Background and objectives Skilled clinical reasoning is a critical tool for physicians. Educators agree that this skill should be formally taught and assessed. Objectives related to the mastery of clinical reasoning skills appear in the documentation of most medical schools and licensing bodies. We conducted this study to assess the differences in clinical reasoning skills in medical students following paper- and computer-based simulated instructions. Materials and methods A total of 52 sixth semester medical students of the Dow University of Health Sciences were included in this study. A tutorial was delivered to all students on clinical reasoning and its importance in clinical practice. Students were divided randomly into two groups: group A received paper-based instructions while group B received computer-based instructions (as Flash-based scenarios developed with Articulate Storyline software [https://articulate.com/p/storyline-3]) focused on clinical reasoning skills in history-taking of acute and chronic upper abdominal pain. After one week, both groups were tested at two objective structured clinical examination (OSCE) stations to assess acute and chronic pain history-taking skills in relation to clinical reasoning. Results There were 27 students in group A and 25 students in group B. The mean OSCE score for group A (paper-based) was 28.6 ± 9.4 and that for group B (computer-based) was 38.5 ± 6.0. Group B's mean score was statistically significantly greater (p < 0.001) than group A's mean score for clinical reasoning skills. Conclusion A computer simulation program can enhance clinical reasoning skills. This technology could be used to acquaint students with real-life experiences and identify potential areas for more training before facing real patients.