بررسی تحلیلی شاخص های کمی آزمون های چند گزینه ای گروه ایمنی شناسی دانشکده پزشکی دانشگاه علوم پزشکی جندی شاپور اهواز

نوع مقاله: مقاله پژوهشی

نویسندگان

1 عضو هیأت علمی، گروه ایمنی شناسی، دانشکده پزشکی، دانشگاه علوم پزشکی جندی شاپور اهواز، اهواز، ایران.

2 گروه ایمنی شناسی، دانشکده پزشکی، دانشگاه علوم پزشکی جندی شاپور اهواز، اهواز، ایران

چکیده

با توجه به استفاده گسترده از سوالات چند گزینه ای در ارزیابی دانشجویان پزشکی، پژوهش حاضر به منظور بررسی تحلیلی آزمون های چند گزینه ای گروه ایمنی شناسی دانشگاه علوم پزشکی جندی شاپور اهواز در سال 1396 انجام گرفت. در این مطالعه توصیفی مقطعی، اطلاعات مربوط به آزمون های چند گزینه ای گروه ایمنی شناسی که با دستگاه  OP-Scan .تصحیح شده بود مورد بررسی قرار گرفت(808 سوال). ضریب دشواری، ضریب تمیز و گزینه های انحرافی محاسبه گردید. داده ها با استفاده از آمار توصیفی و ضریب همبستگی پیرسون تحلیل شد. میانگین ضریب دشواری سوالات 25/0±59/0 بود و 2/46 درصد سوالات ضریب دشواری مناسب داشتند. میانگین ضریب تمیز سوالات 24/0± 25/.  بود و 3/57 درصد سوالات ضریب تمیز مناسب داشتند. تجمیع ضریب دشواری و تمیز سوالات نشان داد که فقط تعداد 248 سوال(7/30 درصد) از سوالات چند گزینه ای ایده آل می باشند. تعداد 1525 گزینه(90/62%) از 2424 گزینه مورد بررسی گزینه انحرافی مناسب (Functional distractor=FD) و 889 0/37%) گزینه انحرافی نامناسب(Non-functional distracters=NFDs) بودند. یافته ها نشان داد که در مجموع سوالات از کیفیت لازم برخوردار نبوده و نیازمند بازبینی و اصلاح می باشند.

کلیدواژه‌ها


عنوان مقاله [English]

Analytical Study of Quantitative Indices of Multiple-choice Questions of Immunology Department in Ahvaz Jundishapur University of Medical Sciences

نویسندگان [English]

  • Abdolhussein Shakurnia 1
  • Mehri ghafourian boroujerdnia 2
  • Ali Khodadadi 2
  • Ata Ghadiri 2
  • Afshin Amari 2
1 Assistant, School of medicine, Ahvaz Jundishapur University of Medical Sciences, Ahvaz, Iran.
2 Immunology department, School of Medicne, Ahvaz Jundishapur University of Medical Sciences
چکیده [English]

Considering the widespread use of MCQs in assessment of medical students, present study was conducted to item analyze of MCQs in immunology department at Ahvaz Jundishapur University of Medical Sciences in 2017. In this descriptive cross-sectional study, the data of MCQs designed by immunological faculties that corrected by the OP-Scan was assessed and explored the difficulty index, discrimination index and distractor options (808 MCQs). Then, data were analyzed using descriptive statistics and Pearson coefficient. The results showed the average of difficulty index of the MCQs was 0.59 ± 0.25, and 46.2% of the MCQs had a suitable difficulty. The average of the discrimination index of the MCQs was 0.25± 0.24 and 57.3% of the MCQs had a discrimination index. Accordingly, a Combination of the two difficulty and discrimination indices showed that only 248 MCQs (30.7%) were ideal. 1525 distractor option (62.9%) were functional distractor (FD) and 889 (37%) were non-functional distractor (NFDs). The findings showed that the MCQs required to be reviewed and improved.

کلیدواژه‌ها [English]

  • MCQs
  • Item analysis
  • difficulty index
  • discrimination index
  • distractor options
Abdulghani, HM, Ahmad, F, Ponnamperuma, GG, et al 2014a, The relationship between non-functioning distractors and item difficulty of multiple choice questions: A descriptive analysis. Journal of Health Specialties, Vol.2, No.4, Pp. 148-157.

Abdulghani, HM, Ponnamperuma, GG, Ahmad, F, 2014b, A comprehensive, multi-modal evaluation of the assessment system of an undergraduate research methodology course: Translating theory into practice, Pakistan journal of medical sciences, Vol.30, No.2, Pp 227-239.

Abozaid, H, Park, YS, Tekian, A, 2017, Peer review improves psychometric characteristics of multiple choice questions, Medical teacher, Vol.39, sup1, Pp. S50-S4.

Amin, MM, Shayan, S, Hashemi, H, et al, 2011, Analysis of multiple choice questions based on classical test theory, Iranian Journal of Medical Education, Vol.10, No.5, Pp.719-725. [In Persian]

 

Carnegie, JA, 2017, Does correct answer distribution influence student choices when writing multiple choice examinations? The Canadian Journal for the Scholarship of Teaching and Learning, Vol.8, No.1, Pp.11-17.

Derakhshan, F, Ahmady, S, Allami, A, 2016, Quantitative and qualitative indicators evaluation of residency exams in Qazvin University of Medical Sciences (2012-13). Journal of medical education development, Vol.8, No.20, Pp.25-32. [In Persian]

 

Derakhshan, F, Allami, A, Ahmadi, S, 2015, Effect of faculty training programs on improving quality of residency exams in 2013-2014. Research in Medical Education, Vol.7, No.1, Pp.19-26. (Persian)

 

Elfaki, OA, Bahamdan, KA, Al-Humayed, S, 2015, Evaluating the quality of multiple-choice questions used for final exams at the Department of Internal Medicine, College of Medicine, King Khalid University. Sudan Medical Monitor, Vol.10, No.4, Pp.123-134.

Gajjar, S, Sharma, R, Kumar, P, et al, 2014, Item and test analysis to identify quality multiple choice questions (MCQs) from an assessment of medical students of Ahmedabad, Gujarat. Indian journal of community medicine: official publication of Indian Association of Preventive & Social Medicine, Vol.39, No.1, Pp.17-20.

Hingorjo, MR, Jaleel, F, 2012, Analysis of one-best MCQs: the difficulty index, discrimination index and distractor efficiency. JPMA-Journal of the Pakistan Medical Association, Vol.62, No.2, Pp.142-149.

Hohensinn, C, Baghaei, P, 2017, Does the position of response options in multiple-choice tests matter? Psicológica, Vol.38, No.1, Pp.19-24.

Hosseini Teshnizi, S, Zare, S, Solati, S, 2010, Quality analysis of multiple choice questions (MCQs) examinations of noncontinuous undergraduate medical records. Bimonthly Journal of Hormozgan University of Medical Sciences, Vol.14, No.3, Pp.177-183. [In Persian]

 

Karkal, YR, Kundapur, GS, 2016, Item analysis of multiple choice questions of undergraduate pharmacology examinations in an International Medical School in India. Journal of Dr NTR University of Health Sciences, Vol.5, No.3, Pp.183-186.

Kouhpayezadeh, J, Dargahi, H, Arabshahi, KS, 2012, Clinical assessment methods in medical sciences universities of Tehran–clinical instructors’ viewpoint. Bimonthly Journal of Hormozgan University of Medical Sciences, Vol.16, No.5, Pp.395-402. (Persian)

 

 D''Sa JL, Visbal Dionaldo ML, 2017, Analysis of multiple choice questions: item difficulty, discrimination index and distractor efficiency. International Journal of Nursing Education, Vol.9, No. 3, Pp 109-114.

McCoubrie, P, 2004, Improving the fairness of multiple-choice questions: a literature review. Medical teacher, Vol.26, No.8, Pp.709-712.

Mehta, G, Mokhasi, V, 2014, Item analysis of multiple choice question-an assessment of the assessment tool. Int J Health Sci Res, Vol.4, No. 7, Pp.197-202

 Menon AR, Kannambra PN, 2017, Item analysis to identify quality Multiple Choice Questions. National Journal of Laboratory Medicine, Vol.6, No. 2, Pp. 7-10

Meyari, A, Beiglarkhani, M, 2012, Improvement of design of multiple choice questions in annual residency exams by giving feedback item analysis to identify quality multiple choice questions. Strides in Development of Medical Education, Vol.10, No.1, Pp.109-118. [In Persian]

 

Mitra, N, Nagaraja, H, Ponnudurai, G, et al, 2009, The levels of difficulty and discrimination indices in type a multiple choice questions of pre-clinical semester 1, multidisciplinary summative tests. IeJSME, Vol.3, No.1, Pp.2-7.

Mkrtchyan, A, 2011, Distractor quality analyze in multiple choice uestions based on information retrieval model. EDULEARN Proceedings, PP.1624-1631.

Mortaz Hejri, S, Khabaz Mafinezhad, M, Jalili, M, 2014, Guessing in Multiple Choice Questions: Challenges and Strategies. Iranian Journal of Medical Education, Vol.14, No.7, Pp.594-604. (Persian)

 

Mukherjee, P, Lahiri, S, 2015, Analysis of multiple choice questions (MCQs): Item and test statistics from an assessment in a medical college of Kolkata, West Bengal. IOSR J Dent Med Sci, Vol.1, Pp.47-52.

Namdeo, SK, Rout, SD, 2016, Assessment of functional and nonfunctional distracter in an item analysis. International Journal of Contemporary Medical Research, Vol.3, No.7, Pp.1891-1893.

Pande, SS, Pande, SR, Parate, VR, et al, 2013, Correlation between difficulty & discrimination indices of MCQs in formative exam in Physiology. South‐East Asian Journal of Medical Education, Vol. 7, No.1, Pp.45-50.

Patel, RM, 2017, Use of item analysis to improve quality of multiple choice questions in II MBBS. Journal of Education Technology in Health Sciences, Vol.4, No.1< Pp.22-29.

Patil, R, Palve, SB, Vell, K, et al, 2017, Evaluation of multiple choice questions by item analysis in a medical college at Pondicherry, India. International Journal of Community Medicine and Public Health, Vol.3, No.6, Pp.1612-1616.

Pho, V-M, Ligozat, A-L, Grau, B, 2015, Distractor quality evaluation in multiple choice questions. International Conference on Artificial Intelligence in Education AIED 2015: Artificial Intelligence in Education pp 377-386.

Pourmirza, KR, Rezaie, M, Shojee M, 2015, Correlation of quality and quantity index of multiple choice questions exams of residency promotion in Kermanshah University of medical sciences, 2013. Vol. 4, No. 1, Pp.71-78. [In Persian]

 

Quaigrain, K, Arhin, AK, 2017, Using reliability and item analysis to evaluate a teacher-developed test in educational measurement and evaluation. Cogent Education, Vol.4, No.1, Pp.1-11.

Reid, WA, Duvall, E, Evans, P, 2007, Relationship between assessment results and approaches to learning and studying in year two medical students. Medical education, Vol.41, No.8, Pp.754-762.

Sanagoo, A, Jouybari, L, Ghanbari Gorji, M, 2010, Quantitative and qualitative analysis of academic achievement tests in Golestan University of Medical Sciences. Research in Medical Education,     Vol.2, No.2, Pp.24-32.

Şenel, S, Pehlivan, EB, Alatlı, B, 2015, Effect of Correction-for-Guessing Formula on Psychometric Characteristics of Test. Procedia-Social and Behavioral Sciences, Vol.191, Pp.925-929.

Shakurnia, A, Khosravi, BA, Mozafari, A, 2010a, Elhampour H. An evaluation of exam questions designed by faculty members, emphasizing on the multiple choice question structure, Ahvaz Jundishapur University of Medical Sciences, 2007. Strides Dev Med Educ, Vol.6, No.2Pp. 129-138. [In Persian]

Shakurnia A, Mozafari , A, Khosravi, BA, 2010b. Survey on structural of MCQs of residency exam in AJUMS, Vol.8, No.4, Pp.491-502. [In Persian]

Shakurnia, A, Elhampour, H, 2015, Survey on correct response position in multiple choice tests: A descriptive study. Strides in Development of Medical Education, Vol.12, No.2, Pp.426-432. [In Persian]

Tarrant, M, Ware, J, Mohammed, AM, 2009, An assessment of functioning and non-functioning distractors in multiple-choice questions: a descriptive analysis. BMC Medical Education, Vol.9, No.1, Pp.40-49.

Tarrant, M, Ware, J, 2012, A framework for improving the quality of multiple-choice assessments. Nurse Educator, Vol.37, No.3, Pp.98-104.

Vakili, Z, Fakharian, E, Rasooli nejad, SA, Et al, 2007, The psychometric properties of the multiple-choice tests residency, Kashan University of Medical. Strides in Development of Medical Education 8th National Congress of Medical Eduction, Pp.174-75.

Vanderbilt, A, Feldman, M, Wood, I, 2013, Assessment in undergraduate medical education: a review of course exams. Medical education online, Vol.18, No.1, Pp. 1-5.

Yusoff, MSB, Taib, F, 2014, Difficulty index, discrimination index, sensitivity and specificity of long case and multiple choice questions to predict medical students’ examination performance. Journal of Taibah University Medical Sciences, Vol.9, No.2, Pp.109-114.