Development of Social Studies Aptitude Test for Testing Critical Thinking Skills: Implication for the Achievement of Education for Sustainable Development (ESD)
DOI:
https://doi.org/10.53103/cjess.v3i4.163Keywords:
Social Studies Aptitude Test, Critical Thinking Skills, Education for Sustainable Development (ESD), Item Response TheoryAbstract
This study developed a social studies aptitude test (SSAT) that captured and measured critical thinking skills for the achievement of education for sustainable development. Eight research questions guided the study. The instrumentation research design was adopted for the study. The population comprised 72,854 Upper Basic School students, with a total of 1,000 sample size selected using simple random and cluster sampling techniques. A 100-item multiple choice Social Studies Aptitude Test (SSAT) was developed by the researcher, which was used as instrument for the study. The data were collated, entered into a computer system and analysed using chi-square goodness of fit, frequency, percentage, Item Characteristics Curve and factor analysis. The findings revealed that most of the items in the SSAT have a good fit; that the test questions measured the skills of enquiry, intellectual, manipulative and societal values; that the test items are distributed according to Upper Basic Education curriculum; and that the Social Studies Aptitude Test is reliable, having obtained 0.89 index. The study also found that all the 100 items measured a single construct; that most of the items (94 out of 100) were either satisfactory (need no revision), good or moderate (needs little or no revision); that most of the items (89 out of 100) were either very easy or easy; and most of the items (73 out of 100) are not susceptible to guessing. Based on the findings, the study concluded that the Social Studies Aptitude Test items are valid and reliable. The study recommended amongst others, that the developed Social Studies Aptitude Test should be used by Social Studies teachers for the assessment of secondary school students, especially during mock examination, in preparation for external examinations. The study has contributed to knowledge by providing a test that measures the objectives of the revised upper basic education curriculum as well as the achievement of Education for Sustainable Development (ESD).
Downloads
References
Bloom, B. S. (1956). Taxonomy of educational objectives. New York: David Mckay.
Bortolotti, S. L. V., Tezza, R., de Andrade, D. F., Bornia, A. C., & Farias de Sousa Junior, A. (2013). Relevance and advantages of using the item response theory. Qual. Quant., 47, 2341–2360.
Chon, K. H., Lee, W., & Dunbar, S. B. (2010). A Comparison of Item Fit Statistics for Mixed IRT Models. Journal of Educational Measurement, 47(3), 318–338. https://doi.org/10.1111/j.1745-3984.2010.00116.x
Donald, J. G. (1985). Intellectual skills in higher education. The Canadian Journal of Higher Education, 15(1), 53-68.
Egunsola, A., Denga, L., & Pev, I. (2014). Development and standardization of agricultural science achievement test for senior secondary school students in Taraba State Nigeria. Journal of Education and Leadership Development, 6(2), 72-85.
Ezechukwu, R. I., Chinecherem, B., Oguguo, E., Ene, C. U., & Ugorji, C. O. (2020). Psychometric Analysis of Economics Achievement Test Using Item Response Theory. World Journal of Education, 10(2), 59-68.
FGN (2014). National Policy on Education (Revised). Federal Ministry of Education. Lagos: NERDC.
Gormally, C., Brickmant, P., & Lutz, M. (2012). Developing a test of scientific literacy skills (TOSLS): Measuring undergraduates’ evaluation of scientific information and arguments. CBE—Life Sciences Education, 11, 364-377.
Haberman, S. J., Sinharay, S., & Chon, K. H. (2013). Assessing item fit for unidimensional item response theory models using residuals from estimated item response functions, Psychometrika, 78(3), 417–440. https://doi.org/10.1007/s11336-012-9305-1
Han, K. T. (2016). Maximum Likelihood Score Estimation Method with Fences for Short-Length Tests and Computerized Adaptive Tests. Applied Psychological Measurement, 40(4), 289–301. https://doi.org/10.1177/0146621616631317
Johnstone, A. H., & Al-Shuaili, A. (2001). Learning in the laboratory: Some thoughts from the literature. Uni. Chem. Ed, 5(2), 42-91.
LaHuis, D. M., Clark, P., & O’Brien, E. (2011). An examination of item response theory item fit indices for the graded response model. Organizational Research Methods, 14, 10-23.
Li, Y., & Rupp, A. A. (2011). Performance of the S-X2 statistic for full-information bifactor models. Educational and Psychological Measurement, 71, 986-1005.
Lord, F. M., & Novick, M. R. (1968). Statistical theories of mental test scores. Reading Mass: Addison-Wesley.
National Research Council (1996). National science education standards. Washington, DC: The National Academy Press.
Oku, K., & Iweka, F. (2018). Development, Standardization and Application of Chemistry Achievement Test Using the One-Parameter Logistic Model (1-Plm) of Item Response Theory (Irt). American Journal of Educational Research, 6(3), 238-257. doi: 10.12691/education-6-3-11
Orangi A. M., & Dorani, K. (2010). Developing a social studies aptitude test for high school students based on item-response theory (IRT). Journal of Psychological Models and Methods, 1(1), 1-13.
Osadebe, P. U., & Jessa, M. O. (2018). Development of social studies achievement test for assessment of secondary school students. European Journal of Open Education and E-Learning Studies, 3(1), 104-124.
Ranger, J., & Kuhn, J. T. (2012). Assessing fit of item response models using the information matrix test. Journal of Educational Measurement, 49, 247-268.
Ross, J. (1966). An empirical study of a logistic mental test model. Psychometrika, 31, 325-340.
Sarita, J. (2017). Inquiry-based learning: Necessary for teachers and students. International Journal of Applied Research, 3(6), 163-164.
Shahali, E. H. M., Halim, L., Rasul, M. S., Osman, K., & Zulkifeli, M. A. (2017). STEM learning through engineering design: Impact on middle secondary students’ interest towards STEM. EURASIA Journal of Mathematics, Science and Technology Education, 13(5), 1189-1211.
Thompson, N. (2016). Introduction to Item Response Theory. A paper presented at the CAT/IRT workshop at the University of Brasilia.
Thompson, N. A. (2009). Ability Estimation with Item Response Theory. Assessment Systems Corporation.
Wang, C., Shu, Z., Shang, Z., & Xu, G. (2015). Assessing item-level fit for the DINA model. Applied Psychological Measurement, 39, 525-538.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2023 Morrison Omokiniovo Jessa, John Nwanibeze Odili, Patrick Uzor Osadebe
This work is licensed under a Creative Commons Attribution 4.0 International License.
All articles published by CJESS are licensed under the Creative Commons Attribution 4.0 International License. This license permits third parties to copy, redistribute, remix, transform and build upon the original work provided that the original work and source is appropriately cited.