Main Article Content
Performance of candidates in large scale examinations is often reported using a composite score that represents an aggregation of several components of a subject. The components are meant to reflect the fact that subjects are made up of different topics or modalities and each modality is assessed by means of a subset of items. The subsets of items measure a candidates’ knowledge with respect to the specific domain. However, more often than not, the construct validity or psychometric independence of each specific domain has not been empirically defined although the domain has intuitive meaning. Factor analysis can be used to make sure that the score reporting practice as indicated by the number of domains is supported by the underlying factor structure. In this paper, Social Studies and Science final examinations test scores were used as dependent variables to extract underlying dimensions. The co-variance matrix for each of the two subjects was submitted to a principal component analysis with Varimax rotation to produce factor loading. The results indicated a unidimensional factor structure for Social Studies and a three component model for Science. The findings were used to evaluate the adopted score reporting structure for each of the two subjects.
Messick Samuel. Foundation of validity: Meaning and consequences in psychological assessment. Educational Testing Service, Princeton: New Jersey; 1993.
Messick S. Validity. In R. L. Linn (Ed.), Educational measurement (3rd ed). New York: Macmillan. 1989;13:103.
Gu L, Turkan S, Gomez PG. Examining the internal structure of the test of English for Teaching (TEFT). New Jersey: Educational Testing Service; 2015.
Republic of Botswana, National Development Plan 8, 1997/8-2002/3. Ministry of Finance and Development Planning. Gaborone: Government Printer; 1997.
Rummel RJ. Understanding factor ananlysis. The Journal of Conflict Resolution. 1967;11(4):444-480.
Breidenbach DH, French BF. A factor-analytic study of the structure of the Brigance Comprehensive Inventory of Basic Skills - II. Journal of Psychoeducational Assessment. 2012;30 (5):478-487.
Brigance AH. Comprehensive inventory of basic skills—II. North Billerica, MA: Curriculum Associates; 2010.
Alavi Kaivanpanah, Nayernia. Factor structure of a written English proficiency test: A structural equation modeling approach. Iranian Journal of Applied Language Studies. 2011;3(2):28-50.
Kuriakose A. The factor structure of English language development assessment: A confirmatory factor analysis. Unpublished doctoral dissertation, Arizona State University, Arizona; 2011.
Mogapi M. Establishing the assessment model for English language continuous writing component. International Journal for Scientific Research in Education. 2016; 9(1):7-19.
Young John W, Cho Yeonsuk, Ling Guangming, Cline Fred, Steinberg Jonathan, Stone Elizabeth. Validity and fairness of state-standard based assessment for English Language learners. Educational Assessment. 2008; 13(2-3):170-192.
Reublic of Botswna, Ministry of Education. Primary school leaving examination results. Gaborone: Governmnt Printer; 2002.
Williams B, Brown T, Onsman A. Exploratory factor analysis: A five-step guide for novices. Australasian Journal of Paramedicine. 2010;8(3).
Kaiser HF. The application of electronic computers to factor analysis. Educational and Psychological Measurement. 1960;20: 141-151.
Cattell RB. The scree test for the number of factors. Multivariate Behavioral Research. 1966;1:245- 276.
Horn JL. A rationale and test for the number of factors in factor analysis. Psychometrika. 1965;30(2):179-85.