• Insa Schnittjer
  • Anna-Lena Gerken
The National Educational Panel Study (NEPS) aims at investigating the development of competencies across the whole life span and designs tests for assessing these different competence domains. In order to evaluate the quality of the competence tests, a wide range of analyses based on item response theory (IRT) were performed. This paper describes the data and scaling procedure for the mathematical competence test in grade 7 of starting cohort 3 (fifth grade). The mathematics test contained 23 items with different response formats representing different content areas and cognitive components. The test was administered to 6,194 students. Their responses were scaled using the partial credit model. Item fir statistics, differential item functioning, Rasch-homogenity, and the test's dimensionality were evaluated to ensure the quality of the test. These analyses showed that the test exhibited an acceptable reliability good item fit and that the items fitted the model in a satisfactory way. Furthermore, test fairness could be confirmed for different subgroups. Limitations of the test were the large number of items targeted toward a lower mathematical ability as well as the relatively high omission rates in some items. Overall, the mathematics test had acceptable psychometric properties that allowed for an estimation of reliable mathematics competence scores. Besides the scaling results, this paper also describes the data available in the Scientific Use File and provides the ConQuest syntax for scaling the data
Original languageEnglish
Place of PublicationBamberg
PublisherLeibniz Institut für Bildungsverläufe, Nationales Bildungspanel
Number of pages28
StatePublished - 01.2017

    Research areas

  • Methodological research and development - Item response theory, scaling, mathematical competence, scientific use file

ID: 693401