• Anna-Lena Gerken
  • Insa Schnittjer
The National Educational Panel Study (NEPS) aims at investigating the development of competencies across the whole life span and designs tests for assessing these different competence domains. In order to evaluate the quality of the competence tests, a wide range of analyses based on item response theory (IRT) were performed. This paper describes the data and scaling procedure for the mathematical competence test in first year students of cohort 5. The mathematics test contained 21 items with different response formats representing different content areas and different cognitive components. The test was administered to 5,915 first-year students. Their responses were scaled using the partial-credit model. Item fit statistics, differential item functioning, Rasch-homogeneity, and the test's dimensionality were evaluated to ensure the quality of the test. These analyses showed that the test exhibited an acceptable reliability and that the items fitted the model in a satisfactory way. Furthermore, test fairness could be confirmed for different subgroups. Limitations of the test were the large number of items targeted toward a lower mathematical ability as well as the relatively high omission rates in some items. Overall, the mathematics test had acceptable psychometric properties that allowed for an estimation of reliable mathematics competence scores. Besides the scaling results, this paper also describes the data available in the Scientific Use File and provides ConQuest syntax for scaling the data.
Original languageEnglish
Place of PublicationBamberg
PublisherLeibniz Institut für Bildungsverläufe, Nationales Bildungspanel
Number of pages27
StatePublished - 01.2017

    Research areas

  • Methodological research and development - item response theory, scaling, mathematical competence, scientific use file

ID: 693414