2 classes 3 classes

Log likelihood -15,179 -14,138 -13,706 -12,129 -12,069 -11.976

Number of independent parameters 18 38 71 98 143 215

AIC 30,394.80 28,351.76 27,554.28 24,453.43 24,424.69 24,381.42

BIC 30,478.69 28,528.86 27,885.18 24,910.16 25,091.15 25,383.45

Note. 7D LLTM = seven-dimensional linear logistic test model, 7D RM = seven-dimensional Rasch model.

response patterns, a situation that must be taken into account in model fitting (see Section 3). If the model selection was based on the BIC, the decision would be to choose the seven-dimensional Rasch model because the BIC value is the smallest for this model. Yet before a decision is made, let us first have a closer look at the models and their estimates.

For the first four models, it is necessary to set the sum of the person or item parameters to zero to let the models be identified. For these analyses the constraint was on the cases (persons), that is, all latent means were set to zero. In the first model, that is, the main effects model (5), known here as the LLTM, the probability of a task response is a function of one person parameter 0v, one item parameter c and one cognitive component (method), ju.. It is the most restrictive model because it only has 18 independent parameters and a log likelihood of-15,179, which is the lowest value of all the tested models. The 18 parameters cover the mean (which is set to zero and not counted as a parameter) and variance of the latent trait, 10 parameters for the content areas, and 7 parameters for the methods.

Analyses reveal a tremendous increment to the log likelihood of the ordinary Rasch model in which no additive decomposition of the task difficulty into an item effect and a method effect is assumed. The log likelihood value increases to -13,706 for the Rasch model, which has 70 item-method parameters. Both models can be compared by means of their task difficulties. These are the item-method parameters of the Rasch model and the sums of the content and the method parame-

Was this article helpful?

## Post a comment