Some of the generalized Rasch models for multi-method data presented in Section 2 require parameter estimation methods that do not belong to the standard repertoire of the ordinary Rasch model, particularly if the data stem from incomplete test designs or if a multimatrix design has been used for data collection. In both cases, the data cube is incomplete, that is, it has many submatrices or subcubes where no responses were observed. In this section, some methods of parameter estimation and measures of accuracy will be discussed.

The application of the Rasch model requires the estimation of both types of parameters, those for persons and for items. There are at least three possibilities for estimating the parameters in the Rasch model:

■ Joint maximum likelihood (JML), that is, the maximization of the likelihood function, which contains both types of parameters.

■ Conditional maximum likelihood (CML), that is, the likelihood function is maximized after the elimination of the person parameters by conditioning on the sum scores.

■ Marginal maximum likelihood (MML), that is, the likelihood function is maximized while the distribution of person parameters is modelled by some type of distribution like the normal.

The best way to estimate the item parameters of the Rasch model is to use the CML approach because it leads to consistent item parameter estimates without making an assumption about the latent trait distribution. Such an assumption must be made for the MML approach. Usually a normal distribution is assumed, the parameters of which can be directly and consistently estimated together with the item parameters (Mislevy, 1984).

Considering the JML approach, the estimates of the item and person parameters are only consistent if the number of persons and the number of items increase to infinity (Molenaar, 1995). As a consequence, the JML method is not used for estimating the item parameters anymore. However, it is still applied for the estimation of person parameters.

Traditionally these parameters are estimated as maximum likelihood estimators (MLE) using the first partial derivatives of the JML function of all items i to which a person v has responded:

aiogp(xM = a p(x 10,(7) represents the likelihood of the response vector x of a certain person with parameter 0 under the condition of known or sufficiently well estimated item parameters. For the Rasch model this likelihood is

/ U \ j-r exp(x (0 - CT )) p(x\9,<j)= I I-'■-'—.

The MLE has several disadvantages. First, it is infinite for persons who either do not solve any or solve every item. Second, it has a considerable bias, that is, the expectation of (#v-#v) is not zero. To circumvent these disadvantages, Warm (1989) modified the MLE via Bayes' theorem

Was this article helpful?

## Post a comment