Pij Yw Yn Z uiy0

The substitution of (2) and (3) in (1) produces the single equation

Y■ = 7™ + 7mX.. + Ym Z. + y.. Z.X.. + u,.X.. + un. + e... (4)

In general there will be more than one explanatory variable at the lowest level and also more than one explanatory variable at the highest level. The assumptions of the multilevel regression model are that the residual errors at the lowest level e. have a u normal distribution with a mean of zero and a variance a2. It is usually assumed that the groups have a common variance <72. The second level residual errors ugj and are assumed to be independent from the lowest level errors e.. as well as to have a y multivariate normal distribution with means of zero and variances c2 and o\ . Other assumptions, identical to the common assumptions of multiple regression analysis, are fixed predictors and linear relationships. The estimators generally used in multilevel analysis are Maximum Likelihood (ML) estimators, with standard errors estimated from the inverse of the information matrix. These standard errors can be used to establish a confidence interval or to test for significance. This is, in general, not correct for the variance components because in this case the null hypothesis is on the boundary of the parameter space (variances cannot be negative). Therefore, variances are generally tested using a likelihood-ratio test or a chi-square test described by Raudenbush and Bryk (2002). Two different likelihood functions are commonly used in multilevel regression analysis: Full Maximum Likelihood (FML) and Restricted Maximum Likelihood (RML;

Raudenbush & Bryk, 2002; see also Goldstein, 1995). RML estimation is preferred when the interest is in estimating the variance components. For details on the statistical model and estimation techniques, we refer to the literature (e.g., Goldstein, 1995; Hox, 2002; Raudenbush & Bryk, 2002; Sni-jders & Bosker, 1999).

Was this article helpful?

0 0

Post a comment