Using Meta Analysis to Compare and Integrate Multimethod Research

A critical challenge in the systematic evaluation of multimethod research is how to compare and integrate results from multiple studies that use different outcome measures, different research designs, different samples, different methodologies, and different statistical analyses. An important limitation to the multimethod approach is that an individual study or even a research program is typically too narrow in focus to take full advantage of the multi-method approach. Hence, the potential value of the multimethod approach may require the researchers to compare and integrate the results from a large number of different studies. Although not traditionally viewed as a multimethod analysis, meta-analy-sis provides a framework for this task. Whereas the focus of meta-analysis historically has been on the synthesis of research, more recent research has focused on the identification of different methodological aspects of research that have a substantial influence on the results. Thus, for example, there is often an overly simplistic attempt to reduce the findings to a single outcome variable per study. Although greatly simplifying the statistical analyses, this approach ignores potentially important differences associated with specific outcomes. Becker (2000) reviewed different approaches to this problem (e.g., treating each outcome as independent, combining outcomes into a single score, creating independent data sets), but recommended new approaches that require researchers to model the dependency of multiple outcomes using multivariate statistical techniques such as the multilevel modeling approach outlined by Raudenbush and Bryk (2002). In summary, meta-analysis offers the multimethod analyst access to a wide variety of different studies that span an entire research literature rather than the limited number of multiple methods that can be incorporated into a single study.

Was this article helpful?

0 0

Post a comment