Document Type

Article

Source of Publication

Australasian Journal of Educational Technology

Publication Date

1-1-2021

Abstract

As the empirical literature in educational technology continues to grow, meta-analyses are increasingly being used to synthesise research to inform practice. However, not all meta-analyses are equal. To examine their evolution over the past 30 years, this study systematically analysed the quality of 52 meta-analyses (1988–2017) on educational technology. Methodological and reporting quality is defined here as the completeness of the descriptive and methodological reporting features of meta-analyses. The study employed the Meta-Analysis Methodological Reporting Quality Guide (MMRQG), an instrument designed to assess 22 areas of reporting quality in meta-analyses. Overall, MMRQG scores were negatively related to average effect size (i.e., the higher the quality, the lower the effect size). Owing to the presence of poor-quality syntheses, the contribution of educational technologies to learning has been overestimated, potentially misleading researchers and practitioners. Nine MMRQG items discriminated between higher and lower average effect sizes. A publication date analysis revealed that older reviews (1988–2009) scored significantly lower on the MMRQG than more recent reviews (2010–2017). Although the increase in quality bodes well for the educational technology literature, many recent meta-analyses still show only moderate levels of quality. Identifying and using only best evidence-based research is thus imperative to avoid bias. Implications for practice or policy: • Educational technology practitioners should make use of meta-analytical findings that systematically synthesise primary research. • Academics, policymakers and practitioners should consider the methodological quality of meta-analyses as they vary in reliability. • Academics, policymakers and practitioners could avoid misleading bias in research evidence by using the MMRQG to evaluate the quality of meta-analyses. • Meta-analyses with lower MMRQG scores should be considered with caution as they seem to overestimate the effect of educational technology on learning.

Publisher

Australasian Society for Computers in Learning in Tertiary Education

Volume

37

Issue

4

First Page

100

Last Page

115

Disciplines

Education

Keywords

bias, educational technology, meta-analysis, reporting quality, research methodology, systematic review

Scopus ID

85118128663

Indexed in Scopus

yes

Open Access

yes

Open Access Type

Gold: This publication is openly available in an open access journal/series

Included in

Education Commons

Share

COinS