09.09.2021 Views

The Blue DOT 14 - Multidisciplinary Science & Evidence For Education

Welcome to the 14th edition of the Institute’s flagship publication, The Blue DOT. In this edition, we bring to you news of the International Science and Evidence-based Education Assessment (ISEE Assessment) that the Institute embarked on about 18 months ago. The International Science and Evidence-based Education (ISEE) Assessment contributes to re-envisioning the future of education and feeds into UNESCO's Futures of Education report, today constituting over 250 authors from 70 countries. Read Opinion Pieces by thought-leaders, experts and academics, watch interviews with our advisory board members and explore the learnings of our research fellows while navigating experience of the Multidisciplinary Science & Evidence for Education.

Welcome to the 14th edition of the Institute’s flagship publication, The Blue DOT. In this edition, we bring to you news of the International Science and Evidence-based Education Assessment (ISEE Assessment) that the Institute embarked on about 18 months ago. The International Science and Evidence-based Education (ISEE) Assessment contributes to re-envisioning the future of education and feeds into UNESCO's Futures of Education report, today constituting over 250 authors from 70 countries. Read Opinion Pieces by thought-leaders, experts and academics, watch interviews with our advisory board members and explore the learnings of our research fellows while navigating experience of the Multidisciplinary Science & Evidence for Education.

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

OPINION<br />

Decisions are often made<br />

on the basis of incomplete<br />

and imperfect information,<br />

and the uncertainty around<br />

quantitative results is one of<br />

the key factors at play.<br />

quantitative results is one of the<br />

key factors at play. As the eventual<br />

implementation of interventions may have<br />

positive or negative impact on learners,<br />

understanding uncertainty of impact<br />

estimates must be an integral part in<br />

educational practice and policymaking. In<br />

particular, the consistency or variability<br />

of effect sizes across studies of similar<br />

interventions is critical to support how<br />

their desired outcomes generalize to<br />

populations and contexts (Borenstein,<br />

2019). Effect sizes 1 (with confidence<br />

intervals) provide a better indication of<br />

impact than significance testing (p-values)<br />

and should be discussed in all cases. This<br />

rationale implies moving away from a<br />

dichotomous interpretation of statistical<br />

significance as the means to gauge the<br />

effectiveness of an intervention.<br />

How do we remain faithful<br />

to the evidence, whilst<br />

making it fit our context?<br />

Using the best evidence as outlined above<br />

REFERENCES<br />

Borenstein, M. (2019). Common Mistakes in Meta-Analysis and How<br />

to Avoid <strong>The</strong>m. Englewood, USA: Biostat.<br />

Brighouse, H., Ladd, H. F., Loeb, S., & Swift, A. (2018). <strong>Education</strong>al<br />

goods: Values, evidence, and decision-making. Chicago, IL: University<br />

of Chicago Press.<br />

Caswell, R.J., Maidment, I., Ross, J.D.C., & Bradbury-Jones, C.<br />

(2020). How, Why, for Whom and in What Context, Do Sexual<br />

Health Clinics Provide an Environment for Safe and Supported<br />

Disclosure of Sexual Violence: Protocol for a Realist Review, BMJ<br />

Open, 10, 1–8, doi:10.1136/bmjopen-2020-037599<br />

De Souza, D. (2016). Critical Realism and Realist Review: Analyzing<br />

still offers no guarantee of success, because<br />

this evidence describes what has worked,<br />

rather than what will work. In addition,<br />

experimental results, even aggregated<br />

and recalibrated through meta-analysis,<br />

are always context-dependent and<br />

interventions are never implemented in<br />

the same context (Smets & Struyven,<br />

2018). Effectiveness predictions (Joyce<br />

& Cartwright, 2020), on the contrary, is<br />

the prediction that a given intervention,<br />

documented as the most effective, will<br />

work concretely within the multi-faceted<br />

characteristics of a given context of<br />

application. Effectiveness predictions<br />

emphasize evidence regarding how a<br />

particular context is likely to provide<br />

the necessary conditions for anticipated<br />

benefits to occur. Thus, this type of<br />

evidence is very different from evidence<br />

related to the generalization of<br />

experimental studies. Yet, it focuses on the<br />

same key aspect as evidence of relative<br />

effectiveness, that is, causality between<br />

interventions and outcomes. This causality<br />

is further explored through an emphasis<br />

on mechanisms, which formalize causal<br />

processes (Caswell, Maidment, Ross,<br />

& Bradbury-Jones, 2020) hinging on<br />

structure, culture and agency (De Souza,<br />

2016).<br />

This last step, by focusing on adapting<br />

the context of implementation, avoids the<br />

circular reasoning involved in adapting<br />

the best possible intervention to a given<br />

context: an intervention not implemented<br />

as tested will likely yield unknown<br />

outcomes, because the causal link between<br />

an intervention and outcome depends on<br />

the permanency of the intervention and<br />

of the outcome; you change one or the<br />

other: the prediction does not hold.<br />

Conclusion: how to remain<br />

up to date in obtaining and<br />

applying evidence?<br />

<strong>The</strong> methodology to obtain evidence<br />

of relative effectiveness in education<br />

has been recently updated. Reaping<br />

its benefits is a massive undertaking<br />

consisting of systematic literature<br />

search, followed by appropriate<br />

meta-analysis followed by<br />

systematic literature reviews for<br />

predictions of effectiveness. Once<br />

completed in any pertinent educational<br />

domain, this work should be turned<br />

into a living review, in which the initial<br />

framework is used to integrate new studies<br />

as they get published. This integrative<br />

scholarly work is frequently overlooked.<br />

Nevertheless, it seems that it is the<br />

essential final step in using research for<br />

policymaking and practice in education.<br />

Governments, universities, and partner<br />

NGOs should collaborate further to gather<br />

the resources and expertise to pursue this<br />

international endeavor.<br />

Complexity in <strong>Education</strong>al Restructuring and the Limits of<br />

Generalizing Program <strong>The</strong>ories Across Borders. American Journal of<br />

Evaluation, 37(2), 216-237.<br />

Joyce, K., Cartwright, N. (2020). Bridging the Gap Between Research<br />

and Practice: Predicting What Will Work Locally. American<br />

<strong>Education</strong>al Research Journal, 57, (3), pp. 1045–1082. DOI:<br />

10.3102/0002831219866687<br />

Smets, W. & Struyven, K. (2018). Realist review of literature on<br />

catering for different instructional needs.pdf. <strong>Education</strong>al <strong>Science</strong>, 8,<br />

113; doi:10.3390/edusci8030113<br />

Rebooting our relationship<br />

to technology: Insights from<br />

psychology into how we<br />

think about AI and EdTech<br />

in the aftermath of Covid-19<br />

John Sabatini is a Distinguished<br />

Research Professor in the Institute<br />

for Intelligent Systems and the<br />

Department of Psychology at<br />

the University of Memphis. He<br />

conducts research in educational<br />

technology, reading literacy<br />

development and disabilities,<br />

assessment, cognitive psychology,<br />

the learning sciences.<br />

Art Graesser is emeritus professor<br />

in the Department of Psychology<br />

and the Institute of Intelligent<br />

Systems at the University of<br />

Memphis. He conducts research<br />

in discourse processing, cognitive<br />

science, educational psychology,<br />

computational linguistics, and<br />

artificial intelligent in education.<br />

He developed AutoTutor, a system<br />

that helps people learn with<br />

conversational agents.<br />

JOHN SABATINI,<br />

ARTHUR GRAESSER<br />

Abstract<br />

In this article, we explore whether it<br />

is time we reconsider our relationship<br />

to technologies in education, both<br />

as individuals and societies. Multiple<br />

generations have now experienced<br />

different forms of digital devices and the<br />

functions and roles they play in society<br />

and education, layered upon traditional<br />

models of the role and function of<br />

education institutions. Over the same time<br />

period, we have learned a great deal from<br />

psychology about how human’s reason,<br />

make decisions, and learn. In this article,<br />

we review recent and emerging AI and<br />

1 An effect size is an indication of the magnitude of differences between groups or the intensity of the relation between variables.<br />

ISSUE • <strong>14</strong><br />

3 9

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!