Evidence-based decision making within cultures of learning and experimentation, enabled by EdTech, leads to more impactful, responsible and equitable uses of data.
Education systems must prepare for an increasing abundance of data – along with the related opportunities and risks. This requires a commitment and capacity to utilize data and evidence to inform decisions to improve teaching, learning and the management of education systems. The promotion of open technology standards and prioritizing ‘interoperability’ (so that e.g. data can be shared across applications in ways that are efficient, appropriate and safe) can help avoid technology and vendor ‘lock-in’, where future decisions about the use of EdTech are constrained as a result of technology choices made in the past and by ‘data silos’ that don’t talk to one another. Going forward, issues related to data privacy, ownership, usage and security will become more acute and clear policy guidance and rules need to be put in place, recognizing that decisions need to consider related trade-offs, and that related guidance and rules need to evolve over time. Iteration, controlled experimentation, and nimble evaluations are critical to creating cultures of learning that can help separate ‘hope’ from ‘hype’, informing future EdTech decisions.
Towards an evidence-based use of EdTech to address learning challenges
Despite the increasing adoption of technology in education during the last decades policy makers navigate with uncertainty in the EdTech landscape. Atlantis Group (2019) claims that: “There is little independent evidence about which EdTech products improve learning outcomes, so popular but ineffective products stay on the market (….) Where independent trials have been conducted, they have painted a mixed picture of the efficacy of even some of the most popular and widely available products”. Poor evidence might lead to belief in urban legends or simple stereotypes. For instance, the idea that youngsters are “digital natives” is often one of them. “It is necessary to debunk the stereotype that all young people uniformly possess these skills, and therefore teachers have nothing to teach students in this new millennium” (UNESCO, 2019b). Fortunately, the growing body of evidence helps decision makers and educators to debunk myths or misconception.
The latest International Computer and Information Literacy Study conducted by the International Association for the Evaluation of Educational Achievement (IEA) in 50,000 students (14 years old) from 13 countries shed light on misperceptions such as “digital natives” (79% of students scored below the autonomy level). It also showed that digital technologies are not “boy's toys” (girls consistently scored higher). This international standardized test also challenges the idea that technologies are necessarily “disruptive” (the ICT that students report most frequent use are simply word processing and presentation software, both classic digital tools) (Julian, et al, 2019). Similarly, recent studies challenge the idea that reducing the “screen time” improves well-being or quality of life (Hall, et al, 2019 and Orben, et al., 2019).
In an increasingly digital world, there are additional educational challenges that need to be addressed. For instance, in PISA 2018 students were asked to distinguish between fact and opinion when reading about an unfamiliar topic. "Understanding involves knowledge and information, concepts and ideas, practical skills and intuition. But fundamentally it involves integrating and applying all of these in ways that are appropriate to the learner’s context. Reading is no longer mainly about extracting information; it is about constructing knowledge, thinking critically and making well-founded judgments. Contrast this with the findings from this latest round of PISA, which shows that fewer than 1 in 10 students in OECD countries were able to distinguish between fact and opinion". The time that 15-year-old students spent online outside of school increased between 2012 and 2018 – by an average of more than 1 hour per day (roughly 3 hours online outside of school). The more knowledge that technology allows students to search and access, the more important it becomes to be able to comprehend what is read and make sense of the content. In addition to the digital divide that distinguishes between those who have access to technology and those who don’t, today’s policies need to ensure that learners in addition to acquiring digital skills need to develop higher-order capacities such as agency or critical thinking. (OECD, 2019)
See the following resources for more details:
• Atlantis Group (2019) System Failure: Why EdTech policy needs a critical update https://www.varkeyfoundation.org/what-we-do/atlantis-group/system-failure/
• Fraillon, Julian Ainley, John Schulz, Wolfram Friedman, Tim Duckworth, Daniel (2019) IEA International Computer and Information Literacy Study 2018 International Report https://www.iea.nl/index.php/publications/study-reports/preparing-life-digital-world
• Hall, Jeffrey A., et al. "Experimentally manipulating social media abstinence: results of a four-week diary study." Media Psychology (2019): 1-17. https://www.tandfonline.com/doi/abs/10.1080/15213269.2019.1688171
• OECD (2019) PISA 2018: Insights and Interpretations. https://www.oecd.org/pisa/PISA%202018%20Insights%20and%20Interpretations%20FINAL%20PDF.pdf
• Orben, Amy, and Andrew K. Przybylski. "The association between adolescent well-being and digital technology use." Nature Human Behaviour 3.2 (2019): 173. https://www.nature.com/articles/s41562-018-0506-1
• UNESCO (2019b) Empowering students to become agents of social transformation through mobile learning in Brazil: case study by the UNESCO-Fazheng project https://en.unesco.org/themes/ict-education/mobile-learning/fazheng/case-studies
How to address challenges regarding privacy and data security
While digital technologies can enhance and diversify opportunities for learning, they also demand new data and privacy literacies. Issues related to student privacy and data security become more acute as new technologies are introduced into individual teaching and learning activities, and into schools more generally, in ways that are increasingly integral and personal.
The use of new technologies can (potentially) open a Pandora’s Box of new inconveniences and threats, including ‘cyberbullying’, unauthorized data release, intellectual property theft and ‘cybercrime’. As clicks are tracked, facial recognition tools become cheaper and more and more data is collected and stored ‘in the cloud’, available for analysis and cross-referencing with other data sources, the potential for new educational technologies to become tools for surveillance raises new and uncomfortable questions for educational policymakers, teachers and learners (and their parents).
Education organizations must take measures to minimize the collection of personal data and to protect confidential information and identities of individuals represented in data sets from unauthorized access and manipulation by third parties. How the education system collects, stores and shares data? What data is collected? Where and how is it collected? Who has access to the data? How long is data stored, and how is it destroyed? These are questions that need to be explored.
For instance, conducting security audits to identify weaknesses and update/patch vulnerable systems; training staff and students on data privacy and security; and reviewing all sensitive data to verify that outside access is appropriately limited are essential issues to review as part of an EdTech policy. Education systems and schools should help educators make informed decisions about the potential privacy implications of educational technologies.
Algorithms that we use daily, for instance, might encode human prejudices, misconceptions, and biases. The new generation of data-intensive technologies will demand educations systems to expand not only the current digital skills but also data awareness to understand that data is never just ‘raw’, the decision to collect data is an action based upon a judgment that the data is of value. Finally, technical challenges cannot be addressed only with technical solutions. Further discussions in fields like data protection, transparency, explainability, fairness, accountability, bias, quality and ethics between different organizations of the society will be increasingly relevant.
See the following resources for more details:
• Address Privacy & Security https://digitalprinciples.org/principle/address-privacy-security
• Institute for Ethical AI in Education
• Privacy toolkit http://myprivacy.uk