02

The problem and the potential of children’s education data

Sonia Livingstone -

Kruakae Pothong -

The data collected from children at or through their participation in school are exponentially increasing in variety and volume. This is partly mandated by government, partly determined by schools, and partly driven by the commercial desires of educational technology (EdTech) companies of all kinds, large and small, national and global, user-facing and business-to-business. Increasingly, children’s education data seem indispensable to public policy, planning and practice in education, health and welfare, and in schools, teaching, learning and assessment, safeguarding and administration. Meanwhile, commerce thrives on data – for research and development, advertising and marketing, and for many other valuable purposes within today’s highly profitable data ecosystem.

Whose interests are served by the intensifying ‘datafication’ of education and childhood? Datafication – the quantification and analysis of human activity – is increasingly informing public and private sector decision-making (Mayer-Schönberger & Cukier, 2013). The economic interests in data-driven EdTech are considerable, fuelled by the UK Government’s economic investment in the EdTech sector (DfE, 2019) and by the commercial ambitions of the rapidly growing global EdTech industry. The political interests are more subtle and diverse, encompassing efforts to shape the nature of education itself as well as the role of the private sector in public provision. These, in turn, have been fuelled by the demand surge for technologies to support remote learning during the COVID-19 lockdown (Walters, 2021), consolidating the appeal of quick-fix technological ‘solutions’ to society’s problems. Meanwhile, the public interest – including the interests of educators, the wider society and children in particular – is surprisingly little examined, even though grand claims abound about the transformative potential of technological innovation for education.

With daily news headlines announcing data breaches and cybercrime, experts debating ‘surveillance capitalism’ and algorithmic discrimination (Zuboff, 2019) and science fiction predictions of a society run by robots, it’s easy to become dystopian. Yet, innovation continues apace, the public is unwilling to give up its tech, and children themselves relish their digital expertise and agency. At the heart of this dilemma is trust, and the need for a viable mechanism for building trust (Edwards, 2004).

The Digital Futures Commission seeks to transcend the polarisation between technological optimists and pessimists by opening up a space for dialogue and deliberation about children’s data-driven education futures. This space draws on the experiences of ‘insiders’ and critical ‘outsider’ perspectives from academia, industry, civil society and those working directly with children/data/schools. It must be an inclusive and creative space, for, in the short history of our digital society, certain views and interests have quickly come to dominate, closing down possibilities for independent analysis and fresh thinking.

In this space of dialogue and deliberation, it is often easier to diagnose problems than to identify what ‘good’ looks like. But it is vital to find ways to ensure that data-driven EdTech benefits children, especially since the technological infrastructure on which a digital society relies is privately owned. The UK’s history of socioeconomic inequalities in education has already resulted in highly stratified childhood outcomes, which uses of technology tend to exacerbate (Helsper, 2021): can this be overcome or ameliorated? And its history of unresolved debates over the very purposes of education has left society ill prepared to assert child-centred pedagogies over the instrumental approaches preferred by EdTech: can civil society rethink and redouble its advocacy? Regarding children’s education data, key questions include:

  • What data are collected from children at or through their participation in school, why, and how are they used?
  • How can we share data in the public interest, including to support children’s learning or welfare, without undermining their privacy?
  • Do uses of education data privilege some children over others, and can we design innovations specifically for those who are disadvantaged?
  • Should we better regulate, or differently incentivise, the EdTech market to benefit children’s education without commercially exploiting them?

The Digital Futures Commission is grounded in a clear human rights framework, namely, the United Nations Convention on the Rights of the Child (UNCRC, see UN General Assembly, 1989). As General Comment No. 25 by the UN Committee on the Rights of the Child asserts, children’s rights in relation to the digital environment require efforts on many fronts to mitigate risks, optimise opportunities and meet new challenges. And many of these efforts, in turn, demand critical attention to data. Thus far, we have found that child rights experts have paid too little attention to data, and data experts have paid too little attention to children and their rights. Meanwhile, educators and education policy often attend more to the digital products and services that can support learning than to the data processed by these technologies or the interests thereby served.

To advance the debate, drawing on the best available evidence and ideas, we invited essays from experts, including the data protection regulator, academia, private sector, non-governmental organisations and civil society. Within the broad remit of examining the potential for beneficial uses of children’s education data, each contributor was free to define the challenge as they saw fit. Some prioritise academic sources; others practical experience or professional insights. Some take a deliberately neutral stance; others are more critical, or political. Together, we believe they make a unique contribution towards a rights-respecting pathway for the uses of education data that benefits everyone.

Competing interests in education data

Education data can yield insights of many kinds. However, with the increasing datafication of children’s learning (Lupton & Williamson, 2017; Williamson, 2019), critical questions arise over whose interests are served by processing children’s education data. Two concerns have come to the fore. First, public and civil society bodies are being prevented from using education data in children’s best interests by risk-averse data protection regulation or bureaucratic practice. For example, even de-identified data is rarely shared in circumstances that would help a child or children at risk. Second, the EdTech sector finds itself relatively free to use even personally identifiable and sensitive data from children to pursue its commercial interests. This is because its complex data ecosystems are highly opaque, and its powerful players easily dwarf the capacity of a school to negotiate or even grasp the scale of their operations. The irony of this situation is painful, and children are doubly the losers.

Our first pair of essays set out how greater data sharing could improve a host of child protection interventions. Indeed, Mark Mon-Williams, Mai Elshehaly and Kuldeep Sohal argue that, by combining datasets across institutions to piece together the needed information to warrant individual interventions, high-profile instances of systematic social care failures resulting in child deaths may have been prevented (Butler, 2021). They explore the potential of connected data to target efforts to mitigate risk and disadvantage and overcome the problematic fragmentation among services meant to safeguard children. The authors’ telling case studies illustrate how linking education and health datasets, combined with intersectional indicators of inequality, have informed policy and practice – for instance, providing for young children with undiagnosed autism – in ways not otherwise possible. Recognising data protection risks and potential privacy infringements of creating ever-larger and more centralised databases about identifiable children, Mon-Williams et al. commit to the co-production of acceptable solutions with affected communities. While this adds to a project’s workload, it also lightens it by gaining community insights and ensuring community trust.

Leon Feinstein makes the case for state-mandated data collection from the most vulnerable children – to provide for them, as is their right, and to hold government to account for so doing. His case study of the lack of robust and comprehensive data on children with insecure immigration status shows that without such data collection these children are invisible to the system that is meant to support them. Hence their needs go unmet. Nor can society analyse the drivers of children’s problems or evaluate the interventions designed to improve their situation. Nonetheless, recording children’s immigration status at school and then sharing it with the Home Office or other government agencies has proved controversial.

Acknowledging the risk to the individual of sharing sensitive personal data, Feinstein advocates sharing only de-identified, aggregated data for explicit public purposes via secure services such as the Office for National Statistics’ (ONS) Five Safes framework. Also important are data ethics: this means taking seriously children’s right to be heard, including in practices of data collection and use, and weighing these according to a rights-based framework with their best interests, individually and collectively. 

Yet, the business models that drive EdTech and education data processing are not designed to meet these concerns. Indeed, they are attracting considerable concern for pitting commercial interests against children’s best interests, as the following two essays examine. Michael Veale’s analysis of the vertically integrated business models of the major players – combining hardware, operating systems, cloud services and educational platforms – reveals how EdTech businesses far more than public, educational or child rights considerations set the standards and determine the rules of the game for the education system. And it is a long game they are playing, locking students early into particular tech practices and norms, providing schools with ‘free’ systems with profitable add-ons from which it is difficult to extricate themselves, and shaping the offer of content vendors to fit particular platform functionalities over others. Meanwhile, competition law and data protection regulation focus on consumer protection, which fails to take account of the particular needs of the education sector. 

Alternative approaches exist, Veale suggests: more collaborative EdTech systems, national procurement frameworks, open source technologies and community-based projects – although these are difficult to scale or sustain, especially at low or no cost. Even these can be appropriated by major platforms able to adjust to and profit from diverse circumstances. But the increasingly global financial power brokers behind the EdTech brands already embedded in UK classrooms exert a very different influence, as Huw Davies, Rebecca Eynon, Janja Komljenovic and Ben Williamson examine. Crucially, the major investors in EdTech are generalists or tech evangelists rather than education experts, and their decisions are financially motivated.

Digital education platforms, Davies et al. show, play a crucial role in connecting young users to the surveillant and extractive data economy, guaranteeing what’s seen as a reliable revenue stream from cradle to grave. EdTech investors are also political actors promoting normative educational futures in which learning is conceived as on-demand, personalised, lifelong and provided at scale via so-called ‘weapons of mass instruction’. Such visions prioritise efficiency gains and drill-and-skill over deep or child-led learning, encouraging external rather than intrinsic rewards and profoundly disintermediating the school as the public education system relies on EdTech platforms and companies appeal directly to parents and caregivers.

The struggle to make education data serve children’s best interests is not only fought in national policy circles but also the everyday life of families and schools. Education data is also occasioning plenty of trouble here, undermining children’s rights in ways examined in the next section.

The trouble with data

Those working with data in practical settings are also raising the alarm about the complexities of education data and the difficulties of ensuring data underpin rather than undermine children’s needs and rights. Concerned that the everyday practices of schools now contribute, however inadvertently, to unregulated and risky data lakes, even data swamps, Heather Toomey documents a host of easily overlooked problems that demand rectification. Careful not to blame already over-pressed schools for ‘failing’ in their arguably impossible task, she highlights ways in which school cultures contribute to the datafication of childhood.

Teachers, administrators, safeguarding officers and other professionals set out to be conscientious in complying with regulations and respecting children’s rights. But they are busy, rushed, under-resourced, lacking relevant guidance or training, ever hopeful of finding a useful shortcut or workaround, and tempted to follow the usual practice rather than think things through from first principles. Dealing with EdTech can too easily take teachers’ attention from their primary task of educating the children in front of them. Moreover, not only is the complex data ecology they must navigate hardly transparent, but the very EdTech companies that pose schools with difficulties also proffer ‘solutions’ that can supposedly ease their path. And yet, broader uses of education data in children’s best interests are on offer – Toomey gives the example of how safeguarding needs may be met by interagency data sharing. Whether this can be enabled without further commercial exploitation of children’s data remains to be seen.

Education data may, for multiple reasons, often unintentional, enable discrimination, exclusion or inequality on multiple grounds, including gender, ethnicity, sexuality, disability, refugee status and more. Arguably, schools are the institutions to redress rather than perpetuate inequalities among children. Yet research reveals many biases, inaccuracies, distortions and other harms in the operation of data-driven and automated technologies that amplify and accentuate pre-existing sources of disadvantage in society. Najarian Peters examines the ‘dirty data’ processes that discriminate against Black children in the USA and UK, now perpetuated through EdTech. She charts a range of adverse outcomes from educational practices that result in Black children being recorded as less innocent or vulnerable and more aggressive or disruptive than their white classmates. No wonder Black parents more often choose home education for their children. Other than opting out, what are the prospects of righting the wrongs in education data and its uses? Peters calls for fair data practices, data subject rights, improved regulation and recognition of the Black Data Traditions by which Black communities seek to preserve their rights.

A common retort is that their parents have signed the necessary permissions with the school, and they are, in any case, responsible for their children – and their children’s data. Yet parents, too, are little informed about data-driven EdTech or able in practice to exercise their responsibilities. Rosalind Edwards, Val Gillies and Sarah Gorin commissioned a nationally representative survey of UK parents, which found that while parents were aware of data collected from their children, they were less aware of the uses to which data are put, including data sharing across agencies. Once made aware, parents expected to be asked for their consent since only half trusted public services – including schools – to use information in children’s best interests. Inequalities matter – parents from relatively disadvantaged or discriminated-against groups, especially Black parents and lone parents, considered data linkage less legitimate, were less trusting of agencies and had more experiences of problematic uses of data regarding their child. Edwards et al. call for a public moratorium on data linkage while a meaningful national dialogue is held to ensure legitimacy.

Yet, far from any moratorium, the quantity and range of data collected from children in a typical day is escalating, as Jen Persson maps in her State of data 2020 report. That report highlighted the struggles of UK schools to manage education data and comply with an at-times unclear or inadequate data governance landscape (Defend Digital Me, 2020). In her essay in this volume, she grounds her energetic call for a better system of education data processing within a holistic child rights framework. Different types of data and data processing are linked to different concerns and rights, few of which attract sufficient attention from the duty bearers – government, regulator, schools and businesses – charged with respecting children’s rights. Yet, she points out, at school, children have particularly little agency to determine what happens to them or their data. By contrast with many business-to-consumer uses of data, rarely can children consent or withdraw consent from particular EdTech uses by the school. Nor have they opportunities to exercise their data subject rights or to be consulted on the school’s education data policy.

The call for improved regulation is mounting on all sides, and we examine this next. It is notable, however, that the UK Government has recently proposed an alternative approach (Data Protection and Digital Information Bill, 2022). Whether the proposals amount to smarter regulation or fewer data protection constraints on the market remains open to debate. Below we consider the value of better regulation before alternative approaches to respecting children’s rights in relation to data-driven EdTech.

The value of better regulation

Much of the work of the UK’s data protection authority focuses on preventing risky and unlawful sharing of personal data, including data from children, for which Age Appropriate Design Code applies. However, tackling the thorny question of what ‘good’ data sharing looks like, Stephen Bonner, Melissa Mathieson, Michael Murray and Julia Cooke from the Information Commissioner’s Office also recognise the risks of not sharing children’s data when such sharing would be in children’s interests for early intervention to prevent harm. To balance the risks of sharing with those of not sharing, in accordance with both the Age Appropriate Design Code (or Children’s Code) and their Data Sharing Code of Practice, they advocate a seven-point strategy: build on existing best practices; adopt a multistakeholder approach; ensure organisational accountability for data protection; prioritise data minimisation; promote transparency both in schools and from digital businesses; support confident data sharing through training and regulatory compliance; and underscore the importance of data accuracy and data subject rights.

Now that the UK is reconsidering its adherence to Europe’s General Data Protection Regulation (GDPR), what can be learned from European and international legal and human rights frameworks? Or even from data protection in other sectors? For Ingrida Milkaite, the critical challenge is less technological innovation than EdTech’s business model, which fuels increasing commercial data processing more than educational goals. Given that the promised educational benefits remain unproven, and with costs to children’s rights also likely, the Council of Europe advocates the precautionary principle, especially regarding children’s sensitive and biometric data. The considerable power imbalance between children and even schools in relation to EdTech businesses cannot be redressed through digital literacy education alone, important though this is. Consequently, Milkaite calls on national data protection authorities to strengthen their actions to enforce existing regulations, take a precautionary approach to technological innovation and underpin children’s rights in all contexts, including education. 

One of the earliest laws to protect student privacy is the US Family Education Rights and Privacy Act (FERPA, 1974), passed reactively when state more than commercial misuse of education data was occasioning concern. Half a century on, can other countries learn from the US experience? FERPA’s protections include the right to correct inaccuracies and prevent unauthorised sharing. Amelia Vance sets out the rationale for sector-specific protections – in this case, that students are required to attend school, that children are uniquely vulnerable to privacy harms, and that data processing is an integral part of school responsibilities. Yet, Vance argues, FERPA contains so many exceptions that, in practice, it has proved confusing and weak. Also problematic is its reliance on parental consent as a mechanism for data collection and sharing since parents may give consent ill-advisedly or against their child’s interests for a host of practical reasons. Viewed from the UK, which has already benefited from the provisions of the GDPR still lacking in the USA, the main lesson appears to be to avoid the mistakes made with FERPA.

Education is but one focus of innovation in our digital society. Riad Fawzi examines how the financial services sector has responded to financial technology (FinTech), with its promise of more diverse, tailored and affordable consumer services, but suffering from low trust among the public. As with EdTech, FinTech has harnessed modern technology to innovate business-to-business and business-to-consumer services. Yet both state and self-regulatory efforts are more advanced than appears to be the case for EdTech, resulting in both a more mature regulatory ecosystem for FinTech and greater oversight and transparency. Yet, for both sectors, greater efforts are needed to merit public trust: Fawzi advocates the combination of regulation and self-regulation, as well as standards for security, privacy and digital identity, a commitment to customer service and, last but not least, provision of a truly valuable service – in this case, EdTech that meets children’s genuine educational needs.

In addition to complying with regulation, what else can and should EdTech businesses do? Whether or not regulation is an enabler or a brake on innovation, EdTech is innovating fast, and the drivers are not only commercial but also social and educational. So what better digital products and services can be hoped for, and are they in evidence?

Seeking design solutions

The most often mentioned benefit of education data is personalised learning – the promise of providing exactly the teaching materials that each child needs at just the moment when they need them. It is hoped that personalised learning can motivate, enable and reward all children as they learn while relieving teachers from the effort to support each child individually – and the guilt of attending to ‘difficult’ children while others lose out. Natalia Kucirkova weighs the evidence for the added value of deploying often-automated, data-driven, adaptive EdTech in the classroom, finding this not only weak but, where it exists, mainly focused on drill-and-skill learning.

Two problematic design principles underpin much of this technology – exponential growth (the idea that more data is always better) and recommendation systems that promote more similar content. But since educational theories instead value teachers’ knowledge of their pupils and a diversity of learning resources, better principles would minimise data collection, keep the ‘human in the loop’ and recommend multiple alternative opportunities. Redesigning EdTech with educational principles and learner agency at its core will require a substantial rethink by businesses.

One group for whom the benefits of data-driven technologies are eagerly anticipated is disabled children (Alper, 2017). Using the phrase ‘disabled children’ to emphasise the social theory of disability, namely that any deficit lies not in the child but in society’s provision for all children, Sue Cranmer and Lyndsay Grant argue that, while there is evidence of digital technologies being used to benefit disabled children’s learning, traditionally such technologies have not been data-driven. When it comes to data-driven EdTech, there are growing critical concerns regarding the biases, stigma and inequalities that can affect this group from automated uses of education data. Is there scope for empowering data-driven interventions to supplement long-standing efforts toward inclusive education?

The authors offer five suggestions to this end: systematic data collection to inform and target government actions; personalised learning provision that responds to accessibility or other disability-related difficulties; monitoring progress to identify when greater support is required; sharing data with relevant agencies for effective decision-making; and using data to represent diversity and redefine norms. In each case, however, they note potential risks as well as the lack of robust evidence for beneficial outcomes. They also note how rarely disabled children are themselves consulted or provided with genuine choices.

It may not be obvious that what education needs is a greater focus on students’ emotions. But through the advent of ‘affective computing’ or ‘emotional AI’ – or what Andrew McStay terms ‘automated empathy’ – an industry has grown to monitor and respond to children’s emotions at school. The technology now exists to record children’s facial expressions, keyboard presses and bodily movements and analyse the resulting data to segment, profile and score children on their attention, interest, uncertainties and feelings during learning. And already on the horizon are educational uses of automated biometric empathy in the metaverse. While not yet in operation in UK schools, McStay examines these developments as part of the broader agenda of personalised learning.

His essay sets out three critical concerns. First, he argues that the technology is inaccurate, being underpinned more by pseudoscience than robust evidence. Second, it infringes children’s rights to privacy, including freedom from surveillance, profiling and commercial exploitation. Third, it is unlikely to work in practice, for not only does it not meet a genuine educational need, but it is likely to generate unintended and adverse consequences as children seek to evade such scrutiny of their every move.

What of those working in EdTech itself? By design solutions for safety, privacy and security are currently being sought in multiple domains, including in education, bringing into focus the role of designers and developers in protecting children’s interests. Ari Beckingham and Larissa Pschetz rethink the assumptions that underpin much EdTech design, concerned that too often design is motivated to maximise user attention rather than encourage deep understanding holistically across formal and informal learning contexts. Instead, their research attends to the pace of learning, embedding ethical data practices in technology (for example, augmented reality [AR]) designed to encourage children to pay careful attention to the world around them and engage reflectively in their learning process. Such research seems to herald a promising alternative to the dominant focus of EdTech, foregrounding attention to pedagogy and inviting deliberation over the educational vision that data could and should serve.

Rethinking data futures

Without public trust in EdTech’s ambitions, policies and practices, scepticism about commercial uses of education data will likely grow rather than diminish. The final section of this volume is the most radical, exploring technical and market-led alternatives to privacy-invasive systems of data harvest among data oligopolies. In other words, rather than placing ever-greater reliance on the regulator, can a new ecosystem of trusted data management technology (or a personal data store) and a new data management service (or a data trust) offer data subjects more effective control? The concept of data trusts as a solution for privacy protection was introduced over a decade ago (Edwards, 2004). Yet, data trusts as technical and market solutions are only now gaining traction with concrete proposals coming to the fore, as discussed in the essays in this section: the authors explore the hope that data trusts can help to realise the benefits of sharing education data for the public, including children.

Expectations of the data protection regulator in a digital society are becoming impossible, not least because people want personalised services. But, as Roger Taylor argues, people wish to be protected not from personalisation in and of itself, but from harmful or exploitative use of data by providers of data-driven services. His proposed alternative is to separate the management of data (or data stewardship) from the provision of data-driven services and applications via the creation of data trusts as a service. As a service, data trusts manage individual users’ data on their behalf, and must be governed independently in ways that respect the interests of individual data subjects with other public and private sector benefits likely to follow, as emerging good practice cases suggest. As several authors note, however, the political, policy and business challenges are notable.

Defining data trusts as legal entities that provide independent stewardship of data, Jim Knight and Timo Hannay base their optimism on the experience of founding a data analytics company to examine the effects of remote learning on children’s outcomes during the pandemic. This taught them that however valuable the insights from education data, these cannot be obtained when public trust in technology companies to manage data fairly is dropping (Wisniewski, 2020). Trust is of particular importance in relation to big data and artificial intelligence (AI), where it is implausible that the public can understand and scrutinise the uses of their data. This applies especially to children and those responsible for them. Avoiding simple solutions, Knight and Hannay are careful to argue for data trusts as part of a wider mix of legislative, self-regulatory and other actions to promote the common good in a digital world. 

Also responding to calls for ever tighter data protection regulation, which he sees as resulting from fears linked to surveillance capitalism, Bill Thompson advocates an innovative technical approach to data management – the personal data store. With several different forms available, and more experimentation underway, the heart of this alternative is that the data owner – potentially, the child – stores their own data and controls access to it. The personal data store, he suggests, could be embedded in a trusted public service data ecosystem such as that developed at the BBC. Although the technical potential has existed for a while, with interest now growing in response to the extensive datafication of childhood (Barassi, 2020; Mascheroni, 2020), challenges remain, including gaining informed consent from minors, data breaches and the difficulty of rectifying poor choices. Nonetheless, with greater transparency and user control on offer, there are grounds for optimism.

Jun Zhao also addresses the potential of data trusts in calling for a new decentralised data governance structure for children’s data and data sharing. Recognising the host of data governance problems set out by the Digital Futures Commission (Day, 2021), and concerned not to burden individuals with excessive vigilance and comprehension regarding the commercial data ecology, Zhao joins those seeking a technical rather than a regulatory solution, whether through data commons, data trusts or data cooperatives. She examines existing and hypothetical cases in education to highlight how a data trust, and its trustees dedicated to acting in children’s interests, can provide a needed intermediary between schools, students and EdTech companies. Can this model work at scale? If so, what legal framework is required, how might it be funded and who will be liable if something goes wrong? As new questions arise, the space for debate over children’s education data is expanded, and the potential for rights-respecting approaches is kept alive.

A rights-respecting approach to children’s education data

This volume does not position EdTech – or the data it generates – as either good or bad in and of itself. Instead, we emphasise human actions and values in determining how technological design and systems, business logics, communities of practice and other socioeconomic and political factors (Ihde, 2002; Arthur, 2009) ‘serve human beings in the accomplishments of their individual and collective purposes’ (Buchanan, 2001, p. 9). Or fail to serve them. By examining the forces shaping children’s learning lives, we hope to identify the steps needed to better realise their rights in a digital world.

Talk of rights often focuses on particular areas of children’s lives, but the UNCRC insists on a holistic approach to children’s rights to participation, education, information, privacy, play and fullest development, among other rights, for rights cannot be ranked. Crucially, the UNCRC emphasises the child’s best interests as a primary consideration. This outweighs commercial interests and demands a comprehensive assessment of the needs and rights of each child and children collectively. 

Also significant in the UNCRC are what is called the general measures of implementation. These specify how the state should act as duty bearers, taking all necessary steps, including ensuring that business and other actors meet their responsibilities to children. Indeed, it is notable that many authors in this volume have paid more attention to the organisations that process education data than to the data flows. They have emphasised the importance of establishing appropriate legislation and implementing it effectively to prevent harm, improve provision and participation, and stimulate innovation that opens up new opportunities for society, including children. 

This volume builds on the Digital Futures Commission’s recent critique of the UK’s governance of children’s education data (Day, 2021), followed by a multistakeholder roundtable discussion (Livingstone et al., 2021), a deep dive into the data-related challenges faced by schools (Turner et al., 2022), consultations on children’s hopes and concerns for their digital lives (Mukherjee & Livingstone, 2020), and sociolegal analysis of the problematic practices of prominent EdTech companies (Hooper et al., 2022). Here, our purpose is to look forward.

This volume offers critical, practical and creative reflections that can guide society in harnessing education data for good. It weaves together often-disconnected policy conversations about technologies as a means to support education (UN, 2022) with regulatory, market and technical solutions for data governance. It highlights the fundamental principles that should guide state and business activities across the essays – transparency, accountability, legitimacy, fairness and non-discrimination, appropriate remedy, consultation with those affected, ensuring public trust, and innovation in children’s best interests. These principles have been widely overlooked in relation to children’s education data and it is time to prioritise them. The Digital Futures Commission is proud to have brought together these insightful essays. These will surely inform and advance the public and policy debate. They also provide a sound basis on which to develop our forthcoming blueprint for regulatory and practical change to ensure that future uses of education data serve children’s best interests.

  • Alper, M. (2017). Giving voice: Mobile communication, disability, and inequality. MIT Press. 
  • Arthur, W. B. (2009). The nature of technology: What it is and how it evolves. Simon & Schuster.
  • Barassi, V. (2020). Child data citizen: How tech companies are profiling us from before birth. MIT Press.
  • Buchanan, R. (2001). Design research and the new learning. Design Issues, 17(4), 3–23.
  • Butler, P. (2021). Anger over child deaths should not trigger knee-jerk overhaul of social care policy. The Guardian, 19 December. www.theguardian.com/society/2021/dec/19/child-deaths-policy-social-care
  • Data Protection and Digital Information (UK) Bill (2022). https://bills.parliament.uk/bills/3322
  • Day, E. (2021). Governance of data for children’s learning in UK state schools. Digital Futures Commission, 5Rights Foundation. 
  • Defend Digital Me (2020). The state of data 2020: Mapping a child’s digital footprint across England’s state education landscape. https://defenddigitalme.org/research/the-state-of-data-2020
  • DfE (Departmentfor Education). (2019). Realising the potential of technology in education. www.gov.uk/government/publications/realising-the-potential-of-technology-in-education
  • Edwards, L. (2004). Reconstructing consumer privacy protection on‐line: a modest proposal. International Review of Law, Computers & Technology18(3), 313-344.
  • FERPA (Family Educational Rights and Privacy Act). 20 USC § 1232g (1974).
  • Helsper, E. (2021). The digital disconnect: The social causes and consequences of digital inequalities. SAGE.
  • Hooper, L., Day, E., Livingstone, S., & Pothong, K. (2022). Problems with data governance in UK schools: The cases of Google Classroom and ClassDojo. Digital Futures Commission, 5Rights Foundation.
  • Ihde, D. (2002). Bodies in technology (Vol. 5). University of Minnesota Press.
  • Livingstone, S., Atabey, A., & Pothong, K. (2021). Addressing the problems and realising the benefits of processing children’s education data: Report on an expert roundtable. Digital Futures Commission, 5Rights Foundation.
  • Lupton, D., & Williamson, B. (2017). The datafied child: The dataveillance of children and implications for their rights. New Media & Society, 19(5), 780–794.
  • Mascheroni, G. (2020). Datafied childhoods: Contextualising datafication in everyday life. Current Sociology, 68(6), 798–813.
  • Mayer-Schönberger, V., & Cukier, K. (2013). Big data: A revolution that will transform how we live, work, and think. Houghton Mifflin Harcourt.
  • Mukherjee, S., & Livingstone, S. (2020) Children and young people’s voices. Digital Futures Commission, 5Rights Foundation. 
  • Turner, S., Pothong, K., & Livingstone, S. (2022). Education data reality: The challenges for schools in managing children’s education data. Digital Futures Commission, 5Rights Foundation. 
  • UN (United Nations). (2022). More than 150 ministers of education discuss key elements for transforming education ahead of September Summit. Press release, 30 June. www.un.org/en/transforming-education-summit/tes-pre-summit-closing-press-release?mc_cid=23b00dde7e&mc_eid=67555cba3d
  • #UN General Assembly. (1989) United Nations Convention on the Rights of the Child. Treaty Series, Vol. 1577, p. 3, 20 November. www.refworld.org/docid/3ae6b38f0.html
  • Walters, R. (2021). UK EdTech sector grows to £3.5bn as demand surges for digital classrooms and AR. Fe News, 14 January. www.fenews.co.uk/skills/uk-edtech-sector-grows-to-3-5bn-as-demand-surges-for-digital-classrooms-and-ar
  • Williamson, B. (2019). Brain data: Scanning, scraping and sculpting the plastic learning brain through neurotechnology. Postdigital Science and Education, 1(1), 65–86.
  • Wisniewski, G. (2020). Losing faith: The UK’s faltering trust in tech. Edelman, 30 January. www.edelman.co.uk/research/losing-faith-uks-faltering-trust-tech
  • Zuboff, S. (2019). The age of surveillance capitalism: The fight for a human future at the new frontier of power. Profile Books.

Sonia Livingstone OBE FBA is a professor in the Department of Media and Communications at the London School of Economics and Political Science. She has published 20 books on media audiences, especially children and young people’s risks and opportunities, media literacy and rights in the digital environment. Her new book is “Parenting for a Digital Future: How hopes and fears about technology shape children’s lives” (Oxford University Press, with Alicia Blum-Ross). She leads the Digital Futures Commission with the 5Rights Foundation, and Global Kids Online with UNICEF, and researches on several UKRI and EC funded projects concerned with children’s digital lives.

Digital Futures Commission and London School of Economics and Political Science

Dr Kruakae Pothong is a Researcher at 5Rights and visiting research fellow in the Department of Media and Communications at London School of Economics and Political Science. Her current research focuses on child-centred design of digital services. Her broader research interests span the areas of human computer interaction, digital ethics, data protection, Internet and other related policies. She specialises in designing social technical research, using deliberative methods to elicit human values and expectations of technological advances, such as the Internet of Things (IoT) and distributed ledgers.

Digital Futures Commission and London School of Economics and Political Science