Research - 02.03.2021 - 00:00
2 March 2021. In February 2021, CNN reported about a Hong Kong school which used AI facial recognition technology to measure school-children’s faces in order to identify their emotions during online teaching in the context of the ongoing Covid-19 pandemic. The software also monitors students’ engagement and performance, personalises their learning experience and predicts their grades.
The use of AI facial recognition software at Hong Kong schools is but one of the latest examples of the datafication of children. In my latest book, entitled Child | Data | Citizen, which was published with MIT Press in 2020, I argue that we need to look at the datafication of children if we want to address one of the most important transformations brought about data technologies and AI innovation: the rise of the datafied citizen.
Digital versus datafied citizen
In contrast to the digital citizen, who uses online technologies and especially social media to construct their public self, the datafied citizen is defined by narratives produced through the processing of data traces: it is the product of practices of data inference and digital profiling. In other words, data traces are made to speak for and about us. Children are key to understanding how citizenship is being transformed by our data-driven cultures.
As I show in the book, educational data is but one of many different types of data that tech companies are gathering about children and families. From doctor’s appointments to artificial intelligence in the home, from social media to mobile apps, children’s everyday life is recorded, stored and shared in ways that were not possible before.
Today’s children are the very first generation of citizens who are being datafied even from before they are born
Today’s children are the very first generation of citizens who are being datafied even from before they are born. In my TED talk, which reached more that 1,7 million views, I discuss how this data is sometimes gathered even from the moment of conception, as parents search for “ways to get pregnant” online or use ovulation and pregnancy apps, through to so-called “sharenting” activities such as the posting of a child’s important life events via social media channels.
So why does it matter that our children are being tracked? The technological transformations of the last decade and the key innovations in the field of big data and artificial intelligence have led us to a historical situation in which our data traces - decontextualized, sterilized and compared with standard algorithms - are used to make decisions about key aspects of our daily lives. When we look for a job, take out an insurance premium, apply for a loan, enrol our children in a new school and in countless other situations, our data traces are used to judge us in ways that escape our understanding and control.
Children and young people are more at risk
In this context, children and young people are more at risk. The data traces that are produced about them today and even from before they were born may follow children throughout their entire lives as they are being profiled in potentially harmful ways, closing down their future job and life opportunities. Insurance companies, banks, police and courts all use data traces to make decisions about us in ways that we cannot control, but that can significantly impact children’s chances in life. Given the fallacy and bias when it comes to profiling humans – as algorithms are systems designed by humans from particular cultural contexts – we already witness different types of algorithmic discriminations today, including the perpetuation of gendered, racialised or class-based inequalities.
Data rights are human rights.
So what can be done? As I argue in my TED talk, I believe that the solution lies not with individual parents and their behaviours and habits online. This is a systemic issue that needs a political solution. Data rights are human rights. But current privacy policies like the EU’s GDPR fall short of addressing these issues. This is not least because companies track us legally, as we provide consent to terms and conditions on a daily basis, often without having the possibility to opt out. In such cases, individual consent becomes meaningless. Instead, what we need is to get together as institutions, organisations and as a collective entity in order to shift current debates from individual responsibility and privacy to a collective demand to protect our right to freedom of expression, self-representation, and non-discrimination.
Such discussions only become more urgent in the context of the ongoing Covid-19 pandemic, as the most recent example of AI technologies at Hong Kong schools illustrates. In this context, my research aims to stimulate and inform wider public debate about the power that tech-companies have over our children’s futures and the future of our democracies.
Veronica Barassi is Professor in Media and Communications Studies at the University of St.Gallen.
Image: Adobe Stock / Maksim Kabakou