Should Virtual Assistants Be Allowed to Harvest Children’s Data?

Voice-activated tools have been found to be “eavesmining” on children, raising privacy concerns

Add bookmark
Elliot Leavy
Elliot Leavy
07/26/2022

In a development that will surprise no-one particularly, voice activated personal assistants (VAPAs) have been found to be continuously listening, recording, and processing sounds in the household, in a process dubbed “eavesmining”.

What is eavesmining? Eavesmining is a portmanteau of eavesdropping and datamining, and is an integral part of what is known as surveillance capitalism: an economic system centered around the capture and commodification of personal data for the core purpose of profit-making. 

Such data collection is clearly mired in ethical concerns, as many consumers may be unaware that their voice prints are being cataloged by a machine and used within an algorithm to help sell them products further down the line.

When it comes to children, these concerns only become more apparent, as this data will be accumulated over a lifetime, with little ability to take it back once it has been extracted. 

Despite such concerns, the VAPA market has exploded over the past decade, with recent market research predicting that by 2024, the number of voice-activated devices will explode to over 8.4 billion.

As we reported here earlier this month, when it comes to audio surveillance, it isn’t just what is said which is reported but how it is said and by whom. Biometric voice data collection is so advanced at this point that it can record personal features of voices that involuntarily reveal biometric and behavioral attributes such as age, gender, health, intoxication and personality.

Further concerns have been raised in wake of governmental law enforcement gaining access to this data during criminal investigations, highlighting the ongoing mission creep of surveillance across wider society. Households could be labeled for example as “noisy” or “troubled”, which in the future could be used by governments to profile those reliant on social assistance or families in crisis with potentially dire consequences.

In any case, as we discussed in our latest interview with the Global AI Ethics and Regulatory Leader at EY, Ansgar Koene, there are clear generational differences in what younger generations expect in terms of privacy today. Read it now to learn more about the ethical hurdles businesses are increasingly having to jump to operate today.


RECOMMENDED