Platforms Refrain from Responsibility Towards Use of Child Data
- No one under 13 is allowed to create an account. SnapChat.
- You must be at least 13 years old to use the Service. Instagram.
- You will not use Facebook if you are under 13. Facebook.
- THE SERVICE IS NOT FOR PERSONS UNDER THE AGE OF 13. Musical.ly.
- We will not knowingly collect personally identifiable information from any person under the age of 13. Steam.
- Minimum age requirement for having an account is 13 years. Youtube.
Many many children use above apps, social media and games, even though they have not turned 13. Most just lie about their age – often with the parents’ tacit acceptance. The children, their parents and their teachers rarely read the terms of conditions (TOS) that they say yes to, when they sign up, so the children’s data are at the services’ full disposal.
This means that these services most likely treat children’s data as they treat adult data; building detailed personal profiles – sometimes including psychological features – to micro target commercial offers, prices, and political messages.
But what does it help when most services only deal with fine words and only a few take responsibility?
1 out of 3 Internet users are children, and in Denmark, 64% of all children under 13 are on social media for people over 13. And yes, of course, it is the children’s and parents’ own fault. They could just read (and try to understand) the thousands of small words written in the terms of conditions. Or do as the tech insiders in Silicon Valley, who understand, how we are psychologically manipulated with, and therefore send their children to schools, where computers, tablets and smartphones are banned.
But the responsibility lies not only with individuals, but also with the legislators in terms of making demands on privacy and data ethics, enforcing legislation and acting as a role model. And, of course, with companies, who too often do nothing about it as it is expensive and difficult to safeguard children. It’s mainly when companies are exposed to massive pressure, that they act – although usually it’s only very few steps like this Youtube example, where they will make an effort to stop more unsuitable videos that have not been stopped before.
LEGO is one of the few data ethical role models (another is Vaikai). LEGO has decided that their customers (the vast majority are children) must be protected, so there are no third-party cookies on LEGO’s sites for children. They don’t use Google Analytics either, as LEGO wants to be in control of their data. Facebook Connect is not a login option to avoid Facebook tracking of the kids, children are encouraged to use pseudonyms, and LEGO has spent ressources to develop an effective consent management tool, so parents of children under the age of 13 have to allow their children to use LEGO’s sites. See a presentation by LEGO about this. Finally, LEGO makes sure to screen all photos before they are visible, a practice that most big tech companies leave to random users who find inappropriate content. LEGO’s access to data ethics created a headline in Wired, which says it in one sentence: How LEGO built a social network for kids that’s not creepy.
- I 2017 gennemførte Datatilsynet 40 tilsyn i alt #dkpol https://t.co/4YwHPELBXy
- SnapChat being used for fake promises on how fast you can make money #kidsmanipulation https://t.co/maO2PEP24A
- We need much more transparency in algorithms. Here it seems that bail algorithms don’t appear to perform any better… https://t.co/pFv12u6PUs
- Like air pollution or intrusive online advertising, tech addiction is a collective-action problem caused by misalig… https://t.co/qJxNEXSJnI
- Big tech Google turns to humans to secure high quality https://t.co/HnK0ec68iX