ePrivacy for Children: What is Data Protection Culture?*

ePrivacy for Children: What is Data Protection Culture?*

The General Data Protection Regulation (GDPR) attracted widespread attention and comment in recent weeks when it came into force on 25 May 2018. Having taken several years to get from being proposed by the European Commission to entering into force, the GDPR has been designed as a concerted, holistic and unifying effort to regulate personal data protection in the digital age.

At a time when many public, private and third sector organisations have only recently ‘gone digital’ and when data has very rapidly becoming seen as ‘a new currency,’ the scope of application of the GDPR is vast. Serious fines can applied to firms, that do not abide by the new rules. This is no coincidence of course; recent Cambrige Analytica and Facebook violations of privacy forced the public debate to grow and with that awareness of what is at stake.

It is not only the scandals on the surface that have piqued the interest of the average user, though; the capital and energy spent on the data gathering fetish of social media platforms is also a key determinant of the process. The right to erasure is also more easily applicable from now on, signifying more meaningful control over data and the erosion of post-capitalist surveillance society. However, in the decade of tl;dr (too-long-did-not-read) and post-truth, this type of detailed regulation might be a little too complicated to understand for internet users of all ages.

Through the lens of a researcher-mother, one is quickly struck by the image of hyper-socialised millennium generation on massive platforms like Facebook and Instagram. GDPR brings special conditions for childrens’ data. Well, living in Turkey with your child right beside you is not a comfort; you are still spending 16+ hours of your day connected to inter-networks.

The GDPR makes some specific requirements in respect of children’s data, for reasons set out in recital 38: “Children merit specific protection with regard to their personal data, as they may be less aware of the risks, consequences and safeguards concerned and their rights in relation to the processing of personal data. Such specific protection should, in particular, apply to the use of personal data of children for the purposes of marketing or creating personality or user profiles and the collection of personal data with regard to children when using services offered directly to a child. The consent of the holder of parental responsibility should not be necessary in the context of preventive or counseling services offered directly to a child.”

While this statement has much merit, it is only an explanatory recital, guiding implementation of the GDPR but lacking the legal force of an article. In a recent London School of Economics Media Policy Project roundtable, it became clear that there is considerable scope for interpretation, if not confusion, regarding the legal basis for processing (including, crucially, when processing should be based on consent), the definition of an information society service (ISS) and the meaning of the phrase “directly offered to a child” in Article 8 (which specifies a so-called “digital age of consent” for children), the rules on profiling children, how parental consent is to be verified (for children younger than the age of consent), and when and how risk-based impact assessments should be conducted (including how they should cover intended or actual child users). It is also unclear in practice just how children will be enabled to claim their rights or seek redress when their privacy is infringed.

Already there are some surprises. WhatsApp, currently used by 24% of UK 12-15 year olds, announced it will restrict its services to those aged 16+, regardless of the fact that in many countries in Europe the digital age of consent is set at 13. Instagram is now asking its users if they are under or over 18 years old, perhaps because this is the age of majority in the United Nations Convention on the Rights of the Child (UNCRC)? We will see how things will unfold in the coming months.

In the meantime, a few suggestions are made by Sonia Livingstone of the London School of Economics in the light of a new project. For exploring how children themselves understand how their personal data is used and how their data literacy develops through the years from 11-16 years old, (1) conducting focus group research with children; (2) organising child deliberation panels for formulating child-inclusive policy and educational/awareness-raising recommendations; and (3) creating an online toolkit to support and promote children’s digital privacy skills and awareness. The young generation reminds us once again of the responsibility for creating commons data culture at grassroots level.

Do such changes mean effective age verification will now be introduced (leading to social media collecting even more personal data?), or will the GDPR become an unintended encouragement for children to lie about their age to gain access to beneficial services, as part of their right to participate? How will this protect them better? And what does this increasingly complex landscape mean for media literacy education, given that schools are often expected to overcome regulatory failures by teaching children how to engage with the internet critically? As in the case of Turkey, teachers digital literacy skills need a serious and rapid boost and even more primarily, policies regarding internet governance and community education must be redrafted.

Translated from the Original Text by Asli Telli Aydemir, Alternative Informatics (Alternatif Bilisim)
You can read the original text in Turkish here.

The article was published on EDRI website.