We have written to ECR Group MEPs and the UK Children’s Commissioner asking for support of the e-Privacy Regulation, following reading that the “ECR Group to force vote on controversial new ePrivacy rules” in particular,
“Consumers regularly use free online services and apps and freely give their data. So long as their privacy and data is protected, which it is under both existing rules and the upcoming data protection laws, we should not remove the incentive for businesses to produce free content.”
This Regulation will influence making the Internet better or worse for young people at the forefront of today’s information society. We hope MEPs will choose to make things better.
Download our full letter here.
- The Regulation suggests businesses and others, make practices safer and more transparent. It does not remove the incentive to produce content which they trade in return for users’ personal data, unless you believe such an incentive only exists, and is acceptable, if users don’t know they are being used.
- Children generally do not give their data freely for third-party uses, such as reselling and re-purposing, but rather usually give it expecting it is to be used for one thing, the access to the site or product or app, without an understanding of law and limitations, or a way to enforce them. Apps required in schools for example, or bait-and-switch first free/then pay-for apps, often mean data protection law offers little in practice. They are required due to schools’ statutory footings on data, to give up their rights. Where rights cannot be enforced by users, we need them to be built-in-by-design at the back-end of the technology, and in company practices.
- EU Kids Online’s research shows children are now going online at a younger and younger age, they use devices such as mobile phones and toys more than what adults consider computers, and young children’s “lack of technical, critical and social skills may pose [a greater] risk.
- Children’s privacy and data are often not protected, but are exploited by design. Recent analysis regarding privacy disclosure and information collection and sharing practices within children’s apps, carried out by the Federal Trade Commission in the US, found that of the 400 children’s apps they surveyed “nearly 60% (235) of the apps reviewed transmitted device ID to the developer or, more commonly, an advertising network, analytics company, or other third-party and only 20% of privacy policies disclosed this. “22% (88) of the apps reviewed contained links to social networking services, while only 9% (36) disclosed such linkage prior to download”.
- These secret data extractions and transfers to social media third parties, means children lose their autonomy and decision-making of who knows what about them, and how that information about them shapes their online experience, such as the adverts they see, is hidden to them. IMCO (on which the UK MEPs sit) and the current Council text suggest deleting Article 17 of the proposal, reducing protections still further.
- The e-Privacy Regulation in fact considers how technology works more than data protection law does, and how it can harm us in ways that we do not see, like the technological machinery of meta data profiling and price discrimination. It protects the right to freedom of communication, and data held on a device. For example, the confidentiality of the content of communications, stored or accessed on an individual’s device — for children this includes toys — the GDPR does not cover this. In fact GDPR Article 8 (2) is likely to mean greater personal data grabs by companies from our children, if age verification is implemented badly.
- Children need the additional security and protections of the e-Privacy Regulation to thrive in a digital future. In developed countries, 94% of young people aged 15-24 use the Internet. Their protections, participation and privacy must be made priorities, if the Internet is to be a safe long term vehicle for collaboration, commerce, knowledge, learning, play, and promotion of democracy. It is vital to promote a safe and transparent infrastructure of the Internet of the future, for the benefit of all.
Fundamentally MEPs need to ask, what kind of environment do we want the Internet to be?
This regulation will influence whether it is acceptable for companies’ with exploitative practices to lurk in the hidden corners of privacy policies. Will we accept that they will prey on those — especially our teens and young people, but applies equally to adults — who don’t understand how ‘free’ content exploits their digital identity?
Do we say it is OK for companies to follow them around every website, profile their activity and nudge their behaviour? How the company chooses what content we as users do or do not get to see, in secret? Companies trade and re-trade our digital identities for marketing and advertising purposes, influencing what we buy and the lives we lead without even being aware of it. Is that the online life we want for our young people? Is it wise to perpetuate a model that relies on ignorance?
Or we can choose to promote an environment which is open and transparent, that we can better understand, and companies can be upfront about how they operate, and a world that offers people protection by default, and active control to choose how they interact with companies.
Business will thrive in both. But will our children?
We need the e-Privacy Regulation to be strong to be forward-looking, to be able to inspire good practice, and to encourage change of design by default, to keep up with the level of technological risk that reaches far into our every day lives on and offline.