Could ‘hide by default’ be a solution to online privacy concerns for children?

Could ‘hide by default’ be a solution to online privacy concerns for children?

September 17, 2018 Off By administrator

The Information Commissioner’s Office’s consultation on an age appropriate design code for information society services, (a requirement of the Data Protection Act 2018 which supports and supplements the implementation of the GDPR) is open for submissions until September 19. The code will provide guidance on the design standards that the Information Commissioner will expect providers of online services, which process personal data and are likely to be accessed by children, to meet. Wendy M. Grossman looks here at some of the key questions to consider. 

Last week, defenddigitalme, a group that campaigns for children’s data privacy and other digital rights, and Sonia Livingstone’s group at the London School of Economics assembled a discussion of the Information Commissioner’s Office’s consultation on age-appropriate design for information society services, which is open for submissions until September 19. The eventual code will be used by the Information Commissioner when she considers regulatory action, may be used as evidence in court, and is intended to guide website design. It must take into account both the child-related provisions of the child-related provisions of the General Data Protection Regulation and the United National Convention on the Rights of the Child.

There are some baseline principles: data minimization, comprehensible terms and conditions and privacy policies. The last is a design question: since most adults either can’t understand or can’t bear to read terms and conditions and privacy policies, what hope of making them comprehensible to children? The summer’s crop of GDPR notices is not a good sign.

There are other practical questions: when is a child not a child any more? Do age bands make sense when the capabilities of one eight-year-old may be very different from those of another? Capacity might be a better approach – but would we want Instagram making these assessments? Also, while we talk most about the data aggregated by commercial companies, government and schools collect much more, including biometrics.

Most important, what is the threat model? What you implement and how is very different if you’re trying to protect children’s spaces from ingress by abusers than if you’re trying to protect children from commercial data aggregation or content deemed harmful. Lacking a threat model, “freedom”, “privacy”, and “security” are abstract concepts with no practical meaning.

There is no formal threat model, as the Yes, Minister episode The Challenge (series 3, episode 2), would predict. Too close to “failure standards”. The lack is particularly dangerous here, because “protecting children” means such different things to different people.

The other significant gap is research. We’ve commented here before on the stratification of social media demographics: you can practically carbon-date someone by the medium they prefer. This poses a particular problem for…

(Excerpt) To read the full article , click here
Image credit: source