Nine tech companies are being investigated by the information watchdog for putting children at risk online after complaints against companies such as Instagram, Apple and Omegle for failing to do their duty to diligence.
The companies are under investigation by the Information Commissioner for breaching the Children’s Code, which came into full force last September and is designed to protect children from inappropriate and harmful content unsuitable for them. age.
The complaints that sparked the investigation into the nine companies were made by the charity 5Rights, whose founder, Baroness Kidron, is considered the architect of the Children’s Code.
Instagram, Apple and Omegle were named by the charity for the most serious violations, including targeting children with self-harming content, allowing underage children to engage with sexual material and allowing them to access freely to adult dating apps.
“High risk of potential harm to children”
The Information Commissioner’s Office (ICO) has written to the nine companies as part of the investigation into compliance with the code, which is enforced under data protection law with fines of up to 4% of global business turnover.
In a letter to Baroness Kidron, the ICO said the initial investigations focused on online companies potentially guilty of “poor compliance with privacy requirements” and which posed a “high risk of potential harm to children”.
He said the penalties ranged from “a commitment to improve compliance” to “enforcement notices and/or penalty notices for instances of significant non-compliance with the law.”
Apple and Google have been contacted by the ICO to find out how they determine app age ratings and whether processing of personal data is a factor.
The ICO has identified 40 other companies – in addition to nine – which it has also contacted as part of its investigation into compliance with the new code.
“I await formal action accordingly”
A decision on penalties is expected within months and will mark a turning point if action is taken as the first of its kind in the world for design flaws that put children at risk online. “I await formal action accordingly,” Baroness Kidron said.
Avatar research by his charity 5Rights has shown that Instagram’s algorithms amplify harmful content, including content promoting self-harm, suicide, eating disorders as well as highly sexualized images.
Along with this content, age-appropriate advertisements of training courses were also shown to child users, proving that the company knew that these users were under 18 years old.
The researchers also uploaded 16 dating and dating apps with a minimum age of 18 from the Apple App Store using an iCloud account registered for a 15-year-old.
This was done simply by pressing “Ok” to confirm that they were over the required age despite signing up to iCloud at age 15.
Inadequate age checks
Omegle, the online chat site, was flagged for inadequate age verifications that allowed users to pair up with strangers to text or video chat.
The platform has a minimum user age of 13 with parental consent, or 18 without but does not use any form of age verification. Users simply declare that they are over 13 years old and tick that they have parental authorization if they are under 18 years old.
One of the teenagers interviewed as part of the 5Rights’ Pathways research said that as a child he spent a lot of time on Omegle and encountered sexual content on the service, often engaging with adults.
The ICO said its investigation is “ongoing” and plans to take “next steps” in the spring.
Apple, Instagram and Omegle were contacted for comment on Friday but had not responded at the time of publication.
The measures coincide with California’s unveiling of a bill for an equivalent code last week, following similar moves by other states in the United States, Australia and New Zealand.