EU investigates Meta for potential breaches of digital services act

The European Union is investigating Facebook and Instagram over whether they are so addictive that they are having "negative effects" on the "physical and mental health" of children.
The European Union is investigating Facebook and Instagram over whether they are so addictive that they are having "negative effects" on the "physical and mental health" of children.

The European Union is investigating Facebook and Instagram over whether they are so addictive that they are having “negative effects” on the “physical and mental health” of children.

Investigation Overview

The European Union has launched an investigation into major tech firms, including Meta, for potential breaches of the Digital Services Act (DSA). 

This scrutiny focuses on whether these companies have adequately verified the ages of their users and how they recommend content to children. Companies found in violation could face fines of up to 6% of their annual global turnover.

Meta’s Response

Meta, which owns Facebook and Instagram, has stated it has “spent a decade developing more than 50 tools and policies” to safeguard children on its platforms. 

“This is a challenge the whole industry is facing, and we look forward to sharing details of our work with the European Commission,” Meta said in a statement. In September, Meta provided regulators with a report on the risks associated with its platforms as required by the DSA.

Also read: DOJ Alleges Boeing breached agreement Over 737 Max crashes

EU’s Concerns

The EU Commission has expressed concerns that Facebook and Instagram’s systems, including their algorithms, may contribute to behavioral addictions in children and create “rabbit-hole effects.” 

Rabbit-hole effects occur when algorithms suggest increasingly harmful content based on a user’s viewing history. 

The EU is also examining Meta’s methods for age assurance and verification, questioning the effectiveness of these measures in protecting young users.

Age Verification and Algorithm Concerns

Meta’s age verification processes, known as age assurance, are under particular scrutiny. The Commission is concerned about how Meta ensures users meet the age requirement of being 13 or older. 

This concern is echoed by the UK communications watchdog, Ofcom, which has highlighted that many younger children are using social media platforms, often with their parents’ knowledge.

Impact on Social Networks

Like most social networks, Meta requires users to be 13 or older. However, Ofcom’s recent report revealed that many younger children have accounts on these platforms. 

The EU investigation and potential enforcement of the DSA highlight the ongoing challenge of protecting young users in the digital space and ensuring companies comply with stringent safety regulations.

The EU’s investigation marks a significant step in holding tech companies accountable for their impact on children and their adherence to safety protocols. The outcomes could lead to stricter enforcement and possibly substantial fines for companies that fail to meet the DSA’s requirements.

Gary Monroe

Gary Monroe is a seasoned contributor to the Los Angeles Business Magazine, where he offers insightful analysis on local business trends and economic developments. With a focus on Los Angeles' dynamic commercial landscape, Gary's articles provide valuable perspectives for entrepreneurs and business professionals in the city.

Leave a Reply

Your email address will not be published.

Previous Story

DOJ Alleges Boeing breached agreement Over 737 Max crashes

Next Story

China introduces measures to address property market crisis

Latest from BUSINESS

withemes on instagram

This error message is only visible to WordPress admins

Error: No feed found.

Please go to the Instagram Feed settings page to create a feed.