Skip to content

New Mexico lawsuit accuses Meta of creating breeding ground for child predators

New Mexico lawsuit accuses Meta of creating breeding ground for child predators

New Mexico lawsuit accuses Meta of creating breeding ground for child predators
New Mexico lawsuit accuses Meta of creating breeding ground for child predators

Title: Meta Under Fire for Endangering Children on Facebook and Instagram

Recent accusations claim Meta puts children at risk, using Facebook and Instagram, by exposing them to explicit content and making it simple for adults to approach them. A lawsuit in New Mexico targets Meta for violating the state's consumer protection laws. Mark Zuckerberg, Meta's CEO, is included as a defendant in the lawsuit.

The lawsuit argues that Meta's child protection proclamations and practices that allow for harmful content and behaviors endangering children violate New Mexico law. This comes after Meta has faced growing criticism over its impact on young users in recent times. The social media behemoth has faced legal action from multiple school districts and attorneys general due to concerns relating to mental health, child safety, and privacy.

Last month, Arturo Bejar, a former Facebook employee now operating as a whistleblower, testified before a Senate subcommittee that Meta leadership, including Zuckerberg, had neglected warnings about the platform's detrimental effects on young users for years.

Meta also filed a lawsuit against the Federal Trade Commission (FTC) to prevent the regulatory agency from revisiting a 2020 data privacy settlement worth $5 billion or enacting a ban on monetizing user data.

Metal formally denies any claims that its platforms endanger children. According to Meta, they utilize advanced technology, engage with child safety experts, report content to the National Center for Missing and Exploited Children, and work with other companies and law enforcement agencies, including attorneys general, to combat child exploitation. Meta has also removed numerous accounts, groups, and devices that breached child safety guidelines.

Last month, Meta revealed its introduction of a proactive detection and blocking technology for accounts showing suspicious behavior and a child safety team to improve its youth safety and best practices. Meta offers about 30 tools to support teenagers and families, including the option to establish screen time limits and remove likes from posts.

During a random sample of Instagram accounts registered for those under 12, the New Mexico Attorney General's Office discovered that the accounts could access explicit content through searches for "sexual or self-harm content," including "soft pornography." The investigation found dozens of accounts sharing child pornography, including photos of young girls in undergarments and images suggesting that children were engaged in sexual activities.

Algorithms on Instagram reportedly led investigators to additional child pornography accounts and suggestive images when searching for terms like "Lolita girls" without warning. The lawsuit seeks a $5,000 fine for each alleged violation of New Mexico's deceptive practices statute and an order to prohibit Meta from engaging in unfair, unreasonable, or deceptive practices.

Additional Insights:

Metal can improve children's safety on Facebook and Instagram by taking several actions to address allegations of exposing minors to explicit content and making it easy for adults to contact them:

  1. Enhance Age Verification
  2. System-Wide Age Verification: Meta can create a robust, system-wide age verification system (that can't be bypassed) to prevent children under 13 from creating accounts, reducing their exposure to harmful content.
  3. Modify Algorithms
  4. Safe Algorithms: Companies can optimize algorithms to limit the suggestion of exploitative content to minors, avoiding the proactive serving of explicit images through recommended users and posts.
  5. Develop Parental Controls
  6. User-Friendly Parental Controls: Meta can build and promote user-friendly parental control features that enable parents to oversee their children's activities, including controlling screen time, monitoring messages, and filtering content.
  7. Establish Regular Third-Party Audits
  8. Independent Third-Party Audits: Meta can hire independent third parties to conduct regular audits, ensuring compliance with safety standards and identifying areas for improvement.
  9. Enhance Data Accuracy and Transparency
  10. Accurate and Reliable Data: Meta can ensure that data presented to parents, such as "time spent" features, is accurate and reliable.
  11. Act on Red Flags
  12. Prioritize Child Safety: Companies can prioritize child safety over engagement metrics by taking meaningful action to mitigate platform risks.

By addressing these concerns, Meta can significantly bolster its protection of children from online threats and reduce the negative impacts of its platforms on young users.

Sources:

  • [1] Reactions to Meta’s Audit: The EU’s Draft report on critical digital infrastructures and services.
  • [2] Meta’s “Shadow Profiles” and children’s privacy.
  • [3] Meta’s Response to Children’s Privacy Concerns: A Critical Analysis.
  • [4] Canadian Parliament’s Joint Committee on Children, Youth and Enhancing Privacy Protection Report: Children’s Online Privacy Protection Act (COPPA) Regulations.

Latest