Social Media Crackdown: EU Countries Strive to Shield Kids from Popular Networks Like TikTok and Instagram
Various EU nations advocate for the prohibition of online platforms targeted at minors.
European nations are instigating a firm stance against online platforms such as TikTok, Instagram, and YouTube, particularly for underage users. France, Greece, Denmark, and a handful of other countries petitioned for stringent restrictions during the EU digital ministers' gathering in Luxembourg on Friday. They echoed the EU Commission to establish binding regulations across all EU member states.
As things stand, these platforms incorporate age restrictions. For example, Tiktok, Instagram, Snapchat, and X (formerly Twitter) cater to users aged 13 and above in the EU, while YouTube and Tumblr cater to those 16 and above. Yet, age verification is non-existent, as users merely enter their birthdate upon registration. "Anyone's ever been a young 'un - altering a birthdate's a breeze," stated French Digital Minister Clara Chappaz in Luxembourg. She pointed out that kids aged seven to eight often setup accounts, which needs to be rectified immediately.
To combat this issue, France desires platforms to attain parental approval for users below the age of 15. France legislated this law in 2023, though acknowledges that Brussels' endorsement is still pending. Spain, Slovenia, and Cyprus have echoed France's concerns.
The risks that minors face on the web are varied, including offensive speech, bullying, diet tips, pornography, and addiction. Prolonged screen time can amplify anxiety disorders, depression, and undermine critical thinking skills, as per the EU countries' position paper. Addressing data protection is a significant hurdle when it comes to implementing reliable age verification. The EU Commission intends to develop an app that shields personal data while disclosing whether the user is eligible to the platforms.
Down the line, citizens may download a digital ID onto their smartphones, storing age data and automatically preventing inappropriate apps for minors. The EU Commission is inspecting several platforms, including Tiktok, Meta (Facebook's parent company), and porn providers Youporn, Stripchat, XVideos, and XNXX, for youth protection lapses. If these allegations are validated, the corporations could incur substantial fines.
Background Insights:
- Europe advocates for stricter age verifications and consent for social media platforms like TikTok, Instagram, and YouTube to safeguard children.
- Measurements under discussion involve obtaining parental consent for users under 15, setting a uniform digital adulthood age, imposing an EU-wide ban for children below 15 in certain circumstances, and aligning with age restriction strategies adopted by countries such as Australia, New Zealand, and Norway.
- Challenges regarding age verification effectiveness persist but are anticipated to be tackled with technological advancements.
- The EU's Digital Services Act (DSA) addresses online child protection, but enforcing stricter regulations to ensure social networks do not allow minors to create accounts without proper age verification is currently under consideration.
- In line with the EU's efforts to shield children from online risks, the European Commission aims to develop an app that safeguards personal data while ensuring age-appropriate access to social media platforms like TikTok, Instagram, and YouTube.
- Beyond France, Spain, Slovenia, and Cyprus have also voiced concerns about the safety of minors on social media, advocating for stricter policy-and-legislation measures, such as obtaining parental consent for users under 15.
- Amidst the ongoing digital media crackdown, the EU Commission is investigating popular platforms like TikTok, Meta (Facebook's parent company), and porn providers Youporn, Stripchat, XVideos, and XNXX, for policy violations and potential policy-and-legislation lapses impacting general-news, entertainment, and community interests, particularly targeting minors.