Starting in April, app stores in Singapore will enforce new measures to prevent users under 18 from downloading inappropriate applications. This initiative is part of the government's efforts to protect youth from harmful content.
A recent study has shown that six social media platforms, including TikTok and X, failed to adequately detect child sexual abuse content and terrorism-related material, raising concerns about user safety. This highlights the urgent need for effective monitoring mechanisms to protect vulnerable users, especially children and teenagers.
An independent developer on Roblox has warned that current safety measures are insufficient to protect children, urging parents to constantly supervise their kids. This warning comes as the platform experiences a significant increase in users, with children under 13 making up about 40% of total players.
During a session in the House of Commons on Tuesday, British MPs criticized social media platforms for their failure to combat misinformation and harmful content, raising concerns about the impact of these platforms on society.
Germany is moving towards imposing strict sanctions on social media platforms in an effort to shield children from harmful and extremist content. This step comes amid growing concerns about the negative impacts of digital communication on youth.