The Australian government has proposed a law banning social media usage for children under 16.
Platforms like TikTok, Facebook, and Instagram will need to prevent underage users from creating accounts. They will have a year to adapt their systems after the law is enacted.
Similar measures in other nations
Other nations are pursuing similar steps. In the United Kingdom, online safety legislation requires platforms to strictly enforce age restrictions, including mandatory verification of users’ ages through technologies such as identity verification or parental consent.
France requires parental consent for users under 15 as President Emmanuel Macron is advocating for a unified “digital majority” across the European Union. Norway has also proposed a minimum age of 15 for social media use, with emphasis on privacy and data protection.
EU’s GDPR and DSA
Under GDPR, parental consent is mandatory for processing the data of children under 16, while the new Digital Services Act (DSA) further obligates large online platforms (those with more than 45 million users) to assess the risks to younger users and implement safeguards.
In addition, the DSA provides for stricter penalties for platforms that do not take the necessary steps to reduce exposure to harmful content, such as violence, hate speech and cyberbullying. The aim is to ensure a safer digital space through cooperation between member states, tech companies and NGOs.
These global efforts signal a growing commitment to securing the digital environment for children and holding social platforms accountable.