Australia has enacted a groundbreaking law prohibiting children under the age of 16 from creating accounts on certain social media platforms, marking a significant step in online safety measures. The legislation, an amendment to the Online Safety Act 2021, passed both houses of parliament after months of debate but has drawn criticism for its rushed process and lack of clarity on implementation. While the law is supported by 77% of Australians, its enforcement won’t begin until after a 12-month preparatory period. The government plans to engage in extensive consultations with stakeholders during this time to iron out implementation details and determine the affected platforms.
The new law imposes stringent requirements on tech companies, mandating them to take "reasonable steps" to prevent underage users from accessing their services. Platforms that fail to comply risk fines of up to AUD 50 million (USD 32 million). While the legislation does not name specific platforms, it broadly defines "age-restricted social media platforms" as those facilitating online social interactions, linking users, or allowing the posting of material. Some exceptions include services primarily for business, education, or health purposes. Despite the broad language, platforms like YouTube are reportedly exempt, though the government has not explicitly confirmed this. Messaging apps and gaming platforms may also be excluded, but specifics remain unclear.
Tech companies will face the logistical challenge of verifying the ages of all account holders, including existing users. The law prohibits platforms from solely relying on government-issued identification for age verification, though such documents may be used alongside other methods. Among the proposed strategies are credit card verification and facial recognition technology. Trials are underway to assess these approaches, with results expected in mid-2025. However, concerns have been raised over the potential biases and inaccuracies of facial recognition systems, which are known to perform unevenly across different demographic groups.
Prime Minister Anthony Albanese has championed the legislation as a vital step in protecting children online, emphasizing the dangers social media can pose, including bullying, scams, and exploitation. He described the law as a demonstration of the government’s commitment to parents and children. Opposition leaders largely supported the bill, viewing it as a necessary measure to hold tech companies accountable. However, critics argue the legislation risks isolating young people who may circumvent the ban and face increased exposure to unrestricted content. Others fear the rushed process has led to an insufficiently developed framework, leaving gaps in how the law will be enforced.
The legislation also introduces a "digital duty of care," requiring platforms to assess and mitigate risks associated with their content and respond promptly to user complaints. Although the duty of care is widely supported, its implementation timeline remains uncertain. Experts and advocacy groups stress the importance of complementing these regulations with digital literacy initiatives for parents, teachers, and children to foster safer online behavior.
The parliamentary process leading to the law’s passage has faced scrutiny. Submissions to a Senate committee inquiry were open for just 24 hours, followed by a brief three-hour hearing. The government passed the bill through both houses in less than a week. Critics, including the Australian Human Rights Commission, argue that such a rushed approach left insufficient time to address key details. Nevertheless, public opinion has overwhelmingly backed the law, with many viewing it as a decisive step in curbing the unchecked power of big tech companies.
The government now has the task of refining the law’s provisions through consultations and setting a timeline for its enforcement. Platforms will be required to deactivate accounts for users under 16 once the ban takes effect, although parents and children will not face penalties for non-compliance. Companies must demonstrate they have taken adequate steps to restrict underage access.
Australia’s approach is seen as a world-first in imposing stringent social media age restrictions with significant penalties for violations. Advocates, including the Heads Up Alliance, argue that the law is a step in the right direction but emphasize the need for broader application to platforms like Discord and YouTube Shorts. The upcoming months will be crucial in shaping how effectively the law protects young users while ensuring equitable enforcement.
Background:
Australia’s efforts to regulate social media stem from growing concerns over the impact of online platforms on young people. Social media has been criticized for enabling cyberbullying, peer pressure, and exposure to harmful content, as well as providing a channel for online predators. Previous regulatory attempts in other countries have fallen short of holding platforms accountable for breaches of age restrictions. Australia’s law aims to address this gap by imposing strict fines on non-compliant companies. However, its rushed enactment has raised questions about the adequacy of its framework and the potential consequences of implementation gaps.