Meta has commenced removing Australian children younger than 16 from its Instagram, Facebook, and Threads platforms, just ahead of an official government-mandated social media ban set for December 10. Last month, the tech giant notified users aged 13 to 15 that their accounts would be deactivated starting December 4.

Estimations suggest that approximately 150,000 users on Facebook and 350,000 on Instagram will be affected, while access to Threads, which requires an Instagram account, is also being restricted. Under the new regulations, social media companies in Australia face fines of up to A$49.5 million (US$33 million) for failing to enforce age restrictions.

A spokesperson from Meta affirmed their commitment to complying with the law while advocating for a more standardized approach to age verification across platforms. This compliance initiative allows identified users under 16 to download their content before account deactivation, with options for appealing age misclassifications through verification measures like submitting a 'video selfie' or providing government-issued ID.

In addition to Meta's platforms, the broader ban impacts major social media sites such as YouTube, TikTok, Snapchat, and X (formerly Twitter). Despite intentions for child protection, critics argue this could isolate vulnerable youth from necessary social interactions.

Australian Communications Minister Anika Wells indicated that while she anticipates initial challenges with the ban, the objective is to safeguard future generations from the detrimental effects of social media algorithms. Public concern continues about how this could drive younger users to less regulated online spaces.

The government has found that a vast majority of children aged 10-15 are active on social media, with many reporting exposure to harmful content and cyberbullying. In light of these statistics, Australia's pioneering move in social media regulation is being observed globally, potentially setting a precedence for policy surrounding child safety online.