**New data indicates that the majority of Australian children under 13 are active on social media, prompting concerns about online safety and regulatory measures.**
**Over 80% of Australian Kids Under 13 Use Social Media, Report Reveals**

**Over 80% of Australian Kids Under 13 Use Social Media, Report Reveals**
**A significant report by eSafety highlights the widespread engagement of children with social media platforms despite age restrictions.**
In a revealing report from Australia’s internet regulator eSafety, it was found that over 80% of children aged 12 and under engaged with social media or messaging apps last year, platforms that are typically restricted to users aged 13 and older. The report identified YouTube, TikTok, and Snapchat as the most frequently accessed services among the youth demographic.
The findings come amid a push for Australia to implement a ban on social media usage for individuals under 16, expected to be enforced by the end of this year. Major companies being scrutinized in this report include Discord, Google (YouTube), Meta (Facebook and Instagram), Reddit, Snap, TikTok, and Twitch. However, these platforms had not provided immediate feedback regarding the findings.
While the standard requirement is that users must be 13 or older to create an account, certain exceptions exist. For instance, YouTube offers Family Link, allowing access for younger children under guardian supervision, along with a dedicated YouTube Kids app designed for children. Users on YouTube Kids were not included in the study.
According to eSafety Commissioner Julie Inman Grant, the report serves as a crucial tool to navigate future actions in safeguarding children's online activities. She emphasized the collective responsibility involving social media companies, app developers, parents, educators, and lawmakers in enhancing online safety for minors.
In a survey extending to over 1,500 children in Australia aged eight to 12, 84% reported utilizing at least one social media or messaging platform since the previous year. Notably, over half accessed their accounts using a parent's or caregiver's details. Additionally, around one-third of surveyed children owned individual accounts, with a reported 80% receiving assistance from guardians during the account setup process. Alarmingly, only 13% of children faced account closures by the respective platforms due to underage registrations.
The report pointed to inconsistencies industry-wide in verifying user ages during account creation. Its authors noted a significant flaw - a lack of strong preventive measures during the account sign-up process, allowing underage users to easily misrepresent their age.
Platforms like Snapchat, TikTok, Twitch, and YouTube disclosed that they utilize certain tools to identify potential underage users post-signup through user engagement signals. Still, reliance on user activity to detect underage status poses risks as children could be exposed to online dangers before any corrective action is taken.
The findings come amid a push for Australia to implement a ban on social media usage for individuals under 16, expected to be enforced by the end of this year. Major companies being scrutinized in this report include Discord, Google (YouTube), Meta (Facebook and Instagram), Reddit, Snap, TikTok, and Twitch. However, these platforms had not provided immediate feedback regarding the findings.
While the standard requirement is that users must be 13 or older to create an account, certain exceptions exist. For instance, YouTube offers Family Link, allowing access for younger children under guardian supervision, along with a dedicated YouTube Kids app designed for children. Users on YouTube Kids were not included in the study.
According to eSafety Commissioner Julie Inman Grant, the report serves as a crucial tool to navigate future actions in safeguarding children's online activities. She emphasized the collective responsibility involving social media companies, app developers, parents, educators, and lawmakers in enhancing online safety for minors.
In a survey extending to over 1,500 children in Australia aged eight to 12, 84% reported utilizing at least one social media or messaging platform since the previous year. Notably, over half accessed their accounts using a parent's or caregiver's details. Additionally, around one-third of surveyed children owned individual accounts, with a reported 80% receiving assistance from guardians during the account setup process. Alarmingly, only 13% of children faced account closures by the respective platforms due to underage registrations.
The report pointed to inconsistencies industry-wide in verifying user ages during account creation. Its authors noted a significant flaw - a lack of strong preventive measures during the account sign-up process, allowing underage users to easily misrepresent their age.
Platforms like Snapchat, TikTok, Twitch, and YouTube disclosed that they utilize certain tools to identify potential underage users post-signup through user engagement signals. Still, reliance on user activity to detect underage status poses risks as children could be exposed to online dangers before any corrective action is taken.