• Home
  • Australia slams tech giants over…

Australia slams tech giants over child safety

Australia slams tech giants over child safety

Australia’s eSafety Commissioner has criticised major global tech companies, accusing them of failing to combat child sexual abuse material on their platforms adequately.

In a report released Wednesday, the watchdog singled out YouTube and Apple for poor transparency and inaction, stating the companies were “turning a blind eye” to serious online crimes.

The report found that YouTube—owned by Google parent Alphabet—and Apple failed to track how many reports of CSAM they received from users and could not provide details on how long it took them to respond.

The platforms were also found to have significant gaps in safety protocols, including inadequate detection of livestreamed abuse, failure to block links to known CSAM, and incomplete deployment of image “hash-matching” technology used to identify and remove exploitative content.

“When left to their own devices, these companies aren’t prioritising the protection of children and are seemingly turning a blind eye to crimes occurring on their services,” said eSafety Commissioner Julie Inman Grant. “No other consumer-facing industry would be given the licence to operate while enabling such heinous crimes against children.”

The Australian government has already taken action based on the findings. Last week, it added YouTube to its landmark social media ban for teenagers, overturning a planned exemption for the video platform following advice from the eSafety office.

A Google spokesperson pushed back on the criticisms, saying that “eSafety’s comments are rooted in reporting metrics, not online safety performance.” They added that YouTube proactively removes more than 99% of abuse content before it is flagged or viewed by users, using a combination of artificial intelligence and hash-matching technology.

The report assessed the responses of several platforms, including Apple, Discord, Google, Meta, Microsoft, Skype, Snap, and WhatsApp. It concluded that all had safety shortcomings that elevated the risk of child exploitation on their services.

Meta—owner of Facebook, Instagram, and Threads—stated separately that it prohibits all graphic content involving child abuse and is committed to user safety.

However, the report noted that some platforms had made little to no progress in addressing known safety gaps, despite prior warnings from the eSafety office. Inman Grant said both Apple and Google failed to disclose how many trust and safety personnel they employ, or how many user reports related to CSAM they receive.

The eSafety Commissioner, an independent regulatory body, is the first of its kind globally and has sweeping powers to demand accountability from digital platforms operating in Australia. The office continues to press tech giants to improve transparency and invest in stronger protections for children online.