Australian watchdog slams tech giants over child abuse failures

Australian watchdog slams tech giants over child abuse failures

Australia’s Online Safety Commissioner has strongly criticized global tech companies for failing to adequately protect children from online sexual exploitation and abuse.

In a report released Wednesday, eSafety Commissioner Julie Inman Grant said major platforms such as Apple, Google, Meta, and Microsoft are leaving “significant gaps” in their detection and removal of child sexual abuse content. The findings follow transparency notices issued in July 2024 to eight tech companies, requiring biannual reporting on their handling of child sexual exploitation and abuse (CSEA) material.

The report flagged Apple and YouTube for not disclosing how many user reports of CSEA content they received or how long they took to respond. It also found that none of the eight companies—including WhatsApp, Discord, Snap, and Skype—fully detect livestreamed child abuse across all services.

Furthermore, Apple, Google, and WhatsApp reportedly fail to block known CSEA URLs, while Apple, Google, Microsoft, and Discord are not fully using hash-matching technology to identify known abuse content.

“It shows that when left to their own devices, these companies aren’t prioritising the protection of children,” Grant said. “No other consumer-facing industry would be allowed to enable such heinous crimes.”

Despite widespread criticism, the report acknowledged limited progress. For instance, Discord and Snap have introduced language analysis tools to detect grooming.

The 2024 findings echo earlier reports from 2022 and 2023, showing that voluntary action by tech giants has been insufficient in preventing abuse.

The eSafety Commissioner’s office called for stronger accountability, clearer transparency, and urgent improvements in how tech firms handle CSEA threats across their platforms.

Leave a Reply

Your email address will not be published. Required fields are marked *