Australia”s internet watchdog announced on Tuesday that it has issued legal notices to various social media companies. The tech giants, including Google, Meta, X, WhatsApp, Telegram, and Reddit, are required to clarify the measures they are taking to protect Australians from terrorist and violent extremist material and activity.
Australia’s eSafety Commissioner mentioned in its press release that the spread of extremist material and its role in online radicalisation remain a concern both in the country and internationally. The Australian regulator also noted that incidents like the 2019 terrorist attacks in Christchurch, New Zealand, and the attack at a grocery store in New York in 2022 indicate how social media and other online services can be exploited by violent extremists.
eSafety Commissioner Julie Inman Grant issued notice to tech companies under the transparency powers granted under the Online Safety Act. Thus, it is mandatory for the companies to answer a series of detailed questions about how they are tackling the issue. Notably, the social media companies will have 49 days to provide responses to the eSafety Commissioner.
Julie Inman Grant said that eSafety still receives reports that disturbing content, including video from the 2019 mosque shootings in New Zealand, continues to circulate on major social media platforms. “We remain concerned about how extremists weaponise technology like live-streaming, algorithms and recommender systems and other features to promote or share this hugely harmful material,” added the eSafety Commissioner.
The Australian regulator also cited the recent OECD report. As per this report, Telegram hosted more terrorist or violent extremist content. Google’s YouTube and Elon Musk”s X are in the second and third positions, respectively. Facebook, Instagram, and WhatsApp are also included in the list. “We want to know why this is and what they are doing to tackle the issue,” added Julie Inman Grant in the statement.