A regulator from Australia has written legal letters to social media companies, including Facebook, YouTube, X, Telegram, and Reddit, requesting that they provide information on the measures they have taken to remove content related to terrorism.
The e-Safety Commission expressed concern in a report seen by News.ng on Monday.
News.ng gathered that the platforms were not taking appropriate action to prevent extremists from recruiting users using live-streaming capabilities, algorithms, and recommendation systems.
From 2022, the regulator has the authority to query large digital companies to provide information about the amount of unlawful content they host and the steps they take to stop it from spreading.
There may be hefty fines for failure to comply.
“We don’t know if they actually have the people and resources in place to even be able to respond to these notices, but we will see,” Ms Inman Grant told reporters in an interview.
“We’re not going to be afraid to take it as far as we need to, to get the answers we need or to find them out of compliance and fine them,” she added.
According to Commissioner Grant, violent extremist groups primarily utilise Telegram as a recruiting and radicalization tool.
A request for comment was not immediately answered by the Dubai-based messaging service, which was placed first in the world for the frequency of terrorism propaganda in a 2022 Organisation for Economic Cooperation and Development assessment.
The media and the government have earlier criticised Facebook’s parent company, Meta, for announcing that it will no longer enter into agreements to pay Australian news publishers.
The parliamentary committee looking into foreign influence in Australia claimed that there may be serious security threats associated with the Chinese apps WeChat and TikTok.
Australia declared last year that TikTok would not be allowed on official devices because of security concerns.