Tech giants, telcos and Digital Rights Watch want clarity on monitoring requirements for online violent abhorrent content
[ad_1]
A coalition of social media platforms, industry groups, and Digital Rights Watch have come together to recommend various amendments to Australia’s online abhorrent violent materials legislation.
The recommendations were submitted to the Parliamentary Joint Committee on Law Enforcement, which is currently conducting an inquiry into the effectiveness of the Criminal Code Amendment (Sharing of Abhorrent Violent Material) Act 2019 (AVM Act).
The AVM Act requires content service providers to remove abhorrent violent material and notify police if their services are used to share abhorrent violent material, or they risk being fined 10% of their annual global turnover. It also gives the eSafety commissioner power to issue notices to content service providers and order them to remove specific abhorrent violent material.
The laws quickly passed through Parliament in 2019 in response to social media platforms being slow to remove videos of the Christchurch terrorist attack.
Put forth by tech giant advocacy group DiGi, communications industry group Communications Alliance, Digital Rights Watch, as well as IBM, Google, Tik Tok, and Twitter, the submission [PDF] proposes various amendments to the AVM Act ranging from more review processes to clarification around definitions to lowered penalties.
Chief among the legislative amendments recommended by the coalition is more clarity regarding when the law’s monitoring obligations for removing abhorrent violent material are triggered. The coalition said many organisations are confused as guidance from the Attorney-General’s department has indicated that there is no obligation to proactively monitor for abhorrent violent material, and that there is only a requirement to remove such material when it is found.
Despite this guidance, the coalition said the laws, as currently written, could be interpreted to confer monitoring obligations due to content service providers automatically being categorised as “reckless” when it receives a notice from the eSafety Commissioner about certain material being available on a provider’s platform.
“This is because the Act presumes providers to be reckless at the time that a notice is issued by the eSafety Commissioner in combination with the associated definition of recklessness,” the coalition wrote in its submission.
Due to this, the coalition has recommended for the law’s wording to be tweaked to clarify that there is no requirement for proactive monitoring in order for the obligations to be consistent with guidance from the Department of the Attorney-General.
It also wants the law to be amended so that a provider is only deemed reckless if they have failed to expeditiously remove the abhorrent violent material flagged by the eSafety commissioner from being accessible on their services.
Another area where the coalition has asked for clarification is in relation to content that falls into a grey area when deciding whether it is captured by the AVM Act.
“One interpretation is that the ability to host important historical footage, such as the events surrounding the Holocaust, would be unlawful. We are concerned that uncertainty around these exemptions will lead to the take-down of material not intended to be removed under the Act,” the coalition wrote.
The coalition also said that the AVM Act’s scope was too broad and should not capture business-to-business infrastructure and cloud providers. In its submission, it said these platforms are private and closed by default, which means they are a much lower risk for the widespread dissemination of abhorrent violent material.
“For example, large internal IT systems for government departments, airports, and banks are highly unlikely to ever contain abhorrent violent material,” the coalition added.
As such, the coalition has called for these providers to be exempt from the laws, while also recommending for a new provision that enables the eSafety Commission to send a notice to these types of organisations, if required.
In addition, the coalition has recommended for the introduction of two new formal review processes: One to review any potential errors made by the eSafety commissioner in issuing notices and another that allows platforms to ask the commissioner for a determination about whether something is categorised as abhorrent violent material.
For that first formal review process, the coalition envisions it would not change the way notices are currently given, but it would give platforms the option to have a later review process where necessary.
Along with formal review processes, the coalition has also recommended for providers to gain the option of being able to explain to the public why certain information has been removed.
The coalition, which primarily comprises organisations that fall within the AVM Act’s scope, has also unsurprisingly said that the penalties are currently disproportionate.
“The penalties that apply are out of alignment with other penalties that apply in our legal system. We believe that the most significant penalties be reserved for bad faith actors and/or those who repeatedly and/or flagrantly breach the Act,” the coalition wrote.
To address this, the coalition has recommended a three-strikes approach where the current penalties would only apply if an organisation fails to comply with the laws three times. For first-time offenders, the coalition said the laws should only offer a maximum fine of AU$2.6 million, which it believes would be a sufficient deterrent.
The final recommendation made by the coalition focused on the requirement to notify law enforcement authorities about the possibility of a crime occurring. In the submission, the coalition said the current threshold was too hard to apply in practice. In its place, the coalition has called for a lower threshold and more objective threshold where platforms are required to notify law enforcement authorities only when it is aware a crime has occurred.
Social media companies are increasingly facing the heat in Australia for the content that resides in their platforms, with Prime Minister Scott Morrison criticising tech giants last month for the conduct that occurs on their platforms, stating that social media platforms like Facebook have become a “coward’s palace” for trolls.
The federal government then released an exposure draft — for what it has labelled an Online Privacy Bill — three weeks later, which seeks to put social media platforms under more regulatory scrutiny.
At the same time, the Australian Competition and Consumer Commission has been investigating the conduct of digital platforms for years.
Related Coverage
[ad_2]
Source link