The Australian Attorney General is in the process of updating the national defamation law to take into account the role of digital platforms in the distribution and display of online content.
Despite being faced by the Parliamentary Joint Committee on Information Security as part of an investigation into Australia’s militant movement and radicalism, Facebook, Google and Twitter representatives were asked by the Attorney General to comment on the work underway I did.
Specifically, the implications of the media being liable, as if the digital platform was treated as a publisher for defamation law and therefore published defamation content in Australia.
“But one of the policy principles that I think is really important when it comes to defamation is to ensure that the responsibility lies primarily with the speaker and with the person who controls what they are saying.” Said Facebook Australia policy director Josh Machine.
“Because if there is an inconsistency between them, you have a potentially wrong incentive if you hold the other party accountable for what someone is saying.”
Machin was asked to ponder the impact of defamatory comments left on Facebook in scenarios where such statements were legally responsible.
“I’ll tell you what we’re doing right now,” he chose instead.
“First, we’ll review against community standards. If we violate community standards, we’ll remove that content. Therefore, if you’re making statements that represent malicious language, we’ll put you in a bullying policy. If you violate it, you can remove it, immediately because it already violates the rules we have set up for the discussion we are doing on Facebook. “
If the content does not violate Facebook’s community standards, a legal review process will take place, including defamation laws.
“We consider whether it is potentially defamatory, what the defense is, and the potential responsibilities of being able to sit with us,” he continued. “And certainly, if we receive a clear court order that something is defamatory, we will take steps to geographically block that content.
See: Everything you need to know about Facebook’s surveillance committee
Samantha York, head of government affairs and public policy at Google Australia, argued that the question from Chairman Senator James Patterson was not as hypothetical as it was perceived.
“Today Australia has many legislative frameworks that impose both criminal and civil liability on digital platforms in relation to potentially harmful content,” she said. “In fact, Google is a publisher for defamation law purposes and only links to websites containing defamation material by several courts in various jurisdictions throughout Australia. I know. “
She said Google “wants for legal certainty” and wants the law to be clearer about what the role and responsibilities of digital platforms are.
“In fact, it’s a bit hard to talk about what to do if the results determine that the content actually hosted by the digital platform is as responsible as the person who actually created the content in the first place. You can see a scenario where you have to behave like a traditional media company, scrutinize all content in advance, and make editorial decisions about what will be published. “
Yorke clearly saw this as a “significant deviation” from Google’s current practice.
“Given the huge amount of content that is uploaded to our service every day, it’s frankly difficult, but of course it’s logical to rethink how our business operates here. I need it, “she said.
Kathleen Lean, senior director of public policy and philanthropy in the APAC region of Twitter, has more than a billion tweets every two days for her company.
She found the policies and procedures that Twitter imposes on platform users sufficient, especially given the constant evolution of users.
“Twitter is very well known for its commitment to freedom of expression. Unless I say I’m concerned about protected speech and freedom of expression, I do my job and express my feelings well. You can’t do that, “Lean said.
“And we are concerned about how easy it is to control costs and speeches, and unpleasant or unpopular speeches and discussions. What to include and what not to do. There are many different views. Transfer to the conversation. “
A debate by the Attorney General on the revision of the Defamation Act is being held until May 18.
Defamation reforms can see a “significant deviation” from the way tech giants operate
Source link Defamation reforms can see a “significant deviation” from the way tech giants operate