How Stab Wounds in Israel Resonate Through the Conflict Over Online Speech

Washington — Stuart Force says he found comfort on Facebook after his son was stabbed in Israel in 2016 by a member of the militant group Hamas. He turned to the site and read hundreds of condolence messages on his son’s page.

However, only a few months later, Force determined that Facebook was partially responsible for death as algorithms that strengthened social networks helped spread Hamas content. He joined the relatives of other terrorist victims and sued the company, claiming that the algorithm helped the crime by regularly amplifying posts that encouraged terrorist attacks.

The proceedings were unsuccessful when the Supreme Court refused to take it up last year. However, the debate over the power of algorithms has been echoing in Washington, with some members of Congress citing the case in a fierce debate about the law that protects tech companies from liability for user-submitted content. doing.

At a hearing on Thursday with false information dissemination with Facebook, Twitter, and Google CEOs, some lawmakers generate revenue by viewing posts that users tend to click on. It is expected to focus on how corporate algorithms are written for this purpose. Corresponds to. We also need to change Section 230 of the Communication Quality Act, a law that protects social networks from liability, to hold businesses accountable when software turns services from the platform into accomplices to crimes committed offline. Some people argue.

“The last few years have proven that the more outrageous and radical content a social media platform promotes, the more engagement and advertising they earn,” said the Energy Commerce Commission Chairman. One Frank Palon Jr. representative said. chief executive officer.

“Now it’s painfully clear that neither market nor public pressure prevents social media companies from increasing disinformation and extremism. Therefore, we have no choice but to legislate it. It’s a question of how best to do it, “says Palon. New Jersey Democrats have been added.

Former President Donald J. Trump called for the abolition of Article 230, and President Biden made similar comments during the White House campaign. However, the abolition seems increasingly suspicious, and lawmakers are focusing on the smallest possible changes to the law.

Sorting, recommending, and distributing algorithms is common throughout social media, so changing the legal shield to explain the capabilities of the algorithm can reshape the web. The system determines the first link that appears in Facebook’s news feed, the account recommended for Instagram users, and the next video to play on YouTube.

Industry, free speech activists, and other advocates of legal shields claim that social media algorithms apply equally to posts regardless of message. They say the algorithm works only for the content provided by the user and is therefore covered in Section 230, which protects the sites that host people’s posts, photos, and videos.

The court agreed. A federal judge said that even the “most generous reading” of the allegations made by Mr. Force “put them honestly in” the immunity given to the platform under the law.

A Facebook spokesman declined to comment on the case, but noted comments from its CEO, Mark Zuckerberg, and upheld the changes in Section 230. Elena Hernandez, a Google-owned YouTube spokeswoman, said the service is a search and discovery algorithm that ensures that more reliable content is prominently displayed and labeled in search results and recommendations. ..

Twitter said it suggested offering users more choices than algorithms for ranking timelines.

“Algorithms are a fundamental component of Internet services, including Twitter,” said Lauren Culbertson, Twitter’s Head of Public Policy. “Regulations need to reflect the reality of how different services work and content is ranked and amplified, while maximizing competition and balancing security and free expression.”

credit…Army Academy, Associated Press

The Force case began in March 2016 when his son Taylor Force, 28, was killed by Bashar Masala while walking for dinner with his graduate classmates in the Israeli port city of Jaffa. Hamas, a Palestinian group, said 22-year-old Masala was a member.

In the months that followed, Stuart Force and his wife, Robi, worked to resolve his son’s property and clean his apartment. That summer, they received a call from an Israeli litigation group. It asked: Is the Force family willing to sue Facebook?

After Mr. Force spent some time on Hamas’ Facebook page, the family agreed to file a proceeding. The proceedings fit into a broader military effort to limit the resources and tools available to the Palestinian group. Mr. Force and his wife have formed an alliance with Washington lawmakers and passed a law limiting aid to the Palestinian Authority, which governs parts of the West Bank.

Their lawyer alleges that Facebook has given Hamas “a highly developed and sophisticated algorithm that promotes Hamas’ ability to reach and engage an audience that is otherwise unreachable” in an American court. did. The proceedings said Facebook’s algorithm not only amplified posts, but also helped Hamas by recommending groups, friends, and events to users.

A judge in New York’s federal district, citing Section 230, ruled against this allegation. Force Family lawyers have appealed to three judges in the United States Court of Appeals for the Second Circuit, two judges wholly on Facebook. Another judge, Robert Katzmann, wrote a 35-page dissenting opinion in part of the ruling, arguing that Facebook’s algorithm recommendations should not be subject to legal protection.

“The pile of evidence suggests that providers have designed algorithms to direct users to content and people they have agreed to.

At the end of last year, the Supreme Court rejected the call to hear another case that would have tested the shield in Section 230. In a statement attached to the court’s decision, Judge Clarence Thomas cites Mr. Force’s proceedings and Judge Katzman’s opinion to consider whether Article 230’s protection has been overextended. Asked.

Judge Thomas said the court does not need to decide at this time whether to curb legal protection. “But when appropriate, it’s right for us,” he said.

Some lawmakers, lawyers, and scholars say that recognition of the power of social media algorithms in deciding what people see has long been delayed. Platforms usually do not reveal exactly what the algorithms use to make decisions and how they weigh each other.

“Amplified and automated decision-making systems create opportunities for connections that would otherwise not be possible,” said Olivier Sylvain, a law professor at Fordham University who discussed in the context of civil rights. I will. “They are making a substantial contribution to the content.”

The debate emerged in a series of proceedings arguing that Facebook should be responsible for housing discrimination if Facebook’s platform can target ads according to the race of the user. A bill drafted by New York Democrat Yvette D. Clark will deprive section 230 of the exemption from targeted advertising that violates civil rights law.

A bill submitted last year by Congressman Tom Marinovsky of New Jersey and Congressman Anna G. Esch of California was a social media platform when the algorithm amplified content that violated some anti-terrorism and civil rights laws. It was to strip the protection of Section 230 from. A news release announcing a bill to be reintroduced on Wednesday cited the Force family’s proceedings against Facebook. Malinowski said he was partly inspired by Judge Katzman’s dissenting opinion.

Critics of the bill say it could violate the First Amendment and that there are so many algorithms on the web that it could wipe out a wider range of services than lawmakers intended. Stated. They also say that there is a more fundamental problem: regulating non-existent algorithmic amplification will not eliminate the urge to drive it.

Daphne Keller, director of the platform regulation program at Stanford University’s Cyber ​​Policy Center, said:

How Stab Wounds in Israel Resonate Through the Conflict Over Online Speech

Source link How Stab Wounds in Israel Resonate Through the Conflict Over Online Speech

Back to top button