{ "id": "5ce5b73dd2250c14b26aad600abdd8cc4ef2599435f7142fcfa1203c35ad6a02i0", "number": 258985, "address": "bc1p8xm0cnmzqmv2cww79s5v32wgsvqz3z68ncqwahmfx2r3ftrfuanqle3qdn", "content_type": "text/plain;charset=utf-8", "content_length": 8221, "genesis_block_height": 779024, "genesis_tx_id": "5ce5b73dd2250c14b26aad600abdd8cc4ef2599435f7142fcfa1203c35ad6a02", "timestamp": "2023-03-02T22:04:13.000Z", "last_updated": "2023-04-01T22:29:18.701Z", "p": "ons", "op": "post", "title": "Supreme Court Poised to Reconsider Key Tenets of Online Speech", "url": "https://www.nytimes.com/2023/01/19/technology/supreme-court-online-free-speech-social-media.html", "body": "# Supreme Court Poised to Reconsider Key Tenets of Online Speech\n\n**The cases could significantly affect the power and responsibilities of social media platforms. For years, giant social networks like Facebook, Twitter and Instagram have operated under two crucial tenets.**\n\nThe first is that the platforms have the power to decide what content to keep online and what to take down, free from government oversight. The second is that the websites cannot be held legally responsible for most of what their users post online, shielding the companies from lawsuits over libelous speech, extremist content and real-world harm linked to their platforms.\n\nNow the Supreme Court is poised to reconsider those rules, potentially leading to the most significant reset of the doctrines governing online speech since U.S. officials and courts decided to apply few regulations to the web in the 1990s.\n\nOn Friday, the Supreme Court is expected to discuss whether to hear two cases that challenge laws in Texas and Florida barring online platforms from taking down certain political content. Next month, the court is scheduled to hear a case that questions Section 230, a 1996 statute that protects the platforms from liability for the content posted by their users.\n\nThe cases could eventually alter the hands-off legal position that the United States has largely taken toward online speech, potentially upending the businesses of TikTok, Twitter, Snap and Meta, which owns Facebook and Instagram.\n\n“It’s a moment when everything might change,” said Daphne Keller, a former lawyer for Google who directs a program at Stanford University’s Cyber Policy Center.\n\nThe cases are part of a growing global battle over how to handle harmful speech online. In recent years, as Facebook and other sites attracted billions of users and became influential communications conduits, the power they wielded came under increasing scrutiny. Questions arose over how the social networks might have unduly affected elections, genocides, wars and political debates.\n\nIn some parts of the world, lawmakers have moved to rein in the platforms’ influence over speech. Last year, European legislators approved rules that require internet companies to carry out procedures for taking down illicit content and to be more transparent about how they recommend content to people.\n\nIn the United States, where freedom of speech is enshrined in the First Amendment, there has been less legislative action. While lawmakers in Washington have grilled the chief executives of the tech giants over the past three years about the content they take down, proposals to regulate harmful content haven’t gotten traction.\n\nPartisanship has made the logjam worse. Republicans, some of whom have accused Facebook, Twitter and other sites of censoring them, have pressured the platforms to leave more content up. In contrast, Democrats have said the platforms should remove more content, like health misinformation.\n\nThe Supreme Court case that challenges Section 230 of the Communications Decency Act is likely to have many ripple effects. While newspapers and magazines can be sued over what they publish, Section 230 shields online platforms from lawsuits over most content posted by their users. It also protects platforms from lawsuits when they take down posts.\n\nFor years, judges cited the law in dismissing claims against Facebook, Twitter and YouTube, ensuring that the companies did not take on new legal liability with each status update, post and viral video. Critics said the law was a Get Out of Jail Free card for the tech giants.\n\n“If they don’t have any liability at the back end for any of the harms that are facilitated, they have basically a mandate to be as reckless as possible,” said Mary Anne Franks, a University of Miami law professor.\n\nThe Supreme Court previously declined to hear several cases challenging the statute. In 2020, the court turned down a lawsuit, by the families of individuals killed in terrorist attacks, that said Facebook was responsible for promoting extremist content. In 2019, the court declined to hear the case of a man who said his former boyfriend sent people to harass him using the dating app Grindr. The man sued the app, saying it had a flawed product.\n\nBut on Feb. 21, the court plans to hear the case of Gonzalez v. Google, which was brought by the family of an American killed in Paris during an attack by followers of the Islamic State. In its lawsuit, the family said Section 230 should not shield YouTube from the claim that the video site supported terrorism when its algorithms recommended Islamic State videos to users. The suit argues that recommendations can count as their own form of content produced by the platform, removing them from the protection of Section 230.\n\nA day later, the court plans to consider a second case, Twitter v. Taamneh. It deals with a related question about when platforms are legally responsible for supporting terrorism under federal law.\n\nEric Schnapper, a University of Washington law professor who is one of the lawyers representing the plaintiffs in both cases, said in an interview that the arguments were narrow enough that they wouldn’t change wide swaths of the internet. “The whole system doesn’t break down,” he said.\n\nBut Halimah DeLaine Prado, Google’s general counsel, said in an interview that “any negative ruling in this case, narrow or otherwise, is going to fundamentally change how the internet works,” since it could result in the removal of recommendation algorithms that are “integral” to the web.\n\nTwitter did not respond to a request for comment.\n\nTech companies are also closely watching the Texas and Florida cases. Both states passed laws prohibiting social networks from taking down certain content after Twitter and Facebook barred President Donald J. Trump following the Jan. 6, 2021, riot at the U.S. Capitol. Texas’ law lets users sue if a large online platform removes their post because of the “viewpoint” it expresses. The Florida law fines platforms that permanently ban the accounts of a candidate for office in the state.\n\nNetChoice and CCIA, groups funded by Facebook, Google, Twitter and other tech companies, sued to block the laws in 2021. The groups argued that the companies had a constitutional right to decide what content to host.\n\n“It’s a roundabout way of punishing businesses for exercising First Amendment rights that others disagree with,” said Chris Marchese, a counsel at NetChoice.\n\nIn Florida, a federal judge agreed with the industry groups, ruling that the law impinged on the platforms’ First Amendment rights, and the U.S. Court of Appeals for the 11th Circuit upheld most of that decision. But the U.S. Court of Appeals for the Fifth Circuit upheld Texas’ law, rejecting “the idea that corporations have a freewheeling First Amendment right to censor what people say.”\n\nThat puts the Supreme Court under pressure to step in. When federal courts offer different answers to the same question, the Supreme Court often chooses to settle the dispute, said Jeff Kosseff, an associate professor of cybersecurity law at the U.S. Naval Academy.\n\nA spokeswoman for Florida’s attorney general, Ashley Moody, pointed to the state’s filings with the Supreme Court, where it argues that the ruling blocking the law strips states’ power “to protect their citizens’ access to information.” A spokesman for the Texas attorney general, Ken Paxton, did not respond to a request for comment.\n\nIf the Supreme Court’s justices decide to hear the challenges, they could move to take the cases immediately for the court’s term ending in June or for its next term, which runs from October until the summer of 2024.\n\n“I think we’re, right now, in a place where the court is being positioned to make a new judgment on the internet,” Mr. Kosseff said.", "author": "David McCabe" }