For years, Washington was baffled on how to regulate the internet or whether he should even try. But the Supreme Court is due to hear a case next week that could completely transform our online world as we know it.
On Tuesday, the judges will hear arguments for González c. Googlea questionable matter Section 230 of the Communications Decency Act, a 1996 law that grants Internet platforms immunity for most third-party content posted on their websites. Arguments will revolve around technological algorithms, which plaintiffs say boosted extremist messaging before a terror attack. They argue that Section 230 protections should not apply to content recommended by an online company’s algorithm, and therefore Google is legally responsible for extremist videos posted on its Youtube service.
While the hearing is scheduled for next week, a resolution is not expected until June.
Section 230 is the reason businesses love Facebook Or Twitter are not responsible for user-generated content, and why a website is not legally liable if someone writes a slanderous review. But it has come under fire in recent years from critics who say it allows disinformation and protect sites known to disseminate hateful and extremist speech rhetoric. However, experts also fear that the rollbacks to Section 230 will go too far and irreparably destroy the foundations of free speech on which the internet was built.
Recent AI developments like ChatGPT have added a new dimension to the fight against 230, as bots that have so far proven unreliable in providing accurate information and getting the right facts could soon protected by law.
Some experts say the Supreme Court’s rulings on these cases could represent a unique opportunity to define the rules of Section 230, but others also warn that going too far could completely gut 230 and make our relationship with the Internet hardly recognizable.
“The more the digital world is intertwined with our physical world, the more urgent this will become,” said Lauren Krapf, senior counsel for technology policy and advocacy at the Anti-Defamation League, an anti-discrimination group. Fortune.
The backbone of the modern web
Section 230 made the Internet work the way it does today by allowing websites to publish most content without fear of legal culpability, with a 26-word layout which has been hugely influential in shaping today’s Internet: “No provider or user of an interactive computer service should be considered the publisher or speaker of information provided by another content provider information.”
The Electronic Frontier Foundation, a digital rights organization, states that without item 230“the free and open Internet as we know it could not exist”, while the provision of law protecting Internet companies is often named “the 26 words that created the Internet.”
But those words written more than a quarter century ago have come under intense scrutiny in recent years, and politicians on both sides of the aisle have targeted 230 of them as part of a larger effort. broad Internet regulation. Even tech leaders, including Meta CEO Mark Zuckerberg proposed that Congress require platforms to demonstrate that they have systems in place to identify illegal content. But how and to what extent the law needs to be refined has so far escaped the consensus.
“We’re at a point where Congress really needs to update Section 230,” Krapf said. His organization filed an amicus memoir on the Google case on behalf of the plaintiff urging the Supreme Court to consider the ramifications of the Section 230 immunity provision.
But given the magnitude of the effects of Section 230, reaching an agreement on how best to revise it is no easy task.
“Because [Section 230] is a high-stakes piece of the puzzle, I think there are a lot of different views on how it should be updated or reformed and what we should do about it,” Krapf said.
What makes that González c. Google different case from previous attempts to refine Section 230 is that the matter is taken to the Supreme Court instead of Congress for the first timeand could set a precedent for future interpretations of the law.
At the heart of his argument is the dissemination of pro-terrorist messages on online platforms. The Gonzalez family alleges Google-owned service Youtube was complicit in radicalizing Islamic State fighters in preparation for a 2015 terror attack in Paris that killed 130 people, including 23-year-old Nohemi Gonzalez , an American student who was studying abroad. A lower court ruled in favor of Google citing 230 protections and the Gonzalez family turned to the Supreme Court, arguing that Section 230 covers content, but not the algorithmic content recommendations in question.
Google isn’t the only case presenting a potential challenge to Section 230 next week. A related case the court will hear on Wednesday, Twitter in Goodbyewas highlighted by relatives of Jordanian citizen Nawras Alassaf, who was one of 39 killed in 2017 in an ISIS-affiliated mass shooting at an Istanbul nightclub.
Alassaf’s family sued Twitter, Google and Facebook for failing to police pro-terrorist content on their websites, a lawsuit that a lower court allowed to advance. Twitter then argued that moving the lawsuit forward was an unconstitutional extension of the Terrorism Act and appealed the decision to the highest court. The lower court never made a decision on the case, so Section 230 was never discussed, but it will likely be brought up at the Supreme Court hearing next week.
Targeting recommendations could be a slippery slope
The Gonzalez family is asking the Supreme Court to clarify whether YouTube endorsements are exempt from Section 230, and exceptions to the law are not unknown.
In 2018, former President Donald Trump signed an exclusion the law that would make online sites liable for content involving sex trafficking. But the difference with the Google case is that the plaintiffs are not targeting specific content, but rather online recommendations generated by the company’s algorithms.
“Their contention is that their lawsuit targets YouTube’s recommendations, not the content itself, because if they were targeting the content itself, Section 230 clearly comes into play and a lawsuit is dismissed by the court,” Paul Barrett, deputy director and senior fellow at NYU’s Stern Center for Business and Human Rights, said Fortune.
Virtually all online platforms, including Google, Twitter, and Facebook, use algorithms to generate user-selected content recommendations. But Barrett argued that targeting recommendations rather than content could be a slippery slope for future lawsuits against online platforms, given that recommendation algorithms have become central to everything tech companies do.
Barrett and the center he is affiliated with also filed an amicus memoir with the court, which recognizes the need for modernization of Section 230, but also argues that the law remains a crucial pillar of freedom of expression online, and that an extreme ruling that opens the door to targeted algorithms rather content could undermine those protections.
“A recommendation is not a separate, distinct, unusual activity for YouTube and the videos it recommends. Recommendation is, in fact, what social media platforms in general do,” he said.
If the Supreme Court rules in favor of the Gonzalez family, it could leave Section 230 vulnerable to future lawsuits targeting the algorithms of online platforms rather than their content, Barrett said, adding that in an extreme case it could lead to complete erosion of the protections. the law grants to technology companies.
“I think what you would see is a very dramatic restriction or reduction of what’s available on most platforms because they just wouldn’t want to take the risk,” he said. Instead, he says online platforms would self-censor themselves for having far less “legal bait” content.
Such an extreme gutting of Section 230 would make life much harder for large corporations, but could potentially pose an existential threat to smaller online platforms that are primarily crowd-powered and with fewer resources to fall back on. said Barrett, including popular sites like Wikipedia.
“We wanted to sound the alarm, ‘Hey, if you go down this road, you might do more than you think,’ Barrett said.
Both Barrett and Krapf agreed that Section 230 is probably overdue for refinement, and it’s becoming more and more urgent as technology becomes more intertwined with our lives. Krapf described the court hearing as a good opportunity to get clarity on Section 230 as part of a larger need for Congress to regulate the behavior of tech companies and ensure consumers are protected even of the digital world.
“I think the urgency just keeps building,” Krapf said. “We have seen reliance on our digital world come into its own over the past few years. And now, with a new wave of technological advancements at the forefront, we need better rules of the road.