Social media companies are preparing for Monday's Supreme Court arguments that could fundamentally change how their sites are monitored.
In Florida, technology companies are targeting political candidates in the state after Facebook, Twitter and YouTube banned President Donald J. Trump from appearing in the wake of the January 6, 2021, riot at the U.S. Capitol. It has become illegal to ban people from their own websites. Texas subsequently passed its own law prohibiting platforms from removing political content.
Two technology industry groups, NetChoice and the Computer & Communications Industry Association, filed suit to block the law from taking effect. They argued that companies have a First Amendment right to decide about their platforms, just as newspapers can decide what goes on their pages.
So what's the problem?
The Supreme Court's decisions in these cases (Moody v. NetChoice and NetChoice v. Paxton) are major tests of the power of social media companies to give governments influence over what stays online and how. This could potentially rebuild millions of social media feeds.
“The question is whether they can be forced to post content they don't want,” said Stanford University Law School, which filed a brief with the Supreme Court in support of tech companies' challenges to Texas and Florida. said Daphne Keller, graduate lecturer. law. “And perhaps more importantly, can governments force people to post content they don't want?”
Some legal experts speculate that if the Supreme Court rules that the Texas and Florida laws are constitutional and goes into effect, the companies could create versions of their feeds specifically for those states. There is. Still, such a ruling could lead to similar laws in other states, and precisely restricting access to websites based on location is technically complex.
Critics of the law say feeds in both countries may contain neo-Nazi and other extremist content, which could have previously been removed by platforms for violating standards. are doing. Alternatively, the platform could ban any discussion of anything political by banning posts on many controversial issues, critics say.
What are the social media laws in Florida and Texas?
Texas law prohibits social media platforms from removing content based on a user's “point of view” or removing content expressed in a post. The law gives individuals and state attorneys general the right to sue for violations against platforms.
Florida law imposes fines on platforms if they permanently ban candidates for public office in the state from their sites. It also prohibits platforms from removing content from “journalism companies” and requires companies to be upfront about rules for moderating content.
Supporters of the Texas and Florida laws passed in 2021 say they protect conservatives from liberal bias they say permeates the California-based platform.
“People around the world use Facebook, YouTube, and X (the social media platforms formerly known as Twitter) to connect with friends, family, politicians, reporters, “And we are communicating with a broader public.” One legal brief. “And like the telegraph companies of old, today's social media giants control the mechanics of this 'modern public square,' directing and often suppressing public debate.”
Chase Sizemore, a spokesman for the Florida attorney general, said the state “looks forward to upholding social media laws that protect Floridians.” A spokeswoman for the Texas attorney general had no comment.
What are the current rights of social media platforms?
They now decide what to keep online and what not.
Companies like Meta's Facebook, Instagram, TikTok, Snap, YouTube and X have long governed themselves by setting their own rules for what users can say, while taking a hands-off approach.
In 1997, the Supreme Court ruled that laws regulating online obscene speech were unconstitutional, distinguishing the Internet from media whose content is regulated by the government. For example, governments enforce standards of decency on television and radio broadcasts.
For years, bad actors have flooded social media with misleading information, hate speech and harassment, leading companies to create new rules over the past decade, including bans on misinformation about elections and the pandemic. I have been encouraged to do so. Platforms have banned people like influencer Andrew Tate for violating rules such as banning hate speech.
But these measures have been met with backlash on the right, with some conservatives accusing the platforms of censoring their opinions and even Elon Musk's campaign to ensure users' free speech. He went so far as to say he wanted to buy Twitter in 2022.
What do social media platforms claim?
Technology groups say the First Amendment protects companies' ability to make editorial choices about the content of their products, giving them the right to remove content if they see fit. claims.
In their lawsuit against the Texas law, the organizations argue that, like magazine publishing decisions, “a platform's decisions about what content to host and what to exclude are intended to convey messages about the type of community the platform wishes to foster.” ''. ”
Still, some legal scholars are concerned about the implications of granting social media companies unrestricted powers under the First Amendment, which is meant to protect not only freedom of the press but also freedom of speech. are doing.
Olivier Sylvain, a professor at Fordham Law School who until recently served as a senior adviser, said, “I'm concerned about a world where corporations invoke the First Amendment.” to Federal Trade Commission Chair Lina Khan.
How does this impact Big Tech’s responsibility for content?
A federal law known as Section 230 of the Communications Decency Act protects platforms from lawsuits over most user content. It also protects you from liability for how you choose to moderate that content.
The law has been criticized in recent years for making it impossible to hold platforms accountable for real-world harms that result from the posts they publish, such as online drug sales or terrorist videos.
The lawsuit being argued Monday does not directly challenge the law. But Section 230 protections could play a role in the broader debate over whether courts should uphold the Texas and Florida laws. And state law creates new legal liability for platforms if they remove certain content or ban certain accounts.
Last year, the Supreme Court considered two cases that sought to limit the scope of Section 230 protections for Google's YouTube and Twitter. The judges declined to hold tech platforms legally responsible for the content in question.
What's next?
The court will hear arguments from both sides on Monday. A decision is expected to be made by June.
Legal experts say the court could rule the law unconstitutional but will provide a path forward on how to fix it. Or they might fully support the First Amendment rights of corporations.
Karl Szabo, general counsel for NetChoice, which advocates against tech regulation on behalf of companies like Google and Meta, said if his group's challenge to the law fails, “Americans across the country would be required to view legal but terrible content.” It is interpreted as political and is therefore subject to law.
“There's a lot of stuff that gets politicized,” he says. “Terrorist recruitment is probably political in nature.”
However, if the Supreme Court rules that these laws are unconstitutional, the status quo will become entrenched. In other words, platforms, not anyone else, decide what speech stays online.
Adam Liptak Contributed to the report.