By james romose
November 6, 2022
at 6:00 p.m.
“The bird is released”, Elon Musk tweeted overnight, it completed its $44 billion purchase of Twitter.
What he didn’t say was that a series of court cases could soon clip his wings.
A self-proclaimed free speech absolutistMusk suggested he would relax Twitter’s content moderation rules, allow more objectionable language to remain on the siteand reinstate some users who have been banned. Three days after reassuring advertisers that he wouldn’t let Twitter become a “hellscape free for all“, he demonstrated his own personal approach to free-wheeling speech when he tweeted (then deleted) a link to a false conspiracy theory about House Speaker Nancy Pelosi’s husband.
Musk’s takeover and expected redesign of Twitter comes at a remarkable time. Internet law is perhaps about to enter its most dramatic transition since the days of CompuServe and AOL. As Georgetown Law Scholar Anupam Chander wrote, Silicon Valley has flourished in the United States in large part because of a well-designed legal regime. Late 20th century legislators and courts enacted various substantive reforms that allowed emerging technology companies to operate without fear of legal liability – just as 19th century judges crafted common law principles to promote development industrial. The legal pillars that helped the internet grow are the same ones that would allow Musk to implement many of the reforms he suggested. But these pillars are under threat.
Last month, the Supreme Court agreed to hear two cases that test the biggest pillar: Section 230 of the Communications Decency Act, the landmark 1996 law that immunizes technology companies from civil lawsuits stemming from the user-generated content they host on their platforms. Under Section 230, if a user posts defamation, harassment, or other forms of harmful speech (such as, for example, spreading conspiracy theories about an 82-year-old assault victim), the individual user can be sued, but the platform (with some exceptions) cannot.
Gonzales vs. Google and Twitter against Taamneh could change that. González asks whether Section 230 immunity disappears if a platform recommends or amplifies problematic content to users. Taamneh asks whether a company can be held liable for “complicity” with terrorism if pro-terrorist content appears on its platform (even if the company aggressively removes most pro-terrorist speech).
Many legal and tech experts were shocked when the court decided to re-examine these cases (which will be heard next year). As a general rule, judges will not hear such cases unless the circuit courts are divided on the underlying legal issues, and there is no real circuit division here. (Lower courts that have considered the issue have been fairly consistent in their broad interpretations of section 230.) And the unusual background of the two cases — lawsuits brought by families of those killed in terrorist attacks — may make them imperfect vehicles for resolving the panoply of issues that Section 230 touches on.
So the fact that the court took the cases suggests that at least some judges want to restrict Section 230. One of them, Judge Clarence Thomas, has already telegraphed his view: Last year and earlier this year, he questioned the law’s broad protections and called on his colleagues to consider them carefully. (I have written before on how the ideas Thomas put forward in separate opinions are garnering growing majorities on the new Conservative court.)
Separately, two other cases are waiting in the wings. In NetChoice vs. Paxton and Moody vs. NetChoice, the tech industry is challenging laws in Texas and Florida that restrict the power of platforms to remove user-generated content. Politicians in these states think tech companies are biased against politically conservative speech, and they’re trying to reduce what they call censorship. Tech companies argue that the First Amendment (not to mention Section 230!) protects their right to set their own rules for their platforms, including banning speech that isn’t necessarily illegal but harmful, like misinformation about elections or COVID vaccines.
The Supreme Court has not yet decided whether or not to seize the NetChoice dispute. But unlike González and Taamneh, there is a circuit split: the United States Court of Appeals for the 5th Circuit (in an opinion by an acolyte of Judge Samuel Alito) upheld the Texas law, while the United States Court of Appeals United States for the 11th Circuit overturned Florida’s similar law. So the judges will most likely weigh in.
The result for Twitter and other social media companies is a new world of largely unknown risks. If the Supreme Court reduces Section 230, Musk may forget about his commitment to lighter moderation. Almost everything Twitter does is built around content recommendations produced by complex algorithms, which in turn respond to the unpredictable behavior of human users. (The same goes for every other big social media company. So do search engines.) While a company could be sued at any time, an automated quirk in its algorithm amplifies obscure and problematic content, the company will have no choice but to remove a lot more content on the front-end.
If the court upholds the Texas and Florida laws, the companies will also face new penalties for removing too contents. And the conundrum could get even worse: one can imagine blue states adopting their own platform regulations that directly conflict with those of red states – say, in demanding platforms to suppress the same misinformation that red states insist cannot be suppressed.
Chander thinks the ultimate loser in such a regime would be what Musk claims to stand for: freedom of speech and an open internet.
“If we impose enormous responsibility on platforms left and right,” he said, “it means that these platforms will now act in a way that will significantly reduce the risks for them – and with serious consequences for our practical freedoms of expression online”.
Congress, of course, could solve this problem by clarifying the scope of Section 230. Its key provision, after all, is only 26 words long and 26 years old—perhaps it’s time for an update. Congress could also wield its power under the Constitution supremacy clause to anticipate any state laws that conflict with Section 230 protections. But proposals for reform (both left and right) have not taken off. Until they do, we’re all flying blind.
This column was originally published November 3 in the National Journal and is owned and licensed by National Journal Group LLC.