Between the Fake News in search of facts. 3d illustration.

Can Social Media Sites Be Held Accountable for Users’ Posts?

Law and society program director breaks down Section 230 debate.

During the presidential campaign, some debate centered on the future of Section 230 of the Communications Decency Act, a decades-old law which says that sites like Facebook and Twitter cannot be held responsible for the content of users’ posts on their platforms.

In legalese, the regulation holds that “no provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”

As the issue carries over after the inauguration of President Joe Biden—who has indicated that he is open to tweaking the existing liability law—it is worth examining what Section 230 means, and what may happen in the future if it were to be altered or, as some hope, eliminated altogether.

Evan Laine, director of the University’s law and society program and faculty director of the Arlen Specter Center for Public Service—explains the nuances of an issue that could impact the course of the national conversation moving forward.

Evan Laine is the faculty director of the Arlen Specter Center for Public Service, located in the Roxboro House where he teaches courses, including "American Conspiracy Theories, Myths and Media Creations."

There are essentially two areas of controversy.

First, under Section 230, the social media sites are not responsible for defamation posted by a member who posts on their sites.

Laine draws an analogy to help explain what the law means. If a market sells bad fruit, both the market and the grower can be held responsible because they are liable for that product. However, social media sites such as Facebook and Instagram do not sell content; rather, they are a place for others to provide content.

Evan Laine, law and society program director

Under Section 230, these sites take no responsibility for what they put out.

“It’s an open area where people come to speak, just as the street owner is not responsible for the conversations that occur on his street,” he says. “Providers are saying that the same applies when you go on Facebook: they’re not saying the alleged defamatory comments. A third party they cannot control is. Therefore, they have no liability.”

The second area of controversy occurs when these sites ban or limit a third party’s comments as being in violation of the site’s terms of service. What many critics get wrong, according to Laine, is claiming that this is a First Amendment violation. As private entities, the First Amendment, which regulates government action only, does not apply. Therefore, the sites are entitled to censor, as opposed to the government, which cannot in most circumstances.

An educated public should have the resources and intelligence to sort through everything and find the truth for themselves.

According to Laine, although it is well recognized that the First Amendment only applies to government action, future Section 230 legal arguments will likely focus on the Marsh vs. Alabama as legal precedent for extending the constitution to private entities in certain circumstances.

In that case, the Supreme Court held that a person distributing religious literature on the sidewalk of a “company town” had First Amendment protections. (The term “company town” describes a municipality where a private company “owned everything” from banks to grocery stores, and established onerous rules to control speech, according to Laine.)

“The argument was that they became a quasi-government, and were acting in that fashion. The Supreme Court agreed,” Laine explains.

Applying that case from 1946 to the year 2021, Laine substitutes “company towns” with social media.

“People will argue that when this case hits the courts, Facebook, Twitter and Instagram are like ‘Marsh,’ in that social media have taken over as the place where people speak out since that’s no longer happening on Main Streets, but online,” he says. “The argument is that they have now become the political forum, and should be held to First Amendment standards. It will be very interesting to see how the Court rules in the future.”

If you allow platforms to be sued, the internet ends.

He notes that critics on both sides of the political spectrum want to rein in social-media sites, albeit for different reasons.

Though potential approaches have been bandied about—such as “bipartisan panels appointing content-moderation teams”—Laine maintains that “protecting the marketplace of ideas” is more important than reactionary measures taking hold.

“I think it’s important for all opinions to get out there. An educated public should have the resources and intelligence to sort through everything and find the truth for themselves,” he says. “Who is to say ‘this content is good and this content is not good’ before it gets to the public?

“Yes, Facebook is irresponsible, and Instagram and Twitter are irresponsible. Nevertheless, the public, not government, should be who rejects questionable information from those sites. If the government removes Section 230 protection and opens these sites to defamation lawsuits, you’ll bankrupt those companies, and that will be the end of the Internet.”

The First Amendment does not apply to social media sites suspending users for violating their terms of service.

In that regard, Laine draws attention to the Hulk Hogan vs. Gawker case as a cautionary tale which highlights how wealthy individuals can sue journalistic outfits into oblivion, with or without credible claim, solely because of the size of their bank accounts.

Peter Thiel, a co-founder of PayPal, spent about $10 million to financially back the Hogan privacy suit for the purpose of taking Gawker down because he was personally insulted by one of their articles.

“In that situation, the person with the most money wins,” Laine shares. “Essentially what happens is that their team starts filing an avalanche of motions, and it doesn’t matter if they lose time and again. All that matters is that the other side, with less money, has to respond to these motions while paying a fortune to their lawyers.

“In the end, even if the sued party wins or gets the case dismissed, it doesn’t matter. The money is the point, and the goal is to legally bankrupt the other side with attorney’s fees. It’s about the rich and powerful going after these sites. With such a threat, who would dare to speak out, even in truth, against the rich and powerful?”

That’s something we teach at Jefferson: Make sure the information you are consuming and sharing is credible and examined before you spread it.

How does that apply to the current situation?

“If you allow platforms to be sued, the internet ends. They’ll be sued into oblivion by those who can fund these expensive cases even if the stories are well-sourced and true,” he says. “That’s why you need Section 230.”

It is important to note that there are exceptions to 230 protections like Backpages.com, which the government shuttered because of the site’s alleged ties to human-trafficking crimes.

“Here’s the difference between Backpage and Facebook: When taking ads for cases involving human trafficking, Backpage knew how to word ads properly in code. They were actively working with advertisers in that regard,” he says. “Facebook is a provider of information. Backpage became involved in the information with code words. That’s different.”

In the end, it is this responsibility of the reader to check their sources which is an issue that surfaces often in Laine’s “American Conspiracy Theories, Myths and Media Creations” course.

By way of solutions, Laine urges students—and the public itself—to be vigilant about what they read, post and repost online. He also suggests that content providers be transparent about those accounts which get blocked (with a public record of why), and that monetized ads should be clearly identified as such.

“I believe in a marketplace of ideas, but where are you getting your information from, and how are you vetting the information?” he says. “Be very careful. That’s something we teach at Jefferson: Make sure the information you are consuming and sharing is credible and examined before you spread it. Any knee-jerk response will lead to a disaster. There aren’t two sides to every story.”