Wikipedia Worries Its Volunteer Editors Could Be Liable to Lawsuits Without Section 230
Where does Wikipedia, the world’s most-visited repository of information on the internet, stand without guaranteed digital liability protections? It’s a question weighing heavy on the people who make up the Wikimedia Foundation, the nonprofit organization that administers the site containing 58 million articles in multiple languages and sees more than 16 billion visits total each month.
“Having that protection there is what has allowed Wikipedia to be written by thousands of volunteer editors around the world over the last 22 years,” said Wikimedia Foundation’s lead counsel Leighanna Mixter. “So without the protections of Section 230, that becomes a much more difficult scenario for us.”
In just a few short weeks on Feb. 21, the Supreme Court is going to hear arguments about whether Section 230 of the 1996 Communications Decency Act—also known as the “26 words that created the internet” according to journalist turned legal scholar Jeff Kosseff—should even exist. Section 230 essentially makes it so websites and platforms aren’t considered the publishers of the content users post. It has shielded companies from being liable for dangerous or libelous content as much as it’s been a boon for people holding powerful entities’ feet to the fire.
In a Zoom interview with Gizmodo, Mixter and the nonprofit’s legal director Jacob Roberts said they have run scenarios for what the site would look like without Section 230, and it wasn’t pretty. The problem for Wikipedia especially is that most editing is done by scores of volunteer editors. The mostly self-organized Wikipedia community selects certain users to sand the rough edges off articles, edit content, or even block certain users. There’s even select groups who determine the article that appears on Wikipedia’s front page.
And all those decisions could be considered content “recommendations.” Admins have regular conversations about what should remain in articles, what language needs to change, which linked sources are veritable, and on and on. In effect, the volunteer admins are the worker bees of the collective encyclopedic hive who have been mostly safe under the banner of 230. Depending on how justices rule, the Wikimedia attorneys argued a page could be liable for everything from how it’s laid out to what kinds of links authors provide, as that could itself be considered a kind of content “recommendation.”
G/O Media may get a commission
“Without the protections of Section 230, platforms are kind of stuck between one extreme or another,” Mixter said. “Either don’t moderate anything, and let your communities run wild and contribute anything, or over moderate be very risk averse.”
How Does 230 Protect Wikipedia and Its Community Editors?
Most of our current internet infrastructure is based on Section 230. This includes sites like Facebook or TikTok run by multi-billion dollar corporations, as well as nonprofits that have allowed an enormous amount of information to be shared freely online. When the Supreme Court hears arguments in Reynaldo Gonzalez v. Google, the question is more than just whether algorithmic-based moderation does enough to prevent bad content from appearing on sites, it could be used to confront whether any sort of moderation decision could be acting as a publisher of that information.
Of course, it’s very different for Wikipedia compared to other major platforms with major companies and thousands of employees working to moderate content. Though the foundation is responsible for the site’s upkeep, it’s also the front bulwark against any legal threat to its volunteers. Roberts said most lawsuits against Wikipedia get squashed soon after they’re filed, thanks to 230. Without that protection, costly lawsuits could drag on much longer, which opens up the possibility some neerdowell with an agenda could threaten a lengthy court battle in order to antagonize Wikipedia or an editor into changing or removing content on the site.
Wikipedia’s not the only one to mention the impact on community editors. Reddit also filed an amicus brief in the case where it argued its volunteer moderators on its multitudes of subreddits would be in the legal crosshairs. The message board site referenced several times moderators of various subreddits were sued for their decisions. Without Section 230, the company argued, things could get much worse.
That’s not to say Wikipedia hasn’t had to wrangle with the impact of its content, especially outside the United States. Wikipedia operates sites in multiple languages, each with its own set of editors. This past weekend, the Pakistani government banned Wikipedia for “not blocking sacrilegious” content. The country’s telecommunications arm said the site did remove some of the material, but not all.” On Monday, the country lifted its ban after facing backlash. A recent report claimed several now-banned Wikipedia administrators and users were reportedly working for the Saudi Arabia government where they manipulated and controlled information about the country. Other past Wikimedia admins working in Saudi Arabia have also reportedly been jailed for editing content regarding the country’s human rights abuses.
And there have been lawsuits as well. A court in Germany ruled that a Wikipedia article about a professor was defamatory, and forced the site to remove the content.
Roberts said the laws in France and Germany have certainly led to some lawsuits against the foundation, but the way the Supreme Court could rule on 230 could put a greater onus on the internet encyclopedia.
“It really could have an operational type of impact where we would be seeing a lot of very costly lawsuits,” he said. “And we might have to make the kind of decision that is between censoring content that maybe we could actually defend but at the end of the day, it would be too expensive to defend it and not have all of the money that we raise from donor fundraising go towards defending these sorts of lawsuits.”