Why We Should All Fear Facebook’s Oversight Board

A dark new world of private government is upon us.

Facebook’s days as a fun place to catch up with high school classmates and play Farmville are well behind it. Besieged by multiple investigations, antitrust lawsuits, and adverse judgements by the Federal Trade Commission (FTC), 48 U.S. states, and the European Commission for practices ranging from privacy violations to to mopping up competitors with mergers, the company is investing urgently to detox its image and steer now-inevitable regulations in a favorable direction. 

But the issue of policing political speech has become such a hot button issue that Facebook has taken a relatively dramatic step, officially setting up a nominally independent content-moderating group known as the Oversight Board. The Board has secure funding from an irrevocable Facebook-financed trust, and has the power to make binding judgements on the giant platform’s content—what posts can stay up despite being controversial, what must be taken down, and what designations will be applied to these posts.

Now, having previously taken tremendous heat over these decisions, up to and including suspending Donald Trump’s account after the Capitol insurrection, Facebook can hope to mostly avoid these business-impeding moral posers by fobbing it off on the FOB. But the Board’s significant power over one of the world’s biggest communicative platforms can’t be denied, as witnessed in its dramatic May decision to uphold Trump’s original ban from the platform. Combined with its quite explicit modeling on the U.S. court system—based on founding “Articles” that mimic a constitution—it’s a whole new frontier in private government. 

Listicles of Confederation

The Oversight Board was conceived years ago by CEO Mark Zuckerberg and his senior execs, who announced plans for it in November 2018 after intense criticism that the company allowed the spread of misinformation that led to the election of Trump, enabled the Myanmar military to propagandize in defense of its genocide of the Rohingya people, and facilitated other ugly episodes. But early in the process it was clear that proceeding would be difficult, as the hundreds of legal scholars, academics, and journalists Facebook consulted with had very little consensus on how to govern speech on the company’s platforms (including Instagram). 

The Board was intended to help set rules for difficult calls on content. It exists as a supplement to the brute force of Facebook’s giant army of moderators, who do the actual scouring of the platform to remove violent, offensive, or inflammatory content. With A.I. content-monitoring still in development, the tens of thousands of moderators are mostly contractors or subcontractors—often based overseas—who spend their work days continuously watching flagged content from the platform. Exposed to a flood of images and video showing violence, sexual abuse, animal cruelty, and general online ugliness, the content moderators have what the Wall Street Journal called “the worst job in technology.” 

But the fundamental structure that the Board would have was clear to Zuckerberg from the beginning, who explained the emerging system to reporters in 2018. If a post is taken down, the poster can appeal within the company. Zuckerberg said, “The basic approach is going to be that if you’re not happy after getting your appeal, then you can also try to appeal to this broader board or body… Like some of the higher courts in other areas, maybe it’ll be able to choose which cases it thinks are incredibly important to look at.” From the first, Zuckerberg openly saw the Board as an institution of private government modeled on the U.S. courts.

Published in September 2019, the Oversight Board Charter is something else. Its mere 12 pages lay down rules that may govern the acceptable content limits for billions of people in the future. In line with this grave mandate and Zuckerberg’s original analogy to the courts, this document of corporate government is written like a public constitution, with enumerated Articles delineating specific powers and authority. The Charter opens by stating the value of connecting people and free expression, but it notes this has the potential for abuse. It also notes “internet services have a responsibility to set standards,” because of course who else would? This makes for a fine neoliberal view of public affairs. The Charter is eager to portray the Board as fully independent of Facebook, portraying it as a neutral arbiter calling balls and strikes for the platform’s content rules. 

The document wants to look like other great founding documents of history, but the constant need for corporate hedging undermines it. For example: Article 1, Section 1, “Size,” states, “The board will consist of no less than eleven members. When it is fully staffed, the board is likely to be forty members. The board will increase or decrease in size as appropriate.” The effect is more like a slippery Terms of Service list than the Articles of Confederation. But again like the courts, “any prior board decisions will have precedential value” for deciding future similar cases.

The Board has the ability to request information from Facebook, deliberate based on the company’s Community Standards, and “instruct” Facebook to restore content on which it had previously ruled. It can  remove a designation (e.g., for graphic violence) as well. It can also advise the company on policy, but while some over-earnest commenters have called the Board “judge and jury over Mark Zuckerberg,” the company’s own financial filings specifically state he is in fact unusually powerful over the company even for a CEO, due to his ownership of special voting shares that dominate the company’s board. More relevantly, the company and the Charter are very clear: “The board will have no authority or powers beyond those expressly defined by this charter,” meaning the Board’s scope is limited to content moderation and issues around it. This does not include review of Facebook’s powerful algorithms that decide what we see on our timelines, including what content is given wide circulation and which gets fewer eyeballs. 

Funelected Government

Much as the court system is an independent branch of government, the Board is meant to be too. Funded by an irrevocable $130 million trust fund, the Board’s co-chairs said in a joint New York Times editorial that the company “recognize[s] that no company should settle these issues alone.” The Board’s materials frequently stress its independence from Facebook, but the senior co-chairs were chosen by the company, and serve three-year terms that are renewable for up to six years.

Board members are paid six-figure salaries for a commitment of about 15 hours a week, according to the Times, a setup that resembles corporate board positions. The members are tremendously racially and geographically diverse, but are dominated by lawyers, professors, and law professors, who make up 13 out of 19 current members (not counting a co-chair with a public policy M.A.).

The co-chairs are the most important members, given their directing role and power to appoint the rest of the Board. Perhaps the most surprising of the four co-chairs is former Danish Prime Minister Helle Thorning-Schmidt, who brings the mystique of Scandinavian social democracy and is remembered in the U.S. for her notorious selfie with then president Barack Obama at Nelson Mandela’s funeral. A leader of a Danish minority government as the head of the Social Democrats, Thorning-Schmidt’s party pursued a typical agenda for the “socialist” parties of Europe—overseeing the privatization of a stake in the state energy utility to Goldman Sachs, which despite buying only 18 percent of the company received special privileges, including a board seat and “veto power” over major strategic decisions. This led to fairly major demonstrations and the fall of the government. From neoliberal government privatization to private corporate government, Thorning-Schmidt is a portrait of why support for European social democrats has fallen through the floor.

Co-chair Catalina Botero-Marino is a Colombian attorney and law school dean with a long record of human rights advocacy, especially around freedom of expression. Botero-Marino’s research celebrates the loosening of speech restrictions in Latin America in the wake of the falls of the Cold War military governments of the region (mostly supported or installed by the U.S.). On the other hand, she has longstanding ties to U.S.-dominated pan-American entities that have been largely part of that same U.S. imperialism in the Western hemisphere. She was a special rapporteur for the Inter-American Commission on Human Rights, which has historically avoided confronting the U.S. and U.S.-backed regimes and today prefers focusing on both real and flimsy charges against non-U.S.-aligned states like Venezuela, Bolivia, and Ecuador. She is also a member of the Inter-American Dialogue, another NGO aligned with the Organization of American States (the leading U.S.-dominated regional group) which today focuses on combating leftist governments and Chinese influence among them.

Columbia Law School professor Jamal Greene was elevated to the Board after a long career there and at Harvard Law. His work focused on the flaws of originalism, the legal theory popular in the current U.S. Supreme Court that all law should be interpreted and maintained to the extent it aligns with the U.S. Constitution. This view is controversial, seeming to cement the legal system into the mold of a document that defined Black slaves as three-fifths of a person for Census purposes and came with an incredibly constrained model of democracy. Greene was later an aide to then-Senator and now U.S. Vice President Kamala Harris in 2019,  sitting behind her during the infamous Kavanaugh hearings. This suggests a number of Board members have an affinity with the neoliberal Democrat leadership—another Board member, Pam Karlan of Stanford Law, has already recused herself from the Board to work for the Biden administration.

The final Board co-chair is Michael McConnell, a former circuit court judge, Stanford law professor, and a senior fellow at that school’s conservative Hoover Institution. McConnell is one of the two arch-conservatives present on the Board (sadly but unsurprisingly there are no leftists). In the past he “defended the Boy Scouts’ right to expel a gay scoutmaster in front of the Supreme Court and once argued that Bob Jones University had a legal right to ban interracial dating on religious grounds,” as the Wall Street Journal reported. Given the very senior role of the co-chairs, who oversee the case and membership committees (discussed below), the presence of an outspoken conservative among the relatively establishment liberal co-chairs may foreshadow a significant rightward tilt of the entity. That would align with Facebook’s incentive to mollify right-wing users, whose cries of censorship only grow.

The other outspoken right-wing Board member is John Samples, who is the V.P. of the libertarian Cato Institute think tank. Samples has a long record of predictably detestable opinions, including in his manuscript “Why the Government Should Not Regulate Content Moderation on Social Media,” which argues that online platforms should “self-regulate” to avoid any potential legal intrusion in online speech, and that “Facebook appears to be offering a private solution.” He dissents though, from policing abusive online speech or outright fake news, arguing they “clearly could not pass muster under American constitutional law.” But of course, the Constitution mainly limits the power of the state, not the power of private platforms like Instagram and Twitter. 

McConnell has told the press, “Practically the only entities I trust less than companies would be the government.” The whole thing is a farce though, because Samples now holds an influential position on what is more or less openly an arm of private government. This “private solution” is just a privatized carbon copy of the public court system, run by a profit-maximizing monopolist corporation.

Other Board members include law faculty from India to Brazil, heads of digital rights NGOs in Pakistan and Kenya, the senior editor of a Jakarta newspaper, and a Yemeni Nobel Peace Prize laureate. Most of these people seem like relatively earnest liberals who aspire to be neutral, thoughtful technocrats, but again the problem is who the fuck put them in charge. Some publicly-traded corporation set the rules, and picked the senior chairs who then picked the rest of the Board. It really is the ultimate in private government—as Elizabeth Anderson described market companies to Current Affairs:

[Whenever] you have a group of people who have to take orders on pain of sanction, what you have is a little government. Now we can ask: what is the constitution of that government? Well, it is certainly not a democracy because the people who are taking orders don’t have any opportunities to elect their rulers or to hold them to account if they behave badly.

But this shadowy private hierarchy has another layer, because as the Board’s bylaws make clear, an enormous amount of authority is invested in the trustees. The trustees run the LLC that includes the Board. They are appointed by Facebook and have final say over many crucial Board processes—the Co-Chairs nominate their successors, but the trustees must approve. The Board recommends new members, but can only recommend them to the trustees, and while a Board vote of two-thirds can expel a member, the trustees must affirm it. The bylaws specify three to 11 “individual trustees” and “one corporate trustee,” appointed by Facebook, together approving the Board’s budget and hiring the staff. This means that the shadowy trustees have veto power over any wild decision like hiring a socialist to join the Board.

This form of arm’s-length authority, an allegedly independent but orbiting deliberative body, could prove popular with Google and other gigantic platforms struggling with opposing content demands. Facebook apparently hopes other tech firms will join, which is part of the reason the body doesn’t have Facebook in its formal name or use Facebook’s design elements. Whether others join or not, the model could be attractive. I swear to God in 10 years we’ll be seating a new Supreme Court Justice who got their start clerking for this thing.

Unappealing Process

The private government’s appeals process is spelled out on its website. If your Facebook or Instagram post is taken down and you lose your appeal, you may then appeal to the Board via their website. Like the U.S. Supreme Court, the Board decides which cases it wants to hear, aiming for cases that are subtle or involve a difficult balance, with a goal to set future precedent. Notably, the Board’s appeals process states it “abides by country-specific laws. Because of this, not all content decisions are eligible for appeal.” So if India’s fascist government legislates that Prime Minister Modi can’t be criticized, a removed post doing so might not be appealable. 

Cases accepted for consideration are assigned to randomly-selected five-person panels, with a goal of including at least one person from the poster’s country or region. The Charter notes that while Board membership is public, “the composition of individual panels may remain anonymous,” which I can see really infuriating people. Then the panel’s decision is reviewed by the full Board, which will “strive to make decisions by consensus,” but if consensus cannot be reached a simple majority will do it. 

Notably, a panel’s decision can be “re-reviewed” if a majority of the full Board requests it, in which case a new panel is convened to work on an expedited basis. Facebook can also request expedited decisions in urgent cases. Final decisions are accompanied by an explanation meant to clarify the basis for the ruling and to make the deliberations more transparent. Finally, “Facebook will promptly implement the board’s decision,” according to the document.

But not necessarily. As the Board’s bylaws state several times, content the Board orders reinstated or left up will be reviewed by “Facebook’s legal department… for the express purpose of ensuring that Facebook is not under a legal obligation to block access to that content.” And the bylaws confirm that some cases “will not be eligible for the board to review,” including “[where] the underlying content is unlawful in a jurisdiction with a connection to the content… and where a board decision to allow the content on the platform could lead to adverse governmental action against Facebook.” That’s a pretty damn big caveat, meaning that if the U.S. outlaws pro-BDS posts, or Brazil bans criticizing the destruction of the Amazon, the Board may be unlikely to hold a hearing over whether posts violating those laws should be allowed. Much like other corporations committing disgraceful acts, the first defense is “we’re just following the law,” which of course is written to their advantage. 

Verdicts so far have focused on apparent over-reliance on algorithmic curation of the platform, ruling for example that Facebook must restore a removed Instagram post on breast cancer detection that included a woman’s nipple (which the platform had already reinstated). Other decisions restored deleted posts on unproven COVID-19 therapies, and others making derogatory comments about Muslims. In these cases, the ruling was that while the cases were clear misinformation and hate speech respectively, they fell short of the Board’s strong standard for removal. The Board is pushing for more human moderation rather than reliance on algorithms, and members have also stated their hopes for more authority over the automated systems that promote or suppress content.

By far the most prominent case currently before the Board is whether to reinstate Trump’s account after the incitement of the Capitol occupation. A panel of Board members is deliberating the case now, having already delayed its decision once, and Trump’s odds are unclear. While the legal reaction to Trump’s suspension has been fraught with limited and highly qualified support, Nick Clegg—Facebook’s V.P. for global affairs and a former member of parliament for the U.K. Liberal Democrats —has said the Board would vindicate the platform’s original suspension but was unsure if they’d allow him back. The randomly-assigned panel of five members, including at least one American per Board policy, is currently deliberating, and their decision will be subject to a vote of the full Board.

More broadly, the ruling could bear on the company’s existing policy which gives global leaders more leeway to make inflammatory comments than the average user, with major potential consequences for the present cadre of global reactionary bastards from Bolsonaro to Johnson to Modi. But while we focus on the fate of the biggest world figures, the New York Times observed “the fact remains that for most Facebook users, the company is the last and final word on what people can or can’t say. And Facebook faces little accountability for the process.” True, and chilling words.

The national press coverage recognizes the incredible power of this legal entity, including its ability to deny a megaphone to gigantic figures like Trump (at least temporarily). The Times called it “a new kind of governance, in which transnational corporations compete for power with democratically elected leaders,” and “[a] new kind of corporate supercourt.” However, they later recognize that this now-delegated power of the Facebook platform is largely a successor to the “gatekeeping” role of the formerly “widely trusted mass media.” While reassuring to the Times, this fact conforms less with the liberal ideal of a “free marketplace of ideas” and more with Ed Herman and Noam Chomsky’s idea of manufacturing consent by corporate control of the limits of debate, while encouraging debate within those narrow limits. Now the Big Tech platforms of YouTube, Twitter, and Instagram, along with their deputized Mod Squads, are leading that job where NBC, CNN, and the Washington Post used to.

The White Hand

Some Facebook users may not be aware of a cute practice among the company’s employees. If a company technician logs into a user’s account or accesses private portions of it, like direct messages, in order to fix bugs or deal with a platform issue, a notification appears for that user—but only if they also work for the company. For most of us, no such indicator appears. The company calls this in-house-only notification “Security Watchdog,” but the company’s workforce has come up with a different, playful name for it—Sauron alert, named of course for the nightmarish all-seeing eye from Lord of the Rings.

If Facebook is the all-seeing, power-hungry Sauron, then perhaps the Board will be like Saruman—the learned wizard motivated by a thirst for knowledge who becomes corrupted by power and allies with Sauron. Their futile desire for order (which manifests as tyranny) is ultimately defeated by a ragtag alliance of people and powers who defeat Sauron and restore peace to Middle Earth. 

Rather than swords and magic staffs, hopefully our Sauron can be countered by a popular socialist movement that can in time replace this unaccountable private government with one reflecting a form of economic democracy, where not just content limits but the platform’s broader rules are set with public input. 

Content is too important to be left to the mods. 

More In: Tech

Cover of latest issue of print magazine

Announcing Our Newest Issue

Featuring

Our glorious FIFTIETH print issue, featuring a special panoramic cover from artist C.M. Duffy showing many of the characters from our previous covers! This spectacular edition features essays on foraging for wild mushrooms, the threat posed by U.S. hegemony, the afterlife of Nazi companies, the wonders of opera, the horrors of prison healthcare, and much more. See the latest in trendy men’s fashion and the latest “productivity optimization tools for the modern boss.” Plus a retrospective on the films of Michael Moore!

The Latest From Current Affairs