How much oversight can an oversight board have if an oversight board has no real oversight?
We're about to find out!
Hello, Friday. Good to see ya.
Programming note: Yom Kippur, the holiest day of the Jewish year, starts Sunday evening, so I will not be doing a newsletter on Monday. To those I’ve wronged over the past year, I am sorry; please forgive me.
(Image via Getty)
What happens when we find ourselves drowning in oversight but those who sit on an oversight have absolutely no power to effect change? We’re about to find out.
Way back in November 2018, Facebook announced it was creating a board of independent folks “to exercise independent judgment over some of the most difficult and significant content decisions.” This came on the heels of an April 2018 thought experiment from Mark Zuckerberg about having a “Supreme Court” for content moderation, a tribunal of sorts that with a thumbs-up or a thumbs-down will dictate which content survives, and which content will die.
Here’s what Zuckerberg told Vox in April 2018:
But over the long term, what I’d really like to get to is an independent appeal. So maybe folks at Facebook make the first decision based on the community standards that are outlined, and then people can get a second opinion. You can imagine some sort of structure, almost like a Supreme Court, that is made up of independent folks who don’t work for Facebook, who ultimately make the final judgment call on what should be acceptable speech in a community that reflects the social norms and values of people all around the world.
In September 2019, Facebook peeled back the curtain a bit to show us how they were selecting board members.
Yesterday, the company said that the oversight board will kick off in October, just in time for the U.S. election.
The Guardian writes:
The launch will come at a time of intense scrutiny and pressure for the company that has lurched from controversy to controversy since it was used by Russia to interfere with the 2016 US presidential election. The consequences of Facebook’s failures in addressing hate speech and incitement, which have for years been linked to violence in several countries and ethnic cleansing in Myanmar, have become increasingly apparent in its home country in recent months. During a summer of civil unrest in the US, Facebook was linked to the growth of the violent Boogaloo movement and a militia’s “call to arms” on the night two Black Lives Matter protesters were shot and killed in Kenosha, Wisconsin.
The limits of the oversight board’s mandate have been a key point of controversy since the independent institution was proposed by Facebook’s chief executive, Mark Zuckerberg, in 2018. The board’s initial bylaws only allowed it to consider appeals from users who believe that individual pieces of content were unfairly removed, prompting criticism from experts, including Evelyn Douek, a lecturer at Harvard Law School who studies online speech regulation.
While the jokes fly about how this oversight board is Facebook’s Supreme Court, it’s no laughing matter, and a group of critics calling themselves “The Real Facebook Oversight Board” has taken it upon themselves to have their own oversight to “analyze and critique Facebook’s content moderation decisions, policies and other platform issues,” according to NBC News.
The newly formed board will scrutinize a broader range of issues in weekly public Zoom meetings. They include political advertising, networks of bot activity, the organization of "militias" through events pages, the dissemination of QAnon conspiracy theories through groups and the algorithmic amplification of disinformation.
Facebook, however, isn't welcoming the outside board.
When the company learned of it, a representative tracked down the people at the philanthropic investment firm Omidyar Network who had funded The Citizens to express the company's disappointment, said two people familiar with the situation who spoke on condition of anonymity to describe private matters.
The big picture: Pressure on Facebook to address misinformation and hate speech on its platform has increased ahead of the election.
Last week, the Stop Hate for Profit organization, a nonprofit aimed at pressuring social media companies to tackle hate speech and misinformation, entered the second phase of its boycott, targeting Facebook specifically. It worked to get dozens of celebrities to freeze their Instagram and Facebook accounts for one day.
The Financial Times reported Thursday that advertisers had struck a deal with Facebook and YouTube to address harmful content.
The bottom line: The tension between Facebook and accountability groups is increasing ahead of the election, and the company's independent oversight board is the latest target.
"Many groups have strong opinions on how Facebook should moderate content, and we welcome new efforts to encourage debate. The members of the Oversight Board are focused on building an institution that will make binding decisions on Facebook's most significant content issues," an Oversight Board spokesperson said.
So we now have a sanctioned oversight board and an unsanctioned oversight board and you know what? Neither will matter.
On the former, CNBC writes:
The Oversight Board said it expects to decide on a case, and for Facebook to have acted on this decision, within a maximum of 90 days.
This is laughable. Ninety days ago, for instance, was June 27. A lot has happened since then! A lie, it is said (by whom, well, we’re not entirely sure, but looks like perhaps Jonathan Swift) will spread halfway around the world before the truth puts its boots on. Across social networks, the same is true with memes and disinformation. So how exactly will an oversight board adjudicate what should be moderated 90-days later?
Politico has a great look at how Facebook is wrestling with political ads, from putting in a plan to not have political ads run the week before the election to reversing its decision to allowing political ads to run at midnight on Nov. 4 after realizing that perhaps the bad-faith actors will flood the zone with “We won!” ads when the election hasn’t been called yet. All of this is a form of content moderation.
As for the group of 25 very concerned people, what can they actually do beyond what they’re already doing as Facebook critics? How much power/change can an unsanctioned, toothless oversight committee provide?
The fundamental issue with both of these oversight groups is that Facebook is not a democracy governed by a checks-and-balances system; it’s a dictatorship overseen by one person. Zuckerberg controls every facet of the company.
The only way to have any meaningful oversight is for our government to step in. And the only way to impact Facebook, as we’ve learned, is to not have a Facebook and Messenger and Instagram and WhatsApp and Oculus account.
Thank you for allowing me in your inbox today and every day. If you have tips, thoughts on the newsletter, or want to have your own Media Nut oversight board, drop me a line. Or you can follow me on Twitter. If you enjoyed this edition, please consider sharing across your social networks (maybe it won’t get moderated?) and get your colleagues to sign up. Have a lovely weekend. Stay healthy. Stay safe. For those fasting, have an easy and meaningful fast. See you all on Tuesday.
Phish, “Avinu Malkeinu”
Some interesting links:
How Twitter survived its biggest hack—and plans to stop the next one (Wired)
Girls alleged abuse at reform school for years. It stayed open until they got on TikTok. (NBC News)
For media criticism:
Trump trying to steal the election is the only campaign story that matters (Press Run)
Ignore the strongman fantasies. If Trump loses the election, he'll lose his job. Period. (USA Today)
For editorial boards sounding alarms:
Trump’s contempt for truth leaves a toxic legacy around the world (WaPo)
For media mergers:
Penske Media paid $225 million for 80% stake in Hollywood Reporter parent (NY Post)
For media and financial literacy:
The Real State Of Unemployment Is Much Higher Than Official Numbers (Forbes)
For incredible data viz:
The actual number of Americans jailed or imprisoned, about 2.3 million (Matt Korostoff)
For revolving door:
Wolfgang Blau Exits Conde Nast in London (WWD)