Opinion: Grading Facebook’s Homework

For months, Facebook dodged questions about why CrowdTangle — an analytics tool from a company Facebook acquired in 2016 — shows that Facebook posts by hyper-partisan conservative pundits consistently and significantly outperform content representing other political views and major news organizations.

Facebook just rolled out a new metric that is supposed to explain the company’s apparent bias in favor of conservative content.

Well, rolled out is a bit of an overstatement. Instead of historic data and a way to track it in the future, all we got so far is a single snapshot of what it looks like right now, and even that snapshot still shows the same bias clear as day.

Still, a debate about how exactly to measure reach would be a useful distraction from the things Facebook has said and done recently that can’t be explained by metrics:

  1. After requesting ranking algorithm tweaks to reduce distribution of progressive news publishers, Facebook policy leaders protected conservative partners from misinformation strikes and demonstrated that liberal pages don’t get the same protection.
  2. Even after receiving 455 reports, Facebook refused to take down a conservative page until after the event it organized turned into a deadly shootout.
  3. In sworn testimony to the United States Senate, Zuckerberg claimed to be unaware of the internal study showing that 64% of extremist group joins are due to Facebook’s recommendation tools, even though he previously criticized the WSJ coverage of that study in an internal Q&A with employees.

Easy to see why Facebook PR prefers to focus on CrowdTangle: can’t perjure yourself with misleading math.

Without further ado, here’s my review of Emperor Mark Augustus Zuckerberg’s new clothes.

The vast majority of what people in the US see on Facebook

This is as helpful as “vast majority of the time, an abusive husband isn’t beating his wife.” Facebook is not criticized over the percentage or the total amount of political content. Our concern is about the impact misinformation and hate speech on Facebook has on US democracy and global peace, with Facebook having the tools to curb it and refusing to use them, and worse, wielding these tools to further the apparent political preferences of its leadership and its owner.

Political content makes up about 6% of what you see on Facebook.

This is as misleading as “our AI systems proactively identify 90% of hate speech we remove [relative to just the hate speech we remove — consisting in large part of what cis white men report as hate speech against them by underprivileged minorities — ].”

There are too many dimensions to this metric to get away with one percentage number decoupled from the baseline it’s measured against.

6% relative to what? Content seen by all people in US combined? Content seen by an average person? How many people see more than 6% of content about politics? How many see more than 50%? How many see no content about politics at all? Is it per session? Per day? Per month? Over the entire lifetime of a person’s Facebook account?

Facebook is notoriously data driven. You have the numbers, why are you giving us vague generalizations? Is it because closer examination of the data doesn’t fit your narrative? Or because you don’t want to be held to this new metric — or any metric — the way you’ve been held to the CrowdTangle data?

The 2016 presidential election was decided by 0.56% margin across 3 states. Facebook just bragged about increasing voter turnout by 2.8%: more than the margin in every state Biden flipped this year. 6% might look like a small number, but in politics it can be a lot.

CrowdTangle [shows] which Page posts are engaged with the most, what content will get likes, comments and reshares. It is not designed to show what is being seen the most.

You glossed over the part where you were supposed to explain how “what is seen the most” matters more than “what is engaged with the most” when it comes to analyzing Facebook’s influence on elections, democracy, what people believe, and what they do.

I don’t think you can explain that. Not when you train your own customers to define ad efficiency in terms of click-through ratio, conversion rate, persuasion lift, and other indicators of how likely it is that people who see a piece of content will engage with it.

Likes and comments don’t equate reach.

It’s admirable that you’ve finally acknowledged the importance of the concept of reach, but you don’t get to redefine it as whatever growth metric provides the most palatable outlook this week.

Our ranking models include much more than just engagement.

Wait, did you just try to define reach as whatever formula Facebook uses on any given day for ranking the News Feed? And Stories, and Search, and Watch, and other product surfaces that each have their own ranking algorithms? Please tell us more. No, seriously, tell us what those algorithms are. What’s the point of holding yourself to a standard that nobody but you gets to see?

Engagement with all posts (and not just posts with links)

I don’t think this new table shows us what you think it shows:

  1. You had an allegedly more comprehensive engagement metric this entire time and you held it back from the public? Why?
  2. The top 2 and 6 out of top 10 pages are conservative, and the President-elect who as of the current count won the election 50.7% to 47.5% of the popular vote gets 6x less engagement than his opponent.

The bias is still there.

How many people actually see the content. We call this reach.

The percentage of people who over the course of a week saw anywhere from a single post to an entire feed full of content from a page is meaningless as a metric of Facebook’s bias and influence in politics because it is completely disconnected from the one and only reason one would post political content on Facebook: to get people to engage, internalize the narrative, and act on it.

Engagement does not predict reach.

Not when you defined reach like that it doesn’t. That’s why you don’t sell this notion of “reach” to your advertisers. Instead, you sell absolute count of impressions, and tie that to engagement, persuasion, and other things people are willing to pay money for.

You don’t get to measure bias in a meaningless metric disconnected from reality. You can only measure it in the amount of value you enable different kinds of people to derive from your platform. Value as in “things people are willing to pay money for.”

Here are lists of the US news publishers

US news publishers: “Facebook, how come the biggest of us get less engagement than some fringe conservative influencers?”

Facebook: “Why don’t you instead look at this table that compares y’all to each other and not to fringe conservative influencers?”

A smaller effect appears to have been due to the temporary measures we put in place to address potential misinformation and delegitimizing content on our platform related to the election.

Thank you for admitting that:

  1. The conservative pages are more likely to spread misinformation and delegitimize elections.
  2. Facebook has the tools to curb misinformation.
  3. Facebook chose not to deploy these tools until the last few days of a two years long elections cycle.

This shows that you are aware that your decision not to curb misinformation the rest of the time favors conservative pages.

We need to have independent research to understand our role in elections.

You already commissioned research to understand your impact on civil rights. You have failed the audit and you have refused to implement all but the most superficial of its recommendations.

You already committed to launching an Oversight Board, and then made sure neither the schedule nor the scope of its mission could interfere with whatever you wanted to do during this election.

And when the people who found your civil rights record problematic tried to launch an actual independent oversight board, you attacked them.

Final Grade: Emperor’s wardrobe is unable to function in compliance with Facebook Community Standards section 14.


This article was originally published on November 10, 2020 at https://medium.com/@angdraug/grading-facebooks-homework-378e5bf3d2e8.

See Also:
Opinion: What ‘The Social Dilemma’ Documentary Got Wrong

The Netflix Documentary 'The Social Dilemma' Left Something (Very Important) Out... Disclaimer: Minor Spoilers for the 'The Social Network' ahead! Read more

Opinion: On Google’s ReCAPTCHA Privacy Nightmare

Opinion - Google ReCAPTCHA You're no doubt familiar with clicking on a box declaring that you are not a robot, Read more