The Lies Facebook Tells Itself (and Us)
Mark
Zuckerberg informed us a few days ago that he would be rewiring our
information landscape. Posts from friends and family move up in the
rankings; news and media fall off. He made a few oblique references as
to why and assured us in an insipid 533-word blog post that the changes would mean that the 50 minutes users spend on the platform each day would be “time well spent.”
Anyone
who has been even partially sentient over the past few years has
noticed how we have become shrouded in our filter bubbles, secure like
never before in the complacency of our convictions. This certainty in
the righteousness of our own point of view makes us regard a neighbor
with a yard sign the way a Capulet regards a Montague. It seems to me
that we suddenly hate each other a whole lot more than we ever did
before.
So
it should come as no surprise that the place where filter bubbles are
the thickest, where the self-satisfied certitude that comes from
unchecked power is iron-clad, is at the headquarters of Facebook itself. This was brought home to me when I read an interview with the head of Facebook’s News Feed product, Adam Mosseri, by the savvy tech blogger Ben Thompson.
Mosseri,
who has been at Facebook for nearly a decade (eons in Facebook
chronos), was eager to explain to an interviewer why this change was
rational, normal, good for humanity (the company counts one quarter of
humanity as monthly active users). The interview was quite a get for
Thompson, and he published it in near-verbatim format. In so doing, he
laid bare just how removed from the rest of humanity Facebook management
is, and how blissfully ignorant they are about the consequences of
their actions.
I
refined my outrage into five points Mosseri makes (down from 15
initially) that illustrate the degree to which Facebook executives live
in a world of their own making where the rest of us are expected to
comply.
#1 The changes are for our collective “well-being”
The
most glaring assumption that jumps out of this interview (as well as
official Facebook communiques) is that we are all asked to swallow
Facebook’s incredibly vague gauge of “well-being,” or “meaningful social
interaction.” In fact, these terms are sometimes tossed about
interchangeably. (Zuckerberg uses “well-being” three times in his post.)
In
the excerpt above, Mosseri implies that Facebook is doing this for our
own mental health, and that it’s based on extensive research.
Interactions = good. Passively consuming content = bad.
Aside
from the disturbingly paternalistic assumptions therein, can I ask how
Facebook defines well-being? And, since they have done such extensive
research, can they share it with the public transparently? Mosseri’s
answer: “We’ll certainly consider it…” (Facebook has a blog post that
discusses a few of its conclusions here.)
To
me, this strikes at the heart of the peril posed by Facebook: The
platform has probably more power than any company has ever wielded over
information (and perhaps even our well-being). And yet it engages in
zero public debate about the changes it makes. It simply rolls them out.
We are asked to buy Facebook’s version of meaningful, as in this
Mosseri statement: “So if you and I had a back and forth conversation on
a post from a Page, that would actually count as a meaningful social
interaction.” Hence, it would get a higher rank in the algorithm, etc.
Is
an exchange “meaningful”? I can think of plenty of Facebook exchanges
that merely raised my blood pressure. These are sweeping categories.
Facebook has placed itself as the imperious custodian of our well-being,
but tells us nothing about how it cares for us. And do they care if it
has side effects? Just ask independent journalists in Bolivia
what happens when Facebook starts using them as guinea pigs in an
experiment about their well-being: Their audience drops, the
government’s ability to control public opinion increases. And when they
complain to Facebook, they get an automated reply email.
#2 “This change actually has very little to do with false news…”
Mosseri
actually said that. But that’s not as stunning as what came next: “I
will say that the amount of attention on false news specifically and a
number of other integrity issues, certainly caught us off guard in a number of ways and it’s certainly been something we’ve tried to respond responsibly [to].”
Let’s
unpack this. For more than a year, Facebook has been under scrutiny
because there has been a flood of outright fake and misleading “news”
coursing through its pipes. As studies
have shown, people share fake news on Facebook, often more than the
real stuff. The Pope endorsed Donald Trump? That spreads on Facebook.
People get pissed. When the senior leadership at Facebook says this
caught them “off guard” I have to pick my jaw up off the floor. Inside
the Facebook HQ, the filter bubble is thicker than a security blanket.
They really believe that all they are doing is connecting people and
fostering “meaningful interactions.” They are not playing Russian
roulette with our democratic institutions or selling adds to people who
want to burn Jews.
And
this filter bubble is so impenetrable that they believe one minute that
they have the power to manipulate our mood (they do) and are shocked
the next when they get blowback for allowing people to manipulate our
politics.
Then
the last part: it’s “something we’ve tried to respond responsibly
[to].” No, Facebook, you have not. The only responsible response after
these revelations would be a massive overhaul of your system and a
transparent conversation with the public and Congress about how your
algorithm works. You have produced the information equivalent of a
massive e.coli contamination. Instead, your response has been an
under-funded effort to infuse fact-checking into the News Feed, and a 41% uptick in what you pay your lobbyists.
#3 “Does the scrutiny accelerate the process? It’s really hard to say.”
Yes,
it does and no, it’s not. This statement is in response to Thompson’s
question about the criticism Facebook has received in the past year over
its distribution of fake and misleading news and whether that has
prompted the company to assume greater responsibility over what its
users see. Mosseri’s full response is here:
Here’s
another counterfactual: Do you think the revelations about years of
sexual abuse, assault and downright rape in the workplace by powerful
men (Harvey Weinstein, Matt Lauer, Charlie Rose, etc., etc.) have
accelerated the conversation about women’s rights and equity in the
workplace? I mean, it’s possible.
So
let’s assume that Facebook continues to post $4.7 billion in net income
each quarter and its stock rises another 40% percent over the next 12
months (market cap at this writing is $517 billion), and there is no
public criticism about fake news, targeting voters, and so forth. Absent
any external pressure, do you think that Zuckerberg and the rest of the
boys in senior management (and Sheryl Sandberg) take it upon themselves
to head to a sweat lodge to probe their souls about whether the way
they are redrawing the map of our information economy is good for
humanity? Sure, that’s likely.
#4 Does Facebook have any responsibility toward media companies?
It’s a great question posed by Thompson. And the answer confirms my worst fears.
Mosseri’s
initial response is anodyne enough: “I think we have a number of
responsibilities.” News stories are important to people, he says. But
then, just as quickly, he contorts himself into a pretzel to explain why
it’s also not the case: “…news is a minority of the media content on
Facebook, and media is a minority of the overall content in the News
Feed.” Ergo, it’s not that big of a responsibility.
Two
major fallacies here. The first: If there is less quantity, then there
is less importance. My five-year-old niece’s recent birthday was a big
hit on Facebook, as I imagine many other birthdays were that day. So,
that’s more important to the Facebook community (read: humanity) than
the SNAFU alert sent to all the residents of Hawaii warning of an
imminent missile attack? The numbers tell us it is.
The
second: Reporting, writing and editing a news story of any import takes
time, resources and skill. Hence, there will be many fewer of them than
there are birthday posts. So if it’s a numbers game, news loses. This
is what I’d call self-serving math.
#5 “… there’s understandably going to be a lot of anxiety…”
Here’s some more math: The Pew Research Center
reports that 45% of Americans get news from Facebook, a percentage that
has been increasing sharply. Why? Because that’s the product Facebook
created. It designed itself for that.
As
the algorithm tweaks fall into place, and news publishers stand by as
their audience plummets, Mosseri concedes: “there’s understandably going
to be a lot of anxiety … it’s always a set of trade offs, we do the
best we can with the information at hand.” (You possess ALL the
information, by the way.) These are not words of someone who sees the
news media as partners but as pawns. A post is a post is a post.
But
that’s not how this company has operated. Since it burst on the scene,
not all that many years ago, it has dangled carrot after carrot in front
of news media. Do your headlines this way and you’ll be rewarded. Hey,
pivot to video! No, try our Instant Articles product (or else). And
then, like Lucy yanking the football, it’s gone. Facebook has moved on.
The
heart of the issue is that Facebook wields immense power and is subject
to minimal accountability. Changes come when Zuckerberg decrees them.
Yes, it’s a publicly traded company. Yes, Congress shall make no law …
But the power is real and the accountability is not.
And
with all this heft, and all this research, Facebook seems to understand
so little about the news it serves up. Take for example this notion
that commenting or reacting to news is what makes news valuable. Yes,
that’s true some of the time, but it’s also false some of the time.
Sometimes we read the news to be informed. To catch up. To be better
citizens. Because I didn’t share or like an article about climate change
doesn’t mean that I don’t care about climate change.
To
treat the value of news purely through the lens of whether people have
shared it or had “meaningful interactions” with other members of the
Facebook “community” misses the value entirely.
And
Dear Facebook, sharing and commenting on every piece of news is
actually part of the problem: It is what has thrust news and journalism
into this hyper-partisan shithole we’re in right now.
I
only have one wish for Zuckerberg. In a few short years, he will be the
father of a girl in her tweens. I can only assume that she, too, might
become obsessed with the Instagram posts of her friends, whether they
liked her pic, or that she might discover that everyone is hanging out
without her. And it might drive her to tears. And then her wise parents
will decide (unilaterally) that they need to limit her screen time to 30
minutes. It’s for her own well-being, after all.