Pages

Sunday, May 22, 2016

Douthat on the Great Facebook Massacre

Image via Michael Ferry at First Draft.
So apparently deep in the subterranean regions of the Facebook world, the News Curators toil, unseen and unappreciated, writing their headlines and teasers for the trending topics in the upper right of your timeline, and a little bit more. It turns out that the Trending Module algorithm doesn't work quite as well as advertised, or, more to the point, as well as Twitter's, and it needs to be goosed from time to time, and one of the things they do is to "inject" stories that fail to trend on their own into the mix, from the front pages of ten different big-time sources (CNN, the New York Times, BBC, and the like), or just stories that seem so serious that it's embarrassing not to have them (the disappearance of Malaysia Airlines flight 370, the killings of the Charlie Hebdo staff, Syria stories, #BlackLivesMatter). They're also empowered to switch a story from its coverage in a less respectable source like RedState or Breitbart to a more trustworthy one, or deactivate it altogether, if not enough sources are covering it.

Perhaps because their backgrounds are more literary than techie, they're not regarded as real Facebook people but "disposable outsiders". They're not employees but contract workers, like proofreaders or Uber drivers, supplied by a temp agency, a dozen or more squeezed into makeshift quarters in the New York offices. Though they have fancy degrees from Ivy schools and résumés from the New York publishing scene, they are not invited into the Facebook world. If there's an 8:00 happy hour for the proper employees, the News Curators aren't invited; they keep working, into the night. The turnover is pretty high. They believe that their real function is to train the algorithm—one day it will know how to do their jobs and they'll all be fired.

And it's possible that they act out, we're told. The political liberals in the team are suspected of acting out by "deactivating" a topic they don't like from time to time: IRS official Lois Lerner, Wisconsin governor Scott Walker. The conservative in the team acts out by ratting out his colleagues to Gizmodo.

Upon which, as we know, the conservative world goes nuts. And not, by the way, with the idea that Facebook falsely represents the stories, which it obviously doesn't, but that it falsely represents whether they are really "top" or not. Facebook is not accused of spreading falsehoods about Lois Lerner, it's accused of spreading falsehoods about whether people are reading about Lois Lerner, maybe. They were giving the impression that the Lois Lerner "scandal" was less trending than it pretended to be. Millions of readers may have been misled into believing that other people were less interested in reading about the Lois Lerner "scandal" than they actually were. It may have been, if it was happening, a biased report of the story's trendingness. How can people possibly appreciate the evil of Lois Lerner if they don't even know how popular it is to read about her?

I got a much clearer picture of what's going on in this story from Monsignor Ross Douthat, Apostolic Nuncio to 42nd Street, in his column today, "Facebook's Subtle Empire". Hahaha, I kid, I got it from those Gizmodo stories, to which he links, but he doesn't seem to have read them himself, or he'd have a clearer picture too.

The column is an awful mess. It begins with a standard declinist narrative, about how in the good old days everybody knew what the news was and what it meant, but now we all choose our own news according to our own ideological and other proclivities—

Once there were three major television networks, and everyone believed what Walter Cronkite handed down from Sinai. Then came cable TV and the talk radio boom, and suddenly people could seek out ideologically congenial sources and tune out the old mass-culture authorities. Then finally the Internet smashed the remaining media monopolies, scattered news readers to the online winds, and opened an age of purely individualized news consumption.
But then develops a counternarrative in which the decline works the opposite way, toward ever greater monopoly control, in

a new era of media consolidation, a return of centralized authority over how people get their news. From this perspective, Mark Zuckerberg’s empire has become an immensely powerful media organization in its own right...
Here I think there is a deliberate smushing together of two fundamentally different kinds of power: the editorial power, which Facebook exercises with an unprecedentedly light hand, trying as much as possible to put it into the hands of natural selection (and as much as possible not to pay for it), and the business power, with which Facebook tries to force you to consume whatever news you want from inside their environment, gobbling up the distinguished content providers like the New York Times and Buzzfeed that must now post on the inside of Zuckerberg's machine.

Douthat is in hyper–just-saying mode today. He's certainly not saying there is any attempt on Facebook's part to bias the news in a leftward direction, but some say there is, and they ought to be taken seriously, not necessarily by Douthat, he's just saying:

Between the algorithmic character of (much of) its news dissemination, the role of decentralized user choice, and the commercial imperatives of personalization, there’s little chance that the Facebook experience will ever bear the kind of ideological stamp that, say, the Time-Life empire bore in Henry Luce’s heyday.
But [celebrity plagiarist and former secretly paid shill for the Malaysian government Ben] Domenech is right that Zuckerberg’s empire still needs vigilant watchdogs and rigorous critiques.
Not that Domenech anywhere says that, by the way: the linked piece is an attack on Glenn Beck and Erich Erichssohn for not being angry enough with Facebook ("the complaints of useful idiots...stupid...dull-witted retromingency") and makes no assertions about what it needs, although it does say what Ross just denied, that the Facebook experience bears an ideological stamp:
Facebook curates the news; it is a news source for the vast majority of Americans. It put its trending algorithm forward as a source of news, with the false impression given that it accurately represented the trending topics of the Facebook community. Instead, it warped these results according to their ideological framework and their biases to falsely represent the top stories of the day.
Although at this point we're bleeding meaning pretty quickly, and the patient may not survive. It, Facebook, warped the results of the algorithm according to whose ideological framework and biases? The results, or the topics? What?

Ross acknowledges that any bias is going to be virtually undetectable, which means in a technical sense that it with be virtually unbiased:

True, any Facebook bias is likely to be subtler-than-subtle.
But he insists it will still be immensely powerful in its invisibility:

because so many people effectively live inside its architecture while online, there’s a power in a social network’s subtlety that no newspaper or news broadcast could ever match.
Power to do what? What are you trying to say?

Consider, for instance, the reported conversation at a Facebook meeting about whether the company might have an obligation to intervene against a figure like Donald Trump — something that a tweak of its news algorithm or even its Election Day notification could theoretically help accomplish.
Well, theoretically, but no conversation was reported, for one thing: what was reported was an employees' poll on questions they'd like to ask Mark Zuckerberg at a weekly meeting on March 4 (the question, "What responsibility does Facebook have to help prevent President Trump in 2017?", was the fifth most popular, with 61 votes, compared with 143 votes for no. 4, "As company growth slows, do you worry our ability to provide engineers with growth opportunities will degrade?", and so on), and it isn't known whether he answered the question at the meeting or not, but he did say, at a development conference a week later,
“I hear fearful voices calling for building walls and distancing people they label as ‘others,’” Zuckerberg said, never referring to Trump by name. “I hear them calling for blocking free expression, for slowing immigration, for reducing trade, and in some cases, even for cutting access to the internet.”
which suggests he's not too worried about saying how he feels openly and directly.

Ross sees hidden propaganda everywhere, and deploys a telling analogy:

Virtual architecture tells stories no less than the real variety: Like stained-glass windows in a medieval cathedral, even what seem like offhand choices — like Google’s choice of its Doodle subject, to cite a different new media entity — point people toward particular icons, particular ideals.
The word "propaganda" comes to us directly from the Roman Church and its Sacra Congregatio de Propaganda Fide (Sacred Congregation for the Propagation of the Faith), founded by Pope Gregory XV in 1622, and Ross knows that. Still, the disposition of the images in cathedral windows has always been very carefully planned and crafted in the construction of a whole teaching architecture for a huge and more or less fixed body of doctrine. Not subtler-than-subtle at the same time as it's tossed off by a couple of disgruntled 20-somethings in the course of an afternoon's wage-slavery.

The Google Doodle tells you very openly what the creators value. The idea that Facebook can conspire to infect its vast audience with some kind of ideological predisposition that the audience isn't even directly aware of is ridiculous. Ross knows it is, too, and his over-the-top emotional conclusion can't hide that:

So even if you don’t particularly care how Facebook treats conservative news sources, you should still want its power constantly checked, critiqued and watched — for the sake not just of its users’ politics, but their very selves and souls.
Selves and souls? Are you sure? Because you really haven't made a case for that.

I don't use Facebook News myself, ever—I spend a lot of time with Twitter, but I don't really use Facebook at all, to tell the truth, other than for the birthday greetings function for all the old friends I haven't seen in decades—and I don't really care what happens to it, but if there's a problem I have a couple of suggestions. First, nobody should be getting their news from a single source including Facebook (they mostly don't, fortunately), and if those News Curators are cutting up Zuckerberg ought to try treating them more the way he treats the engineers and programmers, with a little respect. Hell, if he pays them enough he might even get up to more than one conservative on the staff, those guys don't generally work cheap. Just saying.

No comments:

Post a Comment