PERSONALITIES WITHOUT PEOPLE. Guest Post by Katherine Behar!

Editorial note: The Occulture is delighted, in the wake of the Cambridge Analytica revelations, to publish Katherine Behar’s extremely lucid diagnosis of the dividualizing mechanisms increasingly modulating collective affordances. This paper was originally presented at Tuning Speculation V last November. Thanks to Katherine for her rapid-response, hereby chronoportated!

xenopraxis

______________________________________

I’d like to begin by vibrationally exchanging some gratitude as yet further creative debt to The Occulture. Thank you David, Ted, Marc, Eldritch, and Rebekah. This conference is one of the highlights of my year and so I’d like to begin with your words, not my own.

In summoning us this weekend, you appealed to the “impossible, imaginary, and/or unintelligible.”[1] Yet, the ideas that I intend to outline this morning will hinge on a notion that the impossible and imaginary is more and more incompatible with the unintelligible. I’ll ask whether, again borrowing your excellent words, the “possibly impossible, and likely unlikely” is, by now, too like the “unknown unknowns and known unknowns” prophesied by Donald Rumsfeld, and so all too vulnerable to militaristic and capitalist cooptation and the general baseness of exchange.

Arguably, quivering indeterminacy is all the more valuable to today’s growing arsenal of speculative/predictive/data-based/probabilistic/etc. exchange practices. And by this I mean not only financial exchange practices, but also social ones. I worry that as a result of probability’s marketplace hegemony on both fronts, it is becoming today impossible to be anything other than possibly? I’ll argue that this has interesting consequences: on an ontological level, for human/nonhuman distinctions, and on a pragmatic one, for politics.

So. My paper today is called “Personalities Without People.” Some Tuning Spec repeat offenders may recall a talk I gave last year in the immediate wake of the U.S. presidential election. That talk concerned the then barely emergent phenomenon of fake news. (And what a year it’s been.) Here I’ll be picking up and picking at a stray thread that I found and ignored in the course of my research last year, that being psychometrics. Secondarily, by approaching data and network ecologies through object-oriented feminism, I’ll try to expose what I see as a feminist quandary within the logics of psychometrics, and predictive data practices in general.

So let’s begin.

In the dazed days following the U.S. presidential election, I was reading obsessively about malignant clickbait and fake news on Facebook when I came across a New York Times article by McKenzie Funk titled “The Secret Agenda of a Facebook Quiz.”[2] And so I stumbled into the orbit of “psychometrics” and a company named Cambridge Analytica.

Cambridge Analytica has surfaced in mainstream news of late so many of you have probably heard of this data firm hired by the Trump campaign. In summary, a couple of high notes:

The House Permanent Select Committee on Intelligence, the Senate intelligence committee, Special Counsel Robert Mueller’s Trump-Russia probe, and now the Senate Judiciary Committee are all investigating Cambridge Analytica. Most recently, the firm made headlines on October 25th, when news broke that CEO Alexander Nix reached out to Julian Assange seeking to team up with WikiLeaks to “help” release Hillary Clinton’s famous missing emails.[3]

Cambridge Analytica is a U.S.-based big data firm that “uses data to change audience behavior”[4] and it specializes in political campaigns,[5] drawing in part on the “psy ops” defense contracting work of its shadowy British parent company, SCL Group.

Mostly, Cambridge works in support of right-wing political campaigns primarily in the U.S. and Britain, although none other than Hillary Clinton noted that September’s overturned Kenyan election was also a Cambridge “project.”[6] Before the Trump campaign, the firm worked on Republican campaigns for Ted Cruz and Ben Carson, as well as the pro-Brexit Leave.EU campaign, among others.[7] And they are also under investigation in the UK.[8]

There’s plenty more to say, particularly about the company’s funders and board members. Rather than indulge my inner conspiracy theorist, and in the interest of time, I’ll refer you to Carole Cadwalladr’s remarkable series of articles on the company in The Guardian, which I highly recommend.[9]

For now a word of caution: It is a fact that the Trump Campaign hired Cambridge Analytica and that Cambridge sent three staff members to San Antonio where they worked with Brad Parscale, the Trump Campaign’s digital director. Cambridge publicly identifies itself with psychometrics, (or interchangeably “psychographics”) and boasts having 5000 data points for each of 230 million Americans. Nevertheless, Parscale, who was interviewed by the House Intelligence Committee on the 24th, has repeatedly insisted that in their work for the Trump campaign, Cambridge was doing things other than psychometrics and was using data other than their own, specifically data obtained above board from the RNC.[10] My skepticism is beside the point. What interests me is less whether psychometrics is unsavory or has been applied toward politically disagreeable ends, and more what it suggests about identity and political subjecthood right now.

So what is psychometrics, and what is Cambridge Analytica up to?

According to Funk, “Cambridge Analytica … has been using Facebook as a tool to build psychological profiles that represent some 230 million adult Americans … by seeding the social network with personality quizzes.”[11]

We’ve all seen them. These quizzes ask innocuous questions like, Do you have a vivid imagination? Do you have a sharp tongue? Do you often feel blue? Do you get chores done right away? Do you like art?[12] The quizzes are a key ingredient of Cambridge Analytica’s special sauce,[13] which combines personality trait-based psychological profiling, micro-targeted advertising techniques, political messaging, and, of course, big data.

When Facebook quizzes ask if we feel blue or promptly do our chores, they are measuring our psychological traits through a metric known as the five-factor model, which assesses the “Big Five” personality traits known by the acronym OCEAN: openness, conscientiousness, extroversion, agreeableness, and neuroticism. Developed in the 1980s, the Big Five metric indicates an individual’s psychological character, priorities, and likely future behavior, and is now the psychometric gold standard. (As an aside, psychological typologies are far older, dating back to Sir Francis Dalton, the father of eugenics.)

But psychometrics caught my attention for other reasons, having to do with OOF.

As it happens, psychometric personality attributes are a near perfect example of secondary qualities, attributes of objects that, in object-oriented theories like object-oriented feminism, become objects in their own right.[14] Gender, race, class, and so on, are attributes that once formed the basis of subject-oriented identity politics, as well as of demographics by referring back to the human subject. But! These secondary qualities of people objects are becoming detachable and remixable independent objects. And the same goes for qualities like personality types. They could be arranged in a formation that looks like or centers on a human individual. But they could just as easily be organized otherwise, taken on their own, without a person in their midst.

This mutability is self-evident not only in psychometrics but also in the data practices we live with day-to-day. The digital systems that saturate and structure our lives quietly repeat this logic to us, with every step we and our step-counters take… joined as we are, at the hip, though only loosely so. I think this is part of a greater collapse under the weight of data of the personal or human that can no longer be neatly isolated from the data-driven or nonhuman.

Even the phrase “identity theft” captures colloquial awareness that something as fundamental as identity is no longer a sure thing. Who we are has become a bad bet: a cluster of data points, at risk of dispersal, falling apart at the seams. Identity theft is adequately scary, but try flipping this logic: What if the identity persists and we’re what’s lost?

In “The Data That Turned the World Upside Down,” Hannes Grassegger and Mikael Krogerus follow the story of Michal Kosinski, a Polish psychologist now based at Stanford.[15] In 2008, Kosinski posted “MyPersonality,” a Big Five questionnaire in the form of a Facebook app in an attempt to collect some data for grad school. Innocent enough. He soon had millions of respondents who took the quiz and gamely agreed to donate their profiles for his research. He went on to use this gigantic data set to hone predictive models correlating personality quiz results with Facebook user data, achieving unprecedented levels of accuracy.

Grassegger and Krogerus summarized his published research:

Slide: Kosinski study on Bloomberg

He could use 68 Facebook “likes” to predict skin color, sexual orientation, and party affiliation.

Before long, he was able to evaluate a person better than the average work colleague, merely on the basis of ten Facebook “likes.” Seventy “likes” were enough to outdo what a person’s friends knew, 150 what their parents knew, and 300 “likes” what their partner knew. More “likes” could even surpass what a person thought they knew about themselves.[16]

Eventually he could use just number of profile pictures or contacts, or smart phone motion sensor data to “ascribe Big Five values.”

Kosinski now suspects that Cambridge Analytica’s strategies for influencing elections are based on his methods. The firm created their own quizzes to harvest Facebook data in combination with data acquired from commercial brokers. They drive tremendous offline IRL political action by isolating and influencing uniquely identified individuals using predictive personality typing. Built on top of Kosinski’s predictive data tools, their method takes advantage of Facebook’s massive user base, its permissive privacy policies, and microtargeting.

Microtargeting in Facebook’s advertising platform relies on “dark posts,” newsfeed content items that are seen only by specified users, remaining invisible to everyone else. In his New York Times article, Funk explains how microtargeting of political messaging seeks, to push the exact right buttons for the exact right people at the exact right times. [… For example] a pro-gun voter whose Ocean score ranks him high on neuroticism could see storm clouds and a threat: The Democrat wants to take his guns away. A separate pro-gun voter deemed agreeable and introverted might see an ad emphasizing tradition and community values, a father and son hunting together.[17]

In this way, psychometrics reveals a significant shift. Descriptive demographics are giving way to predictive psychometrics, probablistically modeled on personality types.

For example, according to Grassenger and Krogerus, Cambridge Analytica created 32 personalities focused on 17 states for the Trump Campaign, using data models to isolate the groups predicted to be most actionable, and to feed them hundreds of thousands of ad permutations in dark posts. The targeted groups included potential Trump voters, and potential Clinton voters, like residents of Miami’s Little Haiti neighborhood who saw dark posts about the Clinton Foundation’s difficulties following the Haiti earthquake in an effort to persuade them to stay home from the polls.[18]

On October 27th, under pressure from critics and lawmakers, Facebook announced a new policy for political advertising, intended to add transparency to political dark posts.[19] In part, they plan to create a tool for users to see all of the ads an advertiser has sent to isolated user populations on the platform.[20]

While this policy revision is significant, it strikes me that the extraordinary algorithmic variability in these messages would make it impossible for any person to view every ad. Is it going too far to suggest that in their shear number, or dare I say their capacity for speculative tuning, the totality of ads is not well-suited for human consumption? What if not “possibly impossible and likely unlikely” is the human feat of swallowing never mind digesting all of this datic potential?

Parscale used Cambridge Analytica tools to inundate likely Trump supporters with Facebook ads tested in real time to be most effective out of “100,000 distinct pieces of creative content.”[21] They ran 40,000–50,000 variants of ads every day.[22] Human or nonhuman, this over-the-top variability and customization in political content is a stunning repudiation of demographics, which assumes commonality, and which Nix dismisses as

A really ridiculous idea. The idea that all women should receive the same message because of their gender—or all African Americans because of their race.[23]

Now that this is a fairly radical statement that one might expect to hear from a woke intersectional feminist, not the CEO of this company.

This makes me wonder, uncomfortably, if intersectionality itself might be understood as close to woke data mining.

They both participate in a broader trend toward parsing the personal with infinite granularity.

Consider recent social media mob attacks in the name of intersectionality like those against painter Dana Schultz or philosopher Rebecca Tuvel, which exemplify what I call “intersectionality done badly,” in that they make a particular arrangement of secondary qualities a precondition for communication. Without a perfectly—and possibly impossible—symmetrical data match, solidarity is shut down.

Times are strange. Right-wing white supremacists in khakis are rallying around “identity politics,” and leftist gatekeepers are silencing outside opinions by invoking “intersectionality.” Both result in greater isolationism, which I suspect is bred of defensiveness—perhaps an intuition that our own secondary qualities are abandoning us.

As concerning as some of Cambridge Analytica’s practices may be, this isolationism across the spectrum indicates to me that it is part of a larger pattern that is politically agnostic. My hunch is that the exhausting rise of networked data practices contributes to these vehement reassertions of an overbearingly autonomous (hence disconnected) self. Constant data transactions subtly reshape our self-conception as probabilistically computed, contingent, always available, and at-risk. In exchange for tantalizing personalization, personhood is reduced to fragile data constellations requiring continuous maintenance to cohere.

Predictive models do pinpoint people and do produce personalization. But now the metrics themselves are becoming stand-in political subjects. Unmoored from the individuals they once defined, personality types are gaining autonomous agency. The metrics attract politicians’ deference even though they can only be probabilistically or probably? correlated to any person.

What’s more, with Ocean and microtargeting, as Grassegger and Krogerus point out,

“it also works in reverse: not only can psychological profiles be created from your data, but your data can also be used the other way round to search for specific profiles.”[24]

This reversibility is the independence of secondary qualities—attributes as objects with a logic of their own.

They write, “Essentially, what Kosinski had invented was sort of a people search engine.” But notwithstanding real names and addresses, there is no actual person to be found who is “conscientiousness.” People are varying combinations of all five factors. Moreover, speaking for myself, my neuroticism or agreeableness fluctuates wildly based on things like “proximity to lunch” and don’t get me started on how variously I might answer the question “Do you like art?”

People congregate in waffling middles, yet isolationist extremism is undeniably on the rise. Is it the sway of data pushing us, like a centrifuge, to the outer edges of this media ecosystem?

This is exactly the type of polarization that Russian operatives at the Internet Research Agency, a Kremlin-linked St. Petersburg troll farm, sought to sow, through dark post advertising, as well as fake Facebook accounts moored in charismatically exaggerated false personalities.[25] As Jonathan Albright, of the Tow Center for Digital Journalism, has shown, by posting viral content on so-called “hot button topics” and relying on organic reach, the trolls perverted the intricacies of intersectional communities, and funneled the trust of users who self-identified with content into ever greater extremism, specifically because they calculated that it would lead to apathy and inaction.[26]

Put another way, both Ocean personalities and trolls are nonhuman. They’re in, of, and for data. Psychometrics can only indicate or find abstractions, like “openness,” and anticipations, meaning probabilistic future actions, presumed likelihoods, or possible trends in the data. So, insofar as any person is ever more than a pattern in data, what psychometrics finds with Kosinski’s reverse-look up database is personalities without people—shells or placeholders for a self. And what is this abstraction of pure personality? Maybe a troll.

[1] The Occulture, “Tuning Speculations V: Vibratory (Ex)changes,” The Occulture, March 28, 2017, http://www.theocculture.net/tspecv/.

[2] McKenzie Funk, “The Secret Agenda of a Facebook Quiz,” The New York Times, November 19, 2015, https://www.nytimes.com/2016/11/20/opinion/the-secret-agenda-of-a-facebook-quiz.html.

[3] Betsy Woodruff, “Trump Data Guru: I Tried to Team Up With Julian Assange,” The Daily Beast, October 25, 2017, https://www.thedailybeast.com/trump-data-guru-i-tried-to-team-up-with-julian-assange; David Smith, “Julian Assange confirms Cambridge Analytica sought WikiLeaks’ help,” The Guardian, October 25, 2017, https://www.theguardian.com/us-news/2017/oct/26/julian-assange-confirms-cambridge-analytica-sought-wikileaks-help.

[4] Cambridge Analytica, “Cambridge Analytica: Data drives all that we do,”accessed November 15, 2017, https://cambridgeanalytica.org/.

[5] Cambridge Analytica, “CA Advantage | CA Political,” accessed November 15, 2017, https://ca-political.com/ca-advantage.

[6] Abdi Latif Dahir, “Hillary Clinton says Kenya’s annulled election was a ‘project’ of a controversial US data firm,” Quartz, September 19, 2017, https://qz.com/1081021/hillary-clinton-says-trump-linked-cambridge-analytica-had-role-in-kenyas-annulled-election/.

[7] Funk, “Secret Agenda.”

[8] See Carole Cadwalladr, “The great British Brexit robbery: how our democracy was hijacked,” The Guardian, May 7, 2017, https://www.theguardian.com/technology/2017/may/07/the-great-british-brexit-robbery-hijacked-democracy; see also Carole Cadwalladr, “British courts may unlock secrets of how Trump campaign profiled US voters,” The Guardian, September 30, 2017, https://www.theguardian.com/technology/2017/oct/01/cambridge-analytica-big-data-facebook-trump-voters.

[9] See articles linked from Carole Cadwalladr’s Guardian profile page, accessed November 15, 2017, https://www.theguardian.com/profile/carolecadwalladr.

[10] Issie Lapowsky, “What did Cambridge Analytica really do for Trump’s campaign?” Wired, October 25, 2017, https://www.wired.com/story/what-did-cambridge-analytica-really-do-for-trumps-campaign/; Sean Illing, “Cambridge Analytica, the shady data firm that might be a key Trump-Russia link, explained,” Vox, October 22, 2017, https://www.vox.com/policy-and-politics/2017/10/16/15657512/cambridge-analytica-trump-kushner-flynn-russia. See also, Julie Bykowicz, “House Panel to Interview Trump Campaign Digital Director Brad Parscale,” The Wall Street Journal, October 22, 2017, https://www.wsj.com/articles/house-panel-to-interview-trump-campaign-digital-director-brad-parscale-1508708941.

[11] Funk, “Secret Agenda.”

[12] Cambridge Analytica, “Ocean Personality Quiz: Discover Yourself,” accessed November 15, 2017, https://ocean.cambridgeanalytica.org/

[13] Nicholas Confessore and Danny Hakim, “Data Firm Says ‘Secret Sauce’ Aided Trump; Many Scoff,” The New York Times, March 6, 2017, https://www.nytimes.com/2017/03/06/us/politics/cambridge-analytica.html.

[14] Katherine Behar, “An Introduction to OOF,” in Object-Oriented Feminism, edited by Katherine Behar (Minneapolis: University of Minnesota Press, 2016), 22–23.

[15] Hannes Grassegger and Mikael Krogerus, “The Data That Turned the World Upside Down,” Vice Motherboard, January 28, 2017, https://motherboard.vice.com/en_us/article/mg9vvn/how-our-likes-helped-trump-win.

[16] Ibid.

[17] Funk, “Secret Agenda.”

[18] Joshua Green and Sasha Isenberg, “Inside the Trump Bunker with Days to Go,” Bloomberg News, October 27, 2016, https://www.bloomberg.com/news/articles/2016-10-27/inside-the-trump-bunker-with-12-days-to-go. See also Grassegger and Krogerus, “Data That Turned.”

[19] The New York Times reported that Facebook “will maintain a publicly viewable database of ads purchased on the network.” See Mike Isaac and Daisuke Wakabayashi, “Russian Influence Reached 126 Million Through Facebook Alone, The New York Times, October 30, 2017, https://www.nytimes.com/2017/10/30/technology/facebook-google-russia.html. See also Issie Lapowsky, “Facebook’s Election Ad Overhaul Takes Crucial First Steps,” Wired, September 21, 2017, https://www.wired.com/story/facebook-election-ad-reform/; Tony Romm and Kurt Wagner, “Facebook is taking a stricter stance on political advertising ahead of its testimony to the U.S. Congress next week,” Recode, October 27, 2017, https://www.recode.net/2017/10/27/16555926/facebook-political-advertising-ads-2016-russia.

[20] Lapowsky, “Facebook’s Election Ad Overhaul.”

[21] Green and Isenberg, “Inside the Trump Bunker.”

[22] Martin Moore, director of the King’s College Centre for the Study of Media, Communication, and Power, cited this statistic in Carole Cadwalladr’s “Google, democracy and the truth about internet search,” The Guardian, December 4, 2016, https://www.theguardian.com/technology/2016/dec/04/google-democracy-truth-internet-search-facebook.

[23] Grassegger and Krogerus, “The Data That Turned.”

[24] Ibid.

[25] Adrian Chen, “The Agency,” The New York Times Magazine, June 2, 2015, https://www.nytimes.com/2015/06/07/magazine/the-agency.html.

[26] Craig Timberg, “Russian Propaganda may have been shared hundreds of millions of times, new research says, “ The Washington Post, October 5, 2017, https://www.washingtonpost.com/news/the-switch/wp/2017/10/05/russian-propaganda-may-have-been-shared-hundreds-of-millions-of-times-new-research-says/. See also aggregated links about this research at the Tow Center for Digital Journalism website, accessed November 15, 2017, https://towcenter.org/research-director-jonathan-albright-on-russian-ad-networks/.