4_facebook
π Digital π Society
The digital revolution: at humanity's expense?

“Digital platforms have poor control over their manipulation of emotions”

On June 8th, 2021 |
4min reading time
Camille Alloing
Camille Alloing
Professor of Public Relations at the Université du Québec à Montréal
Key takeaways
  • The Cambridge Analytica scandal and other cases have recently alerted citizens to the possibility of social networks manipulating votes by playing on their emotions.
  • But for researcher Camille Alloing, social networks knowingly overestimate their capacity for manipulation in order to sell advertising space.
  • In the same way, platforms such as Facebook have conducted psychological experiments on the emotions of hundreds of thousands of their users...
  • And doing so without their knowledge based on a caricatured and unreliable conception of emotion, with the sole aim of providing credit for their hypothetical capacity of manipulation.

It is said that one of the best ways to mani­pu­late social media users is to make them feel sca­red or empa­the­tic. The alle­ged role of Cam­bridge Ana­ly­ti­ca and Face­book in the elec­tion of Donald Trump seems to prove it. Howe­ver, UQÀM com­mu­ni­ca­tions and infor­ma­tion science resear­cher Camille Alloing, sug­gests that the power social media holds over our emo­tions needs to be taken with a pinch of salt.

Could you explain what “affec­tive capi­ta­lism” is ?

Sim­ply put, it is the part of capi­ta­lism that exploits our abi­li­ty to be moved (and to move others) to gene­rate value ; some­thing we par­ti­cu­lar­ly see on social media.However, it’s worth loo­king a lit­tle clo­ser at the word “affect”. It is a term that can be used to refer to any emo­tion, but the impor­tant part is how it “sets us in motion,” mea­ning what causes us to take action. 

When I “like” a post, I am affec­ted. Unlike emo­tions (which remain dif­fi­cult to ana­lyse due to their sub­jec­tive and uncons­cious nature), affec­tive conse­quences can be iden­ti­fied (you can know that a video affec­ted me because I pres­sed the “like” but­ton). So, although we can­not ascer­tain whe­ther digi­tal plat­forms actual­ly suc­ceed in pro­vo­king emo­tions in users, we can ana­lyse how users behave.

Given that most social media reve­nue comes from sel­ling ad space, the goal of these plat­forms is to increase the time users spend on them, and the­re­fore the num­ber of ads vie­wed. To that end, affect is unde­nia­bly extre­me­ly use­ful – by gene­ra­ting empa­thy, more reac­tions are promp­ted, and content is sha­red more.

I have found that indi­vi­duals are now part of struc­tures that can affect them (and the­re­fore make them feel emo­tions and make them act) although they can­not affect back. If I post some­thing, and I’m expec­ting a res­ponse from my friends, I am alie­na­ted because I will only get that res­ponse if Face­book chooses (for rea­sons beyond my control) to share my post in my friends’ feeds.

You say that “affect is a power­ful tool”. Is it the power to mani­pu­late people through their emotions ?

If I said that affec­ting a per­son meant you could suc­cess­ful­ly mani­pu­late them, I would be in agree­ment with the argu­ments the plat­forms are put­ting out. Face­book, for example, has eve­ry rea­son to let people think that their algo­rithm is able to control users because it helps them to sell ad space. In this way, the Cam­bridge Ana­ly­ti­ca scan­dal [in which this com­pa­ny attemp­ted to mani­pu­late Face­book users to influence Ame­ri­can swing voters in the 2016 US pre­si­den­tial elec­tion in favour of Donald Trump] pro­vi­ded incre­dible publi­ci­ty for Face­book with their adver­ti­sers who saw it as an oppor­tu­ni­ty to dras­ti­cal­ly increase their sales by mani­pu­la­ting users !

Howe­ver, the role of social media in Trump’s elec­tion must be put in pers­pec­tive, and we should be care­ful not to trust over­sim­pli­fied expla­na­tions. Even though Face­book boas­ted that its tar­ge­ted adver­ti­sing was 89% accu­rate, in 2019 employees revea­led that ave­rage accu­ra­cy in the US was in fact only half that (41%, and as low as 9% in some cate­go­ries)1. Sure, these plat­forms’ algo­rithms and func­tio­na­li­ties have tan­gible effects… but they are much less than what you might think.

The research is there to faci­li­tate well-balan­ced debates, and scien­ti­fic stu­dies23 have shown that, contra­ry to what we might hear, social media plat­forms can­not actual­ly mani­pu­late us. That doesn’t mean they don’t try, but they can­not control who they affect nor what the conse­quences are of their ini­tia­tives. What’s more, it can qui­ck­ly become dan­ge­rous, even more so given that their concept of human psy­cho­lo­gy leaves much to be desi­red. Belie­ving that people are blind­ly sub­ject to their emo­tions and cog­ni­tive biases is a form of class contempt.

In 2014, Face­book hired resear­chers to per­form psy­cho­lo­gi­cal tests that aimed to mani­pu­late the emo­tions of 700,000 users, without their consent [4]. This “scien­ti­fic” stu­dy was meant to demons­trate the platform’s abi­li­ty to control the mood of its users and invol­ved modi­fying people’s news feeds to show them more nega­tive (or posi­tive) content. As a result, they clai­med that they could cause “emo­tio­nal conta­gion,” as people would publish content that was more nega­tive (or posi­tive, depen­ding on what they had been shown). Howe­ver, on top of the obvious ethi­cal issues, the expe­riment was sta­tis­ti­cal­ly fla­wed, and the conclu­sions do not hold up. But I think it’s fair to say that scien­ti­fic rigour was pro­ba­bly not their prio­ri­ty ! Above all, the objec­tive was to create good publi­ci­ty among adver­ti­sers – Face­book uses research as a PR tool.

Yet it is impor­tant to remem­ber that affec­ting someone is not neces­sa­ri­ly nega­tive – it all depends on our inten­tions. We are constant­ly affec­ting each other, and when we are fee­ling down, we need to be affec­ted in a posi­tive way. We sim­ply need to care­ful­ly consi­der who we are allo­wing to affect us. Should pri­vate com­pa­nies have this power ? Should the government ?

Should we be concer­ned by detec­tion of bio­me­tric emotion ?

Yes. We are cur­rent­ly seeing the wides­pread dis­se­mi­na­tion of bio­me­tric tools that mea­sure emo­tions. In our book [5], we men­tion a come­dy club in Bar­ce­lo­na, the Tea­tre­neu, where the price of your ticket is cal­cu­la­ted by the num­ber of times you laugh (30 cents per laugh). This example is pret­ty anec­do­tal, but less amu­sin­gly, bio­me­tric tech­no­lo­gy (which until recent­ly were nothing but basic expe­ri­ments for com­mer­cial ends) is now being used to moni­tor citi­zens. The NYPD has spent more than $3 bil­lion since 2016 on its algo­rithms, which use tar­ge­ted ads to mea­sure the atti­tudes towards police of 250,0000 resi­dents [6].

The pro­blem is also that this bio­me­tric emo­tion detec­tion tech­no­lo­gy is very bad at doing its job. This is because it is based on the work of Ame­ri­can psy­cho­lo­gist Paul Ekman and his Facial Action Coding Sys­tem [a method of ana­ly­sing facial expres­sions that aims to asso­ciate cer­tain facial move­ments to emo­tions], which does not actual­ly work in practice.

Des­pite their inef­fec­ti­ve­ness, these bio­me­tric tools are sprea­ding at a rapid pace – yet tech­no­lo­gy is much more dan­ge­rous when it works bad­ly than when it doesn’t work at all ! If it’s 80% reliable, and you are part of the 20% mar­gin of error, it will be up to you to prove it. I find it very concer­ning that poor­ly func­tio­ning tools are beco­ming tools of gover­nance and sur­veillance, imple­men­ted without the consent of the main par­ties involved. 

Interview by Juliette Parmentier
1http://​www​.wohl​fruch​ter​.com/​c​a​s​e​s​/​f​a​c​e​b​o​o​k-inc
2https://​toward​sda​tas​cience​.com/​e​f​f​e​c​t​-​o​f​-​c​a​m​b​r​i​d​g​e​-​a​n​a​l​y​t​i​c​a​s​-​f​a​c​e​b​o​o​k​-​a​d​s​-​o​n​-​t​h​e​-​2​0​1​6​-​u​s​-​p​r​e​s​i​d​e​n​t​i​a​l​-​e​l​e​c​t​i​o​n​-​d​a​c​b​5​4​6​2​1​5​5​d​?​g​i​=​e​d​0​6​9​7​1​b06a5
3https://web.stanford.edu/~gentzkow/research/fakenews.pdf

Support accurate information rooted in the scientific method.

Donate