Psychometric Profiling of Digital Voter Data

The Role of Technology in Optimising Democratic Engagement

 

Introduction

Historically, the success of political parties has been heavily influenced by the degree of their democratic engagement and enabled by analyzing voter data and drawing conclusions from voters’ identity groups, characteristics, and interests. However, voter databases can reveal a lot more than just demographic information and be more intrusive to include their religion, political affiliation, voting history, digital footprint, social media profiles, purchasing history, and other deeply personal information that is not collected from voters. With the advent of information capitalism, the need to optimize voter engagement has heightened the demand for maintaining and sharing such databases to psychometrically profile voters, with the help of digital tools. 

 

This process is used by marketers and advertisers to assess the psychological characteristics of an individual or a group, which provides an untapped (and unregulated) insight into the user’s beliefs, inclinations, and tendencies. By appealing to such underlying and subconscious traits of humans, psychometrically informed advertisements and messages have the power to be more persuasive and influential in forging public opinion, especially through digital media. [1] Such elaborate statistical techniques and algorithms can infer data with more accuracy and efficiency than ever before and partake in a futures market on voters’ behaviour.

 

 

How can Voter Data be Profiled Psychometrically?

The need to maximize voter engagement to influence democratic outcomes as well as its economic potential has led to the psychometric profiling of voter data. It functions in ways that are subtle, often invisible, and too intricate for an individual to be informed of and, subsequently, be entitled to object. [2] However, the right to object to data profiling, as an example provided under Article 21 of the General Data Protection Regulation (GDPR), is a rarity, to which India’s Digital Personal Data Protection Act 2023 is no exception. 

 

Analytics gather this data either through public databases on voter behaviour or through State agencies (often in an unlawful manner), which is then used to build predictive models by creating aggregate projections on their opinions and behaviours. [3] In this process, the patterns, differences and personality traits evident in voter behaviour are analyzed and predicted by profiling voters into different targetable segments and societal groups. Once profiled, these groups are pumped with targeted content and advertisements, to control and selectively curate the flow of information on political parties, candidates, manifestos, etc. Streamlining content influences and manipulates the voters’ psyche, by activating implicit attitudes and biases, [4] and leads to the propagation of hate speech, fake news and misinformation. This is a perfect example that the State does not just control the flow of information through censorship, but also by selectively targeting the recipients of sensitive information.

 

The model is more commercially viable and profitable compared to actual on-ground campaigning, in terms of the number of people targeted as well as the subconscious and subliminal impact which they carry. Such messaging can have a deeper influence, resulting in micro-targeting of individuals, which is a practice that increasingly forms an integral part of political campaigning globally. [5]

 

These predictive models can be used to convince users to support a particular candidate, demonstrate certain behaviours that can subtly be assigned to their political inclination (such as attending a place of worship), impact their socio-cultural opinions, etc. [6] By basing the messaging on sentimental or personal issues, such as religion, race, culture, etc., voter manipulation ultimately undermines citizens’ autonomy. The reverse of this approach is also widely practised; actively concealing exposure to, or selectively exposing, material information to particular audiences, during election season that could significantly impact voter mentalities. But more often than not, such messages are not aimed to inform, persuade or nudge voters to make more ‘reasoned’ and ‘informed’ decisions, but to appeal to their non-rational vulnerabilities, determined by algorithmic profiling. [7]

 

 

Learnings from the Facebook-Cambridge Analytica Case

Cambridge Analytica was at the epicentre of the data misuse scandal of the US Presidential Elections of 2016 as they used Facebook (now Meta)’s personality test database to psychometrically profile users and target specific political advertisements, agendas, and other highly volatile content. Solely based on the responses gathered by this test, Facebook deployed algorithms to form assumptions on their political enthusiasm, orientation, frequency of voting, consistency in voting for the same political party, etc. [8] Superimposing the outcomes from these datasets on Facebook’s already established extensive user profile information (UPI) gathered on its millions of users, identity-linked personal data was extracted and segregated to determine their political inclination and interests. Their profiles were assessed on the OCEAN scale which categorized users according to their five big personality traits, in order to reduce an individual’s personality to five factors – openness, conscientiousness, extraversion, agreeableness and neuroticism. They were, hence, “able to produce a model of the personality of every single person in the US”. [9] The resultant datasets were sold off to Cambridge Analytica, where they were misused to influence political campaigns in the US Presidential Election in 2016 [10] and the Brexit vote in the UK [11]

 

This, in turn, allowed Cambridge Analytica to deliver micro-targeted ads to potential voters based on their profiles on hot-button, provocative and inflammatory issues. [12] Ultimately, Facebook had to notify 87 million users that their privacy had been compromised and their sensitive personal data had been shared with Cambridge Analytica. [13]

 

 

Conclusion

One of the biggest takeaways from this case study is that at a mass level, it has become algorithmically determinable that personality traits drive behaviour and behaviour influences how one votes. [14] It highlights larger issues of the democratic fragility that digital media creates, by fostering user mentalities that are easily susceptible to misinformation, misguided content, and using several dark patterns on their platforms. The far-reaching consequences of such manipulation of public opinion have shaken regulatory bodies globally to tighten their grip on the flow of information on digital media during an election season. 

 

From a socio-legal perspective, the concept of proportionality and necessity in the process of psychometric profiling in political campaigning has not found its way into legislation, courts or public discourse yet. This is owed to several reasons, including the enormous profits that corporations can realize and the unnatural extent of control that governments can exercise on their citizens. But nothing can ignore the heavy influence that microtargeting tools employed for political campaigning carry in undermining the functioning of a democracy and determining electoral outcomes. It seems unlikely to see any fundamental changes made to this model provided that such ‘propaganda-as-a-service businesses remain in such high demand’ [15]

 

 

Footnotes

[1] Michal Kosinski, David Stillwell and Thore Graepel, Private Traits and Attributes are Predictable from Digital Records of Human Behaviour, PNAS. Available at: https://www.pnas.org/doi/10.1073/pnas.1218772110

[2] Profiling in Political Campaigning, ICO UK. Available at: https://ico.org.uk/for-organisations/guide-to-data-protection/key-dp-themes/guidance-for-the-use-of-personal-data-in-political-campaigning-1/profiling-in-political-campaigning/

[3] Sasha Issenberg, How Obama’s Team Used Big Data to Rally Voters, MIT Technology Review. Available at: https://www.technologyreview.com/2012/12/19/114510/how-obamas-team-used-big-data-to-rally-voters/

[4] Psychometric Profiling: Persuasion by Personality in Elections, Our Data Our Selves. Available at: https://ourdataourselves.tacticaltech.org/posts/psychometric-profiling/

[5] Micro-targeting in Elections, Privacy International. Available at: https://privacyinternational.org/news-analysis/3735/why-were-concerned-about-profiling-and-micro-targeting-elections

[6] Elizabeth Culliford, How Political Campaigns Use Your Data, Reuters. Available at: https://www.reuters.com/graphics/USA-ELECTION/DATA-VISUAL/yxmvjjgojvr/

[7] Supra note 4.

[8] Sai Manish, How Cambridge Analytica Profiled Voters and What It Means for India, Business Standard. Available at: https://www.business-standard.com/article/companies/how-cambridge-analytica-profiled-voters-and-what-it-means-for-india-119120900526_1.html

[9] Psychometric Profiling: Persuasion by Personality in Elections, Our Data Our Selves. Available at: https://ourdataourselves.tacticaltech.org/posts/psychometric-profiling/

[10] Matthew Nussbaum, Trump Campaign Sprints Away From Cambridge Analytica, Politico. Available at: https://www.politico.com/story/2018/03/20/trump-campaign-cambridge-analytica-473650

[11] Ellen Barry, Cambridge Analytica Whistle-Blower Contends Data-Mining Swung Brexit Vote, NY Times. Available at: https://www.nytimes.com/2018/03/27/world/europe/whistle-blower-data-mining-cambridge-analytica.html

[12] The Power of Big Data and Psychographics, Alexander Nix. Available at: https://www.youtube.com/watch?v=n8Dd5aVXLCc&ab_channel=Concordia

[13] John Constine and Taylor Hatmaker, Facebook admits Cambridge Analytica Hijacked data on up to 87M users, Tech Crunch. Available at: https://techcrunch.com/2018/04/04/cambridge-analytica-87-million/.

[14] Natasha Lomas, Facebook Data Misuse and Voter Manipulation Back in the Frame with Latest Cambridge Analytica Leaks, Tech Crunch. Available at: https://techcrunch.com/2020/01/06/facebook-data-misuse-and-voter-manipulation-back-in-the-frame-with-latest-cambridge-analytica-leaks/

[15] Ibid.

Related Posts