The Political Value of Your Data

Modern political data industry, creating new opportunities for political manipulation and anti-democratic strategy, is the result of some weaponized data-driven designed systems that lead to problematic outputs, affecting large scale communities.

Karina Nguyen
7 min readAug 9, 2020

Data-driven technology is an unavoidable attribute of modern political campaigns. Some people agree that it is an excellent addition to politics as a necessary contemporary approach to democratic processes, while others argue that it decreases trust in already flawed systems in politics. Given the fact the mass communication has already been replaced by more and more individualized communication, it’s fair to claim that the use of these technologies in political campaigning is not going away; in fact, we can only expect its sophistication. For this reason, the techniques and methods need to be carefully examined. I argue that the modern political data industry, creating new opportunities for political manipulation and anti-democratic strategy, is the result of some weaponized data-driven designed systems that lead to problematic outputs, affecting large scale communities.

As data prediction is getting more embraced into business models, it obtains not only economic and moral value but also becomes a political asset, which significantly affects how and under which circumstances micro-targeting is carried out. Voter files and databases are usually multifaceted and complicated because it is aggregated from different data sources like consumer data vendors or canvassing apps. Consumer data that is acquired commercially like simple geolocation of one’s mobile phone, credit rating, and even motor vehicle information fuels digital campaigns, thus transforming into a political value. (ConsumerView, Experian) Jasanoff, in “Ethics of Invention,” introduces the idea of parallel with technological determinism and technocracy is that of “unintended consequences.” Interestingly enough, the public data that is easily accessible and open to everyone is usually leveraged for the political use too. For example, Google Maps data can be combined with consumer or even exposed data to infer household or neighborhood type, which will be used to create or unlock detailed profiles of specific political audiences. In addition, a group of AI researchers from Stanford stated that voting patterns can be predicted based on available Google Street Views with cars with the help of deep learning. “If the number of sedans in a city is higher than the number of pickup trucks, that city is likely to vote for a Democrat in the next presidential election (88% chance); if not, then the city is likely to vote for a Republican (82% chance)” (Using deep learning and Google Street View to estimate the demographic makeup of neighborhoods across the United States, Stanford Research) Emma’s Diary, a company that offers pregnant women and parents health advice and gifts, has recently faced a fine for illegally breaching and selling more than a million people’s data like names, addresses, birth dates, household types to the Labour Party, which had a data supply agreement with Experian (data broker) that, in its turn, was using that data to build a database for targeted campaign (Emma’s Diary faces fine for selling mum’s data to Labour, Leo Kellon BBC) Data has no value and is not a political asset until it’s acted upon by actors, or in this case a group of actors, who leverage data without consent of people to influence democratic choices. The power of inferences that are derived from complex combinations and manipulation of various data from different sources, which is acted upon by different actors to influence the public, becomes one of those “unintended consequences.”

The most effective technique that data brokers are using from collected data is called psychometric profiling. It was developed a long time ago by Francis Galton in the 1880s, following Alfred Binet, who inn 1905 introduced the first intelligent test for higher psychological processes like practical judgments and real abilities. With more data coming in from a variety of public platforms as Facebook and Google, data brokers and political consultants have the ability to use that digital footprint in order to predict psychological characteristics at larger scale. (Computer-based personality judgments are more accurate than those made by humans, Wu Youyou, Michal Kosinski, and David Stillwell) This way, in 2016’s Cambridge Analytica presentation, former CEO Alexander Nix said, “It’s personality that drives behaviour, and behaviour obviously influences how you vote.” The basis of the big data lies into factual (demographics/geographics like age, gender, ethnicity), attitudinal (habits/lifestyle like consumer data, consumer confidence, buying styles, patterns, automative data), personality (behavioral/ human psychology) and persuasion (reciprocity, scarcity, authority, fear, and social proof). Cambridge Analytica was focused on predicting the personality data with the methodology that they would call OCEAN: openness (how open a person is to new experiences), contentiousness (whether the person prefers order/habits/planning), extraversion (how social a person is), agreeableness (whether a person puts their needs of others and society ahead of themselves), and neuroticism (the measurement of how much a person tends to worry). “Every time you watch TV, the programs are being recorded and sent to the cable provider, match what you watch, and select the programs to advertise at to reach the highest density of target audience. It does lower not only the costs but also increases ROI.” (Psychometric Profiling: Persuasion by Personality in Election, Varoon Bashyakrla) OCEAN model tells what kind of person someone is; however, there are so many more data sources that can give a much more comprehensive profile of each person. For example, social media “listening” can tell a person’s political opinion through a sentiment analysis with natural language processing, while tracking pixels and ISP data can tell someone’s behavior and everyday habits.

Moreover, A/B testing and conversion rates of a specific designed political message shows how people react and respond, and therefore, gives more data to modify or change the tone of political campaigns. And that’s how design is weaponized: the ideal user is prioritized over empathetic structural vulnerabilities (thread modeling) that causes some features to be at the high risk of misuse (Engineering the public: Big data, surveillance, and computational politics, Tufekci) In a lot of cases deceptive designs techniques tend to perform well in real-time experimentation measurements such as A/B tests that will trick users into doing something is likely to achieve more conversions for-profit businesses or political ads. One of the examples of deceptive technique might be manipulative timing when the time users see a notification can determine how they respond to it or if they even notice it. Interruptions generally influence cognitive overload, and deceptive design deploys them to make it harder for users to be fully in control of their faculties during a critical moment of decision. These techniques are usually referred to as dark patterns in UX, such as Roach Motel (design that makes it easy to get into a situation but hard to get out). As a result, the execution of the political campaigns is deeply rooted in psychometric profiling itself in addition to all other techniques discussed above.

Winner in “Do Artifacts Have Politics” argues that technical things have political qualities and that it’s essential to consider the socioeconomic context in which technology was embedded and affects the communities. It’s possible that automated personality tools can improve people’s lives with data-driven decisions: for example, products and services could adjust their approach to their businesses based on a better understanding of their clients, recruiters might have better match candidates with jobs based on their personality traits. However, such tool and methodology can be easily manipulated. With abundance of data, political targeted campaigns’ objective nowadays is to manipulate people’s decisions based on intelligence about them via surveillance of their online activity (so inferring who they are via their data), which is unethical in its own because manipulation is not only understanding who they are but also design systems and executing certain processes in such a way that would increase the chance of people doing specific action and influencing their decision with emerging technologies AI-powered Voter Intelligence or AI-enabled “chatbots.”

Changing the compass of a political system depends on every actor that’s involved at any given stage. Data brokers, private companies and political consultants can limit the amount of data collected and capturing the minimum viable data, thus minimizing user privacy concerns and maintain trust as well as share a transparent picture of a data flow: its origin, path, and how it has been transformed and provide reciprocal value to people who share their data. Politicians who incorporate AI tools, as well as creators of these intelligence systems, might want to clarify each moment of the intended audience’s relationship with the tool. The solution is not straightforward; it’s way far complex; however, bringing ethical compass and having design and critical thinking approaches might help different communities understand and ethically design other systems with emerging technologies.

Sources:

  1. Experian, ConsumerView, 2019
  2. Tactical Tech, USA: American Digital Politics and the Commercial Sector, 2018
  3. Adam D. I. Kramer, Jamie E. Guillory and Jeffrey T. Hancock, Experimental evidence of massive-scale emotional contagion through social networks, Proceedings of the National Sciences of the United States of America, 2014
  4. Stanford Researchers, Using deep learning and Google Street View to estimate the demographic makeup of neighborhoods across the United States, 2017
  5. Leo Kellon, BBC News, Emma’s Diary faces fine for selling mum’s data to Labour, 2017
  6. New York Times, How Companies Learn Your Secrets, 2012
  7. Wu Youyou, Michal Kosinski, and David Stillwell, Computer-based personality judgments are more accurate than those made by humans, 2015
  8. Varoon Bashyakrla, Psychometric Profiling : Persuasion by Personality in Election, 2018
  9. Tufekci, Engineering the public: Big data, surveillance, and computational politics, 2014
  10. Winner, Do Artifacts Have Politics, 1980
  11. Jason J. Jones, Robert M. Bond,Eytan Bakshy,Dean Eckles,James H. Fowler, Social influence and political mobilization: Further evidence from a randomized experiment in the 2012 U.S. presidential election, 2017
  12. Alberto Ferreira, Universal UX Design: Building Multicultural User Experience, Morgan Kaufmann, 2016

--

--

Karina Nguyen
Karina Nguyen

Written by Karina Nguyen

applied artist & researcher | prev. @nytimes, @dropbox, @square

No responses yet