In the 1950s, moviegoers were unknowingly subjected to a new advertising technique -- subliminal advertising. A psychologist named James Vicary wanted to see if he could change behavior by slipping messages into a movie without viewers being consciously aware of them. Messages like "Drink Coca-Cola" or "Buy Popcorn."
The results are scientifically disputed. There is evidence that the messages worked but not all the time. Still, some governments attempted to ban the practice by passing legislation to outlaw it for this reason: Psychological manipulation, known as brainwashing at the time, was akin to the authoritarian regime depicted in George Orwell's 1984. In the name of freedom and democracy, citizens should not be subjected to psychological manipulation without their knowledge or consent.
Today, there is a new kind of advertising born of the internet. It is called behavioral advertising (also known as behavioral targeting). Internet companies such as Facebook, Google and a whole host of others have gathered so much data on you that they know your desires, fears and how to exploit those characteristics to change your behavior.
The aforementioned companies are able to tap into behavioral advertising by utilizing psychographics, which is the study of human behavior that taps into an individual’s interests, values personality traits. Psychographics is used widely in marketing, particularly by internet-based organizations, to determine how best to target users with messages and/or advertisements.
While Facebook is just one example of a company that uses psychographics on users, it is also the company we hear about the most, in large part due to the enormous influence it has had over individuals and societies. Facebook started this psychological experiment with a key human element -- trust. Facebook’s original mission statement was “To give people the power to share and make the world more open and connected.” In 2017, Mark Zuckerberg revealed a new mission statement: “Give people the power to build community and bring the world closer together.”
The company began by bringing friends and family together on its platform, creating communities of trust. Then, using psychographics, it used artificial intelligence (AI) and algorithms to push forward news feeds that engage two other powerful human emotions -- outrage and fear. The algorithms pushed forward information that would engage these emotions, whether they were true or not. But because the information was being shared among trusted groups, emotional credence was given to lies. But Facebook scored more advertising dollars because people clicked and share more frequently when fear and outrage were activated.
The result of all of this has been a fundamental breakdown in trust among people and democratic systems. Exacerbating this problem is the fact that the “truth in advertising” rule doesn’t apply to political campaigns. That means there is no one guarding against the unprecedented amount of misinformation available. With no clear truth, people tend to react from an emotional gut-level or what “feels” true.
According to a recent Pew Survey, most Americans don’t know or understand what information is being collected on them by Facebook or how it’s being used. If we cannot discern the truth for ourselves, we not only give up free will and self-determination, we give up democracy.
Despite all these real consequences, behavioral advertising remains completely legal. Rather than banning Facebook or Google, we should ban behavioral advertising and the gathering and selling of psychographic data. This one regulation would solve many of the problems born of social media and the Internet without regulating companies individually.
Until that happens, the internet seems to be becoming a big psychological experiment. We need to step away from the madness, take back control of our own choices and reclaim facts. Start using encrypted searching, such as Duckduckgo, where you’re less likely to be profiled. Delete social media accounts that track you, and use alternative social platforms that don’t exploit your data.
As a community, the tech industry could be pushing for regulations like the General Data Protection Regulation in Europe, insisting that advertisers be transparent in the way our data is compiled, used and, ultimately, sold to third parties. The Electronic Frontier Foundation, for example, is an organization that makes a stand for individual rights online.
If governments won’t ban psychological manipulation online, we should take every step possible to ensure we safeguard our own psyches.