The Silent Takeover: How Social Media Algorithms Eroded Our Autonomy
“Technology ended everything” — interviewee from
At first, I thought my father said this out of resentment toward the small trash icon he could never find at the top of his WhatsApp homepage. This line, however, was just a grand ending to the past stories of his youth and an introduction to the present-day dystopia. The majority would agree that technology (carry-on devices, home appliances, and even robotic mechanisms) has, one way or another, immensely improved our quality of life in the last fifty to seventy years. It revolutionized how we produce our products, consume media, commute from places, extract and distribute resources, and communicate with one another. From all this, it seems that technology created “everything” in our present rather than destroying it.
Social media rids democratic societies of autonomy because it affects self-governance at a social level and free will at an individual level.
Like , I also can feel someone rewriting the ingrained routines in my brain. What used to be a simple open-eye walk to the bathroom is now a thirty-minute or more routine of logging into my games, checking emails, scrolling through Reddit, down a Youtube shorts rabbit hole, a step towards the cold floor, and at last, I meet my ceramic sink friend that greets me with a reflective shine and glimmering water droplets. I randomly pat myself down throughout the day when I cannot remember where I left my phone — as if looking for a cigarette pack or vape. I hit rock bottom when my screentime report said eight consecutive hours on Tiktok and five hours on Youtube. I am sure many users experience similar changes in their normal routines, like brushing their teeth, because of the mindlessly scrolling urge. I asked myself, why can I not stop scrolling even when I know, somewhere in the back of my head, that my For You page is endless? Are Big Tech companies aware of the addictive impacts of their platforms? Are they modifying anything to make this new tech-dependent generation less dependent?
documentary on Netflix answered my question of how technology destroyed everything by presenting the three goals all social media companies abide by in their business model.
- The engagement goal: increases usage and makes sure users continue scrolling.
- The growth goal: ensure users are coming back and inviting friends that invite even more friends.
- The advertisement goal: make sure that while the above two objectives are happening, the companies are making as much money as possible from advertisements.
According to the documentary, Big Tech increases usage or engagement using an “adaptive” algorithm to create personalized experiences. Author and documentary interviewee describes such algorithms as “mak[ing] small changes to themselves to try to get better results; ‘better’ in this case mean[s] more engaging and therefore more profitable” (Lanier 15). With every click and view, you feed more information to these algorithms, whose adaptiveness can predict what you want and provide you with the next post to look at. Think of a coffee shop you stop by on the way to work or school. Without your knowledge, a shop employee retrieves your coffee cup from the trash can, puts it in a ziplock bag, and sends it to a laboratory. The lab processes the cup to find the biological components in your saliva that contribute to your coffee addiction. The following day, the shop tweaks its recipe to make its coffee slightly more addictive, especially for you. They follow these steps after every visit and gradually attain more and more information about your saliva. Ultimately they develop a cup of coffee that you find irresistible. It is the same case with many social media algorithms.
The tweaks or adjustments an algorithm makes to customize user content classify as “adaptive.” It can adjust to any environment and user by learning through the provided data and feedback. They collect your information — age, name, location, interests, opinions, etc.- to create a user model of you — or “Voodoo doll,” as Tristan Harris likes to call them. Each like, comment, post, and video watched is like adding more hair to this “voodoo doll.” Bit by bit, your data trail makes the doll more personalized (Public Affairs at UC Berkeley). With such an accurate representation of you, adaptive algorithms can split-test what content keeps you on the platform for the next twenty minutes. It predicts your clicks and recommends content suitable and relevant to the collected information to throw you into a rabbit hole of scrolling to the point where eleven hours of navigating an endless feed become a norm.
Since an algorithm can create an irresistible For You feed by conducting infinite split tests to decipher your personality type and phycological triggers, it is safe to say that you have lost a sense of autonomy. At a social level, autonomy is a group’s ability to maintain and regulate the structures that define their unity without outside interference. This definition acknowledges the existence of an outsider such as a government or force that holds greater authority and oftentimes is the rival of their cause. Think of the thirteen colonies against the British King or Nelson Mandela against the apartheid system. Is social media another of these outsider forces that threaten the vowed democratic structures of American society and every other democracy? At the core of any democracy, the majority are the real leaders of a community because they choose who to appoint as their representatives. In essence, the majority exercise a sense of free speech by discussing matters of public interest and tackling systematic issues. At the core of free speech stands the right of each member to express their opinions without censorship or constraint. However, is it still the same kind of free speech when a population is told what to say and act on by an adaptive algorithm that only shows them a reality that aligns with the user’s belief system? For example, in the , the company allegedly harvested the personal data of about 87 million Facebook users, mostly from America. Analytica rolled out a 120-questioned survey to determine a user’s standing on the five-factor model of personality. They also used Facebook data to figure out a user’s education, location, liked groups and pages, relationship status, and workplace (Detrow). The company allegedly collected about five thousand data points on each user to create psychographic profiles that determined the user’s personality. With all that information, Analytica was able to micro-target or “personalize” political messages to influence the user’s behavior and ultimately drive them toward voting for a specific election candidate (Kroll et al.).
“We exploited Facebook to harvest millions of people’s profiles. And built models to exploit what we knew about them and target their inner demons,” stated Christopher Wylie, Data consultant, and Cambridge Analytica Whistleblower.
It is unknown how much influence Analytica had over the 2016 election, however, the situation reflects a control dichotomy and asymmetry of power between voters and (Public Affairs at UC Berkeley). Voters and users appointed social media as their representatives just like past societies appointed monarchs as leaders. Social media, for many individuals in democratic societies, is synonymous with free speech. Did you find something disturbing or just want to get something off your chest? Tweet about it and wait for others to repost and comment. No one can stop you from saying what you want to a billion-sized audience. On the far end, you have algorithms in charge of placing each user into a virtual community of like-minded user profiles.
Such virtual communities are disconnected from tangible reality since they stick to extreme worldviews and values. Because we have small virtual communities within a larger society, we lose that ability to tackle non-virtual systematic issues and public matters. The majority do not speak up against the fractured truths within their virtual communities or another for fear of being outcasted or canceled. Isn’t the threat of being shunned from our communities, both virtual and non-virtual ones, a constraint on free speech and thus democracy? Do not get me wrong, having a variety of communities is enriching to our societies because each brings a different lens to the table. However, many societies, America specifically, have reached such levels of intolerance that people cannot have a conversation with someone on the far end of the spectrum because their feelings might get hurt. The coined term for such intolerance is “Snowflake generation” where its members cannot handle opposing views and opinions.
The problem is not that there are small online groups within our society but that an algorithm is often the mediator and founder of those groups. Reddit, for instance, has the biggest collection of online communities. Like many other SNS platforms, Reddit has a Home page with posts from different niches that might appeal to you according to your subscribed communities and your general interests. The homepage is designed to get you to subscribe to a new community, look around the multiple subreddits, and see an advertisement or two. Nothing is inherently harmful about that process — unless the potential community is an extremist group. But an algorithm orchestrated your whole relationship with the community. The user is never in control. Every time we open up a platform, we fight against the supercomputer that “has an avatar rooted of each person on earth and can stimulate three billion variations of a message and advertisement” (Public Affairs at UC Berkeley). So, the algorithm is the appointed monarch that determines the state of our society.
“If you are not paying for the product, then you are the product.” Says Tristan Harris, a former Google Design Ethicist (Orlowski). “The ability to micro-target each of us is treating us as the product.” He continues (Public Affairs at UC Berkeley).
Big Techs generate profit by selling you, the user, a product (Like Apple, who sells you a device directly) or by advertisers paying the platform to show their ads and get users to buy something. Instagram, Facebook, and others offer “free services” to bring you the product on the platform. Essentially the advertisers are the real customers because they pay for the product “you” as Harris stated. It is an attention race for tech companies to attain as many eyeballs as possible by providing specific menus (Edelman). Think of such menus as design elements in social network services (SNS) platforms that reflect a selective amount of “choices” available to users that influence their thinking style. For example, the mounting time-stamped notifications on your lock screen as soon as you wake up give you a menu of urgency and a need to catch up with everything you missed, which distracts you from other empowering choices like brushing your teeth, going for a walk, or getting breakfast (Edelman). When you receive an email on Gmail or Outlook, you get a menu of keys to type a response rather than “empowering ways to communicate with a person” (Harris). These menus ignore what users want to do with their time by selling them the illusion of a complete representation of available choices. Lanier adds to this idea of Big techs manipulating one’s thinking style by saying:
“It’s the gradual, slight, imperceptible change in your behavior and perception that is the product.”
What part of you are these advertisers paying for? They want more from you than just scrolling through a feed right? Social media functions on operant conditioning, meaning one can modify a user’s behavior by giving rewards and punishment at unpredictable intervals (Maza et al.). In operant conditioning, rewarded behavior will repeat, and punished behavior will rarely occur. The reward for users is the dopamine rush from pulling the lever on the “Vegas slot machine” (Harris). We pull the lever when we scroll up the Youtube page to get a new feed. We pull the lever again when we click apps with the bright red circle. Also, when we scroll down to the next video, view the number of likes on a post, and check our lock screen when we hear the notification bell. That meniscal amount of anticipation mixed with social and emotional stimuli created by the adaptive algorithm in our personalized feeds is the reward from social media use. Our punishment is not receiving such rewards, in other words, not fulfilling that emotional and social need. For example, missing out on the validation and acceptance we receive from likes (emotional) and the infinite amount of notifications in the morning that make us feel connected to a group or society (social).
It would be simple to blame the machine for causing a stimulus-packed addiction, however, if users never agreed to hand over their data in exchange for a “free service,” most algorithms would not function. Professor Shoshana Zuboff coined the term rendition to describe a two-sided equation that exists in the capitalist attention-seeking market (i.e. Big Tech) that “claims human experience as a free raw material.” The first part describes a process in which something is formed out of something else like when we render fat from oil or how technologies are designed to render our experiences into data. The second part is when a thing gives itself over to the changing process (it sur-renders), such as “rendering a verdict” or “rendering a service” (Zuboff 234). Zuboff claims both sides of the equation must be in place for the surveillance capitalist industry to function. To use social media services, we make a “novel agreement” with such platforms through a privacy policy; they get access to all of our information to improve their algorithm and we get the stimulus we need to feel connected. But to keep such an agreement, we stick to a change process or addiction where users must undergo behavior modification, as already stated by Lanier. So, we give up part of our free will to a manipulative scheme.
Free will ignores the presence of a greater force and only acknowledges the capacity of a person to make an unimpeded choice out of their available options. Simply put, it is when a user voluntarily controls their lived experiences without an algorithm surveilling their every move or influencing their menu options. Yet we know that users must first provide information about themselves so the platform algorithm can render it into data. That data creates a personalized environment where a user slowly surrenders their volition to the algorithm and undergo behavior modification for the cycle to repeat. Similar to the Analytica scandal, a study investigated if one can infer the private attributes of 111,123 Korean SNS users from a few pieces of publicly available information on their Facebook profiles. The users did not disclose their answers to each attribute, yet, the experimental results showed that gender, age, marital status, and relationship status could be inferred by machine-learning algorithms (Choi et al.). Not only does this highlight the power of such algorithms, but, brings into account user responsibility over public information. Yes, we have little to no control compared to algorithms, but as users, we should be aware of how much of our private information we put online. Another article argues that parents posting images of their children on social media puts them under potential harm like identity theft, data reshared on predatory websites, and the revealment of embarrassing information. The article’s author states that parental sharing infringes on the child’s privacy because it is done without the child’s consent (Keith and Steinberg). In this case, once again, an algorithm is mediating the destination of your information and oftentimes cannot control how it is used by that third party.
Civil Disobedience is an example of free will that expresses opposition to a law established by a governing force (Social media) in a non-violent manner. In this instance, Civil Disobedience is one’s ability to say “no” to an algorithm and regain some control over our menus. Maybe do not link your Google or Facebook account to everything, use a web browser that limits data collection, go on incognito mode on Youtube so at least your search and watch history are not stored, and overall just be aware of your online presence. Though this for many is common knowledge, we choose to continue the novel agreement with such platforms because we cannot be bothered with the extra protection steps that might ruin our experience on the platform. Yet, should we allow social media to “shape our behavior” and disconnect us from tangible experiences? The basic foundation of any democracy is freedom of thinking and acting as you please — of course within the bounds of respective social etiquette. Yet many individuals cannot think, act, or interact face-to-face because their tangible world is the internet and social media. All in all, we lost physical and mental control at the hands of our appointed social media government. At an individual level, we limited our menus to those shown on our screens, thus losing all power to say “no” and continue exploring a different set of menus that aligns with how we want to manage our time.
It is impossible to get rid of social media or even hit a big red button to reset our society. If anything, social media platforms ended everything by micro-targeting each of us and dissipating our autonomy. Socially, we gave the control throne to adaptive items we created only to be divided instead of connected — how ironic. It is like we are being trained to be lone wolves; we get rewards for being active and punished for being gone for too long. We reached such intolerance for opposing opinions — better said, opposing For You pages since the algorithm is the leader — that we have lost all empathy and do not think twice when posting a mean tweet thanks to the sheltering “anonymity” of our profiles. Through the personalization scheme, we lost all chances to say “no” to being stuck in little islands that constantly change based on our psychological traits and triggers. Individually, we forgot to explore the tangible menus outside of our pocket devices. But that is to be suspected in a surveillance economy that “sells certainty” to its advertising customers through the gradual behavioral modification of its product, You. That novel agreement between You and the platform is the leash to our rendition relationship. As long as we voluntarily give our information and partake in self-sabotaging the democratic structures that proclaim free speech and civil disobedience, we will be no more than “free raw material” in an economy built by our hands.