In September of 2021, the Texas Legislature passed a law meant to reel in the social media giants’ monopoly of online opinion. The bill, known as H.B. 20, requires large tech companies to produce regular reports of removed content, create a complaint system, and disclose their content regulation procedures. It applies to companies whose users surpass 50 million but was blocked almost immediately upon passage. However, on May 11, 2022, the 5th Circuit Federal Appeals Court upheld the law as constitutional, and it went into law. 

Big Tech censorship has been the rallying cry of conservatives concerned about free speech online for quite some time. The removal and banning of elected public figures such as Donald Trump and Marjorie Taylor Greene is alarming, but there is an issue that has largely gone unmentioned about the dangers of social media. That “unmentionable” is the algorithm which provides users with the content that they consume. 

What is an algorithm? When social media companies such as Facebook, Twitter, and Instagram, began to grow, they knew that as engagement increased on their platforms, they would need some way to filter posts so that users weren’t pummeled with spam. Each company wanted to reflect what was relevant to the individual user. 

So, each company created an algorithm, a computer code that learns from what a user likes. Algorithms are designed to learn from several variables. They measure things such as where you spend time online and what pages you like more than others. The algorithm takes that data and shows you posts that it thinks will interest you. 

If you weren’t paying attention, your social media (Facebook, Twitter, YouTube, TikTok) studies you. And it probably knows you better than you know yourself. Algorithms are the reason you feel like your phone is listening to you. It is the ad for the product you just mentioned or the popular song stuck in your head. Algorithms have also been used as a policing agent, determining what is against company content policy.  

On its face, this information is deeply disturbing. You are being watched and measured every moment you spend on a medium, whether it be liking your mother’s anniversary post or your brother’s kayaking video. Your information is used by advertisers as well as foreign agents. Russia and China use it every day, populating these sites with automated users whose only purpose is to sew and reap division. Even still, these results are only causal in that they are responses to our behavior. 

That behavior is enforced by algorithms’ ability to harness some of humanity’s worst tendencies. It takes what we find pleasurable and creates a feedback loop, using physiological reactions our body has to stimuli to reinforce addictive usage patterns and create filter bubbles by showing us exactly what we want and trashing what we don’t. It makes us crave short-term satisfaction. For some users, the manipulation gratifies their experience to the point that the influence is seen as benevolent. Many users give in and let the algorithm’s current sweep them downriver. These users are not in control of what they’re seeing, and they do not care. This apathy lobotomizes opinion, placing the power of self-determinacy outside of the individual. 

Free speech is predicated on the fact that it is a free exchange of ideas. The people we choose to interact with confirm or challenge our opinions in the town square. It is unfiltered because we choose to seek out others and form these intellectual bonds. They are not chosen for us. Social media strikes down a core civil liberty proactively, and that is something that is not discussed enough. 

While the future of H.B. 20 remains undecided, we must think about how we consume content online and how we regulate it. What would happen if a foreign power were able to build a psychological profile on every citizen in the United States? What would happen if that profile was used to divide and dismantle civil society simply through targeted content? We need to regulate algorithms and establish standards of privacy that protect both the user and their autonomous decisions. 

Alex Urban is an intern at the John Locke Foundation. He is a graduate of East Carolina University and has worked for the past two years as a community organizer.