Big tech censorship and algorithms promote dystopian outcomes
Last week, I wrote an opinion piece in the Carolina Journal titled “The Attack on Free Speech You Haven’t Heard About.” I detailed how Big Tech uses algorithms to filter speech, promote content, and target advertisements. I concluded my piece by stating that current algorithm models discourage an honest and truthful marketplace of ideas. However, I left the implications of this subversive technology, as well as that of overt censorship, for another day.
Today is that day.
Social media censorship, both subliminal and overt, can permanently harm our civil society and our republic by manipulating the way we think and the degree to which we accept content policing.
When discussing the fact-based effects of social media censorship (and the effect of social media on democracy in general), researchers are faced with a unique problem. The causal link is hard to define. The technology used to filter posts, the platforms themselves, and the usage patterns are constantly changing. It is difficult to pinpoint an exact cause of the phenomena because the sites change faster than we learn. Popular posts that generate the most comments and shares are amplified to different extents, and the companies don’t know why.
What we do know is that outrage sells. Content that is inflammatory or expresses moral shock is shared at a 17% higher rate, according to an NYU study. Additionally, we understand that the neural pathways created and reinforced by internet usage favor shallow thinking and brevity. So, when this is combined with the tendency of users to seek out content or individuals that have similar values, as well as reinforced by the addictive algorithmic mechanisms that target dopamine release, users are banded into tribes (both by choice and by design). It limits the number of ideas shared with each user and can spur polarization by pushing content that is meant to excite. Some users welcome this filter, as it reinforces their self-image and satisfies their ideological desires.
These practices result from a populace with a short memory, a shorter fuse, and is accustomed to authority guiding its thinking. The situation reeks of vulnerability. Misinformation and disinformation have created false narratives that still abound on numerous platforms, spreading at twice the rate of truthful content. Because of the medium, many don’t have the capacity to search for truth beyond 280 characters. All the while, political manipulation has begun, the two biggest examples being the Black Lives Matter riots and the January 6 storming of the Capitol. Demagogues and provocateurs took moments that spoke to deep societal ills and twisted them to convince rational people violence was the only way out (and that it was justified). Somebody overtly censored this information, and even though it may have been false, it only solidified the view that these issues were being suppressed. When company policies do not favor free speech, you throw a match within ten feet of a powder keg. I only pray a strong breeze doesn’t blow-by.
Even still, this incendiary content was not broadly organized nor specifically targeted. This is where the actual danger lies, and my purpose in writing this article.
Narrative is power. If you control it, you control everything. Social media is notorious for creating a dichotomy of facts that exist in separate political spheres. Harness this to your advantage, further inciting division and spurring your supporters to act on whatever insane premise you’ve promoted, and you have the capacity to create change overnight. We have seen this with mobs attacking figures, policies, and institutions that they find problematic.
Corporations and individuals alike blacklist these pariahs for social currency or monetary gain. For many, it creates a metaphorical cage they cannot escape. Yet it reinforces the idea that what happens on social media can be extended into the physical world. With this premise, replace followers with fighters and blocks with bullets, and you have a crisis on your hands.
If such a thing were to happen, we have acclimated to a world where speech can cause harm. Our data is open to manipulation, detailing every aspect of our lives (who we are, what we like, and how we are vulnerable). The capability to enforce totalitarian practices is in the palm of our hands, and we have no one to blame but ourselves.
We must regulate Big Tech so that they enforce free speech. It starts by eliminating or reformatting the algorithms that cater to preferred content and drive outrage amplification. Continue this trend by relaxing misinformation policies and allowing users to decide for themselves what is true. Doing so may save us from ourselves by mitigating the platforms’ negative physiological and psychological effects while promoting actual discourse.
Social media giants’ right to free speech should not be weighed against the users ‘civil liberties. There must be standards to uphold when these apps and websites are forums to most political thought in the country. Following the logical thread of harnessing and censoring this content leads to a dystopian and eerily familiar place. I love Ray Bradbury, but I would prefer not to live in one of his novels. In that, I hope I am not alone.
Alex Urban is an intern at the John Locke Foundation. He is a graduate of East Carolina University and has worked for the past two years as a community organizer.