Facebook’s Facade: Unveiling the Truths of Big Tech’s Master Puppeteer
November 22, 2021
Close your eyes for a moment and reflect. Who are you? What do you believe in? Why does this matter to you?
Without our own knowledge, an outside force influences our ideological frameworks. An elusive figure wraps a thin string around our fingertips, yanking left and right. However, the identity of this puppeteer is no longer hidden behind a wall of tech jargon and corporate obscurities.
On Tuesday, October 5th, at 10am, the Russell Building in Washington D.C. quaked with anticipation as Frances Haugen began her tirade. The former Facebook employee’s accusations against Facebook were alarming but not surprising. The whistleblower released tens of thousands of detailed documents that revealed just what the tech conglomerate had been up to.
To understand how Facebook operates, it is critical to map its business model. Because nearly all its revenue comes from ad interactions, the company is incentivized to keep users on the platform for extended periods, increasing their measure of “daily active users.” The inference is apparent: the longer people are on Facebook, the more ads they encounter and the greater the corporation’s profit.
Facebook’s Business Model by shivani123.muru
“The Algorithm” that journalists and politicians alike express disdain towards is a misnomer. There is no singular, omniscient algorithm that rules users. Our interactions on Facebook (and, by extension, Instagram) are developed through a series of processes with different functions and classification methods.
As soon as we open Facebook, we see the “News Feed,” simply dubbed by frequent users as “the Feed.” To decide which pieces of content are shown, Facebook utilizes a four-step approach.
First, an algorithm examines the inventory. This supply consists of all the possible content one can see, including posts from family and friends, results that align with the user’s interests, and targeted ads. Users used to see this personalized curation of information as a massive benefit. Recently, however, this same system has been dubbed exploitative as the stigma against personal data collection grows.
The program goes on to gather “signals” that categorize content based on the account that posted it; how the user has interacted with this account in the past; and whether the post is a video, link, or photo. These signals are used to predict how relevant content is to the individual and their likelihood of interacting with it.
Based on predictions made from signals, a specific score is assigned to the post. Then, they are arranged from the highest score (most interest) to the lowest (least interest) when one’s feed is refreshed.
Recent Algorithmic Changes
Facebook made several modifications to different elements of this system in the past few years, but one of the most impactful changes was implemented in early 2018. The company revised its News Feed algorithm to encourage “meaningful social interactions” (MSIs). This new structure had several unintended consequences, boosting more emotionally charged posts. It did not take long for political figures and influencers to identify this pattern, increasingly posting sensationalist content.
However, the purpose of this change was to encourage interaction between friends and family members rather than consuming professionally manufactured content. So why did it have the opposite effect?
To understand how this occurred, we must explore the phenomenon of quantitative popularity. Posts and accounts with more views, likes, comments, and shares garner more attention and, consequently, have more power. These engagement-based metrics help set the narrative for what people view on their social media feeds.
To model this, I created two separate Instagram accounts and treated each differently. In the first, my personal account, I traversed content on my “Explore” page authentically. Then, based on which reels or posts I dawdled on for longer and interacted with, my feed began to change from the initially generated one. Rather than generically popular content, I started seeing more Gossip Girl and Gilmore Girls content (my favorite TV shows) and more reels by @itsgurnaz, who shares the beauty of Indian culture. I never explicitly followed any of these accounts but continued to like or linger on them.
In the second account, I followed politically left-winged individuals and friends whose political affiliation was more liberal. I then began engaging with content and soon found only liberal thought to be represented in my feed. It can be assumed that the same would have occurred from the right-winged standpoint. Many may argue that this is not inherently an issue, but it unfortunately is.
Firstly, through exposure to emotionally charged posts from one end of the spectrum, diversity of political thought is limited, and extremism takes hold of individuals. Secondly, misinformation on social media platforms is rampant. Cognitive biases such as the bandwagon effect and name familiarity encourage the spread of information that may not be entirely reliable.
The Next Generation
There are many benefits in the way algorithms are structured. Today, we have unfiltered access to the world, and this globalization has brought opportunities to people of all backgrounds.
However, a significant consideration for the future is the effect of this platform on youth. A set narrative can impact many young people’s perspectives in the defining years of their lives. MSI algorithms encourage the spread of polarized content. One case study represents this phenomenon clearly: religious extremism in India. According to Facebook’s researchers, hate speech increased significantly during the nation’s riots. Young people are often manipulated into choosing specific sides in a moral battle of identity and humanity.
Furthermore, individuals are increasingly exposed to unrealistic expectations. We watch videos of perfect people living perfect lives. As a teenage girl, I scroll through Instagram and Facebook encountering those who are skinnier, smarter, and more successful than me. Ali Abdaal seems to work for 40 hours within a 24-hour period, Ruby Granger studies for 14 hours a day, @urmomashley has an entire mall in her closet, and “that girl” is embodied across multiple accounts. The term “that girl” refers to an aesthetic trend that encourages the perfect lifestyle: someone who has everything together, works out, eats healthy, is productive, and is effortlessly beautiful.
“Social media spreads a certain body type and even if body positivity is spreading, that doesn’t mean people don’t edit their pictures to make them look skinny or flawless, which alters my perspective of what is ‘attractive,'” says South student Elisabeth Moreau.
As these trends become more popular among its young niche, the algorithms work to promote it and influencers are incentivized to latch on to this type of content.
As a result of these social repercussions, a niche within the nonprofit tech sector emerged: digital wellbeing. This specific movement works to better the relationship between individuals and technology, emphasizing mental health. LookUp, a nonprofit within the digital wellness field, recently held the Youth 4 Youth iSummit, bringing together hundreds of speakers and participants to engage in this conversation about the future of social media. One specific seminar was “Live Focus Group: Gen Z Responds To Facebook Hearings – Demands More From Big Socials.” This session was led by moderator Rishi Bharwani, Director of Partnerships and Policy at Accountable Tech.
To preface the discussion, Senator Markey of Massachusetts was passionate in voicing his opinion: “We introduced the Kids Internet Design and Safety (KIDS) act. My legislation… limits advertising and commercial content like product placement, [and] prohibits amplification of harmful and violent content to children.”
Vinaya called for greater protections for younger individuals on social media:
“I feel like there should be a lot more regulation on those apps that are harming the younger generation because they can’t advocate for themselves like we are advocating for ourselves right now.” pic.twitter.com/PFeLbCFshZ
— Accountable Tech (@accountabletech) October 15, 2021
Censorship and Algorithms
We see a common trend in the nonpartisan movement of digital wellness as policymakers call for changes within Facebook’s algorithm to better the mental health of youth and reduce the spread of misinformation. In May of 2021, the company began tagging accounts that repeatedly share fact-checker flagged information, expanding penalties to minimize the spread of misinformation, and reframing notifications received when sharing flagged posts. However, facts are not the only content that people wish to curb, for hate speech has been spreading rapidly. But this calls into question a highly subjective censorship process.
“What exactly is considered hate speech?” a student from South Forsyth High School asks.
On that note, who decides what is “appropriate” to post? It is not necessarily the role of Facebook to curate a platform for healthy political discourse or ideological framing at this level. According to its Investor Relations page, “Facebook’s mission is to give people the power to build community and bring the world closer together.” Perhaps, then, some of that responsibility falls to the people. After all, algorithmic changes can only accomplish little if the users themselves nefariously exploit the platform.
For instance, it is beneficial to encourage user discretion or to wait a few more years before giving children access to social media. You would not run up to someone and yell slurs in their face; what makes it acceptable to do online? Why are the rules that govern online interactions more fluid than that of face-to-face ones?
Thus, I began thinking about the benefits of contingencies on user etiquette, but this cycles back to the root of the problem: who decides? Censorship has never been an easy process, as we must consider current power structures and their influence on this practice. But so much is clear: as much we benefit from global interconnectedness, we face the disturbing effects of polarization, addiction, loneliness, and insecurity.
Today, Facebook guides the strings of the puppet show that is our world. Tomorrow, it may be someone or something else. Amidst it all, we must remember to see every side of the spectrum, formulate our opinions carefully, and explore our identities outside the realm of virtual deception.
Disclaimer: The opinions expressed in this piece belong solely to their respective author(s). They do not represent the opinions of South Forsyth High School or Forsyth County Schools.