Are Social Media Algorithms Controlling The Truth In The New Normal?

Are Social Media Algorithms Controlling The Truth In The New Normal?

By

As likes and shares dedicate what we see online, are social media algorithms filtering out the truth? Here’s how it effects the world today.

Related: Where Should We Stand On The Next Big News To Erupt In Social Media?

Ever searched for something on Google and the next thing you know, your YouTube recommended and Instagram ads are all about it? It may seem that our phones are constantly listening to our everyday conversations but the science behind it is because most social media apps are driven by algorithms. It is a formula used to rank and recommend content based on one’s data to organically filter through the large amount of content that is available online. Mark Zuckerberg describes “social media algorithms” to provide “the perfect personalized newspaper for every person in the world”. Instead of sorting posts based on real-time, it is decided by which content that you will most likely engage with, regardless of whether it’s the latest news or not. This may seem harmless, but newspapers aren’t meant to be personalized, it’s meant to deliver the truth. So as social media platforms become the main source of information, does it mean it has the responsibility to act the same?

Regine Cabato, a Filipino journalist from the Washington Post explains how algorithms play a big part in forming echo chambers or “filter bubbles” that influence how we view and interact with the world. “Social media becomes a venue for confirmation bias; that is when what you see just affirms your own opinion,” she explains in our personal interview. This leads to the public not seeing the full picture of reality. Global events go unnoticed, shifts in the political landscape are overlooked, and information is shared without sufficient and credible basis. “One example of how these bubbles that can be limiting was the landslide loss of opposition candidates in the 2019 midterm elections; these candidates were popular on Twitter, but that discussion did not translate to votes from the ground,” Cabato says.

For Arshie Larga, Pharmacist and Content Creator, he benefited from the TikTok algorithm as his videos on dismissing myths regarding medicine and COVID-19 went viral. This allowed him to reach a wider audience in encouraging people to get vaccinated and explain how vaccines could help us end this pandemic. “Ngayon, hindi lang mga customer ko sa botika ang nagbe-benefit sa mga advices ko about how to properly take their medicines pero pati na rin yung mga taong naka-follow sa mga social media accounts ko,” he says. However, even if was able to target his market through the algorithm, he still agrees on how it can lead to seeing the world with a closed mind. “Yung information na nakukuha natin ay limited lang sa kung ano yung pino-post or sine-share ng mga mutuals natin online kaya nakukulong tayo sa isang echo chamber. Kaya pakiramdam ng iba lagi sila lang ang tama at nawawala na yung ability natin to accept other people’s opinion.”

The recent algorithms have also been prone to favor emotion over fact. Emotions such as hate comments and heated conversations always acquire more engagement. As a journalist, Cobato shares her firsthand experience with online trolls who try to harass and discredit her work. “Earlier this year, I left Facebook for over a month, after receiving dozens of friend requests from strangers even after I restricted my privacy settings. In less than four days, I had racked up over 700 new friend requests and I couldn’t open the site without a flood of notifications. It was bothersome and I would have preferred to quit Facebook altogether, but it’s so integrated into life and business in the Philippines. All the press conference livestreams, official announcements, and contacts to sources are there, so I had no choice but to reactivate. Once I reactivated, my engagement dropped whenever I shared articles I had published—possibly in part because news gets lower engagement, and because my account had been inactive.”

It’s also important not to forget that behind algorithms are people and we influence the algorithms just as much as they affect us. Sensationalized stories and clickbait titles are rampant due to writers and content creators using it as a way for the algorithm to favor their content. Fake news also spread faster than facts, primarily because of people who believe headlines at face value. “Whether you have a big or small following on social media, it is our responsibility to ensure that every content we post should be factual and well researched,” Larga expresses and specifically talks about how his content requires heavy responsibility. “It will be very dangerous to spread false information about medicines kasi buhay ng tao ang nakasalalay dito. We always have to remember that there are people who believe everything that they see on the internet is true without fact-checking or doing their research, especially when they see that person who posted the content has a huge following on social media.”

To clarify, algorithms are not inherently evil and malicious. As users, we benefit from it because it’s the main reason why our time online is enjoyable, but we also still have an obligation to uphold responsible media practices which includes fact-checking. And as it continues to mature and grow more complex with the kinds of media and number channels we are exposed to, tech companies need to step in for truth to prevail. “Social media operators have an obligation to set the necessary restrictions to prohibit hate speech and potentially harmful and false information to prosper in their sites. These restrictions are necessary to protect the public’s interest or the rights and reputations of others,” Larga asserts. “Social media companies should be one step ahead of disinformation networks instead of playing catch-up, or cleaning up only after accounts are mass reported,” Cabato also claims. “Deprioritizing false and malicious information in the algorithm and favoring facts on people’s newsfeeds is not equivalent to suppressing free speech. In fact, it informs them so that free speech can be exercised responsibly.”


This feature story on social media algorithms is also found in MEGA’s November issue now available in ReadlyMagzter, Press Reader and Zinio.

Order your print copy of this month's MEGA Magazine:
Download this month's MEGA digital copy from:
Subscribe via [email protected]