How social media giants are spiraling out of control
Maybe connecting everyone in the world might not have been such a good idea after all: Facebook’s dominant selling point has long been its ability to bridge physical barriers and bring large groups of people together over its platforms, but what if it's exactly that goal that is at the root of many problems like ethnic violence, conspiracy theories and social divisions we are seeing arising today?
The good ol’ days: Unlike the early years of Facebook where you might have had only a handful of posts from friends you actually got to interact with in real life, there are currently bns of accounts belonging to people with varied interests from across the world that are able to connect in some way with no more than a single tap on a screen.
The company argues that this evolution was a good thing — that major platforms like Facebook might give some lonely kid who’s the only person into anime in their town access to a whole network of individuals scattered around the globe with shared interests — a situation that may have never been possible offline.
But rapid and widespread connection could also bear some associated costs: Users with harmful interests or violent inclinations who hop on these platforms also have access to that same vast network where they’re able to easily meet like-minded individuals and consume content that can often help reinforce those same views. We’re talking about everything from militant recruitment to the rapid spread of conspiracy theories. But this problem also exists among non-violent communities online where people often find themselves deeply entrenched in internet echo chambers where everyone around them shares the same views and are rarely challenged.
There’s also the problem of professional content creators: Content on Facebook’s platforms in its early years was mostly driven by ordinary users sharing posts and images among themselves but the rise of online influencers has totally changed how the platform operates. Influencers are now at the helm of driving user engagement and helping the platform expand its indirect network effects. This poses the separate but closely related issue of engagement with algorithmically-backed content from professional creators, a recipe that has been shown to be harmful to many young users on its platforms.
The problem isn’t exclusive to Facebook, it is fundamentally tied into the way companies have designed social media networks. Social media platforms need ever-expanding networks to drive more people to their platforms to spend more of their time engaging with content. Haugen at one point described the situation Facebook is facing as a “feedback loop that they can’t get out of.”
Communicating the risks associated with use of its platform: To reel in some of the damaging effects of this vast global network facebook has created, maybe the company could begin by releasing research detailing the mental health issues that may arise from social media use, argues Andy Wu in the Harvard Business Review.
Bolster moderation efforts: Facebook could also expand its team of 15k moderators who are currently tasked with reviewing questionable content posted to the site for potential violations of community guidelines.
Take accountability for connections: There needs to be some kind of internal or external party ready to hold Facebook accountable specifically for its algorithm-generated connections. Content on Facebook is already reviewed by its team of moderators and the company’s Oversight Board, but there remains a large component of what makes Facebook problematic unchecked: suggested connections generated by its algorithms, says Wu. Public representatives could be stationed to oversee Facebook from the inside, France Haugen, former product manager at Facebook who testified at a senate hearing against the company said.
Maybe even cool down the algorithms that prioritize content based on what is most likely to be engaging for users to plain old chronologically ordered timelines. Some have even suggested that the company orient its content algorithms towards boosting posts that are likely to promote charitable causes or civic engagement, rather than likes.