Back in my day, Facebook was for sharing cringe-worthy pictures from high school dances and pretending to keep up those high school friendships when you went off to college. By “back in my day,” I mean all of 10 years ago.
Facebook isn’t just for friends and family anymore. It’s the go-to place for people in favor of proponents and skeptics of COVID-19 vaccines. Digital screaming matches are endemic in both corners. COVID-19 rants end friendships and political diatribes scramble the invite list to Thanksgiving dinner. It’s easy to blame Facebook’s shift toward rage and reactivity on, well, everything else. Politics is exceedingly polarized and science is heralded or scorned with religious devotion. But with more than 70% of Americans on the platform, it’s not so easy to identify whether Facebook is the effect or cause of it all.
As Jeff Horowitz and colleagues recently reported in The Washington Post, Facebook’s hands are not clean. They might not have brewed the toxic muck we’re all swimming in, but they appropriated the recipe and are now pumping it out on an industrial scale. I’m not here to tell everyone to rage-quit the platform. I’m on it, myself. But when we recognize how Facebook has been built, we can start gaming their system to build our community, rather than breaking it.
We need to game their system because they scientifically designed it to make everyone — neighbors included — turn on each other. That feeling you get when you read your neighbor’s asinine political post? Psychologists know that as moral outrage, and it’s a powerful motivator. If you’re one of the 1.9 billion people who use Facebook on any given day, feeling that moral outrage makes you more likely to comment, re-post, and generally amplify the message. That’s a problem for America when coronavirus conspiracy theories and cries of political victimization spark the most moral outrage, and are therefore the most viral.
These virtual civil wars are a problem for America, but not for Facebook. In fact, they’re the solution to Facebook’s problem. Have you ever been doom-scrolling down your feed and thumbed through 10 or more posts that erode your faith in humanity before you find one that restores it? That’s the result of deliberate action from Facebook’s brightest minds. In 2018, the company was facing a crisis of declining user engagement. Less engagement means fewer eyeballs glued to their product for fewer hours, and their advertisement-based lifeblood dries up with it. And so, in a fit of business brilliance — and hopelessly naïve (or disingenuous) promises it would bring people together — they tweaked their algorithm to boost the posts with the most engagement. What followed was the salvation of their bottom line and an erosion of the feel-good, yearbook-like ethos that Zuckerberg promised in his utopian post-Facebook world.
It’s not something you can turn off. Psychological research has found that emotionally charged content — the precise kind of political posts that spark moral outrage — capture your attention before you’re even consciously aware of it.
So let’s say you’re ready to reclaim your eyeballs and resign from the Facebook mob, but you aren’t willing to give up the pictures of the grandkids or community events you only hear about on the platform. It takes work, but you can control your Facebook experience and revitalize your online tranquility.
You don’t need to unfriend your radical uncle. You don’t need to excise trusted news sources or public figures from your life. You can tweak your timeline. Add those friends of yours who are reliable rays of sunshine to your “favorites” list. “Snooze” or “take a break” (actual Facebook terms) from those reliable bear pokers. And always, always, follow the Golden Rule of the internet: never read the comments.
As a psychology teacher, I expect nothing less from you, dear reader, than I do from my students: check your sources! Editors compose headlines for one purpose: to grab your attention. They’re good at their jobs, and most people never get past the headline to see how honest the actual content is. But checking your source doesn’t end with reading the article. Really check it out — evaluate it. Facebook isn’t the only one with ulterior motives. Every author, from concerned citizens, to journalists, to (gasp) politicians, has a bias.
Every author’s goal is to manipulate your emotions to make you care, because that’s the only way you’ll slow down long enough to read what they have to say. Some authors use their power of manipulation for good. If every piece an author writes stokes the fire of moral outrage, they’re probably not one of the good ones. Before you spread their message in a fit of righteous fury, take a breath and check the facts.
After all, who wants to live in a state of rage? More than that, who wants to be the kind of person who brings anger to people instead of happiness? Facebook is now an enemy of civil discourse, but we don’t have to quit cold turkey to fight against it. Our relationships — and our sanity — is worth the battle.
Nathan Ahlgrim is psychology instructor at Catawba Valley Community College. He moved to Hickory in June 2020.