Who says there’s no good news?
Last Monday I started my week off with a smile when Facebook went down for nearly six hours.
One can imagine the increase of productivity employers saw while online agitators and “personalities” had a nice cool-down period.
The hits didn’t stop there, though.
The next day, Frances Haugen, a former Facebook employee, gave a stunning glimpse behind the scenes of the technology behemoth, revealing to a Senate panel that the company's leadership prioritizes “profits before people,” and called on lawmakers to intervene.
It comes after leaked internal Facebook studies show the company’s platforms, including Instagram, actively harm children and propagate dangerous misinformation.
Haugen alleges the company knowingly allows harmful, eating disorder-related content to young users and how it could also present national security concerns if used by authoritarian leaders.
At the root of the issue is the company’s algorithm, which is designed to deliver content that engages you, for better or worse. As a result, this is responsible for fueling polarization, misinformation and other toxic content.
Haugen said the “engagement-based ranking” is also driving children and teenagers to destructive online content, which was leading to body image issues, mental health crises and bullying.
But when engagement means revenue, it’s hard to believe a company would intervene to pump the brakes on that.
The company now says they’ll unveil new features on its platforms, including suggesting teens “take a break” if the algorithm judges they’ve been scrolling Instagram for too long or if they’re repeatedly looking at content deemed harmful to their mental health.
Most people probably use social media for the purpose of staying connected with friends and family, but stick around for all the “you won’t believe this” content.
Some days are slow news days, which people need, including us. Facebook bypasses this by ensuring fresh “the world is burning” content is delivered to your feed daily, hourly.
The angrier you get, the more money Facebook makes.
It’s clear there is an issue of health here, but it’s not as black and white as previous challenges of holding big tobacco companies and car manufacturers accountable.
It’s tricky to put a warning label on free speech, especially when you’re dealing with a private company that’s not even remotely transparent.
While many have called on the government to intervene on the matter, others see a potential conflict of government influencing its power over journalism and free speech.
But what I want to see is accountability. At Facebook, there’s no one saying, “The buck stops here.”
Instead, they blame the algorithm. They blame those creating the content. They shrug and say, “What can you do?”
Things might change if we held Facebook and other tech companies to the same standards publishers must abide by.
Unlike newspapers and traditional media, Section 230 of the Communications Decency Act protects social media companies from being legally liable for what others post on their service.
In short, content that would take The Newnan Times-Herald to court, instead, generates revenue for Facebook.
To fix these ongoing issues, Facebook and other social media platforms should be reclassified as publishers, which makes them liable for spreading false, defamatory or otherwise damaging information.
No one has a bigger audience, so why should they circumvent the same laws of accountability that traditional media companies adhere to?
Until there is some form of accountability at Facebook, absolutely nothing will change. Clicks mean money, and Facebook is already planning for the next generation of users.
Clay Neely is co-publisher and managing editor of The Newnan Times-Herald. He can be reached at email@example.com