Facebook is the world’s largest social media network. With nearly 2 billion users, Facebook is a major influential force among internet users and it’s become one of the most popular and central ways that people communicate, create, and share online. It isn’t without its problems, though.
Since launching in 2004, Facebook has seen plenty of PR nightmares, crises, and criticism. Privacy concerns, instances of ad fraud, questionable content on Facebook Live, and more have threatened the company’s reputation. It remains atop the social media platform food chain, but here are the top 5 controversies and criticisms that have shaped Facebook as we know it today.
Fake accounts aren’t a problem unique to Facebook. Bots exist on Twitter, Instagram, and other prominent social media platforms, and they function a little differently on each. Facebook runs on an algorithm, though, and the primary function of fake accounts on Facebook is twofold. First, bots can lead to a kind of ad fraud, clicking on ads and skewing metrics and budgets. This has been a point of contention for years and has, in some cases, led to advertiser criticism and controversy. Second, those bots become a method for gaming algorithms and pushing specific content to the top of feeds.
In April 2017, Facebook purged 30,000 fake accounts linked to attempts to spread misinformation ahead of France’s presidential election. It’s impossible to know how many fake accounts there are worldwide, but following the United States presidential election in 2016, Facebook has been working with leaders in other countries to mitigate the effects of politically motivated fake account operations.
The most straightforward of Facebook’s scandals is one that may matter most to advertisers. In September 2016, Facebook publicly announced that it found an error in the way that it calculates a key metric that advertisers use to determine audience response to videos: Average duration of video viewed.
Facebook was calculating this metric based on the total amount of time spent watching a video divided by what it considers “views,” which is counted anytime someone watches a video for more than three seconds. In order to get an accurate read on average duration of video viewed, it should have calculated total amount of time spent watching the video by the total number of people who watched, regardless of time.
Average duration of video viewed isn’t a metric that determines how much advertisers pay, nor is it public, but it does give creators and advertisers valuable feedback on their content. Miscalculations like this one irk advertisers and make it more difficult for them to trust the audience information and data that Facebook guards so closely.
Related Post: How Facebook Miscalculated Its Video, Explained
Often used for spur-of-the-moment confessionals, event broadcasting, and engaging videos that enable dialogue between creators and audiences, Facebook Live has been wildly successful. But Facebook Live has a dark side.
Users have the ability to broadcast anything from virtually anywhere with Facebook Live, and that access has led to an increase in offensive, dangerous, and inappropriate content on Facebook. Facebook Live has been used to broadcast dozens of acts of violence, including murders, suicides, and beatings.
Facebook has a flagging system in which users can report instances of violence, hate speech, or any other inappropriate content or behavior on a live broadcast, but flagging will always be an imperfect system. Livestreams are unpredictable by nature, and removing flagged videos takes time and depends on users to alert Facebook to content that should be halted and/or removed.
The company announced in May 2017 that it will hire 3,000 people to track reports of offensive content (especially on Facebook Live) and act on content that’s flagged. These 3,000 employees are in addition to an already robust force of 4,500 employees responsible for monitoring content on Facebook.
Related Post: The Top 30 Viral Facebook Live Videos of 2016
Though fake news isn’t just a Facebook problem, Facebook is a primary source of news and information for its users. The primary criticism surrounding Facebook and fake news isn’t that fake news can be shared on the platform, it’s that it’s allowed to exist in News with legitimate news stories.
Part of the problem goes back to fake accounts, which can be used to push fake news stories to the top of the feed, giving them credibility and making it more difficult for people to determine what’s true and what isn’t. Another key element is the lack of vetting for news on the platform. Assuming that publications and stories that find their way into Facebook News are real, users might assume that stories are fact-checked and accurate.
In reality, Facebook’s fact-checking system is lacking and has given rise to a News ecosystem that serves fake news alongside real news, making it difficult for users to tell the difference. Though Facebook is taking steps to fix this (along with its fake account problem), fake news is a problem that will likely plague the platform for some time.
Related to the problem of fake news is the controversy around the way that Facebook surfaces information in general. In May 2016, Facebook ran into trouble when it came to light that people working on its Trending Topics team were affecting the way stories appeared in Trending Topics and intentionally suppressing conservative media.
Facebook’s also run into trouble where removing or not removing content is concerned. The platform has strict guidelines that dictate what it will and will not allow on the site, but there’s subjectivity in that. For example, Facebook found itself in the middle of controversy when it removed a Pulitzer prize-winning photo because it violated nudity guidelines on the site.
The other major part of the argument about how Facebook serves content is the algorithm that’s based on users. A major tenet of the Facebook feed is that it’s supposed to be tailored to you and what you like. The flip side of that personalization, though, is the echo chamber. Facebook’s algorithm filters out things that it doesn’t think you’ll agree with, warping your worldview.
There’s no quick fix for problems like the subjectivity of offensive content or the rise of fake news, but one thing’s abundantly clear as we look at the biggest scandals and PR crises that Facebook’s faced: As the world moves forward and takes on new challenges in social media and communication, Facebook’s going to be at the center of the conversation.