Facebook and Misinformation

Currently, Facebook is being scrutinized for not being able to properly regulate misinformation about the coronavirus. Facebook users will know that if news is shared on the platform and it could be inaccurate, Facebook will attach a warning label to that post. Since February, Facebook made a plan that they would just remove any posts that could spread false information about the coronavirus. Last week, a former Facebook employee, Frances Haugen, spoke to Congress about the trouble that Facebook is having regulating misinformation about the coronavirus and the vaccine specifically. Haugen said that based on Facebook’s current capability, the platform is likely only removing “10 to 20 percent of [false] content” about the coronavirus and the vaccine.

Not only are Facebook users outraged by this new development, but President Biden has already felt strongly about how social media platforms have handled the pandemic. “In July President Biden said Facebook was ‘killing people’ through the inaccurate information it spread.” With the new information regarding Facebook’s inability to remove inaccurate posts, debates about how safe all social media platforms are in general. Parents have started to express a new concern for their children 13 and up who can make Facebook and Instagram accounts. The question here is: who is in the wrong? Is Facebook in the wrong for not being able to stop the spread of false information? Or are social media users in the wrong for posting false information in the first place, and then believing everything they see online?

This entry was posted in Uncategorized. Bookmark the permalink.

Leave a Reply

Your email address will not be published.