Facebook Inc. said it has removed more than 20 million posts on its main social network and photo-sharing app Instagram for violating rules on Covid-19 misinformation since the beginning of the pandemic.
The Menlo Park, California-based company also said it added information labels to more than 190 million Covid-19-related posts on Facebook that third-party fact-checking partners had rated as false or missing context.
The data, which covers actions taken through June, was released Wednesday as part of Facebook’s quarterly community standards enforcement report.
Facebook is seeking to address criticism that its platforms have been used to spread fear about vaccines and misleading information about the coronavirus. The company implemented new policies against Covid-19 misinformation, including banning repeat offenders who spread falsehoods and directing users to a central Covid-19 information hub.
Facebook also said it will start sharing a new quarterly report outlining the most widely viewed public posts on the service. The new report included the most popular external web domains viewed on Facebook, with YouTube topping the list, followed by Amazon.
Facebook said just 13% of all the content on its News Feed includes a link to an outside web page, like a news article, and that more than half of all posts people see come from their friends or family members. Many of the most popular posts are just questions meant to illicit a reply, like, “What is something you will never eat, no matter how hungry you get?”