There’s no denying that fake news and misinformation are rampant on Facebook, and this new peer-reviewed study from New York University and Université Grenoble Alpes researchers confirms that.
The researchers analyzed Facebook posts from more than 2,500 news pages from August 2020 to July 2021. Researchers found out that pages that post misinformation regularly are getting more engagements such as likes, comments, and shares.
In The Washington Post report, it’s said that the publishers on the right have a much higher tendency to share misleading information than news pages in other political categories.
A Facebook spokesperson said that the study only focused on engagement and not the “reach.” The term Facebook uses to the number of people who actually saw the content, whether they interact with it or not.
However, the reach data is not available to researchers. Instead, entities who want to analyze Facebook’s misinformation problems need to use a separate tool called CrowdTangle, which the social media giant owns.
Albeit, the researchers of this study, have been cut off access to such data. Facebook explains that giving third-party researchers access may violate a settlement with the Federal Trade Commission they were involved in after the Cambridge Analytica Scandal — a claim that FTC already said was inaccurate.
- Lawmaker seeks House probe on Facebook’s censorship policy
- Instagram Sensitive Content Control Option coming soon
In an attempt to mend speculations, Facebook has released a “transparency report” back in August that details its most-viewed posts during the Q2 of 2021.
However, just days later, The New York Times revealed that the social media giant initially scrapped the release of the transparency report for Q1 2021 as the most-viewed post from January to March was a false article claiming the death of a Florida doctor was caused by the COVID-19 vaccine. A post that was used by anti-vaxxers to spread misinformation.
Via: The Verge