Facebook may be changing its corporate name, but it’s still peddling climate misinformation. According to a new report from the advocacy organization Stop Funding Heat and the ad hoc group of activists called the Real Facebook Oversight Board, the platform’s existing mechanisms don’t go nearly far enough to rein in false or misleading content about climate change. 

The groups analyzed 48,700 posts published between January and August 2021, covering 196 Facebook groups and pages that are known to publish false climate claims. They identified 38,925 instances of climate misinformation — only 3.6 percent of which had been evaluated by Facebook’s third-party fact-checkers. Eighty-five percent of the content bore no link to the platform’s Climate Science Center, a tool the company launched ostensibly to provide Facebook users with “factual resources from the world’s leading climate organizations.”

“The extent of climate misinformation on Facebook’s platform is a lot more than they are giving away,” said Sean Buchan, Stop Funding Heat’s research and partnerships manager and a lead author of the report.

According to the report, this mostly unfiltered climate misinformation was viewed up to 1.36 million times daily over the past eight months — nearly 14 times the alleged daily user traffic to Facebook’s climate science information hub.

Grist thanks its sponsors. Become one.

Reader support helps sustain our work. Donate today to keep our climate news free. All donations DOUBLED!

These findings build on the scientific community’s growing antipathy toward the social network and its role in proliferating misinformation. Thanks, in part, to Facebook’s recommendation algorithms, conspiracy theories have reverberated rapidly throughout the platform — whether it’s QAnon nonsense or baseless claims about the efficacy of COVID-19 vaccines

Facebook’s parent company, now called Meta, has taken some steps to address their misinformation problem, including the launch of a third-party fact-checking program in 2016, as well as creating topical information centers meant to connect users with reputable news and information sources. There’s one for COVID-19, for example, and another for voting and election information

Demonstrators in London protest climate misinformation on Facebook, Google, and YouTube. Ollie Millington / Getty Images

However, critics say that these efforts still don’t go far enough. Reflecting on the shortcomings of the company’s Climate Science Center — launched in September 2020 — Buchan did not mince words. “Either they don’t care or they don’t know how to fix it,” he said in a statement tied to the new report.

Granted, climate misinformation is not coming from Meta/Facebook itself. The report by Stop Funding Heat and the Real Facebook Oversight Board found that one-fifth of the groups and pages they analyzed could be classified as “full-time” or “dedicated” misinformers which regularly posted overtly false content. Not even one-tenth of these groups’ posts had a fact-checking label applied, and only 10.6 percent had a link to the Climate Science Center.

Grist thanks its sponsors. Become one.

The report also identified “subtler” forms of climate information that were able to bypass Facebook’s fact checks. This content, which accounted for the vast majority of misinformation in the analysis, included posts containing links to far-right news sites like Breitbart, or flawed reasoning from media personalities. 

For example: “Better start saving up for your electric car,” read one post from the conservative commentator Glenn Beck, linking to an article claiming that U.S. efforts to slash emissions would be economically burdensome. (In reality, failure to act on climate change is projected to cost the global economy up to $23 trillion annually by 2050.) Another post from Australian senator Malcolm Roberts incorrectly suggested that snowy weather disproved global warming. (Weather is not the same as climate; even though it still snows, global average temperatures have already risen by 2.2 degrees Fahrenheit since 1880.) 

Facebook did not flag either post for a third-party fact check. As Stop Funding Heat discussed in a previous report in May 2021, this may be due to loopholes in Facebook’s fact-checking policies, which until August shielded content that was deemed to be “opinion.” (The same set of rules continues to exempt politicians from ever being fact-checked.)

In response to the study, Meta expressed concerns about the authors’ approach, saying that the report used “made up numbers and a flawed methodology to suggest content on Facebook is misinformation when it’s really just posts these groups disagree with politically.” Over the past year, Meta has repeatedly stated its commitment to climate action. It reached net-zero carbon emissions in 2020 and recently announced it was operating on 100 percent renewable energy.

Mark Zuckerberg speaking
Mark Zuckerberg, chair and CEO of Meta, Facebook’s parent company. AP Photo / Esteban Felix

Jake Carbone, a senior data analyst at the nonprofit think tank InfluenceMap who was uninvolved with the new report, said that these efforts are incommensurate with the scale of change needed to address the urgent problem of climate misinformation. “When stories look back on the impact that Facebook has had, they’re not going to talk about the emissions that came from its servers,” he said. “It’s really important to look at the impact that its information ecosystem is having.”

The report identified a number of measures, mostly involving greater transparency, that could help Meta address Facebook’s misinformation problem. The authors called for the company to make its definition of climate misinformation public  and share internal research about how misinformation spreads on its platform. They also suggested implementing a total ban on climate misinformation in paid advertising — a method by which millions of viewers are exposed to anti-climate action messaging. 

“This content causes harm,” Buchan said, adding that without much stronger efforts and transparency from Facebook’s parent company, government regulators may need to step in. In the lead-up to the major climate conference known as COP26, he said, other big tech companies have put out new efforts to combat climate misinformation. Last month, Google announced a pledge to demonetize climate denial content from its video platform, YouTube. And this week, Twitter announced a new policy of “pre-bunking” climate disinformation in an attempt to get ahead of false content before users see it.

Meta, however, seems determined to double down on familiar Facebook tactics. In September the company announced a $1 million investment in a new fact-checking grant program; earlier this week (timed with the start of COP26) it pledged a series of initiatives to drive more Facebook users to the Climate Science Center and suggest ways for Messenger and Instagram users to shrink their carbon footprint. But critics say these programs are unlikely to change the underlying misinformation problems highlighted in Stop Funding Heat and the Real Facebook Oversight Board’s report.

“It’s a distinct lack of humility during this time,” Buchan said. “We’re hoping that Facebook can take leadership on this issue before it gets a lot worse.”