This story is part of Fix’s What’s Next Issue, which looks ahead to the ideas and innovations that will shape the climate conversation in 2022, and asks what it means to have hope now. Check out the full issue here.


Six years ago, 196 nations agreed to transform their societies and economies to reduce greenhouse gas emissions and prevent Earth from warming an additional 2 degrees Celsius. Under the Paris Agreement that came out of that meeting, they also promised to use the best available science to achieve carbon neutrality by the middle of the century.

Trouble is, emissions data available for most countries is often self-reported, at least 10 years old, and available only in pdfs, spreadsheets, and videos that are difficult to access, analyze, and compare. Information for some sectors, such as dairy and beef production, is sparse to non-existent. Studying what is available can be expensive and time-consuming. All of this presents a huge problem, because it’s hard to know what kind of progress you’re making if you don’t know where you’re starting from.

”It impressed upon me the importance of data, because in order to have accountability with respect to these targets, you need to have data transparency,” says Angel Hsu, who has attended 12 UN climate conferences over the years and founded Data-Driven Enviro Lab in 2016 to promote evidence-based environmental policies at the city, regional, and global level. 

Fix thanks its sponsors. Become one.

For Hsu, 2021’s COP26 meeting in Glasgow felt like a turning point with widespread agreement among UN officials, government leaders, regulators, and activists about the need for current, and precise, environmental data. It helps that advances in artificial intelligence, remote-sensing technology, and other tools, coupled with the emergence of a climate-tech industry, have made it easier to compile that information, track progress, and hold nations accountable.

A growing number of organizations have been revolutionizing the scope, accuracy, and accessibility of this information. Climate TRACE, for example, is a coalition of 10 research laboratories, nonprofits, and tech firms. It has completed a data set that tracks 61 percent of the world’s emissions in 10 sectors at a country-by-country level. The analysis, which will be used by countries, corporations, and advocates to meet net-zero goals, is the most comprehensive and up-to-date yet compiled. This year the coalition, which is led by Al Gore, will expand its inventory of data available at the “asset level,” such as for individual farms, shipping vessels, and factories, and its “real time” monthly and weekly measurements.

Achieving this data revolution also will require more involvement from tech companies like Google and Microsoft, both of which met with policy experts at COP26. Many of these companies maintain vast troves of information. For example, Google’s Environmental Insights Explorer has worked with Data-Driven Lab to assess the greenhouse gas emissions of 13,000 cities that made voluntary climate pledges. That’s been hugely helpful, because fewer than 10 percent of cities possess independently validated data. Tech firms’ advanced machine-learning techniques, vast data sources, and sophisticated modeling capabilities can provide more granular data to public officials, policymakers, and researchers.

Fix talked with four people who are leveraging the power of AI, satellites, and data science to facilitate more ambitious climate action. Their comments have been edited for length and clarity.

Fix thanks its sponsors. Become one.


“At COP26, I saw a lot of these tech players showing up and trying to engage in the conversation. I think that was really heartening.”

Angel Hsu

Angel Hsu, founder of Data-Driven Enviro Lab and a Climate Action Data 2.0 member: When these climate agreements were first negotiated in the early ’90s, they were all relying on methods that, frankly, are really archaic today, like Excel spreadsheets, and that’s really what the UN is still relying on for different actors to measure and account for their greenhouse gas emissions. At COP26, I saw a lot of these tech players showing up and trying to engage in the conversation. I think that was really heartening, because I’ve been going to these climate conferences for the last 12 years and I have not seen that level of engagement. For them to say, “Yes, we want to help, we want to use our technologies, we want to use our data to help understand what these different actors are doing on climate and how credible their net zero pledges are” was really promising. 

“You’re seeing technologies becoming faster, more accessible, and cheaper, allowing people to bring together a large amount of information.”

Aaron Davitt

Aaron Davitt, remote sensing analyst at WattTime and Climate TRACE: During the last five years, there’s been a significant push to launch more satellites to monitor Earth with more detail and precision. 

For some sectors, like concentrated animal-feeding operations, we realized the reporting is pretty poor. There’s really no good information out there, even in the United States. Some states have records of where [these operations] are. Other states may be like, “There’s 10 in this county,” and that’s it. We’re working at Climate TRACE to identify where [these operations] are and then estimate the emissions from these dairy and livestock yards.

You’re seeing technologies becoming faster, more accessible, and cheaper, allowing people to bring together a large amount of information. Down the road, it’s just going to [create] better, quicker, more timely information.

“If you are a polluter, and you know that your data is being tracked and shared, then it gives you more incentive to meet your commitments.”

Christy Lewis

Christy Lewis, director of analysis at WattTime and a Climate TRACE member: We’re looking forward to being able to make that asset level [data] for many sectors. For example, for the shipping sector, this is the first time that we know the emissions associated with every individual ship. That’s highly valuable information.

It requires extensive data engineering. We have people scraping data from websites or different sources for dozens and dozens of countries. There was one example where all of the data was in a video format, and so one of our engineers wrote a tool to process the language of the video and then remove the data. The data comes in all different types of formats, but we need to get it.

We definitely don’t intend to be the climate police. But we think that just having this data available can inherently hold people accountable. If you are a polluter, and you know that your data is being tracked and shared, then it gives you more incentive to meet your commitments.

Hsu: What was interesting at COP26 was the fact that you had big tech companies like Google and Facebook and Amazon alongside [small outfits like] Planet Labs and Climate TRACE. I think that [tech] companies can play a role in providing data, particularly spatial and temporal coverage that was never before possible. Half a decade ago I think a lot of companies would have been like, “Why does it matter, how’s this going to turn my profit?” I think that now it’s gotten to a point where a lot of these tech companies are trying to compete with each other to be seen as a sustainable tech provider in the climate space. One easy way they can do that is data, because they have so much information and they have so much cloud storage technology. I think that if consumers were aware of what it was being used for and they could see some tangible benefit, there wouldn’t be these privacy concerns. That’s what I saw living in Asia and in China. 

There’s this larger concern of what happens if we become too reliant [on the tech companies for climate data]. Google’s leadership could change, and then they could say, “We don’t want to invest in providing the world’s transport-related climate data,” or, “We don’t want to provide cloud technology to run these other technologies that help process algorithms.” I think the main question is: How can we develop enough robust redundancy and multiple forms so that we’re not too reliant on technology [companies] to solve these data questions for us?

“I feel like once we are able to deliver asset-level data, things are going to start to look really different.”

Abhilasha Purwar

Abhilasha Purwar, CEO and founder of Blue Sky Analytics and a Climate TRACE member: You will get a measure on how a lake is expanding or contracting, how glaciers are melting, what is the risk of floods or droughts. Most of the data will be at a photo resolution of 100 square meters, 200 square meters. Which is pretty good — like a small park, a collection of two or three houses. That’s going to take a lot of processing power. I feel like once we are able to deliver asset-level data, things are going to start to look really different, because then you wouldn’t have any other option but to take action.

I don’t think this kind of revolution will come online by 2025 with all our collective efforts. But the environmental monitoring people are going to do their job, so you cannot say, “We did not have data so we cannot take action.” There’s going to be data for all these places. 

The only case where we might not be doing quality monitoring for, say Africa or South Asian countries, could be [due to] the lack of satellites focused on those countries. However, that’s changing because more satellites are being launched, so there are more pictures being taken. 

Hsu: We need to develop some type of digital architecture that can handle different data sources. We don’t have answers for that yet because this has been a perennial challenge, and it is political. I’ll have people say, “Oh, no, don’t release your analysis, because we’ve done our own analysis and if they’re two different data [sets], then this could be problematic.” As a scientist, as an academic, I never think that that’s a bad thing. The more analyses, the more independent efforts you have out there trying to do the same thing, that’s the only way we’re going to triangulate the truth. There is never going to be, like, one ultimate representation of the truth. I think it’s [more] like, “How close can we get, and what are the uncertainties, and what are the assumptions that we’ve made getting to that number?” That’s really what I’m talking about in terms of this harmonized framework.


Explore more from Fix’s What’s Next Issue: