A few years ago, I moved to a cool neighborhood. It didn’t seem cool at first; it was in a sprawling, industrial area. But I soon found out that the neighborhood had its charms –- and a reputation. “I’d love to buy a place there!” a friend gushed. “Except for the air.”

“The air?” I said. The air seemed fine.

“Your neighborhood has some of the highest rates of asthma in the Bay Area. It’s all the trucks coming to and from the port.”

I’d done plenty of research before I moved. How long would it take to get to my job in the city? Where would I buy groceries? But I hadn’t thought to look up anything about the air. If I had, I could have found the EPA’s website where you can search air by zip code, or found public health stories about my neighborhood. But it would have taken more than casual research to assess the risks.

Grist thanks its sponsors. Become one.

But what if it was as easy to assess air quality, block by block, as it is to find grocery stores? An experimental collaboration between Google and Aclima, a tech company that specializes in networking environmental sensors, has big dreams of doing just that.

“It’s like a human body,” said Davida Herzl, Aclima’s CEO, when I talked to her recently. “When they let you into the hospital, they take your vital signs, try to treat you, and keep watching them to see if they are helping. For the first time, we’ll be able to take the vital signs of our environment.”

One thing that I’ve learned while writing about air quality: Often, the hardest part is interpreting the data accurately. Environmental sensors are tricky to calibrate; they need multiple backups and skilled technicians to maintain and interpret them. This is one of the reasons that the air quality monitoring stations set up by the EPA look for air quality averages, rather than anomalies. They’re deliberately located at a distance from freeways, for example, because a consistent source of pollution like a freeway would skew that average and make a city seem more polluted than it actually is.

Aclima’s workaround for this problem is something that wouldn’t have been possible even a few years ago, because it would have taken too much storage space and processing power. Aclima hooks up huge numbers of different sensors — some cheap, some expensive. Then it sifts through the enormous amount of data they generate, looking for patterns, as well as potential breaks in the network. Is a sensor broken? Are local weather patterns interfering with the data? With a resilient enough network, sensors can continue producing good data even in less-than-ideal conditions. And it can notice details that a simpler system might not, like how the air quality in the blocks closest to the freeway changes from hour to hour, and block to block.

Grist thanks its sponsors. Become one.

The first large network Aclima built was for Google. Starting in 2011, Aclima began installing what are now 6,000 air quality sensors in Google’s offices around the world – 21 buildings, on four continents. In this case, Aclima was processing half a billion data points a day, looking for patterns and anomalies in the weather inside Google.

What did it find? Both Google and Aclima were reluctant to talk specifics, though carbon monoxide levels and productivity seems to have been one focus. Google is built on a well-known toxic waste site — the former location of manufacturing plants owned by Fairchild Semiconductor, Intel, Raytheon, and other computer chip makers. It is true that in 2012 Google reworked the ventilation systems of two of the buildings in its Mountain View headquarters after it found dangerous levels of vapor from the solvent TCE, or trichloroethylene in both. According to Aclima, that particular discovery wasn’t theirs — Google has multiple air quality collaborations in the works.

For Aclima, measuring indoor air quality was what Melissa Lunden, Aclima’s director of research and a former atmospheric scientist at Lawrence Berkeley Labs, describes as “crawling before we could walk” — a prelude to the great outdoors, and a knitting together of pre-existing and recent technology. “Partly it’s sensors,” says Lunden. “You need hardware and sensors. But you also need to be able to send a lot of data quickly over a 3G network. You need to process and store large amounts of data. You need the cloud. You can see these things reflected in the ‘internet of things’ discussions that are occurring everywhere.”

Last summer, the company spent a month driving three Google Street View cars through Denver, Colo. Aclima is in the second year of a five-year Cooperative Research and Development Agreement with the EPA, where the two organizations work to build low-cost portable sensors to detect fine and coarse particulate matter in the air in real-time.

What Aclima found was that the data from Google’s cars matched up with that of the local EPA monitoring stations, but also provided more detail. For instance: Concentrations of ozone and nitric oxide were higher near freeways and major traffic arteries. And ozone levels spiked right around 4 p.m., when children were getting out of school.

The next test site is the Bay Area — inconvenient because of its strange weather patterns, but convenient because it’s where Aclima is located. Test runs of Street View cars tricked out with sensors have already begun. The goal is to add the information they gather to Google maps of the area. After that, Aclima plans to expand to the point where the project can do what its very earnest founders have been promising — make the link between planetary health and human health visible, by helping people actually see pollution as it flows and ebbs around the world. “It’s the best possible thing,” says Lunden. “We can say ‘Now, here’s the air quality in your neighborhood, here’s where you work, here’s where you live.’ Every person can make a data-driven decision.”

But then, if you’re in an expensive business like global-scale local data collection, it’s hard to make a living giving it away for free. Aclima has – so far – worked with clients who have chosen to keep their data private. When the company’s representatives talk about the future, though, it sounds a lot bigger than just an audience of just a few private clients. “Our vision is to wrap the world in a layer of environmental data, much the way that GPS came about back in the ’50s,” says Kim Hunter, Aclima’s director of communications. “GPS used to just be used by the government. It became more accessible over the last 50 years. And now the location awareness is embedded into everything we do.”

There’s a critical difference, though. GPS owes its ubiquity to political decisions that kept it in the public realm. In 1983, when Korean Air Lines Flight 007 was shot down after accidentally veering into Soviet airspace, killing 269 passengers, including then-Rep. Larry McDonald (D-Ga.), President Reagan publicly vowed to make GPS available to civilian aircraft as soon as it became operational in 1988. Then, in 1994, the FCC required all wireless carriers to implement GPS in cellphones, so emergency responders could track 911 callers.

Without mandates like that, your cellphone might never have been able to find the closest nearby coffee shop, give you directions to Reno — or track your movements and sell them to advertisers. The financial and political life of technology is strange, and long. In the decades that the U.S. has promoted collaborations between private companies and public research, we’ve played out just about every kind of tension between the public good that follows from public data and the private gain that can be had from charging for and restricting access to that kind of information.

It’s possible that Google will be willing to pay for Aclima’s data, the same way that it was willing to buy satellite images for Google Maps — because it helped them sell more ads and search referrals. It’s also possible that Aclima won’t find enough buyers for their data, and the whole project will need to find some other financial model, or win public support, or fold.

All of this is in the future. In the meantime, says Lunden, there is a lot of data, which is the exciting part. “We’re just trying to figure out how to get it into the hands of people who can make sense of it.”