Photo by
Article

Machine Learning and Climate Change

The AI Revolution and Climate Change, Part 3

Time will go here
min read
No items found.
Guy Bayes
Chief Technology Officer

It’s a revolution and a real one this time, unlike the last couple semi-duds. And one of the places where ML is being applied in a revolutionary way to a new problem is at Vibrant Planet, as we use ML to try to keep the earth’s biosphere from collapsing due to a global warming.

Because as all those brainiacs were figuring out how to make you click on ads the world was going to hell. 

Nature and ecology, it turns out, are kinda complicated. There are lots of bits affecting other bits. Many many bits, moving parts coexisting, eating each other, etc. Ecologists can understand it pretty well at the high level, but getting detailed is hard. And one of the reasons why is, it’s pretty much impossible to use a computer to express what happens in the natural world at any level of detail by using that old style of general programming. 

Also, detailed ecological data is hard to come by. The dratted plants and animals are not as good at posting every damn thing they do to Facebook like the humans do. Mostly we have to rely on Remote Sensing which is a fancy way of saying “fly something way up high, take pictures of stuff, maybe bounce a laser off it and try to make sense out of what’s going on down there. Figure out where the trees are. And hope the government funding holds out so you can do it again next year and see what’s changed.”

And oh, by the way, I forgot to mention that while all this other stuff was going on there was this guy called Elon that got really good at shooting rockets filled with satellite sensor packages way up high. Maybe we should call that Thing 4.5.

To make a hard problem harder, large scale ecological computing is not a thing we as species have invested much in. As opposed to cat pictures. We humans have done an awesome job of building tools and technology to understand the human world, mostly because that’s the part of the world most of us care about. I can pull an app up on my phone and go to this place called “Wikipedia” and have access to the sum total of human knowledge.I can do a video call with anyone in the world via a satellite connection. I can see street views of a city in Uzbekistan or get an accurate ETA for a drive to Albuquerque. I can have my DNA analyzed and know I am 19% Norwegian.

By comparison, the natural world is just this green blob on Google Maps that we dig coal out of. We are very behind in using the new wave of technology to understand the natural world. As I mentioned, the beavers don’t post to Facebook. We don’t track the beavers at scale using remote sensing, mostly it’s because we just don’t care as much and hence don’t spend money on it. Ecological Sciences are a thing that has been a barely funded labor of love for centuries. Most Ph.D. ecologists are shell shocked saints working as teachers, running programs on twenty year old laptops, mumbling “don’t you realize everyone is going to die” while the tears drip into their Costco coffee. How does the budget of Facebook’s advertising platform compare to the amount our species spends on ecological sciences?

This went double for machine learning since, as we have noted, for many years doing machine learning was a “How about you start with a Ph.D. in Math plus you are a kickass programmer and also it really helps to have another Ph.D. in computer science please” level of knowledge, as soon as one of those rare birds appeared, they were quickly lured to the dark side by a truck full of stock options and ended up at a technology company or working for the investment industry, not as an ecologist. 

You can’t blame them, given the type of money those big tech companies are willing to pay. And while most ecologists are decent enough programmers they didn’t even get exposed to all the cool new toys the silicon valley folks were using internally. They never had a chance to learn it.

Not to mention the “oh you need a football field full of computers” was a bit of a problem for most universities and scientific research outfits. So ecological sciences were understandably slow to get on the ML bandwagon and got behind. Probably on average a good twenty years behind. But this is changing. And it is changing FAST.

Because, as the barriers to entry fade based on all the things we’ve discussed above, ecological scientists are discovering ML is really really good at helping solve ecological problems. And as the world continues to go to shit in very noticeable “my house just burned down” ways, more and technologists are abandoning cat picture land and starting to pitch in to solve the problem. Even if it doesn’t pay as well, you can’t eat money, and it doesn’t help you to be rich if you are dead. 

Feed ML enough data and run it across enough GPUs and it can make sense of the satellite imagery and pick the trees out. It can understand the fire risks, guess where the beavers probably are. And most importantly it is the only thing that has a chance in hell of connecting all the dots of the complex web that is in an ecosystem at a detailed level. 

Also the VC industry is looking up from their Woodside mansions at the Santa Cruz mountains burning above them and thinking “hmm I wonder if there is a buck to be made saving the world?” as they pack their go-bags. The jury is still out on that one, but some of the braver or just more ethical ones are starting to invest. A new crop of startups is starting to bloom that has a chance of luring the good ML engineers away from the cat picture factories. The pairing of rockstar engineers with genius ecological scientists is starting to happen. 

And the more these companies take their baby steps, the more they validate the opportunities for ML. The story keeps getting better. We are finding we have an opportunity to use this new technology to not only understand the natural world but actually start actively managing it. We cannot stop or roll back climate change but by optimizing the right interventions, we may be able to alleviate the worst of it. We can actually do stuff rather than just looking at data visualizations of how screwed we are and wringing our hands. And we can even measure whether the things we do work well or not, continuously improving our interventions. 

This closed loop, data driven, low level, active management of an ecosystem would be a new capability for mankind. For most of human history, or at least since the industrial revolution, we humans have been treating the natural world like a credit card with an infinite credit limit. We take and take and run up a bigger and bigger debt, thinking that it’s nature, it can take it, it’s a big planet, and daddy needs a new caddy.

And when we do try to actively mess with ecosystems on a large scale, even with the best intentions, it usually ends up about as well as “let’s introduce rabbits to Australia” did, which is to say, terribly. Did I mention ecology is complicated? 

Also now that the debt is coming due, there is greater political will to DO SOMETHING ABOUT THE FIRES ALREADY and start this active management. The shit is, as they say, hitting the fan and the electorate is noticing the smoke. 

I think of 2023 as the year when climate change got real for the US voter at least. Smoke in DC so bad you can’t go outside. Heatwaves across most of the country literally killing people. Lahaina. It’s not just a California problem anymore. Climate change is here folks and the people, especially the younger generation, is demanding we do something. 

Doing something is hard, so it’s a good thing we have these tools coming online, because we can’t roll back the damage we’ve done. We can’t undo it, at least not for centuries. Even if we stopped emitting carbon across the entire world tomorrow we have a century before the earth stops warming. We are at the beginning of a slow rolling disaster, and more and more of the total effort of the human race is going to be devoted to alleviating the worst outcomes and saving as much of the biosphere as we can save, not because we like trees, but so we don’t all die.

I do like trees though. A lot. 

At Vibrant Planet we started with “Let’s try to understand Western Forest and see if we can’t keep all the trees from dying or burning down.” We use ML to understand characteristics of a hundred billion trees, their height, their carbon content, their health, the characteristics of their species and who likes to live on, in, under them. We use ML to simulate fire, to simulate drought. And then to plan what we can do about it. Where to do the controlled burns, the thinning, the treatments. What will happen if we do them? And then we observe what happens and we get better at it.

Using Machine Learning and the associated advanced technology to manage the effects of climate change will become the single most important application of the technology.

What Comes Next?

We are in a fight for the long term survival of our civilization. The odds are frankly not great at this particular point in time. None of the countries on the planet are hitting their emissions reduction targets, nor are they likely to. The carbon credit market, which was one of the better hopes for economic reform, is in shambles, because it turns out ecological science is hard, companies are shortsighted and incentives matter. Some of the worse effects of climate change seem to be hitting harder and earlier than even the more pessimistic projections.

 At least I didn’t have “Canada burns down and smokes out DC” on my 2023 bingo card.

The growth of the AI industry is not going to slow down anytime in my lifetime. However, as the technology expands into more industries it will gradually become more accessible to mainstream software engineers and scientists. I think this increase in accessibility will happen slower than many (especially tool abstraction providers) hope, but it will happen.

This is a new programming paradigm and sending your average software engineer to a class or two is not going to turn them into an ML Engineer. This means there will continue to be a large undersupply in the ML engineering labor market for the foreseeable future. Closing the labor gap will be as much a matter of upskilling existing engineering curriculum and graduating a new generation of engineers as retraining existing engineers or creating a better and more easy to use toolset. Though all these things will be happening simultaneously. 

This means ML engineers with the appropriate math background and the ability to leverage this new generation of ML Models and technology will continue to be a hot commodity, reminiscent of web developers in 1999.

Adding to this, as more and more of the models are getting trained and finding their way to the Inference stage, ML Operations and Data Engineering is going to become a bigger and bigger challenge. Big Data Engineers with a good understanding of how to run such Inference models at scale are also going to be in short supply.

None of this means general purpose software engineering is going anywhere however. None of these models mean anything unless they find their way to an end user through a software product that is understandable and actionable. The very nature of the ML revolution makes this “understandability” problem extremely challenging as the models have become so complex and so abstract that even their authors have no idea exactly what is going on under the hood. And people have a hard time trusting things that cannot be understood or explained.

It’s also important to not see ML and AI as some kind of silver bullet that solves everything. The space that ML is good at is still narrow and it is only applicable to a small set of problems, much smaller than the wide set that software engineering solves. 

This new world requires a cross functional team to be successful, product managers, designers, scientists, data engineers, ML engineers and software engineers all need to come together to build this next generation of software products.

As far as the NVIDIA monopoly goes, that is not likely to end anytime in the next 2-3 years. The stack that has grown up around NVIDIA is not just the NVIDIA hardware itself, but also includes a software stack to access and utilize that hardware. That has taken years and billions to develop. It will be hard for an incumbent to catch up.

However the massive earnings NVIDIA is bringing in coupled with the scarcity of the actual hardware will give incumbents desire and an opening to narrow the gap. AWS and Google are actually more likely to be the ones that produce a truly competitive offering based on their own custom chipsets and a desire to capture a larger share of the revenue, as opposed to the other smaller GPU manufacturers. Acquisitions of these smaller players are highly likely. 

And finally and more on point to the problem of climate change, no matter which GPU vendor ends up on top, the application of these techniques will massively advance our ability to not only understand but actively steward our ecosystem. Our capabilities in this will undergo a step function change. Because they have to.

So there is some light. These technologies I’ve been describing are not standing still. They continue to advance at speeds I would not have imagined even five years ago. The effects of them are just beginning to be felt in the ecological sciences space. This is a good thing because the effects of climate change will rapidly get worse over time. We are in for the fight of our life but at least we have some powerful weapons to fight with. 

(Part 3 of 3 — See Part 1 or Part 2)

More Posts

Article
12.20.2024

Experts Discuss the Future of Community Wildfire Planning and Defense Grants

Community Wildfire Management
Article
12.18.2024

Vibrant Planet at the Planet’s Largest Meeting of Earth Scientists

Publications
Science
Wildfire Management
Data Story
12.1.2024

How Wildfires are Changing

A story produced by Vibrant Planet Data Commons, a nonprofit engine that accelerates the commissioning of cutting-edge environmental science and data analytics at Vibrant Planet.
VPDC
Fire Resilience
Wildfire Management
Data Story
12.1.2024

Predicting the Cause of Wildfires

A story produced by Vibrant Planet Data Commons, a nonprofit engine that accelerates the commissioning of cutting-edge environmental science and data analytics at Vibrant Planet.
VPDC
Fire Resilience