Climate Change, Machine Learning, and High Performance Computing

By Victoria Lubas | April 14, 2021

Predicting the future begins with analyzing the present. From weather patterns to climate change, studying the physics of the atmosphere in new and profound ways can lead to greater understandings of our current weather and increase the accuracy of predictions. With the help of high performance computing and NYU's new supercomputer, Greene, scientists and professors at NYU's Courant Institute of Mathematical Sciences are combining physics principles, existing climate equations, computer science, and machine learning to predict how changes in the atmosphere will impact the Earth's surface.

Edwin Gerber, Courant professor of Mathematics and Atmosphere/Ocean Science, is using climate modeling to represent equations of atmospheric flow on a computer, establish the importance of different atmospheric processes, and use that knowledge to make better use of complicated models.

Simplified Models Make Waves

Through the use of comprehensive General Circulation Models (GCMs), Gerber and his colleagues can virtually explore and examine elements of the climate and analyze which processes have the greatest overall impact. These models are so complex that they can include every process that is occurring; which, while highly-advanced, is also reliant on several scientists to reach a collective understanding of both the model and the code involved. In order to make these large-scale models more understandable and usable, Gerber explained that his research lies in model hierarchies, creating simplified models with "a few equations that I can write down and say 'this is global warming,' …[then] take that understanding and you get insight from the simple models and work it back up."

Gerber continued, saying "a lot of what I do is try to develop models that are in between sort of the state of the art, really advanced...and the simple things that we talk about when I teach atmospheric science—the things we understand, the theory. I'm working in-between, and that's where [high performance computing] comes in really useful for us."

Gerber's work with simplified models gives climatologists the ability to identify and understand the models' elements in context, determine what is relevant, and work toward better predictions for climate change. Gerber explained that by working with these streamlined models, he and his colleagues can "identify key processes that [they] think really matter and explore them in the idealized models" or re-examine more complex models with a fuller understanding. Simplified models are useful for identifying significant elements of change, sparking a search for those elements in the more complicated models and then in the real atmosphere.

An example of such research is the study of gravity waves, which are thought to be important but are notoriously hard to track by satellite and cannot be resolved in a GCM that lacks fine resolution. Gerber explained that downward facing, or nadir, satellites give very accurate horizontal information, such as latitude or longitude, but limited vertical resolution or sense of depth because they detect radiation from all levels of the atmosphere and therefore must infer the vertical structure by trying to untangle that data. Gerber continued, "so-called limb sounders—because they sit on the limb of a satellite—look through horizontal cross sections of the atmosphere, e.g., the sun just as it sets. These measurements can give us insight into the vertical structure of the atmosphere, but suffer from limited resolution in the horizontal; again, it's an issue untangling the signal across a wide cross section of the atmosphere."

Gravity waves transport momentum from the troposphere, or surface and weather layer, to the stratosphere, or upper atmosphere, where they play a key role in atmospheric circulation. The horizontal-versus-vertical limitations of satellites makes this hard to track. Google Loon—a project that aimed to provide Internet coverage around the world using balloons that float in the lower stratosphere—provided an affordable way to overcome the limitations of satellites and identify the location of gravity waves. Coupling information on the balloons' movements with high resolution model integrations allows climatologists to measure real gravity waves and brings them one step closer to including gravity waves in climate models.

Gerber's role in this project was to understand these effects on a global model by parameterizing the large-scale flow and high-resolution images to a smaller scale and using machine learning to identify the missing pieces. Inputting this information into simpler models shed light on the interaction between gravity waves and large-scale atmospheric flow that had been occurring in the comprehensive model unnoticed. The use of the simpler model allowed this information to be identified and applied to more comprehensive models with a more complete understanding. Gerber reflected that "as a theoretician, your greatest honor is if you can help influence people making observational campaigns or at least the way you influence observations and identify what we need to know."

3d model of climate study

A visualization of the atmospheric circulation as a function of latitude, longitude, and pressure (where the log of the pressure is proportional to physical height). Image: Martin Jucker

Advancements in Machine Learning

Gerber's work focuses on reconciling state of the art and simplified models to reach an overall understanding, relying on the use of machine learning and high performance computing (HPC). At the core, climate models are based on simple equations and the interactions they spark. These equations are understandable by physicists when linear but once equations reach a non-linear point the use of computers are essential to gaining an understanding, leading computers to act like pseudo-laboratories. Gerber explained "it's not a true lab, but...I can represent the physics of the atmospheric flow and do experiments and see...if I change this part of the code…[or] these parameterizations, how does the model respond?" 

The first step in working with climate models is utilizing filtered equations that eliminate irrelevant data in order to save computing power for processing the scales of motion that matter for weather. The vast amounts of data being processed means that the computing must be both accurate and efficient in order to yield understandable and usable results. Machine learning can allow for better parameterizations and simulations by optimizing how the data is processed to maintain maximum speed and accuracy. Whether simple or idealized, almost all climate models are based on Fortran code, mainly for historical reasons. This requires climatologists and HPC experts to couple the atmospheric codes that are written in Fortran with the machine learning interfaces that are built for Python, while still allowing the code to run efficiently on hundreds, if not thousands, of processors. It comes down to essential issues of coupling the code with the computer architecture.

Through the use of a neural network, machine learning can examine training data, learn relationships, accurately model the atmosphere, and identify missing pieces within the global code. In the future there is hope that this technology can advance climate predictions and at the rate the field's research is going, Gerber expects machine learning applications to expand significantly within the next year.

Computer model related to climate change

A schematic representation of the processes that must be simulated to accurately model our climate system, including clouds and atmospheric convection, trace gas transport, radiative transfer, and the atmospheric circulation. Image: Martin Jucker

Overcoming Potential Hurdles

High performance computing expertise comes into play when steps within this process are not working together efficiently, or worse, go unstable, as NYU’s helpful staff of HPC experts can ensure these resources are being used to their full potential. Adding new parameterizations can greatly expand what climatologists have to learn from a model, but they can also slow down models significantly to the point that they are unusable. Just as equations must be filtered to relevant data, the essential parts of the parameterizations must be identified and the model itself must be designed efficiently in order to run. Reconciling efficient code with proper use of processors requires the help of HPC experts to make sure all processors are working together, rather than some working in overdrive while others wait in standby.

In addition to the efficient use of HPC processors, Gerber elaborated that climatologists and weather prediction centers must consider other factors when assessing climate predictions and comparing results of one model against another. “For example, an essential test of a climate model is its ability to simulate the observed changes in our climate system over the past century. But how has our atmosphere changed? New technology allows us to better observe the atmosphere today than ever before, but there is a danger that apparent trends over time could reflect changes in our ability to better observe the atmosphere: is it a trend, or a bias correction?” To combat this, climatologists “produce one continuous simulation of atmospheric state, given all available observations …and correcting the errors in earlier models [in order to] artificially produce a trend.” “[Another] issue is the reproducibility of numerical results, a basic tenant in science. The strong links between software and hardware in HPC can hinder reproducibility, even if climate scientists are fully open about sharing their codes.”

A final hurdle that climate scientists must overcome when working with HPC is the battle against truncation error. Gerber explained that the extensive Navier-Stokes equations governing fluid typically require computerized assistance, creating an issue because the continuous nature of fluid cannot be represented on a computer without eventually being approximated. Just like rounding pi in your high school geometry homework resulted in a less accurate answer, discretizing fluid causes a simulation to experience truncation error. However, bigger computers with higher resolution, such as NYU’s Greene, are able to significantly minimize this truncation error to create better simulations and climate and weather predictions. 

Conclusion

A supercomputer as powerful as Greene gives a better simulation of the atmosphere because the potential for higher resolution makes code run more efficiently and makes it more easily relatable to comprehensive climate modeling. Gerber explained that having a state of the art machine allows for state of the art calculations, which when coupled with outstanding support structures and knowledgeable HPC experts ensures codes are highly efficient to result in such calculations. Gerber reflected, “you can give me a giant computer. I don’t have the expertise to use it [fully]. I’d say what I really think is important at NYU is you have very good staff who really understand how that computer works…[and] using it to the best purposes.”