5 minute read

Artificial Intelligence

Next Article
Introduction

Introduction

Developing transformative improvements in NOAA mission performance and cost-effectiveness

By Craig Collins

In February 2020, NOAA announced the agency’s Artificial Intelligence (AI) strategy, the culmination of years of effort, to dramatically expand the application of AI in every NOAA mission area by improving the efficiency, effectiveness, and coordination of AI development and usage. By strengthening coordination, operational capabilities, workforce proficiency, and multisector partnerships, NOAA’s national and global leadership in AI supports science, public safety, and security.

AI – essentially, the replication or simulation of human intelligence in machines – is one priority in the agency’s strategy to dramatically expand the agency’s application of emerging science and technology focus areas. The other’s are: NOAA Unmanned Systems, Cloud Computing, Big Data, Citizen Science and ‘Omics.

“These strategies will accelerate the implementation of the most effective science and technology applications to advance NOAA’s mission to protect life and property and grow the American Blue Economy,” said retired Navy Rear Adm. Tim Gallaudet, Ph.D., assistant secretary of commerce for oceans and atmosphere and deputy NOAA administrator. “They will also guide transformative advancements in the quality and timeliness of NOAA science, products, and services to advance the agency’s science and technology strategies.”

A mother humpback and calf.

In 2003, when Dr. Jamese Sims joined the National Weather Service as a student intern, she was given an assignment: improve the performance of the Gulf Stream Finder, a model used to predict the location of the warm-water current that flows northward off the Atlantic Coast.

“Understanding the Gulf Stream is important to our mission,” Sims said, “because we have partnerships with the Navy, and they need to know the precise locations of ocean currents. But better mapping of the current could also support some of our other line offices, as well as the work of our Ocean Prediction Center and the National Ocean Survey.”

HARPs are long-term acoustic recorders, deployed at a specific location, then recovered after months to years to obtain recorded data and identify the species detected by the HARP. AI allowed NOAA researchers to comb through this data and develop a model for identifying humpback whale songs that could be used to determine where the whales are in the Pacific, and how that has changed over time.

Dr. Sims – now Senior Physical Scientist and assigned to coordinate NOAA’s overall AI strategy – applied artificial intelligence (AI) to the problem: she wrote a computer algorithm that would improve the accuracy of the Gulf Stream Finder by feeding it better satellite data. Known by programmers as “genetic optimization” algorithms, these programs select data in a way that mimics natural selection: only the fittest data, producing the most accurate predictions, survive to be selected for NOAA models. Genetic optimization algorithms have been used to “tune” other models, including the WAVEWATCH model developed by NOAA’s Dr. Hendrik Tolman in the 1990s to predict sea states. WAVE- WATCH can now incorporate nearshore wave activity, and helps to predict dangerous rip currents.

Artificial intelligence isn’t new to NOAA’s work. Teaching computers to recognize signs and patterns in data, and to make real-time decisions based on these patterns, has helped increase the impact of NOAA science for more than two decades. The National Weather Service’s first operational use of AI, in the mid- 1990s, was an algorithm, developed in-house, that would recognize combinations of five distinct types of satellite data (i.e., wind speed and temperature) that would produce more accurate modeling.

advertisement

Technical innovations in NOAA’s remote sensing capabilities – on land, in the ocean, in the air, and in space – have yielded a wealth of data that long ago surpassed the ability of human analysts to absorb and process. In recent years, NOAA scientists and their research partners – often in collaboration with academic or private-sector experts in algorithms and computing – have discovered ways to put computers to work sorting, analyzing and acting on insights derived from these data.

Researchers Burkely Gallo and Alex Anderson-Frey present on their research, which uses machine learning techniques to help determine which forecast models are most accurate for specific weather events, at an Office of Oceanic and Atmospheric Research/National Weather Service “Shark Tank” event in February 2018. Artificial intelligence isn’t new to NOAA’s work, but the agency is aiming to expand its application of AI in all its mission areas.

In 2018, a research ecologist at NOAA’s Pacific Island Fisheries Science Center teamed up with Google to develop a neural network that would comb through 15 years’ worth of underwater recordings of humpback whale calls, in locations throughout the Pacific; distinguish those calls from other noises (i.e., ship engines or dolphin calls); and locate whales in time and space. Within about nine months NOAA researchers had a model that could be used to determine where humpbacks are in the Pacific, and how their locations have changed over time.

In the last year, biologists with NOAA Fisheries have joined with Microsoft to develop and train in artificial intelligence that could study millions of still images and identify individual organisms – underwater images of fish species, for example, which could be used to supplement acoustic and trawl surveys; and aerial images of sea ice, to monitor threatened ice seal and polar bear populations in Alaska. The months it normally took for humans to examine these photographs was shortened to a few hours. Another Alaska team applied an algorithm to acoustic data collected from equipment scattered across the bottom of Cook Inlet, in order to identify the calls of endangered beluga whales and monitor how the dwindling population was using its winter habitat.

Each of these examples represents a fairly basic machine learning task: using visual or auditory cues to identify objects. But in each case, the task was completed in a tiny fraction of the time it would have taken human intelligence. According to Dr. Sims, this has proved true for the thousands of images created by underwater cameras during fisheries-independent stock assessments. “In the data processing of imagery from a fishery survey, we’ve seen a reduction in time of 98 percent,” she said. “Whereas before it would take a month or more to analyze the imagery, we’re able to do that now using artificial intelligence, and it can take only a day or so.”

Such applications have enormous potential for freeing up talent and expertise within NOAA to solve big problems – and as the agency builds its own AI expertise and expands its ability to collect and process even more complex and varied data, Sims said, this impact will grow. “I believe with the use of AI we will be able to provide even better products and services than we have right now,” she said. “And we’re doing great right now.”

This article is from: