Bringing The World Home To You

© 2024 WUNC North Carolina Public Radio
120 Friday Center Dr
Chapel Hill, NC 27517
919.445.9150 | 800.962.9862
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations
WUNC End of Year - Make your tax-deductible gift!

Flood Maps Can Get Much Sharper With A Little Supercomputing Oomph

This is a calculated flood map for the city of St. Louis. Water depth goes from deep (dark blue) to shallow (white, light blue). Floodwater can come from the Illinois, Upper Mississippi and Missouri rivers, as well as from heavy local precipitation.
Courtesy of Dag Lohmann/Katrisk
This is a calculated flood map for the city of St. Louis. Water depth goes from deep (dark blue) to shallow (white, light blue). Floodwater can come from the Illinois, Upper Mississippi and Missouri rivers, as well as from heavy local precipitation.

A small company in California is hoping to make a big splash by providing detailed flood maps to homeowners and insurance companies. And to do that, the company is using one of the fastest supercomputers in the world.

The company is called Katrisk, based in Berkeley, Calif. Hydrologist and computer modeler Dag Lohmann is one of the company's founders. He says the flood maps the Federal Emergency Management Agency already produces will tell you how prone a particular area is to flooding.

But FEMA's maps don't tell you everything you might want to know about what might happen in a flood, Lohmann says.

"You don't know whether the flood is 2 inches high, 2 feet high, or 2 meters high," he says. When you're making a computer model of that sort of catastrophe, how often a particular region floods is just a start.

"You also have to know 'how severe is the flooding?' " Lohmann says. "You have to know, 'How deep will the water be?' "

In other words, in order to properly price a policy, an insurance company would want to know whether repairs from a flood in that region are more likely to involve replacing a few carpets, or removing tons of mud and debris.

Lohmann wants to make flood hazard maps that would provide that kind of information, detailed enough to make predictions for individual properties.

But to make such detailed maps requires an enormous amount of computing.

"Ten years ago it would have been immensely expensive to run those kinds of computations," says Lohmann.

The Titan supercomputer at the Department of Energy's Oak Ridge National Laboratory changes all that, in part because it's a new kind of supercomputer. Instead of just traditional central processing units, or CPUs, the Oak Ridge computer uses lots of graphical processing units, or GPUs. These are the sort of processors computer gamers like for producing lifelike graphics.

"For flood modeling, they're ideal, these graphics cards," Lohmann says, because they can do calculations in parallel. "You have easily 2,000 to 3,000 — [and] now up to 4,000 — little processors in each graphics card."

Titan supercomputer at the Oak Ridge National Laboratory.
/ Oak Ridge National Laboratory
/
Oak Ridge National Laboratory
Titan supercomputer at the Oak Ridge National Laboratory.

So what you do is divide a map into a grid. And each processor calculates the flood risk at a particular time for a different square on the grid. Stitch all the individual squares together, and you've got a highly detailed map.

One especially nice thing for startup companies like Lohmann's: Computer time on the Oak Ridge computer is free. As a national lab, Congress instructed Oak Ridge to provide access to its supercomputer to researchers "in United States industry, institutions of higher education, national laboratories and other federal agencies."

The only catch is, you have to make your results public. In theory, that means another company could jump in and do the same thing that Lohmann is doing. In practice, Lohmann says, there would still be technical and scientific hurdles a competitor would have to overcome, even with access to his results.

Lohmann isn't the only entrepreneur to approach Oak Ridge with an idea in need of some supercomputing. For example, a company from Greenville, S.C., called SmartTruck used the Oak Ridge computer to design a system for reducing drag, and saving fuel on long-haul trucks.

"People always bring their dreams here," says Jack Wells, director of science at the computing facility at Oak Ridge. "They bring their dreams for what they wanted to do with their life's work. And we help give them a big jump forward by giving them access to our Titan supercomputer."

Copyright 2021 NPR. To see more, visit https://www.npr.org.

Joe Palca is a science correspondent for NPR. Since joining NPR in 1992, Palca has covered a range of science topics — everything from biomedical research to astronomy. He is currently focused on the eponymous series, "Joe's Big Idea." Stories in the series explore the minds and motivations of scientists and inventors. Palca is also the founder of NPR Scicommers – A science communication collective.
More Stories