Jack Dongarra, the programmer who wrote a key piece of code for modern supercomputers, recently received one of computing’s highest awards: the Turing Award, named for mathematician, computer scientist, and World War Two codebreaker Alan Turing.
Scientific research often relies on modeling things with numbers, because computer simulations are usually the best way to simulate something you can’t – or at least really, really shouldn’t – make happen in the real world. You get to watch what happens and hopefully learn something useful, but nothing (usually) actually explodes and no one gets labeled a supervillain. And it turns out that a surprising number of the things scientists like to simulate – from weather to economies – can be described in numerical form by a type of math called linear algebra.
At its most basic, linear algebra uses equations in the form “y=mx+b” to describe the shape of a line on a graph. At the risk of inducing high school flashbacks, remember that “m” represents the slope of a line, and “b” represents the point where the line crosses the y-axis of the graph, while “x” and “y” can represent any set of coordinates along the line.
These equations are a handy way to model how changing one variable will change another (if you already know how the variables are related). On the other hand, they’re also great for figuring out how two variables are related (if you just have a bunch of data but don’t yet know the equations that tie it all together). And at their least basic, linear equations are the tools scientists in several fields use to build their mathematical simulations of the world around us – or of just us and our behavior, for that matter.
In the late 1970s, Dongarra wrote a computer program called the Linear Algebra Package, or Linpack for short, which made it easier to program and run complex linear equations on supercomputers.
About 20 years later, in the early 1990s, he used Linpack – the software he wrote – to measure how many calculations per second a supercomputer could perform. Called “floating point operations per second,” or FLOPS, this provides a way to measure a supercomputer’s speed and power. And of course, once engineers can consistently compare the speed and power of a piece of technology, they’re going to do it, and they’re going to make lists about it.
The inevitable “Top500” list of the world’s most powerful supercomputers has helped track a major shift in how the world’s most powerful computers are put together. For a long time, a supercomputer was a supercomputer because it had a much more powerful central processor (a computer’s main circuit) than an ordinary computer. Starting in the early 2000s, though, parallel computing started to take over; the most powerful supercomputers in the world were actually huge arrays of ordinary, desktop-computer-sized processors all networked together, so that dozens or hundreds of processors could work on a problem at the same time. The Top500 list reflected that change; the newfangled parallel computers started proving themselves capable of more FLOPS than the old-school type.
Recently, though, a new kind of supercomputer is starting to dominate the list: cloud computers, which are just really, really big parallel computers, whose processors may not even all be in the same location. Their development is mostly being driven by private companies: mostly the big tech names like Amazon and Google. But without Dongarra’s work providing a way to actually measure their power, that trend might be harder to spot.
“We will rely even more on cloud computing and eventually give up the ‘big iron’ machines inside the national laboratories today,” Dongarra predicted in a recent interview with the New York Times.
Today, Dongarra is a professor at the University of Tennessee and a researcher at Oak Ridge National Laboratory. The Association for Computing Machinery presented him with its prestigious Turing Award, which comes with a $1 million prize and well-deserved bragging rights, on March 30.
https://www.forbes.com/sites/kionasmith/2022/04/01/programmer-jack-dongarra-receives-turing-award-for-bringing-linear-algebra-to-supercomputers/