For five years, China had the world’s fastest computer, a symbolic achievement for a country trying to show that it is a tech powerhouse. But the United States retook the lead thanks to a machine, called Summit, built for the Oak Ridge National Laboratory in Tennessee.
The Chinese government’s aggressive push to become the leader in technologies like artificial intelligence, microchips and cellular networks has ignited a rivalry with the United States, the traditional front-runner in the digital realm. For years, American companies have accused China of stealing their intellectual property, and lawmakers have said that some Chinese companies, including ZTE and Huawei, pose a national security risk.
The Summit computer, which cost $200 million to build, is not just fast — it is also at the forefront of a new generation of supercomputers that embrace technologies at the center of the friction between the United States and China. The machines are adding artificial intelligence and the ability to handle vast amounts of data to traditional supercomputer technology to tackle the most daunting computing challenges in science, industry and national security.
The numbers used to describe supercomputer speeds are, well, super — as beyond human comprehension as the machines’ performance is beyond human capability.
Summit can do mathematical calculations at the rate of 200 quadrillion per second, or 200 petaflops. If a person did one calculation a second, she would have to live for more than 63 billion years to match what the machine can do in a second.
Stupefying? Mr. Dongarra offered another analogy: The University of Tennessee football stadium seats about 100,000 people. If it was full, and everyone in it had a modern laptop, it would take 20 stadiums full of similarly equipped people to match the computing firepower of the Summit.
Supercomputers now perform tasks that include simulating nuclear tests, predicting climate trends, finding oil deposits and cracking encryption codes. Scientists say that further gains and fresh discoveries in fields like medicine, new materials and energy technology will rely on the approach that Summit embodies.
“These are big data and artificial intelligence machines,” said John E. Kelly, who oversees IBM Research. “That’s where the future lies.”
The global supercomputer rankings have been compiled for more than two decades by a small team of computer scientists, led by Mr. Dongarra, who put together a Top 500 list. The newest list will not be released till later this month, but Mr. Dongarra said he had no doubt that the new machine is the fastest.
At 200 petaflops, Summit achieves more than twice the speed of the leading supercomputer in November, when the last Top 500 list was published. That machine is at China’s National Supercomputing Center in Wuxi.
Summit, built by IBM in a partnership with Nvidia, is made up of rows of black, refrigerator-size units that weigh a total of 340 tons and are housed in a 9,250 square-foot room. The machine is powered by 9,216 central processing chips and 27,648 graphics processors that are lashed together with 185 miles of fiber-optic cable.
Cooling Summit requires 4,000 gallons of water a minute, and the supercomputer consumes enough electricity to light up 8,100 American homes.
Supercomputers are a measure of a nation’s technological prowess. It is a narrow measure, of course, because raw speed is only one ingredient in computing performance, and it is software that stirs the machines to life to do useful things.
Although the United States may have regained the top spot, China passed America two years ago for the most supercomputers on the Top 500 list. In the November rankings, China had 202 machines; the United States had 144.
The global supercomputer sprint comes at a time when so much innovation comes from internet giants like Amazon, Facebook and Google in the United States, and Alibaba, Baidu and Tencent in China.
But the big government-supported supercomputer programs, scientists say, take a different technical approach and pursue more long-range research.
Modeling the climate, for example, requires scientific measurements of atmospheric temperatures, moisture and wind patterns that are fed into a huge program. The code may then run across an entire supercomputer for days, in an orchestrated sequence since each change in heat or moisture affects what happens next in the environment. That is a much more tightly coordinated computing task than handling millions of internet searches at once, each a separate task, said Ian Buck, a computer scientist and general manager of Nvidia’s data center business.
Scientists at the government labs are doing exploratory research in areas like new materials to make roads more robust, fundamental new designs for energy storage that might apply to electric cars or energy grids, and new power sources like harnessing fusion.
“Industry is great, and we work with them all the time,” said Rick Stevens, an associate director of the Argonne National Laboratory in Illinois. “But Google is never going to design new materials or design a safe nuclear reactor.”
While impressive, Summit can be seen as a placeholder. Supercomputers that are five times faster — 1,000 petaflops, or an exaflop — are in the works. The Energy Department last month closed the window for bids on three such so-called exascale supercomputers to be built over the next three years. There have been cuts elsewhere in the department, but the budget for its advanced computing program is being increased by 39 percent in the two fiscal years ending September 2019, said Paul Dabarr, the Energy Department’s under secretary for science.
“We’re doing this to help drive innovation in supercomputing and beyond,” Mr. Dabarr said.
Yet China, Japan and Europe all have exascale projects underway. The American lead, scientists say, could be short-lived.
At Oak Ridge, Thomas Zacharia, the lab director, cites a large health research project as an example of the future of supercomputing. Summit has begun ingesting and processing data generated by the Million Veteran Program. Begun in 2011, the Department of Veterans Affairs project is enlisting volunteers to give researchers access to all of their health records, contribute blood tests for genetic analysis, and answer survey questions about their lifestyles and habits. To date, 675,000 veterans have joined; the goal is to reach one million by 2021.
The eventual insights, Mr. Zacharia said, could “help us find new ways to treat our veterans and contribute to the whole area of precision medicine.”
Dr. J. Michael Gaziano, a principal investigator on the Million Veteran Program and a professor at the Harvard Medical School, said that the potential benefit might well be a modern, supercharged version of the Framingham Heart Study. That project, begun in 1948, tracked about 5,000 people in a Massachusetts town.
Over a couple of decades, the Framingham study found that heart disease — far from previous single-cause explanations of disease — had multiple, contributing causes including blood cholesterol, diet, exercise and smoking.
Today, given the flood of digital health data and supercomputers, Dr. Graziano said that population science might be entering a new golden age.
“We have all this big, messy data to create a new field — rethinking how we think about diseases,” he said. “It’s a really exciting time.”