Network World (11/18/10) Jon Brodkin
At SC10, the annual supercomputing conference, high-performance computing (HPC) experts debated whether the industry will reach an exaflop by 2020, and if the achievement will be worth the expense, which could exceed $1 billion. The experts agreed that the industry can reach exascale within 10 years, but that it could end up being too specialized to solve a broad range of issues. Exascale machines could help researchers cure diseases, improve climate research, and upgrade the ability to respond to natural and man-made disasters, according to some scientists on the panel. However, politics could stand in the way of reaching exascale computing. “Let me be blunt, [the U.S. Defense Advanced Research Projects Agency] doesn’t seem to have any interest at this point in exascale,” says the University of Illinois’ Marc Snir. One big problem with future exascale systems will be power management, says the University of Notre Dame’s Peter Kogge. HPC researchers may want to work with smartphone manufacturers, which are concerned with battery life, says Microsoft’s Burton Smith. In addition, supercomputers might not be as useful as their measured speed indicates if data management is not handled efficiently, says the San Diego Supercomputer Center’s Allan Snavely. He says there should be “envelopes of usefulness around machines” that apply to several different applications. The envelopes of usefulness would describe attributes such as memory operations and other measures that would determine whether a computer can be used for a particular application.
Filed under: ICT