RIKEN Plans Exascale Supercomputer ’30 Times Faster’ than Today’s Fastest in Six Years

Japan hopes to bring its first exascale computer online by 2020.

  • Share
  • Read Later

Forget 2014, let’s talk about what to expect in 2020, just six years from now, say a supercomputer finally capable of mongo-calculative deftness on par with what some believe to be the processing oomph of a single human brain.

Let’s put a number around that: one exaflop (FLOP standing for “floating-point operations per second” — the industry benchmark in scientific calculation horsepower) is equal to 1 x 1018 or 10 quintillion (a million trillion) FLOPs per second. That’s a one followed by 18 zeroes, or three more than you’d drop after a petaflops computer: today’s fastest supercomputer, China’s Tianhe-2, according to official TOP500 ranking (as for November 18, 2013), is capable of 33.86 petaflops; an exascale computer would be at least 30 times faster.

Japan-based RIKEN, located just outside Tokyo, says it’s been selected by Japan’s Ministry of Education, Culture, Sports, Science and Technology to design an exascale supercomputer, and that this nascent exa-behemoth will initiate Skynet protocols begin working in 2020.

RIKEN says its exascale supercomputer will be “about 100 times faster than the K computer, RIKEN’s currently operating machine.” The K computer clinched the “fastest supercomputer in the world” spot in 2011.

The first exascale computers had been projected to see light of day by 2018, but the date seems to be creeping back — some think even 2020’s too soon. Horst Simon, the Deputy Director at the Lawrence Berkeley National Laboratory’s Energy Research Scientific Computing Center, put up $2,000 of his own money earlier this year on a bet that we won’t reach exascale computing power by 2020. The reason? According to Horst in his May 6, 2013 presentation, titled “Why we need Exascale and why we won’t get there by 2020,” we have critical power scalability issues that existing multicore architectures don’t address: We can get to exascale processing with existing technology, but the power requirements involved at that scale are extraordinary.

And getting to exascale is a big deal — much bigger than just bragging rights — because exascale computing is expected to do a lot more than nose around the periphery of “singularity” wonks’ brain-ware fantasies. For instance, it’s expected to yield dramatically (and even that word’s an understatement) more accurate climate simulations and weather forecasting, combustion engine advances, “extreme resolution” nuclear stockpile life expectancy assessments, allow us to develop radically better energy storage (battery) tech and cut research development times — discovery-to-market — in half.

In short, whoever gets to exascale first is going to have an enormous advantage — to the extent you’re hearing people talk about the exascale computing paradigm in deadly serious national security terms.

1 comments
dobermanmac
dobermanmac

It isn't a very concrete thing designing a Exascale Supercomputer, because of the changing chip materials, architectures, and types.  For instance, you can use massive parallel process utilizing multiple core GPU (graphic chips).  Or you can use quantum computing strategies (like the Quantum Artificial Intelligence Lab harnessing the D-Wave).  Or you can share processing load with neuromorphic chips that are just coming out, and graphic cards that are being continually upgraded due to the demand for high end video games that run in real time, along with conventional von Neumann computer chips that are the main focus of chip manufacture and standardized computing.

The long and short of it is that combining optimal software with inventive strategies for optimizing continually improving hardware will probably give a group the fastest computing in the shortest time-frame.  Besides, who knows when AI will progress to the point where it is able to improve itself faster than human organizations can.  Let me make a prediction: by the time this Japanese firm is able to possess an Exascale Supercomputer, it will be outdated by several years, and they will have made several major changes in their origional strategy on how to achieve the goal.

The Singularity is coming, and it will change EVERYTHING.