What Is a Supercomputer? The Top 10 Supercomputers in the World

The dictionary definition of a supercomputer is “a particularly powerful mainframe computer.”

The “power” of a supercomputer is defined by how many operations the system can process every second.


This is a broad definition with a goalpost that changes every time computer technology advances. The fastest personal computers today are processing operations at the speed that supercomputers were performing several decades ago. That’s how fast this technology changes.

So what is a supercomputer, really?

Let’s take a deeper look at what supercomputers look like today to find out.

The World’s Most Powerful Supercomputer

For many years, the US, China, and Japan battled to be the country that owned the fastest supercomputer in the world. “Fast” is defined by the number of petaflops the computer system can perform. A petaflop is one thousand teraflops, or one quadrillion floating-point operations per second.

By 2013, the two most powerful supercomputers in existence were:

  • The Cray Titan at Oak Ridge, boasting a processing capability of 17.59 petaflops.
  • The IBM Sequoia at Livermore Labs, pushing 17.17 petaflops.

However, that year China took over the running with the NUDT Tianhe-2 in Guangzhou, with a whopping 33.86 petaflops. This system was king, until 2016 when the Sunway TaihuLight in Wuxi, China finally took the prize with an impressive 93.01 petaflops.

According to Top500, the next king of supercomputers should be the Summit. This is a supercomputer at Oak Ridge National Lab, expected to go live in the summer of 2018.

Looking Deeper Into a Supercomputer

Now that you know the state of the current supercomputer race let’s take a closer look at the inner workings of these impressive marvels of technology. Let’s examine the Sunway Taihulight.

The Taihulight is a super-efficient computer with the following processor architecture:

  • 1.45 GHz SW26010 processors, each with four groups of cores.
  • Each core group contains 65 cores for a total of 260 cores per node.
  • Each system cabinet contains 1,024 nodes.
  • The entire system contains 40 cabinets.

To connect all these cabinets, the builders of this impressive system custom-made their own network of PCIe 3.0 connections they’ve dubbed the “Sunway Network”.

The network connects the switches, resource-sharing hardware, and all the supernodes via 7-inch cables that transmit data at 70 terabytes per second.

The following is the current list of the ten fastest supercomputers in the world, according to Top 500.

  1. Sunway TaihuLight (China): 10,649,600 cores, 93,014 TFlops/s
  2. Tianhe-2 MilkyWay-2 (China): 3,120,000 cores, 33,962 TFlops/s
  3. Piz Daint (Switzerland): 361,760 cores, 19,590 TFlops/s
  4. Gyoukou (Japan): 19,860,000 cores, 19,136 TFlops/s
  5. Titan (United States): 560,640 cores, 17,590 TFlops/s
  6. Sequoia (United States): 1,572,864 cores, 17,173 TFlops/s
  7. Trinity (United States): 979,968 cores, 14,137 TFlops/s
  8. Cori (United States): 622,336 cores, 14,015 TFlops/s
  9. Oakforest-PACS (Japan): 556,104 cores, 13,555 TFlops/s
  10. K Computer – Sparc64 (Japan): 705,024 cores, 10,510 TFlops/s

It’s a vast playing field involving advanced laboratories and government facilities all around the world. The leaders not only earn the respect of computer experts globally, but the race for the top is politically charged as well. And with supercomputer costs running to the thousands of dollars per hour, these companies probably make a lot of money, too.

What Are Supercomputers Used For?

Why would anyone need a computer system that can process quadrillions of floating point operations per second?

The reality is that across many different industries, there are needs that require massive computational power. Here are just a few supercomputers uses in industry, governments, and the military.

A report was written by the International Data Corporation in 2014 dove into dozens of real-world scenarios where supercomputers were used. We drew the following examples from that study.


  • General Electric, a major player in the aerospace industry, collaborated with computer experts at Oak Ridge National Laboratory to produce advanced jet engine simulations. The simulations helped GE identify an engine phenomenon that helped the company improve its overall fuel efficiency.
  • Scientists at Lawrence Livermore National Labs used the supercomputers there to develop a new technique for subsurface data gathering. This allowed the US oil and gas industry to more easily identify oil reserves in the Gulf of Mexico and reduce America’s dependency on foreign oil.
  • Boeing engineers utilised supercomputers to create aircraft simulations that led to better aerodynamic designs, so they could produce more fuel-efficient and safer aircraft.
Boeing 2013 Simulation


  • The Centers for Disease Control and Cornell University collaborated to create a highly detailed model of the hepatitis C virus. Using a supercomputer at Cornell University, researchers were able to develop new therapies that eventually assisted the medical community in reducing or curing liver disease in patients.
  • The US Department of Defense used a supercomputer to develop new weather models that would help meteorologists predict potentially dangerous hurricanes and cyclones. The more advanced computer models of these storms provided the ability to predict the dangers up to five days before the impact.


  • The US Army uses supercomputers at the Army Research Laboratory to run advanced simulations that help researchers conduct “destructive live experiments and prototype demonstrations.” These would otherwise be cost-prohibitive to perform with real equipment.
  • One of the most unusual supercomputers used by the US military was called the “Condor Cluster,” created by the US Air Force in 2010. Engineers there connected 1,760 Sony PlayStation 3 consoles together to create the supercomputer core. It was capable of 500 TFlops and used for tasks like pattern recognition, processing satellite imagery, and conducting artificial intelligence research.

As you can see, there are many demands from all across every industry, government agencies, and the military for advanced computing power.

Are Quantum Computers Next?

Every advancement in supercomputer speed is a direct result of a higher number of transistors packed into microprocessors on a smaller and smaller scale. As these microcircuits continue to shrink toward an atomic scale, many futurists predict that the next stage in supercomputers will be in the realm of quantum computers.

What Is a Quantum Computer?

Quantum computing is a unique take on computer technology. Instead of the traditional transistor-based microprocessor, scientists hope to capitalise on manipulating the state of subatomic particles.

Subatomic particles adhere to very strange laws of physics—much of which researchers have only discovered in the very recent past. By experimenting with various techniques to control the subatomic states of these particles, scientists hope to replace the 1s and 0s of a classic transistor with an equivalent 1 or 0 state of these quantum particles.

Another word for these two-state particles is quantum bits (or “qubits”).

The mind-bending aspect of quantum computing is the fact that the many available states of these particles may allow these computers to store more than just a 1 or 0.

Are There Any Quantum Supercomputers?

The reason you haven’t heard about a superior quantum supercomputer making the Top500 list yet is that no actual stable quantum supercomputer exists yet.

IBM does offer online access to a 20 qubit system for researchers to use. Researchers typically use quantum computers for specific calculations and analysis that they’re well-suited for, like chemical simulations.

The creators of quantum computers use terms that not long ago were mostly theoretical in nature. But now, the scientists building these systems describe very real interactions between “qubits” as “entanglement.” They describe specific states of qubits as “coherence.” So it’s almost impossible to compare these systems to traditional computers.

Large systems that might have some hope of competing with today’s traditional supercomputers are still too unstable. For example, in November of 2017, IBM announced it had built a 50-qubit quantum computer. However, it could only hold its quantum “microstate” for 90 microseconds.


Quantum computers hold great promise for the future, but right now they offer little competition for traditional, silicone-based supercomputers.

Show More
Back to top button