SKA telescope to provide a billion PCs' worth of processing

Petaflop machines the way of the future

Two technologies currently under research by IBM may hold the key to processing and storing the exabyte (1018) of data expected to flow per day from the Square Kilometre Array (SKA) telescope project.

The company, which is part of a research consortium that includes Australia's Commonwealth Scientific and Industrial Research Organisation (CSIRO), Curtin University of Technology and the University of Western Australia, is currently working down a technology roadmap that leads to the development of exaflop machine — the processing equivalent of about a billion PCs. It is also developing a new form of solid state storage, 'Racetrack Memory', which may hold the key to enabling the storage of the SKA’s vast amount of astronomical data.

Last month, the New Zealand and Australian governments announced they would make a joint bid for the $3.1 billion SKA, a massive radio-telescope distributed across Australia and New Zealand.

“Today’s announcement is the fruition of five years’ work by many people, under the umbrella of SKA New Zealand. SKA represents an enormous opportunity for New Zealand, with the potential for massive scientific breakthroughs, job creation and economic spinoffs," said Professor Sergei Gulyaev, director of AUT University’s Institute for Radio Astronomy and Space Research, at the time.

Much of the progress toward solving the massive engineering problems posed by the SKA is still at the whiteboard and analysis stage, but real progress was being made, said the director of IBM's Australia Development Laboratory for ANZ, Glenn Wightwick.

“In the last year or two IBM has built machines in the order of a petaflop and in the last couple of weeks IBM announced an ongoing partnership with the US Department of Energy to build a 20 petaflop machine by 2011-2012,” he said.

“[But] we know we need to build an exabyte machine for the SKA over the next 10 years — that’s 50 times larger and equivalent to about 1 billion PCs — so, that gives you a sense of the size of the problem.”

Despite the size of the challenge, Wightwick said IBM’s roadmaps and belief that the processor technology could be driven much further meant that it was unlikely that a billion cores would be required in 10 years’ time.

“We will need machines which probably have hundreds of thousands of processor cores in them and we roughly know how we can go about engineering it,” he said. “It wouldn’t be cost- or technically-feasible to bolt together 50 20-petaflop machines… and the power consumed would be crazy. By the time we deliver the 20 petaflop machine we will be well on the way to an exaflop machine.”

Given the need to process data streaming off the SKA telescope in real time, it was unlikely IBM or the rest of the SKA research consortium would look to utilise grid or cloud computing environments, Wightwick said.

“Moving that much data around is a challenging problem. It’s therefore likely to be a machine or machines that are scaled large enough to process the volume,” he said. “In the past you might have just thought about how many processors you need and whether you can put enough of them together.

"Today, we have to think about how many processors we can feasibly put together, how much power and cooling we can get, where the machine can be sited, how much data we can transfer. It is a really interesting challenge.”

Storage is also an equally challenging problem, Wightwick said. To address this, the SKA project was likely to use some form of stream computing to analyse data on the fly, sending useful data to storage and discarding the rest.

“There will still be a massive storage capability and we aren’t even sure what it might look like,” he said. “Disk drives are approaching 1.5TB and we will continue to see increases in drive technology, but there are a lot of interesting storage technologies coming on board (such as Racetrack Memory).

“It is very fast, unlike flash, and is denser. It is still a research project but we know — based on these projects — what it is capable of. Five or 10 years out, it may be applicable to deploy in this project.”

IBM was also working with the CSIRO on developing advanced algorithms in areas such as correlation, which facilitate the integration of high volumes of data from multiple telescopes to create more detailed astronomical images.

IBM also recently announced that it had partnered with Western Australia’s International Centre for Radio Astronomy Research (ICRAR) on the technology for the SKA.

The telescope, 50 times more sensitive than current instruments, will use approximately 3,600 antennae spread over thousands of kilometres to peer into deep space.

The SKA will capture data on the evolution of galaxies, dark matter and energy, providing insight into the origins of the universe around 13 billion years ago.

AUT University has begun to build a prototype of the 12 metre diameter radio telescope required for the SKA in Warkworth. Around 5,000 of those telescopes would be needed across ANZ if the Australian and New Zealand bid succeeds.

Join the newsletter!

Or

Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.

Tags SKAaut

Show Comments
[]