Menu
Flashy, storage-happy supercomputers due in 2015

Flashy, storage-happy supercomputers due in 2015

Supercomputers funded by National Science Foundation and due for deployment in 2015 will have petabytes of storage

Supercomputing speed is typically boosted by adding more processors, but two new systems funded by the National Science Foundation due to go live next January will take an unconventional approach to speed up calculations and data analysis.

Arrays of memory and flash storage -- totaling up to petabytes in storage -- will be loaded on the Wrangler supercomputer at Texas Advanced Computing Center (TACC) at the University of Texas at Austin and the Comet supercomputer at the San Diego Supercomputer Center (SDSC) at the University of California, San Diego. The supercomputers, which are currently under construction, have a new design with high levels of storage relative to the number of processors in the system.

The supercomputers will provide better throughput, in-memory and caching features, which could be a faster and more efficient way to solve complex problems, said NSF in a budget request published this week as part of President Barack Obama's 2015 US$3.9 trillion budget proposal sent to Congress.

The new batch of supercomputers will support research in disciplines such as economics, geosciences, medicine, earthquake engineering and climate and weather modeling.

NSF is requesting $7 billion to fund scientific research, of which $894 million is dedicated to research in areas such as software, chip manufacturing, semiconductors, cybersecurity and cognitive computing systems. NSF also funds the construction of supercomputers so scientists have access to computing resources for simulation and other tasks. The supercomputers are being built as part of NSF's Extreme Digital (XD) program, in which scientists share computing resources to advance research.

Compared to what NSF has funded in the past -- including IBM's Blue Waters -- the new servers have a different design, said Dan Olds, principal analyst at Gabriel Consulting Group.

Processors and other computing resources already deliver high levels of performance, but the real bottleneck has been throughput. NSF wants more sophisticated supercomputing designs so bits and bytes move between processing elements faster, Olds said.

"It has to do with the changing nature of high-performance computing," Olds said. "They want to control massive data streams instead of handling batch [jobs]."

The Comet supercomputing is more "suitable for both high throughput and data-intensive computing," NSF said. "Its heterogeneous configuration will support not only complex simulations, but also advanced analytics and visualization of output."

Servers are increasingly packing large arrays of DRAM for in-memory computing, which is considered beneficial for databases and other data-intensive applications. Solid-state drives are being used as a cache layer on which data is temporarily stored before being processed. SSDs are also becoming primary storage at the expense of hard drives, which are slower and more power hungry.

Comet will be built by Dell, have 1,024 processor cores, a massive 7PB array of high-performance storage and 6PB of "durable storage for data reliability," according to specifications published by SDSC. The supercomputer will use Intel Xeon chips and Nvidia graphics processors. Each node will have 128GB of memory and 320GB of flash, though it is unclear how many nodes the supercomputer will have. There will also be special nodes with 1.5TB of memory. It will have 100 Gigabit Ethernet and the InfiniBand interconnect for throughput. The system is built on the Lustre file system, which is designed to overcome bottlenecks on distributed computing systems.

"The Comet project ... is designed to efficiently deliver significant computing capacity (two petaflops) for the 98 percent of research that requires fewer than 1,000 simultaneous and tightly coupled cores to be conducted," NSF said.

SDSC is not saying much more about Comet as it goes through validation and deployment, said Jan Zverina, director of communications and media relations at the center, in an email. More details are likely to be shared later this year, Zverina said.

TACC's Wrangler will combine 120 servers with Intel-based Xeon server chips code-named Haswell. It was touted by NSF as the "most powerful data analysis system allocated in XD, with 10 petabytes (PB) of replicated, secure, high performance data storage." It will have 3,000 processing cores dedicated to data analysis, and flash storage layers for analytics. The supercomputer's bandwidth will be 1TBps (byte per second) and 275 million IOPS (input/output operations per second).

NSF's research priorities are relevant to the problems faced in computing today, Olds said, adding that the government agency is heading in the right direction on supercomputer development.

Agam Shah covers PCs, tablets, servers, chips and semiconductors for IDG News Service. Follow Agam on Twitter at @agamsh. Agam's e-mail address is agam_shah@idg.com

Follow Us

Join the New Zealand Reseller News newsletter!

Error: Please check your email address.

Tags National Science Foundationbusiness issuesUniversity of Texas at AustinUniversity of CaliforniaSan Diegobusiness management

Featured

Slideshows

Arrow exclusively introduces Tenable Network Security to A/NZ channel

Arrow exclusively introduces Tenable Network Security to A/NZ channel

Arrow Electronics introduced Tenable Network Security to local resellers in Sydney last week, officially launching the distributor's latest security partnership across Australia and New Zealand. Representing the first direct distribution agreement locally for Tenable specifically, the deal sees Arrow deliver security solutions directly to mid-market and enterprise channel partners on both sides of the Tasman.

Arrow exclusively introduces Tenable Network Security to A/NZ channel
Examining the changing job scene in the Kiwi channel

Examining the changing job scene in the Kiwi channel

Typically, the New Year brings new opportunities for personnel within the Kiwi channel. 2017 started no differently, with a host of appointments, departures and reshuffles across vendor, distributor and reseller businesses. As a result, the job scene across New Zealand has changed - here’s a run down of who is working where in the year ahead…

Examining the changing job scene in the Kiwi channel
​What are the top 10 tech trends for New Zealand in 2017?

​What are the top 10 tech trends for New Zealand in 2017?

Digital Transformation (DX) has been a critical topic for business over the last few years and IDC is now predicting a step change as DX reaches macroeconomic levels. By 2020 a DX economy will emerge and it will become the core of what New Zealand industries focus on. From the board level through to the C-Suite, Kiwi organisations must be prepared to think and act digital when the DX economy emerges in 2017.

​What are the top 10 tech trends for New Zealand in 2017?
Show Comments