Jefferson Lab cluster tops 100 teraflops

The fastest computer system in Hampton Roads has booted up with more than 100 Teraflops of processing power. Located at the Department of Energy's Thomas Jefferson National Accelerator Facility, the cluster computer system was recently upgraded with video game components to assist scientists in modeling the smallest bits of matter in the universe.

"Our resources crossed 100 Teraflops of sustained processing while running a science application. We finally hit the peak in late September," said Chip Watson, manager of Jefferson Lab's High-Performance Computing group in the Information Technology Division.

The supercomputer, called Hadron, was built with components that were purchased and cobbled together over the last year. It runs on both cards and ordinary computer processors. About 90 percent of its computing power comes from the video game graphics processing units, or GPUs.

The Hadron system contains 480 GPUs and 266 CPUs mounted inside computing chassis and screwed into racks side by side. Two parts run mainly on GPUs, dubbed 9g and 10g ('g' for GPU).

"We bought two different types of GPUs. We bought gaming cards, and we bought in the professional line for supercomputing. These cards are similar to the gaming cards, but they are configured somewhat differently and have error-correcting code built-in," Watson explained. "Some calculations we can do equally well on either card, and we do those calculations on the gaming cards. There are some calculations that involve much more, and those have to be on the professional-quality cards."

Hadron is being used to compute how the building blocks of matter, quarks, build the protons, neutrons and other particles that makeup everyday matter.

"GPUs are being used solely for propagator calculations - computing how a quark will move from one point in space to another point," said Robert Edwards, a senior staff scientist in Jefferson Lab's Theory group.

Once those calculations are done, the scientists then take the results and use them on a CPU-powered cluster computer. "We take the results from the GPUs and we tie this group of quarks together to make mesons or baryons. From this, we can compute the mass of these particles and their excited states," Edwards explained.

Watson used about $1 million of a nearly $5 million grant received as part of ARRA (American Recovery and Reinvestment Act) funding under the auspices of the Department of Energy's USQCD (US Quantum Chromodynamics) collaboration to purchase the 352 gaming cards and 128 professional graphics cards and associated hardware currently installed in the Hadron system. A chunk of the grant also went toward funding the effort to create computer code to adapt the GPUs for scientific computing.

Watson said that some of the GPUs and CPUs purchased for Hadron are still undergoing testing before being added to the system. "When we get all of the GPU systems running, the GPUs alone will exceed 100 Teraflops," he said.

In the meantime, Watson said he and his team continue to monitor the marketplace for the next big leap in technology.

"It looks like GPUs will be the hot topic for the next two or three years. We're keeping an eye on several technologies that could develop, and we're looking for something better that comes along," Watson said. "But it's been a fun year deploying this disruptive technology. Quite a fun year."

More information: An article about the first portion of the Hadron system (9g) to come online is available here: www.jlab.org/news/OnTarget/201 … /2010-06/Story1.html

Provided by Jefferson Lab

Citation: Jefferson Lab cluster tops 100 teraflops (2010, October 15) retrieved 19 March 2024 from https://phys.org/news/2010-10-jefferson-lab-cluster-tops-100.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

New computer cluster gets its grunt from games

0 shares

Feedback to editors