The San Diego Supercomputer Center

January 12, 2010 at 6:45 pm

One of the highlights of our meeting in La Jolla last week was a quick tour through the computing room at the San Diego Supercomputer Center. WARNING: The following post includes some serious geekery! Those who aren’t turned on by massive parallel computing prowess might want to stop reading now.

Kate at the San Diego Super Computing Center!

My research group is known for their innovative approaches to global circulation modeling, and if there’s anybody in the world who uses a lot of cycles on big computers, it’s definitely us. So, it made sense for a few of us to go check out the big computers at the UC San Diego center where our research has run more than a few trillion calculations.

The control station for all of those big-ass-computers!

The building and computer rooms are always in a state of flux in this kind of facility. There’s new machines brought in all of the time, new groups form to study new projects and new people come and go. Last week, when I walked into the building, the first door on the right was for a Neural Network (Artificial Intelligence) in-house research group. If anybody is going to create a computer that takes over the world, it would be these people.

Christina checks out the Triton

We wandered through the computer room, looking at huge supercomputers, both old and new. The newest, biggest machine was the Triton Resource. This computer has 256-nodes with 8 processing cores on each node, which gives it a processing power of more than 500x that of the most powerful desktop computers. This certainly isn’t the most powerful supercomputer in the world today, but it has some unique features. Each of those 256 8-processor nodes comes with 24 GB of memory, which makes this computer very, very good at shifting through huge amounts of data very quickly.

A ginormous storage array

This is the specific challenge of supercomputing that UCSD has decided to tackle: the overwhelming tsunami of data that results from these huge model runs. The image above is of a room-sized harddrive array. These people don’t even really know how much storage they have, the numbers are too big to wrap your brain around. But it’s what we need right now. With climate models doing 200-year runs, and saving the state of the entire world 4 times a simulated day, the trick is not having the cycles to run the model, but having the space available to store all that data. And UCSD’s Supercomputer Center has it all!