UI builds super computer — Memory of the Big-STEM computer gives researchers advantage

A view of the back of the university’s new “super computer.” This super computer is made up of several computers that share a workload and is securely housed in the Buchanan Engineering Lab. It is one of the nation’s most powerful computers.

The Big-STEM computer is one of the most powerful computers in the nation. It gives University of Idaho researchers an advantage to further their research, said Jim Alves-Foss, director of the Center for Secure and Dependable Systems, and who manages the Big-STEM project.

A view of the back of the university's new

A view of the back of the university’s new “super computer.” This super computer is made up of several computers that share a workload and is securely housed in the Buchanan Engineering Lab. It is one of the nation’s most powerful computers.

“It allows our faculty to experiment with much more interesting problems,” Alves-Foss said.

The project began in fall 2011 when a group of junior faculty tried to conduct complicated research, but continued to run into computing problems, he said. Every machine they tried to use at UI and even at the Idaho National Lab couldn’t give them the computing power they needed.

After discussing a way to solve their problems, the faculty went to Alves-Foss asking him and others to help find a way to have a bigger computer where they could all share and solve their problems, he said.

The Big-STEM project team wrote a proposal and received the first half of funding from the National Science Foundation for $300,000 last summer, Alves-Foss said. However, the team waited until fall to start buying the equipment, because they needed more funding and were waiting for technology updates.

The real power of the Big-STEM computer is the four terabytes of memory, Alves-Foss said. UI received $240,000 of funding from the Murdock Charitable Trust to double the memory. Therefore, when the machine goes online this summer, it will have eight terabytes of memory — 4,000 times the memory of an average computer.

The Big-STEM computer consists of multiple motherboards with the processors in a single chassis about the size of a microwave, he said. Each one of the processors holds one-eighth of the memory with a high speed data connection between the motherboards, allowing them to share memory in a way that has never been done before, Alves-Foss said. Memory of this size gives researchers a chance for more accurate studies, he said.

“When you are doing a simulation of a complex structure where you have a lot of data points and a lot of interactions — if you don’t have enough memory — the simulation program will try to store the data on the hard drive and then swap it back and forth, which will either crash the program or make it tremendously slower,” Alves-Foss said.

Marty Ytreberg, associate professor of physics, studies intrinsically disordered proteins found in the human body that change shape and are very flexible. The use of the Big-STEM computer helps to identify similarities within the proteins.

These complex proteins are found in more than 50 percent of people with cancer, Ytreberg said. If researchers understand the protein, they could develop drugs that target the complex protein.

“There is no such thing as a drug that targets intrinsically disordered proteins,” he said. “It’s not something that has ever been done and we think this is a way to do that.”

If Ytreberg and his research team tried to research complex proteins on a normal computer, the system would crash or become extremely slow without the memory needed to store their data, he said.

Many other researchers are working with the Big-STEM computer to conduct their research. For example, a mathematician is looking at new algorithms for helping people to model down to the molecular level and test the algorithms to see which is more efficient, Alves-Foss said.

Water resource researchers are modeling river flow for mitigation for impacts of dams and are able to model bigger parts of the river in a higher resolution, and professors are working with Micron to look at the effect of heat on microprocessor chips to see how they handle heat dissipation to make the chips last longer, he said.

“The more precise you can get those models the more accurate it will be but you need more data points,” Alves-Foss said. “M ore data points mean more memory.”

Emily Aizawa can be reached at [email protected] 

About the Author

Emily Aizawa News reporter Freshman in public relations Can be reached at [email protected]

Leave a Reply

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

This site uses Akismet to reduce spam. Learn how your comment data is processed.