Seattle-based Cray has massive new data cruncher
For a company whose supercomputers can simulate the origin of the universe, making another big leap in computing technology takes time. But after 12 years of...
Seattle Times business reporter
For a company whose supercomputers can simulate the origin of the universe, making another big leap in computing technology takes time.
But after 12 years of development, Cray says it's introducing a new type of supercomputer that analyzes vast amounts of data simultaneously.
Cray's first customer is Pacific Northwest National Laboratory (PNNL), which will put its Cray XMT machine to work studying how the bacterium Shewanella processes radioactive materials like uranium. The research could help the U.S. Department of Energy's cleanup of the contaminated Hanford nuclear site.
Seattle-based Cray is known for creating supercomputers that pack a lot of computational power and allow complex simulations useful for scientific research. This time, it is introducing technology that takes it in a new direction with more commercial applications, in fields such as business intelligence, bioinformatics and power-grid analysis.
What's unique about the Cray XMT system is a potentially huge shared memory that gives it the ability to understand and pull knowledge out of large unrelated data sets, said Jan Silverman, senior vice president of corporate strategy and business development.
While a home laptop can hold about 50,000 text Web pages in its main memory, a large Cray XMT system can hold 50 million. That would allow a program to mine the information in those pages to quickly find complex relationships in the data.
The company calls the Cray XMT, which has a base price of about $1 million, "the most massively multithreaded supercomputer in existence."
Most machines today have only one or two threads on a single computer processor. The Cray XMT has 128. A multithreaded machine sends out tens of thousands of threads all looking for data at the same time, with some misses and some hits, Silverman said.
It's the difference between "sending out a scout and waiting for him to come back and say no, that's not the right information; go out again ... or sending out tens of thousands of scouts and they all come back at once," Silverman said.
That allows the computer to recognize, identify, classify and understand patterns to predict behavior, he said.
Exciting maybe, but "this is not a monster opportunity," Silverman said. The technology is "on the cutting edge of where computer science is," he said, adding "I'm not expecting the market to flood."
Cray has had a string of tough quarters recently, partly a result of slow or deferred sales.
The new focus on data analysis has practical applications, including national security. The technology could allow governments to analyze data from watch lists, cross-border travel, bank accounts, Internet downloads, blogs and other sources in real time to predict behavior patterns, Silverman said.
PNNL, a division of the Department of Energy, has purchased an XMT system for its data-intensive computing initiative, an internal research program.
Deborah Gracio, PNNL's division director for computational and statistical analytics, said researchers in Seattle and Richland will use the system to demonstrate how it could help understand and predict the behavior of networks, whether they are biological, in cyberspace or in the power grid.
PNNL and Washington State University, for example, plan to use the computer in their research into the functions of proteins in a system. Their microbial cell project studies the potential of Shewanella, a type of marine bacteria, as a biological solution to sites contaminated during the manufacture of nuclear weapons.
Because Shewanella can convert uranium from a water soluble to an insoluble form, it could hold promise for cleaning up contaminated groundwater.
"We have to build up a network to look at how proteins communicate together," Gracio said. "This architecture is uniquely suited to the problem."
Another application is to analyze the Western electrical power grid, which has 14,000 nodes in a network.
Each node of the grid can be linked to a single thread in the supercomputer. This allows researchers see how each node contributes to the overall state of the grid and to predict different scenarios, Gracio said.
They'll be able to see, for example, the effect of a power outage in one substation on another substation or the impact of increased consumption from a new manufacturing line.
"What the big leap is here is the ability to parse a problem into tiny pieces," said Daniel Chavarria, PNNL's senior research scientist in high-performance computing, "and run all the activities in parallel."
Kristi Heim: 206-464-2718 or email@example.com
Copyright © 2007 The Seattle Times Company
UPDATE - 09:46 AM
Exxon Mobil wins ruling in Alaska oil spill case
UPDATE - 09:32 AM
Bank stocks push indexes higher; oil prices dip
UPDATE - 08:04 AM
Ford CEO Mulally gets $56.5M in stock award
UPDATE - 07:54 AM
Underwater mortgages rise as home prices fall
NEW - 09:43 AM
Warner Bros. to offer movie rentals on Facebook
Furniture & home furnishings
A LIONEL train sale
POST A FREE LISTING