• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

New Record For World's Most Powerful Computer

Skeptical Greg

Agave Wine Connoisseur
Joined
Jul 1, 2002
Messages
20,720
Location
Just past ' Resume Speed ' .
IBM Claims Fastest-Computer Title

What is really exciting, is that they plan to increase the power by a factor of 10 as early as next April!
IBM is scheduled to complete delivery of a large version of Blue Gene to the Energy Department's Lawrence Livermore Laboratories in California in April. That supercomputer comprises 64 racks containing 130,000 processors which is designed to run at 360 teraflops, 10 times faster than the system that IBM says set the new record.
What do some of you science guys see as benefit of this type of computing power?
 
Diogenes said:
What do some of you science guys see as benefit of this type of computing power?

Be able to play championship manager 6?

Protien folding would be the main use in anything close to the areas I'm studying.
 
What do some of you science guys see as benefit of this type of computing power?

Adding another few hours to how far out we can predict the weather.

I'm only being a little facetious. NOAA (the US weather predicting agency) is a big consumer of supercomputer resources, as weather prediction involves dividing the atmosphere up into little cells for the purpose of calculation. The finer the grid, the better.

I heard that a fundamental limit on weather prediction is that they use only a 2-D grid, but the important atmospheric effects are actually 3-D. Maybe given enough CPU they can add that third dimension, which might actually give a huge leap in prediction capability. Predicting tornados for instance.

Aircraft flight simulation is another computation hog. I'm not talking about flight simulator games, but doing simulated wind tunnel experiments. Simulations are already so good that a lot of design and testing can be done without ever using a real wind tunnel.

What else? Hmmm... there are a number of math problems, especially in number theory, that are limited by computing power. Counting of primes for instance. We know how many primes there are up to about 10^22 or so, but that's it. Searching for large primes is another. Crypto stuff: code design, code breaking.
 
Re: Re: New Record For World's Most Powerful Computer

geni said:
Protien folding would be the main use in anything close to the areas I'm studying.

Ah yes, good one. That enables us to design and test drugs entirely within the computer, right?
 
Re: Re: Re: New Record For World's Most Powerful Computer

rppa said:
Ah yes, good one. That enables us to design and test drugs entirely within the computer, right?

Any of this kind of modelling, in fact. In my field, atomistic models of radiation damage in metals. Can currently model about 1,000,000 atoms at a time, which sounds like a lot but is only a cube 100 atoms per side - nowhere near big enough to encompass some of the effects.
 
Re: Re: Re: New Record For World's Most Powerful Computer

rppa said:
Ah yes, good one. That enables us to design and test drugs entirely within the computer, right?

No. You can design them to a degree like that but you run into problems with testing them (predicting all the metabolites is near imposible).
 
keep in mind that the IBM computer listed is simply one small part of the full complex that will be built. the total comptuer is suuposed to be 360 TFlop :jaw:
 
Re: Re: Re: New Record For World's Most Powerful Computer

rppa said:
Ah yes, good one. That enables us to design and test drugs entirely within the computer, right?
The problems associated with solving protein folding are not simply due to lack of processing power - although this currently limits us to modelling dynamics with Newtonian physics and our force fields are limited (there's no way to model reactions with these schemes, for example).

The more rigorous quantum mechanical approaches are not only slow, but scale hideously with size, so modelling anything more than a small peptide is well beyond our computational means. Merely increasing computational power by a factor of ten isn't going to help noticeably.

There are other problems, too. Assuming we have assembled sufficient computational power to run a long molecular dynamics simulation (using Newtonian mechanics), it's very difficult to parallelise these schemes efficiently.

More fundamentally, even the fastest protein takes 10 microseconds to fold, and most proteins take longer. Our current methods for integrating the mechanics equations lead to small errors. This is fine over the typical time scales normally used in protein simulation (nanoseconds, i.e. a thousand times shorter than even the fastest protein folds), but would propagate out of control if run for too long.

What we need, along with radically faster computers, are radically improved methodologies.
 
Re: Re: New Record For World's Most Powerful Computer

rppa said:
What do some of you science guys see as benefit of this type of computing power?

Adding another few hours to how far out we can predict the weather.

I'm only being a little facetious. NOAA (the US weather predicting agency) is a big consumer of supercomputer resources, as weather prediction involves dividing the atmosphere up into little cells for the purpose of calculation. The finer the grid, the better.

I heard that a fundamental limit on weather prediction is that they use only a 2-D grid, but the important atmospheric effects are actually 3-D. Maybe given enough CPU they can add that third dimension, which might actually give a huge leap in prediction capability. Predicting tornados for instance.

I used to do this back in the day when computing power was nothing like what it is today, and we used 3-D grids all over the place. We even presented an immersive visualization of the tornadic case of a thunderstorm at SIGGRAPH '94. Grids were fairly coarse then--a global simulation might have 30 layers--but hings can't be any worse than they were then.

However, at least then, there was strong reluctance against using 3-D oceanic transport models, especially when studying weather. Usually it's just a 2-D transport model that works as a boundary condition for the 3-D atmospheric model.

Incidentally, whether fine grids are better is still an open question, I believe, due to the chaotic nature of weather. I've seen some models where, if the grid density is quadrupled, you get better local resolution, but overall prediction is worse. Also, every global model I saw predicted that the Sahara would have enormous rainfall. However, I've been out of that business for many years, so perhaps things are better.

Aircraft flight simulation is another computation hog. I'm not talking about flight simulator games, but doing simulated wind tunnel experiments. Simulations are already so good that a lot of design and testing can be done without ever using a real wind tunnel.

This is an area where 2-D grids are still fairly common, because many interesting surfaces (such as airfoils) don't change much along one dimension.
 
Another use of big computers is lattice QCD, where you attempt to calculate a property of a strongly interacting particle, like a proton or a neutron, in an attempt to show that QCD actually works.

There are no good calculation methods to get exact results from the quantum theory of the strong nuclear force, so massive number crunching is used instead. To calculate an amplitude for a particular event, you can add up all the different ways the event can happen, which includes all the different paths a particle can take to or from the event. To approximate this on your supercomputer, you limit your paths to those that hop from grid point to grid point on a 3D lattice of points.

An example would be the exchange of a gluon between two quarks in a proton. You calculate the momentum transferred by the gluon over all possible paths it can take to find the force between the two quarks. This tells you the binding energy, and therefore (thanks to Einstein) the mass of the particle.

Even on a 10x10x10 lattice, there are a hell of a lot of possible paths...
 
Yeah, but I think what we really want to know is how many FPS will I get if I run Quake 3 on this thing?
 
StaticEngine said:
Yeah, but I think what we really want to know is how many FPS will I get if I run Quake 3 on this thing?
According to a randomly selected link from a Google search, a P4 2.2Ghz Northwood peaks at about 800MFlops. 360 TFlops would be 450,000 times faster. Of course, that's ignoring the video card, and floating point is mostly useful for geometry transformations and triangle setup, not actual texture blitting, but with that much perf and 60fps you could probably ray-trace a Quake level and do much nicer lighting effects in real time...

Edit to add: actually, if you assume 1280x1024 at 60fps, 360Tflops gives you over 4.5 million floating point instructions per pixel per frame. The aforementioned P4 would get fewer than 180 pixels rendered in the same amount of time, so yes, I think you really could raytrace in real time with one of these babies.
 
Zombified said:
I think you really could raytrace in real time with one of these babies.
That accompanied with some VR gear could create a nice experience.
 
If Moore's Law continues unabated with no limits (yeah, I wish) this level of computing power would be available off-the-shelf in around 37 years, and probably as a gaming console a couple years after that.

Man, old-folks homes in the future are gonna be hella LAN parties... :D
 
Possibly by election time it can figure out a voter's roll for Florida?
 

Back
Top Bottom