• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Breakthrough in semiconductor technology

Wow. Real time, hyper realistic, 3-d, virtual reality video is just around the corner. There goes the porn industry...

Durn, I'm getting cynical in my old age.
 
I have newsfeeds from a number of "tech" sites, and there are several avenues that are being approached right now that may hit the market before 2010.

Several different "quantum computing" ideas, the above-mentioned technology; our 3-gig processors may be looking pretty sorry in a few years.
 
Don't all processors look pretty sorry in a few years?

Apart from specialized applications (and I know I sound like the guy who said no computer would need more than 256kb of RAM), what would be the use of such super-computers? OK, the VR video, but that only needs an order of magnitude over the present stuff.

I think there might be a "utility ceiling" out there, somewhere. For instance I still retain a pretty sorry 400MHz machine, which serves perfectly for office work, internet browsing, and simpler games.

Hans
 
Never fear, Hans. No matter how powerful computers become, Microsoft's code bloat will expand to chew up the additional capacity. Next version after Vista will most likely require these processors and will still open as slowly as XP.

Linux forever! :)
 
Don't all processors look pretty sorry in a few years?

Apart from specialized applications (and I know I sound like the guy who said no computer would need more than 256kb of RAM), what would be the use of such super-computers? OK, the VR video, but that only needs an order of magnitude over the present stuff.

Off the top of my head;

Games,
AI,
*Interactive real-time video/audio processing
Data crunching,
Windows Vista.

*This one is where it's at, not just generating plausible images but inferring a model of what's going on/what the user is doing from video footage and responding accordingly.

ETA: Damn it Genesius nipped in there before me with vista joke. Clearly I'm suffering from windows' lag.
 
Apart from specialized applications (and I know I sound like the guy who said no computer would need more than 256kb of RAM), what would be the use of such super-computers? OK, the VR video, but that only needs an order of magnitude over the present stuff.

Games and badly written (e.g. Micro$haft) code.

Games have been driving computer development and sales for a long time; the "holy grail" is a machine that's fast enough to handle real-time photorealistic animation. Until we have that, game desigers will always be pushing the hardware envelope for the immersive RPG and FPS-style games.

Also, when cycles are cheap enough, you stop worrying about them. Why bother to use a fast sorting algorithm when you can code up bubble sort and still have the program give "acceptable" responsiveness? Overall code efficiency, in both space and time, has dropped dramatically over the past twenty years precisely because there's little incentive to make code fast or compact any more.
 
*Interactive real-time video/audio processing
Data crunching,

*This one is where it's at, not just generating plausible images but inferring a model of what's going on/what the user is doing from video footage and responding accordingly.

I'd almost suggest the second one is where it's at. A few years ago, I was impressed when my company brought in a half Terabyte of disk storage. Now, realtime data gathering and remote sensing systems can generate constant streams of data. We could reach a point where extremely massive datasets are being generated daily.

In my opinion, things like SETI@home have only been the first forays into widely distributed computing. Push a 10 GHz multi-core processor onto every human being's desktop and dedicate 30%, say, to widespread distributed processing and... well... it's hard to imagine the sort of discoveries that could be made.

Too bad commercial processors seem to be stuck in the 3 - 4 GHz range. The solution has been to increase cores and densities. One thing I haven't seen about this processor is any info on transistor densities, or the circuits implemented.

ETA: drkitten, I've been wondering about code efficiency. Are we better off having a select cadre of engineers coding tight assembly code, or millions of users coding new applications in Visual Basic? I don't know the answer. Even embedded systems are starting to be written using high-level languages, so I can envision a day when people won't care much about code efficiency on embedded systems.
 
Last edited:
Apart from specialized applications (and I know I sound like the guy who said no computer would need more than 256kb of RAM), what would be the use of such super-computers? OK, the VR video, but that only needs an order of magnitude over the present stuff.

A.Is in games. Allowing more units in the real time stratergy games. Real time raytracing. Running Championship Manager 7.
 
In my opinion, things like SETI@home have only been the first forays into widely distributed computing. Push a 10 GHz multi-core processor onto every human being's desktop and dedicate 30%, say, to widespread distributed processing and... well... it's hard to imagine the sort of discoveries that could be made.
Now that would be cool. :D
 
In my opinion, things like SETI@home have only been the first forays into widely distributed computing. Push a 10 GHz multi-core processor onto every human being's desktop and dedicate 30%, say, to widespread distributed processing and... well... it's hard to imagine the sort of discoveries that could be made.

Protien folding.
 
Hans said:
Apart from specialized applications (and I know I sound like the guy who said no computer would need more than 256kb of RAM), what would be the use of such super-computers? OK, the VR video, but that only needs an order of magnitude over the present stuff.
Forget all these exotic applications. I just finished producing a 1,000-page book in TeX. Portions of the book were generated on the fly with scripts. I had to reprocess the book hundreds of times to get all the details correct. I could easily have used an order of magnitude more processing power to make my life pleasant.

~~ Paul
 
High volume business process computing: faster CPU = £millions hardware savings (capital, operations and maintenance) not to mention improved opportunities for automation and innovation.

Computers aren't just used for internet browsing, video games and esoteric scientific purposes! They are the engines of international commerce.
 
Forget all these exotic applications. I just finished producing a 1,000-page book in TeX. Portions of the book were generated on the fly with scripts. I had to reprocess the book hundreds of times to get all the details correct. I could easily have used an order of magnitude more processing power to make my life pleasant.

~~ Paul
.... And the other comments to my post about utility ceiling.

I did not say we're there now, I said within the next order of magnitude from now. That will probably take care of most of the things you mention. As for data crunching: Pure data crunching is already becoming close to "fast enough", just see what a spreadsheet can do, these days.

There will always be special applications that can never have enough power, but they are not enough to drive development.

But.... we'll se. People may dream up new power hogging applications, as power becomes available.

Hans
 
People may dream up new power hogging applications, as power becomes available.
Yeah. I have an old computer mag from the 1980s with an article written about the upcoming CD-Rom storage technology and how it would fit in with home computer requirements. The conclusion was that it was a stupid amount of storage that no home user could ever need. We can see now (and might have suspected then) that this was a serious failure of imagination, but twenty years from now we might be saying the same thing about 350gHz PCs.
 
I think we're pretty darn close now. I think we're there for the average home user (note: average home users don't need a $500 video card for playing games.)

I think we're there for the average office user.

Not quite there yet in my office where 3d studio max, CADD and 4 GB raster images are pretty common.

I'd rather see network speeds increase than computer processing at the moment. especially WAN.
 
.... And the other comments to my post about utility ceiling.

I did not say we're there now, I said within the next order of magnitude from now. That will probably take care of most of the things you mention. As for data crunching: Pure data crunching is already becoming close to "fast enough", just see what a spreadsheet can do, these days.

What a spreadsheet can do is hardly the point. That's a bit like a car engine designer saying that there is a utility ceiling on engine power due to there being a practical limitation on roadspeed and failing to consider the power requirements of ocean liners.

There will always be special applications that can never have enough power, but they are not enough to drive development.

For the forseeable future they are certainly enough (and they are not so special). It is not just technical capability that is the driver but the cost of implementing / deploying / using the technical capability.

But.... we'll se. People may dream up new power hogging applications, as power becomes available.

Of course - better tools allows for greater innovation within a given budget. You can call it power-hogging if you like but that's mising the point.
 

Back
Top Bottom