• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Computers in 2020

Very likely, but what I think will have bigger impact is augmented reality (someone already mentioned it here). And real-time augmented reality DOES require more computing power than most home users have today -- or perhaps it needs about as much computing power, but in a much smaller package.
Yeah, I agree. I guess what I really meant to say is that at some point, the hardware is going to be so small, cheap, powerful, and saturated into society, we're really not going to notice it so much anymore. All we'll care about is the applications. I can picture a time where new software requiring more horsepower than what most consumers have will simply come with the hardware needed.

The next big wave of changes in computers has already begun, with solid-state data storage. Compared to magnetic and optical drives, they're not only smaller and lighter, but also less energy-demanding. Open up the case of a standard desktop computer and look at what takes up most of the space: drives, a power supply unit, and cables to and from those parts.

The remainder is mostly the motherboard, CPU, CPU cooler, and RAM, but various handheld gizmos have been using smaller, simpler versions of those (minus the cooler, which isn't needed when things are that tiny) have already been in use for quite a while, as long as the programming is simple enough to be done on this more limited hardware. So magnetic and optical disk drives, and the power supply units to feed them, were the only things left that justified a desktop computer's bulk. Replace them with flash memory, and accept not getting the very fastest performance that can possibly be achieved this month, and you have no reason for the computer (minus keyboard & monitor, at least) to be significantly larger than a PDA or OQO. So the big separate "tower" part of computers will be no longer made and gradually disappearing as the old ones get replaced with tiny computers like the OQO, bought not just for portability but also because even if all you do is park it at home as the home computer, you can put them practically anywhere and hide them in unobstrusive little corners, instead of having to arrange rooms around them and buy special furniture for them. The most space-consuming parts will then be the keyboard and monitor (which might themselves be good things behind which to "hide" the computer).
Exactly, with the stuff being worked on in labs now, we should soon be seeing true "system on a chip" technology. There won't be any need for the CPU, RAM, Video, sound, networking and storage to be separate, it will all come on a single, cheap, microscopic, component.
 
I'll guess just the opposite. I'll guess the average computer uses 10 times the resources of today's computer in terms of ram, hard drive space, etc, and still takes forever to perform a reboot. The average OS will probably be 10 times more bloated that XP, Vista and Windows 7. I also suspect that Linux (from Google) will be just as popular as Windows.

Bloat is not a requirement, it's optional. My pc is far from cutting edge, yet it's certainly configured well. It does a full reboot in under 30s, running XP.
 
CPUs: Moore's law will come to an end. In fact, it may even reverse as the emphasis becomes on saving heat/power. 10 years from now, your CPU may be slower than the one you have now.

HD Space: Will continue at relatively the same rate it has for the past 10 years. No huge slow downs. Solid state hard drives will increase in size and begin to take over as the main market and replace spinning drives.

OS: Will focus on code optimization due to limits in CPU and the fact there is a huge market for relatively "thin" clients and an emphasis on power savings and lowering green house emissions. Phones, net books, and other novel devices will take market share. For example, Snow Leopard is smaller than previous OS.

Bandwidth: In the future, bandwidth will be more important than any other computer feature. With bandwidth, you don't need a fast CPU, lots of storage, or anything else as this will be outsourced to a server. Bandwidth will increase dramatically over the next 10 years and will be the one thing you care about more than any other computer benchmark
 
Last edited:
CPUs: Moore's law will come to an end. In fact, it may even reverse as the emphasis becomes on saving heat/power. 10 years from now, your CPU may be slower than the one you have now.
In terms of clock speeds, this has already happened. Intel had a 3.8GHz Pentium 3 out in 2003. Today, they don't have anything that runs at that clock speed - but a top of the line Core i7 uses less power and has around 8 times the instruction througput.

AMD has just released a 6-core CPU that runs at 1.8GHz and uses just 40W. That paves the way for their 12-core rollout early next year. Intel will be releasing 8-core chips in the same timeframe.

HD Space: Will continue at relatively the same rate it has for the past 10 years. No huge slow downs. Solid state hard drives will increase in size and begin to take over as the main market and replace spinning drives.
Disks are expected to hit current known limits at 1-2TB per platter, say 5-10TB for a drive. I don't know of any research that indicates that we can move beyond that.

Solid state drives, on the other hand, will see at least a 20-fold improvement in capacity and capacity/dollar.

OS: Will focus on code optimization due to limits in CPU and the fact there is a huge market for relatively "thin" clients and an emphasis on power savings and lowering green house emissions. Phones, net books, and other novel devices will take market share. For example, Snow Leopard is smaller than previous OS.
This is definitely something the OS-makers are aware of. The other big thing of course is large numbers of cores. I have a quad-core desktop, but Vista frequently stalls or hiccups because of slow I/O devices and internal resource contention. Windows 7 is supposed to be much better at this.

Bandwidth: In the future, bandwidth will be more important than any other computer feature. With bandwidth, you don't need a fast CPU, lots of storage, or anything else as this will be outsourced to a server. Bandwidth will increase dramatically over the next 10 years and will be the one thing you care about more than any other computer benchmark
No. Ain't gonna happen.

For one thing, it's hard to roll out bandwidth to the end user. Takes a long, long time.

For another, the last thing service providers want is to have to take over the end-user processing. They'd need 10 or 100 times the hardware and bandwidth, which means 10 or 100 times the cost.
 
Oh, and for a third thing, latency. You can always buy more bandwidth, but you can't buy less latency at any price.
 
I predict we will eventually jettison the von neumann architecture and move to some sort of platform that blends processing/memory so that things are closer together and things are much more parellel (like a human brain).

You people reading this in the far future can be amazed at my prescience. I was right, right?
 
I predict we will eventually jettison the von neumann architecture and move to some sort of platform that blends processing/memory so that things are closer together and things are much more parellel (like a human brain).

You people reading this in the far future can be amazed at my prescience. I was right, right?


We don't get amazed anymore. It's much too stressful. We have computers to do that for us.

(Whoops. I meant ,"They won't get amazed ...")
 
All physical interaction will be outlawed (by the christians who have gained ultimate political power). Sexual, in Utero, reproduction will have been replaced by a complete in vitro process (run by robots)...

Instead, people will sit next to each other in a pub and 'Jack In' to the "interweb" (the name was officially changed in 2012) and hold the entire conversation (or have sexual intercourse) "Inline"...

This will all be done, of course, with nanotech that interacts directly with your brain - efectively creating a "PsuedoReality" that's virtually indistingishable from the real reality...

Eventually PsueReal will be actually indistinguishable from real reality and people will opt to "Check Out" - hooked up to life support machinery and tended to by robots, and living their entire lives Inline...

And then the robots take over or something... I dunno - who cares... I'm living in a virtual reality paradise remember...
 
I don't know about the wasted horsepower. I routinely run an application that will decode about 30 separate digital amateur radio or shortwave streams at the same time. My Vic20 wouldn't have been able to do that. Some of the computer uses we take for granted today wouldn't be possible with less than multigig hard drives and processors.

Of course, mine is the next to the bottom of the lineDell E310, but I use all the processing power on a regular basis. Plus, I have 320 Gb of online storage and it is ALL full.

3.5 Terabytes. In my home computer... (About 80% full)

Anyone beat that?

:D
 
3.5 Terabytes. In my home computer... (About 80% full)

Anyone beat that?

:D

No but I do have a 1.5 Terabyte drive that has 50 gigs left and I consider the drive "full". I suppose ten years ago I wouldn't think that 50 gigs left is not very much.
 
3.5 Terabytes. In my home computer... (About 80% full)

Anyone beat that?

:D
I have 4.5TB in my Nagi (my Vista box), 2.25TB each in Yurie and Haruhi (Linux boxes), 3TB RAID-5 in Pepper, and 1.9TB RAID-5 in Salt (two little fileserver appliances). Oh, and a 1.2TB external drive. And I have a new 1TB drive that arrived today.

:p
 
<snip>
For another, the last thing service providers want is to have to take over the end-user processing. They'd need 10 or 100 times the hardware and bandwidth, which means 10 or 100 times the cost.

I do not agree with this. Imagine a computer with heaps of CPUs and huge numbers of massive disks (or some other similar data storage device) with fast data transfer with heaps of people around the world. The end user only needs a monitor, modem, speakers and other I/O devices. It would be much cheaper and better to have it this way because
1. No need to worry about backups. Done for you. No need to buy external hard drive or DVDs for this purpose.
2. No need to worry about buying software. Bought at wholesale prices for you.
3. You use the computer only when you want it. At other times other people use it as it is in a central location.
4. Viruses (and all other bad software) would be very hard to spread. The computer would have the best protection. It would also only have to worry about a virus coming into the system, not passed from one internal user to another.
 
I do not agree with this. Imagine a computer with heaps of CPUs and huge numbers of massive disks (or some other similar data storage device) with fast data transfer with heaps of people around the world. The end user only needs a monitor, modem, speakers and other I/O devices. It would be much cheaper and better to have it this way because
1. No need to worry about backups. Done for you. No need to buy external hard drive or DVDs for this purpose.
And when there's a failure, everyone's service goes down at the same time.

2. No need to worry about buying software. Bought at wholesale prices for you.
Except that there's no way they'd be using standard off-the-shelf software. The hardware and maintenance costs would be horrifying. Which means that no-one would use it.

3. You use the computer only when you want it. At other times other people use it as it is in a central location.
The problem with that is that peak time is peak time. Everyone wants to be online at 2PM, and no-one at 2AM. You can't transfer resources across timezones either, because the latency would make applications unuseable.

4. Viruses (and all other bad software) would be very hard to spread. The computer would have the best protection. It would also only have to worry about a virus coming into the system, not passed from one internal user to another.
And the virus only has to attack one target, instead of millions of different target.

Now, each of these is a tradeoff, albeit with more downside than upside.

But there's one huge reason why this will not happen: The bandwidth requirements would be astronomical. If you're lucky, and have FiOS or cable or ADSL2+, you might have 20 Mbits of bandwidth coming into your home. A standard 20" widescreen monitor uses 2.5 Gbits - 125 times as much. And that's at your end. A datacenter built to handle just 20,000 simultaneous users would need 50 Tbits. No-one has 50 Tbits of bandwidth, and by 2020, no-one will still have 50 Tbits of bandwidth.

Storing your letters and photos and music online, yes. That's already happened.

Getting rid of the local CPU and video chips, not a chance.
 
And when there's a failure, everyone's service goes down at the same time.
Irrelevant and misleading. You can get it so that if some hardware fails then the rest of the system bypasses the failure and carries on. The biggest problem would be to ensure that humans notice the failure and fix it. Look up fault tolerant systems.
Also sometimes a computer at home fails due to hardware failure. In which case you have no computer for several days. But move the computer to a central location and you will be able to use the system more reliably.
Except that there's no way they'd be using standard off-the-shelf software. The hardware and maintenance costs would be horrifying. Which means that no-one would use it.
No. Not much different to what many offices use now.
The problem with that is that peak time is peak time. Everyone wants to be online at 2PM, and no-one at 2AM. You can't transfer resources across timezones either, because the latency would make applications unuseable.
So we are waiting on a decent broadband system before this can become reality. Then one computer system could serve many people around the world so peak hour would not be much more than the quiet hour.

And the virus only has to attack one target, instead of millions of different target.
No different than now. One virus can attack every Windows based computer. Someone has to react to find and close the loophole in the code. The advantage is that only one human has to react and it is fixed.
 
As for how much bandwidth you will need, well I know I can watch TV on my computer via ADSL. For example go here http://www.abc.net.au/iview/ and you can watch ABC TV. I hope it works outside of Australia. So if you can watch TV via the Internet you should be able to do anything else via the Internet.
 
The "everything will be thin-client" vs "everything will be thick-client" seems to ignore that there are different economies involved.

In the beginning there was no choice - you had one computer and that was it. Given how much cheap computation there is sitting about I don't think we're going to see everything back in the "cloud" again.

Centralisation provides a simpler abstraction for computation at the expense of making computation more expensive for each node - that ain't going to change and no matter how much bandwidth one throws at the problem it does not change the fact that local computation will be cheaper and faster.
 
As for how much bandwidth you will need, well I know I can watch TV on my computer via ADSL. For example go here http://www.abc.net.au/iview/ and you can watch ABC TV. I hope it works outside of Australia. So if you can watch TV via the Internet you should be able to do anything else via the Internet.
Not really. TV works well mainly because it´s not interactive, so you don´t notice any latency it might have. For this reason it´s a very bad example for the "allmightyness" of broadband internet.
Online gaming for example is a different story, and that only works because there´s a lot of prediction going on in the local clients, which oftentimes is quite noticeable.
 
Last edited:
CPUs: Moore's law will come to an end. In fact, it may even reverse as the emphasis becomes on saving heat/power. 10 years from now, your CPU may be slower than the one you have now.
I don´t think so. It´s probably true that one branch of CPUs will get ever smaller and ubiquitos computing will become mainstream.

On the other hand, people will still want to have to fastest possible technology for a lot of applications, be it gaming, video editing or whatever. It´s highly unlikely that people will settle for a less powerfull CPU than what we have now, if massively parallel processors are available that enable extremely high resolution and quality games and videos, or for example household robots and a lot of other stuff.
Nvidia´s CUDA is already showing the direction today, with Intel´s Larrabee following suit next year. This kind of massively parallel computation on affordable devices is just in it´s infancy and will make a lot of stuff feasible that just isn´t possible today, because our current CPUs are so awfully slow :)
 
Last edited:

Back
Top Bottom