• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Computers in 2020

...We still won't have reliable voice recognition, face recognition, auto translation, self-driving cars or biometrics. Google won't be Google anymore, Apple might no longer be cool among the digirati, but IBM will still be IBM.


As a software engineer, skeptic, and dilettante futurist, let me thank you for your set of highly plausible predictions. I don't think you'll prove to be very far off.

I was skeptical about voice recognition, especially after some recent ridiculous encounters with automated customer service lines. But I also recently acquired an iPhone, and find the voice control ("play album abbey road") to be increasingly handy, even if not always successful. Maybe 80% success is good enough.

Also, as a former AI researcher, I can confidently predict that in 2020, AI will be predicted to arrive in "another ten years". Just like always.
 
CPUs: Moore's law will come to an end. In fact, it may even reverse as the emphasis becomes on saving heat/power. 10 years from now, your CPU may be slower than the one you have now.

A slower clock speed does not necessarily equal less power over all. A faster CPU can wake up, do its stuff, fall asleep quicker. This may well use fewer Coulombs than a slower cpu taking longer -- especially if peripherals are awake for the same shorter time.

But yes, Intel have stepped back from their 3+GHz cores, to slower ones. However, the one in your cell-phone may well be moving in the other direction.
 
Irrelevant and misleading. You can get it so that if some hardware fails then the rest of the system bypasses the failure and carries on. The biggest problem would be to ensure that humans notice the failure and fix it. Look up fault tolerant systems.
I've run fault tolerant systems. They fail. They just fail in unexpected ways.

Also sometimes a computer at home fails due to hardware failure. In which case you have no computer for several days.
True.

But move the computer to a central location and you will be able to use the system more reliably.
Not necessarily true, and not the point. The point is, it sucks, and there are no practical solutions.

No. Not much different to what many offices use now.
Systems like Citrix and Terminal Server and X Window? Quite different actually. All of those still require you to have a computer on your desk, not just a monitor.

Furthermore, this sort of architecture suffers from significant diseconomies of scale. A dual-processor system costs about four times as much as a single-processor machine, and a quad-processor system costs about four times as much again. (Talking sockets here, not cores.) Beyond four processors, things start to get expensive.

We have home computers because that's what works. We aren't going to have widespread multi-gigabit internet access within ten years, and even where we do, the user experience would be worse than simply keeping your computer.

So we are waiting on a decent broadband system before this can become reality.
2.5Gbits is beyond simply "decent". And that's what's required just for a current low-end display. Displays ten years from now will probably requre 10-20Gbits.

Then one computer system could serve many people around the world so peak hour would not be much more than the quiet hour.
Not a chance. Have you ever actually used a remote desktop from the other side of the planet? I have. It's... Well, it's crap. And I was just using it for the simplest system admin tasks. For daily use, it would be a nightmare.

You cannot eliminate latency.

No different than now. One virus can attack every Windows based computer.
Assuming that the same flaw is in XP and Vista and 7 (possible), and that none of the dozen or so major anti-virus offer any protection, and that you, the user, are careless, yes.

Remote virtualised systems offer a whole new vector of attack, though: Compromise the virtualisation platform, and every hosted system is wide open.
 
As for how much bandwidth you will need, well I know I can watch TV on my computer via ADSL. For example go here http://www.abc.net.au/iview/ and you can watch ABC TV. I hope it works outside of Australia. So if you can watch TV via the Internet you should be able to do anything else via the Internet.
Nope.

You can watch TV over ADSL because they are throwing a lot of CPU power at compressing the stream in advance, because they can then send that stream out to many people.

If you send your computer's desktop over the network that way, first, you have to independently compress every user's display, which would require more computing resources than the actual programs you are running, and second, you can't do that anyway because the compression quality will be unnaceptable for many tasks. (Like reading text on a web forum. You don't want to try that on an MPEG stream.)
 
They're awfully fast - too fast infact: memory has a huge problem keeping up.
Of course, this is a known problem (and also known to CPU/GPU designers). There are already mechanisms in place for dealing with it (CUDA for example exposes shared memory that can be accessed by all threads in a block and is very fast). But of course, also the programmer has to be aware of the capabilities of the hardware, and take care to minimize memory access ( for example not using lookup tables that might were fashionable earlier ).

Still, CUDA implementations today are oftentimes 100x faster than CPU implementations, that´s what I meant with "CPUs are awfully slow" (which of course depends on the viewpoint). No question, there´s stuff you can´t parallelize, but a for a lot of problems this is possible, and those will gain tremendous speedups in the next years (graphics, physics simulation, computer vision, robotics..).
 
The "everything will be thin-client" vs "everything will be thick-client" seems to ignore that there are different economies involved.
Yup.

In the beginning there was no choice - you had one computer and that was it. Given how much cheap computation there is sitting about I don't think we're going to see everything back in the "cloud" again.
Pervasive always-on high-speed internet access allows some things to move back into the cloud. Gmail, for example. Many people just don't use a local email client any more. Mind you, with Google Gears, Gmail is evolving into a local email client. The Wheel of Reincarnation is turning at 15,000 RPM. ;)

Centralisation provides a simpler abstraction for computation at the expense of making computation more expensive for each node - that ain't going to change and no matter how much bandwidth one throws at the problem it does not change the fact that local computation will be cheaper and faster.
Yup.

Added bandwidth makes it easier to hook everything together. Cheaper transistors means we can add RAS features to make your home computer fault-tolerant without blowing out the budget. (And we'll need to, as more and more cores and more and more gigabytes of memory become the norm.)
 
An important rule to remember is that transistors and bandwidth are cheap, people are expensive, and latency is priceless. If you can work around a people issue by adding transistors, or work around a latency issue by adding bandwidth, that's always a win.

So removing the CPU from your desktop is never likely to make sense.
 
Most people seem to have a huge overkill of processing power at their disposal.

And their getting clued into this too.
See the current success of netbooks.

Nah, ain't gonna stop until everyone's running around with little projection towers sticking out in front of their eyes, transmitting an overlain web/info heads up display directly into their eyes, with face, character, and general 3D environment recognition auto-processing and adding floating info boxes above everything they see, for perusal at will.

Of course, with great pickups in wireless data speeds, the mobile hardware could be very thin, or truly just a dumb terminal, so to speak, with all the work being done back in the cloud. But I don't think that's what you're getting at.
 
It occurs to me there won't be floating info boxes, since there's no need, since everything will be analyzed and can just be 'clicked on' as desired.

Just read a sci-fi short story with essentially exactly that theme. People would sit around and try to guess about other people, then look up all the detailed personal information at a whim.
 
I'm not sure what the relevance of this thread is as the world will end in 2012...duh.
:D
 
Except that there's no way they'd be using standard off-the-shelf software. The hardware and maintenance costs would be horrifying. Which means that no-one would use it.
Considering how many people already use Google Docs, Google Calendar, etc., I would say evidence is against you.
 
I have 4.5TB in my Nagi (my Vista box), 2.25TB each in Yurie and Haruhi (Linux boxes), 3TB RAID-5 in Pepper, and 1.9TB RAID-5 in Salt (two little fileserver appliances). Oh, and a 1.2TB external drive. And I have a new 1TB drive that arrived today.

:p

Activision pays for us to get new boxes every year or so. Right now us programmers are running 8 cores (what is that, dual quad cores?) and we will be upgrading to 16 very soon they tell me.

One thing that does use up cycles is compiling code :)
 
Hopefully by 2020 we will have finally transitioned to fully 64-bit environment.
 
The fact that code takes so long to compile is sick. Linking is often the bottleneck these days. Incredibuild does a good job of helping with the compiling part. Using SSD drives tends to help link times a lot.
 
The fact that code takes so long to compile is sick. Linking is often the bottleneck these days. Incredibuild does a good job of helping with the compiling part. Using SSD drives tends to help link times a lot.

Yep, linking is what kills us.

The frustrating thing is that we drastically decreased our link times by adopting a unity build system but doing so made incredibuild pointless because we can't compile more than 8 or 9 files at a time due to dependencies.
 
In 2020 governments already ran out of new ideas how to make cars and refrigerators etc. more energy-effective and ecological, so they turned to IT sector and made a new law banning computer operating systems that consume unreasonably much CPU power or memory, compared to the tasks needed by the computer user.

This new law made Windows XP, Vista and later Microsoft operating systems illegal. Now computers are running again with featherweight programming techniques, and old hardware from 1990's is again enough to run an operating system that meets all practical needs of a typical home user, only excluding the gamers.

the very second that any kind of usable holographic interface becomes a reality, a cell phone is too big.....
 
Sounds reasonable. The OS, app, and data should all fit on the same 5 1/4" floppy.

Who could possibly need to address more than 1 MB of memory space?

qnx did all of that years ago, very cool very small os.
 
This sound exactly like cloud computing. There is a company (can't think of thier name) that is going to impliment a gaming website where you can play any game reguardless of the computing power of your home computer and without downloading the game software. The software runs on thier severs and they just stream the video to your home computer.


So... your computer would be like a dumb terminal on a mainfraim?
Welcome to the 1970s. :) I think I'll watch TRON again. :D


I guess the reason I'm including this is that I think it's unlikely that any of us are going to be able to do a very good job of predicting the future much farther out than 10-20 years.


2020 is only 10.3 years away, so I don't see a problem.


Single processor throughput has largely peaked. While some small improvements will be made, the trend of smaller transistors = faster and lower power has already ended. Over the next decade, new processor technology will revolve around multiple cores performing paralizable tasks. After 2020, all bets are off since further reducing the size of transistors will be all but impossible.


I think they'll just keep adding more and more processors to computers to make them more and more powerful. The processors will probably have a lower clock speed because of overheating problems.


With a little luck, by 2020 Australia might have a semidecent broadband network.


I won't hold my breath. :(
 

Back
Top Bottom