Games and badly written (e.g. Micro$haft) code.
Games have been driving computer development and sales for a long time; the "holy grail" is a machine that's fast enough to handle real-time photorealistic animation. Until we have that, game desigers will always be pushing the hardware envelope for the immersive RPG and FPS-style games.
Also, when cycles are cheap enough, you stop worrying about them. Why bother to use a fast sorting algorithm when you can code up bubble sort and still have the program give "acceptable" responsiveness? Overall code efficiency, in both space and time, has dropped dramatically over the past twenty years precisely because there's little incentive to make code fast or compact any more.
One of my old professors adressed this way back in my undergraduate years. This is roughly what he presented:
Assume three algorithms of complexity O

, O(n log n), and O(n^2) and three processors that can perform 1000, 10,000, and 100,000 operations per second. (I'll be using log base 10 for ease of calculation... I haven't had my coffee)
Cases:
Processor----Algorithm--------n-----Operations-----Time to complete
1000 o/s------O(n^2)------10,000 ---10^8----------100,000s ~= 28h
10,000 o/s----O(n^2)------10,000----10^8-----------10,000s ~= 3h
100,000 o/s---O(n^2)------10,000----10^8------------1,000s ~= 17min
1000 o/s------O(n log n)----10,000---40,000--------------40s
10,000 o/s----O(n log n)----10,000---40,000---------------4s
100,000 o/s---O(n log n)----10,000---40,000---------------0.4s
1000 o/s------O

---------10,000---10,000--------------10s
10,000 o/s----O

---------10,000---10,000---------------1s
100,000 o/s---O

---------10,000---10,000---------------0.1s
And this is why it remains important to use efficient algorithms. =)