• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Forced obsolescence? RAM usage

Don't worry, it isn't a completely forgotten art form. Performance critical applications, such as operating system kernels, compilers, simulation tools, etc, still count bytes and make a big effort for a saving of even a dozen bytes. Luckily, the vast majority of programmers are free to worry about other things.

A lot of embedded developers are still targeting systems with relatively tiny amounts of RAM and flash memory. It can actually be pretty enjoyable working under those constraints, IMO. :)
 
Nobody would find that to be a good investment

This is really what it comes down to. Time is money. Optimising code takes time, so better optimised code is more expensive. Once RAM is cheap enough, your customers will be more willing to buy more RAM than they will be to pay more for better optimised code. It's not actually a programming trend at all, it's an economic one.
 
This is really what it comes down to. Time is money. Optimising code takes time, so better optimised code is more expensive. Once RAM is cheap enough, your customers will be more willing to buy more RAM than they will be to pay more for better optimised code. It's not actually a programming trend at all, it's an economic one.

Sounds right.. and.. for me at least, is sad, but then again, I believe the goals taken as important by society as large, are wrong...

And so in the end, and as a sideffect, my suspicious is right... this causes forced obsolescence of perfectly good hardware. At least, my old Netbook, still functional, can't use Windows 7 anymore.. is inconceivable slow. So, I'm using Ubuntu and that allows me to use a computer that otherwise would be trash.
 
Sounds right.. and.. for me at least, is sad, but then again, I believe the goals taken as important by society as large, are wrong...

And so in the end, and as a sideffect, my suspicious is right... this causes forced obsolescence of perfectly good hardware. At least, my old Netbook, still functional, can't use Windows 7 anymore.. is inconceivable slow. So, I'm using Ubuntu and that allows me to use a computer that otherwise would be trash.

You're wrong.

The software industry ( I know, because I'm part of it ) moves forward as fast as we can, in any way we can. It is far more productive to start working on some new more amazing software while a hardware vendor increases the power of their machines than it is for us to sit there continually optimizing old software further and further.

In cases where there is no new hardware, we do optimize the heck out of things. When new hardware is coming out, we focus on adding features to utilize the new hardware power and optimize later, and only if needed.

When the current generation of consoles ( 360, ps3 ) came out, the code for the first bunch of games was pretty atrocious. Highly unoptimized. But it didn't matter because even cruddy code could lead to a gorgeous game compared to the original xbox and even ps2. Fast forward to games coming out right now -- their codebases are optimized like *crazy*. Graduating computer sci students, and even people at most other companies in the industry, have no idea what kinds of tricks we use to get the stuff you see on screen with the now extremely limiting hardware in the current generation. We've literally been optimizing those engines for like 7 years without a hardware change.

So it just depends on the conditions. I take offense at the suggestion that programmers these days are less concerned with proper coding. The very notion of "proper" is context dependent and the context is what needs to be delivered to the user.
 

Back
Top Bottom