• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Forced obsolescence? RAM usage

Bodhi Dharma Zen

Advaitin
Joined
Nov 25, 2004
Messages
3,926
Could it be that software programmers have become ... less concerned with proper coding, because if the computer has enough resources it can run badly written programs without being spectacularly slow?

I just noticed that Skype uses 200MB of RAM just by being there, in the background. Sure I have plenty of resources left, but I also work with lots of images so I need a fairly fast computer with lots of RAM. Now of course in my laptop, its a different story. I have only 2GB of RAM there, and limited processing power, so.. using Skype really slows down the computer.. and I don't get why. AFAIK, Skype functionality have not changed in a decade or so, when computers with 1GB of RAM where powerful. I bet (without remembering of course) that Skype used to use much less RAM in the past...

And I'm sure the same goes for other programs. What would be the reason for this?



EDIT: MODERATORS: I just noticed that there is a particular subtopic for this, if you consider it necessary, please move the thread. Thanks.
 
Last edited:
Well first, there is the trade-off of having lots of fancy menu animations, video codecs, audio codecs, etc, that make a program like Skype use quite a bit of RAM.

And second, ya, RAM usage is way way way down there on the list of priorities. Things like:

A product that ships on time.
Security vulnerabilities
Bugs
New features
Low cost selection of third party components

etc, etc, etc, all come first. Making a piece of code run faster or use less memory takes time and generally makes it more complicated. Aka, it won't ship on time, and will be the source of more security and crash bugs. Which would you choose?
 
I just noticed that Skype uses 200MB of RAM just by being there, in the background.

hmm ... just checked my Skype and it's only using 71mb.

A lot of memory usage these days is simply about caching material for faster response when you call them up - it will use what you have if nothing else is.
 
RussDill,

Those are my feelings... as hardware is cheap, time and resources are invested elsewhere...

icerat,

Mine is using now 150MB, I have one chat opened... no video nor audio, just chat. Regarding cache... yes I use Windows 7 and I believe many times does that, in fact this started with Vista I believe. Still... I remember using Skype a decade ago, in computers with 512KB of RAM, with processors that are slower than current Intel Atom's.. and video was ok.. Now in my netbook, I can't even use Skype to videochat.. it became unusually slow. I believe that.. its a way to pressure the public to buy new hardware... I mean is not that they can't make the software run perfectly well in less than speedy machines.. but, as RussDill said, it is not their goal.
 
You can have a program that's

1) Low complexity
2) Low run-time
3) Low memory

But you only get to pick two. To increase any of those three, you have to begin sacrificing one of the others at some time.

You want it fast? We'll have to cache it in more memory or make the code use silly processor tricks that the next debugger will tear his eyes out trying to understand.

You want it low memory? See above silly processor tricks, or be wiling to wait for every slow memory/drive access.

You want it uncomplicated? That code will take forever to run, or it will have to spread itself over ten times the memory.

People need fast and bug-free, so #1 and #2 have highest priority. There's always more RAM, so #3 is way down the list.

Add in the "and make it pretty" factor of modern a GUI, and it's nearly impossible to even get two.
 
In the 1980s I wrote programs on a computer that had 256 KB of RAM. But that space was segmented into partitions of 64K each, and some of the partition space was reserved for a "systems communication" area. The net effect was we had about 56 K to work with for any given program. The average computer had 14 megabytes of disc space; the larger ones had 64 megs. And the OS didn't do paging (that is, it had no ability to swap RAM to disc on its own. We as programmers had to do that in our programs.)

It's really nice these days when you have a multi-megabyte file to work with to simply slurp the whole thing into a big array in RAM and just work with it.

Also, keep in mind that to a decent OS, memory that isn't in use is being wasted, like buying a 50 room mansion when all you need is a bungalow, leaving 38 rooms completely unused. So while the OS takes some pains to try to minimize the amount of memory each program uses, it's always looking for ways to use available RAM for things like caching and buffering.

And memory management in a modern operating system (OS) is complicated. A program might ask for 10 megs of RAM and the OS happily says, "Sure, here it is!" But it's only when the program goes to use that memory does the OS actually give it to the program. What that means is your task manager may be reporting the amount of memory the program has requested, as opposed to how much it's using, which is different again from memory being shared among a task's threads, and different yet again for memory being held in wait for Copy-on-Write operations... basically, trying to figure out how much memory a given task actually uses is a bit like trying to figure out which of three cups the ball is under.
 
Yep.

And it's been this way for 50+ years.

It’s true but in some ways less true now than it’s been. Compared to the speed things are running at inside the processor, RAM is more similar to a hard disk than the RAM of 30 years ago.

- It takes hundreds, and perhaps thousands of clock cycles to get data from RAM
- Despite the name, modern RAM is highly focused sequential access and you pay a big penalty for accessing it in a random way.


Truly high performance on today’s computers requires you to have your executable code in L! Cache, you frequently accessed data in L2 or L3 cache and use your RAM as a buffer to stream perfected data from other sources, disk, network, etc.

Unfortunately the predominate programming methodology (object oriented) does the exact opposite, but most code really doesn’t need to be that fast so they get away with it. Voice recognition, encryption, inline compression, 3D rendering tend to be the common exceptions.
 
The responses are interesting. But I still fail to grasp why... if the software piece have the SAME functionality for 10 years.. now it uses tons more resources.
 
In Firefox you can enter the address "about:memory" for a detailed usage breakdown.

There appear to be tons of separate garbage-collected heaps ("gc-heap"). A curious item, probably one of many, is a 3 GB section of one JavaScript unit labeled "short strings". That must be over 10,000 strings for that one unit(!?)
 
The responses are interesting. But I still fail to grasp why... if the software piece have the SAME functionality for 10 years.. now it uses tons more resources.

But is it really the same functionality? Are your video streams running at the same resolution now that they did 10 years ago?
 
The responses are interesting. But I still fail to grasp why... if the software piece have the SAME functionality for 10 years.. now it uses tons more resources.

The other question to ask (to jump of the platform Ziggurat placed), is if it really is exactly the same functionality? Or is there (for example) now better error-checking and trapping, more useful help files, wizards for common tasks, larger/better buffering, higher resolutions, more supported file formats, better logging, auditing, and/or security, added code to support interoperation with other software, or other similar "back-end" things that wouldn't be represented by a change in the visible GUI?
 
Last edited:
The responses are interesting. But I still fail to grasp why... if the software piece have the SAME functionality for 10 years.. now it uses tons more resources.

Because modern GUI programs aren't standalone. There might be one person that writes the core, but it's going to link against a library that handles the screen graphics, and a library that handles memory usage, and a library that manages user authentication from the OS, and a library that allows you to "like" it on MySpaceTime, and a library that provides a XML/Moose/JSON API parser...

Simply relinking with current versions of every library (which loads of people are working on) can triple the executable size.
 
In terms of RAM usage, my current home computer is actually the first one I've owned where the amount of RAM significantly exceeds my actual requirements. You can pick up 16 GB of DDR3 for less than $100 - not the really high-speed stuff, but that only makes a tiny difference. I find that I pretty much never use over 6 GB, and that includes running a 2GB RAMdisk. Well, maybe I go a bit above that when running some big games, but even there it doesn't push the limit, since I'm not multi-tasking when playing games. The rest of that RAM doesn't actually go to waste, it gets used as a massive disk cache. And when ripping a DVD into an MP4, a potential 10 GB disk cache is actually nice, though not required. But basically, at least on the consumer side, I think hardware has finally outpaced software. And that's a good thing.
 
Anything that will run well in a virtual machine I put in one. This provides a lot of recoverability options and provides a lot of isolation for web based activates. Even trustworthy web sites are frequently compromised to put malware on your PC. With a VM this is usually isolated to just the guest so if you have separate guests for business and financial functions they are usually safe. Additionally if I do have a problem I don’t need to reinstall the OS just restore the virtual disk file. It doesn’t even need to be restored on the same host.

Multiple 8GB VM’s + 16GB on the host for games/other memory intensive functions adds up p[pretty quickly. 32GB is basically the minimum I would go with these days.
 
I used to have to make real things run in 2K of ROM and 128 bytes of RAM, and at times I look at my computer (24 GB RAM, 4 TB disk, Quad i7) and wonder how it is that it's usually full given that.

The answer is that reducing program RAM footprint takes a LOT of effort, and comes with risks if done with any zeal.

I used to spend days to find a few dozen bytes I could dispense with so that new features could be added.

Nobody would find that to be a good investment of time now.
 
I used to have to make real things run in 2K of ROM and 128 bytes of RAM, and at times I look at my computer (24 GB RAM, 4 TB disk, Quad i7) and wonder how it is that it's usually full given that.

The answer is that reducing program RAM footprint takes a LOT of effort, and comes with risks if done with any zeal.

I used to spend days to find a few dozen bytes I could dispense with so that new features could be added.

Nobody would find that to be a good investment of time now.

Don't worry, it isn't a completely forgotten art form. Performance critical applications, such as operating system kernels, compilers, simulation tools, etc, still count bytes and make a big effort for a saving of even a dozen bytes. Luckily, the vast majority of programmers are free to worry about other things.
 

Back
Top Bottom