That's only true if you aren't really using OO in the right way. I rarely right classes that are more than 1000 lines of code - most are 100 or less.
I have written many hundreds of whole programs in less than 100 lines of code. 1000 lines would be considered a huge procedural program, even in COBOL! Many of my procedural programs have compiled to a under a hundred kbytes of executable, with no need to drag around great lumps of library code with them to work either. And they did precisely and only the job they were designed to do, VERY quickly. Clever programmers could do far more than I could, in far less.
While I can and do understand how OO works and the advantages it offers (I use it myself), to me as an
olde-tyme programmer with over 30 years' experience, OO stands mostly for '
Orribly
Optimised!
To be honest, I think young programmers should be taught how to write well-constructed and optimal-design code in a highly limited environment BEFORE they graduate to OO. They should learn to appreciate how big a byte really is, what can actually be done with very limited functionality. They should be encouraged to keep these limitations in mind, because they continue to exist on even the most modern computers...
It's all very fine to say "hardware is cheap", but it still isn't free... And given that much of the life of most modern OS's goes to dealing with bloatware produced by programmers who simply do not understand what their code actually does to memory and hardware (and many of these are current commercial programs!), imagine how much more could be achieved by simply optimising their software with the above performance goals in mind
without changing up the hardware... Why, you could get the same features much faster and in much smaller memory consumption. Which is better, no?
OK I'll go and oil my Zimmer frame now, get my cardigan on, get a nice cup of tea, and sit by the fire... Thank you for listening, young man.