• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Programming Trends to Follow?

aggle-rithm

Ardent Formulist
Joined
Jun 9, 2005
Messages
15,334
Location
Austin, TX
As software development matures as a discipline, and as more and more solutions appear to address various problems, I find myself torn between an ugly plethora of choices. I read up on the latest developments, and am constantly told "this is the next big thing!" It's something everyone is doing, all the jobs require this knowledge, etc., etc.

However, I know for a fact that yesterday's "next big thing" is often today's "what were we thinking?" When I first started working as a developer, Rapid Application Development (RAD) was the biggest thing since sliced bread; today, it is a stupid idea that we wasted too much time on ten years ago.

On the other hand, there is a serious problem with ignoring all the new trends and just sticking with what you know. That's what I did the first five years or so of my career, and I found myself seriously behind the times. Certain aspects of Agile development, Extreme Programming (XP), test-driven development, and domain-driven design have become essential tools for me in the last few years; none of this was taught in school.

Today the next big things are "the Cloud" (whatever that is), and dynamic programming languages like Ruby. There is still no universal development strategy for the Web. Even if you stick with Microsoft, you have to decide amongst ASP.Net, MVC, WebMatrix, Silverlight, and other technologies. After twenty years of the World Wide Web, you'd have thought they would have figured out how to develop for it by now.

At the same time, there are many people arguing that some tried and true technologies are dead and shouldn't be bothered with. C/C++, Java, PHP...these are platforms whose time has passed, according to some prognosticators, even though they are still dominant forces in the programming world.

Given that a single person can only learn a limited portion of what's out there, and we have no idea what's going to actually be used an three years...how does one choose what to focus on today?
 
Last edited:
Just a hint of what might be the key to sorting all this out...if it supports an established principle of programming that has been around for decades, then it will probably be successful in the long run. That's why I think it's better to focus on MVC than some of the other Microsoft offerings, because it's actually a quite old technology that's never had a suitable platform available until recently.

It's not simply that it's "old", it's also that it is based on very solid principles of software development.
 
As I understand it, "The Cloud" is putting your data on other people's computers, so that you don't have to worry about it.
For some reason, this worries me.
 
As I understand it, "The Cloud" is putting your data on other people's computers, so that you don't have to worry about it.
For some reason, this worries me.

That's one definition. It's also used to describe software-as-a-service. At one point, Dell defined "the Cloud" as a class of servers.

In my mind, if the industry cannot agree on what a certain buzzword is, then that buzzword is not something you should waste your time on.

But I could be wrong.
 
'Cloud' is basically online storage. Dropbox, essentially. Google Docs. Save online and you can access your data anywhere.
 
As software development matures as a discipline, and as more and more solutions appear to address various problems, I find myself torn between an ugly plethora of choices. I read up on the latest developments, and am constantly told "this is the next big thing!" It's something everyone is doing, all the jobs require this knowledge, etc., etc.

However, I know for a fact that yesterday's "next big thing" is often today's "what were we thinking?" When I first started working as a developer, Rapid Application Development (RAD) was the biggest thing since sliced bread; today, it is a stupid idea that we wasted too much time on ten years ago.

On the other hand, there is a serious problem with ignoring all the new trends and just sticking with what you know. That's what I did the first five years or so of my career, and I found myself seriously behind the times. Certain aspects of Agile development, Extreme Programming (XP), test-driven development, and domain-driven design have become essential tools for me in the last few years; none of this was taught in school.

Today the next big things are "the Cloud" (whatever that is), and dynamic programming languages like Ruby. There is still no universal development strategy for the Web. Even if you stick with Microsoft, you have to decide amongst ASP.Net, MVC, WebMatrix, Silverlight, and other technologies. After twenty years of the World Wide Web, you'd have thought they would have figured out how to develop for it by now.

At the same time, there are many people arguing that some tried and true technologies are dead and shouldn't be bothered with. C/C++, Java, PHP...these are platforms whose time has passed, according to some prognosticators, even though they are still dominant forces in the programming world.

Given that a single person can only learn a limited portion of what's out there, and we have no idea what's going to actually be used an three years...how does one choose what to focus on today?

Change a few of the names, and this exact post could have been written anytime during the last 50 years.

How does one choose? The one that pays the highest rates.
 
'Cloud' is basically online storage. Dropbox, essentially. Google Docs. Save online and you can access your data anywhere.

It includes Software as a Service, though. This is the part that is more difficult for the average consumer to understand, so Microsoft is marketing "the cloud" as, essentially, online storage.

Maybe that's what it will end up being. .Net was originally more than just a software development platform, but that's all it is today.
 
Change a few of the names, and this exact post could have been written anytime during the last 50 years.

How does one choose? The one that pays the highest rates.

COBOL once paid the highest rates.

Then it paid nothing, because no one needs outdated technology.

(Although I recently heard of an outfit that uses COBOL to render HTML. Why, I'll never know.)
 
COBOL once paid the highest rates.

Then it paid nothing, because no one needs outdated technology.

(Although I recently heard of an outfit that uses COBOL to render HTML. Why, I'll never know.)

Exactly. Which is why I suggested going with what pays the highest rates, then when that changes, move on to the next thing. Or you can try to guess, and maybe pick the wrong technology to follow.
 
It works both ways. Java was considered a fad, now it's the new C++.
 
It works both ways. Java was considered a fad, now it's the new C++.

Interestingly, both Ruby and C++ developers say Java should be avoided. Ruby developers because Java is old-fashioned, C++ developers because it is slow (although not as slow as Ruby).
 
Given that a single person can only learn a limited portion of what's out there, and we have no idea what's going to actually be used an three years...how does one choose what to focus on today?
Have you ever attended a developer's tradeshow or convention or expo? Attending these things, every now and then, will give you a good flavor of what is on the horizon. Which technologies are slick and will stick, and which ones are inherently weak and flimsy.

I find them to be more revealing, in this area, than magazines, book store browsing, or Internet articles. The tech demos are often right there to evaluate, with experts to answer questions standing by.

Certain aspects of Agile development, Extreme Programming (XP), test-driven development, and domain-driven design have become essential tools for me in the last few years; none of this was taught in school.
Welcome to the real world.

This sort of thing happens in any profession, though I suspect that it is especially rapid in software development.

As long as one is always willing to learn, and keep tabs with what is going on, one will always be on the path to being a good developer.

You do not have to go it alone. Trade shows also present good opportunities for networking.

Today the next big things are "the Cloud" (whatever that is),
The Cloud is more of a hosting strategy, than a development platform. True, it does require thinking about your application from a more distributed, service-oriented point of view. But, your choice of "cloud" or not is independent of which platform and languages you will be developing on.

I believe the concept of The Cloud will be around for a good long time. However, its usefulness is probably being oversold, at the moment. There could be a Cloud Backlash in the near future. But then it will slowly ramp up in popularity, again, once we get the knack for what it is truly useful for.

I am not, however, a cloud developer myself, yet. So, what do I know?

and dynamic programming languages like Ruby.
And here I thought Ruby was on the way out, due in part to performance concerns. Well, again, what do I know?

There is still no universal development strategy for the Web.
Nor should we expect there to ever be one. That is the nature of the web. The only thing its servers have to share is the TCP/IP protocol. (and even that is changing from v4 to v6). Everything else is open to whatever anyone happens to stuff into any servers that can use the protocol.

Even if you stick with Microsoft, you have to decide amongst ASP.Net, MVC, WebMatrix, Silverlight, and other technologies.
First of all, WebMatrix is a management and coding tool, not a platform. You might have been thinking about Razor syntax.

And, Silverlight is of limited use on the Web.

If you choose to be a Microsoft-based web developer, (as I am) you ought to learn all three: Razor, Conventional ASP.NET, and ASP.NET MVC. Many skills are largely transferable between them, including the syntax and semantics of the languages. The size of the site might determine which you should consider using. But, so will whatever teams you happen to be working with, if any.

Silverlight is good for two things on the Web: Video playing (at least until HTML5 catches up), and Being All Arrogant ("Look at my web site! It is sooo special because it is Rich!").

(Adobe Flash might be even better in both cases, though if you are already a Microsoft coder, you might have an easier time coding for Silverlight.)

Of course, Silverlight is the platform of choice for Windows Phone 7. But, why would anyone want to develop anything for that?!

After twenty years of the World Wide Web, you'd have thought they would have figured out how to develop for it by now.
The problem is that you can't have one dream team of people coming up with all the good ideas, no matter how hard you try.

We do have committees to establish HTML and protocol standards, since those must be shared. But, not server technologies, because they do not have to be.

Someone comes up with a good idea for web development (MVC-style, for example), which was not previously considered by a big player before (Microsoft, in this case), and the big player would be stupid not to adopt it once it is proven. (as we see with the roll-out of ASP.NET MVC)

And, this repeats with everything else: AJAX*, Clouds, agile programming, etc.

(*Funny how AJAX was first invented by Microsoft, but it took outside parties to demonstrate how useful it could be, before MS widely adopted it!)

At the same time, there are many people arguing that some tried and true technologies are dead and shouldn't be bothered with. C/C++, Java, PHP...these are platforms whose time has passed, according to some prognosticators, even though they are still dominant forces in the programming world.
C/C++ still dominates the desktop application world, and even a good portion of the mobile device app world.

Java is not going away, either. It also dominates a good portion of the mobile world, at least. And, other stuff.

PHP: Who knows? It is popular enough that it will stick around for a good long while. But, I was never much into it, myself.

There are some truly dead or dying languages, as far as opportunities for new projects are concerned: VB6, Pascal, FoxPro, just to name a few. Even COBOL and Fortran, though not entirely dead, offer few opportunities to grow as a developer, and are considered niche expertise.

(COBOL programmers might be paid well as a result. But, it is more difficult to master the skills to compete in that limited market.)

When I first started working as a developer, Rapid Application Development (RAD) was the biggest thing since sliced bread; today, it is a stupid idea that we wasted too much time on ten years ago.
RAD is still a viable strategy, in cases where software needs to be developed… well… rapidly. Highly customized data-entry applications, for example. Especially if they are limited to usage in a particular office or something.

I would not use a RAD approach to commercial software, of course. Ugh.

However, if you use a CMS for a web app, you are essentially utilizing a lot of RAD principles, right there. So, the approach is hardly a complete waste of time.

It includes Software as a Service, though. This is the part that is more difficult for the average consumer to understand, so Microsoft is marketing "the cloud" as, essentially, online storage.
Right.

Maybe that's what it will end up being. .Net was originally more than just a software development platform, but that's all it is today.
.NET was never anything more than a development platform (with a special focus on web services, for better or worse). It was marketed as something more than it really was, for a while.

With the Cloud you have the opposite: It IS a bit more than merely online storage. But, it is commonly sold as less than it could be, to most people.

Someday, I will see if I can find a use for it.
 
COBOL once paid the highest rates.

Then it paid nothing, because no one needs outdated technology.

(Although I recently heard of an outfit that uses COBOL to render HTML. Why, I'll never know.)
There is at least one banking system that I know of, used in a number of major banks and deposit/lending institutions that is written in COBOL. I know that there are modules to do with online payments that we have to install from time to time that are written in COBOL.

I suppose I can understand why COBOL persists in financial institutions.

I agree completely about the HTML.
 
Maybe I'm wrong, but in my personal experience, these changes just sort of organically become best practices and for the most part, you'll build software the way your employer wants you to. Or you'll get a job babysitting some piece of legacy code and you'll code in that language for a few years. So if you become well versed in "agile" or become great with C#? Great, unless your client or employer wants to work differently.

IMHO, it's much more important to understand design patterns and work on creating orderly, well crafted code, because then you'll be equipped for anything.

(Although I do love working with our MVC system in AS3).
 
There is at least one banking system that I know of, used in a number of major banks and deposit/lending institutions that is written in COBOL. I know that there are modules to do with online payments that we have to install from time to time that are written in COBOL.

I suppose I can understand why COBOL persists in financial institutions.

I agree completely about the HTML.

I've been in IT for 30 years. There is tons of COBOL still around.
 
Interestingly, both Ruby and C++ developers say Java should be avoided. Ruby developers because Java is old-fashioned, C++ developers because it is slow (although not as slow as Ruby).

Java is a pig of a thing. Once you write any sufficiently complex application in Java, it becomes a massive lump of slowness, that takes forever to start. SAP was going to use Java as the future direction for all it's application development. They have quietly dropped it, and are moving applications from Java back to their own in house programming language of ABAP. ABAP is a derivative of COBOL. Not as theoretically correct as JAVA, but a hell of a lot faster, and less memory intensive. I saw a JAVA part of it use 48GB of memory the other day, to process one single transaction of data. You can't do that in a commercial environment, even today.

The problem is the Object Oriented programming paradigm. It leads to inherenently inefficient code. SAP managed to make an incredibly complex application that used the relational database method of describing the data, and that does work. Using OO to describe your data leads to an unmanageable pile of junk.

As for the future, functional programming is the way to go. When people will realise that and pay for it is another matter.
 
The problem is the Object Oriented programming paradigm. It leads to inherenently inefficient code. SAP managed to make an incredibly complex application that used the relational database method of describing the data, and that does work. Using OO to describe your data leads to an unmanageable pile of junk.

It CAN lead to that. If used properly, it's a way to manage complexity. Sure, it would be more efficient to write spaghetti code, provided it works perfectly and you don't have to maintain it.

Assembly code would be WAY faster than any other method, but who has time for that?

As for the future, functional programming is the way to go.

If you can reconcile the fact that functional programming has no side effects with the need for the application to actually do something, sure...

Functional principles are working their way into mainstream languages, and that may be as far as it ever goes.
 
It CAN lead to that. If used properly, it's a way to manage complexity. Sure, it would be more efficient to write spaghetti code, provided it works perfectly and you don't have to maintain it.

"If used properly" is a way out for Java, but OO leads to a complexity that cannot be managed. Relational Databases have proven to be far more capable of managing complex data, and it's the data that code exists to manage. This 48GB of memory was probably done perfectly correctly at the design stage, then has to be hacked to make it workable, which they are doing now.

There is no need to resort to spaghetti code to create efficient applications. I would doubt that spaghetti code can create efficient applications, as it rapidly becomes unmaintainable.
 

Back
Top Bottom