aggle-rithm
Ardent Formulist
As software development matures as a discipline, and as more and more solutions appear to address various problems, I find myself torn between an ugly plethora of choices. I read up on the latest developments, and am constantly told "this is the next big thing!" It's something everyone is doing, all the jobs require this knowledge, etc., etc.
However, I know for a fact that yesterday's "next big thing" is often today's "what were we thinking?" When I first started working as a developer, Rapid Application Development (RAD) was the biggest thing since sliced bread; today, it is a stupid idea that we wasted too much time on ten years ago.
On the other hand, there is a serious problem with ignoring all the new trends and just sticking with what you know. That's what I did the first five years or so of my career, and I found myself seriously behind the times. Certain aspects of Agile development, Extreme Programming (XP), test-driven development, and domain-driven design have become essential tools for me in the last few years; none of this was taught in school.
Today the next big things are "the Cloud" (whatever that is), and dynamic programming languages like Ruby. There is still no universal development strategy for the Web. Even if you stick with Microsoft, you have to decide amongst ASP.Net, MVC, WebMatrix, Silverlight, and other technologies. After twenty years of the World Wide Web, you'd have thought they would have figured out how to develop for it by now.
At the same time, there are many people arguing that some tried and true technologies are dead and shouldn't be bothered with. C/C++, Java, PHP...these are platforms whose time has passed, according to some prognosticators, even though they are still dominant forces in the programming world.
Given that a single person can only learn a limited portion of what's out there, and we have no idea what's going to actually be used an three years...how does one choose what to focus on today?
However, I know for a fact that yesterday's "next big thing" is often today's "what were we thinking?" When I first started working as a developer, Rapid Application Development (RAD) was the biggest thing since sliced bread; today, it is a stupid idea that we wasted too much time on ten years ago.
On the other hand, there is a serious problem with ignoring all the new trends and just sticking with what you know. That's what I did the first five years or so of my career, and I found myself seriously behind the times. Certain aspects of Agile development, Extreme Programming (XP), test-driven development, and domain-driven design have become essential tools for me in the last few years; none of this was taught in school.
Today the next big things are "the Cloud" (whatever that is), and dynamic programming languages like Ruby. There is still no universal development strategy for the Web. Even if you stick with Microsoft, you have to decide amongst ASP.Net, MVC, WebMatrix, Silverlight, and other technologies. After twenty years of the World Wide Web, you'd have thought they would have figured out how to develop for it by now.
At the same time, there are many people arguing that some tried and true technologies are dead and shouldn't be bothered with. C/C++, Java, PHP...these are platforms whose time has passed, according to some prognosticators, even though they are still dominant forces in the programming world.
Given that a single person can only learn a limited portion of what's out there, and we have no idea what's going to actually be used an three years...how does one choose what to focus on today?
Last edited: