Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Programming

Your Tech Skills Have a Two Year Half-Life 289

itwbennett writes "Eric Bloom, an IT leadership coach and former CIO, has answered that eternal question 'does working on old software hurt your professional marketability' with a somewhat surprising 'no.' But, Bloom adds, 'a techie's skill set from a marketability perspective has a two year half-life. That is to say, that the exact set of skills you have today will only be half as marketable two years from now.'"
This discussion has been archived. No new comments can be posted.

Your Tech Skills Have a Two Year Half-Life

Comments Filter:
  • Depends... (Score:4, Insightful)

    by Oswald McWeany ( 2428506 ) on Wednesday October 26, 2011 @03:20PM (#37847586)

    Depends really on how specific your skills are.

    Knowing, for example Java or .NET programming languages won't decline in value that fast. Perhaps specialising in certain specific products will- and certainly the development environment will.

    On a non-programming side- knowing the basics of computer hardware doesn't decline in value that fast. Perhaps specialising in certain models does.

    • I was going to make a similar but converse point... as a tech generalist, much of what I do is bleeding-edge. Old knowledge is as irrelevant to me as it would be to a potential employer.

      Just as doctors are supposed to keep up to date on their skills through continuing education, technologists are expected to keep fresh on new tech trends.

      • Re:Depends... (Score:5, Interesting)

        by SoothingMist ( 1517119 ) on Wednesday October 26, 2011 @04:02PM (#37848142)
        My experience has been that one has to balance keeping up with one's technical field and avoiding chasing fads. Too often "keeping fresh on new tech trends" boils down to chasing fads and, for instance, using a new language because it is there. What I have concentrated on are the technologies needed to solve difficult customer problems as they push their own application and technological domains. To make this work I keep up a constant cycle of study-learn-work-produce. That has worked well for 35 years and keeps me in demand as a senior research engineer (Ph.D.) at 60 years of age.
      • Re:Depends... (Score:4, Insightful)

        by jd ( 1658 ) <imipak@ y a hoo.com> on Wednesday October 26, 2011 @04:15PM (#37848340) Homepage Journal

        I'd be wary about that "old knowledge". It may prove useful. There's LOTS of legacy software out there. I stay familiar with Fortran because it's still bloody good for numeric computations and it's uneconomic to translate old Fortran codes, which means I'm going to encounter it. I spent time learning about Intels iWARP chip (brilliant design, naff implementation) and Content Addressable Memory because these are ideas that have appeared multiple times and will therefore appear again. Understanding the principles now saves me time and effort for when they become important later on.

        That's not to say I stay from the bleeding edge. I try to split my time 50:50 between the past that I may well encounter in the future (a trait that secured me my current job) and the future that I will certainly encounter in the future (a trait that secured me my jobs at NASA and Lightfleet). Both will come up, that is inevitable, but it's not possible to know in advance which one will come up first or in what way.

        Generalizing is best done by making the fewest assumptions about the past, present and future that you can that will leave you enough time to learn the skills well.*

        *This is important. 100 half-baked skills are of equal value to 100 highly-tuned future-only skills that turned out to be a dead-end. None whatsoever. Mastering a smaller set of transferable skills, legacy skills and future skills, thus being totally generalized, is the obvious ideal.

        • I don't think the point is that it won't be useful. I think, by definition, the article is simply saying that it will be half as useful.

          It stands to reason: the older a technology is, the more people in the labor market know it (including the global labor market: India, etc.)

          To be honest, I think the life of a permanent contract is an unstable one. I prefer the Japanese model (now also fading) of in-house training to fill strategic goals coupled with bi-lateral loyalty to the company. It's a much more susta

      • Re:Depends... (Score:5, Informative)

        by cyberchondriac ( 456626 ) on Wednesday October 26, 2011 @04:44PM (#37848748) Journal
        I had once felt that way too, but there's a distinct difference: doctors need only keep up with advancements in medicine or new discoveries about extant biological systems: the human body itself, however, doesn't really change (not over a few millenia, anyhow). It's a relatively stationary target.
        Software, OTOH, frequently changes drastically and constantly; it's engineered by man, and can be radically altered in any number of ways on whim, forcing a reinventing of the wheel sometimes even; a moving, morphing target, much of it probably driven as much as by planned obsolescence and profit as it is utter necessity. (Does Word really need to keep "evolving" to do what it does?) Sometimes I really wanna say "screw all this" and go start a goat farm.
        • by Dahamma ( 304068 )

          Depends on what kind of doctor you are... recent advances in surgical techniques (laparoscopy, robotic surgery, etc) means surgeons have to learn new techniques all the time - and ones that require a lot more precision and practice than the latest HTML standards or OS APIs...

      • "as a tech generalist, much of what I do is bleeding-edge. Old knowledge is as irrelevant to me as it would be to a potential employer."

        Then I'd bet you are not so much a "tech generalist" as you think. What I learnt about 20 years ago is basically as valuable now as it was then, and I've being building what I know now upon that from then on.

        Maybe the fact that I almost don't do Microsoft can help to explain it. I certainly don't value so much what I learnt about DOS 5.2 or Windows 3.11 but what I learnt

        • You're a ways ahead of me but what I learned between 2004-2010 for Windows "help-desk" stuff is still good for some other 3-4 years. I purposely stayed away from the harder volatile server side stuff, because I like Durable Knowledge. So I'm backup Helpdesk and a "line" accounting administrator.

      • Old Knowledge is NOT irrelevant. It is useful for abstraction of concepts, which are longer lasting principles that span times. I have known people that work bleeding edge stuff, but don't understand concepts, and they scare the crap out of me, because they have no real understanding of what they are doing. They do, because the manual says to.

    • by jaymz666 ( 34050 )

      I think the point is that if you are an admin of a specific release of a product from a vendor, the further behind on the upgrade path you are the less useful your skills are.

    • by Kagato ( 116051 )

      Knowing Java ins't good enough anymore. For instance, a developer who just does AWT or SWING is going to limited use for potential employers. You have to keep up to date on the common frameworks. What's SpringSource, Hibernate, Apache, etc. up to lately in the Java Space? What about other languages that execute in the JVM (i.e. JRuby, Clojure).

      • I've never seen a job with JRuby, Clojure or Groovy requirement. (But certainly not in my country, Hungary.)

    • Re:Depends... (Score:5, Interesting)

      by jd ( 1658 ) <imipak@ y a hoo.com> on Wednesday October 26, 2011 @04:03PM (#37848148) Homepage Journal

      It also depends on when you're talking about. After the Dot-Com crash, Java programmers were hurt FAR, FAR worse than C or Fortran programmers. Shortly before Y2K, Fortran and Cobol programmers were in massive demand. (For those who argue Y2K was a hoax because nothing happened, I'd point out that after a large fortune and a larger army of coders went to work on fixing the bugs, you should have EXPECTED nothing to happen. Fixing problems after the disaster is too late.)

      So the decay curve isn't a simple one. It has bounces and bottomless pits along the way.

      However, and I can't stress this enough, staying current isn't merely a matter of learning the next feature of the old language set. To stay relevant, you MUST diversify. A coder should also be a damn good system admin and be capable of database admin duties as well. Being able to do tech writing as well won't hurt. You don't know what's going to be in demand tomorrow, you only know what was in demand when you last applied for work.

      Programmers and systems admins shouldn't specialize on one OS either. As OS/2 demonstrated, the biggest thing out there in week 1 can be a forgotten memory by week 12. The market is slow at some times, fickle at others. You don't know how it'll be, the best thing you can do is hedge your bets. If you've covered (and stay current on) Linux, a BSD Unix variant, a SysV Unix variant, Windows Server, and at least one RTOS (doesn't matter which), you'll know 98% of everything you'll need. You can learn the specific lingo needed by a specific OS implementation quickly because that's only a 2% filler and not a 100% learn from scratch.

      Although workplaces don't do sabbaticals (which is stupid), you should still plan on spending the equivalent of 1 study year for every 7 work years. (If you spend 1 hour a day practicing, relearning, or expanding your skills excluding any workside stuff, you're well in excess of what is required. I can't guarantee that an hour a day will make you invulnerable to downturns, but I can guarantee that there will never be a time, even in the worst recession, that your skills aren't in demand.)

      • Re:Depends... (Score:5, Interesting)

        by Requiem18th ( 742389 ) on Wednesday October 26, 2011 @06:40PM (#37849848)

        Half-life is still a pretty damn good analogy.

        It is one of those mysterious, non-intuitive things about the subatomic world. You see, rather than ageing uniformly, atoms randomly decide whether to decay or not. Meaning that if you have a container filled with plutonium, after 24,100 years half of the atoms would have decayed, the supply in the container has decayed as a whole, but in reality half of the atoms there never decayed at all.

        The result of the analogy is that every two years half of the programmers will be unmarketable (unless they acquire new skills) the other half however doesn't need to learn anything since their exact skill set is still in demand.

    • Actually, it will. Not that .NET or Java are going away within two years, but they'll evolve and develop. What you know today about them is only worth half as much in 2 years when new libraries are out and the next version of .NET requires you yet again to relearn half of what you know.

    • by jekewa ( 751500 )

      As I sit here working on a Java web app written almost 10 years ago, updated continuously since, but still with threads of old libraries and methodologies within, I think the half-life is a little bit of a weak comparison to make.

      TFA is all about staying on top of your unnamed vendor's magic moving technology. It isn't about technology skills in general. Heck, it isn't mentioning anything specific either. Well, there is that one line about staying up with PL/SQL.

  • Suppose I know some amount (X) of C now (Just out of college)
    Will that be less valuable after having 2 years experience in the field?
    • I don't think the theory applies universally to all tech skills. C has endured well over the years. So has SQL. Other languages, not so much. I don't see many ads for Ada or Lisp these days. Your actual mileage may vary.

      • by Lennie ( 16154 )

        But how marketable is SQL ? Most of the people already know SQL. Lets say you apply for a programming job at a web-development company and they are all using fancy "noSQL" databases. The question is if you know all the new stuff and when to use it and when not to use it.

        • by jd ( 1658 )

          Again, which dialect? T-SQL isn't identical to PL-SQL. You are correct about NoSQL (though that's often just a fancy way to describe a subclass of key/value databases, which also includes the BerkeleyDB family, XML databases and a myriad of other styles). However, it's not limited to that. Hierarchical databases exist, as do "star" databases (data warehouses), object-oriented databases (sometimes considered a branch of NoSQL), indexed sequential databases are still a popular format,

          SQL is also not a static

        • How marketable is SQL? There are two ways to look at it. Will SQL help you distinguish yourself from others and leave them in the dust? No. But just try to get hired without it.

          There are lots of legacy databases out there, and you won't be talking to them without a fair understanding of SQL. Even the niftiest of whiz-bang query tools will generate flawed SQL every so often, leaving you on your own to figure it out.

      • by TiggertheMad ( 556308 ) on Wednesday October 26, 2011 @04:12PM (#37848306) Journal
        I suspect that the Bloom is referring 'tech' skills in a general sense. Most IT people are not programmers, and thus consume rather than create software products. If you have 'skills' using Office version X, it will probably not be as valuable in two years when a new and improved product, Office X+1 comes out.

        Obviously, if you think of IT as just programers, what he is saying makes no reals sense, since staples like C, Java, and .Net have been around awhile and are not going to go away.
        • If you have 'skills' using Office version X, it will probably not be as valuable in two years when a new and improved product, Office X+1 comes out.

          That makes the big assumption that few of your Office X skills translate well to Office X+1. When it comes to the upgrade from Office 2007 to Office 2010, you might have a point (stupid ribbon...), but in general that's not the case. Someone who could operate Office 95 can get around Office 2007 just fine. Someone who learned pre-ANSI C, C++98, or Java 1.1 can deal with 99% of code written in C1x, C++11, and Java 7, with a short learning curve for anything significant that has changed. It's the same pat

      • It's not just the "barebone language". It's the various libraries and other tidbits that are considered "essential" today because the allow rapid development. The C standard didn't change in ages. Still, if our programmers didn't know their way around the various libraries we have collected in the past years (and we're still collecting, adding to, replacing and eliminating) they'd be worth less than half of what they are.

        Various other things also apply. Security is one aspect that becomes more and more impo

      • C has evolved over the years. C99 updated the language a little and updated the libraries a lot more, especially things like stdint.h. POSIX has grown threading APIs, and various other things that weren't present in the mid '90s.

        I don't see many ads for Ada or Lisp these days

        Look at Rolls Royce. They're hiring SPARK Ada programmers like crazy, as are a lot of other aerospace companies. It doesn't really have any competitors for systems where failure is not an option. As for Lisp, the last job offer I got to write Lisp was for an investment bank.

    • by ThorGod ( 456163 ) on Wednesday October 26, 2011 @03:26PM (#37847680) Journal

      Suppose I know some amount (X) of C now (Just out of college)
        Will that be less valuable after having 2 years experience in the field?

      No, it wont. He's talking about *certain* IT skills. I'm going to go out on a limb and bet he's referring to the kind of tools you learn in a simple ITT-Tech type certification program.

      • by jd ( 1658 )

        Last time ITT Tech was covered by any forum I read, it was soundly ridiculed as not teaching any skills worth knowing, and the certification was denounced as being utterly worthless and accepted by nobody. ITT Tech probably taught Good Governance on Numenor.

    • by Tsingi ( 870990 )

      Suppose I know some amount (X) of C now (Just out of college) Will that be less valuable after having 2 years experience in the field?

      If you haven't learned anything new in your first two years as a professional c programmer, you might want to try another discipline.

      If I haven't learned anything new in any particular two year period, I get bored. Best option there is to either shake something up with my current venue, or quit.

    • by NFN_NLN ( 633283 )

      Suppose I know some amount (X) of C now (Just out of college)
      Will that be less valuable after having 2 years experience in the field?

      School related C skills without work experience... no... it won't be worth less in 2 years. It will be worth exactly the same... which is diddly.

      Only the 2 years field experience will mean anything when you apply for another job. And that field experience will decline... usually because a great deal of knowledge around programming is not about knowing the language but the framework, modules and libraries your project uses. And those are continually changing.

    • With what libraries and languages what you worked in C? Won't those change? If you're a games person, are you up on DX9? DX10? 11? Database backends? SQL? NOSQL? Have your version control skills expanded to match existing systems? Still using CVS? SVN? Git? "The Cloud" ... have any of your applications been designed with that kind of focus in mind of starting and stopping at any point and being part of a model with dynamically changing resource allocations?

      Evolving skills are a demonstration of the ability

    • by Hentes ( 2461350 )

      No, the 2 year claim is bullshit.

    • Not less valuable, but less marketeable, as the article says. Both are different things. Also, I don't belive it. But the working environment around here (that I've already jumped out of) may be unusual.

      TFA sounds absurd, as it claims that markeability depends on the specific version of softwre you have experience. Like if somebody would hire a person that knows JSF 3.1* (it claims that small numbers aren't as important, but puts some importance on them) but not 3.2*.

      Have you ever seen a CV that tells versi

      • Yes, I have seen versions numbers for platforms on CV's. I have them on my CV and I look for them in applicant CV's. I'm a Linux admin who manages two student interns (Jr. Admins). I do the screening of my interns as well as helping to screen full-time co-workers. When reading CV's, I give a higher weight to those with version numbers. I'm not too worried about minor numbers (i.e. RHEL5.4 vs. RHEL5). I'm not too worried about older versions. Version numbers act as a shibboleth to weed out the posers from th

        • "When interviewing a potential Linux admin, I always ask what version and flavour of Linux that they have experience with."

          When interviewing a potential Linux admin, I always ask a TECHNICAL QUESTION I know specific for the version and flavour of Linux that they claim they have experience with if I even give a damn about deep specific knowledge (which most of the time I find basically irrelevant). That's what talks about their skill, not their ability to retain some version numbers out of a fast google sea

    • by jd ( 1658 )

      In some cases, yes. C coding style recommendations have changed over the years. Some C dialects have died (K&R, for example) and others have grown. The standards have shifted, so those who have learned C99 will be at a disadvantage to those who know C1x for newer code -- though the reverse will be true for middle-aged code. Ancient code could be in any of a thousand dialects.

      The market for C is growing, but the number of shifts from C to C++, C# or Java (or other languages) is also growing and the value

  • by Xugumad ( 39311 ) on Wednesday October 26, 2011 @03:22PM (#37847620)

    As a general rule I don't even list things on my CV (resume) that I have less than two years experience in, these days...

    I'm willing to accept this is the case for startups wanting the latest buzzword filled technology, but a LOT of places are happy at a much slower pace.

    • On my CV, I list things that I have less than 2 years experience, but I put skill level qualifiers like "Novice" ,"Intermediate", and "Expert"

  • by Kittenman ( 971447 ) on Wednesday October 26, 2011 @03:25PM (#37847662)
    .. an IT leadership coach ... uh-huh. Veiled message is "take my course, buy my book". I'm still employed using skills I learnt in 1980. Eric Bloom can get the hell off my lawn.
    • It is highly field dependent. I traded the front line trenches of IT-Security for a comfortable management position not even a year ago, and I'm already struggling to stay current with the various threats coming our way. I simply don't have the time anymore to concentrate on it as much as I used to. I'd guess in a year, what I knew a year ago is not only obsolete but simply laughable.

      Of course, COBOL won't change in the foreseeable time, or maybe ever. For most, the reality will be somewhere in between.

    • I'm still employed using skills I learnt in 1980.

      How many skills are you using that you learned in the 1980s but didn't use at all in the 1990s? I bet it's a much lower number.

  • I call bullshit (Score:5, Interesting)

    by cartman ( 18204 ) on Wednesday October 26, 2011 @03:25PM (#37847664)

    I still program in Java which I've been doing since 1998. I also sometimes program in Python which I've been doing since 1997. Obviously some things about those languages have changed, but many things haven't.

    OO languages are fairly similar to what they were 10 years ago. As is OO design, etc. There have been large changes to frameworks etc, but there is a significant "core skill set" which transfers over.

    In my case, my skills have not become become less marketable at all over the last two years. Recently I spent two years out of work (voluntarily), and when I returned to the job market I had no problem whatsoever finding a job.

    I think the half-life of skills is more like 15 years.

    • by msobkow ( 48369 )

      If you take the time to read the article, you'll see he's actually talking about how long your skills in customizing a particular release of software are viable, not about how long languages or operating systems remain relevant.

      As many companies stick with the same release of software for even longer, I question his numbers, but I don't question the theory. The lifespan of customizable products is much shorter than the tool-related skillsets required to do that customization. Your skills as a programme

      • by cartman ( 18204 )

        If you take the time to read the article, you'll see he's actually talking about how long your skills in customizing a particular release of software are viable, not about how long languages or operating systems remain relevant.

        I read the entire article before commenting. It says nothing of the sort. I don't even know where you got what you're saying. Did you read the article?

        From the article: "The longer answer is that, in my opinion, a techie’s skill set from a marketability perspective has a two ye

    • by epine ( 68316 )

      My 1979 APL skills gave me a huge leg up on learning the R language in 2008, except for the tax of unlearning elegance, and the odd rust flake or two.

      Are we talking skill cycles or fashion cycles on the two year tau?

  • by gestalt_n_pepper ( 991155 ) on Wednesday October 26, 2011 @03:25PM (#37847668)

    That said, I've been coding QA software in some VB-Form language since 1994. My pay during that time has only increased. This is the first year that I've had to do anything in a C-form language.

    The unfortunate fact of the matter is that a lot of new technologies are horse puckey. C++ was an actual improvement over C. The .net platform, for all its many faults, has actually increased my productivity, but much of the rest, Windows Presentation Foundation, Python, Ocaml, Ruby, Silverlight, et. al are nifty, but nobody *needs* them. Frankly, if the world standardized on Java tomorrow, and we just used extensions thereof for different platforms and purposes, we could all concentrate on getting useful work done and quit dicking around with learning the latest obscure and allegedly more elegant syntax. The best language and syntax isn't the most logically consistent one, it's the one you know. In productivity terms, human factors trump formal systems elegance every time.

    • I used a few basics as a teenager, as well as C, C++ & Delphi. Then I tried Perl, which I absolutely love. I've tried a little Ruby, it was okay. Currently learning some lisp, and going to have a look at Python soon. At work I mostly use Perl/HTML/JavaScript/SQL, with a little legacy maintenance of a Delphi app that we've thankfully just sold off the source to someone else, so I can use whatever the hell I want for future desktop-only apps.

      If you're going to stick with your "one size fits all" mentality

    • by lewiscr ( 3314 )

      I disagree. Some of my best productivity gains came from learning a new language, then never using it. Instead, I'd use all the good ideas in my "normal" programming language.

      I became a better Perl programmer after I learned Ruby. I became a better programmer (in all languages) after learning Lisp, Prolog, and Erlang.

      I last wrote a Lisp or Prolog program in the late 90's, but I use those techniques every day.

    • Some of what you say is true, but there is no reason to get all monotheistic about it.

      Different languages have their uses.

       

  • sounds about right (Score:3, Interesting)

    by tverbeek ( 457094 ) on Wednesday October 26, 2011 @03:26PM (#37847672) Homepage

    This certainly fits my experience. I'm "over 39" and have specific tech skills that date back to the early 80s. Those are worthless. I continued doing highly technical work and staying current into the late 90s, when I went back to school to build up some of my non-technical skills. Not such a good idea as it sounded. I emerged from school several years later with just enough still-marketable skills to land a tech job that offered little opportunity to further advance my skills, then got laid off from that, took a retail job as a life raft.... and now my "freshest" marketable tech skills are a dozen years old, and close to worthless. I guess it's time to get out the paintbrushes and see if I can swing a new career as an artist; at least the half-life on those skills isn't as short.

    • by mark-t ( 151149 )

      I have some specific tech skills from the early 80's as well... and some are not remotely worthless.

      For instance, I learned C in 1982.

      Or isn't knowing a specific programming language considered a specific tech skill?

    • If you are trying to market yourself with buzzword technologies and languages, then yes, your marketability decreases over time. On the other hand, if you are marketing yourself for less trendy technical work, maybe not. There are still a lot of COBOL and FORTRAN programmers out there, and they command some pretty competitive salaries. There are a lot of systems that were installed over decade ago that work just fine and just need people to support and maintain them, along with occasionally adding an int
    • Interesting. I learned C, Unix and RDBMSes back in the early 80s. I only use C at home for hobby projects, but I still use Unix and SQL professionally. I learned Java back around the turn of the century and it's still paying my mortgage. Franky, I'm disappointed I can't seem to find any new positions that use any of the technologies I've learned lately (like OSGi, SOA or NoSQL databases). It's different if you're a front-end guy, I guess — I have seen some places looking for jQuery and HTML5 expe

    • "I took a retail job as a life raft."

      I would love to be a life raft and get paid for it.

  • an IT leadership coach

    .... riiiiight. In other words, a buzzwad!

    Even COBOL refuses to die. C, C++ and it's variants are still everywhere (Objective C for Apple's iPhone App Store) decades later. Java has outlasted the fads of Ruby and Rails. HTML has been around ... well ... since the Internet. Javascript continues to be the #1 web scripting language.

    So no, your skills don't have a half-life of "X" number of years.

    • Well, I actually agree with your point that skill marketability does not degrade so quickly. Perhaps more importantly, I think that good employers recognize that it is far more important to be able to quickly learn new skills than it is to already possess them; being largely self-taught and having a fairly wide skill set has impressed employers more than any single point on my resume in my experience.

      I do have to take some exception with one of your points though:

      Java has outlasted the fads of Ruby and R

    • by Bozdune ( 68800 )

      "IT skills" is an oxymoron. Generic software engineering skills never lose value.

    • HTML has been around ... well ... since the Internet

      Ignoring the fact that you are about two or three decades off with that, have you looked at HTML 1.0 recently? Even HTML 2.0 from 1994 didn't have CSS or any semantic markup. HTML 4 is still pretty relevant, but that's only from 1997 and web development for the past few years has really required JavaScript (which has changed significantly over the years by the way) and asynchronous HTML requests - which weren't mentioned at all in HTML 4.

      In short, your post make me think that you've never done any deve

    • It has already been pointed out that HTML did evolve a lot. I'd like to add that the ecosystem it's part of has also evolved.

      HTML 4 as written in 2011 is vastly different from HTML 4 written in 1997 (for instance, we tend not to write our sites for a specific version of a specific browser anymore). CSS as written in 2011 is vastly different from CSS written in 1996; for instance, before we had vendor prefixes we had to use hacks to present different CSS depending on browser and sometimes browser version.
  • It depends on the specific skills and industry specialization. Among (many) other things, I've been intermittently doing embedded C code for 2+ decades. If the half-life rule applied here, then my embedded C coding skills would be roughly 1/1000th as marketable today as they were 20 years ago. Embedded C is still used in the defense and avionics industries (among others)... there's still fair demand for it (though admittedly not the sort of demand there was 10 or 20 years ago).
  • For software engineering, I could agree with him that languages, IDEs, paradigms, etc., are still evolving very quickly. For all I know, they will be evolving at that speed in perpetuity.

    On the other hand, I don't think this is true for all "techies." The tools for electrical design, for instance, haven't changed much since the introduction of 2D CAD tools for PCB layout in the 1980s. If you've been soldering, prototyping, debugging, and laying circuits out for the last 20 years, chances are pretty go
  • by Cigaes ( 714444 ) on Wednesday October 26, 2011 @03:33PM (#37847786) Homepage

    "a techie's skill set from a marketability perspective has a two year half-life"

    Well, a marketie's skill set from a technical perspective has a zero year half-life.

  • My unix "Tech Skills" are still quite marketable after more than a decade. Sure, specifics like futzing with IRIX software streams might not be useful any more, but a good 80% or more is still standard.
  • Bullshit (Score:5, Interesting)

    by cjcela ( 1539859 ) on Wednesday October 26, 2011 @03:36PM (#37847806)
    I do not know why this is in the front page, and I do not know why the educated crowd of Slashdot listens to BS from the CIO/CEO/CXO of the day and his new genius theory to quantify things he should not, mainly because he does not understand what technology is about. These guys should be in marketing. There are new technologies and old technologies, and jobs for all of them if you are good and know the right people. If you are very good at Fortran or Cobol you can get a job. If you excel at Java or C you can get a job. None of these are new technologies by far, and the skills are highly portable from one to the other. The basic knowledge you need is always sort of the same, a mix or common sense, knowledge of the basics (algorithms, data structures, and a brief background on the problem domain you are working on), and some minimum social skills.
  • ...or is it most of the poster here, who bash him?
    Did he say that C/Java/whatever decays, or only the worth of people who use this skills?

    I can imagine an interpretation of his statement, which would make sense. In my youth I coded in BASIC, Forth, PASCAL... This was somewhen in the middle ages. I was ok for that time, but today those skills are decayed to nothingness. I made some money in JAVA projects. Only a few years ago. Since then I didn't use JAVA at all. I image it would be much more difficult for m

  • With exactly zero evidence to back it up. The faster we ignore this entire story, the better.

  • Windows 7 may go just as long.

    Some industrial systems still have windows 9.X, ISA cards and other older stuff.

    2 years is to quick and lot's of places may do long testing time of new OS's / software any ways before roll outs.

    • I work at an extremely profitable semiconductor fab making the latest chips. Many of our systems run DOS. Dozens of them. We make billions (not exaggerated) of dollars with DOS variants and IBM mainframe/3270 terminal based systems, that frankly, work very well. I have a stack of floppys on my desk that actually get used. Support for this old stuff is expensive but its barely a blip compared to the rest of our costs.
  • Totally flawed analogy. The figure might hold true for latest fashion in development technology, but its insane to think that fortran skills for instance will be half as marketable 2.5 years from now. They will probably have declined by a few percent but difference in value of 40 year old tech versus 41 year old tech is negligable. Its more like the value of a technology falls by 100/(2+years-since-hot) percent every year.

  • The article is mostly about IT in the sense of database/SQL skills. When all you have to sell is the ability to code in some vendor's API, and when new versions keep appearing at some regular intervals, you need to keep running to keep your place, like in a treadmill. But there are many jobs where the coding skills are essential/necessary but not sufficient. In scientific application development (CAD developers like AutoCAD, Ansoft, Ansys, Fluent, Cadence, Mentor etc) the marketability could improve with ex
  • Every single statement referenced the "software vendor". Every software vendor's goal is to lock you in to not thinking and just buying your way out of any problem. Saying you have technology skills because you know some software from some vendor is like saying you can play guitar since you've got such high scores on the XBOX/PS3/Wii for Rock Band. Even if you know something from that "software vendor" inside & out, you don't know shit unless you understand the fundamentals under the hood of what the

  • This is a reflection of a serious problem in the area of hiring decent techie folks. There's a difference between a "marketable" skill and a "useable" skill. A marketable skill gets you hired by people who are clueless about what makes a good techie (hardware or software) and only know buzzwords, whereas a useable skill is what the people who you're going to work with and for HOPE you have. Sometimes skills overlap between marketable and useable, but my own observation is the larger the company doing the
  • Huh ? (Score:3, Insightful)

    by unity100 ( 970058 ) on Wednesday October 26, 2011 @04:09PM (#37848252) Homepage Journal
    im earning my living from php/mysql/html/css for the last 6 years. and im earning even more today. and having to turn down potential new clients.
  • by eriks ( 31863 ) on Wednesday October 26, 2011 @04:22PM (#37848432)

    As a programmer, I can say that programming itself, that is, *how* to write code, in terms of methodology -- is a skill that will never leave you once you have acquired it (so long as you keep using it).

    Almost any programmer worth their salt can learn a new language in a few weeks, if not days. Granted it may take more time to develop understanding of any idioms or warts the language may have, but you can learn that stuff on the fly, unless you're writing HA/mission critical code, in which case, there'd better be a review process, and it's reasonable to expect that someone on the team will be an expert in the technology being used.

    So I'd say unless you've given up programming entirely and have moved on to a different career, your skills are still valuable, and will stay reasonably "fresh" even if you're writing code in a 30-year-old language (as the article says), as long as you actually think while you write code, and aren't just a copy/paste/munge wizard, not that there's anything wrong with that, for certain kinds of things.

    This of course doesn't even consider the (imho) much more valuable part of being a software developer: being able to converse with non-technical people, in whatever human language you use, and then translate that into some sort of actionable programming work. That's often more than half the battle. Then of course there is testing, testing and testing.

    The article isn't completely wrong, but (like much of the "IT industry") I think it missed the point of what skills are actually important to doing software development. Knowing how to use a specific bit of kit is pretty far down on the list, I think, for any reasonably competent programmer/technologist.

    I treat anything with the word "marketability" in it with suspicion.

  • I just left a job doing objective-c and iOS stuff to work on a windows application that is all c++ and opengl.
  • OMG, so my skill of using a keyboard then has put me into a complete untenable position, career wise. I've been using a keyboard for over 28 years now. What do you people use today, do you talk into your mouse?

  • I design and configure GUI automated testing systems. The particulars change. The principles don't. I've started to design server and virtual machine environments to control the systems more precisely and easily. Think virtual machines are going away? Doubtful. I'm sure the particulars of these too will change, but the principles won't.

  • The fact I can troubleshoot classic MacOS 7.6.1 up through 9.2.2 and a number of old-world PPC related hardware issues over the phone without being anywhere near the machine in question is hardly Buzzword Compliant in this day and age.

    The fact that I learned basic troubleshooting out of self defense in that environment, however, gave me a great baseline for dealing with hardware and basic software issues in the general sense. While any classic MacOS-related "certifications" may be long useless, the fact th

  • Back in the "olden" days, I grew up programming basic, assembly, and later in "C" on Vic20 and C64.

    Now work for a major chip manufacturer, and some of my duties involves developing bootloaders, firmware, apps, and cryptographic libraries for SoC (System on Chip) and smart card platforms. Those old programming skills allow me to develop on platforms with extreme memory, storage, and performance limitations. I noticed that this is a challenge for those young wipper-snappers that only developed on platforms

  • Being current in some area of software today is about knowing its defects. You can read the manual about how it's supposed to work, but knowing what actually works is essential to high productivity.

    I've been thinking about this recently in connection with Mozilla Jetpack, which is a library for making add-ons for Firefox, etc. There are two websites, a blog, a forum, a Google group, a development committee, two completely different sets of development tools, "hack sessions", and an IRC channel. There's

  • ...instead of CS. Ohms law has awesome resale value.
  • Based on my experience, two years is a bit of an exaggeration. Just look at Windows XP. The software is ten years old and still in wide spread use. The real danger is missing out on major trends. In my own career, I almost missed out on virtualization. My employer did not have any plans on virtualizing and that is where the industry went. If I had stayed put, my career would have been dead.

    On the other hand, my knowledge of IT allowed me to make a move into a better position with a company that did no

Math is like love -- a simple idea but it can get complicated. -- R. Drabek

Working...