I was chatting with a couple of old, crusty software developers who were talking about updating resumes. The theme of the conversation had turned to our advanced age (over 40!) and the impact that being so seasoned had on the marketability of developers.
Lets qualify seasoned as 20+ years into software development careers. Clearly, any developer in their 20s or 30s is probably pretty safe from age bias.
Although my buddies were pure software gurus, I hadnt written a line of code in years. (I do still dream about using a VMS debugger to figure out why my FORTRAN code isnt working.)
As a manager, I dont think age is viewed negatively until you have about 30 years in. But for developers, I do think age can have an impact on peoples perceptions.
Keep in mind that under the federal Age Discrimination in Employment Act (ADEA), workers 40 and over cannot be arbitrarily discriminated against because of age in employment decisions, including hiring. But proving discrimination is almost impossible if a job candidate wasnt invited to interview because of their age.
I have written before about my past as a hiring manager, noting that when I reviewed resumes I always jumped down to the bottom to learn where the candidate went to college. As part of that quick scan, Id naturally notice of their graduation year and do the quick math.
In my eyes, someone with 20+ years of experience has more upside than downside. They typically have learned valuable lessons throughout their career, and their maturity makes them more accountable than younger programmers.
Actually, it isnt so much how old they are, but where their graduation year fell in the era of software development. Here are some milestone dates that may trigger a preconception of your skill set, depending on the type of job you are applying for.
While listening to Michael Jacksons classic music (RIP King of Pop), computer science students were still using classic printed hole-punch cards in college to write their code. I do get a kick out of the stories when people dropped their punch cards on the way to the compiler minutes before a project was due.
Im sure most of them adjusted to the not-so tedious move to online compilers, so this year marker is not really a big deal.
There is a good chance they didnt receive formal education on web development. Even though Tim Berners-Lee invented the World Wide Web in 1989, the mainstream explosion of software around TCP/IP and HTML protocols didnt happen until the mid-90s.
This could be a key demarcation line when looking at the value of someones computer science education for any web development position. The good news is that any Windows-based development would have been covered around this time period, which coincided with the release of Windows NT.
Object oriented development didnt become popular until the late 90s, after Java was released. This isnt saying that many developers didnt learn OO concepts and design with C++ post-1989 when version 2.0 moved it into the mainstream.
And for you hard-core OO developers, yes it is a fact that Smalltalk (the only true OO development environment) was taking strides in the early 90s, but unlikely a core topic in computer science programs.
One of the ways around the issues of security and control that make some businesses wary of cloud computing is to build a private cloud -- one that remains within the corporate firewall and is wholly controlled internally. Private clouds also increase the agility of IT an organization's IT infrastructure and make it easier to roll out new technology projects. Download this eBook to get the facts behind the private cloud and learn how your organization can get started.