Too Old To Write Software? Or Just the 'Wrong Era'?

Rightly or wrongly, age is considered when hiring developers, yet a more important indicator is what micro-era in software history a programmer belongs to.
(Page 1 of 2)

“Man, you are crazy if you put your college graduation date on your resume.”

I was chatting with a couple of old, crusty software developers who were talking about updating resumes. The theme of the conversation had turned to our advanced age (over 40!) and the impact that being so “seasoned’ had on the marketability of developers.

Let’s qualify “seasoned’ as 20+ years into software development careers. Clearly, any developer in their 20’s or 30’s is probably pretty safe from age bias.

Although my buddies were pure software gurus, I hadn’t written a line of code in years. (I do still dream about using a VMS debugger to figure out why my FORTRAN code isn’t working.)

As a manager, I don’t think age is viewed negatively until you have about 30 years in. But for developers, I do think age can have an impact on people’s perceptions.

Keep in mind that under the federal Age Discrimination in Employment Act (ADEA), workers 40 and over cannot be arbitrarily discriminated against because of age in employment decisions, including hiring. But proving discrimination is almost impossible if a job candidate wasn’t invited to interview because of their age.

I have written before about my past as a hiring manager, noting that when I reviewed resumes I always jumped down to the bottom to learn where the candidate went to college. As part of that quick scan, I’d naturally notice of their graduation year and do the quick math.

In my eyes, someone with 20+ years of experience has more upside than downside. They typically have learned valuable lessons throughout their career, and their maturity makes them more accountable than younger programmers.

Not When, But What Era

Actually, it isn’t so much how old they are, but where their graduation year fell in the era of software development. Here are some milestone dates that may trigger a preconception of your skill set, depending on the type of job you are applying for.

1984:

While listening to Michael Jackson’s classic music (RIP King of Pop), computer science students were still using classic printed hole-punch cards in college to write their code. I do get a kick out of the stories when people dropped their punch cards on the way to the compiler minutes before a project was due.

I’m sure most of them adjusted to the not-so tedious move to online compilers, so this year marker is not really a big deal.

1993:

There is a good chance they didn’t receive formal education on web development. Even though Tim Berners-Lee invented the World Wide Web in 1989, the mainstream explosion of software around TCP/IP and HTML protocols didn’t happen until the mid-90’s.

This could be a key demarcation line when looking at the value of someone’s computer science education for any web development position. The good news is that any Windows-based development would have been covered around this time period, which coincided with the release of Windows NT.

1996:

Object oriented development didn’t become popular until the late 90’s, after Java was released. This isn’t saying that many developers didn’t learn OO concepts and design with C++ post-1989 when version 2.0 moved it into the mainstream.

And for you hard-core OO developers, yes it is a fact that Smalltalk (the only true OO development environment) was taking strides in the early 90’s, but unlikely a core topic in computer science programs.

Next Page: Salary and software development experience


Page 1 of 2

 
1 2
Next Page



Tags: developer, software, IT Jobs/Salary, programmer salary


0 Comments (click to add your comment)
Comment and Contribute

 


(Maximum characters: 1200). You have characters left.