Tuesday, September 17, 2024

Too Old To Write Software? Or Just the ‘Wrong Era’?

Datamation content and product recommendations are editorially independent. We may make money when you click on links to our partners. Learn More.

“Man, you are crazy if you put your college graduation date on your resume.”

I was chatting with a couple of old, crusty software developers who were talking about updating resumes. The theme of the conversation had turned to our advanced age (over 40!) and the impact that being so “seasoned’ had on the marketability of developers.

Let’s qualify “seasoned’ as 20+ years into software development careers. Clearly, any developer in their 20’s or 30’s is probably pretty safe from age bias.

Although my buddies were pure software gurus, I hadn’t written a line of code in years. (I do still dream about using a VMS debugger to figure out why my FORTRAN code isn’t working.)

As a manager, I don’t think age is viewed negatively until you have about 30 years in. But for developers, I do think age can have an impact on people’s perceptions.

Keep in mind that under the federal Age Discrimination in Employment Act (ADEA), workers 40 and over cannot be arbitrarily discriminated against because of age in employment decisions, including hiring. But proving discrimination is almost impossible if a job candidate wasn’t invited to interview because of their age.

I have written before about my past as a hiring manager, noting that when I reviewed resumes I always jumped down to the bottom to learn where the candidate went to college. As part of that quick scan, I’d naturally notice of their graduation year and do the quick math.

In my eyes, someone with 20+ years of experience has more upside than downside. They typically have learned valuable lessons throughout their career, and their maturity makes them more accountable than younger programmers.

Not When, But What Era

Actually, it isn’t so much how old they are, but where their graduation year fell in the era of software development. Here are some milestone dates that may trigger a preconception of your skill set, depending on the type of job you are applying for.

1984:

While listening to Michael Jackson’s classic music (RIP King of Pop), computer science students were still using classic printed hole-punch cards in college to write their code. I do get a kick out of the stories when people dropped their punch cards on the way to the compiler minutes before a project was due.

I’m sure most of them adjusted to the not-so tedious move to online compilers, so this year marker is not really a big deal.

1993:

There is a good chance they didn’t receive formal education on web development. Even though Tim Berners-Lee invented the World Wide Web in 1989, the mainstream explosion of software around TCP/IP and HTML protocols didn’t happen until the mid-90’s.

This could be a key demarcation line when looking at the value of someone’s computer science education for any web development position. The good news is that any Windows-based development would have been covered around this time period, which coincided with the release of Windows NT.

1996:

Object oriented development didn’t become popular until the late 90’s, after Java was released. This isn’t saying that many developers didn’t learn OO concepts and design with C++ post-1989 when version 2.0 moved it into the mainstream.

And for you hard-core OO developers, yes it is a fact that Smalltalk (the only true OO development environment) was taking strides in the early 90’s, but unlikely a core topic in computer science programs.

Next Page: Salary and software development experience

1999:

Even though first XML spec was produced by the World Wide Web Consortium (W3C) in 1994, it wasn’t widely accepted and taught until the late 90’s. Even as late as 2000 the future of XML was being debated, so it’s a good bet the college curriculums didn’t catch up for a few years.

Of course, everyone in college at this time was desperately trying to learn web development so they could join a startup and retire a few years after graduation. So maybe their ambition pushed them to lean it as a leading edge protocol. If not, then one concern would be that without XML foundations they may be weaker at SOA development.

2001:

This marked the beginning of the .NET era and I imagine that most universities weren’t teaching C#, VB.net and ASP.NET until early to mid-2000’s. If a hiring manager is looking for a .NET developer, then anyone who graduated in the 90’s is going to be more closely scrutinized.

And all who graduated after the bubble burst in 2000 had more realistic expectations of a long term career, so maybe their attitudes were better than those that graduated a few years prior.

In fact, 2001 marked another milestone when considering not what technologies a candidate learned in college, but how a candidate was taught to write code. With the release of Schwaber and Beedle’s book Agile Software Development with Scrum in 2001, the old water-fall approach to managing software projects went out the window. Programmers classically trained in older methods may have a difficult time adjusting to the fast-paced, fluid Scrum methodology.

Anyone who graduated in the mid to late 2000’s is (hopefully) a pretty safe bet to be well grounded in the latest and greatest development technologies and methodologies.

But hold the phone! (landline, mobile or Skype?) Does it really matter what someone learned in their comp sci courses? Have the foundations changed that much since 1990? Many a COBOL programmer has learned OO development. Many a Powerbuilder developer has learned and excelled at .NET development.

To offset any age concerns (whether they are valid or not), it’s critical your resume shows an earnest effort to stay up on the latest and greatest innovations. For instance, if the candidate showed ongoing education with a master’s degree or relevant certificates that demonstrated a commitment to staying current in their area of expertise, then they’d be granted a reprieve.

However, if they had no significant formal education since their bachelor’s degree, then that raises a red-flag.

Not putting your graduation on your resume is likely to draw even more attention to it. I mean your years of experience are pretty obvious based on the length of your resume, so what does not including your graduation year do for you?

Experience and Salary

What does matter very much are your salary requirements.

Should a programmer with 30 years of experience who is now a Java expert be paid more than a developer with 10 years of Java experience? Don’t the extra years of experience count for something, even though they aren’t specific to the programming language for the job position?

When it comes down to it, you should have reasonable salary expectations and be proud of your experience. Highlight everything you’ve done throughout your career to stay current.

And if you haven’t stayed current, you better start now. To look really hip and leading edge, go take a course on iPhone or Android development.

Then again, about 75% of the world’s businesses data is still processed in Cobol so some of us old farts may someday find a lucrative job writing mainframe code once again. Which just goes to show that there is a place for old programmers after all!

ALSO SEE: Developer Salary Levels, 2004-2009

AND: Understanding Your ‘Idiot’ Manager

AND: When Developers Drink On The Job

Eric Spiegel is CEO and co-founder of XTS, which provides software for planning, managing and auditing Citrix and other virtualization platforms.

Subscribe to Data Insider

Learn the latest news and best practices about data science, big data analytics, artificial intelligence, data security, and more.

Similar articles

Get the Free Newsletter!

Subscribe to Data Insider for top news, trends & analysis

Latest Articles