Even though first XML spec was produced by the World Wide Web Consortium (W3C) in 1994, it wasnt widely accepted and taught until the late 90s. Even as late as 2000 the future of XML was being debated, so its a good bet the college curriculums didnt catch up for a few years.
Of course, everyone in college at this time was desperately trying to learn web development so they could join a startup and retire a few years after graduation. So maybe their ambition pushed them to lean it as a leading edge protocol. If not, then one concern would be that without XML foundations they may be weaker at SOA development.
This marked the beginning of the .NET era and I imagine that most universities werent teaching C#, VB.net and ASP.NET until early to mid-2000s. If a hiring manager is looking for a .NET developer, then anyone who graduated in the 90s is going to be more closely scrutinized.
And all who graduated after the bubble burst in 2000 had more realistic expectations of a long term career, so maybe their attitudes were better than those that graduated a few years prior.
In fact, 2001 marked another milestone when considering not what technologies a candidate learned in college, but how a candidate was taught to write code. With the release of Schwaber and Beedles book Agile Software Development with Scrum in 2001, the old water-fall approach to managing software projects went out the window. Programmers classically trained in older methods may have a difficult time adjusting to the fast-paced, fluid Scrum methodology.
Anyone who graduated in the mid to late 2000s is (hopefully) a pretty safe bet to be well grounded in the latest and greatest development technologies and methodologies.
But hold the phone! (landline, mobile or Skype?) Does it really matter what someone learned in their comp sci courses? Have the foundations changed that much since 1990? Many a COBOL programmer has learned OO development. Many a Powerbuilder developer has learned and excelled at .NET development.
To offset any age concerns (whether they are valid or not), its critical your resume shows an earnest effort to stay up on the latest and greatest innovations. For instance, if the candidate showed ongoing education with a masters degree or relevant certificates that demonstrated a commitment to staying current in their area of expertise, then theyd be granted a reprieve.
However, if they had no significant formal education since their bachelors degree, then that raises a red-flag.
Not putting your graduation on your resume is likely to draw even more attention to it. I mean your years of experience are pretty obvious based on the length of your resume, so what does not including your graduation year do for you?
What does matter very much are your salary requirements.
Should a programmer with 30 years of experience who is now a Java expert be paid more than a developer with 10 years of Java experience? Dont the extra years of experience count for something, even though they arent specific to the programming language for the job position?
When it comes down to it, you should have reasonable salary expectations and be proud of your experience. Highlight everything youve done throughout your career to stay current.
And if you havent stayed current, you better start now. To look really hip and leading edge, go take a course on iPhone or Android development.
Then again, about 75% of the world's businesses data is still processed in Cobol so some of us old farts may someday find a lucrative job writing mainframe code once again. Which just goes to show that there is a place for old programmers after all!
ALSO SEE: Developer Salary Levels, 2004-2009
Eric Spiegel is CEO and co-founder of XTS, which provides software for planning, managing and auditing Citrix and other virtualization platforms.