Download the authoritative guide: Cloud Computing 2018: Using the Cloud to Transform Your BusinessI've spent most of my career in industry, but I've also spent some years in academia. After some time at the Defense Advanced Research Projects Agency (DARPA), I started a software design and development company which I ran for eight years before spending some time at both George Mason and Drexel Universities -- before becoming SVP and CTO of CIGNA and then of Safeguard Scientifics, Inc.
I returned to academia in 2001 with high hopes of bridging the gap between what happens in the classroom with what happens in the trenches. Here's what I saw when I returned to the ivory tower.
First, I saw the aftermath of the dot.com meltdown. Students who were ecstatic about computer science, computer engineering and management information systems (MIS) were not very excited about careers in these fields, mostly because of the collapse of the job market.
The eBusiness frenzy was all but over and replaced by a completely different perspective on the nature of technology's contribution to business. From 1999 to 2004 the whole world altered its view of technology from strategic to operational. One of the worst moments of this trend was punctuated by Nicolas Carr's now way-too-famous "IT Doesn't Matter" article published in May 2003 in the Harvard Business Review. Carr argued that information technology was essentially a commodity with little promise as a strategic differentiator.
Here's what I see now.
It seems to me that educators need to focus on where the field is today -- and is likely to be in three to five years -- if not 10. I think that Nick Carr was half right: Infrastructure technology is a commodity, but there's still a ton of strategic leverage to be gained through the efficient application of (especially) front office/customer-facing technology.
I think that the way we acquire, deploy and support (especially) infrastructure technology and our applications portfolios are changing so fast and so profoundly that the data centers of the 1990s will be unrecognizable by 2010. I think that the whole business technology relationship is morphing into a symbiotic partnership that will require CIOs and CTOs to understand business models and processes as well, or better, than computing and communications technology.
So what should we teach? Business technology alignment should begin in the classroom, shouldn't it?
Here's what I think we should do -- please tell me if it makes sense:
Computer science programs need to focus less on programming languages and much, much more on architectures, integration and interoperability. Much less on algorithms and discrete structures and much more on software engineering best practices.
In effect, I'm suggesting that computer science jettison its strict mathematical foundations in favor of courses (and internships) that link operating systems, data architectures and algorithmic problem-solving techniques to specific classes of problems that graduates will face when they hit the trenches. While the next version of Microsoft Office has to be written by someone, I'd prefer it if our software architects and engineers treated problem-solving holistically, anticipating the new Web Services-based service-oriented architectures (and event-driven architectures) likely to re-define the way we all think about software applications and transaction processing.
I think academic MIS programs have a tougher challenge. Since most MIS programs are in business schools, there's more pressure on the graduates to link what they've been taught to real problems -- a great pressure to demonstrate relevance. I think MIS programs need to acknowledge Carr's commodity challenge and distinguish between "strategic" and "operational" technology, the latter of course being the commodity.
Since MIS majors tend to be technologically broad rather than deep, the strategic/operational distinction is actually a very useful one: Why not focus more on strategic technology than commoditized operational technology? And what about technology management? With more and more outsourcing, it seems to me that project, portfolio and vendor management might be good skills to develop.
What might an MIS curriculum look like? In addition to "the basics" like data communications, database management and enterprise applications, 21st-century MIS programs could focus on business analytics, supply-chain optimization, digital security and lots of technology management skills. Over and over again I hear companies express interest in hiring people who know how to write business cases for technology projects, how to mange projects (and portfolios), how to manage vendors and how to communicate all this effectively orally and in written documentation (including, of course, killer Powerpoint presentations). A third option would be to "verticalize" MIS curricula, re-defining courses around the requirements of specific industries, like pharmaceutical, financial services, manufacturing and insurance industries. All of this would result in three or four curriculum layers: one for the basics, one for strategic technology, one for technology management, and one optional layer that's vertical.
Are we on the right track? What do you think computer science and MIS graduates need to know to succeed? What knowledge and skills are you looking for? Please let me know. Thanks.