Ten Tech Blunders: Whoops, We Stepped in It!

From Linux to the kingdom of Redmond, from Apple to digital music, some great snafus in the wonderful world of technology.
(Page 1 of 10)

Blunders. We all make them. With the exception of technology journalists – who are wise, mistake-free individuals – all people and companies commit serious snafus. They simply mess up, big time.

Fortunes are lost, wrong paths are taken. Minor concerns are treated as major problems, while looming disasters are disregarded. Visionary CEOs turn out to be blithering idiots (the most common source of blunders.)

The field of technology is particularly susceptible because change is so constant. When the world turns upside down every 18 months, true foresight is required to look smart. No one can see ahead all the time.

The following list, then, is only a partial account. But of the many missteps, mess-ups, miscalculations and outright step-in-the-doodoo blunders, these are some of the choicest.

1) The Apple OS Decision

The Blunder:

Apple refused to refuse to sell its OS separate from its hardware, forever consigning it to tiny market share.

What Happened:

In the late 1970s the personal computer business was wide open, a veritable desktop Wild West. Nobody knew who would emerge as the dominant player.

For a period, Tandy’s fearsome hot rod of a PC, the TRS-80, looked like a winner. It had a deluxe cassette back-up system and its own word processing software, Scripsit, which enabled you to set your own text margins.

The Commodore PET, with its stylish black-and-green monitor and rugged metal case, included a built-in tape back-up system, the Datassette. Its mini-sized “chiclet keyboard” was hard to use, but at least it came with a keyboard, unlike some systems.

An early front-runner, Apple, launched the Apple II and soon thereafter, the improved Apple II+. It was easy to use – not just for techies – and boasted attractive color graphics and a hot spreadsheet program, VisiCalc. Apple used an open architecture; its many slots allowed you to attach third party gear like memory extenders or graphic cards.

Its architecture was so open that by 1980 various manufacturers were selling Apple clones – a move Apple hired lawyers to squash. The only hardware that can run the Apple OS will be made by Apple, thank you very much.

The tech world’s 5,000-pound gorilla, IBM, realized a lucrative market when it saw one, and entered the PC market with its full weight. Seeing the competition from Apple, IBM opted for a similarly open architecture. In fact, it was even more open – the company actually published its BIOS specifications. The IBM PC, released in 1981 with an awe-inspiring 640KB of memory (if fully loaded), was a huge success.

Over the next several years, IBM’s decision to opt for open architecture defined the PC industry. But to IBM’s chagrin, the open hardware specs allowed companies like Compaq and Dell to sell clones, boxes that got cheaper and cheaper. A generation of PC buyers realized something: buy an inexpensive machine (pre-installed with Windows) and you were ready to roll. Who needed IBM?

For a brief moment as the clone market was zooming upward, Apple had a chance to license its OS and continue to be a top player. (In 1984, Apple’s annual sales of $1.5 billion dwarfed tiny Microsoft’s $98 million.) But, having sealed its fate with its anti-clone stance, Apple was left behind. Later, it realized the mistake and briefly allowed clones; but the time had passed.

The company’s original anti-clone decision was an expensive one. In April 2007, by one count, Apple had a whopping five percent share of the personal computing business.

Moral of the Story:

If you have a choice between the software and the hardware business, it’s usually more lucrative to choose the software business.


Page 1 of 10

 
1 2 3 4 5 6 7 8 9 10
Next Page





0 Comments (click to add your comment)
Comment and Contribute

 


(Maximum characters: 1200). You have characters left.