For the Apple folks, don’t get overly excited because Windows Vista isn’t exactly cutting through the enterprise market either, and it appears SP3 will flow into the segment in large numbers. Linux, whose supporters have been touting as the best platform overall, is still under 1% on the desktop even though it has doubled its market share. And if this was any other product, after this many years with this little result, we’d write it off like we did the Commodore platform, even though it was vastly more successful in its time on the desktop.
Yet the security risk alone of staying on the same platform for too long, as was recently pointed out by Symantec is about as far from a good idea as I can think of. In this post Symantec correctly points out that Microsoft’s older platforms are aggressively targeted because they have a greater need to be patched and criminals use the patches to identify the exploits. In this latest instance they focus on Office but make broad references to the patch process as the source for the attacks. The newer the platform the fewer the exploits, and given that the current crop of folks is no longer script kitties but professional criminals looking for customer and personal information, the risks have never been greater.
For the companies this isn’t trivial either. Hardware OEMs live on churn and would like it a lot better if the market swapped out hardware on an annual basis rather than every 3 to 5 years. On the other hand, one of the big trends a number of us have identified is the move towards green initiatives, and high hardware churn is about as anti-green as we are going to get given what’s going into landfills. And, if you are smart (and if you are reading this I’m guessing you are) you’ve learned that new OSs are best on new hardware and if you don’t do one you won’t do the other.
That creates a mess and I think it is well past time we rethought the desktop.
Dell On-Demand Desktop: Pros and Cons
Dell just released a more advanced version of their alternative to thin clients called the “On-Demand” desktop. This offering places the storage, all of it, on the network and creates a nearly no compromise blend of desktop hardware, centralized storage and management, and networking technology.
While you could still experience network lag and networks would need to be optimized for high performance, this creates physical separation between the hardware and the OS. It requires the entire solution to be tuned to ensure a seamless experience, which is why the solution bundle includes the desktop hardware, server, storage, and networking components. Based heavily on Citrix to deliver, the solution the result costs around $1,100 per desktop and addresses much of the complexity surrounding traditional desktop management.
This is a good way to centralize the software, allowing more assured patching. It should lower overall storage requirements, increases overall control and security (both physical and virtual) by centralizing the repository someplace safer than the desktop.
The advantages of this approach is that it’s very close to what PCs currently are in terms of experience and hardware. The hardware can be reconfigured with hard drives and redeployed, which provides protection against future changes that may make a more traditional configuration more practical. And there shouldn’t be huge problems in bringing in components from other vendors (though practically speaking I doubt most will do so because of potential support issues with Dell).
The disadvantages are that it really doesn’t decouple the OS from the hardware, it just physically moves where the OS resides. It places a lot of heat and complexity at the client end (just like PCs) and it doesn’t handle disconnected PCs like laptops, which are to displace desktop systems at a relatively rapid rate. This solution isn’t particularly green from the standpoint of limiting hardware disposal problems or reducing power costs overall.
Next page: HP’s Thin Client and Blade PCs
HP’s Thin Client and Blade PCs: Pros and Cons
HP has taken a more aggressive approach. They’re the only vendor in their class that offers traditional desktop PCs, laptops, blade PCs, and thin client devices. Blade PCs would be better than what Dell is offering across the board if there were common standards and multiple vendors that supported this solution. This would drive down the cost and risk substantially.
The advantages to Blade PCs are similar to what you get from Dell’s diskless workstations, except you also centralize the hardware. This means hardware failure recovery can be automated and the expensive hardware itself is protected and secure. This also shifts heat out of the working environment along with the related noise and can provide for a new silent experience.
The storage advantages are similar to Dell’s solution, and even though the hardware and OS aren’t separate there’s a relatively high commonality in the blades, which allows for the hardware failover. So this comes close to the goal of a virtualized environment. The blades have to be more power efficient to work in high densities and the solution has fewer things to dispose of that could be problems in landfills, making it greener than a traditional PC or diskless workstation.
Disadvantages are a redundant client (though the desktop client has recently transitioned to more of a KVM and is extremely light and inexpensive). A lack of flexibility in that you if you change your mind about the blade solution you can’t repurpose them as something else (they don’t make good blade servers and there’d be too many anyway), and they are comparatively expensive. This solution also doesn’t embrace mobile workers and requires constant contact between the desktop component and the remote blade.
Thin Clients trade off performance for an even higher degree of solution simplicity and overall reliability. HP recently acquired Neoware, which had actually created a thin client laptop computer. This acquisition makes HP the clear leader in Thin Client solutions in terms of size and breadth of offering.
Advantages for Thin Clients include the fact this is potentially the greenest of the desktop technologies. The clients have very little in them and the servers can both be repurposed and have comparatively (when compared to any PC configuration) fewer things that have to be discarded per desktop user. When based on clusters or bladed servers, reliability can approach PBX levels. However, when there’s an outage, it can be relatively catastrophic as one server failure can take out a lot of users. Everything that has value is centralized and secured. Ongoing support costs are reported to be very low.
Disadvantages have to do with server scalability, installation cost, no disconnected solution, and performance. If it weren’t for these shortcomings we’d be up to our armpits in thin clients because, on paper, they should be vastly better than PCs. The big problem is the servers don’t scale well and this won’t get fixed until the massively multi-core systems Intel and AMD are trying to create come to market. The thin client notebook requires a constant Internet connection, which makes it impractical to travel, particularly on airplanes. Graphics performance can be painful and these things are typically used for data entry applications today.
Next page: The SmartPhone UMPC Desktop
The SmartPhone UMPC Desktop
We have been talking about people leaving their laptops at home and living off their Blackberries for some time. A small minority of people actually do this and RIM, the parent of the Blackberry, is doing very well as a result.
On the phone side folks typically keep them for two years. If you factored in the subsidy, they pay almost as much for the hardware as they do for a PC over the two years they have the product, and it is vastly less expensive to support. In addition you get the OS with the hardware. And you could argue, based on numbers in large enterprises, that RIM has been more successful there than both Apple and all of the Linux distros have been on the desktop combined.
Thing is, most of us can’t live off a Blackberry. But that is because the Blackberry starts out being a super PDA. It wasn’t until the iPhone shipped that we got a chance to see what was possible in a fully featured phone with a near complete browser.
With most of the solutions above you need a connection and the ability to work disconnected is non-existent. But today’s smart phone can cache their work and, while limited in their disconnected state, they are increasingly able to run ever more capable browsers, allowing them to connect back to centralized services for a more complete desktop solution.
While initially this would likely lend itself more to a one of the HP back end products, Dell could, with an intermediate hosted server, provide similar capability if they so chose. HP actually has a smart phone line but, as of this writing, they haven’t looped it in to any of these solutions, though they certainly could relatively quickly.
Intel is developing their UMPC platform, which could close this gap sharply, but the most compelling devices aren’t due until next year. The closest thing in market right now to a cell phone that could be used as a laptop replacement is the HTC x7501. That’s what I currently carry and I can almost leave my laptop at home. It’s an amazing device.
This would close the last gap on all of these solutions and give you a mobile appliance to carry, tied to a secure back end service that could be more easily updated and managed. Finally we would have a blended solution that would work whether you are stationary or mobile.
Next page: Virtualization, and Into the Future…
Looking off Into the Future
Two things that are coming that may accelerate the move to new hardware. First, IBM is making a run at the Thin Client market as well and their Cell processor may have some advantages in terms of aggressively dealing with the need for multiple cores to get to the required performance. The question will be whether the hardware workarounds will be too ugly.
The other is hardware virtualization and that may finally eliminate the hard connection between the OS and the hardware that lies underneath it. Most of the work I’m seeing done by hardware companies is, interestingly enough, using embedded Linux as the host OS and this could be how Linux actually makes it, in volume, onto the desktop. Granted it may still be the Windows or MacOS on top that you live in (I’m having a hard time believing Apple will support the Linux part of this; they’ll probably use embedded BSD UNIX as the host OS).
By the end of this decade you are going to have a number of very interesting alternatives to choose from. For most of us that won’t be soon enough, and for those already living off of Smart Phones, you’re 80% of the way there today.
It may be time, however, to step back, take a breath, and see if it isn’t time to rethink your desktop solution and chat with some of these folks to see if what they have can make your life, and employees, safer and more productive today.