Deploy Vista with a 'Thin' System Image: Page 2

(Page 2 of 4)

Traditional Imaging Techniques

Organizations use lots of different methods to build their reference computers and therefore their system images. One common aspect of system imaging is that of categorizing the types of computer systems in use in your organization. Each different IT role in your organization will require different system configurations both in terms of hardware and software. Someone playing a generic IT role—that of information worker, for example—only really requires the basic operating system, basic utilities such as anti-virus and anti-malware tools, and basic productivity tools such as Microsoft Office on their system.

This basic image is easy to create and reproduce. The problem occurs when technicians need to build images for other roles in the organization—roles that must be customized to specific job roles. For example, a developer will require considerably more products than an information worker. The same would apply to a graphic artist, and in this case, would involve both custom hardware as well as software. In some cases, organizations have dealt with this problem by building multiple reference computers—one for every single IT role. Each reference computer would then be captured as a different system image. In some organizations, this could mean as many as 100 system images—images which need to be updated and maintained as patches and upgrades become available for their contents.

These ‘monolithic’ or ‘thick’ images require considerable effort to maintain. Remember, every reference PC must be depersonalized before capture. This means that to be maintained, they need to be re-personalized, updated, then depersonalized again to be re-captured. While some imaging tools, notably Microsoft’s new ImageX, allow offline mounting of a system image for maintenance purposes, the process is always simpler when it involves an interactive update of a real system than the injection of updates in an offline mounted image.

Thick images also require lots of storage space, both in central locations where they are created and maintained and in remote sites where they need to be duplicated to avoid their transfer and application through wide area network (WAN) links. For example, one single Vista image takes up more than two gigabytes (GB) even when it’s not personalized; adding additional applications and utilities will definitely make it grow. The more thick images you maintain, the more storage space you will require.

Build a Better Image

To reduce image management overhead, you need to start with the image building concept. Yes, it is a great idea to provide systems based on IT roles in the organization. And yes, it is a good idea to create reference computers and build images from them, but there are techniques you can use to simplify this process.

Page 2 of 4

Previous Page
1 2 3 4
Next Page

0 Comments (click to add your comment)
Comment and Contribute


(Maximum characters: 1200). You have characters left.