Thursday, March 28, 2024

Deploy Vista with a ‘Thin’ System Image

Datamation content and product recommendations are editorially independent. We may make money when you click on links to our partners. Learn More.

When deploying new operating systems, most organizations try to achieve a single system standard and deploy that single standard to every system in the network. Organizations around the world use different techniques to achieve this, but all of them use a single deployment strategy: the system image.

System images basically consist of the capture of a preconfigured computer into a single image that is then deployed to every other system. The preconfigured computer is usually called the reference computer. The reference computer is prepared, then depersonalized through a process called system preparation—usually through the use of the Microsoft Sysprep tool which prepares an image for mass reproduction—and finally, captured as a system image.

System images originated from the use of disk imaging software. This software would be designed to capture the disk of the reference computer sector by sector into an image which would then be used to reproduce the entire contents of the disk to other systems. Perhaps the most famous disk imaging tool is Symantec Ghost which is now integrated to the Symantec Ghost Solution Suite. Several other manufacturers offer disk imaging technologies: Altiris, Arconis, LANDesk, and more.

In the recent past, manufacturers have changed this sector-based disk image to a file-based image, capturing all of the files that are required to reproduce the reference computer onto others. File-based images are better than sector-based images since they are independent of the disk structure and can therefore be applied to a wider variety of systems. Most people use system images for PCs, but they can also be used for servers.

When it comes to Windows Vista, several changes have been introduced to make this process as simple as possible.

Vista installs through a new process called Image-Based Setup (IBS). That’s because Microsoft introduced a new system image format with the delivery of Vista: the WIM format. Each DVD that includes Vista includes at least two WIM files: one for Vista itself and one for the Windows Preinstallation Environment (Windows PE) which is a version of Windows that runs in memory only and can be used to install operating systems (OS) on bare metal machines or machines that do not include an OS.

Because it uses single instance store—it stores only one single copy of any given file—the file-based WIM image format can include several different editions of Vista; the edition you install is determined by the product key you type in during installation. In order to work with WIM images, Microsoft has introduced ImageX, a command-line tool that generates and manipulates WIM images. Other manufacturers have also updated their tools to work with Vista images, both disk and file-based.

Image-Based Setup basically copies the Vista WIM to a computer and then expands it to perform the installation. This IBS process is the same whether you perform an upgrade over an existing operating system or a new installation on a bare metal machine. In the latter case, the installation first boots into Windows PE, then copies the Vista WIM and expands it.

Other installation changes include the ability to install to different hardware without using a different image. Hardware specific installations are tied to the Windows Hardware Abstraction Layer (HAL). The HAL is the component that will make Windows work with specific hardware. Different vendors often require different HALs. In previous versions of Windows, you needed one image for each HAL in your network.

With Vista, the image is HAL-independent, reducing the need for multiple images. In addition, Vista is language agnostic. This means that it installs without any language; language packs are applied at installation and used to personalize the installation to your requirements. Because of this, it is possible to include multiple language packs in the same image and determine which applies during installation. These features bring the concept of one single worldwide image much closer to reality.

Traditional Imaging Techniques

Organizations use lots of different methods to build their reference computers and therefore their system images. One common aspect of system imaging is that of categorizing the types of computer systems in use in your organization. Each different IT role in your organization will require different system configurations both in terms of hardware and software. Someone playing a generic IT role—that of information worker, for example—only really requires the basic operating system, basic utilities such as anti-virus and anti-malware tools, and basic productivity tools such as Microsoft Office on their system.

This basic image is easy to create and reproduce. The problem occurs when technicians need to build images for other roles in the organization—roles that must be customized to specific job roles. For example, a developer will require considerably more products than an information worker. The same would apply to a graphic artist, and in this case, would involve both custom hardware as well as software. In some cases, organizations have dealt with this problem by building multiple reference computers—one for every single IT role. Each reference computer would then be captured as a different system image. In some organizations, this could mean as many as 100 system images—images which need to be updated and maintained as patches and upgrades become available for their contents.

These ‘monolithic’ or ‘thick’ images require considerable effort to maintain. Remember, every reference PC must be depersonalized before capture. This means that to be maintained, they need to be re-personalized, updated, then depersonalized again to be re-captured. While some imaging tools, notably Microsoft’s new ImageX, allow offline mounting of a system image for maintenance purposes, the process is always simpler when it involves an interactive update of a real system than the injection of updates in an offline mounted image.

Thick images also require lots of storage space, both in central locations where they are created and maintained and in remote sites where they need to be duplicated to avoid their transfer and application through wide area network (WAN) links. For example, one single Vista image takes up more than two gigabytes (GB) even when it’s not personalized; adding additional applications and utilities will definitely make it grow. The more thick images you maintain, the more storage space you will require.

Build a Better Image

To reduce image management overhead, you need to start with the image building concept. Yes, it is a great idea to provide systems based on IT roles in the organization. And yes, it is a good idea to create reference computers and build images from them, but there are techniques you can use to simplify this process.

Begin by using a logical system model to build your standard operating environment (SOE). For example, we rely on the Point of Access to Secure Services (PASS) model. This model divides each required component into layers (see graphic below):

• The hardware layer helps reduce complexity by relying on standards and standard configurations. Systems are purchased in lots to reduce diversity as much as possible.

• The PASS System Kernel is a component that is built from seven layers, much like the OSI networking model, that deliver services to end users. This kernel is deployed to every PC and when it is properly constructed, it can address the needs of over 50 percent of the end users in your organization.

• The Role-based Application Layer is applied on top of the kernel and addresses the special needs of roles beyond the information worker role. For example, an organization with 2,500 users has 8 special IT roles and applies each of them as needed on end user PCs.

• The Ad Hoc Application Layer is used to provide single applications that do not fit a particular IT role to end users. These applications are rare and usually include less than five percent of the applications in your network.

Besides dividing image components into layers, the PASS model helps create a single view of the system stack. It also relies on other key technologies to reduce image overhead.

deploy vista

The PASS Logical System Construction Model

These technologies include:

• OS deployment technologies which include multicasting or the ability to send one single stream of data to multiple PCs to load the system kernel. Microsoft’s ImageX for example, does not support multicasting. Bearing in mind that images are multiple gigabytes in size, not being able to multicast adds considerable time to the deployment.

• Software distribution systems which are used to deploy the role-based application layer. Ideally, this layer will be delivered as a group of applications on top of the kernel. This tool is also used to deliver the ad hoc application layer if it is required. This tool also supports a smaller image since custom applications are not part of the initial image.

• Virtual machine technology in support of the reference computer build. By using a virtual machine to build and maintain the reference PC, organizations can save considerable time. Since a virtual machine is nothing more than a set of files in a folder, you can simply make a copy of the actual reference PC before you depersonalize it for imaging. This means you never need to rebuild or re-personalize your reference PC again.

Windows Vista’s new feature set works very well with these technologies when delivering images to your PCs.

Move to a ‘Thin’ Image

The factor that will make the concept of complex thick images a thing of the past is the arrival of a new technology along with Vista, that of software or application virtualization.

Application virtualization protects the core operating system at all times from any changes performed during the ‘installation’ of an application on a system. This is because application virtualization does not capture an installation, but rather the running state of the application. Since the running state is captured, no actual installation is required. This means you can simply copy the application to the system and then activate it. Virtualization solutions also avoid every possible conflict applications may cause.

For example, organizations have successfully virtualized Microsoft Access 2000, 2003, 2007 and run them all at the same time on the same PC. What’s even better is that a virtualized application becomes OS agnostic; i.e., it relies on the virtualization layer to make it work properly on the OS. This means that the same virtualized application will work on both Windows XP and Windows Vista.

In addition, virtualized applications can be streamed to PCs. Streaming uses the same technology as video or audio and delivers content to endpoints in continuous streams. When enough content is streamed, the application can be launched. Microsoft Word, for example, needs about 160 kilobytes (KB) to start working, making it much easier to deploy. Several vendors offer application virtualization solutions: Microsoft SoftGrid, Altiris Software Virtualization Solution, and Thinstall are a few options.

With the advantages inherent in application virtualization and software streaming, you can move to the concept of a ‘thin’ system image. This means deploying a smaller OS system image, and then, deploying a generalized application layer (GAL) to achieve the same result you would obtain with a thick image.

deploy vista

The Thin Kernel with the GAL

This gives you the best of all worlds: thin OS kernels are multicasted to endpoints and as soon as the system is up, GAL contents are streamed to each system. Role-based contents, if required, are also streamed to users of the IT role. Images are much easier to maintain and are much fewer in number. You can’t quite make one single world-wide image because despite their best efforts, Microsoft was not able to create an image that would work on both 32-bit and 64-bit systems, but you can reduce it down to two thin images. Now, that’s smart system management!

Subscribe to Data Insider

Learn the latest news and best practices about data science, big data analytics, artificial intelligence, data security, and more.

Similar articles

Get the Free Newsletter!

Subscribe to Data Insider for top news, trends & analysis

Latest Articles