One of the major advantages Linux has to offer is cross-platform
functionality. Far from being a PC operating system that has been
extended for other uses, it runs on cell phones, mainframes and
everything in between.
This offers IT departments the possibility of using Linux to consolidate
resources into a single skill set, or at least a single OS.
The danger, however, is that since developers are free to conduct
extensive customization, it may fork into a number of incompatible
versions.
”The differences between the versions of Linux can increase the level of
effort on the part of the system administrators as software installations
and verification can vary from system to system,” says Rob Pennington,
CTO and head of the Innovative Systems Laboratory at the National Center
for Supercomputing Applications. ”The different OS types and versions
make it very time consuming to verify that all the pieces (libraries,
compilers, file systems, etc.) work together as expected.”
Pennington has five Linux clusters under his control, including a 16
Teraflop Tungstun (W) cluster, which runs Red Hat Linux on 2560 Intel
Xeon processors. Several of the clusters are part of the TeraGrid, a
40-gigabit-per-second network, linking computers at nine universities and
national laboratories.
”A researcher should be able to use any of the systems without having to
know the paths to all of the necessary tools, the methods to submit jobs
for execution or the paths for the storage systems on the systems,” he
adds. ”This would appear to be simple but it becomes very complex when
this same type of goal is applied to multiple sites, such as those within
the TeraGrid.”
But the problem of Linux compatibility doesn’t just affect those
developing high-end research applications. Software vendors also are
significantly impacted.
There are more than 380 different Linux distributions, after all, and
developers need to make sure their products function well on at least all
the major ones in order to make their efforts profitable.
”In the beginning, end users, application developers and system
administrators were delighted to have the flexibility to make very
personal corporate decisions,” says William Hurley, senior analyst for
Enterprise Strategy Group in Portland, Or. ”Though there is a lot of
freedom in that, the ultimate long-term goal is to standardize on a class
of technologies, not just within the organization, but on explicit or de
facto industry standards so it is easier to apply complimentary
technologies.”
Kernel Control
The danger with Linux is not at the kernel. Although there are many
independent developers contributing their labor, what gets released
publicly is firmly under the control of Linus Torvalds.
”At the kernel itself, the community is very disciplined, so you don’t
see the kernel forking,” says Bill Weinberg, an open source architecture
specialist who works for the Open Source Development Laboratory (OSDL) in
Beaverton, Or. ”But there is the potential for divergence among some of
the Linux distributions, which makes it challenging for vendors to ship
shrink-wrapped software without having a lot of installation and
maintenance challenges across distributions.”
To make the job easier, several groups are creating standards and tools
to ensure software interoperability. They include:
formed in 2003 by eight major consumer electronics companies (Hitachi,
Matsushita, NEC, Philips, Samsung, Sharp, Sony, and Toshiba) who were
later joined by more than 50 others, including IBM and LG Electronics.
Its initial specifications, released last year, covered topics such as
reducing power consumption, graphics functions and security;
headquartered in Beaverton, Or. and now employs Linus Torvalds and Andrew
Morton, who oversee the development and maintenance of the Linux kernel.
Among other activities, the OSDL is creating standard Linux versions and
test suites for certain types of installations: Carrier Grade Linux, Data
Center Linux and five different desktop profiles for different types of
uses;
(www.linuxbase.org) project. The LSB Specification defines the binary
environment in which an application executes. This allows both Linux
distributors and application vendors to develop to a common standard,
ensuring interoperability. Companies also can build their custom
applications to the standard and know that they will run on any compliant
version of Linux, as well as on Unix servers.
”The LSB offers data center managers a way to protect their data and
application investment for the long term,” says Free Standards Group
executive director Jim Zemlin. ”If you don’t want a vendor gun held to
your head, invest in open standards based products.”
Unite or Die
It remains to be seen whether these standardization actions will work,
but early indications are positive.
While commercial software vendors try to create features to differentiate
their products, there are two factors limiting this in the open source
community. One is that users feel a personal stake in the software and
apply group pressure to keep everything open and interoperable. The other
is the nature of the open source licensing which limits exclusive,
proprietary code.
As long as these factors hold the Linux community in line, we can expect
to see continued expansion of its functionality and installed base. If
that doesn’t work, the boys in Redmond are standing by ready to pick up
the pieces.
”If Linux does begin to show a fractured face like Unix did, it will
create an unintended opportunity for Microsoft,” says Hurley.
”Microsoft has been aggressive in highlighting various studies showing a
positive TCO for Windows compared to Linux, and this would be another
front Microsoft will exploit to ensure placement of Windows.”