we looked at the growth of GNU/Linux in several areas, including high-performance computing (supercomputers), mobile phones, desktops, miniature laptops, consoles, and set-top boxes. There is a great deal of overlap between some of these areas, but they are certainly separable.
In this concluding article we look at the growth of GNU/Linux in still more tech sectors. This ought to demonstrate the tremendous presence that GNU/Linux quietly gained throughout the year 2007.
The term “server” is very generic in the sense that it covers a broad range of equipment and applications. For E-mail and Web services, for instance, there is a diverse set of systems that are operated so as to connect desktops and devices behind the scenes, so to speak. Application servers exist which blur the gap between the host (server) and the client. Even desktops and laptops can be compared to servers in terms of their function, but let’s try to sub-divide the domains at hand in a sensible fashion and begin with Web servers in particular.
Google is often cited as a major success story and a poster child for GNU/Linux. It is a pioneer capitalizing on disruptive trends as it concentrates on software as a service. In 2007, Google was believed to be using approximately one million GNU/Linux servers around the world, but nobody knows the real number for sure, except Google of course, which consistently keeps those cards close to its chest. Google uses GNU/Linux almost exclusively despite its short experimentation with OpenSolaris quite some time ago (circa 2006).
Other large companies have already chosen GNU/Linux to run various types of servers. Examples include eBay and Amazon for some of their Web services (in-house) and Oracle for its products (clients). They became more vocal about their use of GNU/Linux in the past couple of years. Many others began taking pride in GNU/Linux rather than hiding it from the public eye. This trend can be generalized to account for other areas such as devices, which we will touch on in a moment.
The rise of the so-called ‘Web 2.0’ generation rationalized the need for high-capacity servers that are highly reliable and accessible (in terms of availability). Downtime is rarely acceptable when it comes to user-facing services, which deliver and receive data almost in real time. Downtime is hardly affordable because it can drive customers away.
With growth in the server market in general and especially with the gradual decline of aging Unixes, GNU/Linux deployments kept rising in quantity. Being free software, however, it was impossible to keep track of the number of installations. Moreover, the number of servers does not say very much because actual server capacity depends a great deal on the available hardware and software that runs on it.
Modern hardware and resource-efficient software require less units to handle the same load. Additionally, there is the emergence of virtualization to consider here. VMWare is the leading virtualization company (IPOed in 2007) and it actually started up with GNU/Linux for quicker market penetration. Server virtualization remains a GNU/Linux advantage – this platform is comfortably ahead of most counterparts. However, ironically enough, when it comes to statistics, this also means better distribution and pooling of resources, which results in improved consolidation and therefore a decrease in the number of servers that are needed.
In servers, a great deal of disinformation is being spread to paint a deceiving picture. Despite the fact that not all server units are sold and shipped, GNU/Linux gets counted in this old-fashioned way. Another mistake commonly made involves counting only the revenue made through sales of servers, regardless of the number of servers sold. By adhering to such measures, more expensive servers will be viewed as more popular among users, who are always assumed to be buyers, i.e. paying customers. As a measure of popularity or ubiquity, this is incompatible with free software like GNU/Linux.
Another important mistake is to assume that all GNU/Linux servers are sold, as opposed to deployed. As stated earlier, Google is estimated to have approximately one millions servers, but the number remains unknown due to corporate secrecy. Google is able to build and even distribute its own servers, so such server usage can easily go below the radar of industry analysts, whose definitions are strictly controlled by those who commission studies for vanity and marketing purposes. As pointed out earlier, there is also the issue of server capacity. If a Linux server can handle greater loads, then fewer such servers are required to handle the same amount of work.
Let’s quickly look at some numbers from 2007. Market estimates have claimed a growth of 34% annually for GNU/Linux shipments, with strong evidence of growth in Red Hat’s latest financial figures from mid-December. That is commercialized Linux alone; additional figures remain unknown and uncounted. Red Hat is still the leader in the Linux servers market. Looking at Red Hat’s year-to-year growth, the company boasts a rise of 28% in sales, a 24% rise in cash flow, and an improvement measured at 39% for net income. Red Hat’s shares rose 12% after these results were published. These figures, in general, put GNU/Linux ahead of everyone else when it comes to pace of growth.
Toward the end of the year, even the New York Stock Exchange adopted GNU/Linux as a server platform. It also talked about its decision openly in the press and this story served as an excellent sign of validation.
There are many other success stories that could be covered. Consider rendering farms and studios in Hollywood where Linux enjoys a de facto monopoly, with virtually all desktops and servers running Linux underneath a proprietary software stack. The application layer often hides an underlying embodiment of openness and freedom, which sits just ‘under the hood’. This is one of the least-covered success stories of GNU/Linux and it truly deserves greater attention.
Another class of servers that can be considered separately is the mainframe. IBM leads the way in the area of mainframes, where the use of Linux has become the natural path for most mainframe to follow and evolve along.
Progression is encouraging because IBM recently upgraded the z/VSE mainframe OS to accommodate Linux use in large- and medium-sized businesses. IBM also reported a surge of 390 percent when it comes to the number of sites running Linux in the mainframe. In fact, Linux is said to be driving a revival of mainframes, some of which have been prematurely buried.
In 2007, mainframes were seeing somewhat of a comeback, driven by ISV support from many in the Linux arena. System integrators are involved as well, and the number of supported applications doubled. Earlier last year, an agreement between Oracle and IBM actually helped strengthen mainframe computing. Both companies are known for their love for — and arguably a dependence on — GNU/Linux.
One of the more fascinating trends, whose potential was only realized in the past few years, is cloud computing. Large enterprises, including not just technology companies but anything from banks to healthcare, wish to deploy clouds. Such phenomenal deployments could soon reach as far as governments, according to sources.
Due to some of Red Hat’s new products, which were introduced only a couple of months ago and are geared towards clouds, questions began to arise about their future collaborators. Will it be Amazon or will it be IBM? Red Hat has already set itself a goal to maintain presence in over half the world’s servers by 2015. Free software appears to be at the heart of cloud computing, with companies like Google already taking a lead too. There are other lesser-known contenders to consider, such as Xcerion, whose Internet cloud might quietly mature and help the company grow as rapidly as VMWare.
IBM’s Blue Cloud, which is bound to arrive within a few months, will be using BladeCenter servers and run GNU/Linux. It will rely on free software and utilization enhancers such as Xen-based virtualization. On top of it, IBM’s Tivoli is expected to run and manage the cloud, so this might not be a case of free software cloud top-to-bottom.
IBM’s datacenters are slowly evolving into ‘computing clouds’ and the significance of this, which is often underestimated, can be compared to the importance of the company’s embrace of GNU/Linux many years back. This was seen as a big endorsement (never mind the generous investment) at the time. It also helped Linux rid itself from damaging stereotypes.
In this context, devices would be a large family of mostly embedded software. These tend to be miniature, but they needn’t be. Topology of the different devices is probably a subjective matter.
According to a 2007 survey from VDC, Linux is set to grow 278 percent in the domain that includes embedded, mobile and real-time applications. Linux is used very quietly in this area. People often use it without being aware of it. The closed nature of many Linux devices contributes to apathy and several companies are too shy to admit their use of Linux due to potential (sometimes known) GPL violations.
According to another survey from 2007, 87 percent of those who built their devices using Linux plan to use Linux in their next project as well. In other words, only few of those with Linux experience are actually looking elsewhere and assess other options. This indicates great satisfaction from a developer’s point-of-view.
Moreover, and further to the study above, the use of free distributions was favored considerably in comparison with paid distributions. Trends indicate that more and more developers escape the dependency on commercialized distributions. This makes everything more affordable and hence attractive to both developers and prospective users.
In the year 2007 we saw many media players that run Linux. This includes Wizpy, new models of the portable media player from Archos, an iPod competitor from AOL (manufactured in Germany by Haier) and many lesser-known gadgets. There is a vast array of other devices, including networked-attached storage units, home servers, children’s toys and innovative gadgets with well-known examples like the Chumby, which makes a wonderful gift even to grown-up kids like ourselves. An extensive list of such devices is constantly being compiled at LinuxDevices, as well as in a few smaller Web sites. Many of the devices are designed and/or manufactured in the Far East, which secures low (and thus highly competitive) costs that lure in less receptive markets.
Linux also gained a high status and earned a place in a large number of industrial components including controllers, automation solutions, meters and monitors. Switches and routers, which arguably fall under the domain of servers as well, have played a role in the growth of GNU/Linux. For example, in 2007 3Com announced that it is betting on Linux and an open strategy. We recently saw a router and switch from Korenix and Vyatta delivers a truly free open source server based on GNU/Linux. It runs free software and adheres to the Red Hat-type business model, which is seen as quite faithful to the ideals of free software.
On the same note, while also considering hybrid devices, it’s worth stressing the importance of and the different roles of Linux in telephony or — more generally — communication . This includes Asterisk and other software that handles VoIP. Toward the end of 2007, Asterisk boasted the millionth download of its software. John ‘maddog’ Hall, a Linux luminary who is also the Executive Director of Linux International, once said that open-source VoIP “will be bigger than Linux.”
In a realm where customization is king, it is natural to expect advantages to be found in open systems. The robotics market in 2007 is said to have engendered roughly 10 general-purpose software development frameworks; 9 of these support Linux.
In 2007, Hanson Robotics found that in maintaining a mix of free software and proprietary software in robotics, the ideal ratio is 70% free open source software and only 30% proprietary. In this context, the Linux kernel is expected to play a major part. Linux is dominant in robotics in general. It is not just free open source software that gets chosen for its own separate merits.
Looking Into the Future
The ‘hidden agenda’ in this two-part article — as if there ever was an agenda — was to show that ways in which Linux success is typically measured are deeply flawed.
Computing has a visible and a less visible presence in our lives. People perceive the desktop as very important because it is highly visible to the general population. This can be deceiving. It is important to remember that there is no “year of Linux on the desktop.” If there was, then it’s already behind us and it’s called “the tipping point.”
Any type of real-world usage grows gradually; it doesn’t balloon overnight and clearly not over the course of a single year. Trends are sometimes more meaningful than absolute numbers when it comes to predicting the future. Bearing that in mind, there is no going back as Linux will mature and its usage will further expand in many areas.
Let us never be obsessed too much with the desktop. In fact, a desktop might cease to be a primary target by the time that mythical, so-called ‘Linux domination’ is finally reached. Many call this “inevitable” and such sooner-or-later destiny is at times recognized by those who have the most to lose. That inevitability may or may not include the desktop, whose future role is yet unknown. Mobile devices seem to gradually replace the desktop, at least in Japan.
Last but not least, it is important to remind ourselves not to be distracted by any single area of computing, which is one just among many. What sustains growth and fuels development is a market that is broader than local computer stores. As Linus Torvalds said recently, “Linux is much bigger than me.” Linux is also bigger than the desktop.