If you use the Linux desktop, you’ve never had it so good. Contrary to the critics who either haven’t used the desktop recently or quibble or a minor feature, Linux interfaces are better than anyone could have imagined when they first started being developed a dozen years ago.
At the same time, the major desktops — GNOME 3, KDE, and Ubuntu’s Unity — seen to have reached the point where they have lost direction. With GNOME 3 and Unity failing to be the successes everyone hoped and KDE settling down into maturity, none of the major desktops seem to moving toward any well-defined goal.
Instead, the major desktops seem to be responding to the pressures around them rather than taking charge of their direction. Some of these pressures are self-created, while others are historical or common to all modern desktops, free and proprietary alike. Some are barely articulated, although they operate no less powerfully for that.
Whatever their origins, here are seven concerns that are shaping the Linux desktop today:
1) The Struggle to Define “Usability”
Not long ago, usability on Linux was conspicuous largely by its absence. Now, thanks to the latest release series by KDE, GNOME 3 and Ubuntu’s Unity, few concerns are discussed more.
The trouble is, exactly what usability means differs with the project. For GNOME 3 and Unity, usability seems to mean implementing the discussions of usability experts, and expecting users to change the habits of years on the assumption that the experts know best.
For instance, so far as I’m aware, few people were complaining that the GNOME desktop was cluttered. But the designers decided that a radical simplification was what users really needed, so GNOME 3 sported “distraction free-computing” whose dual screens were more distracting than anything it was supposed to correct.
Similarly, Unity reconfigured its desktop so that opening a menu takes users away from the desktop where they are actually working.
To make matters worse, both GNOME 3 and Unity are aimed at new users, with the needs of veterans largely ignored. In particular, administration and configuration tools are so deeply buried that, at first, users may assume that they no longer exist.
For all the controversy that greeted the initial release of the KDE 4 series, on the whole the current versions of KDE do a much better job of preserving the work flows that developed over the last decade while trying to extend them. The means for preserving or enabling those work flows may be different than in earlier versions of KDE, but at least they exist.
Other desktops like LXDE seem to be gaining users simply by continuing to provide basic desktop functionality — that is, they serve mainly as a space in which to launch applications. And, for many users, this is all the definition of “usability” that seems required.
2) Mobile vs. Workstation Design
Mobile devices are probably the most common form of computer in most people’s lives. If you want to simplify computers for users, and reduce the amount of code that developers need to maintain, from one perspective it makes sense that the desktops on mobile devices should drive all desktop design — more sense, at least, than transferring workstation desktops unchanged on to phones and tablets.
Yet the reverse is also true: desktops that make sense on mobile devices are not the most efficient for workstations.
Mobile desktops are constrained by the small screens, so that frequent changes of screens are unavoidable. Nor is screen-changing likely to produce repetitive stress injuries, since mobile devices tend to be used less intensively than workstations.
By contrast, workstations today are generally attached to high-resolution wide screens, many of which are over 21 inches. Under these conditions, the multiple screens on mobile devices are no longer a necessity, but a potential disruption to concentration.
Moreover, during four or five hours of work, the extra mouse clicks or keystrokes can add up to serious injury, even with regular breaks. Yet this difference seems generally unrecognized. The only major desktop that has faced up to this fact is KDE, whose Activities layouts are basically shells with the same code underneath, and include at least two layouts originally designed for netbooks (Newspaper Layout and Search and Launch).
3) The Pressure to Innovate
About five years ago, the Linux desktop reached functional parity with its proprietary rivals. This milestone went mostly unrecognized, but one of its consequences was the shift to usability.
Almost as important, though, was the question of what to do next. Having equaled the functionality — if not always the look — of Windows and OS X, how could the free desktop surpass them? What new features would keep users coming back and developers continuing to work?
The answers have offered mixed results. They include geolocation, new compositing effects, and experiments in panel layout, many of which have not become the necessities their originators once hoped. At times, the answers have been solutions for problems no one was complaining about, such as Unity’s app indicators or the constant attempts to reinvent the menu.
What the answers have not included is a strong road map of where any of the major desktops are heading. KDE had a definite direction for the first few releases of the fourth release series, but seems less -goal-oriented now, while if GNOME or Unity have detailed plans, they have not been announced.
Yet the pressure for progress remains. Apparently, you do not need a focus on commercial markets to feel market pressures.
4) Assuming Hardware Acceleration
3-D graphics remain only partly implemented on Linux, especially if you avoid proprietary drivers. Even where support is supposed to be available, as with an Intel chip set, the results can still be buggy. Although exact figures are impossible to come by, one third to one half of Linux users probably can’t count on reliable, full hardware acceleration.
However, that didn’t stop GNOME 3 and Unity from assuming that hardware acceleration was the norm. Before release, they backed down enough to provide fallbacks based on the GNOME 2 series (while claiming that only basic hardware acceleration was needed). But, on both development teams, the assumption was clearly that 3-D graphics would be more dependable and widespread than they are.
I have to wonder why hardware acceleration should be required for a desktop — no matter what the compositing effects — and the assumption seems to have increased the frustration of many users. But in the long run, the assumption just might speed the development of hardware acceleration. Linux has no gaming industry to push hardware development, the way that Windows does. So, just possibly, the demand for hardware acceleration on the desktop will encourage driver development.
5) Copyright Assignment
Copyright assignment is the transfer of ownership rights from individual developers to a project. It doesn’t affect users directly, but, indirectly, it can affect a project’s rate of development, since many developers refuse to contribute to projects that require copyright re-assignment.
For example, many people consider that the copyright assignment required by Sun Microsystems hamstrung OpenOffice.org for years, and prevented a strong community from forming around the project.
On the other side, those in favor of copyright assignment point out that a project’s development can be impeded if a developer dies or drifts away from the project while still owning the copyright to a piece of code. This is the main rationale for copyright assignment by the GNU Project and Free Software Foundation Europe.
For example, because the Linux kernel has never required copyright assignment, it would be difficult for it ever to shift from version 2 of the GNU General Public License because dozens of former contributors would need to be tracked down.
These issues aren’t new, but they have been revived due to the Canonical / Ubuntu contributor’s agreement and the Canonical-backed Project Harmony, whose mission is to standardize and coordinate contributor agreements throughout free software. Canonical CEO Mark Shuttleworth has been promoting Project Harmony as a necessary step for responsible code maintenance.
By contrast, critics like Bradley Kuhn argue that Project Harmony’s template agreements heavily favor the projects and companies that receive copyright assignment, while providing insufficient guarantees to the donors that their code will be used as they prefer. In particular, critics point to the fact that the templates allow recipients to relicense code — even under a proprietary license.
6) Fragmentation or Diversity?
Informally, GNOME and KDE have always consulted and borrowed from each other. However, official coordination has been lacking in the last couple of years. The freedesktop.org project has dwindled into inactivity, and, although the second Desktop Summit is currently happening in Berlin, one was not held last year.
Although it seems be too early to tell, the release of GNOME 3 and Ubuntu’s Unity desktop may have increased this fragmentation. The problem is not just that both have been widely criticized, but also that the dissatisfaction has encouraged users to look for alternatives. Many, no doubt, look to variants of the GNOME 2 release series, but at least some are looking to other desktops like Xfce — including Linus Torvalds and a couple of other prominent kernel developers.
The latest GNOME and KDE desktops are not the only choice for graphic interfaces. So, on the one hand, any events that encourage users to explore alternatives seem healthy. Yet, on the other hand, with official cooperation already languishing, the use of more major projects can mean even greater difficulties in coordination.
The GNOME community, with its division between GNOME 3 and Unity may be especially hard hit. They may be the same desktop below the surface, but, with the surfaces diverging, greater differences elsewhere may be inevitable.
7) Anticipating Future Needs
Like their proprietary counter-parts, Linux desktops are doing their best to anticipate what will be needed in the future. Touch screen support is planned or well-advanced in each of GNOME 3, KDE, and Unity, and so are mouse-gestures, which are one of several technologies needed for improved accessibility — something that all three desktops are currently weak in, but particularly KDE and Unity.
In addition, Unity’s simplicity and Apple-inspired design seems intended to ease Canonical’s entry into new markets, including OEM deals and embedded systems.
But the difficulty with anticipation is that deciding which technologies to support is a gamble, especially if you want to innovate and not just copy what is being done elsewhere.
Not long ago, for example, that Nepomuck was supposed to revolutionize desktop search in KDE, increasing efficiency and, perhaps, freeing users from needing to know anything about the shape of the directory tree. Nepomuck is still being developed (and, in fact seems to be finally coming into its own in the last couple of releases. Yet the interest in this technology has diminished so much that you have to wonder if the amount of effort was worth the result.
Something of the same decline may be happening now with cloud computing, one of the most popular buzzwords of the last three years.
Some developers and users have seized on cloud computing. Distributions like Jolicloud have even developed desktops designed specifically for cloud-based computing — anticipating Chromebook by at least two years. Yet with articles starting to talk about the most over-hyped cloud technologies, cloud computing may have peaked as a trend. No doubt it will continue to be used, but the moment when everyone viewed it as the next great revolution in computing may have already passed.
Such are the challenges of development plans. To remain players, desktop designers need to start building a year or more before a need is in demand. Yet, during development, any particular need may become less urgent.
Waiting for the Change to Come
None of these pressures is large enough by itself to determine the direction of the desktop. Yet the combined influence of several might. Possibly, too, the conflict of these pressures might keep the desktop from developing clear goals, confining it for some years to minor improvements.
The same is even truer for individual desktops. A year ago, who could have anticipated the continued complaints about GNOME 3 or Unity? Or that less-used desktops would gain popularity as a result? Similarly, while some of the same concerns existed last year, the emphasis on many of them has shifted in both subtle and obvious ways.
The Linux desktop as a whole is probably not in danger. All the same, after years of trying to match proprietary desktops, followed by several years of attempted innovation, the Linux desktop has yet to find a new destination — or even compass point. For now, it’s more responding to pressures than overcoming any.