For new software deployments, the newer the software installed, the better opportunity to leverage the latest features and the most time before inevitable obsolescence takes its toll. All software ages, so installing newer software gives the best chance that that software will last for the longest time. It provides the best flexibility for the unknown future.
Following this line of thinking might lead us to feel that deploying pre-release or even beta software would make sense as well. And while there might be specific cases where this does make sense, such as in “test groups” to check out software before releasing it to the company at large, in general it does not. The nature of pre-release software is that it is not supported and may contain code that never will be supported. That brings us to the other situation, the one in which we are updating existing software. This, of course, is a completely different scenario from a fresh install and there are many, many more factors involved. One of the biggest factors for most situations is that of licensing. Updating software regularly may incur licensing fees that need to be factored in to the benefits and cost equation. Some products, like most open source software, do not have this cost and can be updated as soon as new versions are available.
The other really large factor in updating software is a human effort cost to updating – unlike in a fresh installation, where the effort of install is effectively a break even between old software and new.
In reality, new software tends to be easier to install than old software simply due to improvements and advancements. Maintaining a single version of software for a decade means that resources were not dedicated, during that time, to upgrade processes. Upgrading annually during that time means that resources were used ten times to enact separate upgrades. That makes updating much harder to cost justify.
But there is more than just the effort of the update process itself, there is also the continuous training needed for end users who will be forced to experience more changes, more often through constant upgrades.
This might make updating software sound like a negative, but it is not. It is simply an equation where each side needs to be weighed. Regular updates often mean small, incremental changes rather than large leaps allowing end users to adapt more naturally.
Regular updates mean that update processes are often easier and more predictable. Regular updates mean that technical debt is always managed and the benefits of the newer versions – which may be features, efficiencies or security improvements – are available sooner, allowing them to be leveraged for a longer period of time.
Taking what we have learned from the two scenarios above, however, there is another important takeaway to be found here. Once the decision to perform an update has been made, the question is often “to what version do we update?” In reality, however, every update that is more than a standard patching process is really like a miniature “new software” buying decision. And the logic as to why we “always” install the newest available version when doing a fresh install also applies here. So when performing an update, we almost always should be updating as far as we can – hopefully to the current version.
To apply the Microsoft example again, we can take an organization that has Windows XP deployed today. The business decides to invest in an update cycle to a newer version, not just continued patching. There are several versions of the Windows desktop platform that are still under active support from Microsoft. These include Windows Vista, Windows 7, Windows 8 and Windows 8.1. Updating to one of the less current versions results in less time before that version’s end of life, which increases organizational risk.
Using older versions means continued investment in already old technologies, which means an increase in technical debt and less access to new features that may prove to be beneficial once available. In this particular example, newer versions are also considered to be more secure and require fewer hardware resources.
Every business needs to find the right balance for their own needs for existing software update cycles. Every business and every software package is different. Enterprise software like Microsoft Windows, Microsoft Office or an Oracle Database follow these models very well. Small software projects and those falling near the bespoke range may have a more dynamic and unpredictable release cycle but generally will still follow most of these rules.
But the rules of thumb are relatively easy:
When deploying new or updating, shoot for the latest reasonable version of software. Use any deployment opportunity to eliminate technical debt as much as possible.
When software already exists, weight factors such as human effort, licensing costs, environmental consistency and compatibility testing against benefits in features, performance and technical debt.