Friday, May 24, 2024

When More is Less

Datamation content and product recommendations are editorially independent. We may make money when you click on links to our partners. Learn More.

It’s curious how we incessantly need more. We always seem to need more processors, more storage, more memory, more people, more everything … Why?

As the abundance of any given resource increases, so too, it seems, does the rate of inefficient use by the user, the customer and IT. There is a very real risk that the rate of inefficiency will overtake and negate any gains made, or even cause losses in excess of perceived benefits due to problems with utilization and management overhead.

Way back in the stone age of PC computing (early to mid-1990s) we marveled at Russian programmers who hand-coded and optimized in assembler to make the most efficient use of their relatively limited computing power. For them necessity was the mother of invention – they had to make do with the computing they had.

Today, we pick the lowest-cost route and use high-level languages that generate relatively unoptimized code and then mask the performance by compensating with inordinately high-performance hardware for processing and storage. Today, we buy ridiculously elaborate software suites that have massive hardware and maintenance requirements and then actually make productive use of a fraction of the capacity.

Unrecognized constraints

The drivers all too often are the vendors and their relentless upgrade paths. Think of it in this way – if processor power and data storage have followed exponential growth curves, then why hasn’t our productivity followed a similar curve? We’ve definitely improved, don’t get me wrong, but at nowhere near the same level. Very real and often unrecognized constraints exist between us and our goals.

The point of business isn’t to run the latest and greatest technology – the point is to make money, or, to be more precise – to maximize the return on investment in a sustainable manner.

Whatever we do with computing should be subordinated to achieving our goals – not goals in and of themselves. Are we really moving toward our goals when we add more storage, more processors and version 1001 of some software package? We typically upgrade due to risks associated with falling outside of the upgrade path, the support path or having hardware that is no longer available or is at a point in its lifecycle that it is likely to fail.

Does risk avoidance by itself move us toward our goal, or are we optimizing in one area at the expense of the overall system? This is fun to ponder (with lots of caffeine) as we realize that an over-optimization of risk mitigation can itself create a risk and must always be tempered by the needs of the business.

Left unchecked, storage and processing requirements easily could expand to consume an unacceptably large portion of an organization’s earnings. It is easy to fill the capacity of a new resource and far harder to manage and optimize the utilization of scarce resources.

In other words, the path of least resistance is to buy additional capacity rather than negotiate the appropriate use of currently near-capacity systems.

Focus on the business

If isn’t a question of “can we expand” but one of “should we expand – can we sustain it?” It would certainly make sense to have policies and procedures in place to manage the utilization of capacity before it gets out of control.

As a thought, when you get ready to install the inevitable next multi-petabyte SAN, take time to negotiate service levels that form expectations around business requirements and what IT can deliver. Moreover, make the business customer assist in the cost justification – if it’s truly a business need, he’s in a far better position to explain to senior management why the service is needed and this perspective must take the total costs into consideration reflecting hardware, software and the IT personnel required to care for the system.

This brings our topic full circle. If the people who comprise organizations have a tendency to over-consume and make inefficient use of resources, then IT needs to work with the business units to understand their requirements, including legal and regulatory requirements, and then work as stewards of the organization to shepherd decisions that maximize productivity while managing risks.

On one hand, it is very easy to add storage, to add processors, get the latest version, and to add staff, but should we? Where is the point of inflection wherein the model is no longer sustainable and we witness productivity or growth decline? At what point does the house of cards collapse?

Inefficiency impacts not just the user’s ability to gain value, but IT’s ability to sustain effective and efficient management of the ever-increasing volumes of resources that are complexly integrated.

Subscribe to Data Insider

Learn the latest news and best practices about data science, big data analytics, artificial intelligence, data security, and more.

Similar articles

Get the Free Newsletter!

Subscribe to Data Insider for top news, trends & analysis

Latest Articles