Thursday, March 28, 2024

The Temporary Death of Processes

Datamation content and product recommendations are editorially independent. We may make money when you click on links to our partners. Learn More.

Think back to the 1970s and early 1980s — the era of “big iron” where mainframes and minis ruled.

Those systems cost a lot of money to purchase, load with applications and then maintain and operate. Management did not take purchases lightly and typically centralized the operation around a core group of people. Along with this centralization came the formation of best practices and sharing of them, both within the IT organization itself along with vendor-specific user groups.

Many processes that younger IT people view as key to IT today and perhaps even revolutionary are not new at all. For example, change management’s lineage can be tracked back for quite some time.

What is particularly interesting in hindsight is that management took its eyes off processes when the PC revolution hit and decentralized everything.


Cost and Value

As mentioned earlier, computing was expensive before PCs. The low cost of PCs created an environment that rapidly decentralized computing in most organizations. If Bob in accounting needed a special pro forma balance sheet created, he didn’t go to IT and laboriously spec it out, wait and then test the result weeks, if not months, later.

Instead, with the advent of the PC, Bob opened up Visicalc or Lotus 1-2-3 and created his own analysis based on his own knowledge. Whether he had errors or not was beside the point — he had control and the “delusion of speed.” In other words, even his needing to re-write the spreadsheet 20 times due to not thinking through the model was irrelevant because it sure seemed like things were getting done quickly and value was being created.


Control Environments

In many cases, this explosion of decentralized processing did lower costs and enhance value. It would be dishonest to say otherwise. However, what it did do was strip out layers of overhead that had been created to ensure that IT resources were used effectively and data integrity protected.

Change management, access controls and application test protocols went right out the window because, borrowing another’s term, they weren’t “sexy” enough for the end-users to bother with. All of those controls were viewed as needless overhead by the users and part of the evil “IT” empire to do away with, or certainly not to be bothered with.

The cumulative end result was an environment that pushed the adoption of technology and short-term need satisfaction over the creation and maintenance of a positive control environment that ensured the overall enterprise truly benefited from the systems put in place and that risks were appropriately managed.


Regulatory Compliance

It’s funny how the business world often seems to swing like a pendulum. We’ll have a negative event that triggers regulation and the pendulum swings one way; then, over time, as the event recedes in memory, the regulations — or at least compliance with them — loosen up. Next thing you know, another event happens and then the process begins again.

Right now, there are plenty of regulations impacting businesses and their IT organizations. The requirements range from having adequate security, proper controls, and so on. Bear in mind (and here’s the rub), it is far easier to be compliant with a centralized IT organization that is following proven controls to begin with! The level of difficulty in achieving compliance for an organization that had organic (meaning “uncontrolled”) IT system growth across multiple uncoordinated groups versus a mature IT organization that values controls is like night and day.

Something for all organizations to be aware of is that the current political climate makes it seem clear that there will be additional regulations to come. They may involve corporate governance, national security or something new, but rest assured, they will come. What groups need to do today is to proactively move towards a normalized set of requirements that are then translated into formalized IT controls and are then applied across the organization following a well thought-out plan that balances costs, benefits and risks.

On the other hand, what organizations must actively guard against is implementing controls on an ad hoc basis due to various regulatory requirements. This approach will cause costs to skyrocket and ultimately may create an unsustainable control environment. It is possible to put controls in place that add value, but it is virtually impossible to do so without proper planning and oversight.

For organizations embarking on controls and that don’t know where to start, I recommend using COBIT for the governance framework, identifying what is important from the framework and then leveraging best practices from the Information Technology Infrastructure Library (ITIL) to accomplish those goals. If there is one control area to focus on heavily to start, I’d recommend that organizations begin with change management.


Summary

We witnessed the temporary death of processes as the delusion of speed swept through organizations in the past 15 years. Now, the winds of change have shifted. The need to meet regulatory requirements, not to mention implement sound business practices, are pushing IT governance and controls to the forefront of board discussions once again. IT must work with the various stakeholders to ensure that proper planning is performed to create a positive control environment that adds value to the organization.

Gaining real benefits from controls are absolutely possible and something all organizations must strive for.

Subscribe to Data Insider

Learn the latest news and best practices about data science, big data analytics, artificial intelligence, data security, and more.

Similar articles

Get the Free Newsletter!

Subscribe to Data Insider for top news, trends & analysis

Latest Articles