I was buying some clothes at a major department store the other day and was struck by the extra effort the clerk was putting in — simply because of a lack of standards. And if it was causing a problem in this store, imagine what it would do in your IT department.
The problem arose from those anti-theft ink tags that are attached to some clothes to dissuade shoplifters. If they’re not removed with the proper tool at the store, they often burst and plash ink all over the clothing.
What was gumming up the works was the fact that the tags weren’t attached to all the clothes, and when they were, they weren’t always attached in the same place. My bet is that it took the sales clerk at least five times longer to check, fold and bag my clothes simply because of the lack of standards regarding these tags and what they’re attached to.
It was the perfect example of the inefficiencies caused by a lack of standards.
Granted, there may be reasons that some of the clothing I bought was not tagged. It might be that it’s based on some mixture of value, likelihood of theft and so on. The clerk confided in me, rightly or wrongly, that she felt it was due to a lack of discipline. That could be.
What I knew for sure was that the store was dealing with wasted productivity, and the potential for customers to become frustrated with the added delays in the checkout line.
All of this was happening because of a lack of standards (or multiple standards), poor training, and the wrong message coming down from the top.
Standards and Training
With standardization comes many efficiencies.
Simple standards make employee training simpler because there is only one ”way” to do something. People can develop and share best practices because they are all working from a common context.
Better yet, in IT, there are best practices that are developed and shared by various bodies. For example, the British Government’s Office of Government Commerce maintains the Information Technology Infrastructure Library (ITIL), which is a treasure trove of best practices for IT.
Now, simply having standards is not enough.
Without adequate training so all practitioners can understand what needs to be done, deviations will occur. It is far too easy to create a list of standards and policies that are never used because no one knows they exist or they aren’t taught how to apply them.
Another problem area relates to senior management — their actions and the way they communicate. Both aspects can impact workers’ adherence to standards dramatically. For example, management may say that the ink tags are important, but if they do not allocate enough resources to getting all the clothing tagged, then employees will see that the standards really don’t matter.
Actions, or inactions, can make or break the best of standards.
Multiple Standards
Lastly, multiple standards can be problematic as can multiple scenarios of when to apply a given criteria. For example, if process A, B or C can be performed with different outcomes, then people downstream of that process can be left wondering when one standard is applied, compared to another.
In terms of people following a given process, the more selection criteria they must remember concerning when to follow the process, the greater the likelihood that they will fail to follow the process correctly. If we think of it in terms of a simple function, the probability of an error increases exponentially as the number of decision makers and potential choices increases.
What About IT?
Right now, many organizations are rushing to implement standard policies and procedures for regulatory compliance. In their rush, they must factor in the impact of standardization on a formerly variable organization, as well as training, complexity and communication from upper management.
Organizations, regardless of their function, must consider human factors in their process design or risk errors, inefficiencies and failed audits — all arising from their attempts to adopt, if not outright force into place, standard policies and procedures. All too often, we overemphasize technology at the expense of people and appropriate process.
In this day and age, we can no longer do that as we risk far more than an overlooked ink tag soiling our new purchases.
Huawei’s AI Update: Things Are Moving Faster Than We Think
FEATURE | By Rob Enderle,
December 04, 2020
Keeping Machine Learning Algorithms Honest in the ‘Ethics-First’ Era
ARTIFICIAL INTELLIGENCE | By Guest Author,
November 18, 2020
Key Trends in Chatbots and RPA
FEATURE | By Guest Author,
November 10, 2020
FEATURE | By Samuel Greengard,
November 05, 2020
ARTIFICIAL INTELLIGENCE | By Guest Author,
November 02, 2020
How Intel’s Work With Autonomous Cars Could Redefine General Purpose AI
ARTIFICIAL INTELLIGENCE | By Rob Enderle,
October 29, 2020
Dell Technologies World: Weaving Together Human And Machine Interaction For AI And Robotics
ARTIFICIAL INTELLIGENCE | By Rob Enderle,
October 23, 2020
The Super Moderator, or How IBM Project Debater Could Save Social Media
FEATURE | By Rob Enderle,
October 16, 2020
FEATURE | By Cynthia Harvey,
October 07, 2020
ARTIFICIAL INTELLIGENCE | By Guest Author,
October 05, 2020
CIOs Discuss the Promise of AI and Data Science
FEATURE | By Guest Author,
September 25, 2020
Microsoft Is Building An AI Product That Could Predict The Future
FEATURE | By Rob Enderle,
September 25, 2020
Top 10 Machine Learning Companies 2020
FEATURE | By Cynthia Harvey,
September 22, 2020
NVIDIA and ARM: Massively Changing The AI Landscape
ARTIFICIAL INTELLIGENCE | By Rob Enderle,
September 18, 2020
Continuous Intelligence: Expert Discussion [Video and Podcast]
ARTIFICIAL INTELLIGENCE | By James Maguire,
September 14, 2020
Artificial Intelligence: Governance and Ethics [Video]
ARTIFICIAL INTELLIGENCE | By James Maguire,
September 13, 2020
IBM Watson At The US Open: Showcasing The Power Of A Mature Enterprise-Class AI
FEATURE | By Rob Enderle,
September 11, 2020
Artificial Intelligence: Perception vs. Reality
FEATURE | By James Maguire,
September 09, 2020
Anticipating The Coming Wave Of AI Enhanced PCs
FEATURE | By Rob Enderle,
September 05, 2020
The Critical Nature Of IBM’s NLP (Natural Language Processing) Effort
ARTIFICIAL INTELLIGENCE | By Rob Enderle,
August 14, 2020
Datamation is the leading industry resource for B2B data professionals and technology buyers. Datamation's focus is on providing insight into the latest trends and innovation in AI, data security, big data, and more, along with in-depth product recommendations and comparisons. More than 1.7M users gain insight and guidance from Datamation every year.
Advertise with TechnologyAdvice on Datamation and our other data and technology-focused platforms.
Advertise with Us
Property of TechnologyAdvice.
© 2025 TechnologyAdvice. All Rights Reserved
Advertiser Disclosure: Some of the products that appear on this
site are from companies from which TechnologyAdvice receives
compensation. This compensation may impact how and where products
appear on this site including, for example, the order in which
they appear. TechnologyAdvice does not include all companies
or all types of products available in the marketplace.