warnings on a daily basis, have
not been serious about applying security patches despite the clear danger of
worms, viruses and intruder attacks.
That’s the word from online security experts who estimate that up to 50
percent of all enterprises
could be sitting ducks for hacker attacks because of unpatched, vulnerable
computer systems.
While it is impossible to figure exact percentages of critical or
important patches that have been
downloaded and installed, experts believe the application of fixes are
delayed for months, even with
the increased awareness after the recent Code Red
and Slammer
incidents.
Last year alone, network administrators had to deal with more than 80
percent more vulnerabilities than in 2001, according to a report from
Symantec , which provides anti-virus software. Microsoft
, the world’s leading software vendor, issued 72 security
alerts in 2002 and
10 already this year.
A Microsoft spokesman told internetnews.com there are no exact
percentages available for issued
patches and downloaded because there is not a 1:1 ratio of patch downloads
to patch applications.
“While technologies such as Windows Update, Auto Update and SUS have
increased patch uptake, we cannot provide detailed download statistics.
Large enterprises often download a patch to a local server, then deploy it
across thousands of computers, therefore; patch downloads are not indicative
of the numbers of computers protected,” the spokesman explained.
Marty Lindner, team leader for incident handling at the CERT Coordination
Center, agreed it was nearly
impossible to figure out actual percentages. In large enterprises, for
instance, Lindner said the term
‘patch download’ doesn’t apply because those systems are typically protected
through an outsourced software
maintenance contract.
“In a major organization, if they have 100,000 machines, they aren’t
downloading and installing 100,000 patches. It really is hard to measure
because, even for smaller business, you have no way of knowing
what happens once a patch is downloaded. You don’t know how many machines
it is applied to and who is
sharing a patch with who,” Lindner explained.
Lindner’s CERT/CC, the federally funded
clearinghouse for warnings from all major vendors, reported 4,129
vulnerabilities in 2002, almost double the number issued the previous year.
The Center’s statistics show an
alarming trend upwards but Lindner said the lack of information is still a
major setback in the Center’s quest to secure susceptible systems.
Sheer Volume
Lindner blamed the administrators’ indifference to patch applications on
the large amounts of security
information being shuttled to enterprises on a daily basis. “The sheer
volume of security information
that’s seen by a network administrator is mind-boggling. In many cases, it’s
a huge task just figuring out which patch applies to you,” he explained.
Even after the sysadmin is made aware of the problem, it’s not a
straightforward care of applying a
patch, Lindner explained. “People believe you solve the problem by applying
a patch but, typically, you
can do a configurating change or turn off the offending software and secure
your system,” he added.
“The first challenge is to decide which patches apply to your system.
After you have weeded through
that, then you have to apply the patch and test it outside of production.
When you apply the patch, you
have to make the blind assumption that it’s fixing whatever needs to be
fixed. Even then, you take the
risk that you will break something that used to work,” Lindner said in an
interview, arguing that faulty
patches have been just as destructive as the vulnerable software it was
meant to fix.
Thomas Kristensen, chief technology officer as security research firm Secunia, believes network admins are more
likely to patch holes in mail servers and Web servers in a timely
manner.
“Generally, in a medium-sized business, they’ll use Windows update and
get patches relevant to their
systems and, even then, they’ll apply the patches based on whether it is
important or not,” Kristensen
said.
He said bug warnings around Web browsers or other client systems are
routinely ignored because they are
deemed unimportant. “Sometimes, they will hesitate and delay fixing a faulty
browser for several months and
assume they aren’t vulnerable because they’re using a firewall but that is a
dangerous assumption. The
intruders are sophisticated and are using attack scenarios that penetrate
the firewall,” Kristensen
told internetnews.com.
In many small- and medium-sized enterprises, Kristensen said it boiled
down to a matter of available resources to deal with patch applications.
“They just don’t have the tools or software to distribute patches in the
network. They’ll have to do it individually and it is a tremendous task for
a one-man staff to be running from machine to machine to plug a hole,” he
said.
CERT/CC’s Lindner agreed that the urgency to apply fixes was determined
by the cost factor. “Many
corporations choose to measure the risk associated with the cost of patching
a system. Sometimes, it is
a conscious decision that patching computer systems is not a high enough
priority to spend big dollars to do it,” Lindner asserted.
So what to do when all of those patches are critical? Read more on Page 2.
“It is quite possible, companies have chosen, for better or worse, to use
their money on marketing as
opposed to patching systems. It is, in many cases, a straightforward
decision,” Lindner said, warning that
it’s dangerous to downplay the serious risk that can be caused by an exposed
system.
David Litchfield, co-founder of Next
Generation Security Software (NGSS), estimated that less than 5 percent
of vulnerable systems are patched in a timely manner. “Within a few weeks of
the advisory going out, about 20 percent are fixed but I’d say about 50
percent of enterprises don’t even bother to apply the patches,” he said in
an interview from his U.K. office.
“A large part of the problem is that the administrator is not even aware
of the patch. It is surprising that in some enterprises, there are no
vulnerability assessment (VA) tools being used,” he argued.
At other times, Litchfield said IT admins are simply “fed up” with the
large amount of patches being issued and are content to wait for service
packs that provides a bulk fix. That’s why the Code Red and
Slammer attacks were so successful. There were literally millions of
unpatched systems around the world,” he added.
Litchfield called on governments around the world to take the lead in
educating companies and consumers
about the serious risks involved with bad software. “It would be a good
start a massive user awareness campaign but the problem is coaching people
to read those documents. It’s like taking the horse to the
water but you can’t make them drink.”
Secunia’s Kristensen agreed that user awareness was a huge problem, even
with the increased publicity from the mainstream media. “One of the big
reasons why people aren’t installing patches is the lack of
knowledge about them actually existing,” he declared.
During internal research, Kristensen said admins are more eager to patch
a hole in a Web server or a
mail server but, even then, only about 50 percent of the holes in
susceptible servers are plugged.
“Even with all the media attention, I don’t think there’s much more than
two-thirds of services out
there that’s been updated,” he added.
Crying Wolf?
Then, there is the cry-wolf syndrome, born out of too many ‘critical’
warnings being issued, particularly by Microsoft. The Redmond, Wash.-firm
acknowledged
there were legitimate fears that too many high-level alerts were being
issued.
Steve Lipner, director of security assurance for Microsoft, recently
announced the Severity Rating
Criteria would be modified to specify clearly which bugs needed to be
addressed immediately.
“There is also a widespread feeling that the Severity Ratings are
difficult to understand and apply. For these reasons, we have modified (the
criteria) to help customers more easily evaluate the impact of security
issues,” Lipner explained.
Of Microsoft’s 72 warnings in 2002, more than half were tagged with the
‘critical’ rating. Of the ten
issued this year, five have been described as critical. The ‘critical’
rating is reserved only for “a
vulnerability whose exploitation could allow the propagation of an Internet
worm without user action,”
Microsoft explained.
The new ratings criteria carry an ‘important’ tag for flaws that could
result in compromise of the
confidentiality, integrity, or availability of users’ data, or of the
integrity or availability of processing resources. Below that, the company
issues ‘moderate’ or ‘low’ warnings.
For CERT/CC’s Lindner, the issue goes beyond software vulnerabilities and
points to faults with the engineering process. “The root cause of
problematic patches and problematic software is bad software engineering
practices. That’s where we have to fix things,” Lindner declared.
“When we find flaws in software and we have to build a patch, we’re using
the same bad software engineering practices to build the patch to fix the
software that’s poorly engineered. It’s a vicious circle,” he added.
Even as the experts continue to decry the slow pace of patch
applications, Lindner suggested a two-fold approach to fixing things.
First, he called for widespread adoption of better software enginnering
practices and, more importantly, widespread adoption of developing foolproof
architecting protocols.
He said too many built-in flaws were being discovered in some of the most
crucial protocols. “Even if you wrote error-free software, there would still
be vulnerabilities because the protocols themselves have problems. That’s
what he have to concentrate on fixing,” said Lindner.