Friday, September 20, 2024

Developers Held Liable for Software Bugs?

Datamation content and product recommendations are editorially independent. We may make money when you click on links to our partners. Learn More.

Hold software developers liable for security defects in their products!

Well, that’s what former White House and Microsoft security advisor

Howard Schmidt says, anyway.

For sure, I’ve seen so many buffer overflow bugs that resulted in the

remote execution of arbitrary code — the penultimate software security

defect — I’ve wanted to scream. But hold the developers liable? That’s a

small word with huge ramifications in these United States.

No, I don’t believe we’re anywhere near ready to take such a drastic

step. Let’s make sure we point the car in the right direction before we

hit the gas.

Allow me to explain.

I’ll bet Mr. Schmidt was thinking about other engineering disciplines

when he made the suggestion at ISC2’s SecureLondon conference recently.

And it’s true in some engineering disciplines, we do hold the

professional engineers liable for their design failures, particularly

when public safety is involved. However, we mustn’t forget there’s a

world of difference between the practices in use in software engineering

than in, say, civil engineering.

When a civil engineer sets out to design a bridge, he calculates the

loads the bridge is likely to have to withstand (e.g., cars, trucks,

pedestrians, wind, temperature changes). At the end of the analysis, a

factor of safety is applied to the estimate and he looks up what size

beams and such to use for the structure. Neglecting to use the minimum

strength beams exposes the approving professional engineer to liability

for the structure’s failure.

In that engineering world, the beam sizes and such are published in the

form of structural codes — standards to which the engineers base their

designs on. These tables were developed over decades of use and analysis,

as well as trial and error. What physics student can forget the film

footage of the famous Tacoma Narrows Bridge that collapsed due to

harmonic loading generated by wind?

Even if one looks at the latest advances in software security best

practices — and there are several that are worthy of note — we’re a far

cry away from any sort of published standards that can hold a candle to

what civil engineers use.

And yes, as I said, there has been significant work done in the best

practices arena for software engineers to learn from. This includes the

Department of Homeland Security’s Build Security In

effort and the Open Web Application

Security Project. Each of these useful projects include actionable

guidance, and I believe they’ll go a long way to improving the overall

state of software security as they’re adopted.

But make no mistake about it: these are not standards, but best

practices. And there’s a vast difference between the two.

Let’s also not forget that civilization was building bridges for (no

doubt) centuries before it got to a point that it could hold engineers

liable for their failures. The best practice efforts cited here need to

be tested, used, and refined for at least several years before they’re

remotely ready for that sort of thing.

I also should add that it would be a mistake to interpret what I’m saying

here as defending shoddy software development practices in any way.

Indeed, I find it staggeringly frustrating to see the same mistakes made

time and time again, year after year. The 1988 Internet worm that Robert

Morris wrote and launched exploited, among other things, a buffer

overflow in the Berkeley UNIX finger daemon. From my perspective, that

greatly analyzed and publicized security failure should have marked the

end of the buffer overflow, but a quick glance at the headlines is all

that’s needed to correct that misguided expectation.

But, perhaps these repeated mistakes also are the fault of our

community’s reluctance to truly learn from its mistakes.

Here, I like to cite the transportation industry as a prime example. They

do a spectacular job at studying their failures in painstaking detail and

publishing their results for all to learn from. The software world needs

to do a much better job at emulating this practice.

Sure enough, shoddy software security is pervasive and we’ve got to

demand better from our product developers. It’s also a huge factor in

preventing us from really getting the most out of the incredible

technologies that have been developed recently. Secure software should be

perceived as a business enabler, not an inhibitor, in much the same way a

car’s brakes enable us to drive fast.

But we’re still a long way from even being able to seriously consider

holding developers liable. Instead, we should be considering other

measures, such as public embarrassment, ostracizing, and the like. Heck,

some bugs might even warrant public caning.

No, I’m just kidding… sort of. My patience buffer must have overflowed.

Subscribe to Data Insider

Learn the latest news and best practices about data science, big data analytics, artificial intelligence, data security, and more.

Similar articles

Get the Free Newsletter!

Subscribe to Data Insider for top news, trends & analysis

Latest Articles