LINK: Should Developers Be Sued for Security Holes?

As a long-time website developer, and one that also holds several industry security certifications (CISSP, Security+, CIW Security Analyst), I have mixed feelings about this. Let me preface my comments by saying that I do believe some sort of certification process or legal responsibility for software bugs mentioned in this article is coming, however, one of the first tenets of good security plans is the concept of “defense in depth”, or not relying on a single security control to protect you, but many varied and overlapping security controls so that if one is breached, another still stands in the way of the attacker gaining access to what they are seeking.

Software development is a business in a quickly-changing threat landscape. Businesses exist to make money and the truth of the matter is that incomplete error handling or data validation is often what is exploited in attacks against software. SQL injection, fuzzing, buffer overflows, and the like are all trying to determine where the development team did not build out all possible error handling. Well, building out all this error handling could swell the codebase 500%. Instead of needing just the 20,000 lines of code for your program to work at a basic level, to handle all possible invalid input could need an additional 80,000 lines of code. Then in order to know that your software is programmed correctly, you have to test the 100,000 lines of code, but the thing about testing is that it grows EXPONENTIALLY, rather than LINEARLY, so that testing 100,000 lines of code is not 500% more work than testing 20,000 lines of code, but 5,000% more work, or maybe even 50,000% more work.

I have said for a long time that lots of bridges, buildings, and houses had to fall down before civilized society came up with building codes, and eventually, I see something similar to building codes in the software development arena. The international aspect of software development will be a challenge, as will be determining who is ultimately responsible for a software bug or glitch.

“Security” in the software world covers many things. For example, you might feel perfectly secure in your house, yet someone could throw a rock through your window, set it on fire, or blow it up with a bomb and your security has been compromised. Living in the country, you might even feel secure with unlocked doors while living in urban areas you might not feel secure even if you have deadbolted locks and bars on your windows.

The article mentions suing software developers for security holes as analogous to suing a window manufacturer when someone breaks your house window. That seems silly to most people, but suing if your house falls over in a strong breeze is not silly because building codes exist to keep this from happening. This is the type of software building codes I see coming, but ultimately I feel the software developers personally should not be responsible because of the economics of the situation. Developers often are very aware of what is built out and what is missing in their software, but they are often on many projects and pulled in many different directions, so that in my opinion lack of whatever security controls in software would really be the responsibility of management because they decide where the software developers are spending their time. Getting back to the housing analogy, if your house falls over in the breeze, you’re not going to sue the guy swinging the hammer, but rather the architect who designed the house or the company management that oversaw the building process.

Software building codes may or may not stifle innovation, too. Some of our favorite software may never have seen the light of day had codes like this existed in the past, but for software that demands security, there will certainly be a demand for security-certified software, I would think. The costs of building to these new codes would be passed on to the customer, though, and they could be immense when you factor in the costs of defending yourself in court as well as building, testing, and marketing the software in the first place.

For more information about software bugs and what you can do to prevent them, I recommend consulting the OWASP Top 10 and the SANS Top 25. These lists of the most common and most damaging security flaws in websites and software are something every developer and manager in charge of software development should be aware of.

http://www.techrepublic.com/blog/european-technology/should-developers-be-sued-for-security-holes/1109

LINK: Mexico Hotel Giant Puts Its IT in Texas

http://www.computerworld.com/s/article/9229729/Mexico_hotel_giant_puts_its_IT_in_Texas_

LINK: Shifting IT Delivery to Tablets: The Strategic Issues

http://www.zdnet.com/blog/hinchcliffe/shifting-it-delivery-to-tablets-the-strategic-issues/2092

LINK: IBM Faces the Perils of “Bring Your Own Device”

http://www.technologyreview.com/business/40324

LINK: U.S. Tech Worker Shortage Looms, Study Warns

http://www.informationweek.com/news/global-cio/outsourcing/240000853

LINK: Why Does Software Security Keep Falling off Your Budget?

“Addressing the problem strictly off statistics, we know for a fact that approximately 3 out of 4 modern attacks against your enterprise or organization come at your applications. Whether it’s at your website, at the mobile app you’ve deployed, or your enterprise API – you’re being attacked through the place where the lowest defenses are – the application. So why is it that we keep spending 3 out of 4 budgetary dollars on network security?”

http://www.infosecisland.com/blogview/21304-Why-Does-Software-Security-Keep-Falling-off-your-Budget.html