The Seven Habits of Highly Insecure Software

[article]

access routes unprotected. The result: the reappearance of supposedly fixed bugs or alternate access routes that bypass security mechanisms.

Habit # 4: Insecure Defaults
We are all guilty of the mortal sin of clicking "Next" or "Finish" on an installation wizard without reading the details and just accept recommended configurations. But is it a sin? The application's developers and testers know more about the application than we do, so it seems natural not to worry about awkward installation options and just accept defaults. Most users think this way and I can't say that I blame them. So what does this mean for security-conscious testers? It means that we need to ensure security out of the box. We have to make sure that default values err on the side of security, and that insecure configurations are appropriately explained to users.

Habit # 5: Trust of the Registry and File System Data
When developers read information from the registry, they trust that the values are accurate and haven't been tampered with maliciously. This is especially true if their code wrote those values to the registry in the first place. One of the most extreme vulnerabilities is when sensitive data, such as passwords, is stored unprotected in the registry. We have found that passwords, configuration options, CD keys, and other sensitive data are often stored unencrypted in the registry-ripe for the reading.

Habit # 6: Unconstrained Application Logic
It's pretty clear that we need to examine individual functions to make sure that they are secure. If a feature used in a Web browser is not supposed to allow the reading of any file except a cookie, then there's a pretty good chance that a test case was run to verify that. Features are not likely to be as well constrained when they are combined or when commands are executed in a loop. Constraining loops can be an exquisitely difficult programming task. Many denial of service attacks are made possible by getting some benign function (such as one that writes out a cookie) to execute over and over again and consume system resources.

Habit # 7: Poor Security Checks with Respect to Time
The ideal situation is that every time sensitive operations are performed, checks are made to ensure they will succeed securely. If too much time lapses between time-of-check and time-of-use, then the possibility for the attacker to get in the middle of such a transaction must be considered. It is the old "bait and switch" con applied to computing: Bait the application with legitimate information, and then switch that information with illegitimate data before the application notices.

Using these seven habits as a guideline for your software project will help ensure a successful outcome. There's no such thing as 100 percent bug free software. Our goal, however, is to get as close as possible.

About the author

Herbert H. Thompson's picture Herbert H. Thompson

Herbert H. Thompson, Ph.D., is director of security technology at Security Innovation LLC. He earned his Ph.D. in mathematics from the Florida Institute of Technology. Herbert is co-author (with James Whittaker) of How to Break Software Security: Effective Techniques for Security Testing (Addison-Wesley, 2003), and is the author of numerous papers on software security and testing.

AgileConnection is one of the growing communities of the TechWell network.

Featuring fresh, insightful stories, TechWell.com is the place to go for what is happening in software development and delivery.  Join the conversation now!

Upcoming Events

Oct 12
Oct 15
Nov 09
Nov 09