Today in Washington, DC, experts from more than 30 US and international cyber security organizations jointly released the consensus list of the 25 most dangerous programming errors that lead to security bugs and that enable cyber espionage and cyber crime. Shockingly, most of these errors are not well understood by programmers; their avoidance is not widely taught by computer science programs; and their presence is frequently not tested by organizations developing software for sale.
And not just there, take a look at most bloody bespoke development shops. But it's worse than that, as Alex Wolfe so pithily says:
It really makes you wonder where we are today, when one sees that two of the top problems on the SANS list are improper input validation and improper access control (authorization). (I'm guessing there's also not much thought given to writing error catches. Probably the SQA or test folks have to tell today's youthful coders to add that stuff in afterwards.) Other goodies on the list include "hard-coded password" and improper initialization.
I guess you can't make this stuff up, but if it's as common as SANS says, I don't know if there's any hope at all.
I concur. But I'd like to point out that while error handling and input validation aren't sexy to code, users don't look at either one when selecting software. So what did you think was going to happen?
2 comments:
Sexy to code?
You can borrow my rubber gloves if you like. They're a very sexy shade of yellow and made of rubber, if you like that sort of thing.
When I first started in my current job I was going through one of the 'live' sites that was to fall on my lap.
Database passwords stored in .inc text files, absolutely no input validation, users removing their own comments connecting to the database with a full rights user. I could go on for hours.
First thing I did was google up the exposed text files and show that to the ops guy.
I'm not in the least surprised about any of this!
Post a Comment