Infrastructure vs. Application Security Spending
A recent study published by 7Safe, UK Security Breach Investigations Report, analyzed 62 cybercrime breach investigation and states that in “86% of all attacks, a weakness in a web interface was exploited” (vs 14% infrastructure) and the attackers were predominately external (80%). These results are largely consistent with the US-based Verizon Data Breach Incident Report (2008) which tracks over 500 cases. Combine this knowledge with the targeted Web-related attacks against Google, Adobe, Yahoo, the US House of Representatives, TechCrunch, Twitter, Heartland Payment Systems, bank after bank, university after university, country after country -- the story is the same. It’s a Web security world.
It has been said before and it’s worth repeating, adding more firewalls, SSL, and the same ol’ anti-malware products is not going to help solve this problem!
The reason that Web security problems persist is not a lack of knowledgeable people (though we could use more), perfected security tools (they could be much better), or effective software development processes (still maturing). A fundamental reason is something myself and others have been harping on for a long while now. Organizations spend their IT security dollars protecting themselves from yesterday’s attacks, at the network/infrastructure layer, while overlooking today’s real threats. Furthermore, these dollars are typically spent counter to how businesses invest their resources in technology. To illustrate this point I’m going to borrow inspiration from Gunnar Peterson (Software Security Architect & CTO at Arctec Group).