Hey, security people: developers want secure code too

I've been working in software security for 20 years with companies of all sizes, and when I hear security people interact with developers (or worse, talk about developers behind closed doors), my impression is that security people aren't really "getting" them -- like, at a fundamental, human level. The developers I know are generally passionate people who want to build cool stuff, and take pride in their work. They're fascinated by exploits. They view security problems as embarrassing and try to fix real issues quickly.

But:

  1. You can't ask them to care more than their boss cares. And more than is reflected in their KPIs and SLAs. They're normal human beings, and so they'll prioritize what they're incentivized to prioritize. It's kind of obvious, but I'll say it anyway: anything you don't incentivize, you're sort of de-incentivizing. If there was an easy button to press, they would hit it. Easy buttons are hard to come by, which brings us to point #2.

  2. Noisy and painful tools have screwed up the relationship and primed developers to be defensive, cynical, and dismissive toward security. I've written about how this happened before. This is the hardest thing to work around organizationally because security teams are not often not measured on how easy they are to work with, or how many false positives they put in front of developers -- because they are, at their core, an organization that measures security and reacts to incidents. If they don't share in the pain they inflict on developers, there is little incentive for them to be better partners. Almost no tools bring solutions to problems. They cite, report, log, punish, warn, alert -- but where are the tools that fix? That patch?

  3. Most developers are conscientious and want their code to be high quality, performant, and yes -- secure. Most people want to be good at what they do. Yet, they are broadly written off as ignorant and combative. Some developers exhibit some of these traits, for sure -- much like people from any demographic. My theory is that the interactions with these types probably linger in the memory because they are more notable or traumatic.

What Is the Developers' Responsibility?

Here are some ways developers must "meet the problem", though:

  1. Keeping an open mind about risk is important. Many developers don't intuitively grok that their code is part of the organization's attack surface, or how it can be exploited. I used to teach software security to developers, and it felt like every class started with 2-3 skeptics, and by the end of the 2 days, they were very interested in the topic. These skeptics always made the class more interesting through some lively discussion. I personally tend to view skepticism as a favorable trait, but skepticism without genuine curiosity is... not helpful.

  2. Recognize that security people are conscientious too. They want what's best, but sometimes don't have the code-level expertise to directly help. Starting with an empathetic mindset will get developers closer to what should be their primary goal: alignment on security with what's worth interruption. I've never met a security person that wasn't happy to "turn off a rule", or whatever the equivalent is, after a well-meaning discussion.

  3. Recognize that tooling is imperfect because security is hard. Even though tools can hit the website/see the code/watch the system running, they don't have perfect knowledge, or the business context to correctly classify all the vulnerabilities in an app. Some very smart people have been working on this problem for a very long time, and it's hard. This isn't an excuse for them not to improve, or a declaration of defeat. We must demand more from our tools but with a spirit of empathy, rather than self-righteousness. It's often more efficient and less frustrating to use the tools as anti-bikeshedding devices and code to "make our collectively chosen tool happy". Avoid expensive arguments that justify the use of anti-patterns when it might be a much quicker win to just fix the thing.

My bottom line: developers want a secure product, and they understandably want to do as little work as possible to get there. There are times developers will go above and beyond to help, but their goodwill isn't a stable leg in the foundation of a good product security program.

And if you want to squeeze every bit of help you want from them along the way, work on being reasonable, knowledgeable, and empathetic. Hard to make enemies with with that combination!