Cyber Security FM! (… or “why aren’t we teaching developers security”)

Recently, I was afforded the opportunity to present at the “Cyber Security FM” event, a community-driven collaboration with Women in ICT. This was a fantastic event with a great turnout, and generated some really good food for thought (apologies for the haphazard dinner organisation). Without further ado:

Slides are available here: CSFM – Combined Slides.

A few points of food for thought from the night:

  • Stop white-knighting. The amount of comments we received about the name (especially with FM standing for “Fresh Meat”) was surprising. Personally, I don’t overly care what an event is called, as long as it brings the community together and is useful to people who attend. “Cyber Security FM” has a nice ring to it anyway!
  • Non-technical subjects are awesome. A lot of us in the security testing industry like to talk about policy and procedure as if it were beneath us. In the modern environment, policy and procedure (covering everything from data classification to incident response) is the other side of the infosec coin – without them, technology would only go so far.
  • Exposure is everything. A lot of developers always turn up to these events, and there’s always interesting dialogue about the exposure that developers get to the security sphere.

At one point, the question of “why aren’t people teaching developers security” was raised. This is an excellent question for which I don’t really have a coherent answer. The general approach, across a number of industries, is to combine security expertise (advisory, security testing, automation) as well as education, carefully prioritising the time of their security professionals for greatest effect.

That said, this approach is ridiculously difficult to scale. The development community outnumbers the security community by many orders of magnitude, and operate on diametrically opposed principles: the developers care primarily for getting their code out to meet a deadline, where security requires additional controls, both technological and process-based, slowing down the development process.

The largest swathe of middle ground seems to be an increased focus on education and exposure across the board – that is, developers (and their managers, who drive the code delivery bus) must understand that security is a shared responsibility, and they’re part of that “shared”: that is, delivering a product with a glaring security flaw is worse than delivering a product late.

It should be noted that this whole discussion seems to be one that the information security community seems to take for granted – definitely keen to hear some alternative viewpoints on this. Does anyone even have a demonstrably working approach on this kind of thing?

Future events will be announced here.

About Norman

Sometimes, I write code. Occasionally, it even works.
This entry was posted in Bards, Computers, Jesting. Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s