Much of the initiative around "shifting left", that is, introducing security much earlier in the development process, simply doesnt move the needle far enough.
In a digitally driven world, we are at an ever-increasing risk of data theft. With large organizations acting as the gatekeepers of our precious information, many are recognizing the need to implement stringent security standards.
Much of the initiative around "shifting left", that is, introducing security much earlier in the development process, simply doesnt move the needle far enough. There is an implication there that we are still beginning the process the wrong way, ultimately backpedaling to achieve the outcome of more secure software. We must start left, enacting a cultural shift that positively engages development teams and arms them with the knowledge they currently lack. However, all training and tools are not equal.
In this article, we explore the ways that key leaders like security awareness and development managers can truly empower their developer cohort, transforming them into the defensive front-line against costly cyber attacks.
In the age of frequent data breaches affecting some of the world's most trusted organizations, company leaders have looked to the security industry to provide guidance on avoiding the financial, reputational and showstopping disaster that is a successful attack.
For quite a while, AppSec specialists (including myself) have advised that we must indeed "shift left". In-keeping with DevOps best practice, as well as better software security outcomes, many of us advised that the security part of a software build must come sooner in the software development lifecycle (SDLC). It should not be the final, costly step - rather, it should shift closer to the start of the process, with AppSec teams engaged early as software projects come to life.
This is not bad advice, and it's certainly better than the old way of doing things (which, if the amount of stolen data out there and age of the vulnerabilities used to heist it is any indication, isn't working anyway). However, if we actually started left, the security outcomes would be far more positive.
Shift left, start left... what's the difference? The difference lies in how you engage your development team. Truly, they are the key to delivering more secure software, much cheaper than later-cycle toolchains and manual code review can manage. In an ideal world, every single developer writing software would have the knowledge and tools to code securely from the very beginning. They would spot potential flaws, mitigating them before they're committed (and thus far more expensive to weed out and fix). There would be a dramatic reduction in the security bugs we've seen for decades - ones that are still responsible for allowing attackers in through the back door. Those windows of opportunity in the form of SQL injection, cross-site scripting and broken authentication would close.
However, right now, there simply isn't enough emphasis on security at the vocational level, and on-the-job secure coding training varies wildly. As a result, developers rarely have what they need to enable an organization to start left. Its time for those in leadership positions to work together and advocate for broader security awareness, with their direct knowledge and contact with developers vital in driving programs that work. After all, development managers were once sitting in their position, on the tools and with the security space difficult to navigate at the best of times.
You know how it goes: mention "security" to your typical developer, and you're likely to be met with an eye-roll at best, or puzzlement at worst. Generally, the whole security thing is seen as someone else's problem.
A developer has a primary responsibility of building software that is functional, brimming with innovative features and delivered within a tight project timeline. Security is rarely a priority at the coding level, and can even be seen as a tedious blocker to rapid delivery and getting creative. AppSec has the task of meticulously checking code, pen-testing and then reporting the bad news: the presence of security vulnerabilities in code that is often already committed, and functioning quite fine otherwise. It's an expensive process in an environment that is often stretched for resources and time, with the setup bound to cause a rift between two teams that have ultimately the same goal, but speak completely different languages.
Now, in this climate, the reception to mandatory security training is likely to be quite chilly. But, the power to ignite a security mindset in every developer isn't a pipe dream. With the right kind of training and support, they can start to bake security into their software and take responsibility for the security outcomes they are able to control. If developers themselves can take care of the common bugs, that frees up the expensive specialists to iron out the truly complex problems. And if youre in the position of managing a development team, you can be instrumental in helping to bridge this gap and help your team see the benefits.
When was the last time you got really excited about learning something new? In the times you did, words like "mandatory", "compliance" or "seventeen hours of video" probably wont come to mind.
Developers are no different. They are clever, creative and love to solve problems. It is quite unlikely that watching endless videos on security vulnerabilities is going to engage them, keep the content memorable or end up specific to their everyday roles and responsibilities. In my time as a SANS instructor, it became apparent very early that the best training is hands-on, forcing participants to analyze and be challenged intellectually, using real-world examples that test their brain and build on prior learning. Gamification and friendly competition are also powerful tools to get everyone on-board with new concepts, while remaining useful and practical in application.
Another scary factor is that a lot of security training goes unmonitored. Nobody likes to feel as though Big Brother is watching, but what is the point of spending time, money and effort on education if no-one is checking whether it is relevant?
The right solution can make secure coding fun, relevant, engaging and measurable. Challenge your developers, treat them well and make it a special event. Gamified training lights up the reward centers in the brain, and offering an incentive to keep learning, push knowledge boundaries and, quite simply, build a higher standard of software, is a win-win.
Creating secure software in an environment with a poor security culture is like trying to win a marathon with a boulder chained to your ankle: virtually impossible, and unnecessarily difficult.
Gamified training, head-to-head tournaments and a commitment to assisting developers in their security growth help immensely in driving a positive security culture, with AppSec and development teams gaining much more insight into each other's day-to-day work. Better relationships grow and thrive, and the (often limited) security budget isn't burned through on fixing a "Groundhog Day'scenario of the same small, annoying bugs time and time again.
There's another powerful byproduct, too: the unearthing of the security champions you never knew you had. Proper training that gets everyone involved, while also allowing for a thorough assessment, can uncover those that not only have an aptitude for security, but actively display a passion for it. These champions are vital in keeping the momentum going and acting as a point of contact between teams, overseeing peers and upholding best practice policies. Implementing a solid champion program, one that includes recognition and executive support, is a feather in the cap of the organization, as well as looking mighty impressive on the individual's CV and bolstering their future career.
When you commit to a positive security culture, responsibility is shared and a greater tier of secure software excellence can be achieved. Ultimately, every person in the software development lifecycle must adhere to a simple mantra: if it's not secure software, it's not good software.
Spread the word.