In our industry, many security experts have started predicting the hot-button issues for the year, but with more than five billion sensitive data records stolen in 2019, we figured it would be more accurate to predict what won't be happening in cybersecurity in the foreseeable future.
A version of this article originally appeared in Dark Reading. It has been updated for syndication here.
In our industry, many security experts have made a habit of predicting the hot-button issues for the coming year (and I am no exception), but with more than five billion sensitive data records stolen in 2019, I figured it would be more accurate to predict what won't be happening in cybersecurity in 2020, or indeed the foreseeable future.
I sat down with my co-founder, Matias Madou, and we had a few laughs putting together this list. It's intended to be somewhat of a lighthearted outlook, but it does make you reflect on the long road ahead to fortify every organization against malicious cyberattacks.
Now, lets look into our crystal ball and reveal our predictions. We won't be seeing:
I think every security professional on the planet has been waiting for the day where they are no longer derailed to identify SQL injection bugs, ad nauseam.
Sadly, we've been waiting for this day for more than twenty years, and we'll keep waiting for at least one more. It's the vulnerability that, like a cockroach, has survived every tactic thus far to eradicate it for good.
It's frustrating to say the least, since the remedy has been known for pretty much the same length of time. However, the prioritization of security best practice at every stage of software development (especially right from the beginning) remains too low, and certainly inadequate in light of the vast increase of code production since the discovery of SQL injection.
(Want to understand more about the coding patterns that could lead to SQL injection? Take the challenge here).
Ah, developers and the AppSec team. Will they ever get along, or are they destined to a life of rivalry like Rocky vs. Apollo? The short answer is yes, they can get along, but right now their priorities are often very different.
When they meet in a project environment, they're on opposite pages and clash right at the final hurdle: when AppSec specialists are poring through a developer's code. The developer has built beautiful, functional features (which is their top priority) that are torn apart if security vulnerabilities are discovered. The software security guru has, in effect, called their baby ugly and forced the developer to go back and fix any bugs, often delaying deployment.
In our current state, this won't be fixed until developers and AppSec teams work towards a common goal, which is the creation of secure software. It's not going to happen as a default process in 2020 (and with many companies, quite a few years after that), but with the advent of the DevSecOps movement and beyond, developers are recognizing the need to upskill in security and work to a higher standard; one that includes security objectives from the beginning.
In 2020, 2025, 2030... it's almost guaranteed that we will be short-staffed globally when it comes to security expertise.
According to a report from ISC(2), there are around 2.93 million cybersecurity positions currently unfilled. This is almost certainly going to get worse before it gets better, and there is no hidden security army waiting to march to our rescue in the next few years.
In the immediate future, our best chance to address this skills shortage is to make security an organizational priority and upskill our existing workforce, and that means empowering developers with the training and tools to code securely, as well as creating a company-wide security culture. Most current AppSec teams are probably fighting against well-known, old security bugs (see point 1 above). If we ensure they don't have to spend precious time and effort fixing these common issues, they will have more bandwidth to focus on tough security problems (at the moment, these are likely to encompass issues with APIs, as well as building tools that can fit development pipelines).
The world is being digitized at a staggering rate, and societal demand is not going to waver. There are approximately 111 billion lines of code written each year, and this number will only grow larger and more terrifying (for the already-stretched AppSec teams, anyway).
Increased code volume inevitably increases potential security vulnerabilities, so any thinking around 2020 being a slower year for software production and breaches are about as realistic as unicorn racing being held at the Grand National.
As I said, more code means more vulnerabilities, and this presents more opportunities for attackers to find a way to steal data.
At least 5.3 billion records were stolen worldwide in 2019, and defense against attackers is still a bit of a desperate, reactive scramble. This number may not double in the next twelve months, but I think it will get close.
As an example, let's look at the historical trends of reported stolen data in the United States year-on-year:
Looking at the period between 2005-2010, there is a steady rise with a little ebb and flow between years. This is interesting, but an important factor to highlight is that in 2009, there was a sharp decline in the number of reported breaches as compared to 2008. However, there was a distinct increase in the number of records stolen despite fewer breaches.
Data records are the new gold; they have value to an attacker. The chart reflects a general upward trend in breaches and number of stolen records, with a huge peak in 2017. The number of attacks trended downwards in 2018, perhaps due to tougher security measures, but the number of records obtained was the highest it has ever been. Cyberattacks will become increasingly sophisticated, high-volume and are not going away any time soon. And globally, we are unlikely to see a downturn in 2020 as companies rapidly produce more software than ever before. They are the gatekeepers of our data and need to work smarter to protect our privacy.
If there is one thing developers love, it's watching hours upon hours of computer-based training (CBT) videos. In fact, such is the demand for this captivating content, Netflix will announce a whole new sub-category dedicated to generic security training videos.
Er, nope. Not now, not in 2020, not ever.
For developers, their introduction to security is often by way of workplace compliance training. Secure coding is rarely part of their tertiary education, and on-the-job training can be the very first encounter with software security. And, unsurprisingly, they often don't like it.
For developers to take security seriously - and for training to be useful - it has to be relevant, engaging and contextual to their jobs. One-off compliance training - or an endless stream of dull videos - is not the way to a developer's heart, and it's not going to reduce vulnerabilities.
If you want the developer cohort to have any chance of becoming a security-aware defensive force against common vulnerabilities, get them working with real code examples - the kind they would come across in their day-to-day tasks. Make the learning bite-sized, easy to build upon, and incentivize it with a sense of fun. For a security culture to thrive, it must be positive, engaging and develop real skills and solutions across the organization.
Who knows? With the right training, a developer might recognize their inner security champion and spread the benefits of secure coding far and wide.
Okay, so this is clearly no laughing matter. I've said many times that the world simply won't care about cybersecurity until people start dying from a cyberattack-related incident.
The problem is that this has already happened, and it went largely unnoticed.
Cyberattacks against US hospitals have been linked to a rise in heart attack deaths in 2019. Of course, the attackers did not cause lethal cardiac events in patients, but their ransomware attacks on hospital systems and equipment slowed treatment times for critical care.
This study from the University of Central Florida analyzed 3000 hospitals, 311 of which had experienced a data breach. In those that were affected by a security incident, they took an average of 2.7 minutes longer to give suspected heart attack victims an ECG; likely due to procedural changes, newly implemented security measures and IT support issues taking up more time than it did previously. Identifying and treating a heart attack is a race against time, and those hospitals saw an additional 36 deaths per 10,000 heart attacks per year on average.
This is shocking, and unfortunately, I think it will get worse before it gets better. This should serve as a stark reminder that we all need to do more to uplift software security and make it front-of-mind in every business.
If you type "hacker" into an image search, you will inevitably uncover thousands of images of a hooded, faceless figure typing away at a laptop, or a similar figure in a Guy Fawkes mask.
This stereotyped image of a hacker is getting really tired, and, well, makes everyone look like a bad guy. There are plenty of security good guys and girls, and the negative connotations around the "hacker" image do everyone a disservice.
Do I see this changing any time soon? Probably not, but it's nice to dream. For now, it's important to remember that security doesn't have to be scary.