One of the big takeaways from last week's Cyber UK conference was the recurring theme that we have reached an inflection point in cybersecurity - an inflection point in the impact that cyber-related crime is having on our society; an inflection point in the proliferation of malware, spyware and ransomware; and an inflection point in terms of how governments are responding, most notably through legislation that is passing responsibility for security back onto industry.
If you want to know just how seriously governments are taking the issue, have a read of the UK government's cyber strategy, the EU Cyber Resilience Act or the US's new cybersecurity strategy (paywall) that "calls on software makers and American industry to take far greater responsibility to assure that their systems cannot be hacked, while accelerating efforts by the Federal Bureau of Investigation and the Defense Department to disrupt the activities of hackers and ransomware groups around the world".
Of course, security has always been industry's responsibility but we're now at a point where it will be baked into legislation. This is no bad thing. Cybersecurity has moved from nagging concern (should we pen test after the fact?) to being a key pillar of how software is built today (let's threat model up front and early).
This is not negotiable. Companies only get one chance at this. Just look at LastPass.
A security-first mindset
Recent research has estimated that 70% of all security vulnerabilities are memory-safety-related issues. Many of these lie deep inside our digital infrastructure in the billions of lines of memory-unsafe C/C++ code that underpin our operating systems, compilers and runtime systems. And whilst projects such as CHERI might eventually help, for now these issues aren’t going anywhere soon.
That’s just the code. Then there’s the people problem. In the first half of 2022 alone there were an estimated 200+ million ransomware attacks. In 2021 there were over 600 million. About 120 million of them resulted in reputation damage. These are staggering numbers.
The question is, given the scale of the challenge, what can you do about it? No software system can ever be made completely safe (not when people are involved), but there is still much that we can do to mitigate the risks. This includes ensuring your people are versed in the basics of cyber essentials, managing your software development supply chain properly, leveraging modern cloud, adopting immutable architectures and so on.
A security-first mindset is now the entry price for doing business. If you leave your windows and doors open at night, don't be surprised to wake up and find your contents gone in the morning. The same goes with your software.
Baseling security know-how
Back in the 1990s, I was lucky enough to work on a product that emulated the AS/400 midrange computer. It was a large and complex piece of software built with great care and attention, but it was also a product of its time, designed to run on premise in some hidden-away machine-room. There was of course no such thing as the cloud back then.
Still, we put considerable effort into doing things properly, especially at the system boundaries where clear contracts and assertions backed by strong defensive checks and rigorous automated testing prevented invalid data getting past the API.
But we weren't really thinking security first - not in the way we need to think about it today. It was more of a design-oriented mindset, focused on validation and checking for bugs in the calling code, not of nefarious use.
Good engineering discipline has always resulted in more secure software, but what constitutes good discipline has evolved along with the threats and the ever-increasing complexities of the software ecosystem (most of which is now online).
The traditional baseline skills of engineering best practices and test driven development now need to be augmented with skills such as threat modelling and security by design. Security know-how has become foundational knowledge for every engineer.
Security as a culture
From the moment a new employee walks through your door, they need a platform of learning and support that fosters a shared understanding of cybersecurity and the skills to mitigate and protect against threats. This is not a one-and-done deal - like any good engineering culture, you are never done with security. You've got to be on it all the time.
What we're really talking about here is an overarching culture of DevSecOps, of integrating security practices and testing into every part of your software development lifecycle. It is a company-wide way of thinking and working.
If you are not already even part way down this road, then you need to start now. Actually you needed to start yesterday. This is root and branch stuff that will seep deeply into your company culture.
Building the culture, where to start?
If your teams aren’t already versed in cyber essentials then you need to invest in training today. Secondly, make sure you level-set everyone regularly. Don’t just assume people know this stuff. It’s a moving target and you have to constantly reset. Here at Instil we put our engineers through security training every 18 months. For new folks it’s within the first few months of their arrival.
The challenge in all this is that ‘security’ is a pretty big topic. No one course will cover it all. Instead, any training needs to be delivered across of a series of workshops, each focused on a specific topic.
The good news is that we have you covered. Along with our partner, Vertical Structure (a great local company whose daily bread is all things security), we have co-authored a range of security courses for engineers that cover all the essential best practices and techniques for building more secure software:
-
Threat Modelling – A collaborative approach to identifying and protecting against security threats before developing your software.
-
Engineering Best Practices – It turns out that clean code that is easy to work with is generally more secure than poorly written code.
-
Modern Testing - Rigorous and automated security testing must be an integral part of the software development process not something done after deployment.
-
Security by Design – Embed a secure-by-design mindset into your engineering teams by helping them understand what secure architectures, designs, and approaches look like.
-
Cloud Security – The modern cloud does so much of the security heavy lifting but you still need to know what good practice looks like.
-
Continuous Delivery - Responding to change quickly and delivering software continuously requires high levels of engineering maturity. This is where the rubber hits the road.
These courses will set your teams in the right direction, but if you really want to make sure the security-first mindset sticks, you need to create an environment where everyone in your organisation is talking about security, all the time.
For example, we have a dedicated company-wide Slack channel where we constantly discuss issues, threats, events etc. Get this right and it will change hearts and minds.
Ultimately, you own and drive your culture. We can help affect change but the impetus for change has to come from within.