In our latest State of Ransomware report, the majority of respondents said that they had more cybersecurity budget than they need (65 percent) and more cybersecurity headcount than they need (64 percent). This seems a pretty shocking finding when we consider how widely discussed the global skills shortage is. But it also suggests that the issue lies in where budget is invested and how resource is used. And nowhere is this clearer than in the public sector.
With vast amounts of sensitive information, containing lucrative political and financial rewards, the public sector is a goldmine for criminals. Arguably, the tantalising combination of such goodies would make the public sector the most well-protected industry against cyberattacks. But, as we all know, this isn’t the case.
On the contrary, the public sector is primarily funded by the taxpayer’s money. As a result, it operates on a finite and often limited budget, with IT department allocations often much lower than they need to be. The consequences of this and swingeing cuts over the past few years means that many organisations are still operating on outdated legacy technology, leaving them more vulnerable to security breaches.
This is one of the most salient points. A recent report from the Cabinet Office found that nearly half the money the UK is spending on IT goes on supporting legacy IT systems – to the tune of £2.3 billion a year. That’s about half of the £4.7 billion the central government spent on tech in 2019. “A recent analysis by government security indicates that almost 50 percent of current government IT spend is dedicated to “keeping the lights on” activity on outdated legacy systems, with an estimated £13-22 billion risk over the coming five years,” the report notes.
Let’s be clear. This is not a cost-saving measure. The debt that taxpayers foot, includes propping up out-of-date legacy systems built on obsolete or waning technical platforms. Apart from the cost of propping these up alone, it creates heightened cybersecurity risks and being unable to introduce new government services, because the risk of spending on new IT systems is still deemed as too high. To prove the point further, the report singles out the Home Office, which has the biggest tech budget, noting has “not been able to retire any of their twelve large operational legacy systems.” Why do departments do this? It might seem cheaper, but our latest ransomware report found that the average cost of recovering after an attack is $1.4 million and if you factor in paying the ransom, $812,360, that’s quite a hefty cost to bear – as well as the reputational damage, the risk to countless people and any fines you then incur. This sounds like a gamble to me, and not a very smart one.
The brain drain
Once you have your systems upgraded, the next step is hiring the best talent. Here we run into a couple of problems.
The first problem is the so-called ‘brain drain’ or the exodus of talent to the private sector for financial reasons. Given both the limited budgets and the urgent need for cybersecurity talent in the sector, the public sector immediately needs cost-effective measures to attract and retain talent within their walls.
You might also like
A number of methods to do this are within arm’s reach. Highlighting the culture of the organisation, through the use of video interviews with current employees for example, can be an excellent recruiting tool. Secondly, highlighting the job security and the ‘meaningful’ aspects of roles that work for the general public can also be a useful measure. Both stability, and the idea of ‘giving something back’ are benefits that are higher in demand since the onset of the pandemic. Increased flexibility is also on every jobseekers mind, and with so many IT jobs able to function remotely, this is something the sector can look to offer more of.
Finally, to retain and attract talent, offering training programmes and upskilling current staff is vital. This demonstrates to staff that they are valued and will be supported throughout their careers, as well as being seen as true career experts.
Even with all of these measures in place however, there is no denying the global cybersecurity skills shortage – the second of our two problems. The United Kingdom alone would need to attract approximately 17,500 new people every year into its cybersecurity sector to meet demand, and similar workforce difficulties have been reported in Australia, Italy, Japan, and the United States.
Education will help in the long term: but solutions are needed now, and this is why so many businesses in other sectors are leaning on managed services providers to help. The growth of cloud technology makes it a more viable option than ever, both to alleviate the pressure on in-house security analysts or IT staff, and address the skills shortage.
The other benefit of using managed services is gaining access to threat hunters, to find malicious actors that evade initial endpoint security. The growing use of legitimate tools means that organisations should not sit back and wait for their anti-malware tools to alert them to a problem. If an attacker sneaks in, they can stealthily remain in a network for months, quietly collecting data, harvesting confidential material or obtaining login details that will allow them to move laterally across the environment. In fact, our recent Active Adversary Playbook found that the average attacker spent 15 days in the network – time in which a threat hunter could have detected and blocked them. Despite how much it may seem like a cost-saving exercise, waiting for an attack can often end up costing more in the long run, than proactively seeking them out.
So, what is the lesson we can take away from this? It isn’t simply that the public sector needs more money, it also needs to better allocate its resources to attract and keep cybersecurity experts within their walls. With the right partners, guidance and experience, this is possible. And, with cyberattacks on the rise, there has never been a better time to heed the call.