Anyone responsible for data security who doesn’t get a shiver down their spine when they read the about yet another high-profile ransomware attack in the news is either doing something very right, or something very wrong.
The danger of falling victim to a cybersecurity issue is getting greater as the volume of attacks continues to rise and bad actors become increasingly sophisticated. Interpol has highlighted how Covid-19 affected both the number and nature of cyberattacks during 2020, and notes: “Vulnerabilities related to working from home and the potential for increased financial benefit will see cybercriminals continue to ramp up their activities and develop more advanced and sophisticated modi operandi.”
There’s no such thing as 100% protection
The natural reaction to such worrying news is to seek protection and build the walls, and there are plenty of firms out there whose livelihood depends on providing just that. The best of them do a grand job, and their regular threat reports indicate just how many attacks they defeat.
But let’s not kid ourselves. No organisation can ever ensure 100% protection from an attack. Especially when those attack types are changing faster than most firms update their defences. Data often sits in too many locations, some forgotten by the user, and ultimately too many areas like this are likely outside those protected by upfront protection, scanning services and threat intelligence. Even some approaches to data backup and restore systems can be somewhat haphazard, augmented over time as new systems are added, consequentially with complex backup routines and even some outdated scripts that are no longer fit for purpose.
How many organisations can say, with absolute certainty, that there are no data silos or duplicate systems outside of the main ‘protected area’ but with accessibility to inside the network? How many organisations can provide absolute assurance that there are no backups, live or archived, that might not be completely clean of ‘infection’ and are reliable?
Put the spotlight on detection and restore
If 100% protection is not possible, what is an organisation to do to protect itself? We would not for a moment advocate giving up on using a protection service. As a first line of defence it is absolutely necessary, but multiple lines of defence are needed for robust and reliable security. The trickier you can make it for an attacker, the less likely they are to succeed. One of the first lines of defence aside from the upfront protection and firewalls must be threat detection. For you to know there is a problem, perhaps before it materialises into a full-blown extorition attempt, and with some hope of restoration and kicking out an attacker, is invaluable.
Sadly too many organisations fail to recognise this and are punished. Consider the malware attack that’s discovered because an unwitting employee has an issue, needs a restore, only for the IT team to find, hours – or maybe even days later, depending on how the restore has been set up – that the ransomware has reinstalled itself, because it had planted itself quietly and neatly in the backup where it has sat, undetected, just waiting for a restore to reinject itself back into the business.
Proof of the pudding is in the eating
None of this is idle speculation. Look at any sector and there are examples of very serious outages from the past year in the UK alone.
Early in 2020 Redcar and Cleveland Council suffered a long-lasting outage due to what it reported as a ransomware attack. The attack started on February 8, and it took a month for services to be up and running again. The cost of getting over the problem has been put by the council at over £10 million. In October 2020 Hackney Council was the victim of a cyberattack, and even weeks later it had still not been able to bring all the data back online. The cost of getting over this attack is, as we write, still unknown as recovery is ongoing.
Of course, for nearly any recovery strategy, the data is only as current as the last backup taken. Every organisation has differing needs, but each must weigh up a variety of factors to determine how frequently to backup, including the cost of downtime and the resources needed to bring business back online. Depending on your business size, the team you have to dedicate to recovery, the nature of the business, the regulations you operate in, and of course budget and critical operations, it will differ.
However, for a bank, they could not only lose business, and therefore money, but if the backup data used to recover is even just a few hours old, they are in trouble. However a small retailer selling plants could get by with weekly backups. It’s all relative and the only people capable of assessing the criticality of backup and recovery for your business is you and your team. What is a niggle for some businesses is frontpage news and a CEO firing for another.
But what we can be pretty certain of is that an organisation can’t just park its data in backup and hope for the best.
Through a robust, reliable backup and restore setup, with strong malware detection capabilities, organisations have a genuine chance to protect themselves, and get back up and running, malware free, in less than an hour. However, without the combination of a front line of defence protecting against cyberattacks and a reliable set of measures for recovery when the front line inevitably fails, no organisation has an appropriate level of protection and recovery. Now, as we head into the unknown of 2021, how does your business stand up to attack?
Interested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.