Download PDF version Contact company

Ahead of World Backup Data Day, please find comments below by Dr. Johannes Ullrich, Dean of Research at SANS Technology Institute, and Lisa Erickson, Head of Data Protection Product Management at Veritas.

Dr. Johannes explains why data should be considered 'at risk’ if it can’t be found in at least three locations and Lisa Erickson says organisations should implement comprehensive classification systems to understand the kinds of data they have and therefore where and how it should be stored and for how long.

Risks related to data 

Comments by Dr. Johannes Ullrich, Dean of Research at SANS Technology Institute, “Data should be considered ‘at risk’ if it can't be found in at least three locations. Organisations should aim at maintaining an on-premise copy, a cloud or online-remote copy as well as an offline remote copy of critical data. In particular, sophisticated ransomware will attempt to disrupt recovery from backups, and any online backup, remote or local, is at risk."

Cloud-based solution

Controls used to monitor access to on-premise backups do not always translate one-to-one to cloud-based systems"

Dr. Johannes Ullrich adds, “Attackers are exploiting backup system vulnerabilities to access confidential information or to disrupt recovery after a ransomware incident. Cloud backups are often more vulnerable. Controls used to monitor access to on-premise backups do not always translate one-to-one to cloud-based systems."

He stated, "Designing a cloud-based solution, organisations need to consider how access is controlled, how requests to retrieve or store data are authenticated and how the backup live cycle from creation over retrieval to eventual deletion is managed."

On-premise backups

Dr. Johannes Ullrich continues, “Backup systems need to be redundant AND diverse. It does not help to have three copies of your data using the same cloud provider (even if it is in different zones). Use different technologies like on-promise, off-site/offline, and cloud."

He adds, “One of the main reasons to invest in on-premise backups is to speed up recovery. Cloud and offsite backups will almost always be slower. In some cases, cloud backup providers may have mechanisms to accelerate the recovery of large data by shipping hard drives instead of using slower internet connections.:

Test recovery speed, encryption 

Dr. Johannes Ullrich stated, "Make sure you test recovery speed to better estimate how long it will take to recover large amounts of data. Any data leaving your direct control, for example, physical backup media being shipped offsite, or cloud-based online backups, must be encrypted before they leave the network you control."

He adds, "Backups need to be encrypted while in transit and at rest at the backup location. This may, in some cases, cause additional complexity, but rarely used backup data should always be encrypted.

Comprehensive data classification

Implementing identification, categorisation, and retention policies will ensure that critical data is retained"

Lisa Erickson, Head of Data Protection Product Management at Veritas comments, “Start the backup process with comprehensive data classification and implement deduplication. IT departments can’t afford to save their data indiscriminately as they face tight budgets and scrutiny over cloud spend ROI, 83% of UK Enterprises are already overspending on public cloud."

She adds, "Organisations should implement comprehensive classification systems to understand the kinds of data they have and therefore where and how it should be stored and for how long. Implementing identification, categorisation, and retention policies will help organisations organise their data and ensure that the critical and sensitive data is retained appropriately."

Deduplication 

Lisa Erickson continues, "Also, they can reduce their attack surfaces by establishing policies, technologies, and auditing that reduces their data footprint through methodologies like deduplication.” 

She concludes, “Double down on backup at the edge. Organisations often don’t apply the same level of protection to the edge as they do in the data centre, often due to skills and staffing shortages. Each edge device needs to be protected and backed up and the resulting edge data needs to be assessed, categorised, and protected accordingly.” 

Download PDF version Download PDF version

In case you missed it

Bosch sells security unit to Triton for growth
Bosch sells security unit to Triton for growth

Bosch is selling its Building Technologies division’s product business for security and communications technology to the European investment firm Triton. The transaction enc...

In age of misinformation, SWEAR embeds proof of authenticity into video data
In age of misinformation, SWEAR embeds proof of authenticity into video data

The information age is changing. Today, we are at the center of addressing one of the most critical issues in the digital age: the misinformation age. While most awareness of thi...

Marin Hospital enhances security with eCLIQ access control
Marin Hospital enhances security with eCLIQ access control

The Marin Hospital of Hendaye in the French Basque Country faced common challenges posed by mechanical access control. Challenges faced Relying on mechanical lock-and-key technol...

Quick poll
What is the most significant challenge facing smart building security today?