By Manoj Chugh- ICT Expert
A few years ago, while presenting to a bunch of CIOs’ I posed a question relating to the importance of backing up data and investing in a Disaster Recovery Strategy. As expected, I received a muffled response. Most felt that the deluge of data that they had was “exhaust waste” that had to be somehow captured and protected for Compliance purposes. It did not help their cause, since it took away budgets from the more important need of creating high performance and robust “production environments”.
Data backed up in secondary storage was not helping them in any way to deliver value to business. I then quizzically asked my second question, “ who was the custodian of these data assets, both online and offline?” A deep uncomfortable silence followed. I waited for a few moments and then stated the obvious- of course, it was everyone sitting in that room. They were the “ keepers” of all these data assets. There was little option, but to carve out money from their limited IT budgets and spend it on projects that were focused on managing non production data- projects which did not really deliver meaningful impact. It was a bitter pill that IT leaders had to swallow. “Expensive insurance,” many muttered. “Gosh, why do we have to do this? “
The world has moved on since then, at a rather accelerated pace. Today, data forms the backbone of the Digital Enterprise. It is the most valuable asset of business. In fact it is not only production data, but includes data that is backed up- object, video, voice ; all cumulative sets that play a key role in helping organizations drive deep insights.
They help businesses understand what their customers are thinking, drive better experiences, throw up ideas on new potential products, help understand what motivates employees, how they can drive increased productivity or improve profitability, amongst many others. Digital applications, powered by unstructured data are driving growth. In fact unstructured data is growing by leaps and bounds. It is expected to grow to a whopping 90 Zettabytes over the next three years. A Zettabyte is 1000 Exabytes. Everything that mankind has every spoken will come in 5 Exabytes. We are talking of real gargantuan numbers here.
Enterprise Storage has often been compared to an iceberg. Above the surface of the ocean is the primary storage which runs business critical application. This is 10% to 20% of the overall environment. The rest consists of file and object storage, back up, analytics, dev/test – all dispersed across the siloed secondary storage infrastructure including de deduplication devices, backup devices, NAS and Cloud Gateways. Each of the silos needs to be provisioned and manages through their own interfaces and processes, truly resembling the smokestacks of the yesteryears. Scale up and scale out technologies have offered solutions. Till a few years, one had to make a choice between one approach or the others. Thankfully that has changed now. Technologies allow consolidation and management of all secondary data from a single web scale platform that spans across the edge to the core and then the cloud. This consolidation becomes important particularly as we work to secure these vast pools of disparate data.
With backup data now moving up the value chain, climbing up from being viewed as “ toxic waste” to becoming a key asset for enterprises, it is not surprising that cybercriminals are considering this as a key target. Ransomware attacks have doubled over the past one year and organizations are paying five times more ransom than they were doing till recently. In certain markets over 10% of Enterprises have had a ransomware attack. This does not just have a cost implication, but the impact on enterprise reputation can be disastrous, apart from opening up an organization to a slew of law suits. The ability to counter such attacks is becoming ever more critical.
Enterprise backup is considered to be a part of an organizations defence strategy. Cybercriminals realize that if they penetrate this bastion, rich dividends will follow as enterprises stand exposed and look vulnerable. Since a large part of an organization’s data asset is a part of backup, protecting backup data from an attack, detecting and recovering quickly post an unfortunate one is becoming increasingly important. Backup data needs to be immutable, it should be stored in a manner in which it cannot be encrypted, modified or deleted. Continuous monitoring needs to be enabled, so that the anomalies are visible. In the unlikely event that there is a breach, the ability to recover a clean copy of the data from within the Global footprint, including the public cloud, has to be facilitated so as to enable quick recovery.
Creating immutable file systems, with unlimited read only snapshots, ensuring that the original backup cannot be accessed by an externally mountable system so as to prevent unauthorized access are some mechanisms that can be put in place as a part of an overall strategy. Baselining the rate of data change and monitoring anomalies is important in helping detect if there is some wrong doing. Machine learning plays an important role here, given the ever increasing size of data sets. Given that a large part of the data is unstructured or in object form, monitoring file anomalies is also important. If under attack, it is important to be able to recover quickly. Ability to quickly pull out fully clean snapshots, which can them be restored quickly will alleviate the pain and save the day! Fortunately, technologies from firms like Cohesity can deliver peace of mind in all these areas.
With cyberthreats increasing by the day, protecting data assets is becoming increasingly important. Most of the data that we have is in backup form. Cybercriminals view this as low hanging fruit. It is time to tighten our belt and get ready for the Big Fight!
The views expressed by the author are personal
For more articles written by Manoj Chugh, please refer to this link: