As the world moves towards an ever more digital future, the amount of data we all generate is growing. It’s increasing on a personal level as we all use more smartphone apps, take high-res photographs and shoot 4K videos, and it’s also growing in the corporate world as more and more core business functions move online and into the cloud.
Storing all this data, keeping it safe, deleting it when required to by law and being able to retrieve it for business use when needed is a complex operation. But the worst thing you can do – the very worst – is treat it as if it’s all the same.
It turns out this is exactly what a lot of companies do. The amount of data they routinely store has grown steadily over the years without them really noticing, and usually the first sign they have that something isn’t right is when a bill arrives.
“The first thing any company needs to look at when it comes to figuring out how best to store and retrieve data is, what kind of data are they dealing with? There are so many different types of data, so it’s really important to put some criteria around the issue and to ask: how critical is the data to the business?” said Colin Boyd, storage platforms and solutions sales manager for Dell Technologies Ireland.
“We advise customers to try and draw up a chart to note: how critical is this data, and does it need to be kept for a long time? It’s basically about profiling the data and getting it into some sort of structure and then looking at the technology behind that.”
Boyd said that many companies are defaulting to the public cloud to store data that they want to keep because they think they might need it in future. The problem is that the amount of data being kept is growing – and with it, the associated costs.
“I can think of one customer in the pharma space which started to go down that road, basically keeping data because they wanted to have it forever and they were afraid to delete it. But what happened then was the costs of the public cloud servers started to accelerate,” said Boyd.
“Once you get into that spiral, it’s extremely difficult to control your costs. So they took the decision to bring that data back on premise. And we were able to help them at that point with making sure that it was stored appropriately.”
The key questions Dell Technologies asks concern issues like the types and age of data, the size, the importance of retention and how often it is going to be accessed. The security of the data is also a key question.
“The reality is that there is no one-size-fits-all solution and it’s really important to identify which solution is best aligned to the type of data. We take a very consultative approach, because basically it’s important to avoid just throwing data into a repository and then letting it sit there.”
A major problem for companies is that storage plans that once made sense can start to cost a lot more as time goes on and data grows steadily.
“What effectively happens if you keep just throwing your data into the cloud is that the type of data we’re using starts to snowball. For example, if you’re keeping CCTV images shot around your premises for five or six years -- and there’s no reason why you need to do that, but people do – and that’s going into a public cloud platform, then you’re at the behest of the public cloud producer as to what they’re going to charge you for that,” said Boyd.
“The pricing can change year on year, and if it becomes too expensive and you want to bring that data back on premise, that becomes very, very expensive. So you need to decide that there has to be a good reason for keeping data for long periods of time.”
For John McCleverty, country manager for Ireland with Rubrik, it’s important to remember that data storage isn’t just about storage – it’s also about retrieval. If your storage solution can’t deliver the data you need back to you when you need it, then it’s not up to scratch.
“Back-up is just expected to work, and when this is proven wrong it causes significant business impact. The first question people ask is: how quickly can we restore? Traditional vendors start to let customers down in this area because their tools were typically designed to do back-up and not recovery,” he said.
“The core principle that we work around is to simplify back-up and recovery. Do more with less; the days of product specialisation are fewer with the consumerisation of technology and you don't need to be a specialist. Customers want instant access to their data, regardless of where it resides, and we can offer this with Google-like search capabilities. Quickly find a file, message or object from a back-up and restore without any downtime.”
McCleverty said that data is “only going to get bigger” and, as that happens, further fragmentation will occur.
“Really what customers want when they come to us is to be able to store and secure their data while also having instant access to the parts of that data they need to work with. And then, ultimately, they want to know what more they can they do with the data,” he said.
“So once we back up the data, we index it, meaning that it’s easily instantly searchable. But security is also a major issue – and the main reason companies should be backing up today is because of the huge risk of ransomware.”
Ransomware attackers are becoming more sophisticated in their efforts to extract cash from Irish companies. It’s not uncommon for them to spend time tracking down not just where companies store their data in the cloud in order to compromise it, but also where they store their back-ups.
“They understand that ultimately if they get the back-up data, you’re unable to recover, meaning you will pay a ransom. We have natively built into our platform the ability to use a proprietary immutable file system that protects against ransomware and promotes cyber resilience,” said McCleverty.
“Cyber attacks can’t alter any of our customers’ back-ups. In the event of a ransomware attack, simply restore from your back-ups with a single click. This should be a de facto requirement for any customer looking at modernising their back-up environment.”
According to Grant Caley, chief technologist for Britain and Ireland with NetApp, one of the biggest issues facing companies dealing with the explosion of data that has occurred in recent years is that very few companies have robust deletion policies.
Instead, their policy is of the ‘better safe than sorry’ variety, and they are inclined to store everything, regardless of whether they should be doing that or not.
“This is something that enterprise-class companies have complained about for years – the difficulty of getting data owners to take ownership – with the result that they can’t delete anything. And that just means that what we’re having to host and store grows, but there are solutions to that from a cost perspective,” said Caley.
“That’s matching and merging two technologies together: typically fast but expensive storage for data accessed regularly, with lower cost but glacially slow style storage that’s available at a much lower cost for data that is rarely looked at.”
Netapp provides capabilities that manage this kind of data tiering automatically, so that the owner doesn’t have to.
“It’s based on how cold the data is, when it was last accessed, etc, and so it’s moved automatically between those different cost tiers. And that means you don’t even have to put a management system into it, because that’s also where the expense in the past came from.”
Caley said this kind of technology is going to become more important as data growth rates climb exponentially. He pointed to the latest consumer smartphones – such as Apple’s upcoming new iPhone, which is expected to ship with up to one terabyte of storage on board.
“That’s a fantastic amount of data in a single phone, and if you multiply that out by the number of people carrying devices like that as well as all the duplicate data they generate in back-ups etc, it’s very easy to see where we’re going. The data economy is just going to grow even more rapidly, and yet nobody really wants to think about that because nobody has the time to sort through all that data,” he said.
“On top of that, we now have machines generating their own data in the form of artificial intelligence, and it’s more than we can really keep up with. We need automation to handle the storing and retrieval of all that data to be able to cope with it.”
Infrastructure still an issue for storage
If you’re lucky enough to be located in a major urban area and your company has access to high-speed broadband, then there are a wide range of internet-enabled cloud storage solutions out there for you.
That’s not the case for everyone. There’s an idea often found in science fiction that says the future is here, but it’s not distributed evenly. In other words, just because a technology is available doesn’t mean it’s available to everyone everywhere, and this is the case for many internet-based storage techniques.
“There’s no point saying to somebody ‘put your data in the cloud’ when what that would effectively mean is that they would bring their entire office’s connection to a grinding halt. That’s something that we’ve seen happen,” Michele Neylon of Blacknight Solutions, said.
“We’ve worked with clients in the past who have had to ship us actual disks to upload for them because the quality of the connections available to them was so poor.”
Neylon thinks that the range of storage technologies on offer to Irish companies is fantastic and there is a lot to recommend them, but also that parts of Ireland are still underserved when it comes to high-quality broadband, and so these technologies aren’t available to everyone equally.
“If it takes you an hour to upload one file, then the idea of mirroring your entire data set in the cloud can be a pipe dream. This conversation can be very different depending on whether you’re talking to a small business based in the centre of a large town or a company based in Dublin, Cork or Galway and one based even just 10km or 15km outside these areas,” he said.
“One can have fantastic fibre connection speeds and the other can be effectively in the dark ages as far as connection goes.”