Defying Data Gravity for a Better Cloud

Defying Data Gravity for a Better Cloud

Source Node: 2591128

Large enterprises with global operations can have employees scattered across the planet. Smaller companies can be distributed as well, particularly with many staffers sent home by COVID who have decided to stay put, finding it a better work-life balance than the traditional office.

That said, a lot of business data is now being generated in different places. Yet, employees and apps must be able to access it in real time, even if the data source is on the other side of the earth. That’s because the ability to make fast, data-driven decisions is becoming table stakes. But not only do yesterday’s data storage approaches come up short, the all-powerful cloud can deliver spotty performance when strained by masses of disbursed users. That becomes even more apparent when users are working with outdated home technology, as is often the case.

Simply put, you need speed and flexibility to enable those quick, focused decisions that can make millions of dollars or prevent an equal amount in loss. Many who oversee technology will attest that that’s not easy, unless, of course, you’re able to defy gravity – which is thankfully possible when it comes to data. 

Getting Close to Users

When file data is stored in the cloud and workloads are on-premises, serious access slowdowns can occur. The reason why is that remote access and data consolidation are often at odds. This produces data gravity that makes the movement of data more complex and, in turn, sluggish. The situation can be made even worse when hampered by that long-standing nemesis of remote access, latency. 

What’s needed to attain limitless capacity can best be described as network-attached storage (NAS) that dynamically expands and reacts to “hot data” while maintaining the benefits of a standard file infrastructure. This can remove the burden from traditional hardware. And by transparently moving files to and from the cloud, as needed, they can be stored in secure, cost-effective object storage. In doing so, enterprises gain that elastic, expanding capacity and flexibility that drives faster decision-making. It doesn’t hurt that this approach can also significantly lower costs.  

For instance, some file data services will use virtual filers that can cache heavy workloads and the data that’s accessed most often by users. By deploying these at the edge, data becomes closer to end users, and the proximity alone raises performance to a level that can rival the standard NAS. 

Even so, users are creatures of habit, and they’ll be averse to any change unless access is familiar. So, it’s imperative that those exploring such solutions ensure that they are implemented to be indistinguishable from a traditional NAS and have a typical POSIX file-based interface that employees are used to. 

With this method, a company can provide data access that users won’t think is any different than what they ordinarily experience, spurring adoption. All the while, the organization can leverage the cloud-first strategy that they wish to adopt for the financial and scalability reasons that are inherent within the cloud model.

A Gold Standard

In such a solution, it should be standard practice to keep a master copy of every file in the cloud. Then, as end users work on cached or new files, they can be versioned to the gold copy. Doing this delivers snapshot functionality on par with a traditional local NAS filer, and further, it’s possible to chunk, dedupe, encrypt, and store files as objects. This makes traditional files immutable objects stored in the cloud – and that can also go a long way in staving off threats.

Take a ransomware attack and the encryption of locally accessible and cached files.

As we outlined above, all files will be maintained with a versioned history and stored as immutable objects. When ransomware is detected and isolated, an enterprise can quickly recover files, folders, and systems – in the form of those unencrypted and immutable objects – to a time just before the attack took place. A rapid recovery greatly curtails the financial toll of downtime and the damage it can do to a business’s reputation.

A Cloud Above

To take simplicity even further, it’s feasible for companies to combine their distributed, siloed storage arrays under one cloud platform. This takes the complexity out of infrastructure on various levels, but equally important, it opens the possibility of having all functionality available through a single pane of glass. 

Should a team be working on a project, and its members are in various, far-flung locations, collaboration is not a concern. If any revision is made to a file by anyone, it can be updated automatically and quickly in the cloud master. What’s more, updates can be sent to cached local copies equally as fast. Add in distributed file-locking, and no one can make changes if someone else is in the process of updating. This effectively eliminates file versioning confusion, with high-speed synchronization ensuring smooth collaboration.

In short, such an approach can achieve that ever-expanding NAS concept, minus collaboration hiccups, and with less vulnerability to threats. It’s all about utilizing the cloud in a way that transcends others, not just as a place for storage, but in a way that can free a disbursed organization from the grounding impact of data gravity and its costly limitations. 

Time Stamp:

More from DATAVERSITY