Sign up for our newsletter and get the latest big data news and analysis.

Companies Are Bringing Data Back from the Cloud. Now They Need a Place to Put It

In this special guest feature, Shridar Subramanian, Vice President of Global Product Management and Marketing at StorageCraft, discusses the trend called “cloud data repatriation,” and how it appears to be gaining steam. A rethinking is happening where companies are looking to return at least some of their core data and applications to their on-premises data centers. StorageCraft is a global provider of data management, protection, and recovery solutions. For more than two decades, Shridar has helped to bring innovation to the market that helps organizations successfully build I.T. infrastructures. He is an innovator of technology that turns data from an operational challenge into a key strategic asset. Shridar joined StorageCraft with the acquisition of Exablox in January 2017. He holds an M.S. in computer science from Penn State University and an MBA from the University of Chicago.

Until recently, if you weren’t in the cloud, you were nowhere. Companies were racing to move their applications and data to the cloud, expecting to reap the benefits of reduced costs, increased flexibility, and greater collaboration, to name just a few. However, now a rethink is happening. Companies are looking to return at least some of their core data and applications to their on-premises data centers. The trend is called “cloud data repatriation,” and it appears to be gaining steam. IDC reports that 80 percent of organizations repatriated workloads last year and that, on average, companies expect to return 50 percent of their public cloud applications to hosted private or on-premises locations over the next two years.

So what’s driving this surprising turn of events? First and foremost is the fact that the cloud isn’t quite the silver bullet it was hyped up to be. In many cases, moving all data to the cloud is not as cost-effective, secure, or scalable as many organizations anticipated.

Sure, public clouds offer a higher level of flexibility, but they can be surprisingly expensive, especially where data storage is concerned.  As well as being costly to store in the cloud, it often proves both slow and costly to download data sets from the cloud when they’re needed on-prem.  The cloud also has a history of being too slow and costly for the transmission of edge data, such as unstructured data produced by the Internet of Things (IoT) devices.  This unstructured data is growing at hyperspeed. Indeed, IDC predicts that the total of the world’s data will increase from 33 zettabytes in 2018 to 175 zettabytes by 2025 and that 80 percent of that data will be unstructured. Those are head-spinning numbers and companies, understandably, are struggling to keep up.

Just when organizations think they’ve got a handle on their data, more and more of it comes flooding in, whether it’s email and multi media phone messages, audio files, video files, text files or social media posts. Moreover, machine-generated data, as well as data created by IoT devices, is adding new and more massive waves of unstructured data to the mix.

All this unstructured data presents large storage and security challenges. At first, when cloud storage rose to prominence, organizations thought the answer was to move the vast majority of their data—both structured and unstructured—to the cloud.

However, these same organizations soon figured out that the cloud is not only more expensive than they thought, it is also hard to access in a timely fashion when they need specific data, due to the cloud’s inherent latency.

That’s why we’re now seeing the cloud repatriation trend, in which more and more organizations are moving to a hybrid infrastructure that involves keeping some data and applications in the cloud while returning other data and applications to an on-premises infrastructure.

The reality is that data volumes in the cloud have become quite unmanageable. This means that it can be more beneficial in terms of cost, security, and performance to move at least some of your company’s data back on-premises.

But as companies bring their data back on-prem, this raises the question of where and how to store it all.

With the emergence of cloud repatriation, organizations need a data storage solution that can protect business data wherever it lives—on-premises, offsite or in the cloud—and to ensure that this data is always available, no matter what happens.

Any storage solution also needs to be highly scalable to keep pace with an organization’s data growth, which is often more than 100 percent per year. The right storage solution will allow organizations to cost-effectively add any number of drives, anytime and in any granularity to meet their expanding storage requirements with no configuration and no application downtime.

Lastly, an ideal storage system will offer analytics to help organizations quickly figure out which pieces of information are critical to their business and which are not. With this ability, organizations can make better decisions about which data sets can be pushed to the cloud, which can be stored locally and which need to be repatriated. Analytics can also help identify the data that should be backed up and the data that need not be, giving organizations an intelligent, tiered data architecture that provides rapid access to mission-critical information.

The proliferation of data is reaching epic proportions, just when companies are discovering that they can’t simply upload it all into the cloud. What they need is a new approach to storage infrastructure that can manage their data growth and, at the same time, secure all of their unstructured data, wherever they put it.

Choosing the right storage system for the new reality of cloud repatriation gives organizations peace of mind. It assures them that they can cost-effectively manage their increasing data volumes and gives them the confidence that their data is always securely at their fingertips.

The good news is that storage technology is up to the challenge. It can be your secret weapon to getting better control of all your data and mitigating risk once and for all.

Sign up for the free insideBIGDATA newsletter.

Comments

  1. Total nonsense and hype to sell 1990’s solutions. There lame argument that there isn’t enough bands width between cloud and on premises just shoes the lack of knowledge by this simple signed author.

    Done right the cloud is more secure than any on premises solution and 1,000% more flexible.

    The only solutions that are not secure are ones not seen correctly, the physical location of the server doesn’t matter.

    Fake articles like this are for sales only. This isn’t reality

    • Kelly Nowell says:

      Tom – I’m taking an objective look at this both ways. Can some of the context of this be seen as FUD, sure, though from a cost perspective, that’s a real concern, especially if growth occurs faster than revenue. Depending on your business model (what your using Cloud compute/storage resources for), the outcome can leave you cash poor and unable to , leaving some companies in a very poor cash flow situation. There is merit to a hybrid model, as that provides the most flexibility of all.

    • Clayton Barlow says:

      100% accurate. Total BS “article” which is written by a person with a financial stake in moving back to on prem.

    • Mr Armstrong says:

      Not for everyone. We can buy complete replacement of hardware annually for the costs of cloud on our high transaction & data rich solution. Not to mention the inefficiencies it costs users in performance. Yes, if you do NOT actually know how to write your systems, do NOT know how to secure your systems, do NOT know how to acquire your systems you don’t belong on premise. But there is not RIGHT one size fits all…on-prem or cloud.

    • John Jones says:

      I agree. This article is nonsense. Companies save so much money and resources with cloud infrastructures. They’re cheaper, more secure and if done correctly, most of the management is automatic.

    • Mark Bretniv says:

      Poor Tom…you can see the spittle forming on the corner of your mouth as you typed that. Cloud isn’t for everyone and companies that already spent the capital on a DC would better server itself by continuing to invest in it than moving to a cloud provider and paying “rent”.
      While I agree the latency argument is bs, this article isn’t complete false. I’ve actually helped move several client’s, I’ve contracted with, data back from the cloud into their own DC because they realized it wasn’t cost effective long-term to keep it in the cloud.

    • Good luck with a <5ms latency requirement without a circuit that costs and arm and a leg. Hybrid will be the status quo going forward. Managed on-prem data centers mimic cloud offerings.

    • Really, I am seeing more companies ruing the day they thought the the cost of cloud was the answer to everything, the problem is they uploaded unsuitable applications and the performance and service nosedived. I wouldn’t call a hyper converged and hybrid on prem option 90’s technology, I’d be a millionaire now if I could have sold that in the 90’s…

    • Tom, I disagree. Firstly it was IDC, not StorageCraft that made that prediction.
      Privacy and Compliance ate drivers in this move. Private hosting or edge computing offer costs benefits.

      For example, in the small and medium business market, companies ill advised by their IT support, get attracted to the cheapness of entry level plans on Office 365, paying just a few dollars a month per user e.g. less then NZ$15.

      However if due diligence is performed, and provisioning for security and industry standard backup, the figure can rise to as much as NZ$70 or more. A small 50 user network is now faced with paying out $3500 per month, or $200,000 over a five year life span. And that buys an awful amount of tin. And that is before the cost of local network support and maintenance, and monitoring of the Office 365 account.

      If you ask Microsoft Reps the hard questions about Office 365, they will admit that the lower plans are insecure. For hosting of non Microsoft dedicated applications and platforms, a similar scenario exists.

      However, technology is getting smaller, with backup, redundancy options all getting smaller. In a few years time, with miniature storage, high speed internet, 5G, IPV6, and miniature servers, companies will question the need to host shoebox size technology than runs their entire company in a data centre for hundreds of thousands ff dollars a month vs on premise managed by one or two engineers.

  2. Sandeep Bansal says:

    Moving on cloud from on premises can be a big time problem. The architect has to look into cost point of view for next 7 to 10 years. The backtracking to on premises from cloud will happen for sure. Since moving from Cape to opex increases an on going expense. Which hurts when you pay MOM, unaware of scalability jump can be a big shock.

  3. Craig Harris says:

    A rather biased and misleading article to help a legacy company try to survive with solutions inferior to modern cloud-based offerings. Good luck to anyone falling for this!

  4. Ramakrishna Ponnaluri says:

    The case is that, when companies have cash flow and good revenew it’s easy to pay for cloud in good times, but when there is global recession and belts have to be tightened it becomes extremely difficult for companies to maintain cloud services from their bleeding nose. Atleast to keep some core data with in reach for continuity of business without paying for cloud will be key for survival in the times of recession.

  5. This article might create confusion among the system architect, but for long, security is one the major concern, which can’t be prevail as the physical existince of your infrastructure. Cloud makes you cheaper in perspective of as workforce and hardware, but cant compromise eith performance optimization which itself commence once you lay down the foundation

Leave a Comment

*

Resource Links: