Data Lifecycle Management Evolves in a Hybrid Cloud World

Blending private and public clouds at enterprise scale creates mammoth complexities when managing an explosion of data from an ever-expanding assortment of digital technologies.

By Tom Mangan

By Tom Mangan January 6, 2023

Data has a lifecycle that gets tougher to manage, especially in increasingly complex hybrid multicloud cloud environments.

Current studies estimate the global amount of stored digital data in 2019 was 41 billion terabytes. In the last three years, that number has more than doubled – reaching 91 billion terabytes in 2022 – and if current projections hold, it will double again by the end of 2025. The pace of this growth will only increase with the rise of data-driven IoT sensors, artificial intelligence and high-bandwidth 5G mobile networks. Data lifecycle management brings order to all this runaway expansion, using technology to rationalize the creation, storage, analysis, backup, and deletion of data.

It's always been a handful to manage the data lifecycle, but the challenges have exploded with the proliferation of APIs, IoT systems, virtual machines, microservices, and container architectures, along with everything else IT teams use to give their employers a digital edge.

And then there's cloud computing. Public cloud platforms can transform storage, archiving, and disaster recovery. Private clouds can harden security and ensure compliance. the appeal for using some combination of both is strong, with 82% of organizations adopting a hybrid multicloud strategy in 2022.

Related

University-based Medical System Innovates Healthcare with Hybrid Cloud IT

The question for IT teams and their employers is how to manage the data lifecycle effectively in the new hybrid cloud world. It requires a clear strategy and the right technologies to address problems that arise from siloed IT systems, lack of control or visibility across those systems, and specialized teams and vendors. Technologies such as Nutanix Mine, can automate backup operations to help manage the hyper-growth and sprawl of data.

The Weight of Data

It's reasonably quick and easy to move applications and virtual machines (VMs). Data, however, does not have that kind of flexibility.

"Data is sacrosanct — it has gravity," said Priyadarshi Prasad told The Forecast.

Prasad, the former general manager of Nutanix Objects and Mine, software for storing unstructured data, protecting data, and managing data lifecycles, explained that data must be stored properly, accessed easily, and protected perpetually to ensure reliability and accuracy. It doesn't have the mobility of apps or VMs. 

"It takes time to move data from one place to another, so maintaining a continuum of data lifecycle management in this hybrid cloud world is really a big challenge," Prasad said.

Troubles often arise when hybrid cloud technologies are incompatible with conventional private cloud tools. 

"It's like fitting a square peg in a round hole in some sense," Prasad added. 

"However, modern data lifecycle management tools can square the circle with APIs that standardize data mobility, simplifying the process of moving data within hybrid cloud environments. Using an API, you can start an on-premises or private cloud and move data to the public cloud if you like."

Related

Why Building Empathy into APIs Makes Everything Play Nice Together

Alternately, users can start cloud-native and utilize APIs to move data back to the private data center. 

"The API-driven data movement occurs transparent to applications accessing the data, ensuring data accessibility while realizing data mobility," Prasad said.

Dealing with Silos

Mobility is central to data lifecycle management in hybrid cloud environments because data must have the flexibility to move seamlessly between private and public clouds. Silos – stores of data isolated from an organization and inaccessible from other systems – tend to impede data mobility.

Silos have multiple dimensions: tools, people and processes. IT teams often use separate technologies to manage storage, applications, virtualizations and networking servers. 

"They're all silos with different management interfaces, troubleshooting techniques and so on," said Mark Nijmeijer, former director of product management at Nutanix. 

Moreover, primary and secondary storage often create silos if backup and disaster recovery have separate architectures — requiring even more tools.

"IT teams also create silos through their expertise and work processes," Nijmeijer added. 

"Backup is a completely different scale than managing virtualization, so backup teams are very specialized," he said. 

Silos also happen during duplication and backup. Multiple copies of data in the primary architecture may become invisible when the data goes to backup. 

"You get this data sprawl," Nijmeijer said. "And you might store a lot more data than you actually need."

Achieving Scale

Public cloud platforms enable IT teams to scale rapidly in lockstep with new demand or seasonal business. In hybrid environments, primary and secondary architectures inevitably expand – or contract – on a scale that affects data lifecycle management.

"The moment you give me that infrastructure, I'm going to start generating data — new PowerPoints as an IT worker, new code, whatever it is," Prasad said.  

As the world adapted to the COVID-19 pandemic, many IT teams scaled up overnight with virtual desktop infrastructure (VDI) and desktop-as-a-service (DaaS) to serve their remote workforces. These technologies use file shares that require specific data-lifecycle approaches, according to Nijmeijer.

"You have to make sure those file shares are protected and can be easily restored and can scale out as well," Nijmeijer explained. "If you're growing from 100 users to 1,000 users, then your file services need to scale out very, very efficiently as well."  

Managing Backup and Disaster Recovery

Before the advent of low-cost cloud computing, tape drives were the go-to source of data backup.

"In the old world, tape was an order of magnitude cheaper than primary storage," Nijmeijer said. "But the moment you go to tape, you lose both data and visibility of the data." 

Recovering data from tape can take hours, days or weeks. 

"Hybrid cloud backup and disaster recovery can make backup data available much faster at dramatically lower costs,” Nijmeijer said.

But there may be trade-offs in hybrid clouds. 

"Now you need a cloud admin, somebody with an AWS (Amazon Web Services) account, et cetera, et cetera," Nijmeijer said. "You also lose data visibility. What's out there? How do I index it? How do I know how many copies of my data are there?" 

As more organizations shift to hybrid cloud strategies, Checkpoint reported that 46% of IT decision-makers cite an increasing lack of data visibility as a top security and strategic data management challenge. 

Another piece is the orchestration of data recovery. Orchestration pulls everything back together when recovering systems from backups. Software and automation are increasingly crucial to data lifecycle management because IT systems and architectures are becoming so complex.

Tools for Simplifying Data Lifecycle Management

Data lifecycle management is a straightforward concept. Applications, sensors, and computing devices give life to data. At some point, data gets copied, analyzed, and stored on a hard disk or memory chip. When it's deleted, new data takes its place.

But things get tricky in an ever-expanding universe of data-driven workloads in hybrid cloud environments. At enterprise scale, it's nearly impossible without automation.

That's driving tech companies to develope better tools for data lifecycle management in hybrid multi-cloud IT systems. Nutanix, which pioneered hyperconverged infrastructure (HCI) – the software virtualizes computing, storage and networking in a single management plane – developed Mine to automate unify backup and disater recovery, making it easier for IT pros to manage the data lifecycle across pubic and private clouds. 

Mine brings value by simplifying complex data lifecycle management operations, according to Nijmeijer. 

"This makes it easy to manage data end-to-end, from creation to long-term archival, all under one platform,” he said. “This saves IT people a lot of time. Time that they typically spend keeping the lights on in the data center." 

This is an updated article that was originally published in July 2020. 

Tom Mangan is a contributing writer. He is a veteran B2B technology writer and editor, specializing in cloud computing and digital transformation. Contact him on his website or LinkedIn.

Michael Brenner updated this article. He’s a keynote speaker, author and CEO of Marketing Insider Group. Michael has written hundreds of articles on sites such as Forbes, Entrepreneur Magazine, and The Guardian, and he speaks at dozens of leadership conferences each year covering topics such as marketing, leadership, technology and business strategy. Follow him @BrennerMichael.

© 2023 Nutanix, Inc. All rights reserved.  For additional legal information, please go here.

Related Articles