This is the third blog in a series on NetApp IT and the hybrid cloud. To read previous blogs, click on these links:

Data management is critical to any successful hybrid cloud strategy, especially when using multiple clouds. In a true hybrid environment, enterprises should have the flexibility to move applications across different public and private environments based on their business requirements and cost considerations.


Business applications are relatively stateless and can be easily brought up and down in various environments. However, data must be managed differently because it has its own unique characteristics:

  • Life – Data needs to be maintained, synchronized, audited, archived, etc., throughout its lifecycle.
  • Value – Data is a corporate asset that must be protected.
  • Mass – Data requires time to move in and out of environments.

Cloud companies (such as Amazon Web Services (AWS) and Microsoft Azure) are aware of the demands of data management. They entice enterprises with a variety of perks (including free data uploads) to gain control of their data. The more data that cloud providers can control, the more they lock in their customers for the long term and the more revenue they generate. Our primary goal is to use the Data Fabric -NetApp’s vision for the future of data management-to control our data and avoid the vendor lock-in, among other things.


Data Management Considerations

Where data is hosted is critical to an enterprise data management strategy. It defines what type of choices and options an enterprise such as NetApp has as its cloud footprint grows. To provide IT with the maximum benefit of cloud services, a data management strategy should address the following five key areas:


  • Secure control and governance of data regardless of its location, and guaranteed data privacy, as mandated by government policies and in-country laws such as Privacy Shield (previously Safe Harbor) laws.
  • Access to data where and when applications need it to satisfy business use cases for disaster recovery, business continuity, and archiving.
  • Flexibility to migrate data and applications between different cloud providers, locations, etc., to avoid vendor lock-in.
  • Data compliance with company requirements (e.g. Sarbanes Oxley (SOX), HIPPA, etc.) and the ability to satisfy audit and other governance processes.
  • Lower total cost of ownership, including storage costs, personnel costs, storage efficiencies, lower data transfer costs, etc.


NetApp IT Data Fabric Overview

NetApp IT’s Customer-1 program is the first adopter of NetApp products and services into our IT production environment. Customer-1’s goal is to provide feedback to Product Engineering on a product’s performance so that a more stable product can be delivered to customers.


Customer-1 implemented the Data Fabric as the underlying architecture of our hybrid cloud strategy. The Data Fabric enables IT to manage data across multiple environments using standard tools, processes, and governance methodologies, independent of cloud providers or locations.


The Data Fabric delivers three major benefits to NetApp IT:


Data storage. NetApp® ONTAP® is the foundation of our Data Fabric, including management of all our public and private cloud data. Our private cloud leverages ONTAP-enabled FAS systems. As mentioned in my previous blog, we leverage NetApp Private Storage (NPS for Cloud) and ONTAP Cloud (a software-only version of ONTAP) for our public cloud workloads. NPS takes advantage of high-speed direct connections from a nearby colocation provider to leading cloud providers such as AWS and Azure. It also provides private storage options to augment elastic compute capabilities from these cloud providers. ONTAP Cloud delivers the same enterprise class data management as on-premises storage. This enables NetApp IT to retain full control of our enterprise data at all times, irrespective of the data location or cloud provider


Data replication. Within the Data Fabric, we use the SnapMirror® replication and SnapVault® backup feature of ONTAP to move data between end-points. This seamless transfer of bulk data provides the underlying transport for our data across various public and private clouds to deliver a variety of business use cases such as application migration, data replication, disaster recovery, etc.


Data archiving. We use NetApp’s AltaVault® and StorageGRID® storage solutions for hybrid cloud data protection and archiving. AltaVault integrates with our NPS and FAS storage systems to back up data to a multi-site StorageGRID object data store. This combination provides us a truly scalable and tapeless backup solution to meet our data archiving and compliancy requirements.


NetApp IT: Data Fabric in Action.jpg

Data Fabric in Action

The Data Fabric enables the Customer-1 program to use the cloud as a flexible component in its integrated IT environment. We can choose the cloud that offers the right service level at the right price for that business customer. This framework opens up many benefits in how we manage our production environment:


  • We have complete control of data at all times, irrespective of the application location/cloud. The same on-premises data governance, security, privacy, and compliance methodologies are applied to cloud workloads as well, enforcing consistency across our IT environment and minimizing risk.
  • We can map the right workloads to the right clouds. When requirements change, due to performance or cost, we can easily move workloads in and out of a cloud without worrying about data migrations. This helps us avoid vendor lock-in, cloud data transfer delays, and extra charges.
  • We support a variety of cost-saving use cases. For example, the Data Fabric has enabled us to migrate our disaster recovery (DR) applications to the cloud and remove rarely used and costly compute from our on-premises data centers. We synchronize the data between our on-premises data center and our public cloud through NPS. Compute from public cloud providers is only used during a DR or testing event.
  • The Data Fabric allows us to use consistent storage standards and policies across various cloud stacks. By providing a homogeneous storage layer we can easily expand to meet data management capabilities across various technologies and cloud stacks, including AWS, Azure, and OpenStack.
  • Finally, the NetApp Data Fabric/NPS provide a rich set of enterprise features and capabilities that are not available in public storage. This allows us to standardize data management across all the platforms and eliminate the need for application re-design, new storage skills, new process development, etc., as technology changes.

The landscape of enterprise IT is changing rapidly with the rise of the cloud. Data management is a critical factor to consider in this journey. As Customer-1 for NetApp’s hybrid cloud strategy, we are using the Data Fabric to gain greater visibility and control over our enterprise data, regardless of where it physically sits. The Data Fabric enables us to combine on-premises capabilities with cloud provider resources to take advantage of a whole new level of compute power and automation for our business customers. More importantly, it supports the evolution of our data management strategy to meet the demands of the future.


The NetApp-on-NetApp blog series features advice from subject matter experts from NetApp IT who share their real-world experiences using NetApp’s industry-leading storage solutions to support business goals. Want to learn more about the program? Visit

Kamal Vyas

Kamal Vyas is the Chief Infrastructure Architect for NetApp IT and is responsible for NetApp IT’s hybrid cloud architecture, translating the NetApp IT vision into an infrastructure design that spans the public and private clouds. Kamal is a cloud evangelist, advocating for organizational adoption of and business transformation through the cloud and Data Fabric. Kamal has expertise in strategy, architecture and design of next-gen infrastructure solutions and transition to an IT as a Service model.