Backups are boring. At least as long as they work. Optimization usually means saving money, often by automating tasks as much as possible and determining the best place to store backup data.


Restores are a different animal – how fast can you get your data back? Is your data intact and healthy? Are you going to lose your job because the restore process fails?


And then there is archiving. What is archiving? Despite what many people think, just grabbing some duplicate backup tapes and storing them in a closet somewhere does not create a useful archive? How about building massive active archive repositories for scientific data that researchers globally access to retrieve valuable information on a fairly regular basis? In the end, archives need to be reliable, accessible and affordable.


In each of these cases, hybrid cloud architectures can help lower cost and improve performance, while maintaining the security and availability standards core to IT operations.


When evaluating use cases for the hybrid cloud, here are some considerations for choosing the right solution:

  • Cost – hybrid cloud lets you take advantage of new cost models provided by Amazon Web Services and other service providers
  • Risk – additional secure physical storage locations help address single-site or local-site risks (as in – you lose a site you lose your data), as well as risks inherent with single copies on physical media, such as tapes
  • Speed – with the help from hybrid clouds you can now improve backup and recovery times for data centers and remote locations
  • Scale – you can dynamically extend backup repositories and archives from on-premises facilities to the cloud and back, providing virtually unlimited scale without waiting for new equipment to get delivered and installed


Why is the concept of the hybrid cloud so important? It enables you to leverage both on-premises and off-premises resources. With the right products this happens seamlessly and automatically, as determined by policies that you set once. What makes this happen under the covers is the data fabric.


With a data fabric, data is managed throughout its lifecycle, whether that means it lives on-premises (for example in a private cloud), near the cloud (NetApp Private Storage is a great example for this), or in a hosted or hyperscale cloud (such as AWS). With a data fabric, our customers can now integrate, move, secure and consistently manage data in the hybrid cloud and benefit from NetApp’s investments and expertise in building enterprise-class hybrid deployments that are designed to evolve as customer needs change. More than 275 service provider partners are contributing to the development of the data fabric, enabled by NetApp technology.


Products and partnerships are the foundation for the data fabric, so what’s new in our announcement today?


NetApp StorageGRID® Webscale – new in version 10.1 is the addition of Amazon Simple Storage Service (S3) and geo-distributed erasure coding as storage tiers. With these additions, you can leverage the value of your data stored in the right place at the right time based on performance, availability, durability and – yes – cost criteria. Data placement is managed based on policies and includes tiers of disk (SSD, SAS, SATA), tape, public cloud (Amazon S3) and geo-distributed storage across multiple data centers. NetApp also announced the StorageGRID Webscale appliance for simpler deployments. You can choose both appliances and virtualized software in a single deployment, based on your requirements.


NetApp SteelStore® – in addition to using SteelStore to seamlessly, efficiently and securely move data from traditional Backup and Archive environments into cloud environments (including StorageGRID Webscale on-premises object storage, Amazon S3, and Amazon Glacier and many more) you can now run instances of SteelStore natively in the AWS cloud on a pay-per-use basis. This way you can protect cloud-based workloads and recover data AWS during a disaster.


NetApp OnCommand® Insight – with version 7.1 you can now manage your storage resources in on-premises infrastructure as well as storage resources running in the cloud. This delivers on the idea of managing service level agreements across hybrid cloud infrastructure through performance monitoring, capacity management, identifying reclamation opportunities and greater awareness of IT costs.


NetApp Cloud ONTAP® for AWS software subscription and NetApp Cloud Manager® – with Cloud ONTAP running a full, enterprise-class storage OS right in AWS and using Cloud Manager, you can rapidly deploy enterprise-class storage software into the cloud with ease. Cloud Manager offers you a single environment to provision, set up replication and automate NetApp service and support registration for Cloud ONTAP.


Ingo Fuchs

Ingo Fuchs is responsible for marketing products and solutions in NetApp’s hybrid cloud portfolio.

Ingo joined NetApp in 2010, addressing a wide range of topics from object storage to sale-out NAS, software-defined storage, Big Data and most recently managing a team covering products and solutions across the hybrid cloud portfolio from on-premises to hosted and hyperscaler environments.

Prior to joining NetApp, Ingo was responsible for Product and Solutions Marketing at DDN. Ingo’s background also includes Product Marketing roles at Quantum, Product Management at ADIC, managing the global Educational Services offering at BakBone Software, managing Alliances at a reseller organization in Germany and driving pre-sales activities at IBM in Germany. Ingo was named a Chief Architect at EDS (now HPE).

Ingo is a published author and holds an engineering degree in information technology from the Baden-Wuerttemberg Cooperative State University in Mannheim, Germany.