We are again at a fascinating point in the ever-accelerating world of IT. Every 7-10 years or so, a combination of IT trends and technologies simultaneously reach a point of maturity, dramatically changing the way IT infrastructure needs to be architected in order for organizations to stay competitive. In 2006-09, it was the combination of mature second platform (client-server) applications together with virtual machines, virtualized storage, and converged infrastructure that changed the industry for the next decade. Starting around 2010, the third computing platform emerged — meaning applications began to be developed to take advantage of new social, mobile, cloud, and analytics trends, capabilities, and requirements. A few years later, those applications are becoming ever more business critical. Today it is a combination of DevOps together with containers and next-generation storage that is intersecting. Combined with the maturity of cloud services, this is again significantly impacting the choices and investments every IT department need to make.


The principals behind the DevOps movement have become the organizational catalyst many organizations needed to accelerate their efforts to modernize their entire approach to IT. There are now many documented and published examples of how this team effort across developers and operations results in higher IT performance, faster time to market, and increased employee and customer loyalty. Puppet’s 2016 State of DevOps report, based on the results of surveying 25,000 technical professionals, concludes that high-performing organizations using DevOps principles deploy 200x more frequently and spend 29% more time on new work. The report also shows that the number of DevOps teams increased by 22% in 2016. There is no doubt this trend will eventually affect everyone interested in next-generation IT.


But that’s only the beginning.


At the first European DevOps Enterprise Summit in London this summer, a wide range of transforming, digital enterprises presented their DevOps journey. Most began over the past couple years with an agile transformation project i and are now looking to move to deployment at scale. The results presented were impressive, with every single organization benefitting from increased agility, increased quality, and improved innovation. Perhaps most surprising was that those running these projects saw far fewer outages compared to their traditional IT environments. Good news all round! At the end of each session, the presenter was asked what they wanted to know more about related to DevOps. The most common requests: How do I run DevOps infrastructure better at web scale? How can I benefit from containers at scale? If you have or run DevOps teams for several years then that is probably not a surprise, but where should you look for answers?


Over the past 10 years as the DevOps movement emerged, the infrastructure, designed by architects and built by operations teams, had already changed. Thanks to the hypergrowth of VMware, operating-system (OS) virtualization is now pervasive within most organizations and data centers. However, just as many developers have become used to the flexibility of the public cloud for the requirements of their modern applications, they have also become all too aware of the limitations of a traditional, non-automated virtual-machine environment.


Container-based infrastructure removes the need for the hypervisor and is seen by many as the answer. Containers are smaller, faster, and more agile. They are also more easy to automate and share between applications. As a result, they facilitate the move to modern applications built around microservices. They provide teams with access to more immediate resource scaling, meaning innovation can move faster from idea to production. Increasing business and customer service demands will drive more and more organizations to move from VMs to containers. Those investing in DevOps infrastructure as an enabler to their business will move faster.


If the answer to DevOps at web scale is based on containers, then the next logical question has to be the approach to storing the data those container-based applications will access. Many early virtualization projects were either delayed or simply failed to deliver as they moved toward production. All too many times, the reason was a complete lack of any focus on the storage and data management architecture. As a result, the adoption of VMware ESX and vSphere took longer as teams came to understand that traditional storage could no longer meet the requirements of virtualized infrastructure. In today’s highly competitive world, where IT is the foundation of digital business, organizations cannot afford to make the same mistake again with the Docker ecosystem and containers. As operations teams select which cloud architectures are needed to support their DevOps teams’ ambitions, they must not forget that the storage architecture chosen will either dramatically accelerate or hold back the applications development teams want to build. Next-generation DevOps infrastructure needs containers. Successfully building container-based infrastructure at scale needs a next-generation approach to storage.

John Rollason

John Rollason is Senior Director, Product & Solutions Marketing, Next Generation Data Center at NetApp. In this role, John is responsible for NetApp’s Next Generation Data Center marketing strategy. He works closely, not only with customers, partners, analysts and media, but also with leadership, sales, technical and product teams worldwide.

Prior to this role John was Marketing Director for SolidFire, acquired by NetApp in 2016, and responsible for global marketing strategy and delivery for the NetApp SolidFire Business Unit. John joined SolidFire in 2014.

Before joining SolidFire, John spent over nine years at NetApp, mainly in the role of Director, Product, Solutions & Alliances Marketing EMEA. Prior to NetApp, John worked for eight years at Nortel in a variety of positions in System Engineering, Business Development and Product Marketing.

John has spoken regularly at industry events for many years and attended The University of Newcastle-upon-Tyne gaining an M. Eng (Hons) in Electrical and Electronic Engineering. He is a Member of the Institution of Engineering & Technology.

Add comment