In part two of this series I wrote about trends that are changing how software is developed, and the need for integration and automation in order to make the application development process faster, easier, and more efficient.
NetApp provides a stable, standard data management platform for application development. As developers increasingly move workloads to the cloud, data management matters more than ever.
Rise of the Service-Based Model
Enterprises with data centers that have large infrastructures and incur huge capital expenses are rapidly adopting a service-based, operating-expenses (pay-as-you-use) model delivered via the cloud.
Under this model, different services are catalogued for the software development lifecycle to write code, builds, deployments, support, sustainability, and so on.
Under the cover, there are multiple micro-services and several infrastructure integrations that are tied to one or more high-service-level offering. These micro-services might run independently or interdependently to solve a specific service-level issue. Operations teams have additional services for workload-based infrastructure, database, backup and restore, data archive, disaster recovery, collaboration, APIs, and so on.
All of these high-level services need to be tightly integrated with application development tools and infrastructure ecosystem partners for smooth transitions between the different DevOps phases. In the DevOps practice, application owners can create models or templates while rendering different services required for development and deployment in the cloud. The figure below illustrates how processes and technologies need to weave together to align with DevOps practice.
Data Fabric for DevOps
When it comes to applications that are born, deployed, and maintained in the cloud, there are additional challenges. The Data Fabric enabled by NetApp automates native NetApp technologies and integrates with ecosystem partnerships to address these challenges, which include the following:
- Providing cohesive integration with open source tools and exposing underlying infrastructure APIs to give developers the flexibility to customize their development workflows within economic boundaries for applications born in the cloud
- Syncing a large code base with the dependent tools and kernel modules to create a complete prepackaged environment for developers in their workspace (doing so takes longer in the cloud)
- Scaling parallel builds during development and deployment of applications in the cloud
- Identifying and isolating errors and reverting to stable builds during continuous integration tests to improve code quality
- Setting up databases quickly for testing the application without affecting the production database
- Avoiding data lock-in within a single cloud or service provider with limited control; moving data between different availability zones within a single cloud provider or across different hyperscalers, and also to on-premises private clouds for backup, archive, collaboration, and disaster recovery
DevOps is not a revolutionary shift; it’s the natural evolution of the development culture that arose in response to revolutionary changes in our data universe. This evolution is not limited to one or two industries or to specific companies, because all businesses increasingly are becoming software businesses.
Organizations have been quick to adopt and adapt the DevOps culture for their developers, operations team, and business stakeholders. It might be tempting to oversimplify the DevOps approach for kingmakers, and in the process overlook the need for flexible and reliable operations. However, a superior developer experience can be sustained only by a superior data management platform integrated with a rich ecosystem of partners.
Meet NetApp next week at MERGE 2016, the Perforce Conference, April 13-15, in San Francisco. More information on MERGE 2016 can be found here.