As manager of the database team in NetApp IT, I manage more than 400 databases, some of which are business-critical to the company’s operations. As a 25-year industry veteran, I’m seeing huge changes in the way I do my job that I never would have anticipated even 10 years ago.
What’s behind this change? The rules around IT service delivery are changing. We are entering a world where IT stability is a given. IT infrastructure is becoming a utility that serves as an electrical outlet for applications and databases to plug into as needed. At the same time, the database team needs to be able to respond quickly to the business. Add in the hybrid cloud, and IT service delivery quickly becomes complicated for database administrators (DBAs).
Five Trends to Watch
I see five trends that are affecting how the database team operates going forward:
Databases are a utility, just like infrastructure. Business customers are typically chomping at the bit to secure their technology quickly and easily. Gone are the days where decisions about which database to use took weeks or months. Multi-tenant database environments are replacing shared environments because they share resources on a scalable database platform that works with the cloud. Multi-tenant or plug-and-play mini-databases can be set up (and taken down) quickly and automatically work with the cloud. They help consolidate and manage multiple databases as one, resulting in improved efficiency, simplified management, and maximum uptime. Applications can easily share compute power based on differing peak load times. Not only are the licensing costs lower, but server utilization database management is streamlined and database performance can be isolated. That isn’t to say that on-premises, standalone databases for critical applications will go away. Databases related to NetApp’s core competencies will always have a place in our operations.
Database-as-a-Service (DBaaS) is closer than you think. The days are gone when users asked for a specific or customized database technology. Administrators have to be prepared to provide Database-as-a-Service as part of the application development lifecycle. That means offering a menu of standard database solutions tailored to different business requirements. Users will be able to self-provision database technology from a catalog of solutions that offers tradeoffs in flexibility, cost, and performance. Self-provisioning brings with it a wide array of benefits: automatic creation of databases in the cloud or on premises, scheduling of start and end dates, automatic integration with CMDB for asset tracking, and even database creation in advance of actual use.
Mongo is challenging the traditional database model. One of the issues that application developers face is choosing their database schema before they start any project. As the decisions around what data will or will not be collected today and tomorrow are hard to predict, this becomes very difficult. Many developers are turning from traditional relational database structure to a dynamic structure like MongoDB because data integration is easier and faster for some applications. The dynamic structure allows users to store data in a single document, easily perform ad-hoc queries, and add fields later on the fly. It’s also free; its open source roots only add to its appeal. The rise of MongoDB has significantly affected the database landscape, making the choice of database at the outset even more important.
Skill sets are changing to reflect the new landscape. Along with the proliferation of data and database types, the database administrator skill set is rapidly changing. Database administrators have historically specialized by database platform-Oracle, SQL Server, etc. Now administrators have to be generalists who are familiar with different database platforms and the ability to recommend the right fit for an application. But they also have to be specialists that can support a database to meet business application requirements. In addition, customers are requiring more data services that are not necessarily tied to a traditional database platform. Decisions get even more complicated when you add in cloud-aware applications and databases. Database administrators have to understand the database as part of a full stack view. Their view of the world has broadened.
The cloud demands better IT collaboration. As IT increasingly leverages the cloud, the push is for database administrators to be data-center agnostic. They have to be ready to support databases for applications that are hosted both on premises (private cloud) and in the public cloud. But delivery and support in the cloud demands a new approach. Database administrators can no longer operate in a silo; the cloud requires collaboration with all the teams supporting the application stack. That makes the support and delivery model much more complex than with on-premises applications. Having a team that provides a set of database services across data centers and database platforms is critical to realizing the efficiencies of the hybrid cloud.
Improving IT Service Delivery
The role of the database administrator is evolving quickly. Skill sets are changing and IT collaboration is a necessity. The most obvious factors driving this change are the rise of the cloud and non-relational databases. But underlying these factors is an even bigger impetus-finding new ways to deliver and support data and database services faster to the business community.
The NetApp-on-NetApp blog series features advice from subject matter experts from NetApp IT who share their real-world experiences using NetApp’s industry-leading storage solutions to support business goals. Want to view learn more about the program? Visit www.NetAppIT.com.