Last week, VMware’s Rawlinson Rivera posted on both his personal blog and VMware’s blog about addressing a key missing link in IT automation: storage. Rawlinson did a great job of detailing what IT consumers want, what they get, and how not all virtual disks are equal.


The current paradigm, however, treats storage as an undifferentiated resource in the IT automation puzzle. The reality is that applications need more than just capacity provisioned storage. On the contrary: Applications and workloads require profiles driven by business requirements rather than the limited options traditional storage architectures have long provided. As applications go through their lifecycle, those requirements change, and taking weeks or months to reconfigure IT stops business agility in its tracks.


In his post, Rawlinson outlines his vision for exposing a richer set of data services for application lifecycle management and consumption. SolidFire’s Quality of Service (QoS) settings (minimum, maximum, and burst IOPS) were highlighted as examples of data services specifically designed and configured for the workload (e.g., MongoDB). Once the underlying tags, policies, and other configurations are set, additional data services such as encryption, snapshots, replication, and site protection are available. The goal here is to provide a more complete set of service levels for the business to define with IT.


The result is the business can consume IT with clarified storage requirements like performance profiles, data protection, or site fault tolerance. As an example a business unit like developer may have three different levels of requirements for MongoDB or SQL blueprints. One level might require a high level of performance. Traditionally storage would have to be carefully designed from a capacity model (i.e. – provisioning a chunk of disks or flash by capacity). This would have to be monitored in the future for move, add, changes not to mention application design or usage changes. In the new model SolidFire’s QoS would define the minimum, maximum, and burst setting based on what IT designs or in combination with what the consumer is requiring. The QoS setting then get exposed so that vRealize Automation can offer those new performance controls directly to IT consumer to set, change, and add to an application blueprint over time. The result is immediate change in the infrastructure with completed results in minutes rather than weeks.


To see the settings in action, watch the video below for a tech preview of SolidFire and vRealize Automation:



SolidFire’s technology solutions group has been working since last year with VMware to show you a vision of where IT automation will go. SolidFire sees a wide range of IT automation and orchestration as critical for the future of IT, and you’ll see us produce this functionality in the very near future.


Stay tuned to and register here to be alerted for future updates on SolidFire’s IT automation solutions.

Keith Norbie

At NetApp Keith drives Strategic Alliances in partnership with the business units and currently leads VMware, Data Protection (Veeam, Commvault, Rubrik) and SAN/Brocade. This applies strategy development, advising/collaborating with product managers, incubates new offerings, cross functional solution development, and executive interlocks. All of this is to drive GTM net new revenue to NetApp via partners in key areas like Private and Hybrid Multi-Cloud, EUC (VDI), Modernized Data Protection and Next Gen SAN for Enterprise Apps.

Keith joined NetApp in February 2016 with the acquisition of SolidFire and previously had 20 years in the channel as an executive including a successful acquisition built from a startup. He delivers a passion for delivering results through clarity, focus (less is more), relationships, intense curiosity, and seeking signal from noise.

Add comment