The opportunity for NetApp to join the Open Compute Project (OCP) program this year as the only storage and data management systems provider is a real privilege. It also completes a circle for NetApp as a leading voice for the role of software in an open hardware ecosystem.


As founder Dave Hitz has often noted, NetApp at its inception 25 years ago was originally a software company and we invented what became known as Software-Defined Storage (SDS). At the time, we got into the business of hardware customization because as we scaled our system in performance, capacity, and reliability, we encountered limitations in standard servers, networks, and storage media. We got very good at both the hardware and software sides of the business as a result and that enabled us to deliver highly reliable, fast, and simple storage solutions.


Such remarkable innovations arose from our development of enterprise storage and data management systems, each an important contribution in its time. For example, managing vibration within a rack of spinning disks has been a big deal to our data center customers, and we happen to be especially good at it. Now, with the emergence of the all-flash data center, that issue is going away.


On the other hand, our accumulated expertise in storage virtualization, hypervisor optimization, cloud and hybrid cloud computing, hardware abstraction, and managing data wherever it resides is more relevant today than ever. That’s especially true in the context of designing and building optimized and sustainable data centers for the future.


I mentioned the introduction of all-flash storage in the data center, which is one important factor in data center design. Other factors and trends include changes in workloads and data types, the growing role of the cloud, and the emergence of data as currency in the digital economy. Amidst these trends, we’re witnessing the arrival of commodity hardware that’s now capable of fulfilling some of the needs of the modern data center.


With the gap closing between what the data center needs and what commodity hardware can deliver, companies like NetApp are free to provide value through the delivery of software and data management services. It’s an exciting time. While we continue to provide the best engineered systems on the market, our software advantages are coming to the fore.


This is especially true in the context of the OCP data center as originally conceived by Facebook and joined by IntelGoldman SachsRackspace, Microsoft, AppleCisco, Juniper Networks,  Nokia, Lenovo, Google and others. In an OCP environment, NetApp software runs better than anyone else’s. Our Data Fabric gives IT organizations ubiquitous control of all their data, no matter where it’s physically located and no matter what applications are creating or accessing it.


NetApp has several software products that are ideal for open computing environments. SolidFire Element OS software enables customers to purchase the hardware to run a shared-nothing scale-out storage system separately from the software. Likewise, ONTAP Select software functions separately from the underlying hardware. AltaVault and StorageGRID Webscale are specialized NetApp solutions that function at the Hardware Abstraction Layer (HAL) and are available in both integrated and software-only versions and in hyperscale clouds.


With the ability of our software to enable the data-driven enterprise, we see the virtualized storage system as effectively a “data operating system.”


Evolution from Hardware-focused, to Software-defined, to Data-defined Storage.jpg

Evolution from Hardware-focused, to Software-defined, to Data-defined Storage

When we started NetApp, we quickly evolved to deliver engineered systems that combined commodity and custom hardware with innovative software. Our emphasis on software capabilities continues as data assumes a more and more prominent role in enterprise IT.


As enterprise customers, hyperscale cloud providers like Azure and AWS, service providers, systems providers, and technology partners engage in designing and building more sustainable data center solutions, we at NetApp embrace the OCP standard. Our history, expertise, and the NetApp Data Fabric approach position us better than any other provider to enable the data-driven enterprise in the context of open computing.

Mark Bregman

When Mark Bregman joined NetApp in September 2015, he brought to the company more than 30 years of technology experience and a passion for the process of innovation. He has held C-level and management roles for global firms including Symantec and IBM. Just prior to NetApp, Mark was CTO of SkywriterRX, Inc., an early-stage start-up using machine learning and natural language processing to analyze books. Before that, he held senior positions at Neustar, Symantec, Veritas, AirMedia, and IBM. He began his career at IBM’s Thomas J. Watson Research Center.
As NetApp SVP and CTO, Mark leads the company’s portfolio strategy and innovation agenda in support of the Data Fabric, NetApp’s vision for the future of data management. His responsibilities include evaluating where the biggest technical opportunities and risks are and helping to further develop and nurture the NetApp culture of innovation within the engineering team.
Mark is dedicated to addressing the underrepresentation of women in the fields of computer science and engineering. He has served as executive sponsor and an engaged member of the Women in Technology programs at all of his previous places of employment. Since 2009, he has served as a director of the Anita Borg Institute for Women and Technology. He also serves on the boards of the Bay Area Science and Innovation Consortium, ShoreTel, Inc., and SkywriterRX, Inc. He is a former member of the Naval Research Advisory Committee, a member of the American Physical Society and a senior member of IEEE. Mark holds a PhD, an MA, and an MPhil in physics from Columbia University and a BA in physics from Harvard College.