New from NetApp: All-Flash and Hybrid Flash Systems, Software, Splunk Solution, and More
The First Timer’s Guide to NetApp Insight
Is NetApp HCI Really A Hyper Converged Infrastructure?
NetApp Vision for NVMe over Fabrics and Storage-Class Memory
What Startups Should Do To Land Their First Ten Customers
Back Up Your SaaS Data with NetApp Cloud Control for Microsoft Office 365

New from NetApp: All-Flash and Hybrid Flash Systems, Software, Splunk…


Today NetApp announced the new EF570 all-flash array and E5700 hybrid flash systems. These NetApp® systems provide the industry’s best price/performance for both IOPS and MB/s, and they are the first NVMe-enabled enterprise class systems available in the market. In addition, a new NetApp Converged Infrastructure Solution for Data Analytics was announced, as well as […]

Top 5 Reasons to Attend Strata Data Conference


The Strata Data Conference is fast approaching. Here are five reasons why you should attend the Strata Data Conference in New York City from September 25-28 and visit NetApp in booth #937.   1. Learn how to reduce operational costs. Run analytics natively on NFSv3 data without adding infrastructure. Use a single storage back end […]

Top 5 Reasons to Attend the Splunk Conference 2017


The 8th Annual Splunk Conference is fast approaching. Here are five reasons why you should attend Splunk Conference 2017 in Washington, D.C., from September 25-28, and visit with NetApp in booth #T4.   1. Learn how to drive extreme performance. With NetApp EF-Series all-flash arrays and E-Series hybrid flash arrays, you can increase search performance […]

NetApp Solutions for MongoDB


Social, mobile, cloud, and Internet of Things (IoT) data is proliferating. This proliferation is driving enterprises to deploy hyperscale, distributed, data-centric applications such as customer analytics, e-commerce, security, surveillance, and business intelligence. To handle the data requirements of these high-volume, high–ingestion-rate, real-time applications, enterprises are rapidly adopting massively scalable and nonrelational databases such as MongoDB. […]

NetApp Solutions for Apache Spark Outperform JBOD


Apache Spark is an open-source cluster computing framework that was developed in response to limitations in the MapReduce cluster computing paradigm. Apache Spark is a relatively new programming framework for writing Hadoop applications that works directly with the Hadoop Distributed File System (HDFS). Spark is production ready, supports processing of streaming data, and is faster than MapReduce.   […]

E2800: Get Turbocharged Flash Storage for Minimal Investment


On September 8, NetApp announced the E2800 all-flash and hybrid storage system. For a two-minute summary of the announcement, watch the video below:     The NetApp® E2800 storage system, with both all-flash and hybrid solid-state drive/hard disk drive (SSD/HDD) configurations, is ideal for small and medium-sized businesses, plus enterprise remote and branch offices. It […]

Equate Accelerates SAP Reporting with NetApp


Since 1997, Equate has been the owner and single operator of several fully integrated world-class petrochemical complexes in Kuwait, North America, and Europe. These complexes produce over 6 million tons of the highest-quality petrochemicals annually.   Equate needed a future-proof storage solution to solve performance and capacity bottlenecks and improve business continuity. Their plants work […]

Four Ways to Maximize Your MongoDB Investment on NetApp


When internal storage is used for MongoDB, it presents challenges such as performance degradation with node and/or drive failures and increasing users. It’s difficult to scale up and out on demand and time consuming to manage, and copying, restoring, protecting, and moving data are inefficient and inflexible. NetApp® external storage all-flash MongoDB certified solutions (All Flash FAS […]

Data Lifecycle Management for the Data Lake


A data lake is a central location in which to store vast amounts of raw data in its native format, including structured, semi-structured, and unstructured data. It is typically built by using Hadoop. A data lake provides a cost-effective, highly scalable architecture for collecting and processing virtually any data format from any source. It enables […]