Data is the new gold for the modern financial business, but the question of how to effectively mine, manage and utilise that data has proved far more challenging. For the finance sector, the challenges are multiplied by the sheer volume of data from billions of transactions. These billions of transactions hide the good, the bad and the downright ugly. When you are faced with this unrelenting barrage of transactions how do you know which is which?

For many years, forensic analytics has let you look back on historical data to help you:

  • Detect suspicious patterns or trends
  • Discover fraud
  • Identify money laundering activity

These reactive activities are key in modern finance, however, today’s bankers are also looking for a proactive strategy applying continuous surveillance and screening of data on arrival. Machine learning is increasingly used to drive operations deliver near real time results, and can tie in neatly with processes of compliance and regulation. The benefits go far beyond the realm of forensic accounting, into offering business benefits through predictive analytics, detecting positive trends, relationships, and sentiment.

In order to deliver these real-time insights powered by machine learning, data needs to be stored in a way that can be accessed quickly, while still adhering to a highly-regulated environment. This means that the systems need to be both secure and able to offer ever-lower latencies. A few years ago, many milliseconds of latency in response times was the norm. Now high performance analytics dictates only latencies of well under one millisecond as acceptable, and, very soon, that will reduce to a few tens of microseconds. However, the one aspect that does not change is the universal requirement for consistent, repeatable, and predictable data management regardless of where your data resides.


The technology leading the next wave in the fight for even lower latencies and higher performance is NVMe.

To date, industries have reused tried the tested storage protocols SAS and SATA. That’s fine, but these protocols were designed for the mechanical HDD age and carry the baggage for managing HDDs.



While we were all taught that the HDDs are ‘random access’ devices in school, HDDs are limited by one set of read / write heads that can only be in one place at one time. If you share HDDs with busy workloads then achieving the nirvana of consistent low latency was difficult, hence the admin overheads associated with short-stroking, load balancing and queuing.


Flash Storage, on the other hand, is like memory. It can be accessed in parallel, by many applications at the same time, at extremely low latencies. This is where NVMe delivers. It is designed to drive parallel access to solid state storage technologies, addressing next generation bandwidth, IOPs, and latency requirements.


The latest  FAS hybrid storage systems from NetApp include NVMe. The FAS9000 supports up to 16TB of NVMeFlashCache, while the All Flash A-Series will easily add NVMe and NVMf technologies in the near future.


There is no doubt in my mind that analytics and business intelligence will be a top-of-mind project for many CIOs the 2017 as these topics underpin the value derived from digital transformation.

If you are looking to meet the challenges of digital transformation and analytics, then NetApp All-Flash technologies are designed to deliver interoperability and future proofing with confidence.


Ask yourself the following questions:

  • Do you need a Formula 1 car with extreme performance, shortest path to your data?
    • Then EF systems are perfect for analytics and battle-hardened.
  • Do you need a high-performance saloon car with creature comforts configurable for all passengers?
    • Then AFF systems are right if you need high performance and rich data management that’s cloud ready.
  • Do you need a bullet train ? No need to learn how to drive it, just buy the ticket for the service you desire.
    • Then Solidfire is perfect for these next generation linear scale-out requirements
    • Where a guaranteed quality of services for individual workloads matters.


If you are looking for more information here are useful links:

Laurence James

Laurence is responsible for driving market awareness for NetApp’s products across EMEA. His focus is on business growth and aligning NetApp’s offerings with customer and market needs.

Laurence works across all of NetApp’s products and has an in-depth understanding of diverse customer requirements to deliver value across the entire range of the product suite.

Working with a dedicated and experienced team, he now assists in developing and implementing campaigns that support the positioning of NetApp’s Cloud Infrastructure products.

Laurence has many years’ experience working with all aspects of Enterprise IT and held roles at Oracle, Sun Microsystems and StorageTek.

For nearly 20 years he was Principal IT Consultant at the UK Meteorological Office.

Add comment