AI has always fascinated me. During my university days, I read every book and article on this topic I could find. Luckily, I received an internship and wrote my master’s thesis on AI. During my 10-month internship, I developed a search engine to enable fast and easy search of components for a CAD application. (The title was “Integration of an inference engine with backward chaining in a CAD software”). This solution was developed in Prolog, with backward-chaining, which was appropriate for this task because traditional languages use forward chaining. After that short AI journey, I no longer worked directly in this space. But I always followed the development of this promising technology.

 

Today, in 2018, there are still detractors who say that AI is a hype, and that it may end in a second AI winter. (The first AI winter took place in the 1990s.) In my opinion, we are in an AI spring. I would like to explain why, by comparing the AI environment 25 years ago with that of 2018. The comparison is based on the four pillars of building an AI as presented by René Buest, director of Technology Research at Arago, at Cloud Expo Europe 2017 in Frankfort.

 

Compute

Moore’s Law says that CPU performance increases by a factor of 2 every 18 months. That means that CPUs are around 100,000 times faster than they were 25 years ago. Therefore AI is much faster and more accurate than it was two decades ago. At that time, it was unrealistic to use AI for real-time applications like autonomous driving, and the neural networks for deep learning were not able to go as deep and therefore were not as accurate as they are today.

 

But that’s not all. Coincidently, during the AI winter in the mid ’90s, another technology came to market, a technology that would revolutionize and boost AI: the graphics processing unit (GPU). GPUs were originally developed for rendering large images, but today they are key for neural networks and therefore for AI. GPUs are better adapted than CPUs to parallel and distributed computing, and they are also more affordable than CPUs for these uses.

Data

Machine learning and deep learning need data — a lot of data. Data is the basis of learning new capabilities in the AI training phase. The more data the system gets during the learning phase, the better the inference will be.

 

Data is the base of knowledge, and it’s common to use terabytes of data just to teach the AI one capability, like differentiating a chihuahua from a muffin, as shown in the research for the best computer vision API.

 

In the inference phase, the requirements are different. Here the AI needs to ingest small to large amounts of data in real time, instead of static data in the training phase. The other important aspect is IOPS, but I don’t want to address that here, because it has already been addressed in another blog, Is Your IT Infrastructure Ready to Support AI Workflows in Production?

 

Fortunately, storage capacity has increased exponentially in the past three decades.  Large hard disks were less than 100MB in the early ’90s, compared to drives of 15TB today. Also, the price of storage has dramatically decreased between the last AI winter and today. In the early ’90s, raw storage still cost more than $1,000/GB, compared to less than $0.03/GB today. And this figure is for raw storage. With storage efficiency, the price/performance is even better.

Algorithms

Today it’s easier than it’s ever been to build an AI system. All the necessary building blocks have become available in the past few years. Platforms and apps are available for any segment — machine learning, speech recognition, gesture control, etc., through a broad ecosystem.

 

When I was developing a search engine, there were basically two programming languages: LISP and Prolog. Everything needed to be coded!

People

Today many universities offer AI programs. This was not the case in the ’90s, when I studied computer sciences.

 

Knowledge about and acceptance for AI have improved dramatically since the first winter. Most of us already use personal assistants like Siri or Alexa. And we don’t care if these intelligent systems are based on AI or not as long as they help us find the fastest way to our destination or order a birthday gift online without having to turn on a computer.

The AI Spring

Very soon, AI will be everywhere. AI will solve the problems we raise in support centers, drive us to work, order our dinners and deliver them to our homes —they’ll even diagnose whether we have a tumor. The systems surrounding us are becoming more intelligent, thanks to AI. I can’t think of any area that won’t be affected by AI technology, sooner or later.

 

The AI spring is here!

Christian Lorentz

Christian has held a variety of positions as System Engineer, Consultant, Product Marketing Manager and Product Manager for Digital Equipment, Chipcom, Cisco Systems and Riverbed. Christian joined NetApp in 2015, as Sr Product and Solution Marketing Manager for EMEA , and has over 20 years’ experience in the networking and storage arena.