Back

 Industry News Details

 
Deep Learning: The Confluence of Big Data, Big Models, Big Compute Posted on : Jan 11 - 2019

Fueled by enterprises seeking greater insight from their analytics, deep learning is now seeing widespread adoption. While this artificial intelligence (AI) discipline was first conceived in the late 1950s, the recent jump into deep learning and other AI methods is fueled by the recent increase in hardware power, the explosion of big data and desire for greater insight in several key industries.

Deep learning – and AI in general – have taken off because organizations of all sizes and industries are capturing a greater variety of data and can mine bigger data, including unstructured data such as text, speech and images. The global deep learning market is expected to grow 41 percent from 2017 to 2023, reaching $18 billion, according to a Market Research Future report.

And it’s not just large companies like Amazon, Facebook and Google that have big data. It’s everywhere. Deep learning needs big data, and now we have it.

Contrary to popular belief, more data does not always mean better results. However, deep learning models absolutely thrive on big data. Through progressive learning, they grind away and find nonlinear relationships in the data without requiring users to do feature engineering. Deep learning models also can overfit the training data, so it is good to have lots of data to validate how well the model generalizes.

So what is a deep learning model? It’s essentially a neural network with many layers. And these models can be enormous in size – often with more than 50 million parameters. The algorithm is not new, but because we now have bigger data with more computing power. This enables next-generation deep learning applications such as computer vision or speech to text.

There are some considerations for those adopting deep learning. Consider the following “big” issues:

  • Big data is expensive to collect, label and store.
  • Big models are hard to optimize.
  • Big computations are often expensive.

Let’s dig into each of these considerations.

Big Data

Deep learning is not a silver bullet, but it damn sure is a Swiss army knife. It can be used for all kinds of applications. Let’s look at some example applications matched to the data and algorithmic type.

Natural language processing. Deep learning innovator and scholar Andrew Ng has long predicted that as speech recognition goes from 95 percent to 99 percent accurate, it will become a primary way to interact with computers. Deep learning models currently have about a 4 percent error rate for speech to text. So unstructured spoken or typed text requires lots of data to produce the best results possible. Different accents are also needed, as are a good representative sample of various speech speed patterns. View More