
Industry News Details
AI And The Third Wave Of Silicon Processors Posted on : May 16 - 2018
The semiconductor industry is currently caught in the middle of what I call the third great wave of silicon development for processing data. This time, the surge in investment is driven by the rising hype and promising future of artificial intelligence, which relies on machine learning techniques referred to as deep learning. As a veteran with over 30 years in the chip business, I have seen this kind of cycle play out twice before, but the amount of money being plowed into the deep learning space today is far beyond the amount invested during the other two cycles combined.
The first great wave of silicon processors began with the invention of the microprocessor itself in the early 70s. There are several claimants to the title of the first microprocessor, but by the early 1980s, it was clear that microprocessors were going to be a big business, and almost every major semiconductor company (Intel, TI, Motorola, IBM, National Semiconductor) had jumped into the race, along with a number of hot startups. These startups (Zilog, MIPS, Sun Microsystems, SPARC, Inmos Transputer) took the new invention in new directions. And while Intel clearly dominated the market with its PC-driven volumes, many players continued to invest heavily well into the 90s.
As the microprocessor wars settled into an Intel-dominated détente (with periodic flare-ups from companies such as IBM, AMD, Motorola, HP and DEC), a new focus for the energy of many of the experienced processor designers looking for a new challenge emerged: 3-D graphics. The highly visible success of Silicon Graphics, Inc. showed that there was a market for beautifully rendered images on computers. The PC standard evolved to enable the addition of graphics accelerator cards by the early 90s, and when SGI released the OpenGL standard in 1992, a market for independently designed graphics processing units (GPUs) was enabled.
Startups such as Nvidia, Rendition, Raycer Graphics, ArtX and 3dfx took their shots at the business. At the end of the decade, ATI bought ArtX, and the survivors of this second wave of silicon processor development were set. While RISC-based architectures like ARM, MIPS, PowerPC and SPARC persisted (and in ARM’s case, flourished), the action in microprocessors never got back to that of the late 80s and early 90s. Competition between Nvidia and ATI (eventually acquired by AMD) drove rapid advances in GPUs, but the barrier to entry for competitors was high enough to scare off most new entrants.
In 2006, Geoffrey Hinton published a paper that described how a long-known technology referred to as neural networks could be improved by adding more layers to the networks. This discovery changed machine learning into deep learning. In 2009, Andrew Ng, a researcher at Stanford University, published a paper showing how the computing power of GPUs could be used to dramatically accelerate the mathematical calculations required by convolutional neural networks (CNNs). View More