Industry News Details
Curb Your Enthusiasm: What AI Can't Do Posted on : Sep 05 - 2019
In my last post, “It’s Time to Demystify Machine Learning,” I shared an easy explanation of machine learning: teaching computers to learn by repeatedly correcting models derived from data until the machine correctly applies the model rapidly to datasets. Now, to take a look at the other side of this, I'll share three things machine learning can’t do well.
No Pattern Lock-In
Our brain works by collecting sensory data around us and encoding it in pairs at junctions called synapses. The more often these experiences repeat, the stronger the synapses’ chemical bonds become, enabling us to practice improving our skills. Our brains also take in live data and use past experiences as a filter to quickly and effortlessly assess and understand what's going on around us. Less frequent or one-time experiences have weaker bonds that eventually fall away.
We’ve adapted another mechanism to overcome this problem. In 1991, a tornado formed over Gull Lake in Minnesota. A homeowner took an incredible, real-time video as the storm grew in strength and approached his home. Despite being in real danger, he kept filming until he ran into the house and never shut off the camera even as the house collapsed around his family. See, when a human sees data so unlike normal patterns, we are slow to react and reform processes.
This is why in videos from nightclub fires or bombings we see people that keep dancing, look around or think it’s part of the show when they’re actually in real danger. A cameraman taking live video during the Station Night Club fire saw the fire and backed away almost immediately. As a news photographer, his brain was more familiar with dangerous situations than those dancing away as the band continued to play. Eventually, our failsafe will kick in as soon as it’s clear we’re in danger. The guy filming the tornado and his family all survived, but his slower reaction to abnormal data didn’t do him any favors.
Machine learning models lack this failsafe. When presented with data points outside the norm — which causes human minds to realize the model should be thrown out and refactored — it will disregard those data points rather than change the model. This is one limiting factor for machine learning’s usefulness. View More