Back

 Industry News Details

 
MACHINE LEARNING CAN CREATE FAKE ‘MASTER KEY’ FINGERPRINTS Posted on : Nov 17 - 2018

JUST LIKE ANY lock can be picked, any biometric scanner can be fooled. Researchers have shown for years that the popular fingerprint sensors used to guard smartphones can be tricked sometimes, using a lifted print or a person's digitized fingerprint data. But new findings from computer scientists at New York University's Tandon School of Engineering could raise the stakes significantly. The group has developed machine learning methods for generating fake fingerprints—called DeepMasterPrints—that not only dupe smartphone sensors, but can successfully masquerade as prints from numerous different people. Think of it as a skeleton key for fingerprint-protected devices.

The work builds on research into the concept of a "master print" that combines common fingerprint traits. In initial tests last year, NYU researchers explored master prints by manually identifying various features and characteristics that could combine to make a fingerprint that authenticates multiple people. The new work vastly expands the possibilities, though, by developing machine learning models that can churn out master prints.

"Even if a biometric system has a very low false acceptance rate for real fingerprints, they now have to be fine-tuned to take into account synthetic fingerprints, too," says Philip Bontrager, a PhD candidate at NYU who worked on the research. "Most systems haven’t been hardened against an artificial fingerprint attack, so it’s something on the algorithmic side that people designing sensors have to be aware of now."

The research capitalizes on the shortcuts that mobile devices take when scanning a user's fingerprint. The sensors are small enough that they can only "see" part of your finger at any given time. As such, they make some assumptions based on a snippet, which also means that fake fingerprints likely need to satisfy fewer variables to trick them.

The researchers trained neural networks on images of real fingerprints, so the system could begin to output a variety of realistic snippets. Then they used a technique called "evolutionary optimization" to assess what would succeed as a master print—with every characteristic as familiar and convincing as possible—and guide the output of the neural networks.

The researchers then tested their synthetic fingerprints against the popular VeriFinger matcher—used in a number of consumer and government fingerprint authentication schemes worldwide—and two other commercial matching platforms, to see how many identities their synthetic prints matched with.

"Most systems haven’t been hardened against an artificial fingerprint attack."

Philip Bontrager, NYU

Fingerprint matchers can be set with different levels of security in mind. A top secret weapons facility would want the lowest possible chance of a false positive. A regular, consumer smartphone would want to keep obvious frauds out, but not be so sensitive that it frequently rejects the actual owner. Against a moderately stringent setting, the researcher team's master prints matched with anywhere from two or three percent of the records in the different commercial platforms up to about 20 percent, depending on which prints they tested. View More