pituf.blogg.se

Faceshift demo
Faceshift demo





"There's this idea that, well, if you don't have the data, how would you ever learn? Well, turns out, if you want to get pictures of mountains you don't need to get it out of people's personal photo libraries," Craig Federighi, Apple senior vice president of software, said in an interview last week.Performance driven facial animation is an increasingly important tool. Sure, big batches of data and photos are needed to make sure these kind of computer vision applications work, but Apple insists that it doesn't need your photos, specifically. Part of the different approach comes down to how Apple treats user data. In contrast, Google uploads each user's entire photo library to look for new patterns. That's because Apple is pre-training - or teaching - its algorithm with these specific terms, then installing that pre-trained algorithm on people's devices. (The full list is at the bottom of the post.)

faceshift demo faceshift demo

So Apple is looking through your photos for specific images like "alligator," "dog," or "burrito."īut this approach also indicates that, at the moment, you won't be able to search for, say "beef jerky," which isn't included in the list. "I found them by poking around plist files and system framework binaries" in the beta version of Apple's software, Yin told Business Insider.įor example, the photos app only recognizes and distinguishes between seven hardcoded facial expressions:Īpple's photo search works by detecting 4,432 different scenes and objects. Turns out, Apple's newest software includes a list of objects, terms, and even facial expressions that Photos can detect. Developer Kay Yin decided to go under the hood, into system frameworks, to take a peek at how Apple's actually pulling off this artificial intelligence achievement.







Faceshift demo