So things have certainly been moving right along since last we posted in 2017 in the iOS machine learning world haven’t they? Whilst we have our accustomed healthy skepticism of the frothily wild-eyed claims of The Universal Panacea Of Machine Learning you see floating around —
CONCERNED PARENT: If all your friends jumped off a bridge, would you?
MACHINE LEARNING ALGORITHM: “YESSSS!!!!”
— this new being able to train models on our devices thing as well as all the other new stuff in CoreML 3 is a bit of a tipping point from “curiosity” to “something to think of serious applications for” we’d say!
If you haven’t taken much notice of the lingo so far and need some bringing up to speed, check out this course list from our friends at CourseDuck:
And if you want lots and lots and lots of cross-platform machine learning resources, check out the awesome list
But we’re focusing on the device training today, and from what we can tell the reigning authority in that space is Matthijs Holleman’s blog, most notably the four piece series On-device training with Core ML completed last month:
- Introduction to on-device training
- Rock, Paper, Scissors (Lizard? Spock?)
- k-Nearest Neighbors
- Training a Neural Network
For sure read that whole series, and check out the rest of the blog too, we particularly liked
And now you have a Combine processing chain that, every time you send it a UIImage object with imagePublisher.send(image), will automatically run a Core ML model on that image and process the results. Pretty cool!
And if you like those enough to pay for more, he’s got not just one but two books out: Machine Learning by Tutorials and Core ML Survival Guide — which we’re pretty sure makes him the go-to guru of iOS machine learning!
Other good introductions to new features and related goodies:
And here are some examples of classifiers and detectors we figure look useful, interesting, or just amusing:
And in that last category, our Machine Learning Rise Of Skynet Award goes to
Some time ago I created a little side project that involved an Arduino-powered servo motor that menacingly pointed at people’s faces with the help of CoreML, mimicking the Team Fortress 2 Engineer’s Sentry Gun. With iOS 13, I decided to re-write that using the new Socket APIs and SwiftUI…
“Mimicking,” he says now, but just you wait…