Under The BridgeUnder The Bridge

Musings
Hello WRLD: VR Developer Challenge

So ARKit by itself is pretty interesting, and adding Core Location to it gets even more interesting, but for the perfect trifecta, what do we need? Why, we need a VR world, so that we can explore real world locations without actually being there, amirite?

Well, turns out there’s an SDK for that: WRLD!

YOUR WORLD, REIMAGINED
A dynamic 3D mapping platform for smart cities and buildings, gaming, AR/VR experiences, and more.

Apparently they’ve been around for several years as “eeGeo”, but the much improved rebranding is new:

We are excited to announce today our rebrand of eeGeo to WRLD. We believe the name reflects the scope and breadth of our vision, which is to build an immersive 3D world to visualise and interact with every thing on the planet.

We’ve set an ambitious goal with our new name and vision on purpose. We can’t do it alone, as we globally operate as a collaboration between partners and our developer community. We’re committed to providing the best tools to build fully immersive experiences that benefit users in numerous ways – greater spatial understanding and decisioning, smart cities and building planning, or just creating awesome location-based games and VR experiences…

Yep, that’s pretty darn interesting. We’d already had some interesting ideas for how to mash up ARKit with real world locations, but were kinda stymied by the “well, it would only be interesting if you were actually there…” thing. But if you could take your informational ARKit app and place it in VR world instead of real world, well that gets rather more interesting, doesn’t it?

And as it happens, there’s an opportunity to get some recognition for your experiments:

WRLD DEVELOPER CHALLENGE

What if you had the WRLD at your fingertips? What would you create?

WRLD is hosting a global challenge for developers to create the most useful application or immersive experience using any of the WRLD SDKs.

Contest starts October 1st and final submissions are due December 31st.

#WRLDChallenge

 

Hello WRLD Challenge

 

So there you go, have at it!

iPhone X Notch-alance

Remember those innocent days a few short months ago when we all spotted the new safeAreaLayoutGuide and thought

Our constraints are now with the top and bottom anchors of the safe area layout guide. This is not a big change…

Oh, sweet naiveté. Did anybody actually call out in advance that subtle little change was foreshadowing THE NOTCHALYPSE? Yes, yes, every time something changes it brings out the fear and loathing no matter what it is, but the level of OMG NEVER BUY APPLE AGAIN … we saw this time, that was unprecedented. Personally, we think a phone with a bat-eared screen is kinda neat, but we’re weird. 

There’s a certain level of enthusiasm out there for clipping and masking and otherwise working around the new roundedness and notchedness … but we think Marco has it right here:

This is the new shape of the iPhone. As long as the notch is clearly present and of approximately these proportions, it’s unique, simple, and recognizable.

It’s probably not going to significantly change for a long time, and Apple needs to make sure that the entire world recognizes it as well as we could recognize previous iPhones.

That’s why Apple has made no effort to hide the notch in software, and why app developers are being told to embrace it in our designs…

So, we’ve got some rewriting to do in the next few weeks to have our apps fit nicely into The Brave New Notched World, and you probably do too we imagine; let’s see what resources are out there!

First off, go over the relevant stuff over at the mothership:

Building Apps for iPhone X

Designing for iPhone X

Updating Your App for iOS 11

Human Interface Guidelines — iPhone X 

Positioning Content Relative to the Safe Area

The great PaintCode people have updated their invaluable 

The Ultimate Guide To iPhone Resolutions

and also have specs for all the Notch attributes at

iPhone X Screen Demystified

More good reads on upgrading UI for the X at

Design for iPhone X

Supporting iPhone X

iPhone X: Designing for the Notch

iPhone X: Dealing with Home Indicator

UI Design for iPhone X: Bottom Elements

How iOS Apps Adapt to the iPhone X Screen Size — and the Adaptivity app showcased here looks like a good purchase!

Adaptivity is an app for developers and designers to visualize how iOS’s Size Classes and margins for layout and readable content look on real devices and how they change with respect to orientation, iPad Slide Over/Split View and Dynamic Type size changes…

Templates to help you out at

Apply Pixel’s iPhone X and iOS 11 UI Kit

iOS Design Kit’s Free iOS 11 GUI for Sketch

If you’re concerned with how your web sites will deal with the Notch, check out

Designing Websites for iPhone X 

Understanding the WebView Viewport in iOS 11

And let’s finish off with the niftiest idea we’ve seen for adapting in a non-approved way with the Notch:

ScrollSnake: “What if scroll bars on the iPhone X worked like the game “Snake”?”

UPDATES:

UI Design for iPhone X: Top Elements and the Notch

Designing for iPhone X: Guidelines to designing for iOS 11

The Reality Side of AR

Got any plans for Friday October 27th? No? Well, that’s a great time for a vacation in Russia, and if you’re around Moscow that day, check out this conference: MBLTdev!

MBLT 2017

That first speech on the programme, that looks like a particularly good one:

Yes, we’re making our outside-North-America speaking debut by leading off the programme here, no less. OK, pressure’s on to make this a particularly good one then!

The basic idea is to take this trenchant observation on the initial release of ARKit

… and well, since he’s quite right, why don’t we just start doing something about that?

It seems that iBeacon technology has kinda languished — at least, if there are any killer apps out there, we’ve managed to miss them — but ARKit seems like just the thing to make a combination which is awesome, doesn’t it?

Well, visit MBLTdev in Moscow October 27th to find out just how good a job we do of pulling that off!

Validate My Experience

So we’re looking at updating one of our apps here for iOS 11 and joining the move to subscriptions which it appears is the only reasonable path to sustaining a productivity app on the horizon, so we’re seeing what’s new in receipt validation to support that, as last time we checked into it was four years ago and all and RMStore we picked then has languished since.

Let’s look around a bit … hmmm … hmmm … oh, what’s this? A comprehensive series walking through the whole process, updated for Swift 3? Well, bit late to the party, but looks like it’s all still relevant, and receipt validation code is something we really ought to write ourselves:

Preparing to Test Receipt Validation for iOS

After having to piece together each step along the path of preparing to test receipt validation for iOS apps, I’ve decided to combine everything into the following guide. Whether you’re working to implement receipt validation for a new iOS app, or for an existing one, this walk-through should provide guidance to get you ready to work with receipts in your iOS application…

Loading a Receipt for Validation with Swift

There are at least 5 steps to validate a receipt, as the Receipt Validation Programming Guide outlines…

Extracting a PKCS7 Container for Receipt Validation with Swift

Before attempting to work with OpenSSL’s PKCS7 functions, you’ve got to do a little prep work to get the functions to play nicely with Swift. Unfortunately, Swift doesn’t work well with C union types. It simply can’t see things defined with a C union…

Receipt Validation – Verifying a Receipt Signature in Swift

The aim of this guide is to help you take a look inside the PKCS #7 container, and verify the presence and authenticity of the signature on the receipt…

Receipt Validation – Parse and Decode a Receipt with Swift

The aim of this guide is to help you parse a receipt and decode it so that you have readable pieces of metadata to inspect and finalize all of the receipt validation steps…

Finalizing Receipt Validation in Swift – Computing a GUID Hash

After finishing this guide, you’ll simply need to use the parsed receipt data to perform any app-specific enabling/disabling of features based on the data within a valid receipt. If the receipt is invalid, you’ll need to handle that as well. But all of the relatively difficult work of working with the Open SSL crypto library will be DONE after this guide….

… The bottom line is that from this point on, you no longer need Open SSL or any additional cryptic, low-level, unsafe pointer-type stuff to finish things out. I hope this series has been helpful in setting you up to validate receipts locally on a user’s device!

That … is a veritably monumental guide to the validation process. And conveniently finished up right when we need it too, thanks ever so much @andrewcbancroft!

UPDATE:

We just beat by a day Messr. Bancroft pulling together his own collection guide, now that you’ve read ours go to the real thing!

Local Receipt Validation for iOS in Swift From Start to Finish

and code at SwiftyLocalReceiptValidator!

YapDatabase: Ditch Core Data

The choice of database technology for our apps has been something of a question for us the last while, where by “last while” we mean since iCloud Core Data was deprecated, which miffed us no end since we’d rather like our apps’ initial releases to Just Work™ across devices for iOS users without jumping through any great hoops or introducing any dependencies on non-Apple services.

So let’s see what’s new in Core Data this year in the iOS 11 release notes! Errr… nothing?

Hmmm. Let’s check out the What’s New In Core Data WWDC 2017 session

kinda thin, don’t you think?

Hmmm, hmmm. Sorta looking like it’s time to seriously evaluate other alternatives, it is.

We liked the looks of RocketData, very Swifty and all, but it … appears to be languishing.

Soooooo, what to do? Well, here is an article that presents an alternative of interest:

Ditching Core Data

I have been using Core Data since iOS 3.0. With few exceptions, it has served entirely as a network cache. Take data from a network response, turn it into a set of objects, save those objects, and update the user interface. Rinse and repeat.

The stack is pretty simple. A set of nested contexts to keep object building off of the main thread, a couple of models that have init(dictionary: NSDictionary), and some find-or-create helpers can get you a long way. The problem is that the apparent simplicity comes with an incredible amount of baggage. Instability, buggy classes, and outdated APIs make building modern apps difficult.

As a community, we have tried to solve these problems by building wrappers around Core Data. These wrappers add their own issues and none of them hide the fact that Core Data was simply not meant to be used like this…

Nodding along here? Yep, us too.

After some research and planning, I landed on a set of features that I wanted in a persistence framework. The list is pretty short:

  • My objects must be plain objects. Requiring model objects to extend a certain base class is painful; especially when you cannot see the source for those classes. I want to know that I can create objects however I like, use them wherever I like, and build my own functionality around them without worrying about what sandbox they belong to.
  • I should not have to worry about concurrency. Tools like closures, Grand Central Dispatch, and NSOperationQueue make it easy to write code that spans many threads. Unpacking objects on background threads that are later used to update interface elements is a common operation that should be simple and easy to reason about.
  • I should not have to write migration code. Frameworks and libraries can and should be smarter than this. Let me describe how I want the database to look. You take care of the rest.
  • I should be able to track and respond to atomic changes. Elements in my application should be able to reload themselves if a single object, or a collection of objects changes. Otherwise it may spend too much time updating things that may not need to be updated, which could have a negative impact on the user experience.

Sounds good. Only thing we’d add as a base requirement is “CloudKit sync” to make the cross-device experience seamless.

And the winner was:

I decided to use YapDatabase. It came highly recommended and it offers support for all of my requirements. YapDatabase is, at its core, a key-value store build on top of SQLite. It has a simple API and is incredibly flexible…

You will notice a few things about my model objects right away: they are completely immutable, they have no knowledge of the persistent store, and no magic superclass. These objects can be passed around the application (and across thread boundaries) without requiring knowledge of the database or how these objects are to be handled…

All database operations in YapDatabase are handled through database connections. YapDatabaseConnection has functions for reading from, writing to, and responding to individual changes in the database. It does all of this in a thread safe manner. One less thing to worry about!

OK, flipping down the feature list there, looks like a solid solution to entry level needs, got a track record and is being easily maintained, just that one little extra base requirement…

• Sync. Support for syncing with Apple’s CloudKit is available out of the box. There’s even a fully functioning example project that demonstrates writing a syncing Todo app.

… well, there we go then. Unless some dramatically superior alternative comes to our attention, next data-storing app we ship will use YapDatabase too, farewell Core Data!

Nine Releases Make An Xcode

All these new APIs are very well and all, but now let’s get to the really exciting stuff from WWDC:

What’s New in Xcode 9

  • All new editor. Fast, structure-based editor that lets you intelligently highlight and navigate your code. Includes great Markdown support.
  • Refactoring. Refactoring built right into the editing experience and works across Swift, Objective-C, Interface Builder, and many other file types.
  • Super-fast search. The Find navigator returns results instantly.
  • Debugging. Wirelessly debug iOS and tvOS devices over the network, new debuggers for Metal, and more features throughout Xcode.
  • Source Control. All new source control navigator and integrated support for GitHub accounts for quickly browsing repositories and pushing your repositories to the cloud.
  • Xcode Server built-in. Continuous integration bots can be run on any Mac with Xcode 9, no need to install macOS Server.
  • New Playground templates. Includes iOS templates designed to run well in both Xcode and Swift Playgrounds in iPad.
  • New Build System. An opt-in preview of Xcode’s new build system provides improved reliability and performance.

That is … a remarkable array of improvements. And those are just the highlights! Seriously, go Read The Whole Thing™. Most are runtime and debugging improvements, we’ll just pick out to highlight further here the changes in asset catalogs, since there’s some new resource stuff here you’ll want to be aware of to be iOS 11 savvy:

Asset Catalogs

  • Named colors support.
  • Added wide gamut app icons.
  • Added a larger iOS marketing to the App Icon set.
  • Added option to preserve image vector data for matching Dynamic Type scaling.
  • Added support for HEIF images.

The biggest one there is, as explained here,

Preserve Vector Data

There’s a new checkbox in the Asset Catalog Attributes Inspector called “Preserve Vector Data.” Checking this box will ensure that Xcode includes a copy of the PDF vector data in the compiled binary. At runtime, iOS can automatically upscale or downscale the vector data to output an image in your app whether you’re using the image in code or in a Storyboard scene. Remember: When using PDF vector data, set the “Scales” value to “Single Scale” in the Attribute Inspector to ensure the proper loading the PDF vector data to populate image.

This change also works in conjunction with the new Tab Bar icon HUD that Apple implemented as an accessibility feature in iOS 11. If you enable “Preserve Vector Data” this feature comes to your apps with no additional work. By enabling this feature, iOS 11 can also automatically scale images regardless of whether you’re increasing a UIImageView’s bounds, or using Size Classes to change an UIImageView size…

Screenshot samples at The Unexpected Joy of Vector Images in iOS 11.

And apparently Apple is bowing to popular pressure and making Xcode all but dependent on Github, see Xcode GitHub Integration and The Marriage of Github and Xcode 9. We’d mutter something curmudgeonly about why don’t you go full fanboi and replace the documentation with hotlinks to Stack Overflow too, but we’re worried they might take that idea seriously…

Community support we do find quite appealing though, is how enthusiastically the new stuff is being open sourced:

Apple open sources key file-level transformation Xcode components

This afternoon at WWDC we announced a new refactoring feature in Xcode 9 that supports Swift, C, Objective-C, and C++. We also announced we will be open sourcing the key parts of the engine that support file-level transformations, as well as the compiler pieces for the new index-while-building feature in Xcode…

And if that bit about “new build system” struck terror into your massively scripted heart, fear not, it appears to be pretty much a behind the scenes change all around:

New Xcode Build System and BuildSettingExtractor

The new system is written in Swift and promises to be a significant advance in a number of areas including performance and dependency management. The new system is built on top of the open source llbuild project and lays the foundation for integrating the Xcode build system with the Swift Package Manager

It appears that everything about defining build settings remains unchanged. Moving between the old and new build systems did not cause any build setting changes or recommended changes. The mechanisms for generating that giant bucket of key-value pairs known as build settings seem to be just the same as before.

This is great news. As developers, we don’t need to learn a new complex system for defining build settings. We get to enjoy the benefits of a new, faster, modern build system with our existing build settings left intact…

(And if you haven’t moved to .xcconfig files yet, or if you do them by hand, seriously do go check out BuildSettingExtractor. So handy, we even contributed to it — and that’s as high praise as it gets, around these parts!)

That’s enough for a TL;DR to get you salivating for the new stuff — but if you missed our link to New stuff from WWDC 2017 last time, go check it out now; more details on Xcode changes there … and everything else as well. Veritably encyclopedic, that reference!

OK, one last note that isn’t Xcode 9 specific but you’ll want to refer to it anyways: iOS Simulator Power Ups. Something for everyone there!

UPDATES:

Hands-on XCUITest Features with Xcode 9

Customizing the file header comment and other text macros in Xcode 9

iOS Simulator on steroids: Tips & Tricks

Little Xcode Beta Surprises 🎁: Core Graphics Codable Conformance

Measuring Swift compile times in Xcode 9gist example

Awesome native Xcode extensions

Secret variables in Xcode AND your CI for fun and profit ?

Xcode 9 Vector Images

Managing different environments in your Swift project with ease

Conditionally embed your dynamic frameworks

Everything You Need to Know About Ruby for iOS Development

This iOS Goes To 11

Been enough of a while since WWDC ’17 for people to sort out what they find interesting about iOS 11 and all now, so let’s take a look shall we?

The canonical references up at the mothership are

A good TL;DR while you bookmark this for later is What’s new in iOS 11 for developers

iOS 11 By Examples has, surprise, examples of using new iOS 11 APIs:

  • Core ML: Image classification demo using Core ML framework
  • Vision: Face detection, landmarks, and object tracking
  • ARKit: Augmented reality experiences in your app or game
  • Core NFC: Reading of NFC tag payloads.
  • IdentityLookup: SMS and MMS filtering using IdentityLookup framework
  • DeviceCheck: Identifying devices that have used a promo, flagging fraudsters
  • Blogs/Newsletter: Other places that mentioned this list — probably mention more good stuff too!

Personally we’d already been planning to try out iPad-only travel soon, and iOS 11 + new iPad Pro looks like MASSIVE WIN on both fronts — seriously, did anybody at all predict the super duper ProMotion screen? — and the mutitasking stuff is a good bit better than we’d expected, and easy to use, see:

CoreNFC is one of those what took you so long things, but hey better late than never: CoreNFC tutorial

Now that Metal is even more metal to support 120 fps displays and all, they’re Introducing Metal 2

If you do anything with file names in iOS or macOS, make sure you read APFS Native Normalization

One thing worthy of noting as removing what we’d found the major annoyance with UIStackView and likely you too: Stack View Custom Spacing

Also top and layout guides are simplified to Safe Area Layout Guide

Some more little bites, to coin a phrase, at #309: UIFontMetrics and #310: Screen Edges in iOS 11 and #311: Round Corner Improvements

Now, to step back from the API level, some people are very excited about this Business Chat thing:

… I believe that Apple Pay Cash, and the ability to send low transaction amounts with no cost to the sender or the receiver (the business) will be one of the most transformative elements to the impact of Business Chat. This can bring rise to a number of new businesses and applications that may be similar to the “gig economy” of Fivver and other systems…

Business Chat can not only spell the end of the POS system and payment systems as we know it in retail sales, it may also spell the end of the shopping carts and payment systems in online sales. Some may argue this is bombastic…

Yes, maybe a touch. Doesn’t mean it’s wrong though; read the whole thing, and the Business Chat info at the mothership, and see what you think —or read this one that figures it’s not just shopping carts in the crosshairs: Apple Bank here we come

Last one we’ll call out here — some good thoughts on the new evolutions of the interface design language: Think Bigger: Design Changes in iOS 11

Need more? Check out your veritably canonical reference to everything new over at New stuff from WWDC 2017!

UPDATES:

Changes to location tracking in iOS 11; Location Permissions in iOS 11 and avoiding the Blue Bar of Shame

iOS 11, Privacy and Single Sign On

What’s New In UIKit Animations In Swift 4

Working with CoreNFC in iOS 11

ARKit And Kaboodle

The other obviously transformational technology introduced at WWDC this year was Apple’s sudden leap from

One thing is clear: Apple needs to get moving soon.

to, a mere 5 days later — how’s that for “soon?” —

If Apple gets this right, they will own the hardware market for years to come.

and what we’re sure must be by far the quickest ever adoption of an Apple-only technology outside the computer industry,

Ikea’s plans for ARKit revealed, virtual shopping tool will launch in fall with iOS 11

so let’s start collecting links on that shall we?

Introducing ARKit

With ARKit, iPhone and iPad can analyze the scene presented by the camera view and find horizontal planes in the room. ARKit can detect horizontal planes like tables and floors, and can track and place objects on smaller feature points as well. ARKit also makes use of the camera sensor to estimate the total amount of light available in a scene and applies the correct amount of lighting to virtual objects…

… yeah ok that’s pretty cool.

There’s already signup open for ARKit Weekly to keep abreast of news here, and the same obviously extra-keen folk have also started

Made With ARKit: Hand-picked curation of the coolest stuff made with #ARKit

That’s some pretty darn nifty videos there for, like two weeks at this!

To grab some source code examples and interesting commentary, check out

Apple ARKit by Example

  1. Getting setup, draw a cube in virtual reality
  2. Plane Detection and Visualization
  3. Adding geometry and physics fun
  4. Physically Based Rendering

ARShooter – An Example Shooter Created Using iOS 11’s ARKit

ViewAR does first tests with ARKit

3 Things To Know About Apple’s ARKit

Apple’s new augmented reality platform may be its next game-changer

Pretty cool huh? It’s like the future’s so bright we have to wear shades … except the whole point of Apple AR (so far…) is that we don’t!

UPDATES:

ARBrush: “Quick demo of 3d drawing in ARKit using metal + SceneKit.”

ARKit Tutorial in Swift 4 for Xcode 9 using SceneKit

Watch a Tesla Model 3 come to life

ARTetris: “Augmented Reality Tetris made with ARKit and SceneKit”

Getting Started with ARKit: Waypoints

ARKit introduction

Wenderlich Introduction to ARKit:

  1. Getting Started
  2. Adding 3D Models
  3. Measuring Distances

I’m presenting the ARKit workshop at RWDevCon 2018!

ARKit-CoreLocation: “Combines the high accuracy of AR with the scale of GPS data.”

Inside iOS 11: The coolest Apple ARKit demos created so far

Using ARKit with Metal

ARKit-CoreLocation: “Combines the high accuracy of AR with the scale of GPS data.”

ARKit and Autism: New Futures

Adventures in ARKit (with magic)

ARPaint: “Draw with bare fingers in the air using ARKit.”

ARKit for iOS Developers

Why is ARKit better than the alternatives?; Why Apple’s glasses won’t include ARKit ; Q&A on ARKit

FaceRecognition-in-ARKit: “Detects faces using the Vision-API and runs the extracted face through a CoreML-model to identiy the specific persons”

Latest demos with iOS 11 ARKit show plated food, 3D sculpting with Apple Pencil

New Augmented Reality Resources Now AvailableARKit Human Interface Guidelines

Apple invites developers, media to Cupertino HQ to showcase ARKit apps

arkit-occlusion: “A demonstration of vertical planes “tracking” and occlusions with ARKit+Scenekit”

How is ARCore better than ARKit?

Using ARKit with Metal and Using ARKit with Metal part 2

Why ARKit will be Apple’s biggest innovation in years

Animating a 3D model in AR with ARKit and Mixamo

ARKit and CoreLocation: Part One

IKEA App

How Is ARCore Better Than ARKit? + The Potential of Apple’s ARKit

How Apple’s iPhone X TrueDepth AR waltzed ahead of Google’s Tango

Core MLagueña

Checked out the WW17 Roundup yet? OK then, let’s start digging into this new stuff a little deeper. And we’ll start with the one with the most buzz around the web,

Introducing Core ML

Machine learning opens up opportunities for creating new and engaging experiences. Core ML is a new framework which you can use to easily integrate machine learning models into your app. See how Xcode and Core ML can help you make your app more intelligent with just a few lines of code.

Vision Framework: Building on Core ML

Vision is a new, powerful, and easy-to-use framework that provides solutions to computer vision challenges through a consistent interface. Understand how to use the Vision API to detect faces, compute facial landmarks, track objects, and more. Learn how to take things even further by providing custom machine learning models for Vision tasks using CoreML.

By “more intelligent” what do we mean exactly here? Why, check out

iOS 11: Machine Learning for everyone

The API is pretty simple. The only things you can do are:

  1. loading a trained model
  2. making predictions
  3. profit!!!

This may sound limited but in practice loading a model and making predictions is usually all you’d want to do in your app anyway…

Yep, probably. Some people are very excited about that approach:

Apple Introduces Core ML

When was the last time you opened up a PDF file and edited the design of the document directly?

You don’t.

PDF is not about making a document. PDF is about being able to easily view a document.

With Core ML, Apple has managed to achieve an equivalent of PDF for machine learning. With their .mlmodel format, the company is not venturing into the business of training models (at least not yet). Instead, they have rolled out a meticulously crafted red carpet for models that are already trained. It’s a carpet that deploys across their entire lineup of hardware.

As a business strategy, it’s shrewd. As a technical achievement, it’s stunning. It moves complex machine learning technology within reach of the average developer…

Well, speaking as that Average Developer here, yep this sure sounds like a great way to dip a toe into $CURRENT_BUZZWORD without, y’know, having to actually work at it. Great stuff!

Here’s some more reactions worth reading:

Here’s some models to try it out with, or you can convert your own built with XGBoost, Caffe, LibSVM, scikit-learn, and Keras :

  • Places205-GoogLeNet CoreML (Detects the scene of an image from 205 categories such as an airport terminal, bedroom, forest, coast, and more.)
  • ResNet50 CoreML (Detects the dominant objects present in an image from a set of 1000 categories such as trees, animals, food, vehicles, people, and more.)
  • Inception v3 CoreML (Detects the dominant objects present in an image from a set of 1000 categories such as trees, animals, food, vehicles, people, and more.)
  • VGG16 CoreML (Detects the dominant objects present in an image from a set of 1000 categories such as trees, animals, food, vehicles, people, and more.)

And some samples and tutorials:

Also note NSLinguisticTagger that’s part of the new ML family here too.

For further updates we miss, check out awesome-core-ml and Machine Learning for iOS !

UPDATES:

YOLO: Core ML versus MPSNNGraph

Why Core ML will not work for your app (most likely)

Blog: Getting Started with Vision

MLCamera – Vision & Core ML with AVCaptureSession Inceptionv3 model

Can Core ML in iOS Really Do Hot Dog Detection Without Server-side Processing?

Bringing Machine Learning to your iOS Apps 🤖📲

Creating A Simple Game With CoreML In Swift 4

Custom classifiers in iOS11 using CoreML and Vision

Using Vision Framework for Text Detection in iOS 11

Smart Gesture Recognition in iOS 11 with Core ML and TensorFlow

Awesome-CoreML-Models: “Largest list of models for Core ML (for iOS 11+)”

WWDC 2017 Roundup

That was a particularly good WWDC this year wasn’t it? Lots of possibly transformational technologies introduced it’s going to be very interesting indeed to see how they play out!

We’ll explore those in more depth later on, but for now let’s sort out some resource links and initial reactions.

First up, you want to go download The Unofficial WWDC app for macOS:

4a5b0d11384fd46acdbe8f552a5b85a7

Isn’t that pretty? Definitely your WWDC video viewing app of choice, we say!

Next, address this problem

687474703a2f2f63646e2e6d656d6567656e657261746f722e6e65742f696e7374616e6365732f343030782f32313932383230372e6a7067.jpeg

with the WWDC-Downloader script over at Github.

(And while that’s all downloading for your next transcontinental plane flight or whatever, have your heart warmed with

WWDC17 Scholarship winner Kenny Batista shared his experiences at Apple’s big week for developers !)

If you want to prioritize a bit, here’s a WWDC 2017 Viewing Guide that looks solid.

There’s enough interesting new technologies that they deserve roundup posts each as there’s going to be a lot of figuring out there to keep track of, so for now we’ll just link a couple overviews hitting the highest points:

The 2017 Apple Design Award Winners

WWDC 2017 Initial Impressions from the Wenderlich team

Another take from the Big Nerd Ranch crew at WWDC 2017: Helping You Get Things Done

Also check out their article New HEVC & HEIF Media Formats: What You Need to Know and Begun, The Codec War Has for background on the fur that’s no doubt going to be flying all over about that soon.

Oh, and we’ll put An In Depth Look At the New App Store and New rules following WWDC 2017 here too — there’s some interesting changes there you want to be aware of. Most notably to us, that finally The new iOS App Store lets devs choose whether or not to reset ratings when updating!! Woo-woo-woohoo!

And as always, over at mjtsai.com you can find a near-canonical list of WWDC 2017 Links for deeper diving!

UPDATES:

HEIF: A First Nail in JPEG’s Coffin?