The advent of machine learning on mobile has opened doors for a bunch of new opportunities. While it has allowed ML experts to tap into the mobile space, the other end of that equation is actually the show-stealer. Letting mobile application developers dabble in machine learning has actually made its with mobile application development so exciting.

The best thing is, you needn’t be a machine learning expert in order to train or run models. Core ML, Apple’s machine learning framework, provides an easy-to-use API that lets you run inference (model predictions), fine-tune models, or re-train on the device.

Create ML, on the other hand, lets you create and train custom machine learning models (currently supported for images, objects, text, recommender systems, and linear regression) with a drag-and-drop macOS tool or in Swift Playgrounds.

If this didn’t amaze you, consider SwiftUI, the new declarative UI framework that caused a storm when it was announced to the iOS community during WWDC 2019. It alone has led to an influx of developers learning Swift and iOS dev, given how easy it is to quickly build user interfaces.

Only together would SwiftUI, Core ML, and Vision (Apple’s computer vision framework that preceded Core ML)give rise to smart AI-based applications. But that’s not all…you can leverage the power of machine learning to build fun games as well.

In the next few sections, we’ll build a camera-based iOS application that lets you hunt down the emojis in your house — something like a treasure hunt, which has to be among the popular indoor games we’re playing right now, as we find ourselves in quarantine.

Plan of Action

  • We’ll use a [MobileNet](https://github.com/tensorflow/models/blob/master/research/slim/nets/mobilenet_v1.md) Core ML model to classify objects from the camera frames. If you want to read more about the MobileNet architecture, hop on over to this article for a detailed overview.
  • For setting up the camera, we’ll use AVFoundation, Apple’s own audio-video framework. With the help of UIViewRepresentable, we’ll integrate it into our SwiftUI view.
  • We’ll drive our Core ML model with the Vision framework, matching the model’s inference with the correct emoji (because every emoticon has a meaning).
  • Our game will consist of a timer, against which the user points the camera at different objects around a given area to find the one that matches the emoji.

#ios #ios-app-development #swift #machine-learning #heartbeat

Build a SwiftUI + Core ML Emoji Hunt Game for iOS
4.20 GEEK