Skip to content

Instantly share code, notes, and snippets.

@dbolella

dbolella/blog.md Secret

Created November 10, 2020 14:52
Show Gist options
  • Save dbolella/c41211080c0d0174c29f48449d79eba8 to your computer and use it in GitHub Desktop.
Save dbolella/c41211080c0d0174c29f48449d79eba8 to your computer and use it in GitHub Desktop.

Using Fritz AI in SwiftUI

SwiftUI is a young, yet powerful, UI framework by Apple that is both declarative and reactive by design. It’s appeal to so many developers is that it is simple, modern, and NOT Interface Builder/Storyboard driven.

That being said, not everything about SwiftUI is polished just yet and your features may still require a reliance on UIKit.

Take, for example, our very own Fritz Pre-Trained Image Segmentation Model. In our example code on how to fake Portrait Mode (which can be found here along with a complete explanation of how it works), you’ll see we rely on a ViewController. Within this ViewController, we are able to get the live Camera feed, use AVKit components, run our model, and manipulate the UI accordingly. This is just not possible with pure SwiftUI (yet).

This doesn’t mean it’s impossible to use our cool visual models in your SwiftUI app. On the contrary, it’s quite simple to turn our demo VC into a Representable View. From there, you could build out amazing UI/UX using the power and ease of SwiftUI and include awesome Fritz AI features simply by declaring them.

Let’s learn how by taking the VC in the demo project and placing it into a new SwiftUI project!

Setup

To really get an understanding of how we can get these features working in a SwiftUI app, let’s first walkthrough the setup. There are some key steps that we’ll want to go through in order to make our features work in SwiftUI.

Note: though we will be following we will be working off of Xcode 12, SwiftUI 2.0, and Cocoapods

Xcode and Pods

First, let’s create a new Xcode project. Let’s go with an iOS App -> Interface: SwiftUI -> Lifecycle -> SwiftUI App (the rest is your choice). Create your project and then close it.

Next, in your new projects directory, add a Podfile with the following code:

https://gist.github.com/db21b4db5f07a1d9455e7430d7f72055

From terminal, navigate to your directory and run pod install. Once once complete, open the new .xcworkspace file, which we’ll use from now on.

Permissions

In your project’s Info, be sure to add the following permission so we can use the Camera: Privacy - Camera Usage Description

Register on Fritz AI

If you haven’t already, sign-up for a new Fritz AI account (sandbox is the bare minimum needed!). Once you do, you’ll want to register your new iOS app. To do this, you’ll need the App Name and Bundle Identifier you gave your app.

When the registration prompts you to download the Fritz-info.plist file, follow the instructions on where to place the file in your project.

Before you hit next in the registration process, we’ll want to finish setting up your app so we can’t complete the registration fully. Let’s do that now.

Bring Back the App Delegate

In the Fritz AI registration walkthrough, you’ll notice that it instructs you to place code in your projects App Delegate. However, with SwiftUI 2.0, the App Delegate is essentially replaced by @main. This doesn’t mean that we still can’t bring App Delegate back. Insert the following in the file <yourappnamehere>App.swift beneath the App struct:

https://gist.github.com/3e801148d8401f5b385db89533cf50de

And in your App, add the following line:

https://gist.github.com/bcbe96dc95198faee8099fac8335faf4

This way, we can run FrtizCore.configure() on launch. This may seem like it eliminates the benefit of @main, but we still eliminate a ton of boilerplate and unnecessary code in our project.

Complete Registration

To wrap up registration, let’s run our app in the simulator so we establish contact with Fritz AI. Once we see “Hello World”, we can go back to the registration page and hit next. Fritz will check to see that our app has at least attempted to reach out the them and, once it does, complete the registration successfully!

Code!

Now comes the fun part, though you’ll find it will be almost too simple since we’ll be porting in the demo. You’ll want to copy both ViewController.swift and CustomBlurView.swift from the demo project and into your project. I also choose to rename ViewController to ImageSegmentationViewController to make it more identifiable, so that’s how I’ll be referring to it here.

Wrapping a VC with Representable

Next, we’ll create a UIViewControllerRepresentable so that our VC will be seen and usable as a View by SwiftUI. On the bottom of our VC file, add the following:

https://gist.github.com/41ffd53888e8007259c5a1520eaa2b2d

That’s… actually it. We’ve now created a new View called ImageSegmenter which contains our ImageSegmentationViewController.

Declare Our View!

All that’s left to do is declare our ImageSegmenter in our UI just like we would any other View. Go to ContentView.swift, wrap the Text in a VStack, and add ImageSegmenter below Text:

https://gist.github.com/ae59eb7b82da2fd014bad1ab11347f1a

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment