Skip to content

Instantly share code, notes, and snippets.

@narner
Last active December 16, 2022 19:30
Show Gist options
  • Save narner/0c7d613275d6891140d234839d6db7ce to your computer and use it in GitHub Desktop.
Save narner/0c7d613275d6891140d234839d6db7ce to your computer and use it in GitHub Desktop.
ARKit + MultiCam Test for iOS 13 - Not Successful
//
// ViewController.swift
// test
//
// Created by Nicholas Arner on 9/19/19.
// Copyright © 2019 NFA. All rights reserved.
//
import UIKit
import SceneKit
import ARKit
import AVFoundation
class ARKitMultiCamTestViewController: UIViewController, ARSCNViewDelegate {
@IBOutlet var sceneView: ARSCNView!
@IBOutlet var camPreview: PreviewView!
var session: AVCaptureMultiCamSession?
var input: AVCaptureDeviceInput?
var output: AVCapturePhotoOutput?
var previewLayer: AVCaptureVideoPreviewLayer?
override func viewDidLoad() {
super.viewDidLoad()
// Set the view's delegate
sceneView.delegate = self
// Show statistics such as fps and timing information
sceneView.showsStatistics = true
// Create a new scene
let scene = SCNScene(named: "art.scnassets/ship.scn")!
// Set the scene to the view
sceneView.scene = scene
// Keep the screen awake
UIApplication.shared.isIdleTimerDisabled = true
configureFrontCamera()
}
public func configureFrontCamera(){
session = AVCaptureMultiCamSession()
output = AVCapturePhotoOutput()
let camera = AVCaptureDevice.default(.builtInWideAngleCamera, for: AVMediaType.video, position: .front)
do {
input = try AVCaptureDeviceInput(device: camera!)
} catch let error as NSError {
print(error)
input = nil
}
if(session?.canAddInput(input!) == true){
session?.addInput(input!)
if(session?.canAddOutput(output!) == true){
session?.addOutput(output!)
previewLayer = AVCaptureVideoPreviewLayer(session: session!)
previewLayer?.videoGravity = AVLayerVideoGravity.resizeAspectFill
previewLayer?.connection?.videoOrientation = AVCaptureVideoOrientation.portrait
previewLayer?.frame = camPreview.bounds
camPreview.layer.addSublayer(previewLayer!)
session?.startRunning()
}
}
}
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
// Create a session configuration
let configuration = ARWorldTrackingConfiguration()
// Run the view's session
sceneView.session.run(configuration)
}
override func viewWillDisappear(_ animated: Bool) {
super.viewWillDisappear(animated)
// Pause the view's session
sceneView.session.pause()
}
}

In iOS 13, Apple introduced support for being able to access multiple cameras at once from the device....they had to, in order to take full use of all those cameras on the new iPhones (though, this feature does work for any device with iOS 13)

They covered this in the WWDC19 session on "Introducing Multi-Camera Capture for iOS": https://developer.apple.com/videos/play/wwdc2019/249/, and released some sample code for it:
https://developer.apple.com/documentation/avfoundation/cameras_and_media_capture/avmulticampip_capturing_from_multiple_cameras

I was curious if it would be possible to start an ARKit session using the back camera (world facing) while using an AVCamea session with the front (self-facing camera)...hoping that since you can use both cameras at once for video recording, you could use one camera for video recording, and the other camera as part of an ARScene.

Turns out that that's sadly still not possible - looks like whichever session is started first gets first priority.

Hopefully this will change in future iOS updates...it would allow developers to write apps that take advantage of using front-facing cameras for gaze detection, eye tracking, facial feature mapping, and emotion recognition while running an AR scene.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment