In iOS 13, Apple introduced support for being able to access multiple cameras at once from the device....they had to, in order to take full use of all those cameras on the new iPhones (though, this feature does work for any device with iOS 13)
They covered this in the WWDC19 session on "Introducing Multi-Camera Capture for iOS":
https://developer.apple.com/videos/play/wwdc2019/249/, and released some sample code for it:
https://developer.apple.com/documentation/avfoundation/cameras_and_media_capture/avmulticampip_capturing_from_multiple_cameras
I was curious if it would be possible to start an ARKit session using the back camera (world facing) while using an AVCamea session with the front (self-facing camera)...hoping that since you can use both cameras at once for video recording, you could use one camera for video recording, and the other camera as part of an ARScene.
Turns out that that's sadly still not possible - looks like whichever session is started first gets first priority.
Hopefully this will change in future iOS updates...it would allow developers to write apps that take advantage of using front-facing cameras for gaze detection, eye tracking, facial feature mapping, and emotion recognition while running an AR scene.