Skip to content

Instantly share code, notes, and snippets.

@PixelPartner
Last active April 25, 2020 13:16
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save PixelPartner/9e7dc6f6afe9f66a285bffc52e01b35a to your computer and use it in GitHub Desktop.
Save PixelPartner/9e7dc6f6afe9f66a285bffc52e01b35a to your computer and use it in GitHub Desktop.
My thoughts about the Apple Roadmap for AR Quicklook

What is revolutional with AR Quicklook on iOS 13.4 ?

With iOS 13.4 Apple lifted the veil of some internals of its RealityKit universe.

Formerly their AR-Editor Reality Composer was exporting scenes only in a proprietary .reality file format. And just a few users and designers used it to manually create and update scenes. And developers had no access to documentation and not a clue how to create those scenes with their own Code base.

With iOS 13.4 and the new release of Reality Composer with USDZ export the situation changed drastically.

USDZ files are just uncompressed zip archives containing one or more USD files (the binary form = .usdc) and folders with textures and sounds needed by the scene (aka USDStage).

The difference to .reality is that USD can be opened, traversed, changed, remixed, written with a well documented open source library.

And we can even use the python script usdcat to create a well structured text file .usda, we can study and even manually edit and convert back to binary .usdc. Feast your eyes!

The flavour of USD, Apple understands in AR Quicklook is a bit different from all the USD used in the wild at Hollywood studios:

  • It's all vaild USD but not all "Hollywood"-features are supported (yet).
  • On the other hand, Apple added so called USDSchemes to also store interaction behaviour, both by the user(s) and between elements (collisions, pyhsics engine) and lots of other great stuff, they prefixed with preliminary while discussing with the community to standardise it.

You can develop either using C++ or Python (2.x as still used by studios, soon also in 3.x) to either

  • read a scene and make changes to reflect daily updates,
  • the USD lib gives you access to change every aspect of the scene.
  • or produce localize text labels for versions in other languages,
    (yes - Texts are not stored as giant Meshes like other scene elements, but you store its content as string as well as font, size, color, depth, etc. and its geometry is created on the fly by AR Quicklook!)
  • or to create this type of scene files all from scratch, filled with your own geometry and a still small set of descriptive game logic (aka Behavior) to interact with it.

The file format is just part of a vast feature set that also includes managemend of massive vertex data, shading networks, animation solvern and even a library for realtime preview of all that.

USD was made open source at Siggraph 2016 by PIXAR Animation Studios (part of Disney) and to find out more about it, look here http://graphics.pixar.com/usd/downloads.html or simply google it.

Since then it has conquered Hollywood by storm because of :

  1. Better organized studio workflows, as artists don't have to wait for each other and still all work with the newest content.
  2. It better interlinks digital content creation (DCC) tools.
  3. It supports the latest and greatest surface description for assets without the usual hassle with low poly tessellation and LODs (see Subdivision Surfaces – OpenSubdiv)
  4. And it allows massive Scenes with animated Previews in interactive speed!

I guess 3) and 4) made some yaws drop also at Apple and they rushed to test-drive it by porting the dynamic tesselation of slim, sparse but precise SubD geo to Metal to run it on their GPUs.

That allows for tiny file sizes for AR/VR content while having a realtime rendering look that is superior to the current low poly mesh rendering approaches of Google/Microsoft/Facebook on mobile chips.

Apple's Roadmap for using AR without installing an App

  • Make simple AR scenes with a single item available to most iOS users.
    This works since iOS 12 and can be used by web shops and for promotion of products.
  • Make several items interactively and seperatly placeable
    (think all products you placed in your webshop basket).
    This works since iOS 13, but was not well received and IHMO also not well planned and designed.
  • Make AR aware of persons infront or behind of the virtual content.
    People occlusion works since iOS 13 on devices with A-Series chips fast enough - A13+.
  • Let the Website (the scene is loaded from) preset values and exchange events with AR Quicklook.
    This already works since iOS 13.? for ApplePay using
    • urlParams for input and
    • DOM events as output.
  • Sneak peek more elaborate AR scene interactions and real time physics effects.
    This came with iOS 13 supporting .reality files made with Reality Composer or the more artist facing Adobe Aero App.
  • Open up a bit and show some internals for us all to chew on.
    Welcome iOS 13.4 and the new version of Reality Composer exporting interactive USDZ.
    This allows us developers to create interactive ARQL scenes without any Apple software / library!
    We can either take a scene manually made with RC or Aero and update it programmatically or create our own by using the open source cross platform USD tools.
  • Make AR aware of your environment.
    On devices with a LiDAR sensor (2020 iPadPro), the AR scene can interact with and be occluded by the real world near you (5 m range)
  • Multi-User-ARQL – allow many users to be part of a single (or several?) AR Quicklook scene(s) and all interact with them simulatiously.
    Sure to comehave a look at the RealityKit dev docs and see the use of UUIDs inside the new USDZ files!
    Will hopefully come with iOS 14 beta at WWDC 2020.
  • Faster room identifacation and AR orientation using 2 or more coin sized AirTags. Their position can be tracked precisely utilizing the U1 chip, found in new iOS devices.
  • Multi-User and single user VR - the rumored A14chip based family of 2021 Macs (paired with a rumored 2nd gen Valve Index2 HMD) as well as all iPhone11+ models (snapped in a smart HMD) will be able to join VR experiences.
    The smart HMD might bring eyetracking, a depth sensor to read your lips, a big touch pad (see patent) and its own battery.
  • AirPencil might be the 3rd gen pencil that provides U1-compatible tracking and force feedback. It's main use will be to manipulate objects in virtual space (VR and AR).
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment