Skip to content

Instantly share code, notes, and snippets.

View elkraneo's full-sized avatar
🌍

Cristian Díaz elkraneo

🌍
View GitHub Profile

AccessibilityComponent custom rotors breaks VoiceOver

FB13769967

When you try to listen to a custom rotor navigation event inside an AccessibilityComponent, the RealityRender stops working as it would normally. It looks like the .customRotors collection property is emptied, but there are no signs of Entity reconstruction. The focus is also lost, and other entities with custom rotors are reset too. After running the custom rotor on the video, it was impossible to get back to focus, and VoiceOver acted anxious.

@elkraneo
elkraneo / States-v3.md
Created May 1, 2024 12:32 — forked from andymatuschak/States-v3.md
A composable pattern for pure state machines with effects (draft v3)

A composable pattern for pure state machines with effects

State machines are everywhere in interactive systems, but they're rarely defined clearly and explicitly. Given some big blob of code including implicit state machines, which transitions are possible and under what conditions? What effects take place on what transitions?

There are existing design patterns for state machines, but all the patterns I've seen complect side effects with the structure of the state machine itself. Instances of these patterns are difficult to test without mocking, and they end up with more dependencies. Worse, the classic patterns compose poorly: hierarchical state machines are typically not straightforward extensions. The functional programming world has solutions, but they don't transpose neatly enough to be broadly usable in mainstream languages.

Here I present a composable pattern for pure state machiness with effects,

 Vision Pro Meta Quest 3
Vision 57 3
Physical and Motor 55 6
Hearing 9 3
General 10 0
import RealityKit
import RealityKitContent
import SwiftUI
struct ManipulationState {
var active = false
var transform: AffineTransform3D = .identity
}
struct ContentView: View {
import SwiftUI
let someText = "Qu’est-ce que c’est?"
struct ContentView: View {
let nsAttributedStringFR = NSAttributedString(
string: someText,
attributes: [
.languageIdentifier: "fr-FR"
// This appears to have no effect
import RealityKit
import SwiftUI
extension Entity {
/// Adds an image-based light.
///
/// This method assumes that the project contains a folder called
/// `Environment.skybox`
///
/// Tune the intensity parameter to get the brightness that you need.
let sceneUpdateSubscription = content.subscribe(to: SceneEvents.Update.self) { event in
let deltaTime = event.deltaTime
// Accumulate delta time
self.accumulatedTime += deltaTime
self.frameCount += 1
// Calculate FPS every n frames
if self.frameCount >= 10 { // Calculate FPS every 10 frames (1 second at 10 FPS)

Runtime showStatistics debugOptions crash on iOS RealityKit

FB13373595

When the ".showStatistics" debug option is assigned to an ARView in runtime, the app crashes due to metal encoder buffer offset errors.

The error reads as follows:

-[MTLDebugRenderCommandEncoder validateCommonDrawErrors:]:5775: failed assertion `Draw Errors Validation
Vertex Function(vsSdfFont): the offset into the buffer viewConstants that is bound at buffer index 4 must be a multiple of 256 but was set to 62096.

Safari on visionOS should support spatial audio

FB13371444

The implementation of spatial sound on safari could enable the easier access of data and interface interaction. Using spatial sound would be highly beneficial for web browsing

Safari on visionOS should support spatial content

FB13363974

It would be quite fascinating to have access to spatial content within Safari. For instance, the ability to view spatial video within iCloud Photos.