Skip to content

Instantly share code, notes, and snippets.

Created October 29, 2021 18:36
Show Gist options
  • Save rmmh/8491b8408f6830d18c1fa92d78cb6c36 to your computer and use it in GitHub Desktop.
Save rmmh/8491b8408f6830d18c1fa92d78cb6c36 to your computer and use it in GitHub Desktop.
Connect 2021 Carmack Talk Transcript

Welcome to Connect 2021.

My process for these talks is that I basically go back through last year of internal posts and public tweets that I’ve made looking for things that are interesting and worthwhile to talk about ask then I try to roughly sort them in to some kind of a coherent topic flow. I’m also usually the one that’s lot more kind of grumpy and unhappy with the way things are going as I always want more and faster progress. But this year, I really have a lot of things to be quite happy about. By far the most important is that quest 2 has been a really big success and it was a heroic effort to get it out when it was last year in the space of all the headwinds that we had, but it was better, faster, cheaper, one of those just rare combinations that you almost never get to have in a product.

00:01:31 While there’s still a couple minor things where some people miss the displays or continuous IPD adjustments, it was a fantastic product and the market has responded very positively to it. It would be a really big danger sign if we had been able to do all of that and didn’t have the multiple uptick that with we saw with quest 2 over quest 1. That’s a sign that things are looking good and the whole ecosystem has been benefiting and it’s been a good thing. There’s a bunch of other things I’ve often talked about in the past connect talks that I’ve been complaining about how we should really have this, why don’t we have this yet and a bunch of them did come this year.

Air link got released where we had Oculus link out and it was doing better than expected internally. Lots of uptake for it. There was a lot of internal debate about whether we needed to make completely custom hardware to be able to do wireless WIFI, but we shipped it, it’s great. Lots of people love it. It doesn’t work for everybody. Not everyone has good enough WIFI for it to be a good solution for them, but it works for a lot of people. It’s kind of funny how we have the obvious extensions to this where you’ve got airlink to your own PC.

There’s a bunch of language where they’re kind of carefully protecting certain use cases but there’s obvious things that I’d like to be able to offer where you should be able to airlink to your friend’s PC, one step to some kind of a crowd link where you can play VR games on Cloud servers on to the mobile system potentially anywhere.

00:03:12 There’s even more challenges for that and always in the spectrum from obvious pure is best, wire link has the next fewest problems to WIFI and a whole new set of issues and we’ll have the exact same arguments that we had on every single one of the steps. It will not be great for everybody, it cuts off more potential users.

It’s been almost funny how the conversation has changed so much that Cloud VR rendering is now almost looked at as a plan of record for some of the metaverse options where Cloud couldn’t get the time of day a year ago but now it’s almost looked at as plan of record to the point that I’m even gently pushing back a little on it with like a whoa, hold up, there’s still a lot of challenges with Cloud rendering and a lot of negatives and other things we need to factor in to, maybe we shouldn’t jump right to that. But it’s been great to see the progress and most importantly the user value we’ve delivered with that.

App lab got released, which has been from the beginning this long tension between more open location development versus carefully cureated spaces and, again, enormous internal battles about who’s going to control access to the VR screen, with a we’re going to allow in, and we finally shipped it. We’ve already got some amusing data points like gorilla tag having more multi-player users than big budget titles that have millions and millions of dollars poured in to them.

00:04:48 I want to keep coming back to the sense of we should trust the market rather than our internal content czars that want to pick and choose the winners with it, but while it’s supposed to not be the full store review process, there’s a significant review process and we’ve been backed up for multiple months at times with this and I think it needs to get solved and I’ve always suggested we should get to a reactive approach. You couldn’t run a social media network and every post had to get approved. I think we should be able to get there with app lab as well but that’s not our current plan of record and something that needs to be pushed on. I certainly have some sympathy for the review schedule where I promised to do more public app review and I’m not even done with my third one yet. It’s hard to get through all of these things. So I’m pushing for a more open-ended approach there.

Another thing that was a niche feature but 120 frames per second support landed this year and that’s been another kind of fun one with the internal arguments about it where in previous years, I had talked about where we felt the original quest one hardware could run 90 Hertz, not just 72. It would have been nice to be able to do that. The theory was that some applications could be lightweight enough to run at that speed and maybe it would be great for Oculus link support to run PC content at sorts of its original native rates, but we never were able to get that one through.

00:06:28 There was this idea that we’re changing the clock rate on that, we would have to get it FCC certified again, recertified and that was never going to happen for a device at that point in its kind of life span, but on quest 2, there were no clock changes, it was just changes in what goes into the timing to be able to do that. But we still ran in to a ton of internal resistance about it from display teams about it’s not certified to do that, it might not always switch, it might have ghosting. And we had people who said, let’s go in to a walk-in refrigerator and let’s try it in there. People are saying, well, it’s like nobody will actually taking advantage of this. 120 frames per second is too hard, but then it came down to more arguments about, well, okay, maybe it works on our display now, but what if we want to second source something? What if we want to use displays from another vendor where we’ve kind of said we want this 90 Hertz but maybe not everything can be pushed to 120 and there’s something to that argument where it’s always better for hardware vendors making a piece of hardware to be able to have multiple sources for the components that you use. It’s important in case something catastrophic happens to one of them, but it’s also important to give you a little leverage with the companies if a company knows that they make the only part in the world that you can use, it kind of has you over a barrel and hurts your negotiating position for everything.

00:08:01 If there’s at least two companies sourcing components, you’re in a much better position to say I’m going to take the one that meets my minimum and is the cheapest at that point. This winds up impacting us on a bunch of other things. We only found out this year that we have a couple different skews of flash vendors and they have slightly different perform characteristics. It’s not anything that you would notice. They all met the specs that we listed, but some are better than the specs we had and if you write really hard core IO tests, you can tell the difference between some versions of quest, but we did finally get it out and it was literally over a year from the point where it was demonstrated that 120 Hertz works to the time that we could actually ship it.

I found that ludicrous that it took that long. It was not much code at all, but it’s out there and it’s a success story and that’s one of the poster child that is I get to use internally now about how sometimes our process messes things up. We do have applications and taking advantage of it. One of the first ones that was a cheerleader for this was 11 table tennis, the ping pong simulator game where that sort of superfast reactions and twitches are super important and it was simple enough that they could cut down the scenes as much as necessary to get to a stable frame rate there. Interestingly, that game also pointed out some of the issues with the extrapolation of our controllers where they see certain problems with even only extrapolating 30 milliseconds into the future that serious players can actually tell the difference.

00:09:34 So we’ve still got room to grow at the high end for our responsiveness in VR for some applications. One of the ones that’s very near and dear to my heart is just this last week, we were able to release the unlocked Oculus system software release. And this has been far longer in the making than likely to consider because even before Oculus existed, when we were planing to ship our first stand-alone headset, there were serious discussions internally about, well, can we let users have root? And there’s always been the breakdown between usually the people on the system software side and a lot of the engineers like, yes, that would be a wonderful thing to offer users, but people will wind up coming up with reasons about how, well, that might undermine some of our platform integrity, it might have privacy compromise issues, it might have -- it’s easy to come up with a big list of reasons why it’s not a good idea.

Oculus Go went through the sales life and it’s no longer on sale, and we’ve already reached the point where we’ve stopped accepting new applications to the store for it, so I was able to make the pitch again. It’s like, okay, we are going to discontinue this product in a finite amount of time. Basically the policy is that you get two years of support from the last time that it was on sale. There’s a clock ticking. At that point, no more support is guaranteed. I mean, it’s not a guarantee that we’ll shut it off immediately, but there’s no more obligation to do so.

00:11:05 So given that there’s this very finite time coming up, let’s talk again about how we’re going to have these hardware, you know, a lot of them, there’s well over a million oculus gos out there, and they at some point, things stop working and they become sort of ewaste. This is a problem that people talk about a lot and there’s a clear thing you can do with that.You can go ahead and let it be used for more things, and looking back over just I made vague references to this a couple times over the year, but there’s been a semi-serious efforts going on for most of last year and it finally got through everything at the end to get the go ahead and surprisingly, the resistance wasn’t from where you’d expect.

The legal team was supportive of this a lot earlier than I would have expected. They went out and they did a bunch more work than I thought was going to have to be necessary talking with partners. We even had partner contracts updated for this, and I wish I could give a shoutout thanks to these, but apparently they didn’t want to be explicitly named for this, but some of our partners that had no real need to do this can go out of their way and execute a new contract with us that allowed us to go through and do this, but we had more people internally on the product teams worrying about, oh, you know, hand ring, what could it do? Could someone go in and expose system security with the software, but eventually, everything got settled down and we got to do it and I’m real happy it’s out there.

00:12:47 One of the scenarios I’ve always wanted to care about is I want somebody five years ago from now to find a dusty old shrink wrapped box in a closet, take it out, power it up and load the last operating system version on to it rather than whatever was first in the factory flash with no over the air updates available, but still, this is only step one where the next step is going to be preserving the content ecosystem.

I’m still trying to fight that battle internally to let us to something official and all-encompassing, but it’s possible that may still need to fall to third parties where everyone that’s got to go, backup your software off of it, make sure it lives somewhere so if we need to piece together the content repository years from now from a thousand separate little hard drive images, we have the possibility of doing that.

Now, obviously the metaverse is the dominant topic of the day, and I was quoted all the way back in 90s as saying building the metaverse is a moral imperative. And even back then, most people missed that I was making a movie reference, but I was still at least partially serious about that. I really do care about it and I buy into the vision, but that leaves many people surprised to find out that I have been pretty actively arguing against every single metaverse effort that we’ve tried to spin up internally in the company from even pre-acwhichquisition times.

00:14:20 I have pretty good reasons to believe that setting out to build the metaverse is not actually the best way to wind up with the metaverse.

Kind of my primary thinking about that is a line I’ve been saying for years now that the metaverse is a honey pot trap for architecture astronauts. Architecture astronaut is kind of a chidingly pejorative time for programmers or designers that want to only look at things from the very highest levels that don’t want to talk about GPU microarchitectures or merging network streams or dealing with any of the architecture asset packing, any of the nuts and bolts details, but they want to talk in high abstract terms about how we’ll have genEric objects that can contain other objects and have references to these and entitlements to that and we can entitlements to that and we can automically pass things along.

Here we are, Mark Zuckerberg has decided now is the time to build the metaverse. So enormous wheels are turning and resources are flowing and the efforts are definitely going to be made. The big challenge now is to try to take all of this energy and make sure it goes to something positive and we’re able build something with real near term user value.

00:15:52 My worry is that we could spend years and thousands of people, possibly, and wind up with things that didn’t contribute all that much to the ways that people are using the devices and hardware today. So my biggest advice is that we need to concentrate on actual products rather than technology, architecture, or initiatives.

Now, I didn’t write game engines when I was working at id software. I wrote games and some of the technology that was in those games turned out to be re-usable enough to be applied to other things, but it was always driven by the product itself and the technology was what enabled the product and then almost accidentally enabled some other things after it. It’s hard for a lot of people to really accept how rarely future proofing and planning for broad generalizations for things turns out to deliver value. It is really shocking how often that winds up getting in your way, making it harder to do the things you’re trying to do today in the name of things you hope to do tomorrow and it’s not actually there and doesn’t actually work right when you get around to wanting to do that.

Horizon worlds is a product. A product can be clearly judged. How many people are using it, what are they doing in it, how much commerce is going on, all those sorts of things. Horizon work rooms, how do we compete against Zoom meetings? And I’m pretty excited about some of the prospects, some of the early signs that we’re seeing with this where everybody is in a lot of Zoom meetings, and sometimes when you’re in work rooms or Horizon interacting with a producer or TPM or a few other people, it winds up being better than staring at the wall of faces on the Zoom screen, and I was also super excited where I heard people spontaneous commenting about how conversation in work rooms was better than what they were seeing in some of the other VR apps and also a lot of the traditional video conferences systems, and that was because work rooms had redone the audio stacks so their voice communication had a couple hundred milliseconds less latancy than you’re seeing in other places.

00:18:15 Even where we’re at in work rooms is still far from the speed of light of what we could be with this, but people noticed and we can get twice as good as we are there and get that much better, but it was interesting when I was digging down in to some of the latency issues,es actually gotten worse from Oculus GO in To Quest 2.

When you have echo cancellation and some of the other DSP processing that happens, it’s annoyingly high right now. It’s over a hundred milliseconds, even at the bare minimum, even if you do nothing wrong above that. It’s one of those things where I’m sort of hoping I can harness all this crisis energy with the metaverse going on. It’s like, okay, we need to dig down in to the DSP chip and claw back the hundred milliseconds. Let’s go ahead and get our echo cancelled voice communication over the network down to under a hundred milliseconds. Let’s be better than anything that anybody has ever seen and that’s that type of nuts and bolts level thing that we can put a couple of people on that are really good at nailing that down and we can make an improvement that’s going to affect everything, our existing products.

00:19:48 Let’s take that and make work rooms, Horizon, Social Home, all of these things just much better directly. Unlike products, architectures and technologies, SDKs and tool kits, they can always claim victory and just say we made a wonderful architecture. Nobody used it correctly. Nobody picked it up. The applications didn’t take advantage of it in the right way, and it is so easy to let yourself off the hook like that. So, you know, you’ve got to be using the things to make value from it.

I worry about this in a lot of ways with our advanced technologies where we’re happy when interesting little proof of concept things coming out, but not showing up kind of on the big board is like, okay, hundreds of thousands of people are using this, you know, it’s delivering millions of hours of value. Maybe they actually aren’t all that important and maybe other things could have been more important to focus on. So Horizon has some strong points.

I do enjoy being in that area, social conversation, I’ve said it feels like we’ve got line of sight on that sort of copresent social aspect when everybody is holding their controllers right, nothing is looking goofy, looking at each other. The conversation works pretty well when you get reduced latency from the audio, it makes it more life like. It’s still a far cry from the metaverse of our visions and what we’d like to see.

00:21:20 Like the Q & A session I’m doing after this talk, we’re going to have 16 audience members this there. That’s a far cry from even the hallway talks, like when I would be physically at Connect, there would be often 50, 60 people in a crowd around me and just in the hallway where we’re talking about, you know, whatever anybody wants, and last year, we had a little bit of the random entry where whoever kind of hopped through the portal first got into the Q & A session. So I’m a little disappointed we’re almost backing away from the virtual session, but we want to have something that feels like that session, like the real connect where thousands of people are here milling around, some of them cluster around me outside, we all crowd into a room for the big key notes, and you get all of that ability without having to have people fly across the country or across the world to get there.

That’s what we’ve always been pitching as the value of VR. I really do want us north star event of what we’re doing. We have an event we do every year here. We have a user base for this. We should be doing this in the metaverse. If we can’t handle this, we can’t handle sort of the vision and we can do this by next year. I thought we could have done it by this year if we really made this a frontline focus, but we didn’t have that focus like now, a year ago, but I’ll be really disappointed if I’m sitting here next year in front of a video crew and a camera in physical reality doing this talk.

00:22:53 I want to be walking around the halls or walking around the stage as my avatar in front of thousands of people getting the feed across multiple platforms. I’m laying that gauntlet down right now. We should be able to do that. This should be exactly in line with our stated mission.

So we should make this happen to make sure we’re doing something valuable to at least us and then it will very likely be valuable to a lot of other places. The problems with that, with capacity planning where if you’re in horizon now and you have 16 people down there, there are already pointing down to low Fidelity avatars, getting pointy elbows and jittery updates as the system tries to manage that. That’s just 16 people, maybe 20 total with cameras and other things going on.

How do we get to something that looks like this? A small, even a large meeting, it’s a problem. In work rooms, we can’t have our VR leads room in the meeting because there’s too many people on the call. How do we scale to the point where even the small club concert, these become really challenging things. There’s no way that you can just spawn 80 entities in unity like this with our Horizon avatar and expect to have it work. The magic being reached for in many cases is Cloud rendering would allow us to use more powerful systems and we could add a bunch more on there.

00:24:27 Like I said earlier, I’m very supportive of cloud rendering architectures, but I have to pull back a little and say well, there’s going to be a lot of cost there. That would cut off a lot of people that don’t have the band width to have that high quality a connection, it does have negatives on the quality. What I had suggested well before we spun out this metaverse stuff is we should do a horizon in the cloud as a separate technology development project where we can certainly run it as a cloud application now, just compile the PC version and run it globally, and that’s the most flexible way to do things, but it runs in to some of these, you know, quality challenges there.

Now, there’s a ton of other things that you can do with sort of hybrid applications where you want to say like your local hands are controllers and your local UIs could make an application where all of that is done locally on your headset and only the crowd of other people is done with cloud rendering and kind of pulled in to it. And I think that there’s talk about wanting to make that sort of a general purpose application, interface, and I don’t think that’s going to work.

I don’t think many applications are interested in refactoring the way they do things. It’s not a trivial thing. Just run your app in the cloud is pretty trivial. You bite off a whole lot of down sides, but it just works.

Splitting your application up in to locally rendered things and compositing with Cloud fragments, that’s the type of thing that I think is application specific and rather than spinning out some general technology SDK, we should just try to take horizon and work all of that out.

00:26:10 Maybe we find out there are greater ways to slice it, but I’m concerned it would be pretty specialized and could be what we want to do for cloud rendering. Or for Horizon at least.

This goes back to like the wheel of technology just kind of reincarnating over and over again, way back in, heck, probably the late 80s, there was a windowing system calls news that let you write certain UI things to be executed on your local system that might be connected by a very low band width connection or modem link to a larger system that’s doing the rest of the rendering, and people had a hard time with that then and I think it’s very much the same thing and people will continue to have hard time with it, but sometimes if you’ve good night to -- got to solve a problem, it’s the thing to do. Even with Cloud rendering, PCs are fast, but there are already a dozen metaverse applications using all of the power of the PC and none of them are magic and making everything amazingly work.

Even with the power of the PC, you can’t scale up both avatar count and avatar quality at the same time. We can’t talk about Kodak avatars and crowds of people. They don’t work right together. And even worse, we can’t just take the very latest 3090GPU and run after those in Cloud instances for everyone. That’s fundamentally expensive, even before you account for the Nvidia data centre tax of running things there. Infact NVIDia is building all these technologies to let you fragment up in to smaller pieces to give less power to individual Cloud instances. That’s useful but we might wind up with Cloud instances that are only two times as powerful as our mobile system and then you run encoding and all the other things and it might be a wash or not such a great idea. Perhaps the saving grace would be that Cloud rendering lets he is project the metaverse on to any device trivially. Anything except a video stream could go ahead and have full featured interface there, lots of benefits there.

But still, if someone had asked me in the year 2000, I’m working on Doom 3 at the time, could you build the metaverse if you have a hundred times the processing on your system today? That’s about where we are right now, and I would have said, yes, it would have been a serious optimization challenge. There’s all these things you might have had to do to make it work out well, but if I had to make the metaverse work just on our mobile hardware today, I think it could. You know, it would be an optimization challenge which is sort of the problem where everybody that wants to work on the metaverse talks about the limitless possibilities of it. But it’s a challenge to fit things in, but you can make smarter decisions about exactly what is important and really automizing the heck out of things like that.

To be even more contrarian here, I have to say it’s like are we necessarily even aiming for all the right targets with the social metaverse where people of copresence is the big bet and it’s completely understandable why a company like meta would be at play. It’s what the company is built on, but, you know, in truth, a lot of the luxury items in reality are freedom from copresence. You know, it’s a private office, private beach, a private plane. Sometimes "just add people" is not always a positive, especially people on the introverted side of things.

There’s also this notion that building all of these 3D things, 3D art, 3D objects that these are the critical factor that people are going to love so much in the metaverse, and I do keep coming back to this point that almost all of the value of the stuff we’ve built in our culture today is represented on flat screens. Now, there’s trillions of dollars of all the software and media and assets and things that are built around flat screens, and I’ve made the pitch before that perhaps a sufficient argument for VR is to just say it’s screens and people as the primary thing where you’ve got that ability to have your friends together in a small room and you’ve got the ability to bring up all the things that you do on your own devices in VR in more flexible screens and then the VR specific things, you know, the actual games, the beat savers and things, those maybe sort of the interstitial things that compliment rather than define the medium.

00:31:23 And I made an extension of that wherefore the metaverse, maybe the metaverse is lots of screens and lots of people that maybe there is a screen-focused world where everything that people do with photography and videography and identify that has an amazing place in a virtual world where you have the flexibility to have those presented all over the place where everybody can do magical things with video and photos where not everybody can do that many magical things with 3D art.

Some can, but maybe it extends like video has and it gets democratized, but also maybe it takes a while. I kind of keep pushing on, let’s get all of our screens right and the ability to handle everything, all the types of Apps, all types of cloud services, that everything that anybody does on a piece of glass today you should be able to do in the metaverse hopefully more flexably than you can on existing devices. I don’t want it to be the thing that you have in addition to your TVs and consoles and computers and laptops is all of these things. You need to take the place of at least one and hopefully more things that’s been so much of the value that mobile phones have brought in to the world where they replaced a whole pile of things and brought all their new value in addition to that, but it’s subsumeing the other things that is a core part of the value.

00:32:54 Now, everybody agrees that a closed platform doesn’t deserve to be called the metaverse but there’s a spectrum where you can have completely open, wild west sorts of things and then you can have completely locked down single application platforms, and it’s, you know, a pretty good bet that we’re not going to be all the way over on the wild west side of things. I’m certainly partial to that direction, but a lot of the strenuous advocates for that, it has to be accepted that centralized systems provide most of the value in the world today and there’s reasons for that other than just accidents of history. It is easier to make better more valuable experiences in many ways with a centralized system. I mean, all the issues with federation and standardization, there’s good value that comes out of all of that, but it comes at a cost, and you can’t really just ignore it. You know, like on the commerce side of things, you have to be able to make a living in the metaverse. Commerce is going to be some part of it, but I use adult entertainment as a litmus test. If there’s an adult entertainment, it’s a very, very open platform from a commerce standpoint. We probably won’t be there. I halfway jokingly suggest certain things along those lines occasionally, but it’s unlikely that we’ll be in the completely open cryptoworld of things. I love the idea of unstoppable global cash transactions, but I’m also well aware of the swamp of scams and the spam I have to clear out of my timeline every morning dealing with that side of things.

00:34:33 So I’ve actually got a bet that I’m sorting out with somebody about the relative adoption of federated versus closed systems over the next 18 months and it’s going to be interesting how things play out because I can envision worlds where it goes either way and we really don’t know at this point. We’re still figuring a lot of these things out.pNow the most o metaverse is that you have one single universal app, something like road blocks where if you’ve got a touring complete extension language and sufficient interfaces, in theory, you could do anything inside any app. We’re all running all of our apps inside the operating system and it kind of defiance however many turtles you stack on top of things opens the levels of abtraction, but I am -- it’s probably I doubt a single application will get to that level of taking over everything the problem is if you make a bad decision at the central level, nobody can fix it. You can cut off entire swaths of possibility. Things that might be super important and I just don’t believe that one player, you know, one company winds up making all the right decisions for this. So the next step down would be to have, you know, our metaverse be something like a giant unity plug-in so anybody could build an app with this massive base layer of functionality. That’s sort of where we are today where we have horizon worlds, horizon work rooms, horizon venues that are all unity applications built using many of the same technologies but it’s far from a clean integration right now.

00:36:14 It’s huge issues where workplace goes and does all this great work, it doesn’t flow wackback in to horizon. It is great when you get -- I love VR applications where I’ve got friend scoreboards and you can see profile pictures and it’s easier to invite friends to things and those are all good things, but it’s really not the metaverse. But on the other side of that where horizon worlds, work rooms and venues are all unity apps, horizon home is really a rebranding of VR shell where it’s a C plus plus application, so we have to interact with all of these things to get the avatars and the profile and all of these things that happen in the unity world happening in our C plus plus world and we’re still spinning that up and there’s a bunch of challenges for it. So there’s some styling of the upcoming hardware as being kind of metaverse oriented and I really don’t like that and I keep pushing back wherever I hear some of these things and it’s important to say that all of these future of work and metaverse things, all of them will work just fine On Quest where the upcoming high-end hardware will add facial track and more world understanding that might add some things on top of these, but they are not central to the experience and it is going to be significantly more expensive, so it’s going to be, you know, really interesting to see the relative adoption of this.

00:38:17 Now, hardware element takes a long time, and we have multiple new headsets in the pipeline and this is the first year in a while that we haven’t announced a headset at connect. Part of that really is an artifact of again truly a heroic effort to get quest to ship last year, but it’s still true that there’s only so many points that we can test. The solution space for hardware is large. You can say more or less of a whole slew of different things and you can say binary features of have or have not on lots of other things. We get to make one bet a year about this and that is point sample, a complex function and hoping that we can pull a lot of good information out of it. You know, in the eventually the mature VR headset market should cover all the niches that mobile phones do, but we only get to make a couple of points. We used to early on in Oculus talk a lot about the possibility of kind of second or third party headsets that interoperate with our ecosystem and that turned out to be really challenging to do because like Mark said, we sell our headsets at a loss or break even. There’s no profit in the headsets. So there’s no way that a company could go and say, I want to make a budget headset, I’m going to undercut the prices here without wanting to be able to negotiate for a cut of the ecosystem revenue.

00:39:49 That’s just kind of the way those things work. On the high end while we can imagine someone saying, I’m going to make a very exclusive headset, most of the things that we talk about, things that require deep core systems software integration to really make them valuable and so we can’t work that closely with another company dealing with that. It’s just really, really challenging. There’s still a couple of spots where it might work if a company made a super wide FOV headset or super high resolution headset that was still basically the same thing, it’s still basically exactly the same sensors or exactly the same modalities that we have in quest two, maybe something like that could work but there’s nothing like that kind of really going on right now. So cost is really critical. I always worry about kind of disruptive innovation where so much of the time things come out where people were focusing on cheaper in quality. It’s easy for people to fall into the trap of saying, well, I’ve seen better, so this is garbage, even if millions of people are getting a lot of great value out of it. And there’s also lots of evidence from a lot of consumer devices about very non-linear demand curves where sometimes lowering the price by $50 or $100 can be way larger increases in the number of sales that you get than, you know, you would think from just linearly looking at it and we can’t disentangle that so much with quest 2 because that was better, faster and cheaper and it went up very non-linearly.

00:41:45 But I still suspect the price point was pretty important. There is -- but the other side of that is that we have people that we have demonstrated existence proof that there are people for which zero dollars is not cheap enough to make the current headsets valuable because there are plenty of quest 2 headsets in closets that are not being regularly used. We have lapsed VR users. Now, there are no lapsed mobile users. Everybody uses a mobile phone. They might switch to at that different brand but they they never just give it up, but there are a whole lot of people that are lapsed VR users that did not find the experience valuable enough to even take something out of their closet that they already own. So obviously more valuable needs to be delivered for us to get to the value point that a mobile phone has. It’s an open question. How much of that comes from software versus hardware. Obviously some has to come from both, but like one of the test points that we’re looking very closely at, a lot of consoles go through this cycle where you get lapsed users and they resurrect when hot new content comes out. Resident evil is one of our biggest tent pole releases and we’re looking very closely to see does that cause people to bring the headset out of the closet, play resident evil and check out what else is new and how much better the ecosystem has evolved, but it is possible that we need higher end systems, that we need systems with features that we don’t have now, like some people really do believe that the facial tracking sensors, that ability to be able so smirk or smile or look sadly forlorn at something in VR is, you know, will take it over like a significant milestone that there will be some bright line that we cross where it gets that much better, the experience is worth that much more than it was before.

00:43:38 You know, maybe the sensing your environment, letting you do more in the mixed reality side of things is critical. A lot of people, you know, in respectable positions to take make that bet, but personally, I still think that the fundamental capabilities of quest that we just extended in quest 2 are a sufficient baseline that if we could do another better, faster, cheaper of quest 2, it would be fantastic and another significant uptick that we saw with this. The untapped possibilities, the normal improvements, camera resolution will go up, ram, processor speeds, at least wider with more, we’re going to get that anyways. It’s the question about the binary new things. We need to go from four cameras to six, eight, ten twelve. There are some designed that have a dozen cameras on them and I think that’s madness. I think you’re never going to get down to this low end potentially $50 budget phone thing with a dozen cameras. Software will let us improve things in a lot of ways while discreet physical components are going to have a real price floor on them. But I can be wrong. So we’re running the experiment with a higher end headset coming up next. We will see how much value it adds to people. They were intentionally very vague with the tease about what’s going to be there, but you can sense eye and facial sensors, better world sensors, and then the pancake lenses and the pancake lenses are -- they’re much more expensive, but they are much more compact, you know, you can make a headset that is a lot smaller, slimmer.

00:45:26 Potentially better looking they are also potentially a lot clearer where one of the points that I’ve made is quest... View is optics limited rather than resolution limited. You are more limited by our optics train when you’re looking at things away from the very centre of the screen than you are by the display. You know, if we doubled the display resolution, most of it would just be a waste right now. We need something that’s going to add some additional clarity, so I’m hoping that that really works out, but that also might put a price floor on, you know, future budget systems if we wind up adopting that for everything. Thermals drives some of the design where I’m running out of time fast on this. You know, in so many ways, we are already limited by just the power that we can dissipate for things where, you know, our system that we’ve got in quest one, if we were able to run that at maximum clocks for everything, it could do most of the things that quest two has and if we can run quest two at full clocks for everything, you know, it would probably have more performance than whatever the next-gen SOC is. We are very conservative about what we’re willing to do on thermals at Oculus. Everybody is justifiably concerned about you don’t want to have sort of a battery fire or you don’t want to have something cooking and smoking on someone’s head, but I think in many cases as I’ve seen over and over with our displays and some of the other things I think were overly conservative where I keep saying, it’s like, all right, you want to talk about we’re going to damage something, I want somebody to blow something up in front of me and see smoke coming out of this SOC before I really agree there is a fundamental limit here.

00:47:09 We went through all of this back with gear VR railroad so many of the limits people pick for how hot people can run for something are just numbers picked out of thin air. There is not something where there is a bright line here that we’re going to cross. We have a ton of different optimizations and things that we can add a lot of value to our systems if we allow ourselves to. There’s this spectrum where you go from actively cooled system like quest and quest 2 where we have a fan on. You get to systems that you might want to wear all day long and the notional future AR glasses there’s controllers didn’t get much attention in anything being talked about today, you know, I historically I clearly own the fact that I underestimated the amazing value that we get in gaming from the controllers, but it’s still clear that we are going to wind up in a world where you can eventually get a headset without controllers where you just use your hands.

00:48:45 It’s nascent, lots of room to improve and eventually, we’ll get to something like that or maybe it uses the brain computer interfaces, using more voice. Other camera sensors for watching the eyes, different things like that. All these are things that just take work. One of the designs that I would like to see is I would love to see an embrace of actual flexibility in the headsets. Our headsets are these big rigid things and I complain that we have to build these like tonka toys with big plastic toy shells around them where I’d love to see us go ahead and take something that was, you know, break up the modules and independent things so you can move them across, let it be bendable there, so you have any IPD and giant swim goggles and track from there. We can improve, you know, the field of view with canting it there. But still, I think we could add a lot more value with software improvements than with hardware or at least with convenience which may involve some hardware elements.

00:50:24 One of the things that I revisited recently that in retro retrospect is even more striking. We have these half dozen headsets that we’ve built and when you compare aspects of them, gear VR Oculus Go was a really interesting comparison where by the time Oculus Go shipped, Samsung phones already had faster processors and better screens than what we shipped on go, yet Oculus Go had integer multiples better retention than what we were seeing on gear VR. It was all because docking your phone, taking it out of, you know, your case and putting it in to a headset was a pain. The friction of getting in there was, again, enormous. Indiger multiples. Indiger multiples. Right now, you put on A Quest, there steams like sometimes a 50 50/50 chance you will have to redraw guardian. You wait and you finally launch something to go do it and it takes a another 20 seconds or more to get in to. I tell people, imagine if your phone was like that.

00:51:57 Imagine if you hold your phone out of your pocket and it took you two minutes to get you to do the thing you want to do instead of two seconds. There’s an enormous value to get to tap in to that to make it better. The ultimate in inconvenience was we had the recent Facebook outage that left a lot of people with headsets that were very broken and no real apologizing for this. It was horrible that that happened, but we had a really fantastic internal effort where there was a team of people that were going through cataloguing everything that didn’t work while that was going on. It’s like saying never let a crisis go to waste and I thought about asking if I could publicly release that because it would have shown how much we really are trying to figure these things out and make it better, but it had far too much really internal stuff to just kind of dump that out, but I really thought this was a very positive thing with people looking at that and it’s a clear case of you can’t break people’s stuff like that. We need to be delivering value, not kind of inhibiting and taking it away. The things that we’re just starting to see, the kind of the advance of multi-tasking in to our systems where, you know, you could have the multiple web browsers and we have the ability to pull up the different android apps and the media apps and being able to arrange some of these, I was delighted the last time I was looking at something, I thought the keyboard was awkwardly far away and it’s like, oh, yeah, I can grab that up and move it up to where I want it to be now, that just worked magically well. We have cases where like, oh, I can now upload a screen shot directly to twitter from browser.

00:53:33 It just works now. Okay, now we can start doing copy and paste, some of these things that you expect to be there, you kind of conditioned your simply, haven’t there in VR but they’re starting to come in one at a time and we are going to get to this place where it is a more flexible work environment than any single glass screen that you have that you could do things on and we’ve got to do a bunch of serious work in there, figuring out desk top, all that management stuff. We need to get virtual memory working like real swapped virtual memory, none of this, you know, do a little compressed swap for there. We need five applications that you opened up, let one of them page out if necessary, so things don’t just crash and disappear as you work on something else. But we can get there. We have huge internal battles about the different things like where our plan of record here is where we’re big on web apps, but I keep making this point, the platforms that succeed wind up having native apps. We should be running all android apps and should be able to pull in the long tail of applications where we have a few hundred applications in VR, real platforms have millions of applications. We are not going to bridge that by having people kind of bring them over one at a time, but we get, you know, I have arguments with people about how we get to have people add VR affordances to the application and the thinking is maybe we can get the top 50 Apps to come over and put in extra ways to explicitly look for scrolling events and different things like that, but I think it’s actually our obligation and duty to figure out on our side what can we do to make the VR platform take advantage of this trillion plus dollars of content on all of the flat screens.

00:55:25 We need to figure out how to take advantage of that rather than saying, oh, they need to change their applications to come to us. We want to work on android apps, cloud windows desk tops, remote desk top applications. I want to bring in everything so that everything can be done there. But on us, maybe that means like -- I would love it if we made a decision on next gen controller, whatever we do to make it easier to act as a TV set or laptop or anything like that. It should be on us. So we actually have a ton of features that are sort of not well exposed right now, and there’s a problem like for years, we’ve had all of these options in like for video recording. We have all of these ways where video comes out of the systems where streaming to the desk top PC browsers, streaming to the FB live, your phone with twilight, recording it, and generally our defaults are like not so great. We have got all these options for changing the bit rate and aspect ratios and resolutions and we can get these through our Oculus developer hub application, but for conventional users, it’s not really anything that lets you get there. I kind of wish that we had the equivalent of the old quake console where you do some magic cord and you get an old school text terminal and you could just type in different things to like turn features on and off because when you have a beneficial feature, it’s got to go through our design market T has to be internationalized for all of this and the way things are set up that has to be going through separate processes usually for running the display, communicating what’s actually done and it is such an enormous tax on what we can accomplish.

00:57:13 We need to find ways to go faster with that, but, you know, there’s a lot of value that should just come simple little talk. Something I did recently where I saw someone make internally make -- they had a big post about how to capture stereo scopic video from inside VR. I was like, how did they do that? We don’t actually save that out and I was horrified to find that they actually did a full record of the distorted 2-eye view and undistorted that to make a stereo video which we could then play back and I’m like, oh, my God, that’s just horrible. That’s not the way you should do something but they cared enough to do that and they knew getting it through our full system design process it just wasn’t going to happen and rise to the top. That was the type of thing where it was not that hard. I took an afternoon and it took a few hours for me to go in and put the real way to do that, but it’s controlled by a system PREF that you have to set over ADV right now where I would love to get in a in an advanced settings option and all the record things, but that’s going to be a thing that will take six months for us to wind up getting something like that done. We need to figure out how to move faster on so many of those things. We have more than WE HAVE MORE THAN ENOUGH PEOPLE WORKING ON ALL OF THESE FEATURES, BUT WE TRIP OVER OUR OWN FEET IN SO MANY WAYS. And I am totally running out of time here.

00:58:44 I think I am going to do something that I’ve never done before here where I’m going to take these notes that I’ve got that I’m kind of scanning through here that I’m only maybe two-thirds of the way through. I’m going to dump them up on a facebook post and send a link. Interesting insight into how I end up doing these things. A bunch more talking about the system software. Development things. And a few miscellaneous things coming up, and maybe we can explore these in other ways. I want to close with one little funny anecdote. I play a lot of beat saber, and sometimes I put on arm weights. I know that’s not recommended for a lot of people, but I worked up to it. It’s part of my exercise regiment. But there was a time this year where I was playing beat saber, I had my arm weights on. In one of those sessions, I got a phone call that I had to take. And I had to jump out of vr. Everyone else in the room was like, john’s avatar fell over and stopped moving. Did we give him a heart attack playing those songs in beat saber. I had to come back when I was finally done on twitter and say, I’m actually okay. People were pinging me. Did you have a heart attack, are you all right. I had a grand time in vr. I am prat of the work that we do.

01:00:16 And I’m very excited about the future. I will see a bunch of you in about half an hour at the q&a session. [♪] >> oh, fantastic. Thanks so much john. I like beat saber too. Am I the only one who can listen to john carmack speak tech all day? he speaks his mind. After a quick break, john will be live in a custom connect world within horizon worlds. If you are not in the session, don’t you worry. We will be broadcasting it on the reality labs page. Drop your questions into the chat, and we might get a few of those to john. Don’t forget, during the upcoming break you can dive deeper into the topics you’re interested in by accessing the incredible sessions and playlists on the reality lab page. So enjoy this 30-minute break, happen into a session, or get that fitxr workout you’ve been wanting to try. We’ll see you back here for you are in-vr q&a with john carmack. We’ll be right back. >> this episode puts a spotlight on asian pacific creators in vr. 01:01:48 I’m your host tuning in from the netherlands. Marine is the ceo and cofounder of baobab studios. >> it brings you to completely different worlds. Ones only limited by your imagination. When I watch animation, I’m still brought back to that 5-year-old sense of self-. >> mark is the cofounder and coo. >> we used real live locations to show off the beautiful spots in asia. I think that vr is the best medium for achieving the goal of creating a safe space for users.

today I want to share about the ties between my family’s immigrant experience and how that has impablthedcted my own practice. I know walking into a room, people are going to underestimate me. 01:03:19 My place in society is supposed to be a certain way, quiet and whatnot. I need to seem more confident to combat that. >> the general stereotype that is aligned with the idea that asians are creative. >> how can vr help? how will our industry address this? >> I think what kevin is doing is really amazing. Putting you in that experience so you can actually feel what it’s like. >> the one most important thing I would like to do is continually making things. Just be constantly nurturing. I want to say it is possible for a small game development studio in asia to make it. >> each creators’ perspectives and experiences. I feel honoured and touched today to hear all stories. I really think the impact asians and specific islanders have, I don’t think it can be overstated. 01:04:58 it’s been harder for me because people look at you and think the absolute worse. >> when you walk down the street and somebody yells out the window, you’re too fat to be seen in public. You should put a paper bag over your head, that’s hard to deal with. But then I found supernatural. And that’s all she wrote. >> imagine you’re standing in your living room. You put on this small headset and suddenly you’re standing in front of a glacier in iceland or on the great wall of china. A coach meets you there, and you smash targets. There are no mirrors, no judgements. Just you feeling powerful and losing yourself in the moment as you catch fire. >> the day I finished my first workout, I was hooked. And I’ve been using supernatural for 307 days to be exact. In a row. it’s raining men! >> I’m all about singing and performing and dancing. And I feel graceful when I’m in that headset. >> it would take me an hour to do a 20-minute workout when I started. 01:06:36 And now I can do two hours without resting. And that is insane. >> I started to make friends in the community, and it was amazing. Nobody judged me. Everybody was cheering me on, and that was an amazing feeling that I wasn’t really used to. because she embodied everything we’re about, vulnerability, inclusivity, celebrating yourself, and falling in love with movement, we knew she had to be more than just a community member. We knew we had to get her into the studio to inspire others. >> I think it was katelynn who reached out to me and said, hey, we want to include you in on this customer feedback call. I was like, I’ll do that. >> I would love to kick things off asking how you heard about us. Oh, it looks like we have someone joining. Hold on one second. I’m going to cry! my heart is racing. I feel like I’m meeting a celebrity right now. I’m freaking out. [ Laughter ] >> LISTEN, AT SUPERNATURAL, ALL OF US, OUR ENTIRE TEAM, HAS BEEN SO INTO AND THRILLED AND FALLING IN LOVE WITH WATCHING YOU THROUGH YOUR FITNESS JOURNEY. 01:08:10 We’ve put together a little slideshow that we want to share with you. You with every step showed with total transparency and joy. So much beauty and vulnerability. We were so amazed at every step. We got to see you break 100,000 points. We got to see you break your 100 day of straight working out in supernatural. We were absolutely amazed. And then we saw this post. I don’t know if you know, but we all fell in love with you. Will you let us bring you to los angeles so you can coach your own supernatural workout? >> are you serious? >> yeah. [ Laughter ] >> EVERYTHING THAT IT IS TO BE A COACH, YOU’VE SHOWN US. We’re so proud. We really would like to bring you out to be a guest coach to help make this dream a reality >> yes! I should probably ask my husband, but I think he’s okay with it. >> every one of your coaches will tell you this. 01:09:41 If your heart wants to help other people in this way, you’ve got the qualifications and the rest of it is so easy. I’m going to be honest, I feel like I’ve just shattered the glass ceiling. [♪] [ Laughter ] >> YOU MIGHT AS WELL BE CHER OR MADONNA AT THIS POINT. >> THIS IS INSANE! >> IT’S AMAZING TO BE A COACH. There’s so much work that goes into it. So much heart and soul. It’s exhausting, but it’s awarding. I wanted to focus on self-love and confidence. I want that to be the entire message of my workout. Because I don’t think you can get enough of that >> love and care goes into every single one of these workouts. There’s a huge spectrum. I used to tell people I listened to anything from manson to hanson. >> I was a dancer. Anything with complex movement, I love it. >> we’ll make it work. >> yes. [♪] >> I had braced myself for a gruelling experience. In reality, it was amazing. >> her performance was so authentic. >> don’t be scared. 01:11:19 It took me 38 years to break out of my shell. >> I can’t wait to do her workout. I don’t know if my oculus is ready for all of the tears. I want to speak to those who are struggling with their weight. Who feel like they can’t do the workouts. I want them to know that they can. And they can do it at their own pace. That’s what I did. Just open up. Just let it take hold. Workout, listen to emotional music, and just open yourself up. Let it flow. >> start with walking knees. Getting the hips warmed up. I remember my first supernatural coaching experience. I was very nervous. How are you feeling? >> if I had my heartbeat monitor on, it would probably be over 100. [♪] >> vr is a fundamentally different platform than any of its predecessors. >> you can drag apps from universal menu or your library to run them side by side. >> it’s a way for developers to safely and securely to bring out innovative ideas. >> the developers we have worked with have experienced success unlike any other vr platform that we’ve seen. >> regardless of your background, your age, your speed, you are an athlete when you’re working out with us in supernatural. >> in today’s talk, you’re going to learn about currently available opportunities for multiplayer apps on quest and what you can integrate with that. >> they define virtual worlds as places where the imaginary meets the world. 01:15:25 I love the worlds I’ve been invited to collaborate on. >> vr’s unseemingly unlimited re estate gives you nothing like before. >> it will help people learn about our world in new ways. by combining complex task into a simple ui preserves hours of friction during play testing. and it’s a beautiful new design. >> over the next few years, we will unlock a suite of tools developers will need to realize the metaverse. Imagine quest experiences that blend virtual content with the real world. I am very excited to introduce the facebook presence platform. >> it enables you to build completely new voice-driven game play and experiences. we can’t wait to see what you’re going to build. and this is all just the beginning. buckle up. And let’s get started. [♪] >> hey. 01:17:51 hey, facebook. Stop video. [♪] >> you got me. you are so mine. Hey, behind you. >> what a shot! >> oh, god. [♪] >> descending into the darkness, the enemy found himself with old but familiar magic. 01:22:04 Whispering the call to adventure. Compelled by fate, he summoned the old fellowship. >> oh, hey! [♪] >> all right, boys. What’s the plan of attack here? >> get the key, and then we have to get to the exit. >> I’m going to try to take out this spider-looking thing. >> take him out! >> where are we going? are we going here? oh, a chest. We have a cart. >> nice. do I need to save you? >> do you have a cart? >> hong on. I’ve got this. [♪] >> what about this door over here? >> I have a feeling this will get ugly. they are killing you, man. >> taking a beating. >> we’ve got you. here they come. I got you, I got you. >> let’s do this. [♪] >> hey. 01:28:36 hey, facebook. Stop video. Hey. Can I take a photo? >> sure. [♪] >> vr is a fundamentally different platform than any of its predecessors. >> you can drag apps from universal menu or your library to run side by side. >> it’s a way for developers so safely and securely bring out innovative ideas. The developers we have worked with have experienced success unlike my other vr platform that we’ve seen. >> regardless of your background, your age, your shape, your speed, you are an athlete when you’re working out with us in supernatural. >> in today’s talk, you’re going to learn about currently available opportunities for multiplayer apps on quest and why you should integrate with them. >> horizon is an ever-expanding vr world to explore, play, and create. my creation style has evolved with each friendship I’ve played in horizon. >> you get the flexibility and space for your apps like no other screen in your home and office. >> vr can help people not only visualize world in a new way but can also help us learn about our world in new ways. >> removes hours of friction during play testing. >> and it’s a beautiful new design with reorganized navigation system. >> over the next few years, we will unluck a suite of tools developers will need to realize the metaverse. 01:30:46 Imagine quest experiences that blend virtual content with the real world. I am very excited to introduce the facebook presence platform. >> the voices tag enables you to build completely new voice-driven game play and experiences. >> we can’t wait to see what you’re going to build. >> and this is all just the beginning. >> buckle up. And let’s get started. [♪] >> welcome back, everybody, to connect 2021. I’m melinda davenport. I’ve had a lot of fun guiding you through connect 2021 here in the real world. But now it’s time to take the party into vr and into my virtual counterpart. So here from inside horizon world, nate barsetti. Hey, nate. thanks, melinda. 01:32:19 Hello, everyone. I’m nate barsetti. I want to invite you all to this q&a with the legendary john carmack. We’re coming to you live from horizon worlds. And joining us today in vr is a live audience representing developers from oculus start, oculus launch pad, and horizon accelerator programs. We’ll also be taking a selection of questions from those watching on fb live or on horizon venues. If you have something you’d like to ask, go ahead and put it on the comment feed in the fb live stream. To kick things off, I would actually like to ask the first question. John, you and I first interacted several years ago as avatars while watching a film with other oculus viewers and venues. Today, we’re back together in avatar form for this q&a and bringing your classic connect hallway talk to vr. What does this mean for public gatherings in general? >> the biggest thing is what we used to have is this defy distance tag line. The connect hallway talks really were a wonderful thing. People would come up and say hi and ask a question or just sit around listening while other people asked questions. It was a valuable experience for a lot of people that they got because they were able to fly from across the country, or in some cases across the world, which is an expensive endeavor that not everyone can do. 01:33:58 There’s no reason now to have it be limited like that by physical proximity. I am very much looking forward to this world where it is open to so many more people that can get here without necessarily having to get on an airplane to get there. >> that’s amazing. That is the promise. Great. So I think we’ll just kick it off with some questions from our vr audience. Anyone have -- oh, I see someone here wants to ask a question. Why don’t you come up here. Introduce yourself. >> thanks, nate. Hi, john. I’m a producer at a free vr app for multigenerational families. I’m also a participant in the oculus launch pad this year where my team and I are trying to use voice ai to optimize the narrative process. What do you believe developers should focus on as we seek to optimize or vr driven ai experiences? thank you. >> ai driven efforts are -- they can be magical and do things that were literally not possible under any circumstances a decade ago. But in many ways, training ais to do things is -- you know, it’s almost like training animals than traditional software development. And it can be -- I would go into it thinking you have to scope down on something and make it small. 01:35:32 Everything is possible if you give it a large enough training set on ai. But you probably, especially as a small startup, will not be able to deliver the training sets to do all the things you imagined would be possible. So you have to leverage things people have already done and repurpose it. Voice based things, I am hesitant to really -- I mean, going in and saying I want to do something broad with voice is going to be challenging because you’ll run off a lot of cliffs. But if you have a really tight level focussed thing, you can probably work something out. Especially if you’re targeting it for, like you said, multigenerational families. Great grandmother is not going to be able to manipulate the controller well, but if you do some studies to figure out how to map that in, there might be something there. As with everything good that I tell developers, you need to have your hundred beta testers to be able to run through and find out how it works. It’s probably easier than most things to be optimizing for your -- you know, making your test set yourself is always bound to cause problems when you open it up to a wider world. So diversity is probably your biggest advice. Thank you. >> wonderful. Thank you so much. >> thanks for that great question. 01:37:04 Let’s go to another question from the audience. Ash, do you want to come on up? here we go. Introduce yourself. hi, john. I am a world builder and scripter in horizon worlds. I was actually part of the horizon creator accelerator program. I actually want to piggyback off the last question a little bit. With all the announcements today, I heard a lot about technologies like project nazaré and other technology. My question is with developments like these coming down the line, do you have any predictions on how we can use such technology to make vr more text is hard to there’s all the things you do about just making things much larger. 01:39:26 Making it like nintendo text rather than xbox text, ie how you present something on screen. And there’s different things you would do with audio. Making sure you don’t mix it in with background tracks. But these are things that in general -- like pitching something like accessibility, usually it’s something that makes it better for everybody. When you’ve got something where you’re talking about a very specific disability -- like, there was an interesting bid earlier when we talked about adjusting the floor height for guardian was challenging for people in a wheelchair. So we did get in a little way to do that as a specific thing. But most of what winds up making a difference in accessibility and readability, worrying about colour blindness, worrying about good contrast, and good communication of using the audio modality for communication there. I would generally urge more towards -- I’m not really picking severe disability things. But how to make things that can work with broader ranges of people and make people that are at the fringes more comfortable. Like, what’s your specific use case. What are you helping people do? >> in horizon, we actually have different events where we invite people to be part of the community. 01:40:57 And sometimes we have users who are unable to speak to us because they can’t verbalize. It’s hard to do sign language when you don’t exactly have the tracking to communicate that way. So I was wondering if there was anything that could help with that. that’s the kind of thing more likely to be on the system side of things. That’s the problem with being a third party application. It’s hard to integrate something like that to add value to a broad range of things. But I can imagine something like that. That seems like something we would have to do at the headset system software. Or at least at the horizon level for things. What might come out of that is proof of concept visualizations and it gets pitched. This would be so much better if it’s applied everywhere. But it’s challenging to make -- unless it’s done as a non-profit with a grant or something, it’s hard to get a company to do something like that beyond the third party. >> sounds like we need to invite audience to try to come up with that technology for us then. >> again, technology wise, always think about what we can do today. It’s fun to hear about what might be coming in three to five to ten years. But it’s a mistake to actually get wrapped up much about that. 01:42:31 We have enormous possibilities just right in front us now. And finding the right things with the capabilities we have here is my strongest advice. okay. Thank you. Thank you so much. we are off to a great start here. Thanks, everyone. I’m actually going to bring in a question from our audience on fb live. This one comes from oscar. Any possibility of an affordable full-body tracking system in the oculus ecosystem? >> so what we can do is the bottom camera on quest sees a reasonable part of your body. Especially when you look down. We have little hints of information there. We can infer a lot by how fast your hands are moving. And I think there is a clear path to us building more and more body tracking solutions using the camera data that we’ve got. We have sdks right now that tell us a lot about tracking. As we augment that with little more hints of data that we get in -- like the cameras right now, the way the hand tracking works, it builds a rectangle around the hand and sends that through a neuronetwork. But we have things that track up the whole arm, like up to your shoulder. Like, if your hands are here where I can’t track them right now, it can see enough of your arm that it can infer things there. 01:44:08 And seeing a little bit about your torso can tell you the twist. And if you look down, you can see your feet from it. There’s a lot that can be done just from what we’ve got right now. It’s not everything. It’s not like having good trackers actually on your feet or external cameras looking at it. But that’s the theme that goes through so much that we do. Let’s not think about what we wish we had in a perfect world but what value can we get by gentle modifications of things we already have or are already going to be getting. I think there’s definite -- it’s almost positive that we’ll get something along the lines around more body tracking. But probably not completely independent sensors like that. At least from us first party. >> great. Thanks. And thank you, oscar, for the question. If you’re just joining us, we’re live with john carmack in horizon worlds. We’re taking questions from our in-vr audience. But if you’re watching on fb live or in horizon venues, feel free to drop your comments on the fb live stream. Let’s get to more questions from our audience here. You, gentleman with a suit on. Looking pretty dapper, by the way. Come on. Sure. Here’s the microphone. 01:45:39 hi, john. My name is michael. I’ve been working in horizon for about a year now. I love building in horizon. It’s fantastic. I was also part of the accelerator program. That was really fun as well. My question is in regards to nfts. Do you feel like going forward with the metaverse and meta, will there be nfts? and will other companies be able to access those nft >> I’m kind of in an interesting situation where I’ve commented that my timeline is sort of bipolar in the distribution about this. Where so many people think nfts are just the double and it’s the worse thing that ever happened. And so many people think the future economy will be built on this thing. I’m in a weird place where I’m actually not personally sure about this. Like I said, I see a lot of value in this. The idea of this unstoppable global transaction and completely permission-free stuff. But every morning I have to clean out my timeline from all the scams and, you know, spam promotion. I can see people being a little bit down on that. 01:47:10 I would guess that we’re not going to be able to keep ourselves from having some degree of oversight and control where just the way it comes out for everything where all somebody has to say is, oh, there’s terrorism and child porn being transmitted through transactions on a platform and everybody will freak out and something must be done and won’t somebody think about the children and blah-blah-blah. I could be wrong. I don’t know one way or the other about this, but my guess would be is it’s still going to be somewhat of a gated experience. But really I have no inside information on what our strategic goals are with that. Mark made it very clear that commerce is important. People have to make money. I mean, we want to -- it’s like the metaverse will be real when you have the first metaverse billionaire or something. It’s super important, but the steps between here and there may go over a lot of different paths >> thanks. Thank you. thank you. All right. Let’s go to another question here. How about you in the back there. thanks a lot. Thank you, john, for doing this. My name is dylan hunter. I’m a developer where we’re building an app. 01:48:49 One of the things we’re trying to do is build our apps with the idea of hand tracking from the ground up. As opposed to adding it as an afterthought. We’ve stumbled across some really interesting paradigms but also a couple of interesting challenges. Such as hand over hand inclusion and interfinger inclusion. By the way, amazing work. It’s incredible what you’ve been able to do with hand tracking. I guess with project cambria coming out, I was curious if you kind of knew where hand tracking might be in the coming years. What the persistent challenges might still be and if you have any advice as a developer. >> yeah. So I have been pretty happy with the pace of improvement that we’ve been able to make. If we go back in time, there were plenty of people who were like, we can’t do hand tracking until we do custom hardware for it. Now we just bear down and we got something out, revved it, doubled the frame rate, improved the quality. We made a lot of strides, and that’s still nowhere near maxed out. Like the point I was making about body tracking. We can get a lot better by not just looking at the hand but looking at the forehand and integrating that information. 01:50:24 If we know more about the body, we’ll know more about the possible positions the hand can be in. But there can be cases where what you’re doing with your fingers when they’re covered by other things, we’re not going to be able to learn that. But there are hardware things that can help. Ranging from higher resolution, higher fidelity cameras. Colour cameras help a lot for some of the skin tracking inside of things. So I would suspect the next gen hardware -- while it doesn’t have explicitly hand tracking sensors, that sensor suite should be able to make things better. But it’s going to mean a whole new set of training data. Whole new sets of work going into it. And it’s going to take more processing power, which will come at the expense of other things. I expect hand tracking will continue to get better on the existing hardware. The next gen high-end headset will probably be better quality than whatever quest 2 has at the same time. But you’ll still have to make smart developer decisions about setting everything up so you can bend the world around it. Maybe you need to tilt the board more up. You need to spread it out larger. 01:51:56 Just do all the things since you obviously can’t do super fine placement. But a very large board, very large pieces. Figuring out all the ways to allow the user to get feedback as they’re going. I know it’s sort of a problem with chess where you can’t touch the piece. Maybe you break that in vr where you want to make sure that the interaction of saying I’m going to touch this goes through a stage of where it’s clear you’re about to grab this rather than it happening instantly. One of the things that I’m -- I’ve always been pretty down on the idea of gesture controls. Where you do a gesture that turns into a discreet event. I always have said the best way to do things is this continuous process where you want to know that -- like, swiping with the hand is one of the better things. You’re just moving it over a continuous space as opposed to doing something that triggers a magical action. So maybe you go down and it’s telling you, all right, through feedback using every possible means of feedback you want to change visuals on what’s there, you want to have an audio feedback, you want to have haptics -- well, you kind of haptics with the hands. But you add new shells around things, add more element. Most people will miss half of the feedback, but as long as they catch some of them, that lets you do it. 01:53:31 Starts changing colour, starts having sparkles around it, starting playing an audio thing. As it goes into the critical phase, it makes a popping sound and lifts off and grabs. Things like that. There’s a lot that you can do. It’s best for you to kind of think about it as a really messy black box. Hand tracking is like a really broken controller right now. It works sometimes, but it’s far from perfect. And if you just -- it’s easy to get upset about it. But if you just accept it like gravity, it’s just a concept of the universe, then you apply all your cleverness towards working around it. If it magically gets better six months from now or a year from now, you just look all the much better. >> wonderful. Thank you so much. Really great. thank you. All right. So we’re going to go to another question from fb live. This one comes from anthony. How important are verifocal displays with vr? >> my personal opinions do not necessarily map to, you know, lots of other people here where a perfect verifocal system obviously makes the system better. 01:55:07 But there’s both dollar cost and volume cost, weight cost, thermal cost, processing cost that you have to spend on this. But then there’s also the question of imperfect verifocal might not be that much better. There’s value when it’s done right, but what’s the shape to the approximation of that. It’s possible that it could dip negative, where if the ability to track onto something to determine what distance you want the focal plane to work on, if that’s not accurate or fast-responding, you could be looking at something and it gets blurrier than if you never had anything at all on it. So I don’t think we’re at a point right now where we can say we have perfect line of sight. There are a lot of problems going from -- we had systems that worked well on someone in the lab, and then they spread out trying broader ranges of people and we ran into more problems with it. There are problems with glasses, different eye shapes, different eyelid shapes. A lot of things there. We will learn a lot with the next headset that has eye and head tracking. We hopefully will have a ton of users trying that. 01:56:39 Although this really does come down to one of those privacy question issues where there’s all sorts of these things where we could improve our technologies better if we did detect data from all of these cameras. But we don’t. And that’s a big foundational point about this. If we were able to sample everyone’s eyes, that would be a wonderful dataset for us. But all we can do is infer secondhand, ensuring that it’s working well. And I would make the point that -- for a lot of things like dealing with screens, if we just put the screens where the focal plane is, for those use cases, verifocal could only be a net negative. So it’s only for use cases outside of that where it has positive cases. >> great. Thanks for the question, anthony. If you’re just joining us, we’re live with john carmack in horizon worlds. We’re taking questions from our in-vr audience of developers. But if you’re watching on fb live or horizon venues, feel free to drop your questions in the stream. Let’s get another question from the audience. Ma’am, up front there. Come in. thank you. 01:58:10 Hi, john. Thank you for this opportunity to talk together and to listen to your ideas. My name is paige, and I’m the founding director of betterworld museum, horizon art museum, and women in horizon. I’ve had a wonderful opportunity to be in the horizon mini game accelerator and the oculus launch pad. I’m currently obsessed with using horizon’s tools to build more resilient individuals, groups, and communities. I do this through climate hack-a-thons, to women world building workshops, and more. I’m currently creating a game about women’s financial empowerment. With this game, I want to create stores with assets and have those assets be able to be either purchased by somebody in horizon or through facebook orin do you imagine one day these assets being able to be transferrable and/or even to be combined with some of the other creator tools? for instance, using a spark ar filter that I create inside of my horizon selfie cam. 01:59:45 a couple parts to that. I think mark was pretty clear that he wants commerce to be happening in our platforms and that it should be not tied into one place. So I think something like that will come along. But it’s probably not going to happen super soon, but I’m -- yeah. I think that’s very much in line with the vision of what they want to do. that’s really exciting. Thanks for building these incredible tools that empower people around the world. and the spark ar things, I think there’s an opportunity for merging that into vr where -- I did a hack internally several years ago that basically let us run other processes on the vr stuff and reproject it into vr. It’s one of those things that’s technically completely possible to start putting a camera through there and feed it through spark and then put it back into vr, at least for a 2d shape. That is the type of thing where every time we file in new features like that, our performance budget for all the things goes down and that’s where we get into those hard design decisions. It’s cool here that we’ve got the cameras floating around with the screens on them for doing multicamera view of all this. 02:01:16 But it also is contributing to the fact that we don’t have 30 users in here rather than 16. And we can say, like, I could pretty much lay out exactly what we would need to do to let spark show up on horizon, but it might make avatars more expensive. All those are tough decisions, and that was my big point of how we need to treat the metaverse as a constrained operational problem. This is cool, but this is actually more valuable and make trades like that. those are important considerations when building in horizon especially. So if you for that insight, john. I appreciate this. I love horizon worlds. I’ve been there since the beginning, and it was a real honour to share with my friend ashes to ashes in oculus today at the building diverse worlds in horizon worlds today. when I spend time in horizon doing these talks and things, there’s all the things I can point at. People’s leds glitching, and we need different line tags. 02:02:49 But I can see line of sight from where we need to go from this. It’s not there yet, but it doesn’t seem like we need something completely different. We just need to polish off all the rough edges and we can find and open up things to more people, more options. Commerce, all that. And I think we’ve got something that can be successful here. I believe it. Thank you. thanks, paige. Great. I think I might be neglecting this side of the room a little bit. Come on up here. Step up here. >> hey. Thank you. It’s really great to be here. My name’s chris. I’m the developer of big baller’s basketball. I really like to study the physics to make a realistic game. The question I have, which is not related to that, but really just the future of vr. I think I’ve come to realize that vr really is the future, and it’s going to be really popular amongst many people. I can see it being in every person’s home just like a tv. But with that, there are some ethical things to think about. 02:04:28 My question about that is people might get addicted to vr. And perhaps maybe a little bit disillusioned. I’m curious about your thoughts around that. I realize it’s a bit of a dark question, but what do you think? >> I do take a position that in general on so many things, I don’t buy into the precautionary principle idea where it’s like, if there’s a chance of something going wrong, we should take mitigating measures for it. If we know something’s a problem, if there’s demonstrated harm, we need to mitigate the harm and make smart decisions about it. But the idea about a speculative harm, I think generally the right thing to do is wait until harm actually manifests. I do point at the faa regulations for airplanes. There’s a saying that the faa regulations were written in blood. They only wrote a regulation once there were actual plane crashes. Some people will say, that’s horrible. We should be protecting people proactively. But in the real world, people are terrible at predicting the future. And there’s all sorts of nonobvious harms, or at least friction and detriments, that happen when people try to avoid harms in different ways. So worries about vr addiction or, like some of the worries that I have to deal with about ai safety and other things, I do just really not worry about that. 02:06:03 And some people might say, well that’s callous or that’s irresponsible. But show me a real problem. We’ve got real problems that we can then make value judgements and do engineering based on it. So I encourage people not to think about the speculative problems. But it doesn’t take long to look around and find something that’s a real-world problem that we can actually do something about today. thank you very much. all right. I’m going to take another one from this side of the room here. Come on up. There we go. >> hello. >> hi. >> my name is liza. I’m a member of oculus start and am making up a game. I also work as a software developer at a company called emerge that does hardware and software for ultrasound base touch. So I do various things. I have a really broad question. I just want to know besides small apps or performing apps, what is the strangest thing you’ve seen with data with users? >> I make a big point about telling everybody at oculus that we should be paying more attention to the data. 02:07:47 And I had a post just a month -- less than a month ago about how we have all this data that we don’t look at enough. And I made the point that I have accessed all the dashboards, and I don’t spend as much time looking at them as I think I responsibly should. My argument now is I’m a consultant, but I try to guilt everybody else into this is your full-time job, you should be looking at these things. I was suggesting that half of every group should be an appropriate data graph for whatever the group is about. That you should be looking at the real data. Because far too often we’re just following people’s gut instincts or visions on things. Some of what you -- you want a 30 something veteran in is because of gut instincts. But data in the end should trump gut instincts. You only use gut instincts when you have nothing else to base anything on. So in terms of real surprises, gorilla tag was a nice one to be able to call out here. It’s, like, crazy large user numbers. It’s clearly sort of a one-man project. It’s like all of things I would normally talk about, about how you increase the quality of different things and the feedback you want to give, it has relatively few of those. But it resinates with people, and it just has a neat little mechanism that turns out to be a ton of fun for people that could have been invented in horizon probably as a different mechanism. 02:09:27 I’m not sure if we have exactly the right scripting for that. But that sense that somebody found a little kernel of magic that nobody else had hit on yet, and it was really easy and obvious that it’s got that value there. Other really surprising things, I think it’s nice now that we’re getting to a point where a lot of the things that are working are almost predictable. Of the things I wind up playing -- I play a ton of beat saber. Beat saber is obviously the correct thing to do in vr. It doesn’t have any negative thing about it, and you get some of the good things that happen in population: one with the climate ability, feeding well into vr. Pistol-whip is a great game because it has motion but no comfort issues because it’s strictly linear. Subtle surprises, I think the fitness stuff has worked out much better than we expected. People playing beat saber earlier on were like, isn’t it going to be gross being in a sweaty headset. But it turns out it works. We’re making steps now with accessories to make that a little bit better. I would say that was somewhat a surprise, that fitness worked out that well. But in a lot of ways, we can look at a game spec that somebody lays out and point out, this is going to exploit vr this way, and this is where it’s going to run into some of the limitations of vr or where you’ll have problems with users. 02:11:15 But this many years into the experience, I think there is a pretty good knowledge base going on right now about how things are working. I’m excited to see how we test this idea about resident evil and coming up with grand theft auto. Taking games that are generations old and pulling them into vr. I’ve been pushing for that from very early on. It’s do-able. You can take something that was triple a many years ago and make a great vr game out of it. We still occasionally get surprised by some little thing like gorilla tag. We can’t predict everything, just as oculus content, whatever. People are going to come up with things and we’ll be like, damn, we didn’t think about that but it totally works. would you like the mic back? >> yeah. Awesome. Thank you so much. That’s all I needed the mic for. great. We’re going to -- let’s go to a question from fb live. This one comes from eric. What do you see as the future of web xr? >> I didn’t have that in any of my notes. 02:12:53 I’m not really deeply involved with that side of things. But I go around this battle with myself whenever I’m upset about app lab reviews or something like that. And what gets thrown back at me is while the escape valve is web xr, there’s no approval process whatsoever. Anyone can do anything. And sometimes I get into moods where I’m like, okay, maybe I should throw more personal effort into advancing this and advocating for it. And there’s been times over the years where I’m looking at the performance and tracing them through the stack. And there’s things that we can do to improve the performance. There are things that I certainly could advocate for. Most of the web export is generally done on pc and it’s horrible on quest initially. It’s just a little more challenging to do the direct native development on quest. But it’s going to be there forever. I think that it is an important aspect for things. But we haven’t seen that breakout thing. It does give me some pause that we haven’t seen something like gorilla tag, that somebody could have done in web xr and made that experience and got that to work. I do think there is something about the communities that are really web xr focussed. They are the purest. 02:14:25 We have this metaverse where it will be built with web technologies and web xr. I think culturally some of that group is missing some of the user and quality focus that more traditional app developers have. And I would ask them to be more quality focussed on things. Where some of them are, like, well this is great in theory. But when you look at the actual practice, it’s a little bit lacking. But I do think there are things we can do on our system software side to make it easier and faster to get into there. That is something where we totally could get at this point where lingks go around, you clic on it, and bam you’re in an experience. But right now it’s a big mess. It’s not a clean, smooth thing. We could do a lot better than that. The performance is good enough if you are really paying attention to your performance. If you’re just grabbing a big toolkit and using a pc, it’s going to be terrible. I’ve reviewed several people’s xr spaces, and it’s all painful for me going through how many things wrong from a presentation standpoint. 02:15:58 But it is possible. Where if you say, how many texture space, how much can you get through the web xr space, you can do great things that look really fantastic. But you have to go to it from an optimization problem rather than a creative problem. >> thanks for that question. Let’s get back into our in-vr audience. I’m going to go with you. You’ve had your hand up for a while. Come on up here. hey, john. First off, I’m a huge fan of your work for many, many years. I’m the ceo of vr studio gibb games. We work on using current hardware for accessibility. Also we develop vr experiences. We have an upcoming murder mystery puzzle game. One of the things I find really interesting when I’m doing development, and I’ve been level developing for a long time, is that there’s a lot of user experiences when you’re developing level design and ui. What I was curious from you is with all of your experience in 2d from back in the day to today, what are some of the major differences that developers that are getting into vr development that are more familiar with the 2d space might look for when they’re developing for vr? like, what kind of major things might they want to keep in mind that are very different in a 3d experience from a 2d one. >> I don’t think it’s really fundamentally different. 02:17:45 The best conventional gaming levels were built by people who had an architect’s eye, and they had awesome composition skills, and they built things through the eye of the camera. You want to get there and still have that same hero shot in vr that you get in a traditional triple a game. So if anything, a lot of the work that was overdone in traditional gaming now comes into its own value in vr. I still think back to the original vr test level, which was a fragment of rage that I had pulled out for the prerift work. And I remember just sitting in there looking at this amazing detail that the artist had built in. I had seen that dozens of times and just run by it onto the next part of the game. But in vr, you stop and you can really appreciate so much things. I think there’s a lot more to get out of it. But in terms of things you would do specifically, I guess performance is more of a problem. Dropping frames is more of an issue, and you’ve got less performance headwork because we are stereo and high framework. You can’t go from an equally powerful game and expect all of it to come right over. You can’t use the same textures for everything. You have to think about it more, but you get a lot more out of it. When an artist gets to build their scene and then walk around and crouch down and look at the brick work at one corner of their level, it’s kind of an amazing thing there. 02:19:18 One thing I had wanted to mention in my old talk was something that everybody building in vr -- and this applies to video games, but especially in vr where budgets are tighter. If you have something with finite amounts of level -- like, walkabout mini golf does this great thing. They build these mini models and then relight versions of them. But there’s another hack which is really easy to do from a programming standpoint. And that’s mirror your level. You would be amazed at how different things feel by just doing a mirror flip of your world. Like, make your world and offer a flipped version of it. If you have things like text on poster, that’s a problem. But for most things, it’s surprising how much you can do. Games that I enjoy where it’s got maybe 8 or 10 levels or something, just double that. Make them all mirror versions. Talk about mini golf. They should go ahead and offer mirror versions of those levels, and they would play and feel completely different. 02:20:52 great. I saw your hand up before. Why don’t you come on up. You got it. Why don’t you move forward a little bit in real space. There you go. hi, john. I worked on an immersive language experience. Totally unrelated to my question. My fiancé and I were talking about your talk last year and how you mentioned haptic research that was being done and using vibration to create the simulation to prevent motion sickness. Do you know if there’s any progress that has been made in that field? >> I haven’t heard of anybody following up on that. It’s kind of surprising that people -- a couple of years ago at connect, they had a little headband with buzzers on it, and they had a demo of it. Running you all over the place, it should make you sick. And they claimed broadly that not a lot of people had a problem with it then. It seems very simple, but I have not heard any third party study on it. I found having haptic actuators can be useful. 02:22:35 Your brain can add sensory inputs into different things. So the idea of having small little buzzers even just as ui feedback may make sense all on its own. In some cases where we can tell there is issues, if we could turn those on if it made a difference, that could tie into something else that I talked about last year that I still think is a good idea. This idea of we can tell when we’re rendering at the final instant whether something should be uncomfortable because the headset has actually moved in a way that doesn’t match up with the way the visuals are presented. We need a little more information from the applications. But there are corrective measures that we’ve considered doing like adding audio or reducing contrast or blurring only the areas where there are the actual uncomfortable bits. The sky doesn’t need to be blurred at all because there’s no discomfort, but the things sitting right next to you might have more of an issue there. That ability to scan the scene and know if something is uncomfortable, we may be able to do visual things to avoid much of that. But if there’s an extra benefit with doing something with the haptics, that might be interesting. I’m not aware of any follow-up experimental work on it, but it does seem interesting. It’s amazing that we can have 10,000 people and have all these potentially exciting ideas that do not get followed up on. 02:24:15 great, who haven’t we gone to? I see you back there, sandwich. Why don’t you come up. hi, john. I design games and script them in horizon. My question is what are some game genres that you think will work really well in vr that we haven’t seen content for yet? >> you know, one of the things that I saw some demos of really early on, I think there’s opportunity for taking even really classic retro games and having them be an interesting vr experience. This notion of taking something that is traditionally a 2d game, and even if it stays exactly the same game but turning it into a 3 d diagrama of things. It’s a niche thing, but I can imagine you take pac-man or space invaders or anything like that. Instead of being a 2d screen there, everything turns into little models and it runs around and it’s right in front of you like a living toy set. I suggested at one point doing that literally with the emulator. So it is the exact 2d old game and just figuring out how to place the different models on top of it. That’s something I’m still sad we didn’t get to do a really good oculus arcade early on. 02:25:46 I think that idea of some of the old games -- and it fits. People aren’t going to spend 20, 30 bucks largely for things like that. But it’s the type of thing that it would be great if in horizon or something like this you can drop a quarter in essentially an old arcade game. That you weren’t restricted by distance from going to an arcade or something. I think there’s small benefits for things like that there. But I think the things that work really well in vr now -- again, beat saber is a perfect match for what vr does. We have other things that are clearly very valuable but still fighting some of the limitations of vr with the lack of being able to physically interact with things. And then we’re still getting surprised again by some things like gorilla tag. I’m still very excited about everything being built. The quality of the games is way up there now over where it was. Though we’re still nowhere near triple a quality for almost anything. None of the games have that depth or level of polish for it. I think it’s coming along in a healthy way. A few years ago there was a lot of despondency in vr development. We’ve been at it for years, nothing is taking off. But I think now people are feeling good about it. It’s not like, well, I did a great job and it still fell apart. 02:27:19 Gaming is looking pretty good. We are trying to expand our breadth a little bit now and not be so focussed on gaming. But there’s no question gaming is still the dominant factor for it right now, and it’s coming along well. you heard the man. Space invaders 3d in horizon by christmas. I’m going to hold you to that. Oh, you want the mic? >> yeah. Someone in horizon named geoff created a pac-man simulator which is actually really similar to the idea that you were talking about. Where you have 3d pac-man and you have walls and you have to invade the -- or you have to move away from the ghost and everything. It’s cool what that guy has done in horizon. I recommend checking it out >> okay. Thanks. great. We’re going to continue on here. I see you there with your hand up. Come on over. okay. My bad. Just nervous jitters and stuff. I’m joshua. I work with laura on our vr language learning experience. 02:28:52 I work on my own game on the side, and I also work at a consulting firm. I’ve noticed over this past year, since everyone has been working remotely -- or some people have been able to work remotely, some people have been getting zoom fatigue. Where people just get tired of being on screen all the time all day 9 to 5. Has there been any trends in vr that has helped that? helped prevent the zoom fatigue? or what’s -- with the whole pandemic going on right now, where’s it coming along in vr? >> we’re a long way from making a dent in zoom meetings. That’s taken over the world, and there’s enormous numbers of people doing that. The early signs we got are really positive. There’s a million things that don’t measure up right now, but I think we really have something there. Right now, what we’re doing right now, I find this easier to be talking and looking around at other people than being in a zoom meeting. My laptop’s over there with six people or whatever on a window, and I find this significantly better. Significantly more interactive, personable, better communication. But we have to get to a world where everybody we want to talk to has headsets. 02:30:31 We are working on ways to let people come in when you don’t have the vr headset and kind of mixing and matching. That’s going to be a long process, but I think that’s one of those ones that could surprise us. Where we’ve got expected growth patterns for different things, but if we really keep our foot on the gas, polishing that and making the experience open to more people, I think that is something that has a decent chance of having a real jump and having more people using it. But nobody knows. It’s all speculative at this point. It could be like so many other things where a very small group of people winds up doing it. I have some hope that that could be a big win for us. with all the developments going on with the stand-alone headsets and keeping the price down, that already does a lot. Getting more people introduced to vr. Just giving more opportunities to come in. Thank you for having me here >> you’re welcome. You’re muted, nate. >> thank you. 02:32:03 We’ve come to the end of your broadcast. I want to thank you all for asking really great questions today. And most of all, we want to thank you, john, for spending time with us today. We hope that all of you watching enjoyed today’s connect. We have one more amazing event coming up. You’re not going to want to miss it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment