<img src="https://certify.alexametrics.com/atrk.gif?account=43vOv1Y1Mn20Io" style="display:none" height="1" width="1" alt="">

What does Apple's Vision Pro mean for the metaverse?

Pic: Apple Computer
4 minute read
Pic: Apple Computer

Apple's stunning new AR headset has taken us several years into the future. What will that future look like? Apple didn't mention the metaverse, but Vision Pro and the metaverse are made for each other.

Whatever you may think of Apple’s new Vision Pro headset, there’s no doubt it grabbed a lot of headlines when it launched at the company’s WWDC 2023 event earlier this week. Despite its undoubted technical prowess opinion about it is divided, however. RedShark is a fairly broad church when it comes to things like this and our writers have differing thoughts on Vision Pro and its impact. So, given its (probable) importance, over the next few days we’ll look at both sides of what you can characterise either as a coin or one of the most expensive poker chips Cupertino has ever gambled. First up, David Shapton with the optimistic point of view. 

You could be forgiven for thinking the metaverse was last year's tech obsession. Meta (formerly Facebook) has made such a leaden attempt at a consumer metaverse that it deserved to be consigned to obscurity. The social media company can hopefully come back with something better, but like so many other tech companies, AI is probably occupying 110% of its attention right now.

But Meta doesn't own the metaverse. Nobody does. If they did, it wouldn't be the metaverse. And nor is Meta's consumer "game" oriented vision the core purpose for the new digital space. 

What is the metaverse? There's no single definition, but the one I like is "a digital overlay above the physical world". That's my definition, but it embraces much more than a shared 3D world. More of that later. Let's talk about Vision Pro. 

Meeting high expectations

My expectations were high. Apple would never release a half-baked concept that was this important. 

But what the Cupertino tech giant revealed went beyond anything I'd imagined. I'm not being hyperbolic here: the Vision Pro is a tour-de-force of technology. But that's not what you'd conclude if you read the Twitter trolls. 

"At that price, it's just a joke."

No, it's niche and unaffordable for the majority at that price, but when you take on board the radical innovation at work here, it's surprising it's not $10,000. Apple's new device is about as far as you can get from a clunky headset with two small low-resolution video screens. The attention to detail is incredible. Except that they're not merely details: they're essentials if "Spacial Computing" is going to gain traction. One of the most eye-grabbing (literally) is seeing the users' eyes through the screen, except that you don't because the screen is opaque. Instead, you see a moving avatar view of their eyes, made from an initial face scan and with input from real-time eye-tracking sensors. During total immersion, the eyes aren't visible, and there's a Siri-like visualisation on the front screen instead. 

And remember that home cinema enthusiasts routinely spend tens of thousands of dollars on screens and audio systems. But you'll probably get a better experience from Vision Pro. To an enthusiast, the cost is probably not out of the question. 

"It's just a toy."

If that were true, then the Apple Watch would be a toy, the Mac Pro, which can process twenty two parallel streams of 8K ProRes video, would be a toy, and the iPhone - the mobile handset that's more powerful than most laptops - would be a toy.

"It doesn't have a killer app."

I think it is a killer app. It's the first truly viable - and desirable - interface between a human and a genuinely "liveable" digital overlay (or Virtual Reality), and it can mix VR, AR and MR. 

Vision Pro is remarkable because all its elements work together to create a sum greater than its individual parts. The impression of a different reality is almost an emergent property - and who knows what else might emerge when a complex ecosystem of apps fully uses the sensors and the display and when Vision Pro users interact with each other.

The cameras, the motion sensors, and the 2x 4K screens are essential for a credible experience. I haven't tried it - only 30 people have as I write this - but they've all said that, in some way, it was better than expected or the best thing they've tried. Most talk about the resolution. Some talk about how you can fade in and fade out of "virtuality". Thanks to the outward-looking cameras, you can even read your Apple Watch or phone. Others talk about eye tracking and the accessory-free user interface. It's all gestures! You could also mention that this headset contains a pretty powerful computer. 

I was talking to Simon Wyndham (Senior RedShark Contributor) about it. He didn't say it was "another iPhone moment", but he did say that when the first iPhone came out, it wasn't clear what it was for. There was no app store. The camera was underwhelming. There was no video to speak of. And yet, smartphones are most people's primary computing platform. 

A lack of latency

What's the most impressive thing about Vision Pro? 

It's latency - or the complete lack of it. It solves the biggest issue with HMDs - lagginess. With other products, it's bad enough to make people feel sick. Latency isn't just something you can specify in a setup file “(latency = 0)”; it doesn't work like that. You can only get rid of latency through great design, careful engineering, and brute force processing fast enough to carry out every calculation in the space between frames.

That's a big ask for a portable computer. Before Apple's reveal, I would have said it was impossible at this resolution. That it is possible is perhaps the biggest thing to take away from this product launch. 

Because I believe that "spatial computing" - a phrase Apple coined to describe the new platform - is about much more than a virtual reality headset. It's about the future and our relationship with it. 

Just think about which other scenarios could use these capabilities for a minute. I'm pretty sure autonomous cars could. It's even possible to envisage cars without windows, with high-resolution screens instead. You could choose the scenery you'd rather see on your way to work. Swiss mountains? No problem. A rainforest? Easy. These would be generated in real-time by a games engine, but the Apple tech would key the images into every tiny car movement. Or maybe these headsets will be cheap enough in a few years for everyone to wear them - like glasses. And at that point, you wouldn't need video screens in the windows. 

And that's not to mention the autonomous car itself, which would be driven by this technology with great precision, thanks to all the sensors and processing, and, again, absolutely no lag. 

Rise of the robots

And then there's something that may affect our future more than any other technological development, which would use a combination of Apple's technology and AI, another tech phenomenon that has kept us awake at night for the last six months: robots. 

People don't know what Vision Pro is for because the future in which it is likely to be a central platform doesn't exist yet. But it will, and now it will be shaped by Apple's new technology.

And knitting all these developments together: a real-time digital overlay on top of the natural world will be the metaverse. Back again, in glorious detail, without lag, and without us having to make a binary choice about being in it or outside it. Where we can come and go as we please, and it will seem the most natural thing in the world. 

Tags: Technology VR & AR The Metaverse

Comments