At $3,500, Apple’s latest AR/VR headset was destined to be a hard sell for the average buyer. But we’re not here to discuss the mass adoption potential of AR, MR, or VR. Instead, let’s take a step back and digest the possibilities that Apple Vision Pro and its counterparts like HoloLens and Meta have demonstrated through their mere existence.
Despite what sceptics may say, the Apple Vision Pro does mark a technological breakthrough. Watching videos on a laptop or phone screen is one thing, but having displays fill your entire vision allows you to relive another person’s experiences as if it were your own. New experiences unlocked by the Apple Vision Pro are not too dissimilar from when monochromatic film was filled with colour for the first time, or when sound could finally play over previously silent films.
But what does this mean for both consumers and creators of media? Will augmented reality headsets change the way we record, edit, and enjoy movies, and could this development even revive the now somewhat forgotten metaverse?
Potential use cases of AR/MR headsets go far beyond the film industry and can be largely attributed to its controls and computing capacity. But before delving into details, it's first important to understand how they differ from virtual reality headsets such as the Oculus series.
As the name suggests, virtual reality completely immerses the user in a digital environment that’s 100%, blocking them out from the physical world. This offers users a higher level of immersion by allowing their senses to perceive an entirely new environment. Mixed reality or augmented reality, on the other hand, overlays digital information and objects onto the real world. AR/MR users can still remain aware of their physical surroundings, and even use digital enhancements to interact with or gain more information about objects around them.
Lag has been one of the biggest challenges that AR/MR developers have had to overcome to create seamless experiences that allow users to suspend their disbelief. While similar headsets and even regular devices such as smartphones and laptops regularly experience lag, Apple Vision Pro users have been lauding the device for its zero latency performance that allows displays to move along with their vision in real time.
Apart from reducing cybersickness, the R1 chip facilitated zero latency feature also flings the door open for AR/MR headsets to be used in higher risk scenarios that require quick response. Training for military personnel, athletes, and first respondents immediately come to mind, with these devices making it possible for new staff to receive accurate, real time information about the scenario and individuals they are dealing with.
Realism is taken to a whole new level with depth of vision that blows rectangular TikTok and Instagram reels out of the water. Viewers can interact with photos and videos as if they were actually present, because zooming in doesn’t just enlarge displays, but allows users to explore recordings three-dimensionally as if they were present in the moment.
This feature can revolutionise journalism and the way people record their most precious moments. Potentially, this could break down barriers of communication and increase empathy by allowing users to see perspectives that they would not otherwise experience otherwise.
Apple Vision Pro set a new standard for UI by getting rid of hardware controllers altogether. Instead, users can manipulate displays through hand and eye gestures. Single-handed pinching, double pinching, and pinch and hold execute actions that are equivalent to smartphone screen taps, while double handed gestures allow for zoom and rotation. Perhaps the most impressive aspect is that hand gestures remain effective even when users have hands rested on their lap, making the headset even more accessible to individuals with disabilities.
These gestures work in tandem with eye tracking, enabling users to select on-screen elements simply by looking at them. Once selected, digital elements can be manipulated using the aforementioned gestures. The system’s low latency makes it possible to effortlessly juxtapose digital elements against the surrounding environment, allowing professionals in real estate or interior design to reconfigure furnishing and building arrangements to preview how they might appear in real life, greatly enhancing the process and preventing potentially costly missteps before the execution stage.
Not to be left behind by the Artificial Intelligence (AI), Apple Vision Pro has circumvented a potential limitation in video calling by using AI to fill the gaps and create a virtual avatar whenever users engage in video calls. This is made possible with an advanced encoder-decoder neural network, which Apple reveals has been trained on a diverse group of thousands of individuals.
While not a 100% replica, these digital personas are realistic enough to allow users to enjoy video calls, and could conceivably provide users with digital representations of themselves in the Metaverse or equivalent digital spaces.
With so much mention of Apple Vision Pro, you’d think that we’d be earning commission on their sales. To be clear, the only reason why they’re mentioned so much is because many of the aforementioned features are now exclusive to the Apple Vision Pro. But for all its innovations, the headset still has glaring limitations that prevent it from fulfilling its full potential, something that competitors within the AR/MR market will no doubt work on improving in the coming years.
While one could technically wear AR/MR headsets outside, it’s easy to see why many would be apprehensive about donning $3,500 headgear that weighs about a third of a full face helmet in public. In its current form, the weight, size, and dimensions of AR/VR headsets also make them awkward to carry around. Unless developers can find a way to create collapsible headsets like what’s been done with modern day cameras and stands, using mixed reality headsets on filming sets may prove to be too much of a hassle for many.
But even if you had the neck strength to carry these hefty headsets around, you might not get much use out of them without access to charging sockets on site. One feature that proponents often leave out is how Apple Vision Pro gets it's energy supply from a wired battery pack that only lasts for two hours.
Unless manufacturers can find a less clunky way to power up their supercomputers, batteries and charging cables will likely limit the use cases of mixed reality headsets, particularly in situations where there's a risk of cable entanglement. Having to ensure that a bulky battery pack doesn't fall out from one's bag pocket will likely interfere with the immersive experience, too.
All of Apple Vision Pro’s shortcomings doesn’t takeaway the fact that this might be the closest that humanity has ever been to full dive reality. For the uninitiated, full dive reality is a lot like the movie Ready Player One, where players are fully immersed in a world in terms of sight, touch, and sound. While the device is marketed as AR/MR, full immersion is still possible with a dimming knob that adjusts how much of the surroundings are blocked out.
The combination of VR immersion with AI generated avatars may soon see Apple Vision Pro users immersing themselves in a metaverse significantly different from the one envisioned by tech giant Meta. Interestingly, Meta’s stock has been on a downward spiral for most of 2022, with the metaverse still weighing them down despite an 11% increase this year. In contrast, Apple’s performance has been nothing short of impressive, crossing the $3 trillion market cap. While the possibility of witnessing a Meta owned metaverse becomes increasingly unlikely by the day, the idea of an alternative owned by Apple or another player is far from implausible.
Sceptics will point towards Microsoft’s January 2023 layoffs of their Industrial Metaverse Core team as proof of the metaverse’s infeasibility. But it’s easy to forget how H1 2023 was pretty much uncharted territory for the once ‘recession proof’ tech industry that experienced layoffs for the first time in years.
Microsoft was just one of many companies that conducted mass layoffs in the first half of this year. However, those actions are not indicative of their long term faith in the metaverse. Rather, tech companies were, and still are, playing the waiting game to see if others will do the heavy metaverse development lifting before joining the foray. If anything, the Apple Vision Pro could reignite mass interest in virtual spaces once more.
While impressive as a technology, creating avatars based on face scans is something that can easily be exploited by bad actors. The last thing that users want are digital replicas saying or doing things that they would never have done in person.
This threat could provide NFTs with a legitimate use case that allows users to own and gain control over their digital likeness. Even Apple themselves have given the green light to NFTs last fall after years of resistance. While the move was resisted by the crypto community for imposing a whopping 30% tax on NFT-related transactions, the reality is that having the backing of a tech giant like Apple will no doubt inject some much needed credibility back into the space.
We still might be a couple years away from full dive reality, but you don’t need to wait that long to start working on projects in virtual reality, mixed reality, or the metaverse. Early sign ups to the Freelancer Nation network will receive their very own NFT access pass and endless opportunities to experiment with and showcase their best work on Web3 and other platforms of the future.
To stay up to date with the latest developments on the creative scene, subscribe to our Freelancer Nation newsletter for monthly updates on the latest social media trends and workshops specially curated to help creative freelancers supercharge their careers.
Subscribe to the Freelancer Nation Newsletter