Face-wearables - will any survive or thrive (Rift / Glass / HoloLens / Vive / Apple Vision etc)?

You can't predict that the market for a category of product that's seeing regular innovation won't tip, or won't tip soon, just because it has been around for a while and hasn't tipped yet. Nor can you predict if or when a market will tip from sales figures alone; because mass adoption tends to follow an s-curve pattern, it's expected that everything will look pretty flat until it rapidly doesn't. The consumer headset market is about eight years in (from the launch of the Vive). Eight years after the first smartphone shipped we were in 2004, three years before iPhone and still short of the left edge of this graph:

Screen-Shot-2013-10-07-at-10-7-5.48.56-PM-431x620.png
So...it took like 15-18 years or something for smartphones to "pop"? And here is the thing...it is a PHONE...things people were already buying (cellphones). It was an evolution of something people were already buying. XR headsets are a totally new thing, not building on something already being bought. Another 10 years out seems about right.
 
  • Like
Reactions: LordDaMan

lithven

Ars Tribunus Militum
1,932
Is it the particular product or the product category which hasn't found a mass market?

That's really the subject of this thread isn't it?
It may be that the product category Apple wants (inside use, relatively stationary, mostly non-gaming) isn't capable of finding a mass market while the alternative, but very similar, markets of VR for gaming and/or AR usable out and about (such as what Meta is doing with Ray Ban) may have a mass market.

I'm not saying those alternative categories are going to be a long term success, I kind of still see them in the same vein as 3D televisions and movies with not a lot of hope for widespread adoption or long term usage even among those who buy-in. My point though is more that you may not be able to differentiate the product and the product category if Apple is the only major player pursuing that particular niche. That is unless you want to lump all HMDs together in which case Meta seems to be at the top currently.

On the other hand if they are going to settle for targeting a slightly expanded version of the market that both Microsoft and Google tried to, with HoloLens and later iterations of Glass respectively, where they basically are aiming for a professional audience only with mostly work specific custom software there might be a market there. The AVP could dominate that kind of market but that would still be a niche rather than mass market and just based on their history I don't see Apple staying in such a limited niche.
 

ZnU

Ars Legatus Legionis
11,694
So...it took like 15-18 years or something for smartphones to "pop"? And here is the thing...it is a PHONE...things people were already buying (cellphones). It was an evolution of something people were already buying. XR headsets are a totally new thing, not building on something already being bought. Another 10 years out seems about right.

10 years to what, exactly? To top 50% adoption? Sure, that's not crazy. iPhone was pretty much perfectly timed, I think. So when I say Apple is 2-4 years early with Vision Pro, I mean we're 2-4 years from spatial computing's approximate equivalent of the point marked "iPhone Launch" on that graph. It then took an additional 5-6 years to 50% adoption and about 10 years to 75%.

But of course the industry anticipates adoption, so for the sort of people who post in computer forums for fun, it will feel like spatial computing is a big, important thing quite a lot sooner. Pretty much as soon as the adoption curve starts to unambiguously bend upward.

It may be that the product category Apple wants (inside use, relatively stationary, mostly non-gaming) isn't capable of finding a mass market while the alternative, but very similar, markets of VR for gaming and/or AR usable out and about (such as what Meta is doing with Ray Ban) may have a mass market.

Market definitions are really going to be a bit of a nightmare here. Are Quest 3 and Vision Pro the same type of device, or is that like trying to make sense of '90s PC adoption while counting the PlayStation as a desktop computer? What if Sony had really wanted the PlayStation to be used as a desktop computer, and it was technically capable of that, but consumers didn't seem interested in actually using it that way?
 
10 years to what, exactly?
I don't have a good metric. To get past niche. Of course this is from my personal perspective. Some would say they (VR/AR/XR) are beyond niche already. It doesn't seem that way to me. It seems to me that the "only" (meaning main) use is games, and even then it is a pretty small corner of the gaming world, and it doesn't seem to be doing much. I mean my kid bought a VR set from his friend, used it a few times and I don't think he's used it in a year or 2. The metaverse seems stalled or on life-support as it doesn't seem like people are desperate for "Ready Player One" just yet. But hey...maybe I am just a crotchety old man (I'm not really that old, just mid-50s but feels older :biggreen:). Now get off my lawn with your new fangled gadgets.
 

Happysin

Ars Legatus Legionis
98,681
Subscriptor++
Maybe so, but how many people are nearsighted these days?

That rise started more with dramatic increases in childhood reading indoors. Screens did not originate our increase in myopia, though it arguably is making the trend stronger (not because screens are unique, just because children tend to be wiling to stare at the screen longer than a book). The interesting thing is VR very likely avoids those problems by tricking your eyes into focusing on far-away things as if they were outside. The absolute distance to the screen isn't the issue, its the focal length your eyes are focusing to.
 

Shavano

Ars Legatus Legionis
59,253
Subscriptor
So...it took like 15-18 years or something for smartphones to "pop"? And here is the thing...it is a PHONE...things people were already buying (cellphones). It was an evolution of something people were already buying. XR headsets are a totally new thing, not building on something already being bought. Another 10 years out seems about right.
Nope, that's mischaracterizing what happened in a big way. There really were no smart phones we'd call a smart phone in 2005. By 2012, smart phones had half the market, so about 7 years. And what limited adoption before the iPhone was technology. There weren't big enough and responsive enough touch screens on the market before that, to enable smart phones to be good enough to displace feature phones.
 

ZnU

Ars Legatus Legionis
11,694
It doesn't seem that way to me. It seems to me that the "only" (meaning main) use is games, and even then it is a pretty small corner of the gaming world, and it doesn't seem to be doing much.

Quest has sold pretty well if you judge it by the standards of the game console market rather than e.g. the smartphone market. 30M units in 5 years. However, Quest 3 is being outsold by the cheaper Quest 2, now a four year-old device. The level of price sensitivity implied by this suggests these things are mostly being bought for casual gaming. As toys, pretty much.

On the other hand, nobody was actually trying very hard to make headsets into more than this until Vision Pro. Quest's approach to XR UI is basically "What if huge virtual Android tablet?" and outside of games its ecosystem is bafflingly sparse, including with respect to very prominent consumer use cases like streaming video. So of course people are only buying it for games.

The metaverse seems stalled or on life-support as it doesn't seem like people are desperate for "Ready Player One" just yet.

The value proposition in VR social is that it should offer something more like real-life interaction, but there's still no non-pro headset with face/eye tracking, which are absolutely critical components of that. Leaving these off of consumer headsets is another baffling omission on Meta's part, given their strategic goals.
 

Horatio

Ars Legatus Legionis
24,069
Moderator
So of course people are only buying it for games.
Zuckerberg stated that the most popular apps are social apps, but I'm guessing that's because there are so many games that even the most popular ones (except maybe Beat Saber, but I think that's fitness, not games) don't hold significant usage share.
Quest's approach to XR UI is basically "What if huge virtual Android tablet?"
Sure, but without passthrough there's little point to anything else. I think this will change (well, they said it will) once the new cheaper leaked MR device is in market and they can sunset Quest 2
 

wco81

Ars Legatus Legionis
28,661
Meta hasn't said so but for now, they're prioritizing ad revenues and AI.

They're not going to ever say they're abandoning the meta verse -- or at least not for a long time and maybe after Zuckerberg is no long in charge.

They'll keep making HMDs but it will be interesting to see how their capital is deployed going forward, meta verse versus AI.
 
Eye strain is real but the whole "you'll go blind siting next to the TV" is a wive's tale.
My dad worked for Zenith electronics for 50 years and this is NOT an old wives tale.

HOWEVER, it IS more relevant to CRT and standard definition than modern LCD screens.

The old rule of thumb was to avoid eye strain you should sit ~7' away from a 27" CRT NTSC screen and adjust forward for smaller and back for larger.

My dad actually would yell at us for being too close.
 
Quest has sold pretty well if you judge it by the standards of the game console market rather than e.g. the smartphone market. 30M units in 5 years. However, Quest 3 is being outsold by the cheaper Quest 2, now a four year-old device. The level of price sensitivity implied by this suggests these things are mostly being bought for casual gaming. As toys, pretty much.

On the other hand, nobody was actually trying very hard to make headsets into more than this until Vision Pro. Quest's approach to XR UI is basically "What if huge virtual Android tablet?" and outside of games its ecosystem is bafflingly sparse, including with respect to very prominent consumer use cases like streaming video. So of course people are only buying it for games.



The value proposition in VR social is that it should offer something more like real-life interaction, but there's still no non-pro headset with face/eye tracking, which are absolutely critical components of that. Leaving these off of consumer headsets is another baffling omission on Meta's part, given their strategic goals.
Ah Hem, Microsoft certainly had a vision beyond gaming...though they went hard into technical usecases, not Office worker but VR now which is a bunch of what is talked about with Vision Pro.
 

whm2074

Ars Centurion
459
Subscriptor
My dad worked for Zenith electronics for 50 years and this is NOT an old wives tale.

HOWEVER, it IS more relevant to CRT and standard definition than modern LCD screens.

The old rule of thumb was to avoid eye strain you should sit ~7' away from a 27" CRT NTSC screen and adjust forward for smaller and back for larger.

My dad actually would yell at us for being too close.
I've yelled at about when I was a kid in the 80's. I have hearing loss so that was a consent thing.
 
For all you know he was really just on on you about blocking his view of the screen. In the 80's - 90's. We plopped a CRT on the desks of tens of millions of workers and people didn't start going blind at an unusual rate.
No, those CRTs were
1: Much smaller and
2: progressive scan generally. (early on not so much and rare occasions of cheap badness later)

NTSC at 525 lines interlaced was shit on your eyes that close. Basically the rule was you needed to sit far enough away to not see the scan lines and especially not the individual RGB dots. Any closer and it was super bad for your eyes.

People didn't go blind, but eye strain did increase significantly. There's a reason we now sell glasses specifically designed for computer workers.

My dad only watched the news and Star Trek....which was pretty Ironic given how much Television R&D he had his hand in.
 
Last edited:
  • Love
Reactions: whm2074
That rise started more with dramatic increases in childhood reading indoors. Screens did not originate our increase in myopia, though it arguably is making the trend stronger (not because screens are unique, just because children tend to be wiling to stare at the screen longer than a book). The interesting thing is VR very likely avoids those problems by tricking your eyes into focusing on far-away things as if they were outside. The absolute distance to the screen isn't the issue, its the focal length your eyes are focusing to.

It's not so much reading, as the lack of outdoor time. Bright outdoor light actually seems to moderate the growth of the eyeball (Myope eyes grow too long).
 

ZnU

Ars Legatus Legionis
11,694
Sure, but without passthrough there's little point to anything else. I think this will change (well, they said it will) once the new cheaper leaked MR device is in market and they can sunset Quest 2

Meta could have done free placement of windows, multiple volumes controlled by different apps in a single space, a general-purpose UI framework for spatial apps, etc. all without passthrough. Anyway, Quest Pro had passthrough and was pitched as a productivity device. Why didn't they get a little more ambitious with UI for that?

It's not just the system UI either. Look at Horizon Workrooms. In productivity contexts, Meta seems to have understood XR almost solely as a substitute for physical presence. You use it to put a bunch of people in the same virtual room. Great. Then, how do these people perform computing tasks? On virtualized versions of bounded 2D screens:

Screenshot 2024-04-28 at 1.34.42 AM.png

It barely seems to have occurred to Meta that XR offered new possibilities for UI.
 

Horatio

Ars Legatus Legionis
24,069
Moderator
Meta could have done free placement of windows, multiple volumes controlled by different apps in a single space, a general-purpose UI framework for spatial apps, etc. all without passthrough.
It's been a while, but pre-passthrough, did Quest even have the notion of spaces? I don't even know if Immersed et al existed back then. I since Immersed did it, they could have done something similar, rather than the 3-panel cylinder (did that even exist pre-passthrough either?)
Anyway, Quest Pro had passthrough and was pitched as a productivity device. Why didn't they get a little more ambitious with UI for that?
I'm guessing it was deemed not worth it to make the system UI different than the Q2, and let the apps handle more ambitious UI. Also, the QP passthrough is godawful.
Look at Horizon Workrooms.
I've never done this (and I bet no one else has either ;) )
Then, how do these people perform computing tasks? On virtualized versions of bounded 2D screens:
Yeah, because it's probably being remoted from a laptop. In terms of notetaking, I guess you could float it, but I dunno, I don't do VR meetings, and I don't intend to.
It barely seems to have occurred to Meta that XR offered new possibilities for UI.
It's also possible they tried a bunch of stuff, and none of it landed with real humans. I have seen by now, in the neighborhood of hundreds of XR UI prototypes, and yet, everything I work on for "real" UI is 2D flat panels.
 

Shavano

Ars Legatus Legionis
59,253
Subscriptor
No, those CRTs were
1: Much smaller and
2: progressive scan generally. (early on not so much and rare occasions of cheap badness later)

NTSC at 525 lines interleaved was shit on your eyes that close. Basically the rule was you needed to sit far enough away to not see the scan lines and especially not the individual RGB dots. Any closer and it was super bad for your eyes.

People didn't go blind, but eye strain did increase significantly. There's a reason we now sell glasses specifically designed for computer workers.
Yeah, age related hyperopia is a bitch.
My dad only watched the news and Star Trek....which was pretty Ironic given how much Television R&D he had his hand in.
 

Shavano

Ars Legatus Legionis
59,253
Subscriptor
It's been a while, but pre-passthrough, did Quest even have the notion of spaces? I don't even know if Immersed et al existed back then. I since Immersed did it, they could have done something similar, rather than the 3-panel cylinder (did that even exist pre-passthrough either?)

I'm guessing it was deemed not worth it to make the system UI different than the Q2, and let the apps handle more ambitious UI. Also, the QP passthrough is godawful.

I've never done this (and I bet no one else has either ;) )

Yeah, because it's probably being remoted from a laptop. In terms of notetaking, I guess you could float it, but I dunno, I don't do VR meetings, and I don't intend to.

It's also possible they tried a bunch of stuff, and none of it landed with real humans. I have seen by now, in the neighborhood of hundreds of XR UI prototypes, and yet, everything I work on for "real" UI is 2D flat panels.
I'm somewhat pessimistic that there will ever be anything worth wearing on my head for real work, at least as a standalone device. Nothing exists yet for an input device that's as good as a physical keyboard and mouse, let alone a physical keyboard and stylus. I suspect nothing CAN exist that you wear on your head for that. The Vision Pro works fine, maybe much better than fine, when paired with a Mac that you're using to capture physical input. But as long as the input capabilty is as limited as it is, when using it standalone it's not offering that much more than an iPhone. Which is not to fault Apple, they've done as well as anybody, and in some ways better, but they're up against the inherent limitations of a device you wear on your head just like everybody else is.

Here's what I think needs to be changed (might have been added already; I haven't been following this much)
(a) cast a rectangle of whatever you're seeing to any external screen (I'd be surprised if there isn't something like this under the hood that they're using for development but that might not be under user control)
(b) improved virtual keyboard function
(c) ditch the creepy eye projection
(d) improve the tools for creation and manipulation of virtual objects. You can leave that to the application developers but if you do you're going to get a mishmash of UI's that differ in how you'll use your hands to create pick up, put down, rotate, place, and resize virtual objects, and that will slow adoption and the development of applications that need these functions, which includes everything from design software to games to AR.
 

Horatio

Ars Legatus Legionis
24,069
Moderator
Per Mark Gurman

A fall 2026 release seems too soon since the only strong direction they've received is that it's too heavy/uncomfortable, the fov could be better and passthrough blur is bad. Well, and cost, of course.
 

ZnU

Ars Legatus Legionis
11,694
It's also possible they tried a bunch of stuff, and none of it landed with real humans. I have seen by now, in the neighborhood of hundreds of XR UI prototypes, and yet, everything I work on for "real" UI is 2D flat panels.

You'll have a lot of a 2D windows, sure. Humans have put a lot of effort into figuring out how to usefully represent information in 2D, not just over the last ~40 years of 2D computer UI, but technically over all the millennia since the invention of writing. But why restrict your 2D windows to bounded virtual screens? That just feels like lazy, unquestioning imitation of physical limitations that don't apply to virtual spaces.

Also, there likely are worthwhile opportunities to present more types of information in 3D now that headsets make this easy. It's going to take time to discover these, and they may not immediately click for users with a lifetime of experience working with information exclusively in 2D, but why not at least build the infrastructure to allow developers to start exploring this? Apple has a good start here with RealityKit/SwiftUI integration, though they're going to need to offer a lot more friendly, high-level functionality on the 3D side to get regular app developers to engage with it. We'll see how Meta responds.

I'm somewhat pessimistic that there will ever be anything worth wearing on my head for real work, at least as a standalone device. Nothing exists yet for an input device that's as good as a physical keyboard and mouse, let alone a physical keyboard and stylus. I suspect nothing CAN exist that you wear on your head for that. The Vision Pro works fine, maybe much better than fine, when paired with a Mac that you're using to capture physical input.

Keyboards and trackpads can be paired directly with Vision Pro, with no Mac in the loop. Mice can't be, for some reason, but presumably that's coming.

Big picture though, human-computer interaction is set up for the biggest shift since the GUI. Vision-language-action models are going to abstract many interactions from sequences of manual low-level user actions (clicks, keystrokes) into higher-level natural language requests. A glance at Apple's ML research shows they're working on multiple aspects of enabling tech for this, a notable recent example being this paper on UI understanding. This may substantially reduce the number of cases where precise, direct user input is required.
 
  • Like
Reactions: TheGnome
Nope, that's mischaracterizing what happened in a big way. There really were no smart phones we'd call a smart phone in 2005. By 2012, smart phones had half the market, so about 7 years. And what limited adoption before the iPhone was technology. There weren't big enough and responsive enough touch screens on the market before that, to enable smart phones to be good enough to displace feature phones.
I was just using the timing ZnU gave.
 
You can leave that to the application developers but if you do you're going to get a mishmash of UI's that differ in how you'll use your hands to create pick up, put down, rotate, place, and resize virtual objects, and that will slow adoption and the development of applications that need these functions, which includes everything from design software to games to AR.
Makes me think of Minority Report. Part of the problem with VR and motion capture is feedback. You get no good sense of when you are touching something something.
 
I'm somewhat pessimistic that there will ever be anything worth wearing on my head for real work, at least as a standalone device. Nothing exists yet for an input device that's as good as a physical keyboard and mouse, let alone a physical keyboard and stylus. I suspect nothing CAN exist that you wear on your head for that. The Vision Pro works fine, maybe much better than fine, when paired with a Mac that you're using to capture physical input. But as long as the input capabilty is as limited as it is, when using it standalone it's not offering that much more than an iPhone. Which is not to fault Apple, they've done as well as anybody, and in some ways better, but they're up against the inherent limitations of a device you wear on your head just like everybody else is.

Here's what I think needs to be changed (might have been added already; I haven't been following this much)
(a) cast a rectangle of whatever you're seeing to any external screen (I'd be surprised if there isn't something like this under the hood that they're using for development but that might not be under user control)
(b) improved virtual keyboard function
(c) ditch the creepy eye projection
(d) improve the tools for creation and manipulation of virtual objects. You can leave that to the application developers but if you do you're going to get a mishmash of UI's that differ in how you'll use your hands to create pick up, put down, rotate, place, and resize virtual objects, and that will slow adoption and the development of applications that need these functions, which includes everything from design software to games to AR.
Go back some years to the 5h generation of game consoles. You had this new 3d environment that people where still learning that everything is really 3d now and trying to figure out how to get a controller to navigate this new environment. A lot of game developers tried a lot of things. Some like Nintendo had a stick control the viewpoint of the camera that rotated round the playable character, which would lead to viewing angles where you couldn't really tell what was happening. Eventually everyone figured out through some trial and error the best thing was to have a fixed camera behind the main character and the 2nd stick controls where that camera is pointing to i.e what you are looking at. The 2nd stick

The point is that Apple = Nintendo. Maybe Apple's way is the best way, maybe not. You just need a lot of different attempts at things to decide what's the best way forward. This is never going to be done by Apple, as they are super conservative with GUI design and almost never do anything new.
 

Horatio

Ars Legatus Legionis
24,069
Moderator
  • Like
Reactions: Nevarre

Exordium01

Ars Praefectus
3,977
Subscriptor
I'd be more worried about it being a worse device with fewer use cases, but then again, I'm not really all that bullish on VR in the short term and think the market would be better served by continuing to push the boundaries in a less cost sensitive manner. Once there's a real killer application, they can figure out what corners to cut.
 

Happysin

Ars Legatus Legionis
98,681
Subscriptor++
I'd be more worried about it being a worse device with fewer use cases, but then again, I'm not really all that bullish on VR in the short term and think the market would be better served by continuing to push the boundaries in a less cost sensitive manner. Once there's a real killer application, they can figure out what corners to cut.
Seems to me that Apply Vision Pro + 24 months = Apple Vision Air + Price Cut.