From what I understand the thing isn’t see through and the eyes are actually projected outside. Can somebody explain why they had to add tech to do it?
Because there are screens in the way? The choice was to either not have the viewer’s eyes be visible, or use a screen to display eyes (not even real eyes, you can supposedly have cat eyes for an example). Considering the device is meant to be AR (augmented reality) and not VR, it kinda makes sense to show the user’s eyes since they’re still “connected” to the outside world. Otherwise you’d have a bunch of blank visors walking around and then people can’t tell if you’re looking at them or your furry waifu.
What? Where did I defend the Vision Pro? I just don’t get why people get so emotionally invested about something nobody ever forces them to buy or use.
And it sucked, fov of the augmented area was tiny, the projected images were see-through and you still couldn’t really see the persons eyes because of the tinted glass.
Vr headsets with cameras are currently by far the best way to do AR.
You actually remove your eyes before inserting the optical couplers into your sockets. You put your eyes in the storage compartment on the front giving the appearance that you’re looking out through the device.
Imagine you’re sitting in restaurant waiting for the waiter while doing some work on your Vision Pro. The waiter shows up and says ‘sir…’. You look at him and… there were two options:
it’s just a black screen so it’s not clear if you’re actually looking at him. Are you paying attention? Of are you still ‘inside’ and can’t hear/see anyone
you have this fake eyes indicating that you’re actually looking at him
It’s a really stupid “solution” to a huge problem all VR/AR has The actual solution? Don’t buy it.
I think this is kind of a temporary workaround. In Apples ideal world, the Vision Pro would actually be transparent and you could see the users eyes for real, but the tech isn’t ready to project what apple is doing on glasses. So they settled for a VR headset and put eyes on the outside. Eventually in however many years it takes, they will actually use glasses and won’t have to do the screen on the outside.
They must believe, that being able to see Vision Pro users eyes is integral to the product, or at least important to the product being accepted by everyone.
For no good benefit? Try comparing the display to a HoloLens 2. There’s no current display technology that’s cheaper and allows you to see through while projecting the light at the same intensity. You can search it up.
All I’d want is “Go away. Gaming.” But a Post-It would do just fine. Hell, I’d prefer googly eyes than my own projected, that’d be way cooler and more useful.
I’ll point to someone down this thread about eye contact in that case. It’s not like it costed much though, reviewers have noted that iSight’s display quality is quite horrible and it seems like all this features added was a small screen
Heavier, too. It’s about as heavy as the competitors despite having a separate battery.
It’s not necessary to have the external screen.
The Quest has passthrough cameras to allow you to see the world with stuff displayed over it too, but Apple has decided that simulating eye contact is important.
It’s Apple’s unique selling point here, but they’d have what sounds like a high-quality headset without it.
To allow eye contact for social interactions. If you want ubiquitous AR in real life that is what you need. This is an attempt to achieve this with current technology and it “almost” works / near miss / fails spectacularly.
From what I understand the thing isn’t see through and the eyes are actually projected outside. Can somebody explain why they had to add tech to do it?
deleted by creator
What do you mean? They added the outside screen to a vr headset to try to make it more acceptable to wear around others.
Meme or actually?
Actually
Because there are screens in the way? The choice was to either not have the viewer’s eyes be visible, or use a screen to display eyes (not even real eyes, you can supposedly have cat eyes for an example). Considering the device is meant to be AR (augmented reality) and not VR, it kinda makes sense to show the user’s eyes since they’re still “connected” to the outside world. Otherwise you’d have a bunch of blank visors walking around and then people can’t tell if you’re looking at them or your furry waifu.
You know how Microsoft solved this problem?
With glass.
Then go and buy Microsoft’s product. Nobody forces you to get a Vision Pro
Lmao look at this bozo defending the shittiest apple product.
What? Where did I defend the Vision Pro? I just don’t get why people get so emotionally invested about something nobody ever forces them to buy or use.
That was Google…
Microsoft Hololens (glass and transparent screen) and Google Glass (tiny screen)
Google had Glass. Windows Mixed Reality used glass. The material. Like a window.
And it sucked, fov of the augmented area was tiny, the projected images were see-through and you still couldn’t really see the persons eyes because of the tinted glass. Vr headsets with cameras are currently by far the best way to do AR.
You actually remove your eyes before inserting the optical couplers into your sockets. You put your eyes in the storage compartment on the front giving the appearance that you’re looking out through the device.
So they can sell you custom eyes like cats and aliens and shit.
Maybe they think it makes you look less stupid.
They have Tesla truck success in that, then.
Imagine you’re sitting in restaurant waiting for the waiter while doing some work on your Vision Pro. The waiter shows up and says ‘sir…’. You look at him and… there were two options:
it’s just a black screen so it’s not clear if you’re actually looking at him. Are you paying attention? Of are you still ‘inside’ and can’t hear/see anyone
you have this fake eyes indicating that you’re actually looking at him
It’s a really stupid “solution” to a huge problem all VR/AR has The actual solution? Don’t buy it.
I think this is kind of a temporary workaround. In Apples ideal world, the Vision Pro would actually be transparent and you could see the users eyes for real, but the tech isn’t ready to project what apple is doing on glasses. So they settled for a VR headset and put eyes on the outside. Eventually in however many years it takes, they will actually use glasses and won’t have to do the screen on the outside. They must believe, that being able to see Vision Pro users eyes is integral to the product, or at least important to the product being accepted by everyone.
Achieving realistic, fast camera passthrough on both sides is harder than you think
That’s why we’ve been stuck with windows for centuries.
Have you tried Linux? /s
Yes, that’s my point. Why? Why make it extra more complicated and more expensive for no good benefit?
For no good benefit? Try comparing the display to a HoloLens 2. There’s no current display technology that’s cheaper and allows you to see through while projecting the light at the same intensity. You can search it up.
I think they’re asking why eyes need to be projected on the outside.
Or anything for that matter.
All I’d want is “Go away. Gaming.” But a Post-It would do just fine. Hell, I’d prefer googly eyes than my own projected, that’d be way cooler and more useful.
I accomplished everything I need by taping a piece of paper with sharpie eyes on my Quest 2 and it cost me $0 to do so!
I’ll point to someone down this thread about eye contact in that case. It’s not like it costed much though, reviewers have noted that iSight’s display quality is quite horrible and it seems like all this features added was a small screen
Heavier, too. It’s about as heavy as the competitors despite having a separate battery.
It’s not necessary to have the external screen.
The Quest has passthrough cameras to allow you to see the world with stuff displayed over it too, but Apple has decided that simulating eye contact is important.
It’s Apple’s unique selling point here, but they’d have what sounds like a high-quality headset without it.
To allow eye contact for social interactions. If you want ubiquitous AR in real life that is what you need. This is an attempt to achieve this with current technology and it “almost” works / near miss / fails spectacularly.
So they could have stopped at many points but decided humanity must suffer
I mean, if the price tag isn’t going to dissuade you…
And the eyes are not the wearer’s eyes. They are just digital eyes.