I left the headline like the original, but I see this as a massive win for Apple. The device is ridiculously expensive, isn’t even on sale yet and already has 150 apps specifically designed for that.

If Google did this, it wouldn’t even get 150 dedicated apps even years after launch (and the guaranteed demise of it) and even if it was something super cheap like being made of fucking cardboard.

This is something that as an Android user I envy a lot from the Apple ecosystem.

Apple: this is a new feature => devs implement them in their apps the very next day even if it launches officially in 6 months.

Google: this is a new feature => devs ignore it, apps start to support it after 5-6 Android versions

  • Zworf@beehaw.org
    link
    fedilink
    arrow-up
    3
    ·
    11 months ago

    Agreed, I worry about this too. The Quest uses a similar gesture with hand tracking (finger pinching to click) and it feels really frustrating compared to the much more direct feel you get with the included controllers.

    With the Apple you don’t even have controllers available if you want them so gesture tracking must work perfectly. Apple does have a lot of experience in getting stuff like that just right, but I really wonder whether eyetracking + pinching is comfortable for hours.

    • nicetriangle@kbin.social
      link
      fedilink
      arrow-up
      2
      ·
      11 months ago

      Supposedly the gestures are one thing they did a really solid job of based on the demo recaps I’ve watched. And the eye tracking supposedly works quite well for focus state switching. The main complaint I’ve heard is that the virtual keyboard sucks.

      I’ll be really interested to see more in depth reviews when they start coming out.

      • Zworf@beehaw.org
        link
        fedilink
        arrow-up
        1
        ·
        11 months ago

        The main complaint I’ve heard is that the virtual keyboard sucks.

        Yeah that I can imagine. I think it would be really annoying and exhausting having to type by looking at the letters. This is how you control the mouse pointer, right?

        But I really hope I can see it for real some day.

        • nicetriangle@kbin.social
          link
          fedilink
          arrow-up
          1
          ·
          11 months ago

          Here’s what that Mark Gurman dude (Apple/Tech journalist for Bloomberg) tweeted about it:

          The Vision Pro virtual keyboard is a complete write-off at least in 1.0. You have to poke each key one finger at a time like you did before you learned how to type. There is no magical in-air typing. You can also look at a character and pinch. You’ll want a Bluetooth keyboard.

          So sounds like its either poke or look + pinch gesture and both options suck for a keyboard. I just think a virtual keyboard is a very difficult problem to solve for for several reasons which is why every attempt at them thus far has been shit.

          And that’s kinda the whole problem with VR/MR. It’s some of the absolute hardest computing and optical and battery hardware and UI challenges we can find, all bundled into one product. It’s just an incredibly steep task and a lot of the solves aren’t even really a matter of “oh this is expensive” as much as it is “we’re not sure if this is even possible right now.”

          I really hope we eventually get a fully mature device. I quite like VR and see so much potential in it.

          • Zworf@beehaw.org
            link
            fedilink
            arrow-up
            1
            ·
            11 months ago

            Ok yes with Oculus it’s similar actually. You can poke at the letters but the problem is the exact depth detection is not so great (mainly because you’re pointing directly away from the tracking cams with your finger) so it’s a bit of a hit and miss.

            And moving the “virtual mouse pointer” and then pinching is also a pain to do. My oculus doesn’t have eye tracking but you can move your hand to move the “pointer”.

            Both methods are a PITA. Using the controllers to point and then click the trigger is better but it’s still slow going of course that way. It’s like typing on a keyboard hanging in front of you by pressing the keys with a stick. Considering that’s the most comfortable option (which the Vision Pro doesn’t have for lack of controllers), it’s pretty sad.

            But yeah I see the potential too… I hope it will come to pass.

            • emeralddawn45@discuss.tchncs.de
              link
              fedilink
              arrow-up
              1
              ·
              11 months ago

              I can imagine a return to some sort of t9 style typing where you could wear a thin sensor on your finger tips then tap certain fingers a certain number of times to enter specific characters. People who were used to typing with t9 could do it very quickly and without looking.

              • Zworf@beehaw.org
                link
                fedilink
                arrow-up
                1
                ·
                11 months ago

                True, but it’s still about adapting the user to the tech instead of the other way around. I don’t think Apple will go for that.

                I would personally think more in the direction of a separate sensor you can place in the house, from a third-person point of view the finger tracking will be much easier to do because you are not moving straight away from the camera.