Skip to main content

Search

Search

Apple Vision Pro? Your thoughts?

Comments

4 comments

  • Andrew Hale
    Andrew Hale

    When we run WorkLink on HoloLens, we have to limit the number of objects and number of polygons in our scene drastically. When we exceed dozens of parts or a million polygons, HoloLens can’t render fast enough and the display is jittery and leads to bad user experience. When we run on iPad Pro, we can usually display hundreds of objects and several million polygons without performance issues. We spend a lot less time in Pixyz/asset development for iPad Pros, which is a labor savings in our pipeline. However, then the user’s hands are holding an iPad, which makes it hard to do tasks. Apple announced Vision Pro will include M2 processor, so we might get similar performance as iPad Pros (also use M2 processors), but in a hands-free headset! As an author, I would love the freedom of being able to bring in more 3D content without having to manage the polygonal budget so tightly!

    Also, they didn’t say exactly what the field of view was but it seemed much bigger than HoloLens.  Our users complain about small HL2 FoV.

    0
  • David Nedohin
    David Nedohin Community moderator
    AR Master

    Thanks for the thoughts, Andrew!  These help our Product team as they continue to learn more about the device and how it will work for our customers!

    0
  • Greg Wenger
    Greg Wenger

    The Apple Vision Pro device would be a non-starter for our use cases. Being on a manufacturing floor, having artificial pass-through AR is not going to be acceptably safe.

    0
  • Kevin Desautels
    Kevin Desautels

    Just curious if there has been any movement on this. My site would be very interested in trying it as an HL2 replacement.

    0

Please sign in to leave a comment.

Powered by Zendesk