RE: 1. Do the pico headsets do this or something? Who doesn't have their spatial tracking process set as the highest priority?
Are you confusing dropped frames with spatial tracking loss? Apple hasn't invented a technology to never drop frames so I expect once we hop into fully immersive Unity apps we'll have familiar stuttery moments.
There are two processors. The R1 is a realtime processor that manages the vision tracking and passthrough rendering. The M2 is an ARM processor that handles general computing. Processing a Unity app would happen on the M2, and is where frames would be dropped if they aren't ready in time. The R1 should not miss a frame of input or drop a frame from the outside world unless caused by a hardware flaw.
My point is that you can still get nauseous from chuggy app rendering even if the passthrough is flawless. Any change in perspective will require an app draw and even if Apple is able to reproject old frames of your app, they will be warped and disorienting. They can't really do enough inpainting to make it flawless.
I'm not sure. Inpainting doesn't seem like right way to think about the R1. My understanding is that visionOS uses a rendered mode that gives the application limited control once they've sent the data. That lets the system re-render the application's stale data correctly. In exchange, apps have limited control over the render (whatever can be translated into a subset of MaterialX.)
That said, I know Unity's Polyspatial does have a flag to update the draw for each eye individually, so it seems like some disorientation should be possible, but it would be coming from inconsistent data across the eyes, not the way the system processes your head's orientation or the passthrough.
Where do you see that? I thought Unity was having developers handle all of it through the URP via Polyspatial which does convert it.
I can see how you could write a bad full-screen app though. Many ways to disorient in addition to inefficient draws. The app store approval process is only going to help so much there.
URP and their shader graph is hardly a recipe for performance. There would have to be much more than that going on to protect devs from attempting shaders that are too expensive.
Full apps don't appear to use the Polyspatial system but its hard to tell exactly.
Are you confusing dropped frames with spatial tracking loss? Apple hasn't invented a technology to never drop frames so I expect once we hop into fully immersive Unity apps we'll have familiar stuttery moments.