yeah. Call me oldskool but I still think a small controller with laser pointer is optimal.
You can move the point at least as fast as your eyes and a button press will be faster and more accurate than pinch, and no occlusion. Plus more buttons gives you additional actions for the same onscreen target.
A controller with a laser pointer might be good, but not optimal.
For a pointing device, moving a stylus on a small graphic tablet (in relative mode, i.e. like a mouse, not in absolute mode), in order to position a visible cursor on a target, is both faster and more precise and also much more comfortable than pointing forward with a laser pointer or an equivalent device.
When you are not at a desk, perhaps one can make a pointing device with a stylus that you hold like for writing and you point downwards, but the relative movements of the stylus are used to control a visible cursor that is in front of you, or in any other direction. That would still be much better than a pointing device that forces you to point in the real direction of the target.
Especially when only the relative movements are of interest, it is easy to measure the 3D orientation of a stylus with a combined accelerometer/magnetometer sensor.
I think the Quest/Index/etc. controllers are a far better form factor than a cylindrical tube, for this use case. But then I also think we should be adding XBox controller input to CAD applications and such, so maybe I am the weird one. We should get over the idea that gamepad = unprofessional, because these are seriously engineer, ergonomic, highly sensitive input devices.
If all you wanted was a pointing device to make selections on a 2D gui interface, then the laser pointer form factor would be better. But I’m going to be an old fart and ask why are you doing this in AR then? Just use a screen. I’m more interested in the different human interface possibilities that are opened up by tactile input and 3D visual controls.
We should be using index controllers to build “grabby” UIs. I’m curious what this could turn into. It opens up a whole new human computer interface medium for intrinsically spatial applications, just like a good stylus on a tablet opened up creative use cases.
Gaze/click is for interacting with 2D planes.
Hand tracking is for interacting with the 3D world.