This is a version of the MiVRy Gesture Recognition Plug-in optimized for hand tracking with the Oculus Quest. It is unrestricted in it's use and can be used for other gesture recognition projects as well.
Making good user interaction for VR is hard. The number of buttons often isn't enough an memorizing button combinations is challenging for users.
Gestures are a great solution! Allow your users to wave their 3D controllers like a magic want and have wonderful things happening. Draw an arrow to shoot a magic missile, make a spiral to summon a hurricane, shake your controller to reload your gun, or just swipe left and right to "undo" or "redo" previous operations.
MARUI has many years of experience of creating VR/AR/XR user interfaces for 3D design software.
Now YOU can use it's powerful gesture recognition module in Unity.
MiVRy is a highly advanced artificial intelligence that can learn to understand your 3D controller motions.
The gestures can be both direction specific ("swipe left" vs. "swipe right") or direction independent ("draw an arrow facing in any direction") - either way, you will receive the direction, position, and scale at which the user performed the gesture!
Draw a large 3d cube and there it will appear, with the appropriate scale and orientation.
Both hand-handed gestures, two-handed gestures, and multi-part sequential gestures are supported.
Key features:
- Real 3D gestures - like waving a magic wand
- Record your own gestures - simple and straightforward
- Easy to use - single C# class
- Can have multiple sets of gestures simultaneously (for example: different sets of gestures for different buttons)
- High recognition fidelity
- Outputs the position, scale, and orientation at which the gesture was performed
- High performance (back-end written in optimized C/C++)
- Includes a Unity sample project that explains how to use the plug-in
[IMPORTANT!] This license is for the gesture recognition plug-in (.dll and .so filed and source code) and does NOT include any permission to use the asset files used in the samples (3D character models, textures etc.)