AR Hand Mesh Tracking enables real-time hand tracking and 3D hand mesh animation, allowing gesture detection, hand movement tracking, and natural interaction with AR objects using the camera.AR Hand Mesh Tracking is a powerful plugin that enables real-time hand tracking, full 3D hand mesh animation, and AR-style hand interaction — without using any AR SDK.The plugin detects left and right hands, tracks 20+ 3D hand landmarks, animates realistic 3D hand meshes, and follows hand movement in world space, allowing developers to build gesture-based AR interactions using only a camera.Unlike traditional AR solutions, this system does not rely on AR Foundation, ARCore, or ARKit. It is ideal for lightweight AR experiences, rapid prototyping, and projects that need hand-based interaction before integrating an AR SDK.Key HighlightsReal-time hand landmark tracking (20+ points per hand)Full 3D hand mesh animationHand movement tracking (position, rotation, scale)AR-style hand mesh interaction (no AR SDK required)Left & right hand detectionGesture & action detection readyInteract with AR objects, UI, and physics itemsSupports Camera, Video, and Image inputLite & Heavy models for performance controlMobile and desktop friendlyPlug-and-play prefabs with demo scenesPerfect ForAR hand interaction systemsGesture-based games and applicationsMobile AR experiencesAR prototyping and experimentationHCI, research, and education projectsHybrid AR projects using or preparing for AR SDKsIf you’re looking for a fast, flexible, and SDK-free solution for AR hand tracking, AR Hand Mesh Tracking gives you everything you need to build natural, hand-driven interaction.Hand Tracking Type: Real-time monocular camera-based hand trackingLandmark Tracking: 20+ 3D landmarks per hand (wrist, palm, finger joints)Hand Detection: Automatic left and right hand identificationMesh Animation: Full 3D hand mesh animation with finger joints and wrist rotationMovement Tracking: World-space hand position, rotation, and scale trackingGesture Support: Landmark-based data suitable for custom gesture and action detectionInput Sources: Device camera, video file, static imageTracking Models:Lite Model (optimized for mobile performance)Heavy Model (higher accuracy)Rendering Support: Built-in Render Pipeline, URP, HDRPPlatforms: Android, iOS, Windows, macOSUnity Version: 2021 or laterDependencies: None (works without AR SDK)Setup: Plug-and-play prefabs with demo scenes includedThis package includes documentation and promotional materials generated with AI assistance (ChatGPT, DALL·E). All scripts, logic, and core functionality were manually developed by the author.




