**BodyTracking** enables real-time full-body pose tracking from webcam or video to animate humanoid avatars. No extra hardware needed — fast, accurate, cross-platform, and easy to use in Unity.BodyTracking – Real-Time Body Pose Tracking & Avatar Animation for UnityBring your avatars to life with real-time full-body tracking using a standard RGB camera. BodyTracking is a powerful Unity tool that detects human pose landmarks and animates humanoid avatars directly from webcam or video — no depth sensors or external hardware required.🚀 Key Features:Real-time 3D human pose trackingWorks with standard webcams or video filesFull-body IK/FK humanoid animationAuto-scaling avatar based on camera distanceAvatar rotation support (−90° to +90°)Smooth, customizable motion interpolationCompatible with any humanoid rigRuns up to 60 FPS on desktopCross-platform: Windows, macOS, Linux, Android, iOSDrag-and-drop prefab setupDemo scenes included for instant testing🎯 Ideal For:Virtual Try-On applicationsFitness tracking and motion-based gamesMotion capture and animation workflowsAR/VR avatars and live streaming puppeteeringGesture-based learning and interactive apps🔧 Easy Setup:Add the BodyTracking prefab to your sceneSelect webcam or video input sourceAttach your humanoid avatarConfigure motion parametersPress Play — your avatar moves in real time!🧠 Technology: BodyTracking uses advanced machine learning to detect and track 3D body landmarks locally in real time. The system supports both IK and FK animation for natural movement and adjusts dynamically to the user’s position.No external hardware. No complex setup. Just plug, play, and track — all inside Unity.Start building interactive, motion-driven experiences today with BodyTracking!Package Contents✅ BodyTracking Prefab with built-in runtime controller✅ Machine Learning Models – Lite and Full versions for performance flexibility✅ AvatarMocap Script with automatic IK/FK animation support✅ Webcam & Video Input Manager for easy source switching✅ Ready-to-Use Demo Scenes for both webcam and video tracking✅ Full Configuration Support directly from the Unity Inspector✅ Sample Humanoid Avatar for quick testing✅ Comprehensive Documentation and step-by-step setup guide✅ ThirdPartyNotice.txt for included dependenciesAvatar RequirementsMust use a Humanoid rig (Unity Mecanim-compatible)No manual bone setup requiredIK and FK constraints are applied automaticallyModel InferencePowered by a pre-trained machine learning model for 3D body pose estimationOptimized for real-time performanceRuns on CPU by default, with optional GPU acceleration where supportedAdjustable parameters: Smoothness, Visibility Threshold, and Avatar ScaleSupports multiple input sources: Webcam, MP4 Video, or Raw Video FeedPerformanceAchieves up to 60 FPS on standard desktop hardwareMobile support (Android/iOS) using Lite Model and camera configurationWorks with both Orthographic and Perspective camera modesSupported PlatformsWindowsmacOSLinuxAndroid (requires camera setup)iOS (requires camera setup)Unity CompatibilityUnity 2020.3 LTS or laterDesigned for Built-in Render PipelineFully compatible with URP and HDRP (minor adjustments may be needed)BodyTracking gives you everything you need to bring real-time human motion into Unity — fast, accurate, and easy to integrate.This package includes documentation and promotional materials generated with AI assistance (ChatGPT, DALL·E). All scripts, logic, and core functionality were manually developed by the author.