Easy VTubing Kit
Matthieu Moncada
$132.00
(no ratings)
Date |
Price |
---|---|
日時 |
価格($) |
07/30(2024) |
132.0 |
07/30(2024) |
92.4 |
08/13(2024) |
132.0 |
11/24(2024) |
132.0 |
Jump AssetStore
This asset is made for you to easily create fully motion captured 3D avatars. You can simply import your 3D modeled characters in Unity and get it fully ready for Vtubing in a few clicks!The example scene and all shaders/materials are built with HDRP 2021.3.However, all scripts would work in any Render Pipeline and all shaders can easily be converted if needed, as they are all made with ShaderGraph.Using Unity 2021.3 or higher is recommended to get access to all features and better performance.Realtime motion capture, IKs, animations, Hair physics... everything is included for you to get your avatar ready as easily as possible.This is designed for 3d artists and VTubing in mind, with no coding necessary! It can be also be used to create any kind of games or applications, you can use the motion capture in realtime for your character, or use a Record and Playback system to create detailed animations or cinematics with multiple characters.Unlike many other Vtubing creation softwares, there are no restrictions to what you can do with your characters, except your imagination. Custom shaders, VFX, scripts... You can now have the full power and versatility of Unity to create the most unique Avatar exactly like you want it!IMPORTANT - The hardware you need:This Asset uses the UnityLiveCapture package, which needs an Apple device to work, such as an iPhone or iPad with iOS 14.6 or higher and ARKit's face tracking capabilities (device supporting Face ID or device with an A12 Bionic chip). You must also be able to install the LiveCapture App on your Apple Device, and run it on the same network as your computer.This asset can NOT do any motion capture from a webcam, an Android device or an older Apple device without face tracking support.What your 3d models needs:Your model must have facial blendshapes for the facial Motion Capture expressions. The blendshapes must match with Apple’s ARkit API’s :https://developer.apple.com/documentation/arkit/arfaceanchor/blendshapelocationThat's it !Everything else, like a humanoid rig for the skeleton, is not mandatory, but proper modeling and naming conventions help greatly with the automatic setup, animations and IKs.This is Version 1.0. Here’s what we are working on for future releases:-Strand based Hair and Fur - for even more realistic hair and animal characters (WIP - 80% done!)-Hand Tracking - motion capture straight from a webcam (WIP - 40% done!)-Twitch integration-Support through PUN/Mirror for multiple synchronized avatars-Improve and expand the control panel-More features and examples!Join our Discord for examples, help, info, or just to show off your skills :https://discord.gg/pgjPwnuG-Automatic Setup from your 3d model to a Realtime motion captured Avatar-Character’s emotions, expressions, blush, breathing, etc. control-Outfits, props and accessories support-Record and Playback facial animations-Body and Head IK-Hair and Jiggle physics-Poses and animation support-Custom Shaders for both High Definition and Anime Style characters-Easy VFX Control-Environment and Lighting controls-Automatic and/or Manual Camera control-Makes use of Unity HDRP for high-quality graphics-Character control from hotkeys, even when the application is out of focus-Run and capture straight from the Unity editor, or build and run on a standalone app!Dependencies:TextMeshPro is used in the example scene.This Asset uses:UnityRawInput under MIT License;Unity Live-Capture under Unity Companion License;see Third-PartyNotices.txt file in package for details.