Step-by-step guide to setting up VTube Studio with your Live2D model, from installation to OBS integration and your first stream.
What Is VTube Studio?
VTube Studio is the most popular software for bringing Live2D models to life on stream. It handles face tracking, model rendering, expression switching, and integrates seamlessly with OBS Studio and Streamlabs. Whether you are a brand-new VTuber or migrating from another tracking application, VTube Studio offers an intuitive interface with powerful customization options under the hood.
This guide walks you through every step from installation to your first stream, including model file placement, tracking configuration, parameter tuning, expression hotkeys, OBS integration, and performance optimization. For professional Live2D models ready to use in VTube Studio, explore AnimArts services.
Installation and First Launch
VTube Studio is available on Steam for Windows and Mac, and on the App Store and Google Play for mobile devices. The desktop version is free to use with a small watermark; a one-time purchase removes it. Download and install from Steam, then launch the application.
On first launch, you will see a sample model on screen. The interface has a toolbar on the left side with icons for settings, model selection, expressions, and more. Take a moment to explore the menus before loading your own model.
Loading Your Model Files
VTube Studio reads Live2D model files from a specific folder on your system. The default location is:
- Windows:
C:\Users\[YourName]\AppData\LocalLow\Denchisoft\VTube Studio\Live2DModels - Mac:
~/Library/Application Support/Denchisoft/VTube Studio/Live2DModels
Create a new subfolder inside Live2DModels with your model's name. Place all model files inside that subfolder, including the .moc3 file, texture images, the .model3.json configuration file, physics file, and any expression or motion files. The folder structure should look like this:
Live2DModels/MyModel/MyModel.moc3Live2DModels/MyModel/MyModel.model3.jsonLive2DModels/MyModel/MyModel.physics3.jsonLive2DModels/MyModel/textures/texture_00.png
Back in VTube Studio, click the model icon in the toolbar, and your model should appear in the list. Select it to load it on screen.
Setting Up Face Tracking
VTube Studio supports two main tracking methods: webcam tracking (computer vision via your desktop camera) and iPhone tracking (Apple ARKit via the iOS companion app). Each has distinct advantages, which we cover in detail in our webcam vs iPhone tracking comparison.
Webcam Tracking
To use webcam tracking, open Settings and navigate to the Camera tab. Select your webcam from the dropdown, choose a resolution (720p is usually sufficient for tracking), and click Start. VTube Studio's built-in computer vision will begin detecting your face. Position yourself so your face is centered in the preview window with adequate lighting.
iPhone Tracking
For iPhone tracking, install the VTube Studio companion app on your iPhone (X or later, which has a TrueDepth camera). Connect both devices to the same Wi-Fi network. In the desktop app's Settings, switch the tracking mode to iPhone and enter the IP address displayed on your phone. iPhone tracking offers significantly more tracking parameters, including individual eye tracking, tongue detection, and cheek puff.
Parameter Tuning
Parameters are the numerical values that map your facial movements to model deformations. Getting these right is essential for natural-looking animation. The most important parameters to tune are:
- FaceAngleX, FaceAngleY, FaceAngleZ: These control head rotation on three axes. Adjust the input range to match your natural head movement range. Too wide and the model barely moves; too narrow and it overreacts.
- EyeOpenLeft, EyeOpenRight: Controls eyelid opening. Tune the sensitivity so natural blinks register cleanly without the eyes flickering.
- MouthOpenY: Controls jaw opening. Set a comfortable minimum threshold so the mouth does not twitch when you are not speaking.
- MouthForm: Maps smile versus frown. This parameter often benefits from a slight smoothing increase to avoid jittery mouth shapes.
- EyeBallX, EyeBallY: Controls gaze direction. If your model's eyes wander too much, reduce the output range.
For each parameter, VTube Studio provides input smoothing and output range controls. Start with the defaults and adjust incrementally while watching your model respond in real time.
Expression Hotkeys
Expressions are pre-configured parameter overrides that change your model's appearance instantly -- think angry eyes, blushing cheeks, or heart-shaped pupils. To set up hotkeys, navigate to the Expression tab and assign keyboard shortcuts to each expression file.
Tips for effective expression management:
- Use function keys (F1 through F12) for your most-used expressions so they are easy to reach during a stream.
- Set expressions to toggle mode so one keypress activates and another deactivates.
- Layer multiple expressions simultaneously -- for example, blush plus tears for a dramatic reaction.
- Test each expression with tracking active to make sure it blends smoothly with your live face data.
OBS Integration with Transparency
To capture your VTube Studio model in OBS with a transparent background, follow these steps:
- In VTube Studio, set the background to a solid green or use the built-in transparent background option (requires the Spout2 plugin or NDI output).
- In OBS, add a new Game Capture or Window Capture source pointed at VTube Studio.
- If using a green background, add a Chroma Key filter to the source and set the key color to match.
- If using Spout2, add a Spout2 Capture source instead, which provides native alpha transparency without any chroma keying.
- Position and resize your model overlay on your scene.
Spout2 is the recommended method because it preserves perfect transparency with no color spill around the edges of your model.
Performance Optimization Tips
VTube Studio is generally lightweight, but you can squeeze out extra performance with these tweaks:
- Lower the rendering resolution in settings if you are running on an older machine.
- Disable anti-aliasing if frame rates are tight.
- Close the tracking preview window during streaming -- it consumes resources.
- Set the frame rate cap to 30 FPS if your model does not need 60 FPS rendering.
- Use webcam tracking instead of iPhone if CPU overhead is a concern, since webcam tracking runs locally without network latency.
Troubleshooting Common Issues
Model Not Appearing in the List
Ensure all model files are inside a single subfolder within the Live2DModels directory. The .model3.json file must reference the correct relative paths to textures and the moc3 file.
Tracking Feels Laggy or Jittery
Increase the smoothing value for affected parameters. Make sure your lighting is consistent and your face is clearly visible to the camera. Avoid backlighting.
Expressions Do Not Activate
Verify that expression files are placed in the model's folder and that the .model3.json file lists them. Re-import the model if necessary.
Next Steps
With VTube Studio configured and your model looking great, you are ready to go live. If you do not yet have a model, check our pricing page for commission options or contact AnimArts for a free quote. For more advanced topics, read our guide on preparing your PSD file to ensure your artwork is rigging-ready from the start. You can also visit our FAQ for answers to common commission questions.
Ready to Get Started?
Get a personalized quote for your project. We respond within 24 hours.