Will the New Valve “Deckard” Headset Have Face Tracking for VRChat?
The idea of a new Valve “Deckard” VR headset has excited a lot of PC VR fans, especially people who spend time in VRChat. One of the biggest questions is:
Will Valve’s next headset support face tracking for VRChat, like eye tracking and mouth / expression tracking?
Because “Deckard” is not an officially released, fully detailed product, there are no confirmed specs from Valve that guarantee face tracking support. But we can break down:
- What face tracking actually is in VR
- How VRChat uses face and eye tracking today
- What’s realistic to expect from a next‑gen Valve headset
- Which variables will decide whether you can use face tracking in VRChat with it
By the end, you’ll understand what’s technically involved, what’s likely, and what still depends on your own setup and priorities.
What “Face Tracking” Means in VR (and in VRChat)
When people say face tracking in VR, they usually mean three related features:
Eye tracking
- Cameras or infrared sensors inside the headset track where your eyes are looking.
- Used for:
- Foveated rendering (rendering what you look at in higher detail)
- More realistic avatar eye movement in VRChat
- Better UI interaction (look-to-select)
Mouth / lip tracking
- A camera or sensor near your mouth reads lip movement, jaw position, and sometimes tongue position.
- Used for:
- Accurate lip sync instead of just microphone-based mouth flaps
- Expressions like smiles, frowns, and jaw movement
Full facial expression tracking (Face & Eye)
- Multiple sensors track the upper and lower face: eyes, eyebrows, cheeks, nose, and mouth.
- Used for:
- Very expressive avatars in VRChat
- Subtle emotional expression (raising an eyebrow, smirking, etc.)
For VRChat specifically, this kind of data is used to drive avatar blendshapes or expression parameters. The game listens for input from:
- Headset eye tracking APIs
- External face-tracking devices
- Microphone audio (as a fallback for basic mouth movement)
So when you ask whether Valve Deckard will have “face tracking for VRChat,” the real questions are:
- Will it include eye or face tracking hardware at all?
- Will Valve expose that data through an API that VRChat can support?
- Will VRChat add or adapt support for that specific tracking system?
Right now, none of those are confirmed in public documentation for Deckard.
Why Everyone Expects More Tracking from a Next‑Gen Valve Headset
Even without official specs, it’s possible to explain why many people expect some level of advanced tracking.
Next‑generation headsets increasingly include:
- Eye tracking for performance gains (foveated rendering)
- Face tracking for social presence in apps like VRChat and Horizon Worlds
- Inside-out tracking with multiple cameras for controllers and room‑scale positioning
Valve is heavily invested in SteamVR and high-end PC VR, where:
- Eye tracking can help squeeze more performance out of gaming PCs
- Social VR apps like VRChat are extremely popular
- Valve has historically experimented with advanced features (e.g., Index’s finger tracking controllers)
Because of that, many users expect that any flagship “Deckard” device would at least consider:
- Eye tracking for performance & UX
- Possibly some form of facial / mouth tracking for social presence
However, this is still expectation, not confirmation. Valve has not publicly committed to specific face-tracking features or VRChat integration.
How VRChat Uses Headset Face Tracking Today
To understand what Deckard would need to offer, it helps to see how VRChat handles this with current devices.
VRChat can work with:
Headsets with built-in eye tracking
- Example categories: premium PC VR, some standalone headsets
- VRChat reads gaze direction, blinking, sometimes pupil dilation
- This drives avatar eyes: where they look, when they blink, etc.
Headsets with external face-tracking add-ons
- Some users attach separate face-tracking modules below the headset
- These capture mouth, cheeks, jaw, and sometimes tongue movement
- VRChat can map this to avatar expressions and visemes
Devices and software that send face data via OSC / APIs
- Tools that stream tracking data into VRChat
- Requires custom setup, calibration, and advanced avatar configuration
In short, VRChat doesn’t care which brand of headset you have; it cares about:
- Is there tracking hardware? (cameras/sensors)
- Is there an API or driver that exposes that data?
- Has VRChat added support or can you pipe it in via existing systems?
So for Deckard to “have face tracking for VRChat,” all three would need to line up.
Key Variables That Will Decide Deckard’s Face Tracking for VRChat
There are several moving parts that determine whether Deckard will support face tracking in VRChat for you personally.
1. Hardware Sensors in the Headset
The first requirement is simple:
- Does Deckard physically include eye-tracking cameras inside the headset?
- Does it include mouth or lower-face tracking sensors (external camera, inward camera, IR sensors, etc.)?
Possible scenarios:
| Scenario | Sensors Included | What It Enables (In Theory) |
|---|---|---|
| A | Eye tracking only | Eye movement, blinking, gaze in VRChat |
| B | Eye + lower-face tracking | Full facial expression, mouth movement, richer avatars |
| C | No special face sensors | Only microphone-based lip sync and basic expressions |
Without hardware sensors, no software update can magically give true face tracking; it can only estimate from audio or controller input.
2. Valve’s Software & APIs
Even if the hardware is there, VRChat also needs:
- A software layer that reads raw sensor data
- A public API or driver that apps like VRChat can call
- Compatibility with SteamVR and VR runtimes that VRChat uses
Key variables here:
- Whether Valve exposes face/eye tracking as part of SteamVR’s OpenVR or similar APIs
- Whether the data format is standard enough for social VR apps to adopt
- Whether Valve limits some features to specific apps or keeps it generally open
The more open and standard the API, the more likely VRChat can support it.
3. VRChat’s Implementation
On VRChat’s side, developers decide:
- Which headset SDKs and tracking systems they officially support
- How deeply they integrate eye & face tracking into avatars
- Whether they treat some features as experimental or officially supported
Even if Deckard launches with full face tracking, VRChat would still need to:
- Update the client to read Deckard’s tracking data
- Map that data to avatar parameters
- Provide configuration options for users
That takes development time, testing, and demand from the community.
4. Your Own PC and Avatar Setup
Finally, even if Deckard and VRChat both support face tracking, your experience depends on:
- PC performance
- Face tracking and foveated rendering add processing overhead
- Lower-end PCs may struggle with high-complexity avatars plus advanced tracking
- Avatar configuration
- Your avatar needs the right blendshapes / visemes and parameters
- Some models are optimized for basic lip sync only, others for full facial expressions
- Network stability
- More tracking data = more information to sync to other players
- Poor network conditions can make expression updates feel delayed or jittery
So “Does Deckard have face tracking for VRChat?” eventually turns into:
“Does my PC + Deckard + avatar + VRChat build all line up correctly?”
Different User Profiles, Different Outcomes
Even assuming Deckard supports some kind of face tracking, not everyone will get the same result. Here’s how different types of users might experience it.
Enthusiast VRChat Creators
- Likely to:
- Use custom avatars with rich facial rigs
- Experiment with advanced settings, mods, and OSC tools
- Keep both VRChat and SteamVR on newer builds
- For them, the moment VRChat adds support, they can probably tap into Deckard’s face tracking quickly.
- They might also combine Deckard sensors with additional trackers or software for even more expressive avatars.
Casual Social VR Users
- Typically:
- Use public or store-bought avatars
- Don’t tweak blendshapes or tracking configs deeply
- Prefer “it just works” setups
- They may only see:
- Eye movement and blinking if that’s enabled by default
- Basic lip sync driven by microphone if full face tracking requires extra steps
- Their experience depends on VRChat enabling simple, default mappings that require minimal setup.
Performance-Focused PC Gamers
- Main interest:
- Playing a mix of VR games, not just VRChat
- Using eye tracking for foveated rendering and performance
- They might:
- Use face tracking in VRChat casually
- Turn off some tracking features if they cost too much performance
- For them, Deckard’s face tracking may be a nice extra rather than the main selling point.
Each of these groups might have the same headset, but very different face-tracking results in VRChat based on configuration, hardware, and expectations.
What’s Clear Now—and What Still Depends on You
From what’s publicly known:
- Deckard is not a fully announced, fully specced consumer product yet.
- Valve has not confirmed specific face-tracking features, sensor layouts, or VRChat integration.
- Technically, a modern Valve headset could include:
- Eye tracking for foveated rendering and improved interaction
- Possibly some form of lower-face tracking for social VR
- For VRChat support, three things must align:
- Deckard needs actual face/eye-tracking hardware
- Valve needs to provide open, accessible APIs
- VRChat needs to implement support and map that to avatars
Where that leaves you depends on:
- How much you prioritize face tracking vs. other headset features
- What kind of PC hardware you have
- How complex and expressive your VRChat avatars are
- How comfortable you are with tweaking settings, SDKs, and experimental features
Understanding these pieces makes it easier to follow news about Deckard’s specs and VRChat updates—and then compare them to your own setup and expectations when real details arrive.