Monday, March 27, 2017

Dev update #28 - VRidge tracking API

Vridge API is out in beta. Networking and audio changes are nearly done. More exciting features coming in next months.


VRidge API out in beta channel

VRidge API is a way to interact with VRidge tracking internals. This is the first step in making our platform more compatible with third party software and hardware solutions. We start with two endpoints that will allow you to take full control of head tracking system and controllers.

You can check out the example implementation and desktop app sample on our github here:

API is versioned so your plugins and drivers should work even after we alter the protocol. Perfect backwards compatibility is something we cannot promise for sure but we designed the API in a way that should allow easy iterations - we have several ways to keep both new and old versions of API channels running side by side. Backwards compatibility will be our priority when we make changes to the API.

We start with two endpoints that can be used separately:

Controller endpoint allows you to send VR motion controller state without writing a full OpenVR driver. This allows a more stable experience since only one driver is loaded in SteamVR. You can send multiple controller states, move them around, press and touch buttons and interact with touchpad. You can also use any language of your choice to send driver states as long as you send properly formatted packets.

Head tracking endpoint allows you to control head tracking in a variety of modes. You can use it to provide positional, rotational or combined data. You can also read mobile sensor data and provide an offset. If you find it necessary, you can modify phone tracking data in real time before it's used for rendering in VR.

We already have plans to expand API further to create a way to expose compressed or raw video output. Ultimate goal is to have everything pluggable/modifiable. VR hardware and software world is very fragmented now without a clear standard that has to be followed*. That's why we want to make parts of VRidge modifiable by controller and HMD manufacturers.

*OpenXR by Khronos is a good attempt to bring it all together. We hope that it succeeeds because with OpenVR, Oculus VR, OSVR and all the bridges in between them - the current state of VR SDK is somehow chaotic.

In the future we will also start moving more and more of our comm protocols into the open API because it allows creating 3rd party apps. We want to keep improving our Android app and eventually release an iOS app but if someone wants to create an alternative app specifically crafted for their purposes - we want to support it. There are multiple reasons to do so. Maybe you need a branded app for your company. Maybe you want to demo your architectural designs with some extra options. Or maybe you want to try to hack a way to run VR on TI-84 (after seeing Quake on oscilloscope or Symmetra being played with a microwave we believe that everything is possible).

While hacking and modding is fun this is our ultimate but not immediate goal. Our next plans are another round of performance/user experience improvements.

What's next?

Protocol, audio streaming, stable channel update (April)

There isn't much to say about it. The work on audio and protocol changes was suspended while we were trying to finish the API. It will be probably our next release since it's mostly complete and just needs some testing. Next blog post will be most likely focused on those changes with audio & protocol update being available in beta channel. We will probably push current beta to stable channel then too.

Smoothness, performance, latency - Moonlight & HDMI (Q2)

We tried to tackle it several times and there's still unresearched option of time warp providing significant improvements here. Trying to create the greatest VR streaming experience would be easier if we had something to compare to. That's why before we start working on time warp (which can improve the experience but we don't know it for sure) we want to create something that should have been added a long time ago.

NVIDIA Gamestream combined with open source Moonlight project is a great streaming solution. Probably the best one we've seen. We want to support it with VRidge app acting as a background tracker connected to VRidge runtime on the desktop which translates mobile tracking data into HMD tracking data and HMD frames into Moonlight & Gamestream compatible data. We want it to serve as a point of comparision of our own streaming.

Once we have Gamestream & Moonlight as a reference of perfect streaming, we want to improve our own streaming and tweak it for VR.

AMD users - don't worry, our streaming will still be available and improved in the future. We just want to take an advantage of pre-build solution because Team Green is large majority of our userbase. And the streaming is already there, we just need to hook into it.

While we are reworking rendering output we want to add rendering to HDMI connected displays* too. We already have render-to-screen working internally because we use it for development (finding bugs is easier when you can use your 2nd screen instead of phone). We just need to wrap it up in nice GUI switches and add some stability to it.

*You can substitute HDMI with DisplayPort/D-SUB/DVI. It will also have an option to simply render to window.

Less restarts, more stability

We all hate the "Your PC is going to restart in 5 minutes and there's nothing you can do." messages. Especially when you're in the middle of ranked match and you don't want your teammates to lose and your MMR to drop. This kind of experience ruins the fun and we want to address it in VRidge too. Current runtime flow requires restarting nearly everything (mobile app, SteamVR & game, VRidge, sometimes RiftCat dekstop app) every time even the smallest thing go wrong.
  • Connection interrupted? Sometimes you won't be able to reconnect. Gotta restart everything.
  • You want to change bitrate? Restart VRidge, SteamVR, game, mobile app.
  • USB cable disconnected? You need to restart everything. Oh, and Riftcat dekstop app crashes too.
Combine this with fragmented settings (some options needs to be configured on the mobile side, some are configured in the desktop app) and questionable design decisions (changing IPD & scale requires taking the phone outside the viewer).

We want to fix as many of these. Our goal is for VRidge to run always and connect with mobile app and SteamVR in a hot-pluggable way. Some settings would still require restarts but we want to limit those to situations when it absolutely needs to happen. We also want to move all settings to desktop app.

Minor tweaks and other features

We will have smaller tweaks and some new motion control options available in the meantime too. Moonlight, HDMI and anti-restart updates might take a while to be released (late Q2 is the most likely possibility). That's why we will try to squeeze in some extra side options that were prototyped as side projects but we're not ready to talk about them yet.

Fake VR is still on the table but performance has higher priority since over 50% of our feedback survey requests can be answered by having Moonlight as an option. iOS app will have to wait until we complete moving our protocol into open API.


  1. Does that means I can write something to emulate vive or oculus controllers? If yes, can you provide some guidelines on it. I will like to emulate Vive or/and Oculus controller.

    1. It would be nice to be able to use two mice as Rift controllers, of course that won't be as good in fast paced games but it would work in something like Google Earth VR. Just have it so you hold a button to change from forwards to up and down.

    2. Yes, with API you can control both motion controllers and your VR 3D position in any way you want. Using mouse/gamepad or even another phone as a controller is possible. There's a .NET sample of such 3rd party client interacting with tracking included in GitHub repo.

      One of the "extra side projects" mentioned in the post is something kinda similar (using gamepad as a motion controller emulator).

    3. if you are implementing gamestream for nvidia , can you support amd VCE ?

    4. Two different things. NVIDIA Gamesteram is both encoder and streamer while VCE is encoder only.

      Implementing VCE is equivalent to implementing NVENC which is something we've done in the past. VCE implementation didn't happen because there was little interest in our community and MF works quite fine with AMD cards (excluding Win 7).

  2. Replies
    1. Just buy a phone that doesn't feel like you're in prison.

  3. This comment has been removed by the author.

  4. Hi, fake VR could permit maybe to add Augmented reality by adding by ex the capability to use by pushing a button the camera while we are in VR for check around us the reality (look the keyboard or see around us)?

    Best Regards

  5. I've seen setups that have a third PS3 move controller on the headset for room scale head tracking; what I'm wandering is, can that controller also be used to replace the gyroscopes in the phone?

    1. Most of the times, the gyroscope of the move controller is inferior to the one on the cellphone...

  6. Will the audio streaming include a way to send microphone input back to the computer (for multiplayer games mostly)? It seems like the only thing missing to make it a complete package.

    1. That is a very good point. Multiplayer will be a big thing for my brother and I, and not needing to run a audio cable for my mic+headphones would be super.

    2. First audio version will include pc->mobile direction only. Android mic->back to PC recorder device sounds interesting but it won't be included in the next update.

  7. Is it possible to provide a custom controller render model for SteamVR?

    1. No, not at this moment. Currently you would need full controller driver to do so. We might add models in the future.

  8. Wow what a fantastic solution this is. Just wish the head tracking was smoother with Galaxy S6..I've noticed it judders with 3rd party apps run directly on the phone but perfect with Oculus apps in Gear Vr. I wonder Will vridge for Gear vr ever be able to use the on board head tracking sensors too?