Thursday, February 9, 2017

Dev update #27

This post is text only without downloadable updates available in Riftcat client. We want to update you on our current work and talk about the future.


Sideload VR replacement

January ended with very unfortunate news. Sideload VR - a great source for Gear VR prototype apps - announced that it's closing down. You can see Sideload's author reasoning in the announcement post here.

We needed a replacement. First we thought that we will have to write our own repackaging service but fortunately such alternative to Sideload VR already exists. From now on you will be able to find Gear VR version on ConstructVR.

Desktop client links will be updated with next downloadable update.


Slightly over a week ago NOLO VR launched their Kickstarter campaign. We tried the hardware, we have units at our office and we can surely say that it's not a vaporware. It works just as advertised. There is no fake camera work or pre-rendered animations.

We are in touch with NOLO VR team and we are currently working to prepare a software side on our end so their hardware can work out of the box with VRidge. We tried Leap Motion, PSMoveService and other experimental drivers but NOLO VR was definitely the easiest and most reliable of them.

Their goal is to provide end user experience of nearly Vive-level tracking. It uses the same underlying technology (Lighthouse) so the tracking is very precise indeed.

Our goal is to provide a software support where you simply plug their hardware in and it just works. We don't want to make you install multiple layers of FreeTrack/FreePie like software relays. Simply plug it in, start VRidge and enjoy the next level of streamed VR.

VRidge API

While we are implementing NOLO VR support we decided to expand this into another component of VRidge. You can already do a lot with FreeTrack and OpenVR drivers so one could argue that we don't need another standard.

XKCD: Standards
But if you actually tried to use FreeTrack, custom OpenVR drivers and VRidge all together - you might have noticed several problems - black screens, different coordinate systems mixing up, etc. We are now working on providing a unified interface to keep everything loaded as one driver in one 3D space. Here are some key features that will be available with VRidge API.
  • Writing positional or positional and rotational data to override VRidge tracking. Similar to FreeTrack but with an ability to switch modes in real time without restarting VRidge. 
  • Reading and writing all VRidge tracking data before it's used to render a frame. You could use external sensor data and combine them with VRidge provided mobile sensor data to combine both data streams into a more precise measurement. It works in real time with ~0.1 ms response time.
  • Reading data and providing an offset. You could use external sensors to compare its data with phone measurements and provide an offset that corrects drift.
  • Writing OpenVR controller 3D position and button presses with an ability to choose the origin point of 3D tracking.
  • A way to do all of the above remotely. Wireless controllers won't need any desktop relays as long as they have some sort of wifi connectivity. It enables quite creative ideas of using mobile phones as motion controllers, similar to Daydream controller emulation.
The API is currently using ZeroMQ as a protocol. This is industry tested standard used by (according to their website) AT&T, Cisco, EA, Los Alamos Labs, NASA, Weta Digital, Zynga, Spotify, Samsung Electronics, Microsoft, CERN, among others. It allows developers to connect to VRidge API in any programming language easily. It doesn't matter if you use Python, Java, C#, C++, Objective-C or any other supported language (full list here).

We plan to provide samples with sources in Java (Android) and C# (WPF). We want to create a direct connector between VRidge and PSMoveService too because it's very popular and can serve as another example.

Audio and network protocol changes

Audio streaming is done but we decided to refactor networking a bit. Until now we used raw UDP datagrams with frame data but while we were adding audio stream we decided that it's time to upgrade to rUDP. It should reduce artifacting on 2.4 GHz networks. There's no reason to reinvent the wheel so we decided to use ENET. The same library that is used by Moonlight + Gamestream. We hope that retransmission will allow us to smooth out the experience further

Survey results and our next steps

According to our survey 80% of you noticed an improvement when it comes to smoothness in 1.2 update. This is a very good news and it motivates us to optimize further now that we can see that users actually notice not only features but the behind-the-scenes data flow optimizations too.
Here are some more facts:
  • One out of the three people use Gear VR.
  • Majority of people found that HEVC improves the exprience.
  • The most common video cards are GTX 970, 1070 and 1080. Only 20% of respondents used AMD cards.
  • Only 15% of users use WiFi 2.4 Ghz. WiFi 5 GHz and USB tethering are equally popular.
Last but definitely not least - the most important question of all of them. We are creating this software for you and your feedback is the most important factor in our decision making.

You can see raw data above. This is how we see it.
  1. You want a better controller support. We think that VRidge controller API described earlier will be a step in the right direction.
  2. Second most popular option is Fake VR. It was discussed earlier and we already did some work in this area. We plan to resume working on it after controller API, sound streaming and protocol changes are out.
  3. Third is jitter reduction. We hope that improved networking protocol will reduce it further but we still have timewarp on our task board.
  4. Reducing latency. This is a tricky one but time warp has a potential to reduce perceivable latency greatly.
Other answers (10%) contain many sound streaming requests which is already nearly complete.

We now want to finish controller API (#1), sound streaming and networking changes. After we're done with it, most likely one developer will focus on Fake VR (#2) and the other one will start prototyping a time warp (#3 and #4). As an additional step to improve on option #3 and #4 we want to add Moonlight compatible mode eventually.

We recently discussed Moonlight mode on reddit. We want to try time warp first because of reasons described in the discussion but Moonlight & Gamestream are always in our minds as a great example how to stream things perfectly.

We hope that this answers your feedback. Of course all of this can change because we can't predict the future but we hope that no major obstacles appear and we can actually deliver features you want the most.


  1. This comment has been removed by the author.

  2. Does the NOLO VR integration mean that positional data will be sent completely wirelessly through Vridge? And does their headset marker help with reducing drift? (The level of drift I experience is currently my biggest issue)

    1. I think it can reduce drift if we and NOLO VRcode it properly. It's still work in progress so it's hard to tell for sure.

      Wireless comms is their 200k kickstarter stretch goal. Currently headset marker needs to be connected to the PC.

  3. I also hope the support of the controller.

  4. I also hope u are not going to leave AMD users behind... And not all of us are using latest HEVC capable phones... if we were rich we probably had already buyed an htc vive or oculus instead of this... or not ?

    1. We're not leaving anyone behind. AMD and non-HEVC support is not going anywhere and we will continue improving it too. :)

  5. Hello, so is the sideload VR setup video out of date now? will there be new video for the construct VR setup?

    1. My goal this coming week is to get a new video created for that.

  6. Are you planning to use "AMD Relive"?

  7. How about using a GearVR with NOLO? If the Tracking Data are transmittet via USB from the Nolo Device to the PC it should reduce the Amount of Positional and Rotating Data via WiFi. This could increase Quality of Graphics Data sent to the Phone. Why you send Sound Data? This is not neccesary! I Guess that most of the RiftCat User would use a Headphone attached to their PC, so this Data could be take out of the Stream!

    1. With NOLO planning on having the option to send the positional data wirelessly, the audio streaming is truly the last piece of the puzzle for a totally wireless experience. Plus, you can probably disable it if you prefer to conserve bandwidth for the video stream.

    2. I would like to have an Options Menu, where i have the choice to enable/disable what is to be transmitted. To say it with a todays poular phrase "Video Streaming First". This should be implemented ind VRidge and VRidge for Gear VR both! Yesterday i tried Aces High III with GearVR. Big Immersion, but the stuttering sucks! Same settings used for Elite Dangerous. No significant stutter, works like a charm. Sounds crazy, cause Graphics from ED are bigger!

  8. Sry if it is mentioned somewhere else but do you plan on releasing HEVC for quicksync (short/med/long term goal)? Good job with the new constructvr release. Quick and effective.

  9. although software rendering works well, I can't seem to use Media Foundation nor Intel QuickSync, I don't know why. The screen just hangs on 'waiting VR stream' and sometimes the app just crashed back to the 'Ready..' screen. Media Foundation produces 'disable Hyper-V' which is annoying.
    I'm on 3570K@4.4, 16GB RAM, Win10AE, and RX480 8GB.. all latest WHQL drivers.

    1. forgot to mention, I'm using Mi3 (SD801) with 2GB RAM, 1080p, Nougat 7.1