Tuesday, July 4, 2017

Dev update #31 - sound streaming and networking protocol changes

Version 1.4 is an update of both mobile app and desktop app. It's not backwards compatible so you need to update both apps. It's currently available in beta channel.

This update includes two major features - sound streaming (off by default, see paragraph below) and networking protocol changes (designed to decrease visual artifacting).

Keep in mind that it’s a first iteration of those features so you might encounter game-breaking bugs and might need to rollback to stable. You might notice few more minor bugfixing updates on weekly basis in July while we fix reported problems.

There's no Gear VR version yet so keep using stable channel if you are Gear VR user.

Update (11 July): Gear VR beta version is now available here.

Known issues:

  • Higher bitrates may introduce higher than usual latency.
  • Streams may partially freeze (rotational tracking stops working) after few minutes.


Sound streaming

We've been delaying it for a while for multiple reasons. We wanted it work with the widest set of devices possible and it took some time to tune all parameters and codec settings to work on all of our test devices. It's first public release so it's highly likely that it won't work anyway on some phones that we didn't test so please send your feedback and bug reports to support@riftcat.com. If you have problems with audio on your Android device, please follow the steps below:

  1. Go to Android app Settings -> Diagnostics and check "Enable diagnostics" in there.
  2. Reproduce the problem by streaming (or trying to stream) with sound for at least 20-30 seconds. Try few times, you can restart mobile app between tries.
  3. Go to Android app Settings -> Diagnostics again and tap "Send bug report". Enter your e-mail address so we can contact you and maybe ask some more questions that could help us fix it.

Currently sound streaming is disabled by default. In this state, it doesn't use any extra resources on mobile or PC side. If you want to enable sound streaming you need to go to PC app settings, expand "Other settings" and check "Sound streaming" box. It's currently buried this deep because it's first public release of this feature so we expect many problems and device specific issues. Once we polish it and fix reported problems, we're going to move it somewhere else in UI. We're changing this screen in 2.0 anyway so we might figure out a better layout for the settings.

Network protocol changes

This is major protocol change and a part of 2.0 changes that should make things better (maybe not initially, but once we tweak it properly). Generally, it should make streaming more stable (in terms of artifacting). This unfortunately breaks compatibility with older versions of VRidge so 1.4 versions of mobile app will require 1.4 versions of desktop app (and vice versa). We know that 1.3 mobile app doesn't work properly on some older devices so stick to stable version if you are using one of those: Galaxy Grand Prime, Galaxy S5 Mini, Blue Advance 5.0. We should have one of those devices at the office soon so we can check out what's wrong with 1.3 and GPU used by those devices.

If you don't want to be bored by tech details, you can skip rest of this section and read last paragraph only.

Until now we were using UDP datagrams to stream video data and tracking data. Adding sound streaming forced us to change our architecture so we can fit more channels (video and sound). Currently new networking layers is set to work in RUDP mode - which means that it will retransmit lost network data. This should mean that artifacts should be reduced by a great lot. Let us explain why.

Video stream is transmitted as a mix of keyframes (I-frames) and predicted frames (P-frames). Predicted frames mostly store differential information (e.g. "the bullet from previous frame moved 5 pixels to the right"). This uses much less data because only movement needs to be transmitted, not the full picture again. This has one major drawback though. If one frame is lost, all following frames that refer to it, will have some sort of artifacts because all next frames are based (at least partially) on information that wasn't received. It's even worse when an important I-frame is lost because it's followed by movement of things that do not exist yet in your stream. It may look like this:

Source: https://www.reddit.com/r/glitch_art/comments/1xgkcy/flip_and_swtich/

It usually fixes itself in a second or two because new I-frame (keyframe) is received and it clears out all visual artifacts that accumulated since the missing frame. Digging down deeper, it's even more complicated because one picture frame usually consists of multiple (sometimes even a hundred) of network frames. If any of these frames gets lost, you get partially broken picture. This manifests itself with floating pixels and frames falling apart.

This is why we want to test how RUDP works out in this scenario. We're starting with RUDP enabled for all packets. Based on user feedback we're going to tweak it. There are few configurations we can use - we can use RUDP on per-packet basis so we can, for example, only mark keyframes as "critically important" or use RUDP for audio channel only.

Once we receive some feedback we're going to set some decent defaults and possibly make rest of it configurable. Some people prefer raw frames arriving ASAP even if one is broken every now and then. Some would prefer a 5 ms lag spike once every few seconds to get rid of all artifacting. We need your feedback to get answers to some questions: How does the new streaming protocol work for you? Does it make things better?

Let's find out - we're listening at support@riftcat.com.

What's next?

  • We are working to update our API to v3 which uses Protocol Buffers instead of default .NET serialization which is kind of bad for cross-platform & language compatibility.
  • We continue our work on NOLO wireless mode. This should be out this July. It should exclude all potential lag and CPU spikes that could be caused by NOLO-SteamVR-VRidge interaction.
  • We recently got hold of Galaxy S8 so we want to test if the right-blurry-eye problem that some users report with S7 is reproducible with S8. Maybe we'll be able to fix it. 
  • 2.0 rework is still in progress but 1.4 protocol changes is one of the things that will play a major role in 2.0 so we want to test it properly with this beta update.
  • We need to figure out a way of deploying beta updates to Gear VR apps (we would like to have the same structure of stable/beta channels with Gear VR app too).


  1. Is Daydream support still being worked on? It's a bit annoying having to disable and enable the service again to work with VRidge and if the Daydream API is more responsive/accurate in any way those advantages would be nice.

    1. The API is the same so we're not losing on sensor precision or things like that. We know it's PITA to toggle between Cardboard and Daydream mode and we have plans to give proper Daydream mode a shot with 2.0. We're not sure if it will work out as good as we would like too but we will definitely try.

    2. Assuming you're using a recent version of the Google VR SDK, isn't enabling "Daydream mode" as simple as adding the right intent filter to the AndroidManifest file?

  2. I wish I could use Gear VR on my Nexus 5x phone.
    The Biggest problem I have right now is head tracking, my phone will spin me 180° every so often, a real kill joy. Anyone find a good solution to this yet? Or do iPhones and Galaxys not have this problem?

  3. I don't think it would be much of a bother (for users at least) to have a second GearVR Beta app. If they wanted to "switch" to the beta channel they could just choose between the two in the app screen.

    1. This is the solution we're most likely going to use. We just want to test few more things that are Gear VR specific and fix at least one known issue on the PC side. I hope it will be ready later this week.

    2. Probably later today. We're running final tests.

  4. For Gear VR, couldn't you simply have 2 separate apps on Construct VR? (one for beta and one for stable)

  5. I hope it will solve the latency problem from Nolo (to have Nolo integration into vridge).

  6. Awesome work Rift Cat team! Your nifty app makes my day that bit more enjoyable. I hope to test the beta soon. :)

    New Zealand Rift Cat Fanatic

  7. Strangely audio streaming was on by default on mine.

  8. Riftcat now runs a lot smoother on my Samsung S8, no jiggle etc. just a great picture.

    But since the last update (1.4.1), Riftcat loses Phone tracking (left, right, up and down) after 5-10 minutes. I can still see the Movement of my Controller and when I restart the App it all works again.

    In Version 1.4.0 the picture freeze after a while. Had to restart the app too.

    Any help would be greatly appreciated.

    1. i experience that too..
      i used USB and WiFi and the result is same...

      now I reinstalled to version 1.3.3
      it's more stable for now

    2. Yep, confirmed here too. With 1.4 it started out great, then a short time later the phone screen image just grinds to a halt.
      With audio streaming on or off.

  9. Go Rifcat!!
    still waiting moonlight support.... :D
    and hope it will have function to play non VR game in future.

  10. Nice! But always disconnect......

  11. Any updates on the GearVR version?

  12. when will have moonlight support?riftcat stream is very bad with my phone lenovo k5 note

  13. Just tested the Gear Vr update and unfortunately headtracking stops after just a few seconds.

  14. Ok, I tried version 1.4 (both with the GearVR and with Cardboard) and it's terribly sluggish. I mean, turn my head to the right and wait 5-7 seconds for the video to catch up bad. Then after a minute or two, the streaming would simply freeze and never update again. Looking at the PC screen though, the movements were sent in real-time to the PC so it's really just the streaming that's gone to hell.

    I tried turning audio streaming on and off and it didn't help. Also, the audio was never played anywhere else than my speaker set.

    I had to go back to 1.3.3 where I can do 95 mbps streaming without any issue.

    1. We're investigating the freeze and I think you're right about the lag with 100 MBps. With RUDP (new protocol) it tries to retransmit every single lost packet and the higher the bitrate is, the more likely it is that packets will be lost. It should work alot better at 15-25 Mbps bitrate.

      We're thinking about adding a toggle whether users wants retransmission protocol because it may indeed add some latency at extremely high bitrates.

    2. Actually I think, works as intended. No more artefact for me, but I needed change bitrate to 25 from 50 to eliminate lag. I am using 5Ghz adhoc wifi (onbord Asus ROG with antene right on my table) and Galaxy S7. Before 1.4 there was a lot of artefact. Still a lot of jittering, looking up for moonlight. Rotation for me freeze after like 2-3 minutes. Good work anyway. All of above apply for GearVR version.

    3. As far as the toggle goes for retransmission protocol, maybe have 3 options: every frame, keyframes only, and off.