Tuesday, July 4, 2017

Dev update #31 - sound streaming and networking protocol changes




Version 1.4 is an update of both mobile app and desktop app. It's not backwards compatible so you need to update both apps. It's currently available in beta channel.

This update includes two major features - sound streaming (off by default, see paragraph below) and networking protocol changes (designed to decrease visual artifacting).

Keep in mind that it’s a first iteration of those features so you might encounter game-breaking bugs and might need to rollback to stable. You might notice few more minor bugfixing updates on weekly basis in July while we fix reported problems.

There's no Gear VR version yet so keep using stable channel if you are Gear VR user.

Update (11 July): Gear VR beta version is now available here.

Known issues:

  • Higher bitrates may introduce higher than usual latency.
  • Streams may partially freeze (rotational tracking stops working) after few minutes.

---

Sound streaming

We've been delaying it for a while for multiple reasons. We wanted it work with the widest set of devices possible and it took some time to tune all parameters and codec settings to work on all of our test devices. It's first public release so it's highly likely that it won't work anyway on some phones that we didn't test so please send your feedback and bug reports to support@riftcat.com. If you have problems with audio on your Android device, please follow the steps below:


  1. Go to Android app Settings -> Diagnostics and check "Enable diagnostics" in there.
  2. Reproduce the problem by streaming (or trying to stream) with sound for at least 20-30 seconds. Try few times, you can restart mobile app between tries.
  3. Go to Android app Settings -> Diagnostics again and tap "Send bug report". Enter your e-mail address so we can contact you and maybe ask some more questions that could help us fix it.


Currently sound streaming is disabled by default. In this state, it doesn't use any extra resources on mobile or PC side. If you want to enable sound streaming you need to go to PC app settings, expand "Other settings" and check "Sound streaming" box. It's currently buried this deep because it's first public release of this feature so we expect many problems and device specific issues. Once we polish it and fix reported problems, we're going to move it somewhere else in UI. We're changing this screen in 2.0 anyway so we might figure out a better layout for the settings.

Network protocol changes

This is major protocol change and a part of 2.0 changes that should make things better (maybe not initially, but once we tweak it properly). Generally, it should make streaming more stable (in terms of artifacting). This unfortunately breaks compatibility with older versions of VRidge so 1.4 versions of mobile app will require 1.4 versions of desktop app (and vice versa). We know that 1.3 mobile app doesn't work properly on some older devices so stick to stable version if you are using one of those: Galaxy Grand Prime, Galaxy S5 Mini, Blue Advance 5.0. We should have one of those devices at the office soon so we can check out what's wrong with 1.3 and GPU used by those devices.

If you don't want to be bored by tech details, you can skip rest of this section and read last paragraph only.

Until now we were using UDP datagrams to stream video data and tracking data. Adding sound streaming forced us to change our architecture so we can fit more channels (video and sound). Currently new networking layers is set to work in RUDP mode - which means that it will retransmit lost network data. This should mean that artifacts should be reduced by a great lot. Let us explain why.

Video stream is transmitted as a mix of keyframes (I-frames) and predicted frames (P-frames). Predicted frames mostly store differential information (e.g. "the bullet from previous frame moved 5 pixels to the right"). This uses much less data because only movement needs to be transmitted, not the full picture again. This has one major drawback though. If one frame is lost, all following frames that refer to it, will have some sort of artifacts because all next frames are based (at least partially) on information that wasn't received. It's even worse when an important I-frame is lost because it's followed by movement of things that do not exist yet in your stream. It may look like this:

Source: https://www.reddit.com/r/glitch_art/comments/1xgkcy/flip_and_swtich/

It usually fixes itself in a second or two because new I-frame (keyframe) is received and it clears out all visual artifacts that accumulated since the missing frame. Digging down deeper, it's even more complicated because one picture frame usually consists of multiple (sometimes even a hundred) of network frames. If any of these frames gets lost, you get partially broken picture. This manifests itself with floating pixels and frames falling apart.

This is why we want to test how RUDP works out in this scenario. We're starting with RUDP enabled for all packets. Based on user feedback we're going to tweak it. There are few configurations we can use - we can use RUDP on per-packet basis so we can, for example, only mark keyframes as "critically important" or use RUDP for audio channel only.

Once we receive some feedback we're going to set some decent defaults and possibly make rest of it configurable. Some people prefer raw frames arriving ASAP even if one is broken every now and then. Some would prefer a 5 ms lag spike once every few seconds to get rid of all artifacting. We need your feedback to get answers to some questions: How does the new streaming protocol work for you? Does it make things better?

Let's find out - we're listening at support@riftcat.com.

What's next?


  • We are working to update our API to v3 which uses Protocol Buffers instead of default .NET serialization which is kind of bad for cross-platform & language compatibility.
  • We continue our work on NOLO wireless mode. This should be out this July. It should exclude all potential lag and CPU spikes that could be caused by NOLO-SteamVR-VRidge interaction.
  • We recently got hold of Galaxy S8 so we want to test if the right-blurry-eye problem that some users report with S7 is reproducible with S8. Maybe we'll be able to fix it. 
  • 2.0 rework is still in progress but 1.4 protocol changes is one of the things that will play a major role in 2.0 so we want to test it properly with this beta update.
  • We need to figure out a way of deploying beta updates to Gear VR apps (we would like to have the same structure of stable/beta channels with Gear VR app too).