Skip to main content

Hacking Skybox on Oculus Go for StereoPi live streaming

  • StereoPi Skybox on Oculus Go

I’ll skip a lengthy intro and go straight to the point.

So, I have a stereo camera capable of outputting H264 video over different protocols, and I have an Oculus Go headset. How do I watch a stereo live stream from the cam on the headset? Preferably with minimal latency and locally, so YouTube and other RTMP video services are a no go.

Skipping ahead, here’s what I’ve managed to achieve: first, I played a pre-recorded video file from StereoPi, and then I played a live stream from StereoPi (MPEG-TS through UDP).

 

 

The stereo camera that I use is a StereoPi, so the examples below are all related to that piece of hardware. Basically, it’s a regular Raspberry Pi with two cameras on it, so you can try these examples on a regular Pi if you feel like it. You’ll just need to flash the StereoPi firmware onto it.

The first thing I tried was making a regular Android app that plays the camera stream in full screen and sideloaded it onto the Oculus (through adb).

With some tweaking of the manifest I managed to persuade the headset to treat the app as a native one. It appeared in the ‘Unknown Sources’ in the Library, I was able to launch it, and it played everything it should, but there was a problem — head movements were not being tracked, so the video from the camera was just playing in full screen on the headset. Sure, the stereo effect was there, but the slightest head movement messed my brain making me feel extremely uncomfortable.

Just in case, here’s the .apk: StereoPi for Oculus Go, and the discussion on sideloading.

The archive includes a copy of adb, so you can try sideloading it onto Oculus straight away. Simply run

adb install StereoPi.apk

Then go to Library -> Unknown Sources and you should see the com.virt2real.stereopi app there.

Oculus Go unknown sources

 

Run it and, if your StereoPi is connected to the same local network as your Oculus Go, you should see the stereo image from the camera.

But this wasn’t good enough… I wanted a normal native Oculus video player app. So that the screen would be static and that head movements wouldn’t mess with my brain. I’m not up for learning Unity for Oculus just yet, so I thought of trying to use an existing video player app from the Oculus app store. I use Skybox for watching 3D movies, so I decided to go for this particular app.

Besides regular media file playback from the embedded flash drive and from network devices Skybox had an interesting menu item called Airscreen. It turned out you can install a Skybox app onto a Windows (or Mac) computer, load your video files into it, and then watch them in your headset. So the desktop app acts as a video server, and the headset acts as a client. I couldn’t find the connection protocol anywhere, so I had to bring tcpdump in.

After some digging, it turned out that Skybox uses broadband UDP messages to look for servers in the local network. The message looks something like this:

{“command”:”search”,”project”:”direwolf”,”deviceId”:”66a86b57-b292–3957–9fc9–4041d5e1f841",”deviceType”:”vr”,”udpPort”:”6881"}

All the messages are in JSON format, which is very convenient.
This message needs a reply to the sender’s host and port stated in the original message, in this case 6881.

{“udp”:true,”project”:”direwolf server”,”command”:”searchResult”,”deviceId”:”66a86b57-b292–3957–9fc9–4041d5e1f841",”computerId”:”53709de962eba2f9695c8a926562486c”,”computerName”:”STEREO-PI”,”ip”:”192.168.1.51",”ips”:[“192.168.1.51”],”port”:6888}

Here you indicate the host and port where your WebSockets server is running. All subsequent communication is done through WebSockets.

For example, the first WebSockets message will be something like this:

{“command”:”addDevice”,”deviceId”:”66a86b57-b292–3957–9fc9–4041d5e1f841",”deviceName”:”Oculus Pacific”,”deviceType”:”vr”,”showLoginCode”:true}

Send back a reply like so:

{“command”:”addDevice”,”deviceId”:”66a86b57-b292–3957–9fc9–4041d5e1f841",”deviceName”:”Oculus Pacific”,”deviceType”:”vr”,”showLoginCode”:true}

Then you should see your StereoPi in Skybox. After that you’ll get a bunch of other requests, which you have to answer.
Like the content of your playlist, for example.

Skybox playlist example:

[ { id: ‘livestream-rtsp’,
name: ‘Live Stream RTSP’,
duration: 0,
size: 0,
url: ‘rtsp://192.168.1.51:554/h264’,
thumbnail: ‘http://192.168.1.51/thumbnail/livestream.png',
thumbnailWidth: 186,
thumbnailHeight: 120,
lastModified: 1,
defaultVRSetting: 1,
userVRSetting: 2,
width: 1280,
height: 720,
orientDegree: ‘0’,
subtitles: [],
ratioTypeFor2DScreen: ‘default’,
rotationFor2DScreen: 0,
exists: true,
isBadMedia: false,
addedTime: 1 },
{ id: ‘livestream-mpegts’,
name: ‘Live Stream MPEG-TS’,
duration: 0, size: 0,
url: ‘udp://@:3001’,
thumbnail: ‘http://192.168.1.51/thumbnail/livestream.png',
thumbnailWidth: 186,
thumbnailHeight: 120,
lastModified: 1,
defaultVRSetting: 1,
userVRSetting: 2,
width: 1280,
height: 720,
orientDegree: ‘0’,
subtitles: [],
ratioTypeFor2DScreen: ‘default’,
rotationFor2DScreen: 0,
exists: true,
isBadMedia: false,
addedTime: 1 } ]


This was particularly interesting since the playlist generated by the desktop app carried the much coveted RTSP acronym. It seemed that the server app streams video files through RTSP, which allows for live video streaming, which was just what I needed. But actually, it turned out that although the playlist does say RTSP, the links to the video files are regular HTTP. So the server app actually sends video files over HTTP, and that didn’t suit my needs. I was about to get upset when I thought of sending the link in a format that VLC usually understands, i.e. rtsp://192.168.1.51:554/h264 And, lo and behold, Skybox started playing the video stream from an RTSP server in full-blown stereo. The latency, however, was enormous. About 20 seconds. So I dug deeper. I tried to send a UDP stream in MPEG-TS. VLC usually takes such streams with links like udp://@:3001 so I tried formatting the links like that for Skybox. Then, I only needed to direct the MPEG-TS stream to the Oculus host on the specified UDP port. I used GStreamer for that like so:

raspivid -3d sbs -w 1280 -h 720 -o — | gst-launch-1.0 -q fdsrc ! h264parse ! mpegtsmux alignment=7 name=muxer ! rndbuffersize max=1316 min=1316 ! multiudpsink clients=”192.168.1.60:3001" sync=false

In the Skybox playlist I clicked on the ‘Live Stream MPEG-TS’ item and voilà, I got the MPEG-TS live stream full screen in a virtual movie theatre. The latency was much lower than over RTSP, only 2–3 seconds, but still much higher than in my simple app, which accepts raw streams over UDP (the latency there is usually 100–150 ms with 720p resolution).

I hit a brick wall here. I can’t manage to lower the latency for now. Maybe I need to disable buffering in Skybox itself. I’ll try asking the Skybox developers to add an option for that :-)

UPDATE: I got an answer from the Skybox developers. Here it is:

Hi there,

Thank you for reaching out.
We are really amazed by the live streaming you achieved.
For the latency issue, it’s probably because the SKYBOX PC Client has not been optimized to use the UDP protocol.
In the near term, we do not have plans to add the feature of live streaming into SKYBOX so we have no better advice for you.
We will talk about this feature request in the product meeting this week but no promise on when to deliver it…

Thank you for your understanding and support.


You can read the discussion here on the Skybox forum.

In conclusion

If for some reason you need to watch a live video stream on Oculus Go or another VR headset (Skybox is apparently available on many platforms), you can try the method I described. I’m not sure it’ll work with other stereo cameras, but it definitely works with StereoPi, tried and tested.

Links

Skybox server source code

Forum discussion thread with the update for our S.L.P. Raspbian Image

Thank you all for your attention.