UX

Common VR user experience issues

My experience so far is of 180 and 360 movies, some interactive movies and a couple of games. I am yet to experience any VR sickness. My background has involved a lot of software testing so I generally feel the need to write up anything that looks or feels wrong.  The following are the most common UX issues I encountered to date. I have used the Oculus browser a couple of times, but for the most part these are within the game apps or the media player apps.

Streaming artefacts

When using a slow internet connection (mine was at less than 7Mb/s at the time of writing) then a streaming 360 video will suffer the low resolution and compression artefacts. This gives a blurry and chunky edges to high contrast areas. While it is the same as you’d expect with a normal 2D movie, seeing the same within a stereoscopic movie seems to amplify the discomfort.  The only advice for the full quality is to download the file before playing if the player app allows this.

Objects (or people) too close to the camera

I have viewed a number of objects far too close to the camera. As an object get’s too close, it first becomes uncomfortable. At a certain point the stereoscopic view breaks down and double vision ensues making for a horrible experience. The production guidance I have found so far is that user comfort is based on objects being 0.75m to 3.5m away within the environment. Same applies to both video captured and 3d rendered.

People are smaller

I think in all of the acted video experiences I have watched, the view is smaller than real life which becomes apparent viewing people. They seem to be about 75% or maybe 66% of full size. At least they are consistent within Amaze, so some standard parameters are being used.

Rapid movement

A director of standard 2D video uses tricks such as depth of field and blurring to guide the viewer’s eye, and give the impression of smooth movement. Neither of these work in 3d video. The user moves their eye wherever they like, so everything must be in focus and sharp. This drives the need for higher frame rates. In any event, fast moving parts such as helicopter rotors appear to strobe rather than sweep.

Camera vertical position

In real life, I am of slightly over average height (6’1). Most of the experiences I have seen have placed the camera at the actor’s eye height, or lower. On normal video this seems not to be an issue, but in VR, especially where there is context of other people taller than the actor being filmed, it feels unnatural to me. Clearly this will be different for everyone and may just be an oddity of VR which cannot be overcome, but it can feel like I am crouching in an unnatural position through some of the video experiences.

Taking screenshots

Taking screen shots within Oculus Go leaves something to be desired. Currently requires the user to drop out of their experience, Choose Sharing > Take Photo from the navigation bar, then return to the experience and the snapshot will be taken 5 seconds from the time the take photo button was pressed. This doesn’t work for me at all in the missed spaceflight experience.

Another route is mirroring the screen using VLC to my Mac and taking the screen shot there. Unfortunately there is often some lag or degradation and it is difficult to have one eye on the mirror while keeping inside the experience.

The easiest method I have found so far is to use Sharing > Record Video, then use the Android File Transfer tool to pull the mp4 video from the device and pick the frame(s) I want to use later

How to set up mirroring is well described on Pixvana ( https://pixvana.com/sharing-your-oculus-go-screen-on-your-laptop/)

This year’s Oculus Connect (OC5) announced Casting for Oculus Go ( https://www.oculus.com/blog/oculus-go-at-oc5-even-more-to-watch-and-play/) . It will be interesting if and how that provides a better route to extract and share experiences.

Posted by creacog in 3d, Oculus, Oculus Go, UX, VR/AR, 0 comments

New year 2018

Personally speaking, 2018 is going to be a year of getting at least some Agile and DevOps adopted at work.

Step 1 – Get some tooling… Kicked off with Vagrant and commenced updating the brain from my previous php5 use to 7. So far so good.

Step 2 – Read up Jeff Patton’s User Story Mapping- Which does seem to stand up to it’s strap-line: “Discover the whole story, build the right product”. From reading this, “Shared understanding” will be my new catchphrase for the year.

Step 3 – Have a play with Docker…. first problem is my otherwise trusty old mid 2009 Mac Book Pro:

$ sysctl kern.hv_support
kern.hv_support: 0

So there’ll be no Docker practice on this machine.

Step 4 – Introduce work to some supporting tools: (JIRA, Confluence, Bitbucket)

Step 5 – Start using this stuff – Plenty of legacy work projects to migrate, and a couple of home projects.

This might be the excuse needed to justify updating my otherwise very long-lived Macs.

Posted by creacog in Developer, Personal, UX, 0 comments