Gesture-Controlled Media Player

Rapid prototype

When tasked to design a gesture-based UI, our team decided to develop an intuitive set of gesture controls for Netflix to improve the TV show and movie streaming experience and test them using a behavioral prototype.

Project Details

Time Frame Team Members Prototype
2 days 3 Behavioral/Wizard of Oz
Personal contributions
  • Prototype Design
  • Video Recording
  • Task Design

Prototype

Our prototype was made from an iPhone placed inside an iPhone box. We punched a hole through the box so that the camera would show through. This was to give the appearance of a motion sensor, and also to be able to record the test session from an additional angle. We cut a slot through the bottom of the box so that the iPhone inside could be plugged into a laptop in order to monitor the recording. We painted the box black to disguise the fact that it was an iPhone box.

We used Chromecast to stream Netflix to a TV screen while controlling it from another laptop. We used a second iPhone to record the session from another corner of the room in order to get a view of what was happening on the TV screen while the participant was performing the gestures.

The participants were informed that they would be testing a gesture-based device for viewing Netflix. They were taught three gestures: start/stop, adjust volume, and advance to next episode. After learning each gesture they were given the chance to use it and asked for their feedback.

Gestures

Play and Pause

To pause video playback, the user holds up there hand, palm forward, a familiar “stop” indicator. To resume video playback, the user repeats this motion.

Scrub

To scrub to a particular position, the user pinches their fingers and makes a slide gesture, as if one were controlling a video scrubber on a computer or smartphone.

Adjust Volume

To control the video’s volume, the user raises or lowers their hand until the video has reached the desired volume or the volume has reached its maximum or minimum.

Next and Previous Episode

To advance to the next episode, the user makes a swiping motion with their hand similar to swiping on a touch screen. To go to the previous episode, they swipe in the opposite direction.

Analysis

The first participant was very enthused about gesture-based controls. She thought it would be great for her mother who has difficulty learning how to use remote controls. She thought that it would be much easier for her to learn gestures. She also said that the swiping “advance to the next episode” gesture made sense because that’s how she would do it on a touch screen. If it worked perfectly, she would use gestures instead of a remote.

The second participant thought the stop/start motion was a little awkward, but “it makes sense.” She said the volume gesture was how she imagined wanting to change the volume up and down. She also said that she preferred the gesture-based UI to natural language UIs.

Neither participant suspected that they were not interacting with a functioning prototype. Both of them were surprised when we revealed that the prototype was not as “real” as they were led to believe.