In-Flight Entertainment
Research & Development

 

Build the next generation controller for IFE.

The Intention

An avionics company asked Tactile to research and develop a new physical controller for their business and premium class IFEC (In-Flight Entertainment & Connectivity) system. As the UX designer on the project, much of my work over the course of six months involved careful and diligent research and discovery of what appropriate interaction patterns would be for the controller working together with an IFEC. Because the final design is still in development, I won’t be showing it here.

 

My Role

Senior UX Designer
UX Research
UX / UI Design
Prototyping

Extra Credit

Tactile – Seattle
Aaron Piazza – Design Director
Adam Weisgerber – Industrial Designer
Spenser Dodge – Industrial Design & Research
Marcus Pape – UX Director

 

 
 

Research In-Flight Tasks and Tech.

Phase ONE

The first phase of our work was a series of conversations with our client to understand and plot out primary IFEC tasks. Once defined, I scored (with the insights of our team and client) each task against a technological mechanism. I used a simple scoring system of 0, 1, 2, or 3 to match tasks. One (1) being “plausible” and (3) being “must have.”

This data score gave us an overall impression, while being able to narrow in on the most promising technologies and interactions.

 
IFE-tasks.png

 

 
 

Explore physical interaction patterns to control screen-based UI.

Phase TWO

As we got a clearer picture of the tasks and technology where we wanted to focus, it started to put guard rails to our experience. I distilled the actions needed to perform these tasks into basic interaction patterns. This formed the vocabulary needed for our team and client to discuss the best practices moving forward.

 
 

Basic Interaction Patterns Mapped to Physical Interfaces

Generating Discussion on Physical Interaction Patterns

InteractionPatterns-IFE.png

 

 
 

Prototype the Experience.

Phase TWO-B

The Industrial Designer and I worked very closely together to define product solutions, as well as the overall experience. I defined interactions and screens based our task explorations. And he refined the details of the buttons and physical housing of the remote.

Once I finished the screen interface, I was able to build a prototype that used an Apple TV remote to give a sense of the experience. After getting approval of our basic prototype, our team custom-built a working physical controller that engaged directly with the on-screen interface.

 

 Final Prototype:

I simplified the interface (from UI above) so that user testing could focus on interactions, and not content. The captured video below shows how the on screen interface would react to a physical controller. Yet at the same time, it was designed to also accommodate touch screen interactions.

IFE-prototype.png