top of page

UX Design for Interactive A.I. Installation

Year: Summer 2020

Role: UX Design & Research, Masters student project

Client: Vancouver Art Gallery

Brief: Deliver an engaging, interactive installation that demonstrates a key element or concept relating to A.I. and visual culture.

In preparation for an upcoming exhibition, the Vancouver Art Gallery asked our project team to design an engaging, interactive installation that demonstrates a key element or concept relating to A.I. and visual culture.


Our solution, Face to Interface, is an interactive facial analysis AI installation that responds to visitors' expressions with an emotion “reading” and a stream of FER images associated with the emotion. By revealing “behind the scenes” of an emotion recognition software, visitors learn about facial analysis software and can assess the validity of this technology themselves while engaging with the database.  


Face to Interface is powered by Visage Technologies Facial Analysis and face tracking SDKs. 


This project was awarded the Gerri Sinclair Award for Digital Innovation.

Key Deliverables



UX Research


User Journey






User Testing Prototype


1. Touchless Interactivity

Due to COVID 19, we were uncertain about future restrictions, therefore, we needed to design for a socially distanced, touchless experience. 

2.  Balance engagement with education

The installation needed to faciliate a deeper understanding of artifical intellegence through playful interaction.  

3. Appeal to expert and novice visitors

The exhibit needed to engage visitors with expert A.I. knowledge, and those with no prior A.I. knowledge. 


Domain Research + Market Scan


We started this project by conducting wide domain research into the various facets of Artifical Intelligence to understand how we could narrow our focus within this domain. Since our project team worked remotely, we collected our research findings into a large Miro board and used a dot voting technique to highlight topics we wanted to further explore. 


We also conducted a market scan of previous A.I. installations to review approaches, assess interactivity, and inspire ideation.


Screenshot of the teams Miro board, illustrating a Mindmap of topics and dot votes to highlight points of interest

Secondary Research - UX in Galleries

Since we were not able to conduct onsite visitor research due to COVID 19 closure, I reviewed previous UX research studies conducted in art galleries and museums and consolidated key findings to guide our project design. 



Some key findings included:

  • Fear of getting it “wrong” deters interaction if the rules of engagement aren’t clear [1

  • Couples and groups may feel more comfortable experimenting due to “teammate validation” [1]

  • Consider how each interaction and piece works together to form the overall message [2

  • Design an experience that supports observation to include onlookers in the installation [3]  


Site Visit + Visitor Journey Map


Keeping these findings in mind, we visited the gallery and I mapped out the visitor experience throughout the space. This helped us understand how our installation would integrate into the overall exhibit message and maintain an ongoing narrative from the first touch point to the last.

Gallery Experience.png

Visitor journey map, that I created to help identify how our installation fits into the overall gallery and exhibition experience.

the Visitors Experience

Final Design


Physical Design

7576726b-5243-439f-b24d-360abd6df778 2.jpg


1. Supporting a socially distanced future


The installation was designed to allow multiple visitors to engage while maintaining a safe distance by placing screens on each side of the kiosk and leaving the surrounding space open for traffic flow. Groups testing the installation together also enjoyed coaching and comparing each other on how to shift the emotion detection, indicating the experience was successful for varying types of visitors.

2. Robust design with minimal upkeep


The installation only requires a monitor and webcam, reducing the risk of physical repairs required. The software was also stress-tested over a 24 hour period with continuous use to ensure that the installation could handle ongoing public interaction.

Storyboard samples that I created to

communicate the user experience.
Visitor illustrations from: Pimpmydrawing 

Onboarding Sequence


1. Help visitors understand how to interact


Early testing indicated that visitors did not expect to interact with an installation using their face. We needed to create a foolproof onboarding sequence that reassured visitors they were interacting with the installation correctly. This was done by:


  • Using simple, descriptive text

  • Forcing an expression to “unlock” the experience

  • Including a scanning animation over the face   


These steps also supported the narrative of the experience, purposely enforcing a mechanical and unsettling tone.

2. Communicate lost facial detection


Our beta test indicated that users became confused when the camera detection was lost and the installation reset, assuming that this was part of the narrative. Adding a “Face Lost” message helped visitors understand the error so they could adjust accordingly.

Messaging used to indicate issues with the facial detection.

Emotion Detection Analysis


1. "Game-like" with an educational twist


The emotion detection was highly responsive to provide a playful, game-like experience, but on deeper inspection, the faces shooting from the graph reveal a database often full of inaccurate and biased images.


By juxtaposing a playful experience with a deep dive into a database that is often hidden to us, visitors can come to their own conclusions about the impacts of this technology

2. Keep the player present


To maintain continuity between the onboarding and without distracting from the experience, a muted video of the visitor is visible behind the spider graph.

Over 30, 000 FER images were included in the database, ensuring that visitors could continue discovering images during prolonged interaction.

FER images were pulled from a real database used to develop emotion recognition software.

Our project manager demonstrating the prototype using a similar build to the installation design.

Most A.I. experts involved in our beta test acknowledged the inconsistencies of the FER database, indicating that the experience facilitated reflection about the technology.

Project Outcomes

  • All beta test users expressed playfulness and delight

  • Solution supported touchless solo and group experiences​

  • A.I. experts understood the technological implications while novice visitors learned about FER databases  

  • The final installation was included in 
    The Imitation Game, exhibition at the Vancouver Art Gallery from March 5 to October 23, 2022

3/4 of the project team at the live exhibition at the Vancouver Art Gallery

Project Credits


Vlad Ryzhov - Technical Lead

Valentina Forte-Hernandez - Project Manager

Courtney Clarkson - UX Design & Research

Julia Read - UI Lead, Creative Consultant

Larry Bafia - Project Supervisor

Special thanks to the Centre for Digital Media for their ongoing support and resources, and the Vancouver Art Gallery for the collaboration.

bottom of page