The Emotion Whisperer

The Emotion Whisperer is a subtle tool that can prompt the visually impaired with the body language they’re otherwise missing out on. It comes with a pair of camera glasses that sends images of conversation partners to an emotion recognition app.

Dogger Collectief

Simon Dogger lost his sight in the second year of his school career at the Design Academy Eindhoven. After a five year medical rehabilitation period Simon decided to finish his bachelor in design and returned to Eindhoven. Simon got granted funds in order to hire alumni, known as the Dogger Collectief, that guide, conceptualize, and execute designs for him. This collective started in 2014, in the spring of 2017 Simon graduated with a Bachelor in Design.


The Emotion Whisperer is Simon’s graduation project for his bachelor’s degree. Inspired by Simon’s Mood Colors app, an entry in a contest organized by Orange Telecom, this 6 month long project received several awards including the René Smeets Price and the iF Design Talent Award 2017. By using facial recognition software expressions are analyzed and translated into a sensory signal. The connected handheld device sends out a specific vibration and thereby signaling the user of the body language of their conversation partner.


Simon Dogger

In collaboration with

Kevin Cools

Myra Wippler




Simon spent a considerable amount of time on the interaction design of the app and what he called the “sound interface.” Using Apple’s speech to text function as a guiding framework, Simon made sure that starting from the beginning the Dogger Collective included the guidelines of this sound interface. This would make sure that the app would be effective to operate without the need of adjustments later on in the process.

To make the app as user friendly as possible for its target audience Simon constructed the following priorities:

  • Minimize the amount of components
  • Minimize the length of the sentences and distill their content
  • The position of the components
  • The physical interaction with the app (swipe, open, activate)

Simon envisioned that facial expressions could be measured on a scale from 0 to 100. Taking happiness as an example that scale would look somewhat like this:

  • 0 = neutral
  • 20 = a starting smile
  • 70 = laughter
  • 100 = ROFL

Ultimately this dynamic scale needs to be translated to the haptic feedback of the handheld device. However, current facial recognition software is at the moment not powerful enough to recognize the intensity of someone's emotion and place it on a scale. Simon therefor translated the recognition of emotions into a true/false statement. Almost comparable with a with a boolean variable. A true return would prompt the device to give of the haptic feedback.

My role

At the time that Simon started his graduation project I moved to Seattle to pursue my master’s degree at the University of Washington. Due to this move my involvement in this project consisted of an advising role on the interaction components.

As an iPhone owner Simon has the most experience with the accessibility features of Apple. We discussed what he thought worked well with the VoiceOver function and where he believed the interaction lacked a good experience. Due to the absence of visual feedback, getting “lost” in an app seemed to be the biggest problem when using applications not designed with visually-impaired users in mind.

We examined Siri as a function, being able to command a phone to take certain action instead of taking that action yourself seemed like the most logical way to go. However, Siri was not activated in the Netherlands at the start of this project so the explanation of a VUI left a lot to imagination. Due to the lack of experience in interacting with a VUI a more traditional path was chosen.

Simon had played with the idea of communicating facial expressions before starting this project in December 2016. Because of this he made videos explaining how he believed an app could stay as usable as possible without the user getting lost due to the lack of visual guidance. These videos were helpful to discuss together what features from this videos could be materialized into companion app of the Emotion Whisperer.