Historically, emotion has been thought of as a byproduct of design–not something that drives the user experience. But emotion is actually a critical new dimension in UI, and one for which designers are ultimately responsible. Huge Ideas recently spoke with Sherine Kazim, managing director of experience design, about her recent talk, “Can You Feel Me? How Emotive UI Will Change the Way We Design,” which she presented in June at the Awwwards Conference in New York City.
Huge Ideas: What exactly is emotive UI?
Sherine Kazim: There are certain things designers want you to feel when you use their products, and you can hear it in the way they talk. Clients often say: “We want to surprise and delight our users.” But really, that’s just scratching the surface. Humans are a mess of emotions–and designers are going to have to learn to engage with all of them.
Robert Plutchik, an academic psychologist who (literally) wrote the textbook on emotions, in 1980 introduced the concept of eight basic emotions–joy, trust, fear, surprise, sadness, disgust, anger, and anticipation. His “wheel of emotions” shows the interplay of those emotions, and their varying levels of intensity.
To me, the form of UI that most accurately reflects our emotional spectrum is…emojis. I created an emoji-based version of the wheel:
You can see in the chart that there is a dissipation of intensity that happens as you move further out on the wheel, and in design, nobody is talking yet about that. We currently design things that make people feel basic emotions–maybe joy or sadness–but we don’t talk about how we should adjust the UI according to how the user’s emotions may be intensifying or dissipating.
Huge: When you’re designing products, do you think about people’s emotions being closer to the center of the wheel–more intense–or the outer ring, where their emotions are less intense?
SK: We tend to create products that aim for the center-wheel emotions, because it’s the easiest thing to convey. We rarely think about the full spectrum, and we don’t think about the dissipation. Currently, most designers think about the design intention and the user reaction: “I want to make you happy,” and the user is happy.
But here’s what a user interaction might look like in the real world:
When someone designed this particular object (that is, a mailbox flag), they weren’t thinking about emotion. They were thinking about an object. Then along comes a user (Ralph), and he clearly thinks, “Hey, this looks like a really fun thing to do.” Once the flag is down and the fun is over, you watch that emotion dissipate. In Ralph’s case, he even transitions to another emotion: sadness. That whole spectrum happens in a 3-second GIF: Ralph is feeling joy–but from a design point of view, we never got around to addressing the more intense ecstasy, or less intense serenity, even though Ralph went through all those emotions and more. We need to start intentionally designing for every single emotion that Ralph is going through.
What that forces us to do is fine tune communication between people and products–to design for a primary emotion as well as its dissipation–and the relationship we have there. Ultimately, we’re trying to make deeper, more meaningful connections with the user and the best way to do that is to fully understand and manage their emotional spectrum.
So how do you think about designing for one specific emotion–beyond the typical “delight.”
SK: Designing for emotion can be subjective, but here’s my interpretation of something that gave me joy:
This is super fun. Things bounce around, there’s bright color and movement and music that brings back amazing memories from when I was a kid. I even get to make something that’s a direct reflection of myself. The design intention is pure ecstasy, and the user reaction, pure ecstasy.
Or what about designing for fear–what does that look like? It’s difficult to come up with examples, but I found one example when I was in a Seoul train station–and it’s an example of designing for fear, gone wrong.
I remember sitting there looking at this piece and thinking, “OK, I don’t speak Korean but clearly I am supposed to run away from a bomb, run away from an explosion. Not sure why I’m running toward the trees.” I’m not entirely sure what I should do, and it seems the intention was to deliver the information in a neutral way. But my reaction to it was “Holy @#$!, I’m not getting on that train, because I don’t want to be anywhere near where these things can happen.” Absolute terror. You start to feel the responsibility of the designer in considering the emotions of the user–in this case, the person viewing the sign.
Or, think about designing for anticipation. Look at the Japan Quake Map. It’s relatively old, but it’s still very powerful. The UI of the quake map shows the quake depth. You see the clock running, but beyond that there’s nothing else going on–until you’ve waited for a couple of minutes:
That first day is the day Fukushima hit, five years ago. Without even realizing it, the data itself is managing to not just to create anticipation, but also elicit an emotion–grief. All that with very few UI cues. That wasn’t intentional, but it illustrates why we need to take responsibility when we create something to put in front of a user.
What about designing for surprise? Is that even a good idea?
SK: Surprise for me is a tricky one because I rarely agree with surprising a user in the interface–with one exception–games. When it comes to games, surprise can be paramount to their success.
Who would think to put an apple in front of a sleepwalker to stop him from hurting himself? It’s gorgeous, interesting and wonderfully surprising.
So that’s 4 of the 8 essential emotions. What about the rest?
SK: What’s interesting is I could only hit on 50% of the basic emotions–not to mention all the others on the rest of the wheel. There’s no clean way to capture that stuff in UI currently – at least, not visually. That’s because as designers we’ve done such an excellent job simplifying visual UI. And while I like that, I’d love to see products be more multi-dimensional so they have their own personalities, just like users do.
How does AI fit into all of this?
SK: As AI becomes more prevalent in our lives, it’s only going to get more complex, because we’ll need to focus on the personality of the user and how it interacts with the personality of the AI. We can no longer solely rely on just the visual representation of the UI. This is good because it will ultimately allow us to form a more reciprocal relationship between the user and the brain behind the interface as we move away from screens.
It’s challenging but know this: products and platforms will change, but our spectrum of emotion will not change. This, to me, is the key takeaway–that we have a finite set of emotions and there will never be a sadness 2.0. It’s actually great that we have that constraint to work within because it’s going to help us design better product relationships for the future. It’s not just about designing for intention and reaction–it’s about understanding the full spectrum of emotion and the dissipation of each one and so we can start to develop true product personalities. The minute we design for all the variations of basic emotions, we know we’re heading toward solid territory for future design.