Location: Blog Menu ▼

Interaction 17

Co-chair Josh Clark on the stage

I have been planning to attend the Interaction conference for years, but something always got in the way. This year I got lucky and went to New York with more than a thousand other design professionals. The summary: it was excellent. Mostly.

The conference is one of the biggest in the field and shines the light on relevant topics and trends in the industry. It’s a great place for designers and researchers to get a broad overview of what’s happening. As to any human endeavor, there were good and bad sides to it.

The good

Parallel tracks during three days covered:

  • Virtual reality
  • Machine learning and artificial intelligence
  • Conversational interfaces and chatbots
  • How technology impacts our emotions and psychology
  • Creating and managing design organizations
  • Ethics, responsibility, and trust
  • Smart environments, designing off-screen, and new interaction paradigms
  • Design thinking and process

I’ll try to summarize some of the topics from the conference. Occasionally I’ll add my thoughts about the industry in general.

Virtual reality got the first half of the first day. I would be surprised by this if I didn’t try it recently. The technology is transformative. Although virtual reality is not new, it became more accessible in the last couple of years. The most pressing questions designers have are about the design process: how do we prototype, how do we test, and what are the best practices? We’re still in the exploration phase.

One speaker pointed out the excessive application of the “VR” label to everything imaginable. According to her, virtual reality is an environment where a person has agency, where a person can interact with that environment in a meaningful way. Everything else is an improved version of 3D goggles. I was thinking about this a lot, and the distinction makes sense. If you’re watching a 360° live feed of a concert or enjoying a 360° computer simulation of a rollercoaster ride, it’s no different from enjoying a movie. You’re just consuming media with the ability to turn your head. You can’t interact with it in any way.

The IBM design team showed a nifty setup where they prepare a paper prototype, put it in a cylinder, and take a photo with a 360° camera.
The photo is then opened on a phone and put in a Google Cardboard VR headset. A monitor mirrors the headset so other team members can see what is happening.

Machine learning is another technology that has been around for decades but had its second renaissance period in the last few years. The reasons are improvements in computing power and access to more data. You need a lot of data for machine learning.

A couple of excellent presentations explained what machine learning can accomplish, where should it be used, and where it should be avoided. They also mentioned a big caveat: the quality of machine learning outputs is going to be only as good as the quality of data that feeds into algorithms. An additional problem is unintentional bias that can negatively affect human lives. For example, the algorithm in the criminal justice system used to assess the risk of re-offending is biased against blacks. I’m afraid many people and companies today jump on the machine learning trend thinking about technology, but not about its consequences. Machine learning is a wonderful technology, and we’ll see it misused due to ignorance.

Maslow's hierarchy of robot needs

I couldn’t attend most of the talks about conversational interfaces and chatbots, but I heard from others they were good. People at the conference mentioned challenges of remembering what to say to Amazon’s Echo. It was in the press too. Other products of the same type have the same problems.

I have reservations about the state of the voice or conversational technology today, even though I’m optimistic for the future. If you say anything to another person, that person will understand you. He or she might not respond or comply, but will understand. The machine will hear you but understand only a tiny fraction. Until it can understand (almost) everything, it’s not as useful. It’s just too hard to remember all supported voice commands without any help. In other words, it’s not a good conversation partner. Yet.

Voice and command-line interfaces look very different on the surface, but they suffer from the same problem: discoverability of a limited set of commands. What to type in the terminal? What to say to Alexa?

Many people use WeChat as an example of a successful conversational UI. Norman Nielsen Group conducted research about WeChat last year:

However, the key UX advantage of WeChat is not that it grew out of a chat service; it’s the integrated user experience. Each individual service is fine, but not necessarily better than those offered by other companies. In fact, our user testing of WeChat revealed many usability problems in various areas. What’s superior is how these services play together and reinforce each other. Most importantly, these benefits are not the result of a superior, simple conversational UI; instead, they are often provided through a simplified graphical user interface (GUI).

Design organizations are a hot topic for a couple of years now (MX 15 was almost all about that). It signifies a shift in the industry—big companies are acquiring small studios or building in-house design teams. Designers join huge and complex organizations where they often struggle with finding their place. How do you collaborate with dozens of other researchers and designers, how do you work with other roles across teams, how do you demonstrate your value to the organization, what process to use, who is responsible for what?

Ethics, trust, and how technology impacts humans is something we should always have on top of our minds. There were a lot of diverse talks on this topic, and I liked all of them. Some common themes were:

  • We build apps optimized for engagement, not for meaningful interactions
  • People hand over too much information and control to technology, but they don’t understand the consequences
  • We often treat end-users as objects, not as living humans that have needs and emotions
  • We build complex systems and rush to launch them without proper testing (I already mentioned machine learning and bias)

One speaker said something that made me think: “Technology is not neutral. Guns don’t kill people, but they make it easy for people to kill people.” I was always in the “tech is neutral” camp, but I have started to doubt that recently. People respond to incentives. Every technology, whether I like it or not, sets some incentives.

The bad

I would be ecstatic about the conference if two things didn’t happen.

A significant number of speakers did a terrible job, from keynotes to short presentations. High expectations amplified my perception of their performance. The conference is, after all, the best known in the industry, not a local meetup where a person might try speaking for the first time and not perform well. Some speakers were reading from their slides, or even worse, asked the audience to read from them. Some were rambling for an hour, and the only thing I remember is the primary color of their slides. A couple of times I completely disengaged, and once I actually left the room.

It’s too easy to disengage if a talk is not interesting.

The other thing was constant jabbing at the US politics and the dissatisfaction with the current president of the US. Sly remarks and jokes found their place in many talks. I understand people are shocked and angered by recent events (me included), but I don’t think this kind of conference should be a place where they vent. Rage all you want on the street, in front of a political office, or on a social network. However, the international part of the audience and locals with different political beliefs might not find it amusing.

A detail from a street in New York.

At the same time, I have to admit some speakers handled this spectacularly. They described the current situation as “challenging times,” and were able to provide context for their talk without pushing their agenda or political beliefs. Kudos.

The ugly

Brace yourself, a rant is approaching at supersonic speeds. There is one thing that fills me with frustration every time I hear it. There were traces of it at the conference, but it’s a problem of the whole industry, and that’s how I want to address it. It’s four simple words.

We, designers, are special.”

It comes in different forms: “We think differently. We’re the only ones who care about people. It’s up to us to fix everything. Only we can save the world. We are special snowflakes.” One speaker literally compared designers with another occupation and said that designers think more critically of themselves and their work. What the actual fuck?

Who are designers? A different species from homo sapiens? Genetically engineered superhumans who were trained from birth to fight bad usability and address all user needs? It seems some designers skipped their Empathy 101 lessons.

No, designers are regular people who learned particular skills over the years. Let me repeat the critical part: regular people who learned some skills. Nothing more, nothing less. If you’re a designer now, nothing is stopping you from going to law school and becoming a lawyer. Would you suddenly stop caring about your work? I don’t think so.

You can measure your design skills along multiple dimensions. None of these should be labeled “better than other humans.” Show some humility, guys and gals.

Interaction 18

In the end, the next year’s conference was announced. Interaction 18 will take place in Lyon. That might be fun :)

A livestream of the Interaction 18 announcement.

Don’t miss this one:
Virtual reality might be a thing

Back to top ▲