#1 Usability becomes a commodity
Design patterns are still a thing — a big thing. More and more, designers can rely on robust and comprehensive interaction pattern libraries for solving common design use cases. Now that the basics are covered, where do we want to focus our efforts?
You don’t need to reinvent the wheel when designing a door handle. Innovation just for innovation’s sake, like trying to create a completely disruptive navigation system for your website or app, can bring usability problems in the long term. It all comes down to: what exactly is the user need you’re trying to solve by introducing a new interaction pattern?
Luckily, interaction design pattern libraries and guidelines are helping keep designers honest and focused on what really matters for the user: getting things done in an easy and familiar way.
It was about time.
Meeting basic usability standards is crucial for any successful product these days — although someone in the room will always feel entitled to raise their hand and argue Snapchat is not the most “intuitive” experience ever, yet is still successful.
Cover the basics, focus on the details
In an era where meeting basic usability requirements is a given, and competing products are reaching feature parity fairly quickly, what really differentiates digital products is how relevant and delightful the experience can be.
The word “usability” itself is losing a bit of importance. It requires too little from us.
Why does one choose to use Gmail over Yahoo, Medium over Blogger — if the features are 99% the same? It’s definitely not about disrupting usability standards. It’s about that additional layer of sophistication that can only be achieved when you put enough time and brainpower into the tiniest details, the most subtle animations, the most elegant transitions – not just for the sake of creating whimsical dribbble shots.
In 2017 designers should not be afraid of starting from design patterns to cover the basics, and then focusing the bulk of their time on the details that will make experiences feel more relevant, delightful — and therefore more memorable.
#2 The words we will stop using
Words are funny. They carry a lot of meaning, and it’s interesting to look at how that meaning evolves over time. Do you still need to sell your design as “mobile-friendly”? Do you still describe experiences as being “intuitive”? What does that even mean?
Back in 2011 everyone was talking about Responsive Design. The possibility of designing and building one single web experience that was able to fluidly adapt to multiple screen sizes was eye-opening at the time. Interestingly enough, fluid layouts are a native functionality of HTML — but over the years web experiences had been built with too much focus on large, desktop screen resolutions.
We went back to the basics and proudly coined it “Responsive Design”. It was the topic everyone was writing, reading, and tweeting about that year.
Interest in the search term “Responsive Design” over time (source: Google Trends)
Fast forward a couple years and designing websites that are responsive is the new norm. Now we only call out the exceptions, and in many teams, projects and companies, responsiveness is an assumption that everyone makes right from the get-go.
The evolving meaning of words
“Responsive Design” is just an example of adjectives that become unnecessary over time:
We don’t sell an experience as being “intuitive” — we prove it through user testing and positive feedback from customers.
We spend less and less time arguing whether a piece of content should live “above the fold” or “below the fold”; the plethora of screen sizes we see today are quickly making concepts like “the fold” outdated.
We don’t say something is “just two clicks away”; the burden of extra clicking was a bigger problem when interactions were limited to cursor and hyperlinks on a low speed connection.
We don’t sell our process as being “human-centered”; it’s an assumption that any capable and successful company these days will bring users to the design process at some level and at some point in the project.
Want another example?
Google’s search results page on mobile
2016 was the year Google decided to remove the label “mobile friendly” from its search results. According to Google’s search team, “85% of all pages in the mobile search results now meet the appropriate criteria and show the mobile-friendly label”. Since the vast majority of mobile results are mobile friendly, there is no need to label them as such.
All websites are now (or should be) “mobile-friendly”, and as a designer you don’t need to keep stressing that over and over when you present your mockups to someone new.
Quickly, our vocabulary shifts and evolves to let us focus on new design challenges.
What words are you crossing out of your daily lexicon in 2017?
What new words will you be adding?
#3 Everything is a conversation
“Chatbot” is one of the hottest terms in our industry right now, and we are pretty confident you are going to be building one quite soon — if you haven’t already. But what does the future of Conversational Interfaces look like?
If you’re reading this article, there’s very little chance you haven’t heard about Conversational Interfaces in 2016. At uxdesign.cc, we’ve written about the technical and social challenges of designing conversations, helped designers who wanted to get started in that space, talked about prototyping bot experiences, and even curated some of the best chatbot experiences we’ve seen this year.
Every interface is a conversation
Essentially, a Conversational Interface is any user interface that mimics chatting with a real human.
But stepping back for a moment: isn’t every interface a conversation between the user and the machine?
Think of the most common apps you use everyday. Like hailing a cab.
First, you tell Uber that you need a ride. Then, it asks you where you are, and once it has found a driver, it tells you the time estimate. When the ride is over, it asks you how it went. And you tell it your opinion by clicking on the stars and rating the ride.
Uber: a conversation about your latest ride
Traditional interfaces (the ones we design every day) are quite similar to a conversation — that just happens to manifest as buttons, menus and other interaction patterns. With Conversational UI the structure is the same. But instead of buttons, menus, and stars, you tell the machine what you want using words. And emojis 😘
Conversations will only get louder
“Chatbot” is the next big buzzword in design — and our industry is seeing a lot of interest from companies in exploring that space. Automated, conversational experiences allow brands to inspire, communicate with and serve their customers right where they are, in a much more scalable way.
Order pizza from Facebook Messenger with Pizza Hut’s new chatbot
Apps like WeChat have become the central destination for a plethora of services in China. Over half a billion people use WeChat, and it touches everything — from consumers communicating with friends, to sharing their daily moments, to buying everything from food to paying credit card bills. It’s IM, ecommerce, banking, dating, gaming and marketing rolled into one platform, where you can shop, order food, book doctor appointments, find parking spots nearby, book hotels, hire maid/nanny/babysitter, hail a taxi, and so on. All through conversations — and mini-apps that run within those conversations.
Messenger, Kik, Slack, and many other messaging platforms have been working hard in 2016 to expand their capabilities and allow for similar experiences through conversations.
Not to mention voice interfaces — Siri, Alexa, Google Home, and so many others — a natural next step for chatbots, and a business opportunity that will inevitably affect the way you, as a designer, think about products and services in the near future.
The interactions of the future are not made of buttons.
Will 2017 be the year where companies shift some of their primary experiences to a chat format? Have we found real use cases for it, or are we just following the hype?
#4 The year we begin breaking the glass
As designers, when we start to move away from designing for clicks, taps, and screens –the ones we’ve been staring at for years– we enter a new domain that might be unfamiliar to us. Now let me ask you: are we really prepared?
Screens are limited.
They are two dimensional.
They are cold.
As designers, we’ve been doing a pretty good job at trying to mock behaviors and gestures and invent metaphors that make screen-based interactions feel a bit more real.
“Swipe your finger on this shiny piece of glass to pretend you’re sliding a fictitious metallic panel to the right”
However, when we bring our services to the physical world, things start to change.
Voice-enabled interactions, for example, require a better understanding of humans — not only the topics they talk about, but how they talk about them. People are spending part of their day giving instructions to machines; and when they do so, intervals, pauses, intonations, culture, age, accent… everything has an impact on their experience.
The same thing happens with Virtual Reality. Immersive experiences require a better understanding not only of the gestures users do with their hands, but also all the subtleties of their body language, personality, posture, cultural background, and age.
Adding new specialists to the team
People expect to interact with machines the way they normally would with each other. However, machines are not necessarily prepared to deliver on those expectations; someone needs to train them to understand intonation, gestures, and ergonomy.
Companies will need more than just an interface, and designers more than just design tools to do their jobs.
If ethnography and research have always been important to web design, now they are at the core of the interactions we will create. The behaviors and gestures are out there in the world.
Someone needs to see them, understand them and translate them into digital interactions.
The reality is: this person doesn’t need to be you, but work with you on that.
In 2017 we expect to see more and more design teams hiring psychologists, physiologists, anthropologists, researchers, and other specialists to work with them on designing these new experiences.
#5 Stitching all the pieces together
Both the Apple Watch and Alexa let users request an Uber ride without having to touch their phone. While this may sound frivolous, it sets the tone for what people expect from technology: a fully connected and ubiquitous experience. As designers, how do we connect all the pieces of the puzzle together?
For UX designers, designing connected, ubiquitous experiences is a twofold challenge.
If you work at a hardware company that is building these connected devices (like Apple Watch and Alexa), the biggest challenge is to understand how people will interact with these objects — voice, gesture, location or display — and design the right interaction model around that behavior.
However, chances are you work for a service company (like Uber) who is designing experiences that run on those devices. In that case, your job is to think through how the service will work on an increasingly vast ecosystem of channels and touchpoints, that is only getting more fragmented every day. You won’t have as much control over your users’ experiences as you did when you were designing “just a mobile app” or “just a website”.
Can I request an Uber ride from Alexa, receive the ETA on my Apple Watch, split the fare with a friend on Messenger and rate the ride on my phone App?
As designers, how do we make sure such a fragmented experience still feels like it is coming from the same brand?
The challenge becomes to design the minimum possible interactions and to focus on people’s behavior — not just add noise to this already complex ecosystem. To help us on that, user journeys, ecosystem maps, and physical prototypes become important design tools this year.
“Technology should require the smallest possible amount of attention” – Amber Case, Calm Tech
We will need to keep in mind that users are more than metrics in a dashboard and, instead of talking about user retention, we will start talking about how relevant the interactions are.
In 2017 we won’t necessarily be designing the whole ecosystem, but the ways people transition from one touchpoint to the other.
#6 A body and space puzzle
From Matrix, to Her, and Black Mirror, humans have always fantasized about living in a world of Virtual Reality — the only thing that varies is the opacity of the virtual layer. As these digital worlds become more real, so do the actions we need to take to enable these types of experience.
The first idea about Virtual Reality that needs to be demystified is the paradox of “Virtual Reality” itself. It is a great name that has been around for a while, but we know that it goes beyond that: immersive experiences are about expanding the reality that we live in.
But if designing two-dimensional interfaces already require so much work, knowledge and effort — what does it mean to design an entire new world that is able to augment our own?
“Designing for VR should not mean transferring 2D practices to 3D, but finding a new paradigm.” — Jonathan Ravaz
Well, let’s break that down.
A new lexicon of interactions
The first and most obvious challenge is designing a new type of interface. Google, Facebook and other players are already defining new interaction standards for virtual spaces — natural gestures with similar meanings from the real world will help translating emotions and actions in the virtual space.
Conversational interfaces also play a big role here — after all, that’s how we mainly interact with other people and businesses in the world outside screens.
A new spatial paradigm
The second challenge is a physical one. Beyond interfaces, immersive experiences are defined by the interaction of our bodies with the space they are in. What is the relationship between our physical and virtual bodies?
Sound design, architecture, lighting, physics are just a few examples of things to be considered.
Are people expecting the virtual experience to be as realistic as the physical one? How far can we push those boundaries? Are humans ready for a more elastic reality?
A new relationship with the self
The last piece might be the most important — yet the one we know the least about. It’s impossible to talk about the relationship of body and space without considering social, psychological and cultural factors. A virtual reality can redefine one’s personal space, personal image (think avatars), and social interactions.
Before designing for Virtual Reality, we need to consider our own bias and the user’s side-effects of such immersive experiences. It will require a clear guideline and an ongoing ethic discussion as we introduce new social paradigms.
What world would you create if you were Anthony Hopkins? (Westworld, HBO)
2017 won’t be the Virtual Reality year — yet. It might take another year or so before immersive experiences have a real impact beyond our tech bubble. But it will be the year where we’ll decide what should and should not be designed for VR.
#7 Should designers design?
The questions we’ve been hearing all year: Should designers code? Should designers prototype? Should designers write copy? After dozens of articles trying to answer these questions, we are now probably closer to a definitive answer: it doesn’t matter.
Our field is about to change again.
Many UX designers started their careers as Information Architects, Visual Designers, Writers, Strategists. We are used to seeing job titles change from time to time, as companies start to understand the depth of our work, and to accommodate trends and market needs.
Today, UX Designers wear several hats under the same job description.
While everything was happening on a computer screen, it was still manageable for a single person to be responsible for research, strategy, and visual design. With the emerging plurality of interfaces, contexts, data and devices, it’s time to get more specialized professionals. And generalists as well.
More specialized professionals
We expect to see narrower job descriptions in the near future. Less “UX Designers” and more “Artificial Intelligence Designers”, “Experiential Designers”, “Verbal Designers”— and all the other ramifications that will arise from the technologies we are playing with, every day.
For example, a Data Designer could work with a VR Screenwriter and a Motion Designer to define how a certain VR experience should work.
A new type of generalist
Generalists will help tie specialists together, looking at the big picture. They could be a manager, but also a Design Ops, Strategist or the one responsible for the company’s Design System.
Being a generalist will be less about “doing all sorts of work”, and more about “connecting everything”.
Which hat do you want to wear?
In order to evolve our field, our design process needs to be even more iterative and collaborative. Specialists and generalists need to work together, mixing different skills and backgrounds, to deliver great work.
We daydream we can solve all problems ourselves, from research to code.
In reality, design is a team effort, and UX is way more than just a job title.
After all, we can only wear one hat a time.
#8 Design must be automated
The design process goes as far as it can with the resources that we have on hand: time, tools, team… It’s a lot to consider. The introduction of automation in our work can surpass these limitations, bringing our work to the next level.
Design Automation is generally associated with the futuristic idea of Artificial Intelligence designing websites and apps. This might be still far from reality, but automation has started to happen in a much more subtle way.
A few examples? Sketch add-ons to bring real data to mockups or easily create several versions of a page; tools like Zeplin to create specs without a bloodbath; InVision integrations aiming for seamless collaboration; and not to mention the new kids on the block, Figma and Subform.
Think about how we were working 5 years ago and how these tiny workflow automation tools have completely changed our work dynamic today.
You can imagine how different things will be in 5 years.
Research is the real opportunity
Recruiting participants, following up with them and collecting feedback can be daunting — not to mention it is not a sexy thing to sell to the big bosses. Besides very specific cases, we predominately still use the same tools and process that we used 10 years ago.
It’s time to automate that.
A few teams are ahead on this game: IDEO brought bots to their design team to help them with research and data collection. Amber Cartwright co-designs with machines at Airbnb, a process she calls Invisible Design.
However, no one is far behind. Think about the Slackbots that help us collect and be notified of all sort of inputs. Intercom and ZenDesk managing customer relationships and collecting data for us. Or even Pocket and IFTTT helping us with our desktop research.
The tools are already here, it’s a matter of us connecting them.
As we start designing for our automated lives, we need first to automate our work. We will design alongside bots.
Automation will set us free
We won’t be replaced by robots by next tax season. But these basic automations will allow us to shift our focus from task-oriented work to more strategic thinking.
We will stop being so protective about interfaces once we fully automate design patterns and put them to work on our favor.
We will be able to look at our design approach more holistically once we automate the heavy lifting of user research.
We will finally be able to collaborate better with our peers once we stop working like robots do.
#9 Not your fault, but your responsibility
Diversity and ethics were two of the most important topics in UX this year. Many Designers started in UX to be able to have meaningful impact on people’s lives. Are we finally at a point where we can do that?
The products and apps we design are used by millions — sometimes billions — of people, creating new markets, improving the economy, and shaping up how people interact with each other. While delightful animations and novelty technologies can put a smile on someone’s face, we have the tools and responsibility to impact much more than that with our work.
Companies are starting to realize they are not only responsible for their impact on society, but also that transformations in society can impact their product design. Airbnb hired a Director of Diversity not only to lead specific initiatives, but also to help shape their products and features with diversity in mind. Nextdoor learned that they could play a role in fighting racism by making some small improvements on the flow to post a message in their platform.
Design can’t be just incidental. Our work has impact on people’s lives.
We, as designers, also have responsibilities of our own. Every design decision carries some opinion or perspective about the world. Unfortunately, some of those decisions are solely based on the designer’s assumptions. From a simple ethnicity question in a form to the way we will design a complete world in Virtual Reality, we could be missing the opportunity of breaking stereotypes and misconceptions.
The (frustrating) User Experience of defining your own ethnicity (source)
It’s not easy.
First, we need to understand our bias to question the design solutions we are creating and be as impartial as possible. Even when we do that, the end product can still be failing in some way. When that happens, the underlying question becomes how intentional the error was, and what has been done to fix it.
A design can fail, designers shouldn’t.
Second, we need to consider the impact of our work and how it can give something positive back to society. We’re not in the game just to make stakeholders happy, and nothing is going to change if we don’t proactively act on it.
“If your company is just in it for the money, maybe you should look for a better company. It’s not your fault, but it is your responsibility” – Alan Cooper, Ranch Stories talk
#10 Recap & Highlights
Product of the year: PokemonGo (we miss you)
Portfolio of the year: Adrian Zumbrunnen’s conversational portfolio
Tweet of the year: Project handoffs, by UX Reactions
Buzzword of the year: Chatbots
UX writer of the year: John Saito
UX blog of the year: A list apart, for the second year in a row
Tool of the year: Figma
UX article of the year: Design Better Forms
Talk of the year: Alan Cooper’s Ranch stories
Book of the year: Calm Tech, by Amber Case
Side project of the year: The Design Team, which we’re clearly big fans of