Inspiration

Design teardown: Help Scout is redesigning with the customer in mind

4 min read
Nick Francis
  •  Dec 16, 2014
Link copied to clipboard

Author Neil Gaiman wisely said, “When people tell you something’s wrong or doesn’t work for them, they are almost always right. When they tell you exactly what they think is wrong and how to fix it, they are almost always wrong.”

Customer feedback critically informs the design processTwitter Logo, but only when it tells you why something should work better—not how. Since a customer only uses your product one way, their advice will often be specific to their use case. When you’re designing for dozens, even hundreds of distinct use cases, you need to parse your feedback carefully.

On the other hand, customer feedback becomes doubly important when you’re redesigning a key product, because it can help you avoid increasing their learning curve.

That’s why I’d like to share how we used customer feedback to place their needs front and center when we redesigned a big part of our product—Help Scout Reports—from the ground up.

Why redesign?

It’s best to avoid a complete redesign if at all possible.Twitter Logo Consistent iteration on the details usually produces better results because it:

  1. Avoids a sudden and dramatic increase in cognitive load. Customers adjust more quickly and easily to minor changes, so it’s easier to appreciate their value.
  2. Breaks a massive, multi-part task into bite-size chunks. A complete redesign takes a lot of time and dedication from every team, and can easily span months. By pushing incremental changes, you can move toward a better solution in a more expedient manner.
  3. Makes it easier to respond to feedback quickly. You can act on feedback about an incremental change pretty speedily. The scope of an overhaul means a similar volume of feedback.

But in the case of the Help Scout Reports, we didn’t have that option.

Reports was one of the first features we built—and they weren’t architected to handle the scale we’ve grown to. No matter what, we had to re-architect everything on the back-end. And since we had to start from scratch, it made sense to design the best possible solution we could imagine after 3 years of learning from our customers.

Here’s how we did that through each step of our process.

Step 1: Identifying the challenge(s)

Building a reporting dashboard poses a huge design challenge for two main reasons:

1. Everyone uses and interprets metrics differently
In a field like customer support, there’s no single success metric—there are dozens. Plus, every company places different levels of importance on each metric, so it’s difficult to design a single flow that’s meaningful for everyone. Instead, the goal is to present the data intuitively and get out of the user’s way.

2. So many metrics
The number of metrics also creates a design challenge. Every metric we add to the list makes the design more complex, which impacts designers, engineering, and most importantly, the end user. The more metrics we display, the more difficult it is for a customer to come up with something meaningful.

Despite this challenge, we tended to include metrics more often than we excluded them. Each company uses different metrics to tell their story, so a new metric’s value often outweighs the cost of including it in a design. That’s one way we put the user first in our redesign.

Step 2: User research on the redesign

Every redesign should start with user research.Twitter Logo Your support team holds a treasure trove of important data on what users want and where current features come up short.

I read hundreds of emails from customers, not just to understand the functionality they wanted, but also to familiarize myself with the language they used. Sometimes it’s worth following up with customers to understand their use case in context. Once you understand all the common use cases, the fun part is connecting the dots and designing a solution that works for all of them.

It’s important to note that support feedback only comes from a segment of your customers. Many customers won’t take the time to email support. That’s why I really like combining support data with surveys.

We used Wufoo to set up an open-ended, 3-question survey. Since I was most concerned with customers’ language and gaining greater understanding of their use case, open-ended questions were the way to go.

Here’s what we asked:

  1. What new metrics would you like to see that aren’t covered by our reports today?
  2. If you were able to get a daily/weeky/monthly reports email, what metrics would you most like to see at a glance?
  3. What else can we improve for you with the new reports?

Question 2 helped me understand what metrics were most important. The answers varied so widely that we didn’t build the email reports, but they still informed our design process a great deal.

One big idea the survey data made clear was that users want the ability to drill down on the numbers. In a support dashboard, every metric represents many distinct conversations, each filled with rich details to act on. By letting users drill down into those conversations, we could give them new ways to learn about and act on their users’ needs.

Today, the drilldowns on each metric really set the Reports apart from similar tools.

Step 3: Prototyping the redesigned dashboard

So how can you tell a cohesive story when every company puts a different emphasis on each support metric?

In our case, the best we could do is group metrics in an intuitive way that reflects concerns every support organization has. The reports cover key metrics across the board, from searching support documentation, to reaching out to support and finding a resolution:

  1. Documentation: how can we improve our knowledge base and self-service tools?
  2. Conversations: how many support requests are we getting, and when?
  3. Productivity: how are we managing incoming support volume?
  4. Team: how are the individuals on my team performing?
  5. Happiness: are our customers satisfied? what can we improve?

Wireframing each report was our next task. I used Omnigraffle for Reports, but have since switched to Balsamiq and love it.

With a solid draft of the wireframes in hand, I reached out to 10 customers who seemed to collectively cover our wide breadth of use cases. I had some informal discussions about the overall flow and made some small adjustments before starting on the visual design.

Step 4: Visual design

When we reach the visual design stage, we stop looking for customer feedback. When it comes to the brand, design isn’t a democratic process.Twitter Logo My co-founder Jared leads the visual design for our team and we work together to get that side of things right. We don’t show it to anyone until it feels almost 100% finished.

But once we reached that level of confidence, we still had a couple months to polish while the engineering team did their thing. So I visited offices, set up phone calls, and did screenshares with customers around the world.

I spent a lot of time understanding each company’s support process, specifically how they measure success. Then I’d walk them through our InVision prototypes, making sure we provided all the necessary metrics in an intuitive location.

One of the most meaningful of these conversations revealed that our individual reporting was too competitive. While competition could work for a sales team report, it’s not a good idea for a support team. The discussion even led to a blog post, “Customer Support is More Than High Scores.” We changed the design and tone of the reports as a result, and it could have been a big problem if we didn’t.

Step 5: Launch and post-launch

A redesign is a long, exhausting process for everyone involved, but it’s important to reiterate that launch only gets you about 70% of the way to the finish line. No matter how much research you do, considering the launched feature a finished product is a big mistake.Twitter Logo

We launched Help Scout’s redesigned Reports on July 3, 2014. And we’ve only recently put the final touches on the redesign. We found a couple tools really useful post-launch.

One product we’ve started using with any new feature is FullStory. It lets us securely record sessions in the application and watch how people interact with the product. It’s like doing 250 usability testing sessions in a couple of hours! We learn a lot from those sessions and disable it once we have enough data.

We also use our own product to manage customer feedback on the redesigned Reports. I read through over 100 conversations with customers about the new feature, and those conversations directly informed improvements we made in the following months.

Things to consider in your next redesign

Every successful redesign works in large part because of customer feedback. We walked away from this one with 3 big lessons:

  1. Always start with research. It’s tempting to geek out too early on a redesign project, starting with prototypes or visual design. User research is the dirty work, but it’s also the most important work. Make sure you’ve got solid research to inform your design decisions along the way.
  2. Gather data from several sources. Your support inbox will frame the data differently than a survey, and a usability test will produce very different results from a phone interview. Make sure you use several UX tools to get a well-rounded view of the design problem.
  3. Keep iterating. Launch isn’t the end of any development cycle and you won’t get every detail right. Block out at least a couple months to keep iterating on the feature based on the feedback you get from customers. The last 20–30% is definitely the most important.

Collaborate in real time on a digital whiteboard