Sometimes, product quality problems just look like numbers. As a Customer Support professional, you probably don't feel this way, since all of that time immersed with customers in real world scenarios creates a unique and powerful perspective that gives meaning to the data. The engineers, however, might not "feel it" quite the way your customers do, but they are critical to resolving the issues.
We need to present more than data, we need to provoke an emotional response that creates commitment to resolve the problem. Quality charts are a lot of things, but a compelling emotional argument, they are not. So what can you do?
Well, let me tell you a story.
It all started with a systemic product problem. Widely known amongst the support team, it was an issue that was driving calls and causing customer dissatisfaction. It was one of those problems that spans many different engineering domains, and was therefore difficult to assign and fix end to end.
The engineers had looked at it, but there was uncertainty and doubt about the facts. We heard that it "couldn't work that way", or that perhaps the symptoms are being misreported or misunderstood. It was part denial, but even for those that believed the quality data, there were only abstract figures on charts to speak to the problem, and while the graph was going the wrong way, it didn't evoke the emotional engagement to truly recognised how poor the experience was. It was clear that everyone wanted to understand & help resolve the problem, but they were stuck. We needed a new approach.
We decided to take it out of the lab. We found a real live customer - someone who represented the greatest probability of encountering the problem - and asked her to let us shadow her while she used the product.
The process was like an episode of CSI - video recordings, meticulous notes, customer commentary, photos, high drama sun-glass removal - we captured it all, in context.
Lucky for us, the problem occurred, and we had a clean, beautiful capture of it. No test rig, QA data or pre-production systems - this was as real as it gets.
Compiling the data
Next, we took our data and built a composite storyline in a series of PowerPoint slides, all sequenced by time. We integrated the pictures, video and associated data into a visually pleasing sequence. We annotated the sequence to show how long things took - when we were waiting for installs or reboots, for example.
The outcome would allow a viewer to ride along with our customer as she went through the process, with a heads-up display (HUD) of sorts overlayed to show what else was happening that she couldn't see at the time.
Next, we introduced Stu. Stu was a geeky little character in a lab coat, representing the interpretive voice of our support engineers through the entire sequence. Stu had no opinion, he just had facts - and he pointed out the meaning and relevance of the data on each slide. If the core narrative left you thinking "so what?", Stu had your answer.
Armed with our story, we took the show on the road, and booked some walk-thrus. We invited product managers, engineers and managers from all of the teams involved in the platform, and pulled them into a large conference room to hear a story.
We explained our methodology, and then invited them to fly co-pilot with our customer as we walked through the customer journey, complete with annotations and the ever insightful perspective of our friend Stu, who was present on nearly every slide, adding context and meaning.
We got a lot of questions about the pictures & associated data, but there was no controversy over the facts themselves. The story was clear - you could see the problem, plain as day, and we all knew it didn't live up to the expectations we set for ourselves, or our customers. We had to make some changes.
For some, the impact was strong enough to compel them to share the story, and ask their colleagues to book walk-thru's for them as well. Next thing we knew, we had a queue of sessions booked with various groups who were not only interested in this particular problem, but wanted to understand how we were conveying it, since it seemed so sticky with the audience. We've seen problems like this before, how did they get so much traction with this one?
Stu officially had a fan club!
The single most significant outcome of the initiative was the emotional investment we provoked, and the action it created amongst our stakeholder community. The universal reaction to the problem was that it wasn't acceptable, and we must fix it now. Everybody took away actions to figure out how their piece could be contributing, and to explain some of the surprising observations with code reviews & testing. In fact, the company eventually formed a full team around the problem domain, with the objective of ensuring it didn't happen again.
The story did what no quality chart could - it captivated the team, helped us adopt the customer perspective, and gained commitment to act - because we all wanted the experience to be better. There is simply nothing like a ride-along with a real customer to bring meaning and importance to all of that code we crank out every day.
Build your story
Clearly, you can't do this for every problem - it doesn't scale. However, if you find yourself supporting a systemic problem at great cost, and having difficulty conveying the gravity of the matter, think of Stu.
You just might need to tell a story.