My role

Senior Product

Designer

Senior Product Designer

Team

Product Manager

Back End Engineer

2 Front End Engineers

About

Dixa is a Conversational Customer Service Platform (B2B/SaaS), that enables companies to interact with their customer via multiple channels, like telephone, e-mail, and messenger.

Project overview

Dixa customers had long requested more ways to gather feedback about their services and CS agents. To address this need, a team consisting of a Product Manager, Engineers, and myself, developed an IVR and SMS-based CSAT survey system. Through thorough user research and prioritization of key features, we integrated this solution into the customer support workflow.


The new feature led to an 18% increase in customer feedback participation, enabled more timely issue resolution, and ultimately improved overall customer satisfaction for the companies that adopted it.

Context

Brief introduction

Before starting, let's clarify terms in this case study:


  • Dixa customers: Companies using Dixa’s platform.

  • End users: The customers of these companies.

  • CSAT method: The method used to deliver the survey to end users.


Dixa already offered the option to send CSAT (Customer Satisfaction) surveys via email, but only for text-based channels (email and chat). Here’s how it worked: after a chat between a CS agent and an end user, a survey would be sent to the user via email, allowing them to rate the interaction or support provided. However, many Dixa customers, who relied on phone calls for up to 50% of their workflow, expressed interest in collecting CSAT feedback via telephone as well.


This feedback, combined with the risk of churn and the need to stay competitive, underscored the importance of building this feature. It became clear that offering phone-based CSAT surveys would not only meet Dixa customers' demands but also help us stay competitive in the CS market. This insight guided our roadmap and sparked a discovery process to determine the most effective approach.

Foundation

Internal insights

Aligning with the team and exploring existing insights within the company was essential. To set a clear direction for the project, I analyzed the numerous feedback entries available on our platform. Navigating through them helped highlight key customer needs and potential priorities. I then organized ideas and categorized feedback to identify patterns and key themes.


Together with my team and key stakeholders, we conducted alignment sessions and exercises to extract insights, revisit feedback, and define key areas to explore with customers. This ensured a deeper understanding before moving toward solutions.


With this groundwork, we mapped four CSAT methods that were technically feasible to build:

IVR – The user stays on the phone to answer the CSAT survey.

SMS – A survey is sent via SMS after the phone call.

Callback – The end user receives a follow-up call with the survey.

Email – The survey is sent by email.

Initial hypothesis

By providing an initial solution to address customer needs for a CSAT on the phone channel, we can reduce churn, close the product gap compared to competitors, and meet both customer expectations and business goals for the quarter.

Research

Engaging with customers

Once the groundwork was in place, we moved into the research phase to deeply explore customer needs and ideas. I interviewed eight Dixa customers, primarily CS leaders from companies interested in using CSAT, to understand their experiences with surveys. The sessions combined exploration and validation.


I explored the importance of CSAT for them, their past experiences with other platforms, expectations, and ideas. For validation, I introduced the four potential CSAT methods we mapped internally in the previous steps, discussing how each would fit their reality, their preferred aspects, and which option would bring the most value.

After the user interviews, I organized the key findings to ensure easy access for the entire team.

MAIN FINDINGS

CSAT methods preferences

Users expressed interest in multiple options. No “one solution fits it all". The SMS method was the most preferred, followed closely by IVR.

Customization

Need for customizable surveys to better align with company branding, including tone of voice. Limited default questions restrict access to meaningful feedback.

Flexibility

Ability to test multiple methods, as end-user preferences may vary by product and country. Easy set up for multiple languages is a must.

Solution

Definition and prototyping

After analyzing user insights, we needed to determine the best path forward. As previously mentioned, no single solution could address all needs, so trade-offs were necessary.


Guided by prioritization and high-level technical assessments from our engineering team, we decided to focus on an IVR-based solution. The rationale behind this decision was:


  • Timely delivery: IVR would allow us to deliver value within our project’s limited timeline.

  • Quick iteration: IVR could enable faster testing and learning, laying the foundation for future CSAT methods.

  • Technical feasibility: SMS posed regulatory challenges due different telephony companies between contries, so IVR was a more viable first step.

Journey mapping conducted with my team to identify key screens, technical constraints, and outstanding questions needing answers.

With this decision made, we created a journey map of the ideal experience, focusing on the technical possibilities for the IVR implementation. This provided a clear view of touchpoints at each stage and enabled engineers to thoroughly explore technical options. It also helped us share assumptions and identify any uncertainties that needed further validation.


This process allowed me to begin prototyping layouts with our limitations and goals in mind. Throughout this phase, I prioritized frequent sharing of prototypes with the team to ensure alignment, gather feedback, and make necessary adjustments before moving to user testing.


For user testing, I worked with many of the same customers from the initial interview phase. We explored the proposed solution together, allowing me to gather feedback and fine-tune the layout based on real user insights.

PROPOSED LAYOUTS

Opportunities

Following insights from research and team discussions, two key opportunities were identified to enhance the CSAT experience and better meet customer needs beyond its core capabilities:

Customizable content

Customers could add multiple survey questions and choose between Text-to-Speech or Audio Upload/Recording. This flexibility ensures better brand consistency, as recorded audio feels more authentic than robotic Text-to-Speech.

Flexible language

A multilingual survey option allows all languages to be managed within a single survey, eliminating the need for creating separate surveys for each language. This simplifies survey management and supports global reach.

Improvements

Initial feedback and improvements

After implementation and a beta phase with a limited group of companies, we launched the feature to the entire customer base with confidence in its positive reception. Evaluating the feature’s impact confirmed our initial hypothesis: there was a clear need for additional CSAT survey solutions via phone, specifically the SMS option.

Feature launch and beta feedback

It had strong adoption among clients who had requested the IVR method. The overall feedback was positive amongst customers and internal stakeholders.

It didn’t work for everyone

Mixed feelings among companies that preferred other survey methods (SMS). Even after giving it a try, feedback indicated that IVR didn’t align well with their specific needs.

Alignment with Dixa’s goals

Based on the feedbacks, we extended the scope to include SMS surveys, since it was aligned with the DIxa’s mid term goals. It would be simpler this time due the foundation from the IVR solution work.

Iterate, improve & repeat

With this update, customers could choose between SMS and IVR for surveys. For SMS, end users would receive a text with a link after the call. Changes to the main setup pages were made to clearly distinguish between the two methods, making setup more intuitive for users.

On the landing page, end users could give a score from one to five and leave a comment. The company could customize the question in the CSAT setup. This was the initial model, with plans to add more question options and flexibility in future updates.

Impact

Feedback and impact

18% increase in end-user engagement

The IVR CSAT feature boosted end-user feedback participation, with some customers seeing an 18% increase in responses. The ability to rate support immediately led to more timely, relevant, and actionable insights.

Positive impact for customers

Qualitative feedback gathered from customers showed that CSAT via phone helped companies train agents, standardize support, and quickly resolve issues. This improved customer satisfaction, strengthened loyalty, and addressed a key product gap.

Thank you for stopping by.
Let's connect ;)

Thank you for stopping by.
Let's connect ;)

Thank you for stopping by.
Let's connect ;)