Listen360 provides a number of different reporting tools to help you analyze your customer feedback. While some reports may have an overlap in data, each has been designed with a unique purpose. It is important to note that the data source may be different even when two reports look comparable at first glance.
This article will help explain the mechanics and data sources of some of the most popular reporting tools available in Listen360 in an effort to help you better judge which metrics can be accurately compared. Covered in this article:
- Loyalty Summary
- Customer Loyalty Report
- Voice of the Customer Report
- Technician Performance Report
- Feedback Requests Report
- Response Rates Report
The Loyalty Summary panel appears on every dashboard and is one of the first and most common sets of data that you might view. This panel shows your overall customer loyalty in terms of Net Promoter Score (NPS), calculated as:
NPS = % Promoters - % Detractors
The chart shows how your score and contributing factors change over time. For example, selecting 1m (one month) will show everything since this day last month. A period of 1w (one week) would show all activity in the last seven days. Selecting MAX will show all of your activity. Here is an example:
The metrics in the top portion of this panel are all NPS-related:
81 NPS = 86% Promoters - 5% Detractors (3502 Promoters, 372 Passives, 205 Detractors)
Keep in mind that the NPS calculation for a given period only takes into consideration the most recent feedback from each customer in that period.
The metrics below the NPS data are totals for the period. In the example above, 21,061 total feedback requests were sent in the past 3 months. Of those feedback requests sent, 4,070 customers responded.
Note that the Responses Received metric (4,070) does not match the sum of the Promoters, Passives, and Detractors (4,079). There are a number of factors that can be responsible for this difference:
- Survey Cap - Listen360 allows you to determine how often your customers are surveyed. Depending on your configuration, some of your customers may have been surveyed more than once in the given time period. Each response will be counted in the Responses Received metric while only one response per customer will be counted in the NPS metrics. To check your survey cap configuration, follow these steps. This may cause the Responses Received number to be higher than your NPS totals.
- Manual Surveys - If your brand allows for the manual sending of individual surveys to customers, your numbers in the Loyalty Summary may differ. Even if a response is removed, each response will be counted in the Responses Received metric while only one response per customer will be counted in the NPS metrics. This may cause the Responses Received number to be higher than your NPS totals. To see if you have access to send manual surveys, follow these steps.
- Date Range - The Responses Received metric only looks at surveys sent within the selected date range (3 months in the example above). The NPS metrics only look at surveys completed within the selected date range. This means that customers can complete surveys that were sent outside of the selected date range. This may cause the Responses Received number to be lower than your NPS totals.
At the franchise or location level, this report lists customers and their level of loyalty to the location as indicated by a score and color label (green>promoter, yellow>passive, red>detractor). The default view will display data from "all time" but the Date Range filter can be used to select a specific time period.
The Customer Loyalty report is NPS-centric. This means that the NPS formula and methodology determines which data is included in the report. As the NPS calculation is only concerned with the most recent feedback from each customer in a given time period, only one response per customer will ever appear in this report for your selected date range. If a customer has provided more than one response, the response that shows in the report will depend on the selected date range.
When viewed from the brand level or from a reporting dashboard, this report will display the NPS for each franchise or location underneath that reporting dashboard rather than individual customer responses.
Listen360 electronically "reads" every comment submitted by a customer. The powerful algorithms that power this report are designed to help you understand precisely what is being said by your customers across the whole system and will summarize the feedback into various categories e.g. Price, Products, Services, Reliability, etc. For example, if customers are complaining about prices, this report can show you each customer and comment that mentions pricing or any other pre-determined pricing-related term.
Some customer comments may include more than one topic or theme. These comments will be included in each category that applies. As a result, a single customer response can appear more than once in the Voice of the Customer report. Because of this possibility, using the responses totals from this report in comparison with other reports is not recommended.
The people who perform the services that you offer are often an important part of the customer's experience. This report provides insight into what your customers think of each individual service provider. If a service provider is associated with a job or service and a survey is sent and completed for that job, the response will be tied to that service provider's name in this report.
In order to provide a Net Promoter Score for each service provider, the NPS calculation rules are applied. This means that of the feedback associated with a service provider only the most recent response from each customer in the given time period will be shown in the report and included in the calculation. If a customer has provided more than one response, the response that shows in the report will depend on the selected date range. However, a customer may have a more recent completed survey but it could be for a different service provider than the one you're looking at or it may have no service provider. In this case, the survey shown for your service provider would not be the absolute most recent survey for that customer but the most recent survey associated with the service provider in question.
When looking at the number of customers surveyed in this report, keep in mind that jobs or services can have more than one service provider. If a customer leaves feedback for a job that involves two separate service providers, that response will show up twice in the report - once for each provider.
This report will help you verify that every franchise or location is successfully requesting feedback.
For each franchise, you'll easily find the volume and recency of surveys sent as well as dates for customers data syncs. The total number of requests refers to the number of feedback requests (surveys) sent, regardless of outcome.
Response rate is a percentage of how many feedback requests have been completed out of the total number sent to customers. A feedback request is considered completed as long as the customer has provided a rating score. If they did not provide comments, the feedback request is still considered complete. For example, if 100 feedback requests have been sent to customers and 22 of those requests have received a response, then the response rate would be 22%.
The response rate metric is calculated per feedback request and not per customer. For example, if 3 feedback requests have been sent to the same customer over a year period and the customer only responded to the first feedback request, the response rate would be 33%.
Comments
0 comments
Please sign in to leave a comment.