Using Net Promoter Score to help improve customer experience

Com­pa­nies are con­tin­u­al­ly look­ing for ways to improve the cus­tomer expe­ri­ence they pro­vide and to put their cus­tomers at the heart of every­thing they do. A pop­u­lar strat­e­gy to help achieve this is to lis­ten to the Voice of the Cus­tomer, by ask­ing for feed­back on their expe­ri­ence.

How­ev­er, to get the best results from a Voice of the Cus­tomer pro­gram, com­pa­nies need to make sure they ask the right ques­tion, at the right time, in a way that is rel­e­vant to the con­text of the user inter­ac­tion.  They also need to ensure that they can apply the customer’s feed­back to actu­al­ly dri­ve improve­ment in the cus­tomer expe­ri­ence — i.e. to make the cus­tomer feed­back action­able.

For the last 12 months I have been work­ing with one of the worlds lead­ing tele­coms com­pa­nies on their voice of the cus­tomer strat­e­gy. The fol­low­ing arti­cle looks at the chal­lenges we faced in this pro­gram, and how we addressed them.

Chal­lenge 1: Ask­ing the right ques­tion. We have all seen lengthy web sur­veys where there is always one more ques­tion to answer. How­ev­er, go too far in the oth­er direc­tion, ask­ing only very gener­ic ques­tions, and the response will lack focus.

To ensure we only present short sur­veys that are not too much trou­ble for the cus­tomer to both­er respond­ing to, we made use of Net Pro­mot­er Score, a method­ol­o­gy that boils all the ques­tions you may want to ask into 2 sim­ple ques­tions – a score and a free text response.

Net Promoter Score - 2 simple questions

By com­bin­ing the response to these ques­tions with oth­er infor­ma­tion about the cus­tomer – what they were doing before and after sur­vey, why they were sent the sur­vey and oth­er data such as CRM or sales infor­ma­tion, we can asso­ciate their response with where they are in the user jour­ney. This allows us to under­stand their response in the con­text of their most recent inter­ac­tion with the com­pa­ny – the cus­tomer Touch­Point.

Chal­lenge 2: How to col­lect feed­back con­sis­tent­ly across the cus­tomer jour­ney. So dif­fer­ent respons­es can be com­pared, we need­ed to stan­dard­ize the way in which cus­tomer feed­back to col­lect­ed, processed and ana­lyzed. By using the same sur­vey ques­tions (NPS ques­tions) across all chan­nel and all stages in the user jour­ney, we can ensure results at dif­fer­ent stages in the cus­tomer jour­ney or on dif­fer­ent chan­nels can be com­pared.

This allows us to answer ques­tions like which ser­vice chan­nel deliv­ers the high­est lev­el of cus­tomer sat­is­fac­tion (web self ser­vice, call cen­tre, social, face 2 face etc. )? Why do peo­ple pre­fer that chan­nel? What about mul­ti-chan­nel expe­ri­ences?

How­ev­er, the method of col­lect­ing the cus­tomer respons­es also needs to be stan­dard­ized to ensure they are com­pa­ra­ble. In this case we used a com­bi­na­tion of web and SMS based sur­veys, both ask­ing NPS sur­vey ques­tions with exact­ly the same word­ing. By using near real time (with­in 30 min­utes) trig­gers to ini­ti­ate pre­sen­ta­tion of the sur­vey, response rates were dra­mat­i­cal­ly increased. By using auto­mat­ed sur­vey trig­gers, vol­umes could be scaled far high­er than tra­di­tion­al “clip­board” or call cen­tre cus­tomer sur­veys.

Chal­lenge 3: Sort­ing, cat­e­go­riz­ing and attribut­ing the dri­vers for pos­i­tive and neg­a­tive expe­ri­ences.  Once we had the foun­da­tions in place – what we were going to ask and how we were going to ask it — we need­ed to look at how we would process the respons­es. Here, we worked close­ly with the Adobe Part­ner, Clarabridge.

By using Clarabridge’s Nat­ur­al Lan­guage Pro­cess­ing (NLP) capa­bil­i­ty, we were able to process each sur­vey response to:

  • Quan­ti­fy pos­i­tive or neg­a­tive sen­ti­ment
  • Iden­ti­fy what con­tributed to pos­i­tive or neg­a­tive sen­ti­ment in the feed­back (i.e. the free text or ver­ba­tim response to the “why” NPS ques­tion) — the NPS dri­vers
  • Cat­e­go­rize that response accord­ing to the busi­ness unit or fol­low up action need­ed to address the dri­ver or cause of that (neg­a­tive) expe­ri­ence.

Chal­lenge 4: Get­ting a com­plete pic­ture. Through the com­bi­na­tion of robust, scal­able sur­vey col­lec­tion; auto­mat­ed pro­cess­ing and cat­e­go­riza­tion of the respons­es using Clarabridge; and match­ing sur­vey respons­es with oth­er sources of cus­tomer data (e.g. web and mobile behav­iour­al data from Adobe Ana­lyt­ics, social data from tools such as Adobe Social and oth­er sources of data such as the CRM sys­tem, Point of Sale data and IVR/call cen­tre records); we can pro­vide a com­plete pic­ture of the user’s seg­ment, actions and feel­ings as they moved through the cus­tomer jour­ney.

This data, that com­bines qual­i­ta­tive voice of the cus­tomer feed­back with quan­ti­ta­tive data about actions tak­en, allows us to visu­al­ize any trends in the cus­tomer expe­ri­ence, to quick­ly high­light any prob­lems in real time, and apply busi­ness rules to the result to iden­ti­fy indi­vid­ual pieces of cus­tomer feed­back that need­ed reply or fol­low up.

As the pro­cess­ing of the free text “why” respons­es in Clarabridge pro­vides quan­ti­ta­tive data as the out­put (sen­ti­ment, NPS dri­vers, busi­ness cat­e­go­ry) this data can eas­i­ly be com­bined with oth­er sources of data (the NPS score, cus­tomer jour­ney stage, cus­tomer actions, demo­graph­ic, geo­graph­ic and con­tract data etc.) to give a com­plete data set for deep­er analy­sis.

Togeth­er these sources of data pro­vide both a macro view of the “big issues” or com­mon trends, and a micro­scop­ic view of indi­vid­ual cus­tomer prob­lems or chal­lenges.

Chal­lenge 5: Clos­ing the loop with the cus­tomer. The final piece of the puz­zle was how to use this Voice of the Cus­tomer feed­back to actu­al­ly improve the cus­tomer expe­ri­ence and to guide busi­ness change.

Using busi­ness rules that look for com­bi­na­tions of cer­tain cus­tomer types, com­ments, dri­ver cat­e­gories and/or NPS scores, we can mark cer­tain sur­vey respons­es as need­ing fol­low up. The fol­low up action can then be man­aged using a light weight dash­board UI pro­vid­ed to NPS task forces, or the data about the feed­back and required fol­low up action can be fed into a CRM or work­flow tool.

 

Closed Loop Net Promoter Score Management

Closed Loop Net Pro­mot­er Score Man­age­ment

The solu­tion put in place for this cus­tomer to address these chal­lenges, the Touch­Point Solu­tion, con­sists of 3 func­tion­al parts:

  • Mul­ti-chan­nel sur­veys (based on the Adobe Sur­vey prod­uct) t
  • The inte­gra­tion of behav­iour­al data and voice of the cus­tomer data (using Adobe Ana­lyt­ics, Adobe Social and Clarabridge)
  • A set of dash­boards to iden­ti­fy and fol­low up on the caus­es of cus­tomer dis­sat­is­fac­tion (devel­oped using Adobe Expe­ri­ence Man­ag­er).

The TouchPoint Solution

The fol­low­ing is an exam­ple of the closed loop dash­board:

Closed Loop NPS Dashboard

In sum­ma­ry the Touch­Point solu­tion helps:

  • Run mul­ti-chan­nel sur­veys and con­sis­tent­ly mea­sure NPS
    • Reduce the cost of voice of cus­tomer sur­vey col­lec­tion
    • Sup­port cus­tomer jour­ney opti­mi­sa­tion ini­tia­tives such as call cen­tre deflec­tion and right chan­nel­ing
    • Mea­sure and com­pare cus­tomer expe­ri­ence across mul­ti­ple chan­nels
    • Scale voice of the cus­tomer pro­grams so all cus­tomers can be sur­veyed as opposed to only a sam­ple
    • Use NPS as a met­ric in a com­pen­sa­tion plan to encour­age teams to focus on deliv­er­ing high lev­els of cus­tomer sat­is­fac­tion
  • Inte­grate behav­iour­al and voice of the cus­tomer data
    • Build a com­plete pic­ture of cus­tomer behav­iour and sen­ti­ment (for analy­sis in Ana­lyt­ics Pre­mi­um)
    • Analy­sis of NPS feed­back in the con­text of mul­ti-chan­nel data (CRM, ePoS, IVR, online, demo­graph­ic etc.)
    • Nat­ur­al lan­guage ana­lyt­ics around NPS ver­ba­tim to rate sen­ti­ment and map feed­back to fol­low up cat­e­gories or root caus­es
    • Cat­e­gori­sa­tion and sen­ti­ment scor­ing of free text feed­back also turns qual­i­ta­tive feed­back into sta­tis­ti­cal data for sta­tis­ti­cal analy­sis
  • Iden­ti­fy and fol­low up on the caus­es of cus­tomer dis­sat­is­fac­tion
    • Work­flow to “close the loop” with cus­tomers – allow­ing com­pa­nies to fol­low up 1:1 with indi­vid­ual cus­tomers who give a low NPS score
    • Dash­boards to mon­i­tor NPS trends by chan­nel and busi­ness hier­ar­chy (dash­boards are mapped to org struc­ture, so execs see a macro view across chan­nels, chan­nel man­agers see data for their chan­nel, team leads see detailed data per employ­ee)
    • Iden­ti­fy dis­sat­is­fied cus­tomers at risk of churn and man­age fol­low up actions
    • Iden­ti­fy the dri­vers behind pos­i­tive and neg­a­tive cus­tomer expe­ri­ences
    • Guide and focus invest­ment on cus­tomer expe­ri­ence improve­ment

If you’re inter­est­ed in devel­op­ing a sim­i­lar strat­e­gy for your com­pa­ny, please get in touch with me.

One Response to Using Net Promoter Score to help improve customer experience

  1. Pingback: Customer Experience | Customer Loyalty As A Platform

Leave a Reply

Your email address will not be published. Required fields are marked *