Customer Experience Monitor 2014 Volvo Global CFL Study Agenda • Background Volvo Global CEM Study • Online Reporting • Support Global CEM Study Research Objectives and Fieldwork Start • Measuring global customer satisfaction of a recent new car purchase experience / service experience of a Volvo Car retailer on an overall satisfaction basis and for a set of detailed satisfaction variables − Overall Satisfaction – Top Box − Detailed satisfaction variables - diagnostic • The surveys provide VCC management and local markets with continuous performance measurements. This is used in the global performance and target follow up as well as local retailer network development work. • Fieldwork started in the first week of 2012. Methodology and Participation Rules • Two surveys/questionnaire types • Sales – 100% email • Service – both email & postal • Target group Sales survey: Customers who bought a new Volvo Service survey: Customers who visited a Volvo workshop for service / repair work • Online survey – (sales & service) Email invite to online survey Sales – 100% of customer records with a valid email address Service – Postal records are sent if we have not already emailed more than 30% of the total* records received in a rolling seven day period Postal Survey – Service Only Postal surveys are sent to customers that are randomly selected of dealers who do not capture enough email addresses *Total records = All records received regardless of whether they passed or failed the business rules Methodology and Participation Rules • Re-Contact Rules Sales: − Sales records will be rejected if a sales survey has ever been sent for the same VIN (rule 9) − Sales records will be rejected if a sales survey has been sent to the same email in the last six months (rule 12) Service: − Service records will be rejected if a service survey has been sent for the same VIN in the last six months (rule 9) − Service records will be rejected if a service survey has been sent to the same email in the last six months (rule 12) − Service records will be rejected if a sales survey has been sent for the same VIN in the last six months (rule 64) • Country/Market coverage All VCNA markets included Email Invitation and Opt Out Below every email there is an opt out link, that customer can click on to be ’blacklisted’. Postal Invitation Page Sales Questionnaire (general) • Average completion time of 8-10 minutes • Relatively low drop out rate (4-6%) • Questions organized to align with JD Power focus areas • Sales Questionnaire content: - Overall Satisfaction question Overall opinion of car Recommendation of retailer Overall opinion of facility Salesperson (3 questions) Paperwork and finance (4 questions) Customer delivery (3 questions) Customer treatment and follow-up (3 questions) Car history 2 open ended questions 2 demographic questions (gender & age) Service Questionnaire (general) • Average completion time of 8-10 minutes • Relatively low drop out rate (4-6%) • Questions organized to align with JD Power focus areas • Service Questionnaire content: - Reason for visit (e.g. Routine Service / Maintenance, Customer Pay, Warranty, Recall, Body & Paint) - Overall Satisfaction question - Overall opinion of car - Likelyhood of repurchase - Recommendation of retailer - Appointment and vehicle drop-off (2 questions) - Service advisor (2 questions) - Service quality and value (8 questions) - Vehicle pick-up (3 questions) - Customer treatment and follow-up (2 questions) - 2 open ended questions - 2 demographic questions (gender / age) Questionnaire Scale The respondents can rate questions and items by using a 1 to 5 point scale. Whereas the following wording and scores equal one another: Outstanding Very Good Good Poor Very Poor 5 4 3 2 1 Online Questionnaire Link Sales: http://survey3.maritz.com/volvo/vas/us/ Service: http://survey3.maritz.com/volvo/vaa/us/ NOTE: Overall Satisfaction Question – Completion Required Postal Questionnaire Page 1 • The postal questionnaire is used for service and as a suppliment to emails only. • The first page of the postal questionnaire includes the invitation text: Postal Questionnaire Page 2 Postal Questionnaire Page 3 Postal Questionnaire Page 4 Time Window Online Survey (Sales & Service) Customer Record Sample Window A SALES customer sample is valid between 0 - 45 days from RDR date A SERVICE customer sample is valid between 0 - 45 days from RO open date Event Date Sales: RDR date / Service: RO Open date Max 73 days between retailer visit & questionnaire completion Max 45 days Customer Record sample processed (Validity check / invitations sent out) Max 28 days Questionnaire completion period Time Window to complete the questionnaire link The online respondent has another 28 days to complete the online questionnaire The maximum time between event and questionnaire link expires is 73 days In average respondents fill out the online questionnaire within 3-4 days Usually postal questionnaire are returned after approximately 24 days Average Time to Complete the Survey (1/2) • Sales online survey is completed after approximately 4 days. • More than 75% of all respondents complete the sales survey within the first week. Completed In Completed In Completed In Completed Greater Than First Week Second Week Third Week Fourth Week Fourth Week Event Type Average days to Surveys Metho Country EventYear complete (N) dology N % N % N % N % N % Sales CA 2012 5 1616 Web 1222 75,6% 252 15,6% 94 5,8% 48 3,0% 0 0,0% Sales US 2012 4 19343 Web 15463 79,9% Sales CA 2013 4 877 Web Sales US 2013 3 13120 Web 2748 14,2% 719 3,7% 412 2,1% 1 0,0% 77,5% 140 16,0% 30 3,4% 27 3,1% 0 0,0% 10930 83,3% 1610 12,3% 371 2,8% 200 1,5% 0 0,0% 680 Average Time to Complete the Survey (2/2) • Service online survey is completed after 3 days. • Postal surveys are returned after approximately 24 days. • More than 80% of all respondents complete the service online survey within the first week. Completed In Completed In Completed In Completed Greater Than First Week Second Week Third Week Fourth Week Fourth Week Event Type Average days to Surveys Metho Country EventYear complete (N) dology N % N % N % N % N % Service CA 2012 3 8342 Email 6863 82,3% 1215 14,6% 186 2,2% 78 0,9% 0 0,0% Service CA 2012 30 1706 Postal 0 0,0% 198 11,6% 185 10,8% 555 32,5% 768 45,0% Service US 2012 3 75178 Email 1546 2,1% 650 0,9% 0 0,0% Service US 2012 20 18169 Postal 3 0,0% 6214 34,2% 6833 37,6% 2441 13,4% 2678 14,7% Service CA 2013 3 5088 Email 4262 83,8% 657 12,9% 108 2,1% 50 1,0% 0 0,0% 0 0,0% 130 14,2% 99 10,8% 329 35,8% 360 39,2% 44244 85,5% 6229 12,0% 871 1,7% 319 0,6% 1 0,0% 3434 33,2% 4348 42,0% 1270 12,3% 1291 12,5% Service CA 2013 28 918 Postal Service US 2013 3 51732 Email Service US 2013 19 10344 Postal 62697 83,4% 10285 13,7% 1 0,0% Data Cut-Off / Reporting Dates The data for a specific event month is final 73 days after the last date of this month. This means that... questionnaires that are returned after this date are not be counted if the monthly data is final on the reporting website turns to Key Postal Dates 2014/*2015 Event month Last day in month Last day to import customer record (45 days from month-end) Final postal send date (14 days) Reporting final (14 days) January Jan 31 Mar 17 Mar 31 Apr 14 February Feb 28 Apr 14 Apr 28 May 12 March Mar 31 May 15 May 29 Jun 12 April Apr 30 Jun 14 Jun 28 Jul 12 May May 31 Jul 15 Jul 29 Aug 12 June Jun 30 Aug 14 Aug 28 Sep 11 July Jul 31 Sep 14 Sep 28 Oct 12 August Aug 31 Oct 15 Oct 29 Nov 12 September Sep 30 Nov 14 Nov 28 Dec 12 October Oct 31 Dec 15 Dec 29 Jan 12* November Nov 30 Jan 14* Jan 28* Feb 11* December Dec 31 Feb 14* Feb 28* Mar 14* Time Window Postal Survey (Service only) Customer Record Sample window a SERVICE customer sample is valid between 0 - 45 days from RO open date Time window to complete the postal questionnaire different from the online survey, the time window to complete the postal questionnaire is not based on the date of sample processing, but on the last date before the event month is closed in the reporting which is the last date of the event month) plus 45 days plus 14 days the kill date is different to allow for the fact that the questionnaire needs to be printed posted, delivered to the respondent, posted back, delivered to Maritz, sorted, scanned, transcribed… 2013 January 2014 March17, 17,2013 2014 31, 2014 2013 March 31, 14,2014 2013 April 14, Quota System (only for Service survey) (1/3) • The postal quota sytsem for the service survey is based on records received / processed on a weekly basis between Friday to Thursday. • Every day service records are received, all valid email records will be sent out. Postal records will be held back until the last day of the rolling 7 day period (Thursday). • When it comes to Thursday (the day that postal records are selected for invitation): a. For each retailer: Count how many service records in total (valid and rejected) we have received from the Wednesday (previous) to Thursday. b. For each retailer: Calculate out what 30% of the total records would be. c. For each retailer: Count how many email records we sent from the records received Wednesday (previous) to Thursday. d. For each retailer: Subtract the number of email records sent from the 30% value and only send postal records to make up the shortfall. NOTE: The number of rejected records will have an impact on the amount of postal records that we could potentially send because the percentage target of 30% is 30% of all records received. Quota System (only for Service survey) (2/3) Quota System (only for Service survey) (3/3) Email only examples Example 1: · Records received in a week: 150 · How many passed the business rules: 100 · Records with a valid email: 50 · 30% of the records received in the week: 45 ·Difference between Emails sent and 30% value: 45 – 50 = -5 (no postal invitations sent) Postal examples Example 2: · Records received in a week: 150 · How many passed the business rules: 100 · Records with a valid email: 30 · 30% of the records received in the week: 45 · Difference between Emails sent and 30% value: 45 – 30 = 15 (15 postal invitations sent) Example 3: · Records received in a week: 150 · How many passed the business rules: 100 · Records with a valid email: 15 · 30% of the records received in the week: 45 · Difference between Emails sent and 30% value: 45 – 15 = 30 (30 postal invitations sent) This quota system has the benefit of not having to put the retailers into categories and if a retailer’s volume increases or decreases, the system will automatically take that into consideration for postal. Agenda • Background Volvo Global CEM Study • Online Reporting • Support Reporting Specifics • Ease of use, encourages knowledge transfer & action • Ability to transfer data to other software applications (e.g. Excel, PDF) • Available in local languages (US English, French Canadian) • Different reporting levels according to hierarchy • Database updated daily • Ability to generate, save and print individual reports Reporting Tool Access (VRC²) Users will get direct access to the CEM reporting tool on VRC² (Intranet). Customer Experience Monitor (CEM) VCNA Hierarchy The VCNA Hierarchy table gives an overview of all entities and related KPI results. Reporting Levels (1/2) There are different reporting levels based on the organization of Regions, Markets, Retailers. Reporting Levels (2/2) On Country level all different levels can be selected. Global Dashboard • This report shows the targets (where available) of all global countries for Sales and Service for the current year at one glance. • Furthermore, the 12 months rolling data as well as the trend of the overall satisfaction are indicated. Reporting Dashboard Overview of important scores at one glace for an entity. Overall Satisfaction • Overall Satisfaction is the main perfomance indicator in the CEM survey. • The coloring on the top of the gauge indicates whether the entities score is below / above the country target or even within the score range of the Top25% retailers in the country. • On the dashboard one can find different score types for overall satisfaction. − − − − − Top Box Overall Satisfaction: customers who answered ’outstanding’ Bottom 2: those who answered ’very poor’ or ’poor’ Mean: average score of all surveys Response %: percentage of completed surveys vs. all customer surveys sent Completed Interviews: total number of completed surveys Hot Alerts (1/4) When customers state that their overall satisfaction with their sales or service experience was poor or very poor they generate a Hot Alert (on the left hand site of the dashboard). The platform provides Volvo with the option to keep track on all Hot Alerts occurring throughout the survey and process these Hot Alerts adequately. More detailed info can be found in the user manual Hot Alerts (2/4) New (Hot Alerts): The figure beside the red flag reveals the counts of new Hot Alerts which have been generated recently or have not been processed so far. Pending (Hot Alerts): As soon as the handling of the new Hot Alert is in progress, and therefore has been put on stage ‘Pending’, the Hot Alert is shown under this category. Thus the figure beside the blue flag shows the counts of Pending Hot Alerts. Resolved (Hot Alerts): The figure beside the green flag shows the counts of Resolved Hot Alerts which have been fully processed and labeled as ‘Resolved’. Hot Alerts (3/4) When clicking on the flag on the dashboard one will be redirected to the Hot Alert overview. Hot Alerts (4/4) By clicking on the loupe icon one will be redirected to the completed questionnaire. By clicking on the flame one will be redirected to the Hot Alert history. (Hot) Alert Subscriptions (1/2) By selecting ”Manage Alert Subscriptions” in the Settings-Menu you can apply for receiving Email-Notications when a Hot Alerts occurs. After clicking on this you will be forwarded to another screen, where you have to... (Hot) Alert Subscriptions(2/2) 1. Select an entity 2. Include an email address and click on ”add” 3. The email address notified is marked as a subscriber in the list 3 1 2 Available Reports (1/2) Several specific reports with different functions are available in the reporting. Available Report Priority Report Report Specifics Shows for each entity its most important areas of improvement respectively those questions/criteria that need to be focused on in order to lift up Overall Satisfaction Scorecard Ranking Trend by Question Show key results of CEM survey Comparison of results of different levels (e.g. retailer, area) in ranking format Shows the development of the scores over time per question Result by Question Shows the results separately for all scaled questions Result Distribution Shows the distribution of all responses per question Result by Question and Level Shows the results for all scaled questions and different entities and hierarchy levels at one glance Verbatim Report Shows the verbatim given in the open ended questions per completed questionnaire Shows the individual results of each respondent Shows the valid sample and received completes as well as the completion method per event month Individual Customer Responses Customer Data Report Available Reports (2/2) Available Report Customer Data Quality Report Raw Data Dynamic Ranking Employee Report Response Rate Trend Query Tool Website Activity Report Report Specifics Shows how many of the provided sample was invalid, valid and the total number of sample that was provided for all hierarchical levels Provides the raw data of all completed customer questionnaires In addition to the Ranking report in this report rankings within individual regions or markets are available Shows results on employee level Shows the total number of invited customers, total number of completed interviews and the response rate in % for your entity or the levels that are reporting into your entity Enables you doing cross-classified table analyses with two questions Shows the total number of logins for the reporting website as well as the percentages of users and hierarchy levels logging in Available Filter (1/3) Available Filter Versus level: possibility to compare results with different levels Available Selection Criteria Region, Market, Retailer Group, Retailer View: Report into me: One can choose to see the results of lower levels or (depending on access rights) Entity comparison: possibility to Country, Region, Market, Retailer Group, Retailer (Top 25% compare results of different levels or Dealers) different units of the same level e.g. (depending on access rights) Market to Region, Region to Region, Retailer to Retailer. May also be used to show single Retailer, Market, etc. results on Country level. Among my peers: One can choose to see the results of oneself compared to the same level. For example one Retailer Group can see his own results compared to other Retailer Groups. This is available for Market and lower levels. The Country level user is able to see all results anyway so this is not applicable on Country and higher levels. Available Filter (2/3) Available Filter Time Period* Available Selection Criteria Rolling 12 months, Rolling 6 months, Last 3 months, Year to date, Choose Your Own * via additional filters BeginDate and EndDate the user is enabled to define period Score Type Questionnaire Type Service Type Top Box, Bottom 2, Mean Sales, Service Only in combination with Survey Type (Service): Gender Age Range Question number Time Period Time Period - Start Time Period – End Level to Rank Rank Over Routine service/Maintenance Work, Repair under Warranty, Repair out of warranty, Recall Male, Female <25/ 25-34/ 35-44/ 45-54/ 55-64/ >65 All scaled questions of the questionnaire Any period from launch of the survey to the current month Any month from launch of the survey to the current month Any month from launch of the survey to the current month Define the entity level you want to produce a ranking Define the entity level in which you want to do the ranking Available Filter (3/3) Available Filter Group/Sort Vehicle Mileage Vehicle Age Available Selection Criteria With this filter you can sort verbatim according to different criteria (Dealer Name, Dealer Code, Event Date, Complete Date, Overall Satisfaction) 0-15K / 15-30K / 30-45K / 45-60K / 60-75K / 75-90K / >90K 0-3 years / 4-6 years / 7-9 years NOTE: Not all filters may be applicable for all reports. Priority Report (1/2) • The Priority Report shows for each entity its most important areas of improvement respectively those questions/criteria that need to be focused on in order to lift up Overall Satisfaction. • The first page shows the top 3 priorities for increasing overall satisfaction linked to best practice comments that point out possible actions to improve for those individual items. • An explanation of the exact calculation of the indicated priority scores can be found in the help file behind the question mark Priority Report (2/2) Via the “Next page”-button ( ) at the top of the report, one can go further to the 2nd page which is displaying the individual Priority Scores for all questions. Scorecard • The Scorecards show the market key results. • All the results are “Top Box” • Orange line are top 25% dealers. • The red line graph is the market target line. • Line graph shows the 12mR progress. • Bar graphs display the 3mR progress. Ranking This report shows the Top Box Score of the different levels in a ranking. Trend By Question This report shows the development of the scores over time per question. Result by Question • This report shows the results separately for all scaled questions. • One will see a comparison of market scores and Top 25% retailers scores. • The questions with a 5-point scale are shown on top and the yes/no questions are shown below separately. • In the table at the bottom the exact scores of the graphs are displayed. Result Distribution This Report shows the distribution of all responses per question. It displays counts (=number of answers) for each score of each scaled question. Result by Question and Level With this report comparisons of results between different entity levels such as Market level or Retailer level are possible. Verbatim Report • The Verbatim Report shows the verbatim given in the open ended questions per respondent. • This report allows one to relate each verbatim to the individual respondent. • All respondents and their comments are listed and sorted by retailer, so that critical comments / request by respondents can be related and issued directly to the retailer and employee. Individual Customer Responses This Report shows the individual results of each respondent. On the first page some key information for each customer is shown including overall satisfaction score, customer name, model, complete method, VIN Number, Event Date and Employee. Customer Data Report • This report shows how many valid sample was received and how many email / postal invites have been sent. • If one opens the details by clicking on the plus sign one will see, how many postal invitations were completed online or send back postal. Customer Data Quality Report • This report shows how many of the provided sample was invalid, valid and the total number of sample that was provided for all hierarchical levels. • It also distinguishes between error reasons that were within retailers control or not, so that the sample accuracy of a dealer can be classified in a fair way. • The rejection reasons are indicated at the top of the list and are also linked to an icon. Raw Data • After filling in the Raw Data Request form with the begin and end event date of the required time period 1 one will receive an email notification that the requested raw data is available for download on the CEM website under the "Documents" menu .2 • The raw data will be provided in an Excel spreadsheet for further analyses. 1 2 Dynamic Ranking With this report, rankings within individual Regions or Markets are available, when adjusting filter settings “Level to Rank” and “Rank Over”. Employee Report (1/5) • The Employee Report shows cumulated data from completed questionnaires, related to an individual Employee. • From the overview of the Employee Report further detailed reports are available. 2 1 3 4 Employee Report (2/5) • By clicking on 1 , the blue underlined retailer name and code in the Employee Overview, one gets forwarded to the Retailer Employee Summary, on which the results of all retailers’ employees are shown at one glance for every single question. • Furthermore this overview contains benchmarks figures for the retailers’ related Country, Region and Market. Employee Report (3/5) • By clicking on 2 , the blue underlined employee code in the Employee Overview, one gets forwarded to the Retailer Employee Summary, on which the results of this particular retailer employee are shown at one glance for every single question. • Furthermore this overview contains benchmarks figures for the retailers’ related Country, Region and Market. Employee Report (4/5) • By clicking on 3 in the Employee Overview, one gets forwarded to the Employee Report of this particular employee. A • The screen with the cumulated employee’s results includes… A Dashboard B Result by Question C Result Distribution B • These report tabs are corresponding in terms of functionality and results to their counterparts on dealer and country level. C Employee Report (5/5) • By clicking on 4 in the Employee Overview, one gets forwarded to the Individual Customer Responses of this particular employee. • This report is corresponding in terms of functionality and results to its counterpart on dealer and country level. Response Rate Trend This report shows the total number of invited customers, total number of completed interviews and the response rate in % for your entity or the levels that are reporting into your entity. Query Tool • The Query Tool enables you doing cross-classified table analyses with two questions. • This report will be available under the “Info”-button in the Navigation menu. Website Activity Report (1/3) • This report shows the total number of logins for the reporting website as well as the percentages of users and hierarchy levels logging in. • This report will be available under the “Info”-button in the Navigation menu. Website Activity Report (2/3) By clicking on the blue numbers in the table‚ “Login Counts by Month and Level” an overview will be opened that shows the details of the users that were logging. Website Activity Report (3/3) The graphs on the first page of the Website Activity report below the table ‚ “Login Counts by Month and Level” show the Total Report Accesses as well as the Report Accesses by Level for the last 6 months. (Offline) Report Subscription (1/2) – only for users at country level By selecting ”Manage Report Subscriptions” in the Settings-Menu you can apply persons for receiving additional offline reports. After clicking on this menu you are forwarded to another screen, where you have to... (Offline) Report Subscription (2/2) – only for users at country level 1. Select a report and a report instance 2. Include Email address and click ”add” 3. The email address is marked as a subscriber in the list 1 2 3 Agenda • Background Volvo Global CEM Study • Online Reporting • Support Help Desk – Customers If customers need further assitance, they can fill out a request form on the Customer Satisfaction Help Desk website to ask questions to either Global CEM (Technical Support) or Volvo: https://www.mbsrv.de/Volvo_CFL_Helpdesk/ Help Desk – Customers When they choose to write a comment to Volvo, they must complete the request form below: Help Desk – Retailers The Help Desk is also available for Retailers via Phone: 800.752.2541 or Fax: 888.681.7344 or Email: [email protected] The help desk hours of operation are Monday - Friday 08:30 (AM) to 17:00 (PM) (Eastern Standard Time) Help Desk – Retailer Access Request Form (1/2) If one wants to access the CEM reporting via VRC² but has not applied for a user account yet, the screen from below is shown. By clicking on the link an application form will be opened which needs to be printed, filled out and then sent via fax to the Help Desk. Help Desk – Retailer Access Request Form (2/2) • Retailer CEM access is granted to the following job roles: Principal, General Manager, Sales Manager, Service Manager, Office Manager. • For all other Retailer personnel CEM access can be requested by completing the request form available on website (sample to the right).
© Copyright 2025