Free Downloads

MindMaps:
ISO 20000: 2011
and
ITIL 2011 MMap
 

Templates:
Request for Change (RFC) Template

Major Incident Report Template

Posters:
ISO 20000/ITIL Timeline poster

    

Sponsored Links

 

Google
Showing posts with label Customer Satisfaction. Show all posts
Showing posts with label Customer Satisfaction. Show all posts

Mar 10, 2013

Customer Satisfaction Survey Grading

What grades do you use in customer satisfaction surveys? How are grades influenced by the culture of the country? What to do if you perform surveys in different countries with different background?

I have been discussing customer satisfaction here a few times. Related to that, let me share a brief anecdote with you:

We implemented a new Service Desk SW on a customer's site. They are a managed services company providing support all over the world, 24x7. After the resolution of every ticket, contact person on a ticket receives a mail notification with a link to a short web-based survey.

There were just a few questions regarding speed of resolution, communication, competence of support people and overall satisfaction with the ticket resolution.

Grades were 1-5, with the above explanation that 1=poor and 5=excellent.

We received quite a bunch of survey results at the beginning, which was the intention. Here and there, a low score was received, but we were not alarmed, you can't please everyone. Service Manager was in charge to treat all grades below 3 as a customer complaint and to follow up with customers to raise their satisfaction.

Then one day we received two very bad results, averaging below 2. Both from the same market. Alert! We are doing something wrong.

So Service Manager sent apologetic mail with a inquiry what went wrong and how can we improve, blah...
The answer came quickly, from both customers, saying they are sorry, but they thought 1 is better and 5 is poor.

Was something wrong with the survey code? Customers sent us their screenshots, everything is fine, the explanation at the form header clearly stated "1=poor and 5=excellent". Both customers were from Germany. On a request to explain how they misunderstood this clear instruction, they said: "German school grading system is 1 to 5, a one being the best grade (Sehr gut), and five the worst - insufficient (Nicht genĂ¼gend)". So they didn't bother looking at explanations, they automatically presumed that this survey from another country complies to their long-term grading experience in German scholar system. Can't blame them.

Therefore we looked arround, what are grading standards in school systems around the world?

USA and influenced states use ABCDEF grades. Europe differs very much depending on history, somewhere 1 is bad, under the German skirt it is excellent (Chezh republic and Slovakia also).
Eastern European countries, Asia and Oceania use either Russian 1-5 system or percentage system 0-100%.

Customer Survey Grades


Interesting details:
  • Venezuela uses rather exotic 0-20 grading system
  • Ecuador and Serbia (opposite hemispheres) use similar 5-10 grading systems where 5=fail and 10=excellent.
  • A lot of countries use different grading systems for primary, high school and university grades.
So what approach to take in grading customer satisfaction in order to make it intuitive regardless of their local education background?
 
We had only two solutions:
  • Go to using 1, 2, 3, 4 grades, where customers won't be able to relate to their finer graded school system. 1-4 is also good because it eliminates the indifferent middle grade, and it forces a customer to decide for better or worse 2 or 3.
  • Take the -2, -1, 0, 1, 2 grading approach. Good: it is self-explanatory, negative numbers can't be good. Bad: it leads the customer to a mediocre "0" grade, suggesting that it is OK.
For now, we are using the second proposed grading system, accepting the downside that some customers see nothing bad in selecting a "0", which is a neutral grade.

An additional tweak would be to change grades to -1, 0, 1, 2, which would "push" a customer towards positive grades some more.

What metod do you think is most apropriate for you?


Related articles:

Customer Satisfaction 
How bad do we need it in IT Service Management? Where is it mentioned and where is it dealt with in ITIL V3? How do we manage it in real life?

Customer Satisfaction Survey: What Methods To Use?
How to gather customer satisfaction data? What methods are there? What ITIL says? What methods will work for you?

Jan 7, 2011

Customer Satisfaction Survey: What Methods To Use?

How to gather customer satisfaction data? What methods are there? What ITIL says? What methods will work for you?

USA Today has come out with a new survey - apparently, three out of every four people make up 75% of the population.
David Letterman
 
In my last post I spoke about Customer Satisfaction. OK, but how to gather data? How to interview customers? How to get the best results in surveys and help educate the customer about the proces?
Here I give you a list of basic ITIL methods for gathering customer satisfaction data and my coments of its worth in my company. Mind you, we are in Managed Service business, so my grades are biased.

If you are a small IT in  government agency or an IT in Airline company, you may look for different values. Still, this will give you an idea.


Customer Satisfaction


After-Call Survey
Callers are asked to remain on the phone after the call and then asked to rate the service they were provided.

This method requires that you have some kind of telephony automation like Interactive Voice Response (IVR). The thing with this method is, however you set the system it will often be despised by the customer. Most of the people don’t like to hear recorded messages and even more people hate moving down the menu tree with phone keyboard. Outcome? Only extremes: customer which are very satisfied and wish to reward an agent by giving him good grades or, more often, a very pissed off customer who will go out of his way to express his low level of satisfaction.

Call Center organizations are doomed to use this technology since phonecalls are the only way of communicating with the customer. In ITSM there are usually more ways to talk to customer.

My worth: 1/5


Mail Surveys
We never considered sending mail interviews to customers as an option. Not in this century at least. What kind od IT-related person would fill the received paper form and send it back to us? Do we need a feedback from this kind of person? No, thanks.

With e-mail it's maybe slightly better. We used to have a ServiceDesk application which would send an automated notification on incident resolve event, asking the customer contact to reply in an email grading their satisfaction with speed and quality of resolution by grades 1 to 5. So lame. Response percentage of 0,05% was enough to pacify our ISO9001 auditor but it was next to worthless to us otherwise. One good thing was that receiving a small number of answers in an unstructured text email made the result processing much easier :-)

My worth: 1.5/5


Group Interviews
In this method, customers should be gathered in small groups and interviewed together. This can maybe be somewhat useful in a group of people working together, even then their answers will be biased and dependent. Possible value I see in this one is that they will help to develop communication between customers of different knowledge levels and enhance collective process knowledge.

In my organization we try to keep different SLA customers as far from each other as possible. We do not conduct group interviews but we do organize ITSM workshops here and there for our customers, feedback is usually very good.

My worth: 2/5


Phone Surveys
Some time after the interaction with Service Desk, an independent agent calls the customer.

You are in the middle of your monthly report, juggling a database and three spreadsheets, and some guy calls you to grade your experience with Service Desk from ten days ago? You will brush him off. OK, maybe not the first time, but the next, for sure.

So, if you want to get the results with this method, you need trained people with strong communication skills and good opening lines. Also, a good idea is not to have surveys for all calls, to let the customer rest from you periodically. If they anticipate your call, they start thinking up good excuses to brush you off. Surprise them periodically.

My worth: 2/5


Personal interviews
Surveying person is usually Incident Manager or Service Desk manager, while the Customer representative is a person responsible for SLA on their side. These can be contact persons on Major incidents after the post mortem reporting, but that’s another story and this should be defined in Major incident procedure.

These interviews are usually periodic and should be conducted at least annually (usually they are, during a contract renewal), but more preferably on quarterly basis. During these, the two parties communicate customer needs and help focusing Service Support towards things important to customer. So the questions in a survey should be aligned with Service Desk and Incident Management KPIs.

These are of great value to us.

My worth: 4/5


Online Surveys
We worked with several Service Desk automation tools, different vendors. In the end, we developed a Web application which enables us to send different surveys to our customers. They can be Ticket related, periodical or Ad Hoc surveys with predefine templates of multiple choice and freeform answers. We like it so much that we host it to other departments of our company. And we do not sell it, it’s that good :-)

Customers fill it in their own convenience (ok, we sometimes call and beg them to fill surveys), and after they submit the form they are automatically in the report.

Web-based surveys are a life-saver to us.
My worth: 5/5


Other Methods
Blogs, tweets, wikis, forum discussions and chats are all nice to have and, if maintained well, they will influence customer experience positively.

However, it is next to impossible to mine a structured measurement of customer satisfaction out of them, and they shouldn’t be used for that purpose.


Additional About Surveys

Scoring
It will be best to define simple questions with multiple choice number ratings. How many different grades? For fine gradation in longer interviews we use grades 1-10. For faster, ticket-focused surveys we opted for a coarser gradation. We used 1-5 for a long time, but a lot of the grades were distributed to 1, 5 and 3. Eliminating the middle value and shifting to 1-4 made our customers think more. We still use (1-4) system for most surveys and we are satisfied with it.

In addition, it is often nice to put an optional comment field under every question. Doesn’t cost much, and sometimes customers leave very useful info there.

How many questions?
You certainly don’t want to scare of the customer with a bunch of questions. OK, annual interviews can be longer, and internal ITIL/ISO20000 audits and assessments can gave a gazillion questions, but survey on incident performance can have only one (how satisfied are you?), ideally three, or maximum five multiple choice questions. You fish for quantity here.

Anonymous or named?
Some customers will answer an anonymous survey rather then named. Well, their problem. This is one case where customer is not always right. You want to know who is satisfied, who is not, so you can decide on your further actions.


Related posts:

Customer Satisfaction in ITIL Service Management: Do You Get It?
Elementary facts about ITSM Customer Satisfaction - How bad do we need it in IT Service Management? Where is it mentioned and where is it dealt with in ITIL V3? How do we manage it in real life?

Nov 30, 2010

Customer Satisfaction in ITIL Service Management: Do You Get It?

Customer Satisfaction:  How bad do we need it in IT Service Management? Where is it mentioned and where is it dealt with in ITIL V3? How do we manage it in real life?

- I Can't Get No...
- I'd rather be dead than singing "Satisfaction" when I'm forty-five.  
  Mick Jagger

Customer Satisfaction is very important. One of the main ITIL highs is to put a customer in focus. It is usually done by finding ways to raise the level of customer satisfaction. Often by creating a common language for better communication between the Business and IT.


Implicit, customer satisfaction is mentioned in all ITIL processes and functions. Explicit, there are parts of ITIL where we really take care of it. Where is it?


SERVICE STRATEGY
In Strategy stage of ITIL Service Lifecycle we are talking generally about opportunities, markets, possible services, where we are, where do we want to be. Here is where we firstly define the level of satisfaction we want to achieve. New people in this field think that a customer satisfaction should be as high as possible. Sorry but that's not the case. If customer is super satisfied it often means that we are over delivering, and he gets more then he paid for.

PERCEPTION
Satisfaction is about PERCEPTION. So, it is not about real objective quality of service, it is about how customer sees that quality. There are cases when a customer sees the service much better than it is, and also, sometimes the service is perceived much worse than it is in reality, usually due to bad communication, or a few isolated cases that gained higher visibility.

Customer interacts with service and compares its quality. To what? Not necessarily to a previous or existing service provider service, but always to his own EXPECTATIONS. That's why

Customer satisfaction =  Perception - Expectation

And our life depends on staying on positive side of this equation.


SERVICE DESIGN
Design is a phase where we define a new service. In the end of design phase we know where we want to be and how satisfied our customer should be. Main places where we define customer satisfaction are Service Level Management, Availability and Capacity Management.

Service Level Management
Now we are at the right place.

The goal of SLM is to ensure that the agreed level of IT service is provided. And that any services we will provide in the future will be delivered as agreed.

The alleged purpose of SLM is to provide the metrics for conformance of the achieved level of service to agreed one.

But in reality, the main goal of SLM is improvement of communication between Business and IT. The process of SLM is the most important and valuable process in creation of the service. It helps Business to KNOW what to EXPECT and also helps IT to know what is important to provide.

Service Level Agreement (SLA) is just a document. But the process of SLM helps IT and the Business to understand each other. Negotiations, request weighting, estimations, it all helps Business to understand what resources are involved and how difficult it is to achieve every little fraction of availability percentage. Also, SLM process helps IT to understand what and how is really important to business. So SLM is about bringing EXPECTATIONS in reality domain.

Service Availability and Capacity Management
Availability and Capacity are naturally opposed and they represent the process of narrowing the gap between what customer wants and what he is willing to pay for. IT is here to try to comprehend what is important to Business in order to stay alive in a cost-effective way.

So Availability and Capacity underpin the process of Service Level Management by educating the customer about the price of the service and finding the optimal investment/gain ratio. Customer will understand that price of availability grows exponentially. Simple math will find the adequate crossing of the cost and availability curves.

That's why ITIL says that "there is a direct correlation in most organizations between the service availability and customer and user satisfaction, where poor service performance is defined as being unavailable. "

Better communication and common language helps IT to stay alive even if some services deteriorate. It is still possible to retain most of customer's satisfaction if he knows he can have confidence in us.


SERVICE TRANSITION
Customer satisfaction is mentioned as a factor of high importance repeatedly in Service Transition Fundamentals and Principles.

During Transition, in Change Management and especially in Release and Deployment Management we get in touch with the customer and it is very important that these interactions create good experiences.

But, if we get to this ITIL implementation stage and something goes wrong, it will usually be much more visible in Operation phase, since badly implemented changes create a large percentage of RFCs and incidents.


SERVICE OPERATION
Most interaction with the customer happens in Operation stage, mainly in Service Desk function, through Request Fulfillment and Incident Management.

What we want to do is to optimize costs and quality in agreed service level:
Finding the Cost/Quality optimum for the Service

Customer satisfaction is the main KPIs for these ITIL entities. Most of the non-technical introduction to Service Desk chapter is actually about Customer Satisfaction. There is even a nice detail about customer/user satisfaction surveys (6.2.5.1).


CONTINUAL SERVICE IMPROVEMENT
A lot of attention is naturally given to customer satisfaction in Continual Service Improvement. Satisfaction is something you can measure, and what you can measure you can manage and improve.

Where I am from, gathering and reporting of customer satisfaction data is done in Service Desk, mostly after incident ticket closure and periodically on meetings with top ten customers. That conforms well to our ISO 9000 and ISO 20000 requirements. Every year on management review we check did we reach our last year’s goal, and define where we want to be next year. Let me remind you: it is not always about rising customer satisfaction: one year we even lowered target values of Incident Management survey results.

I have a few examples to illustrate the above for you:

Anecdote 1: In the early days of our service support we had a customer. We were over-capacitated and eager look our best, so we responded and resolved tickets for this customer much quicker than the Service Level Agreement required. Customer was happy. As we got more customers, our response times became longer and longer, but still well within SLA parameters. Our customer started to send unhappy signals in our regular satisfaction surveys, so we scheduled a meeting.

Problem wasn’t in our agreement, but in perceived deterioration of our service. Customer took our eagerness for granted. So we had a long talk, reset our SLA thresholds and financial parameters, and continued to cooperate. Nevertheless, we took care to respond to system incidents after 1/3 of agreed response time. For all customers. Just in case.

Anecdote 2: During the night, our system monitor tool notified our Service Desk on a major incident at our customer’s site. Our on-call engineers were called and started investigating immediately. They spent all night working on resolution and succeeded early in the morning.
Sadly, our Service Desk did not open an Incident in our ticketing application until the resolution. So customer was notified about the incident 5 hours after the service went down. And in SLA we have defined 2 hours response time, hourly notifications on priority 1 incidents, and resolution after 8 hours.
Our customer went wild (very dissatisfied). Even if we worked all night with doubled capacity to restore the service, we failed to meet the agreed notification time and kept the customer in the dark.

On the other side, if we only had created the incident record and notified the customer hourly that we work on it every hour, put one person on resolving it until the morning, our customer would be much happier.

Service Level Management is KING
Of course, there are all kinds of customers, some are nice people and some are less so, but at the end of the day, it all comes to delivering what you promised. So if your Operations processes are implemented well, usually a significant improvement can be gained by working on your Service Level Management process.

Opening Jagger quotes were just for my amusement. Here are a few Customer Satisfaction thoughts that I picked up for you:

- Your best customers leave quite an impression. Do the same, and they won't leave at all.
  
SAP Ad


- Although your customers won’t love you if you give bad service, your competitors will.
   Kate Zabriskie


- Customers who don't get support become someone else's customers.
   Brigade Ad


- If the shopper feels like it was poor service, then it was poor service. We are in the customer perception business.
   Mark Perrault, Rally Stores


- You are serving a customer, not a life sentence. Learn how to enjoy your work.
   Laurie McIntosh


Related posts:

Customer Satisfaction Survey: What Methods To Use?
How to gather customer satisfaction data? What methods are there? What ITIL says? What methods will work for you?