Free Downloads

MindMaps:
ISO 20000: 2011
and
ITIL 2011 MMap
 

Templates:
Request for Change (RFC) Template

Major Incident Report Template

Posters:
ISO 20000/ITIL Timeline poster

    

Sponsored Links

 

Google
Showing posts with label Major Incident. Show all posts
Showing posts with label Major Incident. Show all posts

Mar 10, 2013

Customer Satisfaction Survey Grading

What grades do you use in customer satisfaction surveys? How are grades influenced by the culture of the country? What to do if you perform surveys in different countries with different background?

I have been discussing customer satisfaction here a few times. Related to that, let me share a brief anecdote with you:

We implemented a new Service Desk SW on a customer's site. They are a managed services company providing support all over the world, 24x7. After the resolution of every ticket, contact person on a ticket receives a mail notification with a link to a short web-based survey.

There were just a few questions regarding speed of resolution, communication, competence of support people and overall satisfaction with the ticket resolution.

Grades were 1-5, with the above explanation that 1=poor and 5=excellent.

We received quite a bunch of survey results at the beginning, which was the intention. Here and there, a low score was received, but we were not alarmed, you can't please everyone. Service Manager was in charge to treat all grades below 3 as a customer complaint and to follow up with customers to raise their satisfaction.

Then one day we received two very bad results, averaging below 2. Both from the same market. Alert! We are doing something wrong.

So Service Manager sent apologetic mail with a inquiry what went wrong and how can we improve, blah...
The answer came quickly, from both customers, saying they are sorry, but they thought 1 is better and 5 is poor.

Was something wrong with the survey code? Customers sent us their screenshots, everything is fine, the explanation at the form header clearly stated "1=poor and 5=excellent". Both customers were from Germany. On a request to explain how they misunderstood this clear instruction, they said: "German school grading system is 1 to 5, a one being the best grade (Sehr gut), and five the worst - insufficient (Nicht genĂ¼gend)". So they didn't bother looking at explanations, they automatically presumed that this survey from another country complies to their long-term grading experience in German scholar system. Can't blame them.

Therefore we looked arround, what are grading standards in school systems around the world?

USA and influenced states use ABCDEF grades. Europe differs very much depending on history, somewhere 1 is bad, under the German skirt it is excellent (Chezh republic and Slovakia also).
Eastern European countries, Asia and Oceania use either Russian 1-5 system or percentage system 0-100%.

Customer Survey Grades


Interesting details:
  • Venezuela uses rather exotic 0-20 grading system
  • Ecuador and Serbia (opposite hemispheres) use similar 5-10 grading systems where 5=fail and 10=excellent.
  • A lot of countries use different grading systems for primary, high school and university grades.
So what approach to take in grading customer satisfaction in order to make it intuitive regardless of their local education background?
 
We had only two solutions:
  • Go to using 1, 2, 3, 4 grades, where customers won't be able to relate to their finer graded school system. 1-4 is also good because it eliminates the indifferent middle grade, and it forces a customer to decide for better or worse 2 or 3.
  • Take the -2, -1, 0, 1, 2 grading approach. Good: it is self-explanatory, negative numbers can't be good. Bad: it leads the customer to a mediocre "0" grade, suggesting that it is OK.
For now, we are using the second proposed grading system, accepting the downside that some customers see nothing bad in selecting a "0", which is a neutral grade.

An additional tweak would be to change grades to -1, 0, 1, 2, which would "push" a customer towards positive grades some more.

What metod do you think is most apropriate for you?


Related articles:

Customer Satisfaction 
How bad do we need it in IT Service Management? Where is it mentioned and where is it dealt with in ITIL V3? How do we manage it in real life?

Customer Satisfaction Survey: What Methods To Use?
How to gather customer satisfaction data? What methods are there? What ITIL says? What methods will work for you?

Mar 24, 2011

ITIL Major Incident - All you want to know

What is a Major Incident in ITIL? What are the roles and responsibilities? How to avoid common mistakes? What to do After the Resolution?

Trust me, I know what I'm doing!
Sledge Hammer

What is a Major Incident?
Definition of a Major Incident has to be clear to every employee in Service Support. Therefore it has to be clearly described in a separate document, Major Incident Procedure.

What makes a Major Incident? It is usually defined by the impact outage has or could have on customer’s business process. Also, it may be determined by priority of the incident or by its urgency.

How come that the impact isn’t allways the only factor in defining the Major incident? For example, an incident of high impact can be resolved by Service Desk thru a simple resolution procedure, like switch resetting after network down event, or connecting a backup provider after internet down event.

Both examples are definitely high impact but we don’t have to recruit a bunch of higher level people on it just yet. We just have to have in mind that they are Priority 1 and they have to be resolved ASAP. In case they can’t be resolved by standard procedure, THEN they can be marked Major and handled with appropriate procedure and policy. That’s why most leading Incident Management tools on the market have a separate checkbox Major or Hot incident.

This was all theory. In practice, to simplify the procedure and make it easier to Service Desk staff, this is what I usually advise: all priority 1 incidents are Major Incidents, if they are not exceptions. Exceptions can be easily defined for particular customers, contracts and incident categories. For example: Major incidents are all Priority 1 incidents except cash register tickets, which are urgent but can be fixed by technicians, no need to involve for more important people. Or: all categories except end user incidents. Simple.

Major Incident Team
OK, now we have determined it’s a Major Incident. What next? We establish a Major Incident Team. Members are:
  • Service Desk Manager – he will be responsible for communication with resolution team and timely reporting to the customer
  • Incident Manager – in reasonable service organizations Incident Manager is usually also the Service Desk Manager. If not, then these two have to work closely together.
  • Major Incident Manager: a frequent mistake is to promote Incident or Service Desk Manager into Major Incident Manager. This doesn’t have to, but can cause some serious conflicts of interests: he has to survive somewhere between Incident Management, Problem Management, Business Management and the customer.
    Major Incident Manager has to be a liaison between all internal parties involved, also acquainted well to technical aspects of the outage. So he will often be recruited between people formerly engaged in a project, or those involved in service catalogue definition.
  • Problem Manager: remember him? He will be most helpful here in investigative phase, towards closure phase, and a life saver in post mortem reporting. Better keep him on our side. Mind you, Major Incident is still an incident, but usually has some underlying cause which will be recognized as a Problem. Hence Incident and Problem Manager have to work closely here, each with his own goal in mind (service restoration vs. underlying cause).
  • Other members of Major Incident Team: representatives of all people involved, impacted users, competent technical staff, vendors... Good practice would be to choose people here the same way you would choose ECAB (Emergency Change Advisory Board) members. There is always a chance that you will be implementing an Emergency Change during a Major Incident resolution process.

Resolution Process
Major Incident resolution works on tight SLA parameters. Service Desk takes care of them. Also, ticket updates and frequent feedback to customers is performed by Service Desk. Remember, customer hates to be kept in the dark, even if news are bad (no progress) they must be updated frequently.

Major Incident Procedure has to define the escalation policy in case of SLA breech. Usually the incident is escalated vertically to higher level IT / Business management and to vendors of services/equipment underpinning the service.

After the battle
Upon the resolution, Incident Management Team stays “on call” and monitors the service for the period defined by Major Incident Manager. He also schedules a short team meeting for the next day.

Incident Review is performed on this meeting, points for improvement and lessons learned are defined and Post Mortem Major Incident Report is created.

Incident Manager sends the report to the customer.

I have prepared for you a template for Major Incident Report, free for download here.

Related posts:

Incident Management Elements
Key elements of Incident management.

Incident Management Mind Map
Download the incident management mind map.

All About Incident Classification
How to deal with incident categories.
Incident prioritization in ITIL.


Hope this helps. Have a nice day!

Jan 7, 2011

Customer Satisfaction Survey: What Methods To Use?

How to gather customer satisfaction data? What methods are there? What ITIL says? What methods will work for you?

USA Today has come out with a new survey - apparently, three out of every four people make up 75% of the population.
David Letterman
 
In my last post I spoke about Customer Satisfaction. OK, but how to gather data? How to interview customers? How to get the best results in surveys and help educate the customer about the proces?
Here I give you a list of basic ITIL methods for gathering customer satisfaction data and my coments of its worth in my company. Mind you, we are in Managed Service business, so my grades are biased.

If you are a small IT in  government agency or an IT in Airline company, you may look for different values. Still, this will give you an idea.


Customer Satisfaction


After-Call Survey
Callers are asked to remain on the phone after the call and then asked to rate the service they were provided.

This method requires that you have some kind of telephony automation like Interactive Voice Response (IVR). The thing with this method is, however you set the system it will often be despised by the customer. Most of the people don’t like to hear recorded messages and even more people hate moving down the menu tree with phone keyboard. Outcome? Only extremes: customer which are very satisfied and wish to reward an agent by giving him good grades or, more often, a very pissed off customer who will go out of his way to express his low level of satisfaction.

Call Center organizations are doomed to use this technology since phonecalls are the only way of communicating with the customer. In ITSM there are usually more ways to talk to customer.

My worth: 1/5


Mail Surveys
We never considered sending mail interviews to customers as an option. Not in this century at least. What kind od IT-related person would fill the received paper form and send it back to us? Do we need a feedback from this kind of person? No, thanks.

With e-mail it's maybe slightly better. We used to have a ServiceDesk application which would send an automated notification on incident resolve event, asking the customer contact to reply in an email grading their satisfaction with speed and quality of resolution by grades 1 to 5. So lame. Response percentage of 0,05% was enough to pacify our ISO9001 auditor but it was next to worthless to us otherwise. One good thing was that receiving a small number of answers in an unstructured text email made the result processing much easier :-)

My worth: 1.5/5


Group Interviews
In this method, customers should be gathered in small groups and interviewed together. This can maybe be somewhat useful in a group of people working together, even then their answers will be biased and dependent. Possible value I see in this one is that they will help to develop communication between customers of different knowledge levels and enhance collective process knowledge.

In my organization we try to keep different SLA customers as far from each other as possible. We do not conduct group interviews but we do organize ITSM workshops here and there for our customers, feedback is usually very good.

My worth: 2/5


Phone Surveys
Some time after the interaction with Service Desk, an independent agent calls the customer.

You are in the middle of your monthly report, juggling a database and three spreadsheets, and some guy calls you to grade your experience with Service Desk from ten days ago? You will brush him off. OK, maybe not the first time, but the next, for sure.

So, if you want to get the results with this method, you need trained people with strong communication skills and good opening lines. Also, a good idea is not to have surveys for all calls, to let the customer rest from you periodically. If they anticipate your call, they start thinking up good excuses to brush you off. Surprise them periodically.

My worth: 2/5


Personal interviews
Surveying person is usually Incident Manager or Service Desk manager, while the Customer representative is a person responsible for SLA on their side. These can be contact persons on Major incidents after the post mortem reporting, but that’s another story and this should be defined in Major incident procedure.

These interviews are usually periodic and should be conducted at least annually (usually they are, during a contract renewal), but more preferably on quarterly basis. During these, the two parties communicate customer needs and help focusing Service Support towards things important to customer. So the questions in a survey should be aligned with Service Desk and Incident Management KPIs.

These are of great value to us.

My worth: 4/5


Online Surveys
We worked with several Service Desk automation tools, different vendors. In the end, we developed a Web application which enables us to send different surveys to our customers. They can be Ticket related, periodical or Ad Hoc surveys with predefine templates of multiple choice and freeform answers. We like it so much that we host it to other departments of our company. And we do not sell it, it’s that good :-)

Customers fill it in their own convenience (ok, we sometimes call and beg them to fill surveys), and after they submit the form they are automatically in the report.

Web-based surveys are a life-saver to us.
My worth: 5/5


Other Methods
Blogs, tweets, wikis, forum discussions and chats are all nice to have and, if maintained well, they will influence customer experience positively.

However, it is next to impossible to mine a structured measurement of customer satisfaction out of them, and they shouldn’t be used for that purpose.


Additional About Surveys

Scoring
It will be best to define simple questions with multiple choice number ratings. How many different grades? For fine gradation in longer interviews we use grades 1-10. For faster, ticket-focused surveys we opted for a coarser gradation. We used 1-5 for a long time, but a lot of the grades were distributed to 1, 5 and 3. Eliminating the middle value and shifting to 1-4 made our customers think more. We still use (1-4) system for most surveys and we are satisfied with it.

In addition, it is often nice to put an optional comment field under every question. Doesn’t cost much, and sometimes customers leave very useful info there.

How many questions?
You certainly don’t want to scare of the customer with a bunch of questions. OK, annual interviews can be longer, and internal ITIL/ISO20000 audits and assessments can gave a gazillion questions, but survey on incident performance can have only one (how satisfied are you?), ideally three, or maximum five multiple choice questions. You fish for quantity here.

Anonymous or named?
Some customers will answer an anonymous survey rather then named. Well, their problem. This is one case where customer is not always right. You want to know who is satisfied, who is not, so you can decide on your further actions.


Related posts:

Customer Satisfaction in ITIL Service Management: Do You Get It?
Elementary facts about ITSM Customer Satisfaction - How bad do we need it in IT Service Management? Where is it mentioned and where is it dealt with in ITIL V3? How do we manage it in real life?