Insights

XLAs – Putting ‘feeling’ into the service experience

Chris Good

Date:

October 2023

key fact

Gartner predicts that by 2023, 30% of organisations will use AI to help set service levels, provide performance reporting, and offer proactive recommendations for SLA optimisation.

Service Level Agreements have been the go-to solution for decades to try and: 

  • Provide customers with optionality (balancing cost vs performance), 
  • Set customer expectations as to the ‘quality’ of the service they will receive, and 
  • Drive behaviours of those providing the service. 

Examples of the most common IT related Technical SLAs are: 

  • Availability – when will the service be ‘up’ during required service hours. Or more accurately, what is the permissible amount of time that the service will not be ‘up’. 
  • Recoverability / Fulfilment – how quickly will an Incident or Service Request be resolved / fulfilled. This may also be supplemented with a ‘Response’ SLA… how quickly will someone review the Incident / Request and assign to a named individual to progress.   

Many others are available including First Contact Resolution (percentage of tickets that are resolved at first contact – typically the Service Desk), Reliability (how many times an issue occurs with a service), and Bounce Rate (how many times an Incident bounces from one team to another to resolve). Note – the type of SLA depends on the type of IT Service being provided. See our ‘What is an IT Service? The 7 Categories’ article for further background. 

But do any of the above really matter if the Customers (those who pay for the service) and Consumers (those that use the service) cannot generate sufficient value from the service they are using due to other factors that are not measured by these traditional technical measures?  

This is where ‘Experience Level Agreements’ (XLAs) come in to play. XLAs put the human ‘feeling’ into measuring service performance with the aim of understanding why a user feels a certain way. In this article we explore how to define and utilise XLAs to create a collaborative relationship between customer, consumer and provider that drives continual improvement leading to increased value.  

Imagine the scenario… you’re a Service Manager responsible for providing a service to a customer. You’ve hit all Technical SLAs, have a report showing everything as green, and walk into a service review meeting with your customer with a spring in your step. You present the report with a smile on your face, only to see a room full of people with steam coming out of their ears. You pause…. You ask if everything is ok… and you receive a (hopefully politely positioned) summary of why service felt so bad to the customer. This is known as the Watermelon Effect… everything looks green on the outside, but you dig a little deeper and it’s red on the inside. How can you avoid being blindsided like this? How can this misalignment of expectation be overcome? And how can you put in place targets that drive the behaviour of providers (be that internal teams or third parties) to provide the service that are more aligned to the required business outcomes? 

Step 1 – Define Customer / Consumer Satisfaction Approach: 

Sending a survey out to users on the back of their ticket being closed is unlikely to drive reliable and valuable insight: 

  • Firstly, based on our research, the stats are not representative – For the very few users who do respond, 80% are from those who have had a terrible experience (and they are willing to take the time to tell you about it), 10% by those that have had an amazing experience (and are willing to take the time to tell you about it), and 10% by those in between who simply have the time and/or corporate inclination to respond.  
  • Secondly, you rarely receive qualitative feedback. And qualitative feedback is key. 
  • Thirdly, the survey will be going to consumers of your IT Services. Customers (those that pay the bill), may have a very different perspective. 

Instead, we recommend three different sets of customer/consumer satisfaction routes to truly understand the user experience across the following groups: 

  1. Consumers – those who are using the IT Services. 
  2. Customer Senior Managers – usually Heads of Department from within the Business.  
  3. Customer Executives – those who are paying for the IT Services 

Consumers: 

  • Purpose – understand the attitude towards Incident and Service Request ticket handling performance. 
  • Owner – Service Desk 
  • Audience – A random sample of customers who have had Incidents and Service Requests resolved/fulfilled will be taken each month.  
  • Frequency – Monthly 
  • Questions – a consistent and concise set of questions. For example: 
    • My incident / request was easy to log / raise (strongly agree to strongly disagree scale)? 
    • The IT first responder understood my incident / requirement and how important this was for me (strongly agree to strongly disagree scale). 
    • My incident / request was resolved / fulfilled within an appropriate timeframe (strongly agree to strongly disagree scale). 
  • MethodSurveyees to be emailed notifying them of their inclusion in the sample, and diary invites to be sent. Service Desk to then call the surveyee and log their responses, including documenting any verbatim feedback. 
  • Reporting – Statistics are to be compiled monthly and reviewed as part of the monthly performance hubs (see Service Level Management process for more details). High level statistical results are to be made visible to all end users via a publication on the Service Portal. 

Customer Senior Managers 

  • Purpose – Understand the attitude towards IT and the services IT provide. 
  • Owner – Business Relationship Managers 
  • Audience – Heads of Department.  
  • Frequency – Three times a year (adjusted to your organisation’s circumstances) 
  • Questions – a consistent and concise set of questions covering the seven different types of IT Services. 
  • MethodSurveyees to be emailed requesting completion of an online survey followed up by discussions with the Business Relationship Manager. 
  • Reporting – Statistics are to be compiled three times a year and reviewed as part of the next Service Review with that Head of Department  

Customer Executives 

Similar to the above, but with Executives and performed once a year in person with the Business Relationship Manager (or as is appropriate to your organisation). 

For all three, any negative feedback items are to be investigated directly with the responder. In addition, any Service Improvements identified as part of the data and verbatim analysis are to be logged on the Continual Service Improvement (CSI) tracker, assigned to an owner, prioritised, analysed (e.g. via user journey mapping and stakeholder interviews) and tracked to closure. 

Step 2 – Sentiment Analysis: 

Why just survey customers/consumers when you could use automation tools to understand the sentiment of their views already? Sentiment Analysis tools do just that… scanning Voice and Chat interactions to identify positive, negative or neutral emotions in those conversations. Heat maps are produced of common themes (e.g. Service, type of Service, location, organisation unit, etc), and customer relationship alerts can be auto-raised to prompt additional focus from managers, all presented in an intuitive dashboard. They are relatively easy to set up, and you’ll need to consider Information Security and Data Protection aspects. In addition, you could expand this further to include wider user journey monitoring tools. 

Step 3 – User Journey Monitoring 

While beneficial, both the above are still mostly reactive. Configured effectively, genuine end user journey mapping and then monitoring (not technical component monitoring) help you understand user experience near-real time before the user needs to contact support.  Alerts can be set to prompt support teams to take action to reduce or potentially eradicate further issues. A user journey ‘success rate’ can be set, and further action taken if ‘success’ drops below that target. 

Step 4 – Review your Technical SLAs: 

How many SLAs do you have in place with your third parties? If you also have a clause for commercial penalties for not hitting these SLAs, how spread out are the service credits across each of these SLAs?  

With XLAs now in place, a review of your Technical SLAs would be beneficial to ensure they drive the behaviours that you as a customer would like to see from your provider. You may already have a material behaviour driver in place based on the cost model (e.g. a Service Desk provider paid a flat per user charge will already be motivated to try and reduce ticket volumes and automate where possible), and therefore a multitude of Technical SLAs may distract the supplier from focusing on the service areas that are most important to you. 

That’s not to say these technical areas should not be measured… they should, they may be moved to a less commercially sensitive Key Performance Indicator (with a target) or Key Measure (no target) to provide summary level visibility of what’s going on, identify trends and help identify new CSI opportunities. 

(It’s worth noting, for internal provided services, the ‘A’ in XLA could easily be switched for a ‘T’ – Target. The word ‘Agreement’ can sometimes invoke commercial connotations at a time when you’re aiming for collaboration and value co-creation.) 

XLAs reflect a shift from a transactional technical approach to a more relationship-oriented approach. By emphasising the user experience, organisations can build stronger relationships with customers, ultimately contributing to better business outcomes. 

If you would like to speak to us further regarding this insight article, send your enquiry to contact@masonadvisory.com to discuss further.  

If you want to find out more about our services click here.

Our services

View all