Making AI work for service support: the chatbot opportunity
Share:
Date:
June 2025

key fact
AI-powered chatbots are transforming IT service support—enhancing user experience, reducing resolution times, and enabling 24/7 assistance across complex digital environments.
Chatbot technology powered by AI is transforming how organisations deliver service support, be that IT, Facilities, HR, Finance or any other enabling function. From automating ticket resolution to proactively deflecting support contacts, the potential to improve user experience while reducing cost is significant. However, getting it right requires a thoughtful and structured approach.
At a time when many businesses are still navigating what’s real and what’s hype, here’s a practical summary of where the market stands, and how organisations can take advantage of the opportunity.
Understanding the AI landscape in Service Management
There’s a lot of noise in the AI space, particularly when it comes to Service Management (SM). To cut through the confusion, it helps to break things down into three key categories:
- Native AI
This includes long-established capabilities such as decision logic, predictive analytics, and embedded automation. These technologies have been maturing over decades. It’s stable, useful, and widely adopted, but typically rules-based and not ‘intelligent’ in the modern AI sense.
- Generative AI (GenAI)
GenAI powers tools that create and summarise content. For example, generating knowledge articles, summarising tickets/cases, or recommending solutions based on contextual understanding. These capabilities are often delivered via conversational interfaces like standalone chatbots (e.g., ServiceNow’s Virtual Agent) or embedded within SM platforms). Since the launch of public GenAI tools like ChatGPT, GenAI has matured rapidly and is now highly effective in enterprise environments.
- Agentic AI
This is the current cutting edge of AI in service management: autonomous agents that can interpret user intent, combine data from multiple sources, and make decisions that trigger automated actions. While the automation itself isn’t new, the intelligent, autonomous decision-making layer is. Enterprise-ready agentic AI has only emerged in the last 6–12 months, and while the promise is huge, capabilities (and the hunt for useful use cases) are still catching up with the marketing hype.
What we’re seeing in the market
The GenAI and Agentic AI space is currently crowded, with thousands of niche players offering specialised tools. However, the market is quickly consolidating, through acquisitions (such as Moveworks being acquired by ServiceNow) and natural attrition. For most organisations, staying ahead in this space, through in-house development alone, is proving difficult.
That’s why we’re seeing a shift. Increasingly, enterprise IT teams are turning to Managed Service Providers (MSPs) and Service Integrators who offer:
- Deep expertise in chatbot configuration and ongoing “BotOps”
- Reusable use cases to reduce implementation time
- Prebuilt integrations to common systems (e.g., Workday, Intune, ServiceNow)
- Flexible delivery models aligned to business outcomes
Some partners are more advanced than others. This remains a fast-moving space, and effective sourcing practices to select an appropriate partner and deliver an effective commercial structure is critical.
Key components of a chatbot-enabled support model
To make chatbot technology work, organisations need to consider three critical dimensions:
- Chatbot Functionality & Integration Reach
Includes the look and feel of the chatbot, interactive voice support, delivery model (standalone vs. SM tool integrated), out-of-the-box use cases, and the range of integrations with business systems.
- LLM capabilities
Large Language Models (LLMs) are what power the natural language flow; recognising intent, enriching context, generating meaningful responses, and managing follow-up interactions. Their sophistication determines how “human-like” and effective the experience feels, and how many trust-sapping hallucinations occur.
- Content / data quality
High-quality knowledge content is the backbone of successful AI support. Organisations must ensure robust catalogue items and well-structured knowledge articles (think: how would I explain this to an intern). The ability to link seamlessly to existing repositories (via zero-copy connectors rather than duplicating data), including SharePoint, Confluence, and SM tools, is also vital.
Skills and resources: a new operational layer
AI support tools introduce new operational challenges. To succeed, organisations need capabilities in:
- Chatbot training and improvement: Managing performance, resolving hallucinations, refining user journeys, and maintaining integrations.
- LLM and integration development: Continuously improving the chatbot’s features and linking to evolving data and automation platforms.
These are specialist skills in short supply, making retention difficult and strengthening the case for partnering with external experts.
Delivery models: from tool to full service
Depending on strategic goals, there are several sourcing options available:
- Software only – You buy and manage everything in-house
- Software + Implementation – A partner helps configure and deploy, but you operate and train it
- Software + Full Management – A partner handles setup, maintenance, and training
- Software-as-a-Service – Fully managed chatbot service integrated into your ecosystem
- Support Experience as a Service (Remote) – Adds partner-run remote support (chat and voice)
- Full Support Experience Service – Extends to onsite support, device management, security, and other End User Services
With each option, the internal cost goes down, and the third-party cost goes up. Understanding the approach that is the most effective for your requirements based on where you are now is critical.
Structuring for success
Chatbots promise both cost reduction and service improvement, but to deliver on that promise, organisations are increasingly adopting holistic managed service models — especially options 5 and 6.
These models are often built around:
- Cost-per-user-per-month pricing
- Ticket volume reduction targets
- Experience Level Agreements (XLAs)
The commercial model creates strong incentives: the provider is rewarded for preventing incidents, deflecting requests, and automating effectively, all without compromising the user experience. The presence of XLAs adds guardrails to ensure user-centric outcomes.
However, even in fully managed models, collaboration remains essential. Actions to improve automation, shift left, or reduce call volumes often require coordination and change activity from internal teams. The best results come from partnership, not outsourcing.
Final thoughts
AI-enabled support (especially chatbot-based solutions) represents a real opportunity to modernise service delivery. But it’s not a quick win or a tool you can just “switch on”. It’s a strategic capability that needs the right blend of technology, expertise, and commercial alignment.
For organisations willing to invest in the right foundations, and the right partnerships, the rewards are clear: better service, lower costs, and a support model fit for the future.
Interested in exploring how chatbot automation can enhance your service support?
Let’s talk about how we can help you plan, implement, or scale your AI service model.
If you would like to speak to one of our experts regarding the insight article, email contact@masonadvisory.com.
If you want to find out more about our services, click here.