Static FAQ replaced by smart chatbot
The outcomes
- Customers interact with a dynamic conversational interface
- Lower support costs.
The context
A leading telecom provider in Belgium wanted to improve its online FAQ system to reduce call center volume, cut costs, and enhance customer satisfaction. The traditional FAQ format required customers to manually search for answers, creating friction and driving unnecessary support calls.
The logic
To meet these goals, the company needed a solution that could deliver accurate, immediate answers without requiring customers to navigate complex FAQ pages. A chatbot powered by Retrieval-Augmented Generation (RAG) offered the potential to transform the experience—provided it could handle ambiguous queries, maintain compliance with strict data regulations, and scale efficiently.
The solution
We implemented a chatbot trained on the company’s entire “Help Section” using an OpenAI large language model (LLM). The system was deployed on Azure OpenAI, supported by Azure DevOps for development and monitoring. Our team focused on refining the chatbot’s performance by improving its ability to rephrase unclear questions and optimizing indexing strategies for better retrieval accuracy.
The traditional FAQ structure was replaced with a user-friendly search bar, allowing customers to ask questions directly and receive precise answers with related links. We ensured full compliance with European data protection regulations throughout the process. To support continuous improvement, we built a robust pipeline for data collection, cleaning, model evaluation, and deployment, enabling ongoing monitoring and refinement.
Next steps in applied excellence
The next phase will focus on completing A/B testing, integrating advanced language capabilities for even better query handling, and expanding the chatbot’s knowledge base to cover additional customer scenarios.