AI-Powered Knowledge Search Modernization for a Large Insurance Organization

AI-Powered Knowledge Search Modernization for a Large Insurance Organization

March 25, 2024

Overview:

A major U.S. insurance organization needed a way to unify and search thousands of internal documents spread across multiple departments. Employees struggled to quickly locate accurate information on claims, 401(k), benefits, payroll processing, and regulatory guidance—resulting in workflow delays, inconsistent answers, and operational inefficiencies.

Our team partnered with the organization to design and implement an AI-enabled enterprise search tool powered by Retrieval-Augmented Generation (RAG), Elasticsearch, and Claude via Amazon Bedrock. The solution now provides employees with instant, accurate, context-aware answers tailored to their specific line of business.


The Challenge:

The client faced several major obstacles:

  • Massive volumes of unstructured documents across claims, benefits, 401(k), compliance, and other departments

  • Difficulty locating reliable answers quickly, slowing down claims resolution and customer servicing

  • Inconsistent search results from outdated enterprise search tools

  • No centralized, context-aware knowledge platform

  • Employees spending unnecessary time sifting through documents instead of focusing on higher-value work

The organization needed an AI solution capable of returning precise, department-specific insights without exposing irrelevant or cross-department data.


The Solution:

We built an internal AI-enabled enterprise search tool using a multi-layered architecture:

1. Retrieval-Augmented Generation (RAG) Pipeline:

  • Elasticsearch identifies the top 10 most relevant documents from large departmental libraries.

  • These documents are passed to the Claude LLM, which synthesizes a concise and accurate answer.

2. Department-Specific Search Libraries:

Each user can search within their dedicated library:

  • Claims

  • 401(k)

  • Benefits & Payroll

  • Other business units

This ensures results are accurate, relevant, and compliant, without cross-contamination across business groups.

3. Flexible LLM Architecture:

The platform uses Amazon Bedrock, enabling seamless switching between multiple LLMs:

  • Tested Claude variants

  • Experimented with ChatGPT models

  • Ultimately selected Claude Haiku for optimal performance, speed, and cost

4. Simple, Intuitive User Experience:

Users can:

  • Ask questions in natural language

  • Receive a synthesized answer

  • View the supporting documents returned by the system

The result: Google-like enterprise search—but trained specifically on the client’s internal knowledge.


Business Impact:

Immediate Benefits:

  • Faster access to information, reducing time spent searching through documents

  • More consistent, accurate answers, improving downstream decision-making

  • Higher workflow efficiency across claims, benefits, and payroll teams

Organizational Momentum:

  • Two business units adopted the tool immediately

  • A third business unit requested access after seeing early success

  • Broad internal excitement and rapid organic adoption

Long-Term Impact:

  • Establishes a foundation for enterprise-wide AI adoption

  • Provides a scalable model for future knowledge applications

  • Helps employees become more efficient—not replaced—by AI technology

  • Signals a shift toward AI-first operational excellence


Why This AI Solution Stands Out:

Unlike traditional enterprise search, this platform:

  • Returns accurate answers, not just documents

  • Filters results by business unit

  • Uses LLM reasoning with internal context

  • Continuously improves as models evolve

  • Is built on a flexible foundation capable of adopting future AI advancements

  • Empowers employees rather than displacing them

It is, in effect, a private, enterprise-safe version of Google on steroids—purpose-built for complex insurance operations.


Key Outcomes:

  • Significant reduction in time spent searching internal documentation

  • Improved accuracy and consistency of operational decisions

  • Increased employee satisfaction due to easier access to information

  • Rapid, cross-department adoption

  • A scalable AI architecture designed for future expansion

Key Highlights

AI Engineering, Enterprise Knowledge Management, Large Language Model (LLM) Integration

RAG Architecture, Elasticsearch, Amazon Bedrock, Claude LLM