November 2022 marked a key moment in the space of generative AI with the launch of ChatGPT, catalysing widespread interest and adoption. This surge of interest spurred even enterprises to explore innovative applications of large language models (LLMs) on both structured and unstructured data, aiming to boost productivity and operational efficiency. Consequently, many tools emerged to minimise manual intervention in tasks. These tools include AI data analysts, automated insight generators, and knowledge search functionalities.
However, as enterprises delved deeper into applying these AI applications to real business problems, the initial enthusiasm began to wane—a bit like the excitement of unwrapping a new gadget only to find it challenging to integrate seamlessly into daily routines. In today’s technologically advanced landscape, it is relatively easy to build an AI assistant capable of answering questions based on documents or structured data. These assistants perform well with straightforward queries but often falter when faced with nuanced or complex business questions. Let’s illustrate this with an example of an assistant equipped with granular information on sales and key dimensions like region, brand, sales channel, and day of sale.
Challenges in Building Enterprise LLM Applications
Following are question types and examples with challenges and potential resolutions.
- Easy to Answer Reliably
- Example Questions:
- What are the sales through various channels?
- Which region has the highest sales?
- Challenges for LLM:
- None
- Potential Resolution:
- None
- Slightly Difficult, Solvable with Engineering
- Example Questions:
- How are my XYZ chocolate sales trending?
- What were the incremental sales during the holiday campaign?
- Challenges for LLM:
- Recognise that XYZ Chocolates is a brand found in the “brand” column.
- Know the campaign dates and the method for calculating incremental sales.
- Potential Resolution:
- Provide context about brands, holiday calendars, and metric definitions. A “context layer” aiding the LLM with these nuances can help generate reliable answers.
- More Complex, Requiring Decision Aid
- Example Questions:
- What is the impact on sales due to an increase in inflation?
- Challenges for LLM:
- Understand the relationship between inflation and sales.
- Know the specifics of the inflation increase.
- Potential Resolution:
- Use a tool like an ML model to establish the relationship between inflation and sales. Seek user input when the question isn’t specific.
- Strategic Questions with Multiple Solutions
- Example Question:
- How can I increase sales by 5%?
- Challenges for LLM:
- Understand numerous data points and apply judgment. Such questions are better answered by humans due to the multiple possible solutions.
- Potential Resolution:
- Constrain the problem and provide LLMs with a few options to aid in planning and coming up with an answer. For example, with access to region-wise demand forecasts and marketing spend models, the LLM can suggest optimising marketing spending in regions with low demand forecasts as a way to increase sales.
From the examples above, certain patterns emerge for designing reliable LLM assistants capable of providing contextual answers:
- Creation of a Semantic Context Layer: This layer helps the LLM with business nuances, including table and column descriptions, a glossary of terms, detailed data catalogues, metric definitions, user personas, historical SQL queries, and relationships between tables.
- Worker agents: These use enterprise data to perform specific tasks well. Worker agents can range in complexity and often serve as reusable organisational assets that provide intelligence to LLM assistants. Examples include:
- Reusable Codebase: Customer prioritisation logic and methods for estimating incremental sales.
- Models: Models that establish relationships between drivers and target KPIs like sales are useful for “what if” questions.
- LLM Agents: text-to-SQL generators or RAG analysis on documents that perform specific tasks well.
- Dashboards: summarised data insights that AI can read to answer questions.
- LLM Brain: Orchestrates the entire process, from question interpretation to answer generation, accessing the context and worker agents. Key functions include:
- Interpreting the question.
- Seeking user input if needed.
- Accessing the context layer for enrichment.
- deciding which worker agents are needed to answer a question.
- Summarising the output in a human-readable format.
- Validating output for sensitive or harmful content.
This approach has been used to develop complex real-life applications, such as:
- An assistant that helps pharma reps plan their meetings with physicians.
- An assistant that helps data users identify the right data from thousands of enterprise datasets.
- An assistant that helps plan marketing budgets and generates spending scenarios.
Challenges and the Way Forward
Now, while this approach holds immense promise for revolutionizing business operations, achieving analysis and reasoning skills at par with humans still remains a challenge.
A primary challenge is the lack of a comprehensive business context. Building and maintaining a robust context layer is time-consuming and essential for effective AI assistance. Also, while LLMs continue to improve, their ability to reason and coordinate multiple agents remains limited. Simply put, there’s a current upper limit to their handling of complexity, though this limit is expanding rapidly. To address this, there is ongoing research on multi-agent systems where agents have better reasoning capabilities, and collaborate to iteratively solve problems.
Another obstacle is the existence of data silos and disparate systems within enterprises. Integrating data and systems is absolutely necessary for creating AI assistants that can answer a wide range of questions.
Despite these challenges, the rapid evolution of AI models is driving progress in accuracy and efficiency. Domain-specific model training will be increasingly important as AI assistants tackle more complex tasks.
Custom business assistants with organizational intelligence, as opposed to off-the-shelf solutions, have the potential to dramatically enhance business productivity, efficiency, and security. Realizing this potential requires ongoing innovation and model refinement. Its applications are vast, spanning but not limited to customer service, supply chain, HR, sales, and marketing. While these areas have already benefited from advanced analytics, the contextualization and multi-agent approach of organizational intelligence promises to unlock even greater enterprise intelligence.