Skip to main content

Genesys CloudAgent Copilot support for customer-provided LLMs

Announced on
(YYYY-MM-DD)
Effective date
(YYYY-MM-DD)
Aha! idea
2026-02-02 - Aha! idea

In a future release, Genesys Cloud will allow customers to use their own large language model (LLM) to generate Agent Copilot interaction summaries. Previously, Genesys Cloud generated Agent Copilot summaries using a native LLM provided by the platform. With this update, customers can connect their own LLM to handle summarization for Agent Copilot. This feature gives organizations direct control over how summaries are generated and which model is used.

This change is useful for enterprises with specific requirements around data handling, language support, or model behavior. Some organizations already operate approved or internally trained LLMs and need Agent Copilot summaries to align with those standards. Others require summary generation in languages that are not yet supported by the native Agent Copilot summarization experience. This update helps customers meet enterprise requirements while keeping Agent Copilot aligned with their existing AI strategy.

What’s changing

  • Support for customer-provided LLMs for Agent Copilot summaries – Customers can configure Genesys Cloud to send interaction data to their own LLM for Agent Copilot summarization instead of using the native model.
  • More control over summarization behavior – Customers can use models trained on their own data or tuned to their business needs, enabling summaries that better reflect internal terminology and workflows.
  • Expanded language options – Customers can generate Agent Copilot summaries in languages supported by their chosen LLM, even if those languages are not currently supported by native Agent Copilot summarization.