Skip to main content

LLM Gateway

This document describes the role and configuration of the LLM Gateway in the Deepdesk platform architecture.

Overview​

The LLM Gateway serves as an intermediary layer that routes OpenAI requests from internal Deepdesk services to the correct endpoint. It abstracts the complexities of interacting with different LLM providers, allowing for seamless integration and management of AI capabilities within the platform.

Architecture Diagram​

Voice Agent

The Voice Agent currently does not make use of the LLM Gateway, but instead connects directly to the Deepdesk Azure OpenAI endpoint.

Endpoint Types​

The LLM Gateway supports two main types of endpoints:

  • Azure OpenAI Endpoints: Standard Deepdesk provisioned endpoints with a base URL and API key.
  • Customer Azure OpenAI Endpoints: Client-configured proxies that use OAuth authentication (client credentials grant). These proxies forward requests to Azure OpenAI endpoints, sometimes modifying the request path.

Load Balancing​

Deepdesk provisions its own Azure OpenAI endpoints and offers different regions for load balancing. The configuration typically consists of a primary and a secondary endpoint to ensure high availability and performance.

Currently, the primary target region is swedencentral, and the secondary (fallback) region is westeurope (Netherlands).

Customer Endpoints​

Customer endpoints can be configured in the Deepdesk Admin interface. The resulting configuration is stored securely in Secret Manager, which the LLM Gateway loads at runtime.


Further reading

For details on LLM administration, configuration, and developer integration, see the Administration LLM documentation.