Skip to main content

Requirements for Using Customer-Hosted Azure OpenAI LLMs

To integrate your own Azure-hosted deployments with Deepdesk, we need the following details for each deployment. Commonly used models include gpt-4o-mini and text-embedding-ada-002.

Required information per deployment​

For each model deployment you want Deepdesk to use, please provide:

ItemDescription
Model / deployment nameMust match the OpenAI model name (e.g. gpt-4o-mini, text-embedding-ada-002) or use a prefix β€” see Deployment name and prefix below.
Deployment region URLThe full base URL of your Azure Cognitive Services endpoint, e.g. https://swedencentral.api.cognitive.microsoft.com/
API keyA valid Azure OpenAI API key with access to the deployment.

Authentication​

You can use API key authentication (above) or, optionally, Entra ID (managed identity) authentication for your Azure OpenAI deployments. If you use Entra ID, provide the Entra ID client credentials instead of (or in addition to) the API key, and we will configure the connection accordingly.

Deployment name and prefix​

Deployment names must match the OpenAI model names, e.g. gpt-4o-mini and text-embedding-ada-002. You may use a prefix to distinguish these deployments from others. The format is:

<prefix>-<model-name>

Examples:

  • deepdesk-gpt-4o-mini
  • deepdesk-text-embedding-ada-002

Tell us which prefix you use so we can configure it in Deepdesk. In Admin this is set as the deployment prefix on the LLM Config β€” see LLM User Guide.

Notes​

  • Quota β€” Ensure you have sufficient quota allocated in your Azure subscription for these models.
  • Security β€” API keys and endpoints are stored securely and used only for request routing under your account.

See also​