LLM Providers

Connect your own LLM credentials for full control over AI responses and costs.

Overview

Monology uses a "bring your own key" model for LLM providers. You provide your own API credentials, giving you full control over costs, data privacy, and model selection.

Supported Providers

OpenAI
Direct integration with OpenAI's API

Required Credentials

  • API Key: Your OpenAI API key (starts with sk-)

Supported Models

GPT-4GPT-4 TurboGPT-4oGPT-4o-miniGPT-3.5 Turbo
Getting an API Key: Sign up at platform.openai.com and create an API key in your account settings.
Azure OpenAI
Enterprise-grade option with Azure infrastructure

Required Credentials

  • Endpoint: Your Azure OpenAI resource endpoint
  • API Key: Azure OpenAI API key
  • Deployment Name: Name of your model deployment
  • API Version: Azure API version (e.g., 2024-02-15-preview)

Benefits

  • Enterprise SLAs and support
  • Data residency options
  • Integration with Azure security
  • Private network deployment
Setup: Create an Azure OpenAI resource in the Azure Portal and deploy a model.

Configuring LLM Credentials

  1. Open your workflow and select an Agent Node
  2. Navigate to the LLM Configuration section
  3. Select your provider (OpenAI or Azure OpenAI)
  4. Enter your credentials
  5. Click Validate to test the connection
  6. Save your changes

Credential Validation

Monology validates your credentials when you save them to ensure they work correctly:

  • Checks API key validity
  • Verifies endpoint accessibility (Azure)
  • Confirms model deployment exists (Azure)
  • Tests a simple API call
Tip: If validation fails, double-check your credentials and ensure your API key has the necessary permissions.

Security

Your API credentials are handled securely:

  • Credentials are encrypted at rest
  • Keys are never exposed in the UI after saving
  • API calls are made server-side
  • You can rotate keys anytime

Next Steps

Email Providers

Configure email sending for Action nodes