You can configure Onyx to work with any AI provider and model!

AI Configuration

Navigate to: Admin Panel → Configuration → LLM to access the AI configuration page. From here, you can select which provider and models you want to use in Onyx. AI Configuration Overview

AI Providers

If cloud AI models are approved for your organization, we recommend using them because of capability, speed, and cost.

Cloud Models

  • OpenAI: State of the art models for reasoning, coding, image generation, and general tasks. Use models like GPT-4o, GPT-4.1, o3, GPT-5.
  • Anthropic: Excels at natural, human-sounding language and coding. Use models like Claude 4 Sonnet and Claude 4 Opus

Self-Hosted Models

  • gpt-oss-20b: OpenAI’s open-weight model optimized for chain-of-thought reasoning.
  • Llama 4 and 3.3. family: Meta’s latest models with excellent performance and varying model sizes for various hardware.
  • Qwen-3 family
  • DeepSeek-R1
Self-hosting models is only recommended for advanced users and teams with specific needs.

Configure Your Providers

Best Practices

  • Review the Terms of Service and Data Processing Agreements for the providers you want to use
  • Review the data you choose to index in Onyx
  • Clearly instruct your users on appropriate data to use with Onyx