Models
This section controls model selection and behavior.
Field | Description |
---|---|
builtin |
Built-in model configurations |
custom |
Custom model configurations. Entries with the same name as a builtin will override the builtin |
default |
The default model name |
no_stream |
Disable streaming for all models |
Builtin models
Tenx auto-configures builtin models based on the presence of API keys in environment variables:
Environment Variable | Models Added |
---|---|
ANTHROPIC_API_KEY |
sonnet (claude-3-5-sonnet-latest)haiku (claude-3-5-haiku-latest)
|
DEEPSEEK_API_KEY |
deepseek (deepseek-chat)
|
DEEPINFRA_API_KEY |
qwen (Qwen/Qwen2.5-Coder-32B-Instruct)llama-8b-turbo (meta-llama/Meta-Llama-3.1-8B-Instruct-Turbo)llama-70b (meta-llama/Meta-Llama-3.1-70B-Instruct)llama33-70b (meta-llama/Llama-3.3-70B-Instruct)qwq (Qwen/QwQ-32B-Preview)
|
GOOGLEAI_API_KEY |
gemini-exp (gemini-exp-1206)gemini-flash-exp (gemini-2.0-flash-exp)gemini-flash-thinking-exp (gemini-2.0-flash-thinking-exp-1219)gemini-15pro (gemini-1.5-pro)gemini-15flash (gemini-1.5-flash)gemini-15flash8b (gemini-1.5-flash-8b)
|
GROQ_API_KEY |
groq-llama33-70b (llama-3.3-70b-versatile)groq-llama31-8b (llama-3.1-8b-instant)
|
OPENAI_API_KEY |
o1 (o1-preview)o1-mini (o1-mini)gpt4o (gpt-4o)gpt4o-mini (gpt-4o-mini)
|
XAI_API_KEY |
grok (grok-beta)
|
Custom models
Models configurations come in two varieties - claude, which is specific to the Anthropic API, and openai, which can be used for any model compatible with the OpenAI API.
The possible fields for claude models are:
Field | Description |
---|---|
name |
The name used to refer to this model |
api_model |
The API model identifier |
key |
The API key |
key_env |
Environment variable to load the API key from if key is empty |
The possible fields for openai models are:
Field | Description |
---|---|
name |
The name of the model |
api_model |
The API model identifier |
key |
The API key |
key_env |
The environment variable to load the API key from |
api_base |
The base URL for the API |
can_stream |
Whether the model can stream responses |
no_system_prompt |
Whether the model supports a separate system prompt |
Example
Example of configuring a custom model using Ollama:
(
models: (
custom: [
open_ai(
name: "codellama",
api_model: "codellama",
key: "", // Ollama doesn't need an API key
key_env: "",
api_base: "http://localhost:11434/v1",
can_stream: true,
no_system_prompt: false,
),
],
default: "codellama",
),
)