文档/Provider 配置

Provider 配置

支持 29+ 模型,想用哪个就用哪个

快速配置

# providers.toml
[openai]
api_key = "sk-..."
model = "gpt-4"
[deepseek]
api_key = "sk-..."
model = "deepseek-chat"

支持的 Provider

国际模型

OpenAI
Claude
Gemini
Groq
Mistral
Cohere

国产模型

DeepSeek
Kimi
通义千问
智谱AI
文心一言
豆包

本地模型

Ollama
LocalAI
vLLM

配置示例

OpenAI

[openai]
api_key = "sk-xxxxxxxx"
model = "gpt-4-turbo"
base_url = "https://api.openai.com/v1" # 可选

DeepSeek

[deepseek]
api_key = "sk-xxxxxxxx"
model = "deepseek-chat"

Ollama(本地)

[ollama]
base_url = "http://localhost:11434"
model = "llama3"

切换 Provider

# 命令行切换
axoncog provider use deepseek
# 或在配置文件中设置默认
default_provider = "deepseek"