Add AI_PROVIDER and AI_MODEL support
This commit is contained in:
@@ -4,15 +4,29 @@ All configuration is managed in `tools/ai-review/config.yml`.
|
||||
|
||||
## Provider Settings
|
||||
|
||||
The provider and model can be configured via Gitea secrets, so you don't need to edit `config.yml` per deployment:
|
||||
|
||||
| Secret | Description | Example values |
|
||||
|--------|-------------|----------------|
|
||||
| `AI_PROVIDER` | Which LLM provider to use | `openai`, `openrouter`, `ollama`, `anthropic`, `azure`, `gemini` |
|
||||
| `AI_MODEL` | Model for the active provider | `gpt-4.1-mini`, `claude-3-5-sonnet-20241022`, `google/gemini-2.0-flash` |
|
||||
|
||||
These secrets override the values in `config.yml`, which serve as fallback defaults.
|
||||
|
||||
```yaml
|
||||
# LLM Provider: openai | openrouter | ollama
|
||||
# LLM Provider: openai | openrouter | ollama | anthropic | azure | gemini
|
||||
# Override with the AI_PROVIDER Gitea secret.
|
||||
provider: openai
|
||||
|
||||
# Model per provider
|
||||
# Override the active provider's model with the AI_MODEL Gitea secret.
|
||||
model:
|
||||
openai: gpt-4.1-mini
|
||||
openrouter: anthropic/claude-3.5-sonnet
|
||||
ollama: codellama:13b
|
||||
anthropic: claude-3-5-sonnet-20241022
|
||||
azure: gpt-4
|
||||
gemini: gemini-1.5-pro
|
||||
|
||||
# Generation settings
|
||||
temperature: 0 # 0 = deterministic
|
||||
@@ -166,6 +180,8 @@ These override config file settings:
|
||||
|
||||
| Variable | Description |
|
||||
|----------|-------------|
|
||||
| `AI_PROVIDER` | Override the active provider (e.g. `openrouter`, `anthropic`) |
|
||||
| `AI_MODEL` | Override the model for the active provider |
|
||||
| `AI_REVIEW_TOKEN` | Gitea/GitHub API token |
|
||||
| `AI_REVIEW_API_URL` | API base URL (`https://api.github.com` or Gitea URL) |
|
||||
| `AI_REVIEW_REPO` | Target repository (owner/repo) |
|
||||
|
||||
Reference in New Issue
Block a user