Add AI_PROVIDER and AI_MODEL support
All checks were successful
CI / ci (push) Successful in 9s
Deploy / deploy-local-runner (push) Has been skipped
Deploy / deploy-ssh (push) Successful in 7s
Docker / docker (push) Successful in 6s
Security / security (push) Successful in 7s

This commit is contained in:
2026-03-01 19:56:14 +01:00
parent f3851f9e96
commit 7cc5d26948
8 changed files with 60 additions and 4 deletions

View File

@@ -4,15 +4,29 @@ All configuration is managed in `tools/ai-review/config.yml`.
## Provider Settings
The provider and model can be configured via Gitea secrets, so you don't need to edit `config.yml` per deployment:
| Secret | Description | Example values |
|--------|-------------|----------------|
| `AI_PROVIDER` | Which LLM provider to use | `openai`, `openrouter`, `ollama`, `anthropic`, `azure`, `gemini` |
| `AI_MODEL` | Model for the active provider | `gpt-4.1-mini`, `claude-3-5-sonnet-20241022`, `google/gemini-2.0-flash` |
These secrets override the values in `config.yml`, which serve as fallback defaults.
```yaml
# LLM Provider: openai | openrouter | ollama
# LLM Provider: openai | openrouter | ollama | anthropic | azure | gemini
# Override with the AI_PROVIDER Gitea secret.
provider: openai
# Model per provider
# Override the active provider's model with the AI_MODEL Gitea secret.
model:
openai: gpt-4.1-mini
openrouter: anthropic/claude-3.5-sonnet
ollama: codellama:13b
anthropic: claude-3-5-sonnet-20241022
azure: gpt-4
gemini: gemini-1.5-pro
# Generation settings
temperature: 0 # 0 = deterministic
@@ -166,6 +180,8 @@ These override config file settings:
| Variable | Description |
|----------|-------------|
| `AI_PROVIDER` | Override the active provider (e.g. `openrouter`, `anthropic`) |
| `AI_MODEL` | Override the model for the active provider |
| `AI_REVIEW_TOKEN` | Gitea/GitHub API token |
| `AI_REVIEW_API_URL` | API base URL (`https://api.github.com` or Gitea URL) |
| `AI_REVIEW_REPO` | Target repository (owner/repo) |