Add AI_PROVIDER and AI_MODEL support
All checks were successful
CI / ci (push) Successful in 9s
Deploy / deploy-local-runner (push) Has been skipped
Deploy / deploy-ssh (push) Successful in 7s
Docker / docker (push) Successful in 6s
Security / security (push) Successful in 7s

This commit is contained in:
2026-03-01 19:56:14 +01:00
parent f3851f9e96
commit 7cc5d26948
8 changed files with 60 additions and 4 deletions

View File

@@ -54,6 +54,8 @@ The wizard will generate workflow files, create configuration, and guide you thr
### 1. Set Repository/Organization Secrets
```
AI_PROVIDER - LLM provider: openai | openrouter | ollama | anthropic | azure | gemini
AI_MODEL - Model to use for the active provider (e.g. gpt-4.1-mini, claude-3-5-sonnet-20241022)
OPENAI_API_KEY - OpenAI API key (or use OpenRouter/Ollama)
SEARXNG_URL - (Optional) SearXNG instance URL for web search
```
@@ -104,6 +106,8 @@ jobs:
AI_REVIEW_TOKEN: ${{ secrets.AI_REVIEW_TOKEN }}
AI_REVIEW_REPO: ${{ gitea.repository }}
AI_REVIEW_API_URL: https://your-gitea.example.com/api/v1
AI_PROVIDER: ${{ secrets.AI_PROVIDER }}
AI_MODEL: ${{ secrets.AI_MODEL }}
OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }}
run: |
cd .ai-review/tools/ai-review
@@ -474,8 +478,10 @@ The bot will search the codebase, read relevant files, and provide a comprehensi
Edit `tools/ai-review/config.yml`:
```yaml
provider: openai # openai | openrouter | ollama
# Set via AI_PROVIDER secret — or hardcode here as fallback
provider: openai # openai | openrouter | ollama | anthropic | azure | gemini
# Set via AI_MODEL secret — or hardcode per provider here
model:
openai: gpt-4.1-mini
openrouter: anthropic/claude-3.5-sonnet
@@ -580,9 +586,18 @@ Replace `'Bartender'` with your bot's Gitea username. This prevents the bot from
### Provider Configuration
The provider and model can be set via Gitea secrets so you don't need to edit `config.yml`:
| Secret | Description | Example |
|--------|-------------|---------|
| `AI_PROVIDER` | Which LLM provider to use | `openrouter` |
| `AI_MODEL` | Model for the active provider | `google/gemini-2.0-flash` |
The `config.yml` values are used as fallback when secrets are not set.
```yaml
# In config.yml
provider: anthropic # openai | anthropic | azure | gemini | openrouter | ollama
# In config.yml (fallback defaults)
provider: openai # openai | anthropic | azure | gemini | openrouter | ollama
# Azure OpenAI
azure:
@@ -600,6 +615,8 @@ gemini:
| Variable | Provider | Description |
|----------|----------|-------------|
| `AI_PROVIDER` | All | Override the active provider (e.g. `openrouter`) |
| `AI_MODEL` | All | Override the model for the active provider |
| `OPENAI_API_KEY` | OpenAI | API key |
| `ANTHROPIC_API_KEY` | Anthropic | API key |
| `AZURE_OPENAI_ENDPOINT` | Azure | Service endpoint URL |