Add AI_PROVIDER and AI_MODEL support
All checks were successful
CI / ci (push) Successful in 9s
Deploy / deploy-local-runner (push) Has been skipped
Deploy / deploy-ssh (push) Successful in 7s
Docker / docker (push) Successful in 6s
Security / security (push) Successful in 7s

This commit is contained in:
2026-03-01 19:56:14 +01:00
parent f3851f9e96
commit 7cc5d26948
8 changed files with 60 additions and 4 deletions

View File

@@ -52,6 +52,8 @@ jobs:
AI_REVIEW_TOKEN: ${{ secrets.AI_REVIEW_TOKEN }}
AI_REVIEW_REPO: ${{ gitea.repository }}
AI_REVIEW_API_URL: https://git.hiddenden.cafe/api/v1
AI_PROVIDER: ${{ secrets.AI_PROVIDER }}
AI_MODEL: ${{ secrets.AI_MODEL }}
OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }}
OPENROUTER_API_KEY: ${{ secrets.OPENROUTER_API_KEY }}
OLLAMA_HOST: ${{ secrets.OLLAMA_HOST }}

View File

@@ -50,6 +50,8 @@ jobs:
AI_REVIEW_REPO: ${{ gitea.repository }}
AI_REVIEW_API_URL: https://git.hiddenden.cafe/api/v1
AI_PROVIDER: ${{ secrets.AI_PROVIDER }}
AI_MODEL: ${{ secrets.AI_MODEL }}
OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }}
OPENROUTER_API_KEY: ${{ secrets.OPENROUTER_API_KEY }}
OLLAMA_HOST: ${{ secrets.OLLAMA_HOST }}

View File

@@ -48,6 +48,8 @@ jobs:
env:
AI_REVIEW_TOKEN: ${{ secrets.AI_REVIEW_TOKEN }}
AI_REVIEW_API_URL: https://git.hiddenden.cafe/api/v1
AI_PROVIDER: ${{ secrets.AI_PROVIDER }}
AI_MODEL: ${{ secrets.AI_MODEL }}
OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }}
OPENROUTER_API_KEY: ${{ secrets.OPENROUTER_API_KEY }}
OLLAMA_HOST: ${{ secrets.OLLAMA_HOST }}

View File

@@ -36,6 +36,8 @@ jobs:
AI_REVIEW_TOKEN: ${{ secrets.AI_REVIEW_TOKEN }}
AI_REVIEW_REPO: ${{ gitea.repository }}
AI_REVIEW_API_URL: https://git.hiddenden.cafe/api/v1
AI_PROVIDER: ${{ secrets.AI_PROVIDER }}
AI_MODEL: ${{ secrets.AI_MODEL }}
OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }}
OPENROUTER_API_KEY: ${{ secrets.OPENROUTER_API_KEY }}
OLLAMA_HOST: ${{ secrets.OLLAMA_HOST }}

View File

@@ -54,6 +54,8 @@ The wizard will generate workflow files, create configuration, and guide you thr
### 1. Set Repository/Organization Secrets
```
AI_PROVIDER - LLM provider: openai | openrouter | ollama | anthropic | azure | gemini
AI_MODEL - Model to use for the active provider (e.g. gpt-4.1-mini, claude-3-5-sonnet-20241022)
OPENAI_API_KEY - OpenAI API key (or use OpenRouter/Ollama)
SEARXNG_URL - (Optional) SearXNG instance URL for web search
```
@@ -104,6 +106,8 @@ jobs:
AI_REVIEW_TOKEN: ${{ secrets.AI_REVIEW_TOKEN }}
AI_REVIEW_REPO: ${{ gitea.repository }}
AI_REVIEW_API_URL: https://your-gitea.example.com/api/v1
AI_PROVIDER: ${{ secrets.AI_PROVIDER }}
AI_MODEL: ${{ secrets.AI_MODEL }}
OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }}
run: |
cd .ai-review/tools/ai-review
@@ -474,8 +478,10 @@ The bot will search the codebase, read relevant files, and provide a comprehensi
Edit `tools/ai-review/config.yml`:
```yaml
provider: openai # openai | openrouter | ollama
# Set via AI_PROVIDER secret — or hardcode here as fallback
provider: openai # openai | openrouter | ollama | anthropic | azure | gemini
# Set via AI_MODEL secret — or hardcode per provider here
model:
openai: gpt-4.1-mini
openrouter: anthropic/claude-3.5-sonnet
@@ -580,9 +586,18 @@ Replace `'Bartender'` with your bot's Gitea username. This prevents the bot from
### Provider Configuration
The provider and model can be set via Gitea secrets so you don't need to edit `config.yml`:
| Secret | Description | Example |
|--------|-------------|---------|
| `AI_PROVIDER` | Which LLM provider to use | `openrouter` |
| `AI_MODEL` | Model for the active provider | `google/gemini-2.0-flash` |
The `config.yml` values are used as fallback when secrets are not set.
```yaml
# In config.yml
provider: anthropic # openai | anthropic | azure | gemini | openrouter | ollama
# In config.yml (fallback defaults)
provider: openai # openai | anthropic | azure | gemini | openrouter | ollama
# Azure OpenAI
azure:
@@ -600,6 +615,8 @@ gemini:
| Variable | Provider | Description |
|----------|----------|-------------|
| `AI_PROVIDER` | All | Override the active provider (e.g. `openrouter`) |
| `AI_MODEL` | All | Override the model for the active provider |
| `OPENAI_API_KEY` | OpenAI | API key |
| `ANTHROPIC_API_KEY` | Anthropic | API key |
| `AZURE_OPENAI_ENDPOINT` | Azure | Service endpoint URL |

View File

@@ -4,15 +4,29 @@ All configuration is managed in `tools/ai-review/config.yml`.
## Provider Settings
The provider and model can be configured via Gitea secrets, so you don't need to edit `config.yml` per deployment:
| Secret | Description | Example values |
|--------|-------------|----------------|
| `AI_PROVIDER` | Which LLM provider to use | `openai`, `openrouter`, `ollama`, `anthropic`, `azure`, `gemini` |
| `AI_MODEL` | Model for the active provider | `gpt-4.1-mini`, `claude-3-5-sonnet-20241022`, `google/gemini-2.0-flash` |
These secrets override the values in `config.yml`, which serve as fallback defaults.
```yaml
# LLM Provider: openai | openrouter | ollama
# LLM Provider: openai | openrouter | ollama | anthropic | azure | gemini
# Override with the AI_PROVIDER Gitea secret.
provider: openai
# Model per provider
# Override the active provider's model with the AI_MODEL Gitea secret.
model:
openai: gpt-4.1-mini
openrouter: anthropic/claude-3.5-sonnet
ollama: codellama:13b
anthropic: claude-3-5-sonnet-20241022
azure: gpt-4
gemini: gemini-1.5-pro
# Generation settings
temperature: 0 # 0 = deterministic
@@ -166,6 +180,8 @@ These override config file settings:
| Variable | Description |
|----------|-------------|
| `AI_PROVIDER` | Override the active provider (e.g. `openrouter`, `anthropic`) |
| `AI_MODEL` | Override the model for the active provider |
| `AI_REVIEW_TOKEN` | Gitea/GitHub API token |
| `AI_REVIEW_API_URL` | API base URL (`https://api.github.com` or Gitea URL) |
| `AI_REVIEW_REPO` | Target repository (owner/repo) |

View File

@@ -4,8 +4,11 @@
# LLM Provider Configuration
# --------------------------
# Available providers: openai | openrouter | ollama | anthropic | azure | gemini
# This value can be overridden by setting the AI_PROVIDER Gitea secret.
provider: openai
# The model to use per provider.
# Override the active provider's model by setting the AI_MODEL Gitea secret.
model:
openai: gpt-4.1-mini
openrouter: anthropic/claude-3.5-sonnet

View File

@@ -332,6 +332,18 @@ def main():
setup_logging(args.verbose)
config = load_config(args.config)
# Allow overriding the provider via a Gitea/CI secret (AI_PROVIDER env var)
ai_provider = os.environ.get("AI_PROVIDER")
if ai_provider:
config["provider"] = ai_provider
# Allow overriding the model via a Gitea/CI secret (AI_MODEL env var)
# Overrides the model for whichever provider is active.
ai_model = os.environ.get("AI_MODEL")
if ai_model:
provider = config.get("provider", "openai")
config.setdefault("model", {})[provider] = ai_model
if args.command == "pr":
run_pr_review(args, config)
elif args.command == "issue":