Add AI_PROVIDER and AI_MODEL support
This commit is contained in:
23
README.md
23
README.md
@@ -54,6 +54,8 @@ The wizard will generate workflow files, create configuration, and guide you thr
|
||||
### 1. Set Repository/Organization Secrets
|
||||
|
||||
```
|
||||
AI_PROVIDER - LLM provider: openai | openrouter | ollama | anthropic | azure | gemini
|
||||
AI_MODEL - Model to use for the active provider (e.g. gpt-4.1-mini, claude-3-5-sonnet-20241022)
|
||||
OPENAI_API_KEY - OpenAI API key (or use OpenRouter/Ollama)
|
||||
SEARXNG_URL - (Optional) SearXNG instance URL for web search
|
||||
```
|
||||
@@ -104,6 +106,8 @@ jobs:
|
||||
AI_REVIEW_TOKEN: ${{ secrets.AI_REVIEW_TOKEN }}
|
||||
AI_REVIEW_REPO: ${{ gitea.repository }}
|
||||
AI_REVIEW_API_URL: https://your-gitea.example.com/api/v1
|
||||
AI_PROVIDER: ${{ secrets.AI_PROVIDER }}
|
||||
AI_MODEL: ${{ secrets.AI_MODEL }}
|
||||
OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }}
|
||||
run: |
|
||||
cd .ai-review/tools/ai-review
|
||||
@@ -474,8 +478,10 @@ The bot will search the codebase, read relevant files, and provide a comprehensi
|
||||
Edit `tools/ai-review/config.yml`:
|
||||
|
||||
```yaml
|
||||
provider: openai # openai | openrouter | ollama
|
||||
# Set via AI_PROVIDER secret — or hardcode here as fallback
|
||||
provider: openai # openai | openrouter | ollama | anthropic | azure | gemini
|
||||
|
||||
# Set via AI_MODEL secret — or hardcode per provider here
|
||||
model:
|
||||
openai: gpt-4.1-mini
|
||||
openrouter: anthropic/claude-3.5-sonnet
|
||||
@@ -580,9 +586,18 @@ Replace `'Bartender'` with your bot's Gitea username. This prevents the bot from
|
||||
|
||||
### Provider Configuration
|
||||
|
||||
The provider and model can be set via Gitea secrets so you don't need to edit `config.yml`:
|
||||
|
||||
| Secret | Description | Example |
|
||||
|--------|-------------|---------|
|
||||
| `AI_PROVIDER` | Which LLM provider to use | `openrouter` |
|
||||
| `AI_MODEL` | Model for the active provider | `google/gemini-2.0-flash` |
|
||||
|
||||
The `config.yml` values are used as fallback when secrets are not set.
|
||||
|
||||
```yaml
|
||||
# In config.yml
|
||||
provider: anthropic # openai | anthropic | azure | gemini | openrouter | ollama
|
||||
# In config.yml (fallback defaults)
|
||||
provider: openai # openai | anthropic | azure | gemini | openrouter | ollama
|
||||
|
||||
# Azure OpenAI
|
||||
azure:
|
||||
@@ -600,6 +615,8 @@ gemini:
|
||||
|
||||
| Variable | Provider | Description |
|
||||
|----------|----------|-------------|
|
||||
| `AI_PROVIDER` | All | Override the active provider (e.g. `openrouter`) |
|
||||
| `AI_MODEL` | All | Override the model for the active provider |
|
||||
| `OPENAI_API_KEY` | OpenAI | API key |
|
||||
| `ANTHROPIC_API_KEY` | Anthropic | API key |
|
||||
| `AZURE_OPENAI_ENDPOINT` | Azure | Service endpoint URL |
|
||||
|
||||
Reference in New Issue
Block a user