Add AI_PROVIDER and AI_MODEL support
This commit is contained in:
@@ -52,6 +52,8 @@ jobs:
|
||||
AI_REVIEW_TOKEN: ${{ secrets.AI_REVIEW_TOKEN }}
|
||||
AI_REVIEW_REPO: ${{ gitea.repository }}
|
||||
AI_REVIEW_API_URL: https://git.hiddenden.cafe/api/v1
|
||||
AI_PROVIDER: ${{ secrets.AI_PROVIDER }}
|
||||
AI_MODEL: ${{ secrets.AI_MODEL }}
|
||||
OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }}
|
||||
OPENROUTER_API_KEY: ${{ secrets.OPENROUTER_API_KEY }}
|
||||
OLLAMA_HOST: ${{ secrets.OLLAMA_HOST }}
|
||||
|
||||
@@ -50,6 +50,8 @@ jobs:
|
||||
AI_REVIEW_REPO: ${{ gitea.repository }}
|
||||
AI_REVIEW_API_URL: https://git.hiddenden.cafe/api/v1
|
||||
|
||||
AI_PROVIDER: ${{ secrets.AI_PROVIDER }}
|
||||
AI_MODEL: ${{ secrets.AI_MODEL }}
|
||||
OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }}
|
||||
OPENROUTER_API_KEY: ${{ secrets.OPENROUTER_API_KEY }}
|
||||
OLLAMA_HOST: ${{ secrets.OLLAMA_HOST }}
|
||||
|
||||
@@ -48,6 +48,8 @@ jobs:
|
||||
env:
|
||||
AI_REVIEW_TOKEN: ${{ secrets.AI_REVIEW_TOKEN }}
|
||||
AI_REVIEW_API_URL: https://git.hiddenden.cafe/api/v1
|
||||
AI_PROVIDER: ${{ secrets.AI_PROVIDER }}
|
||||
AI_MODEL: ${{ secrets.AI_MODEL }}
|
||||
OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }}
|
||||
OPENROUTER_API_KEY: ${{ secrets.OPENROUTER_API_KEY }}
|
||||
OLLAMA_HOST: ${{ secrets.OLLAMA_HOST }}
|
||||
|
||||
@@ -36,6 +36,8 @@ jobs:
|
||||
AI_REVIEW_TOKEN: ${{ secrets.AI_REVIEW_TOKEN }}
|
||||
AI_REVIEW_REPO: ${{ gitea.repository }}
|
||||
AI_REVIEW_API_URL: https://git.hiddenden.cafe/api/v1
|
||||
AI_PROVIDER: ${{ secrets.AI_PROVIDER }}
|
||||
AI_MODEL: ${{ secrets.AI_MODEL }}
|
||||
OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }}
|
||||
OPENROUTER_API_KEY: ${{ secrets.OPENROUTER_API_KEY }}
|
||||
OLLAMA_HOST: ${{ secrets.OLLAMA_HOST }}
|
||||
|
||||
23
README.md
23
README.md
@@ -54,6 +54,8 @@ The wizard will generate workflow files, create configuration, and guide you thr
|
||||
### 1. Set Repository/Organization Secrets
|
||||
|
||||
```
|
||||
AI_PROVIDER - LLM provider: openai | openrouter | ollama | anthropic | azure | gemini
|
||||
AI_MODEL - Model to use for the active provider (e.g. gpt-4.1-mini, claude-3-5-sonnet-20241022)
|
||||
OPENAI_API_KEY - OpenAI API key (or use OpenRouter/Ollama)
|
||||
SEARXNG_URL - (Optional) SearXNG instance URL for web search
|
||||
```
|
||||
@@ -104,6 +106,8 @@ jobs:
|
||||
AI_REVIEW_TOKEN: ${{ secrets.AI_REVIEW_TOKEN }}
|
||||
AI_REVIEW_REPO: ${{ gitea.repository }}
|
||||
AI_REVIEW_API_URL: https://your-gitea.example.com/api/v1
|
||||
AI_PROVIDER: ${{ secrets.AI_PROVIDER }}
|
||||
AI_MODEL: ${{ secrets.AI_MODEL }}
|
||||
OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }}
|
||||
run: |
|
||||
cd .ai-review/tools/ai-review
|
||||
@@ -474,8 +478,10 @@ The bot will search the codebase, read relevant files, and provide a comprehensi
|
||||
Edit `tools/ai-review/config.yml`:
|
||||
|
||||
```yaml
|
||||
provider: openai # openai | openrouter | ollama
|
||||
# Set via AI_PROVIDER secret — or hardcode here as fallback
|
||||
provider: openai # openai | openrouter | ollama | anthropic | azure | gemini
|
||||
|
||||
# Set via AI_MODEL secret — or hardcode per provider here
|
||||
model:
|
||||
openai: gpt-4.1-mini
|
||||
openrouter: anthropic/claude-3.5-sonnet
|
||||
@@ -580,9 +586,18 @@ Replace `'Bartender'` with your bot's Gitea username. This prevents the bot from
|
||||
|
||||
### Provider Configuration
|
||||
|
||||
The provider and model can be set via Gitea secrets so you don't need to edit `config.yml`:
|
||||
|
||||
| Secret | Description | Example |
|
||||
|--------|-------------|---------|
|
||||
| `AI_PROVIDER` | Which LLM provider to use | `openrouter` |
|
||||
| `AI_MODEL` | Model for the active provider | `google/gemini-2.0-flash` |
|
||||
|
||||
The `config.yml` values are used as fallback when secrets are not set.
|
||||
|
||||
```yaml
|
||||
# In config.yml
|
||||
provider: anthropic # openai | anthropic | azure | gemini | openrouter | ollama
|
||||
# In config.yml (fallback defaults)
|
||||
provider: openai # openai | anthropic | azure | gemini | openrouter | ollama
|
||||
|
||||
# Azure OpenAI
|
||||
azure:
|
||||
@@ -600,6 +615,8 @@ gemini:
|
||||
|
||||
| Variable | Provider | Description |
|
||||
|----------|----------|-------------|
|
||||
| `AI_PROVIDER` | All | Override the active provider (e.g. `openrouter`) |
|
||||
| `AI_MODEL` | All | Override the model for the active provider |
|
||||
| `OPENAI_API_KEY` | OpenAI | API key |
|
||||
| `ANTHROPIC_API_KEY` | Anthropic | API key |
|
||||
| `AZURE_OPENAI_ENDPOINT` | Azure | Service endpoint URL |
|
||||
|
||||
@@ -4,15 +4,29 @@ All configuration is managed in `tools/ai-review/config.yml`.
|
||||
|
||||
## Provider Settings
|
||||
|
||||
The provider and model can be configured via Gitea secrets, so you don't need to edit `config.yml` per deployment:
|
||||
|
||||
| Secret | Description | Example values |
|
||||
|--------|-------------|----------------|
|
||||
| `AI_PROVIDER` | Which LLM provider to use | `openai`, `openrouter`, `ollama`, `anthropic`, `azure`, `gemini` |
|
||||
| `AI_MODEL` | Model for the active provider | `gpt-4.1-mini`, `claude-3-5-sonnet-20241022`, `google/gemini-2.0-flash` |
|
||||
|
||||
These secrets override the values in `config.yml`, which serve as fallback defaults.
|
||||
|
||||
```yaml
|
||||
# LLM Provider: openai | openrouter | ollama
|
||||
# LLM Provider: openai | openrouter | ollama | anthropic | azure | gemini
|
||||
# Override with the AI_PROVIDER Gitea secret.
|
||||
provider: openai
|
||||
|
||||
# Model per provider
|
||||
# Override the active provider's model with the AI_MODEL Gitea secret.
|
||||
model:
|
||||
openai: gpt-4.1-mini
|
||||
openrouter: anthropic/claude-3.5-sonnet
|
||||
ollama: codellama:13b
|
||||
anthropic: claude-3-5-sonnet-20241022
|
||||
azure: gpt-4
|
||||
gemini: gemini-1.5-pro
|
||||
|
||||
# Generation settings
|
||||
temperature: 0 # 0 = deterministic
|
||||
@@ -166,6 +180,8 @@ These override config file settings:
|
||||
|
||||
| Variable | Description |
|
||||
|----------|-------------|
|
||||
| `AI_PROVIDER` | Override the active provider (e.g. `openrouter`, `anthropic`) |
|
||||
| `AI_MODEL` | Override the model for the active provider |
|
||||
| `AI_REVIEW_TOKEN` | Gitea/GitHub API token |
|
||||
| `AI_REVIEW_API_URL` | API base URL (`https://api.github.com` or Gitea URL) |
|
||||
| `AI_REVIEW_REPO` | Target repository (owner/repo) |
|
||||
|
||||
@@ -4,8 +4,11 @@
|
||||
# LLM Provider Configuration
|
||||
# --------------------------
|
||||
# Available providers: openai | openrouter | ollama | anthropic | azure | gemini
|
||||
# This value can be overridden by setting the AI_PROVIDER Gitea secret.
|
||||
provider: openai
|
||||
|
||||
# The model to use per provider.
|
||||
# Override the active provider's model by setting the AI_MODEL Gitea secret.
|
||||
model:
|
||||
openai: gpt-4.1-mini
|
||||
openrouter: anthropic/claude-3.5-sonnet
|
||||
|
||||
@@ -332,6 +332,18 @@ def main():
|
||||
setup_logging(args.verbose)
|
||||
config = load_config(args.config)
|
||||
|
||||
# Allow overriding the provider via a Gitea/CI secret (AI_PROVIDER env var)
|
||||
ai_provider = os.environ.get("AI_PROVIDER")
|
||||
if ai_provider:
|
||||
config["provider"] = ai_provider
|
||||
|
||||
# Allow overriding the model via a Gitea/CI secret (AI_MODEL env var)
|
||||
# Overrides the model for whichever provider is active.
|
||||
ai_model = os.environ.get("AI_MODEL")
|
||||
if ai_model:
|
||||
provider = config.get("provider", "openai")
|
||||
config.setdefault("model", {})[provider] = ai_model
|
||||
|
||||
if args.command == "pr":
|
||||
run_pr_review(args, config)
|
||||
elif args.command == "issue":
|
||||
|
||||
Reference in New Issue
Block a user