diff --git a/.gitea/workflows/ai-chat.yml b/.gitea/workflows/ai-chat.yml index 19b7918..2f51877 100644 --- a/.gitea/workflows/ai-chat.yml +++ b/.gitea/workflows/ai-chat.yml @@ -52,6 +52,8 @@ jobs: AI_REVIEW_TOKEN: ${{ secrets.AI_REVIEW_TOKEN }} AI_REVIEW_REPO: ${{ gitea.repository }} AI_REVIEW_API_URL: https://git.hiddenden.cafe/api/v1 + AI_PROVIDER: ${{ secrets.AI_PROVIDER }} + AI_MODEL: ${{ secrets.AI_MODEL }} OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }} OPENROUTER_API_KEY: ${{ secrets.OPENROUTER_API_KEY }} OLLAMA_HOST: ${{ secrets.OLLAMA_HOST }} diff --git a/.gitea/workflows/ai-codebase-review.yml b/.gitea/workflows/ai-codebase-review.yml index 2269688..fe20727 100644 --- a/.gitea/workflows/ai-codebase-review.yml +++ b/.gitea/workflows/ai-codebase-review.yml @@ -50,6 +50,8 @@ jobs: AI_REVIEW_REPO: ${{ gitea.repository }} AI_REVIEW_API_URL: https://git.hiddenden.cafe/api/v1 + AI_PROVIDER: ${{ secrets.AI_PROVIDER }} + AI_MODEL: ${{ secrets.AI_MODEL }} OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }} OPENROUTER_API_KEY: ${{ secrets.OPENROUTER_API_KEY }} OLLAMA_HOST: ${{ secrets.OLLAMA_HOST }} diff --git a/.gitea/workflows/ai-comment-reply.yml b/.gitea/workflows/ai-comment-reply.yml index dc469fe..2e78cf0 100644 --- a/.gitea/workflows/ai-comment-reply.yml +++ b/.gitea/workflows/ai-comment-reply.yml @@ -48,6 +48,8 @@ jobs: env: AI_REVIEW_TOKEN: ${{ secrets.AI_REVIEW_TOKEN }} AI_REVIEW_API_URL: https://git.hiddenden.cafe/api/v1 + AI_PROVIDER: ${{ secrets.AI_PROVIDER }} + AI_MODEL: ${{ secrets.AI_MODEL }} OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }} OPENROUTER_API_KEY: ${{ secrets.OPENROUTER_API_KEY }} OLLAMA_HOST: ${{ secrets.OLLAMA_HOST }} diff --git a/.gitea/workflows/ai-issue-triage.yml b/.gitea/workflows/ai-issue-triage.yml index bb9ad80..82b8c6a 100644 --- a/.gitea/workflows/ai-issue-triage.yml +++ b/.gitea/workflows/ai-issue-triage.yml @@ -36,6 +36,8 @@ jobs: AI_REVIEW_TOKEN: ${{ secrets.AI_REVIEW_TOKEN }} AI_REVIEW_REPO: ${{ gitea.repository }} AI_REVIEW_API_URL: https://git.hiddenden.cafe/api/v1 + AI_PROVIDER: ${{ secrets.AI_PROVIDER }} + AI_MODEL: ${{ secrets.AI_MODEL }} OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }} OPENROUTER_API_KEY: ${{ secrets.OPENROUTER_API_KEY }} OLLAMA_HOST: ${{ secrets.OLLAMA_HOST }} diff --git a/README.md b/README.md index 6b24d82..609109d 100644 --- a/README.md +++ b/README.md @@ -54,6 +54,8 @@ The wizard will generate workflow files, create configuration, and guide you thr ### 1. Set Repository/Organization Secrets ``` +AI_PROVIDER - LLM provider: openai | openrouter | ollama | anthropic | azure | gemini +AI_MODEL - Model to use for the active provider (e.g. gpt-4.1-mini, claude-3-5-sonnet-20241022) OPENAI_API_KEY - OpenAI API key (or use OpenRouter/Ollama) SEARXNG_URL - (Optional) SearXNG instance URL for web search ``` @@ -104,6 +106,8 @@ jobs: AI_REVIEW_TOKEN: ${{ secrets.AI_REVIEW_TOKEN }} AI_REVIEW_REPO: ${{ gitea.repository }} AI_REVIEW_API_URL: https://your-gitea.example.com/api/v1 + AI_PROVIDER: ${{ secrets.AI_PROVIDER }} + AI_MODEL: ${{ secrets.AI_MODEL }} OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }} run: | cd .ai-review/tools/ai-review @@ -474,8 +478,10 @@ The bot will search the codebase, read relevant files, and provide a comprehensi Edit `tools/ai-review/config.yml`: ```yaml -provider: openai # openai | openrouter | ollama +# Set via AI_PROVIDER secret — or hardcode here as fallback +provider: openai # openai | openrouter | ollama | anthropic | azure | gemini +# Set via AI_MODEL secret — or hardcode per provider here model: openai: gpt-4.1-mini openrouter: anthropic/claude-3.5-sonnet @@ -580,9 +586,18 @@ Replace `'Bartender'` with your bot's Gitea username. This prevents the bot from ### Provider Configuration +The provider and model can be set via Gitea secrets so you don't need to edit `config.yml`: + +| Secret | Description | Example | +|--------|-------------|---------| +| `AI_PROVIDER` | Which LLM provider to use | `openrouter` | +| `AI_MODEL` | Model for the active provider | `google/gemini-2.0-flash` | + +The `config.yml` values are used as fallback when secrets are not set. + ```yaml -# In config.yml -provider: anthropic # openai | anthropic | azure | gemini | openrouter | ollama +# In config.yml (fallback defaults) +provider: openai # openai | anthropic | azure | gemini | openrouter | ollama # Azure OpenAI azure: @@ -600,6 +615,8 @@ gemini: | Variable | Provider | Description | |----------|----------|-------------| +| `AI_PROVIDER` | All | Override the active provider (e.g. `openrouter`) | +| `AI_MODEL` | All | Override the model for the active provider | | `OPENAI_API_KEY` | OpenAI | API key | | `ANTHROPIC_API_KEY` | Anthropic | API key | | `AZURE_OPENAI_ENDPOINT` | Azure | Service endpoint URL | diff --git a/docs/configuration.md b/docs/configuration.md index e555927..5858c12 100644 --- a/docs/configuration.md +++ b/docs/configuration.md @@ -4,15 +4,29 @@ All configuration is managed in `tools/ai-review/config.yml`. ## Provider Settings +The provider and model can be configured via Gitea secrets, so you don't need to edit `config.yml` per deployment: + +| Secret | Description | Example values | +|--------|-------------|----------------| +| `AI_PROVIDER` | Which LLM provider to use | `openai`, `openrouter`, `ollama`, `anthropic`, `azure`, `gemini` | +| `AI_MODEL` | Model for the active provider | `gpt-4.1-mini`, `claude-3-5-sonnet-20241022`, `google/gemini-2.0-flash` | + +These secrets override the values in `config.yml`, which serve as fallback defaults. + ```yaml -# LLM Provider: openai | openrouter | ollama +# LLM Provider: openai | openrouter | ollama | anthropic | azure | gemini +# Override with the AI_PROVIDER Gitea secret. provider: openai # Model per provider +# Override the active provider's model with the AI_MODEL Gitea secret. model: openai: gpt-4.1-mini openrouter: anthropic/claude-3.5-sonnet ollama: codellama:13b + anthropic: claude-3-5-sonnet-20241022 + azure: gpt-4 + gemini: gemini-1.5-pro # Generation settings temperature: 0 # 0 = deterministic @@ -166,6 +180,8 @@ These override config file settings: | Variable | Description | |----------|-------------| +| `AI_PROVIDER` | Override the active provider (e.g. `openrouter`, `anthropic`) | +| `AI_MODEL` | Override the model for the active provider | | `AI_REVIEW_TOKEN` | Gitea/GitHub API token | | `AI_REVIEW_API_URL` | API base URL (`https://api.github.com` or Gitea URL) | | `AI_REVIEW_REPO` | Target repository (owner/repo) | diff --git a/tools/ai-review/config.yml b/tools/ai-review/config.yml index 82ae042..0a1f68d 100644 --- a/tools/ai-review/config.yml +++ b/tools/ai-review/config.yml @@ -4,8 +4,11 @@ # LLM Provider Configuration # -------------------------- # Available providers: openai | openrouter | ollama | anthropic | azure | gemini +# This value can be overridden by setting the AI_PROVIDER Gitea secret. provider: openai +# The model to use per provider. +# Override the active provider's model by setting the AI_MODEL Gitea secret. model: openai: gpt-4.1-mini openrouter: anthropic/claude-3.5-sonnet diff --git a/tools/ai-review/main.py b/tools/ai-review/main.py index 9318fc5..32039bc 100644 --- a/tools/ai-review/main.py +++ b/tools/ai-review/main.py @@ -332,6 +332,18 @@ def main(): setup_logging(args.verbose) config = load_config(args.config) + # Allow overriding the provider via a Gitea/CI secret (AI_PROVIDER env var) + ai_provider = os.environ.get("AI_PROVIDER") + if ai_provider: + config["provider"] = ai_provider + + # Allow overriding the model via a Gitea/CI secret (AI_MODEL env var) + # Overrides the model for whichever provider is active. + ai_model = os.environ.get("AI_MODEL") + if ai_model: + provider = config.get("provider", "openai") + config.setdefault("model", {})[provider] = ai_model + if args.command == "pr": run_pr_review(args, config) elif args.command == "issue":