Compare commits

..

No commits in common. "ea9b55e4c384f5bfa0d63dc04f95739dd6ac1279" and "15fa6cef49a424f28c21a0d79bf2a3a3f42f4f1a" have entirely different histories.

3 changed files with 59 additions and 164 deletions

View File

@ -7,20 +7,16 @@
![Test program/library](https://gitfub.space/Jmaa/aider-gitea/actions/workflows/python-test.yml/badge.svg)
A code automation tool that integrates Gitea with AI assistants to automatically solve issues.
A code automation tool that integrates Gitea with Aider to automatically solve issues.
This program monitors your [Gitea](https://about.gitea.com/) repository for issues with the 'aider' label.
When such an issue is found, it:
1. Creates a new branch.
2. Invokes an AI assistant (Aider or Claude Code) to solve the issue using a Large-Language Model.
2. Invokes [Aider](https://aider.chat/) to solve the issue using a Large-Language Model.
3. Runs tests and code quality checks.
4. Creates a pull request with the solution.
The tool automatically selects the appropriate AI assistant based on the specified model:
- **Aider**: Used for non-Anthropic models (e.g., GPT, Ollama, Gemini)
- **Claude Code**: Used for Anthropic models (e.g., Claude, Sonnet, Haiku, Opus)
Inspired by [the AI workflows](https://github.com/oscoreio/ai-workflows/)
project.
@ -36,109 +32,48 @@ have the following permissions:
### Command Line
```bash
# Run with default settings (uses Aider)
python -m aider_gitea --aider-model gpt-4
# Use Claude Code with Anthropic models
python -m aider_gitea --aider-model claude-3-sonnet
python -m aider_gitea --aider-model claude-3-haiku
python -m aider_gitea --aider-model anthropic/claude-3-opus
# Use Aider with various models
python -m aider_gitea --aider-model gpt-4
python -m aider_gitea --aider-model ollama/llama3
python -m aider_gitea --aider-model gemini-pro
# Run with default settings
python -m aider_gitea
# Specify custom repository and owner
python -m aider_gitea --owner myorg --repo myproject --aider-model claude-3-sonnet
python -m aider_gitea --owner myorg --repo myproject
# Use a custom Gitea URL
python -m aider_gitea --gitea-url https://gitea.example.com --aider-model gpt-4
python -m aider_gitea --gitea-url https://gitea.example.com
# Specify a different base branch
python -m aider_gitea --base-branch develop --aider-model claude-3-haiku
python -m aider_gitea --base-branch develop
```
### AI Assistant Selection
The tool automatically routes to the appropriate AI assistant based on the model name:
**Claude Code Integration (Anthropic Models):**
- Model names containing: `claude`, `anthropic`, `sonnet`, `haiku`, `opus`
- Examples: `claude-3-sonnet`, `claude-3-haiku`, `anthropic/claude-3-opus`
- Requires: `ANTHROPIC_API_KEY` environment variable
**Aider Integration (All Other Models):**
- Any model not matching Anthropic patterns
- Examples: `gpt-4`, `ollama/llama3`, `gemini-pro`, `mistral-7b`
- Requires: `LLM_API_KEY` environment variable
### Python API
```python
from aider_gitea import solve_issue_in_repository, create_code_solver
from aider_gitea import solve_issue_in_repository
from pathlib import Path
import argparse
# Solve an issue programmatically with automatic AI assistant selection
repository_config = RepositoryConfig(
# Solve an issue programmatically
args = argparse.Namespace(
gitea_url="https://gitea.example.com",
owner="myorg",
repo="myproject",
base_branch="main"
)
# Set the model to control which AI assistant is used
import aider_gitea
aider_gitea.CODE_MODEL = "claude-3-sonnet" # Will use Claude Code
# aider_gitea.CODE_MODEL = "gpt-4" # Will use Aider
code_solver = create_code_solver() # Automatically selects based on model
solve_issue_in_repository(
repository_config,
args,
Path("/path/to/repo"),
"issue-123-fix-bug",
"Fix critical bug",
"The application crashes when processing large files",
"123",
gitea_client,
code_solver
"123"
)
```
### Environment Configuration
The tool uses environment variables for sensitive information:
**Required for all setups:**
- `GITEA_TOKEN`: Your Gitea API token
**For Aider (non-Anthropic models):**
- `LLM_API_KEY`: API key for the language model (OpenAI, Ollama, etc.)
**For Claude Code (Anthropic models):**
- `ANTHROPIC_API_KEY`: Your Anthropic API key for Claude models
### Model Examples
**Anthropic Models (→ Claude Code):**
```bash
--aider-model claude-3-sonnet
--aider-model claude-3-haiku
--aider-model claude-3-opus
--aider-model anthropic/claude-3-sonnet
```
**Non-Anthropic Models (→ Aider):**
```bash
--aider-model gpt-4
--aider-model gpt-3.5-turbo
--aider-model ollama/llama3
--aider-model ollama/codellama
--aider-model gemini-pro
--aider-model mistral-7b
```
- `LLM_API_KEY`: API key for the language model used by Aider
```
## Dependencies

View File

@ -346,7 +346,7 @@ class AiderCodeSolver(CodeSolverStrategy):
aider_command = self._create_aider_command(issue_content)
aider_did_not_crash = run_cmd(
aider_command,
cwd=repository_path,
repository_path,
check=False,
)
if not aider_did_not_crash:
@ -368,11 +368,9 @@ class ClaudeCodeSolver(CodeSolverStrategy):
'claude',
'-p',
'--output-format',
'stream-json',
#'--max-turns', '100',
'--debug',
'--verbose',
'--dangerously-skip-permissions',
'json',
'--max-turns',
'10',
]
if CODE_MODEL:
@ -383,6 +381,13 @@ class ClaudeCodeSolver(CodeSolverStrategy):
def solve_issue_round(self, repository_path: Path, issue_content: str) -> bool:
"""Solve an issue using Claude Code."""
import json
import os
# Set Anthropic API key environment variable
env = os.environ.copy()
env['ANTHROPIC_API_KEY'] = secrets.anthropic_api_key()
# Prepare the issue prompt for Claude Code
enhanced_issue = CLAUDE_CODE_MESSAGE_FORMAT.format(
issue=issue_content,
@ -394,12 +399,31 @@ class ClaudeCodeSolver(CodeSolverStrategy):
claude_command = self._create_claude_command(enhanced_issue)
# Run Claude Code
run_cmd(
result = subprocess.run(
claude_command,
cwd=repository_path,
env=env,
capture_output=True,
text=True,
check=False,
)
if result.returncode != 0:
logger.error('Claude Code failed with return code %d', result.returncode)
logger.error('stderr: %s', result.stderr)
return False
# Parse response if it's JSON
try:
if result.stdout.strip():
response_data = json.loads(result.stdout)
logger.info(
'Claude Code response: %s',
response_data.get('text', 'No text field'),
)
except json.JSONDecodeError:
logger.info('Claude Code response (non-JSON): %s', result.stdout[:500])
# Run post-solver cleanup
run_post_solver_cleanup(repository_path, 'Claude Code')

View File

@ -12,20 +12,16 @@ PACKAGE_NAME = 'aider_gitea'
PACKAGE_DESCRIPTION = """
Aider Gitea.
A code automation tool that integrates Gitea with AI assistants to automatically solve issues.
A code automation tool that integrates Gitea with Aider to automatically solve issues.
This program monitors your [Gitea](https://about.gitea.com/) repository for issues with the 'aider' label.
When such an issue is found, it:
1. Creates a new branch.
2. Invokes an AI assistant (Aider or Claude Code) to solve the issue using a Large-Language Model.
2. Invokes [Aider](https://aider.chat/) to solve the issue using a Large-Language Model.
3. Runs tests and code quality checks.
4. Creates a pull request with the solution.
The tool automatically selects the appropriate AI assistant based on the specified model:
- **Aider**: Used for non-Anthropic models (e.g., GPT, Ollama, Gemini)
- **Claude Code**: Used for Anthropic models (e.g., Claude, Sonnet, Haiku, Opus)
Inspired by [the AI workflows](https://github.com/oscoreio/ai-workflows/)
project.
@ -41,114 +37,53 @@ have the following permissions:
### Command Line
```bash
# Run with default settings (uses Aider)
python -m aider_gitea --aider-model gpt-4
# Use Claude Code with Anthropic models
python -m aider_gitea --aider-model claude-3-sonnet
python -m aider_gitea --aider-model claude-3-haiku
python -m aider_gitea --aider-model anthropic/claude-3-opus
# Use Aider with various models
python -m aider_gitea --aider-model gpt-4
python -m aider_gitea --aider-model ollama/llama3
python -m aider_gitea --aider-model gemini-pro
# Run with default settings
python -m aider_gitea
# Specify custom repository and owner
python -m aider_gitea --owner myorg --repo myproject --aider-model claude-3-sonnet
python -m aider_gitea --owner myorg --repo myproject
# Use a custom Gitea URL
python -m aider_gitea --gitea-url https://gitea.example.com --aider-model gpt-4
python -m aider_gitea --gitea-url https://gitea.example.com
# Specify a different base branch
python -m aider_gitea --base-branch develop --aider-model claude-3-haiku
python -m aider_gitea --base-branch develop
```
### AI Assistant Selection
The tool automatically routes to the appropriate AI assistant based on the model name:
**Claude Code Integration (Anthropic Models):**
- Model names containing: `claude`, `anthropic`, `sonnet`, `haiku`, `opus`
- Examples: `claude-3-sonnet`, `claude-3-haiku`, `anthropic/claude-3-opus`
- Requires: `ANTHROPIC_API_KEY` environment variable
**Aider Integration (All Other Models):**
- Any model not matching Anthropic patterns
- Examples: `gpt-4`, `ollama/llama3`, `gemini-pro`, `mistral-7b`
- Requires: `LLM_API_KEY` environment variable
### Python API
```python
from aider_gitea import solve_issue_in_repository, create_code_solver
from aider_gitea import solve_issue_in_repository
from pathlib import Path
import argparse
# Solve an issue programmatically with automatic AI assistant selection
repository_config = RepositoryConfig(
# Solve an issue programmatically
args = argparse.Namespace(
gitea_url="https://gitea.example.com",
owner="myorg",
repo="myproject",
base_branch="main"
)
# Set the model to control which AI assistant is used
import aider_gitea
aider_gitea.CODE_MODEL = "claude-3-sonnet" # Will use Claude Code
# aider_gitea.CODE_MODEL = "gpt-4" # Will use Aider
code_solver = create_code_solver() # Automatically selects based on model
solve_issue_in_repository(
repository_config,
args,
Path("/path/to/repo"),
"issue-123-fix-bug",
"Fix critical bug",
"The application crashes when processing large files",
"123",
gitea_client,
code_solver
"123"
)
```
### Environment Configuration
The tool uses environment variables for sensitive information:
**Required for all setups:**
- `GITEA_TOKEN`: Your Gitea API token
**For Aider (non-Anthropic models):**
- `LLM_API_KEY`: API key for the language model (OpenAI, Ollama, etc.)
**For Claude Code (Anthropic models):**
- `ANTHROPIC_API_KEY`: Your Anthropic API key for Claude models
### Model Examples
**Anthropic Models ( Claude Code):**
```bash
--aider-model claude-3-sonnet
--aider-model claude-3-haiku
--aider-model claude-3-opus
--aider-model anthropic/claude-3-sonnet
```
**Non-Anthropic Models ( Aider):**
```bash
--aider-model gpt-4
--aider-model gpt-3.5-turbo
--aider-model ollama/llama3
--aider-model ollama/codellama
--aider-model gemini-pro
--aider-model mistral-7b
```
- `LLM_API_KEY`: API key for the language model used by Aider
```
""".strip()
PACKAGE_DESCRIPTION_SHORT = """
A code automation tool that integrates Gitea with AI assistants to automatically solve issues.""".strip()
A code automation tool that integrates Gitea with Aider to automatically solve issues.""".strip()
def parse_version_file(text: str) -> str:
@ -174,6 +109,7 @@ def find_python_packages() -> list[str]:
print(f'Found following packages: {packages}')
return sorted(packages)
with open(PACKAGE_NAME + '/_version.py') as f:
version = parse_version_file(f.read())