Compare commits

...

3 Commits

Author SHA1 Message Date
ea9b55e4c3 Misc
Some checks failed
Run Python tests (through Pytest) / Test (push) Failing after 25s
Verify Python project can be installed, loaded and have version checked / Test (push) Successful in 23s
2025-06-09 13:00:44 +02:00
d0d22a8ac4 🤖 Repository layout updated to latest version
This commit was automatically generated by [a script](https://gitfub.space/Jmaa/repo-manager)
2025-06-09 02:02:23 +02:00
5d28388bc3 Removed unneeded code 2025-06-09 01:55:36 +02:00
3 changed files with 164 additions and 59 deletions

View File

@ -7,16 +7,20 @@
![Test program/library](https://gitfub.space/Jmaa/aider-gitea/actions/workflows/python-test.yml/badge.svg) ![Test program/library](https://gitfub.space/Jmaa/aider-gitea/actions/workflows/python-test.yml/badge.svg)
A code automation tool that integrates Gitea with Aider to automatically solve issues. A code automation tool that integrates Gitea with AI assistants to automatically solve issues.
This program monitors your [Gitea](https://about.gitea.com/) repository for issues with the 'aider' label. This program monitors your [Gitea](https://about.gitea.com/) repository for issues with the 'aider' label.
When such an issue is found, it: When such an issue is found, it:
1. Creates a new branch. 1. Creates a new branch.
2. Invokes [Aider](https://aider.chat/) to solve the issue using a Large-Language Model. 2. Invokes an AI assistant (Aider or Claude Code) to solve the issue using a Large-Language Model.
3. Runs tests and code quality checks. 3. Runs tests and code quality checks.
4. Creates a pull request with the solution. 4. Creates a pull request with the solution.
The tool automatically selects the appropriate AI assistant based on the specified model:
- **Aider**: Used for non-Anthropic models (e.g., GPT, Ollama, Gemini)
- **Claude Code**: Used for Anthropic models (e.g., Claude, Sonnet, Haiku, Opus)
Inspired by [the AI workflows](https://github.com/oscoreio/ai-workflows/) Inspired by [the AI workflows](https://github.com/oscoreio/ai-workflows/)
project. project.
@ -32,48 +36,109 @@ have the following permissions:
### Command Line ### Command Line
```bash ```bash
# Run with default settings # Run with default settings (uses Aider)
python -m aider_gitea python -m aider_gitea --aider-model gpt-4
# Use Claude Code with Anthropic models
python -m aider_gitea --aider-model claude-3-sonnet
python -m aider_gitea --aider-model claude-3-haiku
python -m aider_gitea --aider-model anthropic/claude-3-opus
# Use Aider with various models
python -m aider_gitea --aider-model gpt-4
python -m aider_gitea --aider-model ollama/llama3
python -m aider_gitea --aider-model gemini-pro
# Specify custom repository and owner # Specify custom repository and owner
python -m aider_gitea --owner myorg --repo myproject python -m aider_gitea --owner myorg --repo myproject --aider-model claude-3-sonnet
# Use a custom Gitea URL # Use a custom Gitea URL
python -m aider_gitea --gitea-url https://gitea.example.com python -m aider_gitea --gitea-url https://gitea.example.com --aider-model gpt-4
# Specify a different base branch # Specify a different base branch
python -m aider_gitea --base-branch develop python -m aider_gitea --base-branch develop --aider-model claude-3-haiku
``` ```
### AI Assistant Selection
The tool automatically routes to the appropriate AI assistant based on the model name:
**Claude Code Integration (Anthropic Models):**
- Model names containing: `claude`, `anthropic`, `sonnet`, `haiku`, `opus`
- Examples: `claude-3-sonnet`, `claude-3-haiku`, `anthropic/claude-3-opus`
- Requires: `ANTHROPIC_API_KEY` environment variable
**Aider Integration (All Other Models):**
- Any model not matching Anthropic patterns
- Examples: `gpt-4`, `ollama/llama3`, `gemini-pro`, `mistral-7b`
- Requires: `LLM_API_KEY` environment variable
### Python API ### Python API
```python ```python
from aider_gitea import solve_issue_in_repository from aider_gitea import solve_issue_in_repository, create_code_solver
from pathlib import Path from pathlib import Path
import argparse
# Solve an issue programmatically # Solve an issue programmatically with automatic AI assistant selection
args = argparse.Namespace( repository_config = RepositoryConfig(
gitea_url="https://gitea.example.com", gitea_url="https://gitea.example.com",
owner="myorg", owner="myorg",
repo="myproject", repo="myproject",
base_branch="main" base_branch="main"
) )
# Set the model to control which AI assistant is used
import aider_gitea
aider_gitea.CODE_MODEL = "claude-3-sonnet" # Will use Claude Code
# aider_gitea.CODE_MODEL = "gpt-4" # Will use Aider
code_solver = create_code_solver() # Automatically selects based on model
solve_issue_in_repository( solve_issue_in_repository(
args, repository_config,
Path("/path/to/repo"), Path("/path/to/repo"),
"issue-123-fix-bug", "issue-123-fix-bug",
"Fix critical bug", "Fix critical bug",
"The application crashes when processing large files", "The application crashes when processing large files",
"123" "123",
gitea_client,
code_solver
) )
``` ```
### Environment Configuration ### Environment Configuration
The tool uses environment variables for sensitive information: The tool uses environment variables for sensitive information:
**Required for all setups:**
- `GITEA_TOKEN`: Your Gitea API token - `GITEA_TOKEN`: Your Gitea API token
- `LLM_API_KEY`: API key for the language model used by Aider
**For Aider (non-Anthropic models):**
- `LLM_API_KEY`: API key for the language model (OpenAI, Ollama, etc.)
**For Claude Code (Anthropic models):**
- `ANTHROPIC_API_KEY`: Your Anthropic API key for Claude models
### Model Examples
**Anthropic Models (→ Claude Code):**
```bash
--aider-model claude-3-sonnet
--aider-model claude-3-haiku
--aider-model claude-3-opus
--aider-model anthropic/claude-3-sonnet
```
**Non-Anthropic Models (→ Aider):**
```bash
--aider-model gpt-4
--aider-model gpt-3.5-turbo
--aider-model ollama/llama3
--aider-model ollama/codellama
--aider-model gemini-pro
--aider-model mistral-7b
```
``` ```
## Dependencies ## Dependencies

View File

@ -346,7 +346,7 @@ class AiderCodeSolver(CodeSolverStrategy):
aider_command = self._create_aider_command(issue_content) aider_command = self._create_aider_command(issue_content)
aider_did_not_crash = run_cmd( aider_did_not_crash = run_cmd(
aider_command, aider_command,
repository_path, cwd=repository_path,
check=False, check=False,
) )
if not aider_did_not_crash: if not aider_did_not_crash:
@ -368,9 +368,11 @@ class ClaudeCodeSolver(CodeSolverStrategy):
'claude', 'claude',
'-p', '-p',
'--output-format', '--output-format',
'json', 'stream-json',
'--max-turns', #'--max-turns', '100',
'10', '--debug',
'--verbose',
'--dangerously-skip-permissions',
] ]
if CODE_MODEL: if CODE_MODEL:
@ -381,13 +383,6 @@ class ClaudeCodeSolver(CodeSolverStrategy):
def solve_issue_round(self, repository_path: Path, issue_content: str) -> bool: def solve_issue_round(self, repository_path: Path, issue_content: str) -> bool:
"""Solve an issue using Claude Code.""" """Solve an issue using Claude Code."""
import json
import os
# Set Anthropic API key environment variable
env = os.environ.copy()
env['ANTHROPIC_API_KEY'] = secrets.anthropic_api_key()
# Prepare the issue prompt for Claude Code # Prepare the issue prompt for Claude Code
enhanced_issue = CLAUDE_CODE_MESSAGE_FORMAT.format( enhanced_issue = CLAUDE_CODE_MESSAGE_FORMAT.format(
issue=issue_content, issue=issue_content,
@ -399,31 +394,12 @@ class ClaudeCodeSolver(CodeSolverStrategy):
claude_command = self._create_claude_command(enhanced_issue) claude_command = self._create_claude_command(enhanced_issue)
# Run Claude Code # Run Claude Code
result = subprocess.run( run_cmd(
claude_command, claude_command,
cwd=repository_path, cwd=repository_path,
env=env,
capture_output=True,
text=True,
check=False, check=False,
) )
if result.returncode != 0:
logger.error('Claude Code failed with return code %d', result.returncode)
logger.error('stderr: %s', result.stderr)
return False
# Parse response if it's JSON
try:
if result.stdout.strip():
response_data = json.loads(result.stdout)
logger.info(
'Claude Code response: %s',
response_data.get('text', 'No text field'),
)
except json.JSONDecodeError:
logger.info('Claude Code response (non-JSON): %s', result.stdout[:500])
# Run post-solver cleanup # Run post-solver cleanup
run_post_solver_cleanup(repository_path, 'Claude Code') run_post_solver_cleanup(repository_path, 'Claude Code')

View File

@ -12,16 +12,20 @@ PACKAGE_NAME = 'aider_gitea'
PACKAGE_DESCRIPTION = """ PACKAGE_DESCRIPTION = """
Aider Gitea. Aider Gitea.
A code automation tool that integrates Gitea with Aider to automatically solve issues. A code automation tool that integrates Gitea with AI assistants to automatically solve issues.
This program monitors your [Gitea](https://about.gitea.com/) repository for issues with the 'aider' label. This program monitors your [Gitea](https://about.gitea.com/) repository for issues with the 'aider' label.
When such an issue is found, it: When such an issue is found, it:
1. Creates a new branch. 1. Creates a new branch.
2. Invokes [Aider](https://aider.chat/) to solve the issue using a Large-Language Model. 2. Invokes an AI assistant (Aider or Claude Code) to solve the issue using a Large-Language Model.
3. Runs tests and code quality checks. 3. Runs tests and code quality checks.
4. Creates a pull request with the solution. 4. Creates a pull request with the solution.
The tool automatically selects the appropriate AI assistant based on the specified model:
- **Aider**: Used for non-Anthropic models (e.g., GPT, Ollama, Gemini)
- **Claude Code**: Used for Anthropic models (e.g., Claude, Sonnet, Haiku, Opus)
Inspired by [the AI workflows](https://github.com/oscoreio/ai-workflows/) Inspired by [the AI workflows](https://github.com/oscoreio/ai-workflows/)
project. project.
@ -37,53 +41,114 @@ have the following permissions:
### Command Line ### Command Line
```bash ```bash
# Run with default settings # Run with default settings (uses Aider)
python -m aider_gitea python -m aider_gitea --aider-model gpt-4
# Use Claude Code with Anthropic models
python -m aider_gitea --aider-model claude-3-sonnet
python -m aider_gitea --aider-model claude-3-haiku
python -m aider_gitea --aider-model anthropic/claude-3-opus
# Use Aider with various models
python -m aider_gitea --aider-model gpt-4
python -m aider_gitea --aider-model ollama/llama3
python -m aider_gitea --aider-model gemini-pro
# Specify custom repository and owner # Specify custom repository and owner
python -m aider_gitea --owner myorg --repo myproject python -m aider_gitea --owner myorg --repo myproject --aider-model claude-3-sonnet
# Use a custom Gitea URL # Use a custom Gitea URL
python -m aider_gitea --gitea-url https://gitea.example.com python -m aider_gitea --gitea-url https://gitea.example.com --aider-model gpt-4
# Specify a different base branch # Specify a different base branch
python -m aider_gitea --base-branch develop python -m aider_gitea --base-branch develop --aider-model claude-3-haiku
``` ```
### AI Assistant Selection
The tool automatically routes to the appropriate AI assistant based on the model name:
**Claude Code Integration (Anthropic Models):**
- Model names containing: `claude`, `anthropic`, `sonnet`, `haiku`, `opus`
- Examples: `claude-3-sonnet`, `claude-3-haiku`, `anthropic/claude-3-opus`
- Requires: `ANTHROPIC_API_KEY` environment variable
**Aider Integration (All Other Models):**
- Any model not matching Anthropic patterns
- Examples: `gpt-4`, `ollama/llama3`, `gemini-pro`, `mistral-7b`
- Requires: `LLM_API_KEY` environment variable
### Python API ### Python API
```python ```python
from aider_gitea import solve_issue_in_repository from aider_gitea import solve_issue_in_repository, create_code_solver
from pathlib import Path from pathlib import Path
import argparse
# Solve an issue programmatically # Solve an issue programmatically with automatic AI assistant selection
args = argparse.Namespace( repository_config = RepositoryConfig(
gitea_url="https://gitea.example.com", gitea_url="https://gitea.example.com",
owner="myorg", owner="myorg",
repo="myproject", repo="myproject",
base_branch="main" base_branch="main"
) )
# Set the model to control which AI assistant is used
import aider_gitea
aider_gitea.CODE_MODEL = "claude-3-sonnet" # Will use Claude Code
# aider_gitea.CODE_MODEL = "gpt-4" # Will use Aider
code_solver = create_code_solver() # Automatically selects based on model
solve_issue_in_repository( solve_issue_in_repository(
args, repository_config,
Path("/path/to/repo"), Path("/path/to/repo"),
"issue-123-fix-bug", "issue-123-fix-bug",
"Fix critical bug", "Fix critical bug",
"The application crashes when processing large files", "The application crashes when processing large files",
"123" "123",
gitea_client,
code_solver
) )
``` ```
### Environment Configuration ### Environment Configuration
The tool uses environment variables for sensitive information: The tool uses environment variables for sensitive information:
**Required for all setups:**
- `GITEA_TOKEN`: Your Gitea API token - `GITEA_TOKEN`: Your Gitea API token
- `LLM_API_KEY`: API key for the language model used by Aider
**For Aider (non-Anthropic models):**
- `LLM_API_KEY`: API key for the language model (OpenAI, Ollama, etc.)
**For Claude Code (Anthropic models):**
- `ANTHROPIC_API_KEY`: Your Anthropic API key for Claude models
### Model Examples
**Anthropic Models ( Claude Code):**
```bash
--aider-model claude-3-sonnet
--aider-model claude-3-haiku
--aider-model claude-3-opus
--aider-model anthropic/claude-3-sonnet
```
**Non-Anthropic Models ( Aider):**
```bash
--aider-model gpt-4
--aider-model gpt-3.5-turbo
--aider-model ollama/llama3
--aider-model ollama/codellama
--aider-model gemini-pro
--aider-model mistral-7b
```
``` ```
""".strip() """.strip()
PACKAGE_DESCRIPTION_SHORT = """ PACKAGE_DESCRIPTION_SHORT = """
A code automation tool that integrates Gitea with Aider to automatically solve issues.""".strip() A code automation tool that integrates Gitea with AI assistants to automatically solve issues.""".strip()
def parse_version_file(text: str) -> str: def parse_version_file(text: str) -> str:
@ -109,7 +174,6 @@ def find_python_packages() -> list[str]:
print(f'Found following packages: {packages}') print(f'Found following packages: {packages}')
return sorted(packages) return sorted(packages)
with open(PACKAGE_NAME + '/_version.py') as f: with open(PACKAGE_NAME + '/_version.py') as f:
version = parse_version_file(f.read()) version = parse_version_file(f.read())