Commit graph

294 commits

Author SHA1 Message Date
mrT23
8ff85a9daf
Fix markdown formatting in utils.py by removing extra newlines 2024-06-13 12:45:57 +03:00
mrT23
58bc54b193
Add file ignore functionality and update documentation for ignore patterns 2024-06-13 12:27:10 +03:00
mrT23
aa56c0097d
Add file ignore functionality and update documentation for ignore patterns 2024-06-13 12:20:21 +03:00
mrT23
20f6af803c
Add file ignore functionality and update documentation for ignore patterns 2024-06-13 12:09:52 +03:00
mrT23
2076454798
Add file ignore functionality and update documentation for ignore patterns 2024-06-13 12:01:50 +03:00
mrT23
55b52ad6b2
Add exception handling to process_can_be_split function and update pr_reviewer_prompts.toml formatting 2024-06-13 09:28:51 +03:00
mrT23
9c8bc6c86a
Update PR review prompts and terminology for clarity and consistency 2024-06-09 14:29:32 +03:00
ryan
b28f66aaa0 1. update LangChainOpenAIHandler to support langchain version 0.2
2. read openai_api_base from settings for llms that compatible with openai
2024-06-06 22:27:01 +08:00
Kamakura
c53c6aee7f fix wrong provider name 2024-06-04 15:09:30 +08:00
Kamakura
d3a7041f0d update alog/__init__.py 2024-06-04 00:00:22 +08:00
Kamakura
b4f0ad948f Update Python code formatting, configuration loading, and local model additions
1. Code Formatting:
   - Standardized Python code formatting across multiple files to align with PEP 8 guidelines. This includes adjustments to whitespace, line breaks, and inline comments.

2. Configuration Loader Enhancements:
   - Enhanced the `get_settings` function in `config_loader.py` to provide more robust handling of settings retrieval. Added detailed documentation to improve code maintainability and clarity.

3. Model Addition in __init__.py:
   - Added a new model "ollama/llama3" with a token limit to the MAX_TOKENS dictionary in `__init__.py` to support new AI capabilities and configurations.
2024-06-03 23:58:31 +08:00
mrT23
911c1268fc
Add large_patch_policy configuration and implement patch clipping logic 2024-05-29 13:52:44 +03:00
mrT23
17f46bb53b
Add large_patch_policy configuration and implement patch clipping logic 2024-05-29 13:42:44 +03:00
mrT23
da44bd7d5e
extended_patch 2024-05-22 21:50:00 +03:00
mrT23
4cd9626217
grammar 2024-05-22 21:47:49 +03:00
mrT23
9b56c83c1d
APP_NAME 2024-05-19 12:18:22 +03:00
mrT23
dcd188193b
Update configuration_options.md to include tip on showing relevant configurations 2024-05-19 08:20:15 +03:00
mrT23
ea4ee1adbc
Add show_relevant_configurations function and integrate it across tools to output relevant configurations if enabled 2024-05-18 13:09:50 +03:00
Tal
be701aa868
Merge pull request #902 from Codium-ai/tr/self_reflect
Refactor Azure DevOps provider to use PR iterations for change detect…
2024-05-15 09:22:14 +03:00
mrT23
e56320540b
Refactor Azure DevOps provider to use PR iterations for change detection, improving accuracy of diff file identification 2024-05-15 09:05:01 +03:00
KennyDizi
36ad8935ad Add gpt-4o models 2024-05-14 08:24:34 +07:00
mrT23
34ad5f2aa2
toolbar emojis in pr-agent feedbacks 2024-05-05 13:33:54 +03:00
tacascer
2e34436589 chore: update GPT3.5 models 2024-04-22 20:25:32 -04:00
Tal
fae6cab2a7
Merge pull request #877 from randy-tsukemen/support-groq-llama3
Add Groq Llama3 support
2024-04-22 11:41:12 +03:00
Randy, Huang
0a53f09a7f Add GROQ.KEY support in LiteLLMAIHandler 2024-04-21 15:21:45 +09:00
Randy, Huang
7a9e73702d Fix duplicate assignment of replicate_key in LiteLLMAIHandler 2024-04-21 14:47:25 +09:00
mrT23
92ef2b4464
ask 2024-04-14 14:09:58 +03:00
mrT23
4683a29819
s 2024-04-14 12:34:14 +03:00
mrT23
8f0f08006f
s 2024-04-14 12:00:19 +03:00
mrT23
a4680ded93
protections 2024-04-12 20:32:47 +03:00
idubnori
9e4ffd824c
Merge branch 'main' into feature/gha-outputs-1 2024-04-10 23:27:44 +09:00
idubnori
ae633b3cea refine: github_action_output 2024-04-10 22:30:16 +09:00
idubnori
97dcb34d77 clean: rename to github_action_output 2024-04-10 22:16:09 +09:00
Yuta Nishi
108b1afa0e
add new models 2024-04-10 14:44:38 +09:00
idubnori
75c4befadf feat: set review data to github actions output 2024-04-10 01:02:05 +09:00
mrT23
84d8f78d0c
publish_output 2024-04-08 14:00:41 +03:00
mrT23
2be0e9108e
readme 2024-04-07 17:00:40 +03:00
mrT23
9c3673209d
TokenEncoder 2024-04-03 08:42:50 +03:00
gregoryboue
501b059575
feat: allows ollama usage
Fix https://github.com/Codium-ai/pr-agent/issues/657
2024-04-02 11:01:45 +02:00
Yuta Nishi
d064a352ad
feat(pr_agent): add support for gpt-4-turbo-preview model and update default settings 2024-03-26 14:47:05 +09:00
Tal
dd83b196b4
Merge pull request #781 from koid/feature/support-bedrock-claude3
added support for bedrock/claude3
2024-03-18 08:03:29 +02:00
mrT23
498b4cb34e
readme 2024-03-17 09:59:55 +02:00
mrT23
6d39773a17
prompt 2024-03-17 09:44:50 +02:00
mrT23
99a676d792
Merge remote-tracking branch 'origin/main' into tr/split 2024-03-17 09:00:04 +02:00
mrT23
669e076938
Enhance AI handler logging and add main PR language attribute to AI handler in various tools 2024-03-16 13:52:02 +02:00
mrT23
74345284bd
Enhance AI handler logging and add main PR language attribute to AI handler in various tools 2024-03-16 13:47:44 +02:00
koid
3bae515e39 add claude-3-haiku 2024-03-14 16:58:44 +09:00
koid
f94a0fd704 add Claude3Config 2024-03-13 11:24:51 +09:00
koid
1ed2cd064a add config litellm.drop_params 2024-03-13 11:20:02 +09:00
koid
d62796ac68 update max_tokens 2024-03-13 11:14:04 +09:00