diff --git a/.github/pull_request_template.md b/.github/pull_request_template.md index 2a45c2c16e..7fc17cd01f 100644 --- a/.github/pull_request_template.md +++ b/.github/pull_request_template.md @@ -9,9 +9,9 @@ - [ ] **Changelog:** Ensure a changelog entry following the format of [Keep a Changelog](https://keepachangelog.com/) is added at the bottom of the PR description. - [ ] **Documentation:** Have you updated relevant documentation [Open WebUI Docs](https://github.com/open-webui/docs), or other documentation sources? - [ ] **Dependencies:** Are there any new dependencies? Have you updated the dependency versions in the documentation? -- [ ] **Testing:** Have you written and run sufficient tests for validating the changes? +- [ ] **Testing:** Have you written and run sufficient tests to validate the changes? - [ ] **Code review:** Have you performed a self-review of your code, addressing any coding standard issues and ensuring adherence to the project's coding standards? -- [ ] **Prefix:** To cleary categorize this pull request, prefix the pull request title, using one of the following: +- [ ] **Prefix:** To clearly categorize this pull request, prefix the pull request title using one of the following: - **BREAKING CHANGE**: Significant changes that may affect compatibility - **build**: Changes that affect the build system or external dependencies - **ci**: Changes to our continuous integration processes or workflows @@ -22,7 +22,7 @@ - **i18n**: Internationalization or localization changes - **perf**: Performance improvement - **refactor**: Code restructuring for better maintainability, readability, or scalability - - **style**: Changes that do not affect the meaning of the code (white-space, formatting, missing semi-colons, etc.) + - **style**: Changes that do not affect the meaning of the code (white space, formatting, missing semi-colons, etc.) - **test**: Adding missing tests or correcting existing tests - **WIP**: Work in progress, a temporary label for incomplete or ongoing work diff --git a/CHANGELOG.md b/CHANGELOG.md index da4046e73f..f6e8f7d297 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -5,6 +5,55 @@ All notable changes to this project will be documented in this file. The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.1.0/), and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html). +## [0.6.0] - 2025-03-31 + +### Added + +- 🧩 **External Tool Server Support via OpenAPI**: Connect Open WebUI to any OpenAPI-compatible REST server instantly—offering immediate integration with thousands of developer tools, SDKs, and SaaS systems for powerful extensibility. Learn more: https://github.com/open-webui/openapi-servers +- 🛠️ **MCP Server Support via MCPO**: You can now convert and expose your internal MCP tools as interoperable OpenAPI HTTP servers within Open WebUI for seamless, plug-n-play AI toolchain creation. Learn more: https://github.com/open-webui/mcpo +- 📨 **/messages Chat API Endpoint Support**: For power users building external AI systems, new endpoints allow precise control of messages asynchronously—feed long-running external responses into Open WebUI chats without coupling with the frontend. +- 📝 **Client-Side PDF Generation**: PDF exports are now generated fully client-side for drastically improved output quality—perfect for saving conversations or documents. +- 💼 **Enforced Temporary Chats Mode**: Admins can now enforce temporary chat sessions by default to align with stringent data retention and compliance requirements. +- 🌍 **Public Resource Sharing Permission Controls**: Fine-grained user group permissions now allow enabling/disabling public sharing for models, knowledge, prompts, and tools—ideal for privacy, team control, and internal deployments. +- 📦 **Custom pip Options for Tools/Functions**: You can now specify custom pip installation options with "PIP_OPTIONS", "PIP_PACKAGE_INDEX_OPTIONS" environment variables—improving compatibility, support for private indexes, and better control over Python environments. +- 🔢 **Editable Message Counter**: You can now double-click the message count number and jump straight to editing the index—quickly navigate complex chats or regenerate specific messages precisely. +- 🧠 **Embedding Prefix Support Added**: Add custom prefixes to your embeddings for instruct-style tokens, enabling stronger model alignment and more consistent RAG performance. +- 🙈 **Ability to Hide Base Models**: Optionally hide base models from the UI, helping users streamline model visibility and limit access to only usable endpoints.. +- 📚 **Docling Content Extraction Support**: Open WebUI now supports Docling as a content extraction engine, enabling smarter and more accurate parsing of complex file formats—ideal for advanced document understanding and Retrieval-Augmented Generation (RAG) workflows. +- 🗃️ **Redis Sentinel Support Added**: Enhance deployment redundancy with support for Redis Sentinel for highly available, failover-safe Redis-based caching or pub/sub. +- 📚 **JSON Schema Format for Ollama**: Added support for defining the format using JSON schema in Ollama-compatible models, improving flexibility and validation of model outputs. +- 🔍 **Chat Sidebar Search "Clear” Button**: Quickly clear search filters in chat sidebar using the new ✖️ button—streamline your chat navigation with one click. +- 🗂️ **Auto-Focus + Enter Submit for Folder Name**: When creating a new folder, the system automatically enters rename mode with name preselected—simplifying your org workflow. +- 🧱 **Markdown Alerts Rendering**: Blockquotes with syntax hinting (e.g. ⚠️, ℹ️, ✅) now render styled Markdown alert banners, making messages and documentation more visually structured. +- 🔁 **Hybrid Search Runs in Parallel Now**: Hybrid (BM25 + embedding) search components now run in parallel—dramatically reducing response times and speeding up document retrieval. +- 📋 **Cleaner UI for Tool Call Display**: Optimized the visual layout of called tools inside chat messages for better clarity and reduced visual clutter. +- 🧪 **Playwright Timeout Now Configurable**: Default timeout for Playwright processes is now shorter and adjustable via environment variables—making web scraping more robust and tunable to environments. +- 📈 **OpenTelemetry Support for Observability**: Open WebUI now integrates with OpenTelemetry, allowing you to connect with tools like Grafana, Jaeger, or Prometheus for detailed performance insights and real-time visibility—entirely opt-in and fully self-hosted. Even if enabled, no data is ever sent to us, ensuring your privacy and ownership over all telemetry data. +- 🛠 **General UI Enhancements & UX Polish**: Numerous refinements across sidebar, code blocks, modal interactions, button alignment, scrollbar visibility, and folder behavior improve overall fluidity and usability of the interface. +- 🧱 **General Backend Refactoring**: Numerous backend components have been refactored to improve stability, maintainability, and performance—ensuring a more consistent and reliable system across all features. +- 🌍 **Internationalization Language Support Updates**: Added Estonian and Galician languages, improved Spanish (fully revised), Traditional Chinese, Simplified Chinese, Turkish, Catalan, Ukrainian, and German for a more localized and inclusive interface. + +### Fixed + +- 🧑‍💻 **Firefox Input Height Bug**: Text input in Firefox now maintains proper height, ensuring message boxes look consistent and behave predictably. +- 🧾 **Tika Blank Line Bug**: PDFs processed with Apache Tika 3.1.0.0 no longer introduce excessive blank lines—improving RAG output quality and visual cleanliness. +- 🧪 **CSV Loader Encoding Issues**: CSV files with unknown encodings now automatically detect character sets, resolving import errors in non-UTF-8 datasets. +- ✅ **LDAP Auth Config Fix**: Path to certificate file is now optional for LDAP setups, fixing authentication trouble for users without preconfigured cert paths. +- 📥 **File Deletion in Bypass Mode**: Resolved issue where files couldn’t be deleted from knowledge when “bypass embedding” mode was enabled. +- 🧩 **Hybrid Search Result Sorting & Deduplication Fixed**: Fixed citation and sorting issues in RAG hybrid and reranker modes, ensuring retrieved documents are shown in correct order per score. +- 🧷 **Model Export/Import Broken for a Single Model**: Fixed bug where individual models couldn’t be exported or re-imported, restoring full portability. +- 📫 **Auth Redirect Fix**: Logged-in users are now routed properly without unnecessary login prompts when already authenticated. + +### Changed + +- 🧠 **Prompt Autocompletion Disabled By Default**: Autocomplete suggestions while typing are now disabled unless explicitly re-enabled in user preferences—reduces distractions while composing prompts for advanced users. +- 🧾 **Normalize Citation Numbering**: Source citations now properly begin from "1" instead of "0"—improving consistency and professional presentation in AI outputs. +- 📚 **Improved Error Handling from Pipelines**: Pipelines now show the actual returned error message from failed tasks rather than generic "Connection closed"—making debugging far more user-friendly. + +### Removed + +- 🧾 **ENABLE_AUDIT_LOGS Setting Removed**: Deprecated setting “ENABLE_AUDIT_LOGS” has been fully removed—now controlled via “AUDIT_LOG_LEVEL” instead. + ## [0.5.20] - 2025-03-05 ### Added diff --git a/Dockerfile b/Dockerfile index 274e23dbfc..4a5411611f 100644 --- a/Dockerfile +++ b/Dockerfile @@ -132,7 +132,7 @@ RUN if [ "$USE_OLLAMA" = "true" ]; then \ # install python dependencies COPY --chown=$UID:$GID ./backend/requirements.txt ./requirements.txt -RUN pip3 install uv && \ +RUN pip3 install --no-cache-dir uv && \ if [ "$USE_CUDA" = "true" ]; then \ # If you use CUDA the whisper and embedding model will be downloaded on first use pip3 install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/$USE_CUDA_DOCKER_VER --no-cache-dir && \ diff --git a/backend/open_webui/config.py b/backend/open_webui/config.py index 1e265f2ce7..0ac92bd23b 100644 --- a/backend/open_webui/config.py +++ b/backend/open_webui/config.py @@ -3,6 +3,7 @@ import logging import os import shutil import base64 +import redis from datetime import datetime from pathlib import Path @@ -17,6 +18,9 @@ from open_webui.env import ( DATA_DIR, DATABASE_URL, ENV, + REDIS_URL, + REDIS_SENTINEL_HOSTS, + REDIS_SENTINEL_PORT, FRONTEND_BUILD_DIR, OFFLINE_MODE, OPEN_WEBUI_DIR, @@ -26,6 +30,7 @@ from open_webui.env import ( log, ) from open_webui.internal.db import Base, get_db +from open_webui.utils.redis import get_redis_connection class EndpointFilter(logging.Filter): @@ -248,9 +253,17 @@ class PersistentConfig(Generic[T]): class AppConfig: _state: dict[str, PersistentConfig] + _redis: Optional[redis.Redis] = None - def __init__(self): + def __init__( + self, redis_url: Optional[str] = None, redis_sentinels: Optional[list] = [] + ): super().__setattr__("_state", {}) + if redis_url: + super().__setattr__( + "_redis", + get_redis_connection(redis_url, redis_sentinels, decode_responses=True), + ) def __setattr__(self, key, value): if isinstance(value, PersistentConfig): @@ -259,7 +272,31 @@ class AppConfig: self._state[key].value = value self._state[key].save() + if self._redis: + redis_key = f"open-webui:config:{key}" + self._redis.set(redis_key, json.dumps(self._state[key].value)) + def __getattr__(self, key): + if key not in self._state: + raise AttributeError(f"Config key '{key}' not found") + + # If Redis is available, check for an updated value + if self._redis: + redis_key = f"open-webui:config:{key}" + redis_value = self._redis.get(redis_key) + + if redis_value is not None: + try: + decoded_value = json.loads(redis_value) + + # Update the in-memory value if different + if self._state[key].value != decoded_value: + self._state[key].value = decoded_value + log.info(f"Updated {key} from Redis: {decoded_value}") + + except json.JSONDecodeError: + log.error(f"Invalid JSON format in Redis for {key}: {redis_value}") + return self._state[key].value @@ -943,6 +980,35 @@ USER_PERMISSIONS_WORKSPACE_TOOLS_ACCESS = ( os.environ.get("USER_PERMISSIONS_WORKSPACE_TOOLS_ACCESS", "False").lower() == "true" ) +USER_PERMISSIONS_WORKSPACE_MODELS_ALLOW_PUBLIC_SHARING = ( + os.environ.get( + "USER_PERMISSIONS_WORKSPACE_MODELS_ALLOW_PUBLIC_SHARING", "False" + ).lower() + == "true" +) + +USER_PERMISSIONS_WORKSPACE_KNOWLEDGE_ALLOW_PUBLIC_SHARING = ( + os.environ.get( + "USER_PERMISSIONS_WORKSPACE_KNOWLEDGE_ALLOW_PUBLIC_SHARING", "False" + ).lower() + == "true" +) + +USER_PERMISSIONS_WORKSPACE_PROMPTS_ALLOW_PUBLIC_SHARING = ( + os.environ.get( + "USER_PERMISSIONS_WORKSPACE_PROMPTS_ALLOW_PUBLIC_SHARING", "False" + ).lower() + == "true" +) + +USER_PERMISSIONS_WORKSPACE_TOOLS_ALLOW_PUBLIC_SHARING = ( + os.environ.get( + "USER_PERMISSIONS_WORKSPACE_TOOLS_ALLOW_PUBLIC_SHARING", "False" + ).lower() + == "true" +) + + USER_PERMISSIONS_CHAT_CONTROLS = ( os.environ.get("USER_PERMISSIONS_CHAT_CONTROLS", "True").lower() == "true" ) @@ -963,6 +1029,11 @@ USER_PERMISSIONS_CHAT_TEMPORARY = ( os.environ.get("USER_PERMISSIONS_CHAT_TEMPORARY", "True").lower() == "true" ) +USER_PERMISSIONS_CHAT_TEMPORARY_ENFORCED = ( + os.environ.get("USER_PERMISSIONS_CHAT_TEMPORARY_ENFORCED", "False").lower() + == "true" +) + USER_PERMISSIONS_FEATURES_WEB_SEARCH = ( os.environ.get("USER_PERMISSIONS_FEATURES_WEB_SEARCH", "True").lower() == "true" ) @@ -985,12 +1056,19 @@ DEFAULT_USER_PERMISSIONS = { "prompts": USER_PERMISSIONS_WORKSPACE_PROMPTS_ACCESS, "tools": USER_PERMISSIONS_WORKSPACE_TOOLS_ACCESS, }, + "sharing": { + "public_models": USER_PERMISSIONS_WORKSPACE_MODELS_ALLOW_PUBLIC_SHARING, + "public_knowledge": USER_PERMISSIONS_WORKSPACE_KNOWLEDGE_ALLOW_PUBLIC_SHARING, + "public_prompts": USER_PERMISSIONS_WORKSPACE_PROMPTS_ALLOW_PUBLIC_SHARING, + "public_tools": USER_PERMISSIONS_WORKSPACE_TOOLS_ALLOW_PUBLIC_SHARING, + }, "chat": { "controls": USER_PERMISSIONS_CHAT_CONTROLS, "file_upload": USER_PERMISSIONS_CHAT_FILE_UPLOAD, "delete": USER_PERMISSIONS_CHAT_DELETE, "edit": USER_PERMISSIONS_CHAT_EDIT, "temporary": USER_PERMISSIONS_CHAT_TEMPORARY, + "temporary_enforced": USER_PERMISSIONS_CHAT_TEMPORARY_ENFORCED, }, "features": { "web_search": USER_PERMISSIONS_FEATURES_WEB_SEARCH, @@ -1055,6 +1133,12 @@ ENABLE_MESSAGE_RATING = PersistentConfig( os.environ.get("ENABLE_MESSAGE_RATING", "True").lower() == "true", ) +ENABLE_USER_WEBHOOKS = PersistentConfig( + "ENABLE_USER_WEBHOOKS", + "ui.enable_user_webhooks", + os.environ.get("ENABLE_USER_WEBHOOKS", "True").lower() == "true", +) + def validate_cors_origins(origins): for origin in origins: @@ -1276,7 +1360,7 @@ Strictly return in JSON format: ENABLE_AUTOCOMPLETE_GENERATION = PersistentConfig( "ENABLE_AUTOCOMPLETE_GENERATION", "task.autocomplete.enable", - os.environ.get("ENABLE_AUTOCOMPLETE_GENERATION", "True").lower() == "true", + os.environ.get("ENABLE_AUTOCOMPLETE_GENERATION", "False").lower() == "true", ) AUTOCOMPLETE_GENERATION_INPUT_MAX_LENGTH = PersistentConfig( @@ -1548,8 +1632,10 @@ QDRANT_API_KEY = os.environ.get("QDRANT_API_KEY", None) # OpenSearch OPENSEARCH_URI = os.environ.get("OPENSEARCH_URI", "https://localhost:9200") -OPENSEARCH_SSL = os.environ.get("OPENSEARCH_SSL", True) -OPENSEARCH_CERT_VERIFY = os.environ.get("OPENSEARCH_CERT_VERIFY", False) +OPENSEARCH_SSL = os.environ.get("OPENSEARCH_SSL", "true").lower() == "true" +OPENSEARCH_CERT_VERIFY = ( + os.environ.get("OPENSEARCH_CERT_VERIFY", "false").lower() == "true" +) OPENSEARCH_USERNAME = os.environ.get("OPENSEARCH_USERNAME", None) OPENSEARCH_PASSWORD = os.environ.get("OPENSEARCH_PASSWORD", None) @@ -1623,6 +1709,12 @@ TIKA_SERVER_URL = PersistentConfig( os.getenv("TIKA_SERVER_URL", "http://tika:9998"), # Default for sidecar deployment ) +DOCLING_SERVER_URL = PersistentConfig( + "DOCLING_SERVER_URL", + "rag.docling_server_url", + os.getenv("DOCLING_SERVER_URL", "http://docling:5001"), +) + DOCUMENT_INTELLIGENCE_ENDPOINT = PersistentConfig( "DOCUMENT_INTELLIGENCE_ENDPOINT", "rag.document_intelligence_endpoint", @@ -1646,6 +1738,11 @@ BYPASS_EMBEDDING_AND_RETRIEVAL = PersistentConfig( RAG_TOP_K = PersistentConfig( "RAG_TOP_K", "rag.top_k", int(os.environ.get("RAG_TOP_K", "3")) ) +RAG_TOP_K_RERANKER = PersistentConfig( + "RAG_TOP_K_RERANKER", + "rag.top_k_reranker", + int(os.environ.get("RAG_TOP_K_RERANKER", "3")), +) RAG_RELEVANCE_THRESHOLD = PersistentConfig( "RAG_RELEVANCE_THRESHOLD", "rag.relevance_threshold", @@ -1727,6 +1824,14 @@ RAG_EMBEDDING_BATCH_SIZE = PersistentConfig( ), ) +RAG_EMBEDDING_QUERY_PREFIX = os.environ.get("RAG_EMBEDDING_QUERY_PREFIX", None) + +RAG_EMBEDDING_CONTENT_PREFIX = os.environ.get("RAG_EMBEDDING_CONTENT_PREFIX", None) + +RAG_EMBEDDING_PREFIX_FIELD_NAME = os.environ.get( + "RAG_EMBEDDING_PREFIX_FIELD_NAME", None +) + RAG_RERANKING_MODEL = PersistentConfig( "RAG_RERANKING_MODEL", "rag.reranking_model", @@ -1950,6 +2055,12 @@ TAVILY_API_KEY = PersistentConfig( os.getenv("TAVILY_API_KEY", ""), ) +TAVILY_EXTRACT_DEPTH = PersistentConfig( + "TAVILY_EXTRACT_DEPTH", + "rag.web.search.tavily_extract_depth", + os.getenv("TAVILY_EXTRACT_DEPTH", "basic"), +) + JINA_API_KEY = PersistentConfig( "JINA_API_KEY", "rag.web.search.jina_api_key", @@ -2036,6 +2147,12 @@ PLAYWRIGHT_WS_URI = PersistentConfig( os.environ.get("PLAYWRIGHT_WS_URI", None), ) +PLAYWRIGHT_TIMEOUT = PersistentConfig( + "PLAYWRIGHT_TIMEOUT", + "rag.web.loader.engine.playwright.timeout", + int(os.environ.get("PLAYWRIGHT_TIMEOUT", "10")), +) + FIRECRAWL_API_KEY = PersistentConfig( "FIRECRAWL_API_KEY", "firecrawl.api_key", diff --git a/backend/open_webui/env.py b/backend/open_webui/env.py index 3b3d6d4f35..e3819fdc5e 100644 --- a/backend/open_webui/env.py +++ b/backend/open_webui/env.py @@ -105,7 +105,6 @@ for source in log_sources: log.setLevel(SRC_LOG_LEVELS["CONFIG"]) - WEBUI_NAME = os.environ.get("WEBUI_NAME", "Open WebUI") if WEBUI_NAME != "Open WebUI": WEBUI_NAME += " (Open WebUI)" @@ -130,7 +129,6 @@ else: except Exception: PACKAGE_DATA = {"version": "0.0.0"} - VERSION = PACKAGE_DATA["version"] @@ -161,7 +159,6 @@ try: except Exception: changelog_content = (pkgutil.get_data("open_webui", "CHANGELOG.md") or b"").decode() - # Convert markdown content to HTML html_content = markdown.markdown(changelog_content) @@ -192,7 +189,6 @@ for version in soup.find_all("h2"): changelog_json[version_number] = version_data - CHANGELOG = changelog_json #################################### @@ -209,7 +205,6 @@ ENABLE_FORWARD_USER_INFO_HEADERS = ( os.environ.get("ENABLE_FORWARD_USER_INFO_HEADERS", "False").lower() == "true" ) - #################################### # WEBUI_BUILD_HASH #################################### @@ -244,7 +239,6 @@ if FROM_INIT_PY: DATA_DIR = Path(os.getenv("DATA_DIR", OPEN_WEBUI_DIR / "data")) - STATIC_DIR = Path(os.getenv("STATIC_DIR", OPEN_WEBUI_DIR / "static")) FONTS_DIR = Path(os.getenv("FONTS_DIR", OPEN_WEBUI_DIR / "static" / "fonts")) @@ -256,7 +250,6 @@ if FROM_INIT_PY: os.getenv("FRONTEND_BUILD_DIR", OPEN_WEBUI_DIR / "frontend") ).resolve() - #################################### # Database #################################### @@ -321,7 +314,6 @@ RESET_CONFIG_ON_START = ( os.environ.get("RESET_CONFIG_ON_START", "False").lower() == "true" ) - ENABLE_REALTIME_CHAT_SAVE = ( os.environ.get("ENABLE_REALTIME_CHAT_SAVE", "False").lower() == "true" ) @@ -330,7 +322,9 @@ ENABLE_REALTIME_CHAT_SAVE = ( # REDIS #################################### -REDIS_URL = os.environ.get("REDIS_URL", "redis://localhost:6379/0") +REDIS_URL = os.environ.get("REDIS_URL", "") +REDIS_SENTINEL_HOSTS = os.environ.get("REDIS_SENTINEL_HOSTS", "") +REDIS_SENTINEL_PORT = os.environ.get("REDIS_SENTINEL_PORT", "26379") #################################### # WEBUI_AUTH (Required for security) @@ -387,6 +381,10 @@ WEBSOCKET_MANAGER = os.environ.get("WEBSOCKET_MANAGER", "") WEBSOCKET_REDIS_URL = os.environ.get("WEBSOCKET_REDIS_URL", REDIS_URL) WEBSOCKET_REDIS_LOCK_TIMEOUT = os.environ.get("WEBSOCKET_REDIS_LOCK_TIMEOUT", 60) +WEBSOCKET_SENTINEL_HOSTS = os.environ.get("WEBSOCKET_SENTINEL_HOSTS", "") + +WEBSOCKET_SENTINEL_PORT = os.environ.get("WEBSOCKET_SENTINEL_PORT", "26379") + AIOHTTP_CLIENT_TIMEOUT = os.environ.get("AIOHTTP_CLIENT_TIMEOUT", "") if AIOHTTP_CLIENT_TIMEOUT == "": @@ -399,18 +397,16 @@ else: AIOHTTP_CLIENT_TIMEOUT_MODEL_LIST = os.environ.get( "AIOHTTP_CLIENT_TIMEOUT_MODEL_LIST", - os.environ.get("AIOHTTP_CLIENT_TIMEOUT_OPENAI_MODEL_LIST", ""), + os.environ.get("AIOHTTP_CLIENT_TIMEOUT_OPENAI_MODEL_LIST", "10"), ) - if AIOHTTP_CLIENT_TIMEOUT_MODEL_LIST == "": AIOHTTP_CLIENT_TIMEOUT_MODEL_LIST = None else: try: AIOHTTP_CLIENT_TIMEOUT_MODEL_LIST = int(AIOHTTP_CLIENT_TIMEOUT_MODEL_LIST) except Exception: - AIOHTTP_CLIENT_TIMEOUT_MODEL_LIST = 5 - + AIOHTTP_CLIENT_TIMEOUT_MODEL_LIST = 10 #################################### # OFFLINE_MODE @@ -424,13 +420,12 @@ if OFFLINE_MODE: #################################### # AUDIT LOGGING #################################### -ENABLE_AUDIT_LOGS = os.getenv("ENABLE_AUDIT_LOGS", "false").lower() == "true" # Where to store log file AUDIT_LOGS_FILE_PATH = f"{DATA_DIR}/audit.log" # Maximum size of a file before rotating into a new log file AUDIT_LOG_FILE_ROTATION_SIZE = os.getenv("AUDIT_LOG_FILE_ROTATION_SIZE", "10MB") # METADATA | REQUEST | REQUEST_RESPONSE -AUDIT_LOG_LEVEL = os.getenv("AUDIT_LOG_LEVEL", "REQUEST_RESPONSE").upper() +AUDIT_LOG_LEVEL = os.getenv("AUDIT_LOG_LEVEL", "NONE").upper() try: MAX_BODY_LOG_SIZE = int(os.environ.get("MAX_BODY_LOG_SIZE") or 2048) except ValueError: @@ -442,3 +437,26 @@ AUDIT_EXCLUDED_PATHS = os.getenv("AUDIT_EXCLUDED_PATHS", "/chats,/chat,/folders" ) AUDIT_EXCLUDED_PATHS = [path.strip() for path in AUDIT_EXCLUDED_PATHS] AUDIT_EXCLUDED_PATHS = [path.lstrip("/") for path in AUDIT_EXCLUDED_PATHS] + +#################################### +# OPENTELEMETRY +#################################### + +ENABLE_OTEL = os.environ.get("ENABLE_OTEL", "False").lower() == "true" +OTEL_EXPORTER_OTLP_ENDPOINT = os.environ.get( + "OTEL_EXPORTER_OTLP_ENDPOINT", "http://localhost:4317" +) +OTEL_SERVICE_NAME = os.environ.get("OTEL_SERVICE_NAME", "open-webui") +OTEL_RESOURCE_ATTRIBUTES = os.environ.get( + "OTEL_RESOURCE_ATTRIBUTES", "" +) # e.g. key1=val1,key2=val2 +OTEL_TRACES_SAMPLER = os.environ.get( + "OTEL_TRACES_SAMPLER", "parentbased_always_on" +).lower() + +#################################### +# TOOLS/FUNCTIONS PIP OPTIONS +#################################### + +PIP_OPTIONS = os.getenv("PIP_OPTIONS", "").split() +PIP_PACKAGE_INDEX_OPTIONS = os.getenv("PIP_PACKAGE_INDEX_OPTIONS", "").split() diff --git a/backend/open_webui/functions.py b/backend/open_webui/functions.py index 2f94f701e9..340b60ba47 100644 --- a/backend/open_webui/functions.py +++ b/backend/open_webui/functions.py @@ -223,6 +223,9 @@ async def generate_function_chat_completion( extra_params = { "__event_emitter__": __event_emitter__, "__event_call__": __event_call__, + "__chat_id__": metadata.get("chat_id", None), + "__session_id__": metadata.get("session_id", None), + "__message_id__": metadata.get("message_id", None), "__task__": __task__, "__task_body__": __task_body__, "__files__": files, diff --git a/backend/open_webui/main.py b/backend/open_webui/main.py index 416460837e..bb78d90034 100644 --- a/backend/open_webui/main.py +++ b/backend/open_webui/main.py @@ -84,11 +84,12 @@ from open_webui.routers.retrieval import ( get_rf, ) -from open_webui.internal.db import Session +from open_webui.internal.db import Session, engine from open_webui.models.functions import Functions from open_webui.models.models import Models from open_webui.models.users import UserModel, Users +from open_webui.models.chats import Chats from open_webui.config import ( LICENSE_KEY, @@ -155,6 +156,7 @@ from open_webui.config import ( AUDIO_TTS_AZURE_SPEECH_REGION, AUDIO_TTS_AZURE_SPEECH_OUTPUT_FORMAT, PLAYWRIGHT_WS_URI, + PLAYWRIGHT_TIMEOUT, FIRECRAWL_API_BASE_URL, FIRECRAWL_API_KEY, RAG_WEB_LOADER_ENGINE, @@ -186,9 +188,11 @@ from open_webui.config import ( CHUNK_SIZE, CONTENT_EXTRACTION_ENGINE, TIKA_SERVER_URL, + DOCLING_SERVER_URL, DOCUMENT_INTELLIGENCE_ENDPOINT, DOCUMENT_INTELLIGENCE_KEY, RAG_TOP_K, + RAG_TOP_K_RERANKER, RAG_TEXT_SPLITTER, TIKTOKEN_ENCODING_NAME, PDF_EXTRACT_IMAGES, @@ -212,6 +216,7 @@ from open_webui.config import ( SERPSTACK_API_KEY, SERPSTACK_HTTPS, TAVILY_API_KEY, + TAVILY_EXTRACT_DEPTH, BING_SEARCH_V7_ENDPOINT, BING_SEARCH_V7_SUBSCRIPTION_KEY, BRAVE_SEARCH_API_KEY, @@ -248,6 +253,7 @@ from open_webui.config import ( ENABLE_CHANNELS, ENABLE_COMMUNITY_SHARING, ENABLE_MESSAGE_RATING, + ENABLE_USER_WEBHOOKS, ENABLE_EVALUATION_ARENA_MODELS, USER_PERMISSIONS, DEFAULT_USER_ROLE, @@ -312,6 +318,9 @@ from open_webui.env import ( AUDIT_EXCLUDED_PATHS, AUDIT_LOG_LEVEL, CHANGELOG, + REDIS_URL, + REDIS_SENTINEL_HOSTS, + REDIS_SENTINEL_PORT, GLOBAL_LOG_LEVEL, MAX_BODY_LOG_SIZE, SAFE_MODE, @@ -327,6 +336,7 @@ from open_webui.env import ( BYPASS_MODEL_ACCESS_CONTROL, RESET_CONFIG_ON_START, OFFLINE_MODE, + ENABLE_OTEL, ) @@ -354,6 +364,8 @@ from open_webui.utils.security_headers import SecurityHeadersMiddleware from open_webui.tasks import stop_task, list_tasks # Import from tasks.py +from open_webui.utils.redis import get_sentinels_from_env + if SAFE_MODE: print("SAFE MODE ENABLED") @@ -418,11 +430,27 @@ app = FastAPI( oauth_manager = OAuthManager(app) -app.state.config = AppConfig() +app.state.config = AppConfig( + redis_url=REDIS_URL, + redis_sentinels=get_sentinels_from_env(REDIS_SENTINEL_HOSTS, REDIS_SENTINEL_PORT), +) app.state.WEBUI_NAME = WEBUI_NAME app.state.LICENSE_METADATA = None + +######################################## +# +# OPENTELEMETRY +# +######################################## + +if ENABLE_OTEL: + from open_webui.utils.telemetry.setup import setup as setup_opentelemetry + + setup_opentelemetry(app=app, db_engine=engine) + + ######################################## # # OLLAMA @@ -492,6 +520,7 @@ app.state.config.MODEL_ORDER_LIST = MODEL_ORDER_LIST app.state.config.ENABLE_CHANNELS = ENABLE_CHANNELS app.state.config.ENABLE_COMMUNITY_SHARING = ENABLE_COMMUNITY_SHARING app.state.config.ENABLE_MESSAGE_RATING = ENABLE_MESSAGE_RATING +app.state.config.ENABLE_USER_WEBHOOKS = ENABLE_USER_WEBHOOKS app.state.config.ENABLE_EVALUATION_ARENA_MODELS = ENABLE_EVALUATION_ARENA_MODELS app.state.config.EVALUATION_ARENA_MODELS = EVALUATION_ARENA_MODELS @@ -535,6 +564,7 @@ app.state.FUNCTIONS = {} app.state.config.TOP_K = RAG_TOP_K +app.state.config.TOP_K_RERANKER = RAG_TOP_K_RERANKER app.state.config.RELEVANCE_THRESHOLD = RAG_RELEVANCE_THRESHOLD app.state.config.FILE_MAX_SIZE = RAG_FILE_MAX_SIZE app.state.config.FILE_MAX_COUNT = RAG_FILE_MAX_COUNT @@ -549,6 +579,7 @@ app.state.config.ENABLE_RAG_WEB_LOADER_SSL_VERIFICATION = ( app.state.config.CONTENT_EXTRACTION_ENGINE = CONTENT_EXTRACTION_ENGINE app.state.config.TIKA_SERVER_URL = TIKA_SERVER_URL +app.state.config.DOCLING_SERVER_URL = DOCLING_SERVER_URL app.state.config.DOCUMENT_INTELLIGENCE_ENDPOINT = DOCUMENT_INTELLIGENCE_ENDPOINT app.state.config.DOCUMENT_INTELLIGENCE_KEY = DOCUMENT_INTELLIGENCE_KEY @@ -612,8 +643,10 @@ app.state.config.RAG_WEB_SEARCH_CONCURRENT_REQUESTS = RAG_WEB_SEARCH_CONCURRENT_ app.state.config.RAG_WEB_LOADER_ENGINE = RAG_WEB_LOADER_ENGINE app.state.config.RAG_WEB_SEARCH_TRUST_ENV = RAG_WEB_SEARCH_TRUST_ENV app.state.config.PLAYWRIGHT_WS_URI = PLAYWRIGHT_WS_URI +app.state.config.PLAYWRIGHT_TIMEOUT = PLAYWRIGHT_TIMEOUT app.state.config.FIRECRAWL_API_BASE_URL = FIRECRAWL_API_BASE_URL app.state.config.FIRECRAWL_API_KEY = FIRECRAWL_API_KEY +app.state.config.TAVILY_EXTRACT_DEPTH = TAVILY_EXTRACT_DEPTH app.state.EMBEDDING_FUNCTION = None app.state.ef = None @@ -947,14 +980,24 @@ async def get_models(request: Request, user=Depends(get_verified_user)): return filtered_models - models = await get_all_models(request, user=user) + all_models = await get_all_models(request, user=user) - # Filter out filter pipelines - models = [ - model - for model in models - if "pipeline" not in model or model["pipeline"].get("type", None) != "filter" - ] + models = [] + for model in all_models: + # Filter out filter pipelines + if "pipeline" in model and model["pipeline"].get("type", None) == "filter": + continue + + model_tags = [ + tag.get("name") + for tag in model.get("info", {}).get("meta", {}).get("tags", []) + ] + tags = [tag.get("name") for tag in model.get("tags", [])] + + tags = list(set(model_tags + tags)) + model["tags"] = [{"name": tag} for tag in tags] + + models.append(model) model_order_list = request.app.state.config.MODEL_ORDER_LIST if model_order_list: @@ -1020,6 +1063,7 @@ async def chat_completion( "message_id": form_data.pop("id", None), "session_id": form_data.pop("session_id", None), "tool_ids": form_data.get("tool_ids", None), + "tool_servers": form_data.pop("tool_servers", None), "files": form_data.get("files", None), "features": form_data.get("features", None), "variables": form_data.get("variables", None), @@ -1046,6 +1090,14 @@ async def chat_completion( except Exception as e: log.debug(f"Error processing chat payload: {e}") + Chats.upsert_message_to_chat_by_id_and_message_id( + metadata["chat_id"], + metadata["message_id"], + { + "error": {"content": str(e)}, + }, + ) + raise HTTPException( status_code=status.HTTP_400_BAD_REQUEST, detail=str(e), @@ -1181,6 +1233,7 @@ async def get_app_config(request: Request): "enable_autocomplete_generation": app.state.config.ENABLE_AUTOCOMPLETE_GENERATION, "enable_community_sharing": app.state.config.ENABLE_COMMUNITY_SHARING, "enable_message_rating": app.state.config.ENABLE_MESSAGE_RATING, + "enable_user_webhooks": app.state.config.ENABLE_USER_WEBHOOKS, "enable_admin_export": ENABLE_ADMIN_EXPORT, "enable_admin_chat_access": ENABLE_ADMIN_CHAT_ACCESS, "enable_google_drive_integration": app.state.config.ENABLE_GOOGLE_DRIVE_INTEGRATION, diff --git a/backend/open_webui/models/folders.py b/backend/open_webui/models/folders.py index 19739bc5f5..1c97de26c9 100644 --- a/backend/open_webui/models/folders.py +++ b/backend/open_webui/models/folders.py @@ -9,6 +9,8 @@ from open_webui.models.chats import Chats from open_webui.env import SRC_LOG_LEVELS from pydantic import BaseModel, ConfigDict from sqlalchemy import BigInteger, Column, Text, JSON, Boolean +from open_webui.utils.access_control import get_permissions + log = logging.getLogger(__name__) log.setLevel(SRC_LOG_LEVELS["MODELS"]) @@ -234,15 +236,18 @@ class FolderTable: log.error(f"update_folder: {e}") return - def delete_folder_by_id_and_user_id(self, id: str, user_id: str) -> bool: + def delete_folder_by_id_and_user_id( + self, id: str, user_id: str, delete_chats=True + ) -> bool: try: with get_db() as db: folder = db.query(Folder).filter_by(id=id, user_id=user_id).first() if not folder: return False - # Delete all chats in the folder - Chats.delete_chats_by_user_id_and_folder_id(user_id, folder.id) + if delete_chats: + # Delete all chats in the folder + Chats.delete_chats_by_user_id_and_folder_id(user_id, folder.id) # Delete all children folders def delete_children(folder): @@ -250,9 +255,11 @@ class FolderTable: folder.id, user_id ) for folder_child in folder_children: - Chats.delete_chats_by_user_id_and_folder_id( - user_id, folder_child.id - ) + if delete_chats: + Chats.delete_chats_by_user_id_and_folder_id( + user_id, folder_child.id + ) + delete_children(folder_child) folder = db.query(Folder).filter_by(id=folder_child.id).first() diff --git a/backend/open_webui/retrieval/loaders/main.py b/backend/open_webui/retrieval/loaders/main.py index 7fa24ced37..295d0414a7 100644 --- a/backend/open_webui/retrieval/loaders/main.py +++ b/backend/open_webui/retrieval/loaders/main.py @@ -105,7 +105,7 @@ class TikaLoader: if r.ok: raw_metadata = r.json() - text = raw_metadata.get("X-TIKA:content", "") + text = raw_metadata.get("X-TIKA:content", "").strip() if "Content-Type" in raw_metadata: headers["Content-Type"] = raw_metadata["Content-Type"] @@ -117,6 +117,52 @@ class TikaLoader: raise Exception(f"Error calling Tika: {r.reason}") +class DoclingLoader: + def __init__(self, url, file_path=None, mime_type=None): + self.url = url.rstrip("/") + self.file_path = file_path + self.mime_type = mime_type + + def load(self) -> list[Document]: + with open(self.file_path, "rb") as f: + files = { + "files": ( + self.file_path, + f, + self.mime_type or "application/octet-stream", + ) + } + + params = { + "image_export_mode": "placeholder", + "table_mode": "accurate", + } + + endpoint = f"{self.url}/v1alpha/convert/file" + r = requests.post(endpoint, files=files, data=params) + + if r.ok: + result = r.json() + document_data = result.get("document", {}) + text = document_data.get("md_content", "") + + metadata = {"Content-Type": self.mime_type} if self.mime_type else {} + + log.debug("Docling extracted text: %s", text) + + return [Document(page_content=text, metadata=metadata)] + else: + error_msg = f"Error calling Docling API: {r.reason}" + if r.text: + try: + error_data = r.json() + if "detail" in error_data: + error_msg += f" - {error_data['detail']}" + except Exception: + error_msg += f" - {r.text}" + raise Exception(f"Error calling Docling: {error_msg}") + + class Loader: def __init__(self, engine: str = "", **kwargs): self.engine = engine @@ -149,6 +195,12 @@ class Loader: file_path=file_path, mime_type=file_content_type, ) + elif self.engine == "docling" and self.kwargs.get("DOCLING_SERVER_URL"): + loader = DoclingLoader( + url=self.kwargs.get("DOCLING_SERVER_URL"), + file_path=file_path, + mime_type=file_content_type, + ) elif ( self.engine == "document_intelligence" and self.kwargs.get("DOCUMENT_INTELLIGENCE_ENDPOINT") != "" @@ -176,7 +228,7 @@ class Loader: file_path, extract_images=self.kwargs.get("PDF_EXTRACT_IMAGES") ) elif file_ext == "csv": - loader = CSVLoader(file_path) + loader = CSVLoader(file_path, autodetect_encoding=True) elif file_ext == "rst": loader = UnstructuredRSTLoader(file_path, mode="elements") elif file_ext == "xml": diff --git a/backend/open_webui/retrieval/loaders/tavily.py b/backend/open_webui/retrieval/loaders/tavily.py new file mode 100644 index 0000000000..15a3d7f97f --- /dev/null +++ b/backend/open_webui/retrieval/loaders/tavily.py @@ -0,0 +1,93 @@ +import requests +import logging +from typing import Iterator, List, Literal, Union + +from langchain_core.document_loaders import BaseLoader +from langchain_core.documents import Document +from open_webui.env import SRC_LOG_LEVELS + +log = logging.getLogger(__name__) +log.setLevel(SRC_LOG_LEVELS["RAG"]) + + +class TavilyLoader(BaseLoader): + """Extract web page content from URLs using Tavily Extract API. + + This is a LangChain document loader that uses Tavily's Extract API to + retrieve content from web pages and return it as Document objects. + + Args: + urls: URL or list of URLs to extract content from. + api_key: The Tavily API key. + extract_depth: Depth of extraction, either "basic" or "advanced". + continue_on_failure: Whether to continue if extraction of a URL fails. + """ + + def __init__( + self, + urls: Union[str, List[str]], + api_key: str, + extract_depth: Literal["basic", "advanced"] = "basic", + continue_on_failure: bool = True, + ) -> None: + """Initialize Tavily Extract client. + + Args: + urls: URL or list of URLs to extract content from. + api_key: The Tavily API key. + include_images: Whether to include images in the extraction. + extract_depth: Depth of extraction, either "basic" or "advanced". + advanced extraction retrieves more data, including tables and + embedded content, with higher success but may increase latency. + basic costs 1 credit per 5 successful URL extractions, + advanced costs 2 credits per 5 successful URL extractions. + continue_on_failure: Whether to continue if extraction of a URL fails. + """ + if not urls: + raise ValueError("At least one URL must be provided.") + + self.api_key = api_key + self.urls = urls if isinstance(urls, list) else [urls] + self.extract_depth = extract_depth + self.continue_on_failure = continue_on_failure + self.api_url = "https://api.tavily.com/extract" + + def lazy_load(self) -> Iterator[Document]: + """Extract and yield documents from the URLs using Tavily Extract API.""" + batch_size = 20 + for i in range(0, len(self.urls), batch_size): + batch_urls = self.urls[i : i + batch_size] + try: + headers = { + "Content-Type": "application/json", + "Authorization": f"Bearer {self.api_key}", + } + # Use string for single URL, array for multiple URLs + urls_param = batch_urls[0] if len(batch_urls) == 1 else batch_urls + payload = {"urls": urls_param, "extract_depth": self.extract_depth} + # Make the API call + response = requests.post(self.api_url, headers=headers, json=payload) + response.raise_for_status() + response_data = response.json() + # Process successful results + for result in response_data.get("results", []): + url = result.get("url", "") + content = result.get("raw_content", "") + if not content: + log.warning(f"No content extracted from {url}") + continue + # Add URLs as metadata + metadata = {"source": url} + yield Document( + page_content=content, + metadata=metadata, + ) + for failed in response_data.get("failed_results", []): + url = failed.get("url", "") + error = failed.get("error", "Unknown error") + log.error(f"Failed to extract content from {url}: {error}") + except Exception as e: + if self.continue_on_failure: + log.error(f"Error extracting content from batch {batch_urls}: {e}") + else: + raise e diff --git a/backend/open_webui/retrieval/utils.py b/backend/open_webui/retrieval/utils.py index 029a33a56c..518a121367 100644 --- a/backend/open_webui/retrieval/utils.py +++ b/backend/open_webui/retrieval/utils.py @@ -1,30 +1,35 @@ import logging import os -import uuid from typing import Optional, Union -import asyncio import requests import hashlib +from concurrent.futures import ThreadPoolExecutor from huggingface_hub import snapshot_download from langchain.retrievers import ContextualCompressionRetriever, EnsembleRetriever from langchain_community.retrievers import BM25Retriever from langchain_core.documents import Document - from open_webui.config import VECTOR_DB from open_webui.retrieval.vector.connector import VECTOR_DB_CLIENT -from open_webui.utils.misc import get_last_user_message, calculate_sha256_string from open_webui.models.users import UserModel from open_webui.models.files import Files +from open_webui.retrieval.vector.main import GetResult + + from open_webui.env import ( SRC_LOG_LEVELS, OFFLINE_MODE, ENABLE_FORWARD_USER_INFO_HEADERS, ) +from open_webui.config import ( + RAG_EMBEDDING_QUERY_PREFIX, + RAG_EMBEDDING_CONTENT_PREFIX, + RAG_EMBEDDING_PREFIX_FIELD_NAME, +) log = logging.getLogger(__name__) log.setLevel(SRC_LOG_LEVELS["RAG"]) @@ -49,7 +54,7 @@ class VectorSearchRetriever(BaseRetriever): ) -> list[Document]: result = VECTOR_DB_CLIENT.search( collection_name=self.collection_name, - vectors=[self.embedding_function(query)], + vectors=[self.embedding_function(query, RAG_EMBEDDING_QUERY_PREFIX)], limit=self.top_k, ) @@ -102,18 +107,18 @@ def get_doc(collection_name: str, user: UserModel = None): def query_doc_with_hybrid_search( collection_name: str, + collection_result: GetResult, query: str, embedding_function, k: int, reranking_function, + k_reranker: int, r: float, ) -> dict: try: - result = VECTOR_DB_CLIENT.get(collection_name=collection_name) - bm25_retriever = BM25Retriever.from_texts( - texts=result.documents[0], - metadatas=result.metadatas[0], + texts=collection_result.documents[0], + metadatas=collection_result.metadatas[0], ) bm25_retriever.k = k @@ -128,7 +133,7 @@ def query_doc_with_hybrid_search( ) compressor = RerankCompressor( embedding_function=embedding_function, - top_n=k, + top_n=k_reranker, reranking_function=reranking_function, r_score=r, ) @@ -138,10 +143,23 @@ def query_doc_with_hybrid_search( ) result = compression_retriever.invoke(query) + + distances = [d.metadata.get("score") for d in result] + documents = [d.page_content for d in result] + metadatas = [d.metadata for d in result] + + # retrieve only min(k, k_reranker) items, sort and cut by distance if k < k_reranker + if k < k_reranker: + sorted_items = sorted( + zip(distances, metadatas, documents), key=lambda x: x[0], reverse=True + ) + sorted_items = sorted_items[:k] + distances, documents, metadatas = map(list, zip(*sorted_items)) + result = { - "distances": [[d.metadata.get("score") for d in result]], - "documents": [[d.page_content for d in result]], - "metadatas": [[d.metadata for d in result]], + "distances": [distances], + "documents": [documents], + "metadatas": [metadatas], } log.info( @@ -174,12 +192,9 @@ def merge_get_results(get_results: list[dict]) -> dict: return result -def merge_and_sort_query_results( - query_results: list[dict], k: int, reverse: bool = False -) -> dict: +def merge_and_sort_query_results(query_results: list[dict], k: int) -> dict: # Initialize lists to store combined data - combined = [] - seen_hashes = set() # To store unique document hashes + combined = dict() # To store documents with unique document hashes for data in query_results: distances = data["distances"][0] @@ -192,12 +207,17 @@ def merge_and_sort_query_results( document.encode() ).hexdigest() # Compute a hash for uniqueness - if doc_hash not in seen_hashes: - seen_hashes.add(doc_hash) - combined.append((distance, document, metadata)) + if doc_hash not in combined.keys(): + combined[doc_hash] = (distance, document, metadata) + continue # if doc is new, no further comparison is needed + # if doc is alredy in, but new distance is better, update + if distance > combined[doc_hash][0]: + combined[doc_hash] = (distance, document, metadata) + + combined = list(combined.values()) # Sort the list based on distances - combined.sort(key=lambda x: x[0], reverse=reverse) + combined.sort(key=lambda x: x[0], reverse=True) # Slice to keep only the top k elements sorted_distances, sorted_documents, sorted_metadatas = ( @@ -237,7 +257,7 @@ def query_collection( ) -> dict: results = [] for query in queries: - query_embedding = embedding_function(query) + query_embedding = embedding_function(query, prefix=RAG_EMBEDDING_QUERY_PREFIX) for collection_name in collection_names: if collection_name: try: @@ -253,12 +273,7 @@ def query_collection( else: pass - if VECTOR_DB == "chroma": - # Chroma uses unconventional cosine similarity, so we don't need to reverse the results - # https://docs.trychroma.com/docs/collections/configure#configuring-chroma-collections - return merge_and_sort_query_results(results, k=k, reverse=False) - else: - return merge_and_sort_query_results(results, k=k, reverse=True) + return merge_and_sort_query_results(results, k=k) def query_collection_with_hybrid_search( @@ -267,39 +282,66 @@ def query_collection_with_hybrid_search( embedding_function, k: int, reranking_function, + k_reranker: int, r: float, ) -> dict: results = [] error = False + # Fetch collection data once per collection sequentially + # Avoid fetching the same data multiple times later + collection_results = {} for collection_name in collection_names: try: - for query in queries: - result = query_doc_with_hybrid_search( - collection_name=collection_name, - query=query, - embedding_function=embedding_function, - k=k, - reranking_function=reranking_function, - r=r, - ) - results.append(result) - except Exception as e: - log.exception( - "Error when querying the collection with " f"hybrid_search: {e}" + collection_results[collection_name] = VECTOR_DB_CLIENT.get( + collection_name=collection_name ) - error = True + except Exception as e: + log.exception(f"Failed to fetch collection {collection_name}: {e}") + collection_results[collection_name] = None - if error: + log.info( + f"Starting hybrid search for {len(queries)} queries in {len(collection_names)} collections..." + ) + + def process_query(collection_name, query): + try: + result = query_doc_with_hybrid_search( + collection_name=collection_name, + collection_result=collection_results[collection_name], + query=query, + embedding_function=embedding_function, + k=k, + reranking_function=reranking_function, + k_reranker=k_reranker, + r=r, + ) + return result, None + except Exception as e: + log.exception(f"Error when querying the collection with hybrid_search: {e}") + return None, e + + tasks = [ + (collection_name, query) + for collection_name in collection_names + for query in queries + ] + + with ThreadPoolExecutor() as executor: + future_results = [executor.submit(process_query, cn, q) for cn, q in tasks] + task_results = [future.result() for future in future_results] + + for result, err in task_results: + if err is not None: + error = True + elif result is not None: + results.append(result) + + if error and not results: raise Exception( - "Hybrid search failed for all collections. Using Non hybrid search as fallback." + "Hybrid search failed for all collections. Using Non-hybrid search as fallback." ) - if VECTOR_DB == "chroma": - # Chroma uses unconventional cosine similarity, so we don't need to reverse the results - # https://docs.trychroma.com/docs/collections/configure#configuring-chroma-collections - return merge_and_sort_query_results(results, k=k, reverse=False) - else: - return merge_and_sort_query_results(results, k=k, reverse=True) + return merge_and_sort_query_results(results, k=k) def get_embedding_function( @@ -311,29 +353,38 @@ def get_embedding_function( embedding_batch_size, ): if embedding_engine == "": - return lambda query, user=None: embedding_function.encode(query).tolist() + return lambda query, prefix=None, user=None: embedding_function.encode( + query, prompt=prefix if prefix else None + ).tolist() elif embedding_engine in ["ollama", "openai"]: - func = lambda query, user=None: generate_embeddings( + func = lambda query, prefix=None, user=None: generate_embeddings( engine=embedding_engine, model=embedding_model, text=query, + prefix=prefix, url=url, key=key, user=user, ) - def generate_multiple(query, user, func): + def generate_multiple(query, prefix, user, func): if isinstance(query, list): embeddings = [] for i in range(0, len(query), embedding_batch_size): embeddings.extend( - func(query[i : i + embedding_batch_size], user=user) + func( + query[i : i + embedding_batch_size], + prefix=prefix, + user=user, + ) ) return embeddings else: - return func(query, user) + return func(query, prefix, user) - return lambda query, user=None: generate_multiple(query, user, func) + return lambda query, prefix=None, user=None: generate_multiple( + query, prefix, user, func + ) else: raise ValueError(f"Unknown embedding engine: {embedding_engine}") @@ -345,6 +396,7 @@ def get_sources_from_files( embedding_function, k, reranking_function, + k_reranker, r, hybrid_search, full_context=False, @@ -461,6 +513,7 @@ def get_sources_from_files( embedding_function=embedding_function, k=k, reranking_function=reranking_function, + k_reranker=k_reranker, r=r, ) except Exception as e: @@ -553,9 +606,14 @@ def generate_openai_batch_embeddings( texts: list[str], url: str = "https://api.openai.com/v1", key: str = "", + prefix: str = None, user: UserModel = None, ) -> Optional[list[list[float]]]: try: + json_data = {"input": texts, "model": model} + if isinstance(RAG_EMBEDDING_PREFIX_FIELD_NAME, str) and isinstance(prefix, str): + json_data[RAG_EMBEDDING_PREFIX_FIELD_NAME] = prefix + r = requests.post( f"{url}/embeddings", headers={ @@ -572,7 +630,7 @@ def generate_openai_batch_embeddings( else {} ), }, - json={"input": texts, "model": model}, + json=json_data, ) r.raise_for_status() data = r.json() @@ -586,9 +644,18 @@ def generate_openai_batch_embeddings( def generate_ollama_batch_embeddings( - model: str, texts: list[str], url: str, key: str = "", user: UserModel = None + model: str, + texts: list[str], + url: str, + key: str = "", + prefix: str = None, + user: UserModel = None, ) -> Optional[list[list[float]]]: try: + json_data = {"input": texts, "model": model} + if isinstance(RAG_EMBEDDING_PREFIX_FIELD_NAME, str) and isinstance(prefix, str): + json_data[RAG_EMBEDDING_PREFIX_FIELD_NAME] = prefix + r = requests.post( f"{url}/api/embed", headers={ @@ -605,7 +672,7 @@ def generate_ollama_batch_embeddings( else {} ), }, - json={"input": texts, "model": model}, + json=json_data, ) r.raise_for_status() data = r.json() @@ -619,15 +686,34 @@ def generate_ollama_batch_embeddings( return None -def generate_embeddings(engine: str, model: str, text: Union[str, list[str]], **kwargs): +def generate_embeddings( + engine: str, + model: str, + text: Union[str, list[str]], + prefix: Union[str, None] = None, + **kwargs, +): url = kwargs.get("url", "") key = kwargs.get("key", "") user = kwargs.get("user") + if prefix is not None and RAG_EMBEDDING_PREFIX_FIELD_NAME is None: + if isinstance(text, list): + text = [f"{prefix}{text_element}" for text_element in text] + else: + text = f"{prefix}{text}" + if engine == "ollama": if isinstance(text, list): embeddings = generate_ollama_batch_embeddings( - **{"model": model, "texts": text, "url": url, "key": key, "user": user} + **{ + "model": model, + "texts": text, + "url": url, + "key": key, + "prefix": prefix, + "user": user, + } ) else: embeddings = generate_ollama_batch_embeddings( @@ -636,16 +722,20 @@ def generate_embeddings(engine: str, model: str, text: Union[str, list[str]], ** "texts": [text], "url": url, "key": key, + "prefix": prefix, "user": user, } ) return embeddings[0] if isinstance(text, str) else embeddings elif engine == "openai": if isinstance(text, list): - embeddings = generate_openai_batch_embeddings(model, text, url, key, user) + embeddings = generate_openai_batch_embeddings( + model, text, url, key, prefix, user + ) else: - embeddings = generate_openai_batch_embeddings(model, [text], url, key, user) - + embeddings = generate_openai_batch_embeddings( + model, [text], url, key, prefix, user + ) return embeddings[0] if isinstance(text, str) else embeddings @@ -681,9 +771,9 @@ class RerankCompressor(BaseDocumentCompressor): else: from sentence_transformers import util - query_embedding = self.embedding_function(query) + query_embedding = self.embedding_function(query, RAG_EMBEDDING_QUERY_PREFIX) document_embedding = self.embedding_function( - [doc.page_content for doc in documents] + [doc.page_content for doc in documents], RAG_EMBEDDING_CONTENT_PREFIX ) scores = util.cos_sim(query_embedding, document_embedding)[0] diff --git a/backend/open_webui/retrieval/vector/dbs/chroma.py b/backend/open_webui/retrieval/vector/dbs/chroma.py index 006ee20763..a6b97df3e9 100755 --- a/backend/open_webui/retrieval/vector/dbs/chroma.py +++ b/backend/open_webui/retrieval/vector/dbs/chroma.py @@ -75,10 +75,16 @@ class ChromaClient: n_results=limit, ) + # chromadb has cosine distance, 2 (worst) -> 0 (best). Re-odering to 0 -> 1 + # https://docs.trychroma.com/docs/collections/configure cosine equation + distances: list = result["distances"][0] + distances = [2 - dist for dist in distances] + distances = [[dist / 2 for dist in distances]] + return SearchResult( **{ "ids": result["ids"], - "distances": result["distances"], + "distances": distances, "documents": result["documents"], "metadatas": result["metadatas"], } @@ -166,12 +172,19 @@ class ChromaClient: filter: Optional[dict] = None, ): # Delete the items from the collection based on the ids. - collection = self.client.get_collection(name=collection_name) - if collection: - if ids: - collection.delete(ids=ids) - elif filter: - collection.delete(where=filter) + try: + collection = self.client.get_collection(name=collection_name) + if collection: + if ids: + collection.delete(ids=ids) + elif filter: + collection.delete(where=filter) + except Exception as e: + # If collection doesn't exist, that's fine - nothing to delete + log.debug( + f"Attempted to delete from non-existent collection {collection_name}. Ignoring." + ) + pass def reset(self): # Resets the database. This will delete all collections and item entries. diff --git a/backend/open_webui/retrieval/vector/dbs/milvus.py b/backend/open_webui/retrieval/vector/dbs/milvus.py index ad05f9422d..26b4dd5ed2 100644 --- a/backend/open_webui/retrieval/vector/dbs/milvus.py +++ b/backend/open_webui/retrieval/vector/dbs/milvus.py @@ -64,7 +64,10 @@ class MilvusClient: for item in match: _ids.append(item.get("id")) - _distances.append(item.get("distance")) + # normalize milvus score from [-1, 1] to [0, 1] range + # https://milvus.io/docs/de/metric.md + _dist = (item.get("distance") + 1.0) / 2.0 + _distances.append(_dist) _documents.append(item.get("entity", {}).get("data", {}).get("text")) _metadatas.append(item.get("entity", {}).get("metadata")) diff --git a/backend/open_webui/retrieval/vector/dbs/opensearch.py b/backend/open_webui/retrieval/vector/dbs/opensearch.py index 2629bfcbac..432bcef412 100644 --- a/backend/open_webui/retrieval/vector/dbs/opensearch.py +++ b/backend/open_webui/retrieval/vector/dbs/opensearch.py @@ -1,4 +1,5 @@ from opensearchpy import OpenSearch +from opensearchpy.helpers import bulk from typing import Optional from open_webui.retrieval.vector.main import VectorItem, SearchResult, GetResult @@ -21,7 +22,13 @@ class OpenSearchClient: http_auth=(OPENSEARCH_USERNAME, OPENSEARCH_PASSWORD), ) + def _get_index_name(self, collection_name: str) -> str: + return f"{self.index_prefix}_{collection_name}" + def _result_to_get_result(self, result) -> GetResult: + if not result["hits"]["hits"]: + return None + ids = [] documents = [] metadatas = [] @@ -31,9 +38,12 @@ class OpenSearchClient: documents.append(hit["_source"].get("text")) metadatas.append(hit["_source"].get("metadata")) - return GetResult(ids=ids, documents=documents, metadatas=metadatas) + return GetResult(ids=[ids], documents=[documents], metadatas=[metadatas]) def _result_to_search_result(self, result) -> SearchResult: + if not result["hits"]["hits"]: + return None + ids = [] distances = [] documents = [] @@ -46,34 +56,40 @@ class OpenSearchClient: metadatas.append(hit["_source"].get("metadata")) return SearchResult( - ids=ids, distances=distances, documents=documents, metadatas=metadatas + ids=[ids], + distances=[distances], + documents=[documents], + metadatas=[metadatas], ) def _create_index(self, collection_name: str, dimension: int): body = { + "settings": {"index": {"knn": True}}, "mappings": { "properties": { "id": {"type": "keyword"}, "vector": { - "type": "dense_vector", - "dims": dimension, # Adjust based on your vector dimensions - "index": true, + "type": "knn_vector", + "dimension": dimension, # Adjust based on your vector dimensions + "index": True, "similarity": "faiss", "method": { "name": "hnsw", - "space_type": "ip", # Use inner product to approximate cosine similarity + "space_type": "innerproduct", # Use inner product to approximate cosine similarity "engine": "faiss", - "ef_construction": 128, - "m": 16, + "parameters": { + "ef_construction": 128, + "m": 16, + }, }, }, "text": {"type": "text"}, "metadata": {"type": "object"}, } - } + }, } self.client.indices.create( - index=f"{self.index_prefix}_{collection_name}", body=body + index=self._get_index_name(collection_name), body=body ) def _create_batches(self, items: list[VectorItem], batch_size=100): @@ -83,39 +99,45 @@ class OpenSearchClient: def has_collection(self, collection_name: str) -> bool: # has_collection here means has index. # We are simply adapting to the norms of the other DBs. - return self.client.indices.exists( - index=f"{self.index_prefix}_{collection_name}" - ) + return self.client.indices.exists(index=self._get_index_name(collection_name)) - def delete_colleciton(self, collection_name: str): + def delete_collection(self, collection_name: str): # delete_collection here means delete index. # We are simply adapting to the norms of the other DBs. - self.client.indices.delete(index=f"{self.index_prefix}_{collection_name}") + self.client.indices.delete(index=self._get_index_name(collection_name)) def search( - self, collection_name: str, vectors: list[list[float]], limit: int + self, collection_name: str, vectors: list[list[float | int]], limit: int ) -> Optional[SearchResult]: - query = { - "size": limit, - "_source": ["text", "metadata"], - "query": { - "script_score": { - "query": {"match_all": {}}, - "script": { - "source": "cosineSimilarity(params.vector, 'vector') + 1.0", - "params": { - "vector": vectors[0] - }, # Assuming single query vector - }, - } - }, - } + try: + if not self.has_collection(collection_name): + return None - result = self.client.search( - index=f"{self.index_prefix}_{collection_name}", body=query - ) + query = { + "size": limit, + "_source": ["text", "metadata"], + "query": { + "script_score": { + "query": {"match_all": {}}, + "script": { + "source": "(cosineSimilarity(params.query_value, doc[params.field]) + 1.0) / 2.0", + "params": { + "field": "vector", + "query_value": vectors[0], + }, # Assuming single query vector + }, + } + }, + } - return self._result_to_search_result(result) + result = self.client.search( + index=self._get_index_name(collection_name), body=query + ) + + return self._result_to_search_result(result) + + except Exception as e: + return None def query( self, collection_name: str, filter: dict, limit: Optional[int] = None @@ -129,13 +151,15 @@ class OpenSearchClient: } for field, value in filter.items(): - query_body["query"]["bool"]["filter"].append({"term": {field: value}}) + query_body["query"]["bool"]["filter"].append( + {"match": {"metadata." + str(field): value}} + ) size = limit if limit else 10 try: result = self.client.search( - index=f"{self.index_prefix}_{collection_name}", + index=self._get_index_name(collection_name), body=query_body, size=size, ) @@ -146,14 +170,14 @@ class OpenSearchClient: return None def _create_index_if_not_exists(self, collection_name: str, dimension: int): - if not self.has_index(collection_name): + if not self.has_collection(collection_name): self._create_index(collection_name, dimension) def get(self, collection_name: str) -> Optional[GetResult]: query = {"query": {"match_all": {}}, "_source": ["text", "metadata"]} result = self.client.search( - index=f"{self.index_prefix}_{collection_name}", body=query + index=self._get_index_name(collection_name), body=query ) return self._result_to_get_result(result) @@ -165,18 +189,18 @@ class OpenSearchClient: for batch in self._create_batches(items): actions = [ { - "index": { - "_id": item["id"], - "_source": { - "vector": item["vector"], - "text": item["text"], - "metadata": item["metadata"], - }, - } + "_op_type": "index", + "_index": self._get_index_name(collection_name), + "_id": item["id"], + "_source": { + "vector": item["vector"], + "text": item["text"], + "metadata": item["metadata"], + }, } for item in batch ] - self.client.bulk(actions) + bulk(self.client, actions) def upsert(self, collection_name: str, items: list[VectorItem]): self._create_index_if_not_exists( @@ -186,26 +210,47 @@ class OpenSearchClient: for batch in self._create_batches(items): actions = [ { - "index": { - "_id": item["id"], - "_index": f"{self.index_prefix}_{collection_name}", - "_source": { - "vector": item["vector"], - "text": item["text"], - "metadata": item["metadata"], - }, - } + "_op_type": "update", + "_index": self._get_index_name(collection_name), + "_id": item["id"], + "doc": { + "vector": item["vector"], + "text": item["text"], + "metadata": item["metadata"], + }, + "doc_as_upsert": True, } for item in batch ] - self.client.bulk(actions) + bulk(self.client, actions) - def delete(self, collection_name: str, ids: list[str]): - actions = [ - {"delete": {"_index": f"{self.index_prefix}_{collection_name}", "_id": id}} - for id in ids - ] - self.client.bulk(body=actions) + def delete( + self, + collection_name: str, + ids: Optional[list[str]] = None, + filter: Optional[dict] = None, + ): + if ids: + actions = [ + { + "_op_type": "delete", + "_index": self._get_index_name(collection_name), + "_id": id, + } + for id in ids + ] + bulk(self.client, actions) + elif filter: + query_body = { + "query": {"bool": {"filter": []}}, + } + for field, value in filter.items(): + query_body["query"]["bool"]["filter"].append( + {"match": {"metadata." + str(field): value}} + ) + self.client.delete_by_query( + index=self._get_index_name(collection_name), body=query_body + ) def reset(self): indices = self.client.indices.get(index=f"{self.index_prefix}_*") diff --git a/backend/open_webui/retrieval/vector/dbs/pgvector.py b/backend/open_webui/retrieval/vector/dbs/pgvector.py index eab02232f4..c38dbb0367 100644 --- a/backend/open_webui/retrieval/vector/dbs/pgvector.py +++ b/backend/open_webui/retrieval/vector/dbs/pgvector.py @@ -278,7 +278,9 @@ class PgvectorClient: for row in results: qid = int(row.qid) ids[qid].append(row.id) - distances[qid].append(row.distance) + # normalize and re-orders pgvec distance from [2, 0] to [0, 1] score range + # https://github.com/pgvector/pgvector?tab=readme-ov-file#querying + distances[qid].append((2.0 - row.distance) / 2.0) documents[qid].append(row.text) metadatas[qid].append(row.vmetadata) diff --git a/backend/open_webui/retrieval/vector/dbs/qdrant.py b/backend/open_webui/retrieval/vector/dbs/qdrant.py index 28f0b37793..be0df6c6ac 100644 --- a/backend/open_webui/retrieval/vector/dbs/qdrant.py +++ b/backend/open_webui/retrieval/vector/dbs/qdrant.py @@ -99,7 +99,8 @@ class QdrantClient: ids=get_result.ids, documents=get_result.documents, metadatas=get_result.metadatas, - distances=[[point.score for point in query_response.points]], + # qdrant distance is [-1, 1], normalize to [0, 1] + distances=[[(point.score + 1.0) / 2.0 for point in query_response.points]], ) def query(self, collection_name: str, filter: dict, limit: Optional[int] = None): diff --git a/backend/open_webui/retrieval/web/utils.py b/backend/open_webui/retrieval/web/utils.py index fd94a1a32f..942cb8483f 100644 --- a/backend/open_webui/retrieval/web/utils.py +++ b/backend/open_webui/retrieval/web/utils.py @@ -24,13 +24,17 @@ from langchain_community.document_loaders import PlaywrightURLLoader, WebBaseLoa from langchain_community.document_loaders.firecrawl import FireCrawlLoader from langchain_community.document_loaders.base import BaseLoader from langchain_core.documents import Document +from open_webui.retrieval.loaders.tavily import TavilyLoader from open_webui.constants import ERROR_MESSAGES from open_webui.config import ( ENABLE_RAG_LOCAL_WEB_FETCH, PLAYWRIGHT_WS_URI, + PLAYWRIGHT_TIMEOUT, RAG_WEB_LOADER_ENGINE, FIRECRAWL_API_BASE_URL, FIRECRAWL_API_KEY, + TAVILY_API_KEY, + TAVILY_EXTRACT_DEPTH, ) from open_webui.env import SRC_LOG_LEVELS @@ -113,7 +117,47 @@ def verify_ssl_cert(url: str) -> bool: return False -class SafeFireCrawlLoader(BaseLoader): +class RateLimitMixin: + async def _wait_for_rate_limit(self): + """Wait to respect the rate limit if specified.""" + if self.requests_per_second and self.last_request_time: + min_interval = timedelta(seconds=1.0 / self.requests_per_second) + time_since_last = datetime.now() - self.last_request_time + if time_since_last < min_interval: + await asyncio.sleep((min_interval - time_since_last).total_seconds()) + self.last_request_time = datetime.now() + + def _sync_wait_for_rate_limit(self): + """Synchronous version of rate limit wait.""" + if self.requests_per_second and self.last_request_time: + min_interval = timedelta(seconds=1.0 / self.requests_per_second) + time_since_last = datetime.now() - self.last_request_time + if time_since_last < min_interval: + time.sleep((min_interval - time_since_last).total_seconds()) + self.last_request_time = datetime.now() + + +class URLProcessingMixin: + def _verify_ssl_cert(self, url: str) -> bool: + """Verify SSL certificate for a URL.""" + return verify_ssl_cert(url) + + async def _safe_process_url(self, url: str) -> bool: + """Perform safety checks before processing a URL.""" + if self.verify_ssl and not self._verify_ssl_cert(url): + raise ValueError(f"SSL certificate verification failed for {url}") + await self._wait_for_rate_limit() + return True + + def _safe_process_url_sync(self, url: str) -> bool: + """Synchronous version of safety checks.""" + if self.verify_ssl and not self._verify_ssl_cert(url): + raise ValueError(f"SSL certificate verification failed for {url}") + self._sync_wait_for_rate_limit() + return True + + +class SafeFireCrawlLoader(BaseLoader, RateLimitMixin, URLProcessingMixin): def __init__( self, web_paths, @@ -184,7 +228,7 @@ class SafeFireCrawlLoader(BaseLoader): yield from loader.lazy_load() except Exception as e: if self.continue_on_failure: - log.exception(e, "Error loading %s", url) + log.exception(f"Error loading {url}: {e}") continue raise e @@ -204,47 +248,124 @@ class SafeFireCrawlLoader(BaseLoader): yield document except Exception as e: if self.continue_on_failure: - log.exception(e, "Error loading %s", url) + log.exception(f"Error loading {url}: {e}") continue raise e - def _verify_ssl_cert(self, url: str) -> bool: - return verify_ssl_cert(url) - async def _wait_for_rate_limit(self): - """Wait to respect the rate limit if specified.""" - if self.requests_per_second and self.last_request_time: - min_interval = timedelta(seconds=1.0 / self.requests_per_second) - time_since_last = datetime.now() - self.last_request_time - if time_since_last < min_interval: - await asyncio.sleep((min_interval - time_since_last).total_seconds()) - self.last_request_time = datetime.now() +class SafeTavilyLoader(BaseLoader, RateLimitMixin, URLProcessingMixin): + def __init__( + self, + web_paths: Union[str, List[str]], + api_key: str, + extract_depth: Literal["basic", "advanced"] = "basic", + continue_on_failure: bool = True, + requests_per_second: Optional[float] = None, + verify_ssl: bool = True, + trust_env: bool = False, + proxy: Optional[Dict[str, str]] = None, + ): + """Initialize SafeTavilyLoader with rate limiting and SSL verification support. - def _sync_wait_for_rate_limit(self): - """Synchronous version of rate limit wait.""" - if self.requests_per_second and self.last_request_time: - min_interval = timedelta(seconds=1.0 / self.requests_per_second) - time_since_last = datetime.now() - self.last_request_time - if time_since_last < min_interval: - time.sleep((min_interval - time_since_last).total_seconds()) - self.last_request_time = datetime.now() + Args: + web_paths: List of URLs/paths to process. + api_key: The Tavily API key. + extract_depth: Depth of extraction ("basic" or "advanced"). + continue_on_failure: Whether to continue if extraction of a URL fails. + requests_per_second: Number of requests per second to limit to. + verify_ssl: If True, verify SSL certificates. + trust_env: If True, use proxy settings from environment variables. + proxy: Optional proxy configuration. + """ + # Initialize proxy configuration if using environment variables + proxy_server = proxy.get("server") if proxy else None + if trust_env and not proxy_server: + env_proxies = urllib.request.getproxies() + env_proxy_server = env_proxies.get("https") or env_proxies.get("http") + if env_proxy_server: + if proxy: + proxy["server"] = env_proxy_server + else: + proxy = {"server": env_proxy_server} - async def _safe_process_url(self, url: str) -> bool: - """Perform safety checks before processing a URL.""" - if self.verify_ssl and not self._verify_ssl_cert(url): - raise ValueError(f"SSL certificate verification failed for {url}") - await self._wait_for_rate_limit() - return True + # Store parameters for creating TavilyLoader instances + self.web_paths = web_paths if isinstance(web_paths, list) else [web_paths] + self.api_key = api_key + self.extract_depth = extract_depth + self.continue_on_failure = continue_on_failure + self.verify_ssl = verify_ssl + self.trust_env = trust_env + self.proxy = proxy - def _safe_process_url_sync(self, url: str) -> bool: - """Synchronous version of safety checks.""" - if self.verify_ssl and not self._verify_ssl_cert(url): - raise ValueError(f"SSL certificate verification failed for {url}") - self._sync_wait_for_rate_limit() - return True + # Add rate limiting + self.requests_per_second = requests_per_second + self.last_request_time = None + + def lazy_load(self) -> Iterator[Document]: + """Load documents with rate limiting support, delegating to TavilyLoader.""" + valid_urls = [] + for url in self.web_paths: + try: + self._safe_process_url_sync(url) + valid_urls.append(url) + except Exception as e: + log.warning(f"SSL verification failed for {url}: {str(e)}") + if not self.continue_on_failure: + raise e + if not valid_urls: + if self.continue_on_failure: + log.warning("No valid URLs to process after SSL verification") + return + raise ValueError("No valid URLs to process after SSL verification") + try: + loader = TavilyLoader( + urls=valid_urls, + api_key=self.api_key, + extract_depth=self.extract_depth, + continue_on_failure=self.continue_on_failure, + ) + yield from loader.lazy_load() + except Exception as e: + if self.continue_on_failure: + log.exception(f"Error extracting content from URLs: {e}") + else: + raise e + + async def alazy_load(self) -> AsyncIterator[Document]: + """Async version with rate limiting and SSL verification.""" + valid_urls = [] + for url in self.web_paths: + try: + await self._safe_process_url(url) + valid_urls.append(url) + except Exception as e: + log.warning(f"SSL verification failed for {url}: {str(e)}") + if not self.continue_on_failure: + raise e + + if not valid_urls: + if self.continue_on_failure: + log.warning("No valid URLs to process after SSL verification") + return + raise ValueError("No valid URLs to process after SSL verification") + + try: + loader = TavilyLoader( + urls=valid_urls, + api_key=self.api_key, + extract_depth=self.extract_depth, + continue_on_failure=self.continue_on_failure, + ) + async for document in loader.alazy_load(): + yield document + except Exception as e: + if self.continue_on_failure: + log.exception(f"Error loading URLs: {e}") + else: + raise e -class SafePlaywrightURLLoader(PlaywrightURLLoader): +class SafePlaywrightURLLoader(PlaywrightURLLoader, RateLimitMixin, URLProcessingMixin): """Load HTML pages safely with Playwright, supporting SSL verification, rate limiting, and remote browser connection. Attributes: @@ -256,6 +377,7 @@ class SafePlaywrightURLLoader(PlaywrightURLLoader): headless (bool): If True, the browser will run in headless mode. proxy (dict): Proxy override settings for the Playwright session. playwright_ws_url (Optional[str]): WebSocket endpoint URI for remote browser connection. + playwright_timeout (Optional[int]): Maximum operation time in milliseconds. """ def __init__( @@ -269,6 +391,7 @@ class SafePlaywrightURLLoader(PlaywrightURLLoader): remove_selectors: Optional[List[str]] = None, proxy: Optional[Dict[str, str]] = None, playwright_ws_url: Optional[str] = None, + playwright_timeout: Optional[int] = 10000, ): """Initialize with additional safety parameters and remote browser support.""" @@ -295,6 +418,7 @@ class SafePlaywrightURLLoader(PlaywrightURLLoader): self.last_request_time = None self.playwright_ws_url = playwright_ws_url self.trust_env = trust_env + self.playwright_timeout = playwright_timeout def lazy_load(self) -> Iterator[Document]: """Safely load URLs synchronously with support for remote browser.""" @@ -311,7 +435,7 @@ class SafePlaywrightURLLoader(PlaywrightURLLoader): try: self._safe_process_url_sync(url) page = browser.new_page() - response = page.goto(url) + response = page.goto(url, timeout=self.playwright_timeout) if response is None: raise ValueError(f"page.goto() returned None for url {url}") @@ -320,7 +444,7 @@ class SafePlaywrightURLLoader(PlaywrightURLLoader): yield Document(page_content=text, metadata=metadata) except Exception as e: if self.continue_on_failure: - log.exception(e, "Error loading %s", url) + log.exception(f"Error loading {url}: {e}") continue raise e browser.close() @@ -342,7 +466,7 @@ class SafePlaywrightURLLoader(PlaywrightURLLoader): try: await self._safe_process_url(url) page = await browser.new_page() - response = await page.goto(url) + response = await page.goto(url, timeout=self.playwright_timeout) if response is None: raise ValueError(f"page.goto() returned None for url {url}") @@ -351,46 +475,11 @@ class SafePlaywrightURLLoader(PlaywrightURLLoader): yield Document(page_content=text, metadata=metadata) except Exception as e: if self.continue_on_failure: - log.exception(e, "Error loading %s", url) + log.exception(f"Error loading {url}: {e}") continue raise e await browser.close() - def _verify_ssl_cert(self, url: str) -> bool: - return verify_ssl_cert(url) - - async def _wait_for_rate_limit(self): - """Wait to respect the rate limit if specified.""" - if self.requests_per_second and self.last_request_time: - min_interval = timedelta(seconds=1.0 / self.requests_per_second) - time_since_last = datetime.now() - self.last_request_time - if time_since_last < min_interval: - await asyncio.sleep((min_interval - time_since_last).total_seconds()) - self.last_request_time = datetime.now() - - def _sync_wait_for_rate_limit(self): - """Synchronous version of rate limit wait.""" - if self.requests_per_second and self.last_request_time: - min_interval = timedelta(seconds=1.0 / self.requests_per_second) - time_since_last = datetime.now() - self.last_request_time - if time_since_last < min_interval: - time.sleep((min_interval - time_since_last).total_seconds()) - self.last_request_time = datetime.now() - - async def _safe_process_url(self, url: str) -> bool: - """Perform safety checks before processing a URL.""" - if self.verify_ssl and not self._verify_ssl_cert(url): - raise ValueError(f"SSL certificate verification failed for {url}") - await self._wait_for_rate_limit() - return True - - def _safe_process_url_sync(self, url: str) -> bool: - """Synchronous version of safety checks.""" - if self.verify_ssl and not self._verify_ssl_cert(url): - raise ValueError(f"SSL certificate verification failed for {url}") - self._sync_wait_for_rate_limit() - return True - class SafeWebBaseLoader(WebBaseLoader): """WebBaseLoader with enhanced error handling for URLs.""" @@ -472,7 +561,7 @@ class SafeWebBaseLoader(WebBaseLoader): yield Document(page_content=text, metadata=metadata) except Exception as e: # Log the error and continue with the next URL - log.exception(e, "Error loading %s", path) + log.exception(f"Error loading {path}: {e}") async def alazy_load(self) -> AsyncIterator[Document]: """Async lazy load text from the url(s) in web_path.""" @@ -499,6 +588,7 @@ RAG_WEB_LOADER_ENGINES = defaultdict(lambda: SafeWebBaseLoader) RAG_WEB_LOADER_ENGINES["playwright"] = SafePlaywrightURLLoader RAG_WEB_LOADER_ENGINES["safe_web"] = SafeWebBaseLoader RAG_WEB_LOADER_ENGINES["firecrawl"] = SafeFireCrawlLoader +RAG_WEB_LOADER_ENGINES["tavily"] = SafeTavilyLoader def get_web_loader( @@ -518,13 +608,19 @@ def get_web_loader( "trust_env": trust_env, } - if PLAYWRIGHT_WS_URI.value: - web_loader_args["playwright_ws_url"] = PLAYWRIGHT_WS_URI.value + if RAG_WEB_LOADER_ENGINE.value == "playwright": + web_loader_args["playwright_timeout"] = PLAYWRIGHT_TIMEOUT.value * 1000 + if PLAYWRIGHT_WS_URI.value: + web_loader_args["playwright_ws_url"] = PLAYWRIGHT_WS_URI.value if RAG_WEB_LOADER_ENGINE.value == "firecrawl": web_loader_args["api_key"] = FIRECRAWL_API_KEY.value web_loader_args["api_url"] = FIRECRAWL_API_BASE_URL.value + if RAG_WEB_LOADER_ENGINE.value == "tavily": + web_loader_args["api_key"] = TAVILY_API_KEY.value + web_loader_args["extract_depth"] = TAVILY_EXTRACT_DEPTH.value + # Create the appropriate WebLoader based on the configuration WebLoaderClass = RAG_WEB_LOADER_ENGINES[RAG_WEB_LOADER_ENGINE.value] web_loader = WebLoaderClass(**web_loader_args) diff --git a/backend/open_webui/routers/audio.py b/backend/open_webui/routers/audio.py index d6f74eac4c..ea13726235 100644 --- a/backend/open_webui/routers/audio.py +++ b/backend/open_webui/routers/audio.py @@ -625,7 +625,9 @@ def transcription( ): log.info(f"file.content_type: {file.content_type}") - if file.content_type not in ["audio/mpeg", "audio/wav", "audio/ogg", "audio/x-m4a"]: + supported_filetypes = ("audio/mpeg", "audio/wav", "audio/ogg", "audio/x-m4a") + + if not file.content_type.startswith(supported_filetypes): raise HTTPException( status_code=status.HTTP_400_BAD_REQUEST, detail=ERROR_MESSAGES.FILE_NOT_SUPPORTED, diff --git a/backend/open_webui/routers/auths.py b/backend/open_webui/routers/auths.py index 399283ee4f..34a63ba3fa 100644 --- a/backend/open_webui/routers/auths.py +++ b/backend/open_webui/routers/auths.py @@ -210,7 +210,7 @@ async def ldap_auth(request: Request, response: Response, form_data: LdapForm): LDAP_APP_DN, LDAP_APP_PASSWORD, auto_bind="NONE", - authentication="SIMPLE", + authentication="SIMPLE" if LDAP_APP_DN else "ANONYMOUS", ) if not connection_app.bind(): raise HTTPException(400, detail="Application account bind failed") @@ -639,11 +639,12 @@ async def get_admin_config(request: Request, user=Depends(get_admin_user)): "ENABLE_API_KEY": request.app.state.config.ENABLE_API_KEY, "ENABLE_API_KEY_ENDPOINT_RESTRICTIONS": request.app.state.config.ENABLE_API_KEY_ENDPOINT_RESTRICTIONS, "API_KEY_ALLOWED_ENDPOINTS": request.app.state.config.API_KEY_ALLOWED_ENDPOINTS, - "ENABLE_CHANNELS": request.app.state.config.ENABLE_CHANNELS, "DEFAULT_USER_ROLE": request.app.state.config.DEFAULT_USER_ROLE, "JWT_EXPIRES_IN": request.app.state.config.JWT_EXPIRES_IN, "ENABLE_COMMUNITY_SHARING": request.app.state.config.ENABLE_COMMUNITY_SHARING, "ENABLE_MESSAGE_RATING": request.app.state.config.ENABLE_MESSAGE_RATING, + "ENABLE_CHANNELS": request.app.state.config.ENABLE_CHANNELS, + "ENABLE_USER_WEBHOOKS": request.app.state.config.ENABLE_USER_WEBHOOKS, } @@ -654,11 +655,12 @@ class AdminConfig(BaseModel): ENABLE_API_KEY: bool ENABLE_API_KEY_ENDPOINT_RESTRICTIONS: bool API_KEY_ALLOWED_ENDPOINTS: str - ENABLE_CHANNELS: bool DEFAULT_USER_ROLE: str JWT_EXPIRES_IN: str ENABLE_COMMUNITY_SHARING: bool ENABLE_MESSAGE_RATING: bool + ENABLE_CHANNELS: bool + ENABLE_USER_WEBHOOKS: bool @router.post("/admin/config") @@ -693,6 +695,8 @@ async def update_admin_config( ) request.app.state.config.ENABLE_MESSAGE_RATING = form_data.ENABLE_MESSAGE_RATING + request.app.state.config.ENABLE_USER_WEBHOOKS = form_data.ENABLE_USER_WEBHOOKS + return { "SHOW_ADMIN_DETAILS": request.app.state.config.SHOW_ADMIN_DETAILS, "WEBUI_URL": request.app.state.config.WEBUI_URL, @@ -705,6 +709,7 @@ async def update_admin_config( "JWT_EXPIRES_IN": request.app.state.config.JWT_EXPIRES_IN, "ENABLE_COMMUNITY_SHARING": request.app.state.config.ENABLE_COMMUNITY_SHARING, "ENABLE_MESSAGE_RATING": request.app.state.config.ENABLE_MESSAGE_RATING, + "ENABLE_USER_WEBHOOKS": request.app.state.config.ENABLE_USER_WEBHOOKS, } diff --git a/backend/open_webui/routers/chats.py b/backend/open_webui/routers/chats.py index 2efd043efe..74bb96c947 100644 --- a/backend/open_webui/routers/chats.py +++ b/backend/open_webui/routers/chats.py @@ -2,6 +2,8 @@ import json import logging from typing import Optional + +from open_webui.socket.main import get_event_emitter from open_webui.models.chats import ( ChatForm, ChatImportForm, @@ -372,6 +374,107 @@ async def update_chat_by_id( ) +############################ +# UpdateChatMessageById +############################ +class MessageForm(BaseModel): + content: str + + +@router.post("/{id}/messages/{message_id}", response_model=Optional[ChatResponse]) +async def update_chat_message_by_id( + id: str, message_id: str, form_data: MessageForm, user=Depends(get_verified_user) +): + chat = Chats.get_chat_by_id(id) + + if not chat: + raise HTTPException( + status_code=status.HTTP_401_UNAUTHORIZED, + detail=ERROR_MESSAGES.ACCESS_PROHIBITED, + ) + + if chat.user_id != user.id and user.role != "admin": + raise HTTPException( + status_code=status.HTTP_401_UNAUTHORIZED, + detail=ERROR_MESSAGES.ACCESS_PROHIBITED, + ) + + chat = Chats.upsert_message_to_chat_by_id_and_message_id( + id, + message_id, + { + "content": form_data.content, + }, + ) + + event_emitter = get_event_emitter( + { + "user_id": user.id, + "chat_id": id, + "message_id": message_id, + }, + False, + ) + + if event_emitter: + await event_emitter( + { + "type": "chat:message", + "data": { + "chat_id": id, + "message_id": message_id, + "content": form_data.content, + }, + } + ) + + return ChatResponse(**chat.model_dump()) + + +############################ +# SendChatMessageEventById +############################ +class EventForm(BaseModel): + type: str + data: dict + + +@router.post("/{id}/messages/{message_id}/event", response_model=Optional[bool]) +async def send_chat_message_event_by_id( + id: str, message_id: str, form_data: EventForm, user=Depends(get_verified_user) +): + chat = Chats.get_chat_by_id(id) + + if not chat: + raise HTTPException( + status_code=status.HTTP_401_UNAUTHORIZED, + detail=ERROR_MESSAGES.ACCESS_PROHIBITED, + ) + + if chat.user_id != user.id and user.role != "admin": + raise HTTPException( + status_code=status.HTTP_401_UNAUTHORIZED, + detail=ERROR_MESSAGES.ACCESS_PROHIBITED, + ) + + event_emitter = get_event_emitter( + { + "user_id": user.id, + "chat_id": id, + "message_id": message_id, + } + ) + + try: + if event_emitter: + await event_emitter(form_data.model_dump()) + else: + return False + return True + except: + return False + + ############################ # DeleteChatById ############################ diff --git a/backend/open_webui/routers/evaluations.py b/backend/open_webui/routers/evaluations.py index f0c4a6b065..8597fa2863 100644 --- a/backend/open_webui/routers/evaluations.py +++ b/backend/open_webui/routers/evaluations.py @@ -56,8 +56,19 @@ async def update_config( } +class FeedbackUserReponse(BaseModel): + id: str + name: str + email: str + role: str = "pending" + + last_active_at: int # timestamp in epoch + updated_at: int # timestamp in epoch + created_at: int # timestamp in epoch + + class FeedbackUserResponse(FeedbackResponse): - user: Optional[UserModel] = None + user: Optional[FeedbackUserReponse] = None @router.get("/feedbacks/all", response_model=list[FeedbackUserResponse]) @@ -65,7 +76,10 @@ async def get_all_feedbacks(user=Depends(get_admin_user)): feedbacks = Feedbacks.get_all_feedbacks() return [ FeedbackUserResponse( - **feedback.model_dump(), user=Users.get_user_by_id(feedback.user_id) + **feedback.model_dump(), + user=FeedbackUserReponse( + **Users.get_user_by_id(feedback.user_id).model_dump() + ), ) for feedback in feedbacks ] diff --git a/backend/open_webui/routers/files.py b/backend/open_webui/routers/files.py index 95b7f6461a..22e1269e37 100644 --- a/backend/open_webui/routers/files.py +++ b/backend/open_webui/routers/files.py @@ -5,7 +5,16 @@ from pathlib import Path from typing import Optional from urllib.parse import quote -from fastapi import APIRouter, Depends, File, HTTPException, Request, UploadFile, status +from fastapi import ( + APIRouter, + Depends, + File, + HTTPException, + Request, + UploadFile, + status, + Query, +) from fastapi.responses import FileResponse, StreamingResponse from open_webui.constants import ERROR_MESSAGES from open_webui.env import SRC_LOG_LEVELS @@ -15,6 +24,9 @@ from open_webui.models.files import ( FileModelResponse, Files, ) +from open_webui.models.knowledge import Knowledges + +from open_webui.routers.knowledge import get_knowledge, get_knowledge_list from open_webui.routers.retrieval import ProcessFileForm, process_file from open_webui.routers.audio import transcribe from open_webui.storage.provider import Storage @@ -27,6 +39,39 @@ log.setLevel(SRC_LOG_LEVELS["MODELS"]) router = APIRouter() + +############################ +# Check if the current user has access to a file through any knowledge bases the user may be in. +############################ + + +def has_access_to_file( + file_id: Optional[str], access_type: str, user=Depends(get_verified_user) +) -> bool: + file = Files.get_file_by_id(file_id) + log.debug(f"Checking if user has {access_type} access to file") + + if not file: + raise HTTPException( + status_code=status.HTTP_404_NOT_FOUND, + detail=ERROR_MESSAGES.NOT_FOUND, + ) + + has_access = False + knowledge_base_id = file.meta.get("collection_name") if file.meta else None + + if knowledge_base_id: + knowledge_bases = Knowledges.get_knowledge_bases_by_user_id( + user.id, access_type + ) + for knowledge_base in knowledge_bases: + if knowledge_base.id == knowledge_base_id: + has_access = True + break + + return has_access + + ############################ # Upload File ############################ @@ -38,6 +83,7 @@ def upload_file( file: UploadFile = File(...), user=Depends(get_verified_user), file_metadata: dict = {}, + process: bool = Query(True), ): log.info(f"file.content_type: {file.content_type}") try: @@ -66,34 +112,33 @@ def upload_file( } ), ) - - try: - if file.content_type in [ - "audio/mpeg", - "audio/wav", - "audio/ogg", - "audio/x-m4a", - ]: - file_path = Storage.get_file(file_path) - result = transcribe(request, file_path) - process_file( - request, - ProcessFileForm(file_id=id, content=result.get("text", "")), - user=user, + if process: + try: + if file.content_type in [ + "audio/mpeg", + "audio/wav", + "audio/ogg", + "audio/x-m4a", + ]: + file_path = Storage.get_file(file_path) + result = transcribe(request, file_path) + process_file( + request, + ProcessFileForm(file_id=id, content=result.get("text", "")), + user=user, + ) + elif file.content_type not in ["image/png", "image/jpeg", "image/gif"]: + process_file(request, ProcessFileForm(file_id=id), user=user) + file_item = Files.get_file_by_id(id=id) + except Exception as e: + log.exception(e) + log.error(f"Error processing file: {file_item.id}") + file_item = FileModelResponse( + **{ + **file_item.model_dump(), + "error": str(e.detail) if hasattr(e, "detail") else str(e), + } ) - else: - process_file(request, ProcessFileForm(file_id=id), user=user) - - file_item = Files.get_file_by_id(id=id) - except Exception as e: - log.exception(e) - log.error(f"Error processing file: {file_item.id}") - file_item = FileModelResponse( - **{ - **file_item.model_dump(), - "error": str(e.detail) if hasattr(e, "detail") else str(e), - } - ) if file_item: return file_item @@ -160,7 +205,17 @@ async def delete_all_files(user=Depends(get_admin_user)): async def get_file_by_id(id: str, user=Depends(get_verified_user)): file = Files.get_file_by_id(id) - if file and (file.user_id == user.id or user.role == "admin"): + if not file: + raise HTTPException( + status_code=status.HTTP_404_NOT_FOUND, + detail=ERROR_MESSAGES.NOT_FOUND, + ) + + if ( + file.user_id == user.id + or user.role == "admin" + or has_access_to_file(id, "read", user) + ): return file else: raise HTTPException( @@ -178,7 +233,17 @@ async def get_file_by_id(id: str, user=Depends(get_verified_user)): async def get_file_data_content_by_id(id: str, user=Depends(get_verified_user)): file = Files.get_file_by_id(id) - if file and (file.user_id == user.id or user.role == "admin"): + if not file: + raise HTTPException( + status_code=status.HTTP_404_NOT_FOUND, + detail=ERROR_MESSAGES.NOT_FOUND, + ) + + if ( + file.user_id == user.id + or user.role == "admin" + or has_access_to_file(id, "read", user) + ): return {"content": file.data.get("content", "")} else: raise HTTPException( @@ -202,7 +267,17 @@ async def update_file_data_content_by_id( ): file = Files.get_file_by_id(id) - if file and (file.user_id == user.id or user.role == "admin"): + if not file: + raise HTTPException( + status_code=status.HTTP_404_NOT_FOUND, + detail=ERROR_MESSAGES.NOT_FOUND, + ) + + if ( + file.user_id == user.id + or user.role == "admin" + or has_access_to_file(id, "write", user) + ): try: process_file( request, @@ -228,9 +303,22 @@ async def update_file_data_content_by_id( @router.get("/{id}/content") -async def get_file_content_by_id(id: str, user=Depends(get_verified_user)): +async def get_file_content_by_id( + id: str, user=Depends(get_verified_user), attachment: bool = Query(False) +): file = Files.get_file_by_id(id) - if file and (file.user_id == user.id or user.role == "admin"): + + if not file: + raise HTTPException( + status_code=status.HTTP_404_NOT_FOUND, + detail=ERROR_MESSAGES.NOT_FOUND, + ) + + if ( + file.user_id == user.id + or user.role == "admin" + or has_access_to_file(id, "read", user) + ): try: file_path = Storage.get_file(file.path) file_path = Path(file_path) @@ -246,17 +334,22 @@ async def get_file_content_by_id(id: str, user=Depends(get_verified_user)): encoded_filename = quote(filename) headers = {} - if content_type == "application/pdf" or filename.lower().endswith( - ".pdf" - ): - headers["Content-Disposition"] = ( - f"inline; filename*=UTF-8''{encoded_filename}" - ) - content_type = "application/pdf" - elif content_type != "text/plain": + if attachment: headers["Content-Disposition"] = ( f"attachment; filename*=UTF-8''{encoded_filename}" ) + else: + if content_type == "application/pdf" or filename.lower().endswith( + ".pdf" + ): + headers["Content-Disposition"] = ( + f"inline; filename*=UTF-8''{encoded_filename}" + ) + content_type = "application/pdf" + elif content_type != "text/plain": + headers["Content-Disposition"] = ( + f"attachment; filename*=UTF-8''{encoded_filename}" + ) return FileResponse(file_path, headers=headers, media_type=content_type) @@ -282,7 +375,18 @@ async def get_file_content_by_id(id: str, user=Depends(get_verified_user)): @router.get("/{id}/content/html") async def get_html_file_content_by_id(id: str, user=Depends(get_verified_user)): file = Files.get_file_by_id(id) - if file and (file.user_id == user.id or user.role == "admin"): + + if not file: + raise HTTPException( + status_code=status.HTTP_404_NOT_FOUND, + detail=ERROR_MESSAGES.NOT_FOUND, + ) + + if ( + file.user_id == user.id + or user.role == "admin" + or has_access_to_file(id, "read", user) + ): try: file_path = Storage.get_file(file.path) file_path = Path(file_path) @@ -314,7 +418,17 @@ async def get_html_file_content_by_id(id: str, user=Depends(get_verified_user)): async def get_file_content_by_id(id: str, user=Depends(get_verified_user)): file = Files.get_file_by_id(id) - if file and (file.user_id == user.id or user.role == "admin"): + if not file: + raise HTTPException( + status_code=status.HTTP_404_NOT_FOUND, + detail=ERROR_MESSAGES.NOT_FOUND, + ) + + if ( + file.user_id == user.id + or user.role == "admin" + or has_access_to_file(id, "read", user) + ): file_path = file.path # Handle Unicode filenames @@ -365,7 +479,18 @@ async def get_file_content_by_id(id: str, user=Depends(get_verified_user)): @router.delete("/{id}") async def delete_file_by_id(id: str, user=Depends(get_verified_user)): file = Files.get_file_by_id(id) - if file and (file.user_id == user.id or user.role == "admin"): + + if not file: + raise HTTPException( + status_code=status.HTTP_404_NOT_FOUND, + detail=ERROR_MESSAGES.NOT_FOUND, + ) + + if ( + file.user_id == user.id + or user.role == "admin" + or has_access_to_file(id, "write", user) + ): # We should add Chroma cleanup here result = Files.delete_file_by_id(id) diff --git a/backend/open_webui/routers/folders.py b/backend/open_webui/routers/folders.py index ca2fbd2132..cf37f9329d 100644 --- a/backend/open_webui/routers/folders.py +++ b/backend/open_webui/routers/folders.py @@ -20,11 +20,13 @@ from open_webui.env import SRC_LOG_LEVELS from open_webui.constants import ERROR_MESSAGES -from fastapi import APIRouter, Depends, File, HTTPException, UploadFile, status +from fastapi import APIRouter, Depends, File, HTTPException, UploadFile, status, Request from fastapi.responses import FileResponse, StreamingResponse from open_webui.utils.auth import get_admin_user, get_verified_user +from open_webui.utils.access_control import has_permission + log = logging.getLogger(__name__) log.setLevel(SRC_LOG_LEVELS["MODELS"]) @@ -228,7 +230,18 @@ async def update_folder_is_expanded_by_id( @router.delete("/{id}") -async def delete_folder_by_id(id: str, user=Depends(get_verified_user)): +async def delete_folder_by_id( + request: Request, id: str, user=Depends(get_verified_user) +): + chat_delete_permission = has_permission( + user.id, "chat.delete", request.app.state.config.USER_PERMISSIONS + ) + if not chat_delete_permission: + raise HTTPException( + status_code=status.HTTP_403_FORBIDDEN, + detail=ERROR_MESSAGES.ACCESS_PROHIBITED, + ) + folder = Folders.get_folder_by_id_and_user_id(id, user.id) if folder: try: diff --git a/backend/open_webui/routers/images.py b/backend/open_webui/routers/images.py index c51d2f9960..275704f341 100644 --- a/backend/open_webui/routers/images.py +++ b/backend/open_webui/routers/images.py @@ -517,10 +517,8 @@ async def image_generations( images = [] for image in res["data"]: - if "url" in image: - image_data, content_type = load_url_image_data( - image["url"], headers - ) + if image_url := image.get("url", None): + image_data, content_type = load_url_image_data(image_url, headers) else: image_data, content_type = load_b64_image_data(image["b64_json"]) diff --git a/backend/open_webui/routers/knowledge.py b/backend/open_webui/routers/knowledge.py index 1969045505..bc1e2429e9 100644 --- a/backend/open_webui/routers/knowledge.py +++ b/backend/open_webui/routers/knowledge.py @@ -437,14 +437,24 @@ def remove_file_from_knowledge_by_id( ) # Remove content from the vector database - VECTOR_DB_CLIENT.delete( - collection_name=knowledge.id, filter={"file_id": form_data.file_id} - ) + try: + VECTOR_DB_CLIENT.delete( + collection_name=knowledge.id, filter={"file_id": form_data.file_id} + ) + except Exception as e: + log.debug("This was most likely caused by bypassing embedding processing") + log.debug(e) + pass - # Remove the file's collection from vector database - file_collection = f"file-{form_data.file_id}" - if VECTOR_DB_CLIENT.has_collection(collection_name=file_collection): - VECTOR_DB_CLIENT.delete_collection(collection_name=file_collection) + try: + # Remove the file's collection from vector database + file_collection = f"file-{form_data.file_id}" + if VECTOR_DB_CLIENT.has_collection(collection_name=file_collection): + VECTOR_DB_CLIENT.delete_collection(collection_name=file_collection) + except Exception as e: + log.debug("This was most likely caused by bypassing embedding processing") + log.debug(e) + pass # Delete file from database Files.delete_file_by_id(form_data.file_id) diff --git a/backend/open_webui/routers/memories.py b/backend/open_webui/routers/memories.py index c55a6a9cc9..e660ef852b 100644 --- a/backend/open_webui/routers/memories.py +++ b/backend/open_webui/routers/memories.py @@ -57,7 +57,9 @@ async def add_memory( { "id": memory.id, "text": memory.content, - "vector": request.app.state.EMBEDDING_FUNCTION(memory.content, user), + "vector": request.app.state.EMBEDDING_FUNCTION( + memory.content, user=user + ), "metadata": {"created_at": memory.created_at}, } ], @@ -82,7 +84,7 @@ async def query_memory( ): results = VECTOR_DB_CLIENT.search( collection_name=f"user-memory-{user.id}", - vectors=[request.app.state.EMBEDDING_FUNCTION(form_data.content, user)], + vectors=[request.app.state.EMBEDDING_FUNCTION(form_data.content, user=user)], limit=form_data.k, ) @@ -105,7 +107,9 @@ async def reset_memory_from_vector_db( { "id": memory.id, "text": memory.content, - "vector": request.app.state.EMBEDDING_FUNCTION(memory.content, user), + "vector": request.app.state.EMBEDDING_FUNCTION( + memory.content, user=user + ), "metadata": { "created_at": memory.created_at, "updated_at": memory.updated_at, @@ -161,7 +165,7 @@ async def update_memory_by_id( "id": memory.id, "text": memory.content, "vector": request.app.state.EMBEDDING_FUNCTION( - memory.content, user + memory.content, user=user ), "metadata": { "created_at": memory.created_at, diff --git a/backend/open_webui/routers/ollama.py b/backend/open_webui/routers/ollama.py index 959b8417aa..fcb263d1e0 100644 --- a/backend/open_webui/routers/ollama.py +++ b/backend/open_webui/routers/ollama.py @@ -295,7 +295,7 @@ async def update_config( } -@cached(ttl=3) +@cached(ttl=1) async def get_all_models(request: Request, user: UserModel = None): log.info("get_all_models()") if request.app.state.config.ENABLE_OLLAMA_API: @@ -336,6 +336,7 @@ async def get_all_models(request: Request, user: UserModel = None): ) prefix_id = api_config.get("prefix_id", None) + tags = api_config.get("tags", []) model_ids = api_config.get("model_ids", []) if len(model_ids) != 0 and "models" in response: @@ -350,6 +351,10 @@ async def get_all_models(request: Request, user: UserModel = None): for model in response.get("models", []): model["model"] = f"{prefix_id}.{model['model']}" + if tags: + for model in response.get("models", []): + model["tags"] = tags + def merge_models_lists(model_lists): merged_models = {} @@ -460,18 +465,27 @@ async def get_ollama_versions(request: Request, url_idx: Optional[int] = None): if request.app.state.config.ENABLE_OLLAMA_API: if url_idx is None: # returns lowest version - request_tasks = [ - send_get_request( - f"{url}/api/version", + request_tasks = [] + + for idx, url in enumerate(request.app.state.config.OLLAMA_BASE_URLS): + api_config = request.app.state.config.OLLAMA_API_CONFIGS.get( + str(idx), request.app.state.config.OLLAMA_API_CONFIGS.get( - str(idx), - request.app.state.config.OLLAMA_API_CONFIGS.get( - url, {} - ), # Legacy support - ).get("key", None), + url, {} + ), # Legacy support ) - for idx, url in enumerate(request.app.state.config.OLLAMA_BASE_URLS) - ] + + enable = api_config.get("enable", True) + key = api_config.get("key", None) + + if enable: + request_tasks.append( + send_get_request( + f"{url}/api/version", + key, + ) + ) + responses = await asyncio.gather(*request_tasks) responses = list(filter(lambda x: x is not None, responses)) @@ -1164,7 +1178,7 @@ async def generate_chat_completion( prefix_id = api_config.get("prefix_id", None) if prefix_id: payload["model"] = payload["model"].replace(f"{prefix_id}.", "") - + # payload["keep_alive"] = -1 # keep alive forever return await send_post_request( url=f"{url}/api/chat", payload=json.dumps(payload), diff --git a/backend/open_webui/routers/openai.py b/backend/open_webui/routers/openai.py index 73b182d3cd..0310014cf5 100644 --- a/backend/open_webui/routers/openai.py +++ b/backend/open_webui/routers/openai.py @@ -36,6 +36,9 @@ from open_webui.utils.payload import ( apply_model_params_to_body_openai, apply_model_system_prompt_to_body, ) +from open_webui.utils.misc import ( + convert_logit_bias_input_to_json, +) from open_webui.utils.auth import get_admin_user, get_verified_user from open_webui.utils.access_control import has_access @@ -350,6 +353,7 @@ async def get_all_models_responses(request: Request, user: UserModel) -> list: ) prefix_id = api_config.get("prefix_id", None) + tags = api_config.get("tags", []) if prefix_id: for model in ( @@ -357,6 +361,12 @@ async def get_all_models_responses(request: Request, user: UserModel) -> list: ): model["id"] = f"{prefix_id}.{model['id']}" + if tags: + for model in ( + response if isinstance(response, list) else response.get("data", []) + ): + model["tags"] = tags + log.debug(f"get_all_models:responses() {responses}") return responses @@ -374,7 +384,7 @@ async def get_filtered_models(models, user): return filtered_models -@cached(ttl=3) +@cached(ttl=1) async def get_all_models(request: Request, user: UserModel) -> dict[str, list]: log.info("get_all_models()") @@ -396,6 +406,7 @@ async def get_all_models(request: Request, user: UserModel) -> dict[str, list]: for idx, models in enumerate(model_lists): if models is not None and "error" not in models: + merged_list.extend( [ { @@ -406,18 +417,21 @@ async def get_all_models(request: Request, user: UserModel) -> dict[str, list]: "urlIdx": idx, } for model in models - if "api.openai.com" - not in request.app.state.config.OPENAI_API_BASE_URLS[idx] - or not any( - name in model["id"] - for name in [ - "babbage", - "dall-e", - "davinci", - "embedding", - "tts", - "whisper", - ] + if (model.get("id") or model.get("name")) + and ( + "api.openai.com" + not in request.app.state.config.OPENAI_API_BASE_URLS[idx] + or not any( + name in model["id"] + for name in [ + "babbage", + "dall-e", + "davinci", + "embedding", + "tts", + "whisper", + ] + ) ) ] ) @@ -666,6 +680,11 @@ async def generate_chat_completion( del payload["max_tokens"] # Convert the modified body back to JSON + if "logit_bias" in payload: + payload["logit_bias"] = json.loads( + convert_logit_bias_input_to_json(payload["logit_bias"]) + ) + payload = json.dumps(payload) r = None diff --git a/backend/open_webui/routers/pipelines.py b/backend/open_webui/routers/pipelines.py index 599208e43d..10c8e9b2ec 100644 --- a/backend/open_webui/routers/pipelines.py +++ b/backend/open_webui/routers/pipelines.py @@ -90,8 +90,8 @@ async def process_pipeline_inlet_filter(request, payload, user, models): headers=headers, json=request_data, ) as response: - response.raise_for_status() payload = await response.json() + response.raise_for_status() except aiohttp.ClientResponseError as e: res = ( await response.json() @@ -139,8 +139,8 @@ async def process_pipeline_outlet_filter(request, payload, user, models): headers=headers, json=request_data, ) as response: - response.raise_for_status() payload = await response.json() + response.raise_for_status() except aiohttp.ClientResponseError as e: try: res = ( diff --git a/backend/open_webui/routers/retrieval.py b/backend/open_webui/routers/retrieval.py index ac38c236e5..2bd908606a 100644 --- a/backend/open_webui/routers/retrieval.py +++ b/backend/open_webui/routers/retrieval.py @@ -74,7 +74,6 @@ from open_webui.utils.misc import ( ) from open_webui.utils.auth import get_admin_user, get_verified_user - from open_webui.config import ( ENV, RAG_EMBEDDING_MODEL_AUTO_UPDATE, @@ -83,6 +82,8 @@ from open_webui.config import ( RAG_RERANKING_MODEL_TRUST_REMOTE_CODE, UPLOAD_DIR, DEFAULT_LOCALE, + RAG_EMBEDDING_CONTENT_PREFIX, + RAG_EMBEDDING_QUERY_PREFIX, ) from open_webui.env import ( SRC_LOG_LEVELS, @@ -358,6 +359,7 @@ async def get_rag_config(request: Request, user=Depends(get_admin_user)): "content_extraction": { "engine": request.app.state.config.CONTENT_EXTRACTION_ENGINE, "tika_server_url": request.app.state.config.TIKA_SERVER_URL, + "docling_server_url": request.app.state.config.DOCLING_SERVER_URL, "document_intelligence_config": { "endpoint": request.app.state.config.DOCUMENT_INTELLIGENCE_ENDPOINT, "key": request.app.state.config.DOCUMENT_INTELLIGENCE_KEY, @@ -428,6 +430,7 @@ class DocumentIntelligenceConfigForm(BaseModel): class ContentExtractionConfig(BaseModel): engine: str = "" tika_server_url: Optional[str] = None + docling_server_url: Optional[str] = None document_intelligence_config: Optional[DocumentIntelligenceConfigForm] = None @@ -540,6 +543,9 @@ async def update_rag_config( request.app.state.config.TIKA_SERVER_URL = ( form_data.content_extraction.tika_server_url ) + request.app.state.config.DOCLING_SERVER_URL = ( + form_data.content_extraction.docling_server_url + ) if form_data.content_extraction.document_intelligence_config is not None: request.app.state.config.DOCUMENT_INTELLIGENCE_ENDPOINT = ( form_data.content_extraction.document_intelligence_config.endpoint @@ -648,6 +654,7 @@ async def update_rag_config( "content_extraction": { "engine": request.app.state.config.CONTENT_EXTRACTION_ENGINE, "tika_server_url": request.app.state.config.TIKA_SERVER_URL, + "docling_server_url": request.app.state.config.DOCLING_SERVER_URL, "document_intelligence_config": { "endpoint": request.app.state.config.DOCUMENT_INTELLIGENCE_ENDPOINT, "key": request.app.state.config.DOCUMENT_INTELLIGENCE_KEY, @@ -713,6 +720,7 @@ async def get_query_settings(request: Request, user=Depends(get_admin_user)): "status": True, "template": request.app.state.config.RAG_TEMPLATE, "k": request.app.state.config.TOP_K, + "k_reranker": request.app.state.config.TOP_K_RERANKER, "r": request.app.state.config.RELEVANCE_THRESHOLD, "hybrid": request.app.state.config.ENABLE_RAG_HYBRID_SEARCH, } @@ -720,6 +728,7 @@ async def get_query_settings(request: Request, user=Depends(get_admin_user)): class QuerySettingsForm(BaseModel): k: Optional[int] = None + k_reranker: Optional[int] = None r: Optional[float] = None template: Optional[str] = None hybrid: Optional[bool] = None @@ -731,6 +740,7 @@ async def update_query_settings( ): request.app.state.config.RAG_TEMPLATE = form_data.template request.app.state.config.TOP_K = form_data.k if form_data.k else 4 + request.app.state.config.TOP_K_RERANKER = form_data.k_reranker or 4 request.app.state.config.RELEVANCE_THRESHOLD = form_data.r if form_data.r else 0.0 request.app.state.config.ENABLE_RAG_HYBRID_SEARCH = ( @@ -741,6 +751,7 @@ async def update_query_settings( "status": True, "template": request.app.state.config.RAG_TEMPLATE, "k": request.app.state.config.TOP_K, + "k_reranker": request.app.state.config.TOP_K_RERANKER, "r": request.app.state.config.RELEVANCE_THRESHOLD, "hybrid": request.app.state.config.ENABLE_RAG_HYBRID_SEARCH, } @@ -881,7 +892,9 @@ def save_docs_to_vector_db( ) embeddings = embedding_function( - list(map(lambda x: x.replace("\n", " "), texts)), user=user + list(map(lambda x: x.replace("\n", " "), texts)), + prefix=RAG_EMBEDDING_CONTENT_PREFIX, + user=user, ) items = [ @@ -990,6 +1003,7 @@ def process_file( loader = Loader( engine=request.app.state.config.CONTENT_EXTRACTION_ENGINE, TIKA_SERVER_URL=request.app.state.config.TIKA_SERVER_URL, + DOCLING_SERVER_URL=request.app.state.config.DOCLING_SERVER_URL, PDF_EXTRACT_IMAGES=request.app.state.config.PDF_EXTRACT_IMAGES, DOCUMENT_INTELLIGENCE_ENDPOINT=request.app.state.config.DOCUMENT_INTELLIGENCE_ENDPOINT, DOCUMENT_INTELLIGENCE_KEY=request.app.state.config.DOCUMENT_INTELLIGENCE_KEY, @@ -1488,6 +1502,7 @@ class QueryDocForm(BaseModel): collection_name: str query: str k: Optional[int] = None + k_reranker: Optional[int] = None r: Optional[float] = None hybrid: Optional[bool] = None @@ -1503,11 +1518,13 @@ def query_doc_handler( return query_doc_with_hybrid_search( collection_name=form_data.collection_name, query=form_data.query, - embedding_function=lambda query: request.app.state.EMBEDDING_FUNCTION( - query, user=user + embedding_function=lambda query, prefix: request.app.state.EMBEDDING_FUNCTION( + query, prefix=prefix, user=user ), k=form_data.k if form_data.k else request.app.state.config.TOP_K, reranking_function=request.app.state.rf, + k_reranker=form_data.k_reranker + or request.app.state.config.TOP_K_RERANKER, r=( form_data.r if form_data.r @@ -1519,7 +1536,7 @@ def query_doc_handler( return query_doc( collection_name=form_data.collection_name, query_embedding=request.app.state.EMBEDDING_FUNCTION( - form_data.query, user=user + form_data.query, prefix=RAG_EMBEDDING_QUERY_PREFIX, user=user ), k=form_data.k if form_data.k else request.app.state.config.TOP_K, user=user, @@ -1536,6 +1553,7 @@ class QueryCollectionsForm(BaseModel): collection_names: list[str] query: str k: Optional[int] = None + k_reranker: Optional[int] = None r: Optional[float] = None hybrid: Optional[bool] = None @@ -1551,11 +1569,13 @@ def query_collection_handler( return query_collection_with_hybrid_search( collection_names=form_data.collection_names, queries=[form_data.query], - embedding_function=lambda query: request.app.state.EMBEDDING_FUNCTION( - query, user=user + embedding_function=lambda query, prefix: request.app.state.EMBEDDING_FUNCTION( + query, prefix=prefix, user=user ), k=form_data.k if form_data.k else request.app.state.config.TOP_K, reranking_function=request.app.state.rf, + k_reranker=form_data.k_reranker + or request.app.state.config.TOP_K_RERANKER, r=( form_data.r if form_data.r @@ -1566,8 +1586,8 @@ def query_collection_handler( return query_collection( collection_names=form_data.collection_names, queries=[form_data.query], - embedding_function=lambda query: request.app.state.EMBEDDING_FUNCTION( - query, user=user + embedding_function=lambda query, prefix: request.app.state.EMBEDDING_FUNCTION( + query, prefix=prefix, user=user ), k=form_data.k if form_data.k else request.app.state.config.TOP_K, ) @@ -1644,7 +1664,11 @@ if ENV == "dev": @router.get("/ef/{text}") async def get_embeddings(request: Request, text: Optional[str] = "Hello World!"): - return {"result": request.app.state.EMBEDDING_FUNCTION(text)} + return { + "result": request.app.state.EMBEDDING_FUNCTION( + text, prefix=RAG_EMBEDDING_QUERY_PREFIX + ) + } class BatchProcessFilesForm(BaseModel): diff --git a/backend/open_webui/routers/users.py b/backend/open_webui/routers/users.py index 872212d3ce..4cf9102e14 100644 --- a/backend/open_webui/routers/users.py +++ b/backend/open_webui/routers/users.py @@ -2,6 +2,7 @@ import logging from typing import Optional from open_webui.models.auths import Auths +from open_webui.models.groups import Groups from open_webui.models.chats import Chats from open_webui.models.users import ( UserModel, @@ -17,7 +18,10 @@ from open_webui.constants import ERROR_MESSAGES from open_webui.env import SRC_LOG_LEVELS from fastapi import APIRouter, Depends, HTTPException, Request, status from pydantic import BaseModel + from open_webui.utils.auth import get_admin_user, get_password_hash, get_verified_user +from open_webui.utils.access_control import get_permissions + log = logging.getLogger(__name__) log.setLevel(SRC_LOG_LEVELS["MODELS"]) @@ -45,7 +49,7 @@ async def get_users( @router.get("/groups") async def get_user_groups(user=Depends(get_verified_user)): - return Users.get_user_groups(user.id) + return Groups.get_groups_by_member_id(user.id) ############################ @@ -54,8 +58,12 @@ async def get_user_groups(user=Depends(get_verified_user)): @router.get("/permissions") -async def get_user_permissisions(user=Depends(get_verified_user)): - return Users.get_user_groups(user.id) +async def get_user_permissisions(request: Request, user=Depends(get_verified_user)): + user_permissions = get_permissions( + user.id, request.app.state.config.USER_PERMISSIONS + ) + + return user_permissions ############################ @@ -68,12 +76,20 @@ class WorkspacePermissions(BaseModel): tools: bool = False +class SharingPermissions(BaseModel): + public_models: bool = True + public_knowledge: bool = True + public_prompts: bool = True + public_tools: bool = True + + class ChatPermissions(BaseModel): controls: bool = True file_upload: bool = True delete: bool = True edit: bool = True temporary: bool = True + temporary_enforced: bool = False class FeaturesPermissions(BaseModel): @@ -84,16 +100,20 @@ class FeaturesPermissions(BaseModel): class UserPermissions(BaseModel): workspace: WorkspacePermissions + sharing: SharingPermissions chat: ChatPermissions features: FeaturesPermissions @router.get("/default/permissions", response_model=UserPermissions) -async def get_user_permissions(request: Request, user=Depends(get_admin_user)): +async def get_default_user_permissions(request: Request, user=Depends(get_admin_user)): return { "workspace": WorkspacePermissions( **request.app.state.config.USER_PERMISSIONS.get("workspace", {}) ), + "sharing": SharingPermissions( + **request.app.state.config.USER_PERMISSIONS.get("sharing", {}) + ), "chat": ChatPermissions( **request.app.state.config.USER_PERMISSIONS.get("chat", {}) ), @@ -104,7 +124,7 @@ async def get_user_permissions(request: Request, user=Depends(get_admin_user)): @router.post("/default/permissions") -async def update_user_permissions( +async def update_default_user_permissions( request: Request, form_data: UserPermissions, user=Depends(get_admin_user) ): request.app.state.config.USER_PERMISSIONS = form_data.model_dump() diff --git a/backend/open_webui/socket/main.py b/backend/open_webui/socket/main.py index 8f5a9568b2..83dd74fff1 100644 --- a/backend/open_webui/socket/main.py +++ b/backend/open_webui/socket/main.py @@ -3,16 +3,24 @@ import socketio import logging import sys import time +from redis import asyncio as aioredis from open_webui.models.users import Users, UserNameResponse from open_webui.models.channels import Channels from open_webui.models.chats import Chats +from open_webui.utils.redis import ( + parse_redis_sentinel_url, + get_sentinels_from_env, + AsyncRedisSentinelManager, +) from open_webui.env import ( ENABLE_WEBSOCKET_SUPPORT, WEBSOCKET_MANAGER, WEBSOCKET_REDIS_URL, WEBSOCKET_REDIS_LOCK_TIMEOUT, + WEBSOCKET_SENTINEL_PORT, + WEBSOCKET_SENTINEL_HOSTS, ) from open_webui.utils.auth import decode_token from open_webui.socket.utils import RedisDict, RedisLock @@ -29,7 +37,19 @@ log.setLevel(SRC_LOG_LEVELS["SOCKET"]) if WEBSOCKET_MANAGER == "redis": - mgr = socketio.AsyncRedisManager(WEBSOCKET_REDIS_URL) + if WEBSOCKET_SENTINEL_HOSTS: + redis_config = parse_redis_sentinel_url(WEBSOCKET_REDIS_URL) + mgr = AsyncRedisSentinelManager( + WEBSOCKET_SENTINEL_HOSTS.split(","), + sentinel_port=int(WEBSOCKET_SENTINEL_PORT), + redis_port=redis_config["port"], + service=redis_config["service"], + db=redis_config["db"], + username=redis_config["username"], + password=redis_config["password"], + ) + else: + mgr = socketio.AsyncRedisManager(WEBSOCKET_REDIS_URL) sio = socketio.AsyncServer( cors_allowed_origins=[], async_mode="asgi", @@ -55,14 +75,30 @@ TIMEOUT_DURATION = 3 if WEBSOCKET_MANAGER == "redis": log.debug("Using Redis to manage websockets.") - SESSION_POOL = RedisDict("open-webui:session_pool", redis_url=WEBSOCKET_REDIS_URL) - USER_POOL = RedisDict("open-webui:user_pool", redis_url=WEBSOCKET_REDIS_URL) - USAGE_POOL = RedisDict("open-webui:usage_pool", redis_url=WEBSOCKET_REDIS_URL) + redis_sentinels = get_sentinels_from_env( + WEBSOCKET_SENTINEL_HOSTS, WEBSOCKET_SENTINEL_PORT + ) + SESSION_POOL = RedisDict( + "open-webui:session_pool", + redis_url=WEBSOCKET_REDIS_URL, + redis_sentinels=redis_sentinels, + ) + USER_POOL = RedisDict( + "open-webui:user_pool", + redis_url=WEBSOCKET_REDIS_URL, + redis_sentinels=redis_sentinels, + ) + USAGE_POOL = RedisDict( + "open-webui:usage_pool", + redis_url=WEBSOCKET_REDIS_URL, + redis_sentinels=redis_sentinels, + ) clean_up_lock = RedisLock( redis_url=WEBSOCKET_REDIS_URL, lock_name="usage_cleanup_lock", timeout_secs=WEBSOCKET_REDIS_LOCK_TIMEOUT, + redis_sentinels=redis_sentinels, ) aquire_func = clean_up_lock.aquire_lock renew_func = clean_up_lock.renew_lock @@ -269,11 +305,19 @@ async def disconnect(sid): # print(f"Unknown session ID {sid} disconnected") -def get_event_emitter(request_info): +def get_event_emitter(request_info, update_db=True): async def __event_emitter__(event_data): user_id = request_info["user_id"] + session_ids = list( - set(USER_POOL.get(user_id, []) + [request_info["session_id"]]) + set( + USER_POOL.get(user_id, []) + + ( + [request_info.get("session_id")] + if request_info.get("session_id") + else [] + ) + ) ) for session_id in session_ids: @@ -287,40 +331,41 @@ def get_event_emitter(request_info): to=session_id, ) - if "type" in event_data and event_data["type"] == "status": - Chats.add_message_status_to_chat_by_id_and_message_id( - request_info["chat_id"], - request_info["message_id"], - event_data.get("data", {}), - ) + if update_db: + if "type" in event_data and event_data["type"] == "status": + Chats.add_message_status_to_chat_by_id_and_message_id( + request_info["chat_id"], + request_info["message_id"], + event_data.get("data", {}), + ) - if "type" in event_data and event_data["type"] == "message": - message = Chats.get_message_by_id_and_message_id( - request_info["chat_id"], - request_info["message_id"], - ) + if "type" in event_data and event_data["type"] == "message": + message = Chats.get_message_by_id_and_message_id( + request_info["chat_id"], + request_info["message_id"], + ) - content = message.get("content", "") - content += event_data.get("data", {}).get("content", "") + content = message.get("content", "") + content += event_data.get("data", {}).get("content", "") - Chats.upsert_message_to_chat_by_id_and_message_id( - request_info["chat_id"], - request_info["message_id"], - { - "content": content, - }, - ) + Chats.upsert_message_to_chat_by_id_and_message_id( + request_info["chat_id"], + request_info["message_id"], + { + "content": content, + }, + ) - if "type" in event_data and event_data["type"] == "replace": - content = event_data.get("data", {}).get("content", "") + if "type" in event_data and event_data["type"] == "replace": + content = event_data.get("data", {}).get("content", "") - Chats.upsert_message_to_chat_by_id_and_message_id( - request_info["chat_id"], - request_info["message_id"], - { - "content": content, - }, - ) + Chats.upsert_message_to_chat_by_id_and_message_id( + request_info["chat_id"], + request_info["message_id"], + { + "content": content, + }, + ) return __event_emitter__ diff --git a/backend/open_webui/socket/utils.py b/backend/open_webui/socket/utils.py index 46fafbb9e7..85a8bb7909 100644 --- a/backend/open_webui/socket/utils.py +++ b/backend/open_webui/socket/utils.py @@ -1,15 +1,17 @@ import json -import redis import uuid +from open_webui.utils.redis import get_redis_connection class RedisLock: - def __init__(self, redis_url, lock_name, timeout_secs): + def __init__(self, redis_url, lock_name, timeout_secs, redis_sentinels=[]): self.lock_name = lock_name self.lock_id = str(uuid.uuid4()) self.timeout_secs = timeout_secs self.lock_obtained = False - self.redis = redis.Redis.from_url(redis_url, decode_responses=True) + self.redis = get_redis_connection( + redis_url, redis_sentinels, decode_responses=True + ) def aquire_lock(self): # nx=True will only set this key if it _hasn't_ already been set @@ -31,9 +33,11 @@ class RedisLock: class RedisDict: - def __init__(self, name, redis_url): + def __init__(self, name, redis_url, redis_sentinels=[]): self.name = name - self.redis = redis.Redis.from_url(redis_url, decode_responses=True) + self.redis = get_redis_connection( + redis_url, redis_sentinels, decode_responses=True + ) def __setitem__(self, key, value): serialized_value = json.dumps(value) diff --git a/backend/open_webui/utils/filter.py b/backend/open_webui/utils/filter.py index 0edc2ac709..a11aeb092c 100644 --- a/backend/open_webui/utils/filter.py +++ b/backend/open_webui/utils/filter.py @@ -101,11 +101,12 @@ async def process_filter_functions( form_data = handler(**params) except Exception as e: - log.exception(f"Error in {filter_type} handler {filter_id}: {e}") + log.debug(f"Error in {filter_type} handler {filter_id}: {e}") raise e # Handle file cleanup for inlet if skip_files and "files" in form_data.get("metadata", {}): + del form_data["files"] del form_data["metadata"]["files"] return form_data, {} diff --git a/backend/open_webui/utils/middleware.py b/backend/open_webui/utils/middleware.py index 289d887dfd..532f173877 100644 --- a/backend/open_webui/utils/middleware.py +++ b/backend/open_webui/utils/middleware.py @@ -18,9 +18,7 @@ from uuid import uuid4 from concurrent.futures import ThreadPoolExecutor -from fastapi import Request -from fastapi import BackgroundTasks - +from fastapi import Request, HTTPException from starlette.responses import Response, StreamingResponse @@ -100,7 +98,7 @@ log.setLevel(SRC_LOG_LEVELS["MAIN"]) async def chat_completion_tools_handler( - request: Request, body: dict, user: UserModel, models, tools + request: Request, body: dict, extra_params: dict, user: UserModel, models, tools ) -> tuple[dict, dict]: async def get_content_from_response(response) -> Optional[str]: content = None @@ -135,6 +133,9 @@ async def chat_completion_tools_handler( "metadata": {"task": str(TASKS.FUNCTION_CALLING)}, } + event_caller = extra_params["__event_call__"] + metadata = extra_params["__metadata__"] + task_model_id = get_task_model_id( body["model"], request.app.state.config.TASK_MODEL, @@ -156,7 +157,6 @@ async def chat_completion_tools_handler( tools_function_calling_prompt = tools_function_calling_generation_template( template, tools_specs ) - log.info(f"{tools_function_calling_prompt=}") payload = get_tools_function_calling_payload( body["messages"], task_model_id, tools_function_calling_prompt ) @@ -189,34 +189,63 @@ async def chat_completion_tools_handler( tool_function_params = tool_call.get("parameters", {}) try: - required_params = ( - tools[tool_function_name] - .get("spec", {}) - .get("parameters", {}) - .get("required", []) + tool = tools[tool_function_name] + + spec = tool.get("spec", {}) + allowed_params = ( + spec.get("parameters", {}).get("properties", {}).keys() ) - tool_function = tools[tool_function_name]["callable"] tool_function_params = { k: v for k, v in tool_function_params.items() - if k in required_params + if k in allowed_params } - tool_output = await tool_function(**tool_function_params) + + if tool.get("direct", False): + tool_result = await event_caller( + { + "type": "execute:tool", + "data": { + "id": str(uuid4()), + "name": tool_function_name, + "params": tool_function_params, + "server": tool.get("server", {}), + "session_id": metadata.get("session_id", None), + }, + } + ) + else: + tool_function = tool["callable"] + tool_result = await tool_function(**tool_function_params) except Exception as e: - tool_output = str(e) + tool_result = str(e) + + if isinstance(tool_result, dict) or isinstance(tool_result, list): + tool_result = json.dumps(tool_result, indent=2) + + if isinstance(tool_result, str): + tool = tools[tool_function_name] + tool_id = tool.get("toolkit_id", "") + if tool.get("citation", False) or tool.get("direct", False): - if isinstance(tool_output, str): - if tools[tool_function_name]["citation"]: sources.append( { "source": { - "name": f"TOOL:{tools[tool_function_name]['toolkit_id']}/{tool_function_name}" + "name": ( + f"TOOL:" + f"{tool_id}/{tool_function_name}" + if tool_id + else f"{tool_function_name}" + ), }, - "document": [tool_output], + "document": [tool_result], "metadata": [ { - "source": f"TOOL:{tools[tool_function_name]['toolkit_id']}/{tool_function_name}" + "source": ( + f"TOOL:" + f"{tool_id}/{tool_function_name}" + if tool_id + else f"{tool_function_name}" + ) } ], } @@ -225,16 +254,20 @@ async def chat_completion_tools_handler( sources.append( { "source": {}, - "document": [tool_output], + "document": [tool_result], "metadata": [ { - "source": f"TOOL:{tools[tool_function_name]['toolkit_id']}/{tool_function_name}" + "source": ( + f"TOOL:" + f"{tool_id}/{tool_function_name}" + if tool_id + else f"{tool_function_name}" + ) } ], } ) - if tools[tool_function_name]["file_handler"]: + if tools[tool_function_name].get("file_handler", False): skip_files = True # check if "tool_calls" in result @@ -245,10 +278,10 @@ async def chat_completion_tools_handler( await tool_call_handler(result) except Exception as e: - log.exception(f"Error: {e}") + log.debug(f"Error: {e}") content = None except Exception as e: - log.exception(f"Error: {e}") + log.debug(f"Error: {e}") content = None log.debug(f"tool_contexts: {sources}") @@ -562,11 +595,12 @@ async def chat_completion_files_handler( request=request, files=files, queries=queries, - embedding_function=lambda query: request.app.state.EMBEDDING_FUNCTION( - query, user=user + embedding_function=lambda query, prefix: request.app.state.EMBEDDING_FUNCTION( + query, prefix=prefix, user=user ), k=request.app.state.config.TOP_K, reranking_function=request.app.state.rf, + k_reranker=request.app.state.config.TOP_K_RERANKER, r=request.app.state.config.RELEVANCE_THRESHOLD, hybrid_search=request.app.state.config.ENABLE_RAG_HYBRID_SEARCH, full_context=request.app.state.config.RAG_FULL_CONTEXT, @@ -766,12 +800,18 @@ async def process_chat_payload(request, form_data, user, metadata, model): } form_data["metadata"] = metadata + # Server side tools tool_ids = metadata.get("tool_ids", None) + # Client side tools + tool_servers = metadata.get("tool_servers", None) + log.debug(f"{tool_ids=}") + log.debug(f"{tool_servers=}") + + tools_dict = {} if tool_ids: - # If tool_ids field is present, then get the tools - tools = get_tools( + tools_dict = get_tools( request, tool_ids, user, @@ -782,20 +822,31 @@ async def process_chat_payload(request, form_data, user, metadata, model): "__files__": metadata.get("files", []), }, ) - log.info(f"{tools=}") + if tool_servers: + for tool_server in tool_servers: + tool_specs = tool_server.pop("specs", []) + + for tool in tool_specs: + tools_dict[tool["name"]] = { + "spec": tool, + "direct": True, + "server": tool_server, + } + + if tools_dict: if metadata.get("function_calling") == "native": # If the function calling is native, then call the tools function calling handler - metadata["tools"] = tools + metadata["tools"] = tools_dict form_data["tools"] = [ {"type": "function", "function": tool.get("spec", {})} - for tool in tools.values() + for tool in tools_dict.values() ] else: # If the function calling is not native, then call the tools function calling handler try: form_data, flags = await chat_completion_tools_handler( - request, form_data, user, models, tools + request, form_data, extra_params, user, models, tools_dict ) sources.extend(flags.get("sources", [])) @@ -814,7 +865,7 @@ async def process_chat_payload(request, form_data, user, metadata, model): for source_idx, source in enumerate(sources): if "document" in source: for doc_idx, doc_context in enumerate(source["document"]): - context_string += f"{source_idx}{doc_context}\n" + context_string += f"{source_idx + 1}{doc_context}\n" context_string = context_string.strip() prompt = get_last_user_message(form_data["messages"]) @@ -991,6 +1042,16 @@ async def process_chat_response( # Non-streaming response if not isinstance(response, StreamingResponse): if event_emitter: + if "error" in response: + error = response["error"].get("detail", response["error"]) + Chats.upsert_message_to_chat_by_id_and_message_id( + metadata["chat_id"], + metadata["message_id"], + { + "error": {"content": error}, + }, + ) + if "selected_model_id" in response: Chats.upsert_message_to_chat_by_id_and_message_id( metadata["chat_id"], @@ -1000,7 +1061,8 @@ async def process_chat_response( }, ) - if response.get("choices", [])[0].get("message", {}).get("content"): + choices = response.get("choices", []) + if choices and choices[0].get("message", {}).get("content"): content = response["choices"][0]["message"]["content"] if content: @@ -1081,8 +1143,6 @@ async def process_chat_response( for filter_id in get_sorted_filter_ids(model) ] - print(f"{filter_functions=}") - # Streaming response if event_emitter and event_caller: task_id = str(uuid4()) # Create a unique task ID. @@ -1121,36 +1181,51 @@ async def process_chat_response( elif block["type"] == "tool_calls": attributes = block.get("attributes", {}) - block_content = block.get("content", []) + tool_calls = block.get("content", []) results = block.get("results", []) if results: - result_display_content = "" + tool_calls_display_content = "" + for tool_call in tool_calls: - for result in results: - tool_call_id = result.get("tool_call_id", "") - tool_name = "" + tool_call_id = tool_call.get("id", "") + tool_name = tool_call.get("function", {}).get( + "name", "" + ) + tool_arguments = tool_call.get("function", {}).get( + "arguments", "" + ) - for tool_call in block_content: - if tool_call.get("id", "") == tool_call_id: - tool_name = tool_call.get("function", {}).get( - "name", "" - ) + tool_result = None + for result in results: + if tool_call_id == result.get("tool_call_id", ""): + tool_result = result.get("content", None) break - result_display_content = f"{result_display_content}\n> {tool_name}: {result.get('content', '')}" + if tool_result: + tool_calls_display_content = f'{tool_calls_display_content}\n
\nTool Executed\n
' + else: + tool_calls_display_content = f'{tool_calls_display_content}\n
\nExecuting...\n
' if not raw: - content = f'{content}\n
\nTool Executed\n{result_display_content}\n
\n' + content = f"{content}\n{tool_calls_display_content}\n\n" else: tool_calls_display_content = "" - for tool_call in block_content: - tool_calls_display_content = f"{tool_calls_display_content}\n> Executing {tool_call.get('function', {}).get('name', '')}" + for tool_call in tool_calls: + tool_call_id = tool_call.get("id", "") + tool_name = tool_call.get("function", {}).get( + "name", "" + ) + tool_arguments = tool_call.get("function", {}).get( + "arguments", "" + ) + + tool_calls_display_content = f'{tool_calls_display_content}\n
\nExecuting...\n
' if not raw: - content = f'{content}\n
\nTool Executing...\n{tool_calls_display_content}\n
\n' + content = f"{content}\n{tool_calls_display_content}\n\n" elif block["type"] == "reasoning": reasoning_display_content = "\n".join( @@ -1507,6 +1582,16 @@ async def process_chat_response( else: choices = data.get("choices", []) if not choices: + error = data.get("error", {}) + if error: + await event_emitter( + { + "type": "chat:completion", + "data": { + "error": error, + }, + } + ) usage = data.get("usage", {}) if usage: await event_emitter( @@ -1562,7 +1647,9 @@ async def process_chat_response( value = delta.get("content") - reasoning_content = delta.get("reasoning_content") + reasoning_content = delta.get( + "reasoning_content" + ) or delta.get("reasoning") if reasoning_content: if ( not content_blocks @@ -1757,6 +1844,15 @@ async def process_chat_response( ) except Exception as e: log.debug(e) + # Fallback to JSON parsing + try: + tool_function_params = json.loads( + tool_call.get("function", {}).get("arguments", "{}") + ) + except Exception as e: + log.debug( + f"Error parsing tool call arguments: {tool_call.get('function', {}).get('arguments', '{}')}" + ) tool_result = None @@ -1765,21 +1861,48 @@ async def process_chat_response( spec = tool.get("spec", {}) try: - required_params = spec.get("parameters", {}).get( - "required", [] + allowed_params = ( + spec.get("parameters", {}) + .get("properties", {}) + .keys() ) - tool_function = tool["callable"] + tool_function_params = { k: v for k, v in tool_function_params.items() - if k in required_params + if k in allowed_params } - tool_result = await tool_function( - **tool_function_params - ) + + if tool.get("direct", False): + tool_result = await event_caller( + { + "type": "execute:tool", + "data": { + "id": str(uuid4()), + "name": tool_name, + "params": tool_function_params, + "server": tool.get("server", {}), + "session_id": metadata.get( + "session_id", None + ), + }, + } + ) + + else: + tool_function = tool["callable"] + tool_result = await tool_function( + **tool_function_params + ) + except Exception as e: tool_result = str(e) + if isinstance(tool_result, dict) or isinstance( + tool_result, list + ): + tool_result = json.dumps(tool_result, indent=2) + results.append( { "tool_call_id": tool_call_id, @@ -1982,11 +2105,6 @@ async def process_chat_response( } ) - log.info(f"content_blocks={content_blocks}") - log.info( - f"serialize_content_blocks={serialize_content_blocks(content_blocks)}" - ) - try: res = await generate_chat_completion( request, diff --git a/backend/open_webui/utils/models.py b/backend/open_webui/utils/models.py index 149e41a41b..b631c2ae33 100644 --- a/backend/open_webui/utils/models.py +++ b/backend/open_webui/utils/models.py @@ -49,6 +49,7 @@ async def get_all_base_models(request: Request, user: UserModel = None): "created": int(time.time()), "owned_by": "ollama", "ollama": model, + "tags": model.get("tags", []), } for model in ollama_models["models"] ] diff --git a/backend/open_webui/utils/oauth.py b/backend/open_webui/utils/oauth.py index be3466362e..ab50247d8b 100644 --- a/backend/open_webui/utils/oauth.py +++ b/backend/open_webui/utils/oauth.py @@ -94,7 +94,7 @@ class OAuthManager: oauth_claim = auth_manager_config.OAUTH_ROLES_CLAIM oauth_allowed_roles = auth_manager_config.OAUTH_ALLOWED_ROLES oauth_admin_roles = auth_manager_config.OAUTH_ADMIN_ROLES - oauth_roles = None + oauth_roles = [] # Default/fallback role if no matching roles are found role = auth_manager_config.DEFAULT_USER_ROLE @@ -104,7 +104,7 @@ class OAuthManager: nested_claims = oauth_claim.split(".") for nested_claim in nested_claims: claim_data = claim_data.get(nested_claim, {}) - oauth_roles = claim_data if isinstance(claim_data, list) else None + oauth_roles = claim_data if isinstance(claim_data, list) else [] log.debug(f"Oauth Roles claim: {oauth_claim}") log.debug(f"User roles from oauth: {oauth_roles}") @@ -140,6 +140,7 @@ class OAuthManager: log.debug("Running OAUTH Group management") oauth_claim = auth_manager_config.OAUTH_GROUPS_CLAIM + user_oauth_groups = [] # Nested claim search for groups claim if oauth_claim: claim_data = user_data @@ -160,7 +161,7 @@ class OAuthManager: # Remove groups that user is no longer a part of for group_model in user_current_groups: - if group_model.name not in user_oauth_groups: + if user_oauth_groups and group_model.name not in user_oauth_groups: # Remove group from user log.debug( f"Removing user from group {group_model.name} as it is no longer in their oauth groups" @@ -186,8 +187,10 @@ class OAuthManager: # Add user to new groups for group_model in all_available_groups: - if group_model.name in user_oauth_groups and not any( - gm.name == group_model.name for gm in user_current_groups + if ( + user_oauth_groups + and group_model.name in user_oauth_groups + and not any(gm.name == group_model.name for gm in user_current_groups) ): # Add user to group log.debug( diff --git a/backend/open_webui/utils/payload.py b/backend/open_webui/utils/payload.py index 46656cc82b..5f8aafb785 100644 --- a/backend/open_webui/utils/payload.py +++ b/backend/open_webui/utils/payload.py @@ -63,6 +63,7 @@ def apply_model_params_to_body_openai(params: dict, form_data: dict) -> dict: "seed": lambda x: x, "stop": lambda x: [bytes(s, "utf-8").decode("unicode_escape") for s in x], "logit_bias": lambda x: x, + "response_format": dict, } return apply_model_params_to_body(params, form_data, mappings) @@ -110,6 +111,15 @@ def apply_model_params_to_body_ollama(params: dict, form_data: dict) -> dict: "num_thread": int, } + # Extract keep_alive from options if it exists + if "options" in form_data and "keep_alive" in form_data["options"]: + form_data["keep_alive"] = form_data["options"]["keep_alive"] + del form_data["options"]["keep_alive"] + + if "options" in form_data and "format" in form_data["options"]: + form_data["format"] = form_data["options"]["format"] + del form_data["options"]["format"] + return apply_model_params_to_body(params, form_data, mappings) @@ -231,6 +241,11 @@ def convert_payload_openai_to_ollama(openai_payload: dict) -> dict: "system" ] # To prevent Ollama warning of invalid option provided + # Extract keep_alive from options if it exists + if "keep_alive" in ollama_options: + ollama_payload["keep_alive"] = ollama_options["keep_alive"] + del ollama_options["keep_alive"] + # If there is the "stop" parameter in the openai_payload, remap it to the ollama_payload.options if "stop" in openai_payload: ollama_options = ollama_payload.get("options", {}) @@ -240,4 +255,13 @@ def convert_payload_openai_to_ollama(openai_payload: dict) -> dict: if "metadata" in openai_payload: ollama_payload["metadata"] = openai_payload["metadata"] + if "response_format" in openai_payload: + response_format = openai_payload["response_format"] + format_type = response_format.get("type", None) + + schema = response_format.get(format_type, None) + if schema: + format = schema.get("schema", None) + ollama_payload["format"] = format + return ollama_payload diff --git a/backend/open_webui/utils/plugin.py b/backend/open_webui/utils/plugin.py index e3fe9237f5..29a4d0cceb 100644 --- a/backend/open_webui/utils/plugin.py +++ b/backend/open_webui/utils/plugin.py @@ -7,7 +7,7 @@ import types import tempfile import logging -from open_webui.env import SRC_LOG_LEVELS +from open_webui.env import SRC_LOG_LEVELS, PIP_OPTIONS, PIP_PACKAGE_INDEX_OPTIONS from open_webui.models.functions import Functions from open_webui.models.tools import Tools @@ -165,15 +165,19 @@ def load_function_module_by_id(function_id, content=None): os.unlink(temp_file.name) -def install_frontmatter_requirements(requirements): +def install_frontmatter_requirements(requirements: str): if requirements: try: req_list = [req.strip() for req in requirements.split(",")] - for req in req_list: - log.info(f"Installing requirement: {req}") - subprocess.check_call([sys.executable, "-m", "pip", "install", req]) + log.info(f"Installing requirements: {' '.join(req_list)}") + subprocess.check_call( + [sys.executable, "-m", "pip", "install"] + + PIP_OPTIONS + + req_list + + PIP_PACKAGE_INDEX_OPTIONS + ) except Exception as e: - log.error(f"Error installing package: {req}") + log.error(f"Error installing packages: {' '.join(req_list)}") raise e else: diff --git a/backend/open_webui/utils/redis.py b/backend/open_webui/utils/redis.py new file mode 100644 index 0000000000..baccb16ad6 --- /dev/null +++ b/backend/open_webui/utils/redis.py @@ -0,0 +1,109 @@ +import socketio +import redis +from redis import asyncio as aioredis +from urllib.parse import urlparse + + +def parse_redis_sentinel_url(redis_url): + parsed_url = urlparse(redis_url) + if parsed_url.scheme != "redis": + raise ValueError("Invalid Redis URL scheme. Must be 'redis'.") + + return { + "username": parsed_url.username or None, + "password": parsed_url.password or None, + "service": parsed_url.hostname or "mymaster", + "port": parsed_url.port or 6379, + "db": int(parsed_url.path.lstrip("/") or 0), + } + + +def get_redis_connection(redis_url, redis_sentinels, decode_responses=True): + if redis_sentinels: + redis_config = parse_redis_sentinel_url(redis_url) + sentinel = redis.sentinel.Sentinel( + redis_sentinels, + port=redis_config["port"], + db=redis_config["db"], + username=redis_config["username"], + password=redis_config["password"], + decode_responses=decode_responses, + ) + + # Get a master connection from Sentinel + return sentinel.master_for(redis_config["service"]) + else: + # Standard Redis connection + return redis.Redis.from_url(redis_url, decode_responses=decode_responses) + + +def get_sentinels_from_env(sentinel_hosts_env, sentinel_port_env): + if sentinel_hosts_env: + sentinel_hosts = sentinel_hosts_env.split(",") + sentinel_port = int(sentinel_port_env) + return [(host, sentinel_port) for host in sentinel_hosts] + return [] + + +class AsyncRedisSentinelManager(socketio.AsyncRedisManager): + def __init__( + self, + sentinel_hosts, + sentinel_port=26379, + redis_port=6379, + service="mymaster", + db=0, + username=None, + password=None, + channel="socketio", + write_only=False, + logger=None, + redis_options=None, + ): + """ + Initialize the Redis Sentinel Manager. + This implementation mostly replicates the __init__ of AsyncRedisManager and + overrides _redis_connect() with a version that uses Redis Sentinel + + :param sentinel_hosts: List of Sentinel hosts + :param sentinel_port: Sentinel Port + :param redis_port: Redis Port (currently unsupported by aioredis!) + :param service: Master service name in Sentinel + :param db: Redis database to use + :param username: Redis username (if any) (currently unsupported by aioredis!) + :param password: Redis password (if any) + :param channel: The channel name on which the server sends and receives + notifications. Must be the same in all the servers. + :param write_only: If set to ``True``, only initialize to emit events. The + default of ``False`` initializes the class for emitting + and receiving. + :param redis_options: additional keyword arguments to be passed to + ``aioredis.from_url()``. + """ + self._sentinels = [(host, sentinel_port) for host in sentinel_hosts] + self._redis_port = redis_port + self._service = service + self._db = db + self._username = username + self._password = password + self._channel = channel + self.redis_options = redis_options or {} + + # connect and call grandparent constructor + self._redis_connect() + super(socketio.AsyncRedisManager, self).__init__( + channel=channel, write_only=write_only, logger=logger + ) + + def _redis_connect(self): + """Establish connections to Redis through Sentinel.""" + sentinel = aioredis.sentinel.Sentinel( + self._sentinels, + port=self._redis_port, + db=self._db, + password=self._password, + **self.redis_options, + ) + + self.redis = sentinel.master_for(self._service) + self.pubsub = self.redis.pubsub(ignore_subscribe_messages=True) diff --git a/backend/open_webui/utils/telemetry/__init__.py b/backend/open_webui/utils/telemetry/__init__.py new file mode 100644 index 0000000000..e69de29bb2 diff --git a/backend/open_webui/utils/telemetry/constants.py b/backend/open_webui/utils/telemetry/constants.py new file mode 100644 index 0000000000..6ef511f934 --- /dev/null +++ b/backend/open_webui/utils/telemetry/constants.py @@ -0,0 +1,26 @@ +from opentelemetry.semconv.trace import SpanAttributes as _SpanAttributes + +# Span Tags +SPAN_DB_TYPE = "mysql" +SPAN_REDIS_TYPE = "redis" +SPAN_DURATION = "duration" +SPAN_SQL_STR = "sql" +SPAN_SQL_EXPLAIN = "explain" +SPAN_ERROR_TYPE = "error" + + +class SpanAttributes(_SpanAttributes): + """ + Span Attributes + """ + + DB_INSTANCE = "db.instance" + DB_TYPE = "db.type" + DB_IP = "db.ip" + DB_PORT = "db.port" + ERROR_KIND = "error.kind" + ERROR_OBJECT = "error.object" + ERROR_MESSAGE = "error.message" + RESULT_CODE = "result.code" + RESULT_MESSAGE = "result.message" + RESULT_ERRORS = "result.errors" diff --git a/backend/open_webui/utils/telemetry/exporters.py b/backend/open_webui/utils/telemetry/exporters.py new file mode 100644 index 0000000000..4bf166e655 --- /dev/null +++ b/backend/open_webui/utils/telemetry/exporters.py @@ -0,0 +1,31 @@ +import threading + +from opentelemetry.sdk.trace import ReadableSpan +from opentelemetry.sdk.trace.export import BatchSpanProcessor + + +class LazyBatchSpanProcessor(BatchSpanProcessor): + def __init__(self, *args, **kwargs): + super().__init__(*args, **kwargs) + self.done = True + with self.condition: + self.condition.notify_all() + self.worker_thread.join() + self.done = False + self.worker_thread = None + + def on_end(self, span: ReadableSpan) -> None: + if self.worker_thread is None: + self.worker_thread = threading.Thread( + name=self.__class__.__name__, target=self.worker, daemon=True + ) + self.worker_thread.start() + super().on_end(span) + + def shutdown(self) -> None: + self.done = True + with self.condition: + self.condition.notify_all() + if self.worker_thread: + self.worker_thread.join() + self.span_exporter.shutdown() diff --git a/backend/open_webui/utils/telemetry/instrumentors.py b/backend/open_webui/utils/telemetry/instrumentors.py new file mode 100644 index 0000000000..0ba42efd4b --- /dev/null +++ b/backend/open_webui/utils/telemetry/instrumentors.py @@ -0,0 +1,202 @@ +import logging +import traceback +from typing import Collection, Union + +from aiohttp import ( + TraceRequestStartParams, + TraceRequestEndParams, + TraceRequestExceptionParams, +) +from chromadb.telemetry.opentelemetry.fastapi import instrument_fastapi +from fastapi import FastAPI +from opentelemetry.instrumentation.httpx import ( + HTTPXClientInstrumentor, + RequestInfo, + ResponseInfo, +) +from opentelemetry.instrumentation.instrumentor import BaseInstrumentor +from opentelemetry.instrumentation.logging import LoggingInstrumentor +from opentelemetry.instrumentation.redis import RedisInstrumentor +from opentelemetry.instrumentation.requests import RequestsInstrumentor +from opentelemetry.instrumentation.sqlalchemy import SQLAlchemyInstrumentor +from opentelemetry.instrumentation.aiohttp_client import AioHttpClientInstrumentor +from opentelemetry.trace import Span, StatusCode +from redis import Redis +from requests import PreparedRequest, Response +from sqlalchemy import Engine +from fastapi import status + +from open_webui.utils.telemetry.constants import SPAN_REDIS_TYPE, SpanAttributes + +from open_webui.env import SRC_LOG_LEVELS + +logger = logging.getLogger(__name__) +logger.setLevel(SRC_LOG_LEVELS["MAIN"]) + + +def requests_hook(span: Span, request: PreparedRequest): + """ + Http Request Hook + """ + + span.update_name(f"{request.method} {request.url}") + span.set_attributes( + attributes={ + SpanAttributes.HTTP_URL: request.url, + SpanAttributes.HTTP_METHOD: request.method, + } + ) + + +def response_hook(span: Span, request: PreparedRequest, response: Response): + """ + HTTP Response Hook + """ + + span.set_attributes( + attributes={ + SpanAttributes.HTTP_STATUS_CODE: response.status_code, + } + ) + span.set_status(StatusCode.ERROR if response.status_code >= 400 else StatusCode.OK) + + +def redis_request_hook(span: Span, instance: Redis, args, kwargs): + """ + Redis Request Hook + """ + + try: + connection_kwargs: dict = instance.connection_pool.connection_kwargs + host = connection_kwargs.get("host") + port = connection_kwargs.get("port") + db = connection_kwargs.get("db") + span.set_attributes( + { + SpanAttributes.DB_INSTANCE: f"{host}/{db}", + SpanAttributes.DB_NAME: f"{host}/{db}", + SpanAttributes.DB_TYPE: SPAN_REDIS_TYPE, + SpanAttributes.DB_PORT: port, + SpanAttributes.DB_IP: host, + SpanAttributes.DB_STATEMENT: " ".join([str(i) for i in args]), + SpanAttributes.DB_OPERATION: str(args[0]), + } + ) + except Exception: # pylint: disable=W0718 + logger.error(traceback.format_exc()) + + +def httpx_request_hook(span: Span, request: RequestInfo): + """ + HTTPX Request Hook + """ + + span.update_name(f"{request.method.decode()} {str(request.url)}") + span.set_attributes( + attributes={ + SpanAttributes.HTTP_URL: str(request.url), + SpanAttributes.HTTP_METHOD: request.method.decode(), + } + ) + + +def httpx_response_hook(span: Span, request: RequestInfo, response: ResponseInfo): + """ + HTTPX Response Hook + """ + + span.set_attribute(SpanAttributes.HTTP_STATUS_CODE, response.status_code) + span.set_status( + StatusCode.ERROR + if response.status_code >= status.HTTP_400_BAD_REQUEST + else StatusCode.OK + ) + + +async def httpx_async_request_hook(span: Span, request: RequestInfo): + """ + Async Request Hook + """ + + httpx_request_hook(span, request) + + +async def httpx_async_response_hook( + span: Span, request: RequestInfo, response: ResponseInfo +): + """ + Async Response Hook + """ + + httpx_response_hook(span, request, response) + + +def aiohttp_request_hook(span: Span, request: TraceRequestStartParams): + """ + Aiohttp Request Hook + """ + + span.update_name(f"{request.method} {str(request.url)}") + span.set_attributes( + attributes={ + SpanAttributes.HTTP_URL: str(request.url), + SpanAttributes.HTTP_METHOD: request.method, + } + ) + + +def aiohttp_response_hook( + span: Span, response: Union[TraceRequestExceptionParams, TraceRequestEndParams] +): + """ + Aiohttp Response Hook + """ + + if isinstance(response, TraceRequestEndParams): + span.set_attribute(SpanAttributes.HTTP_STATUS_CODE, response.response.status) + span.set_status( + StatusCode.ERROR + if response.response.status >= status.HTTP_400_BAD_REQUEST + else StatusCode.OK + ) + elif isinstance(response, TraceRequestExceptionParams): + span.set_status(StatusCode.ERROR) + span.set_attribute(SpanAttributes.ERROR_MESSAGE, str(response.exception)) + + +class Instrumentor(BaseInstrumentor): + """ + Instrument OT + """ + + def __init__(self, app: FastAPI, db_engine: Engine): + self.app = app + self.db_engine = db_engine + + def instrumentation_dependencies(self) -> Collection[str]: + return [] + + def _instrument(self, **kwargs): + instrument_fastapi(app=self.app) + SQLAlchemyInstrumentor().instrument(engine=self.db_engine) + RedisInstrumentor().instrument(request_hook=redis_request_hook) + RequestsInstrumentor().instrument( + request_hook=requests_hook, response_hook=response_hook + ) + LoggingInstrumentor().instrument() + HTTPXClientInstrumentor().instrument( + request_hook=httpx_request_hook, + response_hook=httpx_response_hook, + async_request_hook=httpx_async_request_hook, + async_response_hook=httpx_async_response_hook, + ) + AioHttpClientInstrumentor().instrument( + request_hook=aiohttp_request_hook, + response_hook=aiohttp_response_hook, + ) + + def _uninstrument(self, **kwargs): + if getattr(self, "instrumentors", None) is None: + return + for instrumentor in self.instrumentors: + instrumentor.uninstrument() diff --git a/backend/open_webui/utils/telemetry/setup.py b/backend/open_webui/utils/telemetry/setup.py new file mode 100644 index 0000000000..eb6a238c8d --- /dev/null +++ b/backend/open_webui/utils/telemetry/setup.py @@ -0,0 +1,23 @@ +from fastapi import FastAPI +from opentelemetry import trace +from opentelemetry.exporter.otlp.proto.grpc.trace_exporter import OTLPSpanExporter +from opentelemetry.sdk.resources import SERVICE_NAME, Resource +from opentelemetry.sdk.trace import TracerProvider +from sqlalchemy import Engine + +from open_webui.utils.telemetry.exporters import LazyBatchSpanProcessor +from open_webui.utils.telemetry.instrumentors import Instrumentor +from open_webui.env import OTEL_SERVICE_NAME, OTEL_EXPORTER_OTLP_ENDPOINT + + +def setup(app: FastAPI, db_engine: Engine): + # set up trace + trace.set_tracer_provider( + TracerProvider( + resource=Resource.create(attributes={SERVICE_NAME: OTEL_SERVICE_NAME}) + ) + ) + # otlp export + exporter = OTLPSpanExporter(endpoint=OTEL_EXPORTER_OTLP_ENDPOINT) + trace.get_tracer_provider().add_span_processor(LazyBatchSpanProcessor(exporter)) + Instrumentor(app=app, db_engine=db_engine).instrument() diff --git a/backend/open_webui/utils/tools.py b/backend/open_webui/utils/tools.py index c44c30402d..bd2a731e6a 100644 --- a/backend/open_webui/utils/tools.py +++ b/backend/open_webui/utils/tools.py @@ -1,6 +1,9 @@ import inspect import logging import re +import inspect +import uuid + from typing import Any, Awaitable, Callable, get_type_hints from functools import update_wrapper, partial @@ -88,10 +91,11 @@ def get_tools( # TODO: This needs to be a pydantic model tool_dict = { - "toolkit_id": tool_id, - "callable": callable, "spec": spec, + "callable": callable, + "toolkit_id": tool_id, "pydantic_model": function_to_pydantic_model(callable), + # Misc info "file_handler": hasattr(module, "file_handler") and module.file_handler, "citation": hasattr(module, "citation") and module.citation, } diff --git a/backend/requirements.txt b/backend/requirements.txt index eb1ee6018e..ca2ea50609 100644 --- a/backend/requirements.txt +++ b/backend/requirements.txt @@ -37,13 +37,13 @@ asgiref==3.8.1 # AI libraries openai anthropic -google-generativeai==0.7.2 +google-generativeai==0.8.4 tiktoken langchain==0.3.19 langchain-community==0.3.18 -fake-useragent==1.5.1 +fake-useragent==2.1.0 chromadb==0.6.2 pymilvus==2.5.0 qdrant-client~=1.12.0 @@ -78,6 +78,7 @@ sentencepiece soundfile==0.13.1 azure-ai-documentintelligence==1.0.0 +pillow==11.1.0 opencv-python-headless==4.11.0.86 rapidocr-onnxruntime==1.3.24 rank-bm25==0.2.2 @@ -118,3 +119,16 @@ ldap3==2.9.1 ## Firecrawl firecrawl-py==1.12.0 + +## Trace +opentelemetry-api==1.30.0 +opentelemetry-sdk==1.30.0 +opentelemetry-exporter-otlp==1.30.0 +opentelemetry-instrumentation==0.51b0 +opentelemetry-instrumentation-fastapi==0.51b0 +opentelemetry-instrumentation-sqlalchemy==0.51b0 +opentelemetry-instrumentation-redis==0.51b0 +opentelemetry-instrumentation-requests==0.51b0 +opentelemetry-instrumentation-logging==0.51b0 +opentelemetry-instrumentation-httpx==0.51b0 +opentelemetry-instrumentation-aiohttp-client==0.51b0 \ No newline at end of file diff --git a/backend/start_windows.bat b/backend/start_windows.bat index 7049cd1b37..19f6f123c5 100644 --- a/backend/start_windows.bat +++ b/backend/start_windows.bat @@ -41,4 +41,5 @@ IF "%WEBUI_SECRET_KEY%%WEBUI_JWT_SECRET_KEY%" == " " ( :: Execute uvicorn SET "WEBUI_SECRET_KEY=%WEBUI_SECRET_KEY%" -uvicorn open_webui.main:app --host "%HOST%" --port "%PORT%" --forwarded-allow-ips '*' +uvicorn open_webui.main:app --host "%HOST%" --port "%PORT%" --forwarded-allow-ips '*' --ws auto +:: For ssl user uvicorn open_webui.main:app --host "%HOST%" --port "%PORT%" --forwarded-allow-ips '*' --ssl-keyfile "key.pem" --ssl-certfile "cert.pem" --ws auto diff --git a/package-lock.json b/package-lock.json index e98b0968cc..6eb5064d79 100644 --- a/package-lock.json +++ b/package-lock.json @@ -1,12 +1,12 @@ { "name": "open-webui", - "version": "0.5.20", + "version": "0.6.0", "lockfileVersion": 3, "requires": true, "packages": { "": { "name": "open-webui", - "version": "0.5.20", + "version": "0.6.0", "dependencies": { "@azure/msal-browser": "^4.5.0", "@codemirror/lang-javascript": "^6.2.2", @@ -37,6 +37,8 @@ "file-saver": "^2.0.5", "fuse.js": "^7.0.0", "highlight.js": "^11.9.0", + "html-entities": "^2.5.3", + "html2canvas-pro": "^1.5.8", "i18next": "^23.10.0", "i18next-browser-languagedetector": "^7.2.0", "i18next-resources-to-backend": "^1.2.0", @@ -59,7 +61,7 @@ "prosemirror-schema-list": "^1.4.1", "prosemirror-state": "^1.4.3", "prosemirror-view": "^1.34.3", - "pyodide": "^0.27.2", + "pyodide": "^0.27.3", "socket.io-client": "^4.2.0", "sortablejs": "^1.15.2", "svelte-sonner": "^0.3.19", @@ -158,9 +160,9 @@ } }, "node_modules/@babel/runtime": { - "version": "7.26.9", - "resolved": "https://registry.npmjs.org/@babel/runtime/-/runtime-7.26.9.tgz", - "integrity": "sha512-aA63XwOkcl4xxQa3HjPMqOP6LiK0ZDv3mUPYEFXkpHbaFjtGggE1A61FjFzJnB+p7/oy2gA8E+rcBNl/zC1tMg==", + "version": "7.27.0", + "resolved": "https://registry.npmjs.org/@babel/runtime/-/runtime-7.27.0.tgz", + "integrity": "sha512-VtPOkrdPHZsKc/clNqyi9WUA8TINkZ4cGk63UUE3u4pmB2k+ZMQRDuIOagv8UVd6j7k0T3+RRIb7beKTebNbcw==", "license": "MIT", "dependencies": { "regenerator-runtime": "^0.14.0" @@ -625,371 +627,428 @@ } }, "node_modules/@esbuild/aix-ppc64": { - "version": "0.20.2", - "resolved": "https://registry.npmjs.org/@esbuild/aix-ppc64/-/aix-ppc64-0.20.2.tgz", - "integrity": "sha512-D+EBOJHXdNZcLJRBkhENNG8Wji2kgc9AZ9KiPr1JuZjsNtyHzrsfLRrY0tk2H2aoFu6RANO1y1iPPUCDYWkb5g==", + "version": "0.25.1", + "resolved": "https://registry.npmjs.org/@esbuild/aix-ppc64/-/aix-ppc64-0.25.1.tgz", + "integrity": "sha512-kfYGy8IdzTGy+z0vFGvExZtxkFlA4zAxgKEahG9KE1ScBjpQnFsNOX8KTU5ojNru5ed5CVoJYXFtoxaq5nFbjQ==", "cpu": [ "ppc64" ], "dev": true, + "license": "MIT", "optional": true, "os": [ "aix" ], "engines": { - "node": ">=12" + "node": ">=18" } }, "node_modules/@esbuild/android-arm": { - "version": "0.20.2", - "resolved": "https://registry.npmjs.org/@esbuild/android-arm/-/android-arm-0.20.2.tgz", - "integrity": "sha512-t98Ra6pw2VaDhqNWO2Oph2LXbz/EJcnLmKLGBJwEwXX/JAN83Fym1rU8l0JUWK6HkIbWONCSSatf4sf2NBRx/w==", + "version": "0.25.1", + "resolved": "https://registry.npmjs.org/@esbuild/android-arm/-/android-arm-0.25.1.tgz", + "integrity": "sha512-dp+MshLYux6j/JjdqVLnMglQlFu+MuVeNrmT5nk6q07wNhCdSnB7QZj+7G8VMUGh1q+vj2Bq8kRsuyA00I/k+Q==", "cpu": [ "arm" ], "dev": true, + "license": "MIT", "optional": true, "os": [ "android" ], "engines": { - "node": ">=12" + "node": ">=18" } }, "node_modules/@esbuild/android-arm64": { - "version": "0.20.2", - "resolved": "https://registry.npmjs.org/@esbuild/android-arm64/-/android-arm64-0.20.2.tgz", - "integrity": "sha512-mRzjLacRtl/tWU0SvD8lUEwb61yP9cqQo6noDZP/O8VkwafSYwZ4yWy24kan8jE/IMERpYncRt2dw438LP3Xmg==", + "version": "0.25.1", + "resolved": "https://registry.npmjs.org/@esbuild/android-arm64/-/android-arm64-0.25.1.tgz", + "integrity": "sha512-50tM0zCJW5kGqgG7fQ7IHvQOcAn9TKiVRuQ/lN0xR+T2lzEFvAi1ZcS8DiksFcEpf1t/GYOeOfCAgDHFpkiSmA==", "cpu": [ "arm64" ], "dev": true, + "license": "MIT", "optional": true, "os": [ "android" ], "engines": { - "node": ">=12" + "node": ">=18" } }, "node_modules/@esbuild/android-x64": { - "version": "0.20.2", - "resolved": "https://registry.npmjs.org/@esbuild/android-x64/-/android-x64-0.20.2.tgz", - "integrity": "sha512-btzExgV+/lMGDDa194CcUQm53ncxzeBrWJcncOBxuC6ndBkKxnHdFJn86mCIgTELsooUmwUm9FkhSp5HYu00Rg==", + "version": "0.25.1", + "resolved": "https://registry.npmjs.org/@esbuild/android-x64/-/android-x64-0.25.1.tgz", + "integrity": "sha512-GCj6WfUtNldqUzYkN/ITtlhwQqGWu9S45vUXs7EIYf+7rCiiqH9bCloatO9VhxsL0Pji+PF4Lz2XXCES+Q8hDw==", "cpu": [ "x64" ], "dev": true, + "license": "MIT", "optional": true, "os": [ "android" ], "engines": { - "node": ">=12" + "node": ">=18" } }, "node_modules/@esbuild/darwin-arm64": { - "version": "0.20.2", - "resolved": "https://registry.npmjs.org/@esbuild/darwin-arm64/-/darwin-arm64-0.20.2.tgz", - "integrity": "sha512-4J6IRT+10J3aJH3l1yzEg9y3wkTDgDk7TSDFX+wKFiWjqWp/iCfLIYzGyasx9l0SAFPT1HwSCR+0w/h1ES/MjA==", + "version": "0.25.1", + "resolved": "https://registry.npmjs.org/@esbuild/darwin-arm64/-/darwin-arm64-0.25.1.tgz", + "integrity": "sha512-5hEZKPf+nQjYoSr/elb62U19/l1mZDdqidGfmFutVUjjUZrOazAtwK+Kr+3y0C/oeJfLlxo9fXb1w7L+P7E4FQ==", "cpu": [ "arm64" ], "dev": true, + "license": "MIT", "optional": true, "os": [ "darwin" ], "engines": { - "node": ">=12" + "node": ">=18" } }, "node_modules/@esbuild/darwin-x64": { - "version": "0.20.2", - "resolved": "https://registry.npmjs.org/@esbuild/darwin-x64/-/darwin-x64-0.20.2.tgz", - "integrity": "sha512-tBcXp9KNphnNH0dfhv8KYkZhjc+H3XBkF5DKtswJblV7KlT9EI2+jeA8DgBjp908WEuYll6pF+UStUCfEpdysA==", + "version": "0.25.1", + "resolved": "https://registry.npmjs.org/@esbuild/darwin-x64/-/darwin-x64-0.25.1.tgz", + "integrity": "sha512-hxVnwL2Dqs3fM1IWq8Iezh0cX7ZGdVhbTfnOy5uURtao5OIVCEyj9xIzemDi7sRvKsuSdtCAhMKarxqtlyVyfA==", "cpu": [ "x64" ], "dev": true, + "license": "MIT", "optional": true, "os": [ "darwin" ], "engines": { - "node": ">=12" + "node": ">=18" } }, "node_modules/@esbuild/freebsd-arm64": { - "version": "0.20.2", - "resolved": "https://registry.npmjs.org/@esbuild/freebsd-arm64/-/freebsd-arm64-0.20.2.tgz", - "integrity": "sha512-d3qI41G4SuLiCGCFGUrKsSeTXyWG6yem1KcGZVS+3FYlYhtNoNgYrWcvkOoaqMhwXSMrZRl69ArHsGJ9mYdbbw==", + "version": "0.25.1", + "resolved": "https://registry.npmjs.org/@esbuild/freebsd-arm64/-/freebsd-arm64-0.25.1.tgz", + "integrity": "sha512-1MrCZs0fZa2g8E+FUo2ipw6jw5qqQiH+tERoS5fAfKnRx6NXH31tXBKI3VpmLijLH6yriMZsxJtaXUyFt/8Y4A==", "cpu": [ "arm64" ], "dev": true, + "license": "MIT", "optional": true, "os": [ "freebsd" ], "engines": { - "node": ">=12" + "node": ">=18" } }, "node_modules/@esbuild/freebsd-x64": { - "version": "0.20.2", - "resolved": "https://registry.npmjs.org/@esbuild/freebsd-x64/-/freebsd-x64-0.20.2.tgz", - "integrity": "sha512-d+DipyvHRuqEeM5zDivKV1KuXn9WeRX6vqSqIDgwIfPQtwMP4jaDsQsDncjTDDsExT4lR/91OLjRo8bmC1e+Cw==", + "version": "0.25.1", + "resolved": "https://registry.npmjs.org/@esbuild/freebsd-x64/-/freebsd-x64-0.25.1.tgz", + "integrity": "sha512-0IZWLiTyz7nm0xuIs0q1Y3QWJC52R8aSXxe40VUxm6BB1RNmkODtW6LHvWRrGiICulcX7ZvyH6h5fqdLu4gkww==", "cpu": [ "x64" ], "dev": true, + "license": "MIT", "optional": true, "os": [ "freebsd" ], "engines": { - "node": ">=12" + "node": ">=18" } }, "node_modules/@esbuild/linux-arm": { - "version": "0.20.2", - "resolved": "https://registry.npmjs.org/@esbuild/linux-arm/-/linux-arm-0.20.2.tgz", - "integrity": "sha512-VhLPeR8HTMPccbuWWcEUD1Az68TqaTYyj6nfE4QByZIQEQVWBB8vup8PpR7y1QHL3CpcF6xd5WVBU/+SBEvGTg==", + "version": "0.25.1", + "resolved": "https://registry.npmjs.org/@esbuild/linux-arm/-/linux-arm-0.25.1.tgz", + "integrity": "sha512-NdKOhS4u7JhDKw9G3cY6sWqFcnLITn6SqivVArbzIaf3cemShqfLGHYMx8Xlm/lBit3/5d7kXvriTUGa5YViuQ==", "cpu": [ "arm" ], "dev": true, + "license": "MIT", "optional": true, "os": [ "linux" ], "engines": { - "node": ">=12" + "node": ">=18" } }, "node_modules/@esbuild/linux-arm64": { - "version": "0.20.2", - "resolved": "https://registry.npmjs.org/@esbuild/linux-arm64/-/linux-arm64-0.20.2.tgz", - "integrity": "sha512-9pb6rBjGvTFNira2FLIWqDk/uaf42sSyLE8j1rnUpuzsODBq7FvpwHYZxQ/It/8b+QOS1RYfqgGFNLRI+qlq2A==", + "version": "0.25.1", + "resolved": "https://registry.npmjs.org/@esbuild/linux-arm64/-/linux-arm64-0.25.1.tgz", + "integrity": "sha512-jaN3dHi0/DDPelk0nLcXRm1q7DNJpjXy7yWaWvbfkPvI+7XNSc/lDOnCLN7gzsyzgu6qSAmgSvP9oXAhP973uQ==", "cpu": [ "arm64" ], "dev": true, + "license": "MIT", "optional": true, "os": [ "linux" ], "engines": { - "node": ">=12" + "node": ">=18" } }, "node_modules/@esbuild/linux-ia32": { - "version": "0.20.2", - "resolved": "https://registry.npmjs.org/@esbuild/linux-ia32/-/linux-ia32-0.20.2.tgz", - "integrity": "sha512-o10utieEkNPFDZFQm9CoP7Tvb33UutoJqg3qKf1PWVeeJhJw0Q347PxMvBgVVFgouYLGIhFYG0UGdBumROyiig==", + "version": "0.25.1", + "resolved": "https://registry.npmjs.org/@esbuild/linux-ia32/-/linux-ia32-0.25.1.tgz", + "integrity": "sha512-OJykPaF4v8JidKNGz8c/q1lBO44sQNUQtq1KktJXdBLn1hPod5rE/Hko5ugKKZd+D2+o1a9MFGUEIUwO2YfgkQ==", "cpu": [ "ia32" ], "dev": true, + "license": "MIT", "optional": true, "os": [ "linux" ], "engines": { - "node": ">=12" + "node": ">=18" } }, "node_modules/@esbuild/linux-loong64": { - "version": "0.20.2", - "resolved": "https://registry.npmjs.org/@esbuild/linux-loong64/-/linux-loong64-0.20.2.tgz", - "integrity": "sha512-PR7sp6R/UC4CFVomVINKJ80pMFlfDfMQMYynX7t1tNTeivQ6XdX5r2XovMmha/VjR1YN/HgHWsVcTRIMkymrgQ==", + "version": "0.25.1", + "resolved": "https://registry.npmjs.org/@esbuild/linux-loong64/-/linux-loong64-0.25.1.tgz", + "integrity": "sha512-nGfornQj4dzcq5Vp835oM/o21UMlXzn79KobKlcs3Wz9smwiifknLy4xDCLUU0BWp7b/houtdrgUz7nOGnfIYg==", "cpu": [ "loong64" ], "dev": true, + "license": "MIT", "optional": true, "os": [ "linux" ], "engines": { - "node": ">=12" + "node": ">=18" } }, "node_modules/@esbuild/linux-mips64el": { - "version": "0.20.2", - "resolved": "https://registry.npmjs.org/@esbuild/linux-mips64el/-/linux-mips64el-0.20.2.tgz", - "integrity": "sha512-4BlTqeutE/KnOiTG5Y6Sb/Hw6hsBOZapOVF6njAESHInhlQAghVVZL1ZpIctBOoTFbQyGW+LsVYZ8lSSB3wkjA==", + "version": "0.25.1", + "resolved": "https://registry.npmjs.org/@esbuild/linux-mips64el/-/linux-mips64el-0.25.1.tgz", + "integrity": "sha512-1osBbPEFYwIE5IVB/0g2X6i1qInZa1aIoj1TdL4AaAb55xIIgbg8Doq6a5BzYWgr+tEcDzYH67XVnTmUzL+nXg==", "cpu": [ "mips64el" ], "dev": true, + "license": "MIT", "optional": true, "os": [ "linux" ], "engines": { - "node": ">=12" + "node": ">=18" } }, "node_modules/@esbuild/linux-ppc64": { - "version": "0.20.2", - "resolved": "https://registry.npmjs.org/@esbuild/linux-ppc64/-/linux-ppc64-0.20.2.tgz", - "integrity": "sha512-rD3KsaDprDcfajSKdn25ooz5J5/fWBylaaXkuotBDGnMnDP1Uv5DLAN/45qfnf3JDYyJv/ytGHQaziHUdyzaAg==", + "version": "0.25.1", + "resolved": "https://registry.npmjs.org/@esbuild/linux-ppc64/-/linux-ppc64-0.25.1.tgz", + "integrity": "sha512-/6VBJOwUf3TdTvJZ82qF3tbLuWsscd7/1w+D9LH0W/SqUgM5/JJD0lrJ1fVIfZsqB6RFmLCe0Xz3fmZc3WtyVg==", "cpu": [ "ppc64" ], "dev": true, + "license": "MIT", "optional": true, "os": [ "linux" ], "engines": { - "node": ">=12" + "node": ">=18" } }, "node_modules/@esbuild/linux-riscv64": { - "version": "0.20.2", - "resolved": "https://registry.npmjs.org/@esbuild/linux-riscv64/-/linux-riscv64-0.20.2.tgz", - "integrity": "sha512-snwmBKacKmwTMmhLlz/3aH1Q9T8v45bKYGE3j26TsaOVtjIag4wLfWSiZykXzXuE1kbCE+zJRmwp+ZbIHinnVg==", + "version": "0.25.1", + "resolved": "https://registry.npmjs.org/@esbuild/linux-riscv64/-/linux-riscv64-0.25.1.tgz", + "integrity": "sha512-nSut/Mx5gnilhcq2yIMLMe3Wl4FK5wx/o0QuuCLMtmJn+WeWYoEGDN1ipcN72g1WHsnIbxGXd4i/MF0gTcuAjQ==", "cpu": [ "riscv64" ], "dev": true, + "license": "MIT", "optional": true, "os": [ "linux" ], "engines": { - "node": ">=12" + "node": ">=18" } }, "node_modules/@esbuild/linux-s390x": { - "version": "0.20.2", - "resolved": "https://registry.npmjs.org/@esbuild/linux-s390x/-/linux-s390x-0.20.2.tgz", - "integrity": "sha512-wcWISOobRWNm3cezm5HOZcYz1sKoHLd8VL1dl309DiixxVFoFe/o8HnwuIwn6sXre88Nwj+VwZUvJf4AFxkyrQ==", + "version": "0.25.1", + "resolved": "https://registry.npmjs.org/@esbuild/linux-s390x/-/linux-s390x-0.25.1.tgz", + "integrity": "sha512-cEECeLlJNfT8kZHqLarDBQso9a27o2Zd2AQ8USAEoGtejOrCYHNtKP8XQhMDJMtthdF4GBmjR2au3x1udADQQQ==", "cpu": [ "s390x" ], "dev": true, + "license": "MIT", "optional": true, "os": [ "linux" ], "engines": { - "node": ">=12" + "node": ">=18" } }, "node_modules/@esbuild/linux-x64": { - "version": "0.20.2", - "resolved": "https://registry.npmjs.org/@esbuild/linux-x64/-/linux-x64-0.20.2.tgz", - "integrity": "sha512-1MdwI6OOTsfQfek8sLwgyjOXAu+wKhLEoaOLTjbijk6E2WONYpH9ZU2mNtR+lZ2B4uwr+usqGuVfFT9tMtGvGw==", + "version": "0.25.1", + "resolved": "https://registry.npmjs.org/@esbuild/linux-x64/-/linux-x64-0.25.1.tgz", + "integrity": "sha512-xbfUhu/gnvSEg+EGovRc+kjBAkrvtk38RlerAzQxvMzlB4fXpCFCeUAYzJvrnhFtdeyVCDANSjJvOvGYoeKzFA==", "cpu": [ "x64" ], "dev": true, + "license": "MIT", "optional": true, "os": [ "linux" ], "engines": { - "node": ">=12" + "node": ">=18" } }, - "node_modules/@esbuild/netbsd-x64": { - "version": "0.20.2", - "resolved": "https://registry.npmjs.org/@esbuild/netbsd-x64/-/netbsd-x64-0.20.2.tgz", - "integrity": "sha512-K8/DhBxcVQkzYc43yJXDSyjlFeHQJBiowJ0uVL6Tor3jGQfSGHNNJcWxNbOI8v5k82prYqzPuwkzHt3J1T1iZQ==", + "node_modules/@esbuild/netbsd-arm64": { + "version": "0.25.1", + "resolved": "https://registry.npmjs.org/@esbuild/netbsd-arm64/-/netbsd-arm64-0.25.1.tgz", + "integrity": "sha512-O96poM2XGhLtpTh+s4+nP7YCCAfb4tJNRVZHfIE7dgmax+yMP2WgMd2OecBuaATHKTHsLWHQeuaxMRnCsH8+5g==", "cpu": [ - "x64" + "arm64" ], "dev": true, + "license": "MIT", "optional": true, "os": [ "netbsd" ], "engines": { - "node": ">=12" + "node": ">=18" } }, - "node_modules/@esbuild/openbsd-x64": { - "version": "0.20.2", - "resolved": "https://registry.npmjs.org/@esbuild/openbsd-x64/-/openbsd-x64-0.20.2.tgz", - "integrity": "sha512-eMpKlV0SThJmmJgiVyN9jTPJ2VBPquf6Kt/nAoo6DgHAoN57K15ZghiHaMvqjCye/uU4X5u3YSMgVBI1h3vKrQ==", + "node_modules/@esbuild/netbsd-x64": { + "version": "0.25.1", + "resolved": "https://registry.npmjs.org/@esbuild/netbsd-x64/-/netbsd-x64-0.25.1.tgz", + "integrity": "sha512-X53z6uXip6KFXBQ+Krbx25XHV/NCbzryM6ehOAeAil7X7oa4XIq+394PWGnwaSQ2WRA0KI6PUO6hTO5zeF5ijA==", "cpu": [ "x64" ], "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "netbsd" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/@esbuild/openbsd-arm64": { + "version": "0.25.1", + "resolved": "https://registry.npmjs.org/@esbuild/openbsd-arm64/-/openbsd-arm64-0.25.1.tgz", + "integrity": "sha512-Na9T3szbXezdzM/Kfs3GcRQNjHzM6GzFBeU1/6IV/npKP5ORtp9zbQjvkDJ47s6BCgaAZnnnu/cY1x342+MvZg==", + "cpu": [ + "arm64" + ], + "dev": true, + "license": "MIT", "optional": true, "os": [ "openbsd" ], "engines": { - "node": ">=12" + "node": ">=18" } }, - "node_modules/@esbuild/sunos-x64": { - "version": "0.20.2", - "resolved": "https://registry.npmjs.org/@esbuild/sunos-x64/-/sunos-x64-0.20.2.tgz", - "integrity": "sha512-2UyFtRC6cXLyejf/YEld4Hajo7UHILetzE1vsRcGL3earZEW77JxrFjH4Ez2qaTiEfMgAXxfAZCm1fvM/G/o8w==", + "node_modules/@esbuild/openbsd-x64": { + "version": "0.25.1", + "resolved": "https://registry.npmjs.org/@esbuild/openbsd-x64/-/openbsd-x64-0.25.1.tgz", + "integrity": "sha512-T3H78X2h1tszfRSf+txbt5aOp/e7TAz3ptVKu9Oyir3IAOFPGV6O9c2naym5TOriy1l0nNf6a4X5UXRZSGX/dw==", "cpu": [ "x64" ], "dev": true, + "license": "MIT", + "optional": true, + "os": [ + "openbsd" + ], + "engines": { + "node": ">=18" + } + }, + "node_modules/@esbuild/sunos-x64": { + "version": "0.25.1", + "resolved": "https://registry.npmjs.org/@esbuild/sunos-x64/-/sunos-x64-0.25.1.tgz", + "integrity": "sha512-2H3RUvcmULO7dIE5EWJH8eubZAI4xw54H1ilJnRNZdeo8dTADEZ21w6J22XBkXqGJbe0+wnNJtw3UXRoLJnFEg==", + "cpu": [ + "x64" + ], + "dev": true, + "license": "MIT", "optional": true, "os": [ "sunos" ], "engines": { - "node": ">=12" + "node": ">=18" } }, "node_modules/@esbuild/win32-arm64": { - "version": "0.20.2", - "resolved": "https://registry.npmjs.org/@esbuild/win32-arm64/-/win32-arm64-0.20.2.tgz", - "integrity": "sha512-GRibxoawM9ZCnDxnP3usoUDO9vUkpAxIIZ6GQI+IlVmr5kP3zUq+l17xELTHMWTWzjxa2guPNyrpq1GWmPvcGQ==", + "version": "0.25.1", + "resolved": "https://registry.npmjs.org/@esbuild/win32-arm64/-/win32-arm64-0.25.1.tgz", + "integrity": "sha512-GE7XvrdOzrb+yVKB9KsRMq+7a2U/K5Cf/8grVFRAGJmfADr/e/ODQ134RK2/eeHqYV5eQRFxb1hY7Nr15fv1NQ==", "cpu": [ "arm64" ], "dev": true, + "license": "MIT", "optional": true, "os": [ "win32" ], "engines": { - "node": ">=12" + "node": ">=18" } }, "node_modules/@esbuild/win32-ia32": { - "version": "0.20.2", - "resolved": "https://registry.npmjs.org/@esbuild/win32-ia32/-/win32-ia32-0.20.2.tgz", - "integrity": "sha512-HfLOfn9YWmkSKRQqovpnITazdtquEW8/SoHW7pWpuEeguaZI4QnCRW6b+oZTztdBnZOS2hqJ6im/D5cPzBTTlQ==", + "version": "0.25.1", + "resolved": "https://registry.npmjs.org/@esbuild/win32-ia32/-/win32-ia32-0.25.1.tgz", + "integrity": "sha512-uOxSJCIcavSiT6UnBhBzE8wy3n0hOkJsBOzy7HDAuTDE++1DJMRRVCPGisULScHL+a/ZwdXPpXD3IyFKjA7K8A==", "cpu": [ "ia32" ], "dev": true, + "license": "MIT", "optional": true, "os": [ "win32" ], "engines": { - "node": ">=12" + "node": ">=18" } }, "node_modules/@esbuild/win32-x64": { - "version": "0.20.2", - "resolved": "https://registry.npmjs.org/@esbuild/win32-x64/-/win32-x64-0.20.2.tgz", - "integrity": "sha512-N49X4lJX27+l9jbLKSqZ6bKNjzQvHaT8IIFUy+YIqmXQdjYCToGWwOItDrfby14c78aDd5NHQl29xingXfCdLQ==", + "version": "0.25.1", + "resolved": "https://registry.npmjs.org/@esbuild/win32-x64/-/win32-x64-0.25.1.tgz", + "integrity": "sha512-Y1EQdcfwMSeQN/ujR5VayLOJ1BHaK+ssyk0AEzPjC+t1lITgsnccPqFjb6V+LsTp/9Iov4ysfjxLaGJ9RPtkVg==", "cpu": [ "x64" ], "dev": true, + "license": "MIT", "optional": true, "os": [ "win32" ], "engines": { - "node": ">=12" + "node": ">=18" } }, "node_modules/@eslint-community/eslint-utils": { @@ -2291,9 +2350,9 @@ } }, "node_modules/@sveltejs/adapter-static": { - "version": "3.0.6", - "resolved": "https://registry.npmjs.org/@sveltejs/adapter-static/-/adapter-static-3.0.6.tgz", - "integrity": "sha512-MGJcesnJWj7FxDcB/GbrdYD3q24Uk0PIL4QIX149ku+hlJuj//nxUbb0HxUTpjkecWfHjVveSUnUaQWnPRXlpg==", + "version": "3.0.8", + "resolved": "https://registry.npmjs.org/@sveltejs/adapter-static/-/adapter-static-3.0.8.tgz", + "integrity": "sha512-YaDrquRpZwfcXbnlDsSrBQNCChVOT9MGuSg+dMAyfsAa1SmiAhrA5jUYUiIMC59G92kIbY/AaQOWcBdq+lh+zg==", "dev": true, "license": "MIT", "peerDependencies": { @@ -2301,24 +2360,22 @@ } }, "node_modules/@sveltejs/kit": { - "version": "2.12.1", - "resolved": "https://registry.npmjs.org/@sveltejs/kit/-/kit-2.12.1.tgz", - "integrity": "sha512-M3rPijGImeOkI0DBJSwjqz+YFX2DyOf6NzWgHVk3mqpT06dlYCpcv5xh1q4rYEqB58yQlk4QA1Y35PUqnUiFKw==", - "hasInstallScript": true, + "version": "2.20.2", + "resolved": "https://registry.npmjs.org/@sveltejs/kit/-/kit-2.20.2.tgz", + "integrity": "sha512-Dv8TOAZC9vyfcAB9TMsvUEJsRbklRTeNfcYBPaeH6KnABJ99i3CvCB2eNx8fiiliIqe+9GIchBg4RodRH5p1BQ==", "license": "MIT", "dependencies": { "@types/cookie": "^0.6.0", "cookie": "^0.6.0", "devalue": "^5.1.0", - "esm-env": "^1.2.1", + "esm-env": "^1.2.2", "import-meta-resolve": "^4.1.0", "kleur": "^4.1.5", "magic-string": "^0.30.5", "mrmime": "^2.0.0", "sade": "^1.8.1", "set-cookie-parser": "^2.6.0", - "sirv": "^3.0.0", - "tiny-glob": "^0.2.9" + "sirv": "^3.0.0" }, "bin": { "svelte-kit": "svelte-kit.js" @@ -3884,7 +3941,6 @@ "resolved": "https://registry.npmjs.org/base64-arraybuffer/-/base64-arraybuffer-1.0.2.tgz", "integrity": "sha512-I3yl4r9QB5ZRY3XuJVEPfc2XhZO6YweFPI+UovAzn+8/hb3oJ6lnysaFcjVpkCPfVWFUDvoZ8kmVDP7WyRtYtQ==", "license": "MIT", - "optional": true, "engines": { "node": ">= 0.6.0" } @@ -3993,7 +4049,8 @@ "version": "1.0.0", "resolved": "https://registry.npmjs.org/boolbase/-/boolbase-1.0.0.tgz", "integrity": "sha512-JZOSA7Mo9sNGB8+UjSgzdLtokWAky1zbztM3WRLCbZ70/3cTANmQmOdR7y2g+J0e2WXywy1yS468tY+IruqEww==", - "dev": true + "dev": true, + "license": "ISC" }, "node_modules/brace-expansion": { "version": "2.0.1", @@ -4204,9 +4261,9 @@ } }, "node_modules/canvg": { - "version": "3.0.10", - "resolved": "https://registry.npmjs.org/canvg/-/canvg-3.0.10.tgz", - "integrity": "sha512-qwR2FRNO9NlzTeKIPIKpnTY6fqwuYSequ8Ru8c0YkYU7U0oW+hLUvWadLvAu1Rl72OMNiFhoLu4f8eUjQ7l/+Q==", + "version": "3.0.11", + "resolved": "https://registry.npmjs.org/canvg/-/canvg-3.0.11.tgz", + "integrity": "sha512-5ON+q7jCTgMp9cjpu4Jo6XbvfYwSB2Ow3kzHKfIyJfaCAOHLbdKPQqGKgfED/R5B+3TFFfe8pegYA+b423SRyA==", "license": "MIT", "optional": true, "dependencies": { @@ -4303,21 +4360,26 @@ } }, "node_modules/cheerio": { - "version": "1.0.0-rc.12", - "resolved": "https://registry.npmjs.org/cheerio/-/cheerio-1.0.0-rc.12.tgz", - "integrity": "sha512-VqR8m68vM46BNnuZ5NtnGBKIE/DfN0cRIzg9n40EIq9NOv90ayxLBXA8fXC5gquFRGJSTRqBq25Jt2ECLR431Q==", + "version": "1.0.0", + "resolved": "https://registry.npmjs.org/cheerio/-/cheerio-1.0.0.tgz", + "integrity": "sha512-quS9HgjQpdaXOvsZz82Oz7uxtXiy6UIsIQcpBj7HRw2M63Skasm9qlDocAM7jNuaxdhpPU7c4kJN+gA5MCu4ww==", "dev": true, + "license": "MIT", "dependencies": { "cheerio-select": "^2.1.0", "dom-serializer": "^2.0.0", "domhandler": "^5.0.3", - "domutils": "^3.0.1", - "htmlparser2": "^8.0.1", - "parse5": "^7.0.0", - "parse5-htmlparser2-tree-adapter": "^7.0.0" + "domutils": "^3.1.0", + "encoding-sniffer": "^0.2.0", + "htmlparser2": "^9.1.0", + "parse5": "^7.1.2", + "parse5-htmlparser2-tree-adapter": "^7.0.0", + "parse5-parser-stream": "^7.1.2", + "undici": "^6.19.5", + "whatwg-mimetype": "^4.0.0" }, "engines": { - "node": ">= 6" + "node": ">=18.17" }, "funding": { "url": "https://github.com/cheeriojs/cheerio?sponsor=1" @@ -4328,6 +4390,7 @@ "resolved": "https://registry.npmjs.org/cheerio-select/-/cheerio-select-2.1.0.tgz", "integrity": "sha512-9v9kG0LvzrlcungtnJtpGNxY+fzECQKhK4EGJX2vByejiMX84MFNQw4UxPJl3bFbTMw+Dfs37XaIkCwTZfLh4g==", "dev": true, + "license": "BSD-2-Clause", "dependencies": { "boolbase": "^1.0.0", "css-select": "^5.1.0", @@ -4340,6 +4403,16 @@ "url": "https://github.com/sponsors/fb55" } }, + "node_modules/cheerio/node_modules/undici": { + "version": "6.21.2", + "resolved": "https://registry.npmjs.org/undici/-/undici-6.21.2.tgz", + "integrity": "sha512-uROZWze0R0itiAKVPsYhFov9LxrPMHLMEQFszeI2gCN6bnIIZ8twzBCJcN2LJrBBLfrP0t1FW0g+JmKVl8Vk1g==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">=18.17" + } + }, "node_modules/chokidar": { "version": "3.6.0", "resolved": "https://registry.npmjs.org/chokidar/-/chokidar-3.6.0.tgz", @@ -4699,9 +4772,9 @@ } }, "node_modules/core-js": { - "version": "3.40.0", - "resolved": "https://registry.npmjs.org/core-js/-/core-js-3.40.0.tgz", - "integrity": "sha512-7vsMc/Lty6AGnn7uFpYT56QesI5D2Y/UkgKounk87OP9Z2H9Z8kj6jzcSGAxFmUtDOS0ntK6lbQz+Nsa0Jj6mQ==", + "version": "3.41.0", + "resolved": "https://registry.npmjs.org/core-js/-/core-js-3.41.0.tgz", + "integrity": "sha512-SJ4/EHwS36QMJd6h/Rg+GyR4A5xE0FSI3eZ+iBVpfqf1x0eTSg1smWLHrA+2jQThZSh97fmSgFSU8B61nxosxA==", "hasInstallScript": true, "license": "MIT", "optional": true, @@ -4759,7 +4832,6 @@ "resolved": "https://registry.npmjs.org/css-line-break/-/css-line-break-2.1.0.tgz", "integrity": "sha512-FHcKFCZcAha3LwfVBhCQbW2nCNbkZXn7KVUJcsT5/P8YmfsVja0FMPJr0B903j/E69HUphKiV9iQArX8SDYA4w==", "license": "MIT", - "optional": true, "dependencies": { "utrie": "^1.0.2" } @@ -4769,6 +4841,7 @@ "resolved": "https://registry.npmjs.org/css-select/-/css-select-5.1.0.tgz", "integrity": "sha512-nwoRF1rvRRnnCqqY7updORDsuqKzqYJ28+oSMaJMMgOauh3fvwHqMS7EZpIPqK8GL+g9mKxF1vP/ZjSeNjEVHg==", "dev": true, + "license": "BSD-2-Clause", "dependencies": { "boolbase": "^1.0.0", "css-what": "^6.1.0", @@ -4797,6 +4870,7 @@ "resolved": "https://registry.npmjs.org/css-what/-/css-what-6.1.0.tgz", "integrity": "sha512-HTUrgRJ7r4dsZKU6GjmpfRK1O76h97Z8MfS1G0FozR+oF2kG6Vfe8JE6zwrkbxigziPHinCJ+gCPjA9EaBDtRw==", "dev": true, + "license": "BSD-2-Clause", "engines": { "node": ">= 6" }, @@ -5574,6 +5648,7 @@ "resolved": "https://registry.npmjs.org/dom-serializer/-/dom-serializer-2.0.0.tgz", "integrity": "sha512-wIkAryiqt/nV5EQKqQpo3SToSOV9J0DnbJqwK7Wv/Trc92zIAYZ4FlMu+JPFW1DfGFt81ZTCGgDEabffXeLyJg==", "dev": true, + "license": "MIT", "dependencies": { "domelementtype": "^2.3.0", "domhandler": "^5.0.2", @@ -5593,13 +5668,15 @@ "type": "github", "url": "https://github.com/sponsors/fb55" } - ] + ], + "license": "BSD-2-Clause" }, "node_modules/domhandler": { "version": "5.0.3", "resolved": "https://registry.npmjs.org/domhandler/-/domhandler-5.0.3.tgz", "integrity": "sha512-cgwlv/1iFQiFnU96XXgROh8xTeetsnJiDsTc7TYCLFd9+/WNkIqPTxiM/8pSd8VIrhXGTf1Ny1q1hquVqDJB5w==", "dev": true, + "license": "BSD-2-Clause", "dependencies": { "domelementtype": "^2.3.0" }, @@ -5616,10 +5693,11 @@ "integrity": "sha512-cTOAhc36AalkjtBpfG6O8JimdTMWNXjiePT2xQH/ppBGi/4uIpmj8eKyIkMJErXWARyINV/sB38yf8JCLF5pbQ==" }, "node_modules/domutils": { - "version": "3.1.0", - "resolved": "https://registry.npmjs.org/domutils/-/domutils-3.1.0.tgz", - "integrity": "sha512-H78uMmQtI2AhgDJjWeQmHwJJ2bLPD3GMmO7Zja/ZZh84wkm+4ut+IUnUdRa8uCGX88DiVx1j6FRe1XfxEgjEZA==", + "version": "3.2.2", + "resolved": "https://registry.npmjs.org/domutils/-/domutils-3.2.2.tgz", + "integrity": "sha512-6kZKyUajlDuqlHKVX1w7gyslj9MPIXzIFiz/rGu35uC1wMi+kMhQwGhl4lt9unC9Vb9INnY9Z3/ZA3+FhASLaw==", "dev": true, + "license": "BSD-2-Clause", "dependencies": { "dom-serializer": "^2.0.0", "domelementtype": "^2.3.0", @@ -5654,6 +5732,20 @@ "resolved": "https://registry.npmjs.org/emoji-regex/-/emoji-regex-9.2.2.tgz", "integrity": "sha512-L18DaJsXSUk2+42pv8mLs5jJT2hqFkFE4j21wOmgbUqsZ2hL72NsUU785g9RXgo3s0ZNgVl42TiHp3ZtOv/Vyg==" }, + "node_modules/encoding-sniffer": { + "version": "0.2.0", + "resolved": "https://registry.npmjs.org/encoding-sniffer/-/encoding-sniffer-0.2.0.tgz", + "integrity": "sha512-ju7Wq1kg04I3HtiYIOrUrdfdDvkyO9s5XM8QAj/bN61Yo/Vb4vgJxy5vi4Yxk01gWHbrofpPtpxM8bKger9jhg==", + "dev": true, + "license": "MIT", + "dependencies": { + "iconv-lite": "^0.6.3", + "whatwg-encoding": "^3.1.1" + }, + "funding": { + "url": "https://github.com/fb55/encoding-sniffer?sponsor=1" + } + }, "node_modules/end-of-stream": { "version": "1.4.4", "resolved": "https://registry.npmjs.org/end-of-stream/-/end-of-stream-1.4.4.tgz", @@ -5761,41 +5853,44 @@ "dev": true }, "node_modules/esbuild": { - "version": "0.20.2", - "resolved": "https://registry.npmjs.org/esbuild/-/esbuild-0.20.2.tgz", - "integrity": "sha512-WdOOppmUNU+IbZ0PaDiTst80zjnrOkyJNHoKupIcVyU8Lvla3Ugx94VzkQ32Ijqd7UhHJy75gNWDMUekcrSJ6g==", + "version": "0.25.1", + "resolved": "https://registry.npmjs.org/esbuild/-/esbuild-0.25.1.tgz", + "integrity": "sha512-BGO5LtrGC7vxnqucAe/rmvKdJllfGaYWdyABvyMoXQlfYMb2bbRuReWR5tEGE//4LcNJj9XrkovTqNYRFZHAMQ==", "dev": true, "hasInstallScript": true, + "license": "MIT", "bin": { "esbuild": "bin/esbuild" }, "engines": { - "node": ">=12" + "node": ">=18" }, "optionalDependencies": { - "@esbuild/aix-ppc64": "0.20.2", - "@esbuild/android-arm": "0.20.2", - "@esbuild/android-arm64": "0.20.2", - "@esbuild/android-x64": "0.20.2", - "@esbuild/darwin-arm64": "0.20.2", - "@esbuild/darwin-x64": "0.20.2", - "@esbuild/freebsd-arm64": "0.20.2", - "@esbuild/freebsd-x64": "0.20.2", - "@esbuild/linux-arm": "0.20.2", - "@esbuild/linux-arm64": "0.20.2", - "@esbuild/linux-ia32": "0.20.2", - "@esbuild/linux-loong64": "0.20.2", - "@esbuild/linux-mips64el": "0.20.2", - "@esbuild/linux-ppc64": "0.20.2", - "@esbuild/linux-riscv64": "0.20.2", - "@esbuild/linux-s390x": "0.20.2", - "@esbuild/linux-x64": "0.20.2", - "@esbuild/netbsd-x64": "0.20.2", - "@esbuild/openbsd-x64": "0.20.2", - "@esbuild/sunos-x64": "0.20.2", - "@esbuild/win32-arm64": "0.20.2", - "@esbuild/win32-ia32": "0.20.2", - "@esbuild/win32-x64": "0.20.2" + "@esbuild/aix-ppc64": "0.25.1", + "@esbuild/android-arm": "0.25.1", + "@esbuild/android-arm64": "0.25.1", + "@esbuild/android-x64": "0.25.1", + "@esbuild/darwin-arm64": "0.25.1", + "@esbuild/darwin-x64": "0.25.1", + "@esbuild/freebsd-arm64": "0.25.1", + "@esbuild/freebsd-x64": "0.25.1", + "@esbuild/linux-arm": "0.25.1", + "@esbuild/linux-arm64": "0.25.1", + "@esbuild/linux-ia32": "0.25.1", + "@esbuild/linux-loong64": "0.25.1", + "@esbuild/linux-mips64el": "0.25.1", + "@esbuild/linux-ppc64": "0.25.1", + "@esbuild/linux-riscv64": "0.25.1", + "@esbuild/linux-s390x": "0.25.1", + "@esbuild/linux-x64": "0.25.1", + "@esbuild/netbsd-arm64": "0.25.1", + "@esbuild/netbsd-x64": "0.25.1", + "@esbuild/openbsd-arm64": "0.25.1", + "@esbuild/openbsd-x64": "0.25.1", + "@esbuild/sunos-x64": "0.25.1", + "@esbuild/win32-arm64": "0.25.1", + "@esbuild/win32-ia32": "0.25.1", + "@esbuild/win32-x64": "0.25.1" } }, "node_modules/escape-string-regexp": { @@ -6001,9 +6096,9 @@ } }, "node_modules/esm-env": { - "version": "1.2.1", - "resolved": "https://registry.npmjs.org/esm-env/-/esm-env-1.2.1.tgz", - "integrity": "sha512-U9JedYYjCnadUlXk7e1Kr+aENQhtUaoaV9+gZm1T8LC/YBAPJx3NSPIAurFOC0U5vrdSevnUJS2/wUVxGwPhng==", + "version": "1.2.2", + "resolved": "https://registry.npmjs.org/esm-env/-/esm-env-1.2.2.tgz", + "integrity": "sha512-Epxrv+Nr/CaL4ZcFGPJIYLWFom+YeV1DqMLHJoEd9SYRxNbaFruBwfEX/kkHUJf55j2+TUbmDcmuilbP1TmXHA==", "license": "MIT" }, "node_modules/espree": { @@ -6651,11 +6746,6 @@ "url": "https://github.com/sponsors/sindresorhus" } }, - "node_modules/globalyzer": { - "version": "0.1.0", - "resolved": "https://registry.npmjs.org/globalyzer/-/globalyzer-0.1.0.tgz", - "integrity": "sha512-40oNTM9UfG6aBmuKxk/giHn5nQ8RVz/SS4Ir6zgzOv9/qC3kKZ9v4etGTcJbEl/NyVQH7FGU7d+X1egr57Md2Q==" - }, "node_modules/globby": { "version": "11.1.0", "resolved": "https://registry.npmjs.org/globby/-/globby-11.1.0.tgz", @@ -6676,11 +6766,6 @@ "url": "https://github.com/sponsors/sindresorhus" } }, - "node_modules/globrex": { - "version": "0.1.2", - "resolved": "https://registry.npmjs.org/globrex/-/globrex-0.1.2.tgz", - "integrity": "sha512-uHJgbwAMwNFf5mLst7IWLNg14x1CkeqglJb/K3doi4dw6q2IvAAmM/Y81kevy83wP+Sst+nutFTYOGg3d1lsxg==" - }, "node_modules/gopd": { "version": "1.0.1", "resolved": "https://registry.npmjs.org/gopd/-/gopd-1.0.1.tgz", @@ -6823,6 +6908,22 @@ "node": ">=12.0.0" } }, + "node_modules/html-entities": { + "version": "2.5.3", + "resolved": "https://registry.npmjs.org/html-entities/-/html-entities-2.5.3.tgz", + "integrity": "sha512-D3AfvN7SjhTgBSA8L1BN4FpPzuEd06uy4lHwSoRWr0lndi9BKaNzPLKGOWZ2ocSGguozr08TTb2jhCLHaemruw==", + "funding": [ + { + "type": "github", + "url": "https://github.com/sponsors/mdevils" + }, + { + "type": "patreon", + "url": "https://patreon.com/mdevils" + } + ], + "license": "MIT" + }, "node_modules/html-escaper": { "version": "3.0.3", "resolved": "https://registry.npmjs.org/html-escaper/-/html-escaper-3.0.3.tgz", @@ -6842,10 +6943,23 @@ "node": ">=8.0.0" } }, + "node_modules/html2canvas-pro": { + "version": "1.5.8", + "resolved": "https://registry.npmjs.org/html2canvas-pro/-/html2canvas-pro-1.5.8.tgz", + "integrity": "sha512-bVGAU7IvhBwBlRAmX6QhekX8lsaxmYoF6zIwf/HNlHscjx+KN8jw/U4PQRYqeEVm9+m13hcS1l5ChJB9/e29Lw==", + "license": "MIT", + "dependencies": { + "css-line-break": "^2.1.0", + "text-segmentation": "^1.0.3" + }, + "engines": { + "node": ">=16.0.0" + } + }, "node_modules/htmlparser2": { - "version": "8.0.2", - "resolved": "https://registry.npmjs.org/htmlparser2/-/htmlparser2-8.0.2.tgz", - "integrity": "sha512-GYdjWKDkbRLkZ5geuHs5NY1puJ+PXwP7+fHPRz06Eirsb9ugf6d8kkXav6ADhcODhFFPMIXyxkxSuMf3D6NCFA==", + "version": "9.1.0", + "resolved": "https://registry.npmjs.org/htmlparser2/-/htmlparser2-9.1.0.tgz", + "integrity": "sha512-5zfg6mHUoaer/97TxnGpxmbR7zJtPwIYFMZ/H5ucTlPZhKvtum05yiPK3Mgai3a0DyVxv7qYqoweaEd2nrYQzQ==", "dev": true, "funding": [ "https://github.com/fb55/htmlparser2?sponsor=1", @@ -6854,11 +6968,12 @@ "url": "https://github.com/sponsors/fb55" } ], + "license": "MIT", "dependencies": { "domelementtype": "^2.3.0", "domhandler": "^5.0.3", - "domutils": "^3.0.1", - "entities": "^4.4.0" + "domutils": "^3.1.0", + "entities": "^4.5.0" } }, "node_modules/http-signature": { @@ -6915,34 +7030,35 @@ } }, "node_modules/i18next-parser": { - "version": "9.0.1", - "resolved": "https://registry.npmjs.org/i18next-parser/-/i18next-parser-9.0.1.tgz", - "integrity": "sha512-/Pr93/yEBdwsMKRsk4Zn63K368ALhzh8BRVrM6JNGOHy86ZKpiNJI6m8l1S/4T4Ofy1J4dlwkD7N98M70GP4aA==", + "version": "9.3.0", + "resolved": "https://registry.npmjs.org/i18next-parser/-/i18next-parser-9.3.0.tgz", + "integrity": "sha512-VaQqk/6nLzTFx1MDiCZFtzZXKKyBV6Dv0cJMFM/hOt4/BWHWRgYafzYfVQRUzotwUwjqeNCprWnutzD/YAGczg==", "dev": true, + "license": "MIT", "dependencies": { - "@babel/runtime": "^7.23.2", + "@babel/runtime": "^7.25.0", "broccoli-plugin": "^4.0.7", - "cheerio": "^1.0.0-rc.2", - "colors": "1.4.0", - "commander": "~12.1.0", + "cheerio": "^1.0.0", + "colors": "^1.4.0", + "commander": "^12.1.0", "eol": "^0.9.1", - "esbuild": "^0.20.1", - "fs-extra": "^11.1.0", + "esbuild": "^0.25.0", + "fs-extra": "^11.2.0", "gulp-sort": "^2.0.0", - "i18next": "^23.5.1", - "js-yaml": "4.1.0", - "lilconfig": "^3.0.0", - "rsvp": "^4.8.2", + "i18next": "^23.5.1 || ^24.2.0", + "js-yaml": "^4.1.0", + "lilconfig": "^3.1.3", + "rsvp": "^4.8.5", "sort-keys": "^5.0.0", "typescript": "^5.0.4", - "vinyl": "~3.0.0", + "vinyl": "^3.0.0", "vinyl-fs": "^4.0.0" }, "bin": { "i18next": "bin/cli.js" }, "engines": { - "node": ">=18.0.0 || >=20.0.0 || >=22.0.0", + "node": "^18.0.0 || ^20.0.0 || ^22.0.0", "npm": ">=6", "yarn": ">=1" } @@ -7355,18 +7471,18 @@ } }, "node_modules/jspdf": { - "version": "3.0.0", - "resolved": "https://registry.npmjs.org/jspdf/-/jspdf-3.0.0.tgz", - "integrity": "sha512-QvuQZvOI8CjfjVgtajdL0ihrDYif1cN5gXiF9lb9Pd9JOpmocvnNyFO9sdiJ/8RA5Bu8zyGOUjJLj5kiku16ug==", + "version": "3.0.1", + "resolved": "https://registry.npmjs.org/jspdf/-/jspdf-3.0.1.tgz", + "integrity": "sha512-qaGIxqxetdoNnFQQXxTKUD9/Z7AloLaw94fFsOiJMxbfYdBbrBuhWmbzI8TVjrw7s3jBY1PFHofBKMV/wZPapg==", "license": "MIT", "dependencies": { - "@babel/runtime": "^7.26.0", + "@babel/runtime": "^7.26.7", "atob": "^2.1.2", "btoa": "^1.2.1", "fflate": "^0.8.1" }, "optionalDependencies": { - "canvg": "^3.0.6", + "canvg": "^3.0.11", "core-js": "^3.6.0", "dompurify": "^3.2.4", "html2canvas": "^1.0.0-rc.5" @@ -7738,10 +7854,11 @@ } }, "node_modules/lilconfig": { - "version": "3.1.1", - "resolved": "https://registry.npmjs.org/lilconfig/-/lilconfig-3.1.1.tgz", - "integrity": "sha512-O18pf7nyvHTckunPWCV1XUNXU1piu01y2b7ATJ0ppkUkk8ocqVWBrYjJBCwHDjD/ZWcfyrA0P4gKhzWGi5EINQ==", + "version": "3.1.3", + "resolved": "https://registry.npmjs.org/lilconfig/-/lilconfig-3.1.3.tgz", + "integrity": "sha512-/vlFKAoH5Cgt3Ie+JLhRbwOsCQePABiU3tJ1egGvyQ+33R/vcwM2Zl2QR/LzjsBeItPt3oSVXapn+m4nQDvpzw==", "dev": true, + "license": "MIT", "engines": { "node": ">=14" }, @@ -8873,6 +8990,7 @@ "resolved": "https://registry.npmjs.org/nth-check/-/nth-check-2.1.1.tgz", "integrity": "sha512-lqjrjmaOoAnWfMmBPL+XNnynZh2+swxiX3WUE0s4yEHI6m+AwrK2UZOimIRl3X/4QctVqS8AiZjFqyOGrMXb/w==", "dev": true, + "license": "BSD-2-Clause", "dependencies": { "boolbase": "^1.0.0" }, @@ -9079,24 +9197,39 @@ } }, "node_modules/parse5": { - "version": "7.1.2", - "resolved": "https://registry.npmjs.org/parse5/-/parse5-7.1.2.tgz", - "integrity": "sha512-Czj1WaSVpaoj0wbhMzLmWD69anp2WH7FXMB9n1Sy8/ZFF9jolSQVMu1Ij5WIyGmcBmhk7EOndpO4mIpihVqAXw==", + "version": "7.2.1", + "resolved": "https://registry.npmjs.org/parse5/-/parse5-7.2.1.tgz", + "integrity": "sha512-BuBYQYlv1ckiPdQi/ohiivi9Sagc9JG+Ozs0r7b/0iK3sKmrb0b9FdWdBbOdx6hBCM/F9Ir82ofnBhtZOjCRPQ==", "dev": true, + "license": "MIT", "dependencies": { - "entities": "^4.4.0" + "entities": "^4.5.0" }, "funding": { "url": "https://github.com/inikulin/parse5?sponsor=1" } }, "node_modules/parse5-htmlparser2-tree-adapter": { - "version": "7.0.0", - "resolved": "https://registry.npmjs.org/parse5-htmlparser2-tree-adapter/-/parse5-htmlparser2-tree-adapter-7.0.0.tgz", - "integrity": "sha512-B77tOZrqqfUfnVcOrUvfdLbz4pu4RopLD/4vmu3HUPswwTA8OH0EMW9BlWR2B0RCoiZRAHEUu7IxeP1Pd1UU+g==", + "version": "7.1.0", + "resolved": "https://registry.npmjs.org/parse5-htmlparser2-tree-adapter/-/parse5-htmlparser2-tree-adapter-7.1.0.tgz", + "integrity": "sha512-ruw5xyKs6lrpo9x9rCZqZZnIUntICjQAd0Wsmp396Ul9lN/h+ifgVV1x1gZHi8euej6wTfpqX8j+BFQxF0NS/g==", "dev": true, + "license": "MIT", + "dependencies": { + "domhandler": "^5.0.3", + "parse5": "^7.0.0" + }, + "funding": { + "url": "https://github.com/inikulin/parse5?sponsor=1" + } + }, + "node_modules/parse5-parser-stream": { + "version": "7.1.2", + "resolved": "https://registry.npmjs.org/parse5-parser-stream/-/parse5-parser-stream-7.1.2.tgz", + "integrity": "sha512-JyeQc9iwFLn5TbvvqACIF/VXG6abODeB3Fwmv/TGdLk2LfbWkaySGY72at4+Ty7EkPZj854u4CrICqNk2qIbow==", + "dev": true, + "license": "MIT", "dependencies": { - "domhandler": "^5.0.2", "parse5": "^7.0.0" }, "funding": { @@ -9802,9 +9935,9 @@ } }, "node_modules/pyodide": { - "version": "0.27.2", - "resolved": "https://registry.npmjs.org/pyodide/-/pyodide-0.27.2.tgz", - "integrity": "sha512-sfA2kiUuQVRpWI4BYnU3sX5PaTTt/xrcVEmRzRcId8DzZXGGtPgCBC0gCqjUTUYSa8ofPaSjXmzESc86yvvCHg==", + "version": "0.27.3", + "resolved": "https://registry.npmjs.org/pyodide/-/pyodide-0.27.3.tgz", + "integrity": "sha512-6NwKEbPk0M3Wic2T1TCZijgZH9VE4RkHp1VGljS1sou0NjGdsmY2R/fG5oLmdDkjTRMI1iW7WYaY9pofX8gg1g==", "license": "Apache-2.0", "dependencies": { "ws": "^8.5.0" @@ -11472,7 +11605,6 @@ "resolved": "https://registry.npmjs.org/text-segmentation/-/text-segmentation-1.0.3.tgz", "integrity": "sha512-iOiPUo/BGnZ6+54OsWxZidGCsdU8YbE4PSpdPinp7DeMtUJNJBoJ/ouUSTJjHkh1KntHaltHl/gDs2FC4i5+Nw==", "license": "MIT", - "optional": true, "dependencies": { "utrie": "^1.0.2" } @@ -11508,15 +11640,6 @@ "xtend": "~4.0.1" } }, - "node_modules/tiny-glob": { - "version": "0.2.9", - "resolved": "https://registry.npmjs.org/tiny-glob/-/tiny-glob-0.2.9.tgz", - "integrity": "sha512-g/55ssRPUjShh+xkfx9UPDXqhckHEsHr4Vd9zX55oSdGZc/MD0m3sferOkwWtp98bv+kcVfEHtRJgBVJzelrzg==", - "dependencies": { - "globalyzer": "0.1.0", - "globrex": "^0.1.2" - } - }, "node_modules/tinybench": { "version": "2.8.0", "resolved": "https://registry.npmjs.org/tinybench/-/tinybench-2.8.0.tgz", @@ -11821,7 +11944,6 @@ "resolved": "https://registry.npmjs.org/utrie/-/utrie-1.0.2.tgz", "integrity": "sha512-1MLa5ouZiOmQzUbjbu9VmjLzn1QLXBhwpUa7kdLUQK+KQ5KA9I1vk5U4YHe/X2Ch7PYnJfWuWT+VbuxbGwljhw==", "license": "MIT", - "optional": true, "dependencies": { "base64-arraybuffer": "^1.0.2" } @@ -11963,9 +12085,9 @@ } }, "node_modules/vite": { - "version": "5.4.14", - "resolved": "https://registry.npmjs.org/vite/-/vite-5.4.14.tgz", - "integrity": "sha512-EK5cY7Q1D8JNhSaPKVK4pwBFvaTmZxEnoKXLG/U9gmdDcihQGNzFlgIvaxezFR4glP1LsuiedwMBqCXH3wZccA==", + "version": "5.4.15", + "resolved": "https://registry.npmjs.org/vite/-/vite-5.4.15.tgz", + "integrity": "sha512-6ANcZRivqL/4WtwPGTKNaosuNJr5tWiftOC7liM7G9+rMb8+oeJeyzymDu4rTN93seySBmbjSfsS3Vzr19KNtA==", "license": "MIT", "dependencies": { "esbuild": "^0.21.3", @@ -12692,6 +12814,29 @@ "resolved": "https://registry.npmjs.org/web-worker/-/web-worker-1.3.0.tgz", "integrity": "sha512-BSR9wyRsy/KOValMgd5kMyr3JzpdeoR9KVId8u5GVlTTAtNChlsE4yTxeY7zMdNSyOmoKBv8NH2qeRY9Tg+IaA==" }, + "node_modules/whatwg-encoding": { + "version": "3.1.1", + "resolved": "https://registry.npmjs.org/whatwg-encoding/-/whatwg-encoding-3.1.1.tgz", + "integrity": "sha512-6qN4hJdMwfYBtE3YBTTHhoeuUrDBPZmbQaxWAqSALV/MeEnR5z1xd8UKud2RAkFoPkmB+hli1TZSnyi84xz1vQ==", + "dev": true, + "license": "MIT", + "dependencies": { + "iconv-lite": "0.6.3" + }, + "engines": { + "node": ">=18" + } + }, + "node_modules/whatwg-mimetype": { + "version": "4.0.0", + "resolved": "https://registry.npmjs.org/whatwg-mimetype/-/whatwg-mimetype-4.0.0.tgz", + "integrity": "sha512-QaKxh0eNIi2mE9p2vEdzfagOKHCcj1pJ56EEHGQOVxp8r9/iszLUUV7v89x9O1p/T+NlTM5W7jW6+cz4Fq1YVg==", + "dev": true, + "license": "MIT", + "engines": { + "node": ">=18" + } + }, "node_modules/wheel": { "version": "1.0.0", "resolved": "https://registry.npmjs.org/wheel/-/wheel-1.0.0.tgz", diff --git a/package.json b/package.json index 2e4b905b19..465fbba0fe 100644 --- a/package.json +++ b/package.json @@ -1,6 +1,6 @@ { "name": "open-webui", - "version": "0.5.20", + "version": "0.6.0", "private": true, "scripts": { "dev": "npm run pyodide:fetch && vite dev --host", @@ -80,6 +80,8 @@ "file-saver": "^2.0.5", "fuse.js": "^7.0.0", "highlight.js": "^11.9.0", + "html-entities": "^2.5.3", + "html2canvas-pro": "^1.5.8", "i18next": "^23.10.0", "i18next-browser-languagedetector": "^7.2.0", "i18next-resources-to-backend": "^1.2.0", @@ -102,7 +104,7 @@ "prosemirror-schema-list": "^1.4.1", "prosemirror-state": "^1.4.3", "prosemirror-view": "^1.34.3", - "pyodide": "^0.27.2", + "pyodide": "^0.27.3", "socket.io-client": "^4.2.0", "sortablejs": "^1.15.2", "svelte-sonner": "^0.3.19", diff --git a/pyproject.toml b/pyproject.toml index 0666ac8a26..2e8537a770 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -51,7 +51,7 @@ dependencies = [ "langchain==0.3.19", "langchain-community==0.3.18", - "fake-useragent==1.5.1", + "fake-useragent==2.1.0", "chromadb==0.6.2", "pymilvus==2.5.0", "qdrant-client~=1.12.0", @@ -84,6 +84,7 @@ dependencies = [ "soundfile==0.13.1", "azure-ai-documentintelligence==1.0.0", + "pillow==11.1.0", "opencv-python-headless==4.11.0.86", "rapidocr-onnxruntime==1.3.24", "rank-bm25==0.2.2", diff --git a/src/app.css b/src/app.css index 8bdc6f1ade..4061d3b5eb 100644 --- a/src/app.css +++ b/src/app.css @@ -106,7 +106,7 @@ li p { } ::-webkit-scrollbar { - height: 0.4rem; + height: 0.8rem; width: 0.4rem; } diff --git a/src/lib/apis/index.ts b/src/lib/apis/index.ts index 3fb4a5d01b..015e1272ac 100644 --- a/src/lib/apis/index.ts +++ b/src/lib/apis/index.ts @@ -1,6 +1,9 @@ import { WEBUI_API_BASE_URL, WEBUI_BASE_URL } from '$lib/constants'; +import { convertOpenApiToToolPayload } from '$lib/utils'; import { getOpenAIModelsDirect } from './openai'; +import { toast } from 'svelte-sonner'; + export const getModels = async ( token: string = '', connections: object | null = null, @@ -114,6 +117,13 @@ export const getModels = async ( } } + const tags = apiConfig.tags; + if (tags) { + for (const model of models) { + model.tags = tags; + } + } + localModels = localModels.concat(models); } } @@ -249,6 +259,179 @@ export const stopTask = async (token: string, id: string) => { return res; }; +export const getToolServerData = async (token: string, url: string) => { + let error = null; + + const res = await fetch(`${url}/openapi.json`, { + method: 'GET', + headers: { + Accept: 'application/json', + 'Content-Type': 'application/json', + ...(token && { authorization: `Bearer ${token}` }) + } + }) + .then(async (res) => { + if (!res.ok) throw await res.json(); + return res.json(); + }) + .catch((err) => { + console.log(err); + if ('detail' in err) { + error = err.detail; + } else { + error = err; + } + return null; + }); + + if (error) { + throw error; + } + + const data = { + openapi: res, + info: res.info, + specs: convertOpenApiToToolPayload(res) + }; + + console.log(data); + return data; +}; + +export const getToolServersData = async (i18n, servers: object[]) => { + return ( + await Promise.all( + servers + .filter((server) => server?.config?.enable) + .map(async (server) => { + const data = await getToolServerData(server?.key, server?.url).catch((err) => { + toast.error( + i18n.t(`Failed to connect to {{URL}} OpenAPI tool server`, { + URL: server?.url + }) + ); + return null; + }); + + if (data) { + const { openapi, info, specs } = data; + return { + url: server?.url, + openapi: openapi, + info: info, + specs: specs + }; + } + }) + ) + ).filter((server) => server); +}; + +export const executeToolServer = async ( + token: string, + url: string, + name: string, + params: Record, + serverData: { openapi: any; info: any; specs: any } +) => { + let error = null; + + try { + // Find the matching operationId in the OpenAPI spec + const matchingRoute = Object.entries(serverData.openapi.paths).find(([_, methods]) => + Object.entries(methods as any).some(([__, operation]: any) => operation.operationId === name) + ); + + if (!matchingRoute) { + throw new Error(`No matching route found for operationId: ${name}`); + } + + const [routePath, methods] = matchingRoute; + + const methodEntry = Object.entries(methods as any).find( + ([_, operation]: any) => operation.operationId === name + ); + + if (!methodEntry) { + throw new Error(`No matching method found for operationId: ${name}`); + } + + const [httpMethod, operation]: [string, any] = methodEntry; + + // Split parameters by type + const pathParams: Record = {}; + const queryParams: Record = {}; + let bodyParams: any = {}; + + if (operation.parameters) { + operation.parameters.forEach((param: any) => { + const paramName = param.name; + const paramIn = param.in; + if (params.hasOwnProperty(paramName)) { + if (paramIn === 'path') { + pathParams[paramName] = params[paramName]; + } else if (paramIn === 'query') { + queryParams[paramName] = params[paramName]; + } + } + }); + } + + let finalUrl = `${url}${routePath}`; + + // Replace path parameters (`{param}`) + Object.entries(pathParams).forEach(([key, value]) => { + finalUrl = finalUrl.replace(new RegExp(`{${key}}`, 'g'), encodeURIComponent(value)); + }); + + // Append query parameters to URL if any + if (Object.keys(queryParams).length > 0) { + const queryString = new URLSearchParams( + Object.entries(queryParams).map(([k, v]) => [k, String(v)]) + ).toString(); + finalUrl += `?${queryString}`; + } + + // Handle requestBody composite + if (operation.requestBody && operation.requestBody.content) { + const contentType = Object.keys(operation.requestBody.content)[0]; + if (params !== undefined) { + bodyParams = params; + } else { + // Optional: Fallback or explicit error if body is expected but not provided + throw new Error(`Request body expected for operation '${name}' but none found.`); + } + } + + // Prepare headers and request options + const headers: Record = { + 'Content-Type': 'application/json', + ...(token && { authorization: `Bearer ${token}` }) + }; + + let requestOptions: RequestInit = { + method: httpMethod.toUpperCase(), + headers + }; + + if (['post', 'put', 'patch'].includes(httpMethod.toLowerCase()) && operation.requestBody) { + requestOptions.body = JSON.stringify(bodyParams); + } + + const res = await fetch(finalUrl, requestOptions); + if (!res.ok) { + const resText = await res.text(); + throw new Error(`HTTP error! Status: ${res.status}. Message: ${resText}`); + } + + return await res.json(); + } catch (err: any) { + error = err.message; + console.error('API Request Error:', error); + return { error }; + } +}; + export const getTaskConfig = async (token: string = '') => { let error = null; diff --git a/src/lib/components/AddConnectionModal.svelte b/src/lib/components/AddConnectionModal.svelte index cbd90b68da..864d850a6a 100644 --- a/src/lib/components/AddConnectionModal.svelte +++ b/src/lib/components/AddConnectionModal.svelte @@ -14,6 +14,7 @@ import SensitiveInput from '$lib/components/common/SensitiveInput.svelte'; import Tooltip from '$lib/components/common/Tooltip.svelte'; import Switch from '$lib/components/common/Switch.svelte'; + import Tags from './common/Tags.svelte'; export let onSubmit: Function = () => {}; export let onDelete: Function = () => {}; @@ -31,6 +32,7 @@ let prefixId = ''; let enable = true; + let tags = []; let modelId = ''; let modelIds = []; @@ -77,17 +79,21 @@ const submitHandler = async () => { loading = true; - if (!ollama && (!url || !key)) { + if (!ollama && !url) { loading = false; - toast.error('URL and Key are required'); + toast.error('URL is required'); return; } + // remove trailing slash from url + url = url.replace(/\/$/, ''); + const connection = { url, key, config: { enable: enable, + tags: tags, prefix_id: prefixId, model_ids: modelIds } @@ -101,6 +107,7 @@ url = ''; key = ''; prefixId = ''; + tags = []; modelIds = []; }; @@ -110,6 +117,7 @@ key = connection.key; enable = connection.config?.enable ?? true; + tags = connection.config?.tags ?? []; prefixId = connection.config?.prefix_id ?? ''; modelIds = connection.config?.model_ids ?? []; } @@ -179,7 +187,7 @@ - + + + +
+
+
{ + e.preventDefault(); + submitHandler(); + }} + > +
+
+
+
{$i18n.t('URL')}
+ +
+ +
+
+ +
+ + + +
+
+ +
+ {$i18n.t(`WebUI will make requests to "{{url}}/openapi.json"`, { + url: url + })} +
+ +
+
+
{$i18n.t('Key')}
+ +
+ +
+
+
+
+ +
+ {#if edit} + + {/if} + + +
+
+
+
+ + diff --git a/src/lib/components/admin/Settings.svelte b/src/lib/components/admin/Settings.svelte index 60edbd25a0..76e3ae59d5 100644 --- a/src/lib/components/admin/Settings.svelte +++ b/src/lib/components/admin/Settings.svelte @@ -71,7 +71,7 @@ - - - - {/if}
@@ -725,42 +651,164 @@
-
{$i18n.t('Top K')}
+
{$i18n.t('Full Context Mode')}
- + + +
- {#if querySettings.hybrid === true} -
-
-
{$i18n.t('Minimum Score')}
+ {#if !RAG_FULL_CONTEXT} +
+
{$i18n.t('Hybrid Search')}
+
+ { + toggleHybridSearch(); + }} + /> +
+
+ + {#if querySettings.hybrid === true} +
+
{$i18n.t('Reranking Model')}
+ +
+
+
+ +
+ +
+
+
+ {/if} + +
+
{$i18n.t('Top K')}
+
+ +
+
+ + {#if querySettings.hybrid === true} +
+
{$i18n.t('Top K Reranker')}
-
- {$i18n.t( - 'Note: If you set a minimum score, the search will only return documents with a score greater than or equal to the minimum score.' - )} + {/if} + + {#if querySettings.hybrid === true} +
+
+
{$i18n.t('Minimum Score')}
+
+ +
+
+
+ {$i18n.t( + 'Note: If you set a minimum score, the search will only return documents with a score greater than or equal to the minimum score.' + )} +
-
+ {/if} {/if}
diff --git a/src/lib/components/admin/Settings/Evaluations/ArenaModelModal.svelte b/src/lib/components/admin/Settings/Evaluations/ArenaModelModal.svelte index 8dd898b1ad..08bc0c2a1e 100644 --- a/src/lib/components/admin/Settings/Evaluations/ArenaModelModal.svelte +++ b/src/lib/components/admin/Settings/Evaluations/ArenaModelModal.svelte @@ -10,6 +10,7 @@ import PencilSolid from '$lib/components/icons/PencilSolid.svelte'; import { toast } from 'svelte-sonner'; import AccessControl from '$lib/components/workspace/common/AccessControl.svelte'; + import ConfirmDialog from '$lib/components/common/ConfirmDialog.svelte'; export let show = false; export let edit = false; @@ -44,6 +45,7 @@ let imageInputElement; let loading = false; + let showDeleteConfirmDialog = false; const addModelHandler = () => { if (selectedModelId) { @@ -115,6 +117,14 @@ }); + { + dispatch('delete', model); + show = false; + }} +/> +
@@ -378,8 +388,7 @@ class="px-3.5 py-1.5 text-sm font-medium dark:bg-black dark:hover:bg-gray-950 dark:text-white bg-white text-black hover:bg-gray-100 transition rounded-full flex flex-row space-x-1 items-center" type="button" on:click={() => { - dispatch('delete', model); - show = false; + showDeleteConfirmDialog = true; }} > {$i18n.t('Delete')} diff --git a/src/lib/components/admin/Settings/General.svelte b/src/lib/components/admin/Settings/General.svelte index 8bb353f62c..5c50bf3110 100644 --- a/src/lib/components/admin/Settings/General.svelte +++ b/src/lib/components/admin/Settings/General.svelte @@ -554,7 +554,6 @@
@@ -610,6 +609,14 @@
+
+
+ {$i18n.t('User Webhooks')} +
+ + +
+
{$i18n.t('WebUI URL')}
diff --git a/src/lib/components/admin/Settings/Images.svelte b/src/lib/components/admin/Settings/Images.svelte index e63158bcd2..88039a0e39 100644 --- a/src/lib/components/admin/Settings/Images.svelte +++ b/src/lib/components/admin/Settings/Images.svelte @@ -191,11 +191,15 @@ } if (config.comfyui.COMFYUI_WORKFLOW) { - config.comfyui.COMFYUI_WORKFLOW = JSON.stringify( - JSON.parse(config.comfyui.COMFYUI_WORKFLOW), - null, - 2 - ); + try { + config.comfyui.COMFYUI_WORKFLOW = JSON.stringify( + JSON.parse(config.comfyui.COMFYUI_WORKFLOW), + null, + 2 + ); + } catch (e) { + console.log(e); + } } requiredWorkflowNodes = requiredWorkflowNodes.map((node) => { diff --git a/src/lib/components/admin/Settings/Models.svelte b/src/lib/components/admin/Settings/Models.svelte index 96ed282d51..0fa069891d 100644 --- a/src/lib/components/admin/Settings/Models.svelte +++ b/src/lib/components/admin/Settings/Models.svelte @@ -29,6 +29,12 @@ import Wrench from '$lib/components/icons/Wrench.svelte'; import ArrowDownTray from '$lib/components/icons/ArrowDownTray.svelte'; import ManageModelsModal from './Models/ManageModelsModal.svelte'; + import ModelMenu from '$lib/components/admin/Settings/Models/ModelMenu.svelte'; + import EllipsisHorizontal from '$lib/components/icons/EllipsisHorizontal.svelte'; + import EyeSlash from '$lib/components/icons/EyeSlash.svelte'; + import Eye from '$lib/components/icons/Eye.svelte'; + + let shiftKey = false; let importFiles; let modelsImportInputElement: HTMLInputElement; @@ -146,8 +152,62 @@ ); }; + const hideModelHandler = async (model) => { + model.meta = { + ...model.meta, + hidden: !(model?.meta?.hidden ?? false) + }; + + console.log(model); + + toast.success( + model.meta.hidden + ? $i18n.t(`Model {{name}} is now hidden`, { + name: model.id + }) + : $i18n.t(`Model {{name}} is now visible`, { + name: model.id + }) + ); + + upsertModelHandler(model); + }; + + const exportModelHandler = async (model) => { + let blob = new Blob([JSON.stringify([model])], { + type: 'application/json' + }); + saveAs(blob, `${model.id}-${Date.now()}.json`); + }; + onMount(async () => { - init(); + await init(); + + const onKeyDown = (event) => { + if (event.key === 'Shift') { + shiftKey = true; + } + }; + + const onKeyUp = (event) => { + if (event.key === 'Shift') { + shiftKey = false; + } + }; + + const onBlur = () => { + shiftKey = false; + }; + + window.addEventListener('keydown', onKeyDown); + window.addEventListener('keyup', onKeyUp); + window.addEventListener('blur-sm', onBlur); + + return () => { + window.removeEventListener('keydown', onKeyDown); + window.removeEventListener('keyup', onKeyUp); + window.removeEventListener('blur-sm', onBlur); + }; }); @@ -211,7 +271,10 @@ {#if models.length > 0} {#each filteredModels as model, modelIdx (model.id)}
- - -
- - { - toggleModelHandler(model); + {#if shiftKey} + + -
+ {:else} + + + { + exportModelHandler(model); + }} + hideHandler={() => { + hideModelHandler(model); + }} + onClose={() => {}} + > + + + +
+ + { + toggleModelHandler(model); + }} + /> + +
+ {/if}
{/each} diff --git a/src/lib/components/admin/Settings/Models/ModelList.svelte b/src/lib/components/admin/Settings/Models/ModelList.svelte index 0b70d43d91..4a261ce5a5 100644 --- a/src/lib/components/admin/Settings/Models/ModelList.svelte +++ b/src/lib/components/admin/Settings/Models/ModelList.svelte @@ -33,6 +33,7 @@ if (modelListElement) { sortable = Sortable.create(modelListElement, { animation: 150, + handle: '.item-handle', onUpdate: async (event) => { positionChangeHandler(); } @@ -47,7 +48,7 @@
- +
{#if $models.find((model) => model.id === modelId)} diff --git a/src/lib/components/admin/Settings/Models/ModelMenu.svelte b/src/lib/components/admin/Settings/Models/ModelMenu.svelte new file mode 100644 index 0000000000..88465e42e2 --- /dev/null +++ b/src/lib/components/admin/Settings/Models/ModelMenu.svelte @@ -0,0 +1,116 @@ + + + { + if (e.detail === false) { + onClose(); + } + }} +> + + + + +
+ + { + hideHandler(); + }} + > + {#if model?.meta?.hidden ?? false} + + + + {:else} + + + + + {/if} + +
+ {#if model?.meta?.hidden ?? false} + {$i18n.t('Show Model')} + {:else} + {$i18n.t('Hide Model')} + {/if} +
+
+ + { + exportHandler(); + }} + > + + +
{$i18n.t('Export')}
+
+
+
+
diff --git a/src/lib/components/admin/Settings/WebSearch.svelte b/src/lib/components/admin/Settings/WebSearch.svelte index 0ae56cae3a..cec4cabe65 100644 --- a/src/lib/components/admin/Settings/WebSearch.svelte +++ b/src/lib/components/admin/Settings/WebSearch.svelte @@ -462,8 +462,12 @@
diff --git a/src/lib/components/admin/Users/Groups.svelte b/src/lib/components/admin/Users/Groups.svelte index 89b4141d6b..15497cb205 100644 --- a/src/lib/components/admin/Users/Groups.svelte +++ b/src/lib/components/admin/Users/Groups.svelte @@ -52,12 +52,19 @@ prompts: false, tools: false }, + sharing: { + public_models: false, + public_knowledge: false, + public_prompts: false, + public_tools: false + }, chat: { controls: true, file_upload: true, delete: true, edit: true, - temporary: true + temporary: true, + temporary_enforced: true }, features: { web_search: true, diff --git a/src/lib/components/admin/Users/Groups/EditGroupModal.svelte b/src/lib/components/admin/Users/Groups/EditGroupModal.svelte index 2e2fe60718..e492cc9b6d 100644 --- a/src/lib/components/admin/Users/Groups/EditGroupModal.svelte +++ b/src/lib/components/admin/Users/Groups/EditGroupModal.svelte @@ -9,6 +9,7 @@ import Users from './Users.svelte'; import UserPlusSolid from '$lib/components/icons/UserPlusSolid.svelte'; import WrenchSolid from '$lib/components/icons/WrenchSolid.svelte'; + import ConfirmDialog from '$lib/components/common/ConfirmDialog.svelte'; export let onSubmit: Function = () => {}; export let onDelete: Function = () => {}; @@ -25,6 +26,7 @@ let selectedTab = 'general'; let loading = false; + let showDeleteConfirmDialog = false; export let name = ''; export let description = ''; @@ -88,6 +90,14 @@ }); + { + onDelete(); + show = false; + }} +/> +
@@ -263,18 +273,19 @@ {/if}
--> -
+
{#if edit} + {:else} +
{/if}

diff --git a/src/lib/components/admin/Users/UserList/UserChatsModal.svelte b/src/lib/components/admin/Users/UserList/UserChatsModal.svelte index a5248d9bb5..ca9edd2611 100644 --- a/src/lib/components/admin/Users/UserList/UserChatsModal.svelte +++ b/src/lib/components/admin/Users/UserList/UserChatsModal.svelte @@ -12,6 +12,7 @@ import Modal from '$lib/components/common/Modal.svelte'; import Tooltip from '$lib/components/common/Tooltip.svelte'; import Spinner from '$lib/components/common/Spinner.svelte'; + import ConfirmDialog from '$lib/components/common/ConfirmDialog.svelte'; const i18n = getContext('i18n'); @@ -19,6 +20,8 @@ export let user; let chats = null; + let showDeleteConfirmDialog = false; + let chatToDelete = null; const deleteChatHandler = async (chatId) => { const res = await deleteChatById(localStorage.token, chatId).catch((error) => { @@ -50,6 +53,16 @@ } + { + if (chatToDelete) { + deleteChatHandler(chatToDelete); + chatToDelete = null; + } + }} +/> +
@@ -142,7 +155,8 @@
- This channel was created on {dayjs(channel.created_at / 1000000).format( - 'MMMM D, YYYY' - )}. This is the very beginning of the {channel.name} - channel. + {$i18n.t( + 'This channel was created on {{createdAt}}. This is the very beginning of the {{channelName}} channel.', + { + createdAt: dayjs(channel.created_at / 1000000).format('MMMM D, YYYY'), + channelName: channel.name + } + )}
{:else} diff --git a/src/lib/components/chat/Chat.svelte b/src/lib/components/chat/Chat.svelte index ca766c9f76..c168d628ea 100644 --- a/src/lib/components/chat/Chat.svelte +++ b/src/lib/components/chat/Chat.svelte @@ -35,7 +35,8 @@ showOverview, chatTitle, showArtifacts, - tools + tools, + toolServers } from '$lib/stores'; import { convertMessagesToHistory, @@ -119,6 +120,7 @@ let imageGenerationEnabled = false; let webSearchEnabled = false; let codeInterpreterEnabled = false; + let chat = null; let tags = []; @@ -212,7 +214,14 @@ const _chatId = JSON.parse(JSON.stringify($chatId)); let _messageId = JSON.parse(JSON.stringify(message.id)); - let messageChildrenIds = history.messages[_messageId].childrenIds; + let messageChildrenIds = []; + if (_messageId === null) { + messageChildrenIds = Object.keys(history.messages).filter( + (id) => history.messages[id].parentId === null + ); + } else { + messageChildrenIds = history.messages[_messageId].childrenIds; + } while (messageChildrenIds.length !== 0) { _messageId = messageChildrenIds.at(-1); @@ -286,18 +295,10 @@ } else if (type === 'chat:tags') { chat = await getChatById(localStorage.token, $chatId); allTags.set(await getAllTags(localStorage.token)); - } else if (type === 'message') { + } else if (type === 'chat:message:delta' || type === 'message') { message.content += data.content; - } else if (type === 'replace') { + } else if (type === 'chat:message' || type === 'replace') { message.content = data.content; - } else if (type === 'action') { - if (data.action === 'continue') { - const continueButton = document.getElementById('continue-response-button'); - - if (continueButton) { - continueButton.click(); - } - } } else if (type === 'confirmation') { eventCallback = cb; @@ -384,7 +385,7 @@ if (event.data.type === 'input:prompt:submit') { console.debug(event.data.text); - if (prompt !== '') { + if (event.data.text !== '') { await tick(); submitPrompt(event.data.text); } @@ -887,6 +888,8 @@ await chats.set(await getChatList(localStorage.token, $currentChatPage)); } } + + taskId = null; }; const chatActionHandler = async (chatId, actionId, modelId, responseMessageId, event = null) => { @@ -1276,12 +1279,13 @@ prompt = ''; // Reset chat input textarea - const chatInputElement = document.getElementById('chat-input'); + if (!($settings?.richTextInput ?? true)) { + const chatInputElement = document.getElementById('chat-input'); - if (chatInputElement) { - await tick(); - chatInputElement.style.height = ''; - chatInputElement.style.height = Math.min(chatInputElement.scrollHeight, 320) + 'px'; + if (chatInputElement) { + await tick(); + chatInputElement.style.height = ''; + } } const _files = JSON.parse(JSON.stringify(files)); @@ -1563,6 +1567,7 @@ files: (files?.length ?? 0) > 0 ? files : undefined, tool_ids: selectedToolIds.length > 0 ? selectedToolIds : undefined, + tool_servers: $toolServers, features: { image_generation: @@ -1621,7 +1626,7 @@ : {}) }, `${WEBUI_BASE_URL}/api` - ).catch((error) => { + ).catch(async (error) => { toast.error(`${error}`); responseMessage.error = { @@ -1634,10 +1639,12 @@ return null; }); - console.log(res); - if (res) { - taskId = res.task_id; + if (res.error) { + await handleOpenAIError(res.error, responseMessage); + } else { + taskId = res.task_id; + } } await tick(); @@ -1654,9 +1661,11 @@ console.error(innerError); if ('detail' in innerError) { + // FastAPI error toast.error(innerError.detail); errorMessage = innerError.detail; } else if ('error' in innerError) { + // OpenAI error if ('message' in innerError.error) { toast.error(innerError.error.message); errorMessage = innerError.error.message; @@ -1665,6 +1674,7 @@ errorMessage = innerError.error; } } else if ('message' in innerError) { + // OpenAI error toast.error(innerError.message); errorMessage = innerError.message; } @@ -1683,9 +1693,10 @@ history.messages[responseMessage.id] = responseMessage; }; - const stopResponse = () => { + const stopResponse = async () => { if (taskId) { - const res = stopTask(localStorage.token, taskId).catch((error) => { + const res = await stopTask(localStorage.token, taskId).catch((error) => { + toast.error(`${error}`); return null; }); @@ -2031,6 +2042,7 @@ bind:codeInterpreterEnabled bind:webSearchEnabled bind:atSelectedModel + toolServers={$toolServers} transparentBackground={$settings?.backgroundImageUrl ?? false} {stopResponse} {createMessagePair} @@ -2084,6 +2096,7 @@ bind:webSearchEnabled bind:atSelectedModel transparentBackground={$settings?.backgroundImageUrl ?? false} + toolServers={$toolServers} {stopResponse} {createMessagePair} on:upload={async (e) => { diff --git a/src/lib/components/chat/ContentRenderer/FloatingButtons.svelte b/src/lib/components/chat/ContentRenderer/FloatingButtons.svelte index 6dc48529bc..9286aaed0e 100644 --- a/src/lib/components/chat/ContentRenderer/FloatingButtons.svelte +++ b/src/lib/components/chat/ContentRenderer/FloatingButtons.svelte @@ -263,7 +263,7 @@
{:else}
- {#if $user.role === 'admin' || $user?.permissions.chat?.controls} -
- {#if chatFiles.length > 0} - -
- {#each chatFiles as file, fileIdx} - { - // Remove the file from the chatFiles array +
+ {#if chatFiles.length > 0} + +
+ {#each chatFiles as file, fileIdx} + { + // Remove the file from the chatFiles array - chatFiles.splice(fileIdx, 1); - chatFiles = chatFiles; - }} - on:click={() => { - console.log(file); - }} - /> - {/each} -
-
- -
- {/if} - - -
- + chatFiles.splice(fileIdx, 1); + chatFiles = chatFiles; + }} + on:click={() => { + console.log(file); + }} + /> + {/each}

+ {/if} + + +
+ +
+
+ + {#if $user.role === 'admin' || $user?.permissions.chat?.controls} +
@@ -90,10 +90,6 @@
-
- {:else} -
- {$i18n.t('You do not have permission to access this feature.')} -
- {/if} + {/if} +
diff --git a/src/lib/components/chat/MessageInput.svelte b/src/lib/components/chat/MessageInput.svelte index 7db31010b6..9251ba4e4b 100644 --- a/src/lib/components/chat/MessageInput.svelte +++ b/src/lib/components/chat/MessageInput.svelte @@ -46,6 +46,7 @@ import Photo from '../icons/Photo.svelte'; import CommandLine from '../icons/CommandLine.svelte'; import { KokoroWorker } from '$lib/workers/KokoroWorker'; + import ToolServersModal from './ToolServersModal.svelte'; const i18n = getContext('i18n'); @@ -68,6 +69,8 @@ export let prompt = ''; export let files = []; + export let toolServers = []; + export let selectedToolIds = []; export let imageGenerationEnabled = false; @@ -82,6 +85,8 @@ webSearchEnabled }); + let showToolServers = false; + let loaded = false; let recording = false; @@ -343,6 +348,8 @@ + + {#if loaded}
@@ -417,54 +424,6 @@
{/if} - {#if webSearchEnabled || ($config?.features?.enable_web_search && ($settings?.webSearch ?? false)) === 'always'} -
-
-
- - - - -
-
{$i18n.t('Search the internet')}
-
-
- {/if} - - {#if imageGenerationEnabled} -
-
-
- - - - -
-
{$i18n.t('Generate an image')}
-
-
- {/if} - - {#if codeInterpreterEnabled} -
-
-
- - - - -
-
{$i18n.t('Execute code for analysis')}
-
-
- {/if} - {#if atSelectedModel !== undefined}
@@ -576,7 +535,7 @@ }} >
{#if files.length > 0} @@ -687,7 +646,8 @@ ))} placeholder={placeholder ? placeholder : $i18n.t('Send a Message')} largeTextAsFile={$settings?.largeTextAsFile ?? false} - autocomplete={$config?.features.enable_autocomplete_generation} + autocomplete={$config?.features?.enable_autocomplete_generation && + ($settings?.promptAutocomplete ?? false)} generateAutoCompletion={async (text) => { if (selectedModelIds.length === 0 || !selectedModelIds.at(0)) { toast.error($i18n.t('Please select a model first.')); @@ -895,7 +855,6 @@ on:keydown={async (e) => { const isCtrlPressed = e.ctrlKey || e.metaKey; // metaKey is for Cmd key on Mac - console.log('keydown', e); const commandsContainerElement = document.getElementById('commands-container'); @@ -997,7 +956,6 @@ return; } - console.log('keypress', e); // Prevent Enter key from creating a new line const isCtrlPressed = e.ctrlKey || e.metaKey; const enterPressed = @@ -1175,14 +1133,14 @@ @@ -1195,13 +1153,13 @@ on:click|preventDefault={() => (imageGenerationEnabled = !imageGenerationEnabled)} type="button" - class="px-1.5 @sm:px-2.5 py-1.5 flex gap-1.5 items-center text-sm rounded-full font-medium transition-colors duration-300 focus:outline-hidden max-w-full overflow-hidden {imageGenerationEnabled + class="px-1.5 @xl:px-2.5 py-1.5 flex gap-1.5 items-center text-sm rounded-full font-medium transition-colors duration-300 focus:outline-hidden max-w-full overflow-hidden {imageGenerationEnabled ? 'bg-gray-100 dark:bg-gray-500/20 text-gray-600 dark:text-gray-400' : 'bg-transparent text-gray-600 dark:text-gray-300 border-gray-200 hover:bg-gray-100 dark:hover:bg-gray-800 '}" > @@ -1214,13 +1172,13 @@ on:click|preventDefault={() => (codeInterpreterEnabled = !codeInterpreterEnabled)} type="button" - class="px-1.5 @sm:px-2.5 py-1.5 flex gap-1.5 items-center text-sm rounded-full font-medium transition-colors duration-300 focus:outline-hidden max-w-full overflow-hidden {codeInterpreterEnabled + class="px-1.5 @xl:px-2.5 py-1.5 flex gap-1.5 items-center text-sm rounded-full font-medium transition-colors duration-300 focus:outline-hidden max-w-full overflow-hidden {codeInterpreterEnabled ? 'bg-gray-100 dark:bg-gray-500/20 text-gray-600 dark:text-gray-400' : 'bg-transparent text-gray-600 dark:text-gray-300 border-gray-200 hover:bg-gray-100 dark:hover:bg-gray-800 '}" > @@ -1231,6 +1189,47 @@
+ {#if toolServers.length > 0} + + + + {/if} + {#if !history?.currentId || history.messages[history.currentId]?.done == true}
diff --git a/src/lib/components/chat/MessageInput/Commands/Prompts.svelte b/src/lib/components/chat/MessageInput/Commands/Prompts.svelte index bb91e00a87..0e7a601e4b 100644 --- a/src/lib/components/chat/MessageInput/Commands/Prompts.svelte +++ b/src/lib/components/chat/MessageInput/Commands/Prompts.svelte @@ -1,5 +1,5 @@ + + + + +
+

+ + {alert.type} +

+ +
diff --git a/src/lib/components/chat/Messages/Markdown/MarkdownInlineTokens.svelte b/src/lib/components/chat/Messages/Markdown/MarkdownInlineTokens.svelte index dc24b6dee6..7693e4cb43 100644 --- a/src/lib/components/chat/Messages/Markdown/MarkdownInlineTokens.svelte +++ b/src/lib/components/chat/Messages/Markdown/MarkdownInlineTokens.svelte @@ -31,7 +31,7 @@ {:else if token.text.includes(` {:else} - {token.text} + {@html html} {/if} {:else if token.type === 'link'} {#if token.tokens} diff --git a/src/lib/components/chat/Messages/Markdown/MarkdownTokens.svelte b/src/lib/components/chat/Messages/Markdown/MarkdownTokens.svelte index 8de359a26c..2d5e7a30ea 100644 --- a/src/lib/components/chat/Messages/Markdown/MarkdownTokens.svelte +++ b/src/lib/components/chat/Messages/Markdown/MarkdownTokens.svelte @@ -14,10 +14,13 @@ import CodeBlock from '$lib/components/chat/Messages/CodeBlock.svelte'; import MarkdownInlineTokens from '$lib/components/chat/Messages/Markdown/MarkdownInlineTokens.svelte'; import KatexRenderer from './KatexRenderer.svelte'; + import AlertRenderer, { alertComponent } from './AlertRenderer.svelte'; import Collapsible from '$lib/components/common/Collapsible.svelte'; import Tooltip from '$lib/components/common/Tooltip.svelte'; import ArrowDownTray from '$lib/components/icons/ArrowDownTray.svelte'; + import Source from './Source.svelte'; + import { settings } from '$lib/stores'; const dispatch = createEventDispatcher(); @@ -84,6 +87,7 @@ {#if token.raw.includes('```')} -
+
-
+
{:else if token.type === 'blockquote'} -
- -
+ {@const alert = alertComponent(token)} + {#if alert} + + {:else} +
+ +
+ {/if} {:else if token.type === 'list'} {#if token.ordered}
    @@ -242,6 +251,7 @@ {:else if token.type === 'details'} message.parentId === null) .map((message) => message.id) ?? [])} + {gotoMessage} {showPreviousMessage} {showNextMessage} {editMessage} @@ -70,6 +72,7 @@ {messageId} isLastMessage={messageId === history.currentId} siblings={history.messages[history.messages[messageId].parentId]?.childrenIds ?? []} + {gotoMessage} {showPreviousMessage} {showNextMessage} {updateChat} diff --git a/src/lib/components/chat/Messages/MultiResponseMessages.svelte b/src/lib/components/chat/Messages/MultiResponseMessages.svelte index 1a8ceda79b..c46be0e83a 100644 --- a/src/lib/components/chat/Messages/MultiResponseMessages.svelte +++ b/src/lib/components/chat/Messages/MultiResponseMessages.svelte @@ -58,6 +58,35 @@ } } + const gotoMessage = async (modelIdx, messageIdx) => { + // Clamp messageIdx to ensure it's within valid range + groupedMessageIdsIdx[modelIdx] = Math.max( + 0, + Math.min(messageIdx, groupedMessageIds[modelIdx].messageIds.length - 1) + ); + + // Get the messageId at the specified index + let messageId = groupedMessageIds[modelIdx].messageIds[groupedMessageIdsIdx[modelIdx]]; + console.log(messageId); + + // Traverse the branch to find the deepest child message + let messageChildrenIds = history.messages[messageId].childrenIds; + while (messageChildrenIds.length !== 0) { + messageId = messageChildrenIds.at(-1); + messageChildrenIds = history.messages[messageId].childrenIds; + } + + // Update the current message ID in history + history.currentId = messageId; + + // Await UI updates + await tick(); + await updateChat(); + + // Trigger scrolling after navigation + triggerScroll(); + }; + const showPreviousMessage = async (modelIdx) => { groupedMessageIdsIdx[modelIdx] = Math.max(0, groupedMessageIdsIdx[modelIdx] - 1); @@ -224,6 +253,7 @@ messageId={_messageId} isLastMessage={true} siblings={groupedMessageIds[modelIdx].messageIds} + gotoMessage={(message, messageIdx) => gotoMessage(modelIdx, messageIdx)} showPreviousMessage={() => showPreviousMessage(modelIdx)} showNextMessage={() => showNextMessage(modelIdx)} {updateChat} diff --git a/src/lib/components/chat/Messages/ResponseMessage.svelte b/src/lib/components/chat/Messages/ResponseMessage.svelte index 91e788b8ea..a8c2e7c9fa 100644 --- a/src/lib/components/chat/Messages/ResponseMessage.svelte +++ b/src/lib/components/chat/Messages/ResponseMessage.svelte @@ -5,7 +5,7 @@ import { createEventDispatcher } from 'svelte'; import { onMount, tick, getContext } from 'svelte'; import type { Writable } from 'svelte/store'; - import type { i18n as i18nType } from 'i18next'; + import type { i18n as i18nType, t } from 'i18next'; const i18n = getContext>('i18n'); @@ -110,6 +110,7 @@ export let siblings; + export let gotoMessage: Function = () => {}; export let showPreviousMessage: Function; export let showNextMessage: Function; @@ -139,6 +140,8 @@ let editedContent = ''; let editTextAreaElement: HTMLTextAreaElement; + let messageIndexEdit = false; + let audioParts: Record = {}; let speaking = false; let speakingIdx: number | undefined; @@ -559,7 +562,7 @@
    - + {model?.name ?? message.model} @@ -739,7 +742,7 @@ {history} content={message.content} sources={message.sources} - floatingButtons={message?.done} + floatingButtons={message?.done && !readOnly} save={!readOnly} {model} onTaskClick={async (e) => { @@ -748,7 +751,9 @@ onSourceClick={async (id, idx) => { console.log(id, idx); let sourceButton = document.getElementById(`source-${message.id}-${idx}`); - const sourcesCollapsible = document.getElementById(`collapsible-sources`); + const sourcesCollapsible = document.getElementById( + `collapsible-${message.id}` + ); if (sourceButton) { sourceButton.click(); @@ -844,11 +849,50 @@ -
    - {siblings.indexOf(message.id) + 1}/{siblings.length} -
    + {#if messageIndexEdit} +
    + { + e.target.select(); + }} + on:blur={(e) => { + gotoMessage(message, e.target.value - 1); + messageIndexEdit = false; + }} + on:keydown={(e) => { + if (e.key === 'Enter') { + gotoMessage(message, e.target.value - 1); + messageIndexEdit = false; + } + }} + class="bg-transparent font-semibold self-center dark:text-gray-100 min-w-fit outline-hidden" + />/{siblings.length} +
    + {:else} + +
    { + messageIndexEdit = true; + + await tick(); + const input = document.getElementById(`message-index-input-${message.id}`); + if (input) { + input.focus(); + input.select(); + } + }} + > + {siblings.indexOf(message.id) + 1}/{siblings.length} +
    + {/if} -
    - {siblings.indexOf(message.id) + 1}/{siblings.length} -
    + {#if messageIndexEdit} +
    + { + e.target.select(); + }} + on:blur={(e) => { + gotoMessage(message, e.target.value - 1); + messageIndexEdit = false; + }} + on:keydown={(e) => { + if (e.key === 'Enter') { + gotoMessage(message, e.target.value - 1); + messageIndexEdit = false; + } + }} + class="bg-transparent font-semibold self-center dark:text-gray-100 min-w-fit outline-hidden" + />/{siblings.length} +
    + {:else} + +
    { + messageIndexEdit = true; + + await tick(); + const input = document.getElementById( + `message-index-input-${message.id}` + ); + if (input) { + input.focus(); + input.select(); + } + }} + > + {siblings.indexOf(message.id) + 1}/{siblings.length} +
    + {/if} -
    - {siblings.indexOf(message.id) + 1}/{siblings.length} -
    + {#if messageIndexEdit} +
    + { + e.target.select(); + }} + on:blur={(e) => { + gotoMessage(message, e.target.value - 1); + messageIndexEdit = false; + }} + on:keydown={(e) => { + if (e.key === 'Enter') { + gotoMessage(message, e.target.value - 1); + messageIndexEdit = false; + } + }} + class="bg-transparent font-semibold self-center dark:text-gray-100 min-w-fit outline-hidden" + />/{siblings.length} +
    + {:else} + +
    { + messageIndexEdit = true; + + await tick(); + const input = document.getElementById( + `message-index-input-${message.id}` + ); + if (input) { + input.focus(); + input.select(); + } + }} + > + {siblings.indexOf(message.id) + 1}/{siblings.length} +
    + {/if} + {#if items.find((item) => item.model?.owned_by === 'ollama') && items.find((item) => item.model?.owned_by === 'openai')} + + + {/if} + + {#if items.find((item) => item.model?.direct)} + + {/if} + {#each tags as tag}
    {/if} - {#each filteredItems as item, index} + {#each filteredItems.filter((item) => !(item.model?.info?.meta?.hidden ?? false)) as item, index} - {:else if $mobile && ($user.role === 'admin' || $user?.permissions?.chat?.controls)} - - - {/if} - {#if !$mobile && ($user.role === 'admin' || $user?.permissions?.chat?.controls)} - - - - {/if} + + +
-
-
-
{$i18n.t('Notification Webhook')}
+ {#if $config?.features?.enable_user_webhooks} +
+
+
{$i18n.t('Notification Webhook')}
-
- +
+ +
-
+ {/if}
diff --git a/src/lib/components/chat/Settings/Advanced/AdvancedParams.svelte b/src/lib/components/chat/Settings/Advanced/AdvancedParams.svelte index 59d230d1b7..67b1f4dc10 100644 --- a/src/lib/components/chat/Settings/Advanced/AdvancedParams.svelte +++ b/src/lib/components/chat/Settings/Advanced/AdvancedParams.svelte @@ -961,6 +961,7 @@
{$i18n.t('Context Length')} + {$i18n.t('(Ollama)')}
+ +
+ +
+ {#each servers as server, idx} + { + updateHandler(); + }} + onDelete={() => { + servers = servers.filter((_, i) => i !== idx); + updateHandler(); + }} + /> + {/each} +
+
+ +
+
+ {$i18n.t('Connect to your own OpenAPI compatible external tool servers.')} +
+ {$i18n.t( + 'CORS must be properly configured by the provider to allow requests from Open WebUI.' + )} +
+
+
+
+ {:else} +
+
+ +
+
+ {/if} +
+ +
+ +
+ diff --git a/src/lib/components/chat/Settings/Tools/Connection.svelte b/src/lib/components/chat/Settings/Tools/Connection.svelte new file mode 100644 index 0000000000..426b6a1374 --- /dev/null +++ b/src/lib/components/chat/Settings/Tools/Connection.svelte @@ -0,0 +1,97 @@ + + + { + showDeleteConfirmDialog = true; + }} + onSubmit={(connection) => { + url = connection.url; + key = connection.key; + config = connection.config; + onSubmit(connection); + }} +/> + + { + onDelete(); + showConfigModal = false; + }} +/> + +
+ + {#if !(config?.enable ?? true)} +
+ {/if} +
+
+ +
+ + +
+
+ +
+ + + +
+
diff --git a/src/lib/components/chat/SettingsModal.svelte b/src/lib/components/chat/SettingsModal.svelte index 20d70505fe..e3b20c2db7 100644 --- a/src/lib/components/chat/SettingsModal.svelte +++ b/src/lib/components/chat/SettingsModal.svelte @@ -15,9 +15,9 @@ import Chats from './Settings/Chats.svelte'; import User from '../icons/User.svelte'; import Personalization from './Settings/Personalization.svelte'; - import SearchInput from '../layout/Sidebar/SearchInput.svelte'; import Search from '../icons/Search.svelte'; import Connections from './Settings/Connections.svelte'; + import Tools from './Settings/Tools.svelte'; const i18n = getContext('i18n'); @@ -128,6 +128,11 @@ title: 'Connections', keywords: [] }, + { + id: 'tools', + title: 'Tools', + keywords: [] + }, { id: 'personalization', title: 'Personalization', @@ -459,7 +464,7 @@ {:else if tabId === 'connections'} {#if $user.role === 'admin' || ($user.role === 'user' && $config?.features?.enable_direct_connections)}
{/if} + {:else if tabId === 'tools'} + {#if $user.role === 'admin' || ($user.role === 'user' && $config?.features?.enable_direct_tools)} + + {/if} {:else if tabId === 'personalization'} +
+ +
+
+ Open WebUI can use tools provided by any OpenAPI server.
Learn more about OpenAPI tool servers. +
+
+ {#each $toolServers as toolServer} + +
+
+ {toolServer?.openapi?.info?.title} - v{toolServer?.openapi?.info?.version} +
+ +
+ {toolServer?.openapi?.info?.description} +
+ +
+ {toolServer?.url} +
+
+ +
+ {#each toolServer?.specs ?? [] as tool_spec} +
+
+ {tool_spec?.name} +
+ +
+ {tool_spec?.description} +
+
+ {/each} +
+
+ {/each} +
+
+
+ diff --git a/src/lib/components/common/Checkbox.svelte b/src/lib/components/common/Checkbox.svelte index 9722e25264..feae33cd25 100644 --- a/src/lib/components/common/Checkbox.svelte +++ b/src/lib/components/common/Checkbox.svelte @@ -4,6 +4,7 @@ export let state = 'unchecked'; export let indeterminate = false; + export let disabled = false; let _state = 'unchecked'; @@ -14,8 +15,12 @@ class=" outline -outline-offset-1 outline-[1.5px] outline-gray-200 dark:outline-gray-600 {state !== 'unchecked' ? 'bg-black outline-black ' - : 'hover:outline-gray-500 hover:bg-gray-50 dark:hover:bg-gray-800'} text-white transition-all rounded-sm inline-block w-3.5 h-3.5 relative" + : 'hover:outline-gray-500 hover:bg-gray-50 dark:hover:bg-gray-800'} text-white transition-all rounded-sm inline-block w-3.5 h-3.5 relative {disabled + ? 'opacity-50 cursor-not-allowed' + : ''}" on:click={() => { + if (disabled) return; + if (_state === 'unchecked') { _state = 'checked'; dispatch('change', _state); @@ -30,6 +35,7 @@ } }} type="button" + {disabled} >
{#if _state === 'checked'} diff --git a/src/lib/components/common/CodeEditor.svelte b/src/lib/components/common/CodeEditor.svelte index d545d7236a..82ff139379 100644 --- a/src/lib/components/common/CodeEditor.svelte +++ b/src/lib/components/common/CodeEditor.svelte @@ -33,15 +33,49 @@ const updateValue = () => { if (_value !== value) { + const changes = findChanges(_value, value); _value = value; - if (codeEditor) { - codeEditor.dispatch({ - changes: [{ from: 0, to: codeEditor.state.doc.length, insert: _value }] - }); + + if (codeEditor && changes.length > 0) { + codeEditor.dispatch({ changes }); } } }; + /** + * Finds multiple diffs in two strings and generates minimal change edits. + */ + function findChanges(oldStr, newStr) { + let changes = []; + let oldIndex = 0, + newIndex = 0; + + while (oldIndex < oldStr.length || newIndex < newStr.length) { + if (oldStr[oldIndex] !== newStr[newIndex]) { + let start = oldIndex; + + // Identify the changed portion + while (oldIndex < oldStr.length && oldStr[oldIndex] !== newStr[newIndex]) { + oldIndex++; + } + while (newIndex < newStr.length && newStr[newIndex] !== oldStr[start]) { + newIndex++; + } + + changes.push({ + from: start, + to: oldIndex, // Replace the differing part + insert: newStr.substring(start, newIndex) + }); + } else { + oldIndex++; + newIndex++; + } + } + + return changes; + } + export let id = ''; export let lang = ''; diff --git a/src/lib/components/common/Collapsible.svelte b/src/lib/components/common/Collapsible.svelte index cc52494093..dfed04acf2 100644 --- a/src/lib/components/common/Collapsible.svelte +++ b/src/lib/components/common/Collapsible.svelte @@ -1,4 +1,6 @@
@@ -93,6 +116,22 @@ {:else} {$i18n.t('Analyzing...')} {/if} + {:else if attributes?.type === 'tool_calls'} + {#if attributes?.done === 'true'} + + {:else} + + {/if} {:else} {title} {/if} @@ -119,7 +158,19 @@ }} >
- +
+ + + {#if chevron} +
+ {#if open} + + {:else} + + {/if} +
+ {/if} +
{#if grow} {#if open && !hide} @@ -140,7 +191,29 @@ {#if !grow} {#if open && !hide}
- + {#if attributes?.type === 'tool_calls'} + {@const args = decode(attributes?.arguments)} + {@const result = decode(attributes?.result ?? '')} + + {#if attributes?.done === 'true'} + \`\`\`json +> ${formatJSONString(args)} +> ${formatJSONString(result)} +> \`\`\``} + /> + {:else} + \`\`\`json +> ${formatJSONString(args)} +> \`\`\``} + /> + {/if} + {:else} + + {/if}
{/if} {/if} diff --git a/src/lib/components/common/FileItem.svelte b/src/lib/components/common/FileItem.svelte index 8bc6f5ed42..772b078584 100644 --- a/src/lib/components/common/FileItem.svelte +++ b/src/lib/components/common/FileItem.svelte @@ -82,7 +82,7 @@ {#if !small}
- {name} + {decodeURIComponent(name)}
@@ -101,7 +101,11 @@
{:else} - +
{#if loading} @@ -109,7 +113,7 @@
{/if} -
{name}
+
{decodeURIComponent(name)}
{formatFileSize(size)}
diff --git a/src/lib/components/common/FileItemModal.svelte b/src/lib/components/common/FileItemModal.svelte index dd00b3a96f..ef55e96b2d 100644 --- a/src/lib/components/common/FileItemModal.svelte +++ b/src/lib/components/common/FileItemModal.svelte @@ -87,8 +87,12 @@
{#if enableFullContent} diff --git a/src/lib/components/common/ImagePreview.svelte b/src/lib/components/common/ImagePreview.svelte index ac7dc3a096..ef5b9d77c4 100644 --- a/src/lib/components/common/ImagePreview.svelte +++ b/src/lib/components/common/ImagePreview.svelte @@ -1,5 +1,6 @@ + + + + + diff --git a/src/lib/components/layout/Navbar/Menu.svelte b/src/lib/components/layout/Navbar/Menu.svelte index 5be5092e6c..73468a197f 100644 --- a/src/lib/components/layout/Navbar/Menu.svelte +++ b/src/lib/components/layout/Navbar/Menu.svelte @@ -6,6 +6,9 @@ import fileSaver from 'file-saver'; const { saveAs } = fileSaver; + import jsPDF from 'jspdf'; + import html2canvas from 'html2canvas-pro'; + import { downloadChatAsPDF } from '$lib/apis/utils'; import { copyToClipboard, createMessagesList } from '$lib/utils'; @@ -14,7 +17,8 @@ showControls, showArtifacts, mobile, - temporaryChatEnabled + temporaryChatEnabled, + theme } from '$lib/stores'; import { flyAndScale } from '$lib/utils/transitions'; @@ -58,27 +62,76 @@ }; const downloadPdf = async () => { - const history = chat.chat.history; - const messages = createMessagesList(history, history.currentId); - const blob = await downloadChatAsPDF(localStorage.token, chat.chat.title, messages); + const containerElement = document.getElementById('messages-container'); - // Create a URL for the blob - const url = window.URL.createObjectURL(blob); + if (containerElement) { + try { + const isDarkMode = document.documentElement.classList.contains('dark'); - // Create a link element to trigger the download - const a = document.createElement('a'); - a.href = url; - a.download = `chat-${chat.chat.title}.pdf`; + console.log('isDarkMode', isDarkMode); - // Append the link to the body and click it programmatically - document.body.appendChild(a); - a.click(); + // Define a fixed virtual screen size + const virtualWidth = 800; // Fixed width (adjust as needed) + // Clone the container to avoid layout shifts + const clonedElement = containerElement.cloneNode(true); + clonedElement.classList.add('text-black'); + clonedElement.classList.add('dark:text-white'); + clonedElement.style.width = `${virtualWidth}px`; // Apply fixed width + clonedElement.style.height = 'auto'; // Allow content to expand - // Remove the link from the body - document.body.removeChild(a); + document.body.appendChild(clonedElement); // Temporarily add to DOM - // Revoke the URL to release memory - window.URL.revokeObjectURL(url); + // Render to canvas with predefined width + const canvas = await html2canvas(clonedElement, { + backgroundColor: isDarkMode ? '#000' : '#fff', + useCORS: true, + scale: 2, // Keep at 1x to avoid unexpected enlargements + width: virtualWidth, // Set fixed virtual screen width + windowWidth: virtualWidth // Ensure consistent rendering + }); + + document.body.removeChild(clonedElement); // Clean up temp element + + const imgData = canvas.toDataURL('image/png'); + + // A4 page settings + const pdf = new jsPDF('p', 'mm', 'a4'); + const imgWidth = 210; // A4 width in mm + const pageHeight = 297; // A4 height in mm + + // Maintain aspect ratio + const imgHeight = (canvas.height * imgWidth) / canvas.width; + let heightLeft = imgHeight; + let position = 0; + + // Set page background for dark mode + if (isDarkMode) { + pdf.setFillColor(0, 0, 0); + pdf.rect(0, 0, imgWidth, pageHeight, 'F'); // Apply black bg + } + + pdf.addImage(imgData, 'PNG', 0, position, imgWidth, imgHeight); + heightLeft -= pageHeight; + + // Handle additional pages + while (heightLeft > 0) { + position -= pageHeight; + pdf.addPage(); + + if (isDarkMode) { + pdf.setFillColor(0, 0, 0); + pdf.rect(0, 0, imgWidth, pageHeight, 'F'); + } + + pdf.addImage(imgData, 'PNG', 0, position, imgWidth, imgHeight); + heightLeft -= pageHeight; + } + + pdf.save(`chat-${chat.chat.title}.pdf`); + } catch (error) { + console.error('Error generating PDF', error); + } + } }; const downloadJSONExport = async () => { diff --git a/src/lib/components/layout/Sidebar.svelte b/src/lib/components/layout/Sidebar.svelte index 0ab13e6ad7..d547779482 100644 --- a/src/lib/components/layout/Sidebar.svelte +++ b/src/lib/components/layout/Sidebar.svelte @@ -77,6 +77,7 @@ let allChatsLoaded = false; let folders = {}; + let newFolderId = null; const initFolders = async () => { const folderList = await getFolders(localStorage.token).catch((error) => { @@ -90,6 +91,11 @@ for (const folder of folderList) { // Ensure folder is added to folders with its data folders[folder.id] = { ...(folders[folder.id] || {}), ...folder }; + + if (newFolderId && folder.id === newFolderId) { + folders[folder.id].new = true; + newFolderId = null; + } } // Second pass: Tie child folders to their parents @@ -150,6 +156,7 @@ }); if (res) { + newFolderId = res.id; await initFolders(); } }; @@ -611,6 +618,7 @@ bind:value={search} on:input={searchDebounceHandler} placeholder={$i18n.t('Search')} + showClearButton={true} />
diff --git a/src/lib/components/layout/Sidebar/ChatItem.svelte b/src/lib/components/layout/Sidebar/ChatItem.svelte index 80981d00c7..0c15a334e8 100644 --- a/src/lib/components/layout/Sidebar/ChatItem.svelte +++ b/src/lib/components/layout/Sidebar/ChatItem.svelte @@ -198,6 +198,19 @@ }); let showDeleteConfirm = false; + + const chatTitleInputKeydownHandler = (e) => { + if (e.key === 'Enter') { + e.preventDefault(); + editChatTitle(id, chatTitle); + confirmEdit = false; + chatTitle = ''; + } else if (e.key === 'Escape') { + e.preventDefault(); + confirmEdit = false; + chatTitle = ''; + } + }; @@ -246,6 +259,7 @@ bind:value={chatTitle} id="chat-title-input-{id}" class=" bg-transparent w-full outline-hidden mr-10" + on:keydown={chatTitleInputKeydownHandler} />
{:else} diff --git a/src/lib/components/layout/Sidebar/ChatMenu.svelte b/src/lib/components/layout/Sidebar/ChatMenu.svelte index 8f38ba520e..395b0ec3a6 100644 --- a/src/lib/components/layout/Sidebar/ChatMenu.svelte +++ b/src/lib/components/layout/Sidebar/ChatMenu.svelte @@ -6,6 +6,9 @@ import fileSaver from 'file-saver'; const { saveAs } = fileSaver; + import jsPDF from 'jspdf'; + import html2canvas from 'html2canvas-pro'; + const dispatch = createEventDispatcher(); import Dropdown from '$lib/components/common/Dropdown.svelte'; @@ -23,7 +26,7 @@ getChatPinnedStatusById, toggleChatPinnedStatusById } from '$lib/apis/chats'; - import { chats } from '$lib/stores'; + import { chats, theme } from '$lib/stores'; import { createMessagesList } from '$lib/utils'; import { downloadChatAsPDF } from '$lib/apis/utils'; import Download from '$lib/components/icons/Download.svelte'; @@ -77,31 +80,76 @@ const downloadPdf = async () => { const chat = await getChatById(localStorage.token, chatId); - if (!chat) { - return; + + const containerElement = document.getElementById('messages-container'); + + if (containerElement) { + try { + const isDarkMode = $theme.includes('dark'); // Check theme mode + + // Define a fixed virtual screen size + const virtualWidth = 1024; // Fixed width (adjust as needed) + const virtualHeight = 1400; // Fixed height (adjust as needed) + + // Clone the container to avoid layout shifts + const clonedElement = containerElement.cloneNode(true); + clonedElement.style.width = `${virtualWidth}px`; // Apply fixed width + clonedElement.style.height = 'auto'; // Allow content to expand + + document.body.appendChild(clonedElement); // Temporarily add to DOM + + // Render to canvas with predefined width + const canvas = await html2canvas(clonedElement, { + backgroundColor: isDarkMode ? '#000' : '#fff', + useCORS: true, + scale: 2, // Keep at 1x to avoid unexpected enlargements + width: virtualWidth, // Set fixed virtual screen width + windowWidth: virtualWidth, // Ensure consistent rendering + windowHeight: virtualHeight + }); + + document.body.removeChild(clonedElement); // Clean up temp element + + const imgData = canvas.toDataURL('image/png'); + + // A4 page settings + const pdf = new jsPDF('p', 'mm', 'a4'); + const imgWidth = 210; // A4 width in mm + const pageHeight = 297; // A4 height in mm + + // Maintain aspect ratio + const imgHeight = (canvas.height * imgWidth) / canvas.width; + let heightLeft = imgHeight; + let position = 0; + + // Set page background for dark mode + if (isDarkMode) { + pdf.setFillColor(0, 0, 0); + pdf.rect(0, 0, imgWidth, pageHeight, 'F'); // Apply black bg + } + + pdf.addImage(imgData, 'PNG', 0, position, imgWidth, imgHeight); + heightLeft -= pageHeight; + + // Handle additional pages + while (heightLeft > 0) { + position -= pageHeight; + pdf.addPage(); + + if (isDarkMode) { + pdf.setFillColor(0, 0, 0); + pdf.rect(0, 0, imgWidth, pageHeight, 'F'); + } + + pdf.addImage(imgData, 'PNG', 0, position, imgWidth, imgHeight); + heightLeft -= pageHeight; + } + + pdf.save(`chat-${chat.chat.title}.pdf`); + } catch (error) { + console.error('Error generating PDF', error); + } } - - const history = chat.chat.history; - const messages = createMessagesList(history, history.currentId); - const blob = await downloadChatAsPDF(localStorage.token, chat.chat.title, messages); - - // Create a URL for the blob - const url = window.URL.createObjectURL(blob); - - // Create a link element to trigger the download - const a = document.createElement('a'); - a.href = url; - a.download = `chat-${chat.chat.title}.pdf`; - - // Append the link to the body and click it programmatically - document.body.appendChild(a); - a.click(); - - // Remove the link from the body - document.body.removeChild(a); - - // Revoke the URL to release memory - window.URL.revokeObjectURL(url); }; const downloadJSONExport = async () => { diff --git a/src/lib/components/layout/Sidebar/RecursiveFolder.svelte b/src/lib/components/layout/Sidebar/RecursiveFolder.svelte index 085eb683b8..a7eb920d7d 100644 --- a/src/lib/components/layout/Sidebar/RecursiveFolder.svelte +++ b/src/lib/components/layout/Sidebar/RecursiveFolder.svelte @@ -201,7 +201,7 @@ dragged = false; }; - onMount(() => { + onMount(async () => { open = folders[folderId].is_expanded; if (folderElement) { folderElement.addEventListener('dragover', onDragOver); @@ -215,6 +215,13 @@ // Event listener for when dragging ends folderElement.addEventListener('dragend', onDragEnd); } + + if (folders[folderId]?.new) { + delete folders[folderId].new; + + await tick(); + editHandler(); + } }); onDestroy(() => { @@ -297,15 +304,15 @@ console.log('Edit'); await tick(); name = folders[folderId].name; - edit = true; + edit = true; await tick(); - // focus on the input - setTimeout(() => { - const input = document.getElementById(`folder-${folderId}-input`); + const input = document.getElementById(`folder-${folderId}-input`); + + if (input) { input.focus(); - }, 100); + } }; const exportHandler = async () => { @@ -394,6 +401,9 @@ id="folder-{folderId}-input" type="text" bind:value={name} + on:focus={(e) => { + e.target.select(); + }} on:blur={() => { nameUpdateHandler(); edit = false; @@ -427,7 +437,10 @@ > { - editHandler(); + // Requires a timeout to prevent the click event from closing the dropdown + setTimeout(() => { + editHandler(); + }, 200); }} on:delete={() => { showDeleteConfirm = true; diff --git a/src/lib/components/layout/Sidebar/SearchInput.svelte b/src/lib/components/layout/Sidebar/SearchInput.svelte index eddc5b0694..6dca9a4eb9 100644 --- a/src/lib/components/layout/Sidebar/SearchInput.svelte +++ b/src/lib/components/layout/Sidebar/SearchInput.svelte @@ -3,12 +3,14 @@ import { tags } from '$lib/stores'; import { getContext, createEventDispatcher, onMount, onDestroy, tick } from 'svelte'; import { fade } from 'svelte/transition'; + import XMark from '$lib/components/icons/XMark.svelte'; const dispatch = createEventDispatcher(); const i18n = getContext('i18n'); export let placeholder = ''; export let value = ''; + export let showClearButton = false; let selectedIdx = 0; @@ -59,6 +61,11 @@ loading = false; }; + const clearSearchInput = () => { + value = ''; + dispatch('input'); + }; + const documentClickHandler = (e) => { const searchContainer = document.getElementById('search-container'); const chatSearch = document.getElementById('chat-search'); @@ -98,7 +105,7 @@
{ @@ -140,6 +147,17 @@ } }} /> + + {#if showClearButton && value} +
+ +
+ {/if}
{#if focused && (filteredOptions.length > 0 || filteredTags.length > 0)} diff --git a/src/lib/components/workspace/Knowledge/CreateKnowledgeBase.svelte b/src/lib/components/workspace/Knowledge/CreateKnowledgeBase.svelte index 586564cd79..fefbbefcda 100644 --- a/src/lib/components/workspace/Knowledge/CreateKnowledgeBase.svelte +++ b/src/lib/components/workspace/Knowledge/CreateKnowledgeBase.svelte @@ -5,7 +5,7 @@ import { createNewKnowledge, getKnowledgeBases } from '$lib/apis/knowledge'; import { toast } from 'svelte-sonner'; - import { knowledge } from '$lib/stores'; + import { knowledge, user } from '$lib/stores'; import AccessControl from '../common/AccessControl.svelte'; let loading = false; @@ -112,7 +112,11 @@
- +
diff --git a/src/lib/components/workspace/Knowledge/KnowledgeBase.svelte b/src/lib/components/workspace/Knowledge/KnowledgeBase.svelte index 173d04a613..c6f47e8def 100644 --- a/src/lib/components/workspace/Knowledge/KnowledgeBase.svelte +++ b/src/lib/components/workspace/Knowledge/KnowledgeBase.svelte @@ -9,7 +9,7 @@ import { goto } from '$app/navigation'; import { page } from '$app/stores'; - import { mobile, showSidebar, knowledge as _knowledge } from '$lib/stores'; + import { mobile, showSidebar, knowledge as _knowledge, config, user } from '$lib/stores'; import { updateFileDataContentById, uploadFile, deleteFileById } from '$lib/apis/files'; import { @@ -131,6 +131,22 @@ return null; } + if ( + ($config?.file?.max_size ?? null) !== null && + file.size > ($config?.file?.max_size ?? 0) * 1024 * 1024 + ) { + console.log('File exceeds max size limit:', { + fileSize: file.size, + maxSize: ($config?.file?.max_size ?? 0) * 1024 * 1024 + }); + toast.error( + $i18n.t(`File size should not exceed {{maxSize}} MB.`, { + maxSize: $config?.file?.max_size + }) + ); + return; + } + knowledge.files = [...(knowledge.files ?? []), fileItem]; try { @@ -603,6 +619,7 @@ { changeDebounceHandler(); }} @@ -681,7 +698,7 @@ href={selectedFile.id ? `/api/v1/files/${selectedFile.id}/content` : '#'} target="_blank" > - {selectedFile?.meta?.name} + {decodeURIComponent(selectedFile?.meta?.name)}
diff --git a/src/lib/components/workspace/Models.svelte b/src/lib/components/workspace/Models.svelte index 9a01f3fd8c..3c509a0bcd 100644 --- a/src/lib/components/workspace/Models.svelte +++ b/src/lib/components/workspace/Models.svelte @@ -430,6 +430,12 @@ return null; }); } + } else { + if (model?.id && model?.name) { + await createNewModel(localStorage.token, model).catch((error) => { + return null; + }); + } } } @@ -474,7 +480,7 @@