mirror of
https://github.com/open-webui/open-webui.git
synced 2025-12-12 12:25:20 +00:00
Merge branch 'dev' into feature/knowledge-sync-button
This commit is contained in:
commit
259f2b5a7a
271 changed files with 11473 additions and 3649 deletions
|
|
@ -3,8 +3,6 @@ pnpm-lock.yaml
|
||||||
package-lock.json
|
package-lock.json
|
||||||
yarn.lock
|
yarn.lock
|
||||||
|
|
||||||
kubernetes/
|
|
||||||
|
|
||||||
# Copy of .gitignore
|
# Copy of .gitignore
|
||||||
.DS_Store
|
.DS_Store
|
||||||
node_modules
|
node_modules
|
||||||
|
|
|
||||||
73
CHANGELOG.md
73
CHANGELOG.md
|
|
@ -5,6 +5,79 @@ All notable changes to this project will be documented in this file.
|
||||||
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.1.0/),
|
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.1.0/),
|
||||||
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
|
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
|
||||||
|
|
||||||
|
## [0.6.41] - 2025-12-02
|
||||||
|
|
||||||
|
### Added
|
||||||
|
|
||||||
|
- 🚦 Sign-in rate limiting was implemented to protect against brute force attacks, limiting login attempts to 15 per 3-minute window per email address using Redis with automatic fallback to in-memory storage when Redis is unavailable. [Commit](https://github.com/open-webui/open-webui/commit/7b166370432414ce8f186747fb098e0c70fb2d6b)
|
||||||
|
- 📂 Administrators can now globally disable the folders feature and control user-level folder permissions through the admin panel, enabling minimalist interface configurations for deployments that don't require workspace organization features. [#19529](https://github.com/open-webui/open-webui/pull/19529), [#19210](https://github.com/open-webui/open-webui/discussions/19210), [#18459](https://github.com/open-webui/open-webui/discussions/18459), [#18299](https://github.com/open-webui/open-webui/discussions/18299)
|
||||||
|
- 👥 Group channels were introduced as a new channel type enabling membership-based collaboration spaces where users explicitly join as members rather than accessing through permissions, with support for public or private visibility, automatic member inclusion from specified user groups, member role tracking with invitation metadata, and post-creation member management allowing channel managers to add or remove members through the channel info modal. [Commit](https://github.com/open-webui/open-webui/commit/f589b7c1895a6a77166c047891acfa21bc0936c4), [Commit](https://github.com/open-webui/open-webui/commit/3f1d9ccbf8443a2fa5278f36202bad930a216680)
|
||||||
|
- 💬 Direct Message channels were introduced with a dedicated channel type selector and multi-user member selection interface, enabling private conversations between specific users without requiring full channel visibility. [Commit](https://github.com/open-webui/open-webui/commit/64b4d5d9c280b926746584aaf92b447d09deb386)
|
||||||
|
- 📨 Direct Message channels now support a complete user-to-user messaging system with member-based access control, automatic deduplication for one-on-one conversations, optional channel naming, and distinct visual presentation using participant avatars instead of channel icons. [Commit](https://github.com/open-webui/open-webui/commit/acccb9afdd557274d6296c70258bb897bbb6652f)
|
||||||
|
- 🙈 Users can now hide Direct Message channels from their sidebar while preserving message history, with automatic reactivation when new messages arrive from other participants, providing a cleaner interface for managing active conversations. [Commit](https://github.com/open-webui/open-webui/commit/acccb9afdd557274d6296c70258bb897bbb6652f)
|
||||||
|
- ☑️ A comprehensive user selection component was added to the channel creation modal, featuring search functionality, sortable user lists, pagination support, and multi-select checkboxes for building Direct Message participant lists. [Commit](https://github.com/open-webui/open-webui/commit/acccb9afdd557274d6296c70258bb897bbb6652f)
|
||||||
|
- 🔴 Channel unread message count tracking was implemented with visual badge indicators in the sidebar, automatically updating counts in real-time and marking messages as read when users view channels, with join/leave functionality to manage membership status. [Commit](https://github.com/open-webui/open-webui/commit/64b4d5d9c280b926746584aaf92b447d09deb386)
|
||||||
|
- 📌 Message pinning functionality was added to channels, allowing users to pin important messages for easy reference with visual highlighting, a dedicated pinned messages modal accessible from the navbar, and complete backend support for tracking pinned status, pin timestamp, and the user who pinned each message. [Commit](https://github.com/open-webui/open-webui/commit/64b4d5d9c280b926746584aaf92b447d09deb386), [Commit](https://github.com/open-webui/open-webui/commit/aae2fce17355419d9c29f8100409108037895201)
|
||||||
|
- 🟢 Direct Message channels now display an active status indicator for one-on-one conversations, showing a green dot when the other participant is currently online or a gray dot when offline. [Commit](https://github.com/open-webui/open-webui/commit/4b6773885cd7527c5a56b963781dac5e95105eec), [Commit](https://github.com/open-webui/open-webui/commit/39645102d14f34e71b34e5ddce0625790be33f6f)
|
||||||
|
- 🆔 Users can now start Direct Message conversations directly from user profile previews by clicking the "Message" button, enabling quick access to private messaging without navigating away from the current channel. [Commit](https://github.com/open-webui/open-webui/commit/a0826ec9fedb56320532616d568fa59dda831d4e)
|
||||||
|
- ⚡ Channel messages now appear instantly when sent using optimistic UI rendering, displaying with a pending state while the server confirms delivery, providing a more responsive messaging experience. [Commit](https://github.com/open-webui/open-webui/commit/25994dd3da90600401f53596d4e4fb067c1b8eaa)
|
||||||
|
- 👍 Channel message reactions now display the names of users who reacted when hovering over the emoji, showing up to three names with a count for additional reactors. [Commit](https://github.com/open-webui/open-webui/commit/05e79bdd0c7af70b631e958924e3656db1013b80)
|
||||||
|
- 🛠️ Channel creators can now edit and delete their own group and DM channels without requiring administrator privileges, enabling users to manage the channels they create independently. [Commit](https://github.com/open-webui/open-webui/commit/f589b7c1895a6a77166c047891acfa21bc0936c4)
|
||||||
|
- 🔌 A new API endpoint was added to directly get or create a Direct Message channel with a specific user by their ID, streamlining programmatic DM channel creation for integrations and frontend workflows. [Commit](https://github.com/open-webui/open-webui/commit/f589b7c1895a6a77166c047891acfa21bc0936c4)
|
||||||
|
- 💭 Users can now set a custom status with an emoji and message that displays in profile previews, the sidebar user menu, and Direct Message channel items in the sidebar, with the ability to clear status at any time, providing visibility into availability or current focus similar to team communication platforms. [Commit](https://github.com/open-webui/open-webui/commit/51621ba91a982e52da168ce823abffd11ad3e4fa), [Commit](https://github.com/open-webui/open-webui/commit/f5e8d4d5a004115489c35725408b057e24dfe318)
|
||||||
|
- 📤 A group export API endpoint was added, enabling administrators to export complete group data including member lists for backup and migration purposes. [Commit](https://github.com/open-webui/open-webui/commit/09b6ea38c579659f8ca43ae5ea3746df3ac561ad)
|
||||||
|
- 📡 A new API endpoint was added to retrieve all users belonging to a specific group, enabling programmatic access to group membership information for administrative workflows. [Commit](https://github.com/open-webui/open-webui/commit/01868e856a10f474f74fbd1b4425dafdf949222f)
|
||||||
|
- 👁️ The admin user list now displays an active status indicator next to each user, showing a visual green dot for users who have been active within the last three minutes. [Commit](https://github.com/open-webui/open-webui/commit/1b095d12ff2465b83afa94af89ded9593f8a8655)
|
||||||
|
- 🔑 The admin user edit modal now displays OAuth identity information with a per-provider breakdown, showing each linked identity provider and its associated subject identifier separately. [#19573](https://github.com/open-webui/open-webui/pull/19573)
|
||||||
|
- 🧩 OAuth role claim parsing now respects the "OAUTH_ROLES_SEPARATOR" configuration, enabling proper parsing of roles returned as comma-separated strings and providing consistent behavior with group claim handling. [#19514](https://github.com/open-webui/open-webui/pull/19514)
|
||||||
|
- 🎛️ Channel feature access can now be controlled through both the "USER_PERMISSIONS_FEATURES_CHANNELS" environment variable and group permission toggles in the admin panel, allowing administrators to restrict channel functionality for specific users or groups while defaulting to enabled for all users. [Commit](https://github.com/open-webui/open-webui/commit/f589b7c1895a6a77166c047891acfa21bc0936c4)
|
||||||
|
- 🎨 The model editor interface was refined with access control settings moved to a dedicated modal, group member counts now displayed when configuring permissions, reorganized layout with improved visual hierarchy, and redesigned prompt suggestions cards with tooltips for field guidance. [Commit](https://github.com/open-webui/open-webui/commit/e65d92fc6f49da5ca059e1c65a729e7973354b99), [Commit](https://github.com/open-webui/open-webui/commit/9d39b9b42c653ee2acf2674b2df343ecbceb4954)
|
||||||
|
- 🏗️ Knowledge base file management was rebuilt with a dedicated database table replacing the previous JSON array storage, enabling pagination support for large knowledge bases, significantly faster file listing performance, and more reliable file-knowledge base relationship tracking. [Commit](https://github.com/open-webui/open-webui/commit/d19023288e2ca40f86e2dc3fd9f230540f3e70d7)
|
||||||
|
- ☁️ Azure Document Intelligence model selection was added, allowing administrators to specify which model to use for document processing via the "DOCUMENT_INTELLIGENCE_MODEL" environment variable or admin UI setting, with "prebuilt-layout" as the default. [#19692](https://github.com/open-webui/open-webui/pull/19692), [Docs:#872](https://github.com/open-webui/docs/pull/872)
|
||||||
|
- 🚀 Milvus multitenancy vector database performance was improved by removing manual flush calls after upsert operations, eliminating rate limit errors and reducing load on etcd and MinIO/S3 storage by allowing Milvus to manage segment persistence automatically via its WAL and auto-flush policies. [#19680](https://github.com/open-webui/open-webui/pull/19680)
|
||||||
|
- ✨ Various improvements were implemented across the frontend and backend to enhance performance, stability, and security.
|
||||||
|
- 🌍 Translations for German, French, Portuguese (Brazil), Catalan, Simplified Chinese, and Traditional Chinese were enhanced and expanded.
|
||||||
|
|
||||||
|
### Fixed
|
||||||
|
|
||||||
|
- 🔄 Tool call response token duplication was fixed by removing redundant message history additions in non-native function calling mode, resolving an issue where tool results were included twice in the context and causing 2x token consumption. [#19656](https://github.com/open-webui/open-webui/issues/19656), [Commit](https://github.com/open-webui/open-webui/commit/52ccab8)
|
||||||
|
- 🛡️ Web search domain filtering was corrected to properly block results when any resolved hostname or IP address matches a blocked domain, preventing blocked sites from appearing in search results due to permissive hostname resolution logic that previously allowed results through if any single resolved address passed the filter. [#19670](https://github.com/open-webui/open-webui/pull/19670), [#19669](https://github.com/open-webui/open-webui/issues/19669)
|
||||||
|
- 🧠 Custom models based on Ollama or OpenAI now properly inherit the connection type from their base model, ensuring they appear correctly in the "Local" or "External" model selection tabs instead of only appearing under "All". [#19183](https://github.com/open-webui/open-webui/issues/19183), [Commit](https://github.com/open-webui/open-webui/commit/39f7575)
|
||||||
|
- 🐍 SentenceTransformers embedding initialization was fixed by updating the transformers dependency to version 4.57.3, resolving a regression in v0.6.40 where document ingestion failed with "'NoneType' object has no attribute 'encode'" errors due to a bug in transformers 4.57.2. [#19512](https://github.com/open-webui/open-webui/issues/19512), [#19513](https://github.com/open-webui/open-webui/pull/19513)
|
||||||
|
- 📈 Active user count accuracy was significantly improved by replacing the socket-based USER_POOL tracking with a database-backed heartbeat mechanism, resolving long-standing issues where Redis deployments displayed inflated user counts due to stale sessions never being cleaned up on disconnect. [#16074](https://github.com/open-webui/open-webui/discussions/16074), [Commit](https://github.com/open-webui/open-webui/commit/70948f8803e417459d5203839f8077fdbfbbb213)
|
||||||
|
- 👥 Default group assignment now applies consistently across all user registration methods including OAuth/SSO, LDAP, and admin-created users, fixing an issue where the "DEFAULT_GROUP_ID" setting was only being applied to users who signed up via the email/password signup form. [#19685](https://github.com/open-webui/open-webui/pull/19685)
|
||||||
|
- 🔦 Model list filtering in workspaces was corrected to properly include models shared with user groups, ensuring members can view models they have write access to through group permissions. [#19461](https://github.com/open-webui/open-webui/issues/19461), [Commit](https://github.com/open-webui/open-webui/commit/69722ba973768a5f689f2e2351bf583a8db9bba8)
|
||||||
|
- 🖼️ User profile image display in preview contexts was fixed by resolving a Pydantic validation error that prevented proper rendering. [Commit](https://github.com/open-webui/open-webui/commit/c7eb7136893b0ddfdc5d55ffc7a05bd84a00f5d6)
|
||||||
|
- 🔒 Redis TLS connection failures were resolved by updating the python-socketio dependency to version 5.15.0, restoring support for the "rediss://" URL schema. [#19480](https://github.com/open-webui/open-webui/issues/19480), [#19488](https://github.com/open-webui/open-webui/pull/19488)
|
||||||
|
- 📝 MCP tool server configuration was corrected to properly handle the "Function Name Filter List" as both string and list types, preventing AttributeError when the field is empty and ensuring backward compatibility. [#19486](https://github.com/open-webui/open-webui/issues/19486), [Commit](https://github.com/open-webui/open-webui/commit/c5b73d71843edc024325d4a6e625ec939a747279), [Commit](https://github.com/open-webui/open-webui/commit/477097c2e42985c14892301d0127314629d07df1)
|
||||||
|
- 📎 Web page attachment failures causing TypeError on metadata checks were resolved by correcting async threadpool parameter passing in vector database operations. [#19493](https://github.com/open-webui/open-webui/issues/19493), [Commit](https://github.com/open-webui/open-webui/commit/4370dee79e19d77062c03fba81780cb3b779fca3)
|
||||||
|
- 💾 Model allowlist persistence in multi-worker deployments was fixed by implementing Redis-based shared state for the internal models dictionary, ensuring configuration changes are consistently visible across all worker processes. [#19395](https://github.com/open-webui/open-webui/issues/19395), [Commit](https://github.com/open-webui/open-webui/commit/b5e5617d7f7ad3e4eec9f15f4cc7f07cb5afc2fa)
|
||||||
|
- ⏳ Chat history infinite loading was prevented by enhancing message data structure to properly track parent message relationships, resolving issues where missing parentId fields caused perpetual loading states. [#19225](https://github.com/open-webui/open-webui/issues/19225), [Commit](https://github.com/open-webui/open-webui/commit/ff4b1b9862d15adfa15eac17d2ce066c3d8ae38f)
|
||||||
|
- 🩹 Database migration robustness was improved by automatically detecting and correcting missing primary key constraints on the user table, ensuring successful schema upgrades for databases with non-standard configurations. [#19487](https://github.com/open-webui/open-webui/discussions/19487), [Commit](https://github.com/open-webui/open-webui/commit/453ea9b9a167c0b03d86c46e6efd086bf10056ce)
|
||||||
|
- 🏷️ OAuth group assignment now updates correctly on first login when users transition from admin to user role, ensuring group memberships reflect immediately when group management is enabled. [#19475](https://github.com/open-webui/open-webui/issues/19475), [#19476](https://github.com/open-webui/open-webui/pull/19476)
|
||||||
|
- 💡 Knowledge base file tooltips now properly display the parent collection name when referencing files with the hash symbol, preventing confusion between identically-named files in different collections. [#19491](https://github.com/open-webui/open-webui/issues/19491), [Commit](https://github.com/open-webui/open-webui/commit/3fe5a47b0ff84ac97f8e4ff56a19fa2ec065bf66)
|
||||||
|
- 🔐 Knowledge base file access inconsistencies were resolved where authorized non-admin users received "Not found" or permission errors for certain files due to race conditions during upload causing mismatched collection_name values, with file access validation now properly checking against knowledge base file associations. [#18689](https://github.com/open-webui/open-webui/issues/18689), [#19523](https://github.com/open-webui/open-webui/pull/19523), [Commit](https://github.com/open-webui/open-webui/commit/e301d1962e45900ababd3eabb7e9a2ad275a5761)
|
||||||
|
- 📦 Knowledge API batch file addition endpoint was corrected to properly handle async operations, resolving 500 Internal Server Error responses when adding multiple files simultaneously. [#19538](https://github.com/open-webui/open-webui/issues/19538), [Commit](https://github.com/open-webui/open-webui/commit/28659f60d94feb4f6a99bb1a5b54d7f45e5ea10f)
|
||||||
|
- 🤖 Embedding model auto-update functionality was fixed to properly respect the "RAG_EMBEDDING_MODEL_AUTO_UPDATE" setting by correctly passing the flag to the model path resolver, ensuring models update as expected when the auto-update option is enabled. [#19687](https://github.com/open-webui/open-webui/pull/19687)
|
||||||
|
- 📉 API response payload sizes were dramatically reduced by removing base64-encoded profile images from most endpoints, eliminating multi-megabyte responses caused by high-resolution avatars and enabling better browser caching. [#19519](https://github.com/open-webui/open-webui/issues/19519), [Commit](https://github.com/open-webui/open-webui/commit/384753c4c17f62a68d38af4bbcf55a21ee08e0f2)
|
||||||
|
- 📞 Redundant API calls on the admin user overview page were eliminated by consolidating reactive statements, reducing four duplicate requests to a single efficient call and significantly improving page load performance. [#19509](https://github.com/open-webui/open-webui/issues/19509), [Commit](https://github.com/open-webui/open-webui/commit/9f89cc5e9f7e1c6c9e2bc91177e08df7c79f66f9)
|
||||||
|
- 🧹 Duplicate API calls on the workspace models page were eliminated by removing redundant model list fetching, reducing two identical requests to a single call and improving page responsiveness. [#19517](https://github.com/open-webui/open-webui/issues/19517), [Commit](https://github.com/open-webui/open-webui/commit/d1bbf6be7a4d1d53fa8ad46ca4f62fc4b2e6a8cb)
|
||||||
|
- 🔘 The model valves button was corrected to prevent unintended form submission by adding explicit button type attribute, ensuring it no longer triggers message sending when the input area contains text. [#19534](https://github.com/open-webui/open-webui/pull/19534)
|
||||||
|
- 🗑️ Ollama model deletion was fixed by correcting the request payload format and ensuring the model selector properly displays the placeholder option. [Commit](https://github.com/open-webui/open-webui/commit/0f3156651c64bc5af188a65fc2908bdcecf30c74)
|
||||||
|
- 🎨 Image generation in temporary chats was fixed by correctly handling local chat sessions that are not persisted to the database. [Commit](https://github.com/open-webui/open-webui/commit/a7c7993bbf3a21cb7ba416525b89233cf2ad877f)
|
||||||
|
- 🕵️♂️ Audit logging was fixed by correctly awaiting the async user authentication call, resolving failures where coroutine objects were passed instead of user data. [#19658](https://github.com/open-webui/open-webui/pull/19658), [Commit](https://github.com/open-webui/open-webui/commit/dba86bc)
|
||||||
|
- 🌙 Dark mode select dropdown styling was corrected to use proper background colors, fixing an issue where dropdown borders and hover states appeared white instead of matching the dark theme. [#19693](https://github.com/open-webui/open-webui/pull/19693), [#19442](https://github.com/open-webui/open-webui/issues/19442)
|
||||||
|
- 🔍 Milvus vector database query filtering was fixed by correcting string quote handling in filter expressions and using the proper parameter name for queries, resolving false "duplicate content detected" errors that prevented uploading multiple files to knowledge bases. [#19602](https://github.com/open-webui/open-webui/pull/19602), [#18119](https://github.com/open-webui/open-webui/issues/18119), [#16345](https://github.com/open-webui/open-webui/issues/16345), [#17088](https://github.com/open-webui/open-webui/issues/17088), [#18485](https://github.com/open-webui/open-webui/issues/18485)
|
||||||
|
- 🆙 Milvus multitenancy vector database was updated to use query_iterator() for improved robustness and consistency with the standard Milvus implementation, fixing the same false duplicate detection errors and improving handling of large result sets in multi-tenant deployments. [#19695](https://github.com/open-webui/open-webui/pull/19695)
|
||||||
|
|
||||||
|
### Changed
|
||||||
|
|
||||||
|
- ⚠️ **IMPORTANT for Multi-Instance Deployments** — This release includes database schema changes; multi-worker, multi-server, or load-balanced deployments must update all instances simultaneously rather than performing rolling updates, as running mixed versions will cause application failures due to schema incompatibility between old and new instances.
|
||||||
|
- 👮 Channel creation is now restricted to administrators only, with the channel add button hidden for regular users to maintain organizational control over communication channels. [Commit](https://github.com/open-webui/open-webui/commit/421aba7cd7cd708168b1f2565026c74525a67905)
|
||||||
|
- ➖ The active user count indicator was removed from the bottom-left user menu in the sidebar to streamline the interface. [Commit](https://github.com/open-webui/open-webui/commit/848f3fd4d86ca66656e0ff0335773945af8d7d8d)
|
||||||
|
- 🗂️ The user table was restructured with API keys migrated to a dedicated table supporting future multi-key functionality, OAuth data storage converted to a JSON structure enabling multiple identity providers per user account, and internal column types optimized from TEXT to JSON for the "info" and "settings" fields, with automatic migration preserving all existing data and associations. [#19573](https://github.com/open-webui/open-webui/pull/19573)
|
||||||
|
- 🔄 The knowledge base API was restructured to support the new file relationship model.
|
||||||
|
|
||||||
## [0.6.40] - 2025-11-25
|
## [0.6.40] - 2025-11-25
|
||||||
|
|
||||||
### Fixed
|
### Fixed
|
||||||
|
|
|
||||||
|
|
@ -1,35 +0,0 @@
|
||||||
### Installing Both Ollama and Open WebUI Using Kustomize
|
|
||||||
|
|
||||||
For cpu-only pod
|
|
||||||
|
|
||||||
```bash
|
|
||||||
kubectl apply -f ./kubernetes/manifest/base
|
|
||||||
```
|
|
||||||
|
|
||||||
For gpu-enabled pod
|
|
||||||
|
|
||||||
```bash
|
|
||||||
kubectl apply -k ./kubernetes/manifest
|
|
||||||
```
|
|
||||||
|
|
||||||
### Installing Both Ollama and Open WebUI Using Helm
|
|
||||||
|
|
||||||
Package Helm file first
|
|
||||||
|
|
||||||
```bash
|
|
||||||
helm package ./kubernetes/helm/
|
|
||||||
```
|
|
||||||
|
|
||||||
For cpu-only pod
|
|
||||||
|
|
||||||
```bash
|
|
||||||
helm install ollama-webui ./ollama-webui-*.tgz
|
|
||||||
```
|
|
||||||
|
|
||||||
For gpu-enabled pod
|
|
||||||
|
|
||||||
```bash
|
|
||||||
helm install ollama-webui ./ollama-webui-*.tgz --set ollama.resources.limits.nvidia.com/gpu="1"
|
|
||||||
```
|
|
||||||
|
|
||||||
Check the `kubernetes/helm/values.yaml` file to know which parameters are available for customization
|
|
||||||
2
LICENSE
2
LICENSE
|
|
@ -1,4 +1,4 @@
|
||||||
Copyright (c) 2023-2025 Timothy Jaeryang Baek (Open WebUI)
|
Copyright (c) 2023- Open WebUI Inc. [Created by Timothy Jaeryang Baek]
|
||||||
All rights reserved.
|
All rights reserved.
|
||||||
|
|
||||||
Redistribution and use in source and binary forms, with or without
|
Redistribution and use in source and binary forms, with or without
|
||||||
|
|
|
||||||
|
|
@ -583,14 +583,16 @@ OAUTH_ROLES_CLAIM = PersistentConfig(
|
||||||
os.environ.get("OAUTH_ROLES_CLAIM", "roles"),
|
os.environ.get("OAUTH_ROLES_CLAIM", "roles"),
|
||||||
)
|
)
|
||||||
|
|
||||||
SEP = os.environ.get("OAUTH_ROLES_SEPARATOR", ",")
|
OAUTH_ROLES_SEPARATOR = os.environ.get("OAUTH_ROLES_SEPARATOR", ",")
|
||||||
|
|
||||||
OAUTH_ALLOWED_ROLES = PersistentConfig(
|
OAUTH_ALLOWED_ROLES = PersistentConfig(
|
||||||
"OAUTH_ALLOWED_ROLES",
|
"OAUTH_ALLOWED_ROLES",
|
||||||
"oauth.allowed_roles",
|
"oauth.allowed_roles",
|
||||||
[
|
[
|
||||||
role.strip()
|
role.strip()
|
||||||
for role in os.environ.get("OAUTH_ALLOWED_ROLES", f"user{SEP}admin").split(SEP)
|
for role in os.environ.get(
|
||||||
|
"OAUTH_ALLOWED_ROLES", f"user{OAUTH_ROLES_SEPARATOR}admin"
|
||||||
|
).split(OAUTH_ROLES_SEPARATOR)
|
||||||
if role
|
if role
|
||||||
],
|
],
|
||||||
)
|
)
|
||||||
|
|
@ -600,7 +602,9 @@ OAUTH_ADMIN_ROLES = PersistentConfig(
|
||||||
"oauth.admin_roles",
|
"oauth.admin_roles",
|
||||||
[
|
[
|
||||||
role.strip()
|
role.strip()
|
||||||
for role in os.environ.get("OAUTH_ADMIN_ROLES", "admin").split(SEP)
|
for role in os.environ.get("OAUTH_ADMIN_ROLES", "admin").split(
|
||||||
|
OAUTH_ROLES_SEPARATOR
|
||||||
|
)
|
||||||
if role
|
if role
|
||||||
],
|
],
|
||||||
)
|
)
|
||||||
|
|
@ -625,6 +629,12 @@ OAUTH_ACCESS_TOKEN_REQUEST_INCLUDE_CLIENT_ID = (
|
||||||
== "true"
|
== "true"
|
||||||
)
|
)
|
||||||
|
|
||||||
|
OAUTH_AUDIENCE = PersistentConfig(
|
||||||
|
"OAUTH_AUDIENCE",
|
||||||
|
"oauth.audience",
|
||||||
|
os.environ.get("OAUTH_AUDIENCE", ""),
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
def load_oauth_providers():
|
def load_oauth_providers():
|
||||||
OAUTH_PROVIDERS.clear()
|
OAUTH_PROVIDERS.clear()
|
||||||
|
|
@ -1443,10 +1453,18 @@ USER_PERMISSIONS_FEATURES_CODE_INTERPRETER = (
|
||||||
== "true"
|
== "true"
|
||||||
)
|
)
|
||||||
|
|
||||||
|
USER_PERMISSIONS_FEATURES_FOLDERS = (
|
||||||
|
os.environ.get("USER_PERMISSIONS_FEATURES_FOLDERS", "True").lower() == "true"
|
||||||
|
)
|
||||||
|
|
||||||
USER_PERMISSIONS_FEATURES_NOTES = (
|
USER_PERMISSIONS_FEATURES_NOTES = (
|
||||||
os.environ.get("USER_PERMISSIONS_FEATURES_NOTES", "True").lower() == "true"
|
os.environ.get("USER_PERMISSIONS_FEATURES_NOTES", "True").lower() == "true"
|
||||||
)
|
)
|
||||||
|
|
||||||
|
USER_PERMISSIONS_FEATURES_CHANNELS = (
|
||||||
|
os.environ.get("USER_PERMISSIONS_FEATURES_CHANNELS", "True").lower() == "true"
|
||||||
|
)
|
||||||
|
|
||||||
USER_PERMISSIONS_FEATURES_API_KEYS = (
|
USER_PERMISSIONS_FEATURES_API_KEYS = (
|
||||||
os.environ.get("USER_PERMISSIONS_FEATURES_API_KEYS", "False").lower() == "true"
|
os.environ.get("USER_PERMISSIONS_FEATURES_API_KEYS", "False").lower() == "true"
|
||||||
)
|
)
|
||||||
|
|
@ -1499,12 +1517,16 @@ DEFAULT_USER_PERMISSIONS = {
|
||||||
"temporary_enforced": USER_PERMISSIONS_CHAT_TEMPORARY_ENFORCED,
|
"temporary_enforced": USER_PERMISSIONS_CHAT_TEMPORARY_ENFORCED,
|
||||||
},
|
},
|
||||||
"features": {
|
"features": {
|
||||||
|
# General features
|
||||||
"api_keys": USER_PERMISSIONS_FEATURES_API_KEYS,
|
"api_keys": USER_PERMISSIONS_FEATURES_API_KEYS,
|
||||||
|
"notes": USER_PERMISSIONS_FEATURES_NOTES,
|
||||||
|
"folders": USER_PERMISSIONS_FEATURES_FOLDERS,
|
||||||
|
"channels": USER_PERMISSIONS_FEATURES_CHANNELS,
|
||||||
"direct_tool_servers": USER_PERMISSIONS_FEATURES_DIRECT_TOOL_SERVERS,
|
"direct_tool_servers": USER_PERMISSIONS_FEATURES_DIRECT_TOOL_SERVERS,
|
||||||
|
# Chat features
|
||||||
"web_search": USER_PERMISSIONS_FEATURES_WEB_SEARCH,
|
"web_search": USER_PERMISSIONS_FEATURES_WEB_SEARCH,
|
||||||
"image_generation": USER_PERMISSIONS_FEATURES_IMAGE_GENERATION,
|
"image_generation": USER_PERMISSIONS_FEATURES_IMAGE_GENERATION,
|
||||||
"code_interpreter": USER_PERMISSIONS_FEATURES_CODE_INTERPRETER,
|
"code_interpreter": USER_PERMISSIONS_FEATURES_CODE_INTERPRETER,
|
||||||
"notes": USER_PERMISSIONS_FEATURES_NOTES,
|
|
||||||
},
|
},
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
@ -1514,6 +1536,12 @@ USER_PERMISSIONS = PersistentConfig(
|
||||||
DEFAULT_USER_PERMISSIONS,
|
DEFAULT_USER_PERMISSIONS,
|
||||||
)
|
)
|
||||||
|
|
||||||
|
ENABLE_FOLDERS = PersistentConfig(
|
||||||
|
"ENABLE_FOLDERS",
|
||||||
|
"folders.enable",
|
||||||
|
os.environ.get("ENABLE_FOLDERS", "True").lower() == "true",
|
||||||
|
)
|
||||||
|
|
||||||
ENABLE_CHANNELS = PersistentConfig(
|
ENABLE_CHANNELS = PersistentConfig(
|
||||||
"ENABLE_CHANNELS",
|
"ENABLE_CHANNELS",
|
||||||
"channels.enable",
|
"channels.enable",
|
||||||
|
|
@ -2568,6 +2596,12 @@ DOCUMENT_INTELLIGENCE_KEY = PersistentConfig(
|
||||||
os.getenv("DOCUMENT_INTELLIGENCE_KEY", ""),
|
os.getenv("DOCUMENT_INTELLIGENCE_KEY", ""),
|
||||||
)
|
)
|
||||||
|
|
||||||
|
DOCUMENT_INTELLIGENCE_MODEL = PersistentConfig(
|
||||||
|
"DOCUMENT_INTELLIGENCE_MODEL",
|
||||||
|
"rag.document_intelligence_model",
|
||||||
|
os.getenv("DOCUMENT_INTELLIGENCE_MODEL", "prebuilt-layout"),
|
||||||
|
)
|
||||||
|
|
||||||
MISTRAL_OCR_API_BASE_URL = PersistentConfig(
|
MISTRAL_OCR_API_BASE_URL = PersistentConfig(
|
||||||
"MISTRAL_OCR_API_BASE_URL",
|
"MISTRAL_OCR_API_BASE_URL",
|
||||||
"rag.MISTRAL_OCR_API_BASE_URL",
|
"rag.MISTRAL_OCR_API_BASE_URL",
|
||||||
|
|
@ -2966,6 +3000,12 @@ WEB_LOADER_CONCURRENT_REQUESTS = PersistentConfig(
|
||||||
int(os.getenv("WEB_LOADER_CONCURRENT_REQUESTS", "10")),
|
int(os.getenv("WEB_LOADER_CONCURRENT_REQUESTS", "10")),
|
||||||
)
|
)
|
||||||
|
|
||||||
|
WEB_LOADER_TIMEOUT = PersistentConfig(
|
||||||
|
"WEB_LOADER_TIMEOUT",
|
||||||
|
"rag.web.loader.timeout",
|
||||||
|
os.getenv("WEB_LOADER_TIMEOUT", ""),
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
ENABLE_WEB_LOADER_SSL_VERIFICATION = PersistentConfig(
|
ENABLE_WEB_LOADER_SSL_VERIFICATION = PersistentConfig(
|
||||||
"ENABLE_WEB_LOADER_SSL_VERIFICATION",
|
"ENABLE_WEB_LOADER_SSL_VERIFICATION",
|
||||||
|
|
|
||||||
|
|
@ -395,6 +395,13 @@ try:
|
||||||
except ValueError:
|
except ValueError:
|
||||||
REDIS_SENTINEL_MAX_RETRY_COUNT = 2
|
REDIS_SENTINEL_MAX_RETRY_COUNT = 2
|
||||||
|
|
||||||
|
|
||||||
|
REDIS_SOCKET_CONNECT_TIMEOUT = os.environ.get("REDIS_SOCKET_CONNECT_TIMEOUT", "")
|
||||||
|
try:
|
||||||
|
REDIS_SOCKET_CONNECT_TIMEOUT = float(REDIS_SOCKET_CONNECT_TIMEOUT)
|
||||||
|
except ValueError:
|
||||||
|
REDIS_SOCKET_CONNECT_TIMEOUT = None
|
||||||
|
|
||||||
####################################
|
####################################
|
||||||
# UVICORN WORKERS
|
# UVICORN WORKERS
|
||||||
####################################
|
####################################
|
||||||
|
|
@ -620,9 +627,16 @@ ENABLE_WEBSOCKET_SUPPORT = (
|
||||||
WEBSOCKET_MANAGER = os.environ.get("WEBSOCKET_MANAGER", "")
|
WEBSOCKET_MANAGER = os.environ.get("WEBSOCKET_MANAGER", "")
|
||||||
|
|
||||||
WEBSOCKET_REDIS_OPTIONS = os.environ.get("WEBSOCKET_REDIS_OPTIONS", "")
|
WEBSOCKET_REDIS_OPTIONS = os.environ.get("WEBSOCKET_REDIS_OPTIONS", "")
|
||||||
|
|
||||||
|
|
||||||
if WEBSOCKET_REDIS_OPTIONS == "":
|
if WEBSOCKET_REDIS_OPTIONS == "":
|
||||||
log.debug("No WEBSOCKET_REDIS_OPTIONS provided, defaulting to None")
|
if REDIS_SOCKET_CONNECT_TIMEOUT:
|
||||||
WEBSOCKET_REDIS_OPTIONS = None
|
WEBSOCKET_REDIS_OPTIONS = {
|
||||||
|
"socket_connect_timeout": REDIS_SOCKET_CONNECT_TIMEOUT
|
||||||
|
}
|
||||||
|
else:
|
||||||
|
log.debug("No WEBSOCKET_REDIS_OPTIONS provided, defaulting to None")
|
||||||
|
WEBSOCKET_REDIS_OPTIONS = None
|
||||||
else:
|
else:
|
||||||
try:
|
try:
|
||||||
WEBSOCKET_REDIS_OPTIONS = json.loads(WEBSOCKET_REDIS_OPTIONS)
|
WEBSOCKET_REDIS_OPTIONS = json.loads(WEBSOCKET_REDIS_OPTIONS)
|
||||||
|
|
|
||||||
|
|
@ -61,11 +61,11 @@ from open_webui.utils import logger
|
||||||
from open_webui.utils.audit import AuditLevel, AuditLoggingMiddleware
|
from open_webui.utils.audit import AuditLevel, AuditLoggingMiddleware
|
||||||
from open_webui.utils.logger import start_logger
|
from open_webui.utils.logger import start_logger
|
||||||
from open_webui.socket.main import (
|
from open_webui.socket.main import (
|
||||||
|
MODELS,
|
||||||
app as socket_app,
|
app as socket_app,
|
||||||
periodic_usage_pool_cleanup,
|
periodic_usage_pool_cleanup,
|
||||||
get_event_emitter,
|
get_event_emitter,
|
||||||
get_models_in_use,
|
get_models_in_use,
|
||||||
get_active_user_ids,
|
|
||||||
)
|
)
|
||||||
from open_webui.routers import (
|
from open_webui.routers import (
|
||||||
audio,
|
audio,
|
||||||
|
|
@ -208,6 +208,7 @@ from open_webui.config import (
|
||||||
FIRECRAWL_API_KEY,
|
FIRECRAWL_API_KEY,
|
||||||
WEB_LOADER_ENGINE,
|
WEB_LOADER_ENGINE,
|
||||||
WEB_LOADER_CONCURRENT_REQUESTS,
|
WEB_LOADER_CONCURRENT_REQUESTS,
|
||||||
|
WEB_LOADER_TIMEOUT,
|
||||||
WHISPER_MODEL,
|
WHISPER_MODEL,
|
||||||
WHISPER_VAD_FILTER,
|
WHISPER_VAD_FILTER,
|
||||||
WHISPER_LANGUAGE,
|
WHISPER_LANGUAGE,
|
||||||
|
|
@ -273,6 +274,7 @@ from open_webui.config import (
|
||||||
DOCLING_PARAMS,
|
DOCLING_PARAMS,
|
||||||
DOCUMENT_INTELLIGENCE_ENDPOINT,
|
DOCUMENT_INTELLIGENCE_ENDPOINT,
|
||||||
DOCUMENT_INTELLIGENCE_KEY,
|
DOCUMENT_INTELLIGENCE_KEY,
|
||||||
|
DOCUMENT_INTELLIGENCE_MODEL,
|
||||||
MISTRAL_OCR_API_BASE_URL,
|
MISTRAL_OCR_API_BASE_URL,
|
||||||
MISTRAL_OCR_API_KEY,
|
MISTRAL_OCR_API_KEY,
|
||||||
RAG_TEXT_SPLITTER,
|
RAG_TEXT_SPLITTER,
|
||||||
|
|
@ -352,6 +354,7 @@ from open_webui.config import (
|
||||||
ENABLE_API_KEYS,
|
ENABLE_API_KEYS,
|
||||||
ENABLE_API_KEYS_ENDPOINT_RESTRICTIONS,
|
ENABLE_API_KEYS_ENDPOINT_RESTRICTIONS,
|
||||||
API_KEYS_ALLOWED_ENDPOINTS,
|
API_KEYS_ALLOWED_ENDPOINTS,
|
||||||
|
ENABLE_FOLDERS,
|
||||||
ENABLE_CHANNELS,
|
ENABLE_CHANNELS,
|
||||||
ENABLE_NOTES,
|
ENABLE_NOTES,
|
||||||
ENABLE_COMMUNITY_SHARING,
|
ENABLE_COMMUNITY_SHARING,
|
||||||
|
|
@ -767,6 +770,7 @@ app.state.config.WEBHOOK_URL = WEBHOOK_URL
|
||||||
app.state.config.BANNERS = WEBUI_BANNERS
|
app.state.config.BANNERS = WEBUI_BANNERS
|
||||||
|
|
||||||
|
|
||||||
|
app.state.config.ENABLE_FOLDERS = ENABLE_FOLDERS
|
||||||
app.state.config.ENABLE_CHANNELS = ENABLE_CHANNELS
|
app.state.config.ENABLE_CHANNELS = ENABLE_CHANNELS
|
||||||
app.state.config.ENABLE_NOTES = ENABLE_NOTES
|
app.state.config.ENABLE_NOTES = ENABLE_NOTES
|
||||||
app.state.config.ENABLE_COMMUNITY_SHARING = ENABLE_COMMUNITY_SHARING
|
app.state.config.ENABLE_COMMUNITY_SHARING = ENABLE_COMMUNITY_SHARING
|
||||||
|
|
@ -869,6 +873,7 @@ app.state.config.DOCLING_API_KEY = DOCLING_API_KEY
|
||||||
app.state.config.DOCLING_PARAMS = DOCLING_PARAMS
|
app.state.config.DOCLING_PARAMS = DOCLING_PARAMS
|
||||||
app.state.config.DOCUMENT_INTELLIGENCE_ENDPOINT = DOCUMENT_INTELLIGENCE_ENDPOINT
|
app.state.config.DOCUMENT_INTELLIGENCE_ENDPOINT = DOCUMENT_INTELLIGENCE_ENDPOINT
|
||||||
app.state.config.DOCUMENT_INTELLIGENCE_KEY = DOCUMENT_INTELLIGENCE_KEY
|
app.state.config.DOCUMENT_INTELLIGENCE_KEY = DOCUMENT_INTELLIGENCE_KEY
|
||||||
|
app.state.config.DOCUMENT_INTELLIGENCE_MODEL = DOCUMENT_INTELLIGENCE_MODEL
|
||||||
app.state.config.MISTRAL_OCR_API_BASE_URL = MISTRAL_OCR_API_BASE_URL
|
app.state.config.MISTRAL_OCR_API_BASE_URL = MISTRAL_OCR_API_BASE_URL
|
||||||
app.state.config.MISTRAL_OCR_API_KEY = MISTRAL_OCR_API_KEY
|
app.state.config.MISTRAL_OCR_API_KEY = MISTRAL_OCR_API_KEY
|
||||||
app.state.config.MINERU_API_MODE = MINERU_API_MODE
|
app.state.config.MINERU_API_MODE = MINERU_API_MODE
|
||||||
|
|
@ -918,6 +923,7 @@ app.state.config.WEB_SEARCH_CONCURRENT_REQUESTS = WEB_SEARCH_CONCURRENT_REQUESTS
|
||||||
|
|
||||||
app.state.config.WEB_LOADER_ENGINE = WEB_LOADER_ENGINE
|
app.state.config.WEB_LOADER_ENGINE = WEB_LOADER_ENGINE
|
||||||
app.state.config.WEB_LOADER_CONCURRENT_REQUESTS = WEB_LOADER_CONCURRENT_REQUESTS
|
app.state.config.WEB_LOADER_CONCURRENT_REQUESTS = WEB_LOADER_CONCURRENT_REQUESTS
|
||||||
|
app.state.config.WEB_LOADER_TIMEOUT = WEB_LOADER_TIMEOUT
|
||||||
|
|
||||||
app.state.config.WEB_SEARCH_TRUST_ENV = WEB_SEARCH_TRUST_ENV
|
app.state.config.WEB_SEARCH_TRUST_ENV = WEB_SEARCH_TRUST_ENV
|
||||||
app.state.config.BYPASS_WEB_SEARCH_EMBEDDING_AND_RETRIEVAL = (
|
app.state.config.BYPASS_WEB_SEARCH_EMBEDDING_AND_RETRIEVAL = (
|
||||||
|
|
@ -980,9 +986,7 @@ app.state.YOUTUBE_LOADER_TRANSLATION = None
|
||||||
|
|
||||||
try:
|
try:
|
||||||
app.state.ef = get_ef(
|
app.state.ef = get_ef(
|
||||||
app.state.config.RAG_EMBEDDING_ENGINE,
|
app.state.config.RAG_EMBEDDING_ENGINE, app.state.config.RAG_EMBEDDING_MODEL
|
||||||
app.state.config.RAG_EMBEDDING_MODEL,
|
|
||||||
RAG_EMBEDDING_MODEL_AUTO_UPDATE,
|
|
||||||
)
|
)
|
||||||
if (
|
if (
|
||||||
app.state.config.ENABLE_RAG_HYBRID_SEARCH
|
app.state.config.ENABLE_RAG_HYBRID_SEARCH
|
||||||
|
|
@ -993,7 +997,6 @@ try:
|
||||||
app.state.config.RAG_RERANKING_MODEL,
|
app.state.config.RAG_RERANKING_MODEL,
|
||||||
app.state.config.RAG_EXTERNAL_RERANKER_URL,
|
app.state.config.RAG_EXTERNAL_RERANKER_URL,
|
||||||
app.state.config.RAG_EXTERNAL_RERANKER_API_KEY,
|
app.state.config.RAG_EXTERNAL_RERANKER_API_KEY,
|
||||||
RAG_RERANKING_MODEL_AUTO_UPDATE,
|
|
||||||
)
|
)
|
||||||
else:
|
else:
|
||||||
app.state.rf = None
|
app.state.rf = None
|
||||||
|
|
@ -1030,6 +1033,7 @@ app.state.EMBEDDING_FUNCTION = get_embedding_function(
|
||||||
if app.state.config.RAG_EMBEDDING_ENGINE == "azure_openai"
|
if app.state.config.RAG_EMBEDDING_ENGINE == "azure_openai"
|
||||||
else None
|
else None
|
||||||
),
|
),
|
||||||
|
enable_async=app.state.config.ENABLE_ASYNC_EMBEDDING,
|
||||||
)
|
)
|
||||||
|
|
||||||
app.state.RERANKING_FUNCTION = get_reranking_function(
|
app.state.RERANKING_FUNCTION = get_reranking_function(
|
||||||
|
|
@ -1215,7 +1219,7 @@ app.state.config.VOICE_MODE_PROMPT_TEMPLATE = VOICE_MODE_PROMPT_TEMPLATE
|
||||||
#
|
#
|
||||||
########################################
|
########################################
|
||||||
|
|
||||||
app.state.MODELS = {}
|
app.state.MODELS = MODELS
|
||||||
|
|
||||||
# Add the middleware to the app
|
# Add the middleware to the app
|
||||||
if ENABLE_COMPRESSION_MIDDLEWARE:
|
if ENABLE_COMPRESSION_MIDDLEWARE:
|
||||||
|
|
@ -1575,6 +1579,7 @@ async def chat_completion(
|
||||||
"user_id": user.id,
|
"user_id": user.id,
|
||||||
"chat_id": form_data.pop("chat_id", None),
|
"chat_id": form_data.pop("chat_id", None),
|
||||||
"message_id": form_data.pop("id", None),
|
"message_id": form_data.pop("id", None),
|
||||||
|
"parent_message_id": form_data.pop("parent_id", None),
|
||||||
"session_id": form_data.pop("session_id", None),
|
"session_id": form_data.pop("session_id", None),
|
||||||
"filter_ids": form_data.pop("filter_ids", []),
|
"filter_ids": form_data.pop("filter_ids", []),
|
||||||
"tool_ids": form_data.get("tool_ids", None),
|
"tool_ids": form_data.get("tool_ids", None),
|
||||||
|
|
@ -1631,6 +1636,7 @@ async def chat_completion(
|
||||||
metadata["chat_id"],
|
metadata["chat_id"],
|
||||||
metadata["message_id"],
|
metadata["message_id"],
|
||||||
{
|
{
|
||||||
|
"parentId": metadata.get("parent_message_id", None),
|
||||||
"model": model_id,
|
"model": model_id,
|
||||||
},
|
},
|
||||||
)
|
)
|
||||||
|
|
@ -1663,6 +1669,7 @@ async def chat_completion(
|
||||||
metadata["chat_id"],
|
metadata["chat_id"],
|
||||||
metadata["message_id"],
|
metadata["message_id"],
|
||||||
{
|
{
|
||||||
|
"parentId": metadata.get("parent_message_id", None),
|
||||||
"error": {"content": str(e)},
|
"error": {"content": str(e)},
|
||||||
},
|
},
|
||||||
)
|
)
|
||||||
|
|
@ -1842,6 +1849,7 @@ async def get_app_config(request: Request):
|
||||||
**(
|
**(
|
||||||
{
|
{
|
||||||
"enable_direct_connections": app.state.config.ENABLE_DIRECT_CONNECTIONS,
|
"enable_direct_connections": app.state.config.ENABLE_DIRECT_CONNECTIONS,
|
||||||
|
"enable_folders": app.state.config.ENABLE_FOLDERS,
|
||||||
"enable_channels": app.state.config.ENABLE_CHANNELS,
|
"enable_channels": app.state.config.ENABLE_CHANNELS,
|
||||||
"enable_notes": app.state.config.ENABLE_NOTES,
|
"enable_notes": app.state.config.ENABLE_NOTES,
|
||||||
"enable_web_search": app.state.config.ENABLE_WEB_SEARCH,
|
"enable_web_search": app.state.config.ENABLE_WEB_SEARCH,
|
||||||
|
|
@ -2014,7 +2022,10 @@ async def get_current_usage(user=Depends(get_verified_user)):
|
||||||
This is an experimental endpoint and subject to change.
|
This is an experimental endpoint and subject to change.
|
||||||
"""
|
"""
|
||||||
try:
|
try:
|
||||||
return {"model_ids": get_models_in_use(), "user_ids": get_active_user_ids()}
|
return {
|
||||||
|
"model_ids": get_models_in_use(),
|
||||||
|
"user_count": Users.get_active_user_count(),
|
||||||
|
}
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
log.error(f"Error getting usage statistics: {e}")
|
log.error(f"Error getting usage statistics: {e}")
|
||||||
raise HTTPException(status_code=500, detail="Internal Server Error")
|
raise HTTPException(status_code=500, detail="Internal Server Error")
|
||||||
|
|
@ -2077,7 +2088,7 @@ except Exception as e:
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
async def register_client(self, request, client_id: str) -> bool:
|
async def register_client(request, client_id: str) -> bool:
|
||||||
server_type, server_id = client_id.split(":", 1)
|
server_type, server_id = client_id.split(":", 1)
|
||||||
|
|
||||||
connection = None
|
connection = None
|
||||||
|
|
|
||||||
|
|
@ -0,0 +1,103 @@
|
||||||
|
"""Update messages and channel member table
|
||||||
|
|
||||||
|
Revision ID: 2f1211949ecc
|
||||||
|
Revises: 37f288994c47
|
||||||
|
Create Date: 2025-11-27 03:07:56.200231
|
||||||
|
|
||||||
|
"""
|
||||||
|
|
||||||
|
from typing import Sequence, Union
|
||||||
|
|
||||||
|
from alembic import op
|
||||||
|
import sqlalchemy as sa
|
||||||
|
import open_webui.internal.db
|
||||||
|
|
||||||
|
|
||||||
|
# revision identifiers, used by Alembic.
|
||||||
|
revision: str = "2f1211949ecc"
|
||||||
|
down_revision: Union[str, None] = "37f288994c47"
|
||||||
|
branch_labels: Union[str, Sequence[str], None] = None
|
||||||
|
depends_on: Union[str, Sequence[str], None] = None
|
||||||
|
|
||||||
|
|
||||||
|
def upgrade() -> None:
|
||||||
|
# New columns to be added to channel_member table
|
||||||
|
op.add_column("channel_member", sa.Column("status", sa.Text(), nullable=True))
|
||||||
|
op.add_column(
|
||||||
|
"channel_member",
|
||||||
|
sa.Column(
|
||||||
|
"is_active",
|
||||||
|
sa.Boolean(),
|
||||||
|
nullable=False,
|
||||||
|
default=True,
|
||||||
|
server_default=sa.sql.expression.true(),
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
op.add_column(
|
||||||
|
"channel_member",
|
||||||
|
sa.Column(
|
||||||
|
"is_channel_muted",
|
||||||
|
sa.Boolean(),
|
||||||
|
nullable=False,
|
||||||
|
default=False,
|
||||||
|
server_default=sa.sql.expression.false(),
|
||||||
|
),
|
||||||
|
)
|
||||||
|
op.add_column(
|
||||||
|
"channel_member",
|
||||||
|
sa.Column(
|
||||||
|
"is_channel_pinned",
|
||||||
|
sa.Boolean(),
|
||||||
|
nullable=False,
|
||||||
|
default=False,
|
||||||
|
server_default=sa.sql.expression.false(),
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
op.add_column("channel_member", sa.Column("data", sa.JSON(), nullable=True))
|
||||||
|
op.add_column("channel_member", sa.Column("meta", sa.JSON(), nullable=True))
|
||||||
|
|
||||||
|
op.add_column(
|
||||||
|
"channel_member", sa.Column("joined_at", sa.BigInteger(), nullable=False)
|
||||||
|
)
|
||||||
|
op.add_column(
|
||||||
|
"channel_member", sa.Column("left_at", sa.BigInteger(), nullable=True)
|
||||||
|
)
|
||||||
|
|
||||||
|
op.add_column(
|
||||||
|
"channel_member", sa.Column("last_read_at", sa.BigInteger(), nullable=True)
|
||||||
|
)
|
||||||
|
|
||||||
|
op.add_column(
|
||||||
|
"channel_member", sa.Column("updated_at", sa.BigInteger(), nullable=True)
|
||||||
|
)
|
||||||
|
|
||||||
|
# New columns to be added to message table
|
||||||
|
op.add_column(
|
||||||
|
"message",
|
||||||
|
sa.Column(
|
||||||
|
"is_pinned",
|
||||||
|
sa.Boolean(),
|
||||||
|
nullable=False,
|
||||||
|
default=False,
|
||||||
|
server_default=sa.sql.expression.false(),
|
||||||
|
),
|
||||||
|
)
|
||||||
|
op.add_column("message", sa.Column("pinned_at", sa.BigInteger(), nullable=True))
|
||||||
|
op.add_column("message", sa.Column("pinned_by", sa.Text(), nullable=True))
|
||||||
|
|
||||||
|
|
||||||
|
def downgrade() -> None:
|
||||||
|
op.drop_column("channel_member", "updated_at")
|
||||||
|
op.drop_column("channel_member", "last_read_at")
|
||||||
|
|
||||||
|
op.drop_column("channel_member", "meta")
|
||||||
|
op.drop_column("channel_member", "data")
|
||||||
|
|
||||||
|
op.drop_column("channel_member", "is_channel_pinned")
|
||||||
|
op.drop_column("channel_member", "is_channel_muted")
|
||||||
|
|
||||||
|
op.drop_column("message", "pinned_by")
|
||||||
|
op.drop_column("message", "pinned_at")
|
||||||
|
op.drop_column("message", "is_pinned")
|
||||||
|
|
@ -20,18 +20,46 @@ depends_on: Union[str, Sequence[str], None] = None
|
||||||
|
|
||||||
|
|
||||||
def upgrade() -> None:
|
def upgrade() -> None:
|
||||||
|
# Ensure 'id' column in 'user' table is unique and primary key (ForeignKey constraint)
|
||||||
|
inspector = sa.inspect(op.get_bind())
|
||||||
|
columns = inspector.get_columns("user")
|
||||||
|
|
||||||
|
pk_columns = inspector.get_pk_constraint("user")["constrained_columns"]
|
||||||
|
id_column = next((col for col in columns if col["name"] == "id"), None)
|
||||||
|
|
||||||
|
if id_column and not id_column.get("unique", False):
|
||||||
|
unique_constraints = inspector.get_unique_constraints("user")
|
||||||
|
unique_columns = {tuple(u["column_names"]) for u in unique_constraints}
|
||||||
|
|
||||||
|
with op.batch_alter_table("user") as batch_op:
|
||||||
|
# If primary key is wrong, drop it
|
||||||
|
if pk_columns and pk_columns != ["id"]:
|
||||||
|
batch_op.drop_constraint(
|
||||||
|
inspector.get_pk_constraint("user")["name"], type_="primary"
|
||||||
|
)
|
||||||
|
|
||||||
|
# Add unique constraint if missing
|
||||||
|
if ("id",) not in unique_columns:
|
||||||
|
batch_op.create_unique_constraint("uq_user_id", ["id"])
|
||||||
|
|
||||||
|
# Re-create correct primary key
|
||||||
|
batch_op.create_primary_key("pk_user_id", ["id"])
|
||||||
|
|
||||||
# Create oauth_session table
|
# Create oauth_session table
|
||||||
op.create_table(
|
op.create_table(
|
||||||
"oauth_session",
|
"oauth_session",
|
||||||
sa.Column("id", sa.Text(), nullable=False),
|
sa.Column("id", sa.Text(), primary_key=True, nullable=False, unique=True),
|
||||||
sa.Column("user_id", sa.Text(), nullable=False),
|
sa.Column(
|
||||||
|
"user_id",
|
||||||
|
sa.Text(),
|
||||||
|
sa.ForeignKey("user.id", ondelete="CASCADE"),
|
||||||
|
nullable=False,
|
||||||
|
),
|
||||||
sa.Column("provider", sa.Text(), nullable=False),
|
sa.Column("provider", sa.Text(), nullable=False),
|
||||||
sa.Column("token", sa.Text(), nullable=False),
|
sa.Column("token", sa.Text(), nullable=False),
|
||||||
sa.Column("expires_at", sa.BigInteger(), nullable=False),
|
sa.Column("expires_at", sa.BigInteger(), nullable=False),
|
||||||
sa.Column("created_at", sa.BigInteger(), nullable=False),
|
sa.Column("created_at", sa.BigInteger(), nullable=False),
|
||||||
sa.Column("updated_at", sa.BigInteger(), nullable=False),
|
sa.Column("updated_at", sa.BigInteger(), nullable=False),
|
||||||
sa.PrimaryKeyConstraint("id"),
|
|
||||||
sa.ForeignKeyConstraint(["user_id"], ["user.id"], ondelete="CASCADE"),
|
|
||||||
)
|
)
|
||||||
|
|
||||||
# Create indexes for better performance
|
# Create indexes for better performance
|
||||||
|
|
|
||||||
|
|
@ -0,0 +1,169 @@
|
||||||
|
"""Add knowledge_file table
|
||||||
|
|
||||||
|
Revision ID: 3e0e00844bb0
|
||||||
|
Revises: 90ef40d4714e
|
||||||
|
Create Date: 2025-12-02 06:54:19.401334
|
||||||
|
|
||||||
|
"""
|
||||||
|
|
||||||
|
from typing import Sequence, Union
|
||||||
|
|
||||||
|
from alembic import op
|
||||||
|
import sqlalchemy as sa
|
||||||
|
from sqlalchemy import inspect
|
||||||
|
import open_webui.internal.db
|
||||||
|
|
||||||
|
import time
|
||||||
|
import json
|
||||||
|
import uuid
|
||||||
|
|
||||||
|
# revision identifiers, used by Alembic.
|
||||||
|
revision: str = "3e0e00844bb0"
|
||||||
|
down_revision: Union[str, None] = "90ef40d4714e"
|
||||||
|
branch_labels: Union[str, Sequence[str], None] = None
|
||||||
|
depends_on: Union[str, Sequence[str], None] = None
|
||||||
|
|
||||||
|
|
||||||
|
def upgrade() -> None:
|
||||||
|
op.create_table(
|
||||||
|
"knowledge_file",
|
||||||
|
sa.Column("id", sa.Text(), primary_key=True),
|
||||||
|
sa.Column("user_id", sa.Text(), nullable=False),
|
||||||
|
sa.Column(
|
||||||
|
"knowledge_id",
|
||||||
|
sa.Text(),
|
||||||
|
sa.ForeignKey("knowledge.id", ondelete="CASCADE"),
|
||||||
|
nullable=False,
|
||||||
|
),
|
||||||
|
sa.Column(
|
||||||
|
"file_id",
|
||||||
|
sa.Text(),
|
||||||
|
sa.ForeignKey("file.id", ondelete="CASCADE"),
|
||||||
|
nullable=False,
|
||||||
|
),
|
||||||
|
sa.Column("created_at", sa.BigInteger(), nullable=False),
|
||||||
|
sa.Column("updated_at", sa.BigInteger(), nullable=False),
|
||||||
|
# indexes
|
||||||
|
sa.Index("ix_knowledge_file_knowledge_id", "knowledge_id"),
|
||||||
|
sa.Index("ix_knowledge_file_file_id", "file_id"),
|
||||||
|
sa.Index("ix_knowledge_file_user_id", "user_id"),
|
||||||
|
# unique constraints
|
||||||
|
sa.UniqueConstraint(
|
||||||
|
"knowledge_id", "file_id", name="uq_knowledge_file_knowledge_file"
|
||||||
|
), # prevent duplicate entries
|
||||||
|
)
|
||||||
|
|
||||||
|
connection = op.get_bind()
|
||||||
|
|
||||||
|
# 2. Read existing group with user_ids JSON column
|
||||||
|
knowledge_table = sa.Table(
|
||||||
|
"knowledge",
|
||||||
|
sa.MetaData(),
|
||||||
|
sa.Column("id", sa.Text()),
|
||||||
|
sa.Column("user_id", sa.Text()),
|
||||||
|
sa.Column("data", sa.JSON()), # JSON stored as text in SQLite + PG
|
||||||
|
)
|
||||||
|
|
||||||
|
results = connection.execute(
|
||||||
|
sa.select(
|
||||||
|
knowledge_table.c.id, knowledge_table.c.user_id, knowledge_table.c.data
|
||||||
|
)
|
||||||
|
).fetchall()
|
||||||
|
|
||||||
|
# 3. Insert members into group_member table
|
||||||
|
kf_table = sa.Table(
|
||||||
|
"knowledge_file",
|
||||||
|
sa.MetaData(),
|
||||||
|
sa.Column("id", sa.Text()),
|
||||||
|
sa.Column("user_id", sa.Text()),
|
||||||
|
sa.Column("knowledge_id", sa.Text()),
|
||||||
|
sa.Column("file_id", sa.Text()),
|
||||||
|
sa.Column("created_at", sa.BigInteger()),
|
||||||
|
sa.Column("updated_at", sa.BigInteger()),
|
||||||
|
)
|
||||||
|
|
||||||
|
file_table = sa.Table(
|
||||||
|
"file",
|
||||||
|
sa.MetaData(),
|
||||||
|
sa.Column("id", sa.Text()),
|
||||||
|
)
|
||||||
|
|
||||||
|
now = int(time.time())
|
||||||
|
for knowledge_id, user_id, data in results:
|
||||||
|
if not data:
|
||||||
|
continue
|
||||||
|
|
||||||
|
if isinstance(data, str):
|
||||||
|
try:
|
||||||
|
data = json.loads(data)
|
||||||
|
except Exception:
|
||||||
|
continue # skip invalid JSON
|
||||||
|
|
||||||
|
if not isinstance(data, dict):
|
||||||
|
continue
|
||||||
|
|
||||||
|
file_ids = data.get("file_ids", [])
|
||||||
|
|
||||||
|
for file_id in file_ids:
|
||||||
|
file_exists = connection.execute(
|
||||||
|
sa.select(file_table.c.id).where(file_table.c.id == file_id)
|
||||||
|
).fetchone()
|
||||||
|
|
||||||
|
if not file_exists:
|
||||||
|
continue # skip non-existing files
|
||||||
|
|
||||||
|
row = {
|
||||||
|
"id": str(uuid.uuid4()),
|
||||||
|
"user_id": user_id,
|
||||||
|
"knowledge_id": knowledge_id,
|
||||||
|
"file_id": file_id,
|
||||||
|
"created_at": now,
|
||||||
|
"updated_at": now,
|
||||||
|
}
|
||||||
|
connection.execute(kf_table.insert().values(**row))
|
||||||
|
|
||||||
|
with op.batch_alter_table("knowledge") as batch:
|
||||||
|
batch.drop_column("data")
|
||||||
|
|
||||||
|
|
||||||
|
def downgrade() -> None:
|
||||||
|
# 1. Add back the old data column
|
||||||
|
op.add_column("knowledge", sa.Column("data", sa.JSON(), nullable=True))
|
||||||
|
|
||||||
|
connection = op.get_bind()
|
||||||
|
|
||||||
|
# 2. Read knowledge_file entries and reconstruct data JSON
|
||||||
|
knowledge_table = sa.Table(
|
||||||
|
"knowledge",
|
||||||
|
sa.MetaData(),
|
||||||
|
sa.Column("id", sa.Text()),
|
||||||
|
sa.Column("data", sa.JSON()),
|
||||||
|
)
|
||||||
|
|
||||||
|
kf_table = sa.Table(
|
||||||
|
"knowledge_file",
|
||||||
|
sa.MetaData(),
|
||||||
|
sa.Column("id", sa.Text()),
|
||||||
|
sa.Column("knowledge_id", sa.Text()),
|
||||||
|
sa.Column("file_id", sa.Text()),
|
||||||
|
)
|
||||||
|
|
||||||
|
results = connection.execute(sa.select(knowledge_table.c.id)).fetchall()
|
||||||
|
|
||||||
|
for (knowledge_id,) in results:
|
||||||
|
file_ids = connection.execute(
|
||||||
|
sa.select(kf_table.c.file_id).where(kf_table.c.knowledge_id == knowledge_id)
|
||||||
|
).fetchall()
|
||||||
|
|
||||||
|
file_ids_list = [fid for (fid,) in file_ids]
|
||||||
|
|
||||||
|
data_json = {"file_ids": file_ids_list}
|
||||||
|
|
||||||
|
connection.execute(
|
||||||
|
knowledge_table.update()
|
||||||
|
.where(knowledge_table.c.id == knowledge_id)
|
||||||
|
.values(data=data_json)
|
||||||
|
)
|
||||||
|
|
||||||
|
# 3. Drop the knowledge_file table
|
||||||
|
op.drop_table("knowledge_file")
|
||||||
|
|
@ -0,0 +1,81 @@
|
||||||
|
"""Update channel and channel members table
|
||||||
|
|
||||||
|
Revision ID: 90ef40d4714e
|
||||||
|
Revises: b10670c03dd5
|
||||||
|
Create Date: 2025-11-30 06:33:38.790341
|
||||||
|
|
||||||
|
"""
|
||||||
|
|
||||||
|
from typing import Sequence, Union
|
||||||
|
|
||||||
|
from alembic import op
|
||||||
|
import sqlalchemy as sa
|
||||||
|
import open_webui.internal.db
|
||||||
|
|
||||||
|
|
||||||
|
# revision identifiers, used by Alembic.
|
||||||
|
revision: str = "90ef40d4714e"
|
||||||
|
down_revision: Union[str, None] = "b10670c03dd5"
|
||||||
|
branch_labels: Union[str, Sequence[str], None] = None
|
||||||
|
depends_on: Union[str, Sequence[str], None] = None
|
||||||
|
|
||||||
|
|
||||||
|
def upgrade() -> None:
|
||||||
|
# Update 'channel' table
|
||||||
|
op.add_column("channel", sa.Column("is_private", sa.Boolean(), nullable=True))
|
||||||
|
|
||||||
|
op.add_column("channel", sa.Column("archived_at", sa.BigInteger(), nullable=True))
|
||||||
|
op.add_column("channel", sa.Column("archived_by", sa.Text(), nullable=True))
|
||||||
|
|
||||||
|
op.add_column("channel", sa.Column("deleted_at", sa.BigInteger(), nullable=True))
|
||||||
|
op.add_column("channel", sa.Column("deleted_by", sa.Text(), nullable=True))
|
||||||
|
|
||||||
|
op.add_column("channel", sa.Column("updated_by", sa.Text(), nullable=True))
|
||||||
|
|
||||||
|
# Update 'channel_member' table
|
||||||
|
op.add_column("channel_member", sa.Column("role", sa.Text(), nullable=True))
|
||||||
|
op.add_column("channel_member", sa.Column("invited_by", sa.Text(), nullable=True))
|
||||||
|
op.add_column(
|
||||||
|
"channel_member", sa.Column("invited_at", sa.BigInteger(), nullable=True)
|
||||||
|
)
|
||||||
|
|
||||||
|
# Create 'channel_webhook' table
|
||||||
|
op.create_table(
|
||||||
|
"channel_webhook",
|
||||||
|
sa.Column("id", sa.Text(), primary_key=True, unique=True, nullable=False),
|
||||||
|
sa.Column("user_id", sa.Text(), nullable=False),
|
||||||
|
sa.Column(
|
||||||
|
"channel_id",
|
||||||
|
sa.Text(),
|
||||||
|
sa.ForeignKey("channel.id", ondelete="CASCADE"),
|
||||||
|
nullable=False,
|
||||||
|
),
|
||||||
|
sa.Column("name", sa.Text(), nullable=False),
|
||||||
|
sa.Column("profile_image_url", sa.Text(), nullable=True),
|
||||||
|
sa.Column("token", sa.Text(), nullable=False),
|
||||||
|
sa.Column("last_used_at", sa.BigInteger(), nullable=True),
|
||||||
|
sa.Column("created_at", sa.BigInteger(), nullable=False),
|
||||||
|
sa.Column("updated_at", sa.BigInteger(), nullable=False),
|
||||||
|
)
|
||||||
|
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
|
def downgrade() -> None:
|
||||||
|
# Downgrade 'channel' table
|
||||||
|
op.drop_column("channel", "is_private")
|
||||||
|
op.drop_column("channel", "archived_at")
|
||||||
|
op.drop_column("channel", "archived_by")
|
||||||
|
op.drop_column("channel", "deleted_at")
|
||||||
|
op.drop_column("channel", "deleted_by")
|
||||||
|
op.drop_column("channel", "updated_by")
|
||||||
|
|
||||||
|
# Downgrade 'channel_member' table
|
||||||
|
op.drop_column("channel_member", "role")
|
||||||
|
op.drop_column("channel_member", "invited_by")
|
||||||
|
op.drop_column("channel_member", "invited_at")
|
||||||
|
|
||||||
|
# Drop 'channel_webhook' table
|
||||||
|
op.drop_table("channel_webhook")
|
||||||
|
|
||||||
|
pass
|
||||||
|
|
@ -0,0 +1,251 @@
|
||||||
|
"""Update user table
|
||||||
|
|
||||||
|
Revision ID: b10670c03dd5
|
||||||
|
Revises: 2f1211949ecc
|
||||||
|
Create Date: 2025-11-28 04:55:31.737538
|
||||||
|
|
||||||
|
"""
|
||||||
|
|
||||||
|
from typing import Sequence, Union
|
||||||
|
|
||||||
|
from alembic import op
|
||||||
|
import sqlalchemy as sa
|
||||||
|
|
||||||
|
|
||||||
|
import open_webui.internal.db
|
||||||
|
import json
|
||||||
|
import time
|
||||||
|
|
||||||
|
# revision identifiers, used by Alembic.
|
||||||
|
revision: str = "b10670c03dd5"
|
||||||
|
down_revision: Union[str, None] = "2f1211949ecc"
|
||||||
|
branch_labels: Union[str, Sequence[str], None] = None
|
||||||
|
depends_on: Union[str, Sequence[str], None] = None
|
||||||
|
|
||||||
|
|
||||||
|
def _drop_sqlite_indexes_for_column(table_name, column_name, conn):
|
||||||
|
"""
|
||||||
|
SQLite requires manual removal of any indexes referencing a column
|
||||||
|
before ALTER TABLE ... DROP COLUMN can succeed.
|
||||||
|
"""
|
||||||
|
indexes = conn.execute(sa.text(f"PRAGMA index_list('{table_name}')")).fetchall()
|
||||||
|
|
||||||
|
for idx in indexes:
|
||||||
|
index_name = idx[1] # index name
|
||||||
|
# Get indexed columns
|
||||||
|
idx_info = conn.execute(
|
||||||
|
sa.text(f"PRAGMA index_info('{index_name}')")
|
||||||
|
).fetchall()
|
||||||
|
|
||||||
|
indexed_cols = [row[2] for row in idx_info] # col names
|
||||||
|
if column_name in indexed_cols:
|
||||||
|
conn.execute(sa.text(f"DROP INDEX IF EXISTS {index_name}"))
|
||||||
|
|
||||||
|
|
||||||
|
def _convert_column_to_json(table: str, column: str):
|
||||||
|
conn = op.get_bind()
|
||||||
|
dialect = conn.dialect.name
|
||||||
|
|
||||||
|
# SQLite cannot ALTER COLUMN → must recreate column
|
||||||
|
if dialect == "sqlite":
|
||||||
|
# 1. Add temporary column
|
||||||
|
op.add_column(table, sa.Column(f"{column}_json", sa.JSON(), nullable=True))
|
||||||
|
|
||||||
|
# 2. Load old data
|
||||||
|
rows = conn.execute(sa.text(f'SELECT id, {column} FROM "{table}"')).fetchall()
|
||||||
|
|
||||||
|
for row in rows:
|
||||||
|
uid, raw = row
|
||||||
|
if raw is None:
|
||||||
|
parsed = None
|
||||||
|
else:
|
||||||
|
try:
|
||||||
|
parsed = json.loads(raw)
|
||||||
|
except Exception:
|
||||||
|
parsed = None # fallback safe behavior
|
||||||
|
|
||||||
|
conn.execute(
|
||||||
|
sa.text(f'UPDATE "{table}" SET {column}_json = :val WHERE id = :id'),
|
||||||
|
{"val": json.dumps(parsed) if parsed else None, "id": uid},
|
||||||
|
)
|
||||||
|
|
||||||
|
# 3. Drop old TEXT column
|
||||||
|
op.drop_column(table, column)
|
||||||
|
|
||||||
|
# 4. Rename new JSON column → original name
|
||||||
|
op.alter_column(table, f"{column}_json", new_column_name=column)
|
||||||
|
|
||||||
|
else:
|
||||||
|
# PostgreSQL supports direct CAST
|
||||||
|
op.alter_column(
|
||||||
|
table,
|
||||||
|
column,
|
||||||
|
type_=sa.JSON(),
|
||||||
|
postgresql_using=f"{column}::json",
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def _convert_column_to_text(table: str, column: str):
|
||||||
|
conn = op.get_bind()
|
||||||
|
dialect = conn.dialect.name
|
||||||
|
|
||||||
|
if dialect == "sqlite":
|
||||||
|
op.add_column(table, sa.Column(f"{column}_text", sa.Text(), nullable=True))
|
||||||
|
|
||||||
|
rows = conn.execute(sa.text(f'SELECT id, {column} FROM "{table}"')).fetchall()
|
||||||
|
|
||||||
|
for uid, raw in rows:
|
||||||
|
conn.execute(
|
||||||
|
sa.text(f'UPDATE "{table}" SET {column}_text = :val WHERE id = :id'),
|
||||||
|
{"val": json.dumps(raw) if raw else None, "id": uid},
|
||||||
|
)
|
||||||
|
|
||||||
|
op.drop_column(table, column)
|
||||||
|
op.alter_column(table, f"{column}_text", new_column_name=column)
|
||||||
|
|
||||||
|
else:
|
||||||
|
op.alter_column(
|
||||||
|
table,
|
||||||
|
column,
|
||||||
|
type_=sa.Text(),
|
||||||
|
postgresql_using=f"to_json({column})::text",
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def upgrade() -> None:
|
||||||
|
op.add_column(
|
||||||
|
"user", sa.Column("profile_banner_image_url", sa.Text(), nullable=True)
|
||||||
|
)
|
||||||
|
op.add_column("user", sa.Column("timezone", sa.String(), nullable=True))
|
||||||
|
|
||||||
|
op.add_column("user", sa.Column("presence_state", sa.String(), nullable=True))
|
||||||
|
op.add_column("user", sa.Column("status_emoji", sa.String(), nullable=True))
|
||||||
|
op.add_column("user", sa.Column("status_message", sa.Text(), nullable=True))
|
||||||
|
op.add_column(
|
||||||
|
"user", sa.Column("status_expires_at", sa.BigInteger(), nullable=True)
|
||||||
|
)
|
||||||
|
|
||||||
|
op.add_column("user", sa.Column("oauth", sa.JSON(), nullable=True))
|
||||||
|
|
||||||
|
# Convert info (TEXT/JSONField) → JSON
|
||||||
|
_convert_column_to_json("user", "info")
|
||||||
|
# Convert settings (TEXT/JSONField) → JSON
|
||||||
|
_convert_column_to_json("user", "settings")
|
||||||
|
|
||||||
|
op.create_table(
|
||||||
|
"api_key",
|
||||||
|
sa.Column("id", sa.Text(), primary_key=True, unique=True),
|
||||||
|
sa.Column("user_id", sa.Text(), sa.ForeignKey("user.id", ondelete="CASCADE")),
|
||||||
|
sa.Column("key", sa.Text(), unique=True, nullable=False),
|
||||||
|
sa.Column("data", sa.JSON(), nullable=True),
|
||||||
|
sa.Column("expires_at", sa.BigInteger(), nullable=True),
|
||||||
|
sa.Column("last_used_at", sa.BigInteger(), nullable=True),
|
||||||
|
sa.Column("created_at", sa.BigInteger(), nullable=False),
|
||||||
|
sa.Column("updated_at", sa.BigInteger(), nullable=False),
|
||||||
|
)
|
||||||
|
|
||||||
|
conn = op.get_bind()
|
||||||
|
users = conn.execute(
|
||||||
|
sa.text('SELECT id, oauth_sub FROM "user" WHERE oauth_sub IS NOT NULL')
|
||||||
|
).fetchall()
|
||||||
|
|
||||||
|
for uid, oauth_sub in users:
|
||||||
|
if oauth_sub:
|
||||||
|
# Example formats supported:
|
||||||
|
# provider@sub
|
||||||
|
# plain sub (stored as {"oidc": {"sub": sub}})
|
||||||
|
if "@" in oauth_sub:
|
||||||
|
provider, sub = oauth_sub.split("@", 1)
|
||||||
|
else:
|
||||||
|
provider, sub = "oidc", oauth_sub
|
||||||
|
|
||||||
|
oauth_json = json.dumps({provider: {"sub": sub}})
|
||||||
|
conn.execute(
|
||||||
|
sa.text('UPDATE "user" SET oauth = :oauth WHERE id = :id'),
|
||||||
|
{"oauth": oauth_json, "id": uid},
|
||||||
|
)
|
||||||
|
|
||||||
|
users_with_keys = conn.execute(
|
||||||
|
sa.text('SELECT id, api_key FROM "user" WHERE api_key IS NOT NULL')
|
||||||
|
).fetchall()
|
||||||
|
now = int(time.time())
|
||||||
|
|
||||||
|
for uid, api_key in users_with_keys:
|
||||||
|
if api_key:
|
||||||
|
conn.execute(
|
||||||
|
sa.text(
|
||||||
|
"""
|
||||||
|
INSERT INTO api_key (id, user_id, key, created_at, updated_at)
|
||||||
|
VALUES (:id, :user_id, :key, :created_at, :updated_at)
|
||||||
|
"""
|
||||||
|
),
|
||||||
|
{
|
||||||
|
"id": f"key_{uid}",
|
||||||
|
"user_id": uid,
|
||||||
|
"key": api_key,
|
||||||
|
"created_at": now,
|
||||||
|
"updated_at": now,
|
||||||
|
},
|
||||||
|
)
|
||||||
|
|
||||||
|
if conn.dialect.name == "sqlite":
|
||||||
|
_drop_sqlite_indexes_for_column("user", "api_key", conn)
|
||||||
|
_drop_sqlite_indexes_for_column("user", "oauth_sub", conn)
|
||||||
|
|
||||||
|
with op.batch_alter_table("user") as batch_op:
|
||||||
|
batch_op.drop_column("api_key")
|
||||||
|
batch_op.drop_column("oauth_sub")
|
||||||
|
|
||||||
|
|
||||||
|
def downgrade() -> None:
|
||||||
|
# --- 1. Restore old oauth_sub column ---
|
||||||
|
op.add_column("user", sa.Column("oauth_sub", sa.Text(), nullable=True))
|
||||||
|
|
||||||
|
conn = op.get_bind()
|
||||||
|
users = conn.execute(
|
||||||
|
sa.text('SELECT id, oauth FROM "user" WHERE oauth IS NOT NULL')
|
||||||
|
).fetchall()
|
||||||
|
|
||||||
|
for uid, oauth in users:
|
||||||
|
try:
|
||||||
|
data = json.loads(oauth)
|
||||||
|
provider = list(data.keys())[0]
|
||||||
|
sub = data[provider].get("sub")
|
||||||
|
oauth_sub = f"{provider}@{sub}"
|
||||||
|
except Exception:
|
||||||
|
oauth_sub = None
|
||||||
|
|
||||||
|
conn.execute(
|
||||||
|
sa.text('UPDATE "user" SET oauth_sub = :oauth_sub WHERE id = :id'),
|
||||||
|
{"oauth_sub": oauth_sub, "id": uid},
|
||||||
|
)
|
||||||
|
|
||||||
|
op.drop_column("user", "oauth")
|
||||||
|
|
||||||
|
# --- 2. Restore api_key field ---
|
||||||
|
op.add_column("user", sa.Column("api_key", sa.String(), nullable=True))
|
||||||
|
|
||||||
|
# Restore values from api_key
|
||||||
|
keys = conn.execute(sa.text("SELECT user_id, key FROM api_key")).fetchall()
|
||||||
|
for uid, key in keys:
|
||||||
|
conn.execute(
|
||||||
|
sa.text('UPDATE "user" SET api_key = :key WHERE id = :id'),
|
||||||
|
{"key": key, "id": uid},
|
||||||
|
)
|
||||||
|
|
||||||
|
# Drop new table
|
||||||
|
op.drop_table("api_key")
|
||||||
|
|
||||||
|
with op.batch_alter_table("user") as batch_op:
|
||||||
|
batch_op.drop_column("profile_banner_image_url")
|
||||||
|
batch_op.drop_column("timezone")
|
||||||
|
|
||||||
|
batch_op.drop_column("presence_state")
|
||||||
|
batch_op.drop_column("status_emoji")
|
||||||
|
batch_op.drop_column("status_message")
|
||||||
|
batch_op.drop_column("status_expires_at")
|
||||||
|
|
||||||
|
# Convert info (JSON) → TEXT
|
||||||
|
_convert_column_to_text("user", "info")
|
||||||
|
# Convert settings (JSON) → TEXT
|
||||||
|
_convert_column_to_text("user", "settings")
|
||||||
|
|
@ -3,7 +3,7 @@ import uuid
|
||||||
from typing import Optional
|
from typing import Optional
|
||||||
|
|
||||||
from open_webui.internal.db import Base, get_db
|
from open_webui.internal.db import Base, get_db
|
||||||
from open_webui.models.users import UserModel, Users
|
from open_webui.models.users import UserModel, UserProfileImageResponse, Users
|
||||||
from open_webui.env import SRC_LOG_LEVELS
|
from open_webui.env import SRC_LOG_LEVELS
|
||||||
from pydantic import BaseModel
|
from pydantic import BaseModel
|
||||||
from sqlalchemy import Boolean, Column, String, Text
|
from sqlalchemy import Boolean, Column, String, Text
|
||||||
|
|
@ -46,15 +46,7 @@ class ApiKey(BaseModel):
|
||||||
api_key: Optional[str] = None
|
api_key: Optional[str] = None
|
||||||
|
|
||||||
|
|
||||||
class UserResponse(BaseModel):
|
class SigninResponse(Token, UserProfileImageResponse):
|
||||||
id: str
|
|
||||||
email: str
|
|
||||||
name: str
|
|
||||||
role: str
|
|
||||||
profile_image_url: str
|
|
||||||
|
|
||||||
|
|
||||||
class SigninResponse(Token, UserResponse):
|
|
||||||
pass
|
pass
|
||||||
|
|
||||||
|
|
||||||
|
|
@ -96,7 +88,7 @@ class AuthsTable:
|
||||||
name: str,
|
name: str,
|
||||||
profile_image_url: str = "/user.png",
|
profile_image_url: str = "/user.png",
|
||||||
role: str = "pending",
|
role: str = "pending",
|
||||||
oauth_sub: Optional[str] = None,
|
oauth: Optional[dict] = None,
|
||||||
) -> Optional[UserModel]:
|
) -> Optional[UserModel]:
|
||||||
with get_db() as db:
|
with get_db() as db:
|
||||||
log.info("insert_new_auth")
|
log.info("insert_new_auth")
|
||||||
|
|
@ -110,7 +102,7 @@ class AuthsTable:
|
||||||
db.add(result)
|
db.add(result)
|
||||||
|
|
||||||
user = Users.insert_new_user(
|
user = Users.insert_new_user(
|
||||||
id, name, email, profile_image_url, role, oauth_sub
|
id, name, email, profile_image_url, role, oauth=oauth
|
||||||
)
|
)
|
||||||
|
|
||||||
db.commit()
|
db.commit()
|
||||||
|
|
|
||||||
|
|
@ -4,10 +4,13 @@ import uuid
|
||||||
from typing import Optional
|
from typing import Optional
|
||||||
|
|
||||||
from open_webui.internal.db import Base, get_db
|
from open_webui.internal.db import Base, get_db
|
||||||
from open_webui.utils.access_control import has_access
|
from open_webui.models.groups import Groups
|
||||||
|
|
||||||
from pydantic import BaseModel, ConfigDict
|
from pydantic import BaseModel, ConfigDict
|
||||||
from sqlalchemy import BigInteger, Boolean, Column, String, Text, JSON
|
from sqlalchemy.dialects.postgresql import JSONB
|
||||||
|
|
||||||
|
|
||||||
|
from sqlalchemy import BigInteger, Boolean, Column, String, Text, JSON, case, cast
|
||||||
from sqlalchemy import or_, func, select, and_, text
|
from sqlalchemy import or_, func, select, and_, text
|
||||||
from sqlalchemy.sql import exists
|
from sqlalchemy.sql import exists
|
||||||
|
|
||||||
|
|
@ -26,12 +29,23 @@ class Channel(Base):
|
||||||
name = Column(Text)
|
name = Column(Text)
|
||||||
description = Column(Text, nullable=True)
|
description = Column(Text, nullable=True)
|
||||||
|
|
||||||
|
# Used to indicate if the channel is private (for 'group' type channels)
|
||||||
|
is_private = Column(Boolean, nullable=True)
|
||||||
|
|
||||||
data = Column(JSON, nullable=True)
|
data = Column(JSON, nullable=True)
|
||||||
meta = Column(JSON, nullable=True)
|
meta = Column(JSON, nullable=True)
|
||||||
access_control = Column(JSON, nullable=True)
|
access_control = Column(JSON, nullable=True)
|
||||||
|
|
||||||
created_at = Column(BigInteger)
|
created_at = Column(BigInteger)
|
||||||
|
|
||||||
updated_at = Column(BigInteger)
|
updated_at = Column(BigInteger)
|
||||||
|
updated_by = Column(Text, nullable=True)
|
||||||
|
|
||||||
|
archived_at = Column(BigInteger, nullable=True)
|
||||||
|
archived_by = Column(Text, nullable=True)
|
||||||
|
|
||||||
|
deleted_at = Column(BigInteger, nullable=True)
|
||||||
|
deleted_by = Column(Text, nullable=True)
|
||||||
|
|
||||||
|
|
||||||
class ChannelModel(BaseModel):
|
class ChannelModel(BaseModel):
|
||||||
|
|
@ -39,17 +53,122 @@ class ChannelModel(BaseModel):
|
||||||
|
|
||||||
id: str
|
id: str
|
||||||
user_id: str
|
user_id: str
|
||||||
|
|
||||||
type: Optional[str] = None
|
type: Optional[str] = None
|
||||||
|
|
||||||
name: str
|
name: str
|
||||||
description: Optional[str] = None
|
description: Optional[str] = None
|
||||||
|
|
||||||
|
is_private: Optional[bool] = None
|
||||||
|
|
||||||
data: Optional[dict] = None
|
data: Optional[dict] = None
|
||||||
meta: Optional[dict] = None
|
meta: Optional[dict] = None
|
||||||
access_control: Optional[dict] = None
|
access_control: Optional[dict] = None
|
||||||
|
|
||||||
created_at: int # timestamp in epoch
|
created_at: int # timestamp in epoch (time_ns)
|
||||||
updated_at: int # timestamp in epoch
|
|
||||||
|
updated_at: int # timestamp in epoch (time_ns)
|
||||||
|
updated_by: Optional[str] = None
|
||||||
|
|
||||||
|
archived_at: Optional[int] = None # timestamp in epoch (time_ns)
|
||||||
|
archived_by: Optional[str] = None
|
||||||
|
|
||||||
|
deleted_at: Optional[int] = None # timestamp in epoch (time_ns)
|
||||||
|
deleted_by: Optional[str] = None
|
||||||
|
|
||||||
|
|
||||||
|
class ChannelMember(Base):
|
||||||
|
__tablename__ = "channel_member"
|
||||||
|
|
||||||
|
id = Column(Text, primary_key=True, unique=True)
|
||||||
|
channel_id = Column(Text, nullable=False)
|
||||||
|
user_id = Column(Text, nullable=False)
|
||||||
|
|
||||||
|
role = Column(Text, nullable=True)
|
||||||
|
status = Column(Text, nullable=True)
|
||||||
|
|
||||||
|
is_active = Column(Boolean, nullable=False, default=True)
|
||||||
|
|
||||||
|
is_channel_muted = Column(Boolean, nullable=False, default=False)
|
||||||
|
is_channel_pinned = Column(Boolean, nullable=False, default=False)
|
||||||
|
|
||||||
|
data = Column(JSON, nullable=True)
|
||||||
|
meta = Column(JSON, nullable=True)
|
||||||
|
|
||||||
|
invited_at = Column(BigInteger, nullable=True)
|
||||||
|
invited_by = Column(Text, nullable=True)
|
||||||
|
|
||||||
|
joined_at = Column(BigInteger)
|
||||||
|
left_at = Column(BigInteger, nullable=True)
|
||||||
|
|
||||||
|
last_read_at = Column(BigInteger, nullable=True)
|
||||||
|
|
||||||
|
created_at = Column(BigInteger)
|
||||||
|
updated_at = Column(BigInteger)
|
||||||
|
|
||||||
|
|
||||||
|
class ChannelMemberModel(BaseModel):
|
||||||
|
model_config = ConfigDict(from_attributes=True)
|
||||||
|
|
||||||
|
id: str
|
||||||
|
channel_id: str
|
||||||
|
user_id: str
|
||||||
|
|
||||||
|
role: Optional[str] = None
|
||||||
|
status: Optional[str] = None
|
||||||
|
|
||||||
|
is_active: bool = True
|
||||||
|
|
||||||
|
is_channel_muted: bool = False
|
||||||
|
is_channel_pinned: bool = False
|
||||||
|
|
||||||
|
data: Optional[dict] = None
|
||||||
|
meta: Optional[dict] = None
|
||||||
|
|
||||||
|
invited_at: Optional[int] = None # timestamp in epoch (time_ns)
|
||||||
|
invited_by: Optional[str] = None
|
||||||
|
|
||||||
|
joined_at: Optional[int] = None # timestamp in epoch (time_ns)
|
||||||
|
left_at: Optional[int] = None # timestamp in epoch (time_ns)
|
||||||
|
|
||||||
|
last_read_at: Optional[int] = None # timestamp in epoch (time_ns)
|
||||||
|
|
||||||
|
created_at: Optional[int] = None # timestamp in epoch (time_ns)
|
||||||
|
updated_at: Optional[int] = None # timestamp in epoch (time_ns)
|
||||||
|
|
||||||
|
|
||||||
|
class ChannelWebhook(Base):
|
||||||
|
__tablename__ = "channel_webhook"
|
||||||
|
|
||||||
|
id = Column(Text, primary_key=True, unique=True)
|
||||||
|
channel_id = Column(Text, nullable=False)
|
||||||
|
user_id = Column(Text, nullable=False)
|
||||||
|
|
||||||
|
name = Column(Text, nullable=False)
|
||||||
|
profile_image_url = Column(Text, nullable=True)
|
||||||
|
|
||||||
|
token = Column(Text, nullable=False)
|
||||||
|
last_used_at = Column(BigInteger, nullable=True)
|
||||||
|
|
||||||
|
created_at = Column(BigInteger, nullable=False)
|
||||||
|
updated_at = Column(BigInteger, nullable=False)
|
||||||
|
|
||||||
|
|
||||||
|
class ChannelWebhookModel(BaseModel):
|
||||||
|
model_config = ConfigDict(from_attributes=True)
|
||||||
|
|
||||||
|
id: str
|
||||||
|
channel_id: str
|
||||||
|
user_id: str
|
||||||
|
|
||||||
|
name: str
|
||||||
|
profile_image_url: Optional[str] = None
|
||||||
|
|
||||||
|
token: str
|
||||||
|
last_used_at: Optional[int] = None # timestamp in epoch (time_ns)
|
||||||
|
|
||||||
|
created_at: int # timestamp in epoch (time_ns)
|
||||||
|
updated_at: int # timestamp in epoch (time_ns)
|
||||||
|
|
||||||
|
|
||||||
####################
|
####################
|
||||||
|
|
@ -58,27 +177,94 @@ class ChannelModel(BaseModel):
|
||||||
|
|
||||||
|
|
||||||
class ChannelResponse(ChannelModel):
|
class ChannelResponse(ChannelModel):
|
||||||
|
is_manager: bool = False
|
||||||
write_access: bool = False
|
write_access: bool = False
|
||||||
|
|
||||||
user_count: Optional[int] = None
|
user_count: Optional[int] = None
|
||||||
|
|
||||||
|
|
||||||
class ChannelForm(BaseModel):
|
class ChannelForm(BaseModel):
|
||||||
name: str
|
name: str = ""
|
||||||
description: Optional[str] = None
|
description: Optional[str] = None
|
||||||
|
is_private: Optional[bool] = None
|
||||||
data: Optional[dict] = None
|
data: Optional[dict] = None
|
||||||
meta: Optional[dict] = None
|
meta: Optional[dict] = None
|
||||||
access_control: Optional[dict] = None
|
access_control: Optional[dict] = None
|
||||||
|
group_ids: Optional[list[str]] = None
|
||||||
|
user_ids: Optional[list[str]] = None
|
||||||
|
|
||||||
|
|
||||||
|
class CreateChannelForm(ChannelForm):
|
||||||
|
type: Optional[str] = None
|
||||||
|
|
||||||
|
|
||||||
class ChannelTable:
|
class ChannelTable:
|
||||||
|
|
||||||
|
def _collect_unique_user_ids(
|
||||||
|
self,
|
||||||
|
invited_by: str,
|
||||||
|
user_ids: Optional[list[str]] = None,
|
||||||
|
group_ids: Optional[list[str]] = None,
|
||||||
|
) -> set[str]:
|
||||||
|
"""
|
||||||
|
Collect unique user ids from:
|
||||||
|
- invited_by
|
||||||
|
- user_ids
|
||||||
|
- each group in group_ids
|
||||||
|
Returns a set for efficient SQL diffing.
|
||||||
|
"""
|
||||||
|
users = set(user_ids or [])
|
||||||
|
users.add(invited_by)
|
||||||
|
|
||||||
|
for group_id in group_ids or []:
|
||||||
|
users.update(Groups.get_group_user_ids_by_id(group_id))
|
||||||
|
|
||||||
|
return users
|
||||||
|
|
||||||
|
def _create_membership_models(
|
||||||
|
self,
|
||||||
|
channel_id: str,
|
||||||
|
invited_by: str,
|
||||||
|
user_ids: set[str],
|
||||||
|
) -> list[ChannelMember]:
|
||||||
|
"""
|
||||||
|
Takes a set of NEW user IDs (already filtered to exclude existing members).
|
||||||
|
Returns ORM ChannelMember objects to be added.
|
||||||
|
"""
|
||||||
|
now = int(time.time_ns())
|
||||||
|
memberships = []
|
||||||
|
|
||||||
|
for uid in user_ids:
|
||||||
|
model = ChannelMemberModel(
|
||||||
|
**{
|
||||||
|
"id": str(uuid.uuid4()),
|
||||||
|
"channel_id": channel_id,
|
||||||
|
"user_id": uid,
|
||||||
|
"status": "joined",
|
||||||
|
"is_active": True,
|
||||||
|
"is_channel_muted": False,
|
||||||
|
"is_channel_pinned": False,
|
||||||
|
"invited_at": now,
|
||||||
|
"invited_by": invited_by,
|
||||||
|
"joined_at": now,
|
||||||
|
"left_at": None,
|
||||||
|
"last_read_at": now,
|
||||||
|
"created_at": now,
|
||||||
|
"updated_at": now,
|
||||||
|
}
|
||||||
|
)
|
||||||
|
memberships.append(ChannelMember(**model.model_dump()))
|
||||||
|
|
||||||
|
return memberships
|
||||||
|
|
||||||
def insert_new_channel(
|
def insert_new_channel(
|
||||||
self, type: Optional[str], form_data: ChannelForm, user_id: str
|
self, form_data: CreateChannelForm, user_id: str
|
||||||
) -> Optional[ChannelModel]:
|
) -> Optional[ChannelModel]:
|
||||||
with get_db() as db:
|
with get_db() as db:
|
||||||
channel = ChannelModel(
|
channel = ChannelModel(
|
||||||
**{
|
**{
|
||||||
**form_data.model_dump(),
|
**form_data.model_dump(),
|
||||||
"type": type,
|
"type": form_data.type if form_data.type else None,
|
||||||
"name": form_data.name.lower(),
|
"name": form_data.name.lower(),
|
||||||
"id": str(uuid.uuid4()),
|
"id": str(uuid.uuid4()),
|
||||||
"user_id": user_id,
|
"user_id": user_id,
|
||||||
|
|
@ -86,9 +272,21 @@ class ChannelTable:
|
||||||
"updated_at": int(time.time_ns()),
|
"updated_at": int(time.time_ns()),
|
||||||
}
|
}
|
||||||
)
|
)
|
||||||
|
|
||||||
new_channel = Channel(**channel.model_dump())
|
new_channel = Channel(**channel.model_dump())
|
||||||
|
|
||||||
|
if form_data.type in ["group", "dm"]:
|
||||||
|
users = self._collect_unique_user_ids(
|
||||||
|
invited_by=user_id,
|
||||||
|
user_ids=form_data.user_ids,
|
||||||
|
group_ids=form_data.group_ids,
|
||||||
|
)
|
||||||
|
memberships = self._create_membership_models(
|
||||||
|
channel_id=new_channel.id,
|
||||||
|
invited_by=user_id,
|
||||||
|
user_ids=users,
|
||||||
|
)
|
||||||
|
|
||||||
|
db.add_all(memberships)
|
||||||
db.add(new_channel)
|
db.add(new_channel)
|
||||||
db.commit()
|
db.commit()
|
||||||
return channel
|
return channel
|
||||||
|
|
@ -98,16 +296,346 @@ class ChannelTable:
|
||||||
channels = db.query(Channel).all()
|
channels = db.query(Channel).all()
|
||||||
return [ChannelModel.model_validate(channel) for channel in channels]
|
return [ChannelModel.model_validate(channel) for channel in channels]
|
||||||
|
|
||||||
def get_channels_by_user_id(
|
def _has_permission(self, db, query, filter: dict, permission: str = "read"):
|
||||||
self, user_id: str, permission: str = "read"
|
group_ids = filter.get("group_ids", [])
|
||||||
) -> list[ChannelModel]:
|
user_id = filter.get("user_id")
|
||||||
channels = self.get_channels()
|
|
||||||
return [
|
dialect_name = db.bind.dialect.name
|
||||||
channel
|
|
||||||
for channel in channels
|
# Public access
|
||||||
if channel.user_id == user_id
|
conditions = []
|
||||||
or has_access(user_id, permission, channel.access_control)
|
if group_ids or user_id:
|
||||||
]
|
conditions.extend(
|
||||||
|
[
|
||||||
|
Channel.access_control.is_(None),
|
||||||
|
cast(Channel.access_control, String) == "null",
|
||||||
|
]
|
||||||
|
)
|
||||||
|
|
||||||
|
# User-level permission
|
||||||
|
if user_id:
|
||||||
|
conditions.append(Channel.user_id == user_id)
|
||||||
|
|
||||||
|
# Group-level permission
|
||||||
|
if group_ids:
|
||||||
|
group_conditions = []
|
||||||
|
for gid in group_ids:
|
||||||
|
if dialect_name == "sqlite":
|
||||||
|
group_conditions.append(
|
||||||
|
Channel.access_control[permission]["group_ids"].contains([gid])
|
||||||
|
)
|
||||||
|
elif dialect_name == "postgresql":
|
||||||
|
group_conditions.append(
|
||||||
|
cast(
|
||||||
|
Channel.access_control[permission]["group_ids"],
|
||||||
|
JSONB,
|
||||||
|
).contains([gid])
|
||||||
|
)
|
||||||
|
conditions.append(or_(*group_conditions))
|
||||||
|
|
||||||
|
if conditions:
|
||||||
|
query = query.filter(or_(*conditions))
|
||||||
|
|
||||||
|
return query
|
||||||
|
|
||||||
|
def get_channels_by_user_id(self, user_id: str) -> list[ChannelModel]:
|
||||||
|
with get_db() as db:
|
||||||
|
user_group_ids = [
|
||||||
|
group.id for group in Groups.get_groups_by_member_id(user_id)
|
||||||
|
]
|
||||||
|
|
||||||
|
membership_channels = (
|
||||||
|
db.query(Channel)
|
||||||
|
.join(ChannelMember, Channel.id == ChannelMember.channel_id)
|
||||||
|
.filter(
|
||||||
|
Channel.deleted_at.is_(None),
|
||||||
|
Channel.archived_at.is_(None),
|
||||||
|
Channel.type.in_(["group", "dm"]),
|
||||||
|
ChannelMember.user_id == user_id,
|
||||||
|
ChannelMember.is_active.is_(True),
|
||||||
|
)
|
||||||
|
.all()
|
||||||
|
)
|
||||||
|
|
||||||
|
query = db.query(Channel).filter(
|
||||||
|
Channel.deleted_at.is_(None),
|
||||||
|
Channel.archived_at.is_(None),
|
||||||
|
or_(
|
||||||
|
Channel.type.is_(None), # True NULL/None
|
||||||
|
Channel.type == "", # Empty string
|
||||||
|
and_(Channel.type != "group", Channel.type != "dm"),
|
||||||
|
),
|
||||||
|
)
|
||||||
|
query = self._has_permission(
|
||||||
|
db, query, {"user_id": user_id, "group_ids": user_group_ids}
|
||||||
|
)
|
||||||
|
|
||||||
|
standard_channels = query.all()
|
||||||
|
|
||||||
|
all_channels = membership_channels + standard_channels
|
||||||
|
return [ChannelModel.model_validate(c) for c in all_channels]
|
||||||
|
|
||||||
|
def get_dm_channel_by_user_ids(self, user_ids: list[str]) -> Optional[ChannelModel]:
|
||||||
|
with get_db() as db:
|
||||||
|
# Ensure uniqueness in case a list with duplicates is passed
|
||||||
|
unique_user_ids = list(set(user_ids))
|
||||||
|
|
||||||
|
match_count = func.sum(
|
||||||
|
case(
|
||||||
|
(ChannelMember.user_id.in_(unique_user_ids), 1),
|
||||||
|
else_=0,
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
|
subquery = (
|
||||||
|
db.query(ChannelMember.channel_id)
|
||||||
|
.group_by(ChannelMember.channel_id)
|
||||||
|
# 1. Channel must have exactly len(user_ids) members
|
||||||
|
.having(func.count(ChannelMember.user_id) == len(unique_user_ids))
|
||||||
|
# 2. All those members must be in unique_user_ids
|
||||||
|
.having(match_count == len(unique_user_ids))
|
||||||
|
.subquery()
|
||||||
|
)
|
||||||
|
|
||||||
|
channel = (
|
||||||
|
db.query(Channel)
|
||||||
|
.filter(
|
||||||
|
Channel.id.in_(subquery),
|
||||||
|
Channel.type == "dm",
|
||||||
|
)
|
||||||
|
.first()
|
||||||
|
)
|
||||||
|
|
||||||
|
return ChannelModel.model_validate(channel) if channel else None
|
||||||
|
|
||||||
|
def add_members_to_channel(
|
||||||
|
self,
|
||||||
|
channel_id: str,
|
||||||
|
invited_by: str,
|
||||||
|
user_ids: Optional[list[str]] = None,
|
||||||
|
group_ids: Optional[list[str]] = None,
|
||||||
|
) -> list[ChannelMemberModel]:
|
||||||
|
with get_db() as db:
|
||||||
|
# 1. Collect all user_ids including groups + inviter
|
||||||
|
requested_users = self._collect_unique_user_ids(
|
||||||
|
invited_by, user_ids, group_ids
|
||||||
|
)
|
||||||
|
|
||||||
|
existing_users = {
|
||||||
|
row.user_id
|
||||||
|
for row in db.query(ChannelMember.user_id)
|
||||||
|
.filter(ChannelMember.channel_id == channel_id)
|
||||||
|
.all()
|
||||||
|
}
|
||||||
|
|
||||||
|
new_user_ids = requested_users - existing_users
|
||||||
|
if not new_user_ids:
|
||||||
|
return [] # Nothing to add
|
||||||
|
|
||||||
|
new_memberships = self._create_membership_models(
|
||||||
|
channel_id, invited_by, new_user_ids
|
||||||
|
)
|
||||||
|
|
||||||
|
db.add_all(new_memberships)
|
||||||
|
db.commit()
|
||||||
|
|
||||||
|
return [
|
||||||
|
ChannelMemberModel.model_validate(membership)
|
||||||
|
for membership in new_memberships
|
||||||
|
]
|
||||||
|
|
||||||
|
def remove_members_from_channel(
|
||||||
|
self,
|
||||||
|
channel_id: str,
|
||||||
|
user_ids: list[str],
|
||||||
|
) -> int:
|
||||||
|
with get_db() as db:
|
||||||
|
result = (
|
||||||
|
db.query(ChannelMember)
|
||||||
|
.filter(
|
||||||
|
ChannelMember.channel_id == channel_id,
|
||||||
|
ChannelMember.user_id.in_(user_ids),
|
||||||
|
)
|
||||||
|
.delete(synchronize_session=False)
|
||||||
|
)
|
||||||
|
db.commit()
|
||||||
|
return result # number of rows deleted
|
||||||
|
|
||||||
|
def is_user_channel_manager(self, channel_id: str, user_id: str) -> bool:
|
||||||
|
with get_db() as db:
|
||||||
|
# Check if the user is the creator of the channel
|
||||||
|
# or has a 'manager' role in ChannelMember
|
||||||
|
channel = db.query(Channel).filter(Channel.id == channel_id).first()
|
||||||
|
if channel and channel.user_id == user_id:
|
||||||
|
return True
|
||||||
|
|
||||||
|
membership = (
|
||||||
|
db.query(ChannelMember)
|
||||||
|
.filter(
|
||||||
|
ChannelMember.channel_id == channel_id,
|
||||||
|
ChannelMember.user_id == user_id,
|
||||||
|
ChannelMember.role == "manager",
|
||||||
|
)
|
||||||
|
.first()
|
||||||
|
)
|
||||||
|
return membership is not None
|
||||||
|
|
||||||
|
def join_channel(
|
||||||
|
self, channel_id: str, user_id: str
|
||||||
|
) -> Optional[ChannelMemberModel]:
|
||||||
|
with get_db() as db:
|
||||||
|
# Check if the membership already exists
|
||||||
|
existing_membership = (
|
||||||
|
db.query(ChannelMember)
|
||||||
|
.filter(
|
||||||
|
ChannelMember.channel_id == channel_id,
|
||||||
|
ChannelMember.user_id == user_id,
|
||||||
|
)
|
||||||
|
.first()
|
||||||
|
)
|
||||||
|
if existing_membership:
|
||||||
|
return ChannelMemberModel.model_validate(existing_membership)
|
||||||
|
|
||||||
|
# Create new membership
|
||||||
|
channel_member = ChannelMemberModel(
|
||||||
|
**{
|
||||||
|
"id": str(uuid.uuid4()),
|
||||||
|
"channel_id": channel_id,
|
||||||
|
"user_id": user_id,
|
||||||
|
"status": "joined",
|
||||||
|
"is_active": True,
|
||||||
|
"is_channel_muted": False,
|
||||||
|
"is_channel_pinned": False,
|
||||||
|
"joined_at": int(time.time_ns()),
|
||||||
|
"left_at": None,
|
||||||
|
"last_read_at": int(time.time_ns()),
|
||||||
|
"created_at": int(time.time_ns()),
|
||||||
|
"updated_at": int(time.time_ns()),
|
||||||
|
}
|
||||||
|
)
|
||||||
|
new_membership = ChannelMember(**channel_member.model_dump())
|
||||||
|
|
||||||
|
db.add(new_membership)
|
||||||
|
db.commit()
|
||||||
|
return channel_member
|
||||||
|
|
||||||
|
def leave_channel(self, channel_id: str, user_id: str) -> bool:
|
||||||
|
with get_db() as db:
|
||||||
|
membership = (
|
||||||
|
db.query(ChannelMember)
|
||||||
|
.filter(
|
||||||
|
ChannelMember.channel_id == channel_id,
|
||||||
|
ChannelMember.user_id == user_id,
|
||||||
|
)
|
||||||
|
.first()
|
||||||
|
)
|
||||||
|
if not membership:
|
||||||
|
return False
|
||||||
|
|
||||||
|
membership.status = "left"
|
||||||
|
membership.is_active = False
|
||||||
|
membership.left_at = int(time.time_ns())
|
||||||
|
membership.updated_at = int(time.time_ns())
|
||||||
|
|
||||||
|
db.commit()
|
||||||
|
return True
|
||||||
|
|
||||||
|
def get_member_by_channel_and_user_id(
|
||||||
|
self, channel_id: str, user_id: str
|
||||||
|
) -> Optional[ChannelMemberModel]:
|
||||||
|
with get_db() as db:
|
||||||
|
membership = (
|
||||||
|
db.query(ChannelMember)
|
||||||
|
.filter(
|
||||||
|
ChannelMember.channel_id == channel_id,
|
||||||
|
ChannelMember.user_id == user_id,
|
||||||
|
)
|
||||||
|
.first()
|
||||||
|
)
|
||||||
|
return ChannelMemberModel.model_validate(membership) if membership else None
|
||||||
|
|
||||||
|
def get_members_by_channel_id(self, channel_id: str) -> list[ChannelMemberModel]:
|
||||||
|
with get_db() as db:
|
||||||
|
memberships = (
|
||||||
|
db.query(ChannelMember)
|
||||||
|
.filter(ChannelMember.channel_id == channel_id)
|
||||||
|
.all()
|
||||||
|
)
|
||||||
|
return [
|
||||||
|
ChannelMemberModel.model_validate(membership)
|
||||||
|
for membership in memberships
|
||||||
|
]
|
||||||
|
|
||||||
|
def pin_channel(self, channel_id: str, user_id: str, is_pinned: bool) -> bool:
|
||||||
|
with get_db() as db:
|
||||||
|
membership = (
|
||||||
|
db.query(ChannelMember)
|
||||||
|
.filter(
|
||||||
|
ChannelMember.channel_id == channel_id,
|
||||||
|
ChannelMember.user_id == user_id,
|
||||||
|
)
|
||||||
|
.first()
|
||||||
|
)
|
||||||
|
if not membership:
|
||||||
|
return False
|
||||||
|
|
||||||
|
membership.is_channel_pinned = is_pinned
|
||||||
|
membership.updated_at = int(time.time_ns())
|
||||||
|
|
||||||
|
db.commit()
|
||||||
|
return True
|
||||||
|
|
||||||
|
def update_member_last_read_at(self, channel_id: str, user_id: str) -> bool:
|
||||||
|
with get_db() as db:
|
||||||
|
membership = (
|
||||||
|
db.query(ChannelMember)
|
||||||
|
.filter(
|
||||||
|
ChannelMember.channel_id == channel_id,
|
||||||
|
ChannelMember.user_id == user_id,
|
||||||
|
)
|
||||||
|
.first()
|
||||||
|
)
|
||||||
|
if not membership:
|
||||||
|
return False
|
||||||
|
|
||||||
|
membership.last_read_at = int(time.time_ns())
|
||||||
|
membership.updated_at = int(time.time_ns())
|
||||||
|
|
||||||
|
db.commit()
|
||||||
|
return True
|
||||||
|
|
||||||
|
def update_member_active_status(
|
||||||
|
self, channel_id: str, user_id: str, is_active: bool
|
||||||
|
) -> bool:
|
||||||
|
with get_db() as db:
|
||||||
|
membership = (
|
||||||
|
db.query(ChannelMember)
|
||||||
|
.filter(
|
||||||
|
ChannelMember.channel_id == channel_id,
|
||||||
|
ChannelMember.user_id == user_id,
|
||||||
|
)
|
||||||
|
.first()
|
||||||
|
)
|
||||||
|
if not membership:
|
||||||
|
return False
|
||||||
|
|
||||||
|
membership.is_active = is_active
|
||||||
|
membership.updated_at = int(time.time_ns())
|
||||||
|
|
||||||
|
db.commit()
|
||||||
|
return True
|
||||||
|
|
||||||
|
def is_user_channel_member(self, channel_id: str, user_id: str) -> bool:
|
||||||
|
with get_db() as db:
|
||||||
|
membership = (
|
||||||
|
db.query(ChannelMember)
|
||||||
|
.filter(
|
||||||
|
ChannelMember.channel_id == channel_id,
|
||||||
|
ChannelMember.user_id == user_id,
|
||||||
|
)
|
||||||
|
.first()
|
||||||
|
)
|
||||||
|
return membership is not None
|
||||||
|
|
||||||
def get_channel_by_id(self, id: str) -> Optional[ChannelModel]:
|
def get_channel_by_id(self, id: str) -> Optional[ChannelModel]:
|
||||||
with get_db() as db:
|
with get_db() as db:
|
||||||
|
|
@ -123,8 +651,12 @@ class ChannelTable:
|
||||||
return None
|
return None
|
||||||
|
|
||||||
channel.name = form_data.name
|
channel.name = form_data.name
|
||||||
|
channel.description = form_data.description
|
||||||
|
channel.is_private = form_data.is_private
|
||||||
|
|
||||||
channel.data = form_data.data
|
channel.data = form_data.data
|
||||||
channel.meta = form_data.meta
|
channel.meta = form_data.meta
|
||||||
|
|
||||||
channel.access_control = form_data.access_control
|
channel.access_control = form_data.access_control
|
||||||
channel.updated_at = int(time.time_ns())
|
channel.updated_at = int(time.time_ns())
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -238,6 +238,7 @@ class FilesTable:
|
||||||
try:
|
try:
|
||||||
file = db.query(File).filter_by(id=id).first()
|
file = db.query(File).filter_by(id=id).first()
|
||||||
file.hash = hash
|
file.hash = hash
|
||||||
|
file.updated_at = int(time.time())
|
||||||
db.commit()
|
db.commit()
|
||||||
|
|
||||||
return FileModel.model_validate(file)
|
return FileModel.model_validate(file)
|
||||||
|
|
@ -249,6 +250,7 @@ class FilesTable:
|
||||||
try:
|
try:
|
||||||
file = db.query(File).filter_by(id=id).first()
|
file = db.query(File).filter_by(id=id).first()
|
||||||
file.data = {**(file.data if file.data else {}), **data}
|
file.data = {**(file.data if file.data else {}), **data}
|
||||||
|
file.updated_at = int(time.time())
|
||||||
db.commit()
|
db.commit()
|
||||||
return FileModel.model_validate(file)
|
return FileModel.model_validate(file)
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
|
|
@ -260,6 +262,7 @@ class FilesTable:
|
||||||
try:
|
try:
|
||||||
file = db.query(File).filter_by(id=id).first()
|
file = db.query(File).filter_by(id=id).first()
|
||||||
file.meta = {**(file.meta if file.meta else {}), **meta}
|
file.meta = {**(file.meta if file.meta else {}), **meta}
|
||||||
|
file.updated_at = int(time.time())
|
||||||
db.commit()
|
db.commit()
|
||||||
return FileModel.model_validate(file)
|
return FileModel.model_validate(file)
|
||||||
except Exception:
|
except Exception:
|
||||||
|
|
|
||||||
|
|
@ -11,7 +11,18 @@ from open_webui.models.files import FileMetadataResponse
|
||||||
|
|
||||||
|
|
||||||
from pydantic import BaseModel, ConfigDict
|
from pydantic import BaseModel, ConfigDict
|
||||||
from sqlalchemy import BigInteger, Column, String, Text, JSON, func, ForeignKey
|
from sqlalchemy import (
|
||||||
|
BigInteger,
|
||||||
|
Column,
|
||||||
|
String,
|
||||||
|
Text,
|
||||||
|
JSON,
|
||||||
|
and_,
|
||||||
|
func,
|
||||||
|
ForeignKey,
|
||||||
|
cast,
|
||||||
|
or_,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
log = logging.getLogger(__name__)
|
log = logging.getLogger(__name__)
|
||||||
|
|
@ -41,7 +52,6 @@ class Group(Base):
|
||||||
|
|
||||||
|
|
||||||
class GroupModel(BaseModel):
|
class GroupModel(BaseModel):
|
||||||
model_config = ConfigDict(from_attributes=True)
|
|
||||||
id: str
|
id: str
|
||||||
user_id: str
|
user_id: str
|
||||||
|
|
||||||
|
|
@ -56,6 +66,8 @@ class GroupModel(BaseModel):
|
||||||
created_at: int # timestamp in epoch
|
created_at: int # timestamp in epoch
|
||||||
updated_at: int # timestamp in epoch
|
updated_at: int # timestamp in epoch
|
||||||
|
|
||||||
|
model_config = ConfigDict(from_attributes=True)
|
||||||
|
|
||||||
|
|
||||||
class GroupMember(Base):
|
class GroupMember(Base):
|
||||||
__tablename__ = "group_member"
|
__tablename__ = "group_member"
|
||||||
|
|
@ -84,17 +96,8 @@ class GroupMemberModel(BaseModel):
|
||||||
####################
|
####################
|
||||||
|
|
||||||
|
|
||||||
class GroupResponse(BaseModel):
|
class GroupResponse(GroupModel):
|
||||||
id: str
|
|
||||||
user_id: str
|
|
||||||
name: str
|
|
||||||
description: str
|
|
||||||
permissions: Optional[dict] = None
|
|
||||||
data: Optional[dict] = None
|
|
||||||
meta: Optional[dict] = None
|
|
||||||
member_count: Optional[int] = None
|
member_count: Optional[int] = None
|
||||||
created_at: int # timestamp in epoch
|
|
||||||
updated_at: int # timestamp in epoch
|
|
||||||
|
|
||||||
|
|
||||||
class GroupForm(BaseModel):
|
class GroupForm(BaseModel):
|
||||||
|
|
@ -112,6 +115,11 @@ class GroupUpdateForm(GroupForm):
|
||||||
pass
|
pass
|
||||||
|
|
||||||
|
|
||||||
|
class GroupListResponse(BaseModel):
|
||||||
|
items: list[GroupResponse] = []
|
||||||
|
total: int = 0
|
||||||
|
|
||||||
|
|
||||||
class GroupTable:
|
class GroupTable:
|
||||||
def insert_new_group(
|
def insert_new_group(
|
||||||
self, user_id: str, form_data: GroupForm
|
self, user_id: str, form_data: GroupForm
|
||||||
|
|
@ -140,13 +148,87 @@ class GroupTable:
|
||||||
except Exception:
|
except Exception:
|
||||||
return None
|
return None
|
||||||
|
|
||||||
def get_groups(self) -> list[GroupModel]:
|
def get_all_groups(self) -> list[GroupModel]:
|
||||||
with get_db() as db:
|
with get_db() as db:
|
||||||
|
groups = db.query(Group).order_by(Group.updated_at.desc()).all()
|
||||||
|
return [GroupModel.model_validate(group) for group in groups]
|
||||||
|
|
||||||
|
def get_groups(self, filter) -> list[GroupResponse]:
|
||||||
|
with get_db() as db:
|
||||||
|
query = db.query(Group)
|
||||||
|
|
||||||
|
if filter:
|
||||||
|
if "query" in filter:
|
||||||
|
query = query.filter(Group.name.ilike(f"%{filter['query']}%"))
|
||||||
|
if "member_id" in filter:
|
||||||
|
query = query.join(
|
||||||
|
GroupMember, GroupMember.group_id == Group.id
|
||||||
|
).filter(GroupMember.user_id == filter["member_id"])
|
||||||
|
|
||||||
|
if "share" in filter:
|
||||||
|
share_value = filter["share"]
|
||||||
|
json_share = Group.data["config"]["share"].as_boolean()
|
||||||
|
|
||||||
|
if share_value:
|
||||||
|
query = query.filter(
|
||||||
|
or_(
|
||||||
|
Group.data.is_(None),
|
||||||
|
json_share.is_(None),
|
||||||
|
json_share == True,
|
||||||
|
)
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
query = query.filter(
|
||||||
|
and_(Group.data.isnot(None), json_share == False)
|
||||||
|
)
|
||||||
|
groups = query.order_by(Group.updated_at.desc()).all()
|
||||||
return [
|
return [
|
||||||
GroupModel.model_validate(group)
|
GroupResponse.model_validate(
|
||||||
for group in db.query(Group).order_by(Group.updated_at.desc()).all()
|
{
|
||||||
|
**GroupModel.model_validate(group).model_dump(),
|
||||||
|
"member_count": self.get_group_member_count_by_id(group.id),
|
||||||
|
}
|
||||||
|
)
|
||||||
|
for group in groups
|
||||||
]
|
]
|
||||||
|
|
||||||
|
def search_groups(
|
||||||
|
self, filter: Optional[dict] = None, skip: int = 0, limit: int = 30
|
||||||
|
) -> GroupListResponse:
|
||||||
|
with get_db() as db:
|
||||||
|
query = db.query(Group)
|
||||||
|
|
||||||
|
if filter:
|
||||||
|
if "query" in filter:
|
||||||
|
query = query.filter(Group.name.ilike(f"%{filter['query']}%"))
|
||||||
|
if "member_id" in filter:
|
||||||
|
query = query.join(
|
||||||
|
GroupMember, GroupMember.group_id == Group.id
|
||||||
|
).filter(GroupMember.user_id == filter["member_id"])
|
||||||
|
|
||||||
|
if "share" in filter:
|
||||||
|
# 'share' is stored in data JSON, support both sqlite and postgres
|
||||||
|
share_value = filter["share"]
|
||||||
|
print("Filtering by share:", share_value)
|
||||||
|
query = query.filter(
|
||||||
|
Group.data.op("->>")("share") == str(share_value)
|
||||||
|
)
|
||||||
|
|
||||||
|
total = query.count()
|
||||||
|
query = query.order_by(Group.updated_at.desc())
|
||||||
|
groups = query.offset(skip).limit(limit).all()
|
||||||
|
|
||||||
|
return {
|
||||||
|
"items": [
|
||||||
|
GroupResponse.model_validate(
|
||||||
|
**GroupModel.model_validate(group).model_dump(),
|
||||||
|
member_count=self.get_group_member_count_by_id(group.id),
|
||||||
|
)
|
||||||
|
for group in groups
|
||||||
|
],
|
||||||
|
"total": total,
|
||||||
|
}
|
||||||
|
|
||||||
def get_groups_by_member_id(self, user_id: str) -> list[GroupModel]:
|
def get_groups_by_member_id(self, user_id: str) -> list[GroupModel]:
|
||||||
with get_db() as db:
|
with get_db() as db:
|
||||||
return [
|
return [
|
||||||
|
|
@ -293,7 +375,7 @@ class GroupTable:
|
||||||
) -> list[GroupModel]:
|
) -> list[GroupModel]:
|
||||||
|
|
||||||
# check for existing groups
|
# check for existing groups
|
||||||
existing_groups = self.get_groups()
|
existing_groups = self.get_all_groups()
|
||||||
existing_group_names = {group.name for group in existing_groups}
|
existing_group_names = {group.name for group in existing_groups}
|
||||||
|
|
||||||
new_groups = []
|
new_groups = []
|
||||||
|
|
|
||||||
|
|
@ -7,13 +7,27 @@ import uuid
|
||||||
from open_webui.internal.db import Base, get_db
|
from open_webui.internal.db import Base, get_db
|
||||||
from open_webui.env import SRC_LOG_LEVELS
|
from open_webui.env import SRC_LOG_LEVELS
|
||||||
|
|
||||||
from open_webui.models.files import FileMetadataResponse
|
from open_webui.models.files import (
|
||||||
|
File,
|
||||||
|
FileModel,
|
||||||
|
FileMetadataResponse,
|
||||||
|
FileModelResponse,
|
||||||
|
)
|
||||||
from open_webui.models.groups import Groups
|
from open_webui.models.groups import Groups
|
||||||
from open_webui.models.users import Users, UserResponse
|
from open_webui.models.users import User, UserModel, Users, UserResponse
|
||||||
|
|
||||||
|
|
||||||
from pydantic import BaseModel, ConfigDict
|
from pydantic import BaseModel, ConfigDict
|
||||||
from sqlalchemy import BigInteger, Column, String, Text, JSON
|
from sqlalchemy import (
|
||||||
|
BigInteger,
|
||||||
|
Column,
|
||||||
|
ForeignKey,
|
||||||
|
String,
|
||||||
|
Text,
|
||||||
|
JSON,
|
||||||
|
UniqueConstraint,
|
||||||
|
or_,
|
||||||
|
)
|
||||||
|
|
||||||
from open_webui.utils.access_control import has_access
|
from open_webui.utils.access_control import has_access
|
||||||
|
|
||||||
|
|
@ -34,9 +48,7 @@ class Knowledge(Base):
|
||||||
name = Column(Text)
|
name = Column(Text)
|
||||||
description = Column(Text)
|
description = Column(Text)
|
||||||
|
|
||||||
data = Column(JSON, nullable=True)
|
|
||||||
meta = Column(JSON, nullable=True)
|
meta = Column(JSON, nullable=True)
|
||||||
|
|
||||||
access_control = Column(JSON, nullable=True) # Controls data access levels.
|
access_control = Column(JSON, nullable=True) # Controls data access levels.
|
||||||
# Defines access control rules for this entry.
|
# Defines access control rules for this entry.
|
||||||
# - `None`: Public access, available to all users with the "user" role.
|
# - `None`: Public access, available to all users with the "user" role.
|
||||||
|
|
@ -67,7 +79,6 @@ class KnowledgeModel(BaseModel):
|
||||||
name: str
|
name: str
|
||||||
description: str
|
description: str
|
||||||
|
|
||||||
data: Optional[dict] = None
|
|
||||||
meta: Optional[dict] = None
|
meta: Optional[dict] = None
|
||||||
|
|
||||||
access_control: Optional[dict] = None
|
access_control: Optional[dict] = None
|
||||||
|
|
@ -76,11 +87,42 @@ class KnowledgeModel(BaseModel):
|
||||||
updated_at: int # timestamp in epoch
|
updated_at: int # timestamp in epoch
|
||||||
|
|
||||||
|
|
||||||
|
class KnowledgeFile(Base):
|
||||||
|
__tablename__ = "knowledge_file"
|
||||||
|
|
||||||
|
id = Column(Text, unique=True, primary_key=True)
|
||||||
|
|
||||||
|
knowledge_id = Column(
|
||||||
|
Text, ForeignKey("knowledge.id", ondelete="CASCADE"), nullable=False
|
||||||
|
)
|
||||||
|
file_id = Column(Text, ForeignKey("file.id", ondelete="CASCADE"), nullable=False)
|
||||||
|
user_id = Column(Text, nullable=False)
|
||||||
|
|
||||||
|
created_at = Column(BigInteger, nullable=False)
|
||||||
|
updated_at = Column(BigInteger, nullable=False)
|
||||||
|
|
||||||
|
__table_args__ = (
|
||||||
|
UniqueConstraint(
|
||||||
|
"knowledge_id", "file_id", name="uq_knowledge_file_knowledge_file"
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class KnowledgeFileModel(BaseModel):
|
||||||
|
id: str
|
||||||
|
knowledge_id: str
|
||||||
|
file_id: str
|
||||||
|
user_id: str
|
||||||
|
|
||||||
|
created_at: int # timestamp in epoch
|
||||||
|
updated_at: int # timestamp in epoch
|
||||||
|
|
||||||
|
model_config = ConfigDict(from_attributes=True)
|
||||||
|
|
||||||
|
|
||||||
####################
|
####################
|
||||||
# Forms
|
# Forms
|
||||||
####################
|
####################
|
||||||
|
|
||||||
|
|
||||||
class KnowledgeUserModel(KnowledgeModel):
|
class KnowledgeUserModel(KnowledgeModel):
|
||||||
user: Optional[UserResponse] = None
|
user: Optional[UserResponse] = None
|
||||||
|
|
||||||
|
|
@ -96,10 +138,18 @@ class KnowledgeUserResponse(KnowledgeUserModel):
|
||||||
class KnowledgeForm(BaseModel):
|
class KnowledgeForm(BaseModel):
|
||||||
name: str
|
name: str
|
||||||
description: str
|
description: str
|
||||||
data: Optional[dict] = None
|
|
||||||
access_control: Optional[dict] = None
|
access_control: Optional[dict] = None
|
||||||
|
|
||||||
|
|
||||||
|
class FileUserResponse(FileModelResponse):
|
||||||
|
user: Optional[UserResponse] = None
|
||||||
|
|
||||||
|
|
||||||
|
class KnowledgeFileListResponse(BaseModel):
|
||||||
|
items: list[FileUserResponse]
|
||||||
|
total: int
|
||||||
|
|
||||||
|
|
||||||
class KnowledgeTable:
|
class KnowledgeTable:
|
||||||
def insert_new_knowledge(
|
def insert_new_knowledge(
|
||||||
self, user_id: str, form_data: KnowledgeForm
|
self, user_id: str, form_data: KnowledgeForm
|
||||||
|
|
@ -182,6 +232,182 @@ class KnowledgeTable:
|
||||||
except Exception:
|
except Exception:
|
||||||
return None
|
return None
|
||||||
|
|
||||||
|
def get_knowledges_by_file_id(self, file_id: str) -> list[KnowledgeModel]:
|
||||||
|
try:
|
||||||
|
with get_db() as db:
|
||||||
|
knowledges = (
|
||||||
|
db.query(Knowledge)
|
||||||
|
.join(KnowledgeFile, Knowledge.id == KnowledgeFile.knowledge_id)
|
||||||
|
.filter(KnowledgeFile.file_id == file_id)
|
||||||
|
.all()
|
||||||
|
)
|
||||||
|
return [
|
||||||
|
KnowledgeModel.model_validate(knowledge) for knowledge in knowledges
|
||||||
|
]
|
||||||
|
except Exception:
|
||||||
|
return []
|
||||||
|
|
||||||
|
def search_files_by_id(
|
||||||
|
self,
|
||||||
|
knowledge_id: str,
|
||||||
|
user_id: str,
|
||||||
|
filter: dict,
|
||||||
|
skip: int = 0,
|
||||||
|
limit: int = 30,
|
||||||
|
) -> KnowledgeFileListResponse:
|
||||||
|
try:
|
||||||
|
with get_db() as db:
|
||||||
|
query = (
|
||||||
|
db.query(File, User)
|
||||||
|
.join(KnowledgeFile, File.id == KnowledgeFile.file_id)
|
||||||
|
.outerjoin(User, User.id == KnowledgeFile.user_id)
|
||||||
|
.filter(KnowledgeFile.knowledge_id == knowledge_id)
|
||||||
|
)
|
||||||
|
|
||||||
|
if filter:
|
||||||
|
query_key = filter.get("query")
|
||||||
|
if query_key:
|
||||||
|
query = query.filter(or_(File.filename.ilike(f"%{query_key}%")))
|
||||||
|
|
||||||
|
view_option = filter.get("view_option")
|
||||||
|
if view_option == "created":
|
||||||
|
query = query.filter(KnowledgeFile.user_id == user_id)
|
||||||
|
elif view_option == "shared":
|
||||||
|
query = query.filter(KnowledgeFile.user_id != user_id)
|
||||||
|
|
||||||
|
order_by = filter.get("order_by")
|
||||||
|
direction = filter.get("direction")
|
||||||
|
|
||||||
|
if order_by == "name":
|
||||||
|
if direction == "asc":
|
||||||
|
query = query.order_by(File.filename.asc())
|
||||||
|
else:
|
||||||
|
query = query.order_by(File.filename.desc())
|
||||||
|
elif order_by == "created_at":
|
||||||
|
if direction == "asc":
|
||||||
|
query = query.order_by(File.created_at.asc())
|
||||||
|
else:
|
||||||
|
query = query.order_by(File.created_at.desc())
|
||||||
|
elif order_by == "updated_at":
|
||||||
|
if direction == "asc":
|
||||||
|
query = query.order_by(File.updated_at.asc())
|
||||||
|
else:
|
||||||
|
query = query.order_by(File.updated_at.desc())
|
||||||
|
else:
|
||||||
|
query = query.order_by(File.updated_at.desc())
|
||||||
|
|
||||||
|
else:
|
||||||
|
query = query.order_by(File.updated_at.desc())
|
||||||
|
|
||||||
|
# Count BEFORE pagination
|
||||||
|
total = query.count()
|
||||||
|
|
||||||
|
if skip:
|
||||||
|
query = query.offset(skip)
|
||||||
|
if limit:
|
||||||
|
query = query.limit(limit)
|
||||||
|
|
||||||
|
items = query.all()
|
||||||
|
|
||||||
|
files = []
|
||||||
|
for file, user in items:
|
||||||
|
files.append(
|
||||||
|
FileUserResponse(
|
||||||
|
**FileModel.model_validate(file).model_dump(),
|
||||||
|
user=(
|
||||||
|
UserResponse(
|
||||||
|
**UserModel.model_validate(user).model_dump()
|
||||||
|
)
|
||||||
|
if user
|
||||||
|
else None
|
||||||
|
),
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
|
return KnowledgeFileListResponse(items=files, total=total)
|
||||||
|
except Exception as e:
|
||||||
|
print(e)
|
||||||
|
return KnowledgeFileListResponse(items=[], total=0)
|
||||||
|
|
||||||
|
def get_files_by_id(self, knowledge_id: str) -> list[FileModel]:
|
||||||
|
try:
|
||||||
|
with get_db() as db:
|
||||||
|
files = (
|
||||||
|
db.query(File)
|
||||||
|
.join(KnowledgeFile, File.id == KnowledgeFile.file_id)
|
||||||
|
.filter(KnowledgeFile.knowledge_id == knowledge_id)
|
||||||
|
.all()
|
||||||
|
)
|
||||||
|
return [FileModel.model_validate(file) for file in files]
|
||||||
|
except Exception:
|
||||||
|
return []
|
||||||
|
|
||||||
|
def get_file_metadatas_by_id(self, knowledge_id: str) -> list[FileMetadataResponse]:
|
||||||
|
try:
|
||||||
|
with get_db() as db:
|
||||||
|
files = self.get_files_by_id(knowledge_id)
|
||||||
|
return [FileMetadataResponse(**file.model_dump()) for file in files]
|
||||||
|
except Exception:
|
||||||
|
return []
|
||||||
|
|
||||||
|
def add_file_to_knowledge_by_id(
|
||||||
|
self, knowledge_id: str, file_id: str, user_id: str
|
||||||
|
) -> Optional[KnowledgeFileModel]:
|
||||||
|
with get_db() as db:
|
||||||
|
knowledge_file = KnowledgeFileModel(
|
||||||
|
**{
|
||||||
|
"id": str(uuid.uuid4()),
|
||||||
|
"knowledge_id": knowledge_id,
|
||||||
|
"file_id": file_id,
|
||||||
|
"user_id": user_id,
|
||||||
|
"created_at": int(time.time()),
|
||||||
|
"updated_at": int(time.time()),
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
try:
|
||||||
|
result = KnowledgeFile(**knowledge_file.model_dump())
|
||||||
|
db.add(result)
|
||||||
|
db.commit()
|
||||||
|
db.refresh(result)
|
||||||
|
if result:
|
||||||
|
return KnowledgeFileModel.model_validate(result)
|
||||||
|
else:
|
||||||
|
return None
|
||||||
|
except Exception:
|
||||||
|
return None
|
||||||
|
|
||||||
|
def remove_file_from_knowledge_by_id(self, knowledge_id: str, file_id: str) -> bool:
|
||||||
|
try:
|
||||||
|
with get_db() as db:
|
||||||
|
db.query(KnowledgeFile).filter_by(
|
||||||
|
knowledge_id=knowledge_id, file_id=file_id
|
||||||
|
).delete()
|
||||||
|
db.commit()
|
||||||
|
return True
|
||||||
|
except Exception:
|
||||||
|
return False
|
||||||
|
|
||||||
|
def reset_knowledge_by_id(self, id: str) -> Optional[KnowledgeModel]:
|
||||||
|
try:
|
||||||
|
with get_db() as db:
|
||||||
|
# Delete all knowledge_file entries for this knowledge_id
|
||||||
|
db.query(KnowledgeFile).filter_by(knowledge_id=id).delete()
|
||||||
|
db.commit()
|
||||||
|
|
||||||
|
# Update the knowledge entry's updated_at timestamp
|
||||||
|
db.query(Knowledge).filter_by(id=id).update(
|
||||||
|
{
|
||||||
|
"updated_at": int(time.time()),
|
||||||
|
}
|
||||||
|
)
|
||||||
|
db.commit()
|
||||||
|
|
||||||
|
return self.get_knowledge_by_id(id=id)
|
||||||
|
except Exception as e:
|
||||||
|
log.exception(e)
|
||||||
|
return None
|
||||||
|
|
||||||
def update_knowledge_by_id(
|
def update_knowledge_by_id(
|
||||||
self, id: str, form_data: KnowledgeForm, overwrite: bool = False
|
self, id: str, form_data: KnowledgeForm, overwrite: bool = False
|
||||||
) -> Optional[KnowledgeModel]:
|
) -> Optional[KnowledgeModel]:
|
||||||
|
|
|
||||||
|
|
@ -5,10 +5,11 @@ from typing import Optional
|
||||||
|
|
||||||
from open_webui.internal.db import Base, get_db
|
from open_webui.internal.db import Base, get_db
|
||||||
from open_webui.models.tags import TagModel, Tag, Tags
|
from open_webui.models.tags import TagModel, Tag, Tags
|
||||||
from open_webui.models.users import Users, UserNameResponse
|
from open_webui.models.users import Users, User, UserNameResponse
|
||||||
|
from open_webui.models.channels import Channels, ChannelMember
|
||||||
|
|
||||||
|
|
||||||
from pydantic import BaseModel, ConfigDict
|
from pydantic import BaseModel, ConfigDict, field_validator
|
||||||
from sqlalchemy import BigInteger, Boolean, Column, String, Text, JSON
|
from sqlalchemy import BigInteger, Boolean, Column, String, Text, JSON
|
||||||
from sqlalchemy import or_, func, select, and_, text
|
from sqlalchemy import or_, func, select, and_, text
|
||||||
from sqlalchemy.sql import exists
|
from sqlalchemy.sql import exists
|
||||||
|
|
@ -39,7 +40,7 @@ class MessageReactionModel(BaseModel):
|
||||||
|
|
||||||
class Message(Base):
|
class Message(Base):
|
||||||
__tablename__ = "message"
|
__tablename__ = "message"
|
||||||
id = Column(Text, primary_key=True)
|
id = Column(Text, primary_key=True, unique=True)
|
||||||
|
|
||||||
user_id = Column(Text)
|
user_id = Column(Text)
|
||||||
channel_id = Column(Text, nullable=True)
|
channel_id = Column(Text, nullable=True)
|
||||||
|
|
@ -47,6 +48,11 @@ class Message(Base):
|
||||||
reply_to_id = Column(Text, nullable=True)
|
reply_to_id = Column(Text, nullable=True)
|
||||||
parent_id = Column(Text, nullable=True)
|
parent_id = Column(Text, nullable=True)
|
||||||
|
|
||||||
|
# Pins
|
||||||
|
is_pinned = Column(Boolean, nullable=False, default=False)
|
||||||
|
pinned_at = Column(BigInteger, nullable=True)
|
||||||
|
pinned_by = Column(Text, nullable=True)
|
||||||
|
|
||||||
content = Column(Text)
|
content = Column(Text)
|
||||||
data = Column(JSON, nullable=True)
|
data = Column(JSON, nullable=True)
|
||||||
meta = Column(JSON, nullable=True)
|
meta = Column(JSON, nullable=True)
|
||||||
|
|
@ -65,12 +71,17 @@ class MessageModel(BaseModel):
|
||||||
reply_to_id: Optional[str] = None
|
reply_to_id: Optional[str] = None
|
||||||
parent_id: Optional[str] = None
|
parent_id: Optional[str] = None
|
||||||
|
|
||||||
|
# Pins
|
||||||
|
is_pinned: bool = False
|
||||||
|
pinned_by: Optional[str] = None
|
||||||
|
pinned_at: Optional[int] = None # timestamp in epoch (time_ns)
|
||||||
|
|
||||||
content: str
|
content: str
|
||||||
data: Optional[dict] = None
|
data: Optional[dict] = None
|
||||||
meta: Optional[dict] = None
|
meta: Optional[dict] = None
|
||||||
|
|
||||||
created_at: int # timestamp in epoch
|
created_at: int # timestamp in epoch (time_ns)
|
||||||
updated_at: int # timestamp in epoch
|
updated_at: int # timestamp in epoch (time_ns)
|
||||||
|
|
||||||
|
|
||||||
####################
|
####################
|
||||||
|
|
@ -79,6 +90,7 @@ class MessageModel(BaseModel):
|
||||||
|
|
||||||
|
|
||||||
class MessageForm(BaseModel):
|
class MessageForm(BaseModel):
|
||||||
|
temp_id: Optional[str] = None
|
||||||
content: str
|
content: str
|
||||||
reply_to_id: Optional[str] = None
|
reply_to_id: Optional[str] = None
|
||||||
parent_id: Optional[str] = None
|
parent_id: Optional[str] = None
|
||||||
|
|
@ -88,7 +100,7 @@ class MessageForm(BaseModel):
|
||||||
|
|
||||||
class Reactions(BaseModel):
|
class Reactions(BaseModel):
|
||||||
name: str
|
name: str
|
||||||
user_ids: list[str]
|
users: list[dict]
|
||||||
count: int
|
count: int
|
||||||
|
|
||||||
|
|
||||||
|
|
@ -96,8 +108,25 @@ class MessageUserResponse(MessageModel):
|
||||||
user: Optional[UserNameResponse] = None
|
user: Optional[UserNameResponse] = None
|
||||||
|
|
||||||
|
|
||||||
|
class MessageUserSlimResponse(MessageUserResponse):
|
||||||
|
data: bool | None = None
|
||||||
|
|
||||||
|
@field_validator("data", mode="before")
|
||||||
|
def convert_data_to_bool(cls, v):
|
||||||
|
# No data or not a dict → False
|
||||||
|
if not isinstance(v, dict):
|
||||||
|
return False
|
||||||
|
|
||||||
|
# True if ANY value in the dict is non-empty
|
||||||
|
return any(bool(val) for val in v.values())
|
||||||
|
|
||||||
|
|
||||||
class MessageReplyToResponse(MessageUserResponse):
|
class MessageReplyToResponse(MessageUserResponse):
|
||||||
reply_to_message: Optional[MessageUserResponse] = None
|
reply_to_message: Optional[MessageUserSlimResponse] = None
|
||||||
|
|
||||||
|
|
||||||
|
class MessageWithReactionsResponse(MessageUserSlimResponse):
|
||||||
|
reactions: list[Reactions]
|
||||||
|
|
||||||
|
|
||||||
class MessageResponse(MessageReplyToResponse):
|
class MessageResponse(MessageReplyToResponse):
|
||||||
|
|
@ -111,9 +140,11 @@ class MessageTable:
|
||||||
self, form_data: MessageForm, channel_id: str, user_id: str
|
self, form_data: MessageForm, channel_id: str, user_id: str
|
||||||
) -> Optional[MessageModel]:
|
) -> Optional[MessageModel]:
|
||||||
with get_db() as db:
|
with get_db() as db:
|
||||||
id = str(uuid.uuid4())
|
channel_member = Channels.join_channel(channel_id, user_id)
|
||||||
|
|
||||||
|
id = str(uuid.uuid4())
|
||||||
ts = int(time.time_ns())
|
ts = int(time.time_ns())
|
||||||
|
|
||||||
message = MessageModel(
|
message = MessageModel(
|
||||||
**{
|
**{
|
||||||
"id": id,
|
"id": id,
|
||||||
|
|
@ -121,6 +152,9 @@ class MessageTable:
|
||||||
"channel_id": channel_id,
|
"channel_id": channel_id,
|
||||||
"reply_to_id": form_data.reply_to_id,
|
"reply_to_id": form_data.reply_to_id,
|
||||||
"parent_id": form_data.parent_id,
|
"parent_id": form_data.parent_id,
|
||||||
|
"is_pinned": False,
|
||||||
|
"pinned_at": None,
|
||||||
|
"pinned_by": None,
|
||||||
"content": form_data.content,
|
"content": form_data.content,
|
||||||
"data": form_data.data,
|
"data": form_data.data,
|
||||||
"meta": form_data.meta,
|
"meta": form_data.meta,
|
||||||
|
|
@ -128,8 +162,8 @@ class MessageTable:
|
||||||
"updated_at": ts,
|
"updated_at": ts,
|
||||||
}
|
}
|
||||||
)
|
)
|
||||||
|
|
||||||
result = Message(**message.model_dump())
|
result = Message(**message.model_dump())
|
||||||
|
|
||||||
db.add(result)
|
db.add(result)
|
||||||
db.commit()
|
db.commit()
|
||||||
db.refresh(result)
|
db.refresh(result)
|
||||||
|
|
@ -280,6 +314,30 @@ class MessageTable:
|
||||||
)
|
)
|
||||||
return messages
|
return messages
|
||||||
|
|
||||||
|
def get_last_message_by_channel_id(self, channel_id: str) -> Optional[MessageModel]:
|
||||||
|
with get_db() as db:
|
||||||
|
message = (
|
||||||
|
db.query(Message)
|
||||||
|
.filter_by(channel_id=channel_id)
|
||||||
|
.order_by(Message.created_at.desc())
|
||||||
|
.first()
|
||||||
|
)
|
||||||
|
return MessageModel.model_validate(message) if message else None
|
||||||
|
|
||||||
|
def get_pinned_messages_by_channel_id(
|
||||||
|
self, channel_id: str, skip: int = 0, limit: int = 50
|
||||||
|
) -> list[MessageModel]:
|
||||||
|
with get_db() as db:
|
||||||
|
all_messages = (
|
||||||
|
db.query(Message)
|
||||||
|
.filter_by(channel_id=channel_id, is_pinned=True)
|
||||||
|
.order_by(Message.pinned_at.desc())
|
||||||
|
.offset(skip)
|
||||||
|
.limit(limit)
|
||||||
|
.all()
|
||||||
|
)
|
||||||
|
return [MessageModel.model_validate(message) for message in all_messages]
|
||||||
|
|
||||||
def update_message_by_id(
|
def update_message_by_id(
|
||||||
self, id: str, form_data: MessageForm
|
self, id: str, form_data: MessageForm
|
||||||
) -> Optional[MessageModel]:
|
) -> Optional[MessageModel]:
|
||||||
|
|
@ -299,10 +357,44 @@ class MessageTable:
|
||||||
db.refresh(message)
|
db.refresh(message)
|
||||||
return MessageModel.model_validate(message) if message else None
|
return MessageModel.model_validate(message) if message else None
|
||||||
|
|
||||||
|
def update_is_pinned_by_id(
|
||||||
|
self, id: str, is_pinned: bool, pinned_by: Optional[str] = None
|
||||||
|
) -> Optional[MessageModel]:
|
||||||
|
with get_db() as db:
|
||||||
|
message = db.get(Message, id)
|
||||||
|
message.is_pinned = is_pinned
|
||||||
|
message.pinned_at = int(time.time_ns()) if is_pinned else None
|
||||||
|
message.pinned_by = pinned_by if is_pinned else None
|
||||||
|
db.commit()
|
||||||
|
db.refresh(message)
|
||||||
|
return MessageModel.model_validate(message) if message else None
|
||||||
|
|
||||||
|
def get_unread_message_count(
|
||||||
|
self, channel_id: str, user_id: str, last_read_at: Optional[int] = None
|
||||||
|
) -> int:
|
||||||
|
with get_db() as db:
|
||||||
|
query = db.query(Message).filter(
|
||||||
|
Message.channel_id == channel_id,
|
||||||
|
Message.parent_id == None, # only count top-level messages
|
||||||
|
Message.created_at > (last_read_at if last_read_at else 0),
|
||||||
|
)
|
||||||
|
if user_id:
|
||||||
|
query = query.filter(Message.user_id != user_id)
|
||||||
|
return query.count()
|
||||||
|
|
||||||
def add_reaction_to_message(
|
def add_reaction_to_message(
|
||||||
self, id: str, user_id: str, name: str
|
self, id: str, user_id: str, name: str
|
||||||
) -> Optional[MessageReactionModel]:
|
) -> Optional[MessageReactionModel]:
|
||||||
with get_db() as db:
|
with get_db() as db:
|
||||||
|
# check for existing reaction
|
||||||
|
existing_reaction = (
|
||||||
|
db.query(MessageReaction)
|
||||||
|
.filter_by(message_id=id, user_id=user_id, name=name)
|
||||||
|
.first()
|
||||||
|
)
|
||||||
|
if existing_reaction:
|
||||||
|
return MessageReactionModel.model_validate(existing_reaction)
|
||||||
|
|
||||||
reaction_id = str(uuid.uuid4())
|
reaction_id = str(uuid.uuid4())
|
||||||
reaction = MessageReactionModel(
|
reaction = MessageReactionModel(
|
||||||
id=reaction_id,
|
id=reaction_id,
|
||||||
|
|
@ -319,17 +411,30 @@ class MessageTable:
|
||||||
|
|
||||||
def get_reactions_by_message_id(self, id: str) -> list[Reactions]:
|
def get_reactions_by_message_id(self, id: str) -> list[Reactions]:
|
||||||
with get_db() as db:
|
with get_db() as db:
|
||||||
all_reactions = db.query(MessageReaction).filter_by(message_id=id).all()
|
# JOIN User so all user info is fetched in one query
|
||||||
|
results = (
|
||||||
|
db.query(MessageReaction, User)
|
||||||
|
.join(User, MessageReaction.user_id == User.id)
|
||||||
|
.filter(MessageReaction.message_id == id)
|
||||||
|
.all()
|
||||||
|
)
|
||||||
|
|
||||||
reactions = {}
|
reactions = {}
|
||||||
for reaction in all_reactions:
|
|
||||||
|
for reaction, user in results:
|
||||||
if reaction.name not in reactions:
|
if reaction.name not in reactions:
|
||||||
reactions[reaction.name] = {
|
reactions[reaction.name] = {
|
||||||
"name": reaction.name,
|
"name": reaction.name,
|
||||||
"user_ids": [],
|
"users": [],
|
||||||
"count": 0,
|
"count": 0,
|
||||||
}
|
}
|
||||||
reactions[reaction.name]["user_ids"].append(reaction.user_id)
|
|
||||||
|
reactions[reaction.name]["users"].append(
|
||||||
|
{
|
||||||
|
"id": user.id,
|
||||||
|
"name": user.name,
|
||||||
|
}
|
||||||
|
)
|
||||||
reactions[reaction.name]["count"] += 1
|
reactions[reaction.name]["count"] += 1
|
||||||
|
|
||||||
return [Reactions(**reaction) for reaction in reactions.values()]
|
return [Reactions(**reaction) for reaction in reactions.values()]
|
||||||
|
|
|
||||||
|
|
@ -13,6 +13,8 @@ from pydantic import BaseModel, ConfigDict
|
||||||
|
|
||||||
from sqlalchemy import String, cast, or_, and_, func
|
from sqlalchemy import String, cast, or_, and_, func
|
||||||
from sqlalchemy.dialects import postgresql, sqlite
|
from sqlalchemy.dialects import postgresql, sqlite
|
||||||
|
|
||||||
|
from sqlalchemy.dialects.postgresql import JSONB
|
||||||
from sqlalchemy import BigInteger, Column, Text, JSON, Boolean
|
from sqlalchemy import BigInteger, Column, Text, JSON, Boolean
|
||||||
|
|
||||||
|
|
||||||
|
|
@ -53,7 +55,7 @@ class ModelMeta(BaseModel):
|
||||||
class Model(Base):
|
class Model(Base):
|
||||||
__tablename__ = "model"
|
__tablename__ = "model"
|
||||||
|
|
||||||
id = Column(Text, primary_key=True)
|
id = Column(Text, primary_key=True, unique=True)
|
||||||
"""
|
"""
|
||||||
The model's id as used in the API. If set to an existing model, it will override the model.
|
The model's id as used in the API. If set to an existing model, it will override the model.
|
||||||
"""
|
"""
|
||||||
|
|
@ -220,6 +222,48 @@ class ModelsTable:
|
||||||
or has_access(user_id, permission, model.access_control, user_group_ids)
|
or has_access(user_id, permission, model.access_control, user_group_ids)
|
||||||
]
|
]
|
||||||
|
|
||||||
|
def _has_permission(self, db, query, filter: dict, permission: str = "read"):
|
||||||
|
group_ids = filter.get("group_ids", [])
|
||||||
|
user_id = filter.get("user_id")
|
||||||
|
|
||||||
|
dialect_name = db.bind.dialect.name
|
||||||
|
|
||||||
|
# Public access
|
||||||
|
conditions = []
|
||||||
|
if group_ids or user_id:
|
||||||
|
conditions.extend(
|
||||||
|
[
|
||||||
|
Model.access_control.is_(None),
|
||||||
|
cast(Model.access_control, String) == "null",
|
||||||
|
]
|
||||||
|
)
|
||||||
|
|
||||||
|
# User-level permission
|
||||||
|
if user_id:
|
||||||
|
conditions.append(Model.user_id == user_id)
|
||||||
|
|
||||||
|
# Group-level permission
|
||||||
|
if group_ids:
|
||||||
|
group_conditions = []
|
||||||
|
for gid in group_ids:
|
||||||
|
if dialect_name == "sqlite":
|
||||||
|
group_conditions.append(
|
||||||
|
Model.access_control[permission]["group_ids"].contains([gid])
|
||||||
|
)
|
||||||
|
elif dialect_name == "postgresql":
|
||||||
|
group_conditions.append(
|
||||||
|
cast(
|
||||||
|
Model.access_control[permission]["group_ids"],
|
||||||
|
JSONB,
|
||||||
|
).contains([gid])
|
||||||
|
)
|
||||||
|
conditions.append(or_(*group_conditions))
|
||||||
|
|
||||||
|
if conditions:
|
||||||
|
query = query.filter(or_(*conditions))
|
||||||
|
|
||||||
|
return query
|
||||||
|
|
||||||
def search_models(
|
def search_models(
|
||||||
self, user_id: str, filter: dict = {}, skip: int = 0, limit: int = 30
|
self, user_id: str, filter: dict = {}, skip: int = 0, limit: int = 30
|
||||||
) -> ModelListResponse:
|
) -> ModelListResponse:
|
||||||
|
|
@ -238,16 +282,20 @@ class ModelsTable:
|
||||||
)
|
)
|
||||||
)
|
)
|
||||||
|
|
||||||
if filter.get("user_id"):
|
|
||||||
query = query.filter(Model.user_id == filter.get("user_id"))
|
|
||||||
|
|
||||||
view_option = filter.get("view_option")
|
view_option = filter.get("view_option")
|
||||||
|
|
||||||
if view_option == "created":
|
if view_option == "created":
|
||||||
query = query.filter(Model.user_id == user_id)
|
query = query.filter(Model.user_id == user_id)
|
||||||
elif view_option == "shared":
|
elif view_option == "shared":
|
||||||
query = query.filter(Model.user_id != user_id)
|
query = query.filter(Model.user_id != user_id)
|
||||||
|
|
||||||
|
# Apply access control filtering
|
||||||
|
query = self._has_permission(
|
||||||
|
db,
|
||||||
|
query,
|
||||||
|
filter,
|
||||||
|
permission="write",
|
||||||
|
)
|
||||||
|
|
||||||
tag = filter.get("tag")
|
tag = filter.get("tag")
|
||||||
if tag:
|
if tag:
|
||||||
# TODO: This is a simple implementation and should be improved for performance
|
# TODO: This is a simple implementation and should be improved for performance
|
||||||
|
|
|
||||||
|
|
@ -7,12 +7,15 @@ from functools import lru_cache
|
||||||
from open_webui.internal.db import Base, get_db
|
from open_webui.internal.db import Base, get_db
|
||||||
from open_webui.models.groups import Groups
|
from open_webui.models.groups import Groups
|
||||||
from open_webui.utils.access_control import has_access
|
from open_webui.utils.access_control import has_access
|
||||||
from open_webui.models.users import Users, UserResponse
|
from open_webui.models.users import User, UserModel, Users, UserResponse
|
||||||
|
|
||||||
|
|
||||||
from pydantic import BaseModel, ConfigDict
|
from pydantic import BaseModel, ConfigDict
|
||||||
from sqlalchemy import BigInteger, Boolean, Column, String, Text, JSON
|
from sqlalchemy import BigInteger, Boolean, Column, String, Text, JSON
|
||||||
from sqlalchemy import or_, func, select, and_, text
|
from sqlalchemy.dialects.postgresql import JSONB
|
||||||
|
|
||||||
|
|
||||||
|
from sqlalchemy import or_, func, select, and_, text, cast, or_, and_, func
|
||||||
from sqlalchemy.sql import exists
|
from sqlalchemy.sql import exists
|
||||||
|
|
||||||
####################
|
####################
|
||||||
|
|
@ -23,7 +26,7 @@ from sqlalchemy.sql import exists
|
||||||
class Note(Base):
|
class Note(Base):
|
||||||
__tablename__ = "note"
|
__tablename__ = "note"
|
||||||
|
|
||||||
id = Column(Text, primary_key=True)
|
id = Column(Text, primary_key=True, unique=True)
|
||||||
user_id = Column(Text)
|
user_id = Column(Text)
|
||||||
|
|
||||||
title = Column(Text)
|
title = Column(Text)
|
||||||
|
|
@ -75,7 +78,138 @@ class NoteUserResponse(NoteModel):
|
||||||
user: Optional[UserResponse] = None
|
user: Optional[UserResponse] = None
|
||||||
|
|
||||||
|
|
||||||
|
class NoteItemResponse(BaseModel):
|
||||||
|
id: str
|
||||||
|
title: str
|
||||||
|
data: Optional[dict]
|
||||||
|
updated_at: int
|
||||||
|
created_at: int
|
||||||
|
user: Optional[UserResponse] = None
|
||||||
|
|
||||||
|
|
||||||
|
class NoteListResponse(BaseModel):
|
||||||
|
items: list[NoteUserResponse]
|
||||||
|
total: int
|
||||||
|
|
||||||
|
|
||||||
class NoteTable:
|
class NoteTable:
|
||||||
|
def _has_permission(self, db, query, filter: dict, permission: str = "read"):
|
||||||
|
group_ids = filter.get("group_ids", [])
|
||||||
|
user_id = filter.get("user_id")
|
||||||
|
dialect_name = db.bind.dialect.name
|
||||||
|
|
||||||
|
conditions = []
|
||||||
|
|
||||||
|
# Handle read_only permission separately
|
||||||
|
if permission == "read_only":
|
||||||
|
# For read_only, we want items where:
|
||||||
|
# 1. User has explicit read permission (via groups or user-level)
|
||||||
|
# 2. BUT does NOT have write permission
|
||||||
|
# 3. Public items are NOT considered read_only
|
||||||
|
|
||||||
|
read_conditions = []
|
||||||
|
|
||||||
|
# Group-level read permission
|
||||||
|
if group_ids:
|
||||||
|
group_read_conditions = []
|
||||||
|
for gid in group_ids:
|
||||||
|
if dialect_name == "sqlite":
|
||||||
|
group_read_conditions.append(
|
||||||
|
Note.access_control["read"]["group_ids"].contains([gid])
|
||||||
|
)
|
||||||
|
elif dialect_name == "postgresql":
|
||||||
|
group_read_conditions.append(
|
||||||
|
cast(
|
||||||
|
Note.access_control["read"]["group_ids"],
|
||||||
|
JSONB,
|
||||||
|
).contains([gid])
|
||||||
|
)
|
||||||
|
|
||||||
|
if group_read_conditions:
|
||||||
|
read_conditions.append(or_(*group_read_conditions))
|
||||||
|
|
||||||
|
# Combine read conditions
|
||||||
|
if read_conditions:
|
||||||
|
has_read = or_(*read_conditions)
|
||||||
|
else:
|
||||||
|
# If no read conditions, return empty result
|
||||||
|
return query.filter(False)
|
||||||
|
|
||||||
|
# Now exclude items where user has write permission
|
||||||
|
write_exclusions = []
|
||||||
|
|
||||||
|
# Exclude items owned by user (they have implicit write)
|
||||||
|
if user_id:
|
||||||
|
write_exclusions.append(Note.user_id != user_id)
|
||||||
|
|
||||||
|
# Exclude items where user has explicit write permission via groups
|
||||||
|
if group_ids:
|
||||||
|
group_write_conditions = []
|
||||||
|
for gid in group_ids:
|
||||||
|
if dialect_name == "sqlite":
|
||||||
|
group_write_conditions.append(
|
||||||
|
Note.access_control["write"]["group_ids"].contains([gid])
|
||||||
|
)
|
||||||
|
elif dialect_name == "postgresql":
|
||||||
|
group_write_conditions.append(
|
||||||
|
cast(
|
||||||
|
Note.access_control["write"]["group_ids"],
|
||||||
|
JSONB,
|
||||||
|
).contains([gid])
|
||||||
|
)
|
||||||
|
|
||||||
|
if group_write_conditions:
|
||||||
|
# User should NOT have write permission
|
||||||
|
write_exclusions.append(~or_(*group_write_conditions))
|
||||||
|
|
||||||
|
# Exclude public items (items without access_control)
|
||||||
|
write_exclusions.append(Note.access_control.isnot(None))
|
||||||
|
write_exclusions.append(cast(Note.access_control, String) != "null")
|
||||||
|
|
||||||
|
# Combine: has read AND does not have write AND not public
|
||||||
|
if write_exclusions:
|
||||||
|
query = query.filter(and_(has_read, *write_exclusions))
|
||||||
|
else:
|
||||||
|
query = query.filter(has_read)
|
||||||
|
|
||||||
|
return query
|
||||||
|
|
||||||
|
# Original logic for other permissions (read, write, etc.)
|
||||||
|
# Public access conditions
|
||||||
|
if group_ids or user_id:
|
||||||
|
conditions.extend(
|
||||||
|
[
|
||||||
|
Note.access_control.is_(None),
|
||||||
|
cast(Note.access_control, String) == "null",
|
||||||
|
]
|
||||||
|
)
|
||||||
|
|
||||||
|
# User-level permission (owner has all permissions)
|
||||||
|
if user_id:
|
||||||
|
conditions.append(Note.user_id == user_id)
|
||||||
|
|
||||||
|
# Group-level permission
|
||||||
|
if group_ids:
|
||||||
|
group_conditions = []
|
||||||
|
for gid in group_ids:
|
||||||
|
if dialect_name == "sqlite":
|
||||||
|
group_conditions.append(
|
||||||
|
Note.access_control[permission]["group_ids"].contains([gid])
|
||||||
|
)
|
||||||
|
elif dialect_name == "postgresql":
|
||||||
|
group_conditions.append(
|
||||||
|
cast(
|
||||||
|
Note.access_control[permission]["group_ids"],
|
||||||
|
JSONB,
|
||||||
|
).contains([gid])
|
||||||
|
)
|
||||||
|
conditions.append(or_(*group_conditions))
|
||||||
|
|
||||||
|
if conditions:
|
||||||
|
query = query.filter(or_(*conditions))
|
||||||
|
|
||||||
|
return query
|
||||||
|
|
||||||
def insert_new_note(
|
def insert_new_note(
|
||||||
self,
|
self,
|
||||||
form_data: NoteForm,
|
form_data: NoteForm,
|
||||||
|
|
@ -110,15 +244,105 @@ class NoteTable:
|
||||||
notes = query.all()
|
notes = query.all()
|
||||||
return [NoteModel.model_validate(note) for note in notes]
|
return [NoteModel.model_validate(note) for note in notes]
|
||||||
|
|
||||||
|
def search_notes(
|
||||||
|
self, user_id: str, filter: dict = {}, skip: int = 0, limit: int = 30
|
||||||
|
) -> NoteListResponse:
|
||||||
|
with get_db() as db:
|
||||||
|
query = db.query(Note, User).outerjoin(User, User.id == Note.user_id)
|
||||||
|
if filter:
|
||||||
|
query_key = filter.get("query")
|
||||||
|
if query_key:
|
||||||
|
query = query.filter(
|
||||||
|
or_(
|
||||||
|
Note.title.ilike(f"%{query_key}%"),
|
||||||
|
Note.data["content"]["md"].ilike(f"%{query_key}%"),
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
|
view_option = filter.get("view_option")
|
||||||
|
if view_option == "created":
|
||||||
|
query = query.filter(Note.user_id == user_id)
|
||||||
|
elif view_option == "shared":
|
||||||
|
query = query.filter(Note.user_id != user_id)
|
||||||
|
|
||||||
|
# Apply access control filtering
|
||||||
|
if "permission" in filter:
|
||||||
|
permission = filter["permission"]
|
||||||
|
else:
|
||||||
|
permission = "write"
|
||||||
|
|
||||||
|
query = self._has_permission(
|
||||||
|
db,
|
||||||
|
query,
|
||||||
|
filter,
|
||||||
|
permission=permission,
|
||||||
|
)
|
||||||
|
|
||||||
|
order_by = filter.get("order_by")
|
||||||
|
direction = filter.get("direction")
|
||||||
|
|
||||||
|
if order_by == "name":
|
||||||
|
if direction == "asc":
|
||||||
|
query = query.order_by(Note.title.asc())
|
||||||
|
else:
|
||||||
|
query = query.order_by(Note.title.desc())
|
||||||
|
elif order_by == "created_at":
|
||||||
|
if direction == "asc":
|
||||||
|
query = query.order_by(Note.created_at.asc())
|
||||||
|
else:
|
||||||
|
query = query.order_by(Note.created_at.desc())
|
||||||
|
elif order_by == "updated_at":
|
||||||
|
if direction == "asc":
|
||||||
|
query = query.order_by(Note.updated_at.asc())
|
||||||
|
else:
|
||||||
|
query = query.order_by(Note.updated_at.desc())
|
||||||
|
else:
|
||||||
|
query = query.order_by(Note.updated_at.desc())
|
||||||
|
|
||||||
|
else:
|
||||||
|
query = query.order_by(Note.updated_at.desc())
|
||||||
|
|
||||||
|
# Count BEFORE pagination
|
||||||
|
total = query.count()
|
||||||
|
|
||||||
|
if skip:
|
||||||
|
query = query.offset(skip)
|
||||||
|
if limit:
|
||||||
|
query = query.limit(limit)
|
||||||
|
|
||||||
|
items = query.all()
|
||||||
|
|
||||||
|
notes = []
|
||||||
|
for note, user in items:
|
||||||
|
notes.append(
|
||||||
|
NoteUserResponse(
|
||||||
|
**NoteModel.model_validate(note).model_dump(),
|
||||||
|
user=(
|
||||||
|
UserResponse(**UserModel.model_validate(user).model_dump())
|
||||||
|
if user
|
||||||
|
else None
|
||||||
|
),
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
|
return NoteListResponse(items=notes, total=total)
|
||||||
|
|
||||||
def get_notes_by_user_id(
|
def get_notes_by_user_id(
|
||||||
self,
|
self,
|
||||||
user_id: str,
|
user_id: str,
|
||||||
|
permission: str = "read",
|
||||||
skip: Optional[int] = None,
|
skip: Optional[int] = None,
|
||||||
limit: Optional[int] = None,
|
limit: Optional[int] = None,
|
||||||
) -> list[NoteModel]:
|
) -> list[NoteModel]:
|
||||||
with get_db() as db:
|
with get_db() as db:
|
||||||
query = db.query(Note).filter(Note.user_id == user_id)
|
user_group_ids = [
|
||||||
query = query.order_by(Note.updated_at.desc())
|
group.id for group in Groups.get_groups_by_member_id(user_id)
|
||||||
|
]
|
||||||
|
|
||||||
|
query = db.query(Note).order_by(Note.updated_at.desc())
|
||||||
|
query = self._has_permission(
|
||||||
|
db, query, {"user_id": user_id, "group_ids": user_group_ids}, permission
|
||||||
|
)
|
||||||
|
|
||||||
if skip is not None:
|
if skip is not None:
|
||||||
query = query.offset(skip)
|
query = query.offset(skip)
|
||||||
|
|
@ -128,56 +352,6 @@ class NoteTable:
|
||||||
notes = query.all()
|
notes = query.all()
|
||||||
return [NoteModel.model_validate(note) for note in notes]
|
return [NoteModel.model_validate(note) for note in notes]
|
||||||
|
|
||||||
def get_notes_by_permission(
|
|
||||||
self,
|
|
||||||
user_id: str,
|
|
||||||
permission: str = "write",
|
|
||||||
skip: Optional[int] = None,
|
|
||||||
limit: Optional[int] = None,
|
|
||||||
) -> list[NoteModel]:
|
|
||||||
with get_db() as db:
|
|
||||||
user_groups = Groups.get_groups_by_member_id(user_id)
|
|
||||||
user_group_ids = {group.id for group in user_groups}
|
|
||||||
|
|
||||||
# Order newest-first. We stream to keep memory usage low.
|
|
||||||
query = (
|
|
||||||
db.query(Note)
|
|
||||||
.order_by(Note.updated_at.desc())
|
|
||||||
.execution_options(stream_results=True)
|
|
||||||
.yield_per(256)
|
|
||||||
)
|
|
||||||
|
|
||||||
results: list[NoteModel] = []
|
|
||||||
n_skipped = 0
|
|
||||||
|
|
||||||
for note in query:
|
|
||||||
# Fast-pass #1: owner
|
|
||||||
if note.user_id == user_id:
|
|
||||||
permitted = True
|
|
||||||
# Fast-pass #2: public/open
|
|
||||||
elif note.access_control is None:
|
|
||||||
# Technically this should mean public access for both read and write, but we'll only do read for now
|
|
||||||
# We might want to change this behavior later
|
|
||||||
permitted = permission == "read"
|
|
||||||
else:
|
|
||||||
permitted = has_access(
|
|
||||||
user_id, permission, note.access_control, user_group_ids
|
|
||||||
)
|
|
||||||
|
|
||||||
if not permitted:
|
|
||||||
continue
|
|
||||||
|
|
||||||
# Apply skip AFTER permission filtering so it counts only accessible notes
|
|
||||||
if skip and n_skipped < skip:
|
|
||||||
n_skipped += 1
|
|
||||||
continue
|
|
||||||
|
|
||||||
results.append(NoteModel.model_validate(note))
|
|
||||||
if limit is not None and len(results) >= limit:
|
|
||||||
break
|
|
||||||
|
|
||||||
return results
|
|
||||||
|
|
||||||
def get_note_by_id(self, id: str) -> Optional[NoteModel]:
|
def get_note_by_id(self, id: str) -> Optional[NoteModel]:
|
||||||
with get_db() as db:
|
with get_db() as db:
|
||||||
note = db.query(Note).filter(Note.id == id).first()
|
note = db.query(Note).filter(Note.id == id).first()
|
||||||
|
|
|
||||||
|
|
@ -25,7 +25,7 @@ log.setLevel(SRC_LOG_LEVELS["MODELS"])
|
||||||
class OAuthSession(Base):
|
class OAuthSession(Base):
|
||||||
__tablename__ = "oauth_session"
|
__tablename__ = "oauth_session"
|
||||||
|
|
||||||
id = Column(Text, primary_key=True)
|
id = Column(Text, primary_key=True, unique=True)
|
||||||
user_id = Column(Text, nullable=False)
|
user_id = Column(Text, nullable=False)
|
||||||
provider = Column(Text, nullable=False)
|
provider = Column(Text, nullable=False)
|
||||||
token = Column(
|
token = Column(
|
||||||
|
|
|
||||||
|
|
@ -24,7 +24,7 @@ log.setLevel(SRC_LOG_LEVELS["MODELS"])
|
||||||
class Tool(Base):
|
class Tool(Base):
|
||||||
__tablename__ = "tool"
|
__tablename__ = "tool"
|
||||||
|
|
||||||
id = Column(String, primary_key=True)
|
id = Column(String, primary_key=True, unique=True)
|
||||||
user_id = Column(String)
|
user_id = Column(String)
|
||||||
name = Column(Text)
|
name = Column(Text)
|
||||||
content = Column(Text)
|
content = Column(Text)
|
||||||
|
|
|
||||||
|
|
@ -5,14 +5,29 @@ from open_webui.internal.db import Base, JSONField, get_db
|
||||||
|
|
||||||
|
|
||||||
from open_webui.env import DATABASE_USER_ACTIVE_STATUS_UPDATE_INTERVAL
|
from open_webui.env import DATABASE_USER_ACTIVE_STATUS_UPDATE_INTERVAL
|
||||||
|
|
||||||
from open_webui.models.chats import Chats
|
from open_webui.models.chats import Chats
|
||||||
from open_webui.models.groups import Groups, GroupMember
|
from open_webui.models.groups import Groups, GroupMember
|
||||||
|
from open_webui.models.channels import ChannelMember
|
||||||
|
|
||||||
from open_webui.utils.misc import throttle
|
from open_webui.utils.misc import throttle
|
||||||
|
|
||||||
|
|
||||||
from pydantic import BaseModel, ConfigDict
|
from pydantic import BaseModel, ConfigDict
|
||||||
from sqlalchemy import BigInteger, Column, String, Text, Date, exists, select
|
from sqlalchemy import (
|
||||||
|
BigInteger,
|
||||||
|
JSON,
|
||||||
|
Column,
|
||||||
|
String,
|
||||||
|
Boolean,
|
||||||
|
Text,
|
||||||
|
Date,
|
||||||
|
exists,
|
||||||
|
select,
|
||||||
|
cast,
|
||||||
|
)
|
||||||
from sqlalchemy import or_, case
|
from sqlalchemy import or_, case
|
||||||
|
from sqlalchemy.dialects.postgresql import JSONB
|
||||||
|
|
||||||
import datetime
|
import datetime
|
||||||
|
|
||||||
|
|
@ -21,59 +36,71 @@ import datetime
|
||||||
####################
|
####################
|
||||||
|
|
||||||
|
|
||||||
class User(Base):
|
|
||||||
__tablename__ = "user"
|
|
||||||
|
|
||||||
id = Column(String, primary_key=True)
|
|
||||||
name = Column(String)
|
|
||||||
|
|
||||||
email = Column(String)
|
|
||||||
username = Column(String(50), nullable=True)
|
|
||||||
|
|
||||||
role = Column(String)
|
|
||||||
profile_image_url = Column(Text)
|
|
||||||
|
|
||||||
bio = Column(Text, nullable=True)
|
|
||||||
gender = Column(Text, nullable=True)
|
|
||||||
date_of_birth = Column(Date, nullable=True)
|
|
||||||
|
|
||||||
info = Column(JSONField, nullable=True)
|
|
||||||
settings = Column(JSONField, nullable=True)
|
|
||||||
|
|
||||||
api_key = Column(String, nullable=True, unique=True)
|
|
||||||
oauth_sub = Column(Text, unique=True)
|
|
||||||
|
|
||||||
last_active_at = Column(BigInteger)
|
|
||||||
|
|
||||||
updated_at = Column(BigInteger)
|
|
||||||
created_at = Column(BigInteger)
|
|
||||||
|
|
||||||
|
|
||||||
class UserSettings(BaseModel):
|
class UserSettings(BaseModel):
|
||||||
ui: Optional[dict] = {}
|
ui: Optional[dict] = {}
|
||||||
model_config = ConfigDict(extra="allow")
|
model_config = ConfigDict(extra="allow")
|
||||||
pass
|
pass
|
||||||
|
|
||||||
|
|
||||||
|
class User(Base):
|
||||||
|
__tablename__ = "user"
|
||||||
|
|
||||||
|
id = Column(String, primary_key=True, unique=True)
|
||||||
|
email = Column(String)
|
||||||
|
username = Column(String(50), nullable=True)
|
||||||
|
role = Column(String)
|
||||||
|
|
||||||
|
name = Column(String)
|
||||||
|
|
||||||
|
profile_image_url = Column(Text)
|
||||||
|
profile_banner_image_url = Column(Text, nullable=True)
|
||||||
|
|
||||||
|
bio = Column(Text, nullable=True)
|
||||||
|
gender = Column(Text, nullable=True)
|
||||||
|
date_of_birth = Column(Date, nullable=True)
|
||||||
|
timezone = Column(String, nullable=True)
|
||||||
|
|
||||||
|
presence_state = Column(String, nullable=True)
|
||||||
|
status_emoji = Column(String, nullable=True)
|
||||||
|
status_message = Column(Text, nullable=True)
|
||||||
|
status_expires_at = Column(BigInteger, nullable=True)
|
||||||
|
|
||||||
|
info = Column(JSON, nullable=True)
|
||||||
|
settings = Column(JSON, nullable=True)
|
||||||
|
|
||||||
|
oauth = Column(JSON, nullable=True)
|
||||||
|
|
||||||
|
last_active_at = Column(BigInteger)
|
||||||
|
updated_at = Column(BigInteger)
|
||||||
|
created_at = Column(BigInteger)
|
||||||
|
|
||||||
|
|
||||||
class UserModel(BaseModel):
|
class UserModel(BaseModel):
|
||||||
id: str
|
id: str
|
||||||
name: str
|
|
||||||
|
|
||||||
email: str
|
email: str
|
||||||
username: Optional[str] = None
|
username: Optional[str] = None
|
||||||
|
|
||||||
role: str = "pending"
|
role: str = "pending"
|
||||||
|
|
||||||
|
name: str
|
||||||
|
|
||||||
profile_image_url: str
|
profile_image_url: str
|
||||||
|
profile_banner_image_url: Optional[str] = None
|
||||||
|
|
||||||
bio: Optional[str] = None
|
bio: Optional[str] = None
|
||||||
gender: Optional[str] = None
|
gender: Optional[str] = None
|
||||||
date_of_birth: Optional[datetime.date] = None
|
date_of_birth: Optional[datetime.date] = None
|
||||||
|
timezone: Optional[str] = None
|
||||||
|
|
||||||
|
presence_state: Optional[str] = None
|
||||||
|
status_emoji: Optional[str] = None
|
||||||
|
status_message: Optional[str] = None
|
||||||
|
status_expires_at: Optional[int] = None
|
||||||
|
|
||||||
info: Optional[dict] = None
|
info: Optional[dict] = None
|
||||||
settings: Optional[UserSettings] = None
|
settings: Optional[UserSettings] = None
|
||||||
|
|
||||||
api_key: Optional[str] = None
|
oauth: Optional[dict] = None
|
||||||
oauth_sub: Optional[str] = None
|
|
||||||
|
|
||||||
last_active_at: int # timestamp in epoch
|
last_active_at: int # timestamp in epoch
|
||||||
updated_at: int # timestamp in epoch
|
updated_at: int # timestamp in epoch
|
||||||
|
|
@ -82,6 +109,38 @@ class UserModel(BaseModel):
|
||||||
model_config = ConfigDict(from_attributes=True)
|
model_config = ConfigDict(from_attributes=True)
|
||||||
|
|
||||||
|
|
||||||
|
class UserStatusModel(UserModel):
|
||||||
|
is_active: bool = False
|
||||||
|
|
||||||
|
model_config = ConfigDict(from_attributes=True)
|
||||||
|
|
||||||
|
|
||||||
|
class ApiKey(Base):
|
||||||
|
__tablename__ = "api_key"
|
||||||
|
|
||||||
|
id = Column(Text, primary_key=True, unique=True)
|
||||||
|
user_id = Column(Text, nullable=False)
|
||||||
|
key = Column(Text, unique=True, nullable=False)
|
||||||
|
data = Column(JSON, nullable=True)
|
||||||
|
expires_at = Column(BigInteger, nullable=True)
|
||||||
|
last_used_at = Column(BigInteger, nullable=True)
|
||||||
|
created_at = Column(BigInteger, nullable=False)
|
||||||
|
updated_at = Column(BigInteger, nullable=False)
|
||||||
|
|
||||||
|
|
||||||
|
class ApiKeyModel(BaseModel):
|
||||||
|
id: str
|
||||||
|
user_id: str
|
||||||
|
key: str
|
||||||
|
data: Optional[dict] = None
|
||||||
|
expires_at: Optional[int] = None
|
||||||
|
last_used_at: Optional[int] = None
|
||||||
|
created_at: int # timestamp in epoch
|
||||||
|
updated_at: int # timestamp in epoch
|
||||||
|
|
||||||
|
model_config = ConfigDict(from_attributes=True)
|
||||||
|
|
||||||
|
|
||||||
####################
|
####################
|
||||||
# Forms
|
# Forms
|
||||||
####################
|
####################
|
||||||
|
|
@ -113,7 +172,13 @@ class UserGroupIdsListResponse(BaseModel):
|
||||||
total: int
|
total: int
|
||||||
|
|
||||||
|
|
||||||
class UserInfoResponse(BaseModel):
|
class UserStatus(BaseModel):
|
||||||
|
status_emoji: Optional[str] = None
|
||||||
|
status_message: Optional[str] = None
|
||||||
|
status_expires_at: Optional[int] = None
|
||||||
|
|
||||||
|
|
||||||
|
class UserInfoResponse(UserStatus):
|
||||||
id: str
|
id: str
|
||||||
name: str
|
name: str
|
||||||
email: str
|
email: str
|
||||||
|
|
@ -125,6 +190,12 @@ class UserIdNameResponse(BaseModel):
|
||||||
name: str
|
name: str
|
||||||
|
|
||||||
|
|
||||||
|
class UserIdNameStatusResponse(UserStatus):
|
||||||
|
id: str
|
||||||
|
name: str
|
||||||
|
is_active: Optional[bool] = None
|
||||||
|
|
||||||
|
|
||||||
class UserInfoListResponse(BaseModel):
|
class UserInfoListResponse(BaseModel):
|
||||||
users: list[UserInfoResponse]
|
users: list[UserInfoResponse]
|
||||||
total: int
|
total: int
|
||||||
|
|
@ -135,18 +206,18 @@ class UserIdNameListResponse(BaseModel):
|
||||||
total: int
|
total: int
|
||||||
|
|
||||||
|
|
||||||
class UserResponse(BaseModel):
|
|
||||||
id: str
|
|
||||||
name: str
|
|
||||||
email: str
|
|
||||||
role: str
|
|
||||||
profile_image_url: str
|
|
||||||
|
|
||||||
|
|
||||||
class UserNameResponse(BaseModel):
|
class UserNameResponse(BaseModel):
|
||||||
id: str
|
id: str
|
||||||
name: str
|
name: str
|
||||||
role: str
|
role: str
|
||||||
|
|
||||||
|
|
||||||
|
class UserResponse(UserNameResponse):
|
||||||
|
email: str
|
||||||
|
|
||||||
|
|
||||||
|
class UserProfileImageResponse(UserNameResponse):
|
||||||
|
email: str
|
||||||
profile_image_url: str
|
profile_image_url: str
|
||||||
|
|
||||||
|
|
||||||
|
|
@ -171,20 +242,20 @@ class UsersTable:
|
||||||
email: str,
|
email: str,
|
||||||
profile_image_url: str = "/user.png",
|
profile_image_url: str = "/user.png",
|
||||||
role: str = "pending",
|
role: str = "pending",
|
||||||
oauth_sub: Optional[str] = None,
|
oauth: Optional[dict] = None,
|
||||||
) -> Optional[UserModel]:
|
) -> Optional[UserModel]:
|
||||||
with get_db() as db:
|
with get_db() as db:
|
||||||
user = UserModel(
|
user = UserModel(
|
||||||
**{
|
**{
|
||||||
"id": id,
|
"id": id,
|
||||||
"name": name,
|
|
||||||
"email": email,
|
"email": email,
|
||||||
|
"name": name,
|
||||||
"role": role,
|
"role": role,
|
||||||
"profile_image_url": profile_image_url,
|
"profile_image_url": profile_image_url,
|
||||||
"last_active_at": int(time.time()),
|
"last_active_at": int(time.time()),
|
||||||
"created_at": int(time.time()),
|
"created_at": int(time.time()),
|
||||||
"updated_at": int(time.time()),
|
"updated_at": int(time.time()),
|
||||||
"oauth_sub": oauth_sub,
|
"oauth": oauth,
|
||||||
}
|
}
|
||||||
)
|
)
|
||||||
result = User(**user.model_dump())
|
result = User(**user.model_dump())
|
||||||
|
|
@ -207,8 +278,13 @@ class UsersTable:
|
||||||
def get_user_by_api_key(self, api_key: str) -> Optional[UserModel]:
|
def get_user_by_api_key(self, api_key: str) -> Optional[UserModel]:
|
||||||
try:
|
try:
|
||||||
with get_db() as db:
|
with get_db() as db:
|
||||||
user = db.query(User).filter_by(api_key=api_key).first()
|
user = (
|
||||||
return UserModel.model_validate(user)
|
db.query(User)
|
||||||
|
.join(ApiKey, User.id == ApiKey.user_id)
|
||||||
|
.filter(ApiKey.key == api_key)
|
||||||
|
.first()
|
||||||
|
)
|
||||||
|
return UserModel.model_validate(user) if user else None
|
||||||
except Exception:
|
except Exception:
|
||||||
return None
|
return None
|
||||||
|
|
||||||
|
|
@ -220,12 +296,23 @@ class UsersTable:
|
||||||
except Exception:
|
except Exception:
|
||||||
return None
|
return None
|
||||||
|
|
||||||
def get_user_by_oauth_sub(self, sub: str) -> Optional[UserModel]:
|
def get_user_by_oauth_sub(self, provider: str, sub: str) -> Optional[UserModel]:
|
||||||
try:
|
try:
|
||||||
with get_db() as db:
|
with get_db() as db: # type: Session
|
||||||
user = db.query(User).filter_by(oauth_sub=sub).first()
|
dialect_name = db.bind.dialect.name
|
||||||
return UserModel.model_validate(user)
|
|
||||||
except Exception:
|
query = db.query(User)
|
||||||
|
if dialect_name == "sqlite":
|
||||||
|
query = query.filter(User.oauth.contains({provider: {"sub": sub}}))
|
||||||
|
elif dialect_name == "postgresql":
|
||||||
|
query = query.filter(
|
||||||
|
User.oauth[provider].cast(JSONB)["sub"].astext == sub
|
||||||
|
)
|
||||||
|
|
||||||
|
user = query.first()
|
||||||
|
return UserModel.model_validate(user) if user else None
|
||||||
|
except Exception as e:
|
||||||
|
# You may want to log the exception here
|
||||||
return None
|
return None
|
||||||
|
|
||||||
def get_users(
|
def get_users(
|
||||||
|
|
@ -248,6 +335,17 @@ class UsersTable:
|
||||||
)
|
)
|
||||||
)
|
)
|
||||||
|
|
||||||
|
channel_id = filter.get("channel_id")
|
||||||
|
if channel_id:
|
||||||
|
query = query.filter(
|
||||||
|
exists(
|
||||||
|
select(ChannelMember.id).where(
|
||||||
|
ChannelMember.user_id == User.id,
|
||||||
|
ChannelMember.channel_id == channel_id,
|
||||||
|
)
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
user_ids = filter.get("user_ids")
|
user_ids = filter.get("user_ids")
|
||||||
group_ids = filter.get("group_ids")
|
group_ids = filter.get("group_ids")
|
||||||
|
|
||||||
|
|
@ -354,7 +452,17 @@ class UsersTable:
|
||||||
"total": total,
|
"total": total,
|
||||||
}
|
}
|
||||||
|
|
||||||
def get_users_by_user_ids(self, user_ids: list[str]) -> list[UserModel]:
|
def get_users_by_group_id(self, group_id: str) -> list[UserModel]:
|
||||||
|
with get_db() as db:
|
||||||
|
users = (
|
||||||
|
db.query(User)
|
||||||
|
.join(GroupMember, User.id == GroupMember.user_id)
|
||||||
|
.filter(GroupMember.group_id == group_id)
|
||||||
|
.all()
|
||||||
|
)
|
||||||
|
return [UserModel.model_validate(user) for user in users]
|
||||||
|
|
||||||
|
def get_users_by_user_ids(self, user_ids: list[str]) -> list[UserStatusModel]:
|
||||||
with get_db() as db:
|
with get_db() as db:
|
||||||
users = db.query(User).filter(User.id.in_(user_ids)).all()
|
users = db.query(User).filter(User.id.in_(user_ids)).all()
|
||||||
return [UserModel.model_validate(user) for user in users]
|
return [UserModel.model_validate(user) for user in users]
|
||||||
|
|
@ -410,6 +518,21 @@ class UsersTable:
|
||||||
except Exception:
|
except Exception:
|
||||||
return None
|
return None
|
||||||
|
|
||||||
|
def update_user_status_by_id(
|
||||||
|
self, id: str, form_data: UserStatus
|
||||||
|
) -> Optional[UserModel]:
|
||||||
|
try:
|
||||||
|
with get_db() as db:
|
||||||
|
db.query(User).filter_by(id=id).update(
|
||||||
|
{**form_data.model_dump(exclude_none=True)}
|
||||||
|
)
|
||||||
|
db.commit()
|
||||||
|
|
||||||
|
user = db.query(User).filter_by(id=id).first()
|
||||||
|
return UserModel.model_validate(user)
|
||||||
|
except Exception:
|
||||||
|
return None
|
||||||
|
|
||||||
def update_user_profile_image_url_by_id(
|
def update_user_profile_image_url_by_id(
|
||||||
self, id: str, profile_image_url: str
|
self, id: str, profile_image_url: str
|
||||||
) -> Optional[UserModel]:
|
) -> Optional[UserModel]:
|
||||||
|
|
@ -426,7 +549,7 @@ class UsersTable:
|
||||||
return None
|
return None
|
||||||
|
|
||||||
@throttle(DATABASE_USER_ACTIVE_STATUS_UPDATE_INTERVAL)
|
@throttle(DATABASE_USER_ACTIVE_STATUS_UPDATE_INTERVAL)
|
||||||
def update_user_last_active_by_id(self, id: str) -> Optional[UserModel]:
|
def update_last_active_by_id(self, id: str) -> Optional[UserModel]:
|
||||||
try:
|
try:
|
||||||
with get_db() as db:
|
with get_db() as db:
|
||||||
db.query(User).filter_by(id=id).update(
|
db.query(User).filter_by(id=id).update(
|
||||||
|
|
@ -439,16 +562,35 @@ class UsersTable:
|
||||||
except Exception:
|
except Exception:
|
||||||
return None
|
return None
|
||||||
|
|
||||||
def update_user_oauth_sub_by_id(
|
def update_user_oauth_by_id(
|
||||||
self, id: str, oauth_sub: str
|
self, id: str, provider: str, sub: str
|
||||||
) -> Optional[UserModel]:
|
) -> Optional[UserModel]:
|
||||||
|
"""
|
||||||
|
Update or insert an OAuth provider/sub pair into the user's oauth JSON field.
|
||||||
|
Example resulting structure:
|
||||||
|
{
|
||||||
|
"google": { "sub": "123" },
|
||||||
|
"github": { "sub": "abc" }
|
||||||
|
}
|
||||||
|
"""
|
||||||
try:
|
try:
|
||||||
with get_db() as db:
|
with get_db() as db:
|
||||||
db.query(User).filter_by(id=id).update({"oauth_sub": oauth_sub})
|
user = db.query(User).filter_by(id=id).first()
|
||||||
|
if not user:
|
||||||
|
return None
|
||||||
|
|
||||||
|
# Load existing oauth JSON or create empty
|
||||||
|
oauth = user.oauth or {}
|
||||||
|
|
||||||
|
# Update or insert provider entry
|
||||||
|
oauth[provider] = {"sub": sub}
|
||||||
|
|
||||||
|
# Persist updated JSON
|
||||||
|
db.query(User).filter_by(id=id).update({"oauth": oauth})
|
||||||
db.commit()
|
db.commit()
|
||||||
|
|
||||||
user = db.query(User).filter_by(id=id).first()
|
|
||||||
return UserModel.model_validate(user)
|
return UserModel.model_validate(user)
|
||||||
|
|
||||||
except Exception:
|
except Exception:
|
||||||
return None
|
return None
|
||||||
|
|
||||||
|
|
@ -502,23 +644,45 @@ class UsersTable:
|
||||||
except Exception:
|
except Exception:
|
||||||
return False
|
return False
|
||||||
|
|
||||||
def update_user_api_key_by_id(self, id: str, api_key: str) -> bool:
|
|
||||||
try:
|
|
||||||
with get_db() as db:
|
|
||||||
result = db.query(User).filter_by(id=id).update({"api_key": api_key})
|
|
||||||
db.commit()
|
|
||||||
return True if result == 1 else False
|
|
||||||
except Exception:
|
|
||||||
return False
|
|
||||||
|
|
||||||
def get_user_api_key_by_id(self, id: str) -> Optional[str]:
|
def get_user_api_key_by_id(self, id: str) -> Optional[str]:
|
||||||
try:
|
try:
|
||||||
with get_db() as db:
|
with get_db() as db:
|
||||||
user = db.query(User).filter_by(id=id).first()
|
api_key = db.query(ApiKey).filter_by(user_id=id).first()
|
||||||
return user.api_key
|
return api_key.key if api_key else None
|
||||||
except Exception:
|
except Exception:
|
||||||
return None
|
return None
|
||||||
|
|
||||||
|
def update_user_api_key_by_id(self, id: str, api_key: str) -> bool:
|
||||||
|
try:
|
||||||
|
with get_db() as db:
|
||||||
|
db.query(ApiKey).filter_by(user_id=id).delete()
|
||||||
|
db.commit()
|
||||||
|
|
||||||
|
now = int(time.time())
|
||||||
|
new_api_key = ApiKey(
|
||||||
|
id=f"key_{id}",
|
||||||
|
user_id=id,
|
||||||
|
key=api_key,
|
||||||
|
created_at=now,
|
||||||
|
updated_at=now,
|
||||||
|
)
|
||||||
|
db.add(new_api_key)
|
||||||
|
db.commit()
|
||||||
|
|
||||||
|
return True
|
||||||
|
|
||||||
|
except Exception:
|
||||||
|
return False
|
||||||
|
|
||||||
|
def delete_user_api_key_by_id(self, id: str) -> bool:
|
||||||
|
try:
|
||||||
|
with get_db() as db:
|
||||||
|
db.query(ApiKey).filter_by(user_id=id).delete()
|
||||||
|
db.commit()
|
||||||
|
return True
|
||||||
|
except Exception:
|
||||||
|
return False
|
||||||
|
|
||||||
def get_valid_user_ids(self, user_ids: list[str]) -> list[str]:
|
def get_valid_user_ids(self, user_ids: list[str]) -> list[str]:
|
||||||
with get_db() as db:
|
with get_db() as db:
|
||||||
users = db.query(User).filter(User.id.in_(user_ids)).all()
|
users = db.query(User).filter(User.id.in_(user_ids)).all()
|
||||||
|
|
@ -532,5 +696,23 @@ class UsersTable:
|
||||||
else:
|
else:
|
||||||
return None
|
return None
|
||||||
|
|
||||||
|
def get_active_user_count(self) -> int:
|
||||||
|
with get_db() as db:
|
||||||
|
# Consider user active if last_active_at within the last 3 minutes
|
||||||
|
three_minutes_ago = int(time.time()) - 180
|
||||||
|
count = (
|
||||||
|
db.query(User).filter(User.last_active_at >= three_minutes_ago).count()
|
||||||
|
)
|
||||||
|
return count
|
||||||
|
|
||||||
|
def is_user_active(self, user_id: str) -> bool:
|
||||||
|
with get_db() as db:
|
||||||
|
user = db.query(User).filter_by(id=user_id).first()
|
||||||
|
if user and user.last_active_at:
|
||||||
|
# Consider user active if last_active_at within the last 3 minutes
|
||||||
|
three_minutes_ago = int(time.time()) - 180
|
||||||
|
return user.last_active_at >= three_minutes_ago
|
||||||
|
return False
|
||||||
|
|
||||||
|
|
||||||
Users = UsersTable()
|
Users = UsersTable()
|
||||||
|
|
|
||||||
|
|
@ -144,19 +144,17 @@ class DoclingLoader:
|
||||||
with open(self.file_path, "rb") as f:
|
with open(self.file_path, "rb") as f:
|
||||||
headers = {}
|
headers = {}
|
||||||
if self.api_key:
|
if self.api_key:
|
||||||
headers["Authorization"] = f"Bearer {self.api_key}"
|
headers["X-Api-Key"] = f"Bearer {self.api_key}"
|
||||||
|
|
||||||
files = {
|
|
||||||
"files": (
|
|
||||||
self.file_path,
|
|
||||||
f,
|
|
||||||
self.mime_type or "application/octet-stream",
|
|
||||||
)
|
|
||||||
}
|
|
||||||
|
|
||||||
r = requests.post(
|
r = requests.post(
|
||||||
f"{self.url}/v1/convert/file",
|
f"{self.url}/v1/convert/file",
|
||||||
files=files,
|
files={
|
||||||
|
"files": (
|
||||||
|
self.file_path,
|
||||||
|
f,
|
||||||
|
self.mime_type or "application/octet-stream",
|
||||||
|
)
|
||||||
|
},
|
||||||
data={
|
data={
|
||||||
"image_export_mode": "placeholder",
|
"image_export_mode": "placeholder",
|
||||||
**self.params,
|
**self.params,
|
||||||
|
|
@ -322,12 +320,14 @@ class Loader:
|
||||||
file_path=file_path,
|
file_path=file_path,
|
||||||
api_endpoint=self.kwargs.get("DOCUMENT_INTELLIGENCE_ENDPOINT"),
|
api_endpoint=self.kwargs.get("DOCUMENT_INTELLIGENCE_ENDPOINT"),
|
||||||
api_key=self.kwargs.get("DOCUMENT_INTELLIGENCE_KEY"),
|
api_key=self.kwargs.get("DOCUMENT_INTELLIGENCE_KEY"),
|
||||||
|
api_model=self.kwargs.get("DOCUMENT_INTELLIGENCE_MODEL"),
|
||||||
)
|
)
|
||||||
else:
|
else:
|
||||||
loader = AzureAIDocumentIntelligenceLoader(
|
loader = AzureAIDocumentIntelligenceLoader(
|
||||||
file_path=file_path,
|
file_path=file_path,
|
||||||
api_endpoint=self.kwargs.get("DOCUMENT_INTELLIGENCE_ENDPOINT"),
|
api_endpoint=self.kwargs.get("DOCUMENT_INTELLIGENCE_ENDPOINT"),
|
||||||
azure_credential=DefaultAzureCredential(),
|
azure_credential=DefaultAzureCredential(),
|
||||||
|
api_model=self.kwargs.get("DOCUMENT_INTELLIGENCE_MODEL"),
|
||||||
)
|
)
|
||||||
elif self.engine == "mineru" and file_ext in [
|
elif self.engine == "mineru" and file_ext in [
|
||||||
"pdf"
|
"pdf"
|
||||||
|
|
|
||||||
|
|
@ -1088,23 +1088,19 @@ async def get_sources_from_items(
|
||||||
or knowledge_base.user_id == user.id
|
or knowledge_base.user_id == user.id
|
||||||
or has_access(user.id, "read", knowledge_base.access_control)
|
or has_access(user.id, "read", knowledge_base.access_control)
|
||||||
):
|
):
|
||||||
|
files = Knowledges.get_files_by_id(knowledge_base.id)
|
||||||
file_ids = knowledge_base.data.get("file_ids", [])
|
|
||||||
|
|
||||||
documents = []
|
documents = []
|
||||||
metadatas = []
|
metadatas = []
|
||||||
for file_id in file_ids:
|
for file in files:
|
||||||
file_object = Files.get_file_by_id(file_id)
|
documents.append(file.data.get("content", ""))
|
||||||
|
metadatas.append(
|
||||||
if file_object:
|
{
|
||||||
documents.append(file_object.data.get("content", ""))
|
"file_id": file.id,
|
||||||
metadatas.append(
|
"name": file.filename,
|
||||||
{
|
"source": file.filename,
|
||||||
"file_id": file_id,
|
}
|
||||||
"name": file_object.filename,
|
)
|
||||||
"source": file_object.filename,
|
|
||||||
}
|
|
||||||
)
|
|
||||||
|
|
||||||
query_result = {
|
query_result = {
|
||||||
"documents": [documents],
|
"documents": [documents],
|
||||||
|
|
|
||||||
|
|
@ -200,23 +200,24 @@ class MilvusClient(VectorDBBase):
|
||||||
def query(self, collection_name: str, filter: dict, limit: int = -1):
|
def query(self, collection_name: str, filter: dict, limit: int = -1):
|
||||||
connections.connect(uri=MILVUS_URI, token=MILVUS_TOKEN, db_name=MILVUS_DB)
|
connections.connect(uri=MILVUS_URI, token=MILVUS_TOKEN, db_name=MILVUS_DB)
|
||||||
|
|
||||||
# Construct the filter string for querying
|
|
||||||
collection_name = collection_name.replace("-", "_")
|
collection_name = collection_name.replace("-", "_")
|
||||||
if not self.has_collection(collection_name):
|
if not self.has_collection(collection_name):
|
||||||
log.warning(
|
log.warning(
|
||||||
f"Query attempted on non-existent collection: {self.collection_prefix}_{collection_name}"
|
f"Query attempted on non-existent collection: {self.collection_prefix}_{collection_name}"
|
||||||
)
|
)
|
||||||
return None
|
return None
|
||||||
filter_string = " && ".join(
|
|
||||||
[
|
filter_expressions = []
|
||||||
f'metadata["{key}"] == {json.dumps(value)}'
|
for key, value in filter.items():
|
||||||
for key, value in filter.items()
|
if isinstance(value, str):
|
||||||
]
|
filter_expressions.append(f'metadata["{key}"] == "{value}"')
|
||||||
)
|
else:
|
||||||
|
filter_expressions.append(f'metadata["{key}"] == {value}')
|
||||||
|
|
||||||
|
filter_string = " && ".join(filter_expressions)
|
||||||
|
|
||||||
collection = Collection(f"{self.collection_prefix}_{collection_name}")
|
collection = Collection(f"{self.collection_prefix}_{collection_name}")
|
||||||
collection.load()
|
collection.load()
|
||||||
all_results = []
|
|
||||||
|
|
||||||
try:
|
try:
|
||||||
log.info(
|
log.info(
|
||||||
|
|
@ -224,24 +225,25 @@ class MilvusClient(VectorDBBase):
|
||||||
)
|
)
|
||||||
|
|
||||||
iterator = collection.query_iterator(
|
iterator = collection.query_iterator(
|
||||||
filter=filter_string,
|
expr=filter_string,
|
||||||
output_fields=[
|
output_fields=[
|
||||||
"id",
|
"id",
|
||||||
"data",
|
"data",
|
||||||
"metadata",
|
"metadata",
|
||||||
],
|
],
|
||||||
limit=limit, # Pass the limit directly; -1 means no limit.
|
limit=limit if limit > 0 else -1,
|
||||||
)
|
)
|
||||||
|
|
||||||
|
all_results = []
|
||||||
while True:
|
while True:
|
||||||
result = iterator.next()
|
batch = iterator.next()
|
||||||
if not result:
|
if not batch:
|
||||||
iterator.close()
|
iterator.close()
|
||||||
break
|
break
|
||||||
all_results += result
|
all_results.extend(batch)
|
||||||
|
|
||||||
log.info(f"Total results from query: {len(all_results)}")
|
log.debug(f"Total results from query: {len(all_results)}")
|
||||||
return self._result_to_get_result([all_results])
|
return self._result_to_get_result([all_results] if all_results else [[]])
|
||||||
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
log.exception(
|
log.exception(
|
||||||
|
|
|
||||||
|
|
@ -157,7 +157,6 @@ class MilvusClient(VectorDBBase):
|
||||||
for item in items
|
for item in items
|
||||||
]
|
]
|
||||||
collection.insert(entities)
|
collection.insert(entities)
|
||||||
collection.flush()
|
|
||||||
|
|
||||||
def search(
|
def search(
|
||||||
self, collection_name: str, vectors: List[List[float]], limit: int
|
self, collection_name: str, vectors: List[List[float]], limit: int
|
||||||
|
|
@ -263,15 +262,23 @@ class MilvusClient(VectorDBBase):
|
||||||
else:
|
else:
|
||||||
expr.append(f"metadata['{key}'] == {value}")
|
expr.append(f"metadata['{key}'] == {value}")
|
||||||
|
|
||||||
results = collection.query(
|
iterator = collection.query_iterator(
|
||||||
expr=" and ".join(expr),
|
expr=" and ".join(expr),
|
||||||
output_fields=["id", "text", "metadata"],
|
output_fields=["id", "text", "metadata"],
|
||||||
limit=limit,
|
limit=limit if limit else -1,
|
||||||
)
|
)
|
||||||
|
|
||||||
ids = [res["id"] for res in results]
|
all_results = []
|
||||||
documents = [res["text"] for res in results]
|
while True:
|
||||||
metadatas = [res["metadata"] for res in results]
|
batch = iterator.next()
|
||||||
|
if not batch:
|
||||||
|
iterator.close()
|
||||||
|
break
|
||||||
|
all_results.extend(batch)
|
||||||
|
|
||||||
|
ids = [res["id"] for res in all_results]
|
||||||
|
documents = [res["text"] for res in all_results]
|
||||||
|
metadatas = [res["metadata"] for res in all_results]
|
||||||
|
|
||||||
return GetResult(ids=[ids], documents=[documents], metadatas=[metadatas])
|
return GetResult(ids=[ids], documents=[documents], metadatas=[metadatas])
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -33,7 +33,7 @@ def get_filtered_results(results, filter_list):
|
||||||
except Exception:
|
except Exception:
|
||||||
pass
|
pass
|
||||||
|
|
||||||
if any(is_string_allowed(hostname, filter_list) for hostname in hostnames):
|
if is_string_allowed(hostnames, filter_list):
|
||||||
filtered_results.append(result)
|
filtered_results.append(result)
|
||||||
continue
|
continue
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -33,6 +33,7 @@ from open_webui.config import (
|
||||||
PLAYWRIGHT_WS_URL,
|
PLAYWRIGHT_WS_URL,
|
||||||
PLAYWRIGHT_TIMEOUT,
|
PLAYWRIGHT_TIMEOUT,
|
||||||
WEB_LOADER_ENGINE,
|
WEB_LOADER_ENGINE,
|
||||||
|
WEB_LOADER_TIMEOUT,
|
||||||
FIRECRAWL_API_BASE_URL,
|
FIRECRAWL_API_BASE_URL,
|
||||||
FIRECRAWL_API_KEY,
|
FIRECRAWL_API_KEY,
|
||||||
TAVILY_API_KEY,
|
TAVILY_API_KEY,
|
||||||
|
|
@ -674,6 +675,20 @@ def get_web_loader(
|
||||||
|
|
||||||
if WEB_LOADER_ENGINE.value == "" or WEB_LOADER_ENGINE.value == "safe_web":
|
if WEB_LOADER_ENGINE.value == "" or WEB_LOADER_ENGINE.value == "safe_web":
|
||||||
WebLoaderClass = SafeWebBaseLoader
|
WebLoaderClass = SafeWebBaseLoader
|
||||||
|
|
||||||
|
request_kwargs = {}
|
||||||
|
if WEB_LOADER_TIMEOUT.value:
|
||||||
|
try:
|
||||||
|
timeout_value = float(WEB_LOADER_TIMEOUT.value)
|
||||||
|
except ValueError:
|
||||||
|
timeout_value = None
|
||||||
|
|
||||||
|
if timeout_value:
|
||||||
|
request_kwargs["timeout"] = timeout_value
|
||||||
|
|
||||||
|
if request_kwargs:
|
||||||
|
web_loader_args["requests_kwargs"] = request_kwargs
|
||||||
|
|
||||||
if WEB_LOADER_ENGINE.value == "playwright":
|
if WEB_LOADER_ENGINE.value == "playwright":
|
||||||
WebLoaderClass = SafePlaywrightURLLoader
|
WebLoaderClass = SafePlaywrightURLLoader
|
||||||
web_loader_args["playwright_timeout"] = PLAYWRIGHT_TIMEOUT.value
|
web_loader_args["playwright_timeout"] = PLAYWRIGHT_TIMEOUT.value
|
||||||
|
|
|
||||||
|
|
@ -6,6 +6,7 @@ import logging
|
||||||
from aiohttp import ClientSession
|
from aiohttp import ClientSession
|
||||||
import urllib
|
import urllib
|
||||||
|
|
||||||
|
|
||||||
from open_webui.models.auths import (
|
from open_webui.models.auths import (
|
||||||
AddUserForm,
|
AddUserForm,
|
||||||
ApiKey,
|
ApiKey,
|
||||||
|
|
@ -16,9 +17,13 @@ from open_webui.models.auths import (
|
||||||
SigninResponse,
|
SigninResponse,
|
||||||
SignupForm,
|
SignupForm,
|
||||||
UpdatePasswordForm,
|
UpdatePasswordForm,
|
||||||
UserResponse,
|
|
||||||
)
|
)
|
||||||
from open_webui.models.users import Users, UpdateProfileForm
|
from open_webui.models.users import (
|
||||||
|
UserProfileImageResponse,
|
||||||
|
Users,
|
||||||
|
UpdateProfileForm,
|
||||||
|
UserStatus,
|
||||||
|
)
|
||||||
from open_webui.models.groups import Groups
|
from open_webui.models.groups import Groups
|
||||||
from open_webui.models.oauth_sessions import OAuthSessions
|
from open_webui.models.oauth_sessions import OAuthSessions
|
||||||
|
|
||||||
|
|
@ -60,6 +65,11 @@ from open_webui.utils.auth import (
|
||||||
)
|
)
|
||||||
from open_webui.utils.webhook import post_webhook
|
from open_webui.utils.webhook import post_webhook
|
||||||
from open_webui.utils.access_control import get_permissions, has_permission
|
from open_webui.utils.access_control import get_permissions, has_permission
|
||||||
|
from open_webui.utils.groups import apply_default_group_assignment
|
||||||
|
|
||||||
|
from open_webui.utils.redis import get_redis_client
|
||||||
|
from open_webui.utils.rate_limit import RateLimiter
|
||||||
|
|
||||||
|
|
||||||
from typing import Optional, List
|
from typing import Optional, List
|
||||||
|
|
||||||
|
|
@ -73,17 +83,21 @@ router = APIRouter()
|
||||||
log = logging.getLogger(__name__)
|
log = logging.getLogger(__name__)
|
||||||
log.setLevel(SRC_LOG_LEVELS["MAIN"])
|
log.setLevel(SRC_LOG_LEVELS["MAIN"])
|
||||||
|
|
||||||
|
signin_rate_limiter = RateLimiter(
|
||||||
|
redis_client=get_redis_client(), limit=5 * 3, window=60 * 3
|
||||||
|
)
|
||||||
|
|
||||||
############################
|
############################
|
||||||
# GetSessionUser
|
# GetSessionUser
|
||||||
############################
|
############################
|
||||||
|
|
||||||
|
|
||||||
class SessionUserResponse(Token, UserResponse):
|
class SessionUserResponse(Token, UserProfileImageResponse):
|
||||||
expires_at: Optional[int] = None
|
expires_at: Optional[int] = None
|
||||||
permissions: Optional[dict] = None
|
permissions: Optional[dict] = None
|
||||||
|
|
||||||
|
|
||||||
class SessionUserInfoResponse(SessionUserResponse):
|
class SessionUserInfoResponse(SessionUserResponse, UserStatus):
|
||||||
bio: Optional[str] = None
|
bio: Optional[str] = None
|
||||||
gender: Optional[str] = None
|
gender: Optional[str] = None
|
||||||
date_of_birth: Optional[datetime.date] = None
|
date_of_birth: Optional[datetime.date] = None
|
||||||
|
|
@ -140,6 +154,9 @@ async def get_session_user(
|
||||||
"bio": user.bio,
|
"bio": user.bio,
|
||||||
"gender": user.gender,
|
"gender": user.gender,
|
||||||
"date_of_birth": user.date_of_birth,
|
"date_of_birth": user.date_of_birth,
|
||||||
|
"status_emoji": user.status_emoji,
|
||||||
|
"status_message": user.status_message,
|
||||||
|
"status_expires_at": user.status_expires_at,
|
||||||
"permissions": user_permissions,
|
"permissions": user_permissions,
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
@ -149,7 +166,7 @@ async def get_session_user(
|
||||||
############################
|
############################
|
||||||
|
|
||||||
|
|
||||||
@router.post("/update/profile", response_model=UserResponse)
|
@router.post("/update/profile", response_model=UserProfileImageResponse)
|
||||||
async def update_profile(
|
async def update_profile(
|
||||||
form_data: UpdateProfileForm, session_user=Depends(get_verified_user)
|
form_data: UpdateProfileForm, session_user=Depends(get_verified_user)
|
||||||
):
|
):
|
||||||
|
|
@ -401,6 +418,11 @@ async def ldap_auth(request: Request, response: Response, form_data: LdapForm):
|
||||||
500, detail=ERROR_MESSAGES.CREATE_USER_ERROR
|
500, detail=ERROR_MESSAGES.CREATE_USER_ERROR
|
||||||
)
|
)
|
||||||
|
|
||||||
|
apply_default_group_assignment(
|
||||||
|
request.app.state.config.DEFAULT_GROUP_ID,
|
||||||
|
user.id,
|
||||||
|
)
|
||||||
|
|
||||||
except HTTPException:
|
except HTTPException:
|
||||||
raise
|
raise
|
||||||
except Exception as err:
|
except Exception as err:
|
||||||
|
|
@ -449,7 +471,6 @@ async def ldap_auth(request: Request, response: Response, form_data: LdapForm):
|
||||||
):
|
):
|
||||||
if ENABLE_LDAP_GROUP_CREATION:
|
if ENABLE_LDAP_GROUP_CREATION:
|
||||||
Groups.create_groups_by_group_names(user.id, user_groups)
|
Groups.create_groups_by_group_names(user.id, user_groups)
|
||||||
|
|
||||||
try:
|
try:
|
||||||
Groups.sync_groups_by_group_names(user.id, user_groups)
|
Groups.sync_groups_by_group_names(user.id, user_groups)
|
||||||
log.info(
|
log.info(
|
||||||
|
|
@ -544,6 +565,12 @@ async def signin(request: Request, response: Response, form_data: SigninForm):
|
||||||
admin_email.lower(), lambda pw: verify_password(admin_password, pw)
|
admin_email.lower(), lambda pw: verify_password(admin_password, pw)
|
||||||
)
|
)
|
||||||
else:
|
else:
|
||||||
|
if signin_rate_limiter.is_limited(form_data.email.lower()):
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=status.HTTP_429_TOO_MANY_REQUESTS,
|
||||||
|
detail=ERROR_MESSAGES.RATE_LIMIT_EXCEEDED,
|
||||||
|
)
|
||||||
|
|
||||||
password_bytes = form_data.password.encode("utf-8")
|
password_bytes = form_data.password.encode("utf-8")
|
||||||
if len(password_bytes) > 72:
|
if len(password_bytes) > 72:
|
||||||
# TODO: Implement other hashing algorithms that support longer passwords
|
# TODO: Implement other hashing algorithms that support longer passwords
|
||||||
|
|
@ -700,9 +727,10 @@ async def signup(request: Request, response: Response, form_data: SignupForm):
|
||||||
# Disable signup after the first user is created
|
# Disable signup after the first user is created
|
||||||
request.app.state.config.ENABLE_SIGNUP = False
|
request.app.state.config.ENABLE_SIGNUP = False
|
||||||
|
|
||||||
default_group_id = getattr(request.app.state.config, "DEFAULT_GROUP_ID", "")
|
apply_default_group_assignment(
|
||||||
if default_group_id and default_group_id:
|
request.app.state.config.DEFAULT_GROUP_ID,
|
||||||
Groups.add_users_to_group(default_group_id, [user.id])
|
user.id,
|
||||||
|
)
|
||||||
|
|
||||||
return {
|
return {
|
||||||
"token": token,
|
"token": token,
|
||||||
|
|
@ -807,7 +835,9 @@ async def signout(request: Request, response: Response):
|
||||||
|
|
||||||
|
|
||||||
@router.post("/add", response_model=SigninResponse)
|
@router.post("/add", response_model=SigninResponse)
|
||||||
async def add_user(form_data: AddUserForm, user=Depends(get_admin_user)):
|
async def add_user(
|
||||||
|
request: Request, form_data: AddUserForm, user=Depends(get_admin_user)
|
||||||
|
):
|
||||||
if not validate_email_format(form_data.email.lower()):
|
if not validate_email_format(form_data.email.lower()):
|
||||||
raise HTTPException(
|
raise HTTPException(
|
||||||
status.HTTP_400_BAD_REQUEST, detail=ERROR_MESSAGES.INVALID_EMAIL_FORMAT
|
status.HTTP_400_BAD_REQUEST, detail=ERROR_MESSAGES.INVALID_EMAIL_FORMAT
|
||||||
|
|
@ -832,6 +862,11 @@ async def add_user(form_data: AddUserForm, user=Depends(get_admin_user)):
|
||||||
)
|
)
|
||||||
|
|
||||||
if user:
|
if user:
|
||||||
|
apply_default_group_assignment(
|
||||||
|
request.app.state.config.DEFAULT_GROUP_ID,
|
||||||
|
user.id,
|
||||||
|
)
|
||||||
|
|
||||||
token = create_token(data={"id": user.id})
|
token = create_token(data={"id": user.id})
|
||||||
return {
|
return {
|
||||||
"token": token,
|
"token": token,
|
||||||
|
|
@ -901,6 +936,7 @@ async def get_admin_config(request: Request, user=Depends(get_admin_user)):
|
||||||
"JWT_EXPIRES_IN": request.app.state.config.JWT_EXPIRES_IN,
|
"JWT_EXPIRES_IN": request.app.state.config.JWT_EXPIRES_IN,
|
||||||
"ENABLE_COMMUNITY_SHARING": request.app.state.config.ENABLE_COMMUNITY_SHARING,
|
"ENABLE_COMMUNITY_SHARING": request.app.state.config.ENABLE_COMMUNITY_SHARING,
|
||||||
"ENABLE_MESSAGE_RATING": request.app.state.config.ENABLE_MESSAGE_RATING,
|
"ENABLE_MESSAGE_RATING": request.app.state.config.ENABLE_MESSAGE_RATING,
|
||||||
|
"ENABLE_FOLDERS": request.app.state.config.ENABLE_FOLDERS,
|
||||||
"ENABLE_CHANNELS": request.app.state.config.ENABLE_CHANNELS,
|
"ENABLE_CHANNELS": request.app.state.config.ENABLE_CHANNELS,
|
||||||
"ENABLE_NOTES": request.app.state.config.ENABLE_NOTES,
|
"ENABLE_NOTES": request.app.state.config.ENABLE_NOTES,
|
||||||
"ENABLE_USER_WEBHOOKS": request.app.state.config.ENABLE_USER_WEBHOOKS,
|
"ENABLE_USER_WEBHOOKS": request.app.state.config.ENABLE_USER_WEBHOOKS,
|
||||||
|
|
@ -922,6 +958,7 @@ class AdminConfig(BaseModel):
|
||||||
JWT_EXPIRES_IN: str
|
JWT_EXPIRES_IN: str
|
||||||
ENABLE_COMMUNITY_SHARING: bool
|
ENABLE_COMMUNITY_SHARING: bool
|
||||||
ENABLE_MESSAGE_RATING: bool
|
ENABLE_MESSAGE_RATING: bool
|
||||||
|
ENABLE_FOLDERS: bool
|
||||||
ENABLE_CHANNELS: bool
|
ENABLE_CHANNELS: bool
|
||||||
ENABLE_NOTES: bool
|
ENABLE_NOTES: bool
|
||||||
ENABLE_USER_WEBHOOKS: bool
|
ENABLE_USER_WEBHOOKS: bool
|
||||||
|
|
@ -946,6 +983,7 @@ async def update_admin_config(
|
||||||
form_data.API_KEYS_ALLOWED_ENDPOINTS
|
form_data.API_KEYS_ALLOWED_ENDPOINTS
|
||||||
)
|
)
|
||||||
|
|
||||||
|
request.app.state.config.ENABLE_FOLDERS = form_data.ENABLE_FOLDERS
|
||||||
request.app.state.config.ENABLE_CHANNELS = form_data.ENABLE_CHANNELS
|
request.app.state.config.ENABLE_CHANNELS = form_data.ENABLE_CHANNELS
|
||||||
request.app.state.config.ENABLE_NOTES = form_data.ENABLE_NOTES
|
request.app.state.config.ENABLE_NOTES = form_data.ENABLE_NOTES
|
||||||
|
|
||||||
|
|
@ -988,6 +1026,7 @@ async def update_admin_config(
|
||||||
"JWT_EXPIRES_IN": request.app.state.config.JWT_EXPIRES_IN,
|
"JWT_EXPIRES_IN": request.app.state.config.JWT_EXPIRES_IN,
|
||||||
"ENABLE_COMMUNITY_SHARING": request.app.state.config.ENABLE_COMMUNITY_SHARING,
|
"ENABLE_COMMUNITY_SHARING": request.app.state.config.ENABLE_COMMUNITY_SHARING,
|
||||||
"ENABLE_MESSAGE_RATING": request.app.state.config.ENABLE_MESSAGE_RATING,
|
"ENABLE_MESSAGE_RATING": request.app.state.config.ENABLE_MESSAGE_RATING,
|
||||||
|
"ENABLE_FOLDERS": request.app.state.config.ENABLE_FOLDERS,
|
||||||
"ENABLE_CHANNELS": request.app.state.config.ENABLE_CHANNELS,
|
"ENABLE_CHANNELS": request.app.state.config.ENABLE_CHANNELS,
|
||||||
"ENABLE_NOTES": request.app.state.config.ENABLE_NOTES,
|
"ENABLE_NOTES": request.app.state.config.ENABLE_NOTES,
|
||||||
"ENABLE_USER_WEBHOOKS": request.app.state.config.ENABLE_USER_WEBHOOKS,
|
"ENABLE_USER_WEBHOOKS": request.app.state.config.ENABLE_USER_WEBHOOKS,
|
||||||
|
|
@ -1130,8 +1169,7 @@ async def generate_api_key(request: Request, user=Depends(get_current_user)):
|
||||||
# delete api key
|
# delete api key
|
||||||
@router.delete("/api_key", response_model=bool)
|
@router.delete("/api_key", response_model=bool)
|
||||||
async def delete_api_key(user=Depends(get_current_user)):
|
async def delete_api_key(user=Depends(get_current_user)):
|
||||||
success = Users.update_user_api_key_by_id(user.id, None)
|
return Users.delete_user_api_key_by_id(user.id)
|
||||||
return success
|
|
||||||
|
|
||||||
|
|
||||||
# get api key
|
# get api key
|
||||||
|
|
|
||||||
File diff suppressed because it is too large
Load diff
|
|
@ -3,6 +3,7 @@ import logging
|
||||||
from typing import Optional
|
from typing import Optional
|
||||||
|
|
||||||
|
|
||||||
|
from open_webui.utils.misc import get_message_list
|
||||||
from open_webui.socket.main import get_event_emitter
|
from open_webui.socket.main import get_event_emitter
|
||||||
from open_webui.models.chats import (
|
from open_webui.models.chats import (
|
||||||
ChatForm,
|
ChatForm,
|
||||||
|
|
@ -66,6 +67,64 @@ def get_session_user_chat_list(
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
|
############################
|
||||||
|
# GetChatList
|
||||||
|
############################
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/stats/usage", response_model=list[ChatTitleIdResponse])
|
||||||
|
def get_session_user_chat_usage(
|
||||||
|
user=Depends(get_verified_user),
|
||||||
|
):
|
||||||
|
try:
|
||||||
|
chats = Chats.get_chats_by_user_id(user.id)
|
||||||
|
|
||||||
|
chat_stats = []
|
||||||
|
for chat in chats:
|
||||||
|
messages_map = chat.chat.get("history", {}).get("messages", {})
|
||||||
|
message_id = chat.chat.get("history", {}).get("currentId")
|
||||||
|
|
||||||
|
if messages_map and message_id:
|
||||||
|
try:
|
||||||
|
message_list = get_message_list(messages_map, message_id)
|
||||||
|
message_count = len(message_list)
|
||||||
|
|
||||||
|
last_assistant_message = next(
|
||||||
|
(
|
||||||
|
message
|
||||||
|
for message in reversed(message_list)
|
||||||
|
if message["role"] == "assistant"
|
||||||
|
),
|
||||||
|
None,
|
||||||
|
)
|
||||||
|
|
||||||
|
model_id = (
|
||||||
|
last_assistant_message.get("model", None)
|
||||||
|
if last_assistant_message
|
||||||
|
else None
|
||||||
|
)
|
||||||
|
chat_stats.append(
|
||||||
|
{
|
||||||
|
"id": chat.id,
|
||||||
|
"model_id": model_id,
|
||||||
|
"message_count": message_count,
|
||||||
|
"tags": chat.meta.get("tags", []),
|
||||||
|
"model_ids": chat.chat.get("models", []),
|
||||||
|
"updated_at": chat.updated_at,
|
||||||
|
"created_at": chat.created_at,
|
||||||
|
}
|
||||||
|
)
|
||||||
|
except Exception as e:
|
||||||
|
pass
|
||||||
|
return chat_stats
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
log.exception(e)
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=status.HTTP_400_BAD_REQUEST, detail=ERROR_MESSAGES.DEFAULT()
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
############################
|
############################
|
||||||
# DeleteAllChats
|
# DeleteAllChats
|
||||||
############################
|
############################
|
||||||
|
|
|
||||||
|
|
@ -22,6 +22,7 @@ from fastapi import (
|
||||||
)
|
)
|
||||||
|
|
||||||
from fastapi.responses import FileResponse, StreamingResponse
|
from fastapi.responses import FileResponse, StreamingResponse
|
||||||
|
|
||||||
from open_webui.constants import ERROR_MESSAGES
|
from open_webui.constants import ERROR_MESSAGES
|
||||||
from open_webui.env import SRC_LOG_LEVELS
|
from open_webui.env import SRC_LOG_LEVELS
|
||||||
from open_webui.retrieval.vector.factory import VECTOR_DB_CLIENT
|
from open_webui.retrieval.vector.factory import VECTOR_DB_CLIENT
|
||||||
|
|
@ -34,12 +35,19 @@ from open_webui.models.files import (
|
||||||
Files,
|
Files,
|
||||||
)
|
)
|
||||||
from open_webui.models.knowledge import Knowledges
|
from open_webui.models.knowledge import Knowledges
|
||||||
|
from open_webui.models.groups import Groups
|
||||||
|
|
||||||
|
|
||||||
from open_webui.routers.knowledge import get_knowledge, get_knowledge_list
|
from open_webui.routers.knowledge import get_knowledge, get_knowledge_list
|
||||||
from open_webui.routers.retrieval import ProcessFileForm, process_file
|
from open_webui.routers.retrieval import ProcessFileForm, process_file
|
||||||
from open_webui.routers.audio import transcribe
|
from open_webui.routers.audio import transcribe
|
||||||
|
|
||||||
from open_webui.storage.provider import Storage
|
from open_webui.storage.provider import Storage
|
||||||
|
|
||||||
|
|
||||||
from open_webui.utils.auth import get_admin_user, get_verified_user
|
from open_webui.utils.auth import get_admin_user, get_verified_user
|
||||||
|
from open_webui.utils.access_control import has_access
|
||||||
|
|
||||||
from pydantic import BaseModel
|
from pydantic import BaseModel
|
||||||
|
|
||||||
log = logging.getLogger(__name__)
|
log = logging.getLogger(__name__)
|
||||||
|
|
@ -53,31 +61,37 @@ router = APIRouter()
|
||||||
############################
|
############################
|
||||||
|
|
||||||
|
|
||||||
|
# TODO: Optimize this function to use the knowledge_file table for faster lookups.
|
||||||
def has_access_to_file(
|
def has_access_to_file(
|
||||||
file_id: Optional[str], access_type: str, user=Depends(get_verified_user)
|
file_id: Optional[str], access_type: str, user=Depends(get_verified_user)
|
||||||
) -> bool:
|
) -> bool:
|
||||||
file = Files.get_file_by_id(file_id)
|
file = Files.get_file_by_id(file_id)
|
||||||
log.debug(f"Checking if user has {access_type} access to file")
|
log.debug(f"Checking if user has {access_type} access to file")
|
||||||
|
|
||||||
if not file:
|
if not file:
|
||||||
raise HTTPException(
|
raise HTTPException(
|
||||||
status_code=status.HTTP_404_NOT_FOUND,
|
status_code=status.HTTP_404_NOT_FOUND,
|
||||||
detail=ERROR_MESSAGES.NOT_FOUND,
|
detail=ERROR_MESSAGES.NOT_FOUND,
|
||||||
)
|
)
|
||||||
|
|
||||||
has_access = False
|
knowledge_bases = Knowledges.get_knowledges_by_file_id(file_id)
|
||||||
knowledge_base_id = file.meta.get("collection_name") if file.meta else None
|
user_group_ids = {group.id for group in Groups.get_groups_by_member_id(user.id)}
|
||||||
|
|
||||||
|
for knowledge_base in knowledge_bases:
|
||||||
|
if knowledge_base.user_id == user.id or has_access(
|
||||||
|
user.id, access_type, knowledge_base.access_control, user_group_ids
|
||||||
|
):
|
||||||
|
return True
|
||||||
|
|
||||||
|
knowledge_base_id = file.meta.get("collection_name") if file.meta else None
|
||||||
if knowledge_base_id:
|
if knowledge_base_id:
|
||||||
knowledge_bases = Knowledges.get_knowledge_bases_by_user_id(
|
knowledge_bases = Knowledges.get_knowledge_bases_by_user_id(
|
||||||
user.id, access_type
|
user.id, access_type
|
||||||
)
|
)
|
||||||
for knowledge_base in knowledge_bases:
|
for knowledge_base in knowledge_bases:
|
||||||
if knowledge_base.id == knowledge_base_id:
|
if knowledge_base.id == knowledge_base_id:
|
||||||
has_access = True
|
return True
|
||||||
break
|
|
||||||
|
|
||||||
return has_access
|
return False
|
||||||
|
|
||||||
|
|
||||||
############################
|
############################
|
||||||
|
|
@ -165,7 +179,7 @@ def upload_file_handler(
|
||||||
user=Depends(get_verified_user),
|
user=Depends(get_verified_user),
|
||||||
background_tasks: Optional[BackgroundTasks] = None,
|
background_tasks: Optional[BackgroundTasks] = None,
|
||||||
):
|
):
|
||||||
log.info(f"file.content_type: {file.content_type}")
|
log.info(f"file.content_type: {file.content_type} {process}")
|
||||||
|
|
||||||
if isinstance(metadata, str):
|
if isinstance(metadata, str):
|
||||||
try:
|
try:
|
||||||
|
|
|
||||||
|
|
@ -46,7 +46,23 @@ router = APIRouter()
|
||||||
|
|
||||||
|
|
||||||
@router.get("/", response_model=list[FolderNameIdResponse])
|
@router.get("/", response_model=list[FolderNameIdResponse])
|
||||||
async def get_folders(user=Depends(get_verified_user)):
|
async def get_folders(request: Request, user=Depends(get_verified_user)):
|
||||||
|
if request.app.state.config.ENABLE_FOLDERS is False:
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=status.HTTP_403_FORBIDDEN,
|
||||||
|
detail=ERROR_MESSAGES.ACCESS_PROHIBITED,
|
||||||
|
)
|
||||||
|
|
||||||
|
if user.role != "admin" and not has_permission(
|
||||||
|
user.id,
|
||||||
|
"features.folders",
|
||||||
|
request.app.state.config.USER_PERMISSIONS,
|
||||||
|
):
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=status.HTTP_403_FORBIDDEN,
|
||||||
|
detail=ERROR_MESSAGES.ACCESS_PROHIBITED,
|
||||||
|
)
|
||||||
|
|
||||||
folders = Folders.get_folders_by_user_id(user.id)
|
folders = Folders.get_folders_by_user_id(user.id)
|
||||||
|
|
||||||
# Verify folder data integrity
|
# Verify folder data integrity
|
||||||
|
|
|
||||||
|
|
@ -3,7 +3,7 @@ from pathlib import Path
|
||||||
from typing import Optional
|
from typing import Optional
|
||||||
import logging
|
import logging
|
||||||
|
|
||||||
from open_webui.models.users import Users
|
from open_webui.models.users import Users, UserInfoResponse
|
||||||
from open_webui.models.groups import (
|
from open_webui.models.groups import (
|
||||||
Groups,
|
Groups,
|
||||||
GroupForm,
|
GroupForm,
|
||||||
|
|
@ -32,31 +32,17 @@ router = APIRouter()
|
||||||
|
|
||||||
@router.get("/", response_model=list[GroupResponse])
|
@router.get("/", response_model=list[GroupResponse])
|
||||||
async def get_groups(share: Optional[bool] = None, user=Depends(get_verified_user)):
|
async def get_groups(share: Optional[bool] = None, user=Depends(get_verified_user)):
|
||||||
if user.role == "admin":
|
|
||||||
groups = Groups.get_groups()
|
|
||||||
else:
|
|
||||||
groups = Groups.get_groups_by_member_id(user.id)
|
|
||||||
|
|
||||||
group_list = []
|
filter = {}
|
||||||
|
if user.role != "admin":
|
||||||
|
filter["member_id"] = user.id
|
||||||
|
|
||||||
for group in groups:
|
if share is not None:
|
||||||
if share is not None:
|
filter["share"] = share
|
||||||
# Check if the group has data and a config with share key
|
|
||||||
if (
|
|
||||||
group.data
|
|
||||||
and "share" in group.data.get("config", {})
|
|
||||||
and group.data["config"]["share"] != share
|
|
||||||
):
|
|
||||||
continue
|
|
||||||
|
|
||||||
group_list.append(
|
groups = Groups.get_groups(filter=filter)
|
||||||
GroupResponse(
|
|
||||||
**group.model_dump(),
|
|
||||||
member_count=Groups.get_group_member_count_by_id(group.id),
|
|
||||||
)
|
|
||||||
)
|
|
||||||
|
|
||||||
return group_list
|
return groups
|
||||||
|
|
||||||
|
|
||||||
############################
|
############################
|
||||||
|
|
@ -106,6 +92,50 @@ async def get_group_by_id(id: str, user=Depends(get_admin_user)):
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
|
############################
|
||||||
|
# ExportGroupById
|
||||||
|
############################
|
||||||
|
|
||||||
|
|
||||||
|
class GroupExportResponse(GroupResponse):
|
||||||
|
user_ids: list[str] = []
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/id/{id}/export", response_model=Optional[GroupExportResponse])
|
||||||
|
async def export_group_by_id(id: str, user=Depends(get_admin_user)):
|
||||||
|
group = Groups.get_group_by_id(id)
|
||||||
|
if group:
|
||||||
|
return GroupExportResponse(
|
||||||
|
**group.model_dump(),
|
||||||
|
member_count=Groups.get_group_member_count_by_id(group.id),
|
||||||
|
user_ids=Groups.get_group_user_ids_by_id(group.id),
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=status.HTTP_401_UNAUTHORIZED,
|
||||||
|
detail=ERROR_MESSAGES.NOT_FOUND,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
############################
|
||||||
|
# GetUsersInGroupById
|
||||||
|
############################
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/id/{id}/users", response_model=list[UserInfoResponse])
|
||||||
|
async def get_users_in_group(id: str, user=Depends(get_admin_user)):
|
||||||
|
try:
|
||||||
|
users = Users.get_users_by_group_id(id)
|
||||||
|
return users
|
||||||
|
except Exception as e:
|
||||||
|
log.exception(f"Error adding users to group {id}: {e}")
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=status.HTTP_400_BAD_REQUEST,
|
||||||
|
detail=ERROR_MESSAGES.DEFAULT(e),
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
############################
|
############################
|
||||||
# UpdateGroupById
|
# UpdateGroupById
|
||||||
############################
|
############################
|
||||||
|
|
|
||||||
|
|
@ -5,6 +5,7 @@ from fastapi.concurrency import run_in_threadpool
|
||||||
import logging
|
import logging
|
||||||
|
|
||||||
from open_webui.models.knowledge import (
|
from open_webui.models.knowledge import (
|
||||||
|
KnowledgeFileListResponse,
|
||||||
Knowledges,
|
Knowledges,
|
||||||
KnowledgeForm,
|
KnowledgeForm,
|
||||||
KnowledgeResponse,
|
KnowledgeResponse,
|
||||||
|
|
@ -43,97 +44,38 @@ router = APIRouter()
|
||||||
|
|
||||||
@router.get("/", response_model=list[KnowledgeUserResponse])
|
@router.get("/", response_model=list[KnowledgeUserResponse])
|
||||||
async def get_knowledge(user=Depends(get_verified_user)):
|
async def get_knowledge(user=Depends(get_verified_user)):
|
||||||
|
# Return knowledge bases with read access
|
||||||
knowledge_bases = []
|
knowledge_bases = []
|
||||||
|
|
||||||
if user.role == "admin" and BYPASS_ADMIN_ACCESS_CONTROL:
|
if user.role == "admin" and BYPASS_ADMIN_ACCESS_CONTROL:
|
||||||
knowledge_bases = Knowledges.get_knowledge_bases()
|
knowledge_bases = Knowledges.get_knowledge_bases()
|
||||||
else:
|
else:
|
||||||
knowledge_bases = Knowledges.get_knowledge_bases_by_user_id(user.id, "read")
|
knowledge_bases = Knowledges.get_knowledge_bases_by_user_id(user.id, "read")
|
||||||
|
|
||||||
# Get files for each knowledge base
|
return [
|
||||||
knowledge_with_files = []
|
KnowledgeUserResponse(
|
||||||
for knowledge_base in knowledge_bases:
|
**knowledge_base.model_dump(),
|
||||||
files = []
|
files=Knowledges.get_file_metadatas_by_id(knowledge_base.id),
|
||||||
if knowledge_base.data:
|
|
||||||
files = Files.get_file_metadatas_by_ids(
|
|
||||||
knowledge_base.data.get("file_ids", [])
|
|
||||||
)
|
|
||||||
|
|
||||||
# Check if all files exist
|
|
||||||
if len(files) != len(knowledge_base.data.get("file_ids", [])):
|
|
||||||
missing_files = list(
|
|
||||||
set(knowledge_base.data.get("file_ids", []))
|
|
||||||
- set([file.id for file in files])
|
|
||||||
)
|
|
||||||
if missing_files:
|
|
||||||
data = knowledge_base.data or {}
|
|
||||||
file_ids = data.get("file_ids", [])
|
|
||||||
|
|
||||||
for missing_file in missing_files:
|
|
||||||
file_ids.remove(missing_file)
|
|
||||||
|
|
||||||
data["file_ids"] = file_ids
|
|
||||||
Knowledges.update_knowledge_data_by_id(
|
|
||||||
id=knowledge_base.id, data=data
|
|
||||||
)
|
|
||||||
|
|
||||||
files = Files.get_file_metadatas_by_ids(file_ids)
|
|
||||||
|
|
||||||
knowledge_with_files.append(
|
|
||||||
KnowledgeUserResponse(
|
|
||||||
**knowledge_base.model_dump(),
|
|
||||||
files=files,
|
|
||||||
)
|
|
||||||
)
|
)
|
||||||
|
for knowledge_base in knowledge_bases
|
||||||
return knowledge_with_files
|
]
|
||||||
|
|
||||||
|
|
||||||
@router.get("/list", response_model=list[KnowledgeUserResponse])
|
@router.get("/list", response_model=list[KnowledgeUserResponse])
|
||||||
async def get_knowledge_list(user=Depends(get_verified_user)):
|
async def get_knowledge_list(user=Depends(get_verified_user)):
|
||||||
|
# Return knowledge bases with write access
|
||||||
knowledge_bases = []
|
knowledge_bases = []
|
||||||
|
|
||||||
if user.role == "admin" and BYPASS_ADMIN_ACCESS_CONTROL:
|
if user.role == "admin" and BYPASS_ADMIN_ACCESS_CONTROL:
|
||||||
knowledge_bases = Knowledges.get_knowledge_bases()
|
knowledge_bases = Knowledges.get_knowledge_bases()
|
||||||
else:
|
else:
|
||||||
knowledge_bases = Knowledges.get_knowledge_bases_by_user_id(user.id, "write")
|
knowledge_bases = Knowledges.get_knowledge_bases_by_user_id(user.id, "write")
|
||||||
|
|
||||||
# Get files for each knowledge base
|
return [
|
||||||
knowledge_with_files = []
|
KnowledgeUserResponse(
|
||||||
for knowledge_base in knowledge_bases:
|
**knowledge_base.model_dump(),
|
||||||
files = []
|
files=Knowledges.get_file_metadatas_by_id(knowledge_base.id),
|
||||||
if knowledge_base.data:
|
|
||||||
files = Files.get_file_metadatas_by_ids(
|
|
||||||
knowledge_base.data.get("file_ids", [])
|
|
||||||
)
|
|
||||||
|
|
||||||
# Check if all files exist
|
|
||||||
if len(files) != len(knowledge_base.data.get("file_ids", [])):
|
|
||||||
missing_files = list(
|
|
||||||
set(knowledge_base.data.get("file_ids", []))
|
|
||||||
- set([file.id for file in files])
|
|
||||||
)
|
|
||||||
if missing_files:
|
|
||||||
data = knowledge_base.data or {}
|
|
||||||
file_ids = data.get("file_ids", [])
|
|
||||||
|
|
||||||
for missing_file in missing_files:
|
|
||||||
file_ids.remove(missing_file)
|
|
||||||
|
|
||||||
data["file_ids"] = file_ids
|
|
||||||
Knowledges.update_knowledge_data_by_id(
|
|
||||||
id=knowledge_base.id, data=data
|
|
||||||
)
|
|
||||||
|
|
||||||
files = Files.get_file_metadatas_by_ids(file_ids)
|
|
||||||
|
|
||||||
knowledge_with_files.append(
|
|
||||||
KnowledgeUserResponse(
|
|
||||||
**knowledge_base.model_dump(),
|
|
||||||
files=files,
|
|
||||||
)
|
|
||||||
)
|
)
|
||||||
return knowledge_with_files
|
for knowledge_base in knowledge_bases
|
||||||
|
]
|
||||||
|
|
||||||
|
|
||||||
############################
|
############################
|
||||||
|
|
@ -193,26 +135,9 @@ async def reindex_knowledge_files(request: Request, user=Depends(get_verified_us
|
||||||
|
|
||||||
log.info(f"Starting reindexing for {len(knowledge_bases)} knowledge bases")
|
log.info(f"Starting reindexing for {len(knowledge_bases)} knowledge bases")
|
||||||
|
|
||||||
deleted_knowledge_bases = []
|
|
||||||
|
|
||||||
for knowledge_base in knowledge_bases:
|
for knowledge_base in knowledge_bases:
|
||||||
# -- Robust error handling for missing or invalid data
|
|
||||||
if not knowledge_base.data or not isinstance(knowledge_base.data, dict):
|
|
||||||
log.warning(
|
|
||||||
f"Knowledge base {knowledge_base.id} has no data or invalid data ({knowledge_base.data!r}). Deleting."
|
|
||||||
)
|
|
||||||
try:
|
|
||||||
Knowledges.delete_knowledge_by_id(id=knowledge_base.id)
|
|
||||||
deleted_knowledge_bases.append(knowledge_base.id)
|
|
||||||
except Exception as e:
|
|
||||||
log.error(
|
|
||||||
f"Failed to delete invalid knowledge base {knowledge_base.id}: {e}"
|
|
||||||
)
|
|
||||||
continue
|
|
||||||
|
|
||||||
try:
|
try:
|
||||||
file_ids = knowledge_base.data.get("file_ids", [])
|
files = Knowledges.get_files_by_id(knowledge_base.id)
|
||||||
files = Files.get_files_by_ids(file_ids)
|
|
||||||
try:
|
try:
|
||||||
if VECTOR_DB_CLIENT.has_collection(collection_name=knowledge_base.id):
|
if VECTOR_DB_CLIENT.has_collection(collection_name=knowledge_base.id):
|
||||||
VECTOR_DB_CLIENT.delete_collection(
|
VECTOR_DB_CLIENT.delete_collection(
|
||||||
|
|
@ -252,9 +177,7 @@ async def reindex_knowledge_files(request: Request, user=Depends(get_verified_us
|
||||||
for failed in failed_files:
|
for failed in failed_files:
|
||||||
log.warning(f"File ID: {failed['file_id']}, Error: {failed['error']}")
|
log.warning(f"File ID: {failed['file_id']}, Error: {failed['error']}")
|
||||||
|
|
||||||
log.info(
|
log.info(f"Reindexing completed.")
|
||||||
f"Reindexing completed. Deleted {len(deleted_knowledge_bases)} invalid knowledge bases: {deleted_knowledge_bases}"
|
|
||||||
)
|
|
||||||
return True
|
return True
|
||||||
|
|
||||||
|
|
||||||
|
|
@ -272,19 +195,15 @@ async def get_knowledge_by_id(id: str, user=Depends(get_verified_user)):
|
||||||
knowledge = Knowledges.get_knowledge_by_id(id=id)
|
knowledge = Knowledges.get_knowledge_by_id(id=id)
|
||||||
|
|
||||||
if knowledge:
|
if knowledge:
|
||||||
|
|
||||||
if (
|
if (
|
||||||
user.role == "admin"
|
user.role == "admin"
|
||||||
or knowledge.user_id == user.id
|
or knowledge.user_id == user.id
|
||||||
or has_access(user.id, "read", knowledge.access_control)
|
or has_access(user.id, "read", knowledge.access_control)
|
||||||
):
|
):
|
||||||
|
|
||||||
file_ids = knowledge.data.get("file_ids", []) if knowledge.data else []
|
|
||||||
files = Files.get_file_metadatas_by_ids(file_ids)
|
|
||||||
|
|
||||||
return KnowledgeFilesResponse(
|
return KnowledgeFilesResponse(
|
||||||
**knowledge.model_dump(),
|
**knowledge.model_dump(),
|
||||||
files=files,
|
files=Knowledges.get_file_metadatas_by_id(knowledge.id),
|
||||||
)
|
)
|
||||||
else:
|
else:
|
||||||
raise HTTPException(
|
raise HTTPException(
|
||||||
|
|
@ -336,12 +255,9 @@ async def update_knowledge_by_id(
|
||||||
|
|
||||||
knowledge = Knowledges.update_knowledge_by_id(id=id, form_data=form_data)
|
knowledge = Knowledges.update_knowledge_by_id(id=id, form_data=form_data)
|
||||||
if knowledge:
|
if knowledge:
|
||||||
file_ids = knowledge.data.get("file_ids", []) if knowledge.data else []
|
|
||||||
files = Files.get_file_metadatas_by_ids(file_ids)
|
|
||||||
|
|
||||||
return KnowledgeFilesResponse(
|
return KnowledgeFilesResponse(
|
||||||
**knowledge.model_dump(),
|
**knowledge.model_dump(),
|
||||||
files=files,
|
files=Knowledges.get_file_metadatas_by_id(knowledge.id),
|
||||||
)
|
)
|
||||||
else:
|
else:
|
||||||
raise HTTPException(
|
raise HTTPException(
|
||||||
|
|
@ -350,6 +266,59 @@ async def update_knowledge_by_id(
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
|
############################
|
||||||
|
# GetKnowledgeFilesById
|
||||||
|
############################
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/{id}/files", response_model=KnowledgeFileListResponse)
|
||||||
|
async def get_knowledge_files_by_id(
|
||||||
|
id: str,
|
||||||
|
query: Optional[str] = None,
|
||||||
|
view_option: Optional[str] = None,
|
||||||
|
order_by: Optional[str] = None,
|
||||||
|
direction: Optional[str] = None,
|
||||||
|
page: Optional[int] = 1,
|
||||||
|
user=Depends(get_verified_user),
|
||||||
|
):
|
||||||
|
|
||||||
|
knowledge = Knowledges.get_knowledge_by_id(id=id)
|
||||||
|
if not knowledge:
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=status.HTTP_400_BAD_REQUEST,
|
||||||
|
detail=ERROR_MESSAGES.NOT_FOUND,
|
||||||
|
)
|
||||||
|
|
||||||
|
if not (
|
||||||
|
user.role == "admin"
|
||||||
|
or knowledge.user_id == user.id
|
||||||
|
or has_access(user.id, "read", knowledge.access_control)
|
||||||
|
):
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=status.HTTP_400_BAD_REQUEST,
|
||||||
|
detail=ERROR_MESSAGES.ACCESS_PROHIBITED,
|
||||||
|
)
|
||||||
|
|
||||||
|
page = max(page, 1)
|
||||||
|
|
||||||
|
limit = 30
|
||||||
|
skip = (page - 1) * limit
|
||||||
|
|
||||||
|
filter = {}
|
||||||
|
if query:
|
||||||
|
filter["query"] = query
|
||||||
|
if view_option:
|
||||||
|
filter["view_option"] = view_option
|
||||||
|
if order_by:
|
||||||
|
filter["order_by"] = order_by
|
||||||
|
if direction:
|
||||||
|
filter["direction"] = direction
|
||||||
|
|
||||||
|
return Knowledges.search_files_by_id(
|
||||||
|
id, user.id, filter=filter, skip=skip, limit=limit
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
############################
|
############################
|
||||||
# AddFileToKnowledge
|
# AddFileToKnowledge
|
||||||
############################
|
############################
|
||||||
|
|
@ -371,7 +340,6 @@ def add_file_to_knowledge_by_id(
|
||||||
user=Depends(get_verified_user),
|
user=Depends(get_verified_user),
|
||||||
):
|
):
|
||||||
knowledge = Knowledges.get_knowledge_by_id(id=id)
|
knowledge = Knowledges.get_knowledge_by_id(id=id)
|
||||||
|
|
||||||
if not knowledge:
|
if not knowledge:
|
||||||
raise HTTPException(
|
raise HTTPException(
|
||||||
status_code=status.HTTP_400_BAD_REQUEST,
|
status_code=status.HTTP_400_BAD_REQUEST,
|
||||||
|
|
@ -400,6 +368,11 @@ def add_file_to_knowledge_by_id(
|
||||||
detail=ERROR_MESSAGES.FILE_NOT_PROCESSED,
|
detail=ERROR_MESSAGES.FILE_NOT_PROCESSED,
|
||||||
)
|
)
|
||||||
|
|
||||||
|
# Add file to knowledge base
|
||||||
|
Knowledges.add_file_to_knowledge_by_id(
|
||||||
|
knowledge_id=id, file_id=form_data.file_id, user_id=user.id
|
||||||
|
)
|
||||||
|
|
||||||
# Add content to the vector database
|
# Add content to the vector database
|
||||||
try:
|
try:
|
||||||
process_file(
|
process_file(
|
||||||
|
|
@ -415,32 +388,10 @@ def add_file_to_knowledge_by_id(
|
||||||
)
|
)
|
||||||
|
|
||||||
if knowledge:
|
if knowledge:
|
||||||
data = knowledge.data or {}
|
return KnowledgeFilesResponse(
|
||||||
file_ids = data.get("file_ids", [])
|
**knowledge.model_dump(),
|
||||||
|
files=Knowledges.get_file_metadatas_by_id(knowledge.id),
|
||||||
if form_data.file_id not in file_ids:
|
)
|
||||||
file_ids.append(form_data.file_id)
|
|
||||||
data["file_ids"] = file_ids
|
|
||||||
|
|
||||||
knowledge = Knowledges.update_knowledge_data_by_id(id=id, data=data)
|
|
||||||
|
|
||||||
if knowledge:
|
|
||||||
files = Files.get_file_metadatas_by_ids(file_ids)
|
|
||||||
|
|
||||||
return KnowledgeFilesResponse(
|
|
||||||
**knowledge.model_dump(),
|
|
||||||
files=files,
|
|
||||||
)
|
|
||||||
else:
|
|
||||||
raise HTTPException(
|
|
||||||
status_code=status.HTTP_400_BAD_REQUEST,
|
|
||||||
detail=ERROR_MESSAGES.DEFAULT("knowledge"),
|
|
||||||
)
|
|
||||||
else:
|
|
||||||
raise HTTPException(
|
|
||||||
status_code=status.HTTP_400_BAD_REQUEST,
|
|
||||||
detail=ERROR_MESSAGES.DEFAULT("file_id"),
|
|
||||||
)
|
|
||||||
else:
|
else:
|
||||||
raise HTTPException(
|
raise HTTPException(
|
||||||
status_code=status.HTTP_400_BAD_REQUEST,
|
status_code=status.HTTP_400_BAD_REQUEST,
|
||||||
|
|
@ -555,14 +506,9 @@ def update_file_from_knowledge_by_id(
|
||||||
)
|
)
|
||||||
|
|
||||||
if knowledge:
|
if knowledge:
|
||||||
data = knowledge.data or {}
|
|
||||||
file_ids = data.get("file_ids", [])
|
|
||||||
|
|
||||||
files = Files.get_file_metadatas_by_ids(file_ids)
|
|
||||||
|
|
||||||
return KnowledgeFilesResponse(
|
return KnowledgeFilesResponse(
|
||||||
**knowledge.model_dump(),
|
**knowledge.model_dump(),
|
||||||
files=files,
|
files=Knowledges.get_file_metadatas_by_id(knowledge.id),
|
||||||
)
|
)
|
||||||
else:
|
else:
|
||||||
raise HTTPException(
|
raise HTTPException(
|
||||||
|
|
@ -607,11 +553,19 @@ def remove_file_from_knowledge_by_id(
|
||||||
detail=ERROR_MESSAGES.NOT_FOUND,
|
detail=ERROR_MESSAGES.NOT_FOUND,
|
||||||
)
|
)
|
||||||
|
|
||||||
|
Knowledges.remove_file_from_knowledge_by_id(
|
||||||
|
knowledge_id=id, file_id=form_data.file_id
|
||||||
|
)
|
||||||
|
|
||||||
# Remove content from the vector database
|
# Remove content from the vector database
|
||||||
try:
|
try:
|
||||||
VECTOR_DB_CLIENT.delete(
|
VECTOR_DB_CLIENT.delete(
|
||||||
collection_name=knowledge.id, filter={"file_id": form_data.file_id}
|
collection_name=knowledge.id, filter={"file_id": form_data.file_id}
|
||||||
)
|
) # Remove by file_id first
|
||||||
|
|
||||||
|
VECTOR_DB_CLIENT.delete(
|
||||||
|
collection_name=knowledge.id, filter={"hash": file.hash}
|
||||||
|
) # Remove by hash as well in case of duplicates
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
log.debug("This was most likely caused by bypassing embedding processing")
|
log.debug("This was most likely caused by bypassing embedding processing")
|
||||||
log.debug(e)
|
log.debug(e)
|
||||||
|
|
@ -632,32 +586,10 @@ def remove_file_from_knowledge_by_id(
|
||||||
Files.delete_file_by_id(form_data.file_id)
|
Files.delete_file_by_id(form_data.file_id)
|
||||||
|
|
||||||
if knowledge:
|
if knowledge:
|
||||||
data = knowledge.data or {}
|
return KnowledgeFilesResponse(
|
||||||
file_ids = data.get("file_ids", [])
|
**knowledge.model_dump(),
|
||||||
|
files=Knowledges.get_file_metadatas_by_id(knowledge.id),
|
||||||
if form_data.file_id in file_ids:
|
)
|
||||||
file_ids.remove(form_data.file_id)
|
|
||||||
data["file_ids"] = file_ids
|
|
||||||
|
|
||||||
knowledge = Knowledges.update_knowledge_data_by_id(id=id, data=data)
|
|
||||||
|
|
||||||
if knowledge:
|
|
||||||
files = Files.get_file_metadatas_by_ids(file_ids)
|
|
||||||
|
|
||||||
return KnowledgeFilesResponse(
|
|
||||||
**knowledge.model_dump(),
|
|
||||||
files=files,
|
|
||||||
)
|
|
||||||
else:
|
|
||||||
raise HTTPException(
|
|
||||||
status_code=status.HTTP_400_BAD_REQUEST,
|
|
||||||
detail=ERROR_MESSAGES.DEFAULT("knowledge"),
|
|
||||||
)
|
|
||||||
else:
|
|
||||||
raise HTTPException(
|
|
||||||
status_code=status.HTTP_400_BAD_REQUEST,
|
|
||||||
detail=ERROR_MESSAGES.DEFAULT("file_id"),
|
|
||||||
)
|
|
||||||
else:
|
else:
|
||||||
raise HTTPException(
|
raise HTTPException(
|
||||||
status_code=status.HTTP_400_BAD_REQUEST,
|
status_code=status.HTTP_400_BAD_REQUEST,
|
||||||
|
|
@ -758,8 +690,7 @@ async def reset_knowledge_by_id(id: str, user=Depends(get_verified_user)):
|
||||||
log.debug(e)
|
log.debug(e)
|
||||||
pass
|
pass
|
||||||
|
|
||||||
knowledge = Knowledges.update_knowledge_data_by_id(id=id, data={"file_ids": []})
|
knowledge = Knowledges.reset_knowledge_by_id(id=id)
|
||||||
|
|
||||||
return knowledge
|
return knowledge
|
||||||
|
|
||||||
|
|
||||||
|
|
@ -769,7 +700,7 @@ async def reset_knowledge_by_id(id: str, user=Depends(get_verified_user)):
|
||||||
|
|
||||||
|
|
||||||
@router.post("/{id}/files/batch/add", response_model=Optional[KnowledgeFilesResponse])
|
@router.post("/{id}/files/batch/add", response_model=Optional[KnowledgeFilesResponse])
|
||||||
def add_files_to_knowledge_batch(
|
async def add_files_to_knowledge_batch(
|
||||||
request: Request,
|
request: Request,
|
||||||
id: str,
|
id: str,
|
||||||
form_data: list[KnowledgeFileIdForm],
|
form_data: list[KnowledgeFileIdForm],
|
||||||
|
|
@ -809,7 +740,7 @@ def add_files_to_knowledge_batch(
|
||||||
|
|
||||||
# Process files
|
# Process files
|
||||||
try:
|
try:
|
||||||
result = process_files_batch(
|
result = await process_files_batch(
|
||||||
request=request,
|
request=request,
|
||||||
form_data=BatchProcessFilesForm(files=files, collection_name=id),
|
form_data=BatchProcessFilesForm(files=files, collection_name=id),
|
||||||
user=user,
|
user=user,
|
||||||
|
|
@ -820,25 +751,19 @@ def add_files_to_knowledge_batch(
|
||||||
)
|
)
|
||||||
raise HTTPException(status_code=status.HTTP_400_BAD_REQUEST, detail=str(e))
|
raise HTTPException(status_code=status.HTTP_400_BAD_REQUEST, detail=str(e))
|
||||||
|
|
||||||
# Add successful files to knowledge base
|
|
||||||
data = knowledge.data or {}
|
|
||||||
existing_file_ids = data.get("file_ids", [])
|
|
||||||
|
|
||||||
# Only add files that were successfully processed
|
# Only add files that were successfully processed
|
||||||
successful_file_ids = [r.file_id for r in result.results if r.status == "completed"]
|
successful_file_ids = [r.file_id for r in result.results if r.status == "completed"]
|
||||||
for file_id in successful_file_ids:
|
for file_id in successful_file_ids:
|
||||||
if file_id not in existing_file_ids:
|
Knowledges.add_file_to_knowledge_by_id(
|
||||||
existing_file_ids.append(file_id)
|
knowledge_id=id, file_id=file_id, user_id=user.id
|
||||||
|
)
|
||||||
data["file_ids"] = existing_file_ids
|
|
||||||
knowledge = Knowledges.update_knowledge_data_by_id(id=id, data=data)
|
|
||||||
|
|
||||||
# If there were any errors, include them in the response
|
# If there were any errors, include them in the response
|
||||||
if result.errors:
|
if result.errors:
|
||||||
error_details = [f"{err.file_id}: {err.error}" for err in result.errors]
|
error_details = [f"{err.file_id}: {err.error}" for err in result.errors]
|
||||||
return KnowledgeFilesResponse(
|
return KnowledgeFilesResponse(
|
||||||
**knowledge.model_dump(),
|
**knowledge.model_dump(),
|
||||||
files=Files.get_file_metadatas_by_ids(existing_file_ids),
|
files=Knowledges.get_file_metadatas_by_id(knowledge.id),
|
||||||
warnings={
|
warnings={
|
||||||
"message": "Some files failed to process",
|
"message": "Some files failed to process",
|
||||||
"errors": error_details,
|
"errors": error_details,
|
||||||
|
|
@ -847,5 +772,5 @@ def add_files_to_knowledge_batch(
|
||||||
|
|
||||||
return KnowledgeFilesResponse(
|
return KnowledgeFilesResponse(
|
||||||
**knowledge.model_dump(),
|
**knowledge.model_dump(),
|
||||||
files=Files.get_file_metadatas_by_ids(existing_file_ids),
|
files=Knowledges.get_file_metadatas_by_id(knowledge.id),
|
||||||
)
|
)
|
||||||
|
|
|
||||||
|
|
@ -5,6 +5,7 @@ import json
|
||||||
import asyncio
|
import asyncio
|
||||||
import logging
|
import logging
|
||||||
|
|
||||||
|
from open_webui.models.groups import Groups
|
||||||
from open_webui.models.models import (
|
from open_webui.models.models import (
|
||||||
ModelForm,
|
ModelForm,
|
||||||
ModelModel,
|
ModelModel,
|
||||||
|
|
@ -78,6 +79,10 @@ async def get_models(
|
||||||
filter["direction"] = direction
|
filter["direction"] = direction
|
||||||
|
|
||||||
if not user.role == "admin" or not BYPASS_ADMIN_ACCESS_CONTROL:
|
if not user.role == "admin" or not BYPASS_ADMIN_ACCESS_CONTROL:
|
||||||
|
groups = Groups.get_groups_by_member_id(user.id)
|
||||||
|
if groups:
|
||||||
|
filter["group_ids"] = [group.id for group in groups]
|
||||||
|
|
||||||
filter["user_id"] = user.id
|
filter["user_id"] = user.id
|
||||||
|
|
||||||
return Models.search_models(user.id, filter=filter, skip=skip, limit=limit)
|
return Models.search_models(user.id, filter=filter, skip=skip, limit=limit)
|
||||||
|
|
|
||||||
|
|
@ -8,11 +8,21 @@ from pydantic import BaseModel
|
||||||
|
|
||||||
from open_webui.socket.main import sio
|
from open_webui.socket.main import sio
|
||||||
|
|
||||||
|
from open_webui.models.groups import Groups
|
||||||
from open_webui.models.users import Users, UserResponse
|
from open_webui.models.users import Users, UserResponse
|
||||||
from open_webui.models.notes import Notes, NoteModel, NoteForm, NoteUserResponse
|
from open_webui.models.notes import (
|
||||||
|
NoteListResponse,
|
||||||
|
Notes,
|
||||||
|
NoteModel,
|
||||||
|
NoteForm,
|
||||||
|
NoteUserResponse,
|
||||||
|
)
|
||||||
|
|
||||||
from open_webui.config import ENABLE_ADMIN_CHAT_ACCESS, ENABLE_ADMIN_EXPORT
|
from open_webui.config import (
|
||||||
|
BYPASS_ADMIN_ACCESS_CONTROL,
|
||||||
|
ENABLE_ADMIN_CHAT_ACCESS,
|
||||||
|
ENABLE_ADMIN_EXPORT,
|
||||||
|
)
|
||||||
from open_webui.constants import ERROR_MESSAGES
|
from open_webui.constants import ERROR_MESSAGES
|
||||||
from open_webui.env import SRC_LOG_LEVELS
|
from open_webui.env import SRC_LOG_LEVELS
|
||||||
|
|
||||||
|
|
@ -30,39 +40,17 @@ router = APIRouter()
|
||||||
############################
|
############################
|
||||||
|
|
||||||
|
|
||||||
@router.get("/", response_model=list[NoteUserResponse])
|
class NoteItemResponse(BaseModel):
|
||||||
async def get_notes(request: Request, user=Depends(get_verified_user)):
|
|
||||||
|
|
||||||
if user.role != "admin" and not has_permission(
|
|
||||||
user.id, "features.notes", request.app.state.config.USER_PERMISSIONS
|
|
||||||
):
|
|
||||||
raise HTTPException(
|
|
||||||
status_code=status.HTTP_401_UNAUTHORIZED,
|
|
||||||
detail=ERROR_MESSAGES.UNAUTHORIZED,
|
|
||||||
)
|
|
||||||
|
|
||||||
notes = [
|
|
||||||
NoteUserResponse(
|
|
||||||
**{
|
|
||||||
**note.model_dump(),
|
|
||||||
"user": UserResponse(**Users.get_user_by_id(note.user_id).model_dump()),
|
|
||||||
}
|
|
||||||
)
|
|
||||||
for note in Notes.get_notes_by_permission(user.id, "write")
|
|
||||||
]
|
|
||||||
|
|
||||||
return notes
|
|
||||||
|
|
||||||
|
|
||||||
class NoteTitleIdResponse(BaseModel):
|
|
||||||
id: str
|
id: str
|
||||||
title: str
|
title: str
|
||||||
|
data: Optional[dict]
|
||||||
updated_at: int
|
updated_at: int
|
||||||
created_at: int
|
created_at: int
|
||||||
|
user: Optional[UserResponse] = None
|
||||||
|
|
||||||
|
|
||||||
@router.get("/list", response_model=list[NoteTitleIdResponse])
|
@router.get("/", response_model=list[NoteItemResponse])
|
||||||
async def get_note_list(
|
async def get_notes(
|
||||||
request: Request, page: Optional[int] = None, user=Depends(get_verified_user)
|
request: Request, page: Optional[int] = None, user=Depends(get_verified_user)
|
||||||
):
|
):
|
||||||
if user.role != "admin" and not has_permission(
|
if user.role != "admin" and not has_permission(
|
||||||
|
|
@ -80,15 +68,64 @@ async def get_note_list(
|
||||||
skip = (page - 1) * limit
|
skip = (page - 1) * limit
|
||||||
|
|
||||||
notes = [
|
notes = [
|
||||||
NoteTitleIdResponse(**note.model_dump())
|
NoteUserResponse(
|
||||||
for note in Notes.get_notes_by_permission(
|
**{
|
||||||
user.id, "write", skip=skip, limit=limit
|
**note.model_dump(),
|
||||||
|
"user": UserResponse(**Users.get_user_by_id(note.user_id).model_dump()),
|
||||||
|
}
|
||||||
)
|
)
|
||||||
|
for note in Notes.get_notes_by_user_id(user.id, "read", skip=skip, limit=limit)
|
||||||
]
|
]
|
||||||
|
|
||||||
return notes
|
return notes
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/search", response_model=NoteListResponse)
|
||||||
|
async def search_notes(
|
||||||
|
request: Request,
|
||||||
|
query: Optional[str] = None,
|
||||||
|
view_option: Optional[str] = None,
|
||||||
|
permission: Optional[str] = None,
|
||||||
|
order_by: Optional[str] = None,
|
||||||
|
direction: Optional[str] = None,
|
||||||
|
page: Optional[int] = 1,
|
||||||
|
user=Depends(get_verified_user),
|
||||||
|
):
|
||||||
|
if user.role != "admin" and not has_permission(
|
||||||
|
user.id, "features.notes", request.app.state.config.USER_PERMISSIONS
|
||||||
|
):
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=status.HTTP_401_UNAUTHORIZED,
|
||||||
|
detail=ERROR_MESSAGES.UNAUTHORIZED,
|
||||||
|
)
|
||||||
|
|
||||||
|
limit = None
|
||||||
|
skip = None
|
||||||
|
if page is not None:
|
||||||
|
limit = 60
|
||||||
|
skip = (page - 1) * limit
|
||||||
|
|
||||||
|
filter = {}
|
||||||
|
if query:
|
||||||
|
filter["query"] = query
|
||||||
|
if view_option:
|
||||||
|
filter["view_option"] = view_option
|
||||||
|
if permission:
|
||||||
|
filter["permission"] = permission
|
||||||
|
if order_by:
|
||||||
|
filter["order_by"] = order_by
|
||||||
|
if direction:
|
||||||
|
filter["direction"] = direction
|
||||||
|
|
||||||
|
if not user.role == "admin" or not BYPASS_ADMIN_ACCESS_CONTROL:
|
||||||
|
groups = Groups.get_groups_by_member_id(user.id)
|
||||||
|
if groups:
|
||||||
|
filter["group_ids"] = [group.id for group in groups]
|
||||||
|
|
||||||
|
filter["user_id"] = user.id
|
||||||
|
|
||||||
|
return Notes.search_notes(user.id, filter, skip=skip, limit=limit)
|
||||||
|
|
||||||
|
|
||||||
############################
|
############################
|
||||||
# CreateNewNote
|
# CreateNewNote
|
||||||
############################
|
############################
|
||||||
|
|
@ -98,7 +135,6 @@ async def get_note_list(
|
||||||
async def create_new_note(
|
async def create_new_note(
|
||||||
request: Request, form_data: NoteForm, user=Depends(get_verified_user)
|
request: Request, form_data: NoteForm, user=Depends(get_verified_user)
|
||||||
):
|
):
|
||||||
|
|
||||||
if user.role != "admin" and not has_permission(
|
if user.role != "admin" and not has_permission(
|
||||||
user.id, "features.notes", request.app.state.config.USER_PERMISSIONS
|
user.id, "features.notes", request.app.state.config.USER_PERMISSIONS
|
||||||
):
|
):
|
||||||
|
|
@ -122,7 +158,11 @@ async def create_new_note(
|
||||||
############################
|
############################
|
||||||
|
|
||||||
|
|
||||||
@router.get("/{id}", response_model=Optional[NoteModel])
|
class NoteResponse(NoteModel):
|
||||||
|
write_access: bool = False
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/{id}", response_model=Optional[NoteResponse])
|
||||||
async def get_note_by_id(request: Request, id: str, user=Depends(get_verified_user)):
|
async def get_note_by_id(request: Request, id: str, user=Depends(get_verified_user)):
|
||||||
if user.role != "admin" and not has_permission(
|
if user.role != "admin" and not has_permission(
|
||||||
user.id, "features.notes", request.app.state.config.USER_PERMISSIONS
|
user.id, "features.notes", request.app.state.config.USER_PERMISSIONS
|
||||||
|
|
@ -146,7 +186,15 @@ async def get_note_by_id(request: Request, id: str, user=Depends(get_verified_us
|
||||||
status_code=status.HTTP_403_FORBIDDEN, detail=ERROR_MESSAGES.DEFAULT()
|
status_code=status.HTTP_403_FORBIDDEN, detail=ERROR_MESSAGES.DEFAULT()
|
||||||
)
|
)
|
||||||
|
|
||||||
return note
|
write_access = (
|
||||||
|
user.role == "admin"
|
||||||
|
or (user.id == note.user_id)
|
||||||
|
or has_access(
|
||||||
|
user.id, type="write", access_control=note.access_control, strict=False
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
|
return NoteResponse(**note.model_dump(), write_access=write_access)
|
||||||
|
|
||||||
|
|
||||||
############################
|
############################
|
||||||
|
|
|
||||||
|
|
@ -879,6 +879,7 @@ async def delete_model(
|
||||||
url = request.app.state.config.OLLAMA_BASE_URLS[url_idx]
|
url = request.app.state.config.OLLAMA_BASE_URLS[url_idx]
|
||||||
key = get_api_key(url_idx, url, request.app.state.config.OLLAMA_API_CONFIGS)
|
key = get_api_key(url_idx, url, request.app.state.config.OLLAMA_API_CONFIGS)
|
||||||
|
|
||||||
|
r = None
|
||||||
try:
|
try:
|
||||||
headers = {
|
headers = {
|
||||||
"Content-Type": "application/json",
|
"Content-Type": "application/json",
|
||||||
|
|
@ -892,7 +893,7 @@ async def delete_model(
|
||||||
method="DELETE",
|
method="DELETE",
|
||||||
url=f"{url}/api/delete",
|
url=f"{url}/api/delete",
|
||||||
headers=headers,
|
headers=headers,
|
||||||
data=form_data.model_dump_json(exclude_none=True).encode(),
|
json=form_data,
|
||||||
)
|
)
|
||||||
r.raise_for_status()
|
r.raise_for_status()
|
||||||
|
|
||||||
|
|
@ -949,10 +950,7 @@ async def show_model_info(
|
||||||
headers = include_user_info_headers(headers, user)
|
headers = include_user_info_headers(headers, user)
|
||||||
|
|
||||||
r = requests.request(
|
r = requests.request(
|
||||||
method="POST",
|
method="POST", url=f"{url}/api/show", headers=headers, json=form_data
|
||||||
url=f"{url}/api/show",
|
|
||||||
headers=headers,
|
|
||||||
data=form_data.model_dump_json(exclude_none=True).encode(),
|
|
||||||
)
|
)
|
||||||
r.raise_for_status()
|
r.raise_for_status()
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -123,7 +123,7 @@ log.setLevel(SRC_LOG_LEVELS["RAG"])
|
||||||
def get_ef(
|
def get_ef(
|
||||||
engine: str,
|
engine: str,
|
||||||
embedding_model: str,
|
embedding_model: str,
|
||||||
auto_update: bool = False,
|
auto_update: bool = RAG_EMBEDDING_MODEL_AUTO_UPDATE,
|
||||||
):
|
):
|
||||||
ef = None
|
ef = None
|
||||||
if embedding_model and engine == "":
|
if embedding_model and engine == "":
|
||||||
|
|
@ -148,7 +148,7 @@ def get_rf(
|
||||||
reranking_model: Optional[str] = None,
|
reranking_model: Optional[str] = None,
|
||||||
external_reranker_url: str = "",
|
external_reranker_url: str = "",
|
||||||
external_reranker_api_key: str = "",
|
external_reranker_api_key: str = "",
|
||||||
auto_update: bool = False,
|
auto_update: bool = RAG_RERANKING_MODEL_AUTO_UPDATE,
|
||||||
):
|
):
|
||||||
rf = None
|
rf = None
|
||||||
if reranking_model:
|
if reranking_model:
|
||||||
|
|
@ -468,6 +468,7 @@ async def get_rag_config(request: Request, user=Depends(get_admin_user)):
|
||||||
"DOCLING_PARAMS": request.app.state.config.DOCLING_PARAMS,
|
"DOCLING_PARAMS": request.app.state.config.DOCLING_PARAMS,
|
||||||
"DOCUMENT_INTELLIGENCE_ENDPOINT": request.app.state.config.DOCUMENT_INTELLIGENCE_ENDPOINT,
|
"DOCUMENT_INTELLIGENCE_ENDPOINT": request.app.state.config.DOCUMENT_INTELLIGENCE_ENDPOINT,
|
||||||
"DOCUMENT_INTELLIGENCE_KEY": request.app.state.config.DOCUMENT_INTELLIGENCE_KEY,
|
"DOCUMENT_INTELLIGENCE_KEY": request.app.state.config.DOCUMENT_INTELLIGENCE_KEY,
|
||||||
|
"DOCUMENT_INTELLIGENCE_MODEL": request.app.state.config.DOCUMENT_INTELLIGENCE_MODEL,
|
||||||
"MISTRAL_OCR_API_BASE_URL": request.app.state.config.MISTRAL_OCR_API_BASE_URL,
|
"MISTRAL_OCR_API_BASE_URL": request.app.state.config.MISTRAL_OCR_API_BASE_URL,
|
||||||
"MISTRAL_OCR_API_KEY": request.app.state.config.MISTRAL_OCR_API_KEY,
|
"MISTRAL_OCR_API_KEY": request.app.state.config.MISTRAL_OCR_API_KEY,
|
||||||
# MinerU settings
|
# MinerU settings
|
||||||
|
|
@ -535,6 +536,7 @@ async def get_rag_config(request: Request, user=Depends(get_admin_user)):
|
||||||
"SOUGOU_API_SID": request.app.state.config.SOUGOU_API_SID,
|
"SOUGOU_API_SID": request.app.state.config.SOUGOU_API_SID,
|
||||||
"SOUGOU_API_SK": request.app.state.config.SOUGOU_API_SK,
|
"SOUGOU_API_SK": request.app.state.config.SOUGOU_API_SK,
|
||||||
"WEB_LOADER_ENGINE": request.app.state.config.WEB_LOADER_ENGINE,
|
"WEB_LOADER_ENGINE": request.app.state.config.WEB_LOADER_ENGINE,
|
||||||
|
"WEB_LOADER_TIMEOUT": request.app.state.config.WEB_LOADER_TIMEOUT,
|
||||||
"ENABLE_WEB_LOADER_SSL_VERIFICATION": request.app.state.config.ENABLE_WEB_LOADER_SSL_VERIFICATION,
|
"ENABLE_WEB_LOADER_SSL_VERIFICATION": request.app.state.config.ENABLE_WEB_LOADER_SSL_VERIFICATION,
|
||||||
"PLAYWRIGHT_WS_URL": request.app.state.config.PLAYWRIGHT_WS_URL,
|
"PLAYWRIGHT_WS_URL": request.app.state.config.PLAYWRIGHT_WS_URL,
|
||||||
"PLAYWRIGHT_TIMEOUT": request.app.state.config.PLAYWRIGHT_TIMEOUT,
|
"PLAYWRIGHT_TIMEOUT": request.app.state.config.PLAYWRIGHT_TIMEOUT,
|
||||||
|
|
@ -593,6 +595,7 @@ class WebConfig(BaseModel):
|
||||||
SOUGOU_API_SID: Optional[str] = None
|
SOUGOU_API_SID: Optional[str] = None
|
||||||
SOUGOU_API_SK: Optional[str] = None
|
SOUGOU_API_SK: Optional[str] = None
|
||||||
WEB_LOADER_ENGINE: Optional[str] = None
|
WEB_LOADER_ENGINE: Optional[str] = None
|
||||||
|
WEB_LOADER_TIMEOUT: Optional[str] = None
|
||||||
ENABLE_WEB_LOADER_SSL_VERIFICATION: Optional[bool] = None
|
ENABLE_WEB_LOADER_SSL_VERIFICATION: Optional[bool] = None
|
||||||
PLAYWRIGHT_WS_URL: Optional[str] = None
|
PLAYWRIGHT_WS_URL: Optional[str] = None
|
||||||
PLAYWRIGHT_TIMEOUT: Optional[int] = None
|
PLAYWRIGHT_TIMEOUT: Optional[int] = None
|
||||||
|
|
@ -647,6 +650,7 @@ class ConfigForm(BaseModel):
|
||||||
DOCLING_PARAMS: Optional[dict] = None
|
DOCLING_PARAMS: Optional[dict] = None
|
||||||
DOCUMENT_INTELLIGENCE_ENDPOINT: Optional[str] = None
|
DOCUMENT_INTELLIGENCE_ENDPOINT: Optional[str] = None
|
||||||
DOCUMENT_INTELLIGENCE_KEY: Optional[str] = None
|
DOCUMENT_INTELLIGENCE_KEY: Optional[str] = None
|
||||||
|
DOCUMENT_INTELLIGENCE_MODEL: Optional[str] = None
|
||||||
MISTRAL_OCR_API_BASE_URL: Optional[str] = None
|
MISTRAL_OCR_API_BASE_URL: Optional[str] = None
|
||||||
MISTRAL_OCR_API_KEY: Optional[str] = None
|
MISTRAL_OCR_API_KEY: Optional[str] = None
|
||||||
|
|
||||||
|
|
@ -842,6 +846,11 @@ async def update_rag_config(
|
||||||
if form_data.DOCUMENT_INTELLIGENCE_KEY is not None
|
if form_data.DOCUMENT_INTELLIGENCE_KEY is not None
|
||||||
else request.app.state.config.DOCUMENT_INTELLIGENCE_KEY
|
else request.app.state.config.DOCUMENT_INTELLIGENCE_KEY
|
||||||
)
|
)
|
||||||
|
request.app.state.config.DOCUMENT_INTELLIGENCE_MODEL = (
|
||||||
|
form_data.DOCUMENT_INTELLIGENCE_MODEL
|
||||||
|
if form_data.DOCUMENT_INTELLIGENCE_MODEL is not None
|
||||||
|
else request.app.state.config.DOCUMENT_INTELLIGENCE_MODEL
|
||||||
|
)
|
||||||
|
|
||||||
request.app.state.config.MISTRAL_OCR_API_BASE_URL = (
|
request.app.state.config.MISTRAL_OCR_API_BASE_URL = (
|
||||||
form_data.MISTRAL_OCR_API_BASE_URL
|
form_data.MISTRAL_OCR_API_BASE_URL
|
||||||
|
|
@ -927,7 +936,6 @@ async def update_rag_config(
|
||||||
request.app.state.config.RAG_RERANKING_MODEL,
|
request.app.state.config.RAG_RERANKING_MODEL,
|
||||||
request.app.state.config.RAG_EXTERNAL_RERANKER_URL,
|
request.app.state.config.RAG_EXTERNAL_RERANKER_URL,
|
||||||
request.app.state.config.RAG_EXTERNAL_RERANKER_API_KEY,
|
request.app.state.config.RAG_EXTERNAL_RERANKER_API_KEY,
|
||||||
True,
|
|
||||||
)
|
)
|
||||||
|
|
||||||
request.app.state.RERANKING_FUNCTION = get_reranking_function(
|
request.app.state.RERANKING_FUNCTION = get_reranking_function(
|
||||||
|
|
@ -1065,6 +1073,8 @@ async def update_rag_config(
|
||||||
|
|
||||||
# Web loader settings
|
# Web loader settings
|
||||||
request.app.state.config.WEB_LOADER_ENGINE = form_data.web.WEB_LOADER_ENGINE
|
request.app.state.config.WEB_LOADER_ENGINE = form_data.web.WEB_LOADER_ENGINE
|
||||||
|
request.app.state.config.WEB_LOADER_TIMEOUT = form_data.web.WEB_LOADER_TIMEOUT
|
||||||
|
|
||||||
request.app.state.config.ENABLE_WEB_LOADER_SSL_VERIFICATION = (
|
request.app.state.config.ENABLE_WEB_LOADER_SSL_VERIFICATION = (
|
||||||
form_data.web.ENABLE_WEB_LOADER_SSL_VERIFICATION
|
form_data.web.ENABLE_WEB_LOADER_SSL_VERIFICATION
|
||||||
)
|
)
|
||||||
|
|
@ -1132,6 +1142,7 @@ async def update_rag_config(
|
||||||
"DOCLING_PARAMS": request.app.state.config.DOCLING_PARAMS,
|
"DOCLING_PARAMS": request.app.state.config.DOCLING_PARAMS,
|
||||||
"DOCUMENT_INTELLIGENCE_ENDPOINT": request.app.state.config.DOCUMENT_INTELLIGENCE_ENDPOINT,
|
"DOCUMENT_INTELLIGENCE_ENDPOINT": request.app.state.config.DOCUMENT_INTELLIGENCE_ENDPOINT,
|
||||||
"DOCUMENT_INTELLIGENCE_KEY": request.app.state.config.DOCUMENT_INTELLIGENCE_KEY,
|
"DOCUMENT_INTELLIGENCE_KEY": request.app.state.config.DOCUMENT_INTELLIGENCE_KEY,
|
||||||
|
"DOCUMENT_INTELLIGENCE_MODEL": request.app.state.config.DOCUMENT_INTELLIGENCE_MODEL,
|
||||||
"MISTRAL_OCR_API_BASE_URL": request.app.state.config.MISTRAL_OCR_API_BASE_URL,
|
"MISTRAL_OCR_API_BASE_URL": request.app.state.config.MISTRAL_OCR_API_BASE_URL,
|
||||||
"MISTRAL_OCR_API_KEY": request.app.state.config.MISTRAL_OCR_API_KEY,
|
"MISTRAL_OCR_API_KEY": request.app.state.config.MISTRAL_OCR_API_KEY,
|
||||||
# MinerU settings
|
# MinerU settings
|
||||||
|
|
@ -1199,6 +1210,7 @@ async def update_rag_config(
|
||||||
"SOUGOU_API_SID": request.app.state.config.SOUGOU_API_SID,
|
"SOUGOU_API_SID": request.app.state.config.SOUGOU_API_SID,
|
||||||
"SOUGOU_API_SK": request.app.state.config.SOUGOU_API_SK,
|
"SOUGOU_API_SK": request.app.state.config.SOUGOU_API_SK,
|
||||||
"WEB_LOADER_ENGINE": request.app.state.config.WEB_LOADER_ENGINE,
|
"WEB_LOADER_ENGINE": request.app.state.config.WEB_LOADER_ENGINE,
|
||||||
|
"WEB_LOADER_TIMEOUT": request.app.state.config.WEB_LOADER_TIMEOUT,
|
||||||
"ENABLE_WEB_LOADER_SSL_VERIFICATION": request.app.state.config.ENABLE_WEB_LOADER_SSL_VERIFICATION,
|
"ENABLE_WEB_LOADER_SSL_VERIFICATION": request.app.state.config.ENABLE_WEB_LOADER_SSL_VERIFICATION,
|
||||||
"PLAYWRIGHT_WS_URL": request.app.state.config.PLAYWRIGHT_WS_URL,
|
"PLAYWRIGHT_WS_URL": request.app.state.config.PLAYWRIGHT_WS_URL,
|
||||||
"PLAYWRIGHT_TIMEOUT": request.app.state.config.PLAYWRIGHT_TIMEOUT,
|
"PLAYWRIGHT_TIMEOUT": request.app.state.config.PLAYWRIGHT_TIMEOUT,
|
||||||
|
|
@ -1249,7 +1261,7 @@ def save_docs_to_vector_db(
|
||||||
|
|
||||||
return ", ".join(docs_info)
|
return ", ".join(docs_info)
|
||||||
|
|
||||||
log.info(
|
log.debug(
|
||||||
f"save_docs_to_vector_db: document {_get_docs_info(docs)} {collection_name}"
|
f"save_docs_to_vector_db: document {_get_docs_info(docs)} {collection_name}"
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
@ -1394,6 +1406,7 @@ def save_docs_to_vector_db(
|
||||||
if request.app.state.config.RAG_EMBEDDING_ENGINE == "azure_openai"
|
if request.app.state.config.RAG_EMBEDDING_ENGINE == "azure_openai"
|
||||||
else None
|
else None
|
||||||
),
|
),
|
||||||
|
enable_async=request.app.state.config.ENABLE_ASYNC_EMBEDDING,
|
||||||
)
|
)
|
||||||
|
|
||||||
# Run async embedding in sync context
|
# Run async embedding in sync context
|
||||||
|
|
@ -1544,6 +1557,7 @@ def process_file(
|
||||||
PDF_EXTRACT_IMAGES=request.app.state.config.PDF_EXTRACT_IMAGES,
|
PDF_EXTRACT_IMAGES=request.app.state.config.PDF_EXTRACT_IMAGES,
|
||||||
DOCUMENT_INTELLIGENCE_ENDPOINT=request.app.state.config.DOCUMENT_INTELLIGENCE_ENDPOINT,
|
DOCUMENT_INTELLIGENCE_ENDPOINT=request.app.state.config.DOCUMENT_INTELLIGENCE_ENDPOINT,
|
||||||
DOCUMENT_INTELLIGENCE_KEY=request.app.state.config.DOCUMENT_INTELLIGENCE_KEY,
|
DOCUMENT_INTELLIGENCE_KEY=request.app.state.config.DOCUMENT_INTELLIGENCE_KEY,
|
||||||
|
DOCUMENT_INTELLIGENCE_MODEL=request.app.state.config.DOCUMENT_INTELLIGENCE_MODEL,
|
||||||
MISTRAL_OCR_API_BASE_URL=request.app.state.config.MISTRAL_OCR_API_BASE_URL,
|
MISTRAL_OCR_API_BASE_URL=request.app.state.config.MISTRAL_OCR_API_BASE_URL,
|
||||||
MISTRAL_OCR_API_KEY=request.app.state.config.MISTRAL_OCR_API_KEY,
|
MISTRAL_OCR_API_KEY=request.app.state.config.MISTRAL_OCR_API_KEY,
|
||||||
MINERU_API_MODE=request.app.state.config.MINERU_API_MODE,
|
MINERU_API_MODE=request.app.state.config.MINERU_API_MODE,
|
||||||
|
|
@ -1689,7 +1703,7 @@ async def process_text(
|
||||||
log.debug(f"text_content: {text_content}")
|
log.debug(f"text_content: {text_content}")
|
||||||
|
|
||||||
result = await run_in_threadpool(
|
result = await run_in_threadpool(
|
||||||
save_docs_to_vector_db, request, docs, collection_name, user
|
save_docs_to_vector_db, request, docs, collection_name, user=user
|
||||||
)
|
)
|
||||||
if result:
|
if result:
|
||||||
return {
|
return {
|
||||||
|
|
@ -1721,7 +1735,12 @@ async def process_web(
|
||||||
|
|
||||||
if not request.app.state.config.BYPASS_WEB_SEARCH_EMBEDDING_AND_RETRIEVAL:
|
if not request.app.state.config.BYPASS_WEB_SEARCH_EMBEDDING_AND_RETRIEVAL:
|
||||||
await run_in_threadpool(
|
await run_in_threadpool(
|
||||||
save_docs_to_vector_db, request, docs, collection_name, True, user
|
save_docs_to_vector_db,
|
||||||
|
request,
|
||||||
|
docs,
|
||||||
|
collection_name,
|
||||||
|
overwrite=True,
|
||||||
|
user=user,
|
||||||
)
|
)
|
||||||
else:
|
else:
|
||||||
collection_name = None
|
collection_name = None
|
||||||
|
|
@ -2464,7 +2483,12 @@ async def process_files_batch(
|
||||||
if all_docs:
|
if all_docs:
|
||||||
try:
|
try:
|
||||||
await run_in_threadpool(
|
await run_in_threadpool(
|
||||||
save_docs_to_vector_db, request, all_docs, collection_name, True, user
|
save_docs_to_vector_db,
|
||||||
|
request,
|
||||||
|
all_docs,
|
||||||
|
collection_name,
|
||||||
|
add=True,
|
||||||
|
user=user,
|
||||||
)
|
)
|
||||||
|
|
||||||
# Update all files with collection name
|
# Update all files with collection name
|
||||||
|
|
|
||||||
|
|
@ -719,7 +719,7 @@ async def get_groups(
|
||||||
):
|
):
|
||||||
"""List SCIM Groups"""
|
"""List SCIM Groups"""
|
||||||
# Get all groups
|
# Get all groups
|
||||||
groups_list = Groups.get_groups()
|
groups_list = Groups.get_all_groups()
|
||||||
|
|
||||||
# Apply pagination
|
# Apply pagination
|
||||||
total = len(groups_list)
|
total = len(groups_list)
|
||||||
|
|
|
||||||
|
|
@ -6,7 +6,7 @@ import io
|
||||||
|
|
||||||
from fastapi import APIRouter, Depends, HTTPException, Request, status
|
from fastapi import APIRouter, Depends, HTTPException, Request, status
|
||||||
from fastapi.responses import Response, StreamingResponse, FileResponse
|
from fastapi.responses import Response, StreamingResponse, FileResponse
|
||||||
from pydantic import BaseModel
|
from pydantic import BaseModel, ConfigDict
|
||||||
|
|
||||||
|
|
||||||
from open_webui.models.auths import Auths
|
from open_webui.models.auths import Auths
|
||||||
|
|
@ -19,19 +19,14 @@ from open_webui.models.users import (
|
||||||
UserGroupIdsModel,
|
UserGroupIdsModel,
|
||||||
UserGroupIdsListResponse,
|
UserGroupIdsListResponse,
|
||||||
UserInfoListResponse,
|
UserInfoListResponse,
|
||||||
UserIdNameListResponse,
|
UserInfoListResponse,
|
||||||
UserRoleUpdateForm,
|
UserRoleUpdateForm,
|
||||||
|
UserStatus,
|
||||||
Users,
|
Users,
|
||||||
UserSettings,
|
UserSettings,
|
||||||
UserUpdateForm,
|
UserUpdateForm,
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
from open_webui.socket.main import (
|
|
||||||
get_active_status_by_user_id,
|
|
||||||
get_active_user_ids,
|
|
||||||
get_user_active_status,
|
|
||||||
)
|
|
||||||
from open_webui.constants import ERROR_MESSAGES
|
from open_webui.constants import ERROR_MESSAGES
|
||||||
from open_webui.env import SRC_LOG_LEVELS, STATIC_DIR
|
from open_webui.env import SRC_LOG_LEVELS, STATIC_DIR
|
||||||
|
|
||||||
|
|
@ -51,23 +46,6 @@ log.setLevel(SRC_LOG_LEVELS["MODELS"])
|
||||||
router = APIRouter()
|
router = APIRouter()
|
||||||
|
|
||||||
|
|
||||||
############################
|
|
||||||
# GetActiveUsers
|
|
||||||
############################
|
|
||||||
|
|
||||||
|
|
||||||
@router.get("/active")
|
|
||||||
async def get_active_users(
|
|
||||||
user=Depends(get_verified_user),
|
|
||||||
):
|
|
||||||
"""
|
|
||||||
Get a list of active users.
|
|
||||||
"""
|
|
||||||
return {
|
|
||||||
"user_ids": get_active_user_ids(),
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
############################
|
############################
|
||||||
# GetUsers
|
# GetUsers
|
||||||
############################
|
############################
|
||||||
|
|
@ -125,20 +103,31 @@ async def get_all_users(
|
||||||
return Users.get_users()
|
return Users.get_users()
|
||||||
|
|
||||||
|
|
||||||
@router.get("/search", response_model=UserIdNameListResponse)
|
@router.get("/search", response_model=UserInfoListResponse)
|
||||||
async def search_users(
|
async def search_users(
|
||||||
query: Optional[str] = None,
|
query: Optional[str] = None,
|
||||||
|
order_by: Optional[str] = None,
|
||||||
|
direction: Optional[str] = None,
|
||||||
|
page: Optional[int] = 1,
|
||||||
user=Depends(get_verified_user),
|
user=Depends(get_verified_user),
|
||||||
):
|
):
|
||||||
limit = PAGE_ITEM_COUNT
|
limit = PAGE_ITEM_COUNT
|
||||||
|
|
||||||
page = 1 # Always return the first page for search
|
page = max(1, page)
|
||||||
skip = (page - 1) * limit
|
skip = (page - 1) * limit
|
||||||
|
|
||||||
filter = {}
|
filter = {}
|
||||||
if query:
|
if query:
|
||||||
filter["query"] = query
|
filter["query"] = query
|
||||||
|
|
||||||
|
filter = {}
|
||||||
|
if query:
|
||||||
|
filter["query"] = query
|
||||||
|
if order_by:
|
||||||
|
filter["order_by"] = order_by
|
||||||
|
if direction:
|
||||||
|
filter["direction"] = direction
|
||||||
|
|
||||||
return Users.get_users(filter=filter, skip=skip, limit=limit)
|
return Users.get_users(filter=filter, skip=skip, limit=limit)
|
||||||
|
|
||||||
|
|
||||||
|
|
@ -219,11 +208,14 @@ class ChatPermissions(BaseModel):
|
||||||
|
|
||||||
class FeaturesPermissions(BaseModel):
|
class FeaturesPermissions(BaseModel):
|
||||||
api_keys: bool = False
|
api_keys: bool = False
|
||||||
|
notes: bool = True
|
||||||
|
channels: bool = True
|
||||||
|
folders: bool = True
|
||||||
direct_tool_servers: bool = False
|
direct_tool_servers: bool = False
|
||||||
|
|
||||||
web_search: bool = True
|
web_search: bool = True
|
||||||
image_generation: bool = True
|
image_generation: bool = True
|
||||||
code_interpreter: bool = True
|
code_interpreter: bool = True
|
||||||
notes: bool = True
|
|
||||||
|
|
||||||
|
|
||||||
class UserPermissions(BaseModel):
|
class UserPermissions(BaseModel):
|
||||||
|
|
@ -308,6 +300,43 @@ async def update_user_settings_by_session_user(
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
|
############################
|
||||||
|
# GetUserStatusBySessionUser
|
||||||
|
############################
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/user/status")
|
||||||
|
async def get_user_status_by_session_user(user=Depends(get_verified_user)):
|
||||||
|
user = Users.get_user_by_id(user.id)
|
||||||
|
if user:
|
||||||
|
return user
|
||||||
|
else:
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=status.HTTP_400_BAD_REQUEST,
|
||||||
|
detail=ERROR_MESSAGES.USER_NOT_FOUND,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
############################
|
||||||
|
# UpdateUserStatusBySessionUser
|
||||||
|
############################
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/user/status/update")
|
||||||
|
async def update_user_status_by_session_user(
|
||||||
|
form_data: UserStatus, user=Depends(get_verified_user)
|
||||||
|
):
|
||||||
|
user = Users.get_user_by_id(user.id)
|
||||||
|
if user:
|
||||||
|
user = Users.update_user_status_by_id(user.id, form_data)
|
||||||
|
return user
|
||||||
|
else:
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=status.HTTP_400_BAD_REQUEST,
|
||||||
|
detail=ERROR_MESSAGES.USER_NOT_FOUND,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
############################
|
############################
|
||||||
# GetUserInfoBySessionUser
|
# GetUserInfoBySessionUser
|
||||||
############################
|
############################
|
||||||
|
|
@ -359,13 +388,16 @@ async def update_user_info_by_session_user(
|
||||||
############################
|
############################
|
||||||
|
|
||||||
|
|
||||||
class UserResponse(BaseModel):
|
class UserActiveResponse(UserStatus):
|
||||||
name: str
|
name: str
|
||||||
profile_image_url: str
|
profile_image_url: Optional[str] = None
|
||||||
active: Optional[bool] = None
|
groups: Optional[list] = []
|
||||||
|
|
||||||
|
is_active: bool
|
||||||
|
model_config = ConfigDict(extra="allow")
|
||||||
|
|
||||||
|
|
||||||
@router.get("/{user_id}", response_model=UserResponse)
|
@router.get("/{user_id}", response_model=UserActiveResponse)
|
||||||
async def get_user_by_id(user_id: str, user=Depends(get_verified_user)):
|
async def get_user_by_id(user_id: str, user=Depends(get_verified_user)):
|
||||||
# Check if user_id is a shared chat
|
# Check if user_id is a shared chat
|
||||||
# If it is, get the user_id from the chat
|
# If it is, get the user_id from the chat
|
||||||
|
|
@ -381,13 +413,13 @@ async def get_user_by_id(user_id: str, user=Depends(get_verified_user)):
|
||||||
)
|
)
|
||||||
|
|
||||||
user = Users.get_user_by_id(user_id)
|
user = Users.get_user_by_id(user_id)
|
||||||
|
|
||||||
if user:
|
if user:
|
||||||
return UserResponse(
|
groups = Groups.get_groups_by_member_id(user_id)
|
||||||
|
return UserActiveResponse(
|
||||||
**{
|
**{
|
||||||
"name": user.name,
|
**user.model_dump(),
|
||||||
"profile_image_url": user.profile_image_url,
|
"groups": [{"id": group.id, "name": group.name} for group in groups],
|
||||||
"active": get_active_status_by_user_id(user_id),
|
"is_active": Users.is_user_active(user_id),
|
||||||
}
|
}
|
||||||
)
|
)
|
||||||
else:
|
else:
|
||||||
|
|
@ -454,7 +486,7 @@ async def get_user_profile_image_by_id(user_id: str, user=Depends(get_verified_u
|
||||||
@router.get("/{user_id}/active", response_model=dict)
|
@router.get("/{user_id}/active", response_model=dict)
|
||||||
async def get_user_active_status_by_id(user_id: str, user=Depends(get_verified_user)):
|
async def get_user_active_status_by_id(user_id: str, user=Depends(get_verified_user)):
|
||||||
return {
|
return {
|
||||||
"active": get_user_active_status(user_id),
|
"active": Users.is_user_active(user_id),
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -118,14 +118,16 @@ if WEBSOCKET_MANAGER == "redis":
|
||||||
redis_sentinels = get_sentinels_from_env(
|
redis_sentinels = get_sentinels_from_env(
|
||||||
WEBSOCKET_SENTINEL_HOSTS, WEBSOCKET_SENTINEL_PORT
|
WEBSOCKET_SENTINEL_HOSTS, WEBSOCKET_SENTINEL_PORT
|
||||||
)
|
)
|
||||||
SESSION_POOL = RedisDict(
|
|
||||||
f"{REDIS_KEY_PREFIX}:session_pool",
|
MODELS = RedisDict(
|
||||||
|
f"{REDIS_KEY_PREFIX}:models",
|
||||||
redis_url=WEBSOCKET_REDIS_URL,
|
redis_url=WEBSOCKET_REDIS_URL,
|
||||||
redis_sentinels=redis_sentinels,
|
redis_sentinels=redis_sentinels,
|
||||||
redis_cluster=WEBSOCKET_REDIS_CLUSTER,
|
redis_cluster=WEBSOCKET_REDIS_CLUSTER,
|
||||||
)
|
)
|
||||||
USER_POOL = RedisDict(
|
|
||||||
f"{REDIS_KEY_PREFIX}:user_pool",
|
SESSION_POOL = RedisDict(
|
||||||
|
f"{REDIS_KEY_PREFIX}:session_pool",
|
||||||
redis_url=WEBSOCKET_REDIS_URL,
|
redis_url=WEBSOCKET_REDIS_URL,
|
||||||
redis_sentinels=redis_sentinels,
|
redis_sentinels=redis_sentinels,
|
||||||
redis_cluster=WEBSOCKET_REDIS_CLUSTER,
|
redis_cluster=WEBSOCKET_REDIS_CLUSTER,
|
||||||
|
|
@ -148,8 +150,9 @@ if WEBSOCKET_MANAGER == "redis":
|
||||||
renew_func = clean_up_lock.renew_lock
|
renew_func = clean_up_lock.renew_lock
|
||||||
release_func = clean_up_lock.release_lock
|
release_func = clean_up_lock.release_lock
|
||||||
else:
|
else:
|
||||||
|
MODELS = {}
|
||||||
|
|
||||||
SESSION_POOL = {}
|
SESSION_POOL = {}
|
||||||
USER_POOL = {}
|
|
||||||
USAGE_POOL = {}
|
USAGE_POOL = {}
|
||||||
|
|
||||||
aquire_func = release_func = renew_func = lambda: True
|
aquire_func = release_func = renew_func = lambda: True
|
||||||
|
|
@ -225,16 +228,6 @@ def get_models_in_use():
|
||||||
return models_in_use
|
return models_in_use
|
||||||
|
|
||||||
|
|
||||||
def get_active_user_ids():
|
|
||||||
"""Get the list of active user IDs."""
|
|
||||||
return list(USER_POOL.keys())
|
|
||||||
|
|
||||||
|
|
||||||
def get_user_active_status(user_id):
|
|
||||||
"""Check if a user is currently active."""
|
|
||||||
return user_id in USER_POOL
|
|
||||||
|
|
||||||
|
|
||||||
def get_user_id_from_session_pool(sid):
|
def get_user_id_from_session_pool(sid):
|
||||||
user = SESSION_POOL.get(sid)
|
user = SESSION_POOL.get(sid)
|
||||||
if user:
|
if user:
|
||||||
|
|
@ -260,10 +253,36 @@ def get_user_ids_from_room(room):
|
||||||
return active_user_ids
|
return active_user_ids
|
||||||
|
|
||||||
|
|
||||||
def get_active_status_by_user_id(user_id):
|
async def emit_to_users(event: str, data: dict, user_ids: list[str]):
|
||||||
if user_id in USER_POOL:
|
"""
|
||||||
return True
|
Send a message to specific users using their user:{id} rooms.
|
||||||
return False
|
|
||||||
|
Args:
|
||||||
|
event (str): The event name to emit.
|
||||||
|
data (dict): The payload/data to send.
|
||||||
|
user_ids (list[str]): The target users' IDs.
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
for user_id in user_ids:
|
||||||
|
await sio.emit(event, data, room=f"user:{user_id}")
|
||||||
|
except Exception as e:
|
||||||
|
log.debug(f"Failed to emit event {event} to users {user_ids}: {e}")
|
||||||
|
|
||||||
|
|
||||||
|
async def enter_room_for_users(room: str, user_ids: list[str]):
|
||||||
|
"""
|
||||||
|
Make all sessions of a user join a specific room.
|
||||||
|
Args:
|
||||||
|
room (str): The room to join.
|
||||||
|
user_ids (list[str]): The target user's IDs.
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
for user_id in user_ids:
|
||||||
|
session_ids = get_session_ids_from_room(f"user:{user_id}")
|
||||||
|
for sid in session_ids:
|
||||||
|
await sio.enter_room(sid, room)
|
||||||
|
except Exception as e:
|
||||||
|
log.debug(f"Failed to make users {user_ids} join room {room}: {e}")
|
||||||
|
|
||||||
|
|
||||||
@sio.on("usage")
|
@sio.on("usage")
|
||||||
|
|
@ -293,11 +312,6 @@ async def connect(sid, environ, auth):
|
||||||
SESSION_POOL[sid] = user.model_dump(
|
SESSION_POOL[sid] = user.model_dump(
|
||||||
exclude=["date_of_birth", "bio", "gender"]
|
exclude=["date_of_birth", "bio", "gender"]
|
||||||
)
|
)
|
||||||
if user.id in USER_POOL:
|
|
||||||
USER_POOL[user.id] = USER_POOL[user.id] + [sid]
|
|
||||||
else:
|
|
||||||
USER_POOL[user.id] = [sid]
|
|
||||||
|
|
||||||
await sio.enter_room(sid, f"user:{user.id}")
|
await sio.enter_room(sid, f"user:{user.id}")
|
||||||
|
|
||||||
|
|
||||||
|
|
@ -316,21 +330,34 @@ async def user_join(sid, data):
|
||||||
if not user:
|
if not user:
|
||||||
return
|
return
|
||||||
|
|
||||||
SESSION_POOL[sid] = user.model_dump(exclude=["date_of_birth", "bio", "gender"])
|
SESSION_POOL[sid] = user.model_dump(
|
||||||
if user.id in USER_POOL:
|
exclude=[
|
||||||
USER_POOL[user.id] = USER_POOL[user.id] + [sid]
|
"profile_image_url",
|
||||||
else:
|
"profile_banner_image_url",
|
||||||
USER_POOL[user.id] = [sid]
|
"date_of_birth",
|
||||||
|
"bio",
|
||||||
|
"gender",
|
||||||
|
]
|
||||||
|
)
|
||||||
|
|
||||||
await sio.enter_room(sid, f"user:{user.id}")
|
await sio.enter_room(sid, f"user:{user.id}")
|
||||||
|
|
||||||
# Join all the channels
|
# Join all the channels
|
||||||
channels = Channels.get_channels_by_user_id(user.id)
|
channels = Channels.get_channels_by_user_id(user.id)
|
||||||
log.debug(f"{channels=}")
|
log.debug(f"{channels=}")
|
||||||
for channel in channels:
|
for channel in channels:
|
||||||
await sio.enter_room(sid, f"channel:{channel.id}")
|
await sio.enter_room(sid, f"channel:{channel.id}")
|
||||||
|
|
||||||
return {"id": user.id, "name": user.name}
|
return {"id": user.id, "name": user.name}
|
||||||
|
|
||||||
|
|
||||||
|
@sio.on("heartbeat")
|
||||||
|
async def heartbeat(sid, data):
|
||||||
|
user = SESSION_POOL.get(sid)
|
||||||
|
if user:
|
||||||
|
Users.update_last_active_by_id(user["id"])
|
||||||
|
|
||||||
|
|
||||||
@sio.on("join-channels")
|
@sio.on("join-channels")
|
||||||
async def join_channel(sid, data):
|
async def join_channel(sid, data):
|
||||||
auth = data["auth"] if "auth" in data else None
|
auth = data["auth"] if "auth" in data else None
|
||||||
|
|
@ -398,6 +425,11 @@ async def channel_events(sid, data):
|
||||||
event_data = data["data"]
|
event_data = data["data"]
|
||||||
event_type = event_data["type"]
|
event_type = event_data["type"]
|
||||||
|
|
||||||
|
user = SESSION_POOL.get(sid)
|
||||||
|
|
||||||
|
if not user:
|
||||||
|
return
|
||||||
|
|
||||||
if event_type == "typing":
|
if event_type == "typing":
|
||||||
await sio.emit(
|
await sio.emit(
|
||||||
"events:channel",
|
"events:channel",
|
||||||
|
|
@ -405,10 +437,12 @@ async def channel_events(sid, data):
|
||||||
"channel_id": data["channel_id"],
|
"channel_id": data["channel_id"],
|
||||||
"message_id": data.get("message_id", None),
|
"message_id": data.get("message_id", None),
|
||||||
"data": event_data,
|
"data": event_data,
|
||||||
"user": UserNameResponse(**SESSION_POOL[sid]).model_dump(),
|
"user": UserNameResponse(**user).model_dump(),
|
||||||
},
|
},
|
||||||
room=room,
|
room=room,
|
||||||
)
|
)
|
||||||
|
elif event_type == "last_read_at":
|
||||||
|
Channels.update_member_last_read_at(data["channel_id"], user["id"])
|
||||||
|
|
||||||
|
|
||||||
@sio.on("ydoc:document:join")
|
@sio.on("ydoc:document:join")
|
||||||
|
|
@ -652,13 +686,6 @@ async def disconnect(sid):
|
||||||
if sid in SESSION_POOL:
|
if sid in SESSION_POOL:
|
||||||
user = SESSION_POOL[sid]
|
user = SESSION_POOL[sid]
|
||||||
del SESSION_POOL[sid]
|
del SESSION_POOL[sid]
|
||||||
|
|
||||||
user_id = user["id"]
|
|
||||||
USER_POOL[user_id] = [_sid for _sid in USER_POOL[user_id] if _sid != sid]
|
|
||||||
|
|
||||||
if len(USER_POOL[user_id]) == 0:
|
|
||||||
del USER_POOL[user_id]
|
|
||||||
|
|
||||||
await YDOC_MANAGER.remove_user_from_all_documents(sid)
|
await YDOC_MANAGER.remove_user_from_all_documents(sid)
|
||||||
else:
|
else:
|
||||||
pass
|
pass
|
||||||
|
|
|
||||||
|
|
@ -86,6 +86,15 @@ class RedisDict:
|
||||||
def items(self):
|
def items(self):
|
||||||
return [(k, json.loads(v)) for k, v in self.redis.hgetall(self.name).items()]
|
return [(k, json.loads(v)) for k, v in self.redis.hgetall(self.name).items()]
|
||||||
|
|
||||||
|
def set(self, mapping: dict):
|
||||||
|
pipe = self.redis.pipeline()
|
||||||
|
|
||||||
|
pipe.delete(self.name)
|
||||||
|
if mapping:
|
||||||
|
pipe.hset(self.name, mapping={k: json.dumps(v) for k, v in mapping.items()})
|
||||||
|
|
||||||
|
pipe.execute()
|
||||||
|
|
||||||
def get(self, key, default=None):
|
def get(self, key, default=None):
|
||||||
try:
|
try:
|
||||||
return self[key]
|
return self[key]
|
||||||
|
|
|
||||||
|
|
@ -194,7 +194,7 @@ class AuditLoggingMiddleware:
|
||||||
auth_header = request.headers.get("Authorization")
|
auth_header = request.headers.get("Authorization")
|
||||||
|
|
||||||
try:
|
try:
|
||||||
user = get_current_user(
|
user = await get_current_user(
|
||||||
request, None, None, get_http_authorization_cred(auth_header)
|
request, None, None, get_http_authorization_cred(auth_header)
|
||||||
)
|
)
|
||||||
return user
|
return user
|
||||||
|
|
|
||||||
|
|
@ -235,7 +235,7 @@ async def invalidate_token(request, token):
|
||||||
jti = decoded.get("jti")
|
jti = decoded.get("jti")
|
||||||
exp = decoded.get("exp")
|
exp = decoded.get("exp")
|
||||||
|
|
||||||
if jti:
|
if jti and exp:
|
||||||
ttl = exp - int(
|
ttl = exp - int(
|
||||||
datetime.now(UTC).timestamp()
|
datetime.now(UTC).timestamp()
|
||||||
) # Calculate time-to-live for the token
|
) # Calculate time-to-live for the token
|
||||||
|
|
@ -344,9 +344,7 @@ async def get_current_user(
|
||||||
# Refresh the user's last active timestamp asynchronously
|
# Refresh the user's last active timestamp asynchronously
|
||||||
# to prevent blocking the request
|
# to prevent blocking the request
|
||||||
if background_tasks:
|
if background_tasks:
|
||||||
background_tasks.add_task(
|
background_tasks.add_task(Users.update_last_active_by_id, user.id)
|
||||||
Users.update_user_last_active_by_id, user.id
|
|
||||||
)
|
|
||||||
return user
|
return user
|
||||||
else:
|
else:
|
||||||
raise HTTPException(
|
raise HTTPException(
|
||||||
|
|
@ -397,8 +395,7 @@ def get_current_user_by_api_key(request, api_key: str):
|
||||||
current_span.set_attribute("client.user.role", user.role)
|
current_span.set_attribute("client.user.role", user.role)
|
||||||
current_span.set_attribute("client.auth.type", "api_key")
|
current_span.set_attribute("client.auth.type", "api_key")
|
||||||
|
|
||||||
Users.update_user_last_active_by_id(user.id)
|
Users.update_last_active_by_id(user.id)
|
||||||
|
|
||||||
return user
|
return user
|
||||||
|
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -10,7 +10,11 @@ from fastapi import (
|
||||||
Request,
|
Request,
|
||||||
UploadFile,
|
UploadFile,
|
||||||
)
|
)
|
||||||
|
from typing import Optional
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
from open_webui.storage.provider import Storage
|
||||||
|
from open_webui.models.files import Files
|
||||||
from open_webui.routers.files import upload_file_handler
|
from open_webui.routers.files import upload_file_handler
|
||||||
|
|
||||||
import mimetypes
|
import mimetypes
|
||||||
|
|
@ -113,3 +117,26 @@ def get_file_url_from_base64(request, base64_file_string, metadata, user):
|
||||||
elif "data:audio/wav;base64" in base64_file_string:
|
elif "data:audio/wav;base64" in base64_file_string:
|
||||||
return get_audio_url_from_base64(request, base64_file_string, metadata, user)
|
return get_audio_url_from_base64(request, base64_file_string, metadata, user)
|
||||||
return None
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
def get_image_base64_from_file_id(id: str) -> Optional[str]:
|
||||||
|
file = Files.get_file_by_id(id)
|
||||||
|
if not file:
|
||||||
|
return None
|
||||||
|
|
||||||
|
try:
|
||||||
|
file_path = Storage.get_file(file.path)
|
||||||
|
file_path = Path(file_path)
|
||||||
|
|
||||||
|
# Check if the file already exists in the cache
|
||||||
|
if file_path.is_file():
|
||||||
|
import base64
|
||||||
|
|
||||||
|
with open(file_path, "rb") as image_file:
|
||||||
|
encoded_string = base64.b64encode(image_file.read()).decode("utf-8")
|
||||||
|
content_type, _ = mimetypes.guess_type(file_path.name)
|
||||||
|
return f"data:{content_type};base64,{encoded_string}"
|
||||||
|
else:
|
||||||
|
return None
|
||||||
|
except Exception as e:
|
||||||
|
return None
|
||||||
|
|
|
||||||
24
backend/open_webui/utils/groups.py
Normal file
24
backend/open_webui/utils/groups.py
Normal file
|
|
@ -0,0 +1,24 @@
|
||||||
|
import logging
|
||||||
|
from open_webui.models.groups import Groups
|
||||||
|
|
||||||
|
log = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
def apply_default_group_assignment(
|
||||||
|
default_group_id: str,
|
||||||
|
user_id: str,
|
||||||
|
) -> None:
|
||||||
|
"""
|
||||||
|
Apply default group assignment to a user if default_group_id is provided.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
default_group_id: ID of the default group to add the user to
|
||||||
|
user_id: ID of the user to add to the default group
|
||||||
|
"""
|
||||||
|
if default_group_id:
|
||||||
|
try:
|
||||||
|
Groups.add_users_to_group(default_group_id, [user_id])
|
||||||
|
except Exception as e:
|
||||||
|
log.error(
|
||||||
|
f"Failed to add user {user_id} to default group {default_group_id}: {e}"
|
||||||
|
)
|
||||||
|
|
@ -32,7 +32,6 @@ from open_webui.models.users import Users
|
||||||
from open_webui.socket.main import (
|
from open_webui.socket.main import (
|
||||||
get_event_call,
|
get_event_call,
|
||||||
get_event_emitter,
|
get_event_emitter,
|
||||||
get_active_status_by_user_id,
|
|
||||||
)
|
)
|
||||||
from open_webui.routers.tasks import (
|
from open_webui.routers.tasks import (
|
||||||
generate_queries,
|
generate_queries,
|
||||||
|
|
@ -459,12 +458,6 @@ async def chat_completion_tools_handler(
|
||||||
}
|
}
|
||||||
)
|
)
|
||||||
|
|
||||||
print(
|
|
||||||
f"Tool {tool_function_name} result: {tool_result}",
|
|
||||||
tool_result_files,
|
|
||||||
tool_result_embeds,
|
|
||||||
)
|
|
||||||
|
|
||||||
if tool_result:
|
if tool_result:
|
||||||
tool = tools[tool_function_name]
|
tool = tools[tool_function_name]
|
||||||
tool_id = tool.get("tool_id", "")
|
tool_id = tool.get("tool_id", "")
|
||||||
|
|
@ -492,12 +485,6 @@ async def chat_completion_tools_handler(
|
||||||
}
|
}
|
||||||
)
|
)
|
||||||
|
|
||||||
# Citation is not enabled for this tool
|
|
||||||
body["messages"] = add_or_update_user_message(
|
|
||||||
f"\nTool `{tool_name}` Output: {tool_result}",
|
|
||||||
body["messages"],
|
|
||||||
)
|
|
||||||
|
|
||||||
if (
|
if (
|
||||||
tools[tool_function_name]
|
tools[tool_function_name]
|
||||||
.get("metadata", {})
|
.get("metadata", {})
|
||||||
|
|
@ -729,17 +716,18 @@ async def chat_web_search_handler(
|
||||||
return form_data
|
return form_data
|
||||||
|
|
||||||
|
|
||||||
def get_last_images(message_list):
|
def get_images_from_messages(message_list):
|
||||||
images = []
|
images = []
|
||||||
|
|
||||||
for message in reversed(message_list):
|
for message in reversed(message_list):
|
||||||
images_flag = False
|
|
||||||
|
message_images = []
|
||||||
for file in message.get("files", []):
|
for file in message.get("files", []):
|
||||||
if file.get("type") == "image":
|
if file.get("type") == "image":
|
||||||
images.append(file.get("url"))
|
message_images.append(file.get("url"))
|
||||||
images_flag = True
|
|
||||||
|
|
||||||
if images_flag:
|
if message_images:
|
||||||
break
|
images.append(message_images)
|
||||||
|
|
||||||
return images
|
return images
|
||||||
|
|
||||||
|
|
@ -773,23 +761,36 @@ async def chat_image_generation_handler(
|
||||||
if not chat_id:
|
if not chat_id:
|
||||||
return form_data
|
return form_data
|
||||||
|
|
||||||
chat = Chats.get_chat_by_id_and_user_id(chat_id, user.id)
|
|
||||||
|
|
||||||
__event_emitter__ = extra_params["__event_emitter__"]
|
__event_emitter__ = extra_params["__event_emitter__"]
|
||||||
await __event_emitter__(
|
|
||||||
{
|
|
||||||
"type": "status",
|
|
||||||
"data": {"description": "Creating image", "done": False},
|
|
||||||
}
|
|
||||||
)
|
|
||||||
|
|
||||||
messages_map = chat.chat.get("history", {}).get("messages", {})
|
if chat_id.startswith("local:"):
|
||||||
message_id = chat.chat.get("history", {}).get("currentId")
|
message_list = form_data.get("messages", [])
|
||||||
message_list = get_message_list(messages_map, message_id)
|
else:
|
||||||
|
chat = Chats.get_chat_by_id_and_user_id(chat_id, user.id)
|
||||||
|
await __event_emitter__(
|
||||||
|
{
|
||||||
|
"type": "status",
|
||||||
|
"data": {"description": "Creating image", "done": False},
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
messages_map = chat.chat.get("history", {}).get("messages", {})
|
||||||
|
message_id = chat.chat.get("history", {}).get("currentId")
|
||||||
|
message_list = get_message_list(messages_map, message_id)
|
||||||
|
|
||||||
user_message = get_last_user_message(message_list)
|
user_message = get_last_user_message(message_list)
|
||||||
|
|
||||||
prompt = user_message
|
prompt = user_message
|
||||||
input_images = get_last_images(message_list)
|
message_images = get_images_from_messages(message_list)
|
||||||
|
|
||||||
|
# Limit to first 2 sets of images
|
||||||
|
# We may want to change this in the future to allow more images
|
||||||
|
input_images = []
|
||||||
|
for idx, images in enumerate(message_images):
|
||||||
|
if idx >= 2:
|
||||||
|
break
|
||||||
|
for image in images:
|
||||||
|
input_images.append(image)
|
||||||
|
|
||||||
system_message_content = ""
|
system_message_content = ""
|
||||||
|
|
||||||
|
|
@ -845,7 +846,7 @@ async def chat_image_generation_handler(
|
||||||
}
|
}
|
||||||
)
|
)
|
||||||
|
|
||||||
system_message_content = f"<context>Image generation was attempted but failed. The system is currently unable to generate the image. Tell the user that an error occurred: {error_message}</context>"
|
system_message_content = f"<context>Image generation was attempted but failed. The system is currently unable to generate the image. Tell the user that the following error occurred: {error_message}</context>"
|
||||||
|
|
||||||
else:
|
else:
|
||||||
# Create image(s)
|
# Create image(s)
|
||||||
|
|
@ -908,7 +909,7 @@ async def chat_image_generation_handler(
|
||||||
}
|
}
|
||||||
)
|
)
|
||||||
|
|
||||||
system_message_content = "<context>The requested image has been created and is now being shown to the user. Let them know that it has been generated.</context>"
|
system_message_content = "<context>The requested image has been created by the system successfully and is now being shown to the user. Let the user know that the image they requested has been generated and is now shown in the chat.</context>"
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
log.debug(e)
|
log.debug(e)
|
||||||
|
|
||||||
|
|
@ -929,7 +930,7 @@ async def chat_image_generation_handler(
|
||||||
}
|
}
|
||||||
)
|
)
|
||||||
|
|
||||||
system_message_content = f"<context>Image generation was attempted but failed. The system is currently unable to generate the image. Tell the user that an error occurred: {error_message}</context>"
|
system_message_content = f"<context>Image generation was attempted but failed because of an error. The system is currently unable to generate the image. Tell the user that the following error occurred: {error_message}</context>"
|
||||||
|
|
||||||
if system_message_content:
|
if system_message_content:
|
||||||
form_data["messages"] = add_or_update_system_message(
|
form_data["messages"] = add_or_update_system_message(
|
||||||
|
|
@ -1409,11 +1410,12 @@ async def process_chat_payload(request, form_data, user, metadata, model):
|
||||||
headers=headers if headers else None,
|
headers=headers if headers else None,
|
||||||
)
|
)
|
||||||
|
|
||||||
function_name_filter_list = (
|
function_name_filter_list = mcp_server_connection.get(
|
||||||
mcp_server_connection.get("config", {})
|
"config", {}
|
||||||
.get("function_name_filter_list", "")
|
).get("function_name_filter_list", "")
|
||||||
.split(",")
|
|
||||||
)
|
if isinstance(function_name_filter_list, str):
|
||||||
|
function_name_filter_list = function_name_filter_list.split(",")
|
||||||
|
|
||||||
tool_specs = await mcp_clients[server_id].list_tool_specs()
|
tool_specs = await mcp_clients[server_id].list_tool_specs()
|
||||||
for tool_spec in tool_specs:
|
for tool_spec in tool_specs:
|
||||||
|
|
@ -1914,7 +1916,7 @@ async def process_chat_response(
|
||||||
)
|
)
|
||||||
|
|
||||||
# Send a webhook notification if the user is not active
|
# Send a webhook notification if the user is not active
|
||||||
if not get_active_status_by_user_id(user.id):
|
if not Users.is_user_active(user.id):
|
||||||
webhook_url = Users.get_user_webhook_url_by_id(user.id)
|
webhook_url = Users.get_user_webhook_url_by_id(user.id)
|
||||||
if webhook_url:
|
if webhook_url:
|
||||||
await post_webhook(
|
await post_webhook(
|
||||||
|
|
@ -3209,7 +3211,7 @@ async def process_chat_response(
|
||||||
)
|
)
|
||||||
|
|
||||||
# Send a webhook notification if the user is not active
|
# Send a webhook notification if the user is not active
|
||||||
if not get_active_status_by_user_id(user.id):
|
if not Users.is_user_active(user.id):
|
||||||
webhook_url = Users.get_user_webhook_url_by_id(user.id)
|
webhook_url = Users.get_user_webhook_url_by_id(user.id)
|
||||||
if webhook_url:
|
if webhook_url:
|
||||||
await post_webhook(
|
await post_webhook(
|
||||||
|
|
|
||||||
|
|
@ -6,7 +6,7 @@ import uuid
|
||||||
import logging
|
import logging
|
||||||
from datetime import timedelta
|
from datetime import timedelta
|
||||||
from pathlib import Path
|
from pathlib import Path
|
||||||
from typing import Callable, Optional
|
from typing import Callable, Optional, Sequence, Union
|
||||||
import json
|
import json
|
||||||
import aiohttp
|
import aiohttp
|
||||||
|
|
||||||
|
|
@ -43,26 +43,28 @@ def get_allow_block_lists(filter_list):
|
||||||
return allow_list, block_list
|
return allow_list, block_list
|
||||||
|
|
||||||
|
|
||||||
def is_string_allowed(string: str, filter_list: Optional[list[str]] = None) -> bool:
|
def is_string_allowed(
|
||||||
|
string: Union[str, Sequence[str]], filter_list: Optional[list[str]] = None
|
||||||
|
) -> bool:
|
||||||
"""
|
"""
|
||||||
Checks if a string is allowed based on the provided filter list.
|
Checks if a string is allowed based on the provided filter list.
|
||||||
:param string: The string to check (e.g., domain or hostname).
|
:param string: The string or sequence of strings to check (e.g., domain or hostname).
|
||||||
:param filter_list: List of allowed/blocked strings. Strings starting with "!" are blocked.
|
:param filter_list: List of allowed/blocked strings. Strings starting with "!" are blocked.
|
||||||
:return: True if the string is allowed, False otherwise.
|
:return: True if the string or sequence of strings is allowed, False otherwise.
|
||||||
"""
|
"""
|
||||||
if not filter_list:
|
if not filter_list:
|
||||||
return True
|
return True
|
||||||
|
|
||||||
allow_list, block_list = get_allow_block_lists(filter_list)
|
allow_list, block_list = get_allow_block_lists(filter_list)
|
||||||
print(string, allow_list, block_list)
|
strings = [string] if isinstance(string, str) else list(string)
|
||||||
|
|
||||||
# If allow list is non-empty, require domain to match one of them
|
# If allow list is non-empty, require domain to match one of them
|
||||||
if allow_list:
|
if allow_list:
|
||||||
if not any(string.endswith(allowed) for allowed in allow_list):
|
if not any(s.endswith(allowed) for s in strings for allowed in allow_list):
|
||||||
return False
|
return False
|
||||||
|
|
||||||
# Block list always removes matches
|
# Block list always removes matches
|
||||||
if any(string.endswith(blocked) for blocked in block_list):
|
if any(s.endswith(blocked) for s in strings for blocked in block_list):
|
||||||
return False
|
return False
|
||||||
|
|
||||||
return True
|
return True
|
||||||
|
|
@ -622,14 +624,17 @@ def stream_chunks_handler(stream: aiohttp.StreamReader):
|
||||||
yield line
|
yield line
|
||||||
else:
|
else:
|
||||||
yield b"data: {}"
|
yield b"data: {}"
|
||||||
|
yield b"\n"
|
||||||
else:
|
else:
|
||||||
# Normal mode: check if line exceeds limit
|
# Normal mode: check if line exceeds limit
|
||||||
if len(line) > max_buffer_size:
|
if len(line) > max_buffer_size:
|
||||||
skip_mode = True
|
skip_mode = True
|
||||||
yield b"data: {}"
|
yield b"data: {}"
|
||||||
|
yield b"\n"
|
||||||
log.info(f"Skip mode triggered, line size: {len(line)}")
|
log.info(f"Skip mode triggered, line size: {len(line)}")
|
||||||
else:
|
else:
|
||||||
yield line
|
yield line
|
||||||
|
yield b"\n"
|
||||||
|
|
||||||
# Save the last incomplete fragment
|
# Save the last incomplete fragment
|
||||||
buffer = lines[-1]
|
buffer = lines[-1]
|
||||||
|
|
@ -644,5 +649,6 @@ def stream_chunks_handler(stream: aiohttp.StreamReader):
|
||||||
# Process remaining buffer data
|
# Process remaining buffer data
|
||||||
if buffer and not skip_mode:
|
if buffer and not skip_mode:
|
||||||
yield buffer
|
yield buffer
|
||||||
|
yield b"\n"
|
||||||
|
|
||||||
return yield_safe_stream_chunks()
|
return yield_safe_stream_chunks()
|
||||||
|
|
|
||||||
|
|
@ -6,6 +6,7 @@ import sys
|
||||||
from aiocache import cached
|
from aiocache import cached
|
||||||
from fastapi import Request
|
from fastapi import Request
|
||||||
|
|
||||||
|
from open_webui.socket.utils import RedisDict
|
||||||
from open_webui.routers import openai, ollama
|
from open_webui.routers import openai, ollama
|
||||||
from open_webui.functions import get_function_models
|
from open_webui.functions import get_function_models
|
||||||
|
|
||||||
|
|
@ -190,6 +191,8 @@ async def get_all_models(request, refresh: bool = False, user: UserModel = None)
|
||||||
):
|
):
|
||||||
# Custom model based on a base model
|
# Custom model based on a base model
|
||||||
owned_by = "openai"
|
owned_by = "openai"
|
||||||
|
connection_type = None
|
||||||
|
|
||||||
pipe = None
|
pipe = None
|
||||||
|
|
||||||
for m in models:
|
for m in models:
|
||||||
|
|
@ -200,6 +203,8 @@ async def get_all_models(request, refresh: bool = False, user: UserModel = None)
|
||||||
owned_by = m.get("owned_by", "unknown")
|
owned_by = m.get("owned_by", "unknown")
|
||||||
if "pipe" in m:
|
if "pipe" in m:
|
||||||
pipe = m["pipe"]
|
pipe = m["pipe"]
|
||||||
|
|
||||||
|
connection_type = m.get("connection_type", None)
|
||||||
break
|
break
|
||||||
|
|
||||||
model = {
|
model = {
|
||||||
|
|
@ -208,6 +213,7 @@ async def get_all_models(request, refresh: bool = False, user: UserModel = None)
|
||||||
"object": "model",
|
"object": "model",
|
||||||
"created": custom_model.created_at,
|
"created": custom_model.created_at,
|
||||||
"owned_by": owned_by,
|
"owned_by": owned_by,
|
||||||
|
"connection_type": connection_type,
|
||||||
"preset": True,
|
"preset": True,
|
||||||
**({"pipe": pipe} if pipe is not None else {}),
|
**({"pipe": pipe} if pipe is not None else {}),
|
||||||
}
|
}
|
||||||
|
|
@ -323,7 +329,12 @@ async def get_all_models(request, refresh: bool = False, user: UserModel = None)
|
||||||
|
|
||||||
log.debug(f"get_all_models() returned {len(models)} models")
|
log.debug(f"get_all_models() returned {len(models)} models")
|
||||||
|
|
||||||
request.app.state.MODELS = {model["id"]: model for model in models}
|
models_dict = {model["id"]: model for model in models}
|
||||||
|
if isinstance(request.app.state.MODELS, RedisDict):
|
||||||
|
request.app.state.MODELS.set(models_dict)
|
||||||
|
else:
|
||||||
|
request.app.state.MODELS = models_dict
|
||||||
|
|
||||||
return models
|
return models
|
||||||
|
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -43,6 +43,7 @@ from open_webui.config import (
|
||||||
ENABLE_OAUTH_GROUP_CREATION,
|
ENABLE_OAUTH_GROUP_CREATION,
|
||||||
OAUTH_BLOCKED_GROUPS,
|
OAUTH_BLOCKED_GROUPS,
|
||||||
OAUTH_GROUPS_SEPARATOR,
|
OAUTH_GROUPS_SEPARATOR,
|
||||||
|
OAUTH_ROLES_SEPARATOR,
|
||||||
OAUTH_ROLES_CLAIM,
|
OAUTH_ROLES_CLAIM,
|
||||||
OAUTH_SUB_CLAIM,
|
OAUTH_SUB_CLAIM,
|
||||||
OAUTH_GROUPS_CLAIM,
|
OAUTH_GROUPS_CLAIM,
|
||||||
|
|
@ -54,6 +55,7 @@ from open_webui.config import (
|
||||||
OAUTH_ALLOWED_DOMAINS,
|
OAUTH_ALLOWED_DOMAINS,
|
||||||
OAUTH_UPDATE_PICTURE_ON_LOGIN,
|
OAUTH_UPDATE_PICTURE_ON_LOGIN,
|
||||||
OAUTH_ACCESS_TOKEN_REQUEST_INCLUDE_CLIENT_ID,
|
OAUTH_ACCESS_TOKEN_REQUEST_INCLUDE_CLIENT_ID,
|
||||||
|
OAUTH_AUDIENCE,
|
||||||
WEBHOOK_URL,
|
WEBHOOK_URL,
|
||||||
JWT_EXPIRES_IN,
|
JWT_EXPIRES_IN,
|
||||||
AppConfig,
|
AppConfig,
|
||||||
|
|
@ -71,6 +73,7 @@ from open_webui.env import (
|
||||||
from open_webui.utils.misc import parse_duration
|
from open_webui.utils.misc import parse_duration
|
||||||
from open_webui.utils.auth import get_password_hash, create_token
|
from open_webui.utils.auth import get_password_hash, create_token
|
||||||
from open_webui.utils.webhook import post_webhook
|
from open_webui.utils.webhook import post_webhook
|
||||||
|
from open_webui.utils.groups import apply_default_group_assignment
|
||||||
|
|
||||||
from mcp.shared.auth import (
|
from mcp.shared.auth import (
|
||||||
OAuthClientMetadata as MCPOAuthClientMetadata,
|
OAuthClientMetadata as MCPOAuthClientMetadata,
|
||||||
|
|
@ -124,6 +127,7 @@ auth_manager_config.OAUTH_ALLOWED_DOMAINS = OAUTH_ALLOWED_DOMAINS
|
||||||
auth_manager_config.WEBHOOK_URL = WEBHOOK_URL
|
auth_manager_config.WEBHOOK_URL = WEBHOOK_URL
|
||||||
auth_manager_config.JWT_EXPIRES_IN = JWT_EXPIRES_IN
|
auth_manager_config.JWT_EXPIRES_IN = JWT_EXPIRES_IN
|
||||||
auth_manager_config.OAUTH_UPDATE_PICTURE_ON_LOGIN = OAUTH_UPDATE_PICTURE_ON_LOGIN
|
auth_manager_config.OAUTH_UPDATE_PICTURE_ON_LOGIN = OAUTH_UPDATE_PICTURE_ON_LOGIN
|
||||||
|
auth_manager_config.OAUTH_AUDIENCE = OAUTH_AUDIENCE
|
||||||
|
|
||||||
|
|
||||||
FERNET = None
|
FERNET = None
|
||||||
|
|
@ -1032,7 +1036,13 @@ class OAuthManager:
|
||||||
|
|
||||||
if isinstance(claim_data, list):
|
if isinstance(claim_data, list):
|
||||||
oauth_roles = claim_data
|
oauth_roles = claim_data
|
||||||
if isinstance(claim_data, str) or isinstance(claim_data, int):
|
elif isinstance(claim_data, str):
|
||||||
|
# Split by the configured separator if present
|
||||||
|
if OAUTH_ROLES_SEPARATOR and OAUTH_ROLES_SEPARATOR in claim_data:
|
||||||
|
oauth_roles = claim_data.split(OAUTH_ROLES_SEPARATOR)
|
||||||
|
else:
|
||||||
|
oauth_roles = [claim_data]
|
||||||
|
elif isinstance(claim_data, int):
|
||||||
oauth_roles = [str(claim_data)]
|
oauth_roles = [str(claim_data)]
|
||||||
|
|
||||||
log.debug(f"Oauth Roles claim: {oauth_claim}")
|
log.debug(f"Oauth Roles claim: {oauth_claim}")
|
||||||
|
|
@ -1095,7 +1105,7 @@ class OAuthManager:
|
||||||
user_oauth_groups = []
|
user_oauth_groups = []
|
||||||
|
|
||||||
user_current_groups: list[GroupModel] = Groups.get_groups_by_member_id(user.id)
|
user_current_groups: list[GroupModel] = Groups.get_groups_by_member_id(user.id)
|
||||||
all_available_groups: list[GroupModel] = Groups.get_groups()
|
all_available_groups: list[GroupModel] = Groups.get_all_groups()
|
||||||
|
|
||||||
# Create groups if they don't exist and creation is enabled
|
# Create groups if they don't exist and creation is enabled
|
||||||
if auth_manager_config.ENABLE_OAUTH_GROUP_CREATION:
|
if auth_manager_config.ENABLE_OAUTH_GROUP_CREATION:
|
||||||
|
|
@ -1139,7 +1149,7 @@ class OAuthManager:
|
||||||
|
|
||||||
# Refresh the list of all available groups if any were created
|
# Refresh the list of all available groups if any were created
|
||||||
if groups_created:
|
if groups_created:
|
||||||
all_available_groups = Groups.get_groups()
|
all_available_groups = Groups.get_all_groups()
|
||||||
log.debug("Refreshed list of all available groups after creation.")
|
log.debug("Refreshed list of all available groups after creation.")
|
||||||
|
|
||||||
log.debug(f"Oauth Groups claim: {oauth_claim}")
|
log.debug(f"Oauth Groups claim: {oauth_claim}")
|
||||||
|
|
@ -1160,7 +1170,6 @@ class OAuthManager:
|
||||||
log.debug(
|
log.debug(
|
||||||
f"Removing user from group {group_model.name} as it is no longer in their oauth groups"
|
f"Removing user from group {group_model.name} as it is no longer in their oauth groups"
|
||||||
)
|
)
|
||||||
|
|
||||||
Groups.remove_users_from_group(group_model.id, [user.id])
|
Groups.remove_users_from_group(group_model.id, [user.id])
|
||||||
|
|
||||||
# In case a group is created, but perms are never assigned to the group by hitting "save"
|
# In case a group is created, but perms are never assigned to the group by hitting "save"
|
||||||
|
|
@ -1263,7 +1272,12 @@ class OAuthManager:
|
||||||
client = self.get_client(provider)
|
client = self.get_client(provider)
|
||||||
if client is None:
|
if client is None:
|
||||||
raise HTTPException(404)
|
raise HTTPException(404)
|
||||||
return await client.authorize_redirect(request, redirect_uri)
|
|
||||||
|
kwargs = {}
|
||||||
|
if (auth_manager_config.OAUTH_AUDIENCE):
|
||||||
|
kwargs["audience"] = auth_manager_config.OAUTH_AUDIENCE
|
||||||
|
|
||||||
|
return await client.authorize_redirect(request, redirect_uri, **kwargs)
|
||||||
|
|
||||||
async def handle_callback(self, request, provider, response):
|
async def handle_callback(self, request, provider, response):
|
||||||
if provider not in OAUTH_PROVIDERS:
|
if provider not in OAUTH_PROVIDERS:
|
||||||
|
|
@ -1322,7 +1336,10 @@ class OAuthManager:
|
||||||
log.warning(f"OAuth callback failed, sub is missing: {user_data}")
|
log.warning(f"OAuth callback failed, sub is missing: {user_data}")
|
||||||
raise HTTPException(400, detail=ERROR_MESSAGES.INVALID_CRED)
|
raise HTTPException(400, detail=ERROR_MESSAGES.INVALID_CRED)
|
||||||
|
|
||||||
provider_sub = f"{provider}@{sub}"
|
oauth_data = {}
|
||||||
|
oauth_data[provider] = {
|
||||||
|
"sub": sub,
|
||||||
|
}
|
||||||
|
|
||||||
# Email extraction
|
# Email extraction
|
||||||
email_claim = auth_manager_config.OAUTH_EMAIL_CLAIM
|
email_claim = auth_manager_config.OAUTH_EMAIL_CLAIM
|
||||||
|
|
@ -1369,12 +1386,12 @@ class OAuthManager:
|
||||||
log.warning(f"Error fetching GitHub email: {e}")
|
log.warning(f"Error fetching GitHub email: {e}")
|
||||||
raise HTTPException(400, detail=ERROR_MESSAGES.INVALID_CRED)
|
raise HTTPException(400, detail=ERROR_MESSAGES.INVALID_CRED)
|
||||||
elif ENABLE_OAUTH_EMAIL_FALLBACK:
|
elif ENABLE_OAUTH_EMAIL_FALLBACK:
|
||||||
email = f"{provider_sub}.local"
|
email = f"{provider}@{sub}.local"
|
||||||
else:
|
else:
|
||||||
log.warning(f"OAuth callback failed, email is missing: {user_data}")
|
log.warning(f"OAuth callback failed, email is missing: {user_data}")
|
||||||
raise HTTPException(400, detail=ERROR_MESSAGES.INVALID_CRED)
|
raise HTTPException(400, detail=ERROR_MESSAGES.INVALID_CRED)
|
||||||
email = email.lower()
|
|
||||||
|
|
||||||
|
email = email.lower()
|
||||||
# If allowed domains are configured, check if the email domain is in the list
|
# If allowed domains are configured, check if the email domain is in the list
|
||||||
if (
|
if (
|
||||||
"*" not in auth_manager_config.OAUTH_ALLOWED_DOMAINS
|
"*" not in auth_manager_config.OAUTH_ALLOWED_DOMAINS
|
||||||
|
|
@ -1387,7 +1404,7 @@ class OAuthManager:
|
||||||
raise HTTPException(400, detail=ERROR_MESSAGES.INVALID_CRED)
|
raise HTTPException(400, detail=ERROR_MESSAGES.INVALID_CRED)
|
||||||
|
|
||||||
# Check if the user exists
|
# Check if the user exists
|
||||||
user = Users.get_user_by_oauth_sub(provider_sub)
|
user = Users.get_user_by_oauth_sub(provider, sub)
|
||||||
if not user:
|
if not user:
|
||||||
# If the user does not exist, check if merging is enabled
|
# If the user does not exist, check if merging is enabled
|
||||||
if auth_manager_config.OAUTH_MERGE_ACCOUNTS_BY_EMAIL:
|
if auth_manager_config.OAUTH_MERGE_ACCOUNTS_BY_EMAIL:
|
||||||
|
|
@ -1395,12 +1412,15 @@ class OAuthManager:
|
||||||
user = Users.get_user_by_email(email)
|
user = Users.get_user_by_email(email)
|
||||||
if user:
|
if user:
|
||||||
# Update the user with the new oauth sub
|
# Update the user with the new oauth sub
|
||||||
Users.update_user_oauth_sub_by_id(user.id, provider_sub)
|
Users.update_user_oauth_by_id(user.id, provider, sub)
|
||||||
|
|
||||||
if user:
|
if user:
|
||||||
determined_role = self.get_user_role(user, user_data)
|
determined_role = self.get_user_role(user, user_data)
|
||||||
if user.role != determined_role:
|
if user.role != determined_role:
|
||||||
Users.update_user_role_by_id(user.id, determined_role)
|
Users.update_user_role_by_id(user.id, determined_role)
|
||||||
|
# Update the user object in memory as well,
|
||||||
|
# to avoid problems with the ENABLE_OAUTH_GROUP_MANAGEMENT check below
|
||||||
|
user.role = determined_role
|
||||||
# Update profile picture if enabled and different from current
|
# Update profile picture if enabled and different from current
|
||||||
if auth_manager_config.OAUTH_UPDATE_PICTURE_ON_LOGIN:
|
if auth_manager_config.OAUTH_UPDATE_PICTURE_ON_LOGIN:
|
||||||
picture_claim = auth_manager_config.OAUTH_PICTURE_CLAIM
|
picture_claim = auth_manager_config.OAUTH_PICTURE_CLAIM
|
||||||
|
|
@ -1451,7 +1471,7 @@ class OAuthManager:
|
||||||
name=name,
|
name=name,
|
||||||
profile_image_url=picture_url,
|
profile_image_url=picture_url,
|
||||||
role=self.get_user_role(None, user_data),
|
role=self.get_user_role(None, user_data),
|
||||||
oauth_sub=provider_sub,
|
oauth=oauth_data,
|
||||||
)
|
)
|
||||||
|
|
||||||
if auth_manager_config.WEBHOOK_URL:
|
if auth_manager_config.WEBHOOK_URL:
|
||||||
|
|
@ -1465,6 +1485,12 @@ class OAuthManager:
|
||||||
"user": user.model_dump_json(exclude_none=True),
|
"user": user.model_dump_json(exclude_none=True),
|
||||||
},
|
},
|
||||||
)
|
)
|
||||||
|
|
||||||
|
apply_default_group_assignment(
|
||||||
|
request.app.state.config.DEFAULT_GROUP_ID,
|
||||||
|
user.id,
|
||||||
|
)
|
||||||
|
|
||||||
else:
|
else:
|
||||||
raise HTTPException(
|
raise HTTPException(
|
||||||
status.HTTP_403_FORBIDDEN,
|
status.HTTP_403_FORBIDDEN,
|
||||||
|
|
|
||||||
139
backend/open_webui/utils/rate_limit.py
Normal file
139
backend/open_webui/utils/rate_limit.py
Normal file
|
|
@ -0,0 +1,139 @@
|
||||||
|
import time
|
||||||
|
from typing import Optional, Dict
|
||||||
|
from open_webui.env import REDIS_KEY_PREFIX
|
||||||
|
|
||||||
|
|
||||||
|
class RateLimiter:
|
||||||
|
"""
|
||||||
|
General-purpose rate limiter using Redis with a rolling window strategy.
|
||||||
|
Falls back to in-memory storage if Redis is not available.
|
||||||
|
"""
|
||||||
|
|
||||||
|
# In-memory fallback storage
|
||||||
|
_memory_store: Dict[str, Dict[int, int]] = {}
|
||||||
|
|
||||||
|
def __init__(
|
||||||
|
self,
|
||||||
|
redis_client,
|
||||||
|
limit: int,
|
||||||
|
window: int,
|
||||||
|
bucket_size: int = 60,
|
||||||
|
enabled: bool = True,
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
:param redis_client: Redis client instance or None
|
||||||
|
:param limit: Max allowed events in the window
|
||||||
|
:param window: Time window in seconds
|
||||||
|
:param bucket_size: Bucket resolution
|
||||||
|
:param enabled: Turn on/off rate limiting globally
|
||||||
|
"""
|
||||||
|
self.r = redis_client
|
||||||
|
self.limit = limit
|
||||||
|
self.window = window
|
||||||
|
self.bucket_size = bucket_size
|
||||||
|
self.num_buckets = window // bucket_size
|
||||||
|
self.enabled = enabled
|
||||||
|
|
||||||
|
def _bucket_key(self, key: str, bucket_index: int) -> str:
|
||||||
|
return f"{REDIS_KEY_PREFIX}:ratelimit:{key.lower()}:{bucket_index}"
|
||||||
|
|
||||||
|
def _current_bucket(self) -> int:
|
||||||
|
return int(time.time()) // self.bucket_size
|
||||||
|
|
||||||
|
def _redis_available(self) -> bool:
|
||||||
|
return self.r is not None
|
||||||
|
|
||||||
|
def is_limited(self, key: str) -> bool:
|
||||||
|
"""
|
||||||
|
Main rate-limit check.
|
||||||
|
Gracefully handles missing or failing Redis.
|
||||||
|
"""
|
||||||
|
if not self.enabled:
|
||||||
|
return False
|
||||||
|
|
||||||
|
if self._redis_available():
|
||||||
|
try:
|
||||||
|
return self._is_limited_redis(key)
|
||||||
|
except Exception:
|
||||||
|
return self._is_limited_memory(key)
|
||||||
|
else:
|
||||||
|
return self._is_limited_memory(key)
|
||||||
|
|
||||||
|
def get_count(self, key: str) -> int:
|
||||||
|
if not self.enabled:
|
||||||
|
return 0
|
||||||
|
|
||||||
|
if self._redis_available():
|
||||||
|
try:
|
||||||
|
return self._get_count_redis(key)
|
||||||
|
except Exception:
|
||||||
|
return self._get_count_memory(key)
|
||||||
|
else:
|
||||||
|
return self._get_count_memory(key)
|
||||||
|
|
||||||
|
def remaining(self, key: str) -> int:
|
||||||
|
used = self.get_count(key)
|
||||||
|
return max(0, self.limit - used)
|
||||||
|
|
||||||
|
def _is_limited_redis(self, key: str) -> bool:
|
||||||
|
now_bucket = self._current_bucket()
|
||||||
|
bucket_key = self._bucket_key(key, now_bucket)
|
||||||
|
|
||||||
|
attempts = self.r.incr(bucket_key)
|
||||||
|
if attempts == 1:
|
||||||
|
self.r.expire(bucket_key, self.window + self.bucket_size)
|
||||||
|
|
||||||
|
# Collect buckets
|
||||||
|
buckets = [
|
||||||
|
self._bucket_key(key, now_bucket - i) for i in range(self.num_buckets + 1)
|
||||||
|
]
|
||||||
|
|
||||||
|
counts = self.r.mget(buckets)
|
||||||
|
total = sum(int(c) for c in counts if c)
|
||||||
|
|
||||||
|
return total > self.limit
|
||||||
|
|
||||||
|
def _get_count_redis(self, key: str) -> int:
|
||||||
|
now_bucket = self._current_bucket()
|
||||||
|
buckets = [
|
||||||
|
self._bucket_key(key, now_bucket - i) for i in range(self.num_buckets + 1)
|
||||||
|
]
|
||||||
|
counts = self.r.mget(buckets)
|
||||||
|
return sum(int(c) for c in counts if c)
|
||||||
|
|
||||||
|
def _is_limited_memory(self, key: str) -> bool:
|
||||||
|
now_bucket = self._current_bucket()
|
||||||
|
|
||||||
|
# Init storage
|
||||||
|
if key not in self._memory_store:
|
||||||
|
self._memory_store[key] = {}
|
||||||
|
|
||||||
|
store = self._memory_store[key]
|
||||||
|
|
||||||
|
# Increment bucket
|
||||||
|
store[now_bucket] = store.get(now_bucket, 0) + 1
|
||||||
|
|
||||||
|
# Drop expired buckets
|
||||||
|
min_bucket = now_bucket - self.num_buckets
|
||||||
|
expired = [b for b in store if b < min_bucket]
|
||||||
|
for b in expired:
|
||||||
|
del store[b]
|
||||||
|
|
||||||
|
# Count totals
|
||||||
|
total = sum(store.values())
|
||||||
|
return total > self.limit
|
||||||
|
|
||||||
|
def _get_count_memory(self, key: str) -> int:
|
||||||
|
now_bucket = self._current_bucket()
|
||||||
|
if key not in self._memory_store:
|
||||||
|
return 0
|
||||||
|
|
||||||
|
store = self._memory_store[key]
|
||||||
|
min_bucket = now_bucket - self.num_buckets
|
||||||
|
|
||||||
|
# Remove expired
|
||||||
|
expired = [b for b in store if b < min_bucket]
|
||||||
|
for b in expired:
|
||||||
|
del store[b]
|
||||||
|
|
||||||
|
return sum(store.values())
|
||||||
|
|
@ -5,7 +5,14 @@ import logging
|
||||||
|
|
||||||
import redis
|
import redis
|
||||||
|
|
||||||
from open_webui.env import REDIS_SENTINEL_MAX_RETRY_COUNT
|
from open_webui.env import (
|
||||||
|
REDIS_CLUSTER,
|
||||||
|
REDIS_SOCKET_CONNECT_TIMEOUT,
|
||||||
|
REDIS_SENTINEL_HOSTS,
|
||||||
|
REDIS_SENTINEL_MAX_RETRY_COUNT,
|
||||||
|
REDIS_SENTINEL_PORT,
|
||||||
|
REDIS_URL,
|
||||||
|
)
|
||||||
|
|
||||||
log = logging.getLogger(__name__)
|
log = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
@ -108,6 +115,21 @@ def parse_redis_service_url(redis_url):
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def get_redis_client(async_mode=False):
|
||||||
|
try:
|
||||||
|
return get_redis_connection(
|
||||||
|
redis_url=REDIS_URL,
|
||||||
|
redis_sentinels=get_sentinels_from_env(
|
||||||
|
REDIS_SENTINEL_HOSTS, REDIS_SENTINEL_PORT
|
||||||
|
),
|
||||||
|
redis_cluster=REDIS_CLUSTER,
|
||||||
|
async_mode=async_mode,
|
||||||
|
)
|
||||||
|
except Exception as e:
|
||||||
|
log.debug(f"Failed to get Redis client: {e}")
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
def get_redis_connection(
|
def get_redis_connection(
|
||||||
redis_url,
|
redis_url,
|
||||||
redis_sentinels,
|
redis_sentinels,
|
||||||
|
|
@ -141,6 +163,7 @@ def get_redis_connection(
|
||||||
username=redis_config["username"],
|
username=redis_config["username"],
|
||||||
password=redis_config["password"],
|
password=redis_config["password"],
|
||||||
decode_responses=decode_responses,
|
decode_responses=decode_responses,
|
||||||
|
socket_connect_timeout=REDIS_SOCKET_CONNECT_TIMEOUT,
|
||||||
)
|
)
|
||||||
connection = SentinelRedisProxy(
|
connection = SentinelRedisProxy(
|
||||||
sentinel,
|
sentinel,
|
||||||
|
|
@ -167,6 +190,7 @@ def get_redis_connection(
|
||||||
username=redis_config["username"],
|
username=redis_config["username"],
|
||||||
password=redis_config["password"],
|
password=redis_config["password"],
|
||||||
decode_responses=decode_responses,
|
decode_responses=decode_responses,
|
||||||
|
socket_connect_timeout=REDIS_SOCKET_CONNECT_TIMEOUT,
|
||||||
)
|
)
|
||||||
connection = SentinelRedisProxy(
|
connection = SentinelRedisProxy(
|
||||||
sentinel,
|
sentinel,
|
||||||
|
|
|
||||||
|
|
@ -45,7 +45,6 @@ from open_webui.env import (
|
||||||
OTEL_METRICS_OTLP_SPAN_EXPORTER,
|
OTEL_METRICS_OTLP_SPAN_EXPORTER,
|
||||||
OTEL_METRICS_EXPORTER_OTLP_INSECURE,
|
OTEL_METRICS_EXPORTER_OTLP_INSECURE,
|
||||||
)
|
)
|
||||||
from open_webui.socket.main import get_active_user_ids
|
|
||||||
from open_webui.models.users import Users
|
from open_webui.models.users import Users
|
||||||
|
|
||||||
_EXPORT_INTERVAL_MILLIS = 10_000 # 10 seconds
|
_EXPORT_INTERVAL_MILLIS = 10_000 # 10 seconds
|
||||||
|
|
@ -135,7 +134,7 @@ def setup_metrics(app: FastAPI, resource: Resource) -> None:
|
||||||
) -> Sequence[metrics.Observation]:
|
) -> Sequence[metrics.Observation]:
|
||||||
return [
|
return [
|
||||||
metrics.Observation(
|
metrics.Observation(
|
||||||
value=len(get_active_user_ids()),
|
value=Users.get_active_user_count(),
|
||||||
)
|
)
|
||||||
]
|
]
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -150,11 +150,12 @@ async def get_tools(
|
||||||
)
|
)
|
||||||
|
|
||||||
specs = tool_server_data.get("specs", [])
|
specs = tool_server_data.get("specs", [])
|
||||||
function_name_filter_list = (
|
function_name_filter_list = tool_server_connection.get(
|
||||||
tool_server_connection.get("config", {})
|
"config", {}
|
||||||
.get("function_name_filter_list", "")
|
).get("function_name_filter_list", "")
|
||||||
.split(",")
|
|
||||||
)
|
if isinstance(function_name_filter_list, str):
|
||||||
|
function_name_filter_list = function_name_filter_list.split(",")
|
||||||
|
|
||||||
for spec in specs:
|
for spec in specs:
|
||||||
function_name = spec["name"]
|
function_name = spec["name"]
|
||||||
|
|
|
||||||
|
|
@ -1,13 +1,13 @@
|
||||||
# Minimal requirements for backend to run
|
# Minimal requirements for backend to run
|
||||||
# WIP: use this as a reference to build a minimal docker image
|
# WIP: use this as a reference to build a minimal docker image
|
||||||
|
|
||||||
fastapi==0.118.0
|
fastapi==0.124.0
|
||||||
uvicorn[standard]==0.37.0
|
uvicorn[standard]==0.37.0
|
||||||
pydantic==2.11.9
|
pydantic==2.12.5
|
||||||
python-multipart==0.0.20
|
python-multipart==0.0.20
|
||||||
itsdangerous==2.2.0
|
itsdangerous==2.2.0
|
||||||
|
|
||||||
python-socketio==5.14.0
|
python-socketio==5.15.0
|
||||||
python-jose==3.5.0
|
python-jose==3.5.0
|
||||||
cryptography
|
cryptography
|
||||||
bcrypt==5.0.0
|
bcrypt==5.0.0
|
||||||
|
|
@ -16,36 +16,36 @@ PyJWT[crypto]==2.10.1
|
||||||
authlib==1.6.5
|
authlib==1.6.5
|
||||||
|
|
||||||
requests==2.32.5
|
requests==2.32.5
|
||||||
aiohttp==3.12.15
|
aiohttp==3.13.2
|
||||||
async-timeout
|
async-timeout
|
||||||
aiocache
|
aiocache
|
||||||
aiofiles
|
aiofiles
|
||||||
starlette-compress==1.6.0
|
starlette-compress==1.6.1
|
||||||
httpx[socks,http2,zstd,cli,brotli]==0.28.1
|
httpx[socks,http2,zstd,cli,brotli]==0.28.1
|
||||||
starsessions[redis]==2.2.1
|
starsessions[redis]==2.2.1
|
||||||
|
|
||||||
sqlalchemy==2.0.38
|
sqlalchemy==2.0.44
|
||||||
alembic==1.14.0
|
alembic==1.17.2
|
||||||
peewee==3.18.1
|
peewee==3.18.3
|
||||||
peewee-migrate==1.12.2
|
peewee-migrate==1.14.3
|
||||||
|
|
||||||
pycrdt==0.12.25
|
pycrdt==0.12.44
|
||||||
redis
|
redis
|
||||||
|
|
||||||
APScheduler==3.10.4
|
APScheduler==3.11.1
|
||||||
RestrictedPython==8.0
|
RestrictedPython==8.1
|
||||||
|
|
||||||
loguru==0.7.3
|
loguru==0.7.3
|
||||||
asgiref==3.8.1
|
asgiref==3.11.0
|
||||||
|
|
||||||
mcp==1.21.2
|
mcp==1.23.1
|
||||||
openai
|
openai
|
||||||
|
|
||||||
langchain==0.3.27
|
langchain==0.3.27
|
||||||
langchain-community==0.3.29
|
langchain-community==0.3.29
|
||||||
fake-useragent==2.2.0
|
fake-useragent==2.2.0
|
||||||
|
|
||||||
chromadb==1.1.0
|
chromadb==1.3.5
|
||||||
black==25.9.0
|
black==25.12.0
|
||||||
pydub
|
pydub
|
||||||
chardet==5.2.0
|
chardet==5.2.0
|
||||||
|
|
|
||||||
|
|
@ -1,10 +1,10 @@
|
||||||
fastapi==0.118.0
|
fastapi==0.124.0
|
||||||
uvicorn[standard]==0.37.0
|
uvicorn[standard]==0.37.0
|
||||||
pydantic==2.11.9
|
pydantic==2.12.5
|
||||||
python-multipart==0.0.20
|
python-multipart==0.0.20
|
||||||
itsdangerous==2.2.0
|
itsdangerous==2.2.0
|
||||||
|
|
||||||
python-socketio==5.14.0
|
python-socketio==5.15.0
|
||||||
python-jose==3.5.0
|
python-jose==3.5.0
|
||||||
cryptography
|
cryptography
|
||||||
bcrypt==5.0.0
|
bcrypt==5.0.0
|
||||||
|
|
@ -13,90 +13,90 @@ PyJWT[crypto]==2.10.1
|
||||||
authlib==1.6.5
|
authlib==1.6.5
|
||||||
|
|
||||||
requests==2.32.5
|
requests==2.32.5
|
||||||
aiohttp==3.12.15
|
aiohttp==3.13.2
|
||||||
async-timeout
|
async-timeout
|
||||||
aiocache
|
aiocache
|
||||||
aiofiles
|
aiofiles
|
||||||
starlette-compress==1.6.0
|
starlette-compress==1.6.1
|
||||||
httpx[socks,http2,zstd,cli,brotli]==0.28.1
|
httpx[socks,http2,zstd,cli,brotli]==0.28.1
|
||||||
starsessions[redis]==2.2.1
|
starsessions[redis]==2.2.1
|
||||||
|
|
||||||
sqlalchemy==2.0.38
|
sqlalchemy==2.0.44
|
||||||
alembic==1.14.0
|
alembic==1.17.2
|
||||||
peewee==3.18.1
|
peewee==3.18.3
|
||||||
peewee-migrate==1.12.2
|
peewee-migrate==1.14.3
|
||||||
|
|
||||||
pycrdt==0.12.25
|
pycrdt==0.12.44
|
||||||
redis
|
redis
|
||||||
|
|
||||||
APScheduler==3.10.4
|
APScheduler==3.11.1
|
||||||
RestrictedPython==8.0
|
RestrictedPython==8.1
|
||||||
|
|
||||||
loguru==0.7.3
|
loguru==0.7.3
|
||||||
asgiref==3.8.1
|
asgiref==3.11.0
|
||||||
|
|
||||||
# AI libraries
|
# AI libraries
|
||||||
tiktoken
|
tiktoken
|
||||||
mcp==1.21.2
|
mcp==1.23.3
|
||||||
|
|
||||||
openai
|
openai
|
||||||
anthropic
|
anthropic
|
||||||
google-genai==1.52.0
|
google-genai==1.54.0
|
||||||
google-generativeai==0.8.5
|
google-generativeai==0.8.5
|
||||||
|
|
||||||
langchain==0.3.27
|
langchain==0.3.27
|
||||||
langchain-community==0.3.29
|
langchain-community==0.3.29
|
||||||
|
|
||||||
fake-useragent==2.2.0
|
fake-useragent==2.2.0
|
||||||
chromadb==1.1.0
|
chromadb==1.3.5
|
||||||
weaviate-client==4.17.0
|
weaviate-client==4.18.3
|
||||||
opensearch-py==2.8.0
|
opensearch-py==3.1.0
|
||||||
|
|
||||||
transformers
|
transformers==4.57.3
|
||||||
sentence-transformers==5.1.1
|
sentence-transformers==5.1.2
|
||||||
accelerate
|
accelerate
|
||||||
pyarrow==20.0.0 # fix: pin pyarrow version to 20 for rpi compatibility #15897
|
pyarrow==20.0.0 # fix: pin pyarrow version to 20 for rpi compatibility #15897
|
||||||
einops==0.8.1
|
einops==0.8.1
|
||||||
|
|
||||||
ftfy==6.2.3
|
ftfy==6.3.1
|
||||||
chardet==5.2.0
|
chardet==5.2.0
|
||||||
pypdf==6.0.0
|
pypdf==6.4.1
|
||||||
fpdf2==2.8.2
|
fpdf2==2.8.5
|
||||||
pymdown-extensions==10.14.2
|
pymdown-extensions==10.18
|
||||||
docx2txt==0.8
|
docx2txt==0.9
|
||||||
python-pptx==1.0.2
|
python-pptx==1.0.2
|
||||||
unstructured==0.18.18
|
unstructured==0.18.21
|
||||||
msoffcrypto-tool==5.4.2
|
msoffcrypto-tool==5.4.2
|
||||||
nltk==3.9.1
|
nltk==3.9.2
|
||||||
Markdown==3.9
|
Markdown==3.10
|
||||||
pypandoc==1.15
|
pypandoc==1.16.2
|
||||||
pandas==2.2.3
|
pandas==2.3.3
|
||||||
openpyxl==3.1.5
|
openpyxl==3.1.5
|
||||||
pyxlsb==1.0.10
|
pyxlsb==1.0.10
|
||||||
xlrd==2.0.1
|
xlrd==2.0.2
|
||||||
validators==0.35.0
|
validators==0.35.0
|
||||||
psutil
|
psutil
|
||||||
sentencepiece
|
sentencepiece
|
||||||
soundfile==0.13.1
|
soundfile==0.13.1
|
||||||
|
|
||||||
pillow==11.3.0
|
pillow==12.0.0
|
||||||
opencv-python-headless==4.11.0.86
|
opencv-python-headless==4.12.0.88
|
||||||
rapidocr-onnxruntime==1.4.4
|
rapidocr-onnxruntime==1.4.4
|
||||||
rank-bm25==0.2.2
|
rank-bm25==0.2.2
|
||||||
|
|
||||||
onnxruntime==1.20.1
|
onnxruntime==1.23.2
|
||||||
faster-whisper==1.1.1
|
faster-whisper==1.2.1
|
||||||
|
|
||||||
black==25.9.0
|
black==25.12.0
|
||||||
youtube-transcript-api==1.2.2
|
youtube-transcript-api==1.2.3
|
||||||
pytube==15.0.0
|
pytube==15.0.0
|
||||||
|
|
||||||
pydub
|
pydub
|
||||||
ddgs==9.0.0
|
ddgs==9.9.3
|
||||||
|
|
||||||
azure-ai-documentintelligence==1.0.2
|
azure-ai-documentintelligence==1.0.2
|
||||||
azure-identity==1.25.0
|
azure-identity==1.25.1
|
||||||
azure-storage-blob==12.24.1
|
azure-storage-blob==12.27.1
|
||||||
azure-search-documents==11.6.0
|
azure-search-documents==11.6.0
|
||||||
|
|
||||||
## Google Drive
|
## Google Drive
|
||||||
|
|
@ -104,49 +104,49 @@ google-api-python-client
|
||||||
google-auth-httplib2
|
google-auth-httplib2
|
||||||
google-auth-oauthlib
|
google-auth-oauthlib
|
||||||
|
|
||||||
googleapis-common-protos==1.70.0
|
googleapis-common-protos==1.72.0
|
||||||
google-cloud-storage==2.19.0
|
google-cloud-storage==3.7.0
|
||||||
|
|
||||||
## Databases
|
## Databases
|
||||||
pymongo
|
pymongo
|
||||||
psycopg2-binary==2.9.10
|
psycopg2-binary==2.9.11
|
||||||
pgvector==0.4.1
|
pgvector==0.4.2
|
||||||
|
|
||||||
PyMySQL==1.1.1
|
PyMySQL==1.1.2
|
||||||
boto3==1.40.5
|
boto3==1.42.5
|
||||||
|
|
||||||
pymilvus==2.6.2
|
pymilvus==2.6.5
|
||||||
qdrant-client==1.14.3
|
qdrant-client==1.16.1
|
||||||
playwright==1.49.1 # Caution: version must match docker-compose.playwright.yaml
|
playwright==1.57.0 # Caution: version must match docker-compose.playwright.yaml - Update the docker-compose.yaml if necessary
|
||||||
elasticsearch==9.1.0
|
elasticsearch==9.2.0
|
||||||
pinecone==6.0.2
|
pinecone==6.0.2
|
||||||
oracledb==3.2.0
|
oracledb==3.4.1
|
||||||
|
|
||||||
av==14.0.1 # Caution: Set due to FATAL FIPS SELFTEST FAILURE, see discussion https://github.com/open-webui/open-webui/discussions/15720
|
av==14.0.1 # Caution: Set due to FATAL FIPS SELFTEST FAILURE, see discussion https://github.com/open-webui/open-webui/discussions/15720
|
||||||
|
|
||||||
colbert-ai==0.2.21
|
colbert-ai==0.2.22
|
||||||
|
|
||||||
|
|
||||||
## Tests
|
## Tests
|
||||||
docker~=7.1.0
|
docker~=7.1.0
|
||||||
pytest~=8.4.1
|
pytest~=8.4.1
|
||||||
pytest-docker~=3.1.1
|
pytest-docker~=3.2.5
|
||||||
|
|
||||||
## LDAP
|
## LDAP
|
||||||
ldap3==2.9.1
|
ldap3==2.9.1
|
||||||
|
|
||||||
## Firecrawl
|
## Firecrawl
|
||||||
firecrawl-py==4.5.0
|
firecrawl-py==4.10.4
|
||||||
|
|
||||||
## Trace
|
## Trace
|
||||||
opentelemetry-api==1.37.0
|
opentelemetry-api==1.39.0
|
||||||
opentelemetry-sdk==1.37.0
|
opentelemetry-sdk==1.39.0
|
||||||
opentelemetry-exporter-otlp==1.37.0
|
opentelemetry-exporter-otlp==1.39.0
|
||||||
opentelemetry-instrumentation==0.58b0
|
opentelemetry-instrumentation==0.60b0
|
||||||
opentelemetry-instrumentation-fastapi==0.58b0
|
opentelemetry-instrumentation-fastapi==0.60b0
|
||||||
opentelemetry-instrumentation-sqlalchemy==0.58b0
|
opentelemetry-instrumentation-sqlalchemy==0.60b0
|
||||||
opentelemetry-instrumentation-redis==0.58b0
|
opentelemetry-instrumentation-redis==0.60b0
|
||||||
opentelemetry-instrumentation-requests==0.58b0
|
opentelemetry-instrumentation-requests==0.60b0
|
||||||
opentelemetry-instrumentation-logging==0.58b0
|
opentelemetry-instrumentation-logging==0.60b0
|
||||||
opentelemetry-instrumentation-httpx==0.58b0
|
opentelemetry-instrumentation-httpx==0.60b0
|
||||||
opentelemetry-instrumentation-aiohttp-client==0.58b0
|
opentelemetry-instrumentation-aiohttp-client==0.60b0
|
||||||
|
|
|
||||||
|
|
@ -1,8 +1,8 @@
|
||||||
services:
|
services:
|
||||||
playwright:
|
playwright:
|
||||||
image: mcr.microsoft.com/playwright:v1.49.1-noble # Version must match requirements.txt
|
image: mcr.microsoft.com/playwright:v1.57.0-noble # Version must match requirements.txt
|
||||||
container_name: playwright
|
container_name: playwright
|
||||||
command: npx -y playwright@1.49.1 run-server --port 3000 --host 0.0.0.0
|
command: npx -y playwright@1.57.0 run-server --port 3000 --host 0.0.0.0
|
||||||
|
|
||||||
open-webui:
|
open-webui:
|
||||||
environment:
|
environment:
|
||||||
|
|
|
||||||
|
|
@ -1,4 +0,0 @@
|
||||||
# Helm Charts
|
|
||||||
Open WebUI Helm Charts are now hosted in a separate repo, which can be found here: https://github.com/open-webui/helm-charts
|
|
||||||
|
|
||||||
The charts are released at https://helm.openwebui.com.
|
|
||||||
|
|
@ -1,8 +0,0 @@
|
||||||
resources:
|
|
||||||
- open-webui.yaml
|
|
||||||
- ollama-service.yaml
|
|
||||||
- ollama-statefulset.yaml
|
|
||||||
- webui-deployment.yaml
|
|
||||||
- webui-service.yaml
|
|
||||||
- webui-ingress.yaml
|
|
||||||
- webui-pvc.yaml
|
|
||||||
|
|
@ -1,12 +0,0 @@
|
||||||
apiVersion: v1
|
|
||||||
kind: Service
|
|
||||||
metadata:
|
|
||||||
name: ollama-service
|
|
||||||
namespace: open-webui
|
|
||||||
spec:
|
|
||||||
selector:
|
|
||||||
app: ollama
|
|
||||||
ports:
|
|
||||||
- protocol: TCP
|
|
||||||
port: 11434
|
|
||||||
targetPort: 11434
|
|
||||||
|
|
@ -1,41 +0,0 @@
|
||||||
apiVersion: apps/v1
|
|
||||||
kind: StatefulSet
|
|
||||||
metadata:
|
|
||||||
name: ollama
|
|
||||||
namespace: open-webui
|
|
||||||
spec:
|
|
||||||
serviceName: "ollama"
|
|
||||||
replicas: 1
|
|
||||||
selector:
|
|
||||||
matchLabels:
|
|
||||||
app: ollama
|
|
||||||
template:
|
|
||||||
metadata:
|
|
||||||
labels:
|
|
||||||
app: ollama
|
|
||||||
spec:
|
|
||||||
containers:
|
|
||||||
- name: ollama
|
|
||||||
image: ollama/ollama:latest
|
|
||||||
ports:
|
|
||||||
- containerPort: 11434
|
|
||||||
resources:
|
|
||||||
requests:
|
|
||||||
cpu: "2000m"
|
|
||||||
memory: "2Gi"
|
|
||||||
limits:
|
|
||||||
cpu: "4000m"
|
|
||||||
memory: "4Gi"
|
|
||||||
nvidia.com/gpu: "0"
|
|
||||||
volumeMounts:
|
|
||||||
- name: ollama-volume
|
|
||||||
mountPath: /root/.ollama
|
|
||||||
tty: true
|
|
||||||
volumeClaimTemplates:
|
|
||||||
- metadata:
|
|
||||||
name: ollama-volume
|
|
||||||
spec:
|
|
||||||
accessModes: [ "ReadWriteOnce" ]
|
|
||||||
resources:
|
|
||||||
requests:
|
|
||||||
storage: 30Gi
|
|
||||||
|
|
@ -1,4 +0,0 @@
|
||||||
apiVersion: v1
|
|
||||||
kind: Namespace
|
|
||||||
metadata:
|
|
||||||
name: open-webui
|
|
||||||
|
|
@ -1,38 +0,0 @@
|
||||||
apiVersion: apps/v1
|
|
||||||
kind: Deployment
|
|
||||||
metadata:
|
|
||||||
name: open-webui-deployment
|
|
||||||
namespace: open-webui
|
|
||||||
spec:
|
|
||||||
replicas: 1
|
|
||||||
selector:
|
|
||||||
matchLabels:
|
|
||||||
app: open-webui
|
|
||||||
template:
|
|
||||||
metadata:
|
|
||||||
labels:
|
|
||||||
app: open-webui
|
|
||||||
spec:
|
|
||||||
containers:
|
|
||||||
- name: open-webui
|
|
||||||
image: ghcr.io/open-webui/open-webui:main
|
|
||||||
ports:
|
|
||||||
- containerPort: 8080
|
|
||||||
resources:
|
|
||||||
requests:
|
|
||||||
cpu: "500m"
|
|
||||||
memory: "500Mi"
|
|
||||||
limits:
|
|
||||||
cpu: "1000m"
|
|
||||||
memory: "1Gi"
|
|
||||||
env:
|
|
||||||
- name: OLLAMA_BASE_URL
|
|
||||||
value: "http://ollama-service.open-webui.svc.cluster.local:11434"
|
|
||||||
tty: true
|
|
||||||
volumeMounts:
|
|
||||||
- name: webui-volume
|
|
||||||
mountPath: /app/backend/data
|
|
||||||
volumes:
|
|
||||||
- name: webui-volume
|
|
||||||
persistentVolumeClaim:
|
|
||||||
claimName: open-webui-pvc
|
|
||||||
|
|
@ -1,20 +0,0 @@
|
||||||
apiVersion: networking.k8s.io/v1
|
|
||||||
kind: Ingress
|
|
||||||
metadata:
|
|
||||||
name: open-webui-ingress
|
|
||||||
namespace: open-webui
|
|
||||||
#annotations:
|
|
||||||
# Use appropriate annotations for your Ingress controller, e.g., for NGINX:
|
|
||||||
# nginx.ingress.kubernetes.io/rewrite-target: /
|
|
||||||
spec:
|
|
||||||
rules:
|
|
||||||
- host: open-webui.minikube.local
|
|
||||||
http:
|
|
||||||
paths:
|
|
||||||
- path: /
|
|
||||||
pathType: Prefix
|
|
||||||
backend:
|
|
||||||
service:
|
|
||||||
name: open-webui-service
|
|
||||||
port:
|
|
||||||
number: 8080
|
|
||||||
|
|
@ -1,12 +0,0 @@
|
||||||
apiVersion: v1
|
|
||||||
kind: PersistentVolumeClaim
|
|
||||||
metadata:
|
|
||||||
labels:
|
|
||||||
app: open-webui
|
|
||||||
name: open-webui-pvc
|
|
||||||
namespace: open-webui
|
|
||||||
spec:
|
|
||||||
accessModes: ["ReadWriteOnce"]
|
|
||||||
resources:
|
|
||||||
requests:
|
|
||||||
storage: 2Gi
|
|
||||||
|
|
@ -1,15 +0,0 @@
|
||||||
apiVersion: v1
|
|
||||||
kind: Service
|
|
||||||
metadata:
|
|
||||||
name: open-webui-service
|
|
||||||
namespace: open-webui
|
|
||||||
spec:
|
|
||||||
type: NodePort # Use LoadBalancer if you're on a cloud that supports it
|
|
||||||
selector:
|
|
||||||
app: open-webui
|
|
||||||
ports:
|
|
||||||
- protocol: TCP
|
|
||||||
port: 8080
|
|
||||||
targetPort: 8080
|
|
||||||
# If using NodePort, you can optionally specify the nodePort:
|
|
||||||
# nodePort: 30000
|
|
||||||
|
|
@ -1,8 +0,0 @@
|
||||||
apiVersion: kustomize.config.k8s.io/v1beta1
|
|
||||||
kind: Kustomization
|
|
||||||
|
|
||||||
resources:
|
|
||||||
- ../base
|
|
||||||
|
|
||||||
patches:
|
|
||||||
- path: ollama-statefulset-gpu.yaml
|
|
||||||
|
|
@ -1,17 +0,0 @@
|
||||||
apiVersion: apps/v1
|
|
||||||
kind: StatefulSet
|
|
||||||
metadata:
|
|
||||||
name: ollama
|
|
||||||
namespace: open-webui
|
|
||||||
spec:
|
|
||||||
selector:
|
|
||||||
matchLabels:
|
|
||||||
app: ollama
|
|
||||||
serviceName: "ollama"
|
|
||||||
template:
|
|
||||||
spec:
|
|
||||||
containers:
|
|
||||||
- name: ollama
|
|
||||||
resources:
|
|
||||||
limits:
|
|
||||||
nvidia.com/gpu: "1"
|
|
||||||
4
package-lock.json
generated
4
package-lock.json
generated
|
|
@ -1,12 +1,12 @@
|
||||||
{
|
{
|
||||||
"name": "open-webui",
|
"name": "open-webui",
|
||||||
"version": "0.6.40",
|
"version": "0.6.41",
|
||||||
"lockfileVersion": 3,
|
"lockfileVersion": 3,
|
||||||
"requires": true,
|
"requires": true,
|
||||||
"packages": {
|
"packages": {
|
||||||
"": {
|
"": {
|
||||||
"name": "open-webui",
|
"name": "open-webui",
|
||||||
"version": "0.6.40",
|
"version": "0.6.41",
|
||||||
"dependencies": {
|
"dependencies": {
|
||||||
"@azure/msal-browser": "^4.5.0",
|
"@azure/msal-browser": "^4.5.0",
|
||||||
"@codemirror/lang-javascript": "^6.2.2",
|
"@codemirror/lang-javascript": "^6.2.2",
|
||||||
|
|
|
||||||
|
|
@ -1,6 +1,6 @@
|
||||||
{
|
{
|
||||||
"name": "open-webui",
|
"name": "open-webui",
|
||||||
"version": "0.6.40",
|
"version": "0.6.41",
|
||||||
"private": true,
|
"private": true,
|
||||||
"scripts": {
|
"scripts": {
|
||||||
"dev": "npm run pyodide:fetch && vite dev --host",
|
"dev": "npm run pyodide:fetch && vite dev --host",
|
||||||
|
|
|
||||||
111
pyproject.toml
111
pyproject.toml
|
|
@ -6,13 +6,13 @@ authors = [
|
||||||
]
|
]
|
||||||
license = { file = "LICENSE" }
|
license = { file = "LICENSE" }
|
||||||
dependencies = [
|
dependencies = [
|
||||||
"fastapi==0.118.0",
|
"fastapi==0.124.0",
|
||||||
"uvicorn[standard]==0.37.0",
|
"uvicorn[standard]==0.37.0",
|
||||||
"pydantic==2.11.9",
|
"pydantic==2.12.5",
|
||||||
"python-multipart==0.0.20",
|
"python-multipart==0.0.20",
|
||||||
"itsdangerous==2.2.0",
|
"itsdangerous==2.2.0",
|
||||||
|
|
||||||
"python-socketio==5.14.0",
|
"python-socketio==5.15.0",
|
||||||
"python-jose==3.5.0",
|
"python-jose==3.5.0",
|
||||||
"cryptography",
|
"cryptography",
|
||||||
"bcrypt==5.0.0",
|
"bcrypt==5.0.0",
|
||||||
|
|
@ -21,97 +21,97 @@ dependencies = [
|
||||||
"authlib==1.6.5",
|
"authlib==1.6.5",
|
||||||
|
|
||||||
"requests==2.32.5",
|
"requests==2.32.5",
|
||||||
"aiohttp==3.12.15",
|
"aiohttp==3.13.2",
|
||||||
"async-timeout",
|
"async-timeout",
|
||||||
"aiocache",
|
"aiocache",
|
||||||
"aiofiles",
|
"aiofiles",
|
||||||
"starlette-compress==1.6.0",
|
"starlette-compress==1.6.1",
|
||||||
"httpx[socks,http2,zstd,cli,brotli]==0.28.1",
|
"httpx[socks,http2,zstd,cli,brotli]==0.28.1",
|
||||||
"starsessions[redis]==2.2.1",
|
"starsessions[redis]==2.2.1",
|
||||||
|
|
||||||
"sqlalchemy==2.0.38",
|
"sqlalchemy==2.0.44",
|
||||||
"alembic==1.14.0",
|
"alembic==1.17.2",
|
||||||
"peewee==3.18.1",
|
"peewee==3.18.3",
|
||||||
"peewee-migrate==1.12.2",
|
"peewee-migrate==1.14.3",
|
||||||
|
|
||||||
"pycrdt==0.12.25",
|
"pycrdt==0.12.44",
|
||||||
"redis",
|
"redis",
|
||||||
|
|
||||||
"APScheduler==3.10.4",
|
"APScheduler==3.11.1",
|
||||||
"RestrictedPython==8.0",
|
"RestrictedPython==8.1",
|
||||||
|
|
||||||
"loguru==0.7.3",
|
"loguru==0.7.3",
|
||||||
"asgiref==3.8.1",
|
"asgiref==3.11.0",
|
||||||
|
|
||||||
"tiktoken",
|
"tiktoken",
|
||||||
"mcp==1.21.2",
|
"mcp==1.23.3",
|
||||||
|
|
||||||
"openai",
|
"openai",
|
||||||
"anthropic",
|
"anthropic",
|
||||||
"google-genai==1.52.0",
|
"google-genai==1.54.0",
|
||||||
"google-generativeai==0.8.5",
|
"google-generativeai==0.8.5",
|
||||||
|
|
||||||
"langchain==0.3.27",
|
"langchain==0.3.27",
|
||||||
"langchain-community==0.3.29",
|
"langchain-community==0.3.29",
|
||||||
|
|
||||||
"fake-useragent==2.2.0",
|
"fake-useragent==2.2.0",
|
||||||
"chromadb==1.0.20",
|
"chromadb==1.3.5",
|
||||||
"opensearch-py==2.8.0",
|
"opensearch-py==3.1.0",
|
||||||
"PyMySQL==1.1.1",
|
"PyMySQL==1.1.2",
|
||||||
"boto3==1.40.5",
|
"boto3==1.42.5",
|
||||||
|
|
||||||
"transformers",
|
"transformers==4.57.3",
|
||||||
"sentence-transformers==5.1.1",
|
"sentence-transformers==5.1.2",
|
||||||
"accelerate",
|
"accelerate",
|
||||||
"pyarrow==20.0.0",
|
"pyarrow==20.0.0", # fix: pin pyarrow version to 20 for rpi compatibility #15897
|
||||||
"einops==0.8.1",
|
"einops==0.8.1",
|
||||||
|
|
||||||
"ftfy==6.2.3",
|
"ftfy==6.3.1",
|
||||||
"chardet==5.2.0",
|
"chardet==5.2.0",
|
||||||
"pypdf==6.0.0",
|
"pypdf==6.4.1",
|
||||||
"fpdf2==2.8.2",
|
"fpdf2==2.8.5",
|
||||||
"pymdown-extensions==10.14.2",
|
"pymdown-extensions==10.18",
|
||||||
"docx2txt==0.8",
|
"docx2txt==0.9",
|
||||||
"python-pptx==1.0.2",
|
"python-pptx==1.0.2",
|
||||||
"unstructured==0.18.18",
|
"unstructured==0.18.21",
|
||||||
"msoffcrypto-tool==5.4.2",
|
"msoffcrypto-tool==5.4.2",
|
||||||
"nltk==3.9.1",
|
"nltk==3.9.2",
|
||||||
"Markdown==3.9",
|
"Markdown==3.10",
|
||||||
"pypandoc==1.15",
|
"pypandoc==1.16.2",
|
||||||
"pandas==2.2.3",
|
"pandas==2.3.3",
|
||||||
"openpyxl==3.1.5",
|
"openpyxl==3.1.5",
|
||||||
"pyxlsb==1.0.10",
|
"pyxlsb==1.0.10",
|
||||||
"xlrd==2.0.1",
|
"xlrd==2.0.2",
|
||||||
"validators==0.35.0",
|
"validators==0.35.0",
|
||||||
"psutil",
|
"psutil",
|
||||||
"sentencepiece",
|
"sentencepiece",
|
||||||
"soundfile==0.13.1",
|
"soundfile==0.13.1",
|
||||||
"azure-ai-documentintelligence==1.0.2",
|
"azure-ai-documentintelligence==1.0.2",
|
||||||
|
|
||||||
"pillow==11.3.0",
|
"pillow==12.0.0",
|
||||||
"opencv-python-headless==4.11.0.86",
|
"opencv-python-headless==4.12.0.88",
|
||||||
"rapidocr-onnxruntime==1.4.4",
|
"rapidocr-onnxruntime==1.4.4",
|
||||||
"rank-bm25==0.2.2",
|
"rank-bm25==0.2.2",
|
||||||
|
|
||||||
"onnxruntime==1.20.1",
|
"onnxruntime==1.23.2",
|
||||||
"faster-whisper==1.1.1",
|
"faster-whisper==1.2.1",
|
||||||
|
|
||||||
"black==25.9.0",
|
"black==25.12.0",
|
||||||
"youtube-transcript-api==1.2.2",
|
"youtube-transcript-api==1.2.3",
|
||||||
"pytube==15.0.0",
|
"pytube==15.0.0",
|
||||||
|
|
||||||
"pydub",
|
"pydub",
|
||||||
"ddgs==9.0.0",
|
"ddgs==9.9.3",
|
||||||
|
|
||||||
"google-api-python-client",
|
"google-api-python-client",
|
||||||
"google-auth-httplib2",
|
"google-auth-httplib2",
|
||||||
"google-auth-oauthlib",
|
"google-auth-oauthlib",
|
||||||
|
|
||||||
"googleapis-common-protos==1.70.0",
|
"googleapis-common-protos==1.72.0",
|
||||||
"google-cloud-storage==2.19.0",
|
"google-cloud-storage==3.7.0",
|
||||||
|
|
||||||
"azure-identity==1.25.0",
|
"azure-identity==1.25.1",
|
||||||
"azure-storage-blob==12.24.1",
|
"azure-storage-blob==12.27.1",
|
||||||
|
|
||||||
"ldap3==2.9.1",
|
"ldap3==2.9.1",
|
||||||
]
|
]
|
||||||
|
|
@ -130,8 +130,8 @@ classifiers = [
|
||||||
|
|
||||||
[project.optional-dependencies]
|
[project.optional-dependencies]
|
||||||
postgres = [
|
postgres = [
|
||||||
"psycopg2-binary==2.9.10",
|
"psycopg2-binary==2.9.11",
|
||||||
"pgvector==0.4.1",
|
"pgvector==0.4.2",
|
||||||
]
|
]
|
||||||
|
|
||||||
all = [
|
all = [
|
||||||
|
|
@ -142,18 +142,19 @@ all = [
|
||||||
"gcp-storage-emulator>=2024.8.3",
|
"gcp-storage-emulator>=2024.8.3",
|
||||||
"docker~=7.1.0",
|
"docker~=7.1.0",
|
||||||
"pytest~=8.3.2",
|
"pytest~=8.3.2",
|
||||||
"pytest-docker~=3.1.1",
|
"pytest-docker~=3.2.5",
|
||||||
"playwright==1.49.1",
|
"playwright==1.57.0", # Caution: version must match docker-compose.playwright.yaml - Update the docker-compose.yaml if necessary
|
||||||
"elasticsearch==9.1.0",
|
"elasticsearch==9.2.0",
|
||||||
|
|
||||||
"qdrant-client==1.14.3",
|
"qdrant-client==1.16.1",
|
||||||
"weaviate-client==4.17.0",
|
"pymilvus==2.6.4",
|
||||||
"pymilvus==2.6.2",
|
"weaviate-client==4.18.3",
|
||||||
|
"pymilvus==2.6.5",
|
||||||
"pinecone==6.0.2",
|
"pinecone==6.0.2",
|
||||||
"oracledb==3.2.0",
|
"oracledb==3.4.1",
|
||||||
"colbert-ai==0.2.21",
|
"colbert-ai==0.2.22",
|
||||||
|
|
||||||
"firecrawl-py==4.5.0",
|
"firecrawl-py==4.10.4",
|
||||||
"azure-search-documents==11.6.0",
|
"azure-search-documents==11.6.0",
|
||||||
]
|
]
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -637,7 +637,7 @@ input[type='number'] {
|
||||||
|
|
||||||
.tiptap th,
|
.tiptap th,
|
||||||
.tiptap td {
|
.tiptap td {
|
||||||
@apply px-3 py-1.5 border border-gray-100 dark:border-gray-850;
|
@apply px-3 py-1.5 border border-gray-100/30 dark:border-gray-850/30;
|
||||||
}
|
}
|
||||||
|
|
||||||
.tiptap th {
|
.tiptap th {
|
||||||
|
|
@ -803,3 +803,7 @@ body {
|
||||||
position: relative;
|
position: relative;
|
||||||
z-index: 0;
|
z-index: 0;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
#note-content-container .ProseMirror {
|
||||||
|
padding-bottom: 2rem; /* space for the bottom toolbar */
|
||||||
|
}
|
||||||
|
|
|
||||||
|
|
@ -1,10 +1,13 @@
|
||||||
import { WEBUI_API_BASE_URL } from '$lib/constants';
|
import { WEBUI_API_BASE_URL } from '$lib/constants';
|
||||||
|
|
||||||
type ChannelForm = {
|
type ChannelForm = {
|
||||||
|
type?: string;
|
||||||
name: string;
|
name: string;
|
||||||
|
is_private?: boolean;
|
||||||
data?: object;
|
data?: object;
|
||||||
meta?: object;
|
meta?: object;
|
||||||
access_control?: object;
|
access_control?: object;
|
||||||
|
user_ids?: string[];
|
||||||
};
|
};
|
||||||
|
|
||||||
export const createNewChannel = async (token: string = '', channel: ChannelForm) => {
|
export const createNewChannel = async (token: string = '', channel: ChannelForm) => {
|
||||||
|
|
@ -101,7 +104,38 @@ export const getChannelById = async (token: string = '', channel_id: string) =>
|
||||||
return res;
|
return res;
|
||||||
};
|
};
|
||||||
|
|
||||||
export const getChannelUsersById = async (
|
export const getDMChannelByUserId = async (token: string = '', user_id: string) => {
|
||||||
|
let error = null;
|
||||||
|
|
||||||
|
const res = await fetch(`${WEBUI_API_BASE_URL}/channels/users/${user_id}`, {
|
||||||
|
method: 'GET',
|
||||||
|
headers: {
|
||||||
|
Accept: 'application/json',
|
||||||
|
'Content-Type': 'application/json',
|
||||||
|
authorization: `Bearer ${token}`
|
||||||
|
}
|
||||||
|
})
|
||||||
|
.then(async (res) => {
|
||||||
|
if (!res.ok) throw await res.json();
|
||||||
|
return res.json();
|
||||||
|
})
|
||||||
|
.then((json) => {
|
||||||
|
return json;
|
||||||
|
})
|
||||||
|
.catch((err) => {
|
||||||
|
error = err.detail;
|
||||||
|
console.error(err);
|
||||||
|
return null;
|
||||||
|
});
|
||||||
|
|
||||||
|
if (error) {
|
||||||
|
throw error;
|
||||||
|
}
|
||||||
|
|
||||||
|
return res;
|
||||||
|
};
|
||||||
|
|
||||||
|
export const getChannelMembersById = async (
|
||||||
token: string,
|
token: string,
|
||||||
channel_id: string,
|
channel_id: string,
|
||||||
query?: string,
|
query?: string,
|
||||||
|
|
@ -129,7 +163,7 @@ export const getChannelUsersById = async (
|
||||||
}
|
}
|
||||||
|
|
||||||
res = await fetch(
|
res = await fetch(
|
||||||
`${WEBUI_API_BASE_URL}/channels/${channel_id}/users?${searchParams.toString()}`,
|
`${WEBUI_API_BASE_URL}/channels/${channel_id}/members?${searchParams.toString()}`,
|
||||||
{
|
{
|
||||||
method: 'GET',
|
method: 'GET',
|
||||||
headers: {
|
headers: {
|
||||||
|
|
@ -155,6 +189,124 @@ export const getChannelUsersById = async (
|
||||||
return res;
|
return res;
|
||||||
};
|
};
|
||||||
|
|
||||||
|
export const updateChannelMemberActiveStatusById = async (
|
||||||
|
token: string = '',
|
||||||
|
channel_id: string,
|
||||||
|
is_active: boolean
|
||||||
|
) => {
|
||||||
|
let error = null;
|
||||||
|
|
||||||
|
const res = await fetch(`${WEBUI_API_BASE_URL}/channels/${channel_id}/members/active`, {
|
||||||
|
method: 'POST',
|
||||||
|
headers: {
|
||||||
|
Accept: 'application/json',
|
||||||
|
'Content-Type': 'application/json',
|
||||||
|
authorization: `Bearer ${token}`
|
||||||
|
},
|
||||||
|
body: JSON.stringify({ is_active })
|
||||||
|
})
|
||||||
|
.then(async (res) => {
|
||||||
|
if (!res.ok) throw await res.json();
|
||||||
|
return res.json();
|
||||||
|
})
|
||||||
|
.then((json) => {
|
||||||
|
return json;
|
||||||
|
})
|
||||||
|
.catch((err) => {
|
||||||
|
error = err.detail;
|
||||||
|
console.error(err);
|
||||||
|
return null;
|
||||||
|
});
|
||||||
|
|
||||||
|
if (error) {
|
||||||
|
throw error;
|
||||||
|
}
|
||||||
|
|
||||||
|
return res;
|
||||||
|
};
|
||||||
|
|
||||||
|
type UpdateMembersForm = {
|
||||||
|
user_ids?: string[];
|
||||||
|
group_ids?: string[];
|
||||||
|
};
|
||||||
|
|
||||||
|
export const addMembersById = async (
|
||||||
|
token: string = '',
|
||||||
|
channel_id: string,
|
||||||
|
formData: UpdateMembersForm
|
||||||
|
) => {
|
||||||
|
let error = null;
|
||||||
|
|
||||||
|
const res = await fetch(`${WEBUI_API_BASE_URL}/channels/${channel_id}/update/members/add`, {
|
||||||
|
method: 'POST',
|
||||||
|
headers: {
|
||||||
|
Accept: 'application/json',
|
||||||
|
'Content-Type': 'application/json',
|
||||||
|
authorization: `Bearer ${token}`
|
||||||
|
},
|
||||||
|
body: JSON.stringify({ ...formData })
|
||||||
|
})
|
||||||
|
.then(async (res) => {
|
||||||
|
if (!res.ok) throw await res.json();
|
||||||
|
return res.json();
|
||||||
|
})
|
||||||
|
.then((json) => {
|
||||||
|
return json;
|
||||||
|
})
|
||||||
|
.catch((err) => {
|
||||||
|
error = err.detail;
|
||||||
|
console.error(err);
|
||||||
|
return null;
|
||||||
|
});
|
||||||
|
|
||||||
|
if (error) {
|
||||||
|
throw error;
|
||||||
|
}
|
||||||
|
|
||||||
|
return res;
|
||||||
|
};
|
||||||
|
|
||||||
|
type RemoveMembersForm = {
|
||||||
|
user_ids?: string[];
|
||||||
|
group_ids?: string[];
|
||||||
|
};
|
||||||
|
|
||||||
|
export const removeMembersById = async (
|
||||||
|
token: string = '',
|
||||||
|
channel_id: string,
|
||||||
|
formData: RemoveMembersForm
|
||||||
|
) => {
|
||||||
|
let error = null;
|
||||||
|
|
||||||
|
const res = await fetch(`${WEBUI_API_BASE_URL}/channels/${channel_id}/update/members/remove`, {
|
||||||
|
method: 'POST',
|
||||||
|
headers: {
|
||||||
|
Accept: 'application/json',
|
||||||
|
'Content-Type': 'application/json',
|
||||||
|
authorization: `Bearer ${token}`
|
||||||
|
},
|
||||||
|
body: JSON.stringify({ ...formData })
|
||||||
|
})
|
||||||
|
.then(async (res) => {
|
||||||
|
if (!res.ok) throw await res.json();
|
||||||
|
return res.json();
|
||||||
|
})
|
||||||
|
.then((json) => {
|
||||||
|
return json;
|
||||||
|
})
|
||||||
|
.catch((err) => {
|
||||||
|
error = err.detail;
|
||||||
|
console.error(err);
|
||||||
|
return null;
|
||||||
|
});
|
||||||
|
|
||||||
|
if (error) {
|
||||||
|
throw error;
|
||||||
|
}
|
||||||
|
|
||||||
|
return res;
|
||||||
|
};
|
||||||
|
|
||||||
export const updateChannelById = async (
|
export const updateChannelById = async (
|
||||||
token: string = '',
|
token: string = '',
|
||||||
channel_id: string,
|
channel_id: string,
|
||||||
|
|
@ -261,6 +413,44 @@ export const getChannelMessages = async (
|
||||||
return res;
|
return res;
|
||||||
};
|
};
|
||||||
|
|
||||||
|
export const getChannelPinnedMessages = async (
|
||||||
|
token: string = '',
|
||||||
|
channel_id: string,
|
||||||
|
page: number = 1
|
||||||
|
) => {
|
||||||
|
let error = null;
|
||||||
|
|
||||||
|
const res = await fetch(
|
||||||
|
`${WEBUI_API_BASE_URL}/channels/${channel_id}/messages/pinned?page=${page}`,
|
||||||
|
{
|
||||||
|
method: 'GET',
|
||||||
|
headers: {
|
||||||
|
Accept: 'application/json',
|
||||||
|
'Content-Type': 'application/json',
|
||||||
|
authorization: `Bearer ${token}`
|
||||||
|
}
|
||||||
|
}
|
||||||
|
)
|
||||||
|
.then(async (res) => {
|
||||||
|
if (!res.ok) throw await res.json();
|
||||||
|
return res.json();
|
||||||
|
})
|
||||||
|
.then((json) => {
|
||||||
|
return json;
|
||||||
|
})
|
||||||
|
.catch((err) => {
|
||||||
|
error = err.detail;
|
||||||
|
console.error(err);
|
||||||
|
return null;
|
||||||
|
});
|
||||||
|
|
||||||
|
if (error) {
|
||||||
|
throw error;
|
||||||
|
}
|
||||||
|
|
||||||
|
return res;
|
||||||
|
};
|
||||||
|
|
||||||
export const getChannelThreadMessages = async (
|
export const getChannelThreadMessages = async (
|
||||||
token: string = '',
|
token: string = '',
|
||||||
channel_id: string,
|
channel_id: string,
|
||||||
|
|
@ -301,7 +491,46 @@ export const getChannelThreadMessages = async (
|
||||||
return res;
|
return res;
|
||||||
};
|
};
|
||||||
|
|
||||||
|
export const getMessageData = async (
|
||||||
|
token: string = '',
|
||||||
|
channel_id: string,
|
||||||
|
message_id: string
|
||||||
|
) => {
|
||||||
|
let error = null;
|
||||||
|
|
||||||
|
const res = await fetch(
|
||||||
|
`${WEBUI_API_BASE_URL}/channels/${channel_id}/messages/${message_id}/data`,
|
||||||
|
{
|
||||||
|
method: 'GET',
|
||||||
|
headers: {
|
||||||
|
Accept: 'application/json',
|
||||||
|
'Content-Type': 'application/json',
|
||||||
|
authorization: `Bearer ${token}`
|
||||||
|
}
|
||||||
|
}
|
||||||
|
)
|
||||||
|
.then(async (res) => {
|
||||||
|
if (!res.ok) throw await res.json();
|
||||||
|
return res.json();
|
||||||
|
})
|
||||||
|
.then((json) => {
|
||||||
|
return json;
|
||||||
|
})
|
||||||
|
.catch((err) => {
|
||||||
|
error = err.detail;
|
||||||
|
console.error(err);
|
||||||
|
return null;
|
||||||
|
});
|
||||||
|
|
||||||
|
if (error) {
|
||||||
|
throw error;
|
||||||
|
}
|
||||||
|
|
||||||
|
return res;
|
||||||
|
};
|
||||||
|
|
||||||
type MessageForm = {
|
type MessageForm = {
|
||||||
|
temp_id?: string;
|
||||||
reply_to_id?: string;
|
reply_to_id?: string;
|
||||||
parent_id?: string;
|
parent_id?: string;
|
||||||
content: string;
|
content: string;
|
||||||
|
|
@ -341,6 +570,46 @@ export const sendMessage = async (token: string = '', channel_id: string, messag
|
||||||
return res;
|
return res;
|
||||||
};
|
};
|
||||||
|
|
||||||
|
export const pinMessage = async (
|
||||||
|
token: string = '',
|
||||||
|
channel_id: string,
|
||||||
|
message_id: string,
|
||||||
|
is_pinned: boolean
|
||||||
|
) => {
|
||||||
|
let error = null;
|
||||||
|
|
||||||
|
const res = await fetch(
|
||||||
|
`${WEBUI_API_BASE_URL}/channels/${channel_id}/messages/${message_id}/pin`,
|
||||||
|
{
|
||||||
|
method: 'POST',
|
||||||
|
headers: {
|
||||||
|
Accept: 'application/json',
|
||||||
|
'Content-Type': 'application/json',
|
||||||
|
authorization: `Bearer ${token}`
|
||||||
|
},
|
||||||
|
body: JSON.stringify({ is_pinned })
|
||||||
|
}
|
||||||
|
)
|
||||||
|
.then(async (res) => {
|
||||||
|
if (!res.ok) throw await res.json();
|
||||||
|
return res.json();
|
||||||
|
})
|
||||||
|
.then((json) => {
|
||||||
|
return json;
|
||||||
|
})
|
||||||
|
.catch((err) => {
|
||||||
|
error = err.detail;
|
||||||
|
console.error(err);
|
||||||
|
return null;
|
||||||
|
});
|
||||||
|
|
||||||
|
if (error) {
|
||||||
|
throw error;
|
||||||
|
}
|
||||||
|
|
||||||
|
return res;
|
||||||
|
};
|
||||||
|
|
||||||
export const updateMessage = async (
|
export const updateMessage = async (
|
||||||
token: string = '',
|
token: string = '',
|
||||||
channel_id: string,
|
channel_id: string,
|
||||||
|
|
|
||||||
|
|
@ -1,16 +1,26 @@
|
||||||
import { WEBUI_API_BASE_URL } from '$lib/constants';
|
import { WEBUI_API_BASE_URL } from '$lib/constants';
|
||||||
import { splitStream } from '$lib/utils';
|
import { splitStream } from '$lib/utils';
|
||||||
|
|
||||||
export const uploadFile = async (token: string, file: File, metadata?: object | null) => {
|
export const uploadFile = async (
|
||||||
|
token: string,
|
||||||
|
file: File,
|
||||||
|
metadata?: object | null,
|
||||||
|
process?: boolean | null
|
||||||
|
) => {
|
||||||
const data = new FormData();
|
const data = new FormData();
|
||||||
data.append('file', file);
|
data.append('file', file);
|
||||||
if (metadata) {
|
if (metadata) {
|
||||||
data.append('metadata', JSON.stringify(metadata));
|
data.append('metadata', JSON.stringify(metadata));
|
||||||
}
|
}
|
||||||
|
|
||||||
|
const searchParams = new URLSearchParams();
|
||||||
|
if (process !== undefined && process !== null) {
|
||||||
|
searchParams.append('process', String(process));
|
||||||
|
}
|
||||||
|
|
||||||
let error = null;
|
let error = null;
|
||||||
|
|
||||||
const res = await fetch(`${WEBUI_API_BASE_URL}/files/`, {
|
const res = await fetch(`${WEBUI_API_BASE_URL}/files/?${searchParams.toString()}`, {
|
||||||
method: 'POST',
|
method: 'POST',
|
||||||
headers: {
|
headers: {
|
||||||
Accept: 'application/json',
|
Accept: 'application/json',
|
||||||
|
|
|
||||||
|
|
@ -132,6 +132,56 @@ export const getKnowledgeById = async (token: string, id: string) => {
|
||||||
return res;
|
return res;
|
||||||
};
|
};
|
||||||
|
|
||||||
|
export const searchKnowledgeFilesById = async (
|
||||||
|
token: string,
|
||||||
|
id: string,
|
||||||
|
query?: string | null = null,
|
||||||
|
viewOption?: string | null = null,
|
||||||
|
orderBy?: string | null = null,
|
||||||
|
direction?: string | null = null,
|
||||||
|
page: number = 1
|
||||||
|
) => {
|
||||||
|
let error = null;
|
||||||
|
|
||||||
|
const searchParams = new URLSearchParams();
|
||||||
|
if (query) searchParams.append('query', query);
|
||||||
|
if (viewOption) searchParams.append('view_option', viewOption);
|
||||||
|
if (orderBy) searchParams.append('order_by', orderBy);
|
||||||
|
if (direction) searchParams.append('direction', direction);
|
||||||
|
searchParams.append('page', page.toString());
|
||||||
|
|
||||||
|
const res = await fetch(
|
||||||
|
`${WEBUI_API_BASE_URL}/knowledge/${id}/files?${searchParams.toString()}`,
|
||||||
|
{
|
||||||
|
method: 'GET',
|
||||||
|
headers: {
|
||||||
|
Accept: 'application/json',
|
||||||
|
'Content-Type': 'application/json',
|
||||||
|
authorization: `Bearer ${token}`
|
||||||
|
}
|
||||||
|
}
|
||||||
|
)
|
||||||
|
.then(async (res) => {
|
||||||
|
if (!res.ok) throw await res.json();
|
||||||
|
return res.json();
|
||||||
|
})
|
||||||
|
.then((json) => {
|
||||||
|
return json;
|
||||||
|
})
|
||||||
|
.catch((err) => {
|
||||||
|
error = err.detail;
|
||||||
|
|
||||||
|
console.error(err);
|
||||||
|
return null;
|
||||||
|
});
|
||||||
|
|
||||||
|
if (error) {
|
||||||
|
throw error;
|
||||||
|
}
|
||||||
|
|
||||||
|
return res;
|
||||||
|
};
|
||||||
|
|
||||||
type KnowledgeUpdateForm = {
|
type KnowledgeUpdateForm = {
|
||||||
name?: string;
|
name?: string;
|
||||||
description?: string;
|
description?: string;
|
||||||
|
|
|
||||||
|
|
@ -91,6 +91,65 @@ export const getNotes = async (token: string = '', raw: boolean = false) => {
|
||||||
return grouped;
|
return grouped;
|
||||||
};
|
};
|
||||||
|
|
||||||
|
export const searchNotes = async (
|
||||||
|
token: string = '',
|
||||||
|
query: string | null = null,
|
||||||
|
viewOption: string | null = null,
|
||||||
|
permission: string | null = null,
|
||||||
|
sortKey: string | null = null,
|
||||||
|
page: number | null = null
|
||||||
|
) => {
|
||||||
|
let error = null;
|
||||||
|
const searchParams = new URLSearchParams();
|
||||||
|
|
||||||
|
if (query !== null) {
|
||||||
|
searchParams.append('query', query);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (viewOption !== null) {
|
||||||
|
searchParams.append('view_option', viewOption);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (permission !== null) {
|
||||||
|
searchParams.append('permission', permission);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (sortKey !== null) {
|
||||||
|
searchParams.append('order_by', sortKey);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (page !== null) {
|
||||||
|
searchParams.append('page', `${page}`);
|
||||||
|
}
|
||||||
|
|
||||||
|
const res = await fetch(`${WEBUI_API_BASE_URL}/notes/search?${searchParams.toString()}`, {
|
||||||
|
method: 'GET',
|
||||||
|
headers: {
|
||||||
|
Accept: 'application/json',
|
||||||
|
'Content-Type': 'application/json',
|
||||||
|
authorization: `Bearer ${token}`
|
||||||
|
}
|
||||||
|
})
|
||||||
|
.then(async (res) => {
|
||||||
|
if (!res.ok) throw await res.json();
|
||||||
|
return res.json();
|
||||||
|
})
|
||||||
|
.then((json) => {
|
||||||
|
return json;
|
||||||
|
})
|
||||||
|
.catch((err) => {
|
||||||
|
error = err.detail;
|
||||||
|
console.error(err);
|
||||||
|
return null;
|
||||||
|
});
|
||||||
|
|
||||||
|
if (error) {
|
||||||
|
throw error;
|
||||||
|
}
|
||||||
|
|
||||||
|
return res;
|
||||||
|
};
|
||||||
|
|
||||||
export const getNoteList = async (token: string = '', page: number | null = null) => {
|
export const getNoteList = async (token: string = '', page: number | null = null) => {
|
||||||
let error = null;
|
let error = null;
|
||||||
const searchParams = new URLSearchParams();
|
const searchParams = new URLSearchParams();
|
||||||
|
|
@ -99,7 +158,7 @@ export const getNoteList = async (token: string = '', page: number | null = null
|
||||||
searchParams.append('page', `${page}`);
|
searchParams.append('page', `${page}`);
|
||||||
}
|
}
|
||||||
|
|
||||||
const res = await fetch(`${WEBUI_API_BASE_URL}/notes/list?${searchParams.toString()}`, {
|
const res = await fetch(`${WEBUI_API_BASE_URL}/notes/?${searchParams.toString()}`, {
|
||||||
method: 'GET',
|
method: 'GET',
|
||||||
headers: {
|
headers: {
|
||||||
Accept: 'application/json',
|
Accept: 'application/json',
|
||||||
|
|
|
||||||
|
|
@ -35,6 +35,7 @@ type ChunkConfigForm = {
|
||||||
type DocumentIntelligenceConfigForm = {
|
type DocumentIntelligenceConfigForm = {
|
||||||
key: string;
|
key: string;
|
||||||
endpoint: string;
|
endpoint: string;
|
||||||
|
model: string;
|
||||||
};
|
};
|
||||||
|
|
||||||
type ContentExtractConfigForm = {
|
type ContentExtractConfigForm = {
|
||||||
|
|
|
||||||
|
|
@ -166,11 +166,33 @@ export const getUsers = async (
|
||||||
return res;
|
return res;
|
||||||
};
|
};
|
||||||
|
|
||||||
export const getAllUsers = async (token: string) => {
|
export const searchUsers = async (
|
||||||
|
token: string,
|
||||||
|
query?: string,
|
||||||
|
orderBy?: string,
|
||||||
|
direction?: string,
|
||||||
|
page = 1
|
||||||
|
) => {
|
||||||
let error = null;
|
let error = null;
|
||||||
let res = null;
|
let res = null;
|
||||||
|
|
||||||
res = await fetch(`${WEBUI_API_BASE_URL}/users/all`, {
|
const searchParams = new URLSearchParams();
|
||||||
|
|
||||||
|
searchParams.set('page', `${page}`);
|
||||||
|
|
||||||
|
if (query) {
|
||||||
|
searchParams.set('query', query);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (orderBy) {
|
||||||
|
searchParams.set('order_by', orderBy);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (direction) {
|
||||||
|
searchParams.set('direction', direction);
|
||||||
|
}
|
||||||
|
|
||||||
|
res = await fetch(`${WEBUI_API_BASE_URL}/users/search?${searchParams.toString()}`, {
|
||||||
method: 'GET',
|
method: 'GET',
|
||||||
headers: {
|
headers: {
|
||||||
'Content-Type': 'application/json',
|
'Content-Type': 'application/json',
|
||||||
|
|
@ -194,11 +216,11 @@ export const getAllUsers = async (token: string) => {
|
||||||
return res;
|
return res;
|
||||||
};
|
};
|
||||||
|
|
||||||
export const searchUsers = async (token: string, query: string) => {
|
export const getAllUsers = async (token: string) => {
|
||||||
let error = null;
|
let error = null;
|
||||||
let res = null;
|
let res = null;
|
||||||
|
|
||||||
res = await fetch(`${WEBUI_API_BASE_URL}/users/search?query=${encodeURIComponent(query)}`, {
|
res = await fetch(`${WEBUI_API_BASE_URL}/users/all`, {
|
||||||
method: 'GET',
|
method: 'GET',
|
||||||
headers: {
|
headers: {
|
||||||
'Content-Type': 'application/json',
|
'Content-Type': 'application/json',
|
||||||
|
|
@ -305,6 +327,36 @@ export const getUserById = async (token: string, userId: string) => {
|
||||||
return res;
|
return res;
|
||||||
};
|
};
|
||||||
|
|
||||||
|
export const updateUserStatus = async (token: string, formData: object) => {
|
||||||
|
let error = null;
|
||||||
|
|
||||||
|
const res = await fetch(`${WEBUI_API_BASE_URL}/users/user/status/update`, {
|
||||||
|
method: 'POST',
|
||||||
|
headers: {
|
||||||
|
'Content-Type': 'application/json',
|
||||||
|
Authorization: `Bearer ${token}`
|
||||||
|
},
|
||||||
|
body: JSON.stringify({
|
||||||
|
...formData
|
||||||
|
})
|
||||||
|
})
|
||||||
|
.then(async (res) => {
|
||||||
|
if (!res.ok) throw await res.json();
|
||||||
|
return res.json();
|
||||||
|
})
|
||||||
|
.catch((err) => {
|
||||||
|
console.error(err);
|
||||||
|
error = err.detail;
|
||||||
|
return null;
|
||||||
|
});
|
||||||
|
|
||||||
|
if (error) {
|
||||||
|
throw error;
|
||||||
|
}
|
||||||
|
|
||||||
|
return res;
|
||||||
|
};
|
||||||
|
|
||||||
export const getUserInfo = async (token: string) => {
|
export const getUserInfo = async (token: string) => {
|
||||||
let error = null;
|
let error = null;
|
||||||
const res = await fetch(`${WEBUI_API_BASE_URL}/users/user/info`, {
|
const res = await fetch(`${WEBUI_API_BASE_URL}/users/user/info`, {
|
||||||
|
|
|
||||||
|
|
@ -358,7 +358,7 @@
|
||||||
<div class="flex-shrink-0 self-start">
|
<div class="flex-shrink-0 self-start">
|
||||||
<select
|
<select
|
||||||
id="select-bearer-or-session"
|
id="select-bearer-or-session"
|
||||||
class={`w-full text-sm bg-transparent pr-5 ${($settings?.highContrastMode ?? false) ? 'placeholder:text-gray-700 dark:placeholder:text-gray-100' : 'outline-hidden placeholder:text-gray-300 dark:placeholder:text-gray-700'}`}
|
class={`dark:bg-gray-900 w-full text-sm bg-transparent pr-5 ${($settings?.highContrastMode ?? false) ? 'placeholder:text-gray-700 dark:placeholder:text-gray-100' : 'outline-hidden placeholder:text-gray-300 dark:placeholder:text-gray-700'}`}
|
||||||
bind:value={auth_type}
|
bind:value={auth_type}
|
||||||
>
|
>
|
||||||
<option value="none">{$i18n.t('None')}</option>
|
<option value="none">{$i18n.t('None')}</option>
|
||||||
|
|
|
||||||
|
|
@ -47,7 +47,7 @@
|
||||||
let key = '';
|
let key = '';
|
||||||
let headers = '';
|
let headers = '';
|
||||||
|
|
||||||
let functionNameFilterList = [];
|
let functionNameFilterList = '';
|
||||||
let accessControl = {};
|
let accessControl = {};
|
||||||
|
|
||||||
let id = '';
|
let id = '';
|
||||||
|
|
@ -338,7 +338,7 @@
|
||||||
oauthClientInfo = null;
|
oauthClientInfo = null;
|
||||||
|
|
||||||
enable = true;
|
enable = true;
|
||||||
functionNameFilterList = [];
|
functionNameFilterList = '';
|
||||||
accessControl = null;
|
accessControl = null;
|
||||||
};
|
};
|
||||||
|
|
||||||
|
|
@ -362,7 +362,7 @@
|
||||||
oauthClientInfo = connection.info?.oauth_client_info ?? null;
|
oauthClientInfo = connection.info?.oauth_client_info ?? null;
|
||||||
|
|
||||||
enable = connection.config?.enable ?? true;
|
enable = connection.config?.enable ?? true;
|
||||||
functionNameFilterList = connection.config?.function_name_filter_list ?? [];
|
functionNameFilterList = connection.config?.function_name_filter_list ?? '';
|
||||||
accessControl = connection.config?.access_control ?? null;
|
accessControl = connection.config?.access_control ?? null;
|
||||||
}
|
}
|
||||||
};
|
};
|
||||||
|
|
@ -534,7 +534,7 @@
|
||||||
<div class="flex-shrink-0 self-start">
|
<div class="flex-shrink-0 self-start">
|
||||||
<select
|
<select
|
||||||
id="select-bearer-or-session"
|
id="select-bearer-or-session"
|
||||||
class={`w-full text-sm bg-transparent pr-5 ${($settings?.highContrastMode ?? false) ? 'placeholder:text-gray-700 dark:placeholder:text-gray-100' : 'outline-hidden placeholder:text-gray-300 dark:placeholder:text-gray-700'}`}
|
class={`dark:bg-gray-900 w-full text-sm bg-transparent pr-5 ${($settings?.highContrastMode ?? false) ? 'placeholder:text-gray-700 dark:placeholder:text-gray-100' : 'outline-hidden placeholder:text-gray-300 dark:placeholder:text-gray-700'}`}
|
||||||
bind:value={spec_type}
|
bind:value={spec_type}
|
||||||
>
|
>
|
||||||
<option value="url">{$i18n.t('URL')}</option>
|
<option value="url">{$i18n.t('URL')}</option>
|
||||||
|
|
@ -644,7 +644,7 @@
|
||||||
<div class="flex-shrink-0 self-start">
|
<div class="flex-shrink-0 self-start">
|
||||||
<select
|
<select
|
||||||
id="select-bearer-or-session"
|
id="select-bearer-or-session"
|
||||||
class={`w-full text-sm bg-transparent pr-5 ${($settings?.highContrastMode ?? false) ? 'placeholder:text-gray-700 dark:placeholder:text-gray-100' : 'outline-hidden placeholder:text-gray-300 dark:placeholder:text-gray-700'}`}
|
class={`dark:bg-gray-900 w-full text-sm bg-transparent pr-5 ${($settings?.highContrastMode ?? false) ? 'placeholder:text-gray-700 dark:placeholder:text-gray-100' : 'outline-hidden placeholder:text-gray-300 dark:placeholder:text-gray-700'}`}
|
||||||
bind:value={auth_type}
|
bind:value={auth_type}
|
||||||
>
|
>
|
||||||
<option value="none">{$i18n.t('None')}</option>
|
<option value="none">{$i18n.t('None')}</option>
|
||||||
|
|
@ -818,10 +818,8 @@
|
||||||
|
|
||||||
<hr class=" border-gray-100 dark:border-gray-700/10 my-2.5 w-full" />
|
<hr class=" border-gray-100 dark:border-gray-700/10 my-2.5 w-full" />
|
||||||
|
|
||||||
<div class="my-2 -mx-2">
|
<div class="my-2">
|
||||||
<div class="px-4 py-3 bg-gray-50 dark:bg-gray-950 rounded-3xl">
|
<AccessControl bind:accessControl />
|
||||||
<AccessControl bind:accessControl />
|
|
||||||
</div>
|
|
||||||
</div>
|
</div>
|
||||||
{/if}
|
{/if}
|
||||||
</div>
|
</div>
|
||||||
|
|
|
||||||
|
|
@ -186,7 +186,7 @@
|
||||||
class="w-full text-sm text-left text-gray-500 dark:text-gray-400 table-auto max-w-full"
|
class="w-full text-sm text-left text-gray-500 dark:text-gray-400 table-auto max-w-full"
|
||||||
>
|
>
|
||||||
<thead class="text-xs text-gray-800 uppercase bg-transparent dark:text-gray-200">
|
<thead class="text-xs text-gray-800 uppercase bg-transparent dark:text-gray-200">
|
||||||
<tr class=" border-b-[1.5px] border-gray-50 dark:border-gray-850">
|
<tr class=" border-b-[1.5px] border-gray-50 dark:border-gray-850/30">
|
||||||
<th
|
<th
|
||||||
scope="col"
|
scope="col"
|
||||||
class="px-2.5 py-2 cursor-pointer select-none w-3"
|
class="px-2.5 py-2 cursor-pointer select-none w-3"
|
||||||
|
|
|
||||||
|
|
@ -387,7 +387,7 @@
|
||||||
: ''}"
|
: ''}"
|
||||||
>
|
>
|
||||||
<thead class="text-xs text-gray-800 uppercase bg-transparent dark:text-gray-200">
|
<thead class="text-xs text-gray-800 uppercase bg-transparent dark:text-gray-200">
|
||||||
<tr class=" border-b-[1.5px] border-gray-50 dark:border-gray-850">
|
<tr class=" border-b-[1.5px] border-gray-50 dark:border-gray-850/30">
|
||||||
<th
|
<th
|
||||||
scope="col"
|
scope="col"
|
||||||
class="px-2.5 py-2 cursor-pointer select-none w-3"
|
class="px-2.5 py-2 cursor-pointer select-none w-3"
|
||||||
|
|
|
||||||
|
|
@ -343,7 +343,7 @@
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<div
|
<div
|
||||||
class="py-2 bg-white dark:bg-gray-900 rounded-3xl border border-gray-100 dark:border-gray-850"
|
class="py-2 bg-white dark:bg-gray-900 rounded-3xl border border-gray-100/30 dark:border-gray-850/30"
|
||||||
>
|
>
|
||||||
<div class="px-3.5 flex flex-1 items-center w-full space-x-2 py-0.5 pb-2">
|
<div class="px-3.5 flex flex-1 items-center w-full space-x-2 py-0.5 pb-2">
|
||||||
<div class="flex flex-1">
|
<div class="flex flex-1">
|
||||||
|
|
|
||||||
|
|
@ -63,7 +63,7 @@
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<hr class="border-gray-50 dark:border-gray-850 my-1" />
|
<hr class="border-gray-50 dark:border-gray-850/30 my-1" />
|
||||||
{/if}
|
{/if}
|
||||||
|
|
||||||
<DropdownMenu.Item
|
<DropdownMenu.Item
|
||||||
|
|
@ -122,7 +122,7 @@
|
||||||
<div class="flex items-center">{$i18n.t('Export')}</div>
|
<div class="flex items-center">{$i18n.t('Export')}</div>
|
||||||
</DropdownMenu.Item>
|
</DropdownMenu.Item>
|
||||||
|
|
||||||
<hr class="border-gray-50 dark:border-gray-850 my-1" />
|
<hr class="border-gray-50 dark:border-gray-850/30 my-1" />
|
||||||
|
|
||||||
<DropdownMenu.Item
|
<DropdownMenu.Item
|
||||||
class="flex gap-2 items-center px-3 py-1.5 text-sm font-medium cursor-pointer hover:bg-gray-50 dark:hover:bg-gray-800 rounded-md"
|
class="flex gap-2 items-center px-3 py-1.5 text-sm font-medium cursor-pointer hover:bg-gray-50 dark:hover:bg-gray-800 rounded-md"
|
||||||
|
|
|
||||||
|
|
@ -212,7 +212,7 @@
|
||||||
<div>
|
<div>
|
||||||
<div class=" mt-0.5 mb-2.5 text-base font-medium">{$i18n.t('Speech-to-Text')}</div>
|
<div class=" mt-0.5 mb-2.5 text-base font-medium">{$i18n.t('Speech-to-Text')}</div>
|
||||||
|
|
||||||
<hr class=" border-gray-100 dark:border-gray-850 my-2" />
|
<hr class=" border-gray-100/30 dark:border-gray-850/30 my-2" />
|
||||||
|
|
||||||
{#if STT_ENGINE !== 'web'}
|
{#if STT_ENGINE !== 'web'}
|
||||||
<div class="mb-2">
|
<div class="mb-2">
|
||||||
|
|
@ -263,7 +263,7 @@
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<hr class="border-gray-100 dark:border-gray-850 my-2" />
|
<hr class="border-gray-100/30 dark:border-gray-850/30 my-2" />
|
||||||
|
|
||||||
<div>
|
<div>
|
||||||
<div class=" mb-1.5 text-xs font-medium">{$i18n.t('STT Model')}</div>
|
<div class=" mb-1.5 text-xs font-medium">{$i18n.t('STT Model')}</div>
|
||||||
|
|
@ -289,7 +289,7 @@
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<hr class="border-gray-100 dark:border-gray-850 my-2" />
|
<hr class="border-gray-100/30 dark:border-gray-850/30 my-2" />
|
||||||
|
|
||||||
<div>
|
<div>
|
||||||
<div class=" mb-1.5 text-xs font-medium">{$i18n.t('STT Model')}</div>
|
<div class=" mb-1.5 text-xs font-medium">{$i18n.t('STT Model')}</div>
|
||||||
|
|
@ -323,7 +323,7 @@
|
||||||
/>
|
/>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<hr class="border-gray-100 dark:border-gray-850 my-2" />
|
<hr class="border-gray-100/30 dark:border-gray-850/30 my-2" />
|
||||||
|
|
||||||
<div>
|
<div>
|
||||||
<div class=" mb-1.5 text-xs font-medium">{$i18n.t('Azure Region')}</div>
|
<div class=" mb-1.5 text-xs font-medium">{$i18n.t('Azure Region')}</div>
|
||||||
|
|
@ -391,7 +391,7 @@
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<hr class="border-gray-100 dark:border-gray-850 my-2" />
|
<hr class="border-gray-100/30 dark:border-gray-850/30 my-2" />
|
||||||
|
|
||||||
<div>
|
<div>
|
||||||
<div class=" mb-1.5 text-xs font-medium">{$i18n.t('STT Model')}</div>
|
<div class=" mb-1.5 text-xs font-medium">{$i18n.t('STT Model')}</div>
|
||||||
|
|
@ -416,7 +416,7 @@
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<hr class="border-gray-100 dark:border-gray-850 my-2" />
|
<hr class="border-gray-100/30 dark:border-gray-850/30 my-2" />
|
||||||
|
|
||||||
<div>
|
<div>
|
||||||
<div class="flex items-center justify-between mb-2">
|
<div class="flex items-center justify-between mb-2">
|
||||||
|
|
@ -500,7 +500,7 @@
|
||||||
<div>
|
<div>
|
||||||
<div class=" mt-0.5 mb-2.5 text-base font-medium">{$i18n.t('Text-to-Speech')}</div>
|
<div class=" mt-0.5 mb-2.5 text-base font-medium">{$i18n.t('Text-to-Speech')}</div>
|
||||||
|
|
||||||
<hr class=" border-gray-100 dark:border-gray-850 my-2" />
|
<hr class=" border-gray-100/30 dark:border-gray-850/30 my-2" />
|
||||||
|
|
||||||
<div class="mb-2 py-0.5 flex w-full justify-between">
|
<div class="mb-2 py-0.5 flex w-full justify-between">
|
||||||
<div class=" self-center text-xs font-medium">{$i18n.t('Text-to-Speech Engine')}</div>
|
<div class=" self-center text-xs font-medium">{$i18n.t('Text-to-Speech Engine')}</div>
|
||||||
|
|
@ -557,7 +557,7 @@
|
||||||
<SensitiveInput placeholder={$i18n.t('API Key')} bind:value={TTS_API_KEY} required />
|
<SensitiveInput placeholder={$i18n.t('API Key')} bind:value={TTS_API_KEY} required />
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<hr class="border-gray-100 dark:border-gray-850 my-2" />
|
<hr class="border-gray-100/30 dark:border-gray-850/30 my-2" />
|
||||||
|
|
||||||
<div>
|
<div>
|
||||||
<div class=" mb-1.5 text-xs font-medium">{$i18n.t('Azure Region')}</div>
|
<div class=" mb-1.5 text-xs font-medium">{$i18n.t('Azure Region')}</div>
|
||||||
|
|
|
||||||
|
|
@ -43,7 +43,7 @@
|
||||||
<div class="mb-3.5">
|
<div class="mb-3.5">
|
||||||
<div class=" mt-0.5 mb-2.5 text-base font-medium">{$i18n.t('General')}</div>
|
<div class=" mt-0.5 mb-2.5 text-base font-medium">{$i18n.t('General')}</div>
|
||||||
|
|
||||||
<hr class=" border-gray-100 dark:border-gray-850 my-2" />
|
<hr class=" border-gray-100/30 dark:border-gray-850/30 my-2" />
|
||||||
|
|
||||||
<div class="mb-2.5">
|
<div class="mb-2.5">
|
||||||
<div class=" flex w-full justify-between">
|
<div class=" flex w-full justify-between">
|
||||||
|
|
@ -166,7 +166,7 @@
|
||||||
<div class="mb-3.5">
|
<div class="mb-3.5">
|
||||||
<div class=" mt-0.5 mb-2.5 text-base font-medium">{$i18n.t('Code Interpreter')}</div>
|
<div class=" mt-0.5 mb-2.5 text-base font-medium">{$i18n.t('Code Interpreter')}</div>
|
||||||
|
|
||||||
<hr class=" border-gray-100 dark:border-gray-850 my-2" />
|
<hr class=" border-gray-100/30 dark:border-gray-850/30 my-2" />
|
||||||
|
|
||||||
<div class="mb-2.5">
|
<div class="mb-2.5">
|
||||||
<div class=" flex w-full justify-between">
|
<div class=" flex w-full justify-between">
|
||||||
|
|
@ -288,7 +288,7 @@
|
||||||
</div>
|
</div>
|
||||||
{/if}
|
{/if}
|
||||||
|
|
||||||
<hr class="border-gray-100 dark:border-gray-850 my-2" />
|
<hr class="border-gray-100/30 dark:border-gray-850/30 my-2" />
|
||||||
|
|
||||||
<div>
|
<div>
|
||||||
<div class="py-0.5 w-full">
|
<div class="py-0.5 w-full">
|
||||||
|
|
|
||||||
|
|
@ -221,7 +221,7 @@
|
||||||
<div class="mb-3.5">
|
<div class="mb-3.5">
|
||||||
<div class=" mt-0.5 mb-2.5 text-base font-medium">{$i18n.t('General')}</div>
|
<div class=" mt-0.5 mb-2.5 text-base font-medium">{$i18n.t('General')}</div>
|
||||||
|
|
||||||
<hr class=" border-gray-100 dark:border-gray-850 my-2" />
|
<hr class=" border-gray-100/30 dark:border-gray-850/30 my-2" />
|
||||||
|
|
||||||
<div class="my-2">
|
<div class="my-2">
|
||||||
<div class="mt-2 space-y-2">
|
<div class="mt-2 space-y-2">
|
||||||
|
|
@ -384,7 +384,7 @@
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<hr class=" border-gray-100 dark:border-gray-850 my-2" />
|
<hr class=" border-gray-100/30 dark:border-gray-850/30 my-2" />
|
||||||
|
|
||||||
<div class="my-2">
|
<div class="my-2">
|
||||||
<div class="flex justify-between items-center text-sm">
|
<div class="flex justify-between items-center text-sm">
|
||||||
|
|
|
||||||
|
|
@ -143,7 +143,7 @@
|
||||||
</div>
|
</div>
|
||||||
</button>
|
</button>
|
||||||
|
|
||||||
<hr class="border-gray-50 dark:border-gray-850 my-1" />
|
<hr class="border-gray-50 dark:border-gray-850/30 my-1" />
|
||||||
|
|
||||||
{#if $config?.features.enable_admin_export ?? true}
|
{#if $config?.features.enable_admin_export ?? true}
|
||||||
<div class=" flex w-full justify-between">
|
<div class=" flex w-full justify-between">
|
||||||
|
|
|
||||||
|
|
@ -327,7 +327,7 @@
|
||||||
<div class="mb-3">
|
<div class="mb-3">
|
||||||
<div class=" mt-0.5 mb-2.5 text-base font-medium">{$i18n.t('General')}</div>
|
<div class=" mt-0.5 mb-2.5 text-base font-medium">{$i18n.t('General')}</div>
|
||||||
|
|
||||||
<hr class=" border-gray-100 dark:border-gray-850 my-2" />
|
<hr class=" border-gray-100/30 dark:border-gray-850/30 my-2" />
|
||||||
|
|
||||||
<div class="mb-2.5 flex flex-col w-full justify-between">
|
<div class="mb-2.5 flex flex-col w-full justify-between">
|
||||||
<div class="flex w-full justify-between mb-1">
|
<div class="flex w-full justify-between mb-1">
|
||||||
|
|
@ -597,6 +597,20 @@
|
||||||
required={false}
|
required={false}
|
||||||
/>
|
/>
|
||||||
</div>
|
</div>
|
||||||
|
<div class="my-0.5 flex flex-col w-full">
|
||||||
|
<div class=" mb-1 text-xs font-medium">
|
||||||
|
{$i18n.t('Document Intelligence Model')}
|
||||||
|
</div>
|
||||||
|
<div class="flex w-full">
|
||||||
|
<div class="flex-1 mr-2">
|
||||||
|
<input
|
||||||
|
class="flex-1 w-full text-sm bg-transparent outline-hidden"
|
||||||
|
placeholder={$i18n.t('Enter Document Intelligence Model')}
|
||||||
|
bind:value={RAGConfig.DOCUMENT_INTELLIGENCE_MODEL}
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
{:else if RAGConfig.CONTENT_EXTRACTION_ENGINE === 'mistral_ocr'}
|
{:else if RAGConfig.CONTENT_EXTRACTION_ENGINE === 'mistral_ocr'}
|
||||||
<div class="my-0.5 flex gap-2 pr-2">
|
<div class="my-0.5 flex gap-2 pr-2">
|
||||||
<input
|
<input
|
||||||
|
|
@ -762,7 +776,7 @@
|
||||||
<div class="mb-3">
|
<div class="mb-3">
|
||||||
<div class=" mt-0.5 mb-2.5 text-base font-medium">{$i18n.t('Embedding')}</div>
|
<div class=" mt-0.5 mb-2.5 text-base font-medium">{$i18n.t('Embedding')}</div>
|
||||||
|
|
||||||
<hr class=" border-gray-100 dark:border-gray-850 my-2" />
|
<hr class=" border-gray-100/30 dark:border-gray-850/30 my-2" />
|
||||||
|
|
||||||
<div class=" mb-2.5 flex flex-col w-full justify-between">
|
<div class=" mb-2.5 flex flex-col w-full justify-between">
|
||||||
<div class="flex w-full justify-between">
|
<div class="flex w-full justify-between">
|
||||||
|
|
@ -953,7 +967,7 @@
|
||||||
<div class="mb-3">
|
<div class="mb-3">
|
||||||
<div class=" mt-0.5 mb-2.5 text-base font-medium">{$i18n.t('Retrieval')}</div>
|
<div class=" mt-0.5 mb-2.5 text-base font-medium">{$i18n.t('Retrieval')}</div>
|
||||||
|
|
||||||
<hr class=" border-gray-100 dark:border-gray-850 my-2" />
|
<hr class=" border-gray-100/30 dark:border-gray-850/30 my-2" />
|
||||||
|
|
||||||
<div class=" mb-2.5 flex w-full justify-between">
|
<div class=" mb-2.5 flex w-full justify-between">
|
||||||
<div class=" self-center text-xs font-medium">{$i18n.t('Full Context Mode')}</div>
|
<div class=" self-center text-xs font-medium">{$i18n.t('Full Context Mode')}</div>
|
||||||
|
|
@ -1211,7 +1225,7 @@
|
||||||
<div class="mb-3">
|
<div class="mb-3">
|
||||||
<div class=" mt-0.5 mb-2.5 text-base font-medium">{$i18n.t('Files')}</div>
|
<div class=" mt-0.5 mb-2.5 text-base font-medium">{$i18n.t('Files')}</div>
|
||||||
|
|
||||||
<hr class=" border-gray-100 dark:border-gray-850 my-2" />
|
<hr class=" border-gray-100/30 dark:border-gray-850/30 my-2" />
|
||||||
|
|
||||||
<div class=" mb-2.5 flex w-full justify-between">
|
<div class=" mb-2.5 flex w-full justify-between">
|
||||||
<div class=" self-center text-xs font-medium">{$i18n.t('Allowed File Extensions')}</div>
|
<div class=" self-center text-xs font-medium">{$i18n.t('Allowed File Extensions')}</div>
|
||||||
|
|
@ -1323,7 +1337,7 @@
|
||||||
<div class="mb-3">
|
<div class="mb-3">
|
||||||
<div class=" mt-0.5 mb-2.5 text-base font-medium">{$i18n.t('Integration')}</div>
|
<div class=" mt-0.5 mb-2.5 text-base font-medium">{$i18n.t('Integration')}</div>
|
||||||
|
|
||||||
<hr class=" border-gray-100 dark:border-gray-850 my-2" />
|
<hr class=" border-gray-100/30 dark:border-gray-850/30 my-2" />
|
||||||
|
|
||||||
<div class=" mb-2.5 flex w-full justify-between">
|
<div class=" mb-2.5 flex w-full justify-between">
|
||||||
<div class=" self-center text-xs font-medium">{$i18n.t('Google Drive')}</div>
|
<div class=" self-center text-xs font-medium">{$i18n.t('Google Drive')}</div>
|
||||||
|
|
@ -1343,7 +1357,7 @@
|
||||||
<div class="mb-3">
|
<div class="mb-3">
|
||||||
<div class=" mt-0.5 mb-2.5 text-base font-medium">{$i18n.t('Danger Zone')}</div>
|
<div class=" mt-0.5 mb-2.5 text-base font-medium">{$i18n.t('Danger Zone')}</div>
|
||||||
|
|
||||||
<hr class=" border-gray-100 dark:border-gray-850 my-2" />
|
<hr class=" border-gray-100/30 dark:border-gray-850/30 my-2" />
|
||||||
|
|
||||||
<div class=" mb-2.5 flex w-full justify-between">
|
<div class=" mb-2.5 flex w-full justify-between">
|
||||||
<div class=" self-center text-xs font-medium">{$i18n.t('Reset Upload Directory')}</div>
|
<div class=" self-center text-xs font-medium">{$i18n.t('Reset Upload Directory')}</div>
|
||||||
|
|
|
||||||
|
|
@ -106,7 +106,7 @@
|
||||||
<div class="mb-3">
|
<div class="mb-3">
|
||||||
<div class=" mt-0.5 mb-2.5 text-base font-medium">{$i18n.t('General')}</div>
|
<div class=" mt-0.5 mb-2.5 text-base font-medium">{$i18n.t('General')}</div>
|
||||||
|
|
||||||
<hr class=" border-gray-100 dark:border-gray-850 my-2" />
|
<hr class=" border-gray-100/30 dark:border-gray-850/30 my-2" />
|
||||||
|
|
||||||
<div class="mb-2.5 flex w-full justify-between">
|
<div class="mb-2.5 flex w-full justify-between">
|
||||||
<div class=" text-xs font-medium">{$i18n.t('Arena Models')}</div>
|
<div class=" text-xs font-medium">{$i18n.t('Arena Models')}</div>
|
||||||
|
|
@ -139,7 +139,7 @@
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<hr class=" border-gray-100 dark:border-gray-850 my-2" />
|
<hr class=" border-gray-100/30 dark:border-gray-850/30 my-2" />
|
||||||
|
|
||||||
<div class="flex flex-col gap-2">
|
<div class="flex flex-col gap-2">
|
||||||
{#if (evaluationConfig?.EVALUATION_ARENA_MODELS ?? []).length > 0}
|
{#if (evaluationConfig?.EVALUATION_ARENA_MODELS ?? []).length > 0}
|
||||||
|
|
|
||||||
|
|
@ -292,10 +292,8 @@
|
||||||
|
|
||||||
<hr class=" border-gray-100 dark:border-gray-700/10 my-2.5 w-full" />
|
<hr class=" border-gray-100 dark:border-gray-700/10 my-2.5 w-full" />
|
||||||
|
|
||||||
<div class="my-2 -mx-2">
|
<div class="my-2">
|
||||||
<div class="px-4 py-3 bg-gray-50 dark:bg-gray-950 rounded-3xl">
|
<AccessControl bind:accessControl />
|
||||||
<AccessControl bind:accessControl />
|
|
||||||
</div>
|
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<hr class=" border-gray-100 dark:border-gray-700/10 my-2.5 w-full" />
|
<hr class=" border-gray-100 dark:border-gray-700/10 my-2.5 w-full" />
|
||||||
|
|
@ -352,7 +350,7 @@
|
||||||
|
|
||||||
<div class="flex items-center">
|
<div class="flex items-center">
|
||||||
<select
|
<select
|
||||||
class="w-full py-1 text-sm rounded-lg bg-transparent {selectedModelId
|
class="dark:bg-gray-900 w-full py-1 text-sm rounded-lg bg-transparent {selectedModelId
|
||||||
? ''
|
? ''
|
||||||
: 'text-gray-500'} placeholder:text-gray-300 dark:placeholder:text-gray-700 outline-hidden"
|
: 'text-gray-500'} placeholder:text-gray-300 dark:placeholder:text-gray-700 outline-hidden"
|
||||||
bind:value={selectedModelId}
|
bind:value={selectedModelId}
|
||||||
|
|
|
||||||
|
|
@ -129,7 +129,7 @@
|
||||||
<div class="mb-3.5">
|
<div class="mb-3.5">
|
||||||
<div class=" mt-0.5 mb-2.5 text-base font-medium">{$i18n.t('General')}</div>
|
<div class=" mt-0.5 mb-2.5 text-base font-medium">{$i18n.t('General')}</div>
|
||||||
|
|
||||||
<hr class=" border-gray-100 dark:border-gray-850 my-2" />
|
<hr class=" border-gray-100/30 dark:border-gray-850/30 my-2" />
|
||||||
|
|
||||||
<div class="mb-2.5">
|
<div class="mb-2.5">
|
||||||
<div class=" mb-1 text-xs font-medium flex space-x-2 items-center">
|
<div class=" mb-1 text-xs font-medium flex space-x-2 items-center">
|
||||||
|
|
@ -287,7 +287,7 @@
|
||||||
<div class="mb-3">
|
<div class="mb-3">
|
||||||
<div class=" mt-0.5 mb-2.5 text-base font-medium">{$i18n.t('Authentication')}</div>
|
<div class=" mt-0.5 mb-2.5 text-base font-medium">{$i18n.t('Authentication')}</div>
|
||||||
|
|
||||||
<hr class=" border-gray-100 dark:border-gray-850 my-2" />
|
<hr class=" border-gray-100/30 dark:border-gray-850/30 my-2" />
|
||||||
|
|
||||||
<div class=" mb-2.5 flex w-full justify-between">
|
<div class=" mb-2.5 flex w-full justify-between">
|
||||||
<div class=" self-center text-xs font-medium">{$i18n.t('Default User Role')}</div>
|
<div class=" self-center text-xs font-medium">{$i18n.t('Default User Role')}</div>
|
||||||
|
|
@ -660,7 +660,7 @@
|
||||||
<div class="mb-3">
|
<div class="mb-3">
|
||||||
<div class=" mt-0.5 mb-2.5 text-base font-medium">{$i18n.t('Features')}</div>
|
<div class=" mt-0.5 mb-2.5 text-base font-medium">{$i18n.t('Features')}</div>
|
||||||
|
|
||||||
<hr class=" border-gray-100 dark:border-gray-850 my-2" />
|
<hr class=" border-gray-100/30 dark:border-gray-850/30 my-2" />
|
||||||
|
|
||||||
<div class="mb-2.5 flex w-full items-center justify-between pr-2">
|
<div class="mb-2.5 flex w-full items-center justify-between pr-2">
|
||||||
<div class=" self-center text-xs font-medium">
|
<div class=" self-center text-xs font-medium">
|
||||||
|
|
@ -676,6 +676,14 @@
|
||||||
<Switch bind:state={adminConfig.ENABLE_MESSAGE_RATING} />
|
<Switch bind:state={adminConfig.ENABLE_MESSAGE_RATING} />
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
|
<div class="mb-2.5 flex w-full items-center justify-between pr-2">
|
||||||
|
<div class=" self-center text-xs font-medium">
|
||||||
|
{$i18n.t('Folders')}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<Switch bind:state={adminConfig.ENABLE_FOLDERS} />
|
||||||
|
</div>
|
||||||
|
|
||||||
<div class="mb-2.5 flex w-full items-center justify-between pr-2">
|
<div class="mb-2.5 flex w-full items-center justify-between pr-2">
|
||||||
<div class=" self-center text-xs font-medium">
|
<div class=" self-center text-xs font-medium">
|
||||||
{$i18n.t('Notes')} ({$i18n.t('Beta')})
|
{$i18n.t('Notes')} ({$i18n.t('Beta')})
|
||||||
|
|
|
||||||
|
|
@ -291,7 +291,7 @@
|
||||||
<div class="mb-3">
|
<div class="mb-3">
|
||||||
<div class=" mt-0.5 mb-2.5 text-base font-medium">{$i18n.t('General')}</div>
|
<div class=" mt-0.5 mb-2.5 text-base font-medium">{$i18n.t('General')}</div>
|
||||||
|
|
||||||
<hr class=" border-gray-100 dark:border-gray-850 my-2" />
|
<hr class=" border-gray-100/30 dark:border-gray-850/30 my-2" />
|
||||||
|
|
||||||
<div class="mb-2.5">
|
<div class="mb-2.5">
|
||||||
<div class="flex w-full justify-between items-center">
|
<div class="flex w-full justify-between items-center">
|
||||||
|
|
@ -309,7 +309,7 @@
|
||||||
<div class="mb-3">
|
<div class="mb-3">
|
||||||
<div class=" mt-0.5 mb-2.5 text-base font-medium">{$i18n.t('Create Image')}</div>
|
<div class=" mt-0.5 mb-2.5 text-base font-medium">{$i18n.t('Create Image')}</div>
|
||||||
|
|
||||||
<hr class=" border-gray-100 dark:border-gray-850 my-2" />
|
<hr class=" border-gray-100/30 dark:border-gray-850/30 my-2" />
|
||||||
|
|
||||||
{#if config.ENABLE_IMAGE_GENERATION}
|
{#if config.ENABLE_IMAGE_GENERATION}
|
||||||
<div class="mb-2.5">
|
<div class="mb-2.5">
|
||||||
|
|
@ -882,7 +882,7 @@
|
||||||
<div class="mb-3">
|
<div class="mb-3">
|
||||||
<div class=" mt-0.5 mb-2.5 text-base font-medium">{$i18n.t('Edit Image')}</div>
|
<div class=" mt-0.5 mb-2.5 text-base font-medium">{$i18n.t('Edit Image')}</div>
|
||||||
|
|
||||||
<hr class=" border-gray-100 dark:border-gray-850 my-2" />
|
<hr class=" border-gray-100/30 dark:border-gray-850/30 my-2" />
|
||||||
|
|
||||||
<div class="mb-2.5">
|
<div class="mb-2.5">
|
||||||
<div class="flex w-full justify-between items-center">
|
<div class="flex w-full justify-between items-center">
|
||||||
|
|
|
||||||
|
|
@ -115,7 +115,7 @@
|
||||||
<div class="mb-3.5">
|
<div class="mb-3.5">
|
||||||
<div class=" mt-0.5 mb-2.5 text-base font-medium">{$i18n.t('Tasks')}</div>
|
<div class=" mt-0.5 mb-2.5 text-base font-medium">{$i18n.t('Tasks')}</div>
|
||||||
|
|
||||||
<hr class=" border-gray-100 dark:border-gray-850 my-2" />
|
<hr class=" border-gray-100/30 dark:border-gray-850/30 my-2" />
|
||||||
|
|
||||||
<div class=" mb-2 font-medium flex items-center">
|
<div class=" mb-2 font-medium flex items-center">
|
||||||
<div class=" text-xs mr-1">{$i18n.t('Task Model')}</div>
|
<div class=" text-xs mr-1">{$i18n.t('Task Model')}</div>
|
||||||
|
|
@ -423,7 +423,7 @@
|
||||||
<div class="mb-3.5">
|
<div class="mb-3.5">
|
||||||
<div class=" mt-0.5 mb-2.5 text-base font-medium">{$i18n.t('UI')}</div>
|
<div class=" mt-0.5 mb-2.5 text-base font-medium">{$i18n.t('UI')}</div>
|
||||||
|
|
||||||
<hr class=" border-gray-100 dark:border-gray-850 my-2" />
|
<hr class=" border-gray-100/30 dark:border-gray-850/30 my-2" />
|
||||||
|
|
||||||
<div class="mb-2.5">
|
<div class="mb-2.5">
|
||||||
<div class="flex w-full justify-between">
|
<div class="flex w-full justify-between">
|
||||||
|
|
|
||||||
|
|
@ -811,9 +811,8 @@
|
||||||
bind:value={deleteModelTag}
|
bind:value={deleteModelTag}
|
||||||
placeholder={$i18n.t('Select a model')}
|
placeholder={$i18n.t('Select a model')}
|
||||||
>
|
>
|
||||||
{#if !deleteModelTag}
|
<option value="" disabled selected>{$i18n.t('Select a model')}</option>
|
||||||
<option value="" disabled selected>{$i18n.t('Select a model')}</option>
|
|
||||||
{/if}
|
|
||||||
{#each ollamaModels as model}
|
{#each ollamaModels as model}
|
||||||
<option value={model.id} class="bg-gray-50 dark:bg-gray-700"
|
<option value={model.id} class="bg-gray-50 dark:bg-gray-700"
|
||||||
>{model.name + ' (' + (model.size / 1024 ** 3).toFixed(1) + ' GB)'}</option
|
>{model.name + ' (' + (model.size / 1024 ** 3).toFixed(1) + ' GB)'}</option
|
||||||
|
|
|
||||||
|
|
@ -19,7 +19,7 @@
|
||||||
|
|
||||||
<div class="flex items-center -mr-1">
|
<div class="flex items-center -mr-1">
|
||||||
<select
|
<select
|
||||||
class="w-full py-1 text-sm rounded-lg bg-transparent {selectedModelId
|
class="dark:bg-gray-900 w-full py-1 text-sm rounded-lg bg-transparent {selectedModelId
|
||||||
? ''
|
? ''
|
||||||
: 'text-gray-500'} placeholder:text-gray-300 dark:placeholder:text-gray-700 outline-hidden"
|
: 'text-gray-500'} placeholder:text-gray-300 dark:placeholder:text-gray-700 outline-hidden"
|
||||||
bind:value={selectedModelId}
|
bind:value={selectedModelId}
|
||||||
|
|
|
||||||
|
|
@ -47,7 +47,7 @@
|
||||||
if (pipeline && (pipeline?.valves ?? false)) {
|
if (pipeline && (pipeline?.valves ?? false)) {
|
||||||
for (const property in valves_spec.properties) {
|
for (const property in valves_spec.properties) {
|
||||||
if (valves_spec.properties[property]?.type === 'array') {
|
if (valves_spec.properties[property]?.type === 'array') {
|
||||||
valves[property] = valves[property].split(',').map((v) => v.trim());
|
valves[property] = (valves[property] ?? '').split(',').map((v) => v.trim());
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
@ -418,7 +418,7 @@
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<hr class="border-gray-100 dark:border-gray-850 my-3 w-full" />
|
<hr class="border-gray-100/30 dark:border-gray-850/30 my-3 w-full" />
|
||||||
|
|
||||||
{#if pipelines !== null}
|
{#if pipelines !== null}
|
||||||
{#if pipelines.length > 0}
|
{#if pipelines.length > 0}
|
||||||
|
|
|
||||||
|
|
@ -61,7 +61,7 @@
|
||||||
<div class="mb-3">
|
<div class="mb-3">
|
||||||
<div class=" mt-0.5 mb-2.5 text-base font-medium">{$i18n.t('General')}</div>
|
<div class=" mt-0.5 mb-2.5 text-base font-medium">{$i18n.t('General')}</div>
|
||||||
|
|
||||||
<hr class=" border-gray-100 dark:border-gray-850 my-2" />
|
<hr class=" border-gray-100/30 dark:border-gray-850/30 my-2" />
|
||||||
|
|
||||||
<div class="mb-2.5 flex flex-col w-full justify-between">
|
<div class="mb-2.5 flex flex-col w-full justify-between">
|
||||||
<!-- {$i18n.t(`Failed to connect to {{URL}} OpenAPI tool server`, {
|
<!-- {$i18n.t(`Failed to connect to {{URL}} OpenAPI tool server`, {
|
||||||
|
|
|
||||||
Some files were not shown because too many files have changed in this diff Show more
Loading…
Reference in a new issue