mirror of
https://github.com/open-webui/open-webui.git
synced 2025-12-11 20:05:19 +00:00
Merge pull request #19466 from open-webui/dev
Some checks failed
Python CI / Format Backend (push) Has been cancelled
Frontend Build / Format & Build Frontend (push) Has been cancelled
Frontend Build / Frontend Unit Tests (push) Has been cancelled
Release / release (push) Has been cancelled
Deploy to HuggingFace Spaces / check-secret (push) Has been cancelled
Create and publish Docker images with specific build args / build-main-image (linux/amd64, ubuntu-latest) (push) Has been cancelled
Create and publish Docker images with specific build args / build-main-image (linux/arm64, ubuntu-24.04-arm) (push) Has been cancelled
Create and publish Docker images with specific build args / build-cuda-image (linux/amd64, ubuntu-latest) (push) Has been cancelled
Create and publish Docker images with specific build args / build-cuda-image (linux/arm64, ubuntu-24.04-arm) (push) Has been cancelled
Create and publish Docker images with specific build args / build-cuda126-image (linux/amd64, ubuntu-latest) (push) Has been cancelled
Create and publish Docker images with specific build args / build-cuda126-image (linux/arm64, ubuntu-24.04-arm) (push) Has been cancelled
Create and publish Docker images with specific build args / build-ollama-image (linux/amd64, ubuntu-latest) (push) Has been cancelled
Create and publish Docker images with specific build args / build-ollama-image (linux/arm64, ubuntu-24.04-arm) (push) Has been cancelled
Create and publish Docker images with specific build args / build-slim-image (linux/amd64, ubuntu-latest) (push) Has been cancelled
Create and publish Docker images with specific build args / build-slim-image (linux/arm64, ubuntu-24.04-arm) (push) Has been cancelled
Release to PyPI / release (push) Has been cancelled
Deploy to HuggingFace Spaces / deploy (push) Has been cancelled
Create and publish Docker images with specific build args / merge-main-images (push) Has been cancelled
Create and publish Docker images with specific build args / merge-cuda-images (push) Has been cancelled
Create and publish Docker images with specific build args / merge-cuda126-images (push) Has been cancelled
Create and publish Docker images with specific build args / merge-ollama-images (push) Has been cancelled
Create and publish Docker images with specific build args / merge-slim-images (push) Has been cancelled
Some checks failed
Python CI / Format Backend (push) Has been cancelled
Frontend Build / Format & Build Frontend (push) Has been cancelled
Frontend Build / Frontend Unit Tests (push) Has been cancelled
Release / release (push) Has been cancelled
Deploy to HuggingFace Spaces / check-secret (push) Has been cancelled
Create and publish Docker images with specific build args / build-main-image (linux/amd64, ubuntu-latest) (push) Has been cancelled
Create and publish Docker images with specific build args / build-main-image (linux/arm64, ubuntu-24.04-arm) (push) Has been cancelled
Create and publish Docker images with specific build args / build-cuda-image (linux/amd64, ubuntu-latest) (push) Has been cancelled
Create and publish Docker images with specific build args / build-cuda-image (linux/arm64, ubuntu-24.04-arm) (push) Has been cancelled
Create and publish Docker images with specific build args / build-cuda126-image (linux/amd64, ubuntu-latest) (push) Has been cancelled
Create and publish Docker images with specific build args / build-cuda126-image (linux/arm64, ubuntu-24.04-arm) (push) Has been cancelled
Create and publish Docker images with specific build args / build-ollama-image (linux/amd64, ubuntu-latest) (push) Has been cancelled
Create and publish Docker images with specific build args / build-ollama-image (linux/arm64, ubuntu-24.04-arm) (push) Has been cancelled
Create and publish Docker images with specific build args / build-slim-image (linux/amd64, ubuntu-latest) (push) Has been cancelled
Create and publish Docker images with specific build args / build-slim-image (linux/arm64, ubuntu-24.04-arm) (push) Has been cancelled
Release to PyPI / release (push) Has been cancelled
Deploy to HuggingFace Spaces / deploy (push) Has been cancelled
Create and publish Docker images with specific build args / merge-main-images (push) Has been cancelled
Create and publish Docker images with specific build args / merge-cuda-images (push) Has been cancelled
Create and publish Docker images with specific build args / merge-cuda126-images (push) Has been cancelled
Create and publish Docker images with specific build args / merge-ollama-images (push) Has been cancelled
Create and publish Docker images with specific build args / merge-slim-images (push) Has been cancelled
0.6.41
This commit is contained in:
commit
6f1486ffd0
229 changed files with 8893 additions and 2174 deletions
73
CHANGELOG.md
73
CHANGELOG.md
|
|
@ -5,6 +5,79 @@ All notable changes to this project will be documented in this file.
|
|||
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.1.0/),
|
||||
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
|
||||
|
||||
## [0.6.41] - 2025-12-02
|
||||
|
||||
### Added
|
||||
|
||||
- 🚦 Sign-in rate limiting was implemented to protect against brute force attacks, limiting login attempts to 15 per 3-minute window per email address using Redis with automatic fallback to in-memory storage when Redis is unavailable. [Commit](https://github.com/open-webui/open-webui/commit/7b166370432414ce8f186747fb098e0c70fb2d6b)
|
||||
- 📂 Administrators can now globally disable the folders feature and control user-level folder permissions through the admin panel, enabling minimalist interface configurations for deployments that don't require workspace organization features. [#19529](https://github.com/open-webui/open-webui/pull/19529), [#19210](https://github.com/open-webui/open-webui/discussions/19210), [#18459](https://github.com/open-webui/open-webui/discussions/18459), [#18299](https://github.com/open-webui/open-webui/discussions/18299)
|
||||
- 👥 Group channels were introduced as a new channel type enabling membership-based collaboration spaces where users explicitly join as members rather than accessing through permissions, with support for public or private visibility, automatic member inclusion from specified user groups, member role tracking with invitation metadata, and post-creation member management allowing channel managers to add or remove members through the channel info modal. [Commit](https://github.com/open-webui/open-webui/commit/f589b7c1895a6a77166c047891acfa21bc0936c4), [Commit](https://github.com/open-webui/open-webui/commit/3f1d9ccbf8443a2fa5278f36202bad930a216680)
|
||||
- 💬 Direct Message channels were introduced with a dedicated channel type selector and multi-user member selection interface, enabling private conversations between specific users without requiring full channel visibility. [Commit](https://github.com/open-webui/open-webui/commit/64b4d5d9c280b926746584aaf92b447d09deb386)
|
||||
- 📨 Direct Message channels now support a complete user-to-user messaging system with member-based access control, automatic deduplication for one-on-one conversations, optional channel naming, and distinct visual presentation using participant avatars instead of channel icons. [Commit](https://github.com/open-webui/open-webui/commit/acccb9afdd557274d6296c70258bb897bbb6652f)
|
||||
- 🙈 Users can now hide Direct Message channels from their sidebar while preserving message history, with automatic reactivation when new messages arrive from other participants, providing a cleaner interface for managing active conversations. [Commit](https://github.com/open-webui/open-webui/commit/acccb9afdd557274d6296c70258bb897bbb6652f)
|
||||
- ☑️ A comprehensive user selection component was added to the channel creation modal, featuring search functionality, sortable user lists, pagination support, and multi-select checkboxes for building Direct Message participant lists. [Commit](https://github.com/open-webui/open-webui/commit/acccb9afdd557274d6296c70258bb897bbb6652f)
|
||||
- 🔴 Channel unread message count tracking was implemented with visual badge indicators in the sidebar, automatically updating counts in real-time and marking messages as read when users view channels, with join/leave functionality to manage membership status. [Commit](https://github.com/open-webui/open-webui/commit/64b4d5d9c280b926746584aaf92b447d09deb386)
|
||||
- 📌 Message pinning functionality was added to channels, allowing users to pin important messages for easy reference with visual highlighting, a dedicated pinned messages modal accessible from the navbar, and complete backend support for tracking pinned status, pin timestamp, and the user who pinned each message. [Commit](https://github.com/open-webui/open-webui/commit/64b4d5d9c280b926746584aaf92b447d09deb386), [Commit](https://github.com/open-webui/open-webui/commit/aae2fce17355419d9c29f8100409108037895201)
|
||||
- 🟢 Direct Message channels now display an active status indicator for one-on-one conversations, showing a green dot when the other participant is currently online or a gray dot when offline. [Commit](https://github.com/open-webui/open-webui/commit/4b6773885cd7527c5a56b963781dac5e95105eec), [Commit](https://github.com/open-webui/open-webui/commit/39645102d14f34e71b34e5ddce0625790be33f6f)
|
||||
- 🆔 Users can now start Direct Message conversations directly from user profile previews by clicking the "Message" button, enabling quick access to private messaging without navigating away from the current channel. [Commit](https://github.com/open-webui/open-webui/commit/a0826ec9fedb56320532616d568fa59dda831d4e)
|
||||
- ⚡ Channel messages now appear instantly when sent using optimistic UI rendering, displaying with a pending state while the server confirms delivery, providing a more responsive messaging experience. [Commit](https://github.com/open-webui/open-webui/commit/25994dd3da90600401f53596d4e4fb067c1b8eaa)
|
||||
- 👍 Channel message reactions now display the names of users who reacted when hovering over the emoji, showing up to three names with a count for additional reactors. [Commit](https://github.com/open-webui/open-webui/commit/05e79bdd0c7af70b631e958924e3656db1013b80)
|
||||
- 🛠️ Channel creators can now edit and delete their own group and DM channels without requiring administrator privileges, enabling users to manage the channels they create independently. [Commit](https://github.com/open-webui/open-webui/commit/f589b7c1895a6a77166c047891acfa21bc0936c4)
|
||||
- 🔌 A new API endpoint was added to directly get or create a Direct Message channel with a specific user by their ID, streamlining programmatic DM channel creation for integrations and frontend workflows. [Commit](https://github.com/open-webui/open-webui/commit/f589b7c1895a6a77166c047891acfa21bc0936c4)
|
||||
- 💭 Users can now set a custom status with an emoji and message that displays in profile previews, the sidebar user menu, and Direct Message channel items in the sidebar, with the ability to clear status at any time, providing visibility into availability or current focus similar to team communication platforms. [Commit](https://github.com/open-webui/open-webui/commit/51621ba91a982e52da168ce823abffd11ad3e4fa), [Commit](https://github.com/open-webui/open-webui/commit/f5e8d4d5a004115489c35725408b057e24dfe318)
|
||||
- 📤 A group export API endpoint was added, enabling administrators to export complete group data including member lists for backup and migration purposes. [Commit](https://github.com/open-webui/open-webui/commit/09b6ea38c579659f8ca43ae5ea3746df3ac561ad)
|
||||
- 📡 A new API endpoint was added to retrieve all users belonging to a specific group, enabling programmatic access to group membership information for administrative workflows. [Commit](https://github.com/open-webui/open-webui/commit/01868e856a10f474f74fbd1b4425dafdf949222f)
|
||||
- 👁️ The admin user list now displays an active status indicator next to each user, showing a visual green dot for users who have been active within the last three minutes. [Commit](https://github.com/open-webui/open-webui/commit/1b095d12ff2465b83afa94af89ded9593f8a8655)
|
||||
- 🔑 The admin user edit modal now displays OAuth identity information with a per-provider breakdown, showing each linked identity provider and its associated subject identifier separately. [#19573](https://github.com/open-webui/open-webui/pull/19573)
|
||||
- 🧩 OAuth role claim parsing now respects the "OAUTH_ROLES_SEPARATOR" configuration, enabling proper parsing of roles returned as comma-separated strings and providing consistent behavior with group claim handling. [#19514](https://github.com/open-webui/open-webui/pull/19514)
|
||||
- 🎛️ Channel feature access can now be controlled through both the "USER_PERMISSIONS_FEATURES_CHANNELS" environment variable and group permission toggles in the admin panel, allowing administrators to restrict channel functionality for specific users or groups while defaulting to enabled for all users. [Commit](https://github.com/open-webui/open-webui/commit/f589b7c1895a6a77166c047891acfa21bc0936c4)
|
||||
- 🎨 The model editor interface was refined with access control settings moved to a dedicated modal, group member counts now displayed when configuring permissions, reorganized layout with improved visual hierarchy, and redesigned prompt suggestions cards with tooltips for field guidance. [Commit](https://github.com/open-webui/open-webui/commit/e65d92fc6f49da5ca059e1c65a729e7973354b99), [Commit](https://github.com/open-webui/open-webui/commit/9d39b9b42c653ee2acf2674b2df343ecbceb4954)
|
||||
- 🏗️ Knowledge base file management was rebuilt with a dedicated database table replacing the previous JSON array storage, enabling pagination support for large knowledge bases, significantly faster file listing performance, and more reliable file-knowledge base relationship tracking. [Commit](https://github.com/open-webui/open-webui/commit/d19023288e2ca40f86e2dc3fd9f230540f3e70d7)
|
||||
- ☁️ Azure Document Intelligence model selection was added, allowing administrators to specify which model to use for document processing via the "DOCUMENT_INTELLIGENCE_MODEL" environment variable or admin UI setting, with "prebuilt-layout" as the default. [#19692](https://github.com/open-webui/open-webui/pull/19692), [Docs:#872](https://github.com/open-webui/docs/pull/872)
|
||||
- 🚀 Milvus multitenancy vector database performance was improved by removing manual flush calls after upsert operations, eliminating rate limit errors and reducing load on etcd and MinIO/S3 storage by allowing Milvus to manage segment persistence automatically via its WAL and auto-flush policies. [#19680](https://github.com/open-webui/open-webui/pull/19680)
|
||||
- ✨ Various improvements were implemented across the frontend and backend to enhance performance, stability, and security.
|
||||
- 🌍 Translations for German, French, Portuguese (Brazil), Catalan, Simplified Chinese, and Traditional Chinese were enhanced and expanded.
|
||||
|
||||
### Fixed
|
||||
|
||||
- 🔄 Tool call response token duplication was fixed by removing redundant message history additions in non-native function calling mode, resolving an issue where tool results were included twice in the context and causing 2x token consumption. [#19656](https://github.com/open-webui/open-webui/issues/19656), [Commit](https://github.com/open-webui/open-webui/commit/52ccab8)
|
||||
- 🛡️ Web search domain filtering was corrected to properly block results when any resolved hostname or IP address matches a blocked domain, preventing blocked sites from appearing in search results due to permissive hostname resolution logic that previously allowed results through if any single resolved address passed the filter. [#19670](https://github.com/open-webui/open-webui/pull/19670), [#19669](https://github.com/open-webui/open-webui/issues/19669)
|
||||
- 🧠 Custom models based on Ollama or OpenAI now properly inherit the connection type from their base model, ensuring they appear correctly in the "Local" or "External" model selection tabs instead of only appearing under "All". [#19183](https://github.com/open-webui/open-webui/issues/19183), [Commit](https://github.com/open-webui/open-webui/commit/39f7575)
|
||||
- 🐍 SentenceTransformers embedding initialization was fixed by updating the transformers dependency to version 4.57.3, resolving a regression in v0.6.40 where document ingestion failed with "'NoneType' object has no attribute 'encode'" errors due to a bug in transformers 4.57.2. [#19512](https://github.com/open-webui/open-webui/issues/19512), [#19513](https://github.com/open-webui/open-webui/pull/19513)
|
||||
- 📈 Active user count accuracy was significantly improved by replacing the socket-based USER_POOL tracking with a database-backed heartbeat mechanism, resolving long-standing issues where Redis deployments displayed inflated user counts due to stale sessions never being cleaned up on disconnect. [#16074](https://github.com/open-webui/open-webui/discussions/16074), [Commit](https://github.com/open-webui/open-webui/commit/70948f8803e417459d5203839f8077fdbfbbb213)
|
||||
- 👥 Default group assignment now applies consistently across all user registration methods including OAuth/SSO, LDAP, and admin-created users, fixing an issue where the "DEFAULT_GROUP_ID" setting was only being applied to users who signed up via the email/password signup form. [#19685](https://github.com/open-webui/open-webui/pull/19685)
|
||||
- 🔦 Model list filtering in workspaces was corrected to properly include models shared with user groups, ensuring members can view models they have write access to through group permissions. [#19461](https://github.com/open-webui/open-webui/issues/19461), [Commit](https://github.com/open-webui/open-webui/commit/69722ba973768a5f689f2e2351bf583a8db9bba8)
|
||||
- 🖼️ User profile image display in preview contexts was fixed by resolving a Pydantic validation error that prevented proper rendering. [Commit](https://github.com/open-webui/open-webui/commit/c7eb7136893b0ddfdc5d55ffc7a05bd84a00f5d6)
|
||||
- 🔒 Redis TLS connection failures were resolved by updating the python-socketio dependency to version 5.15.0, restoring support for the "rediss://" URL schema. [#19480](https://github.com/open-webui/open-webui/issues/19480), [#19488](https://github.com/open-webui/open-webui/pull/19488)
|
||||
- 📝 MCP tool server configuration was corrected to properly handle the "Function Name Filter List" as both string and list types, preventing AttributeError when the field is empty and ensuring backward compatibility. [#19486](https://github.com/open-webui/open-webui/issues/19486), [Commit](https://github.com/open-webui/open-webui/commit/c5b73d71843edc024325d4a6e625ec939a747279), [Commit](https://github.com/open-webui/open-webui/commit/477097c2e42985c14892301d0127314629d07df1)
|
||||
- 📎 Web page attachment failures causing TypeError on metadata checks were resolved by correcting async threadpool parameter passing in vector database operations. [#19493](https://github.com/open-webui/open-webui/issues/19493), [Commit](https://github.com/open-webui/open-webui/commit/4370dee79e19d77062c03fba81780cb3b779fca3)
|
||||
- 💾 Model allowlist persistence in multi-worker deployments was fixed by implementing Redis-based shared state for the internal models dictionary, ensuring configuration changes are consistently visible across all worker processes. [#19395](https://github.com/open-webui/open-webui/issues/19395), [Commit](https://github.com/open-webui/open-webui/commit/b5e5617d7f7ad3e4eec9f15f4cc7f07cb5afc2fa)
|
||||
- ⏳ Chat history infinite loading was prevented by enhancing message data structure to properly track parent message relationships, resolving issues where missing parentId fields caused perpetual loading states. [#19225](https://github.com/open-webui/open-webui/issues/19225), [Commit](https://github.com/open-webui/open-webui/commit/ff4b1b9862d15adfa15eac17d2ce066c3d8ae38f)
|
||||
- 🩹 Database migration robustness was improved by automatically detecting and correcting missing primary key constraints on the user table, ensuring successful schema upgrades for databases with non-standard configurations. [#19487](https://github.com/open-webui/open-webui/discussions/19487), [Commit](https://github.com/open-webui/open-webui/commit/453ea9b9a167c0b03d86c46e6efd086bf10056ce)
|
||||
- 🏷️ OAuth group assignment now updates correctly on first login when users transition from admin to user role, ensuring group memberships reflect immediately when group management is enabled. [#19475](https://github.com/open-webui/open-webui/issues/19475), [#19476](https://github.com/open-webui/open-webui/pull/19476)
|
||||
- 💡 Knowledge base file tooltips now properly display the parent collection name when referencing files with the hash symbol, preventing confusion between identically-named files in different collections. [#19491](https://github.com/open-webui/open-webui/issues/19491), [Commit](https://github.com/open-webui/open-webui/commit/3fe5a47b0ff84ac97f8e4ff56a19fa2ec065bf66)
|
||||
- 🔐 Knowledge base file access inconsistencies were resolved where authorized non-admin users received "Not found" or permission errors for certain files due to race conditions during upload causing mismatched collection_name values, with file access validation now properly checking against knowledge base file associations. [#18689](https://github.com/open-webui/open-webui/issues/18689), [#19523](https://github.com/open-webui/open-webui/pull/19523), [Commit](https://github.com/open-webui/open-webui/commit/e301d1962e45900ababd3eabb7e9a2ad275a5761)
|
||||
- 📦 Knowledge API batch file addition endpoint was corrected to properly handle async operations, resolving 500 Internal Server Error responses when adding multiple files simultaneously. [#19538](https://github.com/open-webui/open-webui/issues/19538), [Commit](https://github.com/open-webui/open-webui/commit/28659f60d94feb4f6a99bb1a5b54d7f45e5ea10f)
|
||||
- 🤖 Embedding model auto-update functionality was fixed to properly respect the "RAG_EMBEDDING_MODEL_AUTO_UPDATE" setting by correctly passing the flag to the model path resolver, ensuring models update as expected when the auto-update option is enabled. [#19687](https://github.com/open-webui/open-webui/pull/19687)
|
||||
- 📉 API response payload sizes were dramatically reduced by removing base64-encoded profile images from most endpoints, eliminating multi-megabyte responses caused by high-resolution avatars and enabling better browser caching. [#19519](https://github.com/open-webui/open-webui/issues/19519), [Commit](https://github.com/open-webui/open-webui/commit/384753c4c17f62a68d38af4bbcf55a21ee08e0f2)
|
||||
- 📞 Redundant API calls on the admin user overview page were eliminated by consolidating reactive statements, reducing four duplicate requests to a single efficient call and significantly improving page load performance. [#19509](https://github.com/open-webui/open-webui/issues/19509), [Commit](https://github.com/open-webui/open-webui/commit/9f89cc5e9f7e1c6c9e2bc91177e08df7c79f66f9)
|
||||
- 🧹 Duplicate API calls on the workspace models page were eliminated by removing redundant model list fetching, reducing two identical requests to a single call and improving page responsiveness. [#19517](https://github.com/open-webui/open-webui/issues/19517), [Commit](https://github.com/open-webui/open-webui/commit/d1bbf6be7a4d1d53fa8ad46ca4f62fc4b2e6a8cb)
|
||||
- 🔘 The model valves button was corrected to prevent unintended form submission by adding explicit button type attribute, ensuring it no longer triggers message sending when the input area contains text. [#19534](https://github.com/open-webui/open-webui/pull/19534)
|
||||
- 🗑️ Ollama model deletion was fixed by correcting the request payload format and ensuring the model selector properly displays the placeholder option. [Commit](https://github.com/open-webui/open-webui/commit/0f3156651c64bc5af188a65fc2908bdcecf30c74)
|
||||
- 🎨 Image generation in temporary chats was fixed by correctly handling local chat sessions that are not persisted to the database. [Commit](https://github.com/open-webui/open-webui/commit/a7c7993bbf3a21cb7ba416525b89233cf2ad877f)
|
||||
- 🕵️♂️ Audit logging was fixed by correctly awaiting the async user authentication call, resolving failures where coroutine objects were passed instead of user data. [#19658](https://github.com/open-webui/open-webui/pull/19658), [Commit](https://github.com/open-webui/open-webui/commit/dba86bc)
|
||||
- 🌙 Dark mode select dropdown styling was corrected to use proper background colors, fixing an issue where dropdown borders and hover states appeared white instead of matching the dark theme. [#19693](https://github.com/open-webui/open-webui/pull/19693), [#19442](https://github.com/open-webui/open-webui/issues/19442)
|
||||
- 🔍 Milvus vector database query filtering was fixed by correcting string quote handling in filter expressions and using the proper parameter name for queries, resolving false "duplicate content detected" errors that prevented uploading multiple files to knowledge bases. [#19602](https://github.com/open-webui/open-webui/pull/19602), [#18119](https://github.com/open-webui/open-webui/issues/18119), [#16345](https://github.com/open-webui/open-webui/issues/16345), [#17088](https://github.com/open-webui/open-webui/issues/17088), [#18485](https://github.com/open-webui/open-webui/issues/18485)
|
||||
- 🆙 Milvus multitenancy vector database was updated to use query_iterator() for improved robustness and consistency with the standard Milvus implementation, fixing the same false duplicate detection errors and improving handling of large result sets in multi-tenant deployments. [#19695](https://github.com/open-webui/open-webui/pull/19695)
|
||||
|
||||
### Changed
|
||||
|
||||
- ⚠️ **IMPORTANT for Multi-Instance Deployments** — This release includes database schema changes; multi-worker, multi-server, or load-balanced deployments must update all instances simultaneously rather than performing rolling updates, as running mixed versions will cause application failures due to schema incompatibility between old and new instances.
|
||||
- 👮 Channel creation is now restricted to administrators only, with the channel add button hidden for regular users to maintain organizational control over communication channels. [Commit](https://github.com/open-webui/open-webui/commit/421aba7cd7cd708168b1f2565026c74525a67905)
|
||||
- ➖ The active user count indicator was removed from the bottom-left user menu in the sidebar to streamline the interface. [Commit](https://github.com/open-webui/open-webui/commit/848f3fd4d86ca66656e0ff0335773945af8d7d8d)
|
||||
- 🗂️ The user table was restructured with API keys migrated to a dedicated table supporting future multi-key functionality, OAuth data storage converted to a JSON structure enabling multiple identity providers per user account, and internal column types optimized from TEXT to JSON for the "info" and "settings" fields, with automatic migration preserving all existing data and associations. [#19573](https://github.com/open-webui/open-webui/pull/19573)
|
||||
- 🔄 The knowledge base API was restructured to support the new file relationship model.
|
||||
|
||||
## [0.6.40] - 2025-11-25
|
||||
|
||||
### Fixed
|
||||
|
|
|
|||
|
|
@ -583,14 +583,16 @@ OAUTH_ROLES_CLAIM = PersistentConfig(
|
|||
os.environ.get("OAUTH_ROLES_CLAIM", "roles"),
|
||||
)
|
||||
|
||||
SEP = os.environ.get("OAUTH_ROLES_SEPARATOR", ",")
|
||||
OAUTH_ROLES_SEPARATOR = os.environ.get("OAUTH_ROLES_SEPARATOR", ",")
|
||||
|
||||
OAUTH_ALLOWED_ROLES = PersistentConfig(
|
||||
"OAUTH_ALLOWED_ROLES",
|
||||
"oauth.allowed_roles",
|
||||
[
|
||||
role.strip()
|
||||
for role in os.environ.get("OAUTH_ALLOWED_ROLES", f"user{SEP}admin").split(SEP)
|
||||
for role in os.environ.get(
|
||||
"OAUTH_ALLOWED_ROLES", f"user{OAUTH_ROLES_SEPARATOR}admin"
|
||||
).split(OAUTH_ROLES_SEPARATOR)
|
||||
if role
|
||||
],
|
||||
)
|
||||
|
|
@ -600,7 +602,9 @@ OAUTH_ADMIN_ROLES = PersistentConfig(
|
|||
"oauth.admin_roles",
|
||||
[
|
||||
role.strip()
|
||||
for role in os.environ.get("OAUTH_ADMIN_ROLES", "admin").split(SEP)
|
||||
for role in os.environ.get("OAUTH_ADMIN_ROLES", "admin").split(
|
||||
OAUTH_ROLES_SEPARATOR
|
||||
)
|
||||
if role
|
||||
],
|
||||
)
|
||||
|
|
@ -1443,10 +1447,18 @@ USER_PERMISSIONS_FEATURES_CODE_INTERPRETER = (
|
|||
== "true"
|
||||
)
|
||||
|
||||
USER_PERMISSIONS_FEATURES_FOLDERS = (
|
||||
os.environ.get("USER_PERMISSIONS_FEATURES_FOLDERS", "True").lower() == "true"
|
||||
)
|
||||
|
||||
USER_PERMISSIONS_FEATURES_NOTES = (
|
||||
os.environ.get("USER_PERMISSIONS_FEATURES_NOTES", "True").lower() == "true"
|
||||
)
|
||||
|
||||
USER_PERMISSIONS_FEATURES_CHANNELS = (
|
||||
os.environ.get("USER_PERMISSIONS_FEATURES_CHANNELS", "True").lower() == "true"
|
||||
)
|
||||
|
||||
USER_PERMISSIONS_FEATURES_API_KEYS = (
|
||||
os.environ.get("USER_PERMISSIONS_FEATURES_API_KEYS", "False").lower() == "true"
|
||||
)
|
||||
|
|
@ -1499,12 +1511,16 @@ DEFAULT_USER_PERMISSIONS = {
|
|||
"temporary_enforced": USER_PERMISSIONS_CHAT_TEMPORARY_ENFORCED,
|
||||
},
|
||||
"features": {
|
||||
# General features
|
||||
"api_keys": USER_PERMISSIONS_FEATURES_API_KEYS,
|
||||
"notes": USER_PERMISSIONS_FEATURES_NOTES,
|
||||
"folders": USER_PERMISSIONS_FEATURES_FOLDERS,
|
||||
"channels": USER_PERMISSIONS_FEATURES_CHANNELS,
|
||||
"direct_tool_servers": USER_PERMISSIONS_FEATURES_DIRECT_TOOL_SERVERS,
|
||||
# Chat features
|
||||
"web_search": USER_PERMISSIONS_FEATURES_WEB_SEARCH,
|
||||
"image_generation": USER_PERMISSIONS_FEATURES_IMAGE_GENERATION,
|
||||
"code_interpreter": USER_PERMISSIONS_FEATURES_CODE_INTERPRETER,
|
||||
"notes": USER_PERMISSIONS_FEATURES_NOTES,
|
||||
},
|
||||
}
|
||||
|
||||
|
|
@ -1514,6 +1530,12 @@ USER_PERMISSIONS = PersistentConfig(
|
|||
DEFAULT_USER_PERMISSIONS,
|
||||
)
|
||||
|
||||
ENABLE_FOLDERS = PersistentConfig(
|
||||
"ENABLE_FOLDERS",
|
||||
"folders.enable",
|
||||
os.environ.get("ENABLE_FOLDERS", "True").lower() == "true",
|
||||
)
|
||||
|
||||
ENABLE_CHANNELS = PersistentConfig(
|
||||
"ENABLE_CHANNELS",
|
||||
"channels.enable",
|
||||
|
|
@ -2568,6 +2590,12 @@ DOCUMENT_INTELLIGENCE_KEY = PersistentConfig(
|
|||
os.getenv("DOCUMENT_INTELLIGENCE_KEY", ""),
|
||||
)
|
||||
|
||||
DOCUMENT_INTELLIGENCE_MODEL = PersistentConfig(
|
||||
"DOCUMENT_INTELLIGENCE_MODEL",
|
||||
"rag.document_intelligence_model",
|
||||
os.getenv("DOCUMENT_INTELLIGENCE_MODEL", "prebuilt-layout"),
|
||||
)
|
||||
|
||||
MISTRAL_OCR_API_BASE_URL = PersistentConfig(
|
||||
"MISTRAL_OCR_API_BASE_URL",
|
||||
"rag.MISTRAL_OCR_API_BASE_URL",
|
||||
|
|
|
|||
|
|
@ -61,11 +61,11 @@ from open_webui.utils import logger
|
|||
from open_webui.utils.audit import AuditLevel, AuditLoggingMiddleware
|
||||
from open_webui.utils.logger import start_logger
|
||||
from open_webui.socket.main import (
|
||||
MODELS,
|
||||
app as socket_app,
|
||||
periodic_usage_pool_cleanup,
|
||||
get_event_emitter,
|
||||
get_models_in_use,
|
||||
get_active_user_ids,
|
||||
)
|
||||
from open_webui.routers import (
|
||||
audio,
|
||||
|
|
@ -273,6 +273,7 @@ from open_webui.config import (
|
|||
DOCLING_PARAMS,
|
||||
DOCUMENT_INTELLIGENCE_ENDPOINT,
|
||||
DOCUMENT_INTELLIGENCE_KEY,
|
||||
DOCUMENT_INTELLIGENCE_MODEL,
|
||||
MISTRAL_OCR_API_BASE_URL,
|
||||
MISTRAL_OCR_API_KEY,
|
||||
RAG_TEXT_SPLITTER,
|
||||
|
|
@ -352,6 +353,7 @@ from open_webui.config import (
|
|||
ENABLE_API_KEYS,
|
||||
ENABLE_API_KEYS_ENDPOINT_RESTRICTIONS,
|
||||
API_KEYS_ALLOWED_ENDPOINTS,
|
||||
ENABLE_FOLDERS,
|
||||
ENABLE_CHANNELS,
|
||||
ENABLE_NOTES,
|
||||
ENABLE_COMMUNITY_SHARING,
|
||||
|
|
@ -767,6 +769,7 @@ app.state.config.WEBHOOK_URL = WEBHOOK_URL
|
|||
app.state.config.BANNERS = WEBUI_BANNERS
|
||||
|
||||
|
||||
app.state.config.ENABLE_FOLDERS = ENABLE_FOLDERS
|
||||
app.state.config.ENABLE_CHANNELS = ENABLE_CHANNELS
|
||||
app.state.config.ENABLE_NOTES = ENABLE_NOTES
|
||||
app.state.config.ENABLE_COMMUNITY_SHARING = ENABLE_COMMUNITY_SHARING
|
||||
|
|
@ -869,6 +872,7 @@ app.state.config.DOCLING_API_KEY = DOCLING_API_KEY
|
|||
app.state.config.DOCLING_PARAMS = DOCLING_PARAMS
|
||||
app.state.config.DOCUMENT_INTELLIGENCE_ENDPOINT = DOCUMENT_INTELLIGENCE_ENDPOINT
|
||||
app.state.config.DOCUMENT_INTELLIGENCE_KEY = DOCUMENT_INTELLIGENCE_KEY
|
||||
app.state.config.DOCUMENT_INTELLIGENCE_MODEL = DOCUMENT_INTELLIGENCE_MODEL
|
||||
app.state.config.MISTRAL_OCR_API_BASE_URL = MISTRAL_OCR_API_BASE_URL
|
||||
app.state.config.MISTRAL_OCR_API_KEY = MISTRAL_OCR_API_KEY
|
||||
app.state.config.MINERU_API_MODE = MINERU_API_MODE
|
||||
|
|
@ -980,9 +984,7 @@ app.state.YOUTUBE_LOADER_TRANSLATION = None
|
|||
|
||||
try:
|
||||
app.state.ef = get_ef(
|
||||
app.state.config.RAG_EMBEDDING_ENGINE,
|
||||
app.state.config.RAG_EMBEDDING_MODEL,
|
||||
RAG_EMBEDDING_MODEL_AUTO_UPDATE,
|
||||
app.state.config.RAG_EMBEDDING_ENGINE, app.state.config.RAG_EMBEDDING_MODEL
|
||||
)
|
||||
if (
|
||||
app.state.config.ENABLE_RAG_HYBRID_SEARCH
|
||||
|
|
@ -993,7 +995,6 @@ try:
|
|||
app.state.config.RAG_RERANKING_MODEL,
|
||||
app.state.config.RAG_EXTERNAL_RERANKER_URL,
|
||||
app.state.config.RAG_EXTERNAL_RERANKER_API_KEY,
|
||||
RAG_RERANKING_MODEL_AUTO_UPDATE,
|
||||
)
|
||||
else:
|
||||
app.state.rf = None
|
||||
|
|
@ -1215,7 +1216,7 @@ app.state.config.VOICE_MODE_PROMPT_TEMPLATE = VOICE_MODE_PROMPT_TEMPLATE
|
|||
#
|
||||
########################################
|
||||
|
||||
app.state.MODELS = {}
|
||||
app.state.MODELS = MODELS
|
||||
|
||||
# Add the middleware to the app
|
||||
if ENABLE_COMPRESSION_MIDDLEWARE:
|
||||
|
|
@ -1575,6 +1576,7 @@ async def chat_completion(
|
|||
"user_id": user.id,
|
||||
"chat_id": form_data.pop("chat_id", None),
|
||||
"message_id": form_data.pop("id", None),
|
||||
"parent_message_id": form_data.pop("parent_id", None),
|
||||
"session_id": form_data.pop("session_id", None),
|
||||
"filter_ids": form_data.pop("filter_ids", []),
|
||||
"tool_ids": form_data.get("tool_ids", None),
|
||||
|
|
@ -1631,6 +1633,7 @@ async def chat_completion(
|
|||
metadata["chat_id"],
|
||||
metadata["message_id"],
|
||||
{
|
||||
"parentId": metadata.get("parent_message_id", None),
|
||||
"model": model_id,
|
||||
},
|
||||
)
|
||||
|
|
@ -1663,6 +1666,7 @@ async def chat_completion(
|
|||
metadata["chat_id"],
|
||||
metadata["message_id"],
|
||||
{
|
||||
"parentId": metadata.get("parent_message_id", None),
|
||||
"error": {"content": str(e)},
|
||||
},
|
||||
)
|
||||
|
|
@ -1842,6 +1846,7 @@ async def get_app_config(request: Request):
|
|||
**(
|
||||
{
|
||||
"enable_direct_connections": app.state.config.ENABLE_DIRECT_CONNECTIONS,
|
||||
"enable_folders": app.state.config.ENABLE_FOLDERS,
|
||||
"enable_channels": app.state.config.ENABLE_CHANNELS,
|
||||
"enable_notes": app.state.config.ENABLE_NOTES,
|
||||
"enable_web_search": app.state.config.ENABLE_WEB_SEARCH,
|
||||
|
|
@ -2014,7 +2019,10 @@ async def get_current_usage(user=Depends(get_verified_user)):
|
|||
This is an experimental endpoint and subject to change.
|
||||
"""
|
||||
try:
|
||||
return {"model_ids": get_models_in_use(), "user_ids": get_active_user_ids()}
|
||||
return {
|
||||
"model_ids": get_models_in_use(),
|
||||
"user_count": Users.get_active_user_count(),
|
||||
}
|
||||
except Exception as e:
|
||||
log.error(f"Error getting usage statistics: {e}")
|
||||
raise HTTPException(status_code=500, detail="Internal Server Error")
|
||||
|
|
@ -2077,7 +2085,7 @@ except Exception as e:
|
|||
)
|
||||
|
||||
|
||||
async def register_client(self, request, client_id: str) -> bool:
|
||||
async def register_client(request, client_id: str) -> bool:
|
||||
server_type, server_id = client_id.split(":", 1)
|
||||
|
||||
connection = None
|
||||
|
|
|
|||
|
|
@ -0,0 +1,103 @@
|
|||
"""Update messages and channel member table
|
||||
|
||||
Revision ID: 2f1211949ecc
|
||||
Revises: 37f288994c47
|
||||
Create Date: 2025-11-27 03:07:56.200231
|
||||
|
||||
"""
|
||||
|
||||
from typing import Sequence, Union
|
||||
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
import open_webui.internal.db
|
||||
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision: str = "2f1211949ecc"
|
||||
down_revision: Union[str, None] = "37f288994c47"
|
||||
branch_labels: Union[str, Sequence[str], None] = None
|
||||
depends_on: Union[str, Sequence[str], None] = None
|
||||
|
||||
|
||||
def upgrade() -> None:
|
||||
# New columns to be added to channel_member table
|
||||
op.add_column("channel_member", sa.Column("status", sa.Text(), nullable=True))
|
||||
op.add_column(
|
||||
"channel_member",
|
||||
sa.Column(
|
||||
"is_active",
|
||||
sa.Boolean(),
|
||||
nullable=False,
|
||||
default=True,
|
||||
server_default=sa.sql.expression.true(),
|
||||
),
|
||||
)
|
||||
|
||||
op.add_column(
|
||||
"channel_member",
|
||||
sa.Column(
|
||||
"is_channel_muted",
|
||||
sa.Boolean(),
|
||||
nullable=False,
|
||||
default=False,
|
||||
server_default=sa.sql.expression.false(),
|
||||
),
|
||||
)
|
||||
op.add_column(
|
||||
"channel_member",
|
||||
sa.Column(
|
||||
"is_channel_pinned",
|
||||
sa.Boolean(),
|
||||
nullable=False,
|
||||
default=False,
|
||||
server_default=sa.sql.expression.false(),
|
||||
),
|
||||
)
|
||||
|
||||
op.add_column("channel_member", sa.Column("data", sa.JSON(), nullable=True))
|
||||
op.add_column("channel_member", sa.Column("meta", sa.JSON(), nullable=True))
|
||||
|
||||
op.add_column(
|
||||
"channel_member", sa.Column("joined_at", sa.BigInteger(), nullable=False)
|
||||
)
|
||||
op.add_column(
|
||||
"channel_member", sa.Column("left_at", sa.BigInteger(), nullable=True)
|
||||
)
|
||||
|
||||
op.add_column(
|
||||
"channel_member", sa.Column("last_read_at", sa.BigInteger(), nullable=True)
|
||||
)
|
||||
|
||||
op.add_column(
|
||||
"channel_member", sa.Column("updated_at", sa.BigInteger(), nullable=True)
|
||||
)
|
||||
|
||||
# New columns to be added to message table
|
||||
op.add_column(
|
||||
"message",
|
||||
sa.Column(
|
||||
"is_pinned",
|
||||
sa.Boolean(),
|
||||
nullable=False,
|
||||
default=False,
|
||||
server_default=sa.sql.expression.false(),
|
||||
),
|
||||
)
|
||||
op.add_column("message", sa.Column("pinned_at", sa.BigInteger(), nullable=True))
|
||||
op.add_column("message", sa.Column("pinned_by", sa.Text(), nullable=True))
|
||||
|
||||
|
||||
def downgrade() -> None:
|
||||
op.drop_column("channel_member", "updated_at")
|
||||
op.drop_column("channel_member", "last_read_at")
|
||||
|
||||
op.drop_column("channel_member", "meta")
|
||||
op.drop_column("channel_member", "data")
|
||||
|
||||
op.drop_column("channel_member", "is_channel_pinned")
|
||||
op.drop_column("channel_member", "is_channel_muted")
|
||||
|
||||
op.drop_column("message", "pinned_by")
|
||||
op.drop_column("message", "pinned_at")
|
||||
op.drop_column("message", "is_pinned")
|
||||
|
|
@ -20,18 +20,46 @@ depends_on: Union[str, Sequence[str], None] = None
|
|||
|
||||
|
||||
def upgrade() -> None:
|
||||
# Ensure 'id' column in 'user' table is unique and primary key (ForeignKey constraint)
|
||||
inspector = sa.inspect(op.get_bind())
|
||||
columns = inspector.get_columns("user")
|
||||
|
||||
pk_columns = inspector.get_pk_constraint("user")["constrained_columns"]
|
||||
id_column = next((col for col in columns if col["name"] == "id"), None)
|
||||
|
||||
if id_column and not id_column.get("unique", False):
|
||||
unique_constraints = inspector.get_unique_constraints("user")
|
||||
unique_columns = {tuple(u["column_names"]) for u in unique_constraints}
|
||||
|
||||
with op.batch_alter_table("user") as batch_op:
|
||||
# If primary key is wrong, drop it
|
||||
if pk_columns and pk_columns != ["id"]:
|
||||
batch_op.drop_constraint(
|
||||
inspector.get_pk_constraint("user")["name"], type_="primary"
|
||||
)
|
||||
|
||||
# Add unique constraint if missing
|
||||
if ("id",) not in unique_columns:
|
||||
batch_op.create_unique_constraint("uq_user_id", ["id"])
|
||||
|
||||
# Re-create correct primary key
|
||||
batch_op.create_primary_key("pk_user_id", ["id"])
|
||||
|
||||
# Create oauth_session table
|
||||
op.create_table(
|
||||
"oauth_session",
|
||||
sa.Column("id", sa.Text(), nullable=False),
|
||||
sa.Column("user_id", sa.Text(), nullable=False),
|
||||
sa.Column("id", sa.Text(), primary_key=True, nullable=False, unique=True),
|
||||
sa.Column(
|
||||
"user_id",
|
||||
sa.Text(),
|
||||
sa.ForeignKey("user.id", ondelete="CASCADE"),
|
||||
nullable=False,
|
||||
),
|
||||
sa.Column("provider", sa.Text(), nullable=False),
|
||||
sa.Column("token", sa.Text(), nullable=False),
|
||||
sa.Column("expires_at", sa.BigInteger(), nullable=False),
|
||||
sa.Column("created_at", sa.BigInteger(), nullable=False),
|
||||
sa.Column("updated_at", sa.BigInteger(), nullable=False),
|
||||
sa.PrimaryKeyConstraint("id"),
|
||||
sa.ForeignKeyConstraint(["user_id"], ["user.id"], ondelete="CASCADE"),
|
||||
)
|
||||
|
||||
# Create indexes for better performance
|
||||
|
|
|
|||
|
|
@ -0,0 +1,169 @@
|
|||
"""Add knowledge_file table
|
||||
|
||||
Revision ID: 3e0e00844bb0
|
||||
Revises: 90ef40d4714e
|
||||
Create Date: 2025-12-02 06:54:19.401334
|
||||
|
||||
"""
|
||||
|
||||
from typing import Sequence, Union
|
||||
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
from sqlalchemy import inspect
|
||||
import open_webui.internal.db
|
||||
|
||||
import time
|
||||
import json
|
||||
import uuid
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision: str = "3e0e00844bb0"
|
||||
down_revision: Union[str, None] = "90ef40d4714e"
|
||||
branch_labels: Union[str, Sequence[str], None] = None
|
||||
depends_on: Union[str, Sequence[str], None] = None
|
||||
|
||||
|
||||
def upgrade() -> None:
|
||||
op.create_table(
|
||||
"knowledge_file",
|
||||
sa.Column("id", sa.Text(), primary_key=True),
|
||||
sa.Column("user_id", sa.Text(), nullable=False),
|
||||
sa.Column(
|
||||
"knowledge_id",
|
||||
sa.Text(),
|
||||
sa.ForeignKey("knowledge.id", ondelete="CASCADE"),
|
||||
nullable=False,
|
||||
),
|
||||
sa.Column(
|
||||
"file_id",
|
||||
sa.Text(),
|
||||
sa.ForeignKey("file.id", ondelete="CASCADE"),
|
||||
nullable=False,
|
||||
),
|
||||
sa.Column("created_at", sa.BigInteger(), nullable=False),
|
||||
sa.Column("updated_at", sa.BigInteger(), nullable=False),
|
||||
# indexes
|
||||
sa.Index("ix_knowledge_file_knowledge_id", "knowledge_id"),
|
||||
sa.Index("ix_knowledge_file_file_id", "file_id"),
|
||||
sa.Index("ix_knowledge_file_user_id", "user_id"),
|
||||
# unique constraints
|
||||
sa.UniqueConstraint(
|
||||
"knowledge_id", "file_id", name="uq_knowledge_file_knowledge_file"
|
||||
), # prevent duplicate entries
|
||||
)
|
||||
|
||||
connection = op.get_bind()
|
||||
|
||||
# 2. Read existing group with user_ids JSON column
|
||||
knowledge_table = sa.Table(
|
||||
"knowledge",
|
||||
sa.MetaData(),
|
||||
sa.Column("id", sa.Text()),
|
||||
sa.Column("user_id", sa.Text()),
|
||||
sa.Column("data", sa.JSON()), # JSON stored as text in SQLite + PG
|
||||
)
|
||||
|
||||
results = connection.execute(
|
||||
sa.select(
|
||||
knowledge_table.c.id, knowledge_table.c.user_id, knowledge_table.c.data
|
||||
)
|
||||
).fetchall()
|
||||
|
||||
# 3. Insert members into group_member table
|
||||
kf_table = sa.Table(
|
||||
"knowledge_file",
|
||||
sa.MetaData(),
|
||||
sa.Column("id", sa.Text()),
|
||||
sa.Column("user_id", sa.Text()),
|
||||
sa.Column("knowledge_id", sa.Text()),
|
||||
sa.Column("file_id", sa.Text()),
|
||||
sa.Column("created_at", sa.BigInteger()),
|
||||
sa.Column("updated_at", sa.BigInteger()),
|
||||
)
|
||||
|
||||
file_table = sa.Table(
|
||||
"file",
|
||||
sa.MetaData(),
|
||||
sa.Column("id", sa.Text()),
|
||||
)
|
||||
|
||||
now = int(time.time())
|
||||
for knowledge_id, user_id, data in results:
|
||||
if not data:
|
||||
continue
|
||||
|
||||
if isinstance(data, str):
|
||||
try:
|
||||
data = json.loads(data)
|
||||
except Exception:
|
||||
continue # skip invalid JSON
|
||||
|
||||
if not isinstance(data, dict):
|
||||
continue
|
||||
|
||||
file_ids = data.get("file_ids", [])
|
||||
|
||||
for file_id in file_ids:
|
||||
file_exists = connection.execute(
|
||||
sa.select(file_table.c.id).where(file_table.c.id == file_id)
|
||||
).fetchone()
|
||||
|
||||
if not file_exists:
|
||||
continue # skip non-existing files
|
||||
|
||||
row = {
|
||||
"id": str(uuid.uuid4()),
|
||||
"user_id": user_id,
|
||||
"knowledge_id": knowledge_id,
|
||||
"file_id": file_id,
|
||||
"created_at": now,
|
||||
"updated_at": now,
|
||||
}
|
||||
connection.execute(kf_table.insert().values(**row))
|
||||
|
||||
with op.batch_alter_table("knowledge") as batch:
|
||||
batch.drop_column("data")
|
||||
|
||||
|
||||
def downgrade() -> None:
|
||||
# 1. Add back the old data column
|
||||
op.add_column("knowledge", sa.Column("data", sa.JSON(), nullable=True))
|
||||
|
||||
connection = op.get_bind()
|
||||
|
||||
# 2. Read knowledge_file entries and reconstruct data JSON
|
||||
knowledge_table = sa.Table(
|
||||
"knowledge",
|
||||
sa.MetaData(),
|
||||
sa.Column("id", sa.Text()),
|
||||
sa.Column("data", sa.JSON()),
|
||||
)
|
||||
|
||||
kf_table = sa.Table(
|
||||
"knowledge_file",
|
||||
sa.MetaData(),
|
||||
sa.Column("id", sa.Text()),
|
||||
sa.Column("knowledge_id", sa.Text()),
|
||||
sa.Column("file_id", sa.Text()),
|
||||
)
|
||||
|
||||
results = connection.execute(sa.select(knowledge_table.c.id)).fetchall()
|
||||
|
||||
for (knowledge_id,) in results:
|
||||
file_ids = connection.execute(
|
||||
sa.select(kf_table.c.file_id).where(kf_table.c.knowledge_id == knowledge_id)
|
||||
).fetchall()
|
||||
|
||||
file_ids_list = [fid for (fid,) in file_ids]
|
||||
|
||||
data_json = {"file_ids": file_ids_list}
|
||||
|
||||
connection.execute(
|
||||
knowledge_table.update()
|
||||
.where(knowledge_table.c.id == knowledge_id)
|
||||
.values(data=data_json)
|
||||
)
|
||||
|
||||
# 3. Drop the knowledge_file table
|
||||
op.drop_table("knowledge_file")
|
||||
|
|
@ -0,0 +1,81 @@
|
|||
"""Update channel and channel members table
|
||||
|
||||
Revision ID: 90ef40d4714e
|
||||
Revises: b10670c03dd5
|
||||
Create Date: 2025-11-30 06:33:38.790341
|
||||
|
||||
"""
|
||||
|
||||
from typing import Sequence, Union
|
||||
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
import open_webui.internal.db
|
||||
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision: str = "90ef40d4714e"
|
||||
down_revision: Union[str, None] = "b10670c03dd5"
|
||||
branch_labels: Union[str, Sequence[str], None] = None
|
||||
depends_on: Union[str, Sequence[str], None] = None
|
||||
|
||||
|
||||
def upgrade() -> None:
|
||||
# Update 'channel' table
|
||||
op.add_column("channel", sa.Column("is_private", sa.Boolean(), nullable=True))
|
||||
|
||||
op.add_column("channel", sa.Column("archived_at", sa.BigInteger(), nullable=True))
|
||||
op.add_column("channel", sa.Column("archived_by", sa.Text(), nullable=True))
|
||||
|
||||
op.add_column("channel", sa.Column("deleted_at", sa.BigInteger(), nullable=True))
|
||||
op.add_column("channel", sa.Column("deleted_by", sa.Text(), nullable=True))
|
||||
|
||||
op.add_column("channel", sa.Column("updated_by", sa.Text(), nullable=True))
|
||||
|
||||
# Update 'channel_member' table
|
||||
op.add_column("channel_member", sa.Column("role", sa.Text(), nullable=True))
|
||||
op.add_column("channel_member", sa.Column("invited_by", sa.Text(), nullable=True))
|
||||
op.add_column(
|
||||
"channel_member", sa.Column("invited_at", sa.BigInteger(), nullable=True)
|
||||
)
|
||||
|
||||
# Create 'channel_webhook' table
|
||||
op.create_table(
|
||||
"channel_webhook",
|
||||
sa.Column("id", sa.Text(), primary_key=True, unique=True, nullable=False),
|
||||
sa.Column("user_id", sa.Text(), nullable=False),
|
||||
sa.Column(
|
||||
"channel_id",
|
||||
sa.Text(),
|
||||
sa.ForeignKey("channel.id", ondelete="CASCADE"),
|
||||
nullable=False,
|
||||
),
|
||||
sa.Column("name", sa.Text(), nullable=False),
|
||||
sa.Column("profile_image_url", sa.Text(), nullable=True),
|
||||
sa.Column("token", sa.Text(), nullable=False),
|
||||
sa.Column("last_used_at", sa.BigInteger(), nullable=True),
|
||||
sa.Column("created_at", sa.BigInteger(), nullable=False),
|
||||
sa.Column("updated_at", sa.BigInteger(), nullable=False),
|
||||
)
|
||||
|
||||
pass
|
||||
|
||||
|
||||
def downgrade() -> None:
|
||||
# Downgrade 'channel' table
|
||||
op.drop_column("channel", "is_private")
|
||||
op.drop_column("channel", "archived_at")
|
||||
op.drop_column("channel", "archived_by")
|
||||
op.drop_column("channel", "deleted_at")
|
||||
op.drop_column("channel", "deleted_by")
|
||||
op.drop_column("channel", "updated_by")
|
||||
|
||||
# Downgrade 'channel_member' table
|
||||
op.drop_column("channel_member", "role")
|
||||
op.drop_column("channel_member", "invited_by")
|
||||
op.drop_column("channel_member", "invited_at")
|
||||
|
||||
# Drop 'channel_webhook' table
|
||||
op.drop_table("channel_webhook")
|
||||
|
||||
pass
|
||||
|
|
@ -0,0 +1,251 @@
|
|||
"""Update user table
|
||||
|
||||
Revision ID: b10670c03dd5
|
||||
Revises: 2f1211949ecc
|
||||
Create Date: 2025-11-28 04:55:31.737538
|
||||
|
||||
"""
|
||||
|
||||
from typing import Sequence, Union
|
||||
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
|
||||
|
||||
import open_webui.internal.db
|
||||
import json
|
||||
import time
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision: str = "b10670c03dd5"
|
||||
down_revision: Union[str, None] = "2f1211949ecc"
|
||||
branch_labels: Union[str, Sequence[str], None] = None
|
||||
depends_on: Union[str, Sequence[str], None] = None
|
||||
|
||||
|
||||
def _drop_sqlite_indexes_for_column(table_name, column_name, conn):
|
||||
"""
|
||||
SQLite requires manual removal of any indexes referencing a column
|
||||
before ALTER TABLE ... DROP COLUMN can succeed.
|
||||
"""
|
||||
indexes = conn.execute(sa.text(f"PRAGMA index_list('{table_name}')")).fetchall()
|
||||
|
||||
for idx in indexes:
|
||||
index_name = idx[1] # index name
|
||||
# Get indexed columns
|
||||
idx_info = conn.execute(
|
||||
sa.text(f"PRAGMA index_info('{index_name}')")
|
||||
).fetchall()
|
||||
|
||||
indexed_cols = [row[2] for row in idx_info] # col names
|
||||
if column_name in indexed_cols:
|
||||
conn.execute(sa.text(f"DROP INDEX IF EXISTS {index_name}"))
|
||||
|
||||
|
||||
def _convert_column_to_json(table: str, column: str):
|
||||
conn = op.get_bind()
|
||||
dialect = conn.dialect.name
|
||||
|
||||
# SQLite cannot ALTER COLUMN → must recreate column
|
||||
if dialect == "sqlite":
|
||||
# 1. Add temporary column
|
||||
op.add_column(table, sa.Column(f"{column}_json", sa.JSON(), nullable=True))
|
||||
|
||||
# 2. Load old data
|
||||
rows = conn.execute(sa.text(f'SELECT id, {column} FROM "{table}"')).fetchall()
|
||||
|
||||
for row in rows:
|
||||
uid, raw = row
|
||||
if raw is None:
|
||||
parsed = None
|
||||
else:
|
||||
try:
|
||||
parsed = json.loads(raw)
|
||||
except Exception:
|
||||
parsed = None # fallback safe behavior
|
||||
|
||||
conn.execute(
|
||||
sa.text(f'UPDATE "{table}" SET {column}_json = :val WHERE id = :id'),
|
||||
{"val": json.dumps(parsed) if parsed else None, "id": uid},
|
||||
)
|
||||
|
||||
# 3. Drop old TEXT column
|
||||
op.drop_column(table, column)
|
||||
|
||||
# 4. Rename new JSON column → original name
|
||||
op.alter_column(table, f"{column}_json", new_column_name=column)
|
||||
|
||||
else:
|
||||
# PostgreSQL supports direct CAST
|
||||
op.alter_column(
|
||||
table,
|
||||
column,
|
||||
type_=sa.JSON(),
|
||||
postgresql_using=f"{column}::json",
|
||||
)
|
||||
|
||||
|
||||
def _convert_column_to_text(table: str, column: str):
|
||||
conn = op.get_bind()
|
||||
dialect = conn.dialect.name
|
||||
|
||||
if dialect == "sqlite":
|
||||
op.add_column(table, sa.Column(f"{column}_text", sa.Text(), nullable=True))
|
||||
|
||||
rows = conn.execute(sa.text(f'SELECT id, {column} FROM "{table}"')).fetchall()
|
||||
|
||||
for uid, raw in rows:
|
||||
conn.execute(
|
||||
sa.text(f'UPDATE "{table}" SET {column}_text = :val WHERE id = :id'),
|
||||
{"val": json.dumps(raw) if raw else None, "id": uid},
|
||||
)
|
||||
|
||||
op.drop_column(table, column)
|
||||
op.alter_column(table, f"{column}_text", new_column_name=column)
|
||||
|
||||
else:
|
||||
op.alter_column(
|
||||
table,
|
||||
column,
|
||||
type_=sa.Text(),
|
||||
postgresql_using=f"to_json({column})::text",
|
||||
)
|
||||
|
||||
|
||||
def upgrade() -> None:
|
||||
op.add_column(
|
||||
"user", sa.Column("profile_banner_image_url", sa.Text(), nullable=True)
|
||||
)
|
||||
op.add_column("user", sa.Column("timezone", sa.String(), nullable=True))
|
||||
|
||||
op.add_column("user", sa.Column("presence_state", sa.String(), nullable=True))
|
||||
op.add_column("user", sa.Column("status_emoji", sa.String(), nullable=True))
|
||||
op.add_column("user", sa.Column("status_message", sa.Text(), nullable=True))
|
||||
op.add_column(
|
||||
"user", sa.Column("status_expires_at", sa.BigInteger(), nullable=True)
|
||||
)
|
||||
|
||||
op.add_column("user", sa.Column("oauth", sa.JSON(), nullable=True))
|
||||
|
||||
# Convert info (TEXT/JSONField) → JSON
|
||||
_convert_column_to_json("user", "info")
|
||||
# Convert settings (TEXT/JSONField) → JSON
|
||||
_convert_column_to_json("user", "settings")
|
||||
|
||||
op.create_table(
|
||||
"api_key",
|
||||
sa.Column("id", sa.Text(), primary_key=True, unique=True),
|
||||
sa.Column("user_id", sa.Text(), sa.ForeignKey("user.id", ondelete="CASCADE")),
|
||||
sa.Column("key", sa.Text(), unique=True, nullable=False),
|
||||
sa.Column("data", sa.JSON(), nullable=True),
|
||||
sa.Column("expires_at", sa.BigInteger(), nullable=True),
|
||||
sa.Column("last_used_at", sa.BigInteger(), nullable=True),
|
||||
sa.Column("created_at", sa.BigInteger(), nullable=False),
|
||||
sa.Column("updated_at", sa.BigInteger(), nullable=False),
|
||||
)
|
||||
|
||||
conn = op.get_bind()
|
||||
users = conn.execute(
|
||||
sa.text('SELECT id, oauth_sub FROM "user" WHERE oauth_sub IS NOT NULL')
|
||||
).fetchall()
|
||||
|
||||
for uid, oauth_sub in users:
|
||||
if oauth_sub:
|
||||
# Example formats supported:
|
||||
# provider@sub
|
||||
# plain sub (stored as {"oidc": {"sub": sub}})
|
||||
if "@" in oauth_sub:
|
||||
provider, sub = oauth_sub.split("@", 1)
|
||||
else:
|
||||
provider, sub = "oidc", oauth_sub
|
||||
|
||||
oauth_json = json.dumps({provider: {"sub": sub}})
|
||||
conn.execute(
|
||||
sa.text('UPDATE "user" SET oauth = :oauth WHERE id = :id'),
|
||||
{"oauth": oauth_json, "id": uid},
|
||||
)
|
||||
|
||||
users_with_keys = conn.execute(
|
||||
sa.text('SELECT id, api_key FROM "user" WHERE api_key IS NOT NULL')
|
||||
).fetchall()
|
||||
now = int(time.time())
|
||||
|
||||
for uid, api_key in users_with_keys:
|
||||
if api_key:
|
||||
conn.execute(
|
||||
sa.text(
|
||||
"""
|
||||
INSERT INTO api_key (id, user_id, key, created_at, updated_at)
|
||||
VALUES (:id, :user_id, :key, :created_at, :updated_at)
|
||||
"""
|
||||
),
|
||||
{
|
||||
"id": f"key_{uid}",
|
||||
"user_id": uid,
|
||||
"key": api_key,
|
||||
"created_at": now,
|
||||
"updated_at": now,
|
||||
},
|
||||
)
|
||||
|
||||
if conn.dialect.name == "sqlite":
|
||||
_drop_sqlite_indexes_for_column("user", "api_key", conn)
|
||||
_drop_sqlite_indexes_for_column("user", "oauth_sub", conn)
|
||||
|
||||
with op.batch_alter_table("user") as batch_op:
|
||||
batch_op.drop_column("api_key")
|
||||
batch_op.drop_column("oauth_sub")
|
||||
|
||||
|
||||
def downgrade() -> None:
|
||||
# --- 1. Restore old oauth_sub column ---
|
||||
op.add_column("user", sa.Column("oauth_sub", sa.Text(), nullable=True))
|
||||
|
||||
conn = op.get_bind()
|
||||
users = conn.execute(
|
||||
sa.text('SELECT id, oauth FROM "user" WHERE oauth IS NOT NULL')
|
||||
).fetchall()
|
||||
|
||||
for uid, oauth in users:
|
||||
try:
|
||||
data = json.loads(oauth)
|
||||
provider = list(data.keys())[0]
|
||||
sub = data[provider].get("sub")
|
||||
oauth_sub = f"{provider}@{sub}"
|
||||
except Exception:
|
||||
oauth_sub = None
|
||||
|
||||
conn.execute(
|
||||
sa.text('UPDATE "user" SET oauth_sub = :oauth_sub WHERE id = :id'),
|
||||
{"oauth_sub": oauth_sub, "id": uid},
|
||||
)
|
||||
|
||||
op.drop_column("user", "oauth")
|
||||
|
||||
# --- 2. Restore api_key field ---
|
||||
op.add_column("user", sa.Column("api_key", sa.String(), nullable=True))
|
||||
|
||||
# Restore values from api_key
|
||||
keys = conn.execute(sa.text("SELECT user_id, key FROM api_key")).fetchall()
|
||||
for uid, key in keys:
|
||||
conn.execute(
|
||||
sa.text('UPDATE "user" SET api_key = :key WHERE id = :id'),
|
||||
{"key": key, "id": uid},
|
||||
)
|
||||
|
||||
# Drop new table
|
||||
op.drop_table("api_key")
|
||||
|
||||
with op.batch_alter_table("user") as batch_op:
|
||||
batch_op.drop_column("profile_banner_image_url")
|
||||
batch_op.drop_column("timezone")
|
||||
|
||||
batch_op.drop_column("presence_state")
|
||||
batch_op.drop_column("status_emoji")
|
||||
batch_op.drop_column("status_message")
|
||||
batch_op.drop_column("status_expires_at")
|
||||
|
||||
# Convert info (JSON) → TEXT
|
||||
_convert_column_to_text("user", "info")
|
||||
# Convert settings (JSON) → TEXT
|
||||
_convert_column_to_text("user", "settings")
|
||||
|
|
@ -3,7 +3,7 @@ import uuid
|
|||
from typing import Optional
|
||||
|
||||
from open_webui.internal.db import Base, get_db
|
||||
from open_webui.models.users import UserModel, Users
|
||||
from open_webui.models.users import UserModel, UserProfileImageResponse, Users
|
||||
from open_webui.env import SRC_LOG_LEVELS
|
||||
from pydantic import BaseModel
|
||||
from sqlalchemy import Boolean, Column, String, Text
|
||||
|
|
@ -46,15 +46,7 @@ class ApiKey(BaseModel):
|
|||
api_key: Optional[str] = None
|
||||
|
||||
|
||||
class UserResponse(BaseModel):
|
||||
id: str
|
||||
email: str
|
||||
name: str
|
||||
role: str
|
||||
profile_image_url: str
|
||||
|
||||
|
||||
class SigninResponse(Token, UserResponse):
|
||||
class SigninResponse(Token, UserProfileImageResponse):
|
||||
pass
|
||||
|
||||
|
||||
|
|
@ -96,7 +88,7 @@ class AuthsTable:
|
|||
name: str,
|
||||
profile_image_url: str = "/user.png",
|
||||
role: str = "pending",
|
||||
oauth_sub: Optional[str] = None,
|
||||
oauth: Optional[dict] = None,
|
||||
) -> Optional[UserModel]:
|
||||
with get_db() as db:
|
||||
log.info("insert_new_auth")
|
||||
|
|
@ -110,7 +102,7 @@ class AuthsTable:
|
|||
db.add(result)
|
||||
|
||||
user = Users.insert_new_user(
|
||||
id, name, email, profile_image_url, role, oauth_sub
|
||||
id, name, email, profile_image_url, role, oauth=oauth
|
||||
)
|
||||
|
||||
db.commit()
|
||||
|
|
|
|||
|
|
@ -4,10 +4,13 @@ import uuid
|
|||
from typing import Optional
|
||||
|
||||
from open_webui.internal.db import Base, get_db
|
||||
from open_webui.utils.access_control import has_access
|
||||
from open_webui.models.groups import Groups
|
||||
|
||||
from pydantic import BaseModel, ConfigDict
|
||||
from sqlalchemy import BigInteger, Boolean, Column, String, Text, JSON
|
||||
from sqlalchemy.dialects.postgresql import JSONB
|
||||
|
||||
|
||||
from sqlalchemy import BigInteger, Boolean, Column, String, Text, JSON, case, cast
|
||||
from sqlalchemy import or_, func, select, and_, text
|
||||
from sqlalchemy.sql import exists
|
||||
|
||||
|
|
@ -26,12 +29,23 @@ class Channel(Base):
|
|||
name = Column(Text)
|
||||
description = Column(Text, nullable=True)
|
||||
|
||||
# Used to indicate if the channel is private (for 'group' type channels)
|
||||
is_private = Column(Boolean, nullable=True)
|
||||
|
||||
data = Column(JSON, nullable=True)
|
||||
meta = Column(JSON, nullable=True)
|
||||
access_control = Column(JSON, nullable=True)
|
||||
|
||||
created_at = Column(BigInteger)
|
||||
|
||||
updated_at = Column(BigInteger)
|
||||
updated_by = Column(Text, nullable=True)
|
||||
|
||||
archived_at = Column(BigInteger, nullable=True)
|
||||
archived_by = Column(Text, nullable=True)
|
||||
|
||||
deleted_at = Column(BigInteger, nullable=True)
|
||||
deleted_by = Column(Text, nullable=True)
|
||||
|
||||
|
||||
class ChannelModel(BaseModel):
|
||||
|
|
@ -39,17 +53,122 @@ class ChannelModel(BaseModel):
|
|||
|
||||
id: str
|
||||
user_id: str
|
||||
|
||||
type: Optional[str] = None
|
||||
|
||||
name: str
|
||||
description: Optional[str] = None
|
||||
|
||||
is_private: Optional[bool] = None
|
||||
|
||||
data: Optional[dict] = None
|
||||
meta: Optional[dict] = None
|
||||
access_control: Optional[dict] = None
|
||||
|
||||
created_at: int # timestamp in epoch
|
||||
updated_at: int # timestamp in epoch
|
||||
created_at: int # timestamp in epoch (time_ns)
|
||||
|
||||
updated_at: int # timestamp in epoch (time_ns)
|
||||
updated_by: Optional[str] = None
|
||||
|
||||
archived_at: Optional[int] = None # timestamp in epoch (time_ns)
|
||||
archived_by: Optional[str] = None
|
||||
|
||||
deleted_at: Optional[int] = None # timestamp in epoch (time_ns)
|
||||
deleted_by: Optional[str] = None
|
||||
|
||||
|
||||
class ChannelMember(Base):
|
||||
__tablename__ = "channel_member"
|
||||
|
||||
id = Column(Text, primary_key=True, unique=True)
|
||||
channel_id = Column(Text, nullable=False)
|
||||
user_id = Column(Text, nullable=False)
|
||||
|
||||
role = Column(Text, nullable=True)
|
||||
status = Column(Text, nullable=True)
|
||||
|
||||
is_active = Column(Boolean, nullable=False, default=True)
|
||||
|
||||
is_channel_muted = Column(Boolean, nullable=False, default=False)
|
||||
is_channel_pinned = Column(Boolean, nullable=False, default=False)
|
||||
|
||||
data = Column(JSON, nullable=True)
|
||||
meta = Column(JSON, nullable=True)
|
||||
|
||||
invited_at = Column(BigInteger, nullable=True)
|
||||
invited_by = Column(Text, nullable=True)
|
||||
|
||||
joined_at = Column(BigInteger)
|
||||
left_at = Column(BigInteger, nullable=True)
|
||||
|
||||
last_read_at = Column(BigInteger, nullable=True)
|
||||
|
||||
created_at = Column(BigInteger)
|
||||
updated_at = Column(BigInteger)
|
||||
|
||||
|
||||
class ChannelMemberModel(BaseModel):
|
||||
model_config = ConfigDict(from_attributes=True)
|
||||
|
||||
id: str
|
||||
channel_id: str
|
||||
user_id: str
|
||||
|
||||
role: Optional[str] = None
|
||||
status: Optional[str] = None
|
||||
|
||||
is_active: bool = True
|
||||
|
||||
is_channel_muted: bool = False
|
||||
is_channel_pinned: bool = False
|
||||
|
||||
data: Optional[dict] = None
|
||||
meta: Optional[dict] = None
|
||||
|
||||
invited_at: Optional[int] = None # timestamp in epoch (time_ns)
|
||||
invited_by: Optional[str] = None
|
||||
|
||||
joined_at: Optional[int] = None # timestamp in epoch (time_ns)
|
||||
left_at: Optional[int] = None # timestamp in epoch (time_ns)
|
||||
|
||||
last_read_at: Optional[int] = None # timestamp in epoch (time_ns)
|
||||
|
||||
created_at: Optional[int] = None # timestamp in epoch (time_ns)
|
||||
updated_at: Optional[int] = None # timestamp in epoch (time_ns)
|
||||
|
||||
|
||||
class ChannelWebhook(Base):
|
||||
__tablename__ = "channel_webhook"
|
||||
|
||||
id = Column(Text, primary_key=True, unique=True)
|
||||
channel_id = Column(Text, nullable=False)
|
||||
user_id = Column(Text, nullable=False)
|
||||
|
||||
name = Column(Text, nullable=False)
|
||||
profile_image_url = Column(Text, nullable=True)
|
||||
|
||||
token = Column(Text, nullable=False)
|
||||
last_used_at = Column(BigInteger, nullable=True)
|
||||
|
||||
created_at = Column(BigInteger, nullable=False)
|
||||
updated_at = Column(BigInteger, nullable=False)
|
||||
|
||||
|
||||
class ChannelWebhookModel(BaseModel):
|
||||
model_config = ConfigDict(from_attributes=True)
|
||||
|
||||
id: str
|
||||
channel_id: str
|
||||
user_id: str
|
||||
|
||||
name: str
|
||||
profile_image_url: Optional[str] = None
|
||||
|
||||
token: str
|
||||
last_used_at: Optional[int] = None # timestamp in epoch (time_ns)
|
||||
|
||||
created_at: int # timestamp in epoch (time_ns)
|
||||
updated_at: int # timestamp in epoch (time_ns)
|
||||
|
||||
|
||||
####################
|
||||
|
|
@ -58,27 +177,94 @@ class ChannelModel(BaseModel):
|
|||
|
||||
|
||||
class ChannelResponse(ChannelModel):
|
||||
is_manager: bool = False
|
||||
write_access: bool = False
|
||||
|
||||
user_count: Optional[int] = None
|
||||
|
||||
|
||||
class ChannelForm(BaseModel):
|
||||
name: str
|
||||
name: str = ""
|
||||
description: Optional[str] = None
|
||||
is_private: Optional[bool] = None
|
||||
data: Optional[dict] = None
|
||||
meta: Optional[dict] = None
|
||||
access_control: Optional[dict] = None
|
||||
group_ids: Optional[list[str]] = None
|
||||
user_ids: Optional[list[str]] = None
|
||||
|
||||
|
||||
class CreateChannelForm(ChannelForm):
|
||||
type: Optional[str] = None
|
||||
|
||||
|
||||
class ChannelTable:
|
||||
|
||||
def _collect_unique_user_ids(
|
||||
self,
|
||||
invited_by: str,
|
||||
user_ids: Optional[list[str]] = None,
|
||||
group_ids: Optional[list[str]] = None,
|
||||
) -> set[str]:
|
||||
"""
|
||||
Collect unique user ids from:
|
||||
- invited_by
|
||||
- user_ids
|
||||
- each group in group_ids
|
||||
Returns a set for efficient SQL diffing.
|
||||
"""
|
||||
users = set(user_ids or [])
|
||||
users.add(invited_by)
|
||||
|
||||
for group_id in group_ids or []:
|
||||
users.update(Groups.get_group_user_ids_by_id(group_id))
|
||||
|
||||
return users
|
||||
|
||||
def _create_membership_models(
|
||||
self,
|
||||
channel_id: str,
|
||||
invited_by: str,
|
||||
user_ids: set[str],
|
||||
) -> list[ChannelMember]:
|
||||
"""
|
||||
Takes a set of NEW user IDs (already filtered to exclude existing members).
|
||||
Returns ORM ChannelMember objects to be added.
|
||||
"""
|
||||
now = int(time.time_ns())
|
||||
memberships = []
|
||||
|
||||
for uid in user_ids:
|
||||
model = ChannelMemberModel(
|
||||
**{
|
||||
"id": str(uuid.uuid4()),
|
||||
"channel_id": channel_id,
|
||||
"user_id": uid,
|
||||
"status": "joined",
|
||||
"is_active": True,
|
||||
"is_channel_muted": False,
|
||||
"is_channel_pinned": False,
|
||||
"invited_at": now,
|
||||
"invited_by": invited_by,
|
||||
"joined_at": now,
|
||||
"left_at": None,
|
||||
"last_read_at": now,
|
||||
"created_at": now,
|
||||
"updated_at": now,
|
||||
}
|
||||
)
|
||||
memberships.append(ChannelMember(**model.model_dump()))
|
||||
|
||||
return memberships
|
||||
|
||||
def insert_new_channel(
|
||||
self, type: Optional[str], form_data: ChannelForm, user_id: str
|
||||
self, form_data: CreateChannelForm, user_id: str
|
||||
) -> Optional[ChannelModel]:
|
||||
with get_db() as db:
|
||||
channel = ChannelModel(
|
||||
**{
|
||||
**form_data.model_dump(),
|
||||
"type": type,
|
||||
"type": form_data.type if form_data.type else None,
|
||||
"name": form_data.name.lower(),
|
||||
"id": str(uuid.uuid4()),
|
||||
"user_id": user_id,
|
||||
|
|
@ -86,9 +272,21 @@ class ChannelTable:
|
|||
"updated_at": int(time.time_ns()),
|
||||
}
|
||||
)
|
||||
|
||||
new_channel = Channel(**channel.model_dump())
|
||||
|
||||
if form_data.type in ["group", "dm"]:
|
||||
users = self._collect_unique_user_ids(
|
||||
invited_by=user_id,
|
||||
user_ids=form_data.user_ids,
|
||||
group_ids=form_data.group_ids,
|
||||
)
|
||||
memberships = self._create_membership_models(
|
||||
channel_id=new_channel.id,
|
||||
invited_by=user_id,
|
||||
user_ids=users,
|
||||
)
|
||||
|
||||
db.add_all(memberships)
|
||||
db.add(new_channel)
|
||||
db.commit()
|
||||
return channel
|
||||
|
|
@ -98,16 +296,346 @@ class ChannelTable:
|
|||
channels = db.query(Channel).all()
|
||||
return [ChannelModel.model_validate(channel) for channel in channels]
|
||||
|
||||
def get_channels_by_user_id(
|
||||
self, user_id: str, permission: str = "read"
|
||||
) -> list[ChannelModel]:
|
||||
channels = self.get_channels()
|
||||
return [
|
||||
channel
|
||||
for channel in channels
|
||||
if channel.user_id == user_id
|
||||
or has_access(user_id, permission, channel.access_control)
|
||||
def _has_permission(self, db, query, filter: dict, permission: str = "read"):
|
||||
group_ids = filter.get("group_ids", [])
|
||||
user_id = filter.get("user_id")
|
||||
|
||||
dialect_name = db.bind.dialect.name
|
||||
|
||||
# Public access
|
||||
conditions = []
|
||||
if group_ids or user_id:
|
||||
conditions.extend(
|
||||
[
|
||||
Channel.access_control.is_(None),
|
||||
cast(Channel.access_control, String) == "null",
|
||||
]
|
||||
)
|
||||
|
||||
# User-level permission
|
||||
if user_id:
|
||||
conditions.append(Channel.user_id == user_id)
|
||||
|
||||
# Group-level permission
|
||||
if group_ids:
|
||||
group_conditions = []
|
||||
for gid in group_ids:
|
||||
if dialect_name == "sqlite":
|
||||
group_conditions.append(
|
||||
Channel.access_control[permission]["group_ids"].contains([gid])
|
||||
)
|
||||
elif dialect_name == "postgresql":
|
||||
group_conditions.append(
|
||||
cast(
|
||||
Channel.access_control[permission]["group_ids"],
|
||||
JSONB,
|
||||
).contains([gid])
|
||||
)
|
||||
conditions.append(or_(*group_conditions))
|
||||
|
||||
if conditions:
|
||||
query = query.filter(or_(*conditions))
|
||||
|
||||
return query
|
||||
|
||||
def get_channels_by_user_id(self, user_id: str) -> list[ChannelModel]:
|
||||
with get_db() as db:
|
||||
user_group_ids = [
|
||||
group.id for group in Groups.get_groups_by_member_id(user_id)
|
||||
]
|
||||
|
||||
membership_channels = (
|
||||
db.query(Channel)
|
||||
.join(ChannelMember, Channel.id == ChannelMember.channel_id)
|
||||
.filter(
|
||||
Channel.deleted_at.is_(None),
|
||||
Channel.archived_at.is_(None),
|
||||
Channel.type.in_(["group", "dm"]),
|
||||
ChannelMember.user_id == user_id,
|
||||
ChannelMember.is_active.is_(True),
|
||||
)
|
||||
.all()
|
||||
)
|
||||
|
||||
query = db.query(Channel).filter(
|
||||
Channel.deleted_at.is_(None),
|
||||
Channel.archived_at.is_(None),
|
||||
or_(
|
||||
Channel.type.is_(None), # True NULL/None
|
||||
Channel.type == "", # Empty string
|
||||
and_(Channel.type != "group", Channel.type != "dm"),
|
||||
),
|
||||
)
|
||||
query = self._has_permission(
|
||||
db, query, {"user_id": user_id, "group_ids": user_group_ids}
|
||||
)
|
||||
|
||||
standard_channels = query.all()
|
||||
|
||||
all_channels = membership_channels + standard_channels
|
||||
return [ChannelModel.model_validate(c) for c in all_channels]
|
||||
|
||||
def get_dm_channel_by_user_ids(self, user_ids: list[str]) -> Optional[ChannelModel]:
|
||||
with get_db() as db:
|
||||
# Ensure uniqueness in case a list with duplicates is passed
|
||||
unique_user_ids = list(set(user_ids))
|
||||
|
||||
match_count = func.sum(
|
||||
case(
|
||||
(ChannelMember.user_id.in_(unique_user_ids), 1),
|
||||
else_=0,
|
||||
)
|
||||
)
|
||||
|
||||
subquery = (
|
||||
db.query(ChannelMember.channel_id)
|
||||
.group_by(ChannelMember.channel_id)
|
||||
# 1. Channel must have exactly len(user_ids) members
|
||||
.having(func.count(ChannelMember.user_id) == len(unique_user_ids))
|
||||
# 2. All those members must be in unique_user_ids
|
||||
.having(match_count == len(unique_user_ids))
|
||||
.subquery()
|
||||
)
|
||||
|
||||
channel = (
|
||||
db.query(Channel)
|
||||
.filter(
|
||||
Channel.id.in_(subquery),
|
||||
Channel.type == "dm",
|
||||
)
|
||||
.first()
|
||||
)
|
||||
|
||||
return ChannelModel.model_validate(channel) if channel else None
|
||||
|
||||
def add_members_to_channel(
|
||||
self,
|
||||
channel_id: str,
|
||||
invited_by: str,
|
||||
user_ids: Optional[list[str]] = None,
|
||||
group_ids: Optional[list[str]] = None,
|
||||
) -> list[ChannelMemberModel]:
|
||||
with get_db() as db:
|
||||
# 1. Collect all user_ids including groups + inviter
|
||||
requested_users = self._collect_unique_user_ids(
|
||||
invited_by, user_ids, group_ids
|
||||
)
|
||||
|
||||
existing_users = {
|
||||
row.user_id
|
||||
for row in db.query(ChannelMember.user_id)
|
||||
.filter(ChannelMember.channel_id == channel_id)
|
||||
.all()
|
||||
}
|
||||
|
||||
new_user_ids = requested_users - existing_users
|
||||
if not new_user_ids:
|
||||
return [] # Nothing to add
|
||||
|
||||
new_memberships = self._create_membership_models(
|
||||
channel_id, invited_by, new_user_ids
|
||||
)
|
||||
|
||||
db.add_all(new_memberships)
|
||||
db.commit()
|
||||
|
||||
return [
|
||||
ChannelMemberModel.model_validate(membership)
|
||||
for membership in new_memberships
|
||||
]
|
||||
|
||||
def remove_members_from_channel(
|
||||
self,
|
||||
channel_id: str,
|
||||
user_ids: list[str],
|
||||
) -> int:
|
||||
with get_db() as db:
|
||||
result = (
|
||||
db.query(ChannelMember)
|
||||
.filter(
|
||||
ChannelMember.channel_id == channel_id,
|
||||
ChannelMember.user_id.in_(user_ids),
|
||||
)
|
||||
.delete(synchronize_session=False)
|
||||
)
|
||||
db.commit()
|
||||
return result # number of rows deleted
|
||||
|
||||
def is_user_channel_manager(self, channel_id: str, user_id: str) -> bool:
|
||||
with get_db() as db:
|
||||
# Check if the user is the creator of the channel
|
||||
# or has a 'manager' role in ChannelMember
|
||||
channel = db.query(Channel).filter(Channel.id == channel_id).first()
|
||||
if channel and channel.user_id == user_id:
|
||||
return True
|
||||
|
||||
membership = (
|
||||
db.query(ChannelMember)
|
||||
.filter(
|
||||
ChannelMember.channel_id == channel_id,
|
||||
ChannelMember.user_id == user_id,
|
||||
ChannelMember.role == "manager",
|
||||
)
|
||||
.first()
|
||||
)
|
||||
return membership is not None
|
||||
|
||||
def join_channel(
|
||||
self, channel_id: str, user_id: str
|
||||
) -> Optional[ChannelMemberModel]:
|
||||
with get_db() as db:
|
||||
# Check if the membership already exists
|
||||
existing_membership = (
|
||||
db.query(ChannelMember)
|
||||
.filter(
|
||||
ChannelMember.channel_id == channel_id,
|
||||
ChannelMember.user_id == user_id,
|
||||
)
|
||||
.first()
|
||||
)
|
||||
if existing_membership:
|
||||
return ChannelMemberModel.model_validate(existing_membership)
|
||||
|
||||
# Create new membership
|
||||
channel_member = ChannelMemberModel(
|
||||
**{
|
||||
"id": str(uuid.uuid4()),
|
||||
"channel_id": channel_id,
|
||||
"user_id": user_id,
|
||||
"status": "joined",
|
||||
"is_active": True,
|
||||
"is_channel_muted": False,
|
||||
"is_channel_pinned": False,
|
||||
"joined_at": int(time.time_ns()),
|
||||
"left_at": None,
|
||||
"last_read_at": int(time.time_ns()),
|
||||
"created_at": int(time.time_ns()),
|
||||
"updated_at": int(time.time_ns()),
|
||||
}
|
||||
)
|
||||
new_membership = ChannelMember(**channel_member.model_dump())
|
||||
|
||||
db.add(new_membership)
|
||||
db.commit()
|
||||
return channel_member
|
||||
|
||||
def leave_channel(self, channel_id: str, user_id: str) -> bool:
|
||||
with get_db() as db:
|
||||
membership = (
|
||||
db.query(ChannelMember)
|
||||
.filter(
|
||||
ChannelMember.channel_id == channel_id,
|
||||
ChannelMember.user_id == user_id,
|
||||
)
|
||||
.first()
|
||||
)
|
||||
if not membership:
|
||||
return False
|
||||
|
||||
membership.status = "left"
|
||||
membership.is_active = False
|
||||
membership.left_at = int(time.time_ns())
|
||||
membership.updated_at = int(time.time_ns())
|
||||
|
||||
db.commit()
|
||||
return True
|
||||
|
||||
def get_member_by_channel_and_user_id(
|
||||
self, channel_id: str, user_id: str
|
||||
) -> Optional[ChannelMemberModel]:
|
||||
with get_db() as db:
|
||||
membership = (
|
||||
db.query(ChannelMember)
|
||||
.filter(
|
||||
ChannelMember.channel_id == channel_id,
|
||||
ChannelMember.user_id == user_id,
|
||||
)
|
||||
.first()
|
||||
)
|
||||
return ChannelMemberModel.model_validate(membership) if membership else None
|
||||
|
||||
def get_members_by_channel_id(self, channel_id: str) -> list[ChannelMemberModel]:
|
||||
with get_db() as db:
|
||||
memberships = (
|
||||
db.query(ChannelMember)
|
||||
.filter(ChannelMember.channel_id == channel_id)
|
||||
.all()
|
||||
)
|
||||
return [
|
||||
ChannelMemberModel.model_validate(membership)
|
||||
for membership in memberships
|
||||
]
|
||||
|
||||
def pin_channel(self, channel_id: str, user_id: str, is_pinned: bool) -> bool:
|
||||
with get_db() as db:
|
||||
membership = (
|
||||
db.query(ChannelMember)
|
||||
.filter(
|
||||
ChannelMember.channel_id == channel_id,
|
||||
ChannelMember.user_id == user_id,
|
||||
)
|
||||
.first()
|
||||
)
|
||||
if not membership:
|
||||
return False
|
||||
|
||||
membership.is_channel_pinned = is_pinned
|
||||
membership.updated_at = int(time.time_ns())
|
||||
|
||||
db.commit()
|
||||
return True
|
||||
|
||||
def update_member_last_read_at(self, channel_id: str, user_id: str) -> bool:
|
||||
with get_db() as db:
|
||||
membership = (
|
||||
db.query(ChannelMember)
|
||||
.filter(
|
||||
ChannelMember.channel_id == channel_id,
|
||||
ChannelMember.user_id == user_id,
|
||||
)
|
||||
.first()
|
||||
)
|
||||
if not membership:
|
||||
return False
|
||||
|
||||
membership.last_read_at = int(time.time_ns())
|
||||
membership.updated_at = int(time.time_ns())
|
||||
|
||||
db.commit()
|
||||
return True
|
||||
|
||||
def update_member_active_status(
|
||||
self, channel_id: str, user_id: str, is_active: bool
|
||||
) -> bool:
|
||||
with get_db() as db:
|
||||
membership = (
|
||||
db.query(ChannelMember)
|
||||
.filter(
|
||||
ChannelMember.channel_id == channel_id,
|
||||
ChannelMember.user_id == user_id,
|
||||
)
|
||||
.first()
|
||||
)
|
||||
if not membership:
|
||||
return False
|
||||
|
||||
membership.is_active = is_active
|
||||
membership.updated_at = int(time.time_ns())
|
||||
|
||||
db.commit()
|
||||
return True
|
||||
|
||||
def is_user_channel_member(self, channel_id: str, user_id: str) -> bool:
|
||||
with get_db() as db:
|
||||
membership = (
|
||||
db.query(ChannelMember)
|
||||
.filter(
|
||||
ChannelMember.channel_id == channel_id,
|
||||
ChannelMember.user_id == user_id,
|
||||
)
|
||||
.first()
|
||||
)
|
||||
return membership is not None
|
||||
|
||||
def get_channel_by_id(self, id: str) -> Optional[ChannelModel]:
|
||||
with get_db() as db:
|
||||
|
|
@ -123,8 +651,12 @@ class ChannelTable:
|
|||
return None
|
||||
|
||||
channel.name = form_data.name
|
||||
channel.description = form_data.description
|
||||
channel.is_private = form_data.is_private
|
||||
|
||||
channel.data = form_data.data
|
||||
channel.meta = form_data.meta
|
||||
|
||||
channel.access_control = form_data.access_control
|
||||
channel.updated_at = int(time.time_ns())
|
||||
|
||||
|
|
|
|||
|
|
@ -11,7 +11,18 @@ from open_webui.models.files import FileMetadataResponse
|
|||
|
||||
|
||||
from pydantic import BaseModel, ConfigDict
|
||||
from sqlalchemy import BigInteger, Column, String, Text, JSON, func, ForeignKey
|
||||
from sqlalchemy import (
|
||||
BigInteger,
|
||||
Column,
|
||||
String,
|
||||
Text,
|
||||
JSON,
|
||||
and_,
|
||||
func,
|
||||
ForeignKey,
|
||||
cast,
|
||||
or_,
|
||||
)
|
||||
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
|
|
@ -41,7 +52,6 @@ class Group(Base):
|
|||
|
||||
|
||||
class GroupModel(BaseModel):
|
||||
model_config = ConfigDict(from_attributes=True)
|
||||
id: str
|
||||
user_id: str
|
||||
|
||||
|
|
@ -56,6 +66,8 @@ class GroupModel(BaseModel):
|
|||
created_at: int # timestamp in epoch
|
||||
updated_at: int # timestamp in epoch
|
||||
|
||||
model_config = ConfigDict(from_attributes=True)
|
||||
|
||||
|
||||
class GroupMember(Base):
|
||||
__tablename__ = "group_member"
|
||||
|
|
@ -84,17 +96,8 @@ class GroupMemberModel(BaseModel):
|
|||
####################
|
||||
|
||||
|
||||
class GroupResponse(BaseModel):
|
||||
id: str
|
||||
user_id: str
|
||||
name: str
|
||||
description: str
|
||||
permissions: Optional[dict] = None
|
||||
data: Optional[dict] = None
|
||||
meta: Optional[dict] = None
|
||||
class GroupResponse(GroupModel):
|
||||
member_count: Optional[int] = None
|
||||
created_at: int # timestamp in epoch
|
||||
updated_at: int # timestamp in epoch
|
||||
|
||||
|
||||
class GroupForm(BaseModel):
|
||||
|
|
@ -112,6 +115,11 @@ class GroupUpdateForm(GroupForm):
|
|||
pass
|
||||
|
||||
|
||||
class GroupListResponse(BaseModel):
|
||||
items: list[GroupResponse] = []
|
||||
total: int = 0
|
||||
|
||||
|
||||
class GroupTable:
|
||||
def insert_new_group(
|
||||
self, user_id: str, form_data: GroupForm
|
||||
|
|
@ -140,13 +148,87 @@ class GroupTable:
|
|||
except Exception:
|
||||
return None
|
||||
|
||||
def get_groups(self) -> list[GroupModel]:
|
||||
def get_all_groups(self) -> list[GroupModel]:
|
||||
with get_db() as db:
|
||||
groups = db.query(Group).order_by(Group.updated_at.desc()).all()
|
||||
return [GroupModel.model_validate(group) for group in groups]
|
||||
|
||||
def get_groups(self, filter) -> list[GroupResponse]:
|
||||
with get_db() as db:
|
||||
query = db.query(Group)
|
||||
|
||||
if filter:
|
||||
if "query" in filter:
|
||||
query = query.filter(Group.name.ilike(f"%{filter['query']}%"))
|
||||
if "member_id" in filter:
|
||||
query = query.join(
|
||||
GroupMember, GroupMember.group_id == Group.id
|
||||
).filter(GroupMember.user_id == filter["member_id"])
|
||||
|
||||
if "share" in filter:
|
||||
share_value = filter["share"]
|
||||
json_share = Group.data["config"]["share"].as_boolean()
|
||||
|
||||
if share_value:
|
||||
query = query.filter(
|
||||
or_(
|
||||
Group.data.is_(None),
|
||||
json_share.is_(None),
|
||||
json_share == True,
|
||||
)
|
||||
)
|
||||
else:
|
||||
query = query.filter(
|
||||
and_(Group.data.isnot(None), json_share == False)
|
||||
)
|
||||
groups = query.order_by(Group.updated_at.desc()).all()
|
||||
return [
|
||||
GroupModel.model_validate(group)
|
||||
for group in db.query(Group).order_by(Group.updated_at.desc()).all()
|
||||
GroupResponse.model_validate(
|
||||
{
|
||||
**GroupModel.model_validate(group).model_dump(),
|
||||
"member_count": self.get_group_member_count_by_id(group.id),
|
||||
}
|
||||
)
|
||||
for group in groups
|
||||
]
|
||||
|
||||
def search_groups(
|
||||
self, filter: Optional[dict] = None, skip: int = 0, limit: int = 30
|
||||
) -> GroupListResponse:
|
||||
with get_db() as db:
|
||||
query = db.query(Group)
|
||||
|
||||
if filter:
|
||||
if "query" in filter:
|
||||
query = query.filter(Group.name.ilike(f"%{filter['query']}%"))
|
||||
if "member_id" in filter:
|
||||
query = query.join(
|
||||
GroupMember, GroupMember.group_id == Group.id
|
||||
).filter(GroupMember.user_id == filter["member_id"])
|
||||
|
||||
if "share" in filter:
|
||||
# 'share' is stored in data JSON, support both sqlite and postgres
|
||||
share_value = filter["share"]
|
||||
print("Filtering by share:", share_value)
|
||||
query = query.filter(
|
||||
Group.data.op("->>")("share") == str(share_value)
|
||||
)
|
||||
|
||||
total = query.count()
|
||||
query = query.order_by(Group.updated_at.desc())
|
||||
groups = query.offset(skip).limit(limit).all()
|
||||
|
||||
return {
|
||||
"items": [
|
||||
GroupResponse.model_validate(
|
||||
**GroupModel.model_validate(group).model_dump(),
|
||||
member_count=self.get_group_member_count_by_id(group.id),
|
||||
)
|
||||
for group in groups
|
||||
],
|
||||
"total": total,
|
||||
}
|
||||
|
||||
def get_groups_by_member_id(self, user_id: str) -> list[GroupModel]:
|
||||
with get_db() as db:
|
||||
return [
|
||||
|
|
@ -293,7 +375,7 @@ class GroupTable:
|
|||
) -> list[GroupModel]:
|
||||
|
||||
# check for existing groups
|
||||
existing_groups = self.get_groups()
|
||||
existing_groups = self.get_all_groups()
|
||||
existing_group_names = {group.name for group in existing_groups}
|
||||
|
||||
new_groups = []
|
||||
|
|
|
|||
|
|
@ -7,13 +7,21 @@ import uuid
|
|||
from open_webui.internal.db import Base, get_db
|
||||
from open_webui.env import SRC_LOG_LEVELS
|
||||
|
||||
from open_webui.models.files import FileMetadataResponse
|
||||
from open_webui.models.files import File, FileModel, FileMetadataResponse
|
||||
from open_webui.models.groups import Groups
|
||||
from open_webui.models.users import Users, UserResponse
|
||||
|
||||
|
||||
from pydantic import BaseModel, ConfigDict
|
||||
from sqlalchemy import BigInteger, Column, String, Text, JSON
|
||||
from sqlalchemy import (
|
||||
BigInteger,
|
||||
Column,
|
||||
ForeignKey,
|
||||
String,
|
||||
Text,
|
||||
JSON,
|
||||
UniqueConstraint,
|
||||
)
|
||||
|
||||
from open_webui.utils.access_control import has_access
|
||||
|
||||
|
|
@ -34,9 +42,7 @@ class Knowledge(Base):
|
|||
name = Column(Text)
|
||||
description = Column(Text)
|
||||
|
||||
data = Column(JSON, nullable=True)
|
||||
meta = Column(JSON, nullable=True)
|
||||
|
||||
access_control = Column(JSON, nullable=True) # Controls data access levels.
|
||||
# Defines access control rules for this entry.
|
||||
# - `None`: Public access, available to all users with the "user" role.
|
||||
|
|
@ -67,7 +73,6 @@ class KnowledgeModel(BaseModel):
|
|||
name: str
|
||||
description: str
|
||||
|
||||
data: Optional[dict] = None
|
||||
meta: Optional[dict] = None
|
||||
|
||||
access_control: Optional[dict] = None
|
||||
|
|
@ -76,11 +81,42 @@ class KnowledgeModel(BaseModel):
|
|||
updated_at: int # timestamp in epoch
|
||||
|
||||
|
||||
class KnowledgeFile(Base):
|
||||
__tablename__ = "knowledge_file"
|
||||
|
||||
id = Column(Text, unique=True, primary_key=True)
|
||||
|
||||
knowledge_id = Column(
|
||||
Text, ForeignKey("knowledge.id", ondelete="CASCADE"), nullable=False
|
||||
)
|
||||
file_id = Column(Text, ForeignKey("file.id", ondelete="CASCADE"), nullable=False)
|
||||
user_id = Column(Text, nullable=False)
|
||||
|
||||
created_at = Column(BigInteger, nullable=False)
|
||||
updated_at = Column(BigInteger, nullable=False)
|
||||
|
||||
__table_args__ = (
|
||||
UniqueConstraint(
|
||||
"knowledge_id", "file_id", name="uq_knowledge_file_knowledge_file"
|
||||
),
|
||||
)
|
||||
|
||||
|
||||
class KnowledgeFileModel(BaseModel):
|
||||
id: str
|
||||
knowledge_id: str
|
||||
file_id: str
|
||||
user_id: str
|
||||
|
||||
created_at: int # timestamp in epoch
|
||||
updated_at: int # timestamp in epoch
|
||||
|
||||
model_config = ConfigDict(from_attributes=True)
|
||||
|
||||
|
||||
####################
|
||||
# Forms
|
||||
####################
|
||||
|
||||
|
||||
class KnowledgeUserModel(KnowledgeModel):
|
||||
user: Optional[UserResponse] = None
|
||||
|
||||
|
|
@ -96,7 +132,6 @@ class KnowledgeUserResponse(KnowledgeUserModel):
|
|||
class KnowledgeForm(BaseModel):
|
||||
name: str
|
||||
description: str
|
||||
data: Optional[dict] = None
|
||||
access_control: Optional[dict] = None
|
||||
|
||||
|
||||
|
|
@ -182,6 +217,100 @@ class KnowledgeTable:
|
|||
except Exception:
|
||||
return None
|
||||
|
||||
def get_knowledges_by_file_id(self, file_id: str) -> list[KnowledgeModel]:
|
||||
try:
|
||||
with get_db() as db:
|
||||
knowledges = (
|
||||
db.query(Knowledge)
|
||||
.join(KnowledgeFile, Knowledge.id == KnowledgeFile.knowledge_id)
|
||||
.filter(KnowledgeFile.file_id == file_id)
|
||||
.all()
|
||||
)
|
||||
return [
|
||||
KnowledgeModel.model_validate(knowledge) for knowledge in knowledges
|
||||
]
|
||||
except Exception:
|
||||
return []
|
||||
|
||||
def get_files_by_id(self, knowledge_id: str) -> list[FileModel]:
|
||||
try:
|
||||
with get_db() as db:
|
||||
files = (
|
||||
db.query(File)
|
||||
.join(KnowledgeFile, File.id == KnowledgeFile.file_id)
|
||||
.filter(KnowledgeFile.knowledge_id == knowledge_id)
|
||||
.all()
|
||||
)
|
||||
return [FileModel.model_validate(file) for file in files]
|
||||
except Exception:
|
||||
return []
|
||||
|
||||
def get_file_metadatas_by_id(self, knowledge_id: str) -> list[FileMetadataResponse]:
|
||||
try:
|
||||
with get_db() as db:
|
||||
files = self.get_files_by_id(knowledge_id)
|
||||
return [FileMetadataResponse(**file.model_dump()) for file in files]
|
||||
except Exception:
|
||||
return []
|
||||
|
||||
def add_file_to_knowledge_by_id(
|
||||
self, knowledge_id: str, file_id: str, user_id: str
|
||||
) -> Optional[KnowledgeFileModel]:
|
||||
with get_db() as db:
|
||||
knowledge_file = KnowledgeFileModel(
|
||||
**{
|
||||
"id": str(uuid.uuid4()),
|
||||
"knowledge_id": knowledge_id,
|
||||
"file_id": file_id,
|
||||
"user_id": user_id,
|
||||
"created_at": int(time.time()),
|
||||
"updated_at": int(time.time()),
|
||||
}
|
||||
)
|
||||
|
||||
try:
|
||||
result = KnowledgeFile(**knowledge_file.model_dump())
|
||||
db.add(result)
|
||||
db.commit()
|
||||
db.refresh(result)
|
||||
if result:
|
||||
return KnowledgeFileModel.model_validate(result)
|
||||
else:
|
||||
return None
|
||||
except Exception:
|
||||
return None
|
||||
|
||||
def remove_file_from_knowledge_by_id(self, knowledge_id: str, file_id: str) -> bool:
|
||||
try:
|
||||
with get_db() as db:
|
||||
db.query(KnowledgeFile).filter_by(
|
||||
knowledge_id=knowledge_id, file_id=file_id
|
||||
).delete()
|
||||
db.commit()
|
||||
return True
|
||||
except Exception:
|
||||
return False
|
||||
|
||||
def reset_knowledge_by_id(self, id: str) -> Optional[KnowledgeModel]:
|
||||
try:
|
||||
with get_db() as db:
|
||||
# Delete all knowledge_file entries for this knowledge_id
|
||||
db.query(KnowledgeFile).filter_by(knowledge_id=id).delete()
|
||||
db.commit()
|
||||
|
||||
# Update the knowledge entry's updated_at timestamp
|
||||
db.query(Knowledge).filter_by(id=id).update(
|
||||
{
|
||||
"updated_at": int(time.time()),
|
||||
}
|
||||
)
|
||||
db.commit()
|
||||
|
||||
return self.get_knowledge_by_id(id=id)
|
||||
except Exception as e:
|
||||
log.exception(e)
|
||||
return None
|
||||
|
||||
def update_knowledge_by_id(
|
||||
self, id: str, form_data: KnowledgeForm, overwrite: bool = False
|
||||
) -> Optional[KnowledgeModel]:
|
||||
|
|
|
|||
|
|
@ -5,7 +5,8 @@ from typing import Optional
|
|||
|
||||
from open_webui.internal.db import Base, get_db
|
||||
from open_webui.models.tags import TagModel, Tag, Tags
|
||||
from open_webui.models.users import Users, UserNameResponse
|
||||
from open_webui.models.users import Users, User, UserNameResponse
|
||||
from open_webui.models.channels import Channels, ChannelMember
|
||||
|
||||
|
||||
from pydantic import BaseModel, ConfigDict
|
||||
|
|
@ -39,7 +40,7 @@ class MessageReactionModel(BaseModel):
|
|||
|
||||
class Message(Base):
|
||||
__tablename__ = "message"
|
||||
id = Column(Text, primary_key=True)
|
||||
id = Column(Text, primary_key=True, unique=True)
|
||||
|
||||
user_id = Column(Text)
|
||||
channel_id = Column(Text, nullable=True)
|
||||
|
|
@ -47,6 +48,11 @@ class Message(Base):
|
|||
reply_to_id = Column(Text, nullable=True)
|
||||
parent_id = Column(Text, nullable=True)
|
||||
|
||||
# Pins
|
||||
is_pinned = Column(Boolean, nullable=False, default=False)
|
||||
pinned_at = Column(BigInteger, nullable=True)
|
||||
pinned_by = Column(Text, nullable=True)
|
||||
|
||||
content = Column(Text)
|
||||
data = Column(JSON, nullable=True)
|
||||
meta = Column(JSON, nullable=True)
|
||||
|
|
@ -65,12 +71,17 @@ class MessageModel(BaseModel):
|
|||
reply_to_id: Optional[str] = None
|
||||
parent_id: Optional[str] = None
|
||||
|
||||
# Pins
|
||||
is_pinned: bool = False
|
||||
pinned_by: Optional[str] = None
|
||||
pinned_at: Optional[int] = None # timestamp in epoch (time_ns)
|
||||
|
||||
content: str
|
||||
data: Optional[dict] = None
|
||||
meta: Optional[dict] = None
|
||||
|
||||
created_at: int # timestamp in epoch
|
||||
updated_at: int # timestamp in epoch
|
||||
created_at: int # timestamp in epoch (time_ns)
|
||||
updated_at: int # timestamp in epoch (time_ns)
|
||||
|
||||
|
||||
####################
|
||||
|
|
@ -79,6 +90,7 @@ class MessageModel(BaseModel):
|
|||
|
||||
|
||||
class MessageForm(BaseModel):
|
||||
temp_id: Optional[str] = None
|
||||
content: str
|
||||
reply_to_id: Optional[str] = None
|
||||
parent_id: Optional[str] = None
|
||||
|
|
@ -88,7 +100,7 @@ class MessageForm(BaseModel):
|
|||
|
||||
class Reactions(BaseModel):
|
||||
name: str
|
||||
user_ids: list[str]
|
||||
users: list[dict]
|
||||
count: int
|
||||
|
||||
|
||||
|
|
@ -100,6 +112,10 @@ class MessageReplyToResponse(MessageUserResponse):
|
|||
reply_to_message: Optional[MessageUserResponse] = None
|
||||
|
||||
|
||||
class MessageWithReactionsResponse(MessageUserResponse):
|
||||
reactions: list[Reactions]
|
||||
|
||||
|
||||
class MessageResponse(MessageReplyToResponse):
|
||||
latest_reply_at: Optional[int]
|
||||
reply_count: int
|
||||
|
|
@ -111,9 +127,11 @@ class MessageTable:
|
|||
self, form_data: MessageForm, channel_id: str, user_id: str
|
||||
) -> Optional[MessageModel]:
|
||||
with get_db() as db:
|
||||
id = str(uuid.uuid4())
|
||||
channel_member = Channels.join_channel(channel_id, user_id)
|
||||
|
||||
id = str(uuid.uuid4())
|
||||
ts = int(time.time_ns())
|
||||
|
||||
message = MessageModel(
|
||||
**{
|
||||
"id": id,
|
||||
|
|
@ -121,6 +139,9 @@ class MessageTable:
|
|||
"channel_id": channel_id,
|
||||
"reply_to_id": form_data.reply_to_id,
|
||||
"parent_id": form_data.parent_id,
|
||||
"is_pinned": False,
|
||||
"pinned_at": None,
|
||||
"pinned_by": None,
|
||||
"content": form_data.content,
|
||||
"data": form_data.data,
|
||||
"meta": form_data.meta,
|
||||
|
|
@ -128,8 +149,8 @@ class MessageTable:
|
|||
"updated_at": ts,
|
||||
}
|
||||
)
|
||||
|
||||
result = Message(**message.model_dump())
|
||||
|
||||
db.add(result)
|
||||
db.commit()
|
||||
db.refresh(result)
|
||||
|
|
@ -280,6 +301,30 @@ class MessageTable:
|
|||
)
|
||||
return messages
|
||||
|
||||
def get_last_message_by_channel_id(self, channel_id: str) -> Optional[MessageModel]:
|
||||
with get_db() as db:
|
||||
message = (
|
||||
db.query(Message)
|
||||
.filter_by(channel_id=channel_id)
|
||||
.order_by(Message.created_at.desc())
|
||||
.first()
|
||||
)
|
||||
return MessageModel.model_validate(message) if message else None
|
||||
|
||||
def get_pinned_messages_by_channel_id(
|
||||
self, channel_id: str, skip: int = 0, limit: int = 50
|
||||
) -> list[MessageModel]:
|
||||
with get_db() as db:
|
||||
all_messages = (
|
||||
db.query(Message)
|
||||
.filter_by(channel_id=channel_id, is_pinned=True)
|
||||
.order_by(Message.pinned_at.desc())
|
||||
.offset(skip)
|
||||
.limit(limit)
|
||||
.all()
|
||||
)
|
||||
return [MessageModel.model_validate(message) for message in all_messages]
|
||||
|
||||
def update_message_by_id(
|
||||
self, id: str, form_data: MessageForm
|
||||
) -> Optional[MessageModel]:
|
||||
|
|
@ -299,10 +344,44 @@ class MessageTable:
|
|||
db.refresh(message)
|
||||
return MessageModel.model_validate(message) if message else None
|
||||
|
||||
def update_is_pinned_by_id(
|
||||
self, id: str, is_pinned: bool, pinned_by: Optional[str] = None
|
||||
) -> Optional[MessageModel]:
|
||||
with get_db() as db:
|
||||
message = db.get(Message, id)
|
||||
message.is_pinned = is_pinned
|
||||
message.pinned_at = int(time.time_ns()) if is_pinned else None
|
||||
message.pinned_by = pinned_by if is_pinned else None
|
||||
db.commit()
|
||||
db.refresh(message)
|
||||
return MessageModel.model_validate(message) if message else None
|
||||
|
||||
def get_unread_message_count(
|
||||
self, channel_id: str, user_id: str, last_read_at: Optional[int] = None
|
||||
) -> int:
|
||||
with get_db() as db:
|
||||
query = db.query(Message).filter(
|
||||
Message.channel_id == channel_id,
|
||||
Message.parent_id == None, # only count top-level messages
|
||||
Message.created_at > (last_read_at if last_read_at else 0),
|
||||
)
|
||||
if user_id:
|
||||
query = query.filter(Message.user_id != user_id)
|
||||
return query.count()
|
||||
|
||||
def add_reaction_to_message(
|
||||
self, id: str, user_id: str, name: str
|
||||
) -> Optional[MessageReactionModel]:
|
||||
with get_db() as db:
|
||||
# check for existing reaction
|
||||
existing_reaction = (
|
||||
db.query(MessageReaction)
|
||||
.filter_by(message_id=id, user_id=user_id, name=name)
|
||||
.first()
|
||||
)
|
||||
if existing_reaction:
|
||||
return MessageReactionModel.model_validate(existing_reaction)
|
||||
|
||||
reaction_id = str(uuid.uuid4())
|
||||
reaction = MessageReactionModel(
|
||||
id=reaction_id,
|
||||
|
|
@ -319,17 +398,30 @@ class MessageTable:
|
|||
|
||||
def get_reactions_by_message_id(self, id: str) -> list[Reactions]:
|
||||
with get_db() as db:
|
||||
all_reactions = db.query(MessageReaction).filter_by(message_id=id).all()
|
||||
# JOIN User so all user info is fetched in one query
|
||||
results = (
|
||||
db.query(MessageReaction, User)
|
||||
.join(User, MessageReaction.user_id == User.id)
|
||||
.filter(MessageReaction.message_id == id)
|
||||
.all()
|
||||
)
|
||||
|
||||
reactions = {}
|
||||
for reaction in all_reactions:
|
||||
|
||||
for reaction, user in results:
|
||||
if reaction.name not in reactions:
|
||||
reactions[reaction.name] = {
|
||||
"name": reaction.name,
|
||||
"user_ids": [],
|
||||
"users": [],
|
||||
"count": 0,
|
||||
}
|
||||
reactions[reaction.name]["user_ids"].append(reaction.user_id)
|
||||
|
||||
reactions[reaction.name]["users"].append(
|
||||
{
|
||||
"id": user.id,
|
||||
"name": user.name,
|
||||
}
|
||||
)
|
||||
reactions[reaction.name]["count"] += 1
|
||||
|
||||
return [Reactions(**reaction) for reaction in reactions.values()]
|
||||
|
|
|
|||
|
|
@ -13,6 +13,8 @@ from pydantic import BaseModel, ConfigDict
|
|||
|
||||
from sqlalchemy import String, cast, or_, and_, func
|
||||
from sqlalchemy.dialects import postgresql, sqlite
|
||||
|
||||
from sqlalchemy.dialects.postgresql import JSONB
|
||||
from sqlalchemy import BigInteger, Column, Text, JSON, Boolean
|
||||
|
||||
|
||||
|
|
@ -53,7 +55,7 @@ class ModelMeta(BaseModel):
|
|||
class Model(Base):
|
||||
__tablename__ = "model"
|
||||
|
||||
id = Column(Text, primary_key=True)
|
||||
id = Column(Text, primary_key=True, unique=True)
|
||||
"""
|
||||
The model's id as used in the API. If set to an existing model, it will override the model.
|
||||
"""
|
||||
|
|
@ -220,6 +222,48 @@ class ModelsTable:
|
|||
or has_access(user_id, permission, model.access_control, user_group_ids)
|
||||
]
|
||||
|
||||
def _has_permission(self, db, query, filter: dict, permission: str = "read"):
|
||||
group_ids = filter.get("group_ids", [])
|
||||
user_id = filter.get("user_id")
|
||||
|
||||
dialect_name = db.bind.dialect.name
|
||||
|
||||
# Public access
|
||||
conditions = []
|
||||
if group_ids or user_id:
|
||||
conditions.extend(
|
||||
[
|
||||
Model.access_control.is_(None),
|
||||
cast(Model.access_control, String) == "null",
|
||||
]
|
||||
)
|
||||
|
||||
# User-level permission
|
||||
if user_id:
|
||||
conditions.append(Model.user_id == user_id)
|
||||
|
||||
# Group-level permission
|
||||
if group_ids:
|
||||
group_conditions = []
|
||||
for gid in group_ids:
|
||||
if dialect_name == "sqlite":
|
||||
group_conditions.append(
|
||||
Model.access_control[permission]["group_ids"].contains([gid])
|
||||
)
|
||||
elif dialect_name == "postgresql":
|
||||
group_conditions.append(
|
||||
cast(
|
||||
Model.access_control[permission]["group_ids"],
|
||||
JSONB,
|
||||
).contains([gid])
|
||||
)
|
||||
conditions.append(or_(*group_conditions))
|
||||
|
||||
if conditions:
|
||||
query = query.filter(or_(*conditions))
|
||||
|
||||
return query
|
||||
|
||||
def search_models(
|
||||
self, user_id: str, filter: dict = {}, skip: int = 0, limit: int = 30
|
||||
) -> ModelListResponse:
|
||||
|
|
@ -238,16 +282,20 @@ class ModelsTable:
|
|||
)
|
||||
)
|
||||
|
||||
if filter.get("user_id"):
|
||||
query = query.filter(Model.user_id == filter.get("user_id"))
|
||||
|
||||
view_option = filter.get("view_option")
|
||||
|
||||
if view_option == "created":
|
||||
query = query.filter(Model.user_id == user_id)
|
||||
elif view_option == "shared":
|
||||
query = query.filter(Model.user_id != user_id)
|
||||
|
||||
# Apply access control filtering
|
||||
query = self._has_permission(
|
||||
db,
|
||||
query,
|
||||
filter,
|
||||
permission="write",
|
||||
)
|
||||
|
||||
tag = filter.get("tag")
|
||||
if tag:
|
||||
# TODO: This is a simple implementation and should be improved for performance
|
||||
|
|
|
|||
|
|
@ -23,7 +23,7 @@ from sqlalchemy.sql import exists
|
|||
class Note(Base):
|
||||
__tablename__ = "note"
|
||||
|
||||
id = Column(Text, primary_key=True)
|
||||
id = Column(Text, primary_key=True, unique=True)
|
||||
user_id = Column(Text)
|
||||
|
||||
title = Column(Text)
|
||||
|
|
|
|||
|
|
@ -25,7 +25,7 @@ log.setLevel(SRC_LOG_LEVELS["MODELS"])
|
|||
class OAuthSession(Base):
|
||||
__tablename__ = "oauth_session"
|
||||
|
||||
id = Column(Text, primary_key=True)
|
||||
id = Column(Text, primary_key=True, unique=True)
|
||||
user_id = Column(Text, nullable=False)
|
||||
provider = Column(Text, nullable=False)
|
||||
token = Column(
|
||||
|
|
|
|||
|
|
@ -24,7 +24,7 @@ log.setLevel(SRC_LOG_LEVELS["MODELS"])
|
|||
class Tool(Base):
|
||||
__tablename__ = "tool"
|
||||
|
||||
id = Column(String, primary_key=True)
|
||||
id = Column(String, primary_key=True, unique=True)
|
||||
user_id = Column(String)
|
||||
name = Column(Text)
|
||||
content = Column(Text)
|
||||
|
|
|
|||
|
|
@ -7,12 +7,27 @@ from open_webui.internal.db import Base, JSONField, get_db
|
|||
from open_webui.env import DATABASE_USER_ACTIVE_STATUS_UPDATE_INTERVAL
|
||||
from open_webui.models.chats import Chats
|
||||
from open_webui.models.groups import Groups, GroupMember
|
||||
from open_webui.models.channels import ChannelMember
|
||||
|
||||
|
||||
from open_webui.utils.misc import throttle
|
||||
|
||||
|
||||
from pydantic import BaseModel, ConfigDict
|
||||
from sqlalchemy import BigInteger, Column, String, Text, Date, exists, select
|
||||
from sqlalchemy import (
|
||||
BigInteger,
|
||||
JSON,
|
||||
Column,
|
||||
String,
|
||||
Boolean,
|
||||
Text,
|
||||
Date,
|
||||
exists,
|
||||
select,
|
||||
cast,
|
||||
)
|
||||
from sqlalchemy import or_, case
|
||||
from sqlalchemy.dialects.postgresql import JSONB
|
||||
|
||||
import datetime
|
||||
|
||||
|
|
@ -21,59 +36,71 @@ import datetime
|
|||
####################
|
||||
|
||||
|
||||
class User(Base):
|
||||
__tablename__ = "user"
|
||||
|
||||
id = Column(String, primary_key=True)
|
||||
name = Column(String)
|
||||
|
||||
email = Column(String)
|
||||
username = Column(String(50), nullable=True)
|
||||
|
||||
role = Column(String)
|
||||
profile_image_url = Column(Text)
|
||||
|
||||
bio = Column(Text, nullable=True)
|
||||
gender = Column(Text, nullable=True)
|
||||
date_of_birth = Column(Date, nullable=True)
|
||||
|
||||
info = Column(JSONField, nullable=True)
|
||||
settings = Column(JSONField, nullable=True)
|
||||
|
||||
api_key = Column(String, nullable=True, unique=True)
|
||||
oauth_sub = Column(Text, unique=True)
|
||||
|
||||
last_active_at = Column(BigInteger)
|
||||
|
||||
updated_at = Column(BigInteger)
|
||||
created_at = Column(BigInteger)
|
||||
|
||||
|
||||
class UserSettings(BaseModel):
|
||||
ui: Optional[dict] = {}
|
||||
model_config = ConfigDict(extra="allow")
|
||||
pass
|
||||
|
||||
|
||||
class User(Base):
|
||||
__tablename__ = "user"
|
||||
|
||||
id = Column(String, primary_key=True, unique=True)
|
||||
email = Column(String)
|
||||
username = Column(String(50), nullable=True)
|
||||
role = Column(String)
|
||||
|
||||
name = Column(String)
|
||||
|
||||
profile_image_url = Column(Text)
|
||||
profile_banner_image_url = Column(Text, nullable=True)
|
||||
|
||||
bio = Column(Text, nullable=True)
|
||||
gender = Column(Text, nullable=True)
|
||||
date_of_birth = Column(Date, nullable=True)
|
||||
timezone = Column(String, nullable=True)
|
||||
|
||||
presence_state = Column(String, nullable=True)
|
||||
status_emoji = Column(String, nullable=True)
|
||||
status_message = Column(Text, nullable=True)
|
||||
status_expires_at = Column(BigInteger, nullable=True)
|
||||
|
||||
info = Column(JSON, nullable=True)
|
||||
settings = Column(JSON, nullable=True)
|
||||
|
||||
oauth = Column(JSON, nullable=True)
|
||||
|
||||
last_active_at = Column(BigInteger)
|
||||
updated_at = Column(BigInteger)
|
||||
created_at = Column(BigInteger)
|
||||
|
||||
|
||||
class UserModel(BaseModel):
|
||||
id: str
|
||||
name: str
|
||||
|
||||
email: str
|
||||
username: Optional[str] = None
|
||||
|
||||
role: str = "pending"
|
||||
|
||||
name: str
|
||||
|
||||
profile_image_url: str
|
||||
profile_banner_image_url: Optional[str] = None
|
||||
|
||||
bio: Optional[str] = None
|
||||
gender: Optional[str] = None
|
||||
date_of_birth: Optional[datetime.date] = None
|
||||
timezone: Optional[str] = None
|
||||
|
||||
presence_state: Optional[str] = None
|
||||
status_emoji: Optional[str] = None
|
||||
status_message: Optional[str] = None
|
||||
status_expires_at: Optional[int] = None
|
||||
|
||||
info: Optional[dict] = None
|
||||
settings: Optional[UserSettings] = None
|
||||
|
||||
api_key: Optional[str] = None
|
||||
oauth_sub: Optional[str] = None
|
||||
oauth: Optional[dict] = None
|
||||
|
||||
last_active_at: int # timestamp in epoch
|
||||
updated_at: int # timestamp in epoch
|
||||
|
|
@ -82,6 +109,38 @@ class UserModel(BaseModel):
|
|||
model_config = ConfigDict(from_attributes=True)
|
||||
|
||||
|
||||
class UserStatusModel(UserModel):
|
||||
is_active: bool = False
|
||||
|
||||
model_config = ConfigDict(from_attributes=True)
|
||||
|
||||
|
||||
class ApiKey(Base):
|
||||
__tablename__ = "api_key"
|
||||
|
||||
id = Column(Text, primary_key=True, unique=True)
|
||||
user_id = Column(Text, nullable=False)
|
||||
key = Column(Text, unique=True, nullable=False)
|
||||
data = Column(JSON, nullable=True)
|
||||
expires_at = Column(BigInteger, nullable=True)
|
||||
last_used_at = Column(BigInteger, nullable=True)
|
||||
created_at = Column(BigInteger, nullable=False)
|
||||
updated_at = Column(BigInteger, nullable=False)
|
||||
|
||||
|
||||
class ApiKeyModel(BaseModel):
|
||||
id: str
|
||||
user_id: str
|
||||
key: str
|
||||
data: Optional[dict] = None
|
||||
expires_at: Optional[int] = None
|
||||
last_used_at: Optional[int] = None
|
||||
created_at: int # timestamp in epoch
|
||||
updated_at: int # timestamp in epoch
|
||||
|
||||
model_config = ConfigDict(from_attributes=True)
|
||||
|
||||
|
||||
####################
|
||||
# Forms
|
||||
####################
|
||||
|
|
@ -113,7 +172,13 @@ class UserGroupIdsListResponse(BaseModel):
|
|||
total: int
|
||||
|
||||
|
||||
class UserInfoResponse(BaseModel):
|
||||
class UserStatus(BaseModel):
|
||||
status_emoji: Optional[str] = None
|
||||
status_message: Optional[str] = None
|
||||
status_expires_at: Optional[int] = None
|
||||
|
||||
|
||||
class UserInfoResponse(UserStatus):
|
||||
id: str
|
||||
name: str
|
||||
email: str
|
||||
|
|
@ -125,6 +190,12 @@ class UserIdNameResponse(BaseModel):
|
|||
name: str
|
||||
|
||||
|
||||
class UserIdNameStatusResponse(UserStatus):
|
||||
id: str
|
||||
name: str
|
||||
is_active: Optional[bool] = None
|
||||
|
||||
|
||||
class UserInfoListResponse(BaseModel):
|
||||
users: list[UserInfoResponse]
|
||||
total: int
|
||||
|
|
@ -135,18 +206,18 @@ class UserIdNameListResponse(BaseModel):
|
|||
total: int
|
||||
|
||||
|
||||
class UserResponse(BaseModel):
|
||||
id: str
|
||||
name: str
|
||||
email: str
|
||||
role: str
|
||||
profile_image_url: str
|
||||
|
||||
|
||||
class UserNameResponse(BaseModel):
|
||||
id: str
|
||||
name: str
|
||||
role: str
|
||||
|
||||
|
||||
class UserResponse(UserNameResponse):
|
||||
email: str
|
||||
|
||||
|
||||
class UserProfileImageResponse(UserNameResponse):
|
||||
email: str
|
||||
profile_image_url: str
|
||||
|
||||
|
||||
|
|
@ -171,20 +242,20 @@ class UsersTable:
|
|||
email: str,
|
||||
profile_image_url: str = "/user.png",
|
||||
role: str = "pending",
|
||||
oauth_sub: Optional[str] = None,
|
||||
oauth: Optional[dict] = None,
|
||||
) -> Optional[UserModel]:
|
||||
with get_db() as db:
|
||||
user = UserModel(
|
||||
**{
|
||||
"id": id,
|
||||
"name": name,
|
||||
"email": email,
|
||||
"name": name,
|
||||
"role": role,
|
||||
"profile_image_url": profile_image_url,
|
||||
"last_active_at": int(time.time()),
|
||||
"created_at": int(time.time()),
|
||||
"updated_at": int(time.time()),
|
||||
"oauth_sub": oauth_sub,
|
||||
"oauth": oauth,
|
||||
}
|
||||
)
|
||||
result = User(**user.model_dump())
|
||||
|
|
@ -207,8 +278,13 @@ class UsersTable:
|
|||
def get_user_by_api_key(self, api_key: str) -> Optional[UserModel]:
|
||||
try:
|
||||
with get_db() as db:
|
||||
user = db.query(User).filter_by(api_key=api_key).first()
|
||||
return UserModel.model_validate(user)
|
||||
user = (
|
||||
db.query(User)
|
||||
.join(ApiKey, User.id == ApiKey.user_id)
|
||||
.filter(ApiKey.key == api_key)
|
||||
.first()
|
||||
)
|
||||
return UserModel.model_validate(user) if user else None
|
||||
except Exception:
|
||||
return None
|
||||
|
||||
|
|
@ -220,12 +296,23 @@ class UsersTable:
|
|||
except Exception:
|
||||
return None
|
||||
|
||||
def get_user_by_oauth_sub(self, sub: str) -> Optional[UserModel]:
|
||||
def get_user_by_oauth_sub(self, provider: str, sub: str) -> Optional[UserModel]:
|
||||
try:
|
||||
with get_db() as db:
|
||||
user = db.query(User).filter_by(oauth_sub=sub).first()
|
||||
return UserModel.model_validate(user)
|
||||
except Exception:
|
||||
with get_db() as db: # type: Session
|
||||
dialect_name = db.bind.dialect.name
|
||||
|
||||
query = db.query(User)
|
||||
if dialect_name == "sqlite":
|
||||
query = query.filter(User.oauth.contains({provider: {"sub": sub}}))
|
||||
elif dialect_name == "postgresql":
|
||||
query = query.filter(
|
||||
User.oauth[provider].cast(JSONB)["sub"].astext == sub
|
||||
)
|
||||
|
||||
user = query.first()
|
||||
return UserModel.model_validate(user) if user else None
|
||||
except Exception as e:
|
||||
# You may want to log the exception here
|
||||
return None
|
||||
|
||||
def get_users(
|
||||
|
|
@ -248,6 +335,17 @@ class UsersTable:
|
|||
)
|
||||
)
|
||||
|
||||
channel_id = filter.get("channel_id")
|
||||
if channel_id:
|
||||
query = query.filter(
|
||||
exists(
|
||||
select(ChannelMember.id).where(
|
||||
ChannelMember.user_id == User.id,
|
||||
ChannelMember.channel_id == channel_id,
|
||||
)
|
||||
)
|
||||
)
|
||||
|
||||
user_ids = filter.get("user_ids")
|
||||
group_ids = filter.get("group_ids")
|
||||
|
||||
|
|
@ -354,7 +452,17 @@ class UsersTable:
|
|||
"total": total,
|
||||
}
|
||||
|
||||
def get_users_by_user_ids(self, user_ids: list[str]) -> list[UserModel]:
|
||||
def get_users_by_group_id(self, group_id: str) -> list[UserModel]:
|
||||
with get_db() as db:
|
||||
users = (
|
||||
db.query(User)
|
||||
.join(GroupMember, User.id == GroupMember.user_id)
|
||||
.filter(GroupMember.group_id == group_id)
|
||||
.all()
|
||||
)
|
||||
return [UserModel.model_validate(user) for user in users]
|
||||
|
||||
def get_users_by_user_ids(self, user_ids: list[str]) -> list[UserStatusModel]:
|
||||
with get_db() as db:
|
||||
users = db.query(User).filter(User.id.in_(user_ids)).all()
|
||||
return [UserModel.model_validate(user) for user in users]
|
||||
|
|
@ -410,6 +518,21 @@ class UsersTable:
|
|||
except Exception:
|
||||
return None
|
||||
|
||||
def update_user_status_by_id(
|
||||
self, id: str, form_data: UserStatus
|
||||
) -> Optional[UserModel]:
|
||||
try:
|
||||
with get_db() as db:
|
||||
db.query(User).filter_by(id=id).update(
|
||||
{**form_data.model_dump(exclude_none=True)}
|
||||
)
|
||||
db.commit()
|
||||
|
||||
user = db.query(User).filter_by(id=id).first()
|
||||
return UserModel.model_validate(user)
|
||||
except Exception:
|
||||
return None
|
||||
|
||||
def update_user_profile_image_url_by_id(
|
||||
self, id: str, profile_image_url: str
|
||||
) -> Optional[UserModel]:
|
||||
|
|
@ -426,7 +549,7 @@ class UsersTable:
|
|||
return None
|
||||
|
||||
@throttle(DATABASE_USER_ACTIVE_STATUS_UPDATE_INTERVAL)
|
||||
def update_user_last_active_by_id(self, id: str) -> Optional[UserModel]:
|
||||
def update_last_active_by_id(self, id: str) -> Optional[UserModel]:
|
||||
try:
|
||||
with get_db() as db:
|
||||
db.query(User).filter_by(id=id).update(
|
||||
|
|
@ -439,16 +562,35 @@ class UsersTable:
|
|||
except Exception:
|
||||
return None
|
||||
|
||||
def update_user_oauth_sub_by_id(
|
||||
self, id: str, oauth_sub: str
|
||||
def update_user_oauth_by_id(
|
||||
self, id: str, provider: str, sub: str
|
||||
) -> Optional[UserModel]:
|
||||
"""
|
||||
Update or insert an OAuth provider/sub pair into the user's oauth JSON field.
|
||||
Example resulting structure:
|
||||
{
|
||||
"google": { "sub": "123" },
|
||||
"github": { "sub": "abc" }
|
||||
}
|
||||
"""
|
||||
try:
|
||||
with get_db() as db:
|
||||
db.query(User).filter_by(id=id).update({"oauth_sub": oauth_sub})
|
||||
user = db.query(User).filter_by(id=id).first()
|
||||
if not user:
|
||||
return None
|
||||
|
||||
# Load existing oauth JSON or create empty
|
||||
oauth = user.oauth or {}
|
||||
|
||||
# Update or insert provider entry
|
||||
oauth[provider] = {"sub": sub}
|
||||
|
||||
# Persist updated JSON
|
||||
db.query(User).filter_by(id=id).update({"oauth": oauth})
|
||||
db.commit()
|
||||
|
||||
user = db.query(User).filter_by(id=id).first()
|
||||
return UserModel.model_validate(user)
|
||||
|
||||
except Exception:
|
||||
return None
|
||||
|
||||
|
|
@ -502,23 +644,45 @@ class UsersTable:
|
|||
except Exception:
|
||||
return False
|
||||
|
||||
def update_user_api_key_by_id(self, id: str, api_key: str) -> bool:
|
||||
try:
|
||||
with get_db() as db:
|
||||
result = db.query(User).filter_by(id=id).update({"api_key": api_key})
|
||||
db.commit()
|
||||
return True if result == 1 else False
|
||||
except Exception:
|
||||
return False
|
||||
|
||||
def get_user_api_key_by_id(self, id: str) -> Optional[str]:
|
||||
try:
|
||||
with get_db() as db:
|
||||
user = db.query(User).filter_by(id=id).first()
|
||||
return user.api_key
|
||||
api_key = db.query(ApiKey).filter_by(user_id=id).first()
|
||||
return api_key.key if api_key else None
|
||||
except Exception:
|
||||
return None
|
||||
|
||||
def update_user_api_key_by_id(self, id: str, api_key: str) -> bool:
|
||||
try:
|
||||
with get_db() as db:
|
||||
db.query(ApiKey).filter_by(user_id=id).delete()
|
||||
db.commit()
|
||||
|
||||
now = int(time.time())
|
||||
new_api_key = ApiKey(
|
||||
id=f"key_{id}",
|
||||
user_id=id,
|
||||
key=api_key,
|
||||
created_at=now,
|
||||
updated_at=now,
|
||||
)
|
||||
db.add(new_api_key)
|
||||
db.commit()
|
||||
|
||||
return True
|
||||
|
||||
except Exception:
|
||||
return False
|
||||
|
||||
def delete_user_api_key_by_id(self, id: str) -> bool:
|
||||
try:
|
||||
with get_db() as db:
|
||||
db.query(ApiKey).filter_by(user_id=id).delete()
|
||||
db.commit()
|
||||
return True
|
||||
except Exception:
|
||||
return False
|
||||
|
||||
def get_valid_user_ids(self, user_ids: list[str]) -> list[str]:
|
||||
with get_db() as db:
|
||||
users = db.query(User).filter(User.id.in_(user_ids)).all()
|
||||
|
|
@ -532,5 +696,23 @@ class UsersTable:
|
|||
else:
|
||||
return None
|
||||
|
||||
def get_active_user_count(self) -> int:
|
||||
with get_db() as db:
|
||||
# Consider user active if last_active_at within the last 3 minutes
|
||||
three_minutes_ago = int(time.time()) - 180
|
||||
count = (
|
||||
db.query(User).filter(User.last_active_at >= three_minutes_ago).count()
|
||||
)
|
||||
return count
|
||||
|
||||
def is_user_active(self, user_id: str) -> bool:
|
||||
with get_db() as db:
|
||||
user = db.query(User).filter_by(id=user_id).first()
|
||||
if user and user.last_active_at:
|
||||
# Consider user active if last_active_at within the last 3 minutes
|
||||
three_minutes_ago = int(time.time()) - 180
|
||||
return user.last_active_at >= three_minutes_ago
|
||||
return False
|
||||
|
||||
|
||||
Users = UsersTable()
|
||||
|
|
|
|||
|
|
@ -322,12 +322,14 @@ class Loader:
|
|||
file_path=file_path,
|
||||
api_endpoint=self.kwargs.get("DOCUMENT_INTELLIGENCE_ENDPOINT"),
|
||||
api_key=self.kwargs.get("DOCUMENT_INTELLIGENCE_KEY"),
|
||||
api_model=self.kwargs.get("DOCUMENT_INTELLIGENCE_MODEL"),
|
||||
)
|
||||
else:
|
||||
loader = AzureAIDocumentIntelligenceLoader(
|
||||
file_path=file_path,
|
||||
api_endpoint=self.kwargs.get("DOCUMENT_INTELLIGENCE_ENDPOINT"),
|
||||
azure_credential=DefaultAzureCredential(),
|
||||
api_model=self.kwargs.get("DOCUMENT_INTELLIGENCE_MODEL"),
|
||||
)
|
||||
elif self.engine == "mineru" and file_ext in [
|
||||
"pdf"
|
||||
|
|
|
|||
|
|
@ -1088,21 +1088,17 @@ async def get_sources_from_items(
|
|||
or knowledge_base.user_id == user.id
|
||||
or has_access(user.id, "read", knowledge_base.access_control)
|
||||
):
|
||||
|
||||
file_ids = knowledge_base.data.get("file_ids", [])
|
||||
files = Knowledges.get_files_by_id(knowledge_base.id)
|
||||
|
||||
documents = []
|
||||
metadatas = []
|
||||
for file_id in file_ids:
|
||||
file_object = Files.get_file_by_id(file_id)
|
||||
|
||||
if file_object:
|
||||
documents.append(file_object.data.get("content", ""))
|
||||
for file in files:
|
||||
documents.append(file.data.get("content", ""))
|
||||
metadatas.append(
|
||||
{
|
||||
"file_id": file_id,
|
||||
"name": file_object.filename,
|
||||
"source": file_object.filename,
|
||||
"file_id": file.id,
|
||||
"name": file.filename,
|
||||
"source": file.filename,
|
||||
}
|
||||
)
|
||||
|
||||
|
|
|
|||
|
|
@ -200,23 +200,24 @@ class MilvusClient(VectorDBBase):
|
|||
def query(self, collection_name: str, filter: dict, limit: int = -1):
|
||||
connections.connect(uri=MILVUS_URI, token=MILVUS_TOKEN, db_name=MILVUS_DB)
|
||||
|
||||
# Construct the filter string for querying
|
||||
collection_name = collection_name.replace("-", "_")
|
||||
if not self.has_collection(collection_name):
|
||||
log.warning(
|
||||
f"Query attempted on non-existent collection: {self.collection_prefix}_{collection_name}"
|
||||
)
|
||||
return None
|
||||
filter_string = " && ".join(
|
||||
[
|
||||
f'metadata["{key}"] == {json.dumps(value)}'
|
||||
for key, value in filter.items()
|
||||
]
|
||||
)
|
||||
|
||||
filter_expressions = []
|
||||
for key, value in filter.items():
|
||||
if isinstance(value, str):
|
||||
filter_expressions.append(f'metadata["{key}"] == "{value}"')
|
||||
else:
|
||||
filter_expressions.append(f'metadata["{key}"] == {value}')
|
||||
|
||||
filter_string = " && ".join(filter_expressions)
|
||||
|
||||
collection = Collection(f"{self.collection_prefix}_{collection_name}")
|
||||
collection.load()
|
||||
all_results = []
|
||||
|
||||
try:
|
||||
log.info(
|
||||
|
|
@ -224,24 +225,25 @@ class MilvusClient(VectorDBBase):
|
|||
)
|
||||
|
||||
iterator = collection.query_iterator(
|
||||
filter=filter_string,
|
||||
expr=filter_string,
|
||||
output_fields=[
|
||||
"id",
|
||||
"data",
|
||||
"metadata",
|
||||
],
|
||||
limit=limit, # Pass the limit directly; -1 means no limit.
|
||||
limit=limit if limit > 0 else -1,
|
||||
)
|
||||
|
||||
all_results = []
|
||||
while True:
|
||||
result = iterator.next()
|
||||
if not result:
|
||||
batch = iterator.next()
|
||||
if not batch:
|
||||
iterator.close()
|
||||
break
|
||||
all_results += result
|
||||
all_results.extend(batch)
|
||||
|
||||
log.info(f"Total results from query: {len(all_results)}")
|
||||
return self._result_to_get_result([all_results])
|
||||
log.debug(f"Total results from query: {len(all_results)}")
|
||||
return self._result_to_get_result([all_results] if all_results else [[]])
|
||||
|
||||
except Exception as e:
|
||||
log.exception(
|
||||
|
|
|
|||
|
|
@ -157,7 +157,6 @@ class MilvusClient(VectorDBBase):
|
|||
for item in items
|
||||
]
|
||||
collection.insert(entities)
|
||||
collection.flush()
|
||||
|
||||
def search(
|
||||
self, collection_name: str, vectors: List[List[float]], limit: int
|
||||
|
|
@ -263,15 +262,23 @@ class MilvusClient(VectorDBBase):
|
|||
else:
|
||||
expr.append(f"metadata['{key}'] == {value}")
|
||||
|
||||
results = collection.query(
|
||||
iterator = collection.query_iterator(
|
||||
expr=" and ".join(expr),
|
||||
output_fields=["id", "text", "metadata"],
|
||||
limit=limit,
|
||||
limit=limit if limit else -1,
|
||||
)
|
||||
|
||||
ids = [res["id"] for res in results]
|
||||
documents = [res["text"] for res in results]
|
||||
metadatas = [res["metadata"] for res in results]
|
||||
all_results = []
|
||||
while True:
|
||||
batch = iterator.next()
|
||||
if not batch:
|
||||
iterator.close()
|
||||
break
|
||||
all_results.extend(batch)
|
||||
|
||||
ids = [res["id"] for res in all_results]
|
||||
documents = [res["text"] for res in all_results]
|
||||
metadatas = [res["metadata"] for res in all_results]
|
||||
|
||||
return GetResult(ids=[ids], documents=[documents], metadatas=[metadatas])
|
||||
|
||||
|
|
|
|||
|
|
@ -33,7 +33,7 @@ def get_filtered_results(results, filter_list):
|
|||
except Exception:
|
||||
pass
|
||||
|
||||
if any(is_string_allowed(hostname, filter_list) for hostname in hostnames):
|
||||
if is_string_allowed(hostnames, filter_list):
|
||||
filtered_results.append(result)
|
||||
continue
|
||||
|
||||
|
|
|
|||
|
|
@ -6,6 +6,7 @@ import logging
|
|||
from aiohttp import ClientSession
|
||||
import urllib
|
||||
|
||||
|
||||
from open_webui.models.auths import (
|
||||
AddUserForm,
|
||||
ApiKey,
|
||||
|
|
@ -16,9 +17,13 @@ from open_webui.models.auths import (
|
|||
SigninResponse,
|
||||
SignupForm,
|
||||
UpdatePasswordForm,
|
||||
UserResponse,
|
||||
)
|
||||
from open_webui.models.users import Users, UpdateProfileForm
|
||||
from open_webui.models.users import (
|
||||
UserProfileImageResponse,
|
||||
Users,
|
||||
UpdateProfileForm,
|
||||
UserStatus,
|
||||
)
|
||||
from open_webui.models.groups import Groups
|
||||
from open_webui.models.oauth_sessions import OAuthSessions
|
||||
|
||||
|
|
@ -60,6 +65,11 @@ from open_webui.utils.auth import (
|
|||
)
|
||||
from open_webui.utils.webhook import post_webhook
|
||||
from open_webui.utils.access_control import get_permissions, has_permission
|
||||
from open_webui.utils.groups import apply_default_group_assignment
|
||||
|
||||
from open_webui.utils.redis import get_redis_client
|
||||
from open_webui.utils.rate_limit import RateLimiter
|
||||
|
||||
|
||||
from typing import Optional, List
|
||||
|
||||
|
|
@ -73,17 +83,21 @@ router = APIRouter()
|
|||
log = logging.getLogger(__name__)
|
||||
log.setLevel(SRC_LOG_LEVELS["MAIN"])
|
||||
|
||||
signin_rate_limiter = RateLimiter(
|
||||
redis_client=get_redis_client(), limit=5 * 3, window=60 * 3
|
||||
)
|
||||
|
||||
############################
|
||||
# GetSessionUser
|
||||
############################
|
||||
|
||||
|
||||
class SessionUserResponse(Token, UserResponse):
|
||||
class SessionUserResponse(Token, UserProfileImageResponse):
|
||||
expires_at: Optional[int] = None
|
||||
permissions: Optional[dict] = None
|
||||
|
||||
|
||||
class SessionUserInfoResponse(SessionUserResponse):
|
||||
class SessionUserInfoResponse(SessionUserResponse, UserStatus):
|
||||
bio: Optional[str] = None
|
||||
gender: Optional[str] = None
|
||||
date_of_birth: Optional[datetime.date] = None
|
||||
|
|
@ -140,6 +154,9 @@ async def get_session_user(
|
|||
"bio": user.bio,
|
||||
"gender": user.gender,
|
||||
"date_of_birth": user.date_of_birth,
|
||||
"status_emoji": user.status_emoji,
|
||||
"status_message": user.status_message,
|
||||
"status_expires_at": user.status_expires_at,
|
||||
"permissions": user_permissions,
|
||||
}
|
||||
|
||||
|
|
@ -149,7 +166,7 @@ async def get_session_user(
|
|||
############################
|
||||
|
||||
|
||||
@router.post("/update/profile", response_model=UserResponse)
|
||||
@router.post("/update/profile", response_model=UserProfileImageResponse)
|
||||
async def update_profile(
|
||||
form_data: UpdateProfileForm, session_user=Depends(get_verified_user)
|
||||
):
|
||||
|
|
@ -401,6 +418,11 @@ async def ldap_auth(request: Request, response: Response, form_data: LdapForm):
|
|||
500, detail=ERROR_MESSAGES.CREATE_USER_ERROR
|
||||
)
|
||||
|
||||
apply_default_group_assignment(
|
||||
request.app.state.config.DEFAULT_GROUP_ID,
|
||||
user.id,
|
||||
)
|
||||
|
||||
except HTTPException:
|
||||
raise
|
||||
except Exception as err:
|
||||
|
|
@ -449,7 +471,6 @@ async def ldap_auth(request: Request, response: Response, form_data: LdapForm):
|
|||
):
|
||||
if ENABLE_LDAP_GROUP_CREATION:
|
||||
Groups.create_groups_by_group_names(user.id, user_groups)
|
||||
|
||||
try:
|
||||
Groups.sync_groups_by_group_names(user.id, user_groups)
|
||||
log.info(
|
||||
|
|
@ -544,6 +565,12 @@ async def signin(request: Request, response: Response, form_data: SigninForm):
|
|||
admin_email.lower(), lambda pw: verify_password(admin_password, pw)
|
||||
)
|
||||
else:
|
||||
if signin_rate_limiter.is_limited(form_data.email.lower()):
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_429_TOO_MANY_REQUESTS,
|
||||
detail=ERROR_MESSAGES.RATE_LIMIT_EXCEEDED,
|
||||
)
|
||||
|
||||
password_bytes = form_data.password.encode("utf-8")
|
||||
if len(password_bytes) > 72:
|
||||
# TODO: Implement other hashing algorithms that support longer passwords
|
||||
|
|
@ -700,9 +727,10 @@ async def signup(request: Request, response: Response, form_data: SignupForm):
|
|||
# Disable signup after the first user is created
|
||||
request.app.state.config.ENABLE_SIGNUP = False
|
||||
|
||||
default_group_id = getattr(request.app.state.config, "DEFAULT_GROUP_ID", "")
|
||||
if default_group_id and default_group_id:
|
||||
Groups.add_users_to_group(default_group_id, [user.id])
|
||||
apply_default_group_assignment(
|
||||
request.app.state.config.DEFAULT_GROUP_ID,
|
||||
user.id,
|
||||
)
|
||||
|
||||
return {
|
||||
"token": token,
|
||||
|
|
@ -807,7 +835,9 @@ async def signout(request: Request, response: Response):
|
|||
|
||||
|
||||
@router.post("/add", response_model=SigninResponse)
|
||||
async def add_user(form_data: AddUserForm, user=Depends(get_admin_user)):
|
||||
async def add_user(
|
||||
request: Request, form_data: AddUserForm, user=Depends(get_admin_user)
|
||||
):
|
||||
if not validate_email_format(form_data.email.lower()):
|
||||
raise HTTPException(
|
||||
status.HTTP_400_BAD_REQUEST, detail=ERROR_MESSAGES.INVALID_EMAIL_FORMAT
|
||||
|
|
@ -832,6 +862,11 @@ async def add_user(form_data: AddUserForm, user=Depends(get_admin_user)):
|
|||
)
|
||||
|
||||
if user:
|
||||
apply_default_group_assignment(
|
||||
request.app.state.config.DEFAULT_GROUP_ID,
|
||||
user.id,
|
||||
)
|
||||
|
||||
token = create_token(data={"id": user.id})
|
||||
return {
|
||||
"token": token,
|
||||
|
|
@ -901,6 +936,7 @@ async def get_admin_config(request: Request, user=Depends(get_admin_user)):
|
|||
"JWT_EXPIRES_IN": request.app.state.config.JWT_EXPIRES_IN,
|
||||
"ENABLE_COMMUNITY_SHARING": request.app.state.config.ENABLE_COMMUNITY_SHARING,
|
||||
"ENABLE_MESSAGE_RATING": request.app.state.config.ENABLE_MESSAGE_RATING,
|
||||
"ENABLE_FOLDERS": request.app.state.config.ENABLE_FOLDERS,
|
||||
"ENABLE_CHANNELS": request.app.state.config.ENABLE_CHANNELS,
|
||||
"ENABLE_NOTES": request.app.state.config.ENABLE_NOTES,
|
||||
"ENABLE_USER_WEBHOOKS": request.app.state.config.ENABLE_USER_WEBHOOKS,
|
||||
|
|
@ -922,6 +958,7 @@ class AdminConfig(BaseModel):
|
|||
JWT_EXPIRES_IN: str
|
||||
ENABLE_COMMUNITY_SHARING: bool
|
||||
ENABLE_MESSAGE_RATING: bool
|
||||
ENABLE_FOLDERS: bool
|
||||
ENABLE_CHANNELS: bool
|
||||
ENABLE_NOTES: bool
|
||||
ENABLE_USER_WEBHOOKS: bool
|
||||
|
|
@ -946,6 +983,7 @@ async def update_admin_config(
|
|||
form_data.API_KEYS_ALLOWED_ENDPOINTS
|
||||
)
|
||||
|
||||
request.app.state.config.ENABLE_FOLDERS = form_data.ENABLE_FOLDERS
|
||||
request.app.state.config.ENABLE_CHANNELS = form_data.ENABLE_CHANNELS
|
||||
request.app.state.config.ENABLE_NOTES = form_data.ENABLE_NOTES
|
||||
|
||||
|
|
@ -988,6 +1026,7 @@ async def update_admin_config(
|
|||
"JWT_EXPIRES_IN": request.app.state.config.JWT_EXPIRES_IN,
|
||||
"ENABLE_COMMUNITY_SHARING": request.app.state.config.ENABLE_COMMUNITY_SHARING,
|
||||
"ENABLE_MESSAGE_RATING": request.app.state.config.ENABLE_MESSAGE_RATING,
|
||||
"ENABLE_FOLDERS": request.app.state.config.ENABLE_FOLDERS,
|
||||
"ENABLE_CHANNELS": request.app.state.config.ENABLE_CHANNELS,
|
||||
"ENABLE_NOTES": request.app.state.config.ENABLE_NOTES,
|
||||
"ENABLE_USER_WEBHOOKS": request.app.state.config.ENABLE_USER_WEBHOOKS,
|
||||
|
|
@ -1130,8 +1169,7 @@ async def generate_api_key(request: Request, user=Depends(get_current_user)):
|
|||
# delete api key
|
||||
@router.delete("/api_key", response_model=bool)
|
||||
async def delete_api_key(user=Depends(get_current_user)):
|
||||
success = Users.update_user_api_key_by_id(user.id, None)
|
||||
return success
|
||||
return Users.delete_user_api_key_by_id(user.id)
|
||||
|
||||
|
||||
# get api key
|
||||
|
|
|
|||
|
|
@ -8,11 +8,14 @@ from pydantic import BaseModel
|
|||
|
||||
|
||||
from open_webui.socket.main import (
|
||||
emit_to_users,
|
||||
enter_room_for_users,
|
||||
sio,
|
||||
get_user_ids_from_room,
|
||||
get_active_status_by_user_id,
|
||||
)
|
||||
from open_webui.models.users import (
|
||||
UserIdNameResponse,
|
||||
UserIdNameStatusResponse,
|
||||
UserListResponse,
|
||||
UserModelResponse,
|
||||
Users,
|
||||
|
|
@ -25,11 +28,13 @@ from open_webui.models.channels import (
|
|||
ChannelModel,
|
||||
ChannelForm,
|
||||
ChannelResponse,
|
||||
CreateChannelForm,
|
||||
)
|
||||
from open_webui.models.messages import (
|
||||
Messages,
|
||||
MessageModel,
|
||||
MessageResponse,
|
||||
MessageWithReactionsResponse,
|
||||
MessageForm,
|
||||
)
|
||||
|
||||
|
|
@ -51,6 +56,7 @@ from open_webui.utils.access_control import (
|
|||
has_access,
|
||||
get_users_with_access,
|
||||
get_permitted_group_and_user_ids,
|
||||
has_permission,
|
||||
)
|
||||
from open_webui.utils.webhook import post_webhook
|
||||
from open_webui.utils.channels import extract_mentions, replace_mentions
|
||||
|
|
@ -65,9 +71,64 @@ router = APIRouter()
|
|||
############################
|
||||
|
||||
|
||||
@router.get("/", response_model=list[ChannelModel])
|
||||
async def get_channels(user=Depends(get_verified_user)):
|
||||
return Channels.get_channels_by_user_id(user.id)
|
||||
class ChannelListItemResponse(ChannelModel):
|
||||
user_ids: Optional[list[str]] = None # 'dm' channels only
|
||||
users: Optional[list[UserIdNameStatusResponse]] = None # 'dm' channels only
|
||||
|
||||
last_message_at: Optional[int] = None # timestamp in epoch (time_ns)
|
||||
unread_count: int = 0
|
||||
|
||||
|
||||
@router.get("/", response_model=list[ChannelListItemResponse])
|
||||
async def get_channels(request: Request, user=Depends(get_verified_user)):
|
||||
if user.role != "admin" and not has_permission(
|
||||
user.id, "features.channels", request.app.state.config.USER_PERMISSIONS
|
||||
):
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_401_UNAUTHORIZED,
|
||||
detail=ERROR_MESSAGES.UNAUTHORIZED,
|
||||
)
|
||||
|
||||
channels = Channels.get_channels_by_user_id(user.id)
|
||||
channel_list = []
|
||||
for channel in channels:
|
||||
last_message = Messages.get_last_message_by_channel_id(channel.id)
|
||||
last_message_at = last_message.created_at if last_message else None
|
||||
|
||||
channel_member = Channels.get_member_by_channel_and_user_id(channel.id, user.id)
|
||||
unread_count = (
|
||||
Messages.get_unread_message_count(
|
||||
channel.id, user.id, channel_member.last_read_at
|
||||
)
|
||||
if channel_member
|
||||
else 0
|
||||
)
|
||||
|
||||
user_ids = None
|
||||
users = None
|
||||
if channel.type == "dm":
|
||||
user_ids = [
|
||||
member.user_id
|
||||
for member in Channels.get_members_by_channel_id(channel.id)
|
||||
]
|
||||
users = [
|
||||
UserIdNameStatusResponse(
|
||||
**{**user.model_dump(), "is_active": Users.is_user_active(user.id)}
|
||||
)
|
||||
for user in Users.get_users_by_user_ids(user_ids)
|
||||
]
|
||||
|
||||
channel_list.append(
|
||||
ChannelListItemResponse(
|
||||
**channel.model_dump(),
|
||||
user_ids=user_ids,
|
||||
users=users,
|
||||
last_message_at=last_message_at,
|
||||
unread_count=unread_count,
|
||||
)
|
||||
)
|
||||
|
||||
return channel_list
|
||||
|
||||
|
||||
@router.get("/list", response_model=list[ChannelModel])
|
||||
|
|
@ -77,16 +138,141 @@ async def get_all_channels(user=Depends(get_verified_user)):
|
|||
return Channels.get_channels_by_user_id(user.id)
|
||||
|
||||
|
||||
############################
|
||||
# GetDMChannelByUserId
|
||||
############################
|
||||
|
||||
|
||||
@router.get("/users/{user_id}", response_model=Optional[ChannelModel])
|
||||
async def get_dm_channel_by_user_id(
|
||||
request: Request, user_id: str, user=Depends(get_verified_user)
|
||||
):
|
||||
if user.role != "admin" and not has_permission(
|
||||
user.id, "features.channels", request.app.state.config.USER_PERMISSIONS
|
||||
):
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_401_UNAUTHORIZED,
|
||||
detail=ERROR_MESSAGES.UNAUTHORIZED,
|
||||
)
|
||||
|
||||
try:
|
||||
existing_channel = Channels.get_dm_channel_by_user_ids([user.id, user_id])
|
||||
if existing_channel:
|
||||
participant_ids = [
|
||||
member.user_id
|
||||
for member in Channels.get_members_by_channel_id(existing_channel.id)
|
||||
]
|
||||
|
||||
await emit_to_users(
|
||||
"events:channel",
|
||||
{"data": {"type": "channel:created"}},
|
||||
participant_ids,
|
||||
)
|
||||
await enter_room_for_users(
|
||||
f"channel:{existing_channel.id}", participant_ids
|
||||
)
|
||||
|
||||
Channels.update_member_active_status(existing_channel.id, user.id, True)
|
||||
return ChannelModel(**existing_channel.model_dump())
|
||||
|
||||
channel = Channels.insert_new_channel(
|
||||
CreateChannelForm(
|
||||
type="dm",
|
||||
name="",
|
||||
user_ids=[user_id],
|
||||
),
|
||||
user.id,
|
||||
)
|
||||
|
||||
if channel:
|
||||
participant_ids = [
|
||||
member.user_id
|
||||
for member in Channels.get_members_by_channel_id(channel.id)
|
||||
]
|
||||
|
||||
await emit_to_users(
|
||||
"events:channel",
|
||||
{"data": {"type": "channel:created"}},
|
||||
participant_ids,
|
||||
)
|
||||
await enter_room_for_users(f"channel:{channel.id}", participant_ids)
|
||||
|
||||
return ChannelModel(**channel.model_dump())
|
||||
else:
|
||||
raise Exception("Error creating channel")
|
||||
except Exception as e:
|
||||
log.exception(e)
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_400_BAD_REQUEST, detail=ERROR_MESSAGES.DEFAULT()
|
||||
)
|
||||
|
||||
|
||||
############################
|
||||
# CreateNewChannel
|
||||
############################
|
||||
|
||||
|
||||
@router.post("/create", response_model=Optional[ChannelModel])
|
||||
async def create_new_channel(form_data: ChannelForm, user=Depends(get_admin_user)):
|
||||
async def create_new_channel(
|
||||
request: Request, form_data: CreateChannelForm, user=Depends(get_verified_user)
|
||||
):
|
||||
if user.role != "admin" and not has_permission(
|
||||
user.id, "features.channels", request.app.state.config.USER_PERMISSIONS
|
||||
):
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_401_UNAUTHORIZED,
|
||||
detail=ERROR_MESSAGES.UNAUTHORIZED,
|
||||
)
|
||||
|
||||
if form_data.type not in ["group", "dm"] and user.role != "admin":
|
||||
# Only admins can create standard channels (joined by default)
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_401_UNAUTHORIZED,
|
||||
detail=ERROR_MESSAGES.UNAUTHORIZED,
|
||||
)
|
||||
|
||||
try:
|
||||
channel = Channels.insert_new_channel(None, form_data, user.id)
|
||||
if form_data.type == "dm":
|
||||
existing_channel = Channels.get_dm_channel_by_user_ids(
|
||||
[user.id, *form_data.user_ids]
|
||||
)
|
||||
if existing_channel:
|
||||
participant_ids = [
|
||||
member.user_id
|
||||
for member in Channels.get_members_by_channel_id(
|
||||
existing_channel.id
|
||||
)
|
||||
]
|
||||
await emit_to_users(
|
||||
"events:channel",
|
||||
{"data": {"type": "channel:created"}},
|
||||
participant_ids,
|
||||
)
|
||||
await enter_room_for_users(
|
||||
f"channel:{existing_channel.id}", participant_ids
|
||||
)
|
||||
|
||||
Channels.update_member_active_status(existing_channel.id, user.id, True)
|
||||
return ChannelModel(**existing_channel.model_dump())
|
||||
|
||||
channel = Channels.insert_new_channel(form_data, user.id)
|
||||
|
||||
if channel:
|
||||
participant_ids = [
|
||||
member.user_id
|
||||
for member in Channels.get_members_by_channel_id(channel.id)
|
||||
]
|
||||
|
||||
await emit_to_users(
|
||||
"events:channel",
|
||||
{"data": {"type": "channel:created"}},
|
||||
participant_ids,
|
||||
)
|
||||
await enter_room_for_users(f"channel:{channel.id}", participant_ids)
|
||||
|
||||
return ChannelModel(**channel.model_dump())
|
||||
else:
|
||||
raise Exception("Error creating channel")
|
||||
except Exception as e:
|
||||
log.exception(e)
|
||||
raise HTTPException(
|
||||
|
|
@ -99,7 +285,15 @@ async def create_new_channel(form_data: ChannelForm, user=Depends(get_admin_user
|
|||
############################
|
||||
|
||||
|
||||
@router.get("/{id}", response_model=Optional[ChannelResponse])
|
||||
class ChannelFullResponse(ChannelResponse):
|
||||
user_ids: Optional[list[str]] = None # 'group'/'dm' channels only
|
||||
users: Optional[list[UserIdNameStatusResponse]] = None # 'group'/'dm' channels only
|
||||
|
||||
last_read_at: Optional[int] = None # timestamp in epoch (time_ns)
|
||||
unread_count: int = 0
|
||||
|
||||
|
||||
@router.get("/{id}", response_model=Optional[ChannelFullResponse])
|
||||
async def get_channel_by_id(id: str, user=Depends(get_verified_user)):
|
||||
channel = Channels.get_channel_by_id(id)
|
||||
if not channel:
|
||||
|
|
@ -107,6 +301,44 @@ async def get_channel_by_id(id: str, user=Depends(get_verified_user)):
|
|||
status_code=status.HTTP_404_NOT_FOUND, detail=ERROR_MESSAGES.NOT_FOUND
|
||||
)
|
||||
|
||||
user_ids = None
|
||||
users = None
|
||||
|
||||
if channel.type in ["group", "dm"]:
|
||||
if not Channels.is_user_channel_member(channel.id, user.id):
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_403_FORBIDDEN, detail=ERROR_MESSAGES.DEFAULT()
|
||||
)
|
||||
|
||||
user_ids = [
|
||||
member.user_id for member in Channels.get_members_by_channel_id(channel.id)
|
||||
]
|
||||
|
||||
users = [
|
||||
UserIdNameStatusResponse(
|
||||
**{**user.model_dump(), "is_active": Users.is_user_active(user.id)}
|
||||
)
|
||||
for user in Users.get_users_by_user_ids(user_ids)
|
||||
]
|
||||
|
||||
channel_member = Channels.get_member_by_channel_and_user_id(channel.id, user.id)
|
||||
unread_count = Messages.get_unread_message_count(
|
||||
channel.id, user.id, channel_member.last_read_at if channel_member else None
|
||||
)
|
||||
|
||||
return ChannelFullResponse(
|
||||
**{
|
||||
**channel.model_dump(),
|
||||
"user_ids": user_ids,
|
||||
"users": users,
|
||||
"is_manager": Channels.is_user_channel_manager(channel.id, user.id),
|
||||
"write_access": True,
|
||||
"user_count": len(user_ids),
|
||||
"last_read_at": channel_member.last_read_at if channel_member else None,
|
||||
"unread_count": unread_count,
|
||||
}
|
||||
)
|
||||
else:
|
||||
if user.role != "admin" and not has_access(
|
||||
user.id, type="read", access_control=channel.access_control
|
||||
):
|
||||
|
|
@ -120,20 +352,35 @@ async def get_channel_by_id(id: str, user=Depends(get_verified_user)):
|
|||
|
||||
user_count = len(get_users_with_access("read", channel.access_control))
|
||||
|
||||
return ChannelResponse(
|
||||
channel_member = Channels.get_member_by_channel_and_user_id(channel.id, user.id)
|
||||
unread_count = Messages.get_unread_message_count(
|
||||
channel.id, user.id, channel_member.last_read_at if channel_member else None
|
||||
)
|
||||
|
||||
return ChannelFullResponse(
|
||||
**{
|
||||
**channel.model_dump(),
|
||||
"user_ids": user_ids,
|
||||
"users": users,
|
||||
"is_manager": Channels.is_user_channel_manager(channel.id, user.id),
|
||||
"write_access": write_access or user.role == "admin",
|
||||
"user_count": user_count,
|
||||
"last_read_at": channel_member.last_read_at if channel_member else None,
|
||||
"unread_count": unread_count,
|
||||
}
|
||||
)
|
||||
|
||||
|
||||
############################
|
||||
# GetChannelMembersById
|
||||
############################
|
||||
|
||||
|
||||
PAGE_ITEM_COUNT = 30
|
||||
|
||||
|
||||
@router.get("/{id}/users", response_model=UserListResponse)
|
||||
async def get_channel_users_by_id(
|
||||
@router.get("/{id}/members", response_model=UserListResponse)
|
||||
async def get_channel_members_by_id(
|
||||
id: str,
|
||||
query: Optional[str] = None,
|
||||
order_by: Optional[str] = None,
|
||||
|
|
@ -153,9 +400,30 @@ async def get_channel_users_by_id(
|
|||
page = max(1, page)
|
||||
skip = (page - 1) * limit
|
||||
|
||||
filter = {
|
||||
"roles": ["!pending"],
|
||||
if channel.type in ["group", "dm"]:
|
||||
if not Channels.is_user_channel_member(channel.id, user.id):
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_403_FORBIDDEN, detail=ERROR_MESSAGES.DEFAULT()
|
||||
)
|
||||
|
||||
if channel.type == "dm":
|
||||
user_ids = [
|
||||
member.user_id for member in Channels.get_members_by_channel_id(channel.id)
|
||||
]
|
||||
users = Users.get_users_by_user_ids(user_ids)
|
||||
total = len(users)
|
||||
|
||||
return {
|
||||
"users": [
|
||||
UserModelResponse(
|
||||
**user.model_dump(), is_active=Users.is_user_active(user.id)
|
||||
)
|
||||
for user in users
|
||||
],
|
||||
"total": total,
|
||||
}
|
||||
else:
|
||||
filter = {}
|
||||
|
||||
if query:
|
||||
filter["query"] = query
|
||||
|
|
@ -164,7 +432,13 @@ async def get_channel_users_by_id(
|
|||
if direction:
|
||||
filter["direction"] = direction
|
||||
|
||||
permitted_ids = get_permitted_group_and_user_ids("read", channel.access_control)
|
||||
if channel.type == "group":
|
||||
filter["channel_id"] = channel.id
|
||||
else:
|
||||
filter["roles"] = ["!pending"]
|
||||
permitted_ids = get_permitted_group_and_user_ids(
|
||||
"read", channel.access_control
|
||||
)
|
||||
if permitted_ids:
|
||||
filter["user_ids"] = permitted_ids.get("user_ids")
|
||||
filter["group_ids"] = permitted_ids.get("group_ids")
|
||||
|
|
@ -177,7 +451,7 @@ async def get_channel_users_by_id(
|
|||
return {
|
||||
"users": [
|
||||
UserModelResponse(
|
||||
**user.model_dump(), is_active=get_active_status_by_user_id(user.id)
|
||||
**user.model_dump(), is_active=Users.is_user_active(user.id)
|
||||
)
|
||||
for user in users
|
||||
],
|
||||
|
|
@ -185,6 +459,131 @@ async def get_channel_users_by_id(
|
|||
}
|
||||
|
||||
|
||||
#################################################
|
||||
# UpdateIsActiveMemberByIdAndUserId
|
||||
#################################################
|
||||
|
||||
|
||||
class UpdateActiveMemberForm(BaseModel):
|
||||
is_active: bool
|
||||
|
||||
|
||||
@router.post("/{id}/members/active", response_model=bool)
|
||||
async def update_is_active_member_by_id_and_user_id(
|
||||
id: str,
|
||||
form_data: UpdateActiveMemberForm,
|
||||
user=Depends(get_verified_user),
|
||||
):
|
||||
channel = Channels.get_channel_by_id(id)
|
||||
if not channel:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_404_NOT_FOUND, detail=ERROR_MESSAGES.NOT_FOUND
|
||||
)
|
||||
|
||||
if not Channels.is_user_channel_member(channel.id, user.id):
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_404_NOT_FOUND, detail=ERROR_MESSAGES.NOT_FOUND
|
||||
)
|
||||
|
||||
Channels.update_member_active_status(channel.id, user.id, form_data.is_active)
|
||||
return True
|
||||
|
||||
|
||||
#################################################
|
||||
# AddMembersById
|
||||
#################################################
|
||||
|
||||
|
||||
class UpdateMembersForm(BaseModel):
|
||||
user_ids: list[str] = []
|
||||
group_ids: list[str] = []
|
||||
|
||||
|
||||
@router.post("/{id}/update/members/add")
|
||||
async def add_members_by_id(
|
||||
request: Request,
|
||||
id: str,
|
||||
form_data: UpdateMembersForm,
|
||||
user=Depends(get_verified_user),
|
||||
):
|
||||
if user.role != "admin" and not has_permission(
|
||||
user.id, "features.channels", request.app.state.config.USER_PERMISSIONS
|
||||
):
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_401_UNAUTHORIZED,
|
||||
detail=ERROR_MESSAGES.UNAUTHORIZED,
|
||||
)
|
||||
|
||||
channel = Channels.get_channel_by_id(id)
|
||||
if not channel:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_404_NOT_FOUND, detail=ERROR_MESSAGES.NOT_FOUND
|
||||
)
|
||||
|
||||
if channel.user_id != user.id and user.role != "admin":
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_403_FORBIDDEN, detail=ERROR_MESSAGES.DEFAULT()
|
||||
)
|
||||
|
||||
try:
|
||||
memberships = Channels.add_members_to_channel(
|
||||
channel.id, user.id, form_data.user_ids, form_data.group_ids
|
||||
)
|
||||
|
||||
return memberships
|
||||
except Exception as e:
|
||||
log.exception(e)
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_400_BAD_REQUEST, detail=ERROR_MESSAGES.DEFAULT()
|
||||
)
|
||||
|
||||
|
||||
#################################################
|
||||
#
|
||||
#################################################
|
||||
|
||||
|
||||
class RemoveMembersForm(BaseModel):
|
||||
user_ids: list[str] = []
|
||||
|
||||
|
||||
@router.post("/{id}/update/members/remove")
|
||||
async def remove_members_by_id(
|
||||
request: Request,
|
||||
id: str,
|
||||
form_data: RemoveMembersForm,
|
||||
user=Depends(get_verified_user),
|
||||
):
|
||||
if user.role != "admin" and not has_permission(
|
||||
user.id, "features.channels", request.app.state.config.USER_PERMISSIONS
|
||||
):
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_401_UNAUTHORIZED,
|
||||
detail=ERROR_MESSAGES.UNAUTHORIZED,
|
||||
)
|
||||
|
||||
channel = Channels.get_channel_by_id(id)
|
||||
if not channel:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_404_NOT_FOUND, detail=ERROR_MESSAGES.NOT_FOUND
|
||||
)
|
||||
|
||||
if channel.user_id != user.id and user.role != "admin":
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_403_FORBIDDEN, detail=ERROR_MESSAGES.DEFAULT()
|
||||
)
|
||||
|
||||
try:
|
||||
deleted = Channels.remove_members_from_channel(channel.id, form_data.user_ids)
|
||||
|
||||
return deleted
|
||||
except Exception as e:
|
||||
log.exception(e)
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_400_BAD_REQUEST, detail=ERROR_MESSAGES.DEFAULT()
|
||||
)
|
||||
|
||||
|
||||
############################
|
||||
# UpdateChannelById
|
||||
############################
|
||||
|
|
@ -192,14 +591,27 @@ async def get_channel_users_by_id(
|
|||
|
||||
@router.post("/{id}/update", response_model=Optional[ChannelModel])
|
||||
async def update_channel_by_id(
|
||||
id: str, form_data: ChannelForm, user=Depends(get_admin_user)
|
||||
request: Request, id: str, form_data: ChannelForm, user=Depends(get_verified_user)
|
||||
):
|
||||
if user.role != "admin" and not has_permission(
|
||||
user.id, "features.channels", request.app.state.config.USER_PERMISSIONS
|
||||
):
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_401_UNAUTHORIZED,
|
||||
detail=ERROR_MESSAGES.UNAUTHORIZED,
|
||||
)
|
||||
|
||||
channel = Channels.get_channel_by_id(id)
|
||||
if not channel:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_404_NOT_FOUND, detail=ERROR_MESSAGES.NOT_FOUND
|
||||
)
|
||||
|
||||
if channel.user_id != user.id and user.role != "admin":
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_403_FORBIDDEN, detail=ERROR_MESSAGES.DEFAULT()
|
||||
)
|
||||
|
||||
try:
|
||||
channel = Channels.update_channel_by_id(id, form_data)
|
||||
return ChannelModel(**channel.model_dump())
|
||||
|
|
@ -216,13 +628,28 @@ async def update_channel_by_id(
|
|||
|
||||
|
||||
@router.delete("/{id}/delete", response_model=bool)
|
||||
async def delete_channel_by_id(id: str, user=Depends(get_admin_user)):
|
||||
async def delete_channel_by_id(
|
||||
request: Request, id: str, user=Depends(get_verified_user)
|
||||
):
|
||||
if user.role != "admin" and not has_permission(
|
||||
user.id, "features.channels", request.app.state.config.USER_PERMISSIONS
|
||||
):
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_401_UNAUTHORIZED,
|
||||
detail=ERROR_MESSAGES.UNAUTHORIZED,
|
||||
)
|
||||
|
||||
channel = Channels.get_channel_by_id(id)
|
||||
if not channel:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_404_NOT_FOUND, detail=ERROR_MESSAGES.NOT_FOUND
|
||||
)
|
||||
|
||||
if channel.user_id != user.id and user.role != "admin":
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_403_FORBIDDEN, detail=ERROR_MESSAGES.DEFAULT()
|
||||
)
|
||||
|
||||
try:
|
||||
Channels.delete_channel_by_id(id)
|
||||
return True
|
||||
|
|
@ -252,6 +679,12 @@ async def get_channel_messages(
|
|||
status_code=status.HTTP_404_NOT_FOUND, detail=ERROR_MESSAGES.NOT_FOUND
|
||||
)
|
||||
|
||||
if channel.type in ["group", "dm"]:
|
||||
if not Channels.is_user_channel_member(channel.id, user.id):
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_403_FORBIDDEN, detail=ERROR_MESSAGES.DEFAULT()
|
||||
)
|
||||
else:
|
||||
if user.role != "admin" and not has_access(
|
||||
user.id, type="read", access_control=channel.access_control
|
||||
):
|
||||
|
|
@ -259,6 +692,10 @@ async def get_channel_messages(
|
|||
status_code=status.HTTP_403_FORBIDDEN, detail=ERROR_MESSAGES.DEFAULT()
|
||||
)
|
||||
|
||||
channel_member = Channels.join_channel(
|
||||
id, user.id
|
||||
) # Ensure user is a member of the channel
|
||||
|
||||
message_list = Messages.get_messages_by_channel_id(id, skip, limit)
|
||||
users = {}
|
||||
|
||||
|
|
@ -288,6 +725,62 @@ async def get_channel_messages(
|
|||
return messages
|
||||
|
||||
|
||||
############################
|
||||
# GetPinnedChannelMessages
|
||||
############################
|
||||
|
||||
PAGE_ITEM_COUNT_PINNED = 20
|
||||
|
||||
|
||||
@router.get("/{id}/messages/pinned", response_model=list[MessageWithReactionsResponse])
|
||||
async def get_pinned_channel_messages(
|
||||
id: str, page: int = 1, user=Depends(get_verified_user)
|
||||
):
|
||||
channel = Channels.get_channel_by_id(id)
|
||||
if not channel:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_404_NOT_FOUND, detail=ERROR_MESSAGES.NOT_FOUND
|
||||
)
|
||||
|
||||
if channel.type in ["group", "dm"]:
|
||||
if not Channels.is_user_channel_member(channel.id, user.id):
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_403_FORBIDDEN, detail=ERROR_MESSAGES.DEFAULT()
|
||||
)
|
||||
else:
|
||||
if user.role != "admin" and not has_access(
|
||||
user.id, type="read", access_control=channel.access_control
|
||||
):
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_403_FORBIDDEN, detail=ERROR_MESSAGES.DEFAULT()
|
||||
)
|
||||
|
||||
page = max(1, page)
|
||||
skip = (page - 1) * PAGE_ITEM_COUNT_PINNED
|
||||
limit = PAGE_ITEM_COUNT_PINNED
|
||||
|
||||
message_list = Messages.get_pinned_messages_by_channel_id(id, skip, limit)
|
||||
users = {}
|
||||
|
||||
messages = []
|
||||
for message in message_list:
|
||||
if message.user_id not in users:
|
||||
user = Users.get_user_by_id(message.user_id)
|
||||
users[message.user_id] = user
|
||||
|
||||
messages.append(
|
||||
MessageWithReactionsResponse(
|
||||
**{
|
||||
**message.model_dump(),
|
||||
"reactions": Messages.get_reactions_by_message_id(message.id),
|
||||
"user": UserNameResponse(**users[message.user_id].model_dump()),
|
||||
}
|
||||
)
|
||||
)
|
||||
|
||||
return messages
|
||||
|
||||
|
||||
############################
|
||||
# PostNewMessage
|
||||
############################
|
||||
|
|
@ -297,7 +790,9 @@ async def send_notification(name, webui_url, channel, message, active_user_ids):
|
|||
users = get_users_with_access("read", channel.access_control)
|
||||
|
||||
for user in users:
|
||||
if user.id not in active_user_ids:
|
||||
if (user.id not in active_user_ids) and Channels.is_user_channel_member(
|
||||
channel.id, user.id
|
||||
):
|
||||
if user.settings:
|
||||
webhook_url = user.settings.ui.get("notifications", {}).get(
|
||||
"webhook_url", None
|
||||
|
|
@ -501,6 +996,12 @@ async def new_message_handler(
|
|||
status_code=status.HTTP_404_NOT_FOUND, detail=ERROR_MESSAGES.NOT_FOUND
|
||||
)
|
||||
|
||||
if channel.type in ["group", "dm"]:
|
||||
if not Channels.is_user_channel_member(channel.id, user.id):
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_403_FORBIDDEN, detail=ERROR_MESSAGES.DEFAULT()
|
||||
)
|
||||
else:
|
||||
if user.role != "admin" and not has_access(
|
||||
user.id, type="write", access_control=channel.access_control, strict=False
|
||||
):
|
||||
|
|
@ -511,13 +1012,21 @@ async def new_message_handler(
|
|||
try:
|
||||
message = Messages.insert_new_message(form_data, channel.id, user.id)
|
||||
if message:
|
||||
if channel.type in ["group", "dm"]:
|
||||
members = Channels.get_members_by_channel_id(channel.id)
|
||||
for member in members:
|
||||
if not member.is_active:
|
||||
Channels.update_member_active_status(
|
||||
channel.id, member.user_id, True
|
||||
)
|
||||
|
||||
message = Messages.get_message_by_id(message.id)
|
||||
event_data = {
|
||||
"channel_id": channel.id,
|
||||
"message_id": message.id,
|
||||
"data": {
|
||||
"type": "message",
|
||||
"data": message.model_dump(),
|
||||
"data": {"temp_id": form_data.temp_id, **message.model_dump()},
|
||||
},
|
||||
"user": UserNameResponse(**user.model_dump()).model_dump(),
|
||||
"channel": channel.model_dump(),
|
||||
|
|
@ -609,6 +1118,12 @@ async def get_channel_message(
|
|||
status_code=status.HTTP_404_NOT_FOUND, detail=ERROR_MESSAGES.NOT_FOUND
|
||||
)
|
||||
|
||||
if channel.type in ["group", "dm"]:
|
||||
if not Channels.is_user_channel_member(channel.id, user.id):
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_403_FORBIDDEN, detail=ERROR_MESSAGES.DEFAULT()
|
||||
)
|
||||
else:
|
||||
if user.role != "admin" and not has_access(
|
||||
user.id, type="read", access_control=channel.access_control
|
||||
):
|
||||
|
|
@ -637,6 +1152,69 @@ async def get_channel_message(
|
|||
)
|
||||
|
||||
|
||||
############################
|
||||
# PinChannelMessage
|
||||
############################
|
||||
|
||||
|
||||
class PinMessageForm(BaseModel):
|
||||
is_pinned: bool
|
||||
|
||||
|
||||
@router.post(
|
||||
"/{id}/messages/{message_id}/pin", response_model=Optional[MessageUserResponse]
|
||||
)
|
||||
async def pin_channel_message(
|
||||
id: str, message_id: str, form_data: PinMessageForm, user=Depends(get_verified_user)
|
||||
):
|
||||
channel = Channels.get_channel_by_id(id)
|
||||
if not channel:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_404_NOT_FOUND, detail=ERROR_MESSAGES.NOT_FOUND
|
||||
)
|
||||
|
||||
if channel.type in ["group", "dm"]:
|
||||
if not Channels.is_user_channel_member(channel.id, user.id):
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_403_FORBIDDEN, detail=ERROR_MESSAGES.DEFAULT()
|
||||
)
|
||||
else:
|
||||
if user.role != "admin" and not has_access(
|
||||
user.id, type="read", access_control=channel.access_control
|
||||
):
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_403_FORBIDDEN, detail=ERROR_MESSAGES.DEFAULT()
|
||||
)
|
||||
|
||||
message = Messages.get_message_by_id(message_id)
|
||||
if not message:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_404_NOT_FOUND, detail=ERROR_MESSAGES.NOT_FOUND
|
||||
)
|
||||
|
||||
if message.channel_id != id:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_400_BAD_REQUEST, detail=ERROR_MESSAGES.DEFAULT()
|
||||
)
|
||||
|
||||
try:
|
||||
Messages.update_is_pinned_by_id(message_id, form_data.is_pinned, user.id)
|
||||
message = Messages.get_message_by_id(message_id)
|
||||
return MessageUserResponse(
|
||||
**{
|
||||
**message.model_dump(),
|
||||
"user": UserNameResponse(
|
||||
**Users.get_user_by_id(message.user_id).model_dump()
|
||||
),
|
||||
}
|
||||
)
|
||||
except Exception as e:
|
||||
log.exception(e)
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_400_BAD_REQUEST, detail=ERROR_MESSAGES.DEFAULT()
|
||||
)
|
||||
|
||||
|
||||
############################
|
||||
# GetChannelThreadMessages
|
||||
############################
|
||||
|
|
@ -658,6 +1236,12 @@ async def get_channel_thread_messages(
|
|||
status_code=status.HTTP_404_NOT_FOUND, detail=ERROR_MESSAGES.NOT_FOUND
|
||||
)
|
||||
|
||||
if channel.type in ["group", "dm"]:
|
||||
if not Channels.is_user_channel_member(channel.id, user.id):
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_403_FORBIDDEN, detail=ERROR_MESSAGES.DEFAULT()
|
||||
)
|
||||
else:
|
||||
if user.role != "admin" and not has_access(
|
||||
user.id, type="read", access_control=channel.access_control
|
||||
):
|
||||
|
|
@ -717,10 +1301,18 @@ async def update_message_by_id(
|
|||
status_code=status.HTTP_400_BAD_REQUEST, detail=ERROR_MESSAGES.DEFAULT()
|
||||
)
|
||||
|
||||
if channel.type in ["group", "dm"]:
|
||||
if not Channels.is_user_channel_member(channel.id, user.id):
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_403_FORBIDDEN, detail=ERROR_MESSAGES.DEFAULT()
|
||||
)
|
||||
else:
|
||||
if (
|
||||
user.role != "admin"
|
||||
and message.user_id != user.id
|
||||
and not has_access(user.id, type="read", access_control=channel.access_control)
|
||||
and not has_access(
|
||||
user.id, type="read", access_control=channel.access_control
|
||||
)
|
||||
):
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_403_FORBIDDEN, detail=ERROR_MESSAGES.DEFAULT()
|
||||
|
|
@ -773,6 +1365,12 @@ async def add_reaction_to_message(
|
|||
status_code=status.HTTP_404_NOT_FOUND, detail=ERROR_MESSAGES.NOT_FOUND
|
||||
)
|
||||
|
||||
if channel.type in ["group", "dm"]:
|
||||
if not Channels.is_user_channel_member(channel.id, user.id):
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_403_FORBIDDEN, detail=ERROR_MESSAGES.DEFAULT()
|
||||
)
|
||||
else:
|
||||
if user.role != "admin" and not has_access(
|
||||
user.id, type="write", access_control=channel.access_control, strict=False
|
||||
):
|
||||
|
|
@ -836,6 +1434,12 @@ async def remove_reaction_by_id_and_user_id_and_name(
|
|||
status_code=status.HTTP_404_NOT_FOUND, detail=ERROR_MESSAGES.NOT_FOUND
|
||||
)
|
||||
|
||||
if channel.type in ["group", "dm"]:
|
||||
if not Channels.is_user_channel_member(channel.id, user.id):
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_403_FORBIDDEN, detail=ERROR_MESSAGES.DEFAULT()
|
||||
)
|
||||
else:
|
||||
if user.role != "admin" and not has_access(
|
||||
user.id, type="write", access_control=channel.access_control, strict=False
|
||||
):
|
||||
|
|
@ -913,11 +1517,20 @@ async def delete_message_by_id(
|
|||
status_code=status.HTTP_400_BAD_REQUEST, detail=ERROR_MESSAGES.DEFAULT()
|
||||
)
|
||||
|
||||
if channel.type in ["group", "dm"]:
|
||||
if not Channels.is_user_channel_member(channel.id, user.id):
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_403_FORBIDDEN, detail=ERROR_MESSAGES.DEFAULT()
|
||||
)
|
||||
else:
|
||||
if (
|
||||
user.role != "admin"
|
||||
and message.user_id != user.id
|
||||
and not has_access(
|
||||
user.id, type="write", access_control=channel.access_control, strict=False
|
||||
user.id,
|
||||
type="write",
|
||||
access_control=channel.access_control,
|
||||
strict=False,
|
||||
)
|
||||
):
|
||||
raise HTTPException(
|
||||
|
|
|
|||
|
|
@ -22,6 +22,7 @@ from fastapi import (
|
|||
)
|
||||
|
||||
from fastapi.responses import FileResponse, StreamingResponse
|
||||
|
||||
from open_webui.constants import ERROR_MESSAGES
|
||||
from open_webui.env import SRC_LOG_LEVELS
|
||||
from open_webui.retrieval.vector.factory import VECTOR_DB_CLIENT
|
||||
|
|
@ -34,12 +35,19 @@ from open_webui.models.files import (
|
|||
Files,
|
||||
)
|
||||
from open_webui.models.knowledge import Knowledges
|
||||
from open_webui.models.groups import Groups
|
||||
|
||||
|
||||
from open_webui.routers.knowledge import get_knowledge, get_knowledge_list
|
||||
from open_webui.routers.retrieval import ProcessFileForm, process_file
|
||||
from open_webui.routers.audio import transcribe
|
||||
|
||||
from open_webui.storage.provider import Storage
|
||||
|
||||
|
||||
from open_webui.utils.auth import get_admin_user, get_verified_user
|
||||
from open_webui.utils.access_control import has_access
|
||||
|
||||
from pydantic import BaseModel
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
|
|
@ -53,31 +61,37 @@ router = APIRouter()
|
|||
############################
|
||||
|
||||
|
||||
# TODO: Optimize this function to use the knowledge_file table for faster lookups.
|
||||
def has_access_to_file(
|
||||
file_id: Optional[str], access_type: str, user=Depends(get_verified_user)
|
||||
) -> bool:
|
||||
file = Files.get_file_by_id(file_id)
|
||||
log.debug(f"Checking if user has {access_type} access to file")
|
||||
|
||||
if not file:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_404_NOT_FOUND,
|
||||
detail=ERROR_MESSAGES.NOT_FOUND,
|
||||
)
|
||||
|
||||
has_access = False
|
||||
knowledge_base_id = file.meta.get("collection_name") if file.meta else None
|
||||
knowledge_bases = Knowledges.get_knowledges_by_file_id(file_id)
|
||||
user_group_ids = {group.id for group in Groups.get_groups_by_member_id(user.id)}
|
||||
|
||||
for knowledge_base in knowledge_bases:
|
||||
if knowledge_base.user_id == user.id or has_access(
|
||||
user.id, access_type, knowledge_base.access_control, user_group_ids
|
||||
):
|
||||
return True
|
||||
|
||||
knowledge_base_id = file.meta.get("collection_name") if file.meta else None
|
||||
if knowledge_base_id:
|
||||
knowledge_bases = Knowledges.get_knowledge_bases_by_user_id(
|
||||
user.id, access_type
|
||||
)
|
||||
for knowledge_base in knowledge_bases:
|
||||
if knowledge_base.id == knowledge_base_id:
|
||||
has_access = True
|
||||
break
|
||||
return True
|
||||
|
||||
return has_access
|
||||
return False
|
||||
|
||||
|
||||
############################
|
||||
|
|
|
|||
|
|
@ -46,7 +46,23 @@ router = APIRouter()
|
|||
|
||||
|
||||
@router.get("/", response_model=list[FolderNameIdResponse])
|
||||
async def get_folders(user=Depends(get_verified_user)):
|
||||
async def get_folders(request: Request, user=Depends(get_verified_user)):
|
||||
if request.app.state.config.ENABLE_FOLDERS is False:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_403_FORBIDDEN,
|
||||
detail=ERROR_MESSAGES.ACCESS_PROHIBITED,
|
||||
)
|
||||
|
||||
if user.role != "admin" and not has_permission(
|
||||
user.id,
|
||||
"features.folders",
|
||||
request.app.state.config.USER_PERMISSIONS,
|
||||
):
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_403_FORBIDDEN,
|
||||
detail=ERROR_MESSAGES.ACCESS_PROHIBITED,
|
||||
)
|
||||
|
||||
folders = Folders.get_folders_by_user_id(user.id)
|
||||
|
||||
# Verify folder data integrity
|
||||
|
|
|
|||
|
|
@ -3,7 +3,7 @@ from pathlib import Path
|
|||
from typing import Optional
|
||||
import logging
|
||||
|
||||
from open_webui.models.users import Users
|
||||
from open_webui.models.users import Users, UserInfoResponse
|
||||
from open_webui.models.groups import (
|
||||
Groups,
|
||||
GroupForm,
|
||||
|
|
@ -32,31 +32,17 @@ router = APIRouter()
|
|||
|
||||
@router.get("/", response_model=list[GroupResponse])
|
||||
async def get_groups(share: Optional[bool] = None, user=Depends(get_verified_user)):
|
||||
if user.role == "admin":
|
||||
groups = Groups.get_groups()
|
||||
else:
|
||||
groups = Groups.get_groups_by_member_id(user.id)
|
||||
|
||||
group_list = []
|
||||
filter = {}
|
||||
if user.role != "admin":
|
||||
filter["member_id"] = user.id
|
||||
|
||||
for group in groups:
|
||||
if share is not None:
|
||||
# Check if the group has data and a config with share key
|
||||
if (
|
||||
group.data
|
||||
and "share" in group.data.get("config", {})
|
||||
and group.data["config"]["share"] != share
|
||||
):
|
||||
continue
|
||||
filter["share"] = share
|
||||
|
||||
group_list.append(
|
||||
GroupResponse(
|
||||
**group.model_dump(),
|
||||
member_count=Groups.get_group_member_count_by_id(group.id),
|
||||
)
|
||||
)
|
||||
groups = Groups.get_groups(filter=filter)
|
||||
|
||||
return group_list
|
||||
return groups
|
||||
|
||||
|
||||
############################
|
||||
|
|
@ -106,6 +92,50 @@ async def get_group_by_id(id: str, user=Depends(get_admin_user)):
|
|||
)
|
||||
|
||||
|
||||
############################
|
||||
# ExportGroupById
|
||||
############################
|
||||
|
||||
|
||||
class GroupExportResponse(GroupResponse):
|
||||
user_ids: list[str] = []
|
||||
pass
|
||||
|
||||
|
||||
@router.get("/id/{id}/export", response_model=Optional[GroupExportResponse])
|
||||
async def export_group_by_id(id: str, user=Depends(get_admin_user)):
|
||||
group = Groups.get_group_by_id(id)
|
||||
if group:
|
||||
return GroupExportResponse(
|
||||
**group.model_dump(),
|
||||
member_count=Groups.get_group_member_count_by_id(group.id),
|
||||
user_ids=Groups.get_group_user_ids_by_id(group.id),
|
||||
)
|
||||
else:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_401_UNAUTHORIZED,
|
||||
detail=ERROR_MESSAGES.NOT_FOUND,
|
||||
)
|
||||
|
||||
|
||||
############################
|
||||
# GetUsersInGroupById
|
||||
############################
|
||||
|
||||
|
||||
@router.post("/id/{id}/users", response_model=list[UserInfoResponse])
|
||||
async def get_users_in_group(id: str, user=Depends(get_admin_user)):
|
||||
try:
|
||||
users = Users.get_users_by_group_id(id)
|
||||
return users
|
||||
except Exception as e:
|
||||
log.exception(f"Error adding users to group {id}: {e}")
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_400_BAD_REQUEST,
|
||||
detail=ERROR_MESSAGES.DEFAULT(e),
|
||||
)
|
||||
|
||||
|
||||
############################
|
||||
# UpdateGroupById
|
||||
############################
|
||||
|
|
|
|||
|
|
@ -42,97 +42,38 @@ router = APIRouter()
|
|||
|
||||
@router.get("/", response_model=list[KnowledgeUserResponse])
|
||||
async def get_knowledge(user=Depends(get_verified_user)):
|
||||
# Return knowledge bases with read access
|
||||
knowledge_bases = []
|
||||
|
||||
if user.role == "admin" and BYPASS_ADMIN_ACCESS_CONTROL:
|
||||
knowledge_bases = Knowledges.get_knowledge_bases()
|
||||
else:
|
||||
knowledge_bases = Knowledges.get_knowledge_bases_by_user_id(user.id, "read")
|
||||
|
||||
# Get files for each knowledge base
|
||||
knowledge_with_files = []
|
||||
for knowledge_base in knowledge_bases:
|
||||
files = []
|
||||
if knowledge_base.data:
|
||||
files = Files.get_file_metadatas_by_ids(
|
||||
knowledge_base.data.get("file_ids", [])
|
||||
)
|
||||
|
||||
# Check if all files exist
|
||||
if len(files) != len(knowledge_base.data.get("file_ids", [])):
|
||||
missing_files = list(
|
||||
set(knowledge_base.data.get("file_ids", []))
|
||||
- set([file.id for file in files])
|
||||
)
|
||||
if missing_files:
|
||||
data = knowledge_base.data or {}
|
||||
file_ids = data.get("file_ids", [])
|
||||
|
||||
for missing_file in missing_files:
|
||||
file_ids.remove(missing_file)
|
||||
|
||||
data["file_ids"] = file_ids
|
||||
Knowledges.update_knowledge_data_by_id(
|
||||
id=knowledge_base.id, data=data
|
||||
)
|
||||
|
||||
files = Files.get_file_metadatas_by_ids(file_ids)
|
||||
|
||||
knowledge_with_files.append(
|
||||
return [
|
||||
KnowledgeUserResponse(
|
||||
**knowledge_base.model_dump(),
|
||||
files=files,
|
||||
files=Knowledges.get_file_metadatas_by_id(knowledge_base.id),
|
||||
)
|
||||
)
|
||||
|
||||
return knowledge_with_files
|
||||
for knowledge_base in knowledge_bases
|
||||
]
|
||||
|
||||
|
||||
@router.get("/list", response_model=list[KnowledgeUserResponse])
|
||||
async def get_knowledge_list(user=Depends(get_verified_user)):
|
||||
# Return knowledge bases with write access
|
||||
knowledge_bases = []
|
||||
|
||||
if user.role == "admin" and BYPASS_ADMIN_ACCESS_CONTROL:
|
||||
knowledge_bases = Knowledges.get_knowledge_bases()
|
||||
else:
|
||||
knowledge_bases = Knowledges.get_knowledge_bases_by_user_id(user.id, "write")
|
||||
|
||||
# Get files for each knowledge base
|
||||
knowledge_with_files = []
|
||||
for knowledge_base in knowledge_bases:
|
||||
files = []
|
||||
if knowledge_base.data:
|
||||
files = Files.get_file_metadatas_by_ids(
|
||||
knowledge_base.data.get("file_ids", [])
|
||||
)
|
||||
|
||||
# Check if all files exist
|
||||
if len(files) != len(knowledge_base.data.get("file_ids", [])):
|
||||
missing_files = list(
|
||||
set(knowledge_base.data.get("file_ids", []))
|
||||
- set([file.id for file in files])
|
||||
)
|
||||
if missing_files:
|
||||
data = knowledge_base.data or {}
|
||||
file_ids = data.get("file_ids", [])
|
||||
|
||||
for missing_file in missing_files:
|
||||
file_ids.remove(missing_file)
|
||||
|
||||
data["file_ids"] = file_ids
|
||||
Knowledges.update_knowledge_data_by_id(
|
||||
id=knowledge_base.id, data=data
|
||||
)
|
||||
|
||||
files = Files.get_file_metadatas_by_ids(file_ids)
|
||||
|
||||
knowledge_with_files.append(
|
||||
return [
|
||||
KnowledgeUserResponse(
|
||||
**knowledge_base.model_dump(),
|
||||
files=files,
|
||||
files=Knowledges.get_file_metadatas_by_id(knowledge_base.id),
|
||||
)
|
||||
)
|
||||
return knowledge_with_files
|
||||
for knowledge_base in knowledge_bases
|
||||
]
|
||||
|
||||
|
||||
############################
|
||||
|
|
@ -192,26 +133,9 @@ async def reindex_knowledge_files(request: Request, user=Depends(get_verified_us
|
|||
|
||||
log.info(f"Starting reindexing for {len(knowledge_bases)} knowledge bases")
|
||||
|
||||
deleted_knowledge_bases = []
|
||||
|
||||
for knowledge_base in knowledge_bases:
|
||||
# -- Robust error handling for missing or invalid data
|
||||
if not knowledge_base.data or not isinstance(knowledge_base.data, dict):
|
||||
log.warning(
|
||||
f"Knowledge base {knowledge_base.id} has no data or invalid data ({knowledge_base.data!r}). Deleting."
|
||||
)
|
||||
try:
|
||||
Knowledges.delete_knowledge_by_id(id=knowledge_base.id)
|
||||
deleted_knowledge_bases.append(knowledge_base.id)
|
||||
except Exception as e:
|
||||
log.error(
|
||||
f"Failed to delete invalid knowledge base {knowledge_base.id}: {e}"
|
||||
)
|
||||
continue
|
||||
|
||||
try:
|
||||
file_ids = knowledge_base.data.get("file_ids", [])
|
||||
files = Files.get_files_by_ids(file_ids)
|
||||
files = Knowledges.get_files_by_id(knowledge_base.id)
|
||||
try:
|
||||
if VECTOR_DB_CLIENT.has_collection(collection_name=knowledge_base.id):
|
||||
VECTOR_DB_CLIENT.delete_collection(
|
||||
|
|
@ -251,9 +175,7 @@ async def reindex_knowledge_files(request: Request, user=Depends(get_verified_us
|
|||
for failed in failed_files:
|
||||
log.warning(f"File ID: {failed['file_id']}, Error: {failed['error']}")
|
||||
|
||||
log.info(
|
||||
f"Reindexing completed. Deleted {len(deleted_knowledge_bases)} invalid knowledge bases: {deleted_knowledge_bases}"
|
||||
)
|
||||
log.info(f"Reindexing completed.")
|
||||
return True
|
||||
|
||||
|
||||
|
|
@ -271,19 +193,15 @@ async def get_knowledge_by_id(id: str, user=Depends(get_verified_user)):
|
|||
knowledge = Knowledges.get_knowledge_by_id(id=id)
|
||||
|
||||
if knowledge:
|
||||
|
||||
if (
|
||||
user.role == "admin"
|
||||
or knowledge.user_id == user.id
|
||||
or has_access(user.id, "read", knowledge.access_control)
|
||||
):
|
||||
|
||||
file_ids = knowledge.data.get("file_ids", []) if knowledge.data else []
|
||||
files = Files.get_file_metadatas_by_ids(file_ids)
|
||||
|
||||
return KnowledgeFilesResponse(
|
||||
**knowledge.model_dump(),
|
||||
files=files,
|
||||
files=Knowledges.get_file_metadatas_by_id(knowledge.id),
|
||||
)
|
||||
else:
|
||||
raise HTTPException(
|
||||
|
|
@ -335,12 +253,9 @@ async def update_knowledge_by_id(
|
|||
|
||||
knowledge = Knowledges.update_knowledge_by_id(id=id, form_data=form_data)
|
||||
if knowledge:
|
||||
file_ids = knowledge.data.get("file_ids", []) if knowledge.data else []
|
||||
files = Files.get_file_metadatas_by_ids(file_ids)
|
||||
|
||||
return KnowledgeFilesResponse(
|
||||
**knowledge.model_dump(),
|
||||
files=files,
|
||||
files=Knowledges.get_file_metadatas_by_id(knowledge.id),
|
||||
)
|
||||
else:
|
||||
raise HTTPException(
|
||||
|
|
@ -366,7 +281,6 @@ def add_file_to_knowledge_by_id(
|
|||
user=Depends(get_verified_user),
|
||||
):
|
||||
knowledge = Knowledges.get_knowledge_by_id(id=id)
|
||||
|
||||
if not knowledge:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_400_BAD_REQUEST,
|
||||
|
|
@ -395,6 +309,11 @@ def add_file_to_knowledge_by_id(
|
|||
detail=ERROR_MESSAGES.FILE_NOT_PROCESSED,
|
||||
)
|
||||
|
||||
# Add file to knowledge base
|
||||
Knowledges.add_file_to_knowledge_by_id(
|
||||
knowledge_id=id, file_id=form_data.file_id, user_id=user.id
|
||||
)
|
||||
|
||||
# Add content to the vector database
|
||||
try:
|
||||
process_file(
|
||||
|
|
@ -410,31 +329,9 @@ def add_file_to_knowledge_by_id(
|
|||
)
|
||||
|
||||
if knowledge:
|
||||
data = knowledge.data or {}
|
||||
file_ids = data.get("file_ids", [])
|
||||
|
||||
if form_data.file_id not in file_ids:
|
||||
file_ids.append(form_data.file_id)
|
||||
data["file_ids"] = file_ids
|
||||
|
||||
knowledge = Knowledges.update_knowledge_data_by_id(id=id, data=data)
|
||||
|
||||
if knowledge:
|
||||
files = Files.get_file_metadatas_by_ids(file_ids)
|
||||
|
||||
return KnowledgeFilesResponse(
|
||||
**knowledge.model_dump(),
|
||||
files=files,
|
||||
)
|
||||
else:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_400_BAD_REQUEST,
|
||||
detail=ERROR_MESSAGES.DEFAULT("knowledge"),
|
||||
)
|
||||
else:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_400_BAD_REQUEST,
|
||||
detail=ERROR_MESSAGES.DEFAULT("file_id"),
|
||||
files=Knowledges.get_file_metadatas_by_id(knowledge.id),
|
||||
)
|
||||
else:
|
||||
raise HTTPException(
|
||||
|
|
@ -494,14 +391,9 @@ def update_file_from_knowledge_by_id(
|
|||
)
|
||||
|
||||
if knowledge:
|
||||
data = knowledge.data or {}
|
||||
file_ids = data.get("file_ids", [])
|
||||
|
||||
files = Files.get_file_metadatas_by_ids(file_ids)
|
||||
|
||||
return KnowledgeFilesResponse(
|
||||
**knowledge.model_dump(),
|
||||
files=files,
|
||||
files=Knowledges.get_file_metadatas_by_id(knowledge.id),
|
||||
)
|
||||
else:
|
||||
raise HTTPException(
|
||||
|
|
@ -546,11 +438,19 @@ def remove_file_from_knowledge_by_id(
|
|||
detail=ERROR_MESSAGES.NOT_FOUND,
|
||||
)
|
||||
|
||||
Knowledges.remove_file_from_knowledge_by_id(
|
||||
knowledge_id=id, file_id=form_data.file_id
|
||||
)
|
||||
|
||||
# Remove content from the vector database
|
||||
try:
|
||||
VECTOR_DB_CLIENT.delete(
|
||||
collection_name=knowledge.id, filter={"file_id": form_data.file_id}
|
||||
)
|
||||
) # Remove by file_id first
|
||||
|
||||
VECTOR_DB_CLIENT.delete(
|
||||
collection_name=knowledge.id, filter={"hash": file.hash}
|
||||
) # Remove by hash as well in case of duplicates
|
||||
except Exception as e:
|
||||
log.debug("This was most likely caused by bypassing embedding processing")
|
||||
log.debug(e)
|
||||
|
|
@ -571,31 +471,9 @@ def remove_file_from_knowledge_by_id(
|
|||
Files.delete_file_by_id(form_data.file_id)
|
||||
|
||||
if knowledge:
|
||||
data = knowledge.data or {}
|
||||
file_ids = data.get("file_ids", [])
|
||||
|
||||
if form_data.file_id in file_ids:
|
||||
file_ids.remove(form_data.file_id)
|
||||
data["file_ids"] = file_ids
|
||||
|
||||
knowledge = Knowledges.update_knowledge_data_by_id(id=id, data=data)
|
||||
|
||||
if knowledge:
|
||||
files = Files.get_file_metadatas_by_ids(file_ids)
|
||||
|
||||
return KnowledgeFilesResponse(
|
||||
**knowledge.model_dump(),
|
||||
files=files,
|
||||
)
|
||||
else:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_400_BAD_REQUEST,
|
||||
detail=ERROR_MESSAGES.DEFAULT("knowledge"),
|
||||
)
|
||||
else:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_400_BAD_REQUEST,
|
||||
detail=ERROR_MESSAGES.DEFAULT("file_id"),
|
||||
files=Knowledges.get_file_metadatas_by_id(knowledge.id),
|
||||
)
|
||||
else:
|
||||
raise HTTPException(
|
||||
|
|
@ -697,8 +575,7 @@ async def reset_knowledge_by_id(id: str, user=Depends(get_verified_user)):
|
|||
log.debug(e)
|
||||
pass
|
||||
|
||||
knowledge = Knowledges.update_knowledge_data_by_id(id=id, data={"file_ids": []})
|
||||
|
||||
knowledge = Knowledges.reset_knowledge_by_id(id=id)
|
||||
return knowledge
|
||||
|
||||
|
||||
|
|
@ -708,7 +585,7 @@ async def reset_knowledge_by_id(id: str, user=Depends(get_verified_user)):
|
|||
|
||||
|
||||
@router.post("/{id}/files/batch/add", response_model=Optional[KnowledgeFilesResponse])
|
||||
def add_files_to_knowledge_batch(
|
||||
async def add_files_to_knowledge_batch(
|
||||
request: Request,
|
||||
id: str,
|
||||
form_data: list[KnowledgeFileIdForm],
|
||||
|
|
@ -748,7 +625,7 @@ def add_files_to_knowledge_batch(
|
|||
|
||||
# Process files
|
||||
try:
|
||||
result = process_files_batch(
|
||||
result = await process_files_batch(
|
||||
request=request,
|
||||
form_data=BatchProcessFilesForm(files=files, collection_name=id),
|
||||
user=user,
|
||||
|
|
@ -759,25 +636,19 @@ def add_files_to_knowledge_batch(
|
|||
)
|
||||
raise HTTPException(status_code=status.HTTP_400_BAD_REQUEST, detail=str(e))
|
||||
|
||||
# Add successful files to knowledge base
|
||||
data = knowledge.data or {}
|
||||
existing_file_ids = data.get("file_ids", [])
|
||||
|
||||
# Only add files that were successfully processed
|
||||
successful_file_ids = [r.file_id for r in result.results if r.status == "completed"]
|
||||
for file_id in successful_file_ids:
|
||||
if file_id not in existing_file_ids:
|
||||
existing_file_ids.append(file_id)
|
||||
|
||||
data["file_ids"] = existing_file_ids
|
||||
knowledge = Knowledges.update_knowledge_data_by_id(id=id, data=data)
|
||||
Knowledges.add_file_to_knowledge_by_id(
|
||||
knowledge_id=id, file_id=file_id, user_id=user.id
|
||||
)
|
||||
|
||||
# If there were any errors, include them in the response
|
||||
if result.errors:
|
||||
error_details = [f"{err.file_id}: {err.error}" for err in result.errors]
|
||||
return KnowledgeFilesResponse(
|
||||
**knowledge.model_dump(),
|
||||
files=Files.get_file_metadatas_by_ids(existing_file_ids),
|
||||
files=Knowledges.get_file_metadatas_by_id(knowledge.id),
|
||||
warnings={
|
||||
"message": "Some files failed to process",
|
||||
"errors": error_details,
|
||||
|
|
@ -786,5 +657,5 @@ def add_files_to_knowledge_batch(
|
|||
|
||||
return KnowledgeFilesResponse(
|
||||
**knowledge.model_dump(),
|
||||
files=Files.get_file_metadatas_by_ids(existing_file_ids),
|
||||
files=Knowledges.get_file_metadatas_by_id(knowledge.id),
|
||||
)
|
||||
|
|
|
|||
|
|
@ -5,6 +5,7 @@ import json
|
|||
import asyncio
|
||||
import logging
|
||||
|
||||
from open_webui.models.groups import Groups
|
||||
from open_webui.models.models import (
|
||||
ModelForm,
|
||||
ModelModel,
|
||||
|
|
@ -78,6 +79,10 @@ async def get_models(
|
|||
filter["direction"] = direction
|
||||
|
||||
if not user.role == "admin" or not BYPASS_ADMIN_ACCESS_CONTROL:
|
||||
groups = Groups.get_groups_by_member_id(user.id)
|
||||
if groups:
|
||||
filter["group_ids"] = [group.id for group in groups]
|
||||
|
||||
filter["user_id"] = user.id
|
||||
|
||||
return Models.search_models(user.id, filter=filter, skip=skip, limit=limit)
|
||||
|
|
|
|||
|
|
@ -879,6 +879,7 @@ async def delete_model(
|
|||
url = request.app.state.config.OLLAMA_BASE_URLS[url_idx]
|
||||
key = get_api_key(url_idx, url, request.app.state.config.OLLAMA_API_CONFIGS)
|
||||
|
||||
r = None
|
||||
try:
|
||||
headers = {
|
||||
"Content-Type": "application/json",
|
||||
|
|
@ -892,7 +893,7 @@ async def delete_model(
|
|||
method="DELETE",
|
||||
url=f"{url}/api/delete",
|
||||
headers=headers,
|
||||
data=form_data.model_dump_json(exclude_none=True).encode(),
|
||||
json=form_data,
|
||||
)
|
||||
r.raise_for_status()
|
||||
|
||||
|
|
@ -949,10 +950,7 @@ async def show_model_info(
|
|||
headers = include_user_info_headers(headers, user)
|
||||
|
||||
r = requests.request(
|
||||
method="POST",
|
||||
url=f"{url}/api/show",
|
||||
headers=headers,
|
||||
data=form_data.model_dump_json(exclude_none=True).encode(),
|
||||
method="POST", url=f"{url}/api/show", headers=headers, json=form_data
|
||||
)
|
||||
r.raise_for_status()
|
||||
|
||||
|
|
|
|||
|
|
@ -123,7 +123,7 @@ log.setLevel(SRC_LOG_LEVELS["RAG"])
|
|||
def get_ef(
|
||||
engine: str,
|
||||
embedding_model: str,
|
||||
auto_update: bool = False,
|
||||
auto_update: bool = RAG_EMBEDDING_MODEL_AUTO_UPDATE,
|
||||
):
|
||||
ef = None
|
||||
if embedding_model and engine == "":
|
||||
|
|
@ -148,7 +148,7 @@ def get_rf(
|
|||
reranking_model: Optional[str] = None,
|
||||
external_reranker_url: str = "",
|
||||
external_reranker_api_key: str = "",
|
||||
auto_update: bool = False,
|
||||
auto_update: bool = RAG_RERANKING_MODEL_AUTO_UPDATE,
|
||||
):
|
||||
rf = None
|
||||
if reranking_model:
|
||||
|
|
@ -468,6 +468,7 @@ async def get_rag_config(request: Request, user=Depends(get_admin_user)):
|
|||
"DOCLING_PARAMS": request.app.state.config.DOCLING_PARAMS,
|
||||
"DOCUMENT_INTELLIGENCE_ENDPOINT": request.app.state.config.DOCUMENT_INTELLIGENCE_ENDPOINT,
|
||||
"DOCUMENT_INTELLIGENCE_KEY": request.app.state.config.DOCUMENT_INTELLIGENCE_KEY,
|
||||
"DOCUMENT_INTELLIGENCE_MODEL": request.app.state.config.DOCUMENT_INTELLIGENCE_MODEL,
|
||||
"MISTRAL_OCR_API_BASE_URL": request.app.state.config.MISTRAL_OCR_API_BASE_URL,
|
||||
"MISTRAL_OCR_API_KEY": request.app.state.config.MISTRAL_OCR_API_KEY,
|
||||
# MinerU settings
|
||||
|
|
@ -647,6 +648,7 @@ class ConfigForm(BaseModel):
|
|||
DOCLING_PARAMS: Optional[dict] = None
|
||||
DOCUMENT_INTELLIGENCE_ENDPOINT: Optional[str] = None
|
||||
DOCUMENT_INTELLIGENCE_KEY: Optional[str] = None
|
||||
DOCUMENT_INTELLIGENCE_MODEL: Optional[str] = None
|
||||
MISTRAL_OCR_API_BASE_URL: Optional[str] = None
|
||||
MISTRAL_OCR_API_KEY: Optional[str] = None
|
||||
|
||||
|
|
@ -842,6 +844,11 @@ async def update_rag_config(
|
|||
if form_data.DOCUMENT_INTELLIGENCE_KEY is not None
|
||||
else request.app.state.config.DOCUMENT_INTELLIGENCE_KEY
|
||||
)
|
||||
request.app.state.config.DOCUMENT_INTELLIGENCE_MODEL = (
|
||||
form_data.DOCUMENT_INTELLIGENCE_MODEL
|
||||
if form_data.DOCUMENT_INTELLIGENCE_MODEL is not None
|
||||
else request.app.state.config.DOCUMENT_INTELLIGENCE_MODEL
|
||||
)
|
||||
|
||||
request.app.state.config.MISTRAL_OCR_API_BASE_URL = (
|
||||
form_data.MISTRAL_OCR_API_BASE_URL
|
||||
|
|
@ -927,7 +934,6 @@ async def update_rag_config(
|
|||
request.app.state.config.RAG_RERANKING_MODEL,
|
||||
request.app.state.config.RAG_EXTERNAL_RERANKER_URL,
|
||||
request.app.state.config.RAG_EXTERNAL_RERANKER_API_KEY,
|
||||
True,
|
||||
)
|
||||
|
||||
request.app.state.RERANKING_FUNCTION = get_reranking_function(
|
||||
|
|
@ -1132,6 +1138,7 @@ async def update_rag_config(
|
|||
"DOCLING_PARAMS": request.app.state.config.DOCLING_PARAMS,
|
||||
"DOCUMENT_INTELLIGENCE_ENDPOINT": request.app.state.config.DOCUMENT_INTELLIGENCE_ENDPOINT,
|
||||
"DOCUMENT_INTELLIGENCE_KEY": request.app.state.config.DOCUMENT_INTELLIGENCE_KEY,
|
||||
"DOCUMENT_INTELLIGENCE_MODEL": request.app.state.config.DOCUMENT_INTELLIGENCE_MODEL,
|
||||
"MISTRAL_OCR_API_BASE_URL": request.app.state.config.MISTRAL_OCR_API_BASE_URL,
|
||||
"MISTRAL_OCR_API_KEY": request.app.state.config.MISTRAL_OCR_API_KEY,
|
||||
# MinerU settings
|
||||
|
|
@ -1249,7 +1256,7 @@ def save_docs_to_vector_db(
|
|||
|
||||
return ", ".join(docs_info)
|
||||
|
||||
log.info(
|
||||
log.debug(
|
||||
f"save_docs_to_vector_db: document {_get_docs_info(docs)} {collection_name}"
|
||||
)
|
||||
|
||||
|
|
@ -1544,6 +1551,7 @@ def process_file(
|
|||
PDF_EXTRACT_IMAGES=request.app.state.config.PDF_EXTRACT_IMAGES,
|
||||
DOCUMENT_INTELLIGENCE_ENDPOINT=request.app.state.config.DOCUMENT_INTELLIGENCE_ENDPOINT,
|
||||
DOCUMENT_INTELLIGENCE_KEY=request.app.state.config.DOCUMENT_INTELLIGENCE_KEY,
|
||||
DOCUMENT_INTELLIGENCE_MODEL=request.app.state.config.DOCUMENT_INTELLIGENCE_MODEL,
|
||||
MISTRAL_OCR_API_BASE_URL=request.app.state.config.MISTRAL_OCR_API_BASE_URL,
|
||||
MISTRAL_OCR_API_KEY=request.app.state.config.MISTRAL_OCR_API_KEY,
|
||||
MINERU_API_MODE=request.app.state.config.MINERU_API_MODE,
|
||||
|
|
@ -1689,7 +1697,7 @@ async def process_text(
|
|||
log.debug(f"text_content: {text_content}")
|
||||
|
||||
result = await run_in_threadpool(
|
||||
save_docs_to_vector_db, request, docs, collection_name, user
|
||||
save_docs_to_vector_db, request, docs, collection_name, user=user
|
||||
)
|
||||
if result:
|
||||
return {
|
||||
|
|
@ -1721,7 +1729,12 @@ async def process_web(
|
|||
|
||||
if not request.app.state.config.BYPASS_WEB_SEARCH_EMBEDDING_AND_RETRIEVAL:
|
||||
await run_in_threadpool(
|
||||
save_docs_to_vector_db, request, docs, collection_name, True, user
|
||||
save_docs_to_vector_db,
|
||||
request,
|
||||
docs,
|
||||
collection_name,
|
||||
overwrite=True,
|
||||
user=user,
|
||||
)
|
||||
else:
|
||||
collection_name = None
|
||||
|
|
@ -2464,7 +2477,12 @@ async def process_files_batch(
|
|||
if all_docs:
|
||||
try:
|
||||
await run_in_threadpool(
|
||||
save_docs_to_vector_db, request, all_docs, collection_name, True, user
|
||||
save_docs_to_vector_db,
|
||||
request,
|
||||
all_docs,
|
||||
collection_name,
|
||||
add=True,
|
||||
user=user,
|
||||
)
|
||||
|
||||
# Update all files with collection name
|
||||
|
|
|
|||
|
|
@ -719,7 +719,7 @@ async def get_groups(
|
|||
):
|
||||
"""List SCIM Groups"""
|
||||
# Get all groups
|
||||
groups_list = Groups.get_groups()
|
||||
groups_list = Groups.get_all_groups()
|
||||
|
||||
# Apply pagination
|
||||
total = len(groups_list)
|
||||
|
|
|
|||
|
|
@ -6,7 +6,7 @@ import io
|
|||
|
||||
from fastapi import APIRouter, Depends, HTTPException, Request, status
|
||||
from fastapi.responses import Response, StreamingResponse, FileResponse
|
||||
from pydantic import BaseModel
|
||||
from pydantic import BaseModel, ConfigDict
|
||||
|
||||
|
||||
from open_webui.models.auths import Auths
|
||||
|
|
@ -19,19 +19,14 @@ from open_webui.models.users import (
|
|||
UserGroupIdsModel,
|
||||
UserGroupIdsListResponse,
|
||||
UserInfoListResponse,
|
||||
UserIdNameListResponse,
|
||||
UserInfoListResponse,
|
||||
UserRoleUpdateForm,
|
||||
UserStatus,
|
||||
Users,
|
||||
UserSettings,
|
||||
UserUpdateForm,
|
||||
)
|
||||
|
||||
|
||||
from open_webui.socket.main import (
|
||||
get_active_status_by_user_id,
|
||||
get_active_user_ids,
|
||||
get_user_active_status,
|
||||
)
|
||||
from open_webui.constants import ERROR_MESSAGES
|
||||
from open_webui.env import SRC_LOG_LEVELS, STATIC_DIR
|
||||
|
||||
|
|
@ -51,23 +46,6 @@ log.setLevel(SRC_LOG_LEVELS["MODELS"])
|
|||
router = APIRouter()
|
||||
|
||||
|
||||
############################
|
||||
# GetActiveUsers
|
||||
############################
|
||||
|
||||
|
||||
@router.get("/active")
|
||||
async def get_active_users(
|
||||
user=Depends(get_verified_user),
|
||||
):
|
||||
"""
|
||||
Get a list of active users.
|
||||
"""
|
||||
return {
|
||||
"user_ids": get_active_user_ids(),
|
||||
}
|
||||
|
||||
|
||||
############################
|
||||
# GetUsers
|
||||
############################
|
||||
|
|
@ -125,20 +103,31 @@ async def get_all_users(
|
|||
return Users.get_users()
|
||||
|
||||
|
||||
@router.get("/search", response_model=UserIdNameListResponse)
|
||||
@router.get("/search", response_model=UserInfoListResponse)
|
||||
async def search_users(
|
||||
query: Optional[str] = None,
|
||||
order_by: Optional[str] = None,
|
||||
direction: Optional[str] = None,
|
||||
page: Optional[int] = 1,
|
||||
user=Depends(get_verified_user),
|
||||
):
|
||||
limit = PAGE_ITEM_COUNT
|
||||
|
||||
page = 1 # Always return the first page for search
|
||||
page = max(1, page)
|
||||
skip = (page - 1) * limit
|
||||
|
||||
filter = {}
|
||||
if query:
|
||||
filter["query"] = query
|
||||
|
||||
filter = {}
|
||||
if query:
|
||||
filter["query"] = query
|
||||
if order_by:
|
||||
filter["order_by"] = order_by
|
||||
if direction:
|
||||
filter["direction"] = direction
|
||||
|
||||
return Users.get_users(filter=filter, skip=skip, limit=limit)
|
||||
|
||||
|
||||
|
|
@ -219,11 +208,14 @@ class ChatPermissions(BaseModel):
|
|||
|
||||
class FeaturesPermissions(BaseModel):
|
||||
api_keys: bool = False
|
||||
notes: bool = True
|
||||
channels: bool = True
|
||||
folders: bool = True
|
||||
direct_tool_servers: bool = False
|
||||
|
||||
web_search: bool = True
|
||||
image_generation: bool = True
|
||||
code_interpreter: bool = True
|
||||
notes: bool = True
|
||||
|
||||
|
||||
class UserPermissions(BaseModel):
|
||||
|
|
@ -308,6 +300,43 @@ async def update_user_settings_by_session_user(
|
|||
)
|
||||
|
||||
|
||||
############################
|
||||
# GetUserStatusBySessionUser
|
||||
############################
|
||||
|
||||
|
||||
@router.get("/user/status")
|
||||
async def get_user_status_by_session_user(user=Depends(get_verified_user)):
|
||||
user = Users.get_user_by_id(user.id)
|
||||
if user:
|
||||
return user
|
||||
else:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_400_BAD_REQUEST,
|
||||
detail=ERROR_MESSAGES.USER_NOT_FOUND,
|
||||
)
|
||||
|
||||
|
||||
############################
|
||||
# UpdateUserStatusBySessionUser
|
||||
############################
|
||||
|
||||
|
||||
@router.post("/user/status/update")
|
||||
async def update_user_status_by_session_user(
|
||||
form_data: UserStatus, user=Depends(get_verified_user)
|
||||
):
|
||||
user = Users.get_user_by_id(user.id)
|
||||
if user:
|
||||
user = Users.update_user_status_by_id(user.id, form_data)
|
||||
return user
|
||||
else:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_400_BAD_REQUEST,
|
||||
detail=ERROR_MESSAGES.USER_NOT_FOUND,
|
||||
)
|
||||
|
||||
|
||||
############################
|
||||
# GetUserInfoBySessionUser
|
||||
############################
|
||||
|
|
@ -359,13 +388,15 @@ async def update_user_info_by_session_user(
|
|||
############################
|
||||
|
||||
|
||||
class UserResponse(BaseModel):
|
||||
class UserActiveResponse(UserStatus):
|
||||
name: str
|
||||
profile_image_url: str
|
||||
active: Optional[bool] = None
|
||||
profile_image_url: Optional[str] = None
|
||||
|
||||
is_active: bool
|
||||
model_config = ConfigDict(extra="allow")
|
||||
|
||||
|
||||
@router.get("/{user_id}", response_model=UserResponse)
|
||||
@router.get("/{user_id}", response_model=UserActiveResponse)
|
||||
async def get_user_by_id(user_id: str, user=Depends(get_verified_user)):
|
||||
# Check if user_id is a shared chat
|
||||
# If it is, get the user_id from the chat
|
||||
|
|
@ -383,11 +414,10 @@ async def get_user_by_id(user_id: str, user=Depends(get_verified_user)):
|
|||
user = Users.get_user_by_id(user_id)
|
||||
|
||||
if user:
|
||||
return UserResponse(
|
||||
return UserActiveResponse(
|
||||
**{
|
||||
"name": user.name,
|
||||
"profile_image_url": user.profile_image_url,
|
||||
"active": get_active_status_by_user_id(user_id),
|
||||
**user.model_dump(),
|
||||
"is_active": Users.is_user_active(user_id),
|
||||
}
|
||||
)
|
||||
else:
|
||||
|
|
@ -454,7 +484,7 @@ async def get_user_profile_image_by_id(user_id: str, user=Depends(get_verified_u
|
|||
@router.get("/{user_id}/active", response_model=dict)
|
||||
async def get_user_active_status_by_id(user_id: str, user=Depends(get_verified_user)):
|
||||
return {
|
||||
"active": get_user_active_status(user_id),
|
||||
"active": Users.is_user_active(user_id),
|
||||
}
|
||||
|
||||
|
||||
|
|
|
|||
|
|
@ -118,14 +118,16 @@ if WEBSOCKET_MANAGER == "redis":
|
|||
redis_sentinels = get_sentinels_from_env(
|
||||
WEBSOCKET_SENTINEL_HOSTS, WEBSOCKET_SENTINEL_PORT
|
||||
)
|
||||
SESSION_POOL = RedisDict(
|
||||
f"{REDIS_KEY_PREFIX}:session_pool",
|
||||
|
||||
MODELS = RedisDict(
|
||||
f"{REDIS_KEY_PREFIX}:models",
|
||||
redis_url=WEBSOCKET_REDIS_URL,
|
||||
redis_sentinels=redis_sentinels,
|
||||
redis_cluster=WEBSOCKET_REDIS_CLUSTER,
|
||||
)
|
||||
USER_POOL = RedisDict(
|
||||
f"{REDIS_KEY_PREFIX}:user_pool",
|
||||
|
||||
SESSION_POOL = RedisDict(
|
||||
f"{REDIS_KEY_PREFIX}:session_pool",
|
||||
redis_url=WEBSOCKET_REDIS_URL,
|
||||
redis_sentinels=redis_sentinels,
|
||||
redis_cluster=WEBSOCKET_REDIS_CLUSTER,
|
||||
|
|
@ -148,8 +150,9 @@ if WEBSOCKET_MANAGER == "redis":
|
|||
renew_func = clean_up_lock.renew_lock
|
||||
release_func = clean_up_lock.release_lock
|
||||
else:
|
||||
MODELS = {}
|
||||
|
||||
SESSION_POOL = {}
|
||||
USER_POOL = {}
|
||||
USAGE_POOL = {}
|
||||
|
||||
aquire_func = release_func = renew_func = lambda: True
|
||||
|
|
@ -225,16 +228,6 @@ def get_models_in_use():
|
|||
return models_in_use
|
||||
|
||||
|
||||
def get_active_user_ids():
|
||||
"""Get the list of active user IDs."""
|
||||
return list(USER_POOL.keys())
|
||||
|
||||
|
||||
def get_user_active_status(user_id):
|
||||
"""Check if a user is currently active."""
|
||||
return user_id in USER_POOL
|
||||
|
||||
|
||||
def get_user_id_from_session_pool(sid):
|
||||
user = SESSION_POOL.get(sid)
|
||||
if user:
|
||||
|
|
@ -260,10 +253,36 @@ def get_user_ids_from_room(room):
|
|||
return active_user_ids
|
||||
|
||||
|
||||
def get_active_status_by_user_id(user_id):
|
||||
if user_id in USER_POOL:
|
||||
return True
|
||||
return False
|
||||
async def emit_to_users(event: str, data: dict, user_ids: list[str]):
|
||||
"""
|
||||
Send a message to specific users using their user:{id} rooms.
|
||||
|
||||
Args:
|
||||
event (str): The event name to emit.
|
||||
data (dict): The payload/data to send.
|
||||
user_ids (list[str]): The target users' IDs.
|
||||
"""
|
||||
try:
|
||||
for user_id in user_ids:
|
||||
await sio.emit(event, data, room=f"user:{user_id}")
|
||||
except Exception as e:
|
||||
log.debug(f"Failed to emit event {event} to users {user_ids}: {e}")
|
||||
|
||||
|
||||
async def enter_room_for_users(room: str, user_ids: list[str]):
|
||||
"""
|
||||
Make all sessions of a user join a specific room.
|
||||
Args:
|
||||
room (str): The room to join.
|
||||
user_ids (list[str]): The target user's IDs.
|
||||
"""
|
||||
try:
|
||||
for user_id in user_ids:
|
||||
session_ids = get_session_ids_from_room(f"user:{user_id}")
|
||||
for sid in session_ids:
|
||||
await sio.enter_room(sid, room)
|
||||
except Exception as e:
|
||||
log.debug(f"Failed to make users {user_ids} join room {room}: {e}")
|
||||
|
||||
|
||||
@sio.on("usage")
|
||||
|
|
@ -293,11 +312,6 @@ async def connect(sid, environ, auth):
|
|||
SESSION_POOL[sid] = user.model_dump(
|
||||
exclude=["date_of_birth", "bio", "gender"]
|
||||
)
|
||||
if user.id in USER_POOL:
|
||||
USER_POOL[user.id] = USER_POOL[user.id] + [sid]
|
||||
else:
|
||||
USER_POOL[user.id] = [sid]
|
||||
|
||||
await sio.enter_room(sid, f"user:{user.id}")
|
||||
|
||||
|
||||
|
|
@ -316,21 +330,34 @@ async def user_join(sid, data):
|
|||
if not user:
|
||||
return
|
||||
|
||||
SESSION_POOL[sid] = user.model_dump(exclude=["date_of_birth", "bio", "gender"])
|
||||
if user.id in USER_POOL:
|
||||
USER_POOL[user.id] = USER_POOL[user.id] + [sid]
|
||||
else:
|
||||
USER_POOL[user.id] = [sid]
|
||||
SESSION_POOL[sid] = user.model_dump(
|
||||
exclude=[
|
||||
"profile_image_url",
|
||||
"profile_banner_image_url",
|
||||
"date_of_birth",
|
||||
"bio",
|
||||
"gender",
|
||||
]
|
||||
)
|
||||
|
||||
await sio.enter_room(sid, f"user:{user.id}")
|
||||
|
||||
# Join all the channels
|
||||
channels = Channels.get_channels_by_user_id(user.id)
|
||||
log.debug(f"{channels=}")
|
||||
for channel in channels:
|
||||
await sio.enter_room(sid, f"channel:{channel.id}")
|
||||
|
||||
return {"id": user.id, "name": user.name}
|
||||
|
||||
|
||||
@sio.on("heartbeat")
|
||||
async def heartbeat(sid, data):
|
||||
user = SESSION_POOL.get(sid)
|
||||
if user:
|
||||
Users.update_last_active_by_id(user["id"])
|
||||
|
||||
|
||||
@sio.on("join-channels")
|
||||
async def join_channel(sid, data):
|
||||
auth = data["auth"] if "auth" in data else None
|
||||
|
|
@ -398,6 +425,11 @@ async def channel_events(sid, data):
|
|||
event_data = data["data"]
|
||||
event_type = event_data["type"]
|
||||
|
||||
user = SESSION_POOL.get(sid)
|
||||
|
||||
if not user:
|
||||
return
|
||||
|
||||
if event_type == "typing":
|
||||
await sio.emit(
|
||||
"events:channel",
|
||||
|
|
@ -405,10 +437,12 @@ async def channel_events(sid, data):
|
|||
"channel_id": data["channel_id"],
|
||||
"message_id": data.get("message_id", None),
|
||||
"data": event_data,
|
||||
"user": UserNameResponse(**SESSION_POOL[sid]).model_dump(),
|
||||
"user": UserNameResponse(**user).model_dump(),
|
||||
},
|
||||
room=room,
|
||||
)
|
||||
elif event_type == "last_read_at":
|
||||
Channels.update_member_last_read_at(data["channel_id"], user["id"])
|
||||
|
||||
|
||||
@sio.on("ydoc:document:join")
|
||||
|
|
@ -652,13 +686,6 @@ async def disconnect(sid):
|
|||
if sid in SESSION_POOL:
|
||||
user = SESSION_POOL[sid]
|
||||
del SESSION_POOL[sid]
|
||||
|
||||
user_id = user["id"]
|
||||
USER_POOL[user_id] = [_sid for _sid in USER_POOL[user_id] if _sid != sid]
|
||||
|
||||
if len(USER_POOL[user_id]) == 0:
|
||||
del USER_POOL[user_id]
|
||||
|
||||
await YDOC_MANAGER.remove_user_from_all_documents(sid)
|
||||
else:
|
||||
pass
|
||||
|
|
|
|||
|
|
@ -86,6 +86,15 @@ class RedisDict:
|
|||
def items(self):
|
||||
return [(k, json.loads(v)) for k, v in self.redis.hgetall(self.name).items()]
|
||||
|
||||
def set(self, mapping: dict):
|
||||
pipe = self.redis.pipeline()
|
||||
|
||||
pipe.delete(self.name)
|
||||
if mapping:
|
||||
pipe.hset(self.name, mapping={k: json.dumps(v) for k, v in mapping.items()})
|
||||
|
||||
pipe.execute()
|
||||
|
||||
def get(self, key, default=None):
|
||||
try:
|
||||
return self[key]
|
||||
|
|
|
|||
|
|
@ -194,7 +194,7 @@ class AuditLoggingMiddleware:
|
|||
auth_header = request.headers.get("Authorization")
|
||||
|
||||
try:
|
||||
user = get_current_user(
|
||||
user = await get_current_user(
|
||||
request, None, None, get_http_authorization_cred(auth_header)
|
||||
)
|
||||
return user
|
||||
|
|
|
|||
|
|
@ -235,7 +235,7 @@ async def invalidate_token(request, token):
|
|||
jti = decoded.get("jti")
|
||||
exp = decoded.get("exp")
|
||||
|
||||
if jti:
|
||||
if jti and exp:
|
||||
ttl = exp - int(
|
||||
datetime.now(UTC).timestamp()
|
||||
) # Calculate time-to-live for the token
|
||||
|
|
@ -344,9 +344,7 @@ async def get_current_user(
|
|||
# Refresh the user's last active timestamp asynchronously
|
||||
# to prevent blocking the request
|
||||
if background_tasks:
|
||||
background_tasks.add_task(
|
||||
Users.update_user_last_active_by_id, user.id
|
||||
)
|
||||
background_tasks.add_task(Users.update_last_active_by_id, user.id)
|
||||
return user
|
||||
else:
|
||||
raise HTTPException(
|
||||
|
|
@ -397,8 +395,7 @@ def get_current_user_by_api_key(request, api_key: str):
|
|||
current_span.set_attribute("client.user.role", user.role)
|
||||
current_span.set_attribute("client.auth.type", "api_key")
|
||||
|
||||
Users.update_user_last_active_by_id(user.id)
|
||||
|
||||
Users.update_last_active_by_id(user.id)
|
||||
return user
|
||||
|
||||
|
||||
|
|
|
|||
24
backend/open_webui/utils/groups.py
Normal file
24
backend/open_webui/utils/groups.py
Normal file
|
|
@ -0,0 +1,24 @@
|
|||
import logging
|
||||
from open_webui.models.groups import Groups
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def apply_default_group_assignment(
|
||||
default_group_id: str,
|
||||
user_id: str,
|
||||
) -> None:
|
||||
"""
|
||||
Apply default group assignment to a user if default_group_id is provided.
|
||||
|
||||
Args:
|
||||
default_group_id: ID of the default group to add the user to
|
||||
user_id: ID of the user to add to the default group
|
||||
"""
|
||||
if default_group_id:
|
||||
try:
|
||||
Groups.add_users_to_group(default_group_id, [user_id])
|
||||
except Exception as e:
|
||||
log.error(
|
||||
f"Failed to add user {user_id} to default group {default_group_id}: {e}"
|
||||
)
|
||||
|
|
@ -32,7 +32,6 @@ from open_webui.models.users import Users
|
|||
from open_webui.socket.main import (
|
||||
get_event_call,
|
||||
get_event_emitter,
|
||||
get_active_status_by_user_id,
|
||||
)
|
||||
from open_webui.routers.tasks import (
|
||||
generate_queries,
|
||||
|
|
@ -459,12 +458,6 @@ async def chat_completion_tools_handler(
|
|||
}
|
||||
)
|
||||
|
||||
print(
|
||||
f"Tool {tool_function_name} result: {tool_result}",
|
||||
tool_result_files,
|
||||
tool_result_embeds,
|
||||
)
|
||||
|
||||
if tool_result:
|
||||
tool = tools[tool_function_name]
|
||||
tool_id = tool.get("tool_id", "")
|
||||
|
|
@ -492,12 +485,6 @@ async def chat_completion_tools_handler(
|
|||
}
|
||||
)
|
||||
|
||||
# Citation is not enabled for this tool
|
||||
body["messages"] = add_or_update_user_message(
|
||||
f"\nTool `{tool_name}` Output: {tool_result}",
|
||||
body["messages"],
|
||||
)
|
||||
|
||||
if (
|
||||
tools[tool_function_name]
|
||||
.get("metadata", {})
|
||||
|
|
@ -773,9 +760,12 @@ async def chat_image_generation_handler(
|
|||
if not chat_id:
|
||||
return form_data
|
||||
|
||||
chat = Chats.get_chat_by_id_and_user_id(chat_id, user.id)
|
||||
|
||||
__event_emitter__ = extra_params["__event_emitter__"]
|
||||
|
||||
if chat_id.startswith("local:"):
|
||||
message_list = form_data.get("messages", [])
|
||||
else:
|
||||
chat = Chats.get_chat_by_id_and_user_id(chat_id, user.id)
|
||||
await __event_emitter__(
|
||||
{
|
||||
"type": "status",
|
||||
|
|
@ -786,6 +776,7 @@ async def chat_image_generation_handler(
|
|||
messages_map = chat.chat.get("history", {}).get("messages", {})
|
||||
message_id = chat.chat.get("history", {}).get("currentId")
|
||||
message_list = get_message_list(messages_map, message_id)
|
||||
|
||||
user_message = get_last_user_message(message_list)
|
||||
|
||||
prompt = user_message
|
||||
|
|
@ -845,7 +836,7 @@ async def chat_image_generation_handler(
|
|||
}
|
||||
)
|
||||
|
||||
system_message_content = f"<context>Image generation was attempted but failed. The system is currently unable to generate the image. Tell the user that an error occurred: {error_message}</context>"
|
||||
system_message_content = f"<context>Image generation was attempted but failed. The system is currently unable to generate the image. Tell the user that the following error occurred: {error_message}</context>"
|
||||
|
||||
else:
|
||||
# Create image(s)
|
||||
|
|
@ -908,7 +899,7 @@ async def chat_image_generation_handler(
|
|||
}
|
||||
)
|
||||
|
||||
system_message_content = "<context>The requested image has been created and is now being shown to the user. Let them know that it has been generated.</context>"
|
||||
system_message_content = "<context>The requested image has been created by the system successfully and is now being shown to the user. Let the user know that the image they requested has been generated and is now shown in the chat.</context>"
|
||||
except Exception as e:
|
||||
log.debug(e)
|
||||
|
||||
|
|
@ -929,7 +920,7 @@ async def chat_image_generation_handler(
|
|||
}
|
||||
)
|
||||
|
||||
system_message_content = f"<context>Image generation was attempted but failed. The system is currently unable to generate the image. Tell the user that an error occurred: {error_message}</context>"
|
||||
system_message_content = f"<context>Image generation was attempted but failed because of an error. The system is currently unable to generate the image. Tell the user that the following error occurred: {error_message}</context>"
|
||||
|
||||
if system_message_content:
|
||||
form_data["messages"] = add_or_update_system_message(
|
||||
|
|
@ -1409,11 +1400,12 @@ async def process_chat_payload(request, form_data, user, metadata, model):
|
|||
headers=headers if headers else None,
|
||||
)
|
||||
|
||||
function_name_filter_list = (
|
||||
mcp_server_connection.get("config", {})
|
||||
.get("function_name_filter_list", "")
|
||||
.split(",")
|
||||
)
|
||||
function_name_filter_list = mcp_server_connection.get(
|
||||
"config", {}
|
||||
).get("function_name_filter_list", "")
|
||||
|
||||
if isinstance(function_name_filter_list, str):
|
||||
function_name_filter_list = function_name_filter_list.split(",")
|
||||
|
||||
tool_specs = await mcp_clients[server_id].list_tool_specs()
|
||||
for tool_spec in tool_specs:
|
||||
|
|
@ -1914,7 +1906,7 @@ async def process_chat_response(
|
|||
)
|
||||
|
||||
# Send a webhook notification if the user is not active
|
||||
if not get_active_status_by_user_id(user.id):
|
||||
if not Users.is_user_active(user.id):
|
||||
webhook_url = Users.get_user_webhook_url_by_id(user.id)
|
||||
if webhook_url:
|
||||
await post_webhook(
|
||||
|
|
@ -3209,7 +3201,7 @@ async def process_chat_response(
|
|||
)
|
||||
|
||||
# Send a webhook notification if the user is not active
|
||||
if not get_active_status_by_user_id(user.id):
|
||||
if not Users.is_user_active(user.id):
|
||||
webhook_url = Users.get_user_webhook_url_by_id(user.id)
|
||||
if webhook_url:
|
||||
await post_webhook(
|
||||
|
|
|
|||
|
|
@ -6,7 +6,7 @@ import uuid
|
|||
import logging
|
||||
from datetime import timedelta
|
||||
from pathlib import Path
|
||||
from typing import Callable, Optional
|
||||
from typing import Callable, Optional, Sequence, Union
|
||||
import json
|
||||
import aiohttp
|
||||
|
||||
|
|
@ -43,26 +43,28 @@ def get_allow_block_lists(filter_list):
|
|||
return allow_list, block_list
|
||||
|
||||
|
||||
def is_string_allowed(string: str, filter_list: Optional[list[str]] = None) -> bool:
|
||||
def is_string_allowed(
|
||||
string: Union[str, Sequence[str]], filter_list: Optional[list[str]] = None
|
||||
) -> bool:
|
||||
"""
|
||||
Checks if a string is allowed based on the provided filter list.
|
||||
:param string: The string to check (e.g., domain or hostname).
|
||||
:param string: The string or sequence of strings to check (e.g., domain or hostname).
|
||||
:param filter_list: List of allowed/blocked strings. Strings starting with "!" are blocked.
|
||||
:return: True if the string is allowed, False otherwise.
|
||||
:return: True if the string or sequence of strings is allowed, False otherwise.
|
||||
"""
|
||||
if not filter_list:
|
||||
return True
|
||||
|
||||
allow_list, block_list = get_allow_block_lists(filter_list)
|
||||
print(string, allow_list, block_list)
|
||||
strings = [string] if isinstance(string, str) else list(string)
|
||||
|
||||
# If allow list is non-empty, require domain to match one of them
|
||||
if allow_list:
|
||||
if not any(string.endswith(allowed) for allowed in allow_list):
|
||||
if not any(s.endswith(allowed) for s in strings for allowed in allow_list):
|
||||
return False
|
||||
|
||||
# Block list always removes matches
|
||||
if any(string.endswith(blocked) for blocked in block_list):
|
||||
if any(s.endswith(blocked) for s in strings for blocked in block_list):
|
||||
return False
|
||||
|
||||
return True
|
||||
|
|
|
|||
|
|
@ -6,6 +6,7 @@ import sys
|
|||
from aiocache import cached
|
||||
from fastapi import Request
|
||||
|
||||
from open_webui.socket.utils import RedisDict
|
||||
from open_webui.routers import openai, ollama
|
||||
from open_webui.functions import get_function_models
|
||||
|
||||
|
|
@ -190,6 +191,8 @@ async def get_all_models(request, refresh: bool = False, user: UserModel = None)
|
|||
):
|
||||
# Custom model based on a base model
|
||||
owned_by = "openai"
|
||||
connection_type = None
|
||||
|
||||
pipe = None
|
||||
|
||||
for m in models:
|
||||
|
|
@ -200,6 +203,8 @@ async def get_all_models(request, refresh: bool = False, user: UserModel = None)
|
|||
owned_by = m.get("owned_by", "unknown")
|
||||
if "pipe" in m:
|
||||
pipe = m["pipe"]
|
||||
|
||||
connection_type = m.get("connection_type", None)
|
||||
break
|
||||
|
||||
model = {
|
||||
|
|
@ -208,6 +213,7 @@ async def get_all_models(request, refresh: bool = False, user: UserModel = None)
|
|||
"object": "model",
|
||||
"created": custom_model.created_at,
|
||||
"owned_by": owned_by,
|
||||
"connection_type": connection_type,
|
||||
"preset": True,
|
||||
**({"pipe": pipe} if pipe is not None else {}),
|
||||
}
|
||||
|
|
@ -323,7 +329,12 @@ async def get_all_models(request, refresh: bool = False, user: UserModel = None)
|
|||
|
||||
log.debug(f"get_all_models() returned {len(models)} models")
|
||||
|
||||
request.app.state.MODELS = {model["id"]: model for model in models}
|
||||
models_dict = {model["id"]: model for model in models}
|
||||
if isinstance(request.app.state.MODELS, RedisDict):
|
||||
request.app.state.MODELS.set(models_dict)
|
||||
else:
|
||||
request.app.state.MODELS = models_dict
|
||||
|
||||
return models
|
||||
|
||||
|
||||
|
|
|
|||
|
|
@ -43,6 +43,7 @@ from open_webui.config import (
|
|||
ENABLE_OAUTH_GROUP_CREATION,
|
||||
OAUTH_BLOCKED_GROUPS,
|
||||
OAUTH_GROUPS_SEPARATOR,
|
||||
OAUTH_ROLES_SEPARATOR,
|
||||
OAUTH_ROLES_CLAIM,
|
||||
OAUTH_SUB_CLAIM,
|
||||
OAUTH_GROUPS_CLAIM,
|
||||
|
|
@ -71,6 +72,7 @@ from open_webui.env import (
|
|||
from open_webui.utils.misc import parse_duration
|
||||
from open_webui.utils.auth import get_password_hash, create_token
|
||||
from open_webui.utils.webhook import post_webhook
|
||||
from open_webui.utils.groups import apply_default_group_assignment
|
||||
|
||||
from mcp.shared.auth import (
|
||||
OAuthClientMetadata as MCPOAuthClientMetadata,
|
||||
|
|
@ -1032,7 +1034,13 @@ class OAuthManager:
|
|||
|
||||
if isinstance(claim_data, list):
|
||||
oauth_roles = claim_data
|
||||
if isinstance(claim_data, str) or isinstance(claim_data, int):
|
||||
elif isinstance(claim_data, str):
|
||||
# Split by the configured separator if present
|
||||
if OAUTH_ROLES_SEPARATOR and OAUTH_ROLES_SEPARATOR in claim_data:
|
||||
oauth_roles = claim_data.split(OAUTH_ROLES_SEPARATOR)
|
||||
else:
|
||||
oauth_roles = [claim_data]
|
||||
elif isinstance(claim_data, int):
|
||||
oauth_roles = [str(claim_data)]
|
||||
|
||||
log.debug(f"Oauth Roles claim: {oauth_claim}")
|
||||
|
|
@ -1095,7 +1103,7 @@ class OAuthManager:
|
|||
user_oauth_groups = []
|
||||
|
||||
user_current_groups: list[GroupModel] = Groups.get_groups_by_member_id(user.id)
|
||||
all_available_groups: list[GroupModel] = Groups.get_groups()
|
||||
all_available_groups: list[GroupModel] = Groups.get_all_groups()
|
||||
|
||||
# Create groups if they don't exist and creation is enabled
|
||||
if auth_manager_config.ENABLE_OAUTH_GROUP_CREATION:
|
||||
|
|
@ -1139,7 +1147,7 @@ class OAuthManager:
|
|||
|
||||
# Refresh the list of all available groups if any were created
|
||||
if groups_created:
|
||||
all_available_groups = Groups.get_groups()
|
||||
all_available_groups = Groups.get_all_groups()
|
||||
log.debug("Refreshed list of all available groups after creation.")
|
||||
|
||||
log.debug(f"Oauth Groups claim: {oauth_claim}")
|
||||
|
|
@ -1160,7 +1168,6 @@ class OAuthManager:
|
|||
log.debug(
|
||||
f"Removing user from group {group_model.name} as it is no longer in their oauth groups"
|
||||
)
|
||||
|
||||
Groups.remove_users_from_group(group_model.id, [user.id])
|
||||
|
||||
# In case a group is created, but perms are never assigned to the group by hitting "save"
|
||||
|
|
@ -1322,7 +1329,10 @@ class OAuthManager:
|
|||
log.warning(f"OAuth callback failed, sub is missing: {user_data}")
|
||||
raise HTTPException(400, detail=ERROR_MESSAGES.INVALID_CRED)
|
||||
|
||||
provider_sub = f"{provider}@{sub}"
|
||||
oauth_data = {}
|
||||
oauth_data[provider] = {
|
||||
"sub": sub,
|
||||
}
|
||||
|
||||
# Email extraction
|
||||
email_claim = auth_manager_config.OAUTH_EMAIL_CLAIM
|
||||
|
|
@ -1369,12 +1379,12 @@ class OAuthManager:
|
|||
log.warning(f"Error fetching GitHub email: {e}")
|
||||
raise HTTPException(400, detail=ERROR_MESSAGES.INVALID_CRED)
|
||||
elif ENABLE_OAUTH_EMAIL_FALLBACK:
|
||||
email = f"{provider_sub}.local"
|
||||
email = f"{provider}@{sub}.local"
|
||||
else:
|
||||
log.warning(f"OAuth callback failed, email is missing: {user_data}")
|
||||
raise HTTPException(400, detail=ERROR_MESSAGES.INVALID_CRED)
|
||||
email = email.lower()
|
||||
|
||||
email = email.lower()
|
||||
# If allowed domains are configured, check if the email domain is in the list
|
||||
if (
|
||||
"*" not in auth_manager_config.OAUTH_ALLOWED_DOMAINS
|
||||
|
|
@ -1387,7 +1397,7 @@ class OAuthManager:
|
|||
raise HTTPException(400, detail=ERROR_MESSAGES.INVALID_CRED)
|
||||
|
||||
# Check if the user exists
|
||||
user = Users.get_user_by_oauth_sub(provider_sub)
|
||||
user = Users.get_user_by_oauth_sub(provider, sub)
|
||||
if not user:
|
||||
# If the user does not exist, check if merging is enabled
|
||||
if auth_manager_config.OAUTH_MERGE_ACCOUNTS_BY_EMAIL:
|
||||
|
|
@ -1395,12 +1405,15 @@ class OAuthManager:
|
|||
user = Users.get_user_by_email(email)
|
||||
if user:
|
||||
# Update the user with the new oauth sub
|
||||
Users.update_user_oauth_sub_by_id(user.id, provider_sub)
|
||||
Users.update_user_oauth_by_id(user.id, provider, sub)
|
||||
|
||||
if user:
|
||||
determined_role = self.get_user_role(user, user_data)
|
||||
if user.role != determined_role:
|
||||
Users.update_user_role_by_id(user.id, determined_role)
|
||||
# Update the user object in memory as well,
|
||||
# to avoid problems with the ENABLE_OAUTH_GROUP_MANAGEMENT check below
|
||||
user.role = determined_role
|
||||
# Update profile picture if enabled and different from current
|
||||
if auth_manager_config.OAUTH_UPDATE_PICTURE_ON_LOGIN:
|
||||
picture_claim = auth_manager_config.OAUTH_PICTURE_CLAIM
|
||||
|
|
@ -1451,7 +1464,7 @@ class OAuthManager:
|
|||
name=name,
|
||||
profile_image_url=picture_url,
|
||||
role=self.get_user_role(None, user_data),
|
||||
oauth_sub=provider_sub,
|
||||
oauth=oauth_data,
|
||||
)
|
||||
|
||||
if auth_manager_config.WEBHOOK_URL:
|
||||
|
|
@ -1465,6 +1478,12 @@ class OAuthManager:
|
|||
"user": user.model_dump_json(exclude_none=True),
|
||||
},
|
||||
)
|
||||
|
||||
apply_default_group_assignment(
|
||||
request.app.state.config.DEFAULT_GROUP_ID,
|
||||
user.id,
|
||||
)
|
||||
|
||||
else:
|
||||
raise HTTPException(
|
||||
status.HTTP_403_FORBIDDEN,
|
||||
|
|
|
|||
139
backend/open_webui/utils/rate_limit.py
Normal file
139
backend/open_webui/utils/rate_limit.py
Normal file
|
|
@ -0,0 +1,139 @@
|
|||
import time
|
||||
from typing import Optional, Dict
|
||||
from open_webui.env import REDIS_KEY_PREFIX
|
||||
|
||||
|
||||
class RateLimiter:
|
||||
"""
|
||||
General-purpose rate limiter using Redis with a rolling window strategy.
|
||||
Falls back to in-memory storage if Redis is not available.
|
||||
"""
|
||||
|
||||
# In-memory fallback storage
|
||||
_memory_store: Dict[str, Dict[int, int]] = {}
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
redis_client,
|
||||
limit: int,
|
||||
window: int,
|
||||
bucket_size: int = 60,
|
||||
enabled: bool = True,
|
||||
):
|
||||
"""
|
||||
:param redis_client: Redis client instance or None
|
||||
:param limit: Max allowed events in the window
|
||||
:param window: Time window in seconds
|
||||
:param bucket_size: Bucket resolution
|
||||
:param enabled: Turn on/off rate limiting globally
|
||||
"""
|
||||
self.r = redis_client
|
||||
self.limit = limit
|
||||
self.window = window
|
||||
self.bucket_size = bucket_size
|
||||
self.num_buckets = window // bucket_size
|
||||
self.enabled = enabled
|
||||
|
||||
def _bucket_key(self, key: str, bucket_index: int) -> str:
|
||||
return f"{REDIS_KEY_PREFIX}:ratelimit:{key.lower()}:{bucket_index}"
|
||||
|
||||
def _current_bucket(self) -> int:
|
||||
return int(time.time()) // self.bucket_size
|
||||
|
||||
def _redis_available(self) -> bool:
|
||||
return self.r is not None
|
||||
|
||||
def is_limited(self, key: str) -> bool:
|
||||
"""
|
||||
Main rate-limit check.
|
||||
Gracefully handles missing or failing Redis.
|
||||
"""
|
||||
if not self.enabled:
|
||||
return False
|
||||
|
||||
if self._redis_available():
|
||||
try:
|
||||
return self._is_limited_redis(key)
|
||||
except Exception:
|
||||
return self._is_limited_memory(key)
|
||||
else:
|
||||
return self._is_limited_memory(key)
|
||||
|
||||
def get_count(self, key: str) -> int:
|
||||
if not self.enabled:
|
||||
return 0
|
||||
|
||||
if self._redis_available():
|
||||
try:
|
||||
return self._get_count_redis(key)
|
||||
except Exception:
|
||||
return self._get_count_memory(key)
|
||||
else:
|
||||
return self._get_count_memory(key)
|
||||
|
||||
def remaining(self, key: str) -> int:
|
||||
used = self.get_count(key)
|
||||
return max(0, self.limit - used)
|
||||
|
||||
def _is_limited_redis(self, key: str) -> bool:
|
||||
now_bucket = self._current_bucket()
|
||||
bucket_key = self._bucket_key(key, now_bucket)
|
||||
|
||||
attempts = self.r.incr(bucket_key)
|
||||
if attempts == 1:
|
||||
self.r.expire(bucket_key, self.window + self.bucket_size)
|
||||
|
||||
# Collect buckets
|
||||
buckets = [
|
||||
self._bucket_key(key, now_bucket - i) for i in range(self.num_buckets + 1)
|
||||
]
|
||||
|
||||
counts = self.r.mget(buckets)
|
||||
total = sum(int(c) for c in counts if c)
|
||||
|
||||
return total > self.limit
|
||||
|
||||
def _get_count_redis(self, key: str) -> int:
|
||||
now_bucket = self._current_bucket()
|
||||
buckets = [
|
||||
self._bucket_key(key, now_bucket - i) for i in range(self.num_buckets + 1)
|
||||
]
|
||||
counts = self.r.mget(buckets)
|
||||
return sum(int(c) for c in counts if c)
|
||||
|
||||
def _is_limited_memory(self, key: str) -> bool:
|
||||
now_bucket = self._current_bucket()
|
||||
|
||||
# Init storage
|
||||
if key not in self._memory_store:
|
||||
self._memory_store[key] = {}
|
||||
|
||||
store = self._memory_store[key]
|
||||
|
||||
# Increment bucket
|
||||
store[now_bucket] = store.get(now_bucket, 0) + 1
|
||||
|
||||
# Drop expired buckets
|
||||
min_bucket = now_bucket - self.num_buckets
|
||||
expired = [b for b in store if b < min_bucket]
|
||||
for b in expired:
|
||||
del store[b]
|
||||
|
||||
# Count totals
|
||||
total = sum(store.values())
|
||||
return total > self.limit
|
||||
|
||||
def _get_count_memory(self, key: str) -> int:
|
||||
now_bucket = self._current_bucket()
|
||||
if key not in self._memory_store:
|
||||
return 0
|
||||
|
||||
store = self._memory_store[key]
|
||||
min_bucket = now_bucket - self.num_buckets
|
||||
|
||||
# Remove expired
|
||||
expired = [b for b in store if b < min_bucket]
|
||||
for b in expired:
|
||||
del store[b]
|
||||
|
||||
return sum(store.values())
|
||||
|
|
@ -5,7 +5,13 @@ import logging
|
|||
|
||||
import redis
|
||||
|
||||
from open_webui.env import REDIS_SENTINEL_MAX_RETRY_COUNT
|
||||
from open_webui.env import (
|
||||
REDIS_CLUSTER,
|
||||
REDIS_SENTINEL_HOSTS,
|
||||
REDIS_SENTINEL_MAX_RETRY_COUNT,
|
||||
REDIS_SENTINEL_PORT,
|
||||
REDIS_URL,
|
||||
)
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
|
||||
|
|
@ -108,6 +114,21 @@ def parse_redis_service_url(redis_url):
|
|||
}
|
||||
|
||||
|
||||
def get_redis_client(async_mode=False):
|
||||
try:
|
||||
return get_redis_connection(
|
||||
redis_url=REDIS_URL,
|
||||
redis_sentinels=get_sentinels_from_env(
|
||||
REDIS_SENTINEL_HOSTS, REDIS_SENTINEL_PORT
|
||||
),
|
||||
redis_cluster=REDIS_CLUSTER,
|
||||
async_mode=async_mode,
|
||||
)
|
||||
except Exception as e:
|
||||
log.debug(f"Failed to get Redis client: {e}")
|
||||
return None
|
||||
|
||||
|
||||
def get_redis_connection(
|
||||
redis_url,
|
||||
redis_sentinels,
|
||||
|
|
|
|||
|
|
@ -45,7 +45,6 @@ from open_webui.env import (
|
|||
OTEL_METRICS_OTLP_SPAN_EXPORTER,
|
||||
OTEL_METRICS_EXPORTER_OTLP_INSECURE,
|
||||
)
|
||||
from open_webui.socket.main import get_active_user_ids
|
||||
from open_webui.models.users import Users
|
||||
|
||||
_EXPORT_INTERVAL_MILLIS = 10_000 # 10 seconds
|
||||
|
|
@ -135,7 +134,7 @@ def setup_metrics(app: FastAPI, resource: Resource) -> None:
|
|||
) -> Sequence[metrics.Observation]:
|
||||
return [
|
||||
metrics.Observation(
|
||||
value=len(get_active_user_ids()),
|
||||
value=Users.get_active_user_count(),
|
||||
)
|
||||
]
|
||||
|
||||
|
|
|
|||
|
|
@ -150,11 +150,12 @@ async def get_tools(
|
|||
)
|
||||
|
||||
specs = tool_server_data.get("specs", [])
|
||||
function_name_filter_list = (
|
||||
tool_server_connection.get("config", {})
|
||||
.get("function_name_filter_list", "")
|
||||
.split(",")
|
||||
)
|
||||
function_name_filter_list = tool_server_connection.get(
|
||||
"config", {}
|
||||
).get("function_name_filter_list", "")
|
||||
|
||||
if isinstance(function_name_filter_list, str):
|
||||
function_name_filter_list = function_name_filter_list.split(",")
|
||||
|
||||
for spec in specs:
|
||||
function_name = spec["name"]
|
||||
|
|
|
|||
|
|
@ -1,13 +1,13 @@
|
|||
# Minimal requirements for backend to run
|
||||
# WIP: use this as a reference to build a minimal docker image
|
||||
|
||||
fastapi==0.118.0
|
||||
fastapi==0.123.0
|
||||
uvicorn[standard]==0.37.0
|
||||
pydantic==2.11.9
|
||||
pydantic==2.12.5
|
||||
python-multipart==0.0.20
|
||||
itsdangerous==2.2.0
|
||||
|
||||
python-socketio==5.14.0
|
||||
python-socketio==5.15.0
|
||||
python-jose==3.5.0
|
||||
cryptography
|
||||
bcrypt==5.0.0
|
||||
|
|
@ -20,14 +20,14 @@ aiohttp==3.12.15
|
|||
async-timeout
|
||||
aiocache
|
||||
aiofiles
|
||||
starlette-compress==1.6.0
|
||||
starlette-compress==1.6.1
|
||||
httpx[socks,http2,zstd,cli,brotli]==0.28.1
|
||||
starsessions[redis]==2.2.1
|
||||
|
||||
sqlalchemy==2.0.38
|
||||
alembic==1.14.0
|
||||
peewee==3.18.1
|
||||
peewee-migrate==1.12.2
|
||||
alembic==1.17.2
|
||||
peewee==3.18.3
|
||||
peewee-migrate==1.14.3
|
||||
|
||||
pycrdt==0.12.25
|
||||
redis
|
||||
|
|
@ -36,9 +36,9 @@ APScheduler==3.10.4
|
|||
RestrictedPython==8.0
|
||||
|
||||
loguru==0.7.3
|
||||
asgiref==3.8.1
|
||||
asgiref==3.11.0
|
||||
|
||||
mcp==1.21.2
|
||||
mcp==1.22.0
|
||||
openai
|
||||
|
||||
langchain==0.3.27
|
||||
|
|
@ -46,6 +46,6 @@ langchain-community==0.3.29
|
|||
fake-useragent==2.2.0
|
||||
|
||||
chromadb==1.1.0
|
||||
black==25.9.0
|
||||
black==25.11.0
|
||||
pydub
|
||||
chardet==5.2.0
|
||||
|
|
|
|||
|
|
@ -1,10 +1,10 @@
|
|||
fastapi==0.118.0
|
||||
fastapi==0.123.0
|
||||
uvicorn[standard]==0.37.0
|
||||
pydantic==2.11.9
|
||||
pydantic==2.12.5
|
||||
python-multipart==0.0.20
|
||||
itsdangerous==2.2.0
|
||||
|
||||
python-socketio==5.14.0
|
||||
python-socketio==5.15.0
|
||||
python-jose==3.5.0
|
||||
cryptography
|
||||
bcrypt==5.0.0
|
||||
|
|
@ -17,14 +17,14 @@ aiohttp==3.12.15
|
|||
async-timeout
|
||||
aiocache
|
||||
aiofiles
|
||||
starlette-compress==1.6.0
|
||||
starlette-compress==1.6.1
|
||||
httpx[socks,http2,zstd,cli,brotli]==0.28.1
|
||||
starsessions[redis]==2.2.1
|
||||
|
||||
sqlalchemy==2.0.38
|
||||
alembic==1.14.0
|
||||
peewee==3.18.1
|
||||
peewee-migrate==1.12.2
|
||||
alembic==1.17.2
|
||||
peewee==3.18.3
|
||||
peewee-migrate==1.14.3
|
||||
|
||||
pycrdt==0.12.25
|
||||
redis
|
||||
|
|
@ -33,11 +33,11 @@ APScheduler==3.10.4
|
|||
RestrictedPython==8.0
|
||||
|
||||
loguru==0.7.3
|
||||
asgiref==3.8.1
|
||||
asgiref==3.11.0
|
||||
|
||||
# AI libraries
|
||||
tiktoken
|
||||
mcp==1.21.2
|
||||
mcp==1.22.0
|
||||
|
||||
openai
|
||||
anthropic
|
||||
|
|
@ -52,24 +52,24 @@ chromadb==1.1.0
|
|||
weaviate-client==4.17.0
|
||||
opensearch-py==2.8.0
|
||||
|
||||
transformers
|
||||
sentence-transformers==5.1.1
|
||||
transformers==4.57.3
|
||||
sentence-transformers==5.1.2
|
||||
accelerate
|
||||
pyarrow==20.0.0 # fix: pin pyarrow version to 20 for rpi compatibility #15897
|
||||
einops==0.8.1
|
||||
|
||||
ftfy==6.2.3
|
||||
ftfy==6.3.1
|
||||
chardet==5.2.0
|
||||
pypdf==6.0.0
|
||||
pypdf==6.4.0
|
||||
fpdf2==2.8.2
|
||||
pymdown-extensions==10.14.2
|
||||
pymdown-extensions==10.17.2
|
||||
docx2txt==0.8
|
||||
python-pptx==1.0.2
|
||||
unstructured==0.18.18
|
||||
unstructured==0.18.21
|
||||
msoffcrypto-tool==5.4.2
|
||||
nltk==3.9.1
|
||||
Markdown==3.9
|
||||
pypandoc==1.15
|
||||
Markdown==3.10
|
||||
pypandoc==1.16.2
|
||||
pandas==2.2.3
|
||||
openpyxl==3.1.5
|
||||
pyxlsb==1.0.10
|
||||
|
|
@ -87,12 +87,12 @@ rank-bm25==0.2.2
|
|||
onnxruntime==1.20.1
|
||||
faster-whisper==1.1.1
|
||||
|
||||
black==25.9.0
|
||||
black==25.11.0
|
||||
youtube-transcript-api==1.2.2
|
||||
pytube==15.0.0
|
||||
|
||||
pydub
|
||||
ddgs==9.0.0
|
||||
ddgs==9.9.2
|
||||
|
||||
azure-ai-documentintelligence==1.0.2
|
||||
azure-identity==1.25.0
|
||||
|
|
@ -104,7 +104,7 @@ google-api-python-client
|
|||
google-auth-httplib2
|
||||
google-auth-oauthlib
|
||||
|
||||
googleapis-common-protos==1.70.0
|
||||
googleapis-common-protos==1.72.0
|
||||
google-cloud-storage==2.19.0
|
||||
|
||||
## Databases
|
||||
|
|
@ -113,11 +113,11 @@ psycopg2-binary==2.9.10
|
|||
pgvector==0.4.1
|
||||
|
||||
PyMySQL==1.1.1
|
||||
boto3==1.40.5
|
||||
boto3==1.41.5
|
||||
|
||||
pymilvus==2.6.2
|
||||
pymilvus==2.6.4
|
||||
qdrant-client==1.14.3
|
||||
playwright==1.49.1 # Caution: version must match docker-compose.playwright.yaml
|
||||
playwright==1.56.0 # Caution: version must match docker-compose.playwright.yaml
|
||||
elasticsearch==9.1.0
|
||||
pinecone==6.0.2
|
||||
oracledb==3.2.0
|
||||
|
|
@ -130,23 +130,23 @@ colbert-ai==0.2.21
|
|||
## Tests
|
||||
docker~=7.1.0
|
||||
pytest~=8.4.1
|
||||
pytest-docker~=3.1.1
|
||||
pytest-docker~=3.2.5
|
||||
|
||||
## LDAP
|
||||
ldap3==2.9.1
|
||||
|
||||
## Firecrawl
|
||||
firecrawl-py==4.5.0
|
||||
firecrawl-py==4.10.0
|
||||
|
||||
## Trace
|
||||
opentelemetry-api==1.37.0
|
||||
opentelemetry-sdk==1.37.0
|
||||
opentelemetry-exporter-otlp==1.37.0
|
||||
opentelemetry-instrumentation==0.58b0
|
||||
opentelemetry-instrumentation-fastapi==0.58b0
|
||||
opentelemetry-instrumentation-sqlalchemy==0.58b0
|
||||
opentelemetry-instrumentation-redis==0.58b0
|
||||
opentelemetry-instrumentation-requests==0.58b0
|
||||
opentelemetry-instrumentation-logging==0.58b0
|
||||
opentelemetry-instrumentation-httpx==0.58b0
|
||||
opentelemetry-instrumentation-aiohttp-client==0.58b0
|
||||
opentelemetry-api==1.38.0
|
||||
opentelemetry-sdk==1.38.0
|
||||
opentelemetry-exporter-otlp==1.38.0
|
||||
opentelemetry-instrumentation==0.59b0
|
||||
opentelemetry-instrumentation-fastapi==0.59b0
|
||||
opentelemetry-instrumentation-sqlalchemy==0.59b0
|
||||
opentelemetry-instrumentation-redis==0.59b0
|
||||
opentelemetry-instrumentation-requests==0.59b0
|
||||
opentelemetry-instrumentation-logging==0.59b0
|
||||
opentelemetry-instrumentation-httpx==0.59b0
|
||||
opentelemetry-instrumentation-aiohttp-client==0.59b0
|
||||
|
|
|
|||
4
package-lock.json
generated
4
package-lock.json
generated
|
|
@ -1,12 +1,12 @@
|
|||
{
|
||||
"name": "open-webui",
|
||||
"version": "0.6.40",
|
||||
"version": "0.6.41",
|
||||
"lockfileVersion": 3,
|
||||
"requires": true,
|
||||
"packages": {
|
||||
"": {
|
||||
"name": "open-webui",
|
||||
"version": "0.6.40",
|
||||
"version": "0.6.41",
|
||||
"dependencies": {
|
||||
"@azure/msal-browser": "^4.5.0",
|
||||
"@codemirror/lang-javascript": "^6.2.2",
|
||||
|
|
|
|||
|
|
@ -1,6 +1,6 @@
|
|||
{
|
||||
"name": "open-webui",
|
||||
"version": "0.6.40",
|
||||
"version": "0.6.41",
|
||||
"private": true,
|
||||
"scripts": {
|
||||
"dev": "npm run pyodide:fetch && vite dev --host",
|
||||
|
|
|
|||
|
|
@ -6,13 +6,13 @@ authors = [
|
|||
]
|
||||
license = { file = "LICENSE" }
|
||||
dependencies = [
|
||||
"fastapi==0.118.0",
|
||||
"fastapi==0.123.0",
|
||||
"uvicorn[standard]==0.37.0",
|
||||
"pydantic==2.11.9",
|
||||
"pydantic==2.12.5",
|
||||
"python-multipart==0.0.20",
|
||||
"itsdangerous==2.2.0",
|
||||
|
||||
"python-socketio==5.14.0",
|
||||
"python-socketio==5.15.0",
|
||||
"python-jose==3.5.0",
|
||||
"cryptography",
|
||||
"bcrypt==5.0.0",
|
||||
|
|
@ -25,14 +25,14 @@ dependencies = [
|
|||
"async-timeout",
|
||||
"aiocache",
|
||||
"aiofiles",
|
||||
"starlette-compress==1.6.0",
|
||||
"starlette-compress==1.6.1",
|
||||
"httpx[socks,http2,zstd,cli,brotli]==0.28.1",
|
||||
"starsessions[redis]==2.2.1",
|
||||
|
||||
"sqlalchemy==2.0.38",
|
||||
"alembic==1.14.0",
|
||||
"peewee==3.18.1",
|
||||
"peewee-migrate==1.12.2",
|
||||
"alembic==1.17.2",
|
||||
"peewee==3.18.3",
|
||||
"peewee-migrate==1.14.3",
|
||||
|
||||
"pycrdt==0.12.25",
|
||||
"redis",
|
||||
|
|
@ -41,10 +41,10 @@ dependencies = [
|
|||
"RestrictedPython==8.0",
|
||||
|
||||
"loguru==0.7.3",
|
||||
"asgiref==3.8.1",
|
||||
"asgiref==3.11.0",
|
||||
|
||||
"tiktoken",
|
||||
"mcp==1.21.2",
|
||||
"mcp==1.22.0",
|
||||
|
||||
"openai",
|
||||
"anthropic",
|
||||
|
|
@ -58,26 +58,26 @@ dependencies = [
|
|||
"chromadb==1.0.20",
|
||||
"opensearch-py==2.8.0",
|
||||
"PyMySQL==1.1.1",
|
||||
"boto3==1.40.5",
|
||||
"boto3==1.41.5",
|
||||
|
||||
"transformers",
|
||||
"sentence-transformers==5.1.1",
|
||||
"transformers==4.57.3",
|
||||
"sentence-transformers==5.1.2",
|
||||
"accelerate",
|
||||
"pyarrow==20.0.0",
|
||||
"einops==0.8.1",
|
||||
|
||||
"ftfy==6.2.3",
|
||||
"ftfy==6.3.1",
|
||||
"chardet==5.2.0",
|
||||
"pypdf==6.0.0",
|
||||
"pypdf==6.4.0",
|
||||
"fpdf2==2.8.2",
|
||||
"pymdown-extensions==10.14.2",
|
||||
"pymdown-extensions==10.17.2",
|
||||
"docx2txt==0.8",
|
||||
"python-pptx==1.0.2",
|
||||
"unstructured==0.18.18",
|
||||
"unstructured==0.18.21",
|
||||
"msoffcrypto-tool==5.4.2",
|
||||
"nltk==3.9.1",
|
||||
"Markdown==3.9",
|
||||
"pypandoc==1.15",
|
||||
"Markdown==3.10",
|
||||
"pypandoc==1.16.2",
|
||||
"pandas==2.2.3",
|
||||
"openpyxl==3.1.5",
|
||||
"pyxlsb==1.0.10",
|
||||
|
|
@ -96,18 +96,18 @@ dependencies = [
|
|||
"onnxruntime==1.20.1",
|
||||
"faster-whisper==1.1.1",
|
||||
|
||||
"black==25.9.0",
|
||||
"black==25.11.0",
|
||||
"youtube-transcript-api==1.2.2",
|
||||
"pytube==15.0.0",
|
||||
|
||||
"pydub",
|
||||
"ddgs==9.0.0",
|
||||
"ddgs==9.9.2",
|
||||
|
||||
"google-api-python-client",
|
||||
"google-auth-httplib2",
|
||||
"google-auth-oauthlib",
|
||||
|
||||
"googleapis-common-protos==1.70.0",
|
||||
"googleapis-common-protos==1.72.0",
|
||||
"google-cloud-storage==2.19.0",
|
||||
|
||||
"azure-identity==1.25.0",
|
||||
|
|
@ -142,18 +142,18 @@ all = [
|
|||
"gcp-storage-emulator>=2024.8.3",
|
||||
"docker~=7.1.0",
|
||||
"pytest~=8.3.2",
|
||||
"pytest-docker~=3.1.1",
|
||||
"playwright==1.49.1",
|
||||
"pytest-docker~=3.2.5",
|
||||
"playwright==1.56.0",
|
||||
"elasticsearch==9.1.0",
|
||||
|
||||
"qdrant-client==1.14.3",
|
||||
"weaviate-client==4.17.0",
|
||||
"pymilvus==2.6.2",
|
||||
"pymilvus==2.6.4",
|
||||
"pinecone==6.0.2",
|
||||
"oracledb==3.2.0",
|
||||
"colbert-ai==0.2.21",
|
||||
|
||||
"firecrawl-py==4.5.0",
|
||||
"firecrawl-py==4.10.0",
|
||||
"azure-search-documents==11.6.0",
|
||||
]
|
||||
|
||||
|
|
|
|||
|
|
@ -637,7 +637,7 @@ input[type='number'] {
|
|||
|
||||
.tiptap th,
|
||||
.tiptap td {
|
||||
@apply px-3 py-1.5 border border-gray-100 dark:border-gray-850;
|
||||
@apply px-3 py-1.5 border border-gray-100/30 dark:border-gray-850/30;
|
||||
}
|
||||
|
||||
.tiptap th {
|
||||
|
|
|
|||
|
|
@ -1,10 +1,13 @@
|
|||
import { WEBUI_API_BASE_URL } from '$lib/constants';
|
||||
|
||||
type ChannelForm = {
|
||||
type?: string;
|
||||
name: string;
|
||||
is_private?: boolean;
|
||||
data?: object;
|
||||
meta?: object;
|
||||
access_control?: object;
|
||||
user_ids?: string[];
|
||||
};
|
||||
|
||||
export const createNewChannel = async (token: string = '', channel: ChannelForm) => {
|
||||
|
|
@ -101,7 +104,38 @@ export const getChannelById = async (token: string = '', channel_id: string) =>
|
|||
return res;
|
||||
};
|
||||
|
||||
export const getChannelUsersById = async (
|
||||
export const getDMChannelByUserId = async (token: string = '', user_id: string) => {
|
||||
let error = null;
|
||||
|
||||
const res = await fetch(`${WEBUI_API_BASE_URL}/channels/users/${user_id}`, {
|
||||
method: 'GET',
|
||||
headers: {
|
||||
Accept: 'application/json',
|
||||
'Content-Type': 'application/json',
|
||||
authorization: `Bearer ${token}`
|
||||
}
|
||||
})
|
||||
.then(async (res) => {
|
||||
if (!res.ok) throw await res.json();
|
||||
return res.json();
|
||||
})
|
||||
.then((json) => {
|
||||
return json;
|
||||
})
|
||||
.catch((err) => {
|
||||
error = err.detail;
|
||||
console.error(err);
|
||||
return null;
|
||||
});
|
||||
|
||||
if (error) {
|
||||
throw error;
|
||||
}
|
||||
|
||||
return res;
|
||||
};
|
||||
|
||||
export const getChannelMembersById = async (
|
||||
token: string,
|
||||
channel_id: string,
|
||||
query?: string,
|
||||
|
|
@ -129,7 +163,7 @@ export const getChannelUsersById = async (
|
|||
}
|
||||
|
||||
res = await fetch(
|
||||
`${WEBUI_API_BASE_URL}/channels/${channel_id}/users?${searchParams.toString()}`,
|
||||
`${WEBUI_API_BASE_URL}/channels/${channel_id}/members?${searchParams.toString()}`,
|
||||
{
|
||||
method: 'GET',
|
||||
headers: {
|
||||
|
|
@ -155,6 +189,124 @@ export const getChannelUsersById = async (
|
|||
return res;
|
||||
};
|
||||
|
||||
export const updateChannelMemberActiveStatusById = async (
|
||||
token: string = '',
|
||||
channel_id: string,
|
||||
is_active: boolean
|
||||
) => {
|
||||
let error = null;
|
||||
|
||||
const res = await fetch(`${WEBUI_API_BASE_URL}/channels/${channel_id}/members/active`, {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
Accept: 'application/json',
|
||||
'Content-Type': 'application/json',
|
||||
authorization: `Bearer ${token}`
|
||||
},
|
||||
body: JSON.stringify({ is_active })
|
||||
})
|
||||
.then(async (res) => {
|
||||
if (!res.ok) throw await res.json();
|
||||
return res.json();
|
||||
})
|
||||
.then((json) => {
|
||||
return json;
|
||||
})
|
||||
.catch((err) => {
|
||||
error = err.detail;
|
||||
console.error(err);
|
||||
return null;
|
||||
});
|
||||
|
||||
if (error) {
|
||||
throw error;
|
||||
}
|
||||
|
||||
return res;
|
||||
};
|
||||
|
||||
type UpdateMembersForm = {
|
||||
user_ids?: string[];
|
||||
group_ids?: string[];
|
||||
};
|
||||
|
||||
export const addMembersById = async (
|
||||
token: string = '',
|
||||
channel_id: string,
|
||||
formData: UpdateMembersForm
|
||||
) => {
|
||||
let error = null;
|
||||
|
||||
const res = await fetch(`${WEBUI_API_BASE_URL}/channels/${channel_id}/update/members/add`, {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
Accept: 'application/json',
|
||||
'Content-Type': 'application/json',
|
||||
authorization: `Bearer ${token}`
|
||||
},
|
||||
body: JSON.stringify({ ...formData })
|
||||
})
|
||||
.then(async (res) => {
|
||||
if (!res.ok) throw await res.json();
|
||||
return res.json();
|
||||
})
|
||||
.then((json) => {
|
||||
return json;
|
||||
})
|
||||
.catch((err) => {
|
||||
error = err.detail;
|
||||
console.error(err);
|
||||
return null;
|
||||
});
|
||||
|
||||
if (error) {
|
||||
throw error;
|
||||
}
|
||||
|
||||
return res;
|
||||
};
|
||||
|
||||
type RemoveMembersForm = {
|
||||
user_ids?: string[];
|
||||
group_ids?: string[];
|
||||
};
|
||||
|
||||
export const removeMembersById = async (
|
||||
token: string = '',
|
||||
channel_id: string,
|
||||
formData: RemoveMembersForm
|
||||
) => {
|
||||
let error = null;
|
||||
|
||||
const res = await fetch(`${WEBUI_API_BASE_URL}/channels/${channel_id}/update/members/remove`, {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
Accept: 'application/json',
|
||||
'Content-Type': 'application/json',
|
||||
authorization: `Bearer ${token}`
|
||||
},
|
||||
body: JSON.stringify({ ...formData })
|
||||
})
|
||||
.then(async (res) => {
|
||||
if (!res.ok) throw await res.json();
|
||||
return res.json();
|
||||
})
|
||||
.then((json) => {
|
||||
return json;
|
||||
})
|
||||
.catch((err) => {
|
||||
error = err.detail;
|
||||
console.error(err);
|
||||
return null;
|
||||
});
|
||||
|
||||
if (error) {
|
||||
throw error;
|
||||
}
|
||||
|
||||
return res;
|
||||
};
|
||||
|
||||
export const updateChannelById = async (
|
||||
token: string = '',
|
||||
channel_id: string,
|
||||
|
|
@ -261,6 +413,44 @@ export const getChannelMessages = async (
|
|||
return res;
|
||||
};
|
||||
|
||||
export const getChannelPinnedMessages = async (
|
||||
token: string = '',
|
||||
channel_id: string,
|
||||
page: number = 1
|
||||
) => {
|
||||
let error = null;
|
||||
|
||||
const res = await fetch(
|
||||
`${WEBUI_API_BASE_URL}/channels/${channel_id}/messages/pinned?page=${page}`,
|
||||
{
|
||||
method: 'GET',
|
||||
headers: {
|
||||
Accept: 'application/json',
|
||||
'Content-Type': 'application/json',
|
||||
authorization: `Bearer ${token}`
|
||||
}
|
||||
}
|
||||
)
|
||||
.then(async (res) => {
|
||||
if (!res.ok) throw await res.json();
|
||||
return res.json();
|
||||
})
|
||||
.then((json) => {
|
||||
return json;
|
||||
})
|
||||
.catch((err) => {
|
||||
error = err.detail;
|
||||
console.error(err);
|
||||
return null;
|
||||
});
|
||||
|
||||
if (error) {
|
||||
throw error;
|
||||
}
|
||||
|
||||
return res;
|
||||
};
|
||||
|
||||
export const getChannelThreadMessages = async (
|
||||
token: string = '',
|
||||
channel_id: string,
|
||||
|
|
@ -302,6 +492,7 @@ export const getChannelThreadMessages = async (
|
|||
};
|
||||
|
||||
type MessageForm = {
|
||||
temp_id?: string;
|
||||
reply_to_id?: string;
|
||||
parent_id?: string;
|
||||
content: string;
|
||||
|
|
@ -341,6 +532,46 @@ export const sendMessage = async (token: string = '', channel_id: string, messag
|
|||
return res;
|
||||
};
|
||||
|
||||
export const pinMessage = async (
|
||||
token: string = '',
|
||||
channel_id: string,
|
||||
message_id: string,
|
||||
is_pinned: boolean
|
||||
) => {
|
||||
let error = null;
|
||||
|
||||
const res = await fetch(
|
||||
`${WEBUI_API_BASE_URL}/channels/${channel_id}/messages/${message_id}/pin`,
|
||||
{
|
||||
method: 'POST',
|
||||
headers: {
|
||||
Accept: 'application/json',
|
||||
'Content-Type': 'application/json',
|
||||
authorization: `Bearer ${token}`
|
||||
},
|
||||
body: JSON.stringify({ is_pinned })
|
||||
}
|
||||
)
|
||||
.then(async (res) => {
|
||||
if (!res.ok) throw await res.json();
|
||||
return res.json();
|
||||
})
|
||||
.then((json) => {
|
||||
return json;
|
||||
})
|
||||
.catch((err) => {
|
||||
error = err.detail;
|
||||
console.error(err);
|
||||
return null;
|
||||
});
|
||||
|
||||
if (error) {
|
||||
throw error;
|
||||
}
|
||||
|
||||
return res;
|
||||
};
|
||||
|
||||
export const updateMessage = async (
|
||||
token: string = '',
|
||||
channel_id: string,
|
||||
|
|
|
|||
|
|
@ -35,6 +35,7 @@ type ChunkConfigForm = {
|
|||
type DocumentIntelligenceConfigForm = {
|
||||
key: string;
|
||||
endpoint: string;
|
||||
model: string;
|
||||
};
|
||||
|
||||
type ContentExtractConfigForm = {
|
||||
|
|
|
|||
|
|
@ -166,11 +166,33 @@ export const getUsers = async (
|
|||
return res;
|
||||
};
|
||||
|
||||
export const getAllUsers = async (token: string) => {
|
||||
export const searchUsers = async (
|
||||
token: string,
|
||||
query?: string,
|
||||
orderBy?: string,
|
||||
direction?: string,
|
||||
page = 1
|
||||
) => {
|
||||
let error = null;
|
||||
let res = null;
|
||||
|
||||
res = await fetch(`${WEBUI_API_BASE_URL}/users/all`, {
|
||||
const searchParams = new URLSearchParams();
|
||||
|
||||
searchParams.set('page', `${page}`);
|
||||
|
||||
if (query) {
|
||||
searchParams.set('query', query);
|
||||
}
|
||||
|
||||
if (orderBy) {
|
||||
searchParams.set('order_by', orderBy);
|
||||
}
|
||||
|
||||
if (direction) {
|
||||
searchParams.set('direction', direction);
|
||||
}
|
||||
|
||||
res = await fetch(`${WEBUI_API_BASE_URL}/users/search?${searchParams.toString()}`, {
|
||||
method: 'GET',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
|
|
@ -194,11 +216,11 @@ export const getAllUsers = async (token: string) => {
|
|||
return res;
|
||||
};
|
||||
|
||||
export const searchUsers = async (token: string, query: string) => {
|
||||
export const getAllUsers = async (token: string) => {
|
||||
let error = null;
|
||||
let res = null;
|
||||
|
||||
res = await fetch(`${WEBUI_API_BASE_URL}/users/search?query=${encodeURIComponent(query)}`, {
|
||||
res = await fetch(`${WEBUI_API_BASE_URL}/users/all`, {
|
||||
method: 'GET',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
|
|
@ -305,6 +327,36 @@ export const getUserById = async (token: string, userId: string) => {
|
|||
return res;
|
||||
};
|
||||
|
||||
export const updateUserStatus = async (token: string, formData: object) => {
|
||||
let error = null;
|
||||
|
||||
const res = await fetch(`${WEBUI_API_BASE_URL}/users/user/status/update`, {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
Authorization: `Bearer ${token}`
|
||||
},
|
||||
body: JSON.stringify({
|
||||
...formData
|
||||
})
|
||||
})
|
||||
.then(async (res) => {
|
||||
if (!res.ok) throw await res.json();
|
||||
return res.json();
|
||||
})
|
||||
.catch((err) => {
|
||||
console.error(err);
|
||||
error = err.detail;
|
||||
return null;
|
||||
});
|
||||
|
||||
if (error) {
|
||||
throw error;
|
||||
}
|
||||
|
||||
return res;
|
||||
};
|
||||
|
||||
export const getUserInfo = async (token: string) => {
|
||||
let error = null;
|
||||
const res = await fetch(`${WEBUI_API_BASE_URL}/users/user/info`, {
|
||||
|
|
|
|||
|
|
@ -358,7 +358,7 @@
|
|||
<div class="flex-shrink-0 self-start">
|
||||
<select
|
||||
id="select-bearer-or-session"
|
||||
class={`w-full text-sm bg-transparent pr-5 ${($settings?.highContrastMode ?? false) ? 'placeholder:text-gray-700 dark:placeholder:text-gray-100' : 'outline-hidden placeholder:text-gray-300 dark:placeholder:text-gray-700'}`}
|
||||
class={`dark:bg-gray-900 w-full text-sm bg-transparent pr-5 ${($settings?.highContrastMode ?? false) ? 'placeholder:text-gray-700 dark:placeholder:text-gray-100' : 'outline-hidden placeholder:text-gray-300 dark:placeholder:text-gray-700'}`}
|
||||
bind:value={auth_type}
|
||||
>
|
||||
<option value="none">{$i18n.t('None')}</option>
|
||||
|
|
|
|||
|
|
@ -47,7 +47,7 @@
|
|||
let key = '';
|
||||
let headers = '';
|
||||
|
||||
let functionNameFilterList = [];
|
||||
let functionNameFilterList = '';
|
||||
let accessControl = {};
|
||||
|
||||
let id = '';
|
||||
|
|
@ -338,7 +338,7 @@
|
|||
oauthClientInfo = null;
|
||||
|
||||
enable = true;
|
||||
functionNameFilterList = [];
|
||||
functionNameFilterList = '';
|
||||
accessControl = null;
|
||||
};
|
||||
|
||||
|
|
@ -362,7 +362,7 @@
|
|||
oauthClientInfo = connection.info?.oauth_client_info ?? null;
|
||||
|
||||
enable = connection.config?.enable ?? true;
|
||||
functionNameFilterList = connection.config?.function_name_filter_list ?? [];
|
||||
functionNameFilterList = connection.config?.function_name_filter_list ?? '';
|
||||
accessControl = connection.config?.access_control ?? null;
|
||||
}
|
||||
};
|
||||
|
|
@ -534,7 +534,7 @@
|
|||
<div class="flex-shrink-0 self-start">
|
||||
<select
|
||||
id="select-bearer-or-session"
|
||||
class={`w-full text-sm bg-transparent pr-5 ${($settings?.highContrastMode ?? false) ? 'placeholder:text-gray-700 dark:placeholder:text-gray-100' : 'outline-hidden placeholder:text-gray-300 dark:placeholder:text-gray-700'}`}
|
||||
class={`dark:bg-gray-900 w-full text-sm bg-transparent pr-5 ${($settings?.highContrastMode ?? false) ? 'placeholder:text-gray-700 dark:placeholder:text-gray-100' : 'outline-hidden placeholder:text-gray-300 dark:placeholder:text-gray-700'}`}
|
||||
bind:value={spec_type}
|
||||
>
|
||||
<option value="url">{$i18n.t('URL')}</option>
|
||||
|
|
@ -644,7 +644,7 @@
|
|||
<div class="flex-shrink-0 self-start">
|
||||
<select
|
||||
id="select-bearer-or-session"
|
||||
class={`w-full text-sm bg-transparent pr-5 ${($settings?.highContrastMode ?? false) ? 'placeholder:text-gray-700 dark:placeholder:text-gray-100' : 'outline-hidden placeholder:text-gray-300 dark:placeholder:text-gray-700'}`}
|
||||
class={`dark:bg-gray-900 w-full text-sm bg-transparent pr-5 ${($settings?.highContrastMode ?? false) ? 'placeholder:text-gray-700 dark:placeholder:text-gray-100' : 'outline-hidden placeholder:text-gray-300 dark:placeholder:text-gray-700'}`}
|
||||
bind:value={auth_type}
|
||||
>
|
||||
<option value="none">{$i18n.t('None')}</option>
|
||||
|
|
@ -818,11 +818,9 @@
|
|||
|
||||
<hr class=" border-gray-100 dark:border-gray-700/10 my-2.5 w-full" />
|
||||
|
||||
<div class="my-2 -mx-2">
|
||||
<div class="px-4 py-3 bg-gray-50 dark:bg-gray-950 rounded-3xl">
|
||||
<div class="my-2">
|
||||
<AccessControl bind:accessControl />
|
||||
</div>
|
||||
</div>
|
||||
{/if}
|
||||
</div>
|
||||
|
||||
|
|
|
|||
|
|
@ -186,7 +186,7 @@
|
|||
class="w-full text-sm text-left text-gray-500 dark:text-gray-400 table-auto max-w-full"
|
||||
>
|
||||
<thead class="text-xs text-gray-800 uppercase bg-transparent dark:text-gray-200">
|
||||
<tr class=" border-b-[1.5px] border-gray-50 dark:border-gray-850">
|
||||
<tr class=" border-b-[1.5px] border-gray-50 dark:border-gray-850/30">
|
||||
<th
|
||||
scope="col"
|
||||
class="px-2.5 py-2 cursor-pointer select-none w-3"
|
||||
|
|
|
|||
|
|
@ -387,7 +387,7 @@
|
|||
: ''}"
|
||||
>
|
||||
<thead class="text-xs text-gray-800 uppercase bg-transparent dark:text-gray-200">
|
||||
<tr class=" border-b-[1.5px] border-gray-50 dark:border-gray-850">
|
||||
<tr class=" border-b-[1.5px] border-gray-50 dark:border-gray-850/30">
|
||||
<th
|
||||
scope="col"
|
||||
class="px-2.5 py-2 cursor-pointer select-none w-3"
|
||||
|
|
|
|||
|
|
@ -343,7 +343,7 @@
|
|||
</div>
|
||||
|
||||
<div
|
||||
class="py-2 bg-white dark:bg-gray-900 rounded-3xl border border-gray-100 dark:border-gray-850"
|
||||
class="py-2 bg-white dark:bg-gray-900 rounded-3xl border border-gray-100/30 dark:border-gray-850/30"
|
||||
>
|
||||
<div class="px-3.5 flex flex-1 items-center w-full space-x-2 py-0.5 pb-2">
|
||||
<div class="flex flex-1">
|
||||
|
|
|
|||
|
|
@ -63,7 +63,7 @@
|
|||
</div>
|
||||
</div>
|
||||
|
||||
<hr class="border-gray-50 dark:border-gray-850 my-1" />
|
||||
<hr class="border-gray-50 dark:border-gray-850/30 my-1" />
|
||||
{/if}
|
||||
|
||||
<DropdownMenu.Item
|
||||
|
|
@ -122,7 +122,7 @@
|
|||
<div class="flex items-center">{$i18n.t('Export')}</div>
|
||||
</DropdownMenu.Item>
|
||||
|
||||
<hr class="border-gray-50 dark:border-gray-850 my-1" />
|
||||
<hr class="border-gray-50 dark:border-gray-850/30 my-1" />
|
||||
|
||||
<DropdownMenu.Item
|
||||
class="flex gap-2 items-center px-3 py-1.5 text-sm font-medium cursor-pointer hover:bg-gray-50 dark:hover:bg-gray-800 rounded-md"
|
||||
|
|
|
|||
|
|
@ -212,7 +212,7 @@
|
|||
<div>
|
||||
<div class=" mt-0.5 mb-2.5 text-base font-medium">{$i18n.t('Speech-to-Text')}</div>
|
||||
|
||||
<hr class=" border-gray-100 dark:border-gray-850 my-2" />
|
||||
<hr class=" border-gray-100/30 dark:border-gray-850/30 my-2" />
|
||||
|
||||
{#if STT_ENGINE !== 'web'}
|
||||
<div class="mb-2">
|
||||
|
|
@ -263,7 +263,7 @@
|
|||
</div>
|
||||
</div>
|
||||
|
||||
<hr class="border-gray-100 dark:border-gray-850 my-2" />
|
||||
<hr class="border-gray-100/30 dark:border-gray-850/30 my-2" />
|
||||
|
||||
<div>
|
||||
<div class=" mb-1.5 text-xs font-medium">{$i18n.t('STT Model')}</div>
|
||||
|
|
@ -289,7 +289,7 @@
|
|||
</div>
|
||||
</div>
|
||||
|
||||
<hr class="border-gray-100 dark:border-gray-850 my-2" />
|
||||
<hr class="border-gray-100/30 dark:border-gray-850/30 my-2" />
|
||||
|
||||
<div>
|
||||
<div class=" mb-1.5 text-xs font-medium">{$i18n.t('STT Model')}</div>
|
||||
|
|
@ -323,7 +323,7 @@
|
|||
/>
|
||||
</div>
|
||||
|
||||
<hr class="border-gray-100 dark:border-gray-850 my-2" />
|
||||
<hr class="border-gray-100/30 dark:border-gray-850/30 my-2" />
|
||||
|
||||
<div>
|
||||
<div class=" mb-1.5 text-xs font-medium">{$i18n.t('Azure Region')}</div>
|
||||
|
|
@ -391,7 +391,7 @@
|
|||
</div>
|
||||
</div>
|
||||
|
||||
<hr class="border-gray-100 dark:border-gray-850 my-2" />
|
||||
<hr class="border-gray-100/30 dark:border-gray-850/30 my-2" />
|
||||
|
||||
<div>
|
||||
<div class=" mb-1.5 text-xs font-medium">{$i18n.t('STT Model')}</div>
|
||||
|
|
@ -416,7 +416,7 @@
|
|||
</div>
|
||||
</div>
|
||||
|
||||
<hr class="border-gray-100 dark:border-gray-850 my-2" />
|
||||
<hr class="border-gray-100/30 dark:border-gray-850/30 my-2" />
|
||||
|
||||
<div>
|
||||
<div class="flex items-center justify-between mb-2">
|
||||
|
|
@ -500,7 +500,7 @@
|
|||
<div>
|
||||
<div class=" mt-0.5 mb-2.5 text-base font-medium">{$i18n.t('Text-to-Speech')}</div>
|
||||
|
||||
<hr class=" border-gray-100 dark:border-gray-850 my-2" />
|
||||
<hr class=" border-gray-100/30 dark:border-gray-850/30 my-2" />
|
||||
|
||||
<div class="mb-2 py-0.5 flex w-full justify-between">
|
||||
<div class=" self-center text-xs font-medium">{$i18n.t('Text-to-Speech Engine')}</div>
|
||||
|
|
@ -557,7 +557,7 @@
|
|||
<SensitiveInput placeholder={$i18n.t('API Key')} bind:value={TTS_API_KEY} required />
|
||||
</div>
|
||||
|
||||
<hr class="border-gray-100 dark:border-gray-850 my-2" />
|
||||
<hr class="border-gray-100/30 dark:border-gray-850/30 my-2" />
|
||||
|
||||
<div>
|
||||
<div class=" mb-1.5 text-xs font-medium">{$i18n.t('Azure Region')}</div>
|
||||
|
|
|
|||
|
|
@ -43,7 +43,7 @@
|
|||
<div class="mb-3.5">
|
||||
<div class=" mt-0.5 mb-2.5 text-base font-medium">{$i18n.t('General')}</div>
|
||||
|
||||
<hr class=" border-gray-100 dark:border-gray-850 my-2" />
|
||||
<hr class=" border-gray-100/30 dark:border-gray-850/30 my-2" />
|
||||
|
||||
<div class="mb-2.5">
|
||||
<div class=" flex w-full justify-between">
|
||||
|
|
@ -166,7 +166,7 @@
|
|||
<div class="mb-3.5">
|
||||
<div class=" mt-0.5 mb-2.5 text-base font-medium">{$i18n.t('Code Interpreter')}</div>
|
||||
|
||||
<hr class=" border-gray-100 dark:border-gray-850 my-2" />
|
||||
<hr class=" border-gray-100/30 dark:border-gray-850/30 my-2" />
|
||||
|
||||
<div class="mb-2.5">
|
||||
<div class=" flex w-full justify-between">
|
||||
|
|
@ -288,7 +288,7 @@
|
|||
</div>
|
||||
{/if}
|
||||
|
||||
<hr class="border-gray-100 dark:border-gray-850 my-2" />
|
||||
<hr class="border-gray-100/30 dark:border-gray-850/30 my-2" />
|
||||
|
||||
<div>
|
||||
<div class="py-0.5 w-full">
|
||||
|
|
|
|||
|
|
@ -221,7 +221,7 @@
|
|||
<div class="mb-3.5">
|
||||
<div class=" mt-0.5 mb-2.5 text-base font-medium">{$i18n.t('General')}</div>
|
||||
|
||||
<hr class=" border-gray-100 dark:border-gray-850 my-2" />
|
||||
<hr class=" border-gray-100/30 dark:border-gray-850/30 my-2" />
|
||||
|
||||
<div class="my-2">
|
||||
<div class="mt-2 space-y-2">
|
||||
|
|
@ -384,7 +384,7 @@
|
|||
</div>
|
||||
</div>
|
||||
|
||||
<hr class=" border-gray-100 dark:border-gray-850 my-2" />
|
||||
<hr class=" border-gray-100/30 dark:border-gray-850/30 my-2" />
|
||||
|
||||
<div class="my-2">
|
||||
<div class="flex justify-between items-center text-sm">
|
||||
|
|
|
|||
|
|
@ -143,7 +143,7 @@
|
|||
</div>
|
||||
</button>
|
||||
|
||||
<hr class="border-gray-50 dark:border-gray-850 my-1" />
|
||||
<hr class="border-gray-50 dark:border-gray-850/30 my-1" />
|
||||
|
||||
{#if $config?.features.enable_admin_export ?? true}
|
||||
<div class=" flex w-full justify-between">
|
||||
|
|
|
|||
|
|
@ -327,7 +327,7 @@
|
|||
<div class="mb-3">
|
||||
<div class=" mt-0.5 mb-2.5 text-base font-medium">{$i18n.t('General')}</div>
|
||||
|
||||
<hr class=" border-gray-100 dark:border-gray-850 my-2" />
|
||||
<hr class=" border-gray-100/30 dark:border-gray-850/30 my-2" />
|
||||
|
||||
<div class="mb-2.5 flex flex-col w-full justify-between">
|
||||
<div class="flex w-full justify-between mb-1">
|
||||
|
|
@ -597,6 +597,20 @@
|
|||
required={false}
|
||||
/>
|
||||
</div>
|
||||
<div class="my-0.5 flex flex-col w-full">
|
||||
<div class=" mb-1 text-xs font-medium">
|
||||
{$i18n.t('Document Intelligence Model')}
|
||||
</div>
|
||||
<div class="flex w-full">
|
||||
<div class="flex-1 mr-2">
|
||||
<input
|
||||
class="flex-1 w-full text-sm bg-transparent outline-hidden"
|
||||
placeholder={$i18n.t('Enter Document Intelligence Model')}
|
||||
bind:value={RAGConfig.DOCUMENT_INTELLIGENCE_MODEL}
|
||||
/>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
{:else if RAGConfig.CONTENT_EXTRACTION_ENGINE === 'mistral_ocr'}
|
||||
<div class="my-0.5 flex gap-2 pr-2">
|
||||
<input
|
||||
|
|
@ -762,7 +776,7 @@
|
|||
<div class="mb-3">
|
||||
<div class=" mt-0.5 mb-2.5 text-base font-medium">{$i18n.t('Embedding')}</div>
|
||||
|
||||
<hr class=" border-gray-100 dark:border-gray-850 my-2" />
|
||||
<hr class=" border-gray-100/30 dark:border-gray-850/30 my-2" />
|
||||
|
||||
<div class=" mb-2.5 flex flex-col w-full justify-between">
|
||||
<div class="flex w-full justify-between">
|
||||
|
|
@ -953,7 +967,7 @@
|
|||
<div class="mb-3">
|
||||
<div class=" mt-0.5 mb-2.5 text-base font-medium">{$i18n.t('Retrieval')}</div>
|
||||
|
||||
<hr class=" border-gray-100 dark:border-gray-850 my-2" />
|
||||
<hr class=" border-gray-100/30 dark:border-gray-850/30 my-2" />
|
||||
|
||||
<div class=" mb-2.5 flex w-full justify-between">
|
||||
<div class=" self-center text-xs font-medium">{$i18n.t('Full Context Mode')}</div>
|
||||
|
|
@ -1211,7 +1225,7 @@
|
|||
<div class="mb-3">
|
||||
<div class=" mt-0.5 mb-2.5 text-base font-medium">{$i18n.t('Files')}</div>
|
||||
|
||||
<hr class=" border-gray-100 dark:border-gray-850 my-2" />
|
||||
<hr class=" border-gray-100/30 dark:border-gray-850/30 my-2" />
|
||||
|
||||
<div class=" mb-2.5 flex w-full justify-between">
|
||||
<div class=" self-center text-xs font-medium">{$i18n.t('Allowed File Extensions')}</div>
|
||||
|
|
@ -1323,7 +1337,7 @@
|
|||
<div class="mb-3">
|
||||
<div class=" mt-0.5 mb-2.5 text-base font-medium">{$i18n.t('Integration')}</div>
|
||||
|
||||
<hr class=" border-gray-100 dark:border-gray-850 my-2" />
|
||||
<hr class=" border-gray-100/30 dark:border-gray-850/30 my-2" />
|
||||
|
||||
<div class=" mb-2.5 flex w-full justify-between">
|
||||
<div class=" self-center text-xs font-medium">{$i18n.t('Google Drive')}</div>
|
||||
|
|
@ -1343,7 +1357,7 @@
|
|||
<div class="mb-3">
|
||||
<div class=" mt-0.5 mb-2.5 text-base font-medium">{$i18n.t('Danger Zone')}</div>
|
||||
|
||||
<hr class=" border-gray-100 dark:border-gray-850 my-2" />
|
||||
<hr class=" border-gray-100/30 dark:border-gray-850/30 my-2" />
|
||||
|
||||
<div class=" mb-2.5 flex w-full justify-between">
|
||||
<div class=" self-center text-xs font-medium">{$i18n.t('Reset Upload Directory')}</div>
|
||||
|
|
|
|||
|
|
@ -106,7 +106,7 @@
|
|||
<div class="mb-3">
|
||||
<div class=" mt-0.5 mb-2.5 text-base font-medium">{$i18n.t('General')}</div>
|
||||
|
||||
<hr class=" border-gray-100 dark:border-gray-850 my-2" />
|
||||
<hr class=" border-gray-100/30 dark:border-gray-850/30 my-2" />
|
||||
|
||||
<div class="mb-2.5 flex w-full justify-between">
|
||||
<div class=" text-xs font-medium">{$i18n.t('Arena Models')}</div>
|
||||
|
|
@ -139,7 +139,7 @@
|
|||
</div>
|
||||
</div>
|
||||
|
||||
<hr class=" border-gray-100 dark:border-gray-850 my-2" />
|
||||
<hr class=" border-gray-100/30 dark:border-gray-850/30 my-2" />
|
||||
|
||||
<div class="flex flex-col gap-2">
|
||||
{#if (evaluationConfig?.EVALUATION_ARENA_MODELS ?? []).length > 0}
|
||||
|
|
|
|||
|
|
@ -292,11 +292,9 @@
|
|||
|
||||
<hr class=" border-gray-100 dark:border-gray-700/10 my-2.5 w-full" />
|
||||
|
||||
<div class="my-2 -mx-2">
|
||||
<div class="px-4 py-3 bg-gray-50 dark:bg-gray-950 rounded-3xl">
|
||||
<div class="my-2">
|
||||
<AccessControl bind:accessControl />
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<hr class=" border-gray-100 dark:border-gray-700/10 my-2.5 w-full" />
|
||||
|
||||
|
|
@ -352,7 +350,7 @@
|
|||
|
||||
<div class="flex items-center">
|
||||
<select
|
||||
class="w-full py-1 text-sm rounded-lg bg-transparent {selectedModelId
|
||||
class="dark:bg-gray-900 w-full py-1 text-sm rounded-lg bg-transparent {selectedModelId
|
||||
? ''
|
||||
: 'text-gray-500'} placeholder:text-gray-300 dark:placeholder:text-gray-700 outline-hidden"
|
||||
bind:value={selectedModelId}
|
||||
|
|
|
|||
|
|
@ -129,7 +129,7 @@
|
|||
<div class="mb-3.5">
|
||||
<div class=" mt-0.5 mb-2.5 text-base font-medium">{$i18n.t('General')}</div>
|
||||
|
||||
<hr class=" border-gray-100 dark:border-gray-850 my-2" />
|
||||
<hr class=" border-gray-100/30 dark:border-gray-850/30 my-2" />
|
||||
|
||||
<div class="mb-2.5">
|
||||
<div class=" mb-1 text-xs font-medium flex space-x-2 items-center">
|
||||
|
|
@ -287,7 +287,7 @@
|
|||
<div class="mb-3">
|
||||
<div class=" mt-0.5 mb-2.5 text-base font-medium">{$i18n.t('Authentication')}</div>
|
||||
|
||||
<hr class=" border-gray-100 dark:border-gray-850 my-2" />
|
||||
<hr class=" border-gray-100/30 dark:border-gray-850/30 my-2" />
|
||||
|
||||
<div class=" mb-2.5 flex w-full justify-between">
|
||||
<div class=" self-center text-xs font-medium">{$i18n.t('Default User Role')}</div>
|
||||
|
|
@ -660,7 +660,7 @@
|
|||
<div class="mb-3">
|
||||
<div class=" mt-0.5 mb-2.5 text-base font-medium">{$i18n.t('Features')}</div>
|
||||
|
||||
<hr class=" border-gray-100 dark:border-gray-850 my-2" />
|
||||
<hr class=" border-gray-100/30 dark:border-gray-850/30 my-2" />
|
||||
|
||||
<div class="mb-2.5 flex w-full items-center justify-between pr-2">
|
||||
<div class=" self-center text-xs font-medium">
|
||||
|
|
@ -676,6 +676,14 @@
|
|||
<Switch bind:state={adminConfig.ENABLE_MESSAGE_RATING} />
|
||||
</div>
|
||||
|
||||
<div class="mb-2.5 flex w-full items-center justify-between pr-2">
|
||||
<div class=" self-center text-xs font-medium">
|
||||
{$i18n.t('Folders')}
|
||||
</div>
|
||||
|
||||
<Switch bind:state={adminConfig.ENABLE_FOLDERS} />
|
||||
</div>
|
||||
|
||||
<div class="mb-2.5 flex w-full items-center justify-between pr-2">
|
||||
<div class=" self-center text-xs font-medium">
|
||||
{$i18n.t('Notes')} ({$i18n.t('Beta')})
|
||||
|
|
|
|||
|
|
@ -291,7 +291,7 @@
|
|||
<div class="mb-3">
|
||||
<div class=" mt-0.5 mb-2.5 text-base font-medium">{$i18n.t('General')}</div>
|
||||
|
||||
<hr class=" border-gray-100 dark:border-gray-850 my-2" />
|
||||
<hr class=" border-gray-100/30 dark:border-gray-850/30 my-2" />
|
||||
|
||||
<div class="mb-2.5">
|
||||
<div class="flex w-full justify-between items-center">
|
||||
|
|
@ -309,7 +309,7 @@
|
|||
<div class="mb-3">
|
||||
<div class=" mt-0.5 mb-2.5 text-base font-medium">{$i18n.t('Create Image')}</div>
|
||||
|
||||
<hr class=" border-gray-100 dark:border-gray-850 my-2" />
|
||||
<hr class=" border-gray-100/30 dark:border-gray-850/30 my-2" />
|
||||
|
||||
{#if config.ENABLE_IMAGE_GENERATION}
|
||||
<div class="mb-2.5">
|
||||
|
|
@ -882,7 +882,7 @@
|
|||
<div class="mb-3">
|
||||
<div class=" mt-0.5 mb-2.5 text-base font-medium">{$i18n.t('Edit Image')}</div>
|
||||
|
||||
<hr class=" border-gray-100 dark:border-gray-850 my-2" />
|
||||
<hr class=" border-gray-100/30 dark:border-gray-850/30 my-2" />
|
||||
|
||||
<div class="mb-2.5">
|
||||
<div class="flex w-full justify-between items-center">
|
||||
|
|
|
|||
|
|
@ -115,7 +115,7 @@
|
|||
<div class="mb-3.5">
|
||||
<div class=" mt-0.5 mb-2.5 text-base font-medium">{$i18n.t('Tasks')}</div>
|
||||
|
||||
<hr class=" border-gray-100 dark:border-gray-850 my-2" />
|
||||
<hr class=" border-gray-100/30 dark:border-gray-850/30 my-2" />
|
||||
|
||||
<div class=" mb-2 font-medium flex items-center">
|
||||
<div class=" text-xs mr-1">{$i18n.t('Task Model')}</div>
|
||||
|
|
@ -423,7 +423,7 @@
|
|||
<div class="mb-3.5">
|
||||
<div class=" mt-0.5 mb-2.5 text-base font-medium">{$i18n.t('UI')}</div>
|
||||
|
||||
<hr class=" border-gray-100 dark:border-gray-850 my-2" />
|
||||
<hr class=" border-gray-100/30 dark:border-gray-850/30 my-2" />
|
||||
|
||||
<div class="mb-2.5">
|
||||
<div class="flex w-full justify-between">
|
||||
|
|
|
|||
|
|
@ -811,9 +811,8 @@
|
|||
bind:value={deleteModelTag}
|
||||
placeholder={$i18n.t('Select a model')}
|
||||
>
|
||||
{#if !deleteModelTag}
|
||||
<option value="" disabled selected>{$i18n.t('Select a model')}</option>
|
||||
{/if}
|
||||
|
||||
{#each ollamaModels as model}
|
||||
<option value={model.id} class="bg-gray-50 dark:bg-gray-700"
|
||||
>{model.name + ' (' + (model.size / 1024 ** 3).toFixed(1) + ' GB)'}</option
|
||||
|
|
|
|||
|
|
@ -19,7 +19,7 @@
|
|||
|
||||
<div class="flex items-center -mr-1">
|
||||
<select
|
||||
class="w-full py-1 text-sm rounded-lg bg-transparent {selectedModelId
|
||||
class="dark:bg-gray-900 w-full py-1 text-sm rounded-lg bg-transparent {selectedModelId
|
||||
? ''
|
||||
: 'text-gray-500'} placeholder:text-gray-300 dark:placeholder:text-gray-700 outline-hidden"
|
||||
bind:value={selectedModelId}
|
||||
|
|
|
|||
|
|
@ -418,7 +418,7 @@
|
|||
</div>
|
||||
</div>
|
||||
|
||||
<hr class="border-gray-100 dark:border-gray-850 my-3 w-full" />
|
||||
<hr class="border-gray-100/30 dark:border-gray-850/30 my-3 w-full" />
|
||||
|
||||
{#if pipelines !== null}
|
||||
{#if pipelines.length > 0}
|
||||
|
|
|
|||
|
|
@ -61,7 +61,7 @@
|
|||
<div class="mb-3">
|
||||
<div class=" mt-0.5 mb-2.5 text-base font-medium">{$i18n.t('General')}</div>
|
||||
|
||||
<hr class=" border-gray-100 dark:border-gray-850 my-2" />
|
||||
<hr class=" border-gray-100/30 dark:border-gray-850/30 my-2" />
|
||||
|
||||
<div class="mb-2.5 flex flex-col w-full justify-between">
|
||||
<!-- {$i18n.t(`Failed to connect to {{URL}} OpenAPI tool server`, {
|
||||
|
|
|
|||
|
|
@ -97,7 +97,7 @@
|
|||
<div class="mb-3">
|
||||
<div class=" mt-0.5 mb-2.5 text-base font-medium">{$i18n.t('General')}</div>
|
||||
|
||||
<hr class=" border-gray-100 dark:border-gray-850 my-2" />
|
||||
<hr class=" border-gray-100/30 dark:border-gray-850/30 my-2" />
|
||||
|
||||
<div class=" mb-2.5 flex w-full justify-between">
|
||||
<div class=" self-center text-xs font-medium">
|
||||
|
|
@ -746,7 +746,7 @@
|
|||
<div class="mb-3">
|
||||
<div class=" mt-0.5 mb-2.5 text-base font-medium">{$i18n.t('Loader')}</div>
|
||||
|
||||
<hr class=" border-gray-100 dark:border-gray-850 my-2" />
|
||||
<hr class=" border-gray-100/30 dark:border-gray-850/30 my-2" />
|
||||
|
||||
<div class=" mb-2.5 flex w-full justify-between">
|
||||
<div class=" self-center text-xs font-medium">
|
||||
|
|
|
|||
|
|
@ -100,6 +100,7 @@
|
|||
<EditGroupModal
|
||||
bind:show={showAddGroupModal}
|
||||
edit={false}
|
||||
tabs={['general', 'permissions']}
|
||||
permissions={defaultPermissions}
|
||||
onSubmit={addGroupHandler}
|
||||
/>
|
||||
|
|
@ -175,7 +176,7 @@
|
|||
<div class="w-full basis-2/5 text-right">{$i18n.t('Users')}</div>
|
||||
</div>
|
||||
|
||||
<hr class="mt-1.5 border-gray-100 dark:border-gray-850" />
|
||||
<hr class="mt-1.5 border-gray-100/30 dark:border-gray-850/30" />
|
||||
|
||||
{#each filteredGroups as group}
|
||||
<div class="my-2">
|
||||
|
|
@ -185,7 +186,7 @@
|
|||
</div>
|
||||
{/if}
|
||||
|
||||
<hr class="mb-2 border-gray-100 dark:border-gray-850" />
|
||||
<hr class="mb-2 border-gray-100/30 dark:border-gray-850/30" />
|
||||
|
||||
<EditGroupModal
|
||||
bind:show={showDefaultPermissionsModal}
|
||||
|
|
|
|||
|
|
@ -84,11 +84,13 @@
|
|||
},
|
||||
features: {
|
||||
api_keys: false,
|
||||
notes: true,
|
||||
channels: true,
|
||||
folders: true,
|
||||
direct_tool_servers: false,
|
||||
web_search: true,
|
||||
image_generation: true,
|
||||
code_interpreter: true,
|
||||
notes: true
|
||||
code_interpreter: true
|
||||
}
|
||||
};
|
||||
|
||||
|
|
|
|||
|
|
@ -65,7 +65,7 @@
|
|||
</div>
|
||||
</div>
|
||||
|
||||
<hr class="border-gray-50 dark:border-gray-850 my-1" />
|
||||
<hr class="border-gray-50 dark:border-gray-850/30 my-1" />
|
||||
|
||||
<div class="flex flex-col w-full mt-2">
|
||||
<div class=" mb-1 text-xs text-gray-500">{$i18n.t('Setting')}</div>
|
||||
|
|
|
|||
|
|
@ -54,11 +54,13 @@
|
|||
},
|
||||
features: {
|
||||
api_keys: false,
|
||||
notes: true,
|
||||
channels: true,
|
||||
folders: true,
|
||||
direct_tool_servers: false,
|
||||
web_search: true,
|
||||
image_generation: true,
|
||||
code_interpreter: true,
|
||||
notes: true
|
||||
code_interpreter: true
|
||||
}
|
||||
};
|
||||
|
||||
|
|
@ -214,7 +216,7 @@
|
|||
</div>
|
||||
</div>
|
||||
|
||||
<hr class=" border-gray-100 dark:border-gray-850" />
|
||||
<hr class=" border-gray-100/30 dark:border-gray-850/30" />
|
||||
|
||||
<div>
|
||||
<div class=" mb-2 text-sm font-medium">{$i18n.t('Sharing Permissions')}</div>
|
||||
|
|
@ -390,7 +392,7 @@
|
|||
{/if}
|
||||
</div>
|
||||
|
||||
<hr class=" border-gray-100 dark:border-gray-850" />
|
||||
<hr class=" border-gray-100/30 dark:border-gray-850/30" />
|
||||
|
||||
<div>
|
||||
<div class=" mb-2 text-sm font-medium">{$i18n.t('Chat Permissions')}</div>
|
||||
|
|
@ -704,7 +706,7 @@
|
|||
{/if}
|
||||
</div>
|
||||
|
||||
<hr class=" border-gray-100 dark:border-gray-850" />
|
||||
<hr class=" border-gray-100/30 dark:border-gray-850/30" />
|
||||
|
||||
<div>
|
||||
<div class=" mb-2 text-sm font-medium">{$i18n.t('Features Permissions')}</div>
|
||||
|
|
@ -725,6 +727,54 @@
|
|||
{/if}
|
||||
</div>
|
||||
|
||||
<div class="flex flex-col w-full">
|
||||
<div class="flex w-full justify-between my-1">
|
||||
<div class=" self-center text-xs font-medium">
|
||||
{$i18n.t('Notes')}
|
||||
</div>
|
||||
<Switch bind:state={permissions.features.notes} />
|
||||
</div>
|
||||
{#if defaultPermissions?.features?.notes && !permissions.features.notes}
|
||||
<div>
|
||||
<div class="text-xs text-gray-500">
|
||||
{$i18n.t('This is a default user permission and will remain enabled.')}
|
||||
</div>
|
||||
</div>
|
||||
{/if}
|
||||
</div>
|
||||
|
||||
<div class="flex flex-col w-full">
|
||||
<div class="flex w-full justify-between my-1">
|
||||
<div class=" self-center text-xs font-medium">
|
||||
{$i18n.t('Channels')}
|
||||
</div>
|
||||
<Switch bind:state={permissions.features.channels} />
|
||||
</div>
|
||||
{#if defaultPermissions?.features?.channels && !permissions.features.channels}
|
||||
<div>
|
||||
<div class="text-xs text-gray-500">
|
||||
{$i18n.t('This is a default user permission and will remain enabled.')}
|
||||
</div>
|
||||
</div>
|
||||
{/if}
|
||||
</div>
|
||||
|
||||
<div class="flex flex-col w-full">
|
||||
<div class="flex w-full justify-between my-1">
|
||||
<div class=" self-center text-xs font-medium">
|
||||
{$i18n.t('Folders')}
|
||||
</div>
|
||||
<Switch bind:state={permissions.features.folders} />
|
||||
</div>
|
||||
{#if defaultPermissions?.features?.folders && !permissions.features.folders}
|
||||
<div>
|
||||
<div class="text-xs text-gray-500">
|
||||
{$i18n.t('This is a default user permission and will remain enabled.')}
|
||||
</div>
|
||||
</div>
|
||||
{/if}
|
||||
</div>
|
||||
|
||||
<div class="flex flex-col w-full">
|
||||
<div class="flex w-full justify-between my-1">
|
||||
<div class=" self-center text-xs font-medium">
|
||||
|
|
@ -788,21 +838,5 @@
|
|||
</div>
|
||||
{/if}
|
||||
</div>
|
||||
|
||||
<div class="flex flex-col w-full">
|
||||
<div class="flex w-full justify-between my-1">
|
||||
<div class=" self-center text-xs font-medium">
|
||||
{$i18n.t('Notes')}
|
||||
</div>
|
||||
<Switch bind:state={permissions.features.notes} />
|
||||
</div>
|
||||
{#if defaultPermissions?.features?.notes && !permissions.features.notes}
|
||||
<div>
|
||||
<div class="text-xs text-gray-500">
|
||||
{$i18n.t('This is a default user permission and will remain enabled.')}
|
||||
</div>
|
||||
</div>
|
||||
{/if}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
|
|
|||
|
|
@ -113,7 +113,7 @@
|
|||
class="w-full text-sm text-left text-gray-500 dark:text-gray-400 table-auto max-w-full"
|
||||
>
|
||||
<thead class="text-xs text-gray-800 uppercase bg-transparent dark:text-gray-200">
|
||||
<tr class=" border-b-[1.5px] border-gray-50 dark:border-gray-850">
|
||||
<tr class=" border-b-[1.5px] border-gray-50/50 dark:border-gray-800/10">
|
||||
<th
|
||||
scope="col"
|
||||
class="px-2.5 py-2 cursor-pointer text-left w-8"
|
||||
|
|
|
|||
|
|
@ -33,6 +33,7 @@
|
|||
import Banner from '$lib/components/common/Banner.svelte';
|
||||
import Markdown from '$lib/components/chat/Messages/Markdown.svelte';
|
||||
import Spinner from '$lib/components/common/Spinner.svelte';
|
||||
import ProfilePreview from '$lib/components/channel/Messages/Message/ProfilePreview.svelte';
|
||||
|
||||
const i18n = getContext('i18n');
|
||||
|
||||
|
|
@ -96,11 +97,7 @@
|
|||
}
|
||||
};
|
||||
|
||||
$: if (page) {
|
||||
getUserList();
|
||||
}
|
||||
|
||||
$: if (query !== null && orderBy && direction) {
|
||||
$: if (query !== null && page !== null && orderBy !== null && direction !== null) {
|
||||
getUserList();
|
||||
}
|
||||
</script>
|
||||
|
|
@ -221,7 +218,7 @@
|
|||
<div class="scrollbar-hidden relative whitespace-nowrap overflow-x-auto max-w-full">
|
||||
<table class="w-full text-sm text-left text-gray-500 dark:text-gray-400 table-auto max-w-full">
|
||||
<thead class="text-xs text-gray-800 uppercase bg-transparent dark:text-gray-200">
|
||||
<tr class=" border-b-[1.5px] border-gray-50 dark:border-gray-850">
|
||||
<tr class=" border-b-[1.5px] border-gray-50 dark:border-gray-850/30">
|
||||
<th
|
||||
scope="col"
|
||||
class="px-2.5 py-2 cursor-pointer select-none"
|
||||
|
|
@ -359,14 +356,27 @@
|
|||
</button>
|
||||
</td>
|
||||
<td class="px-3 py-1 font-medium text-gray-900 dark:text-white max-w-48">
|
||||
<div class="flex items-center">
|
||||
<div class="flex items-center gap-2">
|
||||
<ProfilePreview {user} side="right" align="center" sideOffset={6}>
|
||||
<img
|
||||
class="rounded-full w-6 h-6 object-cover mr-2.5 flex-shrink-0"
|
||||
class="rounded-full w-6 h-6 object-cover mr-0.5 flex-shrink-0"
|
||||
src={`${WEBUI_API_BASE_URL}/users/${user.id}/profile/image`}
|
||||
alt="user"
|
||||
/>
|
||||
</ProfilePreview>
|
||||
|
||||
<div class="font-medium truncate">{user.name}</div>
|
||||
|
||||
{#if user?.last_active_at && Date.now() / 1000 - user.last_active_at < 180}
|
||||
<div>
|
||||
<span class="relative flex size-1.5">
|
||||
<span
|
||||
class="absolute inline-flex h-full w-full animate-ping rounded-full bg-green-400 opacity-75"
|
||||
></span>
|
||||
<span class="relative inline-flex size-1.5 rounded-full bg-green-500"></span>
|
||||
</span>
|
||||
</div>
|
||||
{/if}
|
||||
</div>
|
||||
</td>
|
||||
<td class=" px-3 py-1"> {user.email} </td>
|
||||
|
|
|
|||
|
|
@ -180,7 +180,7 @@
|
|||
|
||||
<div class="flex-1">
|
||||
<select
|
||||
class="w-full capitalize rounded-lg text-sm bg-transparent dark:disabled:text-gray-500 outline-hidden"
|
||||
class="dark:bg-gray-900 w-full capitalize rounded-lg text-sm bg-transparent dark:disabled:text-gray-500 outline-hidden"
|
||||
bind:value={_user.role}
|
||||
placeholder={$i18n.t('Enter Your Role')}
|
||||
required
|
||||
|
|
@ -207,7 +207,7 @@
|
|||
</div>
|
||||
</div>
|
||||
|
||||
<hr class=" border-gray-100 dark:border-gray-850 my-2.5 w-full" />
|
||||
<hr class=" border-gray-100/30 dark:border-gray-850/30 my-2.5 w-full" />
|
||||
|
||||
<div class="flex flex-col w-full">
|
||||
<div class=" mb-1 text-xs text-gray-500">{$i18n.t('Email')}</div>
|
||||
|
|
|
|||
|
|
@ -180,12 +180,17 @@
|
|||
</div>
|
||||
</div>
|
||||
|
||||
{#if _user?.oauth_sub}
|
||||
{#if _user?.oauth}
|
||||
<div class="flex flex-col w-full">
|
||||
<div class=" mb-1 text-xs text-gray-500">{$i18n.t('OAuth ID')}</div>
|
||||
|
||||
<div class="flex-1 text-sm break-all mb-1">
|
||||
{_user.oauth_sub ?? ''}
|
||||
<div class="flex-1 text-sm break-all mb-1 flex flex-col space-y-1">
|
||||
{#each Object.keys(_user.oauth) as key}
|
||||
<div>
|
||||
<span class="text-gray-500">{key}</span>
|
||||
<span class="">{_user.oauth[key]?.sub}</span>
|
||||
</div>
|
||||
{/each}
|
||||
</div>
|
||||
</div>
|
||||
{/if}
|
||||
|
|
|
|||
|
|
@ -4,8 +4,16 @@
|
|||
|
||||
import { onDestroy, onMount, tick } from 'svelte';
|
||||
import { goto } from '$app/navigation';
|
||||
import { v4 as uuidv4 } from 'uuid';
|
||||
|
||||
import { chatId, showSidebar, socket, user } from '$lib/stores';
|
||||
import {
|
||||
chatId,
|
||||
channels,
|
||||
channelId as _channelId,
|
||||
showSidebar,
|
||||
socket,
|
||||
user
|
||||
} from '$lib/stores';
|
||||
import { getChannelById, getChannelMessages, sendMessage } from '$lib/apis/channels';
|
||||
|
||||
import Messages from './Messages.svelte';
|
||||
|
|
@ -15,9 +23,12 @@
|
|||
import EllipsisVertical from '../icons/EllipsisVertical.svelte';
|
||||
import Thread from './Thread.svelte';
|
||||
import i18n from '$lib/i18n';
|
||||
import Spinner from '../common/Spinner.svelte';
|
||||
|
||||
export let id = '';
|
||||
|
||||
let currentId = null;
|
||||
|
||||
let scrollEnd = true;
|
||||
let messagesContainerElement = null;
|
||||
let chatInputElement = null;
|
||||
|
|
@ -43,7 +54,37 @@
|
|||
}
|
||||
};
|
||||
|
||||
const updateLastReadAt = async (channelId) => {
|
||||
$socket?.emit('events:channel', {
|
||||
channel_id: channelId,
|
||||
message_id: null,
|
||||
data: {
|
||||
type: 'last_read_at'
|
||||
}
|
||||
});
|
||||
|
||||
channels.set(
|
||||
$channels.map((channel) => {
|
||||
if (channel.id === channelId) {
|
||||
return {
|
||||
...channel,
|
||||
unread_count: 0
|
||||
};
|
||||
}
|
||||
return channel;
|
||||
})
|
||||
);
|
||||
};
|
||||
|
||||
const initHandler = async () => {
|
||||
if (currentId) {
|
||||
updateLastReadAt(currentId);
|
||||
}
|
||||
|
||||
currentId = id;
|
||||
updateLastReadAt(id);
|
||||
_channelId.set(id);
|
||||
|
||||
top = false;
|
||||
messages = null;
|
||||
channel = null;
|
||||
|
|
@ -78,7 +119,8 @@
|
|||
|
||||
if (type === 'message') {
|
||||
if ((data?.parent_id ?? null) === null) {
|
||||
messages = [data, ...messages];
|
||||
const tempId = data?.temp_id ?? null;
|
||||
messages = [{ ...data, temp_id: null }, ...messages.filter((m) => m?.temp_id !== tempId)];
|
||||
|
||||
if (typingUsers.find((user) => user.id === event.user.id)) {
|
||||
typingUsers = typingUsers.filter((user) => user.id !== event.user.id);
|
||||
|
|
@ -143,11 +185,30 @@
|
|||
return;
|
||||
}
|
||||
|
||||
const res = await sendMessage(localStorage.token, id, {
|
||||
const tempId = uuidv4();
|
||||
|
||||
const message = {
|
||||
temp_id: tempId,
|
||||
content: content,
|
||||
data: data,
|
||||
reply_to_id: replyToMessage?.id ?? null
|
||||
}).catch((error) => {
|
||||
};
|
||||
|
||||
const ts = Date.now() * 1000000; // nanoseconds
|
||||
messages = [
|
||||
{
|
||||
...message,
|
||||
id: tempId,
|
||||
user_id: $user?.id,
|
||||
user: $user,
|
||||
reply_to_message: replyToMessage ?? null,
|
||||
created_at: ts,
|
||||
updated_at: ts
|
||||
},
|
||||
...messages
|
||||
];
|
||||
|
||||
const res = await sendMessage(localStorage.token, id, message).catch((error) => {
|
||||
toast.error(`${error}`);
|
||||
return null;
|
||||
});
|
||||
|
|
@ -170,6 +231,8 @@
|
|||
}
|
||||
}
|
||||
});
|
||||
|
||||
updateLastReadAt(id);
|
||||
};
|
||||
|
||||
let mediaQuery;
|
||||
|
|
@ -197,12 +260,32 @@
|
|||
});
|
||||
|
||||
onDestroy(() => {
|
||||
// last read at
|
||||
updateLastReadAt(id);
|
||||
_channelId.set(null);
|
||||
$socket?.off('events:channel', channelEventHandler);
|
||||
});
|
||||
</script>
|
||||
|
||||
<svelte:head>
|
||||
{#if channel?.type === 'dm'}
|
||||
<title
|
||||
>{channel?.name.trim() ||
|
||||
channel?.users.reduce((a, e, i, arr) => {
|
||||
if (e.id === $user?.id) {
|
||||
return a;
|
||||
}
|
||||
|
||||
if (a) {
|
||||
return `${a}, ${e.name}`;
|
||||
} else {
|
||||
return e.name;
|
||||
}
|
||||
}, '')} • Open WebUI</title
|
||||
>
|
||||
{:else}
|
||||
<title>#{channel?.name ?? 'Channel'} • Open WebUI</title>
|
||||
{/if}
|
||||
</svelte:head>
|
||||
|
||||
<div
|
||||
|
|
@ -213,10 +296,28 @@
|
|||
>
|
||||
<PaneGroup direction="horizontal" class="w-full h-full">
|
||||
<Pane defaultSize={50} minSize={50} class="h-full flex flex-col w-full relative">
|
||||
<Navbar {channel} />
|
||||
<Navbar
|
||||
{channel}
|
||||
onPin={(messageId, pinned) => {
|
||||
messages = messages.map((message) => {
|
||||
if (message.id === messageId) {
|
||||
return {
|
||||
...message,
|
||||
is_pinned: pinned
|
||||
};
|
||||
}
|
||||
return message;
|
||||
});
|
||||
}}
|
||||
onUpdate={async () => {
|
||||
channel = await getChannelById(localStorage.token, id).catch((error) => {
|
||||
return null;
|
||||
});
|
||||
}}
|
||||
/>
|
||||
|
||||
{#if channel && messages !== null}
|
||||
<div class="flex-1 overflow-y-auto">
|
||||
{#if channel}
|
||||
<div
|
||||
class=" pb-2.5 max-w-full z-10 scrollbar-hidden w-full h-full pt-6 flex-1 flex flex-col-reverse overflow-auto"
|
||||
id="messages-container"
|
||||
|
|
@ -256,7 +357,6 @@
|
|||
/>
|
||||
{/key}
|
||||
</div>
|
||||
{/if}
|
||||
</div>
|
||||
|
||||
<div class=" pb-[1rem] px-2.5">
|
||||
|
|
@ -277,6 +377,13 @@
|
|||
{scrollEnd}
|
||||
/>
|
||||
</div>
|
||||
{:else}
|
||||
<div class=" flex items-center justify-center h-full w-full">
|
||||
<div class="m-auto">
|
||||
<Spinner className="size-5" />
|
||||
</div>
|
||||
</div>
|
||||
{/if}
|
||||
</Pane>
|
||||
|
||||
{#if !largeScreen}
|
||||
|
|
@ -300,7 +407,7 @@
|
|||
{/if}
|
||||
{:else if threadId !== null}
|
||||
<PaneResizer
|
||||
class="relative flex items-center justify-center group border-l border-gray-50 dark:border-gray-850 hover:border-gray-200 dark:hover:border-gray-800 transition z-20"
|
||||
class="relative flex items-center justify-center group border-l border-gray-50 dark:border-gray-850/30 hover:border-gray-200 dark:hover:border-gray-800 transition z-20"
|
||||
id="controls-resizer"
|
||||
>
|
||||
<div
|
||||
|
|
|
|||
|
|
@ -3,22 +3,41 @@
|
|||
import { getContext, onMount } from 'svelte';
|
||||
const i18n = getContext('i18n');
|
||||
|
||||
import { removeMembersById } from '$lib/apis/channels';
|
||||
|
||||
import Spinner from '$lib/components/common/Spinner.svelte';
|
||||
import Modal from '$lib/components/common/Modal.svelte';
|
||||
|
||||
import UserPlusSolid from '$lib/components/icons/UserPlusSolid.svelte';
|
||||
import WrenchSolid from '$lib/components/icons/WrenchSolid.svelte';
|
||||
import ConfirmDialog from '$lib/components/common/ConfirmDialog.svelte';
|
||||
import XMark from '$lib/components/icons/XMark.svelte';
|
||||
import Hashtag from '../icons/Hashtag.svelte';
|
||||
import Lock from '../icons/Lock.svelte';
|
||||
import UserList from './ChannelInfoModal/UserList.svelte';
|
||||
import AddMembersModal from './ChannelInfoModal/AddMembersModal.svelte';
|
||||
|
||||
export let show = false;
|
||||
export let channel = null;
|
||||
|
||||
export let onUpdate = () => {};
|
||||
|
||||
let showAddMembersModal = false;
|
||||
const submitHandler = async () => {};
|
||||
|
||||
const removeMemberHandler = async (userId) => {
|
||||
const res = await removeMembersById(localStorage.token, channel.id, {
|
||||
user_ids: [userId]
|
||||
}).catch((error) => {
|
||||
toast.error(`${error}`);
|
||||
return null;
|
||||
});
|
||||
|
||||
if (res) {
|
||||
toast.success($i18n.t('Member removed successfully'));
|
||||
onUpdate();
|
||||
} else {
|
||||
toast.error($i18n.t('Failed to remove member'));
|
||||
}
|
||||
};
|
||||
|
||||
const init = () => {};
|
||||
|
||||
$: if (show) {
|
||||
|
|
@ -31,24 +50,29 @@
|
|||
</script>
|
||||
|
||||
{#if channel}
|
||||
<AddMembersModal bind:show={showAddMembersModal} {channel} {onUpdate} />
|
||||
<Modal size="sm" bind:show>
|
||||
<div>
|
||||
<div class=" flex justify-between dark:text-gray-100 px-5 pt-4 mb-1.5">
|
||||
<div class="self-center text-base">
|
||||
<div class="flex items-center gap-0.5 shrink-0">
|
||||
{#if channel?.type === 'dm'}
|
||||
<div class=" text-left self-center overflow-hidden w-full line-clamp-1 flex-1">
|
||||
{$i18n.t('Direct Message')}
|
||||
</div>
|
||||
{:else}
|
||||
<div class=" size-4 justify-center flex items-center">
|
||||
{#if channel?.access_control === null}
|
||||
{#if channel?.type === 'group' ? !channel?.is_private : channel?.access_control === null}
|
||||
<Hashtag className="size-3.5" strokeWidth="2.5" />
|
||||
{:else}
|
||||
<Lock className="size-5.5" strokeWidth="2" />
|
||||
{/if}
|
||||
</div>
|
||||
|
||||
<div
|
||||
class=" text-left self-center overflow-hidden w-full line-clamp-1 capitalize flex-1"
|
||||
>
|
||||
<div class=" text-left self-center overflow-hidden w-full line-clamp-1 flex-1">
|
||||
{channel.name}
|
||||
</div>
|
||||
{/if}
|
||||
</div>
|
||||
</div>
|
||||
<button
|
||||
|
|
@ -61,7 +85,7 @@
|
|||
</button>
|
||||
</div>
|
||||
|
||||
<div class="flex flex-col md:flex-row w-full px-4 pb-4 md:space-x-4 dark:text-gray-200">
|
||||
<div class="flex flex-col md:flex-row w-full px-3 pb-4 md:space-x-4 dark:text-gray-200">
|
||||
<div class=" flex flex-col w-full sm:flex-row sm:justify-center sm:space-x-6">
|
||||
<form
|
||||
class="flex flex-col w-full"
|
||||
|
|
@ -71,7 +95,21 @@
|
|||
}}
|
||||
>
|
||||
<div class="flex flex-col w-full h-full pb-2">
|
||||
<UserList {channel} />
|
||||
<UserList
|
||||
{channel}
|
||||
onAdd={channel?.type === 'group' && channel?.is_manager
|
||||
? () => {
|
||||
showAddMembersModal = true;
|
||||
}
|
||||
: null}
|
||||
onRemove={channel?.type === 'group' && channel?.is_manager
|
||||
? (userId) => {
|
||||
removeMemberHandler(userId);
|
||||
}
|
||||
: null}
|
||||
search={channel?.type !== 'dm'}
|
||||
sort={channel?.type !== 'dm'}
|
||||
/>
|
||||
</div>
|
||||
</form>
|
||||
</div>
|
||||
|
|
|
|||
|
|
@ -0,0 +1,106 @@
|
|||
<script lang="ts">
|
||||
import { toast } from 'svelte-sonner';
|
||||
import { getContext, onMount } from 'svelte';
|
||||
const i18n = getContext('i18n');
|
||||
|
||||
import { addMembersById } from '$lib/apis/channels';
|
||||
|
||||
import Modal from '$lib/components/common/Modal.svelte';
|
||||
import XMark from '$lib/components/icons/XMark.svelte';
|
||||
import MemberSelector from '$lib/components/workspace/common/MemberSelector.svelte';
|
||||
import Spinner from '$lib/components/common/Spinner.svelte';
|
||||
|
||||
export let show = false;
|
||||
export let channel = null;
|
||||
|
||||
export let onUpdate = () => {};
|
||||
|
||||
let groupIds = [];
|
||||
let userIds = [];
|
||||
|
||||
let loading = false;
|
||||
|
||||
const submitHandler = async () => {
|
||||
const res = await addMembersById(localStorage.token, channel.id, {
|
||||
user_ids: userIds,
|
||||
group_ids: groupIds
|
||||
}).catch((error) => {
|
||||
toast.error(`${error}`);
|
||||
return null;
|
||||
});
|
||||
|
||||
if (res) {
|
||||
toast.success($i18n.t('Members added successfully'));
|
||||
onUpdate();
|
||||
show = false;
|
||||
} else {
|
||||
toast.error($i18n.t('Failed to add members'));
|
||||
}
|
||||
};
|
||||
|
||||
const reset = () => {
|
||||
userIds = [];
|
||||
groupIds = [];
|
||||
loading = false;
|
||||
};
|
||||
|
||||
$: if (!show) {
|
||||
reset();
|
||||
}
|
||||
</script>
|
||||
|
||||
{#if channel}
|
||||
<Modal size="sm" bind:show>
|
||||
<div>
|
||||
<div class=" flex justify-between dark:text-gray-100 px-5 pt-4 mb-1.5">
|
||||
<div class="self-center text-base">
|
||||
<div class="flex items-center gap-0.5 shrink-0">
|
||||
{$i18n.t('Add Members')}
|
||||
</div>
|
||||
</div>
|
||||
<button
|
||||
class="self-center"
|
||||
on:click={() => {
|
||||
show = false;
|
||||
}}
|
||||
>
|
||||
<XMark className={'size-5'} />
|
||||
</button>
|
||||
</div>
|
||||
|
||||
<div class="flex flex-col md:flex-row w-full px-3 pb-4 md:space-x-4 dark:text-gray-200">
|
||||
<div class=" flex flex-col w-full sm:flex-row sm:justify-center sm:space-x-6">
|
||||
<form
|
||||
class="flex flex-col w-full"
|
||||
on:submit={(e) => {
|
||||
e.preventDefault();
|
||||
submitHandler();
|
||||
}}
|
||||
>
|
||||
<div class="flex flex-col w-full h-full pb-2">
|
||||
<MemberSelector bind:userIds bind:groupIds includeGroups={true} />
|
||||
</div>
|
||||
|
||||
<div class="flex justify-end pt-3 text-sm font-medium gap-1.5">
|
||||
<button
|
||||
class="px-3.5 py-1.5 text-sm font-medium bg-black hover:bg-gray-950 text-white dark:bg-white dark:text-black dark:hover:bg-gray-100 transition rounded-full flex flex-row space-x-1 items-center {loading
|
||||
? ' cursor-not-allowed'
|
||||
: ''}"
|
||||
type="submit"
|
||||
disabled={loading}
|
||||
>
|
||||
{$i18n.t('Add')}
|
||||
|
||||
{#if loading}
|
||||
<div class="ml-2 self-center">
|
||||
<Spinner />
|
||||
</div>
|
||||
{/if}
|
||||
</button>
|
||||
</div>
|
||||
</form>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</Modal>
|
||||
{/if}
|
||||
|
|
@ -1,6 +1,6 @@
|
|||
<script>
|
||||
import { WEBUI_API_BASE_URL, WEBUI_BASE_URL } from '$lib/constants';
|
||||
import { WEBUI_NAME, config, user, showSidebar } from '$lib/stores';
|
||||
import { WEBUI_NAME, config, user as _user, showSidebar } from '$lib/stores';
|
||||
import { goto } from '$app/navigation';
|
||||
import { onMount, getContext } from 'svelte';
|
||||
|
||||
|
|
@ -11,33 +11,27 @@
|
|||
dayjs.extend(localizedFormat);
|
||||
|
||||
import { toast } from 'svelte-sonner';
|
||||
import { getChannelUsersById } from '$lib/apis/channels';
|
||||
import { getChannelMembersById } from '$lib/apis/channels';
|
||||
|
||||
import Pagination from '$lib/components/common/Pagination.svelte';
|
||||
import ChatBubbles from '$lib/components/icons/ChatBubbles.svelte';
|
||||
import Tooltip from '$lib/components/common/Tooltip.svelte';
|
||||
|
||||
import EditUserModal from '$lib/components/admin/Users/UserList/EditUserModal.svelte';
|
||||
import UserChatsModal from '$lib/components/admin/Users/UserList/UserChatsModal.svelte';
|
||||
import AddUserModal from '$lib/components/admin/Users/UserList/AddUserModal.svelte';
|
||||
|
||||
import ConfirmDialog from '$lib/components/common/ConfirmDialog.svelte';
|
||||
import RoleUpdateConfirmDialog from '$lib/components/common/ConfirmDialog.svelte';
|
||||
|
||||
import Badge from '$lib/components/common/Badge.svelte';
|
||||
import Plus from '$lib/components/icons/Plus.svelte';
|
||||
import ChevronUp from '$lib/components/icons/ChevronUp.svelte';
|
||||
import ChevronDown from '$lib/components/icons/ChevronDown.svelte';
|
||||
import About from '$lib/components/chat/Settings/About.svelte';
|
||||
import Banner from '$lib/components/common/Banner.svelte';
|
||||
import Markdown from '$lib/components/chat/Messages/Markdown.svelte';
|
||||
import Spinner from '$lib/components/common/Spinner.svelte';
|
||||
import ProfilePreview from '../Messages/Message/ProfilePreview.svelte';
|
||||
import XMark from '$lib/components/icons/XMark.svelte';
|
||||
|
||||
const i18n = getContext('i18n');
|
||||
|
||||
export let channel = null;
|
||||
|
||||
export let onAdd = null;
|
||||
export let onRemove = null;
|
||||
|
||||
export let search = true;
|
||||
export let sort = true;
|
||||
|
||||
let page = 1;
|
||||
|
||||
let users = null;
|
||||
|
|
@ -48,6 +42,10 @@
|
|||
let direction = 'asc'; // default sort order
|
||||
|
||||
const setSortKey = (key) => {
|
||||
if (!sort) {
|
||||
return;
|
||||
}
|
||||
|
||||
if (orderBy === key) {
|
||||
direction = direction === 'asc' ? 'desc' : 'asc';
|
||||
} else {
|
||||
|
|
@ -58,7 +56,7 @@
|
|||
|
||||
const getUserList = async () => {
|
||||
try {
|
||||
const res = await getChannelUsersById(
|
||||
const res = await getChannelMembersById(
|
||||
localStorage.token,
|
||||
channel.id,
|
||||
query,
|
||||
|
|
@ -79,11 +77,13 @@
|
|||
}
|
||||
};
|
||||
|
||||
$: if (page) {
|
||||
getUserList();
|
||||
}
|
||||
|
||||
$: if (query !== null && orderBy && direction) {
|
||||
$: if (
|
||||
channel !== null &&
|
||||
page !== null &&
|
||||
query !== null &&
|
||||
orderBy !== null &&
|
||||
direction !== null
|
||||
) {
|
||||
getUserList();
|
||||
}
|
||||
</script>
|
||||
|
|
@ -94,9 +94,33 @@
|
|||
<Spinner className="size-5" />
|
||||
</div>
|
||||
{:else}
|
||||
<div class="flex gap-1">
|
||||
<div class="flex items-center justify-between px-2 mb-1">
|
||||
<div class="flex gap-1 items-center">
|
||||
<span class="text-sm">
|
||||
{$i18n.t('Members')}
|
||||
</span>
|
||||
<span class="text-sm text-gray-500">{total}</span>
|
||||
</div>
|
||||
|
||||
{#if onAdd}
|
||||
<div class="">
|
||||
<button
|
||||
type="button"
|
||||
class=" px-3 py-1.5 gap-1 rounded-xl bg-black dark:text-white dark:bg-gray-850/50 text-black transition font-medium text-xs flex items-center justify-center"
|
||||
on:click={onAdd}
|
||||
>
|
||||
<Plus className="size-3.5 " />
|
||||
<span>{$i18n.t('Add Member')}</span>
|
||||
</button>
|
||||
</div>
|
||||
{/if}
|
||||
</div>
|
||||
<!-- <hr class="my-1 border-gray-100/5- dark:border-gray-850/50" /> -->
|
||||
|
||||
{#if search}
|
||||
<div class="flex gap-1 px-1 mb-1">
|
||||
<div class=" flex w-full space-x-2">
|
||||
<div class="flex flex-1">
|
||||
<div class="flex flex-1 items-center">
|
||||
<div class=" self-center ml-1 mr-3">
|
||||
<svg
|
||||
xmlns="http://www.w3.org/2000/svg"
|
||||
|
|
@ -119,17 +143,19 @@
|
|||
</div>
|
||||
</div>
|
||||
</div>
|
||||
{/if}
|
||||
|
||||
{#if users.length > 0}
|
||||
<div class="scrollbar-hidden relative whitespace-nowrap w-full max-w-full">
|
||||
<div class=" text-sm text-left text-gray-500 dark:text-gray-400 w-full max-w-full">
|
||||
<div
|
||||
<!-- <div
|
||||
class="text-xs text-gray-800 uppercase bg-transparent dark:text-gray-200 w-full mb-0.5"
|
||||
>
|
||||
<div
|
||||
class=" border-b-[1.5px] border-gray-50 dark:border-gray-850 flex items-center justify-between"
|
||||
class=" border-b-[1.5px] border-gray-50/50 dark:border-gray-800/10 flex items-center justify-between"
|
||||
>
|
||||
<button
|
||||
type="button"
|
||||
class="px-2.5 py-2 cursor-pointer select-none"
|
||||
on:click={() => setSortKey('name')}
|
||||
>
|
||||
|
|
@ -153,6 +179,7 @@
|
|||
</button>
|
||||
|
||||
<button
|
||||
type="button"
|
||||
class="px-2.5 py-2 cursor-pointer select-none"
|
||||
on:click={() => setSortKey('role')}
|
||||
>
|
||||
|
|
@ -175,11 +202,11 @@
|
|||
</div>
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
</div> -->
|
||||
<div class="w-full">
|
||||
{#each users as user, userIdx}
|
||||
{#each users as user, userIdx (user.id)}
|
||||
<div class=" dark:border-gray-850 text-xs flex items-center justify-between">
|
||||
<div class="px-3 py-1.5 font-medium text-gray-900 dark:text-white flex-1">
|
||||
<div class="px-2 py-1.5 font-medium text-gray-900 dark:text-white flex-1">
|
||||
<div class="flex items-center gap-2">
|
||||
<ProfilePreview {user} side="right" align="center" sideOffset={6}>
|
||||
<img
|
||||
|
|
@ -206,8 +233,8 @@
|
|||
</div>
|
||||
</div>
|
||||
|
||||
<div class="px-3 py-1">
|
||||
<div class=" translate-y-0.5">
|
||||
<div class="px-2 py-1 flex items-center gap-1 translate-y-0.5">
|
||||
<div class=" ">
|
||||
<Badge
|
||||
type={user.role === 'admin'
|
||||
? 'info'
|
||||
|
|
@ -217,6 +244,21 @@
|
|||
content={$i18n.t(user.role)}
|
||||
/>
|
||||
</div>
|
||||
|
||||
{#if onRemove}
|
||||
<div>
|
||||
<button
|
||||
class=" rounded-full p-1 hover:bg-gray-100 dark:hover:bg-gray-850 transition disabled:opacity-50 disabled:cursor-not-allowed"
|
||||
type="button"
|
||||
disabled={user.id === $_user?.id}
|
||||
on:click={() => {
|
||||
onRemove(user.id);
|
||||
}}
|
||||
>
|
||||
<XMark />
|
||||
</button>
|
||||
</div>
|
||||
{/if}
|
||||
</div>
|
||||
</div>
|
||||
{/each}
|
||||
|
|
|
|||
|
|
@ -775,7 +775,7 @@
|
|||
>
|
||||
<div
|
||||
id="message-input-container"
|
||||
class="flex-1 flex flex-col relative w-full shadow-lg rounded-3xl border border-gray-50 dark:border-gray-850 hover:border-gray-100 focus-within:border-gray-100 hover:dark:border-gray-800 focus-within:dark:border-gray-800 transition px-1 bg-white/90 dark:bg-gray-400/5 dark:text-gray-100"
|
||||
class="flex-1 flex flex-col relative w-full shadow-lg rounded-3xl border border-gray-50 dark:border-gray-850/30 hover:border-gray-100 focus-within:border-gray-100 hover:dark:border-gray-800 focus-within:dark:border-gray-800 transition px-1 bg-white/90 dark:bg-gray-400/5 dark:text-gray-100"
|
||||
dir={$settings?.chatDirection ?? 'auto'}
|
||||
>
|
||||
{#if replyToMessage !== null}
|
||||
|
|
@ -865,7 +865,7 @@
|
|||
<div
|
||||
class="scrollbar-hidden rtl:text-right ltr:text-left bg-transparent dark:text-gray-100 outline-hidden w-full pt-2.5 pb-[5px] px-1 resize-none h-fit max-h-96 overflow-auto"
|
||||
>
|
||||
{#key $settings?.richTextInput}
|
||||
{#key $settings?.richTextInput && $settings?.showFormattingToolbar}
|
||||
<RichTextInput
|
||||
id="chat-input"
|
||||
bind:this={chatInputElement}
|
||||
|
|
|
|||
|
|
@ -111,7 +111,9 @@
|
|||
if (channelSuggestions) {
|
||||
// Add a dummy channel item
|
||||
_channels = [
|
||||
...$channels.map((c) => ({ type: 'channel', id: c.id, label: c.name, data: c }))
|
||||
...$channels
|
||||
.filter((c) => c?.type !== 'dm')
|
||||
.map((c) => ({ type: 'channel', id: c.id, label: c.name, data: c }))
|
||||
];
|
||||
} else {
|
||||
if (userSuggestions) {
|
||||
|
|
|
|||
|
|
@ -16,7 +16,14 @@
|
|||
import Message from './Messages/Message.svelte';
|
||||
import Loader from '../common/Loader.svelte';
|
||||
import Spinner from '../common/Spinner.svelte';
|
||||
import { addReaction, deleteMessage, removeReaction, updateMessage } from '$lib/apis/channels';
|
||||
import {
|
||||
addReaction,
|
||||
deleteMessage,
|
||||
pinMessage,
|
||||
removeReaction,
|
||||
updateMessage
|
||||
} from '$lib/apis/channels';
|
||||
import { WEBUI_API_BASE_URL } from '$lib/constants';
|
||||
|
||||
const i18n = getContext('i18n');
|
||||
|
||||
|
|
@ -68,7 +75,31 @@
|
|||
<div class="px-5 max-w-full mx-auto">
|
||||
{#if channel}
|
||||
<div class="flex flex-col gap-1.5 pb-5 pt-10">
|
||||
<div class="text-2xl font-medium capitalize">{channel.name}</div>
|
||||
{#if channel?.type === 'dm'}
|
||||
<div class="flex ml-[1px] mr-0.5">
|
||||
{#each channel.users.filter((u) => u.id !== $user?.id).slice(0, 2) as u, index}
|
||||
<img
|
||||
src={`${WEBUI_API_BASE_URL}/users/${u.id}/profile/image`}
|
||||
alt={u.name}
|
||||
class=" size-7.5 rounded-full border-2 border-white dark:border-gray-900 {index ===
|
||||
1
|
||||
? '-ml-2.5'
|
||||
: ''}"
|
||||
/>
|
||||
{/each}
|
||||
</div>
|
||||
{/if}
|
||||
|
||||
<div class="text-2xl font-medium capitalize">
|
||||
{#if channel?.name}
|
||||
{channel.name}
|
||||
{:else}
|
||||
{channel?.users
|
||||
?.filter((u) => u.id !== $user?.id)
|
||||
.map((u) => u.name)
|
||||
.join(', ')}
|
||||
{/if}
|
||||
</div>
|
||||
|
||||
<div class=" text-gray-500">
|
||||
{$i18n.t(
|
||||
|
|
@ -97,11 +128,12 @@
|
|||
{message}
|
||||
{thread}
|
||||
replyToMessage={replyToMessage?.id === message.id}
|
||||
disabled={!channel?.write_access}
|
||||
disabled={!channel?.write_access || message?.temp_id}
|
||||
pending={!!message?.temp_id}
|
||||
showUserProfile={messageIdx === 0 ||
|
||||
messageList.at(messageIdx - 1)?.user_id !== message.user_id ||
|
||||
messageList.at(messageIdx - 1)?.meta?.model_id !== message?.meta?.model_id ||
|
||||
message?.reply_to_message}
|
||||
message?.reply_to_message !== null}
|
||||
onDelete={() => {
|
||||
messages = messages.filter((m) => m.id !== message.id);
|
||||
|
||||
|
|
@ -130,6 +162,26 @@
|
|||
onReply={(message) => {
|
||||
onReply(message);
|
||||
}}
|
||||
onPin={async (message) => {
|
||||
messages = messages.map((m) => {
|
||||
if (m.id === message.id) {
|
||||
m.is_pinned = !m.is_pinned;
|
||||
m.pinned_by = !m.is_pinned ? null : $user?.id;
|
||||
m.pinned_at = !m.is_pinned ? null : Date.now() * 1000000;
|
||||
}
|
||||
return m;
|
||||
});
|
||||
|
||||
const updatedMessage = await pinMessage(
|
||||
localStorage.token,
|
||||
message.channel_id,
|
||||
message.id,
|
||||
message.is_pinned
|
||||
).catch((error) => {
|
||||
toast.error(`${error}`);
|
||||
return null;
|
||||
});
|
||||
}}
|
||||
onThread={(id) => {
|
||||
onThread(id);
|
||||
}}
|
||||
|
|
@ -137,7 +189,7 @@
|
|||
if (
|
||||
(message?.reactions ?? [])
|
||||
.find((reaction) => reaction.name === name)
|
||||
?.user_ids?.includes($user?.id) ??
|
||||
?.users?.some((u) => u.id === $user?.id) ??
|
||||
false
|
||||
) {
|
||||
messages = messages.map((m) => {
|
||||
|
|
@ -145,8 +197,8 @@
|
|||
const reaction = m.reactions.find((reaction) => reaction.name === name);
|
||||
|
||||
if (reaction) {
|
||||
reaction.user_ids = reaction.user_ids.filter((id) => id !== $user?.id);
|
||||
reaction.count = reaction.user_ids.length;
|
||||
reaction.users = reaction.users.filter((u) => u.id !== $user?.id);
|
||||
reaction.count = reaction.users.length;
|
||||
|
||||
if (reaction.count === 0) {
|
||||
m.reactions = m.reactions.filter((r) => r.name !== name);
|
||||
|
|
@ -172,12 +224,12 @@
|
|||
const reaction = m.reactions.find((reaction) => reaction.name === name);
|
||||
|
||||
if (reaction) {
|
||||
reaction.user_ids.push($user?.id);
|
||||
reaction.count = reaction.user_ids.length;
|
||||
reaction.users.push({ id: $user?.id, name: $user?.name });
|
||||
reaction.count = reaction.users.length;
|
||||
} else {
|
||||
m.reactions.push({
|
||||
name: name,
|
||||
user_ids: [$user?.id],
|
||||
users: [{ id: $user?.id, name: $user?.name }],
|
||||
count: 1
|
||||
});
|
||||
}
|
||||
|
|
|
|||
|
|
@ -36,6 +36,10 @@
|
|||
import Emoji from '$lib/components/common/Emoji.svelte';
|
||||
import Skeleton from '$lib/components/chat/Messages/Skeleton.svelte';
|
||||
import ArrowUpLeftAlt from '$lib/components/icons/ArrowUpLeftAlt.svelte';
|
||||
import PinSlash from '$lib/components/icons/PinSlash.svelte';
|
||||
import Pin from '$lib/components/icons/Pin.svelte';
|
||||
|
||||
export let className = '';
|
||||
|
||||
export let message;
|
||||
export let showUserProfile = true;
|
||||
|
|
@ -43,10 +47,12 @@
|
|||
|
||||
export let replyToMessage = false;
|
||||
export let disabled = false;
|
||||
export let pending = false;
|
||||
|
||||
export let onDelete: Function = () => {};
|
||||
export let onEdit: Function = () => {};
|
||||
export let onReply: Function = () => {};
|
||||
export let onPin: Function = () => {};
|
||||
export let onThread: Function = () => {};
|
||||
export let onReaction: Function = () => {};
|
||||
|
||||
|
|
@ -69,13 +75,17 @@
|
|||
{#if message}
|
||||
<div
|
||||
id="message-{message.id}"
|
||||
class="flex flex-col justify-between px-5 {showUserProfile
|
||||
? 'pt-1.5 pb-0.5'
|
||||
: ''} w-full max-w-full mx-auto group hover:bg-gray-300/5 dark:hover:bg-gray-700/5 transition relative {replyToMessage
|
||||
? 'border-l-4 border-blue-500 bg-blue-100/10 dark:bg-blue-100/5 pl-4'
|
||||
: ''} {(message?.reply_to_message?.meta?.model_id ?? message?.reply_to_message?.user_id) ===
|
||||
class="flex flex-col justify-between w-full max-w-full mx-auto group hover:bg-gray-300/5 dark:hover:bg-gray-700/5 transition relative {className
|
||||
? className
|
||||
: `px-5 ${
|
||||
replyToMessage ? 'border-l-4 border-blue-500 bg-blue-100/10 dark:bg-blue-100/5 pl-4' : ''
|
||||
} ${
|
||||
(message?.reply_to_message?.meta?.model_id ?? message?.reply_to_message?.user_id) ===
|
||||
$user?.id
|
||||
? 'border-l-4 border-orange-500 bg-orange-100/10 dark:bg-orange-100/5 pl-4'
|
||||
: ''
|
||||
} ${message?.is_pinned ? 'bg-yellow-100/20 dark:bg-yellow-100/5' : ''}`} {showUserProfile
|
||||
? 'pt-1.5 pb-0.5'
|
||||
: ''}"
|
||||
>
|
||||
{#if !edit && !disabled}
|
||||
|
|
@ -83,8 +93,9 @@
|
|||
class=" absolute {showButtons ? '' : 'invisible group-hover:visible'} right-1 -top-2 z-10"
|
||||
>
|
||||
<div
|
||||
class="flex gap-1 rounded-lg bg-white dark:bg-gray-850 shadow-md p-0.5 border border-gray-100 dark:border-gray-850"
|
||||
class="flex gap-1 rounded-lg bg-white dark:bg-gray-850 shadow-md p-0.5 border border-gray-100/30 dark:border-gray-850/30"
|
||||
>
|
||||
{#if onReaction}
|
||||
<EmojiPicker
|
||||
onClose={() => (showButtons = false)}
|
||||
onSubmit={(name) => {
|
||||
|
|
@ -103,7 +114,9 @@
|
|||
</button>
|
||||
</Tooltip>
|
||||
</EmojiPicker>
|
||||
{/if}
|
||||
|
||||
{#if onReply}
|
||||
<Tooltip content={$i18n.t('Reply')}>
|
||||
<button
|
||||
class="hover:bg-gray-100 dark:hover:bg-gray-800 transition rounded-lg p-0.5"
|
||||
|
|
@ -114,8 +127,24 @@
|
|||
<ArrowUpLeftAlt className="size-5" />
|
||||
</button>
|
||||
</Tooltip>
|
||||
{/if}
|
||||
|
||||
{#if !thread}
|
||||
<Tooltip content={message?.is_pinned ? $i18n.t('Unpin') : $i18n.t('Pin')}>
|
||||
<button
|
||||
class="hover:bg-gray-100 dark:hover:bg-gray-800 transition rounded-lg p-1"
|
||||
on:click={() => {
|
||||
onPin(message);
|
||||
}}
|
||||
>
|
||||
{#if message?.is_pinned}
|
||||
<PinSlash className="size-4" />
|
||||
{:else}
|
||||
<Pin className="size-4" />
|
||||
{/if}
|
||||
</button>
|
||||
</Tooltip>
|
||||
|
||||
{#if !thread && onThread}
|
||||
<Tooltip content={$i18n.t('Reply in Thread')}>
|
||||
<button
|
||||
class="hover:bg-gray-100 dark:hover:bg-gray-800 transition rounded-lg p-1"
|
||||
|
|
@ -129,6 +158,7 @@
|
|||
{/if}
|
||||
|
||||
{#if message.user_id === $user?.id || $user?.role === 'admin'}
|
||||
{#if onEdit}
|
||||
<Tooltip content={$i18n.t('Edit')}>
|
||||
<button
|
||||
class="hover:bg-gray-100 dark:hover:bg-gray-800 transition rounded-lg p-1"
|
||||
|
|
@ -140,7 +170,9 @@
|
|||
<Pencil />
|
||||
</button>
|
||||
</Tooltip>
|
||||
{/if}
|
||||
|
||||
{#if onDelete}
|
||||
<Tooltip content={$i18n.t('Delete')}>
|
||||
<button
|
||||
class="hover:bg-gray-100 dark:hover:bg-gray-800 transition rounded-lg p-1"
|
||||
|
|
@ -150,6 +182,16 @@
|
|||
</button>
|
||||
</Tooltip>
|
||||
{/if}
|
||||
{/if}
|
||||
</div>
|
||||
</div>
|
||||
{/if}
|
||||
|
||||
{#if message?.is_pinned}
|
||||
<div class="flex {showUserProfile ? 'mb-0.5' : 'mt-0.5'}">
|
||||
<div class="ml-8.5 flex items-center gap-1 px-1 rounded-full text-xs">
|
||||
<Pin className="size-3 text-yellow-500 dark:text-yellow-300" />
|
||||
<span class="text-gray-500">{$i18n.t('Pinned')}</span>
|
||||
</div>
|
||||
</div>
|
||||
{/if}
|
||||
|
|
@ -203,12 +245,13 @@
|
|||
</button>
|
||||
</div>
|
||||
{/if}
|
||||
|
||||
<div
|
||||
class=" flex w-full message-{message.id}"
|
||||
class=" flex w-full message-{message.id} "
|
||||
id="message-{message.id}"
|
||||
dir={$settings.chatDirection}
|
||||
>
|
||||
<div class={`shrink-0 mr-3 w-9`}>
|
||||
<div class={`shrink-0 mr-1 w-9`}>
|
||||
{#if showUserProfile}
|
||||
{#if message?.meta?.model_id}
|
||||
<img
|
||||
|
|
@ -239,7 +282,7 @@
|
|||
{/if}
|
||||
</div>
|
||||
|
||||
<div class="flex-auto w-0 pl-1">
|
||||
<div class="flex-auto w-0 pl-2">
|
||||
{#if showUserProfile}
|
||||
<Name>
|
||||
<div class=" self-end text-base shrink-0 font-medium truncate">
|
||||
|
|
@ -252,14 +295,18 @@
|
|||
|
||||
{#if message.created_at}
|
||||
<div
|
||||
class=" self-center text-xs invisible group-hover:visible text-gray-400 font-medium first-letter:capitalize ml-0.5 translate-y-[1px]"
|
||||
class=" self-center text-xs text-gray-400 font-medium first-letter:capitalize ml-0.5 translate-y-[1px]"
|
||||
>
|
||||
<Tooltip content={dayjs(message.created_at / 1000000).format('LLLL')}>
|
||||
<span class="line-clamp-1">
|
||||
{#if dayjs(message.created_at / 1000000).isToday()}
|
||||
{dayjs(message.created_at / 1000000).format('LT')}
|
||||
{:else}
|
||||
{$i18n.t(formatDate(message.created_at / 1000000), {
|
||||
LOCALIZED_TIME: dayjs(message.created_at / 1000000).format('LT'),
|
||||
LOCALIZED_DATE: dayjs(message.created_at / 1000000).format('L')
|
||||
})}
|
||||
{/if}
|
||||
</span>
|
||||
</Tooltip>
|
||||
</div>
|
||||
|
|
@ -334,15 +381,16 @@
|
|||
</div>
|
||||
</div>
|
||||
{:else}
|
||||
<div class=" min-w-full markdown-prose">
|
||||
<div class=" min-w-full markdown-prose {pending ? 'opacity-50' : ''}">
|
||||
{#if (message?.content ?? '').trim() === '' && message?.meta?.model_id}
|
||||
<Skeleton />
|
||||
{:else}
|
||||
<Markdown
|
||||
id={message.id}
|
||||
content={message.content}
|
||||
paragraphTag="span"
|
||||
/>{#if message.created_at !== message.updated_at && (message?.meta?.model_id ?? null) === null}<span
|
||||
class="text-gray-500 text-[10px]">({$i18n.t('edited')})</span
|
||||
class="text-gray-500 text-[10px] pl-1 self-center">({$i18n.t('edited')})</span
|
||||
>{/if}
|
||||
{/if}
|
||||
</div>
|
||||
|
|
@ -351,28 +399,64 @@
|
|||
<div>
|
||||
<div class="flex items-center flex-wrap gap-y-1.5 gap-1 mt-1 mb-2">
|
||||
{#each message.reactions as reaction}
|
||||
<Tooltip content={`:${reaction.name}:`}>
|
||||
<Tooltip
|
||||
content={$i18n.t('{{NAMES}} reacted with {{REACTION}}', {
|
||||
NAMES: reaction.users
|
||||
.reduce((acc, u, idx) => {
|
||||
const name = u.id === $user?.id ? $i18n.t('You') : u.name;
|
||||
const total = reaction.users.length;
|
||||
|
||||
// First three names always added normally
|
||||
if (idx < 3) {
|
||||
const separator =
|
||||
idx === 0
|
||||
? ''
|
||||
: idx === Math.min(2, total - 1)
|
||||
? ` ${$i18n.t('and')} `
|
||||
: ', ';
|
||||
return `${acc}${separator}${name}`;
|
||||
}
|
||||
|
||||
// More than 4 → "and X others"
|
||||
if (idx === 3 && total > 4) {
|
||||
return (
|
||||
acc +
|
||||
` ${$i18n.t('and {{COUNT}} others', {
|
||||
COUNT: total - 3
|
||||
})}`
|
||||
);
|
||||
}
|
||||
|
||||
return acc;
|
||||
}, '')
|
||||
.trim(),
|
||||
REACTION: `:${reaction.name}:`
|
||||
})}
|
||||
>
|
||||
<button
|
||||
class="flex items-center gap-1.5 transition rounded-xl px-2 py-1 cursor-pointer {reaction.user_ids.includes(
|
||||
$user?.id
|
||||
)
|
||||
class="flex items-center gap-1.5 transition rounded-xl px-2 py-1 cursor-pointer {reaction.users
|
||||
.map((u) => u.id)
|
||||
.includes($user?.id)
|
||||
? ' bg-blue-300/10 outline outline-blue-500/50 outline-1'
|
||||
: 'bg-gray-300/10 dark:bg-gray-500/10 hover:outline hover:outline-gray-700/30 dark:hover:outline-gray-300/30 hover:outline-1'}"
|
||||
on:click={() => {
|
||||
if (onReaction) {
|
||||
onReaction(reaction.name);
|
||||
}
|
||||
}}
|
||||
>
|
||||
<Emoji shortCode={reaction.name} />
|
||||
|
||||
{#if reaction.user_ids.length > 0}
|
||||
{#if reaction.users.length > 0}
|
||||
<div class="text-xs font-medium text-gray-500 dark:text-gray-400">
|
||||
{reaction.user_ids?.length}
|
||||
{reaction.users?.length}
|
||||
</div>
|
||||
{/if}
|
||||
</button>
|
||||
</Tooltip>
|
||||
{/each}
|
||||
|
||||
{#if onReaction}
|
||||
<EmojiPicker
|
||||
onSubmit={(name) => {
|
||||
onReaction(name);
|
||||
|
|
@ -386,6 +470,7 @@
|
|||
</div>
|
||||
</Tooltip>
|
||||
</EmojiPicker>
|
||||
{/if}
|
||||
</div>
|
||||
</div>
|
||||
{/if}
|
||||
|
|
|
|||
|
|
@ -11,11 +11,21 @@
|
|||
export let align = 'center';
|
||||
export let side = 'right';
|
||||
export let sideOffset = 8;
|
||||
|
||||
let openPreview = false;
|
||||
</script>
|
||||
|
||||
<LinkPreview.Root openDelay={0} closeDelay={0}>
|
||||
<LinkPreview.Trigger class=" cursor-pointer no-underline! font-normal! ">
|
||||
<LinkPreview.Root openDelay={0} closeDelay={200} bind:open={openPreview}>
|
||||
<LinkPreview.Trigger class="flex items-center">
|
||||
<button
|
||||
type="button"
|
||||
class=" cursor-pointer no-underline! font-normal!"
|
||||
on:click={() => {
|
||||
openPreview = !openPreview;
|
||||
}}
|
||||
>
|
||||
<slot />
|
||||
</button>
|
||||
</LinkPreview.Trigger>
|
||||
|
||||
<UserStatusLinkPreview id={user?.id} {side} {align} {sideOffset} />
|
||||
|
|
|
|||
|
|
@ -2,17 +2,43 @@
|
|||
import { getContext, onMount } from 'svelte';
|
||||
|
||||
const i18n = getContext('i18n');
|
||||
|
||||
import { user as _user, channels, socket } from '$lib/stores';
|
||||
import { WEBUI_API_BASE_URL, WEBUI_BASE_URL } from '$lib/constants';
|
||||
import { getChannels, getDMChannelByUserId } from '$lib/apis/channels';
|
||||
|
||||
import ChatBubbles from '$lib/components/icons/ChatBubbles.svelte';
|
||||
import ChatBubble from '$lib/components/icons/ChatBubble.svelte';
|
||||
import ChatBubbleOval from '$lib/components/icons/ChatBubbleOval.svelte';
|
||||
import { goto } from '$app/navigation';
|
||||
import Emoji from '$lib/components/common/Emoji.svelte';
|
||||
import Tooltip from '$lib/components/common/Tooltip.svelte';
|
||||
|
||||
export let user = null;
|
||||
|
||||
const directMessageHandler = async () => {
|
||||
if (!user) {
|
||||
return;
|
||||
}
|
||||
|
||||
const res = await getDMChannelByUserId(localStorage.token, user.id).catch((error) => {
|
||||
console.error('Error fetching DM channel:', error);
|
||||
return null;
|
||||
});
|
||||
|
||||
if (res) {
|
||||
goto(`/channels/${res.id}`);
|
||||
}
|
||||
};
|
||||
</script>
|
||||
|
||||
{#if user}
|
||||
<div class=" flex gap-3.5 w-full py-3 px-3 items-center">
|
||||
<div class="py-3">
|
||||
<div class=" flex gap-3.5 w-full px-3 items-center">
|
||||
<div class=" items-center flex shrink-0">
|
||||
<img
|
||||
src={`${WEBUI_API_BASE_URL}/users/${user?.id}/profile/image`}
|
||||
class=" size-12 object-cover rounded-xl"
|
||||
class=" size-14 object-cover rounded-xl"
|
||||
alt="profile"
|
||||
/>
|
||||
</div>
|
||||
|
|
@ -23,7 +49,7 @@
|
|||
</div>
|
||||
|
||||
<div class=" flex items-center gap-2">
|
||||
{#if user?.active}
|
||||
{#if user?.is_active}
|
||||
<div>
|
||||
<span class="relative flex size-2">
|
||||
<span
|
||||
|
|
@ -46,4 +72,56 @@
|
|||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{#if user?.status_emoji || user?.status_message}
|
||||
<div class="mx-2 mt-2">
|
||||
<Tooltip content={user?.status_message}>
|
||||
<div
|
||||
class="w-full gap-2 px-2.5 py-1.5 rounded-xl bg-gray-50 dark:text-white dark:bg-gray-900/50 text-black transition text-xs flex items-center"
|
||||
>
|
||||
{#if user?.status_emoji}
|
||||
<div class=" self-center shrink-0">
|
||||
<Emoji className="size-4" shortCode={user?.status_emoji} />
|
||||
</div>
|
||||
{/if}
|
||||
<div class=" self-center line-clamp-2 flex-1 text-left">
|
||||
{user?.status_message}
|
||||
</div>
|
||||
</div>
|
||||
</Tooltip>
|
||||
</div>
|
||||
{/if}
|
||||
|
||||
{#if user?.bio}
|
||||
<div class="mx-3.5 mt-2">
|
||||
<Tooltip content={user?.bio}>
|
||||
<div class=" self-center line-clamp-3 flex-1 text-left text-xs">
|
||||
{user?.bio}
|
||||
</div>
|
||||
</Tooltip>
|
||||
</div>
|
||||
{/if}
|
||||
|
||||
{#if $_user?.id !== user.id}
|
||||
<hr class="border-gray-100/50 dark:border-gray-800/50 my-2.5" />
|
||||
|
||||
<div class=" flex flex-col w-full px-2.5 items-center">
|
||||
<button
|
||||
class="w-full text-left px-3 py-1.5 rounded-xl border border-gray-100/50 dark:border-gray-800/50 hover:bg-gray-50 dark:hover:bg-gray-850 transition flex items-center gap-2 text-sm"
|
||||
type="button"
|
||||
on:click={() => {
|
||||
directMessageHandler();
|
||||
}}
|
||||
>
|
||||
<div>
|
||||
<ChatBubbleOval className="size-4" />
|
||||
</div>
|
||||
|
||||
<div class="font-medium">
|
||||
{$i18n.t('Message')}
|
||||
</div>
|
||||
</button>
|
||||
</div>
|
||||
{/if}
|
||||
</div>
|
||||
{/if}
|
||||
|
|
|
|||
|
|
@ -14,7 +14,6 @@
|
|||
export let sideOffset = 6;
|
||||
|
||||
let user = null;
|
||||
|
||||
onMount(async () => {
|
||||
if (id) {
|
||||
user = await getUserById(localStorage.token, id).catch((error) => {
|
||||
|
|
@ -27,7 +26,7 @@
|
|||
|
||||
{#if user}
|
||||
<LinkPreview.Content
|
||||
class="w-full max-w-[260px] rounded-2xl border border-gray-100 dark:border-gray-800 z-[99999] bg-white dark:bg-gray-850 dark:text-white shadow-lg transition"
|
||||
class="w-full max-w-[260px] rounded-2xl border border-gray-100 dark:border-gray-800 z-[9999] bg-white dark:bg-gray-850 dark:text-white shadow-lg transition"
|
||||
{side}
|
||||
{align}
|
||||
{sideOffset}
|
||||
|
|
|
|||
|
|
@ -17,16 +17,24 @@
|
|||
import Lock from '../icons/Lock.svelte';
|
||||
import UserAlt from '../icons/UserAlt.svelte';
|
||||
import ChannelInfoModal from './ChannelInfoModal.svelte';
|
||||
import Users from '../icons/Users.svelte';
|
||||
import Pin from '../icons/Pin.svelte';
|
||||
import PinnedMessagesModal from './PinnedMessagesModal.svelte';
|
||||
|
||||
const i18n = getContext('i18n');
|
||||
|
||||
let showChannelPinnedMessagesModal = false;
|
||||
let showChannelInfoModal = false;
|
||||
|
||||
export let channel;
|
||||
|
||||
export let onPin = (messageId, pinned) => {};
|
||||
export let onUpdate = () => {};
|
||||
</script>
|
||||
|
||||
<ChannelInfoModal bind:show={showChannelInfoModal} {channel} />
|
||||
<nav class="sticky top-0 z-30 w-full px-1.5 py-1 -mb-8 flex items-center drag-region">
|
||||
<PinnedMessagesModal bind:show={showChannelPinnedMessagesModal} {channel} {onPin} />
|
||||
<ChannelInfoModal bind:show={showChannelInfoModal} {channel} {onUpdate} />
|
||||
<nav class="sticky top-0 z-30 w-full px-1.5 py-1 -mb-8 flex items-center drag-region flex flex-col">
|
||||
<div
|
||||
id="navbar-bg-gradient-to-b"
|
||||
class=" bg-linear-to-b via-50% from-white via-white to-transparent dark:from-gray-900 dark:via-gray-900 dark:to-transparent pointer-events-none absolute inset-0 -bottom-7 z-[-1]"
|
||||
|
|
@ -60,30 +68,89 @@
|
|||
{/if}
|
||||
|
||||
<div
|
||||
class="flex-1 overflow-hidden max-w-full py-0.5
|
||||
class="flex-1 overflow-hidden max-w-full py-0.5 flex items-center
|
||||
{$showSidebar ? 'ml-1' : ''}
|
||||
"
|
||||
>
|
||||
{#if channel}
|
||||
<div class="flex items-center gap-0.5 shrink-0">
|
||||
<div class=" size-4 justify-center flex items-center">
|
||||
{#if channel?.access_control === null}
|
||||
<Hashtag className="size-3" strokeWidth="2.5" />
|
||||
{#if channel?.type === 'dm'}
|
||||
{#if channel?.users}
|
||||
{@const channelMembers = channel.users.filter((u) => u.id !== $user?.id)}
|
||||
<div class="flex mr-1.5 relative">
|
||||
{#each channelMembers.slice(0, 2) as u, index}
|
||||
<img
|
||||
src={`${WEBUI_API_BASE_URL}/users/${u.id}/profile/image`}
|
||||
alt={u.name}
|
||||
class=" size-6.5 rounded-full border-2 border-white dark:border-gray-900 {index ===
|
||||
1
|
||||
? '-ml-3'
|
||||
: ''}"
|
||||
/>
|
||||
{/each}
|
||||
|
||||
{#if channelMembers.length === 1}
|
||||
<div class="absolute bottom-0 right-0">
|
||||
<span class="relative flex size-2">
|
||||
{#if channelMembers[0]?.is_active}
|
||||
<span
|
||||
class="absolute inline-flex h-full w-full animate-ping rounded-full bg-green-400 opacity-75"
|
||||
></span>
|
||||
{/if}
|
||||
<span
|
||||
class="relative inline-flex size-2 rounded-full {channelMembers[0]
|
||||
?.is_active
|
||||
? 'bg-green-500'
|
||||
: 'bg-gray-300 dark:bg-gray-700'} border-[1.5px] border-white dark:border-gray-900"
|
||||
></span>
|
||||
</span>
|
||||
</div>
|
||||
{/if}
|
||||
</div>
|
||||
{:else}
|
||||
<Users className="size-4 ml-1 mr-0.5" strokeWidth="2" />
|
||||
{/if}
|
||||
{:else}
|
||||
<div class=" size-4.5 justify-center flex items-center">
|
||||
{#if channel?.type === 'group' ? !channel?.is_private : channel?.access_control === null}
|
||||
<Hashtag className="size-3.5" strokeWidth="2.5" />
|
||||
{:else}
|
||||
<Lock className="size-5" strokeWidth="2" />
|
||||
{/if}
|
||||
</div>
|
||||
{/if}
|
||||
|
||||
<div
|
||||
class=" text-left self-center overflow-hidden w-full line-clamp-1 capitalize flex-1"
|
||||
>
|
||||
<div class=" text-left self-center overflow-hidden w-full line-clamp-1 flex-1">
|
||||
{#if channel?.name}
|
||||
{channel.name}
|
||||
{:else}
|
||||
{channel?.users
|
||||
?.filter((u) => u.id !== $user?.id)
|
||||
.map((u) => u.name)
|
||||
.join(', ')}
|
||||
{/if}
|
||||
</div>
|
||||
</div>
|
||||
{/if}
|
||||
</div>
|
||||
|
||||
<div class="self-start flex flex-none items-center text-gray-600 dark:text-gray-400 gap-1">
|
||||
{#if channel}
|
||||
<Tooltip content={$i18n.t('Pinned Messages')}>
|
||||
<button
|
||||
class=" flex cursor-pointer py-1.5 px-1.5 border dark:border-gray-850 border-gray-50 rounded-xl text-gray-600 dark:text-gray-400 hover:bg-gray-50 dark:hover:bg-gray-850 transition"
|
||||
aria-label="Pinned Messages"
|
||||
type="button"
|
||||
on:click={() => {
|
||||
showChannelPinnedMessagesModal = true;
|
||||
}}
|
||||
>
|
||||
<div class=" flex items-center gap-0.5 m-auto self-center">
|
||||
<Pin className=" size-4" strokeWidth="1.5" />
|
||||
</div>
|
||||
</button>
|
||||
</Tooltip>
|
||||
|
||||
{#if channel?.user_count !== undefined}
|
||||
<Tooltip content={$i18n.t('Users')}>
|
||||
<button
|
||||
|
|
@ -104,6 +171,7 @@
|
|||
</button>
|
||||
</Tooltip>
|
||||
{/if}
|
||||
{/if}
|
||||
|
||||
{#if $user !== undefined}
|
||||
<UserMenu
|
||||
|
|
|
|||
159
src/lib/components/channel/PinnedMessagesModal.svelte
Normal file
159
src/lib/components/channel/PinnedMessagesModal.svelte
Normal file
|
|
@ -0,0 +1,159 @@
|
|||
<script lang="ts">
|
||||
import { toast } from 'svelte-sonner';
|
||||
import { getContext, onMount } from 'svelte';
|
||||
const i18n = getContext('i18n');
|
||||
|
||||
import { getChannelPinnedMessages, pinMessage } from '$lib/apis/channels';
|
||||
|
||||
import Spinner from '$lib/components/common/Spinner.svelte';
|
||||
import Modal from '$lib/components/common/Modal.svelte';
|
||||
|
||||
import XMark from '$lib/components/icons/XMark.svelte';
|
||||
import Message from './Messages/Message.svelte';
|
||||
import Loader from '../common/Loader.svelte';
|
||||
|
||||
export let show = false;
|
||||
export let channel = null;
|
||||
export let onPin = (messageId, pinned) => {};
|
||||
|
||||
let page = 1;
|
||||
let pinnedMessages = null;
|
||||
|
||||
let allItemsLoaded = false;
|
||||
let loading = false;
|
||||
|
||||
const getPinnedMessages = async () => {
|
||||
if (!channel) return;
|
||||
if (allItemsLoaded) return;
|
||||
|
||||
loading = true;
|
||||
try {
|
||||
const res = await getChannelPinnedMessages(localStorage.token, channel.id, page).catch(
|
||||
(error) => {
|
||||
toast.error(`${error}`);
|
||||
return null;
|
||||
}
|
||||
);
|
||||
|
||||
if (res) {
|
||||
pinnedMessages = [...(pinnedMessages ?? []), ...res];
|
||||
}
|
||||
|
||||
if (res.length === 0) {
|
||||
allItemsLoaded = true;
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Error fetching pinned messages:', error);
|
||||
} finally {
|
||||
loading = false;
|
||||
}
|
||||
};
|
||||
|
||||
const init = () => {
|
||||
page = 1;
|
||||
pinnedMessages = null;
|
||||
allItemsLoaded = false;
|
||||
|
||||
getPinnedMessages();
|
||||
};
|
||||
|
||||
$: if (show) {
|
||||
init();
|
||||
}
|
||||
|
||||
onMount(() => {
|
||||
init();
|
||||
});
|
||||
</script>
|
||||
|
||||
{#if channel}
|
||||
<Modal size="sm" bind:show>
|
||||
<div>
|
||||
<div class=" flex justify-between dark:text-gray-100 px-5 pt-4 mb-1.5">
|
||||
<div class="self-center text-base">
|
||||
<div class="flex items-center gap-0.5 shrink-0">
|
||||
{$i18n.t('Pinned Messages')}
|
||||
</div>
|
||||
</div>
|
||||
<button
|
||||
class="self-center"
|
||||
on:click={() => {
|
||||
show = false;
|
||||
}}
|
||||
>
|
||||
<XMark className={'size-5'} />
|
||||
</button>
|
||||
</div>
|
||||
|
||||
<div class="flex flex-col md:flex-row w-full px-4 pb-4 md:space-x-4 dark:text-gray-200">
|
||||
<div class=" flex flex-col w-full sm:flex-row sm:justify-center sm:space-x-6">
|
||||
<div class="flex flex-col w-full h-full pb-2 gap-1">
|
||||
{#if pinnedMessages === null}
|
||||
<div class="my-10">
|
||||
<Spinner className="size-5" />
|
||||
</div>
|
||||
{:else}
|
||||
<div
|
||||
class="flex flex-col gap-2 max-h-[60vh] overflow-y-auto scrollbar-thin scrollbar-thumb-gray-300 dark:scrollbar-thumb-gray-700 scrollbar-track-transparent py-2"
|
||||
>
|
||||
{#if pinnedMessages.length === 0}
|
||||
<div class=" text-center text-xs text-gray-500 dark:text-gray-400 py-6">
|
||||
{$i18n.t('No pinned messages')}
|
||||
</div>
|
||||
{:else}
|
||||
{#each pinnedMessages as message, messageIdx (message.id)}
|
||||
<Message
|
||||
className="rounded-xl px-2"
|
||||
{message}
|
||||
{channel}
|
||||
onPin={async (message) => {
|
||||
pinnedMessages = pinnedMessages.filter((m) => m.id !== message.id);
|
||||
onPin(message.id, !message.is_pinned);
|
||||
|
||||
const updatedMessage = await pinMessage(
|
||||
localStorage.token,
|
||||
message.channel_id,
|
||||
message.id,
|
||||
!message.is_pinned
|
||||
).catch((error) => {
|
||||
toast.error(`${error}`);
|
||||
return null;
|
||||
});
|
||||
|
||||
init();
|
||||
}}
|
||||
onReaction={false}
|
||||
onThread={false}
|
||||
onReply={false}
|
||||
onEdit={false}
|
||||
onDelete={false}
|
||||
/>
|
||||
|
||||
{#if messageIdx === pinnedMessages.length - 1 && !allItemsLoaded}
|
||||
<Loader
|
||||
on:visible={(e) => {
|
||||
console.log('visible');
|
||||
if (!loading) {
|
||||
page += 1;
|
||||
getPinnedMessages();
|
||||
}
|
||||
}}
|
||||
>
|
||||
<div
|
||||
class="w-full flex justify-center py-1 text-xs animate-pulse items-center gap-2"
|
||||
>
|
||||
<Spinner className=" size-4" />
|
||||
<div class=" ">{$i18n.t('Loading...')}</div>
|
||||
</div>
|
||||
</Loader>
|
||||
{/if}
|
||||
{/each}
|
||||
{/if}
|
||||
</div>
|
||||
{/if}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</Modal>
|
||||
{/if}
|
||||
|
|
@ -1955,7 +1955,9 @@
|
|||
|
||||
session_id: $socket?.id,
|
||||
chat_id: $chatId,
|
||||
|
||||
id: responseMessageId,
|
||||
parent_id: userMessage?.id ?? null,
|
||||
|
||||
background_tasks: {
|
||||
...(!$temporaryChatEnabled &&
|
||||
|
|
|
|||
|
|
@ -215,7 +215,7 @@
|
|||
|
||||
{#if $showControls}
|
||||
<PaneResizer
|
||||
class="relative flex items-center justify-center group border-l border-gray-50 dark:border-gray-850 hover:border-gray-200 dark:hover:border-gray-800 transition z-20"
|
||||
class="relative flex items-center justify-center group border-l border-gray-50 dark:border-gray-850/30 hover:border-gray-200 dark:hover:border-gray-800 transition z-20"
|
||||
id="controls-resizer"
|
||||
>
|
||||
<div
|
||||
|
|
|
|||
Some files were not shown because too many files have changed in this diff Show more
Loading…
Reference in a new issue