mirror of
https://github.com/sourcebot-dev/sourcebot.git
synced 2025-12-11 20:05:25 +00:00
feat(web): Streamed code search (#623)
Some checks failed
Publish to ghcr / build (linux/amd64, blacksmith-4vcpu-ubuntu-2404) (push) Has been cancelled
Publish to ghcr / build (linux/arm64, blacksmith-8vcpu-ubuntu-2204-arm) (push) Has been cancelled
Update Roadmap Released / update (push) Has been cancelled
Publish to ghcr / merge (push) Has been cancelled
Some checks failed
Publish to ghcr / build (linux/amd64, blacksmith-4vcpu-ubuntu-2404) (push) Has been cancelled
Publish to ghcr / build (linux/arm64, blacksmith-8vcpu-ubuntu-2204-arm) (push) Has been cancelled
Update Roadmap Released / update (push) Has been cancelled
Publish to ghcr / merge (push) Has been cancelled
* generate protobuf types * stream poc over SSE * wip: make stream search api follow existing schema. Modify UI to support streaming * fix scrolling issue * Dockerfile * wip on lezer parser grammar for query language * add lezer tree -> grpc transformer * remove spammy log message * fix syntax highlighting by adding a module resolution for @lezer/common * further wip on query language * Add case sensitivity and regexp toggles * Improved type safety / cleanup for query lang * support search contexts * update Dockerfile with query langauge package * fix filter * Add skeletons to filter panel when search is streaming * add client side caching * improved cancelation handling * add isSearchExausted flag for flagging when a search captured all results * Add back posthog search_finished event * remove zoekt tenant enforcement * migrate blocking search over to grpc. Centralize everything in searchApi * branch handling * plumb file weburl * add repo_sets filter for repositories a user has access to * refactor a bunch of stuff + add support for passing in Query IR to search api * refactor * dev README * wip on better error handling * error handling for stream path * update mcp * changelog wip * type fix * style * Support rev:* wildcard * changelog * changelog nit * feedback * fix build * update docs and remove uneeded test file
This commit is contained in:
parent
09507d3e89
commit
f3a8fa3dab
130 changed files with 6613 additions and 1212 deletions
|
|
@ -6,8 +6,6 @@ DATABASE_URL="postgresql://postgres:postgres@localhost:5432/postgres"
|
||||||
ZOEKT_WEBSERVER_URL="http://localhost:6070"
|
ZOEKT_WEBSERVER_URL="http://localhost:6070"
|
||||||
# The command to use for generating ctags.
|
# The command to use for generating ctags.
|
||||||
CTAGS_COMMAND=ctags
|
CTAGS_COMMAND=ctags
|
||||||
# logging, strict
|
|
||||||
SRC_TENANT_ENFORCEMENT_MODE=strict
|
|
||||||
|
|
||||||
# Auth.JS
|
# Auth.JS
|
||||||
# You can generate a new secret with:
|
# You can generate a new secret with:
|
||||||
|
|
|
||||||
14
CHANGELOG.md
14
CHANGELOG.md
|
|
@ -7,9 +7,23 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
|
||||||
|
|
||||||
## [Unreleased]
|
## [Unreleased]
|
||||||
|
|
||||||
|
<!-- Bump @sourcebot/mcp since there are breaking changes to the api in this release -->
|
||||||
|
|
||||||
### Added
|
### Added
|
||||||
|
- Added support for streaming code search results. [#623](https://github.com/sourcebot-dev/sourcebot/pull/623)
|
||||||
|
- Added buttons to toggle case sensitivity and regex patterns. [#623](https://github.com/sourcebot-dev/sourcebot/pull/623)
|
||||||
- Added counts to members, requets, and invites tabs in the members settings. [#621](https://github.com/sourcebot-dev/sourcebot/pull/621)
|
- Added counts to members, requets, and invites tabs in the members settings. [#621](https://github.com/sourcebot-dev/sourcebot/pull/621)
|
||||||
|
|
||||||
|
### Changed
|
||||||
|
- Changed the default search behaviour to match patterns as substrings and **not** regular expressions. Regular expressions can be used by toggling the regex button in search bar. [#623](https://github.com/sourcebot-dev/sourcebot/pull/623)
|
||||||
|
- Renamed `public` query prefix to `visibility`. Allowed values for `visibility` are `public`, `private`, and `any`. [#623](https://github.com/sourcebot-dev/sourcebot/pull/623)
|
||||||
|
- Changed `archived` query prefix to accept values `yes`, `no`, and `only`. [#623](https://github.com/sourcebot-dev/sourcebot/pull/623)
|
||||||
|
|
||||||
|
### Removed
|
||||||
|
- Removed `case` query prefix. [#623](https://github.com/sourcebot-dev/sourcebot/pull/623)
|
||||||
|
- Removed `branch` and `b` query prefixes. Please use `rev:` instead. [#623](https://github.com/sourcebot-dev/sourcebot/pull/623)
|
||||||
|
- Removed `regex` query prefix. [#623](https://github.com/sourcebot-dev/sourcebot/pull/623)
|
||||||
|
|
||||||
### Fixed
|
### Fixed
|
||||||
- Fixed spurious infinite loads with explore panel, file tree, and file search command. [#617](https://github.com/sourcebot-dev/sourcebot/pull/617)
|
- Fixed spurious infinite loads with explore panel, file tree, and file search command. [#617](https://github.com/sourcebot-dev/sourcebot/pull/617)
|
||||||
- Wipe search context on init if entitlement no longer exists [#618](https://github.com/sourcebot-dev/sourcebot/pull/618)
|
- Wipe search context on init if entitlement no longer exists [#618](https://github.com/sourcebot-dev/sourcebot/pull/618)
|
||||||
|
|
|
||||||
|
|
@ -43,10 +43,12 @@ COPY .yarn ./.yarn
|
||||||
COPY ./packages/db ./packages/db
|
COPY ./packages/db ./packages/db
|
||||||
COPY ./packages/schemas ./packages/schemas
|
COPY ./packages/schemas ./packages/schemas
|
||||||
COPY ./packages/shared ./packages/shared
|
COPY ./packages/shared ./packages/shared
|
||||||
|
COPY ./packages/queryLanguage ./packages/queryLanguage
|
||||||
|
|
||||||
RUN yarn workspace @sourcebot/db install
|
RUN yarn workspace @sourcebot/db install
|
||||||
RUN yarn workspace @sourcebot/schemas install
|
RUN yarn workspace @sourcebot/schemas install
|
||||||
RUN yarn workspace @sourcebot/shared install
|
RUN yarn workspace @sourcebot/shared install
|
||||||
|
RUN yarn workspace @sourcebot/query-language install
|
||||||
# ------------------------------------
|
# ------------------------------------
|
||||||
|
|
||||||
# ------ Build Web ------
|
# ------ Build Web ------
|
||||||
|
|
@ -92,6 +94,7 @@ COPY --from=shared-libs-builder /app/node_modules ./node_modules
|
||||||
COPY --from=shared-libs-builder /app/packages/db ./packages/db
|
COPY --from=shared-libs-builder /app/packages/db ./packages/db
|
||||||
COPY --from=shared-libs-builder /app/packages/schemas ./packages/schemas
|
COPY --from=shared-libs-builder /app/packages/schemas ./packages/schemas
|
||||||
COPY --from=shared-libs-builder /app/packages/shared ./packages/shared
|
COPY --from=shared-libs-builder /app/packages/shared ./packages/shared
|
||||||
|
COPY --from=shared-libs-builder /app/packages/queryLanguage ./packages/queryLanguage
|
||||||
|
|
||||||
# Fixes arm64 timeouts
|
# Fixes arm64 timeouts
|
||||||
RUN yarn workspace @sourcebot/web install
|
RUN yarn workspace @sourcebot/web install
|
||||||
|
|
@ -130,6 +133,7 @@ COPY --from=shared-libs-builder /app/node_modules ./node_modules
|
||||||
COPY --from=shared-libs-builder /app/packages/db ./packages/db
|
COPY --from=shared-libs-builder /app/packages/db ./packages/db
|
||||||
COPY --from=shared-libs-builder /app/packages/schemas ./packages/schemas
|
COPY --from=shared-libs-builder /app/packages/schemas ./packages/schemas
|
||||||
COPY --from=shared-libs-builder /app/packages/shared ./packages/shared
|
COPY --from=shared-libs-builder /app/packages/shared ./packages/shared
|
||||||
|
COPY --from=shared-libs-builder /app/packages/queryLanguage ./packages/queryLanguage
|
||||||
RUN yarn workspace @sourcebot/backend install
|
RUN yarn workspace @sourcebot/backend install
|
||||||
RUN yarn workspace @sourcebot/backend build
|
RUN yarn workspace @sourcebot/backend build
|
||||||
|
|
||||||
|
|
@ -173,7 +177,6 @@ ENV DATA_DIR=/data
|
||||||
ENV DATA_CACHE_DIR=$DATA_DIR/.sourcebot
|
ENV DATA_CACHE_DIR=$DATA_DIR/.sourcebot
|
||||||
ENV DATABASE_DATA_DIR=$DATA_CACHE_DIR/db
|
ENV DATABASE_DATA_DIR=$DATA_CACHE_DIR/db
|
||||||
ENV REDIS_DATA_DIR=$DATA_CACHE_DIR/redis
|
ENV REDIS_DATA_DIR=$DATA_CACHE_DIR/redis
|
||||||
ENV SRC_TENANT_ENFORCEMENT_MODE=strict
|
|
||||||
ENV SOURCEBOT_PUBLIC_KEY_PATH=/app/public.pem
|
ENV SOURCEBOT_PUBLIC_KEY_PATH=/app/public.pem
|
||||||
|
|
||||||
# Valid values are: debug, info, warn, error
|
# Valid values are: debug, info, warn, error
|
||||||
|
|
@ -217,6 +220,9 @@ COPY --from=zoekt-builder \
|
||||||
/cmd/zoekt-index \
|
/cmd/zoekt-index \
|
||||||
/usr/local/bin/
|
/usr/local/bin/
|
||||||
|
|
||||||
|
# Copy zoekt proto files (needed for gRPC client at runtime)
|
||||||
|
COPY vendor/zoekt/grpc/protos /app/vendor/zoekt/grpc/protos
|
||||||
|
|
||||||
# Copy all of the things
|
# Copy all of the things
|
||||||
COPY --from=web-builder /app/packages/web/public ./packages/web/public
|
COPY --from=web-builder /app/packages/web/public ./packages/web/public
|
||||||
COPY --from=web-builder /app/packages/web/.next/standalone ./
|
COPY --from=web-builder /app/packages/web/.next/standalone ./
|
||||||
|
|
@ -229,6 +235,7 @@ COPY --from=shared-libs-builder /app/node_modules ./node_modules
|
||||||
COPY --from=shared-libs-builder /app/packages/db ./packages/db
|
COPY --from=shared-libs-builder /app/packages/db ./packages/db
|
||||||
COPY --from=shared-libs-builder /app/packages/schemas ./packages/schemas
|
COPY --from=shared-libs-builder /app/packages/schemas ./packages/schemas
|
||||||
COPY --from=shared-libs-builder /app/packages/shared ./packages/shared
|
COPY --from=shared-libs-builder /app/packages/shared ./packages/shared
|
||||||
|
COPY --from=shared-libs-builder /app/packages/queryLanguage ./packages/queryLanguage
|
||||||
|
|
||||||
# Fixes git "dubious ownership" issues when the volume is mounted with different permissions to the container.
|
# Fixes git "dubious ownership" issues when the volume is mounted with different permissions to the container.
|
||||||
RUN git config --global safe.directory "*"
|
RUN git config --global safe.directory "*"
|
||||||
|
|
|
||||||
|
|
@ -4,32 +4,51 @@ title: Writing search queries
|
||||||
|
|
||||||
Sourcebot uses a powerful regex-based query language that enabled precise code search within large codebases.
|
Sourcebot uses a powerful regex-based query language that enabled precise code search within large codebases.
|
||||||
|
|
||||||
|
|
||||||
## Syntax reference guide
|
## Syntax reference guide
|
||||||
|
|
||||||
Queries consist of space-separated regular expressions. Wrapping expressions in `""` combines them. By default, a file must have at least one match for each expression to be included.
|
Queries consist of space-separated search patterns that are matched against file contents. A file must have at least one match for each expression to be included. Queries can optionally contain search filters to further refine the search results.
|
||||||
|
|
||||||
|
## Keyword search (default)
|
||||||
|
|
||||||
|
Keyword search matches search patterns exactly in file contents. Wrapping search patterns in `""` combines them as a single expression.
|
||||||
|
|
||||||
|
| Example | Explanation |
|
||||||
|
| :--- | :--- |
|
||||||
|
| `foo` | Match files containing the keyword `foo` |
|
||||||
|
| `foo bar` | Match files containing both `foo` **and** `bar` |
|
||||||
|
| `"foo bar"` | Match files containing the phrase `foo bar` |
|
||||||
|
| `"foo \"bar\""` | Match files containing `foo "bar"` exactly (escaped quotes) |
|
||||||
|
|
||||||
|
## Regex search
|
||||||
|
|
||||||
|
Toggle the regex button (`.*`) in the search bar to interpret search patterns as regular expressions.
|
||||||
|
|
||||||
| Example | Explanation |
|
| Example | Explanation |
|
||||||
| :--- | :--- |
|
| :--- | :--- |
|
||||||
| `foo` | Match files with regex `/foo/` |
|
| `foo` | Match files with regex `/foo/` |
|
||||||
| `foo bar` | Match files with regex `/foo/` **and** `/bar/` |
|
| `foo.*bar` | Match files with regex `/foo.*bar/` (foo followed by any characters, then bar) |
|
||||||
| `"foo bar"` | Match files with regex `/foo bar/` |
|
| `^function\s+\w+` | Match files with regex `/^function\s+\w+/` (function at start of line, followed by whitespace and word characters) |
|
||||||
|
| `"foo bar"` | Match files with regex `/foo bar/`. Quotes are not matched. |
|
||||||
|
|
||||||
Multiple expressions can be or'd together with `or`, negated with `-`, or grouped with `()`.
|
## Search filters
|
||||||
|
|
||||||
| Example | Explanation |
|
Search queries (keyword or regex) can include multiple search filters to further refine the search results. Some filters can be negated using the `-` prefix.
|
||||||
| :--- | :--- |
|
|
||||||
| `foo or bar` | Match files with regex `/foo/` **or** `/bar/` |
|
|
||||||
| `foo -bar` | Match files with regex `/foo/` but **not** `/bar/` |
|
|
||||||
| `foo (bar or baz)` | Match files with regex `/foo/` **and** either `/bar/` **or** `/baz/` |
|
|
||||||
|
|
||||||
Expressions can be prefixed with certain keywords to modify search behavior. Some keywords can be negated using the `-` prefix.
|
|
||||||
|
|
||||||
| Prefix | Description | Example |
|
| Prefix | Description | Example |
|
||||||
| :--- | :--- | :--- |
|
| :--- | :--- | :--- |
|
||||||
| `file:` | Filter results from filepaths that match the regex. By default all files are searched. | `file:README` - Filter results to filepaths that match regex `/README/`<br/>`file:"my file"` - Filter results to filepaths that match regex `/my file/`<br/>`-file:test\.ts$` - Ignore results from filepaths match regex `/test\.ts$/` |
|
| `file:` | Filter results from filepaths that match the regex. By default all files are searched. | `file:README` - Filter results to filepaths that match regex `/README/`<br/>`file:"my file"` - Filter results to filepaths that match regex `/my file/`<br/>`-file:test\.ts$` - Ignore results from filepaths match regex `/test\.ts$/` |
|
||||||
| `repo:` | Filter results from repos that match the regex. By default all repos are searched. | `repo:linux` - Filter results to repos that match regex `/linux/`<br/>`-repo:^web/.*` - Ignore results from repos that match regex `/^web\/.*` |
|
| `repo:` | Filter results from repos that match the regex. By default all repos are searched. | `repo:linux` - Filter results to repos that match regex `/linux/`<br/>`-repo:^web/.*` - Ignore results from repos that match regex `/^web\/.*/` |
|
||||||
| `rev:` | Filter results from a specific branch or tag. By default **only** the default branch is searched. | `rev:beta` - Filter results to branches that match regex `/beta/` |
|
| `rev:` | Filter results from a specific branch or tag. By default **only** the default branch is searched. | `rev:beta` - Filter results to branches that match regex `/beta/` |
|
||||||
| `lang:` | Filter results by language (as defined by [linguist](https://github.com/github-linguist/linguist/blob/main/lib/linguist/languages.yml)). By default all languages are searched. | `lang:TypeScript` - Filter results to TypeScript files<br/>`-lang:YAML` - Ignore results from YAML files |
|
| `lang:` | Filter results by language (as defined by [linguist](https://github.com/github-linguist/linguist/blob/main/lib/linguist/languages.yml)). By default all languages are searched. | `lang:TypeScript` - Filter results to TypeScript files<br/>`-lang:YAML` - Ignore results from YAML files |
|
||||||
| `sym:` | Match symbol definitions created by [universal ctags](https://ctags.io/) at index time. | `sym:\bmain\b` - Filter results to symbols that match regex `/\bmain\b/` |
|
| `sym:` | Match symbol definitions created by [universal ctags](https://ctags.io/) at index time. | `sym:\bmain\b` - Filter results to symbols that match regex `/\bmain\b/` |
|
||||||
| `context:` | Filter results to a predefined [search context](/docs/features/search/search-contexts). | `context:web` - Filter results to the web context<br/>`-context:pipelines` - Ignore results from the pipelines context |
|
| `context:` | Filter results to a predefined [search context](/docs/features/search/search-contexts). | `context:web` - Filter results to the web context<br/>`-context:pipelines` - Ignore results from the pipelines context |
|
||||||
|
|
||||||
|
## Boolean operators & grouping
|
||||||
|
|
||||||
|
By default, space-separated expressions are and'd together. Using the `or` keyword as well as parentheses `()` can be used to create more complex boolean logic. Parentheses can be negated using the `-` prefix.
|
||||||
|
|
||||||
|
| Example | Explanation |
|
||||||
|
| :--- | :--- |
|
||||||
|
| `foo or bar` | Match files containing `foo` **or** `bar` |
|
||||||
|
| `foo (bar or baz)` | Match files containing `foo` **and** either `bar` **or** `baz`. |
|
||||||
|
| `-(foo) bar` | Match files containing `bar` **and not** `foo`. |
|
||||||
|
|
|
||||||
|
|
@ -18,7 +18,7 @@
|
||||||
"dev:prisma:studio": "yarn with-env yarn workspace @sourcebot/db prisma:studio",
|
"dev:prisma:studio": "yarn with-env yarn workspace @sourcebot/db prisma:studio",
|
||||||
"dev:prisma:migrate:reset": "yarn with-env yarn workspace @sourcebot/db prisma:migrate:reset",
|
"dev:prisma:migrate:reset": "yarn with-env yarn workspace @sourcebot/db prisma:migrate:reset",
|
||||||
"dev:prisma:db:push": "yarn with-env yarn workspace @sourcebot/db prisma:db:push",
|
"dev:prisma:db:push": "yarn with-env yarn workspace @sourcebot/db prisma:db:push",
|
||||||
"build:deps": "yarn workspaces foreach --recursive --topological --from '{@sourcebot/schemas,@sourcebot/db,@sourcebot/shared}' run build"
|
"build:deps": "yarn workspaces foreach --recursive --topological --from '{@sourcebot/schemas,@sourcebot/db,@sourcebot/shared,@sourcebot/query-language}' run build"
|
||||||
},
|
},
|
||||||
"devDependencies": {
|
"devDependencies": {
|
||||||
"concurrently": "^9.2.1",
|
"concurrently": "^9.2.1",
|
||||||
|
|
@ -27,6 +27,7 @@
|
||||||
},
|
},
|
||||||
"packageManager": "yarn@4.7.0",
|
"packageManager": "yarn@4.7.0",
|
||||||
"resolutions": {
|
"resolutions": {
|
||||||
"prettier": "3.5.3"
|
"prettier": "3.5.3",
|
||||||
|
"@lezer/common": "1.3.0"
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
||||||
|
|
@ -94,7 +94,6 @@ const listenToShutdownSignals = () => {
|
||||||
const cleanup = async (signal: string) => {
|
const cleanup = async (signal: string) => {
|
||||||
try {
|
try {
|
||||||
if (receivedSignal) {
|
if (receivedSignal) {
|
||||||
logger.debug(`Recieved repeat signal ${signal}, ignoring.`);
|
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
receivedSignal = true;
|
receivedSignal = true;
|
||||||
|
|
|
||||||
|
|
@ -1 +1,3 @@
|
||||||
|
import type { User, Account } from ".prisma/client";
|
||||||
|
export type UserWithAccounts = User & { accounts: Account[] };
|
||||||
export * from ".prisma/client";
|
export * from ".prisma/client";
|
||||||
|
|
@ -7,6 +7,9 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
|
||||||
|
|
||||||
## [Unreleased]
|
## [Unreleased]
|
||||||
|
|
||||||
|
### Changed
|
||||||
|
- Updated API client to match the latest Sourcebot release. [#555](https://github.com/sourcebot-dev/sourcebot/pull/555)
|
||||||
|
|
||||||
## [1.0.9] - 2025-11-17
|
## [1.0.9] - 2025-11-17
|
||||||
|
|
||||||
### Added
|
### Added
|
||||||
|
|
|
||||||
|
|
@ -70,16 +70,12 @@ server.tool(
|
||||||
query += ` ( lang:${languages.join(' or lang:')} )`;
|
query += ` ( lang:${languages.join(' or lang:')} )`;
|
||||||
}
|
}
|
||||||
|
|
||||||
if (caseSensitive) {
|
|
||||||
query += ` case:yes`;
|
|
||||||
} else {
|
|
||||||
query += ` case:no`;
|
|
||||||
}
|
|
||||||
|
|
||||||
const response = await search({
|
const response = await search({
|
||||||
query,
|
query,
|
||||||
matches: env.DEFAULT_MATCHES,
|
matches: env.DEFAULT_MATCHES,
|
||||||
contextLines: env.DEFAULT_CONTEXT_LINES,
|
contextLines: env.DEFAULT_CONTEXT_LINES,
|
||||||
|
isRegexEnabled: true,
|
||||||
|
isCaseSensitivityEnabled: caseSensitive,
|
||||||
});
|
});
|
||||||
|
|
||||||
if (isServiceError(response)) {
|
if (isServiceError(response)) {
|
||||||
|
|
|
||||||
|
|
@ -21,15 +21,17 @@ export const symbolSchema = z.object({
|
||||||
kind: z.string(),
|
kind: z.string(),
|
||||||
});
|
});
|
||||||
|
|
||||||
|
export const searchOptionsSchema = z.object({
|
||||||
|
matches: z.number(), // The number of matches to return.
|
||||||
|
contextLines: z.number().optional(), // The number of context lines to return.
|
||||||
|
whole: z.boolean().optional(), // Whether to return the whole file as part of the response.
|
||||||
|
isRegexEnabled: z.boolean().optional(), // Whether to enable regular expression search.
|
||||||
|
isCaseSensitivityEnabled: z.boolean().optional(), // Whether to enable case sensitivity.
|
||||||
|
});
|
||||||
|
|
||||||
export const searchRequestSchema = z.object({
|
export const searchRequestSchema = z.object({
|
||||||
// The zoekt query to execute.
|
query: z.string(), // The zoekt query to execute.
|
||||||
query: z.string(),
|
...searchOptionsSchema.shape,
|
||||||
// The number of matches to return.
|
|
||||||
matches: z.number(),
|
|
||||||
// The number of context lines to return.
|
|
||||||
contextLines: z.number().optional(),
|
|
||||||
// Whether to return the whole file as part of the response.
|
|
||||||
whole: z.boolean().optional(),
|
|
||||||
});
|
});
|
||||||
|
|
||||||
export const repositoryInfoSchema = z.object({
|
export const repositoryInfoSchema = z.object({
|
||||||
|
|
@ -109,7 +111,7 @@ export const searchStatsSchema = z.object({
|
||||||
regexpsConsidered: z.number(),
|
regexpsConsidered: z.number(),
|
||||||
|
|
||||||
// FlushReason explains why results were flushed.
|
// FlushReason explains why results were flushed.
|
||||||
flushReason: z.number(),
|
flushReason: z.string(),
|
||||||
});
|
});
|
||||||
|
|
||||||
export const searchResponseSchema = z.object({
|
export const searchResponseSchema = z.object({
|
||||||
|
|
@ -139,7 +141,6 @@ export const searchResponseSchema = z.object({
|
||||||
content: z.string().optional(),
|
content: z.string().optional(),
|
||||||
})),
|
})),
|
||||||
repositoryInfo: z.array(repositoryInfoSchema),
|
repositoryInfo: z.array(repositoryInfoSchema),
|
||||||
isBranchFilteringEnabled: z.boolean(),
|
|
||||||
isSearchExhaustive: z.boolean(),
|
isSearchExhaustive: z.boolean(),
|
||||||
});
|
});
|
||||||
|
|
||||||
|
|
|
||||||
2
packages/queryLanguage/.gitignore
vendored
Normal file
2
packages/queryLanguage/.gitignore
vendored
Normal file
|
|
@ -0,0 +1,2 @@
|
||||||
|
/node_modules/
|
||||||
|
/dist
|
||||||
20
packages/queryLanguage/package.json
Normal file
20
packages/queryLanguage/package.json
Normal file
|
|
@ -0,0 +1,20 @@
|
||||||
|
{
|
||||||
|
"name": "@sourcebot/query-language",
|
||||||
|
"private": true,
|
||||||
|
"main": "dist/index.js",
|
||||||
|
"scripts": {
|
||||||
|
"build": "lezer-generator src/query.grammar -o src/parser --typeScript --names && tsc",
|
||||||
|
"test": "vitest",
|
||||||
|
"postinstall": "yarn build"
|
||||||
|
},
|
||||||
|
"devDependencies": {
|
||||||
|
"@lezer/generator": "^1.8.0",
|
||||||
|
"tsx": "^4.19.1",
|
||||||
|
"typescript": "^5.7.3",
|
||||||
|
"vitest": "^2.1.9"
|
||||||
|
},
|
||||||
|
"dependencies": {
|
||||||
|
"@lezer/common": "^1.3.0",
|
||||||
|
"@lezer/lr": "^1.4.3"
|
||||||
|
}
|
||||||
|
}
|
||||||
7
packages/queryLanguage/src/index.ts
Normal file
7
packages/queryLanguage/src/index.ts
Normal file
|
|
@ -0,0 +1,7 @@
|
||||||
|
import { parser } from "./parser";
|
||||||
|
|
||||||
|
type Tree = ReturnType<typeof parser.parse>;
|
||||||
|
type SyntaxNode = Tree['topNode'];
|
||||||
|
export type { Tree, SyntaxNode };
|
||||||
|
export * from "./parser";
|
||||||
|
export * from "./parser.terms";
|
||||||
21
packages/queryLanguage/src/parser.terms.ts
Normal file
21
packages/queryLanguage/src/parser.terms.ts
Normal file
|
|
@ -0,0 +1,21 @@
|
||||||
|
// This file was generated by lezer-generator. You probably shouldn't edit it.
|
||||||
|
export const
|
||||||
|
negate = 22,
|
||||||
|
Program = 1,
|
||||||
|
OrExpr = 2,
|
||||||
|
AndExpr = 3,
|
||||||
|
NegateExpr = 4,
|
||||||
|
PrefixExpr = 5,
|
||||||
|
ArchivedExpr = 6,
|
||||||
|
RevisionExpr = 7,
|
||||||
|
ContentExpr = 8,
|
||||||
|
ContextExpr = 9,
|
||||||
|
FileExpr = 10,
|
||||||
|
ForkExpr = 11,
|
||||||
|
VisibilityExpr = 12,
|
||||||
|
RepoExpr = 13,
|
||||||
|
LangExpr = 14,
|
||||||
|
SymExpr = 15,
|
||||||
|
RepoSetExpr = 16,
|
||||||
|
ParenExpr = 17,
|
||||||
|
Term = 18
|
||||||
18
packages/queryLanguage/src/parser.ts
Normal file
18
packages/queryLanguage/src/parser.ts
Normal file
File diff suppressed because one or more lines are too long
102
packages/queryLanguage/src/query.grammar
Normal file
102
packages/queryLanguage/src/query.grammar
Normal file
|
|
@ -0,0 +1,102 @@
|
||||||
|
@external tokens negateToken from "./tokens" { negate }
|
||||||
|
|
||||||
|
@top Program { query }
|
||||||
|
|
||||||
|
@precedence {
|
||||||
|
negate,
|
||||||
|
and,
|
||||||
|
or @left
|
||||||
|
}
|
||||||
|
|
||||||
|
query {
|
||||||
|
OrExpr |
|
||||||
|
AndExpr |
|
||||||
|
expr
|
||||||
|
}
|
||||||
|
|
||||||
|
OrExpr { andExpr (or andExpr)+ }
|
||||||
|
|
||||||
|
AndExpr { expr expr+ }
|
||||||
|
|
||||||
|
andExpr { AndExpr | expr }
|
||||||
|
|
||||||
|
expr {
|
||||||
|
NegateExpr |
|
||||||
|
ParenExpr |
|
||||||
|
PrefixExpr |
|
||||||
|
Term
|
||||||
|
}
|
||||||
|
|
||||||
|
NegateExpr { !negate negate (PrefixExpr | ParenExpr) }
|
||||||
|
|
||||||
|
ParenExpr { "(" query ")" }
|
||||||
|
|
||||||
|
PrefixExpr {
|
||||||
|
ArchivedExpr |
|
||||||
|
RevisionExpr |
|
||||||
|
ContentExpr |
|
||||||
|
ContextExpr |
|
||||||
|
FileExpr |
|
||||||
|
ForkExpr |
|
||||||
|
VisibilityExpr |
|
||||||
|
RepoExpr |
|
||||||
|
LangExpr |
|
||||||
|
SymExpr |
|
||||||
|
RepoSetExpr
|
||||||
|
}
|
||||||
|
|
||||||
|
RevisionExpr { revisionKw value }
|
||||||
|
ContentExpr { contentKw value }
|
||||||
|
ContextExpr { contextKw value }
|
||||||
|
FileExpr { fileKw value }
|
||||||
|
RepoExpr { repoKw value }
|
||||||
|
LangExpr { langKw value }
|
||||||
|
SymExpr { symKw value }
|
||||||
|
RepoSetExpr { reposetKw value }
|
||||||
|
|
||||||
|
// Modifiers
|
||||||
|
ArchivedExpr { archivedKw archivedValue }
|
||||||
|
ForkExpr { forkKw forkValue }
|
||||||
|
VisibilityExpr { visibilityKw visibilityValue }
|
||||||
|
|
||||||
|
archivedValue { "yes" | "no" | "only" }
|
||||||
|
forkValue { "yes" | "no" | "only" }
|
||||||
|
visibilityValue { "public" | "private" | "any" }
|
||||||
|
|
||||||
|
Term { quotedString | word }
|
||||||
|
|
||||||
|
value { quotedString | word }
|
||||||
|
|
||||||
|
@skip { space }
|
||||||
|
|
||||||
|
@tokens {
|
||||||
|
archivedKw { "archived:" }
|
||||||
|
revisionKw { "rev:" }
|
||||||
|
contentKw { "content:" | "c:" }
|
||||||
|
contextKw { "context:" }
|
||||||
|
fileKw { "file:" | "f:" }
|
||||||
|
forkKw { "fork:" }
|
||||||
|
visibilityKw { "visibility:" }
|
||||||
|
repoKw { "repo:" | "r:" }
|
||||||
|
langKw { "lang:" }
|
||||||
|
symKw { "sym:" }
|
||||||
|
reposetKw { "reposet:" }
|
||||||
|
|
||||||
|
or { "or" ![a-zA-Z0-9_] }
|
||||||
|
|
||||||
|
quotedString { '"' (!["\\\n] | "\\" _)* '"' }
|
||||||
|
|
||||||
|
// Allow almost anything in a word except spaces, parens, quotes
|
||||||
|
// Colons and dashes are allowed anywhere in words (including at the start)
|
||||||
|
word { (![ \t\n()"]) (![ \t\n()":] | ":" | "-")* }
|
||||||
|
|
||||||
|
space { $[ \t\n]+ }
|
||||||
|
|
||||||
|
@precedence {
|
||||||
|
quotedString,
|
||||||
|
archivedKw, revisionKw, contentKw, contextKw, fileKw,
|
||||||
|
forkKw, visibilityKw, repoKw, langKw,
|
||||||
|
symKw, reposetKw, or,
|
||||||
|
word
|
||||||
|
}
|
||||||
|
}
|
||||||
59
packages/queryLanguage/src/tokens.ts
Normal file
59
packages/queryLanguage/src/tokens.ts
Normal file
|
|
@ -0,0 +1,59 @@
|
||||||
|
import { ExternalTokenizer } from "@lezer/lr";
|
||||||
|
import { negate } from "./parser.terms";
|
||||||
|
|
||||||
|
// External tokenizer for negation
|
||||||
|
// Only tokenizes `-` as negate when followed by a prefix keyword or `(`
|
||||||
|
export const negateToken = new ExternalTokenizer((input) => {
|
||||||
|
if (input.next !== 45 /* '-' */) return; // Not a dash
|
||||||
|
|
||||||
|
const startPos = input.pos;
|
||||||
|
|
||||||
|
// Look ahead to see what follows the dash
|
||||||
|
input.advance();
|
||||||
|
|
||||||
|
// Skip whitespace
|
||||||
|
let ch = input.next;
|
||||||
|
while (ch === 32 || ch === 9 || ch === 10) {
|
||||||
|
input.advance();
|
||||||
|
ch = input.next;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check if followed by opening paren
|
||||||
|
if (ch === 40 /* '(' */) {
|
||||||
|
input.acceptToken(negate, -input.pos + startPos + 1); // Accept just the dash
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check if followed by a prefix keyword (by checking for keyword followed by colon)
|
||||||
|
// Look ahead until we hit a delimiter or colon
|
||||||
|
const checkPos = input.pos;
|
||||||
|
let foundColon = false;
|
||||||
|
|
||||||
|
// Look ahead until we hit a delimiter or colon
|
||||||
|
while (ch >= 0) {
|
||||||
|
if (ch === 58 /* ':' */) {
|
||||||
|
foundColon = true;
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
// Hit a delimiter (whitespace, paren, or quote) - not a prefix keyword
|
||||||
|
if (ch === 32 || ch === 9 || ch === 10 || ch === 40 || ch === 41 || ch === 34) {
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
input.advance();
|
||||||
|
ch = input.next;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Reset position
|
||||||
|
while (input.pos > checkPos) {
|
||||||
|
input.advance(-1);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (foundColon) {
|
||||||
|
// It's a prefix keyword, accept as negate
|
||||||
|
input.acceptToken(negate, -input.pos + startPos + 1);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Otherwise, don't tokenize as negate (let word handle it)
|
||||||
|
});
|
||||||
|
|
||||||
72
packages/queryLanguage/test/basic.txt
Normal file
72
packages/queryLanguage/test/basic.txt
Normal file
|
|
@ -0,0 +1,72 @@
|
||||||
|
# Single term
|
||||||
|
|
||||||
|
hello
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(Term)
|
||||||
|
|
||||||
|
# Multiple terms
|
||||||
|
|
||||||
|
hello world
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(AndExpr(Term,Term))
|
||||||
|
|
||||||
|
# Multiple terms with various characters
|
||||||
|
|
||||||
|
console.log error_handler
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(AndExpr(Term,Term))
|
||||||
|
|
||||||
|
# Term with underscores
|
||||||
|
|
||||||
|
my_variable_name
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(Term)
|
||||||
|
|
||||||
|
# Term with dots
|
||||||
|
|
||||||
|
com.example.package
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(Term)
|
||||||
|
|
||||||
|
# Term with numbers
|
||||||
|
|
||||||
|
func123 test_456
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(AndExpr(Term,Term))
|
||||||
|
|
||||||
|
# Regex pattern
|
||||||
|
|
||||||
|
[a-z]+
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(Term)
|
||||||
|
|
||||||
|
# Wildcard pattern
|
||||||
|
|
||||||
|
test.*
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(Term)
|
||||||
|
|
||||||
|
# Multiple regex patterns
|
||||||
|
|
||||||
|
\w+ [0-9]+ \s*
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(AndExpr(Term,Term,Term))
|
||||||
|
|
||||||
21
packages/queryLanguage/test/grammar.test.ts
Normal file
21
packages/queryLanguage/test/grammar.test.ts
Normal file
|
|
@ -0,0 +1,21 @@
|
||||||
|
import { parser } from "../src/parser";
|
||||||
|
import { fileTests } from "@lezer/generator/dist/test";
|
||||||
|
import { describe, it } from "vitest";
|
||||||
|
import { fileURLToPath } from "url"
|
||||||
|
import * as fs from "fs";
|
||||||
|
import * as path from "path";
|
||||||
|
|
||||||
|
const caseDir = path.dirname(fileURLToPath(import.meta.url))
|
||||||
|
|
||||||
|
for (const file of fs.readdirSync(caseDir)) {
|
||||||
|
if (!/\.txt$/.test(file)) {
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
let name = /^[^\.]*/.exec(file)?.[0];
|
||||||
|
describe(name ?? "unknown", () => {
|
||||||
|
for (const { name, run } of fileTests(fs.readFileSync(path.join(caseDir, file), "utf8"), file)) {
|
||||||
|
it(name, () => run(parser));
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
120
packages/queryLanguage/test/grouping.txt
Normal file
120
packages/queryLanguage/test/grouping.txt
Normal file
|
|
@ -0,0 +1,120 @@
|
||||||
|
# Empty parentheses
|
||||||
|
|
||||||
|
()
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(ParenExpr(Term(⚠)))
|
||||||
|
|
||||||
|
# Simple grouping
|
||||||
|
|
||||||
|
(test)
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(ParenExpr(Term))
|
||||||
|
|
||||||
|
# Multiple terms in group
|
||||||
|
|
||||||
|
(hello world)
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(ParenExpr(AndExpr(Term,Term)))
|
||||||
|
|
||||||
|
# Nested parentheses
|
||||||
|
|
||||||
|
((test))
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(ParenExpr(ParenExpr(Term)))
|
||||||
|
|
||||||
|
# Multiple groups
|
||||||
|
|
||||||
|
(first) (second)
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(AndExpr(ParenExpr(Term),ParenExpr(Term)))
|
||||||
|
|
||||||
|
# Group with multiple terms
|
||||||
|
|
||||||
|
(one two three)
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(ParenExpr(AndExpr(Term,Term,Term)))
|
||||||
|
|
||||||
|
# Mixed grouped and ungrouped
|
||||||
|
|
||||||
|
test (grouped) another
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(AndExpr(Term,ParenExpr(Term),Term))
|
||||||
|
|
||||||
|
# Deeply nested
|
||||||
|
|
||||||
|
(((nested)))
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(ParenExpr(ParenExpr(ParenExpr(Term))))
|
||||||
|
|
||||||
|
# Multiple nested groups
|
||||||
|
|
||||||
|
((a b) (c d))
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(ParenExpr(AndExpr(ParenExpr(AndExpr(Term,Term)),ParenExpr(AndExpr(Term,Term)))))
|
||||||
|
|
||||||
|
# Group at start
|
||||||
|
|
||||||
|
(start) middle end
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(AndExpr(ParenExpr(Term),Term,Term))
|
||||||
|
|
||||||
|
# Group at end
|
||||||
|
|
||||||
|
start middle (end)
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(AndExpr(Term,Term,ParenExpr(Term)))
|
||||||
|
|
||||||
|
# Complex grouping pattern
|
||||||
|
|
||||||
|
(a (b c) d)
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(ParenExpr(AndExpr(Term,ParenExpr(AndExpr(Term,Term)),Term)))
|
||||||
|
|
||||||
|
# Sequential groups
|
||||||
|
|
||||||
|
(a)(b)(c)
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(AndExpr(ParenExpr(Term),ParenExpr(Term),ParenExpr(Term)))
|
||||||
|
|
||||||
|
# Group with regex
|
||||||
|
|
||||||
|
([a-z]+)
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(ParenExpr(Term))
|
||||||
|
|
||||||
|
# Group with dots
|
||||||
|
|
||||||
|
(com.example.test)
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(ParenExpr(Term))
|
||||||
|
|
||||||
255
packages/queryLanguage/test/negation.txt
Normal file
255
packages/queryLanguage/test/negation.txt
Normal file
|
|
@ -0,0 +1,255 @@
|
||||||
|
# Literal dash term
|
||||||
|
|
||||||
|
-test
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(Term)
|
||||||
|
|
||||||
|
# Quoted dash term
|
||||||
|
|
||||||
|
"-excluded"
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(Term)
|
||||||
|
|
||||||
|
# Dash in middle
|
||||||
|
|
||||||
|
test-case
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(Term)
|
||||||
|
|
||||||
|
# Multiple dash terms
|
||||||
|
|
||||||
|
-one -two -three
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(AndExpr(Term,Term,Term))
|
||||||
|
|
||||||
|
# Negate file prefix
|
||||||
|
|
||||||
|
-file:test.js
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(NegateExpr(PrefixExpr(FileExpr)))
|
||||||
|
|
||||||
|
# Negate repo prefix
|
||||||
|
|
||||||
|
-repo:archived
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(NegateExpr(PrefixExpr(RepoExpr)))
|
||||||
|
|
||||||
|
# Negate lang prefix
|
||||||
|
|
||||||
|
-lang:python
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(NegateExpr(PrefixExpr(LangExpr)))
|
||||||
|
|
||||||
|
# Negate content prefix
|
||||||
|
|
||||||
|
-content:TODO
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(NegateExpr(PrefixExpr(ContentExpr)))
|
||||||
|
|
||||||
|
# Negate revision prefix
|
||||||
|
|
||||||
|
-rev:develop
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(NegateExpr(PrefixExpr(RevisionExpr)))
|
||||||
|
|
||||||
|
# Negate archived prefix
|
||||||
|
|
||||||
|
-archived:yes
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(NegateExpr(PrefixExpr(ArchivedExpr)))
|
||||||
|
|
||||||
|
# Negate fork prefix
|
||||||
|
|
||||||
|
-fork:yes
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(NegateExpr(PrefixExpr(ForkExpr)))
|
||||||
|
|
||||||
|
# Negate visibility prefix
|
||||||
|
|
||||||
|
-visibility:any
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(NegateExpr(PrefixExpr(VisibilityExpr)))
|
||||||
|
|
||||||
|
# Negate context prefix
|
||||||
|
|
||||||
|
-context:backend
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(NegateExpr(PrefixExpr(ContextExpr)))
|
||||||
|
|
||||||
|
# Negate symbol prefix
|
||||||
|
|
||||||
|
-sym:OldClass
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(NegateExpr(PrefixExpr(SymExpr)))
|
||||||
|
|
||||||
|
# Negate parentheses
|
||||||
|
|
||||||
|
-(test)
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(NegateExpr(ParenExpr(Term)))
|
||||||
|
|
||||||
|
# Negate group with multiple terms
|
||||||
|
|
||||||
|
-(test exclude)
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(NegateExpr(ParenExpr(AndExpr(Term,Term))))
|
||||||
|
|
||||||
|
# Negate group with prefix
|
||||||
|
|
||||||
|
-(file:test.js console.log)
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(NegateExpr(ParenExpr(AndExpr(PrefixExpr(FileExpr),Term))))
|
||||||
|
|
||||||
|
# Prefix with negated term
|
||||||
|
|
||||||
|
file:test.js -console
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(AndExpr(PrefixExpr(FileExpr),Term))
|
||||||
|
|
||||||
|
# Multiple prefixes with negation
|
||||||
|
|
||||||
|
file:test.js -lang:python
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(AndExpr(PrefixExpr(FileExpr),NegateExpr(PrefixExpr(LangExpr))))
|
||||||
|
|
||||||
|
# Complex negation pattern
|
||||||
|
|
||||||
|
function -file:test.js -lang:java
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(AndExpr(Term,NegateExpr(PrefixExpr(FileExpr)),NegateExpr(PrefixExpr(LangExpr))))
|
||||||
|
|
||||||
|
# Negation inside parentheses
|
||||||
|
|
||||||
|
(-file:test.js)
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(ParenExpr(NegateExpr(PrefixExpr(FileExpr))))
|
||||||
|
|
||||||
|
# Multiple negations in group
|
||||||
|
|
||||||
|
(-file:a.js -lang:python)
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(ParenExpr(AndExpr(NegateExpr(PrefixExpr(FileExpr)),NegateExpr(PrefixExpr(LangExpr)))))
|
||||||
|
|
||||||
|
# Mixed in parentheses
|
||||||
|
|
||||||
|
(include -file:test.js)
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(ParenExpr(AndExpr(Term,NegateExpr(PrefixExpr(FileExpr)))))
|
||||||
|
|
||||||
|
# Negate nested group
|
||||||
|
|
||||||
|
-((file:test.js))
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(NegateExpr(ParenExpr(ParenExpr(PrefixExpr(FileExpr)))))
|
||||||
|
|
||||||
|
# Negate short form prefix
|
||||||
|
|
||||||
|
-f:test.js
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(NegateExpr(PrefixExpr(FileExpr)))
|
||||||
|
|
||||||
|
# Negate short form repo
|
||||||
|
|
||||||
|
-r:myrepo
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(NegateExpr(PrefixExpr(RepoExpr)))
|
||||||
|
|
||||||
|
# Negate short form content
|
||||||
|
|
||||||
|
-c:console
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(NegateExpr(PrefixExpr(ContentExpr)))
|
||||||
|
|
||||||
|
# Negate with prefix in quotes
|
||||||
|
|
||||||
|
-file:"test file.js"
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(NegateExpr(PrefixExpr(FileExpr)))
|
||||||
|
|
||||||
|
# Complex with multiple negated prefixes
|
||||||
|
|
||||||
|
lang:typescript -file:*.test.ts -file:*.spec.ts
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(AndExpr(PrefixExpr(LangExpr),NegateExpr(PrefixExpr(FileExpr)),NegateExpr(PrefixExpr(FileExpr))))
|
||||||
|
|
||||||
|
# Negated group with prefix
|
||||||
|
|
||||||
|
-(file:test.js lang:python)
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(NegateExpr(ParenExpr(AndExpr(PrefixExpr(FileExpr),PrefixExpr(LangExpr)))))
|
||||||
|
|
||||||
|
# Negate empty group
|
||||||
|
|
||||||
|
-()
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(NegateExpr(ParenExpr(Term(⚠))))
|
||||||
|
|
||||||
|
# Negate with space after dash
|
||||||
|
|
||||||
|
- file:test.js
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(NegateExpr(PrefixExpr(FileExpr)))
|
||||||
271
packages/queryLanguage/test/operators.txt
Normal file
271
packages/queryLanguage/test/operators.txt
Normal file
|
|
@ -0,0 +1,271 @@
|
||||||
|
# Simple OR
|
||||||
|
|
||||||
|
test or example
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(OrExpr(Term,Term))
|
||||||
|
|
||||||
|
# Multiple OR
|
||||||
|
|
||||||
|
one or two or three
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(OrExpr(Term,Term,Term))
|
||||||
|
|
||||||
|
# OR with prefixes
|
||||||
|
|
||||||
|
file:test.js or file:example.js
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(OrExpr(PrefixExpr(FileExpr),PrefixExpr(FileExpr)))
|
||||||
|
|
||||||
|
# OR with negation
|
||||||
|
|
||||||
|
test or -file:excluded.js
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(OrExpr(Term,NegateExpr(PrefixExpr(FileExpr))))
|
||||||
|
|
||||||
|
# OR with quoted strings
|
||||||
|
|
||||||
|
"first option" or "second option"
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(OrExpr(Term,Term))
|
||||||
|
|
||||||
|
# OR with different prefixes
|
||||||
|
|
||||||
|
lang:python or lang:javascript
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(OrExpr(PrefixExpr(LangExpr),PrefixExpr(LangExpr)))
|
||||||
|
|
||||||
|
# Multiple terms with OR
|
||||||
|
|
||||||
|
function test or class example
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(OrExpr(AndExpr(Term,Term),AndExpr(Term,Term)))
|
||||||
|
|
||||||
|
# OR in parentheses
|
||||||
|
|
||||||
|
(test or example)
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(ParenExpr(OrExpr(Term,Term)))
|
||||||
|
|
||||||
|
# OR with parentheses outside
|
||||||
|
|
||||||
|
(test) or (example)
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(OrExpr(ParenExpr(Term),ParenExpr(Term)))
|
||||||
|
|
||||||
|
# Complex OR with grouping
|
||||||
|
|
||||||
|
(file:*.js lang:javascript) or (file:*.ts lang:typescript)
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(OrExpr(ParenExpr(AndExpr(PrefixExpr(FileExpr),PrefixExpr(LangExpr))),ParenExpr(AndExpr(PrefixExpr(FileExpr),PrefixExpr(LangExpr)))))
|
||||||
|
|
||||||
|
# OR with mixed content
|
||||||
|
|
||||||
|
test or file:example.js
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(OrExpr(Term,PrefixExpr(FileExpr)))
|
||||||
|
|
||||||
|
# Prefix OR term
|
||||||
|
|
||||||
|
file:test.js or example
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(OrExpr(PrefixExpr(FileExpr),Term))
|
||||||
|
|
||||||
|
# OR with short form prefixes
|
||||||
|
|
||||||
|
f:test.js or r:myrepo
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(OrExpr(PrefixExpr(FileExpr),PrefixExpr(RepoExpr)))
|
||||||
|
|
||||||
|
# OR with repo prefixes
|
||||||
|
|
||||||
|
repo:project1 or repo:project2
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(OrExpr(PrefixExpr(RepoExpr),PrefixExpr(RepoExpr)))
|
||||||
|
|
||||||
|
# OR with revision prefixes
|
||||||
|
|
||||||
|
rev:main or rev:develop
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(OrExpr(PrefixExpr(RevisionExpr),PrefixExpr(RevisionExpr)))
|
||||||
|
|
||||||
|
# OR with lang prefixes
|
||||||
|
|
||||||
|
lang:rust or lang:go
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(OrExpr(PrefixExpr(LangExpr),PrefixExpr(LangExpr)))
|
||||||
|
|
||||||
|
# OR with content
|
||||||
|
|
||||||
|
content:TODO or content:FIXME
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(OrExpr(PrefixExpr(ContentExpr),PrefixExpr(ContentExpr)))
|
||||||
|
|
||||||
|
# OR with negated terms
|
||||||
|
|
||||||
|
-file:test.js or -file:spec.js
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(OrExpr(NegateExpr(PrefixExpr(FileExpr)),NegateExpr(PrefixExpr(FileExpr))))
|
||||||
|
|
||||||
|
# OR in nested parentheses
|
||||||
|
|
||||||
|
((a or b) or (c or d))
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(ParenExpr(OrExpr(ParenExpr(OrExpr(Term,Term)),ParenExpr(OrExpr(Term,Term)))))
|
||||||
|
|
||||||
|
# Multiple OR with parentheses and implicit AND
|
||||||
|
|
||||||
|
(a or b) and (c or d)
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(AndExpr(ParenExpr(OrExpr(Term,Term)),Term,ParenExpr(OrExpr(Term,Term))))
|
||||||
|
|
||||||
|
# OR with wildcards
|
||||||
|
|
||||||
|
*.test.js or *.spec.js
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(OrExpr(Term,Term))
|
||||||
|
|
||||||
|
# OR with regex patterns
|
||||||
|
|
||||||
|
[a-z]+ or [0-9]+
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(OrExpr(Term,Term))
|
||||||
|
|
||||||
|
# OR with dots
|
||||||
|
|
||||||
|
com.example.test or org.example.test
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(OrExpr(Term,Term))
|
||||||
|
|
||||||
|
# OR with dashes
|
||||||
|
|
||||||
|
test-one or test-two
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(OrExpr(Term,Term))
|
||||||
|
|
||||||
|
# Word containing 'or'
|
||||||
|
|
||||||
|
order
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(Term)
|
||||||
|
|
||||||
|
# Word containing 'or' in middle
|
||||||
|
|
||||||
|
before
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(Term)
|
||||||
|
|
||||||
|
# OR at start
|
||||||
|
|
||||||
|
or test
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(⚠,Term)
|
||||||
|
|
||||||
|
# OR at end (or becomes term)
|
||||||
|
|
||||||
|
test or
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(AndExpr(Term,Term))
|
||||||
|
|
||||||
|
# Multiple consecutive OR
|
||||||
|
|
||||||
|
test or or example
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(OrExpr(Term,⚠,Term))
|
||||||
|
|
||||||
|
# OR with all prefix types
|
||||||
|
|
||||||
|
file:*.js or repo:myrepo or lang:javascript
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(OrExpr(PrefixExpr(FileExpr),PrefixExpr(RepoExpr),PrefixExpr(LangExpr)))
|
||||||
|
|
||||||
|
# Complex query with OR and negation
|
||||||
|
|
||||||
|
(lang:python or lang:ruby) -file:test.py
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(AndExpr(ParenExpr(OrExpr(PrefixExpr(LangExpr),PrefixExpr(LangExpr))),NegateExpr(PrefixExpr(FileExpr))))
|
||||||
|
|
||||||
|
# OR with quoted prefix values
|
||||||
|
|
||||||
|
file:"test one.js" or file:"test two.js"
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(OrExpr(PrefixExpr(FileExpr),PrefixExpr(FileExpr)))
|
||||||
|
|
||||||
|
# OR with empty parentheses
|
||||||
|
|
||||||
|
() or ()
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(OrExpr(ParenExpr(Term(⚠)),ParenExpr(Term(⚠))))
|
||||||
|
|
||||||
|
# OR with negated groups
|
||||||
|
|
||||||
|
-(file:a.js) or -(file:b.js)
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(OrExpr(NegateExpr(ParenExpr(PrefixExpr(FileExpr))),NegateExpr(ParenExpr(PrefixExpr(FileExpr)))))
|
||||||
200
packages/queryLanguage/test/precedence.txt
Normal file
200
packages/queryLanguage/test/precedence.txt
Normal file
|
|
@ -0,0 +1,200 @@
|
||||||
|
# OR has lowest precedence - implicit AND groups first
|
||||||
|
|
||||||
|
a b or c d
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(OrExpr(AndExpr(Term,Term),AndExpr(Term,Term)))
|
||||||
|
|
||||||
|
# Multiple OR operators are left-associative
|
||||||
|
|
||||||
|
a or b or c
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(OrExpr(Term,Term,Term))
|
||||||
|
|
||||||
|
# AND before OR
|
||||||
|
|
||||||
|
file:test.js error or file:test.go panic
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(OrExpr(AndExpr(PrefixExpr(FileExpr),Term),AndExpr(PrefixExpr(FileExpr),Term)))
|
||||||
|
|
||||||
|
# Negation binds tighter than AND
|
||||||
|
|
||||||
|
-file:test.js error
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(AndExpr(NegateExpr(PrefixExpr(FileExpr)),Term))
|
||||||
|
|
||||||
|
# Negation binds tighter than OR
|
||||||
|
|
||||||
|
-file:a.js or file:b.js
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(OrExpr(NegateExpr(PrefixExpr(FileExpr)),PrefixExpr(FileExpr)))
|
||||||
|
|
||||||
|
# Parentheses override precedence
|
||||||
|
|
||||||
|
(a or b) c
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(AndExpr(ParenExpr(OrExpr(Term,Term)),Term))
|
||||||
|
|
||||||
|
# Parentheses override - OR inside parens groups first
|
||||||
|
|
||||||
|
a (b or c)
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(AndExpr(Term,ParenExpr(OrExpr(Term,Term))))
|
||||||
|
|
||||||
|
# Complex: AND, OR, and negation
|
||||||
|
|
||||||
|
a -b or c d
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(OrExpr(AndExpr(Term,Term),AndExpr(Term,Term)))
|
||||||
|
|
||||||
|
# Negated group in OR expression
|
||||||
|
|
||||||
|
-(a b) or c
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(OrExpr(NegateExpr(ParenExpr(AndExpr(Term,Term))),Term))
|
||||||
|
|
||||||
|
# Multiple negations in OR
|
||||||
|
|
||||||
|
-file:a.js or -file:b.js or file:c.js
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(OrExpr(NegateExpr(PrefixExpr(FileExpr)),NegateExpr(PrefixExpr(FileExpr)),PrefixExpr(FileExpr)))
|
||||||
|
|
||||||
|
# Prefix binds to its value only
|
||||||
|
|
||||||
|
file:a.js b.js
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(AndExpr(PrefixExpr(FileExpr),Term))
|
||||||
|
|
||||||
|
# OR with prefixes and terms mixed
|
||||||
|
|
||||||
|
repo:backend error or repo:frontend warning
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(OrExpr(AndExpr(PrefixExpr(RepoExpr),Term),AndExpr(PrefixExpr(RepoExpr),Term)))
|
||||||
|
|
||||||
|
# Nested parentheses with OR
|
||||||
|
|
||||||
|
((a or b) c) or d
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(OrExpr(ParenExpr(AndExpr(ParenExpr(OrExpr(Term,Term)),Term)),Term))
|
||||||
|
|
||||||
|
# OR at different nesting levels
|
||||||
|
|
||||||
|
(a or (b or c))
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(ParenExpr(OrExpr(Term,ParenExpr(OrExpr(Term,Term)))))
|
||||||
|
|
||||||
|
# Implicit AND groups all adjacent terms before OR
|
||||||
|
|
||||||
|
a b c or d e f
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(OrExpr(AndExpr(Term,Term,Term),AndExpr(Term,Term,Term)))
|
||||||
|
|
||||||
|
# Mixed prefix and regular terms with OR
|
||||||
|
|
||||||
|
lang:go func or lang:rust fn
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(OrExpr(AndExpr(PrefixExpr(LangExpr),Term),AndExpr(PrefixExpr(LangExpr),Term)))
|
||||||
|
|
||||||
|
# Negation doesn't affect OR grouping
|
||||||
|
|
||||||
|
a or -b or c
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(OrExpr(Term,Term,Term))
|
||||||
|
|
||||||
|
# Parentheses can isolate OR from surrounding AND
|
||||||
|
|
||||||
|
a (b or c) d
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(AndExpr(Term,ParenExpr(OrExpr(Term,Term)),Term))
|
||||||
|
|
||||||
|
# Multiple parenthesized groups with AND
|
||||||
|
|
||||||
|
(a or b) (c or d)
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(AndExpr(ParenExpr(OrExpr(Term,Term)),ParenExpr(OrExpr(Term,Term))))
|
||||||
|
|
||||||
|
# Quoted strings are atomic - no precedence inside
|
||||||
|
|
||||||
|
"a or b"
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(Term)
|
||||||
|
|
||||||
|
# Prefix with OR value doesn't split
|
||||||
|
|
||||||
|
file:"a.js or b.js"
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(PrefixExpr(FileExpr))
|
||||||
|
|
||||||
|
# Negated prefix in complex expression
|
||||||
|
|
||||||
|
-file:test.js lang:go error or warning
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(OrExpr(AndExpr(NegateExpr(PrefixExpr(FileExpr)),PrefixExpr(LangExpr),Term),Term))
|
||||||
|
|
||||||
|
# OR followed by parenthesized AND
|
||||||
|
|
||||||
|
a or (b c)
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(OrExpr(Term,ParenExpr(AndExpr(Term,Term))))
|
||||||
|
|
||||||
|
# Empty parens don't affect precedence
|
||||||
|
|
||||||
|
() or a b
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(OrExpr(ParenExpr(Term(⚠)),AndExpr(Term,Term)))
|
||||||
|
|
||||||
|
# Negation of empty group
|
||||||
|
|
||||||
|
-() a
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(AndExpr(NegateExpr(ParenExpr(Term(⚠))),Term))
|
||||||
|
|
||||||
336
packages/queryLanguage/test/prefixes.txt
Normal file
336
packages/queryLanguage/test/prefixes.txt
Normal file
|
|
@ -0,0 +1,336 @@
|
||||||
|
# File prefix
|
||||||
|
|
||||||
|
file:README.md
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(PrefixExpr(FileExpr))
|
||||||
|
|
||||||
|
# File prefix short form
|
||||||
|
|
||||||
|
f:index.ts
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(PrefixExpr(FileExpr))
|
||||||
|
|
||||||
|
# Repo prefix
|
||||||
|
|
||||||
|
repo:myproject
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(PrefixExpr(RepoExpr))
|
||||||
|
|
||||||
|
# Repo prefix short form
|
||||||
|
|
||||||
|
r:github.com/user/repo
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(PrefixExpr(RepoExpr))
|
||||||
|
|
||||||
|
# Content prefix
|
||||||
|
|
||||||
|
content:function
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(PrefixExpr(ContentExpr))
|
||||||
|
|
||||||
|
# Content prefix short form
|
||||||
|
|
||||||
|
c:console.log
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(PrefixExpr(ContentExpr))
|
||||||
|
|
||||||
|
# Revision prefix
|
||||||
|
|
||||||
|
rev:main
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(PrefixExpr(RevisionExpr))
|
||||||
|
|
||||||
|
# Lang prefix
|
||||||
|
|
||||||
|
lang:typescript
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(PrefixExpr(LangExpr))
|
||||||
|
|
||||||
|
# Archived prefix - no
|
||||||
|
|
||||||
|
archived:no
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(PrefixExpr(ArchivedExpr))
|
||||||
|
|
||||||
|
# Archived prefix - only
|
||||||
|
|
||||||
|
archived:only
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(PrefixExpr(ArchivedExpr))
|
||||||
|
|
||||||
|
# Fork prefix - yes
|
||||||
|
|
||||||
|
fork:yes
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(PrefixExpr(ForkExpr))
|
||||||
|
|
||||||
|
# Fork prefix - only
|
||||||
|
|
||||||
|
fork:only
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(PrefixExpr(ForkExpr))
|
||||||
|
|
||||||
|
# Visibility prefix - public
|
||||||
|
|
||||||
|
visibility:public
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(PrefixExpr(VisibilityExpr))
|
||||||
|
|
||||||
|
# Context prefix
|
||||||
|
|
||||||
|
context:web
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(PrefixExpr(ContextExpr))
|
||||||
|
|
||||||
|
# Symbol prefix
|
||||||
|
|
||||||
|
sym:MyClass
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(PrefixExpr(SymExpr))
|
||||||
|
|
||||||
|
# RepoSet prefix
|
||||||
|
|
||||||
|
reposet:repo1,repo2
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(PrefixExpr(RepoSetExpr))
|
||||||
|
|
||||||
|
# File with wildcard
|
||||||
|
|
||||||
|
file:*.ts
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(PrefixExpr(FileExpr))
|
||||||
|
|
||||||
|
# File with path
|
||||||
|
|
||||||
|
file:src/components/Button.tsx
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(PrefixExpr(FileExpr))
|
||||||
|
|
||||||
|
# Repo with full URL
|
||||||
|
|
||||||
|
repo:github.com/org/project
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(PrefixExpr(RepoExpr))
|
||||||
|
|
||||||
|
# Multiple prefixes
|
||||||
|
|
||||||
|
file:test.js repo:myproject
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(AndExpr(PrefixExpr(FileExpr),PrefixExpr(RepoExpr)))
|
||||||
|
|
||||||
|
# Prefix with term
|
||||||
|
|
||||||
|
file:test.js console.log
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(AndExpr(PrefixExpr(FileExpr),Term))
|
||||||
|
|
||||||
|
# Term then prefix
|
||||||
|
|
||||||
|
console.log file:handler.ts
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(AndExpr(Term,PrefixExpr(FileExpr)))
|
||||||
|
|
||||||
|
# Multiple prefixes and terms
|
||||||
|
|
||||||
|
lang:typescript function file:handler.ts
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(AndExpr(PrefixExpr(LangExpr),Term,PrefixExpr(FileExpr)))
|
||||||
|
|
||||||
|
# Prefix with regex pattern
|
||||||
|
|
||||||
|
file:[a-z]+\.test\.js
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(PrefixExpr(FileExpr))
|
||||||
|
|
||||||
|
# Content with spaces in value (no quotes)
|
||||||
|
|
||||||
|
content:hello
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(PrefixExpr(ContentExpr))
|
||||||
|
|
||||||
|
# Revision with slashes
|
||||||
|
|
||||||
|
rev:feature/new-feature
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(PrefixExpr(RevisionExpr))
|
||||||
|
|
||||||
|
# RepoSet with multiple repos
|
||||||
|
|
||||||
|
reposet:repo1,repo2,repo3
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(PrefixExpr(RepoSetExpr))
|
||||||
|
|
||||||
|
# Symbol with dots
|
||||||
|
|
||||||
|
sym:package.Class.method
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(PrefixExpr(SymExpr))
|
||||||
|
|
||||||
|
# Lang with various languages
|
||||||
|
|
||||||
|
lang:python
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(PrefixExpr(LangExpr))
|
||||||
|
|
||||||
|
# Archived prefix - yes
|
||||||
|
|
||||||
|
archived:yes
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(PrefixExpr(ArchivedExpr))
|
||||||
|
|
||||||
|
# Archived prefix - invalid value (error case)
|
||||||
|
|
||||||
|
archived:invalid
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(AndExpr(PrefixExpr(ArchivedExpr(⚠)),Term))
|
||||||
|
|
||||||
|
# Fork prefix - no
|
||||||
|
|
||||||
|
fork:no
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(PrefixExpr(ForkExpr))
|
||||||
|
|
||||||
|
# Fork prefix - invalid value (error case)
|
||||||
|
|
||||||
|
fork:invalid
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(AndExpr(PrefixExpr(ForkExpr(⚠)),Term))
|
||||||
|
|
||||||
|
# Visibility prefix - private
|
||||||
|
|
||||||
|
visibility:private
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(PrefixExpr(VisibilityExpr))
|
||||||
|
|
||||||
|
# Visibility prefix - any
|
||||||
|
|
||||||
|
visibility:any
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(PrefixExpr(VisibilityExpr))
|
||||||
|
|
||||||
|
# Visibility prefix - invalid value (error case)
|
||||||
|
|
||||||
|
visibility:invalid
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(AndExpr(PrefixExpr(VisibilityExpr(⚠)),Term))
|
||||||
|
|
||||||
|
# File with dashes
|
||||||
|
|
||||||
|
file:my-component.tsx
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(PrefixExpr(FileExpr))
|
||||||
|
|
||||||
|
# Repo with numbers
|
||||||
|
|
||||||
|
repo:project123
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(PrefixExpr(RepoExpr))
|
||||||
|
|
||||||
|
# Content with special chars
|
||||||
|
|
||||||
|
content:@Component
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(PrefixExpr(ContentExpr))
|
||||||
|
|
||||||
|
# Context with underscores
|
||||||
|
|
||||||
|
context:data_engineering
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(PrefixExpr(ContextExpr))
|
||||||
|
|
||||||
|
# Prefix in parentheses
|
||||||
|
|
||||||
|
(file:test.js)
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(ParenExpr(PrefixExpr(FileExpr)))
|
||||||
|
|
||||||
|
# Multiple prefixes in group
|
||||||
|
|
||||||
|
(file:*.ts lang:typescript)
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(ParenExpr(AndExpr(PrefixExpr(FileExpr),PrefixExpr(LangExpr))))
|
||||||
|
|
||||||
479
packages/queryLanguage/test/quoted.txt
Normal file
479
packages/queryLanguage/test/quoted.txt
Normal file
|
|
@ -0,0 +1,479 @@
|
||||||
|
# Simple quoted string
|
||||||
|
|
||||||
|
"hello"
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(Term)
|
||||||
|
|
||||||
|
# Quoted string with spaces
|
||||||
|
|
||||||
|
"hello world"
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(Term)
|
||||||
|
|
||||||
|
# Multiple words in quotes
|
||||||
|
|
||||||
|
"this is a search term"
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(Term)
|
||||||
|
|
||||||
|
# Quoted string with escaped quote
|
||||||
|
|
||||||
|
"hello \"world\""
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(Term)
|
||||||
|
|
||||||
|
# Quoted string with escaped backslash
|
||||||
|
|
||||||
|
"path\\to\\file"
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(Term)
|
||||||
|
|
||||||
|
# Double backslash
|
||||||
|
|
||||||
|
"test\\\\path"
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(Term)
|
||||||
|
|
||||||
|
# Multiple escaped quotes
|
||||||
|
|
||||||
|
"\"quoted\" \"words\""
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(Term)
|
||||||
|
|
||||||
|
# Mixed escaped characters
|
||||||
|
|
||||||
|
"test\\nvalue\"quoted"
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(Term)
|
||||||
|
|
||||||
|
# Empty quoted string
|
||||||
|
|
||||||
|
""
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(Term)
|
||||||
|
|
||||||
|
# Quoted string with only spaces
|
||||||
|
|
||||||
|
" "
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(Term)
|
||||||
|
|
||||||
|
# Quoted string in file prefix
|
||||||
|
|
||||||
|
file:"my file.txt"
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(PrefixExpr(FileExpr))
|
||||||
|
|
||||||
|
# Quoted string in repo prefix
|
||||||
|
|
||||||
|
repo:"github.com/user/repo name"
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(PrefixExpr(RepoExpr))
|
||||||
|
|
||||||
|
# Quoted string in content prefix
|
||||||
|
|
||||||
|
content:"console.log"
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(PrefixExpr(ContentExpr))
|
||||||
|
|
||||||
|
# Quoted string in revision prefix
|
||||||
|
|
||||||
|
rev:"feature/my feature"
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(PrefixExpr(RevisionExpr))
|
||||||
|
|
||||||
|
# Multiple quoted strings
|
||||||
|
|
||||||
|
"first string" "second string"
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(AndExpr(Term,Term))
|
||||||
|
|
||||||
|
# Quoted and unquoted mixed
|
||||||
|
|
||||||
|
unquoted "quoted string" another
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(AndExpr(Term,Term,Term))
|
||||||
|
|
||||||
|
# Quoted string with parentheses inside
|
||||||
|
|
||||||
|
"(test)"
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(Term)
|
||||||
|
|
||||||
|
# Quoted string with brackets
|
||||||
|
|
||||||
|
"[a-z]+"
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(Term)
|
||||||
|
|
||||||
|
# Quoted string with special chars
|
||||||
|
|
||||||
|
"test@example.com"
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(Term)
|
||||||
|
|
||||||
|
# Quoted string with colons
|
||||||
|
|
||||||
|
"key:value"
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(Term)
|
||||||
|
|
||||||
|
# Quoted string with dashes
|
||||||
|
|
||||||
|
"test-case-example"
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(Term)
|
||||||
|
|
||||||
|
# Quoted string with dots
|
||||||
|
|
||||||
|
"com.example.package"
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(Term)
|
||||||
|
|
||||||
|
# Quoted string with regex pattern
|
||||||
|
|
||||||
|
"\\w+\\s*=\\s*\\d+"
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(Term)
|
||||||
|
|
||||||
|
# Quoted string with forward slashes
|
||||||
|
|
||||||
|
"path/to/file"
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(Term)
|
||||||
|
|
||||||
|
# Quoted string with underscores
|
||||||
|
|
||||||
|
"my_variable_name"
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(Term)
|
||||||
|
|
||||||
|
# Quoted string with numbers
|
||||||
|
|
||||||
|
"test123"
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(Term)
|
||||||
|
|
||||||
|
# Quoted string with mixed case
|
||||||
|
|
||||||
|
"CamelCaseTest"
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(Term)
|
||||||
|
|
||||||
|
# Quoted prefix value with spaces
|
||||||
|
|
||||||
|
file:"test file.js"
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(PrefixExpr(FileExpr))
|
||||||
|
|
||||||
|
# Multiple prefixes with quoted values
|
||||||
|
|
||||||
|
file:"my file.txt" repo:"my repo"
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(AndExpr(PrefixExpr(FileExpr),PrefixExpr(RepoExpr)))
|
||||||
|
|
||||||
|
# Quoted string in parentheses
|
||||||
|
|
||||||
|
("quoted term")
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(ParenExpr(Term))
|
||||||
|
|
||||||
|
# Multiple quoted in parentheses
|
||||||
|
|
||||||
|
("first" "second")
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(ParenExpr(AndExpr(Term,Term)))
|
||||||
|
|
||||||
|
# Quoted with escaped newline
|
||||||
|
|
||||||
|
"line1\\nline2"
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(Term)
|
||||||
|
|
||||||
|
# Quoted with tab character
|
||||||
|
|
||||||
|
"value\\ttab"
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(Term)
|
||||||
|
|
||||||
|
# Lang prefix with quoted value
|
||||||
|
|
||||||
|
lang:"objective-c"
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(PrefixExpr(LangExpr))
|
||||||
|
|
||||||
|
# Sym prefix with quoted value
|
||||||
|
|
||||||
|
sym:"My Class"
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(PrefixExpr(SymExpr))
|
||||||
|
|
||||||
|
# Content with quoted phrase
|
||||||
|
|
||||||
|
content:"TODO: fix this"
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(PrefixExpr(ContentExpr))
|
||||||
|
|
||||||
|
# Quoted string with at symbol
|
||||||
|
|
||||||
|
"@decorator"
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(Term)
|
||||||
|
|
||||||
|
# Quoted string with hash
|
||||||
|
|
||||||
|
"#define"
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(Term)
|
||||||
|
|
||||||
|
# Quoted string with dollar sign
|
||||||
|
|
||||||
|
"$variable"
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(Term)
|
||||||
|
|
||||||
|
# Quoted string with percent
|
||||||
|
|
||||||
|
"100%"
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(Term)
|
||||||
|
|
||||||
|
# Quoted string with ampersand
|
||||||
|
|
||||||
|
"foo&bar"
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(Term)
|
||||||
|
|
||||||
|
# Quoted string with asterisk
|
||||||
|
|
||||||
|
"test*"
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(Term)
|
||||||
|
|
||||||
|
# Quoted string with plus
|
||||||
|
|
||||||
|
"a+b"
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(Term)
|
||||||
|
|
||||||
|
# Quoted string with equals
|
||||||
|
|
||||||
|
"a=b"
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(Term)
|
||||||
|
|
||||||
|
# Quoted string with angle brackets
|
||||||
|
|
||||||
|
"<template>"
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(Term)
|
||||||
|
|
||||||
|
# Quoted string with pipe
|
||||||
|
|
||||||
|
"a|b"
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(Term)
|
||||||
|
|
||||||
|
# Quoted string with tilde
|
||||||
|
|
||||||
|
"~/.config"
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(Term)
|
||||||
|
|
||||||
|
# Quoted string with backtick
|
||||||
|
|
||||||
|
"`code`"
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(Term)
|
||||||
|
|
||||||
|
# Quoted string with question mark
|
||||||
|
|
||||||
|
"what?"
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(Term)
|
||||||
|
|
||||||
|
# Quoted string with exclamation
|
||||||
|
|
||||||
|
"important!"
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(Term)
|
||||||
|
|
||||||
|
# Quoted string with semicolon
|
||||||
|
|
||||||
|
"stmt;"
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(Term)
|
||||||
|
|
||||||
|
# Quoted string with comma
|
||||||
|
|
||||||
|
"a,b,c"
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(Term)
|
||||||
|
|
||||||
|
# Multiple quotes in content
|
||||||
|
|
||||||
|
content:"function \"test\" {"
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(PrefixExpr(ContentExpr))
|
||||||
|
|
||||||
|
# Quoted prefix keyword becomes literal
|
||||||
|
|
||||||
|
"repo:hello"
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(Term)
|
||||||
|
|
||||||
|
# Quoted file prefix as literal
|
||||||
|
|
||||||
|
"file:test.js"
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(Term)
|
||||||
|
|
||||||
|
# Quoted lang prefix as literal
|
||||||
|
|
||||||
|
"lang:python"
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(Term)
|
||||||
|
|
||||||
|
# Quoted partial prefix
|
||||||
|
|
||||||
|
"repo:"
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(Term)
|
||||||
|
|
||||||
|
# Mix of quoted prefix and real prefix
|
||||||
|
|
||||||
|
"repo:test" file:actual.js
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(AndExpr(Term,PrefixExpr(FileExpr)))
|
||||||
|
|
||||||
|
# Quoted short form prefix
|
||||||
|
|
||||||
|
"f:test"
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(Term)
|
||||||
|
|
||||||
|
# Quoted revision prefix
|
||||||
|
|
||||||
|
"rev:main"
|
||||||
|
|
||||||
|
==>
|
||||||
|
|
||||||
|
Program(Term)
|
||||||
23
packages/queryLanguage/tsconfig.json
Normal file
23
packages/queryLanguage/tsconfig.json
Normal file
|
|
@ -0,0 +1,23 @@
|
||||||
|
{
|
||||||
|
"compilerOptions": {
|
||||||
|
"target": "ES2022",
|
||||||
|
"module": "Node16",
|
||||||
|
"moduleResolution": "Node16",
|
||||||
|
"lib": ["ES2023"],
|
||||||
|
"outDir": "dist",
|
||||||
|
"rootDir": "src",
|
||||||
|
"declaration": true,
|
||||||
|
"declarationMap": true,
|
||||||
|
"sourceMap": true,
|
||||||
|
"strict": true,
|
||||||
|
"noImplicitAny": true,
|
||||||
|
"strictNullChecks": true,
|
||||||
|
"esModuleInterop": true,
|
||||||
|
"forceConsistentCasingInFileNames": true,
|
||||||
|
"skipLibCheck": true,
|
||||||
|
"isolatedModules": true,
|
||||||
|
"resolveJsonModule": true
|
||||||
|
},
|
||||||
|
"include": ["src/index.ts"],
|
||||||
|
"exclude": ["node_modules", "dist"]
|
||||||
|
}
|
||||||
8
packages/queryLanguage/vitest.config.ts
Normal file
8
packages/queryLanguage/vitest.config.ts
Normal file
|
|
@ -0,0 +1,8 @@
|
||||||
|
import { defineConfig } from 'vitest/config';
|
||||||
|
|
||||||
|
export default defineConfig({
|
||||||
|
test: {
|
||||||
|
environment: 'node',
|
||||||
|
watch: false,
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
@ -1,3 +1,4 @@
|
||||||
# shadcn components
|
# shadcn components
|
||||||
src/components/
|
src/components/
|
||||||
next-env.d.ts
|
next-env.d.ts
|
||||||
|
src/proto/**
|
||||||
|
|
@ -8,6 +8,7 @@
|
||||||
"start": "next start",
|
"start": "next start",
|
||||||
"lint": "cross-env SKIP_ENV_VALIDATION=1 eslint .",
|
"lint": "cross-env SKIP_ENV_VALIDATION=1 eslint .",
|
||||||
"test": "cross-env SKIP_ENV_VALIDATION=1 vitest",
|
"test": "cross-env SKIP_ENV_VALIDATION=1 vitest",
|
||||||
|
"generate:protos": "proto-loader-gen-types --includeComments --longs=Number --enums=String --defaults --oneofs --grpcLib=@grpc/grpc-js --keepCase --includeDirs=../../vendor/zoekt/grpc/protos --outDir=src/proto zoekt/webserver/v1/webserver.proto zoekt/webserver/v1/query.proto",
|
||||||
"dev:emails": "email dev --dir ./src/emails",
|
"dev:emails": "email dev --dir ./src/emails",
|
||||||
"stripe:listen": "stripe listen --forward-to http://localhost:3000/api/stripe"
|
"stripe:listen": "stripe listen --forward-to http://localhost:3000/api/stripe"
|
||||||
},
|
},
|
||||||
|
|
@ -52,6 +53,8 @@
|
||||||
"@codemirror/state": "^6.4.1",
|
"@codemirror/state": "^6.4.1",
|
||||||
"@codemirror/view": "^6.33.0",
|
"@codemirror/view": "^6.33.0",
|
||||||
"@floating-ui/react": "^0.27.2",
|
"@floating-ui/react": "^0.27.2",
|
||||||
|
"@grpc/grpc-js": "^1.14.1",
|
||||||
|
"@grpc/proto-loader": "^0.8.0",
|
||||||
"@hookform/resolvers": "^3.9.0",
|
"@hookform/resolvers": "^3.9.0",
|
||||||
"@iconify/react": "^5.1.0",
|
"@iconify/react": "^5.1.0",
|
||||||
"@iizukak/codemirror-lang-wgsl": "^0.3.0",
|
"@iizukak/codemirror-lang-wgsl": "^0.3.0",
|
||||||
|
|
@ -91,6 +94,7 @@
|
||||||
"@shopify/lang-jsonc": "^1.0.0",
|
"@shopify/lang-jsonc": "^1.0.0",
|
||||||
"@sourcebot/codemirror-lang-tcl": "^1.0.12",
|
"@sourcebot/codemirror-lang-tcl": "^1.0.12",
|
||||||
"@sourcebot/db": "workspace:*",
|
"@sourcebot/db": "workspace:*",
|
||||||
|
"@sourcebot/query-language": "workspace:*",
|
||||||
"@sourcebot/schemas": "workspace:*",
|
"@sourcebot/schemas": "workspace:*",
|
||||||
"@sourcebot/shared": "workspace:*",
|
"@sourcebot/shared": "workspace:*",
|
||||||
"@ssddanbrown/codemirror-lang-twig": "^1.0.0",
|
"@ssddanbrown/codemirror-lang-twig": "^1.0.0",
|
||||||
|
|
|
||||||
|
|
@ -48,6 +48,10 @@ export const sew = async <T>(fn: () => Promise<T>): Promise<T | ServiceError> =>
|
||||||
Sentry.captureException(e);
|
Sentry.captureException(e);
|
||||||
logger.error(e);
|
logger.error(e);
|
||||||
|
|
||||||
|
if (e instanceof ServiceErrorException) {
|
||||||
|
return e.serviceError;
|
||||||
|
}
|
||||||
|
|
||||||
if (e instanceof Error) {
|
if (e instanceof Error) {
|
||||||
return unexpectedError(e.message);
|
return unexpectedError(e.message);
|
||||||
}
|
}
|
||||||
|
|
|
||||||
|
|
@ -29,7 +29,9 @@ export default function Layout({
|
||||||
>
|
>
|
||||||
<SearchBar
|
<SearchBar
|
||||||
size="sm"
|
size="sm"
|
||||||
defaultQuery={`repo:${repoName}${revisionName ? ` rev:${revisionName}` : ''} `}
|
defaults={{
|
||||||
|
query: `repo:${repoName}${revisionName ? ` rev:${revisionName}` : ''} `,
|
||||||
|
}}
|
||||||
className="w-full"
|
className="w-full"
|
||||||
/>
|
/>
|
||||||
</TopBar>
|
</TopBar>
|
||||||
|
|
|
||||||
|
|
@ -6,7 +6,7 @@ import { memo, useEffect, useMemo, useState } from 'react'
|
||||||
import { useCodeMirrorHighlighter } from '@/hooks/useCodeMirrorHighlighter'
|
import { useCodeMirrorHighlighter } from '@/hooks/useCodeMirrorHighlighter'
|
||||||
import tailwind from '@/tailwind'
|
import tailwind from '@/tailwind'
|
||||||
import { measure } from '@/lib/utils'
|
import { measure } from '@/lib/utils'
|
||||||
import { SourceRange } from '@/features/search/types'
|
import { SourceRange } from '@/features/search'
|
||||||
|
|
||||||
// Define a plain text language
|
// Define a plain text language
|
||||||
const plainTextLanguage = StreamLanguage.define({
|
const plainTextLanguage = StreamLanguage.define({
|
||||||
|
|
|
||||||
|
|
@ -233,7 +233,7 @@ export const PathHeader = ({
|
||||||
}}
|
}}
|
||||||
>
|
>
|
||||||
<span className="mr-0.5">@</span>
|
<span className="mr-0.5">@</span>
|
||||||
{`${branchDisplayName}`}
|
{`${branchDisplayName.replace(/^refs\/(heads|tags)\//, '')}`}
|
||||||
</p>
|
</p>
|
||||||
)}
|
)}
|
||||||
<span>·</span>
|
<span>·</span>
|
||||||
|
|
|
||||||
|
|
@ -16,57 +16,53 @@ export enum SearchPrefix {
|
||||||
sym = "sym:",
|
sym = "sym:",
|
||||||
content = "content:",
|
content = "content:",
|
||||||
archived = "archived:",
|
archived = "archived:",
|
||||||
case = "case:",
|
|
||||||
fork = "fork:",
|
fork = "fork:",
|
||||||
public = "public:",
|
visibility = "visibility:",
|
||||||
context = "context:",
|
context = "context:",
|
||||||
}
|
}
|
||||||
|
|
||||||
export const publicModeSuggestions: Suggestion[] = [
|
export const visibilityModeSuggestions: Suggestion[] = [
|
||||||
{
|
{
|
||||||
value: "yes",
|
value: "public",
|
||||||
description: "Only include results from public repositories."
|
description: "Only include results from public repositories."
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
value: "no",
|
value: "private",
|
||||||
description: "Only include results from private repositories."
|
description: "Only include results from private repositories."
|
||||||
},
|
},
|
||||||
|
{
|
||||||
|
value: "any",
|
||||||
|
description: "Include results from both public and private repositories (default)."
|
||||||
|
},
|
||||||
];
|
];
|
||||||
|
|
||||||
export const forkModeSuggestions: Suggestion[] = [
|
export const forkModeSuggestions: Suggestion[] = [
|
||||||
{
|
{
|
||||||
value: "yes",
|
value: "yes",
|
||||||
|
description: "Include results from forked repositories (default)."
|
||||||
|
},
|
||||||
|
{
|
||||||
|
value: "no",
|
||||||
|
description: "Exclude results from forked repositories."
|
||||||
|
},
|
||||||
|
{
|
||||||
|
value: "only",
|
||||||
description: "Only include results from forked repositories."
|
description: "Only include results from forked repositories."
|
||||||
},
|
}
|
||||||
{
|
|
||||||
value: "no",
|
|
||||||
description: "Only include results from non-forked repositories."
|
|
||||||
},
|
|
||||||
];
|
|
||||||
|
|
||||||
export const caseModeSuggestions: Suggestion[] = [
|
|
||||||
{
|
|
||||||
value: "auto",
|
|
||||||
description: "Search patterns are case-insensitive if all characters are lowercase, and case sensitive otherwise (default)."
|
|
||||||
},
|
|
||||||
{
|
|
||||||
value: "yes",
|
|
||||||
description: "Case sensitive search."
|
|
||||||
},
|
|
||||||
{
|
|
||||||
value: "no",
|
|
||||||
description: "Case insensitive search."
|
|
||||||
},
|
|
||||||
];
|
];
|
||||||
|
|
||||||
export const archivedModeSuggestions: Suggestion[] = [
|
export const archivedModeSuggestions: Suggestion[] = [
|
||||||
{
|
{
|
||||||
value: "yes",
|
value: "yes",
|
||||||
description: "Only include results in archived repositories."
|
description: "Include results from archived repositories (default)."
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
value: "no",
|
value: "no",
|
||||||
description: "Only include results in non-archived repositories."
|
description: "Exclude results from archived repositories."
|
||||||
},
|
},
|
||||||
|
{
|
||||||
|
value: "only",
|
||||||
|
description: "Only include results from archived repositories."
|
||||||
|
}
|
||||||
];
|
];
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -42,14 +42,18 @@ import { Separator } from "@/components/ui/separator";
|
||||||
import { Tooltip, TooltipTrigger, TooltipContent } from "@/components/ui/tooltip";
|
import { Tooltip, TooltipTrigger, TooltipContent } from "@/components/ui/tooltip";
|
||||||
import { Toggle } from "@/components/ui/toggle";
|
import { Toggle } from "@/components/ui/toggle";
|
||||||
import { useDomain } from "@/hooks/useDomain";
|
import { useDomain } from "@/hooks/useDomain";
|
||||||
import { KeyboardShortcutHint } from "@/app/components/keyboardShortcutHint";
|
|
||||||
import { createAuditAction } from "@/ee/features/audit/actions";
|
import { createAuditAction } from "@/ee/features/audit/actions";
|
||||||
import tailwind from "@/tailwind";
|
import tailwind from "@/tailwind";
|
||||||
|
import { CaseSensitiveIcon, RegexIcon } from "lucide-react";
|
||||||
|
|
||||||
interface SearchBarProps {
|
interface SearchBarProps {
|
||||||
className?: string;
|
className?: string;
|
||||||
size?: "default" | "sm";
|
size?: "default" | "sm";
|
||||||
defaultQuery?: string;
|
defaults?: {
|
||||||
|
isRegexEnabled?: boolean;
|
||||||
|
isCaseSensitivityEnabled?: boolean;
|
||||||
|
query?: string;
|
||||||
|
}
|
||||||
autoFocus?: boolean;
|
autoFocus?: boolean;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
@ -91,8 +95,12 @@ const searchBarContainerVariants = cva(
|
||||||
export const SearchBar = ({
|
export const SearchBar = ({
|
||||||
className,
|
className,
|
||||||
size,
|
size,
|
||||||
defaultQuery,
|
|
||||||
autoFocus,
|
autoFocus,
|
||||||
|
defaults: {
|
||||||
|
isRegexEnabled: defaultIsRegexEnabled = false,
|
||||||
|
isCaseSensitivityEnabled: defaultIsCaseSensitivityEnabled = false,
|
||||||
|
query: defaultQuery = "",
|
||||||
|
} = {}
|
||||||
}: SearchBarProps) => {
|
}: SearchBarProps) => {
|
||||||
const router = useRouter();
|
const router = useRouter();
|
||||||
const domain = useDomain();
|
const domain = useDomain();
|
||||||
|
|
@ -102,11 +110,13 @@ export const SearchBar = ({
|
||||||
const [isSuggestionsEnabled, setIsSuggestionsEnabled] = useState(false);
|
const [isSuggestionsEnabled, setIsSuggestionsEnabled] = useState(false);
|
||||||
const [isSuggestionsBoxFocused, setIsSuggestionsBoxFocused] = useState(false);
|
const [isSuggestionsBoxFocused, setIsSuggestionsBoxFocused] = useState(false);
|
||||||
const [isHistorySearchEnabled, setIsHistorySearchEnabled] = useState(false);
|
const [isHistorySearchEnabled, setIsHistorySearchEnabled] = useState(false);
|
||||||
|
const [isRegexEnabled, setIsRegexEnabled] = useState(defaultIsRegexEnabled);
|
||||||
|
const [isCaseSensitivityEnabled, setIsCaseSensitivityEnabled] = useState(defaultIsCaseSensitivityEnabled);
|
||||||
|
|
||||||
const focusEditor = useCallback(() => editorRef.current?.view?.focus(), []);
|
const focusEditor = useCallback(() => editorRef.current?.view?.focus(), []);
|
||||||
const focusSuggestionsBox = useCallback(() => suggestionBoxRef.current?.focus(), []);
|
const focusSuggestionsBox = useCallback(() => suggestionBoxRef.current?.focus(), []);
|
||||||
|
|
||||||
const [_query, setQuery] = useState(defaultQuery ?? "");
|
const [_query, setQuery] = useState(defaultQuery);
|
||||||
const query = useMemo(() => {
|
const query = useMemo(() => {
|
||||||
// Replace any newlines with spaces to handle
|
// Replace any newlines with spaces to handle
|
||||||
// copy & pasting text with newlines.
|
// copy & pasting text with newlines.
|
||||||
|
|
@ -215,9 +225,11 @@ export const SearchBar = ({
|
||||||
|
|
||||||
const url = createPathWithQueryParams(`/${domain}/search`,
|
const url = createPathWithQueryParams(`/${domain}/search`,
|
||||||
[SearchQueryParams.query, query],
|
[SearchQueryParams.query, query],
|
||||||
|
[SearchQueryParams.isRegexEnabled, isRegexEnabled ? "true" : null],
|
||||||
|
[SearchQueryParams.isCaseSensitivityEnabled, isCaseSensitivityEnabled ? "true" : null],
|
||||||
);
|
);
|
||||||
router.push(url);
|
router.push(url);
|
||||||
}, [domain, router]);
|
}, [domain, router, isRegexEnabled, isCaseSensitivityEnabled]);
|
||||||
|
|
||||||
return (
|
return (
|
||||||
<div
|
<div
|
||||||
|
|
@ -275,18 +287,40 @@ export const SearchBar = ({
|
||||||
indentWithTab={false}
|
indentWithTab={false}
|
||||||
autoFocus={autoFocus ?? false}
|
autoFocus={autoFocus ?? false}
|
||||||
/>
|
/>
|
||||||
<Tooltip
|
<div className="flex flex-row items-center gap-1 ml-1">
|
||||||
delayDuration={100}
|
<Tooltip>
|
||||||
>
|
|
||||||
<TooltipTrigger asChild>
|
<TooltipTrigger asChild>
|
||||||
<div>
|
<span>
|
||||||
<KeyboardShortcutHint shortcut="/" />
|
<Toggle
|
||||||
</div>
|
className="h-7 w-7 min-w-7 p-0 cursor-pointer"
|
||||||
|
pressed={isCaseSensitivityEnabled}
|
||||||
|
onPressedChange={setIsCaseSensitivityEnabled}
|
||||||
|
>
|
||||||
|
<CaseSensitiveIcon className="w-4 h-4" />
|
||||||
|
</Toggle>
|
||||||
|
</span>
|
||||||
</TooltipTrigger>
|
</TooltipTrigger>
|
||||||
<TooltipContent side="bottom" className="flex flex-row items-center gap-2">
|
<TooltipContent side="bottom" className="flex flex-row items-center gap-2">
|
||||||
Focus search bar
|
{isCaseSensitivityEnabled ? "Disable" : "Enable"} case sensitivity
|
||||||
</TooltipContent>
|
</TooltipContent>
|
||||||
</Tooltip>
|
</Tooltip>
|
||||||
|
<Tooltip>
|
||||||
|
<TooltipTrigger asChild>
|
||||||
|
<span>
|
||||||
|
<Toggle
|
||||||
|
className="h-7 w-7 min-w-7 p-0 cursor-pointer"
|
||||||
|
pressed={isRegexEnabled}
|
||||||
|
onPressedChange={setIsRegexEnabled}
|
||||||
|
>
|
||||||
|
<RegexIcon className="w-4 h-4" />
|
||||||
|
</Toggle>
|
||||||
|
</span>
|
||||||
|
</TooltipTrigger>
|
||||||
|
<TooltipContent side="bottom" className="flex flex-row items-center gap-2">
|
||||||
|
{isRegexEnabled ? "Disable" : "Enable"} regular expressions
|
||||||
|
</TooltipContent>
|
||||||
|
</Tooltip>
|
||||||
|
</div>
|
||||||
<SearchSuggestionsBox
|
<SearchSuggestionsBox
|
||||||
ref={suggestionBoxRef}
|
ref={suggestionBoxRef}
|
||||||
query={query}
|
query={query}
|
||||||
|
|
|
||||||
|
|
@ -7,9 +7,8 @@ import Fuse from "fuse.js";
|
||||||
import { forwardRef, Ref, useEffect, useMemo, useState } from "react";
|
import { forwardRef, Ref, useEffect, useMemo, useState } from "react";
|
||||||
import {
|
import {
|
||||||
archivedModeSuggestions,
|
archivedModeSuggestions,
|
||||||
caseModeSuggestions,
|
|
||||||
forkModeSuggestions,
|
forkModeSuggestions,
|
||||||
publicModeSuggestions,
|
visibilityModeSuggestions,
|
||||||
} from "./constants";
|
} from "./constants";
|
||||||
import { IconType } from "react-icons/lib";
|
import { IconType } from "react-icons/lib";
|
||||||
import { VscFile, VscFilter, VscRepo, VscSymbolMisc } from "react-icons/vsc";
|
import { VscFile, VscFilter, VscRepo, VscSymbolMisc } from "react-icons/vsc";
|
||||||
|
|
@ -32,9 +31,8 @@ export type SuggestionMode =
|
||||||
"archived" |
|
"archived" |
|
||||||
"file" |
|
"file" |
|
||||||
"language" |
|
"language" |
|
||||||
"case" |
|
|
||||||
"fork" |
|
"fork" |
|
||||||
"public" |
|
"visibility" |
|
||||||
"revision" |
|
"revision" |
|
||||||
"symbol" |
|
"symbol" |
|
||||||
"content" |
|
"content" |
|
||||||
|
|
@ -137,9 +135,9 @@ const SearchSuggestionsBox = forwardRef(({
|
||||||
DefaultIcon?: IconType
|
DefaultIcon?: IconType
|
||||||
} => {
|
} => {
|
||||||
switch (suggestionMode) {
|
switch (suggestionMode) {
|
||||||
case "public":
|
case "visibility":
|
||||||
return {
|
return {
|
||||||
list: publicModeSuggestions,
|
list: visibilityModeSuggestions,
|
||||||
onSuggestionClicked: createOnSuggestionClickedHandler(),
|
onSuggestionClicked: createOnSuggestionClickedHandler(),
|
||||||
}
|
}
|
||||||
case "fork":
|
case "fork":
|
||||||
|
|
@ -147,11 +145,6 @@ const SearchSuggestionsBox = forwardRef(({
|
||||||
list: forkModeSuggestions,
|
list: forkModeSuggestions,
|
||||||
onSuggestionClicked: createOnSuggestionClickedHandler(),
|
onSuggestionClicked: createOnSuggestionClickedHandler(),
|
||||||
}
|
}
|
||||||
case "case":
|
|
||||||
return {
|
|
||||||
list: caseModeSuggestions,
|
|
||||||
onSuggestionClicked: createOnSuggestionClickedHandler(),
|
|
||||||
}
|
|
||||||
case "archived":
|
case "archived":
|
||||||
return {
|
return {
|
||||||
list: archivedModeSuggestions,
|
list: archivedModeSuggestions,
|
||||||
|
|
@ -183,7 +176,7 @@ const SearchSuggestionsBox = forwardRef(({
|
||||||
case "file":
|
case "file":
|
||||||
return {
|
return {
|
||||||
list: fileSuggestions,
|
list: fileSuggestions,
|
||||||
onSuggestionClicked: createOnSuggestionClickedHandler(),
|
onSuggestionClicked: createOnSuggestionClickedHandler({ regexEscaped: true }),
|
||||||
isClientSideSearchEnabled: false,
|
isClientSideSearchEnabled: false,
|
||||||
DefaultIcon: VscFile,
|
DefaultIcon: VscFile,
|
||||||
}
|
}
|
||||||
|
|
|
||||||
|
|
@ -26,7 +26,7 @@ export const useRefineModeSuggestions = () => {
|
||||||
},
|
},
|
||||||
] : []),
|
] : []),
|
||||||
{
|
{
|
||||||
value: SearchPrefix.public,
|
value: SearchPrefix.visibility,
|
||||||
description: "Filter on repository visibility."
|
description: "Filter on repository visibility."
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
|
|
@ -86,10 +86,6 @@ export const useRefineModeSuggestions = () => {
|
||||||
value: SearchPrefix.archived,
|
value: SearchPrefix.archived,
|
||||||
description: "Include results from archived repositories.",
|
description: "Include results from archived repositories.",
|
||||||
},
|
},
|
||||||
{
|
|
||||||
value: SearchPrefix.case,
|
|
||||||
description: "Control case-sensitivity of search patterns."
|
|
||||||
},
|
|
||||||
{
|
{
|
||||||
value: SearchPrefix.fork,
|
value: SearchPrefix.fork,
|
||||||
description: "Include only results from forked repositories."
|
description: "Include only results from forked repositories."
|
||||||
|
|
|
||||||
|
|
@ -70,12 +70,6 @@ export const useSuggestionModeMappings = () => {
|
||||||
SearchPrefix.archived
|
SearchPrefix.archived
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
|
||||||
suggestionMode: "case",
|
|
||||||
prefixes: [
|
|
||||||
SearchPrefix.case
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
{
|
||||||
suggestionMode: "fork",
|
suggestionMode: "fork",
|
||||||
prefixes: [
|
prefixes: [
|
||||||
|
|
@ -83,9 +77,9 @@ export const useSuggestionModeMappings = () => {
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
suggestionMode: "public",
|
suggestionMode: "visibility",
|
||||||
prefixes: [
|
prefixes: [
|
||||||
SearchPrefix.public
|
SearchPrefix.visibility
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
...(isSearchContextsEnabled ? [
|
...(isSearchContextsEnabled ? [
|
||||||
|
|
|
||||||
|
|
@ -5,7 +5,7 @@ import { Suggestion, SuggestionMode } from "./searchSuggestionsBox";
|
||||||
import { getRepos, search } from "@/app/api/(client)/client";
|
import { getRepos, search } from "@/app/api/(client)/client";
|
||||||
import { getSearchContexts } from "@/actions";
|
import { getSearchContexts } from "@/actions";
|
||||||
import { useMemo } from "react";
|
import { useMemo } from "react";
|
||||||
import { SearchSymbol } from "@/features/search/types";
|
import { SearchSymbol } from "@/features/search";
|
||||||
import { languageMetadataMap } from "@/lib/languageMetadata";
|
import { languageMetadataMap } from "@/lib/languageMetadata";
|
||||||
import {
|
import {
|
||||||
VscSymbolClass,
|
VscSymbolClass,
|
||||||
|
|
|
||||||
|
|
@ -47,7 +47,7 @@ export const zoekt = () => {
|
||||||
|
|
||||||
// Check for prefixes first
|
// Check for prefixes first
|
||||||
// If these match, we return 'keyword'
|
// If these match, we return 'keyword'
|
||||||
if (stream.match(/(archived:|branch:|b:|rev:|c:|case:|content:|f:|file:|fork:|public:|r:|repo:|regex:|lang:|sym:|t:|type:|context:)/)) {
|
if (stream.match(/(archived:|rev:|content:|f:|file:|fork:|visibility:|r:|repo:|regex:|lang:|sym:|t:|type:|context:)/)) {
|
||||||
return t.keyword.toString();
|
return t.keyword.toString();
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -15,6 +15,7 @@ import { useCallback, useRef } from "react";
|
||||||
import { useHotkeys } from "react-hotkeys-hook";
|
import { useHotkeys } from "react-hotkeys-hook";
|
||||||
import { useSyntaxGuide } from "./syntaxGuideProvider";
|
import { useSyntaxGuide } from "./syntaxGuideProvider";
|
||||||
import { CodeSnippet } from "@/app/components/codeSnippet";
|
import { CodeSnippet } from "@/app/components/codeSnippet";
|
||||||
|
import { ExternalLinkIcon, RegexIcon } from "lucide-react";
|
||||||
|
|
||||||
const LINGUIST_LINK = "https://github.com/github-linguist/linguist/blob/main/lib/linguist/languages.yml";
|
const LINGUIST_LINK = "https://github.com/github-linguist/linguist/blob/main/lib/linguist/languages.yml";
|
||||||
const CTAGS_LINK = "https://ctags.io/";
|
const CTAGS_LINK = "https://ctags.io/";
|
||||||
|
|
@ -61,14 +62,55 @@ export const SyntaxReferenceGuide = () => {
|
||||||
onOpenChange={handleOpenChange}
|
onOpenChange={handleOpenChange}
|
||||||
>
|
>
|
||||||
<DialogContent
|
<DialogContent
|
||||||
className="max-h-[80vh] max-w-[700px] overflow-scroll"
|
className="max-h-[80vh] max-w-[700px] overflow-scroll gap-2"
|
||||||
>
|
>
|
||||||
<DialogHeader>
|
<DialogHeader>
|
||||||
<DialogTitle>Syntax Reference Guide</DialogTitle>
|
<DialogTitle>Syntax Reference Guide <Link href="https://docs.sourcebot.dev/docs/features/search/syntax-reference"><ExternalLinkIcon className="inline w-4 h-4 ml-1 mb-1 text-muted-foreground cursor-pointer" /></Link></DialogTitle>
|
||||||
<DialogDescription className="text-sm text-foreground">
|
<DialogDescription className="text-sm text-foreground">
|
||||||
Queries consist of space-seperated regular expressions. Wrapping expressions in <CodeSnippet>{`""`}</CodeSnippet> combines them. By default, a file must have at least one match for each expression to be included.
|
Queries consist of space-separated search patterns that are matched against file contents. A file must have at least one match for each expression to be included. Queries can optionally contain search filters to further refine the search results.
|
||||||
</DialogDescription>
|
</DialogDescription>
|
||||||
</DialogHeader>
|
</DialogHeader>
|
||||||
|
|
||||||
|
<div>
|
||||||
|
<h3 className="text-lg font-semibold mt-4 mb-0">Keyword search (default)</h3>
|
||||||
|
<p className="text-sm mb-2 mt-0">
|
||||||
|
Keyword search matches search patterns exactly in file contents. Wrapping search patterns in <CodeSnippet>{`""`}</CodeSnippet> combines them as a single expression.
|
||||||
|
</p>
|
||||||
|
<Table>
|
||||||
|
<TableHeader>
|
||||||
|
<TableRow>
|
||||||
|
<TableHead className="py-2">Example</TableHead>
|
||||||
|
<TableHead className="py-2">Explanation</TableHead>
|
||||||
|
</TableRow>
|
||||||
|
</TableHeader>
|
||||||
|
<TableBody>
|
||||||
|
<TableRow>
|
||||||
|
<TableCell className="py-2"><CodeSnippet>foo</CodeSnippet></TableCell>
|
||||||
|
<TableCell className="py-2">Match files containing the keyword <CodeSnippet>foo</CodeSnippet></TableCell>
|
||||||
|
</TableRow>
|
||||||
|
<TableRow>
|
||||||
|
<TableCell className="py-2"><CodeSnippet>foo bar</CodeSnippet></TableCell>
|
||||||
|
<TableCell className="py-2">Match files containing both <CodeSnippet>foo</CodeSnippet> <b>and</b> <CodeSnippet>bar</CodeSnippet></TableCell>
|
||||||
|
</TableRow>
|
||||||
|
<TableRow>
|
||||||
|
<TableCell className="py-2"><CodeSnippet>{`"foo bar"`}</CodeSnippet></TableCell>
|
||||||
|
<TableCell className="py-2">Match files containing the phrase <CodeSnippet>foo bar</CodeSnippet></TableCell>
|
||||||
|
</TableRow>
|
||||||
|
<TableRow>
|
||||||
|
<TableCell className="py-2"><CodeSnippet>{'"foo \\"bar\\""'}</CodeSnippet></TableCell>
|
||||||
|
<TableCell className="py-2">Match files containing <CodeSnippet>foo "bar"</CodeSnippet> exactly (escaped quotes)</TableCell>
|
||||||
|
</TableRow>
|
||||||
|
</TableBody>
|
||||||
|
</Table>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<Separator className="my-4"/>
|
||||||
|
|
||||||
|
<div>
|
||||||
|
<h3 className="text-lg font-semibold mt-4 mb-0">Regex search</h3>
|
||||||
|
<p className="text-sm mb-2 mt-0">
|
||||||
|
Toggle the <RegexIcon className="inline w-4 h-4 align-middle mx-0.5 border rounded px-0.5 py-0.5" /> button to interpret search patterns as regular expressions.
|
||||||
|
</p>
|
||||||
<Table>
|
<Table>
|
||||||
<TableHeader>
|
<TableHeader>
|
||||||
<TableRow>
|
<TableRow>
|
||||||
|
|
@ -82,46 +124,27 @@ export const SyntaxReferenceGuide = () => {
|
||||||
<TableCell className="py-2">Match files with regex <CodeSnippet>/foo/</CodeSnippet></TableCell>
|
<TableCell className="py-2">Match files with regex <CodeSnippet>/foo/</CodeSnippet></TableCell>
|
||||||
</TableRow>
|
</TableRow>
|
||||||
<TableRow>
|
<TableRow>
|
||||||
<TableCell className="py-2"><CodeSnippet>foo bar</CodeSnippet></TableCell>
|
<TableCell className="py-2"><CodeSnippet>foo.*bar</CodeSnippet></TableCell>
|
||||||
<TableCell className="py-2">Match files with regex <CodeSnippet>/foo/</CodeSnippet> <b>and</b> <CodeSnippet>/bar/</CodeSnippet></TableCell>
|
<TableCell className="py-2">Match files with regex <CodeSnippet>/foo.*bar/</CodeSnippet> (foo followed by any characters, then bar)</TableCell>
|
||||||
|
</TableRow>
|
||||||
|
<TableRow>
|
||||||
|
<TableCell className="py-2"><CodeSnippet>{`^function\\s+\\w+`}</CodeSnippet></TableCell>
|
||||||
|
<TableCell className="py-2">Match files with regex <CodeSnippet>/^function\s+\w+/</CodeSnippet> (function at start of line, followed by whitespace and word characters)</TableCell>
|
||||||
</TableRow>
|
</TableRow>
|
||||||
<TableRow>
|
<TableRow>
|
||||||
<TableCell className="py-2"><CodeSnippet>{`"foo bar"`}</CodeSnippet></TableCell>
|
<TableCell className="py-2"><CodeSnippet>{`"foo bar"`}</CodeSnippet></TableCell>
|
||||||
<TableCell className="py-2">Match files with regex <CodeSnippet>/foo bar/</CodeSnippet></TableCell>
|
<TableCell className="py-2">Match files with regex <CodeSnippet>/foo bar/</CodeSnippet>. Quotes are not matched.</TableCell>
|
||||||
</TableRow>
|
</TableRow>
|
||||||
</TableBody>
|
</TableBody>
|
||||||
</Table>
|
</Table>
|
||||||
|
</div>
|
||||||
|
|
||||||
<Separator className="my-2"/>
|
<Separator className="my-4"/>
|
||||||
<p className="text-sm">
|
|
||||||
{`Multiple expressions can be or'd together with `}<CodeSnippet>or</CodeSnippet>, negated with <CodeSnippet>-</CodeSnippet>, or grouped with <CodeSnippet>()</CodeSnippet>.
|
|
||||||
</p>
|
|
||||||
<Table>
|
|
||||||
<TableHeader>
|
|
||||||
<TableRow>
|
|
||||||
<TableHead className="py-2">Example</TableHead>
|
|
||||||
<TableHead className="py-2">Explanation</TableHead>
|
|
||||||
</TableRow>
|
|
||||||
</TableHeader>
|
|
||||||
<TableBody>
|
|
||||||
<TableRow>
|
|
||||||
<TableCell className="py-2"><CodeSnippet>foo <Highlight>or</Highlight> bar</CodeSnippet></TableCell>
|
|
||||||
<TableCell className="py-2">Match files with regex <CodeSnippet>/foo/</CodeSnippet> <b>or</b> <CodeSnippet>/bar/</CodeSnippet></TableCell>
|
|
||||||
</TableRow>
|
|
||||||
<TableRow>
|
|
||||||
<TableCell className="py-2"><CodeSnippet>foo -bar</CodeSnippet></TableCell>
|
|
||||||
<TableCell className="py-2">Match files with regex <CodeSnippet>/foo/</CodeSnippet> but <b>not</b> <CodeSnippet>/bar/</CodeSnippet></TableCell>
|
|
||||||
</TableRow>
|
|
||||||
<TableRow>
|
|
||||||
<TableCell className="py-2"><CodeSnippet>foo (bar <Highlight>or</Highlight> baz)</CodeSnippet></TableCell>
|
|
||||||
<TableCell className="py-2">Match files with regex <CodeSnippet>/foo/</CodeSnippet> <b>and</b> either <CodeSnippet>/bar/</CodeSnippet> <b>or</b> <CodeSnippet>/baz/</CodeSnippet></TableCell>
|
|
||||||
</TableRow>
|
|
||||||
</TableBody>
|
|
||||||
</Table>
|
|
||||||
|
|
||||||
<Separator className="my-2"/>
|
<div>
|
||||||
<p className="text-sm">
|
<h3 className="text-lg font-semibold mt-4 mb-0">Search filters</h3>
|
||||||
Expressions can be prefixed with certain keywords to modify search behavior. Some keywords can be negated using the <CodeSnippet>-</CodeSnippet> prefix.
|
<p className="text-sm mb-2 mt-0">
|
||||||
|
Search queries (keyword or regex) can include multiple search filters to further refine the search results. Some filters can be negated using the <CodeSnippet>-</CodeSnippet> prefix.
|
||||||
</p>
|
</p>
|
||||||
|
|
||||||
<Table>
|
<Table>
|
||||||
|
|
@ -220,6 +243,38 @@ export const SyntaxReferenceGuide = () => {
|
||||||
</TableRow>
|
</TableRow>
|
||||||
</TableBody>
|
</TableBody>
|
||||||
</Table>
|
</Table>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<Separator className="my-4"/>
|
||||||
|
|
||||||
|
<div>
|
||||||
|
<h3 className="text-lg font-semibold mt-4 mb-0">Boolean operators & grouping</h3>
|
||||||
|
<p className="text-sm mb-2 mt-0">
|
||||||
|
By default, space-seperated expressions are and'd together. Using the <CodeSnippet>or</CodeSnippet> keyword as well as parantheses <CodeSnippet>()</CodeSnippet> can be used to create more complex boolean logic. Parantheses can be negated using the <CodeSnippet>-</CodeSnippet> prefix.
|
||||||
|
</p>
|
||||||
|
<Table>
|
||||||
|
<TableHeader>
|
||||||
|
<TableRow>
|
||||||
|
<TableHead className="py-2">Example</TableHead>
|
||||||
|
<TableHead className="py-2">Explanation</TableHead>
|
||||||
|
</TableRow>
|
||||||
|
</TableHeader>
|
||||||
|
<TableBody>
|
||||||
|
<TableRow>
|
||||||
|
<TableCell className="py-2"><CodeSnippet>foo <Highlight>or</Highlight> bar</CodeSnippet></TableCell>
|
||||||
|
<TableCell className="py-2">Match files containing <CodeSnippet>foo</CodeSnippet> <b>or</b> <CodeSnippet>bar</CodeSnippet></TableCell>
|
||||||
|
</TableRow>
|
||||||
|
<TableRow>
|
||||||
|
<TableCell className="py-2"><CodeSnippet>foo (bar <Highlight>or</Highlight> baz)</CodeSnippet></TableCell>
|
||||||
|
<TableCell className="py-2">Match files containing <CodeSnippet>foo</CodeSnippet> <b>and</b> either <CodeSnippet>bar</CodeSnippet> <b>or</b> <CodeSnippet>baz</CodeSnippet>.</TableCell>
|
||||||
|
</TableRow>
|
||||||
|
<TableRow>
|
||||||
|
<TableCell className="py-2"><CodeSnippet>-(foo) bar</CodeSnippet></TableCell>
|
||||||
|
<TableCell className="py-2">Match files containing <CodeSnippet>bar</CodeSnippet> <b>and not</b> <CodeSnippet>foo</CodeSnippet>.</TableCell>
|
||||||
|
</TableRow>
|
||||||
|
</TableBody>
|
||||||
|
</Table>
|
||||||
|
</div>
|
||||||
</DialogContent>
|
</DialogContent>
|
||||||
</Dialog>
|
</Dialog>
|
||||||
)
|
)
|
||||||
|
|
|
||||||
|
|
@ -3,7 +3,7 @@
|
||||||
import { EditorContextMenu } from "@/app/[domain]/components/editorContextMenu";
|
import { EditorContextMenu } from "@/app/[domain]/components/editorContextMenu";
|
||||||
import { Button } from "@/components/ui/button";
|
import { Button } from "@/components/ui/button";
|
||||||
import { ScrollArea } from "@/components/ui/scroll-area";
|
import { ScrollArea } from "@/components/ui/scroll-area";
|
||||||
import { SearchResultChunk } from "@/features/search/types";
|
import { SearchResultChunk } from "@/features/search";
|
||||||
import { useCodeMirrorTheme } from "@/hooks/useCodeMirrorTheme";
|
import { useCodeMirrorTheme } from "@/hooks/useCodeMirrorTheme";
|
||||||
import { useKeymapExtension } from "@/hooks/useKeymapExtension";
|
import { useKeymapExtension } from "@/hooks/useKeymapExtension";
|
||||||
import { useCodeMirrorLanguageExtension } from "@/hooks/useCodeMirrorLanguageExtension";
|
import { useCodeMirrorLanguageExtension } from "@/hooks/useCodeMirrorLanguageExtension";
|
||||||
|
|
|
||||||
|
|
@ -2,7 +2,7 @@
|
||||||
|
|
||||||
import { useQuery } from "@tanstack/react-query";
|
import { useQuery } from "@tanstack/react-query";
|
||||||
import { CodePreview } from "./codePreview";
|
import { CodePreview } from "./codePreview";
|
||||||
import { SearchResultFile } from "@/features/search/types";
|
import { SearchResultFile } from "@/features/search";
|
||||||
import { SymbolIcon } from "@radix-ui/react-icons";
|
import { SymbolIcon } from "@radix-ui/react-icons";
|
||||||
import { SetStateAction, Dispatch, useMemo } from "react";
|
import { SetStateAction, Dispatch, useMemo } from "react";
|
||||||
import { unwrapServiceError } from "@/lib/utils";
|
import { unwrapServiceError } from "@/lib/utils";
|
||||||
|
|
|
||||||
|
|
@ -5,6 +5,7 @@ import { compareEntries, Entry } from "./entry";
|
||||||
import { Input } from "@/components/ui/input";
|
import { Input } from "@/components/ui/input";
|
||||||
import Fuse from "fuse.js";
|
import Fuse from "fuse.js";
|
||||||
import { cn } from "@/lib/utils"
|
import { cn } from "@/lib/utils"
|
||||||
|
import { Skeleton } from "@/components/ui/skeleton";
|
||||||
|
|
||||||
interface FilterProps {
|
interface FilterProps {
|
||||||
title: string,
|
title: string,
|
||||||
|
|
@ -12,6 +13,7 @@ interface FilterProps {
|
||||||
entries: Entry[],
|
entries: Entry[],
|
||||||
onEntryClicked: (key: string) => void,
|
onEntryClicked: (key: string) => void,
|
||||||
className?: string,
|
className?: string,
|
||||||
|
isStreaming: boolean,
|
||||||
}
|
}
|
||||||
|
|
||||||
export const Filter = ({
|
export const Filter = ({
|
||||||
|
|
@ -20,6 +22,7 @@ export const Filter = ({
|
||||||
entries,
|
entries,
|
||||||
onEntryClicked,
|
onEntryClicked,
|
||||||
className,
|
className,
|
||||||
|
isStreaming,
|
||||||
}: FilterProps) => {
|
}: FilterProps) => {
|
||||||
const [searchFilter, setSearchFilter] = useState<string>("");
|
const [searchFilter, setSearchFilter] = useState<string>("");
|
||||||
|
|
||||||
|
|
@ -43,6 +46,10 @@ export const Filter = ({
|
||||||
className
|
className
|
||||||
)}>
|
)}>
|
||||||
<h2 className="text-sm font-semibold">{title}</h2>
|
<h2 className="text-sm font-semibold">{title}</h2>
|
||||||
|
{(isStreaming && entries.length === 0) ? (
|
||||||
|
<Skeleton className="h-12 w-full" />
|
||||||
|
) : (
|
||||||
|
<>
|
||||||
<div className="pr-1">
|
<div className="pr-1">
|
||||||
<Input
|
<Input
|
||||||
placeholder={searchPlaceholder}
|
placeholder={searchPlaceholder}
|
||||||
|
|
@ -64,6 +71,9 @@ export const Filter = ({
|
||||||
/>
|
/>
|
||||||
))}
|
))}
|
||||||
</div>
|
</div>
|
||||||
|
</>
|
||||||
|
)}
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
|
|
|
||||||
|
|
@ -1,7 +1,7 @@
|
||||||
'use client';
|
'use client';
|
||||||
|
|
||||||
import { FileIcon } from "@/components/ui/fileIcon";
|
import { FileIcon } from "@/components/ui/fileIcon";
|
||||||
import { RepositoryInfo, SearchResultFile } from "@/features/search/types";
|
import { RepositoryInfo, SearchResultFile } from "@/features/search";
|
||||||
import { cn, getCodeHostInfoForRepo } from "@/lib/utils";
|
import { cn, getCodeHostInfoForRepo } from "@/lib/utils";
|
||||||
import { LaptopIcon } from "@radix-ui/react-icons";
|
import { LaptopIcon } from "@radix-ui/react-icons";
|
||||||
import Image from "next/image";
|
import Image from "next/image";
|
||||||
|
|
@ -15,6 +15,8 @@ import { useGetSelectedFromQuery } from "./useGetSelectedFromQuery";
|
||||||
interface FilePanelProps {
|
interface FilePanelProps {
|
||||||
matches: SearchResultFile[];
|
matches: SearchResultFile[];
|
||||||
repoInfo: Record<number, RepositoryInfo>;
|
repoInfo: Record<number, RepositoryInfo>;
|
||||||
|
onFilterChange?: () => void;
|
||||||
|
isStreaming: boolean;
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
|
|
@ -31,10 +33,14 @@ interface FilePanelProps {
|
||||||
*
|
*
|
||||||
* @param matches - Array of search result files to filter
|
* @param matches - Array of search result files to filter
|
||||||
* @param repoInfo - Information about repositories including their display names and icons
|
* @param repoInfo - Information about repositories including their display names and icons
|
||||||
|
* @param onFilterChange - Optional callback that is called whenever a filter is applied or removed
|
||||||
|
* @param isStreaming - Whether the search is streaming
|
||||||
*/
|
*/
|
||||||
export const FilterPanel = ({
|
export const FilterPanel = ({
|
||||||
matches,
|
matches,
|
||||||
repoInfo,
|
repoInfo,
|
||||||
|
onFilterChange,
|
||||||
|
isStreaming,
|
||||||
}: FilePanelProps) => {
|
}: FilePanelProps) => {
|
||||||
const router = useRouter();
|
const router = useRouter();
|
||||||
const searchParams = useSearchParams();
|
const searchParams = useSearchParams();
|
||||||
|
|
@ -148,9 +154,11 @@ export const FilterPanel = ({
|
||||||
|
|
||||||
if (newParams.toString() !== searchParams.toString()) {
|
if (newParams.toString() !== searchParams.toString()) {
|
||||||
router.replace(`?${newParams.toString()}`, { scroll: false });
|
router.replace(`?${newParams.toString()}`, { scroll: false });
|
||||||
|
onFilterChange?.();
|
||||||
}
|
}
|
||||||
}}
|
}}
|
||||||
className="max-h-[50%]"
|
className="max-h-[50%]"
|
||||||
|
isStreaming={isStreaming}
|
||||||
/>
|
/>
|
||||||
<Filter
|
<Filter
|
||||||
title="Filter By Language"
|
title="Filter By Language"
|
||||||
|
|
@ -170,9 +178,11 @@ export const FilterPanel = ({
|
||||||
|
|
||||||
if (newParams.toString() !== searchParams.toString()) {
|
if (newParams.toString() !== searchParams.toString()) {
|
||||||
router.replace(`?${newParams.toString()}`, { scroll: false });
|
router.replace(`?${newParams.toString()}`, { scroll: false });
|
||||||
|
onFilterChange?.();
|
||||||
}
|
}
|
||||||
}}
|
}}
|
||||||
className="overflow-auto"
|
className="overflow-auto"
|
||||||
|
isStreaming={isStreaming}
|
||||||
/>
|
/>
|
||||||
</div>
|
</div>
|
||||||
)
|
)
|
||||||
|
|
|
||||||
|
|
@ -1,6 +1,6 @@
|
||||||
'use client';
|
'use client';
|
||||||
|
|
||||||
import { SearchResultFile } from "@/features/search/types";
|
import { SearchResultFile } from "@/features/search";
|
||||||
import { useMemo } from "react";
|
import { useMemo } from "react";
|
||||||
import { useGetSelectedFromQuery } from "./useGetSelectedFromQuery";
|
import { useGetSelectedFromQuery } from "./useGetSelectedFromQuery";
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -11,76 +11,76 @@ import {
|
||||||
} from "@/components/ui/resizable";
|
} from "@/components/ui/resizable";
|
||||||
import { Separator } from "@/components/ui/separator";
|
import { Separator } from "@/components/ui/separator";
|
||||||
import { Tooltip, TooltipContent, TooltipTrigger } from "@/components/ui/tooltip";
|
import { Tooltip, TooltipContent, TooltipTrigger } from "@/components/ui/tooltip";
|
||||||
import { RepositoryInfo, SearchResultFile, SearchStats } from "@/features/search/types";
|
import { RepositoryInfo, SearchResultFile, SearchStats } from "@/features/search";
|
||||||
import useCaptureEvent from "@/hooks/useCaptureEvent";
|
import useCaptureEvent from "@/hooks/useCaptureEvent";
|
||||||
import { useDomain } from "@/hooks/useDomain";
|
import { useDomain } from "@/hooks/useDomain";
|
||||||
import { useNonEmptyQueryParam } from "@/hooks/useNonEmptyQueryParam";
|
import { useNonEmptyQueryParam } from "@/hooks/useNonEmptyQueryParam";
|
||||||
import { useSearchHistory } from "@/hooks/useSearchHistory";
|
import { useSearchHistory } from "@/hooks/useSearchHistory";
|
||||||
import { SearchQueryParams } from "@/lib/types";
|
import { SearchQueryParams } from "@/lib/types";
|
||||||
import { createPathWithQueryParams, measure, unwrapServiceError } from "@/lib/utils";
|
import { createPathWithQueryParams } from "@/lib/utils";
|
||||||
import { InfoCircledIcon, SymbolIcon } from "@radix-ui/react-icons";
|
import { InfoCircledIcon } from "@radix-ui/react-icons";
|
||||||
import { useQuery } from "@tanstack/react-query";
|
|
||||||
import { useLocalStorage } from "@uidotdev/usehooks";
|
import { useLocalStorage } from "@uidotdev/usehooks";
|
||||||
import { AlertTriangleIcon, BugIcon, FilterIcon } from "lucide-react";
|
import { AlertTriangleIcon, BugIcon, FilterIcon, RefreshCwIcon } from "lucide-react";
|
||||||
import { useRouter } from "next/navigation";
|
import { useRouter } from "next/navigation";
|
||||||
import { useCallback, useEffect, useMemo, useRef, useState } from "react";
|
import { useCallback, useEffect, useMemo, useRef, useState } from "react";
|
||||||
import { useHotkeys } from "react-hotkeys-hook";
|
import { useHotkeys } from "react-hotkeys-hook";
|
||||||
import { ImperativePanelHandle } from "react-resizable-panels";
|
import { ImperativePanelHandle } from "react-resizable-panels";
|
||||||
import { search } from "../../../api/(client)/client";
|
|
||||||
import { CopyIconButton } from "../../components/copyIconButton";
|
import { CopyIconButton } from "../../components/copyIconButton";
|
||||||
import { SearchBar } from "../../components/searchBar";
|
import { SearchBar } from "../../components/searchBar";
|
||||||
import { TopBar } from "../../components/topBar";
|
import { TopBar } from "../../components/topBar";
|
||||||
|
import { useStreamedSearch } from "../useStreamedSearch";
|
||||||
import { CodePreviewPanel } from "./codePreviewPanel";
|
import { CodePreviewPanel } from "./codePreviewPanel";
|
||||||
import { FilterPanel } from "./filterPanel";
|
import { FilterPanel } from "./filterPanel";
|
||||||
import { useFilteredMatches } from "./filterPanel/useFilterMatches";
|
import { useFilteredMatches } from "./filterPanel/useFilterMatches";
|
||||||
import { SearchResultsPanel } from "./searchResultsPanel";
|
import { SearchResultsPanel, SearchResultsPanelHandle } from "./searchResultsPanel";
|
||||||
|
import { ServiceErrorException } from "@/lib/serviceError";
|
||||||
|
|
||||||
interface SearchResultsPageProps {
|
interface SearchResultsPageProps {
|
||||||
searchQuery: string;
|
searchQuery: string;
|
||||||
defaultMaxMatchCount: number;
|
defaultMaxMatchCount: number;
|
||||||
|
isRegexEnabled: boolean;
|
||||||
|
isCaseSensitivityEnabled: boolean;
|
||||||
}
|
}
|
||||||
|
|
||||||
export const SearchResultsPage = ({
|
export const SearchResultsPage = ({
|
||||||
searchQuery,
|
searchQuery,
|
||||||
defaultMaxMatchCount,
|
defaultMaxMatchCount,
|
||||||
|
isRegexEnabled,
|
||||||
|
isCaseSensitivityEnabled,
|
||||||
}: SearchResultsPageProps) => {
|
}: SearchResultsPageProps) => {
|
||||||
const router = useRouter();
|
const router = useRouter();
|
||||||
const { setSearchHistory } = useSearchHistory();
|
const { setSearchHistory } = useSearchHistory();
|
||||||
const captureEvent = useCaptureEvent();
|
|
||||||
const domain = useDomain();
|
const domain = useDomain();
|
||||||
const { toast } = useToast();
|
const { toast } = useToast();
|
||||||
|
const captureEvent = useCaptureEvent();
|
||||||
|
|
||||||
// Encodes the number of matches to return in the search response.
|
// Encodes the number of matches to return in the search response.
|
||||||
const _maxMatchCount = parseInt(useNonEmptyQueryParam(SearchQueryParams.matches) ?? `${defaultMaxMatchCount}`);
|
const _maxMatchCount = parseInt(useNonEmptyQueryParam(SearchQueryParams.matches) ?? `${defaultMaxMatchCount}`);
|
||||||
const maxMatchCount = isNaN(_maxMatchCount) ? defaultMaxMatchCount : _maxMatchCount;
|
const maxMatchCount = isNaN(_maxMatchCount) ? defaultMaxMatchCount : _maxMatchCount;
|
||||||
|
|
||||||
const {
|
const {
|
||||||
data: searchResponse,
|
error,
|
||||||
isPending: isSearchPending,
|
files,
|
||||||
isFetching: isFetching,
|
repoInfo,
|
||||||
error
|
timeToSearchCompletionMs,
|
||||||
} = useQuery({
|
timeToFirstSearchResultMs,
|
||||||
queryKey: ["search", searchQuery, maxMatchCount],
|
isStreaming,
|
||||||
queryFn: () => measure(() => unwrapServiceError(search({
|
numMatches,
|
||||||
|
isExhaustive,
|
||||||
|
stats,
|
||||||
|
} = useStreamedSearch({
|
||||||
query: searchQuery,
|
query: searchQuery,
|
||||||
matches: maxMatchCount,
|
matches: maxMatchCount,
|
||||||
contextLines: 3,
|
contextLines: 3,
|
||||||
whole: false,
|
whole: false,
|
||||||
})), "client.search"),
|
isRegexEnabled,
|
||||||
select: ({ data, durationMs }) => ({
|
isCaseSensitivityEnabled,
|
||||||
...data,
|
|
||||||
totalClientSearchDurationMs: durationMs,
|
|
||||||
}),
|
|
||||||
enabled: searchQuery.length > 0,
|
|
||||||
refetchOnWindowFocus: false,
|
|
||||||
retry: false,
|
|
||||||
staleTime: 0,
|
|
||||||
});
|
});
|
||||||
|
|
||||||
useEffect(() => {
|
useEffect(() => {
|
||||||
if (error) {
|
if (error) {
|
||||||
toast({
|
toast({
|
||||||
description: `❌ Search failed. Reason: ${error.message}`,
|
description: `❌ Search failed. Reason: ${error instanceof ServiceErrorException ? error.serviceError.message : error.message}`,
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
}, [error, toast]);
|
}, [error, toast]);
|
||||||
|
|
@ -103,38 +103,51 @@ export const SearchResultsPage = ({
|
||||||
}, [searchQuery, setSearchHistory]);
|
}, [searchQuery, setSearchHistory]);
|
||||||
|
|
||||||
useEffect(() => {
|
useEffect(() => {
|
||||||
if (!searchResponse) {
|
if (isStreaming || !stats) {
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
const fileLanguages = searchResponse.files?.map(file => file.language) || [];
|
const fileLanguages = files.map(file => file.language) || [];
|
||||||
|
|
||||||
|
console.debug('timeToFirstSearchResultMs:', timeToFirstSearchResultMs);
|
||||||
|
console.debug('timeToSearchCompletionMs:', timeToSearchCompletionMs);
|
||||||
|
|
||||||
captureEvent("search_finished", {
|
captureEvent("search_finished", {
|
||||||
durationMs: searchResponse.totalClientSearchDurationMs,
|
durationMs: timeToSearchCompletionMs,
|
||||||
fileCount: searchResponse.stats.fileCount,
|
timeToSearchCompletionMs,
|
||||||
matchCount: searchResponse.stats.totalMatchCount,
|
timeToFirstSearchResultMs,
|
||||||
actualMatchCount: searchResponse.stats.actualMatchCount,
|
fileCount: stats.fileCount,
|
||||||
filesSkipped: searchResponse.stats.filesSkipped,
|
matchCount: stats.totalMatchCount,
|
||||||
contentBytesLoaded: searchResponse.stats.contentBytesLoaded,
|
actualMatchCount: stats.actualMatchCount,
|
||||||
indexBytesLoaded: searchResponse.stats.indexBytesLoaded,
|
filesSkipped: stats.filesSkipped,
|
||||||
crashes: searchResponse.stats.crashes,
|
contentBytesLoaded: stats.contentBytesLoaded,
|
||||||
shardFilesConsidered: searchResponse.stats.shardFilesConsidered,
|
indexBytesLoaded: stats.indexBytesLoaded,
|
||||||
filesConsidered: searchResponse.stats.filesConsidered,
|
crashes: stats.crashes,
|
||||||
filesLoaded: searchResponse.stats.filesLoaded,
|
shardFilesConsidered: stats.shardFilesConsidered,
|
||||||
shardsScanned: searchResponse.stats.shardsScanned,
|
filesConsidered: stats.filesConsidered,
|
||||||
shardsSkipped: searchResponse.stats.shardsSkipped,
|
filesLoaded: stats.filesLoaded,
|
||||||
shardsSkippedFilter: searchResponse.stats.shardsSkippedFilter,
|
shardsScanned: stats.shardsScanned,
|
||||||
ngramMatches: searchResponse.stats.ngramMatches,
|
shardsSkipped: stats.shardsSkipped,
|
||||||
ngramLookups: searchResponse.stats.ngramLookups,
|
shardsSkippedFilter: stats.shardsSkippedFilter,
|
||||||
wait: searchResponse.stats.wait,
|
ngramMatches: stats.ngramMatches,
|
||||||
matchTreeConstruction: searchResponse.stats.matchTreeConstruction,
|
ngramLookups: stats.ngramLookups,
|
||||||
matchTreeSearch: searchResponse.stats.matchTreeSearch,
|
wait: stats.wait,
|
||||||
regexpsConsidered: searchResponse.stats.regexpsConsidered,
|
matchTreeConstruction: stats.matchTreeConstruction,
|
||||||
flushReason: searchResponse.stats.flushReason,
|
matchTreeSearch: stats.matchTreeSearch,
|
||||||
|
regexpsConsidered: stats.regexpsConsidered,
|
||||||
|
flushReason: stats.flushReason,
|
||||||
fileLanguages,
|
fileLanguages,
|
||||||
|
isSearchExhaustive: isExhaustive,
|
||||||
});
|
});
|
||||||
}, [captureEvent, searchQuery, searchResponse]);
|
}, [
|
||||||
|
captureEvent,
|
||||||
|
files,
|
||||||
|
isStreaming,
|
||||||
|
isExhaustive,
|
||||||
|
stats,
|
||||||
|
timeToSearchCompletionMs,
|
||||||
|
timeToFirstSearchResultMs,
|
||||||
|
]);
|
||||||
|
|
||||||
const onLoadMoreResults = useCallback(() => {
|
const onLoadMoreResults = useCallback(() => {
|
||||||
const url = createPathWithQueryParams(`/${domain}/search`,
|
const url = createPathWithQueryParams(`/${domain}/search`,
|
||||||
|
|
@ -144,6 +157,13 @@ export const SearchResultsPage = ({
|
||||||
router.push(url);
|
router.push(url);
|
||||||
}, [maxMatchCount, router, searchQuery, domain]);
|
}, [maxMatchCount, router, searchQuery, domain]);
|
||||||
|
|
||||||
|
// Look for any files that are not on the default branch.
|
||||||
|
const isBranchFilteringEnabled = useMemo(() => {
|
||||||
|
return files.some((file) => {
|
||||||
|
return file.branches?.some((branch) => branch !== 'HEAD') ?? false;
|
||||||
|
});
|
||||||
|
}, [files]);
|
||||||
|
|
||||||
return (
|
return (
|
||||||
<div className="flex flex-col h-screen overflow-clip">
|
<div className="flex flex-col h-screen overflow-clip">
|
||||||
{/* TopBar */}
|
{/* TopBar */}
|
||||||
|
|
@ -152,32 +172,32 @@ export const SearchResultsPage = ({
|
||||||
>
|
>
|
||||||
<SearchBar
|
<SearchBar
|
||||||
size="sm"
|
size="sm"
|
||||||
defaultQuery={searchQuery}
|
defaults={{
|
||||||
|
isRegexEnabled,
|
||||||
|
isCaseSensitivityEnabled,
|
||||||
|
query: searchQuery,
|
||||||
|
}}
|
||||||
className="w-full"
|
className="w-full"
|
||||||
/>
|
/>
|
||||||
</TopBar>
|
</TopBar>
|
||||||
|
|
||||||
{(isSearchPending || isFetching) ? (
|
{error ? (
|
||||||
<div className="flex flex-col items-center justify-center h-full gap-2">
|
|
||||||
<SymbolIcon className="h-6 w-6 animate-spin" />
|
|
||||||
<p className="font-semibold text-center">Searching...</p>
|
|
||||||
</div>
|
|
||||||
) : error ? (
|
|
||||||
<div className="flex flex-col items-center justify-center h-full gap-2">
|
<div className="flex flex-col items-center justify-center h-full gap-2">
|
||||||
<AlertTriangleIcon className="h-6 w-6" />
|
<AlertTriangleIcon className="h-6 w-6" />
|
||||||
<p className="font-semibold text-center">Failed to search</p>
|
<p className="font-semibold text-center">Failed to search</p>
|
||||||
<p className="text-sm text-center">{error.message}</p>
|
<p className="text-sm text-center">{error instanceof ServiceErrorException ? error.serviceError.message : error.message}</p>
|
||||||
</div>
|
</div>
|
||||||
) : (
|
) : (
|
||||||
<PanelGroup
|
<PanelGroup
|
||||||
fileMatches={searchResponse.files}
|
fileMatches={files}
|
||||||
isMoreResultsButtonVisible={searchResponse.isSearchExhaustive === false}
|
|
||||||
onLoadMoreResults={onLoadMoreResults}
|
onLoadMoreResults={onLoadMoreResults}
|
||||||
isBranchFilteringEnabled={searchResponse.isBranchFilteringEnabled}
|
numMatches={numMatches}
|
||||||
repoInfo={searchResponse.repositoryInfo}
|
repoInfo={repoInfo}
|
||||||
searchDurationMs={searchResponse.totalClientSearchDurationMs}
|
searchDurationMs={timeToSearchCompletionMs}
|
||||||
numMatches={searchResponse.stats.actualMatchCount}
|
isStreaming={isStreaming}
|
||||||
searchStats={searchResponse.stats}
|
searchStats={stats}
|
||||||
|
isMoreResultsButtonVisible={!isExhaustive}
|
||||||
|
isBranchFilteringEnabled={isBranchFilteringEnabled}
|
||||||
/>
|
/>
|
||||||
)}
|
)}
|
||||||
</div>
|
</div>
|
||||||
|
|
@ -186,10 +206,11 @@ export const SearchResultsPage = ({
|
||||||
|
|
||||||
interface PanelGroupProps {
|
interface PanelGroupProps {
|
||||||
fileMatches: SearchResultFile[];
|
fileMatches: SearchResultFile[];
|
||||||
isMoreResultsButtonVisible?: boolean;
|
|
||||||
onLoadMoreResults: () => void;
|
onLoadMoreResults: () => void;
|
||||||
|
isStreaming: boolean;
|
||||||
|
isMoreResultsButtonVisible?: boolean;
|
||||||
isBranchFilteringEnabled: boolean;
|
isBranchFilteringEnabled: boolean;
|
||||||
repoInfo: RepositoryInfo[];
|
repoInfo: Record<number, RepositoryInfo>;
|
||||||
searchDurationMs: number;
|
searchDurationMs: number;
|
||||||
numMatches: number;
|
numMatches: number;
|
||||||
searchStats?: SearchStats;
|
searchStats?: SearchStats;
|
||||||
|
|
@ -198,9 +219,10 @@ interface PanelGroupProps {
|
||||||
const PanelGroup = ({
|
const PanelGroup = ({
|
||||||
fileMatches,
|
fileMatches,
|
||||||
isMoreResultsButtonVisible,
|
isMoreResultsButtonVisible,
|
||||||
|
isStreaming,
|
||||||
onLoadMoreResults,
|
onLoadMoreResults,
|
||||||
isBranchFilteringEnabled,
|
isBranchFilteringEnabled,
|
||||||
repoInfo: _repoInfo,
|
repoInfo,
|
||||||
searchDurationMs: _searchDurationMs,
|
searchDurationMs: _searchDurationMs,
|
||||||
numMatches,
|
numMatches,
|
||||||
searchStats,
|
searchStats,
|
||||||
|
|
@ -208,6 +230,7 @@ const PanelGroup = ({
|
||||||
const [previewedFile, setPreviewedFile] = useState<SearchResultFile | undefined>(undefined);
|
const [previewedFile, setPreviewedFile] = useState<SearchResultFile | undefined>(undefined);
|
||||||
const filteredFileMatches = useFilteredMatches(fileMatches);
|
const filteredFileMatches = useFilteredMatches(fileMatches);
|
||||||
const filterPanelRef = useRef<ImperativePanelHandle>(null);
|
const filterPanelRef = useRef<ImperativePanelHandle>(null);
|
||||||
|
const searchResultsPanelRef = useRef<SearchResultsPanelHandle>(null);
|
||||||
const [selectedMatchIndex, setSelectedMatchIndex] = useState(0);
|
const [selectedMatchIndex, setSelectedMatchIndex] = useState(0);
|
||||||
|
|
||||||
const [isFilterPanelCollapsed, setIsFilterPanelCollapsed] = useLocalStorage('isFilterPanelCollapsed', false);
|
const [isFilterPanelCollapsed, setIsFilterPanelCollapsed] = useLocalStorage('isFilterPanelCollapsed', false);
|
||||||
|
|
@ -228,13 +251,6 @@ const PanelGroup = ({
|
||||||
return Math.round(_searchDurationMs);
|
return Math.round(_searchDurationMs);
|
||||||
}, [_searchDurationMs]);
|
}, [_searchDurationMs]);
|
||||||
|
|
||||||
const repoInfo = useMemo(() => {
|
|
||||||
return _repoInfo.reduce((acc, repo) => {
|
|
||||||
acc[repo.id] = repo;
|
|
||||||
return acc;
|
|
||||||
}, {} as Record<number, RepositoryInfo>);
|
|
||||||
}, [_repoInfo]);
|
|
||||||
|
|
||||||
return (
|
return (
|
||||||
<ResizablePanelGroup
|
<ResizablePanelGroup
|
||||||
direction="horizontal"
|
direction="horizontal"
|
||||||
|
|
@ -255,6 +271,10 @@ const PanelGroup = ({
|
||||||
<FilterPanel
|
<FilterPanel
|
||||||
matches={fileMatches}
|
matches={fileMatches}
|
||||||
repoInfo={repoInfo}
|
repoInfo={repoInfo}
|
||||||
|
isStreaming={isStreaming}
|
||||||
|
onFilterChange={() => {
|
||||||
|
searchResultsPanelRef.current?.resetScroll();
|
||||||
|
}}
|
||||||
/>
|
/>
|
||||||
</ResizablePanel>
|
</ResizablePanel>
|
||||||
{isFilterPanelCollapsed && (
|
{isFilterPanelCollapsed && (
|
||||||
|
|
@ -291,6 +311,16 @@ const PanelGroup = ({
|
||||||
order={2}
|
order={2}
|
||||||
>
|
>
|
||||||
<div className="py-1 px-2 flex flex-row items-center">
|
<div className="py-1 px-2 flex flex-row items-center">
|
||||||
|
{isStreaming ? (
|
||||||
|
<>
|
||||||
|
<RefreshCwIcon className="h-4 w-4 animate-spin mr-2" />
|
||||||
|
<p className="text-sm font-medium mr-1">Searching...</p>
|
||||||
|
{numMatches > 0 && (
|
||||||
|
<p className="text-sm font-medium">{`Found ${numMatches} matches in ${fileMatches.length} ${fileMatches.length > 1 ? 'files' : 'file'}`}</p>
|
||||||
|
)}
|
||||||
|
</>
|
||||||
|
) : (
|
||||||
|
<>
|
||||||
<Tooltip>
|
<Tooltip>
|
||||||
<TooltipTrigger asChild>
|
<TooltipTrigger asChild>
|
||||||
<InfoCircledIcon className="w-4 h-4 mr-2" />
|
<InfoCircledIcon className="w-4 h-4 mr-2" />
|
||||||
|
|
@ -327,9 +357,12 @@ const PanelGroup = ({
|
||||||
(load more)
|
(load more)
|
||||||
</div>
|
</div>
|
||||||
)}
|
)}
|
||||||
|
</>
|
||||||
|
)}
|
||||||
</div>
|
</div>
|
||||||
{filteredFileMatches.length > 0 ? (
|
{filteredFileMatches.length > 0 ? (
|
||||||
<SearchResultsPanel
|
<SearchResultsPanel
|
||||||
|
ref={searchResultsPanelRef}
|
||||||
fileMatches={filteredFileMatches}
|
fileMatches={filteredFileMatches}
|
||||||
onOpenFilePreview={(fileMatch, matchIndex) => {
|
onOpenFilePreview={(fileMatch, matchIndex) => {
|
||||||
setSelectedMatchIndex(matchIndex ?? 0);
|
setSelectedMatchIndex(matchIndex ?? 0);
|
||||||
|
|
@ -340,6 +373,11 @@ const PanelGroup = ({
|
||||||
isBranchFilteringEnabled={isBranchFilteringEnabled}
|
isBranchFilteringEnabled={isBranchFilteringEnabled}
|
||||||
repoInfo={repoInfo}
|
repoInfo={repoInfo}
|
||||||
/>
|
/>
|
||||||
|
) : isStreaming ? (
|
||||||
|
<div className="flex flex-col items-center justify-center h-full gap-2">
|
||||||
|
<RefreshCwIcon className="h-6 w-6 animate-spin" />
|
||||||
|
<p className="font-semibold text-center">Searching...</p>
|
||||||
|
</div>
|
||||||
) : (
|
) : (
|
||||||
<div className="flex flex-col items-center justify-center h-full">
|
<div className="flex flex-col items-center justify-center h-full">
|
||||||
<p className="text-sm text-muted-foreground">No results found</p>
|
<p className="text-sm text-muted-foreground">No results found</p>
|
||||||
|
|
|
||||||
|
|
@ -1,6 +1,6 @@
|
||||||
'use client';
|
'use client';
|
||||||
|
|
||||||
import { SearchResultFile, SearchResultChunk } from "@/features/search/types";
|
import { SearchResultFile, SearchResultChunk } from "@/features/search";
|
||||||
import { LightweightCodeHighlighter } from "@/app/[domain]/components/lightweightCodeHighlighter";
|
import { LightweightCodeHighlighter } from "@/app/[domain]/components/lightweightCodeHighlighter";
|
||||||
import Link from "next/link";
|
import Link from "next/link";
|
||||||
import { getBrowsePath } from "@/app/[domain]/browse/hooks/utils";
|
import { getBrowsePath } from "@/app/[domain]/browse/hooks/utils";
|
||||||
|
|
|
||||||
|
|
@ -5,7 +5,7 @@ import { Separator } from "@/components/ui/separator";
|
||||||
import { DoubleArrowDownIcon, DoubleArrowUpIcon } from "@radix-ui/react-icons";
|
import { DoubleArrowDownIcon, DoubleArrowUpIcon } from "@radix-ui/react-icons";
|
||||||
import { useMemo } from "react";
|
import { useMemo } from "react";
|
||||||
import { FileMatch } from "./fileMatch";
|
import { FileMatch } from "./fileMatch";
|
||||||
import { RepositoryInfo, SearchResultFile } from "@/features/search/types";
|
import { RepositoryInfo, SearchResultFile } from "@/features/search";
|
||||||
import { Button } from "@/components/ui/button";
|
import { Button } from "@/components/ui/button";
|
||||||
|
|
||||||
export const MAX_MATCHES_TO_PREVIEW = 3;
|
export const MAX_MATCHES_TO_PREVIEW = 3;
|
||||||
|
|
@ -75,7 +75,7 @@ export const FileMatchContainer = ({
|
||||||
}
|
}
|
||||||
|
|
||||||
return `${branches[0]}${branches.length > 1 ? ` +${branches.length - 1}` : ''}`;
|
return `${branches[0]}${branches.length > 1 ? ` +${branches.length - 1}` : ''}`;
|
||||||
}, [isBranchFilteringEnabled, branches]);
|
}, [branches, isBranchFilteringEnabled]);
|
||||||
|
|
||||||
const repo = useMemo(() => {
|
const repo = useMemo(() => {
|
||||||
return repoInfo[file.repositoryId];
|
return repoInfo[file.repositoryId];
|
||||||
|
|
|
||||||
|
|
@ -1,10 +1,11 @@
|
||||||
'use client';
|
'use client';
|
||||||
|
|
||||||
import { RepositoryInfo, SearchResultFile } from "@/features/search/types";
|
import { RepositoryInfo, SearchResultFile } from "@/features/search";
|
||||||
import { FileMatchContainer, MAX_MATCHES_TO_PREVIEW } from "./fileMatchContainer";
|
|
||||||
import { useVirtualizer, VirtualItem } from "@tanstack/react-virtual";
|
import { useVirtualizer, VirtualItem } from "@tanstack/react-virtual";
|
||||||
import { useCallback, useEffect, useRef, useState } from "react";
|
import { useDebounce } from "@uidotdev/usehooks";
|
||||||
import { useDebounce, usePrevious } from "@uidotdev/usehooks";
|
import { forwardRef, useCallback, useEffect, useImperativeHandle, useRef } from "react";
|
||||||
|
import { useMap } from "usehooks-ts";
|
||||||
|
import { FileMatchContainer, MAX_MATCHES_TO_PREVIEW } from "./fileMatchContainer";
|
||||||
|
|
||||||
interface SearchResultsPanelProps {
|
interface SearchResultsPanelProps {
|
||||||
fileMatches: SearchResultFile[];
|
fileMatches: SearchResultFile[];
|
||||||
|
|
@ -15,6 +16,10 @@ interface SearchResultsPanelProps {
|
||||||
repoInfo: Record<number, RepositoryInfo>;
|
repoInfo: Record<number, RepositoryInfo>;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
export interface SearchResultsPanelHandle {
|
||||||
|
resetScroll: () => void;
|
||||||
|
}
|
||||||
|
|
||||||
const ESTIMATED_LINE_HEIGHT_PX = 20;
|
const ESTIMATED_LINE_HEIGHT_PX = 20;
|
||||||
const ESTIMATED_NUMBER_OF_LINES_PER_CODE_CELL = 10;
|
const ESTIMATED_NUMBER_OF_LINES_PER_CODE_CELL = 10;
|
||||||
const ESTIMATED_MATCH_CONTAINER_HEIGHT_PX = 30;
|
const ESTIMATED_MATCH_CONTAINER_HEIGHT_PX = 30;
|
||||||
|
|
@ -22,17 +27,25 @@ const ESTIMATED_MATCH_CONTAINER_HEIGHT_PX = 30;
|
||||||
type ScrollHistoryState = {
|
type ScrollHistoryState = {
|
||||||
scrollOffset?: number;
|
scrollOffset?: number;
|
||||||
measurementsCache?: VirtualItem[];
|
measurementsCache?: VirtualItem[];
|
||||||
showAllMatchesStates?: boolean[];
|
showAllMatchesMap?: [string, boolean][];
|
||||||
}
|
}
|
||||||
|
|
||||||
export const SearchResultsPanel = ({
|
/**
|
||||||
|
* Unique key for a given file match. Used to store the "show all matches" state for a
|
||||||
|
* given file match.
|
||||||
|
*/
|
||||||
|
const getFileMatchKey = (fileMatch: SearchResultFile) => {
|
||||||
|
return `${fileMatch.repository}-${fileMatch.fileName.text}`;
|
||||||
|
}
|
||||||
|
|
||||||
|
export const SearchResultsPanel = forwardRef<SearchResultsPanelHandle, SearchResultsPanelProps>(({
|
||||||
fileMatches,
|
fileMatches,
|
||||||
onOpenFilePreview,
|
onOpenFilePreview,
|
||||||
isLoadMoreButtonVisible,
|
isLoadMoreButtonVisible,
|
||||||
onLoadMoreButtonClicked,
|
onLoadMoreButtonClicked,
|
||||||
isBranchFilteringEnabled,
|
isBranchFilteringEnabled,
|
||||||
repoInfo,
|
repoInfo,
|
||||||
}: SearchResultsPanelProps) => {
|
}, ref) => {
|
||||||
const parentRef = useRef<HTMLDivElement>(null);
|
const parentRef = useRef<HTMLDivElement>(null);
|
||||||
|
|
||||||
// Restore the scroll offset, measurements cache, and other state from the history
|
// Restore the scroll offset, measurements cache, and other state from the history
|
||||||
|
|
@ -42,17 +55,17 @@ export const SearchResultsPanel = ({
|
||||||
const {
|
const {
|
||||||
scrollOffset: restoreOffset,
|
scrollOffset: restoreOffset,
|
||||||
measurementsCache: restoreMeasurementsCache,
|
measurementsCache: restoreMeasurementsCache,
|
||||||
showAllMatchesStates: restoreShowAllMatchesStates,
|
showAllMatchesMap: restoreShowAllMatchesStates,
|
||||||
} = history.state as ScrollHistoryState;
|
} = (history.state ?? {}) as ScrollHistoryState;
|
||||||
|
|
||||||
const [showAllMatchesStates, setShowAllMatchesStates] = useState(restoreShowAllMatchesStates || Array(fileMatches.length).fill(false));
|
const [showAllMatchesMap, showAllMatchesActions] = useMap<string, boolean>(restoreShowAllMatchesStates || []);
|
||||||
|
|
||||||
const virtualizer = useVirtualizer({
|
const virtualizer = useVirtualizer({
|
||||||
count: fileMatches.length,
|
count: fileMatches.length,
|
||||||
getScrollElement: () => parentRef.current,
|
getScrollElement: () => parentRef.current,
|
||||||
estimateSize: (index) => {
|
estimateSize: (index) => {
|
||||||
const fileMatch = fileMatches[index];
|
const fileMatch = fileMatches[index];
|
||||||
const showAllMatches = showAllMatchesStates[index];
|
const showAllMatches = showAllMatchesMap.get(getFileMatchKey(fileMatch));
|
||||||
|
|
||||||
// Quick guesstimation ;) This needs to be quick since the virtualizer will
|
// Quick guesstimation ;) This needs to be quick since the virtualizer will
|
||||||
// run this upfront for all items in the list.
|
// run this upfront for all items in the list.
|
||||||
|
|
@ -73,38 +86,33 @@ export const SearchResultsPanel = ({
|
||||||
debug: false,
|
debug: false,
|
||||||
});
|
});
|
||||||
|
|
||||||
// When the number of file matches changes, we need to reset our scroll state.
|
const resetScroll = useCallback(() => {
|
||||||
const prevFileMatches = usePrevious(fileMatches);
|
|
||||||
useEffect(() => {
|
|
||||||
if (!prevFileMatches) {
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
if (prevFileMatches.length !== fileMatches.length) {
|
|
||||||
setShowAllMatchesStates(Array(fileMatches.length).fill(false));
|
|
||||||
virtualizer.scrollToIndex(0);
|
virtualizer.scrollToIndex(0);
|
||||||
}
|
}, [virtualizer]);
|
||||||
}, [fileMatches.length, prevFileMatches, virtualizer]);
|
|
||||||
|
// Expose the resetScroll function to parent components
|
||||||
|
useImperativeHandle(ref, () => ({
|
||||||
|
resetScroll,
|
||||||
|
}), [resetScroll]);
|
||||||
|
|
||||||
|
|
||||||
// Save the scroll state to the history stack.
|
// Save the scroll state to the history stack.
|
||||||
const debouncedScrollOffset = useDebounce(virtualizer.scrollOffset, 100);
|
const debouncedScrollOffset = useDebounce(virtualizer.scrollOffset, 500);
|
||||||
useEffect(() => {
|
useEffect(() => {
|
||||||
history.replaceState(
|
history.replaceState(
|
||||||
{
|
{
|
||||||
scrollOffset: debouncedScrollOffset ?? undefined,
|
scrollOffset: debouncedScrollOffset ?? undefined,
|
||||||
measurementsCache: virtualizer.measurementsCache,
|
measurementsCache: virtualizer.measurementsCache,
|
||||||
showAllMatchesStates,
|
showAllMatchesMap: Array.from(showAllMatchesMap.entries()),
|
||||||
} satisfies ScrollHistoryState,
|
} satisfies ScrollHistoryState,
|
||||||
'',
|
'',
|
||||||
window.location.href
|
window.location.href
|
||||||
);
|
);
|
||||||
}, [debouncedScrollOffset, virtualizer.measurementsCache, showAllMatchesStates]);
|
}, [debouncedScrollOffset, virtualizer.measurementsCache, showAllMatchesMap]);
|
||||||
|
|
||||||
const onShowAllMatchesButtonClicked = useCallback((index: number) => {
|
const onShowAllMatchesButtonClicked = useCallback((fileMatchKey: string, index: number) => {
|
||||||
const states = [...showAllMatchesStates];
|
const wasShown = showAllMatchesMap.get(fileMatchKey) ?? false;
|
||||||
const wasShown = states[index];
|
showAllMatchesActions.set(fileMatchKey, !wasShown);
|
||||||
states[index] = !wasShown;
|
|
||||||
setShowAllMatchesStates(states);
|
|
||||||
|
|
||||||
// When collapsing, scroll to the top of the file match container. This ensures
|
// When collapsing, scroll to the top of the file match container. This ensures
|
||||||
// that the focused "show fewer matches" button is visible.
|
// that the focused "show fewer matches" button is visible.
|
||||||
|
|
@ -113,7 +121,7 @@ export const SearchResultsPanel = ({
|
||||||
align: 'start'
|
align: 'start'
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
}, [showAllMatchesStates, virtualizer]);
|
}, [showAllMatchesActions, showAllMatchesMap, virtualizer]);
|
||||||
|
|
||||||
|
|
||||||
return (
|
return (
|
||||||
|
|
@ -153,9 +161,9 @@ export const SearchResultsPanel = ({
|
||||||
onOpenFilePreview={(matchIndex) => {
|
onOpenFilePreview={(matchIndex) => {
|
||||||
onOpenFilePreview(file, matchIndex);
|
onOpenFilePreview(file, matchIndex);
|
||||||
}}
|
}}
|
||||||
showAllMatches={showAllMatchesStates[virtualRow.index]}
|
showAllMatches={showAllMatchesMap.get(getFileMatchKey(file)) ?? false}
|
||||||
onShowAllMatchesButtonClicked={() => {
|
onShowAllMatchesButtonClicked={() => {
|
||||||
onShowAllMatchesButtonClicked(virtualRow.index);
|
onShowAllMatchesButtonClicked(getFileMatchKey(file), virtualRow.index);
|
||||||
}}
|
}}
|
||||||
isBranchFilteringEnabled={isBranchFilteringEnabled}
|
isBranchFilteringEnabled={isBranchFilteringEnabled}
|
||||||
repoInfo={repoInfo}
|
repoInfo={repoInfo}
|
||||||
|
|
@ -177,4 +185,6 @@ export const SearchResultsPanel = ({
|
||||||
)}
|
)}
|
||||||
</div>
|
</div>
|
||||||
)
|
)
|
||||||
}
|
});
|
||||||
|
|
||||||
|
SearchResultsPanel.displayName = 'SearchResultsPanel';
|
||||||
|
|
|
||||||
|
|
@ -4,13 +4,19 @@ import { SearchResultsPage } from "./components/searchResultsPage";
|
||||||
|
|
||||||
interface SearchPageProps {
|
interface SearchPageProps {
|
||||||
params: Promise<{ domain: string }>;
|
params: Promise<{ domain: string }>;
|
||||||
searchParams: Promise<{ query?: string }>;
|
searchParams: Promise<{
|
||||||
|
query?: string;
|
||||||
|
isRegexEnabled?: "true" | "false";
|
||||||
|
isCaseSensitivityEnabled?: "true" | "false";
|
||||||
|
}>;
|
||||||
}
|
}
|
||||||
|
|
||||||
export default async function SearchPage(props: SearchPageProps) {
|
export default async function SearchPage(props: SearchPageProps) {
|
||||||
const { domain } = await props.params;
|
const { domain } = await props.params;
|
||||||
const searchParams = await props.searchParams;
|
const searchParams = await props.searchParams;
|
||||||
const query = searchParams?.query;
|
const query = searchParams?.query;
|
||||||
|
const isRegexEnabled = searchParams?.isRegexEnabled === "true";
|
||||||
|
const isCaseSensitivityEnabled = searchParams?.isCaseSensitivityEnabled === "true";
|
||||||
|
|
||||||
if (query === undefined || query.length === 0) {
|
if (query === undefined || query.length === 0) {
|
||||||
return <SearchLandingPage domain={domain} />
|
return <SearchLandingPage domain={domain} />
|
||||||
|
|
@ -20,6 +26,8 @@ export default async function SearchPage(props: SearchPageProps) {
|
||||||
<SearchResultsPage
|
<SearchResultsPage
|
||||||
searchQuery={query}
|
searchQuery={query}
|
||||||
defaultMaxMatchCount={env.DEFAULT_MAX_MATCH_COUNT}
|
defaultMaxMatchCount={env.DEFAULT_MAX_MATCH_COUNT}
|
||||||
|
isRegexEnabled={isRegexEnabled}
|
||||||
|
isCaseSensitivityEnabled={isCaseSensitivityEnabled}
|
||||||
/>
|
/>
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
|
|
|
||||||
288
packages/web/src/app/[domain]/search/useStreamedSearch.ts
Normal file
288
packages/web/src/app/[domain]/search/useStreamedSearch.ts
Normal file
|
|
@ -0,0 +1,288 @@
|
||||||
|
'use client';
|
||||||
|
|
||||||
|
import { RepositoryInfo, SearchRequest, SearchResultFile, SearchStats, StreamedSearchResponse } from '@/features/search';
|
||||||
|
import { ServiceErrorException } from '@/lib/serviceError';
|
||||||
|
import { isServiceError } from '@/lib/utils';
|
||||||
|
import * as Sentry from '@sentry/nextjs';
|
||||||
|
import { useCallback, useEffect, useRef, useState } from 'react';
|
||||||
|
|
||||||
|
interface CacheEntry {
|
||||||
|
files: SearchResultFile[];
|
||||||
|
repoInfo: Record<number, RepositoryInfo>;
|
||||||
|
numMatches: number;
|
||||||
|
timeToSearchCompletionMs: number;
|
||||||
|
timeToFirstSearchResultMs: number;
|
||||||
|
timestamp: number;
|
||||||
|
isExhaustive: boolean;
|
||||||
|
}
|
||||||
|
|
||||||
|
const searchCache = new Map<string, CacheEntry>();
|
||||||
|
const CACHE_TTL = 5 * 60 * 1000;
|
||||||
|
|
||||||
|
const createCacheKey = (params: SearchRequest): string => {
|
||||||
|
return JSON.stringify({
|
||||||
|
query: params.query,
|
||||||
|
matches: params.matches,
|
||||||
|
contextLines: params.contextLines,
|
||||||
|
whole: params.whole,
|
||||||
|
isRegexEnabled: params.isRegexEnabled,
|
||||||
|
isCaseSensitivityEnabled: params.isCaseSensitivityEnabled,
|
||||||
|
});
|
||||||
|
};
|
||||||
|
|
||||||
|
const isCacheValid = (entry: CacheEntry): boolean => {
|
||||||
|
return Date.now() - entry.timestamp < CACHE_TTL;
|
||||||
|
};
|
||||||
|
|
||||||
|
export const useStreamedSearch = ({ query, matches, contextLines, whole, isRegexEnabled, isCaseSensitivityEnabled }: SearchRequest) => {
|
||||||
|
const [state, setState] = useState<{
|
||||||
|
isStreaming: boolean,
|
||||||
|
isExhaustive: boolean,
|
||||||
|
error: Error | null,
|
||||||
|
files: SearchResultFile[],
|
||||||
|
repoInfo: Record<number, RepositoryInfo>,
|
||||||
|
timeToSearchCompletionMs: number,
|
||||||
|
timeToFirstSearchResultMs: number,
|
||||||
|
numMatches: number,
|
||||||
|
stats?: SearchStats,
|
||||||
|
}>({
|
||||||
|
isStreaming: false,
|
||||||
|
isExhaustive: false,
|
||||||
|
error: null,
|
||||||
|
files: [],
|
||||||
|
repoInfo: {},
|
||||||
|
timeToSearchCompletionMs: 0,
|
||||||
|
timeToFirstSearchResultMs: 0,
|
||||||
|
numMatches: 0,
|
||||||
|
stats: undefined,
|
||||||
|
});
|
||||||
|
|
||||||
|
const abortControllerRef = useRef<AbortController | null>(null);
|
||||||
|
|
||||||
|
const cancel = useCallback(() => {
|
||||||
|
if (abortControllerRef.current) {
|
||||||
|
abortControllerRef.current.abort();
|
||||||
|
abortControllerRef.current = null;
|
||||||
|
}
|
||||||
|
setState(prev => ({
|
||||||
|
...prev,
|
||||||
|
isStreaming: false,
|
||||||
|
}));
|
||||||
|
}, []);
|
||||||
|
|
||||||
|
useEffect(() => {
|
||||||
|
const search = async () => {
|
||||||
|
const startTime = performance.now();
|
||||||
|
|
||||||
|
if (abortControllerRef.current) {
|
||||||
|
abortControllerRef.current.abort();
|
||||||
|
}
|
||||||
|
abortControllerRef.current = new AbortController();
|
||||||
|
|
||||||
|
const cacheKey = createCacheKey({
|
||||||
|
query,
|
||||||
|
matches,
|
||||||
|
contextLines,
|
||||||
|
whole,
|
||||||
|
isRegexEnabled,
|
||||||
|
isCaseSensitivityEnabled,
|
||||||
|
});
|
||||||
|
|
||||||
|
// Check if we have a valid cached result. If so, use it.
|
||||||
|
const cachedEntry = searchCache.get(cacheKey);
|
||||||
|
if (cachedEntry && isCacheValid(cachedEntry)) {
|
||||||
|
console.debug('Using cached search results');
|
||||||
|
setState({
|
||||||
|
isStreaming: false,
|
||||||
|
isExhaustive: cachedEntry.isExhaustive,
|
||||||
|
error: null,
|
||||||
|
files: cachedEntry.files,
|
||||||
|
repoInfo: cachedEntry.repoInfo,
|
||||||
|
timeToSearchCompletionMs: cachedEntry.timeToSearchCompletionMs,
|
||||||
|
timeToFirstSearchResultMs: cachedEntry.timeToFirstSearchResultMs,
|
||||||
|
numMatches: cachedEntry.numMatches,
|
||||||
|
});
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
setState({
|
||||||
|
isStreaming: true,
|
||||||
|
isExhaustive: false,
|
||||||
|
error: null,
|
||||||
|
files: [],
|
||||||
|
repoInfo: {},
|
||||||
|
timeToSearchCompletionMs: 0,
|
||||||
|
timeToFirstSearchResultMs: 0,
|
||||||
|
numMatches: 0,
|
||||||
|
});
|
||||||
|
|
||||||
|
try {
|
||||||
|
const response = await fetch('/api/stream_search', {
|
||||||
|
method: 'POST',
|
||||||
|
headers: {
|
||||||
|
'Content-Type': 'application/json',
|
||||||
|
},
|
||||||
|
body: JSON.stringify({
|
||||||
|
query,
|
||||||
|
matches,
|
||||||
|
contextLines,
|
||||||
|
whole,
|
||||||
|
isRegexEnabled,
|
||||||
|
isCaseSensitivityEnabled,
|
||||||
|
}),
|
||||||
|
signal: abortControllerRef.current.signal,
|
||||||
|
});
|
||||||
|
|
||||||
|
if (!response.ok) {
|
||||||
|
// Check if this is a service error response
|
||||||
|
const contentType = response.headers.get('content-type');
|
||||||
|
if (contentType?.includes('application/json')) {
|
||||||
|
const errorData = await response.json();
|
||||||
|
if (isServiceError(errorData)) {
|
||||||
|
throw new ServiceErrorException(errorData);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
throw new Error(`HTTP error! status: ${response.status}`);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!response.body) {
|
||||||
|
throw new Error('No response body');
|
||||||
|
}
|
||||||
|
|
||||||
|
const reader = response.body.getReader();
|
||||||
|
const decoder = new TextDecoder();
|
||||||
|
let buffer = '';
|
||||||
|
let numMessagesProcessed = 0;
|
||||||
|
|
||||||
|
while (true as boolean) {
|
||||||
|
const { done, value } = await reader.read();
|
||||||
|
|
||||||
|
if (done) {
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Decode the chunk and add to buffer
|
||||||
|
buffer += decoder.decode(value, { stream: true });
|
||||||
|
|
||||||
|
// Process complete SSE messages (separated by \n\n)
|
||||||
|
const messages = buffer.split('\n\n');
|
||||||
|
|
||||||
|
// Keep the last element (potentially incomplete message) in the buffer for the next chunk.
|
||||||
|
// Stream chunks can split messages mid-way, so we only process complete messages.
|
||||||
|
buffer = messages.pop() || '';
|
||||||
|
|
||||||
|
for (const message of messages) {
|
||||||
|
if (!message.trim()) {
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
// SSE messages start with "data: "
|
||||||
|
const dataMatch = message.match(/^data: (.+)$/);
|
||||||
|
if (!dataMatch) {
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
const data = dataMatch[1];
|
||||||
|
|
||||||
|
// Check for completion signal
|
||||||
|
if (data === '[DONE]') {
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
|
||||||
|
const response: StreamedSearchResponse = JSON.parse(data);
|
||||||
|
const isFirstMessage = numMessagesProcessed === 0;
|
||||||
|
switch (response.type) {
|
||||||
|
case 'chunk':
|
||||||
|
setState(prev => ({
|
||||||
|
...prev,
|
||||||
|
files: [
|
||||||
|
...prev.files,
|
||||||
|
...response.files
|
||||||
|
],
|
||||||
|
repoInfo: {
|
||||||
|
...prev.repoInfo,
|
||||||
|
...response.repositoryInfo.reduce((acc, repo) => {
|
||||||
|
acc[repo.id] = repo;
|
||||||
|
return acc;
|
||||||
|
}, {} as Record<number, RepositoryInfo>),
|
||||||
|
},
|
||||||
|
numMatches: prev.numMatches + response.stats.actualMatchCount,
|
||||||
|
...(isFirstMessage ? {
|
||||||
|
timeToFirstSearchResultMs: performance.now() - startTime,
|
||||||
|
} : {}),
|
||||||
|
}));
|
||||||
|
break;
|
||||||
|
case 'final':
|
||||||
|
setState(prev => ({
|
||||||
|
...prev,
|
||||||
|
isExhaustive: response.isSearchExhaustive,
|
||||||
|
stats: response.accumulatedStats,
|
||||||
|
...(isFirstMessage ? {
|
||||||
|
timeToFirstSearchResultMs: performance.now() - startTime,
|
||||||
|
} : {}),
|
||||||
|
}));
|
||||||
|
break;
|
||||||
|
case 'error':
|
||||||
|
throw new ServiceErrorException(response.error);
|
||||||
|
}
|
||||||
|
|
||||||
|
numMessagesProcessed++;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
const timeToSearchCompletionMs = performance.now() - startTime;
|
||||||
|
setState(prev => {
|
||||||
|
// Cache the final results after the stream has completed.
|
||||||
|
searchCache.set(cacheKey, {
|
||||||
|
files: prev.files,
|
||||||
|
repoInfo: prev.repoInfo,
|
||||||
|
isExhaustive: prev.isExhaustive,
|
||||||
|
numMatches: prev.numMatches,
|
||||||
|
timeToFirstSearchResultMs: prev.timeToFirstSearchResultMs,
|
||||||
|
timeToSearchCompletionMs,
|
||||||
|
timestamp: Date.now(),
|
||||||
|
});
|
||||||
|
return {
|
||||||
|
...prev,
|
||||||
|
timeToSearchCompletionMs,
|
||||||
|
isStreaming: false,
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
} catch (error) {
|
||||||
|
if ((error as Error).name === 'AbortError') {
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
console.error(error);
|
||||||
|
Sentry.captureException(error);
|
||||||
|
const timeToSearchCompletionMs = performance.now() - startTime;
|
||||||
|
setState(prev => ({
|
||||||
|
...prev,
|
||||||
|
isStreaming: false,
|
||||||
|
timeToSearchCompletionMs,
|
||||||
|
error: error instanceof Error ? error : null,
|
||||||
|
}));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
search();
|
||||||
|
|
||||||
|
return () => {
|
||||||
|
cancel();
|
||||||
|
}
|
||||||
|
}, [
|
||||||
|
query,
|
||||||
|
matches,
|
||||||
|
contextLines,
|
||||||
|
whole,
|
||||||
|
isRegexEnabled,
|
||||||
|
isCaseSensitivityEnabled,
|
||||||
|
cancel,
|
||||||
|
]);
|
||||||
|
|
||||||
|
return {
|
||||||
|
...state,
|
||||||
|
cancel,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
@ -4,10 +4,12 @@ import { ServiceError } from "@/lib/serviceError";
|
||||||
import { GetVersionResponse, GetReposResponse } from "@/lib/types";
|
import { GetVersionResponse, GetReposResponse } from "@/lib/types";
|
||||||
import { isServiceError } from "@/lib/utils";
|
import { isServiceError } from "@/lib/utils";
|
||||||
import {
|
import {
|
||||||
FileSourceResponse,
|
|
||||||
FileSourceRequest,
|
|
||||||
SearchRequest,
|
SearchRequest,
|
||||||
SearchResponse,
|
SearchResponse,
|
||||||
|
} from "@/features/search";
|
||||||
|
import {
|
||||||
|
FileSourceRequest,
|
||||||
|
FileSourceResponse,
|
||||||
} from "@/features/search/types";
|
} from "@/features/search/types";
|
||||||
import {
|
import {
|
||||||
FindRelatedSymbolsRequest,
|
FindRelatedSymbolsRequest,
|
||||||
|
|
|
||||||
|
|
@ -1,10 +1,9 @@
|
||||||
'use server';
|
'use server';
|
||||||
|
|
||||||
import { search } from "@/features/search/searchApi";
|
import { search, searchRequestSchema } from "@/features/search";
|
||||||
import { isServiceError } from "@/lib/utils";
|
import { isServiceError } from "@/lib/utils";
|
||||||
import { NextRequest } from "next/server";
|
import { NextRequest } from "next/server";
|
||||||
import { schemaValidationError, serviceErrorResponse } from "@/lib/serviceError";
|
import { schemaValidationError, serviceErrorResponse } from "@/lib/serviceError";
|
||||||
import { searchRequestSchema } from "@/features/search/schemas";
|
|
||||||
|
|
||||||
export const POST = async (request: NextRequest) => {
|
export const POST = async (request: NextRequest) => {
|
||||||
const body = await request.json();
|
const body = await request.json();
|
||||||
|
|
@ -15,7 +14,17 @@ export const POST = async (request: NextRequest) => {
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
const response = await search(parsed.data);
|
const {
|
||||||
|
query,
|
||||||
|
...options
|
||||||
|
} = parsed.data;
|
||||||
|
|
||||||
|
const response = await search({
|
||||||
|
queryType: 'string',
|
||||||
|
query,
|
||||||
|
options,
|
||||||
|
});
|
||||||
|
|
||||||
if (isServiceError(response)) {
|
if (isServiceError(response)) {
|
||||||
return serviceErrorResponse(response);
|
return serviceErrorResponse(response);
|
||||||
}
|
}
|
||||||
|
|
|
||||||
|
|
@ -4,7 +4,7 @@ import { getFileSource } from "@/features/search/fileSourceApi";
|
||||||
import { schemaValidationError, serviceErrorResponse } from "@/lib/serviceError";
|
import { schemaValidationError, serviceErrorResponse } from "@/lib/serviceError";
|
||||||
import { isServiceError } from "@/lib/utils";
|
import { isServiceError } from "@/lib/utils";
|
||||||
import { NextRequest } from "next/server";
|
import { NextRequest } from "next/server";
|
||||||
import { fileSourceRequestSchema } from "@/features/search/schemas";
|
import { fileSourceRequestSchema } from "@/features/search/types";
|
||||||
|
|
||||||
export const POST = async (request: NextRequest) => {
|
export const POST = async (request: NextRequest) => {
|
||||||
const body = await request.json();
|
const body = await request.json();
|
||||||
|
|
|
||||||
39
packages/web/src/app/api/(server)/stream_search/route.ts
Normal file
39
packages/web/src/app/api/(server)/stream_search/route.ts
Normal file
|
|
@ -0,0 +1,39 @@
|
||||||
|
'use server';
|
||||||
|
|
||||||
|
import { streamSearch, searchRequestSchema } from '@/features/search';
|
||||||
|
import { schemaValidationError, serviceErrorResponse } from '@/lib/serviceError';
|
||||||
|
import { isServiceError } from '@/lib/utils';
|
||||||
|
import { NextRequest } from 'next/server';
|
||||||
|
|
||||||
|
export const POST = async (request: NextRequest) => {
|
||||||
|
const body = await request.json();
|
||||||
|
const parsed = await searchRequestSchema.safeParseAsync(body);
|
||||||
|
|
||||||
|
if (!parsed.success) {
|
||||||
|
return serviceErrorResponse(schemaValidationError(parsed.error));
|
||||||
|
}
|
||||||
|
|
||||||
|
const {
|
||||||
|
query,
|
||||||
|
...options
|
||||||
|
} = parsed.data;
|
||||||
|
|
||||||
|
const stream = await streamSearch({
|
||||||
|
queryType: 'string',
|
||||||
|
query,
|
||||||
|
options,
|
||||||
|
});
|
||||||
|
|
||||||
|
if (isServiceError(stream)) {
|
||||||
|
return serviceErrorResponse(stream);
|
||||||
|
}
|
||||||
|
|
||||||
|
return new Response(stream, {
|
||||||
|
headers: {
|
||||||
|
'Content-Type': 'text/event-stream',
|
||||||
|
'Cache-Control': 'no-cache, no-transform',
|
||||||
|
'Connection': 'keep-alive',
|
||||||
|
'X-Accel-Buffering': 'no', // Disable nginx buffering if applicable
|
||||||
|
},
|
||||||
|
});
|
||||||
|
};
|
||||||
|
|
@ -4,7 +4,7 @@ import { getBrowsePath } from "@/app/[domain]/browse/hooks/utils";
|
||||||
import { PathHeader } from "@/app/[domain]/components/pathHeader";
|
import { PathHeader } from "@/app/[domain]/components/pathHeader";
|
||||||
import { LightweightCodeHighlighter } from "@/app/[domain]/components/lightweightCodeHighlighter";
|
import { LightweightCodeHighlighter } from "@/app/[domain]/components/lightweightCodeHighlighter";
|
||||||
import { FindRelatedSymbolsResponse } from "@/features/codeNav/types";
|
import { FindRelatedSymbolsResponse } from "@/features/codeNav/types";
|
||||||
import { RepositoryInfo, SourceRange } from "@/features/search/types";
|
import { RepositoryInfo, SourceRange } from "@/features/search";
|
||||||
import { useMemo, useRef } from "react";
|
import { useMemo, useRef } from "react";
|
||||||
import useCaptureEvent from "@/hooks/useCaptureEvent";
|
import useCaptureEvent from "@/hooks/useCaptureEvent";
|
||||||
import { useVirtualizer } from "@tanstack/react-virtual";
|
import { useVirtualizer } from "@tanstack/react-virtual";
|
||||||
|
|
|
||||||
|
|
@ -2,7 +2,7 @@ import { Badge } from "@/components/ui/badge";
|
||||||
import { Tooltip, TooltipContent, TooltipTrigger } from "@/components/ui/tooltip";
|
import { Tooltip, TooltipContent, TooltipTrigger } from "@/components/ui/tooltip";
|
||||||
import { LightweightCodeHighlighter } from "@/app/[domain]/components/lightweightCodeHighlighter";
|
import { LightweightCodeHighlighter } from "@/app/[domain]/components/lightweightCodeHighlighter";
|
||||||
import { useMemo } from "react";
|
import { useMemo } from "react";
|
||||||
import { SourceRange } from "@/features/search/types";
|
import { SourceRange } from "@/features/search";
|
||||||
|
|
||||||
interface SymbolDefinitionPreviewProps {
|
interface SymbolDefinitionPreviewProps {
|
||||||
symbolDefinition: {
|
symbolDefinition: {
|
||||||
|
|
|
||||||
|
|
@ -1,5 +1,5 @@
|
||||||
import { findSearchBasedSymbolDefinitions } from "@/app/api/(client)/client";
|
import { findSearchBasedSymbolDefinitions } from "@/app/api/(client)/client";
|
||||||
import { SourceRange } from "@/features/search/types";
|
import { SourceRange } from "@/features/search";
|
||||||
import { useDomain } from "@/hooks/useDomain";
|
import { useDomain } from "@/hooks/useDomain";
|
||||||
import { unwrapServiceError } from "@/lib/utils";
|
import { unwrapServiceError } from "@/lib/utils";
|
||||||
import { useQuery } from "@tanstack/react-query";
|
import { useQuery } from "@tanstack/react-query";
|
||||||
|
|
|
||||||
|
|
@ -1,6 +1,6 @@
|
||||||
import { sourcebot_context, sourcebot_pr_payload } from "@/features/agents/review-agent/types";
|
import { sourcebot_context, sourcebot_pr_payload } from "@/features/agents/review-agent/types";
|
||||||
import { getFileSource } from "@/features/search/fileSourceApi";
|
import { getFileSource } from "@/features/search/fileSourceApi";
|
||||||
import { fileSourceResponseSchema } from "@/features/search/schemas";
|
import { fileSourceResponseSchema } from "@/features/search/types";
|
||||||
import { isServiceError } from "@/lib/utils";
|
import { isServiceError } from "@/lib/utils";
|
||||||
import { createLogger } from "@sourcebot/shared";
|
import { createLogger } from "@sourcebot/shared";
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -1,5 +1,5 @@
|
||||||
import { z } from "zod"
|
import { z } from "zod"
|
||||||
import { search } from "@/features/search/searchApi"
|
import { search } from "@/features/search"
|
||||||
import { InferToolInput, InferToolOutput, InferUITool, tool, ToolUIPart } from "ai";
|
import { InferToolInput, InferToolOutput, InferUITool, tool, ToolUIPart } from "ai";
|
||||||
import { isServiceError } from "@/lib/utils";
|
import { isServiceError } from "@/lib/utils";
|
||||||
import { getFileSource } from "../search/fileSourceApi";
|
import { getFileSource } from "../search/fileSourceApi";
|
||||||
|
|
@ -178,12 +178,15 @@ Multiple expressions can be or'd together with or, negated with -, or grouped wi
|
||||||
});
|
});
|
||||||
|
|
||||||
const response = await search({
|
const response = await search({
|
||||||
|
queryType: 'string',
|
||||||
query,
|
query,
|
||||||
|
options: {
|
||||||
matches: limit ?? 100,
|
matches: limit ?? 100,
|
||||||
// @todo: we can make this configurable.
|
|
||||||
contextLines: 3,
|
contextLines: 3,
|
||||||
whole: false,
|
whole: false,
|
||||||
// @todo(mt): handle multi-tenancy.
|
isCaseSensitivityEnabled: true,
|
||||||
|
isRegexEnabled: true,
|
||||||
|
}
|
||||||
});
|
});
|
||||||
|
|
||||||
if (isServiceError(response)) {
|
if (isServiceError(response)) {
|
||||||
|
|
|
||||||
|
|
@ -1,13 +1,13 @@
|
||||||
import 'server-only';
|
import 'server-only';
|
||||||
|
|
||||||
import { sew } from "@/actions";
|
import { sew } from "@/actions";
|
||||||
import { searchResponseSchema } from "@/features/search/schemas";
|
import { search } from "@/features/search";
|
||||||
import { search } from "@/features/search/searchApi";
|
|
||||||
import { ServiceError } from "@/lib/serviceError";
|
import { ServiceError } from "@/lib/serviceError";
|
||||||
import { isServiceError } from "@/lib/utils";
|
import { isServiceError } from "@/lib/utils";
|
||||||
import { withOptionalAuthV2 } from "@/withAuthV2";
|
import { withOptionalAuthV2 } from "@/withAuthV2";
|
||||||
import { SearchResponse } from "../search/types";
|
import { SearchResponse } from "../search/types";
|
||||||
import { FindRelatedSymbolsRequest, FindRelatedSymbolsResponse } from "./types";
|
import { FindRelatedSymbolsRequest, FindRelatedSymbolsResponse } from "./types";
|
||||||
|
import { QueryIR } from '../search/ir';
|
||||||
|
|
||||||
// The maximum number of matches to return from the search API.
|
// The maximum number of matches to return from the search API.
|
||||||
const MAX_REFERENCE_COUNT = 1000;
|
const MAX_REFERENCE_COUNT = 1000;
|
||||||
|
|
@ -20,12 +20,37 @@ export const findSearchBasedSymbolReferences = async (props: FindRelatedSymbolsR
|
||||||
revisionName = "HEAD",
|
revisionName = "HEAD",
|
||||||
} = props;
|
} = props;
|
||||||
|
|
||||||
const query = `\\b${symbolName}\\b rev:${revisionName} ${getExpandedLanguageFilter(language)} case:yes`;
|
const languageFilter = getExpandedLanguageFilter(language);
|
||||||
|
|
||||||
|
const query: QueryIR = {
|
||||||
|
and: {
|
||||||
|
children: [
|
||||||
|
{
|
||||||
|
regexp: {
|
||||||
|
regexp: `\\b${symbolName}\\b`,
|
||||||
|
case_sensitive: true,
|
||||||
|
file_name: false,
|
||||||
|
content: true,
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
branch: {
|
||||||
|
pattern: revisionName,
|
||||||
|
exact: true,
|
||||||
|
}
|
||||||
|
},
|
||||||
|
languageFilter,
|
||||||
|
]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
const searchResult = await search({
|
const searchResult = await search({
|
||||||
|
queryType: 'ir',
|
||||||
query,
|
query,
|
||||||
|
options: {
|
||||||
matches: MAX_REFERENCE_COUNT,
|
matches: MAX_REFERENCE_COUNT,
|
||||||
contextLines: 0,
|
contextLines: 0,
|
||||||
|
}
|
||||||
});
|
});
|
||||||
|
|
||||||
if (isServiceError(searchResult)) {
|
if (isServiceError(searchResult)) {
|
||||||
|
|
@ -44,12 +69,41 @@ export const findSearchBasedSymbolDefinitions = async (props: FindRelatedSymbols
|
||||||
revisionName = "HEAD",
|
revisionName = "HEAD",
|
||||||
} = props;
|
} = props;
|
||||||
|
|
||||||
const query = `sym:\\b${symbolName}\\b rev:${revisionName} ${getExpandedLanguageFilter(language)}`;
|
const languageFilter = getExpandedLanguageFilter(language);
|
||||||
|
|
||||||
|
const query: QueryIR = {
|
||||||
|
and: {
|
||||||
|
children: [
|
||||||
|
{
|
||||||
|
symbol: {
|
||||||
|
expr: {
|
||||||
|
regexp: {
|
||||||
|
regexp: `\\b${symbolName}\\b`,
|
||||||
|
case_sensitive: true,
|
||||||
|
file_name: false,
|
||||||
|
content: true,
|
||||||
|
}
|
||||||
|
},
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
branch: {
|
||||||
|
pattern: revisionName,
|
||||||
|
exact: true,
|
||||||
|
}
|
||||||
|
},
|
||||||
|
languageFilter,
|
||||||
|
]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
const searchResult = await search({
|
const searchResult = await search({
|
||||||
|
queryType: 'ir',
|
||||||
query,
|
query,
|
||||||
|
options: {
|
||||||
matches: MAX_REFERENCE_COUNT,
|
matches: MAX_REFERENCE_COUNT,
|
||||||
contextLines: 0,
|
contextLines: 0,
|
||||||
|
}
|
||||||
});
|
});
|
||||||
|
|
||||||
if (isServiceError(searchResult)) {
|
if (isServiceError(searchResult)) {
|
||||||
|
|
@ -59,12 +113,12 @@ export const findSearchBasedSymbolDefinitions = async (props: FindRelatedSymbols
|
||||||
return parseRelatedSymbolsSearchResponse(searchResult);
|
return parseRelatedSymbolsSearchResponse(searchResult);
|
||||||
}));
|
}));
|
||||||
|
|
||||||
const parseRelatedSymbolsSearchResponse = (searchResult: SearchResponse) => {
|
const parseRelatedSymbolsSearchResponse = (searchResult: SearchResponse): FindRelatedSymbolsResponse => {
|
||||||
const parser = searchResponseSchema.transform(async ({ files }) => ({
|
return {
|
||||||
stats: {
|
stats: {
|
||||||
matchCount: searchResult.stats.actualMatchCount,
|
matchCount: searchResult.stats.actualMatchCount,
|
||||||
},
|
},
|
||||||
files: files.flatMap((file) => {
|
files: searchResult.files.flatMap((file) => {
|
||||||
const chunks = file.chunks;
|
const chunks = file.chunks;
|
||||||
|
|
||||||
return {
|
return {
|
||||||
|
|
@ -82,20 +136,47 @@ const parseRelatedSymbolsSearchResponse = (searchResult: SearchResponse) => {
|
||||||
}
|
}
|
||||||
}).filter((file) => file.matches.length > 0),
|
}).filter((file) => file.matches.length > 0),
|
||||||
repositoryInfo: searchResult.repositoryInfo
|
repositoryInfo: searchResult.repositoryInfo
|
||||||
}));
|
};
|
||||||
|
|
||||||
return parser.parseAsync(searchResult);
|
|
||||||
}
|
}
|
||||||
|
|
||||||
// Expands the language filter to include all variants of the language.
|
// Expands the language filter to include all variants of the language.
|
||||||
const getExpandedLanguageFilter = (language: string) => {
|
const getExpandedLanguageFilter = (language: string): QueryIR => {
|
||||||
switch (language) {
|
switch (language) {
|
||||||
case "TypeScript":
|
case "TypeScript":
|
||||||
case "JavaScript":
|
case "JavaScript":
|
||||||
case "JSX":
|
case "JSX":
|
||||||
case "TSX":
|
case "TSX":
|
||||||
return `(lang:TypeScript or lang:JavaScript or lang:JSX or lang:TSX)`
|
return {
|
||||||
|
or: {
|
||||||
|
children: [
|
||||||
|
{
|
||||||
|
language: {
|
||||||
|
language: "TypeScript",
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
language: {
|
||||||
|
language: "JavaScript",
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
language: {
|
||||||
|
language: "JSX",
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
language: {
|
||||||
|
language: "TSX",
|
||||||
|
}
|
||||||
|
},
|
||||||
|
]
|
||||||
|
},
|
||||||
|
}
|
||||||
default:
|
default:
|
||||||
return `lang:${language}`
|
return {
|
||||||
|
language: {
|
||||||
|
language: language,
|
||||||
|
},
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
@ -1,5 +1,5 @@
|
||||||
import { z } from "zod";
|
import { z } from "zod";
|
||||||
import { rangeSchema, repositoryInfoSchema } from "../search/schemas";
|
import { rangeSchema, repositoryInfoSchema } from "../search/types";
|
||||||
|
|
||||||
export const findRelatedSymbolsRequestSchema = z.object({
|
export const findRelatedSymbolsRequestSchema = z.object({
|
||||||
symbolName: z.string(),
|
symbolName: z.string(),
|
||||||
|
|
|
||||||
31
packages/web/src/features/search/README.md
Normal file
31
packages/web/src/features/search/README.md
Normal file
|
|
@ -0,0 +1,31 @@
|
||||||
|
# `/search`
|
||||||
|
|
||||||
|
Code search interface for Sourcebot.
|
||||||
|
|
||||||
|
## Overview
|
||||||
|
|
||||||
|
The search feature parses user queries into an intermediate representation (IR), which is then executed against Zoekt's gRPC search backend. Query parsing uses Lezer for syntax analysis.
|
||||||
|
|
||||||
|
## Architecture
|
||||||
|
|
||||||
|
**Query Flow:**
|
||||||
|
1. User query string → Lezer parser (via `@sourcebot/query-language`)
|
||||||
|
2. Lezer syntax tree → Query IR (Zoekt gRPC `Q` proto)
|
||||||
|
3. Query IR → Zoekt backend → Search results
|
||||||
|
|
||||||
|
## Files
|
||||||
|
|
||||||
|
- **`index.ts`** - Public API exports for the search feature, including search functions and type definitions.
|
||||||
|
|
||||||
|
- **`parser.ts`** - Parses query strings into the query IR using the Lezer parser from `@sourcebot/query-language`.
|
||||||
|
|
||||||
|
- **`ir.ts`** - Defines the `QueryIR` type (internally the Zoekt gRPC `Q` proto) and provides utilities for traversing and querying the IR tree structure.
|
||||||
|
|
||||||
|
- **`types.ts`** - TypeScript types and Zod schemas for search requests, responses, file matches, stats, and streaming results.
|
||||||
|
|
||||||
|
- **`searchApi.ts`** - High-level search API that handles authentication, permission filtering, and orchestrates the query parsing and Zoekt backend calls.
|
||||||
|
|
||||||
|
- **`zoektSearcher.ts`** - Low-level interface to the Zoekt gRPC backend. Handles request construction, streaming search, response transformation, and repository metadata resolution.
|
||||||
|
|
||||||
|
- **`fileSourceApi.ts`** - Retrieves full file contents by executing a specialized search query against Zoekt for a specific file path.
|
||||||
|
|
||||||
|
|
@ -1,29 +1,50 @@
|
||||||
import 'server-only';
|
import 'server-only';
|
||||||
import escapeStringRegexp from "escape-string-regexp";
|
|
||||||
import { fileNotFound, ServiceError, unexpectedError } from "../../lib/serviceError";
|
import { fileNotFound, ServiceError, unexpectedError } from "../../lib/serviceError";
|
||||||
import { FileSourceRequest, FileSourceResponse } from "./types";
|
import { FileSourceRequest, FileSourceResponse } from "./types";
|
||||||
import { isServiceError } from "../../lib/utils";
|
import { isServiceError } from "../../lib/utils";
|
||||||
import { search } from "./searchApi";
|
import { search } from "./searchApi";
|
||||||
import { sew } from "@/actions";
|
import { sew } from "@/actions";
|
||||||
import { withOptionalAuthV2 } from "@/withAuthV2";
|
import { withOptionalAuthV2 } from "@/withAuthV2";
|
||||||
|
import { QueryIR } from './ir';
|
||||||
// @todo (bkellam) #574 : We should really be using `git show <hash>:<path>` to fetch file contents here.
|
// @todo (bkellam) #574 : We should really be using `git show <hash>:<path>` to fetch file contents here.
|
||||||
// This will allow us to support permalinks to files at a specific revision that may not be indexed
|
// This will allow us to support permalinks to files at a specific revision that may not be indexed
|
||||||
// by zoekt.
|
// by zoekt. We should also refactor this out of the /search folder.
|
||||||
|
|
||||||
export const getFileSource = async ({ fileName, repository, branch }: FileSourceRequest): Promise<FileSourceResponse | ServiceError> => sew(() =>
|
export const getFileSource = async ({ fileName, repository, branch }: FileSourceRequest): Promise<FileSourceResponse | ServiceError> => sew(() =>
|
||||||
withOptionalAuthV2(async () => {
|
withOptionalAuthV2(async () => {
|
||||||
const escapedFileName = escapeStringRegexp(fileName);
|
const query: QueryIR = {
|
||||||
const escapedRepository = escapeStringRegexp(repository);
|
and: {
|
||||||
|
children: [
|
||||||
let query = `file:${escapedFileName} repo:^${escapedRepository}$`;
|
{
|
||||||
if (branch) {
|
repo: {
|
||||||
query = query.concat(` branch:${branch}`);
|
regexp: `^${repository}$`,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
regexp: {
|
||||||
|
regexp: fileName,
|
||||||
|
case_sensitive: true,
|
||||||
|
file_name: true,
|
||||||
|
content: false
|
||||||
|
},
|
||||||
|
},
|
||||||
|
...(branch ? [{
|
||||||
|
branch: {
|
||||||
|
pattern: branch,
|
||||||
|
exact: true,
|
||||||
|
},
|
||||||
|
}]: [])
|
||||||
|
]
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
const searchResponse = await search({
|
const searchResponse = await search({
|
||||||
|
queryType: 'ir',
|
||||||
query,
|
query,
|
||||||
|
options: {
|
||||||
matches: 1,
|
matches: 1,
|
||||||
whole: true,
|
whole: true,
|
||||||
|
}
|
||||||
});
|
});
|
||||||
|
|
||||||
if (isServiceError(searchResponse)) {
|
if (isServiceError(searchResponse)) {
|
||||||
|
|
|
||||||
15
packages/web/src/features/search/index.ts
Normal file
15
packages/web/src/features/search/index.ts
Normal file
|
|
@ -0,0 +1,15 @@
|
||||||
|
export { search, streamSearch } from './searchApi';
|
||||||
|
export {
|
||||||
|
searchRequestSchema,
|
||||||
|
} from './types';
|
||||||
|
export type {
|
||||||
|
SourceRange,
|
||||||
|
SearchSymbol,
|
||||||
|
RepositoryInfo,
|
||||||
|
SearchRequest,
|
||||||
|
SearchResultFile,
|
||||||
|
SearchStats,
|
||||||
|
StreamedSearchResponse,
|
||||||
|
SearchResultChunk,
|
||||||
|
SearchResponse,
|
||||||
|
} from './types';
|
||||||
209
packages/web/src/features/search/ir.ts
Normal file
209
packages/web/src/features/search/ir.ts
Normal file
|
|
@ -0,0 +1,209 @@
|
||||||
|
import { Q as QueryIR } from '@/proto/zoekt/webserver/v1/Q';
|
||||||
|
import { RawConfig } from '@/proto/zoekt/webserver/v1/RawConfig';
|
||||||
|
import { Regexp } from '@/proto/zoekt/webserver/v1/Regexp';
|
||||||
|
import { Symbol } from '@/proto/zoekt/webserver/v1/Symbol';
|
||||||
|
import { Language } from '@/proto/zoekt/webserver/v1/Language';
|
||||||
|
import { Repo } from '@/proto/zoekt/webserver/v1/Repo';
|
||||||
|
import { RepoRegexp } from '@/proto/zoekt/webserver/v1/RepoRegexp';
|
||||||
|
import { BranchesRepos } from '@/proto/zoekt/webserver/v1/BranchesRepos';
|
||||||
|
import { RepoIds } from '@/proto/zoekt/webserver/v1/RepoIds';
|
||||||
|
import { RepoSet } from '@/proto/zoekt/webserver/v1/RepoSet';
|
||||||
|
import { FileNameSet } from '@/proto/zoekt/webserver/v1/FileNameSet';
|
||||||
|
import { Type } from '@/proto/zoekt/webserver/v1/Type';
|
||||||
|
import { Substring } from '@/proto/zoekt/webserver/v1/Substring';
|
||||||
|
import { And } from '@/proto/zoekt/webserver/v1/And';
|
||||||
|
import { Or } from '@/proto/zoekt/webserver/v1/Or';
|
||||||
|
import { Not } from '@/proto/zoekt/webserver/v1/Not';
|
||||||
|
import { Branch } from '@/proto/zoekt/webserver/v1/Branch';
|
||||||
|
import { Boost } from '@/proto/zoekt/webserver/v1/Boost';
|
||||||
|
|
||||||
|
export type {
|
||||||
|
QueryIR,
|
||||||
|
}
|
||||||
|
|
||||||
|
// Type guards for each query node type
|
||||||
|
export const isRawConfigQuery = (query: QueryIR): query is QueryIR & { raw_config: RawConfig } => query.raw_config != null;
|
||||||
|
export const isRegexpQuery = (query: QueryIR): query is QueryIR & { regexp: Regexp } => query.regexp != null;
|
||||||
|
export const isSymbolQuery = (query: QueryIR): query is QueryIR & { symbol: Symbol } => query.symbol != null;
|
||||||
|
export const isLanguageQuery = (query: QueryIR): query is QueryIR & { language: Language } => query.language != null;
|
||||||
|
export const isConstQuery = (query: QueryIR): query is QueryIR & { const: boolean } => query.const != null;
|
||||||
|
export const isRepoQuery = (query: QueryIR): query is QueryIR & { repo: Repo } => query.repo != null;
|
||||||
|
export const isRepoRegexpQuery = (query: QueryIR): query is QueryIR & { repo_regexp: RepoRegexp } => query.repo_regexp != null;
|
||||||
|
export const isBranchesReposQuery = (query: QueryIR): query is QueryIR & { branches_repos: BranchesRepos } => query.branches_repos != null;
|
||||||
|
export const isRepoIdsQuery = (query: QueryIR): query is QueryIR & { repo_ids: RepoIds } => query.repo_ids != null;
|
||||||
|
export const isRepoSetQuery = (query: QueryIR): query is QueryIR & { repo_set: RepoSet } => query.repo_set != null;
|
||||||
|
export const isFileNameSetQuery = (query: QueryIR): query is QueryIR & { file_name_set: FileNameSet } => query.file_name_set != null;
|
||||||
|
export const isTypeQuery = (query: QueryIR): query is QueryIR & { type: Type } => query.type != null;
|
||||||
|
export const isSubstringQuery = (query: QueryIR): query is QueryIR & { substring: Substring } => query.substring != null;
|
||||||
|
export const isAndQuery = (query: QueryIR): query is QueryIR & { and: And } => query.and != null;
|
||||||
|
export const isOrQuery = (query: QueryIR): query is QueryIR & { or: Or } => query.or != null;
|
||||||
|
export const isNotQuery = (query: QueryIR): query is QueryIR & { not: Not } => query.not != null;
|
||||||
|
export const isBranchQuery = (query: QueryIR): query is QueryIR & { branch: Branch } => query.branch != null;
|
||||||
|
export const isBoostQuery = (query: QueryIR): query is QueryIR & { boost: Boost } => query.boost != null;
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Visitor pattern for traversing a QueryIR tree.
|
||||||
|
* Return false from any method to stop traversal early.
|
||||||
|
*/
|
||||||
|
export type QueryVisitor = {
|
||||||
|
onRawConfig?: (query: QueryIR) => boolean | void;
|
||||||
|
onRegexp?: (query: QueryIR) => boolean | void;
|
||||||
|
onSymbol?: (query: QueryIR) => boolean | void;
|
||||||
|
onLanguage?: (query: QueryIR) => boolean | void;
|
||||||
|
onConst?: (query: QueryIR) => boolean | void;
|
||||||
|
onRepo?: (query: QueryIR) => boolean | void;
|
||||||
|
onRepoRegexp?: (query: QueryIR) => boolean | void;
|
||||||
|
onBranchesRepos?: (query: QueryIR) => boolean | void;
|
||||||
|
onRepoIds?: (query: QueryIR) => boolean | void;
|
||||||
|
onRepoSet?: (query: QueryIR) => boolean | void;
|
||||||
|
onFileNameSet?: (query: QueryIR) => boolean | void;
|
||||||
|
onType?: (query: QueryIR) => boolean | void;
|
||||||
|
onSubstring?: (query: QueryIR) => boolean | void;
|
||||||
|
onAnd?: (query: QueryIR) => boolean | void;
|
||||||
|
onOr?: (query: QueryIR) => boolean | void;
|
||||||
|
onNot?: (query: QueryIR) => boolean | void;
|
||||||
|
onBranch?: (query: QueryIR) => boolean | void;
|
||||||
|
onBoost?: (query: QueryIR) => boolean | void;
|
||||||
|
};
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Traverses a QueryIR tree using the visitor pattern.
|
||||||
|
* @param query The query to traverse
|
||||||
|
* @param visitor An object with optional callback methods for each query type
|
||||||
|
* @returns false if traversal was stopped early, true otherwise
|
||||||
|
*/
|
||||||
|
export function traverseQueryIR(
|
||||||
|
query: QueryIR,
|
||||||
|
visitor: QueryVisitor
|
||||||
|
): boolean {
|
||||||
|
let shouldContinue: boolean | void = true;
|
||||||
|
|
||||||
|
if (isRawConfigQuery(query)) {
|
||||||
|
shouldContinue = visitor.onRawConfig?.(query);
|
||||||
|
|
||||||
|
} else if (isRegexpQuery(query)) {
|
||||||
|
shouldContinue = visitor.onRegexp?.(query);
|
||||||
|
|
||||||
|
} else if (isSymbolQuery(query)) {
|
||||||
|
shouldContinue = visitor.onSymbol?.(query);
|
||||||
|
if (shouldContinue !== false && query.symbol.expr) {
|
||||||
|
shouldContinue = traverseQueryIR(query.symbol.expr, visitor);
|
||||||
|
}
|
||||||
|
|
||||||
|
} else if (isLanguageQuery(query)) {
|
||||||
|
shouldContinue = visitor.onLanguage?.(query);
|
||||||
|
|
||||||
|
} else if (isConstQuery(query)) {
|
||||||
|
shouldContinue = visitor.onConst?.(query);
|
||||||
|
|
||||||
|
} else if (isRepoQuery(query)) {
|
||||||
|
shouldContinue = visitor.onRepo?.(query);
|
||||||
|
|
||||||
|
} else if (isRepoRegexpQuery(query)) {
|
||||||
|
shouldContinue = visitor.onRepoRegexp?.(query);
|
||||||
|
|
||||||
|
} else if (isBranchesReposQuery(query)) {
|
||||||
|
shouldContinue = visitor.onBranchesRepos?.(query);
|
||||||
|
|
||||||
|
} else if (isRepoIdsQuery(query)) {
|
||||||
|
shouldContinue = visitor.onRepoIds?.(query);
|
||||||
|
|
||||||
|
} else if (isRepoSetQuery(query)) {
|
||||||
|
shouldContinue = visitor.onRepoSet?.(query);
|
||||||
|
|
||||||
|
} else if (isFileNameSetQuery(query)) {
|
||||||
|
shouldContinue = visitor.onFileNameSet?.(query);
|
||||||
|
|
||||||
|
} else if (isTypeQuery(query)) {
|
||||||
|
shouldContinue = visitor.onType?.(query);
|
||||||
|
|
||||||
|
} else if (isSubstringQuery(query)) {
|
||||||
|
shouldContinue = visitor.onSubstring?.(query);
|
||||||
|
|
||||||
|
} else if (isAndQuery(query)) {
|
||||||
|
shouldContinue = visitor.onAnd?.(query);
|
||||||
|
if (shouldContinue !== false && query.and.children) {
|
||||||
|
for (const child of query.and.children) {
|
||||||
|
if (!traverseQueryIR(child, visitor)) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
} else if (isOrQuery(query)) {
|
||||||
|
shouldContinue = visitor.onOr?.(query);
|
||||||
|
if (shouldContinue !== false && query.or.children) {
|
||||||
|
for (const child of query.or.children) {
|
||||||
|
if (!traverseQueryIR(child, visitor)) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
} else if (isNotQuery(query)) {
|
||||||
|
shouldContinue = visitor.onNot?.(query);
|
||||||
|
if (shouldContinue !== false && query.not.child) {
|
||||||
|
shouldContinue = traverseQueryIR(query.not.child, visitor);
|
||||||
|
}
|
||||||
|
|
||||||
|
} else if (isBranchQuery(query)) {
|
||||||
|
shouldContinue = visitor.onBranch?.(query);
|
||||||
|
|
||||||
|
} else if (isBoostQuery(query)) {
|
||||||
|
shouldContinue = visitor.onBoost?.(query);
|
||||||
|
if (shouldContinue !== false && query.boost.child) {
|
||||||
|
shouldContinue = traverseQueryIR(query.boost.child, visitor);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return shouldContinue !== false;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Finds a node in the query tree that matches the predicate.
|
||||||
|
* @param query The query to search
|
||||||
|
* @param predicate A function that returns true if the node matches
|
||||||
|
* @returns The first matching query node, or undefined if none found
|
||||||
|
*/
|
||||||
|
export function findInQueryIR(
|
||||||
|
query: QueryIR,
|
||||||
|
predicate: (query: QueryIR) => boolean
|
||||||
|
): QueryIR | undefined {
|
||||||
|
let found: QueryIR | undefined;
|
||||||
|
|
||||||
|
traverseQueryIR(query, {
|
||||||
|
onRawConfig: (q) => { if (predicate(q)) { found = q; return false; } },
|
||||||
|
onRegexp: (q) => { if (predicate(q)) { found = q; return false; } },
|
||||||
|
onSymbol: (q) => { if (predicate(q)) { found = q; return false; } },
|
||||||
|
onLanguage: (q) => { if (predicate(q)) { found = q; return false; } },
|
||||||
|
onConst: (q) => { if (predicate(q)) { found = q; return false; } },
|
||||||
|
onRepo: (q) => { if (predicate(q)) { found = q; return false; } },
|
||||||
|
onRepoRegexp: (q) => { if (predicate(q)) { found = q; return false; } },
|
||||||
|
onBranchesRepos: (q) => { if (predicate(q)) { found = q; return false; } },
|
||||||
|
onRepoIds: (q) => { if (predicate(q)) { found = q; return false; } },
|
||||||
|
onRepoSet: (q) => { if (predicate(q)) { found = q; return false; } },
|
||||||
|
onFileNameSet: (q) => { if (predicate(q)) { found = q; return false; } },
|
||||||
|
onType: (q) => { if (predicate(q)) { found = q; return false; } },
|
||||||
|
onSubstring: (q) => { if (predicate(q)) { found = q; return false; } },
|
||||||
|
onAnd: (q) => { if (predicate(q)) { found = q; return false; } },
|
||||||
|
onOr: (q) => { if (predicate(q)) { found = q; return false; } },
|
||||||
|
onNot: (q) => { if (predicate(q)) { found = q; return false; } },
|
||||||
|
onBranch: (q) => { if (predicate(q)) { found = q; return false; } },
|
||||||
|
onBoost: (q) => { if (predicate(q)) { found = q; return false; } },
|
||||||
|
});
|
||||||
|
|
||||||
|
return found;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if any node in the query tree matches the predicate.
|
||||||
|
* @param query The query to search
|
||||||
|
* @param predicate A function that returns true if the node matches
|
||||||
|
* @returns true if any node matches, false otherwise
|
||||||
|
*/
|
||||||
|
export function someInQueryIR(
|
||||||
|
query: QueryIR,
|
||||||
|
predicate: (query: QueryIR) => boolean
|
||||||
|
): boolean {
|
||||||
|
return findInQueryIR(query, predicate) !== undefined;
|
||||||
|
}
|
||||||
406
packages/web/src/features/search/parser.ts
Normal file
406
packages/web/src/features/search/parser.ts
Normal file
|
|
@ -0,0 +1,406 @@
|
||||||
|
import { QueryIR } from './ir';
|
||||||
|
import {
|
||||||
|
AndExpr,
|
||||||
|
ArchivedExpr,
|
||||||
|
ContentExpr,
|
||||||
|
ContextExpr,
|
||||||
|
FileExpr,
|
||||||
|
ForkExpr,
|
||||||
|
LangExpr,
|
||||||
|
NegateExpr,
|
||||||
|
OrExpr,
|
||||||
|
ParenExpr,
|
||||||
|
PrefixExpr,
|
||||||
|
Program,
|
||||||
|
RepoExpr,
|
||||||
|
RepoSetExpr,
|
||||||
|
RevisionExpr,
|
||||||
|
SymExpr,
|
||||||
|
SyntaxNode,
|
||||||
|
Term,
|
||||||
|
Tree,
|
||||||
|
VisibilityExpr,
|
||||||
|
} from '@sourcebot/query-language';
|
||||||
|
import { parser as _parser } from '@sourcebot/query-language';
|
||||||
|
import { PrismaClient } from '@sourcebot/db';
|
||||||
|
import { SINGLE_TENANT_ORG_ID } from '@/lib/constants';
|
||||||
|
import { ServiceErrorException } from '@/lib/serviceError';
|
||||||
|
import { StatusCodes } from 'http-status-codes';
|
||||||
|
import { ErrorCode } from '@/lib/errorCodes';
|
||||||
|
|
||||||
|
// Configure the parser to throw errors when encountering invalid syntax.
|
||||||
|
const parser = _parser.configure({
|
||||||
|
strict: true,
|
||||||
|
});
|
||||||
|
|
||||||
|
type ArchivedValue = 'yes' | 'no' | 'only';
|
||||||
|
type VisibilityValue = 'public' | 'private' | 'any';
|
||||||
|
type ForkValue = 'yes' | 'no' | 'only';
|
||||||
|
|
||||||
|
const isArchivedValue = (value: string): value is ArchivedValue => {
|
||||||
|
return value === 'yes' || value === 'no' || value === 'only';
|
||||||
|
}
|
||||||
|
|
||||||
|
const isVisibilityValue = (value: string): value is VisibilityValue => {
|
||||||
|
return value === 'public' || value === 'private' || value === 'any';
|
||||||
|
}
|
||||||
|
|
||||||
|
const isForkValue = (value: string): value is ForkValue => {
|
||||||
|
return value === 'yes' || value === 'no' || value === 'only';
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Given a query string, parses it into the query intermediate representation.
|
||||||
|
*/
|
||||||
|
export const parseQuerySyntaxIntoIR = async ({
|
||||||
|
query,
|
||||||
|
options,
|
||||||
|
prisma,
|
||||||
|
}: {
|
||||||
|
query: string,
|
||||||
|
options: {
|
||||||
|
isCaseSensitivityEnabled?: boolean;
|
||||||
|
isRegexEnabled?: boolean;
|
||||||
|
},
|
||||||
|
prisma: PrismaClient,
|
||||||
|
}): Promise<QueryIR> => {
|
||||||
|
|
||||||
|
try {
|
||||||
|
// First parse the query into a Lezer tree.
|
||||||
|
const tree = parser.parse(query);
|
||||||
|
|
||||||
|
// Then transform the tree into the intermediate representation.
|
||||||
|
return transformTreeToIR({
|
||||||
|
tree,
|
||||||
|
input: query,
|
||||||
|
isCaseSensitivityEnabled: options.isCaseSensitivityEnabled ?? false,
|
||||||
|
isRegexEnabled: options.isRegexEnabled ?? false,
|
||||||
|
onExpandSearchContext: async (contextName: string) => {
|
||||||
|
const context = await prisma.searchContext.findUnique({
|
||||||
|
where: {
|
||||||
|
name_orgId: {
|
||||||
|
name: contextName,
|
||||||
|
orgId: SINGLE_TENANT_ORG_ID,
|
||||||
|
}
|
||||||
|
},
|
||||||
|
include: {
|
||||||
|
repos: true,
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
if (!context) {
|
||||||
|
throw new Error(`Search context "${contextName}" not found`);
|
||||||
|
}
|
||||||
|
|
||||||
|
return context.repos.map((repo) => repo.name);
|
||||||
|
},
|
||||||
|
});
|
||||||
|
} catch (error) {
|
||||||
|
if (error instanceof SyntaxError) {
|
||||||
|
throw new ServiceErrorException({
|
||||||
|
statusCode: StatusCodes.BAD_REQUEST,
|
||||||
|
errorCode: ErrorCode.FAILED_TO_PARSE_QUERY,
|
||||||
|
message: `Failed to parse query "${query}" with message: ${error.message}`,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
throw error;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Given a Lezer tree, transforms it into the query intermediate representation.
|
||||||
|
*/
|
||||||
|
const transformTreeToIR = async ({
|
||||||
|
tree,
|
||||||
|
input,
|
||||||
|
isCaseSensitivityEnabled,
|
||||||
|
isRegexEnabled,
|
||||||
|
onExpandSearchContext,
|
||||||
|
}: {
|
||||||
|
tree: Tree;
|
||||||
|
input: string;
|
||||||
|
isCaseSensitivityEnabled: boolean;
|
||||||
|
isRegexEnabled: boolean;
|
||||||
|
onExpandSearchContext: (contextName: string) => Promise<string[]>;
|
||||||
|
}): Promise<QueryIR> => {
|
||||||
|
const transformNode = async (node: SyntaxNode): Promise<QueryIR> => {
|
||||||
|
switch (node.type.id) {
|
||||||
|
case Program: {
|
||||||
|
// Program wraps the actual query - transform its child
|
||||||
|
const child = node.firstChild;
|
||||||
|
if (!child) {
|
||||||
|
// Empty query - match nothing
|
||||||
|
return { const: false, query: "const" };
|
||||||
|
}
|
||||||
|
return transformNode(child);
|
||||||
|
}
|
||||||
|
case AndExpr:
|
||||||
|
return {
|
||||||
|
and: {
|
||||||
|
children: await Promise.all(getChildren(node).map(c => transformNode(c)))
|
||||||
|
},
|
||||||
|
query: "and"
|
||||||
|
}
|
||||||
|
|
||||||
|
case OrExpr:
|
||||||
|
return {
|
||||||
|
or: {
|
||||||
|
children: await Promise.all(getChildren(node).map(c => transformNode(c)))
|
||||||
|
},
|
||||||
|
query: "or"
|
||||||
|
};
|
||||||
|
|
||||||
|
case NegateExpr: {
|
||||||
|
// Find the child after the negate token
|
||||||
|
const negateChild = node.getChild("PrefixExpr") || node.getChild("ParenExpr");
|
||||||
|
if (!negateChild) {
|
||||||
|
throw new Error("NegateExpr missing child");
|
||||||
|
}
|
||||||
|
return {
|
||||||
|
not: {
|
||||||
|
child: await transformNode(negateChild)
|
||||||
|
},
|
||||||
|
query: "not"
|
||||||
|
};
|
||||||
|
}
|
||||||
|
case ParenExpr: {
|
||||||
|
// Parentheses just group - transform the inner query
|
||||||
|
const innerQuery = node.getChild("query") || node.firstChild;
|
||||||
|
if (!innerQuery) {
|
||||||
|
return { const: false, query: "const" };
|
||||||
|
}
|
||||||
|
return transformNode(innerQuery);
|
||||||
|
}
|
||||||
|
case PrefixExpr:
|
||||||
|
// PrefixExpr contains specific prefix types
|
||||||
|
return transformPrefixExpr(node);
|
||||||
|
|
||||||
|
case Term: {
|
||||||
|
const termText = input.substring(node.from, node.to).replace(/^"|"$/g, '');
|
||||||
|
|
||||||
|
return isRegexEnabled ? {
|
||||||
|
regexp: {
|
||||||
|
regexp: termText,
|
||||||
|
case_sensitive: isCaseSensitivityEnabled,
|
||||||
|
file_name: false,
|
||||||
|
content: true
|
||||||
|
},
|
||||||
|
query: "regexp"
|
||||||
|
} : {
|
||||||
|
substring: {
|
||||||
|
pattern: termText,
|
||||||
|
case_sensitive: isCaseSensitivityEnabled,
|
||||||
|
file_name: false,
|
||||||
|
content: true
|
||||||
|
},
|
||||||
|
query: "substring"
|
||||||
|
};
|
||||||
|
}
|
||||||
|
default:
|
||||||
|
console.warn(`Unhandled node type: ${node.type.name} (id: ${node.type.id})`);
|
||||||
|
return { const: true, query: "const" };
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
const transformPrefixExpr = async (node: SyntaxNode): Promise<QueryIR> => {
|
||||||
|
// Find which specific prefix type this is
|
||||||
|
const prefixNode = node.firstChild;
|
||||||
|
if (!prefixNode) {
|
||||||
|
throw new Error("PrefixExpr has no child");
|
||||||
|
}
|
||||||
|
|
||||||
|
const prefixTypeId = prefixNode.type.id;
|
||||||
|
|
||||||
|
// Extract the full text (e.g., "file:test.js") and split on the colon
|
||||||
|
const fullText = input.substring(prefixNode.from, prefixNode.to);
|
||||||
|
const colonIndex = fullText.indexOf(':');
|
||||||
|
if (colonIndex === -1) {
|
||||||
|
throw new Error(`${prefixNode.type.name} missing colon`);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get the value part after the colon and remove quotes if present
|
||||||
|
const value = fullText.substring(colonIndex + 1).replace(/^"|"$/g, '');
|
||||||
|
|
||||||
|
switch (prefixTypeId) {
|
||||||
|
case FileExpr:
|
||||||
|
return {
|
||||||
|
regexp: {
|
||||||
|
regexp: value,
|
||||||
|
case_sensitive: isCaseSensitivityEnabled,
|
||||||
|
file_name: true,
|
||||||
|
content: false
|
||||||
|
},
|
||||||
|
query: "regexp"
|
||||||
|
};
|
||||||
|
|
||||||
|
case RepoExpr:
|
||||||
|
return {
|
||||||
|
repo: {
|
||||||
|
regexp: value
|
||||||
|
},
|
||||||
|
query: "repo"
|
||||||
|
};
|
||||||
|
|
||||||
|
case RevisionExpr:
|
||||||
|
return {
|
||||||
|
branch: {
|
||||||
|
// Special case - "*" means search all branches. Passing in a
|
||||||
|
// blank string will match all branches.
|
||||||
|
pattern: value === '*' ? "" : value,
|
||||||
|
exact: false
|
||||||
|
},
|
||||||
|
query: "branch"
|
||||||
|
};
|
||||||
|
|
||||||
|
case ContentExpr:
|
||||||
|
return {
|
||||||
|
substring: {
|
||||||
|
pattern: value,
|
||||||
|
case_sensitive: isCaseSensitivityEnabled,
|
||||||
|
file_name: false,
|
||||||
|
content: true
|
||||||
|
},
|
||||||
|
query: "substring"
|
||||||
|
};
|
||||||
|
|
||||||
|
|
||||||
|
case LangExpr:
|
||||||
|
return {
|
||||||
|
language: {
|
||||||
|
language: value
|
||||||
|
},
|
||||||
|
query: "language"
|
||||||
|
};
|
||||||
|
|
||||||
|
case SymExpr:
|
||||||
|
// Symbol search wraps a pattern
|
||||||
|
return {
|
||||||
|
symbol: {
|
||||||
|
expr: {
|
||||||
|
regexp: {
|
||||||
|
regexp: value,
|
||||||
|
case_sensitive: isCaseSensitivityEnabled,
|
||||||
|
file_name: false,
|
||||||
|
content: true
|
||||||
|
},
|
||||||
|
query: "regexp"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
query: "symbol"
|
||||||
|
};
|
||||||
|
|
||||||
|
case VisibilityExpr: {
|
||||||
|
const rawValue = value.toLowerCase();
|
||||||
|
|
||||||
|
if (!isVisibilityValue(rawValue)) {
|
||||||
|
throw new Error(`Invalid visibility value: ${rawValue}. Expected 'public', 'private', or 'any'`);
|
||||||
|
}
|
||||||
|
|
||||||
|
const flags: ('FLAG_ONLY_PUBLIC' | 'FLAG_ONLY_PRIVATE')[] = [];
|
||||||
|
|
||||||
|
if (rawValue === 'any') {
|
||||||
|
// 'any' means no filter
|
||||||
|
} else if (rawValue === 'public') {
|
||||||
|
flags.push('FLAG_ONLY_PUBLIC');
|
||||||
|
} else if (rawValue === 'private') {
|
||||||
|
flags.push('FLAG_ONLY_PRIVATE');
|
||||||
|
}
|
||||||
|
|
||||||
|
return {
|
||||||
|
raw_config: {
|
||||||
|
flags
|
||||||
|
},
|
||||||
|
query: "raw_config"
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
case ArchivedExpr: {
|
||||||
|
const rawValue = value.toLowerCase();
|
||||||
|
|
||||||
|
if (!isArchivedValue(rawValue)) {
|
||||||
|
throw new Error(`Invalid archived value: ${rawValue}. Expected 'yes', 'no', or 'only'`);
|
||||||
|
}
|
||||||
|
|
||||||
|
const flags: ('FLAG_ONLY_ARCHIVED' | 'FLAG_NO_ARCHIVED')[] = [];
|
||||||
|
|
||||||
|
if (rawValue === 'yes') {
|
||||||
|
// 'yes' means include archived repositories (default)
|
||||||
|
} else if (rawValue === 'no') {
|
||||||
|
flags.push('FLAG_NO_ARCHIVED');
|
||||||
|
} else if (rawValue === 'only') {
|
||||||
|
flags.push('FLAG_ONLY_ARCHIVED');
|
||||||
|
}
|
||||||
|
|
||||||
|
return {
|
||||||
|
raw_config: {
|
||||||
|
flags
|
||||||
|
},
|
||||||
|
query: "raw_config"
|
||||||
|
};
|
||||||
|
}
|
||||||
|
case ForkExpr: {
|
||||||
|
const rawValue = value.toLowerCase();
|
||||||
|
|
||||||
|
if (!isForkValue(rawValue)) {
|
||||||
|
throw new Error(`Invalid fork value: ${rawValue}. Expected 'yes', 'no', or 'only'`);
|
||||||
|
}
|
||||||
|
|
||||||
|
const flags: ('FLAG_ONLY_FORKS' | 'FLAG_NO_FORKS')[] = [];
|
||||||
|
|
||||||
|
if (rawValue === 'yes') {
|
||||||
|
// 'yes' means include forks (default)
|
||||||
|
} else if (rawValue === 'no') {
|
||||||
|
flags.push('FLAG_NO_FORKS');
|
||||||
|
} else if (rawValue === 'only') {
|
||||||
|
flags.push('FLAG_ONLY_FORKS');
|
||||||
|
}
|
||||||
|
|
||||||
|
return {
|
||||||
|
raw_config: {
|
||||||
|
flags
|
||||||
|
},
|
||||||
|
query: "raw_config"
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
case ContextExpr: {
|
||||||
|
const repoNames = await onExpandSearchContext(value);
|
||||||
|
return {
|
||||||
|
repo_set: {
|
||||||
|
set: repoNames.reduce((acc, s) => {
|
||||||
|
acc[s.trim()] = true;
|
||||||
|
return acc;
|
||||||
|
}, {} as Record<string, boolean>)
|
||||||
|
},
|
||||||
|
query: "repo_set"
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
case RepoSetExpr: {
|
||||||
|
return {
|
||||||
|
repo_set: {
|
||||||
|
set: value.split(',').reduce((acc, s) => {
|
||||||
|
acc[s.trim()] = true;
|
||||||
|
return acc;
|
||||||
|
}, {} as Record<string, boolean>)
|
||||||
|
},
|
||||||
|
query: "repo_set"
|
||||||
|
};
|
||||||
|
}
|
||||||
|
default:
|
||||||
|
throw new Error(`Unknown prefix type: ${prefixNode.type.name} (id: ${prefixTypeId})`);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return transformNode(tree.topNode);
|
||||||
|
}
|
||||||
|
|
||||||
|
const getChildren = (node: SyntaxNode): SyntaxNode[] => {
|
||||||
|
const children: SyntaxNode[] = [];
|
||||||
|
let child = node.firstChild;
|
||||||
|
while (child) {
|
||||||
|
children.push(child);
|
||||||
|
child = child.nextSibling;
|
||||||
|
}
|
||||||
|
return children;
|
||||||
|
}
|
||||||
|
|
@ -1,163 +0,0 @@
|
||||||
// @NOTE : Please keep this file in sync with @sourcebot/mcp/src/schemas.ts
|
|
||||||
import { CodeHostType } from "@sourcebot/db";
|
|
||||||
import { z } from "zod";
|
|
||||||
|
|
||||||
export const locationSchema = z.object({
|
|
||||||
// 0-based byte offset from the beginning of the file
|
|
||||||
byteOffset: z.number(),
|
|
||||||
// 1-based line number from the beginning of the file
|
|
||||||
lineNumber: z.number(),
|
|
||||||
// 1-based column number (in runes) from the beginning of line
|
|
||||||
column: z.number(),
|
|
||||||
});
|
|
||||||
|
|
||||||
export const rangeSchema = z.object({
|
|
||||||
start: locationSchema,
|
|
||||||
end: locationSchema,
|
|
||||||
});
|
|
||||||
|
|
||||||
export const symbolSchema = z.object({
|
|
||||||
symbol: z.string(),
|
|
||||||
kind: z.string(),
|
|
||||||
});
|
|
||||||
|
|
||||||
export const searchRequestSchema = z.object({
|
|
||||||
// The zoekt query to execute.
|
|
||||||
query: z.string(),
|
|
||||||
// The number of matches to return.
|
|
||||||
matches: z.number(),
|
|
||||||
// The number of context lines to return.
|
|
||||||
contextLines: z.number().optional(),
|
|
||||||
// Whether to return the whole file as part of the response.
|
|
||||||
whole: z.boolean().optional(),
|
|
||||||
});
|
|
||||||
|
|
||||||
export const repositoryInfoSchema = z.object({
|
|
||||||
id: z.number(),
|
|
||||||
codeHostType: z.nativeEnum(CodeHostType),
|
|
||||||
name: z.string(),
|
|
||||||
displayName: z.string().optional(),
|
|
||||||
webUrl: z.string().optional(),
|
|
||||||
});
|
|
||||||
|
|
||||||
// Many of these fields are defined in zoekt/api.go.
|
|
||||||
export const searchStatsSchema = z.object({
|
|
||||||
// The actual number of matches returned by the search.
|
|
||||||
// This will always be less than or equal to `totalMatchCount`.
|
|
||||||
actualMatchCount: z.number(),
|
|
||||||
|
|
||||||
// The total number of matches found during the search.
|
|
||||||
totalMatchCount: z.number(),
|
|
||||||
|
|
||||||
// The duration (in nanoseconds) of the search.
|
|
||||||
duration: z.number(),
|
|
||||||
|
|
||||||
// Number of files containing a match.
|
|
||||||
fileCount: z.number(),
|
|
||||||
|
|
||||||
// Candidate files whose contents weren't examined because we
|
|
||||||
// gathered enough matches.
|
|
||||||
filesSkipped: z.number(),
|
|
||||||
|
|
||||||
// Amount of I/O for reading contents.
|
|
||||||
contentBytesLoaded: z.number(),
|
|
||||||
|
|
||||||
// Amount of I/O for reading from index.
|
|
||||||
indexBytesLoaded: z.number(),
|
|
||||||
|
|
||||||
// Number of search shards that had a crash.
|
|
||||||
crashes: z.number(),
|
|
||||||
|
|
||||||
// Number of files in shards that we considered.
|
|
||||||
shardFilesConsidered: z.number(),
|
|
||||||
|
|
||||||
// Files that we evaluated. Equivalent to files for which all
|
|
||||||
// atom matches (including negations) evaluated to true.
|
|
||||||
filesConsidered: z.number(),
|
|
||||||
|
|
||||||
// Files for which we loaded file content to verify substring matches
|
|
||||||
filesLoaded: z.number(),
|
|
||||||
|
|
||||||
// Shards that we scanned to find matches.
|
|
||||||
shardsScanned: z.number(),
|
|
||||||
|
|
||||||
// Shards that we did not process because a query was canceled.
|
|
||||||
shardsSkipped: z.number(),
|
|
||||||
|
|
||||||
// Shards that we did not process because the query was rejected by the
|
|
||||||
// ngram filter indicating it had no matches.
|
|
||||||
shardsSkippedFilter: z.number(),
|
|
||||||
|
|
||||||
// Number of candidate matches as a result of searching ngrams.
|
|
||||||
ngramMatches: z.number(),
|
|
||||||
|
|
||||||
// NgramLookups is the number of times we accessed an ngram in the index.
|
|
||||||
ngramLookups: z.number(),
|
|
||||||
|
|
||||||
// Wall clock time for queued search.
|
|
||||||
wait: z.number(),
|
|
||||||
|
|
||||||
// Aggregate wall clock time spent constructing and pruning the match tree.
|
|
||||||
// This accounts for time such as lookups in the trigram index.
|
|
||||||
matchTreeConstruction: z.number(),
|
|
||||||
|
|
||||||
// Aggregate wall clock time spent searching the match tree. This accounts
|
|
||||||
// for the bulk of search work done looking for matches.
|
|
||||||
matchTreeSearch: z.number(),
|
|
||||||
|
|
||||||
// Number of times regexp was called on files that we evaluated.
|
|
||||||
regexpsConsidered: z.number(),
|
|
||||||
|
|
||||||
// FlushReason explains why results were flushed.
|
|
||||||
flushReason: z.number(),
|
|
||||||
});
|
|
||||||
|
|
||||||
export const searchResponseSchema = z.object({
|
|
||||||
stats: searchStatsSchema,
|
|
||||||
files: z.array(z.object({
|
|
||||||
fileName: z.object({
|
|
||||||
// The name of the file
|
|
||||||
text: z.string(),
|
|
||||||
// Any matching ranges
|
|
||||||
matchRanges: z.array(rangeSchema),
|
|
||||||
}),
|
|
||||||
webUrl: z.string().optional(),
|
|
||||||
repository: z.string(),
|
|
||||||
repositoryId: z.number(),
|
|
||||||
language: z.string(),
|
|
||||||
chunks: z.array(z.object({
|
|
||||||
content: z.string(),
|
|
||||||
matchRanges: z.array(rangeSchema),
|
|
||||||
contentStart: locationSchema,
|
|
||||||
symbols: z.array(z.object({
|
|
||||||
...symbolSchema.shape,
|
|
||||||
parent: symbolSchema.optional(),
|
|
||||||
})).optional(),
|
|
||||||
})),
|
|
||||||
branches: z.array(z.string()).optional(),
|
|
||||||
// Set if `whole` is true.
|
|
||||||
content: z.string().optional(),
|
|
||||||
})),
|
|
||||||
repositoryInfo: z.array(repositoryInfoSchema),
|
|
||||||
isBranchFilteringEnabled: z.boolean(),
|
|
||||||
isSearchExhaustive: z.boolean(),
|
|
||||||
__debug_timings: z.record(z.string(), z.number()).optional(),
|
|
||||||
});
|
|
||||||
|
|
||||||
export const fileSourceRequestSchema = z.object({
|
|
||||||
fileName: z.string(),
|
|
||||||
repository: z.string(),
|
|
||||||
branch: z.string().optional(),
|
|
||||||
});
|
|
||||||
|
|
||||||
export const fileSourceResponseSchema = z.object({
|
|
||||||
source: z.string(),
|
|
||||||
language: z.string(),
|
|
||||||
path: z.string(),
|
|
||||||
repository: z.string(),
|
|
||||||
repositoryCodeHostType: z.nativeEnum(CodeHostType),
|
|
||||||
repositoryDisplayName: z.string().optional(),
|
|
||||||
repositoryWebUrl: z.string().optional(),
|
|
||||||
branch: z.string().optional(),
|
|
||||||
webUrl: z.string().optional(),
|
|
||||||
});
|
|
||||||
|
|
@ -1,429 +1,87 @@
|
||||||
import 'server-only';
|
|
||||||
import { sew } from "@/actions";
|
import { sew } from "@/actions";
|
||||||
|
import { getRepoPermissionFilterForUser } from "@/prisma";
|
||||||
import { withOptionalAuthV2 } from "@/withAuthV2";
|
import { withOptionalAuthV2 } from "@/withAuthV2";
|
||||||
import { PrismaClient, Repo } from "@sourcebot/db";
|
import { PrismaClient, UserWithAccounts } from "@sourcebot/db";
|
||||||
import { base64Decode, createLogger } from "@sourcebot/shared";
|
import { createLogger, env, hasEntitlement } from "@sourcebot/shared";
|
||||||
import { StatusCodes } from "http-status-codes";
|
import { QueryIR } from './ir';
|
||||||
import { ErrorCode } from "../../lib/errorCodes";
|
import { parseQuerySyntaxIntoIR } from './parser';
|
||||||
import { invalidZoektResponse, ServiceError } from "../../lib/serviceError";
|
import { SearchOptions } from "./types";
|
||||||
import { isServiceError, measure } from "../../lib/utils";
|
import { createZoektSearchRequest, zoektSearch, zoektStreamSearch } from './zoektSearcher';
|
||||||
import { SearchRequest, SearchResponse, SourceRange } from "./types";
|
|
||||||
import { zoektFetch } from "./zoektClient";
|
|
||||||
import { ZoektSearchResponse } from "./zoektSchema";
|
|
||||||
|
|
||||||
const logger = createLogger("searchApi");
|
const logger = createLogger("searchApi");
|
||||||
|
|
||||||
// List of supported query prefixes in zoekt.
|
type QueryStringSearchRequest = {
|
||||||
// @see : https://github.com/sourcebot-dev/zoekt/blob/main/query/parse.go#L417
|
queryType: 'string';
|
||||||
enum zoektPrefixes {
|
query: string;
|
||||||
archived = "archived:",
|
options: SearchOptions;
|
||||||
branchShort = "b:",
|
|
||||||
branch = "branch:",
|
|
||||||
caseShort = "c:",
|
|
||||||
case = "case:",
|
|
||||||
content = "content:",
|
|
||||||
fileShort = "f:",
|
|
||||||
file = "file:",
|
|
||||||
fork = "fork:",
|
|
||||||
public = "public:",
|
|
||||||
repoShort = "r:",
|
|
||||||
repo = "repo:",
|
|
||||||
regex = "regex:",
|
|
||||||
lang = "lang:",
|
|
||||||
sym = "sym:",
|
|
||||||
typeShort = "t:",
|
|
||||||
type = "type:",
|
|
||||||
reposet = "reposet:",
|
|
||||||
}
|
}
|
||||||
|
|
||||||
const transformZoektQuery = async (query: string, orgId: number, prisma: PrismaClient): Promise<string | ServiceError> => {
|
type QueryIRSearchRequest = {
|
||||||
const prevQueryParts = query.split(" ");
|
queryType: 'ir';
|
||||||
const newQueryParts = [];
|
query: QueryIR;
|
||||||
|
// Omit options that are specific to query syntax parsing.
|
||||||
for (const part of prevQueryParts) {
|
options: Omit<SearchOptions, 'isRegexEnabled' | 'isCaseSensitivityEnabled'>;
|
||||||
|
|
||||||
// Handle mapping `rev:` and `revision:` to `branch:`
|
|
||||||
if (part.match(/^-?(rev|revision):.+$/)) {
|
|
||||||
const isNegated = part.startsWith("-");
|
|
||||||
let revisionName = part.slice(part.indexOf(":") + 1);
|
|
||||||
|
|
||||||
// Special case: `*` -> search all revisions.
|
|
||||||
// In zoekt, providing a blank string will match all branches.
|
|
||||||
// @see: https://github.com/sourcebot-dev/zoekt/blob/main/eval.go#L560-L562
|
|
||||||
if (revisionName === "*") {
|
|
||||||
revisionName = "";
|
|
||||||
}
|
|
||||||
newQueryParts.push(`${isNegated ? "-" : ""}${zoektPrefixes.branch}${revisionName}`);
|
|
||||||
}
|
}
|
||||||
|
|
||||||
// Expand `context:` into `reposet:` atom.
|
type SearchRequest = QueryStringSearchRequest | QueryIRSearchRequest;
|
||||||
else if (part.match(/^-?context:.+$/)) {
|
|
||||||
const isNegated = part.startsWith("-");
|
|
||||||
const contextName = part.slice(part.indexOf(":") + 1);
|
|
||||||
|
|
||||||
const context = await prisma.searchContext.findUnique({
|
export const search = (request: SearchRequest) => sew(() =>
|
||||||
where: {
|
withOptionalAuthV2(async ({ prisma, user }) => {
|
||||||
name_orgId: {
|
const repoSearchScope = await getAccessibleRepoNamesForUser({ user, prisma });
|
||||||
name: contextName,
|
|
||||||
orgId,
|
// If needed, parse the query syntax into the query intermediate representation.
|
||||||
}
|
const query = request.queryType === 'string' ? await parseQuerySyntaxIntoIR({
|
||||||
},
|
query: request.query,
|
||||||
include: {
|
options: request.options,
|
||||||
repos: true,
|
prisma,
|
||||||
}
|
}) : request.query;
|
||||||
|
|
||||||
|
const zoektSearchRequest = await createZoektSearchRequest({
|
||||||
|
query,
|
||||||
|
options: request.options,
|
||||||
|
repoSearchScope,
|
||||||
});
|
});
|
||||||
|
|
||||||
// If the context doesn't exist, return an error.
|
return zoektSearch(zoektSearchRequest, prisma);
|
||||||
if (!context) {
|
|
||||||
return {
|
|
||||||
errorCode: ErrorCode.SEARCH_CONTEXT_NOT_FOUND,
|
|
||||||
message: `Search context "${contextName}" not found`,
|
|
||||||
statusCode: StatusCodes.NOT_FOUND,
|
|
||||||
} satisfies ServiceError;
|
|
||||||
}
|
|
||||||
|
|
||||||
const names = context.repos.map((repo) => repo.name);
|
|
||||||
newQueryParts.push(`${isNegated ? "-" : ""}${zoektPrefixes.reposet}${names.join(",")}`);
|
|
||||||
}
|
|
||||||
|
|
||||||
// no-op: add the original part to the new query parts.
|
|
||||||
else {
|
|
||||||
newQueryParts.push(part);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return newQueryParts.join(" ");
|
|
||||||
}
|
|
||||||
|
|
||||||
// Extracts a repository file URL from a zoekt template, branch, and file name.
|
|
||||||
const getFileWebUrl = (template: string, branch: string, fileName: string): string | undefined => {
|
|
||||||
// This is a hacky parser for templates generated by
|
|
||||||
// the go text/template package. Example template:
|
|
||||||
// {{URLJoinPath "https://github.com/sourcebot-dev/sourcebot" "blob" .Version .Path}}
|
|
||||||
|
|
||||||
if (!template.match(/^{{URLJoinPath\s.*}}(\?.+)?$/)) {
|
|
||||||
return undefined;
|
|
||||||
}
|
|
||||||
|
|
||||||
const url =
|
|
||||||
template.substring("{{URLJoinPath ".length, template.indexOf("}}"))
|
|
||||||
.split(" ")
|
|
||||||
.map((part) => {
|
|
||||||
// remove wrapping quotes
|
|
||||||
if (part.startsWith("\"")) part = part.substring(1);
|
|
||||||
if (part.endsWith("\"")) part = part.substring(0, part.length - 1);
|
|
||||||
// Replace variable references
|
|
||||||
if (part == ".Version") part = branch;
|
|
||||||
if (part == ".Path") part = fileName;
|
|
||||||
return part;
|
|
||||||
})
|
|
||||||
.join("/");
|
|
||||||
|
|
||||||
const optionalQueryParams =
|
|
||||||
template.substring(template.indexOf("}}") + 2)
|
|
||||||
.replace("{{.Version}}", branch)
|
|
||||||
.replace("{{.Path}}", fileName);
|
|
||||||
|
|
||||||
return encodeURI(url + optionalQueryParams);
|
|
||||||
}
|
|
||||||
|
|
||||||
export const search = async ({ query, matches, contextLines, whole }: SearchRequest): Promise<SearchResponse | ServiceError> => sew(() =>
|
|
||||||
withOptionalAuthV2(async ({ org, prisma }) => {
|
|
||||||
const transformedQuery = await transformZoektQuery(query, org.id, prisma);
|
|
||||||
if (isServiceError(transformedQuery)) {
|
|
||||||
return transformedQuery;
|
|
||||||
}
|
|
||||||
query = transformedQuery;
|
|
||||||
|
|
||||||
const isBranchFilteringEnabled = (
|
|
||||||
query.includes(zoektPrefixes.branch) ||
|
|
||||||
query.includes(zoektPrefixes.branchShort)
|
|
||||||
);
|
|
||||||
|
|
||||||
// We only want to show matches for the default branch when
|
|
||||||
// the user isn't explicitly filtering by branch.
|
|
||||||
if (!isBranchFilteringEnabled) {
|
|
||||||
query = query.concat(` branch:HEAD`);
|
|
||||||
}
|
|
||||||
|
|
||||||
const body = JSON.stringify({
|
|
||||||
q: query,
|
|
||||||
// @see: https://github.com/sourcebot-dev/zoekt/blob/main/api.go#L892
|
|
||||||
opts: {
|
|
||||||
ChunkMatches: true,
|
|
||||||
// @note: Zoekt has several different ways to limit a given search. The two that
|
|
||||||
// we care about are `MaxMatchDisplayCount` and `TotalMaxMatchCount`:
|
|
||||||
// - `MaxMatchDisplayCount` truncates the number of matches AFTER performing
|
|
||||||
// a search (specifically, after collating and sorting the results). The number of
|
|
||||||
// results returned by the API will be less than or equal to this value.
|
|
||||||
//
|
|
||||||
// - `TotalMaxMatchCount` truncates the number of matches DURING a search. The results
|
|
||||||
// returned by the API the API can be less than, equal to, or greater than this value.
|
|
||||||
// Why greater? Because this value is compared _after_ a given shard has finished
|
|
||||||
// being processed, the number of matches returned by the last shard may have exceeded
|
|
||||||
// this value.
|
|
||||||
//
|
|
||||||
// Let's define two variables:
|
|
||||||
// - `actualMatchCount` : The number of matches that are returned by the API. This is
|
|
||||||
// always less than or equal to `MaxMatchDisplayCount`.
|
|
||||||
// - `totalMatchCount` : The number of matches that zoekt found before it either
|
|
||||||
// 1) found all matches or 2) hit the `TotalMaxMatchCount` limit. This number is
|
|
||||||
// not bounded and can be less than, equal to, or greater than both `TotalMaxMatchCount`
|
|
||||||
// and `MaxMatchDisplayCount`.
|
|
||||||
//
|
|
||||||
//
|
|
||||||
// Our challenge is to determine whether or not the search returned all possible matches/
|
|
||||||
// (it was exaustive) or if it was truncated. By setting the `TotalMaxMatchCount` to
|
|
||||||
// `MaxMatchDisplayCount + 1`, we can determine which of these occurred by comparing
|
|
||||||
// `totalMatchCount` to `MaxMatchDisplayCount`.
|
|
||||||
//
|
|
||||||
// if (totalMatchCount ≤ actualMatchCount):
|
|
||||||
// Search is EXHAUSTIVE (found all possible matches)
|
|
||||||
// Proof: totalMatchCount ≤ MaxMatchDisplayCount < TotalMaxMatchCount
|
|
||||||
// Therefore Zoekt stopped naturally, not due to limit
|
|
||||||
//
|
|
||||||
// if (totalMatchCount > actualMatchCount):
|
|
||||||
// Search is TRUNCATED (more matches exist)
|
|
||||||
// Proof: totalMatchCount > MaxMatchDisplayCount + 1 = TotalMaxMatchCount
|
|
||||||
// Therefore Zoekt hit the limit and stopped searching
|
|
||||||
//
|
|
||||||
MaxMatchDisplayCount: matches,
|
|
||||||
TotalMaxMatchCount: matches + 1,
|
|
||||||
NumContextLines: contextLines,
|
|
||||||
Whole: !!whole,
|
|
||||||
ShardMaxMatchCount: -1,
|
|
||||||
MaxWallTime: 0, // zoekt expects a duration in nanoseconds
|
|
||||||
}
|
|
||||||
});
|
|
||||||
|
|
||||||
let header: Record<string, string> = {};
|
|
||||||
header = {
|
|
||||||
"X-Tenant-ID": org.id.toString()
|
|
||||||
};
|
|
||||||
|
|
||||||
const { data: searchResponse, durationMs: fetchDurationMs } = await measure(
|
|
||||||
() => zoektFetch({
|
|
||||||
path: "/api/search",
|
|
||||||
body,
|
|
||||||
header,
|
|
||||||
method: "POST",
|
|
||||||
}),
|
|
||||||
"zoekt_fetch",
|
|
||||||
false
|
|
||||||
);
|
|
||||||
|
|
||||||
if (!searchResponse.ok) {
|
|
||||||
return invalidZoektResponse(searchResponse);
|
|
||||||
}
|
|
||||||
|
|
||||||
const transformZoektSearchResponse = async ({ Result }: ZoektSearchResponse) => {
|
|
||||||
// @note (2025-05-12): in zoekt, repositories are identified by the `RepositoryID` field
|
|
||||||
// which corresponds to the `id` in the Repo table. In order to efficiently fetch repository
|
|
||||||
// metadata when transforming (potentially thousands) of file matches, we aggregate a unique
|
|
||||||
// set of repository ids* and map them to their corresponding Repo record.
|
|
||||||
//
|
|
||||||
// *Q: Why is `RepositoryID` optional? And why are we falling back to `Repository`?
|
|
||||||
// A: Prior to this change, the repository id was not plumbed into zoekt, so RepositoryID was
|
|
||||||
// always undefined. To make this a non-breaking change, we fallback to using the repository's name
|
|
||||||
// (`Repository`) as the identifier in these cases. This is not guaranteed to be unique, but in
|
|
||||||
// practice it is since the repository name includes the host and path (e.g., 'github.com/org/repo',
|
|
||||||
// 'gitea.com/org/repo', etc.).
|
|
||||||
//
|
|
||||||
// Note: When a repository is re-indexed (every hour) this ID will be populated.
|
|
||||||
// @see: https://github.com/sourcebot-dev/zoekt/pull/6
|
|
||||||
const repoIdentifiers = new Set(Result.Files?.map((file) => file.RepositoryID ?? file.Repository) ?? []);
|
|
||||||
const repos = new Map<string | number, Repo>();
|
|
||||||
|
|
||||||
(await prisma.repo.findMany({
|
|
||||||
where: {
|
|
||||||
id: {
|
|
||||||
in: Array.from(repoIdentifiers).filter((id) => typeof id === "number"),
|
|
||||||
},
|
|
||||||
orgId: org.id,
|
|
||||||
}
|
|
||||||
})).forEach(repo => repos.set(repo.id, repo));
|
|
||||||
|
|
||||||
(await prisma.repo.findMany({
|
|
||||||
where: {
|
|
||||||
name: {
|
|
||||||
in: Array.from(repoIdentifiers).filter((id) => typeof id === "string"),
|
|
||||||
},
|
|
||||||
orgId: org.id,
|
|
||||||
}
|
|
||||||
})).forEach(repo => repos.set(repo.name, repo));
|
|
||||||
|
|
||||||
const files = Result.Files?.map((file) => {
|
|
||||||
const fileNameChunks = file.ChunkMatches.filter((chunk) => chunk.FileName);
|
|
||||||
|
|
||||||
const webUrl = (() => {
|
|
||||||
const template: string | undefined = Result.RepoURLs[file.Repository];
|
|
||||||
if (!template) {
|
|
||||||
return undefined;
|
|
||||||
}
|
|
||||||
|
|
||||||
// If there are multiple branches pointing to the same revision of this file, it doesn't
|
|
||||||
// matter which branch we use here, so use the first one.
|
|
||||||
const branch = file.Branches && file.Branches.length > 0 ? file.Branches[0] : "HEAD";
|
|
||||||
return getFileWebUrl(template, branch, file.FileName);
|
|
||||||
})();
|
|
||||||
|
|
||||||
const identifier = file.RepositoryID ?? file.Repository;
|
|
||||||
const repo = repos.get(identifier);
|
|
||||||
|
|
||||||
// This can happen if the user doesn't have access to the repository.
|
|
||||||
if (!repo) {
|
|
||||||
return undefined;
|
|
||||||
}
|
|
||||||
|
|
||||||
return {
|
|
||||||
fileName: {
|
|
||||||
text: file.FileName,
|
|
||||||
matchRanges: fileNameChunks.length === 1 ? fileNameChunks[0].Ranges.map((range) => ({
|
|
||||||
start: {
|
|
||||||
byteOffset: range.Start.ByteOffset,
|
|
||||||
column: range.Start.Column,
|
|
||||||
lineNumber: range.Start.LineNumber,
|
|
||||||
},
|
|
||||||
end: {
|
|
||||||
byteOffset: range.End.ByteOffset,
|
|
||||||
column: range.End.Column,
|
|
||||||
lineNumber: range.End.LineNumber,
|
|
||||||
}
|
|
||||||
})) : [],
|
|
||||||
},
|
|
||||||
repository: repo.name,
|
|
||||||
repositoryId: repo.id,
|
|
||||||
webUrl: webUrl,
|
|
||||||
language: file.Language,
|
|
||||||
chunks: file.ChunkMatches
|
|
||||||
.filter((chunk) => !chunk.FileName) // Filter out filename chunks.
|
|
||||||
.map((chunk) => {
|
|
||||||
return {
|
|
||||||
content: base64Decode(chunk.Content),
|
|
||||||
matchRanges: chunk.Ranges.map((range) => ({
|
|
||||||
start: {
|
|
||||||
byteOffset: range.Start.ByteOffset,
|
|
||||||
column: range.Start.Column,
|
|
||||||
lineNumber: range.Start.LineNumber,
|
|
||||||
},
|
|
||||||
end: {
|
|
||||||
byteOffset: range.End.ByteOffset,
|
|
||||||
column: range.End.Column,
|
|
||||||
lineNumber: range.End.LineNumber,
|
|
||||||
}
|
|
||||||
}) satisfies SourceRange),
|
|
||||||
contentStart: {
|
|
||||||
byteOffset: chunk.ContentStart.ByteOffset,
|
|
||||||
column: chunk.ContentStart.Column,
|
|
||||||
lineNumber: chunk.ContentStart.LineNumber,
|
|
||||||
},
|
|
||||||
symbols: chunk.SymbolInfo?.map((symbol) => {
|
|
||||||
return {
|
|
||||||
symbol: symbol.Sym,
|
|
||||||
kind: symbol.Kind,
|
|
||||||
parent: symbol.Parent.length > 0 ? {
|
|
||||||
symbol: symbol.Parent,
|
|
||||||
kind: symbol.ParentKind,
|
|
||||||
} : undefined,
|
|
||||||
}
|
|
||||||
}) ?? undefined,
|
|
||||||
}
|
|
||||||
}),
|
|
||||||
branches: file.Branches,
|
|
||||||
content: file.Content ? base64Decode(file.Content) : undefined,
|
|
||||||
}
|
|
||||||
}).filter((file) => file !== undefined) ?? [];
|
|
||||||
|
|
||||||
const actualMatchCount = files.reduce(
|
|
||||||
(acc, file) =>
|
|
||||||
// Match count is the sum of the number of chunk matches and file name matches.
|
|
||||||
acc + file.chunks.reduce(
|
|
||||||
(acc, chunk) => acc + chunk.matchRanges.length,
|
|
||||||
0,
|
|
||||||
) + file.fileName.matchRanges.length,
|
|
||||||
0,
|
|
||||||
);
|
|
||||||
|
|
||||||
const totalMatchCount = Result.MatchCount;
|
|
||||||
const isSearchExhaustive = totalMatchCount <= actualMatchCount;
|
|
||||||
|
|
||||||
return {
|
|
||||||
files,
|
|
||||||
repositoryInfo: Array.from(repos.values()).map((repo) => ({
|
|
||||||
id: repo.id,
|
|
||||||
codeHostType: repo.external_codeHostType,
|
|
||||||
name: repo.name,
|
|
||||||
displayName: repo.displayName ?? undefined,
|
|
||||||
webUrl: repo.webUrl ?? undefined,
|
|
||||||
})),
|
|
||||||
isBranchFilteringEnabled,
|
|
||||||
isSearchExhaustive,
|
|
||||||
stats: {
|
|
||||||
actualMatchCount,
|
|
||||||
totalMatchCount,
|
|
||||||
duration: Result.Duration,
|
|
||||||
fileCount: Result.FileCount,
|
|
||||||
filesSkipped: Result.FilesSkipped,
|
|
||||||
contentBytesLoaded: Result.ContentBytesLoaded,
|
|
||||||
indexBytesLoaded: Result.IndexBytesLoaded,
|
|
||||||
crashes: Result.Crashes,
|
|
||||||
shardFilesConsidered: Result.ShardFilesConsidered,
|
|
||||||
filesConsidered: Result.FilesConsidered,
|
|
||||||
filesLoaded: Result.FilesLoaded,
|
|
||||||
shardsScanned: Result.ShardsScanned,
|
|
||||||
shardsSkipped: Result.ShardsSkipped,
|
|
||||||
shardsSkippedFilter: Result.ShardsSkippedFilter,
|
|
||||||
ngramMatches: Result.NgramMatches,
|
|
||||||
ngramLookups: Result.NgramLookups,
|
|
||||||
wait: Result.Wait,
|
|
||||||
matchTreeConstruction: Result.MatchTreeConstruction,
|
|
||||||
matchTreeSearch: Result.MatchTreeSearch,
|
|
||||||
regexpsConsidered: Result.RegexpsConsidered,
|
|
||||||
flushReason: Result.FlushReason,
|
|
||||||
}
|
|
||||||
} satisfies SearchResponse;
|
|
||||||
}
|
|
||||||
|
|
||||||
const { data: rawZoektResponse, durationMs: parseJsonDurationMs } = await measure(
|
|
||||||
() => searchResponse.json(),
|
|
||||||
"parse_json",
|
|
||||||
false
|
|
||||||
);
|
|
||||||
|
|
||||||
// @note: We do not use zod parseAsync here since in cases where the
|
|
||||||
// response is large (> 40MB), there can be significant performance issues.
|
|
||||||
const zoektResponse = rawZoektResponse as ZoektSearchResponse;
|
|
||||||
|
|
||||||
const { data: response, durationMs: transformZoektResponseDurationMs } = await measure(
|
|
||||||
() => transformZoektSearchResponse(zoektResponse),
|
|
||||||
"transform_zoekt_response",
|
|
||||||
false
|
|
||||||
);
|
|
||||||
|
|
||||||
const totalDurationMs = fetchDurationMs + parseJsonDurationMs + transformZoektResponseDurationMs;
|
|
||||||
|
|
||||||
// Debug log: timing breakdown
|
|
||||||
const timings = [
|
|
||||||
{ name: "zoekt_fetch", duration: fetchDurationMs },
|
|
||||||
{ name: "parse_json", duration: parseJsonDurationMs },
|
|
||||||
{ name: "transform_zoekt_response", duration: transformZoektResponseDurationMs },
|
|
||||||
];
|
|
||||||
|
|
||||||
logger.debug(`Search timing breakdown (query: "${query}"):`);
|
|
||||||
timings.forEach(({ name, duration }) => {
|
|
||||||
const percentage = ((duration / totalDurationMs) * 100).toFixed(1);
|
|
||||||
const durationStr = duration.toFixed(2).padStart(8);
|
|
||||||
const percentageStr = percentage.padStart(5);
|
|
||||||
logger.debug(` ${name.padEnd(25)} ${durationStr}ms (${percentageStr}%)`);
|
|
||||||
});
|
|
||||||
logger.debug(` ${"TOTAL".padEnd(25)} ${totalDurationMs.toFixed(2).padStart(8)}ms (100.0%)`);
|
|
||||||
|
|
||||||
return {
|
|
||||||
...response,
|
|
||||||
__debug_timings: {
|
|
||||||
zoekt_fetch: fetchDurationMs,
|
|
||||||
parse_json: parseJsonDurationMs,
|
|
||||||
transform_zoekt_response: transformZoektResponseDurationMs,
|
|
||||||
}
|
|
||||||
} satisfies SearchResponse;
|
|
||||||
}));
|
}));
|
||||||
|
|
||||||
|
export const streamSearch = (request: SearchRequest) => sew(() =>
|
||||||
|
withOptionalAuthV2(async ({ prisma, user }) => {
|
||||||
|
const repoSearchScope = await getAccessibleRepoNamesForUser({ user, prisma });
|
||||||
|
|
||||||
|
// If needed, parse the query syntax into the query intermediate representation.
|
||||||
|
const query = request.queryType === 'string' ? await parseQuerySyntaxIntoIR({
|
||||||
|
query: request.query,
|
||||||
|
options: request.options,
|
||||||
|
prisma,
|
||||||
|
}) : request.query;
|
||||||
|
|
||||||
|
const zoektSearchRequest = await createZoektSearchRequest({
|
||||||
|
query,
|
||||||
|
options: request.options,
|
||||||
|
repoSearchScope,
|
||||||
|
});
|
||||||
|
|
||||||
|
return zoektStreamSearch(zoektSearchRequest, prisma);
|
||||||
|
}));
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Returns a list of repository names that the user has access to.
|
||||||
|
* If permission syncing is disabled, returns undefined.
|
||||||
|
*/
|
||||||
|
const getAccessibleRepoNamesForUser = async ({ user, prisma }: { user?: UserWithAccounts, prisma: PrismaClient }) => {
|
||||||
|
if (
|
||||||
|
env.EXPERIMENT_EE_PERMISSION_SYNC_ENABLED !== 'true' ||
|
||||||
|
!hasEntitlement('permission-syncing')
|
||||||
|
) {
|
||||||
|
return undefined;
|
||||||
|
}
|
||||||
|
|
||||||
|
const accessibleRepos = await prisma.repo.findMany({
|
||||||
|
where: getRepoPermissionFilterForUser(user),
|
||||||
|
select: {
|
||||||
|
name: true,
|
||||||
|
}
|
||||||
|
});
|
||||||
|
return accessibleRepos.map(repo => repo.name);
|
||||||
|
}
|
||||||
|
|
|
||||||
|
|
@ -1,27 +1,164 @@
|
||||||
// @NOTE : Please keep this file in sync with @sourcebot/mcp/src/types.ts
|
import { CodeHostType } from "@sourcebot/db";
|
||||||
import {
|
|
||||||
fileSourceResponseSchema,
|
|
||||||
locationSchema,
|
|
||||||
searchRequestSchema,
|
|
||||||
searchResponseSchema,
|
|
||||||
rangeSchema,
|
|
||||||
fileSourceRequestSchema,
|
|
||||||
symbolSchema,
|
|
||||||
repositoryInfoSchema,
|
|
||||||
searchStatsSchema,
|
|
||||||
} from "./schemas";
|
|
||||||
import { z } from "zod";
|
import { z } from "zod";
|
||||||
|
import { serviceErrorSchema } from "@/lib/serviceError";
|
||||||
|
|
||||||
export type SearchRequest = z.infer<typeof searchRequestSchema>;
|
export const locationSchema = z.object({
|
||||||
export type SearchResponse = z.infer<typeof searchResponseSchema>;
|
byteOffset: z.number(), // 0-based byte offset from the beginning of the file
|
||||||
export type SearchResultLocation = z.infer<typeof locationSchema>;
|
lineNumber: z.number(), // 1-based line number from the beginning of the file
|
||||||
export type SearchResultFile = SearchResponse["files"][number];
|
column: z.number(), // 1-based column number (in runes) from the beginning of line
|
||||||
export type SearchResultChunk = SearchResultFile["chunks"][number];
|
});
|
||||||
|
export type SourceLocation = z.infer<typeof locationSchema>;
|
||||||
|
|
||||||
|
export const rangeSchema = z.object({
|
||||||
|
start: locationSchema,
|
||||||
|
end: locationSchema,
|
||||||
|
});
|
||||||
|
export type SourceRange = z.infer<typeof rangeSchema>;
|
||||||
|
|
||||||
|
export const symbolSchema = z.object({
|
||||||
|
symbol: z.string(),
|
||||||
|
kind: z.string(),
|
||||||
|
});
|
||||||
export type SearchSymbol = z.infer<typeof symbolSchema>;
|
export type SearchSymbol = z.infer<typeof symbolSchema>;
|
||||||
|
|
||||||
export type FileSourceRequest = z.infer<typeof fileSourceRequestSchema>;
|
export const repositoryInfoSchema = z.object({
|
||||||
export type FileSourceResponse = z.infer<typeof fileSourceResponseSchema>;
|
id: z.number(),
|
||||||
|
codeHostType: z.nativeEnum(CodeHostType),
|
||||||
|
name: z.string(),
|
||||||
|
displayName: z.string().optional(),
|
||||||
|
webUrl: z.string().optional(),
|
||||||
|
});
|
||||||
export type RepositoryInfo = z.infer<typeof repositoryInfoSchema>;
|
export type RepositoryInfo = z.infer<typeof repositoryInfoSchema>;
|
||||||
export type SourceRange = z.infer<typeof rangeSchema>;
|
|
||||||
|
// @note: Many of these fields are defined in zoekt/api.go.
|
||||||
|
export const searchStatsSchema = z.object({
|
||||||
|
actualMatchCount: z.number(), // The actual number of matches returned by the search. This will always be less than or equal to `totalMatchCount`.
|
||||||
|
totalMatchCount: z.number(), // The total number of matches found during the search.
|
||||||
|
duration: z.number(), // The duration (in nanoseconds) of the search.
|
||||||
|
fileCount: z.number(), // Number of files containing a match.
|
||||||
|
filesSkipped: z.number(), // Candidate files whose contents weren't examined because we gathered enough matches.
|
||||||
|
contentBytesLoaded: z.number(), // Amount of I/O for reading contents.
|
||||||
|
indexBytesLoaded: z.number(), // Amount of I/O for reading from index.
|
||||||
|
crashes: z.number(), // Number of search shards that had a crash.
|
||||||
|
shardFilesConsidered: z.number(), // Number of files in shards that we considered.
|
||||||
|
filesConsidered: z.number(), // Files that we evaluated. Equivalent to files for which all atom matches (including negations) evaluated to true.
|
||||||
|
filesLoaded: z.number(), // Files for which we loaded file content to verify substring matches
|
||||||
|
shardsScanned: z.number(), // Shards that we scanned to find matches.
|
||||||
|
shardsSkipped: z.number(), // Shards that we did not process because a query was canceled.
|
||||||
|
shardsSkippedFilter: z.number(), // Shards that we did not process because the query was rejected by the ngram filter indicating it had no matches.
|
||||||
|
ngramMatches: z.number(), // Number of candidate matches as a result of searching ngrams.
|
||||||
|
ngramLookups: z.number(), // NgramLookups is the number of times we accessed an ngram in the index.
|
||||||
|
wait: z.number(), // Wall clock time for queued search.
|
||||||
|
matchTreeConstruction: z.number(), // Aggregate wall clock time spent constructing and pruning the match tree. This accounts for time such as lookups in the trigram index.
|
||||||
|
matchTreeSearch: z.number(), // Aggregate wall clock time spent searching the match tree. This accounts for the bulk of search work done looking for matches.
|
||||||
|
regexpsConsidered: z.number(), // Number of times regexp was called on files that we evaluated.
|
||||||
|
flushReason: z.string(), // FlushReason explains why results were flushed.
|
||||||
|
});
|
||||||
export type SearchStats = z.infer<typeof searchStatsSchema>;
|
export type SearchStats = z.infer<typeof searchStatsSchema>;
|
||||||
|
|
||||||
|
export const searchFileSchema = z.object({
|
||||||
|
fileName: z.object({
|
||||||
|
// The name of the file
|
||||||
|
text: z.string(),
|
||||||
|
// Any matching ranges
|
||||||
|
matchRanges: z.array(rangeSchema),
|
||||||
|
}),
|
||||||
|
webUrl: z.string().optional(),
|
||||||
|
repository: z.string(),
|
||||||
|
repositoryId: z.number(),
|
||||||
|
language: z.string(),
|
||||||
|
chunks: z.array(z.object({
|
||||||
|
content: z.string(),
|
||||||
|
matchRanges: z.array(rangeSchema),
|
||||||
|
contentStart: locationSchema,
|
||||||
|
symbols: z.array(z.object({
|
||||||
|
...symbolSchema.shape,
|
||||||
|
parent: symbolSchema.optional(),
|
||||||
|
})).optional(),
|
||||||
|
})),
|
||||||
|
branches: z.array(z.string()).optional(),
|
||||||
|
// Set if `whole` is true.
|
||||||
|
content: z.string().optional(),
|
||||||
|
});
|
||||||
|
export type SearchResultFile = z.infer<typeof searchFileSchema>;
|
||||||
|
export type SearchResultChunk = SearchResultFile["chunks"][number];
|
||||||
|
|
||||||
|
export const searchOptionsSchema = z.object({
|
||||||
|
matches: z.number(), // The number of matches to return.
|
||||||
|
contextLines: z.number().optional(), // The number of context lines to return.
|
||||||
|
whole: z.boolean().optional(), // Whether to return the whole file as part of the response.
|
||||||
|
isRegexEnabled: z.boolean().optional(), // Whether to enable regular expression search.
|
||||||
|
isCaseSensitivityEnabled: z.boolean().optional(), // Whether to enable case sensitivity.
|
||||||
|
});
|
||||||
|
export type SearchOptions = z.infer<typeof searchOptionsSchema>;
|
||||||
|
|
||||||
|
export const searchRequestSchema = z.object({
|
||||||
|
query: z.string(), // The zoekt query to execute.
|
||||||
|
...searchOptionsSchema.shape,
|
||||||
|
});
|
||||||
|
export type SearchRequest = z.infer<typeof searchRequestSchema>;
|
||||||
|
|
||||||
|
export const searchResponseSchema = z.object({
|
||||||
|
stats: searchStatsSchema,
|
||||||
|
files: z.array(searchFileSchema),
|
||||||
|
repositoryInfo: z.array(repositoryInfoSchema),
|
||||||
|
isSearchExhaustive: z.boolean(),
|
||||||
|
});
|
||||||
|
export type SearchResponse = z.infer<typeof searchResponseSchema>;
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Sent after each chunk of results is processed.
|
||||||
|
*/
|
||||||
|
export const streamedSearchChunkResponseSchema = z.object({
|
||||||
|
type: z.literal('chunk'),
|
||||||
|
stats: searchStatsSchema,
|
||||||
|
files: z.array(searchFileSchema),
|
||||||
|
repositoryInfo: z.array(repositoryInfoSchema),
|
||||||
|
});
|
||||||
|
export type StreamedSearchChunkResponse = z.infer<typeof streamedSearchChunkResponseSchema>;
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Sent after the search is complete.
|
||||||
|
*/
|
||||||
|
export const streamedSearchFinalResponseSchema = z.object({
|
||||||
|
type: z.literal('final'),
|
||||||
|
accumulatedStats: searchStatsSchema,
|
||||||
|
isSearchExhaustive: z.boolean(),
|
||||||
|
});
|
||||||
|
export type StreamedSearchFinalResponse = z.infer<typeof streamedSearchFinalResponseSchema>;
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Sent when an error occurs during streaming.
|
||||||
|
*/
|
||||||
|
export const streamedSearchErrorResponseSchema = z.object({
|
||||||
|
type: z.literal('error'),
|
||||||
|
error: serviceErrorSchema,
|
||||||
|
});
|
||||||
|
export type StreamedSearchErrorResponse = z.infer<typeof streamedSearchErrorResponseSchema>;
|
||||||
|
|
||||||
|
export const streamedSearchResponseSchema = z.discriminatedUnion('type', [
|
||||||
|
streamedSearchChunkResponseSchema,
|
||||||
|
streamedSearchFinalResponseSchema,
|
||||||
|
streamedSearchErrorResponseSchema,
|
||||||
|
]);
|
||||||
|
export type StreamedSearchResponse = z.infer<typeof streamedSearchResponseSchema>;
|
||||||
|
|
||||||
|
export const fileSourceRequestSchema = z.object({
|
||||||
|
fileName: z.string(),
|
||||||
|
repository: z.string(),
|
||||||
|
branch: z.string().optional(),
|
||||||
|
});
|
||||||
|
export type FileSourceRequest = z.infer<typeof fileSourceRequestSchema>;
|
||||||
|
|
||||||
|
export const fileSourceResponseSchema = z.object({
|
||||||
|
source: z.string(),
|
||||||
|
language: z.string(),
|
||||||
|
path: z.string(),
|
||||||
|
repository: z.string(),
|
||||||
|
repositoryCodeHostType: z.nativeEnum(CodeHostType),
|
||||||
|
repositoryDisplayName: z.string().optional(),
|
||||||
|
repositoryWebUrl: z.string().optional(),
|
||||||
|
branch: z.string().optional(),
|
||||||
|
webUrl: z.string().optional(),
|
||||||
|
});
|
||||||
|
export type FileSourceResponse = z.infer<typeof fileSourceResponseSchema>;
|
||||||
|
|
|
||||||
|
|
@ -1,34 +0,0 @@
|
||||||
import { env } from "@sourcebot/shared";
|
|
||||||
|
|
||||||
interface ZoektRequest {
|
|
||||||
path: string,
|
|
||||||
body: string,
|
|
||||||
method: string,
|
|
||||||
header?: Record<string, string>,
|
|
||||||
cache?: RequestCache,
|
|
||||||
}
|
|
||||||
|
|
||||||
export const zoektFetch = async ({
|
|
||||||
path,
|
|
||||||
body,
|
|
||||||
method,
|
|
||||||
header,
|
|
||||||
cache,
|
|
||||||
}: ZoektRequest) => {
|
|
||||||
const response = await fetch(
|
|
||||||
new URL(path, env.ZOEKT_WEBSERVER_URL),
|
|
||||||
{
|
|
||||||
method,
|
|
||||||
headers: {
|
|
||||||
...header,
|
|
||||||
"Content-Type": "application/json",
|
|
||||||
},
|
|
||||||
body,
|
|
||||||
cache,
|
|
||||||
}
|
|
||||||
);
|
|
||||||
|
|
||||||
// @todo : add metrics
|
|
||||||
|
|
||||||
return response;
|
|
||||||
}
|
|
||||||
|
|
@ -1,135 +0,0 @@
|
||||||
|
|
||||||
import { z } from "zod";
|
|
||||||
|
|
||||||
// @see : https://github.com/sourcebot-dev/zoekt/blob/main/api.go#L212
|
|
||||||
export const zoektLocationSchema = z.object({
|
|
||||||
// 0-based byte offset from the beginning of the file
|
|
||||||
ByteOffset: z.number(),
|
|
||||||
// 1-based line number from the beginning of the file
|
|
||||||
LineNumber: z.number(),
|
|
||||||
// 1-based column number (in runes) from the beginning of line
|
|
||||||
Column: z.number(),
|
|
||||||
});
|
|
||||||
|
|
||||||
export const zoektRangeSchema = z.object({
|
|
||||||
Start: zoektLocationSchema,
|
|
||||||
End: zoektLocationSchema,
|
|
||||||
});
|
|
||||||
|
|
||||||
// @see : https://github.com/sourcebot-dev/zoekt/blob/3780e68cdb537d5a7ed2c84d9b3784f80c7c5d04/api.go#L350
|
|
||||||
export const zoektSearchResponseStats = {
|
|
||||||
ContentBytesLoaded: z.number(),
|
|
||||||
IndexBytesLoaded: z.number(),
|
|
||||||
Crashes: z.number(),
|
|
||||||
Duration: z.number(),
|
|
||||||
FileCount: z.number(),
|
|
||||||
ShardFilesConsidered: z.number(),
|
|
||||||
FilesConsidered: z.number(),
|
|
||||||
FilesLoaded: z.number(),
|
|
||||||
FilesSkipped: z.number(),
|
|
||||||
ShardsScanned: z.number(),
|
|
||||||
ShardsSkipped: z.number(),
|
|
||||||
ShardsSkippedFilter: z.number(),
|
|
||||||
MatchCount: z.number(),
|
|
||||||
NgramMatches: z.number(),
|
|
||||||
NgramLookups: z.number(),
|
|
||||||
Wait: z.number(),
|
|
||||||
MatchTreeConstruction: z.number(),
|
|
||||||
MatchTreeSearch: z.number(),
|
|
||||||
RegexpsConsidered: z.number(),
|
|
||||||
FlushReason: z.number(),
|
|
||||||
}
|
|
||||||
|
|
||||||
export const zoektSymbolSchema = z.object({
|
|
||||||
Sym: z.string(),
|
|
||||||
Kind: z.string(),
|
|
||||||
Parent: z.string(),
|
|
||||||
ParentKind: z.string(),
|
|
||||||
});
|
|
||||||
|
|
||||||
// @see : https://github.com/sourcebot-dev/zoekt/blob/3780e68cdb537d5a7ed2c84d9b3784f80c7c5d04/api.go#L497
|
|
||||||
export const zoektSearchResponseSchema = z.object({
|
|
||||||
Result: z.object({
|
|
||||||
...zoektSearchResponseStats,
|
|
||||||
Files: z.array(z.object({
|
|
||||||
FileName: z.string(),
|
|
||||||
Repository: z.string(),
|
|
||||||
RepositoryID: z.number().optional(),
|
|
||||||
Version: z.string().optional(),
|
|
||||||
Language: z.string(),
|
|
||||||
Branches: z.array(z.string()).optional(),
|
|
||||||
ChunkMatches: z.array(z.object({
|
|
||||||
Content: z.string(),
|
|
||||||
Ranges: z.array(zoektRangeSchema),
|
|
||||||
FileName: z.boolean(),
|
|
||||||
ContentStart: zoektLocationSchema,
|
|
||||||
Score: z.number(),
|
|
||||||
SymbolInfo: z.array(zoektSymbolSchema).nullable(),
|
|
||||||
})),
|
|
||||||
Checksum: z.string(),
|
|
||||||
Score: z.number(),
|
|
||||||
// Set if `whole` is true.
|
|
||||||
Content: z.string().optional(),
|
|
||||||
})).nullable(),
|
|
||||||
RepoURLs: z.record(z.string(), z.string()),
|
|
||||||
}),
|
|
||||||
});
|
|
||||||
|
|
||||||
export type ZoektSearchResponse = z.infer<typeof zoektSearchResponseSchema>;
|
|
||||||
|
|
||||||
// @see : https://github.com/sourcebot-dev/zoekt/blob/3780e68cdb537d5a7ed2c84d9b3784f80c7c5d04/api.go#L728
|
|
||||||
const zoektRepoStatsSchema = z.object({
|
|
||||||
Repos: z.number(),
|
|
||||||
Shards: z.number(),
|
|
||||||
Documents: z.number(),
|
|
||||||
IndexBytes: z.number(),
|
|
||||||
ContentBytes: z.number(),
|
|
||||||
NewLinesCount: z.number(),
|
|
||||||
DefaultBranchNewLinesCount: z.number(),
|
|
||||||
OtherBranchesNewLinesCount: z.number(),
|
|
||||||
});
|
|
||||||
|
|
||||||
// @see : https://github.com/sourcebot-dev/zoekt/blob/3780e68cdb537d5a7ed2c84d9b3784f80c7c5d04/api.go#L716
|
|
||||||
const zoektIndexMetadataSchema = z.object({
|
|
||||||
IndexFormatVersion: z.number(),
|
|
||||||
IndexFeatureVersion: z.number(),
|
|
||||||
IndexMinReaderVersion: z.number(),
|
|
||||||
IndexTime: z.string(),
|
|
||||||
PlainASCII: z.boolean(),
|
|
||||||
LanguageMap: z.record(z.string(), z.number()),
|
|
||||||
ZoektVersion: z.string(),
|
|
||||||
ID: z.string(),
|
|
||||||
});
|
|
||||||
|
|
||||||
|
|
||||||
// @see : https://github.com/sourcebot-dev/zoekt/blob/3780e68cdb537d5a7ed2c84d9b3784f80c7c5d04/api.go#L555
|
|
||||||
export const zoektRepositorySchema = z.object({
|
|
||||||
Name: z.string(),
|
|
||||||
URL: z.string(),
|
|
||||||
Source: z.string(),
|
|
||||||
Branches: z.array(z.object({
|
|
||||||
Name: z.string(),
|
|
||||||
Version: z.string(),
|
|
||||||
})).nullable(),
|
|
||||||
CommitURLTemplate: z.string(),
|
|
||||||
FileURLTemplate: z.string(),
|
|
||||||
LineFragmentTemplate: z.string(),
|
|
||||||
RawConfig: z.record(z.string(), z.string()).nullable(),
|
|
||||||
Rank: z.number(),
|
|
||||||
IndexOptions: z.string(),
|
|
||||||
HasSymbols: z.boolean(),
|
|
||||||
Tombstone: z.boolean(),
|
|
||||||
LatestCommitDate: z.string(),
|
|
||||||
FileTombstones: z.string().optional(),
|
|
||||||
});
|
|
||||||
|
|
||||||
export const zoektListRepositoriesResponseSchema = z.object({
|
|
||||||
List: z.object({
|
|
||||||
Repos: z.array(z.object({
|
|
||||||
Repository: zoektRepositorySchema,
|
|
||||||
IndexMetadata: zoektIndexMetadataSchema,
|
|
||||||
Stats: zoektRepoStatsSchema,
|
|
||||||
})),
|
|
||||||
Stats: zoektRepoStatsSchema,
|
|
||||||
})
|
|
||||||
});
|
|
||||||
579
packages/web/src/features/search/zoektSearcher.ts
Normal file
579
packages/web/src/features/search/zoektSearcher.ts
Normal file
|
|
@ -0,0 +1,579 @@
|
||||||
|
import { getCodeHostBrowseFileAtBranchUrl } from "@/lib/utils";
|
||||||
|
import { unexpectedError } from "@/lib/serviceError";
|
||||||
|
import type { ProtoGrpcType } from '@/proto/webserver';
|
||||||
|
import { FileMatch__Output as ZoektGrpcFileMatch } from "@/proto/zoekt/webserver/v1/FileMatch";
|
||||||
|
import { FlushReason as ZoektGrpcFlushReason } from "@/proto/zoekt/webserver/v1/FlushReason";
|
||||||
|
import { Range__Output as ZoektGrpcRange } from "@/proto/zoekt/webserver/v1/Range";
|
||||||
|
import type { SearchRequest as ZoektGrpcSearchRequest } from '@/proto/zoekt/webserver/v1/SearchRequest';
|
||||||
|
import { SearchResponse__Output as ZoektGrpcSearchResponse } from "@/proto/zoekt/webserver/v1/SearchResponse";
|
||||||
|
import { StreamSearchRequest as ZoektGrpcStreamSearchRequest } from "@/proto/zoekt/webserver/v1/StreamSearchRequest";
|
||||||
|
import { StreamSearchResponse__Output as ZoektGrpcStreamSearchResponse } from "@/proto/zoekt/webserver/v1/StreamSearchResponse";
|
||||||
|
import { WebserverServiceClient } from '@/proto/zoekt/webserver/v1/WebserverService';
|
||||||
|
import * as grpc from '@grpc/grpc-js';
|
||||||
|
import * as protoLoader from '@grpc/proto-loader';
|
||||||
|
import * as Sentry from '@sentry/nextjs';
|
||||||
|
import { PrismaClient, Repo } from "@sourcebot/db";
|
||||||
|
import { createLogger, env } from "@sourcebot/shared";
|
||||||
|
import path from 'path';
|
||||||
|
import { isBranchQuery, QueryIR, someInQueryIR } from './ir';
|
||||||
|
import { RepositoryInfo, SearchResponse, SearchResultFile, SearchStats, SourceRange, StreamedSearchErrorResponse, StreamedSearchResponse } from "./types";
|
||||||
|
|
||||||
|
const logger = createLogger("zoekt-searcher");
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Creates a ZoektGrpcSearchRequest given a query IR.
|
||||||
|
*/
|
||||||
|
export const createZoektSearchRequest = async ({
|
||||||
|
query,
|
||||||
|
options,
|
||||||
|
repoSearchScope,
|
||||||
|
}: {
|
||||||
|
query: QueryIR;
|
||||||
|
options: {
|
||||||
|
matches: number,
|
||||||
|
contextLines?: number,
|
||||||
|
whole?: boolean,
|
||||||
|
};
|
||||||
|
// Allows the caller to scope the search to a specific set of repositories.
|
||||||
|
repoSearchScope?: string[];
|
||||||
|
}) => {
|
||||||
|
// Find if there are any `rev:` filters in the query.
|
||||||
|
const containsRevExpression = someInQueryIR(query, (q) => isBranchQuery(q));
|
||||||
|
|
||||||
|
const zoektSearchRequest: ZoektGrpcSearchRequest = {
|
||||||
|
query: {
|
||||||
|
and: {
|
||||||
|
children: [
|
||||||
|
query,
|
||||||
|
// If the query does not contain a `rev:` filter, we default to searching `HEAD`.
|
||||||
|
...(!containsRevExpression ? [{
|
||||||
|
branch: {
|
||||||
|
pattern: 'HEAD',
|
||||||
|
exact: true,
|
||||||
|
}
|
||||||
|
}] : []),
|
||||||
|
...(repoSearchScope ? [{
|
||||||
|
repo_set: {
|
||||||
|
set: repoSearchScope.reduce((acc, repo) => {
|
||||||
|
acc[repo] = true;
|
||||||
|
return acc;
|
||||||
|
}, {} as Record<string, boolean>)
|
||||||
|
}
|
||||||
|
}] : []),
|
||||||
|
]
|
||||||
|
}
|
||||||
|
},
|
||||||
|
opts: {
|
||||||
|
chunk_matches: true,
|
||||||
|
// @note: Zoekt has several different ways to limit a given search. The two that
|
||||||
|
// we care about are `MaxMatchDisplayCount` and `TotalMaxMatchCount`:
|
||||||
|
// - `MaxMatchDisplayCount` truncates the number of matches AFTER performing
|
||||||
|
// a search (specifically, after collating and sorting the results). The number of
|
||||||
|
// results returned by the API will be less than or equal to this value.
|
||||||
|
//
|
||||||
|
// - `TotalMaxMatchCount` truncates the number of matches DURING a search. The results
|
||||||
|
// returned by the API the API can be less than, equal to, or greater than this value.
|
||||||
|
// Why greater? Because this value is compared _after_ a given shard has finished
|
||||||
|
// being processed, the number of matches returned by the last shard may have exceeded
|
||||||
|
// this value.
|
||||||
|
//
|
||||||
|
// Let's define two variables:
|
||||||
|
// - `actualMatchCount` : The number of matches that are returned by the API. This is
|
||||||
|
// always less than or equal to `MaxMatchDisplayCount`.
|
||||||
|
// - `totalMatchCount` : The number of matches that zoekt found before it either
|
||||||
|
// 1) found all matches or 2) hit the `TotalMaxMatchCount` limit. This number is
|
||||||
|
// not bounded and can be less than, equal to, or greater than both `TotalMaxMatchCount`
|
||||||
|
// and `MaxMatchDisplayCount`.
|
||||||
|
//
|
||||||
|
//
|
||||||
|
// Our challenge is to determine whether or not the search returned all possible matches/
|
||||||
|
// (it was exaustive) or if it was truncated. By setting the `TotalMaxMatchCount` to
|
||||||
|
// `MaxMatchDisplayCount + 1`, we can determine which of these occurred by comparing
|
||||||
|
// `totalMatchCount` to `MaxMatchDisplayCount`.
|
||||||
|
//
|
||||||
|
// if (totalMatchCount ≤ actualMatchCount):
|
||||||
|
// Search is EXHAUSTIVE (found all possible matches)
|
||||||
|
// Proof: totalMatchCount ≤ MaxMatchDisplayCount < TotalMaxMatchCount
|
||||||
|
// Therefore Zoekt stopped naturally, not due to limit
|
||||||
|
//
|
||||||
|
// if (totalMatchCount > actualMatchCount):
|
||||||
|
// Search is TRUNCATED (more matches exist)
|
||||||
|
// Proof: totalMatchCount > MaxMatchDisplayCount + 1 = TotalMaxMatchCount
|
||||||
|
// Therefore Zoekt hit the limit and stopped searching
|
||||||
|
//
|
||||||
|
max_match_display_count: options.matches,
|
||||||
|
total_max_match_count: options.matches + 1,
|
||||||
|
num_context_lines: options.contextLines ?? 0,
|
||||||
|
whole: !!options.whole,
|
||||||
|
shard_max_match_count: -1,
|
||||||
|
max_wall_time: {
|
||||||
|
seconds: 0,
|
||||||
|
}
|
||||||
|
},
|
||||||
|
};
|
||||||
|
|
||||||
|
return zoektSearchRequest;
|
||||||
|
}
|
||||||
|
|
||||||
|
export const zoektSearch = async (searchRequest: ZoektGrpcSearchRequest, prisma: PrismaClient): Promise<SearchResponse> => {
|
||||||
|
const client = createGrpcClient();
|
||||||
|
const metadata = new grpc.Metadata();
|
||||||
|
|
||||||
|
return new Promise((resolve, reject) => {
|
||||||
|
client.Search(searchRequest, metadata, (error, response) => {
|
||||||
|
if (error || !response) {
|
||||||
|
reject(error || new Error('No response received'));
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
(async () => {
|
||||||
|
try {
|
||||||
|
const reposMapCache = await createReposMapForChunk(response, new Map<string | number, Repo>(), prisma);
|
||||||
|
const { stats, files, repositoryInfo } = await transformZoektSearchResponse(response, reposMapCache);
|
||||||
|
|
||||||
|
resolve({
|
||||||
|
stats,
|
||||||
|
files,
|
||||||
|
repositoryInfo,
|
||||||
|
isSearchExhaustive: stats.totalMatchCount <= stats.actualMatchCount,
|
||||||
|
} satisfies SearchResponse);
|
||||||
|
} catch (err) {
|
||||||
|
reject(err);
|
||||||
|
}
|
||||||
|
})();
|
||||||
|
});
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
export const zoektStreamSearch = async (searchRequest: ZoektGrpcSearchRequest, prisma: PrismaClient): Promise<ReadableStream> => {
|
||||||
|
const client = createGrpcClient();
|
||||||
|
let grpcStream: ReturnType<WebserverServiceClient['StreamSearch']> | null = null;
|
||||||
|
let isStreamActive = true;
|
||||||
|
let pendingChunks = 0;
|
||||||
|
let accumulatedStats: SearchStats = {
|
||||||
|
actualMatchCount: 0,
|
||||||
|
totalMatchCount: 0,
|
||||||
|
duration: 0,
|
||||||
|
fileCount: 0,
|
||||||
|
filesSkipped: 0,
|
||||||
|
contentBytesLoaded: 0,
|
||||||
|
indexBytesLoaded: 0,
|
||||||
|
crashes: 0,
|
||||||
|
shardFilesConsidered: 0,
|
||||||
|
filesConsidered: 0,
|
||||||
|
filesLoaded: 0,
|
||||||
|
shardsScanned: 0,
|
||||||
|
shardsSkipped: 0,
|
||||||
|
shardsSkippedFilter: 0,
|
||||||
|
ngramMatches: 0,
|
||||||
|
ngramLookups: 0,
|
||||||
|
wait: 0,
|
||||||
|
matchTreeConstruction: 0,
|
||||||
|
matchTreeSearch: 0,
|
||||||
|
regexpsConsidered: 0,
|
||||||
|
flushReason: ZoektGrpcFlushReason.FLUSH_REASON_UNKNOWN_UNSPECIFIED,
|
||||||
|
};
|
||||||
|
|
||||||
|
return new ReadableStream({
|
||||||
|
async start(controller) {
|
||||||
|
const tryCloseController = () => {
|
||||||
|
if (!isStreamActive && pendingChunks === 0) {
|
||||||
|
const finalResponse: StreamedSearchResponse = {
|
||||||
|
type: 'final',
|
||||||
|
accumulatedStats,
|
||||||
|
isSearchExhaustive: accumulatedStats.totalMatchCount <= accumulatedStats.actualMatchCount,
|
||||||
|
}
|
||||||
|
|
||||||
|
controller.enqueue(encodeSSEREsponseChunk(finalResponse));
|
||||||
|
controller.enqueue(encodeSSEREsponseChunk('[DONE]'));
|
||||||
|
controller.close();
|
||||||
|
client.close();
|
||||||
|
logger.debug('SSE stream closed');
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
try {
|
||||||
|
const metadata = new grpc.Metadata();
|
||||||
|
|
||||||
|
const streamRequest: ZoektGrpcStreamSearchRequest = {
|
||||||
|
request: searchRequest,
|
||||||
|
};
|
||||||
|
|
||||||
|
grpcStream = client.StreamSearch(streamRequest, metadata);
|
||||||
|
|
||||||
|
// `_reposMapCache` is used to cache repository metadata across all chunks.
|
||||||
|
// This reduces the number of database queries required to transform file matches.
|
||||||
|
const _reposMapCache = new Map<string | number, Repo>();
|
||||||
|
|
||||||
|
// Handle incoming data chunks
|
||||||
|
grpcStream.on('data', async (chunk: ZoektGrpcStreamSearchResponse) => {
|
||||||
|
if (!isStreamActive) {
|
||||||
|
logger.debug('SSE stream closed, skipping chunk');
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Track that we're processing a chunk
|
||||||
|
pendingChunks++;
|
||||||
|
|
||||||
|
// grpcStream.on doesn't actually await on our handler, so we need to
|
||||||
|
// explicitly pause the stream here to prevent the stream from completing
|
||||||
|
// prior to our asynchronous work being completed.
|
||||||
|
grpcStream?.pause();
|
||||||
|
|
||||||
|
try {
|
||||||
|
if (!chunk.response_chunk) {
|
||||||
|
logger.warn('No response chunk received');
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
const reposMapCache = await createReposMapForChunk(chunk.response_chunk, _reposMapCache, prisma);
|
||||||
|
const { stats, files, repositoryInfo } = await transformZoektSearchResponse(chunk.response_chunk, reposMapCache);
|
||||||
|
|
||||||
|
accumulatedStats = accumulateStats(accumulatedStats, stats);
|
||||||
|
|
||||||
|
const response: StreamedSearchResponse = {
|
||||||
|
type: 'chunk',
|
||||||
|
files,
|
||||||
|
repositoryInfo,
|
||||||
|
stats
|
||||||
|
}
|
||||||
|
|
||||||
|
controller.enqueue(encodeSSEREsponseChunk(response));
|
||||||
|
} catch (error) {
|
||||||
|
logger.error('Error processing chunk:', error);
|
||||||
|
Sentry.captureException(error);
|
||||||
|
isStreamActive = false;
|
||||||
|
|
||||||
|
const errorMessage = error instanceof Error ? error.message : 'Unknown error processing chunk';
|
||||||
|
const errorResponse: StreamedSearchErrorResponse = {
|
||||||
|
type: 'error',
|
||||||
|
error: unexpectedError(errorMessage),
|
||||||
|
};
|
||||||
|
controller.enqueue(encodeSSEREsponseChunk(errorResponse));
|
||||||
|
} finally {
|
||||||
|
pendingChunks--;
|
||||||
|
grpcStream?.resume();
|
||||||
|
|
||||||
|
// @note: we were hitting "Controller is already closed" errors when calling
|
||||||
|
// `controller.enqueue` above for the last chunk. The reasoning was the event
|
||||||
|
// handler for 'end' was being invoked prior to the completion of the last chunk,
|
||||||
|
// resulting in the controller being closed prematurely. The workaround was to
|
||||||
|
// keep track of the number of pending chunks and only close the controller
|
||||||
|
// when there are no more chunks to process. We need to explicitly call
|
||||||
|
// `tryCloseController` since there _seems_ to be no ordering guarantees between
|
||||||
|
// the 'end' event handler and this callback.
|
||||||
|
tryCloseController();
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Handle stream completion
|
||||||
|
grpcStream.on('end', () => {
|
||||||
|
if (!isStreamActive) {
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
isStreamActive = false;
|
||||||
|
tryCloseController();
|
||||||
|
});
|
||||||
|
|
||||||
|
// Handle errors
|
||||||
|
grpcStream.on('error', (error: grpc.ServiceError) => {
|
||||||
|
logger.error('gRPC stream error:', error);
|
||||||
|
Sentry.captureException(error);
|
||||||
|
|
||||||
|
if (!isStreamActive) {
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
isStreamActive = false;
|
||||||
|
|
||||||
|
// Send properly typed error response
|
||||||
|
const errorResponse: StreamedSearchErrorResponse = {
|
||||||
|
type: 'error',
|
||||||
|
error: unexpectedError(error.details || error.message),
|
||||||
|
};
|
||||||
|
controller.enqueue(encodeSSEREsponseChunk(errorResponse));
|
||||||
|
|
||||||
|
controller.close();
|
||||||
|
client.close();
|
||||||
|
});
|
||||||
|
} catch (error) {
|
||||||
|
logger.error('Stream initialization error:', error);
|
||||||
|
Sentry.captureException(error);
|
||||||
|
|
||||||
|
const errorMessage = error instanceof Error ? error.message : 'Unknown error';
|
||||||
|
const errorResponse: StreamedSearchErrorResponse = {
|
||||||
|
type: 'error',
|
||||||
|
error: unexpectedError(errorMessage),
|
||||||
|
};
|
||||||
|
controller.enqueue(encodeSSEREsponseChunk(errorResponse));
|
||||||
|
|
||||||
|
controller.close();
|
||||||
|
client.close();
|
||||||
|
}
|
||||||
|
},
|
||||||
|
cancel() {
|
||||||
|
logger.warn('SSE stream cancelled by client');
|
||||||
|
isStreamActive = false;
|
||||||
|
|
||||||
|
// Cancel the gRPC stream to stop receiving data
|
||||||
|
if (grpcStream) {
|
||||||
|
grpcStream.cancel();
|
||||||
|
}
|
||||||
|
|
||||||
|
client.close();
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
// Encodes a response chunk into a SSE-compatible format.
|
||||||
|
const encodeSSEREsponseChunk = (response: object | string) => {
|
||||||
|
const data = typeof response === 'string' ? response : JSON.stringify(response);
|
||||||
|
return new TextEncoder().encode(`data: ${data}\n\n`);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Creates a mapping between all repository ids in a given response
|
||||||
|
// chunk. The mapping allows us to efficiently lookup repository metadata.
|
||||||
|
const createReposMapForChunk = async (chunk: ZoektGrpcSearchResponse, reposMapCache: Map<string | number, Repo>, prisma: PrismaClient): Promise<Map<string | number, Repo>> => {
|
||||||
|
const reposMap = new Map<string | number, Repo>();
|
||||||
|
await Promise.all(chunk.files.map(async (file) => {
|
||||||
|
const id = getRepoIdForFile(file);
|
||||||
|
|
||||||
|
const repo = await (async () => {
|
||||||
|
// If it's in the cache, return the cached value.
|
||||||
|
if (reposMapCache.has(id)) {
|
||||||
|
return reposMapCache.get(id);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Otherwise, query the database for the record.
|
||||||
|
const repo = typeof id === 'number' ?
|
||||||
|
await prisma.repo.findUnique({
|
||||||
|
where: {
|
||||||
|
id: id,
|
||||||
|
},
|
||||||
|
}) :
|
||||||
|
await prisma.repo.findFirst({
|
||||||
|
where: {
|
||||||
|
name: id,
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
// If a repository is found, cache it for future lookups.
|
||||||
|
if (repo) {
|
||||||
|
reposMapCache.set(id, repo);
|
||||||
|
}
|
||||||
|
|
||||||
|
return repo;
|
||||||
|
})();
|
||||||
|
|
||||||
|
// Only add the repository to the map if it was found.
|
||||||
|
if (repo) {
|
||||||
|
reposMap.set(id, repo);
|
||||||
|
}
|
||||||
|
}));
|
||||||
|
|
||||||
|
return reposMap;
|
||||||
|
}
|
||||||
|
|
||||||
|
const transformZoektSearchResponse = async (response: ZoektGrpcSearchResponse, reposMapCache: Map<string | number, Repo>): Promise<{
|
||||||
|
stats: SearchStats,
|
||||||
|
files: SearchResultFile[],
|
||||||
|
repositoryInfo: RepositoryInfo[],
|
||||||
|
}> => {
|
||||||
|
const files = response.files.map((file) => {
|
||||||
|
const fileNameChunks = file.chunk_matches.filter((chunk) => chunk.file_name);
|
||||||
|
const repoId = getRepoIdForFile(file);
|
||||||
|
const repo = reposMapCache.get(repoId);
|
||||||
|
|
||||||
|
// This should never happen.
|
||||||
|
if (!repo) {
|
||||||
|
throw new Error(`Repository not found for file: ${file.file_name}`);
|
||||||
|
}
|
||||||
|
|
||||||
|
// @todo: address "file_name might not be a valid UTF-8 string" warning.
|
||||||
|
const fileName = file.file_name.toString('utf-8');
|
||||||
|
|
||||||
|
const convertRange = (range: ZoektGrpcRange): SourceRange => ({
|
||||||
|
start: {
|
||||||
|
byteOffset: range.start?.byte_offset ?? 0,
|
||||||
|
column: range.start?.column ?? 1,
|
||||||
|
lineNumber: range.start?.line_number ?? 1,
|
||||||
|
},
|
||||||
|
end: {
|
||||||
|
byteOffset: range.end?.byte_offset ?? 0,
|
||||||
|
column: range.end?.column ?? 1,
|
||||||
|
lineNumber: range.end?.line_number ?? 1,
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
return {
|
||||||
|
fileName: {
|
||||||
|
text: fileName,
|
||||||
|
matchRanges: fileNameChunks.length === 1 ? fileNameChunks[0].ranges.map(convertRange) : [],
|
||||||
|
},
|
||||||
|
repository: repo.name,
|
||||||
|
repositoryId: repo.id,
|
||||||
|
language: file.language,
|
||||||
|
webUrl: getCodeHostBrowseFileAtBranchUrl({
|
||||||
|
webUrl: repo.webUrl,
|
||||||
|
codeHostType: repo.external_codeHostType,
|
||||||
|
// If a file has multiple branches, default to the first one.
|
||||||
|
branchName: file.branches?.[0] ?? 'HEAD',
|
||||||
|
filePath: fileName,
|
||||||
|
}),
|
||||||
|
chunks: file.chunk_matches
|
||||||
|
.filter((chunk) => !chunk.file_name) // filter out filename chunks.
|
||||||
|
.map((chunk) => {
|
||||||
|
return {
|
||||||
|
content: chunk.content.toString('utf-8'),
|
||||||
|
matchRanges: chunk.ranges.map(convertRange),
|
||||||
|
contentStart: chunk.content_start ? {
|
||||||
|
byteOffset: chunk.content_start.byte_offset,
|
||||||
|
column: chunk.content_start.column,
|
||||||
|
lineNumber: chunk.content_start.line_number,
|
||||||
|
} : {
|
||||||
|
byteOffset: 0,
|
||||||
|
column: 1,
|
||||||
|
lineNumber: 1,
|
||||||
|
},
|
||||||
|
symbols: chunk.symbol_info.map((symbol) => {
|
||||||
|
return {
|
||||||
|
symbol: symbol.sym,
|
||||||
|
kind: symbol.kind,
|
||||||
|
parent: symbol.parent ? {
|
||||||
|
symbol: symbol.parent,
|
||||||
|
kind: symbol.parent_kind,
|
||||||
|
} : undefined,
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}),
|
||||||
|
branches: file.branches,
|
||||||
|
content: file.content ? file.content.toString('utf-8') : undefined,
|
||||||
|
}
|
||||||
|
}).filter(file => file !== undefined);
|
||||||
|
|
||||||
|
const actualMatchCount = files.reduce(
|
||||||
|
(acc, file) =>
|
||||||
|
// Match count is the sum of the number of chunk matches and file name matches.
|
||||||
|
acc + file.chunks.reduce(
|
||||||
|
(acc, chunk) => acc + chunk.matchRanges.length,
|
||||||
|
0,
|
||||||
|
) + file.fileName.matchRanges.length,
|
||||||
|
0,
|
||||||
|
);
|
||||||
|
|
||||||
|
const stats: SearchStats = {
|
||||||
|
actualMatchCount,
|
||||||
|
totalMatchCount: response.stats?.match_count ?? 0,
|
||||||
|
duration: response.stats?.duration?.nanos ?? 0,
|
||||||
|
fileCount: response.stats?.file_count ?? 0,
|
||||||
|
filesSkipped: response.stats?.files_skipped ?? 0,
|
||||||
|
contentBytesLoaded: response.stats?.content_bytes_loaded ?? 0,
|
||||||
|
indexBytesLoaded: response.stats?.index_bytes_loaded ?? 0,
|
||||||
|
crashes: response.stats?.crashes ?? 0,
|
||||||
|
shardFilesConsidered: response.stats?.shard_files_considered ?? 0,
|
||||||
|
filesConsidered: response.stats?.files_considered ?? 0,
|
||||||
|
filesLoaded: response.stats?.files_loaded ?? 0,
|
||||||
|
shardsScanned: response.stats?.shards_scanned ?? 0,
|
||||||
|
shardsSkipped: response.stats?.shards_skipped ?? 0,
|
||||||
|
shardsSkippedFilter: response.stats?.shards_skipped_filter ?? 0,
|
||||||
|
ngramMatches: response.stats?.ngram_matches ?? 0,
|
||||||
|
ngramLookups: response.stats?.ngram_lookups ?? 0,
|
||||||
|
wait: response.stats?.wait?.nanos ?? 0,
|
||||||
|
matchTreeConstruction: response.stats?.match_tree_construction?.nanos ?? 0,
|
||||||
|
matchTreeSearch: response.stats?.match_tree_search?.nanos ?? 0,
|
||||||
|
regexpsConsidered: response.stats?.regexps_considered ?? 0,
|
||||||
|
flushReason: response.stats?.flush_reason?.toString() ?? ZoektGrpcFlushReason.FLUSH_REASON_UNKNOWN_UNSPECIFIED,
|
||||||
|
}
|
||||||
|
|
||||||
|
return {
|
||||||
|
files,
|
||||||
|
repositoryInfo: Array.from(reposMapCache.values()).map((repo) => ({
|
||||||
|
id: repo.id,
|
||||||
|
codeHostType: repo.external_codeHostType,
|
||||||
|
name: repo.name,
|
||||||
|
displayName: repo.displayName ?? undefined,
|
||||||
|
webUrl: repo.webUrl ?? undefined,
|
||||||
|
})),
|
||||||
|
stats,
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// @note (2025-05-12): in zoekt, repositories are identified by the `RepositoryID` field
|
||||||
|
// which corresponds to the `id` in the Repo table. In order to efficiently fetch repository
|
||||||
|
// metadata when transforming (potentially thousands) of file matches, we aggregate a unique
|
||||||
|
// set of repository ids* and map them to their corresponding Repo record.
|
||||||
|
//
|
||||||
|
// *Q: Why is `RepositoryID` optional? And why are we falling back to `Repository`?
|
||||||
|
// A: Prior to this change, the repository id was not plumbed into zoekt, so RepositoryID was
|
||||||
|
// always undefined. To make this a non-breaking change, we fallback to using the repository's name
|
||||||
|
// (`Repository`) as the identifier in these cases. This is not guaranteed to be unique, but in
|
||||||
|
// practice it is since the repository name includes the host and path (e.g., 'github.com/org/repo',
|
||||||
|
// 'gitea.com/org/repo', etc.).
|
||||||
|
//
|
||||||
|
// Note: When a repository is re-indexed (every hour) this ID will be populated.
|
||||||
|
// @see: https://github.com/sourcebot-dev/zoekt/pull/6
|
||||||
|
const getRepoIdForFile = (file: ZoektGrpcFileMatch): string | number => {
|
||||||
|
return file.repository_id ?? file.repository;
|
||||||
|
}
|
||||||
|
|
||||||
|
const createGrpcClient = (): WebserverServiceClient => {
|
||||||
|
// Path to proto files - these should match your monorepo structure
|
||||||
|
const protoBasePath = path.join(process.cwd(), '../../vendor/zoekt/grpc/protos');
|
||||||
|
const protoPath = path.join(protoBasePath, 'zoekt/webserver/v1/webserver.proto');
|
||||||
|
|
||||||
|
const packageDefinition = protoLoader.loadSync(protoPath, {
|
||||||
|
keepCase: true,
|
||||||
|
longs: Number,
|
||||||
|
enums: String,
|
||||||
|
defaults: true,
|
||||||
|
oneofs: true,
|
||||||
|
includeDirs: [protoBasePath],
|
||||||
|
});
|
||||||
|
|
||||||
|
const proto = grpc.loadPackageDefinition(packageDefinition) as unknown as ProtoGrpcType;
|
||||||
|
|
||||||
|
// Extract host and port from ZOEKT_WEBSERVER_URL
|
||||||
|
const zoektUrl = new URL(env.ZOEKT_WEBSERVER_URL);
|
||||||
|
const grpcAddress = `${zoektUrl.hostname}:${zoektUrl.port}`;
|
||||||
|
|
||||||
|
return new proto.zoekt.webserver.v1.WebserverService(
|
||||||
|
grpcAddress,
|
||||||
|
grpc.credentials.createInsecure(),
|
||||||
|
{
|
||||||
|
'grpc.max_receive_message_length': 500 * 1024 * 1024, // 500MB
|
||||||
|
'grpc.max_send_message_length': 500 * 1024 * 1024, // 500MB
|
||||||
|
}
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
const accumulateStats = (a: SearchStats, b: SearchStats): SearchStats => {
|
||||||
|
return {
|
||||||
|
actualMatchCount: a.actualMatchCount + b.actualMatchCount,
|
||||||
|
totalMatchCount: a.totalMatchCount + b.totalMatchCount,
|
||||||
|
duration: a.duration + b.duration,
|
||||||
|
fileCount: a.fileCount + b.fileCount,
|
||||||
|
filesSkipped: a.filesSkipped + b.filesSkipped,
|
||||||
|
contentBytesLoaded: a.contentBytesLoaded + b.contentBytesLoaded,
|
||||||
|
indexBytesLoaded: a.indexBytesLoaded + b.indexBytesLoaded,
|
||||||
|
crashes: a.crashes + b.crashes,
|
||||||
|
shardFilesConsidered: a.shardFilesConsidered + b.shardFilesConsidered,
|
||||||
|
filesConsidered: a.filesConsidered + b.filesConsidered,
|
||||||
|
filesLoaded: a.filesLoaded + b.filesLoaded,
|
||||||
|
shardsScanned: a.shardsScanned + b.shardsScanned,
|
||||||
|
shardsSkipped: a.shardsSkipped + b.shardsSkipped,
|
||||||
|
shardsSkippedFilter: a.shardsSkippedFilter + b.shardsSkippedFilter,
|
||||||
|
ngramMatches: a.ngramMatches + b.ngramMatches,
|
||||||
|
ngramLookups: a.ngramLookups + b.ngramLookups,
|
||||||
|
wait: a.wait + b.wait,
|
||||||
|
matchTreeConstruction: a.matchTreeConstruction + b.matchTreeConstruction,
|
||||||
|
matchTreeSearch: a.matchTreeSearch + b.matchTreeSearch,
|
||||||
|
regexpsConsidered: a.regexpsConsidered + b.regexpsConsidered,
|
||||||
|
// Capture the first non-unknown flush reason.
|
||||||
|
...(a.flushReason === ZoektGrpcFlushReason.FLUSH_REASON_UNKNOWN_UNSPECIFIED ? {
|
||||||
|
flushReason: b.flushReason
|
||||||
|
} : {
|
||||||
|
flushReason: a.flushReason,
|
||||||
|
}),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
@ -34,4 +34,5 @@ export enum ErrorCode {
|
||||||
API_KEY_NOT_FOUND = 'API_KEY_NOT_FOUND',
|
API_KEY_NOT_FOUND = 'API_KEY_NOT_FOUND',
|
||||||
INVALID_API_KEY = 'INVALID_API_KEY',
|
INVALID_API_KEY = 'INVALID_API_KEY',
|
||||||
CHAT_IS_READONLY = 'CHAT_IS_READONLY',
|
CHAT_IS_READONLY = 'CHAT_IS_READONLY',
|
||||||
|
FAILED_TO_PARSE_QUERY = 'FAILED_TO_PARSE_QUERY',
|
||||||
}
|
}
|
||||||
|
|
|
||||||
|
|
@ -1,6 +1,6 @@
|
||||||
import { EditorSelection, Extension, StateEffect, StateField, Text, Transaction } from "@codemirror/state";
|
import { EditorSelection, Extension, StateEffect, StateField, Text, Transaction } from "@codemirror/state";
|
||||||
import { Decoration, DecorationSet, EditorView } from "@codemirror/view";
|
import { Decoration, DecorationSet, EditorView } from "@codemirror/view";
|
||||||
import { SourceRange } from "@/features/search/types";
|
import { SourceRange } from "@/features/search";
|
||||||
|
|
||||||
const setMatchState = StateEffect.define<{
|
const setMatchState = StateEffect.define<{
|
||||||
selectedMatchIndex: number,
|
selectedMatchIndex: number,
|
||||||
|
|
|
||||||
|
|
@ -5,7 +5,10 @@ export type PosthogEventMap = {
|
||||||
contentBytesLoaded: number,
|
contentBytesLoaded: number,
|
||||||
indexBytesLoaded: number,
|
indexBytesLoaded: number,
|
||||||
crashes: number,
|
crashes: number,
|
||||||
|
/** @deprecated: use timeToFirstSearchResultMs and timeToSearchCompletionMs instead */
|
||||||
durationMs: number,
|
durationMs: number,
|
||||||
|
timeToFirstSearchResultMs: number,
|
||||||
|
timeToSearchCompletionMs: number,
|
||||||
fileCount: number,
|
fileCount: number,
|
||||||
shardFilesConsidered: number,
|
shardFilesConsidered: number,
|
||||||
filesConsidered: number,
|
filesConsidered: number,
|
||||||
|
|
@ -22,8 +25,9 @@ export type PosthogEventMap = {
|
||||||
matchTreeConstruction: number,
|
matchTreeConstruction: number,
|
||||||
matchTreeSearch: number,
|
matchTreeSearch: number,
|
||||||
regexpsConsidered: number,
|
regexpsConsidered: number,
|
||||||
flushReason: number,
|
flushReason: string,
|
||||||
fileLanguages: string[]
|
fileLanguages: string[],
|
||||||
|
isSearchExhaustive: boolean
|
||||||
},
|
},
|
||||||
share_link_created: {},
|
share_link_created: {},
|
||||||
////////////////////////////////////////////////////////////////
|
////////////////////////////////////////////////////////////////
|
||||||
|
|
|
||||||
|
|
@ -9,6 +9,8 @@ export type GetVersionResponse = z.infer<typeof getVersionResponseSchema>;
|
||||||
export enum SearchQueryParams {
|
export enum SearchQueryParams {
|
||||||
query = "query",
|
query = "query",
|
||||||
matches = "matches",
|
matches = "matches",
|
||||||
|
isRegexEnabled = "isRegexEnabled",
|
||||||
|
isCaseSensitivityEnabled = "isCaseSensitivityEnabled",
|
||||||
}
|
}
|
||||||
|
|
||||||
export type ApiKeyPayload = {
|
export type ApiKeyPayload = {
|
||||||
|
|
|
||||||
|
|
@ -376,6 +376,42 @@ export const getCodeHostBrowseAtBranchUrl = ({
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
export const getCodeHostBrowseFileAtBranchUrl = ({
|
||||||
|
webUrl,
|
||||||
|
codeHostType,
|
||||||
|
branchName,
|
||||||
|
filePath,
|
||||||
|
}: {
|
||||||
|
webUrl?: string | null,
|
||||||
|
codeHostType: CodeHostType,
|
||||||
|
branchName: string,
|
||||||
|
filePath: string,
|
||||||
|
}) => {
|
||||||
|
if (!webUrl) {
|
||||||
|
return undefined;
|
||||||
|
}
|
||||||
|
|
||||||
|
switch (codeHostType) {
|
||||||
|
case 'github':
|
||||||
|
return `${webUrl}/blob/${branchName}/${filePath}`;
|
||||||
|
case 'gitlab':
|
||||||
|
return `${webUrl}/-/blob/${branchName}/${filePath}`;
|
||||||
|
case 'gitea':
|
||||||
|
return `${webUrl}/src/branch/${branchName}/${filePath}`;
|
||||||
|
case 'azuredevops':
|
||||||
|
return `${webUrl}?path=${filePath}&version=${branchName}`;
|
||||||
|
case 'bitbucketCloud':
|
||||||
|
return `${webUrl}/src/${branchName}/${filePath}`;
|
||||||
|
case 'bitbucketServer':
|
||||||
|
return `${webUrl}/browse/${filePath}?at=${branchName}`;
|
||||||
|
case 'gerrit':
|
||||||
|
return `${webUrl}/+/${branchName}/${filePath}`;
|
||||||
|
case 'genericGitHost':
|
||||||
|
return undefined;
|
||||||
|
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
export const isAuthSupportedForCodeHost = (codeHostType: CodeHostType): boolean => {
|
export const isAuthSupportedForCodeHost = (codeHostType: CodeHostType): boolean => {
|
||||||
switch (codeHostType) {
|
switch (codeHostType) {
|
||||||
case "github":
|
case "github":
|
||||||
|
|
|
||||||
|
|
@ -1,6 +1,6 @@
|
||||||
import 'server-only';
|
import 'server-only';
|
||||||
import { env, getDBConnectionString } from "@sourcebot/shared";
|
import { env, getDBConnectionString } from "@sourcebot/shared";
|
||||||
import { Prisma, PrismaClient } from "@sourcebot/db";
|
import { Prisma, PrismaClient, UserWithAccounts } from "@sourcebot/db";
|
||||||
import { hasEntitlement } from "@sourcebot/shared";
|
import { hasEntitlement } from "@sourcebot/shared";
|
||||||
|
|
||||||
// @see: https://authjs.dev/getting-started/adapters/prisma
|
// @see: https://authjs.dev/getting-started/adapters/prisma
|
||||||
|
|
@ -32,7 +32,7 @@ if (env.NODE_ENV !== "production") globalForPrisma.prisma = prisma
|
||||||
* Creates a prisma client extension that scopes queries to striclty information
|
* Creates a prisma client extension that scopes queries to striclty information
|
||||||
* a given user should be able to access.
|
* a given user should be able to access.
|
||||||
*/
|
*/
|
||||||
export const userScopedPrismaClientExtension = (accountIds?: string[]) => {
|
export const userScopedPrismaClientExtension = (user?: UserWithAccounts) => {
|
||||||
return Prisma.defineExtension(
|
return Prisma.defineExtension(
|
||||||
(prisma) => {
|
(prisma) => {
|
||||||
return prisma.$extends({
|
return prisma.$extends({
|
||||||
|
|
@ -46,24 +46,7 @@ export const userScopedPrismaClientExtension = (accountIds?: string[]) => {
|
||||||
|
|
||||||
argsWithWhere.where = {
|
argsWithWhere.where = {
|
||||||
...(argsWithWhere.where || {}),
|
...(argsWithWhere.where || {}),
|
||||||
OR: [
|
...getRepoPermissionFilterForUser(user),
|
||||||
// Only include repos that are permitted to the user
|
|
||||||
...(accountIds ? [
|
|
||||||
{
|
|
||||||
permittedAccounts: {
|
|
||||||
some: {
|
|
||||||
accountId: {
|
|
||||||
in: accountIds,
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
},
|
|
||||||
] : []),
|
|
||||||
// or are public.
|
|
||||||
{
|
|
||||||
isPublic: true,
|
|
||||||
}
|
|
||||||
]
|
|
||||||
};
|
};
|
||||||
|
|
||||||
return query(args);
|
return query(args);
|
||||||
|
|
@ -74,3 +57,29 @@ export const userScopedPrismaClientExtension = (accountIds?: string[]) => {
|
||||||
})
|
})
|
||||||
})
|
})
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Returns a filter for repositories that the user has access to.
|
||||||
|
*/
|
||||||
|
export const getRepoPermissionFilterForUser = (user?: UserWithAccounts): Prisma.RepoWhereInput => {
|
||||||
|
return {
|
||||||
|
OR: [
|
||||||
|
// Only include repos that are permitted to the user
|
||||||
|
...((user && user.accounts.length > 0) ? [
|
||||||
|
{
|
||||||
|
permittedAccounts: {
|
||||||
|
some: {
|
||||||
|
accountId: {
|
||||||
|
in: user.accounts.map(account => account.id),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
] : []),
|
||||||
|
// or are public.
|
||||||
|
{
|
||||||
|
isPublic: true,
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
|
||||||
13
packages/web/src/proto/google/protobuf/Duration.ts
Normal file
13
packages/web/src/proto/google/protobuf/Duration.ts
Normal file
|
|
@ -0,0 +1,13 @@
|
||||||
|
// Original file: null
|
||||||
|
|
||||||
|
import type { Long } from '@grpc/proto-loader';
|
||||||
|
|
||||||
|
export interface Duration {
|
||||||
|
'seconds'?: (number | string | Long);
|
||||||
|
'nanos'?: (number);
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface Duration__Output {
|
||||||
|
'seconds': (number);
|
||||||
|
'nanos': (number);
|
||||||
|
}
|
||||||
13
packages/web/src/proto/google/protobuf/Timestamp.ts
Normal file
13
packages/web/src/proto/google/protobuf/Timestamp.ts
Normal file
|
|
@ -0,0 +1,13 @@
|
||||||
|
// Original file: null
|
||||||
|
|
||||||
|
import type { Long } from '@grpc/proto-loader';
|
||||||
|
|
||||||
|
export interface Timestamp {
|
||||||
|
'seconds'?: (number | string | Long);
|
||||||
|
'nanos'?: (number);
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface Timestamp__Output {
|
||||||
|
'seconds': (number);
|
||||||
|
'nanos': (number);
|
||||||
|
}
|
||||||
55
packages/web/src/proto/query.ts
Normal file
55
packages/web/src/proto/query.ts
Normal file
|
|
@ -0,0 +1,55 @@
|
||||||
|
import type * as grpc from '@grpc/grpc-js';
|
||||||
|
import type { MessageTypeDefinition } from '@grpc/proto-loader';
|
||||||
|
|
||||||
|
import type { And as _zoekt_webserver_v1_And, And__Output as _zoekt_webserver_v1_And__Output } from './zoekt/webserver/v1/And';
|
||||||
|
import type { Boost as _zoekt_webserver_v1_Boost, Boost__Output as _zoekt_webserver_v1_Boost__Output } from './zoekt/webserver/v1/Boost';
|
||||||
|
import type { Branch as _zoekt_webserver_v1_Branch, Branch__Output as _zoekt_webserver_v1_Branch__Output } from './zoekt/webserver/v1/Branch';
|
||||||
|
import type { BranchRepos as _zoekt_webserver_v1_BranchRepos, BranchRepos__Output as _zoekt_webserver_v1_BranchRepos__Output } from './zoekt/webserver/v1/BranchRepos';
|
||||||
|
import type { BranchesRepos as _zoekt_webserver_v1_BranchesRepos, BranchesRepos__Output as _zoekt_webserver_v1_BranchesRepos__Output } from './zoekt/webserver/v1/BranchesRepos';
|
||||||
|
import type { FileNameSet as _zoekt_webserver_v1_FileNameSet, FileNameSet__Output as _zoekt_webserver_v1_FileNameSet__Output } from './zoekt/webserver/v1/FileNameSet';
|
||||||
|
import type { Language as _zoekt_webserver_v1_Language, Language__Output as _zoekt_webserver_v1_Language__Output } from './zoekt/webserver/v1/Language';
|
||||||
|
import type { Not as _zoekt_webserver_v1_Not, Not__Output as _zoekt_webserver_v1_Not__Output } from './zoekt/webserver/v1/Not';
|
||||||
|
import type { Or as _zoekt_webserver_v1_Or, Or__Output as _zoekt_webserver_v1_Or__Output } from './zoekt/webserver/v1/Or';
|
||||||
|
import type { Q as _zoekt_webserver_v1_Q, Q__Output as _zoekt_webserver_v1_Q__Output } from './zoekt/webserver/v1/Q';
|
||||||
|
import type { RawConfig as _zoekt_webserver_v1_RawConfig, RawConfig__Output as _zoekt_webserver_v1_RawConfig__Output } from './zoekt/webserver/v1/RawConfig';
|
||||||
|
import type { Regexp as _zoekt_webserver_v1_Regexp, Regexp__Output as _zoekt_webserver_v1_Regexp__Output } from './zoekt/webserver/v1/Regexp';
|
||||||
|
import type { Repo as _zoekt_webserver_v1_Repo, Repo__Output as _zoekt_webserver_v1_Repo__Output } from './zoekt/webserver/v1/Repo';
|
||||||
|
import type { RepoIds as _zoekt_webserver_v1_RepoIds, RepoIds__Output as _zoekt_webserver_v1_RepoIds__Output } from './zoekt/webserver/v1/RepoIds';
|
||||||
|
import type { RepoRegexp as _zoekt_webserver_v1_RepoRegexp, RepoRegexp__Output as _zoekt_webserver_v1_RepoRegexp__Output } from './zoekt/webserver/v1/RepoRegexp';
|
||||||
|
import type { RepoSet as _zoekt_webserver_v1_RepoSet, RepoSet__Output as _zoekt_webserver_v1_RepoSet__Output } from './zoekt/webserver/v1/RepoSet';
|
||||||
|
import type { Substring as _zoekt_webserver_v1_Substring, Substring__Output as _zoekt_webserver_v1_Substring__Output } from './zoekt/webserver/v1/Substring';
|
||||||
|
import type { Symbol as _zoekt_webserver_v1_Symbol, Symbol__Output as _zoekt_webserver_v1_Symbol__Output } from './zoekt/webserver/v1/Symbol';
|
||||||
|
import type { Type as _zoekt_webserver_v1_Type, Type__Output as _zoekt_webserver_v1_Type__Output } from './zoekt/webserver/v1/Type';
|
||||||
|
|
||||||
|
type SubtypeConstructor<Constructor extends new (...args: any) => any, Subtype> = {
|
||||||
|
new(...args: ConstructorParameters<Constructor>): Subtype;
|
||||||
|
};
|
||||||
|
|
||||||
|
export interface ProtoGrpcType {
|
||||||
|
zoekt: {
|
||||||
|
webserver: {
|
||||||
|
v1: {
|
||||||
|
And: MessageTypeDefinition<_zoekt_webserver_v1_And, _zoekt_webserver_v1_And__Output>
|
||||||
|
Boost: MessageTypeDefinition<_zoekt_webserver_v1_Boost, _zoekt_webserver_v1_Boost__Output>
|
||||||
|
Branch: MessageTypeDefinition<_zoekt_webserver_v1_Branch, _zoekt_webserver_v1_Branch__Output>
|
||||||
|
BranchRepos: MessageTypeDefinition<_zoekt_webserver_v1_BranchRepos, _zoekt_webserver_v1_BranchRepos__Output>
|
||||||
|
BranchesRepos: MessageTypeDefinition<_zoekt_webserver_v1_BranchesRepos, _zoekt_webserver_v1_BranchesRepos__Output>
|
||||||
|
FileNameSet: MessageTypeDefinition<_zoekt_webserver_v1_FileNameSet, _zoekt_webserver_v1_FileNameSet__Output>
|
||||||
|
Language: MessageTypeDefinition<_zoekt_webserver_v1_Language, _zoekt_webserver_v1_Language__Output>
|
||||||
|
Not: MessageTypeDefinition<_zoekt_webserver_v1_Not, _zoekt_webserver_v1_Not__Output>
|
||||||
|
Or: MessageTypeDefinition<_zoekt_webserver_v1_Or, _zoekt_webserver_v1_Or__Output>
|
||||||
|
Q: MessageTypeDefinition<_zoekt_webserver_v1_Q, _zoekt_webserver_v1_Q__Output>
|
||||||
|
RawConfig: MessageTypeDefinition<_zoekt_webserver_v1_RawConfig, _zoekt_webserver_v1_RawConfig__Output>
|
||||||
|
Regexp: MessageTypeDefinition<_zoekt_webserver_v1_Regexp, _zoekt_webserver_v1_Regexp__Output>
|
||||||
|
Repo: MessageTypeDefinition<_zoekt_webserver_v1_Repo, _zoekt_webserver_v1_Repo__Output>
|
||||||
|
RepoIds: MessageTypeDefinition<_zoekt_webserver_v1_RepoIds, _zoekt_webserver_v1_RepoIds__Output>
|
||||||
|
RepoRegexp: MessageTypeDefinition<_zoekt_webserver_v1_RepoRegexp, _zoekt_webserver_v1_RepoRegexp__Output>
|
||||||
|
RepoSet: MessageTypeDefinition<_zoekt_webserver_v1_RepoSet, _zoekt_webserver_v1_RepoSet__Output>
|
||||||
|
Substring: MessageTypeDefinition<_zoekt_webserver_v1_Substring, _zoekt_webserver_v1_Substring__Output>
|
||||||
|
Symbol: MessageTypeDefinition<_zoekt_webserver_v1_Symbol, _zoekt_webserver_v1_Symbol__Output>
|
||||||
|
Type: MessageTypeDefinition<_zoekt_webserver_v1_Type, _zoekt_webserver_v1_Type__Output>
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
112
packages/web/src/proto/webserver.ts
Normal file
112
packages/web/src/proto/webserver.ts
Normal file
|
|
@ -0,0 +1,112 @@
|
||||||
|
import type * as grpc from '@grpc/grpc-js';
|
||||||
|
import type { EnumTypeDefinition, MessageTypeDefinition } from '@grpc/proto-loader';
|
||||||
|
|
||||||
|
import type { Duration as _google_protobuf_Duration, Duration__Output as _google_protobuf_Duration__Output } from './google/protobuf/Duration';
|
||||||
|
import type { Timestamp as _google_protobuf_Timestamp, Timestamp__Output as _google_protobuf_Timestamp__Output } from './google/protobuf/Timestamp';
|
||||||
|
import type { And as _zoekt_webserver_v1_And, And__Output as _zoekt_webserver_v1_And__Output } from './zoekt/webserver/v1/And';
|
||||||
|
import type { Boost as _zoekt_webserver_v1_Boost, Boost__Output as _zoekt_webserver_v1_Boost__Output } from './zoekt/webserver/v1/Boost';
|
||||||
|
import type { Branch as _zoekt_webserver_v1_Branch, Branch__Output as _zoekt_webserver_v1_Branch__Output } from './zoekt/webserver/v1/Branch';
|
||||||
|
import type { BranchRepos as _zoekt_webserver_v1_BranchRepos, BranchRepos__Output as _zoekt_webserver_v1_BranchRepos__Output } from './zoekt/webserver/v1/BranchRepos';
|
||||||
|
import type { BranchesRepos as _zoekt_webserver_v1_BranchesRepos, BranchesRepos__Output as _zoekt_webserver_v1_BranchesRepos__Output } from './zoekt/webserver/v1/BranchesRepos';
|
||||||
|
import type { ChunkMatch as _zoekt_webserver_v1_ChunkMatch, ChunkMatch__Output as _zoekt_webserver_v1_ChunkMatch__Output } from './zoekt/webserver/v1/ChunkMatch';
|
||||||
|
import type { FileMatch as _zoekt_webserver_v1_FileMatch, FileMatch__Output as _zoekt_webserver_v1_FileMatch__Output } from './zoekt/webserver/v1/FileMatch';
|
||||||
|
import type { FileNameSet as _zoekt_webserver_v1_FileNameSet, FileNameSet__Output as _zoekt_webserver_v1_FileNameSet__Output } from './zoekt/webserver/v1/FileNameSet';
|
||||||
|
import type { IndexMetadata as _zoekt_webserver_v1_IndexMetadata, IndexMetadata__Output as _zoekt_webserver_v1_IndexMetadata__Output } from './zoekt/webserver/v1/IndexMetadata';
|
||||||
|
import type { Language as _zoekt_webserver_v1_Language, Language__Output as _zoekt_webserver_v1_Language__Output } from './zoekt/webserver/v1/Language';
|
||||||
|
import type { LineFragmentMatch as _zoekt_webserver_v1_LineFragmentMatch, LineFragmentMatch__Output as _zoekt_webserver_v1_LineFragmentMatch__Output } from './zoekt/webserver/v1/LineFragmentMatch';
|
||||||
|
import type { LineMatch as _zoekt_webserver_v1_LineMatch, LineMatch__Output as _zoekt_webserver_v1_LineMatch__Output } from './zoekt/webserver/v1/LineMatch';
|
||||||
|
import type { ListOptions as _zoekt_webserver_v1_ListOptions, ListOptions__Output as _zoekt_webserver_v1_ListOptions__Output } from './zoekt/webserver/v1/ListOptions';
|
||||||
|
import type { ListRequest as _zoekt_webserver_v1_ListRequest, ListRequest__Output as _zoekt_webserver_v1_ListRequest__Output } from './zoekt/webserver/v1/ListRequest';
|
||||||
|
import type { ListResponse as _zoekt_webserver_v1_ListResponse, ListResponse__Output as _zoekt_webserver_v1_ListResponse__Output } from './zoekt/webserver/v1/ListResponse';
|
||||||
|
import type { Location as _zoekt_webserver_v1_Location, Location__Output as _zoekt_webserver_v1_Location__Output } from './zoekt/webserver/v1/Location';
|
||||||
|
import type { MinimalRepoListEntry as _zoekt_webserver_v1_MinimalRepoListEntry, MinimalRepoListEntry__Output as _zoekt_webserver_v1_MinimalRepoListEntry__Output } from './zoekt/webserver/v1/MinimalRepoListEntry';
|
||||||
|
import type { Not as _zoekt_webserver_v1_Not, Not__Output as _zoekt_webserver_v1_Not__Output } from './zoekt/webserver/v1/Not';
|
||||||
|
import type { Or as _zoekt_webserver_v1_Or, Or__Output as _zoekt_webserver_v1_Or__Output } from './zoekt/webserver/v1/Or';
|
||||||
|
import type { Progress as _zoekt_webserver_v1_Progress, Progress__Output as _zoekt_webserver_v1_Progress__Output } from './zoekt/webserver/v1/Progress';
|
||||||
|
import type { Q as _zoekt_webserver_v1_Q, Q__Output as _zoekt_webserver_v1_Q__Output } from './zoekt/webserver/v1/Q';
|
||||||
|
import type { Range as _zoekt_webserver_v1_Range, Range__Output as _zoekt_webserver_v1_Range__Output } from './zoekt/webserver/v1/Range';
|
||||||
|
import type { RawConfig as _zoekt_webserver_v1_RawConfig, RawConfig__Output as _zoekt_webserver_v1_RawConfig__Output } from './zoekt/webserver/v1/RawConfig';
|
||||||
|
import type { Regexp as _zoekt_webserver_v1_Regexp, Regexp__Output as _zoekt_webserver_v1_Regexp__Output } from './zoekt/webserver/v1/Regexp';
|
||||||
|
import type { Repo as _zoekt_webserver_v1_Repo, Repo__Output as _zoekt_webserver_v1_Repo__Output } from './zoekt/webserver/v1/Repo';
|
||||||
|
import type { RepoIds as _zoekt_webserver_v1_RepoIds, RepoIds__Output as _zoekt_webserver_v1_RepoIds__Output } from './zoekt/webserver/v1/RepoIds';
|
||||||
|
import type { RepoListEntry as _zoekt_webserver_v1_RepoListEntry, RepoListEntry__Output as _zoekt_webserver_v1_RepoListEntry__Output } from './zoekt/webserver/v1/RepoListEntry';
|
||||||
|
import type { RepoRegexp as _zoekt_webserver_v1_RepoRegexp, RepoRegexp__Output as _zoekt_webserver_v1_RepoRegexp__Output } from './zoekt/webserver/v1/RepoRegexp';
|
||||||
|
import type { RepoSet as _zoekt_webserver_v1_RepoSet, RepoSet__Output as _zoekt_webserver_v1_RepoSet__Output } from './zoekt/webserver/v1/RepoSet';
|
||||||
|
import type { RepoStats as _zoekt_webserver_v1_RepoStats, RepoStats__Output as _zoekt_webserver_v1_RepoStats__Output } from './zoekt/webserver/v1/RepoStats';
|
||||||
|
import type { Repository as _zoekt_webserver_v1_Repository, Repository__Output as _zoekt_webserver_v1_Repository__Output } from './zoekt/webserver/v1/Repository';
|
||||||
|
import type { RepositoryBranch as _zoekt_webserver_v1_RepositoryBranch, RepositoryBranch__Output as _zoekt_webserver_v1_RepositoryBranch__Output } from './zoekt/webserver/v1/RepositoryBranch';
|
||||||
|
import type { SearchOptions as _zoekt_webserver_v1_SearchOptions, SearchOptions__Output as _zoekt_webserver_v1_SearchOptions__Output } from './zoekt/webserver/v1/SearchOptions';
|
||||||
|
import type { SearchRequest as _zoekt_webserver_v1_SearchRequest, SearchRequest__Output as _zoekt_webserver_v1_SearchRequest__Output } from './zoekt/webserver/v1/SearchRequest';
|
||||||
|
import type { SearchResponse as _zoekt_webserver_v1_SearchResponse, SearchResponse__Output as _zoekt_webserver_v1_SearchResponse__Output } from './zoekt/webserver/v1/SearchResponse';
|
||||||
|
import type { Stats as _zoekt_webserver_v1_Stats, Stats__Output as _zoekt_webserver_v1_Stats__Output } from './zoekt/webserver/v1/Stats';
|
||||||
|
import type { StreamSearchRequest as _zoekt_webserver_v1_StreamSearchRequest, StreamSearchRequest__Output as _zoekt_webserver_v1_StreamSearchRequest__Output } from './zoekt/webserver/v1/StreamSearchRequest';
|
||||||
|
import type { StreamSearchResponse as _zoekt_webserver_v1_StreamSearchResponse, StreamSearchResponse__Output as _zoekt_webserver_v1_StreamSearchResponse__Output } from './zoekt/webserver/v1/StreamSearchResponse';
|
||||||
|
import type { Substring as _zoekt_webserver_v1_Substring, Substring__Output as _zoekt_webserver_v1_Substring__Output } from './zoekt/webserver/v1/Substring';
|
||||||
|
import type { Symbol as _zoekt_webserver_v1_Symbol, Symbol__Output as _zoekt_webserver_v1_Symbol__Output } from './zoekt/webserver/v1/Symbol';
|
||||||
|
import type { SymbolInfo as _zoekt_webserver_v1_SymbolInfo, SymbolInfo__Output as _zoekt_webserver_v1_SymbolInfo__Output } from './zoekt/webserver/v1/SymbolInfo';
|
||||||
|
import type { Type as _zoekt_webserver_v1_Type, Type__Output as _zoekt_webserver_v1_Type__Output } from './zoekt/webserver/v1/Type';
|
||||||
|
import type { WebserverServiceClient as _zoekt_webserver_v1_WebserverServiceClient, WebserverServiceDefinition as _zoekt_webserver_v1_WebserverServiceDefinition } from './zoekt/webserver/v1/WebserverService';
|
||||||
|
|
||||||
|
type SubtypeConstructor<Constructor extends new (...args: any) => any, Subtype> = {
|
||||||
|
new(...args: ConstructorParameters<Constructor>): Subtype;
|
||||||
|
};
|
||||||
|
|
||||||
|
export interface ProtoGrpcType {
|
||||||
|
google: {
|
||||||
|
protobuf: {
|
||||||
|
Duration: MessageTypeDefinition<_google_protobuf_Duration, _google_protobuf_Duration__Output>
|
||||||
|
Timestamp: MessageTypeDefinition<_google_protobuf_Timestamp, _google_protobuf_Timestamp__Output>
|
||||||
|
}
|
||||||
|
}
|
||||||
|
zoekt: {
|
||||||
|
webserver: {
|
||||||
|
v1: {
|
||||||
|
And: MessageTypeDefinition<_zoekt_webserver_v1_And, _zoekt_webserver_v1_And__Output>
|
||||||
|
Boost: MessageTypeDefinition<_zoekt_webserver_v1_Boost, _zoekt_webserver_v1_Boost__Output>
|
||||||
|
Branch: MessageTypeDefinition<_zoekt_webserver_v1_Branch, _zoekt_webserver_v1_Branch__Output>
|
||||||
|
BranchRepos: MessageTypeDefinition<_zoekt_webserver_v1_BranchRepos, _zoekt_webserver_v1_BranchRepos__Output>
|
||||||
|
BranchesRepos: MessageTypeDefinition<_zoekt_webserver_v1_BranchesRepos, _zoekt_webserver_v1_BranchesRepos__Output>
|
||||||
|
ChunkMatch: MessageTypeDefinition<_zoekt_webserver_v1_ChunkMatch, _zoekt_webserver_v1_ChunkMatch__Output>
|
||||||
|
FileMatch: MessageTypeDefinition<_zoekt_webserver_v1_FileMatch, _zoekt_webserver_v1_FileMatch__Output>
|
||||||
|
FileNameSet: MessageTypeDefinition<_zoekt_webserver_v1_FileNameSet, _zoekt_webserver_v1_FileNameSet__Output>
|
||||||
|
FlushReason: EnumTypeDefinition
|
||||||
|
IndexMetadata: MessageTypeDefinition<_zoekt_webserver_v1_IndexMetadata, _zoekt_webserver_v1_IndexMetadata__Output>
|
||||||
|
Language: MessageTypeDefinition<_zoekt_webserver_v1_Language, _zoekt_webserver_v1_Language__Output>
|
||||||
|
LineFragmentMatch: MessageTypeDefinition<_zoekt_webserver_v1_LineFragmentMatch, _zoekt_webserver_v1_LineFragmentMatch__Output>
|
||||||
|
LineMatch: MessageTypeDefinition<_zoekt_webserver_v1_LineMatch, _zoekt_webserver_v1_LineMatch__Output>
|
||||||
|
ListOptions: MessageTypeDefinition<_zoekt_webserver_v1_ListOptions, _zoekt_webserver_v1_ListOptions__Output>
|
||||||
|
ListRequest: MessageTypeDefinition<_zoekt_webserver_v1_ListRequest, _zoekt_webserver_v1_ListRequest__Output>
|
||||||
|
ListResponse: MessageTypeDefinition<_zoekt_webserver_v1_ListResponse, _zoekt_webserver_v1_ListResponse__Output>
|
||||||
|
Location: MessageTypeDefinition<_zoekt_webserver_v1_Location, _zoekt_webserver_v1_Location__Output>
|
||||||
|
MinimalRepoListEntry: MessageTypeDefinition<_zoekt_webserver_v1_MinimalRepoListEntry, _zoekt_webserver_v1_MinimalRepoListEntry__Output>
|
||||||
|
Not: MessageTypeDefinition<_zoekt_webserver_v1_Not, _zoekt_webserver_v1_Not__Output>
|
||||||
|
Or: MessageTypeDefinition<_zoekt_webserver_v1_Or, _zoekt_webserver_v1_Or__Output>
|
||||||
|
Progress: MessageTypeDefinition<_zoekt_webserver_v1_Progress, _zoekt_webserver_v1_Progress__Output>
|
||||||
|
Q: MessageTypeDefinition<_zoekt_webserver_v1_Q, _zoekt_webserver_v1_Q__Output>
|
||||||
|
Range: MessageTypeDefinition<_zoekt_webserver_v1_Range, _zoekt_webserver_v1_Range__Output>
|
||||||
|
RawConfig: MessageTypeDefinition<_zoekt_webserver_v1_RawConfig, _zoekt_webserver_v1_RawConfig__Output>
|
||||||
|
Regexp: MessageTypeDefinition<_zoekt_webserver_v1_Regexp, _zoekt_webserver_v1_Regexp__Output>
|
||||||
|
Repo: MessageTypeDefinition<_zoekt_webserver_v1_Repo, _zoekt_webserver_v1_Repo__Output>
|
||||||
|
RepoIds: MessageTypeDefinition<_zoekt_webserver_v1_RepoIds, _zoekt_webserver_v1_RepoIds__Output>
|
||||||
|
RepoListEntry: MessageTypeDefinition<_zoekt_webserver_v1_RepoListEntry, _zoekt_webserver_v1_RepoListEntry__Output>
|
||||||
|
RepoRegexp: MessageTypeDefinition<_zoekt_webserver_v1_RepoRegexp, _zoekt_webserver_v1_RepoRegexp__Output>
|
||||||
|
RepoSet: MessageTypeDefinition<_zoekt_webserver_v1_RepoSet, _zoekt_webserver_v1_RepoSet__Output>
|
||||||
|
RepoStats: MessageTypeDefinition<_zoekt_webserver_v1_RepoStats, _zoekt_webserver_v1_RepoStats__Output>
|
||||||
|
Repository: MessageTypeDefinition<_zoekt_webserver_v1_Repository, _zoekt_webserver_v1_Repository__Output>
|
||||||
|
RepositoryBranch: MessageTypeDefinition<_zoekt_webserver_v1_RepositoryBranch, _zoekt_webserver_v1_RepositoryBranch__Output>
|
||||||
|
SearchOptions: MessageTypeDefinition<_zoekt_webserver_v1_SearchOptions, _zoekt_webserver_v1_SearchOptions__Output>
|
||||||
|
SearchRequest: MessageTypeDefinition<_zoekt_webserver_v1_SearchRequest, _zoekt_webserver_v1_SearchRequest__Output>
|
||||||
|
SearchResponse: MessageTypeDefinition<_zoekt_webserver_v1_SearchResponse, _zoekt_webserver_v1_SearchResponse__Output>
|
||||||
|
Stats: MessageTypeDefinition<_zoekt_webserver_v1_Stats, _zoekt_webserver_v1_Stats__Output>
|
||||||
|
StreamSearchRequest: MessageTypeDefinition<_zoekt_webserver_v1_StreamSearchRequest, _zoekt_webserver_v1_StreamSearchRequest__Output>
|
||||||
|
StreamSearchResponse: MessageTypeDefinition<_zoekt_webserver_v1_StreamSearchResponse, _zoekt_webserver_v1_StreamSearchResponse__Output>
|
||||||
|
Substring: MessageTypeDefinition<_zoekt_webserver_v1_Substring, _zoekt_webserver_v1_Substring__Output>
|
||||||
|
Symbol: MessageTypeDefinition<_zoekt_webserver_v1_Symbol, _zoekt_webserver_v1_Symbol__Output>
|
||||||
|
SymbolInfo: MessageTypeDefinition<_zoekt_webserver_v1_SymbolInfo, _zoekt_webserver_v1_SymbolInfo__Output>
|
||||||
|
Type: MessageTypeDefinition<_zoekt_webserver_v1_Type, _zoekt_webserver_v1_Type__Output>
|
||||||
|
WebserverService: SubtypeConstructor<typeof grpc.Client, _zoekt_webserver_v1_WebserverServiceClient> & { service: _zoekt_webserver_v1_WebserverServiceDefinition }
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
17
packages/web/src/proto/zoekt/webserver/v1/And.ts
Normal file
17
packages/web/src/proto/zoekt/webserver/v1/And.ts
Normal file
|
|
@ -0,0 +1,17 @@
|
||||||
|
// Original file: ../../vendor/zoekt/grpc/protos/zoekt/webserver/v1/query.proto
|
||||||
|
|
||||||
|
import type { Q as _zoekt_webserver_v1_Q, Q__Output as _zoekt_webserver_v1_Q__Output } from '../../../zoekt/webserver/v1/Q';
|
||||||
|
|
||||||
|
/**
|
||||||
|
* And is matched when all its children are.
|
||||||
|
*/
|
||||||
|
export interface And {
|
||||||
|
'children'?: (_zoekt_webserver_v1_Q)[];
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* And is matched when all its children are.
|
||||||
|
*/
|
||||||
|
export interface And__Output {
|
||||||
|
'children': (_zoekt_webserver_v1_Q__Output)[];
|
||||||
|
}
|
||||||
19
packages/web/src/proto/zoekt/webserver/v1/Boost.ts
Normal file
19
packages/web/src/proto/zoekt/webserver/v1/Boost.ts
Normal file
|
|
@ -0,0 +1,19 @@
|
||||||
|
// Original file: ../../vendor/zoekt/grpc/protos/zoekt/webserver/v1/query.proto
|
||||||
|
|
||||||
|
import type { Q as _zoekt_webserver_v1_Q, Q__Output as _zoekt_webserver_v1_Q__Output } from '../../../zoekt/webserver/v1/Q';
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Boost multiplies the score of its child by the boost factor.
|
||||||
|
*/
|
||||||
|
export interface Boost {
|
||||||
|
'child'?: (_zoekt_webserver_v1_Q | null);
|
||||||
|
'boost'?: (number | string);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Boost multiplies the score of its child by the boost factor.
|
||||||
|
*/
|
||||||
|
export interface Boost__Output {
|
||||||
|
'child': (_zoekt_webserver_v1_Q__Output | null);
|
||||||
|
'boost': (number);
|
||||||
|
}
|
||||||
24
packages/web/src/proto/zoekt/webserver/v1/Branch.ts
Normal file
24
packages/web/src/proto/zoekt/webserver/v1/Branch.ts
Normal file
|
|
@ -0,0 +1,24 @@
|
||||||
|
// Original file: ../../vendor/zoekt/grpc/protos/zoekt/webserver/v1/query.proto
|
||||||
|
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Branch limits search to a specific branch.
|
||||||
|
*/
|
||||||
|
export interface Branch {
|
||||||
|
'pattern'?: (string);
|
||||||
|
/**
|
||||||
|
* exact is true if we want to Pattern to equal branch.
|
||||||
|
*/
|
||||||
|
'exact'?: (boolean);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Branch limits search to a specific branch.
|
||||||
|
*/
|
||||||
|
export interface Branch__Output {
|
||||||
|
'pattern': (string);
|
||||||
|
/**
|
||||||
|
* exact is true if we want to Pattern to equal branch.
|
||||||
|
*/
|
||||||
|
'exact': (boolean);
|
||||||
|
}
|
||||||
26
packages/web/src/proto/zoekt/webserver/v1/BranchRepos.ts
Normal file
26
packages/web/src/proto/zoekt/webserver/v1/BranchRepos.ts
Normal file
|
|
@ -0,0 +1,26 @@
|
||||||
|
// Original file: ../../vendor/zoekt/grpc/protos/zoekt/webserver/v1/query.proto
|
||||||
|
|
||||||
|
|
||||||
|
/**
|
||||||
|
* BranchRepos is a (branch, sourcegraph repo ids bitmap) tuple. It is a
|
||||||
|
* Sourcegraph addition.
|
||||||
|
*/
|
||||||
|
export interface BranchRepos {
|
||||||
|
'branch'?: (string);
|
||||||
|
/**
|
||||||
|
* a serialized roaring bitmap of the target repo ids
|
||||||
|
*/
|
||||||
|
'repos'?: (Buffer | Uint8Array | string);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* BranchRepos is a (branch, sourcegraph repo ids bitmap) tuple. It is a
|
||||||
|
* Sourcegraph addition.
|
||||||
|
*/
|
||||||
|
export interface BranchRepos__Output {
|
||||||
|
'branch': (string);
|
||||||
|
/**
|
||||||
|
* a serialized roaring bitmap of the target repo ids
|
||||||
|
*/
|
||||||
|
'repos': (Buffer);
|
||||||
|
}
|
||||||
17
packages/web/src/proto/zoekt/webserver/v1/BranchesRepos.ts
Normal file
17
packages/web/src/proto/zoekt/webserver/v1/BranchesRepos.ts
Normal file
|
|
@ -0,0 +1,17 @@
|
||||||
|
// Original file: ../../vendor/zoekt/grpc/protos/zoekt/webserver/v1/query.proto
|
||||||
|
|
||||||
|
import type { BranchRepos as _zoekt_webserver_v1_BranchRepos, BranchRepos__Output as _zoekt_webserver_v1_BranchRepos__Output } from '../../../zoekt/webserver/v1/BranchRepos';
|
||||||
|
|
||||||
|
/**
|
||||||
|
* BranchesRepos is a slice of BranchRepos to match.
|
||||||
|
*/
|
||||||
|
export interface BranchesRepos {
|
||||||
|
'list'?: (_zoekt_webserver_v1_BranchRepos)[];
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* BranchesRepos is a slice of BranchRepos to match.
|
||||||
|
*/
|
||||||
|
export interface BranchesRepos__Output {
|
||||||
|
'list': (_zoekt_webserver_v1_BranchRepos__Output)[];
|
||||||
|
}
|
||||||
67
packages/web/src/proto/zoekt/webserver/v1/ChunkMatch.ts
Normal file
67
packages/web/src/proto/zoekt/webserver/v1/ChunkMatch.ts
Normal file
|
|
@ -0,0 +1,67 @@
|
||||||
|
// Original file: ../../vendor/zoekt/grpc/protos/zoekt/webserver/v1/webserver.proto
|
||||||
|
|
||||||
|
import type { Location as _zoekt_webserver_v1_Location, Location__Output as _zoekt_webserver_v1_Location__Output } from '../../../zoekt/webserver/v1/Location';
|
||||||
|
import type { Range as _zoekt_webserver_v1_Range, Range__Output as _zoekt_webserver_v1_Range__Output } from '../../../zoekt/webserver/v1/Range';
|
||||||
|
import type { SymbolInfo as _zoekt_webserver_v1_SymbolInfo, SymbolInfo__Output as _zoekt_webserver_v1_SymbolInfo__Output } from '../../../zoekt/webserver/v1/SymbolInfo';
|
||||||
|
|
||||||
|
export interface ChunkMatch {
|
||||||
|
/**
|
||||||
|
* A contiguous range of complete lines that fully contains Ranges.
|
||||||
|
*/
|
||||||
|
'content'?: (Buffer | Uint8Array | string);
|
||||||
|
/**
|
||||||
|
* The location (inclusive) of the beginning of content
|
||||||
|
* relative to the beginning of the file. It will always be at the
|
||||||
|
* beginning of a line (Column will always be 1).
|
||||||
|
*/
|
||||||
|
'content_start'?: (_zoekt_webserver_v1_Location | null);
|
||||||
|
/**
|
||||||
|
* True if this match is a match on the file name, in
|
||||||
|
* which case Content will contain the file name.
|
||||||
|
*/
|
||||||
|
'file_name'?: (boolean);
|
||||||
|
/**
|
||||||
|
* A set of matching ranges within this chunk. Each range is relative
|
||||||
|
* to the beginning of the file (not the beginning of Content).
|
||||||
|
*/
|
||||||
|
'ranges'?: (_zoekt_webserver_v1_Range)[];
|
||||||
|
/**
|
||||||
|
* The symbol information associated with Ranges. If it is non-nil,
|
||||||
|
* its length will equal that of Ranges. Any of its elements may be nil.
|
||||||
|
*/
|
||||||
|
'symbol_info'?: (_zoekt_webserver_v1_SymbolInfo)[];
|
||||||
|
'score'?: (number | string);
|
||||||
|
'debug_score'?: (string);
|
||||||
|
'best_line_match'?: (number);
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface ChunkMatch__Output {
|
||||||
|
/**
|
||||||
|
* A contiguous range of complete lines that fully contains Ranges.
|
||||||
|
*/
|
||||||
|
'content': (Buffer);
|
||||||
|
/**
|
||||||
|
* The location (inclusive) of the beginning of content
|
||||||
|
* relative to the beginning of the file. It will always be at the
|
||||||
|
* beginning of a line (Column will always be 1).
|
||||||
|
*/
|
||||||
|
'content_start': (_zoekt_webserver_v1_Location__Output | null);
|
||||||
|
/**
|
||||||
|
* True if this match is a match on the file name, in
|
||||||
|
* which case Content will contain the file name.
|
||||||
|
*/
|
||||||
|
'file_name': (boolean);
|
||||||
|
/**
|
||||||
|
* A set of matching ranges within this chunk. Each range is relative
|
||||||
|
* to the beginning of the file (not the beginning of Content).
|
||||||
|
*/
|
||||||
|
'ranges': (_zoekt_webserver_v1_Range__Output)[];
|
||||||
|
/**
|
||||||
|
* The symbol information associated with Ranges. If it is non-nil,
|
||||||
|
* its length will equal that of Ranges. Any of its elements may be nil.
|
||||||
|
*/
|
||||||
|
'symbol_info': (_zoekt_webserver_v1_SymbolInfo__Output)[];
|
||||||
|
'score': (number);
|
||||||
|
'debug_score': (string);
|
||||||
|
'best_line_match': (number);
|
||||||
|
}
|
||||||
132
packages/web/src/proto/zoekt/webserver/v1/FileMatch.ts
Normal file
132
packages/web/src/proto/zoekt/webserver/v1/FileMatch.ts
Normal file
|
|
@ -0,0 +1,132 @@
|
||||||
|
// Original file: ../../vendor/zoekt/grpc/protos/zoekt/webserver/v1/webserver.proto
|
||||||
|
|
||||||
|
import type { LineMatch as _zoekt_webserver_v1_LineMatch, LineMatch__Output as _zoekt_webserver_v1_LineMatch__Output } from '../../../zoekt/webserver/v1/LineMatch';
|
||||||
|
import type { ChunkMatch as _zoekt_webserver_v1_ChunkMatch, ChunkMatch__Output as _zoekt_webserver_v1_ChunkMatch__Output } from '../../../zoekt/webserver/v1/ChunkMatch';
|
||||||
|
|
||||||
|
/**
|
||||||
|
* FileMatch contains all the matches within a file.
|
||||||
|
*/
|
||||||
|
export interface FileMatch {
|
||||||
|
/**
|
||||||
|
* Ranking; the higher, the better.
|
||||||
|
*/
|
||||||
|
'score'?: (number | string);
|
||||||
|
/**
|
||||||
|
* For debugging. Needs DebugScore set, but public so tests in
|
||||||
|
* other packages can print some diagnostics.
|
||||||
|
*/
|
||||||
|
'debug'?: (string);
|
||||||
|
/**
|
||||||
|
* The repository-relative path to the file.
|
||||||
|
* 🚨 Warning: file_name might not be a valid UTF-8 string.
|
||||||
|
*/
|
||||||
|
'file_name'?: (Buffer | Uint8Array | string);
|
||||||
|
/**
|
||||||
|
* Repository is the globally unique name of the repo of the
|
||||||
|
* match
|
||||||
|
*/
|
||||||
|
'repository'?: (string);
|
||||||
|
'branches'?: (string)[];
|
||||||
|
/**
|
||||||
|
* One of line_matches or chunk_matches will be returned depending on whether
|
||||||
|
* the SearchOptions.ChunkMatches is set.
|
||||||
|
*/
|
||||||
|
'line_matches'?: (_zoekt_webserver_v1_LineMatch)[];
|
||||||
|
'chunk_matches'?: (_zoekt_webserver_v1_ChunkMatch)[];
|
||||||
|
/**
|
||||||
|
* repository_id is a Sourcegraph extension. This is the ID of Repository in
|
||||||
|
* Sourcegraph.
|
||||||
|
*/
|
||||||
|
'repository_id'?: (number);
|
||||||
|
'repository_priority'?: (number | string);
|
||||||
|
/**
|
||||||
|
* Only set if requested
|
||||||
|
*/
|
||||||
|
'content'?: (Buffer | Uint8Array | string);
|
||||||
|
/**
|
||||||
|
* Checksum of the content.
|
||||||
|
*/
|
||||||
|
'checksum'?: (Buffer | Uint8Array | string);
|
||||||
|
/**
|
||||||
|
* Detected language of the result.
|
||||||
|
*/
|
||||||
|
'language'?: (string);
|
||||||
|
/**
|
||||||
|
* sub_repository_name is the globally unique name of the repo,
|
||||||
|
* if it came from a subrepository
|
||||||
|
*/
|
||||||
|
'sub_repository_name'?: (string);
|
||||||
|
/**
|
||||||
|
* sub_repository_path holds the prefix where the subrepository
|
||||||
|
* was mounted.
|
||||||
|
*/
|
||||||
|
'sub_repository_path'?: (string);
|
||||||
|
/**
|
||||||
|
* Commit SHA1 (hex) of the (sub)repo holding the file.
|
||||||
|
*/
|
||||||
|
'version'?: (string);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* FileMatch contains all the matches within a file.
|
||||||
|
*/
|
||||||
|
export interface FileMatch__Output {
|
||||||
|
/**
|
||||||
|
* Ranking; the higher, the better.
|
||||||
|
*/
|
||||||
|
'score': (number);
|
||||||
|
/**
|
||||||
|
* For debugging. Needs DebugScore set, but public so tests in
|
||||||
|
* other packages can print some diagnostics.
|
||||||
|
*/
|
||||||
|
'debug': (string);
|
||||||
|
/**
|
||||||
|
* The repository-relative path to the file.
|
||||||
|
* 🚨 Warning: file_name might not be a valid UTF-8 string.
|
||||||
|
*/
|
||||||
|
'file_name': (Buffer);
|
||||||
|
/**
|
||||||
|
* Repository is the globally unique name of the repo of the
|
||||||
|
* match
|
||||||
|
*/
|
||||||
|
'repository': (string);
|
||||||
|
'branches': (string)[];
|
||||||
|
/**
|
||||||
|
* One of line_matches or chunk_matches will be returned depending on whether
|
||||||
|
* the SearchOptions.ChunkMatches is set.
|
||||||
|
*/
|
||||||
|
'line_matches': (_zoekt_webserver_v1_LineMatch__Output)[];
|
||||||
|
'chunk_matches': (_zoekt_webserver_v1_ChunkMatch__Output)[];
|
||||||
|
/**
|
||||||
|
* repository_id is a Sourcegraph extension. This is the ID of Repository in
|
||||||
|
* Sourcegraph.
|
||||||
|
*/
|
||||||
|
'repository_id': (number);
|
||||||
|
'repository_priority': (number);
|
||||||
|
/**
|
||||||
|
* Only set if requested
|
||||||
|
*/
|
||||||
|
'content': (Buffer);
|
||||||
|
/**
|
||||||
|
* Checksum of the content.
|
||||||
|
*/
|
||||||
|
'checksum': (Buffer);
|
||||||
|
/**
|
||||||
|
* Detected language of the result.
|
||||||
|
*/
|
||||||
|
'language': (string);
|
||||||
|
/**
|
||||||
|
* sub_repository_name is the globally unique name of the repo,
|
||||||
|
* if it came from a subrepository
|
||||||
|
*/
|
||||||
|
'sub_repository_name': (string);
|
||||||
|
/**
|
||||||
|
* sub_repository_path holds the prefix where the subrepository
|
||||||
|
* was mounted.
|
||||||
|
*/
|
||||||
|
'sub_repository_path': (string);
|
||||||
|
/**
|
||||||
|
* Commit SHA1 (hex) of the (sub)repo holding the file.
|
||||||
|
*/
|
||||||
|
'version': (string);
|
||||||
|
}
|
||||||
16
packages/web/src/proto/zoekt/webserver/v1/FileNameSet.ts
Normal file
16
packages/web/src/proto/zoekt/webserver/v1/FileNameSet.ts
Normal file
|
|
@ -0,0 +1,16 @@
|
||||||
|
// Original file: ../../vendor/zoekt/grpc/protos/zoekt/webserver/v1/query.proto
|
||||||
|
|
||||||
|
|
||||||
|
/**
|
||||||
|
* FileNameSet is a list of file names to match.
|
||||||
|
*/
|
||||||
|
export interface FileNameSet {
|
||||||
|
'set'?: (string)[];
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* FileNameSet is a list of file names to match.
|
||||||
|
*/
|
||||||
|
export interface FileNameSet__Output {
|
||||||
|
'set': (string)[];
|
||||||
|
}
|
||||||
20
packages/web/src/proto/zoekt/webserver/v1/FlushReason.ts
Normal file
20
packages/web/src/proto/zoekt/webserver/v1/FlushReason.ts
Normal file
|
|
@ -0,0 +1,20 @@
|
||||||
|
// Original file: ../../vendor/zoekt/grpc/protos/zoekt/webserver/v1/webserver.proto
|
||||||
|
|
||||||
|
export const FlushReason = {
|
||||||
|
FLUSH_REASON_UNKNOWN_UNSPECIFIED: 'FLUSH_REASON_UNKNOWN_UNSPECIFIED',
|
||||||
|
FLUSH_REASON_TIMER_EXPIRED: 'FLUSH_REASON_TIMER_EXPIRED',
|
||||||
|
FLUSH_REASON_FINAL_FLUSH: 'FLUSH_REASON_FINAL_FLUSH',
|
||||||
|
FLUSH_REASON_MAX_SIZE: 'FLUSH_REASON_MAX_SIZE',
|
||||||
|
} as const;
|
||||||
|
|
||||||
|
export type FlushReason =
|
||||||
|
| 'FLUSH_REASON_UNKNOWN_UNSPECIFIED'
|
||||||
|
| 0
|
||||||
|
| 'FLUSH_REASON_TIMER_EXPIRED'
|
||||||
|
| 1
|
||||||
|
| 'FLUSH_REASON_FINAL_FLUSH'
|
||||||
|
| 2
|
||||||
|
| 'FLUSH_REASON_MAX_SIZE'
|
||||||
|
| 3
|
||||||
|
|
||||||
|
export type FlushReason__Output = typeof FlushReason[keyof typeof FlushReason]
|
||||||
26
packages/web/src/proto/zoekt/webserver/v1/IndexMetadata.ts
Normal file
26
packages/web/src/proto/zoekt/webserver/v1/IndexMetadata.ts
Normal file
|
|
@ -0,0 +1,26 @@
|
||||||
|
// Original file: ../../vendor/zoekt/grpc/protos/zoekt/webserver/v1/webserver.proto
|
||||||
|
|
||||||
|
import type { Timestamp as _google_protobuf_Timestamp, Timestamp__Output as _google_protobuf_Timestamp__Output } from '../../../google/protobuf/Timestamp';
|
||||||
|
import type { Long } from '@grpc/proto-loader';
|
||||||
|
|
||||||
|
export interface IndexMetadata {
|
||||||
|
'index_format_version'?: (number | string | Long);
|
||||||
|
'index_feature_version'?: (number | string | Long);
|
||||||
|
'index_min_reader_version'?: (number | string | Long);
|
||||||
|
'index_time'?: (_google_protobuf_Timestamp | null);
|
||||||
|
'plain_ascii'?: (boolean);
|
||||||
|
'language_map'?: ({[key: string]: number});
|
||||||
|
'zoekt_version'?: (string);
|
||||||
|
'id'?: (string);
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface IndexMetadata__Output {
|
||||||
|
'index_format_version': (number);
|
||||||
|
'index_feature_version': (number);
|
||||||
|
'index_min_reader_version': (number);
|
||||||
|
'index_time': (_google_protobuf_Timestamp__Output | null);
|
||||||
|
'plain_ascii': (boolean);
|
||||||
|
'language_map': ({[key: string]: number});
|
||||||
|
'zoekt_version': (string);
|
||||||
|
'id': (string);
|
||||||
|
}
|
||||||
10
packages/web/src/proto/zoekt/webserver/v1/Language.ts
Normal file
10
packages/web/src/proto/zoekt/webserver/v1/Language.ts
Normal file
|
|
@ -0,0 +1,10 @@
|
||||||
|
// Original file: ../../vendor/zoekt/grpc/protos/zoekt/webserver/v1/query.proto
|
||||||
|
|
||||||
|
|
||||||
|
export interface Language {
|
||||||
|
'language'?: (string);
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface Language__Output {
|
||||||
|
'language': (string);
|
||||||
|
}
|
||||||
|
|
@ -0,0 +1,38 @@
|
||||||
|
// Original file: ../../vendor/zoekt/grpc/protos/zoekt/webserver/v1/webserver.proto
|
||||||
|
|
||||||
|
import type { SymbolInfo as _zoekt_webserver_v1_SymbolInfo, SymbolInfo__Output as _zoekt_webserver_v1_SymbolInfo__Output } from '../../../zoekt/webserver/v1/SymbolInfo';
|
||||||
|
import type { Long } from '@grpc/proto-loader';
|
||||||
|
|
||||||
|
export interface LineFragmentMatch {
|
||||||
|
/**
|
||||||
|
* Offset within the line, in bytes.
|
||||||
|
*/
|
||||||
|
'line_offset'?: (number | string | Long);
|
||||||
|
/**
|
||||||
|
* Offset from file start, in bytes.
|
||||||
|
*/
|
||||||
|
'offset'?: (number);
|
||||||
|
/**
|
||||||
|
* Number bytes that match.
|
||||||
|
*/
|
||||||
|
'match_length'?: (number | string | Long);
|
||||||
|
'symbol_info'?: (_zoekt_webserver_v1_SymbolInfo | null);
|
||||||
|
'_symbol_info'?: "symbol_info";
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface LineFragmentMatch__Output {
|
||||||
|
/**
|
||||||
|
* Offset within the line, in bytes.
|
||||||
|
*/
|
||||||
|
'line_offset': (number);
|
||||||
|
/**
|
||||||
|
* Offset from file start, in bytes.
|
||||||
|
*/
|
||||||
|
'offset': (number);
|
||||||
|
/**
|
||||||
|
* Number bytes that match.
|
||||||
|
*/
|
||||||
|
'match_length': (number);
|
||||||
|
'symbol_info'?: (_zoekt_webserver_v1_SymbolInfo__Output | null);
|
||||||
|
'_symbol_info'?: "symbol_info";
|
||||||
|
}
|
||||||
50
packages/web/src/proto/zoekt/webserver/v1/LineMatch.ts
Normal file
50
packages/web/src/proto/zoekt/webserver/v1/LineMatch.ts
Normal file
|
|
@ -0,0 +1,50 @@
|
||||||
|
// Original file: ../../vendor/zoekt/grpc/protos/zoekt/webserver/v1/webserver.proto
|
||||||
|
|
||||||
|
import type { LineFragmentMatch as _zoekt_webserver_v1_LineFragmentMatch, LineFragmentMatch__Output as _zoekt_webserver_v1_LineFragmentMatch__Output } from '../../../zoekt/webserver/v1/LineFragmentMatch';
|
||||||
|
import type { Long } from '@grpc/proto-loader';
|
||||||
|
|
||||||
|
export interface LineMatch {
|
||||||
|
'line'?: (Buffer | Uint8Array | string);
|
||||||
|
'line_start'?: (number | string | Long);
|
||||||
|
'line_end'?: (number | string | Long);
|
||||||
|
'line_number'?: (number | string | Long);
|
||||||
|
/**
|
||||||
|
* before and after are only set when SearchOptions.NumContextLines is > 0
|
||||||
|
*/
|
||||||
|
'before'?: (Buffer | Uint8Array | string);
|
||||||
|
'after'?: (Buffer | Uint8Array | string);
|
||||||
|
/**
|
||||||
|
* If set, this was a match on the filename.
|
||||||
|
*/
|
||||||
|
'file_name'?: (boolean);
|
||||||
|
/**
|
||||||
|
* The higher the better. Only ranks the quality of the match
|
||||||
|
* within the file, does not take rank of file into account
|
||||||
|
*/
|
||||||
|
'score'?: (number | string);
|
||||||
|
'debug_score'?: (string);
|
||||||
|
'line_fragments'?: (_zoekt_webserver_v1_LineFragmentMatch)[];
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface LineMatch__Output {
|
||||||
|
'line': (Buffer);
|
||||||
|
'line_start': (number);
|
||||||
|
'line_end': (number);
|
||||||
|
'line_number': (number);
|
||||||
|
/**
|
||||||
|
* before and after are only set when SearchOptions.NumContextLines is > 0
|
||||||
|
*/
|
||||||
|
'before': (Buffer);
|
||||||
|
'after': (Buffer);
|
||||||
|
/**
|
||||||
|
* If set, this was a match on the filename.
|
||||||
|
*/
|
||||||
|
'file_name': (boolean);
|
||||||
|
/**
|
||||||
|
* The higher the better. Only ranks the quality of the match
|
||||||
|
* within the file, does not take rank of file into account
|
||||||
|
*/
|
||||||
|
'score': (number);
|
||||||
|
'debug_score': (string);
|
||||||
|
'line_fragments': (_zoekt_webserver_v1_LineFragmentMatch__Output)[];
|
||||||
|
}
|
||||||
34
packages/web/src/proto/zoekt/webserver/v1/ListOptions.ts
Normal file
34
packages/web/src/proto/zoekt/webserver/v1/ListOptions.ts
Normal file
|
|
@ -0,0 +1,34 @@
|
||||||
|
// Original file: ../../vendor/zoekt/grpc/protos/zoekt/webserver/v1/webserver.proto
|
||||||
|
|
||||||
|
|
||||||
|
// Original file: ../../vendor/zoekt/grpc/protos/zoekt/webserver/v1/webserver.proto
|
||||||
|
|
||||||
|
export const _zoekt_webserver_v1_ListOptions_RepoListField = {
|
||||||
|
REPO_LIST_FIELD_UNKNOWN_UNSPECIFIED: 'REPO_LIST_FIELD_UNKNOWN_UNSPECIFIED',
|
||||||
|
REPO_LIST_FIELD_REPOS: 'REPO_LIST_FIELD_REPOS',
|
||||||
|
REPO_LIST_FIELD_REPOS_MAP: 'REPO_LIST_FIELD_REPOS_MAP',
|
||||||
|
} as const;
|
||||||
|
|
||||||
|
export type _zoekt_webserver_v1_ListOptions_RepoListField =
|
||||||
|
| 'REPO_LIST_FIELD_UNKNOWN_UNSPECIFIED'
|
||||||
|
| 0
|
||||||
|
| 'REPO_LIST_FIELD_REPOS'
|
||||||
|
| 1
|
||||||
|
| 'REPO_LIST_FIELD_REPOS_MAP'
|
||||||
|
| 3
|
||||||
|
|
||||||
|
export type _zoekt_webserver_v1_ListOptions_RepoListField__Output = typeof _zoekt_webserver_v1_ListOptions_RepoListField[keyof typeof _zoekt_webserver_v1_ListOptions_RepoListField]
|
||||||
|
|
||||||
|
export interface ListOptions {
|
||||||
|
/**
|
||||||
|
* Field decides which field to populate in RepoList response.
|
||||||
|
*/
|
||||||
|
'field'?: (_zoekt_webserver_v1_ListOptions_RepoListField);
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface ListOptions__Output {
|
||||||
|
/**
|
||||||
|
* Field decides which field to populate in RepoList response.
|
||||||
|
*/
|
||||||
|
'field': (_zoekt_webserver_v1_ListOptions_RepoListField__Output);
|
||||||
|
}
|
||||||
14
packages/web/src/proto/zoekt/webserver/v1/ListRequest.ts
Normal file
14
packages/web/src/proto/zoekt/webserver/v1/ListRequest.ts
Normal file
|
|
@ -0,0 +1,14 @@
|
||||||
|
// Original file: ../../vendor/zoekt/grpc/protos/zoekt/webserver/v1/webserver.proto
|
||||||
|
|
||||||
|
import type { Q as _zoekt_webserver_v1_Q, Q__Output as _zoekt_webserver_v1_Q__Output } from '../../../zoekt/webserver/v1/Q';
|
||||||
|
import type { ListOptions as _zoekt_webserver_v1_ListOptions, ListOptions__Output as _zoekt_webserver_v1_ListOptions__Output } from '../../../zoekt/webserver/v1/ListOptions';
|
||||||
|
|
||||||
|
export interface ListRequest {
|
||||||
|
'query'?: (_zoekt_webserver_v1_Q | null);
|
||||||
|
'opts'?: (_zoekt_webserver_v1_ListOptions | null);
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface ListRequest__Output {
|
||||||
|
'query': (_zoekt_webserver_v1_Q__Output | null);
|
||||||
|
'opts': (_zoekt_webserver_v1_ListOptions__Output | null);
|
||||||
|
}
|
||||||
40
packages/web/src/proto/zoekt/webserver/v1/ListResponse.ts
Normal file
40
packages/web/src/proto/zoekt/webserver/v1/ListResponse.ts
Normal file
|
|
@ -0,0 +1,40 @@
|
||||||
|
// Original file: ../../vendor/zoekt/grpc/protos/zoekt/webserver/v1/webserver.proto
|
||||||
|
|
||||||
|
import type { RepoListEntry as _zoekt_webserver_v1_RepoListEntry, RepoListEntry__Output as _zoekt_webserver_v1_RepoListEntry__Output } from '../../../zoekt/webserver/v1/RepoListEntry';
|
||||||
|
import type { MinimalRepoListEntry as _zoekt_webserver_v1_MinimalRepoListEntry, MinimalRepoListEntry__Output as _zoekt_webserver_v1_MinimalRepoListEntry__Output } from '../../../zoekt/webserver/v1/MinimalRepoListEntry';
|
||||||
|
import type { RepoStats as _zoekt_webserver_v1_RepoStats, RepoStats__Output as _zoekt_webserver_v1_RepoStats__Output } from '../../../zoekt/webserver/v1/RepoStats';
|
||||||
|
import type { Long } from '@grpc/proto-loader';
|
||||||
|
|
||||||
|
export interface ListResponse {
|
||||||
|
/**
|
||||||
|
* Returned when ListOptions.Field is RepoListFieldRepos.
|
||||||
|
*/
|
||||||
|
'repos'?: (_zoekt_webserver_v1_RepoListEntry)[];
|
||||||
|
/**
|
||||||
|
* ReposMap is set when ListOptions.Field is RepoListFieldReposMap.
|
||||||
|
*/
|
||||||
|
'repos_map'?: ({[key: number]: _zoekt_webserver_v1_MinimalRepoListEntry});
|
||||||
|
'crashes'?: (number | string | Long);
|
||||||
|
/**
|
||||||
|
* Stats response to a List request.
|
||||||
|
* This is the aggregate RepoStats of all repos matching the input query.
|
||||||
|
*/
|
||||||
|
'stats'?: (_zoekt_webserver_v1_RepoStats | null);
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface ListResponse__Output {
|
||||||
|
/**
|
||||||
|
* Returned when ListOptions.Field is RepoListFieldRepos.
|
||||||
|
*/
|
||||||
|
'repos': (_zoekt_webserver_v1_RepoListEntry__Output)[];
|
||||||
|
/**
|
||||||
|
* ReposMap is set when ListOptions.Field is RepoListFieldReposMap.
|
||||||
|
*/
|
||||||
|
'repos_map': ({[key: number]: _zoekt_webserver_v1_MinimalRepoListEntry__Output});
|
||||||
|
'crashes': (number);
|
||||||
|
/**
|
||||||
|
* Stats response to a List request.
|
||||||
|
* This is the aggregate RepoStats of all repos matching the input query.
|
||||||
|
*/
|
||||||
|
'stats': (_zoekt_webserver_v1_RepoStats__Output | null);
|
||||||
|
}
|
||||||
Some files were not shown because too many files have changed in this diff Show more
Loading…
Reference in a new issue