Compare commits

..

10 Commits

Author SHA1 Message Date
Nathan Sobo
d104df955e WIP 2025-06-04 09:50:06 -07:00
Mikayla Maki
45de5c080a remove old comment 2025-05-31 22:17:44 -07:00
Mikayla Maki
03bed065e7 Slightly optimize the noop reordering case 2025-05-31 22:15:51 -07:00
Mikayla Maki
a3e7c4e961 Add a test showing that the channels can be in any order 2025-05-31 21:09:45 -07:00
Mikayla Maki
24ccd8268d Add missing channel moving tests
Fix a bug in the channel reordering when moving a channel into another
2025-05-31 20:38:50 -07:00
Mikayla Maki
9168a94c6a Remove unescessary complexity.
Fix a test that wasn't actually testing anything when ordering
constraints where removed.
2025-05-31 19:55:15 -07:00
Mikayla Maki
b96463e5bb remove unescessary channel ordering 2025-05-31 19:31:44 -07:00
Nathan Sobo
0e12e31edc Fix channel reordering to return only swapped channels
The reorder_channel function was returning all sibling channels instead
of just the two that were swapped, causing test failures that expected
exactly 2 channels to be returned.

This change modifies the function to return only the two channels that
had their order values swapped, which is more efficient and matches
the test expectations.
2025-05-31 15:53:45 -06:00
Nathan Sobo
a8287e4289 Fix CollabPanel channel reordering keybindings
The channel reordering keybindings (Cmd/Ctrl+Up/Down) were non-functional because the CollabPanel wasn't properly setting the key context to distinguish between editing and non-editing states.

## Problem

The keymaps specified 'CollabPanel && not_editing' as the required context for the reordering actions, but CollabPanel was only setting 'CollabPanel' without the editing state modifier. This meant the keybindings were never matched.

## Solution

Added a dispatch_context method to CollabPanel that:
- Checks if the channel_name_editor is focused
- Adds 'editing' context when renaming/creating channels
- Adds 'not_editing' context during normal navigation
- Follows the same pattern used by ProjectPanel and OutlinePanel

This allows the keybindings to work correctly while preventing conflicts between text editing operations (where arrows move the cursor) and channel navigation operations (where arrows reorder channels).
2025-05-31 14:49:51 -06:00
Nathan Sobo
8019a3925d Add channel reordering functionality
This change introduces the ability for channel administrators to reorder channels within their parent context, providing better organizational control over channel hierarchies. Channels can now be moved up or down relative to their siblings through keyboard shortcuts.

## Problem

Previously, channels were displayed in alphabetical order with no way to customize their arrangement. This made it difficult for teams to organize channels in a logical order that reflected their workflow or importance, forcing users to prefix channel names with numbers or special characters as a workaround.

## Solution

The implementation adds a persistent `channel_order` field to channels that determines their display order within their parent. Channels with the same parent are sorted by this field rather than alphabetically.

## Technical Implementation

**Database Schema:**
- Added `channel_order` INTEGER column to the channels table with a default value of 1
- Created a compound index on `(parent_path, channel_order)` for efficient ordering queries
- Migration updates existing channels with deterministic ordering based on their current alphabetical position

**Core Changes:**
- Extended the `Channel` struct across the codebase to include the `channel_order` field
- Modified `ChannelIndex` sorting logic to use channel_order instead of alphabetical name comparison
- Channels now maintain their relative position among siblings

**RPC Protocol:**
- Added `ReorderChannel` RPC message with channel_id and direction (Up/Down)
- Extended existing Channel proto message to include the channel_order field
- Server validates that users have admin permissions before allowing reordering

**Reordering Logic:**
- When moving a channel up/down, the server finds the adjacent sibling in that direction
- Swaps the `channel_order` values between the two channels
- Broadcasts the updated channel list to all affected users
- Handles edge cases: channels at boundaries, channels with gaps in ordering

**User Interface:**
- Added keyboard shortcuts:
  - macOS: `Cmd+Up/Down` to move channels
  - Linux: `Ctrl+Up/Down` to move channels
- Added `MoveChannelUp` and `MoveChannelDown` actions to the collab panel
- Reordering is only available when a channel is selected (not during editing)

**Error Handling:**
The change also improves error handling practices throughout the codebase by ensuring that errors from async operations propagate correctly to the UI layer instead of being silently discarded with `let _ =`.

**Testing:**
Comprehensive tests were added to verify:
- Basic reordering functionality up and down
- Boundary conditions (first/last channels)
- Permission checks (non-admins cannot reorder)
- Ordering persistence across server restarts
- Correct broadcasting of changes to channel members

**Known Issue:**
The keybindings are currently non-functional due to missing context setup in CollabPanel. The keymaps expect 'CollabPanel && not_editing' context, but the panel only provides 'CollabPanel'. This will be fixed in a follow-up commit.
2025-05-31 14:45:13 -06:00
171 changed files with 5210 additions and 6783 deletions

View File

@@ -482,9 +482,7 @@ jobs:
- macos_tests
- windows_clippy
- windows_tests
if: |
github.repository_owner == 'zed-industries' &&
always()
if: always()
steps:
- name: Check all tests passed
run: |

6
.rules
View File

@@ -5,6 +5,12 @@
* Prefer implementing functionality in existing files unless it is a new logical component. Avoid creating many small files.
* Avoid using functions that panic like `unwrap()`, instead use mechanisms like `?` to propagate errors.
* Be careful with operations like indexing which may panic if the indexes are out of bounds.
* Never silently discard errors with `let _ =` on fallible operations. Always handle errors appropriately:
- Propagate errors with `?` when the calling function should handle them
- Use `.log_err()` or similar when you need to ignore errors but want visibility
- Use explicit error handling with `match` or `if let Err(...)` when you need custom logic
- Example: avoid `let _ = client.request(...).await?;` - use `client.request(...).await?;` instead
* When implementing async operations that may fail, ensure errors propagate to the UI layer so users get meaningful feedback.
* Never create files with `mod.rs` paths - prefer `src/some_module.rs` instead of `src/some_module/mod.rs`.
# GPUI

5
Cargo.lock generated
View File

@@ -114,7 +114,6 @@ dependencies = [
"serde_json_lenient",
"settings",
"smol",
"sqlez",
"streaming_diff",
"telemetry",
"telemetry_events",
@@ -134,7 +133,6 @@ dependencies = [
"workspace-hack",
"zed_actions",
"zed_llm_client",
"zstd",
]
[[package]]
@@ -527,7 +525,6 @@ dependencies = [
"fuzzy",
"gpui",
"indexed_docs",
"indoc",
"language",
"language_model",
"languages",
@@ -7073,7 +7070,6 @@ dependencies = [
"image",
"inventory",
"itertools 0.14.0",
"libc",
"log",
"lyon",
"media",
@@ -8762,7 +8758,6 @@ dependencies = [
"serde",
"serde_json",
"settings",
"shellexpand 2.1.2",
"smallvec",
"smol",
"streaming-iterator",

View File

@@ -1,4 +1,5 @@
<svg width="16" height="16" viewBox="0 0 16 16" fill="none" xmlns="http://www.w3.org/2000/svg">
<path d="M11 13H10.4C9.76346 13 9.15302 12.7893 8.70296 12.4142C8.25284 12.0391 8 11.5304 8 11V5C8 4.46957 8.25284 3.96086 8.70296 3.58579C9.15302 3.21071 9.76346 3 10.4 3H11" stroke="black" stroke-width="1.5" stroke-linecap="round" stroke-linejoin="round"/>
<path d="M5 13H5.6C6.23654 13 6.84698 12.7893 7.29704 12.4142C7.74716 12.0391 8 11.5304 8 11V5C8 4.46957 7.74716 3.96086 7.29704 3.58579C6.84698 3.21071 6.23654 3 5.6 3H5" stroke="black" stroke-width="1.5" stroke-linecap="round" stroke-linejoin="round"/>
<svg width="24" height="24" viewBox="0 0 24 24" fill="none" xmlns="http://www.w3.org/2000/svg">
<path d="M17 20H16C14.9391 20 13.9217 19.6629 13.1716 19.0627C12.4214 18.4626 12 17.6487 12 16.8V7.2C12 6.35131 12.4214 5.53737 13.1716 4.93726C13.9217 4.33714 14.9391 4 16 4H17" stroke="black" stroke-width="2" stroke-linecap="round" stroke-linejoin="round"/>
<path d="M7 20H8C9.06087 20 10.0783 19.5786 10.8284 18.8284C11.5786 18.0783 12 17.0609 12 16V15" stroke="black" stroke-width="2" stroke-linecap="round" stroke-linejoin="round"/>
<path d="M7 4H8C9.06087 4 10.0783 4.42143 10.8284 5.17157C11.5786 5.92172 12 6.93913 12 8V9" stroke="black" stroke-width="2" stroke-linecap="round" stroke-linejoin="round"/>
</svg>

Before

Width:  |  Height:  |  Size: 617 B

After

Width:  |  Height:  |  Size: 715 B

View File

@@ -1,3 +0,0 @@
<svg width="16" height="16" viewBox="0 0 16 16" fill="none" xmlns="http://www.w3.org/2000/svg">
<path d="M4 3L13 8L4 13V3Z" stroke="black" stroke-width="1.5" stroke-linecap="round" stroke-linejoin="round"/>
</svg>

Before

Width:  |  Height:  |  Size: 214 B

View File

@@ -1,8 +0,0 @@
<svg width="16" height="16" viewBox="0 0 16 16" fill="none" xmlns="http://www.w3.org/2000/svg">
<path d="M4 12C2.35977 11.85 1 10.575 1 9" stroke="black" stroke-width="1.5" stroke-linecap="round" stroke-linejoin="round"/>
<path d="M1.00875 15.2C1.00875 13.625 0.683456 12.275 4.00001 12.2" stroke="black" stroke-width="1.5" stroke-linecap="round" stroke-linejoin="round"/>
<path d="M7 9C7 10.575 5.62857 11.85 4 12" stroke="black" stroke-width="1.5" stroke-linecap="round" stroke-linejoin="round"/>
<path d="M4 12.2C6.98117 12.2 7 13.625 7 15.2" stroke="black" stroke-width="1.5" stroke-linecap="round" stroke-linejoin="round"/>
<rect x="2.5" y="9" width="3" height="6" rx="1.5" fill="black"/>
<path d="M9 10L13 8L4 3V7.5" stroke="black" stroke-width="1.5" stroke-linecap="round" stroke-linejoin="round"/>
</svg>

Before

Width:  |  Height:  |  Size: 813 B

View File

@@ -1,8 +1,3 @@
<svg width="16" height="16" viewBox="0 0 16 16" fill="none" xmlns="http://www.w3.org/2000/svg">
<path d="M2 5H4" stroke="black" stroke-width="1.5" stroke-linecap="round"/>
<path d="M8 5L14 5" stroke="black" stroke-width="1.5" stroke-linecap="round"/>
<path d="M12 11L14 11" stroke="black" stroke-width="1.5" stroke-linecap="round"/>
<path d="M2 11H8" stroke="black" stroke-width="1.5" stroke-linecap="round"/>
<circle cx="6" cy="5" r="2" fill="black" fill-opacity="0.1" stroke="black" stroke-width="1.5" stroke-linecap="round"/>
<circle cx="10" cy="11" r="2" fill="black" fill-opacity="0.1" stroke="black" stroke-width="1.5" stroke-linecap="round"/>
<svg width="17" height="17" viewBox="0 0 17 17" fill="none" xmlns="http://www.w3.org/2000/svg">
<path fill-rule="evenodd" clip-rule="evenodd" d="M6.36667 3.79167C5.53364 3.79167 4.85833 4.46697 4.85833 5.3C4.85833 6.13303 5.53364 6.80833 6.36667 6.80833C7.1997 6.80833 7.875 6.13303 7.875 5.3C7.875 4.46697 7.1997 3.79167 6.36667 3.79167ZM2.1 5.925H3.67944C3.9626 7.14732 5.05824 8.05833 6.36667 8.05833C7.67509 8.05833 8.77073 7.14732 9.05389 5.925H14.9C15.2452 5.925 15.525 5.64518 15.525 5.3C15.525 4.95482 15.2452 4.675 14.9 4.675H9.05389C8.77073 3.45268 7.67509 2.54167 6.36667 2.54167C5.05824 2.54167 3.9626 3.45268 3.67944 4.675H2.1C1.75482 4.675 1.475 4.95482 1.475 5.3C1.475 5.64518 1.75482 5.925 2.1 5.925ZM13.3206 12.325C13.0374 13.5473 11.9418 14.4583 10.6333 14.4583C9.32491 14.4583 8.22927 13.5473 7.94611 12.325H2.1C1.75482 12.325 1.475 12.0452 1.475 11.7C1.475 11.3548 1.75482 11.075 2.1 11.075H7.94611C8.22927 9.85268 9.32491 8.94167 10.6333 8.94167C11.9418 8.94167 13.0374 9.85268 13.3206 11.075H14.9C15.2452 11.075 15.525 11.3548 15.525 11.7C15.525 12.0452 15.2452 12.325 14.9 12.325H13.3206ZM9.125 11.7C9.125 10.867 9.8003 10.1917 10.6333 10.1917C11.4664 10.1917 12.1417 10.867 12.1417 11.7C12.1417 12.533 11.4664 13.2083 10.6333 13.2083C9.8003 13.2083 9.125 12.533 9.125 11.7Z" fill="black"/>
</svg>

Before

Width:  |  Height:  |  Size: 657 B

After

Width:  |  Height:  |  Size: 1.3 KiB

View File

@@ -1,5 +1,5 @@
<svg width="16" height="16" viewBox="0 0 16 16" fill="none" xmlns="http://www.w3.org/2000/svg">
<path d="M8 2L6.72534 5.87534C6.6601 6.07367 6.5492 6.25392 6.40155 6.40155C6.25392 6.5492 6.07367 6.6601 5.87534 6.72534L2 8L5.87534 9.27466C6.07367 9.3399 6.25392 9.4508 6.40155 9.59845C6.5492 9.74608 6.6601 9.92633 6.72534 10.1247L8 14L9.27466 10.1247C9.3399 9.92633 9.4508 9.74608 9.59845 9.59845C9.74608 9.4508 9.92633 9.3399 10.1247 9.27466L14 8L10.1247 6.72534C9.92633 6.6601 9.74608 6.5492 9.59845 6.40155C9.4508 6.25392 9.3399 6.07367 9.27466 5.87534L8 2Z" fill="black" fill-opacity="0.15" stroke="black" stroke-width="1.5" stroke-linecap="round" stroke-linejoin="round"/>
<path d="M3.33334 2V4.66666M2 3.33334H4.66666" stroke="black" stroke-opacity="0.75" stroke-width="1.25" stroke-linecap="round" stroke-linejoin="round"/>
<path d="M12.6665 11.3333V14M11.3333 12.6666H13.9999" stroke="black" stroke-opacity="0.75" stroke-width="1.25" stroke-linecap="round" stroke-linejoin="round"/>
<svg width="14" height="14" viewBox="0 0 14 14" fill="none" xmlns="http://www.w3.org/2000/svg">
<path d="M7 1.75L5.88467 5.14092C5.82759 5.31446 5.73055 5.47218 5.60136 5.60136C5.47218 5.73055 5.31446 5.82759 5.14092 5.88467L1.75 7L5.14092 8.11533C5.31446 8.17241 5.47218 8.26945 5.60136 8.39864C5.73055 8.52782 5.82759 8.68554 5.88467 8.85908L7 12.25L8.11533 8.85908C8.17241 8.68554 8.26945 8.52782 8.39864 8.39864C8.52782 8.26945 8.68554 8.17241 8.85908 8.11533L12.25 7L8.85908 5.88467C8.68554 5.82759 8.52782 5.73055 8.39864 5.60136C8.26945 5.47218 8.17241 5.31446 8.11533 5.14092L7 1.75Z" fill="black" fill-opacity="0.15" stroke="black" stroke-width="1.25" stroke-linecap="round" stroke-linejoin="round"/>
<path d="M2.91667 1.75V4.08333M1.75 2.91667H4.08333" stroke="black" stroke-opacity="0.75" stroke-width="1.25" stroke-linecap="round" stroke-linejoin="round"/>
<path d="M11.0833 9.91667V12.25M9.91667 11.0833H12.25" stroke="black" stroke-opacity="0.75" stroke-width="1.25" stroke-linecap="round" stroke-linejoin="round"/>
</svg>

Before

Width:  |  Height:  |  Size: 998 B

After

Width:  |  Height:  |  Size: 1.0 KiB

View File

@@ -31,6 +31,8 @@
"ctrl-,": "zed::OpenSettings",
"ctrl-q": "zed::Quit",
"f4": "debugger::Start",
"alt-f4": "debugger::RerunLastSession",
"f5": "debugger::Continue",
"shift-f5": "debugger::Stop",
"ctrl-shift-f5": "debugger::Restart",
"f6": "debugger::Pause",
@@ -581,24 +583,11 @@
"ctrl-alt-r": "task::Rerun",
"alt-t": "task::Rerun",
"alt-shift-t": "task::Spawn",
"alt-shift-r": ["task::Spawn", { "reveal_target": "center" }],
"alt-shift-r": ["task::Spawn", { "reveal_target": "center" }]
// also possible to spawn tasks by name:
// "foo-bar": ["task::Spawn", { "task_name": "MyTask", "reveal_target": "dock" }]
// or by tag:
// "foo-bar": ["task::Spawn", { "task_tag": "MyTag" }],
"f5": "debugger::RerunLastSession"
}
},
{
"context": "Workspace && debugger_running",
"bindings": {
"f5": "zed::NoAction"
}
},
{
"context": "Workspace && debugger_stopped",
"bindings": {
"f5": "debugger::Continue"
}
},
{
@@ -884,8 +873,7 @@
"context": "DebugPanel",
"bindings": {
"ctrl-t": "debugger::ToggleThreadPicker",
"ctrl-i": "debugger::ToggleSessionPicker",
"shift-alt-escape": "debugger::ToggleExpandItem"
"ctrl-i": "debugger::ToggleSessionPicker"
}
},
{
@@ -909,7 +897,9 @@
"context": "CollabPanel && not_editing",
"bindings": {
"ctrl-backspace": "collab_panel::Remove",
"space": "menu::Confirm"
"space": "menu::Confirm",
"ctrl-up": "collab_panel::MoveChannelUp",
"ctrl-down": "collab_panel::MoveChannelDown"
}
},
{
@@ -940,13 +930,6 @@
"tab": "channel_modal::ToggleMode"
}
},
{
"context": "FileFinder",
"bindings": {
"ctrl-shift-a": "file_finder::ToggleSplitMenu",
"ctrl-shift-i": "file_finder::ToggleFilterMenu"
}
},
{
"context": "FileFinder || (FileFinder > Picker > Editor) || (FileFinder > Picker > menu)",
"bindings": {

View File

@@ -4,6 +4,8 @@
"use_key_equivalents": true,
"bindings": {
"f4": "debugger::Start",
"alt-f4": "debugger::RerunLastSession",
"f5": "debugger::Continue",
"shift-f5": "debugger::Stop",
"shift-cmd-f5": "debugger::Restart",
"f6": "debugger::Pause",
@@ -633,8 +635,7 @@
"cmd-k shift-right": "workspace::SwapPaneRight",
"cmd-k shift-up": "workspace::SwapPaneUp",
"cmd-k shift-down": "workspace::SwapPaneDown",
"cmd-shift-x": "zed::Extensions",
"f5": "debugger::RerunLastSession"
"cmd-shift-x": "zed::Extensions"
}
},
{
@@ -651,20 +652,6 @@
// "foo-bar": ["task::Spawn", { "task_tag": "MyTag" }],
}
},
{
"context": "Workspace && debugger_running",
"use_key_equivalents": true,
"bindings": {
"f5": "zed::NoAction"
}
},
{
"context": "Workspace && debugger_stopped",
"use_key_equivalents": true,
"bindings": {
"f5": "debugger::Continue"
}
},
// Bindings from Sublime Text
{
"context": "Editor",
@@ -949,8 +936,7 @@
"context": "DebugPanel",
"bindings": {
"cmd-t": "debugger::ToggleThreadPicker",
"cmd-i": "debugger::ToggleSessionPicker",
"shift-alt-escape": "debugger::ToggleExpandItem"
"cmd-i": "debugger::ToggleSessionPicker"
}
},
{
@@ -965,7 +951,9 @@
"use_key_equivalents": true,
"bindings": {
"ctrl-backspace": "collab_panel::Remove",
"space": "menu::Confirm"
"space": "menu::Confirm",
"cmd-up": "collab_panel::MoveChannelUp",
"cmd-down": "collab_panel::MoveChannelDown"
}
},
{
@@ -1001,14 +989,6 @@
"tab": "channel_modal::ToggleMode"
}
},
{
"context": "FileFinder",
"use_key_equivalents": true,
"bindings": {
"cmd-shift-a": "file_finder::ToggleSplitMenu",
"cmd-shift-i": "file_finder::ToggleFilterMenu"
}
},
{
"context": "FileFinder || (FileFinder > Picker > Editor) || (FileFinder > Picker > menu)",
"use_key_equivalents": true,

View File

@@ -51,11 +51,7 @@
"ctrl-k ctrl-l": "editor::ConvertToLowerCase",
"shift-alt-m": "markdown::OpenPreviewToTheSide",
"ctrl-backspace": "editor::DeleteToPreviousWordStart",
"ctrl-delete": "editor::DeleteToNextWordEnd",
"alt-right": "editor::MoveToNextSubwordEnd",
"alt-left": "editor::MoveToPreviousSubwordStart",
"alt-shift-right": "editor::SelectToNextSubwordEnd",
"alt-shift-left": "editor::SelectToPreviousSubwordStart"
"ctrl-delete": "editor::DeleteToNextWordEnd"
}
},
{

View File

@@ -53,11 +53,7 @@
"cmd-shift-j": "editor::JoinLines",
"shift-alt-m": "markdown::OpenPreviewToTheSide",
"ctrl-backspace": "editor::DeleteToPreviousWordStart",
"ctrl-delete": "editor::DeleteToNextWordEnd",
"ctrl-right": "editor::MoveToNextSubwordEnd",
"ctrl-left": "editor::MoveToPreviousSubwordStart",
"ctrl-shift-right": "editor::SelectToNextSubwordEnd",
"ctrl-shift-left": "editor::SelectToPreviousSubwordStart"
"ctrl-delete": "editor::DeleteToNextWordEnd"
}
},
{

View File

@@ -838,19 +838,6 @@
"tab": "editor::AcceptEditPrediction"
}
},
{
"context": "MessageEditor > Editor && VimControl",
"bindings": {
"enter": "agent::Chat",
// TODO: Implement search
"/": null,
"?": null,
"#": null,
"*": null,
"n": null,
"shift-n": null
}
},
{
"context": "os != macos && Editor && edit_prediction_conflict",
"bindings": {

View File

@@ -128,8 +128,6 @@
//
// Default: true
"restore_on_file_reopen": true,
// Whether to automatically close files that have been deleted on disk.
"close_on_file_delete": false,
// Size of the drop target in the editor.
"drop_target_size": 0.2,
// Whether the window should be closed when using 'close active item' on a window with no tabs.
@@ -733,6 +731,13 @@
// The model to use.
"model": "claude-sonnet-4"
},
// The model to use when applying edits from the agent.
"editor_model": {
// The provider to use.
"provider": "zed.dev",
// The model to use.
"model": "claude-sonnet-4"
},
// Additional parameters for language model requests. When making a request to a model, parameters will be taken
// from the last entry in this list that matches the model's provider and name. In each entry, both provider
// and model are optional, so that you can specify parameters for either one.

View File

@@ -1,7 +1,3 @@
// Some example tasks for common languages.
//
// For more documentation on how to configure debug tasks,
// see: https://zed.dev/docs/debugger
[
{
"label": "Debug active PHP file",

View File

@@ -1,5 +0,0 @@
// Project-local debug tasks
//
// For more documentation on how to configure debug tasks,
// see: https://zed.dev/docs/debugger
[]

View File

@@ -46,7 +46,6 @@ git.workspace = true
gpui.workspace = true
heed.workspace = true
html_to_markdown.workspace = true
indoc.workspace = true
http_client.workspace = true
indexed_docs.workspace = true
inventory.workspace = true
@@ -79,7 +78,6 @@ serde_json.workspace = true
serde_json_lenient.workspace = true
settings.workspace = true
smol.workspace = true
sqlez.workspace = true
streaming_diff.workspace = true
telemetry.workspace = true
telemetry_events.workspace = true
@@ -99,7 +97,6 @@ workspace-hack.workspace = true
workspace.workspace = true
zed_actions.workspace = true
zed_llm_client.workspace = true
zstd.workspace = true
[dev-dependencies]
buffer_diff = { workspace = true, features = ["test-support"] }

View File

@@ -1017,15 +1017,6 @@ impl ActiveThread {
self.play_notification_sound(cx);
self.show_notification("Waiting for tool confirmation", IconName::Info, window, cx);
}
ThreadEvent::ToolUseLimitReached => {
self.play_notification_sound(cx);
self.show_notification(
"Consecutive tool use limit reached.",
IconName::Warning,
window,
cx,
);
}
ThreadEvent::StreamedAssistantText(message_id, text) => {
if let Some(rendered_message) = self.rendered_messages_by_id.get_mut(&message_id) {
rendered_message.append_text(text, cx);

View File

@@ -1372,7 +1372,6 @@ impl AgentDiff {
| ThreadEvent::ToolFinished { .. }
| ThreadEvent::CheckpointChanged
| ThreadEvent::ToolConfirmationNeeded
| ThreadEvent::ToolUseLimitReached
| ThreadEvent::CancelEditing => {}
}
}
@@ -1465,10 +1464,7 @@ impl AgentDiff {
if !AgentSettings::get_global(cx).single_file_review {
for (editor, _) in self.reviewing_editors.drain() {
editor
.update(cx, |editor, cx| {
editor.end_temporary_diff_override(cx);
editor.unregister_addon::<EditorAgentDiffAddon>();
})
.update(cx, |editor, cx| editor.end_temporary_diff_override(cx))
.ok();
}
return;
@@ -1564,10 +1560,7 @@ impl AgentDiff {
if in_workspace {
editor
.update(cx, |editor, cx| {
editor.end_temporary_diff_override(cx);
editor.unregister_addon::<EditorAgentDiffAddon>();
})
.update(cx, |editor, cx| editor.end_temporary_diff_override(cx))
.ok();
self.reviewing_editors.remove(&editor);
}

View File

@@ -734,7 +734,6 @@ impl Display for RulesContext {
#[derive(Debug, Clone)]
pub struct ImageContext {
pub project_path: Option<ProjectPath>,
pub full_path: Option<Arc<Path>>,
pub original_image: Arc<gpui::Image>,
// TODO: handle this elsewhere and remove `ignore-interior-mutability` opt-out in clippy.toml
// needed due to a false positive of `clippy::mutable_key_type`.

View File

@@ -14,7 +14,7 @@ use http_client::HttpClientWithUrl;
use itertools::Itertools;
use language::{Buffer, CodeLabel, HighlightId};
use lsp::CompletionContext;
use project::{Completion, CompletionIntent, CompletionResponse, ProjectPath, Symbol, WorktreeId};
use project::{Completion, CompletionIntent, ProjectPath, Symbol, WorktreeId};
use prompt_store::PromptStore;
use rope::Point;
use text::{Anchor, OffsetRangeExt, ToPoint};
@@ -746,7 +746,7 @@ impl CompletionProvider for ContextPickerCompletionProvider {
_trigger: CompletionContext,
_window: &mut Window,
cx: &mut Context<Editor>,
) -> Task<Result<Vec<CompletionResponse>>> {
) -> Task<Result<Option<Vec<Completion>>>> {
let state = buffer.update(cx, |buffer, _cx| {
let position = buffer_position.to_point(buffer);
let line_start = Point::new(position.row, 0);
@@ -756,13 +756,13 @@ impl CompletionProvider for ContextPickerCompletionProvider {
MentionCompletion::try_parse(line, offset_to_line)
});
let Some(state) = state else {
return Task::ready(Ok(Vec::new()));
return Task::ready(Ok(None));
};
let Some((workspace, context_store)) =
self.workspace.upgrade().zip(self.context_store.upgrade())
else {
return Task::ready(Ok(Vec::new()));
return Task::ready(Ok(None));
};
let snapshot = buffer.read(cx).snapshot();
@@ -815,10 +815,10 @@ impl CompletionProvider for ContextPickerCompletionProvider {
cx.spawn(async move |_, cx| {
let matches = search_task.await;
let Some(editor) = editor.upgrade() else {
return Ok(Vec::new());
return Ok(None);
};
let completions = cx.update(|cx| {
Ok(Some(cx.update(|cx| {
matches
.into_iter()
.filter_map(|mat| match mat {
@@ -901,14 +901,7 @@ impl CompletionProvider for ContextPickerCompletionProvider {
),
})
.collect()
})?;
Ok(vec![CompletionResponse {
completions,
// Since this does its own filtering (see `filter_completions()` returns false),
// there is no benefit to computing whether this set of completions is incomplete.
is_incomplete: true,
}])
})?))
})
}

View File

@@ -7,7 +7,7 @@ use assistant_context_editor::AssistantContext;
use collections::{HashSet, IndexSet};
use futures::{self, FutureExt};
use gpui::{App, Context, Entity, EventEmitter, Image, SharedString, Task, WeakEntity};
use language::{Buffer, File as _};
use language::Buffer;
use language_model::LanguageModelImage;
use project::image_store::is_image_file;
use project::{Project, ProjectItem, ProjectPath, Symbol};
@@ -304,13 +304,11 @@ impl ContextStore {
project.open_image(project_path.clone(), cx)
})?;
let image_item = open_image_task.await?;
let image = image_item.read_with(cx, |image_item, _| image_item.image.clone())?;
this.update(cx, |this, cx| {
let item = image_item.read(cx);
this.insert_image(
Some(item.project_path(cx)),
Some(item.file.full_path(cx).into()),
item.image.clone(),
Some(image_item.read(cx).project_path(cx)),
image,
remove_if_exists,
cx,
)
@@ -319,13 +317,12 @@ impl ContextStore {
}
pub fn add_image_instance(&mut self, image: Arc<Image>, cx: &mut Context<ContextStore>) {
self.insert_image(None, None, image, false, cx);
self.insert_image(None, image, false, cx);
}
fn insert_image(
&mut self,
project_path: Option<ProjectPath>,
full_path: Option<Arc<Path>>,
image: Arc<Image>,
remove_if_exists: bool,
cx: &mut Context<ContextStore>,
@@ -333,7 +330,6 @@ impl ContextStore {
let image_task = LanguageModelImage::from_image(image.clone(), cx).shared();
let context = AgentContextHandle::Image(ImageContext {
project_path,
full_path,
original_image: image,
image_task,
context_id: self.next_context_id.post_inc(),

View File

@@ -152,7 +152,7 @@ impl HistoryStore {
let entries = join_all(entries)
.await
.into_iter()
.filter_map(|result| result.log_with_level(log::Level::Debug))
.filter_map(|result| result.log_err())
.collect::<VecDeque<_>>();
this.update(cx, |this, _| {

View File

@@ -112,7 +112,6 @@ pub(crate) fn create_editor(
editor.set_placeholder_text("Message the agent @ to include context", cx);
editor.set_show_indent_guides(false, cx);
editor.set_soft_wrap();
editor.set_use_modal_editing(true);
editor.set_context_menu_options(ContextMenuOptions {
min_entries_visible: 12,
max_entries_visible: 12,

View File

@@ -179,17 +179,18 @@ impl TerminalTransaction {
// Ensure that the assistant cannot accidentally execute commands that are streamed into the terminal
let input = Self::sanitize_input(hunk);
self.terminal
.update(cx, |terminal, _| terminal.input(input.into_bytes()));
.update(cx, |terminal, _| terminal.input(input));
}
pub fn undo(&self, cx: &mut App) {
self.terminal
.update(cx, |terminal, _| terminal.input(CLEAR_INPUT.as_bytes()));
.update(cx, |terminal, _| terminal.input(CLEAR_INPUT.to_string()));
}
pub fn complete(&self, cx: &mut App) {
self.terminal
.update(cx, |terminal, _| terminal.input(CARRIAGE_RETURN.as_bytes()));
self.terminal.update(cx, |terminal, _| {
terminal.input(CARRIAGE_RETURN.to_string())
});
}
fn sanitize_input(mut input: String) -> String {

View File

@@ -106,7 +106,7 @@ impl TerminalInlineAssistant {
});
let prompt_editor_render = prompt_editor.clone();
let block = terminal_view::BlockProperties {
height: 4,
height: 2,
render: Box::new(move |_| prompt_editor_render.clone().into_any_element()),
};
terminal_view.update(cx, |terminal_view, cx| {
@@ -202,7 +202,7 @@ impl TerminalInlineAssistant {
.update(cx, |terminal, cx| {
terminal
.terminal()
.update(cx, |terminal, _| terminal.input(CLEAR_INPUT.as_bytes()));
.update(cx, |terminal, _| terminal.input(CLEAR_INPUT.to_string()));
})
.log_err();

View File

@@ -1673,7 +1673,6 @@ impl Thread {
}
CompletionRequestStatus::ToolUseLimitReached => {
thread.tool_use_limit_reached = true;
cx.emit(ThreadEvent::ToolUseLimitReached);
}
}
}
@@ -2844,7 +2843,6 @@ pub enum ThreadEvent {
},
CheckpointChanged,
ToolConfirmationNeeded,
ToolUseLimitReached,
CancelEditing,
CompletionCanceled,
}

View File

@@ -1,7 +1,8 @@
use std::borrow::Cow;
use std::cell::{Ref, RefCell};
use std::path::{Path, PathBuf};
use std::rc::Rc;
use std::sync::{Arc, Mutex};
use std::sync::Arc;
use agent_settings::{AgentProfile, AgentProfileId, AgentSettings, CompletionMode};
use anyhow::{Context as _, Result, anyhow};
@@ -16,7 +17,8 @@ use gpui::{
App, BackgroundExecutor, Context, Entity, EventEmitter, Global, ReadGlobal, SharedString,
Subscription, Task, prelude::*,
};
use heed::Database;
use heed::types::SerdeBincode;
use language_model::{LanguageModelToolResultContent, LanguageModelToolUseId, Role, TokenUsage};
use project::context_server_store::{ContextServerStatus, ContextServerStore};
use project::{Project, ProjectItem, ProjectPath, Worktree};
@@ -33,42 +35,6 @@ use crate::context_server_tool::ContextServerTool;
use crate::thread::{
DetailedSummaryState, ExceededWindowError, MessageId, ProjectSnapshot, Thread, ThreadId,
};
use indoc::indoc;
use sqlez::{
bindable::{Bind, Column},
connection::Connection,
statement::Statement,
};
#[derive(Debug, Clone, PartialEq, Eq, Serialize, Deserialize)]
pub enum DataType {
#[serde(rename = "json")]
Json,
#[serde(rename = "zstd")]
Zstd,
}
impl Bind for DataType {
fn bind(&self, statement: &Statement, start_index: i32) -> Result<i32> {
let value = match self {
DataType::Json => "json",
DataType::Zstd => "zstd",
};
value.bind(statement, start_index)
}
}
impl Column for DataType {
fn column(statement: &mut Statement, start_index: i32) -> Result<(Self, i32)> {
let (value, next_index) = String::column(statement, start_index)?;
let data_type = match value.as_str() {
"json" => DataType::Json,
"zstd" => DataType::Zstd,
_ => anyhow::bail!("Unknown data type: {}", value),
};
Ok((data_type, next_index))
}
}
const RULES_FILE_NAMES: [&'static str; 6] = [
".rules",
@@ -900,27 +866,25 @@ impl Global for GlobalThreadsDatabase {}
pub(crate) struct ThreadsDatabase {
executor: BackgroundExecutor,
connection: Arc<Mutex<Connection>>,
env: heed::Env,
threads: Database<SerdeBincode<ThreadId>, SerializedThread>,
}
impl ThreadsDatabase {
fn connection(&self) -> Arc<Mutex<Connection>> {
self.connection.clone()
}
impl heed::BytesEncode<'_> for SerializedThread {
type EItem = SerializedThread;
const COMPRESSION_LEVEL: i32 = 3;
}
impl Bind for ThreadId {
fn bind(&self, statement: &Statement, start_index: i32) -> Result<i32> {
self.to_string().bind(statement, start_index)
fn bytes_encode(item: &Self::EItem) -> Result<Cow<[u8]>, heed::BoxedError> {
serde_json::to_vec(item).map(Cow::Owned).map_err(Into::into)
}
}
impl Column for ThreadId {
fn column(statement: &mut Statement, start_index: i32) -> Result<(Self, i32)> {
let (id_str, next_index) = String::column(statement, start_index)?;
Ok((ThreadId::from(id_str.as_str()), next_index))
impl<'a> heed::BytesDecode<'a> for SerializedThread {
type DItem = SerializedThread;
fn bytes_decode(bytes: &'a [u8]) -> Result<Self::DItem, heed::BoxedError> {
// We implement this type manually because we want to call `SerializedThread::from_json`,
// instead of the Deserialize trait implementation for `SerializedThread`.
SerializedThread::from_json(bytes).map_err(Into::into)
}
}
@@ -936,8 +900,8 @@ impl ThreadsDatabase {
let database_future = executor
.spawn({
let executor = executor.clone();
let threads_dir = paths::data_dir().join("threads");
async move { ThreadsDatabase::new(threads_dir, executor) }
let database_path = paths::data_dir().join("threads/threads-db.1.mdb");
async move { ThreadsDatabase::new(database_path, executor) }
})
.then(|result| future::ready(result.map(Arc::new).map_err(Arc::new)))
.boxed()
@@ -946,144 +910,41 @@ impl ThreadsDatabase {
cx.set_global(GlobalThreadsDatabase(database_future));
}
pub fn new(threads_dir: PathBuf, executor: BackgroundExecutor) -> Result<Self> {
std::fs::create_dir_all(&threads_dir)?;
let sqlite_path = threads_dir.join("threads.db");
let mdb_path = threads_dir.join("threads-db.1.mdb");
let needs_migration_from_heed = mdb_path.exists();
let connection = Connection::open_file(&sqlite_path.to_string_lossy());
connection.exec(indoc! {"
CREATE TABLE IF NOT EXISTS threads (
id TEXT PRIMARY KEY,
summary TEXT NOT NULL,
updated_at TEXT NOT NULL,
data_type TEXT NOT NULL,
data BLOB NOT NULL
)
"})?()
.map_err(|e| anyhow!("Failed to create threads table: {}", e))?;
let db = Self {
executor: executor.clone(),
connection: Arc::new(Mutex::new(connection)),
};
if needs_migration_from_heed {
let db_connection = db.connection();
let executor_clone = executor.clone();
executor
.spawn(async move {
log::info!("Starting threads.db migration");
Self::migrate_from_heed(&mdb_path, db_connection, executor_clone)?;
std::fs::remove_dir_all(mdb_path)?;
log::info!("threads.db migrated to sqlite");
Ok::<(), anyhow::Error>(())
})
.detach();
}
Ok(db)
}
// Remove this migration after 2025-09-01
fn migrate_from_heed(
mdb_path: &Path,
connection: Arc<Mutex<Connection>>,
_executor: BackgroundExecutor,
) -> Result<()> {
use heed::types::SerdeBincode;
struct SerializedThreadHeed(SerializedThread);
impl heed::BytesEncode<'_> for SerializedThreadHeed {
type EItem = SerializedThreadHeed;
fn bytes_encode(
item: &Self::EItem,
) -> Result<std::borrow::Cow<[u8]>, heed::BoxedError> {
serde_json::to_vec(&item.0)
.map(std::borrow::Cow::Owned)
.map_err(Into::into)
}
}
impl<'a> heed::BytesDecode<'a> for SerializedThreadHeed {
type DItem = SerializedThreadHeed;
fn bytes_decode(bytes: &'a [u8]) -> Result<Self::DItem, heed::BoxedError> {
SerializedThread::from_json(bytes)
.map(SerializedThreadHeed)
.map_err(Into::into)
}
}
pub fn new(path: PathBuf, executor: BackgroundExecutor) -> Result<Self> {
std::fs::create_dir_all(&path)?;
const ONE_GB_IN_BYTES: usize = 1024 * 1024 * 1024;
let env = unsafe {
heed::EnvOpenOptions::new()
.map_size(ONE_GB_IN_BYTES)
.max_dbs(1)
.open(mdb_path)?
.open(path)?
};
let txn = env.write_txn()?;
let threads: heed::Database<SerdeBincode<ThreadId>, SerializedThreadHeed> = env
.open_database(&txn, Some("threads"))?
.ok_or_else(|| anyhow!("threads database not found"))?;
let mut txn = env.write_txn()?;
let threads = env.create_database(&mut txn, Some("threads"))?;
txn.commit()?;
for result in threads.iter(&txn)? {
let (thread_id, thread_heed) = result?;
Self::save_thread_sync(&connection, thread_id, thread_heed.0)?;
}
Ok(())
}
fn save_thread_sync(
connection: &Arc<Mutex<Connection>>,
id: ThreadId,
thread: SerializedThread,
) -> Result<()> {
let json_data = serde_json::to_string(&thread)?;
let summary = thread.summary.to_string();
let updated_at = thread.updated_at.to_rfc3339();
let connection = connection.lock().unwrap();
let compressed = zstd::encode_all(json_data.as_bytes(), Self::COMPRESSION_LEVEL)?;
let data_type = DataType::Zstd;
let data = compressed;
let mut insert = connection.exec_bound::<(ThreadId, String, String, DataType, Vec<u8>)>(indoc! {"
INSERT OR REPLACE INTO threads (id, summary, updated_at, data_type, data) VALUES (?, ?, ?, ?, ?)
"})?;
insert((id, summary, updated_at, data_type, data))?;
Ok(())
Ok(Self {
executor,
env,
threads,
})
}
pub fn list_threads(&self) -> Task<Result<Vec<SerializedThreadMetadata>>> {
let connection = self.connection.clone();
let env = self.env.clone();
let threads = self.threads;
self.executor.spawn(async move {
let connection = connection.lock().unwrap();
let mut select =
connection.select_bound::<(), (ThreadId, String, String)>(indoc! {"
SELECT id, summary, updated_at FROM threads ORDER BY updated_at DESC
"})?;
let rows = select(())?;
let txn = env.read_txn()?;
let mut iter = threads.iter(&txn)?;
let mut threads = Vec::new();
for (id, summary, updated_at) in rows {
while let Some((key, value)) = iter.next().transpose()? {
threads.push(SerializedThreadMetadata {
id,
summary: summary.into(),
updated_at: DateTime::parse_from_rfc3339(&updated_at)?.with_timezone(&Utc),
id: key,
summary: value.summary,
updated_at: value.updated_at,
});
}
@@ -1092,51 +953,36 @@ impl ThreadsDatabase {
}
pub fn try_find_thread(&self, id: ThreadId) -> Task<Result<Option<SerializedThread>>> {
let connection = self.connection.clone();
let env = self.env.clone();
let threads = self.threads;
self.executor.spawn(async move {
let connection = connection.lock().unwrap();
let mut select = connection.select_bound::<ThreadId, (DataType, Vec<u8>)>(indoc! {"
SELECT data_type, data FROM threads WHERE id = ? LIMIT 1
"})?;
let rows = select(id)?;
if let Some((data_type, data)) = rows.into_iter().next() {
let json_data = match data_type {
DataType::Zstd => {
let decompressed = zstd::decode_all(&data[..])?;
String::from_utf8(decompressed)?
}
DataType::Json => String::from_utf8(data)?,
};
let thread = SerializedThread::from_json(json_data.as_bytes())?;
Ok(Some(thread))
} else {
Ok(None)
}
let txn = env.read_txn()?;
let thread = threads.get(&txn, &id)?;
Ok(thread)
})
}
pub fn save_thread(&self, id: ThreadId, thread: SerializedThread) -> Task<Result<()>> {
let connection = self.connection.clone();
let env = self.env.clone();
let threads = self.threads;
self.executor
.spawn(async move { Self::save_thread_sync(&connection, id, thread) })
self.executor.spawn(async move {
let mut txn = env.write_txn()?;
threads.put(&mut txn, &id, &thread)?;
txn.commit()?;
Ok(())
})
}
pub fn delete_thread(&self, id: ThreadId) -> Task<Result<()>> {
let connection = self.connection.clone();
let env = self.env.clone();
let threads = self.threads;
self.executor.spawn(async move {
let connection = connection.lock().unwrap();
let mut delete = connection.exec_bound::<ThreadId>(indoc! {"
DELETE FROM threads WHERE id = ?
"})?;
delete(id)?;
let mut txn = env.write_txn()?;
threads.delete(&mut txn, &id)?;
txn.commit()?;
Ok(())
})
}

View File

@@ -304,7 +304,7 @@ impl AddedContext {
AgentContextHandle::Thread(handle) => Some(Self::pending_thread(handle, cx)),
AgentContextHandle::TextThread(handle) => Some(Self::pending_text_thread(handle, cx)),
AgentContextHandle::Rules(handle) => Self::pending_rules(handle, prompt_store, cx),
AgentContextHandle::Image(handle) => Some(Self::image(handle, cx)),
AgentContextHandle::Image(handle) => Some(Self::image(handle)),
}
}
@@ -318,7 +318,7 @@ impl AddedContext {
AgentContext::Thread(context) => Self::attached_thread(context),
AgentContext::TextThread(context) => Self::attached_text_thread(context),
AgentContext::Rules(context) => Self::attached_rules(context),
AgentContext::Image(context) => Self::image(context.clone(), cx),
AgentContext::Image(context) => Self::image(context.clone()),
}
}
@@ -333,8 +333,14 @@ impl AddedContext {
fn file(handle: FileContextHandle, full_path: &Path, cx: &App) -> AddedContext {
let full_path_string: SharedString = full_path.to_string_lossy().into_owned().into();
let (name, parent) =
extract_file_name_and_directory_from_full_path(full_path, &full_path_string);
let name = full_path
.file_name()
.map(|n| n.to_string_lossy().into_owned().into())
.unwrap_or_else(|| full_path_string.clone());
let parent = full_path
.parent()
.and_then(|p| p.file_name())
.map(|n| n.to_string_lossy().into_owned().into());
AddedContext {
kind: ContextKind::File,
name,
@@ -364,8 +370,14 @@ impl AddedContext {
fn directory(handle: DirectoryContextHandle, full_path: &Path) -> AddedContext {
let full_path_string: SharedString = full_path.to_string_lossy().into_owned().into();
let (name, parent) =
extract_file_name_and_directory_from_full_path(full_path, &full_path_string);
let name = full_path
.file_name()
.map(|n| n.to_string_lossy().into_owned().into())
.unwrap_or_else(|| full_path_string.clone());
let parent = full_path
.parent()
.and_then(|p| p.file_name())
.map(|n| n.to_string_lossy().into_owned().into());
AddedContext {
kind: ContextKind::Directory,
name,
@@ -593,23 +605,13 @@ impl AddedContext {
}
}
fn image(context: ImageContext, cx: &App) -> AddedContext {
let (name, parent, icon_path) = if let Some(full_path) = context.full_path.as_ref() {
let full_path_string: SharedString = full_path.to_string_lossy().into_owned().into();
let (name, parent) =
extract_file_name_and_directory_from_full_path(full_path, &full_path_string);
let icon_path = FileIcons::get_icon(&full_path, cx);
(name, parent, icon_path)
} else {
("Image".into(), None, None)
};
fn image(context: ImageContext) -> AddedContext {
AddedContext {
kind: ContextKind::Image,
name,
parent,
name: "Image".into(),
parent: None,
tooltip: None,
icon_path,
icon_path: None,
status: match context.status() {
ImageStatus::Loading => ContextStatus::Loading {
message: "Loading…".into(),
@@ -637,22 +639,6 @@ impl AddedContext {
}
}
fn extract_file_name_and_directory_from_full_path(
path: &Path,
name_fallback: &SharedString,
) -> (SharedString, Option<SharedString>) {
let name = path
.file_name()
.map(|n| n.to_string_lossy().into_owned().into())
.unwrap_or_else(|| name_fallback.clone());
let parent = path
.parent()
.and_then(|p| p.file_name())
.map(|n| n.to_string_lossy().into_owned().into());
(name, parent)
}
#[derive(Debug, Clone)]
struct ContextFileExcerpt {
pub file_name_and_range: SharedString,
@@ -779,49 +765,37 @@ impl Component for AddedContext {
let mut next_context_id = ContextId::zero();
let image_ready = (
"Ready",
AddedContext::image(
ImageContext {
context_id: next_context_id.post_inc(),
project_path: None,
full_path: None,
original_image: Arc::new(Image::empty()),
image_task: Task::ready(Some(LanguageModelImage::empty())).shared(),
},
cx,
),
AddedContext::image(ImageContext {
context_id: next_context_id.post_inc(),
project_path: None,
original_image: Arc::new(Image::empty()),
image_task: Task::ready(Some(LanguageModelImage::empty())).shared(),
}),
);
let image_loading = (
"Loading",
AddedContext::image(
ImageContext {
context_id: next_context_id.post_inc(),
project_path: None,
full_path: None,
original_image: Arc::new(Image::empty()),
image_task: cx
.background_spawn(async move {
smol::Timer::after(Duration::from_secs(60 * 5)).await;
Some(LanguageModelImage::empty())
})
.shared(),
},
cx,
),
AddedContext::image(ImageContext {
context_id: next_context_id.post_inc(),
project_path: None,
original_image: Arc::new(Image::empty()),
image_task: cx
.background_spawn(async move {
smol::Timer::after(Duration::from_secs(60 * 5)).await;
Some(LanguageModelImage::empty())
})
.shared(),
}),
);
let image_error = (
"Error",
AddedContext::image(
ImageContext {
context_id: next_context_id.post_inc(),
project_path: None,
full_path: None,
original_image: Arc::new(Image::empty()),
image_task: Task::ready(None).shared(),
},
cx,
),
AddedContext::image(ImageContext {
context_id: next_context_id.post_inc(),
project_path: None,
original_image: Arc::new(Image::empty()),
image_task: Task::ready(None).shared(),
}),
);
Some(

View File

@@ -372,8 +372,6 @@ impl AgentSettingsContent {
None,
None,
Some(language_model.supports_tools()),
Some(language_model.supports_images()),
None,
)),
api_url,
});
@@ -691,7 +689,6 @@ pub struct AgentSettingsContentV2 {
pub enum CompletionMode {
#[default]
Normal,
#[serde(alias = "max")]
Burn,
}

View File

@@ -60,7 +60,6 @@ zed_actions.workspace = true
zed_llm_client.workspace = true
[dev-dependencies]
indoc.workspace = true
language_model = { workspace = true, features = ["test-support"] }
languages = { workspace = true, features = ["test-support"] }
pretty_assertions.workspace = true

View File

@@ -1646,35 +1646,34 @@ impl ContextEditor {
let context = self.context.read(cx);
let mut text = String::new();
// If selection is empty, we want to copy the entire line
if selection.range().is_empty() {
let snapshot = context.buffer().read(cx).snapshot();
let point = snapshot.offset_to_point(selection.range().start);
selection.start = snapshot.point_to_offset(Point::new(point.row, 0));
selection.end = snapshot
.point_to_offset(cmp::min(Point::new(point.row + 1, 0), snapshot.max_point()));
for chunk in context.buffer().read(cx).text_for_range(selection.range()) {
text.push_str(chunk);
}
} else {
for message in context.messages(cx) {
if message.offset_range.start >= selection.range().end {
break;
} else if message.offset_range.end >= selection.range().start {
let range = cmp::max(message.offset_range.start, selection.range().start)
..cmp::min(message.offset_range.end, selection.range().end);
if !range.is_empty() {
for chunk in context.buffer().read(cx).text_for_range(range) {
text.push_str(chunk);
}
if message.offset_range.end < selection.range().end {
text.push('\n');
}
for message in context.messages(cx) {
if message.offset_range.start >= selection.range().end {
break;
} else if message.offset_range.end >= selection.range().start {
let range = cmp::max(message.offset_range.start, selection.range().start)
..cmp::min(message.offset_range.end, selection.range().end);
if range.is_empty() {
let snapshot = context.buffer().read(cx).snapshot();
let point = snapshot.offset_to_point(range.start);
selection.start = snapshot.point_to_offset(Point::new(point.row, 0));
selection.end = snapshot.point_to_offset(cmp::min(
Point::new(point.row + 1, 0),
snapshot.max_point(),
));
for chunk in context.buffer().read(cx).text_for_range(selection.range()) {
text.push_str(chunk);
}
} else {
for chunk in context.buffer().read(cx).text_for_range(range) {
text.push_str(chunk);
}
if message.offset_range.end < selection.range().end {
text.push('\n');
}
}
}
}
(text, CopyMetadata { creases }, vec![selection])
}
@@ -3265,92 +3264,74 @@ mod tests {
use super::*;
use fs::FakeFs;
use gpui::{App, TestAppContext, VisualTestContext};
use indoc::indoc;
use language::{Buffer, LanguageRegistry};
use pretty_assertions::assert_eq;
use prompt_store::PromptBuilder;
use text::OffsetRangeExt;
use unindent::Unindent;
use util::path;
#[gpui::test]
async fn test_copy_paste_whole_message(cx: &mut TestAppContext) {
let (context, context_editor, mut cx) = setup_context_editor_text(vec![
(Role::User, "What is the Zed editor?"),
(
Role::Assistant,
"Zed is a modern, high-performance code editor designed from the ground up for speed and collaboration.",
),
(Role::User, ""),
],cx).await;
// Select & Copy whole user message
assert_copy_paste_context_editor(
&context_editor,
message_range(&context, 0, &mut cx),
indoc! {"
What is the Zed editor?
Zed is a modern, high-performance code editor designed from the ground up for speed and collaboration.
What is the Zed editor?
"},
&mut cx,
);
// Select & Copy whole assistant message
assert_copy_paste_context_editor(
&context_editor,
message_range(&context, 1, &mut cx),
indoc! {"
What is the Zed editor?
Zed is a modern, high-performance code editor designed from the ground up for speed and collaboration.
What is the Zed editor?
Zed is a modern, high-performance code editor designed from the ground up for speed and collaboration.
"},
&mut cx,
);
}
#[gpui::test]
async fn test_copy_paste_no_selection(cx: &mut TestAppContext) {
let (context, context_editor, mut cx) = setup_context_editor_text(
vec![
(Role::User, "user1"),
(Role::Assistant, "assistant1"),
(Role::Assistant, "assistant2"),
(Role::User, ""),
],
cx,
)
.await;
cx.update(init_test);
// Copy and paste first assistant message
let message_2_range = message_range(&context, 1, &mut cx);
assert_copy_paste_context_editor(
&context_editor,
message_2_range.start..message_2_range.start,
indoc! {"
user1
assistant1
assistant2
assistant1
"},
&mut cx,
);
let fs = FakeFs::new(cx.executor());
let registry = Arc::new(LanguageRegistry::test(cx.executor()));
let prompt_builder = Arc::new(PromptBuilder::new(None).unwrap());
let context = cx.new(|cx| {
AssistantContext::local(
registry,
None,
None,
prompt_builder.clone(),
Arc::new(SlashCommandWorkingSet::default()),
cx,
)
});
let project = Project::test(fs.clone(), [path!("/test").as_ref()], cx).await;
let window = cx.add_window(|window, cx| Workspace::test_new(project.clone(), window, cx));
let workspace = window.root(cx).unwrap();
let cx = &mut VisualTestContext::from_window(*window, cx);
// Copy and cut second assistant message
let message_3_range = message_range(&context, 2, &mut cx);
assert_copy_paste_context_editor(
&context_editor,
message_3_range.start..message_3_range.start,
indoc! {"
user1
assistant1
assistant2
assistant1
assistant2
"},
&mut cx,
);
let context_editor = window
.update(cx, |_, window, cx| {
cx.new(|cx| {
ContextEditor::for_context(
context,
fs,
workspace.downgrade(),
project,
None,
window,
cx,
)
})
})
.unwrap();
context_editor.update_in(cx, |context_editor, window, cx| {
context_editor.editor.update(cx, |editor, cx| {
editor.set_text("abc\ndef\nghi", window, cx);
editor.move_to_beginning(&Default::default(), window, cx);
})
});
context_editor.update_in(cx, |context_editor, window, cx| {
context_editor.editor.update(cx, |editor, cx| {
editor.copy(&Default::default(), window, cx);
editor.paste(&Default::default(), window, cx);
assert_eq!(editor.text(cx), "abc\nabc\ndef\nghi");
})
});
context_editor.update_in(cx, |context_editor, window, cx| {
context_editor.editor.update(cx, |editor, cx| {
editor.cut(&Default::default(), window, cx);
assert_eq!(editor.text(cx), "abc\ndef\nghi");
editor.paste(&Default::default(), window, cx);
assert_eq!(editor.text(cx), "abc\nabc\ndef\nghi");
})
});
}
#[gpui::test]
@@ -3427,129 +3408,6 @@ mod tests {
}
}
async fn setup_context_editor_text(
messages: Vec<(Role, &str)>,
cx: &mut TestAppContext,
) -> (
Entity<AssistantContext>,
Entity<ContextEditor>,
VisualTestContext,
) {
cx.update(init_test);
let fs = FakeFs::new(cx.executor());
let context = create_context_with_messages(messages, cx);
let project = Project::test(fs.clone(), [path!("/test").as_ref()], cx).await;
let window = cx.add_window(|window, cx| Workspace::test_new(project.clone(), window, cx));
let workspace = window.root(cx).unwrap();
let mut cx = VisualTestContext::from_window(*window, cx);
let context_editor = window
.update(&mut cx, |_, window, cx| {
cx.new(|cx| {
let editor = ContextEditor::for_context(
context.clone(),
fs,
workspace.downgrade(),
project,
None,
window,
cx,
);
editor
})
})
.unwrap();
(context, context_editor, cx)
}
fn message_range(
context: &Entity<AssistantContext>,
message_ix: usize,
cx: &mut TestAppContext,
) -> Range<usize> {
context.update(cx, |context, cx| {
context
.messages(cx)
.nth(message_ix)
.unwrap()
.anchor_range
.to_offset(&context.buffer().read(cx).snapshot())
})
}
fn assert_copy_paste_context_editor<T: editor::ToOffset>(
context_editor: &Entity<ContextEditor>,
range: Range<T>,
expected_text: &str,
cx: &mut VisualTestContext,
) {
context_editor.update_in(cx, |context_editor, window, cx| {
context_editor.editor.update(cx, |editor, cx| {
editor.change_selections(None, window, cx, |s| s.select_ranges([range]));
});
context_editor.copy(&Default::default(), window, cx);
context_editor.editor.update(cx, |editor, cx| {
editor.move_to_end(&Default::default(), window, cx);
});
context_editor.paste(&Default::default(), window, cx);
context_editor.editor.update(cx, |editor, cx| {
assert_eq!(editor.text(cx), expected_text);
});
});
}
fn create_context_with_messages(
mut messages: Vec<(Role, &str)>,
cx: &mut TestAppContext,
) -> Entity<AssistantContext> {
let registry = Arc::new(LanguageRegistry::test(cx.executor()));
let prompt_builder = Arc::new(PromptBuilder::new(None).unwrap());
cx.new(|cx| {
let mut context = AssistantContext::local(
registry,
None,
None,
prompt_builder.clone(),
Arc::new(SlashCommandWorkingSet::default()),
cx,
);
let mut message_1 = context.messages(cx).next().unwrap();
let (role, text) = messages.remove(0);
loop {
if role == message_1.role {
context.buffer().update(cx, |buffer, cx| {
buffer.edit([(message_1.offset_range, text)], None, cx);
});
break;
}
let mut ids = HashSet::default();
ids.insert(message_1.id);
context.cycle_message_roles(ids, cx);
message_1 = context.messages(cx).next().unwrap();
}
let mut last_message_id = message_1.id;
for (role, text) in messages {
context.insert_message_after(last_message_id, role, MessageStatus::Done, cx);
let message = context.messages(cx).last().unwrap();
last_message_id = message.id;
context.buffer().update(cx, |buffer, cx| {
buffer.edit([(message.offset_range, text)], None, cx);
})
}
context
})
}
fn init_test(cx: &mut App) {
let settings_store = SettingsStore::test(cx);
prompt_store::init(cx);

View File

@@ -48,7 +48,7 @@ impl SlashCommandCompletionProvider {
name_range: Range<Anchor>,
window: &mut Window,
cx: &mut App,
) -> Task<Result<Vec<project::CompletionResponse>>> {
) -> Task<Result<Option<Vec<project::Completion>>>> {
let slash_commands = self.slash_commands.clone();
let candidates = slash_commands
.command_names(cx)
@@ -71,27 +71,28 @@ impl SlashCommandCompletionProvider {
.await;
cx.update(|_, cx| {
let completions = matches
.into_iter()
.filter_map(|mat| {
let command = slash_commands.command(&mat.string, cx)?;
let mut new_text = mat.string.clone();
let requires_argument = command.requires_argument();
let accepts_arguments = command.accepts_arguments();
if requires_argument || accepts_arguments {
new_text.push(' ');
}
Some(
matches
.into_iter()
.filter_map(|mat| {
let command = slash_commands.command(&mat.string, cx)?;
let mut new_text = mat.string.clone();
let requires_argument = command.requires_argument();
let accepts_arguments = command.accepts_arguments();
if requires_argument || accepts_arguments {
new_text.push(' ');
}
let confirm =
editor
.clone()
.zip(workspace.clone())
.map(|(editor, workspace)| {
let command_name = mat.string.clone();
let command_range = command_range.clone();
let editor = editor.clone();
let workspace = workspace.clone();
Arc::new(
let confirm =
editor
.clone()
.zip(workspace.clone())
.map(|(editor, workspace)| {
let command_name = mat.string.clone();
let command_range = command_range.clone();
let editor = editor.clone();
let workspace = workspace.clone();
Arc::new(
move |intent: CompletionIntent,
window: &mut Window,
cx: &mut App| {
@@ -117,27 +118,22 @@ impl SlashCommandCompletionProvider {
}
},
) as Arc<_>
});
Some(project::Completion {
replace_range: name_range.clone(),
documentation: Some(CompletionDocumentation::SingleLine(
command.description().into(),
)),
new_text,
label: command.label(cx),
icon_path: None,
insert_text_mode: None,
confirm,
source: CompletionSource::Custom,
});
Some(project::Completion {
replace_range: name_range.clone(),
documentation: Some(CompletionDocumentation::SingleLine(
command.description().into(),
)),
new_text,
label: command.label(cx),
icon_path: None,
insert_text_mode: None,
confirm,
source: CompletionSource::Custom,
})
})
})
.collect();
vec![project::CompletionResponse {
completions,
is_incomplete: false,
}]
.collect(),
)
})
})
}
@@ -151,7 +147,7 @@ impl SlashCommandCompletionProvider {
last_argument_range: Range<Anchor>,
window: &mut Window,
cx: &mut App,
) -> Task<Result<Vec<project::CompletionResponse>>> {
) -> Task<Result<Option<Vec<project::Completion>>>> {
let new_cancel_flag = Arc::new(AtomicBool::new(false));
let mut flag = self.cancel_flag.lock();
flag.store(true, SeqCst);
@@ -169,27 +165,28 @@ impl SlashCommandCompletionProvider {
let workspace = self.workspace.clone();
let arguments = arguments.to_vec();
cx.background_spawn(async move {
let completions = completions
.await?
.into_iter()
.map(|new_argument| {
let confirm =
editor
.clone()
.zip(workspace.clone())
.map(|(editor, workspace)| {
Arc::new({
let mut completed_arguments = arguments.clone();
if new_argument.replace_previous_arguments {
completed_arguments.clear();
} else {
completed_arguments.pop();
}
completed_arguments.push(new_argument.new_text.clone());
Ok(Some(
completions
.await?
.into_iter()
.map(|new_argument| {
let confirm =
editor
.clone()
.zip(workspace.clone())
.map(|(editor, workspace)| {
Arc::new({
let mut completed_arguments = arguments.clone();
if new_argument.replace_previous_arguments {
completed_arguments.clear();
} else {
completed_arguments.pop();
}
completed_arguments.push(new_argument.new_text.clone());
let command_range = command_range.clone();
let command_name = command_name.clone();
move |intent: CompletionIntent,
let command_range = command_range.clone();
let command_name = command_name.clone();
move |intent: CompletionIntent,
window: &mut Window,
cx: &mut App| {
if new_argument.after_completion.run()
@@ -213,41 +210,34 @@ impl SlashCommandCompletionProvider {
!new_argument.after_completion.run()
}
}
}) as Arc<_>
});
}) as Arc<_>
});
let mut new_text = new_argument.new_text.clone();
if new_argument.after_completion == AfterCompletion::Continue {
new_text.push(' ');
}
let mut new_text = new_argument.new_text.clone();
if new_argument.after_completion == AfterCompletion::Continue {
new_text.push(' ');
}
project::Completion {
replace_range: if new_argument.replace_previous_arguments {
argument_range.clone()
} else {
last_argument_range.clone()
},
label: new_argument.label,
icon_path: None,
new_text,
documentation: None,
confirm,
insert_text_mode: None,
source: CompletionSource::Custom,
}
})
.collect();
Ok(vec![project::CompletionResponse {
completions,
is_incomplete: false,
}])
project::Completion {
replace_range: if new_argument.replace_previous_arguments {
argument_range.clone()
} else {
last_argument_range.clone()
},
label: new_argument.label,
icon_path: None,
new_text,
documentation: None,
confirm,
insert_text_mode: None,
source: CompletionSource::Custom,
}
})
.collect(),
))
})
} else {
Task::ready(Ok(vec![project::CompletionResponse {
completions: Vec::new(),
is_incomplete: false,
}]))
Task::ready(Ok(Some(Vec::new())))
}
}
}
@@ -261,7 +251,7 @@ impl CompletionProvider for SlashCommandCompletionProvider {
_: editor::CompletionContext,
window: &mut Window,
cx: &mut Context<Editor>,
) -> Task<Result<Vec<project::CompletionResponse>>> {
) -> Task<Result<Option<Vec<project::Completion>>>> {
let Some((name, arguments, command_range, last_argument_range)) =
buffer.update(cx, |buffer, _cx| {
let position = buffer_position.to_point(buffer);
@@ -305,10 +295,7 @@ impl CompletionProvider for SlashCommandCompletionProvider {
Some((name, arguments, command_range, last_argument_range))
})
else {
return Task::ready(Ok(vec![project::CompletionResponse {
completions: Vec::new(),
is_incomplete: false,
}]));
return Task::ready(Ok(Some(Vec::new())));
};
if let Some((arguments, argument_range)) = arguments {

View File

@@ -35,7 +35,6 @@ pub struct ChannelBuffer {
pub enum ChannelBufferEvent {
CollaboratorsChanged,
Disconnected,
Connected,
BufferEdited,
ChannelChanged,
}
@@ -104,17 +103,6 @@ impl ChannelBuffer {
}
}
pub fn connected(&mut self, cx: &mut Context<Self>) {
self.connected = true;
if self.subscription.is_none() {
let Ok(subscription) = self.client.subscribe_to_entity(self.channel_id.0) else {
return;
};
self.subscription = Some(subscription.set_entity(&cx.entity(), &mut cx.to_async()));
cx.emit(ChannelBufferEvent::Connected);
}
}
pub fn remote_id(&self, cx: &App) -> BufferId {
self.buffer.read(cx).remote_id()
}

View File

@@ -56,6 +56,7 @@ pub struct Channel {
pub name: SharedString,
pub visibility: proto::ChannelVisibility,
pub parent_path: Vec<ChannelId>,
pub channel_order: i32,
}
#[derive(Default, Debug)]
@@ -614,7 +615,24 @@ impl ChannelStore {
to: to.0,
})
.await?;
Ok(())
})
}
pub fn reorder_channel(
&mut self,
channel_id: ChannelId,
direction: proto::reorder_channel::Direction,
cx: &mut Context<Self>,
) -> Task<Result<()>> {
let client = self.client.clone();
cx.spawn(async move |_, _| {
client
.request(proto::ReorderChannel {
channel_id: channel_id.0,
direction: direction.into(),
})
.await?;
Ok(())
})
}
@@ -972,7 +990,6 @@ impl ChannelStore {
.log_err();
if let Some(operations) = operations {
channel_buffer.connected(cx);
let client = this.client.clone();
cx.background_spawn(async move {
let operations = operations.await;
@@ -1013,8 +1030,8 @@ impl ChannelStore {
if let Some(this) = this.upgrade() {
this.update(cx, |this, cx| {
for (_, buffer) in &this.opened_buffers {
if let OpenEntityHandle::Open(buffer) = &buffer {
for (_, buffer) in this.opened_buffers.drain() {
if let OpenEntityHandle::Open(buffer) = buffer {
if let Some(buffer) = buffer.upgrade() {
buffer.update(cx, |buffer, cx| buffer.disconnect(cx));
}
@@ -1027,6 +1044,18 @@ impl ChannelStore {
});
}
#[cfg(any(test, feature = "test-support"))]
pub fn reset(&mut self) {
self.channel_invitations.clear();
self.channel_index.clear();
self.channel_participants.clear();
self.outgoing_invites.clear();
self.opened_buffers.clear();
self.opened_chats.clear();
self.disconnect_channel_buffers_task = None;
self.channel_states.clear();
}
pub(crate) fn update_channels(
&mut self,
payload: proto::UpdateChannels,
@@ -1051,6 +1080,7 @@ impl ChannelStore {
visibility: channel.visibility(),
name: channel.name.into(),
parent_path: channel.parent_path.into_iter().map(ChannelId).collect(),
channel_order: channel.channel_order,
}),
),
}

View File

@@ -61,11 +61,13 @@ impl ChannelPathsInsertGuard<'_> {
ret = existing_channel.visibility != channel_proto.visibility()
|| existing_channel.name != channel_proto.name
|| existing_channel.parent_path != parent_path;
|| existing_channel.parent_path != parent_path
|| existing_channel.channel_order != channel_proto.channel_order;
existing_channel.visibility = channel_proto.visibility();
existing_channel.name = channel_proto.name.into();
existing_channel.parent_path = parent_path;
existing_channel.channel_order = channel_proto.channel_order;
} else {
self.channels_by_id.insert(
ChannelId(channel_proto.id),
@@ -74,6 +76,7 @@ impl ChannelPathsInsertGuard<'_> {
visibility: channel_proto.visibility(),
name: channel_proto.name.into(),
parent_path,
channel_order: channel_proto.channel_order,
}),
);
self.insert_root(ChannelId(channel_proto.id));
@@ -100,17 +103,18 @@ impl Drop for ChannelPathsInsertGuard<'_> {
fn channel_path_sorting_key(
id: ChannelId,
channels_by_id: &BTreeMap<ChannelId, Arc<Channel>>,
) -> impl Iterator<Item = (&str, ChannelId)> {
let (parent_path, name) = channels_by_id
.get(&id)
.map_or((&[] as &[_], None), |channel| {
(
channel.parent_path.as_slice(),
Some((channel.name.as_ref(), channel.id)),
)
});
) -> impl Iterator<Item = (i32, ChannelId)> {
let (parent_path, order_and_id) =
channels_by_id
.get(&id)
.map_or((&[] as &[_], None), |channel| {
(
channel.parent_path.as_slice(),
Some((channel.channel_order, channel.id)),
)
});
parent_path
.iter()
.filter_map(|id| Some((channels_by_id.get(id)?.name.as_ref(), *id)))
.chain(name)
.filter_map(|id| Some((channels_by_id.get(id)?.channel_order, *id)))
.chain(order_and_id)
}

View File

@@ -21,12 +21,14 @@ fn test_update_channels(cx: &mut App) {
name: "b".to_string(),
visibility: proto::ChannelVisibility::Members as i32,
parent_path: Vec::new(),
channel_order: 1,
},
proto::Channel {
id: 2,
name: "a".to_string(),
visibility: proto::ChannelVisibility::Members as i32,
parent_path: Vec::new(),
channel_order: 2,
},
],
..Default::default()
@@ -37,8 +39,8 @@ fn test_update_channels(cx: &mut App) {
&channel_store,
&[
//
(0, "a".to_string()),
(0, "b".to_string()),
(0, "a".to_string()),
],
cx,
);
@@ -52,12 +54,14 @@ fn test_update_channels(cx: &mut App) {
name: "x".to_string(),
visibility: proto::ChannelVisibility::Members as i32,
parent_path: vec![1],
channel_order: 1,
},
proto::Channel {
id: 4,
name: "y".to_string(),
visibility: proto::ChannelVisibility::Members as i32,
parent_path: vec![2],
channel_order: 1,
},
],
..Default::default()
@@ -67,15 +71,111 @@ fn test_update_channels(cx: &mut App) {
assert_channels(
&channel_store,
&[
(0, "a".to_string()),
(1, "y".to_string()),
(0, "b".to_string()),
(1, "x".to_string()),
(0, "a".to_string()),
(1, "y".to_string()),
],
cx,
);
}
#[gpui::test]
fn test_update_channels_order_independent(cx: &mut App) {
/// Based on: https://stackoverflow.com/a/59939809
fn unique_permutations<T: Clone>(items: Vec<T>) -> Vec<Vec<T>> {
if items.len() == 1 {
vec![items]
} else {
let mut output: Vec<Vec<T>> = vec![];
for (ix, first) in items.iter().enumerate() {
let mut remaining_elements = items.clone();
remaining_elements.remove(ix);
for mut permutation in unique_permutations(remaining_elements) {
permutation.insert(0, first.clone());
output.push(permutation);
}
}
output
}
}
let test_data = vec![
proto::Channel {
id: 6,
name: "β".to_string(),
visibility: proto::ChannelVisibility::Members as i32,
parent_path: vec![1, 3],
channel_order: 1,
},
proto::Channel {
id: 5,
name: "α".to_string(),
visibility: proto::ChannelVisibility::Members as i32,
parent_path: vec![1],
channel_order: 2,
},
proto::Channel {
id: 3,
name: "x".to_string(),
visibility: proto::ChannelVisibility::Members as i32,
parent_path: vec![1],
channel_order: 1,
},
proto::Channel {
id: 4,
name: "y".to_string(),
visibility: proto::ChannelVisibility::Members as i32,
parent_path: vec![2],
channel_order: 1,
},
proto::Channel {
id: 1,
name: "b".to_string(),
visibility: proto::ChannelVisibility::Members as i32,
parent_path: Vec::new(),
channel_order: 1,
},
proto::Channel {
id: 2,
name: "a".to_string(),
visibility: proto::ChannelVisibility::Members as i32,
parent_path: Vec::new(),
channel_order: 2,
},
];
let channel_store = init_test(cx);
let permutations = unique_permutations(test_data);
for test_instance in permutations {
channel_store.update(cx, |channel_store, _| channel_store.reset());
update_channels(
&channel_store,
proto::UpdateChannels {
channels: test_instance,
..Default::default()
},
cx,
);
assert_channels(
&channel_store,
&[
(0, "b".to_string()),
(1, "x".to_string()),
(2, "β".to_string()),
(1, "α".to_string()),
(0, "a".to_string()),
(1, "y".to_string()),
],
cx,
);
}
}
#[gpui::test]
fn test_dangling_channel_paths(cx: &mut App) {
let channel_store = init_test(cx);
@@ -89,18 +189,21 @@ fn test_dangling_channel_paths(cx: &mut App) {
name: "a".to_string(),
visibility: proto::ChannelVisibility::Members as i32,
parent_path: vec![],
channel_order: 1,
},
proto::Channel {
id: 1,
name: "b".to_string(),
visibility: proto::ChannelVisibility::Members as i32,
parent_path: vec![0],
channel_order: 1,
},
proto::Channel {
id: 2,
name: "c".to_string(),
visibility: proto::ChannelVisibility::Members as i32,
parent_path: vec![0, 1],
channel_order: 1,
},
],
..Default::default()
@@ -147,6 +250,7 @@ async fn test_channel_messages(cx: &mut TestAppContext) {
name: "the-channel".to_string(),
visibility: proto::ChannelVisibility::Members as i32,
parent_path: vec![],
channel_order: 1,
}],
..Default::default()
});

View File

@@ -266,11 +266,14 @@ CREATE TABLE "channels" (
"created_at" TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP,
"visibility" VARCHAR NOT NULL,
"parent_path" TEXT NOT NULL,
"requires_zed_cla" BOOLEAN NOT NULL DEFAULT FALSE
"requires_zed_cla" BOOLEAN NOT NULL DEFAULT FALSE,
"channel_order" INTEGER NOT NULL DEFAULT 1
);
CREATE INDEX "index_channels_on_parent_path" ON "channels" ("parent_path");
CREATE INDEX "index_channels_on_parent_path_and_order" ON "channels" ("parent_path", "channel_order");
CREATE TABLE IF NOT EXISTS "channel_chat_participants" (
"id" INTEGER PRIMARY KEY AUTOINCREMENT,
"user_id" INTEGER NOT NULL REFERENCES users (id),

View File

@@ -0,0 +1,16 @@
-- Add channel_order column to channels table with default value
ALTER TABLE channels ADD COLUMN channel_order INTEGER NOT NULL DEFAULT 1;
-- Update channel_order for existing channels using ROW_NUMBER for deterministic ordering
UPDATE channels
SET channel_order = (
SELECT ROW_NUMBER() OVER (
PARTITION BY parent_path
ORDER BY name, id
)
FROM channels c2
WHERE c2.id = channels.id
);
-- Create index for efficient ordering queries
CREATE INDEX "index_channels_on_parent_path_and_order" ON "channels" ("parent_path", "channel_order");

View File

@@ -582,6 +582,7 @@ pub struct Channel {
pub visibility: ChannelVisibility,
/// parent_path is the channel ids from the root to this one (not including this one)
pub parent_path: Vec<ChannelId>,
pub channel_order: i32,
}
impl Channel {
@@ -591,6 +592,7 @@ impl Channel {
visibility: value.visibility,
name: value.clone().name,
parent_path: value.ancestors().collect(),
channel_order: value.channel_order,
}
}
@@ -600,8 +602,13 @@ impl Channel {
name: self.name.clone(),
visibility: self.visibility.into(),
parent_path: self.parent_path.iter().map(|c| c.to_proto()).collect(),
channel_order: self.channel_order,
}
}
pub fn root_id(&self) -> ChannelId {
self.parent_path.first().copied().unwrap_or(self.id)
}
}
#[derive(Debug, PartialEq, Eq, Hash)]

View File

@@ -4,7 +4,7 @@ use rpc::{
ErrorCode, ErrorCodeExt,
proto::{ChannelBufferVersion, VectorClockEntry, channel_member::Kind},
};
use sea_orm::{DbBackend, TryGetableMany};
use sea_orm::{ActiveValue, DbBackend, TryGetableMany};
impl Database {
#[cfg(test)]
@@ -59,16 +59,28 @@ impl Database {
parent = Some(parent_channel);
}
let parent_path = parent
.as_ref()
.map_or(String::new(), |parent| parent.path());
// Find the maximum channel_order among siblings to set the new channel at the end
let max_order = max_order(&parent_path, &tx).await?;
log::info!(
"Creating channel '{}' with parent_path='{}', max_order={}, new_order={}",
name,
parent_path,
max_order,
max_order + 1
);
let channel = channel::ActiveModel {
id: ActiveValue::NotSet,
name: ActiveValue::Set(name.to_string()),
visibility: ActiveValue::Set(ChannelVisibility::Members),
parent_path: ActiveValue::Set(
parent
.as_ref()
.map_or(String::new(), |parent| parent.path()),
),
parent_path: ActiveValue::Set(parent_path),
requires_zed_cla: ActiveValue::NotSet,
channel_order: ActiveValue::Set(max_order + 1),
}
.insert(&*tx)
.await?;
@@ -531,11 +543,7 @@ impl Database {
.get_channel_descendants_excluding_self(channels.iter(), tx)
.await?;
for channel in channels {
if let Err(ix) = descendants.binary_search_by_key(&channel.path(), |c| c.path()) {
descendants.insert(ix, channel);
}
}
descendants.extend(channels);
let roles_by_channel_id = channel_memberships
.iter()
@@ -952,11 +960,14 @@ impl Database {
}
let root_id = channel.root_id();
let new_parent_path = new_parent.path();
let old_path = format!("{}{}/", channel.parent_path, channel.id);
let new_path = format!("{}{}/", new_parent.path(), channel.id);
let new_path = format!("{}{}/", &new_parent_path, channel.id);
let new_order = max_order(&new_parent_path, &tx).await? + 1;
let mut model = channel.into_active_model();
model.parent_path = ActiveValue::Set(new_parent.path());
model.channel_order = ActiveValue::Set(new_order);
let channel = model.update(&*tx).await?;
let descendent_ids =
@@ -986,6 +997,132 @@ impl Database {
})
.await
}
pub async fn reorder_channel(
&self,
channel_id: ChannelId,
direction: proto::reorder_channel::Direction,
user_id: UserId,
) -> Result<Vec<Channel>> {
self.transaction(|tx| async move {
let mut channel = self.get_channel_internal(channel_id, &tx).await?;
log::info!(
"Reordering channel {} (parent_path: '{}', order: {})",
channel.id,
channel.parent_path,
channel.channel_order
);
// Check if user is admin of the channel
self.check_user_is_channel_admin(&channel, user_id, &tx)
.await?;
// Find the sibling channel to swap with
let sibling_channel = match direction {
proto::reorder_channel::Direction::Up => {
log::info!(
"Looking for sibling with parent_path='{}' and order < {}",
channel.parent_path,
channel.channel_order
);
// Find channel with highest order less than current
channel::Entity::find()
.filter(
channel::Column::ParentPath
.eq(&channel.parent_path)
.and(channel::Column::ChannelOrder.lt(channel.channel_order)),
)
.order_by_desc(channel::Column::ChannelOrder)
.one(&*tx)
.await?
}
proto::reorder_channel::Direction::Down => {
log::info!(
"Looking for sibling with parent_path='{}' and order > {}",
channel.parent_path,
channel.channel_order
);
// Find channel with lowest order greater than current
channel::Entity::find()
.filter(
channel::Column::ParentPath
.eq(&channel.parent_path)
.and(channel::Column::ChannelOrder.gt(channel.channel_order)),
)
.order_by_asc(channel::Column::ChannelOrder)
.one(&*tx)
.await?
}
};
let mut sibling_channel = match sibling_channel {
Some(sibling) => {
log::info!(
"Found sibling {} (parent_path: '{}', order: {})",
sibling.id,
sibling.parent_path,
sibling.channel_order
);
sibling
}
None => {
log::warn!("No sibling found to swap with");
// No sibling to swap with
return Ok(vec![]);
}
};
let current_order = channel.channel_order;
let sibling_order = sibling_channel.channel_order;
channel::ActiveModel {
id: ActiveValue::Unchanged(sibling_channel.id),
channel_order: ActiveValue::Set(current_order),
..Default::default()
}
.update(&*tx)
.await?;
sibling_channel.channel_order = current_order;
channel::ActiveModel {
id: ActiveValue::Unchanged(channel.id),
channel_order: ActiveValue::Set(sibling_order),
..Default::default()
}
.update(&*tx)
.await?;
channel.channel_order = sibling_order;
log::info!(
"Reorder complete. Swapped channels {} and {}",
channel.id,
sibling_channel.id
);
let swapped_channels = vec![
Channel::from_model(channel),
Channel::from_model(sibling_channel),
];
Ok(swapped_channels)
})
.await
}
}
async fn max_order(parent_path: &str, tx: &TransactionHandle) -> Result<i32> {
let max_order = channel::Entity::find()
.filter(channel::Column::ParentPath.eq(parent_path))
.select_only()
.column_as(channel::Column::ChannelOrder.max(), "max_order")
.into_tuple::<Option<i32>>()
.one(&**tx)
.await?
.flatten()
.unwrap_or(0);
Ok(max_order)
}
#[derive(Copy, Clone, Debug, EnumIter, DeriveColumn)]

View File

@@ -10,6 +10,9 @@ pub struct Model {
pub visibility: ChannelVisibility,
pub parent_path: String,
pub requires_zed_cla: bool,
/// The order of this channel relative to its siblings within the same parent.
/// Lower values appear first. Channels are sorted by parent_path first, then by channel_order.
pub channel_order: i32,
}
impl Model {

View File

@@ -172,16 +172,35 @@ impl Drop for TestDb {
}
}
fn assert_channel_tree_matches(actual: Vec<Channel>, expected: Vec<Channel>) {
let expected_channels = expected.into_iter().collect::<HashSet<_>>();
let actual_channels = actual.into_iter().collect::<HashSet<_>>();
pretty_assertions::assert_eq!(expected_channels, actual_channels);
}
fn channel_tree(channels: &[(ChannelId, &[ChannelId], &'static str)]) -> Vec<Channel> {
channels
.iter()
.map(|(id, parent_path, name)| Channel {
use std::collections::HashMap;
let mut result = Vec::new();
let mut order_by_parent: HashMap<Vec<ChannelId>, i32> = HashMap::new();
for (id, parent_path, name) in channels {
let parent_key = parent_path.to_vec();
let order = *order_by_parent
.entry(parent_key.clone())
.and_modify(|e| *e += 1)
.or_insert(1);
result.push(Channel {
id: *id,
name: name.to_string(),
visibility: ChannelVisibility::Members,
parent_path: parent_path.to_vec(),
})
.collect()
parent_path: parent_key,
channel_order: order,
});
}
result
}
static GITHUB_USER_ID: AtomicI32 = AtomicI32::new(5);

View File

@@ -1,15 +1,15 @@
use crate::{
db::{
Channel, ChannelId, ChannelRole, Database, NewUserParams, RoomId, UserId,
tests::{channel_tree, new_test_connection, new_test_user},
tests::{assert_channel_tree_matches, channel_tree, new_test_connection, new_test_user},
},
test_both_dbs,
};
use rpc::{
ConnectionId,
proto::{self},
proto::{self, reorder_channel},
};
use std::sync::Arc;
use std::{collections::HashSet, sync::Arc};
test_both_dbs!(test_channels, test_channels_postgres, test_channels_sqlite);
@@ -59,28 +59,28 @@ async fn test_channels(db: &Arc<Database>) {
.unwrap();
let result = db.get_channels_for_user(a_id).await.unwrap();
assert_eq!(
assert_channel_tree_matches(
result.channels,
channel_tree(&[
(zed_id, &[], "zed"),
(crdb_id, &[zed_id], "crdb"),
(livestreaming_id, &[zed_id], "livestreaming",),
(livestreaming_id, &[zed_id], "livestreaming"),
(replace_id, &[zed_id], "replace"),
(rust_id, &[], "rust"),
(cargo_id, &[rust_id], "cargo"),
(cargo_ra_id, &[rust_id, cargo_id], "cargo-ra",)
],)
(cargo_ra_id, &[rust_id, cargo_id], "cargo-ra"),
]),
);
let result = db.get_channels_for_user(b_id).await.unwrap();
assert_eq!(
assert_channel_tree_matches(
result.channels,
channel_tree(&[
(zed_id, &[], "zed"),
(crdb_id, &[zed_id], "crdb"),
(livestreaming_id, &[zed_id], "livestreaming",),
(replace_id, &[zed_id], "replace")
],)
(livestreaming_id, &[zed_id], "livestreaming"),
(replace_id, &[zed_id], "replace"),
]),
);
// Update member permissions
@@ -94,14 +94,14 @@ async fn test_channels(db: &Arc<Database>) {
assert!(set_channel_admin.is_ok());
let result = db.get_channels_for_user(b_id).await.unwrap();
assert_eq!(
assert_channel_tree_matches(
result.channels,
channel_tree(&[
(zed_id, &[], "zed"),
(crdb_id, &[zed_id], "crdb"),
(livestreaming_id, &[zed_id], "livestreaming",),
(replace_id, &[zed_id], "replace")
],)
(livestreaming_id, &[zed_id], "livestreaming"),
(replace_id, &[zed_id], "replace"),
]),
);
// Remove a single channel
@@ -313,8 +313,8 @@ async fn test_channel_renames(db: &Arc<Database>) {
test_both_dbs!(
test_db_channel_moving,
test_channels_moving_postgres,
test_channels_moving_sqlite
test_db_channel_moving_postgres,
test_db_channel_moving_sqlite
);
async fn test_db_channel_moving(db: &Arc<Database>) {
@@ -343,16 +343,14 @@ async fn test_db_channel_moving(db: &Arc<Database>) {
.await
.unwrap();
let livestreaming_dag_id = db
.create_sub_channel("livestreaming_dag", livestreaming_id, a_id)
let livestreaming_sub_id = db
.create_sub_channel("livestreaming_sub", livestreaming_id, a_id)
.await
.unwrap();
// ========================================================================
// sanity check
// Initial DAG:
// /- gpui2
// zed -- crdb - livestreaming - livestreaming_dag
// zed -- crdb - livestreaming - livestreaming_sub
let result = db.get_channels_for_user(a_id).await.unwrap();
assert_channel_tree(
result.channels,
@@ -360,10 +358,242 @@ async fn test_db_channel_moving(db: &Arc<Database>) {
(zed_id, &[]),
(crdb_id, &[zed_id]),
(livestreaming_id, &[zed_id, crdb_id]),
(livestreaming_dag_id, &[zed_id, crdb_id, livestreaming_id]),
(livestreaming_sub_id, &[zed_id, crdb_id, livestreaming_id]),
(gpui2_id, &[zed_id]),
],
);
// Check that we can do a simple leaf -> leaf move
db.move_channel(livestreaming_sub_id, crdb_id, a_id)
.await
.unwrap();
// /- gpui2
// zed -- crdb -- livestreaming
// \- livestreaming_sub
let result = db.get_channels_for_user(a_id).await.unwrap();
assert_channel_tree(
result.channels,
&[
(zed_id, &[]),
(crdb_id, &[zed_id]),
(livestreaming_id, &[zed_id, crdb_id]),
(livestreaming_sub_id, &[zed_id, crdb_id]),
(gpui2_id, &[zed_id]),
],
);
// Check that we can move a whole subtree at once
db.move_channel(crdb_id, gpui2_id, a_id).await.unwrap();
// zed -- gpui2 -- crdb -- livestreaming
// \- livestreaming_sub
let result = db.get_channels_for_user(a_id).await.unwrap();
assert_channel_tree(
result.channels,
&[
(zed_id, &[]),
(gpui2_id, &[zed_id]),
(crdb_id, &[zed_id, gpui2_id]),
(livestreaming_id, &[zed_id, gpui2_id, crdb_id]),
(livestreaming_sub_id, &[zed_id, gpui2_id, crdb_id]),
],
);
}
test_both_dbs!(
test_channel_reordering,
test_channel_reordering_postgres,
test_channel_reordering_sqlite
);
async fn test_channel_reordering(db: &Arc<Database>) {
let admin_id = db
.create_user(
"admin@example.com",
None,
false,
NewUserParams {
github_login: "admin".into(),
github_user_id: 1,
},
)
.await
.unwrap()
.user_id;
let user_id = db
.create_user(
"user@example.com",
None,
false,
NewUserParams {
github_login: "user".into(),
github_user_id: 2,
},
)
.await
.unwrap()
.user_id;
// Create a root channel with some sub-channels
let root_id = db.create_root_channel("root", admin_id).await.unwrap();
// Invite user to root channel so they can see the sub-channels
db.invite_channel_member(root_id, user_id, admin_id, ChannelRole::Member)
.await
.unwrap();
db.respond_to_channel_invite(root_id, user_id, true)
.await
.unwrap();
let alpha_id = db
.create_sub_channel("alpha", root_id, admin_id)
.await
.unwrap();
let beta_id = db
.create_sub_channel("beta", root_id, admin_id)
.await
.unwrap();
let gamma_id = db
.create_sub_channel("gamma", root_id, admin_id)
.await
.unwrap();
// Initial order should be: root, alpha (order=1), beta (order=2), gamma (order=3)
let result = db.get_channels_for_user(admin_id).await.unwrap();
assert_channel_tree_order(
result.channels,
&[
(root_id, &[], 1),
(alpha_id, &[root_id], 1),
(beta_id, &[root_id], 2),
(gamma_id, &[root_id], 3),
],
);
// Test moving beta up (should swap with alpha)
let updated_channels = db
.reorder_channel(beta_id, reorder_channel::Direction::Up, admin_id)
.await
.unwrap();
// Verify that beta and alpha were returned as updated
assert_eq!(updated_channels.len(), 2);
let updated_ids: std::collections::HashSet<_> = updated_channels.iter().map(|c| c.id).collect();
assert!(updated_ids.contains(&alpha_id));
assert!(updated_ids.contains(&beta_id));
// Now order should be: root, beta (order=1), alpha (order=2), gamma (order=3)
let result = db.get_channels_for_user(admin_id).await.unwrap();
assert_channel_tree_order(
result.channels,
&[
(root_id, &[], 1),
(beta_id, &[root_id], 1),
(alpha_id, &[root_id], 2),
(gamma_id, &[root_id], 3),
],
);
// Test moving gamma down (should be no-op since it's already last)
let updated_channels = db
.reorder_channel(gamma_id, reorder_channel::Direction::Down, admin_id)
.await
.unwrap();
// Should return just nothing
assert_eq!(updated_channels.len(), 0);
// Test moving alpha down (should swap with gamma)
let updated_channels = db
.reorder_channel(alpha_id, reorder_channel::Direction::Down, admin_id)
.await
.unwrap();
// Verify that alpha and gamma were returned as updated
assert_eq!(updated_channels.len(), 2);
let updated_ids: std::collections::HashSet<_> = updated_channels.iter().map(|c| c.id).collect();
assert!(updated_ids.contains(&alpha_id));
assert!(updated_ids.contains(&gamma_id));
// Now order should be: root, beta (order=1), gamma (order=2), alpha (order=3)
let result = db.get_channels_for_user(admin_id).await.unwrap();
assert_channel_tree_order(
result.channels,
&[
(root_id, &[], 1),
(beta_id, &[root_id], 1),
(gamma_id, &[root_id], 2),
(alpha_id, &[root_id], 3),
],
);
// Test that non-admin cannot reorder
let reorder_result = db
.reorder_channel(beta_id, reorder_channel::Direction::Up, user_id)
.await;
assert!(reorder_result.is_err());
// Test moving beta up (should be no-op since it's already first)
let updated_channels = db
.reorder_channel(beta_id, reorder_channel::Direction::Up, admin_id)
.await
.unwrap();
// Should return nothing
assert_eq!(updated_channels.len(), 0);
// Adding a channel to an existing ordering should add it to the end
let delta_id = db
.create_sub_channel("delta", root_id, admin_id)
.await
.unwrap();
let result = db.get_channels_for_user(admin_id).await.unwrap();
assert_channel_tree_order(
result.channels,
&[
(root_id, &[], 1),
(beta_id, &[root_id], 1),
(gamma_id, &[root_id], 2),
(alpha_id, &[root_id], 3),
(delta_id, &[root_id], 4),
],
);
// And moving a channel into an existing ordering should add it to the end
let eta_id = db
.create_sub_channel("eta", delta_id, admin_id)
.await
.unwrap();
let result = db.get_channels_for_user(admin_id).await.unwrap();
assert_channel_tree_order(
result.channels,
&[
(root_id, &[], 1),
(beta_id, &[root_id], 1),
(gamma_id, &[root_id], 2),
(alpha_id, &[root_id], 3),
(delta_id, &[root_id], 4),
(eta_id, &[root_id, delta_id], 1),
],
);
db.move_channel(eta_id, root_id, admin_id).await.unwrap();
let result = db.get_channels_for_user(admin_id).await.unwrap();
assert_channel_tree_order(
result.channels,
&[
(root_id, &[], 1),
(beta_id, &[root_id], 1),
(gamma_id, &[root_id], 2),
(alpha_id, &[root_id], 3),
(delta_id, &[root_id], 4),
(eta_id, &[root_id], 5),
],
);
}
test_both_dbs!(
@@ -422,6 +652,20 @@ async fn test_db_channel_moving_bugs(db: &Arc<Database>) {
(livestreaming_id, &[zed_id, projects_id]),
],
);
// Can't un-root a root channel
db.move_channel(zed_id, livestreaming_id, user_id)
.await
.unwrap_err();
let result = db.get_channels_for_user(user_id).await.unwrap();
assert_channel_tree(
result.channels,
&[
(zed_id, &[]),
(projects_id, &[zed_id]),
(livestreaming_id, &[zed_id, projects_id]),
],
);
}
test_both_dbs!(
@@ -745,10 +989,29 @@ fn assert_channel_tree(actual: Vec<Channel>, expected: &[(ChannelId, &[ChannelId
let actual = actual
.iter()
.map(|channel| (channel.id, channel.parent_path.as_slice()))
.collect::<Vec<_>>();
pretty_assertions::assert_eq!(
actual,
expected.to_vec(),
"wrong channel ids and parent paths"
);
.collect::<HashSet<_>>();
let expected = expected
.iter()
.map(|(id, parents)| (*id, *parents))
.collect::<HashSet<_>>();
pretty_assertions::assert_eq!(actual, expected, "wrong channel ids and parent paths");
}
#[track_caller]
fn assert_channel_tree_order(actual: Vec<Channel>, expected: &[(ChannelId, &[ChannelId], i32)]) {
let actual = actual
.iter()
.map(|channel| {
(
channel.id,
channel.parent_path.as_slice(),
channel.channel_order,
)
})
.collect::<HashSet<_>>();
let expected = expected
.iter()
.map(|(id, parents, order)| (*id, *parents, *order))
.collect::<HashSet<_>>();
pretty_assertions::assert_eq!(actual, expected, "wrong channel ids and parent paths");
}

View File

@@ -384,6 +384,7 @@ impl Server {
.add_request_handler(get_notifications)
.add_request_handler(mark_notification_as_read)
.add_request_handler(move_channel)
.add_request_handler(reorder_channel)
.add_request_handler(follow)
.add_message_handler(unfollow)
.add_message_handler(update_followers)
@@ -3195,6 +3196,51 @@ async fn move_channel(
Ok(())
}
async fn reorder_channel(
request: proto::ReorderChannel,
response: Response<proto::ReorderChannel>,
session: Session,
) -> Result<()> {
let channel_id = ChannelId::from_proto(request.channel_id);
let direction = request.direction();
let updated_channels = session
.db()
.await
.reorder_channel(channel_id, direction, session.user_id())
.await?;
if let Some(root_id) = updated_channels.first().map(|channel| channel.root_id()) {
let connection_pool = session.connection_pool().await;
for (connection_id, role) in connection_pool.channel_connection_ids(root_id) {
let channels = updated_channels
.iter()
.filter_map(|channel| {
if role.can_see_channel(channel.visibility) {
Some(channel.to_proto())
} else {
None
}
})
.collect::<Vec<_>>();
if channels.is_empty() {
continue;
}
let update = proto::UpdateChannels {
channels,
..Default::default()
};
session.peer.send(connection_id, update.clone())?;
}
}
response.send(Ack {})?;
Ok(())
}
/// Get the list of channel members
async fn get_channel_members(
request: proto::GetChannelMembers,

View File

@@ -1010,6 +1010,7 @@ async fn test_peers_following_each_other(cx_a: &mut TestAppContext, cx_b: &mut T
workspace_b.update_in(cx_b, |workspace, window, cx| {
workspace.active_pane().update(cx, |pane, cx| {
pane.close_inactive_items(&Default::default(), window, cx)
.unwrap()
.detach();
});
});

View File

@@ -354,10 +354,6 @@ impl ChannelView {
editor.set_read_only(true);
cx.notify();
}),
ChannelBufferEvent::Connected => self.editor.update(cx, |editor, cx| {
editor.set_read_only(false);
cx.notify();
}),
ChannelBufferEvent::ChannelChanged => {
self.editor.update(cx, |_, cx| {
cx.emit(editor::EditorEvent::TitleChanged);

View File

@@ -0,0 +1,455 @@
//! Link folding support for channel notes.
//!
//! This module provides functionality to automatically fold markdown links in channel notes,
//! displaying only the link text with an underline decoration. When the cursor is positioned
//! inside a link, the crease is temporarily removed to show the full markdown syntax.
//!
//! Example:
//! - When rendered: [Example](https://example.com) becomes "Example" with underline
//! - When cursor is inside: Full markdown syntax is shown
//!
//! The main components are:
//! - `parse_markdown_links`: Extracts markdown links from text
//! - `create_link_creases`: Creates visual creases for links
//! - `LinkFoldingManager`: Manages dynamic showing/hiding of link creases based on cursor position
use editor::{
Anchor, Editor, FoldPlaceholder,
display_map::{Crease, CreaseId},
};
use gpui::{App, Entity, Window};
use std::{collections::HashMap, ops::Range, sync::Arc};
use ui::prelude::*;
#[derive(Debug, Clone, PartialEq)]
pub struct MarkdownLink {
pub text: String,
pub url: String,
pub range: Range<usize>,
}
pub fn parse_markdown_links(text: &str) -> Vec<MarkdownLink> {
let mut links = Vec::new();
let mut chars = text.char_indices();
while let Some((start, ch)) = chars.next() {
if ch == '[' {
// Look for the closing bracket
let mut bracket_depth = 1;
let text_start = start + 1;
let mut text_end = None;
for (i, ch) in chars.by_ref() {
match ch {
'[' => bracket_depth += 1,
']' => {
bracket_depth -= 1;
if bracket_depth == 0 {
text_end = Some(i);
break;
}
}
_ => {}
}
}
if let Some(text_end) = text_end {
// Check if the next character is '('
if let Some((_, '(')) = chars.next() {
// Look for the closing parenthesis
let url_start = text_end + 2;
let mut url_end = None;
for (i, ch) in chars.by_ref() {
if ch == ')' {
url_end = Some(i);
break;
}
}
if let Some(url_end) = url_end {
links.push(MarkdownLink {
text: text[text_start..text_end].to_string(),
url: text[url_start..url_end].to_string(),
range: start..url_end + 1,
});
}
}
}
}
}
links
}
pub fn create_link_creases(
links: &[MarkdownLink],
buffer_snapshot: &editor::MultiBufferSnapshot,
) -> Vec<Crease<Anchor>> {
links
.iter()
.map(|link| {
// Convert byte offsets to Points first
let start_point = buffer_snapshot.offset_to_point(link.range.start);
let end_point = buffer_snapshot.offset_to_point(link.range.end);
// Create anchors from points
let start = buffer_snapshot.anchor_before(start_point);
let end = buffer_snapshot.anchor_after(end_point);
let link_text = link.text.clone();
Crease::simple(
start..end,
FoldPlaceholder {
render: Arc::new(move |_fold_id, _range, cx| {
div()
.child(link_text.clone())
.text_decoration_1()
.text_decoration_solid()
.text_color(cx.theme().colors().link_text_hover)
.into_any_element()
}),
constrain_width: false,
merge_adjacent: false,
type_tag: None,
},
)
})
.collect()
}
pub struct LinkFoldingManager {
editor: Entity<Editor>,
folded_links: Vec<MarkdownLink>,
link_creases: HashMap<CreaseId, Range<usize>>,
}
impl LinkFoldingManager {
pub fn new(editor: Entity<Editor>, window: &mut Window, cx: &mut App) -> Self {
let mut manager = Self {
editor: editor.clone(),
folded_links: Vec::new(),
link_creases: HashMap::default(),
};
manager.refresh_links(window, cx);
manager
}
pub fn handle_selections_changed(&mut self, window: &mut Window, cx: &mut App) {
self.update_link_visibility(window, cx);
}
pub fn handle_edited(&mut self, window: &mut Window, cx: &mut App) {
// Re-parse links and update folds
// Note: In a production implementation, this could be debounced
// to avoid updating on every keystroke
self.refresh_links(window, cx);
}
fn refresh_links(&mut self, window: &mut Window, cx: &mut App) {
// Remove existing creases
if !self.link_creases.is_empty() {
self.editor.update(cx, |editor, cx| {
let crease_ids: Vec<_> = self.link_creases.keys().copied().collect();
editor.remove_creases(crease_ids, cx);
});
self.link_creases.clear();
}
// Parse new links
let buffer_text = self.editor.read(cx).buffer().read(cx).snapshot(cx).text();
let links = parse_markdown_links(&buffer_text);
self.folded_links = links;
// Insert creases for all links
if !self.folded_links.is_empty() {
self.editor.update(cx, |editor, cx| {
let buffer = editor.buffer().read(cx).snapshot(cx);
let creases = create_link_creases(&self.folded_links, &buffer);
let crease_ids = editor.insert_creases(creases.clone(), cx);
// Store the mapping of crease IDs to link ranges
for (crease_id, link) in crease_ids.into_iter().zip(self.folded_links.iter()) {
self.link_creases.insert(crease_id, link.range.clone());
}
// Fold the creases to activate the custom placeholder
editor.fold_creases(creases, true, window, cx);
});
}
// Update visibility based on cursor position
self.update_link_visibility(window, cx);
}
fn update_link_visibility(&mut self, window: &mut Window, cx: &mut App) {
if self.folded_links.is_empty() {
return;
}
self.editor.update(cx, |editor, cx| {
let buffer = editor.buffer().read(cx).snapshot(cx);
let selections = editor.selections.all::<usize>(cx);
// Find which links should be visible (cursor inside or adjacent)
// Remove creases where cursor is inside or adjacent to the link
let mut creases_to_remove = Vec::new();
for (crease_id, link_range) in &self.link_creases {
let cursor_near_link = selections.iter().any(|selection| {
let cursor_offset = selection.head();
// Check if cursor is inside the link
if link_range.contains(&cursor_offset) {
return true;
}
// Check if cursor is adjacent (immediately before or after)
cursor_offset == link_range.start || cursor_offset == link_range.end
});
if cursor_near_link {
creases_to_remove.push(*crease_id);
}
}
if !creases_to_remove.is_empty() {
editor.remove_creases(creases_to_remove.clone(), cx);
for crease_id in creases_to_remove {
self.link_creases.remove(&crease_id);
}
}
// Re-add creases for links where cursor is not present or adjacent
let links_to_recreate: Vec<_> = self
.folded_links
.iter()
.filter(|link| {
!selections.iter().any(|selection| {
let cursor_offset = selection.head();
// Check if cursor is inside or adjacent to the link
link.range.contains(&cursor_offset)
|| cursor_offset == link.range.start
|| cursor_offset == link.range.end
})
})
.filter(|link| {
// Only recreate if not already present
!self.link_creases.values().any(|range| range == &link.range)
})
.cloned()
.collect();
if !links_to_recreate.is_empty() {
let creases = create_link_creases(&links_to_recreate, &buffer);
let crease_ids = editor.insert_creases(creases.clone(), cx);
for (crease_id, link) in crease_ids.into_iter().zip(links_to_recreate.iter()) {
self.link_creases.insert(crease_id, link.range.clone());
}
// Fold the creases to activate the custom placeholder
editor.fold_creases(creases, true, window, cx);
}
});
}
/// Checks if a position (in buffer coordinates) is within a folded link
/// and returns the link if found
pub fn link_at_position(&self, position: usize) -> Option<&MarkdownLink> {
// Check if the position falls within any folded link that has an active crease
self.folded_links.iter().find(|link| {
link.range.contains(&position)
&& self.link_creases.values().any(|range| range == &link.range)
})
}
/// Opens a link URL in the default browser
pub fn open_link(&self, url: &str, cx: &mut App) {
cx.open_url(url);
}
}
#[cfg(test)]
mod tests {
use super::*;
#[gpui::test]
fn test_parse_markdown_links() {
let text = "This is a [link](https://example.com) and another [test](https://test.com).";
let links = parse_markdown_links(text);
assert_eq!(links.len(), 2);
assert_eq!(links[0].text, "link");
assert_eq!(links[0].url, "https://example.com");
assert_eq!(links[0].range, 10..37);
assert_eq!(links[1].text, "test");
assert_eq!(links[1].url, "https://test.com");
assert_eq!(links[1].range, 50..74);
}
#[gpui::test]
fn test_parse_nested_brackets() {
let text = "A [[nested] link](https://example.com) here.";
let links = parse_markdown_links(text);
assert_eq!(links.len(), 1);
assert_eq!(links[0].text, "[nested] link");
assert_eq!(links[0].url, "https://example.com");
}
#[gpui::test]
fn test_parse_no_links() {
let text = "This text has no links.";
let links = parse_markdown_links(text);
assert_eq!(links.len(), 0);
}
#[gpui::test]
fn test_parse_incomplete_links() {
let text = "This [link has no url] and [this](incomplete";
let links = parse_markdown_links(text);
assert_eq!(links.len(), 0);
}
#[gpui::test]
fn test_link_parsing_and_folding() {
// Test comprehensive link parsing
let text = "Check out [Zed](https://zed.dev) and [GitHub](https://github.com)!";
let links = parse_markdown_links(text);
assert_eq!(links.len(), 2);
assert_eq!(links[0].text, "Zed");
assert_eq!(links[0].url, "https://zed.dev");
assert_eq!(links[0].range, 10..32);
assert_eq!(links[1].text, "GitHub");
assert_eq!(links[1].url, "https://github.com");
assert_eq!(links[1].range, 37..65);
}
#[gpui::test]
fn test_link_detection_when_typing() {
// Test that links are detected as they're typed
let text1 = "Check out ";
let links1 = parse_markdown_links(text1);
assert_eq!(links1.len(), 0, "No links in plain text");
let text2 = "Check out [";
let links2 = parse_markdown_links(text2);
assert_eq!(links2.len(), 0, "Incomplete link not detected");
let text3 = "Check out [Zed]";
let links3 = parse_markdown_links(text3);
assert_eq!(links3.len(), 0, "Link without URL not detected");
let text4 = "Check out [Zed](";
let links4 = parse_markdown_links(text4);
assert_eq!(links4.len(), 0, "Link with incomplete URL not detected");
let text5 = "Check out [Zed](https://zed.dev)";
let links5 = parse_markdown_links(text5);
assert_eq!(links5.len(), 1, "Complete link should be detected");
assert_eq!(links5[0].text, "Zed");
assert_eq!(links5[0].url, "https://zed.dev");
// Test link detection in middle of text
let text6 = "Check out [Zed](https://zed.dev) for coding!";
let links6 = parse_markdown_links(text6);
assert_eq!(links6.len(), 1, "Link in middle of text should be detected");
assert_eq!(links6[0].range, 10..32);
}
#[gpui::test]
fn test_link_position_detection() {
// Test the logic for determining if a position is within a link
let links = vec![
MarkdownLink {
text: "Zed".to_string(),
url: "https://zed.dev".to_string(),
range: 10..32,
},
MarkdownLink {
text: "GitHub".to_string(),
url: "https://github.com".to_string(),
range: 50..78,
},
];
// Test positions inside the first link
assert!(links[0].range.contains(&10), "Start of first link");
assert!(links[0].range.contains(&20), "Middle of first link");
assert!(links[0].range.contains(&31), "Near end of first link");
// Test positions inside the second link
assert!(links[1].range.contains(&50), "Start of second link");
assert!(links[1].range.contains(&65), "Middle of second link");
assert!(links[1].range.contains(&77), "Near end of second link");
// Test positions outside any link
assert!(!links[0].range.contains(&9), "Before first link");
assert!(!links[0].range.contains(&32), "After first link");
assert!(!links[1].range.contains(&49), "Before second link");
assert!(!links[1].range.contains(&78), "After second link");
// Test finding a link at a specific position
let link_at_20 = links.iter().find(|link| link.range.contains(&20));
assert!(link_at_20.is_some());
assert_eq!(link_at_20.unwrap().text, "Zed");
assert_eq!(link_at_20.unwrap().url, "https://zed.dev");
let link_at_65 = links.iter().find(|link| link.range.contains(&65));
assert!(link_at_65.is_some());
assert_eq!(link_at_65.unwrap().text, "GitHub");
assert_eq!(link_at_65.unwrap().url, "https://github.com");
}
#[gpui::test]
fn test_cursor_adjacent_link_expansion() {
// Test the logic for determining if cursor is inside or adjacent to a link
let link = MarkdownLink {
text: "Example".to_string(),
url: "https://example.com".to_string(),
range: 10..37,
};
// Helper function to check if cursor should expand the link
let should_expand_link = |cursor_pos: usize, link: &MarkdownLink| -> bool {
// Link should be expanded if cursor is inside or adjacent
link.range.contains(&cursor_pos)
|| cursor_pos == link.range.start
|| cursor_pos == link.range.end
};
// Test cursor positions
assert!(
should_expand_link(10, &link),
"Cursor at position 10 (link start) should expand link"
);
assert!(
should_expand_link(37, &link),
"Cursor at position 37 (link end) should expand link"
);
assert!(
should_expand_link(20, &link),
"Cursor at position 20 (inside link) should expand link"
);
assert!(
!should_expand_link(9, &link),
"Cursor at position 9 (before link) should not expand link"
);
assert!(
!should_expand_link(38, &link),
"Cursor at position 38 (after link) should not expand link"
);
// Test the edge cases
assert_eq!(link.range.start, 10, "Link starts at position 10");
assert_eq!(link.range.end, 37, "Link ends at position 37");
assert!(link.range.contains(&10), "Range includes start position");
assert!(!link.range.contains(&37), "Range excludes end position");
}
}

View File

@@ -12,7 +12,7 @@ use language::{
Anchor, Buffer, BufferSnapshot, CodeLabel, LanguageRegistry, ToOffset,
language_settings::SoftWrap,
};
use project::{Completion, CompletionResponse, CompletionSource, search::SearchQuery};
use project::{Completion, CompletionSource, search::SearchQuery};
use settings::Settings;
use std::{
cell::RefCell,
@@ -64,9 +64,9 @@ impl CompletionProvider for MessageEditorCompletionProvider {
_: editor::CompletionContext,
_window: &mut Window,
cx: &mut Context<Editor>,
) -> Task<Result<Vec<CompletionResponse>>> {
) -> Task<Result<Option<Vec<Completion>>>> {
let Some(handle) = self.0.upgrade() else {
return Task::ready(Ok(Vec::new()));
return Task::ready(Ok(None));
};
handle.update(cx, |message_editor, cx| {
message_editor.completions(buffer, buffer_position, cx)
@@ -248,21 +248,22 @@ impl MessageEditor {
buffer: &Entity<Buffer>,
end_anchor: Anchor,
cx: &mut Context<Self>,
) -> Task<Result<Vec<CompletionResponse>>> {
) -> Task<Result<Option<Vec<Completion>>>> {
if let Some((start_anchor, query, candidates)) =
self.collect_mention_candidates(buffer, end_anchor, cx)
{
if !candidates.is_empty() {
return cx.spawn(async move |_, cx| {
let completion_response = Self::resolve_completions_for_candidates(
&cx,
query.as_str(),
&candidates,
start_anchor..end_anchor,
Self::completion_for_mention,
)
.await;
Ok(vec![completion_response])
Ok(Some(
Self::resolve_completions_for_candidates(
&cx,
query.as_str(),
&candidates,
start_anchor..end_anchor,
Self::completion_for_mention,
)
.await,
))
});
}
}
@@ -272,23 +273,21 @@ impl MessageEditor {
{
if !candidates.is_empty() {
return cx.spawn(async move |_, cx| {
let completion_response = Self::resolve_completions_for_candidates(
&cx,
query.as_str(),
candidates,
start_anchor..end_anchor,
Self::completion_for_emoji,
)
.await;
Ok(vec![completion_response])
Ok(Some(
Self::resolve_completions_for_candidates(
&cx,
query.as_str(),
candidates,
start_anchor..end_anchor,
Self::completion_for_emoji,
)
.await,
))
});
}
}
Task::ready(Ok(vec![CompletionResponse {
completions: Vec::new(),
is_incomplete: false,
}]))
Task::ready(Ok(Some(Vec::new())))
}
async fn resolve_completions_for_candidates(
@@ -297,19 +296,18 @@ impl MessageEditor {
candidates: &[StringMatchCandidate],
range: Range<Anchor>,
completion_fn: impl Fn(&StringMatch) -> (String, CodeLabel),
) -> CompletionResponse {
const LIMIT: usize = 10;
) -> Vec<Completion> {
let matches = fuzzy::match_strings(
candidates,
query,
true,
LIMIT,
10,
&Default::default(),
cx.background_executor().clone(),
)
.await;
let completions = matches
matches
.into_iter()
.map(|mat| {
let (new_text, label) = completion_fn(&mat);
@@ -324,12 +322,7 @@ impl MessageEditor {
source: CompletionSource::Custom,
}
})
.collect::<Vec<_>>();
CompletionResponse {
is_incomplete: completions.len() >= LIMIT,
completions,
}
.collect()
}
fn completion_for_mention(mat: &StringMatch) -> (String, CodeLabel) {

View File

@@ -14,9 +14,9 @@ use fuzzy::{StringMatchCandidate, match_strings};
use gpui::{
AnyElement, App, AsyncWindowContext, Bounds, ClickEvent, ClipboardItem, Context, DismissEvent,
Div, Entity, EventEmitter, FocusHandle, Focusable, FontStyle, InteractiveElement, IntoElement,
ListOffset, ListState, MouseDownEvent, ParentElement, Pixels, Point, PromptLevel, Render,
SharedString, Styled, Subscription, Task, TextStyle, WeakEntity, Window, actions, anchored,
canvas, deferred, div, fill, list, point, prelude::*, px,
KeyContext, ListOffset, ListState, MouseDownEvent, ParentElement, Pixels, Point, PromptLevel,
Render, SharedString, Styled, Subscription, Task, TextStyle, WeakEntity, Window, actions,
anchored, canvas, deferred, div, fill, list, point, prelude::*, px,
};
use menu::{Cancel, Confirm, SecondaryConfirm, SelectNext, SelectPrevious};
use project::{Fs, Project};
@@ -52,6 +52,8 @@ actions!(
StartMoveChannel,
MoveSelected,
InsertSpace,
MoveChannelUp,
MoveChannelDown,
]
);
@@ -1961,6 +1963,33 @@ impl CollabPanel {
})
}
fn move_channel_up(&mut self, _: &MoveChannelUp, window: &mut Window, cx: &mut Context<Self>) {
if let Some(channel) = self.selected_channel() {
self.channel_store.update(cx, |store, cx| {
store
.reorder_channel(channel.id, proto::reorder_channel::Direction::Up, cx)
.detach_and_prompt_err("Failed to move channel up", window, cx, |_, _, _| None)
});
}
}
fn move_channel_down(
&mut self,
_: &MoveChannelDown,
window: &mut Window,
cx: &mut Context<Self>,
) {
if let Some(channel) = self.selected_channel() {
self.channel_store.update(cx, |store, cx| {
store
.reorder_channel(channel.id, proto::reorder_channel::Direction::Down, cx)
.detach_and_prompt_err("Failed to move channel down", window, cx, |_, _, _| {
None
})
});
}
}
fn open_channel_notes(
&mut self,
channel_id: ChannelId,
@@ -1974,7 +2003,7 @@ impl CollabPanel {
fn show_inline_context_menu(
&mut self,
_: &menu::SecondaryConfirm,
_: &Secondary,
window: &mut Window,
cx: &mut Context<Self>,
) {
@@ -2003,6 +2032,21 @@ impl CollabPanel {
}
}
fn dispatch_context(&self, window: &Window, cx: &Context<Self>) -> KeyContext {
let mut dispatch_context = KeyContext::new_with_defaults();
dispatch_context.add("CollabPanel");
dispatch_context.add("menu");
let identifier = if self.channel_name_editor.focus_handle(cx).is_focused(window) {
"editing"
} else {
"not_editing"
};
dispatch_context.add(identifier);
dispatch_context
}
fn selected_channel(&self) -> Option<&Arc<Channel>> {
self.selection
.and_then(|ix| self.entries.get(ix))
@@ -2965,7 +3009,7 @@ fn render_tree_branch(
impl Render for CollabPanel {
fn render(&mut self, window: &mut Window, cx: &mut Context<Self>) -> impl IntoElement {
v_flex()
.key_context("CollabPanel")
.key_context(self.dispatch_context(window, cx))
.on_action(cx.listener(CollabPanel::cancel))
.on_action(cx.listener(CollabPanel::select_next))
.on_action(cx.listener(CollabPanel::select_previous))
@@ -2977,6 +3021,8 @@ impl Render for CollabPanel {
.on_action(cx.listener(CollabPanel::collapse_selected_channel))
.on_action(cx.listener(CollabPanel::expand_selected_channel))
.on_action(cx.listener(CollabPanel::start_move_selected_channel))
.on_action(cx.listener(CollabPanel::move_channel_up))
.on_action(cx.listener(CollabPanel::move_channel_down))
.track_focus(&self.focus_handle(cx))
.size_full()
.child(if self.user_store.read(cx).current_user().is_none() {

View File

@@ -581,7 +581,7 @@ async fn stream_completion(
api_key: String,
request: Request,
) -> Result<BoxStream<'static, Result<ResponseEvent>>> {
let is_vision_request = request.messages.last().map_or(false, |message| match message {
let is_vision_request = request.messages.iter().any(|message| match message {
ChatMessage::User { content }
| ChatMessage::Assistant { content, .. }
| ChatMessage::Tool { content, .. } => {
@@ -736,4 +736,116 @@ mod tests {
assert_eq!(schema.data[0].id, "gpt-4");
assert_eq!(schema.data[1].id, "claude-3.7-sonnet");
}
#[test]
fn test_vision_request_detection() {
fn message_contains_image(message: &ChatMessage) -> bool {
match message {
ChatMessage::User { content }
| ChatMessage::Assistant { content, .. }
| ChatMessage::Tool { content, .. } => {
matches!(content, ChatMessageContent::Multipart(parts) if
parts.iter().any(|part| matches!(part, ChatMessagePart::Image { .. })))
}
_ => false,
}
}
// Helper function to detect if a request is a vision request
fn is_vision_request(request: &Request) -> bool {
request.messages.iter().any(message_contains_image)
}
let request_with_image_in_last = Request {
intent: true,
n: 1,
stream: true,
temperature: 0.1,
model: "claude-3.7-sonnet".to_string(),
messages: vec![
ChatMessage::User {
content: ChatMessageContent::Plain("Hello".to_string()),
},
ChatMessage::Assistant {
content: ChatMessageContent::Plain("How can I help?".to_string()),
tool_calls: vec![],
},
ChatMessage::User {
content: ChatMessageContent::Multipart(vec![
ChatMessagePart::Text {
text: "What's in this image?".to_string(),
},
ChatMessagePart::Image {
image_url: ImageUrl {
url: "data:image/png;base64,abc123".to_string(),
},
},
]),
},
],
tools: vec![],
tool_choice: None,
};
let request_with_image_in_earlier = Request {
intent: true,
n: 1,
stream: true,
temperature: 0.1,
model: "claude-3.7-sonnet".to_string(),
messages: vec![
ChatMessage::User {
content: ChatMessageContent::Plain("Hello".to_string()),
},
ChatMessage::User {
content: ChatMessageContent::Multipart(vec![
ChatMessagePart::Text {
text: "What's in this image?".to_string(),
},
ChatMessagePart::Image {
image_url: ImageUrl {
url: "data:image/png;base64,abc123".to_string(),
},
},
]),
},
ChatMessage::Assistant {
content: ChatMessageContent::Plain("I see a cat in the image.".to_string()),
tool_calls: vec![],
},
ChatMessage::User {
content: ChatMessageContent::Plain("What color is it?".to_string()),
},
],
tools: vec![],
tool_choice: None,
};
let request_with_no_images = Request {
intent: true,
n: 1,
stream: true,
temperature: 0.1,
model: "claude-3.7-sonnet".to_string(),
messages: vec![
ChatMessage::User {
content: ChatMessageContent::Plain("Hello".to_string()),
},
ChatMessage::Assistant {
content: ChatMessageContent::Plain("How can I help?".to_string()),
tool_calls: vec![],
},
ChatMessage::User {
content: ChatMessageContent::Plain("Tell me about Rust.".to_string()),
},
],
tools: vec![],
tool_choice: None,
};
assert!(is_vision_request(&request_with_image_in_last));
assert!(is_vision_request(&request_with_image_in_earlier));
assert!(!is_vision_request(&request_with_no_images));
}
}

View File

@@ -370,19 +370,21 @@ pub trait DebugAdapter: 'static + Send + Sync {
None
}
/// Extracts the kind (attach/launch) of debug configuration from the given JSON config.
/// This method should only return error when the kind cannot be determined for a given configuration;
/// in particular, it *should not* validate whether the request as a whole is valid, because that's best left to the debug adapter itself to decide.
fn request_kind(
fn validate_config(
&self,
config: &serde_json::Value,
) -> Result<StartDebuggingRequestArgumentsRequest> {
match config.get("request") {
Some(val) if val == "launch" => Ok(StartDebuggingRequestArgumentsRequest::Launch),
Some(val) if val == "attach" => Ok(StartDebuggingRequestArgumentsRequest::Attach),
_ => Err(anyhow!(
"missing or invalid `request` field in config. Expected 'launch' or 'attach'"
)),
let map = config.as_object().context("Config isn't an object")?;
let request_variant = map
.get("request")
.and_then(|val| val.as_str())
.context("request argument is not found or invalid")?;
match request_variant {
"launch" => Ok(StartDebuggingRequestArgumentsRequest::Launch),
"attach" => Ok(StartDebuggingRequestArgumentsRequest::Attach),
_ => Err(anyhow!("request must be either 'launch' or 'attach'")),
}
}
@@ -412,7 +414,7 @@ impl DebugAdapter for FakeAdapter {
serde_json::Value::Null
}
fn request_kind(
fn validate_config(
&self,
config: &serde_json::Value,
) -> Result<StartDebuggingRequestArgumentsRequest> {
@@ -457,7 +459,7 @@ impl DebugAdapter for FakeAdapter {
envs: HashMap::default(),
cwd: None,
request_args: StartDebuggingRequestArguments {
request: self.request_kind(&task_definition.config)?,
request: self.validate_config(&task_definition.config)?,
configuration: task_definition.config.clone(),
},
})

View File

@@ -52,7 +52,7 @@ pub fn send_telemetry(scenario: &DebugScenario, location: TelemetrySpawnLocation
return;
};
let kind = adapter
.request_kind(&scenario.config)
.validate_config(&scenario.config)
.ok()
.map(serde_json::to_value)
.and_then(Result::ok);

View File

@@ -4,7 +4,7 @@ use dap_types::{
messages::{Message, Response},
};
use futures::{AsyncRead, AsyncReadExt as _, AsyncWrite, FutureExt as _, channel::oneshot, select};
use gpui::{AppContext as _, AsyncApp, Task};
use gpui::AsyncApp;
use settings::Settings as _;
use smallvec::SmallVec;
use smol::{
@@ -22,7 +22,7 @@ use std::{
time::Duration,
};
use task::TcpArgumentsTemplate;
use util::{ConnectionResult, ResultExt as _};
use util::{ResultExt as _, TryFutureExt};
use crate::{adapters::DebugAdapterBinary, debugger_settings::DebuggerSettings};
@@ -126,7 +126,7 @@ pub(crate) struct TransportDelegate {
pending_requests: Requests,
transport: Transport,
server_tx: Arc<Mutex<Option<Sender<Message>>>>,
_tasks: Vec<Task<()>>,
_tasks: Vec<gpui::Task<Option<()>>>,
}
impl TransportDelegate {
@@ -141,7 +141,7 @@ impl TransportDelegate {
log_handlers: Default::default(),
current_requests: Default::default(),
pending_requests: Default::default(),
_tasks: Vec::new(),
_tasks: Default::default(),
};
let messages = this.start_handlers(transport_pipes, cx).await?;
Ok((messages, this))
@@ -166,76 +166,45 @@ impl TransportDelegate {
None
};
let adapter_log_handler = log_handler.clone();
cx.update(|cx| {
if let Some(stdout) = params.stdout.take() {
self._tasks.push(cx.background_spawn(async move {
match Self::handle_adapter_log(stdout, adapter_log_handler).await {
ConnectionResult::Timeout => {
log::error!("Timed out when handling debugger log");
}
ConnectionResult::ConnectionReset => {
log::info!("Debugger logs connection closed");
}
ConnectionResult::Result(Ok(())) => {}
ConnectionResult::Result(Err(e)) => {
log::error!("Error handling debugger log: {e}");
}
}
}));
self._tasks.push(
cx.background_executor()
.spawn(Self::handle_adapter_log(stdout, log_handler.clone()).log_err()),
);
}
let pending_requests = self.pending_requests.clone();
let output_log_handler = log_handler.clone();
self._tasks.push(cx.background_spawn(async move {
match Self::handle_output(
params.output,
client_tx,
pending_requests,
output_log_handler,
)
.await
{
Ok(()) => {}
Err(e) => log::error!("Error handling debugger output: {e}"),
}
}));
self._tasks.push(
cx.background_executor().spawn(
Self::handle_output(
params.output,
client_tx,
self.pending_requests.clone(),
log_handler.clone(),
)
.log_err(),
),
);
if let Some(stderr) = params.stderr.take() {
let log_handlers = self.log_handlers.clone();
self._tasks.push(cx.background_spawn(async move {
match Self::handle_error(stderr, log_handlers).await {
ConnectionResult::Timeout => {
log::error!("Timed out reading debugger error stream")
}
ConnectionResult::ConnectionReset => {
log::info!("Debugger closed its error stream")
}
ConnectionResult::Result(Ok(())) => {}
ConnectionResult::Result(Err(e)) => {
log::error!("Error handling debugger error: {e}")
}
}
}));
self._tasks.push(
cx.background_executor()
.spawn(Self::handle_error(stderr, self.log_handlers.clone()).log_err()),
);
}
let current_requests = self.current_requests.clone();
let pending_requests = self.pending_requests.clone();
let log_handler = log_handler.clone();
self._tasks.push(cx.background_spawn(async move {
match Self::handle_input(
params.input,
client_rx,
current_requests,
pending_requests,
log_handler,
)
.await
{
Ok(()) => {}
Err(e) => log::error!("Error handling debugger input: {e}"),
}
}));
self._tasks.push(
cx.background_executor().spawn(
Self::handle_input(
params.input,
client_rx,
self.current_requests.clone(),
self.pending_requests.clone(),
log_handler.clone(),
)
.log_err(),
),
);
})?;
{
@@ -266,7 +235,7 @@ impl TransportDelegate {
async fn handle_adapter_log<Stdout>(
stdout: Stdout,
log_handlers: Option<LogHandlers>,
) -> ConnectionResult<()>
) -> Result<()>
where
Stdout: AsyncRead + Unpin + Send + 'static,
{
@@ -276,14 +245,13 @@ impl TransportDelegate {
let result = loop {
line.truncate(0);
match reader
.read_line(&mut line)
.await
.context("reading adapter log line")
{
Ok(0) => break ConnectionResult::ConnectionReset,
Ok(_) => {}
Err(e) => break ConnectionResult::Result(Err(e)),
let bytes_read = match reader.read_line(&mut line).await {
Ok(bytes_read) => bytes_read,
Err(e) => break Err(e.into()),
};
if bytes_read == 0 {
anyhow::bail!("Debugger log stream closed");
}
if let Some(log_handlers) = log_handlers.as_ref() {
@@ -369,35 +337,35 @@ impl TransportDelegate {
let mut reader = BufReader::new(server_stdout);
let result = loop {
match Self::receive_server_message(&mut reader, &mut recv_buffer, log_handlers.as_ref())
.await
{
ConnectionResult::Timeout => anyhow::bail!("Timed out when connecting to debugger"),
ConnectionResult::ConnectionReset => {
log::info!("Debugger closed the connection");
return Ok(());
}
ConnectionResult::Result(Ok(Message::Response(res))) => {
let message =
Self::receive_server_message(&mut reader, &mut recv_buffer, log_handlers.as_ref())
.await;
match message {
Ok(Message::Response(res)) => {
if let Some(tx) = pending_requests.lock().await.remove(&res.request_seq) {
if let Err(e) = tx.send(Self::process_response(res)) {
log::trace!("Did not send response `{:?}` for a cancelled", e);
}
} else {
client_tx.send(Message::Response(res)).await?;
}
};
}
ConnectionResult::Result(Ok(message)) => client_tx.send(message).await?,
ConnectionResult::Result(Err(e)) => break Err(e),
Ok(message) => {
client_tx.send(message).await?;
}
Err(e) => break Err(e),
}
};
drop(client_tx);
log::debug!("Handle adapter output dropped");
result
}
async fn handle_error<Stderr>(stderr: Stderr, log_handlers: LogHandlers) -> ConnectionResult<()>
async fn handle_error<Stderr>(stderr: Stderr, log_handlers: LogHandlers) -> Result<()>
where
Stderr: AsyncRead + Unpin + Send + 'static,
{
@@ -407,12 +375,8 @@ impl TransportDelegate {
let mut reader = BufReader::new(stderr);
let result = loop {
match reader
.read_line(&mut buffer)
.await
.context("reading error log line")
{
Ok(0) => break ConnectionResult::ConnectionReset,
match reader.read_line(&mut buffer).await {
Ok(0) => anyhow::bail!("debugger error stream closed"),
Ok(_) => {
for (kind, log_handler) in log_handlers.lock().iter_mut() {
if matches!(kind, LogKind::Adapter) {
@@ -422,7 +386,7 @@ impl TransportDelegate {
buffer.truncate(0);
}
Err(error) => break ConnectionResult::Result(Err(error)),
Err(error) => break Err(error.into()),
}
};
@@ -456,7 +420,7 @@ impl TransportDelegate {
reader: &mut BufReader<Stdout>,
buffer: &mut String,
log_handlers: Option<&LogHandlers>,
) -> ConnectionResult<Message>
) -> Result<Message>
where
Stdout: AsyncRead + Unpin + Send + 'static,
{
@@ -464,58 +428,48 @@ impl TransportDelegate {
loop {
buffer.truncate(0);
match reader
if reader
.read_line(buffer)
.await
.with_context(|| "reading a message from server")
.with_context(|| "reading a message from server")?
== 0
{
Ok(0) => return ConnectionResult::ConnectionReset,
Ok(_) => {}
Err(e) => return ConnectionResult::Result(Err(e)),
anyhow::bail!("debugger reader stream closed, last string output: '{buffer}'");
};
if buffer == "\r\n" {
break;
}
if let Some(("Content-Length", value)) = buffer.trim().split_once(": ") {
match value.parse().context("invalid content length") {
Ok(length) => content_length = Some(length),
Err(e) => return ConnectionResult::Result(Err(e)),
let parts = buffer.trim().split_once(": ");
match parts {
Some(("Content-Length", value)) => {
content_length = Some(value.parse().context("invalid content length")?);
}
_ => {}
}
}
let content_length = match content_length.context("missing content length") {
Ok(length) => length,
Err(e) => return ConnectionResult::Result(Err(e)),
};
let content_length = content_length.context("missing content length")?;
let mut content = vec![0; content_length];
if let Err(e) = reader
reader
.read_exact(&mut content)
.await
.with_context(|| "reading after a loop")
{
return ConnectionResult::Result(Err(e));
}
.with_context(|| "reading after a loop")?;
let message_str = match std::str::from_utf8(&content).context("invalid utf8 from server") {
Ok(str) => str,
Err(e) => return ConnectionResult::Result(Err(e)),
};
let message = std::str::from_utf8(&content).context("invalid utf8 from server")?;
if let Some(log_handlers) = log_handlers {
for (kind, log_handler) in log_handlers.lock().iter_mut() {
if matches!(kind, LogKind::Rpc) {
log_handler(IoKind::StdOut, message_str);
log_handler(IoKind::StdOut, &message);
}
}
}
ConnectionResult::Result(
serde_json::from_str::<Message>(message_str).context("deserializing server message"),
)
Ok(serde_json::from_str::<Message>(message)?)
}
pub async fn shutdown(&self) -> Result<()> {
@@ -823,55 +777,73 @@ impl FakeTransport {
let response_handlers = this.response_handlers.clone();
let stdout_writer = Arc::new(Mutex::new(stdout_writer));
cx.background_spawn(async move {
let mut reader = BufReader::new(stdin_reader);
let mut buffer = String::new();
cx.background_executor()
.spawn(async move {
let mut reader = BufReader::new(stdin_reader);
let mut buffer = String::new();
loop {
match TransportDelegate::receive_server_message(&mut reader, &mut buffer, None)
.await
{
ConnectionResult::Timeout => {
anyhow::bail!("Timed out when connecting to debugger");
}
ConnectionResult::ConnectionReset => {
log::info!("Debugger closed the connection");
break Ok(());
}
ConnectionResult::Result(Err(e)) => break Err(e),
ConnectionResult::Result(Ok(message)) => {
match message {
Message::Request(request) => {
// redirect reverse requests to stdout writer/reader
if request.command == RunInTerminal::COMMAND
|| request.command == StartDebugging::COMMAND
{
let message =
serde_json::to_string(&Message::Request(request)).unwrap();
loop {
let message =
TransportDelegate::receive_server_message(&mut reader, &mut buffer, None)
.await;
let mut writer = stdout_writer.lock().await;
writer
.write_all(
TransportDelegate::build_rpc_message(message)
.as_bytes(),
)
.await
.unwrap();
writer.flush().await.unwrap();
} else {
let response = if let Some(handle) =
request_handlers.lock().get_mut(request.command.as_str())
match message {
Err(error) => {
break anyhow::anyhow!(error);
}
Ok(message) => {
match message {
Message::Request(request) => {
// redirect reverse requests to stdout writer/reader
if request.command == RunInTerminal::COMMAND
|| request.command == StartDebugging::COMMAND
{
handle(request.seq, request.arguments.unwrap_or(json!({})))
} else {
panic!("No request handler for {}", request.command);
};
let message =
serde_json::to_string(&Message::Response(response))
let message =
serde_json::to_string(&Message::Request(request))
.unwrap();
let mut writer = stdout_writer.lock().await;
writer
.write_all(
TransportDelegate::build_rpc_message(message)
.as_bytes(),
)
.await
.unwrap();
writer.flush().await.unwrap();
} else {
let response = if let Some(handle) = request_handlers
.lock()
.get_mut(request.command.as_str())
{
handle(
request.seq,
request.arguments.unwrap_or(json!({})),
)
} else {
panic!("No request handler for {}", request.command);
};
let message =
serde_json::to_string(&Message::Response(response))
.unwrap();
let mut writer = stdout_writer.lock().await;
writer
.write_all(
TransportDelegate::build_rpc_message(message)
.as_bytes(),
)
.await
.unwrap();
writer.flush().await.unwrap();
}
}
Message::Event(event) => {
let message =
serde_json::to_string(&Message::Event(event)).unwrap();
let mut writer = stdout_writer.lock().await;
writer
.write_all(
TransportDelegate::build_rpc_message(message)
@@ -881,35 +853,21 @@ impl FakeTransport {
.unwrap();
writer.flush().await.unwrap();
}
}
Message::Event(event) => {
let message =
serde_json::to_string(&Message::Event(event)).unwrap();
let mut writer = stdout_writer.lock().await;
writer
.write_all(
TransportDelegate::build_rpc_message(message).as_bytes(),
)
.await
.unwrap();
writer.flush().await.unwrap();
}
Message::Response(response) => {
if let Some(handle) =
response_handlers.lock().get(response.command.as_str())
{
handle(response);
} else {
log::error!("No response handler for {}", response.command);
Message::Response(response) => {
if let Some(handle) =
response_handlers.lock().get(response.command.as_str())
{
handle(response);
} else {
log::error!("No response handler for {}", response.command);
}
}
}
}
}
}
}
})
.detach();
})
.detach();
Ok((
TransportPipe::new(Box::new(stdin_writer), Box::new(stdout_reader), None, None),

View File

@@ -1,8 +1,11 @@
use std::{collections::HashMap, path::PathBuf, sync::OnceLock};
use anyhow::{Context as _, Result};
use anyhow::{Context as _, Result, anyhow};
use async_trait::async_trait;
use dap::adapters::{DebugTaskDefinition, latest_github_release};
use dap::{
StartDebuggingRequestArgumentsRequest,
adapters::{DebugTaskDefinition, latest_github_release},
};
use futures::StreamExt;
use gpui::AsyncApp;
use serde_json::Value;
@@ -34,7 +37,7 @@ impl CodeLldbDebugAdapter {
Value::String(String::from(task_definition.label.as_ref())),
);
let request = self.request_kind(&configuration)?;
let request = self.validate_config(&configuration)?;
Ok(dap::StartDebuggingRequestArguments {
request,
@@ -86,6 +89,48 @@ impl DebugAdapter for CodeLldbDebugAdapter {
DebugAdapterName(Self::ADAPTER_NAME.into())
}
fn validate_config(
&self,
config: &serde_json::Value,
) -> Result<StartDebuggingRequestArgumentsRequest> {
let map = config
.as_object()
.ok_or_else(|| anyhow!("Config isn't an object"))?;
let request_variant = map
.get("request")
.and_then(|r| r.as_str())
.ok_or_else(|| anyhow!("request field is required and must be a string"))?;
match request_variant {
"launch" => {
// For launch, verify that one of the required configs exists
if !(map.contains_key("program")
|| map.contains_key("targetCreateCommands")
|| map.contains_key("cargo"))
{
return Err(anyhow!(
"launch request requires either 'program', 'targetCreateCommands', or 'cargo' field"
));
}
Ok(StartDebuggingRequestArgumentsRequest::Launch)
}
"attach" => {
// For attach, verify that either pid or program exists
if !(map.contains_key("pid") || map.contains_key("program")) {
return Err(anyhow!(
"attach request requires either 'pid' or 'program' field"
));
}
Ok(StartDebuggingRequestArgumentsRequest::Attach)
}
_ => Err(anyhow!(
"request must be either 'launch' or 'attach', got '{}'",
request_variant
)),
}
}
fn config_from_zed_format(&self, zed_scenario: ZedDebugConfig) -> Result<DebugScenario> {
let mut configuration = json!({
"request": match zed_scenario.request {

View File

@@ -178,7 +178,7 @@ impl DebugAdapter for GdbDebugAdapter {
let gdb_path = user_setting_path.unwrap_or(gdb_path?);
let request_args = StartDebuggingRequestArguments {
request: self.request_kind(&config.config)?,
request: self.validate_config(&config.config)?,
configuration: config.config.clone(),
};

View File

@@ -1,6 +1,6 @@
use anyhow::{Context as _, bail};
use anyhow::{Context as _, anyhow, bail};
use dap::{
StartDebuggingRequestArguments,
StartDebuggingRequestArguments, StartDebuggingRequestArgumentsRequest,
adapters::{
DebugTaskDefinition, DownloadedFileType, download_adapter_from_github,
latest_github_release,
@@ -350,6 +350,24 @@ impl DebugAdapter for GoDebugAdapter {
})
}
fn validate_config(
&self,
config: &serde_json::Value,
) -> Result<StartDebuggingRequestArgumentsRequest> {
let map = config.as_object().context("Config isn't an object")?;
let request_variant = map
.get("request")
.and_then(|val| val.as_str())
.context("request argument is not found or invalid")?;
match request_variant {
"launch" => Ok(StartDebuggingRequestArgumentsRequest::Launch),
"attach" => Ok(StartDebuggingRequestArgumentsRequest::Attach),
_ => Err(anyhow!("request must be either 'launch' or 'attach'")),
}
}
fn config_from_zed_format(&self, zed_scenario: ZedDebugConfig) -> Result<DebugScenario> {
let mut args = match &zed_scenario.request {
dap::DebugRequest::Attach(attach_config) => {
@@ -470,7 +488,7 @@ impl DebugAdapter for GoDebugAdapter {
connection: None,
request_args: StartDebuggingRequestArguments {
configuration: task_definition.config.clone(),
request: self.request_kind(&task_definition.config)?,
request: self.validate_config(&task_definition.config)?,
},
})
}

View File

@@ -1,6 +1,9 @@
use adapters::latest_github_release;
use anyhow::Context as _;
use dap::{StartDebuggingRequestArguments, adapters::DebugTaskDefinition};
use anyhow::{Context as _, anyhow};
use dap::{
StartDebuggingRequestArguments, StartDebuggingRequestArgumentsRequest,
adapters::DebugTaskDefinition,
};
use gpui::AsyncApp;
use std::{collections::HashMap, path::PathBuf, sync::OnceLock};
use task::DebugRequest;
@@ -92,7 +95,7 @@ impl JsDebugAdapter {
}),
request_args: StartDebuggingRequestArguments {
configuration: task_definition.config.clone(),
request: self.request_kind(&task_definition.config)?,
request: self.validate_config(&task_definition.config)?,
},
})
}
@@ -104,6 +107,29 @@ impl DebugAdapter for JsDebugAdapter {
DebugAdapterName(Self::ADAPTER_NAME.into())
}
fn validate_config(
&self,
config: &serde_json::Value,
) -> Result<dap::StartDebuggingRequestArgumentsRequest> {
match config.get("request") {
Some(val) if val == "launch" => {
if config.get("program").is_none() && config.get("url").is_none() {
return Err(anyhow!(
"either program or url is required for launch request"
));
}
Ok(StartDebuggingRequestArgumentsRequest::Launch)
}
Some(val) if val == "attach" => {
if !config.get("processId").is_some_and(|val| val.is_u64()) {
return Err(anyhow!("processId must be a number"));
}
Ok(StartDebuggingRequestArgumentsRequest::Attach)
}
_ => Err(anyhow!("missing or invalid request field in config")),
}
}
fn config_from_zed_format(&self, zed_scenario: ZedDebugConfig) -> Result<DebugScenario> {
let mut args = json!({
"type": "pwa-node",

View File

@@ -94,7 +94,7 @@ impl PhpDebugAdapter {
envs: HashMap::default(),
request_args: StartDebuggingRequestArguments {
configuration: task_definition.config.clone(),
request: <Self as DebugAdapter>::request_kind(self, &task_definition.config)?,
request: <Self as DebugAdapter>::validate_config(self, &task_definition.config)?,
},
})
}
@@ -282,7 +282,10 @@ impl DebugAdapter for PhpDebugAdapter {
Some(SharedString::new_static("PHP").into())
}
fn request_kind(&self, _: &serde_json::Value) -> Result<StartDebuggingRequestArgumentsRequest> {
fn validate_config(
&self,
_: &serde_json::Value,
) -> Result<StartDebuggingRequestArgumentsRequest> {
Ok(StartDebuggingRequestArgumentsRequest::Launch)
}

View File

@@ -1,6 +1,9 @@
use crate::*;
use anyhow::Context as _;
use dap::{DebugRequest, StartDebuggingRequestArguments, adapters::DebugTaskDefinition};
use anyhow::{Context as _, anyhow};
use dap::{
DebugRequest, StartDebuggingRequestArguments, StartDebuggingRequestArgumentsRequest,
adapters::DebugTaskDefinition,
};
use gpui::{AsyncApp, SharedString};
use json_dotpath::DotPaths;
use language::{LanguageName, Toolchain};
@@ -83,7 +86,7 @@ impl PythonDebugAdapter {
&self,
task_definition: &DebugTaskDefinition,
) -> Result<StartDebuggingRequestArguments> {
let request = self.request_kind(&task_definition.config)?;
let request = self.validate_config(&task_definition.config)?;
let mut configuration = task_definition.config.clone();
if let Ok(console) = configuration.dot_get_mut("console") {
@@ -251,6 +254,24 @@ impl DebugAdapter for PythonDebugAdapter {
})
}
fn validate_config(
&self,
config: &serde_json::Value,
) -> Result<StartDebuggingRequestArgumentsRequest> {
let map = config.as_object().context("Config isn't an object")?;
let request_variant = map
.get("request")
.and_then(|val| val.as_str())
.context("request is not valid")?;
match request_variant {
"launch" => Ok(StartDebuggingRequestArgumentsRequest::Launch),
"attach" => Ok(StartDebuggingRequestArgumentsRequest::Attach),
_ => Err(anyhow!("request must be either 'launch' or 'attach'")),
}
}
async fn dap_schema(&self) -> serde_json::Value {
json!({
"properties": {

View File

@@ -265,7 +265,7 @@ impl DebugAdapter for RubyDebugAdapter {
cwd: None,
envs: std::collections::HashMap::default(),
request_args: StartDebuggingRequestArguments {
request: self.request_kind(&definition.config)?,
request: self.validate_config(&definition.config)?,
configuration: definition.config.clone(),
},
})

View File

@@ -50,7 +50,6 @@ project.workspace = true
rpc.workspace = true
serde.workspace = true
serde_json.workspace = true
# serde_json_lenient.workspace = true
settings.workspace = true
shlex.workspace = true
sysinfo.workspace = true

View File

@@ -3,12 +3,11 @@ use crate::session::DebugSession;
use crate::session::running::RunningState;
use crate::{
ClearAllBreakpoints, Continue, Detach, FocusBreakpointList, FocusConsole, FocusFrames,
FocusLoadedSources, FocusModules, FocusTerminal, FocusVariables, NewProcessModal,
NewProcessMode, Pause, Restart, ShowStackTrace, StepBack, StepInto, StepOut, StepOver, Stop,
ToggleExpandItem, ToggleIgnoreBreakpoints, ToggleSessionPicker, ToggleThreadPicker,
persistence, spawn_task_or_modal,
FocusLoadedSources, FocusModules, FocusTerminal, FocusVariables, Pause, Restart,
ShowStackTrace, StepBack, StepInto, StepOut, StepOver, Stop, ToggleIgnoreBreakpoints,
ToggleSessionPicker, ToggleThreadPicker, persistence, spawn_task_or_modal,
};
use anyhow::Result;
use anyhow::{Context as _, Result, anyhow};
use command_palette_hooks::CommandPaletteFilter;
use dap::StartDebuggingRequestArguments;
use dap::adapters::DebugAdapterName;
@@ -25,7 +24,7 @@ use gpui::{
use language::Buffer;
use project::debugger::session::{Session, SessionStateEvent};
use project::{Fs, WorktreeId};
use project::{Fs, ProjectPath, WorktreeId};
use project::{Project, debugger::session::ThreadStatus};
use rpc::proto::{self};
use settings::Settings;
@@ -70,7 +69,6 @@ pub struct DebugPanel {
pub(crate) thread_picker_menu_handle: PopoverMenuHandle<ContextMenu>,
pub(crate) session_picker_menu_handle: PopoverMenuHandle<ContextMenu>,
fs: Arc<dyn Fs>,
is_zoomed: bool,
_subscriptions: [Subscription; 1],
}
@@ -105,7 +103,6 @@ impl DebugPanel {
fs: workspace.app_state().fs.clone(),
thread_picker_menu_handle,
session_picker_menu_handle,
is_zoomed: false,
_subscriptions: [focus_subscription],
debug_scenario_scheduled_last: true,
}
@@ -337,17 +334,10 @@ impl DebugPanel {
let Some(task_inventory) = task_store.read(cx).task_inventory() else {
return;
};
let workspace = self.workspace.clone();
let Some(scenario) = task_inventory.read(cx).last_scheduled_scenario().cloned() else {
window.defer(cx, move |window, cx| {
workspace
.update(cx, |workspace, cx| {
NewProcessModal::show(workspace, window, NewProcessMode::Launch, None, cx);
})
.ok();
});
return;
};
let workspace = self.workspace.clone();
cx.spawn_in(window, async move |this, cx| {
let task_contexts = workspace
@@ -952,69 +942,68 @@ impl DebugPanel {
cx.notify();
}
// TODO: restore once we have proper comment preserving file edits
// pub(crate) fn save_scenario(
// &self,
// scenario: &DebugScenario,
// worktree_id: WorktreeId,
// window: &mut Window,
// cx: &mut App,
// ) -> Task<Result<ProjectPath>> {
// self.workspace
// .update(cx, |workspace, cx| {
// let Some(mut path) = workspace.absolute_path_of_worktree(worktree_id, cx) else {
// return Task::ready(Err(anyhow!("Couldn't get worktree path")));
// };
pub(crate) fn save_scenario(
&self,
scenario: &DebugScenario,
worktree_id: WorktreeId,
window: &mut Window,
cx: &mut App,
) -> Task<Result<ProjectPath>> {
self.workspace
.update(cx, |workspace, cx| {
let Some(mut path) = workspace.absolute_path_of_worktree(worktree_id, cx) else {
return Task::ready(Err(anyhow!("Couldn't get worktree path")));
};
// let serialized_scenario = serde_json::to_value(scenario);
let serialized_scenario = serde_json::to_value(scenario);
// cx.spawn_in(window, async move |workspace, cx| {
// let serialized_scenario = serialized_scenario?;
// let fs =
// workspace.read_with(cx, |workspace, _| workspace.app_state().fs.clone())?;
cx.spawn_in(window, async move |workspace, cx| {
let serialized_scenario = serialized_scenario?;
let fs =
workspace.read_with(cx, |workspace, _| workspace.app_state().fs.clone())?;
// path.push(paths::local_settings_folder_relative_path());
// if !fs.is_dir(path.as_path()).await {
// fs.create_dir(path.as_path()).await?;
// }
// path.pop();
path.push(paths::local_settings_folder_relative_path());
if !fs.is_dir(path.as_path()).await {
fs.create_dir(path.as_path()).await?;
}
path.pop();
// path.push(paths::local_debug_file_relative_path());
// let path = path.as_path();
path.push(paths::local_debug_file_relative_path());
let path = path.as_path();
// if !fs.is_file(path).await {
// fs.create_file(path, Default::default()).await?;
// fs.write(
// path,
// initial_local_debug_tasks_content().to_string().as_bytes(),
// )
// .await?;
// }
if !fs.is_file(path).await {
let content =
serde_json::to_string_pretty(&serde_json::Value::Array(vec![
serialized_scenario,
]))?;
// let content = fs.load(path).await?;
// let mut values =
// serde_json_lenient::from_str::<Vec<serde_json::Value>>(&content)?;
// values.push(serialized_scenario);
// fs.save(
// path,
// &serde_json_lenient::to_string_pretty(&values).map(Into::into)?,
// Default::default(),
// )
// .await?;
fs.create_file(path, Default::default()).await?;
fs.save(path, &content.into(), Default::default()).await?;
} else {
let content = fs.load(path).await?;
let mut values = serde_json::from_str::<Vec<serde_json::Value>>(&content)?;
values.push(serialized_scenario);
fs.save(
path,
&serde_json::to_string_pretty(&values).map(Into::into)?,
Default::default(),
)
.await?;
}
// workspace.update(cx, |workspace, cx| {
// workspace
// .project()
// .read(cx)
// .project_path_for_absolute_path(&path, cx)
// .context(
// "Couldn't get project path for .zed/debug.json in active worktree",
// )
// })?
// })
// })
// .unwrap_or_else(|err| Task::ready(Err(err)))
// }
workspace.update(cx, |workspace, cx| {
workspace
.project()
.read(cx)
.project_path_for_absolute_path(&path, cx)
.context(
"Couldn't get project path for .zed/debug.json in active worktree",
)
})?
})
})
.unwrap_or_else(|err| Task::ready(Err(err)))
}
pub(crate) fn toggle_thread_picker(&mut self, window: &mut Window, cx: &mut Context<Self>) {
self.thread_picker_menu_handle.toggle(window, cx);
@@ -1023,22 +1012,6 @@ impl DebugPanel {
pub(crate) fn toggle_session_picker(&mut self, window: &mut Window, cx: &mut Context<Self>) {
self.session_picker_menu_handle.toggle(window, cx);
}
fn toggle_zoom(
&mut self,
_: &workspace::ToggleZoom,
window: &mut Window,
cx: &mut Context<Self>,
) {
if self.is_zoomed {
cx.emit(PanelEvent::ZoomOut);
} else {
if !self.focus_handle(cx).contains_focused(window, cx) {
cx.focus_self(window);
}
cx.emit(PanelEvent::ZoomIn);
}
}
}
async fn register_session_inner(
@@ -1194,15 +1167,6 @@ impl Panel for DebugPanel {
}
fn set_active(&mut self, _: bool, _: &mut Window, _: &mut Context<Self>) {}
fn is_zoomed(&self, _window: &Window, _cx: &App) -> bool {
self.is_zoomed
}
fn set_zoomed(&mut self, zoomed: bool, _window: &mut Window, cx: &mut Context<Self>) {
self.is_zoomed = zoomed;
cx.notify();
}
}
impl Render for DebugPanel {
@@ -1343,23 +1307,6 @@ impl Render for DebugPanel {
.ok();
}
})
.on_action(cx.listener(Self::toggle_zoom))
.on_action(cx.listener(|panel, _: &ToggleExpandItem, _, cx| {
let Some(session) = panel.active_session() else {
return;
};
let active_pane = session
.read(cx)
.running_state()
.read(cx)
.active_pane()
.clone();
active_pane.update(cx, |pane, cx| {
let is_zoomed = pane.is_zoomed();
pane.set_zoomed(!is_zoomed, cx);
});
cx.notify();
}))
.when(self.active_session.is_some(), |this| {
this.on_mouse_down(
MouseButton::Right,
@@ -1463,10 +1410,4 @@ impl workspace::DebuggerProvider for DebuggerProvider {
fn debug_scenario_scheduled_last(&self, cx: &App) -> bool {
self.0.read(cx).debug_scenario_scheduled_last
}
fn active_thread_state(&self, cx: &App) -> Option<ThreadStatus> {
let session = self.0.read(cx).active_session()?;
let thread = session.read(cx).running_state().read(cx).thread_id()?;
session.read(cx).session(cx).read(cx).thread_state(thread)
}
}

View File

@@ -3,7 +3,7 @@ use debugger_panel::{DebugPanel, ToggleFocus};
use editor::Editor;
use feature_flags::{DebuggerFeatureFlag, FeatureFlagViewExt};
use gpui::{App, EntityInputHandler, actions};
use new_process_modal::{NewProcessModal, NewProcessMode};
use new_session_modal::{NewSessionModal, NewSessionMode};
use project::debugger::{self, breakpoint_store::SourceBreakpoint};
use session::DebugSession;
use settings::Settings;
@@ -15,7 +15,7 @@ use workspace::{ItemHandle, ShutdownDebugAdapters, Workspace};
pub mod attach_modal;
pub mod debugger_panel;
mod dropdown_menus;
mod new_process_modal;
mod new_session_modal;
mod persistence;
pub(crate) mod session;
mod stack_trace_view;
@@ -49,7 +49,6 @@ actions!(
ToggleThreadPicker,
ToggleSessionPicker,
RerunLastSession,
ToggleExpandItem,
]
);
@@ -211,7 +210,7 @@ pub fn init(cx: &mut App) {
},
)
.register_action(|workspace: &mut Workspace, _: &Start, window, cx| {
NewProcessModal::show(workspace, window, NewProcessMode::Debug, None, cx);
NewSessionModal::show(workspace, window, NewSessionMode::Launch, None, cx);
})
.register_action(
|workspace: &mut Workspace, _: &RerunLastSession, window, cx| {
@@ -353,7 +352,7 @@ fn spawn_task_or_modal(
.detach_and_log_err(cx)
}
Spawn::ViaModal { reveal_target } => {
NewProcessModal::show(workspace, window, NewProcessMode::Task, *reveal_target, cx);
NewSessionModal::show(workspace, window, NewSessionMode::Task, *reveal_target, cx);
}
}
}

View File

@@ -8,8 +8,7 @@ pub mod variable_list;
use std::{any::Any, ops::ControlFlow, path::PathBuf, sync::Arc, time::Duration};
use crate::{
ToggleExpandItem,
new_process_modal::resolve_path,
new_session_modal::resolve_path,
persistence::{self, DebuggerPaneItem, SerializedLayout},
};
@@ -348,7 +347,6 @@ pub(crate) fn new_debugger_pane(
false
}
})));
pane.set_can_toggle_zoom(false, cx);
pane.display_nav_history_buttons(None);
pane.set_custom_drop_handle(cx, custom_drop_handle);
pane.set_should_display_tab_bar(|_, _| true);
@@ -474,19 +472,17 @@ pub(crate) fn new_debugger_pane(
},
)
.icon_size(IconSize::XSmall)
.on_click(cx.listener(move |pane, _, _, cx| {
let is_zoomed = pane.is_zoomed();
pane.set_zoomed(!is_zoomed, cx);
cx.notify();
.on_click(cx.listener(move |pane, _, window, cx| {
pane.toggle_zoom(&workspace::ToggleZoom, window, cx);
}))
.tooltip({
let focus_handle = focus_handle.clone();
move |window, cx| {
let zoomed_text =
if zoomed { "Minimize" } else { "Expand" };
if zoomed { "Zoom Out" } else { "Zoom In" };
Tooltip::for_action_in(
zoomed_text,
&ToggleExpandItem,
&workspace::ToggleZoom,
&focus_handle,
window,
cx,
@@ -570,7 +566,7 @@ impl RunningState {
}
}
pub(crate) fn relativize_paths(
pub(crate) fn relativlize_paths(
key: Option<&str>,
config: &mut serde_json::Value,
context: &TaskContext,
@@ -578,12 +574,12 @@ impl RunningState {
match config {
serde_json::Value::Object(obj) => {
obj.iter_mut()
.for_each(|(key, value)| Self::relativize_paths(Some(key), value, context));
.for_each(|(key, value)| Self::relativlize_paths(Some(key), value, context));
}
serde_json::Value::Array(array) => {
array
.iter_mut()
.for_each(|value| Self::relativize_paths(None, value, context));
.for_each(|value| Self::relativlize_paths(None, value, context));
}
serde_json::Value::String(s) if key == Some("program") || key == Some("cwd") => {
// Some built-in zed tasks wrap their arguments in quotes as they might contain spaces.
@@ -810,13 +806,13 @@ impl RunningState {
mut config,
tcp_connection,
} = scenario;
Self::relativize_paths(None, &mut config, &task_context);
Self::relativlize_paths(None, &mut config, &task_context);
Self::substitute_variables_in_config(&mut config, &task_context);
let request_type = dap_registry
.adapter(&adapter)
.ok_or_else(|| anyhow!("{}: is not a valid adapter name", &adapter))
.and_then(|adapter| adapter.request_kind(&config));
.and_then(|adapter| adapter.validate_config(&config));
let config_is_valid = request_type.is_ok();
@@ -958,10 +954,7 @@ impl RunningState {
config = scenario.config;
Self::substitute_variables_in_config(&mut config, &task_context);
} else {
let Err(e) = request_type else {
unreachable!();
};
anyhow::bail!("Zed cannot determine how to run this debug scenario. `build` field was not provided and Debug Adapter won't accept provided configuration because: {e}");
anyhow::bail!("No request or build provided");
};
Ok(DebugTaskDefinition {
@@ -1264,6 +1257,18 @@ impl RunningState {
Event::Focus => {
this.active_pane = source_pane.clone();
}
Event::ZoomIn => {
source_pane.update(cx, |pane, cx| {
pane.set_zoomed(true, cx);
});
cx.notify();
}
Event::ZoomOut => {
source_pane.update(cx, |pane, cx| {
pane.set_zoomed(false, cx);
});
cx.notify();
}
_ => {}
}
}

View File

@@ -13,7 +13,7 @@ use gpui::{
use language::{Buffer, CodeLabel, ToOffset};
use menu::Confirm;
use project::{
Completion, CompletionResponse,
Completion,
debugger::session::{CompletionsQuery, OutputToken, Session, SessionEvent},
};
use settings::Settings;
@@ -262,9 +262,9 @@ impl CompletionProvider for ConsoleQueryBarCompletionProvider {
_trigger: editor::CompletionContext,
_window: &mut Window,
cx: &mut Context<Editor>,
) -> Task<Result<Vec<CompletionResponse>>> {
) -> Task<Result<Option<Vec<Completion>>>> {
let Some(console) = self.0.upgrade() else {
return Task::ready(Ok(Vec::new()));
return Task::ready(Ok(None));
};
let support_completions = console
@@ -322,7 +322,7 @@ impl ConsoleQueryBarCompletionProvider {
buffer: &Entity<Buffer>,
buffer_position: language::Anchor,
cx: &mut Context<Editor>,
) -> Task<Result<Vec<CompletionResponse>>> {
) -> Task<Result<Option<Vec<Completion>>>> {
let (variables, string_matches) = console.update(cx, |console, cx| {
let mut variables = HashMap::default();
let mut string_matches = Vec::default();
@@ -354,43 +354,39 @@ impl ConsoleQueryBarCompletionProvider {
let query = buffer.read(cx).text();
cx.spawn(async move |_, cx| {
const LIMIT: usize = 10;
let matches = fuzzy::match_strings(
&string_matches,
&query,
true,
LIMIT,
10,
&Default::default(),
cx.background_executor().clone(),
)
.await;
let completions = matches
.iter()
.filter_map(|string_match| {
let variable_value = variables.get(&string_match.string)?;
Ok(Some(
matches
.iter()
.filter_map(|string_match| {
let variable_value = variables.get(&string_match.string)?;
Some(project::Completion {
replace_range: buffer_position..buffer_position,
new_text: string_match.string.clone(),
label: CodeLabel {
filter_range: 0..string_match.string.len(),
text: format!("{} {}", string_match.string, variable_value),
runs: Vec::new(),
},
icon_path: None,
documentation: None,
confirm: None,
source: project::CompletionSource::Custom,
insert_text_mode: None,
Some(project::Completion {
replace_range: buffer_position..buffer_position,
new_text: string_match.string.clone(),
label: CodeLabel {
filter_range: 0..string_match.string.len(),
text: format!("{} {}", string_match.string, variable_value),
runs: Vec::new(),
},
icon_path: None,
documentation: None,
confirm: None,
source: project::CompletionSource::Custom,
insert_text_mode: None,
})
})
})
.collect::<Vec<_>>();
Ok(vec![project::CompletionResponse {
is_incomplete: completions.len() >= LIMIT,
completions,
}])
.collect(),
))
})
}
@@ -400,7 +396,7 @@ impl ConsoleQueryBarCompletionProvider {
buffer: &Entity<Buffer>,
buffer_position: language::Anchor,
cx: &mut Context<Editor>,
) -> Task<Result<Vec<CompletionResponse>>> {
) -> Task<Result<Option<Vec<Completion>>>> {
let completion_task = console.update(cx, |console, cx| {
console.session.update(cx, |state, cx| {
let frame_id = console.stack_frame_list.read(cx).opened_stack_frame_id();
@@ -415,56 +411,53 @@ impl ConsoleQueryBarCompletionProvider {
cx.background_executor().spawn(async move {
let completions = completion_task.await?;
let completions = completions
.into_iter()
.map(|completion| {
let new_text = completion
.text
.as_ref()
.unwrap_or(&completion.label)
.to_owned();
let buffer_text = snapshot.text();
let buffer_bytes = buffer_text.as_bytes();
let new_bytes = new_text.as_bytes();
Ok(Some(
completions
.into_iter()
.map(|completion| {
let new_text = completion
.text
.as_ref()
.unwrap_or(&completion.label)
.to_owned();
let buffer_text = snapshot.text();
let buffer_bytes = buffer_text.as_bytes();
let new_bytes = new_text.as_bytes();
let mut prefix_len = 0;
for i in (0..new_bytes.len()).rev() {
if buffer_bytes.ends_with(&new_bytes[0..i]) {
prefix_len = i;
break;
let mut prefix_len = 0;
for i in (0..new_bytes.len()).rev() {
if buffer_bytes.ends_with(&new_bytes[0..i]) {
prefix_len = i;
break;
}
}
}
let buffer_offset = buffer_position.to_offset(&snapshot);
let start = buffer_offset - prefix_len;
let start = snapshot.clip_offset(start, Bias::Left);
let start = snapshot.anchor_before(start);
let replace_range = start..buffer_position;
let buffer_offset = buffer_position.to_offset(&snapshot);
let start = buffer_offset - prefix_len;
let start = snapshot.clip_offset(start, Bias::Left);
let start = snapshot.anchor_before(start);
let replace_range = start..buffer_position;
project::Completion {
replace_range,
new_text,
label: CodeLabel {
filter_range: 0..completion.label.len(),
text: completion.label,
runs: Vec::new(),
},
icon_path: None,
documentation: None,
confirm: None,
source: project::CompletionSource::BufferWord {
word_range: buffer_position..language::Anchor::MAX,
resolved: false,
},
insert_text_mode: None,
}
})
.collect();
Ok(vec![project::CompletionResponse {
completions,
is_incomplete: false,
}])
project::Completion {
replace_range,
new_text,
label: CodeLabel {
filter_range: 0..completion.label.len(),
text: completion.label,
runs: Vec::new(),
},
icon_path: None,
documentation: None,
confirm: None,
source: project::CompletionSource::BufferWord {
word_range: buffer_position..language::Anchor::MAX,
resolved: false,
},
insert_text_mode: None,
}
})
.collect(),
))
})
}
}

View File

@@ -25,7 +25,7 @@ mod inline_values;
#[cfg(test)]
mod module_list;
#[cfg(test)]
mod new_process_modal;
mod new_session_modal;
#[cfg(test)]
mod persistence;
#[cfg(test)]

View File

@@ -1,13 +1,13 @@
use dap::DapRegistry;
use gpui::{BackgroundExecutor, TestAppContext, VisualTestContext};
use project::{FakeFs, Project};
use project::{FakeFs, Fs, Project};
use serde_json::json;
use std::sync::Arc;
use std::sync::atomic::{AtomicBool, Ordering};
use task::{DebugRequest, DebugScenario, LaunchRequest, TaskContext, VariableName, ZedDebugConfig};
use util::path;
// use crate::new_process_modal::NewProcessMode;
use crate::new_session_modal::NewSessionMode;
use crate::tests::{init_test, init_test_workspace};
#[gpui::test]
@@ -152,111 +152,111 @@ async fn test_debug_session_substitutes_variables_and_relativizes_paths(
}
}
// #[gpui::test]
// async fn test_save_debug_scenario_to_file(executor: BackgroundExecutor, cx: &mut TestAppContext) {
// init_test(cx);
#[gpui::test]
async fn test_save_debug_scenario_to_file(executor: BackgroundExecutor, cx: &mut TestAppContext) {
init_test(cx);
// let fs = FakeFs::new(executor.clone());
// fs.insert_tree(
// path!("/project"),
// json!({
// "main.rs": "fn main() {}"
// }),
// )
// .await;
let fs = FakeFs::new(executor.clone());
fs.insert_tree(
path!("/project"),
json!({
"main.rs": "fn main() {}"
}),
)
.await;
// let project = Project::test(fs.clone(), [path!("/project").as_ref()], cx).await;
// let workspace = init_test_workspace(&project, cx).await;
// let cx = &mut VisualTestContext::from_window(*workspace, cx);
let project = Project::test(fs.clone(), [path!("/project").as_ref()], cx).await;
let workspace = init_test_workspace(&project, cx).await;
let cx = &mut VisualTestContext::from_window(*workspace, cx);
// workspace
// .update(cx, |workspace, window, cx| {
// crate::new_process_modal::NewProcessModal::show(
// workspace,
// window,
// NewProcessMode::Debug,
// None,
// cx,
// );
// })
// .unwrap();
workspace
.update(cx, |workspace, window, cx| {
crate::new_session_modal::NewSessionModal::show(
workspace,
window,
NewSessionMode::Launch,
None,
cx,
);
})
.unwrap();
// cx.run_until_parked();
cx.run_until_parked();
// let modal = workspace
// .update(cx, |workspace, _, cx| {
// workspace.active_modal::<crate::new_process_modal::NewProcessModal>(cx)
// })
// .unwrap()
// .expect("Modal should be active");
let modal = workspace
.update(cx, |workspace, _, cx| {
workspace.active_modal::<crate::new_session_modal::NewSessionModal>(cx)
})
.unwrap()
.expect("Modal should be active");
// modal.update_in(cx, |modal, window, cx| {
// modal.set_configure("/project/main", "/project", false, window, cx);
// modal.save_scenario(window, cx);
// });
modal.update_in(cx, |modal, window, cx| {
modal.set_configure("/project/main", "/project", false, window, cx);
modal.save_scenario(window, cx);
});
// cx.executor().run_until_parked();
cx.executor().run_until_parked();
// let debug_json_content = fs
// .load(path!("/project/.zed/debug.json").as_ref())
// .await
// .expect("debug.json should exist");
let debug_json_content = fs
.load(path!("/project/.zed/debug.json").as_ref())
.await
.expect("debug.json should exist");
// let expected_content = vec![
// "[",
// " {",
// r#" "adapter": "fake-adapter","#,
// r#" "label": "main (fake-adapter)","#,
// r#" "request": "launch","#,
// r#" "program": "/project/main","#,
// r#" "cwd": "/project","#,
// r#" "args": [],"#,
// r#" "env": {}"#,
// " }",
// "]",
// ];
let expected_content = vec![
"[",
" {",
r#" "adapter": "fake-adapter","#,
r#" "label": "main (fake-adapter)","#,
r#" "request": "launch","#,
r#" "program": "/project/main","#,
r#" "cwd": "/project","#,
r#" "args": [],"#,
r#" "env": {}"#,
" }",
"]",
];
// let actual_lines: Vec<&str> = debug_json_content.lines().collect();
// pretty_assertions::assert_eq!(expected_content, actual_lines);
let actual_lines: Vec<&str> = debug_json_content.lines().collect();
pretty_assertions::assert_eq!(expected_content, actual_lines);
// modal.update_in(cx, |modal, window, cx| {
// modal.set_configure("/project/other", "/project", true, window, cx);
// modal.save_scenario(window, cx);
// });
modal.update_in(cx, |modal, window, cx| {
modal.set_configure("/project/other", "/project", true, window, cx);
modal.save_scenario(window, cx);
});
// cx.executor().run_until_parked();
cx.executor().run_until_parked();
// let debug_json_content = fs
// .load(path!("/project/.zed/debug.json").as_ref())
// .await
// .expect("debug.json should exist after second save");
let debug_json_content = fs
.load(path!("/project/.zed/debug.json").as_ref())
.await
.expect("debug.json should exist after second save");
// let expected_content = vec![
// "[",
// " {",
// r#" "adapter": "fake-adapter","#,
// r#" "label": "main (fake-adapter)","#,
// r#" "request": "launch","#,
// r#" "program": "/project/main","#,
// r#" "cwd": "/project","#,
// r#" "args": [],"#,
// r#" "env": {}"#,
// " },",
// " {",
// r#" "adapter": "fake-adapter","#,
// r#" "label": "other (fake-adapter)","#,
// r#" "request": "launch","#,
// r#" "program": "/project/other","#,
// r#" "cwd": "/project","#,
// r#" "args": [],"#,
// r#" "env": {}"#,
// " }",
// "]",
// ];
let expected_content = vec![
"[",
" {",
r#" "adapter": "fake-adapter","#,
r#" "label": "main (fake-adapter)","#,
r#" "request": "launch","#,
r#" "program": "/project/main","#,
r#" "cwd": "/project","#,
r#" "args": [],"#,
r#" "env": {}"#,
" },",
" {",
r#" "adapter": "fake-adapter","#,
r#" "label": "other (fake-adapter)","#,
r#" "request": "launch","#,
r#" "program": "/project/other","#,
r#" "cwd": "/project","#,
r#" "args": [],"#,
r#" "env": {}"#,
" }",
"]",
];
// let actual_lines: Vec<&str> = debug_json_content.lines().collect();
// pretty_assertions::assert_eq!(expected_content, actual_lines);
// }
let actual_lines: Vec<&str> = debug_json_content.lines().collect();
pretty_assertions::assert_eq!(expected_content, actual_lines);
}
#[gpui::test]
async fn test_dap_adapter_config_conversion_and_validation(cx: &mut TestAppContext) {
@@ -322,7 +322,7 @@ async fn test_dap_adapter_config_conversion_and_validation(cx: &mut TestAppConte
);
let request_type = adapter
.request_kind(&debug_scenario.config)
.validate_config(&debug_scenario.config)
.unwrap_or_else(|_| {
panic!(
"Adapter {} should validate the config successfully",

View File

@@ -1,8 +1,9 @@
use fuzzy::{StringMatch, StringMatchCandidate};
use gpui::{
AnyElement, Entity, Focusable, FontWeight, ListSizingBehavior, ScrollStrategy, SharedString,
Size, StrikethroughStyle, StyledText, Task, UniformListScrollHandle, div, px, uniform_list,
Size, StrikethroughStyle, StyledText, UniformListScrollHandle, div, px, uniform_list,
};
use gpui::{AsyncWindowContext, WeakEntity};
use itertools::Itertools;
use language::CodeLabel;
use language::{Buffer, LanguageName, LanguageRegistry};
@@ -17,7 +18,6 @@ use task::TaskContext;
use std::collections::VecDeque;
use std::sync::Arc;
use std::sync::atomic::{AtomicBool, Ordering};
use std::{
cell::RefCell,
cmp::{Reverse, min},
@@ -47,10 +47,15 @@ pub const MENU_ASIDE_MAX_WIDTH: Pixels = px(500.);
// Constants for the markdown cache. The purpose of this cache is to reduce flickering due to
// documentation not yet being parsed.
//
// The size of the cache is set to 16, which is roughly 3 times more than the number of items
// fetched around the current selection. This way documentation is more often ready for render when
// revisiting previous entries, such as when pressing backspace.
const MARKDOWN_CACHE_MAX_SIZE: usize = 16;
// The size of the cache is set to the number of items fetched around the current selection plus one
// for the current selection and another to avoid cases where and adjacent selection exits the
// cache. The only current benefit of a larger cache would be doing less markdown parsing when the
// selection revisits items.
//
// One future benefit of a larger cache would be reducing flicker on backspace. This would require
// not recreating the menu on every change, by not re-querying the language server when
// `is_incomplete = false`.
const MARKDOWN_CACHE_MAX_SIZE: usize = MARKDOWN_CACHE_BEFORE_ITEMS + MARKDOWN_CACHE_AFTER_ITEMS + 2;
const MARKDOWN_CACHE_BEFORE_ITEMS: usize = 2;
const MARKDOWN_CACHE_AFTER_ITEMS: usize = 2;
@@ -192,48 +197,27 @@ pub enum ContextMenuOrigin {
QuickActionBar,
}
#[derive(Clone)]
pub struct CompletionsMenu {
pub id: CompletionId,
sort_completions: bool,
pub initial_position: Anchor,
pub initial_query: Option<Arc<String>>,
pub is_incomplete: bool,
pub buffer: Entity<Buffer>,
pub completions: Rc<RefCell<Box<[Completion]>>>,
match_candidates: Arc<[StringMatchCandidate]>,
pub entries: Rc<RefCell<Box<[StringMatch]>>>,
match_candidates: Rc<[StringMatchCandidate]>,
pub entries: Rc<RefCell<Vec<StringMatch>>>,
pub selected_item: usize,
filter_task: Task<()>,
cancel_filter: Arc<AtomicBool>,
scroll_handle: UniformListScrollHandle,
resolve_completions: bool,
show_completion_documentation: bool,
pub(super) ignore_completion_provider: bool,
last_rendered_range: Rc<RefCell<Option<Range<usize>>>>,
markdown_cache: Rc<RefCell<VecDeque<(MarkdownCacheKey, Entity<Markdown>)>>>,
markdown_cache: Rc<RefCell<VecDeque<(usize, Entity<Markdown>)>>>,
language_registry: Option<Arc<LanguageRegistry>>,
language: Option<LanguageName>,
snippet_sort_order: SnippetSortOrder,
}
#[derive(Clone, Debug, PartialEq)]
enum MarkdownCacheKey {
ForCandidate {
candidate_id: usize,
},
ForCompletionMatch {
new_text: String,
markdown_source: SharedString,
},
}
// TODO: There should really be a wrapper around fuzzy match tasks that does this.
impl Drop for CompletionsMenu {
fn drop(&mut self) {
self.cancel_filter.store(true, Ordering::Relaxed);
}
}
impl CompletionsMenu {
pub fn new(
id: CompletionId,
@@ -241,8 +225,6 @@ impl CompletionsMenu {
show_completion_documentation: bool,
ignore_completion_provider: bool,
initial_position: Anchor,
initial_query: Option<Arc<String>>,
is_incomplete: bool,
buffer: Entity<Buffer>,
completions: Box<[Completion]>,
snippet_sort_order: SnippetSortOrder,
@@ -260,21 +242,17 @@ impl CompletionsMenu {
id,
sort_completions,
initial_position,
initial_query,
is_incomplete,
buffer,
show_completion_documentation,
ignore_completion_provider,
completions: RefCell::new(completions).into(),
match_candidates,
entries: Rc::new(RefCell::new(Box::new([]))),
entries: RefCell::new(Vec::new()).into(),
selected_item: 0,
filter_task: Task::ready(()),
cancel_filter: Arc::new(AtomicBool::new(false)),
scroll_handle: UniformListScrollHandle::new(),
resolve_completions: true,
last_rendered_range: RefCell::new(None).into(),
markdown_cache: RefCell::new(VecDeque::new()).into(),
markdown_cache: RefCell::new(VecDeque::with_capacity(MARKDOWN_CACHE_MAX_SIZE)).into(),
language_registry,
language,
snippet_sort_order,
@@ -325,20 +303,16 @@ impl CompletionsMenu {
positions: vec![],
string: completion.clone(),
})
.collect();
.collect::<Vec<_>>();
Self {
id,
sort_completions,
initial_position: selection.start,
initial_query: None,
is_incomplete: false,
buffer,
completions: RefCell::new(completions).into(),
match_candidates,
entries: RefCell::new(entries).into(),
selected_item: 0,
filter_task: Task::ready(()),
cancel_filter: Arc::new(AtomicBool::new(false)),
scroll_handle: UniformListScrollHandle::new(),
resolve_completions: false,
show_completion_documentation: false,
@@ -416,7 +390,14 @@ impl CompletionsMenu {
) {
if self.selected_item != match_index {
self.selected_item = match_index;
self.handle_selection_changed(provider, window, cx);
self.scroll_handle
.scroll_to_item(self.selected_item, ScrollStrategy::Top);
self.resolve_visible_completions(provider, cx);
self.start_markdown_parse_for_nearby_entries(cx);
if let Some(provider) = provider {
self.handle_selection_changed(provider, window, cx);
}
cx.notify();
}
}
@@ -437,25 +418,18 @@ impl CompletionsMenu {
}
fn handle_selection_changed(
&mut self,
provider: Option<&dyn CompletionProvider>,
&self,
provider: &dyn CompletionProvider,
window: &mut Window,
cx: &mut Context<Editor>,
cx: &mut App,
) {
self.scroll_handle
.scroll_to_item(self.selected_item, ScrollStrategy::Top);
if let Some(provider) = provider {
let entries = self.entries.borrow();
let entry = if self.selected_item < entries.len() {
Some(&entries[self.selected_item])
} else {
None
};
provider.selection_changed(entry, window, cx);
}
self.resolve_visible_completions(provider, cx);
self.start_markdown_parse_for_nearby_entries(cx);
cx.notify();
let entries = self.entries.borrow();
let entry = if self.selected_item < entries.len() {
Some(&entries[self.selected_item])
} else {
None
};
provider.selection_changed(entry, window, cx);
}
pub fn resolve_visible_completions(
@@ -470,19 +444,6 @@ impl CompletionsMenu {
return;
};
let entries = self.entries.borrow();
if entries.is_empty() {
return;
}
if self.selected_item >= entries.len() {
log::error!(
"bug: completion selected_item >= entries.len(): {} >= {}",
self.selected_item,
entries.len()
);
self.selected_item = entries.len() - 1;
}
// Attempt to resolve completions for every item that will be displayed. This matters
// because single line documentation may be displayed inline with the completion.
//
@@ -494,6 +455,7 @@ impl CompletionsMenu {
let visible_count = last_rendered_range
.clone()
.map_or(APPROXIMATE_VISIBLE_COUNT, |range| range.count());
let entries = self.entries.borrow();
let entry_range = if self.selected_item == 0 {
0..min(visible_count, entries.len())
} else if self.selected_item == entries.len() - 1 {
@@ -546,11 +508,11 @@ impl CompletionsMenu {
.update(cx, |editor, cx| {
// `resolve_completions` modified state affecting display.
cx.notify();
editor.with_completions_menu_matching_id(completion_id, |menu| {
if let Some(menu) = menu {
menu.start_markdown_parse_for_nearby_entries(cx)
}
});
editor.with_completions_menu_matching_id(
completion_id,
|| (),
|this| this.start_markdown_parse_for_nearby_entries(cx),
);
})
.ok();
}
@@ -586,11 +548,11 @@ impl CompletionsMenu {
return None;
}
let candidate_id = entries[index].candidate_id;
let completions = self.completions.borrow();
match &completions[candidate_id].documentation {
Some(CompletionDocumentation::MultiLineMarkdown(source)) if !source.is_empty() => self
.get_or_create_markdown(candidate_id, Some(source), false, &completions, cx)
.map(|(_, markdown)| markdown),
match &self.completions.borrow()[candidate_id].documentation {
Some(CompletionDocumentation::MultiLineMarkdown(source)) if !source.is_empty() => Some(
self.get_or_create_markdown(candidate_id, source.clone(), false, cx)
.1,
),
Some(_) => None,
_ => None,
}
@@ -599,75 +561,38 @@ impl CompletionsMenu {
fn get_or_create_markdown(
&self,
candidate_id: usize,
source: Option<&SharedString>,
source: SharedString,
is_render: bool,
completions: &[Completion],
cx: &mut Context<Editor>,
) -> Option<(bool, Entity<Markdown>)> {
) -> (bool, Entity<Markdown>) {
let mut markdown_cache = self.markdown_cache.borrow_mut();
let mut has_completion_match_cache_entry = false;
let mut matching_entry = markdown_cache.iter().find_position(|(key, _)| match key {
MarkdownCacheKey::ForCandidate { candidate_id: id } => *id == candidate_id,
MarkdownCacheKey::ForCompletionMatch { .. } => {
has_completion_match_cache_entry = true;
false
}
});
if has_completion_match_cache_entry && matching_entry.is_none() {
if let Some(source) = source {
matching_entry = markdown_cache.iter().find_position(|(key, _)| {
matches!(key, MarkdownCacheKey::ForCompletionMatch { markdown_source, .. }
if markdown_source == source)
});
} else {
// Heuristic guess that documentation can be reused when new_text matches. This is
// to mitigate documentation flicker while typing. If this is wrong, then resolution
// should cause the correct documentation to be displayed soon.
let completion = &completions[candidate_id];
matching_entry = markdown_cache.iter().find_position(|(key, _)| {
matches!(key, MarkdownCacheKey::ForCompletionMatch { new_text, .. }
if new_text == &completion.new_text)
});
}
}
if let Some((cache_index, (key, markdown))) = matching_entry {
let markdown = markdown.clone();
// Since the markdown source matches, the key can now be ForCandidate.
if source.is_some() && matches!(key, MarkdownCacheKey::ForCompletionMatch { .. }) {
markdown_cache[cache_index].0 = MarkdownCacheKey::ForCandidate { candidate_id };
}
if is_render && cache_index != 0 {
if let Some((cache_index, (_, markdown))) = markdown_cache
.iter()
.find_position(|(id, _)| *id == candidate_id)
{
let markdown = if is_render && cache_index != 0 {
// Move the current selection's cache entry to the front.
markdown_cache.rotate_right(1);
let cache_len = markdown_cache.len();
markdown_cache.swap(0, (cache_index + 1) % cache_len);
}
&markdown_cache[0].1
} else {
markdown
};
let is_parsing = markdown.update(cx, |markdown, cx| {
if let Some(source) = source {
// `reset` is called as it's possible for documentation to change due to resolve
// requests. It does nothing if `source` is unchanged.
markdown.reset(source.clone(), cx);
}
// `reset` is called as it's possible for documentation to change due to resolve
// requests. It does nothing if `source` is unchanged.
markdown.reset(source, cx);
markdown.is_parsing()
});
return Some((is_parsing, markdown));
return (is_parsing, markdown.clone());
}
let Some(source) = source else {
// Can't create markdown as there is no source.
return None;
};
if markdown_cache.len() < MARKDOWN_CACHE_MAX_SIZE {
let markdown = cx.new(|cx| {
Markdown::new(
source.clone(),
source,
self.language_registry.clone(),
self.language.clone(),
cx,
@@ -676,20 +601,17 @@ impl CompletionsMenu {
// Handles redraw when the markdown is done parsing. The current render is for a
// deferred draw, and so without this did not redraw when `markdown` notified.
cx.observe(&markdown, |_, _, cx| cx.notify()).detach();
markdown_cache.push_front((
MarkdownCacheKey::ForCandidate { candidate_id },
markdown.clone(),
));
Some((true, markdown))
markdown_cache.push_front((candidate_id, markdown.clone()));
(true, markdown)
} else {
debug_assert_eq!(markdown_cache.capacity(), MARKDOWN_CACHE_MAX_SIZE);
// Moves the last cache entry to the start. The ring buffer is full, so this does no
// copying and just shifts indexes.
markdown_cache.rotate_right(1);
markdown_cache[0].0 = MarkdownCacheKey::ForCandidate { candidate_id };
markdown_cache[0].0 = candidate_id;
let markdown = &markdown_cache[0].1;
markdown.update(cx, |markdown, cx| markdown.reset(source.clone(), cx));
Some((true, markdown.clone()))
markdown.update(cx, |markdown, cx| markdown.reset(source, cx));
(true, markdown.clone())
}
}
@@ -852,46 +774,37 @@ impl CompletionsMenu {
}
let mat = &self.entries.borrow()[self.selected_item];
let completions = self.completions.borrow_mut();
let multiline_docs = match completions[mat.candidate_id].documentation.as_ref() {
Some(CompletionDocumentation::MultiLinePlainText(text)) => div().child(text.clone()),
Some(CompletionDocumentation::SingleLineAndMultiLinePlainText {
let multiline_docs = match self.completions.borrow_mut()[mat.candidate_id]
.documentation
.as_ref()?
{
CompletionDocumentation::MultiLinePlainText(text) => div().child(text.clone()),
CompletionDocumentation::SingleLineAndMultiLinePlainText {
plain_text: Some(text),
..
}) => div().child(text.clone()),
Some(CompletionDocumentation::MultiLineMarkdown(source)) if !source.is_empty() => {
let Some((false, markdown)) = self.get_or_create_markdown(
mat.candidate_id,
Some(source),
true,
&completions,
cx,
) else {
} => div().child(text.clone()),
CompletionDocumentation::MultiLineMarkdown(source) if !source.is_empty() => {
let (is_parsing, markdown) =
self.get_or_create_markdown(mat.candidate_id, source.clone(), true, cx);
if is_parsing {
return None;
};
Self::render_markdown(markdown, window, cx)
}
div().child(
MarkdownElement::new(markdown, hover_markdown_style(window, cx))
.code_block_renderer(markdown::CodeBlockRenderer::Default {
copy_button: false,
copy_button_on_hover: false,
border: false,
})
.on_url_click(open_markdown_url),
)
}
None => {
// Handle the case where documentation hasn't yet been resolved but there's a
// `new_text` match in the cache.
//
// TODO: It's inconsistent that documentation caching based on matching `new_text`
// only works for markdown. Consider generally caching the results of resolving
// completions.
let Some((false, markdown)) =
self.get_or_create_markdown(mat.candidate_id, None, true, &completions, cx)
else {
return None;
};
Self::render_markdown(markdown, window, cx)
}
Some(CompletionDocumentation::MultiLineMarkdown(_)) => return None,
Some(CompletionDocumentation::SingleLine(_)) => return None,
Some(CompletionDocumentation::Undocumented) => return None,
Some(CompletionDocumentation::SingleLineAndMultiLinePlainText {
plain_text: None,
..
}) => {
CompletionDocumentation::MultiLineMarkdown(_) => return None,
CompletionDocumentation::SingleLine(_) => return None,
CompletionDocumentation::Undocumented => return None,
CompletionDocumentation::SingleLineAndMultiLinePlainText {
plain_text: None, ..
} => {
return None;
}
};
@@ -911,177 +824,6 @@ impl CompletionsMenu {
)
}
fn render_markdown(
markdown: Entity<Markdown>,
window: &mut Window,
cx: &mut Context<Editor>,
) -> Div {
div().child(
MarkdownElement::new(markdown, hover_markdown_style(window, cx))
.code_block_renderer(markdown::CodeBlockRenderer::Default {
copy_button: false,
copy_button_on_hover: false,
border: false,
})
.on_url_click(open_markdown_url),
)
}
pub fn filter(
&mut self,
query: Option<Arc<String>>,
provider: Option<Rc<dyn CompletionProvider>>,
window: &mut Window,
cx: &mut Context<Editor>,
) {
self.cancel_filter.store(true, Ordering::Relaxed);
if let Some(query) = query {
self.cancel_filter = Arc::new(AtomicBool::new(false));
let matches = self.do_async_filtering(query, cx);
let id = self.id;
self.filter_task = cx.spawn_in(window, async move |editor, cx| {
let matches = matches.await;
editor
.update_in(cx, |editor, window, cx| {
editor.with_completions_menu_matching_id(id, |this| {
if let Some(this) = this {
this.set_filter_results(matches, provider, window, cx);
}
});
})
.ok();
});
} else {
self.filter_task = Task::ready(());
let matches = self.unfiltered_matches();
self.set_filter_results(matches, provider, window, cx);
}
}
pub fn do_async_filtering(
&self,
query: Arc<String>,
cx: &Context<Editor>,
) -> Task<Vec<StringMatch>> {
let matches_task = cx.background_spawn({
let query = query.clone();
let match_candidates = self.match_candidates.clone();
let cancel_filter = self.cancel_filter.clone();
let background_executor = cx.background_executor().clone();
async move {
fuzzy::match_strings(
&match_candidates,
&query,
query.chars().any(|c| c.is_uppercase()),
100,
&cancel_filter,
background_executor,
)
.await
}
});
let completions = self.completions.clone();
let sort_completions = self.sort_completions;
let snippet_sort_order = self.snippet_sort_order;
cx.foreground_executor().spawn(async move {
let mut matches = matches_task.await;
if sort_completions {
matches = Self::sort_string_matches(
matches,
Some(&query),
snippet_sort_order,
completions.borrow().as_ref(),
);
}
matches
})
}
/// Like `do_async_filtering` but there is no filter query, so no need to spawn tasks.
pub fn unfiltered_matches(&self) -> Vec<StringMatch> {
let mut matches = self
.match_candidates
.iter()
.enumerate()
.map(|(candidate_id, candidate)| StringMatch {
candidate_id,
score: Default::default(),
positions: Default::default(),
string: candidate.string.clone(),
})
.collect();
if self.sort_completions {
matches = Self::sort_string_matches(
matches,
None,
self.snippet_sort_order,
self.completions.borrow().as_ref(),
);
}
matches
}
pub fn set_filter_results(
&mut self,
matches: Vec<StringMatch>,
provider: Option<Rc<dyn CompletionProvider>>,
window: &mut Window,
cx: &mut Context<Editor>,
) {
*self.entries.borrow_mut() = matches.into_boxed_slice();
self.selected_item = 0;
self.handle_selection_changed(provider.as_deref(), window, cx);
}
fn sort_string_matches(
matches: Vec<StringMatch>,
query: Option<&str>,
snippet_sort_order: SnippetSortOrder,
completions: &[Completion],
) -> Vec<StringMatch> {
let mut sortable_items: Vec<SortableMatch<'_>> = matches
.into_iter()
.map(|string_match| {
let completion = &completions[string_match.candidate_id];
let is_snippet = matches!(
&completion.source,
CompletionSource::Lsp { lsp_completion, .. }
if lsp_completion.kind == Some(CompletionItemKind::SNIPPET)
);
let sort_text =
if let CompletionSource::Lsp { lsp_completion, .. } = &completion.source {
lsp_completion.sort_text.as_deref()
} else {
None
};
let (sort_kind, sort_label) = completion.sort_key();
SortableMatch {
string_match,
is_snippet,
sort_text,
sort_kind,
sort_label,
}
})
.collect();
Self::sort_matches(&mut sortable_items, query, snippet_sort_order);
sortable_items
.into_iter()
.map(|sortable| sortable.string_match)
.collect()
}
pub fn sort_matches(
matches: &mut Vec<SortableMatch<'_>>,
query: Option<&str>,
@@ -1115,7 +857,6 @@ impl CompletionsMenu {
let fuzzy_bracket_threshold = max_score * (3.0 / 5.0);
let query_start_lower = query
.as_ref()
.and_then(|q| q.chars().next())
.and_then(|c| c.to_lowercase().next());
@@ -1149,7 +890,6 @@ impl CompletionsMenu {
};
let sort_mixed_case_prefix_length = Reverse(
query
.as_ref()
.map(|q| {
q.chars()
.zip(mat.string_match.string.chars())
@@ -1180,32 +920,97 @@ impl CompletionsMenu {
});
}
pub fn preserve_markdown_cache(&mut self, prev_menu: CompletionsMenu) {
self.markdown_cache = prev_menu.markdown_cache.clone();
pub async fn filter(
&mut self,
query: Option<&str>,
provider: Option<Rc<dyn CompletionProvider>>,
editor: WeakEntity<Editor>,
cx: &mut AsyncWindowContext,
) {
let mut matches = if let Some(query) = query {
fuzzy::match_strings(
&self.match_candidates,
query,
query.chars().any(|c| c.is_uppercase()),
100,
&Default::default(),
cx.background_executor().clone(),
)
.await
} else {
self.match_candidates
.iter()
.enumerate()
.map(|(candidate_id, candidate)| StringMatch {
candidate_id,
score: Default::default(),
positions: Default::default(),
string: candidate.string.clone(),
})
.collect()
};
// Convert ForCandidate cache keys to ForCompletionMatch keys.
let prev_completions = prev_menu.completions.borrow();
self.markdown_cache
.borrow_mut()
.retain_mut(|(key, _markdown)| match key {
MarkdownCacheKey::ForCompletionMatch { .. } => true,
MarkdownCacheKey::ForCandidate { candidate_id } => {
if let Some(completion) = prev_completions.get(*candidate_id) {
match &completion.documentation {
Some(CompletionDocumentation::MultiLineMarkdown(source)) => {
*key = MarkdownCacheKey::ForCompletionMatch {
new_text: completion.new_text.clone(),
markdown_source: source.clone(),
};
true
}
_ => false,
}
} else {
false
if self.sort_completions {
let completions = self.completions.borrow();
let mut sortable_items: Vec<SortableMatch<'_>> = matches
.into_iter()
.map(|string_match| {
let completion = &completions[string_match.candidate_id];
let is_snippet = matches!(
&completion.source,
CompletionSource::Lsp { lsp_completion, .. }
if lsp_completion.kind == Some(CompletionItemKind::SNIPPET)
);
let sort_text =
if let CompletionSource::Lsp { lsp_completion, .. } = &completion.source {
lsp_completion.sort_text.as_deref()
} else {
None
};
let (sort_kind, sort_label) = completion.sort_key();
SortableMatch {
string_match,
is_snippet,
sort_text,
sort_kind,
sort_label,
}
})
.collect();
Self::sort_matches(&mut sortable_items, query, self.snippet_sort_order);
matches = sortable_items
.into_iter()
.map(|sortable| sortable.string_match)
.collect();
}
*self.entries.borrow_mut() = matches;
self.selected_item = 0;
// This keeps the display consistent when y_flipped.
self.scroll_handle.scroll_to_item(0, ScrollStrategy::Top);
if let Some(provider) = provider {
cx.update(|window, cx| {
// Since this is async, it's possible the menu has been closed and possibly even
// another opened. `provider.selection_changed` should not be called in this case.
let this_menu_still_active = editor
.read_with(cx, |editor, _cx| {
editor.with_completions_menu_matching_id(self.id, || false, |_| true)
})
.unwrap_or(false);
if this_menu_still_active {
self.handle_selection_changed(&*provider, window, cx);
}
});
})
.ok();
}
}
}

View File

@@ -0,0 +1,578 @@
use super::{
fold_map::{FoldPlaceholder, FoldSnapshot},
inlay_map::{InlayEdit, InlaySnapshot},
};
use collections::{HashMap, HashSet};
use gpui::{AnyElement, App, Context, Entity};
use language::{Anchor, Buffer, BufferSnapshot, Edit, Point};
use multi_buffer::{ExcerptId, MultiBuffer, MultiBufferSnapshot, ToOffset, ToPoint};
use parking_lot::Mutex;
use serde::{Deserialize, Serialize};
use std::{any::TypeId, ops::Range, sync::Arc};
use sum_tree::TreeMap;
use tree_sitter::{Query, QueryCapture, QueryCursor};
use ui::{Color, IntoElement, Label};
use util::ResultExt;
/// Defines how to find foldable regions using Tree-sitter queries
#[derive(Clone, Debug)]
pub struct FoldingQuery {
pub query: Query,
pub auto_fold: bool, // Should regions auto-fold on load?
pub display_capture_ix: Option<u32>, // Which capture to show when folded
pub action_capture_ix: Option<u32>, // Which capture contains action data
pub proximity_expand: bool, // Expand when cursor is near?
}
/// Represents a foldable region found by queries
#[derive(Clone, Debug)]
pub struct SyntaxFold {
pub id: SyntaxFoldId,
pub range: Range<Anchor>,
pub display_text: Option<String>, // Text to show when folded
pub action_data: Option<FoldAction>, // What happens on cmd+click
pub proximity_expand: bool,
pub auto_fold: bool,
pub query_index: usize, // Which query created this fold
}
/// Unique identifier for a syntax fold
#[derive(Copy, Clone, Debug, PartialEq, Eq, Hash)]
pub struct SyntaxFoldId(usize);
/// Possible actions when clicking folded content
#[derive(Clone, Debug, PartialEq, Serialize, Deserialize)]
pub enum FoldAction {
OpenUrl(String), // Open URL in browser
GoToFile(String), // Open file in editor
RunCommand(String), // Run system command
Custom(String, serde_json::Value), // Extensibility point
}
/// Configuration for syntax folding
#[derive(Clone, Debug, Default)]
pub struct SyntaxFoldConfig {
pub enabled: bool,
pub auto_fold_on_open: bool,
pub proximity_expand_distance: u32, // Characters away from fold to trigger expansion
}
/// Snapshot of the syntax fold state at a point in time
pub struct SyntaxFoldSnapshot {
buffer_snapshot: MultiBufferSnapshot,
folds: TreeMap<Range<Anchor>, SyntaxFold>,
queries: Arc<Vec<FoldingQuery>>,
config: SyntaxFoldConfig,
}
/// Map that tracks syntax-driven folds in the buffer
pub struct SyntaxFoldMap {
buffer: Entity<MultiBuffer>,
queries: Arc<Vec<FoldingQuery>>,
config: SyntaxFoldConfig,
// All detected syntax folds
folds: TreeMap<Range<Anchor>, SyntaxFold>,
next_fold_id: usize,
// Track which folds are currently applied to the fold map
applied_folds: HashMap<SyntaxFoldId, Range<Anchor>>,
// Track which folds are temporarily expanded due to cursor proximity
proximity_expanded: Arc<Mutex<HashSet<SyntaxFoldId>>>,
// Track which folds have been manually toggled by the user
user_toggled: Arc<Mutex<HashSet<SyntaxFoldId>>>,
}
impl SyntaxFoldMap {
pub fn new(buffer: Entity<MultiBuffer>, cx: &mut App) -> Self {
let mut this = Self {
buffer: buffer.clone(),
queries: Arc::new(Vec::new()),
config: SyntaxFoldConfig::default(),
folds: TreeMap::default(),
next_fold_id: 0,
applied_folds: HashMap::default(),
proximity_expanded: Arc::new(Mutex::new(HashSet::default())),
user_toggled: Arc::new(Mutex::new(HashSet::default())),
};
// Initialize queries from languages
this.update_queries(cx);
// Perform initial fold detection if auto-fold is enabled
if this.config.enabled && this.config.auto_fold_on_open {
this.detect_folds(cx);
}
this
}
pub fn snapshot(&self, buffer_snapshot: MultiBufferSnapshot) -> SyntaxFoldSnapshot {
SyntaxFoldSnapshot {
buffer_snapshot,
folds: self.folds.clone(),
queries: self.queries.clone(),
config: self.config.clone(),
}
}
/// Returns ranges that should be folded/unfolded based on syntax analysis
pub fn sync(
&mut self,
buffer_snapshot: MultiBufferSnapshot,
buffer_edits: Vec<Edit<Point>>,
cx: &mut App,
) -> (Vec<Range<Anchor>>, Vec<Range<Anchor>>) {
// Re-detect folds in edited regions
for edit in &buffer_edits {
self.detect_folds_in_range(edit.new.clone(), &buffer_snapshot, cx);
}
// Determine which folds to apply/remove
let mut to_fold = Vec::new();
let mut to_unfold = Vec::new();
for (range, fold) in self.folds.iter() {
let should_be_folded = fold.auto_fold
&& !self.user_toggled.lock().contains(&fold.id)
&& !self.proximity_expanded.lock().contains(&fold.id);
let is_folded = self.applied_folds.contains_key(&fold.id);
if should_be_folded && !is_folded {
to_fold.push(range.clone());
self.applied_folds.insert(fold.id, range.clone());
} else if !should_be_folded && is_folded {
to_unfold.push(range.clone());
self.applied_folds.remove(&fold.id);
}
}
(to_fold, to_unfold)
}
/// Update queries from the current languages in the buffer
fn update_queries(&mut self, cx: &App) {
let mut queries = Vec::new();
let buffer = self.buffer.read(cx);
for excerpt in buffer.excerpts() {
if let Some(language) = excerpt.buffer.read(cx).language() {
if let Some(folding_query_str) = language.folding_query() {
if let Ok(query) =
Query::new(&language.grammar().unwrap().ts_language, folding_query_str)
{
// Parse query metadata from captures
let mut auto_fold = false;
let mut display_capture_ix = None;
let mut action_capture_ix = None;
let mut proximity_expand = false;
for (ix, name) in query.capture_names().iter().enumerate() {
match name.as_str() {
"fold.auto" => auto_fold = true,
"fold.text" | "fold.display" => {
display_capture_ix = Some(ix as u32)
}
"fold.action" | "fold.url" => action_capture_ix = Some(ix as u32),
"fold.proximity" => proximity_expand = true,
_ => {}
}
}
queries.push(FoldingQuery {
query,
auto_fold,
display_capture_ix,
action_capture_ix,
proximity_expand,
});
}
}
}
}
self.queries = Arc::new(queries);
self.snapshot.queries = self.queries.clone();
}
/// Detect all folds in the buffer
fn detect_folds(&mut self, cx: &mut App) {
let buffer = self.buffer.read(cx);
let snapshot = buffer.snapshot(cx);
for (excerpt_id, excerpt) in snapshot.excerpts() {
if let Some(tree) = excerpt.tree() {
let excerpt_range = excerpt.range.to_point(&excerpt.buffer);
self.detect_folds_in_excerpt(
excerpt_id,
&excerpt.buffer,
tree.root_node(),
excerpt_range,
cx,
);
}
}
}
/// Detect folds in a specific range
fn detect_folds_in_range(
&mut self,
range: Range<Point>,
buffer_snapshot: &MultiBufferSnapshot,
cx: &mut App,
) {
// Remove existing folds in this range
let range_anchors =
buffer_snapshot.anchor_before(range.start)..buffer_snapshot.anchor_after(range.end);
let mut folds_to_remove = Vec::new();
for (fold_range, fold) in self.folds.iter() {
if fold_range
.start
.cmp(&range_anchors.end, buffer_snapshot)
.is_lt()
&& fold_range
.end
.cmp(&range_anchors.start, buffer_snapshot)
.is_gt()
{
folds_to_remove.push(fold.id);
}
}
for fold_id in folds_to_remove {
self.remove_fold(fold_id);
}
// Re-detect folds in affected excerpts
for (excerpt_id, excerpt) in buffer_snapshot.excerpts() {
let excerpt_range = excerpt.range.to_point(&excerpt.buffer);
if excerpt_range.start < range.end && excerpt_range.end > range.start {
if let Some(tree) = excerpt.tree() {
self.detect_folds_in_excerpt(
excerpt_id,
&excerpt.buffer,
tree.root_node(),
excerpt_range,
buffer_snapshot,
cx,
);
}
}
}
}
/// Detect folds in a specific excerpt using tree-sitter queries
fn detect_folds_in_excerpt(
&mut self,
excerpt_id: ExcerptId,
buffer: &BufferSnapshot,
root: Node,
range: Range<Point>,
cx: &mut App,
) {
let mut cursor = QueryCursor::new();
cursor.set_point_range(range.clone());
for (query_index, folding_query) in self.queries.iter().enumerate() {
let matches = cursor.matches(
&folding_query.query,
root,
buffer
.as_rope()
.chunks_in_range(range.clone())
.collect::<String>()
.as_bytes(),
);
for match_ in matches {
if let Some(fold) = self.create_fold_from_match(
match_.captures,
query_index,
folding_query,
excerpt_id,
buffer,
cx,
) {
// Store the fold
self.folds.insert(fold.range.clone(), fold);
}
}
}
}
/// Create a SyntaxFold from a tree-sitter match
fn create_fold_from_match(
&mut self,
captures: &[QueryCapture],
query_index: usize,
query: &FoldingQuery,
excerpt_id: ExcerptId,
buffer: &BufferSnapshot,
cx: &App,
) -> Option<SyntaxFold> {
// Find the main fold capture (should be the one without a suffix)
let fold_capture = captures.iter().find(|c| {
let name = query.query.capture_names()[c.index as usize].as_str();
name == "fold" || name == "fold.auto"
})?;
let fold_node = fold_capture.node;
let start = Point::new(
fold_node.start_position().row as u32,
fold_node.start_position().column as u32,
);
let end = Point::new(
fold_node.end_position().row as u32,
fold_node.end_position().column as u32,
);
// Convert to anchors in the multi-buffer
let buffer_snapshot = self.buffer.read(cx).snapshot(cx);
let start_anchor = buffer_snapshot.anchor_in_excerpt(excerpt_id, start)?;
let end_anchor = buffer_snapshot.anchor_in_excerpt(excerpt_id, end)?;
// Extract display text if specified
let display_text = if let Some(display_ix) = query.display_capture_ix {
captures
.iter()
.find(|c| c.index == display_ix)
.and_then(|c| {
let node = c.node;
let start = node.start_byte();
let end = node.end_byte();
buffer.text_for_range(start..end).collect::<String>().into()
})
} else {
None
};
// Extract action data if specified
let action_data = if let Some(action_ix) = query.action_capture_ix {
captures
.iter()
.find(|c| c.index == action_ix)
.and_then(|c| {
let node = c.node;
let start = node.start_byte();
let end = node.end_byte();
let text = buffer.text_for_range(start..end).collect::<String>();
// Parse action based on capture name
let capture_name = &query.query.capture_names()[c.index as usize];
match capture_name.as_str() {
"fold.url" => Some(FoldAction::OpenUrl(text)),
"fold.file" => Some(FoldAction::GoToFile(text)),
"fold.command" => Some(FoldAction::RunCommand(text)),
_ => None,
}
})
} else {
None
};
let fold_id = SyntaxFoldId(self.next_fold_id);
self.next_fold_id += 1;
Some(SyntaxFold {
id: fold_id,
range: start_anchor..end_anchor,
display_text,
action_data,
proximity_expand: query.proximity_expand,
auto_fold: query.auto_fold,
query_index,
})
}
/// Create a fold placeholder for syntax folds
pub fn create_fold_placeholder(fold: &SyntaxFold) -> FoldPlaceholder {
let display_text = fold.display_text.clone();
FoldPlaceholder {
render: Arc::new(move |_id, _range, _cx| {
if let Some(text) = &display_text {
// Render the display text as a clickable element
Label::new(text.clone())
.color(Color::Accent)
.into_any_element()
} else {
// Default ellipsis
Label::new("").color(Color::Muted).into_any_element()
}
}),
constrain_width: true,
merge_adjacent: false,
type_tag: Some(TypeId::of::<SyntaxFold>()),
}
}
/// Remove a syntax fold
fn remove_fold(&mut self, fold_id: SyntaxFoldId) {
let mut range_to_remove = None;
for (range, fold) in self.folds.iter() {
if fold.id == fold_id {
range_to_remove = Some(range.clone());
break;
}
}
if let Some(range) = range_to_remove {
self.folds.remove(&range);
self.applied_folds.remove(&fold_id);
}
}
/// Handle cursor movement for proximity-based expansion
pub fn handle_cursor_moved(
&mut self,
cursor_offset: usize,
buffer_snapshot: &MultiBufferSnapshot,
) -> (Vec<Range<Anchor>>, Vec<Range<Anchor>>) {
let cursor_point = cursor_offset.to_point(buffer_snapshot);
let mut to_expand = Vec::new();
let mut to_collapse = Vec::new();
// Check each fold for proximity
for (range, fold) in self.folds.iter() {
if !fold.proximity_expand {
continue;
}
let fold_start = range.start.to_point(buffer_snapshot);
let fold_end = range.end.to_point(buffer_snapshot);
// Check if cursor is near the fold
let near_fold = Self::is_point_near_range(
cursor_point,
fold_start..fold_end,
self.config.proximity_expand_distance,
);
let is_expanded = self.proximity_expanded.lock().contains(&fold.id);
if near_fold && !is_expanded {
// Mark for expansion
self.proximity_expanded.lock().insert(fold.id);
to_expand.push(range.clone());
} else if !near_fold && is_expanded {
// Mark for collapse
self.proximity_expanded.lock().remove(&fold.id);
to_collapse.push(range.clone());
}
}
(to_expand, to_collapse)
}
/// Check if a point is near a range
fn is_point_near_range(point: Point, range: Range<Point>, distance: u32) -> bool {
// Check if point is within the range
if point >= range.start && point <= range.end {
return true;
}
// Check distance before range
if point.row == range.start.row
&& range.start.column.saturating_sub(point.column) <= distance
{
return true;
}
// Check distance after range
if point.row == range.end.row && point.column.saturating_sub(range.end.column) <= distance {
return true;
}
false
}
/// Get fold at a specific position
pub fn fold_at_position(
&self,
offset: usize,
buffer_snapshot: &MultiBufferSnapshot,
) -> Option<&SyntaxFold> {
let point = offset.to_point(buffer_snapshot);
for (range, fold) in self.folds.iter() {
let start = range.start.to_point(buffer_snapshot);
let end = range.end.to_point(buffer_snapshot);
if point >= start && point <= end {
return Some(fold);
}
}
None
}
/// Execute the action associated with a fold
pub fn execute_fold_action(&self, fold: &SyntaxFold, cx: &mut App) {
if let Some(action) = &fold.action_data {
match action {
FoldAction::OpenUrl(url) => {
// Use platform open to open URLs
cx.open_url(url.as_str());
}
FoldAction::GoToFile(path) => {
// This would be handled by the workspace
log::info!("Go to file: {}", path);
}
FoldAction::RunCommand(cmd) => {
log::info!("Run command: {}", cmd);
}
FoldAction::Custom(name, data) => {
log::info!("Custom action {}: {:?}", name, data);
}
}
}
}
}
impl Clone for SyntaxFoldSnapshot {
fn clone(&self) -> Self {
Self {
buffer_snapshot: self.buffer_snapshot.clone(),
folds: self.folds.clone(),
queries: self.queries.clone(),
config: self.config.clone(),
}
}
}
#[cfg(test)]
mod tests {
use super::*;
use gpui::{Context as _, TestAppContext};
use language::{Buffer, Language, LanguageConfig, LanguageMatcher};
use multi_buffer::MultiBuffer;
use std::sync::Arc;
// Note: These tests are placeholders showing the intended API.
// Real tests would require actual tree-sitter language support.
#[gpui::test]
async fn test_syntax_fold_api(cx: &mut TestAppContext) {
cx.update(|cx| {
// Create a buffer
let buffer = cx.new_model(|cx| Buffer::local("[Example](https://example.com)", cx));
let multibuffer = cx.new_model(|cx| {
let mut multibuffer = MultiBuffer::new(0, language::Capability::ReadWrite);
multibuffer.push_buffer(buffer.clone(), cx);
multibuffer
});
// Create syntax fold map
let mut syntax_fold_map = SyntaxFoldMap::new(multibuffer.clone(), cx);
// Test basic API
syntax_fold_map.config.enabled = true;
syntax_fold_map.detect_folds(cx);
// Verify the API works
let buffer_snapshot = multibuffer.read(cx).snapshot(cx);
let snapshot = syntax_fold_map.snapshot(buffer_snapshot);
assert_eq!(snapshot.folds.len(), 0); // No folds without proper tree-sitter
});
}
}

View File

@@ -0,0 +1,320 @@
use super::*;
use crate::{
display_map::{DisplayMap, FoldMap, InlayMap},
test::editor_test_context::EditorTestContext,
};
use gpui::{Context, TestAppContext};
use language::{Language, LanguageConfig, LanguageMatcher, LanguageRegistry};
use multi_buffer::MultiBuffer;
use project::Project;
use std::sync::Arc;
#[gpui::test]
async fn test_markdown_link_folding(cx: &mut TestAppContext) {
init_test(cx);
let mut cx = EditorTestContext::new(cx).await;
// Set up markdown language with folding query
cx.update_buffer(|buffer, cx| {
let registry = LanguageRegistry::test(cx.background_executor().clone());
let markdown = Arc::new(Language::new(
LanguageConfig {
name: "Markdown".into(),
matcher: LanguageMatcher {
path_suffixes: vec!["md".to_string()],
..Default::default()
},
..Default::default()
},
None,
));
registry.add(markdown.clone());
buffer.set_language_registry(registry);
buffer.set_language(Some(markdown), cx);
});
// Add markdown content with links
cx.set_state(indoc! {"
# Channel Notes
Check out [our website](https://example.com) for more info.
Here's a [link to docs](https://docs.example.com/guide) that you should read.
And another [reference](https://reference.example.com).
"});
cx.update_editor(|editor, cx| {
// Enable syntax folding
let display_map = editor.display_map.update(cx, |map, cx| {
map.syntax_fold_config = SyntaxFoldConfig {
enabled: true,
auto_fold_on_open: true,
proximity_expand_distance: 2,
};
map.detect_syntax_folds(cx);
});
// Verify folds were created for each link
let snapshot = editor.snapshot(cx);
let fold_count = snapshot.fold_count();
assert_eq!(fold_count, 3, "Should have 3 folds for 3 links");
});
}
#[gpui::test]
async fn test_link_proximity_expansion(cx: &mut TestAppContext) {
init_test(cx);
let mut cx = EditorTestContext::new(cx).await;
// Set up markdown with a single link
cx.set_state(indoc! {"
Check out [this link](https://example.com) for details.
"});
cx.update_editor(|editor, cx| {
// Enable syntax folding with proximity expansion
editor.display_map.update(cx, |map, cx| {
map.syntax_fold_config = SyntaxFoldConfig {
enabled: true,
auto_fold_on_open: true,
proximity_expand_distance: 3,
};
map.detect_syntax_folds(cx);
});
});
// Move cursor near the link
cx.set_selections_state(indoc! {"
Check out |[this link](https://example.com) for details.
"});
cx.update_editor(|editor, cx| {
let display_map = editor.display_map.update(cx, |map, cx| {
// Simulate cursor movement handling
let cursor_offset = editor.selections.newest::<usize>(cx).head();
map.handle_cursor_movement(cursor_offset, cx);
});
// Verify link is expanded due to proximity
let snapshot = editor.snapshot(cx);
let expanded_count = snapshot.expanded_fold_count();
assert_eq!(
expanded_count, 1,
"Link should be expanded when cursor is near"
);
});
// Move cursor away from the link
cx.set_selections_state(indoc! {"
Check out [this link](https://example.com) for details.|
"});
cx.update_editor(|editor, cx| {
let display_map = editor.display_map.update(cx, |map, cx| {
let cursor_offset = editor.selections.newest::<usize>(cx).head();
map.handle_cursor_movement(cursor_offset, cx);
});
// Verify link is folded again
let snapshot = editor.snapshot(cx);
let expanded_count = snapshot.expanded_fold_count();
assert_eq!(
expanded_count, 0,
"Link should be folded when cursor moves away"
);
});
}
#[gpui::test]
async fn test_link_click_action(cx: &mut TestAppContext) {
init_test(cx);
let mut cx = EditorTestContext::new(cx).await;
cx.set_state(indoc! {"
Visit [GitHub](https://github.com) for code.
"});
cx.update_editor(|editor, cx| {
editor.display_map.update(cx, |map, cx| {
map.syntax_fold_config = SyntaxFoldConfig {
enabled: true,
auto_fold_on_open: true,
proximity_expand_distance: 2,
};
map.detect_syntax_folds(cx);
});
});
// Simulate clicking on the folded link
cx.update_editor(|editor, cx| {
let point = editor.pixel_position_of_cursor(cx);
// Find fold at click position
let display_map = editor.display_map.read(cx);
let snapshot = display_map.snapshot(cx);
let click_offset = snapshot.display_point_to_offset(point, cx);
if let Some(fold) = display_map.syntax_fold_at_offset(click_offset) {
// Verify the fold has the correct URL action
assert!(matches!(
fold.action_data,
Some(FoldAction::OpenUrl(ref url)) if url == "https://github.com"
));
// Execute the action (in tests, this would be mocked)
display_map.execute_fold_action(&fold, cx);
} else {
panic!("No fold found at click position");
}
});
}
#[gpui::test]
async fn test_nested_markdown_structures(cx: &mut TestAppContext) {
init_test(cx);
let mut cx = EditorTestContext::new(cx).await;
// Test with nested brackets and complex markdown
cx.set_state(indoc! {"
# Documentation
See [the [nested] guide](https://example.com/guide) for details.
![Image description](https://example.com/image.png)
Code: `[not a link](just code)`
> Quote with [link in quote](https://quoted.com)
"});
cx.update_editor(|editor, cx| {
editor.display_map.update(cx, |map, cx| {
map.syntax_fold_config = SyntaxFoldConfig {
enabled: true,
auto_fold_on_open: true,
proximity_expand_distance: 2,
};
map.detect_syntax_folds(cx);
});
// Verify correct number of folds (should not fold code block content)
let snapshot = editor.snapshot(cx);
let fold_count = snapshot.fold_count();
assert_eq!(
fold_count, 3,
"Should have 3 folds: nested link, image, and quote link"
);
});
}
#[gpui::test]
async fn test_fold_persistence_across_edits(cx: &mut TestAppContext) {
init_test(cx);
let mut cx = EditorTestContext::new(cx).await;
cx.set_state(indoc! {"
First [link one](https://one.com) here.
Second [link two](https://two.com) here.
"});
cx.update_editor(|editor, cx| {
editor.display_map.update(cx, |map, cx| {
map.syntax_fold_config = SyntaxFoldConfig {
enabled: true,
auto_fold_on_open: true,
proximity_expand_distance: 2,
};
map.detect_syntax_folds(cx);
});
});
// Edit text between links
cx.set_selections_state(indoc! {"
First [link one](https://one.com) here|.
Second [link two](https://two.com) here.
"});
cx.simulate_keystroke(" and more text");
cx.update_editor(|editor, cx| {
// Verify folds are maintained after edit
let snapshot = editor.snapshot(cx);
let fold_count = snapshot.fold_count();
assert_eq!(fold_count, 2, "Both folds should persist after edit");
});
// Add a new link
cx.simulate_keystroke("\nThird [link three](https://three.com) added.");
cx.update_editor(|editor, cx| {
// Verify new fold is detected
let snapshot = editor.snapshot(cx);
let fold_count = snapshot.fold_count();
assert_eq!(fold_count, 3, "New fold should be detected for added link");
});
}
#[gpui::test]
async fn test_channel_notes_integration(cx: &mut TestAppContext) {
init_test(cx);
let mut cx = EditorTestContext::new(cx).await;
// Simulate channel notes content with many links
cx.set_state(indoc! {"
# Project Resources
## Documentation
- [API Reference](https://api.example.com/docs)
- [User Guide](https://guide.example.com)
- [Developer Handbook](https://dev.example.com/handbook)
## Tools
- [Build Status](https://ci.example.com/status)
- [Issue Tracker](https://issues.example.com)
- [Code Review](https://review.example.com)
## External Links
- [Blog Post](https://blog.example.com/announcement)
- [Video Tutorial](https://youtube.com/watch?v=demo)
- [Community Forum](https://forum.example.com)
"});
cx.update_editor(|editor, cx| {
editor.display_map.update(cx, |map, cx| {
map.syntax_fold_config = SyntaxFoldConfig {
enabled: true,
auto_fold_on_open: true,
proximity_expand_distance: 2,
};
map.detect_syntax_folds(cx);
});
// Verify all links are folded
let snapshot = editor.snapshot(cx);
let fold_count = snapshot.fold_count();
assert_eq!(fold_count, 9, "All 9 links should be folded");
// Verify visual appearance (folded links show only text)
let display_text = snapshot.display_text();
assert!(display_text.contains("API Reference"));
assert!(!display_text.contains("https://api.example.com/docs"));
});
}
fn init_test(cx: &mut TestAppContext) {
cx.update(|cx| {
let settings = crate::test::test_settings(cx);
cx.set_global(settings);
theme::init(theme::LoadThemes::JustBase, cx);
language::init(cx);
crate::init(cx);
});
}

File diff suppressed because it is too large Load Diff

View File

@@ -1,7 +1,6 @@
use super::*;
use crate::{
JoinLines,
code_context_menus::CodeContextMenu,
inline_completion_tests::FakeInlineCompletionProvider,
linked_editing_ranges::LinkedEditingRanges,
scroll::scroll_amount::ScrollAmount,
@@ -8513,123 +8512,108 @@ async fn test_snippet_placeholder_choices(cx: &mut TestAppContext) {
async fn test_snippets(cx: &mut TestAppContext) {
init_test(cx, |_| {});
let mut cx = EditorTestContext::new(cx).await;
let (text, insertion_ranges) = marked_text_ranges(
indoc! {"
a.ˇ b
a.ˇ b
a.ˇ b
"},
false,
);
cx.set_state(indoc! {"
a.ˇ b
a.ˇ b
a.ˇ b
"});
let buffer = cx.update(|cx| MultiBuffer::build_simple(&text, cx));
let (editor, cx) = cx.add_window_view(|window, cx| build_editor(buffer, window, cx));
cx.update_editor(|editor, window, cx| {
editor.update_in(cx, |editor, window, cx| {
let snippet = Snippet::parse("f(${1:one}, ${2:two}, ${1:three})$0").unwrap();
let insertion_ranges = editor
.selections
.all(cx)
.iter()
.map(|s| s.range().clone())
.collect::<Vec<_>>();
editor
.insert_snippet(&insertion_ranges, snippet, window, cx)
.unwrap();
fn assert(editor: &mut Editor, cx: &mut Context<Editor>, marked_text: &str) {
let (expected_text, selection_ranges) = marked_text_ranges(marked_text, false);
assert_eq!(editor.text(cx), expected_text);
assert_eq!(editor.selections.ranges::<usize>(cx), selection_ranges);
}
assert(
editor,
cx,
indoc! {"
a.f(«one», two, «three») b
a.f(«one», two, «three») b
a.f(«one», two, «three») b
"},
);
// Can't move earlier than the first tab stop
assert!(!editor.move_to_prev_snippet_tabstop(window, cx));
assert(
editor,
cx,
indoc! {"
a.f(«one», two, «three») b
a.f(«one», two, «three») b
a.f(«one», two, «three») b
"},
);
assert!(editor.move_to_next_snippet_tabstop(window, cx));
assert(
editor,
cx,
indoc! {"
a.f(one, «two», three) b
a.f(one, «two», three) b
a.f(one, «two», three) b
"},
);
editor.move_to_prev_snippet_tabstop(window, cx);
assert(
editor,
cx,
indoc! {"
a.f(«one», two, «three») b
a.f(«one», two, «three») b
a.f(«one», two, «three») b
"},
);
assert!(editor.move_to_next_snippet_tabstop(window, cx));
assert(
editor,
cx,
indoc! {"
a.f(one, «two», three) b
a.f(one, «two», three) b
a.f(one, «two», three) b
"},
);
assert!(editor.move_to_next_snippet_tabstop(window, cx));
assert(
editor,
cx,
indoc! {"
a.f(one, two, three)ˇ b
a.f(one, two, three)ˇ b
a.f(one, two, three)ˇ b
"},
);
// As soon as the last tab stop is reached, snippet state is gone
editor.move_to_prev_snippet_tabstop(window, cx);
assert(
editor,
cx,
indoc! {"
a.f(one, two, three)ˇ b
a.f(one, two, three)ˇ b
a.f(one, two, three)ˇ b
"},
);
});
cx.assert_editor_state(indoc! {"
a.f(«oneˇ», two, «threeˇ») b
a.f(«oneˇ», two, «threeˇ») b
a.f(«oneˇ», two, «threeˇ») b
"});
// Can't move earlier than the first tab stop
cx.update_editor(|editor, window, cx| {
assert!(!editor.move_to_prev_snippet_tabstop(window, cx))
});
cx.assert_editor_state(indoc! {"
a.f(«oneˇ», two, «threeˇ») b
a.f(«oneˇ», two, «threeˇ») b
a.f(«oneˇ», two, «threeˇ») b
"});
cx.update_editor(|editor, window, cx| assert!(editor.move_to_next_snippet_tabstop(window, cx)));
cx.assert_editor_state(indoc! {"
a.f(one, «twoˇ», three) b
a.f(one, «twoˇ», three) b
a.f(one, «twoˇ», three) b
"});
cx.update_editor(|editor, window, cx| assert!(editor.move_to_prev_snippet_tabstop(window, cx)));
cx.assert_editor_state(indoc! {"
a.f(«oneˇ», two, «threeˇ») b
a.f(«oneˇ», two, «threeˇ») b
a.f(«oneˇ», two, «threeˇ») b
"});
cx.update_editor(|editor, window, cx| assert!(editor.move_to_next_snippet_tabstop(window, cx)));
cx.assert_editor_state(indoc! {"
a.f(one, «twoˇ», three) b
a.f(one, «twoˇ», three) b
a.f(one, «twoˇ», three) b
"});
cx.update_editor(|editor, window, cx| assert!(editor.move_to_next_snippet_tabstop(window, cx)));
cx.assert_editor_state(indoc! {"
a.f(one, two, three)ˇ b
a.f(one, two, three)ˇ b
a.f(one, two, three)ˇ b
"});
// As soon as the last tab stop is reached, snippet state is gone
cx.update_editor(|editor, window, cx| {
assert!(!editor.move_to_prev_snippet_tabstop(window, cx))
});
cx.assert_editor_state(indoc! {"
a.f(one, two, three)ˇ b
a.f(one, two, three)ˇ b
a.f(one, two, three)ˇ b
"});
}
#[gpui::test]
async fn test_snippet_indentation(cx: &mut TestAppContext) {
init_test(cx, |_| {});
let mut cx = EditorTestContext::new(cx).await;
cx.update_editor(|editor, window, cx| {
let snippet = Snippet::parse(indoc! {"
/*
* Multiline comment with leading indentation
*
* $1
*/
$0"})
.unwrap();
let insertion_ranges = editor
.selections
.all(cx)
.iter()
.map(|s| s.range().clone())
.collect::<Vec<_>>();
editor
.insert_snippet(&insertion_ranges, snippet, window, cx)
.unwrap();
});
cx.assert_editor_state(indoc! {"
/*
* Multiline comment with leading indentation
*
* ˇ
*/
"});
cx.update_editor(|editor, window, cx| assert!(editor.move_to_next_snippet_tabstop(window, cx)));
cx.assert_editor_state(indoc! {"
/*
* Multiline comment with leading indentation
*
*•
*/
ˇ"});
}
#[gpui::test]
@@ -10495,7 +10479,6 @@ async fn test_completion_mode(cx: &mut TestAppContext) {
run_description: &'static str,
initial_state: String,
buffer_marked_text: String,
completion_label: &'static str,
completion_text: &'static str,
expected_with_insert_mode: String,
expected_with_replace_mode: String,
@@ -10508,7 +10491,6 @@ async fn test_completion_mode(cx: &mut TestAppContext) {
run_description: "Start of word matches completion text",
initial_state: "before ediˇ after".into(),
buffer_marked_text: "before <edi|> after".into(),
completion_label: "editor",
completion_text: "editor",
expected_with_insert_mode: "before editorˇ after".into(),
expected_with_replace_mode: "before editorˇ after".into(),
@@ -10519,7 +10501,6 @@ async fn test_completion_mode(cx: &mut TestAppContext) {
run_description: "Accept same text at the middle of the word",
initial_state: "before ediˇtor after".into(),
buffer_marked_text: "before <edi|tor> after".into(),
completion_label: "editor",
completion_text: "editor",
expected_with_insert_mode: "before editorˇtor after".into(),
expected_with_replace_mode: "before editorˇ after".into(),
@@ -10530,7 +10511,6 @@ async fn test_completion_mode(cx: &mut TestAppContext) {
run_description: "End of word matches completion text -- cursor at end",
initial_state: "before torˇ after".into(),
buffer_marked_text: "before <tor|> after".into(),
completion_label: "editor",
completion_text: "editor",
expected_with_insert_mode: "before editorˇ after".into(),
expected_with_replace_mode: "before editorˇ after".into(),
@@ -10541,7 +10521,6 @@ async fn test_completion_mode(cx: &mut TestAppContext) {
run_description: "End of word matches completion text -- cursor at start",
initial_state: "before ˇtor after".into(),
buffer_marked_text: "before <|tor> after".into(),
completion_label: "editor",
completion_text: "editor",
expected_with_insert_mode: "before editorˇtor after".into(),
expected_with_replace_mode: "before editorˇ after".into(),
@@ -10552,7 +10531,6 @@ async fn test_completion_mode(cx: &mut TestAppContext) {
run_description: "Prepend text containing whitespace",
initial_state: "pˇfield: bool".into(),
buffer_marked_text: "<p|field>: bool".into(),
completion_label: "pub ",
completion_text: "pub ",
expected_with_insert_mode: "pub ˇfield: bool".into(),
expected_with_replace_mode: "pub ˇ: bool".into(),
@@ -10563,7 +10541,6 @@ async fn test_completion_mode(cx: &mut TestAppContext) {
run_description: "Add element to start of list",
initial_state: "[element_ˇelement_2]".into(),
buffer_marked_text: "[<element_|element_2>]".into(),
completion_label: "element_1",
completion_text: "element_1",
expected_with_insert_mode: "[element_1ˇelement_2]".into(),
expected_with_replace_mode: "[element_1ˇ]".into(),
@@ -10574,7 +10551,6 @@ async fn test_completion_mode(cx: &mut TestAppContext) {
run_description: "Add element to start of list -- first and second elements are equal",
initial_state: "[elˇelement]".into(),
buffer_marked_text: "[<el|element>]".into(),
completion_label: "element",
completion_text: "element",
expected_with_insert_mode: "[elementˇelement]".into(),
expected_with_replace_mode: "[elementˇ]".into(),
@@ -10585,7 +10561,6 @@ async fn test_completion_mode(cx: &mut TestAppContext) {
run_description: "Ends with matching suffix",
initial_state: "SubˇError".into(),
buffer_marked_text: "<Sub|Error>".into(),
completion_label: "SubscriptionError",
completion_text: "SubscriptionError",
expected_with_insert_mode: "SubscriptionErrorˇError".into(),
expected_with_replace_mode: "SubscriptionErrorˇ".into(),
@@ -10596,7 +10571,6 @@ async fn test_completion_mode(cx: &mut TestAppContext) {
run_description: "Suffix is a subsequence -- contiguous",
initial_state: "SubˇErr".into(),
buffer_marked_text: "<Sub|Err>".into(),
completion_label: "SubscriptionError",
completion_text: "SubscriptionError",
expected_with_insert_mode: "SubscriptionErrorˇErr".into(),
expected_with_replace_mode: "SubscriptionErrorˇ".into(),
@@ -10607,7 +10581,6 @@ async fn test_completion_mode(cx: &mut TestAppContext) {
run_description: "Suffix is a subsequence -- non-contiguous -- replace intended",
initial_state: "Suˇscrirr".into(),
buffer_marked_text: "<Su|scrirr>".into(),
completion_label: "SubscriptionError",
completion_text: "SubscriptionError",
expected_with_insert_mode: "SubscriptionErrorˇscrirr".into(),
expected_with_replace_mode: "SubscriptionErrorˇ".into(),
@@ -10618,46 +10591,12 @@ async fn test_completion_mode(cx: &mut TestAppContext) {
run_description: "Suffix is a subsequence -- non-contiguous -- replace unintended",
initial_state: "foo(indˇix)".into(),
buffer_marked_text: "foo(<ind|ix>)".into(),
completion_label: "node_index",
completion_text: "node_index",
expected_with_insert_mode: "foo(node_indexˇix)".into(),
expected_with_replace_mode: "foo(node_indexˇ)".into(),
expected_with_replace_subsequence_mode: "foo(node_indexˇix)".into(),
expected_with_replace_suffix_mode: "foo(node_indexˇix)".into(),
},
Run {
run_description: "Replace range ends before cursor - should extend to cursor",
initial_state: "before editˇo after".into(),
buffer_marked_text: "before <{ed}>it|o after".into(),
completion_label: "editor",
completion_text: "editor",
expected_with_insert_mode: "before editorˇo after".into(),
expected_with_replace_mode: "before editorˇo after".into(),
expected_with_replace_subsequence_mode: "before editorˇo after".into(),
expected_with_replace_suffix_mode: "before editorˇo after".into(),
},
Run {
run_description: "Uses label for suffix matching",
initial_state: "before ediˇtor after".into(),
buffer_marked_text: "before <edi|tor> after".into(),
completion_label: "editor",
completion_text: "editor()",
expected_with_insert_mode: "before editor()ˇtor after".into(),
expected_with_replace_mode: "before editor()ˇ after".into(),
expected_with_replace_subsequence_mode: "before editor()ˇ after".into(),
expected_with_replace_suffix_mode: "before editor()ˇ after".into(),
},
Run {
run_description: "Case insensitive subsequence and suffix matching",
initial_state: "before EDiˇtoR after".into(),
buffer_marked_text: "before <EDi|toR> after".into(),
completion_label: "editor",
completion_text: "editor",
expected_with_insert_mode: "before editorˇtoR after".into(),
expected_with_replace_mode: "before editorˇ after".into(),
expected_with_replace_subsequence_mode: "before editorˇ after".into(),
expected_with_replace_suffix_mode: "before editorˇ after".into(),
},
];
for run in runs {
@@ -10698,7 +10637,7 @@ async fn test_completion_mode(cx: &mut TestAppContext) {
handle_completion_request_with_insert_and_replace(
&mut cx,
&run.buffer_marked_text,
vec![(run.completion_label, run.completion_text)],
vec![run.completion_text],
counter.clone(),
)
.await;
@@ -10758,7 +10697,7 @@ async fn test_completion_with_mode_specified_by_action(cx: &mut TestAppContext)
handle_completion_request_with_insert_and_replace(
&mut cx,
&buffer_marked_text,
vec![(completion_text, completion_text)],
vec![completion_text],
counter.clone(),
)
.await;
@@ -10792,7 +10731,7 @@ async fn test_completion_with_mode_specified_by_action(cx: &mut TestAppContext)
handle_completion_request_with_insert_and_replace(
&mut cx,
&buffer_marked_text,
vec![(completion_text, completion_text)],
vec![completion_text],
counter.clone(),
)
.await;
@@ -10879,7 +10818,7 @@ async fn test_completion_replacing_surrounding_text_with_multicursors(cx: &mut T
handle_completion_request_with_insert_and_replace(
&mut cx,
completion_marked_buffer,
vec![(completion_text, completion_text)],
vec![completion_text],
Arc::new(AtomicUsize::new(0)),
)
.await;
@@ -10933,7 +10872,7 @@ async fn test_completion_replacing_surrounding_text_with_multicursors(cx: &mut T
handle_completion_request_with_insert_and_replace(
&mut cx,
completion_marked_buffer,
vec![(completion_text, completion_text)],
vec![completion_text],
Arc::new(AtomicUsize::new(0)),
)
.await;
@@ -10982,7 +10921,7 @@ async fn test_completion_replacing_surrounding_text_with_multicursors(cx: &mut T
handle_completion_request_with_insert_and_replace(
&mut cx,
completion_marked_buffer,
vec![(completion_text, completion_text)],
vec![completion_text],
Arc::new(AtomicUsize::new(0)),
)
.await;
@@ -11200,15 +11139,14 @@ async fn test_completion(cx: &mut TestAppContext) {
"});
cx.simulate_keystroke(".");
handle_completion_request(
&mut cx,
indoc! {"
one.|<>
two
three
"},
vec!["first_completion", "second_completion"],
true,
counter.clone(),
&mut cx,
)
.await;
cx.condition(|editor, _| editor.context_menu_visible())
@@ -11308,6 +11246,7 @@ async fn test_completion(cx: &mut TestAppContext) {
additional edit
"});
handle_completion_request(
&mut cx,
indoc! {"
one.second_completion
two s
@@ -11315,9 +11254,7 @@ async fn test_completion(cx: &mut TestAppContext) {
additional edit
"},
vec!["fourth_completion", "fifth_completion", "sixth_completion"],
true,
counter.clone(),
&mut cx,
)
.await;
cx.condition(|editor, _| editor.context_menu_visible())
@@ -11327,6 +11264,7 @@ async fn test_completion(cx: &mut TestAppContext) {
cx.simulate_keystroke("i");
handle_completion_request(
&mut cx,
indoc! {"
one.second_completion
two si
@@ -11334,9 +11272,7 @@ async fn test_completion(cx: &mut TestAppContext) {
additional edit
"},
vec!["fourth_completion", "fifth_completion", "sixth_completion"],
true,
counter.clone(),
&mut cx,
)
.await;
cx.condition(|editor, _| editor.context_menu_visible())
@@ -11370,11 +11306,10 @@ async fn test_completion(cx: &mut TestAppContext) {
editor.show_completions(&ShowCompletions { trigger: None }, window, cx);
});
handle_completion_request(
&mut cx,
"editor.<clo|>",
vec!["close", "clobber"],
true,
counter.clone(),
&mut cx,
)
.await;
cx.condition(|editor, _| editor.context_menu_visible())
@@ -11391,128 +11326,6 @@ async fn test_completion(cx: &mut TestAppContext) {
apply_additional_edits.await.unwrap();
}
#[gpui::test]
async fn test_completion_reuse(cx: &mut TestAppContext) {
init_test(cx, |_| {});
let mut cx = EditorLspTestContext::new_rust(
lsp::ServerCapabilities {
completion_provider: Some(lsp::CompletionOptions {
trigger_characters: Some(vec![".".to_string()]),
..Default::default()
}),
..Default::default()
},
cx,
)
.await;
let counter = Arc::new(AtomicUsize::new(0));
cx.set_state("objˇ");
cx.simulate_keystroke(".");
// Initial completion request returns complete results
let is_incomplete = false;
handle_completion_request(
"obj.|<>",
vec!["a", "ab", "abc"],
is_incomplete,
counter.clone(),
&mut cx,
)
.await;
cx.run_until_parked();
assert_eq!(counter.load(atomic::Ordering::Acquire), 1);
cx.assert_editor_state("obj.ˇ");
check_displayed_completions(vec!["a", "ab", "abc"], &mut cx);
// Type "a" - filters existing completions
cx.simulate_keystroke("a");
cx.run_until_parked();
assert_eq!(counter.load(atomic::Ordering::Acquire), 1);
cx.assert_editor_state("obj.aˇ");
check_displayed_completions(vec!["a", "ab", "abc"], &mut cx);
// Type "b" - filters existing completions
cx.simulate_keystroke("b");
cx.run_until_parked();
assert_eq!(counter.load(atomic::Ordering::Acquire), 1);
cx.assert_editor_state("obj.abˇ");
check_displayed_completions(vec!["ab", "abc"], &mut cx);
// Type "c" - filters existing completions
cx.simulate_keystroke("c");
cx.run_until_parked();
assert_eq!(counter.load(atomic::Ordering::Acquire), 1);
cx.assert_editor_state("obj.abcˇ");
check_displayed_completions(vec!["abc"], &mut cx);
// Backspace to delete "c" - filters existing completions
cx.update_editor(|editor, window, cx| {
editor.backspace(&Backspace, window, cx);
});
cx.run_until_parked();
assert_eq!(counter.load(atomic::Ordering::Acquire), 1);
cx.assert_editor_state("obj.abˇ");
check_displayed_completions(vec!["ab", "abc"], &mut cx);
// Moving cursor to the left dismisses menu.
cx.update_editor(|editor, window, cx| {
editor.move_left(&MoveLeft, window, cx);
});
cx.run_until_parked();
assert_eq!(counter.load(atomic::Ordering::Acquire), 1);
cx.assert_editor_state("obj.aˇb");
cx.update_editor(|editor, _, _| {
assert_eq!(editor.context_menu_visible(), false);
});
// Type "b" - new request
cx.simulate_keystroke("b");
let is_incomplete = false;
handle_completion_request(
"obj.<ab|>a",
vec!["ab", "abc"],
is_incomplete,
counter.clone(),
&mut cx,
)
.await;
cx.run_until_parked();
assert_eq!(counter.load(atomic::Ordering::Acquire), 2);
cx.assert_editor_state("obj.abˇb");
check_displayed_completions(vec!["ab", "abc"], &mut cx);
// Backspace to delete "b" - since query was "ab" and is now "a", new request is made.
cx.update_editor(|editor, window, cx| {
editor.backspace(&Backspace, window, cx);
});
let is_incomplete = false;
handle_completion_request(
"obj.<a|>b",
vec!["a", "ab", "abc"],
is_incomplete,
counter.clone(),
&mut cx,
)
.await;
cx.run_until_parked();
assert_eq!(counter.load(atomic::Ordering::Acquire), 3);
cx.assert_editor_state("obj.aˇb");
check_displayed_completions(vec!["a", "ab", "abc"], &mut cx);
// Backspace to delete "a" - dismisses menu.
cx.update_editor(|editor, window, cx| {
editor.backspace(&Backspace, window, cx);
});
cx.run_until_parked();
assert_eq!(counter.load(atomic::Ordering::Acquire), 3);
cx.assert_editor_state("obj.ˇb");
cx.update_editor(|editor, _, _| {
assert_eq!(editor.context_menu_visible(), false);
});
}
#[gpui::test]
async fn test_word_completion(cx: &mut TestAppContext) {
let lsp_fetch_timeout_ms = 10;
@@ -12193,11 +12006,9 @@ async fn test_no_duplicated_completion_requests(cx: &mut TestAppContext) {
let task_completion_item = closure_completion_item.clone();
counter_clone.fetch_add(1, atomic::Ordering::Release);
async move {
Ok(Some(lsp::CompletionResponse::List(lsp::CompletionList {
is_incomplete: true,
item_defaults: None,
items: vec![task_completion_item],
})))
Ok(Some(lsp::CompletionResponse::Array(vec![
task_completion_item,
])))
}
});
@@ -17271,64 +17082,6 @@ async fn test_indent_guide_ends_before_empty_line(cx: &mut TestAppContext) {
);
}
#[gpui::test]
async fn test_indent_guide_ignored_only_whitespace_lines(cx: &mut TestAppContext) {
let (buffer_id, mut cx) = setup_indent_guides_editor(
&"
function component() {
\treturn (
\t\t\t
\t\t<div>
\t\t\t<abc></abc>
\t\t</div>
\t)
}"
.unindent(),
cx,
)
.await;
assert_indent_guides(
0..8,
vec![
indent_guide(buffer_id, 1, 6, 0),
indent_guide(buffer_id, 2, 5, 1),
indent_guide(buffer_id, 4, 4, 2),
],
None,
&mut cx,
);
}
#[gpui::test]
async fn test_indent_guide_fallback_to_next_non_entirely_whitespace_line(cx: &mut TestAppContext) {
let (buffer_id, mut cx) = setup_indent_guides_editor(
&"
function component() {
\treturn (
\t
\t\t<div>
\t\t\t<abc></abc>
\t\t</div>
\t)
}"
.unindent(),
cx,
)
.await;
assert_indent_guides(
0..8,
vec![
indent_guide(buffer_id, 1, 6, 0),
indent_guide(buffer_id, 2, 5, 1),
indent_guide(buffer_id, 4, 4, 2),
],
None,
&mut cx,
);
}
#[gpui::test]
async fn test_indent_guide_continuing_off_screen(cx: &mut TestAppContext) {
let (buffer_id, mut cx) = setup_indent_guides_editor(
@@ -20263,6 +20016,7 @@ println!("5");
pane_1
.update_in(cx, |pane, window, cx| {
pane.close_inactive_items(&CloseInactiveItems::default(), window, cx)
.unwrap()
})
.await
.unwrap();
@@ -20299,6 +20053,7 @@ println!("5");
pane_2
.update_in(cx, |pane, window, cx| {
pane.close_inactive_items(&CloseInactiveItems::default(), window, cx)
.unwrap()
})
.await
.unwrap();
@@ -20474,6 +20229,7 @@ println!("5");
});
pane.update_in(cx, |pane, window, cx| {
pane.close_all_items(&CloseAllItems::default(), window, cx)
.unwrap()
})
.await
.unwrap();
@@ -20827,6 +20583,7 @@ async fn test_invisible_worktree_servers(cx: &mut TestAppContext) {
pane.update_in(cx, |pane, window, cx| {
pane.close_active_item(&CloseActiveItem::default(), window, cx)
})
.unwrap()
.await
.unwrap();
pane.update_in(cx, |pane, window, cx| {
@@ -21253,22 +21010,6 @@ pub fn handle_signature_help_request(
}
}
#[track_caller]
pub fn check_displayed_completions(expected: Vec<&'static str>, cx: &mut EditorLspTestContext) {
cx.update_editor(|editor, _, _| {
if let Some(CodeContextMenu::Completions(menu)) = editor.context_menu.borrow().as_ref() {
let entries = menu.entries.borrow();
let entries = entries
.iter()
.map(|entry| entry.string.as_str())
.collect::<Vec<_>>();
assert_eq!(entries, expected);
} else {
panic!("Expected completions menu");
}
});
}
/// Handle completion request passing a marked string specifying where the completion
/// should be triggered from using '|' character, what range should be replaced, and what completions
/// should be returned using '<' and '>' to delimit the range.
@@ -21276,11 +21017,10 @@ pub fn check_displayed_completions(expected: Vec<&'static str>, cx: &mut EditorL
/// Also see `handle_completion_request_with_insert_and_replace`.
#[track_caller]
pub fn handle_completion_request(
cx: &mut EditorLspTestContext,
marked_string: &str,
completions: Vec<&'static str>,
is_incomplete: bool,
counter: Arc<AtomicUsize>,
cx: &mut EditorLspTestContext,
) -> impl Future<Output = ()> {
let complete_from_marker: TextRangeMarker = '|'.into();
let replace_range_marker: TextRangeMarker = ('<', '>').into();
@@ -21304,10 +21044,8 @@ pub fn handle_completion_request(
params.text_document_position.position,
complete_from_position
);
Ok(Some(lsp::CompletionResponse::List(lsp::CompletionList {
is_incomplete: is_incomplete,
item_defaults: None,
items: completions
Ok(Some(lsp::CompletionResponse::Array(
completions
.iter()
.map(|completion_text| lsp::CompletionItem {
label: completion_text.to_string(),
@@ -21318,7 +21056,7 @@ pub fn handle_completion_request(
..Default::default()
})
.collect(),
})))
)))
}
});
@@ -21330,27 +21068,19 @@ pub fn handle_completion_request(
/// Similar to `handle_completion_request`, but a [`CompletionTextEdit::InsertAndReplace`] will be
/// given instead, which also contains an `insert` range.
///
/// This function uses markers to define ranges:
/// - `|` marks the cursor position
/// - `<>` marks the replace range
/// - `[]` marks the insert range (optional, defaults to `replace_range.start..cursor_pos`which is what Rust-Analyzer provides)
/// This function uses the cursor position to mimic what Rust-Analyzer provides as the `insert` range,
/// that is, `replace_range.start..cursor_pos`.
pub fn handle_completion_request_with_insert_and_replace(
cx: &mut EditorLspTestContext,
marked_string: &str,
completions: Vec<(&'static str, &'static str)>, // (label, new_text)
completions: Vec<&'static str>,
counter: Arc<AtomicUsize>,
) -> impl Future<Output = ()> {
let complete_from_marker: TextRangeMarker = '|'.into();
let replace_range_marker: TextRangeMarker = ('<', '>').into();
let insert_range_marker: TextRangeMarker = ('{', '}').into();
let (_, mut marked_ranges) = marked_text_ranges_by(
marked_string,
vec![
complete_from_marker.clone(),
replace_range_marker.clone(),
insert_range_marker.clone(),
],
vec![complete_from_marker.clone(), replace_range_marker.clone()],
);
let complete_from_position =
@@ -21358,14 +21088,6 @@ pub fn handle_completion_request_with_insert_and_replace(
let replace_range =
cx.to_lsp_range(marked_ranges.remove(&replace_range_marker).unwrap()[0].clone());
let insert_range = match marked_ranges.remove(&insert_range_marker) {
Some(ranges) if !ranges.is_empty() => cx.to_lsp_range(ranges[0].clone()),
_ => lsp::Range {
start: replace_range.start,
end: complete_from_position,
},
};
let mut request =
cx.set_request_handler::<lsp::request::Completion, _, _>(move |url, params, _| {
let completions = completions.clone();
@@ -21379,13 +21101,16 @@ pub fn handle_completion_request_with_insert_and_replace(
Ok(Some(lsp::CompletionResponse::Array(
completions
.iter()
.map(|(label, new_text)| lsp::CompletionItem {
label: label.to_string(),
.map(|completion_text| lsp::CompletionItem {
label: completion_text.to_string(),
text_edit: Some(lsp::CompletionTextEdit::InsertAndReplace(
lsp::InsertReplaceEdit {
insert: insert_range,
insert: lsp::Range {
start: replace_range.start,
end: complete_from_position,
},
replace: replace_range,
new_text: new_text.to_string(),
new_text: completion_text.to_string(),
},
)),
..Default::default()

View File

@@ -682,7 +682,7 @@ impl EditorElement {
editor.select(
SelectPhase::BeginColumnar {
position,
reset: true,
reset: false,
goal_column: point_for_position.exact_unclipped.column(),
},
window,

View File

@@ -1095,15 +1095,14 @@ mod tests {
//prompt autocompletion menu
cx.simulate_keystroke(".");
handle_completion_request(
&mut cx,
indoc! {"
one.|<>
two
three
"},
vec!["first_completion", "second_completion"],
true,
counter.clone(),
&mut cx,
)
.await;
cx.condition(|editor, _| editor.context_menu_visible()) // wait until completion menu is visible

View File

@@ -600,7 +600,7 @@ pub(crate) fn handle_from(
})
.collect::<Vec<_>>();
this.update_in(cx, |this, window, cx| {
this.change_selections_without_updating_completions(None, window, cx, |s| {
this.change_selections_without_showing_completions(None, window, cx, |s| {
s.select(base_selections);
});
})

View File

@@ -22,7 +22,6 @@ use smol::stream::StreamExt;
use task::ResolvedTask;
use task::TaskContext;
use text::BufferId;
use ui::SharedString;
use util::ResultExt as _;
pub(crate) fn find_specific_language_server_in_selection<F>(
@@ -134,22 +133,13 @@ pub fn lsp_tasks(
cx.spawn(async move |cx| {
cx.spawn(async move |cx| {
let mut lsp_tasks = HashMap::default();
let mut lsp_tasks = Vec::new();
while let Some(server_to_query) = lsp_task_sources.next().await {
if let Some((server_id, buffers)) = server_to_query {
let source_kind = TaskSourceKind::Lsp(server_id);
let id_base = source_kind.to_id_base();
let mut new_lsp_tasks = Vec::new();
for buffer in buffers {
let source_kind = match buffer.update(cx, |buffer, _| {
buffer.language().map(|language| language.name())
}) {
Ok(Some(language_name)) => TaskSourceKind::Lsp {
server: server_id,
language_name: SharedString::from(language_name),
},
Ok(None) => continue,
Err(_) => return Vec::new(),
};
let id_base = source_kind.to_id_base();
let lsp_buffer_context = lsp_task_context(&project, &buffer, cx)
.await
.unwrap_or_default();
@@ -178,14 +168,11 @@ pub fn lsp_tasks(
);
}
}
lsp_tasks
.entry(source_kind)
.or_insert_with(Vec::new)
.append(&mut new_lsp_tasks);
}
lsp_tasks.push((source_kind, new_lsp_tasks));
}
}
lsp_tasks.into_iter().collect()
lsp_tasks
})
.race({
// `lsp::LSP_REQUEST_TIMEOUT` is larger than we want for the modal to open fast

View File

@@ -532,9 +532,7 @@ impl EditorTestContext {
#[track_caller]
pub fn assert_editor_selections(&mut self, expected_selections: Vec<Range<usize>>) {
let expected_marked_text =
generate_marked_text(&self.buffer_text(), &expected_selections, true)
.replace(" \n", "\n");
generate_marked_text(&self.buffer_text(), &expected_selections, true);
self.assert_selections(expected_selections, expected_marked_text)
}
@@ -563,8 +561,7 @@ impl EditorTestContext {
) {
let actual_selections = self.editor_selections();
let actual_marked_text =
generate_marked_text(&self.buffer_text(), &actual_selections, true)
.replace(" \n", "\n");
generate_marked_text(&self.buffer_text(), &actual_selections, true);
if expected_selections != actual_selections {
pretty_assertions::assert_eq!(
actual_marked_text,

View File

@@ -246,7 +246,6 @@ impl ExampleContext {
| ThreadEvent::StreamedAssistantThinking(_, _)
| ThreadEvent::UsePendingTools { .. }
| ThreadEvent::CompletionCanceled => {}
ThreadEvent::ToolUseLimitReached => {}
ThreadEvent::ToolFinished {
tool_use_id,
pending_tool_use,

View File

@@ -759,8 +759,8 @@ async fn test_extension_store_with_test_extension(cx: &mut TestAppContext) {
})
.await
.unwrap()
.unwrap()
.into_iter()
.flat_map(|response| response.completions)
.map(|c| c.label.text)
.collect::<Vec<_>>();
assert_eq!(

View File

@@ -38,8 +38,8 @@ use std::{
};
use text::Point;
use ui::{
ButtonLike, ContextMenu, HighlightedLabel, Indicator, KeyBinding, ListItem, ListItemSpacing,
PopoverMenu, PopoverMenuHandle, TintColor, Tooltip, prelude::*,
ContextMenu, HighlightedLabel, IconButtonShape, ListItem, ListItemSpacing, PopoverMenu,
PopoverMenuHandle, Tooltip, prelude::*,
};
use util::{ResultExt, maybe, paths::PathWithPosition, post_inc};
use workspace::{
@@ -47,10 +47,7 @@ use workspace::{
notifications::NotifyResultExt, pane,
};
actions!(
file_finder,
[SelectPrevious, ToggleFilterMenu, ToggleSplitMenu]
);
actions!(file_finder, [SelectPrevious, ToggleMenu]);
impl ModalView for FileFinder {
fn on_before_dismiss(
@@ -59,14 +56,7 @@ impl ModalView for FileFinder {
cx: &mut Context<Self>,
) -> workspace::DismissDecision {
let submenu_focused = self.picker.update(cx, |picker, cx| {
picker
.delegate
.filter_popover_menu_handle
.is_focused(window, cx)
|| picker
.delegate
.split_popover_menu_handle
.is_focused(window, cx)
picker.delegate.popover_menu_handle.is_focused(window, cx)
});
workspace::DismissDecision::Dismiss(!submenu_focused)
}
@@ -222,30 +212,9 @@ impl FileFinder {
window.dispatch_action(Box::new(menu::SelectPrevious), cx);
}
fn handle_filter_toggle_menu(
&mut self,
_: &ToggleFilterMenu,
window: &mut Window,
cx: &mut Context<Self>,
) {
fn handle_toggle_menu(&mut self, _: &ToggleMenu, window: &mut Window, cx: &mut Context<Self>) {
self.picker.update(cx, |picker, cx| {
let menu_handle = &picker.delegate.filter_popover_menu_handle;
if menu_handle.is_deployed() {
menu_handle.hide(cx);
} else {
menu_handle.show(window, cx);
}
});
}
fn handle_split_toggle_menu(
&mut self,
_: &ToggleSplitMenu,
window: &mut Window,
cx: &mut Context<Self>,
) {
self.picker.update(cx, |picker, cx| {
let menu_handle = &picker.delegate.split_popover_menu_handle;
let menu_handle = &picker.delegate.popover_menu_handle;
if menu_handle.is_deployed() {
menu_handle.hide(cx);
} else {
@@ -376,8 +345,7 @@ impl Render for FileFinder {
.w(modal_max_width)
.on_modifiers_changed(cx.listener(Self::handle_modifiers_changed))
.on_action(cx.listener(Self::handle_select_prev))
.on_action(cx.listener(Self::handle_filter_toggle_menu))
.on_action(cx.listener(Self::handle_split_toggle_menu))
.on_action(cx.listener(Self::handle_toggle_menu))
.on_action(cx.listener(Self::handle_toggle_ignored))
.on_action(cx.listener(Self::go_to_file_split_left))
.on_action(cx.listener(Self::go_to_file_split_right))
@@ -403,8 +371,7 @@ pub struct FileFinderDelegate {
history_items: Vec<FoundPath>,
separate_history: bool,
first_update: bool,
filter_popover_menu_handle: PopoverMenuHandle<ContextMenu>,
split_popover_menu_handle: PopoverMenuHandle<ContextMenu>,
popover_menu_handle: PopoverMenuHandle<ContextMenu>,
focus_handle: FocusHandle,
include_ignored: Option<bool>,
include_ignored_refresh: Task<()>,
@@ -791,8 +758,7 @@ impl FileFinderDelegate {
history_items,
separate_history,
first_update: true,
filter_popover_menu_handle: PopoverMenuHandle::default(),
split_popover_menu_handle: PopoverMenuHandle::default(),
popover_menu_handle: PopoverMenuHandle::default(),
focus_handle: cx.focus_handle(),
include_ignored: FileFinderSettings::get_global(cx).include_ignored,
include_ignored_refresh: Task::ready(()),
@@ -1171,13 +1137,8 @@ impl FileFinderDelegate {
fn key_context(&self, window: &Window, cx: &App) -> KeyContext {
let mut key_context = KeyContext::new_with_defaults();
key_context.add("FileFinder");
if self.filter_popover_menu_handle.is_focused(window, cx) {
key_context.add("filter_menu_open");
}
if self.split_popover_menu_handle.is_focused(window, cx) {
key_context.add("split_menu_open");
if self.popover_menu_handle.is_focused(window, cx) {
key_context.add("menu_open");
}
key_context
}
@@ -1531,112 +1492,62 @@ impl PickerDelegate for FileFinderDelegate {
)
}
fn render_footer(
&self,
window: &mut Window,
cx: &mut Context<Picker<Self>>,
) -> Option<AnyElement> {
let focus_handle = self.focus_handle.clone();
fn render_footer(&self, _: &mut Window, cx: &mut Context<Picker<Self>>) -> Option<AnyElement> {
let context = self.focus_handle.clone();
Some(
h_flex()
.w_full()
.p_1p5()
.p_2()
.justify_between()
.border_t_1()
.border_color(cx.theme().colors().border_variant)
.child(
PopoverMenu::new("filter-menu-popover")
.with_handle(self.filter_popover_menu_handle.clone())
.attach(gpui::Corner::BottomRight)
.anchor(gpui::Corner::BottomLeft)
.offset(gpui::Point {
x: px(1.0),
y: px(1.0),
IconButton::new("toggle-ignored", IconName::Sliders)
.on_click({
let focus_handle = self.focus_handle.clone();
move |_, window, cx| {
focus_handle.dispatch_action(&ToggleIncludeIgnored, window, cx);
}
})
.trigger_with_tooltip(
IconButton::new("filter-trigger", IconName::Sliders)
.icon_size(IconSize::Small)
.icon_size(IconSize::Small)
.toggle_state(self.include_ignored.unwrap_or(false))
.when(self.include_ignored.is_some(), |this| {
this.indicator(Indicator::dot().color(Color::Info))
}),
{
let focus_handle = focus_handle.clone();
move |window, cx| {
Tooltip::for_action_in(
"Filter Options",
&ToggleFilterMenu,
&focus_handle,
window,
cx,
)
}
},
)
.menu({
let focus_handle = focus_handle.clone();
let include_ignored = self.include_ignored;
.style(ButtonStyle::Subtle)
.shape(IconButtonShape::Square)
.toggle_state(self.include_ignored.unwrap_or(false))
.tooltip({
let focus_handle = self.focus_handle.clone();
move |window, cx| {
Some(ContextMenu::build(window, cx, {
let focus_handle = focus_handle.clone();
move |menu, _, _| {
menu.context(focus_handle.clone())
.header("Filter Options")
.toggleable_entry(
"Include Ignored Files",
include_ignored.unwrap_or(false),
ui::IconPosition::End,
Some(ToggleIncludeIgnored.boxed_clone()),
move |window, cx| {
window.focus(&focus_handle);
window.dispatch_action(
ToggleIncludeIgnored.boxed_clone(),
cx,
);
},
)
}
}))
Tooltip::for_action_in(
"Use ignored files",
&ToggleIncludeIgnored,
&focus_handle,
window,
cx,
)
}
}),
)
.child(
h_flex()
.gap_0p5()
.gap_2()
.child(
PopoverMenu::new("split-menu-popover")
.with_handle(self.split_popover_menu_handle.clone())
.attach(gpui::Corner::BottomRight)
.anchor(gpui::Corner::BottomLeft)
.offset(gpui::Point {
x: px(1.0),
y: px(1.0),
})
Button::new("open-selection", "Open").on_click(|_, window, cx| {
window.dispatch_action(menu::Confirm.boxed_clone(), cx)
}),
)
.child(
PopoverMenu::new("menu-popover")
.with_handle(self.popover_menu_handle.clone())
.attach(gpui::Corner::TopRight)
.anchor(gpui::Corner::BottomRight)
.trigger(
ButtonLike::new("split-trigger")
.child(Label::new("Split…"))
.selected_style(ButtonStyle::Tinted(TintColor::Accent))
.children(
KeyBinding::for_action_in(
&ToggleSplitMenu,
&focus_handle,
window,
cx,
)
.map(|kb| kb.size(rems_from_px(12.))),
),
Button::new("actions-trigger", "Split…")
.selected_label_color(Color::Accent),
)
.menu({
let focus_handle = focus_handle.clone();
move |window, cx| {
Some(ContextMenu::build(window, cx, {
let focus_handle = focus_handle.clone();
let context = context.clone();
move |menu, _, _| {
menu.context(focus_handle.clone())
menu.context(context)
.action(
"Split Left",
pane::SplitLeft.boxed_clone(),
@@ -1654,21 +1565,6 @@ impl PickerDelegate for FileFinderDelegate {
}))
}
}),
)
.child(
Button::new("open-selection", "Open")
.key_binding(
KeyBinding::for_action_in(
&menu::Confirm,
&focus_handle,
window,
cx,
)
.map(|kb| kb.size(rems_from_px(12.))),
)
.on_click(|_, window, cx| {
window.dispatch_action(menu::Confirm.boxed_clone(), cx)
}),
),
)
.into_any(),

View File

@@ -739,6 +739,7 @@ async fn test_ignored_root(cx: &mut TestAppContext) {
.update_in(cx, |workspace, window, cx| {
workspace.active_pane().update(cx, |pane, cx| {
pane.close_active_item(&CloseActiveItem::default(), window, cx)
.unwrap()
})
})
.await

View File

@@ -39,32 +39,15 @@ pub struct UserCaretPosition {
}
impl UserCaretPosition {
pub(crate) fn at_selection_end(
selection: &Selection<Point>,
snapshot: &MultiBufferSnapshot,
) -> Self {
pub fn at_selection_end(selection: &Selection<Point>, snapshot: &MultiBufferSnapshot) -> Self {
let selection_end = selection.head();
let (line, character) = if let Some((buffer_snapshot, point, _)) =
snapshot.point_to_buffer_point(selection_end)
{
let line_start = Point::new(point.row, 0);
let chars_to_last_position = buffer_snapshot
.text_summary_for_range::<text::TextSummary, _>(line_start..point)
.chars as u32;
(line_start.row, chars_to_last_position)
} else {
let line_start = Point::new(selection_end.row, 0);
let chars_to_last_position = snapshot
.text_summary_for_range::<text::TextSummary, _>(line_start..selection_end)
.chars as u32;
(selection_end.row, chars_to_last_position)
};
let line_start = Point::new(selection_end.row, 0);
let chars_to_last_position = snapshot
.text_summary_for_range::<text::TextSummary, _>(line_start..selection_end)
.chars as u32;
Self {
line: NonZeroU32::new(line + 1).expect("added 1"),
character: NonZeroU32::new(character + 1).expect("added 1"),
line: NonZeroU32::new(selection_end.row + 1).expect("added 1"),
character: NonZeroU32::new(chars_to_last_position + 1).expect("added 1"),
}
}
}

View File

@@ -202,7 +202,6 @@ pub enum Part {
InlineDataPart(InlineDataPart),
FunctionCallPart(FunctionCallPart),
FunctionResponsePart(FunctionResponsePart),
ThoughtPart(ThoughtPart),
}
#[derive(Debug, Serialize, Deserialize)]
@@ -236,13 +235,6 @@ pub struct FunctionResponsePart {
pub function_response: FunctionResponse,
}
#[derive(Debug, Serialize, Deserialize)]
#[serde(rename_all = "camelCase")]
pub struct ThoughtPart {
pub thought: bool,
pub thought_signature: String,
}
#[derive(Debug, Serialize, Deserialize)]
#[serde(rename_all = "camelCase")]
pub struct CitationSource {
@@ -289,22 +281,6 @@ pub struct UsageMetadata {
pub total_token_count: Option<usize>,
}
#[derive(Debug, Serialize, Deserialize)]
#[serde(rename_all = "camelCase")]
pub struct ThinkingConfig {
pub thinking_budget: u32,
}
#[cfg_attr(feature = "schemars", derive(schemars::JsonSchema))]
#[derive(Copy, Clone, Debug, Default, Serialize, Deserialize, PartialEq, Eq)]
pub enum GoogleModelMode {
#[default]
Default,
Thinking {
budget_tokens: Option<u32>,
},
}
#[derive(Debug, Deserialize, Serialize)]
#[serde(rename_all = "camelCase")]
pub struct GenerationConfig {
@@ -320,8 +296,6 @@ pub struct GenerationConfig {
pub top_p: Option<f64>,
#[serde(skip_serializing_if = "Option::is_none")]
pub top_k: Option<usize>,
#[serde(skip_serializing_if = "Option::is_none")]
pub thinking_config: Option<ThinkingConfig>,
}
#[derive(Debug, Serialize, Deserialize)]
@@ -514,8 +488,6 @@ pub enum Model {
/// The name displayed in the UI, such as in the assistant panel model dropdown menu.
display_name: Option<String>,
max_tokens: usize,
#[serde(default)]
mode: GoogleModelMode,
},
}
@@ -572,21 +544,6 @@ impl Model {
Model::Custom { max_tokens, .. } => *max_tokens,
}
}
pub fn mode(&self) -> GoogleModelMode {
match self {
Self::Gemini15Pro
| Self::Gemini15Flash
| Self::Gemini20Pro
| Self::Gemini20Flash
| Self::Gemini20FlashThinking
| Self::Gemini20FlashLite
| Self::Gemini25ProExp0325
| Self::Gemini25ProPreview0325
| Self::Gemini25FlashPreview0417 => GoogleModelMode::Default,
Self::Custom { mode, .. } => *mode,
}
}
}
impl std::fmt::Display for Model {

View File

@@ -126,7 +126,6 @@ uuid.workspace = true
waker-fn = "1.2.0"
lyon = "1.0"
workspace-hack.workspace = true
libc.workspace = true
[target.'cfg(target_os = "macos")'.dependencies]
block = "0.1"

View File

@@ -147,49 +147,14 @@ impl Keymap {
});
let mut bindings: SmallVec<[(KeyBinding, usize); 1]> = SmallVec::new();
// (pending, is_no_action, depth, keystrokes)
let mut pending_info_opt: Option<(bool, bool, usize, &[Keystroke])> = None;
let mut is_pending = None;
'outer: for (binding, pending) in possibilities {
for depth in (0..=context_stack.len()).rev() {
if self.binding_enabled(binding, &context_stack[0..depth]) {
let is_no_action = is_no_action(&*binding.action);
// We only want to consider a binding pending if it has an action
// This, however, means that if we have both a NoAction binding and a binding
// with an action at the same depth, we should still set is_pending to true.
if let Some(pending_info) = pending_info_opt.as_mut() {
let (
already_pending,
pending_is_no_action,
pending_depth,
pending_keystrokes,
) = *pending_info;
// We only want to change the pending status if it's not already pending AND if
// the existing pending status was set by a NoAction binding. This avoids a NoAction
// binding erroneously setting the pending status to true when a binding with an action
// already set it to false
//
// We also want to change the pending status if the keystrokes don't match,
// meaning it's different keystrokes than the NoAction that set pending to false
if pending
&& !already_pending
&& pending_is_no_action
&& (pending_depth == depth
|| pending_keystrokes != binding.keystrokes())
{
pending_info.0 = !is_no_action;
}
} else {
pending_info_opt = Some((
pending && !is_no_action,
is_no_action,
depth,
binding.keystrokes(),
));
if is_pending.is_none() {
is_pending = Some(pending);
}
if !pending {
bindings.push((binding.clone(), depth));
continue 'outer;
@@ -209,7 +174,7 @@ impl Keymap {
})
.collect();
(bindings, pending_info_opt.unwrap_or_default().0)
(bindings, is_pending.unwrap_or_default())
}
/// Check if the given binding is enabled, given a certain key context.
@@ -345,102 +310,6 @@ mod tests {
);
}
#[test]
/// Tests for https://github.com/zed-industries/zed/issues/30259
fn test_multiple_keystroke_binding_disabled() {
let bindings = [
KeyBinding::new("space w w", ActionAlpha {}, Some("workspace")),
KeyBinding::new("space w w", NoAction {}, Some("editor")),
];
let mut keymap = Keymap::default();
keymap.add_bindings(bindings.clone());
let space = || Keystroke::parse("space").unwrap();
let w = || Keystroke::parse("w").unwrap();
let space_w = [space(), w()];
let space_w_w = [space(), w(), w()];
let workspace_context = || [KeyContext::parse("workspace").unwrap()];
let editor_workspace_context = || {
[
KeyContext::parse("workspace").unwrap(),
KeyContext::parse("editor").unwrap(),
]
};
// Ensure `space` results in pending input on the workspace, but not editor
let space_workspace = keymap.bindings_for_input(&[space()], &workspace_context());
assert!(space_workspace.0.is_empty());
assert_eq!(space_workspace.1, true);
let space_editor = keymap.bindings_for_input(&[space()], &editor_workspace_context());
assert!(space_editor.0.is_empty());
assert_eq!(space_editor.1, false);
// Ensure `space w` results in pending input on the workspace, but not editor
let space_w_workspace = keymap.bindings_for_input(&space_w, &workspace_context());
assert!(space_w_workspace.0.is_empty());
assert_eq!(space_w_workspace.1, true);
let space_w_editor = keymap.bindings_for_input(&space_w, &editor_workspace_context());
assert!(space_w_editor.0.is_empty());
assert_eq!(space_w_editor.1, false);
// Ensure `space w w` results in the binding in the workspace, but not in the editor
let space_w_w_workspace = keymap.bindings_for_input(&space_w_w, &workspace_context());
assert!(!space_w_w_workspace.0.is_empty());
assert_eq!(space_w_w_workspace.1, false);
let space_w_w_editor = keymap.bindings_for_input(&space_w_w, &editor_workspace_context());
assert!(space_w_w_editor.0.is_empty());
assert_eq!(space_w_w_editor.1, false);
// Now test what happens if we have another binding defined AFTER the NoAction
// that should result in pending
let bindings = [
KeyBinding::new("space w w", ActionAlpha {}, Some("workspace")),
KeyBinding::new("space w w", NoAction {}, Some("editor")),
KeyBinding::new("space w x", ActionAlpha {}, Some("editor")),
];
let mut keymap = Keymap::default();
keymap.add_bindings(bindings.clone());
let space_editor = keymap.bindings_for_input(&[space()], &editor_workspace_context());
assert!(space_editor.0.is_empty());
assert_eq!(space_editor.1, true);
// Now test what happens if we have another binding defined BEFORE the NoAction
// that should result in pending
let bindings = [
KeyBinding::new("space w w", ActionAlpha {}, Some("workspace")),
KeyBinding::new("space w x", ActionAlpha {}, Some("editor")),
KeyBinding::new("space w w", NoAction {}, Some("editor")),
];
let mut keymap = Keymap::default();
keymap.add_bindings(bindings.clone());
let space_editor = keymap.bindings_for_input(&[space()], &editor_workspace_context());
assert!(space_editor.0.is_empty());
assert_eq!(space_editor.1, true);
// Now test what happens if we have another binding defined at a higher context
// that should result in pending
let bindings = [
KeyBinding::new("space w w", ActionAlpha {}, Some("workspace")),
KeyBinding::new("space w x", ActionAlpha {}, Some("workspace")),
KeyBinding::new("space w w", NoAction {}, Some("editor")),
];
let mut keymap = Keymap::default();
keymap.add_bindings(bindings.clone());
let space_editor = keymap.bindings_for_input(&[space()], &editor_workspace_context());
assert!(space_editor.0.is_empty());
assert_eq!(space_editor.1, true);
}
#[test]
fn test_bindings_for_action() {
let bindings = [

View File

@@ -751,28 +751,12 @@ where
impl rwh::HasWindowHandle for WaylandWindow {
fn window_handle(&self) -> Result<rwh::WindowHandle<'_>, rwh::HandleError> {
let surface = self.0.surface().id().as_ptr() as *mut libc::c_void;
let c_ptr = NonNull::new(surface).ok_or(rwh::HandleError::Unavailable)?;
let handle = rwh::WaylandWindowHandle::new(c_ptr);
let raw_handle = rwh::RawWindowHandle::Wayland(handle);
Ok(unsafe { rwh::WindowHandle::borrow_raw(raw_handle) })
unimplemented!()
}
}
impl rwh::HasDisplayHandle for WaylandWindow {
fn display_handle(&self) -> Result<rwh::DisplayHandle<'_>, rwh::HandleError> {
let display = self
.0
.surface()
.backend()
.upgrade()
.ok_or(rwh::HandleError::Unavailable)?
.display_ptr() as *mut libc::c_void;
let c_ptr = NonNull::new(display).ok_or(rwh::HandleError::Unavailable)?;
let handle = rwh::WaylandDisplayHandle::new(c_ptr);
let raw_handle = rwh::RawDisplayHandle::Wayland(handle);
Ok(unsafe { rwh::DisplayHandle::borrow_raw(raw_handle) })
unimplemented!()
}
}

View File

@@ -25,7 +25,7 @@ use derive_more::{Deref, DerefMut};
use futures::FutureExt;
use futures::channel::oneshot;
use parking_lot::RwLock;
use raw_window_handle::{HandleError, HasDisplayHandle, HasWindowHandle};
use raw_window_handle::{HandleError, HasWindowHandle};
use refineable::Refineable;
use slotmap::SlotMap;
use smallvec::SmallVec;
@@ -4428,14 +4428,6 @@ impl HasWindowHandle for Window {
}
}
impl HasDisplayHandle for Window {
fn display_handle(
&self,
) -> std::result::Result<raw_window_handle::DisplayHandle<'_>, HandleError> {
self.platform_window.display_handle()
}
}
/// An identifier for an [`Element`](crate::Element).
///
/// Can be constructed with a string, a number, or both, as well

View File

@@ -179,8 +179,6 @@ pub enum IconName {
PhoneIncoming,
Pin,
Play,
PlayAlt,
PlayBug,
Plus,
PocketKnife,
Power,

View File

@@ -11,7 +11,7 @@ use gpui::{
InteractiveElement, IntoElement, ObjectFit, ParentElement, Render, Styled, Task, WeakEntity,
Window, canvas, div, fill, img, opaque_grey, point, size,
};
use language::{DiskState, File as _};
use language::File as _;
use persistence::IMAGE_VIEWER;
use project::{ImageItem, Project, ProjectPath, image_store::ImageItemEvent};
use settings::Settings;
@@ -191,10 +191,6 @@ impl Item for ImageView {
focus_handle: cx.focus_handle(),
}))
}
fn has_deleted_file(&self, cx: &App) -> bool {
self.image_item.read(cx).file.disk_state() == DiskState::Deleted
}
}
fn breadcrumbs_text_for_image(project: &Project, image: &ImageItem, cx: &App) -> String {

View File

@@ -11,7 +11,7 @@ use language::{
DiagnosticSeverity, LanguageServerId, Point, ToOffset as _, ToPoint as _,
};
use project::lsp_store::CompletionDocumentation;
use project::{Completion, CompletionResponse, CompletionSource, Project, ProjectPath};
use project::{Completion, CompletionSource, Project, ProjectPath};
use std::cell::RefCell;
use std::fmt::Write as _;
use std::ops::Range;
@@ -641,18 +641,18 @@ impl CompletionProvider for RustStyleCompletionProvider {
_: editor::CompletionContext,
_window: &mut Window,
cx: &mut Context<Editor>,
) -> Task<Result<Vec<CompletionResponse>>> {
) -> Task<Result<Option<Vec<project::Completion>>>> {
let Some(replace_range) = completion_replace_range(&buffer.read(cx).snapshot(), &position)
else {
return Task::ready(Ok(Vec::new()));
return Task::ready(Ok(Some(Vec::new())));
};
self.div_inspector.update(cx, |div_inspector, _cx| {
div_inspector.rust_completion_replace_range = Some(replace_range.clone());
});
Task::ready(Ok(vec![CompletionResponse {
completions: STYLE_METHODS
Task::ready(Ok(Some(
STYLE_METHODS
.iter()
.map(|(_, method)| Completion {
replace_range: replace_range.clone(),
@@ -667,8 +667,7 @@ impl CompletionProvider for RustStyleCompletionProvider {
confirm: None,
})
.collect(),
is_incomplete: false,
}]))
)))
}
fn resolve_completions(

View File

@@ -51,7 +51,6 @@ schemars.workspace = true
serde.workspace = true
serde_json.workspace = true
settings.workspace = true
shellexpand.workspace = true
smallvec.workspace = true
smol.workspace = true
streaming-iterator.workspace = true

View File

@@ -34,7 +34,7 @@ pub use highlight_map::HighlightMap;
use http_client::HttpClient;
pub use language_registry::{LanguageName, LoadedLanguage};
use lsp::{CodeActionKind, InitializeParams, LanguageServerBinary, LanguageServerBinaryOptions};
pub use manifest::{ManifestDelegate, ManifestName, ManifestProvider, ManifestQuery};
pub use manifest::{ManifestName, ManifestProvider, ManifestQuery};
use parking_lot::Mutex;
use regex::Regex;
use schemars::{
@@ -323,6 +323,7 @@ pub trait LspAdapterDelegate: Send + Sync {
fn http_client(&self) -> Arc<dyn HttpClient>;
fn worktree_id(&self) -> WorktreeId;
fn worktree_root_path(&self) -> &Path;
fn exists(&self, path: &Path, is_dir: Option<bool>) -> bool;
fn update_status(&self, language: LanguageServerName, status: BinaryStatus);
fn registered_lsp_adapters(&self) -> Vec<Arc<dyn LspAdapter>>;
async fn language_server_download_dir(&self, name: &LanguageServerName) -> Option<Arc<Path>>;

View File

@@ -23,7 +23,6 @@ use serde_json::Value;
use settings::{
Settings, SettingsLocation, SettingsSources, SettingsStore, add_references_to_properties,
};
use shellexpand;
use std::{borrow::Cow, num::NonZeroU32, path::Path, sync::Arc};
use util::serde::default_true;
@@ -1332,10 +1331,9 @@ impl settings::Settings for AllLanguageSettings {
disabled_globs: completion_globs
.iter()
.filter_map(|g| {
let expanded_g = shellexpand::tilde(g).into_owned();
Some(DisabledGlob {
matcher: globset::Glob::new(&expanded_g).ok()?.compile_matcher(),
is_absolute: Path::new(&expanded_g).is_absolute(),
matcher: globset::Glob::new(g).ok()?.compile_matcher(),
is_absolute: Path::new(g).is_absolute(),
})
})
.collect(),
@@ -1714,12 +1712,10 @@ mod tests {
};
#[cfg(windows)]
let glob_str = glob_str.as_str();
let expanded_glob_str = shellexpand::tilde(glob_str).into_owned();
DisabledGlob {
matcher: globset::Glob::new(&expanded_glob_str)
.unwrap()
.compile_matcher(),
is_absolute: Path::new(&expanded_glob_str).is_absolute(),
matcher: globset::Glob::new(glob_str).unwrap().compile_matcher(),
is_absolute: Path::new(glob_str).is_absolute(),
}
})
.collect(),
@@ -1815,12 +1811,6 @@ mod tests {
let dot_env_file = make_test_file(&[".env"]);
let settings = build_settings(&[".env"]);
assert!(!settings.enabled_for_file(&dot_env_file, &cx));
// Test tilde expansion
let home = shellexpand::tilde("~").into_owned().to_string();
let home_file = make_test_file(&[&home, "test.rs"]);
let settings = build_settings(&["~/test.rs"]);
assert!(!settings.enabled_for_file(&home_file, &cx));
}
#[test]

View File

@@ -1,7 +1,8 @@
use std::{borrow::Borrow, path::Path, sync::Arc};
use gpui::SharedString;
use settings::WorktreeId;
use crate::LspAdapterDelegate;
#[derive(Clone, Debug, PartialEq, Eq, Hash, PartialOrd, Ord)]
pub struct ManifestName(SharedString);
@@ -38,15 +39,10 @@ pub struct ManifestQuery {
/// Path to the file, relative to worktree root.
pub path: Arc<Path>,
pub depth: usize,
pub delegate: Arc<dyn ManifestDelegate>,
pub delegate: Arc<dyn LspAdapterDelegate>,
}
pub trait ManifestProvider {
fn name(&self) -> ManifestName;
fn search(&self, query: ManifestQuery) -> Option<Arc<Path>>;
}
pub trait ManifestDelegate: Send + Sync {
fn worktree_id(&self) -> WorktreeId;
fn exists(&self, path: &Path, is_dir: Option<bool>) -> bool;
}

View File

@@ -14,7 +14,7 @@ use collections::HashMap;
use gpui::{AsyncApp, SharedString};
use settings::WorktreeId;
use crate::{LanguageName, ManifestName};
use crate::LanguageName;
/// Represents a single toolchain.
#[derive(Clone, Debug)]
@@ -44,13 +44,10 @@ pub trait ToolchainLister: Send + Sync {
async fn list(
&self,
worktree_root: PathBuf,
subroot_relative_path: Option<Arc<Path>>,
project_env: Option<HashMap<String, String>>,
) -> ToolchainList;
// Returns a term which we should use in UI to refer to a toolchain.
fn term(&self) -> SharedString;
/// Returns the name of the manifest file for this toolchain.
fn manifest_name(&self) -> ManifestName;
}
#[async_trait(?Send)]

View File

@@ -4,7 +4,6 @@ use client::{Client, UserStore, zed_urls};
use futures::{
AsyncBufReadExt, FutureExt, Stream, StreamExt, future::BoxFuture, stream::BoxStream,
};
use google_ai::GoogleModelMode;
use gpui::{
AnyElement, AnyView, App, AsyncApp, Context, Entity, SemanticVersion, Subscription, Task,
};
@@ -751,8 +750,7 @@ impl LanguageModel for CloudLanguageModel {
let client = self.client.clone();
let llm_api_token = self.llm_api_token.clone();
let model_id = self.model.id.to_string();
let generate_content_request =
into_google(request, model_id.clone(), GoogleModelMode::Default);
let generate_content_request = into_google(request, model_id.clone());
async move {
let http_client = &client.http_client();
let token = llm_api_token.acquire(&client).await?;
@@ -924,8 +922,7 @@ impl LanguageModel for CloudLanguageModel {
}
zed_llm_client::LanguageModelProvider::Google => {
let client = self.client.clone();
let request =
into_google(request, self.model.id.to_string(), GoogleModelMode::Default);
let request = into_google(request, self.model.id.to_string());
let llm_api_token = self.llm_api_token.clone();
let future = self.request_limiter.stream(async move {
let PerformLlmCompletionResponse {

View File

@@ -1,8 +1,7 @@
use anyhow::{Context as _, Result, anyhow};
use collections::{BTreeMap, HashMap};
use collections::BTreeMap;
use credentials_provider::CredentialsProvider;
use editor::{Editor, EditorElement, EditorStyle};
use futures::Stream;
use futures::{FutureExt, StreamExt, future::BoxFuture, stream::BoxStream};
use gpui::{
AnyView, AppContext as _, AsyncApp, Entity, FontStyle, Subscription, Task, TextStyle,
@@ -13,14 +12,11 @@ use language_model::{
AuthenticateError, LanguageModel, LanguageModelCompletionError, LanguageModelCompletionEvent,
LanguageModelId, LanguageModelName, LanguageModelProvider, LanguageModelProviderId,
LanguageModelProviderName, LanguageModelProviderState, LanguageModelRequest,
LanguageModelToolChoice, LanguageModelToolResultContent, LanguageModelToolUse, MessageContent,
RateLimiter, Role, StopReason,
LanguageModelToolChoice, RateLimiter, Role,
};
use schemars::JsonSchema;
use serde::{Deserialize, Serialize};
use settings::{Settings, SettingsStore};
use std::pin::Pin;
use std::str::FromStr;
use std::sync::Arc;
use theme::ThemeSettings;
use ui::{Icon, IconName, List, prelude::*};
@@ -32,13 +28,6 @@ const PROVIDER_ID: &str = "deepseek";
const PROVIDER_NAME: &str = "DeepSeek";
const DEEPSEEK_API_KEY_VAR: &str = "DEEPSEEK_API_KEY";
#[derive(Default)]
struct RawToolCall {
id: String,
name: String,
arguments: String,
}
#[derive(Default, Clone, Debug, PartialEq)]
pub struct DeepSeekSettings {
pub api_url: String,
@@ -291,11 +280,11 @@ impl LanguageModel for DeepSeekLanguageModel {
}
fn supports_tools(&self) -> bool {
true
false
}
fn supports_tool_choice(&self, _choice: LanguageModelToolChoice) -> bool {
true
false
}
fn supports_images(&self) -> bool {
@@ -350,12 +339,35 @@ impl LanguageModel for DeepSeekLanguageModel {
BoxStream<'static, Result<LanguageModelCompletionEvent, LanguageModelCompletionError>>,
>,
> {
let request = into_deepseek(request, &self.model, self.max_output_tokens());
let request = into_deepseek(
request,
self.model.id().to_string(),
self.max_output_tokens(),
);
let stream = self.stream_completion(request, cx);
async move {
let mapper = DeepSeekEventMapper::new();
Ok(mapper.map_stream(stream.await?).boxed())
let stream = stream.await?;
Ok(stream
.map(|result| {
result
.and_then(|response| {
response
.choices
.first()
.context("Empty response")
.map(|choice| {
choice
.delta
.content
.clone()
.unwrap_or_default()
.map(LanguageModelCompletionEvent::Text)
})
})
.map_err(LanguageModelCompletionError::Other)
})
.boxed())
}
.boxed()
}
@@ -363,67 +375,69 @@ impl LanguageModel for DeepSeekLanguageModel {
pub fn into_deepseek(
request: LanguageModelRequest,
model: &deepseek::Model,
model: String,
max_output_tokens: Option<u32>,
) -> deepseek::Request {
let is_reasoner = *model == deepseek::Model::Reasoner;
let is_reasoner = model == "deepseek-reasoner";
let mut messages = Vec::new();
for message in request.messages {
for content in message.content {
match content {
MessageContent::Text(text) | MessageContent::Thinking { text, .. } => messages
.push(match message.role {
Role::User => deepseek::RequestMessage::User { content: text },
Role::Assistant => deepseek::RequestMessage::Assistant {
content: Some(text),
tool_calls: Vec::new(),
},
Role::System => deepseek::RequestMessage::System { content: text },
}),
MessageContent::RedactedThinking(_) => {}
MessageContent::Image(_) => {}
MessageContent::ToolUse(tool_use) => {
let tool_call = deepseek::ToolCall {
id: tool_use.id.to_string(),
content: deepseek::ToolCallContent::Function {
function: deepseek::FunctionContent {
name: tool_use.name.to_string(),
arguments: serde_json::to_string(&tool_use.input)
.unwrap_or_default(),
},
},
};
let len = request.messages.len();
let merged_messages =
request
.messages
.into_iter()
.fold(Vec::with_capacity(len), |mut acc, msg| {
let role = msg.role;
let content = msg.string_contents();
if let Some(deepseek::RequestMessage::Assistant { tool_calls, .. }) =
messages.last_mut()
{
tool_calls.push(tool_call);
} else {
messages.push(deepseek::RequestMessage::Assistant {
content: None,
tool_calls: vec![tool_call],
});
if is_reasoner {
if let Some(last_msg) = acc.last_mut() {
match (last_msg, role) {
(deepseek::RequestMessage::User { content: last }, Role::User) => {
last.push(' ');
last.push_str(&content);
return acc;
}
(
deepseek::RequestMessage::Assistant {
content: last_content,
..
},
Role::Assistant,
) => {
*last_content = last_content
.take()
.map(|c| {
let mut s =
String::with_capacity(c.len() + content.len() + 1);
s.push_str(&c);
s.push(' ');
s.push_str(&content);
s
})
.or(Some(content));
return acc;
}
_ => {}
}
}
}
MessageContent::ToolResult(tool_result) => {
match &tool_result.content {
LanguageModelToolResultContent::Text(text) => {
messages.push(deepseek::RequestMessage::Tool {
content: text.to_string(),
tool_call_id: tool_result.tool_use_id.to_string(),
});
}
LanguageModelToolResultContent::Image(_) => {}
};
}
}
}
}
acc.push(match role {
Role::User => deepseek::RequestMessage::User { content },
Role::Assistant => deepseek::RequestMessage::Assistant {
content: Some(content),
tool_calls: Vec::new(),
},
Role::System => deepseek::RequestMessage::System { content },
});
acc
});
deepseek::Request {
model: model.id().to_string(),
messages,
model,
messages: merged_messages,
stream: true,
max_tokens: max_output_tokens,
temperature: if is_reasoner {
@@ -446,103 +460,6 @@ pub fn into_deepseek(
}
}
pub struct DeepSeekEventMapper {
tool_calls_by_index: HashMap<usize, RawToolCall>,
}
impl DeepSeekEventMapper {
pub fn new() -> Self {
Self {
tool_calls_by_index: HashMap::default(),
}
}
pub fn map_stream(
mut self,
events: Pin<Box<dyn Send + Stream<Item = Result<deepseek::StreamResponse>>>>,
) -> impl Stream<Item = Result<LanguageModelCompletionEvent, LanguageModelCompletionError>>
{
events.flat_map(move |event| {
futures::stream::iter(match event {
Ok(event) => self.map_event(event),
Err(error) => vec![Err(LanguageModelCompletionError::Other(anyhow!(error)))],
})
})
}
pub fn map_event(
&mut self,
event: deepseek::StreamResponse,
) -> Vec<Result<LanguageModelCompletionEvent, LanguageModelCompletionError>> {
let Some(choice) = event.choices.first() else {
return vec![Err(LanguageModelCompletionError::Other(anyhow!(
"Response contained no choices"
)))];
};
let mut events = Vec::new();
if let Some(content) = choice.delta.content.clone() {
events.push(Ok(LanguageModelCompletionEvent::Text(content)));
}
if let Some(tool_calls) = choice.delta.tool_calls.as_ref() {
for tool_call in tool_calls {
let entry = self.tool_calls_by_index.entry(tool_call.index).or_default();
if let Some(tool_id) = tool_call.id.clone() {
entry.id = tool_id;
}
if let Some(function) = tool_call.function.as_ref() {
if let Some(name) = function.name.clone() {
entry.name = name;
}
if let Some(arguments) = function.arguments.clone() {
entry.arguments.push_str(&arguments);
}
}
}
}
match choice.finish_reason.as_deref() {
Some("stop") => {
events.push(Ok(LanguageModelCompletionEvent::Stop(StopReason::EndTurn)));
}
Some("tool_calls") => {
events.extend(self.tool_calls_by_index.drain().map(|(_, tool_call)| {
match serde_json::Value::from_str(&tool_call.arguments) {
Ok(input) => Ok(LanguageModelCompletionEvent::ToolUse(
LanguageModelToolUse {
id: tool_call.id.clone().into(),
name: tool_call.name.as_str().into(),
is_input_complete: true,
input,
raw_input: tool_call.arguments.clone(),
},
)),
Err(error) => Err(LanguageModelCompletionError::BadInputJson {
id: tool_call.id.into(),
tool_name: tool_call.name.as_str().into(),
raw_input: tool_call.arguments.into(),
json_parse_error: error.to_string(),
}),
}
}));
events.push(Ok(LanguageModelCompletionEvent::Stop(StopReason::ToolUse)));
}
Some(stop_reason) => {
log::error!("Unexpected DeepSeek stop_reason: {stop_reason:?}",);
events.push(Ok(LanguageModelCompletionEvent::Stop(StopReason::EndTurn)));
}
None => {}
}
events
}
}
struct ConfigurationView {
api_key_editor: Entity<Editor>,
state: Entity<State>,

View File

@@ -4,8 +4,7 @@ use credentials_provider::CredentialsProvider;
use editor::{Editor, EditorElement, EditorStyle};
use futures::{FutureExt, Stream, StreamExt, future::BoxFuture};
use google_ai::{
FunctionDeclaration, GenerateContentResponse, GoogleModelMode, Part, SystemInstruction,
ThinkingConfig, UsageMetadata,
FunctionDeclaration, GenerateContentResponse, Part, SystemInstruction, UsageMetadata,
};
use gpui::{
AnyView, App, AsyncApp, Context, Entity, FontStyle, Subscription, Task, TextStyle, WhiteSpace,
@@ -46,41 +45,11 @@ pub struct GoogleSettings {
pub available_models: Vec<AvailableModel>,
}
#[derive(Clone, Copy, Debug, Default, PartialEq, Serialize, Deserialize, JsonSchema)]
#[serde(tag = "type", rename_all = "lowercase")]
pub enum ModelMode {
#[default]
Default,
Thinking {
/// The maximum number of tokens to use for reasoning. Must be lower than the model's `max_output_tokens`.
budget_tokens: Option<u32>,
},
}
impl From<ModelMode> for GoogleModelMode {
fn from(value: ModelMode) -> Self {
match value {
ModelMode::Default => GoogleModelMode::Default,
ModelMode::Thinking { budget_tokens } => GoogleModelMode::Thinking { budget_tokens },
}
}
}
impl From<GoogleModelMode> for ModelMode {
fn from(value: GoogleModelMode) -> Self {
match value {
GoogleModelMode::Default => ModelMode::Default,
GoogleModelMode::Thinking { budget_tokens } => ModelMode::Thinking { budget_tokens },
}
}
}
#[derive(Clone, Debug, PartialEq, Serialize, Deserialize, JsonSchema)]
pub struct AvailableModel {
name: String,
display_name: Option<String>,
max_tokens: usize,
mode: Option<ModelMode>,
}
pub struct GoogleLanguageModelProvider {
@@ -247,7 +216,6 @@ impl LanguageModelProvider for GoogleLanguageModelProvider {
name: model.name.clone(),
display_name: model.display_name.clone(),
max_tokens: model.max_tokens,
mode: model.mode.unwrap_or_default().into(),
},
);
}
@@ -375,7 +343,7 @@ impl LanguageModel for GoogleLanguageModel {
cx: &App,
) -> BoxFuture<'static, Result<usize>> {
let model_id = self.model.id().to_string();
let request = into_google(request, model_id.clone(), self.model.mode());
let request = into_google(request, model_id.clone());
let http_client = self.http_client.clone();
let api_key = self.state.read(cx).api_key.clone();
@@ -411,7 +379,7 @@ impl LanguageModel for GoogleLanguageModel {
>,
>,
> {
let request = into_google(request, self.model.id().to_string(), self.model.mode());
let request = into_google(request, self.model.id().to_string());
let request = self.stream_completion(request, cx);
let future = self.request_limiter.stream(async move {
let response = request
@@ -426,7 +394,6 @@ impl LanguageModel for GoogleLanguageModel {
pub fn into_google(
mut request: LanguageModelRequest,
model_id: String,
mode: GoogleModelMode,
) -> google_ai::GenerateContentRequest {
fn map_content(content: Vec<MessageContent>) -> Vec<Part> {
content
@@ -537,12 +504,6 @@ pub fn into_google(
stop_sequences: Some(request.stop),
max_output_tokens: None,
temperature: request.temperature.map(|t| t as f64).or(Some(1.0)),
thinking_config: match mode {
GoogleModelMode::Thinking { budget_tokens } => {
budget_tokens.map(|thinking_budget| ThinkingConfig { thinking_budget })
}
GoogleModelMode::Default => None,
},
top_p: None,
top_k: None,
}),
@@ -659,7 +620,6 @@ impl GoogleEventMapper {
)));
}
Part::FunctionResponsePart(_) => {}
Part::ThoughtPart(_) => {}
});
}
}
@@ -725,15 +685,10 @@ fn update_usage(usage: &mut UsageMetadata, new: &UsageMetadata) {
}
fn convert_usage(usage: &UsageMetadata) -> language_model::TokenUsage {
let prompt_tokens = usage.prompt_token_count.unwrap_or(0) as u32;
let cached_tokens = usage.cached_content_token_count.unwrap_or(0) as u32;
let input_tokens = prompt_tokens - cached_tokens;
let output_tokens = usage.candidates_token_count.unwrap_or(0) as u32;
language_model::TokenUsage {
input_tokens,
output_tokens,
cache_read_input_tokens: cached_tokens,
input_tokens: usage.prompt_token_count.unwrap_or(0) as u32,
output_tokens: usage.candidates_token_count.unwrap_or(0) as u32,
cache_read_input_tokens: usage.cached_content_token_count.unwrap_or(0) as u32,
cache_creation_input_tokens: 0,
}
}

Some files were not shown because too many files have changed in this diff Show More