Compare commits

..

123 Commits

Author SHA1 Message Date
Miguel Raz Guzmán Macedo
fc9c3456e7 draft: Add git commit --cleanup=strip/--cleanup=whitespace option 2025-11-20 11:49:30 -06:00
Finn Evers
61f512af03 Move protobuf action to default linux runner (#43085)
Release Notes:

- N/A

---------

Co-authored-by: Conrad Irwin <conrad.irwin@gmail.com>
2025-11-20 12:48:39 +00:00
Arun Chapagain
dd5482a899 docs: Update developing extension docs for updating specific submodule (#42548)
Release Notes:

- N/A
2025-11-20 13:33:13 +01:00
Bhuminjay Soni
9094eb811b git: Compress diff for commit message generation (#42835)
This PR compresses diff capped at 20000 bytes by:
- Truncation of all lines to 256 chars
- Iteratively removing last hunks from each file until size <= 20000
bytes.


Closes #34486

Release Notes:

- Improved: Compress large diffs for commit message generation (thanks
@11happy)

---------

Signed-off-by: 11happy <soni5happy@gmail.com>
Co-authored-by: Oleksiy Syvokon <oleksiy@zed.dev>
2025-11-20 11:30:34 +00:00
Lukas Wirth
29f9853978 svg_preview: Remove unnecessary dependency on editor (#43147)
Editor is a choke point in our compilation graph while also being a very
common crate that is being edited. So reducing things that depend on it
will generally improve compilation times for us.

Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-20 12:18:50 +01:00
Aaron Saunders
1e45c99c80 Improve readability of files in the git changes panel (#41857)
Closes _unknown_

<img width="1212" height="463" alt="image"
src="https://github.com/user-attachments/assets/ec00fcf0-7eb9-4291-b1e2-66e014dc30ac"
/>


This PR places the file_name before the file_path so that when the panel
is slim it is still usable, mirrors the behaviour of the file picker
(cmd+P)

Release Notes:
-  Improved readability of files in the git changes panel
2025-11-20 06:14:46 -05:00
Danilo Leal
28f50977cf agent_ui: Add support for setting a model as the default for external agents (#43122)
This PR builds on top of the `default_mode` feature where it was
possible to set an external agent mode as the default if you held a
modifier while clicking on the desired option. Now, if you want to have,
for example, Haiku as your default Claude Code model, you can do that.
This feature adds parity between external agents and Zed's built-in one,
which already supported this feature for a little while.

Note: This still doesn't work with external agents installed from
extensions. At the moment, this is limited to Claude Code, Codex, and
Gemini—the ones we include out of the box.

Release Notes:

- agent: Added the ability to set a model as the default for a given
built-in external agent (Claude Code, Codex CLI, or Gemini CLI).
2025-11-20 11:00:01 +01:00
Lukas Wirth
95cb467cd9 multi_buffer: Remove redundant TypedOffset/TypedPoint (#43139)
Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-20 09:31:18 +00:00
Miguel Raz Guzmán Macedo
681a56506b project_panel: Add CollapseAllEntries keybinding (#43112)
Motivated by user feature requests

* https://github.com/zed-industries/zed/issues/6880
* https://discord.com/channels/869392257814519848/1439453067119562793

In analogy with VSCode functionality, we're adding a keybinding to the
project panel.

This is particularly for useful for large monorepos.

Release Notes:

- Keybinding added for `CollapseAllEntries` when in the `ProjectPanel`.

Co-authored-by: mikayla <mikayla@zed.dev>
2025-11-20 14:30:13 +05:30
Lukas Wirth
5052a460b4 vim: Fix increment panicking due to invalid utf8 offsets (#43101)
Fixes ZED-3ER

Release Notes:

- Fixed a panic when using vim increment on a multibyte character
2025-11-20 07:12:50 +00:00
Danilo Leal
66382acd52 ui: Remove outdated/unused component stories (#43118)
This PR removes basically all of the component stories, with the
exception of the context menu, which is a bit more intricate to set up.
All of the component that won't have a story after this PR will have an
entry in the Component Preview, which serves basically the same purpose.

Release Notes:

- N/A
2025-11-20 01:52:13 -03:00
Danilo Leal
ba596267d8 ui: Remove the ToggleButton component (#43115)
This PR removes the old `ToggleButton` component, replacing it with the
newer `ToggleButtonGroup` component in the couple of places that used to
use it. Ended up also adding a few more methods to the newer toggle
button group so the UI for the extensions page and the debugger main
picker didn't get visually impacted much. Then, as I was already in the
extensions page, decided to bake in some reasonably small UI
improvements to it as well.

Release Notes:

- N/A
2025-11-20 01:28:25 -03:00
Finn Evers
1fab43d467 Allow styling the container of markdown elements (#43107)
Closes #43033

Release Notes:

- FIxed an issue where the padding on info popovers would overlay text
when the content was scrollable.
2025-11-19 23:40:54 +00:00
Ben Kunkle
f2f40a5099 zeta2: Merge Sweep and Zeta2 Providers (#43097)
Closes #ISSUE

Release Notes:

- N/A *or* Added/Fixed/Improved ...

---------

Co-authored-by: Max Brunsfeld <maxbrunsfeld@gmail.com>
2025-11-19 15:40:06 -08:00
Piotr Osiewicz
c70f2d16ad lsp_button: Do not surface language servers from different windows in current workspace (#42733)
This led to a problem where we'd have a zombie entries in LSP dropdown
because they were treated as if they originated from an unknown
worktree.

Closes #42077

Release Notes:

- Fixed LSP status list containing zombie entries for LSPs in other
windows

---------

Co-authored-by: HactarCE <6060305+HactarCE@users.noreply.github.com>
Co-authored-by: Kirill Bulatov <kirill@zed.dev>
2025-11-19 22:40:17 +00:00
Lukas Wirth
c98b2d6944 multi_buffer: Typed MultiBufferOffset (#42707)
This PR introduces a new `MultiBufferOffset` new type wrapping size. The
goal of this is to make it clear at the type level when we are
interacting with offsets of a multi buffer versus offsets of a language
/ text buffer. This improves readability of things quite a bit by making
it clear what kind of offsets one is working with while also reducing
accidental bugs by using the wrong kin of offset for the wrong API.

This PR also uncovered two minor bugs due to that.

Does not yet introduce the MultiBufferPoint equivalent, that is for a
follow up PR.

Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-19 22:00:58 +00:00
Cole Miller
5e21457f21 Fix panic in the git panel when toggling sort_by_path (#43074)
We call `entry_by_path` on the `bulk_staging` anchor entry at the
beginning of `update_visible_entries`, but in the codepath where
`sort_by_path` is toggled on or off, we clear entries without clearing
`bulk_staging` or counts, causing that `entry_by_path` to do an out of
bounds index. Fixed by clearing `bulk_staging` as well.

Release Notes:

- N/A
2025-11-19 16:53:40 -05:00
Conrad Irwin
f312215e93 Potentially make zip test less flakey (#43099)
Authored-By: Claude

Release Notes:

- N/A
2025-11-19 14:50:32 -07:00
Jakub Konka
08692bb108 git: Clear pending ops for remote repos (#43098)
Release Notes:

- N/A
2025-11-19 22:47:51 +01:00
Max Brunsfeld
09e02a483a Allow running zeta evals against sweep (#43039)
This PR restructures the subcommands in `zeta-cli`, so that the
prediction engine (currently `zeta1` vs `zeta2`) is no longer the
highest order subcommand. Instead, there is just one layer of
subcommands: `eval`, `predict`, `context`, etc. Within these commands,
there are flags for using `zeta1`, `zeta2`, and now `sweep`.

Release Notes:

- N/A

---------

Co-authored-by: Ben Kunkle <ben@zed.dev>
Co-authored-by: Agus <agus@zed.dev>
2025-11-19 16:09:19 -05:00
David Kleingeld
68b87fc308 Use fixed calloop (#43081)
Calloop (used by our linux executor) was running all futures regardless
of how long they take. Unfortunaly some of our futures are rather busy
and take a while (>10ms).

Running all of them froze the editor for multiple seconds or even
minutes when opening a large project diff (git reset HEAD~2000 in
chromium for example).

Closes #ISSUE

Release Notes:

- N/A

---------

Co-authored-by: Jakub Konka <kubkon@jakubkonka.com>
2025-11-19 22:06:13 +01:00
Agus Zubiaga
ec220dcc05 sweep: Coalesce edits based on line distance rather than time (#43006)
Release Notes:

- N/A

---------

Co-authored-by: Ben Kunkle <ben@zed.dev>
2025-11-19 20:32:14 +00:00
Joseph T. Lyons
b6c8c3f3d9 Remove migrated scripts (#43095)
These scripts have been migrated to:
https://github.com/zed-industries/release_notes

Release Notes:

- N/A
2025-11-19 20:28:40 +00:00
Smit Barmase
dccddf6f66 project_panel: Remove cmd-opt-. binding for hiding hidden files (#43091)
Lots of folks were accidentally clicking this. Even though it’s the
default in macOS Finder, it’s a good idea to not have it as the default
for us.

Release Notes:

- Removed the default `cmd-opt-.` binding for toggling hidden files in
the Project Panel so it’s harder to hide them by accident.
2025-11-20 00:38:29 +05:30
Andrew Farkas
27cb01f4af Fix Helix mode search & selection (#42928)
This PR redoes the desired behavior changes of #41583 (reverted in
#42892) but less invasively

Closes #41125
Closes #41164

Release Notes:

- N/A

Co-authored-by: Conrad Irwin <conrad.irwin@gmail.com>
2025-11-19 18:30:55 +00:00
Andrew Farkas
829be71061 Fix invalid Unicode in terms & conditions (#42906)
Closes #40210

Previously attempted in #40423 and #42756. Third time's the charm?

Release Notes:

- Fixed encoding error in terms & conditions displayed when installing
2025-11-19 13:00:35 -05:00
Finn Evers
2a2f5a9c7a Add callable workflow for extension repositories (#43082)
This starts the work on a workflow that can be invoked in extension CI
to test changes on extension repositories.

Release Notes:

- N/A

---------

Co-authored-by: Agus Zubiaga <agus@zed.dev>
Co-authored-by: Conrad Irwin <conrad.irwin@gmail.com>
2025-11-19 17:47:34 +00:00
Kirill Bulatov
97b429953e gpui: Do not render ligatures between different styled text runs (#43080)
An attempt to re-land https://github.com/zed-industries/zed/pull/41043
Part of https://github.com/zed-industries/zed/issues/5259 (as `>>>`
forms a ligature that we need to break into differently colored tokens)

Before:

<img width="301" height="86" alt="image"
src="https://github.com/user-attachments/assets/e710391a-b8ad-4343-8344-c86fc5cb86b6"
/>

and


https://github.com/user-attachments/assets/ae77ba64-ca50-4b5d-9ee4-a7d46fcaeb34


After:
<img width="1254" height="302" alt="image"
src="https://github.com/user-attachments/assets/7fd5dba5-d798-4153-acf2-e38a1cb712ae"
/>


When certain combination of characters forms a ligature, it takes the
color of the first character.
Even though the runs are split already by color and other properties,
the underlying font system merges the runs together.

Attempts to modify color and other, unrelated to font size, parameters,
did not help on macOS, hence a somewhat odd approach was taken: runs get
interleaved font sizes: normal and "normal + a tiny bit more".
This is the only option that helped splitting the ligatures, and seems
to render fine.

Release Notes:

- Fixed ligatures forming between different text kinds

---------

Co-authored-by: Lukas Wirth <lukas@zed.dev>
2025-11-19 19:37:22 +02:00
John Tur
404ee53812 Fix Windows bundling (#43083)
The updated package from
https://github.com/zed-industries/zed/pull/43066 changed the paths of
these files in the nupkg.

Release Notes:

- N/A
2025-11-19 12:25:03 -05:00
localcc
17c30565fc Fix extension auto-install on first setup (#43078)
Release Notes:

- N/A
2025-11-19 16:41:39 +00:00
Lena
f05eef58c4 Stop the buggy stalebot for now (#43076)
Delay the stalebot runs until the end of the year since it's currently
broken and leaves unhelpful comments on all the issues, including feature requests. Bad
bot. Allegedly this bug will soon be gone
https://github.com/actions/stale/issues/1302 but it's too much work
protecting issues from the bot until then.

Release Notes:

- N/A
2025-11-19 17:09:28 +01:00
Joseph T. Lyons
52716bacef Bump Zed to v0.215 (#43075)
Release Notes:

- N/A
2025-11-19 16:04:35 +00:00
Smit Barmase
79be5cbfe2 editor: Fix prepaint recursion when updating stale sizes (#42896)
The bug is in the `place_near` logic, specifically the
`!row_block_types.contains_key(&(row - 1))` check. The problem isn’t
really that condition itself, but it’s that it relies on
`row_block_types`, which does not take into account that upon block
resizes, subsequent block start row moves up/down. Since `place_near`
depends on this incorrect map, it ends up causing incorrect resize syncs
to the block map, which then triggers more bad recursive calls. The
reason it worked till now in most of the cases is that recursive resizes
eventually lead to stabilizing it.

Before `place_near`, we never touched `row_block_types` during the first
prepaint pass because we knew it was based on outdated heights. Once all
heights are finalized, using it is fine.

The fix is to make sure `row_block_types` is accurate from the very
first prepaint pass by keeping an offset whenever a block shrinks or
expands. Now ideally it should take only one subsequent prepaint. But
due to shrinking, new custom/diagnostics blocks might come into the view
from below, which needs further prepaint calls for resolving. Right now,
tests pass after 2 subsequent prepaint calls. Just to be safe, we have
set it to 5.

<img width="500" alt="image"
src="https://github.com/user-attachments/assets/da3d32ff-5972-46d9-8597-b438e162552b"
/>

Release Notes:

- Fix issue where sometimes Zed used to experience freeze while working
with inline diagnostics.
2025-11-19 21:02:31 +05:30
Jakub Konka
a42676b6bb git: Put pending ops container out of snapshot (#43061)
This also fixes staging checkbox flickering.

Release Notes:

- Fixed staging checkbox flickering sporadically in the Git panel.
2025-11-19 14:56:10 +00:00
Ben Kunkle
39f8aefa8c zeta2: Improve context retrieval (#43014)
Closes #ISSUE

Release Notes:

- N/A *or* Added/Fixed/Improved ...

Co-authored-by: Agus <agus@zed.dev>
Co-authored-by: Max <max@zed.dev>
2025-11-19 14:44:58 +00:00
Lukas Wirth
1c1dfba7e3 windows: Bundle freshers conpty.dll builds (#43066)
Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-19 15:11:37 +01:00
Finn Evers
2c4fd24d37 gpui: Restore last window close behavior on macOS (#43058)
Follow-up to https://github.com/zed-industries/zed/pull/42391

Release Notes:

- Fixed an issue where Zed did not respect the `on_last_window_closed`
setting on macOS
2025-11-19 13:22:29 +00:00
Lukas Wirth
3125e78904 windows: Bundle new conpty.dll/OpenConsole.exe and use it for local builds on x86_64 (#43059)
Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-19 13:12:57 +00:00
Vasyl Protsiv
74d61aad7f util: Fix zip extraction (#42714)
I was trying to use Zed for Rust debugging on windows, but was getting
this warning in debugger console: "Could not initialize Python
interpreter - some features will be unavailable (e.g. debug
visualizers)."
As the warning suggests this led to bad debugging experience where the
variables were not visualized properly in the "Variables" panel.

After some investigation I found that the problem is that Zed silently
failed to extract all files from the debug adapter package
(https://github.com/vadimcn/codelldb/releases/download/v1.11.8/codelldb-win32-x64.vsix).
Particularly `python-lldb` folder was missing, which caused the warning.
The error occurred here:

cf7c64d77f/crates/util/src/archive.rs (L47)
And then gets ignored here:

cf7c64d77f/crates/dap/src/adapters.rs (L323-L326)
The simple fix is to update `async_zip` crate to version 0.0.18 where
this issue appears to be fixed. I also added logging instead of silently
ignoring the error, as I believe that would have helped to catch it
earlier.

To reproduce the original issue you can try to follow these steps:
0. (Optional) Remove/rename old codelldb adapter at
`%localappdata%\Zed\debug_adapters\CodeLLDB`. Restart Zed.
1. Create a simple Rust project. Make sure you use gnu toolchain (target
`x86_64-pc-windows-gnu`)
```rust
fn world() -> String {
    "world".into()
}

fn main() {
    let w = world();
    println!("hello {}", w);
}
```

2. Put a breakpoint on line 7 (`println`)
3. In the command palette choose "debugger: start" and then select "run
*crate name*"

Screenshot before the fix:

<img width="893" height="411" alt="image"
src="https://github.com/user-attachments/assets/78097690-b55e-4989-bfa4-20452560f9fc"
/>


<details>
<summary>Console before the fix</summary>

```
Checking latest version of CodeLLDB...
Downloading from https://github.com/vadimcn/codelldb/releases/download/v1.11.8/codelldb-win32-x64.vsix...
Download complete
Could not initialize Python interpreter - some features will be unavailable (e.g. debug visualizers).
Console is in 'commands' mode, prefix expressions with '?'.
warning: (x86_64) D:\repro\target\x86_64-pc-windows-gnu\debug\repro.exe unable to locate separate debug file (dwo, dwp). Debugging will be degraded.
Launching: D:\repro\target\x86_64-pc-windows-gnu\debug\repro.exe
Launched process 13836 from 'D:\repro\target\x86_64-pc-windows-gnu\debug\repro.exe'
error: repro.exe [0x0000000000002074]: DIE has DW_AT_ranges(DW_FORM_sec_offset 0x000000000000001a) attribute, but range extraction failed (invalid range list offset 0x1a), please file a bug and attach the file at the start of this error message
error: repro.exe [0x000000000000208c]: DIE has DW_AT_ranges(DW_FORM_sec_offset 0x0000000000000025) attribute, but range extraction failed (invalid range list offset 0x25), please file a bug and attach the file at the start of this error message
error: repro.exe [0x00000000000020af]: DIE has DW_AT_ranges(DW_FORM_sec_offset 0x0000000000000030) attribute, but range extraction failed (invalid range list offset 0x30), please file a bug and attach the file at the start of this error message
error: repro.exe [0x00000000000020c4]: DIE has DW_AT_ranges(DW_FORM_sec_offset 0x000000000000003b) attribute, but range extraction failed (invalid range list offset 0x3b), please file a bug and attach the file at the start of this error message
error: repro.exe [0x00000000000020fc]: DIE has DW_AT_ranges(DW_FORM_sec_offset 0x0000000000000046) attribute, but range extraction failed (invalid range list offset 0x46), please file a bug and attach the file at the start of this error message
error: repro.exe [0x0000000000002130]: DIE has DW_AT_ranges(DW_FORM_sec_offset 0x0000000000000046) attribute, but range extraction failed (invalid range list offset 0x46), please file a bug and attach the file at the start of this error message
> ? w
< {...}
```
</details>

Screenshot after the fix:
<img width="634" height="295" alt="image"
src="https://github.com/user-attachments/assets/67e36a64-97d2-406c-9216-7ac5b01f4101"
/>

<details>
<summary>Console after the fix</summary>

```
Checking latest version of CodeLLDB...
Downloading from https://github.com/vadimcn/codelldb/releases/download/v1.11.8/codelldb-win32-x64.vsix...
Download complete
Console is in 'commands' mode, prefix expressions with '?'.
Loading Rust formatters from C:\Users\Vasyl\.rustup\toolchains\1.91.1-x86_64-pc-windows-msvc\lib/rustlib/etc
warning: (x86_64) D:\repro\target\x86_64-pc-windows-gnu\debug\repro.exe unable to locate separate debug file (dwo, dwp). Debugging will be degraded.
Launching: D:\repro\target\x86_64-pc-windows-gnu\debug\repro.exe
Launched process 10364 from 'D:\repro\target\x86_64-pc-windows-gnu\debug\repro.exe'
error: repro.exe [0x0000000000002074]: DIE has DW_AT_ranges(DW_FORM_sec_offset 0x000000000000001a) attribute, but range extraction failed (invalid range list offset 0x1a), please file a bug and attach the file at the start of this error message
error: repro.exe [0x000000000000208c]: DIE has DW_AT_ranges(DW_FORM_sec_offset 0x0000000000000025) attribute, but range extraction failed (invalid range list offset 0x25), please file a bug and attach the file at the start of this error message
error: repro.exe [0x00000000000020af]: DIE has DW_AT_ranges(DW_FORM_sec_offset 0x0000000000000030) attribute, but range extraction failed (invalid range list offset 0x30), please file a bug and attach the file at the start of this error message
error: repro.exe [0x00000000000020c4]: DIE has DW_AT_ranges(DW_FORM_sec_offset 0x000000000000003b) attribute, but range extraction failed (invalid range list offset 0x3b), please file a bug and attach the file at the start of this error message
error: repro.exe [0x00000000000020fc]: DIE has DW_AT_ranges(DW_FORM_sec_offset 0x0000000000000046) attribute, but range extraction failed (invalid range list offset 0x46), please file a bug and attach the file at the start of this error message
error: repro.exe [0x0000000000002130]: DIE has DW_AT_ranges(DW_FORM_sec_offset 0x0000000000000046) attribute, but range extraction failed (invalid range list offset 0x46), please file a bug and attach the file at the start of this error message
> ? w
< "world"
```
</details>

This fixes #33753

Release Notes:

- util: Fixed archive::extract_zip failing to extract some archives
2025-11-19 12:34:41 +01:00
Piotr Osiewicz
40dd4e2270 zeta: Add stats about context lines from patch that were retrieved during context retrieval (#43053)
A.K.A: Eval: Expect lines necessary to uniquely target every change in
"Expected Patch" to be included as context

Release Notes:

- N/A
2025-11-19 11:25:53 +00:00
xdBronch
9feb260216 lsp: Support deprecated completion item tag and advertise capability (#43000)
Release Notes:

- N/A
2025-11-19 12:19:58 +01:00
Lukas Wirth
5ccbe945a6 util: Check whether discovered powershell is actually executable (#43044)
Closes https://github.com/zed-industries/zed/issues/42944

The powershell we discovered might be in a directory with higher
permission requirements which will cause us to fail using it.

Release Notes:

- Fixed powershell discovery disregarding admin requirements
2025-11-19 09:26:49 +00:00
Bennet Bo Fenner
a910c594d6 agent_ui: Add mode_id to telemetry (#43045)
Release Notes:

- N/A
2025-11-19 08:49:56 +00:00
Mikayla Maki
19d2532cf8 Update google_ai.rs (#43034)
Release Notes:

- N/A
2025-11-19 05:41:24 +00:00
Cole Miller
785b81aa3a Revert "Fix track file renames in git panel (#42352)" (#43030)
This reverts commit b0a7defd09.

It looks like this doesn't interact correctly with the project diff or
with staging, let's revert and reland with bugs fixed.

Release Notes:

- N/A
2025-11-19 03:56:45 +00:00
Ben Heimberg
24c1617e74 git_ui: Dismiss pickers only on active window (#41320)
Small QOL improvement for branch picker to only dismiss when focus lost
in active window.
This can benefit those who need to switch windows mid branch creation to
fetch correct jira ticket number or etc.

Added `window.is_active_window()` guard in `picker.rs` -> `cancel` event

Release Notes:
- (Let's Git Together) Fixed a behavior where pickers would
automatically close upon the window becoming inactive.
2025-11-18 21:57:30 -05:00
Julia Ryan
1e2f15a3d7 Disable phpactor by default on windows (#43011)
We install phpactor by default, but on windows it doesn't work out of
the box (see
[here](https://github.com/phpactor/phpactor/discussions/2579) for
details). For now we'll default to using intelephense, but in the future
we'd like to switch back if phpactor lands windows support given that
it's open source.

Release Notes:

- N/A
2025-11-18 16:38:19 -08:00
Martin Bergo
7c0663b825 google_ai: Add gemini-3-pro-preview model (#43015)
Release Notes:

- Added the newly released Gemini 3 Pro Preview Model


https://docs.cloud.google.com/vertex-ai/generative-ai/docs/models/gemini/3-pro
2025-11-18 23:51:32 +00:00
Lukas Wirth
94a43dc73a extension_host: Fix IS_WASM_THREAD being set for wrong threads (#43005)
https://github.com/zed-industries/zed/pull/40883 implemented this
incorrectly. It was marking a random background thread as a wasm thread
(whatever thread picked up the wasm epoch timer background task),
instead of marking the threads that actually run the wasm extension.

This has two implications:
1. it didn't prevent extension panics from tearing down as planned
2. Worse, it actually made us hide legit panics in sentry for one of our
background workers.

Now 2 still technically applies for all tokio threads after this, but we
basically only use these for wasm extensions in the main zed binary.

Release Notes:

- Fixed extension panics crashing Zed on Linux
2025-11-18 23:49:22 +00:00
Ben Kunkle
e8e0707256 zeta2: Improve queries parsing (#43012)
Closes #ISSUE

Release Notes:

- N/A *or* Added/Fixed/Improved ...

---------

Co-authored-by: Agus <agus@zed.dev>
Co-authored-by: Max <max@zed.dev>
2025-11-18 23:46:29 +00:00
Tom Zaspel
d7c340c739 docs: Add documenation for OpenTofu support (#42448)
Closes -

Release Notes:

- N/A

Signed-off-by: Tom Zaspel <40226087+tzabbi@users.noreply.github.com>
2025-11-18 18:40:09 -05:00
Julia Ryan
16b24e892e Increase error verbosity (#43013)
Closes #42288

This will actually print the parsing error that prevented the vscode
settings file from being loaded which should make it easier for users to
self help when they have an invalid config.

Release Notes:

- N/A
2025-11-18 23:25:12 +00:00
Barani S
917148c5ce gpui: Use DWM API for backdrop effects and add Mica/Mica Alt support (#41842)
This PR updates window background rendering to use the **official DWM
backdrop API** (`DwmSetWindowAttribute`) instead of the legacy
`SetWindowCompositionAttribute`.
It also adds **Mica** and **Mica Alt** options to
`WindowBackgroundAppearance` for native Windows 11 effects.

### Motivation

Enables modern, stable, and GPU-accelerated backdrops consistent with
Windows 11’s Fluent Design.
Removes reliance on undocumented APIs while maintaining backward
compatibility with older Windows versions.

### Changes

* Added `MicaBackdrop` and `MicaAltBackdrop` variants.
* Switched to DWM API for applying backdrop effects.
* Verified fallback behavior on Windows 10.

### Release Notes:

- Added `WindowBackgroundAppearance::MicaBackdrop` and
`WindowBackgroundAppearance::MicaAltBackdrop` for Windows 11 Mica and
Mica Alt window backdrops.

### Screenshots

- `WindowBackgroundAppearance::Blurred`
<img width="553" height="354" alt="image"
src="https://github.com/user-attachments/assets/57c9c25d-9412-4141-94b5-00000cc0b1ec"
/>

- `WindowBackgroundAppearance::MicaBackdrop`
<img width="553" height="354" alt="image"
src="https://github.com/user-attachments/assets/019f541c-3335-4c9e-b026-71f5a1786534"
/>

- `WindowBackgroundAppearance::MicaAltBackdrop`
<img width="553" height="354" alt="image"
src="https://github.com/user-attachments/assets/5128d600-c94d-4c89-b81a-8b842fe1337a"
/>

---------

Co-authored-by: John Tur <john-tur@outlook.com>
2025-11-18 18:20:32 -05:00
Piotr Osiewicz
951132fc13 chore: Fix build graph - again (#42999)
11.3s -> 10.0s for silly stuff like extracting actions from crates.
project panel still depends on git_ui though..

Release Notes:

- N/A
2025-11-18 19:20:34 +01:00
Ben Kunkle
bf0dd4057c zeta2: Make new_text/old_text parsing more robust (#42997)
Closes #ISSUE

The model often uses the wrong closing tag, or has spaces around the
closing tag name. This PR makes it so that opening tags are treated as
authoritative and any closing tag with the name `new_text` `old_text` or
`edits` is accepted based on depth. This has the additional benefit that
the parsing is more robust with contents that contain `new_text`
`old_text` or `edits. I.e. the following test passes

```rust
    #[test]
    fn test_extract_xml_edits_with_conflicting_content() {
        let input = indoc! {r#"
            <edits path="component.tsx">
            <old_text>
            <new_text></new_text>
            </old_text>
            <new_text>
            <old_text></old_text>
            </new_text>
            </edits>
        "#};

        let result = extract_xml_replacements(input).unwrap();
        assert_eq!(result.file_path, "component.tsx");
        assert_eq!(result.replacements.len(), 1);
        assert_eq!(result.replacements[0].0, "<new_text></new_text>");
        assert_eq!(result.replacements[0].1, "<old_text></old_text>");
    }
```

Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-18 12:36:37 -05:00
Conrad Irwin
3c4ca3f372 Remove settings::Maybe (#42933)
It's unclear how this would ever be useful

cc @probably-neb

Release Notes:

- N/A
2025-11-18 10:23:16 -07:00
Artur Shirokov
03132921c7 Add HTTP transport support for MCP servers (#39021)
### What this solves

This PR adds support for HTTP and SSE (Server-Sent Events) transports to
Zed's context server implementation, enabling communication with remote
MCP servers. Currently, Zed only supports local MCP servers via stdio
transport. This limitation prevents users from:

- Connecting to cloud-hosted MCP servers
- Using MCP servers running in containers or on remote machines
- Leveraging MCP servers that are designed to work over HTTP/SSE

### Why it's important

The MCP (Model Context Protocol) specification includes HTTP/SSE as
standard transport options, and many MCP server implementations are
being built with these transports in mind. Without this support, Zed
users are limited to a subset of the MCP ecosystem. This is particularly
important for:

- Enterprise users who need to connect to centralized MCP services
- Developers working with MCP servers that require network isolation
- Users wanting to leverage cloud-based context providers (e.g.,
knowledge bases, API integrations)

### Implementation approach

The implementation follows Zed's existing architectural patterns:

- **Transports**: Added `HttpTransport` and `SseTransport` to the
`context_server` crate, built on top of the existing `http_client` crate
- **Async handling**: Uses `gpui::spawn` for network operations instead
of introducing a new Tokio runtime
- **Settings**: Extended `ContextServerSettings` enum with a `Remote`
variant to support URL-based configuration
- **UI**: Updated the agent configuration UI with an "Add Remote Server"
option and dedicated modal for remote server management

### Changes included

- [x] HTTP transport implementation with request/response handling
- [x] SSE transport for server-sent events streaming
- [x] `build_transport` function to construct appropriate transport
based on URL scheme
- [x] Settings system updates to support remote server configuration
- [x] UI updates for adding/editing remote servers
- [x] Unit tests using `FakeHttpClient` for both transports
- [x] Integration tests (WIP)
- [x] Documentation updates (WIP)

### Testing

- Unit tests for both `HttpTransport` and `SseTransport` using mocked
HTTP client
- Manual testing with example MCP servers over HTTP/SSE
- Settings validation and UI interaction testing

### Screenshots/Recordings

[TODO: Add screenshots of the new "Add Remote Server" UI and
configuration modal]

### Example configuration

Users can now configure remote MCP servers in their `settings.json`:

```json
{
  "context_servers": {
    "my-remote-server": {
      "enabled": true,
      "url": "http://localhost:3000/mcp"
    }
  }
}
```

### AI assistance disclosure

I used AI to help with:

- Understanding the MCP protocol specification and how HTTP/SSE
transports should work
- Reviewing Zed's existing patterns for async operations and suggesting
consistent approaches
- Generating boilerplate for test cases
- Debugging SSE streaming issues

All code has been manually reviewed, tested, and adapted to fit Zed's
architecture. The core logic, architectural decisions, and integration
with Zed's systems were done with human understanding of the codebase.
AI was primarily used as a reference tool and for getting unstuck on
specific technical issues.

Release notes:
* You can now configure MCP Servers that connect over HTTP in your
settings file. These are not yet available in the extensions API.
  ```
  {
    "context_servers": {
      "my-remote-server": {
        "enabled": true,
        "url": "http://localhost:3000/mcp"
      }
    }
  }
  ```

---------

Co-authored-by: Conrad Irwin <conrad.irwin@gmail.com>
2025-11-18 16:39:08 +00:00
Richard Feldman
c0fadae881 Thought signatures (#42915)
Implement Gemini API's [thought
signatures](https://ai.google.dev/gemini-api/docs/thinking#signatures)

Release Notes:

- Added thought signatures for Gemini tool calls
2025-11-18 10:41:19 -05:00
Agus Zubiaga
1c66c3991d Enable sweep flag for staff (#42987)
Release Notes:

- N/A
2025-11-18 15:39:27 +00:00
Agus Zubiaga
7e591a7e9a Fix sweep icon spacing (#42986)
Release Notes:

- N/A
2025-11-18 15:33:03 +00:00
Danilo Leal
c44d93745a agent_ui: Improve the modal to add LLM providers (#42983)
Closes https://github.com/zed-industries/zed/issues/42807

This PR makes the modal to add LLM providers a bit better to interact
with:

1. Added a scrollbar
2. Made the inputs navigable with tab
3. Added some responsiveness to ensure it resizes on shorter windows


https://github.com/user-attachments/assets/758ea5f0-6bcc-4a2b-87ea-114982f37caf

Release Notes:

- agent: Improved the modal to add LLM providers by making it responsive
and keyboard navigable.
2025-11-18 12:28:14 -03:00
Lukas Wirth
b4e4e0d3ac remote: Fix up incorrect logs (#42979)
Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-18 15:14:52 +00:00
Lukas Wirth
097024d46f util: Use process spawn helpers in more places (#42976)
Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-18 14:31:39 +00:00
Ben Brandt
f1c2afdee0 Update codex docs to include configuration for third-party providers (#42973)
Release Notes:

- N/A
2025-11-18 13:50:59 +00:00
Jakub Konka
ea120dfe18 Revert "git: Remove JobStatus from PendingOp in favour of in-flight p… (#42970)
…runing (#42955)"

This reverts commit 696fdd8fed.

Release Notes:

- N/A
2025-11-18 13:30:40 +00:00
Lukas Wirth
d2988ffc77 vim: Fix snapshot out of bounds indexing (#42969)
Fixes ZED-38X

Release Notes:

- N/A
2025-11-18 13:02:40 +00:00
Engin Açıkgöz
f17d2c92b6 terminal_view: Fix terminal opening in root directory when editing single file corktree (#42953)
Fixes #42945

## Problem
When opening a single file via command line (e.g., `zed
~/Downloads/file.txt`), the terminal panel was opening in the root
directory (/) instead of the file's directory.

## Root Cause
The code only checked for active project directory, which returns None
when a single file is opened. Additionally, file worktrees weren't
handling parent directory lookup.

## Solution
Added fallback logic to use the first project directory when there's no
active entry, and made file worktrees return their parent directory
instead of None.

## Testing
- All existing tests pass
- Added test coverage for file worktree scenarios
- Manually tested with `zed ~/Downloads/file.txt` - terminal now opens
in correct directory

This improves the user experience for users who frequently open single
files from the command line.

## Release Notes

- Fixed terminal opening in root directory when editing single files
from the command line
2025-11-18 13:37:48 +01:00
Antonio Scandurra
c1d9dc369c Try reducing flakiness of fs-event tests by bumping timeout to 4s on CI (#42960)
Release Notes:

- N/A
2025-11-18 11:00:02 +00:00
Jakub Konka
696fdd8fed git: Remove JobStatus from PendingOp in favour of in-flight pruning (#42955)
The idea is that we only store running (`!self.finished`) or finished
(`self.finished`) pending ops, while everything else (skipped, errored)
jobs are pruned out immediately. We don't really need them in the grand
scheme of things anyway.

Release Notes:

- N/A
2025-11-18 10:22:34 +00:00
Lena
980f8bff2a Add a github issue label to shoo the stalebot away (#42950)
Labeling an issue with "never stale" will keep the stalebot away; the
bot can get annoying in some situations otherwise.

Release Notes:

- N/A
2025-11-18 10:15:09 +01:00
Kirill Bulatov
2a3bcbfe0f Properly check chunk version on lsp store update (#42951)
Release Notes:

- N/A

Co-authored-by: Lukas Wirth <lukas@zed.dev>
2025-11-18 09:13:32 +00:00
aleanon
5225a84aff For and await highlighting rust (#42924)
Closes #42922

Release Notes:

- Fixed Correctly highlighting the 'for' keyword in Rust as
keyword.control only in for loops.
- Fixed Highlighting the 'await' keyword in Rust as keyword.control
2025-11-18 09:11:36 +01:00
Max Brunsfeld
5c70f8391f Fix panic when using sweep AI without token env var (#42940)
Release Notes:

- N/A
2025-11-17 22:23:14 -08:00
Danilo Leal
10efbd5eb4 agent_ui: Show the "new thread" keybinding for the currently active agent (#42939)
This PR's goal is to improve discoverability of how Zed "remembers" the
currently selected agent when hitting `cmd-n` (or `ctrl-n`). Hitting
that binding starts a new thread with whatever agent is currently
selected.

In the example below, I am in a Claude Code thread and if I hit `cmd-n`,
a new, fresh CC thread will be started:

<img width="500" height="822" alt="Screenshot 2025-11-18 at 1  13@2x"
src="https://github.com/user-attachments/assets/d3acd1aa-459d-4078-9b62-bbac3b8c1600"
/>


Release Notes:

- agent: Improved discoverability of the `cmd-n` keybinding to create a
new thread with the currently selected agent.
2025-11-18 01:53:37 -03:00
Agus Zubiaga
0386f240a9 Add experimental Sweep edit prediction provider (#42927)
Only for staff

Release Notes:

- N/A

---------

Co-authored-by: Max Brunsfeld <maxbrunsfeld@gmail.com>
Co-authored-by: Ben Kunkle <ben@zed.dev>
2025-11-18 01:00:26 +00:00
Conrad Irwin
a39ba03bcc Use metrics-id for sentry user id when we have it (#42931)
This should make it easier to correlate Sentry reports with user reports
and
github issues (for users who have diagnostics enabled)

Release Notes:

- N/A
2025-11-17 23:44:59 +00:00
Lukas Wirth
2c7bcfcb7b multi_buffer: Work around another panic bug in path_key (#42920)
Fixes ZED-346 for now until I find the time to dig into this bug
properly

Release Notes:

- Fixed a panic in the diagnostics pane
2025-11-17 22:38:48 +00:00
Lukas Wirth
6bea23e990 text: Temporarily remove assert_char_boundary panics (#42919)
As discussed in the first responders meeting. We have collected a lot of
backtraces from these, but it's not quite clear yet what causes this.
Removing these should ideally make things a bit more stable even if we
may run into panics later one when the faulty anchor is used still.

Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-17 22:20:45 +00:00
Julia Ryan
98da1ea169 Fix remote extension syncing (#42918)
Closes #40906
Closes #39729

SFTP uploads weren't quoting the install directory which was causing
extension syncing to fail. We were also only running `install_extension`
once per remote-connection instead of once per project (thx @feeiyu for
pointing this out) so extension weren't being loaded in subsequently
opened remote projects.

Release Notes:

- N/A

---------

Co-authored-by: Conrad Irwin <conrad.irwin@gmail.com>
2025-11-17 16:13:58 -06:00
Danilo Leal
98a83b47e6 agent_ui: Make input fields in Bedrock settings keyboard navigable (#42916)
Closes https://github.com/zed-industries/zed/issues/36587

This PR enables jumping from one input to the other, in the Bedrock
settings section, with tab.

Release Notes:

- N/A
2025-11-17 19:13:15 -03:00
Danilo Leal
5f356d04ff agent_ui: Fix model name label truncation (#42921)
Closes https://github.com/zed-industries/zed/issues/32739

Release Notes:

- agent: Fixed an issue where the label for model names wouldn't use all
the available space in the model picker.
2025-11-17 19:12:26 -03:00
Marshall Bowers
73d3f9611e collab: Add external_id column to billing_customers table (#42923)
This PR adds an `external_id` column to the `billing_customers` table.

Release Notes:

- N/A
2025-11-17 22:12:00 +00:00
Finn Evers
d9cfc2c883 Fix formatting in various files (#42917)
This fixes various issues where rustfmt failed to format code due to too
long strings, most of which I stumbled across over the last week and
some additonal ones I searched for whilst fixing the others.

Release Notes:

- N/A
2025-11-17 21:48:09 +00:00
Dino
ee420d530e vim: Change approach to fixing vim's temporary mode bug (#42894)
The `Vim.exit_temporary_normal` method had been updated
(https://github.com/zed-industries/zed/pull/42742) to expect and
`Option<&Motion>` that would then be used to determine whether to move
the cursor right in case the motion was `Some(EndOfLine { ..})`.
Unfortunately this meant that all callers now had to provide this
argument, even if just `None`.

After merging those changes I remember that we could probably play
around with `clip_at_line_ends` so this commit removes those intial
changes in favor of updating the `vim::normal::Vim.move_cursor` method
so that, if vim is in temporary mode and `EndOfLine` is used, it
disables clipping at line ends so that the newline character can be
selected.

Closes [#42278](https://github.com/zed-industries/zed/issues/42278)

Release Notes:

- N/A
2025-11-17 21:34:37 +00:00
Miguel Raz Guzmán Macedo
d801d0950e Add @miguelraz to reviewers and support sections (#42904)
Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-17 15:22:34 -06:00
Lukas Wirth
3f25d36b3c agent_ui: Fix text pasting no longer working (#42914)
Regressed in https://github.com/zed-industries/zed/pull/42908
Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-17 21:04:50 +00:00
Joseph T. Lyons
f015368586 Update top-ranking issues script (#42911)
- Added Windows category
- Removed unused import
- Fixed a type error reported by `ty`

Release Notes:

- N/A
2025-11-17 15:20:22 -05:00
Ben Kunkle
4bf3b9d62e zeta2: Output bucketed_analysis.md (#42890)
Closes #ISSUE

Makes it so that a file named `bucketed_analysis.md` is written to the
runs directory after an eval is ran with > 1 repetitions. This file
buckets the predictions made by the model by comparing the edits made so
that seeing how many times different failure modes were encountered
becomes much easier.

Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-17 15:17:39 -05:00
Lukas Wirth
599a217ea5 workspace: Fix logging of errors in prompt_err (#42908)
Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-17 20:39:50 +01:00
ozzy
b0a7defd09 Fix track file renames in git panel (#42352)
Closes #30549

Release Notes:

- Fixed: Git renames now properly show as renamed files in the git panel
instead of appearing as deleted + untracked files
<img width="351" height="132" alt="Screenshot 2025-11-10 at 17 39 44"
src="https://github.com/user-attachments/assets/80e9c286-1abd-4498-a7d5-bd21633e6597"
/>
<img width="500" height="95" alt="Screenshot 2025-11-10 at 17 39 55"
src="https://github.com/user-attachments/assets/e4c59796-df3a-4d12-96f4-e6706b13a32f"
/>
2025-11-17 13:25:51 -06:00
Davis Vaughan
57e3bcfcf8 Revise R documentation - about Air in particular (#42755)
Returning the favor from @rgbkrk in
https://github.com/posit-dev/air/pull/445

I noticed the R docs around Air are a bit incorrect / out of date. I'll
make a few more comments inline. Feel free to take over for any other
edits.

Release Notes:

- Improved R language support documentation
2025-11-17 11:53:42 -07:00
Oleksiy Syvokon
b2f561165f zeta2: Support qwen3-minimal prompt format (#42902)
This prompt is for a fine-tuned model. It has the following changes,
compared to `minimal`:
- No instructions at all, except for one sentence at the beginning of
the prompt.
- Output is a simplified unified diff -- hunk headers have no line
counts (e.g., `@@ -20 +20 @@`)
- Qwen's FIM tokens are used where possible (`<|file_sep|>`,
`<|fim_prefix|>`, `<|fim_suffix|>`, etc.)

To evaluate this model:
```
ZED_ZETA2_MODEL=zeta2-exp [usual zeta-cli eval params ...]  --prompt-format minimal-qwen
```

This will point to the most recent Baseten deployment of zeta2-exp
(which may change in the future, so the prompt-format may get out of
sync).

Release Notes:

- N/A
2025-11-17 20:36:05 +02:00
localcc
fd1494c31a Fix remote server completions not being queried from all LSP servers (#42723)
Closes #41294

Release Notes:

- Fixed remote LSPs not being queried
2025-11-17 18:07:49 +00:00
Danilo Leal
faa1136651 agent_ui: Don't create a new terminal when hitting the new thread binding from the terminal (#42898)
Closes https://github.com/zed-industries/zed/issues/32701

Release Notes:

- agent: Fixed a bug where hitting the `NewThread` keybinding when
focused inside a terminal within the agent panel would create a new
terminal tab instead of a new thread.
2025-11-17 15:05:38 -03:00
Conrad Irwin
6bf5e92a25 Revert "Keep selection in SwitchToHelixNormalMode (#41583)" (#42892)
Closes #ISSUE

Release Notes:

- Fixes vim "go to definition" making a selection
2025-11-17 11:01:34 -07:00
Lukas Wirth
46ad6c0bbb ci: Remove remaining nextest compiles (#42630)
Follow up to https://github.com/zed-industries/zed/pull/42556

Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-17 17:59:48 +00:00
Lukas Wirth
671500de1b agent_ui: Fix images copied from win explorer not being pastable (#42858)
Closes https://github.com/zed-industries/zed/issues/41505

A bit adhoc but it gets the job done for now

Release Notes:

- Fixed images copied from windows explorer not being pastable in the
agent panel
2025-11-17 17:58:22 +00:00
Kirill Bulatov
0519c645fb Deduplicate inlays when getting those from multiple language servers (#42899)
Part of https://github.com/zed-industries/zed/issues/42671

Release Notes:

- Deduplicate inlay hints from different language servers
2025-11-17 17:53:05 +00:00
Richard Feldman
23872b0523 Fix stale edits (#42895)
Closes #34069

<img width="532" height="880" alt="Screenshot 2025-11-17 at 11 14 19 AM"
src="https://github.com/user-attachments/assets/abc50c32-d54d-4310-a6e6-83008db7ed81"
/>

<img width="525" height="863" alt="Screenshot 2025-11-17 at 12 22 50 PM"
src="https://github.com/user-attachments/assets/15a69792-c2c7-4727-add9-c1f9baa5e665"
/>

Release Notes:

- Agent file edits now error if the file has changed since last read
(allowing the agent to read changes and avoid overwriting changes made
outside Zed)
2025-11-17 12:23:18 -05:00
Richard Feldman
4b050b651a Support Agent Servers on remoting (#42683)
<img width="348" height="359" alt="Screenshot 2025-11-13 at 6 53 39 PM"
src="https://github.com/user-attachments/assets/6fe75796-8ceb-4f98-9d35-005c90417fd4"
/>

Also added support for per-target env vars to Agent Server Extensions

Closes https://github.com/zed-industries/zed/issues/42291

Release Notes:

- Per-target env vars are now supported on Agent Server Extensions
- Agent Server Extensions are now available when doing SSH remoting

---------

Co-authored-by: Lukas Wirth <me@lukaswirth.dev>
Co-authored-by: Mikayla Maki <mikayla.c.maki@gmail.com>
2025-11-17 10:48:14 -05:00
Danilo Leal
bb46bc167a settings_ui: Add "Edit in settings.json" button to subpage header (#42886)
Closes https://github.com/zed-industries/zed/issues/42094

This will make it consistent with the regular/main page. Also ended up
fixing a bug along the way where this button wouldn't work for subpage
items.

Release Notes:

- settings ui: Fixed a bug where the "Edit in settings.json" wouldn't
work for subpages like all the Language pages.
2025-11-17 11:58:43 -03:00
Oleksiy Syvokon
b274f80dd9 zeta2: Print average length of prompts and outputs (#42885)
Release Notes:

- N/A
2025-11-17 16:56:58 +02:00
Danilo Leal
d77ab99ab1 keymap_editor: Make "toggle exact match mode" the default for binding search (#42883)
I think having the "exact mode" turned on by default is usually what
users will expect when searching for a specific keybinding. When it's
turned off, it's very odd to search for a super common binding like
"command-enter" and get no results. That happens because without that
mode, we're trying to match for subsequent matches, which I'm betting
it's an edge case. Hopefully, this change will make the keymap editor
feel more like it works well.

I'm also adding the toggle icon button inside the keystroke input for
consistency with the project search input.

Making this change very inspired by [Sam Rose's
feedback](https://bsky.app/profile/samwho.dev/post/3m5juszqyd22w).

Release Notes:

- keymap editor: Made the "toggle exact match mode" the default
keystroke search mode so that whatever you search for matches exactly to
results.
2025-11-17 11:43:15 -03:00
Finn Evers
97792f7fb9 Prefer loading extension.toml before extension.json (#42884)
Closes #42406

The issue for the fish-extension is that a `extension.json` is still
present next to a `extension.toml`, although the former is deprecated.

We should prefer the `extension.toml` if it is present and only fall
back to the `extension.json` if needed. This PR tackles this.

Release Notes:

- N/A
2025-11-17 14:29:50 +00:00
tidely
9bebf314e0 http_client: Remove unused HttpClient::type_name method (#42803)
Closes #ISSUE

Remove unused method `HttpClient::type_name`. Looking at the PR from a
year ago when it was added, it was never actually used for anything and
seems like a prototyping artifact.

Other misc changes for the `http_client` crate include:

- Use `derive_more::Deref` for `HttpClientWithUrl` (already used for
`HttpClientWithProxy`)
- Move `http_client::proxy()` higher up in the trait definition. (It was
in between methods that have default implementations)

Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-17 15:28:22 +01:00
Danilo Leal
4092e81ada keymap_editor: Adjust some items of the UI (#42876)
- Only showing the "Create" menu item in the right-click context menu
for actions that _do not_ contain a binding already assigned to them
- Only show the "Clear Input" icon button in the keystroke modal when
the input is focused/in recording mode
- Add a subtle hover style to the table rows just to make it easier to
navigate

Release Notes:

- N/A
2025-11-17 10:59:46 -03:00
Kirill Bulatov
e0b64773d9 Properly sanitize out inlay hints from remote hosts (#42878)
Part of https://github.com/zed-industries/zed/issues/42671

Release Notes:

- Fixed remote hosts causing duplicate hints to be displayed
2025-11-17 15:53:18 +02:00
Piotr Osiewicz
f1bebd79d1 zeta2: Add skip-prediction flag to eval CLI (#42872)
Release Notes:

- N/A
2025-11-17 13:37:51 +00:00
Lukas Wirth
a66a539a09 Reduce macro burden for rust-analyzer (#42871)
This enables optimizations for our own proc-macros as well as some heavy
hitters. Additionally this gates the `derive_inspector_reflection` to be
skipped for rust-analyzer as it currently slows down rust-analyzer way
too much

Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-17 12:31:00 +00:00
Lucas Parry
a2d3e3baf9 project_panel: Add sort mode (#40160)
Closes #4533 (partly at least)

Release Notes:

- Added `project_panel.sort_mode` option to control explorer file sort
(directories first, mixed, files first)

 ## Summary

Adds three sorting modes for the project panel to give users more
control over how files and directories are displayed:

- **`directories_first`** (default): Current behaviour - directories
grouped before files
- **`mixed`**: Files and directories sorted together alphabetically
- **`files_first`**: filed grouped before directories

 ## Motivation

Users coming from different editors and file managers have different
expectations for file sorting. Some prefer directories grouped at the
top (traditional), while others prefer the macOS Finder-style mixed
sorting where "Apple1/", "apple2.tsx" and "Apple3/" appear
alphabetically mixed together.


 ### Screenshots

New sort options in settings:
<img width="515" height="160" alt="image"
src="https://github.com/user-attachments/assets/8f4e6668-6989-4881-a9bd-ed1f4f0beb40"
/>


Directories first | Mixed | Files first
-------------|-----|-----
<img width="328" height="888" alt="image"
src="https://github.com/user-attachments/assets/308e5c7a-6e6a-46ba-813d-6e268222925c"
/> | <img width="327" height="891" alt="image"
src="https://github.com/user-attachments/assets/8274d8ca-b60f-456e-be36-e35a3259483c"
/> | <img width="328" height="890" alt="image"
src="https://github.com/user-attachments/assets/3c3b1332-cf08-4eaf-9bed-527c00b41529"
/>


### Agent usage

Copilot-cli/claude-code/codex-cli helped out a lot. I'm not from a rust
background, but really wanted this solved, and it gave me a chance to
play with some of the coding agents I'm not permitted to use for work
stuff

---------

Co-authored-by: Smit Barmase <heysmitbarmase@gmail.com>
2025-11-17 17:52:46 +05:30
Serophots
175162af4f project_panel: Fix preview tabs disabling focusing files after just one click in project panel (#42836)
Closes #41484

With preview tabs disabled, when you click once on a file in the project
panel, rather than focusing on that file, zed will incorrectly focus on
the text editor panel. This means if you click on a file to focus it,
then follow up with a keybind like backspace to delete that file, it
doesn't delete that file because the backspace goes through to the text
editor instead.

Incorrect behaviour seen here:


https://github.com/user-attachments/assets/8c2dea90-bd90-4507-8ba6-344be348f151



Release Notes:

- Fixed improper UI focus behaviour in the project panel when preview
tabs are disabled
2025-11-17 16:53:12 +05:30
Dino
cdcc068906 vim: Fix temporary mode exit on end of line (#42742)
When using the end of line motion ($) while in temporary mode, the
cursor would be placed in insert mode just before the last character
instead of after, just like in NeoVim.

This happens because `EndOfLine` kind of assumes that we're in `Normal`
mode and simply places the cursor in the last character instead of the
newline character.

This commit moves the cursor one position to the right when exiting
temporary mode and the motion used was `Motion::EndOfLine`

- Update `vim::normal::Vim.exit_temporary_normal` to now accept a
`Option<&Motion>` argument, in case callers want this new logic to
potentially be applied

Closes #42278 

Release Notes:

- Fixed temporary mode exit when using `$` to move to the end of the
line
2025-11-17 11:14:49 +00:00
Mayank Verma
86484aaded languages: Clean up invalid init calls after recent API changes (#42866)
Related to https://github.com/zed-industries/zed/pull/41670

Release Notes:

- Cleaned up invalid init calls after recent API changes in
https://github.com/zed-industries/zed/pull/42238
2025-11-17 10:29:29 +00:00
Mayank Verma
d32934a893 languages: Fix indentation for if/else statements in C/C++ without braces (#41670)
Closes #41179

Release Notes:

- Fixed indentation for if/else statements in C/C++ without braces
2025-11-17 10:05:54 +01:00
warrenjokinen
b463266fa1 Remove mention of Fireside Hacks (#42853)
Fireside Hack events are no longer being held.

Closes #ISSUE

Release Notes:

- N/A
2025-11-16 23:26:38 -05:00
Agus Zubiaga
b0525a26a6 Report automatically discarded zeta predictions (#42761)
We weren't reporting predictions that were generated but never made it
out of the provider, such as predictions that failed to interpolate, and
those that are cancelled because another request completes before it.

Release Notes:

- N/A

---------

Co-authored-by: Max Brunsfeld <maxbrunsfeld@gmail.com>
Co-authored-by: Ben Kunkle <ben@zed.dev>
2025-11-16 11:51:13 -08:00
Smit Barmase
1683052e6c editor: Fix MoveToEnclosingBracket and unmatched forward/backward Vim motions in Markdown code blocks (#42813)
We now correctly use bracket ranges from the deepest syntax layer when
finding enclosing brackets.

Release Notes:

- Fixed an issue where `MoveToEnclosingBracket` didn’t work correctly
inside Markdown code blocks.
- Fixed an issue where unmatched forward/backward Vim motions didn’t
work correctly inside Markdown code blocks.

---------

Co-authored-by: MuskanPaliwal <muskan10112002@gmail.com>
2025-11-15 23:57:49 +05:30
Alvaro Parker
07cc87b288 Fix wild install script (#42747)
Use
[`command`](https://www.gnu.org/software/bash/manual/bash.html#index-command)
instead of `which` to check if `wild` is installed.

Using `which` will result in an error being printed to stdout: 

```bash
./script/install-wild
which: invalid option -- 's'
/usr/local/bin/wild
Warning: existing wild 0.6.0 found at /usr/local/bin/wild. Skipping installation.
```

Release Notes:

- N/A
2025-11-15 15:15:37 +01:00
Danilo Leal
1277f328c4 docs: Improve custom keybinding for external agent example (#42776)
Follow up to https://github.com/zed-industries/zed/pull/42772 adding
some comments to improve clarity.

Release Notes:

- N/A
2025-11-14 23:08:46 +00:00
Danilo Leal
b3097cfc8a docs: Add section about keybinding for external agent threads (#42772)
Release Notes:

- N/A
2025-11-14 19:54:52 -03:00
Ivan Pasquariello
305206fd48 Make drag and double click enabled on the whole title bar on macOS (#41839)
Closes #4947

Taken inspiration from @tasuren implementation, plus the addition for
the double click enabled on the whole title bar too to
maximizes/restores the window.

I was not able to test the application on Linux, no need to test on
Windows since the feature is enabled by the OS.

Release Notes:

- Fixed title bar not fully draggable on macOS
- Fixed not being able to maximizes/restores the window with double
click on the whole title bar on macOS
2025-11-14 22:46:35 +00:00
Ben Kunkle
c387203ac8 zeta2: Prediction prompt engineering (#42758)
Closes #ISSUE

Release Notes:

- N/A *or* Added/Fixed/Improved ...

---------

Co-authored-by: Agus Zubiaga <agus@zed.dev>
Co-authored-by: Michael Sloan <mgsloan@gmail.com>
2025-11-14 16:50:55 -05:00
Danilo Leal
a260ba6428 agent_ui: Simplify labels in new thread menu (#42746)
Drop the "new", it's simpler! 😆 

| Before | After |
|--------|--------|
| <img width="800" height="932" alt="Screenshot 2025-11-14 at 2  48@2x"
src="https://github.com/user-attachments/assets/efa67d57-9b5c-4eef-8dc7-f36c8e6a4a90"
/> | <img width="800" height="772" alt="Screenshot 2025-11-14 at 2 
47@2x"
src="https://github.com/user-attachments/assets/042d2a0b-24b4-4ad5-8411-82e0eafb993f"
/> |




Release Notes:

- N/A
2025-11-14 18:02:09 +00:00
343 changed files with 13248 additions and 6785 deletions

View File

@@ -16,13 +16,5 @@ rustflags = [
"target-feature=+crt-static", # This fixes the linking issue when compiling livekit on Windows
]
[target.x86_64-unknown-linux-gnu]
linker = "clang"
rustflags = ["-C", "link-arg=--ld-path=wild"]
[target.aarch64-unknown-linux-gnu]
linker = "clang"
rustflags = ["-C", "link-arg=--ld-path=wild"]
[env]
MACOSX_DEPLOYMENT_TARGET = "10.15.7"

View File

@@ -4,10 +4,8 @@ description: "Runs the tests"
runs:
using: "composite"
steps:
- name: Install Rust
shell: bash -euxo pipefail {0}
run: |
cargo install cargo-nextest --locked
- name: Install nextest
uses: taiki-e/install-action@nextest
- name: Install Node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4

View File

@@ -11,9 +11,8 @@ runs:
using: "composite"
steps:
- name: Install test runner
shell: powershell
working-directory: ${{ inputs.working-directory }}
run: cargo install cargo-nextest --locked
uses: taiki-e/install-action@nextest
- name: Install Node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4

View File

@@ -7,7 +7,7 @@ on:
- published
jobs:
rebuild_releases_page:
if: github.repository_owner == 'zed-industries'
if: (github.repository_owner == 'zed-industries' || github.repository_owner == 'zed-extensions')
runs-on: namespace-profile-2x4-ubuntu-2404
steps:
- name: after_release::rebuild_releases_page::refresh_cloud_releases
@@ -21,7 +21,7 @@ jobs:
post_to_discord:
needs:
- rebuild_releases_page
if: github.repository_owner == 'zed-industries'
if: (github.repository_owner == 'zed-industries' || github.repository_owner == 'zed-extensions')
runs-on: namespace-profile-2x4-ubuntu-2404
steps:
- id: get-release-url
@@ -71,7 +71,7 @@ jobs:
max-versions-to-keep: 5
token: ${{ secrets.WINGET_TOKEN }}
create_sentry_release:
if: github.repository_owner == 'zed-industries'
if: (github.repository_owner == 'zed-industries' || github.repository_owner == 'zed-extensions')
runs-on: namespace-profile-2x4-ubuntu-2404
steps:
- name: steps::checkout_repo

View File

@@ -42,7 +42,7 @@ jobs:
exit 1
;;
esac
which cargo-set-version > /dev/null || cargo install cargo-edit
which cargo-set-version > /dev/null || cargo install cargo-edit -f --no-default-features --features "set-version"
output="$(cargo set-version -p zed --bump patch 2>&1 | sed 's/.* //')"
export GIT_COMMITTER_NAME="Zed Bot"
export GIT_COMMITTER_EMAIL="hi@zed.dev"

View File

@@ -1,7 +1,7 @@
name: "Close Stale Issues"
on:
schedule:
- cron: "0 7,9,11 * * 3"
- cron: "0 8 31 DEC *"
workflow_dispatch:
jobs:
@@ -26,3 +26,4 @@ jobs:
ascending: true
enable-statistics: true
stale-issue-label: "stale"
exempt-issue-labels: "never stale"

View File

@@ -39,8 +39,7 @@ jobs:
run: ./script/download-wasi-sdk
shell: bash -euxo pipefail {0}
- name: compare_perf::run_perf::install_hyperfine
run: cargo install hyperfine
shell: bash -euxo pipefail {0}
uses: taiki-e/install-action@hyperfine
- name: steps::git_checkout
run: git fetch origin ${{ inputs.base }} && git checkout ${{ inputs.base }}
shell: bash -euxo pipefail {0}

View File

@@ -12,7 +12,7 @@ on:
- main
jobs:
danger:
if: github.repository_owner == 'zed-industries'
if: (github.repository_owner == 'zed-industries' || github.repository_owner == 'zed-extensions')
runs-on: namespace-profile-2x4-ubuntu-2404
steps:
- name: steps::checkout_repo

View File

@@ -43,9 +43,7 @@ jobs:
fetch-depth: 0
- name: Install cargo nextest
shell: bash -euxo pipefail {0}
run: |
cargo install cargo-nextest --locked
uses: taiki-e/install-action@nextest
- name: Limit target directory size
shell: bash -euxo pipefail {0}

138
.github/workflows/extension_tests.yml vendored Normal file
View File

@@ -0,0 +1,138 @@
# Generated from xtask::workflows::extension_tests
# Rebuild with `cargo xtask workflows`.
name: extension_tests
env:
CARGO_TERM_COLOR: always
RUST_BACKTRACE: '1'
CARGO_INCREMENTAL: '0'
ZED_EXTENSION_CLI_SHA: 7cfce605704d41ca247e3f84804bf323f6c6caaf
on:
workflow_call:
inputs:
run_tests:
description: Whether the workflow should run rust tests
required: true
type: boolean
jobs:
orchestrate:
if: (github.repository_owner == 'zed-industries' || github.repository_owner == 'zed-extensions')
runs-on: namespace-profile-2x4-ubuntu-2404
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
fetch-depth: ${{ github.ref == 'refs/heads/main' && 2 || 350 }}
- id: filter
name: filter
run: |
if [ -z "$GITHUB_BASE_REF" ]; then
echo "Not in a PR context (i.e., push to main/stable/preview)"
COMPARE_REV="$(git rev-parse HEAD~1)"
else
echo "In a PR context comparing to pull_request.base.ref"
git fetch origin "$GITHUB_BASE_REF" --depth=350
COMPARE_REV="$(git merge-base "origin/${GITHUB_BASE_REF}" HEAD)"
fi
CHANGED_FILES="$(git diff --name-only "$COMPARE_REV" ${{ github.sha }})"
check_pattern() {
local output_name="$1"
local pattern="$2"
local grep_arg="$3"
echo "$CHANGED_FILES" | grep "$grep_arg" "$pattern" && \
echo "${output_name}=true" >> "$GITHUB_OUTPUT" || \
echo "${output_name}=false" >> "$GITHUB_OUTPUT"
}
check_pattern "check_rust" '^(Cargo.lock|Cargo.toml|.*\.rs)$' -qP
check_pattern "check_extension" '^.*\.scm$' -qP
shell: bash -euxo pipefail {0}
outputs:
check_rust: ${{ steps.filter.outputs.check_rust }}
check_extension: ${{ steps.filter.outputs.check_extension }}
check_rust:
needs:
- orchestrate
if: needs.orchestrate.outputs.check_rust == 'true'
runs-on: namespace-profile-16x32-ubuntu-2204
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::cache_rust_dependencies_namespace
uses: namespacelabs/nscloud-cache-action@v1
with:
cache: rust
- name: steps::cargo_fmt
run: cargo fmt --all -- --check
shell: bash -euxo pipefail {0}
- name: extension_tests::run_clippy
run: cargo clippy --release --all-targets --all-features -- --deny warnings
shell: bash -euxo pipefail {0}
- name: steps::cargo_install_nextest
if: inputs.run_tests
uses: taiki-e/install-action@nextest
- name: steps::cargo_nextest
if: inputs.run_tests
run: cargo nextest run --workspace --no-fail-fast --failure-output immediate-final
shell: bash -euxo pipefail {0}
timeout-minutes: 3
check_extension:
needs:
- orchestrate
if: needs.orchestrate.outputs.check_extension == 'true'
runs-on: namespace-profile-2x4-ubuntu-2404
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- id: cache-zed-extension-cli
name: extension_tests::cache_zed_extension_cli
uses: actions/cache@0057852bfaa89a56745cba8c7296529d2fc39830
with:
path: zed-extension
key: zed-extension-${{ env.ZED_EXTENSION_CLI_SHA }}
- name: extension_tests::download_zed_extension_cli
if: steps.cache-zed-extension-cli.outputs.cache-hit != 'true'
run: |
wget --quiet "https://zed-extension-cli.nyc3.digitaloceanspaces.com/$ZED_EXTENSION_CLI_SHA/x86_64-unknown-linux-gnu/zed-extension"
chmod +x zed-extension
shell: bash -euxo pipefail {0}
- name: extension_tests::check
run: |
mkdir -p /tmp/ext-scratch
mkdir -p /tmp/ext-output
./zed-extension --source-dir . --scratch-dir /tmp/ext-scratch --output-dir /tmp/ext-output
shell: bash -euxo pipefail {0}
timeout-minutes: 1
tests_pass:
needs:
- orchestrate
- check_rust
- check_extension
if: (github.repository_owner == 'zed-industries' || github.repository_owner == 'zed-extensions') && always()
runs-on: namespace-profile-2x4-ubuntu-2404
steps:
- name: run_tests::tests_pass
run: |
set +x
EXIT_CODE=0
check_result() {
echo "* $1: $2"
if [[ "$2" != "skipped" && "$2" != "success" ]]; then EXIT_CODE=1; fi
}
check_result "orchestrate" "${{ needs.orchestrate.result }}"
check_result "check_rust" "${{ needs.check_rust.result }}"
check_result "check_extension" "${{ needs.check_extension.result }}"
exit $EXIT_CODE
shell: bash -euxo pipefail {0}
concurrency:
group: ${{ github.workflow }}-${{ github.ref_name }}-${{ github.ref_name == 'main' && github.sha || 'anysha' }}
cancel-in-progress: true

View File

@@ -10,7 +10,7 @@ on:
- v*
jobs:
run_tests_mac:
if: github.repository_owner == 'zed-industries'
if: (github.repository_owner == 'zed-industries' || github.repository_owner == 'zed-extensions')
runs-on: self-mini-macos
steps:
- name: steps::checkout_repo
@@ -42,7 +42,7 @@ jobs:
shell: bash -euxo pipefail {0}
timeout-minutes: 60
run_tests_linux:
if: github.repository_owner == 'zed-industries'
if: (github.repository_owner == 'zed-industries' || github.repository_owner == 'zed-extensions')
runs-on: namespace-profile-16x32-ubuntu-2204
steps:
- name: steps::checkout_repo
@@ -89,7 +89,7 @@ jobs:
shell: bash -euxo pipefail {0}
timeout-minutes: 60
run_tests_windows:
if: github.repository_owner == 'zed-industries'
if: (github.repository_owner == 'zed-industries' || github.repository_owner == 'zed-extensions')
runs-on: self-32vcpu-windows-2022
steps:
- name: steps::checkout_repo
@@ -121,7 +121,7 @@ jobs:
shell: pwsh
timeout-minutes: 60
check_scripts:
if: github.repository_owner == 'zed-industries'
if: (github.repository_owner == 'zed-industries' || github.repository_owner == 'zed-extensions')
runs-on: namespace-profile-2x4-ubuntu-2404
steps:
- name: steps::checkout_repo
@@ -150,7 +150,7 @@ jobs:
shell: bash -euxo pipefail {0}
timeout-minutes: 60
create_draft_release:
if: github.repository_owner == 'zed-industries'
if: (github.repository_owner == 'zed-industries' || github.repository_owner == 'zed-extensions')
runs-on: namespace-profile-2x4-ubuntu-2404
steps:
- name: steps::checkout_repo

View File

@@ -12,7 +12,7 @@ on:
- cron: 0 7 * * *
jobs:
check_style:
if: github.repository_owner == 'zed-industries'
if: (github.repository_owner == 'zed-industries' || github.repository_owner == 'zed-extensions')
runs-on: self-mini-macos
steps:
- name: steps::checkout_repo
@@ -28,7 +28,7 @@ jobs:
shell: bash -euxo pipefail {0}
timeout-minutes: 60
run_tests_windows:
if: github.repository_owner == 'zed-industries'
if: (github.repository_owner == 'zed-industries' || github.repository_owner == 'zed-extensions')
runs-on: self-32vcpu-windows-2022
steps:
- name: steps::checkout_repo
@@ -361,7 +361,7 @@ jobs:
needs:
- check_style
- run_tests_windows
if: github.repository_owner == 'zed-industries'
if: (github.repository_owner == 'zed-industries' || github.repository_owner == 'zed-extensions')
runs-on: namespace-profile-32x64-ubuntu-2004
env:
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
@@ -392,7 +392,7 @@ jobs:
needs:
- check_style
- run_tests_windows
if: github.repository_owner == 'zed-industries'
if: (github.repository_owner == 'zed-industries' || github.repository_owner == 'zed-extensions')
runs-on: self-mini-macos
env:
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
@@ -434,7 +434,7 @@ jobs:
- bundle_mac_x86_64
- bundle_windows_aarch64
- bundle_windows_x86_64
if: github.repository_owner == 'zed-industries'
if: (github.repository_owner == 'zed-industries' || github.repository_owner == 'zed-extensions')
runs-on: namespace-profile-4x8-ubuntu-2204
steps:
- name: steps::checkout_repo

View File

@@ -15,7 +15,7 @@ on:
- v[0-9]+.[0-9]+.x
jobs:
orchestrate:
if: github.repository_owner == 'zed-industries'
if: (github.repository_owner == 'zed-industries' || github.repository_owner == 'zed-extensions')
runs-on: namespace-profile-2x4-ubuntu-2404
steps:
- name: steps::checkout_repo
@@ -59,7 +59,7 @@ jobs:
run_nix: ${{ steps.filter.outputs.run_nix }}
run_tests: ${{ steps.filter.outputs.run_tests }}
check_style:
if: github.repository_owner == 'zed-industries'
if: (github.repository_owner == 'zed-industries' || github.repository_owner == 'zed-extensions')
runs-on: namespace-profile-4x8-ubuntu-2204
steps:
- name: steps::checkout_repo
@@ -493,7 +493,10 @@ jobs:
needs:
- orchestrate
if: needs.orchestrate.outputs.run_tests == 'true'
runs-on: self-mini-macos
runs-on: namespace-profile-16x32-ubuntu-2204
env:
GIT_AUTHOR_NAME: Protobuf Action
GIT_AUTHOR_EMAIL: ci@zed.dev
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
@@ -538,7 +541,7 @@ jobs:
- check_scripts
- build_nix_linux_x86_64
- build_nix_mac_aarch64
if: github.repository_owner == 'zed-industries' && always()
if: (github.repository_owner == 'zed-industries' || github.repository_owner == 'zed-extensions') && always()
runs-on: namespace-profile-2x4-ubuntu-2404
steps:
- name: run_tests::tests_pass

38
Cargo.lock generated
View File

@@ -322,6 +322,7 @@ dependencies = [
"assistant_slash_command",
"assistant_slash_commands",
"assistant_text_thread",
"async-fs",
"audio",
"buffer_diff",
"chrono",
@@ -343,6 +344,7 @@ dependencies = [
"gpui",
"html_to_markdown",
"http_client",
"image",
"indoc",
"itertools 0.14.0",
"jsonschema",
@@ -1238,15 +1240,15 @@ dependencies = [
[[package]]
name = "async_zip"
version = "0.0.17"
version = "0.0.18"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "00b9f7252833d5ed4b00aa9604b563529dd5e11de9c23615de2dcdf91eb87b52"
checksum = "0d8c50d65ce1b0e0cb65a785ff615f78860d7754290647d3b983208daa4f85e6"
dependencies = [
"async-compression",
"crc32fast",
"futures-lite 2.6.1",
"pin-project",
"thiserror 1.0.69",
"thiserror 2.0.17",
]
[[package]]
@@ -2615,26 +2617,24 @@ dependencies = [
[[package]]
name = "calloop"
version = "0.13.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b99da2f8558ca23c71f4fd15dc57c906239752dd27ff3c00a1d56b685b7cbfec"
version = "0.14.3"
source = "git+https://github.com/zed-industries/calloop#eb6b4fd17b9af5ecc226546bdd04185391b3e265"
dependencies = [
"bitflags 2.9.4",
"log",
"polling",
"rustix 0.38.44",
"rustix 1.1.2",
"slab",
"thiserror 1.0.69",
"tracing",
]
[[package]]
name = "calloop-wayland-source"
version = "0.3.0"
version = "0.4.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "95a66a987056935f7efce4ab5668920b5d0dac4a7c99991a67395f13702ddd20"
checksum = "138efcf0940a02ebf0cc8d1eff41a1682a46b431630f4c52450d6265876021fa"
dependencies = [
"calloop",
"rustix 0.38.44",
"rustix 1.1.2",
"wayland-backend",
"wayland-client",
]
@@ -3209,6 +3209,7 @@ dependencies = [
"rustc-hash 2.1.1",
"schemars 1.0.4",
"serde",
"serde_json",
"strum 0.27.2",
]
@@ -3688,6 +3689,7 @@ dependencies = [
"collections",
"futures 0.3.31",
"gpui",
"http_client",
"log",
"net",
"parking_lot",
@@ -5318,6 +5320,7 @@ dependencies = [
"workspace",
"zed_actions",
"zeta",
"zeta2",
]
[[package]]
@@ -5861,6 +5864,7 @@ dependencies = [
"lsp",
"parking_lot",
"pretty_assertions",
"proto",
"semantic_version",
"serde",
"serde_json",
@@ -8724,7 +8728,6 @@ dependencies = [
"ui",
"ui_input",
"util",
"vim",
"workspace",
"zed_actions",
]
@@ -14034,6 +14037,7 @@ dependencies = [
"paths",
"pretty_assertions",
"project",
"prompt_store",
"proto",
"rayon",
"release_channel",
@@ -16557,10 +16561,10 @@ checksum = "0193cc4331cfd2f3d2011ef287590868599a2f33c3e69bc22c1a3d3acf9e02fb"
name = "svg_preview"
version = "0.1.0"
dependencies = [
"editor",
"file_icons",
"gpui",
"language",
"multi_buffer",
"ui",
"workspace",
]
@@ -21200,7 +21204,7 @@ dependencies = [
[[package]]
name = "zed"
version = "0.214.0"
version = "0.215.0"
dependencies = [
"acp_tools",
"activity_indicator",
@@ -21213,7 +21217,6 @@ dependencies = [
"audio",
"auto_update",
"auto_update_ui",
"backtrace",
"bincode 1.3.3",
"breadcrumbs",
"call",
@@ -21278,7 +21281,6 @@ dependencies = [
"mimalloc",
"miniprofiler_ui",
"nc",
"nix 0.29.0",
"node_runtime",
"notifications",
"onboarding",
@@ -21320,7 +21322,6 @@ dependencies = [
"task",
"tasks_ui",
"telemetry",
"telemetry_events",
"terminal_view",
"theme",
"theme_extension",
@@ -21725,6 +21726,7 @@ version = "0.1.0"
dependencies = [
"anyhow",
"arrayvec",
"brotli",
"chrono",
"client",
"clock",

View File

@@ -461,7 +461,7 @@ async-tar = "0.5.1"
async-task = "4.7"
async-trait = "0.1"
async-tungstenite = "0.31.0"
async_zip = { version = "0.0.17", features = ["deflate", "deflate64"] }
async_zip = { version = "0.0.18", features = ["deflate", "deflate64"] }
aws-config = { version = "1.6.1", features = ["behavior-version-latest"] }
aws-credential-types = { version = "1.2.2", features = [
"hardcoded-credentials",
@@ -478,6 +478,7 @@ bitflags = "2.6.0"
blade-graphics = { version = "0.7.0" }
blade-macros = { version = "0.3.0" }
blade-util = { version = "0.3.0" }
brotli = "8.0.2"
bytes = "1.0"
cargo_metadata = "0.19"
cargo_toml = "0.21"
@@ -631,6 +632,7 @@ scap = { git = "https://github.com/zed-industries/scap", rev = "4afea48c3b002197
schemars = { version = "1.0", features = ["indexmap2"] }
semver = "1.0"
serde = { version = "1.0.221", features = ["derive", "rc"] }
serde_derive = "1.0.221"
serde_json = { version = "1.0.144", features = ["preserve_order", "raw_value"] }
serde_json_lenient = { version = "0.2", features = [
"preserve_order",
@@ -724,6 +726,7 @@ yawc = "0.2.5"
zeroize = "1.8"
zstd = "0.11"
[workspace.dependencies.windows]
version = "0.61"
features = [
@@ -779,6 +782,7 @@ features = [
notify = { git = "https://github.com/zed-industries/notify.git", rev = "b4588b2e5aee68f4c0e100f140e808cbce7b1419" }
notify-types = { git = "https://github.com/zed-industries/notify.git", rev = "b4588b2e5aee68f4c0e100f140e808cbce7b1419" }
windows-capture = { git = "https://github.com/zed-industries/windows-capture.git", rev = "f0d6c1b6691db75461b732f6d5ff56eed002eeb9" }
calloop = { git = "https://github.com/zed-industries/calloop" }
[profile.dev]
split-debuginfo = "unpacked"
@@ -792,6 +796,19 @@ codegen-units = 16
codegen-units = 16
[profile.dev.package]
# proc-macros start
gpui_macros = { opt-level = 3 }
derive_refineable = { opt-level = 3 }
settings_macros = { opt-level = 3 }
sqlez_macros = { opt-level = 3, codegen-units = 1 }
ui_macros = { opt-level = 3 }
util_macros = { opt-level = 3 }
serde_derive = { opt-level = 3 }
quote = { opt-level = 3 }
syn = { opt-level = 3 }
proc-macro2 = { opt-level = 3 }
# proc-macros end
taffy = { opt-level = 3 }
cranelift-codegen = { opt-level = 3 }
cranelift-codegen-meta = { opt-level = 3 }
@@ -833,7 +850,6 @@ semantic_version = { codegen-units = 1 }
session = { codegen-units = 1 }
snippet = { codegen-units = 1 }
snippets_ui = { codegen-units = 1 }
sqlez_macros = { codegen-units = 1 }
story = { codegen-units = 1 }
supermaven_api = { codegen-units = 1 }
telemetry_events = { codegen-units = 1 }

View File

@@ -44,6 +44,7 @@ design
docs
= @probably-neb
= @miguelraz
extension
= @kubkon
@@ -98,6 +99,9 @@ settings_ui
= @danilo-leal
= @probably-neb
support
= @miguelraz
tasks
= @SomeoneToIgnore
= @Veykril

32
assets/icons/sweep_ai.svg Normal file
View File

@@ -0,0 +1,32 @@
<svg width="16" height="16" viewBox="0 0 16 16" fill="none" xmlns="http://www.w3.org/2000/svg">
<g clip-path="url(#clip0_3348_16)">
<path fill-rule="evenodd" clip-rule="evenodd" d="M7.97419 6.27207C8.44653 6.29114 8.86622 6.27046 9.23628 6.22425C9.08884 7.48378 8.7346 8.72903 8.16697 9.90688C8.04459 9.83861 7.92582 9.76008 7.81193 9.67108C7.64539 9.54099 7.49799 9.39549 7.37015 9.23818C7.5282 9.54496 7.64901 9.86752 7.73175 10.1986C7.35693 10.6656 6.90663 11.0373 6.412 11.3101C5.01165 10.8075 4.03638 9.63089 4.03638 7.93001C4.03638 6.96185 4.35234 6.07053 4.88281 5.36157C5.34001 5.69449 6.30374 6.20455 7.97419 6.27207ZM8.27511 11.5815C10.3762 11.5349 11.8115 10.7826 12.8347 7.93001C11.6992 7.93001 11.4246 7.10731 11.1188 6.19149C11.0669 6.03596 11.0141 5.87771 10.956 5.72037C10.6733 5.86733 10.2753 6.02782 9.74834 6.13895C9.59658 7.49345 9.20592 8.83238 8.56821 10.0897C8.89933 10.2093 9.24674 10.262 9.5908 10.2502C9.08928 10.4803 8.62468 10.8066 8.22655 11.2255C8.2457 11.3438 8.26186 11.4625 8.27511 11.5815ZM6.62702 7.75422C6.62702 7.50604 6.82821 7.30485 7.07639 7.30485C7.32457 7.30485 7.52576 7.50604 7.52576 7.75422V8.23616C7.52576 8.48435 7.32457 8.68554 7.07639 8.68554C6.82821 8.68554 6.62702 8.48435 6.62702 8.23616V7.75422ZM5.27746 7.30485C5.05086 7.30485 4.86716 7.48854 4.86716 7.71513V8.27525C4.86716 8.50185 5.05086 8.68554 5.27746 8.68554C5.50406 8.68554 5.68776 8.50185 5.68776 8.27525V7.71513C5.68776 7.48854 5.50406 7.30485 5.27746 7.30485Z" fill="white"/>
<mask id="mask0_3348_16" style="mask-type:luminance" maskUnits="userSpaceOnUse" x="4" y="5" width="9" height="7">
<path fill-rule="evenodd" clip-rule="evenodd" d="M7.97419 6.27207C8.44653 6.29114 8.86622 6.27046 9.23628 6.22425C9.08884 7.48378 8.7346 8.72903 8.16697 9.90688C8.04459 9.83861 7.92582 9.76008 7.81193 9.67108C7.64539 9.54099 7.49799 9.39549 7.37015 9.23818C7.5282 9.54496 7.64901 9.86752 7.73175 10.1986C7.35693 10.6656 6.90663 11.0373 6.412 11.3101C5.01165 10.8075 4.03638 9.63089 4.03638 7.93001C4.03638 6.96185 4.35234 6.07053 4.88281 5.36157C5.34001 5.69449 6.30374 6.20455 7.97419 6.27207ZM8.27511 11.5815C10.3762 11.5349 11.8115 10.7826 12.8347 7.93001C11.6992 7.93001 11.4246 7.10731 11.1188 6.19149C11.0669 6.03596 11.0141 5.87771 10.956 5.72037C10.6733 5.86733 10.2753 6.02782 9.74834 6.13895C9.59658 7.49345 9.20592 8.83238 8.56821 10.0897C8.89933 10.2093 9.24674 10.262 9.5908 10.2502C9.08928 10.4803 8.62468 10.8066 8.22655 11.2255C8.2457 11.3438 8.26186 11.4625 8.27511 11.5815ZM6.62702 7.75422C6.62702 7.50604 6.82821 7.30485 7.07639 7.30485C7.32457 7.30485 7.52576 7.50604 7.52576 7.75422V8.23616C7.52576 8.48435 7.32457 8.68554 7.07639 8.68554C6.82821 8.68554 6.62702 8.48435 6.62702 8.23616V7.75422ZM5.27746 7.30485C5.05086 7.30485 4.86716 7.48854 4.86716 7.71513V8.27525C4.86716 8.50185 5.05086 8.68554 5.27746 8.68554C5.50406 8.68554 5.68776 8.50185 5.68776 8.27525V7.71513C5.68776 7.48854 5.50406 7.30485 5.27746 7.30485Z" fill="white"/>
</mask>
<g mask="url(#mask0_3348_16)">
<path d="M9.23617 6.22425L9.39588 6.24293L9.41971 6.0393L9.21624 6.06471L9.23617 6.22425ZM8.16687 9.90688L8.08857 10.0473L8.23765 10.1305L8.31174 9.97669L8.16687 9.90688ZM7.37005 9.23819L7.49487 9.13676L7.22714 9.3118L7.37005 9.23819ZM7.73165 10.1986L7.85702 10.2993L7.90696 10.2371L7.88761 10.1597L7.73165 10.1986ZM6.41189 11.3101L6.35758 11.4615L6.42594 11.486L6.48954 11.4509L6.41189 11.3101ZM4.88271 5.36157L4.97736 5.23159L4.84905 5.13817L4.75397 5.26525L4.88271 5.36157ZM8.27501 11.5815L8.11523 11.5993L8.13151 11.7456L8.27859 11.7423L8.27501 11.5815ZM12.8346 7.93001L12.986 7.98428L13.0631 7.76921H12.8346V7.93001ZM10.9559 5.72037L11.1067 5.66469L11.0436 5.49354L10.8817 5.5777L10.9559 5.72037ZM9.74824 6.13896L9.71508 5.98161L9.60139 6.0056L9.58846 6.12102L9.74824 6.13896ZM8.56811 10.0897L8.42469 10.017L8.34242 10.1792L8.51348 10.241L8.56811 10.0897ZM9.5907 10.2502L9.65775 10.3964L9.58519 10.0896L9.5907 10.2502ZM8.22644 11.2255L8.10992 11.1147L8.05502 11.1725L8.06773 11.2512L8.22644 11.2255ZM9.21624 6.06471C8.85519 6.10978 8.44439 6.13015 7.98058 6.11139L7.96756 6.43272C8.44852 6.45215 8.87701 6.43111 9.25607 6.3838L9.21624 6.06471ZM8.31174 9.97669C8.88724 8.78244 9.2464 7.51988 9.39588 6.24293L9.07647 6.20557C8.93108 7.44772 8.58175 8.67563 8.02203 9.83708L8.31174 9.97669ZM8.2452 9.76645C8.12998 9.70219 8.01817 9.62826 7.91082 9.54438L7.71285 9.79779C7.8333 9.8919 7.95895 9.97503 8.08857 10.0473L8.2452 9.76645ZM7.91082 9.54438C7.75387 9.4218 7.61512 9.28479 7.49487 9.13676L7.24526 9.33957C7.38066 9.50619 7.53671 9.66023 7.71285 9.79779L7.91082 9.54438ZM7.22714 9.3118C7.37944 9.60746 7.49589 9.91837 7.57564 10.2376L7.88761 10.1597C7.80196 9.81663 7.67679 9.48248 7.513 9.16453L7.22714 9.3118ZM7.60624 10.098C7.24483 10.5482 6.81083 10.9065 6.33425 11.1693L6.48954 11.4509C7.00223 11.1682 7.46887 10.7829 7.85702 10.2993L7.60624 10.098ZM3.87549 7.93001C3.87548 9.7042 4.89861 10.9378 6.35758 11.4615L6.46622 11.1588C5.12449 10.6772 4.19707 9.55763 4.19707 7.93001H3.87549ZM4.75397 5.26525C4.20309 6.00147 3.87549 6.92646 3.87549 7.93001H4.19707C4.19707 6.99724 4.50139 6.13959 5.01145 5.45791L4.75397 5.26525ZM7.98058 6.11139C6.34236 6.04516 5.40922 5.54604 4.97736 5.23159L4.78806 5.49157C5.27058 5.84291 6.26491 6.3639 7.96756 6.43272L7.98058 6.11139ZM8.27859 11.7423C9.34696 11.7185 10.2682 11.515 11.0542 10.9376C11.8388 10.3612 12.4683 9.4273 12.986 7.98428L12.6833 7.8757C12.1776 9.28534 11.5779 10.1539 10.8638 10.6784C10.1511 11.202 9.30417 11.3978 8.27143 11.4208L8.27859 11.7423ZM12.8346 7.76921C12.3148 7.76921 12.0098 7.58516 11.7925 7.30552C11.5639 7.0114 11.4266 6.60587 11.2712 6.14061L10.9662 6.24242C11.1166 6.69294 11.2695 7.15667 11.5385 7.50285C11.8188 7.86347 12.2189 8.09078 12.8346 8.09078V7.76921ZM11.2712 6.14061C11.2195 5.98543 11.1658 5.82478 11.1067 5.66469L10.805 5.77606C10.8621 5.93065 10.9142 6.0865 10.9662 6.24242L11.2712 6.14061ZM10.8817 5.5777C10.6115 5.71821 10.2273 5.87362 9.71508 5.98161L9.78143 6.29626C10.3232 6.18206 10.735 6.0165 11.0301 5.86301L10.8817 5.5777ZM9.58846 6.12102C9.43882 7.45684 9.05355 8.77717 8.42469 10.017L8.71149 10.1625C9.35809 8.88764 9.75417 7.53011 9.90806 6.15685L9.58846 6.12102ZM9.58519 10.0896C9.26119 10.1006 8.93423 10.051 8.62269 9.93854L8.51348 10.241C8.86427 10.3677 9.23205 10.4234 9.5962 10.4109L9.58519 10.0896ZM8.34301 11.3363C8.72675 10.9325 9.17443 10.6181 9.65775 10.3964L9.52365 10.1041C9.00392 10.3425 8.52241 10.6807 8.10992 11.1147L8.34301 11.3363ZM8.43483 11.5638C8.4213 11.4421 8.40475 11.3207 8.3852 11.1998L8.06773 11.2512C8.08644 11.3668 8.10225 11.4829 8.11523 11.5993L8.43483 11.5638ZM7.07629 7.14405C6.73931 7.14405 6.46613 7.41724 6.46613 7.75423H6.7877C6.7877 7.59484 6.91691 7.46561 7.07629 7.46561V7.14405ZM7.68646 7.75423C7.68646 7.41724 7.41326 7.14405 7.07629 7.14405V7.46561C7.23567 7.46561 7.36489 7.59484 7.36489 7.75423H7.68646ZM7.68646 8.23616V7.75423H7.36489V8.23616H7.68646ZM7.07629 8.84634C7.41326 8.84634 7.68646 8.57315 7.68646 8.23616H7.36489C7.36489 8.39555 7.23567 8.52474 7.07629 8.52474V8.84634ZM6.46613 8.23616C6.46613 8.57315 6.73931 8.84634 7.07629 8.84634V8.52474C6.91691 8.52474 6.7877 8.39555 6.7877 8.23616H6.46613ZM6.46613 7.75423V8.23616H6.7877V7.75423H6.46613ZM5.02785 7.71514C5.02785 7.57734 5.13956 7.46561 5.27736 7.46561V7.14405C4.96196 7.14405 4.70627 7.39974 4.70627 7.71514H5.02785ZM5.02785 8.27525V7.71514H4.70627V8.27525H5.02785ZM5.27736 8.52474C5.13956 8.52474 5.02785 8.41305 5.02785 8.27525H4.70627C4.70627 8.59065 4.96196 8.84634 5.27736 8.84634V8.52474ZM5.52687 8.27525C5.52687 8.41305 5.41516 8.52474 5.27736 8.52474V8.84634C5.59277 8.84634 5.84845 8.59065 5.84845 8.27525H5.52687ZM5.52687 7.71514V8.27525H5.84845V7.71514H5.52687ZM5.27736 7.46561C5.41516 7.46561 5.52687 7.57734 5.52687 7.71514H5.84845C5.84845 7.39974 5.59277 7.14405 5.27736 7.14405V7.46561Z" fill="white"/>
</g>
<path fill-rule="evenodd" clip-rule="evenodd" d="M7.12635 14.5901C7.22369 14.3749 7.3069 14.1501 7.37454 13.9167C7.54132 13.3412 7.5998 12.7599 7.56197 12.1948C7.53665 12.5349 7.47589 12.8775 7.37718 13.2181C7.23926 13.694 7.03667 14.1336 6.78174 14.5301C6.89605 14.5547 7.01101 14.5747 7.12635 14.5901Z" fill="white"/>
<path d="M9.71984 7.74796C9.50296 7.74796 9.29496 7.83412 9.14159 7.98745C8.98822 8.14082 8.9021 8.34882 8.9021 8.5657C8.9021 8.78258 8.98822 8.99057 9.14159 9.14394C9.29496 9.29728 9.50296 9.38344 9.71984 9.38344V8.5657V7.74796Z" fill="white"/>
<mask id="mask1_3348_16" style="mask-type:luminance" maskUnits="userSpaceOnUse" x="5" y="2" width="8" height="9">
<path d="M12.3783 2.9985H5.36792V10.3954H12.3783V2.9985Z" fill="white"/>
<path fill-rule="evenodd" clip-rule="evenodd" d="M9.75733 3.61999C9.98577 5.80374 9.60089 8.05373 8.56819 10.0898C8.43122 10.0403 8.29704 9.9794 8.16699 9.90688C9.15325 7.86033 9.49538 5.61026 9.22757 3.43526C9.39923 3.51584 9.57682 3.57729 9.75733 3.61999Z" fill="black"/>
</mask>
<g mask="url(#mask1_3348_16)">
<path d="M8.56815 10.0898L8.67689 10.1449L8.62812 10.241L8.52678 10.2044L8.56815 10.0898ZM9.75728 3.61998L9.78536 3.50136L9.86952 3.52127L9.87853 3.6073L9.75728 3.61998ZM8.16695 9.90687L8.1076 10.0133L8.00732 9.9574L8.05715 9.85398L8.16695 9.90687ZM9.22753 3.43524L9.10656 3.45014L9.07958 3.23116L9.27932 3.32491L9.22753 3.43524ZM8.45945 10.0346C9.48122 8.02009 9.86217 5.79374 9.63608 3.63266L9.87853 3.6073C10.1093 5.81372 9.72048 8.0873 8.67689 10.1449L8.45945 10.0346ZM8.22633 9.80041C8.35056 9.86971 8.47876 9.92791 8.60956 9.97514L8.52678 10.2044C8.38363 10.1527 8.24344 10.0891 8.1076 10.0133L8.22633 9.80041ZM9.34849 3.42035C9.61905 5.61792 9.27346 7.89158 8.27675 9.9598L8.05715 9.85398C9.03298 7.82905 9.37158 5.60258 9.10656 3.45014L9.34849 3.42035ZM9.72925 3.7386C9.54064 3.69399 9.3551 3.62977 9.17573 3.54558L9.27932 3.32491C9.44327 3.40188 9.61288 3.46058 9.78536 3.50136L9.72925 3.7386Z" fill="white"/>
</g>
<path fill-rule="evenodd" clip-rule="evenodd" d="M11.4118 3.46925L11.2416 3.39926L11.1904 3.57611L11.349 3.62202C11.1904 3.57611 11.1904 3.57615 11.1904 3.5762L11.1903 3.57631L11.1902 3.57658L11.19 3.57741L11.1893 3.58009C11.1886 3.58233 11.1878 3.58548 11.1867 3.58949C11.1845 3.5975 11.1814 3.60897 11.1777 3.62359C11.1703 3.6528 11.1603 3.69464 11.1493 3.74656C11.1275 3.85017 11.102 3.99505 11.0869 4.16045C11.0573 4.4847 11.0653 4.91594 11.2489 5.26595C11.2613 5.28944 11.2643 5.31174 11.2625 5.32629C11.261 5.33849 11.2572 5.34226 11.2536 5.3449C11.0412 5.50026 10.5639 5.78997 9.76653 5.96607C9.76095 6.02373 9.75493 6.08134 9.74848 6.13895C10.601 5.95915 11.1161 5.65017 11.3511 5.4782C11.4413 5.41219 11.4471 5.28823 11.3952 5.18922C11.1546 4.73063 11.2477 4.08248 11.3103 3.78401C11.3314 3.68298 11.349 3.62202 11.349 3.62202C11.3745 3.6325 11.4002 3.63983 11.4259 3.64425C11.9083 3.72709 12.4185 2.78249 12.6294 2.33939C12.6852 2.22212 12.6234 2.08843 12.497 2.05837C11.2595 1.76399 5.46936 0.631807 4.57214 4.96989C4.55907 5.03307 4.57607 5.10106 4.62251 5.14584C4.87914 5.39322 5.86138 6.18665 7.9743 6.27207C8.44664 6.29114 8.86633 6.27046 9.23638 6.22425C9.24295 6.16797 9.24912 6.1117 9.25491 6.05534C8.88438 6.10391 8.46092 6.12641 7.98094 6.10702C5.91152 6.02337 4.96693 5.24843 4.73714 5.02692C4.73701 5.02679 4.73545 5.02525 4.73422 5.0208C4.73292 5.01611 4.73254 5.00987 4.73388 5.00334C4.94996 3.95861 5.4573 3.25195 6.11188 2.77714C6.77039 2.29947 7.58745 2.04983 8.42824 1.94075C10.1122 1.72228 11.8454 2.07312 12.4588 2.21906C12.4722 2.22225 12.4787 2.22927 12.4819 2.2362C12.4853 2.24342 12.4869 2.25443 12.4803 2.2684C12.3706 2.49879 12.183 2.85746 11.9656 3.13057C11.8564 3.26783 11.7479 3.37295 11.6469 3.43216C11.5491 3.48956 11.4752 3.49529 11.4118 3.46925Z" fill="white"/>
<mask id="mask2_3348_16" style="mask-type:luminance" maskUnits="userSpaceOnUse" x="3" y="9" width="7" height="6">
<path fill-rule="evenodd" clip-rule="evenodd" d="M8.22654 11.2255C8.62463 10.8066 9.08923 10.4803 9.59075 10.2502C8.97039 10.2715 8.33933 10.0831 7.81189 9.67109C7.64534 9.541 7.49795 9.39549 7.37014 9.23819C7.52815 9.54497 7.64896 9.86752 7.7317 10.1986C6.70151 11.4821 5.1007 12.0466 3.57739 11.8125C3.85909 12.527 4.32941 13.178 4.97849 13.6851C5.8625 14.3756 6.92544 14.6799 7.96392 14.6227C8.32513 13.5174 8.4085 12.351 8.22654 11.2255Z" fill="white"/>
</mask>
<g mask="url(#mask2_3348_16)">
<path d="M9.59085 10.2502L9.58389 10.0472L9.67556 10.4349L9.59085 10.2502ZM8.22663 11.2255L8.02607 11.258L8.00999 11.1585L8.07936 11.0856L8.22663 11.2255ZM7.37024 9.23819L7.18961 9.33119L7.52789 9.11006L7.37024 9.23819ZM7.7318 10.1986L7.92886 10.1494L7.95328 10.2472L7.8902 10.3258L7.7318 10.1986ZM3.57749 11.8125L3.3885 11.887L3.25879 11.5579L3.60835 11.6117L3.57749 11.8125ZM7.96402 14.6227L8.15711 14.6858L8.11397 14.8179L7.97519 14.8255L7.96402 14.6227ZM9.67556 10.4349C9.19708 10.6544 8.7538 10.9657 8.37387 11.3655L8.07936 11.0856C8.49566 10.6475 8.98161 10.3062 9.50614 10.0656L9.67556 10.4349ZM7.93704 9.51099C8.42551 9.89261 9.00942 10.0669 9.58389 10.0472L9.59781 10.4533C8.93151 10.4761 8.25334 10.2737 7.68693 9.83118L7.93704 9.51099ZM7.52789 9.11006C7.64615 9.25565 7.78261 9.39038 7.93704 9.51099L7.68693 9.83118C7.50827 9.69161 7.34994 9.53537 7.21254 9.36627L7.52789 9.11006ZM7.5347 10.2479C7.45573 9.93178 7.34043 9.62393 7.18961 9.33119L7.55082 9.14514C7.71611 9.466 7.84242 9.80326 7.92886 10.1494L7.5347 10.2479ZM3.60835 11.6117C5.06278 11.8352 6.59038 11.2962 7.57335 10.0715L7.8902 10.3258C6.81284 11.6681 5.1388 12.258 3.54663 12.0133L3.60835 11.6117ZM4.85352 13.8452C4.17512 13.3152 3.68312 12.6343 3.3885 11.887L3.76648 11.738C4.03524 12.4197 4.4839 13.0409 5.10364 13.525L4.85352 13.8452ZM7.97519 14.8255C6.8895 14.8853 5.77774 14.5672 4.85352 13.8452L5.10364 13.525C5.94745 14.1842 6.96157 14.4744 7.95285 14.4198L7.97519 14.8255ZM8.42716 11.1931C8.61419 12.3499 8.52858 13.5491 8.15711 14.6858L7.77093 14.5596C8.12191 13.4857 8.20296 12.352 8.02607 11.258L8.42716 11.1931Z" fill="white"/>
</g>
</g>
<defs>
<clipPath id="clip0_3348_16">
<rect width="9.63483" height="14" fill="white" transform="translate(3.19995 1.5)"/>
</clipPath>
</defs>
</svg>

After

Width:  |  Height:  |  Size: 14 KiB

View File

@@ -43,8 +43,7 @@
"f11": "zed::ToggleFullScreen",
"ctrl-alt-z": "edit_prediction::RateCompletions",
"ctrl-alt-shift-i": "edit_prediction::ToggleMenu",
"ctrl-alt-l": "lsp_tool::ToggleMenu",
"ctrl-alt-.": "project_panel::ToggleHideHidden"
"ctrl-alt-l": "lsp_tool::ToggleMenu"
}
},
{
@@ -866,6 +865,7 @@
"context": "ProjectPanel",
"bindings": {
"left": "project_panel::CollapseSelectedEntry",
"ctrl-left": "project_panel::CollapseAllEntries",
"right": "project_panel::ExpandSelectedEntry",
"new": "project_panel::NewFile",
"ctrl-n": "project_panel::NewFile",

View File

@@ -49,8 +49,7 @@
"ctrl-cmd-f": "zed::ToggleFullScreen",
"ctrl-cmd-z": "edit_prediction::RateCompletions",
"ctrl-cmd-i": "edit_prediction::ToggleMenu",
"ctrl-cmd-l": "lsp_tool::ToggleMenu",
"cmd-alt-.": "project_panel::ToggleHideHidden"
"ctrl-cmd-l": "lsp_tool::ToggleMenu"
}
},
{
@@ -313,7 +312,7 @@
"use_key_equivalents": true,
"bindings": {
"cmd-n": "agent::NewTextThread",
"cmd-alt-t": "agent::NewThread"
"cmd-alt-n": "agent::NewExternalAgentThread"
}
},
{
@@ -936,6 +935,7 @@
"use_key_equivalents": true,
"bindings": {
"left": "project_panel::CollapseSelectedEntry",
"cmd-left": "project_panel::CollapseAllEntries",
"right": "project_panel::ExpandSelectedEntry",
"cmd-n": "project_panel::NewFile",
"cmd-d": "project_panel::Duplicate",

View File

@@ -41,8 +41,7 @@
"shift-f11": "debugger::StepOut",
"f11": "zed::ToggleFullScreen",
"ctrl-shift-i": "edit_prediction::ToggleMenu",
"shift-alt-l": "lsp_tool::ToggleMenu",
"ctrl-alt-.": "project_panel::ToggleHideHidden"
"shift-alt-l": "lsp_tool::ToggleMenu"
}
},
{
@@ -880,6 +879,7 @@
"use_key_equivalents": true,
"bindings": {
"left": "project_panel::CollapseSelectedEntry",
"ctrl-left": "project_panel::CollapseAllEntries",
"right": "project_panel::ExpandSelectedEntry",
"ctrl-n": "project_panel::NewFile",
"alt-n": "project_panel::NewDirectory",

View File

@@ -476,6 +476,9 @@
"alt-p": "editor::SelectPreviousSyntaxNode",
"alt-n": "editor::SelectNextSyntaxNode",
"n": "vim::HelixSelectNext",
"shift-n": "vim::HelixSelectPrevious",
// Goto mode
"g e": "vim::EndOfDocument",
"g h": "vim::StartOfLine",

View File

@@ -742,6 +742,16 @@
// "never"
"show": "always"
},
// Sort order for entries in the project panel.
// This setting can take three values:
//
// 1. Show directories first, then files:
// "directories_first"
// 2. Mix directories and files together:
// "mixed"
// 3. Show files first, then directories:
// "files_first"
"sort_mode": "directories_first",
// Whether to enable drag-and-drop operations in the project panel.
"drag_and_drop": true,
// Whether to hide the root entry when only one folder is open in the window.
@@ -1306,7 +1316,10 @@
// "hunk_style": "staged_hollow"
// 2. Show unstaged hunks hollow and staged hunks filled:
// "hunk_style": "unstaged_hollow"
"hunk_style": "staged_hollow"
"hunk_style": "staged_hollow",
// Should the name or path be displayed first in the git view.
// "path_style": "file_name_first" or "file_path_first"
"path_style": "file_name_first"
},
// The list of custom Git hosting providers.
"git_hosting_providers": [
@@ -2049,6 +2062,18 @@
"dev": {
// "theme": "Andromeda"
},
// Settings overrides to use when using linux
"linux": {},
// Settings overrides to use when using macos
"macos": {},
// Settings overrides to use when using windows
"windows": {
"languages": {
"PHP": {
"language_servers": ["intelephense", "!phpactor", "..."]
}
}
},
// Whether to show full labels in line indicator or short ones
//
// Values:

View File

@@ -150,6 +150,7 @@ impl DbThread {
.unwrap_or_default(),
input: tool_use.input,
is_input_complete: true,
thought_signature: None,
},
));
}

View File

@@ -1108,6 +1108,7 @@ fn tool_use(
raw_input: serde_json::to_string_pretty(&input).unwrap(),
input: serde_json::to_value(input).unwrap(),
is_input_complete: true,
thought_signature: None,
})
}

View File

@@ -274,6 +274,7 @@ async fn test_prompt_caching(cx: &mut TestAppContext) {
raw_input: json!({"text": "test"}).to_string(),
input: json!({"text": "test"}),
is_input_complete: true,
thought_signature: None,
};
fake_model
.send_last_completion_stream_event(LanguageModelCompletionEvent::ToolUse(tool_use.clone()));
@@ -461,6 +462,7 @@ async fn test_tool_authorization(cx: &mut TestAppContext) {
raw_input: "{}".into(),
input: json!({}),
is_input_complete: true,
thought_signature: None,
},
));
fake_model.send_last_completion_stream_event(LanguageModelCompletionEvent::ToolUse(
@@ -470,6 +472,7 @@ async fn test_tool_authorization(cx: &mut TestAppContext) {
raw_input: "{}".into(),
input: json!({}),
is_input_complete: true,
thought_signature: None,
},
));
fake_model.end_last_completion_stream();
@@ -520,6 +523,7 @@ async fn test_tool_authorization(cx: &mut TestAppContext) {
raw_input: "{}".into(),
input: json!({}),
is_input_complete: true,
thought_signature: None,
},
));
fake_model.end_last_completion_stream();
@@ -554,6 +558,7 @@ async fn test_tool_authorization(cx: &mut TestAppContext) {
raw_input: "{}".into(),
input: json!({}),
is_input_complete: true,
thought_signature: None,
},
));
fake_model.end_last_completion_stream();
@@ -592,6 +597,7 @@ async fn test_tool_hallucination(cx: &mut TestAppContext) {
raw_input: "{}".into(),
input: json!({}),
is_input_complete: true,
thought_signature: None,
},
));
fake_model.end_last_completion_stream();
@@ -621,6 +627,7 @@ async fn test_resume_after_tool_use_limit(cx: &mut TestAppContext) {
raw_input: "{}".into(),
input: serde_json::to_value(&EchoToolInput { text: "def".into() }).unwrap(),
is_input_complete: true,
thought_signature: None,
};
fake_model
.send_last_completion_stream_event(LanguageModelCompletionEvent::ToolUse(tool_use.clone()));
@@ -731,6 +738,7 @@ async fn test_send_after_tool_use_limit(cx: &mut TestAppContext) {
raw_input: "{}".into(),
input: serde_json::to_value(&EchoToolInput { text: "def".into() }).unwrap(),
is_input_complete: true,
thought_signature: None,
};
let tool_result = LanguageModelToolResult {
tool_use_id: "tool_id_1".into(),
@@ -1037,6 +1045,7 @@ async fn test_mcp_tools(cx: &mut TestAppContext) {
raw_input: json!({"text": "test"}).to_string(),
input: json!({"text": "test"}),
is_input_complete: true,
thought_signature: None,
},
));
fake_model.end_last_completion_stream();
@@ -1080,6 +1089,7 @@ async fn test_mcp_tools(cx: &mut TestAppContext) {
raw_input: json!({"text": "mcp"}).to_string(),
input: json!({"text": "mcp"}),
is_input_complete: true,
thought_signature: None,
},
));
fake_model.send_last_completion_stream_event(LanguageModelCompletionEvent::ToolUse(
@@ -1089,6 +1099,7 @@ async fn test_mcp_tools(cx: &mut TestAppContext) {
raw_input: json!({"text": "native"}).to_string(),
input: json!({"text": "native"}),
is_input_complete: true,
thought_signature: None,
},
));
fake_model.end_last_completion_stream();
@@ -1788,6 +1799,7 @@ async fn test_building_request_with_pending_tools(cx: &mut TestAppContext) {
raw_input: "{}".into(),
input: json!({}),
is_input_complete: true,
thought_signature: None,
};
let echo_tool_use = LanguageModelToolUse {
id: "tool_id_2".into(),
@@ -1795,6 +1807,7 @@ async fn test_building_request_with_pending_tools(cx: &mut TestAppContext) {
raw_input: json!({"text": "test"}).to_string(),
input: json!({"text": "test"}),
is_input_complete: true,
thought_signature: None,
};
fake_model.send_last_completion_stream_text_chunk("Hi!");
fake_model.send_last_completion_stream_event(LanguageModelCompletionEvent::ToolUse(
@@ -2000,6 +2013,7 @@ async fn test_tool_updates_to_completion(cx: &mut TestAppContext) {
raw_input: input.to_string(),
input,
is_input_complete: false,
thought_signature: None,
},
));
@@ -2012,6 +2026,7 @@ async fn test_tool_updates_to_completion(cx: &mut TestAppContext) {
raw_input: input.to_string(),
input,
is_input_complete: true,
thought_signature: None,
},
));
fake_model.end_last_completion_stream();
@@ -2214,6 +2229,7 @@ async fn test_send_retry_finishes_tool_calls_on_error(cx: &mut TestAppContext) {
raw_input: json!({"text": "test"}).to_string(),
input: json!({"text": "test"}),
is_input_complete: true,
thought_signature: None,
};
fake_model.send_last_completion_stream_event(LanguageModelCompletionEvent::ToolUse(
tool_use_1.clone(),

View File

@@ -607,6 +607,8 @@ pub struct Thread {
pub(crate) prompt_capabilities_rx: watch::Receiver<acp::PromptCapabilities>,
pub(crate) project: Entity<Project>,
pub(crate) action_log: Entity<ActionLog>,
/// Tracks the last time files were read by the agent, to detect external modifications
pub(crate) file_read_times: HashMap<PathBuf, fs::MTime>,
}
impl Thread {
@@ -665,6 +667,7 @@ impl Thread {
prompt_capabilities_rx,
project,
action_log,
file_read_times: HashMap::default(),
}
}
@@ -860,6 +863,7 @@ impl Thread {
updated_at: db_thread.updated_at,
prompt_capabilities_tx,
prompt_capabilities_rx,
file_read_times: HashMap::default(),
}
}
@@ -999,6 +1003,7 @@ impl Thread {
self.add_tool(NowTool);
self.add_tool(OpenTool::new(self.project.clone()));
self.add_tool(ReadFileTool::new(
cx.weak_entity(),
self.project.clone(),
self.action_log.clone(),
));

View File

@@ -309,6 +309,40 @@ impl AgentTool for EditFileTool {
})?
.await?;
// Check if the file has been modified since the agent last read it
if let Some(abs_path) = abs_path.as_ref() {
let (last_read_mtime, current_mtime, is_dirty) = self.thread.update(cx, |thread, cx| {
let last_read = thread.file_read_times.get(abs_path).copied();
let current = buffer.read(cx).file().and_then(|file| file.disk_state().mtime());
let dirty = buffer.read(cx).is_dirty();
(last_read, current, dirty)
})?;
// Check for unsaved changes first - these indicate modifications we don't know about
if is_dirty {
anyhow::bail!(
"This file cannot be written to because it has unsaved changes. \
Please end the current conversation immediately by telling the user you want to write to this file (mention its path explicitly) but you can't write to it because it has unsaved changes. \
Ask the user to save that buffer's changes and to inform you when it's ok to proceed."
);
}
// Check if the file was modified on disk since we last read it
if let (Some(last_read), Some(current)) = (last_read_mtime, current_mtime) {
// MTime can be unreliable for comparisons, so our newtype intentionally
// doesn't support comparing them. If the mtime at all different
// (which could be because of a modification or because e.g. system clock changed),
// we pessimistically assume it was modified.
if current != last_read {
anyhow::bail!(
"The file {} has been modified since you last read it. \
Please read the file again to get the current state before editing it.",
input.path.display()
);
}
}
}
let diff = cx.new(|cx| Diff::new(buffer.clone(), cx))?;
event_stream.update_diff(diff.clone());
let _finalize_diff = util::defer({
@@ -421,6 +455,17 @@ impl AgentTool for EditFileTool {
log.buffer_edited(buffer.clone(), cx);
})?;
// Update the recorded read time after a successful edit so consecutive edits work
if let Some(abs_path) = abs_path.as_ref() {
if let Some(new_mtime) = buffer.read_with(cx, |buffer, _| {
buffer.file().and_then(|file| file.disk_state().mtime())
})? {
self.thread.update(cx, |thread, _| {
thread.file_read_times.insert(abs_path.to_path_buf(), new_mtime);
})?;
}
}
let new_snapshot = buffer.read_with(cx, |buffer, _cx| buffer.snapshot())?;
let (new_text, unified_diff) = cx
.background_spawn({
@@ -1748,10 +1793,426 @@ mod tests {
}
}
#[gpui::test]
async fn test_file_read_times_tracking(cx: &mut TestAppContext) {
init_test(cx);
let fs = project::FakeFs::new(cx.executor());
fs.insert_tree(
"/root",
json!({
"test.txt": "original content"
}),
)
.await;
let project = Project::test(fs.clone(), [path!("/root").as_ref()], cx).await;
let context_server_registry =
cx.new(|cx| ContextServerRegistry::new(project.read(cx).context_server_store(), cx));
let model = Arc::new(FakeLanguageModel::default());
let thread = cx.new(|cx| {
Thread::new(
project.clone(),
cx.new(|_cx| ProjectContext::default()),
context_server_registry,
Templates::new(),
Some(model.clone()),
cx,
)
});
let action_log = thread.read_with(cx, |thread, _| thread.action_log().clone());
// Initially, file_read_times should be empty
let is_empty = thread.read_with(cx, |thread, _| thread.file_read_times.is_empty());
assert!(is_empty, "file_read_times should start empty");
// Create read tool
let read_tool = Arc::new(crate::ReadFileTool::new(
thread.downgrade(),
project.clone(),
action_log,
));
// Read the file to record the read time
cx.update(|cx| {
read_tool.clone().run(
crate::ReadFileToolInput {
path: "root/test.txt".to_string(),
start_line: None,
end_line: None,
},
ToolCallEventStream::test().0,
cx,
)
})
.await
.unwrap();
// Verify that file_read_times now contains an entry for the file
let has_entry = thread.read_with(cx, |thread, _| {
thread.file_read_times.len() == 1
&& thread
.file_read_times
.keys()
.any(|path| path.ends_with("test.txt"))
});
assert!(
has_entry,
"file_read_times should contain an entry after reading the file"
);
// Read the file again - should update the entry
cx.update(|cx| {
read_tool.clone().run(
crate::ReadFileToolInput {
path: "root/test.txt".to_string(),
start_line: None,
end_line: None,
},
ToolCallEventStream::test().0,
cx,
)
})
.await
.unwrap();
// Should still have exactly one entry
let has_one_entry = thread.read_with(cx, |thread, _| thread.file_read_times.len() == 1);
assert!(
has_one_entry,
"file_read_times should still have one entry after re-reading"
);
}
fn init_test(cx: &mut TestAppContext) {
cx.update(|cx| {
let settings_store = SettingsStore::test(cx);
cx.set_global(settings_store);
});
}
#[gpui::test]
async fn test_consecutive_edits_work(cx: &mut TestAppContext) {
init_test(cx);
let fs = project::FakeFs::new(cx.executor());
fs.insert_tree(
"/root",
json!({
"test.txt": "original content"
}),
)
.await;
let project = Project::test(fs.clone(), [path!("/root").as_ref()], cx).await;
let context_server_registry =
cx.new(|cx| ContextServerRegistry::new(project.read(cx).context_server_store(), cx));
let model = Arc::new(FakeLanguageModel::default());
let thread = cx.new(|cx| {
Thread::new(
project.clone(),
cx.new(|_cx| ProjectContext::default()),
context_server_registry,
Templates::new(),
Some(model.clone()),
cx,
)
});
let languages = project.read_with(cx, |project, _| project.languages().clone());
let action_log = thread.read_with(cx, |thread, _| thread.action_log().clone());
let read_tool = Arc::new(crate::ReadFileTool::new(
thread.downgrade(),
project.clone(),
action_log,
));
let edit_tool = Arc::new(EditFileTool::new(
project.clone(),
thread.downgrade(),
languages,
Templates::new(),
));
// Read the file first
cx.update(|cx| {
read_tool.clone().run(
crate::ReadFileToolInput {
path: "root/test.txt".to_string(),
start_line: None,
end_line: None,
},
ToolCallEventStream::test().0,
cx,
)
})
.await
.unwrap();
// First edit should work
let edit_result = {
let edit_task = cx.update(|cx| {
edit_tool.clone().run(
EditFileToolInput {
display_description: "First edit".into(),
path: "root/test.txt".into(),
mode: EditFileMode::Edit,
},
ToolCallEventStream::test().0,
cx,
)
});
cx.executor().run_until_parked();
model.send_last_completion_stream_text_chunk(
"<old_text>original content</old_text><new_text>modified content</new_text>"
.to_string(),
);
model.end_last_completion_stream();
edit_task.await
};
assert!(
edit_result.is_ok(),
"First edit should succeed, got error: {:?}",
edit_result.as_ref().err()
);
// Second edit should also work because the edit updated the recorded read time
let edit_result = {
let edit_task = cx.update(|cx| {
edit_tool.clone().run(
EditFileToolInput {
display_description: "Second edit".into(),
path: "root/test.txt".into(),
mode: EditFileMode::Edit,
},
ToolCallEventStream::test().0,
cx,
)
});
cx.executor().run_until_parked();
model.send_last_completion_stream_text_chunk(
"<old_text>modified content</old_text><new_text>further modified content</new_text>".to_string(),
);
model.end_last_completion_stream();
edit_task.await
};
assert!(
edit_result.is_ok(),
"Second consecutive edit should succeed, got error: {:?}",
edit_result.as_ref().err()
);
}
#[gpui::test]
async fn test_external_modification_detected(cx: &mut TestAppContext) {
init_test(cx);
let fs = project::FakeFs::new(cx.executor());
fs.insert_tree(
"/root",
json!({
"test.txt": "original content"
}),
)
.await;
let project = Project::test(fs.clone(), [path!("/root").as_ref()], cx).await;
let context_server_registry =
cx.new(|cx| ContextServerRegistry::new(project.read(cx).context_server_store(), cx));
let model = Arc::new(FakeLanguageModel::default());
let thread = cx.new(|cx| {
Thread::new(
project.clone(),
cx.new(|_cx| ProjectContext::default()),
context_server_registry,
Templates::new(),
Some(model.clone()),
cx,
)
});
let languages = project.read_with(cx, |project, _| project.languages().clone());
let action_log = thread.read_with(cx, |thread, _| thread.action_log().clone());
let read_tool = Arc::new(crate::ReadFileTool::new(
thread.downgrade(),
project.clone(),
action_log,
));
let edit_tool = Arc::new(EditFileTool::new(
project.clone(),
thread.downgrade(),
languages,
Templates::new(),
));
// Read the file first
cx.update(|cx| {
read_tool.clone().run(
crate::ReadFileToolInput {
path: "root/test.txt".to_string(),
start_line: None,
end_line: None,
},
ToolCallEventStream::test().0,
cx,
)
})
.await
.unwrap();
// Simulate external modification - advance time and save file
cx.background_executor
.advance_clock(std::time::Duration::from_secs(2));
fs.save(
path!("/root/test.txt").as_ref(),
&"externally modified content".into(),
language::LineEnding::Unix,
)
.await
.unwrap();
// Reload the buffer to pick up the new mtime
let project_path = project
.read_with(cx, |project, cx| {
project.find_project_path("root/test.txt", cx)
})
.expect("Should find project path");
let buffer = project
.update(cx, |project, cx| project.open_buffer(project_path, cx))
.await
.unwrap();
buffer
.update(cx, |buffer, cx| buffer.reload(cx))
.await
.unwrap();
cx.executor().run_until_parked();
// Try to edit - should fail because file was modified externally
let result = cx
.update(|cx| {
edit_tool.clone().run(
EditFileToolInput {
display_description: "Edit after external change".into(),
path: "root/test.txt".into(),
mode: EditFileMode::Edit,
},
ToolCallEventStream::test().0,
cx,
)
})
.await;
assert!(
result.is_err(),
"Edit should fail after external modification"
);
let error_msg = result.unwrap_err().to_string();
assert!(
error_msg.contains("has been modified since you last read it"),
"Error should mention file modification, got: {}",
error_msg
);
}
#[gpui::test]
async fn test_dirty_buffer_detected(cx: &mut TestAppContext) {
init_test(cx);
let fs = project::FakeFs::new(cx.executor());
fs.insert_tree(
"/root",
json!({
"test.txt": "original content"
}),
)
.await;
let project = Project::test(fs.clone(), [path!("/root").as_ref()], cx).await;
let context_server_registry =
cx.new(|cx| ContextServerRegistry::new(project.read(cx).context_server_store(), cx));
let model = Arc::new(FakeLanguageModel::default());
let thread = cx.new(|cx| {
Thread::new(
project.clone(),
cx.new(|_cx| ProjectContext::default()),
context_server_registry,
Templates::new(),
Some(model.clone()),
cx,
)
});
let languages = project.read_with(cx, |project, _| project.languages().clone());
let action_log = thread.read_with(cx, |thread, _| thread.action_log().clone());
let read_tool = Arc::new(crate::ReadFileTool::new(
thread.downgrade(),
project.clone(),
action_log,
));
let edit_tool = Arc::new(EditFileTool::new(
project.clone(),
thread.downgrade(),
languages,
Templates::new(),
));
// Read the file first
cx.update(|cx| {
read_tool.clone().run(
crate::ReadFileToolInput {
path: "root/test.txt".to_string(),
start_line: None,
end_line: None,
},
ToolCallEventStream::test().0,
cx,
)
})
.await
.unwrap();
// Open the buffer and make it dirty by editing without saving
let project_path = project
.read_with(cx, |project, cx| {
project.find_project_path("root/test.txt", cx)
})
.expect("Should find project path");
let buffer = project
.update(cx, |project, cx| project.open_buffer(project_path, cx))
.await
.unwrap();
// Make an in-memory edit to the buffer (making it dirty)
buffer.update(cx, |buffer, cx| {
let end_point = buffer.max_point();
buffer.edit([(end_point..end_point, " added text")], None, cx);
});
// Verify buffer is dirty
let is_dirty = buffer.read_with(cx, |buffer, _| buffer.is_dirty());
assert!(is_dirty, "Buffer should be dirty after in-memory edit");
// Try to edit - should fail because buffer has unsaved changes
let result = cx
.update(|cx| {
edit_tool.clone().run(
EditFileToolInput {
display_description: "Edit with dirty buffer".into(),
path: "root/test.txt".into(),
mode: EditFileMode::Edit,
},
ToolCallEventStream::test().0,
cx,
)
})
.await;
assert!(result.is_err(), "Edit should fail when buffer is dirty");
let error_msg = result.unwrap_err().to_string();
assert!(
error_msg.contains("cannot be written to because it has unsaved changes"),
"Error should mention unsaved changes, got: {}",
error_msg
);
}
}

View File

@@ -1,7 +1,7 @@
use action_log::ActionLog;
use agent_client_protocol::{self as acp, ToolCallUpdateFields};
use anyhow::{Context as _, Result, anyhow};
use gpui::{App, Entity, SharedString, Task};
use gpui::{App, Entity, SharedString, Task, WeakEntity};
use indoc::formatdoc;
use language::Point;
use language_model::{LanguageModelImage, LanguageModelToolResultContent};
@@ -12,7 +12,7 @@ use settings::Settings;
use std::sync::Arc;
use util::markdown::MarkdownCodeBlock;
use crate::{AgentTool, ToolCallEventStream, outline};
use crate::{AgentTool, Thread, ToolCallEventStream, outline};
/// Reads the content of the given file in the project.
///
@@ -42,13 +42,19 @@ pub struct ReadFileToolInput {
}
pub struct ReadFileTool {
thread: WeakEntity<Thread>,
project: Entity<Project>,
action_log: Entity<ActionLog>,
}
impl ReadFileTool {
pub fn new(project: Entity<Project>, action_log: Entity<ActionLog>) -> Self {
pub fn new(
thread: WeakEntity<Thread>,
project: Entity<Project>,
action_log: Entity<ActionLog>,
) -> Self {
Self {
thread,
project,
action_log,
}
@@ -195,6 +201,17 @@ impl AgentTool for ReadFileTool {
anyhow::bail!("{file_path} not found");
}
// Record the file read time and mtime
if let Some(mtime) = buffer.read_with(cx, |buffer, _| {
buffer.file().and_then(|file| file.disk_state().mtime())
})? {
self.thread
.update(cx, |thread, _| {
thread.file_read_times.insert(abs_path.to_path_buf(), mtime);
})
.ok();
}
let mut anchor = None;
// Check if specific line ranges are provided
@@ -285,11 +302,15 @@ impl AgentTool for ReadFileTool {
#[cfg(test)]
mod test {
use super::*;
use crate::{ContextServerRegistry, Templates, Thread};
use gpui::{AppContext, TestAppContext, UpdateGlobal as _};
use language::{Language, LanguageConfig, LanguageMatcher, tree_sitter_rust};
use language_model::fake_provider::FakeLanguageModel;
use project::{FakeFs, Project};
use prompt_store::ProjectContext;
use serde_json::json;
use settings::SettingsStore;
use std::sync::Arc;
use util::path;
#[gpui::test]
@@ -300,7 +321,20 @@ mod test {
fs.insert_tree(path!("/root"), json!({})).await;
let project = Project::test(fs.clone(), [path!("/root").as_ref()], cx).await;
let action_log = cx.new(|_| ActionLog::new(project.clone()));
let tool = Arc::new(ReadFileTool::new(project, action_log));
let context_server_registry =
cx.new(|cx| ContextServerRegistry::new(project.read(cx).context_server_store(), cx));
let model = Arc::new(FakeLanguageModel::default());
let thread = cx.new(|cx| {
Thread::new(
project.clone(),
cx.new(|_cx| ProjectContext::default()),
context_server_registry,
Templates::new(),
Some(model),
cx,
)
});
let tool = Arc::new(ReadFileTool::new(thread.downgrade(), project, action_log));
let (event_stream, _) = ToolCallEventStream::test();
let result = cx
@@ -333,7 +367,20 @@ mod test {
.await;
let project = Project::test(fs.clone(), [path!("/root").as_ref()], cx).await;
let action_log = cx.new(|_| ActionLog::new(project.clone()));
let tool = Arc::new(ReadFileTool::new(project, action_log));
let context_server_registry =
cx.new(|cx| ContextServerRegistry::new(project.read(cx).context_server_store(), cx));
let model = Arc::new(FakeLanguageModel::default());
let thread = cx.new(|cx| {
Thread::new(
project.clone(),
cx.new(|_cx| ProjectContext::default()),
context_server_registry,
Templates::new(),
Some(model),
cx,
)
});
let tool = Arc::new(ReadFileTool::new(thread.downgrade(), project, action_log));
let result = cx
.update(|cx| {
let input = ReadFileToolInput {
@@ -363,7 +410,20 @@ mod test {
let language_registry = project.read_with(cx, |project, _| project.languages().clone());
language_registry.add(Arc::new(rust_lang()));
let action_log = cx.new(|_| ActionLog::new(project.clone()));
let tool = Arc::new(ReadFileTool::new(project, action_log));
let context_server_registry =
cx.new(|cx| ContextServerRegistry::new(project.read(cx).context_server_store(), cx));
let model = Arc::new(FakeLanguageModel::default());
let thread = cx.new(|cx| {
Thread::new(
project.clone(),
cx.new(|_cx| ProjectContext::default()),
context_server_registry,
Templates::new(),
Some(model),
cx,
)
});
let tool = Arc::new(ReadFileTool::new(thread.downgrade(), project, action_log));
let result = cx
.update(|cx| {
let input = ReadFileToolInput {
@@ -435,7 +495,20 @@ mod test {
let project = Project::test(fs.clone(), [path!("/root").as_ref()], cx).await;
let action_log = cx.new(|_| ActionLog::new(project.clone()));
let tool = Arc::new(ReadFileTool::new(project, action_log));
let context_server_registry =
cx.new(|cx| ContextServerRegistry::new(project.read(cx).context_server_store(), cx));
let model = Arc::new(FakeLanguageModel::default());
let thread = cx.new(|cx| {
Thread::new(
project.clone(),
cx.new(|_cx| ProjectContext::default()),
context_server_registry,
Templates::new(),
Some(model),
cx,
)
});
let tool = Arc::new(ReadFileTool::new(thread.downgrade(), project, action_log));
let result = cx
.update(|cx| {
let input = ReadFileToolInput {
@@ -463,7 +536,20 @@ mod test {
.await;
let project = Project::test(fs.clone(), [path!("/root").as_ref()], cx).await;
let action_log = cx.new(|_| ActionLog::new(project.clone()));
let tool = Arc::new(ReadFileTool::new(project, action_log));
let context_server_registry =
cx.new(|cx| ContextServerRegistry::new(project.read(cx).context_server_store(), cx));
let model = Arc::new(FakeLanguageModel::default());
let thread = cx.new(|cx| {
Thread::new(
project.clone(),
cx.new(|_cx| ProjectContext::default()),
context_server_registry,
Templates::new(),
Some(model),
cx,
)
});
let tool = Arc::new(ReadFileTool::new(thread.downgrade(), project, action_log));
// start_line of 0 should be treated as 1
let result = cx
@@ -607,7 +693,20 @@ mod test {
let project = Project::test(fs.clone(), [path!("/project_root").as_ref()], cx).await;
let action_log = cx.new(|_| ActionLog::new(project.clone()));
let tool = Arc::new(ReadFileTool::new(project, action_log));
let context_server_registry =
cx.new(|cx| ContextServerRegistry::new(project.read(cx).context_server_store(), cx));
let model = Arc::new(FakeLanguageModel::default());
let thread = cx.new(|cx| {
Thread::new(
project.clone(),
cx.new(|_cx| ProjectContext::default()),
context_server_registry,
Templates::new(),
Some(model),
cx,
)
});
let tool = Arc::new(ReadFileTool::new(thread.downgrade(), project, action_log));
// Reading a file outside the project worktree should fail
let result = cx
@@ -821,7 +920,24 @@ mod test {
.await;
let action_log = cx.new(|_| ActionLog::new(project.clone()));
let tool = Arc::new(ReadFileTool::new(project.clone(), action_log.clone()));
let context_server_registry =
cx.new(|cx| ContextServerRegistry::new(project.read(cx).context_server_store(), cx));
let model = Arc::new(FakeLanguageModel::default());
let thread = cx.new(|cx| {
Thread::new(
project.clone(),
cx.new(|_cx| ProjectContext::default()),
context_server_registry,
Templates::new(),
Some(model),
cx,
)
});
let tool = Arc::new(ReadFileTool::new(
thread.downgrade(),
project.clone(),
action_log.clone(),
));
// Test reading allowed files in worktree1
let result = cx

View File

@@ -35,6 +35,7 @@ pub struct AcpConnection {
auth_methods: Vec<acp::AuthMethod>,
agent_capabilities: acp::AgentCapabilities,
default_mode: Option<acp::SessionModeId>,
default_model: Option<acp::ModelId>,
root_dir: PathBuf,
// NB: Don't move this into the wait_task, since we need to ensure the process is
// killed on drop (setting kill_on_drop on the command seems to not always work).
@@ -57,6 +58,7 @@ pub async fn connect(
command: AgentServerCommand,
root_dir: &Path,
default_mode: Option<acp::SessionModeId>,
default_model: Option<acp::ModelId>,
is_remote: bool,
cx: &mut AsyncApp,
) -> Result<Rc<dyn AgentConnection>> {
@@ -66,6 +68,7 @@ pub async fn connect(
command.clone(),
root_dir,
default_mode,
default_model,
is_remote,
cx,
)
@@ -82,6 +85,7 @@ impl AcpConnection {
command: AgentServerCommand,
root_dir: &Path,
default_mode: Option<acp::SessionModeId>,
default_model: Option<acp::ModelId>,
is_remote: bool,
cx: &mut AsyncApp,
) -> Result<Self> {
@@ -207,6 +211,7 @@ impl AcpConnection {
sessions,
agent_capabilities: response.agent_capabilities,
default_mode,
default_model,
_io_task: io_task,
_wait_task: wait_task,
_stderr_task: stderr_task,
@@ -245,39 +250,61 @@ impl AgentConnection for AcpConnection {
let conn = self.connection.clone();
let sessions = self.sessions.clone();
let default_mode = self.default_mode.clone();
let default_model = self.default_model.clone();
let cwd = cwd.to_path_buf();
let context_server_store = project.read(cx).context_server_store().read(cx);
let mcp_servers = if project.read(cx).is_local() {
context_server_store
.configured_server_ids()
.iter()
.filter_map(|id| {
let configuration = context_server_store.configuration_for_server(id)?;
let command = configuration.command();
Some(acp::McpServer::Stdio {
name: id.0.to_string(),
command: command.path.clone(),
args: command.args.clone(),
env: if let Some(env) = command.env.as_ref() {
env.iter()
.map(|(name, value)| acp::EnvVariable {
name: name.clone(),
value: value.clone(),
meta: None,
})
.collect()
} else {
vec![]
},
let mcp_servers =
if project.read(cx).is_local() {
context_server_store
.configured_server_ids()
.iter()
.filter_map(|id| {
let configuration = context_server_store.configuration_for_server(id)?;
match &*configuration {
project::context_server_store::ContextServerConfiguration::Custom {
command,
..
}
| project::context_server_store::ContextServerConfiguration::Extension {
command,
..
} => Some(acp::McpServer::Stdio {
name: id.0.to_string(),
command: command.path.clone(),
args: command.args.clone(),
env: if let Some(env) = command.env.as_ref() {
env.iter()
.map(|(name, value)| acp::EnvVariable {
name: name.clone(),
value: value.clone(),
meta: None,
})
.collect()
} else {
vec![]
},
}),
project::context_server_store::ContextServerConfiguration::Http {
url,
headers,
} => Some(acp::McpServer::Http {
name: id.0.to_string(),
url: url.to_string(),
headers: headers.iter().map(|(name, value)| acp::HttpHeader {
name: name.clone(),
value: value.clone(),
meta: None,
}).collect(),
}),
}
})
})
.collect()
} else {
// In SSH projects, the external agent is running on the remote
// machine, and currently we only run MCP servers on the local
// machine. So don't pass any MCP servers to the agent in that case.
Vec::new()
};
.collect()
} else {
// In SSH projects, the external agent is running on the remote
// machine, and currently we only run MCP servers on the local
// machine. So don't pass any MCP servers to the agent in that case.
Vec::new()
};
cx.spawn(async move |cx| {
let response = conn
@@ -312,6 +339,7 @@ impl AgentConnection for AcpConnection {
let default_mode = default_mode.clone();
let session_id = response.session_id.clone();
let modes = modes.clone();
let conn = conn.clone();
async move |_| {
let result = conn.set_session_mode(acp::SetSessionModeRequest {
session_id,
@@ -346,6 +374,53 @@ impl AgentConnection for AcpConnection {
}
}
if let Some(default_model) = default_model {
if let Some(models) = models.as_ref() {
let mut models_ref = models.borrow_mut();
let has_model = models_ref.available_models.iter().any(|model| model.model_id == default_model);
if has_model {
let initial_model_id = models_ref.current_model_id.clone();
cx.spawn({
let default_model = default_model.clone();
let session_id = response.session_id.clone();
let models = models.clone();
let conn = conn.clone();
async move |_| {
let result = conn.set_session_model(acp::SetSessionModelRequest {
session_id,
model_id: default_model,
meta: None,
})
.await.log_err();
if result.is_none() {
models.borrow_mut().current_model_id = initial_model_id;
}
}
}).detach();
models_ref.current_model_id = default_model;
} else {
let available_models = models_ref
.available_models
.iter()
.map(|model| format!("- `{}`: {}", model.model_id, model.name))
.collect::<Vec<_>>()
.join("\n");
log::warn!(
"`{default_model}` is not a valid {name} model. Available options:\n{available_models}",
);
}
} else {
log::warn!(
"`{name}` does not support model selection, but `default_model` was set in settings.",
);
}
}
let session_id = response.session_id;
let action_log = cx.new(|_| ActionLog::new(project.clone()))?;
let thread = cx.new(|cx| {

View File

@@ -68,6 +68,18 @@ pub trait AgentServer: Send {
) {
}
fn default_model(&self, _cx: &mut App) -> Option<agent_client_protocol::ModelId> {
None
}
fn set_default_model(
&self,
_model_id: Option<agent_client_protocol::ModelId>,
_fs: Arc<dyn Fs>,
_cx: &mut App,
) {
}
fn connect(
&self,
root_dir: Option<&Path>,

View File

@@ -55,6 +55,27 @@ impl AgentServer for ClaudeCode {
});
}
fn default_model(&self, cx: &mut App) -> Option<acp::ModelId> {
let settings = cx.read_global(|settings: &SettingsStore, _| {
settings.get::<AllAgentServersSettings>(None).claude.clone()
});
settings
.as_ref()
.and_then(|s| s.default_model.clone().map(|m| acp::ModelId(m.into())))
}
fn set_default_model(&self, model_id: Option<acp::ModelId>, fs: Arc<dyn Fs>, cx: &mut App) {
update_settings_file(fs, cx, |settings, _| {
settings
.agent_servers
.get_or_insert_default()
.claude
.get_or_insert_default()
.default_model = model_id.map(|m| m.to_string())
});
}
fn connect(
&self,
root_dir: Option<&Path>,
@@ -68,6 +89,7 @@ impl AgentServer for ClaudeCode {
let store = delegate.store.downgrade();
let extra_env = load_proxy_env(cx);
let default_mode = self.default_mode(cx);
let default_model = self.default_model(cx);
cx.spawn(async move |cx| {
let (command, root_dir, login) = store
@@ -90,6 +112,7 @@ impl AgentServer for ClaudeCode {
command,
root_dir.as_ref(),
default_mode,
default_model,
is_remote,
cx,
)

View File

@@ -56,6 +56,27 @@ impl AgentServer for Codex {
});
}
fn default_model(&self, cx: &mut App) -> Option<acp::ModelId> {
let settings = cx.read_global(|settings: &SettingsStore, _| {
settings.get::<AllAgentServersSettings>(None).codex.clone()
});
settings
.as_ref()
.and_then(|s| s.default_model.clone().map(|m| acp::ModelId(m.into())))
}
fn set_default_model(&self, model_id: Option<acp::ModelId>, fs: Arc<dyn Fs>, cx: &mut App) {
update_settings_file(fs, cx, |settings, _| {
settings
.agent_servers
.get_or_insert_default()
.codex
.get_or_insert_default()
.default_model = model_id.map(|m| m.to_string())
});
}
fn connect(
&self,
root_dir: Option<&Path>,
@@ -69,6 +90,7 @@ impl AgentServer for Codex {
let store = delegate.store.downgrade();
let extra_env = load_proxy_env(cx);
let default_mode = self.default_mode(cx);
let default_model = self.default_model(cx);
cx.spawn(async move |cx| {
let (command, root_dir, login) = store
@@ -92,6 +114,7 @@ impl AgentServer for Codex {
command,
root_dir.as_ref(),
default_mode,
default_model,
is_remote,
cx,
)

View File

@@ -61,6 +61,34 @@ impl crate::AgentServer for CustomAgentServer {
});
}
fn default_model(&self, cx: &mut App) -> Option<acp::ModelId> {
let settings = cx.read_global(|settings: &SettingsStore, _| {
settings
.get::<AllAgentServersSettings>(None)
.custom
.get(&self.name())
.cloned()
});
settings
.as_ref()
.and_then(|s| s.default_model.clone().map(|m| acp::ModelId(m.into())))
}
fn set_default_model(&self, model_id: Option<acp::ModelId>, fs: Arc<dyn Fs>, cx: &mut App) {
let name = self.name();
update_settings_file(fs, cx, move |settings, _| {
if let Some(settings) = settings
.agent_servers
.get_or_insert_default()
.custom
.get_mut(&name)
{
settings.default_model = model_id.map(|m| m.to_string())
}
});
}
fn connect(
&self,
root_dir: Option<&Path>,
@@ -72,6 +100,7 @@ impl crate::AgentServer for CustomAgentServer {
let root_dir = root_dir.map(|root_dir| root_dir.to_string_lossy().into_owned());
let is_remote = delegate.project.read(cx).is_via_remote_server();
let default_mode = self.default_mode(cx);
let default_model = self.default_model(cx);
let store = delegate.store.downgrade();
let extra_env = load_proxy_env(cx);
@@ -98,6 +127,7 @@ impl crate::AgentServer for CustomAgentServer {
command,
root_dir.as_ref(),
default_mode,
default_model,
is_remote,
cx,
)

View File

@@ -476,6 +476,7 @@ pub async fn init_test(cx: &mut TestAppContext) -> Arc<FakeFs> {
env: None,
ignore_system_version: None,
default_mode: None,
default_model: None,
}),
gemini: Some(crate::gemini::tests::local_command().into()),
codex: Some(BuiltinAgentServerSettings {
@@ -484,6 +485,7 @@ pub async fn init_test(cx: &mut TestAppContext) -> Arc<FakeFs> {
env: None,
ignore_system_version: None,
default_mode: None,
default_model: None,
}),
custom: collections::HashMap::default(),
},

View File

@@ -37,6 +37,7 @@ impl AgentServer for Gemini {
let store = delegate.store.downgrade();
let mut extra_env = load_proxy_env(cx);
let default_mode = self.default_mode(cx);
let default_model = self.default_model(cx);
cx.spawn(async move |cx| {
extra_env.insert("SURFACE".to_owned(), "zed".to_owned());
@@ -69,6 +70,7 @@ impl AgentServer for Gemini {
command,
root_dir.as_ref(),
default_mode,
default_model,
is_remote,
cx,
)

View File

@@ -98,6 +98,8 @@ util.workspace = true
watch.workspace = true
workspace.workspace = true
zed_actions.workspace = true
image.workspace = true
async-fs.workspace = true
[dev-dependencies]
acp_thread = { workspace = true, features = ["test-support"] }

View File

@@ -13,7 +13,7 @@ use collections::{HashMap, HashSet};
use editor::{
Addon, Anchor, AnchorRangeExt, ContextMenuOptions, ContextMenuPlacement, Editor, EditorElement,
EditorEvent, EditorMode, EditorSnapshot, EditorStyle, ExcerptId, FoldPlaceholder, Inlay,
MultiBuffer, ToOffset,
MultiBuffer, MultiBufferOffset, ToOffset,
actions::Paste,
code_context_menus::CodeContextMenu,
display_map::{Crease, CreaseId, FoldId},
@@ -28,6 +28,7 @@ use gpui::{
EventEmitter, FocusHandle, Focusable, Image, ImageFormat, Img, KeyContext, SharedString,
Subscription, Task, TextStyle, WeakEntity, pulsating_between,
};
use itertools::Either;
use language::{Buffer, Language, language_settings::InlayHintKind};
use language_model::LanguageModelImage;
use postage::stream::Stream as _;
@@ -208,7 +209,7 @@ impl MessageEditor {
let acp::AvailableCommandInput::Unstructured { mut hint } =
available_command.input.clone()?;
let mut hint_pos = parsed_command.source_range.end + 1;
let mut hint_pos = MultiBufferOffset(parsed_command.source_range.end) + 1usize;
if hint_pos > snapshot.len() {
hint_pos = snapshot.len();
hint.insert(0, ' ');
@@ -306,9 +307,9 @@ impl MessageEditor {
return Task::ready(());
};
let excerpt_id = start_anchor.excerpt_id;
let end_anchor = snapshot
.buffer_snapshot()
.anchor_before(start_anchor.to_offset(&snapshot.buffer_snapshot()) + content_len + 1);
let end_anchor = snapshot.buffer_snapshot().anchor_before(
start_anchor.to_offset(&snapshot.buffer_snapshot()) + content_len + 1usize,
);
let crease = if let MentionUri::File { abs_path } = &mention_uri
&& let Some(extension) = abs_path.extension()
@@ -738,8 +739,8 @@ impl MessageEditor {
};
let crease_range = crease.range().to_offset(&snapshot.buffer_snapshot());
if crease_range.start > ix {
let chunk = text[ix..crease_range.start].into();
if crease_range.start.0 > ix {
let chunk = text[ix..crease_range.start.0].into();
chunks.push(chunk);
}
let chunk = match mention {
@@ -807,7 +808,7 @@ impl MessageEditor {
}),
};
chunks.push(chunk);
ix = crease_range.end;
ix = crease_range.end.0;
}
if ix < text.len() {
@@ -861,7 +862,7 @@ impl MessageEditor {
let snapshot = editor.display_snapshot(cx);
let cursor = editor.selections.newest::<text::Point>(&snapshot).head();
let offset = cursor.to_offset(&snapshot);
if offset > 0 {
if offset.0 > 0 {
snapshot
.buffer_snapshot()
.reversed_chars_at(offset)
@@ -912,74 +913,114 @@ impl MessageEditor {
if !self.prompt_capabilities.borrow().image {
return;
}
let images = cx
.read_from_clipboard()
.map(|item| {
item.into_entries()
.filter_map(|entry| {
if let ClipboardEntry::Image(image) = entry {
Some(image)
} else {
None
}
})
.collect::<Vec<_>>()
})
.unwrap_or_default();
if images.is_empty() {
let Some(clipboard) = cx.read_from_clipboard() else {
return;
}
cx.stop_propagation();
};
cx.spawn_in(window, async move |this, cx| {
use itertools::Itertools;
let (mut images, paths) = clipboard
.into_entries()
.filter_map(|entry| match entry {
ClipboardEntry::Image(image) => Some(Either::Left(image)),
ClipboardEntry::ExternalPaths(paths) => Some(Either::Right(paths)),
_ => None,
})
.partition_map::<Vec<_>, Vec<_>, _, _, _>(std::convert::identity);
let replacement_text = MentionUri::PastedImage.as_link().to_string();
for image in images {
let (excerpt_id, text_anchor, multibuffer_anchor) =
self.editor.update(cx, |message_editor, cx| {
let snapshot = message_editor.snapshot(window, cx);
let (excerpt_id, _, buffer_snapshot) =
snapshot.buffer_snapshot().as_singleton().unwrap();
if !paths.is_empty() {
images.extend(
cx.background_spawn(async move {
let mut images = vec![];
for path in paths.into_iter().flat_map(|paths| paths.paths().to_owned()) {
let Ok(content) = async_fs::read(path).await else {
continue;
};
let Ok(format) = image::guess_format(&content) else {
continue;
};
images.push(gpui::Image::from_bytes(
match format {
image::ImageFormat::Png => gpui::ImageFormat::Png,
image::ImageFormat::Jpeg => gpui::ImageFormat::Jpeg,
image::ImageFormat::WebP => gpui::ImageFormat::Webp,
image::ImageFormat::Gif => gpui::ImageFormat::Gif,
image::ImageFormat::Bmp => gpui::ImageFormat::Bmp,
image::ImageFormat::Tiff => gpui::ImageFormat::Tiff,
image::ImageFormat::Ico => gpui::ImageFormat::Ico,
_ => continue,
},
content,
));
}
images
})
.await,
);
}
let text_anchor = buffer_snapshot.anchor_before(buffer_snapshot.len());
let multibuffer_anchor = snapshot
.buffer_snapshot()
.anchor_in_excerpt(*excerpt_id, text_anchor);
message_editor.edit(
[(
multi_buffer::Anchor::max()..multi_buffer::Anchor::max(),
format!("{replacement_text} "),
)],
if images.is_empty() {
return;
}
let replacement_text = MentionUri::PastedImage.as_link().to_string();
let Ok(editor) = this.update(cx, |this, cx| {
cx.stop_propagation();
this.editor.clone()
}) else {
return;
};
for image in images {
let Ok((excerpt_id, text_anchor, multibuffer_anchor)) =
editor.update_in(cx, |message_editor, window, cx| {
let snapshot = message_editor.snapshot(window, cx);
let (excerpt_id, _, buffer_snapshot) =
snapshot.buffer_snapshot().as_singleton().unwrap();
let text_anchor = buffer_snapshot.anchor_before(buffer_snapshot.len());
let multibuffer_anchor = snapshot
.buffer_snapshot()
.anchor_in_excerpt(*excerpt_id, text_anchor);
message_editor.edit(
[(
multi_buffer::Anchor::max()..multi_buffer::Anchor::max(),
format!("{replacement_text} "),
)],
cx,
);
(*excerpt_id, text_anchor, multibuffer_anchor)
})
else {
break;
};
let content_len = replacement_text.len();
let Some(start_anchor) = multibuffer_anchor else {
continue;
};
let Ok(end_anchor) = editor.update(cx, |editor, cx| {
let snapshot = editor.buffer().read(cx).snapshot(cx);
snapshot.anchor_before(start_anchor.to_offset(&snapshot) + content_len)
}) else {
continue;
};
let image = Arc::new(image);
let Ok(Some((crease_id, tx))) = cx.update(|window, cx| {
insert_crease_for_mention(
excerpt_id,
text_anchor,
content_len,
MentionUri::PastedImage.name().into(),
IconName::Image.path().into(),
Some(Task::ready(Ok(image.clone())).shared()),
editor.clone(),
window,
cx,
);
(*excerpt_id, text_anchor, multibuffer_anchor)
});
let content_len = replacement_text.len();
let Some(start_anchor) = multibuffer_anchor else {
continue;
};
let end_anchor = self.editor.update(cx, |editor, cx| {
let snapshot = editor.buffer().read(cx).snapshot(cx);
snapshot.anchor_before(start_anchor.to_offset(&snapshot) + content_len)
});
let image = Arc::new(image);
let Some((crease_id, tx)) = insert_crease_for_mention(
excerpt_id,
text_anchor,
content_len,
MentionUri::PastedImage.name().into(),
IconName::Image.path().into(),
Some(Task::ready(Ok(image.clone())).shared()),
self.editor.clone(),
window,
cx,
) else {
continue;
};
let task = cx
.spawn_in(window, {
async move |_, cx| {
)
}) else {
continue;
};
let task = cx
.spawn(async move |cx| {
let format = image.format;
let image = cx
.update(|_, cx| LanguageModelImage::from_image(image, cx))
@@ -994,15 +1035,16 @@ impl MessageEditor {
} else {
Err("Failed to convert image".into())
}
}
})
.shared();
this.update(cx, |this, _| {
this.mention_set
.mentions
.insert(crease_id, (MentionUri::PastedImage, task.clone()))
})
.shared();
.ok();
self.mention_set
.mentions
.insert(crease_id, (MentionUri::PastedImage, task.clone()));
cx.spawn_in(window, async move |this, cx| {
if task.await.notify_async_err(cx).is_none() {
this.update(cx, |this, cx| {
this.editor.update(cx, |editor, cx| {
@@ -1012,9 +1054,9 @@ impl MessageEditor {
})
.ok();
}
})
.detach();
}
}
})
.detach();
}
pub fn insert_dragged_files(
@@ -1090,7 +1132,7 @@ impl MessageEditor {
let cursor_anchor = editor.selections.newest_anchor().head();
let cursor_offset = cursor_anchor.to_offset(&editor_buffer.snapshot(cx));
let anchor = buffer.update(cx, |buffer, _cx| {
buffer.anchor_before(cursor_offset.min(buffer.len()))
buffer.anchor_before(cursor_offset.0.min(buffer.len()))
});
let Some(workspace) = self.workspace.upgrade() else {
return;
@@ -1216,7 +1258,7 @@ impl MessageEditor {
});
for (range, mention_uri, mention) in mentions {
let anchor = snapshot.anchor_before(range.start);
let anchor = snapshot.anchor_before(MultiBufferOffset(range.start));
let Some((crease_id, tx)) = insert_crease_for_mention(
anchor.excerpt_id,
anchor.text_anchor,
@@ -1671,7 +1713,7 @@ mod tests {
use agent::{HistoryStore, outline};
use agent_client_protocol as acp;
use assistant_text_thread::TextThreadStore;
use editor::{AnchorRangeExt as _, Editor, EditorMode};
use editor::{AnchorRangeExt as _, Editor, EditorMode, MultiBufferOffset};
use fs::FakeFs;
use futures::StreamExt as _;
use gpui::{
@@ -2640,7 +2682,7 @@ mod tests {
editor.display_map.update(cx, |display_map, cx| {
display_map
.snapshot(cx)
.folds_in_range(0..snapshot.len())
.folds_in_range(MultiBufferOffset(0)..snapshot.len())
.map(|fold| fold.range.to_point(&snapshot))
.collect()
})

View File

@@ -11,7 +11,7 @@ use ui::{
PopoverMenu, PopoverMenuHandle, Tooltip, prelude::*,
};
use crate::{CycleModeSelector, ToggleProfileSelector};
use crate::{CycleModeSelector, ToggleProfileSelector, ui::HoldForDefault};
pub struct ModeSelector {
connection: Rc<dyn AgentSessionModes>,
@@ -56,6 +56,10 @@ impl ModeSelector {
self.set_mode(all_modes[next_index].id.clone(), cx);
}
pub fn mode(&self) -> acp::SessionModeId {
self.connection.current_mode()
}
pub fn set_mode(&mut self, mode: acp::SessionModeId, cx: &mut Context<Self>) {
let task = self.connection.set_mode(mode, cx);
self.setting_mode = true;
@@ -104,36 +108,11 @@ impl ModeSelector {
entry.documentation_aside(side, DocumentationEdge::Bottom, {
let description = description.clone();
move |cx| {
move |_| {
v_flex()
.gap_1()
.child(Label::new(description.clone()))
.child(
h_flex()
.pt_1()
.border_t_1()
.border_color(cx.theme().colors().border_variant)
.gap_0p5()
.text_sm()
.text_color(Color::Muted.color(cx))
.child("Hold")
.child(h_flex().flex_shrink_0().children(
ui::render_modifiers(
&gpui::Modifiers::secondary_key(),
PlatformStyle::platform(),
None,
Some(ui::TextSize::Default.rems(cx).into()),
true,
),
))
.child(div().map(|this| {
if is_default {
this.child("to also unset as default")
} else {
this.child("to also set as default")
}
})),
)
.child(HoldForDefault::new(is_default))
.into_any_element()
}
})

View File

@@ -1,8 +1,10 @@
use std::{cmp::Reverse, rc::Rc, sync::Arc};
use acp_thread::{AgentModelInfo, AgentModelList, AgentModelSelector};
use agent_servers::AgentServer;
use anyhow::Result;
use collections::IndexMap;
use fs::Fs;
use futures::FutureExt;
use fuzzy::{StringMatchCandidate, match_strings};
use gpui::{AsyncWindowContext, BackgroundExecutor, DismissEvent, Task, WeakEntity};
@@ -14,14 +16,18 @@ use ui::{
};
use util::ResultExt;
use crate::ui::HoldForDefault;
pub type AcpModelSelector = Picker<AcpModelPickerDelegate>;
pub fn acp_model_selector(
selector: Rc<dyn AgentModelSelector>,
agent_server: Rc<dyn AgentServer>,
fs: Arc<dyn Fs>,
window: &mut Window,
cx: &mut Context<AcpModelSelector>,
) -> AcpModelSelector {
let delegate = AcpModelPickerDelegate::new(selector, window, cx);
let delegate = AcpModelPickerDelegate::new(selector, agent_server, fs, window, cx);
Picker::list(delegate, window, cx)
.show_scrollbar(true)
.width(rems(20.))
@@ -35,10 +41,12 @@ enum AcpModelPickerEntry {
pub struct AcpModelPickerDelegate {
selector: Rc<dyn AgentModelSelector>,
agent_server: Rc<dyn AgentServer>,
fs: Arc<dyn Fs>,
filtered_entries: Vec<AcpModelPickerEntry>,
models: Option<AgentModelList>,
selected_index: usize,
selected_description: Option<(usize, SharedString)>,
selected_description: Option<(usize, SharedString, bool)>,
selected_model: Option<AgentModelInfo>,
_refresh_models_task: Task<()>,
}
@@ -46,6 +54,8 @@ pub struct AcpModelPickerDelegate {
impl AcpModelPickerDelegate {
fn new(
selector: Rc<dyn AgentModelSelector>,
agent_server: Rc<dyn AgentServer>,
fs: Arc<dyn Fs>,
window: &mut Window,
cx: &mut Context<AcpModelSelector>,
) -> Self {
@@ -86,6 +96,8 @@ impl AcpModelPickerDelegate {
Self {
selector,
agent_server,
fs,
filtered_entries: Vec::new(),
models: None,
selected_model: None,
@@ -181,6 +193,21 @@ impl PickerDelegate for AcpModelPickerDelegate {
if let Some(AcpModelPickerEntry::Model(model_info)) =
self.filtered_entries.get(self.selected_index)
{
if window.modifiers().secondary() {
let default_model = self.agent_server.default_model(cx);
let is_default = default_model.as_ref() == Some(&model_info.id);
self.agent_server.set_default_model(
if is_default {
None
} else {
Some(model_info.id.clone())
},
self.fs.clone(),
cx,
);
}
self.selector
.select_model(model_info.id.clone(), cx)
.detach_and_log_err(cx);
@@ -225,6 +252,8 @@ impl PickerDelegate for AcpModelPickerDelegate {
),
AcpModelPickerEntry::Model(model_info) => {
let is_selected = Some(model_info) == self.selected_model.as_ref();
let default_model = self.agent_server.default_model(cx);
let is_default = default_model.as_ref() == Some(&model_info.id);
let model_icon_color = if is_selected {
Color::Accent
@@ -239,8 +268,8 @@ impl PickerDelegate for AcpModelPickerDelegate {
this
.on_hover(cx.listener(move |menu, hovered, _, cx| {
if *hovered {
menu.delegate.selected_description = Some((ix, description.clone()));
} else if matches!(menu.delegate.selected_description, Some((id, _)) if id == ix) {
menu.delegate.selected_description = Some((ix, description.clone(), is_default));
} else if matches!(menu.delegate.selected_description, Some((id, _, _)) if id == ix) {
menu.delegate.selected_description = None;
}
cx.notify();
@@ -251,17 +280,17 @@ impl PickerDelegate for AcpModelPickerDelegate {
.inset(true)
.spacing(ListItemSpacing::Sparse)
.toggle_state(selected)
.start_slot::<Icon>(model_info.icon.map(|icon| {
Icon::new(icon)
.color(model_icon_color)
.size(IconSize::Small)
}))
.child(
h_flex()
.w_full()
.pl_0p5()
.gap_1p5()
.w(px(240.))
.when_some(model_info.icon, |this, icon| {
this.child(
Icon::new(icon)
.color(model_icon_color)
.size(IconSize::Small)
)
})
.child(Label::new(model_info.name.clone()).truncate()),
)
.end_slot(div().pr_3().when(is_selected, |this| {
@@ -283,14 +312,24 @@ impl PickerDelegate for AcpModelPickerDelegate {
_window: &mut Window,
_cx: &mut Context<Picker<Self>>,
) -> Option<ui::DocumentationAside> {
self.selected_description.as_ref().map(|(_, description)| {
let description = description.clone();
DocumentationAside::new(
DocumentationSide::Left,
DocumentationEdge::Top,
Rc::new(move |_| Label::new(description.clone()).into_any_element()),
)
})
self.selected_description
.as_ref()
.map(|(_, description, is_default)| {
let description = description.clone();
let is_default = *is_default;
DocumentationAside::new(
DocumentationSide::Left,
DocumentationEdge::Top,
Rc::new(move |_| {
v_flex()
.gap_1()
.child(Label::new(description.clone()))
.child(HoldForDefault::new(is_default))
.into_any_element()
}),
)
})
}
}

View File

@@ -1,6 +1,9 @@
use std::rc::Rc;
use std::sync::Arc;
use acp_thread::{AgentModelInfo, AgentModelSelector};
use agent_servers::AgentServer;
use fs::Fs;
use gpui::{Entity, FocusHandle};
use picker::popover_menu::PickerPopoverMenu;
use ui::{
@@ -20,13 +23,15 @@ pub struct AcpModelSelectorPopover {
impl AcpModelSelectorPopover {
pub(crate) fn new(
selector: Rc<dyn AgentModelSelector>,
agent_server: Rc<dyn AgentServer>,
fs: Arc<dyn Fs>,
menu_handle: PopoverMenuHandle<AcpModelSelector>,
focus_handle: FocusHandle,
window: &mut Window,
cx: &mut Context<Self>,
) -> Self {
Self {
selector: cx.new(move |cx| acp_model_selector(selector, window, cx)),
selector: cx.new(move |cx| acp_model_selector(selector, agent_server, fs, window, cx)),
menu_handle,
focus_handle,
}

View File

@@ -51,7 +51,7 @@ use ui::{
PopoverMenuHandle, SpinnerLabel, TintColor, Tooltip, WithScrollbar, prelude::*,
};
use util::{ResultExt, size::format_file_size, time::duration_alt_display};
use workspace::{CollaboratorId, Workspace};
use workspace::{CollaboratorId, NewTerminal, Workspace};
use zed_actions::agent::{Chat, ToggleModelSelector};
use zed_actions::assistant::OpenRulesLibrary;
@@ -69,8 +69,8 @@ use crate::ui::{
};
use crate::{
AgentDiffPane, AgentPanel, AllowAlways, AllowOnce, ContinueThread, ContinueWithBurnMode,
CycleModeSelector, ExpandMessageEditor, Follow, KeepAll, OpenAgentDiff, OpenHistory, RejectAll,
RejectOnce, ToggleBurnMode, ToggleProfileSelector,
CycleModeSelector, ExpandMessageEditor, Follow, KeepAll, NewThread, OpenAgentDiff, OpenHistory,
RejectAll, RejectOnce, ToggleBurnMode, ToggleProfileSelector,
};
#[derive(Copy, Clone, Debug, PartialEq, Eq)]
@@ -591,9 +591,13 @@ impl AcpThreadView {
.connection()
.model_selector(thread.read(cx).session_id())
.map(|selector| {
let agent_server = this.agent.clone();
let fs = this.project.read(cx).fs().clone();
cx.new(|cx| {
AcpModelSelectorPopover::new(
selector,
agent_server,
fs,
PopoverMenuHandle::default(),
this.focus_handle(cx),
window,
@@ -1135,6 +1139,7 @@ impl AcpThreadView {
self.is_loading_contents = true;
let model_id = self.current_model_id(cx);
let mode_id = self.current_mode_id(cx);
let guard = cx.new(|_| ());
cx.observe_release(&guard, |this, _guard, cx| {
this.is_loading_contents = false;
@@ -1169,7 +1174,8 @@ impl AcpThreadView {
"Agent Message Sent",
agent = agent_telemetry_id,
session = session_id,
model = model_id
model = model_id,
mode = mode_id
);
thread.send(contents, cx)
@@ -1182,6 +1188,7 @@ impl AcpThreadView {
agent = agent_telemetry_id,
session = session_id,
model = model_id,
mode = mode_id,
status,
turn_time_ms,
);
@@ -3144,7 +3151,7 @@ impl AcpThreadView {
.text_ui_sm(cx)
.h_full()
.children(terminal_view.map(|terminal_view| {
if terminal_view
let element = if terminal_view
.read(cx)
.content_mode(window, cx)
.is_scrollable()
@@ -3152,7 +3159,15 @@ impl AcpThreadView {
div().h_72().child(terminal_view).into_any_element()
} else {
terminal_view.into_any_element()
}
};
div()
.on_action(cx.listener(|_this, _: &NewTerminal, window, cx| {
window.dispatch_action(NewThread.boxed_clone(), cx);
cx.stop_propagation();
}))
.child(element)
.into_any_element()
})),
)
})
@@ -5397,6 +5412,16 @@ impl AcpThreadView {
)
}
fn current_mode_id(&self, cx: &App) -> Option<Arc<str>> {
if let Some(thread) = self.as_native_thread(cx) {
Some(thread.read(cx).profile().0.clone())
} else if let Some(mode_selector) = self.mode_selector() {
Some(mode_selector.read(cx).mode().0)
} else {
None
}
}
fn current_model_id(&self, cx: &App) -> Option<String> {
self.model_selector
.as_ref()
@@ -6045,6 +6070,7 @@ pub(crate) mod tests {
use acp_thread::StubAgentConnection;
use agent_client_protocol::SessionId;
use assistant_text_thread::TextThreadStore;
use editor::MultiBufferOffset;
use fs::FakeFs;
use gpui::{EventEmitter, SemanticVersion, TestAppContext, VisualTestContext};
use project::Project;
@@ -7213,7 +7239,7 @@ pub(crate) mod tests {
Editor::for_buffer(buffer.clone(), Some(project.clone()), window, cx);
editor.change_selections(Default::default(), window, cx, |selections| {
selections.select_ranges([8..15]);
selections.select_ranges([MultiBufferOffset(8)..MultiBufferOffset(15)]);
});
editor
@@ -7275,7 +7301,7 @@ pub(crate) mod tests {
Editor::for_buffer(buffer.clone(), Some(project.clone()), window, cx);
editor.change_selections(Default::default(), window, cx, |selections| {
selections.select_ranges([8..15]);
selections.select_ranges([MultiBufferOffset(8)..MultiBufferOffset(15)]);
});
editor

View File

@@ -1,5 +1,5 @@
mod add_llm_provider_modal;
mod configure_context_server_modal;
pub mod configure_context_server_modal;
mod configure_context_server_tools_modal;
mod manage_profiles_modal;
mod tool_picker;
@@ -12,7 +12,7 @@ use client::zed_urls;
use cloud_llm_client::{Plan, PlanV1, PlanV2};
use collections::HashMap;
use context_server::ContextServerId;
use editor::{Editor, SelectionEffects, scroll::Autoscroll};
use editor::{Editor, MultiBufferOffset, SelectionEffects, scroll::Autoscroll};
use extension::ExtensionManifest;
use extension_host::ExtensionStore;
use fs::Fs;
@@ -46,9 +46,8 @@ pub(crate) use configure_context_server_modal::ConfigureContextServerModal;
pub(crate) use configure_context_server_tools_modal::ConfigureContextServerToolsModal;
pub(crate) use manage_profiles_modal::ManageProfilesModal;
use crate::{
AddContextServer,
agent_configuration::add_llm_provider_modal::{AddLlmProviderModal, LlmCompatibleProvider},
use crate::agent_configuration::add_llm_provider_modal::{
AddLlmProviderModal, LlmCompatibleProvider,
};
pub struct AgentConfiguration {
@@ -553,7 +552,9 @@ impl AgentConfiguration {
move |window, cx| {
Some(ContextMenu::build(window, cx, |menu, _window, _cx| {
menu.entry("Add Custom Server", None, {
|window, cx| window.dispatch_action(AddContextServer.boxed_clone(), cx)
|window, cx| {
window.dispatch_action(crate::AddContextServer.boxed_clone(), cx)
}
})
.entry("Install from Extensions", None, {
|window, cx| {
@@ -651,7 +652,7 @@ impl AgentConfiguration {
let is_running = matches!(server_status, ContextServerStatus::Running);
let item_id = SharedString::from(context_server_id.0.clone());
// Servers without a configuration can only be provided by extensions.
let provided_by_extension = server_configuration.is_none_or(|config| {
let provided_by_extension = server_configuration.as_ref().is_none_or(|config| {
matches!(
config.as_ref(),
ContextServerConfiguration::Extension { .. }
@@ -707,7 +708,10 @@ impl AgentConfiguration {
"Server is stopped.",
),
};
let is_remote = server_configuration
.as_ref()
.map(|config| matches!(config.as_ref(), ContextServerConfiguration::Http { .. }))
.unwrap_or(false);
let context_server_configuration_menu = PopoverMenu::new("context-server-config-menu")
.trigger_with_tooltip(
IconButton::new("context-server-config-menu", IconName::Settings)
@@ -730,14 +734,25 @@ impl AgentConfiguration {
let language_registry = language_registry.clone();
let workspace = workspace.clone();
move |window, cx| {
ConfigureContextServerModal::show_modal_for_existing_server(
context_server_id.clone(),
language_registry.clone(),
workspace.clone(),
window,
cx,
)
.detach_and_log_err(cx);
if is_remote {
crate::agent_configuration::configure_context_server_modal::ConfigureContextServerModal::show_modal_for_existing_server(
context_server_id.clone(),
language_registry.clone(),
workspace.clone(),
window,
cx,
)
.detach();
} else {
ConfigureContextServerModal::show_modal_for_existing_server(
context_server_id.clone(),
language_registry.clone(),
workspace.clone(),
window,
cx,
)
.detach();
}
}
}).when(tool_count > 0, |this| this.entry("View Tools", None, {
let context_server_id = context_server_id.clone();
@@ -1333,6 +1348,7 @@ async fn open_new_agent_servers_entry_in_settings_editor(
args: vec![],
env: Some(HashMap::default()),
default_mode: None,
default_model: None,
},
);
}
@@ -1347,7 +1363,15 @@ async fn open_new_agent_servers_entry_in_settings_editor(
.map(|(range, _)| range.clone())
.collect::<Vec<_>>();
item.edit(edits, cx);
item.edit(
edits.into_iter().map(|(range, s)| {
(
MultiBufferOffset(range.start)..MultiBufferOffset(range.end),
s,
)
}),
cx,
);
if let Some((unique_server_name, buffer)) =
unique_server_name.zip(item.buffer().read(cx).as_singleton())
{
@@ -1360,7 +1384,9 @@ async fn open_new_agent_servers_entry_in_settings_editor(
window,
cx,
|selections| {
selections.select_ranges(vec![range]);
selections.select_ranges(vec![
MultiBufferOffset(range.start)..MultiBufferOffset(range.end),
]);
},
);
}

View File

@@ -3,16 +3,42 @@ use std::sync::Arc;
use anyhow::Result;
use collections::HashSet;
use fs::Fs;
use gpui::{DismissEvent, Entity, EventEmitter, FocusHandle, Focusable, Render, Task};
use gpui::{
DismissEvent, Entity, EventEmitter, FocusHandle, Focusable, Render, ScrollHandle, Task,
};
use language_model::LanguageModelRegistry;
use language_models::provider::open_ai_compatible::{AvailableModel, ModelCapabilities};
use settings::{OpenAiCompatibleSettingsContent, update_settings_file};
use ui::{
Banner, Checkbox, KeyBinding, Modal, ModalFooter, ModalHeader, Section, ToggleState, prelude::*,
Banner, Checkbox, KeyBinding, Modal, ModalFooter, ModalHeader, Section, ToggleState,
WithScrollbar, prelude::*,
};
use ui_input::InputField;
use workspace::{ModalView, Workspace};
fn single_line_input(
label: impl Into<SharedString>,
placeholder: impl Into<SharedString>,
text: Option<&str>,
tab_index: isize,
window: &mut Window,
cx: &mut App,
) -> Entity<InputField> {
cx.new(|cx| {
let input = InputField::new(window, cx, placeholder)
.label(label)
.tab_index(tab_index)
.tab_stop(true);
if let Some(text) = text {
input
.editor()
.update(cx, |editor, cx| editor.set_text(text, window, cx));
}
input
})
}
#[derive(Clone, Copy)]
pub enum LlmCompatibleProvider {
OpenAi,
@@ -41,12 +67,14 @@ struct AddLlmProviderInput {
impl AddLlmProviderInput {
fn new(provider: LlmCompatibleProvider, window: &mut Window, cx: &mut App) -> Self {
let provider_name = single_line_input("Provider Name", provider.name(), None, window, cx);
let api_url = single_line_input("API URL", provider.api_url(), None, window, cx);
let provider_name =
single_line_input("Provider Name", provider.name(), None, 1, window, cx);
let api_url = single_line_input("API URL", provider.api_url(), None, 2, window, cx);
let api_key = single_line_input(
"API Key",
"000000000000000000000000000000000000000000000000",
None,
3,
window,
cx,
);
@@ -55,12 +83,13 @@ impl AddLlmProviderInput {
provider_name,
api_url,
api_key,
models: vec![ModelInput::new(window, cx)],
models: vec![ModelInput::new(0, window, cx)],
}
}
fn add_model(&mut self, window: &mut Window, cx: &mut App) {
self.models.push(ModelInput::new(window, cx));
let model_index = self.models.len();
self.models.push(ModelInput::new(model_index, window, cx));
}
fn remove_model(&mut self, index: usize) {
@@ -84,11 +113,14 @@ struct ModelInput {
}
impl ModelInput {
fn new(window: &mut Window, cx: &mut App) -> Self {
fn new(model_index: usize, window: &mut Window, cx: &mut App) -> Self {
let base_tab_index = (3 + (model_index * 4)) as isize;
let model_name = single_line_input(
"Model Name",
"e.g. gpt-4o, claude-opus-4, gemini-2.5-pro",
None,
base_tab_index + 1,
window,
cx,
);
@@ -96,6 +128,7 @@ impl ModelInput {
"Max Completion Tokens",
"200000",
Some("200000"),
base_tab_index + 2,
window,
cx,
);
@@ -103,16 +136,26 @@ impl ModelInput {
"Max Output Tokens",
"Max Output Tokens",
Some("32000"),
base_tab_index + 3,
window,
cx,
);
let max_tokens = single_line_input("Max Tokens", "Max Tokens", Some("200000"), window, cx);
let max_tokens = single_line_input(
"Max Tokens",
"Max Tokens",
Some("200000"),
base_tab_index + 4,
window,
cx,
);
let ModelCapabilities {
tools,
images,
parallel_tool_calls,
prompt_cache_key,
} = ModelCapabilities::default();
Self {
name: model_name,
max_completion_tokens,
@@ -165,24 +208,6 @@ impl ModelInput {
}
}
fn single_line_input(
label: impl Into<SharedString>,
placeholder: impl Into<SharedString>,
text: Option<&str>,
window: &mut Window,
cx: &mut App,
) -> Entity<InputField> {
cx.new(|cx| {
let input = InputField::new(window, cx, placeholder).label(label);
if let Some(text) = text {
input
.editor()
.update(cx, |editor, cx| editor.set_text(text, window, cx));
}
input
})
}
fn save_provider_to_settings(
input: &AddLlmProviderInput,
cx: &mut App,
@@ -258,6 +283,7 @@ fn save_provider_to_settings(
pub struct AddLlmProviderModal {
provider: LlmCompatibleProvider,
input: AddLlmProviderInput,
scroll_handle: ScrollHandle,
focus_handle: FocusHandle,
last_error: Option<SharedString>,
}
@@ -278,6 +304,7 @@ impl AddLlmProviderModal {
provider,
last_error: None,
focus_handle: cx.focus_handle(),
scroll_handle: ScrollHandle::new(),
}
}
@@ -418,6 +445,19 @@ impl AddLlmProviderModal {
)
})
}
fn on_tab(&mut self, _: &menu::SelectNext, window: &mut Window, _: &mut Context<Self>) {
window.focus_next();
}
fn on_tab_prev(
&mut self,
_: &menu::SelectPrevious,
window: &mut Window,
_: &mut Context<Self>,
) {
window.focus_prev();
}
}
impl EventEmitter<DismissEvent> for AddLlmProviderModal {}
@@ -431,15 +471,27 @@ impl Focusable for AddLlmProviderModal {
impl ModalView for AddLlmProviderModal {}
impl Render for AddLlmProviderModal {
fn render(&mut self, _window: &mut ui::Window, cx: &mut ui::Context<Self>) -> impl IntoElement {
fn render(&mut self, window: &mut ui::Window, cx: &mut ui::Context<Self>) -> impl IntoElement {
let focus_handle = self.focus_handle(cx);
div()
let window_size = window.viewport_size();
let rem_size = window.rem_size();
let is_large_window = window_size.height / rem_size > rems_from_px(600.).0;
let modal_max_height = if is_large_window {
rems_from_px(450.)
} else {
rems_from_px(200.)
};
v_flex()
.id("add-llm-provider-modal")
.key_context("AddLlmProviderModal")
.w(rems(34.))
.elevation_3(cx)
.on_action(cx.listener(Self::cancel))
.on_action(cx.listener(Self::on_tab))
.on_action(cx.listener(Self::on_tab_prev))
.capture_any_mouse_down(cx.listener(|this, _, window, cx| {
this.focus_handle(cx).focus(window);
}))
@@ -462,17 +514,25 @@ impl Render for AddLlmProviderModal {
)
})
.child(
v_flex()
.id("modal_content")
div()
.size_full()
.max_h_128()
.overflow_y_scroll()
.px(DynamicSpacing::Base12.rems(cx))
.gap(DynamicSpacing::Base04.rems(cx))
.child(self.input.provider_name.clone())
.child(self.input.api_url.clone())
.child(self.input.api_key.clone())
.child(self.render_model_section(cx)),
.vertical_scrollbar_for(self.scroll_handle.clone(), window, cx)
.child(
v_flex()
.id("modal_content")
.size_full()
.tab_group()
.max_h(modal_max_height)
.pl_3()
.pr_4()
.gap_2()
.overflow_y_scroll()
.track_scroll(&self.scroll_handle)
.child(self.input.provider_name.clone())
.child(self.input.api_url.clone())
.child(self.input.api_key.clone())
.child(self.render_model_section(cx)),
),
)
.footer(
ModalFooter::new().end_slot(
@@ -642,7 +702,7 @@ mod tests {
let cx = setup_test(cx).await;
cx.update(|window, cx| {
let model_input = ModelInput::new(window, cx);
let model_input = ModelInput::new(0, window, cx);
model_input.name.update(cx, |input, cx| {
input.editor().update(cx, |editor, cx| {
editor.set_text("somemodel", window, cx);
@@ -678,7 +738,7 @@ mod tests {
let cx = setup_test(cx).await;
cx.update(|window, cx| {
let mut model_input = ModelInput::new(window, cx);
let mut model_input = ModelInput::new(0, window, cx);
model_input.name.update(cx, |input, cx| {
input.editor().update(cx, |editor, cx| {
editor.set_text("somemodel", window, cx);
@@ -703,7 +763,7 @@ mod tests {
let cx = setup_test(cx).await;
cx.update(|window, cx| {
let mut model_input = ModelInput::new(window, cx);
let mut model_input = ModelInput::new(0, window, cx);
model_input.name.update(cx, |input, cx| {
input.editor().update(cx, |editor, cx| {
editor.set_text("somemodel", window, cx);
@@ -767,7 +827,7 @@ mod tests {
models.iter().enumerate()
{
if i >= input.models.len() {
input.models.push(ModelInput::new(window, cx));
input.models.push(ModelInput::new(i, window, cx));
}
let model = &mut input.models[i];
set_text(&model.name, name, window, cx);

View File

@@ -4,6 +4,7 @@ use std::{
};
use anyhow::{Context as _, Result};
use collections::HashMap;
use context_server::{ContextServerCommand, ContextServerId};
use editor::{Editor, EditorElement, EditorStyle};
use gpui::{
@@ -20,6 +21,7 @@ use project::{
project_settings::{ContextServerSettings, ProjectSettings},
worktree_store::WorktreeStore,
};
use serde::Deserialize;
use settings::{Settings as _, update_settings_file};
use theme::ThemeSettings;
use ui::{
@@ -37,6 +39,11 @@ enum ConfigurationTarget {
id: ContextServerId,
command: ContextServerCommand,
},
ExistingHttp {
id: ContextServerId,
url: String,
headers: HashMap<String, String>,
},
Extension {
id: ContextServerId,
repository_url: Option<SharedString>,
@@ -47,9 +54,11 @@ enum ConfigurationTarget {
enum ConfigurationSource {
New {
editor: Entity<Editor>,
is_http: bool,
},
Existing {
editor: Entity<Editor>,
is_http: bool,
},
Extension {
id: ContextServerId,
@@ -97,6 +106,7 @@ impl ConfigurationSource {
match target {
ConfigurationTarget::New => ConfigurationSource::New {
editor: create_editor(context_server_input(None), jsonc_language, window, cx),
is_http: false,
},
ConfigurationTarget::Existing { id, command } => ConfigurationSource::Existing {
editor: create_editor(
@@ -105,6 +115,20 @@ impl ConfigurationSource {
window,
cx,
),
is_http: false,
},
ConfigurationTarget::ExistingHttp {
id,
url,
headers: auth,
} => ConfigurationSource::Existing {
editor: create_editor(
context_server_http_input(Some((id, url, auth))),
jsonc_language,
window,
cx,
),
is_http: true,
},
ConfigurationTarget::Extension {
id,
@@ -141,16 +165,30 @@ impl ConfigurationSource {
fn output(&self, cx: &mut App) -> Result<(ContextServerId, ContextServerSettings)> {
match self {
ConfigurationSource::New { editor } | ConfigurationSource::Existing { editor } => {
parse_input(&editor.read(cx).text(cx)).map(|(id, command)| {
(
id,
ContextServerSettings::Custom {
enabled: true,
command,
},
)
})
ConfigurationSource::New { editor, is_http }
| ConfigurationSource::Existing { editor, is_http } => {
if *is_http {
parse_http_input(&editor.read(cx).text(cx)).map(|(id, url, auth)| {
(
id,
ContextServerSettings::Http {
enabled: true,
url,
headers: auth,
},
)
})
} else {
parse_input(&editor.read(cx).text(cx)).map(|(id, command)| {
(
id,
ContextServerSettings::Custom {
enabled: true,
command,
},
)
})
}
}
ConfigurationSource::Extension {
id,
@@ -212,6 +250,66 @@ fn context_server_input(existing: Option<(ContextServerId, ContextServerCommand)
)
}
fn context_server_http_input(
existing: Option<(ContextServerId, String, HashMap<String, String>)>,
) -> String {
let (name, url, headers) = match existing {
Some((id, url, headers)) => {
let header = if headers.is_empty() {
r#"// "Authorization": "Bearer <token>"#.to_string()
} else {
let json = serde_json::to_string_pretty(&headers).unwrap();
let mut lines = json.split("\n").collect::<Vec<_>>();
if lines.len() > 1 {
lines.remove(0);
lines.pop();
}
lines
.into_iter()
.map(|line| format!(" {}", line))
.collect::<String>()
};
(id.0.to_string(), url, header)
}
None => (
"some-remote-server".to_string(),
"https://example.com/mcp".to_string(),
r#"// "Authorization": "Bearer <token>"#.to_string(),
),
};
format!(
r#"{{
/// The name of your remote MCP server
"{name}": {{
/// The URL of the remote MCP server
"url": "{url}",
"headers": {{
/// Any headers to send along
{headers}
}}
}}
}}"#
)
}
fn parse_http_input(text: &str) -> Result<(ContextServerId, String, HashMap<String, String>)> {
#[derive(Deserialize)]
struct Temp {
url: String,
#[serde(default)]
headers: HashMap<String, String>,
}
let value: HashMap<String, Temp> = serde_json_lenient::from_str(text)?;
if value.len() != 1 {
anyhow::bail!("Expected exactly one context server configuration");
}
let (key, value) = value.into_iter().next().unwrap();
Ok((ContextServerId(key.into()), value.url, value.headers))
}
fn resolve_context_server_extension(
id: ContextServerId,
worktree_store: Entity<WorktreeStore>,
@@ -312,6 +410,15 @@ impl ConfigureContextServerModal {
id: server_id,
command,
}),
ContextServerSettings::Http {
enabled: _,
url,
headers,
} => Some(ConfigurationTarget::ExistingHttp {
id: server_id,
url,
headers,
}),
ContextServerSettings::Extension { .. } => {
match workspace
.update(cx, |workspace, cx| {
@@ -353,6 +460,7 @@ impl ConfigureContextServerModal {
state: State::Idle,
original_server_id: match &target {
ConfigurationTarget::Existing { id, .. } => Some(id.clone()),
ConfigurationTarget::ExistingHttp { id, .. } => Some(id.clone()),
ConfigurationTarget::Extension { id, .. } => Some(id.clone()),
ConfigurationTarget::New => None,
},
@@ -481,7 +589,7 @@ impl ModalView for ConfigureContextServerModal {}
impl Focusable for ConfigureContextServerModal {
fn focus_handle(&self, cx: &App) -> FocusHandle {
match &self.source {
ConfigurationSource::New { editor } => editor.focus_handle(cx),
ConfigurationSource::New { editor, .. } => editor.focus_handle(cx),
ConfigurationSource::Existing { editor, .. } => editor.focus_handle(cx),
ConfigurationSource::Extension { editor, .. } => editor
.as_ref()
@@ -527,9 +635,10 @@ impl ConfigureContextServerModal {
}
fn render_modal_content(&self, cx: &App) -> AnyElement {
// All variants now use single editor approach
let editor = match &self.source {
ConfigurationSource::New { editor } => editor,
ConfigurationSource::Existing { editor } => editor,
ConfigurationSource::New { editor, .. } => editor,
ConfigurationSource::Existing { editor, .. } => editor,
ConfigurationSource::Extension { editor, .. } => {
let Some(editor) = editor else {
return div().into_any_element();
@@ -601,6 +710,36 @@ impl ConfigureContextServerModal {
move |_, _, cx| cx.open_url(&repository_url)
}),
)
} else if let ConfigurationSource::New { is_http, .. } = &self.source {
let label = if *is_http {
"Run command"
} else {
"Connect via HTTP"
};
let tooltip = if *is_http {
"Configure an MCP serevr that runs on stdin/stdout."
} else {
"Configure an MCP server that you connect to over HTTP"
};
Some(
Button::new("toggle-kind", label)
.tooltip(Tooltip::text(tooltip))
.on_click(cx.listener(|this, _, window, cx| match &mut this.source {
ConfigurationSource::New { editor, is_http } => {
*is_http = !*is_http;
let new_text = if *is_http {
context_server_http_input(None)
} else {
context_server_input(None)
};
editor.update(cx, |editor, cx| {
editor.set_text(new_text, window, cx);
})
}
_ => {}
})),
)
} else {
None
},

View File

@@ -13,8 +13,8 @@ use editor::{
scroll::Autoscroll,
};
use gpui::{
Action, AnyElement, AnyView, App, AppContext, Empty, Entity, EventEmitter, FocusHandle,
Focusable, Global, SharedString, Subscription, Task, WeakEntity, Window, prelude::*,
Action, AnyElement, App, AppContext, Empty, Entity, EventEmitter, FocusHandle, Focusable,
Global, SharedString, Subscription, Task, WeakEntity, Window, prelude::*,
};
use language::{Buffer, Capability, DiskState, OffsetRangeExt, Point};
@@ -580,11 +580,11 @@ impl Item for AgentDiffPane {
type_id: TypeId,
self_handle: &'a Entity<Self>,
_: &'a App,
) -> Option<AnyView> {
) -> Option<gpui::AnyEntity> {
if type_id == TypeId::of::<Self>() {
Some(self_handle.to_any())
Some(self_handle.clone().into())
} else if type_id == TypeId::of::<Editor>() {
Some(self.editor.to_any())
Some(self.editor.clone().into())
} else {
None
}

View File

@@ -1892,6 +1892,9 @@ impl AgentPanel {
.anchor(Corner::TopRight)
.with_handle(self.new_thread_menu_handle.clone())
.menu({
let selected_agent = self.selected_agent.clone();
let is_agent_selected = move |agent_type: AgentType| selected_agent == agent_type;
let workspace = self.workspace.clone();
let is_via_collab = workspace
.update(cx, |workspace, cx| {
@@ -1905,7 +1908,6 @@ impl AgentPanel {
let active_thread = active_thread.clone();
Some(ContextMenu::build(window, cx, |menu, _window, cx| {
menu.context(focus_handle.clone())
.header("Zed Agent")
.when_some(active_thread, |this, active_thread| {
let thread = active_thread.read(cx);
@@ -1929,9 +1931,11 @@ impl AgentPanel {
}
})
.item(
ContextMenuEntry::new("New Thread")
.action(NewThread.boxed_clone())
.icon(IconName::Thread)
ContextMenuEntry::new("Zed Agent")
.when(is_agent_selected(AgentType::NativeAgent) | is_agent_selected(AgentType::TextThread) , |this| {
this.action(Box::new(NewExternalAgentThread { agent: None }))
})
.icon(IconName::ZedAgent)
.icon_color(Color::Muted)
.handler({
let workspace = workspace.clone();
@@ -1955,10 +1959,10 @@ impl AgentPanel {
}),
)
.item(
ContextMenuEntry::new("New Text Thread")
ContextMenuEntry::new("Text Thread")
.action(NewTextThread.boxed_clone())
.icon(IconName::TextThread)
.icon_color(Color::Muted)
.action(NewTextThread.boxed_clone())
.handler({
let workspace = workspace.clone();
move |window, cx| {
@@ -1983,7 +1987,10 @@ impl AgentPanel {
.separator()
.header("External Agents")
.item(
ContextMenuEntry::new("New Claude Code")
ContextMenuEntry::new("Claude Code")
.when(is_agent_selected(AgentType::ClaudeCode), |this| {
this.action(Box::new(NewExternalAgentThread { agent: None }))
})
.icon(IconName::AiClaude)
.disabled(is_via_collab)
.icon_color(Color::Muted)
@@ -2009,7 +2016,10 @@ impl AgentPanel {
}),
)
.item(
ContextMenuEntry::new("New Codex CLI")
ContextMenuEntry::new("Codex CLI")
.when(is_agent_selected(AgentType::Codex), |this| {
this.action(Box::new(NewExternalAgentThread { agent: None }))
})
.icon(IconName::AiOpenAi)
.disabled(is_via_collab)
.icon_color(Color::Muted)
@@ -2035,7 +2045,10 @@ impl AgentPanel {
}),
)
.item(
ContextMenuEntry::new("New Gemini CLI")
ContextMenuEntry::new("Gemini CLI")
.when(is_agent_selected(AgentType::Gemini), |this| {
this.action(Box::new(NewExternalAgentThread { agent: None }))
})
.icon(IconName::AiGemini)
.icon_color(Color::Muted)
.disabled(is_via_collab)
@@ -2061,8 +2074,8 @@ impl AgentPanel {
}),
)
.map(|mut menu| {
let agent_server_store_read = agent_server_store.read(cx);
let agent_names = agent_server_store_read
let agent_server_store = agent_server_store.read(cx);
let agent_names = agent_server_store
.external_agents()
.filter(|name| {
name.0 != GEMINI_NAME
@@ -2071,21 +2084,38 @@ impl AgentPanel {
})
.cloned()
.collect::<Vec<_>>();
let custom_settings = cx
.global::<SettingsStore>()
.get::<AllAgentServersSettings>(None)
.custom
.clone();
for agent_name in agent_names {
let icon_path = agent_server_store_read.agent_icon(&agent_name);
let mut entry =
ContextMenuEntry::new(format!("New {}", agent_name));
let icon_path = agent_server_store.agent_icon(&agent_name);
let mut entry = ContextMenuEntry::new(agent_name.clone());
let command = custom_settings
.get(&agent_name.0)
.map(|settings| settings.command.clone())
.unwrap_or(placeholder_command());
if let Some(icon_path) = icon_path {
entry = entry.custom_icon_svg(icon_path);
} else {
entry = entry.icon(IconName::Terminal);
}
entry = entry
.when(
is_agent_selected(AgentType::Custom {
name: agent_name.0.clone(),
command: command.clone(),
}),
|this| {
this.action(Box::new(NewExternalAgentThread { agent: None }))
},
)
.icon_color(Color::Muted)
.disabled(is_via_collab)
.handler({
@@ -2125,6 +2155,7 @@ impl AgentPanel {
}
}
});
menu = menu.item(entry);
}
@@ -2157,7 +2188,7 @@ impl AgentPanel {
.id("selected_agent_icon")
.when_some(selected_agent_custom_icon, |this, icon_path| {
let label = selected_agent_label.clone();
this.px(DynamicSpacing::Base02.rems(cx))
this.px_1()
.child(Icon::from_external_svg(icon_path).color(Color::Muted))
.tooltip(move |_window, cx| {
Tooltip::with_meta(label.clone(), None, "Selected Agent", cx)
@@ -2166,7 +2197,7 @@ impl AgentPanel {
.when(!has_custom_icon, |this| {
this.when_some(self.selected_agent.icon(), |this, icon| {
let label = selected_agent_label.clone();
this.px(DynamicSpacing::Base02.rems(cx))
this.px_1()
.child(Icon::new(icon).color(Color::Muted))
.tooltip(move |_window, cx| {
Tooltip::with_meta(label.clone(), None, "Selected Agent", cx)

View File

@@ -346,7 +346,9 @@ fn update_command_palette_filter(cx: &mut App) {
filter.show_namespace("supermaven");
filter.show_action_types(edit_prediction_actions.iter());
}
EditPredictionProvider::Zed | EditPredictionProvider::Codestral => {
EditPredictionProvider::Zed
| EditPredictionProvider::Codestral
| EditPredictionProvider::Experimental(_) => {
filter.show_namespace("edit_prediction");
filter.hide_namespace("copilot");
filter.hide_namespace("supermaven");

View File

@@ -429,7 +429,12 @@ impl CodegenAlternative {
let prompt = self
.builder
.generate_inline_transformation_prompt(user_prompt, language_name, buffer, range)
.generate_inline_transformation_prompt(
user_prompt,
language_name,
buffer,
range.start.0..range.end.0,
)
.context("generating content prompt")?;
let context_task = self.context_store.as_ref().and_then(|context_store| {

View File

@@ -1082,7 +1082,7 @@ impl MentionCompletion {
#[cfg(test)]
mod tests {
use super::*;
use editor::AnchorRangeExt;
use editor::{AnchorRangeExt, MultiBufferOffset};
use gpui::{EventEmitter, FocusHandle, Focusable, TestAppContext, VisualTestContext};
use project::{Project, ProjectPath};
use serde_json::json;
@@ -1677,7 +1677,7 @@ mod tests {
editor.display_map.update(cx, |display_map, cx| {
display_map
.snapshot(cx)
.folds_in_range(0..snapshot.len())
.folds_in_range(MultiBufferOffset(0)..snapshot.len())
.map(|fold| fold.range.to_point(&snapshot))
.collect()
})

View File

@@ -16,6 +16,7 @@ use agent_settings::AgentSettings;
use anyhow::{Context as _, Result};
use client::telemetry::Telemetry;
use collections::{HashMap, HashSet, VecDeque, hash_map};
use editor::MultiBufferOffset;
use editor::RowExt;
use editor::SelectionEffects;
use editor::scroll::ScrollOffset;
@@ -803,7 +804,7 @@ impl InlineAssistant {
(
editor
.selections
.newest::<usize>(&editor.display_snapshot(cx)),
.newest::<MultiBufferOffset>(&editor.display_snapshot(cx)),
editor.buffer().read(cx).snapshot(cx),
)
});
@@ -836,7 +837,7 @@ impl InlineAssistant {
(
editor
.selections
.newest::<usize>(&editor.display_snapshot(cx)),
.newest::<MultiBufferOffset>(&editor.display_snapshot(cx)),
editor.buffer().read(cx).snapshot(cx),
)
});
@@ -853,12 +854,14 @@ impl InlineAssistant {
} else {
let distance_from_selection = assist_range
.start
.abs_diff(selection.start)
.min(assist_range.start.abs_diff(selection.end))
.0
.abs_diff(selection.start.0)
.min(assist_range.start.0.abs_diff(selection.end.0))
+ assist_range
.end
.abs_diff(selection.start)
.min(assist_range.end.abs_diff(selection.end));
.0
.abs_diff(selection.start.0)
.min(assist_range.end.0.abs_diff(selection.end.0));
match closest_assist_fallback {
Some((_, old_distance)) => {
if distance_from_selection < old_distance {
@@ -935,7 +938,7 @@ impl InlineAssistant {
EditorEvent::Edited { transaction_id } => {
let buffer = editor.read(cx).buffer().read(cx);
let edited_ranges =
buffer.edited_ranges_for_transaction::<usize>(*transaction_id, cx);
buffer.edited_ranges_for_transaction::<MultiBufferOffset>(*transaction_id, cx);
let snapshot = buffer.snapshot(cx);
for assist_id in editor_assists.assist_ids.clone() {

View File

@@ -2,7 +2,7 @@ use agent::HistoryStore;
use collections::{HashMap, VecDeque};
use editor::actions::Paste;
use editor::display_map::{CreaseId, EditorMargins};
use editor::{Addon, AnchorRangeExt as _};
use editor::{Addon, AnchorRangeExt as _, MultiBufferOffset};
use editor::{
ContextMenuOptions, Editor, EditorElement, EditorEvent, EditorMode, EditorStyle, MultiBuffer,
actions::{MoveDown, MoveUp},
@@ -1165,7 +1165,7 @@ impl GenerationMode {
/// Stored information that can be used to resurrect a context crease when creating an editor for a past message.
#[derive(Clone, Debug)]
pub struct MessageCrease {
pub range: Range<usize>,
pub range: Range<MultiBufferOffset>,
pub icon_path: SharedString,
pub label: SharedString,
/// None for a deserialized message, Some otherwise.

View File

@@ -492,17 +492,15 @@ impl PickerDelegate for LanguageModelPickerDelegate {
.inset(true)
.spacing(ListItemSpacing::Sparse)
.toggle_state(selected)
.start_slot(
Icon::new(model_info.icon)
.color(model_icon_color)
.size(IconSize::Small),
)
.child(
h_flex()
.w_full()
.pl_0p5()
.gap_1p5()
.w(px(240.))
.child(
Icon::new(model_info.icon)
.color(model_icon_color)
.size(IconSize::Small),
)
.child(Label::new(model_info.model.name().0).truncate()),
)
.end_slot(div().pr_3().when(is_selected, |this| {

View File

@@ -9,8 +9,8 @@ use assistant_slash_commands::{DefaultSlashCommand, FileSlashCommand, selections
use client::{proto, zed_urls};
use collections::{BTreeSet, HashMap, HashSet, hash_map};
use editor::{
Anchor, Editor, EditorEvent, MenuEditPredictionsPolicy, MultiBuffer, MultiBufferSnapshot,
RowExt, ToOffset as _, ToPoint,
Anchor, Editor, EditorEvent, MenuEditPredictionsPolicy, MultiBuffer, MultiBufferOffset,
MultiBufferSnapshot, RowExt, ToOffset as _, ToPoint,
actions::{MoveToEndOfLine, Newline, ShowCompletions},
display_map::{
BlockPlacement, BlockProperties, BlockStyle, Crease, CreaseMetadata, CustomBlockId, FoldId,
@@ -22,11 +22,11 @@ use editor::{FoldPlaceholder, display_map::CreaseId};
use fs::Fs;
use futures::FutureExt;
use gpui::{
Action, Animation, AnimationExt, AnyElement, AnyView, App, ClipboardEntry, ClipboardItem,
Empty, Entity, EventEmitter, FocusHandle, Focusable, FontWeight, Global, InteractiveElement,
IntoElement, ParentElement, Pixels, Render, RenderImage, SharedString, Size,
StatefulInteractiveElement, Styled, Subscription, Task, WeakEntity, actions, div, img, point,
prelude::*, pulsating_between, size,
Action, Animation, AnimationExt, AnyElement, App, ClipboardEntry, ClipboardItem, Empty, Entity,
EventEmitter, FocusHandle, Focusable, FontWeight, Global, InteractiveElement, IntoElement,
ParentElement, Pixels, Render, RenderImage, SharedString, Size, StatefulInteractiveElement,
Styled, Subscription, Task, WeakEntity, actions, div, img, point, prelude::*,
pulsating_between, size,
};
use language::{
BufferSnapshot, LspAdapterDelegate, ToOffset,
@@ -66,7 +66,7 @@ use workspace::{
};
use workspace::{
Save, Toast, Workspace,
item::{self, FollowableItem, Item, ItemHandle},
item::{self, FollowableItem, Item},
notifications::NotificationId,
pane,
searchable::{SearchEvent, SearchableItem},
@@ -390,7 +390,7 @@ impl TextThreadEditor {
let cursor = user_message
.start
.to_offset(self.text_thread.read(cx).buffer().read(cx));
cursor..cursor
MultiBufferOffset(cursor)..MultiBufferOffset(cursor)
};
self.editor.update(cx, |editor, cx| {
editor.change_selections(Default::default(), window, cx, |selections| {
@@ -431,7 +431,7 @@ impl TextThreadEditor {
let cursors = self.cursors(cx);
self.text_thread.update(cx, |text_thread, cx| {
let messages = text_thread
.messages_for_offsets(cursors, cx)
.messages_for_offsets(cursors.into_iter().map(|cursor| cursor.0), cx)
.into_iter()
.map(|message| message.id)
.collect();
@@ -439,9 +439,11 @@ impl TextThreadEditor {
});
}
fn cursors(&self, cx: &mut App) -> Vec<usize> {
fn cursors(&self, cx: &mut App) -> Vec<MultiBufferOffset> {
let selections = self.editor.update(cx, |editor, cx| {
editor.selections.all::<usize>(&editor.display_snapshot(cx))
editor
.selections
.all::<MultiBufferOffset>(&editor.display_snapshot(cx))
});
selections
.into_iter()
@@ -1580,7 +1582,11 @@ impl TextThreadEditor {
fn get_clipboard_contents(
&mut self,
cx: &mut Context<Self>,
) -> (String, CopyMetadata, Vec<text::Selection<usize>>) {
) -> (
String,
CopyMetadata,
Vec<text::Selection<MultiBufferOffset>>,
) {
let (mut selection, creases) = self.editor.update(cx, |editor, cx| {
let mut selection = editor
.selections
@@ -1638,30 +1644,26 @@ impl TextThreadEditor {
// If selection is empty, we want to copy the entire line
if selection.range().is_empty() {
let snapshot = text_thread.buffer().read(cx).snapshot();
let snapshot = self.editor.read(cx).buffer().read(cx).snapshot(cx);
let point = snapshot.offset_to_point(selection.range().start);
selection.start = snapshot.point_to_offset(Point::new(point.row, 0));
selection.end = snapshot
.point_to_offset(cmp::min(Point::new(point.row + 1, 0), snapshot.max_point()));
for chunk in text_thread
.buffer()
.read(cx)
.text_for_range(selection.range())
{
for chunk in snapshot.text_for_range(selection.range()) {
text.push_str(chunk);
}
} else {
for message in text_thread.messages(cx) {
if message.offset_range.start >= selection.range().end {
if message.offset_range.start >= selection.range().end.0 {
break;
} else if message.offset_range.end >= selection.range().start {
let range = cmp::max(message.offset_range.start, selection.range().start)
..cmp::min(message.offset_range.end, selection.range().end);
} else if message.offset_range.end >= selection.range().start.0 {
let range = cmp::max(message.offset_range.start, selection.range().start.0)
..cmp::min(message.offset_range.end, selection.range().end.0);
if !range.is_empty() {
for chunk in text_thread.buffer().read(cx).text_for_range(range) {
text.push_str(chunk);
}
if message.offset_range.end < selection.range().end {
if message.offset_range.end < selection.range().end.0 {
text.push('\n');
}
}
@@ -1679,7 +1681,7 @@ impl TextThreadEditor {
) {
cx.stop_propagation();
let images = if let Some(item) = cx.read_from_clipboard() {
let mut images = if let Some(item) = cx.read_from_clipboard() {
item.into_entries()
.filter_map(|entry| {
if let ClipboardEntry::Image(image) = entry {
@@ -1693,6 +1695,40 @@ impl TextThreadEditor {
Vec::new()
};
if let Some(paths) = cx.read_from_clipboard() {
for path in paths
.into_entries()
.filter_map(|entry| {
if let ClipboardEntry::ExternalPaths(paths) = entry {
Some(paths.paths().to_owned())
} else {
None
}
})
.flatten()
{
let Ok(content) = std::fs::read(path) else {
continue;
};
let Ok(format) = image::guess_format(&content) else {
continue;
};
images.push(gpui::Image::from_bytes(
match format {
image::ImageFormat::Png => gpui::ImageFormat::Png,
image::ImageFormat::Jpeg => gpui::ImageFormat::Jpeg,
image::ImageFormat::WebP => gpui::ImageFormat::Webp,
image::ImageFormat::Gif => gpui::ImageFormat::Gif,
image::ImageFormat::Bmp => gpui::ImageFormat::Bmp,
image::ImageFormat::Tiff => gpui::ImageFormat::Tiff,
image::ImageFormat::Ico => gpui::ImageFormat::Ico,
_ => continue,
},
content,
));
}
}
let metadata = if let Some(item) = cx.read_from_clipboard() {
item.entries().first().and_then(|entry| {
if let ClipboardEntry::String(text) = entry {
@@ -1709,7 +1745,7 @@ impl TextThreadEditor {
self.editor.update(cx, |editor, cx| {
let paste_position = editor
.selections
.newest::<usize>(&editor.display_snapshot(cx))
.newest::<MultiBufferOffset>(&editor.display_snapshot(cx))
.head();
editor.paste(action, window, cx);
@@ -1757,13 +1793,16 @@ impl TextThreadEditor {
editor.transact(window, cx, |editor, _window, cx| {
let edits = editor
.selections
.all::<usize>(&editor.display_snapshot(cx))
.all::<MultiBufferOffset>(&editor.display_snapshot(cx))
.into_iter()
.map(|selection| (selection.start..selection.end, "\n"));
editor.edit(edits, cx);
let snapshot = editor.buffer().read(cx).snapshot(cx);
for selection in editor.selections.all::<usize>(&editor.display_snapshot(cx)) {
for selection in editor
.selections
.all::<MultiBufferOffset>(&editor.display_snapshot(cx))
{
image_positions.push(snapshot.anchor_before(selection.end));
}
});
@@ -1855,7 +1894,7 @@ impl TextThreadEditor {
let range = selection
.map(|endpoint| endpoint.to_offset(&buffer))
.range();
text_thread.split_message(range, cx);
text_thread.split_message(range.start.0..range.end.0, cx);
}
});
}
@@ -2549,11 +2588,11 @@ impl Item for TextThreadEditor {
type_id: TypeId,
self_handle: &'a Entity<Self>,
_: &'a App,
) -> Option<AnyView> {
) -> Option<gpui::AnyEntity> {
if type_id == TypeId::of::<Self>() {
Some(self_handle.to_any())
Some(self_handle.clone().into())
} else if type_id == TypeId::of::<Editor>() {
Some(self.editor.to_any())
Some(self.editor.clone().into())
} else {
None
}
@@ -2592,12 +2631,11 @@ impl SearchableItem for TextThreadEditor {
&mut self,
index: usize,
matches: &[Self::Match],
collapse: bool,
window: &mut Window,
cx: &mut Context<Self>,
) {
self.editor.update(cx, |editor, cx| {
editor.activate_match(index, matches, collapse, window, cx);
editor.activate_match(index, matches, window, cx);
});
}
@@ -2930,7 +2968,7 @@ pub fn make_lsp_adapter_delegate(
#[cfg(test)]
mod tests {
use super::*;
use editor::SelectionEffects;
use editor::{MultiBufferOffset, SelectionEffects};
use fs::FakeFs;
use gpui::{App, TestAppContext, VisualTestContext};
use indoc::indoc;
@@ -3136,15 +3174,16 @@ mod tests {
text_thread: &Entity<TextThread>,
message_ix: usize,
cx: &mut TestAppContext,
) -> Range<usize> {
text_thread.update(cx, |text_thread, cx| {
) -> Range<MultiBufferOffset> {
let range = text_thread.update(cx, |text_thread, cx| {
text_thread
.messages(cx)
.nth(message_ix)
.unwrap()
.anchor_range
.to_offset(&text_thread.buffer().read(cx).snapshot())
})
});
MultiBufferOffset(range.start)..MultiBufferOffset(range.end)
}
fn assert_copy_paste_text_thread_editor<T: editor::ToOffset>(

View File

@@ -4,6 +4,7 @@ mod burn_mode_tooltip;
mod claude_code_onboarding_modal;
mod context_pill;
mod end_trial_upsell;
mod hold_for_default;
mod onboarding_modal;
mod unavailable_editing_tooltip;
mod usage_callout;
@@ -14,6 +15,7 @@ pub use burn_mode_tooltip::*;
pub use claude_code_onboarding_modal::*;
pub use context_pill::*;
pub use end_trial_upsell::*;
pub use hold_for_default::*;
pub use onboarding_modal::*;
pub use unavailable_editing_tooltip::*;
pub use usage_callout::*;

View File

@@ -0,0 +1,40 @@
use gpui::{App, IntoElement, Modifiers, RenderOnce, Window};
use ui::{prelude::*, render_modifiers};
#[derive(IntoElement)]
pub struct HoldForDefault {
is_default: bool,
}
impl HoldForDefault {
pub fn new(is_default: bool) -> Self {
Self { is_default }
}
}
impl RenderOnce for HoldForDefault {
fn render(self, _window: &mut Window, cx: &mut App) -> impl IntoElement {
h_flex()
.pt_1()
.border_t_1()
.border_color(cx.theme().colors().border_variant)
.gap_0p5()
.text_sm()
.text_color(Color::Muted.color(cx))
.child("Hold")
.child(h_flex().flex_shrink_0().children(render_modifiers(
&Modifiers::secondary_key(),
PlatformStyle::platform(),
None,
Some(TextSize::Default.rems(cx).into()),
true,
)))
.child(div().map(|this| {
if self.is_default {
this.child("to unset as default")
} else {
this.child("to set as default")
}
}))
}
}

View File

@@ -250,6 +250,7 @@ impl PasswordProxy {
.await
.with_context(|| format!("creating askpass script at {askpass_script_path:?}"))?;
make_file_executable(&askpass_script_path).await?;
// todo(shell): There might be no powershell on the system
#[cfg(target_os = "windows")]
let askpass_helper = format!(
"powershell.exe -ExecutionPolicy Bypass -File {}",

View File

@@ -667,7 +667,7 @@ pub struct TextThread {
buffer: Entity<Buffer>,
pub(crate) parsed_slash_commands: Vec<ParsedSlashCommand>,
invoked_slash_commands: HashMap<InvokedSlashCommandId, InvokedSlashCommand>,
edits_since_last_parse: language::Subscription,
edits_since_last_parse: language::Subscription<usize>,
slash_commands: Arc<SlashCommandWorkingSet>,
pub(crate) slash_command_output_sections: Vec<SlashCommandOutputSection<language::Anchor>>,
thought_process_output_sections: Vec<ThoughtProcessOutputSection<language::Anchor>>,

View File

@@ -10,8 +10,8 @@ use paths::remote_servers_dir;
use release_channel::{AppCommitSha, ReleaseChannel};
use serde::{Deserialize, Serialize};
use settings::{RegisterSetting, Settings, SettingsStore};
use smol::fs::File;
use smol::{fs, io::AsyncReadExt};
use smol::{fs::File, process::Command};
use std::mem;
use std::{
env::{
@@ -23,6 +23,7 @@ use std::{
sync::Arc,
time::Duration,
};
use util::command::new_smol_command;
use workspace::Workspace;
const SHOULD_SHOW_UPDATE_NOTIFICATION_KEY: &str = "auto-updater-should-show-updated-notification";
@@ -121,7 +122,7 @@ impl Drop for MacOsUnmounter<'_> {
let mount_path = mem::take(&mut self.mount_path);
self.background_executor
.spawn(async move {
let unmount_output = Command::new("hdiutil")
let unmount_output = new_smol_command("hdiutil")
.args(["detach", "-force"])
.arg(&mount_path)
.output()
@@ -799,7 +800,7 @@ async fn install_release_linux(
.await
.context("failed to create directory into which to extract update")?;
let output = Command::new("tar")
let output = new_smol_command("tar")
.arg("-xzf")
.arg(&downloaded_tar_gz)
.arg("-C")
@@ -834,7 +835,7 @@ async fn install_release_linux(
to = PathBuf::from(prefix);
}
let output = Command::new("rsync")
let output = new_smol_command("rsync")
.args(["-av", "--delete"])
.arg(&from)
.arg(&to)
@@ -866,7 +867,7 @@ async fn install_release_macos(
let mut mounted_app_path: OsString = mount_path.join(running_app_filename).into();
mounted_app_path.push("/");
let output = Command::new("hdiutil")
let output = new_smol_command("hdiutil")
.args(["attach", "-nobrowse"])
.arg(&downloaded_dmg)
.arg("-mountroot")
@@ -886,7 +887,7 @@ async fn install_release_macos(
background_executor: cx.background_executor(),
};
let output = Command::new("rsync")
let output = new_smol_command("rsync")
.args(["-av", "--delete"])
.arg(&mounted_app_path)
.arg(&running_app_path)
@@ -917,7 +918,7 @@ async fn cleanup_windows() -> Result<()> {
}
async fn install_release_windows(downloaded_installer: PathBuf) -> Result<Option<PathBuf>> {
let output = Command::new(downloaded_installer)
let output = new_smol_command(downloaded_installer)
.arg("/verysilent")
.arg("/update=true")
.arg("!desktopicon")

View File

@@ -123,7 +123,7 @@ impl Render for Breadcrumbs {
.upgrade()
.zip(zed_actions::outline::TOGGLE_OUTLINE.get())
{
callback(editor.to_any(), window, cx);
callback(editor.to_any_view(), window, cx);
}
}
})

View File

@@ -293,10 +293,11 @@ impl Telemetry {
}
pub fn metrics_enabled(self: &Arc<Self>) -> bool {
let state = self.state.lock();
let enabled = state.settings.metrics;
drop(state);
enabled
self.state.lock().settings.metrics
}
pub fn diagnostics_enabled(self: &Arc<Self>) -> bool {
self.state.lock().settings.diagnostics
}
pub fn set_authenticated_user_info(

View File

@@ -78,6 +78,8 @@ pub enum PromptFormat {
OnlySnippets,
/// One-sentence instructions used in fine-tuned models
Minimal,
/// One-sentence instructions + FIM-like template
MinimalQwen,
}
impl PromptFormat {
@@ -105,6 +107,7 @@ impl std::fmt::Display for PromptFormat {
PromptFormat::NumLinesUniDiff => write!(f, "Numbered Lines / Unified Diff"),
PromptFormat::OldTextNewText => write!(f, "Old Text / New Text"),
PromptFormat::Minimal => write!(f, "Minimal"),
PromptFormat::MinimalQwen => write!(f, "Minimal + Qwen FIM"),
}
}
}

View File

@@ -19,4 +19,5 @@ ordered-float.workspace = true
rustc-hash.workspace = true
schemars.workspace = true
serde.workspace = true
serde_json.workspace = true
strum.workspace = true

View File

@@ -3,7 +3,8 @@ pub mod retrieval_prompt;
use anyhow::{Context as _, Result, anyhow};
use cloud_llm_client::predict_edits_v3::{
self, DiffPathFmt, Excerpt, Line, Point, PromptFormat, ReferencedDeclaration,
self, DiffPathFmt, Event, Excerpt, IncludedFile, Line, Point, PromptFormat,
ReferencedDeclaration,
};
use indoc::indoc;
use ordered_float::OrderedFloat;
@@ -31,7 +32,7 @@ const MARKED_EXCERPT_INSTRUCTIONS: &str = indoc! {"
Other code is provided for context, and `…` indicates when code has been skipped.
# Edit History:
## Edit History
"};
@@ -49,7 +50,7 @@ const LABELED_SECTIONS_INSTRUCTIONS: &str = indoc! {r#"
println!("{i}");
}
# Edit History:
## Edit History
"#};
@@ -89,7 +90,7 @@ const NUMBERED_LINES_INSTRUCTIONS: &str = indoc! {r#"
const STUDENT_MODEL_INSTRUCTIONS: &str = indoc! {r#"
You are a code completion assistant that analyzes edit history to identify and systematically complete incomplete refactorings or patterns across the entire codebase.
# Edit History:
## Edit History
"#};
@@ -119,14 +120,15 @@ const XML_TAGS_INSTRUCTIONS: &str = indoc! {r#"
# Instructions
You are an edit prediction agent in a code editor.
Your job is to predict the next edit that the user will make,
based on their last few edits and their current cursor location.
# Output Format
Analyze the history of edits made by the user in order to infer what they are currently trying to accomplish.
Then complete the remainder of the current change if it is incomplete, or predict the next edit the user intends to make.
Always continue along the user's current trajectory, rather than changing course.
You must briefly explain your understanding of the user's goal, in one
or two sentences, and then specify their next edit, using the following
XML format:
## Output Format
You should briefly explain your understanding of the user's overall goal in one sentence, then explain what the next change
along the users current trajectory will be in another, and finally specify the next edit using the following XML-like format:
<edits path="my-project/src/myapp/cli.py">
<old_text>
@@ -152,20 +154,34 @@ const XML_TAGS_INSTRUCTIONS: &str = indoc! {r#"
- Always close all tags properly
- Don't include the <|user_cursor|> marker in your output.
# Edit History:
## Edit History
"#};
const OLD_TEXT_NEW_TEXT_REMINDER: &str = indoc! {r#"
---
Remember that the edits in the edit history have already been deployed.
The files are currently as shown in the Code Excerpts section.
Remember that the edits in the edit history have already been applied.
"#};
pub fn build_prompt(
request: &predict_edits_v3::PredictEditsRequest,
) -> Result<(String, SectionLabels)> {
let mut section_labels = Default::default();
match request.prompt_format {
PromptFormat::MinimalQwen => {
let prompt = MinimalQwenPrompt {
events: request.events.clone(),
cursor_point: request.cursor_point,
cursor_path: request.excerpt_path.clone(),
included_files: request.included_files.clone(),
};
return Ok((prompt.render(), section_labels));
}
_ => (),
};
let mut insertions = match request.prompt_format {
PromptFormat::MarkedExcerpt => vec![
(
@@ -191,6 +207,7 @@ pub fn build_prompt(
vec![(request.cursor_point, CURSOR_MARKER)]
}
PromptFormat::OnlySnippets => vec![],
PromptFormat::MinimalQwen => unreachable!(),
};
let mut prompt = match request.prompt_format {
@@ -200,6 +217,7 @@ pub fn build_prompt(
PromptFormat::OldTextNewText => XML_TAGS_INSTRUCTIONS.to_string(),
PromptFormat::OnlySnippets => String::new(),
PromptFormat::Minimal => STUDENT_MODEL_INSTRUCTIONS.to_string(),
PromptFormat::MinimalQwen => unreachable!(),
};
if request.events.is_empty() {
@@ -216,23 +234,32 @@ pub fn build_prompt(
let excerpts_preamble = match request.prompt_format {
PromptFormat::Minimal => indoc! {"
# Part of the file under the cursor:
## Part of the file under the cursor
(The cursor marker <|user_cursor|> indicates the current user cursor position.
The file is in current state, edits from edit history has been applied.
We only show part of the file around the cursor.
You can only edit exactly this part of the file.
We prepend line numbers (e.g., `123|<actual line>`); they are not part of the file.)
"},
PromptFormat::NumLinesUniDiff => indoc! {"
# Code Excerpts
(The cursor marker <|user_cursor|> indicates the current user cursor position.
The file is in current state, edits from edit history has been applied.
We only show part of the file around the cursor.
You can only edit exactly this part of the file.
We prepend line numbers (e.g., `123|<actual line>`); they are not part of the file.)
"},
PromptFormat::NumLinesUniDiff | PromptFormat::OldTextNewText => indoc! {"
## Code Excerpts
The cursor marker <|user_cursor|> indicates the current user cursor position.
The file is in current state, edits from edit history have been applied.
We prepend line numbers (e.g., `123|<actual line>`); they are not part of the file.
"},
Here is some excerpts of code that you should take into account to predict the next edit.
The cursor position is marked by `<|user_cursor|>` as it stands after the last edit in the history.
In addition other excerpts are included to better understand what the edit will be, including the declaration
or references of symbols around the cursor, or other similar code snippets that may need to be updated
following patterns that appear in the edit history.
Consider each of them carefully in relation to the edit history, and that the user may not have navigated
to the next place they want to edit yet.
Lines starting with `…` indicate omitted line ranges. These may appear inside multi-line code constructs.
"},
_ => indoc! {"
# Code Excerpts
## Code Excerpts
The cursor marker <|user_cursor|> indicates the current user cursor position.
The file is in current state, edits from edit history have been applied.
@@ -242,8 +269,6 @@ pub fn build_prompt(
prompt.push_str(excerpts_preamble);
prompt.push('\n');
let mut section_labels = Default::default();
if !request.referenced_declarations.is_empty() || !request.signatures.is_empty() {
let syntax_based_prompt = SyntaxBasedPrompt::populate(request)?;
section_labels = syntax_based_prompt.write(&mut insertions, &mut prompt)?;
@@ -760,6 +785,7 @@ impl<'a> SyntaxBasedPrompt<'a> {
writeln!(output, "<|section_{}|>", section_index).ok();
}
}
PromptFormat::MinimalQwen => unreachable!(),
}
let push_full_snippet = |output: &mut String| {
@@ -869,3 +895,69 @@ fn declaration_size(declaration: &ReferencedDeclaration, style: DeclarationStyle
DeclarationStyle::Declaration => declaration.text.len(),
}
}
struct MinimalQwenPrompt {
events: Vec<Event>,
cursor_point: Point,
cursor_path: Arc<Path>, // TODO: make a common struct with cursor_point
included_files: Vec<IncludedFile>,
}
impl MinimalQwenPrompt {
const INSTRUCTIONS: &str = "You are a code completion assistant that analyzes edit history to identify and systematically complete incomplete refactorings or patterns across the entire codebase.\n";
fn render(&self) -> String {
let edit_history = self.fmt_edit_history();
let context = self.fmt_context();
format!(
"{instructions}\n\n{edit_history}\n\n{context}",
instructions = MinimalQwenPrompt::INSTRUCTIONS,
edit_history = edit_history,
context = context
)
}
fn fmt_edit_history(&self) -> String {
if self.events.is_empty() {
"(No edit history)\n\n".to_string()
} else {
let mut events_str = String::new();
push_events(&mut events_str, &self.events);
format!(
"The following are the latest edits made by the user, from earlier to later.\n\n{}",
events_str
)
}
}
fn fmt_context(&self) -> String {
let mut context = String::new();
let include_line_numbers = true;
for related_file in &self.included_files {
writeln!(context, "<|file_sep|>{}", DiffPathFmt(&related_file.path)).unwrap();
if related_file.path == self.cursor_path {
write!(context, "<|fim_prefix|>").unwrap();
write_excerpts(
&related_file.excerpts,
&[(self.cursor_point, "<|fim_suffix|>")],
related_file.max_row,
include_line_numbers,
&mut context,
);
writeln!(context, "<|fim_middle|>").unwrap();
} else {
write_excerpts(
&related_file.excerpts,
&[],
related_file.max_row,
include_line_numbers,
&mut context,
);
}
}
context
}
}

View File

@@ -40,9 +40,47 @@ pub fn build_prompt(request: predict_edits_v3::PlanContextRetrievalRequest) -> R
pub struct SearchToolInput {
/// An array of queries to run for gathering context relevant to the next prediction
#[schemars(length(max = 3))]
#[serde(deserialize_with = "deserialize_queries")]
pub queries: Box<[SearchToolQuery]>,
}
fn deserialize_queries<'de, D>(deserializer: D) -> Result<Box<[SearchToolQuery]>, D::Error>
where
D: serde::Deserializer<'de>,
{
use serde::de::Error;
#[derive(Deserialize)]
#[serde(untagged)]
enum QueryCollection {
Array(Box<[SearchToolQuery]>),
DoubleArray(Box<[Box<[SearchToolQuery]>]>),
Single(SearchToolQuery),
}
#[derive(Deserialize)]
#[serde(untagged)]
enum MaybeDoubleEncoded {
SingleEncoded(QueryCollection),
DoubleEncoded(String),
}
let result = MaybeDoubleEncoded::deserialize(deserializer)?;
let normalized = match result {
MaybeDoubleEncoded::SingleEncoded(value) => value,
MaybeDoubleEncoded::DoubleEncoded(value) => {
serde_json::from_str(&value).map_err(D::Error::custom)?
}
};
Ok(match normalized {
QueryCollection::Array(items) => items,
QueryCollection::Single(search_tool_query) => Box::new([search_tool_query]),
QueryCollection::DoubleArray(double_array) => double_array.into_iter().flatten().collect(),
})
}
/// Search for relevant code by path, syntax hierarchy, and content.
#[derive(Debug, Clone, Serialize, Deserialize, JsonSchema, Hash)]
pub struct SearchToolQuery {
@@ -92,3 +130,115 @@ const TOOL_USE_REMINDER: &str = indoc! {"
--
Analyze the user's intent in one to two sentences, then call the `search` tool.
"};
#[cfg(test)]
mod tests {
use serde_json::json;
use super::*;
#[test]
fn test_deserialize_queries() {
let single_query_json = indoc! {r#"{
"queries": {
"glob": "**/*.rs",
"syntax_node": ["fn test"],
"content": "assert"
}
}"#};
let flat_input: SearchToolInput = serde_json::from_str(single_query_json).unwrap();
assert_eq!(flat_input.queries.len(), 1);
assert_eq!(flat_input.queries[0].glob, "**/*.rs");
assert_eq!(flat_input.queries[0].syntax_node, vec!["fn test"]);
assert_eq!(flat_input.queries[0].content, Some("assert".to_string()));
let flat_json = indoc! {r#"{
"queries": [
{
"glob": "**/*.rs",
"syntax_node": ["fn test"],
"content": "assert"
},
{
"glob": "**/*.ts",
"syntax_node": [],
"content": null
}
]
}"#};
let flat_input: SearchToolInput = serde_json::from_str(flat_json).unwrap();
assert_eq!(flat_input.queries.len(), 2);
assert_eq!(flat_input.queries[0].glob, "**/*.rs");
assert_eq!(flat_input.queries[0].syntax_node, vec!["fn test"]);
assert_eq!(flat_input.queries[0].content, Some("assert".to_string()));
assert_eq!(flat_input.queries[1].glob, "**/*.ts");
assert_eq!(flat_input.queries[1].syntax_node.len(), 0);
assert_eq!(flat_input.queries[1].content, None);
let nested_json = indoc! {r#"{
"queries": [
[
{
"glob": "**/*.rs",
"syntax_node": ["fn test"],
"content": "assert"
}
],
[
{
"glob": "**/*.ts",
"syntax_node": [],
"content": null
}
]
]
}"#};
let nested_input: SearchToolInput = serde_json::from_str(nested_json).unwrap();
assert_eq!(nested_input.queries.len(), 2);
assert_eq!(nested_input.queries[0].glob, "**/*.rs");
assert_eq!(nested_input.queries[0].syntax_node, vec!["fn test"]);
assert_eq!(nested_input.queries[0].content, Some("assert".to_string()));
assert_eq!(nested_input.queries[1].glob, "**/*.ts");
assert_eq!(nested_input.queries[1].syntax_node.len(), 0);
assert_eq!(nested_input.queries[1].content, None);
let double_encoded_queries = serde_json::to_string(&json!({
"queries": serde_json::to_string(&json!([
{
"glob": "**/*.rs",
"syntax_node": ["fn test"],
"content": "assert"
},
{
"glob": "**/*.ts",
"syntax_node": [],
"content": null
}
])).unwrap()
}))
.unwrap();
let double_encoded_input: SearchToolInput =
serde_json::from_str(&double_encoded_queries).unwrap();
assert_eq!(double_encoded_input.queries.len(), 2);
assert_eq!(double_encoded_input.queries[0].glob, "**/*.rs");
assert_eq!(double_encoded_input.queries[0].syntax_node, vec!["fn test"]);
assert_eq!(
double_encoded_input.queries[0].content,
Some("assert".to_string())
);
assert_eq!(double_encoded_input.queries[1].glob, "**/*.ts");
assert_eq!(double_encoded_input.queries[1].syntax_node.len(), 0);
assert_eq!(double_encoded_input.queries[1].content, None);
// ### ERROR Switching from var declarations to lexical declarations [RUN 073]
// invalid search json {"queries": ["express/lib/response.js", "var\\s+[a-zA-Z_][a-zA-Z0-9_]*\\s*=.*;", "function.*\\(.*\\).*\\{.*\\}"]}
}
}

View File

@@ -0,0 +1,4 @@
alter table billing_customers
add column external_id text;
create unique index uix_billing_customers_on_external_id on billing_customers (external_id);

View File

@@ -7,7 +7,7 @@ use channel::ACKNOWLEDGE_DEBOUNCE_INTERVAL;
use client::{Collaborator, ParticipantIndex, UserId};
use collab_ui::channel_view::ChannelView;
use collections::HashMap;
use editor::{Anchor, Editor, ToOffset};
use editor::{Anchor, Editor, MultiBufferOffset, ToOffset};
use futures::future;
use gpui::{BackgroundExecutor, Context, Entity, TestAppContext, Window};
use rpc::{RECEIVE_TIMEOUT, proto::PeerId};
@@ -180,7 +180,7 @@ async fn test_channel_notes_participant_indices(
notes.editor.update(cx, |editor, cx| {
editor.insert("a", window, cx);
editor.change_selections(Default::default(), window, cx, |selections| {
selections.select_ranges(vec![0..1]);
selections.select_ranges(vec![MultiBufferOffset(0)..MultiBufferOffset(1)]);
});
});
});
@@ -190,7 +190,7 @@ async fn test_channel_notes_participant_indices(
editor.move_down(&Default::default(), window, cx);
editor.insert("b", window, cx);
editor.change_selections(Default::default(), window, cx, |selections| {
selections.select_ranges(vec![1..2]);
selections.select_ranges(vec![MultiBufferOffset(1)..MultiBufferOffset(2)]);
});
});
});
@@ -200,7 +200,7 @@ async fn test_channel_notes_participant_indices(
editor.move_down(&Default::default(), window, cx);
editor.insert("c", window, cx);
editor.change_selections(Default::default(), window, cx, |selections| {
selections.select_ranges(vec![2..3]);
selections.select_ranges(vec![MultiBufferOffset(2)..MultiBufferOffset(3)]);
});
});
});
@@ -287,12 +287,12 @@ async fn test_channel_notes_participant_indices(
editor_a.update_in(cx_a, |editor, window, cx| {
editor.change_selections(Default::default(), window, cx, |selections| {
selections.select_ranges(vec![0..1]);
selections.select_ranges(vec![MultiBufferOffset(0)..MultiBufferOffset(1)]);
});
});
editor_b.update_in(cx_b, |editor, window, cx| {
editor.change_selections(Default::default(), window, cx, |selections| {
selections.select_ranges(vec![2..3]);
selections.select_ranges(vec![MultiBufferOffset(2)..MultiBufferOffset(3)]);
});
});
executor.run_until_parked();
@@ -327,7 +327,7 @@ fn assert_remote_selections(
let end = s.selection.end.to_offset(snapshot.buffer_snapshot());
let user_id = collaborators.get(&peer_id).unwrap().user_id;
let participant_index = hub.user_participant_indices(cx).get(&user_id).copied();
(participant_index, start..end)
(participant_index, start.0..end.0)
})
.collect::<Vec<_>>();
assert_eq!(

View File

@@ -4,7 +4,8 @@ use crate::{
};
use call::ActiveCall;
use editor::{
DocumentColorsRenderMode, Editor, FETCH_COLORS_DEBOUNCE_TIMEOUT, RowInfo, SelectionEffects,
DocumentColorsRenderMode, Editor, FETCH_COLORS_DEBOUNCE_TIMEOUT, MultiBufferOffset, RowInfo,
SelectionEffects,
actions::{
ConfirmCodeAction, ConfirmCompletion, ConfirmRename, ContextMenuFirst,
ExpandMacroRecursively, MoveToEnd, Redo, Rename, SelectAll, ToggleCodeActions, Undo,
@@ -288,7 +289,7 @@ async fn test_newline_above_or_below_does_not_move_guest_cursor(
"});
}
#[gpui::test(iterations = 10)]
#[gpui::test]
async fn test_collaborating_with_completion(cx_a: &mut TestAppContext, cx_b: &mut TestAppContext) {
let mut server = TestServer::start(cx_a.executor()).await;
let client_a = server.create_client(cx_a, "user_a").await;
@@ -307,17 +308,35 @@ async fn test_collaborating_with_completion(cx_a: &mut TestAppContext, cx_b: &mu
..lsp::ServerCapabilities::default()
};
client_a.language_registry().add(rust_lang());
let mut fake_language_servers = client_a.language_registry().register_fake_lsp(
let mut fake_language_servers = [
client_a.language_registry().register_fake_lsp(
"Rust",
FakeLspAdapter {
capabilities: capabilities.clone(),
..FakeLspAdapter::default()
},
),
client_a.language_registry().register_fake_lsp(
"Rust",
FakeLspAdapter {
name: "fake-analyzer",
capabilities: capabilities.clone(),
..FakeLspAdapter::default()
},
),
];
client_b.language_registry().add(rust_lang());
client_b.language_registry().register_fake_lsp_adapter(
"Rust",
FakeLspAdapter {
capabilities: capabilities.clone(),
..FakeLspAdapter::default()
},
);
client_b.language_registry().add(rust_lang());
client_b.language_registry().register_fake_lsp_adapter(
"Rust",
FakeLspAdapter {
name: "fake-analyzer",
capabilities,
..FakeLspAdapter::default()
},
@@ -352,7 +371,8 @@ async fn test_collaborating_with_completion(cx_a: &mut TestAppContext, cx_b: &mu
Editor::for_buffer(buffer_b.clone(), Some(project_b.clone()), window, cx)
});
let fake_language_server = fake_language_servers.next().await.unwrap();
let fake_language_server = fake_language_servers[0].next().await.unwrap();
let second_fake_language_server = fake_language_servers[1].next().await.unwrap();
cx_a.background_executor.run_until_parked();
buffer_b.read_with(cx_b, |buffer, _| {
@@ -362,7 +382,7 @@ async fn test_collaborating_with_completion(cx_a: &mut TestAppContext, cx_b: &mu
// Type a completion trigger character as the guest.
editor_b.update_in(cx_b, |editor, window, cx| {
editor.change_selections(SelectionEffects::no_scroll(), window, cx, |s| {
s.select_ranges([13..13])
s.select_ranges([MultiBufferOffset(13)..MultiBufferOffset(13)])
});
editor.handle_input(".", window, cx);
});
@@ -414,6 +434,11 @@ async fn test_collaborating_with_completion(cx_a: &mut TestAppContext, cx_b: &mu
.next()
.await
.unwrap();
second_fake_language_server
.set_request_handler::<lsp::request::Completion, _, _>(|_, _| async move { Ok(None) })
.next()
.await
.unwrap();
cx_a.executor().finish_waiting();
// Open the buffer on the host.
@@ -479,7 +504,7 @@ async fn test_collaborating_with_completion(cx_a: &mut TestAppContext, cx_b: &mu
// resolved
editor_b.update_in(cx_b, |editor, window, cx| {
editor.change_selections(SelectionEffects::no_scroll(), window, cx, |s| {
s.select_ranges([46..46])
s.select_ranges([MultiBufferOffset(46)..MultiBufferOffset(46)])
});
editor.handle_input("; a", window, cx);
editor.handle_input(".", window, cx);
@@ -522,6 +547,10 @@ async fn test_collaborating_with_completion(cx_a: &mut TestAppContext, cx_b: &mu
])))
});
// Second language server also needs to handle the request (returns None)
let mut second_completion_response = second_fake_language_server
.set_request_handler::<lsp::request::Completion, _, _>(|_, _| async move { Ok(None) });
// The completion now gets a new `text_edit.new_text` when resolving the completion item
let mut resolve_completion_response = fake_language_server
.set_request_handler::<lsp::request::ResolveCompletionItem, _, _>(|params, _| async move {
@@ -545,6 +574,7 @@ async fn test_collaborating_with_completion(cx_a: &mut TestAppContext, cx_b: &mu
cx_b.executor().run_until_parked();
completion_response.next().await.unwrap();
second_completion_response.next().await.unwrap();
editor_b.update_in(cx_b, |editor, window, cx| {
assert!(editor.context_menu_visible());
@@ -563,6 +593,77 @@ async fn test_collaborating_with_completion(cx_a: &mut TestAppContext, cx_b: &mu
"use d::SomeTrait;\nfn main() { a.first_method(); a.third_method(, , ) }"
);
});
// Ensure buffer is synced before proceeding with the next test
cx_a.executor().run_until_parked();
cx_b.executor().run_until_parked();
// Test completions from the second fake language server
// Add another completion trigger to test the second language server
editor_b.update_in(cx_b, |editor, window, cx| {
editor.change_selections(SelectionEffects::no_scroll(), window, cx, |s| {
s.select_ranges([MultiBufferOffset(68)..MultiBufferOffset(68)])
});
editor.handle_input("; b", window, cx);
editor.handle_input(".", window, cx);
});
buffer_b.read_with(cx_b, |buffer, _| {
assert_eq!(
buffer.text(),
"use d::SomeTrait;\nfn main() { a.first_method(); a.third_method(, , ); b. }"
);
});
// Set up completion handlers for both language servers
let mut first_lsp_completion = fake_language_server
.set_request_handler::<lsp::request::Completion, _, _>(|_, _| async move { Ok(None) });
let mut second_lsp_completion = second_fake_language_server
.set_request_handler::<lsp::request::Completion, _, _>(|params, _| async move {
assert_eq!(
params.text_document_position.text_document.uri,
lsp::Uri::from_file_path(path!("/a/main.rs")).unwrap(),
);
assert_eq!(
params.text_document_position.position,
lsp::Position::new(1, 54),
);
Ok(Some(lsp::CompletionResponse::Array(vec![
lsp::CompletionItem {
label: "analyzer_method(…)".into(),
detail: Some("fn(&self) -> Result<T>".into()),
text_edit: Some(lsp::CompletionTextEdit::Edit(lsp::TextEdit {
new_text: "analyzer_method()".to_string(),
range: lsp::Range::new(
lsp::Position::new(1, 54),
lsp::Position::new(1, 54),
),
})),
insert_text_format: Some(lsp::InsertTextFormat::SNIPPET),
..Default::default()
},
])))
});
cx_b.executor().run_until_parked();
// Await both language server responses
first_lsp_completion.next().await.unwrap();
second_lsp_completion.next().await.unwrap();
cx_b.executor().run_until_parked();
// Confirm the completion from the second language server works
editor_b.update_in(cx_b, |editor, window, cx| {
assert!(editor.context_menu_visible());
editor.confirm_completion(&ConfirmCompletion { item_ix: Some(0) }, window, cx);
assert_eq!(
editor.text(cx),
"use d::SomeTrait;\nfn main() { a.first_method(); a.third_method(, , ); b.analyzer_method() }"
);
});
}
#[gpui::test(iterations = 10)]
@@ -850,7 +951,7 @@ async fn test_collaborating_with_renames(cx_a: &mut TestAppContext, cx_b: &mut T
// Move cursor to a location that can be renamed.
let prepare_rename = editor_b.update_in(cx_b, |editor, window, cx| {
editor.change_selections(SelectionEffects::no_scroll(), window, cx, |s| {
s.select_ranges([7..7])
s.select_ranges([MultiBufferOffset(7)..MultiBufferOffset(7)])
});
editor.rename(&Rename, window, cx).unwrap()
});
@@ -877,17 +978,17 @@ async fn test_collaborating_with_renames(cx_a: &mut TestAppContext, cx_b: &mut T
let buffer = editor.buffer().read(cx).snapshot(cx);
assert_eq!(
rename.range.start.to_offset(&buffer)..rename.range.end.to_offset(&buffer),
6..9
MultiBufferOffset(6)..MultiBufferOffset(9)
);
rename.editor.update(cx, |rename_editor, cx| {
let rename_selection = rename_editor.selections.newest::<usize>(&rename_editor.display_snapshot(cx));
let rename_selection = rename_editor.selections.newest::<MultiBufferOffset>(&rename_editor.display_snapshot(cx));
assert_eq!(
rename_selection.range(),
0..3,
MultiBufferOffset(0)..MultiBufferOffset(3),
"Rename that was triggered from zero selection caret, should propose the whole word."
);
rename_editor.buffer().update(cx, |rename_buffer, cx| {
rename_buffer.edit([(0..3, "THREE")], None, cx);
rename_buffer.edit([(MultiBufferOffset(0)..MultiBufferOffset(3), "THREE")], None, cx);
});
});
});
@@ -898,7 +999,7 @@ async fn test_collaborating_with_renames(cx_a: &mut TestAppContext, cx_b: &mut T
});
let prepare_rename = editor_b.update_in(cx_b, |editor, window, cx| {
editor.change_selections(SelectionEffects::no_scroll(), window, cx, |s| {
s.select_ranges([7..8])
s.select_ranges([MultiBufferOffset(7)..MultiBufferOffset(8)])
});
editor.rename(&Rename, window, cx).unwrap()
});
@@ -925,16 +1026,16 @@ async fn test_collaborating_with_renames(cx_a: &mut TestAppContext, cx_b: &mut T
let buffer = editor.buffer().read(cx).snapshot(cx);
let lsp_rename_start = rename.range.start.to_offset(&buffer);
let lsp_rename_end = rename.range.end.to_offset(&buffer);
assert_eq!(lsp_rename_start..lsp_rename_end, 6..9);
assert_eq!(lsp_rename_start..lsp_rename_end, MultiBufferOffset(6)..MultiBufferOffset(9));
rename.editor.update(cx, |rename_editor, cx| {
let rename_selection = rename_editor.selections.newest::<usize>(&rename_editor.display_snapshot(cx));
let rename_selection = rename_editor.selections.newest::<MultiBufferOffset>(&rename_editor.display_snapshot(cx));
assert_eq!(
rename_selection.range(),
1..2,
MultiBufferOffset(1)..MultiBufferOffset(2),
"Rename that was triggered from a selection, should have the same selection range in the rename proposal"
);
rename_editor.buffer().update(cx, |rename_buffer, cx| {
rename_buffer.edit([(0..lsp_rename_end - lsp_rename_start, "THREE")], None, cx);
rename_buffer.edit([(MultiBufferOffset(0)..MultiBufferOffset(lsp_rename_end - lsp_rename_start), "THREE")], None, cx);
});
});
});
@@ -1137,7 +1238,7 @@ async fn test_slow_lsp_server(cx_a: &mut TestAppContext, cx_b: &mut TestAppConte
// Move cursor to a location, this should trigger the code lens call.
editor_b.update_in(cx_b, |editor, window, cx| {
editor.change_selections(SelectionEffects::no_scroll(), window, cx, |s| {
s.select_ranges([7..7])
s.select_ranges([MultiBufferOffset(7)..MultiBufferOffset(7)])
});
});
let () = request_started_rx.next().await.unwrap();
@@ -1159,7 +1260,7 @@ async fn test_slow_lsp_server(cx_a: &mut TestAppContext, cx_b: &mut TestAppConte
editor_b.update_in(cx_b, |editor, window, cx| {
editor.change_selections(SelectionEffects::no_scroll(), window, cx, |s| {
s.select_ranges([1..1])
s.select_ranges([MultiBufferOffset(1)..MultiBufferOffset(1)])
});
});
let () = request_started_rx.next().await.unwrap();
@@ -1181,7 +1282,7 @@ async fn test_slow_lsp_server(cx_a: &mut TestAppContext, cx_b: &mut TestAppConte
editor_b.update_in(cx_b, |editor, window, cx| {
editor.change_selections(SelectionEffects::no_scroll(), window, cx, |s| {
s.select_ranges([2..2])
s.select_ranges([MultiBufferOffset(2)..MultiBufferOffset(2)])
});
});
let () = request_started_rx.next().await.unwrap();
@@ -1619,7 +1720,7 @@ async fn test_on_input_format_from_host_to_guest(
cx_a.focus(&editor_a);
editor_a.update_in(cx_a, |editor, window, cx| {
editor.change_selections(SelectionEffects::no_scroll(), window, cx, |s| {
s.select_ranges([13..13])
s.select_ranges([MultiBufferOffset(13)..MultiBufferOffset(13)])
});
editor.handle_input(">", window, cx);
});
@@ -1728,7 +1829,7 @@ async fn test_on_input_format_from_guest_to_host(
cx_b.focus(&editor_b);
editor_b.update_in(cx_b, |editor, window, cx| {
editor.change_selections(SelectionEffects::no_scroll(), window, cx, |s| {
s.select_ranges([13..13])
s.select_ranges([MultiBufferOffset(13)..MultiBufferOffset(13)])
});
editor.handle_input(":", window, cx);
});
@@ -1956,7 +2057,7 @@ async fn test_mutual_editor_inlay_hint_cache_update(
let after_client_edit = edits_made.fetch_add(1, atomic::Ordering::Release) + 1;
editor_b.update_in(cx_b, |editor, window, cx| {
editor.change_selections(SelectionEffects::no_scroll(), window, cx, |s| {
s.select_ranges([13..13].clone())
s.select_ranges([MultiBufferOffset(13)..MultiBufferOffset(13)].clone())
});
editor.handle_input(":", window, cx);
});
@@ -1980,7 +2081,7 @@ async fn test_mutual_editor_inlay_hint_cache_update(
let after_host_edit = edits_made.fetch_add(1, atomic::Ordering::Release) + 1;
editor_a.update_in(cx_a, |editor, window, cx| {
editor.change_selections(SelectionEffects::no_scroll(), window, cx, |s| {
s.select_ranges([13..13])
s.select_ranges([MultiBufferOffset(13)..MultiBufferOffset(13)])
});
editor.handle_input("a change to increment both buffers' versions", window, cx);
});
@@ -2169,16 +2270,28 @@ async fn test_inlay_hint_refresh_is_forwarded(
} else {
"initial hint"
};
Ok(Some(vec![lsp::InlayHint {
position: lsp::Position::new(0, character),
label: lsp::InlayHintLabel::String(label.to_string()),
kind: None,
text_edits: None,
tooltip: None,
padding_left: None,
padding_right: None,
data: None,
}]))
Ok(Some(vec![
lsp::InlayHint {
position: lsp::Position::new(0, character),
label: lsp::InlayHintLabel::String(label.to_string()),
kind: None,
text_edits: None,
tooltip: None,
padding_left: None,
padding_right: None,
data: None,
},
lsp::InlayHint {
position: lsp::Position::new(1090, 1090),
label: lsp::InlayHintLabel::String("out-of-bounds hint".to_string()),
kind: None,
text_edits: None,
tooltip: None,
padding_left: None,
padding_right: None,
data: None,
},
]))
}
})
.next()
@@ -2408,7 +2521,7 @@ async fn test_lsp_document_color(cx_a: &mut TestAppContext, cx_b: &mut TestAppCo
editor_a.update_in(cx_a, |editor, window, cx| {
editor.change_selections(SelectionEffects::no_scroll(), window, cx, |s| {
s.select_ranges([13..13].clone())
s.select_ranges([MultiBufferOffset(13)..MultiBufferOffset(13)].clone())
});
editor.handle_input(":", window, cx);
});
@@ -2845,7 +2958,7 @@ async fn test_lsp_pull_diagnostics(
editor_a_main.update(cx_a, |editor, cx| {
let snapshot = editor.buffer().read(cx).snapshot(cx);
let all_diagnostics = snapshot
.diagnostics_in_range(0..snapshot.len())
.diagnostics_in_range(MultiBufferOffset(0)..snapshot.len())
.collect::<Vec<_>>();
assert_eq!(
all_diagnostics.len(),
@@ -2974,7 +3087,7 @@ async fn test_lsp_pull_diagnostics(
editor_a_main.update(cx_a, |editor, cx| {
let snapshot = editor.buffer().read(cx).snapshot(cx);
let all_diagnostics = snapshot
.diagnostics_in_range(0..snapshot.len())
.diagnostics_in_range(MultiBufferOffset(0)..snapshot.len())
.collect::<Vec<_>>();
assert_eq!(
all_diagnostics.len(),
@@ -3021,7 +3134,7 @@ async fn test_lsp_pull_diagnostics(
editor_b_main.update(cx_b, |editor, cx| {
let snapshot = editor.buffer().read(cx).snapshot(cx);
let all_diagnostics = snapshot
.diagnostics_in_range(0..snapshot.len())
.diagnostics_in_range(MultiBufferOffset(0)..snapshot.len())
.collect::<Vec<_>>();
assert_eq!(
all_diagnostics.len(),
@@ -3068,7 +3181,7 @@ async fn test_lsp_pull_diagnostics(
editor_b_lib.update(cx_b, |editor, cx| {
let snapshot = editor.buffer().read(cx).snapshot(cx);
let all_diagnostics = snapshot
.diagnostics_in_range(0..snapshot.len())
.diagnostics_in_range(MultiBufferOffset(0)..snapshot.len())
.collect::<Vec<_>>();
let expected_messages = [
expected_pull_diagnostic_lib_message,
@@ -3135,7 +3248,7 @@ async fn test_lsp_pull_diagnostics(
editor_b_lib.update(cx_b, |editor, cx| {
let snapshot = editor.buffer().read(cx).snapshot(cx);
let all_diagnostics = snapshot
.diagnostics_in_range(0..snapshot.len())
.diagnostics_in_range(MultiBufferOffset(0)..snapshot.len())
.collect::<Vec<_>>();
let expected_messages = [
expected_workspace_pull_diagnostics_lib_message,
@@ -3270,7 +3383,7 @@ async fn test_lsp_pull_diagnostics(
editor_b_lib.update(cx_b, |editor, cx| {
let snapshot = editor.buffer().read(cx).snapshot(cx);
let all_diagnostics = snapshot
.diagnostics_in_range(0..snapshot.len())
.diagnostics_in_range(MultiBufferOffset(0)..snapshot.len())
.collect::<Vec<_>>();
let expected_messages = [
expected_workspace_pull_diagnostics_lib_message,
@@ -3288,7 +3401,7 @@ async fn test_lsp_pull_diagnostics(
editor_b_main.update(cx_b, |editor, cx| {
let snapshot = editor.buffer().read(cx).snapshot(cx);
let all_diagnostics = snapshot
.diagnostics_in_range(0..snapshot.len())
.diagnostics_in_range(MultiBufferOffset(0)..snapshot.len())
.collect::<Vec<_>>();
assert_eq!(all_diagnostics.len(), 2);
@@ -3307,7 +3420,7 @@ async fn test_lsp_pull_diagnostics(
editor_a_main.update(cx_a, |editor, cx| {
let snapshot = editor.buffer().read(cx).snapshot(cx);
let all_diagnostics = snapshot
.diagnostics_in_range(0..snapshot.len())
.diagnostics_in_range(MultiBufferOffset(0)..snapshot.len())
.collect::<Vec<_>>();
assert_eq!(all_diagnostics.len(), 2);
let expected_messages = [

View File

@@ -6,7 +6,7 @@ use collab_ui::{
channel_view::ChannelView,
notifications::project_shared_notification::ProjectSharedNotification,
};
use editor::{Editor, MultiBuffer, PathKey, SelectionEffects};
use editor::{Editor, MultiBuffer, MultiBufferOffset, PathKey, SelectionEffects};
use gpui::{
AppContext as _, BackgroundExecutor, BorrowAppContext, Entity, SharedString, TestAppContext,
VisualContext, VisualTestContext, point,
@@ -124,7 +124,7 @@ async fn test_basic_following(
editor.select_left(&Default::default(), window, cx);
assert_eq!(
editor.selections.ranges(&editor.display_snapshot(cx)),
vec![3..2]
vec![MultiBufferOffset(3)..MultiBufferOffset(2)]
);
});
editor_a2.update_in(cx_a, |editor, window, cx| {
@@ -133,7 +133,7 @@ async fn test_basic_following(
editor.select_left(&Default::default(), window, cx);
assert_eq!(
editor.selections.ranges(&editor.display_snapshot(cx)),
vec![2..1]
vec![MultiBufferOffset(2)..MultiBufferOffset(1)]
);
});
@@ -158,13 +158,13 @@ async fn test_basic_following(
editor_b2.update(cx_b, |editor, cx| editor
.selections
.ranges(&editor.display_snapshot(cx))),
vec![2..1]
vec![MultiBufferOffset(2)..MultiBufferOffset(1)]
);
assert_eq!(
editor_b1.update(cx_b, |editor, cx| editor
.selections
.ranges(&editor.display_snapshot(cx))),
vec![3..3]
vec![MultiBufferOffset(3)..MultiBufferOffset(3)]
);
executor.run_until_parked();
@@ -386,7 +386,10 @@ async fn test_basic_following(
// Changes to client A's editor are reflected on client B.
editor_a1.update_in(cx_a, |editor, window, cx| {
editor.change_selections(SelectionEffects::no_scroll(), window, cx, |s| {
s.select_ranges([1..1, 2..2])
s.select_ranges([
MultiBufferOffset(1)..MultiBufferOffset(1),
MultiBufferOffset(2)..MultiBufferOffset(2),
])
});
});
executor.advance_clock(workspace::item::LEADER_UPDATE_THROTTLE);
@@ -396,7 +399,10 @@ async fn test_basic_following(
editor_b1.update(cx_b, |editor, cx| {
assert_eq!(
editor.selections.ranges(&editor.display_snapshot(cx)),
&[1..1, 2..2]
&[
MultiBufferOffset(1)..MultiBufferOffset(1),
MultiBufferOffset(2)..MultiBufferOffset(2)
]
);
});
@@ -408,7 +414,7 @@ async fn test_basic_following(
editor_a1.update_in(cx_a, |editor, window, cx| {
editor.change_selections(SelectionEffects::no_scroll(), window, cx, |s| {
s.select_ranges([3..3])
s.select_ranges([MultiBufferOffset(3)..MultiBufferOffset(3)])
});
editor.set_scroll_position(point(0., 100.), window, cx);
});
@@ -417,7 +423,7 @@ async fn test_basic_following(
editor_b1.update(cx_b, |editor, cx| {
assert_eq!(
editor.selections.ranges(&editor.display_snapshot(cx)),
&[3..3]
&[MultiBufferOffset(3)..MultiBufferOffset(3)]
);
});
@@ -1694,7 +1700,7 @@ async fn test_following_stops_on_unshare(cx_a: &mut TestAppContext, cx_b: &mut T
// b should follow a to position 1
editor_a.update_in(cx_a, |editor, window, cx| {
editor.change_selections(SelectionEffects::no_scroll(), window, cx, |s| {
s.select_ranges([1..1])
s.select_ranges([MultiBufferOffset(1)..MultiBufferOffset(1)])
})
});
cx_a.executor()
@@ -1703,7 +1709,7 @@ async fn test_following_stops_on_unshare(cx_a: &mut TestAppContext, cx_b: &mut T
editor_b.update(cx_b, |editor, cx| {
assert_eq!(
editor.selections.ranges(&editor.display_snapshot(cx)),
vec![1..1]
vec![MultiBufferOffset(1)..MultiBufferOffset(1)]
)
});
@@ -1719,7 +1725,7 @@ async fn test_following_stops_on_unshare(cx_a: &mut TestAppContext, cx_b: &mut T
// b should not follow a to position 2
editor_a.update_in(cx_a, |editor, window, cx| {
editor.change_selections(SelectionEffects::no_scroll(), window, cx, |s| {
s.select_ranges([2..2])
s.select_ranges([MultiBufferOffset(2)..MultiBufferOffset(2)])
})
});
cx_a.executor()
@@ -1728,7 +1734,7 @@ async fn test_following_stops_on_unshare(cx_a: &mut TestAppContext, cx_b: &mut T
editor_b.update(cx_b, |editor, cx| {
assert_eq!(
editor.selections.ranges(&editor.display_snapshot(cx)),
vec![1..1]
vec![MultiBufferOffset(1)..MultiBufferOffset(1)]
)
});
cx_b.update(|_, cx| {
@@ -1829,7 +1835,7 @@ async fn test_following_into_excluded_file(
editor.select_left(&Default::default(), window, cx);
assert_eq!(
editor.selections.ranges(&editor.display_snapshot(cx)),
vec![3..2]
vec![MultiBufferOffset(3)..MultiBufferOffset(2)]
);
});
editor_for_excluded_a.update_in(cx_a, |editor, window, cx| {
@@ -1838,7 +1844,7 @@ async fn test_following_into_excluded_file(
editor.select_left(&Default::default(), window, cx);
assert_eq!(
editor.selections.ranges(&editor.display_snapshot(cx)),
vec![18..17]
vec![MultiBufferOffset(18)..MultiBufferOffset(17)]
);
});
@@ -1864,7 +1870,7 @@ async fn test_following_into_excluded_file(
editor_for_excluded_b.update(cx_b, |editor, cx| editor
.selections
.ranges(&editor.display_snapshot(cx))),
vec![18..17]
vec![MultiBufferOffset(18)..MultiBufferOffset(17)]
);
editor_for_excluded_a.update_in(cx_a, |editor, window, cx| {
@@ -2040,7 +2046,7 @@ async fn test_following_to_channel_notes_without_a_shared_project(
notes.editor.update(cx, |editor, cx| {
editor.insert("Hello from A.", window, cx);
editor.change_selections(SelectionEffects::no_scroll(), window, cx, |selections| {
selections.select_ranges(vec![3..4]);
selections.select_ranges(vec![MultiBufferOffset(3)..MultiBufferOffset(4)]);
});
});
});
@@ -2076,8 +2082,8 @@ async fn test_following_to_channel_notes_without_a_shared_project(
assert_eq!(
editor
.selections
.ranges::<usize>(&editor.display_snapshot(cx)),
&[3..4]
.ranges::<MultiBufferOffset>(&editor.display_snapshot(cx)),
&[MultiBufferOffset(3)..MultiBufferOffset(4)]
);
})
});

View File

@@ -11,7 +11,7 @@ use editor::{
display_map::ToDisplayPoint, scroll::Autoscroll,
};
use gpui::{
AnyView, App, ClipboardItem, Context, Entity, EventEmitter, Focusable, Pixels, Point, Render,
App, ClipboardItem, Context, Entity, EventEmitter, Focusable, Pixels, Point, Render,
Subscription, Task, VisualContext as _, WeakEntity, Window, actions,
};
use project::Project;
@@ -25,7 +25,7 @@ use util::ResultExt;
use workspace::{CollaboratorId, item::TabContentParams};
use workspace::{
ItemNavHistory, Pane, SaveIntent, Toast, ViewId, Workspace, WorkspaceId,
item::{FollowableItem, Item, ItemEvent, ItemHandle},
item::{FollowableItem, Item, ItemEvent},
searchable::SearchableItemHandle,
};
use workspace::{item::Dedup, notifications::NotificationId};
@@ -441,11 +441,11 @@ impl Item for ChannelView {
type_id: TypeId,
self_handle: &'a Entity<Self>,
_: &'a App,
) -> Option<AnyView> {
) -> Option<gpui::AnyEntity> {
if type_id == TypeId::of::<Self>() {
Some(self_handle.to_any())
Some(self_handle.clone().into())
} else if type_id == TypeId::of::<Editor>() {
Some(self.editor.to_any())
Some(self.editor.clone().into())
} else {
None
}

View File

@@ -1496,7 +1496,7 @@ impl CollabPanel {
fn reset_filter_editor_text(&mut self, window: &mut Window, cx: &mut Context<Self>) -> bool {
self.filter_editor.update(cx, |editor, cx| {
if editor.buffer().read(cx).len(cx) > 0 {
if editor.buffer().read(cx).len(cx).0 > 0 {
editor.set_text("", window, cx);
true
} else {

View File

@@ -12,7 +12,7 @@ workspace = true
path = "src/context_server.rs"
[features]
test-support = []
test-support = ["gpui/test-support"]
[dependencies]
anyhow.workspace = true
@@ -20,6 +20,7 @@ async-trait.workspace = true
collections.workspace = true
futures.workspace = true
gpui.workspace = true
http_client = { workspace = true, features = ["test-support"] }
log.workspace = true
net.workspace = true
parking_lot.workspace = true
@@ -32,3 +33,6 @@ smol.workspace = true
tempfile.workspace = true
url = { workspace = true, features = ["serde"] }
util.workspace = true
[dev-dependencies]
gpui = { workspace = true, features = ["test-support"] }

View File

@@ -6,6 +6,8 @@ pub mod test;
pub mod transport;
pub mod types;
use collections::HashMap;
use http_client::HttpClient;
use std::path::Path;
use std::sync::Arc;
use std::{fmt::Display, path::PathBuf};
@@ -15,6 +17,9 @@ use client::Client;
use gpui::AsyncApp;
use parking_lot::RwLock;
pub use settings::ContextServerCommand;
use url::Url;
use crate::transport::HttpTransport;
#[derive(Debug, Clone, PartialEq, Eq, Hash)]
pub struct ContextServerId(pub Arc<str>);
@@ -52,6 +57,25 @@ impl ContextServer {
}
}
pub fn http(
id: ContextServerId,
endpoint: &Url,
headers: HashMap<String, String>,
http_client: Arc<dyn HttpClient>,
executor: gpui::BackgroundExecutor,
) -> Result<Self> {
let transport = match endpoint.scheme() {
"http" | "https" => {
log::info!("Using HTTP transport for {}", endpoint);
let transport =
HttpTransport::new(http_client, endpoint.to_string(), headers, executor);
Arc::new(transport) as _
}
_ => anyhow::bail!("unsupported MCP url scheme {}", endpoint.scheme()),
};
Ok(Self::new(id, transport))
}
pub fn new(id: ContextServerId, transport: Arc<dyn crate::transport::Transport>) -> Self {
Self {
id,

View File

@@ -1,11 +1,12 @@
pub mod http;
mod stdio_transport;
use std::pin::Pin;
use anyhow::Result;
use async_trait::async_trait;
use futures::Stream;
use std::pin::Pin;
pub use http::*;
pub use stdio_transport::*;
#[async_trait]

View File

@@ -0,0 +1,259 @@
use anyhow::{Result, anyhow};
use async_trait::async_trait;
use collections::HashMap;
use futures::{Stream, StreamExt};
use gpui::BackgroundExecutor;
use http_client::{AsyncBody, HttpClient, Request, Response, http::Method};
use parking_lot::Mutex as SyncMutex;
use smol::channel;
use std::{pin::Pin, sync::Arc};
use crate::transport::Transport;
// Constants from MCP spec
const HEADER_SESSION_ID: &str = "Mcp-Session-Id";
const EVENT_STREAM_MIME_TYPE: &str = "text/event-stream";
const JSON_MIME_TYPE: &str = "application/json";
/// HTTP Transport with session management and SSE support
pub struct HttpTransport {
http_client: Arc<dyn HttpClient>,
endpoint: String,
session_id: Arc<SyncMutex<Option<String>>>,
executor: BackgroundExecutor,
response_tx: channel::Sender<String>,
response_rx: channel::Receiver<String>,
error_tx: channel::Sender<String>,
error_rx: channel::Receiver<String>,
// Authentication headers to include in requests
headers: HashMap<String, String>,
}
impl HttpTransport {
pub fn new(
http_client: Arc<dyn HttpClient>,
endpoint: String,
headers: HashMap<String, String>,
executor: BackgroundExecutor,
) -> Self {
let (response_tx, response_rx) = channel::unbounded();
let (error_tx, error_rx) = channel::unbounded();
Self {
http_client,
executor,
endpoint,
session_id: Arc::new(SyncMutex::new(None)),
response_tx,
response_rx,
error_tx,
error_rx,
headers,
}
}
/// Send a message and handle the response based on content type
async fn send_message(&self, message: String) -> Result<()> {
let is_notification =
!message.contains("\"id\":") || message.contains("notifications/initialized");
let mut request_builder = Request::builder()
.method(Method::POST)
.uri(&self.endpoint)
.header("Content-Type", JSON_MIME_TYPE)
.header(
"Accept",
format!("{}, {}", JSON_MIME_TYPE, EVENT_STREAM_MIME_TYPE),
);
for (key, value) in &self.headers {
request_builder = request_builder.header(key.as_str(), value.as_str());
}
// Add session ID if we have one (except for initialize)
if let Some(ref session_id) = *self.session_id.lock() {
request_builder = request_builder.header(HEADER_SESSION_ID, session_id.as_str());
}
let request = request_builder.body(AsyncBody::from(message.into_bytes()))?;
let mut response = self.http_client.send(request).await?;
// Handle different response types based on status and content-type
match response.status() {
status if status.is_success() => {
// Check content type
let content_type = response
.headers()
.get("content-type")
.and_then(|v| v.to_str().ok());
// Extract session ID from response headers if present
if let Some(session_id) = response
.headers()
.get(HEADER_SESSION_ID)
.and_then(|v| v.to_str().ok())
{
*self.session_id.lock() = Some(session_id.to_string());
log::debug!("Session ID set: {}", session_id);
}
match content_type {
Some(ct) if ct.starts_with(JSON_MIME_TYPE) => {
// JSON response - read and forward immediately
let mut body = String::new();
futures::AsyncReadExt::read_to_string(response.body_mut(), &mut body)
.await?;
// Only send non-empty responses
if !body.is_empty() {
self.response_tx
.send(body)
.await
.map_err(|_| anyhow!("Failed to send JSON response"))?;
}
}
Some(ct) if ct.starts_with(EVENT_STREAM_MIME_TYPE) => {
// SSE stream - set up streaming
self.setup_sse_stream(response).await?;
}
_ => {
// For notifications, 202 Accepted with no content type is ok
if is_notification && status.as_u16() == 202 {
log::debug!("Notification accepted");
} else {
return Err(anyhow!("Unexpected content type: {:?}", content_type));
}
}
}
}
status if status.as_u16() == 202 => {
// Accepted - notification acknowledged, no response needed
log::debug!("Notification accepted");
}
_ => {
let mut error_body = String::new();
futures::AsyncReadExt::read_to_string(response.body_mut(), &mut error_body).await?;
self.error_tx
.send(format!("HTTP {}: {}", response.status(), error_body))
.await
.map_err(|_| anyhow!("Failed to send error"))?;
}
}
Ok(())
}
/// Set up SSE streaming from the response
async fn setup_sse_stream(&self, mut response: Response<AsyncBody>) -> Result<()> {
let response_tx = self.response_tx.clone();
let error_tx = self.error_tx.clone();
// Spawn a task to handle the SSE stream
smol::spawn(async move {
let reader = futures::io::BufReader::new(response.body_mut());
let mut lines = futures::AsyncBufReadExt::lines(reader);
let mut data_buffer = Vec::new();
let mut in_message = false;
while let Some(line_result) = lines.next().await {
match line_result {
Ok(line) => {
if line.is_empty() {
// Empty line signals end of event
if !data_buffer.is_empty() {
let message = data_buffer.join("\n");
// Filter out ping messages and empty data
if !message.trim().is_empty() && message != "ping" {
if let Err(e) = response_tx.send(message).await {
log::error!("Failed to send SSE message: {}", e);
break;
}
}
data_buffer.clear();
}
in_message = false;
} else if let Some(data) = line.strip_prefix("data: ") {
// Handle data lines
let data = data.trim();
if !data.is_empty() {
// Check if this is a ping message
if data == "ping" {
log::trace!("Received SSE ping");
continue;
}
data_buffer.push(data.to_string());
in_message = true;
}
} else if line.starts_with("event:")
|| line.starts_with("id:")
|| line.starts_with("retry:")
{
// Ignore other SSE fields
continue;
} else if in_message {
// Continuation of data
data_buffer.push(line);
}
}
Err(e) => {
let _ = error_tx.send(format!("SSE stream error: {}", e)).await;
break;
}
}
}
})
.detach();
Ok(())
}
}
#[async_trait]
impl Transport for HttpTransport {
async fn send(&self, message: String) -> Result<()> {
self.send_message(message).await
}
fn receive(&self) -> Pin<Box<dyn Stream<Item = String> + Send>> {
Box::pin(self.response_rx.clone())
}
fn receive_err(&self) -> Pin<Box<dyn Stream<Item = String> + Send>> {
Box::pin(self.error_rx.clone())
}
}
impl Drop for HttpTransport {
fn drop(&mut self) {
// Try to cleanup session on drop
let http_client = self.http_client.clone();
let endpoint = self.endpoint.clone();
let session_id = self.session_id.lock().clone();
let headers = self.headers.clone();
if let Some(session_id) = session_id {
self.executor
.spawn(async move {
let mut request_builder = Request::builder()
.method(Method::DELETE)
.uri(&endpoint)
.header(HEADER_SESSION_ID, &session_id);
// Add authentication headers if present
for (key, value) in headers {
request_builder = request_builder.header(key.as_str(), value.as_str());
}
let request = request_builder.body(AsyncBody::empty());
if let Ok(request) = request {
let _ = http_client.send(request).await;
}
})
.detach();
}
}
}

View File

@@ -270,7 +270,7 @@ fn common_prefix<T1: Iterator<Item = char>, T2: Iterator<Item = char>>(a: T1, b:
mod tests {
use super::*;
use editor::{
Editor, ExcerptRange, MultiBuffer, SelectionEffects,
Editor, ExcerptRange, MultiBuffer, MultiBufferOffset, SelectionEffects,
test::editor_lsp_test_context::EditorLspTestContext,
};
use fs::FakeFs;
@@ -1081,8 +1081,9 @@ mod tests {
vec![complete_from_marker, replace_range_marker.clone()],
);
let range = marked_ranges.remove(&replace_range_marker).unwrap()[0].clone();
let replace_range =
cx.to_lsp_range(marked_ranges.remove(&replace_range_marker).unwrap()[0].clone());
cx.to_lsp_range(MultiBufferOffset(range.start)..MultiBufferOffset(range.end));
let mut request =
cx.set_request_handler::<lsp::request::Completion, _, _>(move |url, params, _| {

View File

@@ -289,15 +289,11 @@ impl minidumper::ServerHandler for CrashServer {
pub fn panic_hook(info: &PanicHookInfo) {
// Don't handle a panic on threads that are not relevant to the main execution.
if extension_host::wasm_host::IS_WASM_THREAD.with(|v| v.load(Ordering::Acquire)) {
log::error!("wasm thread panicked!");
return;
}
let message = info
.payload()
.downcast_ref::<&str>()
.map(|s| s.to_string())
.or_else(|| info.payload().downcast_ref::<String>().cloned())
.unwrap_or_else(|| "Box<Any>".to_string());
let message = info.payload_as_str().unwrap_or("Box<Any>").to_owned();
let span = info
.location()

View File

@@ -324,6 +324,7 @@ pub async fn download_adapter_from_github(
extract_zip(&version_path, file)
.await
// we cannot check the status as some adapter include files with names that trigger `Illegal byte sequence`
.inspect_err(|e| log::warn!("ZIP extraction error: {}. Ignoring...", e))
.ok();
util::fs::remove_matching(&adapter_path, |entry| {

View File

@@ -1029,13 +1029,11 @@ impl SearchableItem for DapLogView {
&mut self,
index: usize,
matches: &[Self::Match],
collapse: bool,
window: &mut Window,
cx: &mut Context<Self>,
) {
self.editor.update(cx, |e, cx| {
e.activate_match(index, matches, collapse, window, cx)
})
self.editor
.update(cx, |e, cx| e.activate_match(index, matches, window, cx))
}
fn select_matches(

View File

@@ -14,13 +14,12 @@ use collections::IndexMap;
use dap::adapters::DebugAdapterName;
use dap::{DapRegistry, StartDebuggingRequestArguments};
use dap::{client::SessionId, debugger_settings::DebuggerSettings};
use editor::Editor;
use editor::{Editor, MultiBufferOffset, ToPoint};
use gpui::{
Action, App, AsyncWindowContext, ClipboardItem, Context, DismissEvent, Entity, EntityId,
EventEmitter, FocusHandle, Focusable, MouseButton, MouseDownEvent, Point, Subscription, Task,
WeakEntity, anchored, deferred,
};
use text::ToPoint as _;
use itertools::Itertools as _;
use language::Buffer;
@@ -1216,11 +1215,11 @@ impl DebugPanel {
let mut last_offset = None;
while let Some(mat) = matches.next() {
if let Some(pos) = mat.captures.first().map(|m| m.node.byte_range().end) {
last_offset = Some(pos)
last_offset = Some(MultiBufferOffset(pos))
}
}
let mut edits = Vec::new();
let mut cursor_position = 0;
let mut cursor_position = MultiBufferOffset(0);
if let Some(pos) = last_offset {
edits.push((pos..pos, format!(",\n{new_scenario}")));
@@ -1234,24 +1233,25 @@ impl DebugPanel {
if let Some(mat) = matches.next() {
if let Some(pos) = mat.captures.first().map(|m| m.node.byte_range().end - 1) {
edits.push((pos..pos, format!("\n{new_scenario}\n")));
cursor_position = pos + "\n ".len();
edits.push((
MultiBufferOffset(pos)..MultiBufferOffset(pos),
format!("\n{new_scenario}\n"),
));
cursor_position = MultiBufferOffset(pos) + "\n ".len();
}
} else {
edits.push((0..0, format!("[\n{}\n]", new_scenario)));
cursor_position = "[\n ".len();
edits.push((
MultiBufferOffset(0)..MultiBufferOffset(0),
format!("[\n{}\n]", new_scenario),
));
cursor_position = MultiBufferOffset("[\n ".len());
}
}
editor.transact(window, cx, |editor, window, cx| {
editor.edit(edits, cx);
let snapshot = editor
.buffer()
.read(cx)
.as_singleton()
.unwrap()
.read(cx)
.snapshot();
let snapshot = editor.buffer().read(cx).read(cx);
let point = cursor_position.to_point(&snapshot);
drop(snapshot);
editor.go_to_singleton_buffer_point(point, window, cx);
});
Ok(editor.save(SaveOptions::default(), project, window, cx))

View File

@@ -1,7 +1,7 @@
use std::any::TypeId;
use debugger_panel::DebugPanel;
use editor::Editor;
use editor::{Editor, MultiBufferOffsetUtf16};
use gpui::{Action, App, DispatchPhase, EntityInputHandler, actions};
use new_process_modal::{NewProcessModal, NewProcessMode};
use onboarding_modal::DebuggerOnboardingModal;
@@ -390,11 +390,14 @@ pub fn init(cx: &mut App) {
maybe!({
let text = editor
.update(cx, |editor, cx| {
let range = editor
.selections
.newest::<MultiBufferOffsetUtf16>(
&editor.display_snapshot(cx),
)
.range();
editor.text_for_range(
editor
.selections
.newest(&editor.display_snapshot(cx))
.range(),
range.start.0.0..range.end.0.0,
&mut None,
window,
cx,

View File

@@ -25,12 +25,9 @@ use settings::Settings;
use task::{DebugScenario, RevealTarget, VariableName, ZedDebugConfig};
use theme::ThemeSettings;
use ui::{
ActiveTheme, Button, ButtonCommon, ButtonSize, CheckboxWithLabel, Clickable, Color, Context,
ContextMenu, Disableable, DropdownMenu, FluentBuilder, Icon, IconName, IconSize,
IconWithIndicator, Indicator, InteractiveElement, IntoElement, KeyBinding, Label,
LabelCommon as _, LabelSize, ListItem, ListItemSpacing, ParentElement, RenderOnce,
SharedString, Styled, StyledExt, ToggleButton, ToggleState, Toggleable, Tooltip, Window, div,
h_flex, relative, rems, v_flex,
CheckboxWithLabel, ContextMenu, DropdownMenu, FluentBuilder, IconWithIndicator, Indicator,
KeyBinding, ListItem, ListItemSpacing, ToggleButtonGroup, ToggleButtonSimple, ToggleState,
Tooltip, prelude::*,
};
use util::{ResultExt, rel_path::RelPath, shell::ShellKind};
use workspace::{ModalView, Workspace, notifications::DetachAndPromptErr, pane};
@@ -620,72 +617,64 @@ impl Render for NewProcessModal {
.border_b_1()
.border_color(cx.theme().colors().border_variant)
.child(
ToggleButton::new(
"debugger-session-ui-tasks-button",
NewProcessMode::Task.to_string(),
)
.size(ButtonSize::Default)
.toggle_state(matches!(self.mode, NewProcessMode::Task))
.style(ui::ButtonStyle::Subtle)
.on_click(cx.listener(|this, _, window, cx| {
this.mode = NewProcessMode::Task;
this.mode_focus_handle(cx).focus(window);
cx.notify();
}))
.tooltip(Tooltip::text("Run predefined task"))
.first(),
)
.child(
ToggleButton::new(
"debugger-session-ui-launch-button",
NewProcessMode::Debug.to_string(),
)
.size(ButtonSize::Default)
.style(ui::ButtonStyle::Subtle)
.toggle_state(matches!(self.mode, NewProcessMode::Debug))
.on_click(cx.listener(|this, _, window, cx| {
this.mode = NewProcessMode::Debug;
this.mode_focus_handle(cx).focus(window);
cx.notify();
}))
.tooltip(Tooltip::text("Start a predefined debug scenario"))
.middle(),
)
.child(
ToggleButton::new(
"debugger-session-ui-attach-button",
NewProcessMode::Attach.to_string(),
)
.size(ButtonSize::Default)
.toggle_state(matches!(self.mode, NewProcessMode::Attach))
.style(ui::ButtonStyle::Subtle)
.on_click(cx.listener(|this, _, window, cx| {
this.mode = NewProcessMode::Attach;
ToggleButtonGroup::single_row(
"debugger-mode-buttons",
[
ToggleButtonSimple::new(
NewProcessMode::Task.to_string(),
cx.listener(|this, _, window, cx| {
this.mode = NewProcessMode::Task;
this.mode_focus_handle(cx).focus(window);
cx.notify();
}),
)
.tooltip(Tooltip::text("Run predefined task")),
ToggleButtonSimple::new(
NewProcessMode::Debug.to_string(),
cx.listener(|this, _, window, cx| {
this.mode = NewProcessMode::Debug;
this.mode_focus_handle(cx).focus(window);
cx.notify();
}),
)
.tooltip(Tooltip::text("Start a predefined debug scenario")),
ToggleButtonSimple::new(
NewProcessMode::Attach.to_string(),
cx.listener(|this, _, window, cx| {
this.mode = NewProcessMode::Attach;
if let Some(debugger) = this.debugger.as_ref() {
Self::update_attach_picker(&this.attach_mode, debugger, window, cx);
}
this.mode_focus_handle(cx).focus(window);
cx.notify();
}))
.tooltip(Tooltip::text("Attach the debugger to a running process"))
.middle(),
)
.child(
ToggleButton::new(
"debugger-session-ui-custom-button",
NewProcessMode::Launch.to_string(),
if let Some(debugger) = this.debugger.as_ref() {
Self::update_attach_picker(
&this.attach_mode,
debugger,
window,
cx,
);
}
this.mode_focus_handle(cx).focus(window);
cx.notify();
}),
)
.tooltip(Tooltip::text("Attach the debugger to a running process")),
ToggleButtonSimple::new(
NewProcessMode::Launch.to_string(),
cx.listener(|this, _, window, cx| {
this.mode = NewProcessMode::Launch;
this.mode_focus_handle(cx).focus(window);
cx.notify();
}),
)
.tooltip(Tooltip::text("Launch a new process with a debugger")),
],
)
.size(ButtonSize::Default)
.toggle_state(matches!(self.mode, NewProcessMode::Launch))
.style(ui::ButtonStyle::Subtle)
.on_click(cx.listener(|this, _, window, cx| {
this.mode = NewProcessMode::Launch;
this.mode_focus_handle(cx).focus(window);
cx.notify();
}))
.tooltip(Tooltip::text("Launch a new process with a debugger"))
.last(),
.label_size(LabelSize::Default)
.auto_width()
.selected_index(match self.mode {
NewProcessMode::Task => 0,
NewProcessMode::Debug => 1,
NewProcessMode::Attach => 2,
NewProcessMode::Launch => 3,
}),
),
)
.child(v_flex().child(self.render_mode(window, cx)))

View File

@@ -8,7 +8,7 @@ use collections::HashMap;
use dap::{CompletionItem, CompletionItemType, OutputEvent};
use editor::{
Bias, CompletionProvider, Editor, EditorElement, EditorMode, EditorStyle, ExcerptId,
SizingBehavior,
MultiBufferOffset, SizingBehavior,
};
use fuzzy::StringMatchCandidate;
use gpui::{
@@ -161,7 +161,9 @@ impl Console {
) -> Task<Result<()>> {
self.console.update(cx, |_, cx| {
cx.spawn_in(window, async move |console, cx| {
let mut len = console.update(cx, |this, cx| this.buffer().read(cx).len(cx))?;
let mut len = console
.update(cx, |this, cx| this.buffer().read(cx).len(cx))?
.0;
let (output, spans, background_spans) = cx
.background_spawn(async move {
let mut all_spans = Vec::new();
@@ -227,8 +229,8 @@ impl Console {
for (range, color) in spans {
let Some(color) = color else { continue };
let start_offset = range.start;
let range =
buffer.anchor_after(range.start)..buffer.anchor_before(range.end);
let range = buffer.anchor_after(MultiBufferOffset(range.start))
..buffer.anchor_before(MultiBufferOffset(range.end));
let style = HighlightStyle {
color: Some(terminal_view::terminal_element::convert_color(
&color,
@@ -247,8 +249,8 @@ impl Console {
for (range, color) in background_spans {
let Some(color) = color else { continue };
let start_offset = range.start;
let range =
buffer.anchor_after(range.start)..buffer.anchor_before(range.end);
let range = buffer.anchor_after(MultiBufferOffset(range.start))
..buffer.anchor_before(MultiBufferOffset(range.end));
console.highlight_background_key::<ConsoleAnsiHighlight>(
start_offset,
&[range],
@@ -961,7 +963,7 @@ fn color_fetcher(color: ansi::Color) -> fn(&Theme) -> Hsla {
mod tests {
use super::*;
use crate::tests::init_test;
use editor::test::editor_test_context::EditorTestContext;
use editor::{MultiBufferOffset, test::editor_test_context::EditorTestContext};
use gpui::TestAppContext;
use language::Point;
@@ -993,8 +995,8 @@ mod tests {
cx.update_editor(|editor, _, cx| {
editor.edit(
vec![(
snapshot.offset_for_anchor(&replace_range.start)
..snapshot.offset_for_anchor(&replace_range.end),
MultiBufferOffset(snapshot.offset_for_anchor(&replace_range.start))
..MultiBufferOffset(snapshot.offset_for_anchor(&replace_range.end)),
replacement,
)],
cx,

View File

@@ -7,7 +7,7 @@ use editor::{
RowHighlightOptions, SelectionEffects, ToPoint, scroll::Autoscroll,
};
use gpui::{
AnyView, App, AppContext, Entity, EventEmitter, Focusable, IntoElement, Render, SharedString,
App, AppContext, Entity, EventEmitter, Focusable, IntoElement, Render, SharedString,
Subscription, Task, WeakEntity, Window,
};
use language::{BufferSnapshot, Capability, Point, Selection, SelectionGoal, TreeSitterOptions};
@@ -418,11 +418,11 @@ impl Item for StackTraceView {
type_id: TypeId,
self_handle: &'a Entity<Self>,
_: &'a App,
) -> Option<AnyView> {
) -> Option<gpui::AnyEntity> {
if type_id == TypeId::of::<Self>() {
Some(self_handle.to_any())
Some(self_handle.clone().into())
} else if type_id == TypeId::of::<Editor>() {
Some(self.editor.to_any())
Some(self.editor.clone().into())
} else {
None
}

View File

@@ -680,11 +680,11 @@ impl Item for BufferDiagnosticsEditor {
type_id: std::any::TypeId,
self_handle: &'a Entity<Self>,
_: &'a App,
) -> Option<gpui::AnyView> {
) -> Option<gpui::AnyEntity> {
if type_id == TypeId::of::<Self>() {
Some(self_handle.to_any())
Some(self_handle.clone().into())
} else if type_id == TypeId::of::<Editor>() {
Some(self.editor.to_any())
Some(self.editor.clone().into())
} else {
None
}

View File

@@ -17,7 +17,7 @@ use editor::{
multibuffer_context_lines,
};
use gpui::{
AnyElement, AnyView, App, AsyncApp, Context, Entity, EventEmitter, FocusHandle, FocusOutEvent,
AnyElement, App, AsyncApp, Context, Entity, EventEmitter, FocusHandle, FocusOutEvent,
Focusable, Global, InteractiveElement, IntoElement, ParentElement, Render, SharedString,
Styled, Subscription, Task, WeakEntity, Window, actions, div,
};
@@ -491,7 +491,7 @@ impl ProjectDiagnosticsEditor {
cx: &mut Context<Self>,
) -> Task<Result<()>> {
let was_empty = self.multibuffer.read(cx).is_empty();
let mut buffer_snapshot = buffer.read(cx).snapshot();
let buffer_snapshot = buffer.read(cx).snapshot();
let buffer_id = buffer_snapshot.remote_id();
let max_severity = if self.include_warnings {
@@ -602,7 +602,6 @@ impl ProjectDiagnosticsEditor {
cx,
)
.await;
buffer_snapshot = cx.update(|_, cx| buffer.read(cx).snapshot())?;
let initial_range = buffer_snapshot.anchor_after(b.initial_range.start)
..buffer_snapshot.anchor_before(b.initial_range.end);
let excerpt_range = ExcerptRange {
@@ -881,11 +880,11 @@ impl Item for ProjectDiagnosticsEditor {
type_id: TypeId,
self_handle: &'a Entity<Self>,
_: &'a App,
) -> Option<AnyView> {
) -> Option<gpui::AnyEntity> {
if type_id == TypeId::of::<Self>() {
Some(self_handle.to_any())
Some(self_handle.clone().into())
} else if type_id == TypeId::of::<Editor>() {
Some(self.editor.to_any())
Some(self.editor.clone().into())
} else {
None
}
@@ -1010,11 +1009,14 @@ async fn heuristic_syntactic_expand(
snapshot: BufferSnapshot,
cx: &mut AsyncApp,
) -> Option<RangeInclusive<BufferRow>> {
let start = snapshot.clip_point(input_range.start, Bias::Right);
let end = snapshot.clip_point(input_range.end, Bias::Left);
let input_row_count = input_range.end.row - input_range.start.row;
if input_row_count > max_row_count {
return None;
}
let input_range = start..end;
// If the outline node contains the diagnostic and is small enough, just use that.
let outline_range = snapshot.outline_range_containing(input_range.clone());
if let Some(outline_range) = outline_range.clone() {

View File

@@ -1,7 +1,7 @@
use super::*;
use collections::{HashMap, HashSet};
use editor::{
DisplayPoint, EditorSettings, Inlay,
DisplayPoint, EditorSettings, Inlay, MultiBufferOffset,
actions::{GoToDiagnostic, GoToPreviousDiagnostic, Hover, MoveToBeginning},
display_map::DisplayRow,
test::{
@@ -878,7 +878,8 @@ async fn test_random_diagnostics_with_inlays(cx: &mut TestAppContext, mut rng: S
diagnostics.editor.update(cx, |editor, cx| {
let snapshot = editor.snapshot(window, cx);
if !snapshot.buffer_snapshot().is_empty() {
let position = rng.random_range(0..snapshot.buffer_snapshot().len());
let position = rng
.random_range(MultiBufferOffset(0)..snapshot.buffer_snapshot().len());
let position = snapshot.buffer_snapshot().clip_offset(position, Bias::Left);
log::info!(
"adding inlay at {position}/{}: {:?}",

View File

@@ -1,6 +1,6 @@
use std::time::Duration;
use editor::Editor;
use editor::{Editor, MultiBufferOffset};
use gpui::{
Context, Entity, EventEmitter, IntoElement, ParentElement, Render, Styled, Subscription, Task,
WeakEntity, Window,
@@ -171,14 +171,19 @@ impl DiagnosticIndicator {
let buffer = editor.buffer().read(cx).snapshot(cx);
let cursor_position = editor
.selections
.newest::<usize>(&editor.display_snapshot(cx))
.newest::<MultiBufferOffset>(&editor.display_snapshot(cx))
.head();
(buffer, cursor_position)
});
let new_diagnostic = buffer
.diagnostics_in_range::<usize>(cursor_position..cursor_position)
.diagnostics_in_range::<MultiBufferOffset>(cursor_position..cursor_position)
.filter(|entry| !entry.range.is_empty())
.min_by_key(|entry| (entry.diagnostic.severity, entry.range.len()))
.min_by_key(|entry| {
(
entry.diagnostic.severity,
entry.range.end - entry.range.start,
)
})
.map(|entry| entry.diagnostic);
if new_diagnostic != self.current_diagnostic.as_ref() {
let new_diagnostic = new_diagnostic.cloned();

View File

@@ -18,12 +18,12 @@ client.workspace = true
cloud_llm_client.workspace = true
codestral.workspace = true
copilot.workspace = true
edit_prediction.workspace = true
editor.workspace = true
feature_flags.workspace = true
fs.workspace = true
gpui.workspace = true
indoc.workspace = true
edit_prediction.workspace = true
language.workspace = true
paths.workspace = true
project.workspace = true
@@ -35,6 +35,7 @@ ui.workspace = true
workspace.workspace = true
zed_actions.workspace = true
zeta.workspace = true
zeta2.workspace = true
[dev-dependencies]
copilot = { workspace = true, features = ["test-support"] }

View File

@@ -3,7 +3,9 @@ use client::{Client, UserStore, zed_urls};
use cloud_llm_client::UsageLimit;
use codestral::CodestralCompletionProvider;
use copilot::{Copilot, Status};
use editor::{Editor, SelectionEffects, actions::ShowEditPrediction, scroll::Autoscroll};
use editor::{
Editor, MultiBufferOffset, SelectionEffects, actions::ShowEditPrediction, scroll::Autoscroll,
};
use feature_flags::{FeatureFlagAppExt, PredictEditsRateCompletionsFeatureFlag};
use fs::Fs;
use gpui::{
@@ -18,7 +20,9 @@ use language::{
};
use project::DisableAiSettings;
use regex::Regex;
use settings::{Settings, SettingsStore, update_settings_file};
use settings::{
EXPERIMENTAL_SWEEP_EDIT_PREDICTION_PROVIDER_NAME, Settings, SettingsStore, update_settings_file,
};
use std::{
sync::{Arc, LazyLock},
time::Duration,
@@ -34,6 +38,7 @@ use workspace::{
};
use zed_actions::OpenBrowser;
use zeta::RateCompletions;
use zeta2::SweepFeatureFlag;
actions!(
edit_prediction,
@@ -78,7 +83,7 @@ impl Render for EditPredictionButton {
let all_language_settings = all_language_settings(None, cx);
match all_language_settings.edit_predictions.provider {
match &all_language_settings.edit_predictions.provider {
EditPredictionProvider::None => div().hidden(),
EditPredictionProvider::Copilot => {
@@ -297,6 +302,15 @@ impl Render for EditPredictionButton {
.with_handle(self.popover_menu_handle.clone()),
)
}
EditPredictionProvider::Experimental(provider_name) => {
if *provider_name == EXPERIMENTAL_SWEEP_EDIT_PREDICTION_PROVIDER_NAME
&& cx.has_flag::<SweepFeatureFlag>()
{
div().child(Icon::new(IconName::SweepAi))
} else {
div()
}
}
EditPredictionProvider::Zed => {
let enabled = self.editor_enabled.unwrap_or(true);
@@ -525,7 +539,7 @@ impl EditPredictionButton {
set_completion_provider(fs.clone(), cx, provider);
})
}
EditPredictionProvider::None => continue,
EditPredictionProvider::None | EditPredictionProvider::Experimental(_) => continue,
};
}
}
@@ -1095,7 +1109,12 @@ async fn open_disabled_globs_setting_in_editor(
});
if !edits.is_empty() {
item.edit(edits, cx);
item.edit(
edits
.into_iter()
.map(|(r, s)| (MultiBufferOffset(r.start)..MultiBufferOffset(r.end), s)),
cx,
);
}
let text = item.buffer().read(cx).snapshot(cx).text();
@@ -1110,6 +1129,7 @@ async fn open_disabled_globs_setting_in_editor(
.map(|inner_match| inner_match.start()..inner_match.end())
});
if let Some(range) = range {
let range = MultiBufferOffset(range.start)..MultiBufferOffset(range.end);
item.change_selections(
SelectionEffects::scroll(Autoscroll::newest()),
window,

View File

@@ -2,6 +2,7 @@ use criterion::{BenchmarkId, Criterion, criterion_group, criterion_main};
use editor::MultiBuffer;
use gpui::TestDispatcher;
use itertools::Itertools;
use multi_buffer::MultiBufferOffset;
use rand::{Rng, SeedableRng, rngs::StdRng};
use std::num::NonZeroU32;
use text::Bias;
@@ -24,7 +25,9 @@ fn to_tab_point_benchmark(c: &mut Criterion) {
let (_, inlay_snapshot) = InlayMap::new(buffer_snapshot);
let (_, fold_snapshot) = FoldMap::new(inlay_snapshot.clone());
let fold_point = fold_snapshot.to_fold_point(
inlay_snapshot.to_point(InlayOffset(rng.random_range(0..length))),
inlay_snapshot.to_point(InlayOffset(
rng.random_range(MultiBufferOffset(0)..MultiBufferOffset(length)),
)),
Bias::Left,
);
let (_, snapshot) = TabMap::new(fold_snapshot, NonZeroU32::new(4).unwrap());
@@ -69,7 +72,9 @@ fn to_fold_point_benchmark(c: &mut Criterion) {
let (_, fold_snapshot) = FoldMap::new(inlay_snapshot.clone());
let fold_point = fold_snapshot.to_fold_point(
inlay_snapshot.to_point(InlayOffset(rng.random_range(0..length))),
inlay_snapshot.to_point(InlayOffset(
rng.random_range(MultiBufferOffset(0)..MultiBufferOffset(length)),
)),
Bias::Left,
);

View File

@@ -8,6 +8,7 @@ use gpui::{
use itertools::Itertools;
use language::CodeLabel;
use language::{Buffer, LanguageName, LanguageRegistry};
use lsp::CompletionItemTag;
use markdown::{Markdown, MarkdownElement};
use multi_buffer::{Anchor, ExcerptId};
use ordered_float::OrderedFloat;
@@ -840,7 +841,16 @@ impl CompletionsMenu {
if completion
.source
.lsp_completion(false)
.and_then(|lsp_completion| lsp_completion.deprecated)
.and_then(|lsp_completion| {
match (lsp_completion.deprecated, &lsp_completion.tags)
{
(Some(true), _) => Some(true),
(_, Some(tags)) => Some(
tags.contains(&CompletionItemTag::DEPRECATED),
),
_ => None,
}
})
.unwrap_or(false)
{
highlight.strikethrough = Some(StrikethroughStyle {

View File

@@ -44,12 +44,10 @@ pub use invisibles::{is_invisible, replacement};
use collections::{HashMap, HashSet};
use gpui::{App, Context, Entity, Font, HighlightStyle, LineLayout, Pixels, UnderlineStyle};
use language::{
OffsetUtf16, Point, Subscription as BufferSubscription, language_settings::language_settings,
};
use language::{Point, Subscription as BufferSubscription, language_settings::language_settings};
use multi_buffer::{
Anchor, AnchorRangeExt, MultiBuffer, MultiBufferPoint, MultiBufferRow, MultiBufferSnapshot,
RowInfo, ToOffset, ToPoint,
Anchor, AnchorRangeExt, MultiBuffer, MultiBufferOffset, MultiBufferOffsetUtf16,
MultiBufferPoint, MultiBufferRow, MultiBufferSnapshot, RowInfo, ToOffset, ToPoint,
};
use project::InlayId;
use project::project_settings::DiagnosticSeverity;
@@ -104,7 +102,7 @@ type InlayHighlights = TreeMap<TypeId, TreeMap<InlayId, (HighlightStyle, InlayHi
pub struct DisplayMap {
/// The buffer that we are displaying.
buffer: Entity<MultiBuffer>,
buffer_subscription: BufferSubscription,
buffer_subscription: BufferSubscription<MultiBufferOffset>,
/// Decides where the [`Inlay`]s should be displayed.
inlay_map: InlayMap,
/// Decides where the fold indicators should be and tracks parts of a source file that are currently folded.
@@ -198,7 +196,7 @@ impl DisplayMap {
pub fn set_state(&mut self, other: &DisplaySnapshot, cx: &mut Context<Self>) {
self.fold(
other
.folds_in_range(0..other.buffer_snapshot().len())
.folds_in_range(MultiBufferOffset(0)..other.buffer_snapshot().len())
.map(|fold| {
Crease::simple(
fold.range.to_offset(other.buffer_snapshot()),
@@ -794,7 +792,7 @@ impl DisplaySnapshot {
}
pub fn is_empty(&self) -> bool {
self.buffer_snapshot().len() == 0
self.buffer_snapshot().len() == MultiBufferOffset(0)
}
pub fn row_infos(&self, start_row: DisplayRow) -> impl Iterator<Item = RowInfo> + '_ {
@@ -1133,7 +1131,10 @@ impl DisplaySnapshot {
})
}
pub fn buffer_chars_at(&self, mut offset: usize) -> impl Iterator<Item = (char, usize)> + '_ {
pub fn buffer_chars_at(
&self,
mut offset: MultiBufferOffset,
) -> impl Iterator<Item = (char, MultiBufferOffset)> + '_ {
self.buffer_snapshot().chars_at(offset).map(move |ch| {
let ret = (ch, offset);
offset += ch.len_utf8();
@@ -1143,8 +1144,8 @@ impl DisplaySnapshot {
pub fn reverse_buffer_chars_at(
&self,
mut offset: usize,
) -> impl Iterator<Item = (char, usize)> + '_ {
mut offset: MultiBufferOffset,
) -> impl Iterator<Item = (char, MultiBufferOffset)> + '_ {
self.buffer_snapshot()
.reversed_chars_at(offset)
.map(move |ch| {
@@ -1526,7 +1527,7 @@ impl DisplayPoint {
map.display_point_to_point(self, Bias::Left)
}
pub fn to_offset(self, map: &DisplaySnapshot, bias: Bias) -> usize {
pub fn to_offset(self, map: &DisplaySnapshot, bias: Bias) -> MultiBufferOffset {
let wrap_point = map.block_snapshot.to_wrap_point(self.0, bias);
let tab_point = map.wrap_snapshot().to_tab_point(wrap_point);
let fold_point = map.tab_snapshot().to_fold_point(tab_point, bias).0;
@@ -1536,13 +1537,13 @@ impl DisplayPoint {
}
}
impl ToDisplayPoint for usize {
impl ToDisplayPoint for MultiBufferOffset {
fn to_display_point(&self, map: &DisplaySnapshot) -> DisplayPoint {
map.point_to_display_point(self.to_point(map.buffer_snapshot()), Bias::Left)
}
}
impl ToDisplayPoint for OffsetUtf16 {
impl ToDisplayPoint for MultiBufferOffsetUtf16 {
fn to_display_point(&self, map: &DisplaySnapshot) -> DisplayPoint {
self.to_offset(map.buffer_snapshot()).to_display_point(map)
}
@@ -1685,7 +1686,7 @@ pub mod tests {
let block_properties = (0..rng.random_range(1..=1))
.map(|_| {
let position = buffer.anchor_after(buffer.clip_offset(
rng.random_range(0..=buffer.len()),
rng.random_range(MultiBufferOffset(0)..=buffer.len()),
Bias::Left,
));
@@ -1727,8 +1728,12 @@ pub mod tests {
for _ in 0..rng.random_range(1..=3) {
buffer.read_with(cx, |buffer, cx| {
let buffer = buffer.read(cx);
let end = buffer.clip_offset(rng.random_range(0..=buffer.len()), Right);
let start = buffer.clip_offset(rng.random_range(0..=end), Left);
let end = buffer.clip_offset(
rng.random_range(MultiBufferOffset(0)..=buffer.len()),
Right,
);
let start = buffer
.clip_offset(rng.random_range(MultiBufferOffset(0)..=end), Left);
ranges.push(start..end);
});
}
@@ -1954,7 +1959,7 @@ pub mod tests {
)
);
let ix = snapshot.buffer_snapshot().text().find("seven").unwrap();
let ix = MultiBufferOffset(snapshot.buffer_snapshot().text().find("seven").unwrap());
buffer.update(cx, |buffer, cx| {
buffer.edit([(ix..ix, "and ")], None, cx);
});
@@ -2083,7 +2088,7 @@ pub mod tests {
&[],
vec![Inlay::edit_prediction(
0,
buffer_snapshot.anchor_after(0),
buffer_snapshot.anchor_after(MultiBufferOffset(0)),
"\n",
)],
cx,
@@ -2094,7 +2099,11 @@ pub mod tests {
// Regression test: updating the display map does not crash when a
// block is immediately followed by a multi-line inlay.
buffer.update(cx, |buffer, cx| {
buffer.edit([(1..1, "b")], None, cx);
buffer.edit(
[(MultiBufferOffset(1)..MultiBufferOffset(1), "b")],
None,
cx,
);
});
map.update(cx, |m, cx| assert_eq!(m.snapshot(cx).text(), "\n\n\nab"));
}
@@ -2694,6 +2703,7 @@ pub mod tests {
HighlightKey::Type(TypeId::of::<MyType>()),
highlighted_ranges
.into_iter()
.map(|range| MultiBufferOffset(range.start)..MultiBufferOffset(range.end))
.map(|range| {
buffer_snapshot.anchor_before(range.start)
..buffer_snapshot.anchor_before(range.end)

View File

@@ -11,8 +11,8 @@ use collections::{Bound, HashMap, HashSet};
use gpui::{AnyElement, App, EntityId, Pixels, Window};
use language::{Patch, Point};
use multi_buffer::{
Anchor, ExcerptId, ExcerptInfo, MultiBuffer, MultiBufferRow, MultiBufferSnapshot, RowInfo,
ToOffset, ToPoint as _,
Anchor, ExcerptId, ExcerptInfo, MultiBuffer, MultiBufferOffset, MultiBufferRow,
MultiBufferSnapshot, RowInfo, ToOffset, ToPoint as _,
};
use parking_lot::Mutex;
use std::{
@@ -1208,7 +1208,7 @@ impl BlockMapWriter<'_> {
pub fn remove_intersecting_replace_blocks(
&mut self,
ranges: impl IntoIterator<Item = Range<usize>>,
ranges: impl IntoIterator<Item = Range<MultiBufferOffset>>,
inclusive: bool,
) {
let wrap_snapshot = self.0.wrap_snapshot.borrow();
@@ -1283,7 +1283,7 @@ impl BlockMapWriter<'_> {
fn blocks_intersecting_buffer_range(
&self,
range: Range<usize>,
range: Range<MultiBufferOffset>,
inclusive: bool,
) -> &[Arc<CustomBlock>] {
if range.is_empty() && !inclusive {
@@ -3043,8 +3043,10 @@ mod tests {
let block_properties = (0..block_count)
.map(|_| {
let buffer = cx.update(|cx| buffer.read(cx).read(cx).clone());
let offset =
buffer.clip_offset(rng.random_range(0..=buffer.len()), Bias::Left);
let offset = buffer.clip_offset(
rng.random_range(MultiBufferOffset(0)..=buffer.len()),
Bias::Left,
);
let mut min_height = 0;
let placement = match rng.random_range(0..3) {
0 => {
@@ -3244,7 +3246,7 @@ mod tests {
// Note that this needs to be synced with the related section in BlockMap::sync
expected_blocks.extend(block_map.header_and_footer_blocks(
&buffer_snapshot,
0..,
MultiBufferOffset(0)..,
&wraps_snapshot,
));

View File

@@ -1,7 +1,7 @@
use collections::BTreeMap;
use gpui::HighlightStyle;
use language::Chunk;
use multi_buffer::{MultiBufferChunks, MultiBufferSnapshot, ToOffset as _};
use multi_buffer::{MultiBufferChunks, MultiBufferOffset, MultiBufferSnapshot, ToOffset as _};
use std::{
cmp,
iter::{self, Peekable},
@@ -14,7 +14,7 @@ use crate::display_map::{HighlightKey, TextHighlights};
pub struct CustomHighlightsChunks<'a> {
buffer_chunks: MultiBufferChunks<'a>,
buffer_chunk: Option<Chunk<'a>>,
offset: usize,
offset: MultiBufferOffset,
multibuffer_snapshot: &'a MultiBufferSnapshot,
highlight_endpoints: Peekable<vec::IntoIter<HighlightEndpoint>>,
@@ -24,14 +24,14 @@ pub struct CustomHighlightsChunks<'a> {
#[derive(Debug, Copy, Clone, Eq, PartialEq)]
struct HighlightEndpoint {
offset: usize,
offset: MultiBufferOffset,
tag: HighlightKey,
style: Option<HighlightStyle>,
}
impl<'a> CustomHighlightsChunks<'a> {
pub fn new(
range: Range<usize>,
range: Range<MultiBufferOffset>,
language_aware: bool,
text_highlights: Option<&'a TextHighlights>,
multibuffer_snapshot: &'a MultiBufferSnapshot,
@@ -52,7 +52,7 @@ impl<'a> CustomHighlightsChunks<'a> {
}
}
pub fn seek(&mut self, new_range: Range<usize>) {
pub fn seek(&mut self, new_range: Range<MultiBufferOffset>) {
self.highlight_endpoints =
create_highlight_endpoints(&new_range, self.text_highlights, self.multibuffer_snapshot);
self.offset = new_range.start;
@@ -63,7 +63,7 @@ impl<'a> CustomHighlightsChunks<'a> {
}
fn create_highlight_endpoints(
range: &Range<usize>,
range: &Range<MultiBufferOffset>,
text_highlights: Option<&TextHighlights>,
buffer: &MultiBufferSnapshot,
) -> iter::Peekable<vec::IntoIter<HighlightEndpoint>> {
@@ -117,7 +117,7 @@ impl<'a> Iterator for CustomHighlightsChunks<'a> {
type Item = Chunk<'a>;
fn next(&mut self) -> Option<Self::Item> {
let mut next_highlight_endpoint = usize::MAX;
let mut next_highlight_endpoint = MultiBufferOffset(usize::MAX);
while let Some(endpoint) = self.highlight_endpoints.peek().copied() {
if endpoint.offset <= self.offset {
if let Some(style) = endpoint.style {
@@ -224,20 +224,22 @@ mod tests {
let range_count = rng.random_range(1..10);
let text = buffer_snapshot.text();
for _ in 0..range_count {
if buffer_snapshot.len() == 0 {
if buffer_snapshot.len() == MultiBufferOffset(0) {
continue;
}
let mut start = rng.random_range(0..=buffer_snapshot.len().saturating_sub(10));
let mut start = rng.random_range(
MultiBufferOffset(0)..=buffer_snapshot.len().saturating_sub_usize(10),
);
while !text.is_char_boundary(start) {
start = start.saturating_sub(1);
while !text.is_char_boundary(start.0) {
start = start.saturating_sub_usize(1);
}
let end_end = buffer_snapshot.len().min(start + 100);
let end_end = buffer_snapshot.len().min(start + 100usize);
let mut end = rng.random_range(start..=end_end);
while !text.is_char_boundary(end) {
end = end.saturating_sub(1);
while !text.is_char_boundary(end.0) {
end = end.saturating_sub_usize(1);
}
if start < end {
@@ -253,8 +255,12 @@ mod tests {
}
// Get all chunks and verify their bitmaps
let chunks =
CustomHighlightsChunks::new(0..buffer_snapshot.len(), false, None, &buffer_snapshot);
let chunks = CustomHighlightsChunks::new(
MultiBufferOffset(0)..buffer_snapshot.len(),
false,
None,
&buffer_snapshot,
);
for chunk in chunks {
let chunk_text = chunk.text;

View File

@@ -5,16 +5,17 @@ use super::{
inlay_map::{InlayBufferRows, InlayChunks, InlayEdit, InlayOffset, InlayPoint, InlaySnapshot},
};
use gpui::{AnyElement, App, ElementId, HighlightStyle, Pixels, Window};
use language::{Edit, HighlightId, Point, TextSummary};
use language::{Edit, HighlightId, Point};
use multi_buffer::{
Anchor, AnchorRangeExt, MultiBufferRow, MultiBufferSnapshot, RowInfo, ToOffset,
Anchor, AnchorRangeExt, MBTextSummary, MultiBufferOffset, MultiBufferRow, MultiBufferSnapshot,
RowInfo, ToOffset,
};
use project::InlayId;
use std::{
any::TypeId,
cmp::{self, Ordering},
fmt, iter,
ops::{Add, AddAssign, Deref, DerefMut, Range, Sub},
ops::{Add, AddAssign, Deref, DerefMut, Range, Sub, SubAssign},
sync::Arc,
usize,
};
@@ -261,7 +262,7 @@ impl FoldMapWriter<'_> {
fold_ixs_to_delete.dedup();
self.0.snapshot.folds = {
let mut cursor = self.0.snapshot.folds.cursor::<usize>(buffer);
let mut cursor = self.0.snapshot.folds.cursor::<MultiBufferOffset>(buffer);
let mut folds = SumTree::new(buffer);
for fold_ix in fold_ixs_to_delete {
folds.append(cursor.slice(&fold_ix, Bias::Right), buffer);
@@ -413,7 +414,7 @@ impl FoldMap {
let mut new_transforms = SumTree::<Transform>::default();
let mut cursor = self.snapshot.transforms.cursor::<InlayOffset>(());
cursor.seek(&InlayOffset(0), Bias::Right);
cursor.seek(&InlayOffset(MultiBufferOffset(0)), Bias::Right);
while let Some(mut edit) = inlay_edits_iter.next() {
if let Some(item) = cursor.item()
@@ -436,7 +437,7 @@ impl FoldMap {
cursor.seek(&edit.old.end, Bias::Right);
cursor.next();
let mut delta = edit.new_len().0 as isize - edit.old_len().0 as isize;
let mut delta = edit.new_len() as isize - edit.old_len() as isize;
loop {
edit.old.end = *cursor.start();
@@ -446,7 +447,7 @@ impl FoldMap {
}
let next_edit = inlay_edits_iter.next().unwrap();
delta += next_edit.new_len().0 as isize - next_edit.old_len().0 as isize;
delta += next_edit.new_len() as isize - next_edit.old_len() as isize;
if next_edit.old.end >= edit.old.end {
edit.old.end = next_edit.old.end;
@@ -458,8 +459,9 @@ impl FoldMap {
}
}
edit.new.end =
InlayOffset(((edit.new.start + edit.old_len()).0 as isize + delta) as usize);
edit.new.end = InlayOffset(MultiBufferOffset(
((edit.new.start + edit.old_len()).0.0 as isize + delta) as usize,
));
let anchor = inlay_snapshot
.buffer
@@ -522,7 +524,7 @@ impl FoldMap {
new_transforms.push(
Transform {
summary: TransformSummary {
output: TextSummary::from(ELLIPSIS),
output: MBTextSummary::from(ELLIPSIS),
input: inlay_snapshot
.text_summary_for_range(fold_range.start..fold_range.end),
},
@@ -579,7 +581,7 @@ impl FoldMap {
edit.old.start = old_transforms.start().0;
}
let old_start =
old_transforms.start().1.0 + (edit.old.start - old_transforms.start().0).0;
old_transforms.start().1.0 + (edit.old.start - old_transforms.start().0);
old_transforms.seek_forward(&edit.old.end, Bias::Right);
if old_transforms.item().is_some_and(|t| t.is_fold()) {
@@ -587,14 +589,14 @@ impl FoldMap {
edit.old.end = old_transforms.start().0;
}
let old_end =
old_transforms.start().1.0 + (edit.old.end - old_transforms.start().0).0;
old_transforms.start().1.0 + (edit.old.end - old_transforms.start().0);
new_transforms.seek(&edit.new.start, Bias::Left);
if new_transforms.item().is_some_and(|t| t.is_fold()) {
edit.new.start = new_transforms.start().0;
}
let new_start =
new_transforms.start().1.0 + (edit.new.start - new_transforms.start().0).0;
new_transforms.start().1.0 + (edit.new.start - new_transforms.start().0);
new_transforms.seek_forward(&edit.new.end, Bias::Right);
if new_transforms.item().is_some_and(|t| t.is_fold()) {
@@ -602,7 +604,7 @@ impl FoldMap {
edit.new.end = new_transforms.start().0;
}
let new_end =
new_transforms.start().1.0 + (edit.new.end - new_transforms.start().0).0;
new_transforms.start().1.0 + (edit.new.end - new_transforms.start().0);
fold_edits.push(FoldEdit {
old: FoldOffset(old_start)..FoldOffset(old_end),
@@ -649,9 +651,13 @@ impl FoldSnapshot {
#[cfg(test)]
pub fn text(&self) -> String {
self.chunks(FoldOffset(0)..self.len(), false, Highlights::default())
.map(|c| c.text)
.collect()
self.chunks(
FoldOffset(MultiBufferOffset(0))..self.len(),
false,
Highlights::default(),
)
.map(|c| c.text)
.collect()
}
#[cfg(test)]
@@ -659,8 +665,8 @@ impl FoldSnapshot {
self.folds.items(&self.inlay_snapshot.buffer).len()
}
pub fn text_summary_for_range(&self, range: Range<FoldPoint>) -> TextSummary {
let mut summary = TextSummary::default();
pub fn text_summary_for_range(&self, range: Range<FoldPoint>) -> MBTextSummary {
let mut summary = MBTextSummary::default();
let mut cursor = self
.transforms
@@ -670,7 +676,7 @@ impl FoldSnapshot {
let start_in_transform = range.start.0 - cursor.start().0.0;
let end_in_transform = cmp::min(range.end, cursor.end().0).0 - cursor.start().0.0;
if let Some(placeholder) = transform.placeholder.as_ref() {
summary = TextSummary::from(
summary = MBTextSummary::from(
&placeholder.text
[start_in_transform.column as usize..end_in_transform.column as usize],
);
@@ -689,14 +695,14 @@ impl FoldSnapshot {
if range.end > cursor.end().0 {
cursor.next();
summary += &cursor
summary += cursor
.summary::<_, TransformSummary>(&range.end, Bias::Right)
.output;
if let Some(transform) = cursor.item() {
let end_in_transform = range.end.0 - cursor.start().0.0;
if let Some(placeholder) = transform.placeholder.as_ref() {
summary +=
TextSummary::from(&placeholder.text[..end_in_transform.column as usize]);
MBTextSummary::from(&placeholder.text[..end_in_transform.column as usize]);
} else {
let inlay_start = self.inlay_snapshot.to_offset(cursor.start().1);
let inlay_end = self
@@ -839,8 +845,8 @@ impl FoldSnapshot {
transform_cursor.seek(&range.start, Bias::Right);
let inlay_start = {
let overshoot = range.start.0 - transform_cursor.start().0.0;
transform_cursor.start().1 + InlayOffset(overshoot)
let overshoot = range.start - transform_cursor.start().0;
transform_cursor.start().1 + overshoot
};
let transform_end = transform_cursor.end();
@@ -851,8 +857,8 @@ impl FoldSnapshot {
{
inlay_start
} else if range.end < transform_end.0 {
let overshoot = range.end.0 - transform_cursor.start().0.0;
transform_cursor.start().1 + InlayOffset(overshoot)
let overshoot = range.end - transform_cursor.start().0;
transform_cursor.start().1 + overshoot
} else {
transform_end.1
};
@@ -921,7 +927,7 @@ impl FoldSnapshot {
}
}
fn push_isomorphic(transforms: &mut SumTree<Transform>, summary: TextSummary) {
fn push_isomorphic(transforms: &mut SumTree<Transform>, summary: MBTextSummary) {
let mut did_merge = false;
transforms.update_last(
|last| {
@@ -950,13 +956,13 @@ fn push_isomorphic(transforms: &mut SumTree<Transform>, summary: TextSummary) {
fn intersecting_folds<'a>(
inlay_snapshot: &'a InlaySnapshot,
folds: &'a SumTree<Fold>,
range: Range<usize>,
range: Range<MultiBufferOffset>,
inclusive: bool,
) -> FilterCursor<'a, 'a, impl 'a + FnMut(&FoldSummary) -> bool, Fold, usize> {
) -> FilterCursor<'a, 'a, impl 'a + FnMut(&FoldSummary) -> bool, Fold, MultiBufferOffset> {
let buffer = &inlay_snapshot.buffer;
let start = buffer.anchor_before(range.start.to_offset(buffer));
let end = buffer.anchor_after(range.end.to_offset(buffer));
let mut cursor = folds.filter::<_, usize>(buffer, move |summary| {
let mut cursor = folds.filter::<_, MultiBufferOffset>(buffer, move |summary| {
let start_cmp = start.cmp(&summary.max_end, buffer);
let end_cmp = end.cmp(&summary.min_start, buffer);
@@ -1061,8 +1067,8 @@ impl Transform {
#[derive(Clone, Debug, Default, Eq, PartialEq)]
struct TransformSummary {
output: TextSummary,
input: TextSummary,
output: MBTextSummary,
input: MBTextSummary,
}
impl sum_tree::Item for Transform {
@@ -1079,8 +1085,8 @@ impl sum_tree::ContextLessSummary for TransformSummary {
}
fn add_summary(&mut self, other: &Self) {
self.input += &other.input;
self.output += &other.output;
self.input += other.input;
self.output += other.output;
}
}
@@ -1211,7 +1217,7 @@ impl sum_tree::SeekTarget<'_, FoldSummary, FoldRange> for FoldRange {
}
}
impl<'a> sum_tree::Dimension<'a, FoldSummary> for usize {
impl<'a> sum_tree::Dimension<'a, FoldSummary> for MultiBufferOffset {
fn zero(_cx: &MultiBufferSnapshot) -> Self {
Default::default()
}
@@ -1357,8 +1363,8 @@ impl FoldChunks<'_> {
self.transform_cursor.seek(&range.start, Bias::Right);
let inlay_start = {
let overshoot = range.start.0 - self.transform_cursor.start().0.0;
self.transform_cursor.start().1 + InlayOffset(overshoot)
let overshoot = range.start - self.transform_cursor.start().0;
self.transform_cursor.start().1 + overshoot
};
let transform_end = self.transform_cursor.end();
@@ -1370,8 +1376,8 @@ impl FoldChunks<'_> {
{
inlay_start
} else if range.end < transform_end.0 {
let overshoot = range.end.0 - self.transform_cursor.start().0.0;
self.transform_cursor.start().1 + InlayOffset(overshoot)
let overshoot = range.end - self.transform_cursor.start().0;
self.transform_cursor.start().1 + overshoot
} else {
transform_end.1
};
@@ -1423,8 +1429,8 @@ impl<'a> Iterator for FoldChunks<'a> {
let transform_start = self.transform_cursor.start();
let transform_end = self.transform_cursor.end();
let inlay_end = if self.max_output_offset < transform_end.0 {
let overshoot = self.max_output_offset.0 - transform_start.0.0;
transform_start.1 + InlayOffset(overshoot)
let overshoot = self.max_output_offset - transform_start.0;
transform_start.1 + overshoot
} else {
transform_end.1
};
@@ -1441,15 +1447,15 @@ impl<'a> Iterator for FoldChunks<'a> {
// Otherwise, take a chunk from the buffer's text.
if let Some((buffer_chunk_start, mut inlay_chunk)) = self.inlay_chunk.clone() {
let chunk = &mut inlay_chunk.chunk;
let buffer_chunk_end = buffer_chunk_start + InlayOffset(chunk.text.len());
let buffer_chunk_end = buffer_chunk_start + chunk.text.len();
let transform_end = self.transform_cursor.end().1;
let chunk_end = buffer_chunk_end.min(transform_end);
let bit_start = (self.inlay_offset - buffer_chunk_start).0;
let bit_end = (chunk_end - buffer_chunk_start).0;
let bit_start = self.inlay_offset - buffer_chunk_start;
let bit_end = chunk_end - buffer_chunk_start;
chunk.text = &chunk.text[bit_start..bit_end];
let bit_end = (chunk_end - buffer_chunk_start).0;
let bit_end = chunk_end - buffer_chunk_start;
let mask = 1u128.unbounded_shl(bit_end as u32).wrapping_sub(1);
chunk.tabs = (chunk.tabs >> bit_start) & mask;
@@ -1483,7 +1489,7 @@ impl<'a> Iterator for FoldChunks<'a> {
}
#[derive(Copy, Clone, Debug, Default, Eq, Ord, PartialOrd, PartialEq)]
pub struct FoldOffset(pub usize);
pub struct FoldOffset(pub MultiBufferOffset);
impl FoldOffset {
pub fn to_point(self, snapshot: &FoldSnapshot) -> FoldPoint {
@@ -1493,7 +1499,7 @@ impl FoldOffset {
let overshoot = if item.is_none_or(|t| t.is_fold()) {
Point::new(0, (self.0 - start.0.0) as u32)
} else {
let inlay_offset = start.1.input.len + self.0 - start.0.0;
let inlay_offset = start.1.input.len + (self - start.0);
let inlay_point = snapshot.inlay_snapshot.to_point(InlayOffset(inlay_offset));
inlay_point.0 - start.1.input.lines
};
@@ -1505,7 +1511,7 @@ impl FoldOffset {
let (start, _, _) = snapshot
.transforms
.find::<Dimensions<FoldOffset, InlayOffset>, _>((), &self, Bias::Right);
let overshoot = self.0 - start.0.0;
let overshoot = self - start.0;
InlayOffset(start.1.0 + overshoot)
}
}
@@ -1518,17 +1524,46 @@ impl Add for FoldOffset {
}
}
impl Sub for FoldOffset {
type Output = <MultiBufferOffset as Sub>::Output;
fn sub(self, rhs: Self) -> Self::Output {
self.0 - rhs.0
}
}
impl<T> SubAssign<T> for FoldOffset
where
MultiBufferOffset: SubAssign<T>,
{
fn sub_assign(&mut self, rhs: T) {
self.0 -= rhs;
}
}
impl<T> Add<T> for FoldOffset
where
MultiBufferOffset: Add<T, Output = MultiBufferOffset>,
{
type Output = Self;
fn add(self, rhs: T) -> Self::Output {
Self(self.0 + rhs)
}
}
impl AddAssign for FoldOffset {
fn add_assign(&mut self, rhs: Self) {
self.0 += rhs.0;
}
}
impl Sub for FoldOffset {
type Output = Self;
fn sub(self, rhs: Self) -> Self::Output {
Self(self.0 - rhs.0)
impl<T> AddAssign<T> for FoldOffset
where
MultiBufferOffset: AddAssign<T>,
{
fn add_assign(&mut self, rhs: T) {
self.0 += rhs;
}
}
@@ -1538,7 +1573,7 @@ impl<'a> sum_tree::Dimension<'a, TransformSummary> for FoldOffset {
}
fn add_summary(&mut self, summary: &'a TransformSummary, _: ()) {
self.0 += &summary.output.len;
self.0 += summary.output.len;
}
}
@@ -1558,7 +1593,7 @@ impl<'a> sum_tree::Dimension<'a, TransformSummary> for InlayOffset {
}
fn add_summary(&mut self, summary: &'a TransformSummary, _: ()) {
self.0 += &summary.input.len;
self.0 += summary.input.len;
}
}
@@ -1596,12 +1631,12 @@ mod tests {
edits,
&[
FoldEdit {
old: FoldOffset(2)..FoldOffset(16),
new: FoldOffset(2)..FoldOffset(5),
old: FoldOffset(MultiBufferOffset(2))..FoldOffset(MultiBufferOffset(16)),
new: FoldOffset(MultiBufferOffset(2))..FoldOffset(MultiBufferOffset(5)),
},
FoldEdit {
old: FoldOffset(18)..FoldOffset(29),
new: FoldOffset(7)..FoldOffset(10)
old: FoldOffset(MultiBufferOffset(18))..FoldOffset(MultiBufferOffset(29)),
new: FoldOffset(MultiBufferOffset(7))..FoldOffset(MultiBufferOffset(10)),
},
]
);
@@ -1626,12 +1661,12 @@ mod tests {
edits,
&[
FoldEdit {
old: FoldOffset(0)..FoldOffset(1),
new: FoldOffset(0)..FoldOffset(3),
old: FoldOffset(MultiBufferOffset(0))..FoldOffset(MultiBufferOffset(1)),
new: FoldOffset(MultiBufferOffset(0))..FoldOffset(MultiBufferOffset(3)),
},
FoldEdit {
old: FoldOffset(6)..FoldOffset(6),
new: FoldOffset(8)..FoldOffset(11),
old: FoldOffset(MultiBufferOffset(6))..FoldOffset(MultiBufferOffset(6)),
new: FoldOffset(MultiBufferOffset(8))..FoldOffset(MultiBufferOffset(11)),
},
]
);
@@ -1668,15 +1703,24 @@ mod tests {
let mut map = FoldMap::new(inlay_snapshot.clone()).0;
let (mut writer, _, _) = map.write(inlay_snapshot.clone(), vec![]);
writer.fold(vec![(5..8, FoldPlaceholder::test())]);
writer.fold(vec![(
MultiBufferOffset(5)..MultiBufferOffset(8),
FoldPlaceholder::test(),
)]);
let (snapshot, _) = map.read(inlay_snapshot.clone(), vec![]);
assert_eq!(snapshot.text(), "abcde⋯ijkl");
// Create an fold adjacent to the start of the first fold.
let (mut writer, _, _) = map.write(inlay_snapshot.clone(), vec![]);
writer.fold(vec![
(0..1, FoldPlaceholder::test()),
(2..5, FoldPlaceholder::test()),
(
MultiBufferOffset(0)..MultiBufferOffset(1),
FoldPlaceholder::test(),
),
(
MultiBufferOffset(2)..MultiBufferOffset(5),
FoldPlaceholder::test(),
),
]);
let (snapshot, _) = map.read(inlay_snapshot.clone(), vec![]);
assert_eq!(snapshot.text(), "⋯b⋯ijkl");
@@ -1684,8 +1728,14 @@ mod tests {
// Create an fold adjacent to the end of the first fold.
let (mut writer, _, _) = map.write(inlay_snapshot.clone(), vec![]);
writer.fold(vec![
(11..11, FoldPlaceholder::test()),
(8..10, FoldPlaceholder::test()),
(
MultiBufferOffset(11)..MultiBufferOffset(11),
FoldPlaceholder::test(),
),
(
MultiBufferOffset(8)..MultiBufferOffset(10),
FoldPlaceholder::test(),
),
]);
let (snapshot, _) = map.read(inlay_snapshot.clone(), vec![]);
assert_eq!(snapshot.text(), "⋯b⋯kl");
@@ -1697,15 +1747,25 @@ mod tests {
// Create two adjacent folds.
let (mut writer, _, _) = map.write(inlay_snapshot.clone(), vec![]);
writer.fold(vec![
(0..2, FoldPlaceholder::test()),
(2..5, FoldPlaceholder::test()),
(
MultiBufferOffset(0)..MultiBufferOffset(2),
FoldPlaceholder::test(),
),
(
MultiBufferOffset(2)..MultiBufferOffset(5),
FoldPlaceholder::test(),
),
]);
let (snapshot, _) = map.read(inlay_snapshot, vec![]);
assert_eq!(snapshot.text(), "⋯fghijkl");
// Edit within one of the folds.
let buffer_snapshot = buffer.update(cx, |buffer, cx| {
buffer.edit([(0..1, "12345")], None, cx);
buffer.edit(
[(MultiBufferOffset(0)..MultiBufferOffset(1), "12345")],
None,
cx,
);
buffer.snapshot(cx)
});
let (inlay_snapshot, inlay_edits) =
@@ -1849,7 +1909,7 @@ mod tests {
for fold_range in map.merged_folds().into_iter().rev() {
let fold_inlay_start = inlay_snapshot.to_inlay_offset(fold_range.start);
let fold_inlay_end = inlay_snapshot.to_inlay_offset(fold_range.end);
expected_text.replace_range(fold_inlay_start.0..fold_inlay_end.0, "");
expected_text.replace_range(fold_inlay_start.0.0..fold_inlay_end.0.0, "");
}
assert_eq!(snapshot.text(), expected_text);
@@ -1898,7 +1958,7 @@ mod tests {
.chars()
.count();
let mut fold_point = FoldPoint::new(0, 0);
let mut fold_offset = FoldOffset(0);
let mut fold_offset = FoldOffset(MultiBufferOffset(0));
let mut char_column = 0;
for c in expected_text.chars() {
let inlay_point = fold_point.to_inlay_point(&snapshot);
@@ -1944,18 +2004,18 @@ mod tests {
for _ in 0..5 {
let mut start = snapshot.clip_offset(
FoldOffset(rng.random_range(0..=snapshot.len().0)),
FoldOffset(rng.random_range(MultiBufferOffset(0)..=snapshot.len().0)),
Bias::Left,
);
let mut end = snapshot.clip_offset(
FoldOffset(rng.random_range(0..=snapshot.len().0)),
FoldOffset(rng.random_range(MultiBufferOffset(0)..=snapshot.len().0)),
Bias::Right,
);
if start > end {
mem::swap(&mut start, &mut end);
}
let text = &expected_text[start.0..end.0];
let text = &expected_text[start.0.0..end.0.0];
assert_eq!(
snapshot
.chunks(start..end, false, Highlights::default())
@@ -2004,9 +2064,12 @@ mod tests {
}
for _ in 0..5 {
let end =
buffer_snapshot.clip_offset(rng.random_range(0..=buffer_snapshot.len()), Right);
let start = buffer_snapshot.clip_offset(rng.random_range(0..=end), Left);
let end = buffer_snapshot.clip_offset(
rng.random_range(MultiBufferOffset(0)..=buffer_snapshot.len()),
Right,
);
let start =
buffer_snapshot.clip_offset(rng.random_range(MultiBufferOffset(0)..=end), Left);
let expected_folds = map
.snapshot
.folds
@@ -2046,7 +2109,7 @@ mod tests {
let bytes = start.to_offset(&snapshot)..end.to_offset(&snapshot);
assert_eq!(
snapshot.text_summary_for_range(lines),
TextSummary::from(&text[bytes.start.0..bytes.end.0])
MBTextSummary::from(&text[bytes.start.0.0..bytes.end.0.0])
)
}
@@ -2054,8 +2117,8 @@ mod tests {
for (snapshot, edits) in snapshot_edits.drain(..) {
let new_text = snapshot.text();
for edit in edits {
let old_bytes = edit.new.start.0..edit.new.start.0 + edit.old_len().0;
let new_bytes = edit.new.start.0..edit.new.end.0;
let old_bytes = edit.new.start.0.0..edit.new.start.0.0 + edit.old_len();
let new_bytes = edit.new.start.0.0..edit.new.end.0.0;
text.replace_range(old_bytes, &new_text[new_bytes]);
}
@@ -2126,7 +2189,7 @@ mod tests {
// Get all chunks and verify their bitmaps
let chunks = snapshot.chunks(
FoldOffset(0)..FoldOffset(snapshot.len().0),
FoldOffset(MultiBufferOffset(0))..FoldOffset(snapshot.len().0),
false,
Highlights::default(),
);
@@ -2195,7 +2258,7 @@ mod tests {
}
impl FoldMap {
fn merged_folds(&self) -> Vec<Range<usize>> {
fn merged_folds(&self) -> Vec<Range<MultiBufferOffset>> {
let inlay_snapshot = self.snapshot.inlay_snapshot.clone();
let buffer = &inlay_snapshot.buffer;
let mut folds = self.snapshot.folds.items(buffer);
@@ -2236,8 +2299,12 @@ mod tests {
let buffer = &inlay_snapshot.buffer;
let mut to_unfold = Vec::new();
for _ in 0..rng.random_range(1..=3) {
let end = buffer.clip_offset(rng.random_range(0..=buffer.len()), Right);
let start = buffer.clip_offset(rng.random_range(0..=end), Left);
let end = buffer.clip_offset(
rng.random_range(MultiBufferOffset(0)..=buffer.len()),
Right,
);
let start =
buffer.clip_offset(rng.random_range(MultiBufferOffset(0)..=end), Left);
to_unfold.push(start..end);
}
let inclusive = rng.random();
@@ -2252,8 +2319,12 @@ mod tests {
let buffer = &inlay_snapshot.buffer;
let mut to_fold = Vec::new();
for _ in 0..rng.random_range(1..=2) {
let end = buffer.clip_offset(rng.random_range(0..=buffer.len()), Right);
let start = buffer.clip_offset(rng.random_range(0..=end), Left);
let end = buffer.clip_offset(
rng.random_range(MultiBufferOffset(0)..=buffer.len()),
Right,
);
let start =
buffer.clip_offset(rng.random_range(MultiBufferOffset(0)..=end), Left);
to_fold.push((start..end, FoldPlaceholder::test()));
}
log::info!("folding {:?}", to_fold);

View File

@@ -4,7 +4,10 @@ use crate::{
};
use collections::BTreeSet;
use language::{Chunk, Edit, Point, TextSummary};
use multi_buffer::{MultiBufferRow, MultiBufferRows, MultiBufferSnapshot, RowInfo, ToOffset};
use multi_buffer::{
MBTextSummary, MultiBufferOffset, MultiBufferRow, MultiBufferRows, MultiBufferSnapshot,
RowInfo, ToOffset,
};
use project::InlayId;
use std::{
cmp,
@@ -42,7 +45,7 @@ impl std::ops::Deref for InlaySnapshot {
#[derive(Clone, Debug)]
enum Transform {
Isomorphic(TextSummary),
Isomorphic(MBTextSummary),
Inlay(Inlay),
}
@@ -56,8 +59,8 @@ impl sum_tree::Item for Transform {
output: *summary,
},
Transform::Inlay(inlay) => TransformSummary {
input: TextSummary::default(),
output: inlay.text().summary(),
input: MBTextSummary::default(),
output: MBTextSummary::from(inlay.text().summary()),
},
}
}
@@ -65,8 +68,8 @@ impl sum_tree::Item for Transform {
#[derive(Clone, Debug, Default)]
struct TransformSummary {
input: TextSummary,
output: TextSummary,
input: MBTextSummary,
output: MBTextSummary,
}
impl sum_tree::ContextLessSummary for TransformSummary {
@@ -75,15 +78,15 @@ impl sum_tree::ContextLessSummary for TransformSummary {
}
fn add_summary(&mut self, other: &Self) {
self.input += &other.input;
self.output += &other.output;
self.input += other.input;
self.output += other.output;
}
}
pub type InlayEdit = Edit<InlayOffset>;
#[derive(Copy, Clone, Debug, Default, Eq, Ord, PartialOrd, PartialEq)]
pub struct InlayOffset(pub usize);
pub struct InlayOffset(pub MultiBufferOffset);
impl Add for InlayOffset {
type Output = Self;
@@ -94,10 +97,30 @@ impl Add for InlayOffset {
}
impl Sub for InlayOffset {
type Output = Self;
type Output = <MultiBufferOffset as Sub>::Output;
fn sub(self, rhs: Self) -> Self::Output {
Self(self.0 - rhs.0)
self.0 - rhs.0
}
}
impl<T> SubAssign<T> for InlayOffset
where
MultiBufferOffset: SubAssign<T>,
{
fn sub_assign(&mut self, rhs: T) {
self.0 -= rhs;
}
}
impl<T> Add<T> for InlayOffset
where
MultiBufferOffset: Add<T, Output = MultiBufferOffset>,
{
type Output = Self;
fn add(self, rhs: T) -> Self::Output {
Self(self.0 + rhs)
}
}
@@ -107,9 +130,12 @@ impl AddAssign for InlayOffset {
}
}
impl SubAssign for InlayOffset {
fn sub_assign(&mut self, rhs: Self) {
self.0 -= rhs.0;
impl<T> AddAssign<T> for InlayOffset
where
MultiBufferOffset: AddAssign<T>,
{
fn add_assign(&mut self, rhs: T) {
self.0 += rhs;
}
}
@@ -119,7 +145,7 @@ impl<'a> sum_tree::Dimension<'a, TransformSummary> for InlayOffset {
}
fn add_summary(&mut self, summary: &'a TransformSummary, _: ()) {
self.0 += &summary.output.len;
self.0 += summary.output.len;
}
}
@@ -152,13 +178,13 @@ impl<'a> sum_tree::Dimension<'a, TransformSummary> for InlayPoint {
}
}
impl<'a> sum_tree::Dimension<'a, TransformSummary> for usize {
impl<'a> sum_tree::Dimension<'a, TransformSummary> for MultiBufferOffset {
fn zero(_cx: ()) -> Self {
Default::default()
}
fn add_summary(&mut self, summary: &'a TransformSummary, _: ()) {
*self += &summary.input.len;
*self += summary.input.len;
}
}
@@ -181,7 +207,7 @@ pub struct InlayBufferRows<'a> {
}
pub struct InlayChunks<'a> {
transforms: Cursor<'a, 'static, Transform, Dimensions<InlayOffset, usize>>,
transforms: Cursor<'a, 'static, Transform, Dimensions<InlayOffset, MultiBufferOffset>>,
buffer_chunks: CustomHighlightsChunks<'a>,
buffer_chunk: Option<Chunk<'a>>,
inlay_chunks: Option<text::ChunkWithBitmaps<'a>>,
@@ -332,12 +358,12 @@ impl<'a> Iterator for InlayChunks<'a> {
let offset_in_inlay = self.output_offset - self.transforms.start().0;
if let Some((style, highlight)) = inlay_style_and_highlight {
let range = &highlight.range;
if offset_in_inlay.0 < range.start {
next_inlay_highlight_endpoint = range.start - offset_in_inlay.0;
} else if offset_in_inlay.0 >= range.end {
if offset_in_inlay < range.start {
next_inlay_highlight_endpoint = range.start - offset_in_inlay;
} else if offset_in_inlay >= range.end {
next_inlay_highlight_endpoint = usize::MAX;
} else {
next_inlay_highlight_endpoint = range.end - offset_in_inlay.0;
next_inlay_highlight_endpoint = range.end - offset_in_inlay;
highlight_style = highlight_style
.map(|highlight| highlight.highlight(*style))
.or_else(|| Some(*style));
@@ -350,7 +376,7 @@ impl<'a> Iterator for InlayChunks<'a> {
let start = offset_in_inlay;
let end = cmp::min(self.max_output_offset, self.transforms.end().0)
- self.transforms.start().0;
let chunks = inlay.text().chunks_in_range(start.0..end.0);
let chunks = inlay.text().chunks_in_range(start..end);
text::ChunkWithBitmaps(chunks)
});
let ChunkBitmaps {
@@ -488,7 +514,7 @@ impl InlayMap {
pub fn sync(
&mut self,
buffer_snapshot: MultiBufferSnapshot,
mut buffer_edits: Vec<text::Edit<usize>>,
mut buffer_edits: Vec<text::Edit<MultiBufferOffset>>,
) -> (InlaySnapshot, Vec<InlayEdit>) {
let snapshot = &mut self.snapshot;
@@ -519,7 +545,7 @@ impl InlayMap {
let mut new_transforms = SumTree::default();
let mut cursor = snapshot
.transforms
.cursor::<Dimensions<usize, InlayOffset>>(());
.cursor::<Dimensions<MultiBufferOffset, InlayOffset>>(());
let mut buffer_edits_iter = buffer_edits.iter().peekable();
while let Some(buffer_edit) = buffer_edits_iter.next() {
new_transforms.append(cursor.slice(&buffer_edit.old.start, Bias::Left), ());
@@ -531,11 +557,9 @@ impl InlayMap {
}
// Remove all the inlays and transforms contained by the edit.
let old_start =
cursor.start().1 + InlayOffset(buffer_edit.old.start - cursor.start().0);
let old_start = cursor.start().1 + (buffer_edit.old.start - cursor.start().0);
cursor.seek(&buffer_edit.old.end, Bias::Right);
let old_end =
cursor.start().1 + InlayOffset(buffer_edit.old.end - cursor.start().0);
let old_end = cursor.start().1 + (buffer_edit.old.end - cursor.start().0);
// Push the unchanged prefix.
let prefix_start = new_transforms.summary().input.len;
@@ -687,7 +711,10 @@ impl InlayMap {
let snapshot = &mut self.snapshot;
for i in 0..rng.random_range(1..=5) {
if self.inlays.is_empty() || rng.random() {
let position = snapshot.buffer.random_byte_range(0, rng).start;
let position = snapshot
.buffer
.random_byte_range(MultiBufferOffset(0), rng)
.start;
let bias = if rng.random() {
Bias::Left
} else {
@@ -740,9 +767,11 @@ impl InlayMap {
impl InlaySnapshot {
pub fn to_point(&self, offset: InlayOffset) -> InlayPoint {
let (start, _, item) = self
.transforms
.find::<Dimensions<InlayOffset, InlayPoint, usize>, _>((), &offset, Bias::Right);
let (start, _, item) = self.transforms.find::<Dimensions<
InlayOffset,
InlayPoint,
MultiBufferOffset,
>, _>((), &offset, Bias::Right);
let overshoot = offset.0 - start.0.0;
match item {
Some(Transform::Isomorphic(_)) => {
@@ -801,22 +830,24 @@ impl InlaySnapshot {
None => self.buffer.max_point(),
}
}
pub fn to_buffer_offset(&self, offset: InlayOffset) -> usize {
let (start, _, item) =
self.transforms
.find::<Dimensions<InlayOffset, usize>, _>((), &offset, Bias::Right);
pub fn to_buffer_offset(&self, offset: InlayOffset) -> MultiBufferOffset {
let (start, _, item) = self
.transforms
.find::<Dimensions<InlayOffset, MultiBufferOffset>, _>((), &offset, Bias::Right);
match item {
Some(Transform::Isomorphic(_)) => {
let overshoot = offset - start.0;
start.1 + overshoot.0
start.1 + overshoot
}
Some(Transform::Inlay(_)) => start.1,
None => self.buffer.len(),
}
}
pub fn to_inlay_offset(&self, offset: usize) -> InlayOffset {
let mut cursor = self.transforms.cursor::<Dimensions<usize, InlayOffset>>(());
pub fn to_inlay_offset(&self, offset: MultiBufferOffset) -> InlayOffset {
let mut cursor = self
.transforms
.cursor::<Dimensions<MultiBufferOffset, InlayOffset>>(());
cursor.seek(&offset, Bias::Left);
loop {
match cursor.item() {
@@ -973,14 +1004,16 @@ impl InlaySnapshot {
}
}
pub fn text_summary(&self) -> TextSummary {
pub fn text_summary(&self) -> MBTextSummary {
self.transforms.summary().output
}
pub fn text_summary_for_range(&self, range: Range<InlayOffset>) -> TextSummary {
let mut summary = TextSummary::default();
pub fn text_summary_for_range(&self, range: Range<InlayOffset>) -> MBTextSummary {
let mut summary = MBTextSummary::default();
let mut cursor = self.transforms.cursor::<Dimensions<InlayOffset, usize>>(());
let mut cursor = self
.transforms
.cursor::<Dimensions<InlayOffset, MultiBufferOffset>>(());
cursor.seek(&range.start, Bias::Right);
let overshoot = range.start.0 - cursor.start().0.0;
@@ -996,7 +1029,12 @@ impl InlaySnapshot {
Some(Transform::Inlay(inlay)) => {
let suffix_start = overshoot;
let suffix_end = cmp::min(cursor.end().0, range.end).0 - cursor.start().0.0;
summary = inlay.text().cursor(suffix_start).summary(suffix_end);
summary = MBTextSummary::from(
inlay
.text()
.cursor(suffix_start)
.summary::<TextSummary>(suffix_end),
);
cursor.next();
}
None => {}
@@ -1014,7 +1052,7 @@ impl InlaySnapshot {
let prefix_end = prefix_start + overshoot;
summary += self
.buffer
.text_summary_for_range::<TextSummary, _>(prefix_start..prefix_end);
.text_summary_for_range::<MBTextSummary, _>(prefix_start..prefix_end);
}
Some(Transform::Inlay(inlay)) => {
let prefix_end = overshoot;
@@ -1070,7 +1108,9 @@ impl InlaySnapshot {
language_aware: bool,
highlights: Highlights<'a>,
) -> InlayChunks<'a> {
let mut cursor = self.transforms.cursor::<Dimensions<InlayOffset, usize>>(());
let mut cursor = self
.transforms
.cursor::<Dimensions<InlayOffset, MultiBufferOffset>>(());
cursor.seek(&range.start, Bias::Right);
let buffer_range = self.to_buffer_offset(range.start)..self.to_buffer_offset(range.end);
@@ -1122,8 +1162,8 @@ impl InlaySnapshot {
}
}
fn push_isomorphic(sum_tree: &mut SumTree<Transform>, summary: TextSummary) {
if summary.len == 0 {
fn push_isomorphic(sum_tree: &mut SumTree<Transform>, summary: MBTextSummary) {
if summary.len == MultiBufferOffset(0) {
return;
}
@@ -1279,7 +1319,10 @@ mod tests {
&[],
vec![Inlay::mock_hint(
post_inc(&mut next_inlay_id),
buffer.read(cx).snapshot(cx).anchor_after(3),
buffer
.read(cx)
.snapshot(cx)
.anchor_after(MultiBufferOffset(3)),
"|123|",
)],
);
@@ -1335,7 +1378,15 @@ mod tests {
// Edits before or after the inlay should not affect it.
buffer.update(cx, |buffer, cx| {
buffer.edit([(2..3, "x"), (3..3, "y"), (4..4, "z")], None, cx)
buffer.edit(
[
(MultiBufferOffset(2)..MultiBufferOffset(3), "x"),
(MultiBufferOffset(3)..MultiBufferOffset(3), "y"),
(MultiBufferOffset(4)..MultiBufferOffset(4), "z"),
],
None,
cx,
)
});
let (inlay_snapshot, _) = inlay_map.sync(
buffer.read(cx).snapshot(cx),
@@ -1344,7 +1395,13 @@ mod tests {
assert_eq!(inlay_snapshot.text(), "abxy|123|dzefghi");
// An edit surrounding the inlay should invalidate it.
buffer.update(cx, |buffer, cx| buffer.edit([(4..5, "D")], None, cx));
buffer.update(cx, |buffer, cx| {
buffer.edit(
[(MultiBufferOffset(4)..MultiBufferOffset(5), "D")],
None,
cx,
)
});
let (inlay_snapshot, _) = inlay_map.sync(
buffer.read(cx).snapshot(cx),
buffer_edits.consume().into_inner(),
@@ -1356,12 +1413,18 @@ mod tests {
vec![
Inlay::mock_hint(
post_inc(&mut next_inlay_id),
buffer.read(cx).snapshot(cx).anchor_before(3),
buffer
.read(cx)
.snapshot(cx)
.anchor_before(MultiBufferOffset(3)),
"|123|",
),
Inlay::edit_prediction(
post_inc(&mut next_inlay_id),
buffer.read(cx).snapshot(cx).anchor_after(3),
buffer
.read(cx)
.snapshot(cx)
.anchor_after(MultiBufferOffset(3)),
"|456|",
),
],
@@ -1369,7 +1432,13 @@ mod tests {
assert_eq!(inlay_snapshot.text(), "abx|123||456|yDzefghi");
// Edits ending where the inlay starts should not move it if it has a left bias.
buffer.update(cx, |buffer, cx| buffer.edit([(3..3, "JKL")], None, cx));
buffer.update(cx, |buffer, cx| {
buffer.edit(
[(MultiBufferOffset(3)..MultiBufferOffset(3), "JKL")],
None,
cx,
)
});
let (inlay_snapshot, _) = inlay_map.sync(
buffer.read(cx).snapshot(cx),
buffer_edits.consume().into_inner(),
@@ -1571,17 +1640,26 @@ mod tests {
vec![
Inlay::mock_hint(
post_inc(&mut next_inlay_id),
buffer.read(cx).snapshot(cx).anchor_before(0),
buffer
.read(cx)
.snapshot(cx)
.anchor_before(MultiBufferOffset(0)),
"|123|\n",
),
Inlay::mock_hint(
post_inc(&mut next_inlay_id),
buffer.read(cx).snapshot(cx).anchor_before(4),
buffer
.read(cx)
.snapshot(cx)
.anchor_before(MultiBufferOffset(4)),
"|456|",
),
Inlay::edit_prediction(
post_inc(&mut next_inlay_id),
buffer.read(cx).snapshot(cx).anchor_before(7),
buffer
.read(cx)
.snapshot(cx)
.anchor_before(MultiBufferOffset(7)),
"\n|567|\n",
),
],
@@ -1658,7 +1736,7 @@ mod tests {
.collect::<Vec<_>>();
let mut expected_text = Rope::from(&buffer_snapshot.text());
for (offset, inlay) in inlays.iter().rev() {
expected_text.replace(*offset..*offset, &inlay.text().to_string());
expected_text.replace(offset.0..offset.0, &inlay.text().to_string());
}
assert_eq!(inlay_snapshot.text(), expected_text.to_string());
@@ -1681,7 +1759,7 @@ mod tests {
let mut text_highlights = TextHighlights::default();
let text_highlight_count = rng.random_range(0_usize..10);
let mut text_highlight_ranges = (0..text_highlight_count)
.map(|_| buffer_snapshot.random_byte_range(0, &mut rng))
.map(|_| buffer_snapshot.random_byte_range(MultiBufferOffset(0), &mut rng))
.collect::<Vec<_>>();
text_highlight_ranges.sort_by_key(|range| (range.start, Reverse(range.end)));
log::info!("highlighting text ranges {text_highlight_ranges:?}");
@@ -1744,12 +1822,13 @@ mod tests {
}
for _ in 0..5 {
let mut end = rng.random_range(0..=inlay_snapshot.len().0);
let mut end = rng.random_range(0..=inlay_snapshot.len().0.0);
end = expected_text.clip_offset(end, Bias::Right);
let mut start = rng.random_range(0..=end);
start = expected_text.clip_offset(start, Bias::Right);
let range = InlayOffset(start)..InlayOffset(end);
let range =
InlayOffset(MultiBufferOffset(start))..InlayOffset(MultiBufferOffset(end));
log::info!("calling inlay_snapshot.chunks({range:?})");
let actual_text = inlay_snapshot
.chunks(
@@ -1771,25 +1850,27 @@ mod tests {
);
assert_eq!(
inlay_snapshot.text_summary_for_range(InlayOffset(start)..InlayOffset(end)),
expected_text.slice(start..end).summary()
inlay_snapshot.text_summary_for_range(
InlayOffset(MultiBufferOffset(start))..InlayOffset(MultiBufferOffset(end))
),
MBTextSummary::from(expected_text.slice(start..end).summary())
);
}
for edit in inlay_edits {
prev_inlay_text.replace_range(
edit.new.start.0..edit.new.start.0 + edit.old_len().0,
&inlay_snapshot.text()[edit.new.start.0..edit.new.end.0],
edit.new.start.0.0..edit.new.start.0.0 + edit.old_len(),
&inlay_snapshot.text()[edit.new.start.0.0..edit.new.end.0.0],
);
}
assert_eq!(prev_inlay_text, inlay_snapshot.text());
assert_eq!(expected_text.max_point(), inlay_snapshot.max_point().0);
assert_eq!(expected_text.len(), inlay_snapshot.len().0);
assert_eq!(expected_text.len(), inlay_snapshot.len().0.0);
let mut buffer_point = Point::default();
let mut inlay_point = inlay_snapshot.to_inlay_point(buffer_point);
let mut buffer_chars = buffer_snapshot.chars_at(0);
let mut buffer_chars = buffer_snapshot.chars_at(MultiBufferOffset(0));
loop {
// Ensure conversion from buffer coordinates to inlay coordinates
// is consistent.
@@ -1930,7 +2011,7 @@ mod tests {
// Get all chunks and verify their bitmaps
let chunks = snapshot.chunks(
InlayOffset(0)..InlayOffset(snapshot.len().0),
InlayOffset(MultiBufferOffset(0))..snapshot.len(),
false,
Highlights::default(),
);
@@ -2064,7 +2145,7 @@ mod tests {
// Collect chunks - this previously would panic
let chunks: Vec<_> = inlay_snapshot
.chunks(
InlayOffset(0)..InlayOffset(inlay_snapshot.len().0),
InlayOffset(MultiBufferOffset(0))..inlay_snapshot.len(),
false,
highlights,
)
@@ -2178,7 +2259,7 @@ mod tests {
let chunks: Vec<_> = inlay_snapshot
.chunks(
InlayOffset(0)..InlayOffset(inlay_snapshot.len().0),
InlayOffset(MultiBufferOffset(0))..inlay_snapshot.len(),
false,
highlights,
)

View File

@@ -648,6 +648,7 @@ mod tests {
inlay_map::InlayMap,
},
};
use multi_buffer::MultiBufferOffset;
use rand::{Rng, prelude::StdRng};
use util;
@@ -1156,7 +1157,7 @@ mod tests {
let (_, inlay_snapshot) = InlayMap::new(buffer_snapshot);
let (_, fold_snapshot) = FoldMap::new(inlay_snapshot);
let chunks = fold_snapshot.chunks(
FoldOffset(0)..fold_snapshot.len(),
FoldOffset(MultiBufferOffset(0))..fold_snapshot.len(),
false,
Default::default(),
);
@@ -1318,7 +1319,7 @@ mod tests {
let (_, inlay_snapshot) = InlayMap::new(buffer_snapshot);
let (_, fold_snapshot) = FoldMap::new(inlay_snapshot);
let chunks = fold_snapshot.chunks(
FoldOffset(0)..fold_snapshot.len(),
FoldOffset(MultiBufferOffset(0))..fold_snapshot.len(),
false,
Default::default(),
);

File diff suppressed because it is too large Load Diff

View File

@@ -32,9 +32,10 @@ use language::{
tree_sitter_python,
};
use language_settings::Formatter;
use languages::markdown_lang;
use languages::rust_lang;
use lsp::CompletionParams;
use multi_buffer::{IndentGuide, PathKey};
use multi_buffer::{IndentGuide, MultiBufferOffset, MultiBufferOffsetUtf16, PathKey};
use parking_lot::Mutex;
use pretty_assertions::{assert_eq, assert_ne};
use project::{
@@ -196,7 +197,7 @@ fn test_edit_events(cx: &mut TestAppContext) {
// No event is emitted when the mutation is a no-op.
_ = editor2.update(cx, |editor, window, cx| {
editor.change_selections(SelectionEffects::no_scroll(), window, cx, |s| {
s.select_ranges([0..0])
s.select_ranges([MultiBufferOffset(0)..MultiBufferOffset(0)])
});
editor.backspace(&Backspace, window, cx);
@@ -221,7 +222,7 @@ fn test_undo_redo_with_selection_restoration(cx: &mut TestAppContext) {
_ = editor.update(cx, |editor, window, cx| {
editor.start_transaction_at(now, window, cx);
editor.change_selections(SelectionEffects::no_scroll(), window, cx, |s| {
s.select_ranges([2..4])
s.select_ranges([MultiBufferOffset(2)..MultiBufferOffset(4)])
});
editor.insert("cd", window, cx);
@@ -229,38 +230,46 @@ fn test_undo_redo_with_selection_restoration(cx: &mut TestAppContext) {
assert_eq!(editor.text(cx), "12cd56");
assert_eq!(
editor.selections.ranges(&editor.display_snapshot(cx)),
vec![4..4]
vec![MultiBufferOffset(4)..MultiBufferOffset(4)]
);
editor.start_transaction_at(now, window, cx);
editor.change_selections(SelectionEffects::no_scroll(), window, cx, |s| {
s.select_ranges([4..5])
s.select_ranges([MultiBufferOffset(4)..MultiBufferOffset(5)])
});
editor.insert("e", window, cx);
editor.end_transaction_at(now, cx);
assert_eq!(editor.text(cx), "12cde6");
assert_eq!(
editor.selections.ranges(&editor.display_snapshot(cx)),
vec![5..5]
vec![MultiBufferOffset(5)..MultiBufferOffset(5)]
);
now += group_interval + Duration::from_millis(1);
editor.change_selections(SelectionEffects::no_scroll(), window, cx, |s| {
s.select_ranges([2..2])
s.select_ranges([MultiBufferOffset(2)..MultiBufferOffset(2)])
});
// Simulate an edit in another editor
buffer.update(cx, |buffer, cx| {
buffer.start_transaction_at(now, cx);
buffer.edit([(0..1, "a")], None, cx);
buffer.edit([(1..1, "b")], None, cx);
buffer.edit(
[(MultiBufferOffset(0)..MultiBufferOffset(1), "a")],
None,
cx,
);
buffer.edit(
[(MultiBufferOffset(1)..MultiBufferOffset(1), "b")],
None,
cx,
);
buffer.end_transaction_at(now, cx);
});
assert_eq!(editor.text(cx), "ab2cde6");
assert_eq!(
editor.selections.ranges(&editor.display_snapshot(cx)),
vec![3..3]
vec![MultiBufferOffset(3)..MultiBufferOffset(3)]
);
// Last transaction happened past the group interval in a different editor.
@@ -269,7 +278,7 @@ fn test_undo_redo_with_selection_restoration(cx: &mut TestAppContext) {
assert_eq!(editor.text(cx), "12cde6");
assert_eq!(
editor.selections.ranges(&editor.display_snapshot(cx)),
vec![2..2]
vec![MultiBufferOffset(2)..MultiBufferOffset(2)]
);
// First two transactions happened within the group interval in this editor.
@@ -279,7 +288,7 @@ fn test_undo_redo_with_selection_restoration(cx: &mut TestAppContext) {
assert_eq!(editor.text(cx), "123456");
assert_eq!(
editor.selections.ranges(&editor.display_snapshot(cx)),
vec![0..0]
vec![MultiBufferOffset(0)..MultiBufferOffset(0)]
);
// Redo the first two transactions together.
@@ -287,7 +296,7 @@ fn test_undo_redo_with_selection_restoration(cx: &mut TestAppContext) {
assert_eq!(editor.text(cx), "12cde6");
assert_eq!(
editor.selections.ranges(&editor.display_snapshot(cx)),
vec![5..5]
vec![MultiBufferOffset(5)..MultiBufferOffset(5)]
);
// Redo the last transaction on its own.
@@ -295,7 +304,7 @@ fn test_undo_redo_with_selection_restoration(cx: &mut TestAppContext) {
assert_eq!(editor.text(cx), "ab2cde6");
assert_eq!(
editor.selections.ranges(&editor.display_snapshot(cx)),
vec![6..6]
vec![MultiBufferOffset(6)..MultiBufferOffset(6)]
);
// Test empty transactions.
@@ -328,7 +337,9 @@ fn test_ime_composition(cx: &mut TestAppContext) {
assert_eq!(editor.text(cx), "äbcde");
assert_eq!(
editor.marked_text_ranges(cx),
Some(vec![OffsetUtf16(0)..OffsetUtf16(1)])
Some(vec![
MultiBufferOffsetUtf16(OffsetUtf16(0))..MultiBufferOffsetUtf16(OffsetUtf16(1))
])
);
// Finalize IME composition.
@@ -348,7 +359,9 @@ fn test_ime_composition(cx: &mut TestAppContext) {
editor.replace_and_mark_text_in_range(Some(0..1), "à", None, window, cx);
assert_eq!(
editor.marked_text_ranges(cx),
Some(vec![OffsetUtf16(0)..OffsetUtf16(1)])
Some(vec![
MultiBufferOffsetUtf16(OffsetUtf16(0))..MultiBufferOffsetUtf16(OffsetUtf16(1))
])
);
// Undoing during an IME composition cancels it.
@@ -361,7 +374,9 @@ fn test_ime_composition(cx: &mut TestAppContext) {
assert_eq!(editor.text(cx), "ābcdè");
assert_eq!(
editor.marked_text_ranges(cx),
Some(vec![OffsetUtf16(4)..OffsetUtf16(5)])
Some(vec![
MultiBufferOffsetUtf16(OffsetUtf16(4))..MultiBufferOffsetUtf16(OffsetUtf16(5))
])
);
// Finalize IME composition with an invalid replacement range, ensuring it gets clipped.
@@ -372,9 +387,9 @@ fn test_ime_composition(cx: &mut TestAppContext) {
// Start a new IME composition with multiple cursors.
editor.change_selections(SelectionEffects::no_scroll(), window, cx, |s| {
s.select_ranges([
OffsetUtf16(1)..OffsetUtf16(1),
OffsetUtf16(3)..OffsetUtf16(3),
OffsetUtf16(5)..OffsetUtf16(5),
MultiBufferOffsetUtf16(OffsetUtf16(1))..MultiBufferOffsetUtf16(OffsetUtf16(1)),
MultiBufferOffsetUtf16(OffsetUtf16(3))..MultiBufferOffsetUtf16(OffsetUtf16(3)),
MultiBufferOffsetUtf16(OffsetUtf16(5))..MultiBufferOffsetUtf16(OffsetUtf16(5)),
])
});
editor.replace_and_mark_text_in_range(Some(4..5), "XYZ", None, window, cx);
@@ -382,9 +397,9 @@ fn test_ime_composition(cx: &mut TestAppContext) {
assert_eq!(
editor.marked_text_ranges(cx),
Some(vec![
OffsetUtf16(0)..OffsetUtf16(3),
OffsetUtf16(4)..OffsetUtf16(7),
OffsetUtf16(8)..OffsetUtf16(11)
MultiBufferOffsetUtf16(OffsetUtf16(0))..MultiBufferOffsetUtf16(OffsetUtf16(3)),
MultiBufferOffsetUtf16(OffsetUtf16(4))..MultiBufferOffsetUtf16(OffsetUtf16(7)),
MultiBufferOffsetUtf16(OffsetUtf16(8))..MultiBufferOffsetUtf16(OffsetUtf16(11))
])
);
@@ -394,9 +409,9 @@ fn test_ime_composition(cx: &mut TestAppContext) {
assert_eq!(
editor.marked_text_ranges(cx),
Some(vec![
OffsetUtf16(1)..OffsetUtf16(2),
OffsetUtf16(5)..OffsetUtf16(6),
OffsetUtf16(9)..OffsetUtf16(10)
MultiBufferOffsetUtf16(OffsetUtf16(1))..MultiBufferOffsetUtf16(OffsetUtf16(2)),
MultiBufferOffsetUtf16(OffsetUtf16(5))..MultiBufferOffsetUtf16(OffsetUtf16(6)),
MultiBufferOffsetUtf16(OffsetUtf16(9))..MultiBufferOffsetUtf16(OffsetUtf16(10))
])
);
@@ -756,7 +771,11 @@ fn test_clone(cx: &mut TestAppContext) {
_ = editor.update(cx, |editor, window, cx| {
editor.change_selections(SelectionEffects::no_scroll(), window, cx, |s| {
s.select_ranges(selection_ranges.clone())
s.select_ranges(
selection_ranges
.iter()
.map(|range| MultiBufferOffset(range.start)..MultiBufferOffset(range.end)),
)
});
editor.fold_creases(
vec![
@@ -793,9 +812,11 @@ fn test_clone(cx: &mut TestAppContext) {
);
assert_eq!(
cloned_snapshot
.folds_in_range(0..text.len())
.folds_in_range(MultiBufferOffset(0)..MultiBufferOffset(text.len()))
.collect::<Vec<_>>(),
snapshot
.folds_in_range(MultiBufferOffset(0)..MultiBufferOffset(text.len()))
.collect::<Vec<_>>(),
snapshot.folds_in_range(0..text.len()).collect::<Vec<_>>(),
);
assert_set_eq!(
cloned_editor
@@ -1417,7 +1438,11 @@ fn test_fold_at_level(cx: &mut TestAppContext) {
);
editor.change_selections(SelectionEffects::default(), window, cx, |s| {
s.select_ranges(positions)
s.select_ranges(
positions
.iter()
.map(|range| MultiBufferOffset(range.start)..MultiBufferOffset(range.end)),
)
});
editor.fold_at_level(&FoldAtLevel(2), window, cx);
@@ -3699,7 +3724,11 @@ fn test_insert_with_old_selections(cx: &mut TestAppContext) {
let buffer = MultiBuffer::build_simple("a( X ), b( Y ), c( Z )", cx);
let mut editor = build_editor(buffer, window, cx);
editor.change_selections(SelectionEffects::no_scroll(), window, cx, |s| {
s.select_ranges([3..4, 11..12, 19..20])
s.select_ranges([
MultiBufferOffset(3)..MultiBufferOffset(4),
MultiBufferOffset(11)..MultiBufferOffset(12),
MultiBufferOffset(19)..MultiBufferOffset(20),
])
});
editor
});
@@ -3707,12 +3736,24 @@ fn test_insert_with_old_selections(cx: &mut TestAppContext) {
_ = editor.update(cx, |editor, window, cx| {
// Edit the buffer directly, deleting ranges surrounding the editor's selections
editor.buffer.update(cx, |buffer, cx| {
buffer.edit([(2..5, ""), (10..13, ""), (18..21, "")], None, cx);
buffer.edit(
[
(MultiBufferOffset(2)..MultiBufferOffset(5), ""),
(MultiBufferOffset(10)..MultiBufferOffset(13), ""),
(MultiBufferOffset(18)..MultiBufferOffset(21), ""),
],
None,
cx,
);
assert_eq!(buffer.read(cx).text(), "a(), b(), c()".unindent());
});
assert_eq!(
editor.selections.ranges(&editor.display_snapshot(cx)),
&[2..2, 7..7, 12..12],
&[
MultiBufferOffset(2)..MultiBufferOffset(2),
MultiBufferOffset(7)..MultiBufferOffset(7),
MultiBufferOffset(12)..MultiBufferOffset(12)
],
);
editor.insert("Z", window, cx);
@@ -3721,7 +3762,11 @@ fn test_insert_with_old_selections(cx: &mut TestAppContext) {
// The selections are moved after the inserted characters
assert_eq!(
editor.selections.ranges(&editor.display_snapshot(cx)),
&[3..3, 9..9, 15..15],
&[
MultiBufferOffset(3)..MultiBufferOffset(3),
MultiBufferOffset(9)..MultiBufferOffset(9),
MultiBufferOffset(15)..MultiBufferOffset(15)
],
);
});
}
@@ -4691,7 +4736,7 @@ async fn test_custom_newlines_cause_no_false_positive_diffs(
assert_eq!(
snapshot
.buffer_snapshot()
.diff_hunks_in_range(0..snapshot.buffer_snapshot().len())
.diff_hunks_in_range(MultiBufferOffset(0)..snapshot.buffer_snapshot().len())
.collect::<Vec<_>>(),
Vec::new(),
"Should not have any diffs for files with custom newlines"
@@ -5963,27 +6008,27 @@ fn test_transpose(cx: &mut TestAppContext) {
let mut editor = build_editor(MultiBuffer::build_simple("abc", cx), window, cx);
editor.set_style(EditorStyle::default(), window, cx);
editor.change_selections(SelectionEffects::no_scroll(), window, cx, |s| {
s.select_ranges([1..1])
s.select_ranges([MultiBufferOffset(1)..MultiBufferOffset(1)])
});
editor.transpose(&Default::default(), window, cx);
assert_eq!(editor.text(cx), "bac");
assert_eq!(
editor.selections.ranges(&editor.display_snapshot(cx)),
[2..2]
[MultiBufferOffset(2)..MultiBufferOffset(2)]
);
editor.transpose(&Default::default(), window, cx);
assert_eq!(editor.text(cx), "bca");
assert_eq!(
editor.selections.ranges(&editor.display_snapshot(cx)),
[3..3]
[MultiBufferOffset(3)..MultiBufferOffset(3)]
);
editor.transpose(&Default::default(), window, cx);
assert_eq!(editor.text(cx), "bac");
assert_eq!(
editor.selections.ranges(&editor.display_snapshot(cx)),
[3..3]
[MultiBufferOffset(3)..MultiBufferOffset(3)]
);
editor
@@ -5993,37 +6038,37 @@ fn test_transpose(cx: &mut TestAppContext) {
let mut editor = build_editor(MultiBuffer::build_simple("abc\nde", cx), window, cx);
editor.set_style(EditorStyle::default(), window, cx);
editor.change_selections(SelectionEffects::no_scroll(), window, cx, |s| {
s.select_ranges([3..3])
s.select_ranges([MultiBufferOffset(3)..MultiBufferOffset(3)])
});
editor.transpose(&Default::default(), window, cx);
assert_eq!(editor.text(cx), "acb\nde");
assert_eq!(
editor.selections.ranges(&editor.display_snapshot(cx)),
[3..3]
[MultiBufferOffset(3)..MultiBufferOffset(3)]
);
editor.change_selections(SelectionEffects::no_scroll(), window, cx, |s| {
s.select_ranges([4..4])
s.select_ranges([MultiBufferOffset(4)..MultiBufferOffset(4)])
});
editor.transpose(&Default::default(), window, cx);
assert_eq!(editor.text(cx), "acbd\ne");
assert_eq!(
editor.selections.ranges(&editor.display_snapshot(cx)),
[5..5]
[MultiBufferOffset(5)..MultiBufferOffset(5)]
);
editor.transpose(&Default::default(), window, cx);
assert_eq!(editor.text(cx), "acbde\n");
assert_eq!(
editor.selections.ranges(&editor.display_snapshot(cx)),
[6..6]
[MultiBufferOffset(6)..MultiBufferOffset(6)]
);
editor.transpose(&Default::default(), window, cx);
assert_eq!(editor.text(cx), "acbd\ne");
assert_eq!(
editor.selections.ranges(&editor.display_snapshot(cx)),
[6..6]
[MultiBufferOffset(6)..MultiBufferOffset(6)]
);
editor
@@ -6033,41 +6078,62 @@ fn test_transpose(cx: &mut TestAppContext) {
let mut editor = build_editor(MultiBuffer::build_simple("abc\nde", cx), window, cx);
editor.set_style(EditorStyle::default(), window, cx);
editor.change_selections(SelectionEffects::no_scroll(), window, cx, |s| {
s.select_ranges([1..1, 2..2, 4..4])
s.select_ranges([
MultiBufferOffset(1)..MultiBufferOffset(1),
MultiBufferOffset(2)..MultiBufferOffset(2),
MultiBufferOffset(4)..MultiBufferOffset(4),
])
});
editor.transpose(&Default::default(), window, cx);
assert_eq!(editor.text(cx), "bacd\ne");
assert_eq!(
editor.selections.ranges(&editor.display_snapshot(cx)),
[2..2, 3..3, 5..5]
[
MultiBufferOffset(2)..MultiBufferOffset(2),
MultiBufferOffset(3)..MultiBufferOffset(3),
MultiBufferOffset(5)..MultiBufferOffset(5)
]
);
editor.transpose(&Default::default(), window, cx);
assert_eq!(editor.text(cx), "bcade\n");
assert_eq!(
editor.selections.ranges(&editor.display_snapshot(cx)),
[3..3, 4..4, 6..6]
[
MultiBufferOffset(3)..MultiBufferOffset(3),
MultiBufferOffset(4)..MultiBufferOffset(4),
MultiBufferOffset(6)..MultiBufferOffset(6)
]
);
editor.transpose(&Default::default(), window, cx);
assert_eq!(editor.text(cx), "bcda\ne");
assert_eq!(
editor.selections.ranges(&editor.display_snapshot(cx)),
[4..4, 6..6]
[
MultiBufferOffset(4)..MultiBufferOffset(4),
MultiBufferOffset(6)..MultiBufferOffset(6)
]
);
editor.transpose(&Default::default(), window, cx);
assert_eq!(editor.text(cx), "bcade\n");
assert_eq!(
editor.selections.ranges(&editor.display_snapshot(cx)),
[4..4, 6..6]
[
MultiBufferOffset(4)..MultiBufferOffset(4),
MultiBufferOffset(6)..MultiBufferOffset(6)
]
);
editor.transpose(&Default::default(), window, cx);
assert_eq!(editor.text(cx), "bcaed\n");
assert_eq!(
editor.selections.ranges(&editor.display_snapshot(cx)),
[5..5, 6..6]
[
MultiBufferOffset(5)..MultiBufferOffset(5),
MultiBufferOffset(6)..MultiBufferOffset(6)
]
);
editor
@@ -6077,27 +6143,27 @@ fn test_transpose(cx: &mut TestAppContext) {
let mut editor = build_editor(MultiBuffer::build_simple("🍐🏀✋", cx), window, cx);
editor.set_style(EditorStyle::default(), window, cx);
editor.change_selections(SelectionEffects::no_scroll(), window, cx, |s| {
s.select_ranges([4..4])
s.select_ranges([MultiBufferOffset(4)..MultiBufferOffset(4)])
});
editor.transpose(&Default::default(), window, cx);
assert_eq!(editor.text(cx), "🏀🍐✋");
assert_eq!(
editor.selections.ranges(&editor.display_snapshot(cx)),
[8..8]
[MultiBufferOffset(8)..MultiBufferOffset(8)]
);
editor.transpose(&Default::default(), window, cx);
assert_eq!(editor.text(cx), "🏀✋🍐");
assert_eq!(
editor.selections.ranges(&editor.display_snapshot(cx)),
[11..11]
[MultiBufferOffset(11)..MultiBufferOffset(11)]
);
editor.transpose(&Default::default(), window, cx);
assert_eq!(editor.text(cx), "🏀🍐✋");
assert_eq!(
editor.selections.ranges(&editor.display_snapshot(cx)),
[11..11]
[MultiBufferOffset(11)..MultiBufferOffset(11)]
);
editor
@@ -9730,7 +9796,11 @@ async fn test_autoindent(cx: &mut TestAppContext) {
editor.update_in(cx, |editor, window, cx| {
editor.change_selections(SelectionEffects::no_scroll(), window, cx, |s| {
s.select_ranges([5..5, 8..8, 9..9])
s.select_ranges([
MultiBufferOffset(5)..MultiBufferOffset(5),
MultiBufferOffset(8)..MultiBufferOffset(8),
MultiBufferOffset(9)..MultiBufferOffset(9),
])
});
editor.newline(&Newline, window, cx);
assert_eq!(editor.text(cx), "fn a(\n \n) {\n \n}\n");
@@ -9795,7 +9865,11 @@ async fn test_autoindent_disabled(cx: &mut TestAppContext) {
editor.update_in(cx, |editor, window, cx| {
editor.change_selections(SelectionEffects::no_scroll(), window, cx, |s| {
s.select_ranges([5..5, 8..8, 9..9])
s.select_ranges([
MultiBufferOffset(5)..MultiBufferOffset(5),
MultiBufferOffset(8)..MultiBufferOffset(8),
MultiBufferOffset(9)..MultiBufferOffset(9),
])
});
editor.newline(&Newline, window, cx);
assert_eq!(
@@ -10452,7 +10526,7 @@ async fn test_autoclose_with_embedded_language(cx: &mut TestAppContext) {
let snapshot = editor.snapshot(window, cx);
let cursors = editor
.selections
.ranges::<usize>(&editor.display_snapshot(cx));
.ranges::<MultiBufferOffset>(&editor.display_snapshot(cx));
let languages = cursors
.iter()
.map(|c| snapshot.language_at(c.start).unwrap().name())
@@ -11142,17 +11216,26 @@ async fn test_snippet_placeholder_choices(cx: &mut TestAppContext) {
let snippet = Snippet::parse("type ${1|,i32,u32|} = $2").unwrap();
editor
.insert_snippet(&insertion_ranges, snippet, window, cx)
.insert_snippet(
&insertion_ranges
.iter()
.map(|range| MultiBufferOffset(range.start)..MultiBufferOffset(range.end))
.collect::<Vec<_>>(),
snippet,
window,
cx,
)
.unwrap();
fn assert(editor: &mut Editor, cx: &mut Context<Editor>, marked_text: &str) {
let (expected_text, selection_ranges) = marked_text_ranges(marked_text, false);
assert_eq!(editor.text(cx), expected_text);
assert_eq!(
editor
.selections
.ranges::<usize>(&editor.display_snapshot(cx)),
editor.selections.ranges(&editor.display_snapshot(cx)),
selection_ranges
.iter()
.map(|range| MultiBufferOffset(range.start)..MultiBufferOffset(range.end))
.collect::<Vec<_>>()
);
}
@@ -11176,10 +11259,11 @@ async fn test_snippet_tabstop_navigation_with_placeholders(cx: &mut TestAppConte
let (expected_text, selection_ranges) = marked_text_ranges(marked_text, false);
assert_eq!(editor.text(cx), expected_text);
assert_eq!(
editor
.selections
.ranges::<usize>(&editor.display_snapshot(cx)),
editor.selections.ranges(&editor.display_snapshot(cx)),
selection_ranges
.iter()
.map(|range| MultiBufferOffset(range.start)..MultiBufferOffset(range.end))
.collect::<Vec<_>>()
);
}
@@ -11197,7 +11281,15 @@ async fn test_snippet_tabstop_navigation_with_placeholders(cx: &mut TestAppConte
let snippet = Snippet::parse("type ${1|,i32,u32|} = $2; $3").unwrap();
editor
.insert_snippet(&insertion_ranges, snippet, window, cx)
.insert_snippet(
&insertion_ranges
.iter()
.map(|range| MultiBufferOffset(range.start)..MultiBufferOffset(range.end))
.collect::<Vec<_>>(),
snippet,
window,
cx,
)
.unwrap();
assert_state(
@@ -11645,7 +11737,7 @@ async fn test_redo_after_noop_format(cx: &mut TestAppContext) {
});
editor.update_in(cx, |editor, window, cx| {
editor.change_selections(SelectionEffects::default(), window, cx, |s| {
s.select_ranges([0..0])
s.select_ranges([MultiBufferOffset(0)..MultiBufferOffset(0)])
});
});
assert!(!cx.read(|cx| editor.is_dirty(cx)));
@@ -11811,7 +11903,7 @@ async fn test_multibuffer_format_during_save(cx: &mut TestAppContext) {
SelectionEffects::scroll(Autoscroll::Next),
window,
cx,
|s| s.select_ranges(Some(1..2)),
|s| s.select_ranges(Some(MultiBufferOffset(1)..MultiBufferOffset(2))),
);
editor.insert("|one|two|three|", window, cx);
});
@@ -11821,7 +11913,7 @@ async fn test_multibuffer_format_during_save(cx: &mut TestAppContext) {
SelectionEffects::scroll(Autoscroll::Next),
window,
cx,
|s| s.select_ranges(Some(60..70)),
|s| s.select_ranges(Some(MultiBufferOffset(60)..MultiBufferOffset(70))),
);
editor.insert("|four|five|six|", window, cx);
});
@@ -11989,7 +12081,7 @@ async fn test_autosave_with_dirty_buffers(cx: &mut TestAppContext) {
SelectionEffects::scroll(Autoscroll::Next),
window,
cx,
|s| s.select_ranges(Some(10..10)),
|s| s.select_ranges(Some(MultiBufferOffset(10)..MultiBufferOffset(10))),
);
editor.insert("// edited", window, cx);
});
@@ -13427,7 +13519,7 @@ async fn test_signature_help(cx: &mut TestAppContext) {
cx.update_editor(|editor, window, cx| {
editor.change_selections(SelectionEffects::no_scroll(), window, cx, |s| {
s.select_ranges([0..0])
s.select_ranges([MultiBufferOffset(0)..MultiBufferOffset(0)])
});
});
@@ -16412,7 +16504,11 @@ fn test_editing_overlapping_excerpts(cx: &mut TestAppContext) {
);
assert_eq!(editor.text(cx), expected_text);
editor.change_selections(SelectionEffects::no_scroll(), window, cx, |s| {
s.select_ranges(selection_ranges)
s.select_ranges(
selection_ranges
.iter()
.map(|range| MultiBufferOffset(range.start)..MultiBufferOffset(range.end)),
)
});
editor.handle_input("X", window, cx);
@@ -16430,6 +16526,9 @@ fn test_editing_overlapping_excerpts(cx: &mut TestAppContext) {
assert_eq!(
editor.selections.ranges(&editor.display_snapshot(cx)),
expected_selections
.iter()
.map(|range| MultiBufferOffset(range.start)..MultiBufferOffset(range.end))
.collect::<Vec<_>>()
);
editor.newline(&Newline, window, cx);
@@ -16450,6 +16549,9 @@ fn test_editing_overlapping_excerpts(cx: &mut TestAppContext) {
assert_eq!(
editor.selections.ranges(&editor.display_snapshot(cx)),
expected_selections
.iter()
.map(|range| MultiBufferOffset(range.start)..MultiBufferOffset(range.end))
.collect::<Vec<_>>()
);
});
}
@@ -16825,7 +16927,7 @@ async fn test_following(cx: &mut TestAppContext) {
// Update the selections only
_ = leader.update(cx, |leader, window, cx| {
leader.change_selections(SelectionEffects::no_scroll(), window, cx, |s| {
s.select_ranges([1..1])
s.select_ranges([MultiBufferOffset(1)..MultiBufferOffset(1)])
});
});
follower
@@ -16843,7 +16945,7 @@ async fn test_following(cx: &mut TestAppContext) {
_ = follower.update(cx, |follower, _, cx| {
assert_eq!(
follower.selections.ranges(&follower.display_snapshot(cx)),
vec![1..1]
vec![MultiBufferOffset(1)..MultiBufferOffset(1)]
);
});
assert!(*is_still_following.borrow());
@@ -16878,7 +16980,7 @@ async fn test_following(cx: &mut TestAppContext) {
// via autoscroll, not via the leader's exact scroll position.
_ = leader.update(cx, |leader, window, cx| {
leader.change_selections(SelectionEffects::no_scroll(), window, cx, |s| {
s.select_ranges([0..0])
s.select_ranges([MultiBufferOffset(0)..MultiBufferOffset(0)])
});
leader.request_autoscroll(Autoscroll::newest(), cx);
leader.set_scroll_position(gpui::Point::new(1.5, 3.5), window, cx);
@@ -16899,7 +17001,7 @@ async fn test_following(cx: &mut TestAppContext) {
assert_eq!(follower.scroll_position(cx), gpui::Point::new(1.5, 0.0));
assert_eq!(
follower.selections.ranges(&follower.display_snapshot(cx)),
vec![0..0]
vec![MultiBufferOffset(0)..MultiBufferOffset(0)]
);
});
assert!(*is_still_following.borrow());
@@ -16907,7 +17009,7 @@ async fn test_following(cx: &mut TestAppContext) {
// Creating a pending selection that precedes another selection
_ = leader.update(cx, |leader, window, cx| {
leader.change_selections(SelectionEffects::no_scroll(), window, cx, |s| {
s.select_ranges([1..1])
s.select_ranges([MultiBufferOffset(1)..MultiBufferOffset(1)])
});
leader.begin_selection(DisplayPoint::new(DisplayRow(0), 0), true, 1, window, cx);
});
@@ -16926,7 +17028,10 @@ async fn test_following(cx: &mut TestAppContext) {
_ = follower.update(cx, |follower, _, cx| {
assert_eq!(
follower.selections.ranges(&follower.display_snapshot(cx)),
vec![0..0, 1..1]
vec![
MultiBufferOffset(0)..MultiBufferOffset(0),
MultiBufferOffset(1)..MultiBufferOffset(1)
]
);
});
assert!(*is_still_following.borrow());
@@ -16950,13 +17055,17 @@ async fn test_following(cx: &mut TestAppContext) {
_ = follower.update(cx, |follower, _, cx| {
assert_eq!(
follower.selections.ranges(&follower.display_snapshot(cx)),
vec![0..2]
vec![MultiBufferOffset(0)..MultiBufferOffset(2)]
);
});
// Scrolling locally breaks the follow
_ = follower.update(cx, |follower, window, cx| {
let top_anchor = follower.buffer().read(cx).read(cx).anchor_after(0);
let top_anchor = follower
.buffer()
.read(cx)
.read(cx)
.anchor_after(MultiBufferOffset(0));
follower.set_scroll_anchor(
ScrollAnchor {
anchor: top_anchor,
@@ -17503,6 +17612,89 @@ async fn test_move_to_enclosing_bracket(cx: &mut TestAppContext) {
);
}
#[gpui::test]
async fn test_move_to_enclosing_bracket_in_markdown_code_block(cx: &mut TestAppContext) {
init_test(cx, |_| {});
let language_registry = Arc::new(language::LanguageRegistry::test(cx.executor()));
language_registry.add(markdown_lang());
language_registry.add(rust_lang());
let buffer = cx.new(|cx| {
let mut buffer = language::Buffer::local(
indoc! {"
```rs
impl Worktree {
pub async fn open_buffers(&self, path: &Path) -> impl Iterator<&Buffer> {
}
}
```
"},
cx,
);
buffer.set_language_registry(language_registry.clone());
buffer.set_language(Some(markdown_lang()), cx);
buffer
});
let buffer = cx.new(|cx| MultiBuffer::singleton(buffer, cx));
let editor = cx.add_window(|window, cx| build_editor(buffer.clone(), window, cx));
cx.executor().run_until_parked();
_ = editor.update(cx, |editor, window, cx| {
// Case 1: Test outer enclosing brackets
select_ranges(
editor,
&indoc! {"
```rs
impl Worktree {
pub async fn open_buffers(&self, path: &Path) -> impl Iterator<&Buffer> {
}
```
"},
window,
cx,
);
editor.move_to_enclosing_bracket(&MoveToEnclosingBracket, window, cx);
assert_text_with_selections(
editor,
&indoc! {"
```rs
impl Worktree ˇ{
pub async fn open_buffers(&self, path: &Path) -> impl Iterator<&Buffer> {
}
}
```
"},
cx,
);
// Case 2: Test inner enclosing brackets
select_ranges(
editor,
&indoc! {"
```rs
impl Worktree {
pub async fn open_buffers(&self, path: &Path) -> impl Iterator<&Buffer> {
}
```
"},
window,
cx,
);
editor.move_to_enclosing_bracket(&MoveToEnclosingBracket, window, cx);
assert_text_with_selections(
editor,
&indoc! {"
```rs
impl Worktree {
pub async fn open_buffers(&self, path: &Path) -> impl Iterator<&Buffer> ˇ{
}
}
```
"},
cx,
);
});
}
#[gpui::test]
async fn test_on_type_formatting_not_triggered(cx: &mut TestAppContext) {
init_test(cx, |_| {});
@@ -19367,7 +19559,7 @@ async fn test_multibuffer_in_navigation_history(cx: &mut TestAppContext) {
SelectionEffects::scroll(Autoscroll::Next),
window,
cx,
|s| s.select_ranges(Some(1..2)),
|s| s.select_ranges(Some(MultiBufferOffset(1)..MultiBufferOffset(2))),
);
editor.open_excerpts(&OpenExcerpts, window, cx);
});
@@ -19423,7 +19615,7 @@ async fn test_multibuffer_in_navigation_history(cx: &mut TestAppContext) {
SelectionEffects::scroll(Autoscroll::Next),
window,
cx,
|s| s.select_ranges(Some(39..40)),
|s| s.select_ranges(Some(MultiBufferOffset(39)..MultiBufferOffset(40))),
);
editor.open_excerpts(&OpenExcerpts, window, cx);
});
@@ -19483,7 +19675,7 @@ async fn test_multibuffer_in_navigation_history(cx: &mut TestAppContext) {
SelectionEffects::scroll(Autoscroll::Next),
window,
cx,
|s| s.select_ranges(Some(70..70)),
|s| s.select_ranges(Some(MultiBufferOffset(70)..MultiBufferOffset(70))),
);
editor.open_excerpts(&OpenExcerpts, window, cx);
});
@@ -22273,7 +22465,7 @@ async fn test_find_enclosing_node_with_task(cx: &mut TestAppContext) {
(buffer.read(cx).remote_id(), 3),
RunnableTasks {
templates: vec![],
offset: snapshot.anchor_before(43),
offset: snapshot.anchor_before(MultiBufferOffset(43)),
column: 0,
extra_variables: HashMap::default(),
context_range: BufferOffset(43)..BufferOffset(85),
@@ -22283,7 +22475,7 @@ async fn test_find_enclosing_node_with_task(cx: &mut TestAppContext) {
(buffer.read(cx).remote_id(), 8),
RunnableTasks {
templates: vec![],
offset: snapshot.anchor_before(86),
offset: snapshot.anchor_before(MultiBufferOffset(86)),
column: 0,
extra_variables: HashMap::default(),
context_range: BufferOffset(86)..BufferOffset(191),
@@ -25665,7 +25857,10 @@ fn assert_selection_ranges(marked_text: &str, editor: &mut Editor, cx: &mut Cont
assert_eq!(editor.text(cx), text);
assert_eq!(
editor.selections.ranges(&editor.display_snapshot(cx)),
ranges,
ranges
.iter()
.map(|range| MultiBufferOffset(range.start)..MultiBufferOffset(range.end))
.collect::<Vec<_>>(),
"Assert selections are {}",
marked_text
);
@@ -25911,10 +26106,12 @@ pub fn handle_completion_request(
vec![complete_from_marker.clone(), replace_range_marker.clone()],
);
let complete_from_position =
cx.to_lsp(marked_ranges.remove(&complete_from_marker).unwrap()[0].start);
let complete_from_position = cx.to_lsp(MultiBufferOffset(
marked_ranges.remove(&complete_from_marker).unwrap()[0].start,
));
let range = marked_ranges.remove(&replace_range_marker).unwrap()[0].clone();
let replace_range =
cx.to_lsp_range(marked_ranges.remove(&replace_range_marker).unwrap()[0].clone());
cx.to_lsp_range(MultiBufferOffset(range.start)..MultiBufferOffset(range.end));
let mut request =
cx.set_request_handler::<lsp::request::Completion, _, _>(move |url, params, _| {
@@ -25975,13 +26172,18 @@ pub fn handle_completion_request_with_insert_and_replace(
],
);
let complete_from_position =
cx.to_lsp(marked_ranges.remove(&complete_from_marker).unwrap()[0].start);
let complete_from_position = cx.to_lsp(MultiBufferOffset(
marked_ranges.remove(&complete_from_marker).unwrap()[0].start,
));
let range = marked_ranges.remove(&replace_range_marker).unwrap()[0].clone();
let replace_range =
cx.to_lsp_range(marked_ranges.remove(&replace_range_marker).unwrap()[0].clone());
cx.to_lsp_range(MultiBufferOffset(range.start)..MultiBufferOffset(range.end));
let insert_range = match marked_ranges.remove(&insert_range_marker) {
Some(ranges) if !ranges.is_empty() => cx.to_lsp_range(ranges[0].clone()),
Some(ranges) if !ranges.is_empty() => {
let range1 = ranges[0].clone();
cx.to_lsp_range(MultiBufferOffset(range1.start)..MultiBufferOffset(range1.end))
}
_ => lsp::Range {
start: replace_range.start,
end: complete_from_position,
@@ -26031,7 +26233,10 @@ fn handle_resolve_completion_request(
.iter()
.map(|(marked_string, new_text)| {
let (_, marked_ranges) = marked_text_ranges(marked_string, false);
let replace_range = cx.to_lsp_range(marked_ranges[0].clone());
let replace_range = cx.to_lsp_range(
MultiBufferOffset(marked_ranges[0].start)
..MultiBufferOffset(marked_ranges[0].end),
);
lsp::TextEdit::new(replace_range, new_text.to_string())
})
.collect::<Vec<_>>()
@@ -26115,7 +26320,7 @@ fn assert_hunk_revert(
let snapshot = editor.snapshot(window, cx);
let reverted_hunk_statuses = snapshot
.buffer_snapshot()
.diff_hunks_in_range(0..snapshot.buffer_snapshot().len())
.diff_hunks_in_range(MultiBufferOffset(0)..snapshot.buffer_snapshot().len())
.map(|hunk| hunk.status().kind)
.collect::<Vec<_>>();
@@ -26739,7 +26944,9 @@ async fn test_newline_replacement_in_single_line(cx: &mut TestAppContext) {
editor.update(cx, |editor, cx| {
assert_eq!(editor.display_text(cx), "oops⋯⋯wow⋯");
});
editor.update(cx, |editor, cx| editor.edit([(3..5, "")], cx));
editor.update(cx, |editor, cx| {
editor.edit([(MultiBufferOffset(3)..MultiBufferOffset(5), "")], cx)
});
cx.run_until_parked();
editor.update(cx, |editor, cx| {
assert_eq!(editor.display_text(cx), "oop⋯wow⋯");
@@ -26776,7 +26983,7 @@ async fn test_non_utf_8_opens(cx: &mut TestAppContext) {
.unwrap();
assert_eq!(
handle.to_any().entity_type(),
handle.to_any_view().entity_type(),
TypeId::of::<InvalidItemView>()
);
}
@@ -27811,7 +28018,7 @@ async fn test_multibuffer_selections_with_folding(cx: &mut TestAppContext) {
// Scenario 1: Unfolded buffers, position cursor on "2", select all matches, then insert
cx.update_editor(|editor, window, cx| {
editor.change_selections(None.into(), window, cx, |s| {
s.select_ranges([2..3]);
s.select_ranges([MultiBufferOffset(2)..MultiBufferOffset(3)]);
});
});
cx.assert_excerpts_with_selections(indoc! {"
@@ -27868,7 +28075,7 @@ async fn test_multibuffer_selections_with_folding(cx: &mut TestAppContext) {
// Select "2" and select all matches
cx.update_editor(|editor, window, cx| {
editor.change_selections(None.into(), window, cx, |s| {
s.select_ranges([2..3]);
s.select_ranges([MultiBufferOffset(2)..MultiBufferOffset(3)]);
});
editor
.select_all_matches(&SelectAllMatches, window, cx)
@@ -27919,7 +28126,7 @@ async fn test_multibuffer_selections_with_folding(cx: &mut TestAppContext) {
// Select "2" and select all matches
cx.update_editor(|editor, window, cx| {
editor.change_selections(None.into(), window, cx, |s| {
s.select_ranges([2..3]);
s.select_ranges([MultiBufferOffset(2)..MultiBufferOffset(3)]);
});
editor
.select_all_matches(&SelectAllMatches, window, cx)

View File

@@ -74,6 +74,7 @@ use smallvec::{SmallVec, smallvec};
use std::{
any::TypeId,
borrow::Cow,
cell::Cell,
cmp::{self, Ordering},
fmt::{self, Write},
iter, mem,
@@ -185,6 +186,13 @@ impl SelectionLayout {
}
}
#[derive(Default)]
struct RenderBlocksOutput {
blocks: Vec<BlockLayout>,
row_block_types: HashMap<DisplayRow, bool>,
resized_blocks: Option<HashMap<CustomBlockId, u32>>,
}
pub struct EditorElement {
editor: Entity<Editor>,
style: EditorStyle,
@@ -3667,6 +3675,7 @@ impl EditorElement {
latest_selection_anchors: &HashMap<BufferId, Anchor>,
is_row_soft_wrapped: impl Copy + Fn(usize) -> bool,
sticky_header_excerpt_id: Option<ExcerptId>,
block_resize_offset: &mut i32,
window: &mut Window,
cx: &mut App,
) -> Option<(AnyElement, Size<Pixels>, DisplayRow, Pixels)> {
@@ -3820,7 +3829,10 @@ impl EditorElement {
};
let mut element_height_in_lines = ((final_size.height / line_height).ceil() as u32).max(1);
let mut row = block_row_start;
let effective_row_start = block_row_start.0 as i32 + *block_resize_offset;
debug_assert!(effective_row_start >= 0);
let mut row = DisplayRow(effective_row_start.max(0) as u32);
let mut x_offset = px(0.);
let mut is_block = true;
@@ -3850,6 +3862,7 @@ impl EditorElement {
}
};
if element_height_in_lines != block.height() {
*block_resize_offset += element_height_in_lines as i32 - block.height() as i32;
resized_blocks.insert(custom_block_id, element_height_in_lines);
}
}
@@ -4254,7 +4267,7 @@ impl EditorElement {
sticky_header_excerpt_id: Option<ExcerptId>,
window: &mut Window,
cx: &mut App,
) -> Result<(Vec<BlockLayout>, HashMap<DisplayRow, bool>), HashMap<CustomBlockId, u32>> {
) -> RenderBlocksOutput {
let (fixed_blocks, non_fixed_blocks) = snapshot
.blocks_in_range(rows.clone())
.partition::<Vec<_>, _>(|(_, block)| block.style() == BlockStyle::Fixed);
@@ -4266,6 +4279,7 @@ impl EditorElement {
let mut blocks = Vec::new();
let mut resized_blocks = HashMap::default();
let mut row_block_types = HashMap::default();
let mut block_resize_offset: i32 = 0;
for (row, block) in fixed_blocks {
let block_id = block.id();
@@ -4296,6 +4310,7 @@ impl EditorElement {
latest_selection_anchors,
is_row_soft_wrapped,
sticky_header_excerpt_id,
&mut block_resize_offset,
window,
cx,
) {
@@ -4354,6 +4369,7 @@ impl EditorElement {
latest_selection_anchors,
is_row_soft_wrapped,
sticky_header_excerpt_id,
&mut block_resize_offset,
window,
cx,
) {
@@ -4410,6 +4426,7 @@ impl EditorElement {
latest_selection_anchors,
is_row_soft_wrapped,
sticky_header_excerpt_id,
&mut block_resize_offset,
window,
cx,
) {
@@ -4429,9 +4446,12 @@ impl EditorElement {
if resized_blocks.is_empty() {
*scroll_width =
(*scroll_width).max(fixed_block_max_width - editor_margins.gutter.width);
Ok((blocks, row_block_types))
} else {
Err(resized_blocks)
}
RenderBlocksOutput {
blocks,
row_block_types,
resized_blocks: (!resized_blocks.is_empty()).then_some(resized_blocks),
}
}
@@ -8772,8 +8792,48 @@ impl EditorElement {
}
}
#[derive(Default)]
pub struct EditorRequestLayoutState {
// We use prepaint depth to limit the number of times prepaint is
// called recursively. We need this so that we can update stale
// data for e.g. block heights in block map.
prepaint_depth: Rc<Cell<usize>>,
}
impl EditorRequestLayoutState {
// In ideal conditions we only need one more subsequent prepaint call for resize to take effect.
// i.e. MAX_PREPAINT_DEPTH = 2, but since moving blocks inline (place_near), more lines from
// below get exposed, and we end up querying blocks for those lines too in subsequent renders.
// Setting MAX_PREPAINT_DEPTH = 3, passes all tests. Just to be on the safe side we set it to 5, so
// that subsequent shrinking does not lead to incorrect block placing.
const MAX_PREPAINT_DEPTH: usize = 5;
fn increment_prepaint_depth(&self) -> EditorPrepaintGuard {
let depth = self.prepaint_depth.get();
self.prepaint_depth.set(depth + 1);
EditorPrepaintGuard {
prepaint_depth: self.prepaint_depth.clone(),
}
}
fn can_prepaint(&self) -> bool {
self.prepaint_depth.get() < Self::MAX_PREPAINT_DEPTH
}
}
struct EditorPrepaintGuard {
prepaint_depth: Rc<Cell<usize>>,
}
impl Drop for EditorPrepaintGuard {
fn drop(&mut self) {
let depth = self.prepaint_depth.get();
self.prepaint_depth.set(depth.saturating_sub(1));
}
}
impl Element for EditorElement {
type RequestLayoutState = ();
type RequestLayoutState = EditorRequestLayoutState;
type PrepaintState = EditorLayout;
fn id(&self) -> Option<ElementId> {
@@ -8790,7 +8850,7 @@ impl Element for EditorElement {
_inspector_id: Option<&gpui::InspectorElementId>,
window: &mut Window,
cx: &mut App,
) -> (gpui::LayoutId, ()) {
) -> (gpui::LayoutId, Self::RequestLayoutState) {
let rem_size = self.rem_size(cx);
window.with_rem_size(rem_size, |window| {
self.editor.update(cx, |editor, cx| {
@@ -8857,7 +8917,7 @@ impl Element for EditorElement {
}
};
(layout_id, ())
(layout_id, EditorRequestLayoutState::default())
})
})
}
@@ -8867,10 +8927,11 @@ impl Element for EditorElement {
_: Option<&GlobalElementId>,
_inspector_id: Option<&gpui::InspectorElementId>,
bounds: Bounds<Pixels>,
_: &mut Self::RequestLayoutState,
request_layout: &mut Self::RequestLayoutState,
window: &mut Window,
cx: &mut App,
) -> Self::PrepaintState {
let _prepaint_depth_guard = request_layout.increment_prepaint_depth();
let text_style = TextStyleRefinement {
font_size: Some(self.style.text.font_size),
line_height: Some(self.style.text.line_height),
@@ -9394,7 +9455,20 @@ impl Element for EditorElement {
// If the fold widths have changed, we need to prepaint
// the element again to account for any changes in
// wrapping.
return self.prepaint(None, _inspector_id, bounds, &mut (), window, cx);
if request_layout.can_prepaint() {
return self.prepaint(
None,
_inspector_id,
bounds,
request_layout,
window,
cx,
);
} else {
debug_panic!(
"skipping recursive prepaint at max depth. renderer widths may be stale."
);
}
}
let longest_line_blame_width = self
@@ -9481,20 +9555,35 @@ impl Element for EditorElement {
)
})
})
.unwrap_or_else(|| Ok((Vec::default(), HashMap::default())));
let (mut blocks, row_block_types) = match blocks {
Ok(blocks) => blocks,
Err(resized_blocks) => {
self.editor.update(cx, |editor, cx| {
editor.resize_blocks(
resized_blocks,
autoscroll_request.map(|(autoscroll, _)| autoscroll),
cx,
)
});
return self.prepaint(None, _inspector_id, bounds, &mut (), window, cx);
.unwrap_or_default();
let RenderBlocksOutput {
mut blocks,
row_block_types,
resized_blocks,
} = blocks;
if let Some(resized_blocks) = resized_blocks {
self.editor.update(cx, |editor, cx| {
editor.resize_blocks(
resized_blocks,
autoscroll_request.map(|(autoscroll, _)| autoscroll),
cx,
)
});
if request_layout.can_prepaint() {
return self.prepaint(
None,
_inspector_id,
bounds,
request_layout,
window,
cx,
);
} else {
debug_panic!(
"skipping recursive prepaint at max depth. block layout may be stale."
);
}
};
}
let sticky_buffer_header = sticky_header_excerpt.map(|sticky_header_excerpt| {
window.with_element_namespace("blocks", |window| {

Some files were not shown because too many files have changed in this diff Show More