Compare commits

..

126 Commits

Author SHA1 Message Date
Anthony
19265bc340 Merge remote-tracking branch 'origin/main' into container-queries 2025-10-30 17:19:30 -04:00
Dino
8aa2158418 vim: Improve pasting while in replace mode (#41549)
- Update `vim::normal::Vim.normal_replace` to work with more than one
  character
- Add `vim::replace::Vim.paste_replace` to handle pasting the 
  clipboard's contents while in replace mode
- Update vim's handling of the `editor::actions::Paste` action so that
  the `paste_replace` method is called when vim is in replace mode,
  otherwise it'll just call the regular `editor::Editor.paste` method

Closes #41378 

Release Notes:

- Improved pasting while in Vim's Replace mode, ensuring that the Zed
replaces the same number of characters as the length of the contents
being pasted
2025-10-30 20:33:03 +00:00
Anthony Eid
5ae0768ce4 debugger: Polish breakpoint list UI (#41598)
This PR fixes breakpoint icon alignment to also be at the end of a
rendered entry and enables editing breakpoint qualities when there's no
active session.

The alignment issue was caused by some icons being invisible, so the
layout phase always accounted for the space they would take up. Only
laying out the icons when they are visible fixed the issue.

#### Before
<img width="1014" height="316" alt="image"
src="https://github.com/user-attachments/assets/9a9ced06-e219-4d9d-8793-6bdfdaca48e8"
/>

#### After
[
<img width="502" height="167" alt="Screenshot 2025-10-30 at 3 21 17 PM"
src="https://github.com/user-attachments/assets/23744868-e354-461c-a940-9b6812e1bcf4"
/>
](url)

Release Notes:

- Breakpoint list: Allow adding conditions, logs, and hit conditions to
breakpoints when there's no active session
2025-10-30 16:15:21 -04:00
Anthony Eid
44e5a962e6 debugger: Add horizontal scroll bars to variable list, memory view, and breakpoint list (#41594)
Closes #40360

This PR added heuristics to determine what variable/breakpoint list
entry has the longest width when rendered. I added this in so the
uniform list would correctly determine which item has the longest width
and use that to calculate the scrollbar size.

The heuristic can be off if a non-mono space font is used in the UI; in
most cases, it's more than accurate enough though.

Release Notes:

- debugger: Add horizontal scroll bars to variable list, memory view,
and breakpoint list

---------

Co-authored-by: MrSubidubi <dev@bahn.sh>
2025-10-30 19:43:32 +00:00
Lukas Wirth
3944234bab windows: Don't flood windows message queue with gpui messages (#41595)
Release Notes:

- N/A

Co-authored by: Max Brunsfeld <max@zed.dev>
2025-10-30 20:09:32 +01:00
Lukas Wirth
ac3b232dda Reduce amount of foreground tasks spawned on multibuffer/editor updates (#41479)
When doing a project wide search in zed on windows for `hang`, zed
starts to freeze for a couple seconds ultimately starting to error with
`Not enough quota is available to process this command.` when
dispatching windows messages. The cause for this is that we simply
overload the windows message pump due to the sheer amount of foreground
tasks we spawn when we populate the project search.

This PR is an attempt at reducing this.

Release Notes:

- Reduced hangs and stutters in large project file searches
2025-10-30 17:40:56 +00:00
Paweł Kondzior
743180342a agent_ui: Insert thread summary as proper mention URI (#40722)
This ensures the thread summary is treated as a tracked mention with
accessible context.

Changes:
- Fixed `MessageEditor::insert_thread_summary()` to use proper mention
URI format
- Added test coverage to verify the fix

Release Notes:

- Fixed an issue where "New From Summary" was not properly inserting
thread summaries as contextual mentions when creating new threads.
Thread summaries are now inserted as proper mention URIs.
2025-10-30 17:19:32 +01:00
Bennet Fenner
3825ce523e agent_ui: Fix agent: Chat with follow not working (#41581)
Release Notes:

- Fixed an issue where `agent: Chat with follow` was not working anymore

Co-authored-by: Ben Brandt <benjamin.j.brandt@gmail.com>
2025-10-30 16:03:12 +00:00
Anthony Eid
b4cf7e440e debugger: Get rid of initialize_args in php debugger setup docs (#41579)
Related to issue: #40887

Release Notes:

- N/A

Co-authored-by: Remco Smits <djsmits12@gmail.com>
2025-10-30 11:47:59 -04:00
Conrad Irwin
bdb2d6c8de Don't skip tests in nightly release (#41573)
Release Notes:

- N/A
2025-10-30 14:59:30 +00:00
Lukas Wirth
0c73252c9d project: Spawn terminal process on background executor (#41216)
Attempt 2 for https://github.com/zed-industries/zed/pull/40774

We were spawning the process on the foreground thread before which can
block an arbitrary amount of time. Likewise we no longer block
deserialization on the terminal loading.

Release Notes:

- Improved startup time on systems with slow process spawning
capabilities
2025-10-30 13:55:19 +00:00
Danilo Leal
c7aa805398 docs: Improve the Inline Assistant content (#41566)
Release Notes:

- N/A
2025-10-30 13:53:31 +00:00
Lukas Wirth
94ba24dadd terminal: Properly kill child process on terminal exit (#41562)
Release Notes:

- Fixed terminal processes occasionally leaking

Co-authored by: Jakub <jakub@zed.dev>
2025-10-30 13:40:31 +00:00
Agus Zubiaga
046b43f135 collab panel: Open selected channel notes (#41560)
Adds an action to open the notes for the currently selected channel in
the collab panel, which is mapped to `alt-enter` in all platforms.

Release Notes:

- collab: Add `collab_panel::OpenSelectedChannelNotes` action
(`alt-enter` by default)
2025-10-30 13:10:19 +00:00
Caleb Jasik
426040f08f Add cmd-d shortcut for (terminal) pane::SplitRight (#41139)
Add default keybinding for `pane::SplitRight` in the `Terminal` context
for all platforms.

Closes #ISSUE

Release Notes:

- Added VS Code's terminal split keybindings (`cmd` on MacOS,
`ctrl-shift-5` on Windows and Linux)

---------

Co-authored-by: dino <dinojoaocosta@gmail.com>
2025-10-30 12:28:06 +00:00
Finn Evers
785b5ade6e extension_host: Do not try auto installing suppressed extensions (#41551)
Release Notes:

- Fixed an issue where Zed would try to install extensions specified
under `auto_install_extensions` which were moved into core.
2025-10-30 12:24:32 +00:00
A. Teo Welton
344f63c6ca Language: Fix minor C++ completion label formatting issue (#41544)
Closes #39515

**Details:**
- Improved logic for formatting completion labels, as some (such as
`namespace`) were missing space characters.
- Added extra logic as per stale PR #39533
[comment](https://github.com/zed-industries/zed/pull/39533#issuecomment-3368549433)
ensuring that cases where extra spaces are not necessary (such as
functions) are not affected
- I will note, I was not able to figure out how to fix the coloring of
`namespace` within completion labels as mentioned in that comment, if
someone would provide me with direction I would be happy to look into
that too.

Previous:
<img width="812" height="530" alt="previous"
src="https://github.com/user-attachments/assets/b38f1590-ca2d-489d-9dcb-2d478eb6ed03"
/>

Fixed:
<img width="812" height="530" alt="fixed"
src="https://github.com/user-attachments/assets/020b151d-e5d9-467e-99c1-5b0cab057169"
/>


Release Notes:

- Fixed minor issue where some `clangd` labels would be missing a space
in formatting
2025-10-30 10:47:44 +00:00
claytonrcarter
e30d5998e4 bundle: Restore local install on macOS (#41482)
I just pulled and ran a local build via `script/bundle-mac -l -i` but
found that the resulting bundle wasn't installed as expected. (me:
"ToggleAllDocks!! Wait! Where is it?!") Looking into, it looks like the
`-l` flag was removed in #41392, leaving the `$local_only` var orphaned,
which then left the `-i/$local_install` flag unreachable. I suspect that
this was unintentional, so this PR re-adds the `-l/$local_only` flag to
`script/bundle-mac`.

I ran the build again and confirmed that local install seemed to work as
expected. (ie "ToggleAllDocks!! 🎉")

While here, I also removed the last reference to `$local_arch`, because
all other references to that were removed in #41392.

/cc @osiewicz 

Release Notes:

- N/A

---------

Co-authored-by: Conrad Irwin <conrad.irwin@gmail.com>
2025-10-29 21:55:02 -06:00
Conrad Irwin
277ae27ca2 Use gh-workflow for tests (take 2) (#41420)
This re-implements the reverted commit 8b051d6cc3.

Closes #ISSUE

Release Notes:

- N/A

---------

Co-authored-by: Ben Kunkle <ben@zed.dev>
2025-10-29 21:28:43 -04:00
Danilo Leal
64fdc1d5b6 docs: Fix Codestral section title in edit prediction page (#41509)
Follow up to https://github.com/zed-industries/zed/pull/41507 as I
realized I didn't change the title for this section.

Release Notes:

- N/A
2025-10-30 01:18:43 +00:00
Danilo Leal
992448b560 edit prediction: Add ability to switch providers from the status bar menu (#41504)
Closes https://github.com/zed-industries/zed/issues/41500

<img width="500" height="1122" alt="Screenshot 2025-10-29 at 9  43@2x"
src="https://github.com/user-attachments/assets/ac2a81ad-99bb-43cd-b032-f2485fc23166"
/>

Release Notes:

- Added the ability to switch between configured edit prediction
providers through the status bar menu.
2025-10-29 22:16:13 -03:00
Danilo Leal
802b0e4968 docs: Add content about EP with Codestral (#41507)
This was missing after we added support to Codestral as an edit
prediction provider.

Release Notes:

- N/A
2025-10-29 22:07:38 -03:00
Agus Zubiaga
b8cdd38efb zeta2: Improve context search performance (#41501)
We'll now perform all searches from the context model concurrently, and
combine queries for the same glob into one reducing the total number of
project searches.

For better readability, the debug context view now displays each
top-level regex alternation individually, grouped by its corresponding
glob:

<img width="1592" height="672" alt="CleanShot 2025-10-29 at 19 56 03@2x"
src="https://github.com/user-attachments/assets/f6e8408e-09d6-4e27-ba11-a739a772aa12"
/>

  
Release Notes:

- N/A
2025-10-29 23:06:49 +00:00
Danilo Leal
87f9ba380f settings_ui: Close the settings window when going to the JSON file (#41491)
Release Notes:

- N/A
2025-10-29 19:19:36 -03:00
Danilo Leal
12dae07108 agent_ui: Fix history view background color when zoomed in (#41493)
Release Notes:

- N/A
2025-10-29 17:58:31 -03:00
Danilo Leal
cf0f442869 settings_ui: Fix links for edit prediction items (#41492)
Follow up to the bonus commit we added in
https://github.com/zed-industries/zed/pull/41172/.

Release Notes:

- N/A
2025-10-29 17:58:22 -03:00
Joseph T. Lyons
de9c4127a5 Remove references to how-to blog posts (#41489)
Release Notes:

- N/A
2025-10-29 19:48:50 +00:00
Joseph T. Lyons
e7089fe45c Update release process doc (#41488)
Release Notes:

- N/A
2025-10-29 19:47:32 +00:00
Kirill Bulatov
901b6ffd28 Support numeric tokens in work report LSP requests (#41448)
Closes https://github.com/zed-industries/zed/issues/41347

Release Notes:

- Indicate progress for more kinds of language servers
2025-10-29 19:35:16 +00:00
Danilo Leal
edc380db80 settings_ui: Add edit prediction settings (#41480)
Release Notes:

- N/A

---------

Co-authored-by: Ben Kunkle <Ben.kunkle@gmail.com>
2025-10-29 16:18:06 -03:00
Danilo Leal
33adfa443e docs: Add content about adding selection as context in the agent panel (#41485)
Release Notes:

- N/A
2025-10-29 16:17:45 -03:00
Finn Evers
9e5438906a svg_preview: Update preview on every buffer edit (#41270)
Closes https://github.com/zed-industries/zed/issues/39104

This fixes an issue where the preview would not work for remote buffers
in the process.

Release Notes:

- Fixed an issue where the SVG preview would not work in remote
scenarios.
- The SVG preview will now rerender on every keypress instead of only on
saves.
2025-10-29 18:49:39 +01:00
Joseph T. Lyons
fbe2907919 Document zed: reveal log in file manager in crash report template (#41053)
Merge once stable is v0.210 (10/29/2025).

Release Notes:

- N/A
2025-10-29 13:35:23 -04:00
Paul Xu
02f5a514ce gpui: Add justify_evenly to Styled (#41262)
Release Notes:
- gpui: Add `justify_evenly()` to `Styled`.
2025-10-29 12:56:53 -04:00
tidely
4bd4d76276 gpui: Fix GPUI prompts from bleeding clicks into lower windows (#41442)
Closes #41180 

When using the fallback prompt renderer (default on Wayland), clicks
would bleed through into underlying windows. When the click happens to
hit a button that creates a prompt, it drops the
`RenderablePromptHandle` which is contained within `Window`, causing the
`Receiver` which returns the index of the clicked `PromptButton` to
return `Err(Canceled)` even though a button was pressed.

This bug appears in the GPUI `window.rs` example, which can be ran using
`cargo run -p gpui --example window`. MacOS has a native
`PromptRenderer` and thus needs additional code to be adjusted to be
able to reproduce the issue.

Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-10-29 12:53:06 -04:00
Cameron Mcloughlin
7a7e820030 settings_ui: Remove OpenSettingsAt from command palette (#41358)
Co-authored-by: Conrad Irwin <conrad.irwin@gmail.com>
2025-10-29 16:29:53 +00:00
Bennet Fenner
0e45500158 prompt_store: Remove unused code (#41473)
Release Notes:

- N/A
2025-10-29 16:27:30 +00:00
Danilo Leal
16c399876c settings_ui: Add ability to copy a link for a given setting (#41172)
Release Notes:

- settings_ui: Added the ability to copy a link to a given setting,
allowing users to quickly open the settings window at the correct
location in a faster way.

---------

Co-authored-by: cameron <cameron.studdstreet@gmail.com>
Co-authored-by: Ben Kunkle <ben@zed.dev>
2025-10-29 13:15:08 -03:00
Lukas Wirth
3583e129d1 editor: Limit the amount of git processes spawned per multibuffer (#41472)
Release Notes:

- Reduced the number of concurrent git processes spawned for blaming
2025-10-29 16:08:41 +00:00
skewb1k
75b1da0f65 Fix Gruvbox accent colors (#41470)
In #11503, the "accents" option was incorrectly at the top level. This
moves it under the "style" key so it takes effect.

### Before/After
<img width="872" height="499" alt="1761750444_screenshot"
src="https://github.com/user-attachments/assets/2720d576-33b7-42df-9290-7b6a56f5b6a6"
/>
<img width="901" height="501" alt="1761750448_screenshot"
src="https://github.com/user-attachments/assets/bd6b7ccb-77ef-467c-b7cc-a5107b093db5"
/>

Release Notes:

- N/A
2025-10-29 15:59:42 +00:00
Shardul Vaidya
207a202477 bedrock: Add support for Claude Haiku 4.5 model (#41045)
Release Notes:

- bedrock: Added support for Claude Haiku 4.5

---------

Co-authored-by: Ona <no-reply@ona.com>
2025-10-29 16:41:43 +01:00
Yordis Prieto
0871c539ee acp_tools: Add button to clear messages (#41206)
Added a "Clear Messages" button to the ACP logs toolbar that removes all
messages.

## Motivation

When debugging ACP protocol implementations, the message list can become
cluttered with old messages. This feature allows clearing all messages
with a single click to start fresh, making it easier to focus on new
interactions without closing and reopening the ACP logs view.

Release Notes:

- N/A
2025-10-29 16:40:02 +01:00
Hilmar Wiegand
b92664c52d gpui: Implement support for wlr layer shell (#35610)
This reintroduces `layer_shell` support after #32651 was reverted. On
top of that, it allows setting options for the created surface,
restricts the enum variant to the `wayland` feature, and adds an example
that renders a clock widget using the protocol.

I've renamed the `WindowKind` variant to `LayerShell` from `Overlay`,
since the protocol can also be used to render wallpapers and such, which
doesn't really fit with the word.

Things I'm still unsure of:
- We need to get the layer options types to the user somehow, but
nothing from the `platform::linux` crate was exported, I'm assuming
intentionally. I've kept the types inside the module (instead of doing
`pub use layer_shell::*` to not pollute the global namespace with
generic words like `Anchor` or `Layer` Let me know if you want to do
this differently.
- I've added the options to the `WindowKind` variant. That's the only
clean way I see to supply them when the window is created. This makes
the kind no longer implement `Copy`.
- The options don't have setter methods yet and can only be defined on
window creation. We'd have to make fallible functions for setting them,
which only work if the underlying surface is a `layer_shell` surface.
That feels un-rust-y.

CC @zeroeightysix  
Thanks to @wuliuqii, whose layer-shell implementation I've also looked
at while putting this together.

Release Notes:

- Add support for the `layer_shell` protocol on wayland

---------

Co-authored-by: Ridan Vandenbergh <ridanvandenbergh@gmail.com>
2025-10-29 11:32:01 -04:00
Anthony Eid
19099e808c editor: Add action to move between snippet tabstop positions (#41466)
Closes #41407

This solves a problem where users couldn't navigate between snippet
tabstops while the completion menu was open.

I named the action {Next, Previous}SnippetTabstop instead of Placeholder
to be more inline with the LSP spec naming convention and our codebase
names.

Release Notes:

- Editor: Add actions to move between snippet tabstop positions
2025-10-29 15:26:09 +00:00
Ben Kunkle
f29ac79bbd Add myself as a docs reviewer (#41463)
Closes #ISSUE

Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-10-29 14:21:56 +00:00
Angelo Verlain
797ac5ead4 docs: Update docs for using ESLint as the only formatter (#40679)
Closes #ISSUE

Release Notes:

- N/A

---------

Co-authored-by: Ben Kunkle <Ben.kunkle@gmail.com>
2025-10-29 13:58:35 +00:00
Lukas Wirth
37c6cd43e0 project: Fix inlay hints duplicatig on chunk start (#41461)
Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-10-29 13:52:03 +00:00
Justin Su
01a1b9b2c1 Document Go hard tabs in default settings (#41459)
Closes https://github.com/zed-industries/zed/issues/40876

This is already present in the code but missing from the default
settings, which is confusing.

Release Notes:

- N/A
2025-10-29 09:44:26 -04:00
Anthony Eid
d44437d543 display map: Fix left shift debug panic (#38656)
Closes https://github.com/zed-industries/zed/issues/38558

The bug occurred because TabStopCursor chunk_position.1 is bounded
between 0 and 128. The fix for this was changing the bound to 0 and 127.

This also allowed me to simplify some of the tab stop cursor code to be
a bit faster (less branches and unbounded shifts).

Release Notes:

- N/A
2025-10-29 09:34:33 -04:00
Finn Evers
6be029ff17 Document plain text soft wrap in default settings (#41456)
Closes #41169

This was alredy present in code before, but not documented in the
default settings, which could lead to confusion,

Release Notes:

- N/A
2025-10-29 12:38:18 +00:00
Finn Evers
d59ecf790f ui: Don't show scrollbar track in too many cases (#41455)
Follow-up to https://github.com/zed-industries/zed/pull/41354 which
introduced a small regression.

Release Notes:

- N/A
2025-10-29 12:20:57 +00:00
Lukas Wirth
bde7e55adb editor: Render diagnostic popover even if the source is out of view (#41449)
This happens quite often with cargo based diagnostics which may spawn
several lines (sometimes the entire screen), forcing the user to scroll
up to the start of the diagnostic just to see the hover message is not
great.

Release Notes:

- Fixed diagnostics hovers not working if the diagnostic spans out of
view
2025-10-29 12:18:34 +00:00
Lukas Wirth
b7d31fabc5 vim: Add helix mode toggle (#41454)
Just for parity with vim. Also prevents these toggles from having both
enabled at the same time as that is a buggy state.

Release Notes:

- Added command to toggle helix mode
2025-10-29 12:16:31 +00:00
Finn Evers
1a223e23fb Revert "Support relative line number on wrapped lines (#39268)" (#41450)
Closes #41422

This completely broke line numbering as described in the linked issue
and scrolling up does not have the correct numbers any more.

Release Notes:

- NOTE: The `relative_line_numbers` change
(https://github.com/zed-industries/zed/pull/39268) was reverted and did
not make the release cut!
2025-10-29 11:07:23 +00:00
Xiaobo Liu
f2c03d0d0a gpui: Fix typo in ForegroundExecutor documentation (#41446)
Release Notes:

- N/A

Signed-off-by: Xiaobo Liu <cppcoffee@gmail.com>
2025-10-29 10:53:52 +00:00
Lukas Wirth
6fa823417f editor: When expanding first excerpt up, scroll it into view (#41445)
Before

https://github.com/user-attachments/assets/2390e924-112a-43fa-8ab8-429a55456d12

After

https://github.com/user-attachments/assets/b47c95f0-ccd9-40a6-ab04-28295158102e

Release Notes:

- Fixed an issue where expanding the first excerpt upwards would expand
it out of view
2025-10-29 10:32:42 +00:00
Mayank Verma
8725a2d166 go_to_line: Fix scroll position restore on dismiss (#41234)
Closes #35347

Release Notes:

- Fixed Go To Line jumping back to previous position on dismiss
2025-10-29 08:47:48 +00:00
Conrad Irwin
f9c97d29c8 Bump Zed to v0.212 (#41417)
Release Notes:

- N/A
2025-10-29 02:10:33 +00:00
Conrad Irwin
5192233b59 Fix people who use gh instead of env vars (#41418)
Closes #ISSUE

Release Notes:

- N/A
2025-10-29 02:09:22 +00:00
Callum Tolley
d0d7b9cdcd Update docs to use == instead of = (#41415)
Closes #41219

Release Notes:

- Updated docs to use `==` instead of `=` in keymap context.

Hopefully I'm not mistaken here, but I think the docs have a bug in them
2025-10-28 19:34:19 -06:00
Ben Kunkle
8b051d6cc3 Revert "Use gh workflow for tests" (#41411)
Reverts zed-industries/zed#41384

The branch-protection rules work much better when there is a Job that
runs every time and can be depended on to pass, we no longer have this.

Release Notes:

- N/A
2025-10-29 00:37:57 +00:00
John Tur
16d84a31ec Adjust Windows fusion manifests (#41408)
- Declare UAC support. This will prevent Windows from flagging
`auto_update_helper.exe` as a legacy setup program that needs to run as
administrator.
- Declare support for Windows 10. This will stop Windows from applying
various application compatibility profiles.

The UAC policy is not really appropriate to apply to all GPUI
applications (e.g. an installer written in GPUI may want to declare
itself as `requireAdministrator` instead of `asInvoker`). I tried
splitting this into a Zed.exe-only manifest and enabling manifest file
merging, but I ran out of my time-box. We can fix this later if this is
flagged by GPUI users.

Release Notes:

- N/A
2025-10-28 20:21:31 -04:00
Ben Kunkle
4adff4aa8a Use gh workflow for tests (#41384)
Follow up for: #41304

Splits CI tests (cherry-picks and PRs only for now) into separate
workflows using `gh-workflow`. Includes a couple restructures to
- run more things in parallel
- remove our previous shell script based checking to filter tests based
on files changed, instead using the builtin `paths:` workflow filters


Splitting the docs/style/rust tests & checks into separate workflows
means we lose the complete summary showing all the tests in one view,
but it's possible to re-add in the future if we go back to checking what
files changed ourselves or always run everything.

Release Notes:

- N/A *or* Added/Fixed/Improved ...

---------

Co-authored-by: Conrad <conrad@zed.dev>
2025-10-28 23:31:38 +00:00
Danilo Leal
7de3c67b1d docs: Improve header on mobile (#41404)
Release Notes:

- N/A
2025-10-28 19:34:13 -03:00
Max Brunsfeld
60bd417d8b Allow inspection of zeta2's LLM-based context retrieval (#41340)
Release Notes:

- N/A

---------

Co-authored-by: Agus Zubiaga <agus@zed.dev>
2025-10-28 22:26:48 +00:00
Danilo Leal
d31194dcf8 docs: Fix keybinding display in /configuring-zed (#41402)
Release Notes:

- N/A
2025-10-28 19:20:49 -03:00
Piotr Osiewicz
4cc6d6a398 ci: Notarize in parallel (different flavor) (#41392)
Release Notes:

- N/A

---------

Co-authored-by: Conrad Irwin <conrad.irwin@gmail.com>
Co-authored-by: Ben Kunkle <ben@zed.dev>
Co-authored-by: Conrad Irwin <conrad@zed.dev>
2025-10-28 16:18:17 -06:00
Piotr Osiewicz
b9eafb80fd extensions: Load extension byte repr in background thread (again) (#41398)
Release Notes:

- N/A
2025-10-28 20:54:35 +00:00
Cameron Mcloughlin
5e7927f628 [WIP] editor: Implement next/prev reference (#41078)
Co-authored-by: Cole <cole@zed.dev>
Co-authored-by: Conrad Irwin <conrad.irwin@gmail.com>
2025-10-28 20:41:44 +00:00
Katie Geer
b75736568b docs: Reorganize introduction (#41387)
Release Notes:
- Remove Windows/Linux from Getting Started
- Consolidate download & system into new Installation page
- Move Remote Dev out of Windows and into Remote Development
- Add Uninstall page
- Add updates page
- Remove addl learning materials from intro
2025-10-28 17:39:40 -03:00
Lukas Wirth
360074effb rope: Prevent stack overflows by bumping rayon stack sizes (#41397)
Thread stacks in rust by default have 2 megabytes of stack which for
sumtrees (or ropes in this case) can easily be exceeded depending on the
workload.

Release Notes:

- Fixed stack overflows when constructing large ropes
2025-10-28 20:21:49 +00:00
Conrad Irwin
b1922b7156 Move Nightly release to gh-workflow (#41349)
Follow up to #41304 to move nightly release over

Release Notes:

- N/A

---------

Co-authored-by: Ben Kunkle <ben@zed.dev>
2025-10-28 13:57:23 -06:00
Danilo Leal
54fd7ea699 agent_ui: Don't show root project in path prefix in @-mention menu (#41372)
This PR hides the worktree root name from the path prefix displayed when
you @-mention a file or directory in the agent panel. Given the tight UI
real state we have, I believe that having the project name in there is
redundant—the project you're in is already displayed in the title bar.
Not only it was the very first word you'd see after the file's name, but
it also made the path longer than it needs to. A bit of a design clean
up here :)

(PS: We still show the project name if there are more than one in the
same workspace.)

Release Notes:

- N/A
2025-10-28 16:15:53 -03:00
Danilo Leal
8564602c3b command palette: Make footer buttons justified to the right (#41382)
Release Notes:

- N/A
2025-10-28 16:15:44 -03:00
Marshall Bowers
e604ef3af9 Add a setting to prevent sharing projects in public channels (#41395)
This PR adds a setting to prevent projects from being shared in public
channels.

This can be enabled by adding the following to the project settings
(`.zed/settings.json`):

```json
{
  "prevent_sharing_in_public_channels": true
}
```

This will then disable the "Share" button when not in a private channel:

<img width="380" height="115" alt="Screenshot 2025-10-28 at 2 28 10 PM"
src="https://github.com/user-attachments/assets/6761ac34-c0d5-4451-a443-adf7a1c42bcd"
/>

Release Notes:

- collaboration: Added a `prevent_sharing_in_public_channels` project
setting for preventing projects from being shared in public channels.
2025-10-28 18:48:07 +00:00
Bennet Fenner
1f5101d9fd agent_ui: Trim whitespace when submitting message (#41391)
Closes #41017

Release Notes:

- N/A
2025-10-28 19:02:39 +01:00
Christian Durán Carvajal
37540d1ff7 agent: Only include tool guidance in system prompt when profile has tools enabled (#40413)
Was previously sending all of the tools to the LLM on the first message
of a conversation regardless of the selected agent profile. This added
extra context, and tended to create scenarios where the LLM would
attempt to use the tool and it would fail since it was not available.

To reproduce, create a new conversation where you ask the Minimal mode
which tools it has access to, or try to write to a file.

Release Notes:

- Fixed an issue in the agent where all tools would be presented as
available even when using the `Minimal` profile

---------

Co-authored-by: Bennet Bo Fenner <bennetbo@gmx.de>
2025-10-28 16:56:39 +00:00
Piotr Osiewicz
d9d24582bb lsp: Fix workspace diagnostics when registered statically (#41386)
Closes #41379

Release Notes:

- Fixed diagnostics for Ruff and Biome
2025-10-28 17:16:42 +01:00
Sushant Mishra
a4a2acfaa5 gpui: Set initial window title on Wayland (#36844)
Closes #36843 

Release Notes:

- Fixed: set the initial window title correctly on startup in Wayland.
2025-10-28 16:52:01 +01:00
Jason Lee
55554a8653 markdown_preview: Improve nested list item prefix style (#39606)
Release Notes:

- Improved nested list item prefix style for Markdown preview.


## Before

<img width="667" height="450" alt="SCR-20251006-rtis"
src="https://github.com/user-attachments/assets/439160c4-7982-463c-9017-268d47c42c0c"
/>

## After

<img width="739" height="440" alt="SCR-20251006-rzlb"
src="https://github.com/user-attachments/assets/f6c237d9-3ff0-4468-ae9c-6853c5c2946a"
/>

---------

Co-authored-by: Bennet Bo Fenner <bennetbo@gmx.de>
2025-10-28 16:44:05 +01:00
Paweł Kondzior
d00ad02b05 agent_ui: Add file name and line number to symbol completions (#40508)
Symbol completions in the context picker now show "SymbolName
filename.txt L123" instead of just "SymbolName". This helps distinguish
between symbols with the same name in different files.

## Motivation

I noticed that the message prompt editor only showed symbol names
without any context, making it hard to use in large projects with many
symbols sharing the same name. The inline prompt editor already includes
file context for symbol completions, so I brought that same UX
improvement to the message prompt editor. I've decided to match the
highlighting style with directory completion entries from the message
editor.

## Changes

- Extract file name from both InProject and OutsideProject symbol
locations
- Add `build_symbol_label` helper to format labels with file context
- Update test expectation for new label format

## Screenshots
Inline Prompt Editor
<img width="843" height="334" alt="Screenshot 2025-10-17 at 17 47 52"
src="https://github.com/user-attachments/assets/16752e1a-0b8c-4f58-a0b2-25ae5130f18c"
/>
Old Message Editor:
<img width="466" height="363" alt="Screenshot 2025-10-17 at 17 31 44"
src="https://github.com/user-attachments/assets/900659f3-0632-4dff-bc9a-ab2dad351964"
/>
New Message Editor:
<img width="482" height="359" alt="Screenshot 2025-10-17 at 17 31 23"
src="https://github.com/user-attachments/assets/bf382167-f88b-4f3d-ba3e-1e251a8df962"
/>


Release Notes:

- Added file names and line numbers to symbol completions in the agent
panel
2025-10-28 16:43:48 +01:00
David Kleingeld
233a1eb46e Open keymap editor from command palette entry (#40825)
Release Notes:

- Adds footer to the command palette with buttons to add or change the
selected actions keybinding. Both open the keymap editor though the add
button takes you directly to the modal for recording a new keybind.


https://github.com/user-attachments/assets/0ee6b91e-b1dd-4d7f-ad64-cc79689ceeb2

---------

Co-authored-by: Joseph T. Lyons <JosephTLyons@gmail.com>
Co-authored-by: Julia Ryan <juliaryan3.14@gmail.com>
Co-authored-by: Danilo Leal <daniloleal09@gmail.com>
2025-10-28 15:14:58 +00:00
Conrad Irwin
c656101862 Use gh-workflow for the run-bundling aspects of CI.yml (#41304)
To help make our GitHub Actions easier to understand, we're planning to
split the existing `ci.yml` into three separate workflows:

* run_bundling.yml (this PR)
* run_tests.yml 
* make_release.yml

To avoid the duplication that this might otherwise cause, we're planning
to write the workflows with gh-workflow, and use rust instead of
encoding logic in YAML conditions.

Release Notes:

- N/A

---------

Co-authored-by: Ben Kunkle <ben@zed.dev>
2025-10-28 09:12:04 -06:00
Lionel Henry
3248a05406 Propagate Jupyter client errors (#40886)
Closes #40884

- Make IOPub task return a `Result`
- Create a monitoring task that watches over IOPub, Control, Routing and
Shell tasks.
- If any of these tasks fail, report the error with `kernel_errored()`
(which is already used to report process crashes)


https://github.com/user-attachments/assets/3125f6c7-099a-41ca-b668-fe694ecc68b9

This is not perfect. I did not have time to look into this but:

- When such errors happen, the kernel should be shut down.
- The kernel should no longer appear as online in the UI

But at least the user is getting feedback on what went wrong.

Release Notes:

- Jupyter client errors are now surfaced in the UI (#40884)
2025-10-28 14:45:27 +00:00
Bennet Fenner
baaf87aa23 Fix unit_evals.yml (#41377)
Release Notes:

- N/A
2025-10-28 14:30:47 +00:00
Piotr Osiewicz
8991f58b97 ci: Bump target directory size limit for mac runners (#41375)
Release Notes:

- N/A
2025-10-28 14:12:03 +00:00
Bennet Fenner
2579f86bcd acp_thread: Fix @mention file path format (#41310)
After #38882 we were always including file/directory mentions as
`zed:///agent/file?path=a/b/c.rs`.
However, for most resource links (files/directories/symbols/selections)
we want to use a common format, so that ACP servers don't have to
implement custom handling for parsing `ResourceLink`s coming from Zed.

This is what it looks like now:
```
[@index.js](file:///Users/.../projects/reqwest/examples/wasm_github_fetch/index.js) 
[@wasm](file:///Users/.../projects/reqwest/src/wasm) 
[@Error](file:///Users/.../projects/reqwest/src/async_impl/client.rs?symbol=Error#L2661:2661) 
[@error.rs (23:27)](file:///Users/.../projects/reqwest/src/error.rs#L23:27) 
```

Release Notes:

- N/A

---------

Co-authored-by: Cole Miller <cole@zed.dev>
2025-10-28 15:06:19 +01:00
Kirill Bulatov
5423fafc83 Use proper inlay hint range when filtering out hints (#41363)
Follow-up of https://github.com/zed-industries/zed/pull/40183

Release Notes:

- N/A

---------

Co-authored-by: Lukas Wirth <lukas@zed.dev>
2025-10-28 12:58:38 +00:00
Finn Evers
8a01e48339 ui: Properly update scrollbar track color (#41354)
Closes https://github.com/zed-industries/zed/issues/41334

This comes down to a caching issue..

Release Notes:

- Fixed an issue where the scrollbar track color would not update in
case the theme was changed.
2025-10-28 09:30:58 +00:00
Adir Shemesh
1b43217c05 Add a jetbrains-like Toggle All Docks action (#40567)
The current Jetbrains keymap has `ctrl-shift-f12` set to
`CloseAllDocks`. On Jetbrains IDEs this hotkey actually toggles the
docks, which is very convenient: You press it once to hide all docks and
just focus on the code, and then you can press it again to toggle your
docks right back to how they were. Unlike `CloseAllDocks`, a toggle
means the editor needs to remember the previous docks state so this
necessitated some code changes.

Release Notes:

- Added a `Toggle All Docks` editor action and updated the keymaps to
use it
2025-10-28 08:54:05 +00:00
Kirill Bulatov
1b6cde7032 Revert "Fix ESLint linebreak-style errors by preserving line endings in LSP communication (#38773)" (#41355)
This reverts commit 435eab6896.

This caused format on save to scroll down to bottom instead of keeping
the position.

Release Notes:

- N/A
2025-10-28 08:45:02 +00:00
Finn Evers
bd0bcdb0ed Fix line number settings migration (#41351)
Follow-up to https://github.com/zed-industries/zed/pull/39268

Also updates the documentation.

Release Notes:

- N/A
2025-10-28 08:08:49 +00:00
Anthony Eid
2b5699117f editor: Fix calculate relative line number panic (#41352)
### Reproduction steps

1. Turn on relative line numbers
2. Start a debugging session and hit an active debug line
3. minimize Zed so the editor element with the active debug line has
zero visible rows

#### Before 

https://github.com/user-attachments/assets/57cc7a4d-478d-481a-8b70-f14c879bd858
#### After 

https://github.com/user-attachments/assets/19614104-f9aa-4b76-886b-1ad4a5985403

Release Notes:

- debugger: Fix a panic that could occur when minimizing Zed
2025-10-28 08:08:02 +00:00
John Tur
0857ddadc5 Always delete OpenConsole.exe on Windows uninstall (#41348)
By default, the uninstaller will only delete files that were written by
the original installer. When users upgrade Zed, these new
OpenConsole.exe files will have been written by auto_upgrade_helper, not
the installer. Force them to be deleted on uninstall, so they do not
hang around.

Release Notes:

- N/A
2025-10-28 07:14:56 +00:00
Coenen Benjamin
3e3618b3ff debugger: Add horizontal scrollbar for frame item and tooltip for variables (#41261)
Closes #40360 

I first tried to use an horizontal scrollbar also for variables but as
it's a List that can be collapsed it didn't feel natural so I ended up
adding a tooltip to have to full value of the variable when you hover
the item. (cf screenshots).




https://github.com/user-attachments/assets/70c4150d-b967-46b0-8720-82bbad9c9cca




https://github.com/user-attachments/assets/d0b52189-b090-4824-8eb7-2f455fa58b33



Release Notes:

- Added: for debugger UI horizontal scrollbar for frame item and tooltip
for variables.

Signed-off-by: Benjamin <5719034+bnjjj@users.noreply.github.com>
2025-10-28 03:04:21 -04:00
Anthony Eid
2163580b16 Fix tab switcher spacing bug (#41329)
The tab switcher render matches calls each workspace item's
`Item::tab_content` function that can return an element of variable
size. Because the tab switcher was using a uniform list under the hood,
this would cause spacing issues when tab_contents elements had different
sizes.

The fix is by changing the picker to use a material list under the hood.

Release Notes:

- N/A
2025-10-28 03:00:55 -04:00
h-michaelson20
4778d61bdd Fix copy button not working for REPL error output (#40669)
## Description

Fixes the copy button functionality in REPL interactive mode error
output sections.

When executing Python code that produces errors in the REPL (e.g.,
`NameError`), the copy button in the error output section was
unresponsive. The stdout/stderr copy button worked correctly, but the
error traceback section copy button had no effect when clicked.

Fixes #40207

## Changes

Modified the following:
src/outputs.rs: Fixed context issues in render_output_controls by
replacing cx.listener() with simple closures, and added custom button
implementation for ErrorOutput that copies/opens the complete error
(name + message + traceback)
src/outputs/plain.rs: Made full_text() method public to allow access
from button handlers
src/outputs/user_error.rs: Added Clone derive to ErrorView struct and
removed a couple pieces of commented code

## Why This Matters

The copy button was clearly broken and it is useful to have for REPL
workflows. Users could potentially need to copy error messages for a
variety of reasons.

## Testing

See attached demo for proof that the fix is working as intended. (this
is my first ever commit, if there are additional test cases I need to
write or run, please let me know!)


https://github.com/user-attachments/assets/da158205-4119-47eb-a271-196ef8d196e4

Release Notes:

- Fixed copy button not working for REPL error output
2025-10-28 06:52:53 +00:00
Lukas Wirth
46c5d515bf recent_projects: Surface project opening errors to user (#41308)
Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-10-28 06:44:44 +00:00
Xiaobo Liu
73bd12ebbe editor: Optimize selection overlap checking (#41281)
Replace the binary search approach with a more efficient partition_point
method for checking selection overlaps. This eliminates the need to
collect and sort selection ranges separately, reducing memory allocation
and improving performance when handling multiple selections.

Release Notes:

- N/A

Signed-off-by: Xiaobo Liu <cppcoffee@gmail.com>
2025-10-28 07:40:38 +01:00
versecafe
cc829e7fdb remote: 60 second timeout on initial connection (#41339)
Closes #41316

Release Notes:

- Fixes #41316 

> This keeps the 5 second heartbeat behavior for after the connection is
made
2025-10-28 06:58:35 +01:00
Jakub Konka
fdf5bf7e6a remote: Support building x86_64-linux-musl proxy in nix-darwin (#41291)
This change adds two things to our remote server build
process:
1. It now checks if all required tooling is installed before using it or installing it on demand. This includes checks for `rustup` and `cargo-zigbuild` in your `PATH`.
2. Next, if `ZED_BUILD_REMOTE_SERVER` contains `musl` and `ZED_ZSTD_MUSL_LIB`
is set, we will pass its value (the path) to `cargo-zigbuild` as `-C
link-arg=-L{path}`.

Release Notes:

- N/A
2025-10-28 06:58:09 +01:00
John Tur
c94536a2d6 Fix Windows updater failing to copy OpenConsole.exe (#41338)
Release Notes:

- N/A
2025-10-28 05:10:57 +00:00
John Tur
9db474051b Reland Windows Arm64 builds in CI (#40855)
Release Notes:

- windows: Added builds for Arm64 architecture

---------

Co-authored-by: Julia Ryan <juliaryan3.14@gmail.com>
2025-10-27 23:59:34 -04:00
Lukas
1d0bb5a7a6 Make 'wrap selections in tag' work with line selection mode (#41030)
The `wrap selections in tag` action currently did not take line_mode
into account, which means when selecting lines with `shift-v`, the
start/end tags would be inserted into the middle of the selection (where
the cursor sits)


https://github.com/user-attachments/assets/a1cbf3da-d52a-42e2-aecf-1a7b6d1dbb32

This PR fixes this behaviour by checking if the selection uses line_mode
and then adjusting start and end points accordingly.

NOTE: I looked into amending the test cases for this, but I am unsure
how to express line mode with range markers. I would appreciate some
guidance on this and then I am happy to add test cases.

After:


https://github.com/user-attachments/assets/a212c41f-b0db-4f50-866f-fced7bc677ca

Release Notes:

- Fixed `Editor: wrap selection in tags` when in vim visual line mode

---------

Co-authored-by: Conrad Irwin <conrad.irwin@gmail.com>
2025-10-28 03:07:41 +00:00
Josh Piasecki
6823847978 Add buffer_search_deployed key context (#41193)
Release Notes:

- Pane key context now includes 'buffer_search_deployed' identifier

The goal of this PR is to add a new identifier in the key context that
will let the user target when the BufferSearchBar is deployed even if
they are not focused on it.

requested in #36930

Same rational as #40454 this will allow users to make more flexible
keybindings, by including some additional information higher up the key
context tree.

i thought adding this context to `Pane` seemed more appropriate than
`Editor` since `Terminal` also has a `BufferSearchBar`; however, I ran
into some import issues between BufferSearchBar, Search, Pane, and
Workspace which made it difficult to implement adding this context
directly inside `Pane`'s render function.

instead i added a new method called `contributes_context` to
`ToolbarItem` which will allow any toolbar item to add additional
context to the `Pane` level, which feels like it might come in handy.

here are some screen shots of the context being displayed in the Editor
and the Terminal

<img width="1653" height="1051" alt="Screenshot 2025-10-25 at 14 34 03"
src="https://github.com/user-attachments/assets/21c5b07a-8d36-4e0b-ad09-378b12d2ea38"
/>

<img width="1444" height="1167" alt="Screenshot 2025-10-25 at 12 32 21"
src="https://github.com/user-attachments/assets/86afe72f-b238-43cd-8230-9cb59fb93b2c"
/>
2025-10-27 20:37:37 -06:00
Conrad Irwin
3a7bdf43f5 Fix unwrap in branch diff (#41330)
Closes #ISSUE

Release Notes:

- N/A
2025-10-28 02:24:54 +00:00
Thomas Heartman
d5e297147f Support relative line number on wrapped lines (#39268)
**Problem:** Current relative line numbering creates a mismatch with
vim-style navigation when soft wrap is enabled. Users must mentally
calculate whether target lines are wrapped segments or logical lines,
making `<n>j/k` navigation unreliable and cognitively demanding.

**How things work today:**
- Real line navigation (`j/k` moves by logical lines): Requires
determining if visible lines are wrapped segments before jumping. Can't
jump to wrapped lines directly.
- Display line navigation (`j/k` moves by display rows): Line numbers
don't correspond to actual row distances for multi-line jumps.

**Proposed solution:** Count and number each display line (including
wrapped segments) for relative numbering. This creates direct
visual-to-navigational correspondence where the relative number shown
always matches the `<n>j/k` distance needed.

**Benefits:**
- Eliminates mental overhead of distinguishing wrapped vs. logical lines
- Makes relative line numbers consistently actionable regardless of wrap
state
- Preserves intuitive "what you see is what you navigate" principle
- Maintains vim workflow efficiency in narrow window scenarios

Also explained an discussed in
https://github.com/zed-industries/zed/discussions/25733.

Release Notes:

Release Notes:

- Added support for counting wrapped lines as relative lines and for
displaying line numbers for wrapped segments. Changes
`relative_line_numbers` from a boolean to an enum: `enabled`,
`disabled`, or `wrapped`.

---------

Co-authored-by: Conrad Irwin <conrad.irwin@gmail.com>
2025-10-27 20:20:45 -06:00
Lukas Wirth
1c4923e1c8 gpui: Add a timeout to #[gpui::test] tests (#41303)
Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-10-28 01:37:41 +00:00
Agus Zubiaga
ee80ba6693 zeta2: LLM-based context gathering (#41326)
Release Notes:

- N/A

---------

Co-authored-by: Max Brunsfeld <maxbrunsfeld@gmail.com>
Co-authored-by: Max Brunsfeld <max@zed.dev>
2025-10-27 22:54:42 +00:00
Abdelhakim Qbaich
fd306c97f4 Fix default settings entry for basedpyright (#40812)
If you set `{"basedpyright": {"analysis": {"typeCheckingMode":
"off"}}}`, you will notice that it doesn't actually work, but
`{"basedpyright.analysis": {"typeCheckingMode": "off"}}` does.

Made the change on how the default is being set.

Release Notes:

- N/A
2025-10-27 22:57:04 +01:00
Anthony Eid
b3483a157c settings_ui: Fix tabbing in settings UI main page content (#41209)
Tabbing into the main page would move focus to the navigation panel
instead of auto-scrolling. This PR fixes that bug.


Release Notes:

- N/A
2025-10-27 17:30:34 -04:00
Conrad Irwin
ac66e912d5 Don't upload symbols to DO anymore (#41317)
Sentry now symbolicates stack traces, no need to make our builds slower

Release Notes:

- N/A
2025-10-27 15:22:53 -06:00
Anthony Eid
5e37a7b78c Fix shell welcome prompt showing up in Zed's stdout (#41311)
The bug occurred because `smol::process::Command::from(_)` doesn't set
the correct fields for stdio markers. So moving the stdio configuration
after converting to a `smol` command fixed the issue.

I added the `std::process::Command::{stdout, stderr, stdin}` functions
to our disallowed list in clippy to prevent any bugs like this appearing
in the future.

Release Notes:

- N/A
2025-10-27 20:04:36 +00:00
Cameron Mcloughlin
00278f43bb Add Rust convenience Tree-sitter injections for common crates (#41258) 2025-10-27 19:58:04 +00:00
Mikayla Maki
ac3d2a338b Tune the focus-visible heuristics a bit (#41314)
This isn't quite right yet, as a proper solution would remember the
input modality at the moment of focus change, rather than at painting
time. But this gets us close enough for now.

Release Notes:

- N/A
2025-10-27 19:53:53 +00:00
Conrad Irwin
58f07ff709 Try gh-workflow (#41155)
Experimenting with not writing YAML by hand...

Release Notes:

- N/A
2025-10-27 13:39:01 -06:00
Danilo Leal
db0f7a8b23 docs: Mention the settings and keymap UIs more prominently (#41302)
Release Notes:

- N/A
2025-10-27 15:17:11 -03:00
Marshall Bowers
f5ad4c8bd9 Remove PostgREST (#41299)
This PR removes the PostgREST containers and deployments, as we're no
longer using it.

Release Notes:

- N/A
2025-10-27 13:27:59 -04:00
Piotr Osiewicz
172984978f collab: Add 'Copy channel notes link' to right click menu on channels (#41298)
Release Notes:

- Added a "Copy Channel Notes Link" action to right-click menu of Zed
channels.
2025-10-27 17:00:36 +00:00
Mohin Hasin Rabbi
ba26ca4aee docs: Document per-release channel configuration (#40833)
## Summary
- Document the `stable`/`preview`/`nightly` top-level keys that let
users scope settings overrides per release channel.
- Provide an example `settings.json` snippet and call out that overrides
replace array values rather than merging them.
- Mention that UI-driven changes edit the root config so per-channel
blocks might need manual updates.

## Testing
- Not run (docs only).

Fixes #40458.

Release Notes:

- N/A

---------

Co-authored-by: MrSubidubi <finn@zed.dev>
2025-10-27 16:50:32 +00:00
Finn Evers
1ae8e0c53a keymap_editor: Clear action query when showing matching keybindings (#41296)
Closes https://github.com/zed-industries/zed/issues/41050

Release Notes:

- Fixed an issue where showing matching keystrokes in the keybind editor
modal would not clear an active text query.
2025-10-27 17:49:13 +01:00
pedrxd
a70f80df95 Add support for changing the Codestral endpoint (#41116)
```json
  "edit_predictions": {
    "codestral": {
      "api_url": "https://codestral.mistral.ai",
      "model": "codestral-latest",
      "max_tokens": 150
    }
  },
```

Release Notes:

- Added support for changing the Codestral endpoint. This was discussed
at #34371.

---------

Co-authored-by: Bennet Bo Fenner <bennetbo@gmx.de>
2025-10-27 16:00:27 +00:00
Dino
821a4880bd cli: Use --wait to prefer focused window (#41051)
Introduce a new `prefer_focused_window` field to the
`workspace::OpenOptions` struct that, when provided, will make it so
that Zed opens the provided path in the currently focused window.

This will now automatically be set to true when the `--wait` flag is
used with the CLI.

Closes #40551 

Release Notes:

- Improved the `--wait` flag in Zed's CLI so as to always open the
provided file in the currently focused window

---------

Co-authored-by: Conrad Irwin <conrad.irwin@gmail.com>
2025-10-27 15:58:41 +00:00
Danilo Leal
7f17d4b61d Add Tailwind CSS and Ruff to built-in features list (#41285)
Closes https://github.com/zed-industries/zed/issues/41168

This PR adds both Tailwind CSS and Ruff (linter for Python) as built-in
features; a banner mentioning this should show up now for these two when
searching for them in the extensions UI.

There will also be a corresponding zed.dev site PR adding a "Ruff is
built-in" card to the zed.dev/extensions page.

Release Notes:

- N/A
2025-10-27 12:38:56 -03:00
Bennet Fenner
ebf4a23b18 Use paths::external_agents_dir (#41286)
We were not actually using `paths::agent_servers` and were manually
constructing the path to the `external_agents` folder in a few places.

Release Notes:

- N/A
2025-10-27 15:24:44 +00:00
Anthony
4ba9259890 Start work to add container queries to gpui 2025-08-18 04:03:03 -04:00
262 changed files with 12496 additions and 7179 deletions

View File

@@ -35,10 +35,8 @@ body:
attributes:
label: If applicable, attach your `Zed.log` file to this issue.
description: |
macOS: `~/Library/Logs/Zed/Zed.log`
Windows: `C:\Users\YOU\AppData\Local\Zed\logs\Zed.log`
Linux: `~/.local/share/zed/logs/Zed.log` or $XDG_DATA_HOME
If you only need the most recent lines, you can run the `zed: open log` command palette action to see the last 1000.
From the command palette, run `zed: open log` to see the last 1000 lines.
Or run `zed: reveal log in file manager` to reveal the log file itself.
value: |
<details><summary>Zed.log</summary>

View File

@@ -2,16 +2,9 @@ name: CI
on:
push:
branches:
- main
- "v[0-9]+.[0-9]+.x"
tags:
- "v*"
pull_request:
branches:
- "**"
concurrency:
# Allow only one workflow per any non-`main` branch.
group: ${{ github.workflow }}-${{ github.ref_name }}-${{ github.ref_name == 'main' && github.sha || 'anysha' }}
@@ -273,15 +266,12 @@ jobs:
uses: ./.github/actions/run_tests
- name: Build collab
# we should do this on a linux x86 machinge
run: cargo build -p collab
- name: Build other binaries and features
run: |
cargo build --workspace --bins --all-features
cargo check -p gpui --features "macos-blade"
cargo check -p workspace
cargo build -p remote_server
cargo check -p gpui --examples
cargo build --workspace --bins --examples
# Since the macOS runners are stateful, so we need to remove the config file to prevent potential bug.
- name: Clean CI config file
@@ -516,9 +506,7 @@ jobs:
name: Create a macOS bundle
runs-on:
- self-mini-macos
if: |
( startsWith(github.ref, 'refs/tags/v')
|| contains(github.event.pull_request.labels.*.name, 'run-bundling') )
if: startsWith(github.ref, 'refs/tags/v')
needs: [macos_tests]
env:
MACOS_CERTIFICATE: ${{ secrets.MACOS_CERTIFICATE }}
@@ -549,16 +537,14 @@ jobs:
ref: ${{ github.ref }}
- name: Limit target directory size
run: script/clear-target-dir-if-larger-than 100
run: script/clear-target-dir-if-larger-than 300
- name: Determine version and release channel
if: ${{ startsWith(github.ref, 'refs/tags/v') }}
run: |
# This exports RELEASE_CHANNEL into env (GITHUB_ENV)
script/determine-release-channel
- name: Draft release notes
if: ${{ startsWith(github.ref, 'refs/tags/v') }}
run: |
mkdir -p target/
# Ignore any errors that occur while drafting release notes to not fail the build.
@@ -567,29 +553,17 @@ jobs:
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
- name: Create macOS app bundle
run: script/bundle-mac
- name: Create macOS app bundle (aarch64)
run: script/bundle-mac aarch64-apple-darwin
- name: Create macOS app bundle (x64)
run: script/bundle-mac x86_64-apple-darwin
- name: Rename binaries
if: ${{ github.ref == 'refs/heads/main' }} || contains(github.event.pull_request.labels.*.name, 'run-bundling') }}
run: |
mv target/aarch64-apple-darwin/release/Zed.dmg target/aarch64-apple-darwin/release/Zed-aarch64.dmg
mv target/x86_64-apple-darwin/release/Zed.dmg target/x86_64-apple-darwin/release/Zed-x86_64.dmg
- name: Upload app bundle (aarch64) to workflow run if main branch or specific label
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4
if: ${{ github.ref == 'refs/heads/main' }} || contains(github.event.pull_request.labels.*.name, 'run-bundling') }}
with:
name: Zed_${{ github.event.pull_request.head.sha || github.sha }}-aarch64.dmg
path: target/aarch64-apple-darwin/release/Zed-aarch64.dmg
- name: Upload app bundle (x86_64) to workflow run if main branch or specific label
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4
if: ${{ github.ref == 'refs/heads/main' }} || contains(github.event.pull_request.labels.*.name, 'run-bundling') }}
with:
name: Zed_${{ github.event.pull_request.head.sha || github.sha }}-x86_64.dmg
path: target/x86_64-apple-darwin/release/Zed-x86_64.dmg
- uses: softprops/action-gh-release@de2c0eb89ae2a093876385947365aca7b0e5f844 # v1
name: Upload app bundle to release
if: ${{ env.RELEASE_CHANNEL == 'preview' || env.RELEASE_CHANNEL == 'stable' }}
@@ -610,8 +584,7 @@ jobs:
runs-on:
- namespace-profile-16x32-ubuntu-2004 # ubuntu 20.04 for minimal glibc
if: |
( startsWith(github.ref, 'refs/tags/v')
|| contains(github.event.pull_request.labels.*.name, 'run-bundling') )
( startsWith(github.ref, 'refs/tags/v') )
needs: [linux_tests]
steps:
- name: Checkout repo
@@ -628,7 +601,6 @@ jobs:
token: ${{ SECRETS.SENTRY_AUTH_TOKEN }}
- name: Determine version and release channel
if: startsWith(github.ref, 'refs/tags/v')
run: |
# This exports RELEASE_CHANNEL into env (GITHUB_ENV)
script/determine-release-channel
@@ -636,23 +608,8 @@ jobs:
- name: Create Linux .tar.gz bundle
run: script/bundle-linux
- name: Upload Artifact to Workflow - zed (run-bundling)
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4
if: contains(github.event.pull_request.labels.*.name, 'run-bundling')
with:
name: zed-${{ github.event.pull_request.head.sha || github.sha }}-x86_64-unknown-linux-gnu.tar.gz
path: target/release/zed-*.tar.gz
- name: Upload Artifact to Workflow - zed-remote-server (run-bundling)
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4
if: contains(github.event.pull_request.labels.*.name, 'run-bundling')
with:
name: zed-remote-server-${{ github.event.pull_request.head.sha || github.sha }}-x86_64-unknown-linux-gnu.gz
path: target/zed-remote-server-linux-x86_64.gz
- name: Upload Artifacts to release
uses: softprops/action-gh-release@de2c0eb89ae2a093876385947365aca7b0e5f844 # v1
if: ${{ !(contains(github.event.pull_request.labels.*.name, 'run-bundling')) }}
with:
draft: true
prerelease: ${{ env.RELEASE_CHANNEL == 'preview' }}
@@ -669,7 +626,6 @@ jobs:
- namespace-profile-8x32-ubuntu-2004-arm-m4 # ubuntu 20.04 for minimal glibc
if: |
startsWith(github.ref, 'refs/tags/v')
|| contains(github.event.pull_request.labels.*.name, 'run-bundling')
needs: [linux_tests]
steps:
- name: Checkout repo
@@ -686,7 +642,6 @@ jobs:
token: ${{ SECRETS.SENTRY_AUTH_TOKEN }}
- name: Determine version and release channel
if: startsWith(github.ref, 'refs/tags/v')
run: |
# This exports RELEASE_CHANNEL into env (GITHUB_ENV)
script/determine-release-channel
@@ -694,23 +649,8 @@ jobs:
- name: Create and upload Linux .tar.gz bundles
run: script/bundle-linux
- name: Upload Artifact to Workflow - zed (run-bundling)
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4
if: contains(github.event.pull_request.labels.*.name, 'run-bundling')
with:
name: zed-${{ github.event.pull_request.head.sha || github.sha }}-aarch64-unknown-linux-gnu.tar.gz
path: target/release/zed-*.tar.gz
- name: Upload Artifact to Workflow - zed-remote-server (run-bundling)
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4
if: contains(github.event.pull_request.labels.*.name, 'run-bundling')
with:
name: zed-remote-server-${{ github.event.pull_request.head.sha || github.sha }}-aarch64-unknown-linux-gnu.gz
path: target/zed-remote-server-linux-aarch64.gz
- name: Upload Artifacts to release
uses: softprops/action-gh-release@de2c0eb89ae2a093876385947365aca7b0e5f844 # v1
if: ${{ !(contains(github.event.pull_request.labels.*.name, 'run-bundling')) }}
with:
draft: true
prerelease: ${{ env.RELEASE_CHANNEL == 'preview' }}
@@ -724,8 +664,7 @@ jobs:
timeout-minutes: 60
runs-on: github-8vcpu-ubuntu-2404
if: |
false && ( startsWith(github.ref, 'refs/tags/v')
|| contains(github.event.pull_request.labels.*.name, 'run-bundling') )
false && ( startsWith(github.ref, 'refs/tags/v') )
needs: [linux_tests]
name: Build Zed on FreeBSD
steps:
@@ -776,24 +715,19 @@ jobs:
nix-build:
name: Build with Nix
uses: ./.github/workflows/nix.yml
uses: ./.github/workflows/nix_build.yml
needs: [job_spec]
if: github.repository_owner == 'zed-industries' &&
(contains(github.event.pull_request.labels.*.name, 'run-nix') ||
needs.job_spec.outputs.run_nix == 'true')
secrets: inherit
with:
flake-output: debug
# excludes the final package to only cache dependencies
cachix-filter: "-zed-editor-[0-9.]*-nightly"
bundle-windows-x64:
timeout-minutes: 120
name: Create a Windows installer
name: Create a Windows installer for x86_64
runs-on: [self-32vcpu-windows-2022]
if: |
( startsWith(github.ref, 'refs/tags/v')
|| contains(github.event.pull_request.labels.*.name, 'run-bundling') )
( startsWith(github.ref, 'refs/tags/v') )
needs: [windows_tests]
env:
AZURE_TENANT_ID: ${{ secrets.AZURE_SIGNING_TENANT_ID }}
@@ -818,7 +752,6 @@ jobs:
- name: Determine version and release channel
working-directory: ${{ env.ZED_WORKSPACE }}
if: ${{ startsWith(github.ref, 'refs/tags/v') }}
run: |
# This exports RELEASE_CHANNEL into env (GITHUB_ENV)
script/determine-release-channel.ps1
@@ -827,16 +760,55 @@ jobs:
working-directory: ${{ env.ZED_WORKSPACE }}
run: script/bundle-windows.ps1
- name: Upload installer (x86_64) to Workflow - zed (run-bundling)
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4
if: contains(github.event.pull_request.labels.*.name, 'run-bundling')
- name: Upload Artifacts to release
uses: softprops/action-gh-release@de2c0eb89ae2a093876385947365aca7b0e5f844 # v1
with:
name: Zed_${{ github.event.pull_request.head.sha || github.sha }}-x86_64.exe
path: ${{ env.SETUP_PATH }}
draft: true
prerelease: ${{ env.RELEASE_CHANNEL == 'preview' }}
files: ${{ env.SETUP_PATH }}
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
bundle-windows-aarch64:
timeout-minutes: 120
name: Create a Windows installer for aarch64
runs-on: [self-32vcpu-windows-2022]
if: |
( startsWith(github.ref, 'refs/tags/v') )
needs: [windows_tests]
env:
AZURE_TENANT_ID: ${{ secrets.AZURE_SIGNING_TENANT_ID }}
AZURE_CLIENT_ID: ${{ secrets.AZURE_SIGNING_CLIENT_ID }}
AZURE_CLIENT_SECRET: ${{ secrets.AZURE_SIGNING_CLIENT_SECRET }}
ACCOUNT_NAME: ${{ vars.AZURE_SIGNING_ACCOUNT_NAME }}
CERT_PROFILE_NAME: ${{ vars.AZURE_SIGNING_CERT_PROFILE_NAME }}
ENDPOINT: ${{ vars.AZURE_SIGNING_ENDPOINT }}
FILE_DIGEST: SHA256
TIMESTAMP_DIGEST: SHA256
TIMESTAMP_SERVER: "http://timestamp.acs.microsoft.com"
steps:
- name: Checkout repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false
- name: Setup Sentry CLI
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b #v2
with:
token: ${{ SECRETS.SENTRY_AUTH_TOKEN }}
- name: Determine version and release channel
working-directory: ${{ env.ZED_WORKSPACE }}
run: |
# This exports RELEASE_CHANNEL into env (GITHUB_ENV)
script/determine-release-channel.ps1
- name: Build Zed installer
working-directory: ${{ env.ZED_WORKSPACE }}
run: script/bundle-windows.ps1 -Architecture aarch64
- name: Upload Artifacts to release
uses: softprops/action-gh-release@de2c0eb89ae2a093876385947365aca7b0e5f844 # v1
if: ${{ !(contains(github.event.pull_request.labels.*.name, 'run-bundling')) }}
with:
draft: true
prerelease: ${{ env.RELEASE_CHANNEL == 'preview' }}
@@ -850,7 +822,7 @@ jobs:
false
&& startsWith(github.ref, 'refs/tags/v')
&& endsWith(github.ref, '-pre') && !endsWith(github.ref, '.0-pre')
needs: [bundle-mac, bundle-linux-x86_x64, bundle-linux-aarch64, bundle-windows-x64]
needs: [bundle-mac, bundle-linux-x86_x64, bundle-linux-aarch64, bundle-windows-x64, bundle-windows-aarch64]
runs-on:
- self-mini-macos
steps:

View File

@@ -1,42 +1,40 @@
name: Danger
# Generated from xtask::workflows::danger
# Rebuild with `cargo xtask workflows`.
name: danger
on:
pull_request:
branches: [main]
types:
- opened
- synchronize
- reopened
- edited
- opened
- synchronize
- reopened
- edited
branches:
- main
jobs:
danger:
if: github.repository_owner == 'zed-industries'
runs-on: namespace-profile-2x4-ubuntu-2404
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
- uses: pnpm/action-setup@fe02b34f77f8bc703788d5817da081398fad5dd2 # v4.0.0
with:
version: 9
- name: Setup Node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4
with:
node-version: "20"
cache: "pnpm"
cache-dependency-path: "script/danger/pnpm-lock.yaml"
- run: pnpm install --dir script/danger
- name: Run Danger
run: pnpm run --dir script/danger danger ci
env:
# This GitHub token is not used, but the value needs to be here to prevent
# Danger from throwing an error.
GITHUB_TOKEN: "not_a_real_token"
# All requests are instead proxied through an instance of
# https://github.com/maxdeviant/danger-proxy that allows Danger to securely
# authenticate with GitHub while still being able to run on PRs from forks.
DANGER_GITHUB_API_BASE_URL: "https://danger-proxy.fly.dev/github"
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_pnpm
uses: pnpm/action-setup@fe02b34f77f8bc703788d5817da081398fad5dd2
with:
version: '9'
- name: steps::setup_node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020
with:
node-version: '20'
cache: pnpm
cache-dependency-path: script/danger/pnpm-lock.yaml
- name: danger::install_deps
run: pnpm install --dir script/danger
shell: bash -euxo pipefail {0}
- name: danger::run
run: pnpm run --dir script/danger danger ci
shell: bash -euxo pipefail {0}
env:
GITHUB_TOKEN: not_a_real_token
DANGER_GITHUB_API_BASE_URL: https://danger-proxy.fly.dev/github

View File

@@ -49,7 +49,7 @@ jobs:
- name: Limit target directory size
shell: bash -euxo pipefail {0}
run: script/clear-target-dir-if-larger-than 100
run: script/clear-target-dir-if-larger-than 300
- name: Run tests
shell: bash -euxo pipefail {0}

View File

@@ -1,69 +0,0 @@
name: "Nix build"
on:
workflow_call:
inputs:
flake-output:
type: string
default: "default"
cachix-filter:
type: string
default: ""
jobs:
nix-build:
timeout-minutes: 60
name: (${{ matrix.system.os }}) Nix Build
continue-on-error: true # TODO: remove when we want this to start blocking CI
strategy:
fail-fast: false
matrix:
system:
- os: x86 Linux
runner: namespace-profile-16x32-ubuntu-2204
install_nix: true
- os: arm Mac
runner: [macOS, ARM64, test]
install_nix: false
if: github.repository_owner == 'zed-industries'
runs-on: ${{ matrix.system.runner }}
env:
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_MINIDUMP_ENDPOINT: ${{ secrets.ZED_SENTRY_MINIDUMP_ENDPOINT }}
ZED_CLOUD_PROVIDER_ADDITIONAL_MODELS_JSON: ${{ secrets.ZED_CLOUD_PROVIDER_ADDITIONAL_MODELS_JSON }}
GIT_LFS_SKIP_SMUDGE: 1 # breaks the livekit rust sdk examples which we don't actually depend on
steps:
- name: Checkout repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false
# on our macs we manually install nix. for some reason the cachix action is running
# under a non-login /bin/bash shell which doesn't source the proper script to add the
# nix profile to PATH, so we manually add them here
- name: Set path
if: ${{ ! matrix.system.install_nix }}
run: |
echo "/nix/var/nix/profiles/default/bin" >> "$GITHUB_PATH"
echo "/Users/administrator/.nix-profile/bin" >> "$GITHUB_PATH"
- uses: cachix/install-nix-action@02a151ada4993995686f9ed4f1be7cfbb229e56f # v31
if: ${{ matrix.system.install_nix }}
with:
github_access_token: ${{ secrets.GITHUB_TOKEN }}
- uses: cachix/cachix-action@0fc020193b5a1fa3ac4575aa3a7d3aa6a35435ad # v16
with:
name: zed
authToken: "${{ secrets.CACHIX_AUTH_TOKEN }}"
pushFilter: "${{ inputs.cachix-filter }}"
cachixArgs: "-v"
- run: nix build .#${{ inputs.flake-output }} -L --accept-flake-config
- name: Limit /nix/store to 50GB on macs
if: ${{ ! matrix.system.install_nix }}
run: |
if [ "$(du -sm /nix/store | cut -f1)" -gt 50000 ]; then
nix-collect-garbage -d || true
fi

97
.github/workflows/nix_build.yml vendored Normal file
View File

@@ -0,0 +1,97 @@
# Generated from xtask::workflows::nix_build
# Rebuild with `cargo xtask workflows`.
name: nix_build
env:
CARGO_TERM_COLOR: always
RUST_BACKTRACE: '1'
CARGO_INCREMENTAL: '0'
on:
pull_request:
branches:
- '**'
paths:
- nix/**
- flake.*
- Cargo.*
- rust-toolchain.toml
- .cargo/config.toml
push:
branches:
- main
- v[0-9]+.[0-9]+.x
paths:
- nix/**
- flake.*
- Cargo.*
- rust-toolchain.toml
- .cargo/config.toml
workflow_call: {}
jobs:
build_nix_linux_x86_64:
if: github.repository_owner == 'zed-industries'
runs-on: namespace-profile-32x64-ubuntu-2004
env:
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_MINIDUMP_ENDPOINT: ${{ secrets.ZED_SENTRY_MINIDUMP_ENDPOINT }}
ZED_CLOUD_PROVIDER_ADDITIONAL_MODELS_JSON: ${{ secrets.ZED_CLOUD_PROVIDER_ADDITIONAL_MODELS_JSON }}
GIT_LFS_SKIP_SMUDGE: '1'
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: nix_build::install_nix
uses: cachix/install-nix-action@02a151ada4993995686f9ed4f1be7cfbb229e56f
with:
github_access_token: ${{ secrets.GITHUB_TOKEN }}
- name: nix_build::cachix_action
uses: cachix/cachix-action@0fc020193b5a1fa3ac4575aa3a7d3aa6a35435ad
with:
name: zed
authToken: ${{ secrets.CACHIX_AUTH_TOKEN }}
cachixArgs: -v
pushFilter: -zed-editor-[0-9.]*-nightly
- name: nix_build::build
run: nix build .#debug -L --accept-flake-config
shell: bash -euxo pipefail {0}
timeout-minutes: 60
continue-on-error: true
build_nix_mac_aarch64:
if: github.repository_owner == 'zed-industries'
runs-on: self-mini-macos
env:
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_MINIDUMP_ENDPOINT: ${{ secrets.ZED_SENTRY_MINIDUMP_ENDPOINT }}
ZED_CLOUD_PROVIDER_ADDITIONAL_MODELS_JSON: ${{ secrets.ZED_CLOUD_PROVIDER_ADDITIONAL_MODELS_JSON }}
GIT_LFS_SKIP_SMUDGE: '1'
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: nix_build::set_path
run: |
echo "/nix/var/nix/profiles/default/bin" >> "$GITHUB_PATH"
echo "/Users/administrator/.nix-profile/bin" >> "$GITHUB_PATH"
shell: bash -euxo pipefail {0}
- name: nix_build::cachix_action
uses: cachix/cachix-action@0fc020193b5a1fa3ac4575aa3a7d3aa6a35435ad
with:
name: zed
authToken: ${{ secrets.CACHIX_AUTH_TOKEN }}
cachixArgs: -v
pushFilter: -zed-editor-[0-9.]*-nightly
- name: nix_build::build
run: nix build .#debug -L --accept-flake-config
shell: bash -euxo pipefail {0}
- name: nix_build::limit_store
run: |-
if [ "$(du -sm /nix/store | cut -f1)" -gt 50000 ]; then
nix-collect-garbage -d || true
fi
shell: bash -euxo pipefail {0}
timeout-minutes: 60
continue-on-error: true
concurrency:
group: ${{ github.workflow }}-${{ github.ref_name }}-${{ github.ref_name == 'main' && github.sha || 'anysha' }}
cancel-in-progress: true

View File

@@ -1,93 +1,113 @@
name: Release Nightly
on:
schedule:
# Fire every day at 7:00am UTC (Roughly before EU workday and after US workday)
- cron: "0 7 * * *"
push:
tags:
- "nightly"
# Generated from xtask::workflows::release_nightly
# Rebuild with `cargo xtask workflows`.
name: release_nightly
env:
CARGO_TERM_COLOR: always
CARGO_INCREMENTAL: 0
RUST_BACKTRACE: 1
CARGO_INCREMENTAL: '0'
RUST_BACKTRACE: '1'
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_MINIDUMP_ENDPOINT: ${{ secrets.ZED_SENTRY_MINIDUMP_ENDPOINT }}
DIGITALOCEAN_SPACES_ACCESS_KEY: ${{ secrets.DIGITALOCEAN_SPACES_ACCESS_KEY }}
DIGITALOCEAN_SPACES_SECRET_KEY: ${{ secrets.DIGITALOCEAN_SPACES_SECRET_KEY }}
on:
push:
tags:
- nightly
schedule:
- cron: 0 7 * * *
jobs:
style:
timeout-minutes: 60
name: Check formatting and Clippy lints
check_style:
if: github.repository_owner == 'zed-industries'
runs-on:
- self-hosted
- macOS
runs-on: self-mini-macos
steps:
- name: Checkout repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false
fetch-depth: 0
- name: Run style checks
uses: ./.github/actions/check_style
- name: Run clippy
run: ./script/clippy
tests:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
fetch-depth: 0
- name: steps::cargo_fmt
run: cargo fmt --all -- --check
shell: bash -euxo pipefail {0}
- name: ./script/clippy
run: ./script/clippy
shell: bash -euxo pipefail {0}
timeout-minutes: 60
name: Run tests
run_tests_mac:
if: github.repository_owner == 'zed-industries'
runs-on:
- self-hosted
- macOS
needs: style
runs-on: self-mini-macos
steps:
- name: Checkout repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false
- name: Run tests
uses: ./.github/actions/run_tests
windows-tests:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_cargo_config
run: |
mkdir -p ./../.cargo
cp ./.cargo/ci-config.toml ./../.cargo/config.toml
shell: bash -euxo pipefail {0}
- name: steps::setup_node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020
with:
node-version: '20'
- name: steps::clippy
run: ./script/clippy
shell: bash -euxo pipefail {0}
- name: steps::cargo_install_nextest
run: cargo install cargo-nextest --locked
shell: bash -euxo pipefail {0}
- name: steps::clear_target_dir_if_large
run: ./script/clear-target-dir-if-larger-than 300
shell: bash -euxo pipefail {0}
- name: steps::cargo_nextest
run: cargo nextest run --workspace --no-fail-fast --failure-output immediate-final
shell: bash -euxo pipefail {0}
- name: steps::cleanup_cargo_config
if: always()
run: |
rm -rf ./../.cargo
shell: bash -euxo pipefail {0}
timeout-minutes: 60
name: Run tests on Windows
run_tests_windows:
if: github.repository_owner == 'zed-industries'
runs-on: [self-32vcpu-windows-2022]
runs-on: self-32vcpu-windows-2022
steps:
- name: Checkout repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false
- name: Configure CI
run: |
New-Item -ItemType Directory -Path "./../.cargo" -Force
Copy-Item -Path "./.cargo/ci-config.toml" -Destination "./../.cargo/config.toml"
- name: Run tests
uses: ./.github/actions/run_tests_windows
- name: Limit target directory size
run: ./script/clear-target-dir-if-larger-than.ps1 1024
- name: Clean CI config file
if: always()
run: Remove-Item -Recurse -Path "./../.cargo" -Force -ErrorAction SilentlyContinue
bundle-mac:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_cargo_config
run: |
New-Item -ItemType Directory -Path "./../.cargo" -Force
Copy-Item -Path "./.cargo/ci-config.toml" -Destination "./../.cargo/config.toml"
shell: pwsh
- name: steps::setup_node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020
with:
node-version: '20'
- name: steps::clippy
run: ./script/clippy.ps1
shell: pwsh
- name: steps::cargo_install_nextest
run: cargo install cargo-nextest --locked
shell: pwsh
- name: steps::clear_target_dir_if_large
run: ./script/clear-target-dir-if-larger-than.ps1 250
shell: pwsh
- name: steps::cargo_nextest
run: cargo nextest run --workspace --no-fail-fast --failure-output immediate-final
shell: pwsh
- name: steps::cleanup_cargo_config
if: always()
run: |
Remove-Item -Recurse -Path "./../.cargo" -Force -ErrorAction SilentlyContinue
shell: pwsh
timeout-minutes: 60
name: Create a macOS bundle
bundle_mac_nightly_x86_64:
needs:
- check_style
- run_tests_mac
if: github.repository_owner == 'zed-industries'
runs-on:
- self-mini-macos
needs: tests
runs-on: self-mini-macos
env:
MACOS_CERTIFICATE: ${{ secrets.MACOS_CERTIFICATE }}
MACOS_CERTIFICATE_PASSWORD: ${{ secrets.MACOS_CERTIFICATE_PASSWORD }}
@@ -95,161 +115,162 @@ jobs:
APPLE_NOTARIZATION_KEY_ID: ${{ secrets.APPLE_NOTARIZATION_KEY_ID }}
APPLE_NOTARIZATION_ISSUER_ID: ${{ secrets.APPLE_NOTARIZATION_ISSUER_ID }}
steps:
- name: Install Node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4
with:
node-version: "18"
- name: Checkout repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false
- name: Set release channel to nightly
run: |
set -eu
version=$(git rev-parse --short HEAD)
echo "Publishing version: ${version} on release channel nightly"
echo "nightly" > crates/zed/RELEASE_CHANNEL
- name: Setup Sentry CLI
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b #v2
with:
token: ${{ SECRETS.SENTRY_AUTH_TOKEN }}
- name: Create macOS app bundle
run: script/bundle-mac
- name: Upload Zed Nightly
run: script/upload-nightly macos
bundle-linux-x86:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020
with:
node-version: '20'
- name: steps::setup_sentry
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b
with:
token: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: steps::clear_target_dir_if_large
run: ./script/clear-target-dir-if-larger-than 300
shell: bash -euxo pipefail {0}
- name: release_nightly::set_release_channel_to_nightly
run: |
set -eu
version=$(git rev-parse --short HEAD)
echo "Publishing version: ${version} on release channel nightly"
echo "nightly" > crates/zed/RELEASE_CHANNEL
shell: bash -euxo pipefail {0}
- name: run_bundling::bundle_mac
run: ./script/bundle-mac x86_64-apple-darwin
shell: bash -euxo pipefail {0}
- name: release_nightly::upload_zed_nightly
run: script/upload-nightly macos x86_64
shell: bash -euxo pipefail {0}
timeout-minutes: 60
name: Create a Linux *.tar.gz bundle for x86
bundle_mac_nightly_aarch64:
needs:
- check_style
- run_tests_mac
if: github.repository_owner == 'zed-industries'
runs-on:
- namespace-profile-16x32-ubuntu-2004 # ubuntu 20.04 for minimal glibc
needs: tests
runs-on: self-mini-macos
env:
MACOS_CERTIFICATE: ${{ secrets.MACOS_CERTIFICATE }}
MACOS_CERTIFICATE_PASSWORD: ${{ secrets.MACOS_CERTIFICATE_PASSWORD }}
APPLE_NOTARIZATION_KEY: ${{ secrets.APPLE_NOTARIZATION_KEY }}
APPLE_NOTARIZATION_KEY_ID: ${{ secrets.APPLE_NOTARIZATION_KEY_ID }}
APPLE_NOTARIZATION_ISSUER_ID: ${{ secrets.APPLE_NOTARIZATION_ISSUER_ID }}
steps:
- name: Checkout repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false
- name: Add Rust to the PATH
run: echo "$HOME/.cargo/bin" >> "$GITHUB_PATH"
- name: Install Linux dependencies
run: ./script/linux && ./script/install-mold 2.34.0
- name: Setup Sentry CLI
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b #v2
with:
token: ${{ SECRETS.SENTRY_AUTH_TOKEN }}
- name: Limit target directory size
run: script/clear-target-dir-if-larger-than 100
- name: Set release channel to nightly
run: |
set -euo pipefail
version=$(git rev-parse --short HEAD)
echo "Publishing version: ${version} on release channel nightly"
echo "nightly" > crates/zed/RELEASE_CHANNEL
- name: Create Linux .tar.gz bundle
run: script/bundle-linux
- name: Upload Zed Nightly
run: script/upload-nightly linux-targz
bundle-linux-arm:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020
with:
node-version: '20'
- name: steps::setup_sentry
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b
with:
token: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: steps::clear_target_dir_if_large
run: ./script/clear-target-dir-if-larger-than 300
shell: bash -euxo pipefail {0}
- name: release_nightly::set_release_channel_to_nightly
run: |
set -eu
version=$(git rev-parse --short HEAD)
echo "Publishing version: ${version} on release channel nightly"
echo "nightly" > crates/zed/RELEASE_CHANNEL
shell: bash -euxo pipefail {0}
- name: run_bundling::bundle_mac
run: ./script/bundle-mac aarch64-apple-darwin
shell: bash -euxo pipefail {0}
- name: release_nightly::upload_zed_nightly
run: script/upload-nightly macos aarch64
shell: bash -euxo pipefail {0}
timeout-minutes: 60
name: Create a Linux *.tar.gz bundle for ARM
bundle_linux_nightly_x86_64:
needs:
- check_style
- run_tests_mac
if: github.repository_owner == 'zed-industries'
runs-on:
- namespace-profile-8x32-ubuntu-2004-arm-m4 # ubuntu 20.04 for minimal glibc
needs: tests
runs-on: namespace-profile-32x64-ubuntu-2004
steps:
- name: Checkout repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false
- name: Install Linux dependencies
run: ./script/linux
- name: Setup Sentry CLI
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b #v2
with:
token: ${{ SECRETS.SENTRY_AUTH_TOKEN }}
- name: Limit target directory size
run: script/clear-target-dir-if-larger-than 100
- name: Set release channel to nightly
run: |
set -euo pipefail
version=$(git rev-parse --short HEAD)
echo "Publishing version: ${version} on release channel nightly"
echo "nightly" > crates/zed/RELEASE_CHANNEL
- name: Create Linux .tar.gz bundle
run: script/bundle-linux
- name: Upload Zed Nightly
run: script/upload-nightly linux-targz
freebsd:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_sentry
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b
with:
token: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: release_nightly::add_rust_to_path
run: echo "$HOME/.cargo/bin" >> "$GITHUB_PATH"
shell: bash -euxo pipefail {0}
- name: ./script/linux
run: ./script/linux
shell: bash -euxo pipefail {0}
- name: ./script/install-mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: steps::clear_target_dir_if_large
run: ./script/clear-target-dir-if-larger-than 100
shell: bash -euxo pipefail {0}
- name: release_nightly::set_release_channel_to_nightly
run: |
set -eu
version=$(git rev-parse --short HEAD)
echo "Publishing version: ${version} on release channel nightly"
echo "nightly" > crates/zed/RELEASE_CHANNEL
shell: bash -euxo pipefail {0}
- name: ./script/bundle-linux
run: ./script/bundle-linux
shell: bash -euxo pipefail {0}
- name: release_nightly::upload_zed_nightly
run: script/upload-nightly linux-targz x86_64
shell: bash -euxo pipefail {0}
timeout-minutes: 60
if: false && github.repository_owner == 'zed-industries'
runs-on: github-8vcpu-ubuntu-2404
needs: tests
name: Build Zed on FreeBSD
steps:
- uses: actions/checkout@v4
- name: Build FreeBSD remote-server
id: freebsd-build
uses: vmactions/freebsd-vm@c3ae29a132c8ef1924775414107a97cac042aad5 # v1.2.0
with:
# envs: "MYTOKEN MYTOKEN2"
usesh: true
release: 13.5
copyback: true
prepare: |
pkg install -y \
bash curl jq git \
rustup-init cmake-core llvm-devel-lite pkgconf protobuf # ibx11 alsa-lib rust-bindgen-cli
run: |
freebsd-version
sysctl hw.model
sysctl hw.ncpu
sysctl hw.physmem
sysctl hw.usermem
git config --global --add safe.directory /home/runner/work/zed/zed
rustup-init --profile minimal --default-toolchain none -y
. "$HOME/.cargo/env"
./script/bundle-freebsd
mkdir -p out/
mv "target/zed-remote-server-freebsd-x86_64.gz" out/
rm -rf target/
cargo clean
- name: Upload Zed Nightly
run: script/upload-nightly freebsd
bundle-nix:
name: Build and cache Nix package
needs: tests
secrets: inherit
uses: ./.github/workflows/nix.yml
bundle-windows-x64:
timeout-minutes: 60
name: Create a Windows installer
bundle_linux_nightly_aarch64:
needs:
- check_style
- run_tests_mac
if: github.repository_owner == 'zed-industries'
runs-on: [self-32vcpu-windows-2022]
needs: windows-tests
runs-on: namespace-profile-8x32-ubuntu-2004-arm-m4
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_sentry
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b
with:
token: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: release_nightly::add_rust_to_path
run: echo "$HOME/.cargo/bin" >> "$GITHUB_PATH"
shell: bash -euxo pipefail {0}
- name: ./script/linux
run: ./script/linux
shell: bash -euxo pipefail {0}
- name: steps::clear_target_dir_if_large
run: ./script/clear-target-dir-if-larger-than 100
shell: bash -euxo pipefail {0}
- name: release_nightly::set_release_channel_to_nightly
run: |
set -eu
version=$(git rev-parse --short HEAD)
echo "Publishing version: ${version} on release channel nightly"
echo "nightly" > crates/zed/RELEASE_CHANNEL
shell: bash -euxo pipefail {0}
- name: ./script/bundle-linux
run: ./script/bundle-linux
shell: bash -euxo pipefail {0}
- name: release_nightly::upload_zed_nightly
run: script/upload-nightly linux-targz aarch64
shell: bash -euxo pipefail {0}
timeout-minutes: 60
bundle_windows_nightly_x86_64:
needs:
- check_style
- run_tests_windows
if: github.repository_owner == 'zed-industries'
runs-on: self-32vcpu-windows-2022
env:
AZURE_TENANT_ID: ${{ secrets.AZURE_SIGNING_TENANT_ID }}
AZURE_CLIENT_ID: ${{ secrets.AZURE_SIGNING_CLIENT_ID }}
@@ -259,65 +280,177 @@ jobs:
ENDPOINT: ${{ vars.AZURE_SIGNING_ENDPOINT }}
FILE_DIGEST: SHA256
TIMESTAMP_DIGEST: SHA256
TIMESTAMP_SERVER: "http://timestamp.acs.microsoft.com"
TIMESTAMP_SERVER: http://timestamp.acs.microsoft.com
steps:
- name: Checkout repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false
- name: Set release channel to nightly
working-directory: ${{ env.ZED_WORKSPACE }}
run: |
$ErrorActionPreference = "Stop"
$version = git rev-parse --short HEAD
Write-Host "Publishing version: $version on release channel nightly"
"nightly" | Set-Content -Path "crates/zed/RELEASE_CHANNEL"
- name: Setup Sentry CLI
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b #v2
with:
token: ${{ SECRETS.SENTRY_AUTH_TOKEN }}
- name: Build Zed installer
working-directory: ${{ env.ZED_WORKSPACE }}
run: script/bundle-windows.ps1
- name: Upload Zed Nightly
working-directory: ${{ env.ZED_WORKSPACE }}
run: script/upload-nightly.ps1 windows
update-nightly-tag:
name: Update nightly tag
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_sentry
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b
with:
token: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: release_nightly::set_release_channel_to_nightly
run: |
$ErrorActionPreference = "Stop"
$version = git rev-parse --short HEAD
Write-Host "Publishing version: $version on release channel nightly"
"nightly" | Set-Content -Path "crates/zed/RELEASE_CHANNEL"
shell: pwsh
working-directory: ${{ env.ZED_WORKSPACE }}
- name: release_nightly::build_zed_installer
run: script/bundle-windows.ps1 -Architecture x86_64
shell: pwsh
working-directory: ${{ env.ZED_WORKSPACE }}
- name: release_nightly::upload_zed_nightly_windows
run: script/upload-nightly.ps1 -Architecture x86_64
shell: pwsh
working-directory: ${{ env.ZED_WORKSPACE }}
timeout-minutes: 60
bundle_windows_nightly_aarch64:
needs:
- check_style
- run_tests_windows
if: github.repository_owner == 'zed-industries'
runs-on: self-32vcpu-windows-2022
env:
AZURE_TENANT_ID: ${{ secrets.AZURE_SIGNING_TENANT_ID }}
AZURE_CLIENT_ID: ${{ secrets.AZURE_SIGNING_CLIENT_ID }}
AZURE_CLIENT_SECRET: ${{ secrets.AZURE_SIGNING_CLIENT_SECRET }}
ACCOUNT_NAME: ${{ vars.AZURE_SIGNING_ACCOUNT_NAME }}
CERT_PROFILE_NAME: ${{ vars.AZURE_SIGNING_CERT_PROFILE_NAME }}
ENDPOINT: ${{ vars.AZURE_SIGNING_ENDPOINT }}
FILE_DIGEST: SHA256
TIMESTAMP_DIGEST: SHA256
TIMESTAMP_SERVER: http://timestamp.acs.microsoft.com
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_sentry
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b
with:
token: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: release_nightly::set_release_channel_to_nightly
run: |
$ErrorActionPreference = "Stop"
$version = git rev-parse --short HEAD
Write-Host "Publishing version: $version on release channel nightly"
"nightly" | Set-Content -Path "crates/zed/RELEASE_CHANNEL"
shell: pwsh
working-directory: ${{ env.ZED_WORKSPACE }}
- name: release_nightly::build_zed_installer
run: script/bundle-windows.ps1 -Architecture aarch64
shell: pwsh
working-directory: ${{ env.ZED_WORKSPACE }}
- name: release_nightly::upload_zed_nightly_windows
run: script/upload-nightly.ps1 -Architecture aarch64
shell: pwsh
working-directory: ${{ env.ZED_WORKSPACE }}
timeout-minutes: 60
build_nix_linux_x86_64:
needs:
- check_style
- run_tests_mac
if: github.repository_owner == 'zed-industries'
runs-on: namespace-profile-32x64-ubuntu-2004
env:
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_MINIDUMP_ENDPOINT: ${{ secrets.ZED_SENTRY_MINIDUMP_ENDPOINT }}
ZED_CLOUD_PROVIDER_ADDITIONAL_MODELS_JSON: ${{ secrets.ZED_CLOUD_PROVIDER_ADDITIONAL_MODELS_JSON }}
GIT_LFS_SKIP_SMUDGE: '1'
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: nix_build::install_nix
uses: cachix/install-nix-action@02a151ada4993995686f9ed4f1be7cfbb229e56f
with:
github_access_token: ${{ secrets.GITHUB_TOKEN }}
- name: nix_build::cachix_action
uses: cachix/cachix-action@0fc020193b5a1fa3ac4575aa3a7d3aa6a35435ad
with:
name: zed
authToken: ${{ secrets.CACHIX_AUTH_TOKEN }}
cachixArgs: -v
- name: nix_build::build
run: nix build .#default -L --accept-flake-config
shell: bash -euxo pipefail {0}
timeout-minutes: 60
continue-on-error: true
build_nix_mac_aarch64:
needs:
- check_style
- run_tests_mac
if: github.repository_owner == 'zed-industries'
runs-on: self-mini-macos
env:
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_MINIDUMP_ENDPOINT: ${{ secrets.ZED_SENTRY_MINIDUMP_ENDPOINT }}
ZED_CLOUD_PROVIDER_ADDITIONAL_MODELS_JSON: ${{ secrets.ZED_CLOUD_PROVIDER_ADDITIONAL_MODELS_JSON }}
GIT_LFS_SKIP_SMUDGE: '1'
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: nix_build::set_path
run: |
echo "/nix/var/nix/profiles/default/bin" >> "$GITHUB_PATH"
echo "/Users/administrator/.nix-profile/bin" >> "$GITHUB_PATH"
shell: bash -euxo pipefail {0}
- name: nix_build::cachix_action
uses: cachix/cachix-action@0fc020193b5a1fa3ac4575aa3a7d3aa6a35435ad
with:
name: zed
authToken: ${{ secrets.CACHIX_AUTH_TOKEN }}
cachixArgs: -v
- name: nix_build::build
run: nix build .#default -L --accept-flake-config
shell: bash -euxo pipefail {0}
- name: nix_build::limit_store
run: |-
if [ "$(du -sm /nix/store | cut -f1)" -gt 50000 ]; then
nix-collect-garbage -d || true
fi
shell: bash -euxo pipefail {0}
timeout-minutes: 60
continue-on-error: true
update_nightly_tag:
needs:
- bundle_mac_nightly_x86_64
- bundle_mac_nightly_aarch64
- bundle_linux_nightly_x86_64
- bundle_linux_nightly_aarch64
- bundle_windows_nightly_x86_64
- bundle_windows_nightly_aarch64
if: github.repository_owner == 'zed-industries'
runs-on: namespace-profile-2x4-ubuntu-2404
needs:
- bundle-mac
- bundle-linux-x86
- bundle-linux-arm
- bundle-windows-x64
steps:
- name: Checkout repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
fetch-depth: 0
- name: Update nightly tag
run: |
if [ "$(git rev-parse nightly)" = "$(git rev-parse HEAD)" ]; then
echo "Nightly tag already points to current commit. Skipping tagging."
exit 0
fi
git config user.name github-actions
git config user.email github-actions@github.com
git tag -f nightly
git push origin nightly --force
- name: Create Sentry release
uses: getsentry/action-release@526942b68292201ac6bbb99b9a0747d4abee354c # v3
env:
SENTRY_ORG: zed-dev
SENTRY_PROJECT: zed
SENTRY_AUTH_TOKEN: ${{ secrets.SENTRY_AUTH_TOKEN }}
with:
environment: production
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
fetch-depth: 0
- name: release_nightly::update_nightly_tag
run: |
if [ "$(git rev-parse nightly)" = "$(git rev-parse HEAD)" ]; then
echo "Nightly tag already points to current commit. Skipping tagging."
exit 0
fi
git config user.name github-actions
git config user.email github-actions@github.com
git tag -f nightly
git push origin nightly --force
shell: bash -euxo pipefail {0}
- name: release_nightly::create_sentry_release
uses: getsentry/action-release@526942b68292201ac6bbb99b9a0747d4abee354c
with:
environment: production
env:
SENTRY_ORG: zed-dev
SENTRY_PROJECT: zed
SENTRY_AUTH_TOKEN: ${{ secrets.SENTRY_AUTH_TOKEN }}
timeout-minutes: 60

236
.github/workflows/run_bundling.yml vendored Normal file
View File

@@ -0,0 +1,236 @@
# Generated from xtask::workflows::run_bundling
# Rebuild with `cargo xtask workflows`.
name: run_bundling
env:
CARGO_TERM_COLOR: always
CARGO_INCREMENTAL: '0'
RUST_BACKTRACE: '1'
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_MINIDUMP_ENDPOINT: ${{ secrets.ZED_SENTRY_MINIDUMP_ENDPOINT }}
on:
pull_request:
types:
- labeled
- synchronize
jobs:
bundle_mac_x86_64:
if: |-
(github.event.action == 'labeled' && github.event.label.name == 'run-bundling') ||
(github.event.action == 'synchronize' && contains(github.event.pull_request.labels.*.name, 'run-bundling'))
runs-on: self-mini-macos
env:
MACOS_CERTIFICATE: ${{ secrets.MACOS_CERTIFICATE }}
MACOS_CERTIFICATE_PASSWORD: ${{ secrets.MACOS_CERTIFICATE_PASSWORD }}
APPLE_NOTARIZATION_KEY: ${{ secrets.APPLE_NOTARIZATION_KEY }}
APPLE_NOTARIZATION_KEY_ID: ${{ secrets.APPLE_NOTARIZATION_KEY_ID }}
APPLE_NOTARIZATION_ISSUER_ID: ${{ secrets.APPLE_NOTARIZATION_ISSUER_ID }}
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020
with:
node-version: '20'
- name: steps::setup_sentry
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b
with:
token: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: steps::clear_target_dir_if_large
run: ./script/clear-target-dir-if-larger-than 300
shell: bash -euxo pipefail {0}
- name: run_bundling::bundle_mac
run: ./script/bundle-mac x86_64-apple-darwin
shell: bash -euxo pipefail {0}
- name: '@actions/upload-artifact Zed_${{ github.event.pull_request.head.sha || github.sha }}-x86_64.dmg'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: Zed_${{ github.event.pull_request.head.sha || github.sha }}-x86_64.dmg
path: target/x86_64-apple-darwin/release/Zed.dmg
- name: '@actions/upload-artifact zed-remote-server-${{ github.event.pull_request.head.sha || github.sha }}-macos-x86_64.gz'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: zed-remote-server-${{ github.event.pull_request.head.sha || github.sha }}-macos-x86_64.gz
path: target/zed-remote-server-macos-x86_64.gz
timeout-minutes: 60
bundle_mac_arm64:
if: |-
(github.event.action == 'labeled' && github.event.label.name == 'run-bundling') ||
(github.event.action == 'synchronize' && contains(github.event.pull_request.labels.*.name, 'run-bundling'))
runs-on: self-mini-macos
env:
MACOS_CERTIFICATE: ${{ secrets.MACOS_CERTIFICATE }}
MACOS_CERTIFICATE_PASSWORD: ${{ secrets.MACOS_CERTIFICATE_PASSWORD }}
APPLE_NOTARIZATION_KEY: ${{ secrets.APPLE_NOTARIZATION_KEY }}
APPLE_NOTARIZATION_KEY_ID: ${{ secrets.APPLE_NOTARIZATION_KEY_ID }}
APPLE_NOTARIZATION_ISSUER_ID: ${{ secrets.APPLE_NOTARIZATION_ISSUER_ID }}
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020
with:
node-version: '20'
- name: steps::setup_sentry
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b
with:
token: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: steps::clear_target_dir_if_large
run: ./script/clear-target-dir-if-larger-than 300
shell: bash -euxo pipefail {0}
- name: run_bundling::bundle_mac
run: ./script/bundle-mac aarch64-apple-darwin
shell: bash -euxo pipefail {0}
- name: '@actions/upload-artifact Zed_${{ github.event.pull_request.head.sha || github.sha }}-aarch64.dmg'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: Zed_${{ github.event.pull_request.head.sha || github.sha }}-aarch64.dmg
path: target/aarch64-apple-darwin/release/Zed.dmg
- name: '@actions/upload-artifact zed-remote-server-${{ github.event.pull_request.head.sha || github.sha }}-macos-aarch64.gz'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: zed-remote-server-${{ github.event.pull_request.head.sha || github.sha }}-macos-aarch64.gz
path: target/zed-remote-server-macos-aarch64.gz
timeout-minutes: 60
bundle_linux_x86_64:
if: |-
(github.event.action == 'labeled' && github.event.label.name == 'run-bundling') ||
(github.event.action == 'synchronize' && contains(github.event.pull_request.labels.*.name, 'run-bundling'))
runs-on: namespace-profile-32x64-ubuntu-2004
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_sentry
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b
with:
token: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: steps::setup_linux
run: ./script/linux
shell: bash -euxo pipefail {0}
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: ./script/bundle-linux
run: ./script/bundle-linux
shell: bash -euxo pipefail {0}
- name: '@actions/upload-artifact zed-${{ github.event.pull_request.head.sha || github.sha }}-x86_64-unknown-linux-gnu.tar.gz'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: zed-${{ github.event.pull_request.head.sha || github.sha }}-x86_64-unknown-linux-gnu.tar.gz
path: target/release/zed-*.tar.gz
- name: '@actions/upload-artifact zed-remote-server-${{ github.event.pull_request.head.sha || github.sha }}-x86_64-unknown-linux-gnu.tar.gz'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: zed-remote-server-${{ github.event.pull_request.head.sha || github.sha }}-x86_64-unknown-linux-gnu.tar.gz
path: target/release/zed-remote-server-*.tar.gz
timeout-minutes: 60
bundle_linux_arm64:
if: |-
(github.event.action == 'labeled' && github.event.label.name == 'run-bundling') ||
(github.event.action == 'synchronize' && contains(github.event.pull_request.labels.*.name, 'run-bundling'))
runs-on: namespace-profile-8x32-ubuntu-2004-arm-m4
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_sentry
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b
with:
token: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: steps::setup_linux
run: ./script/linux
shell: bash -euxo pipefail {0}
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: ./script/bundle-linux
run: ./script/bundle-linux
shell: bash -euxo pipefail {0}
- name: '@actions/upload-artifact zed-${{ github.event.pull_request.head.sha || github.sha }}-aarch64-unknown-linux-gnu.tar.gz'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: zed-${{ github.event.pull_request.head.sha || github.sha }}-aarch64-unknown-linux-gnu.tar.gz
path: target/release/zed-*.tar.gz
- name: '@actions/upload-artifact zed-remote-server-${{ github.event.pull_request.head.sha || github.sha }}-aarch64-unknown-linux-gnu.tar.gz'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: zed-remote-server-${{ github.event.pull_request.head.sha || github.sha }}-aarch64-unknown-linux-gnu.tar.gz
path: target/release/zed-remote-server-*.tar.gz
timeout-minutes: 60
bundle_windows_x86_64:
if: |-
(github.event.action == 'labeled' && github.event.label.name == 'run-bundling') ||
(github.event.action == 'synchronize' && contains(github.event.pull_request.labels.*.name, 'run-bundling'))
runs-on: self-32vcpu-windows-2022
env:
AZURE_TENANT_ID: ${{ secrets.AZURE_SIGNING_TENANT_ID }}
AZURE_CLIENT_ID: ${{ secrets.AZURE_SIGNING_CLIENT_ID }}
AZURE_CLIENT_SECRET: ${{ secrets.AZURE_SIGNING_CLIENT_SECRET }}
ACCOUNT_NAME: ${{ vars.AZURE_SIGNING_ACCOUNT_NAME }}
CERT_PROFILE_NAME: ${{ vars.AZURE_SIGNING_CERT_PROFILE_NAME }}
ENDPOINT: ${{ vars.AZURE_SIGNING_ENDPOINT }}
FILE_DIGEST: SHA256
TIMESTAMP_DIGEST: SHA256
TIMESTAMP_SERVER: http://timestamp.acs.microsoft.com
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_sentry
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b
with:
token: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: run_bundling::bundle_windows
run: script/bundle-windows.ps1 -Architecture x86_64
shell: pwsh
working-directory: ${{ env.ZED_WORKSPACE }}
- name: '@actions/upload-artifact Zed_${{ github.event.pull_request.head.sha || github.sha }}-x86_64.exe'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: Zed_${{ github.event.pull_request.head.sha || github.sha }}-x86_64.exe
path: ${{ env.SETUP_PATH }}
timeout-minutes: 60
bundle_windows_arm64:
if: |-
(github.event.action == 'labeled' && github.event.label.name == 'run-bundling') ||
(github.event.action == 'synchronize' && contains(github.event.pull_request.labels.*.name, 'run-bundling'))
runs-on: self-32vcpu-windows-2022
env:
AZURE_TENANT_ID: ${{ secrets.AZURE_SIGNING_TENANT_ID }}
AZURE_CLIENT_ID: ${{ secrets.AZURE_SIGNING_CLIENT_ID }}
AZURE_CLIENT_SECRET: ${{ secrets.AZURE_SIGNING_CLIENT_SECRET }}
ACCOUNT_NAME: ${{ vars.AZURE_SIGNING_ACCOUNT_NAME }}
CERT_PROFILE_NAME: ${{ vars.AZURE_SIGNING_CERT_PROFILE_NAME }}
ENDPOINT: ${{ vars.AZURE_SIGNING_ENDPOINT }}
FILE_DIGEST: SHA256
TIMESTAMP_DIGEST: SHA256
TIMESTAMP_SERVER: http://timestamp.acs.microsoft.com
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_sentry
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b
with:
token: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: run_bundling::bundle_windows
run: script/bundle-windows.ps1 -Architecture aarch64
shell: pwsh
working-directory: ${{ env.ZED_WORKSPACE }}
- name: '@actions/upload-artifact Zed_${{ github.event.pull_request.head.sha || github.sha }}-aarch64.exe'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: Zed_${{ github.event.pull_request.head.sha || github.sha }}-aarch64.exe
path: ${{ env.SETUP_PATH }}
timeout-minutes: 60
concurrency:
group: ${{ github.workflow }}-${{ github.head_ref || github.ref }}
cancel-in-progress: true

549
.github/workflows/run_tests.yml vendored Normal file
View File

@@ -0,0 +1,549 @@
# Generated from xtask::workflows::run_tests
# Rebuild with `cargo xtask workflows`.
name: run_tests
env:
CARGO_TERM_COLOR: always
RUST_BACKTRACE: '1'
CARGO_INCREMENTAL: '0'
on:
pull_request:
branches:
- '**'
push:
branches:
- main
- v[0-9]+.[0-9]+.x
jobs:
orchestrate:
if: github.repository_owner == 'zed-industries'
runs-on: namespace-profile-2x4-ubuntu-2404
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
fetch-depth: ${{ github.ref == 'refs/heads/main' && 2 || 350 }}
- id: filter
name: filter
run: |
if [ -z "$GITHUB_BASE_REF" ]; then
echo "Not in a PR context (i.e., push to main/stable/preview)"
COMPARE_REV="$(git rev-parse HEAD~1)"
else
echo "In a PR context comparing to pull_request.base.ref"
git fetch origin "$GITHUB_BASE_REF" --depth=350
COMPARE_REV="$(git merge-base "origin/${GITHUB_BASE_REF}" HEAD)"
fi
CHANGED_FILES="$(git diff --name-only "$COMPARE_REV" ${{ github.sha }})"
check_pattern() {
local output_name="$1"
local pattern="$2"
local grep_arg="$3"
echo "$CHANGED_FILES" | grep "$grep_arg" "$pattern" && \
echo "${output_name}=true" >> "$GITHUB_OUTPUT" || \
echo "${output_name}=false" >> "$GITHUB_OUTPUT"
}
check_pattern "run_action_checks" '^\.github/(workflows/|actions/|actionlint.yml)|tooling/xtask|script/' -qP
check_pattern "run_docs" '^docs/' -qP
check_pattern "run_licenses" '^(Cargo.lock|script/.*licenses)' -qP
check_pattern "run_nix" '^(nix/|flake\.|Cargo\.|rust-toolchain.toml|\.cargo/config.toml)' -qP
check_pattern "run_tests" '^(docs/|script/update_top_ranking_issues/|\.github/(ISSUE_TEMPLATE|workflows/(?!run_tests)))' -qvP
shell: bash -euxo pipefail {0}
outputs:
run_action_checks: ${{ steps.filter.outputs.run_action_checks }}
run_docs: ${{ steps.filter.outputs.run_docs }}
run_licenses: ${{ steps.filter.outputs.run_licenses }}
run_nix: ${{ steps.filter.outputs.run_nix }}
run_tests: ${{ steps.filter.outputs.run_tests }}
check_style:
if: github.repository_owner == 'zed-industries'
runs-on: namespace-profile-4x8-ubuntu-2204
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_pnpm
uses: pnpm/action-setup@fe02b34f77f8bc703788d5817da081398fad5dd2
with:
version: '9'
- name: ./script/prettier
run: ./script/prettier
shell: bash -euxo pipefail {0}
- name: ./script/check-todos
run: ./script/check-todos
shell: bash -euxo pipefail {0}
- name: ./script/check-keymaps
run: ./script/check-keymaps
shell: bash -euxo pipefail {0}
- name: run_tests::check_style::check_for_typos
uses: crate-ci/typos@80c8a4945eec0f6d464eaf9e65ed98ef085283d1
with:
config: ./typos.toml
- name: steps::cargo_fmt
run: cargo fmt --all -- --check
shell: bash -euxo pipefail {0}
timeout-minutes: 60
run_tests_windows:
needs:
- orchestrate
if: needs.orchestrate.outputs.run_tests == 'true'
runs-on: self-32vcpu-windows-2022
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_cargo_config
run: |
New-Item -ItemType Directory -Path "./../.cargo" -Force
Copy-Item -Path "./.cargo/ci-config.toml" -Destination "./../.cargo/config.toml"
shell: pwsh
- name: steps::setup_node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020
with:
node-version: '20'
- name: steps::clippy
run: ./script/clippy.ps1
shell: pwsh
- name: steps::cargo_install_nextest
run: cargo install cargo-nextest --locked
shell: pwsh
- name: steps::clear_target_dir_if_large
run: ./script/clear-target-dir-if-larger-than.ps1 250
shell: pwsh
- name: steps::cargo_nextest
run: cargo nextest run --workspace --no-fail-fast --failure-output immediate-final
shell: pwsh
- name: steps::cleanup_cargo_config
if: always()
run: |
Remove-Item -Recurse -Path "./../.cargo" -Force -ErrorAction SilentlyContinue
shell: pwsh
timeout-minutes: 60
run_tests_linux:
needs:
- orchestrate
if: needs.orchestrate.outputs.run_tests == 'true'
runs-on: namespace-profile-16x32-ubuntu-2204
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_cargo_config
run: |
mkdir -p ./../.cargo
cp ./.cargo/ci-config.toml ./../.cargo/config.toml
shell: bash -euxo pipefail {0}
- name: steps::setup_linux
run: ./script/linux
shell: bash -euxo pipefail {0}
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: steps::setup_node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020
with:
node-version: '20'
- name: steps::clippy
run: ./script/clippy
shell: bash -euxo pipefail {0}
- name: steps::cargo_install_nextest
run: cargo install cargo-nextest --locked
shell: bash -euxo pipefail {0}
- name: steps::clear_target_dir_if_large
run: ./script/clear-target-dir-if-larger-than 100
shell: bash -euxo pipefail {0}
- name: steps::cargo_nextest
run: cargo nextest run --workspace --no-fail-fast --failure-output immediate-final
shell: bash -euxo pipefail {0}
- name: steps::cleanup_cargo_config
if: always()
run: |
rm -rf ./../.cargo
shell: bash -euxo pipefail {0}
timeout-minutes: 60
run_tests_mac:
needs:
- orchestrate
if: needs.orchestrate.outputs.run_tests == 'true'
runs-on: self-mini-macos
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_cargo_config
run: |
mkdir -p ./../.cargo
cp ./.cargo/ci-config.toml ./../.cargo/config.toml
shell: bash -euxo pipefail {0}
- name: steps::setup_node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020
with:
node-version: '20'
- name: steps::clippy
run: ./script/clippy
shell: bash -euxo pipefail {0}
- name: steps::cargo_install_nextest
run: cargo install cargo-nextest --locked
shell: bash -euxo pipefail {0}
- name: steps::clear_target_dir_if_large
run: ./script/clear-target-dir-if-larger-than 300
shell: bash -euxo pipefail {0}
- name: steps::cargo_nextest
run: cargo nextest run --workspace --no-fail-fast --failure-output immediate-final
shell: bash -euxo pipefail {0}
- name: steps::cleanup_cargo_config
if: always()
run: |
rm -rf ./../.cargo
shell: bash -euxo pipefail {0}
timeout-minutes: 60
doctests:
needs:
- orchestrate
if: needs.orchestrate.outputs.run_tests == 'true'
runs-on: namespace-profile-16x32-ubuntu-2204
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::cache_rust_dependencies
uses: swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6
with:
save-if: ${{ github.ref == 'refs/heads/main' }}
- name: steps::setup_linux
run: ./script/linux
shell: bash -euxo pipefail {0}
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: steps::setup_cargo_config
run: |
mkdir -p ./../.cargo
cp ./.cargo/ci-config.toml ./../.cargo/config.toml
shell: bash -euxo pipefail {0}
- id: run_doctests
name: run_tests::doctests::run_doctests
run: |
cargo test --workspace --doc --no-fail-fast
shell: bash -euxo pipefail {0}
- name: steps::cleanup_cargo_config
if: always()
run: |
rm -rf ./../.cargo
shell: bash -euxo pipefail {0}
timeout-minutes: 60
check_workspace_binaries:
needs:
- orchestrate
if: needs.orchestrate.outputs.run_tests == 'true'
runs-on: namespace-profile-8x16-ubuntu-2204
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_cargo_config
run: |
mkdir -p ./../.cargo
cp ./.cargo/ci-config.toml ./../.cargo/config.toml
shell: bash -euxo pipefail {0}
- name: steps::setup_linux
run: ./script/linux
shell: bash -euxo pipefail {0}
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: cargo build -p collab
run: cargo build -p collab
shell: bash -euxo pipefail {0}
- name: cargo build --workspace --bins --examples
run: cargo build --workspace --bins --examples
shell: bash -euxo pipefail {0}
- name: steps::cleanup_cargo_config
if: always()
run: |
rm -rf ./../.cargo
shell: bash -euxo pipefail {0}
timeout-minutes: 60
check_postgres_and_protobuf_migrations:
needs:
- orchestrate
if: needs.orchestrate.outputs.run_tests == 'true'
runs-on: self-mini-macos
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
fetch-depth: 0
- name: run_tests::check_postgres_and_protobuf_migrations::remove_untracked_files
run: git clean -df
shell: bash -euxo pipefail {0}
- name: run_tests::check_postgres_and_protobuf_migrations::ensure_fresh_merge
run: |
if [ -z "$GITHUB_BASE_REF" ];
then
echo "BUF_BASE_BRANCH=$(git merge-base origin/main HEAD)" >> "$GITHUB_ENV"
else
git checkout -B temp
git merge -q "origin/$GITHUB_BASE_REF" -m "merge main into temp"
echo "BUF_BASE_BRANCH=$GITHUB_BASE_REF" >> "$GITHUB_ENV"
fi
shell: bash -euxo pipefail {0}
- name: run_tests::check_postgres_and_protobuf_migrations::bufbuild_setup_action
uses: bufbuild/buf-setup-action@v1
with:
version: v1.29.0
- name: run_tests::check_postgres_and_protobuf_migrations::bufbuild_breaking_action
uses: bufbuild/buf-breaking-action@v1
with:
input: crates/proto/proto/
against: https://github.com/${GITHUB_REPOSITORY}.git#branch=${BUF_BASE_BRANCH},subdir=crates/proto/proto/
timeout-minutes: 60
check_dependencies:
needs:
- orchestrate
if: needs.orchestrate.outputs.run_tests == 'true'
runs-on: namespace-profile-2x4-ubuntu-2404
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: run_tests::check_dependencies::install_cargo_machete
uses: clechasseur/rs-cargo@8435b10f6e71c2e3d4d3b7573003a8ce4bfc6386
with:
command: install
args: cargo-machete@0.7.0
- name: run_tests::check_dependencies::run_cargo_machete
uses: clechasseur/rs-cargo@8435b10f6e71c2e3d4d3b7573003a8ce4bfc6386
with:
command: machete
- name: run_tests::check_dependencies::check_cargo_lock
run: cargo update --locked --workspace
shell: bash -euxo pipefail {0}
- name: run_tests::check_dependencies::check_vulnerable_dependencies
if: github.event_name == 'pull_request'
uses: actions/dependency-review-action@67d4f4bd7a9b17a0db54d2a7519187c65e339de8
with:
license-check: false
timeout-minutes: 60
check_docs:
needs:
- orchestrate
if: needs.orchestrate.outputs.run_docs == 'true'
runs-on: namespace-profile-8x16-ubuntu-2204
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_cargo_config
run: |
mkdir -p ./../.cargo
cp ./.cargo/ci-config.toml ./../.cargo/config.toml
shell: bash -euxo pipefail {0}
- name: steps::cache_rust_dependencies
uses: swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6
with:
save-if: ${{ github.ref == 'refs/heads/main' }}
- name: run_tests::check_docs::lychee_link_check
uses: lycheeverse/lychee-action@82202e5e9c2f4ef1a55a3d02563e1cb6041e5332
with:
args: --no-progress --exclude '^http' './docs/src/**/*'
fail: true
jobSummary: false
- name: steps::setup_linux
run: ./script/linux
shell: bash -euxo pipefail {0}
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: run_tests::check_docs::install_mdbook
uses: peaceiris/actions-mdbook@ee69d230fe19748b7abf22df32acaa93833fad08
with:
mdbook-version: 0.4.37
- name: run_tests::check_docs::build_docs
run: |
mkdir -p target/deploy
mdbook build ./docs --dest-dir=../target/deploy/docs/
shell: bash -euxo pipefail {0}
- name: run_tests::check_docs::lychee_link_check
uses: lycheeverse/lychee-action@82202e5e9c2f4ef1a55a3d02563e1cb6041e5332
with:
args: --no-progress --exclude '^http' 'target/deploy/docs'
fail: true
jobSummary: false
timeout-minutes: 60
check_licenses:
needs:
- orchestrate
if: needs.orchestrate.outputs.run_licenses == 'true'
runs-on: namespace-profile-2x4-ubuntu-2404
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: ./script/check-licenses
run: ./script/check-licenses
shell: bash -euxo pipefail {0}
- name: ./script/generate-licenses
run: ./script/generate-licenses
shell: bash -euxo pipefail {0}
check_scripts:
needs:
- orchestrate
if: needs.orchestrate.outputs.run_action_checks == 'true'
runs-on: namespace-profile-2x4-ubuntu-2404
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: run_tests::check_scripts::run_shellcheck
run: ./script/shellcheck-scripts error
shell: bash -euxo pipefail {0}
- id: get_actionlint
name: run_tests::check_scripts::download_actionlint
run: bash <(curl https://raw.githubusercontent.com/rhysd/actionlint/main/scripts/download-actionlint.bash)
shell: bash -euxo pipefail {0}
- name: run_tests::check_scripts::run_actionlint
run: |
${{ steps.get_actionlint.outputs.executable }} -color
shell: bash -euxo pipefail {0}
- name: run_tests::check_scripts::check_xtask_workflows
run: |
cargo xtask workflows
if ! git diff --exit-code .github; then
echo "Error: .github directory has uncommitted changes after running 'cargo xtask workflows'"
echo "Please run 'cargo xtask workflows' locally and commit the changes"
exit 1
fi
shell: bash -euxo pipefail {0}
timeout-minutes: 60
build_nix_linux_x86_64:
needs:
- orchestrate
if: needs.orchestrate.outputs.run_nix == 'true'
runs-on: namespace-profile-32x64-ubuntu-2004
env:
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_MINIDUMP_ENDPOINT: ${{ secrets.ZED_SENTRY_MINIDUMP_ENDPOINT }}
ZED_CLOUD_PROVIDER_ADDITIONAL_MODELS_JSON: ${{ secrets.ZED_CLOUD_PROVIDER_ADDITIONAL_MODELS_JSON }}
GIT_LFS_SKIP_SMUDGE: '1'
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: nix_build::install_nix
uses: cachix/install-nix-action@02a151ada4993995686f9ed4f1be7cfbb229e56f
with:
github_access_token: ${{ secrets.GITHUB_TOKEN }}
- name: nix_build::cachix_action
uses: cachix/cachix-action@0fc020193b5a1fa3ac4575aa3a7d3aa6a35435ad
with:
name: zed
authToken: ${{ secrets.CACHIX_AUTH_TOKEN }}
cachixArgs: -v
pushFilter: -zed-editor-[0-9.]*-nightly
- name: nix_build::build
run: nix build .#debug -L --accept-flake-config
shell: bash -euxo pipefail {0}
timeout-minutes: 60
continue-on-error: true
build_nix_mac_aarch64:
needs:
- orchestrate
if: needs.orchestrate.outputs.run_nix == 'true'
runs-on: self-mini-macos
env:
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_MINIDUMP_ENDPOINT: ${{ secrets.ZED_SENTRY_MINIDUMP_ENDPOINT }}
ZED_CLOUD_PROVIDER_ADDITIONAL_MODELS_JSON: ${{ secrets.ZED_CLOUD_PROVIDER_ADDITIONAL_MODELS_JSON }}
GIT_LFS_SKIP_SMUDGE: '1'
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: nix_build::set_path
run: |
echo "/nix/var/nix/profiles/default/bin" >> "$GITHUB_PATH"
echo "/Users/administrator/.nix-profile/bin" >> "$GITHUB_PATH"
shell: bash -euxo pipefail {0}
- name: nix_build::cachix_action
uses: cachix/cachix-action@0fc020193b5a1fa3ac4575aa3a7d3aa6a35435ad
with:
name: zed
authToken: ${{ secrets.CACHIX_AUTH_TOKEN }}
cachixArgs: -v
pushFilter: -zed-editor-[0-9.]*-nightly
- name: nix_build::build
run: nix build .#debug -L --accept-flake-config
shell: bash -euxo pipefail {0}
- name: nix_build::limit_store
run: |-
if [ "$(du -sm /nix/store | cut -f1)" -gt 50000 ]; then
nix-collect-garbage -d || true
fi
shell: bash -euxo pipefail {0}
timeout-minutes: 60
continue-on-error: true
tests_pass:
needs:
- orchestrate
- check_style
- run_tests_windows
- run_tests_linux
- run_tests_mac
- doctests
- check_workspace_binaries
- check_postgres_and_protobuf_migrations
- check_dependencies
- check_docs
- check_licenses
- check_scripts
- build_nix_linux_x86_64
- build_nix_mac_aarch64
if: github.repository_owner == 'zed-industries' && always()
runs-on: namespace-profile-2x4-ubuntu-2404
steps:
- name: run_tests::tests_pass
run: |
set +x
EXIT_CODE=0
check_result() {
echo "* $1: $2"
if [[ "$2" != "skipped" && "$2" != "success" ]]; then EXIT_CODE=1; fi
}
check_result "orchestrate" "${{ needs.orchestrate.result }}"
check_result "check_style" "${{ needs.check_style.result }}"
check_result "run_tests_windows" "${{ needs.run_tests_windows.result }}"
check_result "run_tests_linux" "${{ needs.run_tests_linux.result }}"
check_result "run_tests_mac" "${{ needs.run_tests_mac.result }}"
check_result "doctests" "${{ needs.doctests.result }}"
check_result "check_workspace_binaries" "${{ needs.check_workspace_binaries.result }}"
check_result "check_postgres_and_protobuf_migrations" "${{ needs.check_postgres_and_protobuf_migrations.result }}"
check_result "check_dependencies" "${{ needs.check_dependencies.result }}"
check_result "check_docs" "${{ needs.check_docs.result }}"
check_result "check_licenses" "${{ needs.check_licenses.result }}"
check_result "check_scripts" "${{ needs.check_scripts.result }}"
check_result "build_nix_linux_x86_64" "${{ needs.build_nix_linux_x86_64.result }}"
check_result "build_nix_mac_aarch64" "${{ needs.build_nix_mac_aarch64.result }}"
exit $EXIT_CODE
shell: bash -euxo pipefail {0}
concurrency:
group: ${{ github.workflow }}-${{ github.ref_name }}-${{ github.ref_name == 'main' && github.sha || 'anysha' }}
cancel-in-progress: true

View File

@@ -1,21 +0,0 @@
name: Script
on:
pull_request:
paths:
- "script/**"
push:
branches:
- main
jobs:
shellcheck:
name: "ShellCheck Scripts"
if: github.repository_owner == 'zed-industries'
runs-on: namespace-profile-2x4-ubuntu-2404
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
- name: Shellcheck ./scripts
run: |
./script/shellcheck-scripts error

View File

@@ -63,7 +63,7 @@ jobs:
- name: Run unit evals
shell: bash -euxo pipefail {0}
run: cargo nextest run --workspace --no-fail-fast --features eval --no-capture -E 'test(::eval_)'
run: cargo nextest run --workspace --no-fail-fast --features unit-eval --no-capture -E 'test(::eval_)'
env:
ANTHROPIC_API_KEY: ${{ secrets.ANTHROPIC_API_KEY }}

134
Cargo.lock generated
View File

@@ -4902,6 +4902,18 @@ dependencies = [
"syn 2.0.106",
]
[[package]]
name = "derive_setters"
version = "0.1.8"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ae5c625eda104c228c06ecaf988d1c60e542176bd7a490e60eeda3493244c0c9"
dependencies = [
"darling 0.20.11",
"proc-macro2",
"quote",
"syn 2.0.106",
]
[[package]]
name = "deunicode"
version = "1.6.2"
@@ -6942,6 +6954,33 @@ dependencies = [
"wasm-bindgen",
]
[[package]]
name = "gh-workflow"
version = "0.8.0"
source = "git+https://github.com/zed-industries/gh-workflow?rev=0090c6b6ef82fff02bc8616645953e778d1acc08#0090c6b6ef82fff02bc8616645953e778d1acc08"
dependencies = [
"async-trait",
"derive_more 2.0.1",
"derive_setters",
"gh-workflow-macros",
"indexmap 2.11.4",
"merge",
"serde",
"serde_json",
"serde_yaml",
"strum_macros 0.27.2",
]
[[package]]
name = "gh-workflow-macros"
version = "0.8.0"
source = "git+https://github.com/zed-industries/gh-workflow?rev=0090c6b6ef82fff02bc8616645953e778d1acc08#0090c6b6ef82fff02bc8616645953e778d1acc08"
dependencies = [
"heck 0.5.0",
"quote",
"syn 2.0.106",
]
[[package]]
name = "gif"
version = "0.13.3"
@@ -7224,6 +7263,7 @@ dependencies = [
"async-task",
"backtrace",
"bindgen 0.71.1",
"bitflags 2.9.4",
"blade-graphics",
"blade-macros",
"blade-util",
@@ -7303,6 +7343,7 @@ dependencies = [
"wayland-cursor",
"wayland-protocols 0.31.2",
"wayland-protocols-plasma",
"wayland-protocols-wlr",
"windows 0.61.3",
"windows-core 0.61.2",
"windows-numerics",
@@ -9811,6 +9852,28 @@ dependencies = [
"gpui",
]
[[package]]
name = "merge"
version = "0.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "10bbef93abb1da61525bbc45eeaff6473a41907d19f8f9aa5168d214e10693e9"
dependencies = [
"merge_derive",
"num-traits",
]
[[package]]
name = "merge_derive"
version = "0.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "209d075476da2e63b4b29e72a2ef627b840589588e71400a25e3565c4f849d07"
dependencies = [
"proc-macro-error",
"proc-macro2",
"quote",
"syn 1.0.109",
]
[[package]]
name = "metal"
version = "0.29.0"
@@ -12800,6 +12863,30 @@ dependencies = [
"toml_edit 0.23.7",
]
[[package]]
name = "proc-macro-error"
version = "1.0.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "da25490ff9892aab3fcf7c36f08cfb902dd3e71ca0f9f9517bea02a73a5ce38c"
dependencies = [
"proc-macro-error-attr",
"proc-macro2",
"quote",
"syn 1.0.109",
"version_check",
]
[[package]]
name = "proc-macro-error-attr"
version = "1.0.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a1be40180e52ecc98ad80b184934baf3d0d29f979574e439af5a55274b35f869"
dependencies = [
"proc-macro2",
"quote",
"version_check",
]
[[package]]
name = "proc-macro-error-attr2"
version = "2.0.0"
@@ -13038,7 +13125,6 @@ dependencies = [
"paths",
"rope",
"serde",
"serde_json",
"text",
"util",
"uuid",
@@ -14226,7 +14312,6 @@ dependencies = [
"log",
"rand 0.9.2",
"rayon",
"regex",
"sum_tree",
"unicode-segmentation",
"util",
@@ -15224,6 +15309,19 @@ dependencies = [
"syn 2.0.106",
]
[[package]]
name = "serde_yaml"
version = "0.9.34+deprecated"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "6a8b1a1a2ebf674015cc02edccce75287f1a0130d394307b36743c2f5d504b47"
dependencies = [
"indexmap 2.11.4",
"itoa",
"ryu",
"serde",
"unsafe-libyaml",
]
[[package]]
name = "serial2"
version = "0.2.33"
@@ -16397,7 +16495,7 @@ dependencies = [
"editor",
"file_icons",
"gpui",
"multi_buffer",
"language",
"ui",
"workspace",
]
@@ -17049,6 +17147,7 @@ dependencies = [
"parking_lot",
"postage",
"rand 0.9.2",
"regex",
"rope",
"smallvec",
"sum_tree",
@@ -17341,6 +17440,7 @@ dependencies = [
"anyhow",
"auto_update",
"call",
"channel",
"chrono",
"client",
"cloud_llm_client",
@@ -18411,6 +18511,12 @@ version = "0.2.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "7264e107f553ccae879d21fbea1d6724ac785e8c3bfc762137959b5802826ef3"
[[package]]
name = "unsafe-libyaml"
version = "0.2.11"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "673aac59facbab8a9007c7f6108d11f63b603f7cabff99fabf650fea5c32b861"
[[package]]
name = "untrusted"
version = "0.9.0"
@@ -19385,6 +19491,19 @@ dependencies = [
"wayland-scanner",
]
[[package]]
name = "wayland-protocols-wlr"
version = "0.3.9"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "efd94963ed43cf9938a090ca4f7da58eb55325ec8200c3848963e98dc25b78ec"
dependencies = [
"bitflags 2.9.4",
"wayland-backend",
"wayland-client",
"wayland-protocols 0.32.9",
"wayland-scanner",
]
[[package]]
name = "wayland-scanner"
version = "0.31.7"
@@ -20818,9 +20937,12 @@ name = "xtask"
version = "0.1.0"
dependencies = [
"anyhow",
"backtrace",
"cargo_metadata",
"cargo_toml",
"clap",
"gh-workflow",
"indexmap 2.11.4",
"indoc",
"toml 0.8.23",
"toml_edit 0.22.27",
@@ -21006,7 +21128,7 @@ dependencies = [
[[package]]
name = "zed"
version = "0.211.0"
version = "0.212.0"
dependencies = [
"acp_tools",
"activity_indicator",
@@ -21533,6 +21655,7 @@ dependencies = [
"clock",
"cloud_llm_client",
"cloud_zeta2_prompt",
"collections",
"edit_prediction",
"edit_prediction_context",
"feature_flags",
@@ -21546,6 +21669,7 @@ dependencies = [
"pretty_assertions",
"project",
"release_channel",
"schemars 1.0.4",
"serde",
"serde_json",
"settings",
@@ -21560,6 +21684,7 @@ dependencies = [
name = "zeta2_tools"
version = "0.1.0"
dependencies = [
"anyhow",
"chrono",
"clap",
"client",
@@ -21577,6 +21702,7 @@ dependencies = [
"ordered-float 2.10.1",
"pretty_assertions",
"project",
"regex-syntax",
"serde",
"serde_json",
"settings",

View File

@@ -508,6 +508,7 @@ fork = "0.2.0"
futures = "0.3"
futures-batch = "0.6.1"
futures-lite = "1.13"
gh-workflow = { git = "https://github.com/zed-industries/gh-workflow", rev = "0090c6b6ef82fff02bc8616645953e778d1acc08" }
git2 = { version = "0.20.1", default-features = false }
globset = "0.4"
handlebars = "4.3"

View File

@@ -1,2 +0,0 @@
app: postgrest crates/collab/postgrest_app.conf
llm: postgrest crates/collab/postgrest_llm.conf

View File

@@ -1,2 +1 @@
postgrest_llm: postgrest crates/collab/postgrest_llm.conf
website: cd ../zed.dev; npm run dev -- --port=3000

View File

@@ -38,6 +38,7 @@ linux
= @smitbarmase
= @p1n3appl3
= @cole-miller
= @probably-neb
windows
= @reflectronic
@@ -78,7 +79,7 @@ crashes
ai
= @rtfeldman
= @danilo-leal
= @benbrandt
= @benbrandt
design
= @danilo-leal
@@ -98,6 +99,7 @@ languages
= @Veykril
= @smitbarmase
= @SomeoneToIgnore
= @probably-neb
project_panel
= @smitbarmase
@@ -105,3 +107,6 @@ project_panel
tasks
= @SomeoneToIgnore
= @Veykril
docs
= @probably-neb

5
assets/icons/link.svg Normal file
View File

@@ -0,0 +1,5 @@
<svg width="16" height="16" viewBox="0 0 16 16" fill="none" xmlns="http://www.w3.org/2000/svg">
<path d="M6.2 11H5C4.20435 11 3.44129 10.6839 2.87868 10.1213C2.31607 9.55871 2 8.79565 2 8C2 7.20435 2.31607 6.44129 2.87868 5.87868C3.44129 5.31607 4.20435 5 5 5H6.2" stroke="#C4CAD4" stroke-width="1.2" stroke-linecap="round" stroke-linejoin="round"/>
<path d="M9.80005 5H11C11.7957 5 12.5588 5.31607 13.1214 5.87868C13.684 6.44129 14 7.20435 14 8C14 8.79565 13.684 9.55871 13.1214 10.1213C12.5588 10.6839 11.7957 11 11 11H9.80005" stroke="#C4CAD4" stroke-width="1.2" stroke-linecap="round" stroke-linejoin="round"/>
<path d="M5.6001 8H10.4001" stroke="#C4CAD4" stroke-width="1.2" stroke-linecap="round" stroke-linejoin="round"/>
</svg>

After

Width:  |  Height:  |  Size: 735 B

View File

@@ -609,7 +609,7 @@
"ctrl-alt-b": "workspace::ToggleRightDock",
"ctrl-b": "workspace::ToggleLeftDock",
"ctrl-j": "workspace::ToggleBottomDock",
"ctrl-alt-y": "workspace::CloseAllDocks",
"ctrl-alt-y": "workspace::ToggleAllDocks",
"ctrl-alt-0": "workspace::ResetActiveDockSize",
// For 0px parameter, uses UI font size value.
"ctrl-alt--": ["workspace::DecreaseActiveDockSize", { "px": 0 }],
@@ -731,6 +731,14 @@
"tab": "editor::ComposeCompletion"
}
},
{
"context": "Editor && in_snippet",
"use_key_equivalents": true,
"bindings": {
"alt-right": "editor::NextSnippetTabstop",
"alt-left": "editor::PreviousSnippetTabstop"
}
},
// Bindings for accepting edit predictions
//
// alt-l is provided as an alternative to tab/alt-tab. and will be displayed in the UI. This is
@@ -1012,7 +1020,8 @@
"context": "CollabPanel",
"bindings": {
"alt-up": "collab_panel::MoveChannelUp",
"alt-down": "collab_panel::MoveChannelDown"
"alt-down": "collab_panel::MoveChannelDown",
"alt-enter": "collab_panel::OpenSelectedChannelNotes"
}
},
{
@@ -1126,7 +1135,8 @@
"ctrl-shift-space": "terminal::ToggleViMode",
"ctrl-shift-r": "terminal::RerunTask",
"ctrl-alt-r": "terminal::RerunTask",
"alt-t": "terminal::RerunTask"
"alt-t": "terminal::RerunTask",
"ctrl-shift-5": "pane::SplitRight"
}
},
{
@@ -1298,5 +1308,12 @@
"ctrl-enter up": "dev::Zeta2RatePredictionPositive",
"ctrl-enter down": "dev::Zeta2RatePredictionNegative"
}
},
{
"context": "Zeta2Context > Editor",
"bindings": {
"alt-left": "dev::Zeta2ContextGoBack",
"alt-right": "dev::Zeta2ContextGoForward"
}
}
]

View File

@@ -679,7 +679,7 @@
"cmd-alt-b": "workspace::ToggleRightDock",
"cmd-r": "workspace::ToggleRightDock",
"cmd-j": "workspace::ToggleBottomDock",
"alt-cmd-y": "workspace::CloseAllDocks",
"alt-cmd-y": "workspace::ToggleAllDocks",
// For 0px parameter, uses UI font size value.
"ctrl-alt-0": "workspace::ResetActiveDockSize",
"ctrl-alt--": ["workspace::DecreaseActiveDockSize", { "px": 0 }],
@@ -801,6 +801,14 @@
"tab": "editor::ComposeCompletion"
}
},
{
"context": "Editor && in_snippet",
"use_key_equivalents": true,
"bindings": {
"alt-right": "editor::NextSnippetTabstop",
"alt-left": "editor::PreviousSnippetTabstop"
}
},
{
"context": "Editor && edit_prediction",
"bindings": {
@@ -1077,7 +1085,8 @@
"use_key_equivalents": true,
"bindings": {
"alt-up": "collab_panel::MoveChannelUp",
"alt-down": "collab_panel::MoveChannelDown"
"alt-down": "collab_panel::MoveChannelDown",
"alt-enter": "collab_panel::OpenSelectedChannelNotes"
}
},
{
@@ -1209,6 +1218,7 @@
"ctrl-alt-down": "pane::SplitDown",
"ctrl-alt-left": "pane::SplitLeft",
"ctrl-alt-right": "pane::SplitRight",
"cmd-d": "pane::SplitRight",
"cmd-alt-r": "terminal::RerunTask"
}
},
@@ -1404,5 +1414,12 @@
"cmd-enter up": "dev::Zeta2RatePredictionPositive",
"cmd-enter down": "dev::Zeta2RatePredictionNegative"
}
},
{
"context": "Zeta2Context > Editor",
"bindings": {
"alt-left": "dev::Zeta2ContextGoBack",
"alt-right": "dev::Zeta2ContextGoForward"
}
}
]

View File

@@ -614,7 +614,7 @@
"ctrl-alt-b": "workspace::ToggleRightDock",
"ctrl-b": "workspace::ToggleLeftDock",
"ctrl-j": "workspace::ToggleBottomDock",
"ctrl-shift-y": "workspace::CloseAllDocks",
"ctrl-shift-y": "workspace::ToggleAllDocks",
"alt-r": "workspace::ResetActiveDockSize",
// For 0px parameter, uses UI font size value.
"shift-alt--": ["workspace::DecreaseActiveDockSize", { "px": 0 }],
@@ -736,6 +736,14 @@
"tab": "editor::ComposeCompletion"
}
},
{
"context": "Editor && in_snippet",
"use_key_equivalents": true,
"bindings": {
"alt-right": "editor::NextSnippetTabstop",
"alt-left": "editor::PreviousSnippetTabstop"
}
},
// Bindings for accepting edit predictions
//
// alt-l is provided as an alternative to tab/alt-tab. and will be displayed in the UI. This is
@@ -1030,7 +1038,8 @@
"use_key_equivalents": true,
"bindings": {
"alt-up": "collab_panel::MoveChannelUp",
"alt-down": "collab_panel::MoveChannelDown"
"alt-down": "collab_panel::MoveChannelDown",
"alt-enter": "collab_panel::OpenSelectedChannelNotes"
}
},
{
@@ -1152,7 +1161,8 @@
"ctrl-shift-space": "terminal::ToggleViMode",
"ctrl-shift-r": "terminal::RerunTask",
"ctrl-alt-r": "terminal::RerunTask",
"alt-t": "terminal::RerunTask"
"alt-t": "terminal::RerunTask",
"ctrl-shift-5": "pane::SplitRight"
}
},
{
@@ -1327,5 +1337,12 @@
"ctrl-enter up": "dev::Zeta2RatePredictionPositive",
"ctrl-enter down": "dev::Zeta2RatePredictionNegative"
}
},
{
"context": "Zeta2Context > Editor",
"bindings": {
"alt-left": "dev::Zeta2ContextGoBack",
"alt-right": "dev::Zeta2ContextGoForward"
}
}
]

View File

@@ -91,7 +91,7 @@
{
"context": "Workspace",
"bindings": {
"ctrl-shift-f12": "workspace::CloseAllDocks",
"ctrl-shift-f12": "workspace::ToggleAllDocks",
"ctrl-shift-r": ["pane::DeploySearch", { "replace_enabled": true }],
"alt-shift-f10": "task::Spawn",
"ctrl-e": "file_finder::Toggle",

View File

@@ -93,7 +93,7 @@
{
"context": "Workspace",
"bindings": {
"cmd-shift-f12": "workspace::CloseAllDocks",
"cmd-shift-f12": "workspace::ToggleAllDocks",
"cmd-shift-r": ["pane::DeploySearch", { "replace_enabled": true }],
"ctrl-alt-r": "task::Spawn",
"cmd-e": "file_finder::Toggle",

View File

@@ -220,6 +220,8 @@
"[ {": ["vim::UnmatchedBackward", { "char": "{" }],
"] )": ["vim::UnmatchedForward", { "char": ")" }],
"[ (": ["vim::UnmatchedBackward", { "char": "(" }],
"[ r": "vim::GoToPreviousReference",
"] r": "vim::GoToNextReference",
// tree-sitter related commands
"[ x": "vim::SelectLargerSyntaxNode",
"] x": "vim::SelectSmallerSyntaxNode"

View File

@@ -1,179 +0,0 @@
You are a highly skilled software engineer with extensive knowledge in many programming languages, frameworks, design patterns, and best practices.
## Communication
1. Be conversational but professional.
2. Refer to the user in the second person and yourself in the first person.
3. Format your responses in markdown. Use backticks to format file, directory, function, and class names.
4. NEVER lie or make things up.
5. Refrain from apologizing all the time when results are unexpected. Instead, just try your best to proceed or explain the circumstances to the user without apologizing.
{{#if has_tools}}
## Tool Use
1. Make sure to adhere to the tools schema.
2. Provide every required argument.
3. DO NOT use tools to access items that are already available in the context section.
4. Use only the tools that are currently available.
5. DO NOT use a tool that is not available just because it appears in the conversation. This means the user turned it off.
6. NEVER run commands that don't terminate on their own such as web servers (like `npm run start`, `npm run dev`, `python -m http.server`, etc) or file watchers.
7. Avoid HTML entity escaping - use plain characters instead.
## Searching and Reading
If you are unsure how to fulfill the user's request, gather more information with tool calls and/or clarifying questions.
{{! TODO: If there are files, we should mention it but otherwise omit that fact }}
If appropriate, use tool calls to explore the current project, which contains the following root directories:
{{#each worktrees}}
- `{{abs_path}}`
{{/each}}
- Bias towards not asking the user for help if you can find the answer yourself.
- When providing paths to tools, the path should always start with the name of a project root directory listed above.
- Before you read or edit a file, you must first find the full path. DO NOT ever guess a file path!
{{# if (has_tool 'grep') }}
- When looking for symbols in the project, prefer the `grep` tool.
- As you learn about the structure of the project, use that information to scope `grep` searches to targeted subtrees of the project.
- The user might specify a partial file path. If you don't know the full path, use `find_path` (not `grep`) before you read the file.
{{/if}}
{{else}}
You are being tasked with providing a response, but you have no ability to use tools or to read or write any aspect of the user's system (other than any context the user might have provided to you).
As such, if you need the user to perform any actions for you, you must request them explicitly. Bias towards giving a response to the best of your ability, and then making requests for the user to take action (e.g. to give you more context) only optionally.
The one exception to this is if the user references something you don't know about - for example, the name of a source code file, function, type, or other piece of code that you have no awareness of. In this case, you MUST NOT MAKE SOMETHING UP, or assume you know what that thing is or how it works. Instead, you must ask the user for clarification rather than giving a response.
{{/if}}
## Code Block Formatting
Whenever you mention a code block, you MUST use ONLY use the following format:
```path/to/Something.blah#L123-456
(code goes here)
```
The `#L123-456` means the line number range 123 through 456, and the path/to/Something.blah
is a path in the project. (If there is no valid path in the project, then you can use
/dev/null/path.extension for its path.) This is the ONLY valid way to format code blocks, because the Markdown parser
does not understand the more common ```language syntax, or bare ``` blocks. It only
understands this path-based syntax, and if the path is missing, then it will error and you will have to do it over again.
Just to be really clear about this, if you ever find yourself writing three backticks followed by a language name, STOP!
You have made a mistake. You can only ever put paths after triple backticks!
<example>
Based on all the information I've gathered, here's a summary of how this system works:
1. The README file is loaded into the system.
2. The system finds the first two headers, including everything in between. In this case, that would be:
```path/to/README.md#L8-12
# First Header
This is the info under the first header.
## Sub-header
```
3. Then the system finds the last header in the README:
```path/to/README.md#L27-29
## Last Header
This is the last header in the README.
```
4. Finally, it passes this information on to the next process.
</example>
<example>
In Markdown, hash marks signify headings. For example:
```/dev/null/example.md#L1-3
# Level 1 heading
## Level 2 heading
### Level 3 heading
```
</example>
Here are examples of ways you must never render code blocks:
<bad_example_do_not_do_this>
In Markdown, hash marks signify headings. For example:
```
# Level 1 heading
## Level 2 heading
### Level 3 heading
```
</bad_example_do_not_do_this>
This example is unacceptable because it does not include the path.
<bad_example_do_not_do_this>
In Markdown, hash marks signify headings. For example:
```markdown
# Level 1 heading
## Level 2 heading
### Level 3 heading
```
</bad_example_do_not_do_this>
This example is unacceptable because it has the language instead of the path.
<bad_example_do_not_do_this>
In Markdown, hash marks signify headings. For example:
# Level 1 heading
## Level 2 heading
### Level 3 heading
</bad_example_do_not_do_this>
This example is unacceptable because it uses indentation to mark the code block
instead of backticks with a path.
<bad_example_do_not_do_this>
In Markdown, hash marks signify headings. For example:
```markdown
/dev/null/example.md#L1-3
# Level 1 heading
## Level 2 heading
### Level 3 heading
```
</bad_example_do_not_do_this>
This example is unacceptable because the path is in the wrong place. The path must be directly after the opening backticks.
{{#if has_tools}}
## Fixing Diagnostics
1. Make 1-2 attempts at fixing diagnostics, then defer to the user.
2. Never simplify code you've written just to solve diagnostics. Complete, mostly correct code is more valuable than perfect code that doesn't solve the problem.
## Debugging
When debugging, only make code changes if you are certain that you can solve the problem.
Otherwise, follow debugging best practices:
1. Address the root cause instead of the symptoms.
2. Add descriptive logging statements and error messages to track variable and code state.
3. Add test functions and statements to isolate the problem.
{{/if}}
## Calling External APIs
1. Unless explicitly requested by the user, use the best suited external APIs and packages to solve the task. There is no need to ask the user for permission.
2. When selecting which version of an API or package to use, choose one that is compatible with the user's dependency management file(s). If no such file exists or if the package is not present, use the latest version that is in your training data.
3. If an external API requires an API Key, be sure to point this out to the user. Adhere to best security practices (e.g. DO NOT hardcode an API key in a place where it can be exposed)
## System Information
Operating System: {{os}}
Default Shell: {{shell}}
{{#if (or has_rules has_user_rules)}}
## User's Custom Instructions
The following additional instructions are provided by the user, and should be followed to the best of your ability{{#if has_tools}} without interfering with the tool use guidelines{{/if}}.
{{#if has_rules}}
There are project rules that apply to these root directories:
{{#each worktrees}}
{{#if rules_file}}
`{{root_name}}/{{rules_file.path_in_worktree}}`:
``````
{{{rules_file.text}}}
``````
{{/if}}
{{/each}}
{{/if}}
{{#if has_user_rules}}
The user has specified the following rules that should be applied:
{{#each user_rules}}
{{#if title}}
Rules title: {{title}}
{{/if}}
``````
{{contents}}
``````
{{/each}}
{{/if}}
{{/if}}

View File

@@ -1329,7 +1329,7 @@
"model": null,
"max_tokens": null
},
// Whether edit predictions are enabled when editing text threads.
// Whether edit predictions are enabled when editing text threads in the agent panel.
// This setting has no effect if globally disabled.
"enabled_in_text_threads": true
},
@@ -1700,6 +1700,7 @@
"preferred_line_length": 72
},
"Go": {
"hard_tabs": true,
"code_actions_on_format": {
"source.organizeImports": true
},
@@ -1769,7 +1770,8 @@
}
},
"Plain Text": {
"allow_rewrap": "anywhere"
"allow_rewrap": "anywhere",
"soft_wrap": "editor_width"
},
"Python": {
"code_actions_on_format": {

View File

@@ -6,8 +6,8 @@
{
"name": "Gruvbox Dark",
"appearance": "dark",
"accents": ["#cc241dff", "#98971aff", "#d79921ff", "#458588ff", "#b16286ff", "#689d6aff", "#d65d0eff"],
"style": {
"accents": ["#cc241dff", "#98971aff", "#d79921ff", "#458588ff", "#b16286ff", "#689d6aff", "#d65d0eff"],
"border": "#5b534dff",
"border.variant": "#494340ff",
"border.focused": "#303a36ff",
@@ -412,8 +412,8 @@
{
"name": "Gruvbox Dark Hard",
"appearance": "dark",
"accents": ["#cc241dff", "#98971aff", "#d79921ff", "#458588ff", "#b16286ff", "#689d6aff", "#d65d0eff"],
"style": {
"accents": ["#cc241dff", "#98971aff", "#d79921ff", "#458588ff", "#b16286ff", "#689d6aff", "#d65d0eff"],
"border": "#5b534dff",
"border.variant": "#494340ff",
"border.focused": "#303a36ff",
@@ -818,8 +818,8 @@
{
"name": "Gruvbox Dark Soft",
"appearance": "dark",
"accents": ["#cc241dff", "#98971aff", "#d79921ff", "#458588ff", "#b16286ff", "#689d6aff", "#d65d0eff"],
"style": {
"accents": ["#cc241dff", "#98971aff", "#d79921ff", "#458588ff", "#b16286ff", "#689d6aff", "#d65d0eff"],
"border": "#5b534dff",
"border.variant": "#494340ff",
"border.focused": "#303a36ff",
@@ -1224,8 +1224,8 @@
{
"name": "Gruvbox Light",
"appearance": "light",
"accents": ["#cc241dff", "#98971aff", "#d79921ff", "#458588ff", "#b16286ff", "#689d6aff", "#d65d0eff"],
"style": {
"accents": ["#cc241dff", "#98971aff", "#d79921ff", "#458588ff", "#b16286ff", "#689d6aff", "#d65d0eff"],
"border": "#c8b899ff",
"border.variant": "#ddcca7ff",
"border.focused": "#adc5ccff",
@@ -1630,8 +1630,8 @@
{
"name": "Gruvbox Light Hard",
"appearance": "light",
"accents": ["#cc241dff", "#98971aff", "#d79921ff", "#458588ff", "#b16286ff", "#689d6aff", "#d65d0eff"],
"style": {
"accents": ["#cc241dff", "#98971aff", "#d79921ff", "#458588ff", "#b16286ff", "#689d6aff", "#d65d0eff"],
"border": "#c8b899ff",
"border.variant": "#ddcca7ff",
"border.focused": "#adc5ccff",
@@ -2036,8 +2036,8 @@
{
"name": "Gruvbox Light Soft",
"appearance": "light",
"accents": ["#cc241dff", "#98971aff", "#d79921ff", "#458588ff", "#b16286ff", "#689d6aff", "#d65d0eff"],
"style": {
"accents": ["#cc241dff", "#98971aff", "#d79921ff", "#458588ff", "#b16286ff", "#689d6aff", "#d65d0eff"],
"border": "#c8b899ff",
"border.variant": "#ddcca7ff",
"border.focused": "#adc5ccff",

21
ci/Dockerfile.namespace Normal file
View File

@@ -0,0 +1,21 @@
ARG NAMESPACE_BASE_IMAGE_REF=""
# Your image must build FROM NAMESPACE_BASE_IMAGE_REF
FROM ${NAMESPACE_BASE_IMAGE_REF} AS base
# Remove problematic git-lfs packagecloud source
RUN sudo rm -f /etc/apt/sources.list.d/*git-lfs*.list
# Install git and SSH for cloning private repositories
RUN sudo apt-get update && \
sudo apt-get install -y git openssh-client
# Clone the Zed repository
RUN git clone https://github.com/zed-industries/zed.git ~/zed
# Run the Linux installation script
WORKDIR /home/runner/zed
RUN ./script/linux
# Clean up unnecessary files to reduce image size
RUN sudo apt-get clean && sudo rm -rf \
/home/runner/zed

View File

@@ -9,6 +9,9 @@ disallowed-methods = [
{ path = "std::process::Command::spawn", reason = "Spawning `std::process::Command` can block the current thread for an unknown duration", replacement = "smol::process::Command::spawn" },
{ path = "std::process::Command::output", reason = "Spawning `std::process::Command` can block the current thread for an unknown duration", replacement = "smol::process::Command::output" },
{ path = "std::process::Command::status", reason = "Spawning `std::process::Command` can block the current thread for an unknown duration", replacement = "smol::process::Command::status" },
{ path = "std::process::Command::stdin", reason = "`smol::process::Command::from()` does not preserve stdio configuration", replacement = "smol::process::Command::stdin" },
{ path = "std::process::Command::stdout", reason = "`smol::process::Command::from()` does not preserve stdio configuration", replacement = "smol::process::Command::stdout" },
{ path = "std::process::Command::stderr", reason = "`smol::process::Command::from()` does not preserve stdio configuration", replacement = "smol::process::Command::stderr" },
{ path = "serde_json::from_reader", reason = "Parsing from a buffer is much slower than first reading the buffer into a Vec/String, see https://github.com/serde-rs/json/issues/160#issuecomment-253446892. Use `serde_json::from_slice` instead." },
{ path = "serde_json_lenient::from_reader", reason = "Parsing from a buffer is much slower than first reading the buffer into a Vec/String, see https://github.com/serde-rs/json/issues/160#issuecomment-253446892, Use `serde_json_lenient::from_slice` instead." },
]

View File

@@ -33,32 +33,6 @@ services:
volumes:
- ./livekit.yaml:/livekit.yaml
postgrest_app:
image: docker.io/postgrest/postgrest
container_name: postgrest_app
ports:
- 8081:8081
environment:
PGRST_DB_URI: postgres://postgres@postgres:5432/zed
volumes:
- ./crates/collab/postgrest_app.conf:/etc/postgrest.conf
command: postgrest /etc/postgrest.conf
depends_on:
- postgres
postgrest_llm:
image: docker.io/postgrest/postgrest
container_name: postgrest_llm
ports:
- 8082:8082
environment:
PGRST_DB_URI: postgres://postgres@postgres:5432/zed_llm
volumes:
- ./crates/collab/postgrest_llm.conf:/etc/postgrest.conf
command: postgrest /etc/postgrest.conf
depends_on:
- postgres
stripe-mock:
image: docker.io/stripe/stripe-mock:v0.178.0
ports:

View File

@@ -35,7 +35,7 @@ use std::rc::Rc;
use std::time::{Duration, Instant};
use std::{fmt::Display, mem, path::PathBuf, sync::Arc};
use ui::App;
use util::{ResultExt, get_default_system_shell_preferring_bash};
use util::{ResultExt, get_default_system_shell_preferring_bash, paths::PathStyle};
use uuid::Uuid;
#[derive(Debug)]
@@ -95,9 +95,14 @@ pub enum AssistantMessageChunk {
}
impl AssistantMessageChunk {
pub fn from_str(chunk: &str, language_registry: &Arc<LanguageRegistry>, cx: &mut App) -> Self {
pub fn from_str(
chunk: &str,
language_registry: &Arc<LanguageRegistry>,
path_style: PathStyle,
cx: &mut App,
) -> Self {
Self::Message {
block: ContentBlock::new(chunk.into(), language_registry, cx),
block: ContentBlock::new(chunk.into(), language_registry, path_style, cx),
}
}
@@ -186,6 +191,7 @@ impl ToolCall {
tool_call: acp::ToolCall,
status: ToolCallStatus,
language_registry: Arc<LanguageRegistry>,
path_style: PathStyle,
terminals: &HashMap<acp::TerminalId, Entity<Terminal>>,
cx: &mut App,
) -> Result<Self> {
@@ -199,6 +205,7 @@ impl ToolCall {
content.push(ToolCallContent::from_acp(
item,
language_registry.clone(),
path_style,
terminals,
cx,
)?);
@@ -223,6 +230,7 @@ impl ToolCall {
&mut self,
fields: acp::ToolCallUpdateFields,
language_registry: Arc<LanguageRegistry>,
path_style: PathStyle,
terminals: &HashMap<acp::TerminalId, Entity<Terminal>>,
cx: &mut App,
) -> Result<()> {
@@ -260,12 +268,13 @@ impl ToolCall {
// Reuse existing content if we can
for (old, new) in self.content.iter_mut().zip(content.by_ref()) {
old.update_from_acp(new, language_registry.clone(), terminals, cx)?;
old.update_from_acp(new, language_registry.clone(), path_style, terminals, cx)?;
}
for new in content {
self.content.push(ToolCallContent::from_acp(
new,
language_registry.clone(),
path_style,
terminals,
cx,
)?)
@@ -450,21 +459,23 @@ impl ContentBlock {
pub fn new(
block: acp::ContentBlock,
language_registry: &Arc<LanguageRegistry>,
path_style: PathStyle,
cx: &mut App,
) -> Self {
let mut this = Self::Empty;
this.append(block, language_registry, cx);
this.append(block, language_registry, path_style, cx);
this
}
pub fn new_combined(
blocks: impl IntoIterator<Item = acp::ContentBlock>,
language_registry: Arc<LanguageRegistry>,
path_style: PathStyle,
cx: &mut App,
) -> Self {
let mut this = Self::Empty;
for block in blocks {
this.append(block, &language_registry, cx);
this.append(block, &language_registry, path_style, cx);
}
this
}
@@ -473,6 +484,7 @@ impl ContentBlock {
&mut self,
block: acp::ContentBlock,
language_registry: &Arc<LanguageRegistry>,
path_style: PathStyle,
cx: &mut App,
) {
if matches!(self, ContentBlock::Empty)
@@ -482,7 +494,7 @@ impl ContentBlock {
return;
}
let new_content = self.block_string_contents(block);
let new_content = self.block_string_contents(block, path_style);
match self {
ContentBlock::Empty => {
@@ -492,7 +504,7 @@ impl ContentBlock {
markdown.update(cx, |markdown, cx| markdown.append(&new_content, cx));
}
ContentBlock::ResourceLink { resource_link } => {
let existing_content = Self::resource_link_md(&resource_link.uri);
let existing_content = Self::resource_link_md(&resource_link.uri, path_style);
let combined = format!("{}\n{}", existing_content, new_content);
*self = Self::create_markdown_block(combined, language_registry, cx);
@@ -511,11 +523,11 @@ impl ContentBlock {
}
}
fn block_string_contents(&self, block: acp::ContentBlock) -> String {
fn block_string_contents(&self, block: acp::ContentBlock, path_style: PathStyle) -> String {
match block {
acp::ContentBlock::Text(text_content) => text_content.text,
acp::ContentBlock::ResourceLink(resource_link) => {
Self::resource_link_md(&resource_link.uri)
Self::resource_link_md(&resource_link.uri, path_style)
}
acp::ContentBlock::Resource(acp::EmbeddedResource {
resource:
@@ -524,14 +536,14 @@ impl ContentBlock {
..
}),
..
}) => Self::resource_link_md(&uri),
}) => Self::resource_link_md(&uri, path_style),
acp::ContentBlock::Image(image) => Self::image_md(&image),
acp::ContentBlock::Audio(_) | acp::ContentBlock::Resource(_) => String::new(),
}
}
fn resource_link_md(uri: &str) -> String {
if let Some(uri) = MentionUri::parse(uri).log_err() {
fn resource_link_md(uri: &str, path_style: PathStyle) -> String {
if let Some(uri) = MentionUri::parse(uri, path_style).log_err() {
uri.as_link().to_string()
} else {
uri.to_string()
@@ -577,6 +589,7 @@ impl ToolCallContent {
pub fn from_acp(
content: acp::ToolCallContent,
language_registry: Arc<LanguageRegistry>,
path_style: PathStyle,
terminals: &HashMap<acp::TerminalId, Entity<Terminal>>,
cx: &mut App,
) -> Result<Self> {
@@ -584,6 +597,7 @@ impl ToolCallContent {
acp::ToolCallContent::Content { content } => Ok(Self::ContentBlock(ContentBlock::new(
content,
&language_registry,
path_style,
cx,
))),
acp::ToolCallContent::Diff { diff } => Ok(Self::Diff(cx.new(|cx| {
@@ -607,6 +621,7 @@ impl ToolCallContent {
&mut self,
new: acp::ToolCallContent,
language_registry: Arc<LanguageRegistry>,
path_style: PathStyle,
terminals: &HashMap<acp::TerminalId, Entity<Terminal>>,
cx: &mut App,
) -> Result<()> {
@@ -622,7 +637,7 @@ impl ToolCallContent {
};
if needs_update {
*self = Self::from_acp(new, language_registry, terminals, cx)?;
*self = Self::from_acp(new, language_registry, path_style, terminals, cx)?;
}
Ok(())
}
@@ -1142,6 +1157,7 @@ impl AcpThread {
cx: &mut Context<Self>,
) {
let language_registry = self.project.read(cx).languages().clone();
let path_style = self.project.read(cx).path_style(cx);
let entries_len = self.entries.len();
if let Some(last_entry) = self.entries.last_mut()
@@ -1153,12 +1169,12 @@ impl AcpThread {
}) = last_entry
{
*id = message_id.or(id.take());
content.append(chunk.clone(), &language_registry, cx);
content.append(chunk.clone(), &language_registry, path_style, cx);
chunks.push(chunk);
let idx = entries_len - 1;
cx.emit(AcpThreadEvent::EntryUpdated(idx));
} else {
let content = ContentBlock::new(chunk.clone(), &language_registry, cx);
let content = ContentBlock::new(chunk.clone(), &language_registry, path_style, cx);
self.push_entry(
AgentThreadEntry::UserMessage(UserMessage {
id: message_id,
@@ -1178,6 +1194,7 @@ impl AcpThread {
cx: &mut Context<Self>,
) {
let language_registry = self.project.read(cx).languages().clone();
let path_style = self.project.read(cx).path_style(cx);
let entries_len = self.entries.len();
if let Some(last_entry) = self.entries.last_mut()
&& let AgentThreadEntry::AssistantMessage(AssistantMessage { chunks }) = last_entry
@@ -1187,10 +1204,10 @@ impl AcpThread {
match (chunks.last_mut(), is_thought) {
(Some(AssistantMessageChunk::Message { block }), false)
| (Some(AssistantMessageChunk::Thought { block }), true) => {
block.append(chunk, &language_registry, cx)
block.append(chunk, &language_registry, path_style, cx)
}
_ => {
let block = ContentBlock::new(chunk, &language_registry, cx);
let block = ContentBlock::new(chunk, &language_registry, path_style, cx);
if is_thought {
chunks.push(AssistantMessageChunk::Thought { block })
} else {
@@ -1199,7 +1216,7 @@ impl AcpThread {
}
}
} else {
let block = ContentBlock::new(chunk, &language_registry, cx);
let block = ContentBlock::new(chunk, &language_registry, path_style, cx);
let chunk = if is_thought {
AssistantMessageChunk::Thought { block }
} else {
@@ -1251,6 +1268,7 @@ impl AcpThread {
) -> Result<()> {
let update = update.into();
let languages = self.project.read(cx).languages().clone();
let path_style = self.project.read(cx).path_style(cx);
let ix = match self.index_for_tool_call(update.id()) {
Some(ix) => ix,
@@ -1267,6 +1285,7 @@ impl AcpThread {
meta: None,
}),
&languages,
path_style,
cx,
))],
status: ToolCallStatus::Failed,
@@ -1286,7 +1305,7 @@ impl AcpThread {
match update {
ToolCallUpdate::UpdateFields(update) => {
let location_updated = update.fields.locations.is_some();
call.update_fields(update.fields, languages, &self.terminals, cx)?;
call.update_fields(update.fields, languages, path_style, &self.terminals, cx)?;
if location_updated {
self.resolve_locations(update.id, cx);
}
@@ -1325,6 +1344,7 @@ impl AcpThread {
cx: &mut Context<Self>,
) -> Result<(), acp::Error> {
let language_registry = self.project.read(cx).languages().clone();
let path_style = self.project.read(cx).path_style(cx);
let id = update.id.clone();
if let Some(ix) = self.index_for_tool_call(&id) {
@@ -1332,7 +1352,13 @@ impl AcpThread {
unreachable!()
};
call.update_fields(update.fields, language_registry, &self.terminals, cx)?;
call.update_fields(
update.fields,
language_registry,
path_style,
&self.terminals,
cx,
)?;
call.status = status;
cx.emit(AcpThreadEvent::EntryUpdated(ix));
@@ -1341,6 +1367,7 @@ impl AcpThread {
update.try_into()?,
status,
language_registry,
self.project.read(cx).path_style(cx),
&self.terminals,
cx,
)?;
@@ -1620,6 +1647,7 @@ impl AcpThread {
let block = ContentBlock::new_combined(
message.clone(),
self.project.read(cx).languages().clone(),
self.project.read(cx).path_style(cx),
cx,
);
let request = acp::PromptRequest {

View File

@@ -7,10 +7,10 @@ use std::{
fmt,
ops::RangeInclusive,
path::{Path, PathBuf},
str::FromStr,
};
use ui::{App, IconName, SharedString};
use url::Url;
use util::paths::PathStyle;
#[derive(Clone, Debug, PartialEq, Eq, Serialize, Deserialize, Hash)]
pub enum MentionUri {
@@ -49,7 +49,7 @@ pub enum MentionUri {
}
impl MentionUri {
pub fn parse(input: &str) -> Result<Self> {
pub fn parse(input: &str, path_style: PathStyle) -> Result<Self> {
fn parse_line_range(fragment: &str) -> Result<RangeInclusive<u32>> {
let range = fragment
.strip_prefix("L")
@@ -74,25 +74,34 @@ impl MentionUri {
let path = url.path();
match url.scheme() {
"file" => {
let path = url.to_file_path().ok().context("Extracting file path")?;
let path = if path_style.is_windows() {
path.trim_start_matches("/")
} else {
path
};
if let Some(fragment) = url.fragment() {
let line_range = parse_line_range(fragment)?;
if let Some(name) = single_query_param(&url, "symbol")? {
Ok(Self::Symbol {
name,
abs_path: path,
abs_path: path.into(),
line_range,
})
} else {
Ok(Self::Selection {
abs_path: Some(path),
abs_path: Some(path.into()),
line_range,
})
}
} else if input.ends_with("/") {
Ok(Self::Directory { abs_path: path })
Ok(Self::Directory {
abs_path: path.into(),
})
} else {
Ok(Self::File { abs_path: path })
Ok(Self::File {
abs_path: path.into(),
})
}
}
"zed" => {
@@ -213,18 +222,14 @@ impl MentionUri {
pub fn to_uri(&self) -> Url {
match self {
MentionUri::File { abs_path } => {
let mut url = Url::parse("zed:///").unwrap();
url.set_path("/agent/file");
url.query_pairs_mut()
.append_pair("path", &abs_path.to_string_lossy());
let mut url = Url::parse("file:///").unwrap();
url.set_path(&abs_path.to_string_lossy());
url
}
MentionUri::PastedImage => Url::parse("zed:///agent/pasted-image").unwrap(),
MentionUri::Directory { abs_path } => {
let mut url = Url::parse("zed:///").unwrap();
url.set_path("/agent/directory");
url.query_pairs_mut()
.append_pair("path", &abs_path.to_string_lossy());
let mut url = Url::parse("file:///").unwrap();
url.set_path(&abs_path.to_string_lossy());
url
}
MentionUri::Symbol {
@@ -232,10 +237,9 @@ impl MentionUri {
name,
line_range,
} => {
let mut url = Url::parse("zed:///").unwrap();
url.set_path(&format!("/agent/symbol/{name}"));
url.query_pairs_mut()
.append_pair("path", &abs_path.to_string_lossy());
let mut url = Url::parse("file:///").unwrap();
url.set_path(&abs_path.to_string_lossy());
url.query_pairs_mut().append_pair("symbol", name);
url.set_fragment(Some(&format!(
"L{}:{}",
line_range.start() + 1,
@@ -247,13 +251,14 @@ impl MentionUri {
abs_path,
line_range,
} => {
let mut url = Url::parse("zed:///").unwrap();
if let Some(abs_path) = abs_path {
url.set_path("/agent/selection");
url.query_pairs_mut()
.append_pair("path", &abs_path.to_string_lossy());
let mut url = if let Some(path) = abs_path {
let mut url = Url::parse("file:///").unwrap();
url.set_path(&path.to_string_lossy());
url
} else {
let mut url = Url::parse("zed:///").unwrap();
url.set_path("/agent/untitled-buffer");
url
};
url.set_fragment(Some(&format!(
"L{}:{}",
@@ -288,14 +293,6 @@ impl MentionUri {
}
}
impl FromStr for MentionUri {
type Err = anyhow::Error;
fn from_str(s: &str) -> anyhow::Result<Self> {
Self::parse(s)
}
}
pub struct MentionLink<'a>(&'a MentionUri);
impl fmt::Display for MentionLink<'_> {
@@ -338,93 +335,81 @@ mod tests {
#[test]
fn test_parse_file_uri() {
let old_uri = uri!("file:///path/to/file.rs");
let parsed = MentionUri::parse(old_uri).unwrap();
let file_uri = uri!("file:///path/to/file.rs");
let parsed = MentionUri::parse(file_uri, PathStyle::local()).unwrap();
match &parsed {
MentionUri::File { abs_path } => {
assert_eq!(abs_path.to_str().unwrap(), path!("/path/to/file.rs"));
assert_eq!(abs_path, Path::new(path!("/path/to/file.rs")));
}
_ => panic!("Expected File variant"),
}
let new_uri = parsed.to_uri().to_string();
assert!(new_uri.starts_with("zed:///agent/file"));
assert_eq!(MentionUri::parse(&new_uri).unwrap(), parsed);
assert_eq!(parsed.to_uri().to_string(), file_uri);
}
#[test]
fn test_parse_directory_uri() {
let old_uri = uri!("file:///path/to/dir/");
let parsed = MentionUri::parse(old_uri).unwrap();
let file_uri = uri!("file:///path/to/dir/");
let parsed = MentionUri::parse(file_uri, PathStyle::local()).unwrap();
match &parsed {
MentionUri::Directory { abs_path } => {
assert_eq!(abs_path.to_str().unwrap(), path!("/path/to/dir/"));
assert_eq!(abs_path, Path::new(path!("/path/to/dir/")));
}
_ => panic!("Expected Directory variant"),
}
let new_uri = parsed.to_uri().to_string();
assert!(new_uri.starts_with("zed:///agent/directory"));
assert_eq!(MentionUri::parse(&new_uri).unwrap(), parsed);
assert_eq!(parsed.to_uri().to_string(), file_uri);
}
#[test]
fn test_to_directory_uri_without_slash() {
let uri = MentionUri::Directory {
abs_path: PathBuf::from(path!("/path/to/dir")),
abs_path: PathBuf::from(path!("/path/to/dir/")),
};
let uri_string = uri.to_uri().to_string();
assert!(uri_string.starts_with("zed:///agent/directory"));
assert_eq!(MentionUri::parse(&uri_string).unwrap(), uri);
let expected = uri!("file:///path/to/dir/");
assert_eq!(uri.to_uri().to_string(), expected);
}
#[test]
fn test_parse_symbol_uri() {
let old_uri = uri!("file:///path/to/file.rs?symbol=MySymbol#L10:20");
let parsed = MentionUri::parse(old_uri).unwrap();
let symbol_uri = uri!("file:///path/to/file.rs?symbol=MySymbol#L10:20");
let parsed = MentionUri::parse(symbol_uri, PathStyle::local()).unwrap();
match &parsed {
MentionUri::Symbol {
abs_path: path,
name,
line_range,
} => {
assert_eq!(path.to_str().unwrap(), path!("/path/to/file.rs"));
assert_eq!(path, Path::new(path!("/path/to/file.rs")));
assert_eq!(name, "MySymbol");
assert_eq!(line_range.start(), &9);
assert_eq!(line_range.end(), &19);
}
_ => panic!("Expected Symbol variant"),
}
let new_uri = parsed.to_uri().to_string();
assert!(new_uri.starts_with("zed:///agent/symbol/MySymbol"));
assert_eq!(MentionUri::parse(&new_uri).unwrap(), parsed);
assert_eq!(parsed.to_uri().to_string(), symbol_uri);
}
#[test]
fn test_parse_selection_uri() {
let old_uri = uri!("file:///path/to/file.rs#L5:15");
let parsed = MentionUri::parse(old_uri).unwrap();
let selection_uri = uri!("file:///path/to/file.rs#L5:15");
let parsed = MentionUri::parse(selection_uri, PathStyle::local()).unwrap();
match &parsed {
MentionUri::Selection {
abs_path: path,
line_range,
} => {
assert_eq!(
path.as_ref().unwrap().to_str().unwrap(),
path!("/path/to/file.rs")
);
assert_eq!(path.as_ref().unwrap(), Path::new(path!("/path/to/file.rs")));
assert_eq!(line_range.start(), &4);
assert_eq!(line_range.end(), &14);
}
_ => panic!("Expected Selection variant"),
}
let new_uri = parsed.to_uri().to_string();
assert!(new_uri.starts_with("zed:///agent/selection"));
assert_eq!(MentionUri::parse(&new_uri).unwrap(), parsed);
assert_eq!(parsed.to_uri().to_string(), selection_uri);
}
#[test]
fn test_parse_untitled_selection_uri() {
let selection_uri = uri!("zed:///agent/untitled-buffer#L1:10");
let parsed = MentionUri::parse(selection_uri).unwrap();
let parsed = MentionUri::parse(selection_uri, PathStyle::local()).unwrap();
match &parsed {
MentionUri::Selection {
abs_path: None,
@@ -441,7 +426,7 @@ mod tests {
#[test]
fn test_parse_thread_uri() {
let thread_uri = "zed:///agent/thread/session123?name=Thread+name";
let parsed = MentionUri::parse(thread_uri).unwrap();
let parsed = MentionUri::parse(thread_uri, PathStyle::local()).unwrap();
match &parsed {
MentionUri::Thread {
id: thread_id,
@@ -458,7 +443,7 @@ mod tests {
#[test]
fn test_parse_rule_uri() {
let rule_uri = "zed:///agent/rule/d8694ff2-90d5-4b6f-be33-33c1763acd52?name=Some+rule";
let parsed = MentionUri::parse(rule_uri).unwrap();
let parsed = MentionUri::parse(rule_uri, PathStyle::local()).unwrap();
match &parsed {
MentionUri::Rule { id, name } => {
assert_eq!(id.to_string(), "d8694ff2-90d5-4b6f-be33-33c1763acd52");
@@ -472,7 +457,7 @@ mod tests {
#[test]
fn test_parse_fetch_http_uri() {
let http_uri = "http://example.com/path?query=value#fragment";
let parsed = MentionUri::parse(http_uri).unwrap();
let parsed = MentionUri::parse(http_uri, PathStyle::local()).unwrap();
match &parsed {
MentionUri::Fetch { url } => {
assert_eq!(url.to_string(), http_uri);
@@ -485,7 +470,7 @@ mod tests {
#[test]
fn test_parse_fetch_https_uri() {
let https_uri = "https://example.com/api/endpoint";
let parsed = MentionUri::parse(https_uri).unwrap();
let parsed = MentionUri::parse(https_uri, PathStyle::local()).unwrap();
match &parsed {
MentionUri::Fetch { url } => {
assert_eq!(url.to_string(), https_uri);
@@ -497,40 +482,55 @@ mod tests {
#[test]
fn test_invalid_scheme() {
assert!(MentionUri::parse("ftp://example.com").is_err());
assert!(MentionUri::parse("ssh://example.com").is_err());
assert!(MentionUri::parse("unknown://example.com").is_err());
assert!(MentionUri::parse("ftp://example.com", PathStyle::local()).is_err());
assert!(MentionUri::parse("ssh://example.com", PathStyle::local()).is_err());
assert!(MentionUri::parse("unknown://example.com", PathStyle::local()).is_err());
}
#[test]
fn test_invalid_zed_path() {
assert!(MentionUri::parse("zed:///invalid/path").is_err());
assert!(MentionUri::parse("zed:///agent/unknown/test").is_err());
assert!(MentionUri::parse("zed:///invalid/path", PathStyle::local()).is_err());
assert!(MentionUri::parse("zed:///agent/unknown/test", PathStyle::local()).is_err());
}
#[test]
fn test_invalid_line_range_format() {
// Missing L prefix
assert!(MentionUri::parse(uri!("file:///path/to/file.rs#10:20")).is_err());
assert!(
MentionUri::parse(uri!("file:///path/to/file.rs#10:20"), PathStyle::local()).is_err()
);
// Missing colon separator
assert!(MentionUri::parse(uri!("file:///path/to/file.rs#L1020")).is_err());
assert!(
MentionUri::parse(uri!("file:///path/to/file.rs#L1020"), PathStyle::local()).is_err()
);
// Invalid numbers
assert!(MentionUri::parse(uri!("file:///path/to/file.rs#L10:abc")).is_err());
assert!(MentionUri::parse(uri!("file:///path/to/file.rs#Labc:20")).is_err());
assert!(
MentionUri::parse(uri!("file:///path/to/file.rs#L10:abc"), PathStyle::local()).is_err()
);
assert!(
MentionUri::parse(uri!("file:///path/to/file.rs#Labc:20"), PathStyle::local()).is_err()
);
}
#[test]
fn test_invalid_query_parameters() {
// Invalid query parameter name
assert!(MentionUri::parse(uri!("file:///path/to/file.rs#L10:20?invalid=test")).is_err());
assert!(
MentionUri::parse(
uri!("file:///path/to/file.rs#L10:20?invalid=test"),
PathStyle::local()
)
.is_err()
);
// Too many query parameters
assert!(
MentionUri::parse(uri!(
"file:///path/to/file.rs#L10:20?symbol=test&another=param"
))
MentionUri::parse(
uri!("file:///path/to/file.rs#L10:20?symbol=test&another=param"),
PathStyle::local()
)
.is_err()
);
}
@@ -538,8 +538,14 @@ mod tests {
#[test]
fn test_zero_based_line_numbers() {
// Test that 0-based line numbers are rejected (should be 1-based)
assert!(MentionUri::parse(uri!("file:///path/to/file.rs#L0:10")).is_err());
assert!(MentionUri::parse(uri!("file:///path/to/file.rs#L1:0")).is_err());
assert!(MentionUri::parse(uri!("file:///path/to/file.rs#L0:0")).is_err());
assert!(
MentionUri::parse(uri!("file:///path/to/file.rs#L0:10"), PathStyle::local()).is_err()
);
assert!(
MentionUri::parse(uri!("file:///path/to/file.rs#L1:0"), PathStyle::local()).is_err()
);
assert!(
MentionUri::parse(uri!("file:///path/to/file.rs#L0:0"), PathStyle::local()).is_err()
);
}
}

View File

@@ -259,6 +259,15 @@ impl AcpTools {
serde_json::to_string_pretty(&messages).ok()
}
fn clear_messages(&mut self, cx: &mut Context<Self>) {
if let Some(connection) = self.watched_connection.as_mut() {
connection.messages.clear();
connection.list_state.reset(0);
self.expanded.clear();
cx.notify();
}
}
fn render_message(
&mut self,
index: usize,
@@ -547,10 +556,16 @@ impl Render for AcpToolsToolbarItemView {
};
let acp_tools = acp_tools.clone();
let has_messages = acp_tools
.read(cx)
.watched_connection
.as_ref()
.is_some_and(|connection| !connection.messages.is_empty());
h_flex()
.gap_2()
.child(
.child({
let acp_tools = acp_tools.clone();
IconButton::new(
"copy_all_messages",
if self.just_copied {
@@ -565,13 +580,7 @@ impl Render for AcpToolsToolbarItemView {
} else {
"Copy All Messages"
}))
.disabled(
acp_tools
.read(cx)
.watched_connection
.as_ref()
.is_none_or(|connection| connection.messages.is_empty()),
)
.disabled(!has_messages)
.on_click(cx.listener(move |this, _, _window, cx| {
if let Some(content) = acp_tools.read(cx).serialize_observed_messages() {
cx.write_to_clipboard(ClipboardItem::new_string(content));
@@ -586,7 +595,18 @@ impl Render for AcpToolsToolbarItemView {
})
.detach();
}
})),
}))
})
.child(
IconButton::new("clear_messages", IconName::Trash)
.icon_size(IconSize::Small)
.tooltip(Tooltip::text("Clear Messages"))
.disabled(!has_messages)
.on_click(cx.listener(move |_this, _, _window, cx| {
acp_tools.update(cx, |acp_tools, cx| {
acp_tools.clear_messages(cx);
});
})),
)
.into_any()
}

View File

@@ -11,7 +11,7 @@ use language::{
LanguageServerStatusUpdate, ServerHealth,
};
use project::{
LanguageServerProgress, LspStoreEvent, Project, ProjectEnvironmentEvent,
LanguageServerProgress, LspStoreEvent, ProgressToken, Project, ProjectEnvironmentEvent,
git_store::{GitStoreEvent, Repository},
};
use smallvec::SmallVec;
@@ -61,7 +61,7 @@ struct ServerStatus {
struct PendingWork<'a> {
language_server_id: LanguageServerId,
progress_token: &'a str,
progress_token: &'a ProgressToken,
progress: &'a LanguageServerProgress,
}
@@ -313,9 +313,9 @@ impl ActivityIndicator {
let mut pending_work = status
.pending_work
.iter()
.map(|(token, progress)| PendingWork {
.map(|(progress_token, progress)| PendingWork {
language_server_id: server_id,
progress_token: token.as_str(),
progress_token,
progress,
})
.collect::<SmallVec<[_; 4]>>();
@@ -358,11 +358,7 @@ impl ActivityIndicator {
..
}) = pending_work.next()
{
let mut message = progress
.title
.as_deref()
.unwrap_or(progress_token)
.to_string();
let mut message = progress.title.clone().unwrap_or(progress_token.to_string());
if let Some(percentage) = progress.percentage {
write!(&mut message, " ({}%)", percentage).unwrap();
@@ -773,7 +769,7 @@ impl Render for ActivityIndicator {
let Some(content) = self.content_to_render(cx) else {
return result;
};
let this = cx.entity().downgrade();
let activity_indicator = cx.entity().downgrade();
let truncate_content = content.message.len() > MAX_MESSAGE_LEN;
result.gap_2().child(
PopoverMenu::new("activity-indicator-popover")
@@ -815,22 +811,21 @@ impl Render for ActivityIndicator {
)
.anchor(gpui::Corner::BottomLeft)
.menu(move |window, cx| {
let strong_this = this.upgrade()?;
let strong_this = activity_indicator.upgrade()?;
let mut has_work = false;
let menu = ContextMenu::build(window, cx, |mut menu, _, cx| {
for work in strong_this.read(cx).pending_language_server_work(cx) {
has_work = true;
let this = this.clone();
let activity_indicator = activity_indicator.clone();
let mut title = work
.progress
.title
.as_deref()
.unwrap_or(work.progress_token)
.to_owned();
.clone()
.unwrap_or(work.progress_token.to_string());
if work.progress.is_cancellable {
let language_server_id = work.language_server_id;
let token = work.progress_token.to_string();
let token = work.progress_token.clone();
let title = SharedString::from(title);
menu = menu.custom_entry(
move |_, _| {
@@ -842,18 +837,23 @@ impl Render for ActivityIndicator {
.into_any_element()
},
move |_, cx| {
this.update(cx, |this, cx| {
this.project.update(cx, |project, cx| {
project.cancel_language_server_work(
language_server_id,
Some(token.clone()),
let token = token.clone();
activity_indicator
.update(cx, |activity_indicator, cx| {
activity_indicator.project.update(
cx,
|project, cx| {
project.cancel_language_server_work(
language_server_id,
Some(token),
cx,
);
},
);
});
this.context_menu_handle.hide(cx);
cx.notify();
})
.ok();
activity_indicator.context_menu_handle.hide(cx);
cx.notify();
})
.ok();
},
);
} else {

View File

@@ -11,7 +11,7 @@ path = "src/agent.rs"
[features]
test-support = ["db/test-support"]
eval = []
edit-agent-eval = []
unit-eval = []
e2e = []
[lints]

View File

@@ -1035,12 +1035,13 @@ impl acp_thread::AgentConnection for NativeAgentConnection {
let session_id = params.session_id.clone();
log::info!("Received prompt request for session: {}", session_id);
log::debug!("Prompt blocks count: {}", params.prompt.len());
let path_style = self.0.read(cx).project.read(cx).path_style(cx);
self.run_turn(session_id, cx, |thread, cx| {
self.run_turn(session_id, cx, move |thread, cx| {
let content: Vec<UserMessageContent> = params
.prompt
.into_iter()
.map(Into::into)
.map(|block| UserMessageContent::from_content_block(block, path_style))
.collect::<Vec<_>>();
log::debug!("Converted prompt to message: {} chars", content.len());
log::debug!("Message id: {:?}", id);

View File

@@ -31,7 +31,7 @@ use std::{
use util::path;
#[test]
#[cfg_attr(not(feature = "edit-agent-eval"), ignore)]
#[cfg_attr(not(feature = "unit-eval"), ignore)]
fn eval_extract_handle_command_output() {
// Test how well agent generates multiple edit hunks.
//
@@ -108,7 +108,7 @@ fn eval_extract_handle_command_output() {
}
#[test]
#[cfg_attr(not(feature = "edit-agent-eval"), ignore)]
#[cfg_attr(not(feature = "unit-eval"), ignore)]
fn eval_delete_run_git_blame() {
// Model | Pass rate
// ----------------------------|----------
@@ -171,7 +171,7 @@ fn eval_delete_run_git_blame() {
}
#[test]
#[cfg_attr(not(feature = "edit-agent-eval"), ignore)]
#[cfg_attr(not(feature = "unit-eval"), ignore)]
fn eval_translate_doc_comments() {
// Model | Pass rate
// ============================================
@@ -234,7 +234,7 @@ fn eval_translate_doc_comments() {
}
#[test]
#[cfg_attr(not(feature = "edit-agent-eval"), ignore)]
#[cfg_attr(not(feature = "unit-eval"), ignore)]
fn eval_use_wasi_sdk_in_compile_parser_to_wasm() {
// Model | Pass rate
// ============================================
@@ -360,7 +360,7 @@ fn eval_use_wasi_sdk_in_compile_parser_to_wasm() {
}
#[test]
#[cfg_attr(not(feature = "edit-agent-eval"), ignore)]
#[cfg_attr(not(feature = "unit-eval"), ignore)]
fn eval_disable_cursor_blinking() {
// Model | Pass rate
// ============================================
@@ -446,7 +446,7 @@ fn eval_disable_cursor_blinking() {
}
#[test]
#[cfg_attr(not(feature = "edit-agent-eval"), ignore)]
#[cfg_attr(not(feature = "unit-eval"), ignore)]
fn eval_from_pixels_constructor() {
// Results for 2025-06-13
//
@@ -656,7 +656,7 @@ fn eval_from_pixels_constructor() {
}
#[test]
#[cfg_attr(not(feature = "edit-agent-eval"), ignore)]
#[cfg_attr(not(feature = "unit-eval"), ignore)]
fn eval_zode() {
// Model | Pass rate
// ============================================
@@ -763,7 +763,7 @@ fn eval_zode() {
}
#[test]
#[cfg_attr(not(feature = "edit-agent-eval"), ignore)]
#[cfg_attr(not(feature = "unit-eval"), ignore)]
fn eval_add_overwrite_test() {
// Model | Pass rate
// ============================================
@@ -995,7 +995,7 @@ fn eval_add_overwrite_test() {
}
#[test]
#[cfg_attr(not(feature = "edit-agent-eval"), ignore)]
#[cfg_attr(not(feature = "unit-eval"), ignore)]
fn eval_create_empty_file() {
// Check that Edit Agent can create a file without writing its
// thoughts into it. This issue is not specific to empty files, but

View File

@@ -160,6 +160,42 @@ async fn test_system_prompt(cx: &mut TestAppContext) {
);
}
#[gpui::test]
async fn test_system_prompt_without_tools(cx: &mut TestAppContext) {
let ThreadTest { model, thread, .. } = setup(cx, TestModel::Fake).await;
let fake_model = model.as_fake();
thread
.update(cx, |thread, cx| {
thread.send(UserMessageId::new(), ["abc"], cx)
})
.unwrap();
cx.run_until_parked();
let mut pending_completions = fake_model.pending_completions();
assert_eq!(
pending_completions.len(),
1,
"unexpected pending completions: {:?}",
pending_completions
);
let pending_completion = pending_completions.pop().unwrap();
assert_eq!(pending_completion.messages[0].role, Role::System);
let system_message = &pending_completion.messages[0];
let system_prompt = system_message.content[0].to_str().unwrap();
assert!(
!system_prompt.contains("## Tool Use"),
"unexpected system message: {:?}",
system_message
);
assert!(
!system_prompt.contains("## Fixing Diagnostics"),
"unexpected system message: {:?}",
system_message
);
}
#[gpui::test]
async fn test_prompt_caching(cx: &mut TestAppContext) {
let ThreadTest { model, thread, .. } = setup(cx, TestModel::Fake).await;

View File

@@ -50,7 +50,7 @@ use std::{
time::{Duration, Instant},
};
use std::{fmt::Write, path::PathBuf};
use util::{ResultExt, debug_panic, markdown::MarkdownCodeBlock};
use util::{ResultExt, debug_panic, markdown::MarkdownCodeBlock, paths::PathStyle};
use uuid::Uuid;
const TOOL_CANCELED_MESSAGE: &str = "Tool canceled by user";
@@ -1816,9 +1816,15 @@ impl Thread {
log::debug!("Completion intent: {:?}", completion_intent);
log::debug!("Completion mode: {:?}", self.completion_mode);
let messages = self.build_request_messages(cx);
let available_tools: Vec<_> = self
.running_turn
.as_ref()
.map(|turn| turn.tools.keys().cloned().collect())
.unwrap_or_default();
log::debug!("Request includes {} tools", available_tools.len());
let messages = self.build_request_messages(available_tools, cx);
log::debug!("Request will include {} messages", messages.len());
log::debug!("Request includes {} tools", tools.len());
let request = LanguageModelRequest {
thread_id: Some(self.id.to_string()),
@@ -1909,7 +1915,11 @@ impl Thread {
self.running_turn.as_ref()?.tools.get(name).cloned()
}
fn build_request_messages(&self, cx: &App) -> Vec<LanguageModelRequestMessage> {
fn build_request_messages(
&self,
available_tools: Vec<SharedString>,
cx: &App,
) -> Vec<LanguageModelRequestMessage> {
log::trace!(
"Building request messages from {} thread messages",
self.messages.len()
@@ -1917,7 +1927,7 @@ impl Thread {
let system_prompt = SystemPromptTemplate {
project: self.project_context.read(cx),
available_tools: self.tools.keys().cloned().collect(),
available_tools,
}
.render(&self.templates)
.context("failed to build system prompt")
@@ -2538,8 +2548,8 @@ impl From<&str> for UserMessageContent {
}
}
impl From<acp::ContentBlock> for UserMessageContent {
fn from(value: acp::ContentBlock) -> Self {
impl UserMessageContent {
pub fn from_content_block(value: acp::ContentBlock, path_style: PathStyle) -> Self {
match value {
acp::ContentBlock::Text(text_content) => Self::Text(text_content.text),
acp::ContentBlock::Image(image_content) => Self::Image(convert_image(image_content)),
@@ -2548,7 +2558,7 @@ impl From<acp::ContentBlock> for UserMessageContent {
Self::Text("[audio]".to_string())
}
acp::ContentBlock::ResourceLink(resource_link) => {
match MentionUri::parse(&resource_link.uri) {
match MentionUri::parse(&resource_link.uri, path_style) {
Ok(uri) => Self::Mention {
uri,
content: String::new(),
@@ -2561,7 +2571,7 @@ impl From<acp::ContentBlock> for UserMessageContent {
}
acp::ContentBlock::Resource(resource) => match resource.resource {
acp::EmbeddedResourceResource::TextResourceContents(resource) => {
match MentionUri::parse(&resource.uri) {
match MentionUri::parse(&resource.uri, path_style) {
Ok(uri) => Self::Mention {
uri,
content: resource.text,

View File

@@ -253,17 +253,22 @@ impl ContextPickerCompletionProvider {
) -> Option<Completion> {
let project = workspace.read(cx).project().clone();
let label = CodeLabel::plain(symbol.name.clone(), None);
let abs_path = match &symbol.path {
SymbolLocation::InProject(project_path) => {
project.read(cx).absolute_path(&project_path, cx)?
}
let (abs_path, file_name) = match &symbol.path {
SymbolLocation::InProject(project_path) => (
project.read(cx).absolute_path(&project_path, cx)?,
project_path.path.file_name()?.to_string().into(),
),
SymbolLocation::OutsideProject {
abs_path,
signature: _,
} => PathBuf::from(abs_path.as_ref()),
} => (
PathBuf::from(abs_path.as_ref()),
abs_path.file_name().map(|f| f.to_string_lossy())?,
),
};
let label = build_symbol_label(&symbol.name, &file_name, symbol.range.start.0.row + 1, cx);
let uri = MentionUri::Symbol {
abs_path,
name: symbol.name.clone(),
@@ -570,6 +575,7 @@ impl ContextPickerCompletionProvider {
.unwrap_or_default();
let workspace = workspace.read(cx);
let project = workspace.project().read(cx);
let include_root_name = workspace.visible_worktrees(cx).count() > 1;
if let Some(agent_panel) = workspace.panel::<AgentPanel>(cx)
&& let Some(thread) = agent_panel.read(cx).active_agent_thread(cx)
@@ -596,7 +602,11 @@ impl ContextPickerCompletionProvider {
project
.worktree_for_id(project_path.worktree_id, cx)
.map(|worktree| {
let path_prefix = worktree.read(cx).root_name().into();
let path_prefix = if include_root_name {
worktree.read(cx).root_name().into()
} else {
RelPath::empty().into()
};
Match::File(FileMatch {
mat: fuzzy::PathMatch {
score: 1.,
@@ -674,6 +684,17 @@ impl ContextPickerCompletionProvider {
}
}
fn build_symbol_label(symbol_name: &str, file_name: &str, line: u32, cx: &App) -> CodeLabel {
let comment_id = cx.theme().syntax().highlight_id("comment").map(HighlightId);
let mut label = CodeLabelBuilder::default();
label.push_str(symbol_name, None);
label.push_str(" ", None);
label.push_str(&format!("{} L{}", file_name, line), comment_id);
label.build()
}
fn build_code_label_for_full_path(file_name: &str, directory: Option<&str>, cx: &App) -> CodeLabel {
let comment_id = cx.theme().syntax().highlight_id("comment").map(HighlightId);
let mut label = CodeLabelBuilder::default();
@@ -812,9 +833,21 @@ impl CompletionProvider for ContextPickerCompletionProvider {
path: mat.path.clone(),
};
// If path is empty, this means we're matching with the root directory itself
// so we use the path_prefix as the name
let path_prefix = if mat.path.is_empty() {
project
.read(cx)
.worktree_for_id(project_path.worktree_id, cx)
.map(|wt| wt.read(cx).root_name().into())
.unwrap_or_else(|| mat.path_prefix.clone())
} else {
mat.path_prefix.clone()
};
Self::completion_for_path(
project_path,
&mat.path_prefix,
&path_prefix,
is_recent,
mat.is_dir,
source_range.clone(),

View File

@@ -1,4 +1,5 @@
use crate::{
ChatWithFollow,
acp::completion_provider::{ContextPickerCompletionProvider, SlashCommandCompletion},
context_picker::{ContextPickerAction, fetch_context_picker::fetch_url_content},
};
@@ -49,7 +50,7 @@ use text::OffsetRangeExt;
use theme::ThemeSettings;
use ui::{ButtonLike, TintColor, Toggleable, prelude::*};
use util::{ResultExt, debug_panic, rel_path::RelPath};
use workspace::{Workspace, notifications::NotifyResultExt as _};
use workspace::{CollaboratorId, Workspace, notifications::NotifyResultExt as _};
use zed_actions::agent::Chat;
pub struct MessageEditor {
@@ -234,8 +235,16 @@ impl MessageEditor {
window: &mut Window,
cx: &mut Context<Self>,
) {
let uri = MentionUri::Thread {
id: thread.id.clone(),
name: thread.title.to_string(),
};
let content = format!("{}\n", uri.as_link());
let content_len = content.len() - 1;
let start = self.editor.update(cx, |editor, cx| {
editor.set_text(format!("{}\n", thread.title), window, cx);
editor.set_text(content, window, cx);
editor
.buffer()
.read(cx)
@@ -244,18 +253,8 @@ impl MessageEditor {
.text_anchor
});
self.confirm_mention_completion(
thread.title.clone(),
start,
thread.title.len(),
MentionUri::Thread {
id: thread.id.clone(),
name: thread.title.to_string(),
},
window,
cx,
)
.detach();
self.confirm_mention_completion(thread.title, start, content_len, uri, window, cx)
.detach();
}
#[cfg(test)]
@@ -702,7 +701,7 @@ impl MessageEditor {
let mut all_tracked_buffers = Vec::new();
let result = editor.update(cx, |editor, cx| {
let mut ix = 0;
let mut ix = text.chars().position(|c| !c.is_whitespace()).unwrap_or(0);
let mut chunks: Vec<acp::ContentBlock> = Vec::new();
let text = editor.text(cx);
editor.display_map.update(cx, |map, cx| {
@@ -714,15 +713,6 @@ impl MessageEditor {
let crease_range = crease.range().to_offset(&snapshot.buffer_snapshot());
if crease_range.start > ix {
//todo(): Custom slash command ContentBlock?
// let chunk = if prevent_slash_commands
// && ix == 0
// && parse_slash_command(&text[ix..]).is_some()
// {
// format!(" {}", &text[ix..crease_range.start]).into()
// } else {
// text[ix..crease_range.start].into()
// };
let chunk = text[ix..crease_range.start].into();
chunks.push(chunk);
}
@@ -783,15 +773,6 @@ impl MessageEditor {
}
if ix < text.len() {
//todo(): Custom slash command ContentBlock?
// let last_chunk = if prevent_slash_commands
// && ix == 0
// && parse_slash_command(&text[ix..]).is_some()
// {
// format!(" {}", text[ix..].trim_end())
// } else {
// text[ix..].trim_end().to_owned()
// };
let last_chunk = text[ix..].trim_end().to_owned();
if !last_chunk.is_empty() {
chunks.push(last_chunk.into());
@@ -831,6 +812,21 @@ impl MessageEditor {
self.send(cx);
}
fn chat_with_follow(
&mut self,
_: &ChatWithFollow,
window: &mut Window,
cx: &mut Context<Self>,
) {
self.workspace
.update(cx, |this, cx| {
this.follow(CollaboratorId::Agent, window, cx)
})
.log_err();
self.send(cx);
}
fn cancel(&mut self, _: &editor::actions::Cancel, _: &mut Window, cx: &mut Context<Self>) {
cx.emit(MessageEditorEvent::Cancel)
}
@@ -1062,6 +1058,7 @@ impl MessageEditor {
) {
self.clear(window, cx);
let path_style = self.project.read(cx).path_style(cx);
let mut text = String::new();
let mut mentions = Vec::new();
@@ -1074,7 +1071,8 @@ impl MessageEditor {
resource: acp::EmbeddedResourceResource::TextResourceContents(resource),
..
}) => {
let Some(mention_uri) = MentionUri::parse(&resource.uri).log_err() else {
let Some(mention_uri) = MentionUri::parse(&resource.uri, path_style).log_err()
else {
continue;
};
let start = text.len();
@@ -1090,7 +1088,9 @@ impl MessageEditor {
));
}
acp::ContentBlock::ResourceLink(resource) => {
if let Some(mention_uri) = MentionUri::parse(&resource.uri).log_err() {
if let Some(mention_uri) =
MentionUri::parse(&resource.uri, path_style).log_err()
{
let start = text.len();
write!(&mut text, "{}", mention_uri.as_link()).ok();
let end = text.len();
@@ -1105,7 +1105,7 @@ impl MessageEditor {
meta: _,
}) => {
let mention_uri = if let Some(uri) = uri {
MentionUri::parse(&uri)
MentionUri::parse(&uri, path_style)
} else {
Ok(MentionUri::PastedImage)
};
@@ -1290,6 +1290,7 @@ impl Render for MessageEditor {
div()
.key_context("MessageEditor")
.on_action(cx.listener(Self::chat))
.on_action(cx.listener(Self::chat_with_follow))
.on_action(cx.listener(Self::cancel))
.capture_action(cx.listener(Self::paste))
.flex_1()
@@ -1598,6 +1599,7 @@ mod tests {
use gpui::{
AppContext, Entity, EventEmitter, FocusHandle, Focusable, TestAppContext, VisualTestContext,
};
use language_model::LanguageModelRegistry;
use lsp::{CompletionContext, CompletionTriggerKind};
use project::{CompletionIntent, Project, ProjectPath};
use serde_json::json;
@@ -2179,10 +2181,10 @@ mod tests {
assert_eq!(
current_completion_labels(editor),
&[
format!("eight.txt dir{slash}b{slash}"),
format!("seven.txt dir{slash}b{slash}"),
format!("six.txt dir{slash}b{slash}"),
format!("five.txt dir{slash}b{slash}"),
format!("eight.txt b{slash}"),
format!("seven.txt b{slash}"),
format!("six.txt b{slash}"),
format!("five.txt b{slash}"),
]
);
editor.set_text("", window, cx);
@@ -2210,10 +2212,10 @@ mod tests {
assert_eq!(
current_completion_labels(editor),
&[
format!("eight.txt dir{slash}b{slash}"),
format!("seven.txt dir{slash}b{slash}"),
format!("six.txt dir{slash}b{slash}"),
format!("five.txt dir{slash}b{slash}"),
format!("eight.txt b{slash}"),
format!("seven.txt b{slash}"),
format!("six.txt b{slash}"),
format!("five.txt b{slash}"),
"Files & Directories".into(),
"Symbols".into(),
"Threads".into(),
@@ -2246,7 +2248,7 @@ mod tests {
assert!(editor.has_visible_completions_menu());
assert_eq!(
current_completion_labels(editor),
vec![format!("one.txt dir{slash}a{slash}")]
vec![format!("one.txt a{slash}")]
);
});
@@ -2293,7 +2295,10 @@ mod tests {
panic!("Unexpected mentions");
};
pretty_assertions::assert_eq!(content, "1");
pretty_assertions::assert_eq!(uri, &url_one.parse::<MentionUri>().unwrap());
pretty_assertions::assert_eq!(
uri,
&MentionUri::parse(&url_one, PathStyle::local()).unwrap()
);
}
let contents = message_editor
@@ -2314,7 +2319,10 @@ mod tests {
let [(uri, Mention::UriOnly)] = contents.as_slice() else {
panic!("Unexpected mentions");
};
pretty_assertions::assert_eq!(uri, &url_one.parse::<MentionUri>().unwrap());
pretty_assertions::assert_eq!(
uri,
&MentionUri::parse(&url_one, PathStyle::local()).unwrap()
);
}
cx.simulate_input(" ");
@@ -2375,7 +2383,10 @@ mod tests {
panic!("Unexpected mentions");
};
pretty_assertions::assert_eq!(content, "8");
pretty_assertions::assert_eq!(uri, &url_eight.parse::<MentionUri>().unwrap());
pretty_assertions::assert_eq!(
uri,
&MentionUri::parse(&url_eight, PathStyle::local()).unwrap()
);
}
editor.update(&mut cx, |editor, cx| {
@@ -2460,7 +2471,7 @@ mod tests {
format!("Lorem [@one.txt]({url_one}) Ipsum [@eight.txt]({url_eight}) @symbol ")
);
assert!(editor.has_visible_completions_menu());
assert_eq!(current_completion_labels(editor), &["MySymbol"]);
assert_eq!(current_completion_labels(editor), &["MySymbol one.txt L1"]);
});
editor.update_in(&mut cx, |editor, window, cx| {
@@ -2516,7 +2527,7 @@ mod tests {
format!("Lorem [@one.txt]({url_one}) Ipsum [@eight.txt]({url_eight}) [@MySymbol]({}) @file x.png", symbol.to_uri())
);
assert!(editor.has_visible_completions_menu());
assert_eq!(current_completion_labels(editor), &[format!("x.png dir{slash}")]);
assert_eq!(current_completion_labels(editor), &["x.png "]);
});
editor.update_in(&mut cx, |editor, window, cx| {
@@ -2558,7 +2569,7 @@ mod tests {
format!("Lorem [@one.txt]({url_one}) Ipsum [@eight.txt]({url_eight}) [@MySymbol]({}) @file x.png", symbol.to_uri())
);
assert!(editor.has_visible_completions_menu());
assert_eq!(current_completion_labels(editor), &[format!("x.png dir{slash}")]);
assert_eq!(current_completion_labels(editor), &["x.png "]);
});
editor.update_in(&mut cx, |editor, window, cx| {
@@ -2734,4 +2745,137 @@ mod tests {
_ => panic!("Expected Text mention for small file"),
}
}
#[gpui::test]
async fn test_insert_thread_summary(cx: &mut TestAppContext) {
init_test(cx);
cx.update(LanguageModelRegistry::test);
let fs = FakeFs::new(cx.executor());
fs.insert_tree("/project", json!({"file": ""})).await;
let project = Project::test(fs, [Path::new(path!("/project"))], cx).await;
let (workspace, cx) =
cx.add_window_view(|window, cx| Workspace::test_new(project.clone(), window, cx));
let text_thread_store = cx.new(|cx| TextThreadStore::fake(project.clone(), cx));
let history_store = cx.new(|cx| HistoryStore::new(text_thread_store, cx));
// Create a thread metadata to insert as summary
let thread_metadata = agent::DbThreadMetadata {
id: acp::SessionId("thread-123".into()),
title: "Previous Conversation".into(),
updated_at: chrono::Utc::now(),
};
let message_editor = cx.update(|window, cx| {
cx.new(|cx| {
let mut editor = MessageEditor::new(
workspace.downgrade(),
project.clone(),
history_store.clone(),
None,
Default::default(),
Default::default(),
"Test Agent".into(),
"Test",
EditorMode::AutoHeight {
min_lines: 1,
max_lines: None,
},
window,
cx,
);
editor.insert_thread_summary(thread_metadata.clone(), window, cx);
editor
})
});
// Construct expected values for verification
let expected_uri = MentionUri::Thread {
id: thread_metadata.id.clone(),
name: thread_metadata.title.to_string(),
};
let expected_link = format!("[@{}]({})", thread_metadata.title, expected_uri.to_uri());
message_editor.read_with(cx, |editor, cx| {
let text = editor.text(cx);
assert!(
text.contains(&expected_link),
"Expected editor text to contain thread mention link.\nExpected substring: {}\nActual text: {}",
expected_link,
text
);
let mentions = editor.mentions();
assert_eq!(
mentions.len(),
1,
"Expected exactly one mention after inserting thread summary"
);
assert!(
mentions.contains(&expected_uri),
"Expected mentions to contain the thread URI"
);
});
}
#[gpui::test]
async fn test_whitespace_trimming(cx: &mut TestAppContext) {
init_test(cx);
let fs = FakeFs::new(cx.executor());
fs.insert_tree("/project", json!({"file.rs": "fn main() {}"}))
.await;
let project = Project::test(fs, [Path::new(path!("/project"))], cx).await;
let (workspace, cx) =
cx.add_window_view(|window, cx| Workspace::test_new(project.clone(), window, cx));
let text_thread_store = cx.new(|cx| TextThreadStore::fake(project.clone(), cx));
let history_store = cx.new(|cx| HistoryStore::new(text_thread_store, cx));
let message_editor = cx.update(|window, cx| {
cx.new(|cx| {
MessageEditor::new(
workspace.downgrade(),
project.clone(),
history_store.clone(),
None,
Default::default(),
Default::default(),
"Test Agent".into(),
"Test",
EditorMode::AutoHeight {
min_lines: 1,
max_lines: None,
},
window,
cx,
)
})
});
let editor = message_editor.update(cx, |message_editor, _| message_editor.editor.clone());
cx.run_until_parked();
editor.update_in(cx, |editor, window, cx| {
editor.set_text(" hello world ", window, cx);
});
let (content, _) = message_editor
.update(cx, |message_editor, cx| message_editor.contents(false, cx))
.await
.unwrap();
assert_eq!(
content,
vec![acp::ContentBlock::Text(acp::TextContent {
text: "hello world".into(),
annotations: None,
meta: None
})]
);
}
}

View File

@@ -450,6 +450,7 @@ impl Render for AcpThreadHistory {
v_flex()
.key_context("ThreadHistory")
.size_full()
.bg(cx.theme().colors().panel_background)
.on_action(cx.listener(Self::select_previous))
.on_action(cx.listener(Self::select_next))
.on_action(cx.listener(Self::select_first))

View File

@@ -4305,7 +4305,8 @@ impl AcpThreadView {
return;
};
if let Some(mention) = MentionUri::parse(&url).log_err() {
if let Some(mention) = MentionUri::parse(&url, workspace.read(cx).path_style(cx)).log_err()
{
workspace.update(cx, |workspace, cx| match mention {
MentionUri::File { abs_path } => {
let project = workspace.project();

View File

@@ -662,6 +662,7 @@ pub(crate) fn recent_context_picker_entries(
let mut recent = Vec::with_capacity(6);
let workspace = workspace.read(cx);
let project = workspace.project().read(cx);
let include_root_name = workspace.visible_worktrees(cx).count() > 1;
recent.extend(
workspace
@@ -675,9 +676,16 @@ pub(crate) fn recent_context_picker_entries(
.filter_map(|(project_path, _)| {
project
.worktree_for_id(project_path.worktree_id, cx)
.map(|worktree| RecentEntry::File {
project_path,
path_prefix: worktree.read(cx).root_name().into(),
.map(|worktree| {
let path_prefix = if include_root_name {
worktree.read(cx).root_name().into()
} else {
RelPath::empty().into()
};
RecentEntry::File {
project_path,
path_prefix,
}
})
}),
);

View File

@@ -655,13 +655,12 @@ impl ContextPickerCompletionProvider {
let SymbolLocation::InProject(symbol_path) = &symbol.path else {
return None;
};
let path_prefix = workspace
let _path_prefix = workspace
.read(cx)
.project()
.read(cx)
.worktree_for_id(symbol_path.worktree_id, cx)?
.read(cx)
.root_name();
.worktree_for_id(symbol_path.worktree_id, cx)?;
let path_prefix = RelPath::empty();
let (file_name, directory) = super::file_context_picker::extract_file_name_and_directory(
&symbol_path.path,
@@ -818,9 +817,21 @@ impl CompletionProvider for ContextPickerCompletionProvider {
return None;
}
// If path is empty, this means we're matching with the root directory itself
// so we use the path_prefix as the name
let path_prefix = if mat.path.is_empty() {
project
.read(cx)
.worktree_for_id(project_path.worktree_id, cx)
.map(|wt| wt.read(cx).root_name().into())
.unwrap_or_else(|| mat.path_prefix.clone())
} else {
mat.path_prefix.clone()
};
Some(Self::completion_for_path(
project_path,
&mat.path_prefix,
&path_prefix,
is_recent,
mat.is_dir,
excerpt_id,
@@ -1309,10 +1320,10 @@ mod tests {
assert_eq!(
current_completion_labels(editor),
&[
format!("seven.txt dir{slash}b{slash}"),
format!("six.txt dir{slash}b{slash}"),
format!("five.txt dir{slash}b{slash}"),
format!("four.txt dir{slash}a{slash}"),
format!("seven.txt b{slash}"),
format!("six.txt b{slash}"),
format!("five.txt b{slash}"),
format!("four.txt a{slash}"),
"Files & Directories".into(),
"Symbols".into(),
"Fetch".into()
@@ -1344,7 +1355,7 @@ mod tests {
assert!(editor.has_visible_completions_menu());
assert_eq!(
current_completion_labels(editor),
vec![format!("one.txt dir{slash}a{slash}")]
vec![format!("one.txt a{slash}")]
);
});
@@ -1356,12 +1367,12 @@ mod tests {
editor.update(&mut cx, |editor, cx| {
assert_eq!(
editor.text(cx),
format!("Lorem [@one.txt](@file:dir{slash}a{slash}one.txt) ")
format!("Lorem [@one.txt](@file:a{slash}one.txt) ")
);
assert!(!editor.has_visible_completions_menu());
assert_eq!(
fold_ranges(editor, cx),
vec![Point::new(0, 6)..Point::new(0, 37)]
vec![Point::new(0, 6)..Point::new(0, 33)]
);
});
@@ -1370,12 +1381,12 @@ mod tests {
editor.update(&mut cx, |editor, cx| {
assert_eq!(
editor.text(cx),
format!("Lorem [@one.txt](@file:dir{slash}a{slash}one.txt) ")
format!("Lorem [@one.txt](@file:a{slash}one.txt) ")
);
assert!(!editor.has_visible_completions_menu());
assert_eq!(
fold_ranges(editor, cx),
vec![Point::new(0, 6)..Point::new(0, 37)]
vec![Point::new(0, 6)..Point::new(0, 33)]
);
});
@@ -1384,12 +1395,12 @@ mod tests {
editor.update(&mut cx, |editor, cx| {
assert_eq!(
editor.text(cx),
format!("Lorem [@one.txt](@file:dir{slash}a{slash}one.txt) Ipsum "),
format!("Lorem [@one.txt](@file:a{slash}one.txt) Ipsum "),
);
assert!(!editor.has_visible_completions_menu());
assert_eq!(
fold_ranges(editor, cx),
vec![Point::new(0, 6)..Point::new(0, 37)]
vec![Point::new(0, 6)..Point::new(0, 33)]
);
});
@@ -1398,12 +1409,12 @@ mod tests {
editor.update(&mut cx, |editor, cx| {
assert_eq!(
editor.text(cx),
format!("Lorem [@one.txt](@file:dir{slash}a{slash}one.txt) Ipsum @file "),
format!("Lorem [@one.txt](@file:a{slash}one.txt) Ipsum @file "),
);
assert!(editor.has_visible_completions_menu());
assert_eq!(
fold_ranges(editor, cx),
vec![Point::new(0, 6)..Point::new(0, 37)]
vec![Point::new(0, 6)..Point::new(0, 33)]
);
});
@@ -1416,14 +1427,14 @@ mod tests {
editor.update(&mut cx, |editor, cx| {
assert_eq!(
editor.text(cx),
format!("Lorem [@one.txt](@file:dir{slash}a{slash}one.txt) Ipsum [@seven.txt](@file:dir{slash}b{slash}seven.txt) ")
format!("Lorem [@one.txt](@file:a{slash}one.txt) Ipsum [@seven.txt](@file:b{slash}seven.txt) ")
);
assert!(!editor.has_visible_completions_menu());
assert_eq!(
fold_ranges(editor, cx),
vec![
Point::new(0, 6)..Point::new(0, 37),
Point::new(0, 45)..Point::new(0, 80)
Point::new(0, 6)..Point::new(0, 33),
Point::new(0, 41)..Point::new(0, 72)
]
);
});
@@ -1433,14 +1444,14 @@ mod tests {
editor.update(&mut cx, |editor, cx| {
assert_eq!(
editor.text(cx),
format!("Lorem [@one.txt](@file:dir{slash}a{slash}one.txt) Ipsum [@seven.txt](@file:dir{slash}b{slash}seven.txt) \n@")
format!("Lorem [@one.txt](@file:a{slash}one.txt) Ipsum [@seven.txt](@file:b{slash}seven.txt) \n@")
);
assert!(editor.has_visible_completions_menu());
assert_eq!(
fold_ranges(editor, cx),
vec![
Point::new(0, 6)..Point::new(0, 37),
Point::new(0, 45)..Point::new(0, 80)
Point::new(0, 6)..Point::new(0, 33),
Point::new(0, 41)..Point::new(0, 72)
]
);
});
@@ -1454,20 +1465,203 @@ mod tests {
editor.update(&mut cx, |editor, cx| {
assert_eq!(
editor.text(cx),
format!("Lorem [@one.txt](@file:dir{slash}a{slash}one.txt) Ipsum [@seven.txt](@file:dir{slash}b{slash}seven.txt) \n[@six.txt](@file:dir{slash}b{slash}six.txt) ")
format!("Lorem [@one.txt](@file:a{slash}one.txt) Ipsum [@seven.txt](@file:b{slash}seven.txt) \n[@six.txt](@file:b{slash}six.txt) ")
);
assert!(!editor.has_visible_completions_menu());
assert_eq!(
fold_ranges(editor, cx),
vec![
Point::new(0, 6)..Point::new(0, 37),
Point::new(0, 45)..Point::new(0, 80),
Point::new(1, 0)..Point::new(1, 31)
Point::new(0, 6)..Point::new(0, 33),
Point::new(0, 41)..Point::new(0, 72),
Point::new(1, 0)..Point::new(1, 27)
]
);
});
}
#[gpui::test]
async fn test_context_completion_provider_multiple_worktrees(cx: &mut TestAppContext) {
init_test(cx);
let app_state = cx.update(AppState::test);
cx.update(|cx| {
language::init(cx);
editor::init(cx);
workspace::init(app_state.clone(), cx);
Project::init_settings(cx);
});
app_state
.fs
.as_fake()
.insert_tree(
path!("/project1"),
json!({
"a": {
"one.txt": "",
"two.txt": "",
}
}),
)
.await;
app_state
.fs
.as_fake()
.insert_tree(
path!("/project2"),
json!({
"b": {
"three.txt": "",
"four.txt": "",
}
}),
)
.await;
let project = Project::test(
app_state.fs.clone(),
[path!("/project1").as_ref(), path!("/project2").as_ref()],
cx,
)
.await;
let window = cx.add_window(|window, cx| Workspace::test_new(project.clone(), window, cx));
let workspace = window.root(cx).unwrap();
let worktrees = project.update(cx, |project, cx| {
let worktrees = project.worktrees(cx).collect::<Vec<_>>();
assert_eq!(worktrees.len(), 2);
worktrees
});
let mut cx = VisualTestContext::from_window(*window.deref(), cx);
let slash = PathStyle::local().separator();
for (worktree_idx, paths) in [
vec![rel_path("a/one.txt"), rel_path("a/two.txt")],
vec![rel_path("b/three.txt"), rel_path("b/four.txt")],
]
.iter()
.enumerate()
{
let worktree_id = worktrees[worktree_idx].read_with(&cx, |wt, _| wt.id());
for path in paths {
workspace
.update_in(&mut cx, |workspace, window, cx| {
workspace.open_path(
ProjectPath {
worktree_id,
path: (*path).into(),
},
None,
false,
window,
cx,
)
})
.await
.unwrap();
}
}
let editor = workspace.update_in(&mut cx, |workspace, window, cx| {
let editor = cx.new(|cx| {
Editor::new(
editor::EditorMode::full(),
multi_buffer::MultiBuffer::build_simple("", cx),
None,
window,
cx,
)
});
workspace.active_pane().update(cx, |pane, cx| {
pane.add_item(
Box::new(cx.new(|_| AtMentionEditor(editor.clone()))),
true,
true,
None,
window,
cx,
);
});
editor
});
let context_store = cx.new(|_| ContextStore::new(project.downgrade()));
let editor_entity = editor.downgrade();
editor.update_in(&mut cx, |editor, window, cx| {
window.focus(&editor.focus_handle(cx));
editor.set_completion_provider(Some(Rc::new(ContextPickerCompletionProvider::new(
workspace.downgrade(),
context_store.downgrade(),
None,
None,
editor_entity,
None,
))));
});
cx.simulate_input("@");
// With multiple worktrees, we should see the project name as prefix
editor.update(&mut cx, |editor, cx| {
assert_eq!(editor.text(cx), "@");
assert!(editor.has_visible_completions_menu());
let labels = current_completion_labels(editor);
assert!(
labels.contains(&format!("four.txt project2{slash}b{slash}")),
"Expected 'four.txt project2{slash}b{slash}' in labels: {:?}",
labels
);
assert!(
labels.contains(&format!("three.txt project2{slash}b{slash}")),
"Expected 'three.txt project2{slash}b{slash}' in labels: {:?}",
labels
);
});
editor.update_in(&mut cx, |editor, window, cx| {
editor.context_menu_next(&editor::actions::ContextMenuNext, window, cx);
editor.context_menu_next(&editor::actions::ContextMenuNext, window, cx);
editor.context_menu_next(&editor::actions::ContextMenuNext, window, cx);
editor.context_menu_next(&editor::actions::ContextMenuNext, window, cx);
editor.confirm_completion(&editor::actions::ConfirmCompletion::default(), window, cx);
});
cx.run_until_parked();
editor.update(&mut cx, |editor, cx| {
assert_eq!(editor.text(cx), "@file ");
assert!(editor.has_visible_completions_menu());
});
cx.simulate_input("one");
editor.update(&mut cx, |editor, cx| {
assert_eq!(editor.text(cx), "@file one");
assert!(editor.has_visible_completions_menu());
assert_eq!(
current_completion_labels(editor),
vec![format!("one.txt project1{slash}a{slash}")]
);
});
editor.update_in(&mut cx, |editor, window, cx| {
editor.confirm_completion(&editor::actions::ConfirmCompletion::default(), window, cx);
});
editor.update(&mut cx, |editor, cx| {
assert_eq!(
editor.text(cx),
format!("[@one.txt](@file:project1{slash}a{slash}one.txt) ")
);
assert!(!editor.has_visible_completions_menu());
});
}
fn fold_ranges(editor: &Editor, cx: &mut App) -> Vec<Range<Point>> {
let snapshot = editor.buffer().read(cx).snapshot(cx);
editor.display_map.update(cx, |display_map, cx| {

View File

@@ -197,34 +197,50 @@ pub(crate) fn search_files(
if query.is_empty() {
let workspace = workspace.read(cx);
let project = workspace.project().read(cx);
let visible_worktrees = workspace.visible_worktrees(cx).collect::<Vec<_>>();
let include_root_name = visible_worktrees.len() > 1;
let recent_matches = workspace
.recent_navigation_history(Some(10), cx)
.into_iter()
.filter_map(|(project_path, _)| {
let worktree = project.worktree_for_id(project_path.worktree_id, cx)?;
Some(FileMatch {
.map(|(project_path, _)| {
let path_prefix = if include_root_name {
project
.worktree_for_id(project_path.worktree_id, cx)
.map(|wt| wt.read(cx).root_name().into())
.unwrap_or_else(|| RelPath::empty().into())
} else {
RelPath::empty().into()
};
FileMatch {
mat: PathMatch {
score: 0.,
positions: Vec::new(),
worktree_id: project_path.worktree_id.to_usize(),
path: project_path.path,
path_prefix: worktree.read(cx).root_name().into(),
path_prefix,
distance_to_relative_ancestor: 0,
is_dir: false,
},
is_recent: true,
})
}
});
let file_matches = project.worktrees(cx).flat_map(|worktree| {
let file_matches = visible_worktrees.into_iter().flat_map(|worktree| {
let worktree = worktree.read(cx);
let path_prefix: Arc<RelPath> = if include_root_name {
worktree.root_name().into()
} else {
RelPath::empty().into()
};
worktree.entries(false, 0).map(move |entry| FileMatch {
mat: PathMatch {
score: 0.,
positions: Vec::new(),
worktree_id: worktree.id().to_usize(),
path: entry.path.clone(),
path_prefix: worktree.root_name().into(),
path_prefix: path_prefix.clone(),
distance_to_relative_ancestor: 0,
is_dir: entry.is_dir(),
},
@@ -235,6 +251,7 @@ pub(crate) fn search_files(
Task::ready(recent_matches.chain(file_matches).collect())
} else {
let worktrees = workspace.read(cx).visible_worktrees(cx).collect::<Vec<_>>();
let include_root_name = worktrees.len() > 1;
let candidate_sets = worktrees
.into_iter()
.map(|worktree| {
@@ -243,7 +260,7 @@ pub(crate) fn search_files(
PathMatchCandidateSet {
snapshot: worktree.snapshot(),
include_ignored: worktree.root_entry().is_some_and(|entry| entry.is_ignored),
include_root_name: true,
include_root_name,
candidates: project::Candidates::Entries,
}
})
@@ -276,6 +293,12 @@ pub fn extract_file_name_and_directory(
path_prefix: &RelPath,
path_style: PathStyle,
) -> (SharedString, Option<SharedString>) {
// If path is empty, this means we're matching with the root directory itself
// so we use the path_prefix as the name
if path.is_empty() && !path_prefix.is_empty() {
return (path_prefix.display(path_style).to_string().into(), None);
}
let full_path = path_prefix.join(path);
let file_name = full_path.file_name().unwrap_or_default();
let display_path = full_path.display(path_style);

View File

@@ -1,16 +1,32 @@
<assembly xmlns="urn:schemas-microsoft-com:asm.v1" manifestVersion="1.0" xmlns:asmv3="urn:schemas-microsoft-com:asm.v3">
<asmv3:application>
<asmv3:windowsSettings>
<dpiAware xmlns="http://schemas.microsoft.com/SMI/2005/WindowsSettings">true</dpiAware>
<assembly xmlns="urn:schemas-microsoft-com:asm.v1" manifestVersion="1.0">
<trustInfo xmlns="urn:schemas-microsoft-com:asm.v3">
<security>
<requestedPrivileges>
<requestedExecutionLevel level="asInvoker" uiAccess="false" />
</requestedPrivileges>
</security>
</trustInfo>
<compatibility xmlns="urn:schemas-microsoft-com:compatibility.v1">
<application>
<!-- Windows 10 -->
<supportedOS Id="{8e0f7a12-bfb3-4fe8-b9a5-48fd50a15a9a}" />
</application>
</compatibility>
<application xmlns="urn:schemas-microsoft-com:asm.v3">
<windowsSettings>
<dpiAware xmlns="http://schemas.microsoft.com/SMI/2005/WindowsSettings">true/pm</dpiAware>
<dpiAwareness xmlns="http://schemas.microsoft.com/SMI/2016/WindowsSettings">PerMonitorV2</dpiAwareness>
</asmv3:windowsSettings>
</asmv3:application>
</windowsSettings>
</application>
<dependency>
<dependentAssembly>
<assemblyIdentity type='win32'
<assemblyIdentity
type='win32'
name='Microsoft.Windows.Common-Controls'
version='6.0.0.0' processorArchitecture='*'
publicKeyToken='6595b64144ccf1df' />
version='6.0.0.0'
processorArchitecture='*'
publicKeyToken='6595b64144ccf1df'
/>
</dependentAssembly>
</dependency>
</assembly>

View File

@@ -36,13 +36,31 @@ pub(crate) const JOBS: &[Job] = &[
std::fs::remove_file(&zed_wsl)
.context(format!("Failed to remove old file {}", zed_wsl.display()))
},
// TODO: remove after a few weeks once everyone is on the new version and this file never exists
|app_dir| {
let open_console = app_dir.join("OpenConsole.exe");
log::info!("Removing old file: {}", open_console.display());
std::fs::remove_file(&open_console).context(format!(
"Failed to remove old file {}",
open_console.display()
))
if open_console.exists() {
log::info!("Removing old file: {}", open_console.display());
std::fs::remove_file(&open_console).context(format!(
"Failed to remove old file {}",
open_console.display()
))?
}
Ok(())
},
|app_dir| {
let archs = ["x64", "arm64"];
for arch in archs {
let open_console = app_dir.join(format!("{arch}\\OpenConsole.exe"));
if open_console.exists() {
log::info!("Removing old file: {}", open_console.display());
std::fs::remove_file(&open_console).context(format!(
"Failed to remove old file {}",
open_console.display()
))?
}
}
Ok(())
},
|app_dir| {
let conpty = app_dir.join("conpty.dll");
@@ -100,20 +118,32 @@ pub(crate) const JOBS: &[Job] = &[
))
},
|app_dir| {
let open_console_source = app_dir.join("install\\OpenConsole.exe");
let open_console_dest = app_dir.join("OpenConsole.exe");
log::info!(
"Copying new file {} to {}",
open_console_source.display(),
open_console_dest.display()
);
std::fs::copy(&open_console_source, &open_console_dest)
.map(|_| ())
.context(format!(
"Failed to copy new file {} to {}",
open_console_source.display(),
open_console_dest.display()
))
let archs = ["x64", "arm64"];
for arch in archs {
let open_console_source = app_dir.join(format!("install\\{arch}\\OpenConsole.exe"));
let open_console_dest = app_dir.join(format!("{arch}\\OpenConsole.exe"));
if open_console_source.exists() {
log::info!(
"Copying new file {} to {}",
open_console_source.display(),
open_console_dest.display()
);
let parent = open_console_dest.parent().context(format!(
"Failed to get parent directory of {}",
open_console_dest.display()
))?;
std::fs::create_dir_all(parent)
.context(format!("Failed to create directory {}", parent.display()))?;
std::fs::copy(&open_console_source, &open_console_dest)
.map(|_| ())
.context(format!(
"Failed to copy new file {} to {}",
open_console_source.display(),
open_console_dest.display()
))?
}
}
Ok(())
},
|app_dir| {
let conpty_source = app_dir.join("install\\conpty.dll");

View File

@@ -66,6 +66,8 @@ pub enum Model {
Claude3Sonnet,
#[serde(rename = "claude-3-5-haiku", alias = "claude-3-5-haiku-latest")]
Claude3_5Haiku,
#[serde(rename = "claude-haiku-4-5", alias = "claude-haiku-4-5-latest")]
ClaudeHaiku4_5,
Claude3_5Sonnet,
Claude3Haiku,
// Amazon Nova Models
@@ -147,6 +149,8 @@ impl Model {
Ok(Self::Claude3Sonnet)
} else if id.starts_with("claude-3-5-haiku") {
Ok(Self::Claude3_5Haiku)
} else if id.starts_with("claude-haiku-4-5") {
Ok(Self::ClaudeHaiku4_5)
} else if id.starts_with("claude-3-7-sonnet") {
Ok(Self::Claude3_7Sonnet)
} else if id.starts_with("claude-3-7-sonnet-thinking") {
@@ -180,6 +184,7 @@ impl Model {
Model::Claude3Sonnet => "claude-3-sonnet",
Model::Claude3Haiku => "claude-3-haiku",
Model::Claude3_5Haiku => "claude-3-5-haiku",
Model::ClaudeHaiku4_5 => "claude-haiku-4-5",
Model::Claude3_7Sonnet => "claude-3-7-sonnet",
Model::Claude3_7SonnetThinking => "claude-3-7-sonnet-thinking",
Model::AmazonNovaLite => "amazon-nova-lite",
@@ -246,6 +251,7 @@ impl Model {
Model::Claude3Sonnet => "anthropic.claude-3-sonnet-20240229-v1:0",
Model::Claude3Haiku => "anthropic.claude-3-haiku-20240307-v1:0",
Model::Claude3_5Haiku => "anthropic.claude-3-5-haiku-20241022-v1:0",
Model::ClaudeHaiku4_5 => "anthropic.claude-haiku-4-5-20251001-v1:0",
Model::Claude3_7Sonnet | Model::Claude3_7SonnetThinking => {
"anthropic.claude-3-7-sonnet-20250219-v1:0"
}
@@ -309,6 +315,7 @@ impl Model {
Self::Claude3Sonnet => "Claude 3 Sonnet",
Self::Claude3Haiku => "Claude 3 Haiku",
Self::Claude3_5Haiku => "Claude 3.5 Haiku",
Self::ClaudeHaiku4_5 => "Claude Haiku 4.5",
Self::Claude3_7Sonnet => "Claude 3.7 Sonnet",
Self::Claude3_7SonnetThinking => "Claude 3.7 Sonnet Thinking",
Self::AmazonNovaLite => "Amazon Nova Lite",
@@ -363,6 +370,7 @@ impl Model {
| Self::Claude3Opus
| Self::Claude3Sonnet
| Self::Claude3_5Haiku
| Self::ClaudeHaiku4_5
| Self::Claude3_7Sonnet
| Self::ClaudeSonnet4
| Self::ClaudeOpus4
@@ -385,7 +393,7 @@ impl Model {
Self::Claude3Opus | Self::Claude3Sonnet | Self::Claude3_5Haiku => 4_096,
Self::Claude3_7Sonnet | Self::Claude3_7SonnetThinking => 128_000,
Self::ClaudeSonnet4 | Self::ClaudeSonnet4Thinking => 64_000,
Self::ClaudeSonnet4_5 | Self::ClaudeSonnet4_5Thinking => 64_000,
Self::ClaudeSonnet4_5 | Self::ClaudeSonnet4_5Thinking | Self::ClaudeHaiku4_5 => 64_000,
Self::ClaudeOpus4
| Self::ClaudeOpus4Thinking
| Self::ClaudeOpus4_1
@@ -404,6 +412,7 @@ impl Model {
| Self::Claude3Opus
| Self::Claude3Sonnet
| Self::Claude3_5Haiku
| Self::ClaudeHaiku4_5
| Self::Claude3_7Sonnet
| Self::ClaudeOpus4
| Self::ClaudeOpus4Thinking
@@ -438,7 +447,8 @@ impl Model {
| Self::ClaudeSonnet4Thinking
| Self::ClaudeSonnet4_5
| Self::ClaudeSonnet4_5Thinking
| Self::Claude3_5Haiku => true,
| Self::Claude3_5Haiku
| Self::ClaudeHaiku4_5 => true,
// Amazon Nova models (all support tool use)
Self::AmazonNovaPremier
@@ -464,6 +474,7 @@ impl Model {
// Nova models support only text caching
// https://docs.aws.amazon.com/bedrock/latest/userguide/prompt-caching.html#prompt-caching-models
Self::Claude3_5Haiku
| Self::ClaudeHaiku4_5
| Self::Claude3_7Sonnet
| Self::Claude3_7SonnetThinking
| Self::ClaudeSonnet4
@@ -500,7 +511,7 @@ impl Model {
min_total_token: 1024,
}),
Self::Claude3_5Haiku => Some(BedrockModelCacheConfiguration {
Self::Claude3_5Haiku | Self::ClaudeHaiku4_5 => Some(BedrockModelCacheConfiguration {
max_cache_anchors: 4,
min_total_token: 2048,
}),
@@ -569,6 +580,7 @@ impl Model {
(
Model::AmazonNovaPremier
| Model::Claude3_5Haiku
| Model::ClaudeHaiku4_5
| Model::Claude3_5Sonnet
| Model::Claude3_5SonnetV2
| Model::Claude3_7Sonnet
@@ -606,6 +618,7 @@ impl Model {
// Models available in EU
(
Model::Claude3_5Sonnet
| Model::ClaudeHaiku4_5
| Model::Claude3_7Sonnet
| Model::Claude3_7SonnetThinking
| Model::ClaudeSonnet4
@@ -624,6 +637,7 @@ impl Model {
(
Model::Claude3_5Sonnet
| Model::Claude3_5SonnetV2
| Model::ClaudeHaiku4_5
| Model::Claude3Haiku
| Model::Claude3Sonnet
| Model::Claude3_7Sonnet

View File

@@ -70,7 +70,6 @@ pub struct DiffHunk {
pub secondary_status: DiffHunkSecondaryStatus,
}
// FIXME
/// We store [`InternalDiffHunk`]s internally so we don't need to store the additional row range.
#[derive(Debug, Clone, PartialEq, Eq)]
struct InternalDiffHunk {
@@ -269,16 +268,6 @@ impl BufferDiffSnapshot {
self.secondary_diff.as_deref()
}
pub fn valid_and_invalid_hunks_intersecting_range<'a>(
&'a self,
range: Range<Anchor>,
buffer: &'a text::BufferSnapshot,
) -> impl 'a + Iterator<Item = DiffHunk> {
let unstaged_counterpart = self.secondary_diff.as_ref().map(|diff| &diff.inner);
self.inner
.hunks_intersecting_range(range, buffer, unstaged_counterpart)
}
pub fn hunks_intersecting_range<'a>(
&'a self,
range: Range<Anchor>,
@@ -287,7 +276,6 @@ impl BufferDiffSnapshot {
let unstaged_counterpart = self.secondary_diff.as_ref().map(|diff| &diff.inner);
self.inner
.hunks_intersecting_range(range, buffer, unstaged_counterpart)
.filter(|hunk| hunk.buffer_range.start.is_valid(buffer))
}
pub fn hunks_intersecting_range_rev<'a>(
@@ -574,6 +562,10 @@ impl BufferDiffInner {
let (start_point, (start_anchor, start_base)) = summaries.next()?;
let (mut end_point, (mut end_anchor, end_base)) = summaries.next()?;
if !start_anchor.is_valid(buffer) {
continue;
}
if end_point.column > 0 && end_point < max_point {
end_point.row += 1;
end_point.column = 0;
@@ -679,7 +671,7 @@ impl BufferDiffInner {
old_cursor.next();
new_cursor.next();
let mut start = None;
let mut end: Option<Anchor> = None;
let mut end = None;
loop {
match (new_cursor.item(), old_cursor.item()) {
@@ -691,11 +683,7 @@ impl BufferDiffInner {
{
Ordering::Less => {
start.get_or_insert(new_hunk.buffer_range.start);
if end.is_none_or(|end| {
end.cmp(&new_hunk.buffer_range.end, new_snapshot).is_lt()
}) {
end.replace(new_hunk.buffer_range.end);
}
end.replace(new_hunk.buffer_range.end);
new_cursor.next();
}
Ordering::Equal => {
@@ -707,14 +695,8 @@ impl BufferDiffInner {
.cmp(&new_hunk.buffer_range.end, new_snapshot)
.is_ge()
{
debug_assert!(end.is_none_or(|end| {
end.cmp(&old_hunk.buffer_range.end, new_snapshot).is_le()
}));
end.replace(old_hunk.buffer_range.end);
} else {
debug_assert!(end.is_none_or(|end| {
end.cmp(&new_hunk.buffer_range.end, new_snapshot).is_le()
}));
end.replace(new_hunk.buffer_range.end);
}
}
@@ -724,31 +706,19 @@ impl BufferDiffInner {
}
Ordering::Greater => {
start.get_or_insert(old_hunk.buffer_range.start);
if end.is_none_or(|end| {
end.cmp(&old_hunk.buffer_range.end, new_snapshot).is_lt()
}) {
end.replace(old_hunk.buffer_range.end);
}
end.replace(old_hunk.buffer_range.end);
old_cursor.next();
}
}
}
(Some(new_hunk), None) => {
start.get_or_insert(new_hunk.buffer_range.start);
if end
.is_none_or(|end| end.cmp(&new_hunk.buffer_range.end, new_snapshot).is_lt())
{
end.replace(new_hunk.buffer_range.end);
}
end.replace(new_hunk.buffer_range.end);
new_cursor.next();
}
(None, Some(old_hunk)) => {
start.get_or_insert(old_hunk.buffer_range.start);
if end
.is_none_or(|end| end.cmp(&old_hunk.buffer_range.end, new_snapshot).is_lt())
{
end.replace(old_hunk.buffer_range.end);
}
end.replace(old_hunk.buffer_range.end);
old_cursor.next();
}
(None, None) => break,
@@ -1170,7 +1140,6 @@ impl BufferDiff {
.map(|diff| &diff.read(cx).inner);
self.inner
.hunks_intersecting_range(range, buffer_snapshot, unstaged_counterpart)
.filter(|hunk| hunk.buffer_range.start.is_valid(buffer_snapshot))
}
pub fn hunks_intersecting_range_rev<'a>(

View File

@@ -358,6 +358,7 @@ fn main() -> Result<()> {
rayon::ThreadPoolBuilder::new()
.num_threads(4)
.stack_size(10 * 1024 * 1024)
.thread_name(|ix| format!("RayonWorker{}", ix))
.build_global()
.unwrap();

View File

@@ -23,7 +23,11 @@ pub struct PredictEditsRequest {
pub cursor_point: Point,
/// Within `signatures`
pub excerpt_parent: Option<usize>,
#[serde(skip_serializing_if = "Vec::is_empty", default)]
pub included_files: Vec<IncludedFile>,
#[serde(skip_serializing_if = "Vec::is_empty", default)]
pub signatures: Vec<Signature>,
#[serde(skip_serializing_if = "Vec::is_empty", default)]
pub referenced_declarations: Vec<ReferencedDeclaration>,
pub events: Vec<Event>,
#[serde(default)]
@@ -44,6 +48,19 @@ pub struct PredictEditsRequest {
pub prompt_format: PromptFormat,
}
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct IncludedFile {
pub path: Arc<Path>,
pub max_row: Line,
pub excerpts: Vec<Excerpt>,
}
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct Excerpt {
pub start_line: Line,
pub text: Arc<str>,
}
#[derive(Debug, Clone, Copy, Serialize, Deserialize, PartialEq, EnumIter)]
pub enum PromptFormat {
MarkedExcerpt,

View File

@@ -1,11 +1,14 @@
//! Zeta2 prompt planning and generation code shared with cloud.
use anyhow::{Context as _, Result, anyhow};
use cloud_llm_client::predict_edits_v3::{self, Line, Point, PromptFormat, ReferencedDeclaration};
use cloud_llm_client::predict_edits_v3::{
self, Excerpt, Line, Point, PromptFormat, ReferencedDeclaration,
};
use indoc::indoc;
use ordered_float::OrderedFloat;
use rustc_hash::{FxHashMap, FxHashSet};
use serde::Serialize;
use std::cmp;
use std::fmt::Write;
use std::sync::Arc;
use std::{cmp::Reverse, collections::BinaryHeap, ops::Range, path::Path};
@@ -96,7 +99,177 @@ const UNIFIED_DIFF_REMINDER: &str = indoc! {"
If you're editing multiple files, be sure to reflect filename in the hunk's header.
"};
pub struct PlannedPrompt<'a> {
pub fn build_prompt(
request: &predict_edits_v3::PredictEditsRequest,
) -> Result<(String, SectionLabels)> {
let mut insertions = match request.prompt_format {
PromptFormat::MarkedExcerpt => vec![
(
Point {
line: request.excerpt_line_range.start,
column: 0,
},
EDITABLE_REGION_START_MARKER_WITH_NEWLINE,
),
(request.cursor_point, CURSOR_MARKER),
(
Point {
line: request.excerpt_line_range.end,
column: 0,
},
EDITABLE_REGION_END_MARKER_WITH_NEWLINE,
),
],
PromptFormat::LabeledSections => vec![(request.cursor_point, CURSOR_MARKER)],
PromptFormat::NumLinesUniDiff => {
vec![(request.cursor_point, CURSOR_MARKER)]
}
PromptFormat::OnlySnippets => vec![],
};
let mut prompt = match request.prompt_format {
PromptFormat::MarkedExcerpt => MARKED_EXCERPT_INSTRUCTIONS.to_string(),
PromptFormat::LabeledSections => LABELED_SECTIONS_INSTRUCTIONS.to_string(),
PromptFormat::NumLinesUniDiff => NUMBERED_LINES_INSTRUCTIONS.to_string(),
// only intended for use via zeta_cli
PromptFormat::OnlySnippets => String::new(),
};
if request.events.is_empty() {
prompt.push_str("(No edit history)\n\n");
} else {
prompt.push_str(
"The following are the latest edits made by the user, from earlier to later.\n\n",
);
push_events(&mut prompt, &request.events);
}
if request.prompt_format == PromptFormat::NumLinesUniDiff {
if request.referenced_declarations.is_empty() {
prompt.push_str(indoc! {"
# File under the cursor:
The cursor marker <|user_cursor|> indicates the current user cursor position.
The file is in current state, edits from edit history have been applied.
We prepend line numbers (e.g., `123|<actual line>`); they are not part of the file.
"});
} else {
// Note: This hasn't been trained on yet
prompt.push_str(indoc! {"
# Code Excerpts:
The cursor marker <|user_cursor|> indicates the current user cursor position.
Other excerpts of code from the project have been included as context based on their similarity to the code under the cursor.
Context excerpts are not guaranteed to be relevant, so use your own judgement.
Files are in their current state, edits from edit history have been applied.
We prepend line numbers (e.g., `123|<actual line>`); they are not part of the file.
"});
}
} else {
prompt.push_str("\n## Code\n\n");
}
let mut section_labels = Default::default();
if !request.referenced_declarations.is_empty() || !request.signatures.is_empty() {
let syntax_based_prompt = SyntaxBasedPrompt::populate(request)?;
section_labels = syntax_based_prompt.write(&mut insertions, &mut prompt)?;
} else {
if request.prompt_format == PromptFormat::LabeledSections {
anyhow::bail!("PromptFormat::LabeledSections cannot be used with ContextMode::Llm");
}
for related_file in &request.included_files {
writeln!(&mut prompt, "`````filename={}", related_file.path.display()).unwrap();
write_excerpts(
&related_file.excerpts,
if related_file.path == request.excerpt_path {
&insertions
} else {
&[]
},
related_file.max_row,
request.prompt_format == PromptFormat::NumLinesUniDiff,
&mut prompt,
);
write!(&mut prompt, "`````\n\n").unwrap();
}
}
if request.prompt_format == PromptFormat::NumLinesUniDiff {
prompt.push_str(UNIFIED_DIFF_REMINDER);
}
Ok((prompt, section_labels))
}
pub fn write_excerpts<'a>(
excerpts: impl IntoIterator<Item = &'a Excerpt>,
sorted_insertions: &[(Point, &str)],
file_line_count: Line,
include_line_numbers: bool,
output: &mut String,
) {
let mut current_row = Line(0);
let mut sorted_insertions = sorted_insertions.iter().peekable();
for excerpt in excerpts {
if excerpt.start_line > current_row {
writeln!(output, "").unwrap();
}
if excerpt.text.is_empty() {
return;
}
current_row = excerpt.start_line;
for mut line in excerpt.text.lines() {
if include_line_numbers {
write!(output, "{}|", current_row.0 + 1).unwrap();
}
while let Some((insertion_location, insertion_marker)) = sorted_insertions.peek() {
match current_row.cmp(&insertion_location.line) {
cmp::Ordering::Equal => {
let (prefix, suffix) = line.split_at(insertion_location.column as usize);
output.push_str(prefix);
output.push_str(insertion_marker);
line = suffix;
sorted_insertions.next();
}
cmp::Ordering::Less => break,
cmp::Ordering::Greater => {
sorted_insertions.next();
break;
}
}
}
output.push_str(line);
output.push('\n');
current_row.0 += 1;
}
}
if current_row < file_line_count {
writeln!(output, "").unwrap();
}
}
fn push_events(output: &mut String, events: &[predict_edits_v3::Event]) {
if events.is_empty() {
return;
};
writeln!(output, "`````diff").unwrap();
for event in events {
writeln!(output, "{}", event).unwrap();
}
writeln!(output, "`````\n").unwrap();
}
pub struct SyntaxBasedPrompt<'a> {
request: &'a predict_edits_v3::PredictEditsRequest,
/// Snippets to include in the prompt. These may overlap - they are merged / deduplicated in
/// `to_prompt_string`.
@@ -120,13 +293,13 @@ pub enum DeclarationStyle {
Declaration,
}
#[derive(Clone, Debug, Serialize)]
#[derive(Default, Clone, Debug, Serialize)]
pub struct SectionLabels {
pub excerpt_index: usize,
pub section_ranges: Vec<(Arc<Path>, Range<Line>)>,
}
impl<'a> PlannedPrompt<'a> {
impl<'a> SyntaxBasedPrompt<'a> {
/// Greedy one-pass knapsack algorithm to populate the prompt plan. Does the following:
///
/// Initializes a priority queue by populating it with each snippet, finding the
@@ -149,7 +322,7 @@ impl<'a> PlannedPrompt<'a> {
///
/// * Does not include file paths / other text when considering max_bytes.
pub fn populate(request: &'a predict_edits_v3::PredictEditsRequest) -> Result<Self> {
let mut this = PlannedPrompt {
let mut this = Self {
request,
snippets: Vec::new(),
budget_used: request.excerpt.len(),
@@ -354,7 +527,11 @@ impl<'a> PlannedPrompt<'a> {
/// Renders the planned context. Each file starts with "```FILE_PATH\n` and ends with triple
/// backticks, with a newline after each file. Outputs a line with "..." between nonconsecutive
/// chunks.
pub fn to_prompt_string(&'a self) -> Result<(String, SectionLabels)> {
pub fn write(
&'a self,
excerpt_file_insertions: &mut Vec<(Point, &'static str)>,
prompt: &mut String,
) -> Result<SectionLabels> {
let mut file_to_snippets: FxHashMap<&'a std::path::Path, Vec<&PlannedSnippet<'a>>> =
FxHashMap::default();
for snippet in &self.snippets {
@@ -383,95 +560,10 @@ impl<'a> PlannedPrompt<'a> {
excerpt_file_snippets.push(&excerpt_snippet);
file_snippets.push((&self.request.excerpt_path, excerpt_file_snippets, true));
let mut excerpt_file_insertions = match self.request.prompt_format {
PromptFormat::MarkedExcerpt => vec![
(
Point {
line: self.request.excerpt_line_range.start,
column: 0,
},
EDITABLE_REGION_START_MARKER_WITH_NEWLINE,
),
(self.request.cursor_point, CURSOR_MARKER),
(
Point {
line: self.request.excerpt_line_range.end,
column: 0,
},
EDITABLE_REGION_END_MARKER_WITH_NEWLINE,
),
],
PromptFormat::LabeledSections => vec![(self.request.cursor_point, CURSOR_MARKER)],
PromptFormat::NumLinesUniDiff => {
vec![(self.request.cursor_point, CURSOR_MARKER)]
}
PromptFormat::OnlySnippets => vec![],
};
let mut prompt = match self.request.prompt_format {
PromptFormat::MarkedExcerpt => MARKED_EXCERPT_INSTRUCTIONS.to_string(),
PromptFormat::LabeledSections => LABELED_SECTIONS_INSTRUCTIONS.to_string(),
PromptFormat::NumLinesUniDiff => NUMBERED_LINES_INSTRUCTIONS.to_string(),
// only intended for use via zeta_cli
PromptFormat::OnlySnippets => String::new(),
};
if self.request.events.is_empty() {
prompt.push_str("(No edit history)\n\n");
} else {
prompt.push_str(
"The following are the latest edits made by the user, from earlier to later.\n\n",
);
Self::push_events(&mut prompt, &self.request.events);
}
if self.request.prompt_format == PromptFormat::NumLinesUniDiff {
if self.request.referenced_declarations.is_empty() {
prompt.push_str(indoc! {"
# File under the cursor:
The cursor marker <|user_cursor|> indicates the current user cursor position.
The file is in current state, edits from edit history have been applied.
We prepend line numbers (e.g., `123|<actual line>`); they are not part of the file.
"});
} else {
// Note: This hasn't been trained on yet
prompt.push_str(indoc! {"
# Code Excerpts:
The cursor marker <|user_cursor|> indicates the current user cursor position.
Other excerpts of code from the project have been included as context based on their similarity to the code under the cursor.
Context excerpts are not guaranteed to be relevant, so use your own judgement.
Files are in their current state, edits from edit history have been applied.
We prepend line numbers (e.g., `123|<actual line>`); they are not part of the file.
"});
}
} else {
prompt.push_str("\n## Code\n\n");
}
let section_labels =
self.push_file_snippets(&mut prompt, &mut excerpt_file_insertions, file_snippets)?;
self.push_file_snippets(prompt, excerpt_file_insertions, file_snippets)?;
if self.request.prompt_format == PromptFormat::NumLinesUniDiff {
prompt.push_str(UNIFIED_DIFF_REMINDER);
}
Ok((prompt, section_labels))
}
fn push_events(output: &mut String, events: &[predict_edits_v3::Event]) {
if events.is_empty() {
return;
};
writeln!(output, "`````diff").unwrap();
for event in events {
writeln!(output, "{}", event).unwrap();
}
writeln!(output, "`````\n").unwrap();
Ok(section_labels)
}
fn push_file_snippets(

View File

@@ -66,6 +66,14 @@ impl CodestralCompletionProvider {
Self::api_key(cx).is_some()
}
/// This is so we can immediately show Codestral as a provider users can
/// switch to in the edit prediction menu, if the API has been added
pub fn ensure_api_key_loaded(http_client: Arc<dyn HttpClient>, cx: &mut App) {
MistralLanguageModelProvider::global(http_client, cx)
.load_codestral_api_key(cx)
.detach();
}
fn api_key(cx: &App) -> Option<Arc<str>> {
MistralLanguageModelProvider::try_global(cx)
.and_then(|provider| provider.codestral_api_key(CODESTRAL_API_URL, cx))
@@ -79,6 +87,7 @@ impl CodestralCompletionProvider {
suffix: String,
model: String,
max_tokens: Option<u32>,
api_url: String,
) -> Result<String> {
let start_time = Instant::now();
@@ -111,7 +120,7 @@ impl CodestralCompletionProvider {
let http_request = http_client::Request::builder()
.method(http_client::Method::POST)
.uri(format!("{}/v1/fim/completions", CODESTRAL_API_URL))
.uri(format!("{}/v1/fim/completions", api_url))
.header("Content-Type", "application/json")
.header("Authorization", format!("Bearer {}", api_key))
.body(http_client::AsyncBody::from(request_body))?;
@@ -211,6 +220,12 @@ impl EditPredictionProvider for CodestralCompletionProvider {
.clone()
.unwrap_or_else(|| "codestral-latest".to_string());
let max_tokens = settings.edit_predictions.codestral.max_tokens;
let api_url = settings
.edit_predictions
.codestral
.api_url
.clone()
.unwrap_or_else(|| CODESTRAL_API_URL.to_string());
self.pending_request = Some(cx.spawn(async move |this, cx| {
if debounce {
@@ -242,6 +257,7 @@ impl EditPredictionProvider for CodestralCompletionProvider {
suffix,
model,
max_tokens,
api_url,
)
.await
{

View File

@@ -1,175 +0,0 @@
---
kind: Service
apiVersion: v1
metadata:
namespace: ${ZED_KUBE_NAMESPACE}
name: postgrest
annotations:
service.beta.kubernetes.io/do-loadbalancer-name: "postgrest-${ZED_KUBE_NAMESPACE}"
service.beta.kubernetes.io/do-loadbalancer-tls-ports: "443"
service.beta.kubernetes.io/do-loadbalancer-certificate-id: ${ZED_DO_CERTIFICATE_ID}
service.beta.kubernetes.io/do-loadbalancer-disable-lets-encrypt-dns-records: "true"
spec:
type: LoadBalancer
selector:
app: nginx
ports:
- name: web
protocol: TCP
port: 443
targetPort: 8080
---
apiVersion: apps/v1
kind: Deployment
metadata:
namespace: ${ZED_KUBE_NAMESPACE}
name: nginx
spec:
replicas: 1
selector:
matchLabels:
app: nginx
template:
metadata:
labels:
app: nginx
spec:
containers:
- name: nginx
image: nginx:latest
ports:
- containerPort: 8080
protocol: TCP
volumeMounts:
- name: nginx-config
mountPath: /etc/nginx/nginx.conf
subPath: nginx.conf
volumes:
- name: nginx-config
configMap:
name: nginx-config
---
apiVersion: v1
kind: ConfigMap
metadata:
namespace: ${ZED_KUBE_NAMESPACE}
name: nginx-config
data:
nginx.conf: |
events {}
http {
server {
listen 8080;
location /app/ {
proxy_pass http://postgrest-app:8080/;
}
location /llm/ {
proxy_pass http://postgrest-llm:8080/;
}
}
}
---
apiVersion: v1
kind: Service
metadata:
namespace: ${ZED_KUBE_NAMESPACE}
name: postgrest-app
spec:
selector:
app: postgrest-app
ports:
- protocol: TCP
port: 8080
targetPort: 8080
---
apiVersion: v1
kind: Service
metadata:
namespace: ${ZED_KUBE_NAMESPACE}
name: postgrest-llm
spec:
selector:
app: postgrest-llm
ports:
- protocol: TCP
port: 8080
targetPort: 8080
---
apiVersion: apps/v1
kind: Deployment
metadata:
namespace: ${ZED_KUBE_NAMESPACE}
name: postgrest-app
spec:
replicas: 1
selector:
matchLabels:
app: postgrest-app
template:
metadata:
labels:
app: postgrest-app
spec:
containers:
- name: postgrest
image: "postgrest/postgrest"
ports:
- containerPort: 8080
protocol: TCP
env:
- name: PGRST_SERVER_PORT
value: "8080"
- name: PGRST_DB_URI
valueFrom:
secretKeyRef:
name: database
key: url
- name: PGRST_JWT_SECRET
valueFrom:
secretKeyRef:
name: postgrest
key: jwt_secret
---
apiVersion: apps/v1
kind: Deployment
metadata:
namespace: ${ZED_KUBE_NAMESPACE}
name: postgrest-llm
spec:
replicas: 1
selector:
matchLabels:
app: postgrest-llm
template:
metadata:
labels:
app: postgrest-llm
spec:
containers:
- name: postgrest
image: "postgrest/postgrest"
ports:
- containerPort: 8080
protocol: TCP
env:
- name: PGRST_SERVER_PORT
value: "8080"
- name: PGRST_DB_URI
valueFrom:
secretKeyRef:
name: llm-database
key: url
- name: PGRST_JWT_SECRET
valueFrom:
secretKeyRef:
name: postgrest
key: jwt_secret

View File

@@ -1,4 +0,0 @@
db-uri = "postgres://postgres@localhost/zed"
server-port = 8081
jwt-secret = "the-postgrest-jwt-secret-for-authorization"
log-level = "info"

View File

@@ -1,4 +0,0 @@
db-uri = "postgres://postgres@localhost/zed_llm"
server-port = 8082
jwt-secret = "the-postgrest-jwt-secret-for-authorization"
log-level = "info"

View File

@@ -17,12 +17,14 @@ use editor::{
use fs::Fs;
use futures::{SinkExt, StreamExt, channel::mpsc, lock::Mutex};
use git::repository::repo_path;
use gpui::{App, Rgba, TestAppContext, UpdateGlobal, VisualContext, VisualTestContext};
use gpui::{
App, Rgba, SharedString, TestAppContext, UpdateGlobal, VisualContext, VisualTestContext,
};
use indoc::indoc;
use language::FakeLspAdapter;
use lsp::LSP_REQUEST_TIMEOUT;
use project::{
ProjectPath, SERVER_PROGRESS_THROTTLE_TIMEOUT,
ProgressToken, ProjectPath, SERVER_PROGRESS_THROTTLE_TIMEOUT,
lsp_store::lsp_ext_command::{ExpandedMacro, LspExtExpandMacro},
};
use recent_projects::disconnected_overlay::DisconnectedOverlay;
@@ -1283,12 +1285,14 @@ async fn test_language_server_statuses(cx_a: &mut TestAppContext, cx_b: &mut Tes
});
executor.run_until_parked();
let token = ProgressToken::String(SharedString::from("the-token"));
project_a.read_with(cx_a, |project, cx| {
let status = project.language_server_statuses(cx).next().unwrap().1;
assert_eq!(status.name.0, "the-language-server");
assert_eq!(status.pending_work.len(), 1);
assert_eq!(
status.pending_work["the-token"].message.as_ref().unwrap(),
status.pending_work[&token].message.as_ref().unwrap(),
"the-message"
);
});
@@ -1322,7 +1326,7 @@ async fn test_language_server_statuses(cx_a: &mut TestAppContext, cx_b: &mut Tes
assert_eq!(status.name.0, "the-language-server");
assert_eq!(status.pending_work.len(), 1);
assert_eq!(
status.pending_work["the-token"].message.as_ref().unwrap(),
status.pending_work[&token].message.as_ref().unwrap(),
"the-message-2"
);
});
@@ -1332,7 +1336,7 @@ async fn test_language_server_statuses(cx_a: &mut TestAppContext, cx_b: &mut Tes
assert_eq!(status.name.0, "the-language-server");
assert_eq!(status.pending_work.len(), 1);
assert_eq!(
status.pending_work["the-token"].message.as_ref().unwrap(),
status.pending_work[&token].message.as_ref().unwrap(),
"the-message-2"
);
});

View File

@@ -54,6 +54,10 @@ actions!(
CollapseSelectedChannel,
/// Expands the selected channel in the tree view.
ExpandSelectedChannel,
/// Opens the meeting notes for the selected channel in the panel.
///
/// Use `collab::OpenChannelNotes` to open the channel notes for the current call.
OpenSelectedChannelNotes,
/// Starts moving a channel to a new location.
StartMoveChannel,
/// Moves the selected item to the current location.
@@ -1265,6 +1269,13 @@ impl CollabPanel {
window.handler_for(&this, move |this, _, cx| {
this.copy_channel_link(channel_id, cx)
}),
)
.entry(
"Copy Channel Notes Link",
None,
window.handler_for(&this, move |this, _, cx| {
this.copy_channel_notes_link(channel_id, cx)
}),
);
let mut has_destructive_actions = false;
@@ -1849,6 +1860,17 @@ impl CollabPanel {
}
}
fn open_selected_channel_notes(
&mut self,
_: &OpenSelectedChannelNotes,
window: &mut Window,
cx: &mut Context<Self>,
) {
if let Some(channel) = self.selected_channel() {
self.open_channel_notes(channel.id, window, cx);
}
}
fn set_channel_visibility(
&mut self,
channel_id: ChannelId,
@@ -2220,6 +2242,15 @@ impl CollabPanel {
cx.write_to_clipboard(item)
}
fn copy_channel_notes_link(&mut self, channel_id: ChannelId, cx: &mut Context<Self>) {
let channel_store = self.channel_store.read(cx);
let Some(channel) = channel_store.channel_for_id(channel_id) else {
return;
};
let item = ClipboardItem::new_string(channel.notes_link(None, cx));
cx.write_to_clipboard(item)
}
fn render_signed_out(&mut self, cx: &mut Context<Self>) -> Div {
let collab_blurb = "Work with your team in realtime with collaborative editing, voice, shared notes and more.";
@@ -2960,6 +2991,7 @@ impl Render for CollabPanel {
.on_action(cx.listener(CollabPanel::remove_selected_channel))
.on_action(cx.listener(CollabPanel::show_inline_context_menu))
.on_action(cx.listener(CollabPanel::rename_selected_channel))
.on_action(cx.listener(CollabPanel::open_selected_channel_notes))
.on_action(cx.listener(CollabPanel::collapse_selected_channel))
.on_action(cx.listener(CollabPanel::expand_selected_channel))
.on_action(cx.listener(CollabPanel::start_move_selected_channel))

View File

@@ -20,6 +20,7 @@ command_palette_hooks.workspace = true
db.workspace = true
fuzzy.workspace = true
gpui.workspace = true
menu.workspace = true
log.workspace = true
picker.workspace = true
postage.workspace = true

View File

@@ -22,7 +22,7 @@ use persistence::COMMAND_PALETTE_HISTORY;
use picker::{Picker, PickerDelegate};
use postage::{sink::Sink, stream::Stream};
use settings::Settings;
use ui::{HighlightedLabel, KeyBinding, ListItem, ListItemSpacing, h_flex, prelude::*, v_flex};
use ui::{HighlightedLabel, KeyBinding, ListItem, ListItemSpacing, prelude::*};
use util::ResultExt;
use workspace::{ModalView, Workspace, WorkspaceSettings};
use zed_actions::{OpenZedUrl, command_palette::Toggle};
@@ -143,7 +143,7 @@ impl Focusable for CommandPalette {
}
impl Render for CommandPalette {
fn render(&mut self, _window: &mut Window, _cx: &mut Context<Self>) -> impl IntoElement {
fn render(&mut self, _window: &mut Window, _: &mut Context<Self>) -> impl IntoElement {
v_flex()
.key_context("CommandPalette")
.w(rems(34.))
@@ -261,6 +261,17 @@ impl CommandPaletteDelegate {
HashMap::new()
}
}
fn selected_command(&self) -> Option<&Command> {
let action_ix = self
.matches
.get(self.selected_ix)
.map(|m| m.candidate_id)
.unwrap_or(self.selected_ix);
// this gets called in headless tests where there are no commands loaded
// so we need to return an Option here
self.commands.get(action_ix)
}
}
impl PickerDelegate for CommandPaletteDelegate {
@@ -411,7 +422,20 @@ impl PickerDelegate for CommandPaletteDelegate {
.log_err();
}
fn confirm(&mut self, _: bool, window: &mut Window, cx: &mut Context<Picker<Self>>) {
fn confirm(&mut self, secondary: bool, window: &mut Window, cx: &mut Context<Picker<Self>>) {
if secondary {
let Some(selected_command) = self.selected_command() else {
return;
};
let action_name = selected_command.action.name();
let open_keymap = Box::new(zed_actions::ChangeKeybinding {
action: action_name.to_string(),
});
window.dispatch_action(open_keymap, cx);
self.dismissed(window, cx);
return;
}
if self.matches.is_empty() {
self.dismissed(window, cx);
return;
@@ -448,6 +472,7 @@ impl PickerDelegate for CommandPaletteDelegate {
) -> Option<Self::ListItem> {
let matching_command = self.matches.get(ix)?;
let command = self.commands.get(matching_command.candidate_id)?;
Some(
ListItem::new(ix)
.inset(true)
@@ -470,6 +495,59 @@ impl PickerDelegate for CommandPaletteDelegate {
),
)
}
fn render_footer(
&self,
window: &mut Window,
cx: &mut Context<Picker<Self>>,
) -> Option<AnyElement> {
let selected_command = self.selected_command()?;
let keybind =
KeyBinding::for_action_in(&*selected_command.action, &self.previous_focus_handle, cx);
let focus_handle = &self.previous_focus_handle;
let keybinding_buttons = if keybind.has_binding(window) {
Button::new("change", "Change Keybinding…")
.key_binding(
KeyBinding::for_action_in(&menu::SecondaryConfirm, focus_handle, cx)
.map(|kb| kb.size(rems_from_px(12.))),
)
.on_click(move |_, window, cx| {
window.dispatch_action(menu::SecondaryConfirm.boxed_clone(), cx);
})
} else {
Button::new("add", "Add Keybinding…")
.key_binding(
KeyBinding::for_action_in(&menu::SecondaryConfirm, focus_handle, cx)
.map(|kb| kb.size(rems_from_px(12.))),
)
.on_click(move |_, window, cx| {
window.dispatch_action(menu::SecondaryConfirm.boxed_clone(), cx);
})
};
Some(
h_flex()
.w_full()
.p_1p5()
.gap_1()
.justify_end()
.border_t_1()
.border_color(cx.theme().colors().border_variant)
.child(keybinding_buttons)
.child(
Button::new("run-action", "Run")
.key_binding(
KeyBinding::for_action_in(&menu::Confirm, &focus_handle, cx)
.map(|kb| kb.size(rems_from_px(12.))),
)
.on_click(|_, window, cx| {
window.dispatch_action(menu::Confirm.boxed_clone(), cx)
}),
)
.into_any(),
)
}
}
pub fn humanize_action_name(name: &str) -> String {

View File

@@ -12,6 +12,7 @@ use gpui::{
Action, AppContext, ClickEvent, Entity, FocusHandle, Focusable, MouseButton, ScrollStrategy,
Task, UniformListScrollHandle, WeakEntity, actions, uniform_list,
};
use itertools::Itertools;
use language::Point;
use project::{
Project,
@@ -24,7 +25,7 @@ use project::{
};
use ui::{
Divider, DividerColor, FluentBuilder as _, Indicator, IntoElement, ListItem, Render,
StatefulInteractiveElement, Tooltip, WithScrollbar, prelude::*,
ScrollAxes, StatefulInteractiveElement, Tooltip, WithScrollbar, prelude::*,
};
use util::rel_path::RelPath;
use workspace::Workspace;
@@ -55,6 +56,7 @@ pub(crate) struct BreakpointList {
focus_handle: FocusHandle,
scroll_handle: UniformListScrollHandle,
selected_ix: Option<usize>,
max_width_index: Option<usize>,
input: Entity<Editor>,
strip_mode: Option<ActiveBreakpointStripMode>,
serialize_exception_breakpoints_task: Option<Task<anyhow::Result<()>>>,
@@ -95,6 +97,7 @@ impl BreakpointList {
dap_store,
worktree_store,
breakpoints: Default::default(),
max_width_index: None,
workspace,
session,
focus_handle,
@@ -546,7 +549,7 @@ impl BreakpointList {
.session
.as_ref()
.map(|session| SupportedBreakpointProperties::from(session.read(cx).capabilities()))
.unwrap_or_else(SupportedBreakpointProperties::empty);
.unwrap_or_else(SupportedBreakpointProperties::all);
let strip_mode = self.strip_mode;
uniform_list(
@@ -570,6 +573,8 @@ impl BreakpointList {
.collect()
}),
)
.with_horizontal_sizing_behavior(gpui::ListHorizontalSizingBehavior::Unconstrained)
.with_width_from_item(self.max_width_index)
.track_scroll(self.scroll_handle.clone())
.flex_1()
}
@@ -732,6 +737,26 @@ impl Render for BreakpointList {
.chain(exception_breakpoints),
);
let text_pixels = ui::TextSize::Default.pixels(cx).to_f64() as f32;
self.max_width_index = self
.breakpoints
.iter()
.map(|entry| match &entry.kind {
BreakpointEntryKind::LineBreakpoint(line_bp) => {
let name_and_line = format!("{}:{}", line_bp.name, line_bp.line);
let dir_len = line_bp.dir.as_ref().map(|d| d.len()).unwrap_or(0);
(name_and_line.len() + dir_len) as f32 * text_pixels
}
BreakpointEntryKind::ExceptionBreakpoint(exc_bp) => {
exc_bp.data.label.len() as f32 * text_pixels
}
BreakpointEntryKind::DataBreakpoint(data_bp) => {
data_bp.0.context.human_readable_label().len() as f32 * text_pixels
}
})
.position_max_by(|left, right| left.total_cmp(right));
v_flex()
.id("breakpoint-list")
.key_context("BreakpointList")
@@ -749,7 +774,14 @@ impl Render for BreakpointList {
.size_full()
.pt_1()
.child(self.render_list(cx))
.vertical_scrollbar_for(self.scroll_handle.clone(), window, cx)
.custom_scrollbars(
ui::Scrollbars::new(ScrollAxes::Both)
.tracked_scroll_handle(self.scroll_handle.clone())
.with_track_along(ScrollAxes::Both, cx.theme().colors().panel_background)
.tracked_entity(cx.entity_id()),
window,
cx,
)
.when_some(self.strip_mode, |this, _| {
this.child(Divider::horizontal().color(DividerColor::Border))
.child(
@@ -1376,8 +1408,10 @@ impl RenderOnce for BreakpointOptionsStrip {
h_flex()
.gap_px()
.mr_3() // Space to avoid overlapping with the scrollbar
.child(
div()
.justify_end()
.when(has_logs || self.is_selected, |this| {
this.child(
div()
.map(self.add_focus_styles(
ActiveBreakpointStripMode::Log,
supports_logs,
@@ -1406,45 +1440,46 @@ impl RenderOnce for BreakpointOptionsStrip {
)
}),
)
.when(!has_logs && !self.is_selected, |this| this.invisible()),
)
.child(
div()
.map(self.add_focus_styles(
ActiveBreakpointStripMode::Condition,
supports_condition,
window,
cx,
))
.child(
IconButton::new(
SharedString::from(format!("{id}-condition-toggle")),
IconName::SplitAlt,
)
.shape(ui::IconButtonShape::Square)
.style(style_for_toggle(
)
})
.when(has_condition || self.is_selected, |this| {
this.child(
div()
.map(self.add_focus_styles(
ActiveBreakpointStripMode::Condition,
has_condition,
supports_condition,
window,
cx,
))
.icon_size(IconSize::Small)
.icon_color(color_for_toggle(has_condition))
.when(has_condition, |this| this.indicator(Indicator::dot().color(Color::Info)))
.disabled(!supports_condition)
.toggle_state(self.is_toggled(ActiveBreakpointStripMode::Condition))
.on_click(self.on_click_callback(ActiveBreakpointStripMode::Condition))
.tooltip(|_window, cx| {
Tooltip::with_meta(
"Set Condition",
None,
"Set condition to evaluate when a breakpoint is hit. Program execution will stop only when the condition is met.",
cx,
.child(
IconButton::new(
SharedString::from(format!("{id}-condition-toggle")),
IconName::SplitAlt,
)
}),
)
.when(!has_condition && !self.is_selected, |this| this.invisible()),
)
.child(
div()
.shape(ui::IconButtonShape::Square)
.style(style_for_toggle(
ActiveBreakpointStripMode::Condition,
has_condition,
))
.icon_size(IconSize::Small)
.icon_color(color_for_toggle(has_condition))
.when(has_condition, |this| this.indicator(Indicator::dot().color(Color::Info)))
.disabled(!supports_condition)
.toggle_state(self.is_toggled(ActiveBreakpointStripMode::Condition))
.on_click(self.on_click_callback(ActiveBreakpointStripMode::Condition))
.tooltip(|_window, cx| {
Tooltip::with_meta(
"Set Condition",
None,
"Set condition to evaluate when a breakpoint is hit. Program execution will stop only when the condition is met.",
cx,
)
}),
)
)
})
.when(has_hit_condition || self.is_selected, |this| {
this.child(div()
.map(self.add_focus_styles(
ActiveBreakpointStripMode::HitCondition,
supports_hit_condition,
@@ -1475,10 +1510,8 @@ impl RenderOnce for BreakpointOptionsStrip {
cx,
)
}),
)
.when(!has_hit_condition && !self.is_selected, |this| {
this.invisible()
}),
)
))
})
}
}

View File

@@ -10,8 +10,9 @@ use std::{
use editor::{Editor, EditorElement, EditorStyle};
use gpui::{
Action, Along, AppContext, Axis, DismissEvent, DragMoveEvent, Empty, Entity, FocusHandle,
Focusable, MouseButton, Point, ScrollStrategy, ScrollWheelEvent, Subscription, Task, TextStyle,
UniformList, UniformListScrollHandle, WeakEntity, actions, anchored, deferred, uniform_list,
Focusable, ListHorizontalSizingBehavior, MouseButton, Point, ScrollStrategy, ScrollWheelEvent,
Subscription, Task, TextStyle, UniformList, UniformListScrollHandle, WeakEntity, actions,
anchored, deferred, uniform_list,
};
use notifications::status_toast::{StatusToast, ToastIcon};
use project::debugger::{MemoryCell, dap_command::DataBreakpointContext, session::Session};
@@ -229,6 +230,7 @@ impl MemoryView {
},
)
.track_scroll(view_state.scroll_handle)
.with_horizontal_sizing_behavior(ListHorizontalSizingBehavior::Unconstrained)
.on_scroll_wheel(cx.listener(|this, evt: &ScrollWheelEvent, window, _| {
let mut view_state = this.view_state();
let delta = evt.delta.pixel_delta(window.line_height());
@@ -917,7 +919,17 @@ impl Render for MemoryView {
)
.with_priority(1)
}))
.vertical_scrollbar_for(self.view_state_handle.clone(), window, cx),
.custom_scrollbars(
ui::Scrollbars::new(ui::ScrollAxes::Both)
.tracked_scroll_handle(self.view_state_handle.clone())
.with_track_along(
ui::ScrollAxes::Both,
cx.theme().colors().panel_background,
)
.tracked_entity(cx.entity_id()),
window,
cx,
),
)
}
}

View File

@@ -566,6 +566,7 @@ impl StackFrameList {
this.activate_selected_entry(window, cx);
}))
.hover(|style| style.bg(cx.theme().colors().element_hover).cursor_pointer())
.overflow_x_scroll()
.child(
v_flex()
.gap_0p5()

View File

@@ -11,15 +11,18 @@ use gpui::{
FocusHandle, Focusable, Hsla, MouseDownEvent, Point, Subscription, TextStyleRefinement,
UniformListScrollHandle, WeakEntity, actions, anchored, deferred, uniform_list,
};
use itertools::Itertools;
use menu::{SelectFirst, SelectLast, SelectNext, SelectPrevious};
use project::debugger::{
dap_command::DataBreakpointContext,
session::{Session, SessionEvent, Watcher},
};
use std::{collections::HashMap, ops::Range, sync::Arc};
use ui::{ContextMenu, ListItem, ScrollableHandle, Tooltip, WithScrollbar, prelude::*};
use ui::{ContextMenu, ListItem, ScrollAxes, ScrollableHandle, Tooltip, WithScrollbar, prelude::*};
use util::{debug_panic, maybe};
static INDENT_STEP_SIZE: Pixels = px(10.0);
actions!(
variable_list,
[
@@ -185,6 +188,7 @@ struct VariableColor {
pub struct VariableList {
entries: Vec<ListEntry>,
max_width_index: Option<usize>,
entry_states: HashMap<EntryPath, EntryState>,
selected_stack_frame_id: Option<StackFrameId>,
list_handle: UniformListScrollHandle,
@@ -243,6 +247,7 @@ impl VariableList {
disabled: false,
edited_path: None,
entries: Default::default(),
max_width_index: None,
entry_states: Default::default(),
weak_running,
memory_view,
@@ -368,6 +373,26 @@ impl VariableList {
}
self.entries = entries;
let text_pixels = ui::TextSize::Default.pixels(cx).to_f64() as f32;
let indent_size = INDENT_STEP_SIZE.to_f64() as f32;
self.max_width_index = self
.entries
.iter()
.map(|entry| match &entry.entry {
DapEntry::Scope(scope) => scope.name.len() as f32 * text_pixels,
DapEntry::Variable(variable) => {
(variable.value.len() + variable.name.len()) as f32 * text_pixels
+ (entry.path.indices.len() as f32 * indent_size)
}
DapEntry::Watcher(watcher) => {
(watcher.value.len() + watcher.expression.len()) as f32 * text_pixels
+ (entry.path.indices.len() as f32 * indent_size)
}
})
.position_max_by(|left, right| left.total_cmp(right));
cx.notify();
}
@@ -1129,6 +1154,7 @@ impl VariableList {
this.color(Color::from(color))
}),
)
.tooltip(Tooltip::text(value))
}
})
.into_any_element()
@@ -1243,7 +1269,7 @@ impl VariableList {
.disabled(self.disabled)
.selectable(false)
.indent_level(state.depth)
.indent_step_size(px(10.))
.indent_step_size(INDENT_STEP_SIZE)
.always_show_disclosure_icon(true)
.when(var_ref > 0, |list_item| {
list_item.toggle(state.is_expanded).on_toggle(cx.listener({
@@ -1444,7 +1470,7 @@ impl VariableList {
.disabled(self.disabled)
.selectable(false)
.indent_level(state.depth)
.indent_step_size(px(10.))
.indent_step_size(INDENT_STEP_SIZE)
.always_show_disclosure_icon(true)
.when(var_ref > 0, |list_item| {
list_item.toggle(state.is_expanded).on_toggle(cx.listener({
@@ -1506,7 +1532,6 @@ impl Render for VariableList {
.key_context("VariableList")
.id("variable-list")
.group("variable-list")
.overflow_y_scroll()
.size_full()
.on_action(cx.listener(Self::select_first))
.on_action(cx.listener(Self::select_last))
@@ -1532,6 +1557,9 @@ impl Render for VariableList {
}),
)
.track_scroll(self.list_handle.clone())
.with_width_from_item(self.max_width_index)
.with_sizing_behavior(gpui::ListSizingBehavior::Auto)
.with_horizontal_sizing_behavior(gpui::ListHorizontalSizingBehavior::Unconstrained)
.gap_1_5()
.size_full()
.flex_grow(),
@@ -1545,7 +1573,15 @@ impl Render for VariableList {
)
.with_priority(1)
}))
.vertical_scrollbar_for(self.list_handle.clone(), window, cx)
// .vertical_scrollbar_for(self.list_handle.clone(), window, cx)
.custom_scrollbars(
ui::Scrollbars::new(ScrollAxes::Both)
.tracked_scroll_handle(self.list_handle.clone())
.with_track_along(ScrollAxes::Both, cx.theme().colors().panel_background)
.tracked_entity(cx.entity_id()),
window,
cx,
)
}
}

View File

@@ -1,5 +1,5 @@
use anyhow::Result;
use client::{UserStore, zed_urls};
use client::{Client, UserStore, zed_urls};
use cloud_llm_client::UsageLimit;
use codestral::CodestralCompletionProvider;
use copilot::{Copilot, Status};
@@ -192,6 +192,7 @@ impl Render for EditPredictionButton {
Some(ContextMenu::build(window, cx, |menu, _, _| {
let fs = fs.clone();
let activate_url = activate_url.clone();
menu.entry("Sign In", None, move |_, cx| {
cx.open_url(activate_url.as_str())
})
@@ -244,15 +245,8 @@ impl Render for EditPredictionButton {
} else {
Some(ContextMenu::build(window, cx, |menu, _, _| {
let fs = fs.clone();
menu.entry("Use Zed AI instead", None, move |_, cx| {
set_completion_provider(
fs.clone(),
cx,
EditPredictionProvider::Zed,
)
})
.separator()
.entry(
menu.entry(
"Configure Codestral API Key",
None,
move |window, cx| {
@@ -262,6 +256,18 @@ impl Render for EditPredictionButton {
);
},
)
.separator()
.entry(
"Use Zed AI instead",
None,
move |_, cx| {
set_completion_provider(
fs.clone(),
cx,
EditPredictionProvider::Zed,
)
},
)
}))
}
})
@@ -412,6 +418,7 @@ impl EditPredictionButton {
fs: Arc<dyn Fs>,
user_store: Entity<UserStore>,
popover_menu_handle: PopoverMenuHandle<ContextMenu>,
client: Arc<Client>,
cx: &mut Context<Self>,
) -> Self {
if let Some(copilot) = Copilot::global(cx) {
@@ -421,6 +428,8 @@ impl EditPredictionButton {
cx.observe_global::<SettingsStore>(move |_, cx| cx.notify())
.detach();
CodestralCompletionProvider::ensure_api_key_loaded(client.http_client(), cx);
Self {
editor_subscription: None,
editor_enabled: None,
@@ -435,6 +444,89 @@ impl EditPredictionButton {
}
}
fn get_available_providers(&self, cx: &App) -> Vec<EditPredictionProvider> {
let mut providers = Vec::new();
providers.push(EditPredictionProvider::Zed);
if let Some(copilot) = Copilot::global(cx) {
if matches!(copilot.read(cx).status(), Status::Authorized) {
providers.push(EditPredictionProvider::Copilot);
}
}
if let Some(supermaven) = Supermaven::global(cx) {
if let Supermaven::Spawned(agent) = supermaven.read(cx) {
if matches!(agent.account_status, AccountStatus::Ready) {
providers.push(EditPredictionProvider::Supermaven);
}
}
}
if CodestralCompletionProvider::has_api_key(cx) {
providers.push(EditPredictionProvider::Codestral);
}
providers
}
fn add_provider_switching_section(
&self,
mut menu: ContextMenu,
current_provider: EditPredictionProvider,
cx: &App,
) -> ContextMenu {
let available_providers = self.get_available_providers(cx);
let other_providers: Vec<_> = available_providers
.into_iter()
.filter(|p| *p != current_provider && *p != EditPredictionProvider::None)
.collect();
if !other_providers.is_empty() {
menu = menu.separator().header("Switch Providers");
for provider in other_providers {
let fs = self.fs.clone();
menu = match provider {
EditPredictionProvider::Zed => menu.item(
ContextMenuEntry::new("Zed AI")
.documentation_aside(
DocumentationSide::Left,
DocumentationEdge::Top,
|_| {
Label::new("Zed's edit prediction is powered by Zeta, an open-source, dataset mode.")
.into_any_element()
},
)
.handler(move |_, cx| {
set_completion_provider(fs.clone(), cx, provider);
}),
),
EditPredictionProvider::Copilot => {
menu.entry("GitHub Copilot", None, move |_, cx| {
set_completion_provider(fs.clone(), cx, provider);
})
}
EditPredictionProvider::Supermaven => {
menu.entry("Supermaven", None, move |_, cx| {
set_completion_provider(fs.clone(), cx, provider);
})
}
EditPredictionProvider::Codestral => {
menu.entry("Codestral", None, move |_, cx| {
set_completion_provider(fs.clone(), cx, provider);
})
}
EditPredictionProvider::None => continue,
};
}
}
menu
}
pub fn build_copilot_start_menu(
&mut self,
window: &mut Window,
@@ -572,8 +664,10 @@ impl EditPredictionButton {
}
menu = menu.separator().header("Privacy");
if let Some(provider) = &self.edit_prediction_provider {
let data_collection = provider.data_collection_state(cx);
if data_collection.is_supported() {
let provider = provider.clone();
let enabled = data_collection.is_enabled();
@@ -691,7 +785,7 @@ impl EditPredictionButton {
}
}),
).item(
ContextMenuEntry::new("View Documentation")
ContextMenuEntry::new("View Docs")
.icon(IconName::FileGeneric)
.icon_color(Color::Muted)
.handler(move |_, cx| {
@@ -711,6 +805,7 @@ impl EditPredictionButton {
if let Some(editor_focus_handle) = self.editor_focus_handle.clone() {
menu = menu
.separator()
.header("Actions")
.entry(
"Predict Edit at Cursor",
Some(Box::new(ShowEditPrediction)),
@@ -721,7 +816,11 @@ impl EditPredictionButton {
}
},
)
.context(editor_focus_handle);
.context(editor_focus_handle)
.when(
cx.has_flag::<PredictEditsRateCompletionsFeatureFlag>(),
|this| this.action("Rate Completions", RateCompletions.boxed_clone()),
);
}
menu
@@ -733,15 +832,11 @@ impl EditPredictionButton {
cx: &mut Context<Self>,
) -> Entity<ContextMenu> {
ContextMenu::build(window, cx, |menu, window, cx| {
self.build_language_settings_menu(menu, window, cx)
.separator()
.entry("Use Zed AI instead", None, {
let fs = self.fs.clone();
move |_window, cx| {
set_completion_provider(fs.clone(), cx, EditPredictionProvider::Zed)
}
})
.separator()
let menu = self.build_language_settings_menu(menu, window, cx);
let menu =
self.add_provider_switching_section(menu, EditPredictionProvider::Copilot, cx);
menu.separator()
.link(
"Go to Copilot Settings",
OpenBrowser {
@@ -759,8 +854,11 @@ impl EditPredictionButton {
cx: &mut Context<Self>,
) -> Entity<ContextMenu> {
ContextMenu::build(window, cx, |menu, window, cx| {
self.build_language_settings_menu(menu, window, cx)
.separator()
let menu = self.build_language_settings_menu(menu, window, cx);
let menu =
self.add_provider_switching_section(menu, EditPredictionProvider::Supermaven, cx);
menu.separator()
.action("Sign Out", supermaven::SignOut.boxed_clone())
})
}
@@ -770,14 +868,12 @@ impl EditPredictionButton {
window: &mut Window,
cx: &mut Context<Self>,
) -> Entity<ContextMenu> {
let fs = self.fs.clone();
ContextMenu::build(window, cx, |menu, window, cx| {
self.build_language_settings_menu(menu, window, cx)
.separator()
.entry("Use Zed AI instead", None, move |_, cx| {
set_completion_provider(fs.clone(), cx, EditPredictionProvider::Zed)
})
.separator()
let menu = self.build_language_settings_menu(menu, window, cx);
let menu =
self.add_provider_switching_section(menu, EditPredictionProvider::Codestral, cx);
menu.separator()
.entry("Configure Codestral API Key", None, move |window, cx| {
window.dispatch_action(zed_actions::agent::OpenSettings.boxed_clone(), cx);
})
@@ -872,10 +968,10 @@ impl EditPredictionButton {
.separator();
}
self.build_language_settings_menu(menu, window, cx).when(
cx.has_flag::<PredictEditsRateCompletionsFeatureFlag>(),
|this| this.action("Rate Completions", RateCompletions.boxed_clone()),
)
let menu = self.build_language_settings_menu(menu, window, cx);
let menu = self.add_provider_switching_section(menu, EditPredictionProvider::Zed, cx);
menu
})
}

View File

@@ -21,8 +21,7 @@ fn to_tab_point_benchmark(c: &mut Criterion) {
let buffer_snapshot = cx.read(|cx| buffer.read(cx).snapshot(cx));
use editor::display_map::*;
let (_, filter_snapshot) = FilterMap::new(None, buffer_snapshot);
let (_, inlay_snapshot) = InlayMap::new(filter_snapshot);
let (_, inlay_snapshot) = InlayMap::new(buffer_snapshot);
let (_, fold_snapshot) = FoldMap::new(inlay_snapshot.clone());
let fold_point = fold_snapshot.to_fold_point(
inlay_snapshot.to_point(InlayOffset(rng.random_range(0..length))),
@@ -66,8 +65,7 @@ fn to_fold_point_benchmark(c: &mut Criterion) {
let buffer_snapshot = cx.read(|cx| buffer.read(cx).snapshot(cx));
use editor::display_map::*;
let (_, filter_snapshot) = FilterMap::new(None, buffer_snapshot);
let (_, inlay_snapshot) = InlayMap::new(filter_snapshot);
let (_, inlay_snapshot) = InlayMap::new(buffer_snapshot);
let (_, fold_snapshot) = FoldMap::new(inlay_snapshot.clone());
let fold_point = fold_snapshot.to_fold_point(

View File

@@ -539,6 +539,10 @@ actions!(
GoToParentModule,
/// Goes to the previous change in the file.
GoToPreviousChange,
/// Goes to the next reference to the symbol under the cursor.
GoToNextReference,
/// Goes to the previous reference to the symbol under the cursor.
GoToPreviousReference,
/// Goes to the type definition of the symbol at cursor.
GoToTypeDefinition,
/// Goes to type definition in a split pane.
@@ -617,6 +621,8 @@ actions!(
NextEditPrediction,
/// Scrolls to the next screen.
NextScreen,
/// Goes to the next snippet tabstop if one exists.
NextSnippetTabstop,
/// Opens the context menu at cursor position.
OpenContextMenu,
/// Opens excerpts from the current file.
@@ -650,6 +656,8 @@ actions!(
Paste,
/// Navigates to the previous edit prediction.
PreviousEditPrediction,
/// Goes to the previous snippet tabstop if one exists.
PreviousSnippetTabstop,
/// Redoes the last undone edit.
Redo,
/// Redoes the last selection change.

View File

@@ -20,7 +20,6 @@
mod block_map;
mod crease_map;
mod custom_highlights;
mod filter_map;
mod fold_map;
mod inlay_map;
pub(crate) mod invisibles;
@@ -28,8 +27,7 @@ mod tab_map;
mod wrap_map;
use crate::{
EditorStyle, RowExt, display_map::filter_map::FilterSnapshot, hover_links::InlayHighlight,
inlays::Inlay, movement::TextLayoutDetails,
EditorStyle, RowExt, hover_links::InlayHighlight, inlays::Inlay, movement::TextLayoutDetails,
};
pub use block_map::{
Block, BlockChunks as DisplayChunks, BlockContext, BlockId, BlockMap, BlockPlacement,
@@ -39,7 +37,6 @@ pub use block_map::{
use block_map::{BlockRow, BlockSnapshot};
use collections::{HashMap, HashSet};
pub use crease_map::*;
pub use filter_map::FilterMode;
use fold_map::FoldSnapshot;
pub use fold_map::{
ChunkRenderer, ChunkRendererContext, ChunkRendererId, Fold, FoldId, FoldPlaceholder, FoldPoint,
@@ -75,9 +72,7 @@ use ui::{SharedString, px};
use unicode_segmentation::UnicodeSegmentation;
use wrap_map::{WrapMap, WrapSnapshot};
pub use crate::display_map::{
filter_map::FilterMap, fold_map::FoldMap, inlay_map::InlayMap, tab_map::TabMap,
};
pub use crate::display_map::{fold_map::FoldMap, inlay_map::InlayMap, tab_map::TabMap};
#[derive(Copy, Clone, Debug, PartialEq, Eq)]
pub enum FoldStatus {
@@ -106,9 +101,6 @@ pub struct DisplayMap {
/// The buffer that we are displaying.
buffer: Entity<MultiBuffer>,
buffer_subscription: BufferSubscription,
/// Filters out certain entire lines (based on git status). Used to
/// implement split diff view
filter_map: FilterMap,
/// Decides where the [`Inlay`]s should be displayed.
inlay_map: InlayMap,
/// Decides where the fold indicators should be and tracks parts of a source file that are currently folded.
@@ -141,7 +133,6 @@ impl DisplayMap {
excerpt_header_height: u32,
fold_placeholder: FoldPlaceholder,
diagnostics_max_severity: DiagnosticSeverity,
filter_mode: Option<FilterMode>,
cx: &mut Context<Self>,
) -> Self {
let buffer_subscription = buffer.update(cx, |buffer, _| buffer.subscribe());
@@ -149,8 +140,7 @@ impl DisplayMap {
let tab_size = Self::tab_size(&buffer, cx);
let buffer_snapshot = buffer.read(cx).snapshot(cx);
let crease_map = CreaseMap::new(&buffer_snapshot);
let (filter_map, snapshot) = FilterMap::new(filter_mode, buffer_snapshot);
let (inlay_map, snapshot) = InlayMap::new(snapshot);
let (inlay_map, snapshot) = InlayMap::new(buffer_snapshot);
let (fold_map, snapshot) = FoldMap::new(snapshot);
let (tab_map, snapshot) = TabMap::new(snapshot, tab_size);
let (wrap_map, snapshot) = WrapMap::new(snapshot, font, font_size, wrap_width, cx);
@@ -161,9 +151,8 @@ impl DisplayMap {
DisplayMap {
buffer,
buffer_subscription,
filter_map,
inlay_map,
fold_map,
inlay_map,
tab_map,
wrap_map,
block_map,
@@ -180,8 +169,7 @@ impl DisplayMap {
pub fn snapshot(&mut self, cx: &mut Context<Self>) -> DisplaySnapshot {
let buffer_snapshot = self.buffer.read(cx).snapshot(cx);
let edits = self.buffer_subscription.consume().into_inner();
let (filter_snapshot, edits) = self.filter_map.sync(buffer_snapshot, edits);
let (inlay_snapshot, edits) = self.inlay_map.sync(filter_snapshot, edits);
let (inlay_snapshot, edits) = self.inlay_map.sync(buffer_snapshot, edits);
let (fold_snapshot, edits) = self.fold_map.read(inlay_snapshot, edits);
let tab_size = Self::tab_size(&self.buffer, cx);
let (tab_snapshot, edits) = self.tab_map.sync(fold_snapshot, edits, tab_size);
@@ -222,8 +210,7 @@ impl DisplayMap {
let buffer_snapshot = self.buffer.read(cx).snapshot(cx);
let edits = self.buffer_subscription.consume().into_inner();
let tab_size = Self::tab_size(&self.buffer, cx);
let (snapshot, edits) = self.filter_map.sync(buffer_snapshot.clone(), edits);
let (snapshot, edits) = self.inlay_map.sync(snapshot, edits);
let (snapshot, edits) = self.inlay_map.sync(buffer_snapshot.clone(), edits);
let (mut fold_map, snapshot, edits) = self.fold_map.write(snapshot, edits);
let (snapshot, edits) = self.tab_map.sync(snapshot, edits, tab_size);
let (snapshot, edits) = self
@@ -296,7 +283,6 @@ impl DisplayMap {
let snapshot = self.buffer.read(cx).snapshot(cx);
let edits = self.buffer_subscription.consume().into_inner();
let tab_size = Self::tab_size(&self.buffer, cx);
let (snapshot, edits) = self.filter_map.sync(snapshot, edits);
let (snapshot, edits) = self.inlay_map.sync(snapshot, edits);
let (mut fold_map, snapshot, edits) = self.fold_map.write(snapshot, edits);
let (snapshot, edits) = self.tab_map.sync(snapshot, edits, tab_size);
@@ -326,7 +312,6 @@ impl DisplayMap {
.collect::<Vec<_>>();
let edits = self.buffer_subscription.consume().into_inner();
let tab_size = Self::tab_size(&self.buffer, cx);
let (snapshot, edits) = self.filter_map.sync(snapshot, edits);
let (snapshot, edits) = self.inlay_map.sync(snapshot, edits);
let (mut fold_map, snapshot, edits) = self.fold_map.write(snapshot, edits);
let (snapshot, edits) = self.tab_map.sync(snapshot, edits, tab_size);
@@ -349,7 +334,6 @@ impl DisplayMap {
let snapshot = self.buffer.read(cx).snapshot(cx);
let edits = self.buffer_subscription.consume().into_inner();
let tab_size = Self::tab_size(&self.buffer, cx);
let (snapshot, edits) = self.filter_map.sync(snapshot, edits);
let (snapshot, edits) = self.inlay_map.sync(snapshot, edits);
let (snapshot, edits) = self.fold_map.read(snapshot, edits);
let (snapshot, edits) = self.tab_map.sync(snapshot, edits, tab_size);
@@ -368,7 +352,6 @@ impl DisplayMap {
let snapshot = self.buffer.read(cx).snapshot(cx);
let edits = self.buffer_subscription.consume().into_inner();
let tab_size = Self::tab_size(&self.buffer, cx);
let (snapshot, edits) = self.filter_map.sync(snapshot, edits);
let (snapshot, edits) = self.inlay_map.sync(snapshot, edits);
let (snapshot, edits) = self.fold_map.read(snapshot, edits);
let (snapshot, edits) = self.tab_map.sync(snapshot, edits, tab_size);
@@ -387,7 +370,6 @@ impl DisplayMap {
let snapshot = self.buffer.read(cx).snapshot(cx);
let edits = self.buffer_subscription.consume().into_inner();
let tab_size = Self::tab_size(&self.buffer, cx);
let (snapshot, edits) = self.filter_map.sync(snapshot, edits);
let (snapshot, edits) = self.inlay_map.sync(snapshot, edits);
let (snapshot, edits) = self.fold_map.read(snapshot, edits);
let (snapshot, edits) = self.tab_map.sync(snapshot, edits, tab_size);
@@ -432,7 +414,6 @@ impl DisplayMap {
let snapshot = self.buffer.read(cx).snapshot(cx);
let edits = self.buffer_subscription.consume().into_inner();
let tab_size = Self::tab_size(&self.buffer, cx);
let (snapshot, edits) = self.filter_map.sync(snapshot, edits);
let (snapshot, edits) = self.inlay_map.sync(snapshot, edits);
let (snapshot, edits) = self.fold_map.read(snapshot, edits);
let (snapshot, edits) = self.tab_map.sync(snapshot, edits, tab_size);
@@ -447,7 +428,6 @@ impl DisplayMap {
let snapshot = self.buffer.read(cx).snapshot(cx);
let edits = self.buffer_subscription.consume().into_inner();
let tab_size = Self::tab_size(&self.buffer, cx);
let (snapshot, edits) = self.filter_map.sync(snapshot, edits);
let (snapshot, edits) = self.inlay_map.sync(snapshot, edits);
let (snapshot, edits) = self.fold_map.read(snapshot, edits);
let (snapshot, edits) = self.tab_map.sync(snapshot, edits, tab_size);
@@ -466,7 +446,6 @@ impl DisplayMap {
let snapshot = self.buffer.read(cx).snapshot(cx);
let edits = self.buffer_subscription.consume().into_inner();
let tab_size = Self::tab_size(&self.buffer, cx);
let (snapshot, edits) = self.filter_map.sync(snapshot, edits);
let (snapshot, edits) = self.inlay_map.sync(snapshot, edits);
let (snapshot, edits) = self.fold_map.read(snapshot, edits);
let (snapshot, edits) = self.tab_map.sync(snapshot, edits, tab_size);
@@ -485,7 +464,6 @@ impl DisplayMap {
let snapshot = self.buffer.read(cx).snapshot(cx);
let edits = self.buffer_subscription.consume().into_inner();
let tab_size = Self::tab_size(&self.buffer, cx);
let (snapshot, edits) = self.filter_map.sync(snapshot, edits);
let (snapshot, edits) = self.inlay_map.sync(snapshot, edits);
let (snapshot, edits) = self.fold_map.read(snapshot, edits);
let (snapshot, edits) = self.tab_map.sync(snapshot, edits, tab_size);
@@ -556,30 +534,6 @@ impl DisplayMap {
.update(cx, |map, cx| map.set_wrap_width(width, cx))
}
#[cfg(test)]
pub fn set_filter_mode(&mut self, filter_mode: Option<FilterMode>, cx: &mut Context<Self>) {
let snapshot = self.buffer.read(cx).snapshot(cx);
let edits = self.buffer_subscription.consume().into_inner();
let tab_size = Self::tab_size(&self.buffer, cx);
let (snapshot, edits) = self.filter_map.sync(snapshot, edits);
let (snapshot, edits) = self.inlay_map.sync(snapshot, edits);
let (_, snapshot, edits) = self.fold_map.write(snapshot, edits);
let (snapshot, edits) = self.tab_map.sync(snapshot, edits, tab_size);
let (snapshot, edits) = self
.wrap_map
.update(cx, |map, cx| map.sync(snapshot, edits, cx));
self.block_map.read(snapshot, edits);
let (filter_snapshot, filter_edits) = self.filter_map.set_mode(filter_mode);
let (snapshot, edits) = self.inlay_map.sync(filter_snapshot, filter_edits);
let (_, snapshot, edits) = self.fold_map.write(snapshot, edits);
let (snapshot, edits) = self.tab_map.sync(snapshot, edits, tab_size);
let (snapshot, edits) = self
.wrap_map
.update(cx, |map, cx| map.sync(snapshot, edits, cx));
self.block_map.read(snapshot, edits);
}
pub fn update_fold_widths(
&mut self,
widths: impl IntoIterator<Item = (ChunkRendererId, Pixels)>,
@@ -588,7 +542,6 @@ impl DisplayMap {
let snapshot = self.buffer.read(cx).snapshot(cx);
let edits = self.buffer_subscription.consume().into_inner();
let tab_size = Self::tab_size(&self.buffer, cx);
let (snapshot, edits) = self.filter_map.sync(snapshot, edits);
let (snapshot, edits) = self.inlay_map.sync(snapshot, edits);
let (mut fold_map, snapshot, edits) = self.fold_map.write(snapshot, edits);
let (snapshot, edits) = self.tab_map.sync(snapshot, edits, tab_size);
@@ -618,19 +571,12 @@ impl DisplayMap {
to_insert: Vec<Inlay>,
cx: &mut Context<Self>,
) {
// when I type `ggdG`
// - calculate start and end of edit in display coords
// - call display_map.edit(vec![edit_range])
// - call buffer.set_text(new_text)
//
//
if to_remove.is_empty() && to_insert.is_empty() {
return;
}
let buffer_snapshot = self.buffer.read(cx).snapshot(cx);
let edits = self.buffer_subscription.consume().into_inner();
let (snapshot, edits) = self.filter_map.sync(buffer_snapshot, edits);
let (snapshot, edits) = self.inlay_map.sync(snapshot, edits);
let (snapshot, edits) = self.inlay_map.sync(buffer_snapshot, edits);
let (snapshot, edits) = self.fold_map.read(snapshot, edits);
let tab_size = Self::tab_size(&self.buffer, cx);
let (snapshot, edits) = self.tab_map.sync(snapshot, edits, tab_size);
@@ -827,16 +773,6 @@ impl DisplaySnapshot {
.inlay_snapshot
}
pub fn filter_snapshot(&self) -> &FilterSnapshot {
&self
.block_snapshot
.wrap_snapshot
.tab_snapshot
.fold_snapshot
.inlay_snapshot
.filter_snapshot
}
pub fn buffer_snapshot(&self) -> &MultiBufferSnapshot {
&self
.block_snapshot
@@ -844,8 +780,7 @@ impl DisplaySnapshot {
.tab_snapshot
.fold_snapshot
.inlay_snapshot
.filter_snapshot
.buffer_snapshot
.buffer
}
#[cfg(test)]
@@ -867,15 +802,11 @@ impl DisplaySnapshot {
pub fn prev_line_boundary(&self, mut point: MultiBufferPoint) -> (Point, DisplayPoint) {
loop {
let mut filter_point = self.filter_snapshot().to_filter_point(point);
let mut inlay_point = self.inlay_snapshot().to_inlay_point(filter_point);
let mut inlay_point = self.inlay_snapshot().to_inlay_point(point);
let mut fold_point = self.fold_snapshot().to_fold_point(inlay_point, Bias::Left);
fold_point.0.column = 0;
inlay_point = fold_point.to_inlay_point(self.fold_snapshot());
filter_point = self.inlay_snapshot().to_filter_point(inlay_point);
point = self
.filter_snapshot()
.to_buffer_point(filter_point, Bias::Left);
point = self.inlay_snapshot().to_buffer_point(inlay_point);
let mut display_point = self.point_to_display_point(point, Bias::Left);
*display_point.column_mut() = 0;
@@ -893,15 +824,11 @@ impl DisplaySnapshot {
) -> (MultiBufferPoint, DisplayPoint) {
let original_point = point;
loop {
let mut filter_point = self.filter_snapshot().to_filter_point(point);
let mut inlay_point = self.inlay_snapshot().to_inlay_point(filter_point);
let mut inlay_point = self.inlay_snapshot().to_inlay_point(point);
let mut fold_point = self.fold_snapshot().to_fold_point(inlay_point, Bias::Right);
fold_point.0.column = self.fold_snapshot().line_len(fold_point.row());
inlay_point = fold_point.to_inlay_point(self.fold_snapshot());
filter_point = self.inlay_snapshot().to_filter_point(inlay_point);
point = self
.filter_snapshot()
.to_buffer_point(filter_point, Bias::Right);
point = self.inlay_snapshot().to_buffer_point(inlay_point);
let mut display_point = self.point_to_display_point(point, Bias::Right);
*display_point.column_mut() = self.line_len(display_point.row());
@@ -930,8 +857,7 @@ impl DisplaySnapshot {
}
pub fn point_to_display_point(&self, point: MultiBufferPoint, bias: Bias) -> DisplayPoint {
let filter_point = self.filter_snapshot().to_filter_point(point);
let inlay_point = self.inlay_snapshot().to_inlay_point(filter_point);
let inlay_point = self.inlay_snapshot().to_inlay_point(point);
let fold_point = self.fold_snapshot().to_fold_point(inlay_point, bias);
let tab_point = self.tab_snapshot().to_tab_point(fold_point);
let wrap_point = self.wrap_snapshot().tab_point_to_wrap_point(tab_point);
@@ -940,9 +866,8 @@ impl DisplaySnapshot {
}
pub fn display_point_to_point(&self, point: DisplayPoint, bias: Bias) -> Point {
let inlay_point = self.display_point_to_inlay_point(point, bias);
let filter_point = self.inlay_snapshot().to_filter_point(inlay_point);
self.filter_snapshot().to_buffer_point(filter_point, bias)
self.inlay_snapshot()
.to_buffer_point(self.display_point_to_inlay_point(point, bias))
}
pub fn display_point_to_inlay_offset(&self, point: DisplayPoint, bias: Bias) -> InlayOffset {
@@ -951,10 +876,8 @@ impl DisplaySnapshot {
}
pub fn anchor_to_inlay_offset(&self, anchor: Anchor) -> InlayOffset {
let filter_offset = self
.filter_snapshot()
.to_filter_offset(anchor.to_offset(self.buffer_snapshot()));
self.inlay_snapshot().to_inlay_offset(filter_offset)
self.inlay_snapshot()
.to_inlay_offset(anchor.to_offset(self.buffer_snapshot()))
}
pub fn display_point_to_anchor(&self, point: DisplayPoint, bias: Bias) -> Anchor {
@@ -1598,9 +1521,8 @@ impl DisplayPoint {
let tab_point = map.wrap_snapshot().to_tab_point(wrap_point);
let fold_point = map.tab_snapshot().to_fold_point(tab_point, bias).0;
let inlay_point = fold_point.to_inlay_point(map.fold_snapshot());
let filter_point = map.inlay_snapshot().to_filter_point(inlay_point);
map.filter_snapshot()
.to_buffer_offset(map.filter_snapshot().point_to_offset(filter_point), bias)
map.inlay_snapshot()
.to_buffer_offset(map.inlay_snapshot().to_offset(inlay_point))
}
}
@@ -1705,7 +1627,6 @@ pub mod tests {
excerpt_header_height,
FoldPlaceholder::test(),
DiagnosticSeverity::Warning,
None,
cx,
)
});
@@ -1955,7 +1876,6 @@ pub mod tests {
1,
FoldPlaceholder::test(),
DiagnosticSeverity::Warning,
None,
cx,
)
});
@@ -2066,7 +1986,6 @@ pub mod tests {
1,
FoldPlaceholder::test(),
DiagnosticSeverity::Warning,
None,
cx,
)
});
@@ -2129,7 +2048,6 @@ pub mod tests {
1,
FoldPlaceholder::test(),
DiagnosticSeverity::Warning,
None,
cx,
)
});
@@ -2227,7 +2145,6 @@ pub mod tests {
1,
FoldPlaceholder::test(),
DiagnosticSeverity::Warning,
None,
cx,
)
});
@@ -2329,7 +2246,6 @@ pub mod tests {
1,
FoldPlaceholder::test(),
DiagnosticSeverity::Warning,
None,
cx,
)
});
@@ -2434,7 +2350,6 @@ pub mod tests {
1,
FoldPlaceholder::test(),
DiagnosticSeverity::Warning,
None,
cx,
)
});
@@ -2524,7 +2439,6 @@ pub mod tests {
1,
FoldPlaceholder::test(),
DiagnosticSeverity::Warning,
None,
cx,
)
});
@@ -2666,7 +2580,6 @@ pub mod tests {
1,
FoldPlaceholder::test(),
DiagnosticSeverity::Warning,
None,
cx,
)
});
@@ -2755,7 +2668,6 @@ pub mod tests {
1,
FoldPlaceholder::test(),
DiagnosticSeverity::Warning,
None,
cx,
)
});
@@ -2881,7 +2793,6 @@ pub mod tests {
1,
FoldPlaceholder::test(),
DiagnosticSeverity::Warning,
None,
cx,
);
let snapshot = map.buffer.read(cx).snapshot(cx);
@@ -2920,7 +2831,6 @@ pub mod tests {
1,
FoldPlaceholder::test(),
DiagnosticSeverity::Warning,
None,
cx,
)
});
@@ -2997,7 +2907,6 @@ pub mod tests {
1,
FoldPlaceholder::test(),
DiagnosticSeverity::Warning,
None,
cx,
)
});

View File

@@ -257,16 +257,6 @@ pub enum BlockId {
ExcerptBoundary(ExcerptId),
FoldedBuffer(ExcerptId),
Custom(CustomBlockId),
Spacer(SpacerId),
}
#[derive(Debug, Clone, Copy, PartialEq, Eq, Ord, PartialOrd, Hash)]
pub struct SpacerId(pub usize);
impl From<SpacerId> for EntityId {
fn from(value: SpacerId) -> Self {
EntityId::from(value.0 as u64)
}
}
impl From<BlockId> for ElementId {
@@ -277,7 +267,6 @@ impl From<BlockId> for ElementId {
("ExcerptBoundary", EntityId::from(excerpt_id)).into()
}
BlockId::FoldedBuffer(id) => ("FoldedBuffer", EntityId::from(id)).into(),
BlockId::Spacer(id) => ("Spacer", EntityId::from(id)).into(),
}
}
}
@@ -288,7 +277,6 @@ impl std::fmt::Display for BlockId {
Self::Custom(id) => write!(f, "Block({id:?})"),
Self::ExcerptBoundary(id) => write!(f, "ExcerptHeader({id:?})"),
Self::FoldedBuffer(id) => write!(f, "FoldedBuffer({id:?})"),
Self::Spacer(id) => write!(f, "Spacer({id:?})"),
}
}
}
@@ -314,10 +302,6 @@ pub enum Block {
excerpt: ExcerptInfo,
height: u32,
},
Spacer {
height: u32,
id: SpacerId,
},
}
impl Block {
@@ -333,7 +317,6 @@ impl Block {
excerpt: next_excerpt,
..
} => BlockId::ExcerptBoundary(next_excerpt.id),
Block::Spacer { id, .. } => BlockId::Spacer(*id),
}
}
@@ -342,8 +325,7 @@ impl Block {
Block::Custom(block) => block.height.is_some(),
Block::ExcerptBoundary { .. }
| Block::FoldedBuffer { .. }
| Block::BufferHeader { .. }
| Block::Spacer { .. } => true,
| Block::BufferHeader { .. } => true,
}
}
@@ -352,8 +334,7 @@ impl Block {
Block::Custom(block) => block.height.unwrap_or(0),
Block::ExcerptBoundary { height, .. }
| Block::FoldedBuffer { height, .. }
| Block::BufferHeader { height, .. }
| Block::Spacer { height, .. } => *height,
| Block::BufferHeader { height, .. } => *height,
}
}
@@ -362,8 +343,7 @@ impl Block {
Block::Custom(block) => block.style,
Block::ExcerptBoundary { .. }
| Block::FoldedBuffer { .. }
| Block::BufferHeader { .. }
| Block::Spacer { .. } => BlockStyle::Sticky,
| Block::BufferHeader { .. } => BlockStyle::Sticky,
}
}
@@ -373,7 +353,6 @@ impl Block {
Block::FoldedBuffer { .. } => false,
Block::ExcerptBoundary { .. } => true,
Block::BufferHeader { .. } => true,
Block::Spacer { .. } => false,
}
}
@@ -383,7 +362,6 @@ impl Block {
Block::FoldedBuffer { .. } => false,
Block::ExcerptBoundary { .. } => false,
Block::BufferHeader { .. } => false,
Block::Spacer { .. } => false,
}
}
@@ -396,7 +374,6 @@ impl Block {
Block::FoldedBuffer { .. } => false,
Block::ExcerptBoundary { .. } => false,
Block::BufferHeader { .. } => false,
Block::Spacer { .. } => false,
}
}
@@ -406,7 +383,6 @@ impl Block {
Block::FoldedBuffer { .. } => true,
Block::ExcerptBoundary { .. } => false,
Block::BufferHeader { .. } => false,
Block::Spacer { .. } => true,
}
}
@@ -416,7 +392,6 @@ impl Block {
Block::FoldedBuffer { .. } => true,
Block::ExcerptBoundary { .. } => true,
Block::BufferHeader { .. } => true,
Block::Spacer { .. } => false,
}
}
@@ -426,7 +401,6 @@ impl Block {
Block::FoldedBuffer { .. } => true,
Block::ExcerptBoundary { .. } => false,
Block::BufferHeader { .. } => true,
Block::Spacer { .. } => false,
}
}
}
@@ -453,11 +427,6 @@ impl Debug for Block {
.field("excerpt", excerpt)
.field("height", height)
.finish(),
Self::Spacer { height, id } => f
.debug_struct("Spacer")
.field("height", height)
.field("id", id)
.finish(),
}
}
}
@@ -587,7 +556,6 @@ impl BlockMap {
{
// Preserve the transform (push and next)
new_transforms.push(transform.clone(), ());
// ^ FIXME wrong?
cursor.next();
// Preserve below blocks at end of edit
@@ -777,11 +745,10 @@ impl BlockMap {
}
new_transforms.append(cursor.suffix(), ());
// FIXME
// debug_assert_eq!(
// new_transforms.summary().input_rows,
// wrap_snapshot.max_point().row() + 1
// );
debug_assert_eq!(
new_transforms.summary().input_rows,
wrap_snapshot.max_point().row() + 1
);
drop(cursor);
*transforms = new_transforms;
@@ -1462,42 +1429,31 @@ impl BlockSnapshot {
pub fn block_for_id(&self, block_id: BlockId) -> Option<Block> {
let buffer = self.wrap_snapshot.buffer_snapshot();
let wrap_row = match block_id {
let wrap_point = match block_id {
BlockId::Custom(custom_block_id) => {
let custom_block = self.custom_blocks_by_id.get(&custom_block_id)?;
return Some(Block::Custom(custom_block.clone()));
}
BlockId::ExcerptBoundary(next_excerpt_id) => {
let excerpt_range = buffer.range_for_excerpt(next_excerpt_id)?;
Some(
self.wrap_snapshot
.make_wrap_point(excerpt_range.start, Bias::Left)
.row(),
)
}
BlockId::FoldedBuffer(excerpt_id) => Some(
self.wrap_snapshot
.make_wrap_point(buffer.range_for_excerpt(excerpt_id)?.start, Bias::Left)
.row(),
),
BlockId::Spacer(_) => None,
.make_wrap_point(excerpt_range.start, Bias::Left)
}
BlockId::FoldedBuffer(excerpt_id) => self
.wrap_snapshot
.make_wrap_point(buffer.range_for_excerpt(excerpt_id)?.start, Bias::Left),
};
let wrap_row = WrapRow(wrap_point.row());
let mut cursor = self.transforms.cursor::<WrapRow>(());
if let Some(wrap_row) = wrap_row {
cursor.seek(&WrapRow(wrap_row), Bias::Left);
} else {
cursor.next();
}
cursor.seek(&wrap_row, Bias::Left);
while let Some(transform) = cursor.item() {
if let Some(block) = transform.block.as_ref() {
if block.id() == block_id {
return Some(block.clone());
}
} else if let Some(wrap_row) = wrap_row
&& *cursor.start() > WrapRow(wrap_row)
{
} else if *cursor.start() > wrap_row {
break;
}
@@ -2013,10 +1969,7 @@ fn offset_for_row(s: &str, target: u32) -> (u32, usize) {
mod tests {
use super::*;
use crate::{
display_map::{
filter_map::FilterMap, fold_map::FoldMap, inlay_map::InlayMap, tab_map::TabMap,
wrap_map::WrapMap,
},
display_map::{fold_map::FoldMap, inlay_map::InlayMap, tab_map::TabMap, wrap_map::WrapMap},
test::test_font,
};
use gpui::{App, AppContext as _, Element, div, font, px};
@@ -2051,8 +2004,7 @@ mod tests {
let buffer = cx.update(|cx| MultiBuffer::build_simple(text, cx));
let buffer_snapshot = cx.update(|cx| buffer.read(cx).snapshot(cx));
let subscription = buffer.update(cx, |buffer, _| buffer.subscribe());
let (mut filter_map, filter_snapshot) = FilterMap::new(None, buffer_snapshot.clone());
let (mut inlay_map, inlay_snapshot) = InlayMap::new(filter_snapshot);
let (mut inlay_map, inlay_snapshot) = InlayMap::new(buffer_snapshot.clone());
let (mut fold_map, fold_snapshot) = FoldMap::new(inlay_snapshot);
let (mut tab_map, tab_snapshot) = TabMap::new(fold_snapshot, 1.try_into().unwrap());
let (wrap_map, wraps_snapshot) =
@@ -2201,9 +2153,8 @@ mod tests {
buffer.snapshot(cx)
});
let (filter_snapshot, filter_edits) =
filter_map.sync(buffer_snapshot, subscription.consume().into_inner());
let (inlay_snapshot, inlay_edits) = inlay_map.sync(filter_snapshot, filter_edits);
let (inlay_snapshot, inlay_edits) =
inlay_map.sync(buffer_snapshot, subscription.consume().into_inner());
let (fold_snapshot, fold_edits) = fold_map.read(inlay_snapshot, inlay_edits);
let (tab_snapshot, tab_edits) =
tab_map.sync(fold_snapshot, fold_edits, 4.try_into().unwrap());
@@ -2257,14 +2208,13 @@ mod tests {
}
let multi_buffer_snapshot = multi_buffer.read(cx).snapshot(cx);
let (_, filter_snapshot) = FilterMap::new(None, multi_buffer_snapshot);
let (_, inlay_snapshot) = InlayMap::new(filter_snapshot);
let (_, inlay_snapshot) = InlayMap::new(multi_buffer_snapshot);
let (_, fold_snapshot) = FoldMap::new(inlay_snapshot);
let (_, tab_snapshot) = TabMap::new(fold_snapshot, 4.try_into().unwrap());
let (_, wrap_snapshot) = WrapMap::new(tab_snapshot, font, font_size, Some(wrap_width), cx);
let (_, wraps_snapshot) = WrapMap::new(tab_snapshot, font, font_size, Some(wrap_width), cx);
let block_map = BlockMap::new(wrap_snapshot.clone(), 1, 1);
let snapshot = block_map.read(wrap_snapshot.clone(), Default::default());
let block_map = BlockMap::new(wraps_snapshot.clone(), 1, 1);
let snapshot = block_map.read(wraps_snapshot, Default::default());
// Each excerpt has a header above and footer below. Excerpts are also *separated* by a newline.
assert_eq!(snapshot.text(), "\nBuff\ner 1\n\nBuff\ner 2\n\nBuff\ner 3");
@@ -2292,8 +2242,7 @@ mod tests {
let buffer = cx.update(|cx| MultiBuffer::build_simple(text, cx));
let buffer_snapshot = cx.update(|cx| buffer.read(cx).snapshot(cx));
let _subscription = buffer.update(cx, |buffer, _| buffer.subscribe());
let (_filter_map, filter_snapshot) = FilterMap::new(None, buffer_snapshot.clone());
let (_inlay_map, inlay_snapshot) = InlayMap::new(filter_snapshot);
let (_inlay_map, inlay_snapshot) = InlayMap::new(buffer_snapshot.clone());
let (_fold_map, fold_snapshot) = FoldMap::new(inlay_snapshot);
let (_tab_map, tab_snapshot) = TabMap::new(fold_snapshot, 1.try_into().unwrap());
let (_wrap_map, wraps_snapshot) =
@@ -2392,8 +2341,7 @@ mod tests {
let buffer = cx.update(|cx| MultiBuffer::build_simple(text, cx));
let buffer_snapshot = cx.update(|cx| buffer.read(cx).snapshot(cx));
let (_, filter_snapshot) = FilterMap::new(None, buffer_snapshot.clone());
let (_, inlay_snapshot) = InlayMap::new(filter_snapshot);
let (_, inlay_snapshot) = InlayMap::new(buffer_snapshot.clone());
let (_, fold_snapshot) = FoldMap::new(inlay_snapshot);
let (_, tab_snapshot) = TabMap::new(fold_snapshot, 4.try_into().unwrap());
let (_, wraps_snapshot) = cx.update(|cx| {
@@ -2437,8 +2385,7 @@ mod tests {
let buffer = cx.update(|cx| MultiBuffer::build_simple(text, cx));
let buffer_subscription = buffer.update(cx, |buffer, _cx| buffer.subscribe());
let buffer_snapshot = cx.update(|cx| buffer.read(cx).snapshot(cx));
let (mut filter_map, filter_snapshot) = FilterMap::new(None, buffer_snapshot.clone());
let (mut inlay_map, inlay_snapshot) = InlayMap::new(filter_snapshot);
let (mut inlay_map, inlay_snapshot) = InlayMap::new(buffer_snapshot.clone());
let (mut fold_map, fold_snapshot) = FoldMap::new(inlay_snapshot);
let tab_size = 1.try_into().unwrap();
let (mut tab_map, tab_snapshot) = TabMap::new(fold_snapshot, tab_size);
@@ -2465,9 +2412,8 @@ mod tests {
buffer.edit([(Point::new(2, 0)..Point::new(3, 0), "")], None, cx);
buffer.snapshot(cx)
});
let (filter_snapshot, filter_edits) =
filter_map.sync(buffer_snapshot, buffer_subscription.consume().into_inner());
let (inlay_snapshot, inlay_edits) = inlay_map.sync(filter_snapshot, filter_edits);
let (inlay_snapshot, inlay_edits) =
inlay_map.sync(buffer_snapshot, buffer_subscription.consume().into_inner());
let (fold_snapshot, fold_edits) = fold_map.read(inlay_snapshot, inlay_edits);
let (tab_snapshot, tab_edits) = tab_map.sync(fold_snapshot, fold_edits, tab_size);
let (wraps_snapshot, wrap_edits) = wrap_map.update(cx, |wrap_map, cx| {
@@ -2487,11 +2433,10 @@ mod tests {
);
buffer.snapshot(cx)
});
let (filter_snapshot, filter_edits) = filter_map.sync(
let (inlay_snapshot, inlay_edits) = inlay_map.sync(
buffer_snapshot.clone(),
buffer_subscription.consume().into_inner(),
);
let (inlay_snapshot, inlay_edits) = inlay_map.sync(filter_snapshot, filter_edits);
let (fold_snapshot, fold_edits) = fold_map.read(inlay_snapshot, inlay_edits);
let (tab_snapshot, tab_edits) = tab_map.sync(fold_snapshot, fold_edits, tab_size);
let (wraps_snapshot, wrap_edits) = wrap_map.update(cx, |wrap_map, cx| {
@@ -2606,8 +2551,7 @@ mod tests {
let buffer_id_2 = buffer_ids[1];
let buffer_id_3 = buffer_ids[2];
let (_, filter_snapshot) = FilterMap::new(None, buffer_snapshot.clone());
let (_, inlay_snapshot) = InlayMap::new(filter_snapshot);
let (_, inlay_snapshot) = InlayMap::new(buffer_snapshot.clone());
let (_, fold_snapshot) = FoldMap::new(inlay_snapshot);
let (_, tab_snapshot) = TabMap::new(fold_snapshot, 4.try_into().unwrap());
let (_, wrap_snapshot) =
@@ -2952,8 +2896,7 @@ mod tests {
assert_eq!(buffer_ids.len(), 1);
let buffer_id = buffer_ids[0];
let (_, filter_snapshot) = FilterMap::new(None, buffer_snapshot);
let (_, inlay_snapshot) = InlayMap::new(filter_snapshot);
let (_, inlay_snapshot) = InlayMap::new(buffer_snapshot);
let (_, fold_snapshot) = FoldMap::new(inlay_snapshot);
let (_, tab_snapshot) = TabMap::new(fold_snapshot, 4.try_into().unwrap());
let (_, wrap_snapshot) =
@@ -3028,8 +2971,7 @@ mod tests {
};
let mut buffer_snapshot = cx.update(|cx| buffer.read(cx).snapshot(cx));
let (mut filter_map, filter_snapshot) = FilterMap::new(None, buffer_snapshot.clone());
let (mut inlay_map, inlay_snapshot) = InlayMap::new(filter_snapshot);
let (mut inlay_map, inlay_snapshot) = InlayMap::new(buffer_snapshot.clone());
let (mut fold_map, fold_snapshot) = FoldMap::new(inlay_snapshot);
let (mut tab_map, tab_snapshot) = TabMap::new(fold_snapshot, 4.try_into().unwrap());
let font = test_font();
@@ -3086,10 +3028,8 @@ mod tests {
})
.collect::<Vec<_>>();
let (filter_snapshot, filter_edits) =
filter_map.sync(buffer_snapshot.clone(), vec![]);
let (inlay_snapshot, inlay_edits) =
inlay_map.sync(filter_snapshot, filter_edits);
inlay_map.sync(buffer_snapshot.clone(), vec![]);
let (fold_snapshot, fold_edits) = fold_map.read(inlay_snapshot, inlay_edits);
let (tab_snapshot, tab_edits) =
tab_map.sync(fold_snapshot, fold_edits, tab_size);
@@ -3126,9 +3066,8 @@ mod tests {
.map(|block| block.id)
.collect::<HashSet<_>>();
let (filter_snapshot, filter_edits) =
filter_map.sync(buffer_snapshot.clone(), vec![]);
let (inlay_snapshot, inlay_edits) = inlay_map.sync(filter_snapshot, vec![]);
let (inlay_snapshot, inlay_edits) =
inlay_map.sync(buffer_snapshot.clone(), vec![]);
let (fold_snapshot, fold_edits) = fold_map.read(inlay_snapshot, inlay_edits);
let (tab_snapshot, tab_edits) =
tab_map.sync(fold_snapshot, fold_edits, tab_size);
@@ -3148,10 +3087,8 @@ mod tests {
log::info!("Noop fold/unfold operation on a singleton buffer");
continue;
}
let (filter_snapshot, filter_edits) =
filter_map.sync(buffer_snapshot.clone(), vec![]);
let (inlay_snapshot, inlay_edits) =
inlay_map.sync(filter_snapshot.clone(), vec![]);
inlay_map.sync(buffer_snapshot.clone(), vec![]);
let (fold_snapshot, fold_edits) = fold_map.read(inlay_snapshot, inlay_edits);
let (tab_snapshot, tab_edits) =
tab_map.sync(fold_snapshot, fold_edits, tab_size);
@@ -3238,9 +3175,8 @@ mod tests {
}
}
let (filter_snapshot, filter_edits) =
filter_map.sync(buffer_snapshot.clone(), buffer_edits);
let (inlay_snapshot, inlay_edits) = inlay_map.sync(filter_snapshot, filter_edits);
let (inlay_snapshot, inlay_edits) =
inlay_map.sync(buffer_snapshot.clone(), buffer_edits);
let (fold_snapshot, fold_edits) = fold_map.read(inlay_snapshot, inlay_edits);
let (tab_snapshot, tab_edits) = tab_map.sync(fold_snapshot, fold_edits, tab_size);
let (wraps_snapshot, wrap_edits) = wrap_map.update(cx, |wrap_map, cx| {
@@ -3616,8 +3552,7 @@ mod tests {
let text = "abc\ndef\nghi\njkl\nmno";
let buffer = cx.update(|cx| MultiBuffer::build_simple(text, cx));
let buffer_snapshot = cx.update(|cx| buffer.read(cx).snapshot(cx));
let (_filter_map, filter_snapshot) = FilterMap::new(None, buffer_snapshot.clone());
let (_inlay_map, inlay_snapshot) = InlayMap::new(filter_snapshot);
let (_inlay_map, inlay_snapshot) = InlayMap::new(buffer_snapshot.clone());
let (_fold_map, fold_snapshot) = FoldMap::new(inlay_snapshot);
let (_tab_map, tab_snapshot) = TabMap::new(fold_snapshot, 4.try_into().unwrap());
let (_wrap_map, wraps_snapshot) =
@@ -3667,8 +3602,7 @@ mod tests {
assert_eq!(buffer_ids.len(), 1);
let buffer_id = buffer_ids[0];
let (_, filter_snapshot) = FilterMap::new(None, buffer_snapshot.clone());
let (_, inlay_snapshot) = InlayMap::new(filter_snapshot);
let (_, inlay_snapshot) = InlayMap::new(buffer_snapshot.clone());
let (_, fold_snapshot) = FoldMap::new(inlay_snapshot);
let (_, tab_snapshot) = TabMap::new(fold_snapshot, 4.try_into().unwrap());
let (_, wrap_snapshot) =
@@ -3713,8 +3647,7 @@ mod tests {
assert_eq!(buffer_ids.len(), 1);
let buffer_id = buffer_ids[0];
let (_, filter_snapshot) = FilterMap::new(None, buffer_snapshot.clone());
let (_, inlay_snapshot) = InlayMap::new(filter_snapshot);
let (_, inlay_snapshot) = InlayMap::new(buffer_snapshot.clone());
let (_, fold_snapshot) = FoldMap::new(inlay_snapshot);
let (_, tab_snapshot) = TabMap::new(fold_snapshot, 4.try_into().unwrap());
let (_, wrap_snapshot) =

File diff suppressed because it is too large Load Diff

View File

@@ -145,7 +145,7 @@ impl FoldMapWriter<'_> {
let mut folds = Vec::new();
let snapshot = self.0.snapshot.inlay_snapshot.clone();
for (range, fold_text) in ranges.into_iter() {
let buffer = &snapshot.filter_snapshot.buffer_snapshot;
let buffer = &snapshot.buffer;
let range = range.start.to_offset(buffer)..range.end.to_offset(buffer);
// Ignore any empty ranges.
@@ -159,7 +159,6 @@ impl FoldMapWriter<'_> {
if fold_range.0.start.excerpt_id != fold_range.0.end.excerpt_id {
continue;
}
// FIXME we should ignore any folds that start or end in a filtered region
folds.push(Fold {
id: FoldId(post_inc(&mut self.0.next_fold_id.0)),
@@ -167,17 +166,15 @@ impl FoldMapWriter<'_> {
placeholder: fold_text,
});
let filter_range = snapshot.filter_snapshot.to_filter_offset(range.start)
..snapshot.filter_snapshot.to_filter_offset(range.end);
let inlay_range = snapshot.to_inlay_offset(filter_range.start)
..snapshot.to_inlay_offset(filter_range.end);
let inlay_range =
snapshot.to_inlay_offset(range.start)..snapshot.to_inlay_offset(range.end);
edits.push(InlayEdit {
old: inlay_range.clone(),
new: inlay_range,
});
}
let buffer = &snapshot.filter_snapshot.buffer_snapshot;
let buffer = &snapshot.buffer;
folds.sort_unstable_by(|a, b| sum_tree::SeekTarget::cmp(&a.range, &b.range, buffer));
self.0.snapshot.folds = {
@@ -236,7 +233,7 @@ impl FoldMapWriter<'_> {
let mut edits = Vec::new();
let mut fold_ixs_to_delete = Vec::new();
let snapshot = self.0.snapshot.inlay_snapshot.clone();
let buffer = &snapshot.filter_snapshot.buffer_snapshot;
let buffer = &snapshot.buffer;
for range in ranges.into_iter() {
let range = range.start.to_offset(buffer)..range.end.to_offset(buffer);
let mut folds_cursor =
@@ -244,14 +241,10 @@ impl FoldMapWriter<'_> {
while let Some(fold) = folds_cursor.item() {
let offset_range =
fold.range.start.to_offset(buffer)..fold.range.end.to_offset(buffer);
let filter_range = snapshot
.filter_snapshot
.to_filter_offset(offset_range.start)
..snapshot.filter_snapshot.to_filter_offset(offset_range.end);
if should_unfold(fold) {
if offset_range.end > offset_range.start {
let inlay_range = snapshot.to_inlay_offset(filter_range.start)
..snapshot.to_inlay_offset(filter_range.end);
let inlay_range = snapshot.to_inlay_offset(offset_range.start)
..snapshot.to_inlay_offset(offset_range.end);
edits.push(InlayEdit {
old: inlay_range.clone(),
new: inlay_range,
@@ -289,7 +282,7 @@ impl FoldMapWriter<'_> {
) -> (FoldSnapshot, Vec<FoldEdit>) {
let mut edits = Vec::new();
let inlay_snapshot = self.0.snapshot.inlay_snapshot.clone();
let buffer = &inlay_snapshot.filter_snapshot.buffer_snapshot;
let buffer = &inlay_snapshot.buffer;
for (id, new_width) in new_widths {
let ChunkRendererId::Fold(id) = id else {
@@ -300,12 +293,8 @@ impl FoldMapWriter<'_> {
{
let buffer_start = metadata.range.start.to_offset(buffer);
let buffer_end = metadata.range.end.to_offset(buffer);
let filter_range = inlay_snapshot
.filter_snapshot
.to_filter_offset(buffer_start)
..inlay_snapshot.filter_snapshot.to_filter_offset(buffer_end);
let inlay_range = inlay_snapshot.to_inlay_offset(filter_range.start)
..inlay_snapshot.to_inlay_offset(filter_range.end);
let inlay_range = inlay_snapshot.to_inlay_offset(buffer_start)
..inlay_snapshot.to_inlay_offset(buffer_end);
edits.push(InlayEdit {
old: inlay_range.clone(),
new: inlay_range.clone(),
@@ -339,7 +328,7 @@ impl FoldMap {
pub fn new(inlay_snapshot: InlaySnapshot) -> (Self, FoldSnapshot) {
let this = Self {
snapshot: FoldSnapshot {
folds: SumTree::new(&inlay_snapshot.filter_snapshot.buffer_snapshot),
folds: SumTree::new(&inlay_snapshot.buffer),
transforms: SumTree::from_item(
Transform {
summary: TransformSummary {
@@ -472,41 +461,25 @@ impl FoldMap {
edit.new.end =
InlayOffset(((edit.new.start + edit.old_len()).0 as isize + delta) as usize);
let filter_offset = inlay_snapshot.to_filter_offset(edit.new.start);
let anchor = inlay_snapshot
.filter_snapshot
.buffer_snapshot
.anchor_before(
inlay_snapshot
.filter_snapshot
.to_buffer_offset(filter_offset, Bias::Left),
);
.buffer
.anchor_before(inlay_snapshot.to_buffer_offset(edit.new.start));
let mut folds_cursor = self
.snapshot
.folds
.cursor::<FoldRange>(&inlay_snapshot.filter_snapshot.buffer_snapshot);
.cursor::<FoldRange>(&inlay_snapshot.buffer);
folds_cursor.seek(&FoldRange(anchor..Anchor::max()), Bias::Left);
let mut folds = iter::from_fn({
let inlay_snapshot = &inlay_snapshot;
move || {
let item = folds_cursor.item().map(|fold| {
let buffer_start = fold
.range
.start
.to_offset(&inlay_snapshot.filter_snapshot.buffer_snapshot);
let buffer_end = fold
.range
.end
.to_offset(&inlay_snapshot.filter_snapshot.buffer_snapshot);
let filter_range = inlay_snapshot
.filter_snapshot
.to_filter_offset(buffer_start)
..inlay_snapshot.filter_snapshot.to_filter_offset(buffer_end);
let buffer_start = fold.range.start.to_offset(&inlay_snapshot.buffer);
let buffer_end = fold.range.end.to_offset(&inlay_snapshot.buffer);
(
fold.clone(),
inlay_snapshot.to_inlay_offset(filter_range.start)
..inlay_snapshot.to_inlay_offset(filter_range.end),
inlay_snapshot.to_inlay_offset(buffer_start)
..inlay_snapshot.to_inlay_offset(buffer_end),
)
});
folds_cursor.next();
@@ -659,7 +632,7 @@ pub struct FoldSnapshot {
impl FoldSnapshot {
pub fn buffer(&self) -> &MultiBufferSnapshot {
&self.inlay_snapshot.filter_snapshot.buffer_snapshot
&self.inlay_snapshot.buffer
}
fn fold_width(&self, fold_id: &FoldId) -> Option<Pixels> {
@@ -675,9 +648,7 @@ impl FoldSnapshot {
#[cfg(test)]
pub fn fold_count(&self) -> usize {
self.folds
.items(&self.inlay_snapshot.filter_snapshot.buffer_snapshot)
.len()
self.folds.items(&self.inlay_snapshot.buffer).len()
}
pub fn text_summary_for_range(&self, range: Range<FoldPoint>) -> TextSummary {
@@ -798,7 +769,7 @@ impl FoldSnapshot {
where
T: ToOffset,
{
let buffer = &self.inlay_snapshot.filter_snapshot.buffer_snapshot;
let buffer = &self.inlay_snapshot.buffer;
let range = range.start.to_offset(buffer)..range.end.to_offset(buffer);
let mut folds = intersecting_folds(&self.inlay_snapshot, &self.folds, range, false);
iter::from_fn(move || {
@@ -812,12 +783,8 @@ impl FoldSnapshot {
where
T: ToOffset,
{
let buffer_offset = offset.to_offset(&self.inlay_snapshot.filter_snapshot.buffer_snapshot);
let filter_offset = self
.inlay_snapshot
.filter_snapshot
.to_filter_offset(buffer_offset);
let inlay_offset = self.inlay_snapshot.to_inlay_offset(filter_offset);
let buffer_offset = offset.to_offset(&self.inlay_snapshot.buffer);
let inlay_offset = self.inlay_snapshot.to_inlay_offset(buffer_offset);
let (_, _, item) = self
.transforms
.find::<InlayOffset, _>((), &inlay_offset, Bias::Right);
@@ -825,22 +792,15 @@ impl FoldSnapshot {
}
pub fn is_line_folded(&self, buffer_row: MultiBufferRow) -> bool {
let filter_point = self
let mut inlay_point = self
.inlay_snapshot
.filter_snapshot
.to_filter_point(Point::new(buffer_row.0, 0));
let mut inlay_point = self.inlay_snapshot.to_inlay_point(filter_point);
.to_inlay_point(Point::new(buffer_row.0, 0));
let mut cursor = self.transforms.cursor::<InlayPoint>(());
cursor.seek(&inlay_point, Bias::Right);
loop {
match cursor.item() {
Some(transform) => {
let filter_point = self.inlay_snapshot.to_filter_point(inlay_point);
// FIXME unclear what bias to use here
let buffer_point = self
.inlay_snapshot
.filter_snapshot
.to_buffer_point(filter_point, Bias::Left);
let buffer_point = self.inlay_snapshot.to_buffer_point(inlay_point);
if buffer_point.row != buffer_row.0 {
return false;
} else if transform.placeholder.is_some() {
@@ -985,7 +945,7 @@ fn intersecting_folds<'a>(
range: Range<usize>,
inclusive: bool,
) -> FilterCursor<'a, 'a, impl 'a + FnMut(&FoldSummary) -> bool, Fold, usize> {
let buffer = &inlay_snapshot.filter_snapshot.buffer_snapshot;
let buffer = &inlay_snapshot.buffer;
let start = buffer.anchor_before(range.start.to_offset(buffer));
let end = buffer.anchor_after(range.end.to_offset(buffer));
let mut cursor = folds.filter::<_, usize>(buffer, move |summary| {
@@ -1599,7 +1559,6 @@ pub type FoldEdit = Edit<FoldOffset>;
#[cfg(test)]
mod tests {
use super::*;
use crate::display_map::filter_map::FilterMap;
use crate::{MultiBuffer, ToPoint, display_map::inlay_map::InlayMap};
use Bias::{Left, Right};
use collections::HashSet;
@@ -1616,8 +1575,7 @@ mod tests {
let buffer = MultiBuffer::build_simple(&sample_text(5, 6, 'a'), cx);
let subscription = buffer.update(cx, |buffer, _| buffer.subscribe());
let buffer_snapshot = buffer.read(cx).snapshot(cx);
let (mut filter_map, filter_snapshot) = FilterMap::new(None, buffer_snapshot);
let (mut inlay_map, inlay_snapshot) = InlayMap::new(filter_snapshot);
let (mut inlay_map, inlay_snapshot) = InlayMap::new(buffer_snapshot);
let mut map = FoldMap::new(inlay_snapshot.clone()).0;
let (mut writer, _, _) = map.write(inlay_snapshot, vec![]);
@@ -1652,9 +1610,8 @@ mod tests {
buffer.snapshot(cx)
});
let (filter_snapshot, filter_edits) =
filter_map.sync(buffer_snapshot, subscription.consume().into_inner());
let (inlay_snapshot, inlay_edits) = inlay_map.sync(filter_snapshot, filter_edits);
let (inlay_snapshot, inlay_edits) =
inlay_map.sync(buffer_snapshot, subscription.consume().into_inner());
let (snapshot3, edits) = map.read(inlay_snapshot, inlay_edits);
assert_eq!(snapshot3.text(), "123a⋯c123c⋯eeeee");
assert_eq!(
@@ -1675,9 +1632,8 @@ mod tests {
buffer.edit([(Point::new(2, 6)..Point::new(4, 3), "456")], None, cx);
buffer.snapshot(cx)
});
let (filter_snapshot, filter_edits) =
filter_map.sync(buffer_snapshot, subscription.consume().into_inner());
let (inlay_snapshot, inlay_edits) = inlay_map.sync(filter_snapshot, filter_edits);
let (inlay_snapshot, inlay_edits) =
inlay_map.sync(buffer_snapshot, subscription.consume().into_inner());
let (snapshot4, _) = map.read(inlay_snapshot.clone(), inlay_edits);
assert_eq!(snapshot4.text(), "123a⋯c123456eee");
@@ -1698,8 +1654,7 @@ mod tests {
let buffer = MultiBuffer::build_simple("abcdefghijkl", cx);
let subscription = buffer.update(cx, |buffer, _| buffer.subscribe());
let buffer_snapshot = buffer.read(cx).snapshot(cx);
let (mut filter_map, filter_snapshot) = FilterMap::new(None, buffer_snapshot);
let (mut inlay_map, inlay_snapshot) = InlayMap::new(filter_snapshot);
let (mut inlay_map, inlay_snapshot) = InlayMap::new(buffer_snapshot);
{
let mut map = FoldMap::new(inlay_snapshot.clone()).0;
@@ -1745,9 +1700,8 @@ mod tests {
buffer.edit([(0..1, "12345")], None, cx);
buffer.snapshot(cx)
});
let (filter_snapshot, filter_edits) =
filter_map.sync(buffer_snapshot, subscription.consume().into_inner());
let (inlay_snapshot, inlay_edits) = inlay_map.sync(filter_snapshot, filter_edits);
let (inlay_snapshot, inlay_edits) =
inlay_map.sync(buffer_snapshot, subscription.consume().into_inner());
let (snapshot, _) = map.read(inlay_snapshot, inlay_edits);
assert_eq!(snapshot.text(), "12345⋯fghijkl");
}
@@ -1757,8 +1711,7 @@ mod tests {
fn test_overlapping_folds(cx: &mut gpui::App) {
let buffer = MultiBuffer::build_simple(&sample_text(5, 6, 'a'), cx);
let buffer_snapshot = buffer.read(cx).snapshot(cx);
let (_, filter_snapshot) = FilterMap::new(None, buffer_snapshot);
let (_, inlay_snapshot) = InlayMap::new(filter_snapshot);
let (_, inlay_snapshot) = InlayMap::new(buffer_snapshot);
let mut map = FoldMap::new(inlay_snapshot.clone()).0;
let (mut writer, _, _) = map.write(inlay_snapshot.clone(), vec![]);
writer.fold(vec![
@@ -1777,8 +1730,7 @@ mod tests {
let buffer = MultiBuffer::build_simple(&sample_text(5, 6, 'a'), cx);
let subscription = buffer.update(cx, |buffer, _| buffer.subscribe());
let buffer_snapshot = buffer.read(cx).snapshot(cx);
let (mut filter_map, filter_snapshot) = FilterMap::new(None, buffer_snapshot);
let (mut inlay_map, inlay_snapshot) = InlayMap::new(filter_snapshot);
let (mut inlay_map, inlay_snapshot) = InlayMap::new(buffer_snapshot);
let mut map = FoldMap::new(inlay_snapshot.clone()).0;
let (mut writer, _, _) = map.write(inlay_snapshot.clone(), vec![]);
@@ -1793,9 +1745,8 @@ mod tests {
buffer.edit([(Point::new(2, 2)..Point::new(3, 1), "")], None, cx);
buffer.snapshot(cx)
});
let (filter_snapshot, filter_edits) =
filter_map.sync(buffer_snapshot, subscription.consume().into_inner());
let (inlay_snapshot, inlay_edits) = inlay_map.sync(filter_snapshot, filter_edits);
let (inlay_snapshot, inlay_edits) =
inlay_map.sync(buffer_snapshot, subscription.consume().into_inner());
let (snapshot, _) = map.read(inlay_snapshot, inlay_edits);
assert_eq!(snapshot.text(), "aa⋯eeeee");
}
@@ -1804,8 +1755,7 @@ mod tests {
fn test_folds_in_range(cx: &mut gpui::App) {
let buffer = MultiBuffer::build_simple(&sample_text(5, 6, 'a'), cx);
let buffer_snapshot = buffer.read(cx).snapshot(cx);
let (_, filter_snapshot) = FilterMap::new(None, buffer_snapshot.clone());
let (_, inlay_snapshot) = InlayMap::new(filter_snapshot);
let (_, inlay_snapshot) = InlayMap::new(buffer_snapshot.clone());
let mut map = FoldMap::new(inlay_snapshot.clone()).0;
let (mut writer, _, _) = map.write(inlay_snapshot.clone(), vec![]);
@@ -1847,8 +1797,7 @@ mod tests {
MultiBuffer::build_random(&mut rng, cx)
};
let mut buffer_snapshot = buffer.read(cx).snapshot(cx);
let (mut filter_map, filter_snapshot) = FilterMap::new(None, buffer_snapshot.clone());
let (mut inlay_map, inlay_snapshot) = InlayMap::new(filter_snapshot);
let (mut inlay_map, inlay_snapshot) = InlayMap::new(buffer_snapshot.clone());
let mut map = FoldMap::new(inlay_snapshot.clone()).0;
let (mut initial_snapshot, _) = map.read(inlay_snapshot, vec![]);
@@ -1878,10 +1827,8 @@ mod tests {
}),
};
let (filter_snapshot, filter_edits) =
filter_map.sync(buffer_snapshot.clone(), buffer_edits);
let (inlay_snapshot, new_inlay_edits) =
inlay_map.sync(filter_snapshot.clone(), filter_edits);
inlay_map.sync(buffer_snapshot.clone(), buffer_edits);
log::info!("inlay text {:?}", inlay_snapshot.text());
let inlay_edits = Patch::new(inlay_edits)
@@ -1892,11 +1839,9 @@ mod tests {
let mut expected_text: String = inlay_snapshot.text().to_string();
for fold_range in map.merged_folds().into_iter().rev() {
let fold_filter_range = filter_snapshot.to_filter_offset(fold_range.start)
..filter_snapshot.to_filter_offset(fold_range.end);
let fold_inlay_range = inlay_snapshot.to_inlay_offset(fold_filter_range.start)
..inlay_snapshot.to_inlay_offset(fold_filter_range.end);
expected_text.replace_range(fold_inlay_range.start.0..fold_inlay_range.end.0, "");
let fold_inlay_start = inlay_snapshot.to_inlay_offset(fold_range.start);
let fold_inlay_end = inlay_snapshot.to_inlay_offset(fold_range.end);
expected_text.replace_range(fold_inlay_start.0..fold_inlay_end.0, "");
}
assert_eq!(snapshot.text(), expected_text);
@@ -1909,20 +1854,18 @@ mod tests {
let mut prev_row = 0;
let mut expected_buffer_rows = Vec::new();
for fold_range in map.merged_folds() {
let fold_filter_range = filter_snapshot.to_filter_offset(fold_range.start)
..filter_snapshot.to_filter_offset(fold_range.end);
let fold_row_range = inlay_snapshot
.to_point(inlay_snapshot.to_inlay_offset(fold_filter_range.start))
.row()
..inlay_snapshot
.to_point(inlay_snapshot.to_inlay_offset(fold_filter_range.end))
.row();
let fold_start = inlay_snapshot
.to_point(inlay_snapshot.to_inlay_offset(fold_range.start))
.row();
let fold_end = inlay_snapshot
.to_point(inlay_snapshot.to_inlay_offset(fold_range.end))
.row();
expected_buffer_rows.extend(
inlay_snapshot
.row_infos(prev_row)
.take((1 + fold_row_range.start - prev_row) as usize),
.take((1 + fold_start - prev_row) as usize),
);
prev_row = 1 + fold_row_range.end;
prev_row = 1 + fold_end;
}
expected_buffer_rows.extend(inlay_snapshot.row_infos(prev_row));
@@ -2120,8 +2063,7 @@ mod tests {
let buffer = MultiBuffer::build_simple(&text, cx);
let buffer_snapshot = buffer.read(cx).snapshot(cx);
let (_, filter_snapshot) = FilterMap::new(None, buffer_snapshot);
let (_, inlay_snapshot) = InlayMap::new(filter_snapshot);
let (_, inlay_snapshot) = InlayMap::new(buffer_snapshot);
let mut map = FoldMap::new(inlay_snapshot.clone()).0;
let (mut writer, _, _) = map.write(inlay_snapshot.clone(), vec![]);
@@ -2163,8 +2105,7 @@ mod tests {
MultiBuffer::build_random(&mut rng, cx)
};
let buffer_snapshot = buffer.read(cx).snapshot(cx);
let (_, filter_snapshot) = FilterMap::new(None, buffer_snapshot);
let (_, inlay_snapshot) = InlayMap::new(filter_snapshot);
let (_, inlay_snapshot) = InlayMap::new(buffer_snapshot);
let (mut fold_map, _) = FoldMap::new(inlay_snapshot.clone());
// Perform random mutations
@@ -2248,7 +2189,7 @@ mod tests {
impl FoldMap {
fn merged_folds(&self) -> Vec<Range<usize>> {
let inlay_snapshot = self.snapshot.inlay_snapshot.clone();
let buffer = &inlay_snapshot.filter_snapshot.buffer_snapshot;
let buffer = &inlay_snapshot.buffer;
let mut folds = self.snapshot.folds.items(buffer);
// Ensure sorting doesn't change how folds get merged and displayed.
folds.sort_by(|a, b| a.range.cmp(&b.range, buffer));
@@ -2284,7 +2225,7 @@ mod tests {
match rng.random_range(0..=100) {
0..=39 if !self.snapshot.folds.is_empty() => {
let inlay_snapshot = self.snapshot.inlay_snapshot.clone();
let buffer = &inlay_snapshot.filter_snapshot.buffer_snapshot;
let buffer = &inlay_snapshot.buffer;
let mut to_unfold = Vec::new();
for _ in 0..rng.random_range(1..=3) {
let end = buffer.clip_offset(rng.random_range(0..=buffer.len()), Right);
@@ -2300,7 +2241,7 @@ mod tests {
}
_ => {
let inlay_snapshot = self.snapshot.inlay_snapshot.clone();
let buffer = &inlay_snapshot.filter_snapshot.buffer_snapshot;
let buffer = &inlay_snapshot.buffer;
let mut to_fold = Vec::new();
for _ in 0..rng.random_range(1..=2) {
let end = buffer.clip_offset(rng.random_range(0..=buffer.len()), Right);

View File

@@ -1,13 +1,10 @@
use crate::{
ChunkRenderer, HighlightStyles,
display_map::filter_map::{
FilterChunks, FilterOffset, FilterPoint, FilterRow, FilterRows, FilterSnapshot,
},
inlays::{Inlay, InlayContent},
};
use collections::BTreeSet;
use language::{Chunk, Edit, Point, TextSummary};
use multi_buffer::{MultiBufferRow, MultiBufferRows, RowInfo, ToOffset};
use multi_buffer::{MultiBufferRow, MultiBufferRows, MultiBufferSnapshot, RowInfo, ToOffset};
use project::InlayId;
use std::{
cmp,
@@ -30,7 +27,7 @@ pub struct InlayMap {
#[derive(Clone)]
pub struct InlaySnapshot {
pub filter_snapshot: FilterSnapshot,
pub buffer: MultiBufferSnapshot,
transforms: SumTree<Transform>,
pub version: usize,
}
@@ -147,37 +144,37 @@ impl<'a> sum_tree::Dimension<'a, TransformSummary> for InlayPoint {
}
}
impl<'a> sum_tree::Dimension<'a, TransformSummary> for FilterOffset {
impl<'a> sum_tree::Dimension<'a, TransformSummary> for usize {
fn zero(_cx: ()) -> Self {
Default::default()
}
fn add_summary(&mut self, summary: &'a TransformSummary, _: ()) {
self.0 += &summary.input.len;
*self += &summary.input.len;
}
}
impl<'a> sum_tree::Dimension<'a, TransformSummary> for FilterPoint {
impl<'a> sum_tree::Dimension<'a, TransformSummary> for Point {
fn zero(_cx: ()) -> Self {
Default::default()
}
fn add_summary(&mut self, summary: &'a TransformSummary, _: ()) {
self.0 += &summary.input.lines;
*self += &summary.input.lines;
}
}
#[derive(Clone)]
pub struct InlayBufferRows<'a> {
transforms: Cursor<'a, 'static, Transform, Dimensions<InlayPoint, FilterPoint>>,
filter_rows: FilterRows<'a>,
transforms: Cursor<'a, 'static, Transform, Dimensions<InlayPoint, Point>>,
buffer_rows: MultiBufferRows<'a>,
inlay_row: u32,
max_filter_row: FilterRow,
max_buffer_row: MultiBufferRow,
}
pub struct InlayChunks<'a> {
transforms: Cursor<'a, 'static, Transform, Dimensions<InlayOffset, FilterOffset>>,
filter_chunks: FilterChunks<'a>,
transforms: Cursor<'a, 'static, Transform, Dimensions<InlayOffset, usize>>,
buffer_chunks: CustomHighlightsChunks<'a>,
buffer_chunk: Option<Chunk<'a>>,
inlay_chunks: Option<text::ChunkWithBitmaps<'a>>,
/// text, char bitmap, tabs bitmap
@@ -200,9 +197,9 @@ impl InlayChunks<'_> {
pub fn seek(&mut self, new_range: Range<InlayOffset>) {
self.transforms.seek(&new_range.start, Bias::Right);
let filter_range = self.snapshot.to_filter_offset(new_range.start)
..self.snapshot.to_filter_offset(new_range.end);
self.filter_chunks.seek(filter_range);
let buffer_range = self.snapshot.to_buffer_offset(new_range.start)
..self.snapshot.to_buffer_offset(new_range.end);
self.buffer_chunks.seek(buffer_range);
self.inlay_chunks = None;
self.buffer_chunk = None;
self.output_offset = new_range.start;
@@ -226,9 +223,9 @@ impl<'a> Iterator for InlayChunks<'a> {
Transform::Isomorphic(_) => {
let chunk = self
.buffer_chunk
.get_or_insert_with(|| self.filter_chunks.next().unwrap());
.get_or_insert_with(|| self.buffer_chunks.next().unwrap());
if chunk.text.is_empty() {
*chunk = self.filter_chunks.next().unwrap();
*chunk = self.buffer_chunks.next().unwrap();
}
let desired_bytes = self.transforms.end().0.0 - self.output_offset.0;
@@ -418,20 +415,20 @@ impl InlayBufferRows<'_> {
let inlay_point = InlayPoint::new(row, 0);
self.transforms.seek(&inlay_point, Bias::Left);
let mut filter_point = self.transforms.start().1;
let filter_row = if row == 0 {
let mut buffer_point = self.transforms.start().1;
let buffer_row = MultiBufferRow(if row == 0 {
0
} else {
match self.transforms.item() {
Some(Transform::Isomorphic(_)) => {
filter_point += FilterPoint(inlay_point.0 - self.transforms.start().0.0);
filter_point.0.row
buffer_point += inlay_point.0 - self.transforms.start().0.0;
buffer_point.row
}
_ => cmp::min(filter_point.0.row + 1, self.max_filter_row.0),
_ => cmp::min(buffer_point.row + 1, self.max_buffer_row.0),
}
};
});
self.inlay_row = inlay_point.row();
self.filter_rows.seek(FilterRow(filter_row));
self.buffer_rows.seek(buffer_row);
}
}
@@ -440,11 +437,11 @@ impl Iterator for InlayBufferRows<'_> {
fn next(&mut self) -> Option<Self::Item> {
let buffer_row = if self.inlay_row == 0 {
self.filter_rows.next().unwrap()
self.buffer_rows.next().unwrap()
} else {
match self.transforms.item()? {
Transform::Inlay(_) => Default::default(),
Transform::Isomorphic(_) => self.filter_rows.next().unwrap(),
Transform::Isomorphic(_) => self.buffer_rows.next().unwrap(),
}
};
@@ -467,14 +464,11 @@ impl InlayPoint {
}
impl InlayMap {
pub fn new(filter_snapshot: FilterSnapshot) -> (Self, InlaySnapshot) {
pub fn new(buffer: MultiBufferSnapshot) -> (Self, InlaySnapshot) {
let version = 0;
let snapshot = InlaySnapshot {
filter_snapshot: filter_snapshot.clone(),
transforms: SumTree::from_iter(
Some(Transform::Isomorphic(filter_snapshot.text_summary())),
(),
),
buffer: buffer.clone(),
transforms: SumTree::from_iter(Some(Transform::Isomorphic(buffer.text_summary())), ()),
version,
};
@@ -489,126 +483,136 @@ impl InlayMap {
pub fn sync(
&mut self,
filter_snapshot: FilterSnapshot,
filter_edits: Vec<text::Edit<FilterOffset>>,
buffer_snapshot: MultiBufferSnapshot,
mut buffer_edits: Vec<text::Edit<usize>>,
) -> (InlaySnapshot, Vec<InlayEdit>) {
let snapshot = &mut self.snapshot;
if filter_edits.is_empty() {
if snapshot.filter_snapshot.version != filter_snapshot.version {
if buffer_edits.is_empty()
&& snapshot.buffer.trailing_excerpt_update_count()
!= buffer_snapshot.trailing_excerpt_update_count()
{
buffer_edits.push(Edit {
old: snapshot.buffer.len()..snapshot.buffer.len(),
new: buffer_snapshot.len()..buffer_snapshot.len(),
});
}
if buffer_edits.is_empty() {
if snapshot.buffer.edit_count() != buffer_snapshot.edit_count()
|| snapshot.buffer.non_text_state_update_count()
!= buffer_snapshot.non_text_state_update_count()
|| snapshot.buffer.trailing_excerpt_update_count()
!= buffer_snapshot.trailing_excerpt_update_count()
{
snapshot.version += 1;
}
snapshot.filter_snapshot = filter_snapshot;
return (snapshot.clone(), Vec::new());
}
let mut inlay_edits = Patch::default();
let mut new_transforms = SumTree::default();
let mut cursor = snapshot
.transforms
.cursor::<Dimensions<FilterOffset, InlayOffset>>(());
let mut filter_edits_iter = filter_edits.iter().peekable();
while let Some(filter_edit) = filter_edits_iter.next() {
new_transforms.append(cursor.slice(&filter_edit.old.start, Bias::Left), ());
if let Some(Transform::Isomorphic(transform)) = cursor.item()
&& cursor.end().0 == filter_edit.old.start
{
push_isomorphic(&mut new_transforms, *transform);
cursor.next();
}
// Remove all the inlays and transforms contained by the edit.
let old_start =
cursor.start().1 + InlayOffset(filter_edit.old.start.0 - cursor.start().0.0);
cursor.seek(&filter_edit.old.end, Bias::Right);
let old_end =
cursor.start().1 + InlayOffset(filter_edit.old.end.0 - cursor.start().0.0);
// Push the unchanged prefix.
let prefix_start = new_transforms.summary().input.len;
let prefix_end = filter_edit.new.start;
push_isomorphic(
&mut new_transforms,
filter_snapshot.text_summary_for_range(FilterOffset(prefix_start)..prefix_end),
);
let new_start = InlayOffset(new_transforms.summary().output.len);
let start_ix = match self.inlays.binary_search_by(|probe| {
let buffer_position = probe.position.to_offset(&filter_snapshot.buffer_snapshot);
filter_snapshot
.to_filter_offset(buffer_position)
.cmp(&filter_edit.new.start)
.then(std::cmp::Ordering::Greater)
}) {
Ok(ix) | Err(ix) => ix,
};
for inlay in &self.inlays[start_ix..] {
if !inlay.position.is_valid(&filter_snapshot.buffer_snapshot) {
continue;
}
if filter_snapshot.is_anchor_filtered(inlay.position) {
continue;
}
let buffer_offset = inlay.position.to_offset(&filter_snapshot.buffer_snapshot);
let filter_offset = filter_snapshot.to_filter_offset(buffer_offset);
if filter_offset > filter_edit.new.end {
break;
snapshot.buffer = buffer_snapshot;
(snapshot.clone(), Vec::new())
} else {
let mut inlay_edits = Patch::default();
let mut new_transforms = SumTree::default();
let mut cursor = snapshot
.transforms
.cursor::<Dimensions<usize, InlayOffset>>(());
let mut buffer_edits_iter = buffer_edits.iter().peekable();
while let Some(buffer_edit) = buffer_edits_iter.next() {
new_transforms.append(cursor.slice(&buffer_edit.old.start, Bias::Left), ());
if let Some(Transform::Isomorphic(transform)) = cursor.item()
&& cursor.end().0 == buffer_edit.old.start
{
push_isomorphic(&mut new_transforms, *transform);
cursor.next();
}
// Remove all the inlays and transforms contained by the edit.
let old_start =
cursor.start().1 + InlayOffset(buffer_edit.old.start - cursor.start().0);
cursor.seek(&buffer_edit.old.end, Bias::Right);
let old_end =
cursor.start().1 + InlayOffset(buffer_edit.old.end - cursor.start().0);
// Push the unchanged prefix.
let prefix_start = new_transforms.summary().input.len;
let prefix_end = buffer_offset;
let prefix_end = buffer_edit.new.start;
push_isomorphic(
&mut new_transforms,
filter_snapshot.text_summary_for_range(
FilterOffset(prefix_start)..FilterOffset(prefix_end),
),
buffer_snapshot.text_summary_for_range(prefix_start..prefix_end),
);
let new_start = InlayOffset(new_transforms.summary().output.len);
new_transforms.push(Transform::Inlay(inlay.clone()), ());
}
let start_ix = match self.inlays.binary_search_by(|probe| {
probe
.position
.to_offset(&buffer_snapshot)
.cmp(&buffer_edit.new.start)
.then(std::cmp::Ordering::Greater)
}) {
Ok(ix) | Err(ix) => ix,
};
// Apply the rest of the edit.
let transform_start = new_transforms.summary().input.len;
push_isomorphic(
&mut new_transforms,
filter_snapshot
.text_summary_for_range(FilterOffset(transform_start)..filter_edit.new.end),
);
let new_end = InlayOffset(new_transforms.summary().output.len);
inlay_edits.push(Edit {
old: old_start..old_end,
new: new_start..new_end,
});
for inlay in &self.inlays[start_ix..] {
if !inlay.position.is_valid(&buffer_snapshot) {
continue;
}
let buffer_offset = inlay.position.to_offset(&buffer_snapshot);
if buffer_offset > buffer_edit.new.end {
break;
}
// If the next edit doesn't intersect the current isomorphic transform, then
// we can push its remainder.
if filter_edits_iter
.peek()
.is_none_or(|edit| edit.old.start >= cursor.end().0)
{
let transform_start = FilterOffset(new_transforms.summary().input.len);
let transform_end = filter_edit.new.end + (cursor.end().0 - filter_edit.old.end);
let prefix_start = new_transforms.summary().input.len;
let prefix_end = buffer_offset;
push_isomorphic(
&mut new_transforms,
buffer_snapshot.text_summary_for_range(prefix_start..prefix_end),
);
new_transforms.push(Transform::Inlay(inlay.clone()), ());
}
// Apply the rest of the edit.
let transform_start = new_transforms.summary().input.len;
push_isomorphic(
&mut new_transforms,
filter_snapshot.text_summary_for_range(transform_start..transform_end),
buffer_snapshot.text_summary_for_range(transform_start..buffer_edit.new.end),
);
cursor.next();
let new_end = InlayOffset(new_transforms.summary().output.len);
inlay_edits.push(Edit {
old: old_start..old_end,
new: new_start..new_end,
});
// If the next edit doesn't intersect the current isomorphic transform, then
// we can push its remainder.
if buffer_edits_iter
.peek()
.is_none_or(|edit| edit.old.start >= cursor.end().0)
{
let transform_start = new_transforms.summary().input.len;
let transform_end =
buffer_edit.new.end + (cursor.end().0 - buffer_edit.old.end);
push_isomorphic(
&mut new_transforms,
buffer_snapshot.text_summary_for_range(transform_start..transform_end),
);
cursor.next();
}
}
new_transforms.append(cursor.suffix(), ());
if new_transforms.is_empty() {
new_transforms.push(Transform::Isomorphic(Default::default()), ());
}
drop(cursor);
snapshot.transforms = new_transforms;
snapshot.version += 1;
snapshot.buffer = buffer_snapshot;
snapshot.check_invariants();
(snapshot.clone(), inlay_edits.into_inner())
}
new_transforms.append(cursor.suffix(), ());
if new_transforms.is_empty() {
new_transforms.push(Transform::Isomorphic(Default::default()), ());
}
drop(cursor);
snapshot.transforms = new_transforms;
snapshot.version += 1;
snapshot.filter_snapshot = filter_snapshot;
snapshot.check_invariants();
(snapshot.clone(), inlay_edits.into_inner())
}
pub fn splice(
@@ -622,11 +626,8 @@ impl InlayMap {
self.inlays.retain(|inlay| {
let retain = !to_remove.contains(&inlay.id);
if !retain {
let buffer_offset = inlay
.position
.to_offset(&snapshot.filter_snapshot.buffer_snapshot);
let filter_offset = snapshot.filter_snapshot.to_filter_offset(buffer_offset);
edits.insert(filter_offset);
let offset = inlay.position.to_offset(&snapshot.buffer);
edits.insert(offset);
}
retain
});
@@ -637,17 +638,11 @@ impl InlayMap {
continue;
}
let buffer_offset = inlay_to_insert
.position
.to_offset(&snapshot.filter_snapshot.buffer_snapshot);
let filter_offset = snapshot.filter_snapshot.to_filter_offset(buffer_offset);
let offset = inlay_to_insert.position.to_offset(&snapshot.buffer);
match self.inlays.binary_search_by(|probe| {
probe
.position
.cmp(
&inlay_to_insert.position,
&snapshot.filter_snapshot.buffer_snapshot,
)
.cmp(&inlay_to_insert.position, &snapshot.buffer)
.then(std::cmp::Ordering::Less)
}) {
Ok(ix) | Err(ix) => {
@@ -655,18 +650,18 @@ impl InlayMap {
}
}
edits.insert(filter_offset);
edits.insert(offset);
}
let filter_edits = edits
let buffer_edits = edits
.into_iter()
.map(|offset| Edit {
old: offset..offset,
new: offset..offset,
})
.collect();
let filter_snapshot = snapshot.filter_snapshot.clone();
let (snapshot, edits) = self.sync(filter_snapshot, filter_edits);
let buffer_snapshot = snapshot.buffer.clone();
let (snapshot, edits) = self.sync(buffer_snapshot, buffer_edits);
(snapshot, edits)
}
@@ -688,11 +683,7 @@ impl InlayMap {
let snapshot = &mut self.snapshot;
for i in 0..rng.random_range(1..=5) {
if self.inlays.is_empty() || rng.random() {
let position = snapshot
.filter_snapshot
.buffer_snapshot
.random_byte_range(0, rng)
.start;
let position = snapshot.buffer.random_byte_range(0, rng).start;
let bias = if rng.random() {
Bias::Left
} else {
@@ -711,19 +702,13 @@ impl InlayMap {
let next_inlay = if i % 2 == 0 {
Inlay::mock_hint(
post_inc(next_inlay_id),
snapshot
.filter_snapshot
.buffer_snapshot
.anchor_at(position, bias),
snapshot.buffer.anchor_at(position, bias),
&text,
)
} else {
Inlay::edit_prediction(
post_inc(next_inlay_id),
snapshot
.filter_snapshot
.buffer_snapshot
.anchor_at(position, bias),
snapshot.buffer.anchor_at(position, bias),
&text,
)
};
@@ -753,15 +738,15 @@ impl InlaySnapshot {
pub fn to_point(&self, offset: InlayOffset) -> InlayPoint {
let (start, _, item) = self
.transforms
.find::<Dimensions<InlayOffset, InlayPoint, FilterOffset>, _>((), &offset, Bias::Right);
.find::<Dimensions<InlayOffset, InlayPoint, usize>, _>((), &offset, Bias::Right);
let overshoot = offset.0 - start.0.0;
match item {
Some(Transform::Isomorphic(_)) => {
let filter_offset_start = start.2;
let filter_offset_end = filter_offset_start + FilterOffset(overshoot);
let filter_start = self.filter_snapshot.offset_to_point(filter_offset_start);
let filter_end = self.filter_snapshot.offset_to_point(filter_offset_end);
InlayPoint(start.1.0 + (filter_end - filter_start).0)
let buffer_offset_start = start.2;
let buffer_offset_end = buffer_offset_start + overshoot;
let buffer_start = self.buffer.offset_to_point(buffer_offset_start);
let buffer_end = self.buffer.offset_to_point(buffer_offset_end);
InlayPoint(start.1.0 + (buffer_end - buffer_start))
}
Some(Transform::Inlay(inlay)) => {
let overshoot = inlay.text().offset_to_point(overshoot);
@@ -782,15 +767,15 @@ impl InlaySnapshot {
pub fn to_offset(&self, point: InlayPoint) -> InlayOffset {
let (start, _, item) = self
.transforms
.find::<Dimensions<InlayPoint, InlayOffset, FilterPoint>, _>((), &point, Bias::Right);
.find::<Dimensions<InlayPoint, InlayOffset, Point>, _>((), &point, Bias::Right);
let overshoot = point.0 - start.0.0;
match item {
Some(Transform::Isomorphic(_)) => {
let filter_point_start = start.2;
let filter_point_end = filter_point_start + FilterPoint(overshoot);
let buffer_offset_start = self.filter_snapshot.point_to_offset(filter_point_start);
let buffer_offset_end = self.filter_snapshot.point_to_offset(filter_point_end);
InlayOffset(start.1.0 + (buffer_offset_end - buffer_offset_start).0)
let buffer_point_start = start.2;
let buffer_point_end = buffer_point_start + overshoot;
let buffer_offset_start = self.buffer.point_to_offset(buffer_point_start);
let buffer_offset_end = self.buffer.point_to_offset(buffer_point_end);
InlayOffset(start.1.0 + (buffer_offset_end - buffer_offset_start))
}
Some(Transform::Inlay(inlay)) => {
let overshoot = inlay.text().point_to_offset(overshoot);
@@ -799,39 +784,35 @@ impl InlaySnapshot {
None => self.len(),
}
}
pub(crate) fn to_filter_point(&self, point: InlayPoint) -> FilterPoint {
let (start, _, item) = self
.transforms
.find::<Dimensions<InlayPoint, FilterPoint>, _>((), &point, Bias::Right);
pub fn to_buffer_point(&self, point: InlayPoint) -> Point {
let (start, _, item) =
self.transforms
.find::<Dimensions<InlayPoint, Point>, _>((), &point, Bias::Right);
match item {
Some(Transform::Isomorphic(_)) => {
let overshoot = point.0 - start.0.0;
start.1 + FilterPoint(overshoot)
start.1 + overshoot
}
Some(Transform::Inlay(_)) => start.1,
None => self.filter_snapshot.max_point(),
None => self.buffer.max_point(),
}
}
pub(crate) fn to_filter_offset(&self, offset: InlayOffset) -> FilterOffset {
let (start, _, item) = self
.transforms
.find::<Dimensions<InlayOffset, FilterOffset>, _>((), &offset, Bias::Right);
pub fn to_buffer_offset(&self, offset: InlayOffset) -> usize {
let (start, _, item) =
self.transforms
.find::<Dimensions<InlayOffset, usize>, _>((), &offset, Bias::Right);
match item {
Some(Transform::Isomorphic(_)) => {
let overshoot = offset - start.0;
start.1 + FilterOffset(overshoot.0)
start.1 + overshoot.0
}
Some(Transform::Inlay(_)) => start.1,
None => self.filter_snapshot.len(),
None => self.buffer.len(),
}
}
pub fn to_inlay_offset(&self, offset: FilterOffset) -> InlayOffset {
let mut cursor = self
.transforms
.cursor::<Dimensions<FilterOffset, InlayOffset>>(());
pub fn to_inlay_offset(&self, offset: usize) -> InlayOffset {
let mut cursor = self.transforms.cursor::<Dimensions<usize, InlayOffset>>(());
cursor.seek(&offset, Bias::Left);
loop {
match cursor.item() {
@@ -847,7 +828,7 @@ impl InlaySnapshot {
return cursor.end().1;
} else {
let overshoot = offset - cursor.start().0;
return InlayOffset(cursor.start().1.0 + overshoot.0);
return InlayOffset(cursor.start().1.0 + overshoot);
}
}
Some(Transform::Inlay(inlay)) => {
@@ -863,11 +844,8 @@ impl InlaySnapshot {
}
}
}
pub fn to_inlay_point(&self, point: FilterPoint) -> InlayPoint {
let mut cursor = self
.transforms
.cursor::<Dimensions<FilterPoint, InlayPoint>>(());
pub fn to_inlay_point(&self, point: Point) -> InlayPoint {
let mut cursor = self.transforms.cursor::<Dimensions<Point, InlayPoint>>(());
cursor.seek(&point, Bias::Left);
loop {
match cursor.item() {
@@ -883,7 +861,7 @@ impl InlaySnapshot {
return cursor.end().1;
} else {
let overshoot = point - cursor.start().0;
return InlayPoint(cursor.start().1.0 + overshoot.0);
return InlayPoint(cursor.start().1.0 + overshoot);
}
}
Some(Transform::Inlay(inlay)) => {
@@ -901,9 +879,7 @@ impl InlaySnapshot {
}
pub fn clip_point(&self, mut point: InlayPoint, mut bias: Bias) -> InlayPoint {
let mut cursor = self
.transforms
.cursor::<Dimensions<InlayPoint, FilterPoint>>(());
let mut cursor = self.transforms.cursor::<Dimensions<InlayPoint, Point>>(());
cursor.seek(&point, Bias::Left);
loop {
match cursor.item() {
@@ -939,11 +915,10 @@ impl InlaySnapshot {
}
} else {
let overshoot = point.0 - cursor.start().0.0;
let filter_point = cursor.start().1 + FilterPoint(overshoot);
let clipped_filter_point =
self.filter_snapshot.clip_point(filter_point, bias);
let clipped_overshoot = clipped_filter_point - cursor.start().1;
let clipped_point = InlayPoint(cursor.start().0.0 + clipped_overshoot.0);
let buffer_point = cursor.start().1 + overshoot;
let clipped_buffer_point = self.buffer.clip_point(buffer_point, bias);
let clipped_overshoot = clipped_buffer_point - cursor.start().1;
let clipped_point = InlayPoint(cursor.start().0.0 + clipped_overshoot);
if clipped_point == point {
return clipped_point;
} else {
@@ -1001,21 +976,17 @@ impl InlaySnapshot {
pub fn text_summary_for_range(&self, range: Range<InlayOffset>) -> TextSummary {
let mut summary = TextSummary::default();
let mut cursor = self
.transforms
.cursor::<Dimensions<InlayOffset, FilterOffset>>(());
let mut cursor = self.transforms.cursor::<Dimensions<InlayOffset, usize>>(());
cursor.seek(&range.start, Bias::Right);
let overshoot = range.start.0 - cursor.start().0.0;
match cursor.item() {
Some(Transform::Isomorphic(_)) => {
let filter_start = cursor.start().1;
let suffix_start = filter_start + FilterOffset(overshoot);
let suffix_end = filter_start
+ FilterOffset(cmp::min(cursor.end().0, range.end).0 - cursor.start().0.0);
summary = self
.filter_snapshot
.text_summary_for_range(suffix_start..suffix_end);
let buffer_start = cursor.start().1;
let suffix_start = buffer_start + overshoot;
let suffix_end =
buffer_start + (cmp::min(cursor.end().0, range.end).0 - cursor.start().0.0);
summary = self.buffer.text_summary_for_range(suffix_start..suffix_end);
cursor.next();
}
Some(Transform::Inlay(inlay)) => {
@@ -1036,10 +1007,10 @@ impl InlaySnapshot {
match cursor.item() {
Some(Transform::Isomorphic(_)) => {
let prefix_start = cursor.start().1;
let prefix_end = prefix_start + FilterOffset(overshoot);
let prefix_end = prefix_start + overshoot;
summary += self
.filter_snapshot
.text_summary_for_range(prefix_start..prefix_end);
.buffer
.text_summary_for_range::<TextSummary, _>(prefix_start..prefix_end);
}
Some(Transform::Inlay(inlay)) => {
let prefix_end = overshoot;
@@ -1053,31 +1024,29 @@ impl InlaySnapshot {
}
pub fn row_infos(&self, row: u32) -> InlayBufferRows<'_> {
let mut cursor = self
.transforms
.cursor::<Dimensions<InlayPoint, FilterPoint>>(());
let mut cursor = self.transforms.cursor::<Dimensions<InlayPoint, Point>>(());
let inlay_point = InlayPoint::new(row, 0);
cursor.seek(&inlay_point, Bias::Left);
let max_filter_row = self.filter_snapshot.max_row();
let mut filter_point = cursor.start().1;
let filter_row = if row == 0 {
FilterRow(0)
let max_buffer_row = self.buffer.max_row();
let mut buffer_point = cursor.start().1;
let buffer_row = if row == 0 {
MultiBufferRow(0)
} else {
match cursor.item() {
Some(Transform::Isomorphic(_)) => {
filter_point.0 += inlay_point.0 - cursor.start().0.0;
FilterRow(filter_point.0.row)
buffer_point += inlay_point.0 - cursor.start().0.0;
MultiBufferRow(buffer_point.row)
}
_ => cmp::min(FilterRow(filter_point.0.row + 1), max_filter_row),
_ => cmp::min(MultiBufferRow(buffer_point.row + 1), max_buffer_row),
}
};
InlayBufferRows {
transforms: cursor,
inlay_row: inlay_point.row(),
filter_rows: self.filter_snapshot.row_infos(filter_row),
max_filter_row,
buffer_rows: self.buffer.row_infos(buffer_row),
max_buffer_row,
}
}
@@ -1097,19 +1066,20 @@ impl InlaySnapshot {
language_aware: bool,
highlights: Highlights<'a>,
) -> InlayChunks<'a> {
let mut cursor = self
.transforms
.cursor::<Dimensions<InlayOffset, FilterOffset>>(());
let mut cursor = self.transforms.cursor::<Dimensions<InlayOffset, usize>>(());
cursor.seek(&range.start, Bias::Right);
let filter_range = self.to_filter_offset(range.start)..self.to_filter_offset(range.end);
let filter_chunks = self
.filter_snapshot
.chunks(filter_range, language_aware, &highlights);
let buffer_range = self.to_buffer_offset(range.start)..self.to_buffer_offset(range.end);
let buffer_chunks = CustomHighlightsChunks::new(
buffer_range,
language_aware,
highlights.text_highlights,
&self.buffer,
);
InlayChunks {
transforms: cursor,
filter_chunks,
buffer_chunks,
inlay_chunks: None,
inlay_chunk: None,
buffer_chunk: None,
@@ -1131,10 +1101,7 @@ impl InlaySnapshot {
fn check_invariants(&self) {
#[cfg(any(debug_assertions, feature = "test-support"))]
{
assert_eq!(
self.transforms.summary().input,
self.filter_snapshot.text_summary()
);
assert_eq!(self.transforms.summary().input, self.buffer.text_summary());
let mut transforms = self.transforms.iter().peekable();
while let Some(transform) = transforms.next() {
let transform_is_isomorphic = matches!(transform, Transform::Isomorphic(_));
@@ -1201,7 +1168,7 @@ mod tests {
use super::*;
use crate::{
MultiBuffer,
display_map::{HighlightKey, InlayHighlights, TextHighlights, filter_map::FilterMap},
display_map::{HighlightKey, InlayHighlights, TextHighlights},
hover_links::InlayHighlight,
};
use gpui::{App, HighlightStyle};
@@ -1325,9 +1292,7 @@ mod tests {
fn test_basic_inlays(cx: &mut App) {
let buffer = MultiBuffer::build_simple("abcdefghi", cx);
let buffer_edits = buffer.update(cx, |buffer, _| buffer.subscribe());
let buffer_snapshot = buffer.read(cx).snapshot(cx);
let (mut filter_map, filter_snapshot) = FilterMap::new(None, buffer_snapshot);
let (mut inlay_map, inlay_snapshot) = InlayMap::new(filter_snapshot);
let (mut inlay_map, inlay_snapshot) = InlayMap::new(buffer.read(cx).snapshot(cx));
assert_eq!(inlay_snapshot.text(), "abcdefghi");
let mut next_inlay_id = 0;
@@ -1341,27 +1306,27 @@ mod tests {
);
assert_eq!(inlay_snapshot.text(), "abc|123|defghi");
assert_eq!(
inlay_snapshot.to_inlay_point(FilterPoint(Point::new(0, 0))),
inlay_snapshot.to_inlay_point(Point::new(0, 0)),
InlayPoint::new(0, 0)
);
assert_eq!(
inlay_snapshot.to_inlay_point(FilterPoint(Point::new(0, 1))),
inlay_snapshot.to_inlay_point(Point::new(0, 1)),
InlayPoint::new(0, 1)
);
assert_eq!(
inlay_snapshot.to_inlay_point(FilterPoint(Point::new(0, 2))),
inlay_snapshot.to_inlay_point(Point::new(0, 2)),
InlayPoint::new(0, 2)
);
assert_eq!(
inlay_snapshot.to_inlay_point(FilterPoint(Point::new(0, 3))),
inlay_snapshot.to_inlay_point(Point::new(0, 3)),
InlayPoint::new(0, 3)
);
assert_eq!(
inlay_snapshot.to_inlay_point(FilterPoint(Point::new(0, 4))),
inlay_snapshot.to_inlay_point(Point::new(0, 4)),
InlayPoint::new(0, 9)
);
assert_eq!(
inlay_snapshot.to_inlay_point(FilterPoint(Point::new(0, 5))),
inlay_snapshot.to_inlay_point(Point::new(0, 5)),
InlayPoint::new(0, 10)
);
assert_eq!(
@@ -1393,20 +1358,18 @@ mod tests {
buffer.update(cx, |buffer, cx| {
buffer.edit([(2..3, "x"), (3..3, "y"), (4..4, "z")], None, cx)
});
let (filter_snapshot, filter_edits) = filter_map.sync(
let (inlay_snapshot, _) = inlay_map.sync(
buffer.read(cx).snapshot(cx),
buffer_edits.consume().into_inner(),
);
let (inlay_snapshot, _) = inlay_map.sync(filter_snapshot, filter_edits);
assert_eq!(inlay_snapshot.text(), "abxy|123|dzefghi");
// An edit surrounding the inlay should invalidate it.
buffer.update(cx, |buffer, cx| buffer.edit([(4..5, "D")], None, cx));
let (filter_snapshot, filter_edits) = filter_map.sync(
let (inlay_snapshot, _) = inlay_map.sync(
buffer.read(cx).snapshot(cx),
buffer_edits.consume().into_inner(),
);
let (inlay_snapshot, _) = inlay_map.sync(filter_snapshot, filter_edits);
assert_eq!(inlay_snapshot.text(), "abxyDzefghi");
let (inlay_snapshot, _) = inlay_map.splice(
@@ -1428,11 +1391,10 @@ mod tests {
// Edits ending where the inlay starts should not move it if it has a left bias.
buffer.update(cx, |buffer, cx| buffer.edit([(3..3, "JKL")], None, cx));
let (filter_snapshot, filter_edits) = filter_map.sync(
let (inlay_snapshot, _) = inlay_map.sync(
buffer.read(cx).snapshot(cx),
buffer_edits.consume().into_inner(),
);
let (inlay_snapshot, _) = inlay_map.sync(filter_snapshot, filter_edits);
assert_eq!(inlay_snapshot.text(), "abx|123|JKL|456|yDzefghi");
assert_eq!(
@@ -1621,8 +1583,7 @@ mod tests {
#[gpui::test]
fn test_inlay_buffer_rows(cx: &mut App) {
let buffer = MultiBuffer::build_simple("abc\ndef\nghi", cx);
let (mut filter_map, filter_snapshot) = FilterMap::new(None, buffer.read(cx).snapshot(cx));
let (mut inlay_map, inlay_snapshot) = InlayMap::new(filter_snapshot);
let (mut inlay_map, inlay_snapshot) = InlayMap::new(buffer.read(cx).snapshot(cx));
assert_eq!(inlay_snapshot.text(), "abc\ndef\nghi");
let mut next_inlay_id = 0;
@@ -1676,8 +1637,7 @@ mod tests {
let mut buffer_snapshot = buffer.read(cx).snapshot(cx);
let mut next_inlay_id = 0;
log::info!("buffer text: {:?}", buffer_snapshot.text());
let (mut filter_map, filter_snapshot) = FilterMap::new(None, buffer_snapshot.clone());
let (mut inlay_map, mut inlay_snapshot) = InlayMap::new(filter_snapshot.clone());
let (mut inlay_map, mut inlay_snapshot) = InlayMap::new(buffer_snapshot.clone());
for _ in 0..operations {
let mut inlay_edits = Patch::default();
@@ -1700,10 +1660,8 @@ mod tests {
}),
};
let (filter_snapshot, filter_edits) =
filter_map.sync(buffer_snapshot.clone(), buffer_edits);
let (new_inlay_snapshot, new_inlay_edits) =
inlay_map.sync(filter_snapshot, filter_edits);
inlay_map.sync(buffer_snapshot.clone(), buffer_edits);
inlay_snapshot = new_inlay_snapshot;
inlay_edits = inlay_edits.compose(new_inlay_edits);
@@ -1851,15 +1809,14 @@ mod tests {
assert_eq!(expected_text.len(), inlay_snapshot.len().0);
let mut buffer_point = Point::default();
let mut inlay_point = inlay_snapshot.to_inlay_point(FilterPoint(buffer_point));
let mut inlay_point = inlay_snapshot.to_inlay_point(buffer_point);
let mut buffer_chars = buffer_snapshot.chars_at(0);
loop {
// Ensure conversion from buffer coordinates to inlay coordinates
// is consistent.
let buffer_offset = buffer_snapshot.point_to_offset(buffer_point);
assert_eq!(
inlay_snapshot
.to_point(inlay_snapshot.to_inlay_offset(FilterOffset(buffer_offset))),
inlay_snapshot.to_point(inlay_snapshot.to_inlay_offset(buffer_offset)),
inlay_point
);
@@ -1886,7 +1843,7 @@ mod tests {
}
// Ensure that moving forward in the buffer always moves the inlay point forward as well.
let new_inlay_point = inlay_snapshot.to_inlay_point(FilterPoint(buffer_point));
let new_inlay_point = inlay_snapshot.to_inlay_point(buffer_point);
assert!(new_inlay_point > inlay_point);
inlay_point = new_inlay_point;
} else {
@@ -1946,19 +1903,19 @@ mod tests {
// Ensure the clipped points are at valid buffer locations.
assert_eq!(
inlay_snapshot
.to_inlay_point(inlay_snapshot.to_filter_point(clipped_left_point)),
.to_inlay_point(inlay_snapshot.to_buffer_point(clipped_left_point)),
clipped_left_point,
"to_filter_point({:?}) = {:?}",
"to_buffer_point({:?}) = {:?}",
clipped_left_point,
inlay_snapshot.to_filter_point(clipped_left_point),
inlay_snapshot.to_buffer_point(clipped_left_point),
);
assert_eq!(
inlay_snapshot
.to_inlay_point(inlay_snapshot.to_filter_point(clipped_right_point)),
.to_inlay_point(inlay_snapshot.to_buffer_point(clipped_right_point)),
clipped_right_point,
"to_filter_point({:?}) = {:?}",
"to_buffer_point({:?}) = {:?}",
clipped_right_point,
inlay_snapshot.to_filter_point(clipped_right_point),
inlay_snapshot.to_buffer_point(clipped_right_point),
);
}
}
@@ -1981,8 +1938,7 @@ mod tests {
};
let buffer_snapshot = buffer.read(cx).snapshot(cx);
let (_filter_map, filter_snapshot) = FilterMap::new(None, buffer_snapshot);
let (mut inlay_map, _) = InlayMap::new(filter_snapshot.clone());
let (mut inlay_map, _) = InlayMap::new(buffer_snapshot.clone());
// Perform random mutations to add inlays
let mut next_inlay_id = 0;
@@ -1991,7 +1947,7 @@ mod tests {
inlay_map.randomly_mutate(&mut next_inlay_id, &mut rng);
}
let (snapshot, _) = inlay_map.sync(filter_snapshot, vec![]);
let (snapshot, _) = inlay_map.sync(buffer_snapshot, vec![]);
// Get all chunks and verify their bitmaps
let chunks = snapshot.chunks(
@@ -2101,8 +2057,7 @@ mod tests {
//
// See https://github.com/zed-industries/zed/issues/33641
let buffer = MultiBuffer::build_simple("fn main() {}\n", cx);
let (_filter_map, filter_snapshot) = FilterMap::new(None, buffer.read(cx).snapshot(cx));
let (mut inlay_map, _) = InlayMap::new(filter_snapshot);
let (mut inlay_map, _) = InlayMap::new(buffer.read(cx).snapshot(cx));
// Create an inlay with text that contains a multi-byte character
// The string "SortingDirec…" contains an ellipsis character '…' which is 3 bytes (E2 80 A6)
@@ -2220,8 +2175,7 @@ mod tests {
for test_case in test_cases {
let buffer = MultiBuffer::build_simple("test", cx);
let (_filter_map, filter_snapshot) = FilterMap::new(None, buffer.read(cx).snapshot(cx));
let (mut inlay_map, _) = InlayMap::new(filter_snapshot);
let (mut inlay_map, _) = InlayMap::new(buffer.read(cx).snapshot(cx));
let position = buffer.read(cx).snapshot(cx).anchor_before(Point::new(0, 2));
let inlay = Inlay {

View File

@@ -169,11 +169,7 @@ pub struct TabSnapshot {
impl TabSnapshot {
pub fn buffer_snapshot(&self) -> &MultiBufferSnapshot {
&self
.fold_snapshot
.inlay_snapshot
.filter_snapshot
.buffer_snapshot
&self.fold_snapshot.inlay_snapshot.buffer
}
pub fn line_len(&self, row: u32) -> u32 {
@@ -323,15 +319,7 @@ impl TabSnapshot {
}
pub fn make_tab_point(&self, point: Point, bias: Bias) -> TabPoint {
let filter_point = self
.fold_snapshot
.inlay_snapshot
.filter_snapshot
.to_filter_point(point);
let inlay_point = self
.fold_snapshot
.inlay_snapshot
.to_inlay_point(filter_point);
let inlay_point = self.fold_snapshot.inlay_snapshot.to_inlay_point(point);
let fold_point = self.fold_snapshot.to_fold_point(inlay_point, bias);
self.to_tab_point(fold_point)
}
@@ -339,14 +327,9 @@ impl TabSnapshot {
pub fn to_point(&self, point: TabPoint, bias: Bias) -> Point {
let fold_point = self.to_fold_point(point, bias).0;
let inlay_point = fold_point.to_inlay_point(&self.fold_snapshot);
let filter_point = self
.fold_snapshot
.inlay_snapshot
.to_filter_point(inlay_point);
self.fold_snapshot
.inlay_snapshot
.filter_snapshot
.to_buffer_point(filter_point, bias)
.to_buffer_point(inlay_point)
}
fn expand_tabs<'a, I>(&self, mut cursor: TabStopCursor<'a, I>, column: u32) -> u32
@@ -653,7 +636,6 @@ mod tests {
use crate::{
MultiBuffer,
display_map::{
filter_map::FilterMap,
fold_map::{FoldMap, FoldOffset},
inlay_map::InlayMap,
},
@@ -768,8 +750,7 @@ mod tests {
];
let buffer = MultiBuffer::build_simple("", cx);
let buffer_snapshot = buffer.read(cx).snapshot(cx);
let (_, filter_snapshot) = FilterMap::new(None, buffer_snapshot);
let (_, inlay_snapshot) = InlayMap::new(filter_snapshot);
let (_, inlay_snapshot) = InlayMap::new(buffer_snapshot);
let (_, fold_snapshot) = FoldMap::new(inlay_snapshot);
let (_, tab_snapshot) = TabMap::new(fold_snapshot, 4.try_into().unwrap());
@@ -805,8 +786,7 @@ mod tests {
let buffer = MultiBuffer::build_simple(input, cx);
let buffer_snapshot = buffer.read(cx).snapshot(cx);
let (_, filter_snapshot) = FilterMap::new(None, buffer_snapshot);
let (_, inlay_snapshot) = InlayMap::new(filter_snapshot);
let (_, inlay_snapshot) = InlayMap::new(buffer_snapshot);
let (_, fold_snapshot) = FoldMap::new(inlay_snapshot);
let (_, tab_snapshot) = TabMap::new(fold_snapshot, 4.try_into().unwrap());
@@ -845,8 +825,7 @@ mod tests {
let text = "γ\tw⭐\n🍐🍗 \t";
let buffer = MultiBuffer::build_simple(text, cx);
let buffer_snapshot = buffer.read(cx).snapshot(cx);
let (_, filter_snapshot) = FilterMap::new(None, buffer_snapshot);
let (_, inlay_snapshot) = InlayMap::new(filter_snapshot);
let (_, inlay_snapshot) = InlayMap::new(buffer_snapshot);
let (_, fold_snapshot) = FoldMap::new(inlay_snapshot);
let (_, tab_snapshot) = TabMap::new(fold_snapshot, 4.try_into().unwrap());
@@ -885,8 +864,7 @@ mod tests {
let buffer = MultiBuffer::build_simple(&input, cx);
let buffer_snapshot = buffer.read(cx).snapshot(cx);
let (_, filter_snapshot) = FilterMap::new(None, buffer_snapshot);
let (_, inlay_snapshot) = InlayMap::new(filter_snapshot);
let (_, inlay_snapshot) = InlayMap::new(buffer_snapshot);
let (_, fold_snapshot) = FoldMap::new(inlay_snapshot);
let (_, mut tab_snapshot) = TabMap::new(fold_snapshot, 4.try_into().unwrap());
tab_snapshot.max_expansion_column = rng.random_range(0..323);
@@ -931,8 +909,7 @@ mod tests {
let buffer = MultiBuffer::build_simple(input, cx);
let buffer_snapshot = buffer.read(cx).snapshot(cx);
let (_, filter_snapshot) = FilterMap::new(None, buffer_snapshot);
let (_, inlay_snapshot) = InlayMap::new(filter_snapshot);
let (_, inlay_snapshot) = InlayMap::new(buffer_snapshot);
let (_, fold_snapshot) = FoldMap::new(inlay_snapshot);
let (_, mut tab_snapshot) = TabMap::new(fold_snapshot, 4.try_into().unwrap());
@@ -979,8 +956,7 @@ mod tests {
let buffer = MultiBuffer::build_simple(input, cx);
let buffer_snapshot = buffer.read(cx).snapshot(cx);
let (_, filter_snapshot) = FilterMap::new(None, buffer_snapshot);
let (_, inlay_snapshot) = InlayMap::new(filter_snapshot);
let (_, inlay_snapshot) = InlayMap::new(buffer_snapshot);
let (_, fold_snapshot) = FoldMap::new(inlay_snapshot);
let (_, mut tab_snapshot) = TabMap::new(fold_snapshot, 4.try_into().unwrap());
@@ -994,8 +970,7 @@ mod tests {
let buffer = MultiBuffer::build_simple(input, cx);
let buffer_snapshot = buffer.read(cx).snapshot(cx);
let (_, filter_snapshot) = FilterMap::new(None, buffer_snapshot);
let (_, inlay_snapshot) = InlayMap::new(filter_snapshot);
let (_, inlay_snapshot) = InlayMap::new(buffer_snapshot);
let (_, fold_snapshot) = FoldMap::new(inlay_snapshot);
let (_, tab_snapshot) = TabMap::new(fold_snapshot, 4.try_into().unwrap());
@@ -1055,8 +1030,7 @@ mod tests {
let buffer_snapshot = buffer.read(cx).snapshot(cx);
log::info!("Buffer text: {:?}", buffer_snapshot.text());
let (filter_map, filter_snapshot) = FilterMap::new(None, buffer_snapshot);
let (mut inlay_map, inlay_snapshot) = InlayMap::new(filter_snapshot);
let (mut inlay_map, inlay_snapshot) = InlayMap::new(buffer_snapshot);
log::info!("InlayMap text: {:?}", inlay_snapshot.text());
let (mut fold_map, _) = FoldMap::new(inlay_snapshot.clone());
fold_map.randomly_mutate(&mut rng);
@@ -1132,8 +1106,7 @@ mod tests {
// Create buffer and tab map
let buffer = MultiBuffer::build_simple(&text, cx);
let buffer_snapshot = buffer.read(cx).snapshot(cx);
let (filter_map, filter_snapshot) = FilterMap::new(None, buffer_snapshot);
let (mut inlay_map, inlay_snapshot) = InlayMap::new(filter_snapshot);
let (mut inlay_map, inlay_snapshot) = InlayMap::new(buffer_snapshot);
let (mut fold_map, fold_snapshot) = FoldMap::new(inlay_snapshot);
let (mut tab_map, _) = TabMap::new(fold_snapshot, tab_size);
@@ -1172,8 +1145,7 @@ mod tests {
let text = "\tfoo\tbarbarbar\t\tbaz\n";
let buffer = MultiBuffer::build_simple(text, cx);
let buffer_snapshot = buffer.read(cx).snapshot(cx);
let (_, filter_snapshot) = FilterMap::new(None, buffer_snapshot);
let (_, inlay_snapshot) = InlayMap::new(filter_snapshot);
let (_, inlay_snapshot) = InlayMap::new(buffer_snapshot);
let (_, fold_snapshot) = FoldMap::new(inlay_snapshot);
let chunks = fold_snapshot.chunks(
FoldOffset(0)..fold_snapshot.len(),
@@ -1211,8 +1183,7 @@ mod tests {
let buffer = MultiBuffer::build_simple(input, cx);
let buffer_snapshot = buffer.read(cx).snapshot(cx);
let (_, filter_snapshot) = FilterMap::new(None, buffer_snapshot);
let (_, inlay_snapshot) = InlayMap::new(filter_snapshot);
let (_, inlay_snapshot) = InlayMap::new(buffer_snapshot);
let (_, fold_snapshot) = FoldMap::new(inlay_snapshot);
let chunks = fold_snapshot.chunks_at(FoldPoint::new(0, 0));
@@ -1270,8 +1241,7 @@ mod tests {
// Build the buffer and create cursor
let buffer = MultiBuffer::build_simple(&input, cx);
let buffer_snapshot = buffer.read(cx).snapshot(cx);
let (_, filter_snapshot) = FilterMap::new(None, buffer_snapshot.clone());
let (_, inlay_snapshot) = InlayMap::new(filter_snapshot);
let (_, inlay_snapshot) = InlayMap::new(buffer_snapshot.clone());
let (_, fold_snapshot) = FoldMap::new(inlay_snapshot);
// First, collect all expected tab positions
@@ -1337,8 +1307,7 @@ mod tests {
let text = "\r\t😁foo\tb😀arbar🤯bar\t\tbaz\n";
let buffer = MultiBuffer::build_simple(text, cx);
let buffer_snapshot = buffer.read(cx).snapshot(cx);
let (_, filter_snapshot) = FilterMap::new(None, buffer_snapshot);
let (_, inlay_snapshot) = InlayMap::new(filter_snapshot);
let (_, inlay_snapshot) = InlayMap::new(buffer_snapshot);
let (_, fold_snapshot) = FoldMap::new(inlay_snapshot);
let chunks = fold_snapshot.chunks(
FoldOffset(0)..fold_snapshot.len(),
@@ -1382,8 +1351,7 @@ mod tests {
// Build the buffer and create cursor
let buffer = MultiBuffer::build_simple(&input, cx);
let buffer_snapshot = buffer.read(cx).snapshot(cx);
let (_, filter_snapshot) = FilterMap::new(None, buffer_snapshot.clone());
let (_, inlay_snapshot) = InlayMap::new(filter_snapshot);
let (_, inlay_snapshot) = InlayMap::new(buffer_snapshot.clone());
let (_, fold_snapshot) = FoldMap::new(inlay_snapshot);
// First, collect all expected tab positions
@@ -1472,7 +1440,7 @@ where
self.current_chunk.as_ref().and_then(|(chunk, idx)| {
let mut idx = *idx;
let mut diff = 0;
while idx > 0 && chunk.chars & (1 << idx) == 0 {
while idx > 0 && chunk.chars & (1u128.unbounded_shl(idx)) == 0 {
idx -= 1;
diff += 1;
}
@@ -1492,7 +1460,7 @@ where
fn is_char_boundary(&self) -> bool {
self.current_chunk
.as_ref()
.is_some_and(|(chunk, idx)| (chunk.chars & (1 << *idx.min(&127))) != 0)
.is_some_and(|(chunk, idx)| (chunk.chars & 1u128.unbounded_shl(*idx)) != 0)
}
/// distance: length to move forward while searching for the next tab stop
@@ -1515,18 +1483,20 @@ where
self.byte_offset += overshoot;
self.char_offset += get_char_offset(
chunk_position..(chunk_position + overshoot).saturating_sub(1).min(127),
chunk_position..(chunk_position + overshoot).saturating_sub(1),
chunk.chars,
);
self.current_chunk = Some((chunk, chunk_position + overshoot));
if chunk_position + overshoot < 128 {
self.current_chunk = Some((chunk, chunk_position + overshoot));
}
return None;
}
self.byte_offset += chunk_distance;
self.char_offset += get_char_offset(
chunk_position..(chunk_position + chunk_distance).saturating_sub(1).min(127),
chunk_position..(chunk_position + chunk_distance).saturating_sub(1),
chunk.chars,
);
distance_traversed += chunk_distance;
@@ -1578,8 +1548,6 @@ where
#[inline(always)]
fn get_char_offset(range: Range<u32>, bit_map: u128) -> u32 {
// This edge case can happen when we're at chunk position 128
if range.start == range.end {
return if (1u128 << range.start) & bit_map == 0 {
0
@@ -1587,7 +1555,7 @@ fn get_char_offset(range: Range<u32>, bit_map: u128) -> u32 {
1
};
}
let end_shift: u128 = 127u128 - range.end.min(127) as u128;
let end_shift: u128 = 127u128 - range.end as u128;
let mut bit_mask = (u128::MAX >> range.start) << range.start;
bit_mask = (bit_mask << end_shift) >> end_shift;
let bit_map = bit_map & bit_mask;

View File

@@ -31,21 +31,18 @@ pub struct WrapMap {
#[derive(Clone)]
pub struct WrapSnapshot {
pub(super) tab_snapshot: TabSnapshot,
// FIXME
pub(crate) transforms: SumTree<Transform>,
transforms: SumTree<Transform>,
interpolated: bool,
}
// FIXME
#[derive(Clone, Debug, Default, Eq, PartialEq)]
pub(crate) struct Transform {
struct Transform {
summary: TransformSummary,
display_text: Option<&'static str>,
}
// FIXME
#[derive(Clone, Debug, Default, Eq, PartialEq)]
pub(crate) struct TransformSummary {
struct TransformSummary {
input: TextSummary,
output: TextSummary,
}
@@ -571,18 +568,15 @@ impl WrapSnapshot {
let mut old_start = old_cursor.start().output.lines;
old_start += tab_edit.old.start.0 - old_cursor.start().input.lines;
// todo(lw): Should these be seek_forward?
old_cursor.seek(&tab_edit.old.end, Bias::Right);
old_cursor.seek_forward(&tab_edit.old.end, Bias::Right);
let mut old_end = old_cursor.start().output.lines;
old_end += tab_edit.old.end.0 - old_cursor.start().input.lines;
// todo(lw): Should these be seek_forward?
new_cursor.seek(&tab_edit.new.start, Bias::Right);
let mut new_start = new_cursor.start().output.lines;
new_start += tab_edit.new.start.0 - new_cursor.start().input.lines;
// todo(lw): Should these be seek_forward?
new_cursor.seek(&tab_edit.new.end, Bias::Right);
new_cursor.seek_forward(&tab_edit.new.end, Bias::Right);
let mut new_end = new_cursor.start().output.lines;
new_end += tab_edit.new.end.0 - new_cursor.start().input.lines;
@@ -1204,9 +1198,7 @@ mod tests {
use super::*;
use crate::{
MultiBuffer,
display_map::{
filter_map::FilterMap, fold_map::FoldMap, inlay_map::InlayMap, tab_map::TabMap,
},
display_map::{fold_map::FoldMap, inlay_map::InlayMap, tab_map::TabMap},
test::test_font,
};
use gpui::{LineFragment, px, test::observe};
@@ -1255,8 +1247,7 @@ mod tests {
});
let mut buffer_snapshot = buffer.read_with(cx, |buffer, cx| buffer.snapshot(cx));
log::info!("Buffer text: {:?}", buffer_snapshot.text());
let (mut filter_map, filter_snapshot) = FilterMap::new(None, buffer_snapshot.clone());
let (mut inlay_map, inlay_snapshot) = InlayMap::new(filter_snapshot.clone());
let (mut inlay_map, inlay_snapshot) = InlayMap::new(buffer_snapshot.clone());
log::info!("InlayMap text: {:?}", inlay_snapshot.text());
let (mut fold_map, fold_snapshot) = FoldMap::new(inlay_snapshot.clone());
log::info!("FoldMap text: {:?}", fold_snapshot.text());
@@ -1340,9 +1331,8 @@ mod tests {
}
log::info!("Buffer text: {:?}", buffer_snapshot.text());
let (filter_snapshot, filter_edits) =
filter_map.sync(buffer_snapshot.clone(), buffer_edits);
let (inlay_snapshot, inlay_edits) = inlay_map.sync(filter_snapshot, filter_edits);
let (inlay_snapshot, inlay_edits) =
inlay_map.sync(buffer_snapshot.clone(), buffer_edits);
log::info!("InlayMap text: {:?}", inlay_snapshot.text());
let (fold_snapshot, fold_edits) = fold_map.read(inlay_snapshot, inlay_edits);
log::info!("FoldMap text: {:?}", fold_snapshot.text());

View File

@@ -1728,24 +1728,6 @@ impl Editor {
clone
}
pub fn filtered(
buffer: Entity<MultiBuffer>,
project: Entity<Project>,
filter_mode: FilterMode,
window: &mut Window,
cx: &mut Context<Self>,
) -> Self {
Editor::new_internal(
EditorMode::full(),
buffer,
Some(project),
None,
Some(filter_mode),
window,
cx,
)
}
pub fn new(
mode: EditorMode,
buffer: Entity<MultiBuffer>,
@@ -1753,7 +1735,7 @@ impl Editor {
window: &mut Window,
cx: &mut Context<Self>,
) -> Self {
Editor::new_internal(mode, buffer, project, None, None, window, cx)
Editor::new_internal(mode, buffer, project, None, window, cx)
}
fn new_internal(
@@ -1761,7 +1743,6 @@ impl Editor {
multi_buffer: Entity<MultiBuffer>,
project: Option<Entity<Project>>,
display_map: Option<Entity<DisplayMap>>,
filter_mode: Option<FilterMode>,
window: &mut Window,
cx: &mut Context<Self>,
) -> Self {
@@ -1825,7 +1806,6 @@ impl Editor {
MULTI_BUFFER_EXCERPT_HEADER_HEIGHT,
fold_placeholder,
diagnostics_max_severity,
filter_mode,
cx,
)
})
@@ -2459,6 +2439,10 @@ impl Editor {
key_context.add("renaming");
}
if !self.snippet_stack.is_empty() {
key_context.add("in_snippet");
}
match self.context_menu.borrow().as_ref() {
Some(CodeContextMenu::Completions(menu)) => {
if menu.visible() {
@@ -2875,7 +2859,6 @@ impl Editor {
MULTI_BUFFER_EXCERPT_HEADER_HEIGHT,
Default::default(),
DiagnosticSeverity::Off,
None,
cx,
)
}));
@@ -9968,6 +9951,38 @@ impl Editor {
self.outdent(&Outdent, window, cx);
}
pub fn next_snippet_tabstop(
&mut self,
_: &NextSnippetTabstop,
window: &mut Window,
cx: &mut Context<Self>,
) {
if self.mode.is_single_line() || self.snippet_stack.is_empty() {
return;
}
if self.move_to_next_snippet_tabstop(window, cx) {
self.hide_mouse_cursor(HideMouseCursorOrigin::TypingAction, cx);
return;
}
}
pub fn previous_snippet_tabstop(
&mut self,
_: &PreviousSnippetTabstop,
window: &mut Window,
cx: &mut Context<Self>,
) {
if self.mode.is_single_line() || self.snippet_stack.is_empty() {
return;
}
if self.move_to_prev_snippet_tabstop(window, cx) {
self.hide_mouse_cursor(HideMouseCursorOrigin::TypingAction, cx);
return;
}
}
pub fn tab(&mut self, _: &Tab, window: &mut Window, cx: &mut Context<Self>) {
if self.mode.is_single_line() {
cx.propagate();
@@ -10540,7 +10555,7 @@ impl Editor {
for selection in self
.selections
.all::<Point>(&self.display_snapshot(cx))
.all_adjusted(&self.display_snapshot(cx))
.iter()
{
let Some(wrap_config) = snapshot
@@ -14353,10 +14368,6 @@ impl Editor {
let last_selection = selections.iter().max_by_key(|s| s.id).unwrap();
let mut next_selected_range = None;
// Collect and sort selection ranges for efficient overlap checking
let mut selection_ranges: Vec<_> = selections.iter().map(|s| s.range()).collect();
selection_ranges.sort_by_key(|r| r.start);
let bytes_after_last_selection =
buffer.bytes_in_range(last_selection.end..buffer.len());
let bytes_before_first_selection = buffer.bytes_in_range(0..first_selection.start);
@@ -14378,18 +14389,11 @@ impl Editor {
|| (!buffer.is_inside_word(offset_range.start, None)
&& !buffer.is_inside_word(offset_range.end, None))
{
// Use binary search to check for overlap (O(log n))
let overlaps = selection_ranges
.binary_search_by(|range| {
if range.end <= offset_range.start {
std::cmp::Ordering::Less
} else if range.start >= offset_range.end {
std::cmp::Ordering::Greater
} else {
std::cmp::Ordering::Equal
}
})
.is_ok();
let idx = selections
.partition_point(|selection| selection.end <= offset_range.start);
let overlaps = selections
.get(idx)
.map_or(false, |selection| selection.start < offset_range.end);
if !overlaps {
next_selected_range = Some(offset_range);
@@ -15874,7 +15878,7 @@ impl Editor {
) {
let current_scroll_position = self.scroll_position(cx);
let lines_to_expand = EditorSettings::get_global(cx).expand_excerpt_lines;
let mut should_scroll_up = false;
let mut scroll = None;
if direction == ExpandExcerptDirection::Down {
let multi_buffer = self.buffer.read(cx);
@@ -15887,17 +15891,30 @@ impl Editor {
let excerpt_end_row = Point::from_anchor(&excerpt_range.end, &buffer_snapshot).row;
let last_row = buffer_snapshot.max_point().row;
let lines_below = last_row.saturating_sub(excerpt_end_row);
should_scroll_up = lines_below >= lines_to_expand;
if lines_below >= lines_to_expand {
scroll = Some(
current_scroll_position
+ gpui::Point::new(0.0, lines_to_expand as ScrollOffset),
);
}
}
}
if direction == ExpandExcerptDirection::Up
&& self
.buffer
.read(cx)
.snapshot(cx)
.excerpt_before(excerpt)
.is_none()
{
scroll = Some(current_scroll_position);
}
self.buffer.update(cx, |buffer, cx| {
buffer.expand_excerpts([excerpt], lines_to_expand, direction, cx)
});
if should_scroll_up {
let new_scroll_position =
current_scroll_position + gpui::Point::new(0.0, lines_to_expand as ScrollOffset);
if let Some(new_scroll_position) = scroll {
self.set_scroll_position(new_scroll_position, window, cx);
}
}
@@ -16715,6 +16732,139 @@ impl Editor {
})
}
fn go_to_next_reference(
&mut self,
_: &GoToNextReference,
window: &mut Window,
cx: &mut Context<Self>,
) {
let task = self.go_to_reference_before_or_after_position(Direction::Next, 1, window, cx);
if let Some(task) = task {
task.detach();
};
}
fn go_to_prev_reference(
&mut self,
_: &GoToPreviousReference,
window: &mut Window,
cx: &mut Context<Self>,
) {
let task = self.go_to_reference_before_or_after_position(Direction::Prev, 1, window, cx);
if let Some(task) = task {
task.detach();
};
}
pub fn go_to_reference_before_or_after_position(
&mut self,
direction: Direction,
count: usize,
window: &mut Window,
cx: &mut Context<Self>,
) -> Option<Task<Result<()>>> {
let selection = self.selections.newest_anchor();
let head = selection.head();
let multi_buffer = self.buffer.read(cx);
let (buffer, text_head) = multi_buffer.text_anchor_for_position(head, cx)?;
let workspace = self.workspace()?;
let project = workspace.read(cx).project().clone();
let references =
project.update(cx, |project, cx| project.references(&buffer, text_head, cx));
Some(cx.spawn_in(window, async move |editor, cx| -> Result<()> {
let Some(locations) = references.await? else {
return Ok(());
};
if locations.is_empty() {
// totally normal - the cursor may be on something which is not
// a symbol (e.g. a keyword)
log::info!("no references found under cursor");
return Ok(());
}
let multi_buffer = editor.read_with(cx, |editor, _| editor.buffer().clone())?;
let multi_buffer_snapshot =
multi_buffer.read_with(cx, |multi_buffer, cx| multi_buffer.snapshot(cx))?;
let (locations, current_location_index) =
multi_buffer.update(cx, |multi_buffer, cx| {
let mut locations = locations
.into_iter()
.filter_map(|loc| {
let start = multi_buffer.buffer_anchor_to_anchor(
&loc.buffer,
loc.range.start,
cx,
)?;
let end = multi_buffer.buffer_anchor_to_anchor(
&loc.buffer,
loc.range.end,
cx,
)?;
Some(start..end)
})
.collect::<Vec<_>>();
// There is an O(n) implementation, but given this list will be
// small (usually <100 items), the extra O(log(n)) factor isn't
// worth the (surprisingly large amount of) extra complexity.
locations
.sort_unstable_by(|l, r| l.start.cmp(&r.start, &multi_buffer_snapshot));
let head_offset = head.to_offset(&multi_buffer_snapshot);
let current_location_index = locations.iter().position(|loc| {
loc.start.to_offset(&multi_buffer_snapshot) <= head_offset
&& loc.end.to_offset(&multi_buffer_snapshot) >= head_offset
});
(locations, current_location_index)
})?;
let Some(current_location_index) = current_location_index else {
// This indicates something has gone wrong, because we already
// handle the "no references" case above
log::error!(
"failed to find current reference under cursor. Total references: {}",
locations.len()
);
return Ok(());
};
let destination_location_index = match direction {
Direction::Next => (current_location_index + count) % locations.len(),
Direction::Prev => {
(current_location_index + locations.len() - count % locations.len())
% locations.len()
}
};
// TODO(cameron): is this needed?
// the thinking is to avoid "jumping to the current location" (avoid
// polluting "jumplist" in vim terms)
if current_location_index == destination_location_index {
return Ok(());
}
let Range { start, end } = locations[destination_location_index];
editor.update_in(cx, |editor, window, cx| {
let effects = SelectionEffects::default();
editor.unfold_ranges(&[start..end], false, false, cx);
editor.change_selections(effects, window, cx, |s| {
s.select_ranges([start..start]);
});
})?;
Ok(())
}))
}
pub fn find_all_references(
&mut self,
_: &FindAllReferences,
@@ -19178,7 +19328,6 @@ impl Editor {
self.buffer.clone(),
None,
Some(self.display_map.clone()),
None,
window,
cx,
);
@@ -19303,13 +19452,6 @@ impl Editor {
}
}
#[cfg(test)]
pub(crate) fn set_filter_mode(&mut self, filter_mode: Option<FilterMode>, cx: &mut App) {
self.display_map.update(cx, |display_map, cx| {
display_map.set_filter_mode(filter_mode, cx);
});
}
pub fn set_soft_wrap(&mut self) {
self.soft_wrap_mode_override = Some(language_settings::SoftWrap::EditorWidth)
}

View File

@@ -1506,62 +1506,6 @@ fn test_move_cursor(cx: &mut TestAppContext) {
});
}
// FIXME
#[gpui::test]
async fn test_filtered_editor(cx: &mut TestAppContext) {
init_test(cx, |_| {});
let mut cx = EditorTestContext::new(cx).await;
cx.update_editor(|editor, window, cx| {
editor.set_expand_all_diff_hunks(cx);
editor.set_text("one\nTWO\nTHREE\nfour\n", window, cx);
});
cx.set_head_text("one\ntwo\nthree\nfour\n");
cx.update_editor(|editor, _, cx| {
editor.set_filter_mode(Some(FilterMode::RemoveDeletions), cx);
});
cx.run_until_parked();
let text = cx.editor(|editor, window, cx| editor.display_text(cx));
pretty_assertions::assert_eq!(text, "one\nTWO\nTHREE\nfour\n");
cx.set_selections_state(indoc! {"
ˇone
two
three
TWO
THREE
four
"});
cx.update_editor(|editor, window, cx| {
editor.move_down(&Default::default(), window, cx);
});
cx.run_until_parked();
cx.update_editor(|editor, window, cx| {
editor.move_down(&Default::default(), window, cx);
});
cx.run_until_parked();
cx.update_editor(|editor, window, cx| {
editor.move_down(&Default::default(), window, cx);
});
cx.run_until_parked();
cx.assert_excerpts_with_selections(indoc! {"
[EXCERPT]
one
two
three
TWO
THREE
ˇfour
"});
// let row_infos = cx.editor(|editor, window, cx| {
// editor
// .display_snapshot(cx)
// .row_infos(DisplayRow(0))
// .collect::<Vec<_>>()
// });
// dbg!(&row_infos);
}
#[gpui::test]
fn test_move_cursor_multibyte(cx: &mut TestAppContext) {
init_test(cx, |_| {});
@@ -11122,6 +11066,129 @@ async fn test_snippet_placeholder_choices(cx: &mut TestAppContext) {
});
}
#[gpui::test]
async fn test_snippet_tabstop_navigation_with_placeholders(cx: &mut TestAppContext) {
init_test(cx, |_| {});
fn assert_state(editor: &mut Editor, cx: &mut Context<Editor>, marked_text: &str) {
let (expected_text, selection_ranges) = marked_text_ranges(marked_text, false);
assert_eq!(editor.text(cx), expected_text);
assert_eq!(
editor
.selections
.ranges::<usize>(&editor.display_snapshot(cx)),
selection_ranges
);
}
let (text, insertion_ranges) = marked_text_ranges(
indoc! {"
ˇ
"},
false,
);
let buffer = cx.update(|cx| MultiBuffer::build_simple(&text, cx));
let (editor, cx) = cx.add_window_view(|window, cx| build_editor(buffer, window, cx));
_ = editor.update_in(cx, |editor, window, cx| {
let snippet = Snippet::parse("type ${1|,i32,u32|} = $2; $3").unwrap();
editor
.insert_snippet(&insertion_ranges, snippet, window, cx)
.unwrap();
assert_state(
editor,
cx,
indoc! {"
type «» = ;•
"},
);
assert!(
editor.context_menu_visible(),
"Context menu should be visible for placeholder choices"
);
editor.next_snippet_tabstop(&NextSnippetTabstop, window, cx);
assert_state(
editor,
cx,
indoc! {"
type = «»;•
"},
);
assert!(
!editor.context_menu_visible(),
"Context menu should be hidden after moving to next tabstop"
);
editor.next_snippet_tabstop(&NextSnippetTabstop, window, cx);
assert_state(
editor,
cx,
indoc! {"
type = ; ˇ
"},
);
editor.next_snippet_tabstop(&NextSnippetTabstop, window, cx);
assert_state(
editor,
cx,
indoc! {"
type = ; ˇ
"},
);
});
_ = editor.update_in(cx, |editor, window, cx| {
editor.select_all(&SelectAll, window, cx);
editor.backspace(&Backspace, window, cx);
let snippet = Snippet::parse("fn ${1|,foo,bar|} = ${2:value}; $3").unwrap();
let insertion_ranges = editor
.selections
.all(&editor.display_snapshot(cx))
.iter()
.map(|s| s.range())
.collect::<Vec<_>>();
editor
.insert_snippet(&insertion_ranges, snippet, window, cx)
.unwrap();
assert_state(editor, cx, "fn «» = value;•");
assert!(
editor.context_menu_visible(),
"Context menu should be visible for placeholder choices"
);
editor.next_snippet_tabstop(&NextSnippetTabstop, window, cx);
assert_state(editor, cx, "fn = «valueˇ»;•");
editor.previous_snippet_tabstop(&PreviousSnippetTabstop, window, cx);
assert_state(editor, cx, "fn «» = value;•");
assert!(
editor.context_menu_visible(),
"Context menu should be visible again after returning to first tabstop"
);
editor.previous_snippet_tabstop(&PreviousSnippetTabstop, window, cx);
assert_state(editor, cx, "fn «» = value;•");
});
}
#[gpui::test]
async fn test_snippets(cx: &mut TestAppContext) {
init_test(cx, |_| {});
@@ -12685,12 +12752,6 @@ async fn test_strip_whitespace_and_format_via_lsp(cx: &mut TestAppContext) {
);
}
});
#[cfg(target_os = "windows")]
let line_ending = "\r\n";
#[cfg(not(target_os = "windows"))]
let line_ending = "\n";
// Handle formatting requests to the language server.
cx.lsp
.set_request_handler::<lsp::request::Formatting, _, _>({
@@ -12714,7 +12775,7 @@ async fn test_strip_whitespace_and_format_via_lsp(cx: &mut TestAppContext) {
),
(
lsp::Range::new(lsp::Position::new(3, 4), lsp::Position::new(3, 4)),
line_ending.into()
"\n".into()
),
]
);
@@ -12725,14 +12786,14 @@ async fn test_strip_whitespace_and_format_via_lsp(cx: &mut TestAppContext) {
lsp::Position::new(1, 0),
lsp::Position::new(1, 0),
),
new_text: line_ending.into(),
new_text: "\n".into(),
},
lsp::TextEdit {
range: lsp::Range::new(
lsp::Position::new(2, 0),
lsp::Position::new(2, 0),
),
new_text: line_ending.into(),
new_text: "\n".into(),
},
]))
}
@@ -26718,83 +26779,6 @@ async fn test_paste_url_from_other_app_creates_markdown_link_selectively_in_mult
));
}
#[gpui::test]
async fn test_non_linux_line_endings_registration(cx: &mut TestAppContext) {
init_test(cx, |_| {});
let unix_newlines_file_text = "fn main() {
let a = 5;
}";
let clrf_file_text = unix_newlines_file_text.lines().join("\r\n");
let fs = FakeFs::new(cx.executor());
fs.insert_tree(
path!("/a"),
json!({
"first.rs": &clrf_file_text,
}),
)
.await;
let project = Project::test(fs, [path!("/a").as_ref()], cx).await;
let workspace = cx.add_window(|window, cx| Workspace::test_new(project.clone(), window, cx));
let cx = &mut VisualTestContext::from_window(*workspace, cx);
let registered_text = Arc::new(Mutex::new(Vec::new()));
let language_registry = project.read_with(cx, |project, _| project.languages().clone());
language_registry.add(rust_lang());
let mut fake_servers = language_registry.register_fake_lsp(
"Rust",
FakeLspAdapter {
capabilities: lsp::ServerCapabilities {
color_provider: Some(lsp::ColorProviderCapability::Simple(true)),
..lsp::ServerCapabilities::default()
},
name: "rust-analyzer",
initializer: Some({
let registered_text = registered_text.clone();
Box::new(move |fake_server| {
fake_server.handle_notification::<lsp::notification::DidOpenTextDocument, _>({
let registered_text = registered_text.clone();
move |params, _| {
registered_text.lock().push(params.text_document.text);
}
});
})
}),
..FakeLspAdapter::default()
},
);
let editor = workspace
.update(cx, |workspace, window, cx| {
workspace.open_abs_path(
PathBuf::from(path!("/a/first.rs")),
OpenOptions::default(),
window,
cx,
)
})
.unwrap()
.await
.unwrap()
.downcast::<Editor>()
.unwrap();
let _fake_language_server = fake_servers.next().await.unwrap();
cx.executor().run_until_parked();
assert_eq!(
editor.update(cx, |editor, cx| editor.text(cx)),
unix_newlines_file_text,
"Default text API returns \n-separated text",
);
assert_eq!(
vec![clrf_file_text],
registered_text.lock().drain(..).collect::<Vec<_>>(),
"Expected the language server to receive the exact same text from the FS",
);
}
#[gpui::test]
async fn test_race_in_multibuffer_save(cx: &mut TestAppContext) {
init_test(cx, |_| {});
@@ -26998,3 +26982,123 @@ async fn test_end_of_editor_context(cx: &mut TestAppContext) {
assert!(!e.key_context(window, cx).contains("end_of_input"));
});
}
#[gpui::test]
async fn test_next_prev_reference(cx: &mut TestAppContext) {
const CYCLE_POSITIONS: &[&'static str] = &[
indoc! {"
fn foo() {
let ˇabc = 123;
let x = abc + 1;
let y = abc + 2;
let z = abc + 2;
}
"},
indoc! {"
fn foo() {
let abc = 123;
let x = ˇabc + 1;
let y = abc + 2;
let z = abc + 2;
}
"},
indoc! {"
fn foo() {
let abc = 123;
let x = abc + 1;
let y = ˇabc + 2;
let z = abc + 2;
}
"},
indoc! {"
fn foo() {
let abc = 123;
let x = abc + 1;
let y = abc + 2;
let z = ˇabc + 2;
}
"},
];
init_test(cx, |_| {});
let mut cx = EditorLspTestContext::new_rust(
lsp::ServerCapabilities {
references_provider: Some(lsp::OneOf::Left(true)),
..Default::default()
},
cx,
)
.await;
// importantly, the cursor is in the middle
cx.set_state(indoc! {"
fn foo() {
let aˇbc = 123;
let x = abc + 1;
let y = abc + 2;
let z = abc + 2;
}
"});
let reference_ranges = [
lsp::Position::new(1, 8),
lsp::Position::new(2, 12),
lsp::Position::new(3, 12),
lsp::Position::new(4, 12),
]
.map(|start| lsp::Range::new(start, lsp::Position::new(start.line, start.character + 3)));
cx.lsp
.set_request_handler::<lsp::request::References, _, _>(move |params, _cx| async move {
Ok(Some(
reference_ranges
.map(|range| lsp::Location {
uri: params.text_document_position.text_document.uri.clone(),
range,
})
.to_vec(),
))
});
let _move = async |direction, count, cx: &mut EditorLspTestContext| {
cx.update_editor(|editor, window, cx| {
editor.go_to_reference_before_or_after_position(direction, count, window, cx)
})
.unwrap()
.await
.unwrap()
};
_move(Direction::Next, 1, &mut cx).await;
cx.assert_editor_state(CYCLE_POSITIONS[1]);
_move(Direction::Next, 1, &mut cx).await;
cx.assert_editor_state(CYCLE_POSITIONS[2]);
_move(Direction::Next, 1, &mut cx).await;
cx.assert_editor_state(CYCLE_POSITIONS[3]);
// loops back to the start
_move(Direction::Next, 1, &mut cx).await;
cx.assert_editor_state(CYCLE_POSITIONS[0]);
// loops back to the end
_move(Direction::Prev, 1, &mut cx).await;
cx.assert_editor_state(CYCLE_POSITIONS[3]);
_move(Direction::Prev, 1, &mut cx).await;
cx.assert_editor_state(CYCLE_POSITIONS[2]);
_move(Direction::Prev, 1, &mut cx).await;
cx.assert_editor_state(CYCLE_POSITIONS[1]);
_move(Direction::Prev, 1, &mut cx).await;
cx.assert_editor_state(CYCLE_POSITIONS[0]);
_move(Direction::Next, 3, &mut cx).await;
cx.assert_editor_state(CYCLE_POSITIONS[3]);
_move(Direction::Prev, 2, &mut cx).await;
cx.assert_editor_state(CYCLE_POSITIONS[1]);
}

View File

@@ -232,6 +232,8 @@ impl EditorElement {
register_action(editor, window, Editor::blame_hover);
register_action(editor, window, Editor::delete);
register_action(editor, window, Editor::tab);
register_action(editor, window, Editor::next_snippet_tabstop);
register_action(editor, window, Editor::previous_snippet_tabstop);
register_action(editor, window, Editor::backtab);
register_action(editor, window, Editor::indent);
register_action(editor, window, Editor::outdent);
@@ -495,6 +497,8 @@ impl EditorElement {
register_action(editor, window, Editor::collapse_all_diff_hunks);
register_action(editor, window, Editor::go_to_previous_change);
register_action(editor, window, Editor::go_to_next_change);
register_action(editor, window, Editor::go_to_prev_reference);
register_action(editor, window, Editor::go_to_next_reference);
register_action(editor, window, |editor, action, window, cx| {
if let Some(task) = editor.format(action, window, cx) {
@@ -3173,7 +3177,7 @@ impl EditorElement {
i += 1;
}
delta = 1;
i = head_idx.min(buffer_rows.len() as u32 - 1);
i = head_idx.min(buffer_rows.len().saturating_sub(1) as u32);
while i > 0 && buffer_rows[i as usize].buffer_row.is_none() {
i -= 1;
}
@@ -3757,16 +3761,6 @@ impl EditorElement {
result.into_any()
}
Block::Spacer { height, .. } => v_flex()
.id(block_id)
.w_full()
.child(
div()
.h(*height as f32 * window.line_height())
.debug_bg_magenta(),
)
.into_any(),
};
// Discover the element's content height, then round up to the nearest multiple of line height.
@@ -5115,6 +5109,7 @@ impl EditorElement {
snapshot,
visible_display_row_range.clone(),
max_size,
&editor.text_layout_details(window),
window,
cx,
)

View File

@@ -1,6 +1,7 @@
use crate::Editor;
use anyhow::Result;
use collections::HashMap;
use futures::StreamExt;
use git::{
GitHostingProviderRegistry, GitRemote, Oid,
blame::{Blame, BlameEntry, ParsedCommitMessage},
@@ -507,7 +508,7 @@ impl GitBlame {
let buffer_edits = buffer.update(cx, |buffer, _| buffer.subscribe());
let blame_buffer = project.blame_buffer(&buffer, None, cx);
Some((id, snapshot, buffer_edits, blame_buffer))
Some(async move { (id, snapshot, buffer_edits, blame_buffer.await) })
})
.collect::<Vec<_>>()
});
@@ -517,10 +518,14 @@ impl GitBlame {
let (result, errors) = cx
.background_spawn({
async move {
let blame = futures::stream::iter(blame)
.buffered(4)
.collect::<Vec<_>>()
.await;
let mut res = vec![];
let mut errors = vec![];
for (id, snapshot, buffer_edits, blame) in blame {
match blame.await {
match blame {
Ok(Some(Blame {
entries,
messages,
@@ -597,6 +602,7 @@ impl GitBlame {
}
fn regenerate_on_edit(&mut self, cx: &mut Context<Self>) {
// todo(lw): hot foreground spawn
self.regenerate_on_edit_task = cx.spawn(async move |this, cx| {
cx.background_executor()
.timer(REGENERATE_ON_EDIT_DEBOUNCE_INTERVAL)

View File

@@ -3,6 +3,7 @@ use crate::{
EditorSnapshot, GlobalDiagnosticRenderer, Hover,
display_map::{InlayOffset, ToDisplayPoint, invisibles::is_invisible},
hover_links::{InlayHighlight, RangeInEditor},
movement::TextLayoutDetails,
scroll::ScrollAmount,
};
use anyhow::Context as _;
@@ -766,9 +767,13 @@ impl HoverState {
snapshot: &EditorSnapshot,
visible_rows: Range<DisplayRow>,
max_size: Size<Pixels>,
text_layout_details: &TextLayoutDetails,
window: &mut Window,
cx: &mut Context<Editor>,
) -> Option<(DisplayPoint, Vec<AnyElement>)> {
if !self.visible() {
return None;
}
// If there is a diagnostic, position the popovers based on that.
// Otherwise use the start of the hover range
let anchor = self
@@ -791,11 +796,29 @@ impl HoverState {
}
})
})?;
let point = anchor.to_display_point(&snapshot.display_snapshot);
let mut point = anchor.to_display_point(&snapshot.display_snapshot);
// Don't render if the relevant point isn't on screen
if !self.visible() || !visible_rows.contains(&point.row()) {
return None;
// Clamp the point within the visible rows in case the popup source spans multiple lines
if point.row() < visible_rows.start {
point = crate::movement::down_by_rows(
&snapshot.display_snapshot,
point,
(visible_rows.start - point.row()).0,
text::SelectionGoal::None,
true,
text_layout_details,
)
.0;
} else if visible_rows.end <= point.row() {
point = crate::movement::up_by_rows(
&snapshot.display_snapshot,
point,
(visible_rows.end - point.row()).0,
text::SelectionGoal::None,
true,
text_layout_details,
)
.0;
}
let mut elements = Vec::new();

View File

@@ -344,7 +344,7 @@ impl Editor {
.extend(invalidate_hints_for_buffers);
let mut buffers_to_query = HashMap::default();
for (excerpt_id, (buffer, buffer_version, visible_range)) in visible_excerpts {
for (_, (buffer, buffer_version, visible_range)) in visible_excerpts {
let buffer_id = buffer.read(cx).remote_id();
if !self.registered_buffers.contains_key(&buffer_id) {
continue;
@@ -358,13 +358,11 @@ impl Editor {
buffers_to_query
.entry(buffer_id)
.or_insert_with(|| VisibleExcerpts {
excerpts: Vec::new(),
ranges: Vec::new(),
buffer_version: buffer_version.clone(),
buffer: buffer.clone(),
});
visible_excerpts.buffer_version = buffer_version;
visible_excerpts.excerpts.push(excerpt_id);
visible_excerpts.ranges.push(buffer_anchor_range);
}
@@ -850,7 +848,6 @@ impl Editor {
#[derive(Debug)]
struct VisibleExcerpts {
excerpts: Vec<ExcerptId>,
ranges: Vec<Range<text::Anchor>>,
buffer_version: Global,
buffer: Entity<language::Buffer>,
@@ -1184,17 +1181,17 @@ pub mod tests {
})
.unwrap();
let progress_token = "test_progress_token";
let progress_token = 42;
fake_server
.request::<lsp::request::WorkDoneProgressCreate>(lsp::WorkDoneProgressCreateParams {
token: lsp::ProgressToken::String(progress_token.to_string()),
token: lsp::ProgressToken::Number(progress_token),
})
.await
.into_response()
.expect("work done progress create request failed");
cx.executor().run_until_parked();
fake_server.notify::<lsp::notification::Progress>(lsp::ProgressParams {
token: lsp::ProgressToken::String(progress_token.to_string()),
token: lsp::ProgressToken::Number(progress_token),
value: lsp::ProgressParamsValue::WorkDone(lsp::WorkDoneProgress::Begin(
lsp::WorkDoneProgressBegin::default(),
)),
@@ -1214,7 +1211,7 @@ pub mod tests {
.unwrap();
fake_server.notify::<lsp::notification::Progress>(lsp::ProgressParams {
token: lsp::ProgressToken::String(progress_token.to_string()),
token: lsp::ProgressToken::Number(progress_token),
value: lsp::ProgressParamsValue::WorkDone(lsp::WorkDoneProgress::End(
lsp::WorkDoneProgressEnd::default(),
)),
@@ -2017,7 +2014,7 @@ pub mod tests {
task_lsp_request_ranges.lock().push(params.range);
task_lsp_request_count.fetch_add(1, Ordering::Release);
Ok(Some(vec![lsp::InlayHint {
position: params.range.end,
position: params.range.start,
label: lsp::InlayHintLabel::String(
params.range.end.line.to_string(),
),
@@ -2698,7 +2695,7 @@ let c = 3;"#
),
(
"main.rs",
lsp::Range::new(lsp::Position::new(50, 0), lsp::Position::new(100, 11))
lsp::Range::new(lsp::Position::new(50, 0), lsp::Position::new(100, 0))
),
],
lsp_request_ranges
@@ -2757,7 +2754,7 @@ let c = 3;"#
),
(
"main.rs",
lsp::Range::new(lsp::Position::new(50, 0), lsp::Position::new(100, 11))
lsp::Range::new(lsp::Position::new(50, 0), lsp::Position::new(100, 0))
),
],
lsp_request_ranges

View File

@@ -1012,7 +1012,6 @@ mod tests {
1,
FoldPlaceholder::test(),
DiagnosticSeverity::Warning,
None,
cx,
)
});
@@ -1211,7 +1210,6 @@ mod tests {
1,
FoldPlaceholder::test(),
DiagnosticSeverity::Warning,
None,
cx,
)
});

View File

@@ -1,130 +0,0 @@
use crate::EditorEvent;
use super::{Editor, EditorElement, EditorStyle};
use gpui::{App, Context, Entity, Render, Window};
use ui::{Element, IntoElement};
use workspace::ItemHandle;
// stage 1: render two side-by-side editors with the correct scrolling behavior
// stage 2: add alignment map to insert blank lines
/// An editor that can be rendered with a split diff layout.
///
/// When [secondary] is `None`, it is rendered with an inline diff style.
pub struct SplittableEditor {
primary: Entity<Editor>,
secondary: Option<Entity<Editor>>,
}
impl SplittableEditor {
fn subscribe(&mut self, window: &mut Window, cx: &mut Context<Self>) {
cx.subscribe_in(
&self.primary,
window,
|this, editor, event: &EditorEvent, window, cx| {},
)
.detach();
}
fn sync_state(&mut self, cx: &mut App) {}
}
impl SplittableEditor {}
struct SplitEditorElement {
primary: Entity<Editor>,
secondary: Entity<Editor>,
style: EditorStyle,
}
struct SplitEditorElementLayout {}
impl Element for SplitEditorElement {
type RequestLayoutState = ();
type PrepaintState = SplitEditorElementLayout;
fn id(&self) -> Option<ui::ElementId> {
todo!()
}
fn source_location(&self) -> Option<&'static std::panic::Location<'static>> {
todo!()
}
fn request_layout(
&mut self,
id: Option<&gpui::GlobalElementId>,
inspector_id: Option<&gpui::InspectorElementId>,
window: &mut ui::Window,
cx: &mut ui::App,
) -> (gpui::LayoutId, Self::RequestLayoutState) {
}
fn prepaint(
&mut self,
id: Option<&gpui::GlobalElementId>,
inspector_id: Option<&gpui::InspectorElementId>,
bounds: gpui::Bounds<ui::Pixels>,
request_layout: &mut Self::RequestLayoutState,
window: &mut ui::Window,
cx: &mut ui::App,
) -> Self::PrepaintState {
todo!()
}
fn paint(
&mut self,
id: Option<&gpui::GlobalElementId>,
inspector_id: Option<&gpui::InspectorElementId>,
bounds: gpui::Bounds<ui::Pixels>,
request_layout: &mut Self::RequestLayoutState,
prepaint: &mut Self::PrepaintState,
window: &mut ui::Window,
cx: &mut ui::App,
) {
todo!()
}
}
impl Render for SplittableEditor {
fn render(
&mut self,
window: &mut ui::Window,
cx: &mut ui::Context<Self>,
) -> impl ui::IntoElement {
enum SplittableEditorElement {
Single(EditorElement),
Split(SplitEditorElement),
}
impl Element for SplittableEditorElement {}
impl IntoElement for SplittableEditorElement {
type Element = Self;
fn into_element(self) -> Self::Element {
self
}
}
let style;
if let Some(secondary) = self.secondary.clone() {
SplittableEditorElement::Split(SplitEditorElement {
primary: self.primary.clone(),
secondary,
style,
})
} else {
SplittableEditorElement::Single(EditorElement::new(&self.primary.clone(), style))
}
}
}
impl IntoElement for SplitEditorElement {
type Element = Self;
fn into_element(self) -> Self::Element {
self
}
}

View File

@@ -72,7 +72,6 @@ pub fn marked_display_snapshot(
1,
FoldPlaceholder::test(),
DiagnosticSeverity::Warning,
None,
cx,
)
});
@@ -239,11 +238,6 @@ pub fn editor_content_with_blocks(editor: &Entity<Editor>, cx: &mut VisualTestCo
lines[row as usize].push_str("§ -----");
}
}
Block::Spacer { height, .. } => {
for row in row.0..row.0 + height {
lines[row as usize].push_str("@@@@@@@");
}
}
}
}
lines.join("\n")

View File

@@ -5,7 +5,7 @@ use crate::{
use anyhow::{Context as _, Result, bail};
use async_compression::futures::bufread::GzipDecoder;
use async_tar::Archive;
use futures::io::BufReader;
use futures::{AsyncReadExt, io::Cursor};
use heck::ToSnakeCase;
use http_client::{self, AsyncBody, HttpClient};
use serde::Deserialize;
@@ -411,6 +411,8 @@ impl ExtensionBuilder {
let mut clang_path = wasi_sdk_dir.clone();
clang_path.extend(["bin", &format!("clang{}", env::consts::EXE_SUFFIX)]);
log::info!("downloading wasi-sdk to {}", wasi_sdk_dir.display());
if fs::metadata(&clang_path).is_ok_and(|metadata| metadata.is_file()) {
return Ok(clang_path);
}
@@ -423,13 +425,19 @@ impl ExtensionBuilder {
log::info!("downloading wasi-sdk to {}", wasi_sdk_dir.display());
let mut response = self.http.get(&url, AsyncBody::default(), true).await?;
let body = BufReader::new(response.body_mut());
let body = GzipDecoder::new(body);
let body = GzipDecoder::new({
// stream the entire request into memory at once as the artifact is quite big (100MB+)
let mut b = vec![];
response.body_mut().read_to_end(&mut b).await?;
Cursor::new(b)
});
let tar = Archive::new(body);
log::info!("un-tarring wasi-sdk to {}", wasi_sdk_dir.display());
tar.unpack(&tar_out_dir)
.await
.context("failed to unpack wasi-sdk archive")?;
log::info!("finished downloading wasi-sdk");
let inner_dir = fs::read_dir(&tar_out_dir)?
.next()

View File

@@ -360,7 +360,7 @@ impl ExtensionStore {
}
extension_id = reload_rx.next() => {
let Some(extension_id) = extension_id else { break; };
this.update( cx, |this, _| {
this.update(cx, |this, _| {
this.modified_extensions.extend(extension_id);
})?;
index_changed = true;
@@ -608,7 +608,7 @@ impl ExtensionStore {
.extension_index
.extensions
.contains_key(extension_id.as_ref());
!is_already_installed
!is_already_installed && !SUPPRESSED_EXTENSIONS.contains(&extension_id.as_ref())
})
.cloned()
.collect::<Vec<_>>();

View File

@@ -31,7 +31,8 @@ use util::test::TempTree;
#[cfg(test)]
#[ctor::ctor]
fn init_logger() {
zlog::init_test();
// show info logs while we debug the extension_store tests hanging.
zlog::init_test_with("info");
}
#[gpui::test]
@@ -529,10 +530,6 @@ async fn test_extension_store(cx: &mut TestAppContext) {
});
}
// todo(windows)
// Disable this test on Windows for now. Because this test hangs at
// `let fake_server = fake_servers.next().await.unwrap();`.
// Reenable this test when we figure out why.
#[gpui::test]
async fn test_extension_store_with_test_extension(cx: &mut TestAppContext) {
init_test(cx);

View File

@@ -658,7 +658,7 @@ impl WasmHost {
};
cx.spawn(async move |cx| {
let (extension_task, manifest, work_dir, tx, zed_api_version) =
load_extension_task.await?;
cx.background_executor().spawn(load_extension_task).await?;
// we need to run run the task in an extension context as wasmtime_wasi may
// call into tokio, accessing its runtime handle
let task = Arc::new(gpui_tokio::Tokio::spawn(cx, extension_task)?);

View File

@@ -1,5 +1,3 @@
mod extension_card;
mod feature_upsell;
pub use extension_card::*;
pub use feature_upsell::*;

View File

@@ -1,77 +0,0 @@
use gpui::{AnyElement, Div, StyleRefinement};
use smallvec::SmallVec;
use ui::prelude::*;
#[derive(IntoElement)]
pub struct FeatureUpsell {
base: Div,
text: SharedString,
docs_url: Option<SharedString>,
children: SmallVec<[AnyElement; 2]>,
}
impl FeatureUpsell {
pub fn new(text: impl Into<SharedString>) -> Self {
Self {
base: h_flex(),
text: text.into(),
docs_url: None,
children: SmallVec::new(),
}
}
pub fn docs_url(mut self, docs_url: impl Into<SharedString>) -> Self {
self.docs_url = Some(docs_url.into());
self
}
}
impl ParentElement for FeatureUpsell {
fn extend(&mut self, elements: impl IntoIterator<Item = AnyElement>) {
self.children.extend(elements)
}
}
// Style methods.
impl FeatureUpsell {
fn style(&mut self) -> &mut StyleRefinement {
self.base.style()
}
gpui::border_style_methods!({
visibility: pub
});
}
impl RenderOnce for FeatureUpsell {
fn render(self, _window: &mut Window, cx: &mut App) -> impl IntoElement {
self.base
.py_2()
.px_4()
.justify_between()
.flex_wrap()
.border_color(cx.theme().colors().border_variant)
.child(Label::new(self.text))
.child(h_flex().gap_2().children(self.children).when_some(
self.docs_url,
|el, docs_url| {
el.child(
Button::new("open_docs", "View Documentation")
.icon(IconName::ArrowUpRight)
.icon_size(IconSize::Small)
.icon_position(IconPosition::End)
.on_click({
move |_event, _window, cx| {
telemetry::event!(
"Documentation Viewed",
source = "Feature Upsell",
url = docs_url,
);
cx.open_url(&docs_url)
}
}),
)
},
))
}
}

View File

@@ -24,8 +24,8 @@ use settings::{Settings, SettingsContent};
use strum::IntoEnumIterator as _;
use theme::ThemeSettings;
use ui::{
CheckboxWithLabel, Chip, ContextMenu, PopoverMenu, ScrollableHandle, ToggleButton, Tooltip,
WithScrollbar, prelude::*,
Banner, Chip, ContextMenu, Divider, PopoverMenu, ScrollableHandle, Switch, ToggleButton,
Tooltip, WithScrollbar, prelude::*,
};
use vim_mode_setting::VimModeSetting;
use workspace::{
@@ -34,7 +34,7 @@ use workspace::{
};
use zed_actions::ExtensionCategoryFilter;
use crate::components::{ExtensionCard, FeatureUpsell};
use crate::components::ExtensionCard;
use crate::extension_version_selector::{
ExtensionVersionSelector, ExtensionVersionSelectorDelegate,
};
@@ -225,9 +225,9 @@ impl ExtensionFilter {
#[derive(Debug, PartialEq, Eq, PartialOrd, Ord, Hash, Clone, Copy)]
enum Feature {
ExtensionRuff,
ExtensionTailwind,
Git,
OpenIn,
Vim,
LanguageBash,
LanguageC,
LanguageCpp,
@@ -236,13 +236,28 @@ enum Feature {
LanguageReact,
LanguageRust,
LanguageTypescript,
OpenIn,
Vim,
}
fn keywords_by_feature() -> &'static BTreeMap<Feature, Vec<&'static str>> {
static KEYWORDS_BY_FEATURE: OnceLock<BTreeMap<Feature, Vec<&'static str>>> = OnceLock::new();
KEYWORDS_BY_FEATURE.get_or_init(|| {
BTreeMap::from_iter([
(Feature::ExtensionRuff, vec!["ruff"]),
(Feature::ExtensionTailwind, vec!["tail", "tailwind"]),
(Feature::Git, vec!["git"]),
(Feature::LanguageBash, vec!["sh", "bash"]),
(Feature::LanguageC, vec!["c", "clang"]),
(Feature::LanguageCpp, vec!["c++", "cpp", "clang"]),
(Feature::LanguageGo, vec!["go", "golang"]),
(Feature::LanguagePython, vec!["python", "py"]),
(Feature::LanguageReact, vec!["react"]),
(Feature::LanguageRust, vec!["rust", "rs"]),
(
Feature::LanguageTypescript,
vec!["type", "typescript", "ts"],
),
(
Feature::OpenIn,
vec![
@@ -257,17 +272,6 @@ fn keywords_by_feature() -> &'static BTreeMap<Feature, Vec<&'static str>> {
],
),
(Feature::Vim, vec!["vim"]),
(Feature::LanguageBash, vec!["sh", "bash"]),
(Feature::LanguageC, vec!["c", "clang"]),
(Feature::LanguageCpp, vec!["c++", "cpp", "clang"]),
(Feature::LanguageGo, vec!["go", "golang"]),
(Feature::LanguagePython, vec!["python", "py"]),
(Feature::LanguageReact, vec!["react"]),
(Feature::LanguageRust, vec!["rust", "rs"]),
(
Feature::LanguageTypescript,
vec!["type", "typescript", "ts"],
),
])
})
}
@@ -1336,58 +1340,172 @@ impl ExtensionsPage {
}
}
fn render_feature_upsells(&self, cx: &mut Context<Self>) -> impl IntoElement {
let upsells_count = self.upsells.len();
v_flex().children(self.upsells.iter().enumerate().map(|(ix, feature)| {
let upsell = match feature {
Feature::Git => FeatureUpsell::new(
"Zed comes with basic Git support. More Git features are coming in the future.",
)
.docs_url("https://zed.dev/docs/git"),
Feature::OpenIn => FeatureUpsell::new(
"Zed supports linking to a source line on GitHub and others.",
)
.docs_url("https://zed.dev/docs/git#git-integrations"),
Feature::Vim => FeatureUpsell::new("Vim support is built-in to Zed!")
.docs_url("https://zed.dev/docs/vim")
.child(CheckboxWithLabel::new(
"enable-vim",
Label::new("Enable vim mode"),
if VimModeSetting::get_global(cx).0 {
ui::ToggleState::Selected
} else {
ui::ToggleState::Unselected
},
cx.listener(move |this, selection, _, cx| {
telemetry::event!("Vim Mode Toggled", source = "Feature Upsell");
this.update_settings(selection, cx, |setting, value| {
setting.vim_mode = Some(value)
});
}),
)),
Feature::LanguageBash => FeatureUpsell::new("Shell support is built-in to Zed!")
.docs_url("https://zed.dev/docs/languages/bash"),
Feature::LanguageC => FeatureUpsell::new("C support is built-in to Zed!")
.docs_url("https://zed.dev/docs/languages/c"),
Feature::LanguageCpp => FeatureUpsell::new("C++ support is built-in to Zed!")
.docs_url("https://zed.dev/docs/languages/cpp"),
Feature::LanguageGo => FeatureUpsell::new("Go support is built-in to Zed!")
.docs_url("https://zed.dev/docs/languages/go"),
Feature::LanguagePython => FeatureUpsell::new("Python support is built-in to Zed!")
.docs_url("https://zed.dev/docs/languages/python"),
Feature::LanguageReact => FeatureUpsell::new("React support is built-in to Zed!")
.docs_url("https://zed.dev/docs/languages/typescript"),
Feature::LanguageRust => FeatureUpsell::new("Rust support is built-in to Zed!")
.docs_url("https://zed.dev/docs/languages/rust"),
Feature::LanguageTypescript => {
FeatureUpsell::new("Typescript support is built-in to Zed!")
.docs_url("https://zed.dev/docs/languages/typescript")
fn render_feature_upsell_banner(
&self,
label: SharedString,
docs_url: SharedString,
vim: bool,
cx: &mut Context<Self>,
) -> impl IntoElement {
let docs_url_button = Button::new("open_docs", "View Documentation")
.icon(IconName::ArrowUpRight)
.icon_size(IconSize::Small)
.icon_position(IconPosition::End)
.on_click({
move |_event, _window, cx| {
telemetry::event!(
"Documentation Viewed",
source = "Feature Upsell",
url = docs_url,
);
cx.open_url(&docs_url)
}
};
});
upsell.when(ix < upsells_count, |upsell| upsell.border_b_1())
}))
div()
.pt_4()
.px_4()
.child(
Banner::new()
.severity(Severity::Success)
.child(Label::new(label).mt_0p5())
.map(|this| {
if vim {
this.action_slot(
h_flex()
.gap_1()
.child(docs_url_button)
.child(Divider::vertical().color(ui::DividerColor::Border))
.child(
h_flex()
.pl_1()
.gap_1()
.child(Label::new("Enable Vim mode"))
.child(
Switch::new(
"enable-vim",
if VimModeSetting::get_global(cx).0 {
ui::ToggleState::Selected
} else {
ui::ToggleState::Unselected
},
)
.on_click(cx.listener(
move |this, selection, _, cx| {
telemetry::event!(
"Vim Mode Toggled",
source = "Feature Upsell"
);
this.update_settings(
selection,
cx,
|setting, value| {
setting.vim_mode = Some(value)
},
);
},
))
.color(ui::SwitchColor::Accent),
),
),
)
} else {
this.action_slot(docs_url_button)
}
}),
)
.into_any_element()
}
fn render_feature_upsells(&self, cx: &mut Context<Self>) -> impl IntoElement {
let mut container = v_flex();
for feature in &self.upsells {
let banner = match feature {
Feature::ExtensionRuff => self.render_feature_upsell_banner(
"Ruff (linter for Python) support is built-in to Zed!".into(),
"https://zed.dev/docs/languages/python#code-formatting--linting".into(),
false,
cx,
),
Feature::ExtensionTailwind => self.render_feature_upsell_banner(
"Tailwind CSS support is built-in to Zed!".into(),
"https://zed.dev/docs/languages/tailwindcss".into(),
false,
cx,
),
Feature::Git => self.render_feature_upsell_banner(
"Zed comes with basic Git support—more features are coming in the future."
.into(),
"https://zed.dev/docs/git".into(),
false,
cx,
),
Feature::LanguageBash => self.render_feature_upsell_banner(
"Shell support is built-in to Zed!".into(),
"https://zed.dev/docs/languages/bash".into(),
false,
cx,
),
Feature::LanguageC => self.render_feature_upsell_banner(
"C support is built-in to Zed!".into(),
"https://zed.dev/docs/languages/c".into(),
false,
cx,
),
Feature::LanguageCpp => self.render_feature_upsell_banner(
"C++ support is built-in to Zed!".into(),
"https://zed.dev/docs/languages/cpp".into(),
false,
cx,
),
Feature::LanguageGo => self.render_feature_upsell_banner(
"Go support is built-in to Zed!".into(),
"https://zed.dev/docs/languages/go".into(),
false,
cx,
),
Feature::LanguagePython => self.render_feature_upsell_banner(
"Python support is built-in to Zed!".into(),
"https://zed.dev/docs/languages/python".into(),
false,
cx,
),
Feature::LanguageReact => self.render_feature_upsell_banner(
"React support is built-in to Zed!".into(),
"https://zed.dev/docs/languages/typescript".into(),
false,
cx,
),
Feature::LanguageRust => self.render_feature_upsell_banner(
"Rust support is built-in to Zed!".into(),
"https://zed.dev/docs/languages/rust".into(),
false,
cx,
),
Feature::LanguageTypescript => self.render_feature_upsell_banner(
"Typescript support is built-in to Zed!".into(),
"https://zed.dev/docs/languages/typescript".into(),
false,
cx,
),
Feature::OpenIn => self.render_feature_upsell_banner(
"Zed supports linking to a source line on GitHub and others.".into(),
"https://zed.dev/docs/git#git-integrations".into(),
false,
cx,
),
Feature::Vim => self.render_feature_upsell_banner(
"Vim support is built-in to Zed!".into(),
"https://zed.dev/docs/vim".into(),
true,
cx,
),
};
container = container.child(banner);
}
container
}
}

View File

@@ -1279,18 +1279,17 @@ impl GitRepository for RealGitRepository {
.remote_url("upstream")
.or_else(|| self.remote_url("origin"));
self.executor
.spawn(async move {
crate::blame::Blame::for_path(
&git_binary_path,
&working_directory?,
&path,
&content,
remote_url,
)
.await
})
.boxed()
async move {
crate::blame::Blame::for_path(
&git_binary_path,
&working_directory?,
&path,
&content,
remote_url,
)
.await
}
.boxed()
}
fn diff(&self, diff: DiffType) -> BoxFuture<'_, Result<String>> {

View File

@@ -3,9 +3,7 @@ use std::any::Any;
use ::settings::Settings;
use command_palette_hooks::CommandPaletteFilter;
use commit_modal::CommitModal;
use editor::{
Editor, MultiBuffer, actions::DiffClipboardWithSelectionData, display_map::FilterMode,
};
use editor::{Editor, actions::DiffClipboardWithSelectionData};
use ui::{
Headline, HeadlineSize, Icon, IconName, IconSize, IntoElement, ParentElement, Render, Styled,
StyledExt, div, h_flex, rems, v_flex,
@@ -27,10 +25,7 @@ use onboarding::GitOnboardingModal;
use project::git_store::Repository;
use project_diff::ProjectDiff;
use ui::prelude::*;
use util::rel_path::RelPath;
use workspace::{
ModalView, OpenOptions, SplitDirection, Workspace, notifications::DetachAndPromptErr,
};
use workspace::{ModalView, Workspace, notifications::DetachAndPromptErr};
use zed_actions;
use crate::{git_panel::GitPanel, text_diff_view::TextDiffView};
@@ -56,8 +51,7 @@ actions!(
git,
[
/// Resets the git onboarding state to show the tutorial again.
ResetOnboarding,
OpenSplitDiff
ResetOnboarding
]
);
@@ -227,69 +221,6 @@ pub fn init(cx: &mut App) {
};
},
);
workspace.register_action(|workspace, _: &OpenSplitDiff, window, cx| {
open_split_diff(workspace, window, cx);
});
})
.detach();
}
fn open_split_diff(
workspace: &mut Workspace,
window: &mut Window,
cx: &mut Context<'_, Workspace>,
) {
let buffer = workspace.project().update(cx, |project, cx| {
let worktree = project.worktrees(cx).next().unwrap();
project.open_buffer(
(
worktree.read(cx).id(),
RelPath::unix("scripts/spellcheck.sh").unwrap(),
),
cx,
)
});
cx.spawn_in(window, async move |workspace, cx| {
let buffer = buffer.await?;
workspace.update_in(cx, |workspace, window, cx| {
let multibuffer = cx.new(|cx| {
let mut multibuffer = MultiBuffer::singleton(buffer, cx);
multibuffer.set_all_diff_hunks_expanded(cx);
multibuffer
});
// let deletions_only_editor = cx.new(|cx| {
// Editor::filtered(
// multibuffer.clone(),
// workspace.project().clone(),
// FilterMode::RemoveInsertions,
// window,
// cx,
// )
// });
let insertions_only_editor = cx.new(|cx| {
Editor::filtered(
multibuffer,
workspace.project().clone(),
FilterMode::RemoveDeletions,
window,
cx,
)
});
// workspace.add_item_to_active_pane(
// Box::new(deletions_only_editor),
// None,
// true,
// window,
// cx,
// );
workspace.split_item(
SplitDirection::horizontal(cx),
Box::new(insertions_only_editor),
window,
cx,
);
})?;
anyhow::Ok(())
})
.detach();
}

View File

@@ -1,4 +1,4 @@
use editor::{Editor, MultiBufferSnapshot};
use editor::{Editor, EditorEvent, MultiBufferSnapshot};
use gpui::{App, Entity, FocusHandle, Focusable, Styled, Subscription, Task, WeakEntity};
use settings::Settings;
use std::{fmt::Write, num::NonZeroU32, time::Duration};
@@ -81,7 +81,7 @@ impl CursorPosition {
fn update_position(
&mut self,
editor: Entity<Editor>,
editor: &Entity<Editor>,
debounce: Option<Duration>,
window: &mut Window,
cx: &mut Context<Self>,
@@ -269,19 +269,21 @@ impl StatusItemView for CursorPosition {
cx: &mut Context<Self>,
) {
if let Some(editor) = active_pane_item.and_then(|item| item.act_as::<Editor>(cx)) {
self._observe_active_editor =
Some(
cx.observe_in(&editor, window, |cursor_position, editor, window, cx| {
Self::update_position(
cursor_position,
editor,
Some(UPDATE_DEBOUNCE),
window,
cx,
)
}),
);
self.update_position(editor, None, window, cx);
self._observe_active_editor = Some(cx.subscribe_in(
&editor,
window,
|cursor_position, editor, event, window, cx| match event {
EditorEvent::SelectionsChanged { .. } => Self::update_position(
cursor_position,
editor,
Some(UPDATE_DEBOUNCE),
window,
cx,
),
_ => {}
},
));
self.update_position(&editor, None, window, cx);
} else {
self.position = None;
self._observe_active_editor = None;

View File

@@ -16,7 +16,7 @@ use text::{Bias, Point};
use theme::ActiveTheme;
use ui::prelude::*;
use util::paths::FILE_ROW_COLUMN_DELIMITER;
use workspace::ModalView;
use workspace::{DismissDecision, ModalView};
pub fn init(cx: &mut App) {
LineIndicatorFormat::register(cx);
@@ -31,7 +31,16 @@ pub struct GoToLine {
_subscriptions: Vec<Subscription>,
}
impl ModalView for GoToLine {}
impl ModalView for GoToLine {
fn on_before_dismiss(
&mut self,
_window: &mut Window,
_cx: &mut Context<Self>,
) -> DismissDecision {
self.prev_scroll_position.take();
DismissDecision::Dismiss(true)
}
}
impl Focusable for GoToLine {
fn focus_handle(&self, cx: &App) -> FocusHandle {
@@ -769,4 +778,171 @@ mod tests {
state
})
}
#[gpui::test]
async fn test_scroll_position_on_outside_click(cx: &mut TestAppContext) {
init_test(cx);
let fs = FakeFs::new(cx.executor());
let file_content = (0..100)
.map(|i| format!("struct Line{};", i))
.collect::<Vec<_>>()
.join("\n");
fs.insert_tree(path!("/dir"), json!({"a.rs": file_content}))
.await;
let project = Project::test(fs, [path!("/dir").as_ref()], cx).await;
let (workspace, cx) =
cx.add_window_view(|window, cx| Workspace::test_new(project.clone(), window, cx));
let worktree_id = workspace.update(cx, |workspace, cx| {
workspace.project().update(cx, |project, cx| {
project.worktrees(cx).next().unwrap().read(cx).id()
})
});
let _buffer = project
.update(cx, |project, cx| {
project.open_local_buffer(path!("/dir/a.rs"), cx)
})
.await
.unwrap();
let editor = workspace
.update_in(cx, |workspace, window, cx| {
workspace.open_path((worktree_id, rel_path("a.rs")), None, true, window, cx)
})
.await
.unwrap()
.downcast::<Editor>()
.unwrap();
let go_to_line_view = open_go_to_line_view(&workspace, cx);
let scroll_position_before_input =
editor.update(cx, |editor, cx| editor.scroll_position(cx));
cx.simulate_input("47");
let scroll_position_after_input =
editor.update(cx, |editor, cx| editor.scroll_position(cx));
assert_ne!(scroll_position_before_input, scroll_position_after_input);
drop(go_to_line_view);
workspace.update_in(cx, |workspace, window, cx| {
workspace.hide_modal(window, cx);
});
cx.run_until_parked();
let scroll_position_after_auto_dismiss =
editor.update(cx, |editor, cx| editor.scroll_position(cx));
assert_eq!(
scroll_position_after_auto_dismiss, scroll_position_after_input,
"Dismissing via outside click should maintain new scroll position"
);
}
#[gpui::test]
async fn test_scroll_position_on_cancel(cx: &mut TestAppContext) {
init_test(cx);
let fs = FakeFs::new(cx.executor());
let file_content = (0..100)
.map(|i| format!("struct Line{};", i))
.collect::<Vec<_>>()
.join("\n");
fs.insert_tree(path!("/dir"), json!({"a.rs": file_content}))
.await;
let project = Project::test(fs, [path!("/dir").as_ref()], cx).await;
let (workspace, cx) =
cx.add_window_view(|window, cx| Workspace::test_new(project.clone(), window, cx));
let worktree_id = workspace.update(cx, |workspace, cx| {
workspace.project().update(cx, |project, cx| {
project.worktrees(cx).next().unwrap().read(cx).id()
})
});
let _buffer = project
.update(cx, |project, cx| {
project.open_local_buffer(path!("/dir/a.rs"), cx)
})
.await
.unwrap();
let editor = workspace
.update_in(cx, |workspace, window, cx| {
workspace.open_path((worktree_id, rel_path("a.rs")), None, true, window, cx)
})
.await
.unwrap()
.downcast::<Editor>()
.unwrap();
let go_to_line_view = open_go_to_line_view(&workspace, cx);
let scroll_position_before_input =
editor.update(cx, |editor, cx| editor.scroll_position(cx));
cx.simulate_input("47");
let scroll_position_after_input =
editor.update(cx, |editor, cx| editor.scroll_position(cx));
assert_ne!(scroll_position_before_input, scroll_position_after_input);
cx.dispatch_action(menu::Cancel);
drop(go_to_line_view);
cx.run_until_parked();
let scroll_position_after_cancel =
editor.update(cx, |editor, cx| editor.scroll_position(cx));
assert_eq!(
scroll_position_after_cancel, scroll_position_after_input,
"Cancel should maintain new scroll position"
);
}
#[gpui::test]
async fn test_scroll_position_on_confirm(cx: &mut TestAppContext) {
init_test(cx);
let fs = FakeFs::new(cx.executor());
let file_content = (0..100)
.map(|i| format!("struct Line{};", i))
.collect::<Vec<_>>()
.join("\n");
fs.insert_tree(path!("/dir"), json!({"a.rs": file_content}))
.await;
let project = Project::test(fs, [path!("/dir").as_ref()], cx).await;
let (workspace, cx) =
cx.add_window_view(|window, cx| Workspace::test_new(project.clone(), window, cx));
let worktree_id = workspace.update(cx, |workspace, cx| {
workspace.project().update(cx, |project, cx| {
project.worktrees(cx).next().unwrap().read(cx).id()
})
});
let _buffer = project
.update(cx, |project, cx| {
project.open_local_buffer(path!("/dir/a.rs"), cx)
})
.await
.unwrap();
let editor = workspace
.update_in(cx, |workspace, window, cx| {
workspace.open_path((worktree_id, rel_path("a.rs")), None, true, window, cx)
})
.await
.unwrap()
.downcast::<Editor>()
.unwrap();
let go_to_line_view = open_go_to_line_view(&workspace, cx);
let scroll_position_before_input =
editor.update(cx, |editor, cx| editor.scroll_position(cx));
cx.simulate_input("47");
let scroll_position_after_input =
editor.update(cx, |editor, cx| editor.scroll_position(cx));
assert_ne!(scroll_position_before_input, scroll_position_after_input);
cx.dispatch_action(menu::Confirm);
drop(go_to_line_view);
cx.run_until_parked();
let scroll_position_after_confirm =
editor.update(cx, |editor, cx| editor.scroll_position(cx));
assert_eq!(
scroll_position_after_confirm, scroll_position_after_input,
"Confirm should maintain new scroll position"
);
}
}

View File

@@ -39,6 +39,7 @@ macos-blade = [
"objc2-metal",
]
wayland = [
"bitflags",
"blade-graphics",
"blade-macros",
"blade-util",
@@ -52,6 +53,7 @@ wayland = [
"wayland-cursor",
"wayland-protocols",
"wayland-protocols-plasma",
"wayland-protocols-wlr",
"filedescriptor",
"xkbcommon",
"open",
@@ -85,7 +87,8 @@ doctest = false
[dependencies]
anyhow.workspace = true
async-task = "4.7"
backtrace = { version = "0.3", optional = true }
backtrace = { workspace = true, optional = true }
bitflags = { workspace = true, optional = true }
blade-graphics = { workspace = true, optional = true }
blade-macros = { workspace = true, optional = true }
blade-util = { workspace = true, optional = true }
@@ -202,6 +205,9 @@ wayland-protocols = { version = "0.31.2", features = [
wayland-protocols-plasma = { version = "0.2.0", features = [
"client",
], optional = true }
wayland-protocols-wlr = { version = "0.3.9", features = [
"client",
], optional = true }
# X11
as-raw-xcb-connection = { version = "1", optional = true }
@@ -234,7 +240,7 @@ windows-numerics = "0.2"
windows-registry = "0.5"
[dev-dependencies]
backtrace = "0.3"
backtrace.workspace = true
collections = { workspace = true, features = ["test-support"] }
env_logger.workspace = true
http_client = { workspace = true, features = ["test-support"] }

View File

@@ -21,6 +21,7 @@ impl Render for HolyGrailExample {
div()
.gap_1()
.id("Random")
.grid()
.bg(rgb(0x505050))
.size(px(500.0))
@@ -37,7 +38,9 @@ impl Render for HolyGrailExample {
)
.child(
block(gpui::red())
.col_span(1)
.id("Table of contents")
.row_span(1)
.container_3xs(|style| style.col_span(1).row_span_auto())
.h_56()
.child("Table of contents"),
)
@@ -49,8 +52,10 @@ impl Render for HolyGrailExample {
)
.child(
block(gpui::blue())
.col_span(1)
.row_span(3)
.id("Ads")
.col_span(3)
.row_span(1)
.container_3xs(|style| style.col_span(1).row_span(3))
.child("AD :(")
.text_color(gpui::white()),
)

View File

@@ -0,0 +1,87 @@
fn main() {
#[cfg(all(target_os = "linux", feature = "wayland"))]
example::main();
#[cfg(not(all(target_os = "linux", feature = "wayland")))]
panic!("This example requires the `wayland` feature and a linux system.");
}
#[cfg(all(target_os = "linux", feature = "wayland"))]
mod example {
use std::time::{Duration, SystemTime, UNIX_EPOCH};
use gpui::{
App, Application, Bounds, Context, FontWeight, Size, Window, WindowBackgroundAppearance,
WindowBounds, WindowKind, WindowOptions, div, layer_shell::*, point, prelude::*, px, rems,
rgba, white,
};
struct LayerShellExample;
impl LayerShellExample {
fn new(cx: &mut Context<Self>) -> Self {
cx.spawn(async move |this, cx| {
loop {
let _ = this.update(cx, |_, cx| cx.notify());
cx.background_executor()
.timer(Duration::from_millis(500))
.await;
}
})
.detach();
LayerShellExample
}
}
impl Render for LayerShellExample {
fn render(&mut self, _window: &mut Window, _cx: &mut Context<Self>) -> impl IntoElement {
let now = SystemTime::now()
.duration_since(UNIX_EPOCH)
.unwrap()
.as_secs();
let hours = (now / 3600) % 24;
let minutes = (now / 60) % 60;
let seconds = now % 60;
div()
.size_full()
.flex()
.items_center()
.justify_center()
.text_size(rems(4.5))
.font_weight(FontWeight::EXTRA_BOLD)
.text_color(white())
.bg(rgba(0x0000044))
.rounded_xl()
.child(format!("{:02}:{:02}:{:02}", hours, minutes, seconds))
}
}
pub fn main() {
Application::new().run(|cx: &mut App| {
cx.open_window(
WindowOptions {
titlebar: None,
window_bounds: Some(WindowBounds::Windowed(Bounds {
origin: point(px(0.), px(0.)),
size: Size::new(px(500.), px(200.)),
})),
app_id: Some("gpui-layer-shell-example".to_string()),
window_background: WindowBackgroundAppearance::Transparent,
kind: WindowKind::LayerShell(LayerShellOptions {
namespace: "gpui".to_string(),
anchor: Anchor::LEFT | Anchor::RIGHT | Anchor::BOTTOM,
margin: Some((px(0.), px(0.), px(40.), px(0.))),
keyboard_interactivity: KeyboardInteractivity::None,
..Default::default()
}),
..Default::default()
},
|_, cx| cx.new(LayerShellExample::new),
)
.unwrap();
});
}
}

View File

@@ -1,16 +1,32 @@
<assembly xmlns="urn:schemas-microsoft-com:asm.v1" manifestVersion="1.0" xmlns:asmv3="urn:schemas-microsoft-com:asm.v3">
<asmv3:application>
<asmv3:windowsSettings>
<dpiAware xmlns="http://schemas.microsoft.com/SMI/2005/WindowsSettings">true</dpiAware>
<assembly xmlns="urn:schemas-microsoft-com:asm.v1" manifestVersion="1.0">
<trustInfo xmlns="urn:schemas-microsoft-com:asm.v3">
<security>
<requestedPrivileges>
<requestedExecutionLevel level="asInvoker" uiAccess="false" />
</requestedPrivileges>
</security>
</trustInfo>
<compatibility xmlns="urn:schemas-microsoft-com:compatibility.v1">
<application>
<!-- Windows 10 -->
<supportedOS Id="{8e0f7a12-bfb3-4fe8-b9a5-48fd50a15a9a}" />
</application>
</compatibility>
<application xmlns="urn:schemas-microsoft-com:asm.v3">
<windowsSettings>
<dpiAware xmlns="http://schemas.microsoft.com/SMI/2005/WindowsSettings">true/pm</dpiAware>
<dpiAwareness xmlns="http://schemas.microsoft.com/SMI/2016/WindowsSettings">PerMonitorV2</dpiAwareness>
</asmv3:windowsSettings>
</asmv3:application>
</windowsSettings>
</application>
<dependency>
<dependentAssembly>
<assemblyIdentity type='win32'
<assemblyIdentity
type='win32'
name='Microsoft.Windows.Common-Controls'
version='6.0.0.0' processorArchitecture='*'
publicKeyToken='6595b64144ccf1df' />
version='6.0.0.0'
processorArchitecture='*'
publicKeyToken='6595b64144ccf1df'
/>
</dependentAssembly>
</dependency>
</assembly>

View File

@@ -176,7 +176,7 @@ impl AsyncApp {
lock.open_window(options, build_root_view)
}
/// Schedule a future to be polled in the background.
/// Schedule a future to be polled in the foreground.
#[track_caller]
pub fn spawn<AsyncFn, R>(&self, f: AsyncFn) -> Task<R>
where

View File

@@ -676,6 +676,61 @@ pub trait InteractiveElement: Sized {
self
}
/// Apply the given style when container width is is less than or equal to 256.0 Pixels
fn container_3xs(mut self, f: impl Fn(StyleRefinement) -> StyleRefinement + 'static) -> Self {
self.container_query_with(256.0, f)
}
/// Apply the given style when container width is is less than or equal to 288.0 Pixels
fn container_2xs(mut self, f: impl Fn(StyleRefinement) -> StyleRefinement + 'static) -> Self {
self.container_query_with(288.0, f)
}
/// Apply the given style when container width is is less than or equal to 320.0 Pixels
fn container_xs(mut self, f: impl Fn(StyleRefinement) -> StyleRefinement + 'static) -> Self {
self.container_query_with(320.0, f)
}
/// Apply the given style when container width is is less than or equal to 384.0 Pixels
fn container_sm(mut self, f: impl Fn(StyleRefinement) -> StyleRefinement + 'static) -> Self {
self.container_query_with(384.0, f)
}
/// Apply the given style when container width is is less than or equal to 448.0 Pixels
fn container_md(mut self, f: impl Fn(StyleRefinement) -> StyleRefinement + 'static) -> Self {
self.container_query_with(448.0, f)
}
/// Apply the given style when container width is is less than or equal to 512.0 Pixels
fn container_lg(mut self, f: impl Fn(StyleRefinement) -> StyleRefinement + 'static) -> Self {
self.container_query_with(512.0, f)
}
/// Apply the given style when container width is is less than or equal to 576.0 Pixels
fn container_xl(mut self, f: impl Fn(StyleRefinement) -> StyleRefinement + 'static) -> Self {
self.container_query_with(576.0, f)
}
/// Apply the given style when width is less than or equal to the current value
fn container_query_with(
mut self,
width: impl Into<Pixels>,
f: impl Fn(StyleRefinement) -> StyleRefinement + 'static,
) -> Self {
let width = width.into();
self.interactivity()
.container_queries
.push(Box::new(move |layout_width, style| {
if layout_width >= width {
f(style)
} else {
style
}
}));
self
}
/// Apply the given style to this element when the mouse hovers over a group member
fn group_hover(
mut self,
@@ -1532,6 +1587,7 @@ pub struct Interactivity {
pub(crate) click_listeners: Vec<ClickListener>,
pub(crate) drag_listener: Option<(Arc<dyn Any>, DragListener)>,
pub(crate) hover_listener: Option<Box<dyn Fn(&bool, &mut Window, &mut App)>>,
pub(crate) container_queries: Vec<Box<dyn Fn(Pixels, StyleRefinement) -> StyleRefinement>>,
pub(crate) tooltip_builder: Option<TooltipBuilder>,
pub(crate) window_control: Option<WindowControlArea>,
pub(crate) hitbox_behavior: HitboxBehavior,
@@ -1625,7 +1681,22 @@ impl Interactivity {
);
}
let style = self.compute_style_internal(None, element_state.as_mut(), window, cx);
let mut style =
self.compute_style_internal(None, element_state.as_mut(), window, cx);
let mut style_refinement = StyleRefinement::default();
if let Some(element_state) = element_state.as_ref()
&& let Some(bounds) = element_state.cached_bounds
{
let current_width = bounds.size.width;
for query in self.container_queries.iter() {
style_refinement = query(current_width, style_refinement);
}
style.refine(&style_refinement);
}
let layout_id = f(style, window, cx);
(layout_id, element_state)
},
@@ -1807,6 +1878,10 @@ impl Interactivity {
let mut element_state =
element_state.map(|element_state| element_state.unwrap_or_default());
if let Some(element_state) = element_state.as_mut() {
element_state.cached_bounds = Some(bounds);
}
let style = self.compute_style_internal(hitbox, element_state.as_mut(), window, cx);
#[cfg(any(feature = "test-support", test))]
@@ -2592,6 +2667,7 @@ pub struct InteractiveElementState {
pub(crate) pending_mouse_down: Option<Rc<RefCell<Option<MouseDownEvent>>>>,
pub(crate) scroll_offset: Option<Rc<RefCell<Point<Pixels>>>>,
pub(crate) active_tooltip: Option<Rc<RefCell<Option<ActiveTooltip>>>>,
pub(crate) cached_bounds: Option<Bounds<Pixels>>,
}
/// Whether or not the element or a group that contains it is clicked by the mouse.

Some files were not shown because too many files have changed in this diff Show More