Compare commits

..

97 Commits

Author SHA1 Message Date
Richard Feldman
70d3749f25 Add regression tests for rules behavior 2025-11-12 11:11:11 -05:00
Richard Feldman
8492b15d72 Fix .rules behavior 2025-11-12 10:48:06 -05:00
Richard Feldman
83075ce1d7 Reproduce rules update bug 2025-11-12 10:44:04 -05:00
Smit Barmase
1d75a9c4b2 Reverts "add OpenExcerptsSplit and dispatches on click" (#42538)
Partially reverts https://github.com/zed-industries/zed/pull/42283 to
restore the old behavior of excerpt clicking.

Release Notes:

- N/A
2025-11-12 20:47:29 +05:30
Richard Feldman
c5ab1d4679 Stop thread on Restore Checkpoint (#42537)
Closes #35142

In addition to cleaning up the terminals, also stops the conversation.

Release Notes:

- Restoring a checkpoint now stops the agent conversation.
2025-11-12 15:13:40 +00:00
Smit Barmase
1fdd95a9b3 Revert "editor: Improve multi-buffer header filename click to jump to the latest selection from that buffer" (#42534)
Reverts zed-industries/zed#42480

This panics on Nightly in cases where anchor might not be valid for that
snapshot. Taking it back before the cutoff.

Release Notes:

- N/A
2025-11-12 20:31:43 +05:30
localcc
49634f6041 Miniprofiler (#42385)
Release Notes:

- Added hang detection and a built in performance profiler
2025-11-12 15:31:20 +01:00
Jakub Konka
2119ac42d7 git_panel: Fix partially staged changes not showing up (#42530)
Release Notes:

- N/A
2025-11-12 15:13:29 +01:00
Hans
e833d1af8d vim: Fix change surround adding unwanted spaces with quotes (#42431)
Update `Vim.change_surround` in order to ensure that there's no
overlapping edits by keeping track of where the open string range ends
and ensuring that the closing string range start does not go lower than
the open string range end.

Closes #42316 

Release Notes:

- Fix vim's change surrounds `cs` inserting spaces with quotes by
preventing overlapping edits

---------

Co-authored-by: dino <dinojoaocosta@gmail.com>
2025-11-12 13:04:24 +00:00
Ben Kunkle
7be76c74d6 Use set -x in script/clear-target-dir-if-larger-than (#42525)
Closes #ISSUE

Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-12 12:52:19 +00:00
Piotr Osiewicz
c2980cba18 remote_server: Bump fork to 0.4.0 (#42520)
Release Notes:

- N/A
2025-11-12 11:57:53 +00:00
Lena
a0be53a190 Wake up stalebot with an updated config (#42516)
- switch the bot from looking at the `bug/crash` labels which we don't
  use anymore to the Bug/Crash issue types which we do use
- shorten the period of time after which a bug is suspected to be stale
  (with our pace they can indeed be outdated in 60 days)
- extend the grace period for someone to come around and say nope, this
  problem still exists (people might be away for a couple of weeks).


Release Notes:

- N/A
2025-11-12 12:40:26 +01:00
Lena
70feff3c7a Add a one-off cleanup script for GH issue types (#42515)
Mainly for historical purposes and in case we want to do something similar enough in the future.

Release Notes:

- N/A
2025-11-12 11:40:31 +01:00
Finn Evers
f46990bac8 extensions_ui: Add XML extension suggestion for XML files (#42514)
Closes #41798

Release Notes:

- N/A
2025-11-12 10:12:02 +00:00
Lukas Wirth
78f466559a vim: Fix empty selections panic in insert_at_previous (#42504)
Fixes ZED-15C

Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-12 09:54:22 +00:00
CnsMaple
4f158c1983 docs: Update basedpyright settings examples (#42497)
The
[example](https://docs.basedpyright.com/latest/configuration/language-server-settings/#zed)
on the official website of basedpyright is correct.

Release Notes:

- Update basedpyright settings examples
2025-11-12 10:05:17 +01:00
Kirill Bulatov
ddf762e368 Revert "gpui: Unify the index_for_x methods (#42162)" (#42505)
This reverts commit 082b80ec89.

This broke clicking, e.g. in snippets like

```rs
let x = vec![
    1, 2, //
    3,
];
```

clicking between `2` and `,` is quite off now.

Release Notes:

- N/A
2025-11-12 08:24:06 +00:00
Lukas Wirth
f2cadad49a gpui: Fix RefCell already borrowed in WindowsPlatform::run (#42506)
Relands #42440 

Fixes ZED-1VX

Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-12 08:19:32 +00:00
Lukas Wirth
231d1b1d58 diagnostics: Close diagnosticsless buffers on refresh (#42503)
Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-12 08:11:50 +00:00
Andrew Farkas
2bcfc12951 Absolutize LSP and DAP paths more conservatively (#42482)
Fixes a regression caused by #42135 where LSP and DAP binaries weren't
being used from `PATH` env var

Now we absolutize the path if (path is relative AND (path has multiple
components OR path exists in worktree)).

- Relative paths with multiple components might not exist in the
worktree because they are ignored. Paths with a single component will at
least have an entry saying that they exist and are ignored.
- Relative paths with multiple components will never use the `PATH` env
var, so they can be safely absolutized

Release Notes:

- N/A
2025-11-12 01:36:22 +00:00
Richard Feldman
cf6ae01d07 Show recommended models under normal category too (#42489)
<img width="395" height="444" alt="Screenshot 2025-11-11 at 4 04 57 PM"
src="https://github.com/user-attachments/assets/8da68721-6e33-4d01-810d-4aa1e2f3402d"
/>

Discussed with @danilo-leal and we're going with the "it's checked in
both places" design!

Closes #40910

Release Notes:

- Recommended AI models now still appear in their normal category in
addition to "Recommended:"
2025-11-11 22:10:46 +00:00
Miguel Cárdenas
2ad7ecbcf0 project_panel: Add auto_open settings (#40435)
- Based on #40234, and improvement of #40331

Release Notes:

- Added granular settings to control when files auto-open in the project
panel (project_panel.auto_open.on_create, on_paste, on_drop)

<img width="662" height="367" alt="Screenshot_2025-10-16_17-28-31"
src="https://github.com/user-attachments/assets/930a0a50-fc89-4c5d-8d05-b1fa2279de8b"
/>

---------

Co-authored-by: Smit Barmase <heysmitbarmase@gmail.com>
2025-11-12 03:23:40 +05:30
Lukas Wirth
854c6873c7 Revert "gpui: Fix RefCell already borrowed in WindowsPlatform::run" (#42481)
Reverts zed-industries/zed#42440

There are invalid temporaries in here keeping the borrows alive for
longer
2025-11-11 21:42:59 +00:00
Andrew Farkas
da94f898e6 Add support for multi-word snippet prefixes (#42398)
Supercedes #41126

Closes #39559, #35397, and #41426

Release Notes:

- Added support for multi-word snippet prefixes

---------

Co-authored-by: Agus Zubiaga <hi@aguz.me>
Co-authored-by: Conrad Irwin <conrad.irwin@gmail.com>
Co-authored-by: Cole Miller <cole@zed.dev>
2025-11-11 16:34:25 -05:00
Richard Feldman
f62bfe1dfa Use enterprise_uri for settings when provided (#42485)
Closes #34945

Release Notes:

- Fixed `enterprise_uri` not being used for GitHub settings URL when
provided
2025-11-11 21:31:42 +00:00
Richard Feldman
a56693d9e8 Fix panic when opening an invalid URL (#42483)
Now instead of a panic we see this:

<img width="511" height="132" alt="Screenshot 2025-11-11 at 3 47 25 PM"
src="https://github.com/user-attachments/assets/48ba2f41-c5c0-4030-9331-0d3acfbf9461"
/>


Release Notes:

- Trying to open invalid URLs in a browser now shows an error instead of
panicking
2025-11-11 21:24:37 +00:00
Smit Barmase
b4b7a23c39 editor: Improve multi-buffer header filename click to jump to the latest selection from that buffer (#42480)
Closes https://github.com/zed-industries/zed/pull/42099

Regressed in https://github.com/zed-industries/zed/pull/42283

Release Notes:

- Clicking the multi-buffer header file name or the "Open file" button
now jumps to the most recent selection in that buffer, if one exists.
2025-11-12 02:04:37 +05:30
Richard Feldman
0d56ed7d91 Only send unit eval failures to Slack for cron job (#42479)
Release Notes:

- N/A
2025-11-11 20:19:34 +00:00
Lay Sheth
e01e0b83c4 Avoid panics in LSP store path handling (#42117)
Release Notes:

- Fixed incorrect journal paths handling
2025-11-11 20:51:57 +02:00
Richard Feldman
908ef03502 Split out cron and non-cron unit evals (#42472)
Release Notes:

- N/A

---------

Co-authored-by: Bennet Bo Fenner <bennetbo@gmx.de>
2025-11-11 13:45:48 -05:00
feeiyu
5f4d0dbaab Fix circular reference issue around PopoverMenu (#42461)
Follow up to https://github.com/zed-industries/zed/pull/42351

Release Notes:

- N/A
2025-11-11 19:20:38 +02:00
brequet
c50f821613 docs: Fix typo in configuring-zed.md (#42454)
Fix a minor typo in the setting key: `auto_install_extension` should be
`auto_install_extensions`.

Release Notes:

- N/A
2025-11-11 17:58:18 +01:00
Marshall Bowers
7e491ac500 collab: Drop embeddings table (#42466)
This PR drops the `embeddings` table, as it is no longer used.

Release Notes:

- N/A
2025-11-11 11:44:04 -05:00
Richard Feldman
9e1e732db8 Use longer timeout on evals (#42465)
The GPT-5 ones in particular can take a long time!

Release Notes:

- N/A

---------

Co-authored-by: Bennet Bo Fenner <bennetbo@gmx.de>
2025-11-11 16:37:20 +00:00
Lukas Wirth
83351283e4 settings: Skip terminal env vars with substitutions in vscode import (#42464)
Closes https://github.com/zed-industries/zed/issues/40547

Release Notes:

- Fixed vscode import creating faulty terminal env vars in terminal
settings
2025-11-11 16:15:12 +00:00
Marshall Bowers
03acbb7de3 collab: Remove unused embeddings queries and model (#42463)
This PR removes the queries and database model for embeddings, as
they're no longer used.

Release Notes:

- N/A
2025-11-11 16:13:59 +00:00
Richard Feldman
0268b17096 Add more secrets to eval workflows (#42459)
Release Notes:

- N/A
2025-11-11 16:07:57 +00:00
Danilo Leal
993919d360 agent_ui: Add icon button to trigger the @-mention completions menu (#42449)
Closes https://github.com/zed-industries/zed/issues/37087

This PR adds an icon button to the footer of the message editor enabling
to trigger and interact with the @-mention completions menu with the
mouse. This is a first step towards making other types of context you
can add in Zed's agent panel more discoverable. Next, I want to improve
the discoverability of images and selections, given that you wouldn't
necessarily know they work in Zed without a clear way to see them. But I
think that for now, this is enough to close the issue above, which had
lots of productive comments and discussion!

<img width="500" height="540" alt="Screenshot 2025-11-11 at 10  46 3@2x"
src="https://github.com/user-attachments/assets/fd028442-6f77-4153-bea1-c0b815da4ac6"
/>

Release Notes:

- agent: Added an icon button in the agent panel that allows to trigger
the @-mention menu (for adding context) now also with the mouse.
2025-11-11 12:50:56 -03:00
Danilo Leal
8467a3dbd6 agent_ui: Allow to uninstall agent servers from the settings view (#42445)
This PR also adds items within the "Add Agent" menu to:
1. Add more agent servers from extensions, opening up the extensions
page with "Agent Servers" already filtered
2. Go to the agent server + ACP docs to learn more about them

I feel like having them there is a nice way to promote this knowledge
from within the product and have users learn more about them.

<img width="500" height="540" alt="Screenshot 2025-11-11 at 10  46 3@2x"
src="https://github.com/user-attachments/assets/9449df2e-1568-44d8-83ca-87cbb9eefdd2"
/>

Release Notes:

- agent: Enabled uninstalled agent servers from the agent panel's
settings view.
2025-11-11 12:47:08 -03:00
Bennet Bo Fenner
ee2e690657 agent_servers: Fix panic when setting default mode (#42452)
Closes ZED-35A

Release Notes:

- Fixed an issue where Zed would panic when trying to set the default
mode for ACP agents
2025-11-11 15:25:27 +00:00
tidely
28d019be2e ollama: Fix tool calling (#42275)
Closes #42303

Ollama added tool call identifiers
(https://github.com/ollama/ollama/pull/12956) in its latest version
[v0.12.10](https://github.com/ollama/ollama/releases/tag/v0.12.10). This
broke our json schema and made all tool calls fail.

This PR fixes the schema and uses the Ollama provided tool call
identifier when available. We remain backwards compatible and still use
our own identifier with older versions of Ollama. I added a `TODO` to
remove the `Option` around the new field when most users have updated
their installations to v0.12.10 or above.

Note to reviewer: The fix to this issue should likely get cherry-picked
into the next release, since Ollama becomes unusable as an agent without
it.

Release Notes:

- Fixed tool calling when using the latest version of Ollama
2025-11-11 16:10:47 +01:00
Lukas Wirth
a19d11184d remote: Add more context to error logging in wsl (#42450)
cc https://github.com/zed-industries/zed/issues/40892

Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-11 15:09:56 +00:00
Lukas Wirth
38e2c7aa66 editor: Hide file blame on editor cancel (ESC) (#42436)
Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-11 13:56:04 +00:00
liuyanghejerry
10d5d78ded Improve error messages on extension loading (#42266)
This pull request improves error message when extension loading goes
wrong.

Before:

```
2025-11-08T21:16:02+08:00 ERROR [extension_host::extension_host] failed to load arkts extension.toml

Caused by:
    No such file or directory (os error 2)
```

Now:

```
2025-11-08T22:57:00+08:00 ERROR [extension_host::extension_host] failed to load arkts extension.toml, "/Users/user_name_placeholder/Library/Application Support/Zed/extensions/installed/arkts/extension.toml"

Caused by:
    No such file or directory (os error 2)

```

Release Notes:

- N/A
2025-11-11 15:45:03 +02:00
Terra
dfd7e85d5d Replace deprecated json.schemastore.org with www.schemastore.org (#42336)
Release Notes:

- N/A

According to
[microsoft/vscode#254689](https://github.com/microsoft/vscode/issues/254689),
the json.schemastore.org domain has been deprecated and should now use
www.schemastore.org (or schemastore.org) instead.

This PR updates all occurrences of the old domain within the Zed
codebase,
including code, documentation, and configuration files.
2025-11-11 15:43:25 +02:00
Lukas Wirth
b8fcd3ea04 gpui: Fix RefCell already borrowed in WindowsPlatform::run (#42440)
Fixes ZED-1VX

Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-11 13:43:06 +00:00
Libon
9be5e31aca Add clear recent files history command (#42176)
![2025-11-07
181619](https://github.com/user-attachments/assets/a9bef7a6-dc0b-4db2-85e5-2e1df7b21cfa)


Release Notes:

- Added "workspace: clear navigation history" command
2025-11-11 15:42:00 +02:00
Kirill Bulatov
58db38722b Find proper applicable chunks for visible ranges (#42422)
Release Notes:

- Fixed inlay hints not being queried for certain long-ranged jumps

Co-authored-by: Smit Barmase <heysmitbarmase@gmail.com>
Co-authored-by: Lukas Wirth <lukas@zed.dev>
2025-11-11 13:38:28 +00:00
Agus Zubiaga
f2ad0d716f zeta cli: Print log paths when running predict (#42396)
Release Notes:

- N/A

Co-authored-by: Michael Sloan <mgsloan@gmail.com>
Co-authored-by: Ben Kunkle <ben@zed.dev>
2025-11-11 09:56:20 -03:00
Lukas Wirth
777b46533f auto_update: Ignore dir removal errors on windows (#42435)
The auto update helper already removes these when successful, so these
will always fail in the common case.

Additional replaces a mutable const with a static as otherwise we'll
rebuild the job list on every access

Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-11 12:55:19 +00:00
Miguel Raz Guzmán Macedo
b3dd51560b docs: Fix broken links in docs with lychee (#42404)
Lychee is a [Rust based](https://lychee.cli.rs) async parallel link
checker.

I ran it against the codebase to suss out stale links and fixed those
up.

There's currently 2 remaining cases that I don't know how to resolve:

1. https://flathub.org/apps/dev.zed.Zed - nginx is giving a 502 bad
gateway
2.
https://github.com/zed-industries/zed/actions/workflows/ci.yml/badge.svg
- I don't want to mess with the CI pipeline in this PR.

Once again, I'll punt to the Docs Czar to see if this gets incorporated
into CI later.

---

## Running `lychee` locally:

```
cargo binstall -y lychee
lychee .
```

---
Release Notes:

- N/A

Signed-off-by: mrg <miguelraz@ciencias.unam.mx>
2025-11-11 13:55:02 +01:00
dDostalker
25489c2b7a Fix adding a Python virtual environment, may duplicate the "open this dictionary" string when modifying content. (#41840)
Release Notes:

- Fixed an issue when adding a Python virtual environment that may cause
duplicate "open this dictionary" entries

- Trigger condition:
Type `C:\`, delete `\`, then repeatedly add `\`.

-Video

bug:

https://github.com/user-attachments/assets/f68008bb-9138-4451-a842-25b58574493b

fix:

https://github.com/user-attachments/assets/2913b8c2-adee-4275-af7e-e055fd78915f
2025-11-11 13:22:32 +01:00
Alexandre Anício
dc372e8a84 editor: Unfold buffers with selections on edit + Remove selections on buffer fold (#37953)
Closes #36376 

Problem:
Multi-cursor edits/selections in multi-buffers view were jumping to
incorrect locations after toggling buffer folds. When users created
multiple selections across different buffers in a multi-buffer view
(like project search results) and then folded one of the buffers,
subsequent text insertion would either:

1. Insert text at wrong locations (like at the top of the first unfolded
buffer)
2. Replace the entire content in some buffers instead of inserting at
the intended cursor positions
3. Create orphaned selections that caused corruption in the editing
experience

The issue seems to happen because when a buffer gets folded in a
multi-buffer view, the existing selections associated with that buffer
become invalid anchor points.

Solution:
1. Selection Cleanup on Buffer Folding
- Added `remove_selections_from_buffer()` method that filters out all
selections from a buffer when it gets folded
- This prevents invalid selections from corrupting subsequent editing
operations
- Includes edge case handling: if all selections are removed (all
buffers folded), it creates a default selection at the start of the
first buffer to prevent panics

2. Unfolding buffers before editing  
- Added `unfold_buffers_with_selections()` call in `handle_input()`
ensures buffers with active selections are automatically unfolded before
editing
- This helps in fixing an edge case (covered in the tests) where, if you
fold all buffers in a multi-buffer view, and try to insert text in a
selection, it gets unfolded before the edit happens. Without this, the
inserted text would override the entire buffer content.
- If we don't care about this edge case, we could remove this method. I
find it ok to add since we already trigger buffer unfolding after edits
with `Event::ExcerptsEdited`.

Release Notes:

- Fixed multi-cursor edits jumping to incorrect locations after toggling
buffer folds in multi-buffer views (e.g, project search)
- Multi-cursor selections now properly handle buffer folding/unfolding
operations
- Text insertion no longer occurs at the wrong positions when buffers
are folded during multi-cursor editing
- Eliminated content replacement bugs where entire buffer contents were
incorrectly overwritten
- Added safe fallback behavior when all buffers in a multi-buffer view
are folded

---------

Co-authored-by: Smit Barmase <heysmitbarmase@gmail.com>
2025-11-11 16:59:44 +05:30
Lukas Wirth
1c4bb60209 gpui: Fix invalid unwrap in windows window creation (#42426)
Fixes ZED-34M

Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-11 10:55:19 +00:00
Dino
97100ce52f editor: Respect search case sensitivity when selecting occurrences (#42121)
Update how the editor's `select_*` methods work in order to respect the
`search.case_sensitive` setting, or to be overriden by the
`BufferSearchBar` search options.

- Update both the `SearchableItem` and `SearchableItemHandle` traits
  with a new `set_search_is_case_sensitive` method that allows callers
  to set the case sensitivity of the search
- Update the `BufferSearchBar` to leverage
  `SearchableItemHandle.set_search_is_case_sensitive` in order to sync
  its case sensitivity options with the searchable item
- Update the implementation of the `SearchableItem` trait for `Editor`
  so as to store the argument provided to the
  `set_search_is_case_sensitive` method
- Update the way search queries are built by `Editor` so as to rely on
  `SearchableItem.set_search_is_case_sensitive` argument, if not `None`,
  or default to the editor's `search.case_sensitive` settings

Closes #41070 

Release Notes:

- Improved the "Select Next Occurrence", "Select Previous Occurrence"
and "Select All Occurrences" actions in order to respect the case
sensitivity search settings

---------

Co-authored-by: Conrad Irwin <conrad.irwin@gmail.com>
2025-11-11 10:26:40 +00:00
Dino
dcf56144b5 vim: Sort whole buffer when no range is specified (#42376)
- Introduce a `default_range` field to `VimCommand`, to be optionally
  used when no range is specified for the command
- Update `VimCommand.parse` to take into consideration the
  `default_range`
- Introduce `CommandRange::buffer` to obtain the `CommandRange` which
  corresponds to the whole buffer
- Update the `VimCommand` definitions for both `sort` and `sort i` to
  default to the whole buffer when no range is specified

Closes #41750 

Release Notes:

- Improved vim's `:sort` command to sort the buffer's content when no
selection is used
2025-11-11 10:04:30 +00:00
Lukas Wirth
46db753f79 diagnostics: Fix panic due non-sorted diagnostics excerpt ranges (#42416)
Fixes ZED-356

Release Notes:

- N/A *or* Added/Fixed/Improved ...

Co-authored-by: Smit Barmase <heysmitbarmase@gmail.com>
2025-11-11 09:57:11 +01:00
Lukas Wirth
1a807a7a6a terminal: Spawn terminal process on main thread on macos again (#42411)
Closes https://github.com/zed-industries/zed/issues/42365, follow up to
https://github.com/zed-industries/zed/pull/42234

Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-11 07:57:30 +00:00
Alvaro Parker
f90d0789fb git: Add notification to git clone (#41712)
Adds a simple notification when cloning a repo using the integrated git
clone on Zed. Before this, the user had no feedback after starting the
cloning action.

Demo:


https://github.com/user-attachments/assets/72fcdf1b-fc99-4fe5-8db2-7c30b170f12f

Not sure about that icon I'm using for the animation, but that can be
easily changed.

Release Notes:

- Added notification when cloning a repo from zed
2025-11-11 01:02:13 -05:00
Conrad Irwin
9e717c7711 Use cloud for auto-update (#42246)
We've had several outages with a proximate cause of "vercel is
complicated",
and auto-update is considered a critical feature; so lets not use vercel
for
that.

Release Notes:

- Auto Updates (and remote server binaries) are now downloaded via
https://cloud.zed.dev instead of https://zed.dev. As before, these URLs
redirect to the GitHub release for actual downloads.
2025-11-10 23:00:55 -07:00
CnsMaple
823844ef18 vim: Fix increment order (#42256)
before:


https://github.com/user-attachments/assets/d490573c-4c2b-4645-a685-d683f06c611f


after:


https://github.com/user-attachments/assets/a69067a1-6e68-4f05-ba56-18eadb1c54df

Release Notes:

- Fix vim increment order
2025-11-10 21:48:27 -07:00
Conrad Irwin
70bcf93355 Add an event_source to events (#42125)
Release Notes:

- N/A
2025-11-10 21:32:09 -07:00
Conrad Irwin
378b30eba5 Use cloud.zed.dev for install.sh (#42399)
Similar to #42246, we'd like to avoid having Vercel on the critical
path.

https://zed.dev/install.sh is served from Cloudflare by intercepting a
route on that page, so this makes the shell-based install flow vercel independent.

Release Notes:

- `./script/install.sh` will now fetch assets via
`https://cloud.zed.dev/`
instead of `https://zed.dev`. As before it will redirect to GitHub
releases
  to complete the download.
2025-11-10 23:55:19 +00:00
Marshall Bowers
83e7c21b2c collab: Remove unused user queries (#42400)
This PR removes queries on users that were no longer being used.

Release Notes:

- N/A
2025-11-10 23:47:39 +00:00
Finn Evers
e488b6cd0b agent_ui: Fix issue where MCP extension could not be uninstalled (#42384)
Closes https://github.com/zed-industries/zed/issues/42312

The issue here was that we assumed that context servers provided by
extensions would always need a config in the settings to be present when
actually the opposite was the case - context servers provided by
extensions are the only context servers that do not need a config to be
in place in order to be available in the UI.

Release Notes:

- Fixed an issue where context servers provided by extensions could not
be uninstalled if they were previously unconfigured.
2025-11-11 00:25:27 +01:00
Kirill Bulatov
f52549c1c4 Small documentation fixes (#42397)
Release Notes:

- N/A

Co-authored-by: Ole Jørgen Brønner <olejorgenb@gmail.com>
2025-11-11 01:16:28 +02:00
Conrad Irwin
359521e91d Allow passing model_name to evals (#42395)
Release Notes:

- N/A
2025-11-10 23:00:52 +00:00
Max Brunsfeld
b607077c08 Add old_text/new_text as a zeta2 prompt format (#42171)
Release Notes:

- N/A

---------

Co-authored-by: Agus Zubiaga <agus@zed.dev>
Co-authored-by: Oleksiy Syvokon <oleksiy.syvokon@gmail.com>
Co-authored-by: Ben Kunkle <ben@zed.dev>
Co-authored-by: Michael Sloan <mgsloan@gmail.com>
2025-11-10 15:44:54 -07:00
Marshall Bowers
e5fce424b3 Update CI badge in README (#42394)
This PR updates the CI badge in the README, after the CI workflow
reorganization.

Release Notes:

- N/A
2025-11-10 22:05:46 +00:00
Andrew Farkas
a8b04369ae Refactor completions (#42122)
This is progress toward multi-word snippets (including snippets with
prefixes containing symbols)

Release Notes:

- Removed `trigger` argument in `ShowCompletions` command

---------

Co-authored-by: Conrad Irwin <conrad.irwin@gmail.com>
2025-11-10 17:00:59 -05:00
Marshall Bowers
11b38db3e3 collab: Drop channel_messages table and its dependents (#42392)
This PR drops the `channel_messages` table and its
dependents—`channel_message_mentions` and `observed_channel_messages`—as
they are no longer used.

Release Notes:

- N/A
2025-11-10 21:59:05 +00:00
John Tur
112b5c16b7 Add QuitMode policy to GPUI (#42391)
Applications can select a policy for when the app quits using the new
function `Application::with_quit_mode`:
- Only on explicit calls to `App::quit`
- When the last window is closed
- Platform default (former on macOS, latter everywhere else) 

Release Notes:

- N/A
2025-11-10 16:45:43 -05:00
Marshall Bowers
32ec1037e1 collab: Remove unused models left over from chat (#42390)
This PR removes some database models that were left over from the chat
feature.

Release Notes:

- N/A
2025-11-10 21:39:44 +00:00
Connor Tsui
a44fc9a1de Rename ThemeMode to ThemeAppearanceMode (#42279)
There was a TODO in `crates/settings/src/settings_content/theme.rs` to
make this rename.

This PR is just splitting off this change from
https://github.com/zed-industries/zed/pull/40035 to make reviewing that
one a bit easier since that PR is a bit more involved than expected.

Release Notes:

- N/A

Signed-off-by: Connor Tsui <connor.tsui20@gmail.com>
2025-11-10 14:26:01 -07:00
Ole Jørgen Brønner
efcd7f7d10 Slightly improve completion in settings.json (for lsp.<language-server>.) (#42263)
Document "any-typed" (`serde_json::Value`) "lsp" keys to include them in
json-language-server completions.

The vscode-json-languageserver seems to skip generically typed keys when
offering completion.

For this schema

```
    "LspSettings": {
        "type": "object",
        "properties": {
            ...
            "initialization_options": true,
            ...
         }
     }
```

"initialization_options" is not offered in the completion.

The effect is easy to verify by triggering completion inside:

```
    "lsp": {
        "basedpyright": {
           COMPLETE HERE
```

<img width="797" height="215" alt="image"
src="https://github.com/user-attachments/assets/d1d1391c-d02c-4028-9888-8869f4d18b0f"
/>

By adding a documentation string the keys are offered even if they are
generically typed:

<img width="809" height="238" alt="image"
src="https://github.com/user-attachments/assets/9a072da9-961b-4e15-9aec-3d56933cbe67"
/>

---

Note: I did some cursory research of whether it's possible to make
vscode-json-languageserver change behavior without success. IMO, not
offering completions here is a bug (or at minimal should be
configurable)

---

Release Notes:

- N/A

---------

Co-authored-by: Kirill Bulatov <mail4score@gmail.com>
2025-11-10 23:08:40 +02:00
John Tur
aaf2f9d309 Ignore "Option as Meta" setting outside of macOS (#42367)
The "Option" key only exists on a Mac. On other operating systems, it is
always expected that the Alt key generates escaped characters.

Fixes https://github.com/zed-industries/zed/issues/40583

Release Notes:

- N/A
2025-11-10 15:11:18 -05:00
Finn Evers
62e3a49212 editor: Fix rare panic in wrap map (#39379)
Closes ZED-1SV
Closes ZED-TG
Closes ZED-22G
Closes ZED-22J

This seems to fix the reported error there, but ultimately, this might
benefit from a test to reproduce. Hence, marking as draft for now.

Release Notes:

- Fixed a rare panic whilst wrapping lines.
2025-11-10 20:08:48 +00:00
Finn Evers
87d0401e64 editor: Show relative line numbers for deleted rows (#42378)
Closes #42191

This PR adds support for relative line numbers in deleted hunks. Note
that this only applies in cases where there is a form of relative
numbering.

It also adds some tests for this functionality as well as missing tests
for other cases in line layouting that was previously untested.

Release Notes:

- Line numbers will now be shown in deleted git hunks if relative line
numbering is enabled
2025-11-10 21:00:50 +01:00
Danilo Leal
2c375e2e0a agent_ui: Ensure message editor placeholder text is accurate (#42375)
This PR creates a dedicated function for the agent panel message
editor's placeholder text so that we can wait for the agent
initialization to capture whether they support slash commands or not. On
the one (nice) hand, this allow us to stop matching agents by name and
make this a bit more generic. On the other (bad) hand, the "/ for
commands" bit should take a little second to show up because we can only
know whether an agent supports it after it is initialized.

This is particularly relevant now that we have agents coming from
extensions and for them, we would obviously not be able to match by
name.

Release Notes:

- agent: Fixed agent panel message editor's placeholder text by making
it more accurate as to whether agents support slash commands,
particularly those coming from extensions.
2025-11-10 16:50:52 -03:00
Conrad Irwin
c24f9e47b4 Try to download wasi-sdk ahead of time (#42377)
This hopefully resolves the lingering test failures on linux,
but also adds some logging just in case this isn't the problem...

Release Notes:

- N/A

---------

Co-authored-by: Ben Kunkle <ben@zed.dev>
2025-11-10 19:50:43 +00:00
Andrew Farkas
3fbfea491d Support relative paths in LSP & DAP binaries (#42135)
Closes #41214

Release Notes:

- Added support for relative paths in LSP and DAP binaries

---------

Co-authored-by: Cole Miller <cole@zed.dev>
Co-authored-by: Julia Ryan <juliaryan3.14@gmail.com>
2025-11-10 19:33:00 +00:00
Finn Evers
2b369d7532 rust: Explicitly capture lifetime identifier (#42372)
Closes #42030

This matches what VSCode and basically also this capture does. However,
the identifier capture was overridden by other captures, hence the need
to be explicit here.

| Before | After | 
| - | - |
| <img width="930" height="346" alt="Bildschirmfoto 2025-11-10 um 17 56
28"
src="https://github.com/user-attachments/assets/e938c863-0981-4368-ab0a-a01dd04cfb24"
/> | <img width="930" height="346" alt="Bildschirmfoto 2025-11-10 um 17
54 35"
src="https://github.com/user-attachments/assets/f3b74011-c75c-448a-819e-80e7e8684e92"
/> |


Release Notes:

- Improved lifetime highlighting in Rust using the `lifetime` capture.
2025-11-10 19:41:14 +01:00
Danilo Leal
ed61a79cc5 agent_ui: Fix history view losing focus when empty (#42374)
Closes https://github.com/zed-industries/zed/issues/42356

This PR fixes the history view losing focus by simply always displaying
the search editor. I don't think it's too weird to not have it when it's
empty, and it also ends up matching how regular pickers work.

Release Notes:

- agent: Fixed a bug where navigating the agent panel with the keyboard
wouldn't work if you visited the history view and it was empty/had no
entries.
2025-11-10 15:29:55 -03:00
Tim Vermeulen
aa6270e658 editor: Add sticky scroll (#42242)
Closes #5344


https://github.com/user-attachments/assets/37ec58b0-7cf6-4eea-9b34-dccf03d3526b

Release Notes:

- Added a setting to stick scopes to the top of the editor

---------

Co-authored-by: KyleBarton <kjb@initialcapacity.io>
Co-authored-by: Conrad Irwin <conrad.irwin@gmail.com>
2025-11-10 11:24:30 -07:00
Dino
d896af2f15 git: Handle buffer file path changes (#41944)
Update `GitStore.on_buffer_store_event` so that, when a
`BufferStoreEvent::BufferChangedFilePath` event is received, we check if
there's any diff state for the buffer and, if so, update it according to
the new file path, in case the file exists in the repository.

Closes #40499

Release Notes:

- Fixed issue with git diff tracking when updating a buffer's file from
an untracked to a tracked file
2025-11-10 18:19:08 +00:00
Agus Zubiaga
c748b177c4 zeta2 cli: Cache at LLM request level (#42371)
We'll now cache LLM responses at the request level (by hash of
URL+contents) for both context and prediction. This way we don't need to
worry about mistakenly using the cache when we change the prompt or its
components.

Release Notes:

- N/A

---------

Co-authored-by: Oleksiy Syvokon <oleksiy.syvokon@gmail.com>
2025-11-10 14:23:52 -03:00
Tryanks
ddf5937899 gpui: Move 'app closing on last window closed' behavior to app-side (#41436)
This commit is a continuation of #36548. As per [mikayla-maki's
Comment](https://github.com/zed-industries/zed/pull/36548#issuecomment-3412140698),
I removed the process management behavior located in GPUI and
reimplemented it in Zed.

Release Notes:

- N/A

---------

Co-authored-by: Mikayla Maki <mikayla.c.maki@gmail.com>
2025-11-10 17:19:35 +00:00
Piotr Osiewicz
6e1d86f311 fs: Handle io::ErrorKind::NotADirectory in fs::metadata (#42370)
New error variants were stabilized in 1.83, and this might've led to us
mis-handling not-a-directory errors.

Co-authored-by: Dino <dino@zed.dev>

Release Notes:

- N/A

Co-authored-by: Dino <dino@zed.dev>
2025-11-10 18:18:40 +01:00
Abul Hossain Khan
a3f04e8b36 agent_ui: Fix thread history item showing GMT time instead of local time on Windows (#42198)
Closes #42178
Now it's consistent with the DateAndTime path which already does
timezone conversion.

- **Future Work**
Happy to tackle the TODO in `time_format.rs` about implementing native
Windows APIs for proper localized formatting (similar to macOS's
`CFDateFormatter`) as a follow-up.

Release Notes:

- agent: Fixed the thread history item timestamp, which was being shown
in GMT instead of in the user's local timezone on Windows.
2025-11-10 13:34:59 -03:00
Danilo Leal
3c81ee6ba6 agent_ui: Allow to configure a default model for profiles through modal (#42359)
Follow-up to https://github.com/zed-industries/zed/pull/39220

This PR allows to configure a default model for a given profile through
the profile management modal.

| Option In Picker | Model Selector |
|--------|--------|
| <img width="1172" height="538" alt="Screenshot 2025-11-10 at 12  24
2@2x"
src="https://github.com/user-attachments/assets/33dfb6f1-f8fd-42f9-b824-3dab807094da"
/> | <img width="1172" height="1120" alt="Screenshot 2025-11-10 at 12 
24@2x"
src="https://github.com/user-attachments/assets/50360b0a-fbb1-455e-9cf7-9fa987345038"
/> |

Release Notes:

- N/A
2025-11-10 13:12:13 -03:00
Miguel Raz Guzmán Macedo
35ae2f5b2b typo: Use tips from proselint (#42362)
I ran [proselint](https://github.com/amperser/proselint) (recommended by
cURL author [Daniel
Stenberg](https://daniel.haxx.se/blog/2022/09/22/taking-curl-documentation-quality-up-one-more-notch/))
against all the `.md` files in the codebase to see if I could fix some
easy typos.

The tool is noisier than I would like and picking up the overrides to
the default config in a `.proselintrc.json` was much harder than I
expected.

There's many other small nits [1] that I believe are best left to your
docs czar whenever they want to consider incorporating a tool like this
into big releases or CI, but these seemed like small wins for now to
open a conversation about a tool like proselint.

---

[1]: Such nits include
- incosistent 1 or 2 spaces
- "color" vs "colour"
- ab/use of `very` 
- awkward or superfluous phrasing.

Release Notes:

- N/A

Signed-off-by: mrg <miguelraz@ciencias.unam.mx>
2025-11-10 17:51:44 +02:00
Agus Zubiaga
d420dd63ed zeta: Improve unified diff prompt (#42354)
Extract some of the improvements from to the unified diff prompt from
https://github.com/zed-industries/zed/pull/42171 and adds some other
about how context work to improve the reliability of predictions.

We also now strip the `<|user_cursor|>` marker if it appears in the
output rather than failing.

Release Notes:

- N/A

---------

Co-authored-by: Max Brunsfeld <maxbrunsfeld@gmail.com>
2025-11-10 14:58:42 +00:00
feeiyu
42ed032f12 Fix circular reference issue between EditPredictionButton and PopoverMenuHandle (#42351)
Closes #ISSUE

While working on issue #40906, I discovered that RemoteClient was not
being released after the remote project closed.
Analysis revealed a circular reference between EditPredictionButton and
PopoverMenuHandle.

Dependency Chain: RemoteClient → Project → ZetaEditPredictionProvider →
EditPredictionButton ↔ PopoverMenuHandle

<img width="400" height="300" alt="image"
src="https://github.com/user-attachments/assets/6b716c9b-6938-471a-b044-397314b729d4"
/>

a) EditPredictionButton hold the reference of PopoverMenuHandle 

5f8226457e/crates/zed/src/zed.rs (L386-L394)

b) PopoverMenuHandle hold the reference of Fn which capture
`Entity<EditPredictionButton>`

5fc54986c7/crates/edit_prediction_button/src/edit_prediction_button.rs (L382-L389)


a9bc890497/crates/ui/src/components/popover_menu.rs (L376-L384)


Release Notes:

- N/A
2025-11-10 16:52:03 +02:00
David
2d84af91bf agent: Add ability to set a default_model per profile (#39220)
Split off from https://github.com/zed-industries/zed/pull/39175

Requires https://github.com/zed-industries/zed/pull/39219 to be merged
first

Adds support for `default_model` for profiles: 

```
      "my-profile": {
        "name": "Coding Agent",
        "tools": {},
        "enable_all_context_servers": false,
        "context_servers": {},
        "default_model": {
          "provider": "copilot_chat",
          "model": "grok-code-fast-1"
        }
      }
```

Which will then switch to the default model whenever the profile is
activated

![2025-09-30 17 09
06](https://github.com/user-attachments/assets/43f07b7b-85d9-4aff-82ce-25d6f5050d50)


Release Notes:

- Added `default_model` configuration to agent profile

---------

Co-authored-by: Danilo Leal <daniloleal09@gmail.com>
2025-11-10 11:11:24 -03:00
Abdugani Toshmukhamedov
7aacc7566c Add support for closing window tabs with middle mouse click (#41628)
This change adds support for closing a system window tabs by pressing
the middle mouse button.
It improves tab management UX by matching common tab behavior.

Release Notes:

- Added support for closing system window tabs with middle mouse click.
2025-11-10 15:09:37 +01:00
Caleb Van Dyke
8d632958db Add better labels for completions for ty lsp (#42233)
Verified that this works locally. I modeled it after how basedpyright
and pyright work. Here is a screenshot of what it looks like (issue has
screenshots of the old state):

<img width="593" height="258" alt="Screenshot 2025-11-07 at 2 40 50 PM"
src="https://github.com/user-attachments/assets/5d2371fc-360b-422f-ba59-0a95f2083c87"
/>

Closes #42232

Release Notes:

- python/ty: Code completion menu now shows packages that will be
imported when a given entry is accepted.
2025-11-10 13:28:12 +01:00
Lukas Wirth
0149de4b54 git: Fix panic in git2 due to empty repo paths (#42304)
Fixes ZED-1VR

Release Notes:

- Fixed sporadic panic in git features
2025-11-10 09:27:51 +00:00
220 changed files with 8499 additions and 2540 deletions

View File

@@ -1,4 +1,4 @@
# yaml-language-server: $schema=https://json.schemastore.org/github-issue-config.json
# yaml-language-server: $schema=https://www.schemastore.org/github-issue-config.json
blank_issues_enabled: false
contact_links:
- name: Feature Request

View File

@@ -15,13 +15,13 @@ jobs:
stale-issue-message: >
Hi there! 👋
We're working to clean up our issue tracker by closing older issues that might not be relevant anymore. If you are able to reproduce this issue in the latest version of Zed, please let us know by commenting on this issue, and we will keep it open. If you can't reproduce it, feel free to close the issue yourself. Otherwise, we'll close it in 7 days.
We're working to clean up our issue tracker by closing older bugs that might not be relevant anymore. If you are able to reproduce this issue in the latest version of Zed, please let us know by commenting on this issue, and it will be kept open. If you can't reproduce it, feel free to close the issue yourself. Otherwise, it will close automatically in 14 days.
Thanks for your help!
close-issue-message: "This issue was closed due to inactivity. If you're still experiencing this problem, please open a new issue with a link to this issue."
days-before-stale: 120
days-before-close: 7
any-of-issue-labels: "bug,panic / crash"
days-before-stale: 60
days-before-close: 14
only-issue-types: "Bug,Crash"
operations-per-run: 1000
ascending: true
enable-statistics: true

View File

@@ -35,6 +35,9 @@ jobs:
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: steps::download_wasi_sdk
run: ./script/download-wasi-sdk
shell: bash -euxo pipefail {0}
- name: compare_perf::run_perf::install_hyperfine
run: cargo install hyperfine
shell: bash -euxo pipefail {0}

View File

@@ -57,16 +57,19 @@ jobs:
mkdir -p ./../.cargo
cp ./.cargo/ci-config.toml ./../.cargo/config.toml
shell: bash -euxo pipefail {0}
- name: steps::cache_rust_dependencies_namespace
uses: namespacelabs/nscloud-cache-action@v1
with:
cache: rust
- name: steps::setup_linux
run: ./script/linux
shell: bash -euxo pipefail {0}
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: steps::cache_rust_dependencies_namespace
uses: namespacelabs/nscloud-cache-action@v1
with:
cache: rust
- name: steps::download_wasi_sdk
run: ./script/download-wasi-sdk
shell: bash -euxo pipefail {0}
- name: steps::setup_node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020
with:
@@ -202,6 +205,9 @@ jobs:
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: steps::download_wasi_sdk
run: ./script/download-wasi-sdk
shell: bash -euxo pipefail {0}
- name: ./script/bundle-linux
run: ./script/bundle-linux
shell: bash -euxo pipefail {0}
@@ -242,6 +248,9 @@ jobs:
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: steps::download_wasi_sdk
run: ./script/download-wasi-sdk
shell: bash -euxo pipefail {0}
- name: ./script/bundle-linux
run: ./script/bundle-linux
shell: bash -euxo pipefail {0}

View File

@@ -93,6 +93,9 @@ jobs:
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: steps::download_wasi_sdk
run: ./script/download-wasi-sdk
shell: bash -euxo pipefail {0}
- name: ./script/bundle-linux
run: ./script/bundle-linux
shell: bash -euxo pipefail {0}
@@ -140,6 +143,9 @@ jobs:
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: steps::download_wasi_sdk
run: ./script/download-wasi-sdk
shell: bash -euxo pipefail {0}
- name: ./script/bundle-linux
run: ./script/bundle-linux
shell: bash -euxo pipefail {0}

View File

@@ -6,24 +6,21 @@ env:
CARGO_INCREMENTAL: '0'
RUST_BACKTRACE: '1'
ANTHROPIC_API_KEY: ${{ secrets.ANTHROPIC_API_KEY }}
OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }}
GOOGLE_AI_API_KEY: ${{ secrets.GOOGLE_AI_API_KEY }}
GOOGLE_CLOUD_PROJECT: ${{ secrets.GOOGLE_CLOUD_PROJECT }}
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_EVAL_TELEMETRY: '1'
MODEL_NAME: ${{ inputs.model_name }}
on:
pull_request:
types:
- synchronize
- reopened
- labeled
branches:
- '**'
schedule:
- cron: 0 0 * * *
workflow_dispatch: {}
workflow_dispatch:
inputs:
model_name:
description: model_name
required: true
type: string
jobs:
agent_evals:
if: |
github.repository_owner == 'zed-industries' &&
(github.event_name != 'pull_request' || contains(github.event.pull_request.labels.*.name, 'run-eval'))
runs-on: namespace-profile-16x32-ubuntu-2204
steps:
- name: steps::checkout_repo
@@ -40,6 +37,9 @@ jobs:
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: steps::download_wasi_sdk
run: ./script/download-wasi-sdk
shell: bash -euxo pipefail {0}
- name: steps::setup_cargo_config
run: |
mkdir -p ./../.cargo
@@ -49,14 +49,19 @@ jobs:
run: cargo build --package=eval
shell: bash -euxo pipefail {0}
- name: run_agent_evals::agent_evals::run_eval
run: cargo run --package=eval -- --repetitions=8 --concurrency=1
run: cargo run --package=eval -- --repetitions=8 --concurrency=1 --model "${MODEL_NAME}"
shell: bash -euxo pipefail {0}
env:
ANTHROPIC_API_KEY: ${{ secrets.ANTHROPIC_API_KEY }}
OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }}
GOOGLE_AI_API_KEY: ${{ secrets.GOOGLE_AI_API_KEY }}
GOOGLE_CLOUD_PROJECT: ${{ secrets.GOOGLE_CLOUD_PROJECT }}
- name: steps::cleanup_cargo_config
if: always()
run: |
rm -rf ./../.cargo
shell: bash -euxo pipefail {0}
timeout-minutes: 60
timeout-minutes: 600
concurrency:
group: ${{ github.workflow }}-${{ github.ref_name }}-${{ github.ref_name == 'main' && github.sha || 'anysha' }}
cancel-in-progress: true

View File

@@ -34,6 +34,9 @@ jobs:
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: steps::download_wasi_sdk
run: ./script/download-wasi-sdk
shell: bash -euxo pipefail {0}
- name: ./script/bundle-linux
run: ./script/bundle-linux
shell: bash -euxo pipefail {0}
@@ -74,6 +77,9 @@ jobs:
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: steps::download_wasi_sdk
run: ./script/download-wasi-sdk
shell: bash -euxo pipefail {0}
- name: ./script/bundle-linux
run: ./script/bundle-linux
shell: bash -euxo pipefail {0}

View File

@@ -0,0 +1,69 @@
# Generated from xtask::workflows::run_cron_unit_evals
# Rebuild with `cargo xtask workflows`.
name: run_cron_unit_evals
env:
CARGO_TERM_COLOR: always
CARGO_INCREMENTAL: '0'
RUST_BACKTRACE: '1'
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
on:
schedule:
- cron: 47 1 * * 2
workflow_dispatch: {}
jobs:
cron_unit_evals:
runs-on: namespace-profile-16x32-ubuntu-2204
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_cargo_config
run: |
mkdir -p ./../.cargo
cp ./.cargo/ci-config.toml ./../.cargo/config.toml
shell: bash -euxo pipefail {0}
- name: steps::cache_rust_dependencies_namespace
uses: namespacelabs/nscloud-cache-action@v1
with:
cache: rust
- name: steps::setup_linux
run: ./script/linux
shell: bash -euxo pipefail {0}
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: steps::download_wasi_sdk
run: ./script/download-wasi-sdk
shell: bash -euxo pipefail {0}
- name: steps::cargo_install_nextest
run: cargo install cargo-nextest --locked
shell: bash -euxo pipefail {0}
- name: steps::clear_target_dir_if_large
run: ./script/clear-target-dir-if-larger-than 250
shell: bash -euxo pipefail {0}
- name: ./script/run-unit-evals
run: ./script/run-unit-evals
shell: bash -euxo pipefail {0}
env:
ANTHROPIC_API_KEY: ${{ secrets.ANTHROPIC_API_KEY }}
OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }}
GOOGLE_AI_API_KEY: ${{ secrets.GOOGLE_AI_API_KEY }}
GOOGLE_CLOUD_PROJECT: ${{ secrets.GOOGLE_CLOUD_PROJECT }}
- name: steps::cleanup_cargo_config
if: always()
run: |
rm -rf ./../.cargo
shell: bash -euxo pipefail {0}
- name: run_agent_evals::cron_unit_evals::send_failure_to_slack
if: ${{ failure() }}
uses: slackapi/slack-github-action@b0fa283ad8fea605de13dc3f449259339835fc52
with:
method: chat.postMessage
token: ${{ secrets.SLACK_APP_ZED_UNIT_EVALS_BOT_TOKEN }}
payload: |
channel: C04UDRNNJFQ
text: "Unit Evals Failed: https://github.com/zed-industries/zed/actions/runs/${{ github.run_id }}"
concurrency:
group: ${{ github.workflow }}-${{ github.ref_name }}-${{ github.ref_name == 'main' && github.sha || 'anysha' }}
cancel-in-progress: true

View File

@@ -143,16 +143,19 @@ jobs:
mkdir -p ./../.cargo
cp ./.cargo/ci-config.toml ./../.cargo/config.toml
shell: bash -euxo pipefail {0}
- name: steps::cache_rust_dependencies_namespace
uses: namespacelabs/nscloud-cache-action@v1
with:
cache: rust
- name: steps::setup_linux
run: ./script/linux
shell: bash -euxo pipefail {0}
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: steps::cache_rust_dependencies_namespace
uses: namespacelabs/nscloud-cache-action@v1
with:
cache: rust
- name: steps::download_wasi_sdk
run: ./script/download-wasi-sdk
shell: bash -euxo pipefail {0}
- name: steps::setup_node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020
with:
@@ -232,6 +235,9 @@ jobs:
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: steps::download_wasi_sdk
run: ./script/download-wasi-sdk
shell: bash -euxo pipefail {0}
- name: steps::setup_cargo_config
run: |
mkdir -p ./../.cargo
@@ -263,16 +269,19 @@ jobs:
mkdir -p ./../.cargo
cp ./.cargo/ci-config.toml ./../.cargo/config.toml
shell: bash -euxo pipefail {0}
- name: steps::cache_rust_dependencies_namespace
uses: namespacelabs/nscloud-cache-action@v1
with:
cache: rust
- name: steps::setup_linux
run: ./script/linux
shell: bash -euxo pipefail {0}
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: steps::cache_rust_dependencies_namespace
uses: namespacelabs/nscloud-cache-action@v1
with:
cache: rust
- name: steps::download_wasi_sdk
run: ./script/download-wasi-sdk
shell: bash -euxo pipefail {0}
- name: cargo build -p collab
run: cargo build -p collab
shell: bash -euxo pipefail {0}
@@ -348,6 +357,9 @@ jobs:
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: steps::download_wasi_sdk
run: ./script/download-wasi-sdk
shell: bash -euxo pipefail {0}
- name: run_tests::check_docs::install_mdbook
uses: peaceiris/actions-mdbook@ee69d230fe19748b7abf22df32acaa93833fad08
with:

View File

@@ -1,17 +1,26 @@
# Generated from xtask::workflows::run_agent_evals
# Generated from xtask::workflows::run_unit_evals
# Rebuild with `cargo xtask workflows`.
name: run_agent_evals
name: run_unit_evals
env:
CARGO_TERM_COLOR: always
CARGO_INCREMENTAL: '0'
RUST_BACKTRACE: '1'
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_EVAL_TELEMETRY: '1'
MODEL_NAME: ${{ inputs.model_name }}
on:
schedule:
- cron: 47 1 * * 2
workflow_dispatch: {}
workflow_dispatch:
inputs:
model_name:
description: model_name
required: true
type: string
commit_sha:
description: commit_sha
required: true
type: string
jobs:
unit_evals:
run_unit_evals:
runs-on: namespace-profile-16x32-ubuntu-2204
steps:
- name: steps::checkout_repo
@@ -33,6 +42,9 @@ jobs:
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: steps::download_wasi_sdk
run: ./script/download-wasi-sdk
shell: bash -euxo pipefail {0}
- name: steps::cargo_install_nextest
run: cargo install cargo-nextest --locked
shell: bash -euxo pipefail {0}
@@ -44,15 +56,10 @@ jobs:
shell: bash -euxo pipefail {0}
env:
ANTHROPIC_API_KEY: ${{ secrets.ANTHROPIC_API_KEY }}
- name: run_agent_evals::unit_evals::send_failure_to_slack
if: ${{ failure() }}
uses: slackapi/slack-github-action@b0fa283ad8fea605de13dc3f449259339835fc52
with:
method: chat.postMessage
token: ${{ secrets.SLACK_APP_ZED_UNIT_EVALS_BOT_TOKEN }}
payload: |
channel: C04UDRNNJFQ
text: "Unit Evals Failed: https://github.com/zed-industries/zed/actions/runs/${{ github.run_id }}"
OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }}
GOOGLE_AI_API_KEY: ${{ secrets.GOOGLE_AI_API_KEY }}
GOOGLE_CLOUD_PROJECT: ${{ secrets.GOOGLE_CLOUD_PROJECT }}
UNIT_EVAL_COMMIT: ${{ inputs.commit_sha }}
- name: steps::cleanup_cargo_config
if: always()
run: |

41
Cargo.lock generated
View File

@@ -96,6 +96,7 @@ dependencies = [
"auto_update",
"editor",
"extension_host",
"fs",
"futures 0.3.31",
"gpui",
"language",
@@ -1330,10 +1331,14 @@ version = "0.1.0"
dependencies = [
"anyhow",
"client",
"clock",
"ctor",
"db",
"futures 0.3.31",
"gpui",
"http_client",
"log",
"parking_lot",
"paths",
"release_channel",
"serde",
@@ -1344,6 +1349,7 @@ dependencies = [
"util",
"which 6.0.3",
"workspace",
"zlog",
]
[[package]]
@@ -6242,7 +6248,7 @@ dependencies = [
"futures-core",
"futures-sink",
"nanorand",
"spin",
"spin 0.9.8",
]
[[package]]
@@ -6353,9 +6359,9 @@ checksum = "aa9a19cbb55df58761df49b23516a86d432839add4af60fc256da840f66ed35b"
[[package]]
name = "fork"
version = "0.2.0"
version = "0.4.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "05dc8b302e04a1c27f4fe694439ef0f29779ca4edc205b7b58f00db04e29656d"
checksum = "30268f1eefccc9d72f43692e8b89e659aeb52e84016c3b32b6e7e9f1c8f38f94"
dependencies = [
"libc",
]
@@ -7281,6 +7287,7 @@ dependencies = [
"calloop",
"calloop-wayland-source",
"cbindgen",
"circular-buffer",
"cocoa 0.26.0",
"cocoa-foundation 0.2.0",
"collections",
@@ -7336,6 +7343,7 @@ dependencies = [
"slotmap",
"smallvec",
"smol",
"spin 0.10.0",
"stacksafe",
"strum 0.27.2",
"sum_tree",
@@ -7799,6 +7807,7 @@ dependencies = [
"parking_lot",
"serde",
"serde_json",
"serde_urlencoded",
"sha2",
"tempfile",
"url",
@@ -9065,7 +9074,7 @@ version = "1.5.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "bbd2bcb4c963f2ddae06a2efc7e9f3591312473c50c6685e1f298068316e66fe"
dependencies = [
"spin",
"spin 0.9.8",
]
[[package]]
@@ -10007,6 +10016,18 @@ version = "0.2.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "68354c5c6bd36d73ff3feceb05efa59b6acb7626617f4962be322a825e61f79a"
[[package]]
name = "miniprofiler_ui"
version = "0.1.0"
dependencies = [
"gpui",
"serde_json",
"smol",
"util",
"workspace",
"zed_actions",
]
[[package]]
name = "miniz_oxide"
version = "0.8.9"
@@ -13071,6 +13092,7 @@ dependencies = [
"settings",
"smallvec",
"telemetry",
"tempfile",
"theme",
"ui",
"util",
@@ -15846,6 +15868,15 @@ dependencies = [
"lock_api",
]
[[package]]
name = "spin"
version = "0.10.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d5fe4ccb98d9c292d56fec89a5e07da7fc4cf0dc11e156b41793132775d3e591"
dependencies = [
"lock_api",
]
[[package]]
name = "spirv"
version = "0.3.0+sdk-1.3.268.0"
@@ -21157,6 +21188,7 @@ dependencies = [
"breadcrumbs",
"call",
"channel",
"chrono",
"clap",
"cli",
"client",
@@ -21214,6 +21246,7 @@ dependencies = [
"menu",
"migrator",
"mimalloc",
"miniprofiler_ui",
"nc",
"nix 0.29.0",
"node_runtime",

View File

@@ -110,6 +110,7 @@ members = [
"crates/menu",
"crates/migrator",
"crates/mistral",
"crates/miniprofiler_ui",
"crates/multi_buffer",
"crates/nc",
"crates/net",
@@ -341,6 +342,7 @@ menu = { path = "crates/menu" }
migrator = { path = "crates/migrator" }
mistral = { path = "crates/mistral" }
multi_buffer = { path = "crates/multi_buffer" }
miniprofiler_ui = { path = "crates/miniprofiler_ui" }
nc = { path = "crates/nc" }
net = { path = "crates/net" }
node_runtime = { path = "crates/node_runtime" }
@@ -504,7 +506,7 @@ emojis = "0.6.1"
env_logger = "0.11"
exec = "0.3.1"
fancy-regex = "0.14.0"
fork = "0.2.0"
fork = "0.4.0"
futures = "0.3"
futures-batch = "0.6.1"
futures-lite = "1.13"
@@ -840,7 +842,7 @@ ui_input = { codegen-units = 1 }
zed_actions = { codegen-units = 1 }
[profile.release]
debug = "full"
debug = "limited"
lto = "thin"
codegen-units = 1

View File

@@ -1,7 +1,7 @@
# Zed
[![Zed](https://img.shields.io/endpoint?url=https://raw.githubusercontent.com/zed-industries/zed/main/assets/badge/v0.json)](https://zed.dev)
[![CI](https://github.com/zed-industries/zed/actions/workflows/ci.yml/badge.svg)](https://github.com/zed-industries/zed/actions/workflows/ci.yml)
[![CI](https://github.com/zed-industries/zed/actions/workflows/run_tests.yml/badge.svg)](https://github.com/zed-industries/zed/actions/workflows/run_tests.yml)
Welcome to Zed, a high-performance, multiplayer code editor from the creators of [Atom](https://github.com/atom/atom) and [Tree-sitter](https://github.com/tree-sitter/tree-sitter).

4
assets/icons/at_sign.svg Normal file
View File

@@ -0,0 +1,4 @@
<svg width="16" height="16" viewBox="0 0 16 16" fill="none" xmlns="http://www.w3.org/2000/svg">
<path d="M8.00156 10.3996C9.32705 10.3996 10.4016 9.32509 10.4016 7.99961C10.4016 6.67413 9.32705 5.59961 8.00156 5.59961C6.67608 5.59961 5.60156 6.67413 5.60156 7.99961C5.60156 9.32509 6.67608 10.3996 8.00156 10.3996Z" stroke="black" stroke-width="1.2" stroke-linecap="round" stroke-linejoin="round"/>
<path d="M10.4 5.6V8.6C10.4 9.07739 10.5896 9.53523 10.9272 9.8728C11.2648 10.2104 11.7226 10.4 12.2 10.4C12.6774 10.4 13.1352 10.2104 13.4728 9.8728C13.8104 9.53523 14 9.07739 14 8.6V8C14 6.64839 13.5436 5.33636 12.7048 4.27651C11.8661 3.21665 10.694 2.47105 9.37852 2.16051C8.06306 1.84997 6.68129 1.99269 5.45707 2.56554C4.23285 3.13838 3.23791 4.1078 2.63344 5.31672C2.02898 6.52565 1.85041 7.90325 2.12667 9.22633C2.40292 10.5494 3.11782 11.7405 4.15552 12.6065C5.19323 13.4726 6.49295 13.9629 7.84411 13.998C9.19527 14.0331 10.5187 13.611 11.6 12.8" stroke="black" stroke-width="1.2" stroke-linecap="round" stroke-linejoin="round"/>
</svg>

After

Width:  |  Height:  |  Size: 1.0 KiB

View File

@@ -605,6 +605,10 @@
// to both the horizontal and vertical delta values while scrolling. Fast scrolling
// happens when a user holds the alt or option key while scrolling.
"fast_scroll_sensitivity": 4.0,
"sticky_scroll": {
// Whether to stick scopes to the top of the editor.
"enabled": false
},
"relative_line_numbers": "disabled",
// If 'search_wrap' is disabled, search result do not wrap around the end of the file.
"search_wrap": true,
@@ -612,9 +616,13 @@
"search": {
// Whether to show the project search button in the status bar.
"button": true,
// Whether to only match on whole words.
"whole_word": false,
// Whether to match case sensitively.
"case_sensitive": false,
// Whether to include gitignored files in search results.
"include_ignored": false,
// Whether to interpret the search query as a regular expression.
"regex": false,
// Whether to center the cursor on each search match when navigating.
"center_on_match": false
@@ -740,8 +748,15 @@
"hide_root": false,
// Whether to hide the hidden entries in the project panel.
"hide_hidden": false,
// Whether to automatically open files when pasting them in the project panel.
"open_file_on_paste": true
// Settings for automatically opening files.
"auto_open": {
// Whether to automatically open newly created files in the editor.
"on_create": true,
// Whether to automatically open files after pasting or duplicating them.
"on_paste": true,
// Whether to automatically open files dropped from external sources.
"on_drop": true
}
},
"outline_panel": {
// Whether to show the outline panel button in the status bar

View File

@@ -1866,10 +1866,14 @@ impl AcpThread {
.checkpoint
.as_ref()
.map(|c| c.git_checkpoint.clone());
// Cancel any in-progress generation before restoring
let cancel_task = self.cancel(cx);
let rewind = self.rewind(id.clone(), cx);
let git_store = self.project.read(cx).git_store().clone();
cx.spawn(async move |_, cx| {
cancel_task.await;
rewind.await?;
if let Some(checkpoint) = checkpoint {
git_store
@@ -1894,9 +1898,25 @@ impl AcpThread {
cx.update(|cx| truncate.run(id.clone(), cx))?.await?;
this.update(cx, |this, cx| {
if let Some((ix, _)) = this.user_message_mut(&id) {
// Collect all terminals from entries that will be removed
let terminals_to_remove: Vec<acp::TerminalId> = this.entries[ix..]
.iter()
.flat_map(|entry| entry.terminals())
.filter_map(|terminal| terminal.read(cx).id().clone().into())
.collect();
let range = ix..this.entries.len();
this.entries.truncate(ix);
cx.emit(AcpThreadEvent::EntriesRemoved(range));
// Kill and remove the terminals
for terminal_id in terminals_to_remove {
if let Some(terminal) = this.terminals.remove(&terminal_id) {
terminal.update(cx, |terminal, cx| {
terminal.kill(cx);
});
}
}
}
this.action_log().update(cx, |action_log, cx| {
action_log.reject_all_edits(Some(telemetry), cx)
@@ -3803,4 +3823,314 @@ mod tests {
}
});
}
/// Tests that restoring a checkpoint properly cleans up terminals that were
/// created after that checkpoint, and cancels any in-progress generation.
///
/// Reproduces issue #35142: When a checkpoint is restored, any terminal processes
/// that were started after that checkpoint should be terminated, and any in-progress
/// AI generation should be canceled.
#[gpui::test]
async fn test_restore_checkpoint_kills_terminal(cx: &mut TestAppContext) {
init_test(cx);
let fs = FakeFs::new(cx.executor());
let project = Project::test(fs, [], cx).await;
let connection = Rc::new(FakeAgentConnection::new());
let thread = cx
.update(|cx| connection.new_thread(project, Path::new(path!("/test")), cx))
.await
.unwrap();
// Send first user message to create a checkpoint
cx.update(|cx| {
thread.update(cx, |thread, cx| {
thread.send(vec!["first message".into()], cx)
})
})
.await
.unwrap();
// Send second message (creates another checkpoint) - we'll restore to this one
cx.update(|cx| {
thread.update(cx, |thread, cx| {
thread.send(vec!["second message".into()], cx)
})
})
.await
.unwrap();
// Create 2 terminals BEFORE the checkpoint that have completed running
let terminal_id_1 = acp::TerminalId(uuid::Uuid::new_v4().to_string().into());
let mock_terminal_1 = cx.new(|cx| {
let builder = ::terminal::TerminalBuilder::new_display_only(
::terminal::terminal_settings::CursorShape::default(),
::terminal::terminal_settings::AlternateScroll::On,
None,
0,
)
.unwrap();
builder.subscribe(cx)
});
thread.update(cx, |thread, cx| {
thread.on_terminal_provider_event(
TerminalProviderEvent::Created {
terminal_id: terminal_id_1.clone(),
label: "echo 'first'".to_string(),
cwd: Some(PathBuf::from("/test")),
output_byte_limit: None,
terminal: mock_terminal_1.clone(),
},
cx,
);
});
thread.update(cx, |thread, cx| {
thread.on_terminal_provider_event(
TerminalProviderEvent::Output {
terminal_id: terminal_id_1.clone(),
data: b"first\n".to_vec(),
},
cx,
);
});
thread.update(cx, |thread, cx| {
thread.on_terminal_provider_event(
TerminalProviderEvent::Exit {
terminal_id: terminal_id_1.clone(),
status: acp::TerminalExitStatus {
exit_code: Some(0),
signal: None,
meta: None,
},
},
cx,
);
});
let terminal_id_2 = acp::TerminalId(uuid::Uuid::new_v4().to_string().into());
let mock_terminal_2 = cx.new(|cx| {
let builder = ::terminal::TerminalBuilder::new_display_only(
::terminal::terminal_settings::CursorShape::default(),
::terminal::terminal_settings::AlternateScroll::On,
None,
0,
)
.unwrap();
builder.subscribe(cx)
});
thread.update(cx, |thread, cx| {
thread.on_terminal_provider_event(
TerminalProviderEvent::Created {
terminal_id: terminal_id_2.clone(),
label: "echo 'second'".to_string(),
cwd: Some(PathBuf::from("/test")),
output_byte_limit: None,
terminal: mock_terminal_2.clone(),
},
cx,
);
});
thread.update(cx, |thread, cx| {
thread.on_terminal_provider_event(
TerminalProviderEvent::Output {
terminal_id: terminal_id_2.clone(),
data: b"second\n".to_vec(),
},
cx,
);
});
thread.update(cx, |thread, cx| {
thread.on_terminal_provider_event(
TerminalProviderEvent::Exit {
terminal_id: terminal_id_2.clone(),
status: acp::TerminalExitStatus {
exit_code: Some(0),
signal: None,
meta: None,
},
},
cx,
);
});
// Get the second message ID to restore to
let second_message_id = thread.read_with(cx, |thread, _| {
// At this point we have:
// - Index 0: First user message (with checkpoint)
// - Index 1: Second user message (with checkpoint)
// No assistant responses because FakeAgentConnection just returns EndTurn
let AgentThreadEntry::UserMessage(message) = &thread.entries[1] else {
panic!("expected user message at index 1");
};
message.id.clone().unwrap()
});
// Create a terminal AFTER the checkpoint we'll restore to.
// This simulates the AI agent starting a long-running terminal command.
let terminal_id = acp::TerminalId(uuid::Uuid::new_v4().to_string().into());
let mock_terminal = cx.new(|cx| {
let builder = ::terminal::TerminalBuilder::new_display_only(
::terminal::terminal_settings::CursorShape::default(),
::terminal::terminal_settings::AlternateScroll::On,
None,
0,
)
.unwrap();
builder.subscribe(cx)
});
// Register the terminal as created
thread.update(cx, |thread, cx| {
thread.on_terminal_provider_event(
TerminalProviderEvent::Created {
terminal_id: terminal_id.clone(),
label: "sleep 1000".to_string(),
cwd: Some(PathBuf::from("/test")),
output_byte_limit: None,
terminal: mock_terminal.clone(),
},
cx,
);
});
// Simulate the terminal producing output (still running)
thread.update(cx, |thread, cx| {
thread.on_terminal_provider_event(
TerminalProviderEvent::Output {
terminal_id: terminal_id.clone(),
data: b"terminal is running...\n".to_vec(),
},
cx,
);
});
// Create a tool call entry that references this terminal
// This represents the agent requesting a terminal command
thread.update(cx, |thread, cx| {
thread
.handle_session_update(
acp::SessionUpdate::ToolCall(acp::ToolCall {
id: acp::ToolCallId("terminal-tool-1".into()),
title: "Running command".into(),
kind: acp::ToolKind::Execute,
status: acp::ToolCallStatus::InProgress,
content: vec![acp::ToolCallContent::Terminal {
terminal_id: terminal_id.clone(),
}],
locations: vec![],
raw_input: Some(
serde_json::json!({"command": "sleep 1000", "cd": "/test"}),
),
raw_output: None,
meta: None,
}),
cx,
)
.unwrap();
});
// Verify terminal exists and is in the thread
let terminal_exists_before =
thread.read_with(cx, |thread, _| thread.terminals.contains_key(&terminal_id));
assert!(
terminal_exists_before,
"Terminal should exist before checkpoint restore"
);
// Verify the terminal's underlying task is still running (not completed)
let terminal_running_before = thread.read_with(cx, |thread, _cx| {
let terminal_entity = thread.terminals.get(&terminal_id).unwrap();
terminal_entity.read_with(cx, |term, _cx| {
term.output().is_none() // output is None means it's still running
})
});
assert!(
terminal_running_before,
"Terminal should be running before checkpoint restore"
);
// Verify we have the expected entries before restore
let entry_count_before = thread.read_with(cx, |thread, _| thread.entries.len());
assert!(
entry_count_before > 1,
"Should have multiple entries before restore"
);
// Restore the checkpoint to the second message.
// This should:
// 1. Cancel any in-progress generation (via the cancel() call)
// 2. Remove the terminal that was created after that point
thread
.update(cx, |thread, cx| {
thread.restore_checkpoint(second_message_id, cx)
})
.await
.unwrap();
// Verify that no send_task is in progress after restore
// (cancel() clears the send_task)
let has_send_task_after = thread.read_with(cx, |thread, _| thread.send_task.is_some());
assert!(
!has_send_task_after,
"Should not have a send_task after restore (cancel should have cleared it)"
);
// Verify the entries were truncated (restoring to index 1 truncates at 1, keeping only index 0)
let entry_count = thread.read_with(cx, |thread, _| thread.entries.len());
assert_eq!(
entry_count, 1,
"Should have 1 entry after restore (only the first user message)"
);
// Verify the 2 completed terminals from before the checkpoint still exist
let terminal_1_exists = thread.read_with(cx, |thread, _| {
thread.terminals.contains_key(&terminal_id_1)
});
assert!(
terminal_1_exists,
"Terminal 1 (from before checkpoint) should still exist"
);
let terminal_2_exists = thread.read_with(cx, |thread, _| {
thread.terminals.contains_key(&terminal_id_2)
});
assert!(
terminal_2_exists,
"Terminal 2 (from before checkpoint) should still exist"
);
// Verify they're still in completed state
let terminal_1_completed = thread.read_with(cx, |thread, _cx| {
let terminal_entity = thread.terminals.get(&terminal_id_1).unwrap();
terminal_entity.read_with(cx, |term, _cx| term.output().is_some())
});
assert!(terminal_1_completed, "Terminal 1 should still be completed");
let terminal_2_completed = thread.read_with(cx, |thread, _cx| {
let terminal_entity = thread.terminals.get(&terminal_id_2).unwrap();
terminal_entity.read_with(cx, |term, _cx| term.output().is_some())
});
assert!(terminal_2_completed, "Terminal 2 should still be completed");
// Verify the running terminal (created after checkpoint) was removed
let terminal_3_exists =
thread.read_with(cx, |thread, _| thread.terminals.contains_key(&terminal_id));
assert!(
!terminal_3_exists,
"Terminal 3 (created after checkpoint) should have been removed"
);
// Verify total count is 2 (the two from before the checkpoint)
let terminal_count = thread.read_with(cx, |thread, _| thread.terminals.len());
assert_eq!(
terminal_count, 2,
"Should have exactly 2 terminals (the completed ones from before checkpoint)"
);
}
}

View File

@@ -17,6 +17,7 @@ anyhow.workspace = true
auto_update.workspace = true
editor.workspace = true
extension_host.workspace = true
fs.workspace = true
futures.workspace = true
gpui.workspace = true
language.workspace = true

View File

@@ -51,6 +51,7 @@ pub struct ActivityIndicator {
project: Entity<Project>,
auto_updater: Option<Entity<AutoUpdater>>,
context_menu_handle: PopoverMenuHandle<ContextMenu>,
fs_jobs: Vec<fs::JobInfo>,
}
#[derive(Debug)]
@@ -99,6 +100,27 @@ impl ActivityIndicator {
})
.detach();
let fs = project.read(cx).fs().clone();
let mut job_events = fs.subscribe_to_jobs();
cx.spawn(async move |this, cx| {
while let Some(job_event) = job_events.next().await {
this.update(cx, |this: &mut ActivityIndicator, cx| {
match job_event {
fs::JobEvent::Started { info } => {
this.fs_jobs.retain(|j| j.id != info.id);
this.fs_jobs.push(info);
}
fs::JobEvent::Completed { id } => {
this.fs_jobs.retain(|j| j.id != id);
}
}
cx.notify();
})?;
}
anyhow::Ok(())
})
.detach();
cx.subscribe(
&project.read(cx).lsp_store(),
|activity_indicator, _, event, cx| {
@@ -201,7 +223,8 @@ impl ActivityIndicator {
statuses: Vec::new(),
project: project.clone(),
auto_updater,
context_menu_handle: Default::default(),
context_menu_handle: PopoverMenuHandle::default(),
fs_jobs: Vec::new(),
}
});
@@ -432,6 +455,23 @@ impl ActivityIndicator {
});
}
// Show any long-running fs command
for fs_job in &self.fs_jobs {
if Instant::now().duration_since(fs_job.start) >= GIT_OPERATION_DELAY {
return Some(Content {
icon: Some(
Icon::new(IconName::ArrowCircle)
.size(IconSize::Small)
.with_rotate_animation(2)
.into_any_element(),
),
message: fs_job.message.clone().into(),
on_click: None,
tooltip_message: None,
});
}
}
// Show any language server installation info.
let mut downloading = SmallVec::<[_; 3]>::new();
let mut checking_for_update = SmallVec::<[_; 3]>::new();

View File

@@ -133,9 +133,7 @@ impl LanguageModels {
for model in provider.provided_models(cx) {
let model_info = Self::map_language_model_to_info(&model, &provider);
let model_id = model_info.id.clone();
if !recommended_models.contains(&(model.provider_id(), model.id())) {
provider_models.push(model_info);
}
provider_models.push(model_info);
models.insert(model_id, model);
}
if !provider_models.is_empty() {
@@ -370,13 +368,15 @@ impl NativeAgent {
cx: &mut AsyncApp,
) -> Result<()> {
while needs_refresh.changed().await.is_ok() {
let project_context = this
let new_project_context_data = this
.update(cx, |this, cx| {
Self::build_project_context(&this.project, this.prompt_store.as_ref(), cx)
})?
.await;
this.update(cx, |this, cx| {
this.project_context = cx.new(|_| project_context);
this.project_context.update(cx, |project_context, _cx| {
*project_context = new_project_context_data;
});
})?;
}

View File

@@ -29,13 +29,14 @@ use pretty_assertions::assert_eq;
use project::{
Project, context_server_store::ContextServerStore, project_settings::ProjectSettings,
};
use prompt_store::ProjectContext;
use prompt_store::{ProjectContext, UserPromptId, UserRulesContext};
use reqwest_client::ReqwestClient;
use schemars::JsonSchema;
use serde::{Deserialize, Serialize};
use serde_json::json;
use settings::{Settings, SettingsStore};
use std::{path::Path, rc::Rc, sync::Arc, time::Duration};
use util::path;
mod test_tools;
@@ -933,7 +934,7 @@ async fn test_profiles(cx: &mut TestAppContext) {
// Test that test-1 profile (default) has echo and delay tools
thread
.update(cx, |thread, cx| {
thread.set_profile(AgentProfileId("test-1".into()));
thread.set_profile(AgentProfileId("test-1".into()), cx);
thread.send(UserMessageId::new(), ["test"], cx)
})
.unwrap();
@@ -953,7 +954,7 @@ async fn test_profiles(cx: &mut TestAppContext) {
// Switch to test-2 profile, and verify that it has only the infinite tool.
thread
.update(cx, |thread, cx| {
thread.set_profile(AgentProfileId("test-2".into()));
thread.set_profile(AgentProfileId("test-2".into()), cx);
thread.send(UserMessageId::new(), ["test2"], cx)
})
.unwrap();
@@ -1002,8 +1003,8 @@ async fn test_mcp_tools(cx: &mut TestAppContext) {
)
.await;
cx.run_until_parked();
thread.update(cx, |thread, _| {
thread.set_profile(AgentProfileId("test".into()))
thread.update(cx, |thread, cx| {
thread.set_profile(AgentProfileId("test".into()), cx)
});
let mut mcp_tool_calls = setup_context_server(
@@ -1169,8 +1170,8 @@ async fn test_mcp_tool_truncation(cx: &mut TestAppContext) {
.await;
cx.run_until_parked();
thread.update(cx, |thread, _| {
thread.set_profile(AgentProfileId("test".into()));
thread.update(cx, |thread, cx| {
thread.set_profile(AgentProfileId("test".into()), cx);
thread.add_tool(EchoTool);
thread.add_tool(DelayTool);
thread.add_tool(WordListTool);
@@ -2577,3 +2578,145 @@ fn setup_context_server(
cx.run_until_parked();
mcp_tool_calls_rx
}
/// Tests that rules toggled after thread creation are applied to the first message.
///
/// This test verifies the fix for https://github.com/zed-industries/zed/issues/39057
/// where rules toggled in the Rules Library after opening a new agent thread were not
/// being applied to the first message sent in that thread.
///
/// The test simulates:
/// 1. Creating a thread with an initial rule in the project context
/// 2. Updating the project context entity with new rules (simulating what
/// NativeAgent.maintain_project_context does when rules are toggled)
/// 3. Sending a message through the thread
/// 4. Verifying that the newly toggled rules appear in the system prompt
///
/// The fix ensures that threads see updated rules by updating the project_context
/// entity in place rather than creating a new entity that threads wouldn't reference.
#[gpui::test]
async fn test_rules_toggled_after_thread_creation_are_applied(cx: &mut TestAppContext) {
let ThreadTest {
model,
thread,
project_context,
..
} = setup(cx, TestModel::Fake).await;
let fake_model = model.as_fake();
let initial_rule_content = "Initial rule before thread creation.";
project_context.update(cx, |context, _cx| {
context.user_rules.push(UserRulesContext {
uuid: UserPromptId::new(),
title: Some("Initial Rule".to_string()),
contents: initial_rule_content.to_string(),
});
context.has_user_rules = true;
});
let rule_id = UserPromptId::new();
let new_rule_content = "Always respond in uppercase.";
project_context.update(cx, |context, _cx| {
context.user_rules.clear();
context.user_rules.push(UserRulesContext {
uuid: rule_id,
title: Some("New Rule".to_string()),
contents: new_rule_content.to_string(),
});
context.has_user_rules = true;
});
thread
.update(cx, |thread, cx| {
thread.send(UserMessageId::new(), ["test message"], cx)
})
.unwrap();
cx.run_until_parked();
let mut pending_completions = fake_model.pending_completions();
assert_eq!(pending_completions.len(), 1);
let pending_completion = pending_completions.pop().unwrap();
let system_message = &pending_completion.messages[0];
let system_prompt = system_message.content[0].to_str().unwrap();
assert!(
system_prompt.contains(new_rule_content),
"System prompt should contain the rule content that was toggled after thread creation"
);
}
/// Verifies the fix for issue #39057: entity replacement breaks rule updates.
///
/// This test will FAIL without the fix and PASS with the fix.
///
/// The buggy code in maintain_project_context creates a NEW entity:
/// `this.project_context = cx.new(|_| new_data);`
/// The fixed code updates the EXISTING entity in place:
/// `this.project_context.update(cx, |ctx, _| *ctx = new_data);`
///
/// This test simulates the bug by:
/// 1. Creating a thread (thread holds reference to project_context entity)
/// 2. Creating NEW project context data with rules (simulating rule toggle)
/// 3. Simulating the BUGGY behavior: creating a completely new entity
/// 4. Sending a message (thread still references the old entity)
/// 5. Asserting rules are present (FAILS with bug, PASSES with fix)
///
/// With the fix, maintain_project_context updates in place, so we simulate that here.
#[gpui::test]
async fn test_project_context_entity_updated_in_place(cx: &mut TestAppContext) {
let ThreadTest {
model,
thread,
project_context,
..
} = setup(cx, TestModel::Fake).await;
let fake_model = model.as_fake();
let rule_content = "Rule toggled after thread creation.";
// Simulate what maintain_project_context does: build new context data with updated rules
let new_context_data = {
let mut context = ProjectContext::default();
context.user_rules.push(UserRulesContext {
uuid: UserPromptId::new(),
title: Some("Toggled Rule".to_string()),
contents: rule_content.to_string(),
});
context.has_user_rules = true;
context
};
// THE FIX: Update the existing entity in place (not creating a new entity)
// This is what the fixed maintain_project_context does.
// The buggy version would do: this.project_context = cx.new(|_| new_context_data);
// which would leave the thread with a stale reference.
project_context.update(cx, |context, _cx| {
*context = new_context_data;
});
thread
.update(cx, |thread, cx| {
thread.send(UserMessageId::new(), ["test message"], cx)
})
.unwrap();
cx.run_until_parked();
let mut pending_completions = fake_model.pending_completions();
assert_eq!(pending_completions.len(), 1);
let pending_completion = pending_completions.pop().unwrap();
let system_message = &pending_completion.messages[0];
let system_prompt = system_message.content[0].to_str().unwrap();
assert!(
system_prompt.contains(rule_content),
"Thread should see rules because entity was updated in place.\n\
Without the fix, maintain_project_context would create a NEW entity,\n\
leaving the thread with a reference to the old entity (which has no rules).\n\
This test passes because we simulate the FIXED behavior (update in place)."
);
}

View File

@@ -30,16 +30,17 @@ use gpui::{
};
use language_model::{
LanguageModel, LanguageModelCompletionError, LanguageModelCompletionEvent, LanguageModelExt,
LanguageModelImage, LanguageModelProviderId, LanguageModelRegistry, LanguageModelRequest,
LanguageModelRequestMessage, LanguageModelRequestTool, LanguageModelToolResult,
LanguageModelToolResultContent, LanguageModelToolSchemaFormat, LanguageModelToolUse,
LanguageModelToolUseId, Role, SelectedModel, StopReason, TokenUsage, ZED_CLOUD_PROVIDER_ID,
LanguageModelId, LanguageModelImage, LanguageModelProviderId, LanguageModelRegistry,
LanguageModelRequest, LanguageModelRequestMessage, LanguageModelRequestTool,
LanguageModelToolResult, LanguageModelToolResultContent, LanguageModelToolSchemaFormat,
LanguageModelToolUse, LanguageModelToolUseId, Role, SelectedModel, StopReason, TokenUsage,
ZED_CLOUD_PROVIDER_ID,
};
use project::Project;
use prompt_store::ProjectContext;
use schemars::{JsonSchema, Schema};
use serde::{Deserialize, Serialize};
use settings::{Settings, update_settings_file};
use settings::{LanguageModelSelection, Settings, update_settings_file};
use smol::stream::StreamExt;
use std::{
collections::BTreeMap,
@@ -798,7 +799,8 @@ impl Thread {
let profile_id = db_thread
.profile
.unwrap_or_else(|| AgentSettings::get_global(cx).default_profile.clone());
let model = LanguageModelRegistry::global(cx).update(cx, |registry, cx| {
let mut model = LanguageModelRegistry::global(cx).update(cx, |registry, cx| {
db_thread
.model
.and_then(|model| {
@@ -811,6 +813,16 @@ impl Thread {
.or_else(|| registry.default_model())
.map(|model| model.model)
});
if model.is_none() {
model = Self::resolve_profile_model(&profile_id, cx);
}
if model.is_none() {
model = LanguageModelRegistry::global(cx).update(cx, |registry, _cx| {
registry.default_model().map(|model| model.model)
});
}
let (prompt_capabilities_tx, prompt_capabilities_rx) =
watch::channel(Self::prompt_capabilities(model.as_deref()));
@@ -951,6 +963,11 @@ impl Thread {
cx.notify()
}
#[cfg(any(test, feature = "test-support"))]
pub fn replace_project_context(&mut self, new_context: Entity<ProjectContext>) {
self.project_context = new_context;
}
#[cfg(any(test, feature = "test-support"))]
pub fn last_message(&self) -> Option<Message> {
if let Some(message) = self.pending_message.clone() {
@@ -1007,8 +1024,17 @@ impl Thread {
&self.profile_id
}
pub fn set_profile(&mut self, profile_id: AgentProfileId) {
pub fn set_profile(&mut self, profile_id: AgentProfileId, cx: &mut Context<Self>) {
if self.profile_id == profile_id {
return;
}
self.profile_id = profile_id;
// Swap to the profile's preferred model when available.
if let Some(model) = Self::resolve_profile_model(&self.profile_id, cx) {
self.set_model(model, cx);
}
}
pub fn cancel(&mut self, cx: &mut Context<Self>) {
@@ -1065,6 +1091,35 @@ impl Thread {
})
}
/// Look up the active profile and resolve its preferred model if one is configured.
fn resolve_profile_model(
profile_id: &AgentProfileId,
cx: &mut Context<Self>,
) -> Option<Arc<dyn LanguageModel>> {
let selection = AgentSettings::get_global(cx)
.profiles
.get(profile_id)?
.default_model
.clone()?;
Self::resolve_model_from_selection(&selection, cx)
}
/// Translate a stored model selection into the configured model from the registry.
fn resolve_model_from_selection(
selection: &LanguageModelSelection,
cx: &mut Context<Self>,
) -> Option<Arc<dyn LanguageModel>> {
let selected = SelectedModel {
provider: LanguageModelProviderId::from(selection.provider.0.clone()),
model: LanguageModelId::from(selection.model.clone()),
};
LanguageModelRegistry::global(cx).update(cx, |registry, cx| {
registry
.select_model(&selected, cx)
.map(|configured| configured.model)
})
}
pub fn resume(
&mut self,
cx: &mut Context<Self>,

View File

@@ -136,7 +136,7 @@ impl AcpConnection {
while let Ok(n) = stderr.read_line(&mut line).await
&& n > 0
{
log::warn!("agent stderr: {}", &line);
log::warn!("agent stderr: {}", line.trim());
line.clear();
}
Ok(())

View File

@@ -50,13 +50,14 @@ impl crate::AgentServer for CustomAgentServer {
fn set_default_mode(&self, mode_id: Option<acp::SessionModeId>, fs: Arc<dyn Fs>, cx: &mut App) {
let name = self.name();
update_settings_file(fs, cx, move |settings, _| {
settings
if let Some(settings) = settings
.agent_servers
.get_or_insert_default()
.custom
.get_mut(&name)
.unwrap()
.default_mode = mode_id.map(|m| m.to_string())
{
settings.default_mode = mode_id.map(|m| m.to_string())
}
});
}

View File

@@ -6,8 +6,8 @@ use convert_case::{Case, Casing as _};
use fs::Fs;
use gpui::{App, SharedString};
use settings::{
AgentProfileContent, ContextServerPresetContent, Settings as _, SettingsContent,
update_settings_file,
AgentProfileContent, ContextServerPresetContent, LanguageModelSelection, Settings as _,
SettingsContent, update_settings_file,
};
use util::ResultExt as _;
@@ -53,19 +53,30 @@ impl AgentProfile {
let base_profile =
base_profile_id.and_then(|id| AgentSettings::get_global(cx).profiles.get(&id).cloned());
// Copy toggles from the base profile so the new profile starts with familiar defaults.
let tools = base_profile
.as_ref()
.map(|profile| profile.tools.clone())
.unwrap_or_default();
let enable_all_context_servers = base_profile
.as_ref()
.map(|profile| profile.enable_all_context_servers)
.unwrap_or_default();
let context_servers = base_profile
.as_ref()
.map(|profile| profile.context_servers.clone())
.unwrap_or_default();
// Preserve the base profile's model preference when cloning into a new profile.
let default_model = base_profile
.as_ref()
.and_then(|profile| profile.default_model.clone());
let profile_settings = AgentProfileSettings {
name: name.into(),
tools: base_profile
.as_ref()
.map(|profile| profile.tools.clone())
.unwrap_or_default(),
enable_all_context_servers: base_profile
.as_ref()
.map(|profile| profile.enable_all_context_servers)
.unwrap_or_default(),
context_servers: base_profile
.map(|profile| profile.context_servers)
.unwrap_or_default(),
tools,
enable_all_context_servers,
context_servers,
default_model,
};
update_settings_file(fs, cx, {
@@ -96,6 +107,8 @@ pub struct AgentProfileSettings {
pub tools: IndexMap<Arc<str>, bool>,
pub enable_all_context_servers: bool,
pub context_servers: IndexMap<Arc<str>, ContextServerPreset>,
/// Default language model to apply when this profile becomes active.
pub default_model: Option<LanguageModelSelection>,
}
impl AgentProfileSettings {
@@ -144,6 +157,7 @@ impl AgentProfileSettings {
)
})
.collect(),
default_model: self.default_model.clone(),
},
);
@@ -153,15 +167,23 @@ impl AgentProfileSettings {
impl From<AgentProfileContent> for AgentProfileSettings {
fn from(content: AgentProfileContent) -> Self {
let AgentProfileContent {
name,
tools,
enable_all_context_servers,
context_servers,
default_model,
} = content;
Self {
name: content.name.into(),
tools: content.tools,
enable_all_context_servers: content.enable_all_context_servers.unwrap_or_default(),
context_servers: content
.context_servers
name: name.into(),
tools,
enable_all_context_servers: enable_all_context_servers.unwrap_or_default(),
context_servers: context_servers
.into_iter()
.map(|(server_id, preset)| (server_id, preset.into()))
.collect(),
default_model,
}
}
}

View File

@@ -109,6 +109,8 @@ impl ContextPickerCompletionProvider {
icon_path: Some(mode.icon().path().into()),
documentation: None,
source: project::CompletionSource::Custom,
match_start: None,
snippet_deduplication_key: None,
insert_text_mode: None,
// This ensures that when a user accepts this completion, the
// completion menu will still be shown after "@category " is
@@ -146,6 +148,8 @@ impl ContextPickerCompletionProvider {
documentation: None,
insert_text_mode: None,
source: project::CompletionSource::Custom,
match_start: None,
snippet_deduplication_key: None,
icon_path: Some(icon_for_completion),
confirm: Some(confirm_completion_callback(
thread_entry.title().clone(),
@@ -177,6 +181,8 @@ impl ContextPickerCompletionProvider {
documentation: None,
insert_text_mode: None,
source: project::CompletionSource::Custom,
match_start: None,
snippet_deduplication_key: None,
icon_path: Some(icon_path),
confirm: Some(confirm_completion_callback(
rule.title,
@@ -233,6 +239,8 @@ impl ContextPickerCompletionProvider {
documentation: None,
source: project::CompletionSource::Custom,
icon_path: Some(completion_icon_path),
match_start: None,
snippet_deduplication_key: None,
insert_text_mode: None,
confirm: Some(confirm_completion_callback(
file_name,
@@ -284,6 +292,8 @@ impl ContextPickerCompletionProvider {
documentation: None,
source: project::CompletionSource::Custom,
icon_path: Some(icon_path),
match_start: None,
snippet_deduplication_key: None,
insert_text_mode: None,
confirm: Some(confirm_completion_callback(
symbol.name.into(),
@@ -316,6 +326,8 @@ impl ContextPickerCompletionProvider {
documentation: None,
source: project::CompletionSource::Custom,
icon_path: Some(icon_path),
match_start: None,
snippet_deduplication_key: None,
insert_text_mode: None,
confirm: Some(confirm_completion_callback(
url_to_fetch.to_string().into(),
@@ -384,6 +396,8 @@ impl ContextPickerCompletionProvider {
icon_path: Some(action.icon().path().into()),
documentation: None,
source: project::CompletionSource::Custom,
match_start: None,
snippet_deduplication_key: None,
insert_text_mode: None,
// This ensures that when a user accepts this completion, the
// completion menu will still be shown after "@category " is
@@ -694,14 +708,18 @@ fn build_symbol_label(symbol_name: &str, file_name: &str, line: u32, cx: &App) -
}
fn build_code_label_for_full_path(file_name: &str, directory: Option<&str>, cx: &App) -> CodeLabel {
let comment_id = cx.theme().syntax().highlight_id("comment").map(HighlightId);
let path = cx
.theme()
.syntax()
.highlight_id("variable")
.map(HighlightId);
let mut label = CodeLabelBuilder::default();
label.push_str(file_name, None);
label.push_str(" ", None);
if let Some(directory) = directory {
label.push_str(directory, comment_id);
label.push_str(directory, path);
}
label.build()
@@ -770,6 +788,8 @@ impl CompletionProvider for ContextPickerCompletionProvider {
)),
source: project::CompletionSource::Custom,
icon_path: None,
match_start: None,
snippet_deduplication_key: None,
insert_text_mode: None,
confirm: Some(Arc::new({
let editor = editor.clone();

View File

@@ -15,6 +15,7 @@ use editor::{
EditorEvent, EditorMode, EditorSnapshot, EditorStyle, ExcerptId, FoldPlaceholder, Inlay,
MultiBuffer, ToOffset,
actions::Paste,
code_context_menus::CodeContextMenu,
display_map::{Crease, CreaseId, FoldId},
scroll::Autoscroll,
};
@@ -272,6 +273,15 @@ impl MessageEditor {
self.editor.read(cx).is_empty(cx)
}
pub fn is_completions_menu_visible(&self, cx: &App) -> bool {
self.editor
.read(cx)
.context_menu()
.borrow()
.as_ref()
.is_some_and(|menu| matches!(menu, CodeContextMenu::Completions(_)) && menu.visible())
}
pub fn mentions(&self) -> HashSet<MentionUri> {
self.mention_set
.mentions
@@ -836,6 +846,45 @@ impl MessageEditor {
cx.emit(MessageEditorEvent::Send)
}
pub fn trigger_completion_menu(&mut self, window: &mut Window, cx: &mut Context<Self>) {
let editor = self.editor.clone();
cx.spawn_in(window, async move |_, cx| {
editor
.update_in(cx, |editor, window, cx| {
let menu_is_open =
editor.context_menu().borrow().as_ref().is_some_and(|menu| {
matches!(menu, CodeContextMenu::Completions(_)) && menu.visible()
});
let has_at_sign = {
let snapshot = editor.display_snapshot(cx);
let cursor = editor.selections.newest::<text::Point>(&snapshot).head();
let offset = cursor.to_offset(&snapshot);
if offset > 0 {
snapshot
.buffer_snapshot()
.reversed_chars_at(offset)
.next()
.map(|sign| sign == '@')
.unwrap_or(false)
} else {
false
}
};
if menu_is_open && has_at_sign {
return;
}
editor.insert("@", window, cx);
editor.show_completions(&editor::actions::ShowCompletions, window, cx);
})
.log_err();
})
.detach();
}
fn chat(&mut self, _: &Chat, _: &mut Window, cx: &mut Context<Self>) {
self.send(cx);
}
@@ -1195,6 +1244,17 @@ impl MessageEditor {
self.editor.read(cx).text(cx)
}
pub fn set_placeholder_text(
&mut self,
placeholder: &str,
window: &mut Window,
cx: &mut Context<Self>,
) {
self.editor.update(cx, |editor, cx| {
editor.set_placeholder_text(placeholder, window, cx);
});
}
#[cfg(test)]
pub fn set_text(&mut self, text: &str, window: &mut Window, cx: &mut Context<Self>) {
self.editor.update(cx, |editor, cx| {

View File

@@ -457,25 +457,23 @@ impl Render for AcpThreadHistory {
.on_action(cx.listener(Self::select_last))
.on_action(cx.listener(Self::confirm))
.on_action(cx.listener(Self::remove_selected_thread))
.when(!self.history_store.read(cx).is_empty(cx), |parent| {
parent.child(
h_flex()
.h(px(41.)) // Match the toolbar perfectly
.w_full()
.py_1()
.px_2()
.gap_2()
.justify_between()
.border_b_1()
.border_color(cx.theme().colors().border)
.child(
Icon::new(IconName::MagnifyingGlass)
.color(Color::Muted)
.size(IconSize::Small),
)
.child(self.search_editor.clone()),
)
})
.child(
h_flex()
.h(px(41.)) // Match the toolbar perfectly
.w_full()
.py_1()
.px_2()
.gap_2()
.justify_between()
.border_b_1()
.border_color(cx.theme().colors().border)
.child(
Icon::new(IconName::MagnifyingGlass)
.color(Color::Muted)
.size(IconSize::Small),
)
.child(self.search_editor.clone()),
)
.child({
let view = v_flex()
.id("list-container")
@@ -484,19 +482,15 @@ impl Render for AcpThreadHistory {
.flex_grow();
if self.history_store.read(cx).is_empty(cx) {
view.justify_center()
.child(
h_flex().w_full().justify_center().child(
Label::new("You don't have any past threads yet.")
.size(LabelSize::Small),
),
)
} else if self.search_produced_no_matches() {
view.justify_center().child(
h_flex().w_full().justify_center().child(
Label::new("No threads match your search.").size(LabelSize::Small),
),
view.justify_center().items_center().child(
Label::new("You don't have any past threads yet.")
.size(LabelSize::Small)
.color(Color::Muted),
)
} else if self.search_produced_no_matches() {
view.justify_center()
.items_center()
.child(Label::new("No threads match your search.").size(LabelSize::Small))
} else {
view.child(
uniform_list(
@@ -673,7 +667,7 @@ impl EntryTimeFormat {
timezone,
time_format::TimestampFormat::EnhancedAbsolute,
),
EntryTimeFormat::TimeOnly => time_format::format_time(timestamp),
EntryTimeFormat::TimeOnly => time_format::format_time(timestamp.to_offset(timezone)),
}
}
}

View File

@@ -125,8 +125,9 @@ impl ProfileProvider for Entity<agent::Thread> {
}
fn set_profile(&self, profile_id: AgentProfileId, cx: &mut App) {
self.update(cx, |thread, _cx| {
thread.set_profile(profile_id);
self.update(cx, |thread, cx| {
// Apply the profile and let the thread swap to its default model.
thread.set_profile(profile_id, cx);
});
}
@@ -336,19 +337,7 @@ impl AcpThreadView {
let prompt_capabilities = Rc::new(RefCell::new(acp::PromptCapabilities::default()));
let available_commands = Rc::new(RefCell::new(vec![]));
let placeholder = if agent.name() == "Zed Agent" {
format!("Message the {} — @ to include context", agent.name())
} else if agent.name() == "Claude Code"
|| agent.name() == "Codex"
|| !available_commands.borrow().is_empty()
{
format!(
"Message {} — @ to include context, / for commands",
agent.name()
)
} else {
format!("Message {} — @ to include context", agent.name())
};
let placeholder = placeholder_text(agent.name().as_ref(), false);
let message_editor = cx.new(|cx| {
let mut editor = MessageEditor::new(
@@ -1455,7 +1444,14 @@ impl AcpThreadView {
});
}
let has_commands = !available_commands.is_empty();
self.available_commands.replace(available_commands);
let new_placeholder = placeholder_text(self.agent.name().as_ref(), has_commands);
self.message_editor.update(cx, |editor, cx| {
editor.set_placeholder_text(&new_placeholder, window, cx);
});
}
AcpThreadEvent::ModeUpdated(_mode) => {
// The connection keeps track of the mode
@@ -4192,6 +4188,8 @@ impl AcpThreadView {
.justify_between()
.child(
h_flex()
.gap_0p5()
.child(self.render_add_context_button(cx))
.child(self.render_follow_toggle(cx))
.children(self.render_burn_mode_toggle(cx)),
)
@@ -4506,6 +4504,29 @@ impl AcpThreadView {
}))
}
fn render_add_context_button(&self, cx: &mut Context<Self>) -> impl IntoElement {
let message_editor = self.message_editor.clone();
let menu_visible = message_editor.read(cx).is_completions_menu_visible(cx);
IconButton::new("add-context", IconName::AtSign)
.icon_size(IconSize::Small)
.icon_color(Color::Muted)
.when(!menu_visible, |this| {
this.tooltip(move |_window, cx| {
Tooltip::with_meta("Add Context", None, "Or type @ to include context", cx)
})
})
.on_click(cx.listener(move |_this, _, window, cx| {
let message_editor_clone = message_editor.clone();
window.defer(cx, move |window, cx| {
message_editor_clone.update(cx, |message_editor, cx| {
message_editor.trigger_completion_menu(window, cx);
});
});
}))
}
fn render_markdown(&self, markdown: Entity<Markdown>, style: MarkdownStyle) -> MarkdownElement {
let workspace = self.workspace.clone();
MarkdownElement::new(markdown, style).on_url_click(move |text, window, cx| {
@@ -5707,6 +5728,19 @@ fn loading_contents_spinner(size: IconSize) -> AnyElement {
.into_any_element()
}
fn placeholder_text(agent_name: &str, has_commands: bool) -> String {
if agent_name == "Zed Agent" {
format!("Message the {} — @ to include context", agent_name)
} else if has_commands {
format!(
"Message {} — @ to include context, / for commands",
agent_name
)
} else {
format!("Message {} — @ to include context", agent_name)
}
}
impl Focusable for AcpThreadView {
fn focus_handle(&self, cx: &App) -> FocusHandle {
match self.thread_state {

View File

@@ -8,6 +8,7 @@ use std::{ops::Range, sync::Arc};
use agent::ContextServerRegistry;
use anyhow::Result;
use client::zed_urls;
use cloud_llm_client::{Plan, PlanV1, PlanV2};
use collections::HashMap;
use context_server::ContextServerId;
@@ -26,18 +27,20 @@ use language_model::{
use language_models::AllLanguageModelSettings;
use notifications::status_toast::{StatusToast, ToastIcon};
use project::{
agent_server_store::{AgentServerStore, CLAUDE_CODE_NAME, CODEX_NAME, GEMINI_NAME},
agent_server_store::{
AgentServerStore, CLAUDE_CODE_NAME, CODEX_NAME, ExternalAgentServerName, GEMINI_NAME,
},
context_server_store::{ContextServerConfiguration, ContextServerStatus, ContextServerStore},
};
use settings::{Settings, SettingsStore, update_settings_file};
use ui::{
Button, ButtonStyle, Chip, CommonAnimationExt, ContextMenu, Disclosure, Divider, DividerColor,
ElevationIndex, IconName, IconPosition, IconSize, Indicator, LabelSize, PopoverMenu, Switch,
SwitchColor, Tooltip, WithScrollbar, prelude::*,
Button, ButtonStyle, Chip, CommonAnimationExt, ContextMenu, ContextMenuEntry, Disclosure,
Divider, DividerColor, ElevationIndex, IconName, IconPosition, IconSize, Indicator, LabelSize,
PopoverMenu, Switch, SwitchColor, Tooltip, WithScrollbar, prelude::*,
};
use util::ResultExt as _;
use workspace::{Workspace, create_and_open_local_file};
use zed_actions::ExtensionCategoryFilter;
use zed_actions::{ExtensionCategoryFilter, OpenBrowser};
pub(crate) use configure_context_server_modal::ConfigureContextServerModal;
pub(crate) use configure_context_server_tools_modal::ConfigureContextServerToolsModal;
@@ -415,6 +418,7 @@ impl AgentConfiguration {
cx: &mut Context<Self>,
) -> impl IntoElement {
let providers = LanguageModelRegistry::read_global(cx).providers();
let popover_menu = PopoverMenu::new("add-provider-popover")
.trigger(
Button::new("add-provider", "Add Provider")
@@ -425,7 +429,6 @@ impl AgentConfiguration {
.icon_color(Color::Muted)
.label_size(LabelSize::Small),
)
.anchor(gpui::Corner::TopRight)
.menu({
let workspace = self.workspace.clone();
move |window, cx| {
@@ -447,6 +450,11 @@ impl AgentConfiguration {
})
}))
}
})
.anchor(gpui::Corner::TopRight)
.offset(gpui::Point {
x: px(0.0),
y: px(2.0),
});
v_flex()
@@ -541,7 +549,6 @@ impl AgentConfiguration {
.icon_color(Color::Muted)
.label_size(LabelSize::Small),
)
.anchor(gpui::Corner::TopRight)
.menu({
move |window, cx| {
Some(ContextMenu::build(window, cx, |menu, _window, _cx| {
@@ -564,6 +571,11 @@ impl AgentConfiguration {
})
}))
}
})
.anchor(gpui::Corner::TopRight)
.offset(gpui::Point {
x: px(0.0),
y: px(2.0),
});
v_flex()
@@ -638,15 +650,13 @@ impl AgentConfiguration {
let is_running = matches!(server_status, ContextServerStatus::Running);
let item_id = SharedString::from(context_server_id.0.clone());
let is_from_extension = server_configuration
.as_ref()
.map(|config| {
matches!(
config.as_ref(),
ContextServerConfiguration::Extension { .. }
)
})
.unwrap_or(false);
// Servers without a configuration can only be provided by extensions.
let provided_by_extension = server_configuration.is_none_or(|config| {
matches!(
config.as_ref(),
ContextServerConfiguration::Extension { .. }
)
});
let error = if let ContextServerStatus::Error(error) = server_status.clone() {
Some(error)
@@ -660,7 +670,7 @@ impl AgentConfiguration {
.tools_for_server(&context_server_id)
.count();
let (source_icon, source_tooltip) = if is_from_extension {
let (source_icon, source_tooltip) = if provided_by_extension {
(
IconName::ZedSrcExtension,
"This MCP server was installed from an extension.",
@@ -710,7 +720,6 @@ impl AgentConfiguration {
let fs = self.fs.clone();
let context_server_id = context_server_id.clone();
let language_registry = self.language_registry.clone();
let context_server_store = self.context_server_store.clone();
let workspace = self.workspace.clone();
let context_server_registry = self.context_server_registry.clone();
@@ -752,23 +761,10 @@ impl AgentConfiguration {
.entry("Uninstall", None, {
let fs = fs.clone();
let context_server_id = context_server_id.clone();
let context_server_store = context_server_store.clone();
let workspace = workspace.clone();
move |_, cx| {
let is_provided_by_extension = context_server_store
.read(cx)
.configuration_for_server(&context_server_id)
.as_ref()
.map(|config| {
matches!(
config.as_ref(),
ContextServerConfiguration::Extension { .. }
)
})
.unwrap_or(false);
let uninstall_extension_task = match (
is_provided_by_extension,
provided_by_extension,
resolve_extension_for_context_server(&context_server_id, cx),
) {
(true, Some((id, manifest))) => {
@@ -959,7 +955,7 @@ impl AgentConfiguration {
.cloned()
.collect::<Vec<_>>();
let user_defined_agents = user_defined_agents
let user_defined_agents: Vec<_> = user_defined_agents
.into_iter()
.map(|name| {
let icon = if let Some(icon_path) = agent_server_store.agent_icon(&name) {
@@ -967,27 +963,93 @@ impl AgentConfiguration {
} else {
AgentIcon::Name(IconName::Ai)
};
self.render_agent_server(icon, name, true)
.into_any_element()
(name, icon)
})
.collect::<Vec<_>>();
.collect();
let add_agens_button = Button::new("add-agent", "Add Agent")
.style(ButtonStyle::Outlined)
.icon_position(IconPosition::Start)
.icon(IconName::Plus)
.icon_size(IconSize::Small)
.icon_color(Color::Muted)
.label_size(LabelSize::Small)
.on_click(move |_, window, cx| {
if let Some(workspace) = window.root().flatten() {
let workspace = workspace.downgrade();
window
.spawn(cx, async |cx| {
open_new_agent_servers_entry_in_settings_editor(workspace, cx).await
let add_agent_popover = PopoverMenu::new("add-agent-server-popover")
.trigger(
Button::new("add-agent", "Add Agent")
.style(ButtonStyle::Outlined)
.icon_position(IconPosition::Start)
.icon(IconName::Plus)
.icon_size(IconSize::Small)
.icon_color(Color::Muted)
.label_size(LabelSize::Small),
)
.menu({
move |window, cx| {
Some(ContextMenu::build(window, cx, |menu, _window, _cx| {
menu.entry("Install from Extensions", None, {
|window, cx| {
window.dispatch_action(
zed_actions::Extensions {
category_filter: Some(
ExtensionCategoryFilter::AgentServers,
),
id: None,
}
.boxed_clone(),
cx,
)
}
})
.detach_and_log_err(cx);
.entry("Add Custom Agent", None, {
move |window, cx| {
if let Some(workspace) = window.root().flatten() {
let workspace = workspace.downgrade();
window
.spawn(cx, async |cx| {
open_new_agent_servers_entry_in_settings_editor(
workspace, cx,
)
.await
})
.detach_and_log_err(cx);
}
}
})
.separator()
.header("Learn More")
.item(
ContextMenuEntry::new("Agent Servers Docs")
.icon(IconName::ArrowUpRight)
.icon_color(Color::Muted)
.icon_position(IconPosition::End)
.handler({
move |window, cx| {
window.dispatch_action(
Box::new(OpenBrowser {
url: zed_urls::agent_server_docs(cx),
}),
cx,
);
}
}),
)
.item(
ContextMenuEntry::new("ACP Docs")
.icon(IconName::ArrowUpRight)
.icon_color(Color::Muted)
.icon_position(IconPosition::End)
.handler({
move |window, cx| {
window.dispatch_action(
Box::new(OpenBrowser {
url: "https://agentclientprotocol.com/".into(),
}),
cx,
);
}
}),
)
}))
}
})
.anchor(gpui::Corner::TopRight)
.offset(gpui::Point {
x: px(0.0),
y: px(2.0),
});
v_flex()
@@ -998,7 +1060,7 @@ impl AgentConfiguration {
.child(self.render_section_title(
"External Agents",
"All agents connected through the Agent Client Protocol.",
add_agens_button.into_any_element(),
add_agent_popover.into_any_element(),
))
.child(
v_flex()
@@ -1009,26 +1071,29 @@ impl AgentConfiguration {
AgentIcon::Name(IconName::AiClaude),
"Claude Code",
false,
cx,
))
.child(Divider::horizontal().color(DividerColor::BorderFaded))
.child(self.render_agent_server(
AgentIcon::Name(IconName::AiOpenAi),
"Codex CLI",
false,
cx,
))
.child(Divider::horizontal().color(DividerColor::BorderFaded))
.child(self.render_agent_server(
AgentIcon::Name(IconName::AiGemini),
"Gemini CLI",
false,
cx,
))
.map(|mut parent| {
for agent in user_defined_agents {
for (name, icon) in user_defined_agents {
parent = parent
.child(
Divider::horizontal().color(DividerColor::BorderFaded),
)
.child(agent);
.child(self.render_agent_server(icon, name, true, cx));
}
parent
}),
@@ -1041,6 +1106,7 @@ impl AgentConfiguration {
icon: AgentIcon,
name: impl Into<SharedString>,
external: bool,
cx: &mut Context<Self>,
) -> impl IntoElement {
let name = name.into();
let icon = match icon {
@@ -1055,28 +1121,53 @@ impl AgentConfiguration {
let tooltip_id = SharedString::new(format!("agent-source-{}", name));
let tooltip_message = format!("The {} agent was installed from an extension.", name);
let agent_server_name = ExternalAgentServerName(name.clone());
let uninstall_btn_id = SharedString::from(format!("uninstall-{}", name));
let uninstall_button = IconButton::new(uninstall_btn_id, IconName::Trash)
.icon_color(Color::Muted)
.icon_size(IconSize::Small)
.tooltip(Tooltip::text("Uninstall Agent Extension"))
.on_click(cx.listener(move |this, _, _window, cx| {
let agent_name = agent_server_name.clone();
if let Some(ext_id) = this.agent_server_store.update(cx, |store, _cx| {
store.get_extension_id_for_agent(&agent_name)
}) {
ExtensionStore::global(cx)
.update(cx, |store, cx| store.uninstall_extension(ext_id, cx))
.detach_and_log_err(cx);
}
}));
h_flex()
.gap_1p5()
.child(icon)
.child(Label::new(name))
.when(external, |this| {
this.child(
div()
.id(tooltip_id)
.flex_none()
.tooltip(Tooltip::text(tooltip_message))
.child(
Icon::new(IconName::ZedSrcExtension)
.size(IconSize::Small)
.color(Color::Muted),
),
)
})
.gap_1()
.justify_between()
.child(
Icon::new(IconName::Check)
.color(Color::Success)
.size(IconSize::Small),
h_flex()
.gap_1p5()
.child(icon)
.child(Label::new(name))
.when(external, |this| {
this.child(
div()
.id(tooltip_id)
.flex_none()
.tooltip(Tooltip::text(tooltip_message))
.child(
Icon::new(IconName::ZedSrcExtension)
.size(IconSize::Small)
.color(Color::Muted),
),
)
})
.child(
Icon::new(IconName::Check)
.color(Color::Success)
.size(IconSize::Small),
),
)
.when(external, |this| this.child(uninstall_button))
}
}

View File

@@ -7,8 +7,10 @@ use agent_settings::{AgentProfile, AgentProfileId, AgentSettings, builtin_profil
use editor::Editor;
use fs::Fs;
use gpui::{DismissEvent, Entity, EventEmitter, FocusHandle, Focusable, Subscription, prelude::*};
use language_model::LanguageModel;
use settings::Settings as _;
use language_model::{LanguageModel, LanguageModelRegistry};
use settings::{
LanguageModelProviderSetting, LanguageModelSelection, Settings as _, update_settings_file,
};
use ui::{
KeyBinding, ListItem, ListItemSpacing, ListSeparator, Navigable, NavigableEntry, prelude::*,
};
@@ -16,6 +18,7 @@ use workspace::{ModalView, Workspace};
use crate::agent_configuration::manage_profiles_modal::profile_modal_header::ProfileModalHeader;
use crate::agent_configuration::tool_picker::{ToolPicker, ToolPickerDelegate};
use crate::language_model_selector::{LanguageModelSelector, language_model_selector};
use crate::{AgentPanel, ManageProfiles};
enum Mode {
@@ -32,6 +35,11 @@ enum Mode {
tool_picker: Entity<ToolPicker>,
_subscription: Subscription,
},
ConfigureDefaultModel {
profile_id: AgentProfileId,
model_picker: Entity<LanguageModelSelector>,
_subscription: Subscription,
},
}
impl Mode {
@@ -83,6 +91,7 @@ pub struct ChooseProfileMode {
pub struct ViewProfileMode {
profile_id: AgentProfileId,
fork_profile: NavigableEntry,
configure_default_model: NavigableEntry,
configure_tools: NavigableEntry,
configure_mcps: NavigableEntry,
cancel_item: NavigableEntry,
@@ -180,6 +189,7 @@ impl ManageProfilesModal {
self.mode = Mode::ViewProfile(ViewProfileMode {
profile_id,
fork_profile: NavigableEntry::focusable(cx),
configure_default_model: NavigableEntry::focusable(cx),
configure_tools: NavigableEntry::focusable(cx),
configure_mcps: NavigableEntry::focusable(cx),
cancel_item: NavigableEntry::focusable(cx),
@@ -187,6 +197,83 @@ impl ManageProfilesModal {
self.focus_handle(cx).focus(window);
}
fn configure_default_model(
&mut self,
profile_id: AgentProfileId,
window: &mut Window,
cx: &mut Context<Self>,
) {
let fs = self.fs.clone();
let profile_id_for_closure = profile_id.clone();
let model_picker = cx.new(|cx| {
let fs = fs.clone();
let profile_id = profile_id_for_closure.clone();
language_model_selector(
{
let profile_id = profile_id.clone();
move |cx| {
let settings = AgentSettings::get_global(cx);
settings
.profiles
.get(&profile_id)
.and_then(|profile| profile.default_model.as_ref())
.and_then(|selection| {
let registry = LanguageModelRegistry::read_global(cx);
let provider_id = language_model::LanguageModelProviderId(
gpui::SharedString::from(selection.provider.0.clone()),
);
let provider = registry.provider(&provider_id)?;
let model = provider
.provided_models(cx)
.iter()
.find(|m| m.id().0 == selection.model.as_str())?
.clone();
Some(language_model::ConfiguredModel { provider, model })
})
}
},
move |model, cx| {
let provider = model.provider_id().0.to_string();
let model_id = model.id().0.to_string();
let profile_id = profile_id.clone();
update_settings_file(fs.clone(), cx, move |settings, _cx| {
let agent_settings = settings.agent.get_or_insert_default();
if let Some(profiles) = agent_settings.profiles.as_mut() {
if let Some(profile) = profiles.get_mut(profile_id.0.as_ref()) {
profile.default_model = Some(LanguageModelSelection {
provider: LanguageModelProviderSetting(provider.clone()),
model: model_id.clone(),
});
}
}
});
},
false, // Do not use popover styles for the model picker
window,
cx,
)
.modal(false)
});
let dismiss_subscription = cx.subscribe_in(&model_picker, window, {
let profile_id = profile_id.clone();
move |this, _picker, _: &DismissEvent, window, cx| {
this.view_profile(profile_id.clone(), window, cx);
}
});
self.mode = Mode::ConfigureDefaultModel {
profile_id,
model_picker,
_subscription: dismiss_subscription,
};
self.focus_handle(cx).focus(window);
}
fn configure_mcp_tools(
&mut self,
profile_id: AgentProfileId,
@@ -277,6 +364,7 @@ impl ManageProfilesModal {
Mode::ViewProfile(_) => {}
Mode::ConfigureTools { .. } => {}
Mode::ConfigureMcps { .. } => {}
Mode::ConfigureDefaultModel { .. } => {}
}
}
@@ -299,6 +387,9 @@ impl ManageProfilesModal {
Mode::ConfigureMcps { profile_id, .. } => {
self.view_profile(profile_id.clone(), window, cx)
}
Mode::ConfigureDefaultModel { profile_id, .. } => {
self.view_profile(profile_id.clone(), window, cx)
}
}
}
}
@@ -313,6 +404,7 @@ impl Focusable for ManageProfilesModal {
Mode::ViewProfile(_) => self.focus_handle.clone(),
Mode::ConfigureTools { tool_picker, .. } => tool_picker.focus_handle(cx),
Mode::ConfigureMcps { tool_picker, .. } => tool_picker.focus_handle(cx),
Mode::ConfigureDefaultModel { model_picker, .. } => model_picker.focus_handle(cx),
}
}
}
@@ -544,6 +636,47 @@ impl ManageProfilesModal {
}),
),
)
.child(
div()
.id("configure-default-model")
.track_focus(&mode.configure_default_model.focus_handle)
.on_action({
let profile_id = mode.profile_id.clone();
cx.listener(move |this, _: &menu::Confirm, window, cx| {
this.configure_default_model(
profile_id.clone(),
window,
cx,
);
})
})
.child(
ListItem::new("model-item")
.toggle_state(
mode.configure_default_model
.focus_handle
.contains_focused(window, cx),
)
.inset(true)
.spacing(ListItemSpacing::Sparse)
.start_slot(
Icon::new(IconName::ZedAssistant)
.size(IconSize::Small)
.color(Color::Muted),
)
.child(Label::new("Configure Default Model"))
.on_click({
let profile_id = mode.profile_id.clone();
cx.listener(move |this, _, window, cx| {
this.configure_default_model(
profile_id.clone(),
window,
cx,
);
})
}),
),
)
.child(
div()
.id("configure-builtin-tools")
@@ -668,6 +801,7 @@ impl ManageProfilesModal {
.into_any_element(),
)
.entry(mode.fork_profile)
.entry(mode.configure_default_model)
.entry(mode.configure_tools)
.entry(mode.configure_mcps)
.entry(mode.cancel_item)
@@ -753,6 +887,29 @@ impl Render for ManageProfilesModal {
.child(go_back_item)
.into_any_element()
}
Mode::ConfigureDefaultModel {
profile_id,
model_picker,
..
} => {
let profile_name = settings
.profiles
.get(profile_id)
.map(|profile| profile.name.clone())
.unwrap_or_else(|| "Unknown".into());
v_flex()
.pb_1()
.child(ProfileModalHeader::new(
format!("{profile_name} — Configure Default Model"),
Some(IconName::Ai),
))
.child(ListSeparator)
.child(v_flex().w(rems(34.)).child(model_picker.clone()))
.child(ListSeparator)
.child(go_back_item)
.into_any_element()
}
Mode::ConfigureMcps {
profile_id,
tool_picker,

View File

@@ -314,6 +314,7 @@ impl PickerDelegate for ToolPickerDelegate {
)
})
.collect(),
default_model: default_profile.default_model.clone(),
});
if let Some(server_id) = server_id {

View File

@@ -47,6 +47,7 @@ impl AgentModelSelector {
}
}
},
true, // Use popover styles for picker
window,
cx,
)

View File

@@ -278,6 +278,8 @@ impl ContextPickerCompletionProvider {
icon_path: Some(mode.icon().path().into()),
documentation: None,
source: project::CompletionSource::Custom,
match_start: None,
snippet_deduplication_key: None,
insert_text_mode: None,
// This ensures that when a user accepts this completion, the
// completion menu will still be shown after "@category " is
@@ -386,6 +388,8 @@ impl ContextPickerCompletionProvider {
icon_path: Some(action.icon().path().into()),
documentation: None,
source: project::CompletionSource::Custom,
match_start: None,
snippet_deduplication_key: None,
insert_text_mode: None,
// This ensures that when a user accepts this completion, the
// completion menu will still be shown after "@category " is
@@ -417,6 +421,8 @@ impl ContextPickerCompletionProvider {
replace_range: source_range.clone(),
new_text,
label: CodeLabel::plain(thread_entry.title().to_string(), None),
match_start: None,
snippet_deduplication_key: None,
documentation: None,
insert_text_mode: None,
source: project::CompletionSource::Custom,
@@ -484,6 +490,8 @@ impl ContextPickerCompletionProvider {
replace_range: source_range.clone(),
new_text,
label: CodeLabel::plain(rules.title.to_string(), None),
match_start: None,
snippet_deduplication_key: None,
documentation: None,
insert_text_mode: None,
source: project::CompletionSource::Custom,
@@ -524,6 +532,8 @@ impl ContextPickerCompletionProvider {
documentation: None,
source: project::CompletionSource::Custom,
icon_path: Some(IconName::ToolWeb.path().into()),
match_start: None,
snippet_deduplication_key: None,
insert_text_mode: None,
confirm: Some(confirm_completion_callback(
IconName::ToolWeb.path().into(),
@@ -612,6 +622,8 @@ impl ContextPickerCompletionProvider {
documentation: None,
source: project::CompletionSource::Custom,
icon_path: Some(completion_icon_path),
match_start: None,
snippet_deduplication_key: None,
insert_text_mode: None,
confirm: Some(confirm_completion_callback(
crease_icon_path,
@@ -689,6 +701,8 @@ impl ContextPickerCompletionProvider {
documentation: None,
source: project::CompletionSource::Custom,
icon_path: Some(IconName::Code.path().into()),
match_start: None,
snippet_deduplication_key: None,
insert_text_mode: None,
confirm: Some(confirm_completion_callback(
IconName::Code.path().into(),

View File

@@ -1,6 +1,6 @@
use std::{cmp::Reverse, sync::Arc};
use collections::{HashSet, IndexMap};
use collections::IndexMap;
use fuzzy::{StringMatch, StringMatchCandidate, match_strings};
use gpui::{Action, AnyElement, App, BackgroundExecutor, DismissEvent, Subscription, Task};
use language_model::{
@@ -19,14 +19,26 @@ pub type LanguageModelSelector = Picker<LanguageModelPickerDelegate>;
pub fn language_model_selector(
get_active_model: impl Fn(&App) -> Option<ConfiguredModel> + 'static,
on_model_changed: impl Fn(Arc<dyn LanguageModel>, &mut App) + 'static,
popover_styles: bool,
window: &mut Window,
cx: &mut Context<LanguageModelSelector>,
) -> LanguageModelSelector {
let delegate = LanguageModelPickerDelegate::new(get_active_model, on_model_changed, window, cx);
Picker::list(delegate, window, cx)
.show_scrollbar(true)
.width(rems(20.))
.max_height(Some(rems(20.).into()))
let delegate = LanguageModelPickerDelegate::new(
get_active_model,
on_model_changed,
popover_styles,
window,
cx,
);
if popover_styles {
Picker::list(delegate, window, cx)
.show_scrollbar(true)
.width(rems(20.))
.max_height(Some(rems(20.).into()))
} else {
Picker::list(delegate, window, cx).show_scrollbar(true)
}
}
fn all_models(cx: &App) -> GroupedModels {
@@ -45,7 +57,7 @@ fn all_models(cx: &App) -> GroupedModels {
})
.collect();
let other = providers
let all = providers
.iter()
.flat_map(|provider| {
provider
@@ -58,7 +70,7 @@ fn all_models(cx: &App) -> GroupedModels {
})
.collect();
GroupedModels::new(other, recommended)
GroupedModels::new(all, recommended)
}
#[derive(Clone)]
@@ -75,12 +87,14 @@ pub struct LanguageModelPickerDelegate {
selected_index: usize,
_authenticate_all_providers_task: Task<()>,
_subscriptions: Vec<Subscription>,
popover_styles: bool,
}
impl LanguageModelPickerDelegate {
fn new(
get_active_model: impl Fn(&App) -> Option<ConfiguredModel> + 'static,
on_model_changed: impl Fn(Arc<dyn LanguageModel>, &mut App) + 'static,
popover_styles: bool,
window: &mut Window,
cx: &mut Context<Picker<Self>>,
) -> Self {
@@ -113,6 +127,7 @@ impl LanguageModelPickerDelegate {
}
},
)],
popover_styles,
}
}
@@ -195,33 +210,24 @@ impl LanguageModelPickerDelegate {
struct GroupedModels {
recommended: Vec<ModelInfo>,
other: IndexMap<LanguageModelProviderId, Vec<ModelInfo>>,
all: IndexMap<LanguageModelProviderId, Vec<ModelInfo>>,
}
impl GroupedModels {
pub fn new(other: Vec<ModelInfo>, recommended: Vec<ModelInfo>) -> Self {
let recommended_ids = recommended
.iter()
.map(|info| (info.model.provider_id(), info.model.id()))
.collect::<HashSet<_>>();
let mut other_by_provider: IndexMap<_, Vec<ModelInfo>> = IndexMap::default();
for model in other {
if recommended_ids.contains(&(model.model.provider_id(), model.model.id())) {
continue;
}
pub fn new(all: Vec<ModelInfo>, recommended: Vec<ModelInfo>) -> Self {
let mut all_by_provider: IndexMap<_, Vec<ModelInfo>> = IndexMap::default();
for model in all {
let provider = model.model.provider_id();
if let Some(models) = other_by_provider.get_mut(&provider) {
if let Some(models) = all_by_provider.get_mut(&provider) {
models.push(model);
} else {
other_by_provider.insert(provider, vec![model]);
all_by_provider.insert(provider, vec![model]);
}
}
Self {
recommended,
other: other_by_provider,
all: all_by_provider,
}
}
@@ -237,7 +243,7 @@ impl GroupedModels {
);
}
for models in self.other.values() {
for models in self.all.values() {
if models.is_empty() {
continue;
}
@@ -252,20 +258,6 @@ impl GroupedModels {
}
entries
}
fn model_infos(&self) -> Vec<ModelInfo> {
let other = self
.other
.values()
.flat_map(|model| model.iter())
.cloned()
.collect::<Vec<_>>();
self.recommended
.iter()
.chain(&other)
.cloned()
.collect::<Vec<_>>()
}
}
enum LanguageModelPickerEntry {
@@ -410,8 +402,9 @@ impl PickerDelegate for LanguageModelPickerDelegate {
.collect::<Vec<_>>();
let available_models = all_models
.model_infos()
.iter()
.all
.values()
.flat_map(|models| models.iter())
.filter(|m| configured_provider_ids.contains(&m.model.provider_id()))
.cloned()
.collect::<Vec<_>>();
@@ -530,6 +523,10 @@ impl PickerDelegate for LanguageModelPickerDelegate {
_window: &mut Window,
cx: &mut Context<Picker<Self>>,
) -> Option<gpui::AnyElement> {
if !self.popover_styles {
return None;
}
Some(
h_flex()
.w_full()
@@ -745,46 +742,52 @@ mod tests {
}
#[gpui::test]
fn test_exclude_recommended_models(_cx: &mut TestAppContext) {
fn test_recommended_models_also_appear_in_other(_cx: &mut TestAppContext) {
let recommended_models = create_models(vec![("zed", "claude")]);
let all_models = create_models(vec![
("zed", "claude"), // Should be filtered out from "other"
("zed", "claude"), // Should also appear in "other"
("zed", "gemini"),
("copilot", "o3"),
]);
let grouped_models = GroupedModels::new(all_models, recommended_models);
let actual_other_models = grouped_models
.other
let actual_all_models = grouped_models
.all
.values()
.flatten()
.cloned()
.collect::<Vec<_>>();
// Recommended models should not appear in "other"
assert_models_eq(actual_other_models, vec!["zed/gemini", "copilot/o3"]);
// Recommended models should also appear in "all"
assert_models_eq(
actual_all_models,
vec!["zed/claude", "zed/gemini", "copilot/o3"],
);
}
#[gpui::test]
fn test_dont_exclude_models_from_other_providers(_cx: &mut TestAppContext) {
fn test_models_from_different_providers(_cx: &mut TestAppContext) {
let recommended_models = create_models(vec![("zed", "claude")]);
let all_models = create_models(vec![
("zed", "claude"), // Should be filtered out from "other"
("zed", "claude"), // Should also appear in "other"
("zed", "gemini"),
("copilot", "claude"), // Should not be filtered out from "other"
("copilot", "claude"), // Different provider, should appear in "other"
]);
let grouped_models = GroupedModels::new(all_models, recommended_models);
let actual_other_models = grouped_models
.other
let actual_all_models = grouped_models
.all
.values()
.flatten()
.cloned()
.collect::<Vec<_>>();
// Recommended models should not appear in "other"
assert_models_eq(actual_other_models, vec!["zed/gemini", "copilot/claude"]);
// All models should appear in "all" regardless of recommended status
assert_models_eq(
actual_all_models,
vec!["zed/claude", "zed/gemini", "copilot/claude"],
);
}
}

View File

@@ -15,8 +15,8 @@ use std::{
sync::{Arc, atomic::AtomicBool},
};
use ui::{
DocumentationAside, DocumentationEdge, DocumentationSide, HighlightedLabel, LabelSize,
ListItem, ListItemSpacing, PopoverMenuHandle, TintColor, Tooltip, prelude::*,
DocumentationAside, DocumentationEdge, DocumentationSide, HighlightedLabel, KeyBinding,
LabelSize, ListItem, ListItemSpacing, PopoverMenuHandle, TintColor, Tooltip, prelude::*,
};
/// Trait for types that can provide and manage agent profiles
@@ -81,6 +81,7 @@ impl ProfileSelector {
self.provider.clone(),
self.profiles.clone(),
cx.background_executor().clone(),
self.focus_handle.clone(),
cx,
);
@@ -207,6 +208,7 @@ pub(crate) struct ProfilePickerDelegate {
selected_index: usize,
query: String,
cancel: Option<Arc<AtomicBool>>,
focus_handle: FocusHandle,
}
impl ProfilePickerDelegate {
@@ -215,6 +217,7 @@ impl ProfilePickerDelegate {
provider: Arc<dyn ProfileProvider>,
profiles: AvailableProfiles,
background: BackgroundExecutor,
focus_handle: FocusHandle,
cx: &mut Context<ProfileSelector>,
) -> Self {
let candidates = Self::candidates_from(profiles);
@@ -231,6 +234,7 @@ impl ProfilePickerDelegate {
selected_index: 0,
query: String::new(),
cancel: None,
focus_handle,
};
this.selected_index = this
@@ -594,20 +598,26 @@ impl PickerDelegate for ProfilePickerDelegate {
_: &mut Window,
cx: &mut Context<Picker<Self>>,
) -> Option<gpui::AnyElement> {
let focus_handle = self.focus_handle.clone();
Some(
h_flex()
.w_full()
.border_t_1()
.border_color(cx.theme().colors().border_variant)
.p_1()
.gap_4()
.justify_between()
.p_1p5()
.child(
Button::new("configure", "Configure")
.icon(IconName::Settings)
.icon_size(IconSize::Small)
.icon_color(Color::Muted)
.icon_position(IconPosition::Start)
.full_width()
.style(ButtonStyle::Outlined)
.key_binding(
KeyBinding::for_action_in(
&ManageProfiles::default(),
&focus_handle,
cx,
)
.map(|kb| kb.size(rems_from_px(12.))),
)
.on_click(|_, window, cx| {
window.dispatch_action(ManageProfiles::default().boxed_clone(), cx);
}),
@@ -659,20 +669,25 @@ mod tests {
is_builtin: true,
}];
let delegate = ProfilePickerDelegate {
fs: FakeFs::new(cx.executor()),
provider: Arc::new(TestProfileProvider::new(AgentProfileId("write".into()))),
background: cx.executor(),
candidates,
string_candidates: Arc::new(Vec::new()),
filtered_entries: Vec::new(),
selected_index: 0,
query: String::new(),
cancel: None,
};
cx.update(|cx| {
let focus_handle = cx.focus_handle();
let matches = Vec::new(); // No matches
let _entries = delegate.entries_from_matches(matches);
let delegate = ProfilePickerDelegate {
fs: FakeFs::new(cx.background_executor().clone()),
provider: Arc::new(TestProfileProvider::new(AgentProfileId("write".into()))),
background: cx.background_executor().clone(),
candidates,
string_candidates: Arc::new(Vec::new()),
filtered_entries: Vec::new(),
selected_index: 0,
query: String::new(),
cancel: None,
focus_handle,
};
let matches = Vec::new(); // No matches
let _entries = delegate.entries_from_matches(matches);
});
}
#[gpui::test]
@@ -690,30 +705,35 @@ mod tests {
},
];
let delegate = ProfilePickerDelegate {
fs: FakeFs::new(cx.executor()),
provider: Arc::new(TestProfileProvider::new(AgentProfileId("write".into()))),
background: cx.executor(),
candidates,
string_candidates: Arc::new(Vec::new()),
filtered_entries: vec![
ProfilePickerEntry::Profile(ProfileMatchEntry {
candidate_index: 0,
positions: Vec::new(),
}),
ProfilePickerEntry::Profile(ProfileMatchEntry {
candidate_index: 1,
positions: Vec::new(),
}),
],
selected_index: 0,
query: String::new(),
cancel: None,
};
cx.update(|cx| {
let focus_handle = cx.focus_handle();
// Active profile should be found at index 0
let active_index = delegate.index_of_profile(&AgentProfileId("write".into()));
assert_eq!(active_index, Some(0));
let delegate = ProfilePickerDelegate {
fs: FakeFs::new(cx.background_executor().clone()),
provider: Arc::new(TestProfileProvider::new(AgentProfileId("write".into()))),
background: cx.background_executor().clone(),
candidates,
string_candidates: Arc::new(Vec::new()),
filtered_entries: vec![
ProfilePickerEntry::Profile(ProfileMatchEntry {
candidate_index: 0,
positions: Vec::new(),
}),
ProfilePickerEntry::Profile(ProfileMatchEntry {
candidate_index: 1,
positions: Vec::new(),
}),
],
selected_index: 0,
query: String::new(),
cancel: None,
focus_handle,
};
// Active profile should be found at index 0
let active_index = delegate.index_of_profile(&AgentProfileId("write".into()));
assert_eq!(active_index, Some(0));
});
}
struct TestProfileProvider {

View File

@@ -127,6 +127,8 @@ impl SlashCommandCompletionProvider {
new_text,
label: command.label(cx),
icon_path: None,
match_start: None,
snippet_deduplication_key: None,
insert_text_mode: None,
confirm,
source: CompletionSource::Custom,
@@ -232,6 +234,8 @@ impl SlashCommandCompletionProvider {
icon_path: None,
new_text,
documentation: None,
match_start: None,
snippet_deduplication_key: None,
confirm,
insert_text_mode: None,
source: CompletionSource::Custom,

View File

@@ -314,6 +314,7 @@ impl TextThreadEditor {
)
});
},
true, // Use popover styles for picker
window,
cx,
)
@@ -477,7 +478,7 @@ impl TextThreadEditor {
editor.insert(&format!("/{name}"), window, cx);
if command.accepts_arguments() {
editor.insert(" ", window, cx);
editor.show_completions(&ShowCompletions::default(), window, cx);
editor.show_completions(&ShowCompletions, window, cx);
}
});
});

View File

@@ -33,4 +33,9 @@ workspace.workspace = true
which.workspace = true
[dev-dependencies]
ctor.workspace = true
clock= { workspace = true, "features" = ["test-support"] }
futures.workspace = true
gpui = { workspace = true, "features" = ["test-support"] }
parking_lot.workspace = true
zlog.workspace = true

View File

@@ -1,12 +1,11 @@
use anyhow::{Context as _, Result};
use client::{Client, TelemetrySettings};
use db::RELEASE_CHANNEL;
use client::Client;
use db::kvp::KEY_VALUE_STORE;
use gpui::{
App, AppContext as _, AsyncApp, BackgroundExecutor, Context, Entity, Global, SemanticVersion,
Task, Window, actions,
};
use http_client::{AsyncBody, HttpClient, HttpClientWithUrl};
use http_client::{HttpClient, HttpClientWithUrl};
use paths::remote_servers_dir;
use release_channel::{AppCommitSha, ReleaseChannel};
use serde::{Deserialize, Serialize};
@@ -41,22 +40,23 @@ actions!(
]
);
#[derive(Serialize)]
struct UpdateRequestBody {
installation_id: Option<Arc<str>>,
release_channel: Option<&'static str>,
telemetry: bool,
is_staff: Option<bool>,
destination: &'static str,
}
#[derive(Clone, Debug, PartialEq, Eq)]
pub enum VersionCheckType {
Sha(AppCommitSha),
Semantic(SemanticVersion),
}
#[derive(Clone)]
#[derive(Serialize, Debug)]
pub struct AssetQuery<'a> {
asset: &'a str,
os: &'a str,
arch: &'a str,
metrics_id: Option<&'a str>,
system_id: Option<&'a str>,
is_staff: Option<bool>,
}
#[derive(Clone, Debug)]
pub enum AutoUpdateStatus {
Idle,
Checking,
@@ -66,6 +66,31 @@ pub enum AutoUpdateStatus {
Errored { error: Arc<anyhow::Error> },
}
impl PartialEq for AutoUpdateStatus {
fn eq(&self, other: &Self) -> bool {
match (self, other) {
(AutoUpdateStatus::Idle, AutoUpdateStatus::Idle) => true,
(AutoUpdateStatus::Checking, AutoUpdateStatus::Checking) => true,
(
AutoUpdateStatus::Downloading { version: v1 },
AutoUpdateStatus::Downloading { version: v2 },
) => v1 == v2,
(
AutoUpdateStatus::Installing { version: v1 },
AutoUpdateStatus::Installing { version: v2 },
) => v1 == v2,
(
AutoUpdateStatus::Updated { version: v1 },
AutoUpdateStatus::Updated { version: v2 },
) => v1 == v2,
(AutoUpdateStatus::Errored { error: e1 }, AutoUpdateStatus::Errored { error: e2 }) => {
e1.to_string() == e2.to_string()
}
_ => false,
}
}
}
impl AutoUpdateStatus {
pub fn is_updated(&self) -> bool {
matches!(self, Self::Updated { .. })
@@ -75,13 +100,13 @@ impl AutoUpdateStatus {
pub struct AutoUpdater {
status: AutoUpdateStatus,
current_version: SemanticVersion,
http_client: Arc<HttpClientWithUrl>,
client: Arc<Client>,
pending_poll: Option<Task<Option<()>>>,
quit_subscription: Option<gpui::Subscription>,
}
#[derive(Deserialize, Clone, Debug)]
pub struct JsonRelease {
#[derive(Deserialize, Serialize, Clone, Debug)]
pub struct ReleaseAsset {
pub version: String,
pub url: String,
}
@@ -137,7 +162,7 @@ struct GlobalAutoUpdate(Option<Entity<AutoUpdater>>);
impl Global for GlobalAutoUpdate {}
pub fn init(http_client: Arc<HttpClientWithUrl>, cx: &mut App) {
pub fn init(client: Arc<Client>, cx: &mut App) {
cx.observe_new(|workspace: &mut Workspace, _window, _cx| {
workspace.register_action(|_, action, window, cx| check(action, window, cx));
@@ -149,7 +174,7 @@ pub fn init(http_client: Arc<HttpClientWithUrl>, cx: &mut App) {
let version = release_channel::AppVersion::global(cx);
let auto_updater = cx.new(|cx| {
let updater = AutoUpdater::new(version, http_client, cx);
let updater = AutoUpdater::new(version, client, cx);
let poll_for_updates = ReleaseChannel::try_global(cx)
.map(|channel| channel.poll_for_updates())
@@ -233,7 +258,7 @@ pub fn view_release_notes(_: &ViewReleaseNotes, cx: &mut App) -> Option<()> {
let current_version = auto_updater.current_version;
let release_channel = release_channel.dev_name();
let path = format!("/releases/{release_channel}/{current_version}");
let url = &auto_updater.http_client.build_url(&path);
let url = &auto_updater.client.http_client().build_url(&path);
cx.open_url(url);
}
ReleaseChannel::Nightly => {
@@ -296,11 +321,7 @@ impl AutoUpdater {
cx.default_global::<GlobalAutoUpdate>().0.clone()
}
fn new(
current_version: SemanticVersion,
http_client: Arc<HttpClientWithUrl>,
cx: &mut Context<Self>,
) -> Self {
fn new(current_version: SemanticVersion, client: Arc<Client>, cx: &mut Context<Self>) -> Self {
// On windows, executable files cannot be overwritten while they are
// running, so we must wait to overwrite the application until quitting
// or restarting. When quitting the app, we spawn the auto update helper
@@ -321,7 +342,7 @@ impl AutoUpdater {
Self {
status: AutoUpdateStatus::Idle,
current_version,
http_client,
client,
pending_poll: None,
quit_subscription,
}
@@ -329,8 +350,7 @@ impl AutoUpdater {
pub fn start_polling(&self, cx: &mut Context<Self>) -> Task<Result<()>> {
cx.spawn(async move |this, cx| {
#[cfg(target_os = "windows")]
{
if cfg!(target_os = "windows") {
use util::ResultExt;
cleanup_windows()
@@ -354,7 +374,7 @@ impl AutoUpdater {
cx.notify();
self.pending_poll = Some(cx.spawn(async move |this, cx| {
let result = Self::update(this.upgrade()?, cx.clone()).await;
let result = Self::update(this.upgrade()?, cx).await;
this.update(cx, |this, cx| {
this.pending_poll = None;
if let Err(error) = result {
@@ -400,10 +420,10 @@ impl AutoUpdater {
// you can override this function. You should also update get_remote_server_release_url to return
// Ok(None).
pub async fn download_remote_server_release(
os: &str,
arch: &str,
release_channel: ReleaseChannel,
version: Option<SemanticVersion>,
os: &str,
arch: &str,
set_status: impl Fn(&str, &mut AsyncApp) + Send + 'static,
cx: &mut AsyncApp,
) -> Result<PathBuf> {
@@ -415,13 +435,13 @@ impl AutoUpdater {
})??;
set_status("Fetching remote server release", cx);
let release = Self::get_release(
let release = Self::get_release_asset(
&this,
release_channel,
version,
"zed-remote-server",
os,
arch,
version,
Some(release_channel),
cx,
)
.await?;
@@ -432,7 +452,7 @@ impl AutoUpdater {
let version_path = platform_dir.join(format!("{}.gz", release.version));
smol::fs::create_dir_all(&platform_dir).await.ok();
let client = this.read_with(cx, |this, _| this.http_client.clone())?;
let client = this.read_with(cx, |this, _| this.client.http_client())?;
if smol::fs::metadata(&version_path).await.is_err() {
log::info!(
@@ -440,19 +460,19 @@ impl AutoUpdater {
release.version
);
set_status("Downloading remote server", cx);
download_remote_server_binary(&version_path, release, client, cx).await?;
download_remote_server_binary(&version_path, release, client).await?;
}
Ok(version_path)
}
pub async fn get_remote_server_release_url(
channel: ReleaseChannel,
version: Option<SemanticVersion>,
os: &str,
arch: &str,
release_channel: ReleaseChannel,
version: Option<SemanticVersion>,
cx: &mut AsyncApp,
) -> Result<Option<(String, String)>> {
) -> Result<Option<String>> {
let this = cx.update(|cx| {
cx.default_global::<GlobalAutoUpdate>()
.0
@@ -460,108 +480,99 @@ impl AutoUpdater {
.context("auto-update not initialized")
})??;
let release = Self::get_release(
&this,
"zed-remote-server",
os,
arch,
version,
Some(release_channel),
cx,
)
.await?;
let release =
Self::get_release_asset(&this, channel, version, "zed-remote-server", os, arch, cx)
.await?;
let update_request_body = build_remote_server_update_request_body(cx)?;
let body = serde_json::to_string(&update_request_body)?;
Ok(Some((release.url, body)))
Ok(Some(release.url))
}
async fn get_release(
async fn get_release_asset(
this: &Entity<Self>,
asset: &str,
os: &str,
arch: &str,
release_channel: ReleaseChannel,
version: Option<SemanticVersion>,
release_channel: Option<ReleaseChannel>,
cx: &mut AsyncApp,
) -> Result<JsonRelease> {
let client = this.read_with(cx, |this, _| this.http_client.clone())?;
if let Some(version) = version {
let channel = release_channel.map(|c| c.dev_name()).unwrap_or("stable");
let url = format!("/api/releases/{channel}/{version}/{asset}-{os}-{arch}.gz?update=1",);
Ok(JsonRelease {
version: version.to_string(),
url: client.build_url(&url),
})
} else {
let mut url_string = client.build_url(&format!(
"/api/releases/latest?asset={}&os={}&arch={}",
asset, os, arch
));
if let Some(param) = release_channel.and_then(|c| c.release_query_param()) {
url_string += "&";
url_string += param;
}
let mut response = client.get(&url_string, Default::default(), true).await?;
let mut body = Vec::new();
response.body_mut().read_to_end(&mut body).await?;
anyhow::ensure!(
response.status().is_success(),
"failed to fetch release: {:?}",
String::from_utf8_lossy(&body),
);
serde_json::from_slice(body.as_slice()).with_context(|| {
format!(
"error deserializing release {:?}",
String::from_utf8_lossy(&body),
)
})
}
}
async fn get_latest_release(
this: &Entity<Self>,
asset: &str,
os: &str,
arch: &str,
release_channel: Option<ReleaseChannel>,
cx: &mut AsyncApp,
) -> Result<JsonRelease> {
Self::get_release(this, asset, os, arch, None, release_channel, cx).await
) -> Result<ReleaseAsset> {
let client = this.read_with(cx, |this, _| this.client.clone())?;
let (system_id, metrics_id, is_staff) = if client.telemetry().metrics_enabled() {
(
client.telemetry().system_id(),
client.telemetry().metrics_id(),
client.telemetry().is_staff(),
)
} else {
(None, None, None)
};
let version = if let Some(version) = version {
version.to_string()
} else {
"latest".to_string()
};
let http_client = client.http_client();
let path = format!("/releases/{}/{}/asset", release_channel.dev_name(), version,);
let url = http_client.build_zed_cloud_url_with_query(
&path,
AssetQuery {
os,
arch,
asset,
metrics_id: metrics_id.as_deref(),
system_id: system_id.as_deref(),
is_staff: is_staff,
},
)?;
let mut response = http_client
.get(url.as_str(), Default::default(), true)
.await?;
let mut body = Vec::new();
response.body_mut().read_to_end(&mut body).await?;
anyhow::ensure!(
response.status().is_success(),
"failed to fetch release: {:?}",
String::from_utf8_lossy(&body),
);
serde_json::from_slice(body.as_slice()).with_context(|| {
format!(
"error deserializing release {:?}",
String::from_utf8_lossy(&body),
)
})
}
async fn update(this: Entity<Self>, mut cx: AsyncApp) -> Result<()> {
async fn update(this: Entity<Self>, cx: &mut AsyncApp) -> Result<()> {
let (client, installed_version, previous_status, release_channel) =
this.read_with(&cx, |this, cx| {
this.read_with(cx, |this, cx| {
(
this.http_client.clone(),
this.client.http_client(),
this.current_version,
this.status.clone(),
ReleaseChannel::try_global(cx),
ReleaseChannel::try_global(cx).unwrap_or(ReleaseChannel::Stable),
)
})?;
Self::check_dependencies()?;
this.update(&mut cx, |this, cx| {
this.update(cx, |this, cx| {
this.status = AutoUpdateStatus::Checking;
log::info!("Auto Update: checking for updates");
cx.notify();
})?;
let fetched_release_data =
Self::get_latest_release(&this, "zed", OS, ARCH, release_channel, &mut cx).await?;
Self::get_release_asset(&this, release_channel, None, "zed", OS, ARCH, cx).await?;
let fetched_version = fetched_release_data.clone().version;
let app_commit_sha = cx.update(|cx| AppCommitSha::try_global(cx).map(|sha| sha.full()));
let newer_version = Self::check_if_fetched_version_is_newer(
*RELEASE_CHANNEL,
release_channel,
app_commit_sha,
installed_version,
fetched_version,
@@ -569,7 +580,7 @@ impl AutoUpdater {
)?;
let Some(newer_version) = newer_version else {
return this.update(&mut cx, |this, cx| {
return this.update(cx, |this, cx| {
let status = match previous_status {
AutoUpdateStatus::Updated { .. } => previous_status,
_ => AutoUpdateStatus::Idle,
@@ -579,7 +590,7 @@ impl AutoUpdater {
});
};
this.update(&mut cx, |this, cx| {
this.update(cx, |this, cx| {
this.status = AutoUpdateStatus::Downloading {
version: newer_version.clone(),
};
@@ -588,21 +599,21 @@ impl AutoUpdater {
let installer_dir = InstallerDir::new().await?;
let target_path = Self::target_path(&installer_dir).await?;
download_release(&target_path, fetched_release_data, client, &cx).await?;
download_release(&target_path, fetched_release_data, client).await?;
this.update(&mut cx, |this, cx| {
this.update(cx, |this, cx| {
this.status = AutoUpdateStatus::Installing {
version: newer_version.clone(),
};
cx.notify();
})?;
let new_binary_path = Self::install_release(installer_dir, target_path, &cx).await?;
let new_binary_path = Self::install_release(installer_dir, target_path, cx).await?;
if let Some(new_binary_path) = new_binary_path {
cx.update(|cx| cx.set_restart_path(new_binary_path))?;
}
this.update(&mut cx, |this, cx| {
this.update(cx, |this, cx| {
this.set_should_show_update_notification(true, cx)
.detach_and_log_err(cx);
this.status = AutoUpdateStatus::Updated {
@@ -681,6 +692,12 @@ impl AutoUpdater {
target_path: PathBuf,
cx: &AsyncApp,
) -> Result<Option<PathBuf>> {
#[cfg(test)]
if let Some(test_install) =
cx.try_read_global::<tests::InstallOverride, _>(|g, _| g.0.clone())
{
return test_install(target_path, cx);
}
match OS {
"macos" => install_release_macos(&installer_dir, target_path, cx).await,
"linux" => install_release_linux(&installer_dir, target_path, cx).await,
@@ -731,16 +748,13 @@ impl AutoUpdater {
async fn download_remote_server_binary(
target_path: &PathBuf,
release: JsonRelease,
release: ReleaseAsset,
client: Arc<HttpClientWithUrl>,
cx: &AsyncApp,
) -> Result<()> {
let temp = tempfile::Builder::new().tempfile_in(remote_servers_dir())?;
let mut temp_file = File::create(&temp).await?;
let update_request_body = build_remote_server_update_request_body(cx)?;
let request_body = AsyncBody::from(serde_json::to_string(&update_request_body)?);
let mut response = client.get(&release.url, request_body, true).await?;
let mut response = client.get(&release.url, Default::default(), true).await?;
anyhow::ensure!(
response.status().is_success(),
"failed to download remote server release: {:?}",
@@ -752,65 +766,19 @@ async fn download_remote_server_binary(
Ok(())
}
fn build_remote_server_update_request_body(cx: &AsyncApp) -> Result<UpdateRequestBody> {
let (installation_id, release_channel, telemetry_enabled, is_staff) = cx.update(|cx| {
let telemetry = Client::global(cx).telemetry().clone();
let is_staff = telemetry.is_staff();
let installation_id = telemetry.installation_id();
let release_channel =
ReleaseChannel::try_global(cx).map(|release_channel| release_channel.display_name());
let telemetry_enabled = TelemetrySettings::get_global(cx).metrics;
(
installation_id,
release_channel,
telemetry_enabled,
is_staff,
)
})?;
Ok(UpdateRequestBody {
installation_id,
release_channel,
telemetry: telemetry_enabled,
is_staff,
destination: "remote",
})
}
async fn download_release(
target_path: &Path,
release: JsonRelease,
release: ReleaseAsset,
client: Arc<HttpClientWithUrl>,
cx: &AsyncApp,
) -> Result<()> {
let mut target_file = File::create(&target_path).await?;
let (installation_id, release_channel, telemetry_enabled, is_staff) = cx.update(|cx| {
let telemetry = Client::global(cx).telemetry().clone();
let is_staff = telemetry.is_staff();
let installation_id = telemetry.installation_id();
let release_channel =
ReleaseChannel::try_global(cx).map(|release_channel| release_channel.display_name());
let telemetry_enabled = TelemetrySettings::get_global(cx).metrics;
(
installation_id,
release_channel,
telemetry_enabled,
is_staff,
)
})?;
let request_body = AsyncBody::from(serde_json::to_string(&UpdateRequestBody {
installation_id,
release_channel,
telemetry: telemetry_enabled,
is_staff,
destination: "local",
})?);
let mut response = client.get(&release.url, request_body, true).await?;
let mut response = client.get(&release.url, Default::default(), true).await?;
anyhow::ensure!(
response.status().is_success(),
"failed to download update: {:?}",
response.status()
);
smol::io::copy(response.body_mut(), &mut target_file).await?;
log::info!("downloaded update. path:{:?}", target_path);
@@ -934,28 +902,16 @@ async fn install_release_macos(
Ok(None)
}
#[cfg(target_os = "windows")]
async fn cleanup_windows() -> Result<()> {
use util::ResultExt;
let parent = std::env::current_exe()?
.parent()
.context("No parent dir for Zed.exe")?
.to_owned();
// keep in sync with crates/auto_update_helper/src/updater.rs
smol::fs::remove_dir(parent.join("updates"))
.await
.context("failed to remove updates dir")
.log_err();
smol::fs::remove_dir(parent.join("install"))
.await
.context("failed to remove install dir")
.log_err();
smol::fs::remove_dir(parent.join("old"))
.await
.context("failed to remove old version dir")
.log_err();
_ = smol::fs::remove_dir(parent.join("updates")).await;
_ = smol::fs::remove_dir(parent.join("install")).await;
_ = smol::fs::remove_dir(parent.join("old")).await;
Ok(())
}
@@ -1010,11 +966,33 @@ pub async fn finalize_auto_update_on_quit() {
#[cfg(test)]
mod tests {
use client::Client;
use clock::FakeSystemClock;
use futures::channel::oneshot;
use gpui::TestAppContext;
use http_client::{FakeHttpClient, Response};
use settings::default_settings;
use std::{
rc::Rc,
sync::{
Arc,
atomic::{self, AtomicBool},
},
};
use tempfile::tempdir;
#[ctor::ctor]
fn init_logger() {
zlog::init_test();
}
use super::*;
pub(super) struct InstallOverride(
pub Rc<dyn Fn(PathBuf, &AsyncApp) -> Result<Option<PathBuf>>>,
);
impl Global for InstallOverride {}
#[gpui::test]
fn test_auto_update_defaults_to_true(cx: &mut TestAppContext) {
cx.update(|cx| {
@@ -1030,6 +1008,115 @@ mod tests {
});
}
#[gpui::test]
async fn test_auto_update_downloads(cx: &mut TestAppContext) {
cx.background_executor.allow_parking();
zlog::init_test();
let release_available = Arc::new(AtomicBool::new(false));
let (dmg_tx, dmg_rx) = oneshot::channel::<String>();
cx.update(|cx| {
settings::init(cx);
let current_version = SemanticVersion::new(0, 100, 0);
release_channel::init_test(current_version, ReleaseChannel::Stable, cx);
let clock = Arc::new(FakeSystemClock::new());
let release_available = Arc::clone(&release_available);
let dmg_rx = Arc::new(parking_lot::Mutex::new(Some(dmg_rx)));
let fake_client_http = FakeHttpClient::create(move |req| {
let release_available = release_available.load(atomic::Ordering::Relaxed);
let dmg_rx = dmg_rx.clone();
async move {
if req.uri().path() == "/releases/stable/latest/asset" {
if release_available {
return Ok(Response::builder().status(200).body(
r#"{"version":"0.100.1","url":"https://test.example/new-download"}"#.into()
).unwrap());
} else {
return Ok(Response::builder().status(200).body(
r#"{"version":"0.100.0","url":"https://test.example/old-download"}"#.into()
).unwrap());
}
} else if req.uri().path() == "/new-download" {
return Ok(Response::builder().status(200).body({
let dmg_rx = dmg_rx.lock().take().unwrap();
dmg_rx.await.unwrap().into()
}).unwrap());
}
Ok(Response::builder().status(404).body("".into()).unwrap())
}
});
let client = Client::new(clock, fake_client_http, cx);
crate::init(client, cx);
});
let auto_updater = cx.update(|cx| AutoUpdater::get(cx).expect("auto updater should exist"));
cx.background_executor.run_until_parked();
auto_updater.read_with(cx, |updater, _| {
assert_eq!(updater.status(), AutoUpdateStatus::Idle);
assert_eq!(updater.current_version(), SemanticVersion::new(0, 100, 0));
});
release_available.store(true, atomic::Ordering::SeqCst);
cx.background_executor.advance_clock(POLL_INTERVAL);
cx.background_executor.run_until_parked();
loop {
cx.background_executor.timer(Duration::from_millis(0)).await;
cx.run_until_parked();
let status = auto_updater.read_with(cx, |updater, _| updater.status());
if !matches!(status, AutoUpdateStatus::Idle) {
break;
}
}
let status = auto_updater.read_with(cx, |updater, _| updater.status());
assert_eq!(
status,
AutoUpdateStatus::Downloading {
version: VersionCheckType::Semantic(SemanticVersion::new(0, 100, 1))
}
);
dmg_tx.send("<fake-zed-update>".to_owned()).unwrap();
let tmp_dir = Arc::new(tempdir().unwrap());
cx.update(|cx| {
let tmp_dir = tmp_dir.clone();
cx.set_global(InstallOverride(Rc::new(move |target_path, _cx| {
let tmp_dir = tmp_dir.clone();
let dest_path = tmp_dir.path().join("zed");
std::fs::copy(&target_path, &dest_path)?;
Ok(Some(dest_path))
})));
});
loop {
cx.background_executor.timer(Duration::from_millis(0)).await;
cx.run_until_parked();
let status = auto_updater.read_with(cx, |updater, _| updater.status());
if !matches!(status, AutoUpdateStatus::Downloading { .. }) {
break;
}
}
let status = auto_updater.read_with(cx, |updater, _| updater.status());
assert_eq!(
status,
AutoUpdateStatus::Updated {
version: VersionCheckType::Semantic(SemanticVersion::new(0, 100, 1))
}
);
let will_restart = cx.expect_restart();
cx.update(|cx| cx.restart());
let path = will_restart.await.unwrap().unwrap();
assert_eq!(path, tmp_dir.path().join("zed"));
assert_eq!(std::fs::read_to_string(path).unwrap(), "<fake-zed-update>");
}
#[test]
fn test_stable_does_not_update_when_fetched_version_is_not_higher() {
let release_channel = ReleaseChannel::Stable;

View File

@@ -1,6 +1,6 @@
use std::{
cell::LazyCell,
path::Path,
sync::LazyLock,
time::{Duration, Instant},
};
@@ -13,8 +13,8 @@ use windows::Win32::{
use crate::windows_impl::WM_JOB_UPDATED;
pub(crate) struct Job {
pub apply: Box<dyn Fn(&Path) -> Result<()>>,
pub rollback: Box<dyn Fn(&Path) -> Result<()>>,
pub apply: Box<dyn Fn(&Path) -> Result<()> + Send + Sync>,
pub rollback: Box<dyn Fn(&Path) -> Result<()> + Send + Sync>,
}
impl Job {
@@ -154,10 +154,8 @@ impl Job {
}
}
// app is single threaded
#[cfg(not(test))]
#[allow(clippy::declare_interior_mutable_const)]
pub(crate) const JOBS: LazyCell<[Job; 22]> = LazyCell::new(|| {
pub(crate) static JOBS: LazyLock<[Job; 22]> = LazyLock::new(|| {
fn p(value: &str) -> &Path {
Path::new(value)
}
@@ -206,10 +204,8 @@ pub(crate) const JOBS: LazyCell<[Job; 22]> = LazyCell::new(|| {
]
});
// app is single threaded
#[cfg(test)]
#[allow(clippy::declare_interior_mutable_const)]
pub(crate) const JOBS: LazyCell<[Job; 9]> = LazyCell::new(|| {
pub(crate) static JOBS: LazyLock<[Job; 9]> = LazyLock::new(|| {
fn p(value: &str) -> &Path {
Path::new(value)
}

View File

@@ -1487,7 +1487,7 @@ impl Client {
let url = self
.http
.build_zed_cloud_url("/internal/users/impersonate", &[])?;
.build_zed_cloud_url("/internal/users/impersonate")?;
let request = Request::post(url.as_str())
.header("Content-Type", "application/json")
.header("Authorization", format!("Bearer {api_token}"))

View File

@@ -435,7 +435,7 @@ impl Telemetry {
Some(project_types)
}
fn report_event(self: &Arc<Self>, event: Event) {
fn report_event(self: &Arc<Self>, mut event: Event) {
let mut state = self.state.lock();
// RUST_LOG=telemetry=trace to debug telemetry events
log::trace!(target: "telemetry", "{:?}", event);
@@ -444,6 +444,12 @@ impl Telemetry {
return;
}
match &mut event {
Event::Flexible(event) => event
.event_properties
.insert("event_source".into(), "zed".into()),
};
if state.flush_events_task.is_none() {
let this = self.clone();
state.flush_events_task = Some(self.executor.spawn(async move {

View File

@@ -51,3 +51,11 @@ pub fn external_agents_docs(cx: &App) -> String {
server_url = server_url(cx)
)
}
/// Returns the URL to Zed agent servers documentation.
pub fn agent_server_docs(cx: &App) -> String {
format!(
"{server_url}/docs/extensions/agent-servers",
server_url = server_url(cx)
)
}

View File

@@ -62,7 +62,7 @@ impl CloudApiClient {
let request = self.build_request(
Request::builder().method(Method::GET).uri(
self.http_client
.build_zed_cloud_url("/client/users/me", &[])?
.build_zed_cloud_url("/client/users/me")?
.as_ref(),
),
AsyncBody::default(),
@@ -89,7 +89,7 @@ impl CloudApiClient {
pub fn connect(&self, cx: &App) -> Result<Task<Result<Connection>>> {
let mut connect_url = self
.http_client
.build_zed_cloud_url("/client/users/connect", &[])?;
.build_zed_cloud_url("/client/users/connect")?;
connect_url
.set_scheme(match connect_url.scheme() {
"https" => "wss",
@@ -123,7 +123,7 @@ impl CloudApiClient {
.method(Method::POST)
.uri(
self.http_client
.build_zed_cloud_url("/client/llm_tokens", &[])?
.build_zed_cloud_url("/client/llm_tokens")?
.as_ref(),
)
.when_some(system_id, |builder, system_id| {
@@ -154,7 +154,7 @@ impl CloudApiClient {
let request = build_request(
Request::builder().method(Method::GET).uri(
self.http_client
.build_zed_cloud_url("/client/users/me", &[])?
.build_zed_cloud_url("/client/users/me")?
.as_ref(),
),
AsyncBody::default(),

View File

@@ -73,6 +73,7 @@ pub enum PromptFormat {
MarkedExcerpt,
LabeledSections,
NumLinesUniDiff,
OldTextNewText,
/// Prompt format intended for use via zeta_cli
OnlySnippets,
}
@@ -100,6 +101,7 @@ impl std::fmt::Display for PromptFormat {
PromptFormat::LabeledSections => write!(f, "Labeled Sections"),
PromptFormat::OnlySnippets => write!(f, "Only Snippets"),
PromptFormat::NumLinesUniDiff => write!(f, "Numbered Lines / Unified Diff"),
PromptFormat::OldTextNewText => write!(f, "Old Text / New Text"),
}
}
}

View File

@@ -56,50 +56,98 @@ const LABELED_SECTIONS_INSTRUCTIONS: &str = indoc! {r#"
const NUMBERED_LINES_INSTRUCTIONS: &str = indoc! {r#"
# Instructions
You are a code completion assistant helping a programmer finish their work. Your task is to:
You are an edit prediction agent in a code editor.
Your job is to predict the next edit that the user will make,
based on their last few edits and their current cursor location.
1. Analyze the edit history to understand what the programmer is trying to achieve
2. Identify any incomplete refactoring or changes that need to be finished
3. Make the remaining edits that a human programmer would logically make next
4. Apply systematic changes consistently across the entire codebase - if you see a pattern starting, complete it everywhere.
## Output Format
Focus on:
- Understanding the intent behind the changes (e.g., improving error handling, refactoring APIs, fixing bugs)
- Completing any partially-applied changes across the codebase
- Ensuring consistency with the programming style and patterns already established
- Making edits that maintain or improve code quality
- If the programmer started refactoring one instance of a pattern, find and update ALL similar instances
- Don't write a lot of code if you're not sure what to do
Rules:
- Do not just mechanically apply patterns - reason about what changes make sense given the context and the programmer's apparent goals.
- Do not just fix syntax errors - look for the broader refactoring pattern and apply it systematically throughout the code.
- Write the edits in the unified diff format as shown in the example.
# Example output:
You must briefly explain your understanding of the user's goal, in one
or two sentences, and then specify their next edit in the form of a
unified diff, like this:
```
--- a/src/myapp/cli.py
+++ b/src/myapp/cli.py
@@ -1,3 +1,3 @@
-
-
-import sys
+import json
@@ ... @@
import os
import time
import sys
+from constants import LOG_LEVEL_WARNING
@@ ... @@
config.headless()
config.set_interactive(false)
-config.set_log_level(LOG_L)
+config.set_log_level(LOG_LEVEL_WARNING)
config.set_use_color(True)
```
# Edit History:
## Edit History
"#};
const UNIFIED_DIFF_REMINDER: &str = indoc! {"
---
Please analyze the edit history and the files, then provide the unified diff for your predicted edits.
Analyze the edit history and the files, then provide the unified diff for your predicted edits.
Do not include the cursor marker in your output.
If you're editing multiple files, be sure to reflect filename in the hunk's header.
Your diff should include edited file paths in its file headers (lines beginning with `---` and `+++`).
Do not include line numbers in the hunk headers, use `@@ ... @@`.
Removed lines begin with `-`.
Added lines begin with `+`.
Context lines begin with an extra space.
Context and removed lines are used to match the target edit location, so make sure to include enough of them
to uniquely identify it amongst all excerpts of code provided.
"};
const XML_TAGS_INSTRUCTIONS: &str = indoc! {r#"
# Instructions
You are an edit prediction agent in a code editor.
Your job is to predict the next edit that the user will make,
based on their last few edits and their current cursor location.
# Output Format
You must briefly explain your understanding of the user's goal, in one
or two sentences, and then specify their next edit, using the following
XML format:
<edits path="my-project/src/myapp/cli.py">
<old_text>
OLD TEXT 1 HERE
</old_text>
<new_text>
NEW TEXT 1 HERE
</new_text>
<old_text>
OLD TEXT 1 HERE
</old_text>
<new_text>
NEW TEXT 1 HERE
</new_text>
</edits>
- Specify the file to edit using the `path` attribute.
- Use `<old_text>` and `<new_text>` tags to replace content
- `<old_text>` must exactly match existing file content, including indentation
- `<old_text>` cannot be empty
- Do not escape quotes, newlines, or other characters within tags
- Always close all tags properly
- Don't include the <|user_cursor|> marker in your output.
# Edit History:
"#};
const OLD_TEXT_NEW_TEXT_REMINDER: &str = indoc! {r#"
---
Remember that the edits in the edit history have already been deployed.
The files are currently as shown in the Code Excerpts section.
"#};
pub fn build_prompt(
request: &predict_edits_v3::PredictEditsRequest,
) -> Result<(String, SectionLabels)> {
@@ -121,8 +169,9 @@ pub fn build_prompt(
EDITABLE_REGION_END_MARKER_WITH_NEWLINE,
),
],
PromptFormat::LabeledSections => vec![(request.cursor_point, CURSOR_MARKER)],
PromptFormat::NumLinesUniDiff => {
PromptFormat::LabeledSections
| PromptFormat::NumLinesUniDiff
| PromptFormat::OldTextNewText => {
vec![(request.cursor_point, CURSOR_MARKER)]
}
PromptFormat::OnlySnippets => vec![],
@@ -132,46 +181,32 @@ pub fn build_prompt(
PromptFormat::MarkedExcerpt => MARKED_EXCERPT_INSTRUCTIONS.to_string(),
PromptFormat::LabeledSections => LABELED_SECTIONS_INSTRUCTIONS.to_string(),
PromptFormat::NumLinesUniDiff => NUMBERED_LINES_INSTRUCTIONS.to_string(),
// only intended for use via zeta_cli
PromptFormat::OldTextNewText => XML_TAGS_INSTRUCTIONS.to_string(),
PromptFormat::OnlySnippets => String::new(),
};
if request.events.is_empty() {
prompt.push_str("(No edit history)\n\n");
} else {
prompt.push_str(
"The following are the latest edits made by the user, from earlier to later.\n\n",
);
prompt.push_str("Here are the latest edits made by the user, from earlier to later.\n\n");
push_events(&mut prompt, &request.events);
}
prompt.push_str(indoc! {"
# Code Excerpts
The cursor marker <|user_cursor|> indicates the current user cursor position.
The file is in current state, edits from edit history have been applied.
"});
if request.prompt_format == PromptFormat::NumLinesUniDiff {
if request.referenced_declarations.is_empty() {
prompt.push_str(indoc! {"
# File under the cursor:
The cursor marker <|user_cursor|> indicates the current user cursor position.
The file is in current state, edits from edit history have been applied.
We prepend line numbers (e.g., `123|<actual line>`); they are not part of the file.
"});
} else {
// Note: This hasn't been trained on yet
prompt.push_str(indoc! {"
# Code Excerpts:
The cursor marker <|user_cursor|> indicates the current user cursor position.
Other excerpts of code from the project have been included as context based on their similarity to the code under the cursor.
Context excerpts are not guaranteed to be relevant, so use your own judgement.
Files are in their current state, edits from edit history have been applied.
We prepend line numbers (e.g., `123|<actual line>`); they are not part of the file.
"});
}
} else {
prompt.push_str("\n## Code\n\n");
prompt.push_str(indoc! {"
We prepend line numbers (e.g., `123|<actual line>`); they are not part of the file.
"});
}
prompt.push('\n');
let mut section_labels = Default::default();
if !request.referenced_declarations.is_empty() || !request.signatures.is_empty() {
@@ -198,8 +233,14 @@ pub fn build_prompt(
}
}
if request.prompt_format == PromptFormat::NumLinesUniDiff {
prompt.push_str(UNIFIED_DIFF_REMINDER);
match request.prompt_format {
PromptFormat::NumLinesUniDiff => {
prompt.push_str(UNIFIED_DIFF_REMINDER);
}
PromptFormat::OldTextNewText => {
prompt.push_str(OLD_TEXT_NEW_TEXT_REMINDER);
}
_ => {}
}
Ok((prompt, section_labels))
@@ -624,6 +665,7 @@ impl<'a> SyntaxBasedPrompt<'a> {
match self.request.prompt_format {
PromptFormat::MarkedExcerpt
| PromptFormat::OnlySnippets
| PromptFormat::OldTextNewText
| PromptFormat::NumLinesUniDiff => {
if range.start.0 > 0 && !skipped_last_snippet {
output.push_str("\n");

View File

@@ -291,29 +291,6 @@ CREATE TABLE IF NOT EXISTS "channel_chat_participants" (
CREATE INDEX "index_channel_chat_participants_on_channel_id" ON "channel_chat_participants" ("channel_id");
CREATE TABLE IF NOT EXISTS "channel_messages" (
"id" INTEGER PRIMARY KEY AUTOINCREMENT,
"channel_id" INTEGER NOT NULL REFERENCES channels (id) ON DELETE CASCADE,
"sender_id" INTEGER NOT NULL REFERENCES users (id),
"body" TEXT NOT NULL,
"sent_at" TIMESTAMP,
"edited_at" TIMESTAMP,
"nonce" BLOB NOT NULL,
"reply_to_message_id" INTEGER DEFAULT NULL
);
CREATE INDEX "index_channel_messages_on_channel_id" ON "channel_messages" ("channel_id");
CREATE UNIQUE INDEX "index_channel_messages_on_sender_id_nonce" ON "channel_messages" ("sender_id", "nonce");
CREATE TABLE "channel_message_mentions" (
"message_id" INTEGER NOT NULL REFERENCES channel_messages (id) ON DELETE CASCADE,
"start_offset" INTEGER NOT NULL,
"end_offset" INTEGER NOT NULL,
"user_id" INTEGER NOT NULL REFERENCES users (id) ON DELETE CASCADE,
PRIMARY KEY (message_id, start_offset)
);
CREATE TABLE "channel_members" (
"id" INTEGER PRIMARY KEY AUTOINCREMENT,
"channel_id" INTEGER NOT NULL REFERENCES channels (id) ON DELETE CASCADE,
@@ -408,15 +385,6 @@ CREATE TABLE "observed_buffer_edits" (
CREATE UNIQUE INDEX "index_observed_buffers_user_and_buffer_id" ON "observed_buffer_edits" ("user_id", "buffer_id");
CREATE TABLE IF NOT EXISTS "observed_channel_messages" (
"user_id" INTEGER NOT NULL REFERENCES users (id) ON DELETE CASCADE,
"channel_id" INTEGER NOT NULL REFERENCES channels (id) ON DELETE CASCADE,
"channel_message_id" INTEGER NOT NULL,
PRIMARY KEY (user_id, channel_id)
);
CREATE UNIQUE INDEX "index_observed_channel_messages_user_and_channel_id" ON "observed_channel_messages" ("user_id", "channel_id");
CREATE TABLE "notification_kinds" (
"id" INTEGER PRIMARY KEY AUTOINCREMENT,
"name" VARCHAR NOT NULL

View File

@@ -0,0 +1,3 @@
drop table observed_channel_messages;
drop table channel_message_mentions;
drop table channel_messages;

View File

@@ -0,0 +1 @@
drop table embeddings;

View File

@@ -5,7 +5,6 @@ pub mod buffers;
pub mod channels;
pub mod contacts;
pub mod contributors;
pub mod embeddings;
pub mod extensions;
pub mod notifications;
pub mod projects;

View File

@@ -1,94 +0,0 @@
use super::*;
use time::Duration;
use time::OffsetDateTime;
impl Database {
pub async fn get_embeddings(
&self,
model: &str,
digests: &[Vec<u8>],
) -> Result<HashMap<Vec<u8>, Vec<f32>>> {
self.transaction(|tx| async move {
let embeddings = {
let mut db_embeddings = embedding::Entity::find()
.filter(
embedding::Column::Model.eq(model).and(
embedding::Column::Digest
.is_in(digests.iter().map(|digest| digest.as_slice())),
),
)
.stream(&*tx)
.await?;
let mut embeddings = HashMap::default();
while let Some(db_embedding) = db_embeddings.next().await {
let db_embedding = db_embedding?;
embeddings.insert(db_embedding.digest, db_embedding.dimensions);
}
embeddings
};
if !embeddings.is_empty() {
let now = OffsetDateTime::now_utc();
let retrieved_at = PrimitiveDateTime::new(now.date(), now.time());
embedding::Entity::update_many()
.filter(
embedding::Column::Digest
.is_in(embeddings.keys().map(|digest| digest.as_slice())),
)
.col_expr(embedding::Column::RetrievedAt, Expr::value(retrieved_at))
.exec(&*tx)
.await?;
}
Ok(embeddings)
})
.await
}
pub async fn save_embeddings(
&self,
model: &str,
embeddings: &HashMap<Vec<u8>, Vec<f32>>,
) -> Result<()> {
self.transaction(|tx| async move {
embedding::Entity::insert_many(embeddings.iter().map(|(digest, dimensions)| {
let now_offset_datetime = OffsetDateTime::now_utc();
let retrieved_at =
PrimitiveDateTime::new(now_offset_datetime.date(), now_offset_datetime.time());
embedding::ActiveModel {
model: ActiveValue::set(model.to_string()),
digest: ActiveValue::set(digest.clone()),
dimensions: ActiveValue::set(dimensions.clone()),
retrieved_at: ActiveValue::set(retrieved_at),
}
}))
.on_conflict(
OnConflict::columns([embedding::Column::Model, embedding::Column::Digest])
.do_nothing()
.to_owned(),
)
.exec_without_returning(&*tx)
.await?;
Ok(())
})
.await
}
pub async fn purge_old_embeddings(&self) -> Result<()> {
self.transaction(|tx| async move {
embedding::Entity::delete_many()
.filter(
embedding::Column::RetrievedAt
.lte(OffsetDateTime::now_utc() - Duration::days(60)),
)
.exec(&*tx)
.await?;
Ok(())
})
.await
}
}

View File

@@ -66,40 +66,6 @@ impl Database {
.await
}
/// Returns all users flagged as staff.
pub async fn get_staff_users(&self) -> Result<Vec<user::Model>> {
self.transaction(|tx| async {
let tx = tx;
Ok(user::Entity::find()
.filter(user::Column::Admin.eq(true))
.all(&*tx)
.await?)
})
.await
}
/// Returns a user by email address. There are no access checks here, so this should only be used internally.
pub async fn get_user_by_email(&self, email: &str) -> Result<Option<User>> {
self.transaction(|tx| async move {
Ok(user::Entity::find()
.filter(user::Column::EmailAddress.eq(email))
.one(&*tx)
.await?)
})
.await
}
/// Returns a user by GitHub user ID. There are no access checks here, so this should only be used internally.
pub async fn get_user_by_github_user_id(&self, github_user_id: i32) -> Result<Option<User>> {
self.transaction(|tx| async move {
Ok(user::Entity::find()
.filter(user::Column::GithubUserId.eq(github_user_id))
.one(&*tx)
.await?)
})
.await
}
/// Returns a user by GitHub login. There are no access checks here, so this should only be used internally.
pub async fn get_user_by_github_login(&self, github_login: &str) -> Result<Option<User>> {
self.transaction(|tx| async move {
@@ -270,39 +236,6 @@ impl Database {
.await
}
/// Sets "accepted_tos_at" on the user to the given timestamp.
pub async fn set_user_accepted_tos_at(
&self,
id: UserId,
accepted_tos_at: Option<DateTime>,
) -> Result<()> {
self.transaction(|tx| async move {
user::Entity::update_many()
.filter(user::Column::Id.eq(id))
.set(user::ActiveModel {
accepted_tos_at: ActiveValue::set(accepted_tos_at),
..Default::default()
})
.exec(&*tx)
.await?;
Ok(())
})
.await
}
/// hard delete the user.
pub async fn destroy_user(&self, id: UserId) -> Result<()> {
self.transaction(|tx| async move {
access_token::Entity::delete_many()
.filter(access_token::Column::UserId.eq(id))
.exec(&*tx)
.await?;
user::Entity::delete_by_id(id).exec(&*tx).await?;
Ok(())
})
.await
}
/// Find users where github_login ILIKE name_query.
pub async fn fuzzy_search_users(&self, name_query: &str, limit: u32) -> Result<Vec<User>> {
self.transaction(|tx| async {
@@ -341,14 +274,4 @@ impl Database {
result.push('%');
result
}
pub async fn get_users_missing_github_user_created_at(&self) -> Result<Vec<user::Model>> {
self.transaction(|tx| async move {
Ok(user::Entity::find()
.filter(user::Column::GithubUserCreatedAt.is_null())
.all(&*tx)
.await?)
})
.await
}
}

View File

@@ -6,11 +6,8 @@ pub mod channel;
pub mod channel_buffer_collaborator;
pub mod channel_chat_participant;
pub mod channel_member;
pub mod channel_message;
pub mod channel_message_mention;
pub mod contact;
pub mod contributor;
pub mod embedding;
pub mod extension;
pub mod extension_version;
pub mod follower;
@@ -18,7 +15,6 @@ pub mod language_server;
pub mod notification;
pub mod notification_kind;
pub mod observed_buffer_edits;
pub mod observed_channel_messages;
pub mod project;
pub mod project_collaborator;
pub mod project_repository;

View File

@@ -1,47 +0,0 @@
use crate::db::{ChannelId, MessageId, UserId};
use sea_orm::entity::prelude::*;
use time::PrimitiveDateTime;
#[derive(Clone, Debug, PartialEq, Eq, DeriveEntityModel)]
#[sea_orm(table_name = "channel_messages")]
pub struct Model {
#[sea_orm(primary_key)]
pub id: MessageId,
pub channel_id: ChannelId,
pub sender_id: UserId,
pub body: String,
pub sent_at: PrimitiveDateTime,
pub edited_at: Option<PrimitiveDateTime>,
pub nonce: Uuid,
pub reply_to_message_id: Option<MessageId>,
}
impl ActiveModelBehavior for ActiveModel {}
#[derive(Copy, Clone, Debug, EnumIter, DeriveRelation)]
pub enum Relation {
#[sea_orm(
belongs_to = "super::channel::Entity",
from = "Column::ChannelId",
to = "super::channel::Column::Id"
)]
Channel,
#[sea_orm(
belongs_to = "super::user::Entity",
from = "Column::SenderId",
to = "super::user::Column::Id"
)]
Sender,
}
impl Related<super::channel::Entity> for Entity {
fn to() -> RelationDef {
Relation::Channel.def()
}
}
impl Related<super::user::Entity> for Entity {
fn to() -> RelationDef {
Relation::Sender.def()
}
}

View File

@@ -1,43 +0,0 @@
use crate::db::{MessageId, UserId};
use sea_orm::entity::prelude::*;
#[derive(Clone, Debug, PartialEq, Eq, DeriveEntityModel)]
#[sea_orm(table_name = "channel_message_mentions")]
pub struct Model {
#[sea_orm(primary_key)]
pub message_id: MessageId,
#[sea_orm(primary_key)]
pub start_offset: i32,
pub end_offset: i32,
pub user_id: UserId,
}
impl ActiveModelBehavior for ActiveModel {}
#[derive(Copy, Clone, Debug, EnumIter, DeriveRelation)]
pub enum Relation {
#[sea_orm(
belongs_to = "super::channel_message::Entity",
from = "Column::MessageId",
to = "super::channel_message::Column::Id"
)]
Message,
#[sea_orm(
belongs_to = "super::user::Entity",
from = "Column::UserId",
to = "super::user::Column::Id"
)]
MentionedUser,
}
impl Related<super::channel::Entity> for Entity {
fn to() -> RelationDef {
Relation::Message.def()
}
}
impl Related<super::user::Entity> for Entity {
fn to() -> RelationDef {
Relation::MentionedUser.def()
}
}

View File

@@ -1,18 +0,0 @@
use sea_orm::entity::prelude::*;
use time::PrimitiveDateTime;
#[derive(Clone, Debug, PartialEq, DeriveEntityModel)]
#[sea_orm(table_name = "embeddings")]
pub struct Model {
#[sea_orm(primary_key)]
pub model: String,
#[sea_orm(primary_key)]
pub digest: Vec<u8>,
pub dimensions: Vec<f32>,
pub retrieved_at: PrimitiveDateTime,
}
#[derive(Copy, Clone, Debug, EnumIter, DeriveRelation)]
pub enum Relation {}
impl ActiveModelBehavior for ActiveModel {}

View File

@@ -1,41 +0,0 @@
use crate::db::{ChannelId, MessageId, UserId};
use sea_orm::entity::prelude::*;
#[derive(Clone, Debug, PartialEq, Eq, DeriveEntityModel)]
#[sea_orm(table_name = "observed_channel_messages")]
pub struct Model {
#[sea_orm(primary_key)]
pub user_id: UserId,
pub channel_id: ChannelId,
pub channel_message_id: MessageId,
}
#[derive(Copy, Clone, Debug, EnumIter, DeriveRelation)]
pub enum Relation {
#[sea_orm(
belongs_to = "super::channel::Entity",
from = "Column::ChannelId",
to = "super::channel::Column::Id"
)]
Channel,
#[sea_orm(
belongs_to = "super::user::Entity",
from = "Column::UserId",
to = "super::user::Column::Id"
)]
User,
}
impl Related<super::channel::Entity> for Entity {
fn to() -> RelationDef {
Relation::Channel.def()
}
}
impl Related<super::user::Entity> for Entity {
fn to() -> RelationDef {
Relation::User.def()
}
}
impl ActiveModelBehavior for ActiveModel {}

View File

@@ -2,11 +2,7 @@ mod buffer_tests;
mod channel_tests;
mod contributor_tests;
mod db_tests;
// we only run postgres tests on macos right now
#[cfg(target_os = "macos")]
mod embedding_tests;
mod extension_tests;
mod user_tests;
use crate::migrations::run_database_migrations;

View File

@@ -1,7 +1,7 @@
use super::*;
use crate::test_both_dbs;
use chrono::Utc;
use pretty_assertions::{assert_eq, assert_ne};
use pretty_assertions::assert_eq;
use std::sync::Arc;
test_both_dbs!(
@@ -457,53 +457,6 @@ async fn test_add_contacts(db: &Arc<Database>) {
);
}
test_both_dbs!(
test_metrics_id,
test_metrics_id_postgres,
test_metrics_id_sqlite
);
async fn test_metrics_id(db: &Arc<Database>) {
let NewUserResult {
user_id: user1,
metrics_id: metrics_id1,
..
} = db
.create_user(
"person1@example.com",
None,
false,
NewUserParams {
github_login: "person1".into(),
github_user_id: 101,
},
)
.await
.unwrap();
let NewUserResult {
user_id: user2,
metrics_id: metrics_id2,
..
} = db
.create_user(
"person2@example.com",
None,
false,
NewUserParams {
github_login: "person2".into(),
github_user_id: 102,
},
)
.await
.unwrap();
assert_eq!(db.get_user_metrics_id(user1).await.unwrap(), metrics_id1);
assert_eq!(db.get_user_metrics_id(user2).await.unwrap(), metrics_id2);
assert_eq!(metrics_id1.len(), 36);
assert_eq!(metrics_id2.len(), 36);
assert_ne!(metrics_id1, metrics_id2);
}
test_both_dbs!(
test_project_count,
test_project_count_postgres,

View File

@@ -1,87 +0,0 @@
use super::TestDb;
use crate::db::embedding;
use collections::HashMap;
use sea_orm::{ColumnTrait, EntityTrait, QueryFilter, sea_query::Expr};
use std::ops::Sub;
use time::{Duration, OffsetDateTime, PrimitiveDateTime};
// SQLite does not support array arguments, so we only test this against a real postgres instance
#[gpui::test]
async fn test_get_embeddings_postgres(cx: &mut gpui::TestAppContext) {
let test_db = TestDb::postgres(cx.executor());
let db = test_db.db();
let provider = "test_model";
let digest1 = vec![1, 2, 3];
let digest2 = vec![4, 5, 6];
let embeddings = HashMap::from_iter([
(digest1.clone(), vec![0.1, 0.2, 0.3]),
(digest2.clone(), vec![0.4, 0.5, 0.6]),
]);
// Save embeddings
db.save_embeddings(provider, &embeddings).await.unwrap();
// Retrieve embeddings
let retrieved_embeddings = db
.get_embeddings(provider, &[digest1.clone(), digest2.clone()])
.await
.unwrap();
assert_eq!(retrieved_embeddings.len(), 2);
assert!(retrieved_embeddings.contains_key(&digest1));
assert!(retrieved_embeddings.contains_key(&digest2));
// Check if the retrieved embeddings are correct
assert_eq!(retrieved_embeddings[&digest1], vec![0.1, 0.2, 0.3]);
assert_eq!(retrieved_embeddings[&digest2], vec![0.4, 0.5, 0.6]);
}
#[gpui::test]
async fn test_purge_old_embeddings(cx: &mut gpui::TestAppContext) {
let test_db = TestDb::postgres(cx.executor());
let db = test_db.db();
let model = "test_model";
let digest = vec![7, 8, 9];
let embeddings = HashMap::from_iter([(digest.clone(), vec![0.7, 0.8, 0.9])]);
// Save old embeddings
db.save_embeddings(model, &embeddings).await.unwrap();
// Reach into the DB and change the retrieved at to be > 60 days
db.transaction(|tx| {
let digest = digest.clone();
async move {
let sixty_days_ago = OffsetDateTime::now_utc().sub(Duration::days(61));
let retrieved_at = PrimitiveDateTime::new(sixty_days_ago.date(), sixty_days_ago.time());
embedding::Entity::update_many()
.filter(
embedding::Column::Model
.eq(model)
.and(embedding::Column::Digest.eq(digest)),
)
.col_expr(embedding::Column::RetrievedAt, Expr::value(retrieved_at))
.exec(&*tx)
.await
.unwrap();
Ok(())
}
})
.await
.unwrap();
// Purge old embeddings
db.purge_old_embeddings().await.unwrap();
// Try to retrieve the purged embeddings
let retrieved_embeddings = db
.get_embeddings(model, std::slice::from_ref(&digest))
.await
.unwrap();
assert!(
retrieved_embeddings.is_empty(),
"Old embeddings should have been purged"
);
}

View File

@@ -1,96 +0,0 @@
use chrono::Utc;
use crate::{
db::{Database, NewUserParams},
test_both_dbs,
};
use std::sync::Arc;
test_both_dbs!(
test_accepted_tos,
test_accepted_tos_postgres,
test_accepted_tos_sqlite
);
async fn test_accepted_tos(db: &Arc<Database>) {
let user_id = db
.create_user(
"user1@example.com",
None,
false,
NewUserParams {
github_login: "user1".to_string(),
github_user_id: 1,
},
)
.await
.unwrap()
.user_id;
let user = db.get_user_by_id(user_id).await.unwrap().unwrap();
assert!(user.accepted_tos_at.is_none());
let accepted_tos_at = Utc::now().naive_utc();
db.set_user_accepted_tos_at(user_id, Some(accepted_tos_at))
.await
.unwrap();
let user = db.get_user_by_id(user_id).await.unwrap().unwrap();
assert!(user.accepted_tos_at.is_some());
assert_eq!(user.accepted_tos_at, Some(accepted_tos_at));
db.set_user_accepted_tos_at(user_id, None).await.unwrap();
let user = db.get_user_by_id(user_id).await.unwrap().unwrap();
assert!(user.accepted_tos_at.is_none());
}
test_both_dbs!(
test_destroy_user_cascade_deletes_access_tokens,
test_destroy_user_cascade_deletes_access_tokens_postgres,
test_destroy_user_cascade_deletes_access_tokens_sqlite
);
async fn test_destroy_user_cascade_deletes_access_tokens(db: &Arc<Database>) {
let user_id = db
.create_user(
"user1@example.com",
Some("user1"),
false,
NewUserParams {
github_login: "user1".to_string(),
github_user_id: 12345,
},
)
.await
.unwrap()
.user_id;
let user = db.get_user_by_id(user_id).await.unwrap();
assert!(user.is_some());
let token_1_id = db
.create_access_token(user_id, None, "token-1", 10)
.await
.unwrap();
let token_2_id = db
.create_access_token(user_id, None, "token-2", 10)
.await
.unwrap();
let token_1 = db.get_access_token(token_1_id).await;
let token_2 = db.get_access_token(token_2_id).await;
assert!(token_1.is_ok());
assert!(token_2.is_ok());
db.destroy_user(user_id).await.unwrap();
let user = db.get_user_by_id(user_id).await.unwrap();
assert!(user.is_none());
let token_1 = db.get_access_token(token_1_id).await;
let token_2 = db.get_access_token(token_2_id).await;
assert!(token_1.is_err());
assert!(token_2.is_err());
}

View File

@@ -13,7 +13,7 @@ use collab::llm::db::LlmDatabase;
use collab::migrations::run_database_migrations;
use collab::{
AppState, Config, Result, api::fetch_extensions_from_blob_store_periodically, db, env,
executor::Executor, rpc::ResultExt,
executor::Executor,
};
use db::Database;
use std::{
@@ -95,8 +95,6 @@ async fn main() -> Result<()> {
let state = AppState::new(config, Executor::Production).await?;
if mode.is_collab() {
state.db.purge_old_embeddings().await.trace_err();
let epoch = state
.db
.create_server(&state.config.zed_environment)

View File

@@ -677,6 +677,8 @@ impl ConsoleQueryBarCompletionProvider {
),
new_text: string_match.string.clone(),
label: CodeLabel::plain(string_match.string.clone(), None),
match_start: None,
snippet_deduplication_key: None,
icon_path: None,
documentation: Some(CompletionDocumentation::MultiLineMarkdown(
variable_value.into(),
@@ -790,6 +792,8 @@ impl ConsoleQueryBarCompletionProvider {
documentation: completion.detail.map(|detail| {
CompletionDocumentation::MultiLineMarkdown(detail.into())
}),
match_start: None,
snippet_deduplication_key: None,
confirm: None,
source: project::CompletionSource::Dap { sort_text },
insert_text_mode: None,

View File

@@ -182,7 +182,6 @@ impl ProjectDiagnosticsEditor {
project::Event::DiskBasedDiagnosticsFinished { language_server_id } => {
log::debug!("disk based diagnostics finished for server {language_server_id}");
this.close_diagnosticless_buffers(
window,
cx,
this.editor.focus_handle(cx).contains_focused(window, cx)
|| this.focus_handle.contains_focused(window, cx),
@@ -247,10 +246,10 @@ impl ProjectDiagnosticsEditor {
window.focus(&this.focus_handle);
}
}
EditorEvent::Blurred => this.close_diagnosticless_buffers(window, cx, false),
EditorEvent::Saved => this.close_diagnosticless_buffers(window, cx, true),
EditorEvent::Blurred => this.close_diagnosticless_buffers(cx, false),
EditorEvent::Saved => this.close_diagnosticless_buffers(cx, true),
EditorEvent::SelectionsChanged { .. } => {
this.close_diagnosticless_buffers(window, cx, true)
this.close_diagnosticless_buffers(cx, true)
}
_ => {}
}
@@ -298,12 +297,7 @@ impl ProjectDiagnosticsEditor {
/// - have no diagnostics anymore
/// - are saved (not dirty)
/// - and, if `retain_selections` is true, do not have selections within them
fn close_diagnosticless_buffers(
&mut self,
_window: &mut Window,
cx: &mut Context<Self>,
retain_selections: bool,
) {
fn close_diagnosticless_buffers(&mut self, cx: &mut Context<Self>, retain_selections: bool) {
let snapshot = self
.editor
.update(cx, |editor, cx| editor.display_snapshot(cx));
@@ -447,7 +441,7 @@ impl ProjectDiagnosticsEditor {
fn focus_out(&mut self, _: FocusOutEvent, window: &mut Window, cx: &mut Context<Self>) {
if !self.focus_handle.is_focused(window) && !self.editor.focus_handle(cx).is_focused(window)
{
self.close_diagnosticless_buffers(window, cx, false);
self.close_diagnosticless_buffers(cx, false);
}
}
@@ -461,8 +455,7 @@ impl ProjectDiagnosticsEditor {
});
}
});
self.multibuffer
.update(cx, |multibuffer, cx| multibuffer.clear(cx));
self.close_diagnosticless_buffers(cx, false);
self.project.update(cx, |project, cx| {
self.paths_to_update = project
.diagnostic_summaries(false, cx)
@@ -564,6 +557,20 @@ impl ProjectDiagnosticsEditor {
blocks.extend(more);
}
let cmp_excerpts = |buffer_snapshot: &BufferSnapshot,
a: &ExcerptRange<text::Anchor>,
b: &ExcerptRange<text::Anchor>| {
let context_start = || a.context.start.cmp(&b.context.start, buffer_snapshot);
let context_end = || a.context.end.cmp(&b.context.end, buffer_snapshot);
let primary_start = || a.primary.start.cmp(&b.primary.start, buffer_snapshot);
let primary_end = || a.primary.end.cmp(&b.primary.end, buffer_snapshot);
context_start()
.then_with(context_end)
.then_with(primary_start)
.then_with(primary_end)
.then(cmp::Ordering::Greater)
};
let mut excerpt_ranges: Vec<ExcerptRange<_>> = this.update(cx, |this, cx| {
this.multibuffer.update(cx, |multi_buffer, cx| {
let is_dirty = multi_buffer
@@ -575,10 +582,12 @@ impl ProjectDiagnosticsEditor {
.excerpts_for_buffer(buffer_id, cx)
.into_iter()
.map(|(_, range)| range)
.sorted_by(|a, b| cmp_excerpts(&buffer_snapshot, a, b))
.collect(),
}
})
})?;
let mut result_blocks = vec![None; excerpt_ranges.len()];
let context_lines = cx.update(|_, cx| multibuffer_context_lines(cx))?;
for b in blocks {
@@ -592,40 +601,14 @@ impl ProjectDiagnosticsEditor {
buffer_snapshot = cx.update(|_, cx| buffer.read(cx).snapshot())?;
let initial_range = buffer_snapshot.anchor_after(b.initial_range.start)
..buffer_snapshot.anchor_before(b.initial_range.end);
let bin_search = |probe: &ExcerptRange<text::Anchor>| {
let context_start = || {
probe
.context
.start
.cmp(&excerpt_range.start, &buffer_snapshot)
};
let context_end =
|| probe.context.end.cmp(&excerpt_range.end, &buffer_snapshot);
let primary_start = || {
probe
.primary
.start
.cmp(&initial_range.start, &buffer_snapshot)
};
let primary_end =
|| probe.primary.end.cmp(&initial_range.end, &buffer_snapshot);
context_start()
.then_with(context_end)
.then_with(primary_start)
.then_with(primary_end)
.then(cmp::Ordering::Greater)
let excerpt_range = ExcerptRange {
context: excerpt_range,
primary: initial_range,
};
let i = excerpt_ranges
.binary_search_by(bin_search)
.binary_search_by(|probe| cmp_excerpts(&buffer_snapshot, probe, &excerpt_range))
.unwrap_or_else(|i| i);
excerpt_ranges.insert(
i,
ExcerptRange {
context: excerpt_range,
primary: initial_range,
},
);
excerpt_ranges.insert(i, excerpt_range);
result_blocks.insert(i, Some(b));
}

View File

@@ -43,7 +43,8 @@ actions!(
]
);
const COPILOT_SETTINGS_URL: &str = "https://github.com/settings/copilot";
const COPILOT_SETTINGS_PATH: &str = "/settings/copilot";
const COPILOT_SETTINGS_URL: &str = concat!("https://github.com", "/settings/copilot");
const PRIVACY_DOCS: &str = "https://zed.dev/docs/ai/privacy-and-security";
struct CopilotErrorToast;
@@ -128,20 +129,21 @@ impl Render for EditPredictionButton {
}),
);
}
let this = cx.entity();
let this = cx.weak_entity();
div().child(
PopoverMenu::new("copilot")
.menu(move |window, cx| {
let current_status = Copilot::global(cx)?.read(cx).status();
Some(match current_status {
match current_status {
Status::Authorized => this.update(cx, |this, cx| {
this.build_copilot_context_menu(window, cx)
}),
_ => this.update(cx, |this, cx| {
this.build_copilot_start_menu(window, cx)
}),
})
}
.ok()
})
.anchor(Corner::BottomRight)
.trigger_with_tooltip(
@@ -182,7 +184,7 @@ impl Render for EditPredictionButton {
let icon = status.to_icon();
let tooltip_text = status.to_tooltip();
let has_menu = status.has_menu();
let this = cx.entity();
let this = cx.weak_entity();
let fs = self.fs.clone();
div().child(
@@ -209,9 +211,11 @@ impl Render for EditPredictionButton {
)
}))
}
SupermavenButtonStatus::Ready => Some(this.update(cx, |this, cx| {
this.build_supermaven_context_menu(window, cx)
})),
SupermavenButtonStatus::Ready => this
.update(cx, |this, cx| {
this.build_supermaven_context_menu(window, cx)
})
.ok(),
_ => None,
})
.anchor(Corner::BottomRight)
@@ -233,15 +237,16 @@ impl Render for EditPredictionButton {
let enabled = self.editor_enabled.unwrap_or(true);
let has_api_key = CodestralCompletionProvider::has_api_key(cx);
let fs = self.fs.clone();
let this = cx.entity();
let this = cx.weak_entity();
div().child(
PopoverMenu::new("codestral")
.menu(move |window, cx| {
if has_api_key {
Some(this.update(cx, |this, cx| {
this.update(cx, |this, cx| {
this.build_codestral_context_menu(window, cx)
}))
})
.ok()
} else {
Some(ContextMenu::build(window, cx, |menu, _, _| {
let fs = fs.clone();
@@ -379,11 +384,12 @@ impl Render for EditPredictionButton {
})
});
let this = cx.entity();
let this = cx.weak_entity();
let mut popover_menu = PopoverMenu::new("zeta")
.menu(move |window, cx| {
Some(this.update(cx, |this, cx| this.build_zeta_context_menu(window, cx)))
this.update(cx, |this, cx| this.build_zeta_context_menu(window, cx))
.ok()
})
.anchor(Corner::BottomRight)
.with_handle(self.popover_menu_handle.clone());
@@ -831,6 +837,16 @@ impl EditPredictionButton {
window: &mut Window,
cx: &mut Context<Self>,
) -> Entity<ContextMenu> {
let all_language_settings = all_language_settings(None, cx);
let copilot_config = copilot::copilot_chat::CopilotChatConfiguration {
enterprise_uri: all_language_settings
.edit_predictions
.copilot
.enterprise_uri
.clone(),
};
let settings_url = copilot_settings_url(copilot_config.enterprise_uri.as_deref());
ContextMenu::build(window, cx, |menu, window, cx| {
let menu = self.build_language_settings_menu(menu, window, cx);
let menu =
@@ -839,10 +855,7 @@ impl EditPredictionButton {
menu.separator()
.link(
"Go to Copilot Settings",
OpenBrowser {
url: COPILOT_SETTINGS_URL.to_string(),
}
.boxed_clone(),
OpenBrowser { url: settings_url }.boxed_clone(),
)
.action("Sign Out", copilot::SignOut.boxed_clone())
})
@@ -1171,3 +1184,99 @@ fn toggle_edit_prediction_mode(fs: Arc<dyn Fs>, mode: EditPredictionsMode, cx: &
});
}
}
fn copilot_settings_url(enterprise_uri: Option<&str>) -> String {
match enterprise_uri {
Some(uri) => {
format!("{}{}", uri.trim_end_matches('/'), COPILOT_SETTINGS_PATH)
}
None => COPILOT_SETTINGS_URL.to_string(),
}
}
#[cfg(test)]
mod tests {
use super::*;
use gpui::TestAppContext;
#[gpui::test]
async fn test_copilot_settings_url_with_enterprise_uri(cx: &mut TestAppContext) {
cx.update(|cx| {
let settings_store = SettingsStore::test(cx);
cx.set_global(settings_store);
});
cx.update_global(|settings_store: &mut SettingsStore, cx| {
settings_store
.set_user_settings(
r#"{"edit_predictions":{"copilot":{"enterprise_uri":"https://my-company.ghe.com"}}}"#,
cx,
)
.unwrap();
});
let url = cx.update(|cx| {
let all_language_settings = all_language_settings(None, cx);
copilot_settings_url(
all_language_settings
.edit_predictions
.copilot
.enterprise_uri
.as_deref(),
)
});
assert_eq!(url, "https://my-company.ghe.com/settings/copilot");
}
#[gpui::test]
async fn test_copilot_settings_url_with_enterprise_uri_trailing_slash(cx: &mut TestAppContext) {
cx.update(|cx| {
let settings_store = SettingsStore::test(cx);
cx.set_global(settings_store);
});
cx.update_global(|settings_store: &mut SettingsStore, cx| {
settings_store
.set_user_settings(
r#"{"edit_predictions":{"copilot":{"enterprise_uri":"https://my-company.ghe.com/"}}}"#,
cx,
)
.unwrap();
});
let url = cx.update(|cx| {
let all_language_settings = all_language_settings(None, cx);
copilot_settings_url(
all_language_settings
.edit_predictions
.copilot
.enterprise_uri
.as_deref(),
)
});
assert_eq!(url, "https://my-company.ghe.com/settings/copilot");
}
#[gpui::test]
async fn test_copilot_settings_url_without_enterprise_uri(cx: &mut TestAppContext) {
cx.update(|cx| {
let settings_store = SettingsStore::test(cx);
cx.set_global(settings_store);
});
let url = cx.update(|cx| {
let all_language_settings = all_language_settings(None, cx);
copilot_settings_url(
all_language_settings
.edit_predictions
.copilot
.enterprise_uri
.as_deref(),
)
});
assert_eq!(url, "https://github.com/settings/copilot");
}
}

View File

@@ -213,15 +213,6 @@ pub struct ExpandExcerptsDown {
pub(super) lines: u32,
}
/// Shows code completion suggestions at the cursor position.
#[derive(PartialEq, Clone, Deserialize, Default, JsonSchema, Action)]
#[action(namespace = editor)]
#[serde(deny_unknown_fields)]
pub struct ShowCompletions {
#[serde(default)]
pub(super) trigger: Option<String>,
}
/// Handles text input in the editor.
#[derive(PartialEq, Clone, Deserialize, Default, JsonSchema, Action)]
#[action(namespace = editor)]
@@ -736,6 +727,8 @@ actions!(
SelectToStartOfParagraph,
/// Extends selection up.
SelectUp,
/// Shows code completion suggestions at the cursor position.
ShowCompletions,
/// Shows the system character palette.
ShowCharacterPalette,
/// Shows edit prediction at cursor.

View File

@@ -305,6 +305,8 @@ impl CompletionBuilder {
icon_path: None,
insert_text_mode: None,
confirm: None,
match_start: None,
snippet_deduplication_key: None,
}
}
}

View File

@@ -17,7 +17,6 @@ use project::{CompletionDisplayOptions, CompletionSource};
use task::DebugScenario;
use task::TaskContext;
use std::collections::VecDeque;
use std::sync::Arc;
use std::sync::atomic::{AtomicBool, Ordering};
use std::{
@@ -36,12 +35,13 @@ use util::ResultExt;
use crate::hover_popover::{hover_markdown_style, open_markdown_url};
use crate::{
CodeActionProvider, CompletionId, CompletionItemKind, CompletionProvider, DisplayRow, Editor,
EditorStyle, ResolvedTasks,
CodeActionProvider, CompletionId, CompletionProvider, DisplayRow, Editor, EditorStyle,
ResolvedTasks,
actions::{ConfirmCodeAction, ConfirmCompletion},
split_words, styled_runs_for_code_label,
};
use crate::{CodeActionSource, EditorSettings};
use collections::{HashSet, VecDeque};
use settings::{Settings, SnippetSortOrder};
pub const MENU_GAP: Pixels = px(4.);
@@ -220,7 +220,9 @@ pub struct CompletionsMenu {
pub is_incomplete: bool,
pub buffer: Entity<Buffer>,
pub completions: Rc<RefCell<Box<[Completion]>>>,
match_candidates: Arc<[StringMatchCandidate]>,
/// String match candidate for each completion, grouped by `match_start`.
match_candidates: Arc<[(Option<text::Anchor>, Vec<StringMatchCandidate>)]>,
/// Entries displayed in the menu, which is a filtered and sorted subset of `match_candidates`.
pub entries: Rc<RefCell<Box<[StringMatch]>>>,
pub selected_item: usize,
filter_task: Task<()>,
@@ -252,8 +254,17 @@ enum MarkdownCacheKey {
#[derive(Copy, Clone, Debug, PartialEq, Eq)]
pub enum CompletionsMenuSource {
/// Show all completions (words, snippets, LSP)
Normal,
/// Show only snippets (not words or LSP)
///
/// Used after typing a non-word character
SnippetsOnly,
/// Tab stops within a snippet that have a predefined finite set of choices
SnippetChoices,
/// Show only words (not snippets or LSP)
///
/// Used when word completions are explicitly triggered
Words { ignore_threshold: bool },
}
@@ -299,6 +310,8 @@ impl CompletionsMenu {
.iter()
.enumerate()
.map(|(id, completion)| StringMatchCandidate::new(id, completion.label.filter_text()))
.into_group_map_by(|candidate| completions[candidate.id].match_start)
.into_iter()
.collect();
let completions_menu = Self {
@@ -346,6 +359,8 @@ impl CompletionsMenu {
replace_range: selection.start.text_anchor..selection.end.text_anchor,
new_text: choice.to_string(),
label: CodeLabel::plain(choice.to_string(), None),
match_start: None,
snippet_deduplication_key: None,
icon_path: None,
documentation: None,
confirm: None,
@@ -354,11 +369,14 @@ impl CompletionsMenu {
})
.collect();
let match_candidates = choices
.iter()
.enumerate()
.map(|(id, completion)| StringMatchCandidate::new(id, completion))
.collect();
let match_candidates = Arc::new([(
None,
choices
.iter()
.enumerate()
.map(|(id, completion)| StringMatchCandidate::new(id, completion))
.collect(),
)]);
let entries = choices
.iter()
.enumerate()
@@ -939,7 +957,7 @@ impl CompletionsMenu {
}
let mat = &self.entries.borrow()[self.selected_item];
let completions = self.completions.borrow_mut();
let completions = self.completions.borrow();
let multiline_docs = match completions[mat.candidate_id].documentation.as_ref() {
Some(CompletionDocumentation::MultiLinePlainText(text)) => div().child(text.clone()),
Some(CompletionDocumentation::SingleLineAndMultiLinePlainText {
@@ -1017,57 +1035,74 @@ impl CompletionsMenu {
pub fn filter(
&mut self,
query: Option<Arc<String>>,
query: Arc<String>,
query_end: text::Anchor,
buffer: &Entity<Buffer>,
provider: Option<Rc<dyn CompletionProvider>>,
window: &mut Window,
cx: &mut Context<Editor>,
) {
self.cancel_filter.store(true, Ordering::Relaxed);
if let Some(query) = query {
self.cancel_filter = Arc::new(AtomicBool::new(false));
let matches = self.do_async_filtering(query, cx);
let id = self.id;
self.filter_task = cx.spawn_in(window, async move |editor, cx| {
let matches = matches.await;
editor
.update_in(cx, |editor, window, cx| {
editor.with_completions_menu_matching_id(id, |this| {
if let Some(this) = this {
this.set_filter_results(matches, provider, window, cx);
}
});
})
.ok();
});
} else {
self.filter_task = Task::ready(());
let matches = self.unfiltered_matches();
self.set_filter_results(matches, provider, window, cx);
}
self.cancel_filter = Arc::new(AtomicBool::new(false));
let matches = self.do_async_filtering(query, query_end, buffer, cx);
let id = self.id;
self.filter_task = cx.spawn_in(window, async move |editor, cx| {
let matches = matches.await;
editor
.update_in(cx, |editor, window, cx| {
editor.with_completions_menu_matching_id(id, |this| {
if let Some(this) = this {
this.set_filter_results(matches, provider, window, cx);
}
});
})
.ok();
});
}
pub fn do_async_filtering(
&self,
query: Arc<String>,
query_end: text::Anchor,
buffer: &Entity<Buffer>,
cx: &Context<Editor>,
) -> Task<Vec<StringMatch>> {
let matches_task = cx.background_spawn({
let query = query.clone();
let match_candidates = self.match_candidates.clone();
let cancel_filter = self.cancel_filter.clone();
let background_executor = cx.background_executor().clone();
async move {
fuzzy::match_strings(
&match_candidates,
&query,
query.chars().any(|c| c.is_uppercase()),
false,
1000,
&cancel_filter,
background_executor,
)
.await
let buffer_snapshot = buffer.read(cx).snapshot();
let background_executor = cx.background_executor().clone();
let match_candidates = self.match_candidates.clone();
let cancel_filter = self.cancel_filter.clone();
let default_query = query.clone();
let matches_task = cx.background_spawn(async move {
let queries_and_candidates = match_candidates
.iter()
.map(|(query_start, candidates)| {
let query_for_batch = match query_start {
Some(start) => {
Arc::new(buffer_snapshot.text_for_range(*start..query_end).collect())
}
None => default_query.clone(),
};
(query_for_batch, candidates)
})
.collect_vec();
let mut results = vec![];
for (query, match_candidates) in queries_and_candidates {
results.extend(
fuzzy::match_strings(
&match_candidates,
&query,
query.chars().any(|c| c.is_uppercase()),
false,
1000,
&cancel_filter,
background_executor.clone(),
)
.await,
);
}
results
});
let completions = self.completions.clone();
@@ -1076,45 +1111,31 @@ impl CompletionsMenu {
cx.foreground_executor().spawn(async move {
let mut matches = matches_task.await;
let completions_ref = completions.borrow();
if sort_completions {
matches = Self::sort_string_matches(
matches,
Some(&query),
Some(&query), // used for non-snippets only
snippet_sort_order,
completions.borrow().as_ref(),
&completions_ref,
);
}
// Remove duplicate snippet prefixes (e.g., "cool code" will match
// the text "c c" in two places; we should only show the longer one)
let mut snippets_seen = HashSet::<(usize, usize)>::default();
matches.retain(|result| {
match completions_ref[result.candidate_id].snippet_deduplication_key {
Some(key) => snippets_seen.insert(key),
None => true,
}
});
matches
})
}
/// Like `do_async_filtering` but there is no filter query, so no need to spawn tasks.
pub fn unfiltered_matches(&self) -> Vec<StringMatch> {
let mut matches = self
.match_candidates
.iter()
.enumerate()
.map(|(candidate_id, candidate)| StringMatch {
candidate_id,
score: Default::default(),
positions: Default::default(),
string: candidate.string.clone(),
})
.collect();
if self.sort_completions {
matches = Self::sort_string_matches(
matches,
None,
self.snippet_sort_order,
self.completions.borrow().as_ref(),
);
}
matches
}
pub fn set_filter_results(
&mut self,
matches: Vec<StringMatch>,
@@ -1157,28 +1178,13 @@ impl CompletionsMenu {
.and_then(|c| c.to_lowercase().next());
if snippet_sort_order == SnippetSortOrder::None {
matches.retain(|string_match| {
let completion = &completions[string_match.candidate_id];
let is_snippet = matches!(
&completion.source,
CompletionSource::Lsp { lsp_completion, .. }
if lsp_completion.kind == Some(CompletionItemKind::SNIPPET)
);
!is_snippet
});
matches
.retain(|string_match| !completions[string_match.candidate_id].is_snippet_kind());
}
matches.sort_unstable_by_key(|string_match| {
let completion = &completions[string_match.candidate_id];
let is_snippet = matches!(
&completion.source,
CompletionSource::Lsp { lsp_completion, .. }
if lsp_completion.kind == Some(CompletionItemKind::SNIPPET)
);
let sort_text = match &completion.source {
CompletionSource::Lsp { lsp_completion, .. } => lsp_completion.sort_text.as_deref(),
CompletionSource::Dap { sort_text } => Some(sort_text.as_str()),
@@ -1190,14 +1196,17 @@ impl CompletionsMenu {
let score = string_match.score;
let sort_score = Reverse(OrderedFloat(score));
let query_start_doesnt_match_split_words = query_start_lower
.map(|query_char| {
!split_words(&string_match.string).any(|word| {
word.chars().next().and_then(|c| c.to_lowercase().next())
== Some(query_char)
// Snippets do their own first-letter matching logic elsewhere.
let is_snippet = completion.is_snippet_kind();
let query_start_doesnt_match_split_words = !is_snippet
&& query_start_lower
.map(|query_char| {
!split_words(&string_match.string).any(|word| {
word.chars().next().and_then(|c| c.to_lowercase().next())
== Some(query_char)
})
})
})
.unwrap_or(false);
.unwrap_or(false);
if query_start_doesnt_match_split_words {
MatchTier::OtherMatch { sort_score }
@@ -1209,6 +1218,7 @@ impl CompletionsMenu {
SnippetSortOrder::None => Reverse(0),
};
let sort_positions = string_match.positions.clone();
// This exact matching won't work for multi-word snippets, but it's fine
let sort_exact = Reverse(if Some(completion.label.filter_text()) == query {
1
} else {

View File

@@ -1097,7 +1097,7 @@ impl DisplaySnapshot {
details: &TextLayoutDetails,
) -> u32 {
let layout_line = self.layout_row(display_row, details);
layout_line.index_for_x(x) as u32
layout_line.closest_index_for_x(x) as u32
}
pub fn grapheme_at(&self, mut point: DisplayPoint) -> Option<SharedString> {

View File

@@ -19,7 +19,7 @@ use std::{
cell::RefCell,
cmp::{self, Ordering},
fmt::Debug,
ops::{Deref, DerefMut, Range, RangeBounds, RangeInclusive},
ops::{Deref, DerefMut, Not, Range, RangeBounds, RangeInclusive},
sync::{
Arc,
atomic::{AtomicUsize, Ordering::SeqCst},
@@ -1879,18 +1879,14 @@ impl Iterator for BlockRows<'_> {
}
let transform = self.transforms.item()?;
if let Some(block) = transform.block.as_ref() {
if block.is_replacement() && self.transforms.start().0 == self.output_row {
if matches!(block, Block::FoldedBuffer { .. }) {
Some(RowInfo::default())
} else {
Some(self.input_rows.next().unwrap())
}
} else {
Some(RowInfo::default())
}
if transform.block.as_ref().is_none_or(|block| {
block.is_replacement()
&& self.transforms.start().0 == self.output_row
&& matches!(block, Block::FoldedBuffer { .. }).not()
}) {
self.input_rows.next()
} else {
Some(self.input_rows.next().unwrap())
Some(RowInfo::default())
}
}
}

View File

@@ -965,7 +965,7 @@ impl<'a> Iterator for WrapChunks<'a> {
}
if self.input_chunk.text.is_empty() {
self.input_chunk = self.input_chunks.next().unwrap();
self.input_chunk = self.input_chunks.next()?;
}
let mut input_len = 0;

View File

@@ -74,7 +74,7 @@ use ::git::{
blame::{BlameEntry, ParsedCommitMessage},
status::FileStatus,
};
use aho_corasick::AhoCorasick;
use aho_corasick::{AhoCorasick, AhoCorasickBuilder, BuildError};
use anyhow::{Context as _, Result, anyhow};
use blink_manager::BlinkManager;
use buffer_diff::DiffHunkStatus;
@@ -117,8 +117,8 @@ use language::{
AutoindentMode, BlockCommentConfig, BracketMatch, BracketPair, Buffer, BufferRow,
BufferSnapshot, Capability, CharClassifier, CharKind, CharScopeContext, CodeLabel, CursorShape,
DiagnosticEntryRef, DiffOptions, EditPredictionsMode, EditPreview, HighlightedText, IndentKind,
IndentSize, Language, OffsetRangeExt, Point, Runnable, RunnableRange, Selection, SelectionGoal,
TextObject, TransactionId, TreeSitterOptions, WordsQuery,
IndentSize, Language, OffsetRangeExt, OutlineItem, Point, Runnable, RunnableRange, Selection,
SelectionGoal, TextObject, TransactionId, TreeSitterOptions, WordsQuery,
language_settings::{
self, LspInsertMode, RewrapBehavior, WordsCompletionMode, all_language_settings,
language_settings,
@@ -1183,12 +1183,14 @@ pub struct Editor {
hide_mouse_mode: HideMouseMode,
pub change_list: ChangeList,
inline_value_cache: InlineValueCache,
selection_drag_state: SelectionDragState,
colors: Option<LspColorData>,
post_scroll_update: Task<()>,
refresh_colors_task: Task<()>,
inlay_hints: Option<LspInlayHintData>,
folding_newlines: Task<()>,
select_next_is_case_sensitive: Option<bool>,
pub lookup_key: Option<Box<dyn Any + Send + Sync>>,
}
@@ -1764,6 +1766,51 @@ impl Editor {
Editor::new_internal(mode, buffer, project, None, window, cx)
}
pub fn sticky_headers(&self, cx: &App) -> Option<Vec<OutlineItem<Anchor>>> {
let multi_buffer = self.buffer().read(cx);
let multi_buffer_snapshot = multi_buffer.snapshot(cx);
let multi_buffer_visible_start = self
.scroll_manager
.anchor()
.anchor
.to_point(&multi_buffer_snapshot);
let max_row = multi_buffer_snapshot.max_point().row;
let start_row = (multi_buffer_visible_start.row).min(max_row);
let end_row = (multi_buffer_visible_start.row + 10).min(max_row);
if let Some((excerpt_id, buffer_id, buffer)) = multi_buffer.read(cx).as_singleton() {
let outline_items = buffer
.outline_items_containing(
Point::new(start_row, 0)..Point::new(end_row, 0),
true,
self.style().map(|style| style.syntax.as_ref()),
)
.into_iter()
.map(|outline_item| OutlineItem {
depth: outline_item.depth,
range: Anchor::range_in_buffer(*excerpt_id, buffer_id, outline_item.range),
source_range_for_text: Anchor::range_in_buffer(
*excerpt_id,
buffer_id,
outline_item.source_range_for_text,
),
text: outline_item.text,
highlight_ranges: outline_item.highlight_ranges,
name_ranges: outline_item.name_ranges,
body_range: outline_item
.body_range
.map(|range| Anchor::range_in_buffer(*excerpt_id, buffer_id, range)),
annotation_range: outline_item
.annotation_range
.map(|range| Anchor::range_in_buffer(*excerpt_id, buffer_id, range)),
});
return Some(outline_items.collect());
}
None
}
fn new_internal(
mode: EditorMode,
multi_buffer: Entity<MultiBuffer>,
@@ -2287,6 +2334,7 @@ impl Editor {
selection_drag_state: SelectionDragState::None,
folding_newlines: Task::ready(()),
lookup_key: None,
select_next_is_case_sensitive: None,
};
if is_minimap {
@@ -3216,7 +3264,7 @@ impl Editor {
};
if continue_showing {
self.show_completions(&ShowCompletions { trigger: None }, window, cx);
self.open_or_update_completions_menu(None, None, false, window, cx);
} else {
self.hide_context_menu(window, cx);
}
@@ -3401,6 +3449,21 @@ impl Editor {
Subscription::join(other_subscription, this_subscription)
}
fn unfold_buffers_with_selections(&mut self, cx: &mut Context<Self>) {
if self.buffer().read(cx).is_singleton() {
return;
}
let snapshot = self.buffer.read(cx).snapshot(cx);
let buffer_ids: HashSet<BufferId> = self
.selections
.disjoint_anchor_ranges()
.flat_map(|range| snapshot.buffer_ids_for_range(range))
.collect();
for buffer_id in buffer_ids {
self.unfold_buffer(buffer_id, cx);
}
}
/// Changes selections using the provided mutation function. Changes to `self.selections` occur
/// immediately, but when run within `transact` or `with_selection_effects_deferred` other
/// effects of selection change occur at the end of the transaction.
@@ -4013,17 +4076,24 @@ impl Editor {
self.selection_mark_mode = false;
self.selection_drag_state = SelectionDragState::None;
if self.dismiss_menus_and_popups(true, window, cx) {
cx.notify();
return;
}
if self.clear_expanded_diff_hunks(cx) {
cx.notify();
return;
}
if self.dismiss_menus_and_popups(true, window, cx) {
if self.show_git_blame_gutter {
self.show_git_blame_gutter = false;
cx.notify();
return;
}
if self.mode.is_full()
&& self.change_selections(Default::default(), window, cx, |s| s.try_cancel())
{
cx.notify();
return;
}
@@ -4142,6 +4212,8 @@ impl Editor {
self.hide_mouse_cursor(HideMouseCursorOrigin::TypingAction, cx);
self.unfold_buffers_with_selections(cx);
let selections = self.selections.all_adjusted(&self.display_snapshot(cx));
let mut bracket_inserted = false;
let mut edits = Vec::new();
@@ -5051,57 +5123,18 @@ impl Editor {
ignore_threshold: false,
}),
None,
trigger_in_words,
window,
cx,
);
}
Some(CompletionsMenuSource::Normal)
| Some(CompletionsMenuSource::SnippetChoices)
| None
if self.is_completion_trigger(
text,
trigger_in_words,
completions_source.is_some(),
cx,
) =>
{
self.show_completions(
&ShowCompletions {
trigger: Some(text.to_owned()).filter(|x| !x.is_empty()),
},
window,
cx,
)
}
_ => {
self.hide_context_menu(window, cx);
}
}
}
fn is_completion_trigger(
&self,
text: &str,
trigger_in_words: bool,
menu_is_open: bool,
cx: &mut Context<Self>,
) -> bool {
let position = self.selections.newest_anchor().head();
let Some(buffer) = self.buffer.read(cx).buffer_for_anchor(position, cx) else {
return false;
};
if let Some(completion_provider) = &self.completion_provider {
completion_provider.is_completion_trigger(
&buffer,
position.text_anchor,
text,
trigger_in_words,
menu_is_open,
_ => self.open_or_update_completions_menu(
None,
Some(text.to_owned()).filter(|x| !x.is_empty()),
true,
window,
cx,
)
} else {
false
),
}
}
@@ -5379,6 +5412,7 @@ impl Editor {
ignore_threshold: true,
}),
None,
false,
window,
cx,
);
@@ -5386,17 +5420,18 @@ impl Editor {
pub fn show_completions(
&mut self,
options: &ShowCompletions,
_: &ShowCompletions,
window: &mut Window,
cx: &mut Context<Self>,
) {
self.open_or_update_completions_menu(None, options.trigger.as_deref(), window, cx);
self.open_or_update_completions_menu(None, None, false, window, cx);
}
fn open_or_update_completions_menu(
&mut self,
requested_source: Option<CompletionsMenuSource>,
trigger: Option<&str>,
trigger: Option<String>,
trigger_in_words: bool,
window: &mut Window,
cx: &mut Context<Self>,
) {
@@ -5404,6 +5439,15 @@ impl Editor {
return;
}
let completions_source = self
.context_menu
.borrow()
.as_ref()
.and_then(|menu| match menu {
CodeContextMenu::Completions(completions_menu) => Some(completions_menu.source),
CodeContextMenu::CodeActions(_) => None,
});
let multibuffer_snapshot = self.buffer.read(cx).read(cx);
// Typically `start` == `end`, but with snippet tabstop choices the default choice is
@@ -5451,7 +5495,8 @@ impl Editor {
ignore_word_threshold = ignore_threshold;
None
}
Some(CompletionsMenuSource::SnippetChoices) => {
Some(CompletionsMenuSource::SnippetChoices)
| Some(CompletionsMenuSource::SnippetsOnly) => {
log::error!("bug: SnippetChoices requested_source is not handled");
None
}
@@ -5465,17 +5510,30 @@ impl Editor {
.as_ref()
.is_none_or(|provider| provider.filter_completions());
let was_snippets_only = matches!(
completions_source,
Some(CompletionsMenuSource::SnippetsOnly)
);
if let Some(CodeContextMenu::Completions(menu)) = self.context_menu.borrow_mut().as_mut() {
if filter_completions {
menu.filter(query.clone(), provider.clone(), window, cx);
menu.filter(
query.clone().unwrap_or_default(),
buffer_position.text_anchor,
&buffer,
provider.clone(),
window,
cx,
);
}
// When `is_incomplete` is false, no need to re-query completions when the current query
// is a suffix of the initial query.
if !menu.is_incomplete {
let was_complete = !menu.is_incomplete;
if was_complete && !was_snippets_only {
// If the new query is a suffix of the old query (typing more characters) and
// the previous result was complete, the existing completions can be filtered.
//
// Note that this is always true for snippet completions.
// Note that snippet completions are always complete.
let query_matches = match (&menu.initial_query, &query) {
(Some(initial_query), Some(query)) => query.starts_with(initial_query.as_ref()),
(None, _) => true,
@@ -5495,23 +5553,6 @@ impl Editor {
}
};
let trigger_kind = match trigger {
Some(trigger) if buffer.read(cx).completion_triggers().contains(trigger) => {
CompletionTriggerKind::TRIGGER_CHARACTER
}
_ => CompletionTriggerKind::INVOKED,
};
let completion_context = CompletionContext {
trigger_character: trigger.and_then(|trigger| {
if trigger_kind == CompletionTriggerKind::TRIGGER_CHARACTER {
Some(String::from(trigger))
} else {
None
}
}),
trigger_kind,
};
let Anchor {
excerpt_id: buffer_excerpt_id,
text_anchor: buffer_position,
@@ -5564,54 +5605,94 @@ impl Editor {
.as_ref()
.is_none_or(|query| !query.chars().any(|c| c.is_digit(10)));
let omit_word_completions = !self.word_completions_enabled
|| (!ignore_word_threshold
&& match &query {
Some(query) => query.chars().count() < completion_settings.words_min_length,
None => completion_settings.words_min_length != 0,
});
let (mut words, provider_responses) = match &provider {
Some(provider) => {
let provider_responses = provider.completions(
buffer_excerpt_id,
let load_provider_completions = provider.as_ref().is_some_and(|provider| {
trigger.as_ref().is_none_or(|trigger| {
provider.is_completion_trigger(
&buffer,
buffer_position,
completion_context,
window,
position.text_anchor,
trigger,
trigger_in_words,
completions_source.is_some(),
cx,
);
)
})
});
let words = match (omit_word_completions, completion_settings.words) {
(true, _) | (_, WordsCompletionMode::Disabled) => {
Task::ready(BTreeMap::default())
}
(false, WordsCompletionMode::Enabled | WordsCompletionMode::Fallback) => cx
.background_spawn(async move {
buffer_snapshot.words_in_range(WordsQuery {
fuzzy_contents: None,
range: word_search_range,
skip_digits,
})
}),
};
let provider_responses = if let Some(provider) = &provider
&& load_provider_completions
{
let trigger_character =
trigger.filter(|trigger| buffer.read(cx).completion_triggers().contains(trigger));
let completion_context = CompletionContext {
trigger_kind: match &trigger_character {
Some(_) => CompletionTriggerKind::TRIGGER_CHARACTER,
None => CompletionTriggerKind::INVOKED,
},
trigger_character,
};
(words, provider_responses)
}
None => {
let words = if omit_word_completions {
Task::ready(BTreeMap::default())
} else {
cx.background_spawn(async move {
buffer_snapshot.words_in_range(WordsQuery {
fuzzy_contents: None,
range: word_search_range,
skip_digits,
})
provider.completions(
buffer_excerpt_id,
&buffer,
buffer_position,
completion_context,
window,
cx,
)
} else {
Task::ready(Ok(Vec::new()))
};
let load_word_completions = if !self.word_completions_enabled {
false
} else if requested_source
== Some(CompletionsMenuSource::Words {
ignore_threshold: true,
})
{
true
} else {
load_provider_completions
&& completion_settings.words != WordsCompletionMode::Disabled
&& (ignore_word_threshold || {
let words_min_length = completion_settings.words_min_length;
// check whether word has at least `words_min_length` characters
let query_chars = query.iter().flat_map(|q| q.chars());
query_chars.take(words_min_length).count() == words_min_length
})
};
let mut words = if load_word_completions {
cx.background_spawn({
let buffer_snapshot = buffer_snapshot.clone();
async move {
buffer_snapshot.words_in_range(WordsQuery {
fuzzy_contents: None,
range: word_search_range,
skip_digits,
})
};
(words, Task::ready(Ok(Vec::new())))
}
}
})
} else {
Task::ready(BTreeMap::default())
};
let snippets = if let Some(provider) = &provider
&& provider.show_snippets()
&& let Some(project) = self.project()
{
let char_classifier = buffer_snapshot
.char_classifier_at(buffer_position)
.scope_context(Some(CharScopeContext::Completion));
project.update(cx, |project, cx| {
snippet_completions(project, &buffer, buffer_position, char_classifier, cx)
})
} else {
Task::ready(Ok(CompletionResponse {
completions: Vec::new(),
display_options: Default::default(),
is_incomplete: false,
}))
};
let snippet_sort_order = EditorSettings::get_global(cx).snippet_sort_order;
@@ -5659,6 +5740,8 @@ impl Editor {
replace_range: word_replace_range.clone(),
new_text: word.clone(),
label: CodeLabel::plain(word, None),
match_start: None,
snippet_deduplication_key: None,
icon_path: None,
documentation: None,
source: CompletionSource::BufferWord {
@@ -5669,6 +5752,13 @@ impl Editor {
confirm: None,
}));
completions.extend(
snippets
.await
.into_iter()
.flat_map(|response| response.completions),
);
let menu = if completions.is_empty() {
None
} else {
@@ -5680,7 +5770,11 @@ impl Editor {
.map(|workspace| workspace.read(cx).app_state().languages.clone());
let menu = CompletionsMenu::new(
id,
requested_source.unwrap_or(CompletionsMenuSource::Normal),
requested_source.unwrap_or(if load_provider_completions {
CompletionsMenuSource::Normal
} else {
CompletionsMenuSource::SnippetsOnly
}),
sort_completions,
show_completion_documentation,
position,
@@ -5696,11 +5790,12 @@ impl Editor {
);
let query = if filter_completions { query } else { None };
let matches_task = if let Some(query) = query {
menu.do_async_filtering(query, cx)
} else {
Task::ready(menu.unfiltered_matches())
};
let matches_task = menu.do_async_filtering(
query.unwrap_or_default(),
buffer_position,
&buffer,
cx,
);
(menu, matches_task)
}) else {
return;
@@ -5717,7 +5812,7 @@ impl Editor {
return;
};
// Only valid to take prev_menu because it the new menu is immediately set
// Only valid to take prev_menu because either the new menu is immediately set
// below, or the menu is hidden.
if let Some(CodeContextMenu::Completions(prev_menu)) =
editor.context_menu.borrow_mut().take()
@@ -6010,7 +6105,7 @@ impl Editor {
.as_ref()
.is_some_and(|confirm| confirm(intent, window, cx));
if show_new_completions_on_confirm {
self.show_completions(&ShowCompletions { trigger: None }, window, cx);
self.open_or_update_completions_menu(None, None, false, window, cx);
}
let provider = self.completion_provider.as_ref()?;
@@ -12806,6 +12901,10 @@ impl Editor {
});
}
// 🤔 | .. | show_in_menu |
// | .. | true true
// | had_edit_prediction | false true
let trigger_in_words =
this.show_edit_predictions_in_menu() || !had_active_edit_prediction;
@@ -14588,7 +14687,7 @@ impl Editor {
.collect::<String>();
let is_empty = query.is_empty();
let select_state = SelectNextState {
query: AhoCorasick::new(&[query])?,
query: self.build_query(&[query], cx)?,
wordwise: true,
done: is_empty,
};
@@ -14598,7 +14697,7 @@ impl Editor {
}
} else if let Some(selected_text) = selected_text {
self.select_next_state = Some(SelectNextState {
query: AhoCorasick::new(&[selected_text])?,
query: self.build_query(&[selected_text], cx)?,
wordwise: false,
done: false,
});
@@ -14806,7 +14905,7 @@ impl Editor {
.collect::<String>();
let is_empty = query.is_empty();
let select_state = SelectNextState {
query: AhoCorasick::new(&[query.chars().rev().collect::<String>()])?,
query: self.build_query(&[query.chars().rev().collect::<String>()], cx)?,
wordwise: true,
done: is_empty,
};
@@ -14816,7 +14915,8 @@ impl Editor {
}
} else if let Some(selected_text) = selected_text {
self.select_prev_state = Some(SelectNextState {
query: AhoCorasick::new(&[selected_text.chars().rev().collect::<String>()])?,
query: self
.build_query(&[selected_text.chars().rev().collect::<String>()], cx)?,
wordwise: false,
done: false,
});
@@ -14826,6 +14926,25 @@ impl Editor {
Ok(())
}
/// Builds an `AhoCorasick` automaton from the provided patterns, while
/// setting the case sensitivity based on the global
/// `SelectNextCaseSensitive` setting, if set, otherwise based on the
/// editor's settings.
fn build_query<I, P>(&self, patterns: I, cx: &Context<Self>) -> Result<AhoCorasick, BuildError>
where
I: IntoIterator<Item = P>,
P: AsRef<[u8]>,
{
let case_sensitive = self.select_next_is_case_sensitive.map_or_else(
|| EditorSettings::get_global(cx).search.case_sensitive,
|value| value,
);
let mut builder = AhoCorasickBuilder::new();
builder.ascii_case_insensitive(!case_sensitive);
builder.build(patterns)
}
pub fn find_next_match(
&mut self,
_: &FindNextMatch,
@@ -18800,10 +18919,17 @@ impl Editor {
if self.buffer().read(cx).is_singleton() || self.is_buffer_folded(buffer_id, cx) {
return;
}
let folded_excerpts = self.buffer().read(cx).excerpts_for_buffer(buffer_id, cx);
self.display_map.update(cx, |display_map, cx| {
display_map.fold_buffers([buffer_id], cx)
});
let snapshot = self.display_snapshot(cx);
self.selections.change_with(&snapshot, |selections| {
selections.remove_selections_from_buffer(buffer_id);
});
cx.emit(EditorEvent::BufferFoldToggled {
ids: folded_excerpts.iter().map(|&(id, _)| id).collect(),
folded: true,
@@ -23013,6 +23139,10 @@ pub trait CompletionProvider {
fn filter_completions(&self) -> bool {
true
}
fn show_snippets(&self) -> bool {
false
}
}
pub trait CodeActionProvider {
@@ -23087,10 +23217,11 @@ impl CodeActionProvider for Entity<Project> {
fn snippet_completions(
project: &Project,
buffer: &Entity<Buffer>,
buffer_position: text::Anchor,
buffer_anchor: text::Anchor,
classifier: CharClassifier,
cx: &mut App,
) -> Task<Result<CompletionResponse>> {
let languages = buffer.read(cx).languages_at(buffer_position);
let languages = buffer.read(cx).languages_at(buffer_anchor);
let snippet_store = project.snippets().read(cx);
let scopes: Vec<_> = languages
@@ -23119,97 +23250,146 @@ fn snippet_completions(
let executor = cx.background_executor().clone();
cx.background_spawn(async move {
let is_word_char = |c| classifier.is_word(c);
let mut is_incomplete = false;
let mut completions: Vec<Completion> = Vec::new();
for (scope, snippets) in scopes.into_iter() {
let classifier =
CharClassifier::new(Some(scope)).scope_context(Some(CharScopeContext::Completion));
const MAX_WORD_PREFIX_LEN: usize = 128;
let last_word: String = snapshot
.reversed_chars_for_range(text::Anchor::MIN..buffer_position)
.take(MAX_WORD_PREFIX_LEN)
.take_while(|c| classifier.is_word(*c))
.collect::<String>()
.chars()
.rev()
.collect();
const MAX_PREFIX_LEN: usize = 128;
let buffer_offset = text::ToOffset::to_offset(&buffer_anchor, &snapshot);
let window_start = buffer_offset.saturating_sub(MAX_PREFIX_LEN);
let window_start = snapshot.clip_offset(window_start, Bias::Left);
if last_word.is_empty() {
return Ok(CompletionResponse {
completions: vec![],
display_options: CompletionDisplayOptions::default(),
is_incomplete: true,
});
let max_buffer_window: String = snapshot
.text_for_range(window_start..buffer_offset)
.collect();
if max_buffer_window.is_empty() {
return Ok(CompletionResponse {
completions: vec![],
display_options: CompletionDisplayOptions::default(),
is_incomplete: true,
});
}
for (_scope, snippets) in scopes.into_iter() {
// Sort snippets by word count to match longer snippet prefixes first.
let mut sorted_snippet_candidates = snippets
.iter()
.enumerate()
.flat_map(|(snippet_ix, snippet)| {
snippet
.prefix
.iter()
.enumerate()
.map(move |(prefix_ix, prefix)| {
let word_count =
snippet_candidate_suffixes(prefix, is_word_char).count();
((snippet_ix, prefix_ix), prefix, word_count)
})
})
.collect_vec();
sorted_snippet_candidates
.sort_unstable_by_key(|(_, _, word_count)| Reverse(*word_count));
// Each prefix may be matched multiple times; the completion menu must filter out duplicates.
let buffer_windows = snippet_candidate_suffixes(&max_buffer_window, is_word_char)
.take(
sorted_snippet_candidates
.first()
.map(|(_, _, word_count)| *word_count)
.unwrap_or_default(),
)
.collect_vec();
const MAX_RESULTS: usize = 100;
// Each match also remembers how many characters from the buffer it consumed
let mut matches: Vec<(StringMatch, usize)> = vec![];
let mut snippet_list_cutoff_index = 0;
for (buffer_index, buffer_window) in buffer_windows.iter().enumerate().rev() {
let word_count = buffer_index + 1;
// Increase `snippet_list_cutoff_index` until we have all of the
// snippets with sufficiently many words.
while sorted_snippet_candidates
.get(snippet_list_cutoff_index)
.is_some_and(|(_ix, _prefix, snippet_word_count)| {
*snippet_word_count >= word_count
})
{
snippet_list_cutoff_index += 1;
}
// Take only the candidates with at least `word_count` many words
let snippet_candidates_at_word_len =
&sorted_snippet_candidates[..snippet_list_cutoff_index];
let candidates = snippet_candidates_at_word_len
.iter()
.map(|(_snippet_ix, prefix, _snippet_word_count)| prefix)
.enumerate() // index in `sorted_snippet_candidates`
// First char must match
.filter(|(_ix, prefix)| {
itertools::equal(
prefix
.chars()
.next()
.into_iter()
.flat_map(|c| c.to_lowercase()),
buffer_window
.chars()
.next()
.into_iter()
.flat_map(|c| c.to_lowercase()),
)
})
.map(|(ix, prefix)| StringMatchCandidate::new(ix, prefix))
.collect::<Vec<StringMatchCandidate>>();
matches.extend(
fuzzy::match_strings(
&candidates,
&buffer_window,
buffer_window.chars().any(|c| c.is_uppercase()),
true,
MAX_RESULTS - matches.len(), // always prioritize longer snippets
&Default::default(),
executor.clone(),
)
.await
.into_iter()
.map(|string_match| (string_match, buffer_window.len())),
);
if matches.len() >= MAX_RESULTS {
break;
}
}
let as_offset = text::ToOffset::to_offset(&buffer_position, &snapshot);
let to_lsp = |point: &text::Anchor| {
let end = text::ToPointUtf16::to_point_utf16(point, &snapshot);
point_to_lsp(end)
};
let lsp_end = to_lsp(&buffer_position);
let candidates = snippets
.iter()
.enumerate()
.flat_map(|(ix, snippet)| {
snippet
.prefix
.iter()
.map(move |prefix| StringMatchCandidate::new(ix, prefix))
})
.collect::<Vec<StringMatchCandidate>>();
const MAX_RESULTS: usize = 100;
let mut matches = fuzzy::match_strings(
&candidates,
&last_word,
last_word.chars().any(|c| c.is_uppercase()),
true,
MAX_RESULTS,
&Default::default(),
executor.clone(),
)
.await;
let lsp_end = to_lsp(&buffer_anchor);
if matches.len() >= MAX_RESULTS {
is_incomplete = true;
}
// Remove all candidates where the query's start does not match the start of any word in the candidate
if let Some(query_start) = last_word.chars().next() {
matches.retain(|string_match| {
split_words(&string_match.string).any(|word| {
// Check that the first codepoint of the word as lowercase matches the first
// codepoint of the query as lowercase
word.chars()
.flat_map(|codepoint| codepoint.to_lowercase())
.zip(query_start.to_lowercase())
.all(|(word_cp, query_cp)| word_cp == query_cp)
})
});
}
let matched_strings = matches
.into_iter()
.map(|m| m.string)
.collect::<HashSet<_>>();
completions.extend(snippets.iter().filter_map(|snippet| {
let matching_prefix = snippet
.prefix
.iter()
.find(|prefix| matched_strings.contains(*prefix))?;
let start = as_offset - last_word.len();
completions.extend(matches.iter().map(|(string_match, buffer_window_len)| {
let ((snippet_index, prefix_index), matching_prefix, _snippet_word_count) =
sorted_snippet_candidates[string_match.candidate_id];
let snippet = &snippets[snippet_index];
let start = buffer_offset - buffer_window_len;
let start = snapshot.anchor_before(start);
let range = start..buffer_position;
let range = start..buffer_anchor;
let lsp_start = to_lsp(&start);
let lsp_range = lsp::Range {
start: lsp_start,
end: lsp_end,
};
Some(Completion {
Completion {
replace_range: range,
new_text: snippet.body.clone(),
source: CompletionSource::Lsp {
@@ -23239,7 +23419,11 @@ fn snippet_completions(
}),
lsp_defaults: None,
},
label: CodeLabel::plain(matching_prefix.clone(), None),
label: CodeLabel {
text: matching_prefix.clone(),
runs: Vec::new(),
filter_range: 0..matching_prefix.len(),
},
icon_path: None,
documentation: Some(CompletionDocumentation::SingleLineAndMultiLinePlainText {
single_line: snippet.name.clone().into(),
@@ -23250,8 +23434,10 @@ fn snippet_completions(
}),
insert_text_mode: None,
confirm: None,
})
}))
match_start: Some(start),
snippet_deduplication_key: Some((snippet_index, prefix_index)),
}
}));
}
Ok(CompletionResponse {
@@ -23273,16 +23459,8 @@ impl CompletionProvider for Entity<Project> {
cx: &mut Context<Editor>,
) -> Task<Result<Vec<CompletionResponse>>> {
self.update(cx, |project, cx| {
let snippets = snippet_completions(project, buffer, buffer_position, cx);
let project_completions = project.completions(buffer, buffer_position, options, cx);
cx.background_spawn(async move {
let mut responses = project_completions.await?;
let snippets = snippets.await?;
if !snippets.completions.is_empty() {
responses.push(snippets);
}
Ok(responses)
})
let task = project.completions(buffer, buffer_position, options, cx);
cx.background_spawn(task)
})
}
@@ -23354,6 +23532,10 @@ impl CompletionProvider for Entity<Project> {
buffer.completion_triggers().contains(text)
}
fn show_snippets(&self) -> bool {
true
}
}
impl SemanticsProvider for Entity<Project> {
@@ -24501,6 +24683,33 @@ pub(crate) fn split_words(text: &str) -> impl std::iter::Iterator<Item = &str> +
})
}
/// Given a string of text immediately before the cursor, iterates over possible
/// strings a snippet could match to. More precisely: returns an iterator over
/// suffixes of `text` created by splitting at word boundaries (before & after
/// every non-word character).
///
/// Shorter suffixes are returned first.
pub(crate) fn snippet_candidate_suffixes(
text: &str,
is_word_char: impl Fn(char) -> bool,
) -> impl std::iter::Iterator<Item = &str> {
let mut prev_index = text.len();
let mut prev_codepoint = None;
text.char_indices()
.rev()
.chain([(0, '\0')])
.filter_map(move |(index, codepoint)| {
let prev_index = std::mem::replace(&mut prev_index, index);
let prev_codepoint = prev_codepoint.replace(codepoint)?;
if is_word_char(prev_codepoint) && is_word_char(codepoint) {
None
} else {
let chunk = &text[prev_index..]; // go to end of string
Some(chunk)
}
})
}
pub trait RangeToAnchorExt: Sized {
fn to_anchors(self, snapshot: &MultiBufferSnapshot) -> Range<Anchor>;

View File

@@ -33,6 +33,7 @@ pub struct EditorSettings {
pub horizontal_scroll_margin: f32,
pub scroll_sensitivity: f32,
pub fast_scroll_sensitivity: f32,
pub sticky_scroll: StickyScroll,
pub relative_line_numbers: RelativeLineNumbers,
pub seed_search_query_from_cursor: SeedQuerySetting,
pub use_smartcase_search: bool,
@@ -65,6 +66,11 @@ pub struct Jupyter {
pub enabled: bool,
}
#[derive(Copy, Clone, Debug, PartialEq, Eq)]
pub struct StickyScroll {
pub enabled: bool,
}
#[derive(Clone, Debug, PartialEq, Eq)]
pub struct Toolbar {
pub breadcrumbs: bool,
@@ -156,10 +162,15 @@ pub struct DragAndDropSelection {
pub struct SearchSettings {
/// Whether to show the project search button in the status bar.
pub button: bool,
/// Whether to only match on whole words.
pub whole_word: bool,
/// Whether to match case sensitively.
pub case_sensitive: bool,
/// Whether to include gitignored files in search results.
pub include_ignored: bool,
/// Whether to interpret the search query as a regular expression.
pub regex: bool,
/// Whether to center the cursor on each search match when navigating.
pub center_on_match: bool,
}
@@ -185,6 +196,7 @@ impl Settings for EditorSettings {
let toolbar = editor.toolbar.unwrap();
let search = editor.search.unwrap();
let drag_and_drop_selection = editor.drag_and_drop_selection.unwrap();
let sticky_scroll = editor.sticky_scroll.unwrap();
Self {
cursor_blink: editor.cursor_blink.unwrap(),
cursor_shape: editor.cursor_shape.map(Into::into),
@@ -235,6 +247,9 @@ impl Settings for EditorSettings {
horizontal_scroll_margin: editor.horizontal_scroll_margin.unwrap(),
scroll_sensitivity: editor.scroll_sensitivity.unwrap(),
fast_scroll_sensitivity: editor.fast_scroll_sensitivity.unwrap(),
sticky_scroll: StickyScroll {
enabled: sticky_scroll.enabled.unwrap(),
},
relative_line_numbers: editor.relative_line_numbers.unwrap(),
seed_search_query_from_cursor: editor.seed_search_query_from_cursor.unwrap(),
use_smartcase_search: editor.use_smartcase_search.unwrap(),

File diff suppressed because it is too large Load Diff

View File

@@ -6,10 +6,10 @@ use crate::{
EditDisplayMode, EditPrediction, Editor, EditorMode, EditorSettings, EditorSnapshot,
EditorStyle, FILE_HEADER_HEIGHT, FocusedBlock, GutterDimensions, HalfPageDown, HalfPageUp,
HandleInput, HoveredCursor, InlayHintRefreshReason, JumpData, LineDown, LineHighlight, LineUp,
MAX_LINE_LEN, MINIMAP_FONT_SIZE, MULTI_BUFFER_EXCERPT_HEADER_HEIGHT, OpenExcerpts,
OpenExcerptsSplit, PageDown, PageUp, PhantomBreakpointIndicator, Point, RowExt, RowRangeExt,
SelectPhase, SelectedTextHighlight, Selection, SelectionDragState, SizingBehavior, SoftWrap,
StickyHeaderExcerpt, ToPoint, ToggleFold, ToggleFoldAll,
MAX_LINE_LEN, MINIMAP_FONT_SIZE, MULTI_BUFFER_EXCERPT_HEADER_HEIGHT, OpenExcerpts, PageDown,
PageUp, PhantomBreakpointIndicator, Point, RowExt, RowRangeExt, SelectPhase,
SelectedTextHighlight, Selection, SelectionDragState, SelectionEffects, SizingBehavior,
SoftWrap, StickyHeaderExcerpt, ToPoint, ToggleFold, ToggleFoldAll,
code_context_menus::{CodeActionsMenu, MENU_ASIDE_MAX_WIDTH, MENU_ASIDE_MIN_WIDTH, MENU_GAP},
display_map::{
Block, BlockContext, BlockStyle, ChunkRendererId, DisplaySnapshot, EditorMargins,
@@ -29,7 +29,7 @@ use crate::{
items::BufferSearchHighlights,
mouse_context_menu::{self, MenuPosition},
scroll::{
ActiveScrollbarState, ScrollOffset, ScrollPixelOffset, ScrollbarThumbState,
ActiveScrollbarState, Autoscroll, ScrollOffset, ScrollPixelOffset, ScrollbarThumbState,
scroll_amount::ScrollAmount,
},
};
@@ -3255,11 +3255,9 @@ impl EditorElement {
(newest_selection_head, relative)
});
let relative_to = if relative.enabled() {
Some(newest_selection_head.row())
} else {
None
};
let relative_line_numbers_enabled = relative.enabled();
let relative_to = relative_line_numbers_enabled.then(|| newest_selection_head.row());
let relative_rows =
self.calculate_relative_line_numbers(snapshot, &rows, relative_to, relative.wrapped());
let mut line_number = String::new();
@@ -3271,17 +3269,18 @@ impl EditorElement {
} else {
row_info.buffer_row? + 1
};
let number = relative_rows
.get(&display_row)
.unwrap_or(&non_relative_number);
write!(&mut line_number, "{number}").unwrap();
if row_info
.diff_status
.is_some_and(|status| status.is_deleted())
let relative_number = relative_rows.get(&display_row);
if !(relative_line_numbers_enabled && relative_number.is_some())
&& row_info
.diff_status
.is_some_and(|status| status.is_deleted())
{
return None;
}
let number = relative_number.unwrap_or(&non_relative_number);
write!(&mut line_number, "{number}").unwrap();
let color = active_rows
.get(&display_row)
.map(|spec| {
@@ -4043,24 +4042,17 @@ impl EditorElement {
)
.group_hover("", |div| div.underline()),
)
.on_click({
let focus_handle = focus_handle.clone();
move |event, window, cx| {
if event.modifiers().secondary() {
focus_handle.dispatch_action(
&OpenExcerptsSplit,
window,
cx,
);
} else {
focus_handle.dispatch_action(
&OpenExcerpts,
window,
cx,
);
}
.on_click(window.listener_for(&self.editor, {
let jump_data = jump_data.clone();
move |editor, e: &ClickEvent, window, cx| {
editor.open_excerpts_common(
Some(jump_data.clone()),
e.modifiers().secondary(),
window,
cx,
);
}
}),
})),
)
.when_some(parent_path, |then, path| {
then.child(div().child(path).text_color(
@@ -4088,24 +4080,17 @@ impl EditorElement {
cx,
)),
)
.on_click({
let focus_handle = focus_handle.clone();
move |event, window, cx| {
if event.modifiers().secondary() {
focus_handle.dispatch_action(
&OpenExcerptsSplit,
window,
cx,
);
} else {
focus_handle.dispatch_action(
&OpenExcerpts,
window,
cx,
);
}
.on_click(window.listener_for(&self.editor, {
let jump_data = jump_data.clone();
move |editor, e: &ClickEvent, window, cx| {
editor.open_excerpts_common(
Some(jump_data.clone()),
e.modifiers().secondary(),
window,
cx,
);
}
}),
})),
)
},
)
@@ -4555,6 +4540,138 @@ impl EditorElement {
header
}
fn layout_sticky_headers(
&self,
snapshot: &EditorSnapshot,
editor_width: Pixels,
is_row_soft_wrapped: impl Copy + Fn(usize) -> bool,
line_height: Pixels,
scroll_pixel_position: gpui::Point<ScrollPixelOffset>,
content_origin: gpui::Point<Pixels>,
gutter_dimensions: &GutterDimensions,
gutter_hitbox: &Hitbox,
text_hitbox: &Hitbox,
window: &mut Window,
cx: &mut App,
) -> Option<StickyHeaders> {
let show_line_numbers = snapshot
.show_line_numbers
.unwrap_or_else(|| EditorSettings::get_global(cx).gutter.line_numbers);
let rows = Self::sticky_headers(self.editor.read(cx), snapshot, cx);
let mut lines = Vec::<StickyHeaderLine>::new();
for StickyHeader {
item,
sticky_row,
start_point,
offset,
} in rows.into_iter().rev()
{
let line = layout_line(
sticky_row,
snapshot,
&self.style,
editor_width,
is_row_soft_wrapped,
window,
cx,
);
let line_number = show_line_numbers.then(|| {
let number = (start_point.row + 1).to_string();
let color = cx.theme().colors().editor_line_number;
self.shape_line_number(SharedString::from(number), color, window)
});
lines.push(StickyHeaderLine::new(
sticky_row,
line_height * offset as f32,
line,
line_number,
item.range.start,
line_height,
scroll_pixel_position,
content_origin,
gutter_hitbox,
text_hitbox,
window,
cx,
));
}
lines.reverse();
if lines.is_empty() {
return None;
}
Some(StickyHeaders {
lines,
gutter_background: cx.theme().colors().editor_gutter_background,
content_background: self.style.background,
gutter_right_padding: gutter_dimensions.right_padding,
})
}
pub(crate) fn sticky_headers(
editor: &Editor,
snapshot: &EditorSnapshot,
cx: &App,
) -> Vec<StickyHeader> {
let scroll_top = snapshot.scroll_position().y;
let mut end_rows = Vec::<DisplayRow>::new();
let mut rows = Vec::<StickyHeader>::new();
let items = editor.sticky_headers(cx).unwrap_or_default();
for item in items {
let start_point = item.range.start.to_point(snapshot.buffer_snapshot());
let end_point = item.range.end.to_point(snapshot.buffer_snapshot());
let sticky_row = snapshot
.display_snapshot
.point_to_display_point(start_point, Bias::Left)
.row();
let end_row = snapshot
.display_snapshot
.point_to_display_point(end_point, Bias::Left)
.row();
let max_sticky_row = end_row.previous_row();
if max_sticky_row <= sticky_row {
continue;
}
while end_rows
.last()
.is_some_and(|&last_end| last_end < sticky_row)
{
end_rows.pop();
}
let depth = end_rows.len();
let adjusted_scroll_top = scroll_top + depth as f64;
if sticky_row.as_f64() >= adjusted_scroll_top || end_row.as_f64() <= adjusted_scroll_top
{
continue;
}
let max_scroll_offset = max_sticky_row.as_f64() - scroll_top;
let offset = (depth as f64).min(max_scroll_offset);
end_rows.push(end_row);
rows.push(StickyHeader {
item,
sticky_row,
start_point,
offset,
});
}
rows
}
fn layout_cursor_popovers(
&self,
line_height: Pixels,
@@ -6407,6 +6524,89 @@ impl EditorElement {
}
}
fn paint_sticky_headers(
&mut self,
layout: &mut EditorLayout,
window: &mut Window,
cx: &mut App,
) {
let Some(mut sticky_headers) = layout.sticky_headers.take() else {
return;
};
if sticky_headers.lines.is_empty() {
layout.sticky_headers = Some(sticky_headers);
return;
}
let whitespace_setting = self
.editor
.read(cx)
.buffer
.read(cx)
.language_settings(cx)
.show_whitespaces;
sticky_headers.paint(layout, whitespace_setting, window, cx);
let sticky_header_hitboxes: Vec<Hitbox> = sticky_headers
.lines
.iter()
.map(|line| line.hitbox.clone())
.collect();
let hovered_hitbox = sticky_header_hitboxes
.iter()
.find_map(|hitbox| hitbox.is_hovered(window).then_some(hitbox.id));
window.on_mouse_event(move |_: &MouseMoveEvent, phase, window, _cx| {
if !phase.bubble() {
return;
}
let current_hover = sticky_header_hitboxes
.iter()
.find_map(|hitbox| hitbox.is_hovered(window).then_some(hitbox.id));
if hovered_hitbox != current_hover {
window.refresh();
}
});
for (line_index, line) in sticky_headers.lines.iter().enumerate() {
let editor = self.editor.clone();
let hitbox = line.hitbox.clone();
let target_anchor = line.target_anchor;
window.on_mouse_event(move |event: &MouseDownEvent, phase, window, cx| {
if !phase.bubble() {
return;
}
if event.button == MouseButton::Left && hitbox.is_hovered(window) {
editor.update(cx, |editor, cx| {
editor.change_selections(
SelectionEffects::scroll(Autoscroll::top_relative(line_index)),
window,
cx,
|selections| selections.select_ranges([target_anchor..target_anchor]),
);
cx.stop_propagation();
});
}
});
}
let text_bounds = layout.position_map.text_hitbox.bounds;
let border_top = text_bounds.top()
+ sticky_headers.lines.last().unwrap().offset
+ layout.position_map.line_height;
let separator_height = px(1.);
let border_bounds = Bounds::from_corners(
point(layout.gutter_hitbox.bounds.left(), border_top),
point(text_bounds.right(), border_top + separator_height),
);
window.paint_quad(fill(border_bounds, cx.theme().colors().border_variant));
layout.sticky_headers = Some(sticky_headers);
}
fn paint_lines_background(
&mut self,
layout: &mut EditorLayout,
@@ -8107,6 +8307,27 @@ impl LineWithInvisibles {
cx: &mut App,
) {
let line_y = f32::from(line_height) * Pixels::from(row.as_f64() - scroll_position.y);
self.prepaint_with_custom_offset(
line_height,
scroll_pixel_position,
content_origin,
line_y,
line_elements,
window,
cx,
);
}
fn prepaint_with_custom_offset(
&mut self,
line_height: Pixels,
scroll_pixel_position: gpui::Point<ScrollPixelOffset>,
content_origin: gpui::Point<Pixels>,
line_y: Pixels,
line_elements: &mut SmallVec<[AnyElement; 1]>,
window: &mut Window,
cx: &mut App,
) {
let mut fragment_origin =
content_origin + gpui::point(Pixels::from(-scroll_pixel_position.x), line_y);
for fragment in &mut self.fragments {
@@ -8141,9 +8362,31 @@ impl LineWithInvisibles {
window: &mut Window,
cx: &mut App,
) {
let line_height = layout.position_map.line_height;
let line_y = line_height * (row.as_f64() - layout.position_map.scroll_position.y) as f32;
self.draw_with_custom_offset(
layout,
row,
content_origin,
layout.position_map.line_height
* (row.as_f64() - layout.position_map.scroll_position.y) as f32,
whitespace_setting,
selection_ranges,
window,
cx,
);
}
fn draw_with_custom_offset(
&self,
layout: &EditorLayout,
row: DisplayRow,
content_origin: gpui::Point<Pixels>,
line_y: Pixels,
whitespace_setting: ShowWhitespaceSetting,
selection_ranges: &[Range<DisplayPoint>],
window: &mut Window,
cx: &mut App,
) {
let line_height = layout.position_map.line_height;
let mut fragment_origin = content_origin
+ gpui::point(
Pixels::from(-layout.position_map.scroll_pixel_position.x),
@@ -8363,7 +8606,7 @@ impl LineWithInvisibles {
let fragment_end_x = fragment_start_x + shaped_line.width;
if x < fragment_end_x {
return Some(
fragment_start_index + shaped_line.index_for_x(x - fragment_start_x),
fragment_start_index + shaped_line.index_for_x(x - fragment_start_x)?,
);
}
fragment_start_x = fragment_end_x;
@@ -8582,6 +8825,7 @@ impl Element for EditorElement {
};
let is_minimap = self.editor.read(cx).mode.is_minimap();
let is_singleton = self.editor.read(cx).buffer_kind(cx) == ItemBufferKind::Singleton;
if !is_minimap {
let focus_handle = self.editor.focus_handle(cx);
@@ -9228,6 +9472,26 @@ impl Element for EditorElement {
scroll_position.x * f64::from(em_advance),
scroll_position.y * f64::from(line_height),
);
let sticky_headers = if !is_minimap
&& is_singleton
&& EditorSettings::get_global(cx).sticky_scroll.enabled
{
self.layout_sticky_headers(
&snapshot,
editor_width,
is_row_soft_wrapped,
line_height,
scroll_pixel_position,
content_origin,
&gutter_dimensions,
&gutter_hitbox,
&text_hitbox,
window,
cx,
)
} else {
None
};
let indent_guides = self.layout_indent_guides(
content_origin,
text_hitbox.origin,
@@ -9697,6 +9961,7 @@ impl Element for EditorElement {
tab_invisible,
space_invisible,
sticky_buffer_header,
sticky_headers,
expand_toggles,
}
})
@@ -9767,6 +10032,7 @@ impl Element for EditorElement {
}
});
self.paint_sticky_headers(layout, window, cx);
self.paint_minimap(layout, window, cx);
self.paint_scrollbars(layout, window, cx);
self.paint_edit_prediction_popover(layout, window, cx);
@@ -9875,15 +10141,180 @@ pub struct EditorLayout {
tab_invisible: ShapedLine,
space_invisible: ShapedLine,
sticky_buffer_header: Option<AnyElement>,
sticky_headers: Option<StickyHeaders>,
document_colors: Option<(DocumentColorsRenderMode, Vec<(Range<DisplayPoint>, Hsla)>)>,
}
struct StickyHeaders {
lines: Vec<StickyHeaderLine>,
gutter_background: Hsla,
content_background: Hsla,
gutter_right_padding: Pixels,
}
struct StickyHeaderLine {
row: DisplayRow,
offset: Pixels,
line: LineWithInvisibles,
line_number: Option<ShapedLine>,
elements: SmallVec<[AnyElement; 1]>,
available_text_width: Pixels,
target_anchor: Anchor,
hitbox: Hitbox,
}
impl EditorLayout {
fn line_end_overshoot(&self) -> Pixels {
0.15 * self.position_map.line_height
}
}
impl StickyHeaders {
fn paint(
&mut self,
layout: &mut EditorLayout,
whitespace_setting: ShowWhitespaceSetting,
window: &mut Window,
cx: &mut App,
) {
let line_height = layout.position_map.line_height;
for line in self.lines.iter_mut().rev() {
window.paint_layer(
Bounds::new(
layout.gutter_hitbox.origin + point(Pixels::ZERO, line.offset),
size(line.hitbox.size.width, line_height),
),
|window| {
let gutter_bounds = Bounds::new(
layout.gutter_hitbox.origin + point(Pixels::ZERO, line.offset),
size(layout.gutter_hitbox.size.width, line_height),
);
window.paint_quad(fill(gutter_bounds, self.gutter_background));
let text_bounds = Bounds::new(
layout.position_map.text_hitbox.origin + point(Pixels::ZERO, line.offset),
size(line.available_text_width, line_height),
);
window.paint_quad(fill(text_bounds, self.content_background));
if line.hitbox.is_hovered(window) {
let hover_overlay = cx.theme().colors().panel_overlay_hover;
window.paint_quad(fill(gutter_bounds, hover_overlay));
window.paint_quad(fill(text_bounds, hover_overlay));
}
line.paint(
layout,
self.gutter_right_padding,
line.available_text_width,
layout.content_origin,
line_height,
whitespace_setting,
window,
cx,
);
},
);
window.set_cursor_style(CursorStyle::PointingHand, &line.hitbox);
}
}
}
impl StickyHeaderLine {
fn new(
row: DisplayRow,
offset: Pixels,
mut line: LineWithInvisibles,
line_number: Option<ShapedLine>,
target_anchor: Anchor,
line_height: Pixels,
scroll_pixel_position: gpui::Point<ScrollPixelOffset>,
content_origin: gpui::Point<Pixels>,
gutter_hitbox: &Hitbox,
text_hitbox: &Hitbox,
window: &mut Window,
cx: &mut App,
) -> Self {
let mut elements = SmallVec::<[AnyElement; 1]>::new();
line.prepaint_with_custom_offset(
line_height,
scroll_pixel_position,
content_origin,
offset,
&mut elements,
window,
cx,
);
let hitbox_bounds = Bounds::new(
gutter_hitbox.origin + point(Pixels::ZERO, offset),
size(text_hitbox.right() - gutter_hitbox.left(), line_height),
);
let available_text_width =
(hitbox_bounds.size.width - gutter_hitbox.size.width).max(Pixels::ZERO);
Self {
row,
offset,
line,
line_number,
elements,
available_text_width,
target_anchor,
hitbox: window.insert_hitbox(hitbox_bounds, HitboxBehavior::BlockMouseExceptScroll),
}
}
fn paint(
&mut self,
layout: &EditorLayout,
gutter_right_padding: Pixels,
available_text_width: Pixels,
content_origin: gpui::Point<Pixels>,
line_height: Pixels,
whitespace_setting: ShowWhitespaceSetting,
window: &mut Window,
cx: &mut App,
) {
window.with_content_mask(
Some(ContentMask {
bounds: Bounds::new(
layout.position_map.text_hitbox.bounds.origin
+ point(Pixels::ZERO, self.offset),
size(available_text_width, line_height),
),
}),
|window| {
self.line.draw_with_custom_offset(
layout,
self.row,
content_origin,
self.offset,
whitespace_setting,
&[],
window,
cx,
);
for element in &mut self.elements {
element.paint(window, cx);
}
},
);
if let Some(line_number) = &self.line_number {
let gutter_origin = layout.gutter_hitbox.origin + point(Pixels::ZERO, self.offset);
let gutter_width = layout.gutter_hitbox.size.width;
let origin = point(
gutter_origin.x + gutter_width - gutter_right_padding - line_number.width,
gutter_origin.y,
);
line_number.paint(origin, line_height, window, cx).log_err();
}
}
}
#[derive(Debug)]
struct LineNumberSegment {
shaped_line: ShapedLine,
@@ -10730,6 +11161,13 @@ impl HighlightedRange {
}
}
pub(crate) struct StickyHeader {
pub item: language::OutlineItem<Anchor>,
pub sticky_row: DisplayRow,
pub start_point: Point,
pub offset: ScrollOffset,
}
enum CursorPopoverType {
CodeContextMenu,
EditPrediction,
@@ -11002,6 +11440,46 @@ mod tests {
assert_eq!(relative_rows[&DisplayRow(0)], 5);
assert_eq!(relative_rows[&DisplayRow(1)], 4);
assert_eq!(relative_rows[&DisplayRow(2)], 3);
const DELETED_LINE: u32 = 3;
let layouts = cx
.update_window(*window, |_, window, cx| {
element.layout_line_numbers(
None,
GutterDimensions {
left_padding: Pixels::ZERO,
right_padding: Pixels::ZERO,
width: px(30.0),
margin: Pixels::ZERO,
git_blame_entries_width: None,
},
line_height,
gpui::Point::default(),
DisplayRow(0)..DisplayRow(6),
&(0..6)
.map(|row| RowInfo {
buffer_row: Some(row),
diff_status: (row == DELETED_LINE).then(|| {
DiffHunkStatus::deleted(
buffer_diff::DiffHunkSecondaryStatus::NoSecondaryHunk,
)
}),
..Default::default()
})
.collect::<Vec<_>>(),
&BTreeMap::default(),
Some(DisplayPoint::new(DisplayRow(0), 0)),
&snapshot,
window,
cx,
)
})
.unwrap();
assert_eq!(layouts.len(), 5,);
assert!(
layouts.get(&MultiBufferRow(DELETED_LINE)).is_none(),
"Deleted line should not have a line number"
);
}
#[gpui::test]
@@ -11077,6 +11555,62 @@ mod tests {
// current line has no relative number
assert_eq!(relative_rows[&DisplayRow(4)], 1);
assert_eq!(relative_rows[&DisplayRow(5)], 2);
let layouts = cx
.update_window(*window, |_, window, cx| {
element.layout_line_numbers(
None,
GutterDimensions {
left_padding: Pixels::ZERO,
right_padding: Pixels::ZERO,
width: px(30.0),
margin: Pixels::ZERO,
git_blame_entries_width: None,
},
line_height,
gpui::Point::default(),
DisplayRow(0)..DisplayRow(6),
&(0..6)
.map(|row| RowInfo {
buffer_row: Some(row),
diff_status: Some(DiffHunkStatus::deleted(
buffer_diff::DiffHunkSecondaryStatus::NoSecondaryHunk,
)),
..Default::default()
})
.collect::<Vec<_>>(),
&BTreeMap::from_iter([(DisplayRow(0), LineHighlightSpec::default())]),
Some(DisplayPoint::new(DisplayRow(0), 0)),
&snapshot,
window,
cx,
)
})
.unwrap();
assert!(
layouts.is_empty(),
"Deleted lines should have no line number"
);
let relative_rows = window
.update(cx, |editor, window, cx| {
let snapshot = editor.snapshot(window, cx);
element.calculate_relative_line_numbers(
&snapshot,
&(DisplayRow(0)..DisplayRow(6)),
Some(DisplayRow(3)),
true,
)
})
.unwrap();
// Deleted lines should still have relative numbers
assert_eq!(relative_rows[&DisplayRow(0)], 3);
assert_eq!(relative_rows[&DisplayRow(1)], 2);
assert_eq!(relative_rows[&DisplayRow(2)], 1);
// current line, even if deleted, has no relative number
assert_eq!(relative_rows[&DisplayRow(4)], 1);
assert_eq!(relative_rows[&DisplayRow(5)], 2);
}
#[gpui::test]

View File

@@ -1796,6 +1796,14 @@ impl SearchableItem for Editor {
fn search_bar_visibility_changed(&mut self, _: bool, _: &mut Window, _: &mut Context<Self>) {
self.expect_bounds_change = self.last_bounds;
}
fn set_search_is_case_sensitive(
&mut self,
case_sensitive: Option<bool>,
_cx: &mut Context<Self>,
) {
self.select_next_is_case_sensitive = case_sensitive;
}
}
pub fn active_match_index(

View File

@@ -372,7 +372,7 @@ impl SelectionsCollection {
let is_empty = positions.start == positions.end;
let line_len = display_map.line_len(row);
let line = display_map.layout_row(row, text_layout_details);
let start_col = line.index_for_x(positions.start) as u32;
let start_col = line.closest_index_for_x(positions.start) as u32;
let (start, end) = if is_empty {
let point = DisplayPoint::new(row, std::cmp::min(start_col, line_len));
@@ -382,7 +382,7 @@ impl SelectionsCollection {
return None;
}
let start = DisplayPoint::new(row, start_col);
let end_col = line.index_for_x(positions.end) as u32;
let end_col = line.closest_index_for_x(positions.end) as u32;
let end = DisplayPoint::new(row, end_col);
(start, end)
};
@@ -487,6 +487,43 @@ impl<'snap, 'a> MutableSelectionsCollection<'snap, 'a> {
self.selections_changed |= changed;
}
pub fn remove_selections_from_buffer(&mut self, buffer_id: language::BufferId) {
let mut changed = false;
let filtered_selections: Arc<[Selection<Anchor>]> = {
self.disjoint
.iter()
.filter(|selection| {
if let Some(selection_buffer_id) =
self.snapshot.buffer_id_for_anchor(selection.start)
{
let should_remove = selection_buffer_id == buffer_id;
changed |= should_remove;
!should_remove
} else {
true
}
})
.cloned()
.collect()
};
if filtered_selections.is_empty() {
let default_anchor = self.snapshot.anchor_before(0);
self.collection.disjoint = Arc::from([Selection {
id: post_inc(&mut self.collection.next_selection_id),
start: default_anchor,
end: default_anchor,
reversed: false,
goal: SelectionGoal::None,
}]);
} else {
self.collection.disjoint = filtered_selections;
}
self.selections_changed |= changed;
}
pub fn clear_pending(&mut self) {
if self.collection.pending.is_some() {
self.collection.pending = None;

View File

@@ -6,6 +6,7 @@ use buffer_diff::DiffHunkStatusKind;
use collections::BTreeMap;
use futures::Future;
use git::repository::RepoPath;
use gpui::{
AnyWindowHandle, App, Context, Entity, Focusable as _, Keystroke, Pixels, Point,
VisualTestContext, Window, WindowHandle, prelude::*,
@@ -58,6 +59,17 @@ impl EditorTestContext {
})
.await
.unwrap();
let language = project
.read_with(cx, |project, _cx| {
project.languages().language_for_name("Plain Text")
})
.await
.unwrap();
buffer.update(cx, |buffer, cx| {
buffer.set_language(Some(language), cx);
});
let editor = cx.add_window(|window, cx| {
let editor = build_editor_with_project(
project,
@@ -334,7 +346,10 @@ impl EditorTestContext {
let path = self.update_buffer(|buffer, _| buffer.file().unwrap().path().clone());
let mut found = None;
fs.with_git_state(&Self::root_path().join(".git"), false, |git_state| {
found = git_state.index_contents.get(&path.into()).cloned();
found = git_state
.index_contents
.get(&RepoPath::from_rel_path(&path))
.cloned();
})
.unwrap();
assert_eq!(expected, found.as_deref());

View File

@@ -463,8 +463,8 @@ pub fn find_model(
.ok_or_else(|| {
anyhow::anyhow!(
"No language model with ID {}/{} was available. Available models: {}",
selected.model.0,
selected.provider.0,
selected.model.0,
model_registry
.available_models(cx)
.map(|model| format!("{}/{}", model.provider_id().0, model.id().0))

View File

@@ -322,7 +322,7 @@ impl ExampleInstance {
thread.add_default_tools(Rc::new(EvalThreadEnvironment {
project: project.clone(),
}), cx);
thread.set_profile(meta.profile_id.clone());
thread.set_profile(meta.profile_id.clone(), cx);
thread.set_model(
LanguageModelInterceptor::new(
LanguageModelRegistry::read_global(cx).default_model().expect("Missing model").model.clone(),

View File

@@ -267,10 +267,9 @@ impl ExtensionManifest {
let mut extension_manifest_path = extension_dir.join("extension.json");
if fs.is_file(&extension_manifest_path).await {
let manifest_content = fs
.load(&extension_manifest_path)
.await
.with_context(|| format!("failed to load {extension_name} extension.json"))?;
let manifest_content = fs.load(&extension_manifest_path).await.with_context(|| {
format!("loading {extension_name} extension.json, {extension_manifest_path:?}")
})?;
let manifest_json = serde_json::from_str::<OldExtensionManifest>(&manifest_content)
.with_context(|| {
format!("invalid extension.json for extension {extension_name}")
@@ -279,10 +278,9 @@ impl ExtensionManifest {
Ok(manifest_from_old_manifest(manifest_json, extension_name))
} else {
extension_manifest_path.set_extension("toml");
let manifest_content = fs
.load(&extension_manifest_path)
.await
.with_context(|| format!("failed to load {extension_name} extension.toml"))?;
let manifest_content = fs.load(&extension_manifest_path).await.with_context(|| {
format!("loading {extension_name} extension.toml, {extension_manifest_path:?}")
})?;
toml::from_str(&manifest_content).map_err(|err| {
anyhow!("Invalid extension.toml for extension {extension_name}:\n{err}")
})

View File

@@ -31,8 +31,7 @@ use util::test::TempTree;
#[cfg(test)]
#[ctor::ctor]
fn init_logger() {
// show info logs while we debug the extension_store tests hanging.
zlog::init_test_with("info");
zlog::init_test();
}
#[gpui::test]
@@ -532,6 +531,7 @@ async fn test_extension_store(cx: &mut TestAppContext) {
#[gpui::test]
async fn test_extension_store_with_test_extension(cx: &mut TestAppContext) {
log::info!("Initializing test");
init_test(cx);
cx.executor().allow_parking();
@@ -556,6 +556,8 @@ async fn test_extension_store_with_test_extension(cx: &mut TestAppContext) {
let extensions_dir = extensions_tree.path().canonicalize().unwrap();
let project_dir = project_dir.path().canonicalize().unwrap();
log::info!("Setting up test");
let project = Project::test(fs.clone(), [project_dir.as_path()], cx).await;
let proxy = Arc::new(ExtensionHostProxy::new());
@@ -674,6 +676,8 @@ async fn test_extension_store_with_test_extension(cx: &mut TestAppContext) {
)
});
log::info!("Flushing events");
// Ensure that debounces fire.
let mut events = cx.events(&extension_store);
let executor = cx.executor();

View File

@@ -763,17 +763,17 @@ impl WasmExtension {
.fs
.open_sync(&path)
.await
.context("failed to open wasm file")?;
.context(format!("opening wasm file, path: {path:?}"))?;
let mut wasm_bytes = Vec::new();
wasm_file
.read_to_end(&mut wasm_bytes)
.context("failed to read wasm")?;
.context(format!("reading wasm file, path: {path:?}"))?;
wasm_host
.load_extension(wasm_bytes, manifest, cx)
.await
.with_context(|| format!("failed to load wasm extension {}", manifest.id))
.with_context(|| format!("loading wasm extension: {}", manifest.id))
}
pub async fn call<T, Fn>(&self, f: Fn) -> Result<T>

View File

@@ -75,6 +75,7 @@ const SUGGESTIONS_BY_EXTENSION_ID: &[(&str, &[&str])] = &[
("vue", &["vue"]),
("wgsl", &["wgsl"]),
("wit", &["wit"]),
("xml", &["xml"]),
("zig", &["zig"]),
];

View File

@@ -3452,3 +3452,99 @@ async fn test_paths_with_starting_slash(cx: &mut TestAppContext) {
assert_eq!(active_editor.read(cx).title(cx), "file1.txt");
});
}
#[gpui::test]
async fn test_clear_navigation_history(cx: &mut TestAppContext) {
let app_state = init_test(cx);
app_state
.fs
.as_fake()
.insert_tree(
path!("/src"),
json!({
"test": {
"first.rs": "// First file",
"second.rs": "// Second file",
"third.rs": "// Third file",
}
}),
)
.await;
let project = Project::test(app_state.fs.clone(), [path!("/src").as_ref()], cx).await;
let (workspace, cx) = cx.add_window_view(|window, cx| Workspace::test_new(project, window, cx));
workspace.update_in(cx, |_workspace, window, cx| window.focused(cx));
// Open some files to generate navigation history
open_close_queried_buffer("fir", 1, "first.rs", &workspace, cx).await;
open_close_queried_buffer("sec", 1, "second.rs", &workspace, cx).await;
let history_before_clear =
open_close_queried_buffer("thi", 1, "third.rs", &workspace, cx).await;
assert_eq!(
history_before_clear.len(),
2,
"Should have history items before clearing"
);
// Verify that file finder shows history items
let picker = open_file_picker(&workspace, cx);
cx.simulate_input("fir");
picker.update(cx, |finder, _| {
let matches = collect_search_matches(finder);
assert!(
!matches.history.is_empty(),
"File finder should show history items before clearing"
);
});
workspace.update_in(cx, |_, window, cx| {
window.dispatch_action(menu::Cancel.boxed_clone(), cx);
});
// Verify navigation state before clear
workspace.update(cx, |workspace, cx| {
let pane = workspace.active_pane();
pane.read(cx).can_navigate_backward()
});
// Clear navigation history
cx.dispatch_action(workspace::ClearNavigationHistory);
// Verify that navigation is disabled immediately after clear
workspace.update(cx, |workspace, cx| {
let pane = workspace.active_pane();
assert!(
!pane.read(cx).can_navigate_backward(),
"Should not be able to navigate backward after clearing history"
);
assert!(
!pane.read(cx).can_navigate_forward(),
"Should not be able to navigate forward after clearing history"
);
});
// Verify that file finder no longer shows history items
let picker = open_file_picker(&workspace, cx);
cx.simulate_input("fir");
picker.update(cx, |finder, _| {
let matches = collect_search_matches(finder);
assert!(
matches.history.is_empty(),
"File finder should not show history items after clearing"
);
});
workspace.update_in(cx, |_, window, cx| {
window.dispatch_action(menu::Cancel.boxed_clone(), cx);
});
// Verify history is empty by opening a new file
// (this should not show any previous history)
let history_after_clear =
open_close_queried_buffer("sec", 1, "second.rs", &workspace, cx).await;
assert_eq!(
history_after_clear.len(),
0,
"Should have no history items after clearing"
);
}

View File

@@ -399,7 +399,12 @@ impl PickerDelegate for OpenPathDelegate {
}
})
.unwrap_or(false);
if should_prepend_with_current_dir {
let current_dir_in_new_entries = new_entries
.iter()
.any(|entry| &entry.path.string == current_dir);
if should_prepend_with_current_dir && !current_dir_in_new_entries {
new_entries.insert(
0,
CandidateInfo {

View File

@@ -272,7 +272,7 @@ impl GitRepository for FakeGitRepository {
.ok()
.map(|content| String::from_utf8(content).unwrap())?;
let repo_path = RelPath::new(repo_path, PathStyle::local()).ok()?;
Some((repo_path.into(), (content, is_ignored)))
Some((RepoPath::from_rel_path(&repo_path), (content, is_ignored)))
})
.collect();
@@ -436,7 +436,7 @@ impl GitRepository for FakeGitRepository {
state
.blames
.get(&path)
.with_context(|| format!("failed to get blame for {:?}", path.0))
.with_context(|| format!("failed to get blame for {:?}", path))
.cloned()
})
}

View File

@@ -4,6 +4,10 @@ mod mac_watcher;
#[cfg(not(target_os = "macos"))]
pub mod fs_watcher;
use parking_lot::Mutex;
use std::sync::atomic::{AtomicUsize, Ordering};
use std::time::Instant;
use anyhow::{Context as _, Result, anyhow};
#[cfg(any(target_os = "linux", target_os = "freebsd"))]
use ashpd::desktop::trash;
@@ -12,6 +16,7 @@ use gpui::App;
use gpui::BackgroundExecutor;
use gpui::Global;
use gpui::ReadGlobal as _;
use gpui::SharedString;
use std::borrow::Cow;
use util::command::new_smol_command;
@@ -51,8 +56,7 @@ use git::{
repository::{RepoPath, repo_path},
status::{FileStatus, StatusCode, TrackedStatus, UnmergedStatus},
};
#[cfg(any(test, feature = "test-support"))]
use parking_lot::Mutex;
#[cfg(any(test, feature = "test-support"))]
use smol::io::AsyncReadExt;
#[cfg(any(test, feature = "test-support"))]
@@ -148,6 +152,7 @@ pub trait Fs: Send + Sync {
async fn git_clone(&self, repo_url: &str, abs_work_directory: &Path) -> Result<()>;
fn is_fake(&self) -> bool;
async fn is_case_sensitive(&self) -> Result<bool>;
fn subscribe_to_jobs(&self) -> JobEventReceiver;
#[cfg(any(test, feature = "test-support"))]
fn as_fake(&self) -> Arc<FakeFs> {
@@ -215,6 +220,55 @@ pub struct Metadata {
#[serde(transparent)]
pub struct MTime(SystemTime);
pub type JobId = usize;
#[derive(Clone, Debug)]
pub struct JobInfo {
pub start: Instant,
pub message: SharedString,
pub id: JobId,
}
#[derive(Debug, Clone)]
pub enum JobEvent {
Started { info: JobInfo },
Completed { id: JobId },
}
pub type JobEventSender = futures::channel::mpsc::UnboundedSender<JobEvent>;
pub type JobEventReceiver = futures::channel::mpsc::UnboundedReceiver<JobEvent>;
struct JobTracker {
id: JobId,
subscribers: Arc<Mutex<Vec<JobEventSender>>>,
}
impl JobTracker {
fn new(info: JobInfo, subscribers: Arc<Mutex<Vec<JobEventSender>>>) -> Self {
let id = info.id;
{
let mut subs = subscribers.lock();
subs.retain(|sender| {
sender
.unbounded_send(JobEvent::Started { info: info.clone() })
.is_ok()
});
}
Self { id, subscribers }
}
}
impl Drop for JobTracker {
fn drop(&mut self) {
let mut subs = self.subscribers.lock();
subs.retain(|sender| {
sender
.unbounded_send(JobEvent::Completed { id: self.id })
.is_ok()
});
}
}
impl MTime {
/// Conversion intended for persistence and testing.
pub fn from_seconds_and_nanos(secs: u64, nanos: u32) -> Self {
@@ -257,6 +311,8 @@ impl From<MTime> for proto::Timestamp {
pub struct RealFs {
bundled_git_binary_path: Option<PathBuf>,
executor: BackgroundExecutor,
next_job_id: Arc<AtomicUsize>,
job_event_subscribers: Arc<Mutex<Vec<JobEventSender>>>,
}
pub trait FileHandle: Send + Sync + std::fmt::Debug {
@@ -361,6 +417,8 @@ impl RealFs {
Self {
bundled_git_binary_path: git_binary_path,
executor,
next_job_id: Arc::new(AtomicUsize::new(0)),
job_event_subscribers: Arc::new(Mutex::new(Vec::new())),
}
}
}
@@ -719,9 +777,8 @@ impl Fs for RealFs {
{
Ok(metadata) => metadata,
Err(err) => {
return match (err.kind(), err.raw_os_error()) {
(io::ErrorKind::NotFound, _) => Ok(None),
(io::ErrorKind::Other, Some(libc::ENOTDIR)) => Ok(None),
return match err.kind() {
io::ErrorKind::NotFound | io::ErrorKind::NotADirectory => Ok(None),
_ => Err(anyhow::Error::new(err)),
};
}
@@ -863,7 +920,6 @@ impl Fs for RealFs {
Pin<Box<dyn Send + Stream<Item = Vec<PathEvent>>>>,
Arc<dyn Watcher>,
) {
use parking_lot::Mutex;
use util::{ResultExt as _, paths::SanitizedPath};
let (tx, rx) = smol::channel::unbounded();
@@ -960,6 +1016,15 @@ impl Fs for RealFs {
}
async fn git_clone(&self, repo_url: &str, abs_work_directory: &Path) -> Result<()> {
let job_id = self.next_job_id.fetch_add(1, Ordering::SeqCst);
let job_info = JobInfo {
id: job_id,
start: Instant::now(),
message: SharedString::from(format!("Cloning {}", repo_url)),
};
let _job_tracker = JobTracker::new(job_info, self.job_event_subscribers.clone());
let output = new_smol_command("git")
.current_dir(abs_work_directory)
.args(&["clone", repo_url])
@@ -980,6 +1045,12 @@ impl Fs for RealFs {
false
}
fn subscribe_to_jobs(&self) -> JobEventReceiver {
let (sender, receiver) = futures::channel::mpsc::unbounded();
self.job_event_subscribers.lock().push(sender);
receiver
}
/// Checks whether the file system is case sensitive by attempting to create two files
/// that have the same name except for the casing.
///
@@ -1050,6 +1121,7 @@ struct FakeFsState {
read_dir_call_count: usize,
path_write_counts: std::collections::HashMap<PathBuf, usize>,
moves: std::collections::HashMap<u64, PathBuf>,
job_event_subscribers: Arc<Mutex<Vec<JobEventSender>>>,
}
#[cfg(any(test, feature = "test-support"))]
@@ -1334,6 +1406,7 @@ impl FakeFs {
metadata_call_count: 0,
path_write_counts: Default::default(),
moves: Default::default(),
job_event_subscribers: Arc::new(Mutex::new(Vec::new())),
})),
});
@@ -1792,7 +1865,8 @@ impl FakeFs {
for (path, content) in workdir_contents {
use util::{paths::PathStyle, rel_path::RelPath};
let repo_path: RepoPath = RelPath::new(path.strip_prefix(&workdir_path).unwrap(), PathStyle::local()).unwrap().into();
let repo_path = RelPath::new(path.strip_prefix(&workdir_path).unwrap(), PathStyle::local()).unwrap();
let repo_path = RepoPath::from_rel_path(&repo_path);
let status = statuses
.iter()
.find_map(|(p, status)| (*p == repo_path.as_unix_str()).then_some(status));
@@ -2587,6 +2661,12 @@ impl Fs for FakeFs {
Ok(true)
}
fn subscribe_to_jobs(&self) -> JobEventReceiver {
let (sender, receiver) = futures::channel::mpsc::unbounded();
self.state.lock().job_event_subscribers.lock().push(sender);
receiver
}
#[cfg(any(test, feature = "test-support"))]
fn as_fake(&self) -> Arc<FakeFs> {
self.this.upgrade().unwrap()
@@ -3201,6 +3281,8 @@ mod tests {
let fs = RealFs {
bundled_git_binary_path: None,
executor,
next_job_id: Arc::new(AtomicUsize::new(0)),
job_event_subscribers: Arc::new(Mutex::new(Vec::new())),
};
let temp_dir = TempDir::new().unwrap();
let file_to_be_replaced = temp_dir.path().join("file.txt");
@@ -3219,6 +3301,8 @@ mod tests {
let fs = RealFs {
bundled_git_binary_path: None,
executor,
next_job_id: Arc::new(AtomicUsize::new(0)),
job_event_subscribers: Arc::new(Mutex::new(Vec::new())),
};
let temp_dir = TempDir::new().unwrap();
let file_to_be_replaced = temp_dir.path().join("file.txt");

View File

@@ -14,7 +14,6 @@ use rope::Rope;
use schemars::JsonSchema;
use serde::Deserialize;
use smol::io::{AsyncBufReadExt, AsyncReadExt, BufReader};
use std::borrow::Cow;
use std::ffi::{OsStr, OsString};
use std::process::{ExitStatus, Stdio};
use std::{
@@ -848,7 +847,7 @@ impl GitRepository for RealGitRepository {
}
files.push(CommitFile {
path: rel_path.into(),
path: RepoPath(Arc::from(rel_path)),
old_text,
new_text,
})
@@ -2049,6 +2048,11 @@ fn git_status_args(path_prefixes: &[RepoPath]) -> Vec<OsString> {
OsString::from("--no-renames"),
OsString::from("-z"),
];
args.extend(
path_prefixes
.iter()
.map(|path_prefix| path_prefix.as_std_path().into()),
);
args.extend(path_prefixes.iter().map(|path_prefix| {
if path_prefix.is_empty() {
Path::new(".").into()
@@ -2304,23 +2308,43 @@ async fn run_askpass_command(
}
}
#[derive(Clone, Debug, Ord, Hash, PartialOrd, Eq, PartialEq)]
pub struct RepoPath(pub Arc<RelPath>);
#[derive(Clone, Ord, Hash, PartialOrd, Eq, PartialEq)]
pub struct RepoPath(Arc<RelPath>);
impl std::fmt::Debug for RepoPath {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
self.0.fmt(f)
}
}
impl RepoPath {
pub fn new<S: AsRef<str> + ?Sized>(s: &S) -> Result<Self> {
let rel_path = RelPath::unix(s.as_ref())?;
Ok(rel_path.into())
}
pub fn from_proto(proto: &str) -> Result<Self> {
let rel_path = RelPath::from_proto(proto)?;
Ok(rel_path.into())
Ok(Self::from_rel_path(rel_path))
}
pub fn from_std_path(path: &Path, path_style: PathStyle) -> Result<Self> {
let rel_path = RelPath::new(path, path_style)?;
Ok(Self(rel_path.as_ref().into()))
Ok(Self::from_rel_path(&rel_path))
}
pub fn from_proto(proto: &str) -> Result<Self> {
let rel_path = RelPath::from_proto(proto)?;
Ok(Self(rel_path))
}
pub fn from_rel_path(path: &RelPath) -> RepoPath {
Self(Arc::from(path))
}
pub fn as_std_path(&self) -> &Path {
// git2 does not like empty paths and our RelPath infra turns `.` into ``
// so undo that here
if self.is_empty() {
Path::new(".")
} else {
self.0.as_std_path()
}
}
}
@@ -2329,27 +2353,9 @@ pub fn repo_path<S: AsRef<str> + ?Sized>(s: &S) -> RepoPath {
RepoPath(RelPath::unix(s.as_ref()).unwrap().into())
}
impl From<&RelPath> for RepoPath {
fn from(value: &RelPath) -> Self {
RepoPath(value.into())
}
}
impl<'a> From<Cow<'a, RelPath>> for RepoPath {
fn from(value: Cow<'a, RelPath>) -> Self {
value.as_ref().into()
}
}
impl From<Arc<RelPath>> for RepoPath {
fn from(value: Arc<RelPath>) -> Self {
RepoPath(value)
}
}
impl Default for RepoPath {
fn default() -> Self {
RepoPath(RelPath::empty().into())
impl AsRef<Arc<RelPath>> for RepoPath {
fn as_ref(&self) -> &Arc<RelPath> {
&self.0
}
}
@@ -2361,12 +2367,6 @@ impl std::ops::Deref for RepoPath {
}
}
// impl AsRef<Path> for RepoPath {
// fn as_ref(&self) -> &Path {
// RelPath::as_ref(&self.0)
// }
// }
#[derive(Debug)]
pub struct RepoPathDescendants<'a>(pub &'a RepoPath);

View File

@@ -454,7 +454,7 @@ impl FromStr for GitStatus {
let status = entry.as_bytes()[0..2].try_into().unwrap();
let status = FileStatus::from_bytes(status).log_err()?;
// git-status outputs `/`-delimited repo paths, even on Windows.
let path = RepoPath(RelPath::unix(path).log_err()?.into());
let path = RepoPath::from_rel_path(RelPath::unix(path).log_err()?);
Some((path, status))
})
.collect::<Vec<_>>();
@@ -539,7 +539,7 @@ impl FromStr for TreeDiff {
let mut fields = s.split('\0');
let mut parsed = HashMap::default();
while let Some((status, path)) = fields.next().zip(fields.next()) {
let path = RepoPath(RelPath::unix(path)?.into());
let path = RepoPath::from_rel_path(RelPath::unix(path)?);
let mut fields = status.split(" ").skip(2);
let old_sha = fields

View File

@@ -266,7 +266,7 @@ impl language::File for GitBlob {
}
fn path(&self) -> &Arc<RelPath> {
&self.path.0
self.path.as_ref()
}
fn full_path(&self, _: &App) -> PathBuf {

View File

@@ -879,7 +879,7 @@ impl GitPanel {
let active_repository = self.active_repository.as_ref()?.downgrade();
cx.spawn(async move |_, cx| {
let file_path_str = repo_path.0.display(PathStyle::Posix);
let file_path_str = repo_path.as_ref().display(PathStyle::Posix);
let repo_root = active_repository.read_with(cx, |repository, _| {
repository.snapshot().work_directory_abs_path
@@ -1074,7 +1074,7 @@ impl GitPanel {
}
let mut details = entries
.iter()
.filter_map(|entry| entry.repo_path.0.file_name())
.filter_map(|entry| entry.repo_path.as_ref().file_name())
.map(|filename| filename.to_string())
.take(5)
.join("\n");
@@ -1129,7 +1129,7 @@ impl GitPanel {
.map(|entry| {
entry
.repo_path
.0
.as_ref()
.file_name()
.map(|f| f.to_string())
.unwrap_or_default()
@@ -3980,9 +3980,9 @@ impl GitPanel {
.map(|ops| ops.staging() || ops.staged())
.or_else(|| {
repo.status_for_path(&entry.repo_path)
.map(|status| status.status.staging().has_staged())
.and_then(|status| status.status.staging().as_bool())
})
.unwrap_or(entry.staging.has_staged());
.or_else(|| entry.staging.as_bool());
let mut is_staged: ToggleState = is_staging_or_staged.into();
if self.show_placeholders && !self.has_staged_changes() && !entry.status.is_created() {
is_staged = ToggleState::Selected;
@@ -4102,7 +4102,9 @@ impl GitPanel {
}
})
.tooltip(move |_window, cx| {
let action = if is_staging_or_staged {
// If is_staging_or_staged is None, this implies the file was partially staged, and so
// we allow the user to stage it in full by displaying `Stage` in the tooltip.
let action = if is_staging_or_staged.unwrap_or(false) {
"Unstage"
} else {
"Stage"
@@ -5647,7 +5649,7 @@ mod tests {
assert_eq!(
entry.status_entry().map(|status| status
.repo_path
.0
.as_ref()
.as_std_path()
.to_string_lossy()
.to_string()),

View File

@@ -37,7 +37,7 @@ use std::ops::Range;
use std::sync::Arc;
use theme::ActiveTheme;
use ui::{KeyBinding, Tooltip, prelude::*, vertical_divider};
use util::rel_path::RelPath;
use util::{ResultExt as _, rel_path::RelPath};
use workspace::{
CloseActiveItem, ItemNavHistory, SerializableItem, ToolbarItemEvent, ToolbarItemLocation,
ToolbarItemView, Workspace,
@@ -336,7 +336,7 @@ impl ProjectDiff {
};
let repo = git_repo.read(cx);
let sort_prefix = sort_prefix(repo, &entry.repo_path, entry.status, cx);
let path_key = PathKey::with_sort_prefix(sort_prefix, entry.repo_path.0);
let path_key = PathKey::with_sort_prefix(sort_prefix, entry.repo_path.as_ref().clone());
self.move_to_path(path_key, window, cx)
}
@@ -566,7 +566,7 @@ impl ProjectDiff {
for entry in buffers_to_load.iter() {
let sort_prefix = sort_prefix(&repo, &entry.repo_path, entry.file_status, cx);
let path_key =
PathKey::with_sort_prefix(sort_prefix, entry.repo_path.0.clone());
PathKey::with_sort_prefix(sort_prefix, entry.repo_path.as_ref().clone());
previous_paths.remove(&path_key);
path_keys.push(path_key)
}
@@ -582,17 +582,14 @@ impl ProjectDiff {
})?;
for (entry, path_key) in buffers_to_load.into_iter().zip(path_keys.into_iter()) {
let this = this.clone();
cx.spawn(async move |cx| {
let (buffer, diff) = entry.load.await?;
if let Some((buffer, diff)) = entry.load.await.log_err() {
cx.update(|window, cx| {
this.update(cx, |this, cx| {
this.register_buffer(path_key, entry.file_status, buffer, diff, window, cx);
cx.notify();
this.register_buffer(path_key, entry.file_status, buffer, diff, window, cx)
})
})
})
.detach();
.ok();
})?;
}
}
this.update(cx, |this, cx| {
this.pending_scroll.take();

View File

@@ -138,6 +138,8 @@ waker-fn = "1.2.0"
lyon = "1.0"
libc.workspace = true
pin-project = "1.1.10"
circular-buffer.workspace = true
spin = "0.10.0"
[target.'cfg(target_os = "macos")'.dependencies]
block = "0.1"

View File

@@ -178,7 +178,7 @@ impl TextInput {
if position.y > bounds.bottom() {
return self.content.len();
}
line.index_for_x(position.x - bounds.left())
line.closest_index_for_x(position.x - bounds.left())
}
fn select_to(&mut self, offset: usize, cx: &mut Context<Self>) {
@@ -380,7 +380,7 @@ impl EntityInputHandler for TextInput {
let last_layout = self.last_layout.as_ref()?;
assert_eq!(last_layout.text, self.content);
let utf8_index = last_layout.index_for_x(point.x - line_point.x);
let utf8_index = last_layout.index_for_x(point.x - line_point.x)?;
Some(self.offset_to_utf16(utf8_index))
}
}

View File

@@ -169,6 +169,13 @@ impl Application {
self
}
/// Configures when the application should automatically quit.
/// By default, [`QuitMode::Default`] is used.
pub fn with_quit_mode(self, mode: QuitMode) -> Self {
self.0.borrow_mut().quit_mode = mode;
self
}
/// Start the application. The provided callback will be called once the
/// app is fully launched.
pub fn run<F>(self, on_finish_launching: F)
@@ -238,6 +245,18 @@ type WindowClosedHandler = Box<dyn FnMut(&mut App)>;
type ReleaseListener = Box<dyn FnOnce(&mut dyn Any, &mut App) + 'static>;
type NewEntityListener = Box<dyn FnMut(AnyEntity, &mut Option<&mut Window>, &mut App) + 'static>;
/// Defines when the application should automatically quit.
#[derive(Debug, Clone, Copy, PartialEq, Eq, Default)]
pub enum QuitMode {
/// Use [`QuitMode::Explicit`] on macOS and [`QuitMode::LastWindowClosed`] on other platforms.
#[default]
Default,
/// Quit automatically when the last window is closed.
LastWindowClosed,
/// Quit only when requested via [`App::quit`].
Explicit,
}
#[doc(hidden)]
#[derive(Clone, PartialEq, Eq)]
pub struct SystemWindowTab {
@@ -588,6 +607,7 @@ pub struct App {
pub(crate) inspector_element_registry: InspectorElementRegistry,
#[cfg(any(test, feature = "test-support", debug_assertions))]
pub(crate) name: Option<&'static str>,
quit_mode: QuitMode,
quitting: bool,
}
@@ -659,6 +679,7 @@ impl App {
inspector_renderer: None,
#[cfg(any(feature = "inspector", debug_assertions))]
inspector_element_registry: InspectorElementRegistry::default(),
quit_mode: QuitMode::default(),
quitting: false,
#[cfg(any(test, feature = "test-support", debug_assertions))]
@@ -1172,6 +1193,12 @@ impl App {
self.http_client = new_client;
}
/// Configures when the application should automatically quit.
/// By default, [`QuitMode::Default`] is used.
pub fn set_quit_mode(&mut self, mode: QuitMode) {
self.quit_mode = mode;
}
/// Returns the SVG renderer used by the application.
pub fn svg_renderer(&self) -> SvgRenderer {
self.svg_renderer.clone()
@@ -1379,6 +1406,16 @@ impl App {
callback(cx);
true
});
let quit_on_empty = match cx.quit_mode {
QuitMode::Explicit => false,
QuitMode::LastWindowClosed => true,
QuitMode::Default => !cfg!(macos),
};
if quit_on_empty && cx.windows.is_empty() {
cx.quit();
}
} else {
cx.windows.get_mut(id)?.replace(window);
}

View File

@@ -10,7 +10,9 @@ use crate::{
use anyhow::{anyhow, bail};
use futures::{Stream, StreamExt, channel::oneshot};
use rand::{SeedableRng, rngs::StdRng};
use std::{cell::RefCell, future::Future, ops::Deref, rc::Rc, sync::Arc, time::Duration};
use std::{
cell::RefCell, future::Future, ops::Deref, path::PathBuf, rc::Rc, sync::Arc, time::Duration,
};
/// A TestAppContext is provided to tests created with `#[gpui::test]`, it provides
/// an implementation of `Context` with additional methods that are useful in tests.
@@ -331,6 +333,13 @@ impl TestAppContext {
self.test_window(window_handle).simulate_resize(size);
}
/// Returns true if there's an alert dialog open.
pub fn expect_restart(&self) -> oneshot::Receiver<Option<PathBuf>> {
let (tx, rx) = futures::channel::oneshot::channel();
self.test_platform.expect_restart.borrow_mut().replace(tx);
rx
}
/// Causes the given sources to be returned if the application queries for screen
/// capture sources.
pub fn set_screen_capture_sources(&self, sources: Vec<TestScreenCaptureSource>) {

View File

@@ -1,4 +1,4 @@
use crate::{App, PlatformDispatcher};
use crate::{App, PlatformDispatcher, RunnableMeta, RunnableVariant};
use async_task::Runnable;
use futures::channel::mpsc;
use smol::prelude::*;
@@ -62,7 +62,7 @@ enum TaskState<T> {
Ready(Option<T>),
/// A task that is currently running.
Spawned(async_task::Task<T>),
Spawned(async_task::Task<T, RunnableMeta>),
}
impl<T> Task<T> {
@@ -146,6 +146,7 @@ impl BackgroundExecutor {
}
/// Enqueues the given future to be run to completion on a background thread.
#[track_caller]
pub fn spawn<R>(&self, future: impl Future<Output = R> + Send + 'static) -> Task<R>
where
R: Send + 'static,
@@ -155,6 +156,7 @@ impl BackgroundExecutor {
/// Enqueues the given future to be run to completion on a background thread.
/// The given label can be used to control the priority of the task in tests.
#[track_caller]
pub fn spawn_labeled<R>(
&self,
label: TaskLabel,
@@ -166,14 +168,20 @@ impl BackgroundExecutor {
self.spawn_internal::<R>(Box::pin(future), Some(label))
}
#[track_caller]
fn spawn_internal<R: Send + 'static>(
&self,
future: AnyFuture<R>,
label: Option<TaskLabel>,
) -> Task<R> {
let dispatcher = self.dispatcher.clone();
let (runnable, task) =
async_task::spawn(future, move |runnable| dispatcher.dispatch(runnable, label));
let location = core::panic::Location::caller();
let (runnable, task) = async_task::Builder::new()
.metadata(RunnableMeta { location })
.spawn(
move |_| future,
move |runnable| dispatcher.dispatch(RunnableVariant::Meta(runnable), label),
);
runnable.schedule();
Task(TaskState::Spawned(task))
}
@@ -281,7 +289,11 @@ impl BackgroundExecutor {
});
let mut cx = std::task::Context::from_waker(&waker);
let duration = Duration::from_secs(180);
let duration = Duration::from_secs(
option_env!("GPUI_TEST_TIMEOUT")
.and_then(|s| s.parse::<u64>().ok())
.unwrap_or(180),
);
let mut test_should_end_by = Instant::now() + duration;
loop {
@@ -370,10 +382,13 @@ impl BackgroundExecutor {
if duration.is_zero() {
return Task::ready(());
}
let (runnable, task) = async_task::spawn(async move {}, {
let dispatcher = self.dispatcher.clone();
move |runnable| dispatcher.dispatch_after(duration, runnable)
});
let location = core::panic::Location::caller();
let (runnable, task) = async_task::Builder::new()
.metadata(RunnableMeta { location })
.spawn(move |_| async move {}, {
let dispatcher = self.dispatcher.clone();
move |runnable| dispatcher.dispatch_after(duration, RunnableVariant::Meta(runnable))
});
runnable.schedule();
Task(TaskState::Spawned(task))
}
@@ -479,24 +494,29 @@ impl ForegroundExecutor {
}
/// Enqueues the given Task to run on the main thread at some point in the future.
#[track_caller]
pub fn spawn<R>(&self, future: impl Future<Output = R> + 'static) -> Task<R>
where
R: 'static,
{
let dispatcher = self.dispatcher.clone();
let location = core::panic::Location::caller();
#[track_caller]
fn inner<R: 'static>(
dispatcher: Arc<dyn PlatformDispatcher>,
future: AnyLocalFuture<R>,
location: &'static core::panic::Location<'static>,
) -> Task<R> {
let (runnable, task) = spawn_local_with_source_location(future, move |runnable| {
dispatcher.dispatch_on_main_thread(runnable)
});
let (runnable, task) = spawn_local_with_source_location(
future,
move |runnable| dispatcher.dispatch_on_main_thread(RunnableVariant::Meta(runnable)),
RunnableMeta { location },
);
runnable.schedule();
Task(TaskState::Spawned(task))
}
inner::<R>(dispatcher, Box::pin(future))
inner::<R>(dispatcher, Box::pin(future), location)
}
}
@@ -505,14 +525,16 @@ impl ForegroundExecutor {
/// Copy-modified from:
/// <https://github.com/smol-rs/async-task/blob/ca9dbe1db9c422fd765847fa91306e30a6bb58a9/src/runnable.rs#L405>
#[track_caller]
fn spawn_local_with_source_location<Fut, S>(
fn spawn_local_with_source_location<Fut, S, M>(
future: Fut,
schedule: S,
) -> (Runnable<()>, async_task::Task<Fut::Output, ()>)
metadata: M,
) -> (Runnable<M>, async_task::Task<Fut::Output, M>)
where
Fut: Future + 'static,
Fut::Output: 'static,
S: async_task::Schedule<()> + Send + Sync + 'static,
S: async_task::Schedule<M> + Send + Sync + 'static,
M: 'static,
{
#[inline]
fn thread_id() -> ThreadId {
@@ -560,7 +582,11 @@ where
location: Location::caller(),
};
unsafe { async_task::spawn_unchecked(future, schedule) }
unsafe {
async_task::Builder::new()
.metadata(metadata)
.spawn_unchecked(move |_| future, schedule)
}
}
/// Scope manages a set of tasks that are enqueued and waited on together. See [`BackgroundExecutor::scoped`].
@@ -590,6 +616,7 @@ impl<'a> Scope<'a> {
}
/// Spawn a future into this scope.
#[track_caller]
pub fn spawn<F>(&mut self, f: F)
where
F: Future<Output = ()> + Send + 'a,

View File

@@ -30,6 +30,7 @@ mod keymap;
mod path_builder;
mod platform;
pub mod prelude;
mod profiler;
mod scene;
mod shared_string;
mod shared_uri;
@@ -87,6 +88,7 @@ use key_dispatch::*;
pub use keymap::*;
pub use path_builder::*;
pub use platform::*;
pub use profiler::*;
pub use refineable::*;
pub use scene::*;
pub use shared_string::*;

View File

@@ -40,8 +40,8 @@ use crate::{
DEFAULT_WINDOW_SIZE, DevicePixels, DispatchEventResult, Font, FontId, FontMetrics, FontRun,
ForegroundExecutor, GlyphId, GpuSpecs, ImageSource, Keymap, LineLayout, Pixels, PlatformInput,
Point, RenderGlyphParams, RenderImage, RenderImageParams, RenderSvgParams, Scene, ShapedGlyph,
ShapedRun, SharedString, Size, SvgRenderer, SystemWindowTab, Task, TaskLabel, Window,
WindowControlArea, hash, point, px, size,
ShapedRun, SharedString, Size, SvgRenderer, SystemWindowTab, Task, TaskLabel, TaskTiming,
ThreadTaskTimings, Window, WindowControlArea, hash, point, px, size,
};
use anyhow::Result;
use async_task::Runnable;
@@ -559,14 +559,32 @@ pub(crate) trait PlatformWindow: HasWindowHandle + HasDisplayHandle {
}
}
/// This type is public so that our test macro can generate and use it, but it should not
/// be considered part of our public API.
#[doc(hidden)]
#[derive(Debug)]
pub struct RunnableMeta {
/// Location of the runnable
pub location: &'static core::panic::Location<'static>,
}
#[doc(hidden)]
pub enum RunnableVariant {
Meta(Runnable<RunnableMeta>),
Compat(Runnable),
}
/// This type is public so that our test macro can generate and use it, but it should not
/// be considered part of our public API.
#[doc(hidden)]
pub trait PlatformDispatcher: Send + Sync {
fn get_all_timings(&self) -> Vec<ThreadTaskTimings>;
fn get_current_thread_timings(&self) -> Vec<TaskTiming>;
fn is_main_thread(&self) -> bool;
fn dispatch(&self, runnable: Runnable, label: Option<TaskLabel>);
fn dispatch_on_main_thread(&self, runnable: Runnable);
fn dispatch_after(&self, duration: Duration, runnable: Runnable);
fn dispatch(&self, runnable: RunnableVariant, label: Option<TaskLabel>);
fn dispatch_on_main_thread(&self, runnable: RunnableVariant);
fn dispatch_after(&self, duration: Duration, runnable: RunnableVariant);
fn now(&self) -> Instant {
Instant::now()
}

Some files were not shown because too many files have changed in this diff Show More