Compare commits

..

155 Commits

Author SHA1 Message Date
Alvaro Parker
f90d0789fb git: Add notification to git clone (#41712)
Adds a simple notification when cloning a repo using the integrated git
clone on Zed. Before this, the user had no feedback after starting the
cloning action.

Demo:


https://github.com/user-attachments/assets/72fcdf1b-fc99-4fe5-8db2-7c30b170f12f

Not sure about that icon I'm using for the animation, but that can be
easily changed.

Release Notes:

- Added notification when cloning a repo from zed
2025-11-11 01:02:13 -05:00
Conrad Irwin
9e717c7711 Use cloud for auto-update (#42246)
We've had several outages with a proximate cause of "vercel is
complicated",
and auto-update is considered a critical feature; so lets not use vercel
for
that.

Release Notes:

- Auto Updates (and remote server binaries) are now downloaded via
https://cloud.zed.dev instead of https://zed.dev. As before, these URLs
redirect to the GitHub release for actual downloads.
2025-11-10 23:00:55 -07:00
CnsMaple
823844ef18 vim: Fix increment order (#42256)
before:


https://github.com/user-attachments/assets/d490573c-4c2b-4645-a685-d683f06c611f


after:


https://github.com/user-attachments/assets/a69067a1-6e68-4f05-ba56-18eadb1c54df

Release Notes:

- Fix vim increment order
2025-11-10 21:48:27 -07:00
Conrad Irwin
70bcf93355 Add an event_source to events (#42125)
Release Notes:

- N/A
2025-11-10 21:32:09 -07:00
Conrad Irwin
378b30eba5 Use cloud.zed.dev for install.sh (#42399)
Similar to #42246, we'd like to avoid having Vercel on the critical
path.

https://zed.dev/install.sh is served from Cloudflare by intercepting a
route on that page, so this makes the shell-based install flow vercel independent.

Release Notes:

- `./script/install.sh` will now fetch assets via
`https://cloud.zed.dev/`
instead of `https://zed.dev`. As before it will redirect to GitHub
releases
  to complete the download.
2025-11-10 23:55:19 +00:00
Marshall Bowers
83e7c21b2c collab: Remove unused user queries (#42400)
This PR removes queries on users that were no longer being used.

Release Notes:

- N/A
2025-11-10 23:47:39 +00:00
Finn Evers
e488b6cd0b agent_ui: Fix issue where MCP extension could not be uninstalled (#42384)
Closes https://github.com/zed-industries/zed/issues/42312

The issue here was that we assumed that context servers provided by
extensions would always need a config in the settings to be present when
actually the opposite was the case - context servers provided by
extensions are the only context servers that do not need a config to be
in place in order to be available in the UI.

Release Notes:

- Fixed an issue where context servers provided by extensions could not
be uninstalled if they were previously unconfigured.
2025-11-11 00:25:27 +01:00
Kirill Bulatov
f52549c1c4 Small documentation fixes (#42397)
Release Notes:

- N/A

Co-authored-by: Ole Jørgen Brønner <olejorgenb@gmail.com>
2025-11-11 01:16:28 +02:00
Conrad Irwin
359521e91d Allow passing model_name to evals (#42395)
Release Notes:

- N/A
2025-11-10 23:00:52 +00:00
Max Brunsfeld
b607077c08 Add old_text/new_text as a zeta2 prompt format (#42171)
Release Notes:

- N/A

---------

Co-authored-by: Agus Zubiaga <agus@zed.dev>
Co-authored-by: Oleksiy Syvokon <oleksiy.syvokon@gmail.com>
Co-authored-by: Ben Kunkle <ben@zed.dev>
Co-authored-by: Michael Sloan <mgsloan@gmail.com>
2025-11-10 15:44:54 -07:00
Marshall Bowers
e5fce424b3 Update CI badge in README (#42394)
This PR updates the CI badge in the README, after the CI workflow
reorganization.

Release Notes:

- N/A
2025-11-10 22:05:46 +00:00
Andrew Farkas
a8b04369ae Refactor completions (#42122)
This is progress toward multi-word snippets (including snippets with
prefixes containing symbols)

Release Notes:

- Removed `trigger` argument in `ShowCompletions` command

---------

Co-authored-by: Conrad Irwin <conrad.irwin@gmail.com>
2025-11-10 17:00:59 -05:00
Marshall Bowers
11b38db3e3 collab: Drop channel_messages table and its dependents (#42392)
This PR drops the `channel_messages` table and its
dependents—`channel_message_mentions` and `observed_channel_messages`—as
they are no longer used.

Release Notes:

- N/A
2025-11-10 21:59:05 +00:00
John Tur
112b5c16b7 Add QuitMode policy to GPUI (#42391)
Applications can select a policy for when the app quits using the new
function `Application::with_quit_mode`:
- Only on explicit calls to `App::quit`
- When the last window is closed
- Platform default (former on macOS, latter everywhere else) 

Release Notes:

- N/A
2025-11-10 16:45:43 -05:00
Marshall Bowers
32ec1037e1 collab: Remove unused models left over from chat (#42390)
This PR removes some database models that were left over from the chat
feature.

Release Notes:

- N/A
2025-11-10 21:39:44 +00:00
Connor Tsui
a44fc9a1de Rename ThemeMode to ThemeAppearanceMode (#42279)
There was a TODO in `crates/settings/src/settings_content/theme.rs` to
make this rename.

This PR is just splitting off this change from
https://github.com/zed-industries/zed/pull/40035 to make reviewing that
one a bit easier since that PR is a bit more involved than expected.

Release Notes:

- N/A

Signed-off-by: Connor Tsui <connor.tsui20@gmail.com>
2025-11-10 14:26:01 -07:00
Ole Jørgen Brønner
efcd7f7d10 Slightly improve completion in settings.json (for lsp.<language-server>.) (#42263)
Document "any-typed" (`serde_json::Value`) "lsp" keys to include them in
json-language-server completions.

The vscode-json-languageserver seems to skip generically typed keys when
offering completion.

For this schema

```
    "LspSettings": {
        "type": "object",
        "properties": {
            ...
            "initialization_options": true,
            ...
         }
     }
```

"initialization_options" is not offered in the completion.

The effect is easy to verify by triggering completion inside:

```
    "lsp": {
        "basedpyright": {
           COMPLETE HERE
```

<img width="797" height="215" alt="image"
src="https://github.com/user-attachments/assets/d1d1391c-d02c-4028-9888-8869f4d18b0f"
/>

By adding a documentation string the keys are offered even if they are
generically typed:

<img width="809" height="238" alt="image"
src="https://github.com/user-attachments/assets/9a072da9-961b-4e15-9aec-3d56933cbe67"
/>

---

Note: I did some cursory research of whether it's possible to make
vscode-json-languageserver change behavior without success. IMO, not
offering completions here is a bug (or at minimal should be
configurable)

---

Release Notes:

- N/A

---------

Co-authored-by: Kirill Bulatov <mail4score@gmail.com>
2025-11-10 23:08:40 +02:00
John Tur
aaf2f9d309 Ignore "Option as Meta" setting outside of macOS (#42367)
The "Option" key only exists on a Mac. On other operating systems, it is
always expected that the Alt key generates escaped characters.

Fixes https://github.com/zed-industries/zed/issues/40583

Release Notes:

- N/A
2025-11-10 15:11:18 -05:00
Finn Evers
62e3a49212 editor: Fix rare panic in wrap map (#39379)
Closes ZED-1SV
Closes ZED-TG
Closes ZED-22G
Closes ZED-22J

This seems to fix the reported error there, but ultimately, this might
benefit from a test to reproduce. Hence, marking as draft for now.

Release Notes:

- Fixed a rare panic whilst wrapping lines.
2025-11-10 20:08:48 +00:00
Finn Evers
87d0401e64 editor: Show relative line numbers for deleted rows (#42378)
Closes #42191

This PR adds support for relative line numbers in deleted hunks. Note
that this only applies in cases where there is a form of relative
numbering.

It also adds some tests for this functionality as well as missing tests
for other cases in line layouting that was previously untested.

Release Notes:

- Line numbers will now be shown in deleted git hunks if relative line
numbering is enabled
2025-11-10 21:00:50 +01:00
Danilo Leal
2c375e2e0a agent_ui: Ensure message editor placeholder text is accurate (#42375)
This PR creates a dedicated function for the agent panel message
editor's placeholder text so that we can wait for the agent
initialization to capture whether they support slash commands or not. On
the one (nice) hand, this allow us to stop matching agents by name and
make this a bit more generic. On the other (bad) hand, the "/ for
commands" bit should take a little second to show up because we can only
know whether an agent supports it after it is initialized.

This is particularly relevant now that we have agents coming from
extensions and for them, we would obviously not be able to match by
name.

Release Notes:

- agent: Fixed agent panel message editor's placeholder text by making
it more accurate as to whether agents support slash commands,
particularly those coming from extensions.
2025-11-10 16:50:52 -03:00
Conrad Irwin
c24f9e47b4 Try to download wasi-sdk ahead of time (#42377)
This hopefully resolves the lingering test failures on linux,
but also adds some logging just in case this isn't the problem...

Release Notes:

- N/A

---------

Co-authored-by: Ben Kunkle <ben@zed.dev>
2025-11-10 19:50:43 +00:00
Andrew Farkas
3fbfea491d Support relative paths in LSP & DAP binaries (#42135)
Closes #41214

Release Notes:

- Added support for relative paths in LSP and DAP binaries

---------

Co-authored-by: Cole Miller <cole@zed.dev>
Co-authored-by: Julia Ryan <juliaryan3.14@gmail.com>
2025-11-10 19:33:00 +00:00
Finn Evers
2b369d7532 rust: Explicitly capture lifetime identifier (#42372)
Closes #42030

This matches what VSCode and basically also this capture does. However,
the identifier capture was overridden by other captures, hence the need
to be explicit here.

| Before | After | 
| - | - |
| <img width="930" height="346" alt="Bildschirmfoto 2025-11-10 um 17 56
28"
src="https://github.com/user-attachments/assets/e938c863-0981-4368-ab0a-a01dd04cfb24"
/> | <img width="930" height="346" alt="Bildschirmfoto 2025-11-10 um 17
54 35"
src="https://github.com/user-attachments/assets/f3b74011-c75c-448a-819e-80e7e8684e92"
/> |


Release Notes:

- Improved lifetime highlighting in Rust using the `lifetime` capture.
2025-11-10 19:41:14 +01:00
Danilo Leal
ed61a79cc5 agent_ui: Fix history view losing focus when empty (#42374)
Closes https://github.com/zed-industries/zed/issues/42356

This PR fixes the history view losing focus by simply always displaying
the search editor. I don't think it's too weird to not have it when it's
empty, and it also ends up matching how regular pickers work.

Release Notes:

- agent: Fixed a bug where navigating the agent panel with the keyboard
wouldn't work if you visited the history view and it was empty/had no
entries.
2025-11-10 15:29:55 -03:00
Tim Vermeulen
aa6270e658 editor: Add sticky scroll (#42242)
Closes #5344


https://github.com/user-attachments/assets/37ec58b0-7cf6-4eea-9b34-dccf03d3526b

Release Notes:

- Added a setting to stick scopes to the top of the editor

---------

Co-authored-by: KyleBarton <kjb@initialcapacity.io>
Co-authored-by: Conrad Irwin <conrad.irwin@gmail.com>
2025-11-10 11:24:30 -07:00
Dino
d896af2f15 git: Handle buffer file path changes (#41944)
Update `GitStore.on_buffer_store_event` so that, when a
`BufferStoreEvent::BufferChangedFilePath` event is received, we check if
there's any diff state for the buffer and, if so, update it according to
the new file path, in case the file exists in the repository.

Closes #40499

Release Notes:

- Fixed issue with git diff tracking when updating a buffer's file from
an untracked to a tracked file
2025-11-10 18:19:08 +00:00
Agus Zubiaga
c748b177c4 zeta2 cli: Cache at LLM request level (#42371)
We'll now cache LLM responses at the request level (by hash of
URL+contents) for both context and prediction. This way we don't need to
worry about mistakenly using the cache when we change the prompt or its
components.

Release Notes:

- N/A

---------

Co-authored-by: Oleksiy Syvokon <oleksiy.syvokon@gmail.com>
2025-11-10 14:23:52 -03:00
Tryanks
ddf5937899 gpui: Move 'app closing on last window closed' behavior to app-side (#41436)
This commit is a continuation of #36548. As per [mikayla-maki's
Comment](https://github.com/zed-industries/zed/pull/36548#issuecomment-3412140698),
I removed the process management behavior located in GPUI and
reimplemented it in Zed.

Release Notes:

- N/A

---------

Co-authored-by: Mikayla Maki <mikayla.c.maki@gmail.com>
2025-11-10 17:19:35 +00:00
Piotr Osiewicz
6e1d86f311 fs: Handle io::ErrorKind::NotADirectory in fs::metadata (#42370)
New error variants were stabilized in 1.83, and this might've led to us
mis-handling not-a-directory errors.

Co-authored-by: Dino <dino@zed.dev>

Release Notes:

- N/A

Co-authored-by: Dino <dino@zed.dev>
2025-11-10 18:18:40 +01:00
Abul Hossain Khan
a3f04e8b36 agent_ui: Fix thread history item showing GMT time instead of local time on Windows (#42198)
Closes #42178
Now it's consistent with the DateAndTime path which already does
timezone conversion.

- **Future Work**
Happy to tackle the TODO in `time_format.rs` about implementing native
Windows APIs for proper localized formatting (similar to macOS's
`CFDateFormatter`) as a follow-up.

Release Notes:

- agent: Fixed the thread history item timestamp, which was being shown
in GMT instead of in the user's local timezone on Windows.
2025-11-10 13:34:59 -03:00
Danilo Leal
3c81ee6ba6 agent_ui: Allow to configure a default model for profiles through modal (#42359)
Follow-up to https://github.com/zed-industries/zed/pull/39220

This PR allows to configure a default model for a given profile through
the profile management modal.

| Option In Picker | Model Selector |
|--------|--------|
| <img width="1172" height="538" alt="Screenshot 2025-11-10 at 12  24
2@2x"
src="https://github.com/user-attachments/assets/33dfb6f1-f8fd-42f9-b824-3dab807094da"
/> | <img width="1172" height="1120" alt="Screenshot 2025-11-10 at 12 
24@2x"
src="https://github.com/user-attachments/assets/50360b0a-fbb1-455e-9cf7-9fa987345038"
/> |

Release Notes:

- N/A
2025-11-10 13:12:13 -03:00
Miguel Raz Guzmán Macedo
35ae2f5b2b typo: Use tips from proselint (#42362)
I ran [proselint](https://github.com/amperser/proselint) (recommended by
cURL author [Daniel
Stenberg](https://daniel.haxx.se/blog/2022/09/22/taking-curl-documentation-quality-up-one-more-notch/))
against all the `.md` files in the codebase to see if I could fix some
easy typos.

The tool is noisier than I would like and picking up the overrides to
the default config in a `.proselintrc.json` was much harder than I
expected.

There's many other small nits [1] that I believe are best left to your
docs czar whenever they want to consider incorporating a tool like this
into big releases or CI, but these seemed like small wins for now to
open a conversation about a tool like proselint.

---

[1]: Such nits include
- incosistent 1 or 2 spaces
- "color" vs "colour"
- ab/use of `very` 
- awkward or superfluous phrasing.

Release Notes:

- N/A

Signed-off-by: mrg <miguelraz@ciencias.unam.mx>
2025-11-10 17:51:44 +02:00
Agus Zubiaga
d420dd63ed zeta: Improve unified diff prompt (#42354)
Extract some of the improvements from to the unified diff prompt from
https://github.com/zed-industries/zed/pull/42171 and adds some other
about how context work to improve the reliability of predictions.

We also now strip the `<|user_cursor|>` marker if it appears in the
output rather than failing.

Release Notes:

- N/A

---------

Co-authored-by: Max Brunsfeld <maxbrunsfeld@gmail.com>
2025-11-10 14:58:42 +00:00
feeiyu
42ed032f12 Fix circular reference issue between EditPredictionButton and PopoverMenuHandle (#42351)
Closes #ISSUE

While working on issue #40906, I discovered that RemoteClient was not
being released after the remote project closed.
Analysis revealed a circular reference between EditPredictionButton and
PopoverMenuHandle.

Dependency Chain: RemoteClient → Project → ZetaEditPredictionProvider →
EditPredictionButton ↔ PopoverMenuHandle

<img width="400" height="300" alt="image"
src="https://github.com/user-attachments/assets/6b716c9b-6938-471a-b044-397314b729d4"
/>

a) EditPredictionButton hold the reference of PopoverMenuHandle 

5f8226457e/crates/zed/src/zed.rs (L386-L394)

b) PopoverMenuHandle hold the reference of Fn which capture
`Entity<EditPredictionButton>`

5fc54986c7/crates/edit_prediction_button/src/edit_prediction_button.rs (L382-L389)


a9bc890497/crates/ui/src/components/popover_menu.rs (L376-L384)


Release Notes:

- N/A
2025-11-10 16:52:03 +02:00
David
2d84af91bf agent: Add ability to set a default_model per profile (#39220)
Split off from https://github.com/zed-industries/zed/pull/39175

Requires https://github.com/zed-industries/zed/pull/39219 to be merged
first

Adds support for `default_model` for profiles: 

```
      "my-profile": {
        "name": "Coding Agent",
        "tools": {},
        "enable_all_context_servers": false,
        "context_servers": {},
        "default_model": {
          "provider": "copilot_chat",
          "model": "grok-code-fast-1"
        }
      }
```

Which will then switch to the default model whenever the profile is
activated

![2025-09-30 17 09
06](https://github.com/user-attachments/assets/43f07b7b-85d9-4aff-82ce-25d6f5050d50)


Release Notes:

- Added `default_model` configuration to agent profile

---------

Co-authored-by: Danilo Leal <daniloleal09@gmail.com>
2025-11-10 11:11:24 -03:00
Abdugani Toshmukhamedov
7aacc7566c Add support for closing window tabs with middle mouse click (#41628)
This change adds support for closing a system window tabs by pressing
the middle mouse button.
It improves tab management UX by matching common tab behavior.

Release Notes:

- Added support for closing system window tabs with middle mouse click.
2025-11-10 15:09:37 +01:00
Caleb Van Dyke
8d632958db Add better labels for completions for ty lsp (#42233)
Verified that this works locally. I modeled it after how basedpyright
and pyright work. Here is a screenshot of what it looks like (issue has
screenshots of the old state):

<img width="593" height="258" alt="Screenshot 2025-11-07 at 2 40 50 PM"
src="https://github.com/user-attachments/assets/5d2371fc-360b-422f-ba59-0a95f2083c87"
/>

Closes #42232

Release Notes:

- python/ty: Code completion menu now shows packages that will be
imported when a given entry is accepted.
2025-11-10 13:28:12 +01:00
Lukas Wirth
0149de4b54 git: Fix panic in git2 due to empty repo paths (#42304)
Fixes ZED-1VR

Release Notes:

- Fixed sporadic panic in git features
2025-11-10 09:27:51 +00:00
Jakub Konka
359160c8b1 git: Add askpass delegate to git-commit handlers (#42239)
In my local setup, I always enforce git-commit signing with GPG/SSH
which automatically enforces `git commit -S` when committing. This
changeset will now show a modal to the user for them to specify the
passphrase (if any) so that they can unlock their private key for
signing when committing in Zed.

<img width="1086" height="948" alt="Screenshot 2025-11-07 at 11 09
09 PM"
src="https://github.com/user-attachments/assets/ac34b427-c833-41c7-b634-8781493f8a5e"
/>


Release Notes:

- Handle automatic git-commit signing by presenting the user with an
askpass modal
2025-11-10 07:57:50 +00:00
Max Brunsfeld
b8081ad7a6 Make it easy to point zeta2 at ollama (#42329)
I wanted to be able to work offline, so I made it a little bit more
convenient to point zeta2 at ollama.

* For zeta2, don't require that request ids be UUIDs
* Add an env var `ZED_ZETA2_OLLAMA` that sets the edit prediction URL
and model id to work w/ ollama.

Release Notes:

- N/A
2025-11-09 21:10:36 -08:00
ᴀᴍᴛᴏᴀᴇʀ
35c58151eb git: Fix support for self-hosted Bitbucket (#42002)
Closes #41995

Release Notes:

- Fixed support for self-hosted Bitbucket
2025-11-09 21:37:22 -05:00
Ayush Chandekar
e025ee6a11 git: Add base branch support to create_branch (#42151)
Closes [#41674](https://github.com/zed-industries/zed/issues/41674)

Description:
Creating a branch from a base requires switching to the base branch
first, then creating the new branch and checking out to it, which
requires multiple operations.

Add base_branch parameter to create_branch to allow a new branch from a
base branch in one operation which is synonymous to the command `git
switch -c <new-branch> <base-branch>`.

Below is the video after solving the issue: 

(`master` branch is the default branch here, and I create a branch
`new-branch-2` based off the `master` branch. I also show the error
which used to appear before the fix.)

[Screencast from 2025-11-07
05-14-32.webm](https://github.com/user-attachments/assets/d37d1b58-af5f-44e8-b867-2aa5d4ef3d90)

Release Notes:

- Fixed the branch-picking error by replacing multiple sequential switch
operations with just one switch operation.

Signed-off-by: ayu-ch <ayu.chandekar@gmail.com>
2025-11-09 21:35:29 -05:00
Mayank Verma
c60d31a726 git: Track worktree references to resolve stale repository state (#41592)
Closes #35997
Closes #38018
Closes #41516

Release Notes:
- Fixes stale git repositories persisting after removal
2025-11-09 21:24:20 -05:00
Bennet Bo Fenner
0bcf607a28 agent_ui: Always allow to include symbols (#42261)
We can always include symbols, since we either include a ResourceLink to
the symbol (when `PromptCapabilities::embedded_context = false`) or a
Resource (when `PromptCapabilities::embedded_context = true`)

Release Notes:

- Fixed an issue where symbols could not be included when using specific
ACP agents
2025-11-09 20:45:00 +01:00
Bennet Bo Fenner
431a195c32 acp: Fix issue with mentions when embedded_context is set to false (#42260)
Release Notes:

- acp: Fixed an issue where Zed would not respect
`PromptCapabilities::embedded_context`
2025-11-09 20:44:07 +01:00
Danilo Leal
6db6251484 agent_ui: Fix external agent icons in configuration view (#42313)
This PR makes the icons for external agents in the configuration view
use `from_external_svg` instead of `from_path`.

Release Notes:

- N/A
2025-11-09 14:32:19 -03:00
Danilo Leal
2fb3d593bc agent_ui: Add component to standardize the configured LLM card (#42314)
This PR adds a new component to the `language_models` crate called
`ConfiguredApiCard`:

<img width="500" height="420" alt="Screenshot 2025-11-09 at 2  07@2x"
src="https://github.com/user-attachments/assets/655ea941-2df8-4489-a4da-bba34acf33a9"
/>

We were previously recreating this component from scratch with regular
divs in all LLM providers render function, which was redundant as they
all essentially looked the same and didn't have any major variations
aside from labels. We can clean up a bunch of similar code with this
change, which is cool!

Release Notes:

- N/A
2025-11-09 14:32:05 -03:00
chenmi
cc1d66b530 agent_ui: Improve API key configuration UI display (#42306)
Improve the layout and text display of API key configuration in multiple
language model providers to ensure proper text wrapping and ellipsis
handling when API URLs are long.

Before:

<img width="320" alt="image"
src="https://github.com/user-attachments/assets/2f89182c-34a0-4f95-a43a-c2be98d34873"
/>

After:

<img width="320" alt="image"
src="https://github.com/user-attachments/assets/09bf5cc3-07f0-47bc-b21a-d84b8b1caa67"
/>

Changes include:
- Add proper flex layout with overflow handling
- Replace truncate_and_trailoff with CSS text ellipsis
- Ensure consistent UI behavior across all providers

Release Notes:

- Improved API key configuration display in language model settings
2025-11-09 13:00:31 -03:00
Lukas Wirth
5d08c1b35f Surpress more rust-analyzer error logs (#42299)
Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-09 11:27:16 +00:00
Lukas Wirth
b7d4d1791a diagnostics: Keep diagnostic excerpt ranges properly ordered (#42298)
Fixes ZED-2CQ

We were doing the binary search by buffer points, but due to await
points within this function we could end up mixing points of differing
buffer versions.

Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-09 10:55:56 +00:00
John Tur
81d38d9872 Additional Windows keyboard input fixes (#42294)
- Enable Alt+Numpad input
- For this to be effective, the default keybindings for Alt+{Number}
will need to be unbound. This won't be needed once we gain the ability
to differentiate numpad digit keys from alphanumeric digit keys.
  - Fixes https://github.com/zed-industries/zed/issues/40699
- Fix a number of edge cases with dead keys

Release Notes:

- N/A
2025-11-09 03:51:39 -05:00
Matt Miller
21f73d9c02 Use ButtonLike and add OpenExcerptsSplit and dispatches on click (#42283)
Closes #42099 

Release Notes:

- N/A
2025-11-08 17:47:01 -06:00
Danilo Leal
12857a7207 agent: Improve AddSelectionToThread action display (#42280)
Closes https://github.com/zed-industries/zed/issues/42276

This fixes the fact that the `AddSelectionToThread` action was visible
when `disable_ai` was true, as well as it improves its display by making
it either disabled or hidden when there are no selections in the editor.
I also ended up removing it from the app menu simply because making it
observe the `disable_ai` setting would be a bit more complex than I'd
like at the moment, so figured that, given I'm also now adding it to the
toolbar selection menu, we could do without it over there.

Release Notes:

- Fixed the `AddSelectionToThread` action showing up when `disable_ai`
is true
- Improved the `AddSelectionToThread` action display by only making it
available when there are selections in the editor
2025-11-08 14:41:08 -03:00
Danilo Leal
f6be16da3b docs: Update text threads page (#42273)
Adding a section clarifying the difference between regular threads vs.
text threads.

Release Notes:

- N/A
2025-11-08 12:55:31 -03:00
Danilo Leal
94aa643484 docs: Update agent tools page (#42271)
Release Notes:

- N/A
2025-11-08 12:54:42 -03:00
Jakub Konka
28a85158c7 shell_env: Wrap error context in format! where missing (#42267)
Release Notes:

- N/A
2025-11-08 15:46:01 +00:00
boris.zalman
a2c2c617b5 Add helix keymap to delete without yanking (#41988)
Added previously missing keymap for helix deleting without yank. Action
"editor::Delete" is mapped to "Ald-d" as in
https://docs.helix-editor.com/master/keymap.html#changes

Release Notes:

- N/A
2025-11-08 15:53:34 +01:00
Xipeng Jin
77667f4844 Remove Markdown CodeBlock metadata and Custom rendering (#42211)
Follow up #40736

Clean up `CodeBlockRenderer::Custom` related rendering per the previous
PR
[comment](https://github.com/zed-industries/zed/pull/40736#issuecomment-3503074893).
Additional note here:
1. The `Custom` variant in the enum `CodeBlockRenderer` will become not
useful since cleaning all code related to the custom rendering logic.
2. Need to further review the usage of code block `metadata` field in
`MarkdownTag::CodeBlock` enum.

I would like to have the team further review my note above so that we
can make sure it will be safe to clean it up and will not affect any
potential future features will be built on top of it. Thank you!

Release Notes:

- N/A
2025-11-08 14:00:53 +01:00
Hyeondong Lee
b01a6fbdea Fix missing highlight for macro_invocation bang (#41572)
(Not sure if this was left out on purpose, but this makes things feel a
bit more consistent since [VS Code parses bang mark as part of the macro
name](https://github.com/microsoft/vscode/blob/main/extensions/rust/syntaxes/rust.tmLanguage.json#L889-L905))


Release Notes:
- Added the missing highlight for the bang mark in macro invocations.


| **Before** | **After** |
| :---: | :---: |
| <img width="684" height="222" alt="before"
src="https://github.com/user-attachments/assets/ae71eda3-76b5-4547-b2df-4e437a07abf5"
/> | <img width="646" height="236" alt="fixed"
src="https://github.com/user-attachments/assets/500deda5-d6d8-439c-8824-65c2fb0a5daa"
/> |
2025-11-08 08:42:33 +00:00
Roland Rodriguez
44d91c1709 docs: Explain what scrollbar marks represent (#42130)
## Summary

Adds explanations for what each type of scrollbar indicator visually
represents in the editor.

## Description

This PR addresses the issue where users didn't understand what the
colored marks on the scrollbar mean. The existing documentation
explained how to toggle each type of mark on/off, but didn't explain
what they actually represent.

This adds a brief, clear explanation after each scrollbar indicator
setting describing what that indicator shows (e.g., "Git diff indicators
appear as colored marks showing lines that have been added, modified, or
deleted compared to the git HEAD").

## Fixes

Closes #31794

## Test Plan

- Documentation follows the existing style and format of
`docs/src/configuring-zed.md`
- Each explanation is concise and immediately follows the setting
description
- Language is clear and user-friendly

Release Notes:

- N/A
2025-11-08 10:40:18 +02:00
Donnie Adams
d187cbb188 Add comment injection support to remaining languages (#41710)
Release Notes:

- Added support for comment language injections for remaining built-in
languages and multi-line support for Rust
2025-11-08 10:34:53 +02:00
Agus Zubiaga
c241eadbc3 zeta2: Targeted retrieval search (#42240)
Since we removed the filtering step during context gathering, we want
the model to perform more targeted searches. This PR tweaks search tool
schema allowing the model to search within syntax nodes such as `impl`
blocks or methods.

This is what the query schema looks like now:

```rust
/// Search for relevant code by path, syntax hierarchy, and content.
#[derive(Debug, Clone, Serialize, Deserialize, JsonSchema)]
pub struct SearchToolQuery {
    /// 1. A glob pattern to match file paths in the codebase to search in.
    pub glob: String,
    /// 2. Regular expressions to match syntax nodes **by their first line** and hierarchy.
    ///
    /// Subsequent regexes match nodes within the full content of the nodes matched by the previous regexes.
    ///
    /// Example: Searching for a `User` class
    ///     ["class\s+User"]
    ///
    /// Example: Searching for a `get_full_name` method under a `User` class
    ///     ["class\s+User", "def\sget_full_name"]
    ///
    /// Skip this field to match on content alone.
    #[schemars(length(max = 3))]
    #[serde(default)]
    pub syntax_node: Vec<String>,
    /// 3. An optional regular expression to match the final content that should appear in the results.
    ///
    /// - Content will be matched within all lines of the matched syntax nodes.
    /// - If syntax node regexes are provided, this field can be skipped to include as much of the node itself as possible.
    /// - If no syntax node regexes are provided, the content will be matched within the entire file.
    pub content: Option<String>,
}
```

We'll need to keep refining this, but the core implementation is ready.

Release Notes:

- N/A

---------

Co-authored-by: Ben <ben@zed.dev>
Co-authored-by: Max <max@zed.dev>
Co-authored-by: Max Brunsfeld <maxbrunsfeld@gmail.com>
2025-11-08 01:06:12 +00:00
Mikayla Maki
5f8226457e Automate settings registration (#42238)
Release Notes:

- N/A

---------

Co-authored-by: Nia <nia@zed.dev>
2025-11-07 22:27:14 +00:00
Anthony Eid
309947aa53 editor: Allow clicking on excerpts with alt key to open file path (#42235)
#42021 Made clicking on an excerpt title toggle it. This PR brings back
the old behavior if a user is pressing the Alt key when clicking on an
excerpt title.

Release Notes:

- N/A

---------

Co-authored-by: Remco Smits <djsmits12@gmail.com>
2025-11-07 16:23:55 -05:00
Lukas Wirth
0881e548de terminal: Spawn terminal process on main thread on unix (#42234)
Otherwise the terminal will not process the signals correctly 

Release Notes:

- Fixed ctrl+c and friends not working in the terminal on macOS and
linux
2025-11-07 20:56:43 +00:00
claytonrcarter
4511d11a11 bundle: Skip sentry upload for local install on macOS (#42231)
This is a follow up to #41482. When running `script/bundle-mac`, it will
upload debug symbols to Sentry if you have a `$SENTRY_AUTH_TOKEN` set. I
happen to have one set, so this script was trying to generate and upload
those. Whoops! This change skips the upload entirely if you're running a
local install.

Release Notes:

- N/A
2025-11-07 12:38:01 -08:00
Conrad Irwin
19d2fdb6c6 Refresh releases page post deploy (#42218)
Release Notes:

- N/A
2025-11-07 13:34:47 -07:00
Conrad Irwin
20953ecb9d Make nightly bucket objects public (#42229)
Makes it easier to port updates to cloudflare

Closes #ISSUE

Release Notes:

- N/A
2025-11-07 19:43:08 +00:00
Anthony Eid
7475bdaf20 debugger: Truncate scope names to avoid text overlapping in variable list (#42230)
Closes #41969

This was caused because scope names weren't being truncated unlike the
other type of variable list entries.

Release Notes:

- debugger: Fix bug where minimizing the width of the variable list
would cause scope names to overlap

Co-authored-by: Remco Smits <djsmits12@gmail.com>
2025-11-07 19:42:14 +00:00
Dima
8e4c807c6a Fix duplicated 'the' typo (#42225)
Just found this while reading the docs.

Release Notes:

- N/A
2025-11-07 21:37:10 +02:00
John Tur
a66dac7b3a Fix crash during drag-and-drop on Windows (#42227)
The HGLOBAL is itself the HDROP. Do not dereference it.

Release Notes:

- windows: Fixed crashes during drag-and-drop operations
2025-11-07 19:18:47 +00:00
morgankrey
8a903f9c10 2025 11 07 update privacy docs (#42226)
Docs update

Release Notes:

- N/A
2025-11-07 13:16:13 -06:00
Lukas Wirth
9f9575d100 Silence rust-analyzer startup errors (#42222)
When rust-analyzer is still loading the cargo project it tends to error
out on most lsp requests with `content modified`. This pollutes our
logs.

Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-07 18:43:43 +00:00
Marshall Bowers
00898d46c0 docs: Add docs on extension capabilities (#42223)
This PR adds some initial docs on extension capabilities.

Release Notes:

- N/A
2025-11-07 18:38:20 +00:00
Joseph T. Lyons
bcc3307a7e Add optional Zed log field to all bug report templates (#42221)
Release Notes:

- N/A
2025-11-07 18:28:35 +00:00
Lukas Wirth
8ba33ad270 gpui: Do not unwrap in window_procedure (#42216)
Technically these should not be possible to hit, but sentry says
otherwise. Turning these into errors should give us more information
than the abort due to unwinding across ffi boundaries.

Fixes ZED-321

Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-07 18:10:30 +00:00
Marshall Bowers
1e6344899d docs: Update extension features language (#42215)
This PR updates the language around what features extensions can
provide.

Release Notes:

- N/A
2025-11-07 17:43:22 +00:00
Jakub Konka
93f9cff876 Remove invalid assertion in editor (#42210)
Release Notes:

- N/A
2025-11-07 17:36:10 +00:00
Lukas Wirth
6cafe4a9c5 gpui: Do not panic when unable to find the selected fonts (#42212)
Fixes ZED-329

Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-07 17:08:37 +00:00
Lukas Wirth
585c440e6e util: Support shell env fetching for git bash (#42208)
Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-07 16:28:41 +00:00
Lukas Wirth
083bd147ef util: Fall back to cmd if we can't find powershell on the system (#42204)
Closes https://github.com/zed-industries/zed/issues/42165

Release Notes:

- Fixed trying to use powershell for commands when its not installed on
the system
2025-11-07 17:13:52 +01:00
Remco Smits
9591790d8d markdown: Add support for HTML styling attributes (#42143)
Second take on https://github.com/zed-industries/zed/pull/37765.

This PR adds support for styling elements (**b**, **strong**, **em**,
**i**, **ins**, **del**), but also allow you to show the styling text
inline with the current text.
This is done by appending all the up-following text into one text chunk
and merge the highlights from both of them into the already existing
chunk. If there does not exist a text chunk, we will create one and the
next iteration we will use that one to store all the information on.

**Before**
<img width="483" height="692" alt="Screenshot 2025-11-06 at 22 08 09"
src="https://github.com/user-attachments/assets/6158fd3b-066c-4abe-9f8e-bcafae85392e"
/>

**After**
<img width="868" height="300" alt="Screenshot 2025-11-06 at 22 08 21"
src="https://github.com/user-attachments/assets/4d5a7a33-d31c-4514-91c8-2b2a2ff43e0e"
/>

**Code example**
```html
<p>some text <b>bold text</b></p>
<p>some text <strong>strong text</strong></p>
<p>some text <i>italic text</i></p>
<p>some text <em>emphasized text</em></p>
<p>some text <del>delete text</del></p>
<p>some text <ins>insert text</ins></p>

<p>Some text <strong>strong text</strong> more text <b>bold text</b> more text <i>italic text</i> more text <em>emphasized text</em> more text <del>deleted text</del> more text <ins>inserted text</ins></p>

<p><a href="https://example.com">Link Text</a></p>

<p style="text-decoration: underline;">text styled from style attribute</p>
```

cc @bennetbo 

**TODO**
- [x] add tests for styling nested text that should result in one merge

Release Notes:

- Markdown Preview: Added support for `HTML` styling elements

---------

Co-authored-by: Bennet Bo Fenner <bennetbo@gmx.de>
2025-11-07 16:12:27 +00:00
Lukas Wirth
74bf1a170d recent_projects: Do not try to watch /etc/ssh/ssh_config on windows (#42200)
Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-07 15:46:37 +00:00
Bennet Bo Fenner
e72c3bf20d agent_ui: Remove message_editor module (#42195)
Artefact from agent1 removal

Release Notes:

- N/A
2025-11-07 14:36:58 +00:00
Maokaman1
160bf915aa language_models: Filter out whitespace-only text content parts for OpenAI and OpenAI compatible providers (#40316)
Closes #40097

When multiple files are added sequentially to the agent panel, the
request JSON incorrectly includes "text" elements containing only
spaces. These empty elements cause the Zhipu AI API to return a "text
cannot be empty" error.
The fix filters out any "text" elements that are empty or contain only
whitespaces.

UI state when the error occurs:
<img width="300" alt="Image"
src="https://github.com/user-attachments/assets/c55e5272-3f03-42c0-b412-fa24be2b0043"
/>

Request JSON (causing the error):
```
{
  "model": "glm-4.6",
  "messages": [
    {
      "role": "system",
      "content": "<<CUT>>"
    },
    {
      "role": "user",
      "content": [
        {
          "type": "text",
          "text": "[@1.txt](zed:///agent/file?path=C%3A%5CTemp%5CTest%5C1.txt)"
        },
        { "type": "text", "text": " " },
        {
          "type": "text",
          "text": "[@2.txt](zed:///agent/file?path=C%3A%5CTemp%5CTest%5C2.txt)"
        },
        { "type": "text", "text": " describe" },
```

Release Notes:

- Fixed an issue when an OpenAI request contained whitespace-only text content

Co-authored-by: Bennet Bo Fenner <bennetbo@gmx.de>
2025-11-07 14:43:07 +01:00
Xipeng Jin
aff4c25a47 markdown: Restore horizontal scrollbars for codeblocks (#40736)
### Summary

Restore the agent pane’s code-block horizontal scrollbar for easier
scrolling without trackpad and preserve individual scroll state across
multiple code blocks.

### Motivation

Addresses https://github.com/zed-industries/zed/issues/34224, where
agent responses with wide code snippets couldn’t be scrolled
horizontally in the panel. Previously there is no visual effect for
scrollbar to let the user move the code snippet and it was not obviously
to use trackpad or hold down `shift` while scrolling. This PR will
ensure the user being able to only use their mouse to drag the
horizontal scrollbar to show the complete line when the code overflow
the width of code block.

### Changes

- Support auto-hide horizontal scrollbar for rendering code block in
agent panel by adding scrollbar support in markdown.rs
- Add `code_block_scroll_handles` cache in
_crates/markdown/src/markdown.rs_ to give each code block a persistent
`ScrollHandle`.
- Wrap rendered code blocks with custom horizontal scrollbars that match
the vertical scrollbar styling and track hover visibility.
- Retain or clear scroll handles based on whether horizontal overflow is
enabled, preventing leaks when the markdown re-renders.

### How to Test

1. Open the agent panel, request code generation, and ensure wide
snippets show a horizontal scrollbar on hover.
3. Scroll horizontally, navigate away (e.g., change tabs or trigger a
re-render), and confirm the scroll position sticks when returning.
5. Toggle horizontal overflow styling off/on (if applicable) and verify
scrollbars appear or disappear appropriately.

### Screenshots / Demos (if UI change)


https://github.com/user-attachments/assets/e23f94d9-8fe3-42f5-8f77-81b1005a14c8

### Notes for Reviewers

- This is my first time contribution for `zed`, sorry for any code
patten inconsistency. So please let me know if you have any comments and
suggestions to make the code pattern consistent and easy to maintain.
- For now, the horizontal scrollbar is not configurable from the setting
and the style is fixed with the same design as the vertical one. I am
happy to readjust this setting to fit the needs.
- Please let me know if you think any behaviors or designs need to be
changed for the scrollbar.
- All changes live inside _crates/markdown/src/markdown.rs_; no API
surface changes.

Closes #34224 

### Release Notes:

- AI: Show horizontal scroll-bars in wide markdown elements
2025-11-07 13:35:59 +00:00
aohanhongzhi
146e754f73 URL-encode the image paths in Markdown so that images with filenames (#41788)
Closes https://github.com/zed-industries/zed/issues/41786

Release Notes:

- markdown preview: Fixed an issue where path urls would not be parsed
correctly when containing URL-encoded characters

<img width="1680" height="1126"
alt="569415cb-b3e8-4ad6-b31c-a1898ec32085"
src="https://github.com/user-attachments/assets/7de8a892-ff01-4e00-a28c-1c5e9206ce3a"
/>
2025-11-07 14:02:40 +01:00
Kirill Bulatov
278fe91a9a Skip buffer registration if lsp data should be ignored (#42190)
Release Notes:

- N/A
2025-11-07 12:57:54 +00:00
Lukas Wirth
39fb89e031 workspace: Do not panic when the database is corruped (#42186)
Fixes ZED-1NK

Release Notes:

- Fixed zed not starting when the database cannot be loaded
2025-11-07 12:36:36 +00:00
Casper van Elteren
9d52b6c538 terminal: Allow configuring conda manager (#40577)
Closes #40576
This PR makes Conda activation configurable and transparent by adding a
`terminal.detect_venv.on.conda_manager` setting (`"auto" | "conda" |
"mamba" | "micromamba"`, default `"auto"`), updating Python environment
activation to honor this preference (or the detected manager executable)
and fall back to `conda` when necessary.

The preference is passed via `ZED_CONDA_MANAGER` from the terminal
settings, and the activation command is built accordingly (with proper
quoting for paths). Changes span
`zed/crates/terminal/src/terminal_settings.rs` (new `CondaManager` and
setting), `zed/crates/project/src/terminals.rs` (inject env var),
`zed/crates/languages/src/python.rs` (activation logic), and
`zed/assets/settings/default.json` (document the setting). Default
behavior remains unchanged for most users while enabling explicit
selection of `mamba` or `micromamba`.

Release Notes:
- Added: terminal.detect_venv.on.conda_manager setting to choose the
Conda manager (auto, conda, mamba, micromamba). Default: auto.
- Changed: Python Conda environment activation now respects the
configured manager, otherwise uses the detected environment manager
executable, and falls back to conda.
- Reliability: Activation commands quote manager paths to handle spaces
across platforms.
- Compatibility: No breaking changes; non-Conda environments are
unaffected; remote terminals are supported.

---------

Co-authored-by: Lukas Wirth <me@lukaswirth.dev>
Co-authored-by: Lukas Wirth <lukas@zed.dev>
2025-11-07 12:35:06 +00:00
Mikayla Maki
082b80ec89 gpui: Unify the index_for_x methods (#42162)
Supersedes https://github.com/zed-industries/zed/pull/39910

At some point, these two (`index_for_x` and `closest_index_for_x`)
methods where separated out and some code paths used one, while other
code paths took the other. That said, their behavior is almost
identical:

- `index_for_x` computes the index behind the pixel offset, and returns
`None` if there's an overshoot
- `closest_index_for_x` computes the nearest index to the pixel offset,
taking into account whether the offset is over halfway through or not.
If there's an overshoot, it returns the length of the line.

Given these two behaviors, `closest_index_for_x` seems to be a more
useful API than `index_for_x`, and indeed the display map and other core
editor features use it extensively. So this PR is an experiment in
simply replacing one behavior with the other.

Release Notes:

- Improved the accuracy of mouse selections in Markdown
2025-11-07 14:23:43 +02:00
Bennet Bo Fenner
483e31e42a Fix telemetry (#42184)
Follow up to #41991 🤦🏻 

Release Notes:

- N/A
2025-11-07 11:36:05 +00:00
Bennet Bo Fenner
61c263fcf0 agent_ui: Allow opening thread as markdown in remote projects (#42182)
Release Notes:

- Added support for opening thread as markdown in remote projects
2025-11-07 11:32:16 +00:00
Lukas Wirth
3c19174f7b diagnostics: Fix diagnostics view no clearing blocks correctly (#42179)
Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-07 11:16:25 +00:00
Bennet Bo Fenner
7e93c171b5 action_log: Remove unused code (#42177)
Release Notes:

- N/A
2025-11-07 10:33:51 +00:00
Lukas Wirth
88a8e53696 editor: Remove buffer and display map fields from SelectionsCollection (#42175)
Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-07 11:21:14 +01:00
Bennet Bo Fenner
c2416d6bab Add agent metrics (#41991)
Release Notes:

- N/A
2025-11-07 10:07:57 +00:00
Jason Lee
29cc3d0e18 gpui: Add to support check state to MenuItem (#39876)
Release Notes:

- N/A

---


https://github.com/user-attachments/assets/d46b77ae-88ba-43da-93ad-3656a7fecaf9

The system menu is only support for macOS, so here just modify the macOS
platform special code.

The Windows, Linux used `ApplicationMenu`, I have already added
`checked` option to Zed's ContextMenu.

Then later when this PR merged, we can improve "View" menu to show check
state to panels (Project Panel, Outline Panel, ...).
2025-11-07 09:42:55 +00:00
Remco Smits
760747f127 markdown: Add support for HTML table captions (#41192)
Thanks to @Angelk90 for pointing out that, we were missing this feature.
So this PR implement the caption feature for HTML tables for the
markdown preview.

**Code example**
```html
<table>
    <caption>Revenue by Region</caption>
    <tr>
        <th rowspan="2">Region</th>
        <th colspan="2">Revenue</th>
        <th rowspan="2">Growth</th>
    </tr>
    <tr>
        <th>Q2 2024</th>
        <th>Q3 2024</th>
    </tr>
    <tr>
        <td>North America</td>
        <td>$2.8M</td>
        <td>$2.4B</td>
        <td>+85,614%</td>
    </tr>
    <tr>
        <td>Europe</td>
        <td>$1.2M</td>
        <td>$1.9B</td>
        <td>+158,233%</td>
    </tr>
    <tr>
        <td>Asia-Pacific</td>
        <td>$0.5M</td>
        <td>$1.4B</td>
        <td>+279,900%</td>
    </tr>
</table>
```

**Result**:
<img width="1201" height="774" alt="Screenshot 2025-10-25 at 21 18 01"
src="https://github.com/user-attachments/assets/c2a8c1c2-f861-40df-b5c9-549932818f6e"
/>

Release Notes:

- Markdown preview: Added support for `HTML` table captions
2025-11-07 09:47:12 +01:00
Lukas Wirth
d4ec55b183 editor: Correctly handle blocks spanning more than 128 rows (#42172)
Release Notes:

- Fixed block rendering for blocks spanning more than 128 rows
2025-11-07 08:46:57 +00:00
Max Brunsfeld
f89bb2f0d2 Small zeta cli fixes (#42170)
* Fix a panic that happened because we lost the
`ContextRetrievalStarted` debug message, so we didn't assign `t0`.
* Write the edit prediction response log file as a markdown file
containing the text, not a JSON file. We mostly always want the text
content.

Release Notes:

- N/A
2025-11-07 07:34:05 +00:00
Conrad Irwin
e43c436cb6 Create sentry releases in after_release (#42169)
This had been moved to auto-release preview, and was not running for
stable.

Closes #ISSUE

Release Notes:

- N/A
2025-11-07 07:30:10 +00:00
Lukas Wirth
de1bf64f41 project: Remove unnecessary panic (#42167)
If we are in a remote session with the remote dropped, this path is very
much reachable if the call to this function got queued up in a task.

Fixes ZED-124

Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-07 06:56:19 +00:00
Jakub Konka
00eafe63d9 git: Make long-running git staging snappy in git panel (#42149)
Previously, staging a large file in the git panel would block the UI
items until that operation finished. This is due to the fact that
staging is a git op that is locked globally by git (per repo) meaning
only one op that is modifying the git index can run at any one time. In
order to make the UI snappy while letting any pending git staging jobs
to finish in the background, we track their progress via `PendingOps`
indexed by git entry path. We have already had a concept of pending
operations however they existed at the UI layer in the `GitPanel`
abstraction. This PR moves and augments `PendingOps` into the model
`Repository` in `git_store` which seems like a more natural place for
tracking running git jobs/operations. Thanks to this, pending ops are
now stored in a `SumTree` indexed by git entry path part of the
`Repository` snapshot, which makes for efficient access from the UI.

Release Notes:

- Improved UI responsiveness when staging/unstaging large files in the
git panel
2025-11-07 07:34:06 +01:00
Max Brunsfeld
5044e6ac1d zeta2: Make eval example file format more expressive (#42156)
* Allow expressing alternative possible context fetches in `Expected
Context` section
* Allow marking a subset of lines as "required" in `Expected Context`.

We still need to improve how we display the results. I've removed the
context pass/fail pretty printing for now, because it would need to be
rethought to work with the new structure, but for now I think we should
focus on getting basic predictions to run. But this is progress toward a
better structure for eval examples.

Release Notes:

- N/A

---------

Co-authored-by: Oleksiy Syvokon <oleksiy.syvokon@gmail.com>
Co-authored-by: Ben Kunkle <ben@zed.dev>
Co-authored-by: Agus Zubiaga <agus@zed.dev>
2025-11-06 18:05:18 -08:00
Max Brunsfeld
784fdcaee3 zeta2: Build edit prediction prompt and process model output in client (#41870)
Release Notes:

- N/A

---------

Co-authored-by: Agus Zubiaga <agus@zed.dev>
Co-authored-by: Ben Kunkle <ben@zed.dev>
Co-authored-by: Piotr Osiewicz <24362066+osiewicz@users.noreply.github.com>
2025-11-06 18:36:58 -05:00
Ben Kunkle
fb87972f44 settings_ui: Use any open workspace window when opening settings links (#42106)
Closes #ISSUE

Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-06 18:36:23 -05:00
Julia Ryan
8cccb5d4f5 Use windows runner for publishing winget package (#42144)
Our new linux runners don't have powershell installed which causes the
`release-winget` job to fail. This simply runs that step on windows
instead.

Release Notes:

- N/A
2025-11-06 14:13:03 -08:00
Andrew Farkas
2895d31d83 Fix tab switcher close item using wrong pane (#42138)
Closes #40646

Release Notes:

- Fixed `tab_switcher::CloseSelectedItem` doing nothing on tab in
inactive pane

Co-authored-by: Cole Miller <cole@zed.dev>
2025-11-06 20:38:52 +00:00
Cave Bats Of Ware
a112153a2e Enable image support in remote projects (#39158)
Adds support for opening and displaying images in remote projects. The
server streams image data to the client in chunks, where the client then
reconstructs the image and displays it. This change includes:

- Adding `image` crate as a dependency for remote_server
- Implementing `ImageStore` for remote access
- Creating proto definitions for image-related messages
- Adding handlers for creating images for peers
- Computing image metadata from bytes instead of reading from disk for
remote images

Closes #20430
Closes #39104
Closes #40445

Release Notes:

- Added support for image preview in remote sessions.
- Fixed #39104

<img width="982" height="551" alt="image"
src="https://github.com/user-attachments/assets/575428a3-9144-4c1f-b76f-952019ea14cc"
/>
<img width="978" height="547" alt="image"
src="https://github.com/user-attachments/assets/fb58243a-4856-4e73-bb30-8d5e188b3ac9"
/>

---------

Co-authored-by: Julia Ryan <juliaryan3.14@gmail.com>
2025-11-06 11:31:32 -08:00
Richard Feldman
d21184b1d3 Fix ACP extension root dir (#42131)
Agents running in extensions need to have a root directory of the
extension's dir for installation and authentication, but *not* for the
conversation itself - otherwise the agent is running things like
terminal commands in the wrong dir.

Release Notes:

- Fixed Agent Server extensions having the current working directory of
the extension rather than the project
2025-11-06 19:19:16 +00:00
Anthony Eid
ba136abf6c editor: Add action to move between snippet tabstop positions v2 (#42127)
Closes https://github.com/zed-industries/zed/issues/41407

This PR fixes the issues that caused #41407 to be reverted in #42008.
Namely that the action context didn't take into account if a snippet
could move backwards or forwards, and the action shared the same key
mapping as `editor::MoveToPreviousWordStart` and
`editor::MoveToNextWordEnd`.

I changed the default key mapping for the move to snippet tabstop to tab
and shift-tab to match the default behavior of other editors.

Release Notes:

- Editor: Add actions to move between snippet tabstop positions
2025-11-06 18:49:39 +00:00
Lukas Wirth
2d45c23fb0 remote: Flush to stdin when writing to sftp 2 (#42126)
https://github.com/zed-industries/zed/pull/42103#issuecomment-3498137130

Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-06 18:18:19 +00:00
Bo Lorentsen
b78f19982f Allow for using config specific gdb binaries in gdb adapter (#37193)
I really need this for embedded development in IDF and Zephyr, where the
chip vendors sometimes provide there own specialized version of the
tools, and I need to direct zed to use these.

The current GDB adapter only supports the gdb it find in the normal
search path, and it also seems like we where not able to transfer gdb
specific startup arguments (only `-i=dap` is set) .

In order to fix this I (semi wipe using GPT-4.1) expanded the GDB
adapter with 2 new config options :

* **gdb_path** holds a full path to the gdb executable, for now only
full path or relative to cwd
* **gdb_args** an array holding additional arguments given to gdb on
startup
 
It seemed to me, like the `env` config did not transferred to gdb, so
this is added to.
 
 I have tested this locally, and it seems to work not only compile :-)

Release Notes:

debugger: Adds gdb_path and gdb_args to gdb debug adapter options 
debugger: Fix bug where gdb debug sessions wouldn't inherit the shell
environment from Zed

---------

Co-authored-by: Anthony <anthony@zed.dev>
2025-11-06 12:53:59 -05:00
Lukas Wirth
a7fac65d62 gpui: Remove unneeded weak Rc cycle in WindowsWindowInner (#42119)
Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-06 17:33:09 +00:00
Lukas Wirth
bf8864d106 gpui: Remove all (unsound) ManuallyDrop usages, panic on device loss (#42114)
Given that when we lose our devices unrecoverably we will panic anyways,
might as well do so eagerly which makes it clearer.

Additionally this PR replaces all uses of `ManuallyDrop` with `Option`,
as otherwise we need to do manual bookkeeping of what is and isn't
initialized when we try to recover devices as we can bail out halfway
while recovering. In other words, the code prior to this was fairly
unsound due to freely using `ManuallyDrop::drop`.
 
Fixes ZED-1SS
 
Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-06 17:29:15 +00:00
Nia
08ee4f7966 notify: Bump to rebased 8.2.0 fork (#42113)
Hopefully makes progress towards #38109, #39266

Release Notes:

- N/A
2025-11-06 16:30:17 +00:00
Lukas Wirth
6f6f652cf2 zlog: Add env var to enable line number logging (#41905)
Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-06 15:31:26 +00:00
Lukas Wirth
9c8e37a156 clangd: Fix switch source header action on windows (#42105)
Fixes https://github.com/zed-industries/zed/issues/40935

Release Notes:

- Fixed clangd's switch source header action not working on windows
2025-11-06 15:12:34 +00:00
Kirill Bulatov
d54c64f35a Refresh outline panel on file renames (#42104)
Closes https://github.com/zed-industries/zed/issues/41877

Release Notes:

- Fixed outline panel not updating file headers on rename
2025-11-06 14:46:46 +00:00
Lukas Wirth
0b53da18d5 remote: Flush to stdin when writing to sftp (#42103)
https://github.com/zed-industries/zed/issues/42027#issuecomment-3497210172

Release Notes:

- Fixed ssh remoting potentially failing due to not flushing stdin to
sftp
2025-11-06 14:27:26 +00:00
Libon
5ced3ef0fd editor: Improve multibuffer header spacing (#42071)
BEFORE:
<img width="1751" height="629" alt="image"
src="https://github.com/user-attachments/assets/f88464d3-5daa-4d53-b394-f92db8b0fd8c"
/>

AFTER:
<img width="1714" height="493" alt="image"
src="https://github.com/user-attachments/assets/022c883b-b219-40a3-aa5f-0c16d23e8abf"
/>

Release Notes:

- N/A

---------

Co-authored-by: Danilo Leal <daniloleal09@gmail.com>
2025-11-06 14:09:06 +00:00
ʟᴜɴᴇx
7a37dd9433 Do not serialize buffers containing bundled files (#38102)
Fix bundled file persistence by introducing SerializationMode enum

Closes: #38094

What's the issue?

Opening bundled files like Default Key Bindings
(zed://settings/keymap-default.json) was causing SQLite foreign key
constraint errors. The editor was trying to save state for these
read-only assets just like regular files, but since bundled files don't
have database entries, the foreign key constraint would fail.

The fix

Replaced the boolean serialize_dirty_buffers flag with a type-safe
SerializationMode enum:

```rust
pub enum SerializationMode {
    Enabled,   // Regular files persist across sessions
    Disabled,  // Bundled files don't persist
}
```

This prevents serialization at the source: workspace_id() returns None
for disabled editors, serialize() bails early, and should_serialize()
returns false. When opening bundled files, we set the mode to Disabled
from the start, so they're treated as transient views that never
interact with the persistence layer.

Changes

- editor.rs: Added SerializationMode enum and updated serialization
methods to respect it
- items.rs: Guarded should_serialize() to prevent disabled editors from
being serialized
- zed.rs: Set SerializationMode::Disabled in open_bundled_file()

Result

Bundled files open cleanly without SQLite errors and don't persist
across workspace reloads (expected behavior). Regular file persistence
remains unaffected.

Release Notes: Fixed SQLite foreign key constraint errors when opening
bundled files like Default Key Bindings.

---------

Co-authored-by: MrSubidubi <finn@zed.dev>
2025-11-06 13:30:03 +00:00
Danilo Leal
a7163623e7 agent_ui: Make "add more agents" menu item take to extensions (#42098)
Now that agent servers are a thing, this is the primary and easiest way
to quickly add more agents to Zed, without touching any settings JSON
file. :)

Release Notes:

- N/A
2025-11-06 09:55:06 -03:00
Lukas Wirth
f08068680d agent_ui: Do not show Codex wsl warning on wsl take 2 (#42096)
https://github.com/zed-industries/zed/pull/42079#discussion_r2498472887

Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-06 12:15:17 +00:00
Lukas Wirth
a951e414d8 util: Fix shell environment fetching with cmd (#42093)
Release Notes:

- Fixed shell environment fetching failing when having `cmd` configured
as terminal shell
2025-11-06 12:07:34 +00:00
Lukas Wirth
e75c6b1aa5 remote: Fix detect_can_exec detection (#42087)
Closes https://github.com/zed-industries/zed/issues/42036

Release Notes:

- Fixed an issuer with wsl exec detection eagerly failing, breaking
remote connections
2025-11-06 12:06:32 +00:00
Kirill Bulatov
149eedb73d Fix scroll position restoration (#42088)
Follow-up of https://github.com/zed-industries/zed/pull/42035
Scroll position needs to be stored immediately, otherwise editor close
may not register that.

Release Notes:

- N/A
2025-11-06 11:37:27 +00:00
Lukas Wirth
fb46bae3ed remote: Add missing quotation in extract_server_binary (#42085)
Also respect the shell env for various commands again
Should close https://github.com/zed-industries/zed/issues/42027

Release Notes:

- Fixed remote server installation failing on some setups
2025-11-06 11:19:24 +00:00
Lukas Wirth
28d7c37b0d recent_projects: Improve user facing error messages on connection failure (#42083)
cc https://github.com/zed-industries/zed/issues/42004

Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-06 11:00:41 +00:00
Lukas Wirth
f6da987d4c agent_ui: Do not show Codex wsl warning on wsl (#42079)
Release Notes:

- Fixed the codex wsl warning being shown on wsl itself
2025-11-06 10:34:14 +00:00
Kirill Bulatov
efc71f35a5 Tone down extension errors (#42080)
Before:
<img width="2032" height="1161" alt="before"
src="https://github.com/user-attachments/assets/5c497b47-87e8-4167-bc28-93e34556ea4d"
/>

After:
<img width="2032" height="1161" alt="after"
src="https://github.com/user-attachments/assets/4a87803f-67df-4bf8-ade0-306f3c9ca81e"
/>

Release Notes:

- N/A
2025-11-06 10:12:04 +00:00
Lukas Wirth
3b7ee58cfa zed: Attach console to parent process before processing --printenv (#42075)
Otherwise the `--printenv` flag will simply not work on windows

Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-06 09:47:54 +00:00
Lukas Wirth
32047bef93 text: Improve panic messages with more information (#42072)
Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-06 09:16:45 +00:00
Lukas Wirth
4003287cc3 vim: Downgrade user config error from panic to log (#42070)
Fixes ZED-2W3

Release Notes:

- Fixed panic due to invalid vim keycap
2025-11-06 08:37:04 +00:00
Lukas Wirth
001a47c8b7 gpui: Inline some hot recursive scope functions (#42069)
This reduces stack usage in prepainting slightly as we stack less
references to window/app.

Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-06 08:31:39 +00:00
Lukas Wirth
113f0780b3 agent_ui: Fix string slicing panic in message editor (#42068)
Fixes ZED-302

Release Notes:

- Fixed a panic in agent message editor when using multibyte whitespace
characters
2025-11-06 07:58:20 +00:00
Lexi Mattick
273321608f gpui: Add debug assertion to Window::on_action and make docs consistent (#41202)
Improves formatting consistency across various docs, fixes some typos,
and adds a missing `debug_assert_paint` to `Window::on_action` and
`Window::on_action_when`.

Release Notes:

- N/A
2025-11-06 07:55:20 +00:00
Smit Barmase
92cfce568b language: Fix completion menu no longer prioritizes relevant items for Typescript and Python (#42065)
Closes #41672

Regressed in https://github.com/zed-industries/zed/pull/40242

Release Notes:

- Fixed issue where completion menu no longer prioritizes relevant items
for TypeScript and Python.
2025-11-06 13:19:46 +05:30
Coenen Benjamin
2b6cf31ace file_finder: Display duplicated file in file finder history (#41917)
Closes #41850

When digging into this I figured out that basically what was going on is
in the history of the file finder it doesn't update the name of the file
duplicated because when you duplicate a file it's named automatically
with `filename copy` and so this filename was added to the history but
not updated so once you wanted to go back into this file it was not part
of file finder displayed history anymore because this file doesn't exist
anymore but the entity id remains the same.
I was also to reproduce this bug when just renaming a file.

Release Notes:

- Fixed: Display duplicated file in file finder history

---------

Signed-off-by: Benjamin <5719034+bnjjj@users.noreply.github.com>
Co-authored-by: Conrad Irwin <conrad.irwin@gmail.com>
Co-authored-by: Lukas Wirth <me@lukaswirth.dev>
2025-11-06 07:06:45 +00:00
Lemon
58cec41932 Adjust terminal decorative character ranges to include missing Powerline characters (#42043)
Closes #41975.

This change adjusts some of the ranges which were incorrectly labeled or
excluded characters. The new ranges include three codepoints which are
not assigned in Nerd Fonts. However, one of these codepoints were
already included prior to this change. These codepoints are:

- U+E0C9, between the two ice separators
- U+E0D3, between the two trapezoid separators. The ranges prior to this
PR already included this one.
- U+E0D5, between the trapezoid separators and the inverted triangle
separators

I included these so as to not overcomplicate the ranges by
cherry-picking the defined codepoints. That being said, if we're okay
with this and an additional unassigned codepoint (U+E0CB, between ice
separators and honeycomb separators) being included then a simple range
from 0xE0B0 to 0xE0D7 nicely includes all of the Powerline characters.

I wasn't sure how to write tests for this so I just added two characters
to the existing tests which were previously not covered lol. All of the
Powerline characters can be seen
[here](https://www.nerdfonts.com/cheat-sheet) by searching `nf-pl`.

Release Notes:

- Fixed certain Powerline characters incorrectly having terminal
contrast adjustment applied.
2025-11-06 06:38:34 +00:00
Conrad Irwin
2ec5ca0e05 Fix generate release notes script on first stable (#42061)
Don't crash in generate-release-notes on the first stable
commit on a branch.

Release Notes:

- N/A
2025-11-05 23:25:30 -07:00
Conrad Irwin
f8da550867 Refresh zed.dev releases page after releases (#42060)
Release Notes:

- N/A
2025-11-05 23:25:13 -07:00
Mayank Verma
0b1d3d78a4 git: Fix pull failing when tracking remote with different branch name (#41768)
Closes #31430

Release Notes:

- Fixed git pull failing when tracking remote with different branch name

Here's a before/after comparison when `dev` branch has upstream set to
`origin/main`:


https://github.com/user-attachments/assets/3a47e736-c7b7-4634-8cd1-aca7300c3a73
2025-11-06 05:18:08 +00:00
Cole Miller
930b489d90 ci: Don't require protobuf and postgres checks for tests_pass for now (#42057)
For now, there are cases where we want to merge PRs (advisedly) even
though these checks fail.

Release Notes:

- N/A
2025-11-06 00:03:50 -05:00
Delvin
121cee8045 git: Add cursor pointer on last commit to check changes (#41960)
Release Notes:
-  Improved visual cue on git panel ui to check previous commit changes 

Before: 
<img width="1470" height="956" alt="Screenshot 2025-11-05 at 2 06 49 pm"
src="https://github.com/user-attachments/assets/b8c54bb6-c8b8-4d36-a14f-71d725ed68f2"
/>

After:
<img width="1470" height="956" alt="Screenshot 2025-11-05 at 2 06 24 pm"
src="https://github.com/user-attachments/assets/d8d96f9e-ceed-4c02-9f93-de9fd3dfcbf1"
/>
2025-11-05 23:27:38 -05:00
Viraj Bhartiya
5360dc1504 Refactor timestamp formatting in Git UI components to use chrono for local time calculations (#41005)
- Updated `blame_ui.rs`, `branch_picker.rs`, `commit_tooltip.rs`, and
`commit_view.rs` to replace the previous timestamp formatting with
`chrono` for better accuracy in local time representation.
- Introduced `chrono::Local::now().offset().local_minus_utc()` to obtain
the local offset for timestamp formatting.

Closes #40878

Release Notes:
- Improved timestamp handling in various Git UI components for enhanced
user experience.
2025-11-05 23:23:22 -05:00
Danilo Leal
69862790cb docs: Improve content in /ai/agent-panel and /ai/rules (#42055)
- Clarify about first-party supported features in external agents
- Create new section for selection as context for higher visibility
- Add keybindings for agent profiles
- Fix outdated setting using `assistant` instead of `agent`
- Add keybinding for accessing the rules library

Release Notes:

- N/A
2025-11-06 01:18:55 -03:00
ᴀᴍᴛᴏᴀᴇʀ
284d8f790a Support Forgejo and Gitea avatars in git blame (#41813)
Part of #11043.

Codeberg is a public instance of Forgejo, as confirmed by the API
documentation at https://codeberg.org/api/swagger. Therefore, I renamed
the related component from codeberg to forgejo and added codeberg.org as
a public instance.

Furthermore, to optimize request speed for the commit API, I set
`stat=false&verification=false&files=false`.

<img width="1650" height="1268" alt="CleanShot 2025-11-03 at 19 57
06@2x"
src="https://github.com/user-attachments/assets/c1b4129e-f324-41c2-86dc-5e4f7403c046"
/>

<br/>
<br/>

Regarding Gitea Support: 

Forgejo is a fork of Gitea, and their APIs are currently identical
(e.g., for getting avatars). However, to future-proof against potential
API divergence, I decided to treat them as separate entities. The
current gitea implementation is essentially a copy of the forgejo file
with the relevant type names and the public instance URL updated.

Release Notes:

- Added Support for Forgejo and Gitea avatars in git blame
2025-11-05 22:53:28 -05:00
Danilo Leal
f824e93eeb docs: Remove non-existing keybinding in /visual-customization (#42053)
Release Notes:

- N/A
2025-11-06 00:38:39 -03:00
Danilo Leal
e71bc4821c docs: Add section about agent servers in /external-agents (#42052)
Release Notes:

- N/A
2025-11-06 00:38:33 -03:00
Jason Lee
64c8c19e1b gpui: Impl Default for TextRun (#41084)
Release Notes:

- N/A

When I was implementing Input, I often used `TextRun`, but `background`,
`underline` and `strikethrough` were often not used.

So make change to simplify it.
2025-11-06 01:50:23 +00:00
Richard Feldman
622d626a29 Add agent-servers.md (#41609)
Release Notes:

- N/A

---------

Co-authored-by: Danilo Leal <67129314+danilo-leal@users.noreply.github.com>
2025-11-06 01:35:56 +00:00
Petros Amoiridis
714481073d Fix macOS new window stacking (#38683)
Closes #36206 

Disclaimer: I did use AI for help to end up with this proposed solution.
😅

## Observed behavior of native apps on macOS (like Safari)

I first did a quick research on how Safari behaves on macOS, and here's
what I have found:

1. Safari seems to position new windows with an offset based on the
currently active window
2. It keeps opening new windows with an offset until the new window
cannot fit the display bounds horizontally, vertically or both.
3. When it cannot fit horizontally, the new window opens at x=0
(y=active window's y)
4. When it cannot fit vertically, the new window opens at y=0 (x=active
window's x)
5. When it cannot fit both horizontally and vertically, the new window
opens at x=0 and y=0 (top left).
6. At any moment if I activate a different Safari window, the next new
window is offset off of that
7. If I resize the active window and open a new window, the new window
has the same size as the active window

So, I implemented the changes based on those observations.

I am not sure if touching `gpui/src/window.rs` is the way to go. I am
open to feedback and direction here.

I am also not sure if making my changes platform (macOS) specific, is
the right thing to do. I reckoned that Linux and Windows have different
default behaviors, and the original issue mentioned macOS. But,
likewise, I am open to take a different approach.

## Tests

I haven't included tests for such change, as it seems to me a bit
difficult to properly test this, other than just doing a manual
integration test. But if you would want them for such a change, happy to
try including them.

## Alternative approach

I also did some research on macOS native APIs that we could use instead
of trying to make the calculations ourselves, and I found
`NSWindow.cascadeTopLeftFromPoint` which seems to be doing exactly what
we want, and more. It probably takes more things into consideration and
thus it is more robust. We could go down that road, and add it to
`gpui/src/platform/mac/window.rs` and then use it for new window
creation. Again, if that's what you would do yourselves, let me know and
I can either change the implementation here, or open a new pull request
and let you decide which one would you would like to pursue.

## Video showing the behavior


https://github.com/user-attachments/assets/f802a864-7504-47ee-8c6b-8d9b55474899

🙇‍♂️

Release Notes:

- Improved macOS new window stacking
2025-11-06 01:22:49 +00:00
428 changed files with 14158 additions and 8430 deletions

View File

@@ -39,3 +39,21 @@ body:
Output of "zed: copy system specs into clipboard" Output of "zed: copy system specs into clipboard"
validations: validations:
required: true required: true
- type: textarea
attributes:
label: If applicable, attach your `Zed.log` file to this issue.
description: |
From the command palette, run `zed: open log` to see the last 1000 lines.
Or run `zed: reveal log in file manager` to reveal the log file itself.
value: |
<details><summary>Zed.log</summary>
<!-- Paste your log inside the code block. -->
```log
```
</details>
validations:
required: false

View File

@@ -33,3 +33,21 @@ body:
Output of "zed: copy system specs into clipboard" Output of "zed: copy system specs into clipboard"
validations: validations:
required: true required: true
- type: textarea
attributes:
label: If applicable, attach your `Zed.log` file to this issue.
description: |
From the command palette, run `zed: open log` to see the last 1000 lines.
Or run `zed: reveal log in file manager` to reveal the log file itself.
value: |
<details><summary>Zed.log</summary>
<!-- Paste your log inside the code block. -->
```log
```
</details>
validations:
required: false

View File

@@ -33,3 +33,21 @@ body:
Output of "zed: copy system specs into clipboard" Output of "zed: copy system specs into clipboard"
validations: validations:
required: true required: true
- type: textarea
attributes:
label: If applicable, attach your `Zed.log` file to this issue.
description: |
From the command palette, run `zed: open log` to see the last 1000 lines.
Or run `zed: reveal log in file manager` to reveal the log file itself.
value: |
<details><summary>Zed.log</summary>
<!-- Paste your log inside the code block. -->
```log
```
</details>
validations:
required: false

View File

@@ -33,3 +33,21 @@ body:
Output of "zed: copy system specs into clipboard" Output of "zed: copy system specs into clipboard"
validations: validations:
required: true required: true
- type: textarea
attributes:
label: If applicable, attach your `Zed.log` file to this issue.
description: |
From the command palette, run `zed: open log` to see the last 1000 lines.
Or run `zed: reveal log in file manager` to reveal the log file itself.
value: |
<details><summary>Zed.log</summary>
<!-- Paste your log inside the code block. -->
```log
```
</details>
validations:
required: false

View File

@@ -56,3 +56,20 @@ body:
Output of "zed: copy system specs into clipboard" Output of "zed: copy system specs into clipboard"
validations: validations:
required: true required: true
- type: textarea
attributes:
label: If applicable, attach your `Zed.log` file to this issue.
description: |
From the command palette, run `zed: open log` to see the last 1000 lines.
Or run `zed: reveal log in file manager` to reveal the log file itself.
value: |
<details><summary>Zed.log</summary>
<!-- Paste your log inside the code block. -->
```log
```
</details>
validations:
required: false

88
.github/workflows/after_release.yml vendored Normal file
View File

@@ -0,0 +1,88 @@
# Generated from xtask::workflows::after_release
# Rebuild with `cargo xtask workflows`.
name: after_release
on:
release:
types:
- published
jobs:
rebuild_releases_page:
if: github.repository_owner == 'zed-industries'
runs-on: namespace-profile-2x4-ubuntu-2404
steps:
- name: after_release::rebuild_releases_page::refresh_cloud_releases
run: curl -fX POST https://cloud.zed.dev/releases/refresh?expect_tag=${{ github.event.release.tag_name }}
shell: bash -euxo pipefail {0}
- name: after_release::rebuild_releases_page::redeploy_zed_dev
run: npm exec --yes -- vercel@37 --token="$VERCEL_TOKEN" --scope zed-industries redeploy https://zed.dev
shell: bash -euxo pipefail {0}
env:
VERCEL_TOKEN: ${{ secrets.VERCEL_TOKEN }}
post_to_discord:
needs:
- rebuild_releases_page
if: github.repository_owner == 'zed-industries'
runs-on: namespace-profile-2x4-ubuntu-2404
steps:
- id: get-release-url
name: after_release::post_to_discord::get_release_url
run: |
if [ "${{ github.event.release.prerelease }}" == "true" ]; then
URL="https://zed.dev/releases/preview"
else
URL="https://zed.dev/releases/stable"
fi
echo "URL=$URL" >> "$GITHUB_OUTPUT"
shell: bash -euxo pipefail {0}
- id: get-content
name: after_release::post_to_discord::get_content
uses: 2428392/gh-truncate-string-action@b3ff790d21cf42af3ca7579146eedb93c8fb0757
with:
stringToTruncate: |
📣 Zed [${{ github.event.release.tag_name }}](<${{ steps.get-release-url.outputs.URL }}>) was just released!
${{ github.event.release.body }}
maxLength: 2000
truncationSymbol: '...'
- name: after_release::post_to_discord::discord_webhook_action
uses: tsickert/discord-webhook@c840d45a03a323fbc3f7507ac7769dbd91bfb164
with:
webhook-url: ${{ secrets.DISCORD_WEBHOOK_RELEASE_NOTES }}
content: ${{ steps.get-content.outputs.string }}
publish_winget:
runs-on: self-32vcpu-windows-2022
steps:
- id: set-package-name
name: after_release::publish_winget::set_package_name
run: |
if [ "${{ github.event.release.prerelease }}" == "true" ]; then
PACKAGE_NAME=ZedIndustries.Zed.Preview
else
PACKAGE_NAME=ZedIndustries.Zed
fi
echo "PACKAGE_NAME=$PACKAGE_NAME" >> "$GITHUB_OUTPUT"
shell: bash -euxo pipefail {0}
- name: after_release::publish_winget::winget_releaser
uses: vedantmgoyal9/winget-releaser@19e706d4c9121098010096f9c495a70a7518b30f
with:
identifier: ${{ steps.set-package-name.outputs.PACKAGE_NAME }}
max-versions-to-keep: 5
token: ${{ secrets.WINGET_TOKEN }}
create_sentry_release:
if: github.repository_owner == 'zed-industries'
runs-on: namespace-profile-2x4-ubuntu-2404
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: release::create_sentry_release
uses: getsentry/action-release@526942b68292201ac6bbb99b9a0747d4abee354c
with:
environment: production
env:
SENTRY_ORG: zed-dev
SENTRY_PROJECT: zed
SENTRY_AUTH_TOKEN: ${{ secrets.SENTRY_AUTH_TOKEN }}

View File

@@ -1,93 +0,0 @@
# IF YOU UPDATE THE NAME OF ANY GITHUB SECRET, YOU MUST CHERRY PICK THE COMMIT
# TO BOTH STABLE AND PREVIEW CHANNELS
name: Release Actions
on:
release:
types: [published]
jobs:
discord_release:
if: github.repository_owner == 'zed-industries'
runs-on: ubuntu-latest
steps:
- name: Get release URL
id: get-release-url
run: |
if [ "${{ github.event.release.prerelease }}" == "true" ]; then
URL="https://zed.dev/releases/preview"
else
URL="https://zed.dev/releases/stable"
fi
echo "URL=$URL" >> "$GITHUB_OUTPUT"
- name: Get content
uses: 2428392/gh-truncate-string-action@b3ff790d21cf42af3ca7579146eedb93c8fb0757 # v1.4.1
id: get-content
with:
stringToTruncate: |
📣 Zed [${{ github.event.release.tag_name }}](<${{ steps.get-release-url.outputs.URL }}>) was just released!
${{ github.event.release.body }}
maxLength: 2000
truncationSymbol: "..."
- name: Discord Webhook Action
uses: tsickert/discord-webhook@c840d45a03a323fbc3f7507ac7769dbd91bfb164 # v5.3.0
with:
webhook-url: ${{ secrets.DISCORD_WEBHOOK_RELEASE_NOTES }}
content: ${{ steps.get-content.outputs.string }}
publish-winget:
runs-on:
- ubuntu-latest
steps:
- name: Set Package Name
id: set-package-name
run: |
if [ "${{ github.event.release.prerelease }}" == "true" ]; then
PACKAGE_NAME=ZedIndustries.Zed.Preview
else
PACKAGE_NAME=ZedIndustries.Zed
fi
echo "PACKAGE_NAME=$PACKAGE_NAME" >> "$GITHUB_OUTPUT"
- uses: vedantmgoyal9/winget-releaser@19e706d4c9121098010096f9c495a70a7518b30f # v2
with:
identifier: ${{ steps.set-package-name.outputs.PACKAGE_NAME }}
max-versions-to-keep: 5
token: ${{ secrets.WINGET_TOKEN }}
send_release_notes_email:
if: false && github.repository_owner == 'zed-industries' && !github.event.release.prerelease
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
fetch-depth: 0
- name: Check if release was promoted from preview
id: check-promotion-from-preview
run: |
VERSION="${{ github.event.release.tag_name }}"
PREVIEW_TAG="${VERSION}-pre"
if git rev-parse "$PREVIEW_TAG" > /dev/null 2>&1; then
echo "was_promoted_from_preview=true" >> "$GITHUB_OUTPUT"
else
echo "was_promoted_from_preview=false" >> "$GITHUB_OUTPUT"
fi
- name: Send release notes email
if: steps.check-promotion-from-preview.outputs.was_promoted_from_preview == 'true'
run: |
TAG="${{ github.event.release.tag_name }}"
cat << 'EOF' > release_body.txt
${{ github.event.release.body }}
EOF
jq -n --arg tag "$TAG" --rawfile body release_body.txt '{version: $tag, markdown_body: $body}' \
> release_data.json
curl -X POST "https://zed.dev/api/send_release_notes_email" \
-H "Authorization: Bearer ${{ secrets.RELEASE_NOTES_API_TOKEN }}" \
-H "Content-Type: application/json" \
-d @release_data.json

View File

@@ -35,6 +35,9 @@ jobs:
- name: steps::install_mold - name: steps::install_mold
run: ./script/install-mold run: ./script/install-mold
shell: bash -euxo pipefail {0} shell: bash -euxo pipefail {0}
- name: steps::download_wasi_sdk
run: ./script/download-wasi-sdk
shell: bash -euxo pipefail {0}
- name: compare_perf::run_perf::install_hyperfine - name: compare_perf::run_perf::install_hyperfine
run: cargo install hyperfine run: cargo install hyperfine
shell: bash -euxo pipefail {0} shell: bash -euxo pipefail {0}

View File

@@ -57,16 +57,19 @@ jobs:
mkdir -p ./../.cargo mkdir -p ./../.cargo
cp ./.cargo/ci-config.toml ./../.cargo/config.toml cp ./.cargo/ci-config.toml ./../.cargo/config.toml
shell: bash -euxo pipefail {0} shell: bash -euxo pipefail {0}
- name: steps::cache_rust_dependencies_namespace
uses: namespacelabs/nscloud-cache-action@v1
with:
cache: rust
- name: steps::setup_linux - name: steps::setup_linux
run: ./script/linux run: ./script/linux
shell: bash -euxo pipefail {0} shell: bash -euxo pipefail {0}
- name: steps::install_mold - name: steps::install_mold
run: ./script/install-mold run: ./script/install-mold
shell: bash -euxo pipefail {0} shell: bash -euxo pipefail {0}
- name: steps::cache_rust_dependencies_namespace - name: steps::download_wasi_sdk
uses: namespacelabs/nscloud-cache-action@v1 run: ./script/download-wasi-sdk
with: shell: bash -euxo pipefail {0}
cache: rust
- name: steps::setup_node - name: steps::setup_node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020
with: with:
@@ -202,6 +205,9 @@ jobs:
- name: steps::install_mold - name: steps::install_mold
run: ./script/install-mold run: ./script/install-mold
shell: bash -euxo pipefail {0} shell: bash -euxo pipefail {0}
- name: steps::download_wasi_sdk
run: ./script/download-wasi-sdk
shell: bash -euxo pipefail {0}
- name: ./script/bundle-linux - name: ./script/bundle-linux
run: ./script/bundle-linux run: ./script/bundle-linux
shell: bash -euxo pipefail {0} shell: bash -euxo pipefail {0}
@@ -242,6 +248,9 @@ jobs:
- name: steps::install_mold - name: steps::install_mold
run: ./script/install-mold run: ./script/install-mold
shell: bash -euxo pipefail {0} shell: bash -euxo pipefail {0}
- name: steps::download_wasi_sdk
run: ./script/download-wasi-sdk
shell: bash -euxo pipefail {0}
- name: ./script/bundle-linux - name: ./script/bundle-linux
run: ./script/bundle-linux run: ./script/bundle-linux
shell: bash -euxo pipefail {0} shell: bash -euxo pipefail {0}
@@ -475,14 +484,6 @@ jobs:
shell: bash -euxo pipefail {0} shell: bash -euxo pipefail {0}
env: env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
- name: release::create_sentry_release
uses: getsentry/action-release@526942b68292201ac6bbb99b9a0747d4abee354c
with:
environment: production
env:
SENTRY_ORG: zed-dev
SENTRY_PROJECT: zed
SENTRY_AUTH_TOKEN: ${{ secrets.SENTRY_AUTH_TOKEN }}
concurrency: concurrency:
group: ${{ github.workflow }}-${{ github.ref_name }}-${{ github.ref_name == 'main' && github.sha || 'anysha' }} group: ${{ github.workflow }}-${{ github.ref_name }}-${{ github.ref_name == 'main' && github.sha || 'anysha' }}
cancel-in-progress: true cancel-in-progress: true

View File

@@ -93,6 +93,9 @@ jobs:
- name: steps::install_mold - name: steps::install_mold
run: ./script/install-mold run: ./script/install-mold
shell: bash -euxo pipefail {0} shell: bash -euxo pipefail {0}
- name: steps::download_wasi_sdk
run: ./script/download-wasi-sdk
shell: bash -euxo pipefail {0}
- name: ./script/bundle-linux - name: ./script/bundle-linux
run: ./script/bundle-linux run: ./script/bundle-linux
shell: bash -euxo pipefail {0} shell: bash -euxo pipefail {0}
@@ -140,6 +143,9 @@ jobs:
- name: steps::install_mold - name: steps::install_mold
run: ./script/install-mold run: ./script/install-mold
shell: bash -euxo pipefail {0} shell: bash -euxo pipefail {0}
- name: steps::download_wasi_sdk
run: ./script/download-wasi-sdk
shell: bash -euxo pipefail {0}
- name: ./script/bundle-linux - name: ./script/bundle-linux
run: ./script/bundle-linux run: ./script/bundle-linux
shell: bash -euxo pipefail {0} shell: bash -euxo pipefail {0}

View File

@@ -8,22 +8,16 @@ env:
ANTHROPIC_API_KEY: ${{ secrets.ANTHROPIC_API_KEY }} ANTHROPIC_API_KEY: ${{ secrets.ANTHROPIC_API_KEY }}
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }} ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_EVAL_TELEMETRY: '1' ZED_EVAL_TELEMETRY: '1'
MODEL_NAME: ${{ inputs.model_name }}
on: on:
pull_request: workflow_dispatch:
types: inputs:
- synchronize model_name:
- reopened description: model_name
- labeled required: true
branches: type: string
- '**'
schedule:
- cron: 0 0 * * *
workflow_dispatch: {}
jobs: jobs:
agent_evals: agent_evals:
if: |
github.repository_owner == 'zed-industries' &&
(github.event_name != 'pull_request' || contains(github.event.pull_request.labels.*.name, 'run-eval'))
runs-on: namespace-profile-16x32-ubuntu-2204 runs-on: namespace-profile-16x32-ubuntu-2204
steps: steps:
- name: steps::checkout_repo - name: steps::checkout_repo
@@ -40,6 +34,9 @@ jobs:
- name: steps::install_mold - name: steps::install_mold
run: ./script/install-mold run: ./script/install-mold
shell: bash -euxo pipefail {0} shell: bash -euxo pipefail {0}
- name: steps::download_wasi_sdk
run: ./script/download-wasi-sdk
shell: bash -euxo pipefail {0}
- name: steps::setup_cargo_config - name: steps::setup_cargo_config
run: | run: |
mkdir -p ./../.cargo mkdir -p ./../.cargo
@@ -49,14 +46,14 @@ jobs:
run: cargo build --package=eval run: cargo build --package=eval
shell: bash -euxo pipefail {0} shell: bash -euxo pipefail {0}
- name: run_agent_evals::agent_evals::run_eval - name: run_agent_evals::agent_evals::run_eval
run: cargo run --package=eval -- --repetitions=8 --concurrency=1 run: cargo run --package=eval -- --repetitions=8 --concurrency=1 --model "${MODEL_NAME}"
shell: bash -euxo pipefail {0} shell: bash -euxo pipefail {0}
- name: steps::cleanup_cargo_config - name: steps::cleanup_cargo_config
if: always() if: always()
run: | run: |
rm -rf ./../.cargo rm -rf ./../.cargo
shell: bash -euxo pipefail {0} shell: bash -euxo pipefail {0}
timeout-minutes: 60 timeout-minutes: 600
concurrency: concurrency:
group: ${{ github.workflow }}-${{ github.ref_name }}-${{ github.ref_name == 'main' && github.sha || 'anysha' }} group: ${{ github.workflow }}-${{ github.ref_name }}-${{ github.ref_name == 'main' && github.sha || 'anysha' }}
cancel-in-progress: true cancel-in-progress: true

View File

@@ -34,6 +34,9 @@ jobs:
- name: steps::install_mold - name: steps::install_mold
run: ./script/install-mold run: ./script/install-mold
shell: bash -euxo pipefail {0} shell: bash -euxo pipefail {0}
- name: steps::download_wasi_sdk
run: ./script/download-wasi-sdk
shell: bash -euxo pipefail {0}
- name: ./script/bundle-linux - name: ./script/bundle-linux
run: ./script/bundle-linux run: ./script/bundle-linux
shell: bash -euxo pipefail {0} shell: bash -euxo pipefail {0}
@@ -74,6 +77,9 @@ jobs:
- name: steps::install_mold - name: steps::install_mold
run: ./script/install-mold run: ./script/install-mold
shell: bash -euxo pipefail {0} shell: bash -euxo pipefail {0}
- name: steps::download_wasi_sdk
run: ./script/download-wasi-sdk
shell: bash -euxo pipefail {0}
- name: ./script/bundle-linux - name: ./script/bundle-linux
run: ./script/bundle-linux run: ./script/bundle-linux
shell: bash -euxo pipefail {0} shell: bash -euxo pipefail {0}

View File

@@ -143,16 +143,19 @@ jobs:
mkdir -p ./../.cargo mkdir -p ./../.cargo
cp ./.cargo/ci-config.toml ./../.cargo/config.toml cp ./.cargo/ci-config.toml ./../.cargo/config.toml
shell: bash -euxo pipefail {0} shell: bash -euxo pipefail {0}
- name: steps::cache_rust_dependencies_namespace
uses: namespacelabs/nscloud-cache-action@v1
with:
cache: rust
- name: steps::setup_linux - name: steps::setup_linux
run: ./script/linux run: ./script/linux
shell: bash -euxo pipefail {0} shell: bash -euxo pipefail {0}
- name: steps::install_mold - name: steps::install_mold
run: ./script/install-mold run: ./script/install-mold
shell: bash -euxo pipefail {0} shell: bash -euxo pipefail {0}
- name: steps::cache_rust_dependencies_namespace - name: steps::download_wasi_sdk
uses: namespacelabs/nscloud-cache-action@v1 run: ./script/download-wasi-sdk
with: shell: bash -euxo pipefail {0}
cache: rust
- name: steps::setup_node - name: steps::setup_node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020
with: with:
@@ -232,6 +235,9 @@ jobs:
- name: steps::install_mold - name: steps::install_mold
run: ./script/install-mold run: ./script/install-mold
shell: bash -euxo pipefail {0} shell: bash -euxo pipefail {0}
- name: steps::download_wasi_sdk
run: ./script/download-wasi-sdk
shell: bash -euxo pipefail {0}
- name: steps::setup_cargo_config - name: steps::setup_cargo_config
run: | run: |
mkdir -p ./../.cargo mkdir -p ./../.cargo
@@ -263,16 +269,19 @@ jobs:
mkdir -p ./../.cargo mkdir -p ./../.cargo
cp ./.cargo/ci-config.toml ./../.cargo/config.toml cp ./.cargo/ci-config.toml ./../.cargo/config.toml
shell: bash -euxo pipefail {0} shell: bash -euxo pipefail {0}
- name: steps::cache_rust_dependencies_namespace
uses: namespacelabs/nscloud-cache-action@v1
with:
cache: rust
- name: steps::setup_linux - name: steps::setup_linux
run: ./script/linux run: ./script/linux
shell: bash -euxo pipefail {0} shell: bash -euxo pipefail {0}
- name: steps::install_mold - name: steps::install_mold
run: ./script/install-mold run: ./script/install-mold
shell: bash -euxo pipefail {0} shell: bash -euxo pipefail {0}
- name: steps::cache_rust_dependencies_namespace - name: steps::download_wasi_sdk
uses: namespacelabs/nscloud-cache-action@v1 run: ./script/download-wasi-sdk
with: shell: bash -euxo pipefail {0}
cache: rust
- name: cargo build -p collab - name: cargo build -p collab
run: cargo build -p collab run: cargo build -p collab
shell: bash -euxo pipefail {0} shell: bash -euxo pipefail {0}
@@ -285,40 +294,6 @@ jobs:
rm -rf ./../.cargo rm -rf ./../.cargo
shell: bash -euxo pipefail {0} shell: bash -euxo pipefail {0}
timeout-minutes: 60 timeout-minutes: 60
check_postgres_and_protobuf_migrations:
needs:
- orchestrate
if: needs.orchestrate.outputs.run_tests == 'true'
runs-on: self-mini-macos
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
fetch-depth: 0
- name: run_tests::check_postgres_and_protobuf_migrations::remove_untracked_files
run: git clean -df
shell: bash -euxo pipefail {0}
- name: run_tests::check_postgres_and_protobuf_migrations::ensure_fresh_merge
run: |
if [ -z "$GITHUB_BASE_REF" ];
then
echo "BUF_BASE_BRANCH=$(git merge-base origin/main HEAD)" >> "$GITHUB_ENV"
else
git checkout -B temp
git merge -q "origin/$GITHUB_BASE_REF" -m "merge main into temp"
echo "BUF_BASE_BRANCH=$GITHUB_BASE_REF" >> "$GITHUB_ENV"
fi
shell: bash -euxo pipefail {0}
- name: run_tests::check_postgres_and_protobuf_migrations::bufbuild_setup_action
uses: bufbuild/buf-setup-action@v1
with:
version: v1.29.0
- name: run_tests::check_postgres_and_protobuf_migrations::bufbuild_breaking_action
uses: bufbuild/buf-breaking-action@v1
with:
input: crates/proto/proto/
against: https://github.com/${GITHUB_REPOSITORY}.git#branch=${BUF_BASE_BRANCH},subdir=crates/proto/proto/
timeout-minutes: 60
check_dependencies: check_dependencies:
needs: needs:
- orchestrate - orchestrate
@@ -382,6 +357,9 @@ jobs:
- name: steps::install_mold - name: steps::install_mold
run: ./script/install-mold run: ./script/install-mold
shell: bash -euxo pipefail {0} shell: bash -euxo pipefail {0}
- name: steps::download_wasi_sdk
run: ./script/download-wasi-sdk
shell: bash -euxo pipefail {0}
- name: run_tests::check_docs::install_mdbook - name: run_tests::check_docs::install_mdbook
uses: peaceiris/actions-mdbook@ee69d230fe19748b7abf22df32acaa93833fad08 uses: peaceiris/actions-mdbook@ee69d230fe19748b7abf22df32acaa93833fad08
with: with:
@@ -518,6 +496,40 @@ jobs:
shell: bash -euxo pipefail {0} shell: bash -euxo pipefail {0}
timeout-minutes: 60 timeout-minutes: 60
continue-on-error: true continue-on-error: true
check_postgres_and_protobuf_migrations:
needs:
- orchestrate
if: needs.orchestrate.outputs.run_tests == 'true'
runs-on: self-mini-macos
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
fetch-depth: 0
- name: run_tests::check_postgres_and_protobuf_migrations::remove_untracked_files
run: git clean -df
shell: bash -euxo pipefail {0}
- name: run_tests::check_postgres_and_protobuf_migrations::ensure_fresh_merge
run: |
if [ -z "$GITHUB_BASE_REF" ];
then
echo "BUF_BASE_BRANCH=$(git merge-base origin/main HEAD)" >> "$GITHUB_ENV"
else
git checkout -B temp
git merge -q "origin/$GITHUB_BASE_REF" -m "merge main into temp"
echo "BUF_BASE_BRANCH=$GITHUB_BASE_REF" >> "$GITHUB_ENV"
fi
shell: bash -euxo pipefail {0}
- name: run_tests::check_postgres_and_protobuf_migrations::bufbuild_setup_action
uses: bufbuild/buf-setup-action@v1
with:
version: v1.29.0
- name: run_tests::check_postgres_and_protobuf_migrations::bufbuild_breaking_action
uses: bufbuild/buf-breaking-action@v1
with:
input: crates/proto/proto/
against: https://github.com/${GITHUB_REPOSITORY}.git#branch=${BUF_BASE_BRANCH},subdir=crates/proto/proto/
timeout-minutes: 60
tests_pass: tests_pass:
needs: needs:
- orchestrate - orchestrate
@@ -527,7 +539,6 @@ jobs:
- run_tests_mac - run_tests_mac
- doctests - doctests
- check_workspace_binaries - check_workspace_binaries
- check_postgres_and_protobuf_migrations
- check_dependencies - check_dependencies
- check_docs - check_docs
- check_licenses - check_licenses
@@ -554,7 +565,6 @@ jobs:
check_result "run_tests_mac" "${{ needs.run_tests_mac.result }}" check_result "run_tests_mac" "${{ needs.run_tests_mac.result }}"
check_result "doctests" "${{ needs.doctests.result }}" check_result "doctests" "${{ needs.doctests.result }}"
check_result "check_workspace_binaries" "${{ needs.check_workspace_binaries.result }}" check_result "check_workspace_binaries" "${{ needs.check_workspace_binaries.result }}"
check_result "check_postgres_and_protobuf_migrations" "${{ needs.check_postgres_and_protobuf_migrations.result }}"
check_result "check_dependencies" "${{ needs.check_dependencies.result }}" check_result "check_dependencies" "${{ needs.check_dependencies.result }}"
check_result "check_docs" "${{ needs.check_docs.result }}" check_result "check_docs" "${{ needs.check_docs.result }}"
check_result "check_licenses" "${{ needs.check_licenses.result }}" check_result "check_licenses" "${{ needs.check_licenses.result }}"

View File

@@ -33,6 +33,9 @@ jobs:
- name: steps::install_mold - name: steps::install_mold
run: ./script/install-mold run: ./script/install-mold
shell: bash -euxo pipefail {0} shell: bash -euxo pipefail {0}
- name: steps::download_wasi_sdk
run: ./script/download-wasi-sdk
shell: bash -euxo pipefail {0}
- name: steps::cargo_install_nextest - name: steps::cargo_install_nextest
run: cargo install cargo-nextest --locked run: cargo install cargo-nextest --locked
shell: bash -euxo pipefail {0} shell: bash -euxo pipefail {0}

44
Cargo.lock generated
View File

@@ -32,6 +32,7 @@ dependencies = [
"settings", "settings",
"smol", "smol",
"task", "task",
"telemetry",
"tempfile", "tempfile",
"terminal", "terminal",
"ui", "ui",
@@ -39,6 +40,7 @@ dependencies = [
"util", "util",
"uuid", "uuid",
"watch", "watch",
"zlog",
] ]
[[package]] [[package]]
@@ -79,6 +81,7 @@ dependencies = [
"rand 0.9.2", "rand 0.9.2",
"serde_json", "serde_json",
"settings", "settings",
"telemetry",
"text", "text",
"util", "util",
"watch", "watch",
@@ -93,6 +96,7 @@ dependencies = [
"auto_update", "auto_update",
"editor", "editor",
"extension_host", "extension_host",
"fs",
"futures 0.3.31", "futures 0.3.31",
"gpui", "gpui",
"language", "language",
@@ -211,6 +215,8 @@ dependencies = [
[[package]] [[package]]
name = "agent-client-protocol" name = "agent-client-protocol"
version = "0.7.0" version = "0.7.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "525705e39c11cd73f7bc784e3681a9386aa30c8d0630808d3dc2237eb4f9cb1b"
dependencies = [ dependencies = [
"agent-client-protocol-schema", "agent-client-protocol-schema",
"anyhow", "anyhow",
@@ -226,7 +232,9 @@ dependencies = [
[[package]] [[package]]
name = "agent-client-protocol-schema" name = "agent-client-protocol-schema"
version = "0.6.3" version = "0.6.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ecf16c18fea41282d6bbadd1549a06be6836bddb1893f44a6235f340fa24e2af"
dependencies = [ dependencies = [
"anyhow", "anyhow",
"derive_more 2.0.1", "derive_more 2.0.1",
@@ -243,7 +251,6 @@ dependencies = [
"acp_tools", "acp_tools",
"action_log", "action_log",
"agent-client-protocol", "agent-client-protocol",
"agent_settings",
"anyhow", "anyhow",
"async-trait", "async-trait",
"client", "client",
@@ -1324,10 +1331,14 @@ version = "0.1.0"
dependencies = [ dependencies = [
"anyhow", "anyhow",
"client", "client",
"clock",
"ctor",
"db", "db",
"futures 0.3.31",
"gpui", "gpui",
"http_client", "http_client",
"log", "log",
"parking_lot",
"paths", "paths",
"release_channel", "release_channel",
"serde", "serde",
@@ -1338,6 +1349,7 @@ dependencies = [
"util", "util",
"which 6.0.3", "which 6.0.3",
"workspace", "workspace",
"zlog",
] ]
[[package]] [[package]]
@@ -3194,6 +3206,7 @@ dependencies = [
"indoc", "indoc",
"ordered-float 2.10.1", "ordered-float 2.10.1",
"rustc-hash 2.1.1", "rustc-hash 2.1.1",
"schemars 1.0.4",
"serde", "serde",
"strum 0.27.2", "strum 0.27.2",
] ]
@@ -6400,7 +6413,7 @@ dependencies = [
"ignore", "ignore",
"libc", "libc",
"log", "log",
"notify 8.0.0", "notify 8.2.0",
"objc", "objc",
"parking_lot", "parking_lot",
"paths", "paths",
@@ -7087,7 +7100,6 @@ dependencies = [
"askpass", "askpass",
"buffer_diff", "buffer_diff",
"call", "call",
"chrono",
"cloud_llm_client", "cloud_llm_client",
"collections", "collections",
"command_palette_hooks", "command_palette_hooks",
@@ -7793,6 +7805,7 @@ dependencies = [
"parking_lot", "parking_lot",
"serde", "serde",
"serde_json", "serde_json",
"serde_urlencoded",
"sha2", "sha2",
"tempfile", "tempfile",
"url", "url",
@@ -8860,6 +8873,7 @@ dependencies = [
"open_router", "open_router",
"parking_lot", "parking_lot",
"proto", "proto",
"schemars 1.0.4",
"serde", "serde",
"serde_json", "serde_json",
"settings", "settings",
@@ -9024,6 +9038,7 @@ dependencies = [
"settings", "settings",
"smol", "smol",
"task", "task",
"terminal",
"text", "text",
"theme", "theme",
"toml 0.8.23", "toml 0.8.23",
@@ -9671,6 +9686,7 @@ dependencies = [
"settings", "settings",
"theme", "theme",
"ui", "ui",
"urlencoding",
"util", "util",
"workspace", "workspace",
] ]
@@ -10405,11 +10421,10 @@ dependencies = [
[[package]] [[package]]
name = "notify" name = "notify"
version = "8.0.0" version = "8.2.0"
source = "git+https://github.com/zed-industries/notify.git?rev=bbb9ea5ae52b253e095737847e367c30653a2e96#bbb9ea5ae52b253e095737847e367c30653a2e96" source = "git+https://github.com/zed-industries/notify.git?rev=b4588b2e5aee68f4c0e100f140e808cbce7b1419#b4588b2e5aee68f4c0e100f140e808cbce7b1419"
dependencies = [ dependencies = [
"bitflags 2.9.4", "bitflags 2.9.4",
"filetime",
"fsevent-sys 4.1.0", "fsevent-sys 4.1.0",
"inotify 0.11.0", "inotify 0.11.0",
"kqueue", "kqueue",
@@ -10418,7 +10433,7 @@ dependencies = [
"mio 1.1.0", "mio 1.1.0",
"notify-types", "notify-types",
"walkdir", "walkdir",
"windows-sys 0.59.0", "windows-sys 0.60.2",
] ]
[[package]] [[package]]
@@ -10435,7 +10450,7 @@ dependencies = [
[[package]] [[package]]
name = "notify-types" name = "notify-types"
version = "2.0.0" version = "2.0.0"
source = "git+https://github.com/zed-industries/notify.git?rev=bbb9ea5ae52b253e095737847e367c30653a2e96#bbb9ea5ae52b253e095737847e367c30653a2e96" source = "git+https://github.com/zed-industries/notify.git?rev=b4588b2e5aee68f4c0e100f140e808cbce7b1419#b4588b2e5aee68f4c0e100f140e808cbce7b1419"
[[package]] [[package]]
name = "now" name = "now"
@@ -13967,6 +13982,7 @@ dependencies = [
"gpui", "gpui",
"gpui_tokio", "gpui_tokio",
"http_client", "http_client",
"image",
"json_schema_store", "json_schema_store",
"language", "language",
"language_extension", "language_extension",
@@ -16208,7 +16224,6 @@ dependencies = [
"log", "log",
"menu", "menu",
"picker", "picker",
"project",
"reqwest_client", "reqwest_client",
"rust-embed", "rust-embed",
"settings", "settings",
@@ -16218,7 +16233,6 @@ dependencies = [
"theme", "theme",
"title_bar", "title_bar",
"ui", "ui",
"workspace",
] ]
[[package]] [[package]]
@@ -18804,7 +18818,6 @@ dependencies = [
name = "vim_mode_setting" name = "vim_mode_setting"
version = "0.1.0" version = "0.1.0"
dependencies = [ dependencies = [
"gpui",
"settings", "settings",
] ]
@@ -20948,6 +20961,7 @@ dependencies = [
"gh-workflow", "gh-workflow",
"indexmap 2.11.4", "indexmap 2.11.4",
"indoc", "indoc",
"serde",
"toml 0.8.23", "toml 0.8.23",
"toml_edit 0.22.27", "toml_edit 0.22.27",
] ]
@@ -21671,18 +21685,20 @@ dependencies = [
"language_model", "language_model",
"log", "log",
"lsp", "lsp",
"open_ai",
"pretty_assertions", "pretty_assertions",
"project", "project",
"release_channel", "release_channel",
"schemars 1.0.4",
"serde", "serde",
"serde_json", "serde_json",
"settings", "settings",
"smol",
"thiserror 2.0.17", "thiserror 2.0.17",
"util", "util",
"uuid", "uuid",
"workspace", "workspace",
"worktree", "worktree",
"zlog",
] ]
[[package]] [[package]]
@@ -21694,6 +21710,7 @@ dependencies = [
"clap", "clap",
"client", "client",
"cloud_llm_client", "cloud_llm_client",
"cloud_zeta2_prompt",
"collections", "collections",
"edit_prediction_context", "edit_prediction_context",
"editor", "editor",
@@ -21707,7 +21724,6 @@ dependencies = [
"ordered-float 2.10.1", "ordered-float 2.10.1",
"pretty_assertions", "pretty_assertions",
"project", "project",
"regex-syntax",
"serde", "serde",
"serde_json", "serde_json",
"settings", "settings",

View File

@@ -440,7 +440,7 @@ zlog_settings = { path = "crates/zlog_settings" }
# External crates # External crates
# #
agent-client-protocol = { path = "../agent-client-protocol", features = ["unstable"] } agent-client-protocol = { version = "0.7.0", features = ["unstable"] }
aho-corasick = "1.1" aho-corasick = "1.1"
alacritty_terminal = "0.25.1-rc1" alacritty_terminal = "0.25.1-rc1"
any_vec = "0.14" any_vec = "0.14"
@@ -663,6 +663,7 @@ time = { version = "0.3", features = [
"serde", "serde",
"serde-well-known", "serde-well-known",
"formatting", "formatting",
"local-offset",
] } ] }
tiny_http = "0.8" tiny_http = "0.8"
tokio = { version = "1" } tokio = { version = "1" }
@@ -772,8 +773,8 @@ features = [
] ]
[patch.crates-io] [patch.crates-io]
notify = { git = "https://github.com/zed-industries/notify.git", rev = "bbb9ea5ae52b253e095737847e367c30653a2e96" } notify = { git = "https://github.com/zed-industries/notify.git", rev = "b4588b2e5aee68f4c0e100f140e808cbce7b1419" }
notify-types = { git = "https://github.com/zed-industries/notify.git", rev = "bbb9ea5ae52b253e095737847e367c30653a2e96" } notify-types = { git = "https://github.com/zed-industries/notify.git", rev = "b4588b2e5aee68f4c0e100f140e808cbce7b1419" }
windows-capture = { git = "https://github.com/zed-industries/windows-capture.git", rev = "f0d6c1b6691db75461b732f6d5ff56eed002eeb9" } windows-capture = { git = "https://github.com/zed-industries/windows-capture.git", rev = "f0d6c1b6691db75461b732f6d5ff56eed002eeb9" }
[profile.dev] [profile.dev]

View File

@@ -1,7 +1,7 @@
# Zed # Zed
[![Zed](https://img.shields.io/endpoint?url=https://raw.githubusercontent.com/zed-industries/zed/main/assets/badge/v0.json)](https://zed.dev) [![Zed](https://img.shields.io/endpoint?url=https://raw.githubusercontent.com/zed-industries/zed/main/assets/badge/v0.json)](https://zed.dev)
[![CI](https://github.com/zed-industries/zed/actions/workflows/ci.yml/badge.svg)](https://github.com/zed-industries/zed/actions/workflows/ci.yml) [![CI](https://github.com/zed-industries/zed/actions/workflows/run_tests.yml/badge.svg)](https://github.com/zed-industries/zed/actions/workflows/run_tests.yml)
Welcome to Zed, a high-performance, multiplayer code editor from the creators of [Atom](https://github.com/atom/atom) and [Tree-sitter](https://github.com/tree-sitter/tree-sitter). Welcome to Zed, a high-performance, multiplayer code editor from the creators of [Atom](https://github.com/atom/atom) and [Tree-sitter](https://github.com/tree-sitter/tree-sitter).

View File

@@ -735,6 +735,20 @@
"tab": "editor::ComposeCompletion" "tab": "editor::ComposeCompletion"
} }
}, },
{
"context": "Editor && in_snippet && has_next_tabstop && !showing_completions",
"use_key_equivalents": true,
"bindings": {
"tab": "editor::NextSnippetTabstop"
}
},
{
"context": "Editor && in_snippet && has_previous_tabstop && !showing_completions",
"use_key_equivalents": true,
"bindings": {
"shift-tab": "editor::PreviousSnippetTabstop"
}
},
// Bindings for accepting edit predictions // Bindings for accepting edit predictions
// //
// alt-l is provided as an alternative to tab/alt-tab. and will be displayed in the UI. This is // alt-l is provided as an alternative to tab/alt-tab. and will be displayed in the UI. This is

View File

@@ -805,6 +805,20 @@
"tab": "editor::ComposeCompletion" "tab": "editor::ComposeCompletion"
} }
}, },
{
"context": "Editor && in_snippet && has_next_tabstop && !showing_completions",
"use_key_equivalents": true,
"bindings": {
"tab": "editor::NextSnippetTabstop"
}
},
{
"context": "Editor && in_snippet && has_previous_tabstop && !showing_completions",
"use_key_equivalents": true,
"bindings": {
"shift-tab": "editor::PreviousSnippetTabstop"
}
},
{ {
"context": "Editor && edit_prediction", "context": "Editor && edit_prediction",
"bindings": { "bindings": {

View File

@@ -739,6 +739,20 @@
"tab": "editor::ComposeCompletion" "tab": "editor::ComposeCompletion"
} }
}, },
{
"context": "Editor && in_snippet && has_next_tabstop && !showing_completions",
"use_key_equivalents": true,
"bindings": {
"tab": "editor::NextSnippetTabstop"
}
},
{
"context": "Editor && in_snippet && has_previous_tabstop && !showing_completions",
"use_key_equivalents": true,
"bindings": {
"shift-tab": "editor::PreviousSnippetTabstop"
}
},
// Bindings for accepting edit predictions // Bindings for accepting edit predictions
// //
// alt-l is provided as an alternative to tab/alt-tab. and will be displayed in the UI. This is // alt-l is provided as an alternative to tab/alt-tab. and will be displayed in the UI. This is

View File

@@ -455,6 +455,7 @@
"<": "vim::Outdent", "<": "vim::Outdent",
"=": "vim::AutoIndent", "=": "vim::AutoIndent",
"d": "vim::HelixDelete", "d": "vim::HelixDelete",
"alt-d": "editor::Delete", // Delete selection, without yanking
"c": "vim::HelixSubstitute", "c": "vim::HelixSubstitute",
"alt-c": "vim::HelixSubstituteNoYank", "alt-c": "vim::HelixSubstituteNoYank",

View File

@@ -605,6 +605,10 @@
// to both the horizontal and vertical delta values while scrolling. Fast scrolling // to both the horizontal and vertical delta values while scrolling. Fast scrolling
// happens when a user holds the alt or option key while scrolling. // happens when a user holds the alt or option key while scrolling.
"fast_scroll_sensitivity": 4.0, "fast_scroll_sensitivity": 4.0,
"sticky_scroll": {
// Whether to stick scopes to the top of the editor.
"enabled": false
},
"relative_line_numbers": "disabled", "relative_line_numbers": "disabled",
// If 'search_wrap' is disabled, search result do not wrap around the end of the file. // If 'search_wrap' is disabled, search result do not wrap around the end of the file.
"search_wrap": true, "search_wrap": true,
@@ -1487,7 +1491,11 @@
// in your project's settings, rather than globally. // in your project's settings, rather than globally.
"directories": [".env", "env", ".venv", "venv"], "directories": [".env", "env", ".venv", "venv"],
// Can also be `csh`, `fish`, `nushell` and `power_shell` // Can also be `csh`, `fish`, `nushell` and `power_shell`
"activate_script": "default" "activate_script": "default",
// Preferred Conda manager to use when activating Conda environments.
// Values: "auto", "conda", "mamba", "micromamba"
// Default: "auto"
"conda_manager": "auto"
} }
}, },
"toolbar": { "toolbar": {

View File

@@ -39,6 +39,7 @@ serde_json.workspace = true
settings.workspace = true settings.workspace = true
smol.workspace = true smol.workspace = true
task.workspace = true task.workspace = true
telemetry.workspace = true
terminal.workspace = true terminal.workspace = true
ui.workspace = true ui.workspace = true
url.workspace = true url.workspace = true
@@ -56,3 +57,4 @@ rand.workspace = true
tempfile.workspace = true tempfile.workspace = true
util.workspace = true util.workspace = true
settings.workspace = true settings.workspace = true
zlog.workspace = true

View File

@@ -15,7 +15,7 @@ use settings::Settings as _;
use task::{Shell, ShellBuilder}; use task::{Shell, ShellBuilder};
pub use terminal::*; pub use terminal::*;
use action_log::ActionLog; use action_log::{ActionLog, ActionLogTelemetry};
use agent_client_protocol::{self as acp}; use agent_client_protocol::{self as acp};
use anyhow::{Context as _, Result, anyhow}; use anyhow::{Context as _, Result, anyhow};
use editor::Bias; use editor::Bias;
@@ -38,10 +38,10 @@ use util::{ResultExt, get_default_system_shell_preferring_bash, paths::PathStyle
use uuid::Uuid; use uuid::Uuid;
#[derive(Debug)] #[derive(Debug)]
pub struct UserMessage<T> { pub struct UserMessage {
pub id: Option<UserMessageId>, pub id: Option<UserMessageId>,
pub content: ContentBlock, pub content: ContentBlock,
pub chunks: Vec<acp::ContentBlock<T>>, pub chunks: Vec<acp::ContentBlock>,
pub checkpoint: Option<Checkpoint>, pub checkpoint: Option<Checkpoint>,
} }
@@ -51,7 +51,7 @@ pub struct Checkpoint {
pub show: bool, pub show: bool,
} }
impl<T> UserMessage<T> { impl UserMessage {
fn to_markdown(&self, cx: &App) -> String { fn to_markdown(&self, cx: &App) -> String {
let mut markdown = String::new(); let mut markdown = String::new();
if self if self
@@ -116,13 +116,13 @@ impl AssistantMessageChunk {
} }
#[derive(Debug)] #[derive(Debug)]
pub enum AgentThreadEntry<T> { pub enum AgentThreadEntry {
UserMessage(UserMessage<T>), UserMessage(UserMessage),
AssistantMessage(AssistantMessage), AssistantMessage(AssistantMessage),
ToolCall(ToolCall), ToolCall(ToolCall),
} }
impl<T> AgentThreadEntry<T> { impl AgentThreadEntry {
pub fn to_markdown(&self, cx: &App) -> String { pub fn to_markdown(&self, cx: &App) -> String {
match self { match self {
Self::UserMessage(message) => message.to_markdown(cx), Self::UserMessage(message) => message.to_markdown(cx),
@@ -131,7 +131,7 @@ impl<T> AgentThreadEntry<T> {
} }
} }
pub fn user_message(&self) -> Option<&UserMessage<T>> { pub fn user_message(&self) -> Option<&UserMessage> {
if let AgentThreadEntry::UserMessage(message) = self { if let AgentThreadEntry::UserMessage(message) = self {
Some(message) Some(message)
} else { } else {
@@ -802,11 +802,9 @@ pub struct RetryStatus {
pub duration: Duration, pub duration: Duration,
} }
pub struct AnchoredText; pub struct AcpThread {
title: SharedString,
pub struct AcpThread<T = SharedString> { entries: Vec<AgentThreadEntry>,
title: T,
entries: Vec<AgentThreadEntry<AnchoredText>>,
plan: Plan, plan: Plan,
project: Entity<Project>, project: Entity<Project>,
action_log: Entity<ActionLog>, action_log: Entity<ActionLog>,
@@ -822,6 +820,15 @@ pub struct AcpThread<T = SharedString> {
pending_terminal_exit: HashMap<acp::TerminalId, acp::TerminalExitStatus>, pending_terminal_exit: HashMap<acp::TerminalId, acp::TerminalExitStatus>,
} }
impl From<&AcpThread> for ActionLogTelemetry {
fn from(value: &AcpThread) -> Self {
Self {
agent_telemetry_id: value.connection().telemetry_id(),
session_id: value.session_id.0.clone(),
}
}
}
#[derive(Debug)] #[derive(Debug)]
pub enum AcpThreadEvent { pub enum AcpThreadEvent {
NewEntry, NewEntry,
@@ -1004,7 +1011,7 @@ impl Display for LoadError {
impl Error for LoadError {} impl Error for LoadError {}
impl<T> AcpThread<T> { impl AcpThread {
pub fn new( pub fn new(
title: impl Into<SharedString>, title: impl Into<SharedString>,
connection: Rc<dyn AgentConnection>, connection: Rc<dyn AgentConnection>,
@@ -1154,7 +1161,7 @@ impl<T> AcpThread<T> {
pub fn push_user_content_block( pub fn push_user_content_block(
&mut self, &mut self,
message_id: Option<UserMessageId>, message_id: Option<UserMessageId>,
chunk: acp::ContentBlock<T>, chunk: acp::ContentBlock,
cx: &mut Context<Self>, cx: &mut Context<Self>,
) { ) {
let language_registry = self.project.read(cx).languages().clone(); let language_registry = self.project.read(cx).languages().clone();
@@ -1233,7 +1240,7 @@ impl<T> AcpThread<T> {
} }
} }
fn push_entry(&mut self, entry: AgentThreadEntry<T>, cx: &mut Context<Self>) { fn push_entry(&mut self, entry: AgentThreadEntry, cx: &mut Context<Self>) {
self.entries.push(entry); self.entries.push(entry);
cx.emit(AcpThreadEvent::NewEntry); cx.emit(AcpThreadEvent::NewEntry);
} }
@@ -1348,6 +1355,17 @@ impl<T> AcpThread<T> {
let path_style = self.project.read(cx).path_style(cx); let path_style = self.project.read(cx).path_style(cx);
let id = update.id.clone(); let id = update.id.clone();
let agent = self.connection().telemetry_id();
let session = self.session_id();
if let ToolCallStatus::Completed | ToolCallStatus::Failed = status {
let status = if matches!(status, ToolCallStatus::Completed) {
"completed"
} else {
"failed"
};
telemetry::event!("Agent Tool Call Completed", agent, session, status);
}
if let Some(ix) = self.index_for_tool_call(&id) { if let Some(ix) = self.index_for_tool_call(&id) {
let AgentThreadEntry::ToolCall(call) = &mut self.entries[ix] else { let AgentThreadEntry::ToolCall(call) = &mut self.entries[ix] else {
unreachable!() unreachable!()
@@ -1871,6 +1889,7 @@ impl<T> AcpThread<T> {
return Task::ready(Err(anyhow!("not supported"))); return Task::ready(Err(anyhow!("not supported")));
}; };
let telemetry = ActionLogTelemetry::from(&*self);
cx.spawn(async move |this, cx| { cx.spawn(async move |this, cx| {
cx.update(|cx| truncate.run(id.clone(), cx))?.await?; cx.update(|cx| truncate.run(id.clone(), cx))?.await?;
this.update(cx, |this, cx| { this.update(cx, |this, cx| {
@@ -1879,8 +1898,9 @@ impl<T> AcpThread<T> {
this.entries.truncate(ix); this.entries.truncate(ix);
cx.emit(AcpThreadEvent::EntriesRemoved(range)); cx.emit(AcpThreadEvent::EntriesRemoved(range));
} }
this.action_log() this.action_log().update(cx, |action_log, cx| {
.update(cx, |action_log, cx| action_log.reject_all_edits(cx)) action_log.reject_all_edits(Some(telemetry), cx)
})
})? })?
.await; .await;
Ok(()) Ok(())
@@ -1926,7 +1946,7 @@ impl<T> AcpThread<T> {
}) })
} }
fn last_user_message(&mut self) -> Option<(usize, &mut UserMessage<T>)> { fn last_user_message(&mut self) -> Option<(usize, &mut UserMessage)> {
self.entries self.entries
.iter_mut() .iter_mut()
.enumerate() .enumerate()
@@ -2357,8 +2377,6 @@ mod tests {
cx.update(|cx| { cx.update(|cx| {
let settings_store = SettingsStore::test(cx); let settings_store = SettingsStore::test(cx);
cx.set_global(settings_store); cx.set_global(settings_store);
Project::init_settings(cx);
language::init(cx);
}); });
} }
@@ -3616,6 +3634,10 @@ mod tests {
} }
impl AgentConnection for FakeAgentConnection { impl AgentConnection for FakeAgentConnection {
fn telemetry_id(&self) -> &'static str {
"fake"
}
fn auth_methods(&self) -> &[acp::AuthMethod] { fn auth_methods(&self) -> &[acp::AuthMethod] {
&self.auth_methods &self.auth_methods
} }

View File

@@ -20,6 +20,8 @@ impl UserMessageId {
} }
pub trait AgentConnection { pub trait AgentConnection {
fn telemetry_id(&self) -> &'static str;
fn new_thread( fn new_thread(
self: Rc<Self>, self: Rc<Self>,
project: Entity<Project>, project: Entity<Project>,
@@ -106,9 +108,6 @@ pub trait AgentSessionSetTitle {
} }
pub trait AgentTelemetry { pub trait AgentTelemetry {
/// The name of the agent used for telemetry.
fn agent_name(&self) -> String;
/// A representation of the current thread state that can be serialized for /// A representation of the current thread state that can be serialized for
/// storage with telemetry events. /// storage with telemetry events.
fn thread_data( fn thread_data(
@@ -318,6 +317,10 @@ mod test_support {
} }
impl AgentConnection for StubAgentConnection { impl AgentConnection for StubAgentConnection {
fn telemetry_id(&self) -> &'static str {
"stub"
}
fn auth_methods(&self) -> &[acp::AuthMethod] { fn auth_methods(&self) -> &[acp::AuthMethod] {
&[] &[]
} }

View File

@@ -20,6 +20,7 @@ futures.workspace = true
gpui.workspace = true gpui.workspace = true
language.workspace = true language.workspace = true
project.workspace = true project.workspace = true
telemetry.workspace = true
text.workspace = true text.workspace = true
util.workspace = true util.workspace = true
watch.workspace = true watch.workspace = true

View File

@@ -3,7 +3,9 @@ use buffer_diff::BufferDiff;
use clock; use clock;
use collections::BTreeMap; use collections::BTreeMap;
use futures::{FutureExt, StreamExt, channel::mpsc}; use futures::{FutureExt, StreamExt, channel::mpsc};
use gpui::{App, AppContext, AsyncApp, Context, Entity, Subscription, Task, WeakEntity}; use gpui::{
App, AppContext, AsyncApp, Context, Entity, SharedString, Subscription, Task, WeakEntity,
};
use language::{Anchor, Buffer, BufferEvent, DiskState, Point, ToPoint}; use language::{Anchor, Buffer, BufferEvent, DiskState, Point, ToPoint};
use project::{Project, ProjectItem, lsp_store::OpenLspBufferHandle}; use project::{Project, ProjectItem, lsp_store::OpenLspBufferHandle};
use std::{cmp, ops::Range, sync::Arc}; use std::{cmp, ops::Range, sync::Arc};
@@ -31,71 +33,6 @@ impl ActionLog {
&self.project &self.project
} }
pub fn latest_snapshot(&self, buffer: &Entity<Buffer>) -> Option<text::BufferSnapshot> {
Some(self.tracked_buffers.get(buffer)?.snapshot.clone())
}
/// Return a unified diff patch with user edits made since last read or notification
pub fn unnotified_user_edits(&self, cx: &Context<Self>) -> Option<String> {
let diffs = self
.tracked_buffers
.values()
.filter_map(|tracked| {
if !tracked.may_have_unnotified_user_edits {
return None;
}
let text_with_latest_user_edits = tracked.diff_base.to_string();
let text_with_last_seen_user_edits = tracked.last_seen_base.to_string();
if text_with_latest_user_edits == text_with_last_seen_user_edits {
return None;
}
let patch = language::unified_diff(
&text_with_last_seen_user_edits,
&text_with_latest_user_edits,
);
let buffer = tracked.buffer.clone();
let file_path = buffer
.read(cx)
.file()
.map(|file| {
let mut path = file.full_path(cx).to_string_lossy().into_owned();
if file.path_style(cx).is_windows() {
path = path.replace('\\', "/");
}
path
})
.unwrap_or_else(|| format!("buffer_{}", buffer.entity_id()));
let mut result = String::new();
result.push_str(&format!("--- a/{}\n", file_path));
result.push_str(&format!("+++ b/{}\n", file_path));
result.push_str(&patch);
Some(result)
})
.collect::<Vec<_>>();
if diffs.is_empty() {
return None;
}
let unified_diff = diffs.join("\n\n");
Some(unified_diff)
}
/// Return a unified diff patch with user edits made since last read/notification
/// and mark them as notified
pub fn flush_unnotified_user_edits(&mut self, cx: &Context<Self>) -> Option<String> {
let patch = self.unnotified_user_edits(cx);
self.tracked_buffers.values_mut().for_each(|tracked| {
tracked.may_have_unnotified_user_edits = false;
tracked.last_seen_base = tracked.diff_base.clone();
});
patch
}
fn track_buffer_internal( fn track_buffer_internal(
&mut self, &mut self,
buffer: Entity<Buffer>, buffer: Entity<Buffer>,
@@ -145,31 +82,26 @@ impl ActionLog {
let diff = cx.new(|cx| BufferDiff::new(&text_snapshot, cx)); let diff = cx.new(|cx| BufferDiff::new(&text_snapshot, cx));
let (diff_update_tx, diff_update_rx) = mpsc::unbounded(); let (diff_update_tx, diff_update_rx) = mpsc::unbounded();
let diff_base; let diff_base;
let last_seen_base;
let unreviewed_edits; let unreviewed_edits;
if is_created { if is_created {
diff_base = Rope::default(); diff_base = Rope::default();
last_seen_base = Rope::default();
unreviewed_edits = Patch::new(vec![Edit { unreviewed_edits = Patch::new(vec![Edit {
old: 0..1, old: 0..1,
new: 0..text_snapshot.max_point().row + 1, new: 0..text_snapshot.max_point().row + 1,
}]) }])
} else { } else {
diff_base = buffer.read(cx).as_rope().clone(); diff_base = buffer.read(cx).as_rope().clone();
last_seen_base = diff_base.clone();
unreviewed_edits = Patch::default(); unreviewed_edits = Patch::default();
} }
TrackedBuffer { TrackedBuffer {
buffer: buffer.clone(), buffer: buffer.clone(),
diff_base, diff_base,
last_seen_base,
unreviewed_edits, unreviewed_edits,
snapshot: text_snapshot, snapshot: text_snapshot,
status, status,
version: buffer.read(cx).version(), version: buffer.read(cx).version(),
diff, diff,
diff_update: diff_update_tx, diff_update: diff_update_tx,
may_have_unnotified_user_edits: false,
_open_lsp_handle: open_lsp_handle, _open_lsp_handle: open_lsp_handle,
_maintain_diff: cx.spawn({ _maintain_diff: cx.spawn({
let buffer = buffer.clone(); let buffer = buffer.clone();
@@ -320,10 +252,9 @@ impl ActionLog {
let new_snapshot = buffer_snapshot.clone(); let new_snapshot = buffer_snapshot.clone();
let unreviewed_edits = tracked_buffer.unreviewed_edits.clone(); let unreviewed_edits = tracked_buffer.unreviewed_edits.clone();
let edits = diff_snapshots(&old_snapshot, &new_snapshot); let edits = diff_snapshots(&old_snapshot, &new_snapshot);
let mut has_user_changes = false;
async move { async move {
if let ChangeAuthor::User = author { if let ChangeAuthor::User = author {
has_user_changes = apply_non_conflicting_edits( apply_non_conflicting_edits(
&unreviewed_edits, &unreviewed_edits,
edits, edits,
&mut base_text, &mut base_text,
@@ -331,22 +262,13 @@ impl ActionLog {
); );
} }
(Arc::new(base_text.to_string()), base_text, has_user_changes) (Arc::new(base_text.to_string()), base_text)
} }
}); });
anyhow::Ok(rebase) anyhow::Ok(rebase)
})??; })??;
let (new_base_text, new_diff_base, has_user_changes) = rebase.await; let (new_base_text, new_diff_base) = rebase.await;
this.update(cx, |this, _| {
let tracked_buffer = this
.tracked_buffers
.get_mut(buffer)
.context("buffer not tracked")
.unwrap();
tracked_buffer.may_have_unnotified_user_edits |= has_user_changes;
})?;
Self::update_diff( Self::update_diff(
this, this,
@@ -565,14 +487,17 @@ impl ActionLog {
&mut self, &mut self,
buffer: Entity<Buffer>, buffer: Entity<Buffer>,
buffer_range: Range<impl language::ToPoint>, buffer_range: Range<impl language::ToPoint>,
telemetry: Option<ActionLogTelemetry>,
cx: &mut Context<Self>, cx: &mut Context<Self>,
) { ) {
let Some(tracked_buffer) = self.tracked_buffers.get_mut(&buffer) else { let Some(tracked_buffer) = self.tracked_buffers.get_mut(&buffer) else {
return; return;
}; };
let mut metrics = ActionLogMetrics::for_buffer(buffer.read(cx));
match tracked_buffer.status { match tracked_buffer.status {
TrackedBufferStatus::Deleted => { TrackedBufferStatus::Deleted => {
metrics.add_edits(tracked_buffer.unreviewed_edits.edits());
self.tracked_buffers.remove(&buffer); self.tracked_buffers.remove(&buffer);
cx.notify(); cx.notify();
} }
@@ -581,7 +506,6 @@ impl ActionLog {
let buffer_range = let buffer_range =
buffer_range.start.to_point(buffer)..buffer_range.end.to_point(buffer); buffer_range.start.to_point(buffer)..buffer_range.end.to_point(buffer);
let mut delta = 0i32; let mut delta = 0i32;
tracked_buffer.unreviewed_edits.retain_mut(|edit| { tracked_buffer.unreviewed_edits.retain_mut(|edit| {
edit.old.start = (edit.old.start as i32 + delta) as u32; edit.old.start = (edit.old.start as i32 + delta) as u32;
edit.old.end = (edit.old.end as i32 + delta) as u32; edit.old.end = (edit.old.end as i32 + delta) as u32;
@@ -613,6 +537,7 @@ impl ActionLog {
.collect::<String>(), .collect::<String>(),
); );
delta += edit.new_len() as i32 - edit.old_len() as i32; delta += edit.new_len() as i32 - edit.old_len() as i32;
metrics.add_edit(edit);
false false
} }
}); });
@@ -624,19 +549,24 @@ impl ActionLog {
tracked_buffer.schedule_diff_update(ChangeAuthor::User, cx); tracked_buffer.schedule_diff_update(ChangeAuthor::User, cx);
} }
} }
if let Some(telemetry) = telemetry {
telemetry_report_accepted_edits(&telemetry, metrics);
}
} }
pub fn reject_edits_in_ranges( pub fn reject_edits_in_ranges(
&mut self, &mut self,
buffer: Entity<Buffer>, buffer: Entity<Buffer>,
buffer_ranges: Vec<Range<impl language::ToPoint>>, buffer_ranges: Vec<Range<impl language::ToPoint>>,
telemetry: Option<ActionLogTelemetry>,
cx: &mut Context<Self>, cx: &mut Context<Self>,
) -> Task<Result<()>> { ) -> Task<Result<()>> {
let Some(tracked_buffer) = self.tracked_buffers.get_mut(&buffer) else { let Some(tracked_buffer) = self.tracked_buffers.get_mut(&buffer) else {
return Task::ready(Ok(())); return Task::ready(Ok(()));
}; };
match &tracked_buffer.status { let mut metrics = ActionLogMetrics::for_buffer(buffer.read(cx));
let task = match &tracked_buffer.status {
TrackedBufferStatus::Created { TrackedBufferStatus::Created {
existing_file_content, existing_file_content,
} => { } => {
@@ -686,6 +616,7 @@ impl ActionLog {
} }
}; };
metrics.add_edits(tracked_buffer.unreviewed_edits.edits());
self.tracked_buffers.remove(&buffer); self.tracked_buffers.remove(&buffer);
cx.notify(); cx.notify();
task task
@@ -699,6 +630,7 @@ impl ActionLog {
.update(cx, |project, cx| project.save_buffer(buffer.clone(), cx)); .update(cx, |project, cx| project.save_buffer(buffer.clone(), cx));
// Clear all tracked edits for this buffer and start over as if we just read it. // Clear all tracked edits for this buffer and start over as if we just read it.
metrics.add_edits(tracked_buffer.unreviewed_edits.edits());
self.tracked_buffers.remove(&buffer); self.tracked_buffers.remove(&buffer);
self.buffer_read(buffer.clone(), cx); self.buffer_read(buffer.clone(), cx);
cx.notify(); cx.notify();
@@ -738,6 +670,7 @@ impl ActionLog {
} }
if revert { if revert {
metrics.add_edit(edit);
let old_range = tracked_buffer let old_range = tracked_buffer
.diff_base .diff_base
.point_to_offset(Point::new(edit.old.start, 0)) .point_to_offset(Point::new(edit.old.start, 0))
@@ -758,12 +691,25 @@ impl ActionLog {
self.project self.project
.update(cx, |project, cx| project.save_buffer(buffer, cx)) .update(cx, |project, cx| project.save_buffer(buffer, cx))
} }
};
if let Some(telemetry) = telemetry {
telemetry_report_rejected_edits(&telemetry, metrics);
} }
task
} }
pub fn keep_all_edits(&mut self, cx: &mut Context<Self>) { pub fn keep_all_edits(
self.tracked_buffers &mut self,
.retain(|_buffer, tracked_buffer| match tracked_buffer.status { telemetry: Option<ActionLogTelemetry>,
cx: &mut Context<Self>,
) {
self.tracked_buffers.retain(|buffer, tracked_buffer| {
let mut metrics = ActionLogMetrics::for_buffer(buffer.read(cx));
metrics.add_edits(tracked_buffer.unreviewed_edits.edits());
if let Some(telemetry) = telemetry.as_ref() {
telemetry_report_accepted_edits(telemetry, metrics);
}
match tracked_buffer.status {
TrackedBufferStatus::Deleted => false, TrackedBufferStatus::Deleted => false,
_ => { _ => {
if let TrackedBufferStatus::Created { .. } = &mut tracked_buffer.status { if let TrackedBufferStatus::Created { .. } = &mut tracked_buffer.status {
@@ -774,13 +720,24 @@ impl ActionLog {
tracked_buffer.schedule_diff_update(ChangeAuthor::User, cx); tracked_buffer.schedule_diff_update(ChangeAuthor::User, cx);
true true
} }
}); }
});
cx.notify(); cx.notify();
} }
pub fn reject_all_edits(&mut self, cx: &mut Context<Self>) -> Task<()> { pub fn reject_all_edits(
&mut self,
telemetry: Option<ActionLogTelemetry>,
cx: &mut Context<Self>,
) -> Task<()> {
let futures = self.changed_buffers(cx).into_keys().map(|buffer| { let futures = self.changed_buffers(cx).into_keys().map(|buffer| {
let reject = self.reject_edits_in_ranges(buffer, vec![Anchor::MIN..Anchor::MAX], cx); let reject = self.reject_edits_in_ranges(
buffer,
vec![Anchor::MIN..Anchor::MAX],
telemetry.clone(),
cx,
);
async move { async move {
reject.await.log_err(); reject.await.log_err();
@@ -788,8 +745,7 @@ impl ActionLog {
}); });
let task = futures::future::join_all(futures); let task = futures::future::join_all(futures);
cx.background_spawn(async move {
cx.spawn(async move |_, _| {
task.await; task.await;
}) })
} }
@@ -819,6 +775,61 @@ impl ActionLog {
} }
} }
#[derive(Clone)]
pub struct ActionLogTelemetry {
pub agent_telemetry_id: &'static str,
pub session_id: Arc<str>,
}
struct ActionLogMetrics {
lines_removed: u32,
lines_added: u32,
language: Option<SharedString>,
}
impl ActionLogMetrics {
fn for_buffer(buffer: &Buffer) -> Self {
Self {
language: buffer.language().map(|l| l.name().0),
lines_removed: 0,
lines_added: 0,
}
}
fn add_edits(&mut self, edits: &[Edit<u32>]) {
for edit in edits {
self.add_edit(edit);
}
}
fn add_edit(&mut self, edit: &Edit<u32>) {
self.lines_added += edit.new_len();
self.lines_removed += edit.old_len();
}
}
fn telemetry_report_accepted_edits(telemetry: &ActionLogTelemetry, metrics: ActionLogMetrics) {
telemetry::event!(
"Agent Edits Accepted",
agent = telemetry.agent_telemetry_id,
session = telemetry.session_id,
language = metrics.language,
lines_added = metrics.lines_added,
lines_removed = metrics.lines_removed
);
}
fn telemetry_report_rejected_edits(telemetry: &ActionLogTelemetry, metrics: ActionLogMetrics) {
telemetry::event!(
"Agent Edits Rejected",
agent = telemetry.agent_telemetry_id,
session = telemetry.session_id,
language = metrics.language,
lines_added = metrics.lines_added,
lines_removed = metrics.lines_removed
);
}
fn apply_non_conflicting_edits( fn apply_non_conflicting_edits(
patch: &Patch<u32>, patch: &Patch<u32>,
edits: Vec<Edit<u32>>, edits: Vec<Edit<u32>>,
@@ -949,14 +960,12 @@ enum TrackedBufferStatus {
struct TrackedBuffer { struct TrackedBuffer {
buffer: Entity<Buffer>, buffer: Entity<Buffer>,
diff_base: Rope, diff_base: Rope,
last_seen_base: Rope,
unreviewed_edits: Patch<u32>, unreviewed_edits: Patch<u32>,
status: TrackedBufferStatus, status: TrackedBufferStatus,
version: clock::Global, version: clock::Global,
diff: Entity<BufferDiff>, diff: Entity<BufferDiff>,
snapshot: text::BufferSnapshot, snapshot: text::BufferSnapshot,
diff_update: mpsc::UnboundedSender<(ChangeAuthor, text::BufferSnapshot)>, diff_update: mpsc::UnboundedSender<(ChangeAuthor, text::BufferSnapshot)>,
may_have_unnotified_user_edits: bool,
_open_lsp_handle: OpenLspBufferHandle, _open_lsp_handle: OpenLspBufferHandle,
_maintain_diff: Task<()>, _maintain_diff: Task<()>,
_subscription: Subscription, _subscription: Subscription,
@@ -987,7 +996,6 @@ mod tests {
use super::*; use super::*;
use buffer_diff::DiffHunkStatusKind; use buffer_diff::DiffHunkStatusKind;
use gpui::TestAppContext; use gpui::TestAppContext;
use indoc::indoc;
use language::Point; use language::Point;
use project::{FakeFs, Fs, Project, RemoveOptions}; use project::{FakeFs, Fs, Project, RemoveOptions};
use rand::prelude::*; use rand::prelude::*;
@@ -1005,8 +1013,6 @@ mod tests {
cx.update(|cx| { cx.update(|cx| {
let settings_store = SettingsStore::test(cx); let settings_store = SettingsStore::test(cx);
cx.set_global(settings_store); cx.set_global(settings_store);
language::init(cx);
Project::init_settings(cx);
}); });
} }
@@ -1066,7 +1072,7 @@ mod tests {
); );
action_log.update(cx, |log, cx| { action_log.update(cx, |log, cx| {
log.keep_edits_in_range(buffer.clone(), Point::new(3, 0)..Point::new(4, 3), cx) log.keep_edits_in_range(buffer.clone(), Point::new(3, 0)..Point::new(4, 3), None, cx)
}); });
cx.run_until_parked(); cx.run_until_parked();
assert_eq!( assert_eq!(
@@ -1082,7 +1088,7 @@ mod tests {
); );
action_log.update(cx, |log, cx| { action_log.update(cx, |log, cx| {
log.keep_edits_in_range(buffer.clone(), Point::new(0, 0)..Point::new(4, 3), cx) log.keep_edits_in_range(buffer.clone(), Point::new(0, 0)..Point::new(4, 3), None, cx)
}); });
cx.run_until_parked(); cx.run_until_parked();
assert_eq!(unreviewed_hunks(&action_log, cx), vec![]); assert_eq!(unreviewed_hunks(&action_log, cx), vec![]);
@@ -1167,7 +1173,7 @@ mod tests {
); );
action_log.update(cx, |log, cx| { action_log.update(cx, |log, cx| {
log.keep_edits_in_range(buffer.clone(), Point::new(1, 0)..Point::new(1, 0), cx) log.keep_edits_in_range(buffer.clone(), Point::new(1, 0)..Point::new(1, 0), None, cx)
}); });
cx.run_until_parked(); cx.run_until_parked();
assert_eq!(unreviewed_hunks(&action_log, cx), vec![]); assert_eq!(unreviewed_hunks(&action_log, cx), vec![]);
@@ -1264,111 +1270,7 @@ mod tests {
); );
action_log.update(cx, |log, cx| { action_log.update(cx, |log, cx| {
log.keep_edits_in_range(buffer.clone(), Point::new(0, 0)..Point::new(1, 0), cx) log.keep_edits_in_range(buffer.clone(), Point::new(0, 0)..Point::new(1, 0), None, cx)
});
cx.run_until_parked();
assert_eq!(unreviewed_hunks(&action_log, cx), vec![]);
}
#[gpui::test(iterations = 10)]
async fn test_user_edits_notifications(cx: &mut TestAppContext) {
init_test(cx);
let fs = FakeFs::new(cx.executor());
fs.insert_tree(
path!("/dir"),
json!({"file": indoc! {"
abc
def
ghi
jkl
mno"}}),
)
.await;
let project = Project::test(fs.clone(), [path!("/dir").as_ref()], cx).await;
let action_log = cx.new(|_| ActionLog::new(project.clone()));
let file_path = project
.read_with(cx, |project, cx| project.find_project_path("dir/file", cx))
.unwrap();
let buffer = project
.update(cx, |project, cx| project.open_buffer(file_path, cx))
.await
.unwrap();
// Agent edits
cx.update(|cx| {
action_log.update(cx, |log, cx| log.buffer_read(buffer.clone(), cx));
buffer.update(cx, |buffer, cx| {
buffer
.edit([(Point::new(1, 2)..Point::new(2, 3), "F\nGHI")], None, cx)
.unwrap()
});
action_log.update(cx, |log, cx| log.buffer_edited(buffer.clone(), cx));
});
cx.run_until_parked();
assert_eq!(
buffer.read_with(cx, |buffer, _| buffer.text()),
indoc! {"
abc
deF
GHI
jkl
mno"}
);
assert_eq!(
unreviewed_hunks(&action_log, cx),
vec![(
buffer.clone(),
vec![HunkStatus {
range: Point::new(1, 0)..Point::new(3, 0),
diff_status: DiffHunkStatusKind::Modified,
old_text: "def\nghi\n".into(),
}],
)]
);
// User edits
buffer.update(cx, |buffer, cx| {
buffer.edit(
[
(Point::new(0, 2)..Point::new(0, 2), "X"),
(Point::new(3, 0)..Point::new(3, 0), "Y"),
],
None,
cx,
)
});
cx.run_until_parked();
assert_eq!(
buffer.read_with(cx, |buffer, _| buffer.text()),
indoc! {"
abXc
deF
GHI
Yjkl
mno"}
);
// User edits should be stored separately from agent's
let user_edits = action_log.update(cx, |log, cx| log.unnotified_user_edits(cx));
assert_eq!(
user_edits.expect("should have some user edits"),
indoc! {"
--- a/dir/file
+++ b/dir/file
@@ -1,5 +1,5 @@
-abc
+abXc
def
ghi
-jkl
+Yjkl
mno
"}
);
action_log.update(cx, |log, cx| {
log.keep_edits_in_range(buffer.clone(), Point::new(0, 0)..Point::new(1, 0), cx)
}); });
cx.run_until_parked(); cx.run_until_parked();
assert_eq!(unreviewed_hunks(&action_log, cx), vec![]); assert_eq!(unreviewed_hunks(&action_log, cx), vec![]);
@@ -1427,7 +1329,7 @@ mod tests {
); );
action_log.update(cx, |log, cx| { action_log.update(cx, |log, cx| {
log.keep_edits_in_range(buffer.clone(), 0..5, cx) log.keep_edits_in_range(buffer.clone(), 0..5, None, cx)
}); });
cx.run_until_parked(); cx.run_until_parked();
assert_eq!(unreviewed_hunks(&action_log, cx), vec![]); assert_eq!(unreviewed_hunks(&action_log, cx), vec![]);
@@ -1479,7 +1381,7 @@ mod tests {
action_log action_log
.update(cx, |log, cx| { .update(cx, |log, cx| {
log.reject_edits_in_ranges(buffer.clone(), vec![2..5], cx) log.reject_edits_in_ranges(buffer.clone(), vec![2..5], None, cx)
}) })
.await .await
.unwrap(); .unwrap();
@@ -1559,7 +1461,7 @@ mod tests {
action_log action_log
.update(cx, |log, cx| { .update(cx, |log, cx| {
log.reject_edits_in_ranges(buffer.clone(), vec![2..5], cx) log.reject_edits_in_ranges(buffer.clone(), vec![2..5], None, cx)
}) })
.await .await
.unwrap(); .unwrap();
@@ -1742,6 +1644,7 @@ mod tests {
log.reject_edits_in_ranges( log.reject_edits_in_ranges(
buffer.clone(), buffer.clone(),
vec![Point::new(4, 0)..Point::new(4, 0)], vec![Point::new(4, 0)..Point::new(4, 0)],
None,
cx, cx,
) )
}) })
@@ -1776,6 +1679,7 @@ mod tests {
log.reject_edits_in_ranges( log.reject_edits_in_ranges(
buffer.clone(), buffer.clone(),
vec![Point::new(0, 0)..Point::new(1, 0)], vec![Point::new(0, 0)..Point::new(1, 0)],
None,
cx, cx,
) )
}) })
@@ -1803,6 +1707,7 @@ mod tests {
log.reject_edits_in_ranges( log.reject_edits_in_ranges(
buffer.clone(), buffer.clone(),
vec![Point::new(4, 0)..Point::new(4, 0)], vec![Point::new(4, 0)..Point::new(4, 0)],
None,
cx, cx,
) )
}) })
@@ -1877,7 +1782,7 @@ mod tests {
let range_2 = buffer.read(cx).anchor_before(Point::new(5, 0)) let range_2 = buffer.read(cx).anchor_before(Point::new(5, 0))
..buffer.read(cx).anchor_before(Point::new(5, 3)); ..buffer.read(cx).anchor_before(Point::new(5, 3));
log.reject_edits_in_ranges(buffer.clone(), vec![range_1, range_2], cx) log.reject_edits_in_ranges(buffer.clone(), vec![range_1, range_2], None, cx)
.detach(); .detach();
assert_eq!( assert_eq!(
buffer.read_with(cx, |buffer, _| buffer.text()), buffer.read_with(cx, |buffer, _| buffer.text()),
@@ -1938,6 +1843,7 @@ mod tests {
log.reject_edits_in_ranges( log.reject_edits_in_ranges(
buffer.clone(), buffer.clone(),
vec![Point::new(0, 0)..Point::new(0, 0)], vec![Point::new(0, 0)..Point::new(0, 0)],
None,
cx, cx,
) )
}) })
@@ -1993,6 +1899,7 @@ mod tests {
log.reject_edits_in_ranges( log.reject_edits_in_ranges(
buffer.clone(), buffer.clone(),
vec![Point::new(0, 0)..Point::new(0, 11)], vec![Point::new(0, 0)..Point::new(0, 11)],
None,
cx, cx,
) )
}) })
@@ -2055,6 +1962,7 @@ mod tests {
log.reject_edits_in_ranges( log.reject_edits_in_ranges(
buffer.clone(), buffer.clone(),
vec![Point::new(0, 0)..Point::new(100, 0)], vec![Point::new(0, 0)..Point::new(100, 0)],
None,
cx, cx,
) )
}) })
@@ -2102,7 +2010,7 @@ mod tests {
// User accepts the single hunk // User accepts the single hunk
action_log.update(cx, |log, cx| { action_log.update(cx, |log, cx| {
log.keep_edits_in_range(buffer.clone(), Anchor::MIN..Anchor::MAX, cx) log.keep_edits_in_range(buffer.clone(), Anchor::MIN..Anchor::MAX, None, cx)
}); });
cx.run_until_parked(); cx.run_until_parked();
assert_eq!(unreviewed_hunks(&action_log, cx), vec![]); assert_eq!(unreviewed_hunks(&action_log, cx), vec![]);
@@ -2123,7 +2031,7 @@ mod tests {
// User rejects the hunk // User rejects the hunk
action_log action_log
.update(cx, |log, cx| { .update(cx, |log, cx| {
log.reject_edits_in_ranges(buffer.clone(), vec![Anchor::MIN..Anchor::MAX], cx) log.reject_edits_in_ranges(buffer.clone(), vec![Anchor::MIN..Anchor::MAX], None, cx)
}) })
.await .await
.unwrap(); .unwrap();
@@ -2167,7 +2075,7 @@ mod tests {
cx.run_until_parked(); cx.run_until_parked();
// User clicks "Accept All" // User clicks "Accept All"
action_log.update(cx, |log, cx| log.keep_all_edits(cx)); action_log.update(cx, |log, cx| log.keep_all_edits(None, cx));
cx.run_until_parked(); cx.run_until_parked();
assert!(fs.is_file(path!("/dir/new_file").as_ref()).await); assert!(fs.is_file(path!("/dir/new_file").as_ref()).await);
assert_eq!(unreviewed_hunks(&action_log, cx), vec![]); // Hunks are cleared assert_eq!(unreviewed_hunks(&action_log, cx), vec![]); // Hunks are cleared
@@ -2186,7 +2094,7 @@ mod tests {
// User clicks "Reject All" // User clicks "Reject All"
action_log action_log
.update(cx, |log, cx| log.reject_all_edits(cx)) .update(cx, |log, cx| log.reject_all_edits(None, cx))
.await; .await;
cx.run_until_parked(); cx.run_until_parked();
assert!(fs.is_file(path!("/dir/new_file").as_ref()).await); assert!(fs.is_file(path!("/dir/new_file").as_ref()).await);
@@ -2226,7 +2134,7 @@ mod tests {
action_log.update(cx, |log, cx| { action_log.update(cx, |log, cx| {
let range = buffer.read(cx).random_byte_range(0, &mut rng); let range = buffer.read(cx).random_byte_range(0, &mut rng);
log::info!("keeping edits in range {:?}", range); log::info!("keeping edits in range {:?}", range);
log.keep_edits_in_range(buffer.clone(), range, cx) log.keep_edits_in_range(buffer.clone(), range, None, cx)
}); });
} }
25..50 => { 25..50 => {
@@ -2234,7 +2142,7 @@ mod tests {
.update(cx, |log, cx| { .update(cx, |log, cx| {
let range = buffer.read(cx).random_byte_range(0, &mut rng); let range = buffer.read(cx).random_byte_range(0, &mut rng);
log::info!("rejecting edits in range {:?}", range); log::info!("rejecting edits in range {:?}", range);
log.reject_edits_in_ranges(buffer.clone(), vec![range], cx) log.reject_edits_in_ranges(buffer.clone(), vec![range], None, cx)
}) })
.await .await
.unwrap(); .unwrap();
@@ -2488,61 +2396,4 @@ mod tests {
.collect() .collect()
}) })
} }
#[gpui::test]
async fn test_format_patch(cx: &mut TestAppContext) {
init_test(cx);
let fs = FakeFs::new(cx.executor());
fs.insert_tree(
path!("/dir"),
json!({"test.txt": "line 1\nline 2\nline 3\n"}),
)
.await;
let project = Project::test(fs.clone(), [path!("/dir").as_ref()], cx).await;
let action_log = cx.new(|_| ActionLog::new(project.clone()));
let file_path = project
.read_with(cx, |project, cx| {
project.find_project_path("dir/test.txt", cx)
})
.unwrap();
let buffer = project
.update(cx, |project, cx| project.open_buffer(file_path, cx))
.await
.unwrap();
cx.update(|cx| {
// Track the buffer and mark it as read first
action_log.update(cx, |log, cx| {
log.buffer_read(buffer.clone(), cx);
});
// Make some edits to create a patch
buffer.update(cx, |buffer, cx| {
buffer
.edit([(Point::new(1, 0)..Point::new(1, 6), "CHANGED")], None, cx)
.unwrap(); // Replace "line2" with "CHANGED"
});
});
cx.run_until_parked();
// Get the patch
let patch = action_log.update(cx, |log, cx| log.unnotified_user_edits(cx));
// Verify the patch format contains expected unified diff elements
assert_eq!(
patch.unwrap(),
indoc! {"
--- a/dir/test.txt
+++ b/dir/test.txt
@@ -1,3 +1,3 @@
line 1
-line 2
+CHANGED
line 3
"}
);
}
} }

View File

@@ -17,6 +17,7 @@ anyhow.workspace = true
auto_update.workspace = true auto_update.workspace = true
editor.workspace = true editor.workspace = true
extension_host.workspace = true extension_host.workspace = true
fs.workspace = true
futures.workspace = true futures.workspace = true
gpui.workspace = true gpui.workspace = true
language.workspace = true language.workspace = true

View File

@@ -51,6 +51,7 @@ pub struct ActivityIndicator {
project: Entity<Project>, project: Entity<Project>,
auto_updater: Option<Entity<AutoUpdater>>, auto_updater: Option<Entity<AutoUpdater>>,
context_menu_handle: PopoverMenuHandle<ContextMenu>, context_menu_handle: PopoverMenuHandle<ContextMenu>,
fs_jobs: Vec<fs::JobInfo>,
} }
#[derive(Debug)] #[derive(Debug)]
@@ -99,6 +100,27 @@ impl ActivityIndicator {
}) })
.detach(); .detach();
let fs = project.read(cx).fs().clone();
let mut job_events = fs.subscribe_to_jobs();
cx.spawn(async move |this, cx| {
while let Some(job_event) = job_events.next().await {
this.update(cx, |this: &mut ActivityIndicator, cx| {
match job_event {
fs::JobEvent::Started { info } => {
this.fs_jobs.retain(|j| j.id != info.id);
this.fs_jobs.push(info);
}
fs::JobEvent::Completed { id } => {
this.fs_jobs.retain(|j| j.id != id);
}
}
cx.notify();
})?;
}
anyhow::Ok(())
})
.detach();
cx.subscribe( cx.subscribe(
&project.read(cx).lsp_store(), &project.read(cx).lsp_store(),
|activity_indicator, _, event, cx| { |activity_indicator, _, event, cx| {
@@ -201,7 +223,8 @@ impl ActivityIndicator {
statuses: Vec::new(), statuses: Vec::new(),
project: project.clone(), project: project.clone(),
auto_updater, auto_updater,
context_menu_handle: Default::default(), context_menu_handle: PopoverMenuHandle::default(),
fs_jobs: Vec::new(),
} }
}); });
@@ -432,6 +455,23 @@ impl ActivityIndicator {
}); });
} }
// Show any long-running fs command
for fs_job in &self.fs_jobs {
if Instant::now().duration_since(fs_job.start) >= GIT_OPERATION_DELAY {
return Some(Content {
icon: Some(
Icon::new(IconName::ArrowCircle)
.size(IconSize::Small)
.with_rotate_animation(2)
.into_any_element(),
),
message: fs_job.message.clone().into(),
on_click: None,
tooltip_message: None,
});
}
}
// Show any language server installation info. // Show any language server installation info.
let mut downloading = SmallVec::<[_; 3]>::new(); let mut downloading = SmallVec::<[_; 3]>::new();
let mut checking_for_update = SmallVec::<[_; 3]>::new(); let mut checking_for_update = SmallVec::<[_; 3]>::new();

View File

@@ -63,7 +63,6 @@ streaming_diff.workspace = true
strsim.workspace = true strsim.workspace = true
task.workspace = true task.workspace = true
telemetry.workspace = true telemetry.workspace = true
terminal.workspace = true
text.workspace = true text.workspace = true
thiserror.workspace = true thiserror.workspace = true
ui.workspace = true ui.workspace = true

View File

@@ -6,7 +6,6 @@ mod native_agent_server;
pub mod outline; pub mod outline;
mod templates; mod templates;
mod thread; mod thread;
mod tool_schema;
mod tools; mod tools;
#[cfg(test)] #[cfg(test)]
@@ -218,7 +217,7 @@ impl LanguageModels {
} }
_ => { _ => {
log::error!( log::error!(
"Failed to authenticate provider: {}: {err}", "Failed to authenticate provider: {}: {err:#}",
provider_name.0 provider_name.0
); );
} }
@@ -967,6 +966,10 @@ impl acp_thread::AgentModelSelector for NativeAgentModelSelector {
} }
impl acp_thread::AgentConnection for NativeAgentConnection { impl acp_thread::AgentConnection for NativeAgentConnection {
fn telemetry_id(&self) -> &'static str {
"zed"
}
fn new_thread( fn new_thread(
self: Rc<Self>, self: Rc<Self>,
project: Entity<Project>, project: Entity<Project>,
@@ -1107,10 +1110,6 @@ impl acp_thread::AgentConnection for NativeAgentConnection {
} }
impl acp_thread::AgentTelemetry for NativeAgentConnection { impl acp_thread::AgentTelemetry for NativeAgentConnection {
fn agent_name(&self) -> String {
"Zed".into()
}
fn thread_data( fn thread_data(
&self, &self,
session_id: &acp::SessionId, session_id: &acp::SessionId,
@@ -1627,9 +1626,7 @@ mod internal_tests {
cx.update(|cx| { cx.update(|cx| {
let settings_store = SettingsStore::test(cx); let settings_store = SettingsStore::test(cx);
cx.set_global(settings_store); cx.set_global(settings_store);
Project::init_settings(cx);
agent_settings::init(cx);
language::init(cx);
LanguageModelRegistry::test(cx); LanguageModelRegistry::test(cx);
}); });
} }

View File

@@ -1394,7 +1394,7 @@ mod tests {
async fn init_test(cx: &mut TestAppContext) -> EditAgent { async fn init_test(cx: &mut TestAppContext) -> EditAgent {
cx.update(settings::init); cx.update(settings::init);
cx.update(Project::init_settings);
let project = Project::test(FakeFs::new(cx.executor()), [], cx).await; let project = Project::test(FakeFs::new(cx.executor()), [], cx).await;
let model = Arc::new(FakeLanguageModel::default()); let model = Arc::new(FakeLanguageModel::default());
let action_log = cx.new(|_| ActionLog::new(project.clone())); let action_log = cx.new(|_| ActionLog::new(project.clone()));

View File

@@ -1468,14 +1468,9 @@ impl EditAgentTest {
gpui_tokio::init(cx); gpui_tokio::init(cx);
let http_client = Arc::new(ReqwestClient::user_agent("agent tests").unwrap()); let http_client = Arc::new(ReqwestClient::user_agent("agent tests").unwrap());
cx.set_http_client(http_client); cx.set_http_client(http_client);
client::init_settings(cx);
let client = Client::production(cx); let client = Client::production(cx);
let user_store = cx.new(|cx| UserStore::new(client.clone(), cx)); let user_store = cx.new(|cx| UserStore::new(client.clone(), cx));
settings::init(cx); settings::init(cx);
Project::init_settings(cx);
language::init(cx);
language_model::init(client.clone(), cx); language_model::init(client.clone(), cx);
language_models::init(user_store, client.clone(), cx); language_models::init(user_store, client.clone(), cx);
}); });

View File

@@ -88,8 +88,6 @@ mod tests {
async |fs, project, cx| { async |fs, project, cx| {
let auth = cx.update(|cx| { let auth = cx.update(|cx| {
prompt_store::init(cx); prompt_store::init(cx);
terminal::init(cx);
let registry = language_model::LanguageModelRegistry::read_global(cx); let registry = language_model::LanguageModelRegistry::read_global(cx);
let auth = registry let auth = registry
.provider(&language_model::ANTHROPIC_PROVIDER_ID) .provider(&language_model::ANTHROPIC_PROVIDER_ID)

View File

@@ -1,6 +1,6 @@
use anyhow::Result; use anyhow::Result;
use gpui::{AsyncApp, Entity}; use gpui::{AsyncApp, Entity};
use language::{Buffer, OutlineItem, ParseStatus}; use language::{Buffer, OutlineItem};
use regex::Regex; use regex::Regex;
use std::fmt::Write; use std::fmt::Write;
use text::Point; use text::Point;
@@ -30,10 +30,9 @@ pub async fn get_buffer_content_or_outline(
if file_size > AUTO_OUTLINE_SIZE { if file_size > AUTO_OUTLINE_SIZE {
// For large files, use outline instead of full content // For large files, use outline instead of full content
// Wait until the buffer has been fully parsed, so we can read its outline // Wait until the buffer has been fully parsed, so we can read its outline
let mut parse_status = buffer.read_with(cx, |buffer, _| buffer.parse_status())?; buffer
while *parse_status.borrow() != ParseStatus::Idle { .read_with(cx, |buffer, _| buffer.parsing_idle())?
parse_status.changed().await?; .await;
}
let outline_items = buffer.read_with(cx, |buffer, _| { let outline_items = buffer.read_with(cx, |buffer, _| {
let snapshot = buffer.snapshot(); let snapshot = buffer.snapshot();

View File

@@ -933,7 +933,7 @@ async fn test_profiles(cx: &mut TestAppContext) {
// Test that test-1 profile (default) has echo and delay tools // Test that test-1 profile (default) has echo and delay tools
thread thread
.update(cx, |thread, cx| { .update(cx, |thread, cx| {
thread.set_profile(AgentProfileId("test-1".into())); thread.set_profile(AgentProfileId("test-1".into()), cx);
thread.send(UserMessageId::new(), ["test"], cx) thread.send(UserMessageId::new(), ["test"], cx)
}) })
.unwrap(); .unwrap();
@@ -953,7 +953,7 @@ async fn test_profiles(cx: &mut TestAppContext) {
// Switch to test-2 profile, and verify that it has only the infinite tool. // Switch to test-2 profile, and verify that it has only the infinite tool.
thread thread
.update(cx, |thread, cx| { .update(cx, |thread, cx| {
thread.set_profile(AgentProfileId("test-2".into())); thread.set_profile(AgentProfileId("test-2".into()), cx);
thread.send(UserMessageId::new(), ["test2"], cx) thread.send(UserMessageId::new(), ["test2"], cx)
}) })
.unwrap(); .unwrap();
@@ -1002,8 +1002,8 @@ async fn test_mcp_tools(cx: &mut TestAppContext) {
) )
.await; .await;
cx.run_until_parked(); cx.run_until_parked();
thread.update(cx, |thread, _| { thread.update(cx, |thread, cx| {
thread.set_profile(AgentProfileId("test".into())) thread.set_profile(AgentProfileId("test".into()), cx)
}); });
let mut mcp_tool_calls = setup_context_server( let mut mcp_tool_calls = setup_context_server(
@@ -1169,8 +1169,8 @@ async fn test_mcp_tool_truncation(cx: &mut TestAppContext) {
.await; .await;
cx.run_until_parked(); cx.run_until_parked();
thread.update(cx, |thread, _| { thread.update(cx, |thread, cx| {
thread.set_profile(AgentProfileId("test".into())); thread.set_profile(AgentProfileId("test".into()), cx);
thread.add_tool(EchoTool); thread.add_tool(EchoTool);
thread.add_tool(DelayTool); thread.add_tool(DelayTool);
thread.add_tool(WordListTool); thread.add_tool(WordListTool);
@@ -1851,7 +1851,6 @@ async fn test_agent_connection(cx: &mut TestAppContext) {
// Initialize language model system with test provider // Initialize language model system with test provider
cx.update(|cx| { cx.update(|cx| {
gpui_tokio::init(cx); gpui_tokio::init(cx);
client::init_settings(cx);
let http_client = FakeHttpClient::with_404_response(); let http_client = FakeHttpClient::with_404_response();
let clock = Arc::new(clock::FakeSystemClock::new()); let clock = Arc::new(clock::FakeSystemClock::new());
@@ -1859,9 +1858,7 @@ async fn test_agent_connection(cx: &mut TestAppContext) {
let user_store = cx.new(|cx| UserStore::new(client.clone(), cx)); let user_store = cx.new(|cx| UserStore::new(client.clone(), cx));
language_model::init(client.clone(), cx); language_model::init(client.clone(), cx);
language_models::init(user_store, client.clone(), cx); language_models::init(user_store, client.clone(), cx);
Project::init_settings(cx);
LanguageModelRegistry::test(cx); LanguageModelRegistry::test(cx);
agent_settings::init(cx);
}); });
cx.executor().forbid_parking(); cx.executor().forbid_parking();
@@ -2395,8 +2392,6 @@ async fn setup(cx: &mut TestAppContext, model: TestModel) -> ThreadTest {
cx.update(|cx| { cx.update(|cx| {
settings::init(cx); settings::init(cx);
Project::init_settings(cx);
agent_settings::init(cx);
match model { match model {
TestModel::Fake => {} TestModel::Fake => {}
@@ -2404,7 +2399,6 @@ async fn setup(cx: &mut TestAppContext, model: TestModel) -> ThreadTest {
gpui_tokio::init(cx); gpui_tokio::init(cx);
let http_client = ReqwestClient::user_agent("agent tests").unwrap(); let http_client = ReqwestClient::user_agent("agent tests").unwrap();
cx.set_http_client(Arc::new(http_client)); cx.set_http_client(Arc::new(http_client));
client::init_settings(cx);
let client = Client::production(cx); let client = Client::production(cx);
let user_store = cx.new(|cx| UserStore::new(client.clone(), cx)); let user_store = cx.new(|cx| UserStore::new(client.clone(), cx));
language_model::init(client.clone(), cx); language_model::init(client.clone(), cx);

View File

@@ -30,16 +30,17 @@ use gpui::{
}; };
use language_model::{ use language_model::{
LanguageModel, LanguageModelCompletionError, LanguageModelCompletionEvent, LanguageModelExt, LanguageModel, LanguageModelCompletionError, LanguageModelCompletionEvent, LanguageModelExt,
LanguageModelImage, LanguageModelProviderId, LanguageModelRegistry, LanguageModelRequest, LanguageModelId, LanguageModelImage, LanguageModelProviderId, LanguageModelRegistry,
LanguageModelRequestMessage, LanguageModelRequestTool, LanguageModelToolResult, LanguageModelRequest, LanguageModelRequestMessage, LanguageModelRequestTool,
LanguageModelToolResultContent, LanguageModelToolSchemaFormat, LanguageModelToolUse, LanguageModelToolResult, LanguageModelToolResultContent, LanguageModelToolSchemaFormat,
LanguageModelToolUseId, Role, SelectedModel, StopReason, TokenUsage, ZED_CLOUD_PROVIDER_ID, LanguageModelToolUse, LanguageModelToolUseId, Role, SelectedModel, StopReason, TokenUsage,
ZED_CLOUD_PROVIDER_ID,
}; };
use project::Project; use project::Project;
use prompt_store::ProjectContext; use prompt_store::ProjectContext;
use schemars::{JsonSchema, Schema}; use schemars::{JsonSchema, Schema};
use serde::{Deserialize, Serialize}; use serde::{Deserialize, Serialize};
use settings::{Settings, update_settings_file}; use settings::{LanguageModelSelection, Settings, update_settings_file};
use smol::stream::StreamExt; use smol::stream::StreamExt;
use std::{ use std::{
collections::BTreeMap, collections::BTreeMap,
@@ -798,7 +799,8 @@ impl Thread {
let profile_id = db_thread let profile_id = db_thread
.profile .profile
.unwrap_or_else(|| AgentSettings::get_global(cx).default_profile.clone()); .unwrap_or_else(|| AgentSettings::get_global(cx).default_profile.clone());
let model = LanguageModelRegistry::global(cx).update(cx, |registry, cx| {
let mut model = LanguageModelRegistry::global(cx).update(cx, |registry, cx| {
db_thread db_thread
.model .model
.and_then(|model| { .and_then(|model| {
@@ -811,6 +813,16 @@ impl Thread {
.or_else(|| registry.default_model()) .or_else(|| registry.default_model())
.map(|model| model.model) .map(|model| model.model)
}); });
if model.is_none() {
model = Self::resolve_profile_model(&profile_id, cx);
}
if model.is_none() {
model = LanguageModelRegistry::global(cx).update(cx, |registry, _cx| {
registry.default_model().map(|model| model.model)
});
}
let (prompt_capabilities_tx, prompt_capabilities_rx) = let (prompt_capabilities_tx, prompt_capabilities_rx) =
watch::channel(Self::prompt_capabilities(model.as_deref())); watch::channel(Self::prompt_capabilities(model.as_deref()));
@@ -1007,8 +1019,17 @@ impl Thread {
&self.profile_id &self.profile_id
} }
pub fn set_profile(&mut self, profile_id: AgentProfileId) { pub fn set_profile(&mut self, profile_id: AgentProfileId, cx: &mut Context<Self>) {
if self.profile_id == profile_id {
return;
}
self.profile_id = profile_id; self.profile_id = profile_id;
// Swap to the profile's preferred model when available.
if let Some(model) = Self::resolve_profile_model(&self.profile_id, cx) {
self.set_model(model, cx);
}
} }
pub fn cancel(&mut self, cx: &mut Context<Self>) { pub fn cancel(&mut self, cx: &mut Context<Self>) {
@@ -1065,6 +1086,35 @@ impl Thread {
}) })
} }
/// Look up the active profile and resolve its preferred model if one is configured.
fn resolve_profile_model(
profile_id: &AgentProfileId,
cx: &mut Context<Self>,
) -> Option<Arc<dyn LanguageModel>> {
let selection = AgentSettings::get_global(cx)
.profiles
.get(profile_id)?
.default_model
.clone()?;
Self::resolve_model_from_selection(&selection, cx)
}
/// Translate a stored model selection into the configured model from the registry.
fn resolve_model_from_selection(
selection: &LanguageModelSelection,
cx: &mut Context<Self>,
) -> Option<Arc<dyn LanguageModel>> {
let selected = SelectedModel {
provider: LanguageModelProviderId::from(selection.provider.0.clone()),
model: LanguageModelId::from(selection.model.clone()),
};
LanguageModelRegistry::global(cx).update(cx, |registry, cx| {
registry
.select_model(&selected, cx)
.map(|configured| configured.model)
})
}
pub fn resume( pub fn resume(
&mut self, &mut self,
cx: &mut Context<Self>, cx: &mut Context<Self>,
@@ -2139,7 +2189,7 @@ where
/// Returns the JSON schema that describes the tool's input. /// Returns the JSON schema that describes the tool's input.
fn input_schema(format: LanguageModelToolSchemaFormat) -> Schema { fn input_schema(format: LanguageModelToolSchemaFormat) -> Schema {
crate::tool_schema::root_schema_for::<Self::Input>(format) language_model::tool_schema::root_schema_for::<Self::Input>(format)
} }
/// Some tools rely on a provider for the underlying billing or other reasons. /// Some tools rely on a provider for the underlying billing or other reasons.
@@ -2226,7 +2276,7 @@ where
fn input_schema(&self, format: LanguageModelToolSchemaFormat) -> Result<serde_json::Value> { fn input_schema(&self, format: LanguageModelToolSchemaFormat) -> Result<serde_json::Value> {
let mut json = serde_json::to_value(T::input_schema(format))?; let mut json = serde_json::to_value(T::input_schema(format))?;
crate::tool_schema::adapt_schema_to_format(&mut json, format)?; language_model::tool_schema::adapt_schema_to_format(&mut json, format)?;
Ok(json) Ok(json)
} }

View File

@@ -165,7 +165,7 @@ impl AnyAgentTool for ContextServerTool {
format: language_model::LanguageModelToolSchemaFormat, format: language_model::LanguageModelToolSchemaFormat,
) -> Result<serde_json::Value> { ) -> Result<serde_json::Value> {
let mut schema = self.tool.input_schema.clone(); let mut schema = self.tool.input_schema.clone();
crate::tool_schema::adapt_schema_to_format(&mut schema, format)?; language_model::tool_schema::adapt_schema_to_format(&mut schema, format)?;
Ok(match schema { Ok(match schema {
serde_json::Value::Null => { serde_json::Value::Null => {
serde_json::json!({ "type": "object", "properties": [] }) serde_json::json!({ "type": "object", "properties": [] })

View File

@@ -562,7 +562,6 @@ fn resolve_path(
mod tests { mod tests {
use super::*; use super::*;
use crate::{ContextServerRegistry, Templates}; use crate::{ContextServerRegistry, Templates};
use client::TelemetrySettings;
use fs::Fs; use fs::Fs;
use gpui::{TestAppContext, UpdateGlobal}; use gpui::{TestAppContext, UpdateGlobal};
use language_model::fake_provider::FakeLanguageModel; use language_model::fake_provider::FakeLanguageModel;
@@ -1753,10 +1752,6 @@ mod tests {
cx.update(|cx| { cx.update(|cx| {
let settings_store = SettingsStore::test(cx); let settings_store = SettingsStore::test(cx);
cx.set_global(settings_store); cx.set_global(settings_store);
language::init(cx);
TelemetrySettings::register(cx);
agent_settings::AgentSettings::register(cx);
Project::init_settings(cx);
}); });
} }
} }

View File

@@ -246,8 +246,6 @@ mod test {
cx.update(|cx| { cx.update(|cx| {
let settings_store = SettingsStore::test(cx); let settings_store = SettingsStore::test(cx);
cx.set_global(settings_store); cx.set_global(settings_store);
language::init(cx);
Project::init_settings(cx);
}); });
} }
} }

View File

@@ -778,8 +778,6 @@ mod tests {
cx.update(|cx| { cx.update(|cx| {
let settings_store = SettingsStore::test(cx); let settings_store = SettingsStore::test(cx);
cx.set_global(settings_store); cx.set_global(settings_store);
language::init(cx);
Project::init_settings(cx);
}); });
} }

View File

@@ -223,8 +223,6 @@ mod tests {
cx.update(|cx| { cx.update(|cx| {
let settings_store = SettingsStore::test(cx); let settings_store = SettingsStore::test(cx);
cx.set_global(settings_store); cx.set_global(settings_store);
language::init(cx);
Project::init_settings(cx);
}); });
} }

View File

@@ -163,8 +163,6 @@ mod tests {
cx.update(|cx| { cx.update(|cx| {
let settings_store = SettingsStore::test(cx); let settings_store = SettingsStore::test(cx);
cx.set_global(settings_store); cx.set_global(settings_store);
language::init(cx);
Project::init_settings(cx);
}); });
} }
} }

View File

@@ -509,8 +509,6 @@ mod test {
cx.update(|cx| { cx.update(|cx| {
let settings_store = SettingsStore::test(cx); let settings_store = SettingsStore::test(cx);
cx.set_global(settings_store); cx.set_global(settings_store);
language::init(cx);
Project::init_settings(cx);
}); });
} }

View File

@@ -21,7 +21,6 @@ acp_tools.workspace = true
acp_thread.workspace = true acp_thread.workspace = true
action_log.workspace = true action_log.workspace = true
agent-client-protocol.workspace = true agent-client-protocol.workspace = true
agent_settings.workspace = true
anyhow.workspace = true anyhow.workspace = true
async-trait.workspace = true async-trait.workspace = true
client.workspace = true client.workspace = true
@@ -33,7 +32,6 @@ gpui.workspace = true
gpui_tokio = { workspace = true, optional = true } gpui_tokio = { workspace = true, optional = true }
http_client.workspace = true http_client.workspace = true
indoc.workspace = true indoc.workspace = true
language.workspace = true
language_model.workspace = true language_model.workspace = true
language_models.workspace = true language_models.workspace = true
log.workspace = true log.workspace = true

View File

@@ -29,6 +29,7 @@ pub struct UnsupportedVersion;
pub struct AcpConnection { pub struct AcpConnection {
server_name: SharedString, server_name: SharedString,
telemetry_id: &'static str,
connection: Rc<acp::ClientSideConnection>, connection: Rc<acp::ClientSideConnection>,
sessions: Rc<RefCell<HashMap<acp::SessionId, AcpSession>>>, sessions: Rc<RefCell<HashMap<acp::SessionId, AcpSession>>>,
auth_methods: Vec<acp::AuthMethod>, auth_methods: Vec<acp::AuthMethod>,
@@ -52,6 +53,7 @@ pub struct AcpSession {
pub async fn connect( pub async fn connect(
server_name: SharedString, server_name: SharedString,
telemetry_id: &'static str,
command: AgentServerCommand, command: AgentServerCommand,
root_dir: &Path, root_dir: &Path,
default_mode: Option<acp::SessionModeId>, default_mode: Option<acp::SessionModeId>,
@@ -60,6 +62,7 @@ pub async fn connect(
) -> Result<Rc<dyn AgentConnection>> { ) -> Result<Rc<dyn AgentConnection>> {
let conn = AcpConnection::stdio( let conn = AcpConnection::stdio(
server_name, server_name,
telemetry_id,
command.clone(), command.clone(),
root_dir, root_dir,
default_mode, default_mode,
@@ -75,6 +78,7 @@ const MINIMUM_SUPPORTED_VERSION: acp::ProtocolVersion = acp::V1;
impl AcpConnection { impl AcpConnection {
pub async fn stdio( pub async fn stdio(
server_name: SharedString, server_name: SharedString,
telemetry_id: &'static str,
command: AgentServerCommand, command: AgentServerCommand,
root_dir: &Path, root_dir: &Path,
default_mode: Option<acp::SessionModeId>, default_mode: Option<acp::SessionModeId>,
@@ -132,7 +136,7 @@ impl AcpConnection {
while let Ok(n) = stderr.read_line(&mut line).await while let Ok(n) = stderr.read_line(&mut line).await
&& n > 0 && n > 0
{ {
log::warn!("agent stderr: {}", &line); log::warn!("agent stderr: {}", line.trim());
line.clear(); line.clear();
} }
Ok(()) Ok(())
@@ -199,6 +203,7 @@ impl AcpConnection {
root_dir: root_dir.to_owned(), root_dir: root_dir.to_owned(),
connection, connection,
server_name, server_name,
telemetry_id,
sessions, sessions,
agent_capabilities: response.agent_capabilities, agent_capabilities: response.agent_capabilities,
default_mode, default_mode,
@@ -226,6 +231,10 @@ impl Drop for AcpConnection {
} }
impl AgentConnection for AcpConnection { impl AgentConnection for AcpConnection {
fn telemetry_id(&self) -> &'static str {
self.telemetry_id
}
fn new_thread( fn new_thread(
self: Rc<Self>, self: Rc<Self>,
project: Entity<Project>, project: Entity<Project>,

View File

@@ -62,6 +62,7 @@ impl AgentServer for ClaudeCode {
cx: &mut App, cx: &mut App,
) -> Task<Result<(Rc<dyn AgentConnection>, Option<task::SpawnInTerminal>)>> { ) -> Task<Result<(Rc<dyn AgentConnection>, Option<task::SpawnInTerminal>)>> {
let name = self.name(); let name = self.name();
let telemetry_id = self.telemetry_id();
let root_dir = root_dir.map(|root_dir| root_dir.to_string_lossy().into_owned()); let root_dir = root_dir.map(|root_dir| root_dir.to_string_lossy().into_owned());
let is_remote = delegate.project.read(cx).is_via_remote_server(); let is_remote = delegate.project.read(cx).is_via_remote_server();
let store = delegate.store.downgrade(); let store = delegate.store.downgrade();
@@ -85,6 +86,7 @@ impl AgentServer for ClaudeCode {
.await?; .await?;
let connection = crate::acp::connect( let connection = crate::acp::connect(
name, name,
telemetry_id,
command, command,
root_dir.as_ref(), root_dir.as_ref(),
default_mode, default_mode,

View File

@@ -63,6 +63,7 @@ impl AgentServer for Codex {
cx: &mut App, cx: &mut App,
) -> Task<Result<(Rc<dyn AgentConnection>, Option<task::SpawnInTerminal>)>> { ) -> Task<Result<(Rc<dyn AgentConnection>, Option<task::SpawnInTerminal>)>> {
let name = self.name(); let name = self.name();
let telemetry_id = self.telemetry_id();
let root_dir = root_dir.map(|root_dir| root_dir.to_string_lossy().into_owned()); let root_dir = root_dir.map(|root_dir| root_dir.to_string_lossy().into_owned());
let is_remote = delegate.project.read(cx).is_via_remote_server(); let is_remote = delegate.project.read(cx).is_via_remote_server();
let store = delegate.store.downgrade(); let store = delegate.store.downgrade();
@@ -87,6 +88,7 @@ impl AgentServer for Codex {
let connection = crate::acp::connect( let connection = crate::acp::connect(
name, name,
telemetry_id,
command, command,
root_dir.as_ref(), root_dir.as_ref(),
default_mode, default_mode,

View File

@@ -67,6 +67,7 @@ impl crate::AgentServer for CustomAgentServer {
cx: &mut App, cx: &mut App,
) -> Task<Result<(Rc<dyn AgentConnection>, Option<task::SpawnInTerminal>)>> { ) -> Task<Result<(Rc<dyn AgentConnection>, Option<task::SpawnInTerminal>)>> {
let name = self.name(); let name = self.name();
let telemetry_id = self.telemetry_id();
let root_dir = root_dir.map(|root_dir| root_dir.to_string_lossy().into_owned()); let root_dir = root_dir.map(|root_dir| root_dir.to_string_lossy().into_owned());
let is_remote = delegate.project.read(cx).is_via_remote_server(); let is_remote = delegate.project.read(cx).is_via_remote_server();
let default_mode = self.default_mode(cx); let default_mode = self.default_mode(cx);
@@ -92,6 +93,7 @@ impl crate::AgentServer for CustomAgentServer {
.await?; .await?;
let connection = crate::acp::connect( let connection = crate::acp::connect(
name, name,
telemetry_id,
command, command,
root_dir.as_ref(), root_dir.as_ref(),
default_mode, default_mode,

View File

@@ -6,7 +6,9 @@ use gpui::{AppContext, Entity, TestAppContext};
use indoc::indoc; use indoc::indoc;
#[cfg(test)] #[cfg(test)]
use project::agent_server_store::BuiltinAgentServerSettings; use project::agent_server_store::BuiltinAgentServerSettings;
use project::{FakeFs, Project, agent_server_store::AllAgentServersSettings}; use project::{FakeFs, Project};
#[cfg(test)]
use settings::Settings;
use std::{ use std::{
path::{Path, PathBuf}, path::{Path, PathBuf},
sync::Arc, sync::Arc,
@@ -452,29 +454,22 @@ pub use common_e2e_tests;
// Helpers // Helpers
pub async fn init_test(cx: &mut TestAppContext) -> Arc<FakeFs> { pub async fn init_test(cx: &mut TestAppContext) -> Arc<FakeFs> {
use settings::Settings;
env_logger::try_init().ok(); env_logger::try_init().ok();
cx.update(|cx| { cx.update(|cx| {
let settings_store = settings::SettingsStore::test(cx); let settings_store = settings::SettingsStore::test(cx);
cx.set_global(settings_store); cx.set_global(settings_store);
Project::init_settings(cx);
language::init(cx);
gpui_tokio::init(cx); gpui_tokio::init(cx);
let http_client = reqwest_client::ReqwestClient::user_agent("agent tests").unwrap(); let http_client = reqwest_client::ReqwestClient::user_agent("agent tests").unwrap();
cx.set_http_client(Arc::new(http_client)); cx.set_http_client(Arc::new(http_client));
client::init_settings(cx);
let client = client::Client::production(cx); let client = client::Client::production(cx);
let user_store = cx.new(|cx| client::UserStore::new(client.clone(), cx)); let user_store = cx.new(|cx| client::UserStore::new(client.clone(), cx));
language_model::init(client.clone(), cx); language_model::init(client.clone(), cx);
language_models::init(user_store, client, cx); language_models::init(user_store, client, cx);
agent_settings::init(cx);
AllAgentServersSettings::register(cx);
#[cfg(test)] #[cfg(test)]
AllAgentServersSettings::override_global( project::agent_server_store::AllAgentServersSettings::override_global(
AllAgentServersSettings { project::agent_server_store::AllAgentServersSettings {
claude: Some(BuiltinAgentServerSettings { claude: Some(BuiltinAgentServerSettings {
path: Some("claude-code-acp".into()), path: Some("claude-code-acp".into()),
args: None, args: None,

View File

@@ -31,6 +31,7 @@ impl AgentServer for Gemini {
cx: &mut App, cx: &mut App,
) -> Task<Result<(Rc<dyn AgentConnection>, Option<task::SpawnInTerminal>)>> { ) -> Task<Result<(Rc<dyn AgentConnection>, Option<task::SpawnInTerminal>)>> {
let name = self.name(); let name = self.name();
let telemetry_id = self.telemetry_id();
let root_dir = root_dir.map(|root_dir| root_dir.to_string_lossy().into_owned()); let root_dir = root_dir.map(|root_dir| root_dir.to_string_lossy().into_owned());
let is_remote = delegate.project.read(cx).is_via_remote_server(); let is_remote = delegate.project.read(cx).is_via_remote_server();
let store = delegate.store.downgrade(); let store = delegate.store.downgrade();
@@ -64,6 +65,7 @@ impl AgentServer for Gemini {
let connection = crate::acp::connect( let connection = crate::acp::connect(
name, name,
telemetry_id,
command, command,
root_dir.as_ref(), root_dir.as_ref(),
default_mode, default_mode,

View File

@@ -6,8 +6,8 @@ use convert_case::{Case, Casing as _};
use fs::Fs; use fs::Fs;
use gpui::{App, SharedString}; use gpui::{App, SharedString};
use settings::{ use settings::{
AgentProfileContent, ContextServerPresetContent, Settings as _, SettingsContent, AgentProfileContent, ContextServerPresetContent, LanguageModelSelection, Settings as _,
update_settings_file, SettingsContent, update_settings_file,
}; };
use util::ResultExt as _; use util::ResultExt as _;
@@ -53,19 +53,30 @@ impl AgentProfile {
let base_profile = let base_profile =
base_profile_id.and_then(|id| AgentSettings::get_global(cx).profiles.get(&id).cloned()); base_profile_id.and_then(|id| AgentSettings::get_global(cx).profiles.get(&id).cloned());
// Copy toggles from the base profile so the new profile starts with familiar defaults.
let tools = base_profile
.as_ref()
.map(|profile| profile.tools.clone())
.unwrap_or_default();
let enable_all_context_servers = base_profile
.as_ref()
.map(|profile| profile.enable_all_context_servers)
.unwrap_or_default();
let context_servers = base_profile
.as_ref()
.map(|profile| profile.context_servers.clone())
.unwrap_or_default();
// Preserve the base profile's model preference when cloning into a new profile.
let default_model = base_profile
.as_ref()
.and_then(|profile| profile.default_model.clone());
let profile_settings = AgentProfileSettings { let profile_settings = AgentProfileSettings {
name: name.into(), name: name.into(),
tools: base_profile tools,
.as_ref() enable_all_context_servers,
.map(|profile| profile.tools.clone()) context_servers,
.unwrap_or_default(), default_model,
enable_all_context_servers: base_profile
.as_ref()
.map(|profile| profile.enable_all_context_servers)
.unwrap_or_default(),
context_servers: base_profile
.map(|profile| profile.context_servers)
.unwrap_or_default(),
}; };
update_settings_file(fs, cx, { update_settings_file(fs, cx, {
@@ -96,6 +107,8 @@ pub struct AgentProfileSettings {
pub tools: IndexMap<Arc<str>, bool>, pub tools: IndexMap<Arc<str>, bool>,
pub enable_all_context_servers: bool, pub enable_all_context_servers: bool,
pub context_servers: IndexMap<Arc<str>, ContextServerPreset>, pub context_servers: IndexMap<Arc<str>, ContextServerPreset>,
/// Default language model to apply when this profile becomes active.
pub default_model: Option<LanguageModelSelection>,
} }
impl AgentProfileSettings { impl AgentProfileSettings {
@@ -144,6 +157,7 @@ impl AgentProfileSettings {
) )
}) })
.collect(), .collect(),
default_model: self.default_model.clone(),
}, },
); );
@@ -153,15 +167,23 @@ impl AgentProfileSettings {
impl From<AgentProfileContent> for AgentProfileSettings { impl From<AgentProfileContent> for AgentProfileSettings {
fn from(content: AgentProfileContent) -> Self { fn from(content: AgentProfileContent) -> Self {
let AgentProfileContent {
name,
tools,
enable_all_context_servers,
context_servers,
default_model,
} = content;
Self { Self {
name: content.name.into(), name: name.into(),
tools: content.tools, tools,
enable_all_context_servers: content.enable_all_context_servers.unwrap_or_default(), enable_all_context_servers: enable_all_context_servers.unwrap_or_default(),
context_servers: content context_servers: context_servers
.context_servers
.into_iter() .into_iter()
.map(|(server_id, preset)| (server_id, preset.into())) .map(|(server_id, preset)| (server_id, preset.into()))
.collect(), .collect(),
default_model,
} }
} }
} }

View File

@@ -10,7 +10,7 @@ use schemars::JsonSchema;
use serde::{Deserialize, Serialize}; use serde::{Deserialize, Serialize};
use settings::{ use settings::{
DefaultAgentView, DockPosition, LanguageModelParameters, LanguageModelSelection, DefaultAgentView, DockPosition, LanguageModelParameters, LanguageModelSelection,
NotifyWhenAgentWaiting, Settings, NotifyWhenAgentWaiting, RegisterSetting, Settings,
}; };
pub use crate::agent_profile::*; pub use crate::agent_profile::*;
@@ -19,11 +19,7 @@ pub const SUMMARIZE_THREAD_PROMPT: &str = include_str!("prompts/summarize_thread
pub const SUMMARIZE_THREAD_DETAILED_PROMPT: &str = pub const SUMMARIZE_THREAD_DETAILED_PROMPT: &str =
include_str!("prompts/summarize_thread_detailed_prompt.txt"); include_str!("prompts/summarize_thread_detailed_prompt.txt");
pub fn init(cx: &mut App) { #[derive(Clone, Debug, RegisterSetting)]
AgentSettings::register(cx);
}
#[derive(Clone, Debug)]
pub struct AgentSettings { pub struct AgentSettings {
pub enabled: bool, pub enabled: bool,
pub button: bool, pub button: bool,

View File

@@ -4,7 +4,6 @@ mod message_editor;
mod mode_selector; mod mode_selector;
mod model_selector; mod model_selector;
mod model_selector_popover; mod model_selector_popover;
mod thread_editor;
mod thread_history; mod thread_history;
mod thread_view; mod thread_view;

View File

@@ -646,16 +646,14 @@ impl ContextPickerCompletionProvider {
cx: &mut App, cx: &mut App,
) -> Vec<ContextPickerEntry> { ) -> Vec<ContextPickerEntry> {
let embedded_context = self.prompt_capabilities.borrow().embedded_context; let embedded_context = self.prompt_capabilities.borrow().embedded_context;
let mut entries = if embedded_context { let mut entries = vec![
vec![ ContextPickerEntry::Mode(ContextPickerMode::File),
ContextPickerEntry::Mode(ContextPickerMode::File), ContextPickerEntry::Mode(ContextPickerMode::Symbol),
ContextPickerEntry::Mode(ContextPickerMode::Symbol), ];
ContextPickerEntry::Mode(ContextPickerMode::Thread),
] if embedded_context {
} else { entries.push(ContextPickerEntry::Mode(ContextPickerMode::Thread));
// File is always available, but we don't need a mode entry }
vec![]
};
let has_selection = workspace let has_selection = workspace
.read(cx) .read(cx)

View File

@@ -401,10 +401,9 @@ mod tests {
use acp_thread::{AgentConnection, StubAgentConnection}; use acp_thread::{AgentConnection, StubAgentConnection};
use agent::HistoryStore; use agent::HistoryStore;
use agent_client_protocol as acp; use agent_client_protocol as acp;
use agent_settings::AgentSettings;
use assistant_text_thread::TextThreadStore; use assistant_text_thread::TextThreadStore;
use buffer_diff::{DiffHunkStatus, DiffHunkStatusKind}; use buffer_diff::{DiffHunkStatus, DiffHunkStatusKind};
use editor::{EditorSettings, RowInfo}; use editor::RowInfo;
use fs::FakeFs; use fs::FakeFs;
use gpui::{AppContext as _, SemanticVersion, TestAppContext}; use gpui::{AppContext as _, SemanticVersion, TestAppContext};
@@ -413,7 +412,7 @@ mod tests {
use pretty_assertions::assert_matches; use pretty_assertions::assert_matches;
use project::Project; use project::Project;
use serde_json::json; use serde_json::json;
use settings::{Settings as _, SettingsStore}; use settings::SettingsStore;
use util::path; use util::path;
use workspace::Workspace; use workspace::Workspace;
@@ -539,13 +538,8 @@ mod tests {
cx.update(|cx| { cx.update(|cx| {
let settings_store = SettingsStore::test(cx); let settings_store = SettingsStore::test(cx);
cx.set_global(settings_store); cx.set_global(settings_store);
language::init(cx);
Project::init_settings(cx);
AgentSettings::register(cx);
workspace::init_settings(cx);
theme::init(theme::LoadThemes::JustBase, cx); theme::init(theme::LoadThemes::JustBase, cx);
release_channel::init(SemanticVersion::default(), cx); release_channel::init(SemanticVersion::default(), cx);
EditorSettings::register(cx);
}); });
} }
} }

View File

@@ -356,7 +356,7 @@ impl MessageEditor {
let task = match mention_uri.clone() { let task = match mention_uri.clone() {
MentionUri::Fetch { url } => self.confirm_mention_for_fetch(url, cx), MentionUri::Fetch { url } => self.confirm_mention_for_fetch(url, cx),
MentionUri::Directory { .. } => Task::ready(Ok(Mention::UriOnly)), MentionUri::Directory { .. } => Task::ready(Ok(Mention::Link)),
MentionUri::Thread { id, .. } => self.confirm_mention_for_thread(id, cx), MentionUri::Thread { id, .. } => self.confirm_mention_for_thread(id, cx),
MentionUri::TextThread { path, .. } => self.confirm_mention_for_text_thread(path, cx), MentionUri::TextThread { path, .. } => self.confirm_mention_for_text_thread(path, cx),
MentionUri::File { abs_path } => self.confirm_mention_for_file(abs_path, cx), MentionUri::File { abs_path } => self.confirm_mention_for_file(abs_path, cx),
@@ -373,7 +373,6 @@ impl MessageEditor {
))) )))
} }
MentionUri::Selection { .. } => { MentionUri::Selection { .. } => {
// Handled elsewhere
debug_panic!("unexpected selection URI"); debug_panic!("unexpected selection URI");
Task::ready(Err(anyhow!("unexpected selection URI"))) Task::ready(Err(anyhow!("unexpected selection URI")))
} }
@@ -704,20 +703,21 @@ impl MessageEditor {
return Task::ready(Err(err)); return Task::ready(Err(err));
} }
let contents = self.mention_set.contents( let contents = self
&self.prompt_capabilities.borrow(), .mention_set
full_mention_content, .contents(full_mention_content, self.project.clone(), cx);
self.project.clone(),
cx,
);
let editor = self.editor.clone(); let editor = self.editor.clone();
let supports_embedded_context = self.prompt_capabilities.borrow().embedded_context;
cx.spawn(async move |_, cx| { cx.spawn(async move |_, cx| {
let contents = contents.await?; let contents = contents.await?;
let mut all_tracked_buffers = Vec::new(); let mut all_tracked_buffers = Vec::new();
let result = editor.update(cx, |editor, cx| { let result = editor.update(cx, |editor, cx| {
let mut ix = text.chars().position(|c| !c.is_whitespace()).unwrap_or(0); let (mut ix, _) = text
.char_indices()
.find(|(_, c)| !c.is_whitespace())
.unwrap_or((0, '\0'));
let mut chunks: Vec<acp::ContentBlock> = Vec::new(); let mut chunks: Vec<acp::ContentBlock> = Vec::new();
let text = editor.text(cx); let text = editor.text(cx);
editor.display_map.update(cx, |map, cx| { editor.display_map.update(cx, |map, cx| {
@@ -738,18 +738,32 @@ impl MessageEditor {
tracked_buffers, tracked_buffers,
} => { } => {
all_tracked_buffers.extend(tracked_buffers.iter().cloned()); all_tracked_buffers.extend(tracked_buffers.iter().cloned());
acp::ContentBlock::Resource(acp::EmbeddedResource { if supports_embedded_context {
annotations: None, acp::ContentBlock::Resource(acp::EmbeddedResource {
resource: acp::EmbeddedResourceResource::TextResourceContents( annotations: None,
acp::TextResourceContents { resource:
mime_type: None, acp::EmbeddedResourceResource::TextResourceContents(
text: content.clone(), acp::TextResourceContents {
uri: uri.to_uri().to_string(), mime_type: None,
meta: None, text: content.clone(),
}, uri: uri.to_uri().to_string(),
), meta: None,
meta: None, },
}) ),
meta: None,
})
} else {
acp::ContentBlock::ResourceLink(acp::ResourceLink {
name: uri.name(),
uri: uri.to_uri().to_string(),
annotations: None,
description: None,
mime_type: None,
size: None,
title: None,
meta: None,
})
}
} }
Mention::Image(mention_image) => { Mention::Image(mention_image) => {
let uri = match uri { let uri = match uri {
@@ -771,18 +785,16 @@ impl MessageEditor {
meta: None, meta: None,
}) })
} }
Mention::UriOnly => { Mention::Link => acp::ContentBlock::ResourceLink(acp::ResourceLink {
acp::ContentBlock::ResourceLink(acp::ResourceLink { name: uri.name(),
name: uri.name(), uri: uri.to_uri().to_string(),
uri: uri.to_uri().to_string(), annotations: None,
annotations: None, description: None,
description: None, mime_type: None,
mime_type: None, size: None,
size: None, title: None,
title: None, meta: None,
meta: None, }),
})
}
}; };
chunks.push(chunk); chunks.push(chunk);
ix = crease_range.end; ix = crease_range.end;
@@ -1111,7 +1123,7 @@ impl MessageEditor {
let start = text.len(); let start = text.len();
write!(&mut text, "{}", mention_uri.as_link()).ok(); write!(&mut text, "{}", mention_uri.as_link()).ok();
let end = text.len(); let end = text.len();
mentions.push((start..end, mention_uri, Mention::UriOnly)); mentions.push((start..end, mention_uri, Mention::Link));
} }
} }
acp::ContentBlock::Image(acp::ImageContent { acp::ContentBlock::Image(acp::ImageContent {
@@ -1183,6 +1195,17 @@ impl MessageEditor {
self.editor.read(cx).text(cx) self.editor.read(cx).text(cx)
} }
pub fn set_placeholder_text(
&mut self,
placeholder: &str,
window: &mut Window,
cx: &mut Context<Self>,
) {
self.editor.update(cx, |editor, cx| {
editor.set_placeholder_text(placeholder, window, cx);
});
}
#[cfg(test)] #[cfg(test)]
pub fn set_text(&mut self, text: &str, window: &mut Window, cx: &mut Context<Self>) { pub fn set_text(&mut self, text: &str, window: &mut Window, cx: &mut Context<Self>) {
self.editor.update(cx, |editor, cx| { self.editor.update(cx, |editor, cx| {
@@ -1517,7 +1540,7 @@ pub enum Mention {
tracked_buffers: Vec<Entity<Buffer>>, tracked_buffers: Vec<Entity<Buffer>>,
}, },
Image(MentionImage), Image(MentionImage),
UriOnly, Link,
} }
#[derive(Clone, Debug, Eq, PartialEq)] #[derive(Clone, Debug, Eq, PartialEq)]
@@ -1534,21 +1557,10 @@ pub struct MentionSet {
impl MentionSet { impl MentionSet {
fn contents( fn contents(
&self, &self,
prompt_capabilities: &acp::PromptCapabilities,
full_mention_content: bool, full_mention_content: bool,
project: Entity<Project>, project: Entity<Project>,
cx: &mut App, cx: &mut App,
) -> Task<Result<HashMap<CreaseId, (MentionUri, Mention)>>> { ) -> Task<Result<HashMap<CreaseId, (MentionUri, Mention)>>> {
if !prompt_capabilities.embedded_context {
let mentions = self
.mentions
.iter()
.map(|(crease_id, (uri, _))| (*crease_id, (uri.clone(), Mention::UriOnly)))
.collect();
return Task::ready(Ok(mentions));
}
let mentions = self.mentions.clone(); let mentions = self.mentions.clone();
cx.spawn(async move |cx| { cx.spawn(async move |cx| {
let mut contents = HashMap::default(); let mut contents = HashMap::default();
@@ -1898,10 +1910,8 @@ mod tests {
let app_state = cx.update(AppState::test); let app_state = cx.update(AppState::test);
cx.update(|cx| { cx.update(|cx| {
language::init(cx);
editor::init(cx); editor::init(cx);
workspace::init(app_state.clone(), cx); workspace::init(app_state.clone(), cx);
Project::init_settings(cx);
}); });
let project = Project::test(app_state.fs.clone(), [path!("/dir").as_ref()], cx).await; let project = Project::test(app_state.fs.clone(), [path!("/dir").as_ref()], cx).await;
@@ -2074,10 +2084,8 @@ mod tests {
let app_state = cx.update(AppState::test); let app_state = cx.update(AppState::test);
cx.update(|cx| { cx.update(|cx| {
language::init(cx);
editor::init(cx); editor::init(cx);
workspace::init(app_state.clone(), cx); workspace::init(app_state.clone(), cx);
Project::init_settings(cx);
}); });
app_state app_state
@@ -2202,6 +2210,8 @@ mod tests {
format!("seven.txt b{slash}"), format!("seven.txt b{slash}"),
format!("six.txt b{slash}"), format!("six.txt b{slash}"),
format!("five.txt b{slash}"), format!("five.txt b{slash}"),
"Files & Directories".into(),
"Symbols".into()
] ]
); );
editor.set_text("", window, cx); editor.set_text("", window, cx);
@@ -2286,21 +2296,11 @@ mod tests {
assert_eq!(fold_ranges(editor, cx).len(), 1); assert_eq!(fold_ranges(editor, cx).len(), 1);
}); });
let all_prompt_capabilities = acp::PromptCapabilities {
image: true,
audio: true,
embedded_context: true,
meta: None,
};
let contents = message_editor let contents = message_editor
.update(&mut cx, |message_editor, cx| { .update(&mut cx, |message_editor, cx| {
message_editor.mention_set().contents( message_editor
&all_prompt_capabilities, .mention_set()
false, .contents(false, project.clone(), cx)
project.clone(),
cx,
)
}) })
.await .await
.unwrap() .unwrap()
@@ -2318,30 +2318,6 @@ mod tests {
); );
} }
let contents = message_editor
.update(&mut cx, |message_editor, cx| {
message_editor.mention_set().contents(
&acp::PromptCapabilities::default(),
false,
project.clone(),
cx,
)
})
.await
.unwrap()
.into_values()
.collect::<Vec<_>>();
{
let [(uri, Mention::UriOnly)] = contents.as_slice() else {
panic!("Unexpected mentions");
};
pretty_assertions::assert_eq!(
uri,
&MentionUri::parse(&url_one, PathStyle::local()).unwrap()
);
}
cx.simulate_input(" "); cx.simulate_input(" ");
editor.update(&mut cx, |editor, cx| { editor.update(&mut cx, |editor, cx| {
@@ -2377,12 +2353,9 @@ mod tests {
let contents = message_editor let contents = message_editor
.update(&mut cx, |message_editor, cx| { .update(&mut cx, |message_editor, cx| {
message_editor.mention_set().contents( message_editor
&all_prompt_capabilities, .mention_set()
false, .contents(false, project.clone(), cx)
project.clone(),
cx,
)
}) })
.await .await
.unwrap() .unwrap()
@@ -2503,12 +2476,9 @@ mod tests {
let contents = message_editor let contents = message_editor
.update(&mut cx, |message_editor, cx| { .update(&mut cx, |message_editor, cx| {
message_editor.mention_set().contents( message_editor
&all_prompt_capabilities, .mention_set()
false, .contents(false, project.clone(), cx)
project.clone(),
cx,
)
}) })
.await .await
.unwrap() .unwrap()
@@ -2554,12 +2524,9 @@ mod tests {
// Getting the message contents fails // Getting the message contents fails
message_editor message_editor
.update(&mut cx, |message_editor, cx| { .update(&mut cx, |message_editor, cx| {
message_editor.mention_set().contents( message_editor
&all_prompt_capabilities, .mention_set()
false, .contents(false, project.clone(), cx)
project.clone(),
cx,
)
}) })
.await .await
.expect_err("Should fail to load x.png"); .expect_err("Should fail to load x.png");
@@ -2610,12 +2577,9 @@ mod tests {
// Now getting the contents succeeds, because the invalid mention was removed // Now getting the contents succeeds, because the invalid mention was removed
let contents = message_editor let contents = message_editor
.update(&mut cx, |message_editor, cx| { .update(&mut cx, |message_editor, cx| {
message_editor.mention_set().contents( message_editor
&all_prompt_capabilities, .mention_set()
false, .contents(false, project.clone(), cx)
project.clone(),
cx,
)
}) })
.await .await
.unwrap(); .unwrap();
@@ -2879,7 +2843,7 @@ mod tests {
cx.run_until_parked(); cx.run_until_parked();
editor.update_in(cx, |editor, window, cx| { editor.update_in(cx, |editor, window, cx| {
editor.set_text(" hello world ", window, cx); editor.set_text(" \u{A0}してhello world ", window, cx);
}); });
let (content, _) = message_editor let (content, _) = message_editor
@@ -2890,13 +2854,154 @@ mod tests {
assert_eq!( assert_eq!(
content, content,
vec![acp::ContentBlock::Text(acp::TextContent { vec![acp::ContentBlock::Text(acp::TextContent {
text: "hello world".into(), text: "してhello world".into(),
annotations: None, annotations: None,
meta: None meta: None
})] })]
); );
} }
#[gpui::test]
async fn test_editor_respects_embedded_context_capability(cx: &mut TestAppContext) {
init_test(cx);
let fs = FakeFs::new(cx.executor());
let file_content = "fn main() { println!(\"Hello, world!\"); }\n";
fs.insert_tree(
"/project",
json!({
"src": {
"main.rs": file_content,
}
}),
)
.await;
let project = Project::test(fs, [Path::new(path!("/project"))], cx).await;
let (workspace, cx) =
cx.add_window_view(|window, cx| Workspace::test_new(project.clone(), window, cx));
let text_thread_store = cx.new(|cx| TextThreadStore::fake(project.clone(), cx));
let history_store = cx.new(|cx| HistoryStore::new(text_thread_store, cx));
let (message_editor, editor) = workspace.update_in(cx, |workspace, window, cx| {
let workspace_handle = cx.weak_entity();
let message_editor = cx.new(|cx| {
MessageEditor::new(
workspace_handle,
project.clone(),
history_store.clone(),
None,
Default::default(),
Default::default(),
"Test Agent".into(),
"Test",
EditorMode::AutoHeight {
max_lines: None,
min_lines: 1,
},
window,
cx,
)
});
workspace.active_pane().update(cx, |pane, cx| {
pane.add_item(
Box::new(cx.new(|_| MessageEditorItem(message_editor.clone()))),
true,
true,
None,
window,
cx,
);
});
message_editor.read(cx).focus_handle(cx).focus(window);
let editor = message_editor.read(cx).editor().clone();
(message_editor, editor)
});
cx.simulate_input("What is in @file main");
editor.update_in(cx, |editor, window, cx| {
assert!(editor.has_visible_completions_menu());
assert_eq!(editor.text(cx), "What is in @file main");
editor.confirm_completion(&editor::actions::ConfirmCompletion::default(), window, cx);
});
let content = message_editor
.update(cx, |editor, cx| editor.contents(false, cx))
.await
.unwrap()
.0;
let main_rs_uri = if cfg!(windows) {
"file:///C:/project/src/main.rs".to_string()
} else {
"file:///project/src/main.rs".to_string()
};
// When embedded context is `false` we should get a resource link
pretty_assertions::assert_eq!(
content,
vec![
acp::ContentBlock::Text(acp::TextContent {
text: "What is in ".to_string(),
annotations: None,
meta: None
}),
acp::ContentBlock::ResourceLink(acp::ResourceLink {
uri: main_rs_uri.clone(),
name: "main.rs".to_string(),
annotations: None,
meta: None,
description: None,
mime_type: None,
size: None,
title: None,
})
]
);
message_editor.update(cx, |editor, _cx| {
editor.prompt_capabilities.replace(acp::PromptCapabilities {
embedded_context: true,
..Default::default()
})
});
let content = message_editor
.update(cx, |editor, cx| editor.contents(false, cx))
.await
.unwrap()
.0;
// When embedded context is `true` we should get a resource
pretty_assertions::assert_eq!(
content,
vec![
acp::ContentBlock::Text(acp::TextContent {
text: "What is in ".to_string(),
annotations: None,
meta: None
}),
acp::ContentBlock::Resource(acp::EmbeddedResource {
resource: acp::EmbeddedResourceResource::TextResourceContents(
acp::TextResourceContents {
text: file_content.to_string(),
uri: main_rs_uri,
mime_type: None,
meta: None
}
),
annotations: None,
meta: None
})
]
);
}
#[gpui::test] #[gpui::test]
async fn test_autoscroll_after_insert_selections(cx: &mut TestAppContext) { async fn test_autoscroll_after_insert_selections(cx: &mut TestAppContext) {
init_test(cx); init_test(cx);
@@ -2904,10 +3009,8 @@ mod tests {
let app_state = cx.update(AppState::test); let app_state = cx.update(AppState::test);
cx.update(|cx| { cx.update(|cx| {
language::init(cx);
editor::init(cx); editor::init(cx);
workspace::init(app_state.clone(), cx); workspace::init(app_state.clone(), cx);
Project::init_settings(cx);
}); });
app_state app_state

View File

@@ -1,6 +1,6 @@
use std::rc::Rc; use std::rc::Rc;
use acp_thread::AgentModelSelector; use acp_thread::{AgentModelInfo, AgentModelSelector};
use gpui::{Entity, FocusHandle}; use gpui::{Entity, FocusHandle};
use picker::popover_menu::PickerPopoverMenu; use picker::popover_menu::PickerPopoverMenu;
use ui::{ use ui::{
@@ -36,12 +36,8 @@ impl AcpModelSelectorPopover {
self.menu_handle.toggle(window, cx); self.menu_handle.toggle(window, cx);
} }
pub fn active_model_name(&self, cx: &App) -> Option<SharedString> { pub fn active_model<'a>(&self, cx: &'a App) -> Option<&'a AgentModelInfo> {
self.selector self.selector.read(cx).delegate.active_model()
.read(cx)
.delegate
.active_model()
.map(|model| model.name.clone())
} }
} }

View File

@@ -1,6 +0,0 @@
use acp_thread::AcpThread;
use gpui::Entity;
pub struct ThreadEditor {
thread: Entity<AcpThread>,
}

View File

@@ -457,25 +457,23 @@ impl Render for AcpThreadHistory {
.on_action(cx.listener(Self::select_last)) .on_action(cx.listener(Self::select_last))
.on_action(cx.listener(Self::confirm)) .on_action(cx.listener(Self::confirm))
.on_action(cx.listener(Self::remove_selected_thread)) .on_action(cx.listener(Self::remove_selected_thread))
.when(!self.history_store.read(cx).is_empty(cx), |parent| { .child(
parent.child( h_flex()
h_flex() .h(px(41.)) // Match the toolbar perfectly
.h(px(41.)) // Match the toolbar perfectly .w_full()
.w_full() .py_1()
.py_1() .px_2()
.px_2() .gap_2()
.gap_2() .justify_between()
.justify_between() .border_b_1()
.border_b_1() .border_color(cx.theme().colors().border)
.border_color(cx.theme().colors().border) .child(
.child( Icon::new(IconName::MagnifyingGlass)
Icon::new(IconName::MagnifyingGlass) .color(Color::Muted)
.color(Color::Muted) .size(IconSize::Small),
.size(IconSize::Small), )
) .child(self.search_editor.clone()),
.child(self.search_editor.clone()), )
)
})
.child({ .child({
let view = v_flex() let view = v_flex()
.id("list-container") .id("list-container")
@@ -484,19 +482,15 @@ impl Render for AcpThreadHistory {
.flex_grow(); .flex_grow();
if self.history_store.read(cx).is_empty(cx) { if self.history_store.read(cx).is_empty(cx) {
view.justify_center() view.justify_center().items_center().child(
.child( Label::new("You don't have any past threads yet.")
h_flex().w_full().justify_center().child( .size(LabelSize::Small)
Label::new("You don't have any past threads yet.") .color(Color::Muted),
.size(LabelSize::Small),
),
)
} else if self.search_produced_no_matches() {
view.justify_center().child(
h_flex().w_full().justify_center().child(
Label::new("No threads match your search.").size(LabelSize::Small),
),
) )
} else if self.search_produced_no_matches() {
view.justify_center()
.items_center()
.child(Label::new("No threads match your search.").size(LabelSize::Small))
} else { } else {
view.child( view.child(
uniform_list( uniform_list(
@@ -673,7 +667,7 @@ impl EntryTimeFormat {
timezone, timezone,
time_format::TimestampFormat::EnhancedAbsolute, time_format::TimestampFormat::EnhancedAbsolute,
), ),
EntryTimeFormat::TimeOnly => time_format::format_time(timestamp), EntryTimeFormat::TimeOnly => time_format::format_time(timestamp.to_offset(timezone)),
} }
} }
} }

View File

@@ -4,12 +4,12 @@ use acp_thread::{
ToolCallStatus, UserMessageId, ToolCallStatus, UserMessageId,
}; };
use acp_thread::{AgentConnection, Plan}; use acp_thread::{AgentConnection, Plan};
use action_log::ActionLog; use action_log::{ActionLog, ActionLogTelemetry};
use agent::{DbThreadMetadata, HistoryEntry, HistoryEntryId, HistoryStore, NativeAgentServer}; use agent::{DbThreadMetadata, HistoryEntry, HistoryEntryId, HistoryStore, NativeAgentServer};
use agent_client_protocol::{self as acp, PromptCapabilities}; use agent_client_protocol::{self as acp, PromptCapabilities};
use agent_servers::{AgentServer, AgentServerDelegate}; use agent_servers::{AgentServer, AgentServerDelegate};
use agent_settings::{AgentProfileId, AgentSettings, CompletionMode}; use agent_settings::{AgentProfileId, AgentSettings, CompletionMode};
use anyhow::{Result, anyhow, bail}; use anyhow::{Result, anyhow};
use arrayvec::ArrayVec; use arrayvec::ArrayVec;
use audio::{Audio, Sound}; use audio::{Audio, Sound};
use buffer_diff::BufferDiff; use buffer_diff::BufferDiff;
@@ -125,8 +125,9 @@ impl ProfileProvider for Entity<agent::Thread> {
} }
fn set_profile(&self, profile_id: AgentProfileId, cx: &mut App) { fn set_profile(&self, profile_id: AgentProfileId, cx: &mut App) {
self.update(cx, |thread, _cx| { self.update(cx, |thread, cx| {
thread.set_profile(profile_id); // Apply the profile and let the thread swap to its default model.
thread.set_profile(profile_id, cx);
}); });
} }
@@ -169,7 +170,7 @@ impl ThreadFeedbackState {
} }
} }
let session_id = thread.read(cx).session_id().clone(); let session_id = thread.read(cx).session_id().clone();
let agent_name = telemetry.agent_name(); let agent = thread.read(cx).connection().telemetry_id();
let task = telemetry.thread_data(&session_id, cx); let task = telemetry.thread_data(&session_id, cx);
let rating = match feedback { let rating = match feedback {
ThreadFeedback::Positive => "positive", ThreadFeedback::Positive => "positive",
@@ -179,9 +180,9 @@ impl ThreadFeedbackState {
let thread = task.await?; let thread = task.await?;
telemetry::event!( telemetry::event!(
"Agent Thread Rated", "Agent Thread Rated",
agent = agent,
session_id = session_id, session_id = session_id,
rating = rating, rating = rating,
agent = agent_name,
thread = thread thread = thread
); );
anyhow::Ok(()) anyhow::Ok(())
@@ -206,15 +207,15 @@ impl ThreadFeedbackState {
self.comments_editor.take(); self.comments_editor.take();
let session_id = thread.read(cx).session_id().clone(); let session_id = thread.read(cx).session_id().clone();
let agent_name = telemetry.agent_name(); let agent = thread.read(cx).connection().telemetry_id();
let task = telemetry.thread_data(&session_id, cx); let task = telemetry.thread_data(&session_id, cx);
cx.background_spawn(async move { cx.background_spawn(async move {
let thread = task.await?; let thread = task.await?;
telemetry::event!( telemetry::event!(
"Agent Thread Feedback Comments", "Agent Thread Feedback Comments",
agent = agent,
session_id = session_id, session_id = session_id,
comments = comments, comments = comments,
agent = agent_name,
thread = thread thread = thread
); );
anyhow::Ok(()) anyhow::Ok(())
@@ -294,7 +295,6 @@ pub struct AcpThreadView {
resume_thread_metadata: Option<DbThreadMetadata>, resume_thread_metadata: Option<DbThreadMetadata>,
_cancel_task: Option<Task<()>>, _cancel_task: Option<Task<()>>,
_subscriptions: [Subscription; 5], _subscriptions: [Subscription; 5],
#[cfg(target_os = "windows")]
show_codex_windows_warning: bool, show_codex_windows_warning: bool,
} }
@@ -337,19 +337,7 @@ impl AcpThreadView {
let prompt_capabilities = Rc::new(RefCell::new(acp::PromptCapabilities::default())); let prompt_capabilities = Rc::new(RefCell::new(acp::PromptCapabilities::default()));
let available_commands = Rc::new(RefCell::new(vec![])); let available_commands = Rc::new(RefCell::new(vec![]));
let placeholder = if agent.name() == "Zed Agent" { let placeholder = placeholder_text(agent.name().as_ref(), false);
format!("Message the {} — @ to include context", agent.name())
} else if agent.name() == "Claude Code"
|| agent.name() == "Codex"
|| !available_commands.borrow().is_empty()
{
format!(
"Message {} — @ to include context, / for commands",
agent.name()
)
} else {
format!("Message {} — @ to include context", agent.name())
};
let message_editor = cx.new(|cx| { let message_editor = cx.new(|cx| {
let mut editor = MessageEditor::new( let mut editor = MessageEditor::new(
@@ -401,7 +389,6 @@ impl AcpThreadView {
), ),
]; ];
#[cfg(target_os = "windows")]
let show_codex_windows_warning = crate::ExternalAgent::parse_built_in(agent.as_ref()) let show_codex_windows_warning = crate::ExternalAgent::parse_built_in(agent.as_ref())
== Some(crate::ExternalAgent::Codex); == Some(crate::ExternalAgent::Codex);
@@ -447,7 +434,6 @@ impl AcpThreadView {
focus_handle: cx.focus_handle(), focus_handle: cx.focus_handle(),
new_server_version_available: None, new_server_version_available: None,
resume_thread_metadata: resume_thread, resume_thread_metadata: resume_thread,
#[cfg(target_os = "windows")]
show_codex_windows_warning, show_codex_windows_warning,
} }
} }
@@ -541,14 +527,7 @@ impl AcpThreadView {
}) })
.log_err() .log_err()
} else { } else {
let root_dir = if let Some(acp_agent) = connection let root_dir = root_dir.unwrap_or(paths::home_dir().as_path().into());
.clone()
.downcast::<agent_servers::AcpConnection>()
{
acp_agent.root_dir().into()
} else {
root_dir.unwrap_or(paths::home_dir().as_path().into())
};
cx.update(|_, cx| { cx.update(|_, cx| {
connection connection
.clone() .clone()
@@ -1133,8 +1112,6 @@ impl AcpThreadView {
message_editor.contents(full_mention_content, cx) message_editor.contents(full_mention_content, cx)
}); });
let agent_telemetry_id = self.agent.telemetry_id();
self.thread_error.take(); self.thread_error.take();
self.editing_message.take(); self.editing_message.take();
self.thread_feedback.clear(); self.thread_feedback.clear();
@@ -1142,6 +1119,8 @@ impl AcpThreadView {
let Some(thread) = self.thread() else { let Some(thread) = self.thread() else {
return; return;
}; };
let agent_telemetry_id = self.agent.telemetry_id();
let session_id = thread.read(cx).session_id().clone();
let thread = thread.downgrade(); let thread = thread.downgrade();
if self.should_be_following { if self.should_be_following {
self.workspace self.workspace
@@ -1152,6 +1131,7 @@ impl AcpThreadView {
} }
self.is_loading_contents = true; self.is_loading_contents = true;
let model_id = self.current_model_id(cx);
let guard = cx.new(|_| ()); let guard = cx.new(|_| ());
cx.observe_release(&guard, |this, _guard, cx| { cx.observe_release(&guard, |this, _guard, cx| {
this.is_loading_contents = false; this.is_loading_contents = false;
@@ -1173,6 +1153,7 @@ impl AcpThreadView {
message_editor.clear(window, cx); message_editor.clear(window, cx);
}); });
})?; })?;
let turn_start_time = Instant::now();
let send = thread.update(cx, |thread, cx| { let send = thread.update(cx, |thread, cx| {
thread.action_log().update(cx, |action_log, cx| { thread.action_log().update(cx, |action_log, cx| {
for buffer in tracked_buffers { for buffer in tracked_buffers {
@@ -1181,11 +1162,27 @@ impl AcpThreadView {
}); });
drop(guard); drop(guard);
telemetry::event!("Agent Message Sent", agent = agent_telemetry_id); telemetry::event!(
"Agent Message Sent",
agent = agent_telemetry_id,
session = session_id,
model = model_id
);
thread.send(contents, cx) thread.send(contents, cx)
})?; })?;
send.await let res = send.await;
let turn_time_ms = turn_start_time.elapsed().as_millis();
let status = if res.is_ok() { "success" } else { "failure" };
telemetry::event!(
"Agent Turn Completed",
agent = agent_telemetry_id,
session = session_id,
model = model_id,
status,
turn_time_ms,
);
res
}); });
cx.spawn(async move |this, cx| { cx.spawn(async move |this, cx| {
@@ -1387,7 +1384,7 @@ impl AcpThreadView {
AcpThreadEvent::Refusal => { AcpThreadEvent::Refusal => {
self.thread_retry_status.take(); self.thread_retry_status.take();
self.thread_error = Some(ThreadError::Refusal); self.thread_error = Some(ThreadError::Refusal);
let model_or_agent_name = self.get_current_model_name(cx); let model_or_agent_name = self.current_model_name(cx);
let notification_message = let notification_message =
format!("{} refused to respond to this request", model_or_agent_name); format!("{} refused to respond to this request", model_or_agent_name);
self.notify_with_sound(&notification_message, IconName::Warning, window, cx); self.notify_with_sound(&notification_message, IconName::Warning, window, cx);
@@ -1447,7 +1444,14 @@ impl AcpThreadView {
}); });
} }
let has_commands = !available_commands.is_empty();
self.available_commands.replace(available_commands); self.available_commands.replace(available_commands);
let new_placeholder = placeholder_text(self.agent.name().as_ref(), has_commands);
self.message_editor.update(cx, |editor, cx| {
editor.set_placeholder_text(&new_placeholder, window, cx);
});
} }
AcpThreadEvent::ModeUpdated(_mode) => { AcpThreadEvent::ModeUpdated(_mode) => {
// The connection keeps track of the mode // The connection keeps track of the mode
@@ -1856,6 +1860,14 @@ impl AcpThreadView {
let Some(thread) = self.thread() else { let Some(thread) = self.thread() else {
return; return;
}; };
telemetry::event!(
"Agent Tool Call Authorized",
agent = self.agent.telemetry_id(),
session = thread.read(cx).session_id(),
option = option_kind
);
thread.update(cx, |thread, cx| { thread.update(cx, |thread, cx| {
thread.authorize_tool_call(tool_call_id, option_id, option_kind, cx); thread.authorize_tool_call(tool_call_id, option_id, option_kind, cx);
}); });
@@ -3588,6 +3600,7 @@ impl AcpThreadView {
) -> Option<AnyElement> { ) -> Option<AnyElement> {
let thread = thread_entity.read(cx); let thread = thread_entity.read(cx);
let action_log = thread.action_log(); let action_log = thread.action_log();
let telemetry = ActionLogTelemetry::from(thread);
let changed_buffers = action_log.read(cx).changed_buffers(cx); let changed_buffers = action_log.read(cx).changed_buffers(cx);
let plan = thread.plan(); let plan = thread.plan();
@@ -3635,6 +3648,7 @@ impl AcpThreadView {
.when(self.edits_expanded, |parent| { .when(self.edits_expanded, |parent| {
parent.child(self.render_edited_files( parent.child(self.render_edited_files(
action_log, action_log,
telemetry,
&changed_buffers, &changed_buffers,
pending_edits, pending_edits,
cx, cx,
@@ -3915,6 +3929,7 @@ impl AcpThreadView {
fn render_edited_files( fn render_edited_files(
&self, &self,
action_log: &Entity<ActionLog>, action_log: &Entity<ActionLog>,
telemetry: ActionLogTelemetry,
changed_buffers: &BTreeMap<Entity<Buffer>, Entity<BufferDiff>>, changed_buffers: &BTreeMap<Entity<Buffer>, Entity<BufferDiff>>,
pending_edits: bool, pending_edits: bool,
cx: &Context<Self>, cx: &Context<Self>,
@@ -4034,12 +4049,14 @@ impl AcpThreadView {
.on_click({ .on_click({
let buffer = buffer.clone(); let buffer = buffer.clone();
let action_log = action_log.clone(); let action_log = action_log.clone();
let telemetry = telemetry.clone();
move |_, _, cx| { move |_, _, cx| {
action_log.update(cx, |action_log, cx| { action_log.update(cx, |action_log, cx| {
action_log action_log
.reject_edits_in_ranges( .reject_edits_in_ranges(
buffer.clone(), buffer.clone(),
vec![Anchor::MIN..Anchor::MAX], vec![Anchor::MIN..Anchor::MAX],
Some(telemetry.clone()),
cx, cx,
) )
.detach_and_log_err(cx); .detach_and_log_err(cx);
@@ -4054,11 +4071,13 @@ impl AcpThreadView {
.on_click({ .on_click({
let buffer = buffer.clone(); let buffer = buffer.clone();
let action_log = action_log.clone(); let action_log = action_log.clone();
let telemetry = telemetry.clone();
move |_, _, cx| { move |_, _, cx| {
action_log.update(cx, |action_log, cx| { action_log.update(cx, |action_log, cx| {
action_log.keep_edits_in_range( action_log.keep_edits_in_range(
buffer.clone(), buffer.clone(),
Anchor::MIN..Anchor::MAX, Anchor::MIN..Anchor::MAX,
Some(telemetry.clone()),
cx, cx,
); );
}) })
@@ -4274,17 +4293,23 @@ impl AcpThreadView {
let Some(thread) = self.thread() else { let Some(thread) = self.thread() else {
return; return;
}; };
let telemetry = ActionLogTelemetry::from(thread.read(cx));
let action_log = thread.read(cx).action_log().clone(); let action_log = thread.read(cx).action_log().clone();
action_log.update(cx, |action_log, cx| action_log.keep_all_edits(cx)); action_log.update(cx, |action_log, cx| {
action_log.keep_all_edits(Some(telemetry), cx)
});
} }
fn reject_all(&mut self, _: &RejectAll, _window: &mut Window, cx: &mut Context<Self>) { fn reject_all(&mut self, _: &RejectAll, _window: &mut Window, cx: &mut Context<Self>) {
let Some(thread) = self.thread() else { let Some(thread) = self.thread() else {
return; return;
}; };
let telemetry = ActionLogTelemetry::from(thread.read(cx));
let action_log = thread.read(cx).action_log().clone(); let action_log = thread.read(cx).action_log().clone();
action_log action_log
.update(cx, |action_log, cx| action_log.reject_all_edits(cx)) .update(cx, |action_log, cx| {
action_log.reject_all_edits(Some(telemetry), cx)
})
.detach(); .detach();
} }
@@ -4680,35 +4705,36 @@ impl AcpThreadView {
.languages .languages
.language_for_name("Markdown"); .language_for_name("Markdown");
let (thread_summary, markdown) = if let Some(thread) = self.thread() { let (thread_title, markdown) = if let Some(thread) = self.thread() {
let thread = thread.read(cx); let thread = thread.read(cx);
(thread.title().to_string(), thread.to_markdown(cx)) (thread.title().to_string(), thread.to_markdown(cx))
} else { } else {
return Task::ready(Ok(())); return Task::ready(Ok(()));
}; };
let project = workspace.read(cx).project().clone();
window.spawn(cx, async move |cx| { window.spawn(cx, async move |cx| {
let markdown_language = markdown_language_task.await?; let markdown_language = markdown_language_task.await?;
let buffer = project
.update(cx, |project, cx| project.create_buffer(false, cx))?
.await?;
buffer.update(cx, |buffer, cx| {
buffer.set_text(markdown, cx);
buffer.set_language(Some(markdown_language), cx);
buffer.set_capability(language::Capability::ReadOnly, cx);
})?;
workspace.update_in(cx, |workspace, window, cx| { workspace.update_in(cx, |workspace, window, cx| {
let project = workspace.project().clone(); let buffer = cx
.new(|cx| MultiBuffer::singleton(buffer, cx).with_title(thread_title.clone()));
if !project.read(cx).is_local() {
bail!("failed to open active thread as markdown in remote project");
}
let buffer = project.update(cx, |project, cx| {
project.create_local_buffer(&markdown, Some(markdown_language), true, cx)
});
let buffer = cx.new(|cx| {
MultiBuffer::singleton(buffer, cx).with_title(thread_summary.clone())
});
workspace.add_item_to_active_pane( workspace.add_item_to_active_pane(
Box::new(cx.new(|cx| { Box::new(cx.new(|cx| {
let mut editor = let mut editor =
Editor::for_multibuffer(buffer, Some(project.clone()), window, cx); Editor::for_multibuffer(buffer, Some(project.clone()), window, cx);
editor.set_breadcrumb_header(thread_summary); editor.set_breadcrumb_header(thread_title);
editor editor
})), })),
None, None,
@@ -4716,9 +4742,7 @@ impl AcpThreadView {
window, window,
cx, cx,
); );
})?;
anyhow::Ok(())
})??;
anyhow::Ok(()) anyhow::Ok(())
}) })
} }
@@ -5252,7 +5276,6 @@ impl AcpThreadView {
) )
} }
#[cfg(target_os = "windows")]
fn render_codex_windows_warning(&self, cx: &mut Context<Self>) -> Option<Callout> { fn render_codex_windows_warning(&self, cx: &mut Context<Self>) -> Option<Callout> {
if self.show_codex_windows_warning { if self.show_codex_windows_warning {
Some( Some(
@@ -5268,8 +5291,9 @@ impl AcpThreadView {
.icon_size(IconSize::Small) .icon_size(IconSize::Small)
.icon_color(Color::Muted) .icon_color(Color::Muted)
.on_click(cx.listener({ .on_click(cx.listener({
move |_, _, window, cx| { move |_, _, _window, cx| {
window.dispatch_action( #[cfg(windows)]
_window.dispatch_action(
zed_actions::wsl_actions::OpenWsl::default().boxed_clone(), zed_actions::wsl_actions::OpenWsl::default().boxed_clone(),
cx, cx,
); );
@@ -5344,20 +5368,21 @@ impl AcpThreadView {
) )
} }
fn get_current_model_name(&self, cx: &App) -> SharedString { fn current_model_id(&self, cx: &App) -> Option<String> {
self.model_selector
.as_ref()
.and_then(|selector| selector.read(cx).active_model(cx).map(|m| m.id.to_string()))
}
fn current_model_name(&self, cx: &App) -> SharedString {
// For native agent (Zed Agent), use the specific model name (e.g., "Claude 3.5 Sonnet") // For native agent (Zed Agent), use the specific model name (e.g., "Claude 3.5 Sonnet")
// For ACP agents, use the agent name (e.g., "Claude Code", "Gemini CLI") // For ACP agents, use the agent name (e.g., "Claude Code", "Gemini CLI")
// This provides better clarity about what refused the request // This provides better clarity about what refused the request
if self if self.as_native_connection(cx).is_some() {
.agent
.clone()
.downcast::<agent::NativeAgentServer>()
.is_some()
{
// Native agent - use the model name
self.model_selector self.model_selector
.as_ref() .as_ref()
.and_then(|selector| selector.read(cx).active_model_name(cx)) .and_then(|selector| selector.read(cx).active_model(cx))
.map(|model| model.name.clone())
.unwrap_or_else(|| SharedString::from("The model")) .unwrap_or_else(|| SharedString::from("The model"))
} else { } else {
// ACP agent - use the agent name (e.g., "Claude Code", "Gemini CLI") // ACP agent - use the agent name (e.g., "Claude Code", "Gemini CLI")
@@ -5366,7 +5391,7 @@ impl AcpThreadView {
} }
fn render_refusal_error(&self, cx: &mut Context<'_, Self>) -> Callout { fn render_refusal_error(&self, cx: &mut Context<'_, Self>) -> Callout {
let model_or_agent_name = self.get_current_model_name(cx); let model_or_agent_name = self.current_model_name(cx);
let refusal_message = format!( let refusal_message = format!(
"{} refused to respond to this prompt. This can happen when a model believes the prompt violates its content policy or safety guidelines, so rephrasing it can sometimes address the issue.", "{} refused to respond to this prompt. This can happen when a model believes the prompt violates its content policy or safety guidelines, so rephrasing it can sometimes address the issue.",
model_or_agent_name model_or_agent_name
@@ -5678,6 +5703,19 @@ fn loading_contents_spinner(size: IconSize) -> AnyElement {
.into_any_element() .into_any_element()
} }
fn placeholder_text(agent_name: &str, has_commands: bool) -> String {
if agent_name == "Zed Agent" {
format!("Message the {} — @ to include context", agent_name)
} else if has_commands {
format!(
"Message {} — @ to include context, / for commands",
agent_name
)
} else {
format!("Message {} — @ to include context", agent_name)
}
}
impl Focusable for AcpThreadView { impl Focusable for AcpThreadView {
fn focus_handle(&self, cx: &App) -> FocusHandle { fn focus_handle(&self, cx: &App) -> FocusHandle {
match self.thread_state { match self.thread_state {
@@ -5772,13 +5810,10 @@ impl Render for AcpThreadView {
}) })
.children(self.render_thread_retry_status_callout(window, cx)) .children(self.render_thread_retry_status_callout(window, cx))
.children({ .children({
#[cfg(target_os = "windows")] if cfg!(windows) && self.project.read(cx).is_local() {
{
self.render_codex_windows_warning(cx) self.render_codex_windows_warning(cx)
} } else {
#[cfg(not(target_os = "windows"))] None
{
Vec::<Empty>::new()
} }
}) })
.children(self.render_thread_error(cx)) .children(self.render_thread_error(cx))
@@ -5967,7 +6002,6 @@ pub(crate) mod tests {
use acp_thread::StubAgentConnection; use acp_thread::StubAgentConnection;
use agent_client_protocol::SessionId; use agent_client_protocol::SessionId;
use assistant_text_thread::TextThreadStore; use assistant_text_thread::TextThreadStore;
use editor::EditorSettings;
use fs::FakeFs; use fs::FakeFs;
use gpui::{EventEmitter, SemanticVersion, TestAppContext, VisualTestContext}; use gpui::{EventEmitter, SemanticVersion, TestAppContext, VisualTestContext};
use project::Project; use project::Project;
@@ -6355,6 +6389,10 @@ pub(crate) mod tests {
struct SaboteurAgentConnection; struct SaboteurAgentConnection;
impl AgentConnection for SaboteurAgentConnection { impl AgentConnection for SaboteurAgentConnection {
fn telemetry_id(&self) -> &'static str {
"saboteur"
}
fn new_thread( fn new_thread(
self: Rc<Self>, self: Rc<Self>,
project: Entity<Project>, project: Entity<Project>,
@@ -6415,6 +6453,10 @@ pub(crate) mod tests {
struct RefusalAgentConnection; struct RefusalAgentConnection;
impl AgentConnection for RefusalAgentConnection { impl AgentConnection for RefusalAgentConnection {
fn telemetry_id(&self) -> &'static str {
"refusal"
}
fn new_thread( fn new_thread(
self: Rc<Self>, self: Rc<Self>,
project: Entity<Project>, project: Entity<Project>,
@@ -6477,13 +6519,8 @@ pub(crate) mod tests {
cx.update(|cx| { cx.update(|cx| {
let settings_store = SettingsStore::test(cx); let settings_store = SettingsStore::test(cx);
cx.set_global(settings_store); cx.set_global(settings_store);
language::init(cx);
Project::init_settings(cx);
AgentSettings::register(cx);
workspace::init_settings(cx);
theme::init(theme::LoadThemes::JustBase, cx); theme::init(theme::LoadThemes::JustBase, cx);
release_channel::init(SemanticVersion::default(), cx); release_channel::init(SemanticVersion::default(), cx);
EditorSettings::register(cx);
prompt_store::init(cx) prompt_store::init(cx)
}); });
} }

View File

@@ -638,15 +638,13 @@ impl AgentConfiguration {
let is_running = matches!(server_status, ContextServerStatus::Running); let is_running = matches!(server_status, ContextServerStatus::Running);
let item_id = SharedString::from(context_server_id.0.clone()); let item_id = SharedString::from(context_server_id.0.clone());
let is_from_extension = server_configuration // Servers without a configuration can only be provided by extensions.
.as_ref() let provided_by_extension = server_configuration.is_none_or(|config| {
.map(|config| { matches!(
matches!( config.as_ref(),
config.as_ref(), ContextServerConfiguration::Extension { .. }
ContextServerConfiguration::Extension { .. } )
) });
})
.unwrap_or(false);
let error = if let ContextServerStatus::Error(error) = server_status.clone() { let error = if let ContextServerStatus::Error(error) = server_status.clone() {
Some(error) Some(error)
@@ -660,7 +658,7 @@ impl AgentConfiguration {
.tools_for_server(&context_server_id) .tools_for_server(&context_server_id)
.count(); .count();
let (source_icon, source_tooltip) = if is_from_extension { let (source_icon, source_tooltip) = if provided_by_extension {
( (
IconName::ZedSrcExtension, IconName::ZedSrcExtension,
"This MCP server was installed from an extension.", "This MCP server was installed from an extension.",
@@ -710,7 +708,6 @@ impl AgentConfiguration {
let fs = self.fs.clone(); let fs = self.fs.clone();
let context_server_id = context_server_id.clone(); let context_server_id = context_server_id.clone();
let language_registry = self.language_registry.clone(); let language_registry = self.language_registry.clone();
let context_server_store = self.context_server_store.clone();
let workspace = self.workspace.clone(); let workspace = self.workspace.clone();
let context_server_registry = self.context_server_registry.clone(); let context_server_registry = self.context_server_registry.clone();
@@ -752,23 +749,10 @@ impl AgentConfiguration {
.entry("Uninstall", None, { .entry("Uninstall", None, {
let fs = fs.clone(); let fs = fs.clone();
let context_server_id = context_server_id.clone(); let context_server_id = context_server_id.clone();
let context_server_store = context_server_store.clone();
let workspace = workspace.clone(); let workspace = workspace.clone();
move |_, cx| { move |_, cx| {
let is_provided_by_extension = context_server_store
.read(cx)
.configuration_for_server(&context_server_id)
.as_ref()
.map(|config| {
matches!(
config.as_ref(),
ContextServerConfiguration::Extension { .. }
)
})
.unwrap_or(false);
let uninstall_extension_task = match ( let uninstall_extension_task = match (
is_provided_by_extension, provided_by_extension,
resolve_extension_for_context_server(&context_server_id, cx), resolve_extension_for_context_server(&context_server_id, cx),
) { ) {
(true, Some((id, manifest))) => { (true, Some((id, manifest))) => {
@@ -1047,7 +1031,7 @@ impl AgentConfiguration {
AgentIcon::Name(icon_name) => Icon::new(icon_name) AgentIcon::Name(icon_name) => Icon::new(icon_name)
.size(IconSize::Small) .size(IconSize::Small)
.color(Color::Muted), .color(Color::Muted),
AgentIcon::Path(icon_path) => Icon::from_path(icon_path) AgentIcon::Path(icon_path) => Icon::from_external_svg(icon_path)
.size(IconSize::Small) .size(IconSize::Small)
.color(Color::Muted), .color(Color::Muted),
}; };

View File

@@ -515,16 +515,14 @@ impl Render for AddLlmProviderModal {
#[cfg(test)] #[cfg(test)]
mod tests { mod tests {
use super::*; use super::*;
use editor::EditorSettings;
use fs::FakeFs; use fs::FakeFs;
use gpui::{TestAppContext, VisualTestContext}; use gpui::{TestAppContext, VisualTestContext};
use language::language_settings;
use language_model::{ use language_model::{
LanguageModelProviderId, LanguageModelProviderName, LanguageModelProviderId, LanguageModelProviderName,
fake_provider::FakeLanguageModelProvider, fake_provider::FakeLanguageModelProvider,
}; };
use project::Project; use project::Project;
use settings::{Settings as _, SettingsStore}; use settings::SettingsStore;
use util::path; use util::path;
#[gpui::test] #[gpui::test]
@@ -730,13 +728,9 @@ mod tests {
cx.update(|cx| { cx.update(|cx| {
let store = SettingsStore::test(cx); let store = SettingsStore::test(cx);
cx.set_global(store); cx.set_global(store);
workspace::init_settings(cx);
Project::init_settings(cx);
theme::init(theme::LoadThemes::JustBase, cx); theme::init(theme::LoadThemes::JustBase, cx);
language_settings::init(cx);
EditorSettings::register(cx);
language_model::init_settings(cx); language_model::init_settings(cx);
language_models::init_settings(cx);
}); });
let fs = FakeFs::new(cx.executor()); let fs = FakeFs::new(cx.executor());

View File

@@ -7,8 +7,10 @@ use agent_settings::{AgentProfile, AgentProfileId, AgentSettings, builtin_profil
use editor::Editor; use editor::Editor;
use fs::Fs; use fs::Fs;
use gpui::{DismissEvent, Entity, EventEmitter, FocusHandle, Focusable, Subscription, prelude::*}; use gpui::{DismissEvent, Entity, EventEmitter, FocusHandle, Focusable, Subscription, prelude::*};
use language_model::LanguageModel; use language_model::{LanguageModel, LanguageModelRegistry};
use settings::Settings as _; use settings::{
LanguageModelProviderSetting, LanguageModelSelection, Settings as _, update_settings_file,
};
use ui::{ use ui::{
KeyBinding, ListItem, ListItemSpacing, ListSeparator, Navigable, NavigableEntry, prelude::*, KeyBinding, ListItem, ListItemSpacing, ListSeparator, Navigable, NavigableEntry, prelude::*,
}; };
@@ -16,6 +18,7 @@ use workspace::{ModalView, Workspace};
use crate::agent_configuration::manage_profiles_modal::profile_modal_header::ProfileModalHeader; use crate::agent_configuration::manage_profiles_modal::profile_modal_header::ProfileModalHeader;
use crate::agent_configuration::tool_picker::{ToolPicker, ToolPickerDelegate}; use crate::agent_configuration::tool_picker::{ToolPicker, ToolPickerDelegate};
use crate::language_model_selector::{LanguageModelSelector, language_model_selector};
use crate::{AgentPanel, ManageProfiles}; use crate::{AgentPanel, ManageProfiles};
enum Mode { enum Mode {
@@ -32,6 +35,11 @@ enum Mode {
tool_picker: Entity<ToolPicker>, tool_picker: Entity<ToolPicker>,
_subscription: Subscription, _subscription: Subscription,
}, },
ConfigureDefaultModel {
profile_id: AgentProfileId,
model_picker: Entity<LanguageModelSelector>,
_subscription: Subscription,
},
} }
impl Mode { impl Mode {
@@ -83,6 +91,7 @@ pub struct ChooseProfileMode {
pub struct ViewProfileMode { pub struct ViewProfileMode {
profile_id: AgentProfileId, profile_id: AgentProfileId,
fork_profile: NavigableEntry, fork_profile: NavigableEntry,
configure_default_model: NavigableEntry,
configure_tools: NavigableEntry, configure_tools: NavigableEntry,
configure_mcps: NavigableEntry, configure_mcps: NavigableEntry,
cancel_item: NavigableEntry, cancel_item: NavigableEntry,
@@ -180,6 +189,7 @@ impl ManageProfilesModal {
self.mode = Mode::ViewProfile(ViewProfileMode { self.mode = Mode::ViewProfile(ViewProfileMode {
profile_id, profile_id,
fork_profile: NavigableEntry::focusable(cx), fork_profile: NavigableEntry::focusable(cx),
configure_default_model: NavigableEntry::focusable(cx),
configure_tools: NavigableEntry::focusable(cx), configure_tools: NavigableEntry::focusable(cx),
configure_mcps: NavigableEntry::focusable(cx), configure_mcps: NavigableEntry::focusable(cx),
cancel_item: NavigableEntry::focusable(cx), cancel_item: NavigableEntry::focusable(cx),
@@ -187,6 +197,83 @@ impl ManageProfilesModal {
self.focus_handle(cx).focus(window); self.focus_handle(cx).focus(window);
} }
fn configure_default_model(
&mut self,
profile_id: AgentProfileId,
window: &mut Window,
cx: &mut Context<Self>,
) {
let fs = self.fs.clone();
let profile_id_for_closure = profile_id.clone();
let model_picker = cx.new(|cx| {
let fs = fs.clone();
let profile_id = profile_id_for_closure.clone();
language_model_selector(
{
let profile_id = profile_id.clone();
move |cx| {
let settings = AgentSettings::get_global(cx);
settings
.profiles
.get(&profile_id)
.and_then(|profile| profile.default_model.as_ref())
.and_then(|selection| {
let registry = LanguageModelRegistry::read_global(cx);
let provider_id = language_model::LanguageModelProviderId(
gpui::SharedString::from(selection.provider.0.clone()),
);
let provider = registry.provider(&provider_id)?;
let model = provider
.provided_models(cx)
.iter()
.find(|m| m.id().0 == selection.model.as_str())?
.clone();
Some(language_model::ConfiguredModel { provider, model })
})
}
},
move |model, cx| {
let provider = model.provider_id().0.to_string();
let model_id = model.id().0.to_string();
let profile_id = profile_id.clone();
update_settings_file(fs.clone(), cx, move |settings, _cx| {
let agent_settings = settings.agent.get_or_insert_default();
if let Some(profiles) = agent_settings.profiles.as_mut() {
if let Some(profile) = profiles.get_mut(profile_id.0.as_ref()) {
profile.default_model = Some(LanguageModelSelection {
provider: LanguageModelProviderSetting(provider.clone()),
model: model_id.clone(),
});
}
}
});
},
false, // Do not use popover styles for the model picker
window,
cx,
)
.modal(false)
});
let dismiss_subscription = cx.subscribe_in(&model_picker, window, {
let profile_id = profile_id.clone();
move |this, _picker, _: &DismissEvent, window, cx| {
this.view_profile(profile_id.clone(), window, cx);
}
});
self.mode = Mode::ConfigureDefaultModel {
profile_id,
model_picker,
_subscription: dismiss_subscription,
};
self.focus_handle(cx).focus(window);
}
fn configure_mcp_tools( fn configure_mcp_tools(
&mut self, &mut self,
profile_id: AgentProfileId, profile_id: AgentProfileId,
@@ -277,6 +364,7 @@ impl ManageProfilesModal {
Mode::ViewProfile(_) => {} Mode::ViewProfile(_) => {}
Mode::ConfigureTools { .. } => {} Mode::ConfigureTools { .. } => {}
Mode::ConfigureMcps { .. } => {} Mode::ConfigureMcps { .. } => {}
Mode::ConfigureDefaultModel { .. } => {}
} }
} }
@@ -299,6 +387,9 @@ impl ManageProfilesModal {
Mode::ConfigureMcps { profile_id, .. } => { Mode::ConfigureMcps { profile_id, .. } => {
self.view_profile(profile_id.clone(), window, cx) self.view_profile(profile_id.clone(), window, cx)
} }
Mode::ConfigureDefaultModel { profile_id, .. } => {
self.view_profile(profile_id.clone(), window, cx)
}
} }
} }
} }
@@ -313,6 +404,7 @@ impl Focusable for ManageProfilesModal {
Mode::ViewProfile(_) => self.focus_handle.clone(), Mode::ViewProfile(_) => self.focus_handle.clone(),
Mode::ConfigureTools { tool_picker, .. } => tool_picker.focus_handle(cx), Mode::ConfigureTools { tool_picker, .. } => tool_picker.focus_handle(cx),
Mode::ConfigureMcps { tool_picker, .. } => tool_picker.focus_handle(cx), Mode::ConfigureMcps { tool_picker, .. } => tool_picker.focus_handle(cx),
Mode::ConfigureDefaultModel { model_picker, .. } => model_picker.focus_handle(cx),
} }
} }
} }
@@ -544,6 +636,47 @@ impl ManageProfilesModal {
}), }),
), ),
) )
.child(
div()
.id("configure-default-model")
.track_focus(&mode.configure_default_model.focus_handle)
.on_action({
let profile_id = mode.profile_id.clone();
cx.listener(move |this, _: &menu::Confirm, window, cx| {
this.configure_default_model(
profile_id.clone(),
window,
cx,
);
})
})
.child(
ListItem::new("model-item")
.toggle_state(
mode.configure_default_model
.focus_handle
.contains_focused(window, cx),
)
.inset(true)
.spacing(ListItemSpacing::Sparse)
.start_slot(
Icon::new(IconName::ZedAssistant)
.size(IconSize::Small)
.color(Color::Muted),
)
.child(Label::new("Configure Default Model"))
.on_click({
let profile_id = mode.profile_id.clone();
cx.listener(move |this, _, window, cx| {
this.configure_default_model(
profile_id.clone(),
window,
cx,
);
})
}),
),
)
.child( .child(
div() div()
.id("configure-builtin-tools") .id("configure-builtin-tools")
@@ -668,6 +801,7 @@ impl ManageProfilesModal {
.into_any_element(), .into_any_element(),
) )
.entry(mode.fork_profile) .entry(mode.fork_profile)
.entry(mode.configure_default_model)
.entry(mode.configure_tools) .entry(mode.configure_tools)
.entry(mode.configure_mcps) .entry(mode.configure_mcps)
.entry(mode.cancel_item) .entry(mode.cancel_item)
@@ -753,6 +887,29 @@ impl Render for ManageProfilesModal {
.child(go_back_item) .child(go_back_item)
.into_any_element() .into_any_element()
} }
Mode::ConfigureDefaultModel {
profile_id,
model_picker,
..
} => {
let profile_name = settings
.profiles
.get(profile_id)
.map(|profile| profile.name.clone())
.unwrap_or_else(|| "Unknown".into());
v_flex()
.pb_1()
.child(ProfileModalHeader::new(
format!("{profile_name} — Configure Default Model"),
Some(IconName::Ai),
))
.child(ListSeparator)
.child(v_flex().w(rems(34.)).child(model_picker.clone()))
.child(ListSeparator)
.child(go_back_item)
.into_any_element()
}
Mode::ConfigureMcps { Mode::ConfigureMcps {
profile_id, profile_id,
tool_picker, tool_picker,

View File

@@ -314,6 +314,7 @@ impl PickerDelegate for ToolPickerDelegate {
) )
}) })
.collect(), .collect(),
default_model: default_profile.default_model.clone(),
}); });
if let Some(server_id) = server_id { if let Some(server_id) = server_id {

View File

@@ -1,6 +1,6 @@
use crate::{Keep, KeepAll, OpenAgentDiff, Reject, RejectAll}; use crate::{Keep, KeepAll, OpenAgentDiff, Reject, RejectAll};
use acp_thread::{AcpThread, AcpThreadEvent}; use acp_thread::{AcpThread, AcpThreadEvent};
use action_log::ActionLog; use action_log::ActionLogTelemetry;
use agent_settings::AgentSettings; use agent_settings::AgentSettings;
use anyhow::Result; use anyhow::Result;
use buffer_diff::DiffHunkStatus; use buffer_diff::DiffHunkStatus;
@@ -40,79 +40,16 @@ use zed_actions::assistant::ToggleFocus;
pub struct AgentDiffPane { pub struct AgentDiffPane {
multibuffer: Entity<MultiBuffer>, multibuffer: Entity<MultiBuffer>,
editor: Entity<Editor>, editor: Entity<Editor>,
thread: AgentDiffThread, thread: Entity<AcpThread>,
focus_handle: FocusHandle, focus_handle: FocusHandle,
workspace: WeakEntity<Workspace>, workspace: WeakEntity<Workspace>,
title: SharedString, title: SharedString,
_subscriptions: Vec<Subscription>, _subscriptions: Vec<Subscription>,
} }
#[derive(PartialEq, Eq, Clone)]
pub enum AgentDiffThread {
AcpThread(Entity<AcpThread>),
}
impl AgentDiffThread {
fn project(&self, cx: &App) -> Entity<Project> {
match self {
AgentDiffThread::AcpThread(thread) => thread.read(cx).project().clone(),
}
}
fn action_log(&self, cx: &App) -> Entity<ActionLog> {
match self {
AgentDiffThread::AcpThread(thread) => thread.read(cx).action_log().clone(),
}
}
fn title(&self, cx: &App) -> SharedString {
match self {
AgentDiffThread::AcpThread(thread) => thread.read(cx).title(),
}
}
fn has_pending_edit_tool_uses(&self, cx: &App) -> bool {
match self {
AgentDiffThread::AcpThread(thread) => thread.read(cx).has_pending_edit_tool_calls(),
}
}
fn downgrade(&self) -> WeakAgentDiffThread {
match self {
AgentDiffThread::AcpThread(thread) => {
WeakAgentDiffThread::AcpThread(thread.downgrade())
}
}
}
}
impl From<Entity<AcpThread>> for AgentDiffThread {
fn from(entity: Entity<AcpThread>) -> Self {
AgentDiffThread::AcpThread(entity)
}
}
#[derive(PartialEq, Eq, Clone)]
pub enum WeakAgentDiffThread {
AcpThread(WeakEntity<AcpThread>),
}
impl WeakAgentDiffThread {
pub fn upgrade(&self) -> Option<AgentDiffThread> {
match self {
WeakAgentDiffThread::AcpThread(weak) => weak.upgrade().map(AgentDiffThread::AcpThread),
}
}
}
impl From<WeakEntity<AcpThread>> for WeakAgentDiffThread {
fn from(entity: WeakEntity<AcpThread>) -> Self {
WeakAgentDiffThread::AcpThread(entity)
}
}
impl AgentDiffPane { impl AgentDiffPane {
pub fn deploy( pub fn deploy(
thread: impl Into<AgentDiffThread>, thread: Entity<AcpThread>,
workspace: WeakEntity<Workspace>, workspace: WeakEntity<Workspace>,
window: &mut Window, window: &mut Window,
cx: &mut App, cx: &mut App,
@@ -123,12 +60,11 @@ impl AgentDiffPane {
} }
pub fn deploy_in_workspace( pub fn deploy_in_workspace(
thread: impl Into<AgentDiffThread>, thread: Entity<AcpThread>,
workspace: &mut Workspace, workspace: &mut Workspace,
window: &mut Window, window: &mut Window,
cx: &mut Context<Workspace>, cx: &mut Context<Workspace>,
) -> Entity<Self> { ) -> Entity<Self> {
let thread = thread.into();
let existing_diff = workspace let existing_diff = workspace
.items_of_type::<AgentDiffPane>(cx) .items_of_type::<AgentDiffPane>(cx)
.find(|diff| diff.read(cx).thread == thread); .find(|diff| diff.read(cx).thread == thread);
@@ -145,7 +81,7 @@ impl AgentDiffPane {
} }
pub fn new( pub fn new(
thread: AgentDiffThread, thread: Entity<AcpThread>,
workspace: WeakEntity<Workspace>, workspace: WeakEntity<Workspace>,
window: &mut Window, window: &mut Window,
cx: &mut Context<Self>, cx: &mut Context<Self>,
@@ -153,7 +89,7 @@ impl AgentDiffPane {
let focus_handle = cx.focus_handle(); let focus_handle = cx.focus_handle();
let multibuffer = cx.new(|_| MultiBuffer::new(Capability::ReadWrite)); let multibuffer = cx.new(|_| MultiBuffer::new(Capability::ReadWrite));
let project = thread.project(cx); let project = thread.read(cx).project().clone();
let editor = cx.new(|cx| { let editor = cx.new(|cx| {
let mut editor = let mut editor =
Editor::for_multibuffer(multibuffer.clone(), Some(project.clone()), window, cx); Editor::for_multibuffer(multibuffer.clone(), Some(project.clone()), window, cx);
@@ -164,19 +100,16 @@ impl AgentDiffPane {
editor editor
}); });
let action_log = thread.action_log(cx); let action_log = thread.read(cx).action_log().clone();
let mut this = Self { let mut this = Self {
_subscriptions: vec![ _subscriptions: vec![
cx.observe_in(&action_log, window, |this, _action_log, window, cx| { cx.observe_in(&action_log, window, |this, _action_log, window, cx| {
this.update_excerpts(window, cx) this.update_excerpts(window, cx)
}), }),
match &thread { cx.subscribe(&thread, |this, _thread, event, cx| {
AgentDiffThread::AcpThread(thread) => cx this.handle_acp_thread_event(event, cx)
.subscribe(thread, |this, _thread, event, cx| { }),
this.handle_acp_thread_event(event, cx)
}),
},
], ],
title: SharedString::default(), title: SharedString::default(),
multibuffer, multibuffer,
@@ -191,7 +124,12 @@ impl AgentDiffPane {
} }
fn update_excerpts(&mut self, window: &mut Window, cx: &mut Context<Self>) { fn update_excerpts(&mut self, window: &mut Window, cx: &mut Context<Self>) {
let changed_buffers = self.thread.action_log(cx).read(cx).changed_buffers(cx); let changed_buffers = self
.thread
.read(cx)
.action_log()
.read(cx)
.changed_buffers(cx);
let mut paths_to_delete = self.multibuffer.read(cx).paths().collect::<HashSet<_>>(); let mut paths_to_delete = self.multibuffer.read(cx).paths().collect::<HashSet<_>>();
for (buffer, diff_handle) in changed_buffers { for (buffer, diff_handle) in changed_buffers {
@@ -278,7 +216,7 @@ impl AgentDiffPane {
} }
fn update_title(&mut self, cx: &mut Context<Self>) { fn update_title(&mut self, cx: &mut Context<Self>) {
let new_title = self.thread.title(cx); let new_title = self.thread.read(cx).title();
if new_title != self.title { if new_title != self.title {
self.title = new_title; self.title = new_title;
cx.emit(EditorEvent::TitleChanged); cx.emit(EditorEvent::TitleChanged);
@@ -340,16 +278,18 @@ impl AgentDiffPane {
} }
fn keep_all(&mut self, _: &KeepAll, _window: &mut Window, cx: &mut Context<Self>) { fn keep_all(&mut self, _: &KeepAll, _window: &mut Window, cx: &mut Context<Self>) {
self.thread let telemetry = ActionLogTelemetry::from(self.thread.read(cx));
.action_log(cx) let action_log = self.thread.read(cx).action_log().clone();
.update(cx, |action_log, cx| action_log.keep_all_edits(cx)) action_log.update(cx, |action_log, cx| {
action_log.keep_all_edits(Some(telemetry), cx)
});
} }
} }
fn keep_edits_in_selection( fn keep_edits_in_selection(
editor: &mut Editor, editor: &mut Editor,
buffer_snapshot: &MultiBufferSnapshot, buffer_snapshot: &MultiBufferSnapshot,
thread: &AgentDiffThread, thread: &Entity<AcpThread>,
window: &mut Window, window: &mut Window,
cx: &mut Context<Editor>, cx: &mut Context<Editor>,
) { ) {
@@ -364,7 +304,7 @@ fn keep_edits_in_selection(
fn reject_edits_in_selection( fn reject_edits_in_selection(
editor: &mut Editor, editor: &mut Editor,
buffer_snapshot: &MultiBufferSnapshot, buffer_snapshot: &MultiBufferSnapshot,
thread: &AgentDiffThread, thread: &Entity<AcpThread>,
window: &mut Window, window: &mut Window,
cx: &mut Context<Editor>, cx: &mut Context<Editor>,
) { ) {
@@ -378,7 +318,7 @@ fn reject_edits_in_selection(
fn keep_edits_in_ranges( fn keep_edits_in_ranges(
editor: &mut Editor, editor: &mut Editor,
buffer_snapshot: &MultiBufferSnapshot, buffer_snapshot: &MultiBufferSnapshot,
thread: &AgentDiffThread, thread: &Entity<AcpThread>,
ranges: Vec<Range<editor::Anchor>>, ranges: Vec<Range<editor::Anchor>>,
window: &mut Window, window: &mut Window,
cx: &mut Context<Editor>, cx: &mut Context<Editor>,
@@ -393,8 +333,15 @@ fn keep_edits_in_ranges(
for hunk in &diff_hunks_in_ranges { for hunk in &diff_hunks_in_ranges {
let buffer = multibuffer.read(cx).buffer(hunk.buffer_id); let buffer = multibuffer.read(cx).buffer(hunk.buffer_id);
if let Some(buffer) = buffer { if let Some(buffer) = buffer {
thread.action_log(cx).update(cx, |action_log, cx| { let action_log = thread.read(cx).action_log().clone();
action_log.keep_edits_in_range(buffer, hunk.buffer_range.clone(), cx) let telemetry = ActionLogTelemetry::from(thread.read(cx));
action_log.update(cx, |action_log, cx| {
action_log.keep_edits_in_range(
buffer,
hunk.buffer_range.clone(),
Some(telemetry),
cx,
)
}); });
} }
} }
@@ -403,7 +350,7 @@ fn keep_edits_in_ranges(
fn reject_edits_in_ranges( fn reject_edits_in_ranges(
editor: &mut Editor, editor: &mut Editor,
buffer_snapshot: &MultiBufferSnapshot, buffer_snapshot: &MultiBufferSnapshot,
thread: &AgentDiffThread, thread: &Entity<AcpThread>,
ranges: Vec<Range<editor::Anchor>>, ranges: Vec<Range<editor::Anchor>>,
window: &mut Window, window: &mut Window,
cx: &mut Context<Editor>, cx: &mut Context<Editor>,
@@ -427,11 +374,12 @@ fn reject_edits_in_ranges(
} }
} }
let action_log = thread.read(cx).action_log().clone();
let telemetry = ActionLogTelemetry::from(thread.read(cx));
for (buffer, ranges) in ranges_by_buffer { for (buffer, ranges) in ranges_by_buffer {
thread action_log
.action_log(cx)
.update(cx, |action_log, cx| { .update(cx, |action_log, cx| {
action_log.reject_edits_in_ranges(buffer, ranges, cx) action_log.reject_edits_in_ranges(buffer, ranges, Some(telemetry.clone()), cx)
}) })
.detach_and_log_err(cx); .detach_and_log_err(cx);
} }
@@ -531,7 +479,7 @@ impl Item for AgentDiffPane {
} }
fn tab_content(&self, params: TabContentParams, _window: &Window, cx: &App) -> AnyElement { fn tab_content(&self, params: TabContentParams, _window: &Window, cx: &App) -> AnyElement {
let title = self.thread.title(cx); let title = self.thread.read(cx).title();
Label::new(format!("Review: {}", title)) Label::new(format!("Review: {}", title))
.color(if params.selected { .color(if params.selected {
Color::Default Color::Default
@@ -712,7 +660,7 @@ impl Render for AgentDiffPane {
} }
} }
fn diff_hunk_controls(thread: &AgentDiffThread) -> editor::RenderDiffHunkControlsFn { fn diff_hunk_controls(thread: &Entity<AcpThread>) -> editor::RenderDiffHunkControlsFn {
let thread = thread.clone(); let thread = thread.clone();
Arc::new( Arc::new(
@@ -739,7 +687,7 @@ fn render_diff_hunk_controls(
hunk_range: Range<editor::Anchor>, hunk_range: Range<editor::Anchor>,
is_created_file: bool, is_created_file: bool,
line_height: Pixels, line_height: Pixels,
thread: &AgentDiffThread, thread: &Entity<AcpThread>,
editor: &Entity<Editor>, editor: &Entity<Editor>,
cx: &mut App, cx: &mut App,
) -> AnyElement { ) -> AnyElement {
@@ -1153,8 +1101,11 @@ impl Render for AgentDiffToolbar {
return Empty.into_any(); return Empty.into_any();
}; };
let has_pending_edit_tool_use = let has_pending_edit_tool_use = agent_diff
agent_diff.read(cx).thread.has_pending_edit_tool_uses(cx); .read(cx)
.thread
.read(cx)
.has_pending_edit_tool_calls();
if has_pending_edit_tool_use { if has_pending_edit_tool_use {
return div().px_2().child(spinner_icon).into_any(); return div().px_2().child(spinner_icon).into_any();
@@ -1214,7 +1165,7 @@ pub enum EditorState {
} }
struct WorkspaceThread { struct WorkspaceThread {
thread: WeakAgentDiffThread, thread: WeakEntity<AcpThread>,
_thread_subscriptions: (Subscription, Subscription), _thread_subscriptions: (Subscription, Subscription),
singleton_editors: HashMap<WeakEntity<Buffer>, HashMap<WeakEntity<Editor>, Subscription>>, singleton_editors: HashMap<WeakEntity<Buffer>, HashMap<WeakEntity<Editor>, Subscription>>,
_settings_subscription: Subscription, _settings_subscription: Subscription,
@@ -1239,23 +1190,23 @@ impl AgentDiff {
pub fn set_active_thread( pub fn set_active_thread(
workspace: &WeakEntity<Workspace>, workspace: &WeakEntity<Workspace>,
thread: impl Into<AgentDiffThread>, thread: Entity<AcpThread>,
window: &mut Window, window: &mut Window,
cx: &mut App, cx: &mut App,
) { ) {
Self::global(cx).update(cx, |this, cx| { Self::global(cx).update(cx, |this, cx| {
this.register_active_thread_impl(workspace, thread.into(), window, cx); this.register_active_thread_impl(workspace, thread, window, cx);
}); });
} }
fn register_active_thread_impl( fn register_active_thread_impl(
&mut self, &mut self,
workspace: &WeakEntity<Workspace>, workspace: &WeakEntity<Workspace>,
thread: AgentDiffThread, thread: Entity<AcpThread>,
window: &mut Window, window: &mut Window,
cx: &mut Context<Self>, cx: &mut Context<Self>,
) { ) {
let action_log = thread.action_log(cx); let action_log = thread.read(cx).action_log().clone();
let action_log_subscription = cx.observe_in(&action_log, window, { let action_log_subscription = cx.observe_in(&action_log, window, {
let workspace = workspace.clone(); let workspace = workspace.clone();
@@ -1264,14 +1215,12 @@ impl AgentDiff {
} }
}); });
let thread_subscription = match &thread { let thread_subscription = cx.subscribe_in(&thread, window, {
AgentDiffThread::AcpThread(thread) => cx.subscribe_in(thread, window, { let workspace = workspace.clone();
let workspace = workspace.clone(); move |this, thread, event, window, cx| {
move |this, thread, event, window, cx| { this.handle_acp_thread_event(&workspace, thread, event, window, cx)
this.handle_acp_thread_event(&workspace, thread, event, window, cx) }
} });
}),
};
if let Some(workspace_thread) = self.workspace_threads.get_mut(workspace) { if let Some(workspace_thread) = self.workspace_threads.get_mut(workspace) {
// replace thread and action log subscription, but keep editors // replace thread and action log subscription, but keep editors
@@ -1348,7 +1297,7 @@ impl AgentDiff {
fn register_review_action<T: Action>( fn register_review_action<T: Action>(
workspace: &mut Workspace, workspace: &mut Workspace,
review: impl Fn(&Entity<Editor>, &AgentDiffThread, &mut Window, &mut App) -> PostReviewState review: impl Fn(&Entity<Editor>, &Entity<AcpThread>, &mut Window, &mut App) -> PostReviewState
+ 'static, + 'static,
this: &Entity<AgentDiff>, this: &Entity<AgentDiff>,
) { ) {
@@ -1508,7 +1457,7 @@ impl AgentDiff {
return; return;
}; };
let action_log = thread.action_log(cx); let action_log = thread.read(cx).action_log();
let changed_buffers = action_log.read(cx).changed_buffers(cx); let changed_buffers = action_log.read(cx).changed_buffers(cx);
let mut unaffected = self.reviewing_editors.clone(); let mut unaffected = self.reviewing_editors.clone();
@@ -1627,7 +1576,7 @@ impl AgentDiff {
fn keep_all( fn keep_all(
editor: &Entity<Editor>, editor: &Entity<Editor>,
thread: &AgentDiffThread, thread: &Entity<AcpThread>,
window: &mut Window, window: &mut Window,
cx: &mut App, cx: &mut App,
) -> PostReviewState { ) -> PostReviewState {
@@ -1647,7 +1596,7 @@ impl AgentDiff {
fn reject_all( fn reject_all(
editor: &Entity<Editor>, editor: &Entity<Editor>,
thread: &AgentDiffThread, thread: &Entity<AcpThread>,
window: &mut Window, window: &mut Window,
cx: &mut App, cx: &mut App,
) -> PostReviewState { ) -> PostReviewState {
@@ -1667,7 +1616,7 @@ impl AgentDiff {
fn keep( fn keep(
editor: &Entity<Editor>, editor: &Entity<Editor>,
thread: &AgentDiffThread, thread: &Entity<AcpThread>,
window: &mut Window, window: &mut Window,
cx: &mut App, cx: &mut App,
) -> PostReviewState { ) -> PostReviewState {
@@ -1680,7 +1629,7 @@ impl AgentDiff {
fn reject( fn reject(
editor: &Entity<Editor>, editor: &Entity<Editor>,
thread: &AgentDiffThread, thread: &Entity<AcpThread>,
window: &mut Window, window: &mut Window,
cx: &mut App, cx: &mut App,
) -> PostReviewState { ) -> PostReviewState {
@@ -1703,7 +1652,7 @@ impl AgentDiff {
fn review_in_active_editor( fn review_in_active_editor(
&mut self, &mut self,
workspace: &mut Workspace, workspace: &mut Workspace,
review: impl Fn(&Entity<Editor>, &AgentDiffThread, &mut Window, &mut App) -> PostReviewState, review: impl Fn(&Entity<Editor>, &Entity<AcpThread>, &mut Window, &mut App) -> PostReviewState,
window: &mut Window, window: &mut Window,
cx: &mut Context<Self>, cx: &mut Context<Self>,
) -> Option<Task<Result<()>>> { ) -> Option<Task<Result<()>>> {
@@ -1725,7 +1674,7 @@ impl AgentDiff {
if let PostReviewState::AllReviewed = review(&editor, &thread, window, cx) if let PostReviewState::AllReviewed = review(&editor, &thread, window, cx)
&& let Some(curr_buffer) = editor.read(cx).buffer().read(cx).as_singleton() && let Some(curr_buffer) = editor.read(cx).buffer().read(cx).as_singleton()
{ {
let changed_buffers = thread.action_log(cx).read(cx).changed_buffers(cx); let changed_buffers = thread.read(cx).action_log().read(cx).changed_buffers(cx);
let mut keys = changed_buffers.keys().cycle(); let mut keys = changed_buffers.keys().cycle();
keys.find(|k| *k == &curr_buffer); keys.find(|k| *k == &curr_buffer);
@@ -1768,12 +1717,11 @@ mod tests {
use super::*; use super::*;
use crate::Keep; use crate::Keep;
use acp_thread::AgentConnection as _; use acp_thread::AgentConnection as _;
use agent_settings::AgentSettings;
use editor::EditorSettings; use editor::EditorSettings;
use gpui::{TestAppContext, UpdateGlobal, VisualTestContext}; use gpui::{TestAppContext, UpdateGlobal, VisualTestContext};
use project::{FakeFs, Project}; use project::{FakeFs, Project};
use serde_json::json; use serde_json::json;
use settings::{Settings, SettingsStore}; use settings::SettingsStore;
use std::{path::Path, rc::Rc}; use std::{path::Path, rc::Rc};
use util::path; use util::path;
@@ -1782,13 +1730,8 @@ mod tests {
cx.update(|cx| { cx.update(|cx| {
let settings_store = SettingsStore::test(cx); let settings_store = SettingsStore::test(cx);
cx.set_global(settings_store); cx.set_global(settings_store);
language::init(cx);
Project::init_settings(cx);
AgentSettings::register(cx);
prompt_store::init(cx); prompt_store::init(cx);
workspace::init_settings(cx);
theme::init(theme::LoadThemes::JustBase, cx); theme::init(theme::LoadThemes::JustBase, cx);
EditorSettings::register(cx);
language_model::init_settings(cx); language_model::init_settings(cx);
}); });
@@ -1815,8 +1758,7 @@ mod tests {
.await .await
.unwrap(); .unwrap();
let thread = AgentDiffThread::AcpThread(thread); let action_log = cx.read(|cx| thread.read(cx).action_log().clone());
let action_log = cx.read(|cx| thread.action_log(cx));
let (workspace, cx) = let (workspace, cx) =
cx.add_window_view(|window, cx| Workspace::test_new(project.clone(), window, cx)); cx.add_window_view(|window, cx| Workspace::test_new(project.clone(), window, cx));
@@ -1942,13 +1884,8 @@ mod tests {
cx.update(|cx| { cx.update(|cx| {
let settings_store = SettingsStore::test(cx); let settings_store = SettingsStore::test(cx);
cx.set_global(settings_store); cx.set_global(settings_store);
language::init(cx);
Project::init_settings(cx);
AgentSettings::register(cx);
prompt_store::init(cx); prompt_store::init(cx);
workspace::init_settings(cx);
theme::init(theme::LoadThemes::JustBase, cx); theme::init(theme::LoadThemes::JustBase, cx);
EditorSettings::register(cx);
language_model::init_settings(cx); language_model::init_settings(cx);
workspace::register_project_item::<Editor>(cx); workspace::register_project_item::<Editor>(cx);
}); });
@@ -2004,7 +1941,6 @@ mod tests {
let action_log = thread.read_with(cx, |thread, _| thread.action_log().clone()); let action_log = thread.read_with(cx, |thread, _| thread.action_log().clone());
// Set the active thread // Set the active thread
let thread = AgentDiffThread::AcpThread(thread);
cx.update(|window, cx| { cx.update(|window, cx| {
AgentDiff::set_active_thread(&workspace.downgrade(), thread.clone(), window, cx) AgentDiff::set_active_thread(&workspace.downgrade(), thread.clone(), window, cx)
}); });

View File

@@ -47,6 +47,7 @@ impl AgentModelSelector {
} }
} }
}, },
true, // Use popover styles for picker
window, window,
cx, cx,
) )

View File

@@ -16,7 +16,7 @@ use serde::{Deserialize, Serialize};
use settings::{ use settings::{
DefaultAgentView as DefaultView, LanguageModelProviderSetting, LanguageModelSelection, DefaultAgentView as DefaultView, LanguageModelProviderSetting, LanguageModelSelection,
}; };
use zed_actions::OpenBrowser;
use zed_actions::agent::{OpenClaudeCodeOnboardingModal, ReauthenticateAgent}; use zed_actions::agent::{OpenClaudeCodeOnboardingModal, ReauthenticateAgent};
use crate::ui::{AcpOnboardingModal, ClaudeCodeOnboardingModal}; use crate::ui::{AcpOnboardingModal, ClaudeCodeOnboardingModal};
@@ -2131,12 +2131,20 @@ impl AgentPanel {
menu menu
}) })
.separator() .separator()
.link( .item(
"Add Other Agents", ContextMenuEntry::new("Add More Agents")
OpenBrowser { .icon(IconName::Plus)
url: zed_urls::external_agents_docs(cx), .icon_color(Color::Muted)
} .handler({
.boxed_clone(), move |window, cx| {
window.dispatch_action(Box::new(zed_actions::Extensions {
category_filter: Some(
zed_actions::ExtensionCategoryFilter::AgentServers,
),
id: None,
}), cx)
}
}),
) )
})) }))
} }

View File

@@ -12,7 +12,6 @@ mod context_strip;
mod inline_assistant; mod inline_assistant;
mod inline_prompt_editor; mod inline_prompt_editor;
mod language_model_selector; mod language_model_selector;
mod message_editor;
mod profile_selector; mod profile_selector;
mod slash_command; mod slash_command;
mod slash_command_picker; mod slash_command_picker;
@@ -248,8 +247,6 @@ pub fn init(
is_eval: bool, is_eval: bool,
cx: &mut App, cx: &mut App,
) { ) {
AgentSettings::register(cx);
assistant_text_thread::init(client.clone(), cx); assistant_text_thread::init(client.clone(), cx);
rules_library::init(cx); rules_library::init(cx);
if !is_eval { if !is_eval {

View File

@@ -1082,10 +1082,7 @@ mod tests {
}; };
use gpui::TestAppContext; use gpui::TestAppContext;
use indoc::indoc; use indoc::indoc;
use language::{ use language::{Buffer, Language, LanguageConfig, LanguageMatcher, Point, tree_sitter_rust};
Buffer, Language, LanguageConfig, LanguageMatcher, Point, language_settings,
tree_sitter_rust,
};
use language_model::{LanguageModelRegistry, TokenUsage}; use language_model::{LanguageModelRegistry, TokenUsage};
use rand::prelude::*; use rand::prelude::*;
use settings::SettingsStore; use settings::SettingsStore;
@@ -1465,8 +1462,6 @@ mod tests {
fn init_test(cx: &mut TestAppContext) { fn init_test(cx: &mut TestAppContext) {
cx.update(LanguageModelRegistry::test); cx.update(LanguageModelRegistry::test);
cx.set_global(cx.update(SettingsStore::test)); cx.set_global(cx.update(SettingsStore::test));
cx.update(Project::init_settings);
cx.update(language_settings::init);
} }
fn simulate_response_stream( fn simulate_response_stream(

View File

@@ -1075,8 +1075,6 @@ mod tests {
cx.update(|cx| { cx.update(|cx| {
let settings_store = SettingsStore::test(cx); let settings_store = SettingsStore::test(cx);
cx.set_global(settings_store); cx.set_global(settings_store);
language::init(cx);
Project::init_settings(cx);
}); });
} }

View File

@@ -42,7 +42,7 @@ use super::{
ContextPickerAction, ContextPickerEntry, ContextPickerMode, MentionLink, RecentEntry, ContextPickerAction, ContextPickerEntry, ContextPickerMode, MentionLink, RecentEntry,
available_context_picker_entries, recent_context_picker_entries_with_store, selection_ranges, available_context_picker_entries, recent_context_picker_entries_with_store, selection_ranges,
}; };
use crate::message_editor::ContextCreasesAddon; use crate::inline_prompt_editor::ContextCreasesAddon;
pub(crate) enum Match { pub(crate) enum Match {
File(FileMatch), File(FileMatch),
@@ -1182,10 +1182,8 @@ mod tests {
let app_state = cx.update(AppState::test); let app_state = cx.update(AppState::test);
cx.update(|cx| { cx.update(|cx| {
language::init(cx);
editor::init(cx); editor::init(cx);
workspace::init(app_state.clone(), cx); workspace::init(app_state.clone(), cx);
Project::init_settings(cx);
}); });
app_state app_state
@@ -1486,10 +1484,8 @@ mod tests {
let app_state = cx.update(AppState::test); let app_state = cx.update(AppState::test);
cx.update(|cx| { cx.update(|cx| {
language::init(cx);
editor::init(cx); editor::init(cx);
workspace::init(app_state.clone(), cx); workspace::init(app_state.clone(), cx);
Project::init_settings(cx);
}); });
app_state app_state
@@ -1686,11 +1682,6 @@ mod tests {
let store = SettingsStore::test(cx); let store = SettingsStore::test(cx);
cx.set_global(store); cx.set_global(store);
theme::init(theme::LoadThemes::JustBase, cx); theme::init(theme::LoadThemes::JustBase, cx);
client::init_settings(cx);
language::init(cx);
Project::init_settings(cx);
workspace::init_settings(cx);
editor::init_settings(cx);
}); });
} }
} }

View File

@@ -1,8 +1,8 @@
use crate::context_store::ContextStore;
use agent::HistoryStore; use agent::HistoryStore;
use collections::VecDeque; use collections::{HashMap, VecDeque};
use editor::actions::Paste; use editor::actions::Paste;
use editor::display_map::EditorMargins; use editor::display_map::{CreaseId, EditorMargins};
use editor::{Addon, AnchorRangeExt as _};
use editor::{ use editor::{
ContextMenuOptions, Editor, EditorElement, EditorEvent, EditorMode, EditorStyle, MultiBuffer, ContextMenuOptions, Editor, EditorElement, EditorEvent, EditorMode, EditorStyle, MultiBuffer,
actions::{MoveDown, MoveUp}, actions::{MoveDown, MoveUp},
@@ -17,6 +17,7 @@ use parking_lot::Mutex;
use prompt_store::PromptStore; use prompt_store::PromptStore;
use settings::Settings; use settings::Settings;
use std::cmp; use std::cmp;
use std::ops::Range;
use std::rc::Rc; use std::rc::Rc;
use std::sync::Arc; use std::sync::Arc;
use theme::ThemeSettings; use theme::ThemeSettings;
@@ -27,12 +28,15 @@ use zed_actions::agent::ToggleModelSelector;
use crate::agent_model_selector::AgentModelSelector; use crate::agent_model_selector::AgentModelSelector;
use crate::buffer_codegen::BufferCodegen; use crate::buffer_codegen::BufferCodegen;
use crate::context_picker::{ContextPicker, ContextPickerCompletionProvider}; use crate::context::{AgentContextHandle, AgentContextKey};
use crate::context_picker::{ContextPicker, ContextPickerCompletionProvider, crease_for_mention};
use crate::context_store::{ContextStore, ContextStoreEvent};
use crate::context_strip::{ContextStrip, ContextStripEvent, SuggestContextKind}; use crate::context_strip::{ContextStrip, ContextStripEvent, SuggestContextKind};
use crate::message_editor::{ContextCreasesAddon, extract_message_creases, insert_message_creases};
use crate::terminal_codegen::TerminalCodegen; use crate::terminal_codegen::TerminalCodegen;
use crate::{CycleNextInlineAssist, CyclePreviousInlineAssist, ModelUsageContext}; use crate::{
use crate::{RemoveAllContext, ToggleContextPicker}; CycleNextInlineAssist, CyclePreviousInlineAssist, ModelUsageContext, RemoveAllContext,
ToggleContextPicker,
};
pub struct PromptEditor<T> { pub struct PromptEditor<T> {
pub editor: Entity<Editor>, pub editor: Entity<Editor>,
@@ -1157,3 +1161,156 @@ impl GenerationMode {
} }
} }
} }
/// Stored information that can be used to resurrect a context crease when creating an editor for a past message.
#[derive(Clone, Debug)]
pub struct MessageCrease {
pub range: Range<usize>,
pub icon_path: SharedString,
pub label: SharedString,
/// None for a deserialized message, Some otherwise.
pub context: Option<AgentContextHandle>,
}
#[derive(Default)]
pub struct ContextCreasesAddon {
creases: HashMap<AgentContextKey, Vec<(CreaseId, SharedString)>>,
_subscription: Option<Subscription>,
}
impl Addon for ContextCreasesAddon {
fn to_any(&self) -> &dyn std::any::Any {
self
}
fn to_any_mut(&mut self) -> Option<&mut dyn std::any::Any> {
Some(self)
}
}
impl ContextCreasesAddon {
pub fn new() -> Self {
Self {
creases: HashMap::default(),
_subscription: None,
}
}
pub fn add_creases(
&mut self,
context_store: &Entity<ContextStore>,
key: AgentContextKey,
creases: impl IntoIterator<Item = (CreaseId, SharedString)>,
cx: &mut Context<Editor>,
) {
self.creases.entry(key).or_default().extend(creases);
self._subscription = Some(
cx.subscribe(context_store, |editor, _, event, cx| match event {
ContextStoreEvent::ContextRemoved(key) => {
let Some(this) = editor.addon_mut::<Self>() else {
return;
};
let (crease_ids, replacement_texts): (Vec<_>, Vec<_>) = this
.creases
.remove(key)
.unwrap_or_default()
.into_iter()
.unzip();
let ranges = editor
.remove_creases(crease_ids, cx)
.into_iter()
.map(|(_, range)| range)
.collect::<Vec<_>>();
editor.unfold_ranges(&ranges, false, false, cx);
editor.edit(ranges.into_iter().zip(replacement_texts), cx);
cx.notify();
}
}),
)
}
pub fn into_inner(self) -> HashMap<AgentContextKey, Vec<(CreaseId, SharedString)>> {
self.creases
}
}
pub fn extract_message_creases(
editor: &mut Editor,
cx: &mut Context<'_, Editor>,
) -> Vec<MessageCrease> {
let buffer_snapshot = editor.buffer().read(cx).snapshot(cx);
let mut contexts_by_crease_id = editor
.addon_mut::<ContextCreasesAddon>()
.map(std::mem::take)
.unwrap_or_default()
.into_inner()
.into_iter()
.flat_map(|(key, creases)| {
let context = key.0;
creases
.into_iter()
.map(move |(id, _)| (id, context.clone()))
})
.collect::<HashMap<_, _>>();
// Filter the addon's list of creases based on what the editor reports,
// since the addon might have removed creases in it.
editor.display_map.update(cx, |display_map, cx| {
display_map
.snapshot(cx)
.crease_snapshot
.creases()
.filter_map(|(id, crease)| {
Some((
id,
(
crease.range().to_offset(&buffer_snapshot),
crease.metadata()?.clone(),
),
))
})
.map(|(id, (range, metadata))| {
let context = contexts_by_crease_id.remove(&id);
MessageCrease {
range,
context,
label: metadata.label,
icon_path: metadata.icon_path,
}
})
.collect()
})
}
pub fn insert_message_creases(
editor: &mut Editor,
message_creases: &[MessageCrease],
context_store: &Entity<ContextStore>,
window: &mut Window,
cx: &mut Context<'_, Editor>,
) {
let buffer_snapshot = editor.buffer().read(cx).snapshot(cx);
let creases = message_creases
.iter()
.map(|crease| {
let start = buffer_snapshot.anchor_after(crease.range.start);
let end = buffer_snapshot.anchor_before(crease.range.end);
crease_for_mention(
crease.label.clone(),
crease.icon_path.clone(),
start..end,
cx.weak_entity(),
)
})
.collect::<Vec<_>>();
let ids = editor.insert_creases(creases.clone(), cx);
editor.fold_creases(creases, false, window, cx);
if let Some(addon) = editor.addon_mut::<ContextCreasesAddon>() {
for (crease, id) in message_creases.iter().zip(ids) {
if let Some(context) = crease.context.as_ref() {
let key = AgentContextKey(context.clone());
addon.add_creases(context_store, key, vec![(id, crease.label.clone())], cx);
}
}
}
}

View File

@@ -19,14 +19,26 @@ pub type LanguageModelSelector = Picker<LanguageModelPickerDelegate>;
pub fn language_model_selector( pub fn language_model_selector(
get_active_model: impl Fn(&App) -> Option<ConfiguredModel> + 'static, get_active_model: impl Fn(&App) -> Option<ConfiguredModel> + 'static,
on_model_changed: impl Fn(Arc<dyn LanguageModel>, &mut App) + 'static, on_model_changed: impl Fn(Arc<dyn LanguageModel>, &mut App) + 'static,
popover_styles: bool,
window: &mut Window, window: &mut Window,
cx: &mut Context<LanguageModelSelector>, cx: &mut Context<LanguageModelSelector>,
) -> LanguageModelSelector { ) -> LanguageModelSelector {
let delegate = LanguageModelPickerDelegate::new(get_active_model, on_model_changed, window, cx); let delegate = LanguageModelPickerDelegate::new(
Picker::list(delegate, window, cx) get_active_model,
.show_scrollbar(true) on_model_changed,
.width(rems(20.)) popover_styles,
.max_height(Some(rems(20.).into())) window,
cx,
);
if popover_styles {
Picker::list(delegate, window, cx)
.show_scrollbar(true)
.width(rems(20.))
.max_height(Some(rems(20.).into()))
} else {
Picker::list(delegate, window, cx).show_scrollbar(true)
}
} }
fn all_models(cx: &App) -> GroupedModels { fn all_models(cx: &App) -> GroupedModels {
@@ -75,12 +87,14 @@ pub struct LanguageModelPickerDelegate {
selected_index: usize, selected_index: usize,
_authenticate_all_providers_task: Task<()>, _authenticate_all_providers_task: Task<()>,
_subscriptions: Vec<Subscription>, _subscriptions: Vec<Subscription>,
popover_styles: bool,
} }
impl LanguageModelPickerDelegate { impl LanguageModelPickerDelegate {
fn new( fn new(
get_active_model: impl Fn(&App) -> Option<ConfiguredModel> + 'static, get_active_model: impl Fn(&App) -> Option<ConfiguredModel> + 'static,
on_model_changed: impl Fn(Arc<dyn LanguageModel>, &mut App) + 'static, on_model_changed: impl Fn(Arc<dyn LanguageModel>, &mut App) + 'static,
popover_styles: bool,
window: &mut Window, window: &mut Window,
cx: &mut Context<Picker<Self>>, cx: &mut Context<Picker<Self>>,
) -> Self { ) -> Self {
@@ -113,6 +127,7 @@ impl LanguageModelPickerDelegate {
} }
}, },
)], )],
popover_styles,
} }
} }
@@ -177,7 +192,7 @@ impl LanguageModelPickerDelegate {
} }
_ => { _ => {
log::error!( log::error!(
"Failed to authenticate provider: {}: {err}", "Failed to authenticate provider: {}: {err:#}",
provider_name.0 provider_name.0
); );
} }
@@ -530,6 +545,10 @@ impl PickerDelegate for LanguageModelPickerDelegate {
_window: &mut Window, _window: &mut Window,
cx: &mut Context<Picker<Self>>, cx: &mut Context<Picker<Self>>,
) -> Option<gpui::AnyElement> { ) -> Option<gpui::AnyElement> {
if !self.popover_styles {
return None;
}
Some( Some(
h_flex() h_flex()
.w_full() .w_full()

View File

@@ -1,166 +0,0 @@
use std::ops::Range;
use collections::HashMap;
use editor::display_map::CreaseId;
use editor::{Addon, AnchorRangeExt, Editor};
use gpui::{Entity, Subscription};
use ui::prelude::*;
use crate::{
context::{AgentContextHandle, AgentContextKey},
context_picker::crease_for_mention,
context_store::{ContextStore, ContextStoreEvent},
};
/// Stored information that can be used to resurrect a context crease when creating an editor for a past message.
#[derive(Clone, Debug)]
pub struct MessageCrease {
pub range: Range<usize>,
pub icon_path: SharedString,
pub label: SharedString,
/// None for a deserialized message, Some otherwise.
pub context: Option<AgentContextHandle>,
}
#[derive(Default)]
pub struct ContextCreasesAddon {
creases: HashMap<AgentContextKey, Vec<(CreaseId, SharedString)>>,
_subscription: Option<Subscription>,
}
impl Addon for ContextCreasesAddon {
fn to_any(&self) -> &dyn std::any::Any {
self
}
fn to_any_mut(&mut self) -> Option<&mut dyn std::any::Any> {
Some(self)
}
}
impl ContextCreasesAddon {
pub fn new() -> Self {
Self {
creases: HashMap::default(),
_subscription: None,
}
}
pub fn add_creases(
&mut self,
context_store: &Entity<ContextStore>,
key: AgentContextKey,
creases: impl IntoIterator<Item = (CreaseId, SharedString)>,
cx: &mut Context<Editor>,
) {
self.creases.entry(key).or_default().extend(creases);
self._subscription = Some(
cx.subscribe(context_store, |editor, _, event, cx| match event {
ContextStoreEvent::ContextRemoved(key) => {
let Some(this) = editor.addon_mut::<Self>() else {
return;
};
let (crease_ids, replacement_texts): (Vec<_>, Vec<_>) = this
.creases
.remove(key)
.unwrap_or_default()
.into_iter()
.unzip();
let ranges = editor
.remove_creases(crease_ids, cx)
.into_iter()
.map(|(_, range)| range)
.collect::<Vec<_>>();
editor.unfold_ranges(&ranges, false, false, cx);
editor.edit(ranges.into_iter().zip(replacement_texts), cx);
cx.notify();
}
}),
)
}
pub fn into_inner(self) -> HashMap<AgentContextKey, Vec<(CreaseId, SharedString)>> {
self.creases
}
}
pub fn extract_message_creases(
editor: &mut Editor,
cx: &mut Context<'_, Editor>,
) -> Vec<MessageCrease> {
let buffer_snapshot = editor.buffer().read(cx).snapshot(cx);
let mut contexts_by_crease_id = editor
.addon_mut::<ContextCreasesAddon>()
.map(std::mem::take)
.unwrap_or_default()
.into_inner()
.into_iter()
.flat_map(|(key, creases)| {
let context = key.0;
creases
.into_iter()
.map(move |(id, _)| (id, context.clone()))
})
.collect::<HashMap<_, _>>();
// Filter the addon's list of creases based on what the editor reports,
// since the addon might have removed creases in it.
editor.display_map.update(cx, |display_map, cx| {
display_map
.snapshot(cx)
.crease_snapshot
.creases()
.filter_map(|(id, crease)| {
Some((
id,
(
crease.range().to_offset(&buffer_snapshot),
crease.metadata()?.clone(),
),
))
})
.map(|(id, (range, metadata))| {
let context = contexts_by_crease_id.remove(&id);
MessageCrease {
range,
context,
label: metadata.label,
icon_path: metadata.icon_path,
}
})
.collect()
})
}
pub fn insert_message_creases(
editor: &mut Editor,
message_creases: &[MessageCrease],
context_store: &Entity<ContextStore>,
window: &mut Window,
cx: &mut Context<'_, Editor>,
) {
let buffer_snapshot = editor.buffer().read(cx).snapshot(cx);
let creases = message_creases
.iter()
.map(|crease| {
let start = buffer_snapshot.anchor_after(crease.range.start);
let end = buffer_snapshot.anchor_before(crease.range.end);
crease_for_mention(
crease.label.clone(),
crease.icon_path.clone(),
start..end,
cx.weak_entity(),
)
})
.collect::<Vec<_>>();
let ids = editor.insert_creases(creases.clone(), cx);
editor.fold_creases(creases, false, window, cx);
if let Some(addon) = editor.addon_mut::<ContextCreasesAddon>() {
for (crease, id) in message_creases.iter().zip(ids) {
if let Some(context) = crease.context.as_ref() {
let key = AgentContextKey(context.clone());
addon.add_creases(context_store, key, vec![(id, crease.label.clone())], cx);
}
}
}
}

View File

@@ -15,8 +15,8 @@ use std::{
sync::{Arc, atomic::AtomicBool}, sync::{Arc, atomic::AtomicBool},
}; };
use ui::{ use ui::{
DocumentationAside, DocumentationEdge, DocumentationSide, HighlightedLabel, LabelSize, DocumentationAside, DocumentationEdge, DocumentationSide, HighlightedLabel, KeyBinding,
ListItem, ListItemSpacing, PopoverMenuHandle, TintColor, Tooltip, prelude::*, LabelSize, ListItem, ListItemSpacing, PopoverMenuHandle, TintColor, Tooltip, prelude::*,
}; };
/// Trait for types that can provide and manage agent profiles /// Trait for types that can provide and manage agent profiles
@@ -81,6 +81,7 @@ impl ProfileSelector {
self.provider.clone(), self.provider.clone(),
self.profiles.clone(), self.profiles.clone(),
cx.background_executor().clone(), cx.background_executor().clone(),
self.focus_handle.clone(),
cx, cx,
); );
@@ -207,6 +208,7 @@ pub(crate) struct ProfilePickerDelegate {
selected_index: usize, selected_index: usize,
query: String, query: String,
cancel: Option<Arc<AtomicBool>>, cancel: Option<Arc<AtomicBool>>,
focus_handle: FocusHandle,
} }
impl ProfilePickerDelegate { impl ProfilePickerDelegate {
@@ -215,6 +217,7 @@ impl ProfilePickerDelegate {
provider: Arc<dyn ProfileProvider>, provider: Arc<dyn ProfileProvider>,
profiles: AvailableProfiles, profiles: AvailableProfiles,
background: BackgroundExecutor, background: BackgroundExecutor,
focus_handle: FocusHandle,
cx: &mut Context<ProfileSelector>, cx: &mut Context<ProfileSelector>,
) -> Self { ) -> Self {
let candidates = Self::candidates_from(profiles); let candidates = Self::candidates_from(profiles);
@@ -231,6 +234,7 @@ impl ProfilePickerDelegate {
selected_index: 0, selected_index: 0,
query: String::new(), query: String::new(),
cancel: None, cancel: None,
focus_handle,
}; };
this.selected_index = this this.selected_index = this
@@ -594,20 +598,26 @@ impl PickerDelegate for ProfilePickerDelegate {
_: &mut Window, _: &mut Window,
cx: &mut Context<Picker<Self>>, cx: &mut Context<Picker<Self>>,
) -> Option<gpui::AnyElement> { ) -> Option<gpui::AnyElement> {
let focus_handle = self.focus_handle.clone();
Some( Some(
h_flex() h_flex()
.w_full() .w_full()
.border_t_1() .border_t_1()
.border_color(cx.theme().colors().border_variant) .border_color(cx.theme().colors().border_variant)
.p_1() .p_1p5()
.gap_4()
.justify_between()
.child( .child(
Button::new("configure", "Configure") Button::new("configure", "Configure")
.icon(IconName::Settings) .full_width()
.icon_size(IconSize::Small) .style(ButtonStyle::Outlined)
.icon_color(Color::Muted) .key_binding(
.icon_position(IconPosition::Start) KeyBinding::for_action_in(
&ManageProfiles::default(),
&focus_handle,
cx,
)
.map(|kb| kb.size(rems_from_px(12.))),
)
.on_click(|_, window, cx| { .on_click(|_, window, cx| {
window.dispatch_action(ManageProfiles::default().boxed_clone(), cx); window.dispatch_action(ManageProfiles::default().boxed_clone(), cx);
}), }),
@@ -659,20 +669,25 @@ mod tests {
is_builtin: true, is_builtin: true,
}]; }];
let delegate = ProfilePickerDelegate { cx.update(|cx| {
fs: FakeFs::new(cx.executor()), let focus_handle = cx.focus_handle();
provider: Arc::new(TestProfileProvider::new(AgentProfileId("write".into()))),
background: cx.executor(),
candidates,
string_candidates: Arc::new(Vec::new()),
filtered_entries: Vec::new(),
selected_index: 0,
query: String::new(),
cancel: None,
};
let matches = Vec::new(); // No matches let delegate = ProfilePickerDelegate {
let _entries = delegate.entries_from_matches(matches); fs: FakeFs::new(cx.background_executor().clone()),
provider: Arc::new(TestProfileProvider::new(AgentProfileId("write".into()))),
background: cx.background_executor().clone(),
candidates,
string_candidates: Arc::new(Vec::new()),
filtered_entries: Vec::new(),
selected_index: 0,
query: String::new(),
cancel: None,
focus_handle,
};
let matches = Vec::new(); // No matches
let _entries = delegate.entries_from_matches(matches);
});
} }
#[gpui::test] #[gpui::test]
@@ -690,30 +705,35 @@ mod tests {
}, },
]; ];
let delegate = ProfilePickerDelegate { cx.update(|cx| {
fs: FakeFs::new(cx.executor()), let focus_handle = cx.focus_handle();
provider: Arc::new(TestProfileProvider::new(AgentProfileId("write".into()))),
background: cx.executor(),
candidates,
string_candidates: Arc::new(Vec::new()),
filtered_entries: vec![
ProfilePickerEntry::Profile(ProfileMatchEntry {
candidate_index: 0,
positions: Vec::new(),
}),
ProfilePickerEntry::Profile(ProfileMatchEntry {
candidate_index: 1,
positions: Vec::new(),
}),
],
selected_index: 0,
query: String::new(),
cancel: None,
};
// Active profile should be found at index 0 let delegate = ProfilePickerDelegate {
let active_index = delegate.index_of_profile(&AgentProfileId("write".into())); fs: FakeFs::new(cx.background_executor().clone()),
assert_eq!(active_index, Some(0)); provider: Arc::new(TestProfileProvider::new(AgentProfileId("write".into()))),
background: cx.background_executor().clone(),
candidates,
string_candidates: Arc::new(Vec::new()),
filtered_entries: vec![
ProfilePickerEntry::Profile(ProfileMatchEntry {
candidate_index: 0,
positions: Vec::new(),
}),
ProfilePickerEntry::Profile(ProfileMatchEntry {
candidate_index: 1,
positions: Vec::new(),
}),
],
selected_index: 0,
query: String::new(),
cancel: None,
focus_handle,
};
// Active profile should be found at index 0
let active_index = delegate.index_of_profile(&AgentProfileId("write".into()));
assert_eq!(active_index, Some(0));
});
} }
struct TestProfileProvider { struct TestProfileProvider {

View File

@@ -314,6 +314,7 @@ impl TextThreadEditor {
) )
}); });
}, },
true, // Use popover styles for picker
window, window,
cx, cx,
) )
@@ -477,7 +478,7 @@ impl TextThreadEditor {
editor.insert(&format!("/{name}"), window, cx); editor.insert(&format!("/{name}"), window, cx);
if command.accepts_arguments() { if command.accepts_arguments() {
editor.insert(" ", window, cx); editor.insert(" ", window, cx);
editor.show_completions(&ShowCompletions::default(), window, cx); editor.show_completions(&ShowCompletions, window, cx);
} }
}); });
}); });
@@ -3223,11 +3224,7 @@ mod tests {
prompt_store::init(cx); prompt_store::init(cx);
LanguageModelRegistry::test(cx); LanguageModelRegistry::test(cx);
cx.set_global(settings_store); cx.set_global(settings_store);
language::init(cx);
agent_settings::init(cx);
Project::init_settings(cx);
theme::init(theme::LoadThemes::JustBase, cx); theme::init(theme::LoadThemes::JustBase, cx);
workspace::init_settings(cx);
editor::init_settings(cx);
} }
} }

View File

@@ -577,8 +577,6 @@ mod test {
let settings_store = SettingsStore::test(cx); let settings_store = SettingsStore::test(cx);
cx.set_global(settings_store); cx.set_global(settings_store);
// release_channel::init(SemanticVersion::default(), cx); // release_channel::init(SemanticVersion::default(), cx);
language::init(cx);
Project::init_settings(cx);
}); });
} }

View File

@@ -22,7 +22,6 @@ use language_model::{
}; };
use parking_lot::Mutex; use parking_lot::Mutex;
use pretty_assertions::assert_eq; use pretty_assertions::assert_eq;
use project::Project;
use prompt_store::PromptBuilder; use prompt_store::PromptBuilder;
use rand::prelude::*; use rand::prelude::*;
use serde_json::json; use serde_json::json;
@@ -1411,9 +1410,6 @@ fn init_test(cx: &mut App) {
prompt_store::init(cx); prompt_store::init(cx);
LanguageModelRegistry::test(cx); LanguageModelRegistry::test(cx);
cx.set_global(settings_store); cx.set_global(settings_store);
language::init(cx);
agent_settings::init(cx);
Project::init_settings(cx);
} }
#[derive(Clone)] #[derive(Clone)]

View File

@@ -48,7 +48,6 @@ pub const LEGACY_CHANNEL_COUNT: NonZero<u16> = nz!(2);
pub const REPLAY_DURATION: Duration = Duration::from_secs(30); pub const REPLAY_DURATION: Duration = Duration::from_secs(30);
pub fn init(cx: &mut App) { pub fn init(cx: &mut App) {
AudioSettings::register(cx);
LIVE_SETTINGS.initialize(cx); LIVE_SETTINGS.initialize(cx);
} }

View File

@@ -1,9 +1,9 @@
use std::sync::atomic::{AtomicBool, Ordering}; use std::sync::atomic::{AtomicBool, Ordering};
use gpui::App; use gpui::App;
use settings::{Settings, SettingsStore}; use settings::{RegisterSetting, Settings, SettingsStore};
#[derive(Clone, Debug)] #[derive(Clone, Debug, RegisterSetting)]
pub struct AudioSettings { pub struct AudioSettings {
/// Opt into the new audio system. /// Opt into the new audio system.
/// ///

View File

@@ -33,4 +33,9 @@ workspace.workspace = true
which.workspace = true which.workspace = true
[dev-dependencies] [dev-dependencies]
ctor.workspace = true
clock= { workspace = true, "features" = ["test-support"] }
futures.workspace = true
gpui = { workspace = true, "features" = ["test-support"] } gpui = { workspace = true, "features" = ["test-support"] }
parking_lot.workspace = true
zlog.workspace = true

View File

@@ -1,16 +1,15 @@
use anyhow::{Context as _, Result}; use anyhow::{Context as _, Result};
use client::{Client, TelemetrySettings}; use client::Client;
use db::RELEASE_CHANNEL;
use db::kvp::KEY_VALUE_STORE; use db::kvp::KEY_VALUE_STORE;
use gpui::{ use gpui::{
App, AppContext as _, AsyncApp, BackgroundExecutor, Context, Entity, Global, SemanticVersion, App, AppContext as _, AsyncApp, BackgroundExecutor, Context, Entity, Global, SemanticVersion,
Task, Window, actions, Task, Window, actions,
}; };
use http_client::{AsyncBody, HttpClient, HttpClientWithUrl}; use http_client::{HttpClient, HttpClientWithUrl};
use paths::remote_servers_dir; use paths::remote_servers_dir;
use release_channel::{AppCommitSha, ReleaseChannel}; use release_channel::{AppCommitSha, ReleaseChannel};
use serde::{Deserialize, Serialize}; use serde::{Deserialize, Serialize};
use settings::{Settings, SettingsStore}; use settings::{RegisterSetting, Settings, SettingsStore};
use smol::{fs, io::AsyncReadExt}; use smol::{fs, io::AsyncReadExt};
use smol::{fs::File, process::Command}; use smol::{fs::File, process::Command};
use std::mem; use std::mem;
@@ -41,22 +40,23 @@ actions!(
] ]
); );
#[derive(Serialize)]
struct UpdateRequestBody {
installation_id: Option<Arc<str>>,
release_channel: Option<&'static str>,
telemetry: bool,
is_staff: Option<bool>,
destination: &'static str,
}
#[derive(Clone, Debug, PartialEq, Eq)] #[derive(Clone, Debug, PartialEq, Eq)]
pub enum VersionCheckType { pub enum VersionCheckType {
Sha(AppCommitSha), Sha(AppCommitSha),
Semantic(SemanticVersion), Semantic(SemanticVersion),
} }
#[derive(Clone)] #[derive(Serialize, Debug)]
pub struct AssetQuery<'a> {
asset: &'a str,
os: &'a str,
arch: &'a str,
metrics_id: Option<&'a str>,
system_id: Option<&'a str>,
is_staff: Option<bool>,
}
#[derive(Clone, Debug)]
pub enum AutoUpdateStatus { pub enum AutoUpdateStatus {
Idle, Idle,
Checking, Checking,
@@ -66,6 +66,31 @@ pub enum AutoUpdateStatus {
Errored { error: Arc<anyhow::Error> }, Errored { error: Arc<anyhow::Error> },
} }
impl PartialEq for AutoUpdateStatus {
fn eq(&self, other: &Self) -> bool {
match (self, other) {
(AutoUpdateStatus::Idle, AutoUpdateStatus::Idle) => true,
(AutoUpdateStatus::Checking, AutoUpdateStatus::Checking) => true,
(
AutoUpdateStatus::Downloading { version: v1 },
AutoUpdateStatus::Downloading { version: v2 },
) => v1 == v2,
(
AutoUpdateStatus::Installing { version: v1 },
AutoUpdateStatus::Installing { version: v2 },
) => v1 == v2,
(
AutoUpdateStatus::Updated { version: v1 },
AutoUpdateStatus::Updated { version: v2 },
) => v1 == v2,
(AutoUpdateStatus::Errored { error: e1 }, AutoUpdateStatus::Errored { error: e2 }) => {
e1.to_string() == e2.to_string()
}
_ => false,
}
}
}
impl AutoUpdateStatus { impl AutoUpdateStatus {
pub fn is_updated(&self) -> bool { pub fn is_updated(&self) -> bool {
matches!(self, Self::Updated { .. }) matches!(self, Self::Updated { .. })
@@ -75,13 +100,13 @@ impl AutoUpdateStatus {
pub struct AutoUpdater { pub struct AutoUpdater {
status: AutoUpdateStatus, status: AutoUpdateStatus,
current_version: SemanticVersion, current_version: SemanticVersion,
http_client: Arc<HttpClientWithUrl>, client: Arc<Client>,
pending_poll: Option<Task<Option<()>>>, pending_poll: Option<Task<Option<()>>>,
quit_subscription: Option<gpui::Subscription>, quit_subscription: Option<gpui::Subscription>,
} }
#[derive(Deserialize, Clone, Debug)] #[derive(Deserialize, Serialize, Clone, Debug)]
pub struct JsonRelease { pub struct ReleaseAsset {
pub version: String, pub version: String,
pub url: String, pub url: String,
} }
@@ -120,7 +145,7 @@ impl Drop for MacOsUnmounter<'_> {
} }
} }
#[derive(Clone, Copy, Debug)] #[derive(Clone, Copy, Debug, RegisterSetting)]
struct AutoUpdateSetting(bool); struct AutoUpdateSetting(bool);
/// Whether or not to automatically check for updates. /// Whether or not to automatically check for updates.
@@ -137,9 +162,7 @@ struct GlobalAutoUpdate(Option<Entity<AutoUpdater>>);
impl Global for GlobalAutoUpdate {} impl Global for GlobalAutoUpdate {}
pub fn init(http_client: Arc<HttpClientWithUrl>, cx: &mut App) { pub fn init(client: Arc<Client>, cx: &mut App) {
AutoUpdateSetting::register(cx);
cx.observe_new(|workspace: &mut Workspace, _window, _cx| { cx.observe_new(|workspace: &mut Workspace, _window, _cx| {
workspace.register_action(|_, action, window, cx| check(action, window, cx)); workspace.register_action(|_, action, window, cx| check(action, window, cx));
@@ -151,7 +174,7 @@ pub fn init(http_client: Arc<HttpClientWithUrl>, cx: &mut App) {
let version = release_channel::AppVersion::global(cx); let version = release_channel::AppVersion::global(cx);
let auto_updater = cx.new(|cx| { let auto_updater = cx.new(|cx| {
let updater = AutoUpdater::new(version, http_client, cx); let updater = AutoUpdater::new(version, client, cx);
let poll_for_updates = ReleaseChannel::try_global(cx) let poll_for_updates = ReleaseChannel::try_global(cx)
.map(|channel| channel.poll_for_updates()) .map(|channel| channel.poll_for_updates())
@@ -235,7 +258,7 @@ pub fn view_release_notes(_: &ViewReleaseNotes, cx: &mut App) -> Option<()> {
let current_version = auto_updater.current_version; let current_version = auto_updater.current_version;
let release_channel = release_channel.dev_name(); let release_channel = release_channel.dev_name();
let path = format!("/releases/{release_channel}/{current_version}"); let path = format!("/releases/{release_channel}/{current_version}");
let url = &auto_updater.http_client.build_url(&path); let url = &auto_updater.client.http_client().build_url(&path);
cx.open_url(url); cx.open_url(url);
} }
ReleaseChannel::Nightly => { ReleaseChannel::Nightly => {
@@ -298,11 +321,7 @@ impl AutoUpdater {
cx.default_global::<GlobalAutoUpdate>().0.clone() cx.default_global::<GlobalAutoUpdate>().0.clone()
} }
fn new( fn new(current_version: SemanticVersion, client: Arc<Client>, cx: &mut Context<Self>) -> Self {
current_version: SemanticVersion,
http_client: Arc<HttpClientWithUrl>,
cx: &mut Context<Self>,
) -> Self {
// On windows, executable files cannot be overwritten while they are // On windows, executable files cannot be overwritten while they are
// running, so we must wait to overwrite the application until quitting // running, so we must wait to overwrite the application until quitting
// or restarting. When quitting the app, we spawn the auto update helper // or restarting. When quitting the app, we spawn the auto update helper
@@ -323,7 +342,7 @@ impl AutoUpdater {
Self { Self {
status: AutoUpdateStatus::Idle, status: AutoUpdateStatus::Idle,
current_version, current_version,
http_client, client,
pending_poll: None, pending_poll: None,
quit_subscription, quit_subscription,
} }
@@ -356,7 +375,7 @@ impl AutoUpdater {
cx.notify(); cx.notify();
self.pending_poll = Some(cx.spawn(async move |this, cx| { self.pending_poll = Some(cx.spawn(async move |this, cx| {
let result = Self::update(this.upgrade()?, cx.clone()).await; let result = Self::update(this.upgrade()?, cx).await;
this.update(cx, |this, cx| { this.update(cx, |this, cx| {
this.pending_poll = None; this.pending_poll = None;
if let Err(error) = result { if let Err(error) = result {
@@ -402,10 +421,11 @@ impl AutoUpdater {
// you can override this function. You should also update get_remote_server_release_url to return // you can override this function. You should also update get_remote_server_release_url to return
// Ok(None). // Ok(None).
pub async fn download_remote_server_release( pub async fn download_remote_server_release(
os: &str,
arch: &str,
release_channel: ReleaseChannel, release_channel: ReleaseChannel,
version: Option<SemanticVersion>, version: Option<SemanticVersion>,
os: &str,
arch: &str,
set_status: impl Fn(&str, &mut AsyncApp) + Send + 'static,
cx: &mut AsyncApp, cx: &mut AsyncApp,
) -> Result<PathBuf> { ) -> Result<PathBuf> {
let this = cx.update(|cx| { let this = cx.update(|cx| {
@@ -415,13 +435,14 @@ impl AutoUpdater {
.context("auto-update not initialized") .context("auto-update not initialized")
})??; })??;
let release = Self::get_release( set_status("Fetching remote server release", cx);
let release = Self::get_release_asset(
&this, &this,
release_channel,
version,
"zed-remote-server", "zed-remote-server",
os, os,
arch, arch,
version,
Some(release_channel),
cx, cx,
) )
.await?; .await?;
@@ -432,26 +453,27 @@ impl AutoUpdater {
let version_path = platform_dir.join(format!("{}.gz", release.version)); let version_path = platform_dir.join(format!("{}.gz", release.version));
smol::fs::create_dir_all(&platform_dir).await.ok(); smol::fs::create_dir_all(&platform_dir).await.ok();
let client = this.read_with(cx, |this, _| this.http_client.clone())?; let client = this.read_with(cx, |this, _| this.client.http_client())?;
if smol::fs::metadata(&version_path).await.is_err() { if smol::fs::metadata(&version_path).await.is_err() {
log::info!( log::info!(
"downloading zed-remote-server {os} {arch} version {}", "downloading zed-remote-server {os} {arch} version {}",
release.version release.version
); );
download_remote_server_binary(&version_path, release, client, cx).await?; set_status("Downloading remote server", cx);
download_remote_server_binary(&version_path, release, client).await?;
} }
Ok(version_path) Ok(version_path)
} }
pub async fn get_remote_server_release_url( pub async fn get_remote_server_release_url(
channel: ReleaseChannel,
version: Option<SemanticVersion>,
os: &str, os: &str,
arch: &str, arch: &str,
release_channel: ReleaseChannel,
version: Option<SemanticVersion>,
cx: &mut AsyncApp, cx: &mut AsyncApp,
) -> Result<Option<(String, String)>> { ) -> Result<Option<String>> {
let this = cx.update(|cx| { let this = cx.update(|cx| {
cx.default_global::<GlobalAutoUpdate>() cx.default_global::<GlobalAutoUpdate>()
.0 .0
@@ -459,108 +481,99 @@ impl AutoUpdater {
.context("auto-update not initialized") .context("auto-update not initialized")
})??; })??;
let release = Self::get_release( let release =
&this, Self::get_release_asset(&this, channel, version, "zed-remote-server", os, arch, cx)
"zed-remote-server", .await?;
os,
arch,
version,
Some(release_channel),
cx,
)
.await?;
let update_request_body = build_remote_server_update_request_body(cx)?; Ok(Some(release.url))
let body = serde_json::to_string(&update_request_body)?;
Ok(Some((release.url, body)))
} }
async fn get_release( async fn get_release_asset(
this: &Entity<Self>, this: &Entity<Self>,
asset: &str, release_channel: ReleaseChannel,
os: &str,
arch: &str,
version: Option<SemanticVersion>, version: Option<SemanticVersion>,
release_channel: Option<ReleaseChannel>,
cx: &mut AsyncApp,
) -> Result<JsonRelease> {
let client = this.read_with(cx, |this, _| this.http_client.clone())?;
if let Some(version) = version {
let channel = release_channel.map(|c| c.dev_name()).unwrap_or("stable");
let url = format!("/api/releases/{channel}/{version}/{asset}-{os}-{arch}.gz?update=1",);
Ok(JsonRelease {
version: version.to_string(),
url: client.build_url(&url),
})
} else {
let mut url_string = client.build_url(&format!(
"/api/releases/latest?asset={}&os={}&arch={}",
asset, os, arch
));
if let Some(param) = release_channel.and_then(|c| c.release_query_param()) {
url_string += "&";
url_string += param;
}
let mut response = client.get(&url_string, Default::default(), true).await?;
let mut body = Vec::new();
response.body_mut().read_to_end(&mut body).await?;
anyhow::ensure!(
response.status().is_success(),
"failed to fetch release: {:?}",
String::from_utf8_lossy(&body),
);
serde_json::from_slice(body.as_slice()).with_context(|| {
format!(
"error deserializing release {:?}",
String::from_utf8_lossy(&body),
)
})
}
}
async fn get_latest_release(
this: &Entity<Self>,
asset: &str, asset: &str,
os: &str, os: &str,
arch: &str, arch: &str,
release_channel: Option<ReleaseChannel>,
cx: &mut AsyncApp, cx: &mut AsyncApp,
) -> Result<JsonRelease> { ) -> Result<ReleaseAsset> {
Self::get_release(this, asset, os, arch, None, release_channel, cx).await let client = this.read_with(cx, |this, _| this.client.clone())?;
let (system_id, metrics_id, is_staff) = if client.telemetry().metrics_enabled() {
(
client.telemetry().system_id(),
client.telemetry().metrics_id(),
client.telemetry().is_staff(),
)
} else {
(None, None, None)
};
let version = if let Some(version) = version {
version.to_string()
} else {
"latest".to_string()
};
let http_client = client.http_client();
let path = format!("/releases/{}/{}/asset", release_channel.dev_name(), version,);
let url = http_client.build_zed_cloud_url_with_query(
&path,
AssetQuery {
os,
arch,
asset,
metrics_id: metrics_id.as_deref(),
system_id: system_id.as_deref(),
is_staff: is_staff,
},
)?;
let mut response = http_client
.get(url.as_str(), Default::default(), true)
.await?;
let mut body = Vec::new();
response.body_mut().read_to_end(&mut body).await?;
anyhow::ensure!(
response.status().is_success(),
"failed to fetch release: {:?}",
String::from_utf8_lossy(&body),
);
serde_json::from_slice(body.as_slice()).with_context(|| {
format!(
"error deserializing release {:?}",
String::from_utf8_lossy(&body),
)
})
} }
async fn update(this: Entity<Self>, mut cx: AsyncApp) -> Result<()> { async fn update(this: Entity<Self>, cx: &mut AsyncApp) -> Result<()> {
let (client, installed_version, previous_status, release_channel) = let (client, installed_version, previous_status, release_channel) =
this.read_with(&cx, |this, cx| { this.read_with(cx, |this, cx| {
( (
this.http_client.clone(), this.client.http_client(),
this.current_version, this.current_version,
this.status.clone(), this.status.clone(),
ReleaseChannel::try_global(cx), ReleaseChannel::try_global(cx).unwrap_or(ReleaseChannel::Stable),
) )
})?; })?;
Self::check_dependencies()?; Self::check_dependencies()?;
this.update(&mut cx, |this, cx| { this.update(cx, |this, cx| {
this.status = AutoUpdateStatus::Checking; this.status = AutoUpdateStatus::Checking;
log::info!("Auto Update: checking for updates"); log::info!("Auto Update: checking for updates");
cx.notify(); cx.notify();
})?; })?;
let fetched_release_data = let fetched_release_data =
Self::get_latest_release(&this, "zed", OS, ARCH, release_channel, &mut cx).await?; Self::get_release_asset(&this, release_channel, None, "zed", OS, ARCH, cx).await?;
let fetched_version = fetched_release_data.clone().version; let fetched_version = fetched_release_data.clone().version;
let app_commit_sha = cx.update(|cx| AppCommitSha::try_global(cx).map(|sha| sha.full())); let app_commit_sha = cx.update(|cx| AppCommitSha::try_global(cx).map(|sha| sha.full()));
let newer_version = Self::check_if_fetched_version_is_newer( let newer_version = Self::check_if_fetched_version_is_newer(
*RELEASE_CHANNEL, release_channel,
app_commit_sha, app_commit_sha,
installed_version, installed_version,
fetched_version, fetched_version,
@@ -568,7 +581,7 @@ impl AutoUpdater {
)?; )?;
let Some(newer_version) = newer_version else { let Some(newer_version) = newer_version else {
return this.update(&mut cx, |this, cx| { return this.update(cx, |this, cx| {
let status = match previous_status { let status = match previous_status {
AutoUpdateStatus::Updated { .. } => previous_status, AutoUpdateStatus::Updated { .. } => previous_status,
_ => AutoUpdateStatus::Idle, _ => AutoUpdateStatus::Idle,
@@ -578,7 +591,7 @@ impl AutoUpdater {
}); });
}; };
this.update(&mut cx, |this, cx| { this.update(cx, |this, cx| {
this.status = AutoUpdateStatus::Downloading { this.status = AutoUpdateStatus::Downloading {
version: newer_version.clone(), version: newer_version.clone(),
}; };
@@ -587,21 +600,21 @@ impl AutoUpdater {
let installer_dir = InstallerDir::new().await?; let installer_dir = InstallerDir::new().await?;
let target_path = Self::target_path(&installer_dir).await?; let target_path = Self::target_path(&installer_dir).await?;
download_release(&target_path, fetched_release_data, client, &cx).await?; download_release(&target_path, fetched_release_data, client).await?;
this.update(&mut cx, |this, cx| { this.update(cx, |this, cx| {
this.status = AutoUpdateStatus::Installing { this.status = AutoUpdateStatus::Installing {
version: newer_version.clone(), version: newer_version.clone(),
}; };
cx.notify(); cx.notify();
})?; })?;
let new_binary_path = Self::install_release(installer_dir, target_path, &cx).await?; let new_binary_path = Self::install_release(installer_dir, target_path, cx).await?;
if let Some(new_binary_path) = new_binary_path { if let Some(new_binary_path) = new_binary_path {
cx.update(|cx| cx.set_restart_path(new_binary_path))?; cx.update(|cx| cx.set_restart_path(new_binary_path))?;
} }
this.update(&mut cx, |this, cx| { this.update(cx, |this, cx| {
this.set_should_show_update_notification(true, cx) this.set_should_show_update_notification(true, cx)
.detach_and_log_err(cx); .detach_and_log_err(cx);
this.status = AutoUpdateStatus::Updated { this.status = AutoUpdateStatus::Updated {
@@ -680,6 +693,12 @@ impl AutoUpdater {
target_path: PathBuf, target_path: PathBuf,
cx: &AsyncApp, cx: &AsyncApp,
) -> Result<Option<PathBuf>> { ) -> Result<Option<PathBuf>> {
#[cfg(test)]
if let Some(test_install) =
cx.try_read_global::<tests::InstallOverride, _>(|g, _| g.0.clone())
{
return test_install(target_path, cx);
}
match OS { match OS {
"macos" => install_release_macos(&installer_dir, target_path, cx).await, "macos" => install_release_macos(&installer_dir, target_path, cx).await,
"linux" => install_release_linux(&installer_dir, target_path, cx).await, "linux" => install_release_linux(&installer_dir, target_path, cx).await,
@@ -730,16 +749,13 @@ impl AutoUpdater {
async fn download_remote_server_binary( async fn download_remote_server_binary(
target_path: &PathBuf, target_path: &PathBuf,
release: JsonRelease, release: ReleaseAsset,
client: Arc<HttpClientWithUrl>, client: Arc<HttpClientWithUrl>,
cx: &AsyncApp,
) -> Result<()> { ) -> Result<()> {
let temp = tempfile::Builder::new().tempfile_in(remote_servers_dir())?; let temp = tempfile::Builder::new().tempfile_in(remote_servers_dir())?;
let mut temp_file = File::create(&temp).await?; let mut temp_file = File::create(&temp).await?;
let update_request_body = build_remote_server_update_request_body(cx)?;
let request_body = AsyncBody::from(serde_json::to_string(&update_request_body)?);
let mut response = client.get(&release.url, request_body, true).await?; let mut response = client.get(&release.url, Default::default(), true).await?;
anyhow::ensure!( anyhow::ensure!(
response.status().is_success(), response.status().is_success(),
"failed to download remote server release: {:?}", "failed to download remote server release: {:?}",
@@ -751,65 +767,19 @@ async fn download_remote_server_binary(
Ok(()) Ok(())
} }
fn build_remote_server_update_request_body(cx: &AsyncApp) -> Result<UpdateRequestBody> {
let (installation_id, release_channel, telemetry_enabled, is_staff) = cx.update(|cx| {
let telemetry = Client::global(cx).telemetry().clone();
let is_staff = telemetry.is_staff();
let installation_id = telemetry.installation_id();
let release_channel =
ReleaseChannel::try_global(cx).map(|release_channel| release_channel.display_name());
let telemetry_enabled = TelemetrySettings::get_global(cx).metrics;
(
installation_id,
release_channel,
telemetry_enabled,
is_staff,
)
})?;
Ok(UpdateRequestBody {
installation_id,
release_channel,
telemetry: telemetry_enabled,
is_staff,
destination: "remote",
})
}
async fn download_release( async fn download_release(
target_path: &Path, target_path: &Path,
release: JsonRelease, release: ReleaseAsset,
client: Arc<HttpClientWithUrl>, client: Arc<HttpClientWithUrl>,
cx: &AsyncApp,
) -> Result<()> { ) -> Result<()> {
let mut target_file = File::create(&target_path).await?; let mut target_file = File::create(&target_path).await?;
let (installation_id, release_channel, telemetry_enabled, is_staff) = cx.update(|cx| { let mut response = client.get(&release.url, Default::default(), true).await?;
let telemetry = Client::global(cx).telemetry().clone(); anyhow::ensure!(
let is_staff = telemetry.is_staff(); response.status().is_success(),
let installation_id = telemetry.installation_id(); "failed to download update: {:?}",
let release_channel = response.status()
ReleaseChannel::try_global(cx).map(|release_channel| release_channel.display_name()); );
let telemetry_enabled = TelemetrySettings::get_global(cx).metrics;
(
installation_id,
release_channel,
telemetry_enabled,
is_staff,
)
})?;
let request_body = AsyncBody::from(serde_json::to_string(&UpdateRequestBody {
installation_id,
release_channel,
telemetry: telemetry_enabled,
is_staff,
destination: "local",
})?);
let mut response = client.get(&release.url, request_body, true).await?;
smol::io::copy(response.body_mut(), &mut target_file).await?; smol::io::copy(response.body_mut(), &mut target_file).await?;
log::info!("downloaded update. path:{:?}", target_path); log::info!("downloaded update. path:{:?}", target_path);
@@ -1009,11 +979,33 @@ pub async fn finalize_auto_update_on_quit() {
#[cfg(test)] #[cfg(test)]
mod tests { mod tests {
use client::Client;
use clock::FakeSystemClock;
use futures::channel::oneshot;
use gpui::TestAppContext; use gpui::TestAppContext;
use http_client::{FakeHttpClient, Response};
use settings::default_settings; use settings::default_settings;
use std::{
rc::Rc,
sync::{
Arc,
atomic::{self, AtomicBool},
},
};
use tempfile::tempdir;
#[ctor::ctor]
fn init_logger() {
zlog::init_test();
}
use super::*; use super::*;
pub(super) struct InstallOverride(
pub Rc<dyn Fn(PathBuf, &AsyncApp) -> Result<Option<PathBuf>>>,
);
impl Global for InstallOverride {}
#[gpui::test] #[gpui::test]
fn test_auto_update_defaults_to_true(cx: &mut TestAppContext) { fn test_auto_update_defaults_to_true(cx: &mut TestAppContext) {
cx.update(|cx| { cx.update(|cx| {
@@ -1025,11 +1017,119 @@ mod tests {
.set_user_settings("{}", cx) .set_user_settings("{}", cx)
.expect("Unable to set user settings"); .expect("Unable to set user settings");
cx.set_global(store); cx.set_global(store);
AutoUpdateSetting::register(cx);
assert!(AutoUpdateSetting::get_global(cx).0); assert!(AutoUpdateSetting::get_global(cx).0);
}); });
} }
#[gpui::test]
async fn test_auto_update_downloads(cx: &mut TestAppContext) {
cx.background_executor.allow_parking();
zlog::init_test();
let release_available = Arc::new(AtomicBool::new(false));
let (dmg_tx, dmg_rx) = oneshot::channel::<String>();
cx.update(|cx| {
settings::init(cx);
let current_version = SemanticVersion::new(0, 100, 0);
release_channel::init_test(current_version, ReleaseChannel::Stable, cx);
let clock = Arc::new(FakeSystemClock::new());
let release_available = Arc::clone(&release_available);
let dmg_rx = Arc::new(parking_lot::Mutex::new(Some(dmg_rx)));
let fake_client_http = FakeHttpClient::create(move |req| {
let release_available = release_available.load(atomic::Ordering::Relaxed);
let dmg_rx = dmg_rx.clone();
async move {
if req.uri().path() == "/releases/stable/latest/asset" {
if release_available {
return Ok(Response::builder().status(200).body(
r#"{"version":"0.100.1","url":"https://test.example/new-download"}"#.into()
).unwrap());
} else {
return Ok(Response::builder().status(200).body(
r#"{"version":"0.100.0","url":"https://test.example/old-download"}"#.into()
).unwrap());
}
} else if req.uri().path() == "/new-download" {
return Ok(Response::builder().status(200).body({
let dmg_rx = dmg_rx.lock().take().unwrap();
dmg_rx.await.unwrap().into()
}).unwrap());
}
Ok(Response::builder().status(404).body("".into()).unwrap())
}
});
let client = Client::new(clock, fake_client_http, cx);
crate::init(client, cx);
});
let auto_updater = cx.update(|cx| AutoUpdater::get(cx).expect("auto updater should exist"));
cx.background_executor.run_until_parked();
auto_updater.read_with(cx, |updater, _| {
assert_eq!(updater.status(), AutoUpdateStatus::Idle);
assert_eq!(updater.current_version(), SemanticVersion::new(0, 100, 0));
});
release_available.store(true, atomic::Ordering::SeqCst);
cx.background_executor.advance_clock(POLL_INTERVAL);
cx.background_executor.run_until_parked();
loop {
cx.background_executor.timer(Duration::from_millis(0)).await;
cx.run_until_parked();
let status = auto_updater.read_with(cx, |updater, _| updater.status());
if !matches!(status, AutoUpdateStatus::Idle) {
break;
}
}
let status = auto_updater.read_with(cx, |updater, _| updater.status());
assert_eq!(
status,
AutoUpdateStatus::Downloading {
version: VersionCheckType::Semantic(SemanticVersion::new(0, 100, 1))
}
);
dmg_tx.send("<fake-zed-update>".to_owned()).unwrap();
let tmp_dir = Arc::new(tempdir().unwrap());
cx.update(|cx| {
let tmp_dir = tmp_dir.clone();
cx.set_global(InstallOverride(Rc::new(move |target_path, _cx| {
let tmp_dir = tmp_dir.clone();
let dest_path = tmp_dir.path().join("zed");
std::fs::copy(&target_path, &dest_path)?;
Ok(Some(dest_path))
})));
});
loop {
cx.background_executor.timer(Duration::from_millis(0)).await;
cx.run_until_parked();
let status = auto_updater.read_with(cx, |updater, _| updater.status());
if !matches!(status, AutoUpdateStatus::Downloading { .. }) {
break;
}
}
let status = auto_updater.read_with(cx, |updater, _| updater.status());
assert_eq!(
status,
AutoUpdateStatus::Updated {
version: VersionCheckType::Semantic(SemanticVersion::new(0, 100, 1))
}
);
let will_restart = cx.expect_restart();
cx.update(|cx| cx.restart());
let path = will_restart.await.unwrap().unwrap();
assert_eq!(path, tmp_dir.path().join("zed"));
assert_eq!(std::fs::read_to_string(path).unwrap(), "<fake-zed-update>");
}
#[test] #[test]
fn test_stable_does_not_update_when_fetched_version_is_not_higher() { fn test_stable_does_not_update_when_fetched_version_is_not_higher() {
let release_channel = ReleaseChannel::Stable; let release_channel = ReleaseChannel::Stable;

View File

@@ -1,7 +1,6 @@
pub mod participant; pub mod participant;
pub mod room; pub mod room;
use crate::call_settings::CallSettings;
use anyhow::{Context as _, Result, anyhow}; use anyhow::{Context as _, Result, anyhow};
use audio::Audio; use audio::Audio;
use client::{ChannelId, Client, TypedEnvelope, User, UserStore, ZED_ALWAYS_ACTIVE, proto}; use client::{ChannelId, Client, TypedEnvelope, User, UserStore, ZED_ALWAYS_ACTIVE, proto};
@@ -14,7 +13,6 @@ use gpui::{
use postage::watch; use postage::watch;
use project::Project; use project::Project;
use room::Event; use room::Event;
use settings::Settings;
use std::sync::Arc; use std::sync::Arc;
pub use livekit_client::{RemoteVideoTrack, RemoteVideoTrackView, RemoteVideoTrackViewEvent}; pub use livekit_client::{RemoteVideoTrack, RemoteVideoTrackView, RemoteVideoTrackViewEvent};
@@ -26,8 +24,6 @@ struct GlobalActiveCall(Entity<ActiveCall>);
impl Global for GlobalActiveCall {} impl Global for GlobalActiveCall {}
pub fn init(client: Arc<Client>, user_store: Entity<UserStore>, cx: &mut App) { pub fn init(client: Arc<Client>, user_store: Entity<UserStore>, cx: &mut App) {
CallSettings::register(cx);
let active_call = cx.new(|cx| ActiveCall::new(client, user_store, cx)); let active_call = cx.new(|cx| ActiveCall::new(client, user_store, cx));
cx.set_global(GlobalActiveCall(active_call)); cx.set_global(GlobalActiveCall(active_call));
} }

View File

@@ -1,6 +1,6 @@
use settings::Settings; use settings::{RegisterSetting, Settings};
#[derive(Debug)] #[derive(Debug, RegisterSetting)]
pub struct CallSettings { pub struct CallSettings {
pub mute_on_join: bool, pub mute_on_join: bool,
pub share_on_join: bool, pub share_on_join: bool,

View File

@@ -237,7 +237,6 @@ fn init_test(cx: &mut App) -> Entity<ChannelStore> {
let settings_store = SettingsStore::test(cx); let settings_store = SettingsStore::test(cx);
cx.set_global(settings_store); cx.set_global(settings_store);
release_channel::init(SemanticVersion::default(), cx); release_channel::init(SemanticVersion::default(), cx);
client::init_settings(cx);
let clock = Arc::new(FakeSystemClock::new()); let clock = Arc::new(FakeSystemClock::new());
let http = FakeHttpClient::with_404_response(); let http = FakeHttpClient::with_404_response();

View File

@@ -30,7 +30,7 @@ use rand::prelude::*;
use release_channel::{AppVersion, ReleaseChannel}; use release_channel::{AppVersion, ReleaseChannel};
use rpc::proto::{AnyTypedEnvelope, EnvelopedMessage, PeerId, RequestMessage}; use rpc::proto::{AnyTypedEnvelope, EnvelopedMessage, PeerId, RequestMessage};
use serde::{Deserialize, Serialize}; use serde::{Deserialize, Serialize};
use settings::{Settings, SettingsContent}; use settings::{RegisterSetting, Settings, SettingsContent};
use std::{ use std::{
any::TypeId, any::TypeId,
convert::TryFrom, convert::TryFrom,
@@ -95,7 +95,7 @@ actions!(
] ]
); );
#[derive(Deserialize)] #[derive(Deserialize, RegisterSetting)]
pub struct ClientSettings { pub struct ClientSettings {
pub server_url: String, pub server_url: String,
} }
@@ -113,7 +113,7 @@ impl Settings for ClientSettings {
} }
} }
#[derive(Deserialize, Default)] #[derive(Deserialize, Default, RegisterSetting)]
pub struct ProxySettings { pub struct ProxySettings {
pub proxy: Option<String>, pub proxy: Option<String>,
} }
@@ -140,12 +140,6 @@ impl Settings for ProxySettings {
} }
} }
pub fn init_settings(cx: &mut App) {
TelemetrySettings::register(cx);
ClientSettings::register(cx);
ProxySettings::register(cx);
}
pub fn init(client: &Arc<Client>, cx: &mut App) { pub fn init(client: &Arc<Client>, cx: &mut App) {
let client = Arc::downgrade(client); let client = Arc::downgrade(client);
cx.on_action({ cx.on_action({
@@ -508,7 +502,7 @@ impl<T: 'static> Drop for PendingEntitySubscription<T> {
} }
} }
#[derive(Copy, Clone, Deserialize, Debug)] #[derive(Copy, Clone, Deserialize, Debug, RegisterSetting)]
pub struct TelemetrySettings { pub struct TelemetrySettings {
pub diagnostics: bool, pub diagnostics: bool,
pub metrics: bool, pub metrics: bool,
@@ -1493,7 +1487,7 @@ impl Client {
let url = self let url = self
.http .http
.build_zed_cloud_url("/internal/users/impersonate", &[])?; .build_zed_cloud_url("/internal/users/impersonate")?;
let request = Request::post(url.as_str()) let request = Request::post(url.as_str())
.header("Content-Type", "application/json") .header("Content-Type", "application/json")
.header("Authorization", format!("Bearer {api_token}")) .header("Authorization", format!("Bearer {api_token}"))
@@ -2177,7 +2171,6 @@ mod tests {
cx.update(|cx| { cx.update(|cx| {
let settings_store = SettingsStore::test(cx); let settings_store = SettingsStore::test(cx);
cx.set_global(settings_store); cx.set_global(settings_store);
init_settings(cx);
}); });
} }
} }

View File

@@ -179,8 +179,6 @@ impl Telemetry {
let release_channel = let release_channel =
ReleaseChannel::try_global(cx).map(|release_channel| release_channel.display_name()); ReleaseChannel::try_global(cx).map(|release_channel| release_channel.display_name());
TelemetrySettings::register(cx);
let state = Arc::new(Mutex::new(TelemetryState { let state = Arc::new(Mutex::new(TelemetryState {
settings: *TelemetrySettings::get_global(cx), settings: *TelemetrySettings::get_global(cx),
architecture: env::consts::ARCH, architecture: env::consts::ARCH,
@@ -437,7 +435,7 @@ impl Telemetry {
Some(project_types) Some(project_types)
} }
fn report_event(self: &Arc<Self>, event: Event) { fn report_event(self: &Arc<Self>, mut event: Event) {
let mut state = self.state.lock(); let mut state = self.state.lock();
// RUST_LOG=telemetry=trace to debug telemetry events // RUST_LOG=telemetry=trace to debug telemetry events
log::trace!(target: "telemetry", "{:?}", event); log::trace!(target: "telemetry", "{:?}", event);
@@ -446,6 +444,12 @@ impl Telemetry {
return; return;
} }
match &mut event {
Event::Flexible(event) => event
.event_properties
.insert("event_source".into(), "zed".into()),
};
if state.flush_events_task.is_none() { if state.flush_events_task.is_none() {
let this = self.clone(); let this = self.clone();
state.flush_events_task = Some(self.executor.spawn(async move { state.flush_events_task = Some(self.executor.spawn(async move {

View File

@@ -260,7 +260,7 @@ impl fmt::Debug for Lamport {
impl fmt::Debug for Global { impl fmt::Debug for Global {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result { fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
write!(f, "Global {{")?; write!(f, "Global {{")?;
for timestamp in self.iter() { for timestamp in self.iter().filter(|t| t.value > 0) {
if timestamp.replica_id.0 > 0 { if timestamp.replica_id.0 > 0 {
write!(f, ", ")?; write!(f, ", ")?;
} }

View File

@@ -62,7 +62,7 @@ impl CloudApiClient {
let request = self.build_request( let request = self.build_request(
Request::builder().method(Method::GET).uri( Request::builder().method(Method::GET).uri(
self.http_client self.http_client
.build_zed_cloud_url("/client/users/me", &[])? .build_zed_cloud_url("/client/users/me")?
.as_ref(), .as_ref(),
), ),
AsyncBody::default(), AsyncBody::default(),
@@ -89,7 +89,7 @@ impl CloudApiClient {
pub fn connect(&self, cx: &App) -> Result<Task<Result<Connection>>> { pub fn connect(&self, cx: &App) -> Result<Task<Result<Connection>>> {
let mut connect_url = self let mut connect_url = self
.http_client .http_client
.build_zed_cloud_url("/client/users/connect", &[])?; .build_zed_cloud_url("/client/users/connect")?;
connect_url connect_url
.set_scheme(match connect_url.scheme() { .set_scheme(match connect_url.scheme() {
"https" => "wss", "https" => "wss",
@@ -123,7 +123,7 @@ impl CloudApiClient {
.method(Method::POST) .method(Method::POST)
.uri( .uri(
self.http_client self.http_client
.build_zed_cloud_url("/client/llm_tokens", &[])? .build_zed_cloud_url("/client/llm_tokens")?
.as_ref(), .as_ref(),
) )
.when_some(system_id, |builder, system_id| { .when_some(system_id, |builder, system_id| {
@@ -154,7 +154,7 @@ impl CloudApiClient {
let request = build_request( let request = build_request(
Request::builder().method(Method::GET).uri( Request::builder().method(Method::GET).uri(
self.http_client self.http_client
.build_zed_cloud_url("/client/users/me", &[])? .build_zed_cloud_url("/client/users/me")?
.as_ref(), .as_ref(),
), ),
AsyncBody::default(), AsyncBody::default(),

View File

@@ -1,5 +1,4 @@
pub mod predict_edits_v3; pub mod predict_edits_v3;
pub mod udiff;
use std::str::FromStr; use std::str::FromStr;
use std::sync::Arc; use std::sync::Arc;
@@ -184,13 +183,13 @@ pub struct PredictEditsGitInfo {
#[derive(Debug, Clone, Serialize, Deserialize)] #[derive(Debug, Clone, Serialize, Deserialize)]
pub struct PredictEditsResponse { pub struct PredictEditsResponse {
pub request_id: Uuid, pub request_id: String,
pub output_excerpt: String, pub output_excerpt: String,
} }
#[derive(Debug, Clone, Serialize, Deserialize)] #[derive(Debug, Clone, Serialize, Deserialize)]
pub struct AcceptEditPredictionBody { pub struct AcceptEditPredictionBody {
pub request_id: Uuid, pub request_id: String,
} }
#[derive(Debug, PartialEq, Eq, PartialOrd, Ord, Hash, Clone, Copy, Serialize, Deserialize)] #[derive(Debug, PartialEq, Eq, PartialOrd, Ord, Hash, Clone, Copy, Serialize, Deserialize)]

View File

@@ -1,7 +1,7 @@
use chrono::Duration; use chrono::Duration;
use serde::{Deserialize, Serialize}; use serde::{Deserialize, Serialize};
use std::{ use std::{
fmt::Display, fmt::{Display, Write as _},
ops::{Add, Range, Sub}, ops::{Add, Range, Sub},
path::{Path, PathBuf}, path::{Path, PathBuf},
sync::Arc, sync::Arc,
@@ -11,7 +11,14 @@ use uuid::Uuid;
use crate::PredictEditsGitInfo; use crate::PredictEditsGitInfo;
// TODO: snippet ordering within file / relative to excerpt #[derive(Debug, Clone, Serialize, Deserialize)]
pub struct PlanContextRetrievalRequest {
pub excerpt: String,
pub excerpt_path: Arc<Path>,
pub excerpt_line_range: Range<Line>,
pub cursor_file_max_row: Line,
pub events: Vec<Event>,
}
#[derive(Debug, Clone, Serialize, Deserialize)] #[derive(Debug, Clone, Serialize, Deserialize)]
pub struct PredictEditsRequest { pub struct PredictEditsRequest {
@@ -66,6 +73,7 @@ pub enum PromptFormat {
MarkedExcerpt, MarkedExcerpt,
LabeledSections, LabeledSections,
NumLinesUniDiff, NumLinesUniDiff,
OldTextNewText,
/// Prompt format intended for use via zeta_cli /// Prompt format intended for use via zeta_cli
OnlySnippets, OnlySnippets,
} }
@@ -93,6 +101,7 @@ impl std::fmt::Display for PromptFormat {
PromptFormat::LabeledSections => write!(f, "Labeled Sections"), PromptFormat::LabeledSections => write!(f, "Labeled Sections"),
PromptFormat::OnlySnippets => write!(f, "Only Snippets"), PromptFormat::OnlySnippets => write!(f, "Only Snippets"),
PromptFormat::NumLinesUniDiff => write!(f, "Numbered Lines / Unified Diff"), PromptFormat::NumLinesUniDiff => write!(f, "Numbered Lines / Unified Diff"),
PromptFormat::OldTextNewText => write!(f, "Old Text / New Text"),
} }
} }
} }
@@ -125,15 +134,15 @@ impl Display for Event {
write!( write!(
f, f,
"// User accepted prediction:\n--- a/{}\n+++ b/{}\n{diff}", "// User accepted prediction:\n--- a/{}\n+++ b/{}\n{diff}",
old_path.display(), DiffPathFmt(old_path),
new_path.display() DiffPathFmt(new_path)
) )
} else { } else {
write!( write!(
f, f,
"--- a/{}\n+++ b/{}\n{diff}", "--- a/{}\n+++ b/{}\n{diff}",
old_path.display(), DiffPathFmt(old_path),
new_path.display() DiffPathFmt(new_path)
) )
} }
} }
@@ -141,6 +150,24 @@ impl Display for Event {
} }
} }
/// always format the Path as a unix path with `/` as the path sep in Diffs
pub struct DiffPathFmt<'a>(pub &'a Path);
impl<'a> std::fmt::Display for DiffPathFmt<'a> {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
let mut is_first = true;
for component in self.0.components() {
if !is_first {
f.write_char('/')?;
} else {
is_first = false;
}
write!(f, "{}", component.as_os_str().display())?;
}
Ok(())
}
}
#[derive(Debug, Clone, Serialize, Deserialize)] #[derive(Debug, Clone, Serialize, Deserialize)]
pub struct Signature { pub struct Signature {
pub text: String, pub text: String,

View File

@@ -1,294 +0,0 @@
use std::{borrow::Cow, fmt::Display};
#[derive(Debug, PartialEq)]
pub enum DiffLine<'a> {
OldPath { path: Cow<'a, str> },
NewPath { path: Cow<'a, str> },
HunkHeader(Option<HunkLocation>),
Context(&'a str),
Deletion(&'a str),
Addition(&'a str),
Garbage(&'a str),
}
#[derive(Debug, PartialEq)]
pub struct HunkLocation {
start_line_old: u32,
count_old: u32,
start_line_new: u32,
count_new: u32,
}
impl<'a> DiffLine<'a> {
pub fn parse(line: &'a str) -> Self {
Self::try_parse(line).unwrap_or(Self::Garbage(line))
}
fn try_parse(line: &'a str) -> Option<Self> {
if let Some(header) = line.strip_prefix("---").and_then(eat_required_whitespace) {
let path = parse_header_path("a/", header);
Some(Self::OldPath { path })
} else if let Some(header) = line.strip_prefix("+++").and_then(eat_required_whitespace) {
Some(Self::NewPath {
path: parse_header_path("b/", header),
})
} else if let Some(header) = line.strip_prefix("@@").and_then(eat_required_whitespace) {
if header.starts_with("...") {
return Some(Self::HunkHeader(None));
}
let (start_line_old, header) = header.strip_prefix('-')?.split_once(',')?;
let mut parts = header.split_ascii_whitespace();
let count_old = parts.next()?;
let (start_line_new, count_new) = parts.next()?.strip_prefix('+')?.split_once(',')?;
Some(Self::HunkHeader(Some(HunkLocation {
start_line_old: start_line_old.parse::<u32>().ok()?.saturating_sub(1),
count_old: count_old.parse().ok()?,
start_line_new: start_line_new.parse::<u32>().ok()?.saturating_sub(1),
count_new: count_new.parse().ok()?,
})))
} else if let Some(deleted_header) = line.strip_prefix("-") {
Some(Self::Deletion(deleted_header))
} else if line.is_empty() {
Some(Self::Context(""))
} else if let Some(context) = line.strip_prefix(" ") {
Some(Self::Context(context))
} else {
Some(Self::Addition(line.strip_prefix("+")?))
}
}
}
impl<'a> Display for DiffLine<'a> {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
match self {
DiffLine::OldPath { path } => write!(f, "--- {path}"),
DiffLine::NewPath { path } => write!(f, "+++ {path}"),
DiffLine::HunkHeader(Some(hunk_location)) => {
write!(
f,
"@@ -{},{} +{},{} @@",
hunk_location.start_line_old + 1,
hunk_location.count_old,
hunk_location.start_line_new + 1,
hunk_location.count_new
)
}
DiffLine::HunkHeader(None) => write!(f, "@@ ... @@"),
DiffLine::Context(content) => write!(f, " {content}"),
DiffLine::Deletion(content) => write!(f, "-{content}"),
DiffLine::Addition(content) => write!(f, "+{content}"),
DiffLine::Garbage(line) => write!(f, "{line}"),
}
}
}
fn parse_header_path<'a>(strip_prefix: &'static str, header: &'a str) -> Cow<'a, str> {
if !header.contains(['"', '\\']) {
let path = header.split_ascii_whitespace().next().unwrap_or(header);
return Cow::Borrowed(path.strip_prefix(strip_prefix).unwrap_or(path));
}
let mut path = String::with_capacity(header.len());
let mut in_quote = false;
let mut chars = header.chars().peekable();
let mut strip_prefix = Some(strip_prefix);
while let Some(char) = chars.next() {
if char == '"' {
in_quote = !in_quote;
} else if char == '\\' {
let Some(&next_char) = chars.peek() else {
break;
};
chars.next();
path.push(next_char);
} else if char.is_ascii_whitespace() && !in_quote {
break;
} else {
path.push(char);
}
if let Some(prefix) = strip_prefix
&& path == prefix
{
strip_prefix.take();
path.clear();
}
}
Cow::Owned(path)
}
fn eat_required_whitespace(header: &str) -> Option<&str> {
let trimmed = header.trim_ascii_start();
if trimmed.len() == header.len() {
None
} else {
Some(trimmed)
}
}
#[cfg(test)]
mod tests {
use super::*;
use indoc::indoc;
#[test]
fn parse_lines_simple() {
let input = indoc! {"
diff --git a/text.txt b/text.txt
index 86c770d..a1fd855 100644
--- a/file.txt
+++ b/file.txt
@@ -1,2 +1,3 @@
context
-deleted
+inserted
garbage
--- b/file.txt
+++ a/file.txt
"};
let lines = input.lines().map(DiffLine::parse).collect::<Vec<_>>();
pretty_assertions::assert_eq!(
lines,
&[
DiffLine::Garbage("diff --git a/text.txt b/text.txt"),
DiffLine::Garbage("index 86c770d..a1fd855 100644"),
DiffLine::OldPath {
path: "file.txt".into()
},
DiffLine::NewPath {
path: "file.txt".into()
},
DiffLine::HunkHeader(Some(HunkLocation {
start_line_old: 0,
count_old: 2,
start_line_new: 0,
count_new: 3
})),
DiffLine::Context("context"),
DiffLine::Deletion("deleted"),
DiffLine::Addition("inserted"),
DiffLine::Garbage("garbage"),
DiffLine::Context(""),
DiffLine::OldPath {
path: "b/file.txt".into()
},
DiffLine::NewPath {
path: "a/file.txt".into()
},
]
);
}
#[test]
fn file_header_extra_space() {
let options = ["--- file", "--- file", "---\tfile"];
for option in options {
pretty_assertions::assert_eq!(
DiffLine::parse(option),
DiffLine::OldPath {
path: "file".into()
},
"{option}",
);
}
}
#[test]
fn hunk_header_extra_space() {
let options = [
"@@ -1,2 +1,3 @@",
"@@ -1,2 +1,3 @@",
"@@\t-1,2\t+1,3\t@@",
"@@ -1,2 +1,3 @@",
"@@ -1,2 +1,3 @@",
"@@ -1,2 +1,3 @@",
"@@ -1,2 +1,3 @@ garbage",
];
for option in options {
pretty_assertions::assert_eq!(
DiffLine::parse(option),
DiffLine::HunkHeader(Some(HunkLocation {
start_line_old: 0,
count_old: 2,
start_line_new: 0,
count_new: 3
})),
"{option}",
);
}
}
#[test]
fn hunk_header_without_location() {
pretty_assertions::assert_eq!(DiffLine::parse("@@ ... @@"), DiffLine::HunkHeader(None));
}
#[test]
fn test_parse_path() {
assert_eq!(parse_header_path("a/", "foo.txt"), "foo.txt");
assert_eq!(
parse_header_path("a/", "foo/bar/baz.txt"),
"foo/bar/baz.txt"
);
assert_eq!(parse_header_path("a/", "a/foo.txt"), "foo.txt");
assert_eq!(
parse_header_path("a/", "a/foo/bar/baz.txt"),
"foo/bar/baz.txt"
);
// Extra
assert_eq!(
parse_header_path("a/", "a/foo/bar/baz.txt 2025"),
"foo/bar/baz.txt"
);
assert_eq!(
parse_header_path("a/", "a/foo/bar/baz.txt\t2025"),
"foo/bar/baz.txt"
);
assert_eq!(
parse_header_path("a/", "a/foo/bar/baz.txt \""),
"foo/bar/baz.txt"
);
// Quoted
assert_eq!(
parse_header_path("a/", "a/foo/bar/\"baz quox.txt\""),
"foo/bar/baz quox.txt"
);
assert_eq!(
parse_header_path("a/", "\"a/foo/bar/baz quox.txt\""),
"foo/bar/baz quox.txt"
);
assert_eq!(
parse_header_path("a/", "\"foo/bar/baz quox.txt\""),
"foo/bar/baz quox.txt"
);
assert_eq!(parse_header_path("a/", "\"whatever 🤷\""), "whatever 🤷");
assert_eq!(
parse_header_path("a/", "\"foo/bar/baz quox.txt\" 2025"),
"foo/bar/baz quox.txt"
);
// unescaped quotes are dropped
assert_eq!(parse_header_path("a/", "foo/\"bar\""), "foo/bar");
// Escaped
assert_eq!(
parse_header_path("a/", "\"foo/\\\"bar\\\"/baz.txt\""),
"foo/\"bar\"/baz.txt"
);
assert_eq!(
parse_header_path("a/", "\"C:\\\\Projects\\\\My App\\\\old file.txt\""),
"C:\\Projects\\My App\\old file.txt"
);
}
}

View File

@@ -17,5 +17,6 @@ cloud_llm_client.workspace = true
indoc.workspace = true indoc.workspace = true
ordered-float.workspace = true ordered-float.workspace = true
rustc-hash.workspace = true rustc-hash.workspace = true
schemars.workspace = true
serde.workspace = true serde.workspace = true
strum.workspace = true strum.workspace = true

View File

@@ -1,8 +1,9 @@
//! Zeta2 prompt planning and generation code shared with cloud. //! Zeta2 prompt planning and generation code shared with cloud.
pub mod retrieval_prompt;
use anyhow::{Context as _, Result, anyhow}; use anyhow::{Context as _, Result, anyhow};
use cloud_llm_client::predict_edits_v3::{ use cloud_llm_client::predict_edits_v3::{
self, Excerpt, Line, Point, PromptFormat, ReferencedDeclaration, self, DiffPathFmt, Excerpt, Line, Point, PromptFormat, ReferencedDeclaration,
}; };
use indoc::indoc; use indoc::indoc;
use ordered_float::OrderedFloat; use ordered_float::OrderedFloat;
@@ -55,50 +56,98 @@ const LABELED_SECTIONS_INSTRUCTIONS: &str = indoc! {r#"
const NUMBERED_LINES_INSTRUCTIONS: &str = indoc! {r#" const NUMBERED_LINES_INSTRUCTIONS: &str = indoc! {r#"
# Instructions # Instructions
You are a code completion assistant helping a programmer finish their work. Your task is to: You are an edit prediction agent in a code editor.
Your job is to predict the next edit that the user will make,
based on their last few edits and their current cursor location.
1. Analyze the edit history to understand what the programmer is trying to achieve ## Output Format
2. Identify any incomplete refactoring or changes that need to be finished
3. Make the remaining edits that a human programmer would logically make next
4. Apply systematic changes consistently across the entire codebase - if you see a pattern starting, complete it everywhere.
Focus on: You must briefly explain your understanding of the user's goal, in one
- Understanding the intent behind the changes (e.g., improving error handling, refactoring APIs, fixing bugs) or two sentences, and then specify their next edit in the form of a
- Completing any partially-applied changes across the codebase unified diff, like this:
- Ensuring consistency with the programming style and patterns already established
- Making edits that maintain or improve code quality
- If the programmer started refactoring one instance of a pattern, find and update ALL similar instances
- Don't write a lot of code if you're not sure what to do
Rules:
- Do not just mechanically apply patterns - reason about what changes make sense given the context and the programmer's apparent goals.
- Do not just fix syntax errors - look for the broader refactoring pattern and apply it systematically throughout the code.
- Write the edits in the unified diff format as shown in the example.
# Example output:
``` ```
--- a/src/myapp/cli.py --- a/src/myapp/cli.py
+++ b/src/myapp/cli.py +++ b/src/myapp/cli.py
@@ -1,3 +1,3 @@ @@ ... @@
- import os
- import time
-import sys import sys
+import json +from constants import LOG_LEVEL_WARNING
@@ ... @@
config.headless()
config.set_interactive(false)
-config.set_log_level(LOG_L)
+config.set_log_level(LOG_LEVEL_WARNING)
config.set_use_color(True)
``` ```
# Edit History: ## Edit History
"#}; "#};
const UNIFIED_DIFF_REMINDER: &str = indoc! {" const UNIFIED_DIFF_REMINDER: &str = indoc! {"
--- ---
Please analyze the edit history and the files, then provide the unified diff for your predicted edits. Analyze the edit history and the files, then provide the unified diff for your predicted edits.
Do not include the cursor marker in your output. Do not include the cursor marker in your output.
If you're editing multiple files, be sure to reflect filename in the hunk's header. Your diff should include edited file paths in its file headers (lines beginning with `---` and `+++`).
Do not include line numbers in the hunk headers, use `@@ ... @@`.
Removed lines begin with `-`.
Added lines begin with `+`.
Context lines begin with an extra space.
Context and removed lines are used to match the target edit location, so make sure to include enough of them
to uniquely identify it amongst all excerpts of code provided.
"}; "};
const XML_TAGS_INSTRUCTIONS: &str = indoc! {r#"
# Instructions
You are an edit prediction agent in a code editor.
Your job is to predict the next edit that the user will make,
based on their last few edits and their current cursor location.
# Output Format
You must briefly explain your understanding of the user's goal, in one
or two sentences, and then specify their next edit, using the following
XML format:
<edits path="my-project/src/myapp/cli.py">
<old_text>
OLD TEXT 1 HERE
</old_text>
<new_text>
NEW TEXT 1 HERE
</new_text>
<old_text>
OLD TEXT 1 HERE
</old_text>
<new_text>
NEW TEXT 1 HERE
</new_text>
</edits>
- Specify the file to edit using the `path` attribute.
- Use `<old_text>` and `<new_text>` tags to replace content
- `<old_text>` must exactly match existing file content, including indentation
- `<old_text>` cannot be empty
- Do not escape quotes, newlines, or other characters within tags
- Always close all tags properly
- Don't include the <|user_cursor|> marker in your output.
# Edit History:
"#};
const OLD_TEXT_NEW_TEXT_REMINDER: &str = indoc! {r#"
---
Remember that the edits in the edit history have already been deployed.
The files are currently as shown in the Code Excerpts section.
"#};
pub fn build_prompt( pub fn build_prompt(
request: &predict_edits_v3::PredictEditsRequest, request: &predict_edits_v3::PredictEditsRequest,
) -> Result<(String, SectionLabels)> { ) -> Result<(String, SectionLabels)> {
@@ -120,8 +169,9 @@ pub fn build_prompt(
EDITABLE_REGION_END_MARKER_WITH_NEWLINE, EDITABLE_REGION_END_MARKER_WITH_NEWLINE,
), ),
], ],
PromptFormat::LabeledSections => vec![(request.cursor_point, CURSOR_MARKER)], PromptFormat::LabeledSections
PromptFormat::NumLinesUniDiff => { | PromptFormat::NumLinesUniDiff
| PromptFormat::OldTextNewText => {
vec![(request.cursor_point, CURSOR_MARKER)] vec![(request.cursor_point, CURSOR_MARKER)]
} }
PromptFormat::OnlySnippets => vec![], PromptFormat::OnlySnippets => vec![],
@@ -131,46 +181,32 @@ pub fn build_prompt(
PromptFormat::MarkedExcerpt => MARKED_EXCERPT_INSTRUCTIONS.to_string(), PromptFormat::MarkedExcerpt => MARKED_EXCERPT_INSTRUCTIONS.to_string(),
PromptFormat::LabeledSections => LABELED_SECTIONS_INSTRUCTIONS.to_string(), PromptFormat::LabeledSections => LABELED_SECTIONS_INSTRUCTIONS.to_string(),
PromptFormat::NumLinesUniDiff => NUMBERED_LINES_INSTRUCTIONS.to_string(), PromptFormat::NumLinesUniDiff => NUMBERED_LINES_INSTRUCTIONS.to_string(),
// only intended for use via zeta_cli PromptFormat::OldTextNewText => XML_TAGS_INSTRUCTIONS.to_string(),
PromptFormat::OnlySnippets => String::new(), PromptFormat::OnlySnippets => String::new(),
}; };
if request.events.is_empty() { if request.events.is_empty() {
prompt.push_str("(No edit history)\n\n"); prompt.push_str("(No edit history)\n\n");
} else { } else {
prompt.push_str( prompt.push_str("Here are the latest edits made by the user, from earlier to later.\n\n");
"The following are the latest edits made by the user, from earlier to later.\n\n",
);
push_events(&mut prompt, &request.events); push_events(&mut prompt, &request.events);
} }
prompt.push_str(indoc! {"
# Code Excerpts
The cursor marker <|user_cursor|> indicates the current user cursor position.
The file is in current state, edits from edit history have been applied.
"});
if request.prompt_format == PromptFormat::NumLinesUniDiff { if request.prompt_format == PromptFormat::NumLinesUniDiff {
if request.referenced_declarations.is_empty() { prompt.push_str(indoc! {"
prompt.push_str(indoc! {" We prepend line numbers (e.g., `123|<actual line>`); they are not part of the file.
# File under the cursor: "});
The cursor marker <|user_cursor|> indicates the current user cursor position.
The file is in current state, edits from edit history have been applied.
We prepend line numbers (e.g., `123|<actual line>`); they are not part of the file.
"});
} else {
// Note: This hasn't been trained on yet
prompt.push_str(indoc! {"
# Code Excerpts:
The cursor marker <|user_cursor|> indicates the current user cursor position.
Other excerpts of code from the project have been included as context based on their similarity to the code under the cursor.
Context excerpts are not guaranteed to be relevant, so use your own judgement.
Files are in their current state, edits from edit history have been applied.
We prepend line numbers (e.g., `123|<actual line>`); they are not part of the file.
"});
}
} else {
prompt.push_str("\n## Code\n\n");
} }
prompt.push('\n');
let mut section_labels = Default::default(); let mut section_labels = Default::default();
if !request.referenced_declarations.is_empty() || !request.signatures.is_empty() { if !request.referenced_declarations.is_empty() || !request.signatures.is_empty() {
@@ -197,8 +233,14 @@ pub fn build_prompt(
} }
} }
if request.prompt_format == PromptFormat::NumLinesUniDiff { match request.prompt_format {
prompt.push_str(UNIFIED_DIFF_REMINDER); PromptFormat::NumLinesUniDiff => {
prompt.push_str(UNIFIED_DIFF_REMINDER);
}
PromptFormat::OldTextNewText => {
prompt.push_str(OLD_TEXT_NEW_TEXT_REMINDER);
}
_ => {}
} }
Ok((prompt, section_labels)) Ok((prompt, section_labels))
@@ -212,7 +254,7 @@ pub fn write_codeblock<'a>(
include_line_numbers: bool, include_line_numbers: bool,
output: &'a mut String, output: &'a mut String,
) { ) {
writeln!(output, "`````{}", path.display()).unwrap(); writeln!(output, "`````{}", DiffPathFmt(path)).unwrap();
write_excerpts( write_excerpts(
excerpts, excerpts,
sorted_insertions, sorted_insertions,
@@ -275,7 +317,7 @@ pub fn write_excerpts<'a>(
} }
} }
fn push_events(output: &mut String, events: &[predict_edits_v3::Event]) { pub fn push_events(output: &mut String, events: &[predict_edits_v3::Event]) {
if events.is_empty() { if events.is_empty() {
return; return;
}; };
@@ -623,6 +665,7 @@ impl<'a> SyntaxBasedPrompt<'a> {
match self.request.prompt_format { match self.request.prompt_format {
PromptFormat::MarkedExcerpt PromptFormat::MarkedExcerpt
| PromptFormat::OnlySnippets | PromptFormat::OnlySnippets
| PromptFormat::OldTextNewText
| PromptFormat::NumLinesUniDiff => { | PromptFormat::NumLinesUniDiff => {
if range.start.0 > 0 && !skipped_last_snippet { if range.start.0 > 0 && !skipped_last_snippet {
output.push_str("\n"); output.push_str("\n");

View File

@@ -0,0 +1,94 @@
use anyhow::Result;
use cloud_llm_client::predict_edits_v3::{self, Excerpt};
use indoc::indoc;
use schemars::JsonSchema;
use serde::{Deserialize, Serialize};
use std::fmt::Write;
use crate::{push_events, write_codeblock};
pub fn build_prompt(request: predict_edits_v3::PlanContextRetrievalRequest) -> Result<String> {
let mut prompt = SEARCH_INSTRUCTIONS.to_string();
if !request.events.is_empty() {
writeln!(&mut prompt, "## User Edits\n")?;
push_events(&mut prompt, &request.events);
}
writeln!(&mut prompt, "## Cursor context")?;
write_codeblock(
&request.excerpt_path,
&[Excerpt {
start_line: request.excerpt_line_range.start,
text: request.excerpt.into(),
}],
&[],
request.cursor_file_max_row,
true,
&mut prompt,
);
writeln!(&mut prompt, "{TOOL_USE_REMINDER}")?;
Ok(prompt)
}
/// Search for relevant code
///
/// For the best results, run multiple queries at once with a single invocation of this tool.
#[derive(Clone, Deserialize, Serialize, JsonSchema)]
pub struct SearchToolInput {
/// An array of queries to run for gathering context relevant to the next prediction
#[schemars(length(max = 3))]
pub queries: Box<[SearchToolQuery]>,
}
/// Search for relevant code by path, syntax hierarchy, and content.
#[derive(Debug, Clone, Serialize, Deserialize, JsonSchema)]
pub struct SearchToolQuery {
/// 1. A glob pattern to match file paths in the codebase to search in.
pub glob: String,
/// 2. Regular expressions to match syntax nodes **by their first line** and hierarchy.
///
/// Subsequent regexes match nodes within the full content of the nodes matched by the previous regexes.
///
/// Example: Searching for a `User` class
/// ["class\s+User"]
///
/// Example: Searching for a `get_full_name` method under a `User` class
/// ["class\s+User", "def\sget_full_name"]
///
/// Skip this field to match on content alone.
#[schemars(length(max = 3))]
#[serde(default)]
pub syntax_node: Vec<String>,
/// 3. An optional regular expression to match the final content that should appear in the results.
///
/// - Content will be matched within all lines of the matched syntax nodes.
/// - If syntax node regexes are provided, this field can be skipped to include as much of the node itself as possible.
/// - If no syntax node regexes are provided, the content will be matched within the entire file.
pub content: Option<String>,
}
pub const TOOL_NAME: &str = "search";
const SEARCH_INSTRUCTIONS: &str = indoc! {r#"
You are part of an edit prediction system in a code editor.
Your role is to search for code that will serve as context for predicting the next edit.
- Analyze the user's recent edits and current cursor context
- Use the `search` tool to find code that is relevant for predicting the next edit
- Focus on finding:
- Code patterns that might need similar changes based on the recent edits
- Functions, variables, types, and constants referenced in the current cursor context
- Related implementations, usages, or dependencies that may require consistent updates
- How items defined in the cursor excerpt are used or altered
- You will not be able to filter results or perform subsequent queries, so keep searches as targeted as possible
- Use `syntax_node` parameter whenever you're looking for a particular type, class, or function
- Avoid using wildcard globs if you already know the file path of the content you're looking for
"#};
const TOOL_USE_REMINDER: &str = indoc! {"
--
Analyze the user's intent in one to two sentences, then call the `search` tool.
"};

View File

@@ -34,7 +34,7 @@ struct CurrentCompletion {
snapshot: BufferSnapshot, snapshot: BufferSnapshot,
/// The edits that should be applied to transform the original text into the predicted text. /// The edits that should be applied to transform the original text into the predicted text.
/// Each edit is a range in the buffer and the text to replace it with. /// Each edit is a range in the buffer and the text to replace it with.
edits: Arc<[(Range<Anchor>, String)]>, edits: Arc<[(Range<Anchor>, Arc<str>)]>,
/// Preview of how the buffer will look after applying the edits. /// Preview of how the buffer will look after applying the edits.
edit_preview: EditPreview, edit_preview: EditPreview,
} }
@@ -42,7 +42,7 @@ struct CurrentCompletion {
impl CurrentCompletion { impl CurrentCompletion {
/// Attempts to adjust the edits based on changes made to the buffer since the completion was generated. /// Attempts to adjust the edits based on changes made to the buffer since the completion was generated.
/// Returns None if the user's edits conflict with the predicted edits. /// Returns None if the user's edits conflict with the predicted edits.
fn interpolate(&self, new_snapshot: &BufferSnapshot) -> Option<Vec<(Range<Anchor>, String)>> { fn interpolate(&self, new_snapshot: &BufferSnapshot) -> Option<Vec<(Range<Anchor>, Arc<str>)>> {
edit_prediction::interpolate_edits(&self.snapshot, new_snapshot, &self.edits) edit_prediction::interpolate_edits(&self.snapshot, new_snapshot, &self.edits)
} }
} }
@@ -281,8 +281,8 @@ impl EditPredictionProvider for CodestralCompletionProvider {
return Ok(()); return Ok(());
} }
let edits: Arc<[(Range<Anchor>, String)]> = let edits: Arc<[(Range<Anchor>, Arc<str>)]> =
vec![(cursor_position..cursor_position, completion_text)].into(); vec![(cursor_position..cursor_position, completion_text.into())].into();
let edit_preview = buffer let edit_preview = buffer
.read_with(cx, |buffer, cx| buffer.preview_edits(edits.clone(), cx))? .read_with(cx, |buffer, cx| buffer.preview_edits(edits.clone(), cx))?
.await; .await;

View File

@@ -291,29 +291,6 @@ CREATE TABLE IF NOT EXISTS "channel_chat_participants" (
CREATE INDEX "index_channel_chat_participants_on_channel_id" ON "channel_chat_participants" ("channel_id"); CREATE INDEX "index_channel_chat_participants_on_channel_id" ON "channel_chat_participants" ("channel_id");
CREATE TABLE IF NOT EXISTS "channel_messages" (
"id" INTEGER PRIMARY KEY AUTOINCREMENT,
"channel_id" INTEGER NOT NULL REFERENCES channels (id) ON DELETE CASCADE,
"sender_id" INTEGER NOT NULL REFERENCES users (id),
"body" TEXT NOT NULL,
"sent_at" TIMESTAMP,
"edited_at" TIMESTAMP,
"nonce" BLOB NOT NULL,
"reply_to_message_id" INTEGER DEFAULT NULL
);
CREATE INDEX "index_channel_messages_on_channel_id" ON "channel_messages" ("channel_id");
CREATE UNIQUE INDEX "index_channel_messages_on_sender_id_nonce" ON "channel_messages" ("sender_id", "nonce");
CREATE TABLE "channel_message_mentions" (
"message_id" INTEGER NOT NULL REFERENCES channel_messages (id) ON DELETE CASCADE,
"start_offset" INTEGER NOT NULL,
"end_offset" INTEGER NOT NULL,
"user_id" INTEGER NOT NULL REFERENCES users (id) ON DELETE CASCADE,
PRIMARY KEY (message_id, start_offset)
);
CREATE TABLE "channel_members" ( CREATE TABLE "channel_members" (
"id" INTEGER PRIMARY KEY AUTOINCREMENT, "id" INTEGER PRIMARY KEY AUTOINCREMENT,
"channel_id" INTEGER NOT NULL REFERENCES channels (id) ON DELETE CASCADE, "channel_id" INTEGER NOT NULL REFERENCES channels (id) ON DELETE CASCADE,
@@ -408,15 +385,6 @@ CREATE TABLE "observed_buffer_edits" (
CREATE UNIQUE INDEX "index_observed_buffers_user_and_buffer_id" ON "observed_buffer_edits" ("user_id", "buffer_id"); CREATE UNIQUE INDEX "index_observed_buffers_user_and_buffer_id" ON "observed_buffer_edits" ("user_id", "buffer_id");
CREATE TABLE IF NOT EXISTS "observed_channel_messages" (
"user_id" INTEGER NOT NULL REFERENCES users (id) ON DELETE CASCADE,
"channel_id" INTEGER NOT NULL REFERENCES channels (id) ON DELETE CASCADE,
"channel_message_id" INTEGER NOT NULL,
PRIMARY KEY (user_id, channel_id)
);
CREATE UNIQUE INDEX "index_observed_channel_messages_user_and_channel_id" ON "observed_channel_messages" ("user_id", "channel_id");
CREATE TABLE "notification_kinds" ( CREATE TABLE "notification_kinds" (
"id" INTEGER PRIMARY KEY AUTOINCREMENT, "id" INTEGER PRIMARY KEY AUTOINCREMENT,
"name" VARCHAR NOT NULL "name" VARCHAR NOT NULL

View File

@@ -0,0 +1,3 @@
drop table observed_channel_messages;
drop table channel_message_mentions;
drop table channel_messages;

View File

@@ -66,40 +66,6 @@ impl Database {
.await .await
} }
/// Returns all users flagged as staff.
pub async fn get_staff_users(&self) -> Result<Vec<user::Model>> {
self.transaction(|tx| async {
let tx = tx;
Ok(user::Entity::find()
.filter(user::Column::Admin.eq(true))
.all(&*tx)
.await?)
})
.await
}
/// Returns a user by email address. There are no access checks here, so this should only be used internally.
pub async fn get_user_by_email(&self, email: &str) -> Result<Option<User>> {
self.transaction(|tx| async move {
Ok(user::Entity::find()
.filter(user::Column::EmailAddress.eq(email))
.one(&*tx)
.await?)
})
.await
}
/// Returns a user by GitHub user ID. There are no access checks here, so this should only be used internally.
pub async fn get_user_by_github_user_id(&self, github_user_id: i32) -> Result<Option<User>> {
self.transaction(|tx| async move {
Ok(user::Entity::find()
.filter(user::Column::GithubUserId.eq(github_user_id))
.one(&*tx)
.await?)
})
.await
}
/// Returns a user by GitHub login. There are no access checks here, so this should only be used internally. /// Returns a user by GitHub login. There are no access checks here, so this should only be used internally.
pub async fn get_user_by_github_login(&self, github_login: &str) -> Result<Option<User>> { pub async fn get_user_by_github_login(&self, github_login: &str) -> Result<Option<User>> {
self.transaction(|tx| async move { self.transaction(|tx| async move {
@@ -270,39 +236,6 @@ impl Database {
.await .await
} }
/// Sets "accepted_tos_at" on the user to the given timestamp.
pub async fn set_user_accepted_tos_at(
&self,
id: UserId,
accepted_tos_at: Option<DateTime>,
) -> Result<()> {
self.transaction(|tx| async move {
user::Entity::update_many()
.filter(user::Column::Id.eq(id))
.set(user::ActiveModel {
accepted_tos_at: ActiveValue::set(accepted_tos_at),
..Default::default()
})
.exec(&*tx)
.await?;
Ok(())
})
.await
}
/// hard delete the user.
pub async fn destroy_user(&self, id: UserId) -> Result<()> {
self.transaction(|tx| async move {
access_token::Entity::delete_many()
.filter(access_token::Column::UserId.eq(id))
.exec(&*tx)
.await?;
user::Entity::delete_by_id(id).exec(&*tx).await?;
Ok(())
})
.await
}
/// Find users where github_login ILIKE name_query. /// Find users where github_login ILIKE name_query.
pub async fn fuzzy_search_users(&self, name_query: &str, limit: u32) -> Result<Vec<User>> { pub async fn fuzzy_search_users(&self, name_query: &str, limit: u32) -> Result<Vec<User>> {
self.transaction(|tx| async { self.transaction(|tx| async {
@@ -341,14 +274,4 @@ impl Database {
result.push('%'); result.push('%');
result result
} }
pub async fn get_users_missing_github_user_created_at(&self) -> Result<Vec<user::Model>> {
self.transaction(|tx| async move {
Ok(user::Entity::find()
.filter(user::Column::GithubUserCreatedAt.is_null())
.all(&*tx)
.await?)
})
.await
}
} }

Some files were not shown because too many files have changed in this diff Show More