Compare commits

..

50 Commits

Author SHA1 Message Date
David Kleingeld
b5d1eeff60 Add @dvdsk to git reviewers list 2025-11-03 18:50:00 +01:00
Nia
45b78482f5 perf: Fixup Hyperfine finding (#41837)
Release Notes:

- N/A
2025-11-03 17:36:33 +00:00
Antal Szabó
71f1f3728d windows: Remove null terminator from keyboard ID (#41785)
Closes #41486, closes #35862 

It is unnecessary, and it broke the `uses_altgr` function.

Also add Slovenian layout as using AltGr.

This should fix:
-
https://github.com/zed-industries/zed/pull/40536#issuecomment-3477121224
- https://github.com/zed-industries/zed/issues/41486
- https://github.com/zed-industries/zed/issues/35862

As the current strategy relies on manually adding layouts that have
AltGr, it's brittle and not very elegant. It also has other issues (it
requests the current layout on every kesytroke and mouse movement).

**A potentially better and more comprehensive solution is at
https://github.com/zed-industries/zed/pull/41259**
This is just to fix the immediate issues while that gets reviewed.

Release Notes:

- windows: Fix AltGr handling on non-US layouts again.
2025-11-03 18:29:28 +01:00
Richard Feldman
8b560cd8aa Fix bug with uninstalled agent extensions (#41836)
Previously, uninstalled agent extensions didn't immediately disappear
from the menu. Now, they do!

Release Notes:

- N/A
2025-11-03 12:09:26 -05:00
Lukas Wirth
38e1e3f498 project: Use user configured shells for project env fetching (#41288)
Closes https://github.com/zed-industries/zed/issues/40464

Release Notes:

- Fix shell environment sourcing not respecting users remote shells
2025-11-03 16:29:07 +00:00
Karl-Erik Enkelmann
a6b177d806 Update open buffers with newly registered completion trigger characters (#41243)
Closes https://github.com/zed-extensions/java/issues/108

Previously, when language servers dynamically register completion
capabilities with trigger characters for completions (hello JDTLS), this
would not get updated in buffers for that language server that were
already open. This change is to find open buffers for the language
server and update the trigger characters in each of them when the new
capability is being registered.

Release Notes:

- N/A
2025-11-03 17:28:30 +01:00
Lukas Wirth
73366bef62 diagnostics: Live update diagnostics view on edits while focused (#41829)
Prior we were only updating the diagnostics pane when it is either
unfocued, saved or when a disk based diagnostic run finishes (aka cargo
check). The reason for this is simple, we do not want to take away the
excerpt under the users cursor while they are typing if they manage to
fix the diagnostic. Additionally we need to prevent dropping the changed
buffer before it is saved.

Delaying updates was a simple way to work around these kind of issues,
but comes at a huge annoyance that the diagnostics pane is not actually
reflecting the current state of the world but some snapshot of it
instead making it less than ideal to work within it for languages that
do not leverage disk based diagnostics (that is not rust-analyzer, and
even for rust-analyzer its annoying).

This PR changes this. We now always live update the view but take care
to retain unsaved buffers as well as buffers that contain a cursor in
them (as well as some other "checkpoint" properties).

Release Notes:

- Improved diagnostics pane to live update when editing within its
editor
2025-11-03 16:16:05 +00:00
David Kleingeld
48bd253358 Adds instructions on how to use Perf & Flamegraph without debug symbols (#41831)
Release Notes:

- N/A
2025-11-03 16:11:30 +00:00
Bob Mannino
2131d88e48 Add center_on_match option for search (#40523)
[Closes discussion
#28943](https://github.com/zed-industries/zed/discussions/28943)

Release Notes:

- Added `center_on_match` option to center matched text in view during buffer or project search.

---------

Co-authored-by: Smit Barmase <heysmitbarmase@gmail.com>
2025-11-03 21:17:09 +05:30
Kirill Bulatov
42149df0f2 Show modal hover for one-off tasks (#41824)
Before, one-off commands did not show anything on hover at all, now they
show their command:

<img width="657" height="527" alt="after"
src="https://github.com/user-attachments/assets/d43292ca-9101-4a6a-b689-828277e8cfeb"
/>

Release Notes:

- Show modal hover for one-off tasks
2025-11-03 15:25:44 +00:00
Bing Wang
379bdb227a languages: Add ignore keyword for gomod (#41520)
go 1.25 introduce ignore directive in go.mod to specify directories the
go command should ignore.

ref: https://tip.golang.org/doc/go1.25#go-command


Release Notes:

- Added syntax highlighting support for the new [`ignore`
directive](https://tip.golang.org/doc/go1.25#go-command) in `go.mod`
files
2025-11-03 09:11:50 -06:00
Dijana Pavlovic
04e53bff3d Move @punctuation.delimiter before @operator capture (#41663)
Closes #41593

From what I understand the order of captures inside tree-sitter query
files matters, and the last capture will win. `?` and `:` are captured
by both `@operator` and `@punctuation.delimiter`.So in order for the
ternary operator to win it should live after `@punctuation.delimiter`.

Before:
<img width="298" height="32" alt="Screenshot 2025-10-31 at 17 41 21"
src="https://github.com/user-attachments/assets/af376e52-88be-4f62-9e2b-a106731f8145"
/>


After:
<img width="303" height="39" alt="Screenshot 2025-10-31 at 17 41 33"
src="https://github.com/user-attachments/assets/9a754ae9-0521-4c70-9adb-90a562404ce8"
/>


Release Notes:

- Fixed an issue where the ternary operator symbols in TypeScript would
not be highlighted as operators.
2025-11-03 08:51:22 -06:00
Kirill Bulatov
28f30fc851 Fix racy inlay hints queries (#41816)
Follow-up of https://github.com/zed-industries/zed/pull/40183

Release Notes:

- (Preview only) Fixed inlay hints duplicating when multiple editors are
open for the same buffer

---------

Co-authored-by: Lukas Wirth <lukas@zed.dev>
2025-11-03 13:54:53 +00:00
Lukas Wirth
f8b414c22c zed: Reduce number of rayon threads, spawn with bigger stacks (#41812)
We already do this for the cli and remote server but forgot to do so for
the main binary

Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-03 12:28:32 +00:00
Lukas Wirth
50504793e6 file_finder: Fix highlighting panic in open path prompt (#41808)
Closes https://github.com/zed-industries/zed/issues/41249

Couldn't quite come up with a test case here but verified it works.

Release Notes:

- Fixed a panic in file finder when deleting characters
2025-11-03 12:07:13 +00:00
kallyaleksiev
b625263989 remote: Close window when SSH connection fails (#41782)
## Context 

This PR closes issue https://github.com/zed-industries/zed/issues/41781

It essentially `matches` the result of opening the connection here
f7153bbe8a/crates/recent_projects/src/remote_connections.rs (L650)

and adds a Close / Retry alert that upon 'Close' closes the new window
if the result is an error
2025-11-03 11:13:20 +00:00
Lukas Wirth
c8f9db2e24 remote: Fix more quoting issues with nushell (#41547)
https://github.com/zed-industries/zed/pull/40084#issuecomment-3464159871
Closes https://github.com/zed-industries/zed/pull/41547

Release Notes:

- Fixed remoting not working when the remote has nu set as its shell
2025-11-03 10:50:05 +00:00
Lukas Wirth
bc3c88e737 Revert "windows: Don't flood windows message queue with gpui messages" (#41803)
Reverts zed-industries/zed#41595

Closes #41704
2025-11-03 10:41:53 +00:00
Oleksiy Syvokon
3a058138c1 Fix Sonnet's regression with inserting </parameter></invoke> (#41800)
Sometimes, inside the edit agent, Sonnet thinks that it's doing a tool
call and closes its response with `</parameter></invoke>` instead of
properly closing </new_text>.

A better but more labor-intensive way of fixing this would be switching
to streaming tool calls for LLMs that support it.

Closes #39921

Release Notes:

- Fixed Sonnet's regression with inserting `</parameter></invoke>`
sometimes
2025-11-03 12:22:51 +02:00
Lukas Wirth
f2b539598e sum_tree: Spawn less tasks in SumTree::from_iter_async (#41793)
Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-03 10:02:31 +00:00
ᴀᴍᴛᴏᴀᴇʀ
dc503e9975 Support GitLab and self-hosted GitLab avatars in git blame (#41747)
Part of #11043.

Release Notes:

- Added Support for showing GitLab and self-hosted GitLab avatars in git
blame
2025-11-03 07:51:04 +01:00
ᴀᴍᴛᴏᴀᴇʀ
73b75a7765 Support Gitee avatars in git blame (#41783)
Part of https://github.com/zed-industries/zed/issues/11043.

<img width="3596" height="1894" alt="CleanShot 2025-11-03 at 10 39
08@2x"
src="https://github.com/user-attachments/assets/68d16c32-fd23-4f54-9973-6cbda9685a8f"
/>


Release Notes:

- Added Support for showing Gitee avatars in git blame
2025-11-03 07:47:20 +01:00
Danilo Leal
deacd3e922 extension_ui: Fix card label truncation (#41784)
Closes https://github.com/zed-industries/zed/issues/41763

Release Notes:

- N/A
2025-11-03 03:54:47 +00:00
Aero
f7153bbe8a agent_ui: Add delete button for compatible API-based LLM providers (#41739)
Discussion: https://github.com/zed-industries/zed/discussions/41736

Release Notes:

- agent panel: Added the ability to remove OpenAI-compatible LLM
providers directly from the UI.

---------

Co-authored-by: Danilo Leal <daniloleal09@gmail.com>
2025-11-02 16:05:29 -03:00
Xiaobo Liu
4e7ba8e680 acp_tools: Add vertical scrollbar to ACP logs (#41740)
Release Notes:

- N/A

---------

Signed-off-by: Xiaobo Liu <cppcoffee@gmail.com>
Co-authored-by: Danilo Leal <daniloleal09@gmail.com>
2025-11-02 18:11:21 +00:00
Danilo Leal
9909b59bd0 agent_ui: Improve the "go to file" affordance in the edit bar (#41762)
This PR makes it clearer that you can click on the file path to open the
corresponding file in the agent panel's "edit bar", which is the element
that shows up in the panel as soon as agent-made edits happen.

Release Notes:

- agent panel: Improved the "go to file" affordance in the edit bar.
2025-11-02 14:56:23 -03:00
Danilo Leal
00ff89f00f agent_ui: Make single file review actions match panel (#41718)
When we introduced the ACP-based agent panel, the condition that the
"review" | "reject" | "keep" buttons observed to be displayed got
mismatched between the panel and the pane (when in the single file
review scenario). In the panel, the buttons appear as soon as there are
changed buffers, whereas in the pane, they appear when response
generation is done.

I believe that making them appear at the same time, observing the same
condition, is the desired behavior. Thus, I think the panel behavior is
more correct, because there are loads of times where agent response
generation isn't technically done (e.g., when there's a command waiting
for permission to be run) but the _file edit_ has already been performed
and is in a good state to be already accepted or rejected.

So, this is what this PR is doing; effectively removing the "generating"
state from the agent diff, and switching to `EditorState::Reviewing`
when there are changed buffers.

Release Notes:

- Improved agent edit single file reviews by making the "reject" and
"accept" buttons appear at the same time.
2025-11-02 14:30:50 -03:00
Danilo Leal
12fe12b5ac docs: Update theme, icon theme, and visual customization pages (#41761)
Some housekeeping updates:

- Update hardcoded actions/keybindings so they're pulled from the repo
- Mention settings window when useful
- Add more info about agent panel's font size
- Break sentences in individual lines

Release Notes:

- N/A
2025-11-02 14:30:37 -03:00
Mayank Verma
a9bc890497 ui: Fix popover menu not restoring focus to the previously focused element (#41751)
Closes #26548

Here's a before/after comparison:


https://github.com/user-attachments/assets/21d49db7-28bb-4fe2-bdaf-e86b6400ae7a

Release Notes:

- Fixed popover menus not restoring focus to the previously focused
element
2025-11-02 16:56:58 +01:00
Xiaobo Liu
d887e2050f windows: Hide background helpers behind CREATE_NO_WINDOW (#41737)
Close https://github.com/zed-industries/zed/issues/41538

Release Notes:

- Fixed some processes on windows not spawning with CREATE_NO_WINDOW

---------

Signed-off-by: Xiaobo Liu <cppcoffee@gmail.com>
2025-11-02 09:15:01 +01:00
Anthony Eid
d5421ba1a8 windows: Fix click bleeding through collab follow (#41726)
On Windows, clicking on a collab user icon in the title bar would
minimize/expand Zed because the click would bleed through to the title
bar. This PR fixes this by stopping propagation.

#### Before (On MacOS with double clicks to mimic the same behavior)

https://github.com/user-attachments/assets/5a91f7ff-265a-4575-aa23-00b8d30daeed
#### After (On MacOS with double clicks to mimic the same behavior)

https://github.com/user-attachments/assets/e9fcb98f-4855-4f21-8926-2d306d256f1c

Release Notes:

- Windows: Fix clicking on user icon in title bar to follow
minimizing/expanding Zed

Co-authored-by: Remco Smits <djsmits12@gmail.com>
2025-11-02 07:35:11 +00:00
Joseph T. Lyons
548cdfde3a Delete release process docs (#41733)
These have been migrated to the README.md
[here](https://github.com/zed-industries/release_notes). These don't
need to be public. Putting them in the same repo where we draft
(`release_notes`) means less jumping around and allows us to include
additional information we might not want to make public.

Release Notes:

- N/A
2025-11-02 00:37:02 -04:00
Ben Kunkle
2408f767f4 gh-workflow unit evals (#41637)
Closes #ISSUE

Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-01 22:45:44 -04:00
Haojian Wu
df15d2d2fe Fix doc typos (#41727)
Release Notes:

- N/A
2025-11-01 20:52:32 -03:00
Anthony Eid
07dcb8f2bb debugger: Add program and module path fallbacks for debugpy toolchain (#40975)
Fixes the Debugpy toolchain detection bug in #40324 

When detecting what toolchain (venv) to use in the Debugpy configuration
stage, we used to only base it off of the current working directory
argument passed to the config. This is wrong behavior for cases like
mono repos, where the correct virtual environment to use is nested in
another folder.

This PR fixes this issue by adding the program and module fields as
fallbacks to check for virtual environments. We also added support for
program/module relative paths as well when cwd is not None.

Release Notes:

- debugger: Improve mono repo virtual environment detection with Debugpy

---------

Co-authored-by: Remco Smits <djsmits12@gmail.com>
2025-11-01 17:30:13 -04:00
Agus Zubiaga
06bdb28517 zeta cli: Add convert-example command (#41608)
Adds a `convert-example` subcommand to the zeta cli that converts eval
examples from/to `json`, `toml`, and `md` formats.

Release Notes:

- N/A

---------

Co-authored-by: Max Brunsfeld <maxbrunsfeld@gmail.com>
2025-11-01 19:35:04 +00:00
Mark Christiansen
d6b58bb948 agent_ui: Use agent font size tokens for thread markdown rendering (#41610)
Release Notes:

- N/A

--- 

Previously, agent markdown rendering used hardcoded font sizes
(TextSize::Default and TextSize::Small) which ignored the
agent_ui_font_size and agent_buffer_font_size settings. This updates the
markdown style to respect these settings.

This pull request adds support for customizing the font size of code
blocks in agent responses, making it possible to set a distinct font
size for code within the agent panel. The changes ensure that if the new
setting is not specified, the font size will fall back to the agent UI
font size, maintaining consistent appearance.

(I am a frontend developer without any Rust knowledge so this is
co-authored with Claude Code)


**Theme settings extension:**

* Added a new `agent_buffer_code_font_size` setting to
`ThemeSettingsContent`, `ThemeSettings`, and the default settings JSON,
allowing users to specify the font size for code blocks in agent
responses.
[[1]](diffhunk://#diff-a3bba02a485aba48e8e9a9d85485332378aa4fe29a0c50d11ae801ecfa0a56a4R69-R72)
[[2]](diffhunk://#diff-aed3a9217587d27844c57ac8aff4a749f1fb1fc5d54926ef5065bf85f8fd633aR118-R119)
[[3]](diffhunk://#diff-42e01d7aacb60673842554e30970b4ddbbaee7a2ec2c6f2be1c0b08b0dd89631R82-R83)
* Updated the VSCode import logic to recognize and import the new
`agent_buffer_code_font_size` setting.

**Font size application in agent UI:**

* Modified the agent UI rendering logic in `thread_view.rs` to use the
new `agent_buffer_code_font_size` for code blocks, and to fall back to
the agent UI font size if unset.
[[1]](diffhunk://#diff-f73942e8d4f8c4d4d173d57d7c58bb653c4bb6ae7079533ee501750cdca27d98L5584-R5584)
[[2]](diffhunk://#diff-f73942e8d4f8c4d4d173d57d7c58bb653c4bb6ae7079533ee501750cdca27d98L5596-R5598)
* Implemented a helper method in `ThemeSettings` to retrieve the code
block font size, with fallback logic to ensure a value is always used.
* Updated the settings application logic to propagate the new code block
font size setting throughout the theme system.


### Example Screenshots
![Screenshot 2025-10-31 at 12 38
28](https://github.com/user-attachments/assets/cbc34232-ab1f-40bf-a006-689678380e47)
![Screenshot 2025-10-31 at 12 37
45](https://github.com/user-attachments/assets/372b5cf8-2df8-425a-b052-12136de7c6bd)

---------

Co-authored-by: Danilo Leal <daniloleal09@gmail.com>
2025-11-01 11:14:21 -03:00
Charles McLaughlin
03e0581ee8 agent_ui: Show notifications also when the panel is hidden (#40942)
Currently Zed only displays agent notifications (e.g. when the agent
completes a task) if the user has switched apps and Zed is not in the
foreground. This adds PR supports the scenario where the agent finishes
a long-running task and the user is busy coding within Zed on something
else.

Releases Note:

- If agent notifications are turned on, they will now also be displayed
when the agent panel is hidden, in complement to them showing when the
Zed window is in the background.

---------

Co-authored-by: Danilo Leal <daniloleal09@gmail.com>
2025-11-01 11:14:12 -03:00
Conrad Irwin
1552e13799 Fix telemetry in release builds (#41695)
This was inadvertently broken in v0.211.1-pre when we rewrote the
release build

Release Notes:

- N/A
2025-10-31 22:55:12 -06:00
Dijana Pavlovic
ade0f1342c agent_ui: Prevent mode selector tooltip from going off-screen (#41589)
Closes #41458 

Dynamically position mode selector tooltip to prevent clipping.

Position tooltip on the right when panel is docked left, otherwise on
the left. This ensures the tooltip remains visible regardless of panel
position.

**Note:** The tooltip currently vertically aligns with the bottom of the
menu rather than individual items. Would be great if it can be aligned
with the option it explains. But this doesn't seem trivial to me to
implement and not sure if it's important enough atm?

Before: 
<img width="431" height="248" alt="Screenshot 2025-10-30 at 22 21 09"
src="https://github.com/user-attachments/assets/073f5440-b1bf-420b-b12f-558928b627f1"
/>

After:
<img width="632" height="158" alt="Screenshot 2025-10-30 at 17 26 52"
src="https://github.com/user-attachments/assets/e999e390-bf23-435e-9df0-3126dbc14ecb"
/>
<img width="685" height="175" alt="Screenshot 2025-10-30 at 17 27 15"
src="https://github.com/user-attachments/assets/84efca94-7920-474b-bcf8-062c7b59a812"
/>


Release Notes:

- Improved the agent panel's mode selector by preventing it to go
off-screen in case the panel is docked to the left.
2025-11-01 04:29:58 +00:00
Joe Innes
04f7b08ab9 Give visual feedback when an operation is pending (#41686)
Currently, if a commit operation takes some time, there's no visual
feedback in the UI that anything's happening.

This PR changes the colour of the text on the button to the
`Color::Disabled` colour when a commit operation is pending.

Release Notes:

- Improved UI feedback when a commit is in progress

---------

Co-authored-by: Danilo Leal <daniloleal09@gmail.com>
2025-11-01 04:24:59 +00:00
Anthony Eid
ecbdffc84f debugger: Fix Debugpy attach with connect session startup (#41690)
Closes #38345, #34882, #33280

Debugpy has four distinct configuration scenarios, which are:
1. launch
2. attach with process id
3. attach with listen
4. attach with connect

Spawning Debugpy directly works with the first three scenarios but not
with "attach with connect". Which requires host/port arguments being
passed in both with an attach request and when starting up Debugpy. This
PR passes in the right arguments when spawning Debugpy in an attach with
connect scenario, thus fixing the bug.

The VsCode extension comment that explains this:
98f5b93ee4/src/extension/debugger/adapter/factory.ts (L43-L51)

Release Notes:

- debugger: Fix Python attach-based sessions not working with `connect`
or `port` arguments
2025-10-31 19:02:51 -04:00
Jakub Konka
aa61f25795 git: Make GitPanel more responsive to long-running staging ops (#41667)
Currently, this only applies to long-running individually selected
unstaged files in the git panel. Next up I would like to make this work
for `Stage All`/`Unstage All` however this will most likely require
pushing `PendingOperation` into `GitStore` (from the `GitPanel`).

Release Notes:

- N/A
2025-10-31 22:47:49 +01:00
Danilo Leal
d406409b72 Fix categorization of agent server extensions (#41689)
We missed making extensions that provide agent servers fill the
`provides` field with `agent-servers`, and thus, filtering for this type
of extension in both the app and site wouldn't return anything.

Release Notes:

- N/A
2025-10-31 21:18:22 +00:00
Danilo Leal
bf79592465 git_ui: Adjust stash picker (#41688)
Just tidying it up by removing the unnecessary eye icon buttons in all
list items and adding that action in the form of a button in the footer,
closer to all other actions. Also reordering the footer buttons so that
the likely most common action is in the far right.

Release Notes:

- N/A
2025-10-31 18:15:52 -03:00
Ben Kunkle
d3d7199507 Fix release.yml workflow (#41675)
Closes #ISSUE

Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-10-31 16:29:13 -04:00
Danilo Leal
743a9cf258 Add included agents in extensions search (#41679)
Given agent servers will soon be a thing, I'm adding Claude Code, Gemini
CLI, and Codex CLI as included agents in case anyone comes first to
search them as extensions before looking up on the agent panel.

Release Notes:

- N/A
2025-10-31 16:45:39 -03:00
Conrad Irwin
a05358f47f Delete old ci.yml (#41668)
The new one is much better

Release Notes:

- N/A
2025-10-31 12:58:47 -06:00
Ben Kunkle
3a4aba1df2 gh-workflow release (#41502)
Closes #ISSUE

Rewrite our release pipeline to be generated by `gh-workflow`

Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-10-31 14:23:25 -04:00
tidely
12d71b37bb ollama: Add button for refreshing available models (#38181)
Closes #17524

This PR adds a button to the bottom right corner of the ollama settings
ui. It resets the available ollama models, also resets the "Connected"
state in the process. This means it can be used to check if the
connection is still valid as well. It's a question whether we should
clear the available models on ALL `fetch_models` calls, since these only
happen during auth anyway.

Ollama is a local model provider which means clicking the refresh button
often only flashes the "not connected" state because the latency of the
request is so low. This accentuates changes in the UI, however I don't
think there's a way around this without adding some rather cumbersome
deferred ui updates.

I've attached the refresh button to the "Connected" `ButtonLike`, since
I don't think automatic UI spacing should separate these elements. I
think this is okay because the "Connected" isn't actually something that
the user can interact with.

Before: 
<img width="211" height="245" alt="image"
src="https://github.com/user-attachments/assets/ea90e24a-b603-4ee2-9212-2917e1695774"
/>

After: 
<img width="211" height="250" alt="image"
src="https://github.com/user-attachments/assets/be9af950-86a2-4067-87a0-52034a80a823"
/>


Alternative approach: There was also a suggestion to simply add a entry
to the command palette, however none of the other providers have this
ability currently either so I went with this approach. The current
approach also makes it more discoverable to the user.

Release Notes:

- Added a button for refreshing available ollama models

---------

Co-authored-by: Bennet Bo Fenner <bennetbo@gmx.de>
2025-10-31 18:12:02 +00:00
116 changed files with 4150 additions and 2571 deletions

View File

@@ -1,841 +0,0 @@
name: CI
on:
push:
tags:
- "v*"
concurrency:
# Allow only one workflow per any non-`main` branch.
group: ${{ github.workflow }}-${{ github.ref_name }}-${{ github.ref_name == 'main' && github.sha || 'anysha' }}
cancel-in-progress: true
env:
CARGO_TERM_COLOR: always
CARGO_INCREMENTAL: 0
RUST_BACKTRACE: 1
DIGITALOCEAN_SPACES_ACCESS_KEY: ${{ secrets.DIGITALOCEAN_SPACES_ACCESS_KEY }}
DIGITALOCEAN_SPACES_SECRET_KEY: ${{ secrets.DIGITALOCEAN_SPACES_SECRET_KEY }}
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_MINIDUMP_ENDPOINT: ${{ secrets.ZED_SENTRY_MINIDUMP_ENDPOINT }}
jobs:
job_spec:
name: Decide which jobs to run
if: github.repository_owner == 'zed-industries'
outputs:
run_tests: ${{ steps.filter.outputs.run_tests }}
run_license: ${{ steps.filter.outputs.run_license }}
run_docs: ${{ steps.filter.outputs.run_docs }}
run_nix: ${{ steps.filter.outputs.run_nix }}
run_actionlint: ${{ steps.filter.outputs.run_actionlint }}
runs-on:
- namespace-profile-2x4-ubuntu-2404
steps:
- name: Checkout repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
# 350 is arbitrary; ~10days of history on main (5secs); full history is ~25secs
fetch-depth: ${{ github.ref == 'refs/heads/main' && 2 || 350 }}
- name: Fetch git history and generate output filters
id: filter
run: |
if [ -z "$GITHUB_BASE_REF" ]; then
echo "Not in a PR context (i.e., push to main/stable/preview)"
COMPARE_REV="$(git rev-parse HEAD~1)"
else
echo "In a PR context comparing to pull_request.base.ref"
git fetch origin "$GITHUB_BASE_REF" --depth=350
COMPARE_REV="$(git merge-base "origin/${GITHUB_BASE_REF}" HEAD)"
fi
CHANGED_FILES="$(git diff --name-only "$COMPARE_REV" ${{ github.sha }})"
# Specify anything which should potentially skip full test suite in this regex:
# - docs/
# - script/update_top_ranking_issues/
# - .github/ISSUE_TEMPLATE/
# - .github/workflows/ (except .github/workflows/ci.yml)
SKIP_REGEX='^(docs/|script/update_top_ranking_issues/|\.github/(ISSUE_TEMPLATE|workflows/(?!ci)))'
echo "$CHANGED_FILES" | grep -qvP "$SKIP_REGEX" && \
echo "run_tests=true" >> "$GITHUB_OUTPUT" || \
echo "run_tests=false" >> "$GITHUB_OUTPUT"
echo "$CHANGED_FILES" | grep -qP '^docs/' && \
echo "run_docs=true" >> "$GITHUB_OUTPUT" || \
echo "run_docs=false" >> "$GITHUB_OUTPUT"
echo "$CHANGED_FILES" | grep -qP '^\.github/(workflows/|actions/|actionlint.yml)' && \
echo "run_actionlint=true" >> "$GITHUB_OUTPUT" || \
echo "run_actionlint=false" >> "$GITHUB_OUTPUT"
echo "$CHANGED_FILES" | grep -qP '^(Cargo.lock|script/.*licenses)' && \
echo "run_license=true" >> "$GITHUB_OUTPUT" || \
echo "run_license=false" >> "$GITHUB_OUTPUT"
echo "$CHANGED_FILES" | grep -qP '^(nix/|flake\.|Cargo\.|rust-toolchain.toml|\.cargo/config.toml)' && \
echo "$GITHUB_REF_NAME" | grep -qvP '^v[0-9]+\.[0-9]+\.[0-9x](-pre)?$' && \
echo "run_nix=true" >> "$GITHUB_OUTPUT" || \
echo "run_nix=false" >> "$GITHUB_OUTPUT"
migration_checks:
name: Check Postgres and Protobuf migrations, mergability
needs: [job_spec]
if: |
github.repository_owner == 'zed-industries' &&
needs.job_spec.outputs.run_tests == 'true'
timeout-minutes: 60
runs-on:
- self-mini-macos
steps:
- name: Checkout repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false
fetch-depth: 0 # fetch full history
- name: Remove untracked files
run: git clean -df
- name: Find modified migrations
shell: bash -euxo pipefail {0}
run: |
export SQUAWK_GITHUB_TOKEN=${{ github.token }}
. ./script/squawk
- name: Ensure fresh merge
shell: bash -euxo pipefail {0}
run: |
if [ -z "$GITHUB_BASE_REF" ];
then
echo "BUF_BASE_BRANCH=$(git merge-base origin/main HEAD)" >> "$GITHUB_ENV"
else
git checkout -B temp
git merge -q "origin/$GITHUB_BASE_REF" -m "merge main into temp"
echo "BUF_BASE_BRANCH=$GITHUB_BASE_REF" >> "$GITHUB_ENV"
fi
- uses: bufbuild/buf-setup-action@v1
with:
version: v1.29.0
- uses: bufbuild/buf-breaking-action@v1
with:
input: "crates/proto/proto/"
against: "https://github.com/${GITHUB_REPOSITORY}.git#branch=${BUF_BASE_BRANCH},subdir=crates/proto/proto/"
style:
timeout-minutes: 60
name: Check formatting and spelling
needs: [job_spec]
if: github.repository_owner == 'zed-industries'
runs-on:
- namespace-profile-4x8-ubuntu-2204
steps:
- name: Checkout repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
- uses: pnpm/action-setup@fe02b34f77f8bc703788d5817da081398fad5dd2 # v4.0.0
with:
version: 9
- name: Prettier Check on /docs
working-directory: ./docs
run: |
pnpm dlx "prettier@${PRETTIER_VERSION}" . --check || {
echo "To fix, run from the root of the Zed repo:"
echo " cd docs && pnpm dlx prettier@${PRETTIER_VERSION} . --write && cd .."
false
}
env:
PRETTIER_VERSION: 3.5.0
- name: Prettier Check on default.json
run: |
pnpm dlx "prettier@${PRETTIER_VERSION}" assets/settings/default.json --check || {
echo "To fix, run from the root of the Zed repo:"
echo " pnpm dlx prettier@${PRETTIER_VERSION} assets/settings/default.json --write"
false
}
env:
PRETTIER_VERSION: 3.5.0
# To support writing comments that they will certainly be revisited.
- name: Check for todo! and FIXME comments
run: script/check-todos
- name: Check modifier use in keymaps
run: script/check-keymaps
- name: Run style checks
uses: ./.github/actions/check_style
- name: Check for typos
uses: crate-ci/typos@80c8a4945eec0f6d464eaf9e65ed98ef085283d1 # v1.38.1
with:
config: ./typos.toml
check_docs:
timeout-minutes: 60
name: Check docs
needs: [job_spec]
if: |
github.repository_owner == 'zed-industries' &&
(needs.job_spec.outputs.run_tests == 'true' || needs.job_spec.outputs.run_docs == 'true')
runs-on:
- namespace-profile-8x16-ubuntu-2204
steps:
- name: Checkout repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false
- name: Configure CI
run: |
mkdir -p ./../.cargo
cp ./.cargo/ci-config.toml ./../.cargo/config.toml
- name: Build docs
uses: ./.github/actions/build_docs
actionlint:
runs-on: namespace-profile-2x4-ubuntu-2404
if: github.repository_owner == 'zed-industries' && needs.job_spec.outputs.run_actionlint == 'true'
needs: [job_spec]
steps:
- uses: actions/checkout@v4
- name: Download actionlint
id: get_actionlint
run: bash <(curl https://raw.githubusercontent.com/rhysd/actionlint/main/scripts/download-actionlint.bash)
shell: bash
- name: Check workflow files
run: ${{ steps.get_actionlint.outputs.executable }} -color
shell: bash
macos_tests:
timeout-minutes: 60
name: (macOS) Run Clippy and tests
needs: [job_spec]
if: |
github.repository_owner == 'zed-industries' &&
needs.job_spec.outputs.run_tests == 'true'
runs-on:
- self-mini-macos
steps:
- name: Checkout repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false
- name: Configure CI
run: |
mkdir -p ./../.cargo
cp ./.cargo/ci-config.toml ./../.cargo/config.toml
- name: Check that Cargo.lock is up to date
run: |
cargo update --locked --workspace
- name: cargo clippy
run: ./script/clippy
- name: Install cargo-machete
uses: clechasseur/rs-cargo@8435b10f6e71c2e3d4d3b7573003a8ce4bfc6386 # v2
with:
command: install
args: cargo-machete@0.7.0
- name: Check unused dependencies
uses: clechasseur/rs-cargo@8435b10f6e71c2e3d4d3b7573003a8ce4bfc6386 # v2
with:
command: machete
- name: Check licenses
run: |
script/check-licenses
if [[ "${{ needs.job_spec.outputs.run_license }}" == "true" ]]; then
script/generate-licenses /tmp/zed_licenses_output
fi
- name: Check for new vulnerable dependencies
if: github.event_name == 'pull_request'
uses: actions/dependency-review-action@67d4f4bd7a9b17a0db54d2a7519187c65e339de8 # v4
with:
license-check: false
- name: Run tests
uses: ./.github/actions/run_tests
- name: Build collab
# we should do this on a linux x86 machinge
run: cargo build -p collab
- name: Build other binaries and features
run: |
cargo build --workspace --bins --examples
# Since the macOS runners are stateful, so we need to remove the config file to prevent potential bug.
- name: Clean CI config file
if: always()
run: rm -rf ./../.cargo
linux_tests:
timeout-minutes: 60
name: (Linux) Run Clippy and tests
needs: [job_spec]
if: |
github.repository_owner == 'zed-industries' &&
needs.job_spec.outputs.run_tests == 'true'
runs-on:
- namespace-profile-16x32-ubuntu-2204
steps:
- name: Add Rust to the PATH
run: echo "$HOME/.cargo/bin" >> "$GITHUB_PATH"
- name: Checkout repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false
- name: Cache dependencies
uses: swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2
with:
save-if: ${{ github.ref == 'refs/heads/main' }}
# cache-provider: "buildjet"
- name: Install Linux dependencies
run: ./script/linux
- name: Configure CI
run: |
mkdir -p ./../.cargo
cp ./.cargo/ci-config.toml ./../.cargo/config.toml
- name: cargo clippy
run: ./script/clippy
- name: Run tests
uses: ./.github/actions/run_tests
- name: Build other binaries and features
run: |
cargo build -p zed
cargo check -p workspace
cargo check -p gpui --examples
# Even the Linux runner is not stateful, in theory there is no need to do this cleanup.
# But, to avoid potential issues in the future if we choose to use a stateful Linux runner and forget to add code
# to clean up the config file, Ive included the cleanup code here as a precaution.
# While its not strictly necessary at this moment, I believe its better to err on the side of caution.
- name: Clean CI config file
if: always()
run: rm -rf ./../.cargo
doctests:
# Nextest currently doesn't support doctests, so run them separately and in parallel.
timeout-minutes: 60
name: (Linux) Run doctests
needs: [job_spec]
if: |
github.repository_owner == 'zed-industries' &&
needs.job_spec.outputs.run_tests == 'true'
runs-on:
- namespace-profile-16x32-ubuntu-2204
steps:
- name: Add Rust to the PATH
run: echo "$HOME/.cargo/bin" >> "$GITHUB_PATH"
- name: Checkout repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false
- name: Cache dependencies
uses: swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2
with:
save-if: ${{ github.ref == 'refs/heads/main' }}
# cache-provider: "buildjet"
- name: Install Linux dependencies
run: ./script/linux
- name: Configure CI
run: |
mkdir -p ./../.cargo
cp ./.cargo/ci-config.toml ./../.cargo/config.toml
- name: Run doctests
run: cargo test --workspace --doc --no-fail-fast
- name: Clean CI config file
if: always()
run: rm -rf ./../.cargo
build_remote_server:
timeout-minutes: 60
name: (Linux) Build Remote Server
needs: [job_spec]
if: |
github.repository_owner == 'zed-industries' &&
needs.job_spec.outputs.run_tests == 'true'
runs-on:
- namespace-profile-16x32-ubuntu-2204
steps:
- name: Add Rust to the PATH
run: echo "$HOME/.cargo/bin" >> "$GITHUB_PATH"
- name: Checkout repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false
- name: Cache dependencies
uses: swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2
with:
save-if: ${{ github.ref == 'refs/heads/main' }}
# cache-provider: "buildjet"
- name: Install Clang & Mold
run: ./script/remote-server && ./script/install-mold 2.34.0
- name: Configure CI
run: |
mkdir -p ./../.cargo
cp ./.cargo/ci-config.toml ./../.cargo/config.toml
- name: Build Remote Server
run: cargo build -p remote_server
- name: Clean CI config file
if: always()
run: rm -rf ./../.cargo
windows_tests:
timeout-minutes: 60
name: (Windows) Run Clippy and tests
needs: [job_spec]
if: |
github.repository_owner == 'zed-industries' &&
needs.job_spec.outputs.run_tests == 'true'
runs-on: [self-32vcpu-windows-2022]
steps:
- name: Environment Setup
run: |
$RunnerDir = Split-Path -Parent $env:RUNNER_WORKSPACE
Write-Output `
"RUSTUP_HOME=$RunnerDir\.rustup" `
"CARGO_HOME=$RunnerDir\.cargo" `
"PATH=$RunnerDir\.cargo\bin;$env:PATH" `
>> $env:GITHUB_ENV
- name: Checkout repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false
- name: Configure CI
run: |
New-Item -ItemType Directory -Path "./../.cargo" -Force
Copy-Item -Path "./.cargo/ci-config.toml" -Destination "./../.cargo/config.toml"
- name: cargo clippy
run: |
.\script\clippy.ps1
- name: Run tests
uses: ./.github/actions/run_tests_windows
- name: Build Zed
run: cargo build
- name: Limit target directory size
run: ./script/clear-target-dir-if-larger-than.ps1 250
- name: Clean CI config file
if: always()
run: Remove-Item -Recurse -Path "./../.cargo" -Force -ErrorAction SilentlyContinue
tests_pass:
name: Tests Pass
runs-on: namespace-profile-2x4-ubuntu-2404
needs:
- job_spec
- style
- check_docs
- actionlint
- migration_checks
# run_tests: If adding required tests, add them here and to script below.
- linux_tests
- build_remote_server
- macos_tests
- windows_tests
if: |
github.repository_owner == 'zed-industries' &&
always()
steps:
- name: Check all tests passed
run: |
# Check dependent jobs...
RET_CODE=0
# Always check style
[[ "${{ needs.style.result }}" != 'success' ]] && { RET_CODE=1; echo "style tests failed"; }
if [[ "${{ needs.job_spec.outputs.run_docs }}" == "true" ]]; then
[[ "${{ needs.check_docs.result }}" != 'success' ]] && { RET_CODE=1; echo "docs checks failed"; }
fi
if [[ "${{ needs.job_spec.outputs.run_actionlint }}" == "true" ]]; then
[[ "${{ needs.actionlint.result }}" != 'success' ]] && { RET_CODE=1; echo "actionlint checks failed"; }
fi
# Only check test jobs if they were supposed to run
if [[ "${{ needs.job_spec.outputs.run_tests }}" == "true" ]]; then
[[ "${{ needs.macos_tests.result }}" != 'success' ]] && { RET_CODE=1; echo "macOS tests failed"; }
[[ "${{ needs.linux_tests.result }}" != 'success' ]] && { RET_CODE=1; echo "Linux tests failed"; }
[[ "${{ needs.windows_tests.result }}" != 'success' ]] && { RET_CODE=1; echo "Windows tests failed"; }
[[ "${{ needs.build_remote_server.result }}" != 'success' ]] && { RET_CODE=1; echo "Remote server build failed"; }
# This check is intentionally disabled. See: https://github.com/zed-industries/zed/pull/28431
# [[ "${{ needs.migration_checks.result }}" != 'success' ]] && { RET_CODE=1; echo "Migration Checks failed"; }
fi
if [[ "$RET_CODE" -eq 0 ]]; then
echo "All tests passed successfully!"
fi
exit $RET_CODE
bundle-mac:
timeout-minutes: 120
name: Create a macOS bundle
runs-on:
- self-mini-macos
if: startsWith(github.ref, 'refs/tags/v')
needs: [macos_tests]
env:
MACOS_CERTIFICATE: ${{ secrets.MACOS_CERTIFICATE }}
MACOS_CERTIFICATE_PASSWORD: ${{ secrets.MACOS_CERTIFICATE_PASSWORD }}
APPLE_NOTARIZATION_KEY: ${{ secrets.APPLE_NOTARIZATION_KEY }}
APPLE_NOTARIZATION_KEY_ID: ${{ secrets.APPLE_NOTARIZATION_KEY_ID }}
APPLE_NOTARIZATION_ISSUER_ID: ${{ secrets.APPLE_NOTARIZATION_ISSUER_ID }}
steps:
- name: Install Node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4
with:
node-version: "18"
- name: Setup Sentry CLI
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b #v2
with:
token: ${{ SECRETS.SENTRY_AUTH_TOKEN }}
- name: Checkout repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
# We need to fetch more than one commit so that `script/draft-release-notes`
# is able to diff between the current and previous tag.
#
# 25 was chosen arbitrarily.
fetch-depth: 25
clean: false
ref: ${{ github.ref }}
- name: Limit target directory size
run: script/clear-target-dir-if-larger-than 300
- name: Determine version and release channel
run: |
# This exports RELEASE_CHANNEL into env (GITHUB_ENV)
script/determine-release-channel
- name: Draft release notes
run: |
mkdir -p target/
# Ignore any errors that occur while drafting release notes to not fail the build.
script/draft-release-notes "$RELEASE_VERSION" "$RELEASE_CHANNEL" > target/release-notes.md || true
script/create-draft-release target/release-notes.md
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
- name: Create macOS app bundle (aarch64)
run: script/bundle-mac aarch64-apple-darwin
- name: Create macOS app bundle (x64)
run: script/bundle-mac x86_64-apple-darwin
- name: Rename binaries
run: |
mv target/aarch64-apple-darwin/release/Zed.dmg target/aarch64-apple-darwin/release/Zed-aarch64.dmg
mv target/x86_64-apple-darwin/release/Zed.dmg target/x86_64-apple-darwin/release/Zed-x86_64.dmg
- uses: softprops/action-gh-release@de2c0eb89ae2a093876385947365aca7b0e5f844 # v1
name: Upload app bundle to release
if: ${{ env.RELEASE_CHANNEL == 'preview' || env.RELEASE_CHANNEL == 'stable' }}
with:
draft: true
prerelease: ${{ env.RELEASE_CHANNEL == 'preview' }}
files: |
target/zed-remote-server-macos-x86_64.gz
target/zed-remote-server-macos-aarch64.gz
target/aarch64-apple-darwin/release/Zed-aarch64.dmg
target/x86_64-apple-darwin/release/Zed-x86_64.dmg
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
bundle-linux-x86_x64:
timeout-minutes: 60
name: Linux x86_x64 release bundle
runs-on:
- namespace-profile-16x32-ubuntu-2004 # ubuntu 20.04 for minimal glibc
if: |
( startsWith(github.ref, 'refs/tags/v') )
needs: [linux_tests]
steps:
- name: Checkout repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false
- name: Install Linux dependencies
run: ./script/linux && ./script/install-mold 2.34.0
- name: Setup Sentry CLI
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b #v2
with:
token: ${{ SECRETS.SENTRY_AUTH_TOKEN }}
- name: Determine version and release channel
run: |
# This exports RELEASE_CHANNEL into env (GITHUB_ENV)
script/determine-release-channel
- name: Create Linux .tar.gz bundle
run: script/bundle-linux
- name: Upload Artifacts to release
uses: softprops/action-gh-release@de2c0eb89ae2a093876385947365aca7b0e5f844 # v1
with:
draft: true
prerelease: ${{ env.RELEASE_CHANNEL == 'preview' }}
files: |
target/zed-remote-server-linux-x86_64.gz
target/release/zed-linux-x86_64.tar.gz
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
bundle-linux-aarch64: # this runs on ubuntu22.04
timeout-minutes: 60
name: Linux arm64 release bundle
runs-on:
- namespace-profile-8x32-ubuntu-2004-arm-m4 # ubuntu 20.04 for minimal glibc
if: |
startsWith(github.ref, 'refs/tags/v')
needs: [linux_tests]
steps:
- name: Checkout repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false
- name: Install Linux dependencies
run: ./script/linux
- name: Setup Sentry CLI
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b #v2
with:
token: ${{ SECRETS.SENTRY_AUTH_TOKEN }}
- name: Determine version and release channel
run: |
# This exports RELEASE_CHANNEL into env (GITHUB_ENV)
script/determine-release-channel
- name: Create and upload Linux .tar.gz bundles
run: script/bundle-linux
- name: Upload Artifacts to release
uses: softprops/action-gh-release@de2c0eb89ae2a093876385947365aca7b0e5f844 # v1
with:
draft: true
prerelease: ${{ env.RELEASE_CHANNEL == 'preview' }}
files: |
target/zed-remote-server-linux-aarch64.gz
target/release/zed-linux-aarch64.tar.gz
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
freebsd:
timeout-minutes: 60
runs-on: github-8vcpu-ubuntu-2404
if: |
false && ( startsWith(github.ref, 'refs/tags/v') )
needs: [linux_tests]
name: Build Zed on FreeBSD
steps:
- uses: actions/checkout@v4
- name: Build FreeBSD remote-server
id: freebsd-build
uses: vmactions/freebsd-vm@c3ae29a132c8ef1924775414107a97cac042aad5 # v1.2.0
with:
usesh: true
release: 13.5
copyback: true
prepare: |
pkg install -y \
bash curl jq git \
rustup-init cmake-core llvm-devel-lite pkgconf protobuf # ibx11 alsa-lib rust-bindgen-cli
run: |
freebsd-version
sysctl hw.model
sysctl hw.ncpu
sysctl hw.physmem
sysctl hw.usermem
git config --global --add safe.directory /home/runner/work/zed/zed
rustup-init --profile minimal --default-toolchain none -y
. "$HOME/.cargo/env"
./script/bundle-freebsd
mkdir -p out/
mv "target/zed-remote-server-freebsd-x86_64.gz" out/
rm -rf target/
cargo clean
- name: Upload Artifact to Workflow - zed-remote-server (run-bundling)
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4
if: contains(github.event.pull_request.labels.*.name, 'run-bundling')
with:
name: zed-remote-server-${{ github.event.pull_request.head.sha || github.sha }}-x86_64-unknown-freebsd.gz
path: out/zed-remote-server-freebsd-x86_64.gz
- name: Upload Artifacts to release
uses: softprops/action-gh-release@de2c0eb89ae2a093876385947365aca7b0e5f844 # v1
if: ${{ !(contains(github.event.pull_request.labels.*.name, 'run-bundling')) }}
with:
draft: true
prerelease: ${{ env.RELEASE_CHANNEL == 'preview' }}
files: |
out/zed-remote-server-freebsd-x86_64.gz
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
nix-build:
name: Build with Nix
uses: ./.github/workflows/nix_build.yml
needs: [job_spec]
if: github.repository_owner == 'zed-industries' &&
(contains(github.event.pull_request.labels.*.name, 'run-nix') ||
needs.job_spec.outputs.run_nix == 'true')
secrets: inherit
bundle-windows-x64:
timeout-minutes: 120
name: Create a Windows installer for x86_64
runs-on: [self-32vcpu-windows-2022]
if: |
( startsWith(github.ref, 'refs/tags/v') )
needs: [windows_tests]
env:
AZURE_TENANT_ID: ${{ secrets.AZURE_SIGNING_TENANT_ID }}
AZURE_CLIENT_ID: ${{ secrets.AZURE_SIGNING_CLIENT_ID }}
AZURE_CLIENT_SECRET: ${{ secrets.AZURE_SIGNING_CLIENT_SECRET }}
ACCOUNT_NAME: ${{ vars.AZURE_SIGNING_ACCOUNT_NAME }}
CERT_PROFILE_NAME: ${{ vars.AZURE_SIGNING_CERT_PROFILE_NAME }}
ENDPOINT: ${{ vars.AZURE_SIGNING_ENDPOINT }}
FILE_DIGEST: SHA256
TIMESTAMP_DIGEST: SHA256
TIMESTAMP_SERVER: "http://timestamp.acs.microsoft.com"
steps:
- name: Checkout repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false
- name: Setup Sentry CLI
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b #v2
with:
token: ${{ SECRETS.SENTRY_AUTH_TOKEN }}
- name: Determine version and release channel
working-directory: ${{ env.ZED_WORKSPACE }}
run: |
# This exports RELEASE_CHANNEL into env (GITHUB_ENV)
script/determine-release-channel.ps1
- name: Build Zed installer
working-directory: ${{ env.ZED_WORKSPACE }}
run: script/bundle-windows.ps1
- name: Upload Artifacts to release
uses: softprops/action-gh-release@de2c0eb89ae2a093876385947365aca7b0e5f844 # v1
with:
draft: true
prerelease: ${{ env.RELEASE_CHANNEL == 'preview' }}
files: ${{ env.SETUP_PATH }}
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
bundle-windows-aarch64:
timeout-minutes: 120
name: Create a Windows installer for aarch64
runs-on: [self-32vcpu-windows-2022]
if: |
( startsWith(github.ref, 'refs/tags/v') )
needs: [windows_tests]
env:
AZURE_TENANT_ID: ${{ secrets.AZURE_SIGNING_TENANT_ID }}
AZURE_CLIENT_ID: ${{ secrets.AZURE_SIGNING_CLIENT_ID }}
AZURE_CLIENT_SECRET: ${{ secrets.AZURE_SIGNING_CLIENT_SECRET }}
ACCOUNT_NAME: ${{ vars.AZURE_SIGNING_ACCOUNT_NAME }}
CERT_PROFILE_NAME: ${{ vars.AZURE_SIGNING_CERT_PROFILE_NAME }}
ENDPOINT: ${{ vars.AZURE_SIGNING_ENDPOINT }}
FILE_DIGEST: SHA256
TIMESTAMP_DIGEST: SHA256
TIMESTAMP_SERVER: "http://timestamp.acs.microsoft.com"
steps:
- name: Checkout repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false
- name: Setup Sentry CLI
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b #v2
with:
token: ${{ SECRETS.SENTRY_AUTH_TOKEN }}
- name: Determine version and release channel
working-directory: ${{ env.ZED_WORKSPACE }}
run: |
# This exports RELEASE_CHANNEL into env (GITHUB_ENV)
script/determine-release-channel.ps1
- name: Build Zed installer
working-directory: ${{ env.ZED_WORKSPACE }}
run: script/bundle-windows.ps1 -Architecture aarch64
- name: Upload Artifacts to release
uses: softprops/action-gh-release@de2c0eb89ae2a093876385947365aca7b0e5f844 # v1
with:
draft: true
prerelease: ${{ env.RELEASE_CHANNEL == 'preview' }}
files: ${{ env.SETUP_PATH }}
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
auto-release-preview:
name: Auto release preview
if: |
false
&& startsWith(github.ref, 'refs/tags/v')
&& endsWith(github.ref, '-pre') && !endsWith(github.ref, '.0-pre')
needs: [bundle-mac, bundle-linux-x86_x64, bundle-linux-aarch64, bundle-windows-x64, bundle-windows-aarch64]
runs-on:
- self-mini-macos
steps:
- name: gh release
run: gh release edit "$GITHUB_REF_NAME" --draft=false
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
- name: Create Sentry release
uses: getsentry/action-release@526942b68292201ac6bbb99b9a0747d4abee354c # v3
env:
SENTRY_ORG: zed-dev
SENTRY_PROJECT: zed
SENTRY_AUTH_TOKEN: ${{ secrets.SENTRY_AUTH_TOKEN }}
with:
environment: production

View File

@@ -29,10 +29,10 @@ jobs:
node-version: '20'
cache: pnpm
cache-dependency-path: script/danger/pnpm-lock.yaml
- name: danger::install_deps
- name: danger::danger_job::install_deps
run: pnpm install --dir script/danger
shell: bash -euxo pipefail {0}
- name: danger::run
- name: danger::danger_job::run
run: pnpm run --dir script/danger danger ci
shell: bash -euxo pipefail {0}
env:

View File

@@ -1,71 +0,0 @@
name: Run Agent Eval
on:
schedule:
- cron: "0 0 * * *"
pull_request:
branches:
- "**"
types: [synchronize, reopened, labeled]
workflow_dispatch:
concurrency:
# Allow only one workflow per any non-`main` branch.
group: ${{ github.workflow }}-${{ github.ref_name }}-${{ github.ref_name == 'main' && github.sha || 'anysha' }}
cancel-in-progress: true
env:
CARGO_TERM_COLOR: always
CARGO_INCREMENTAL: 0
RUST_BACKTRACE: 1
ANTHROPIC_API_KEY: ${{ secrets.ANTHROPIC_API_KEY }}
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_EVAL_TELEMETRY: 1
jobs:
run_eval:
timeout-minutes: 60
name: Run Agent Eval
if: >
github.repository_owner == 'zed-industries' &&
(github.event_name != 'pull_request' || contains(github.event.pull_request.labels.*.name, 'run-eval'))
runs-on:
- namespace-profile-16x32-ubuntu-2204
steps:
- name: Add Rust to the PATH
run: echo "$HOME/.cargo/bin" >> "$GITHUB_PATH"
- name: Checkout repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false
- name: Cache dependencies
uses: swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2
with:
save-if: ${{ github.ref == 'refs/heads/main' }}
# cache-provider: "buildjet"
- name: Install Linux dependencies
run: ./script/linux
- name: Configure CI
run: |
mkdir -p ./../.cargo
cp ./.cargo/ci-config.toml ./../.cargo/config.toml
- name: Compile eval
run: cargo build --package=eval
- name: Run eval
run: cargo run --package=eval -- --repetitions=8 --concurrency=1
# Even the Linux runner is not stateful, in theory there is no need to do this cleanup.
# But, to avoid potential issues in the future if we choose to use a stateful Linux runner and forget to add code
# to clean up the config file, Ive included the cleanup code here as a precaution.
# While its not strictly necessary at this moment, I believe its better to err on the side of caution.
- name: Clean CI config file
if: always()
run: rm -rf ./../.cargo

View File

@@ -1,97 +0,0 @@
# Generated from xtask::workflows::nix_build
# Rebuild with `cargo xtask workflows`.
name: nix_build
env:
CARGO_TERM_COLOR: always
RUST_BACKTRACE: '1'
CARGO_INCREMENTAL: '0'
on:
pull_request:
branches:
- '**'
paths:
- nix/**
- flake.*
- Cargo.*
- rust-toolchain.toml
- .cargo/config.toml
push:
branches:
- main
- v[0-9]+.[0-9]+.x
paths:
- nix/**
- flake.*
- Cargo.*
- rust-toolchain.toml
- .cargo/config.toml
workflow_call: {}
jobs:
build_nix_linux_x86_64:
if: github.repository_owner == 'zed-industries'
runs-on: namespace-profile-32x64-ubuntu-2004
env:
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_MINIDUMP_ENDPOINT: ${{ secrets.ZED_SENTRY_MINIDUMP_ENDPOINT }}
ZED_CLOUD_PROVIDER_ADDITIONAL_MODELS_JSON: ${{ secrets.ZED_CLOUD_PROVIDER_ADDITIONAL_MODELS_JSON }}
GIT_LFS_SKIP_SMUDGE: '1'
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: nix_build::install_nix
uses: cachix/install-nix-action@02a151ada4993995686f9ed4f1be7cfbb229e56f
with:
github_access_token: ${{ secrets.GITHUB_TOKEN }}
- name: nix_build::cachix_action
uses: cachix/cachix-action@0fc020193b5a1fa3ac4575aa3a7d3aa6a35435ad
with:
name: zed
authToken: ${{ secrets.CACHIX_AUTH_TOKEN }}
cachixArgs: -v
pushFilter: -zed-editor-[0-9.]*-nightly
- name: nix_build::build
run: nix build .#debug -L --accept-flake-config
shell: bash -euxo pipefail {0}
timeout-minutes: 60
continue-on-error: true
build_nix_mac_aarch64:
if: github.repository_owner == 'zed-industries'
runs-on: self-mini-macos
env:
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_MINIDUMP_ENDPOINT: ${{ secrets.ZED_SENTRY_MINIDUMP_ENDPOINT }}
ZED_CLOUD_PROVIDER_ADDITIONAL_MODELS_JSON: ${{ secrets.ZED_CLOUD_PROVIDER_ADDITIONAL_MODELS_JSON }}
GIT_LFS_SKIP_SMUDGE: '1'
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: nix_build::set_path
run: |
echo "/nix/var/nix/profiles/default/bin" >> "$GITHUB_PATH"
echo "/Users/administrator/.nix-profile/bin" >> "$GITHUB_PATH"
shell: bash -euxo pipefail {0}
- name: nix_build::cachix_action
uses: cachix/cachix-action@0fc020193b5a1fa3ac4575aa3a7d3aa6a35435ad
with:
name: zed
authToken: ${{ secrets.CACHIX_AUTH_TOKEN }}
cachixArgs: -v
pushFilter: -zed-editor-[0-9.]*-nightly
- name: nix_build::build
run: nix build .#debug -L --accept-flake-config
shell: bash -euxo pipefail {0}
- name: nix_build::limit_store
run: |-
if [ "$(du -sm /nix/store | cut -f1)" -gt 50000 ]; then
nix-collect-garbage -d || true
fi
shell: bash -euxo pipefail {0}
timeout-minutes: 60
continue-on-error: true
concurrency:
group: ${{ github.workflow }}-${{ github.ref_name }}-${{ github.ref_name == 'main' && github.sha || 'anysha' }}
cancel-in-progress: true

486
.github/workflows/release.yml vendored Normal file
View File

@@ -0,0 +1,486 @@
# Generated from xtask::workflows::release
# Rebuild with `cargo xtask workflows`.
name: release
env:
CARGO_TERM_COLOR: always
CARGO_INCREMENTAL: '0'
RUST_BACKTRACE: '1'
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_MINIDUMP_ENDPOINT: ${{ secrets.ZED_SENTRY_MINIDUMP_ENDPOINT }}
on:
push:
tags:
- v*
jobs:
run_tests_mac:
if: github.repository_owner == 'zed-industries'
runs-on: self-mini-macos
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_cargo_config
run: |
mkdir -p ./../.cargo
cp ./.cargo/ci-config.toml ./../.cargo/config.toml
shell: bash -euxo pipefail {0}
- name: steps::setup_node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020
with:
node-version: '20'
- name: steps::clippy
run: ./script/clippy
shell: bash -euxo pipefail {0}
- name: steps::cargo_install_nextest
run: cargo install cargo-nextest --locked
shell: bash -euxo pipefail {0}
- name: steps::clear_target_dir_if_large
run: ./script/clear-target-dir-if-larger-than 300
shell: bash -euxo pipefail {0}
- name: steps::cargo_nextest
run: cargo nextest run --workspace --no-fail-fast --failure-output immediate-final
shell: bash -euxo pipefail {0}
- name: steps::cleanup_cargo_config
if: always()
run: |
rm -rf ./../.cargo
shell: bash -euxo pipefail {0}
timeout-minutes: 60
run_tests_linux:
if: github.repository_owner == 'zed-industries'
runs-on: namespace-profile-16x32-ubuntu-2204
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_cargo_config
run: |
mkdir -p ./../.cargo
cp ./.cargo/ci-config.toml ./../.cargo/config.toml
shell: bash -euxo pipefail {0}
- name: steps::setup_linux
run: ./script/linux
shell: bash -euxo pipefail {0}
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: steps::setup_node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020
with:
node-version: '20'
- name: steps::clippy
run: ./script/clippy
shell: bash -euxo pipefail {0}
- name: steps::cargo_install_nextest
run: cargo install cargo-nextest --locked
shell: bash -euxo pipefail {0}
- name: steps::clear_target_dir_if_large
run: ./script/clear-target-dir-if-larger-than 100
shell: bash -euxo pipefail {0}
- name: steps::cargo_nextest
run: cargo nextest run --workspace --no-fail-fast --failure-output immediate-final
shell: bash -euxo pipefail {0}
- name: steps::cleanup_cargo_config
if: always()
run: |
rm -rf ./../.cargo
shell: bash -euxo pipefail {0}
timeout-minutes: 60
run_tests_windows:
if: github.repository_owner == 'zed-industries'
runs-on: self-32vcpu-windows-2022
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_cargo_config
run: |
New-Item -ItemType Directory -Path "./../.cargo" -Force
Copy-Item -Path "./.cargo/ci-config.toml" -Destination "./../.cargo/config.toml"
shell: pwsh
- name: steps::setup_node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020
with:
node-version: '20'
- name: steps::clippy
run: ./script/clippy.ps1
shell: pwsh
- name: steps::cargo_install_nextest
run: cargo install cargo-nextest --locked
shell: pwsh
- name: steps::clear_target_dir_if_large
run: ./script/clear-target-dir-if-larger-than.ps1 250
shell: pwsh
- name: steps::cargo_nextest
run: cargo nextest run --workspace --no-fail-fast --failure-output immediate-final
shell: pwsh
- name: steps::cleanup_cargo_config
if: always()
run: |
Remove-Item -Recurse -Path "./../.cargo" -Force -ErrorAction SilentlyContinue
shell: pwsh
timeout-minutes: 60
check_scripts:
if: github.repository_owner == 'zed-industries'
runs-on: namespace-profile-2x4-ubuntu-2404
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: run_tests::check_scripts::run_shellcheck
run: ./script/shellcheck-scripts error
shell: bash -euxo pipefail {0}
- id: get_actionlint
name: run_tests::check_scripts::download_actionlint
run: bash <(curl https://raw.githubusercontent.com/rhysd/actionlint/main/scripts/download-actionlint.bash)
shell: bash -euxo pipefail {0}
- name: run_tests::check_scripts::run_actionlint
run: |
${{ steps.get_actionlint.outputs.executable }} -color
shell: bash -euxo pipefail {0}
- name: run_tests::check_scripts::check_xtask_workflows
run: |
cargo xtask workflows
if ! git diff --exit-code .github; then
echo "Error: .github directory has uncommitted changes after running 'cargo xtask workflows'"
echo "Please run 'cargo xtask workflows' locally and commit the changes"
exit 1
fi
shell: bash -euxo pipefail {0}
timeout-minutes: 60
create_draft_release:
if: github.repository_owner == 'zed-industries'
runs-on: namespace-profile-2x4-ubuntu-2404
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
fetch-depth: 25
ref: ${{ github.ref }}
- name: script/determine-release-channel
run: script/determine-release-channel
shell: bash -euxo pipefail {0}
- name: mkdir -p target/
run: mkdir -p target/
shell: bash -euxo pipefail {0}
- name: release::create_draft_release::generate_release_notes
run: node --redirect-warnings=/dev/null ./script/draft-release-notes "$RELEASE_VERSION" "$RELEASE_CHANNEL" > target/release-notes.md
shell: bash -euxo pipefail {0}
- name: release::create_draft_release::create_release
run: script/create-draft-release target/release-notes.md
shell: bash -euxo pipefail {0}
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
timeout-minutes: 60
bundle_linux_arm64:
needs:
- run_tests_linux
- check_scripts
runs-on: namespace-profile-8x32-ubuntu-2004-arm-m4
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_sentry
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b
with:
token: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: steps::setup_linux
run: ./script/linux
shell: bash -euxo pipefail {0}
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: ./script/bundle-linux
run: ./script/bundle-linux
shell: bash -euxo pipefail {0}
- name: '@actions/upload-artifact zed-${{ github.event.pull_request.head.sha || github.sha }}-aarch64-unknown-linux-gnu.tar.gz'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: zed-${{ github.event.pull_request.head.sha || github.sha }}-aarch64-unknown-linux-gnu.tar.gz
path: target/release/zed-*.tar.gz
if-no-files-found: error
- name: '@actions/upload-artifact zed-remote-server-${{ github.event.pull_request.head.sha || github.sha }}-aarch64-unknown-linux-gnu.tar.gz'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: zed-remote-server-${{ github.event.pull_request.head.sha || github.sha }}-aarch64-unknown-linux-gnu.tar.gz
path: target/zed-remote-server-*.gz
if-no-files-found: error
outputs:
zed: zed-${{ github.event.pull_request.head.sha || github.sha }}-aarch64-unknown-linux-gnu.tar.gz
remote-server: zed-remote-server-${{ github.event.pull_request.head.sha || github.sha }}-aarch64-unknown-linux-gnu.tar.gz
timeout-minutes: 60
bundle_linux_x86_64:
needs:
- run_tests_linux
- check_scripts
runs-on: namespace-profile-32x64-ubuntu-2004
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_sentry
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b
with:
token: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: steps::setup_linux
run: ./script/linux
shell: bash -euxo pipefail {0}
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: ./script/bundle-linux
run: ./script/bundle-linux
shell: bash -euxo pipefail {0}
- name: '@actions/upload-artifact zed-${{ github.event.pull_request.head.sha || github.sha }}-x86_64-unknown-linux-gnu.tar.gz'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: zed-${{ github.event.pull_request.head.sha || github.sha }}-x86_64-unknown-linux-gnu.tar.gz
path: target/release/zed-*.tar.gz
if-no-files-found: error
- name: '@actions/upload-artifact zed-remote-server-${{ github.event.pull_request.head.sha || github.sha }}-x86_64-unknown-linux-gnu.tar.gz'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: zed-remote-server-${{ github.event.pull_request.head.sha || github.sha }}-x86_64-unknown-linux-gnu.tar.gz
path: target/zed-remote-server-*.gz
if-no-files-found: error
outputs:
zed: zed-${{ github.event.pull_request.head.sha || github.sha }}-x86_64-unknown-linux-gnu.tar.gz
remote-server: zed-remote-server-${{ github.event.pull_request.head.sha || github.sha }}-x86_64-unknown-linux-gnu.tar.gz
timeout-minutes: 60
bundle_mac_arm64:
needs:
- run_tests_mac
- check_scripts
runs-on: self-mini-macos
env:
MACOS_CERTIFICATE: ${{ secrets.MACOS_CERTIFICATE }}
MACOS_CERTIFICATE_PASSWORD: ${{ secrets.MACOS_CERTIFICATE_PASSWORD }}
APPLE_NOTARIZATION_KEY: ${{ secrets.APPLE_NOTARIZATION_KEY }}
APPLE_NOTARIZATION_KEY_ID: ${{ secrets.APPLE_NOTARIZATION_KEY_ID }}
APPLE_NOTARIZATION_ISSUER_ID: ${{ secrets.APPLE_NOTARIZATION_ISSUER_ID }}
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020
with:
node-version: '20'
- name: steps::setup_sentry
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b
with:
token: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: steps::clear_target_dir_if_large
run: ./script/clear-target-dir-if-larger-than 300
shell: bash -euxo pipefail {0}
- name: run_bundling::bundle_mac
run: ./script/bundle-mac aarch64-apple-darwin
shell: bash -euxo pipefail {0}
- name: '@actions/upload-artifact Zed_${{ github.event.pull_request.head.sha || github.sha }}-aarch64.dmg'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: Zed_${{ github.event.pull_request.head.sha || github.sha }}-aarch64.dmg
path: target/aarch64-apple-darwin/release/Zed.dmg
if-no-files-found: error
- name: '@actions/upload-artifact zed-remote-server-${{ github.event.pull_request.head.sha || github.sha }}-macos-aarch64.gz'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: zed-remote-server-${{ github.event.pull_request.head.sha || github.sha }}-macos-aarch64.gz
path: target/zed-remote-server-macos-aarch64.gz
if-no-files-found: error
outputs:
zed: Zed_${{ github.event.pull_request.head.sha || github.sha }}-aarch64.dmg
remote-server: zed-remote-server-${{ github.event.pull_request.head.sha || github.sha }}-macos-aarch64.gz
timeout-minutes: 60
bundle_mac_x86_64:
needs:
- run_tests_mac
- check_scripts
runs-on: self-mini-macos
env:
MACOS_CERTIFICATE: ${{ secrets.MACOS_CERTIFICATE }}
MACOS_CERTIFICATE_PASSWORD: ${{ secrets.MACOS_CERTIFICATE_PASSWORD }}
APPLE_NOTARIZATION_KEY: ${{ secrets.APPLE_NOTARIZATION_KEY }}
APPLE_NOTARIZATION_KEY_ID: ${{ secrets.APPLE_NOTARIZATION_KEY_ID }}
APPLE_NOTARIZATION_ISSUER_ID: ${{ secrets.APPLE_NOTARIZATION_ISSUER_ID }}
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020
with:
node-version: '20'
- name: steps::setup_sentry
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b
with:
token: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: steps::clear_target_dir_if_large
run: ./script/clear-target-dir-if-larger-than 300
shell: bash -euxo pipefail {0}
- name: run_bundling::bundle_mac
run: ./script/bundle-mac x86_64-apple-darwin
shell: bash -euxo pipefail {0}
- name: '@actions/upload-artifact Zed_${{ github.event.pull_request.head.sha || github.sha }}-x86_64.dmg'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: Zed_${{ github.event.pull_request.head.sha || github.sha }}-x86_64.dmg
path: target/x86_64-apple-darwin/release/Zed.dmg
if-no-files-found: error
- name: '@actions/upload-artifact zed-remote-server-${{ github.event.pull_request.head.sha || github.sha }}-macos-x86_64.gz'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: zed-remote-server-${{ github.event.pull_request.head.sha || github.sha }}-macos-x86_64.gz
path: target/zed-remote-server-macos-x86_64.gz
if-no-files-found: error
outputs:
zed: Zed_${{ github.event.pull_request.head.sha || github.sha }}-x86_64.dmg
remote-server: zed-remote-server-${{ github.event.pull_request.head.sha || github.sha }}-macos-x86_64.gz
timeout-minutes: 60
bundle_windows_arm64:
needs:
- run_tests_windows
- check_scripts
runs-on: self-32vcpu-windows-2022
env:
AZURE_TENANT_ID: ${{ secrets.AZURE_SIGNING_TENANT_ID }}
AZURE_CLIENT_ID: ${{ secrets.AZURE_SIGNING_CLIENT_ID }}
AZURE_CLIENT_SECRET: ${{ secrets.AZURE_SIGNING_CLIENT_SECRET }}
ACCOUNT_NAME: ${{ vars.AZURE_SIGNING_ACCOUNT_NAME }}
CERT_PROFILE_NAME: ${{ vars.AZURE_SIGNING_CERT_PROFILE_NAME }}
ENDPOINT: ${{ vars.AZURE_SIGNING_ENDPOINT }}
FILE_DIGEST: SHA256
TIMESTAMP_DIGEST: SHA256
TIMESTAMP_SERVER: http://timestamp.acs.microsoft.com
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_sentry
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b
with:
token: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: run_bundling::bundle_windows
run: script/bundle-windows.ps1 -Architecture aarch64
shell: pwsh
working-directory: ${{ env.ZED_WORKSPACE }}
- name: '@actions/upload-artifact Zed_${{ github.event.pull_request.head.sha || github.sha }}-aarch64.exe'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: Zed_${{ github.event.pull_request.head.sha || github.sha }}-aarch64.exe
path: ${{ env.SETUP_PATH }}
if-no-files-found: error
outputs:
zed: Zed_${{ github.event.pull_request.head.sha || github.sha }}-aarch64.exe
timeout-minutes: 60
bundle_windows_x86_64:
needs:
- run_tests_windows
- check_scripts
runs-on: self-32vcpu-windows-2022
env:
AZURE_TENANT_ID: ${{ secrets.AZURE_SIGNING_TENANT_ID }}
AZURE_CLIENT_ID: ${{ secrets.AZURE_SIGNING_CLIENT_ID }}
AZURE_CLIENT_SECRET: ${{ secrets.AZURE_SIGNING_CLIENT_SECRET }}
ACCOUNT_NAME: ${{ vars.AZURE_SIGNING_ACCOUNT_NAME }}
CERT_PROFILE_NAME: ${{ vars.AZURE_SIGNING_CERT_PROFILE_NAME }}
ENDPOINT: ${{ vars.AZURE_SIGNING_ENDPOINT }}
FILE_DIGEST: SHA256
TIMESTAMP_DIGEST: SHA256
TIMESTAMP_SERVER: http://timestamp.acs.microsoft.com
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_sentry
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b
with:
token: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: run_bundling::bundle_windows
run: script/bundle-windows.ps1 -Architecture x86_64
shell: pwsh
working-directory: ${{ env.ZED_WORKSPACE }}
- name: '@actions/upload-artifact Zed_${{ github.event.pull_request.head.sha || github.sha }}-x86_64.exe'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: Zed_${{ github.event.pull_request.head.sha || github.sha }}-x86_64.exe
path: ${{ env.SETUP_PATH }}
if-no-files-found: error
outputs:
zed: Zed_${{ github.event.pull_request.head.sha || github.sha }}-x86_64.exe
timeout-minutes: 60
upload_release_assets:
needs:
- create_draft_release
- bundle_linux_arm64
- bundle_linux_x86_64
- bundle_mac_arm64
- bundle_mac_x86_64
- bundle_windows_arm64
- bundle_windows_x86_64
runs-on: namespace-profile-4x8-ubuntu-2204
steps:
- name: release::upload_release_assets::download_workflow_artifacts
uses: actions/download-artifact@018cc2cf5baa6db3ef3c5f8a56943fffe632ef53
with:
path: ./artifacts/
- name: ls -lR ./artifacts
run: ls -lR ./artifacts
shell: bash -euxo pipefail {0}
- name: release::upload_release_assets::prep_release_artifacts
run: |-
mkdir -p release-artifacts/
mv ./artifacts/${{ needs.bundle_mac_x86_64.outputs.zed }}/* release-artifacts/Zed-x86_64.dmg
mv ./artifacts/${{ needs.bundle_mac_arm64.outputs.zed }}/* release-artifacts/Zed-aarch64.dmg
mv ./artifacts/${{ needs.bundle_windows_x86_64.outputs.zed }}/* release-artifacts/Zed-x86_64.exe
mv ./artifacts/${{ needs.bundle_windows_arm64.outputs.zed }}/* release-artifacts/Zed-aarch64.exe
mv ./artifacts/${{ needs.bundle_linux_arm64.outputs.zed }}/* release-artifacts/zed-linux-aarch64.tar.gz
mv ./artifacts/${{ needs.bundle_linux_x86_64.outputs.zed }}/* release-artifacts/zed-linux-x86_64.tar.gz
mv ./artifacts/${{ needs.bundle_linux_x86_64.outputs.remote-server }}/* release-artifacts/zed-remote-server-linux-x86_64.gz
mv ./artifacts/${{ needs.bundle_linux_arm64.outputs.remote-server }}/* release-artifacts/zed-remote-server-linux-aarch64.gz
mv ./artifacts/${{ needs.bundle_mac_x86_64.outputs.remote-server }}/* release-artifacts/zed-remote-server-macos-x86_64.gz
mv ./artifacts/${{ needs.bundle_mac_arm64.outputs.remote-server }}/* release-artifacts/zed-remote-server-macos-aarch64.gz
shell: bash -euxo pipefail {0}
- name: gh release upload "$GITHUB_REF_NAME" --repo=zed-industries/zed release-artifacts/*
run: gh release upload "$GITHUB_REF_NAME" --repo=zed-industries/zed release-artifacts/*
shell: bash -euxo pipefail {0}
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
auto_release_preview:
needs:
- upload_release_assets
if: |
false
&& startsWith(github.ref, 'refs/tags/v')
&& endsWith(github.ref, '-pre') && !endsWith(github.ref, '.0-pre')
runs-on: namespace-profile-2x4-ubuntu-2404
steps:
- name: gh release edit "$GITHUB_REF_NAME" --repo=zed-industries/zed --draft=false
run: gh release edit "$GITHUB_REF_NAME" --repo=zed-industries/zed --draft=false
shell: bash -euxo pipefail {0}
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
- name: release::auto_release_preview::create_sentry_release
uses: getsentry/action-release@526942b68292201ac6bbb99b9a0747d4abee354c
with:
environment: production
env:
SENTRY_ORG: zed-dev
SENTRY_PROJECT: zed
SENTRY_AUTH_TOKEN: ${{ secrets.SENTRY_AUTH_TOKEN }}
concurrency:
group: ${{ github.workflow }}-${{ github.ref_name }}-${{ github.ref_name == 'main' && github.sha || 'anysha' }}
cancel-in-progress: true

View File

@@ -201,9 +201,6 @@ jobs:
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b
with:
token: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: release_nightly::add_rust_to_path
run: echo "$HOME/.cargo/bin" >> "$GITHUB_PATH"
shell: bash -euxo pipefail {0}
- name: ./script/linux
run: ./script/linux
shell: bash -euxo pipefail {0}
@@ -242,9 +239,6 @@ jobs:
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b
with:
token: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: release_nightly::add_rust_to_path
run: echo "$HOME/.cargo/bin" >> "$GITHUB_PATH"
shell: bash -euxo pipefail {0}
- name: ./script/linux
run: ./script/linux
shell: bash -euxo pipefail {0}
@@ -298,11 +292,11 @@ jobs:
"nightly" | Set-Content -Path "crates/zed/RELEASE_CHANNEL"
shell: pwsh
working-directory: ${{ env.ZED_WORKSPACE }}
- name: release_nightly::build_zed_installer
- name: run_bundling::bundle_windows
run: script/bundle-windows.ps1 -Architecture x86_64
shell: pwsh
working-directory: ${{ env.ZED_WORKSPACE }}
- name: release_nightly::upload_zed_nightly_windows
- name: release_nightly::upload_zed_nightly
run: script/upload-nightly.ps1 -Architecture x86_64
shell: pwsh
working-directory: ${{ env.ZED_WORKSPACE }}
@@ -340,11 +334,11 @@ jobs:
"nightly" | Set-Content -Path "crates/zed/RELEASE_CHANNEL"
shell: pwsh
working-directory: ${{ env.ZED_WORKSPACE }}
- name: release_nightly::build_zed_installer
- name: run_bundling::bundle_windows
run: script/bundle-windows.ps1 -Architecture aarch64
shell: pwsh
working-directory: ${{ env.ZED_WORKSPACE }}
- name: release_nightly::upload_zed_nightly_windows
- name: release_nightly::upload_zed_nightly
run: script/upload-nightly.ps1 -Architecture aarch64
shell: pwsh
working-directory: ${{ env.ZED_WORKSPACE }}
@@ -365,17 +359,17 @@ jobs:
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: nix_build::install_nix
- name: nix_build::build_nix::install_nix
uses: cachix/install-nix-action@02a151ada4993995686f9ed4f1be7cfbb229e56f
with:
github_access_token: ${{ secrets.GITHUB_TOKEN }}
- name: nix_build::cachix_action
- name: nix_build::build_nix::cachix_action
uses: cachix/cachix-action@0fc020193b5a1fa3ac4575aa3a7d3aa6a35435ad
with:
name: zed
authToken: ${{ secrets.CACHIX_AUTH_TOKEN }}
cachixArgs: -v
- name: nix_build::build
- name: nix_build::build_nix::build
run: nix build .#default -L --accept-flake-config
shell: bash -euxo pipefail {0}
timeout-minutes: 60
@@ -396,21 +390,21 @@ jobs:
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: nix_build::set_path
- name: nix_build::build_nix::set_path
run: |
echo "/nix/var/nix/profiles/default/bin" >> "$GITHUB_PATH"
echo "/Users/administrator/.nix-profile/bin" >> "$GITHUB_PATH"
shell: bash -euxo pipefail {0}
- name: nix_build::cachix_action
- name: nix_build::build_nix::cachix_action
uses: cachix/cachix-action@0fc020193b5a1fa3ac4575aa3a7d3aa6a35435ad
with:
name: zed
authToken: ${{ secrets.CACHIX_AUTH_TOKEN }}
cachixArgs: -v
- name: nix_build::build
- name: nix_build::build_nix::build
run: nix build .#default -L --accept-flake-config
shell: bash -euxo pipefail {0}
- name: nix_build::limit_store
- name: nix_build::build_nix::limit_store
run: |-
if [ "$(du -sm /nix/store | cut -f1)" -gt 50000 ]; then
nix-collect-garbage -d || true
@@ -434,7 +428,7 @@ jobs:
with:
clean: false
fetch-depth: 0
- name: release_nightly::update_nightly_tag
- name: release_nightly::update_nightly_tag_job::update_nightly_tag
run: |
if [ "$(git rev-parse nightly)" = "$(git rev-parse HEAD)" ]; then
echo "Nightly tag already points to current commit. Skipping tagging."
@@ -445,7 +439,7 @@ jobs:
git tag -f nightly
git push origin nightly --force
shell: bash -euxo pipefail {0}
- name: release_nightly::create_sentry_release
- name: release_nightly::update_nightly_tag_job::create_sentry_release
uses: getsentry/action-release@526942b68292201ac6bbb99b9a0747d4abee354c
with:
environment: production

62
.github/workflows/run_agent_evals.yml vendored Normal file
View File

@@ -0,0 +1,62 @@
# Generated from xtask::workflows::run_agent_evals
# Rebuild with `cargo xtask workflows`.
name: run_agent_evals
env:
CARGO_TERM_COLOR: always
CARGO_INCREMENTAL: '0'
RUST_BACKTRACE: '1'
ANTHROPIC_API_KEY: ${{ secrets.ANTHROPIC_API_KEY }}
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_EVAL_TELEMETRY: '1'
on:
pull_request:
types:
- synchronize
- reopened
- labeled
branches:
- '**'
schedule:
- cron: 0 0 * * *
workflow_dispatch: {}
jobs:
agent_evals:
if: |
github.repository_owner == 'zed-industries' &&
(github.event_name != 'pull_request' || contains(github.event.pull_request.labels.*.name, 'run-eval'))
runs-on: namespace-profile-16x32-ubuntu-2204
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::cache_rust_dependencies
uses: swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6
with:
save-if: ${{ github.ref == 'refs/heads/main' }}
- name: steps::setup_linux
run: ./script/linux
shell: bash -euxo pipefail {0}
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: steps::setup_cargo_config
run: |
mkdir -p ./../.cargo
cp ./.cargo/ci-config.toml ./../.cargo/config.toml
shell: bash -euxo pipefail {0}
- name: cargo build --package=eval
run: cargo build --package=eval
shell: bash -euxo pipefail {0}
- name: run_agent_evals::agent_evals::run_eval
run: cargo run --package=eval -- --repetitions=8 --concurrency=1
shell: bash -euxo pipefail {0}
- name: steps::cleanup_cargo_config
if: always()
run: |
rm -rf ./../.cargo
shell: bash -euxo pipefail {0}
timeout-minutes: 60
concurrency:
group: ${{ github.workflow }}-${{ github.ref_name }}-${{ github.ref_name == 'main' && github.sha || 'anysha' }}
cancel-in-progress: true

View File

@@ -48,11 +48,16 @@ jobs:
with:
name: Zed_${{ github.event.pull_request.head.sha || github.sha }}-x86_64.dmg
path: target/x86_64-apple-darwin/release/Zed.dmg
if-no-files-found: error
- name: '@actions/upload-artifact zed-remote-server-${{ github.event.pull_request.head.sha || github.sha }}-macos-x86_64.gz'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: zed-remote-server-${{ github.event.pull_request.head.sha || github.sha }}-macos-x86_64.gz
path: target/zed-remote-server-macos-x86_64.gz
if-no-files-found: error
outputs:
zed: Zed_${{ github.event.pull_request.head.sha || github.sha }}-x86_64.dmg
remote-server: zed-remote-server-${{ github.event.pull_request.head.sha || github.sha }}-macos-x86_64.gz
timeout-minutes: 60
bundle_mac_arm64:
if: |-
@@ -89,11 +94,16 @@ jobs:
with:
name: Zed_${{ github.event.pull_request.head.sha || github.sha }}-aarch64.dmg
path: target/aarch64-apple-darwin/release/Zed.dmg
if-no-files-found: error
- name: '@actions/upload-artifact zed-remote-server-${{ github.event.pull_request.head.sha || github.sha }}-macos-aarch64.gz'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: zed-remote-server-${{ github.event.pull_request.head.sha || github.sha }}-macos-aarch64.gz
path: target/zed-remote-server-macos-aarch64.gz
if-no-files-found: error
outputs:
zed: Zed_${{ github.event.pull_request.head.sha || github.sha }}-aarch64.dmg
remote-server: zed-remote-server-${{ github.event.pull_request.head.sha || github.sha }}-macos-aarch64.gz
timeout-minutes: 60
bundle_linux_x86_64:
if: |-
@@ -123,11 +133,16 @@ jobs:
with:
name: zed-${{ github.event.pull_request.head.sha || github.sha }}-x86_64-unknown-linux-gnu.tar.gz
path: target/release/zed-*.tar.gz
if-no-files-found: error
- name: '@actions/upload-artifact zed-remote-server-${{ github.event.pull_request.head.sha || github.sha }}-x86_64-unknown-linux-gnu.tar.gz'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: zed-remote-server-${{ github.event.pull_request.head.sha || github.sha }}-x86_64-unknown-linux-gnu.tar.gz
path: target/release/zed-remote-server-*.tar.gz
path: target/zed-remote-server-*.gz
if-no-files-found: error
outputs:
zed: zed-${{ github.event.pull_request.head.sha || github.sha }}-x86_64-unknown-linux-gnu.tar.gz
remote-server: zed-remote-server-${{ github.event.pull_request.head.sha || github.sha }}-x86_64-unknown-linux-gnu.tar.gz
timeout-minutes: 60
bundle_linux_arm64:
if: |-
@@ -157,11 +172,16 @@ jobs:
with:
name: zed-${{ github.event.pull_request.head.sha || github.sha }}-aarch64-unknown-linux-gnu.tar.gz
path: target/release/zed-*.tar.gz
if-no-files-found: error
- name: '@actions/upload-artifact zed-remote-server-${{ github.event.pull_request.head.sha || github.sha }}-aarch64-unknown-linux-gnu.tar.gz'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: zed-remote-server-${{ github.event.pull_request.head.sha || github.sha }}-aarch64-unknown-linux-gnu.tar.gz
path: target/release/zed-remote-server-*.tar.gz
path: target/zed-remote-server-*.gz
if-no-files-found: error
outputs:
zed: zed-${{ github.event.pull_request.head.sha || github.sha }}-aarch64-unknown-linux-gnu.tar.gz
remote-server: zed-remote-server-${{ github.event.pull_request.head.sha || github.sha }}-aarch64-unknown-linux-gnu.tar.gz
timeout-minutes: 60
bundle_windows_x86_64:
if: |-
@@ -196,6 +216,9 @@ jobs:
with:
name: Zed_${{ github.event.pull_request.head.sha || github.sha }}-x86_64.exe
path: ${{ env.SETUP_PATH }}
if-no-files-found: error
outputs:
zed: Zed_${{ github.event.pull_request.head.sha || github.sha }}-x86_64.exe
timeout-minutes: 60
bundle_windows_arm64:
if: |-
@@ -230,6 +253,9 @@ jobs:
with:
name: Zed_${{ github.event.pull_request.head.sha || github.sha }}-aarch64.exe
path: ${{ env.SETUP_PATH }}
if-no-files-found: error
outputs:
zed: Zed_${{ github.event.pull_request.head.sha || github.sha }}-aarch64.exe
timeout-minutes: 60
concurrency:
group: ${{ github.workflow }}-${{ github.head_ref || github.ref }}

View File

@@ -444,18 +444,18 @@ jobs:
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: nix_build::install_nix
- name: nix_build::build_nix::install_nix
uses: cachix/install-nix-action@02a151ada4993995686f9ed4f1be7cfbb229e56f
with:
github_access_token: ${{ secrets.GITHUB_TOKEN }}
- name: nix_build::cachix_action
- name: nix_build::build_nix::cachix_action
uses: cachix/cachix-action@0fc020193b5a1fa3ac4575aa3a7d3aa6a35435ad
with:
name: zed
authToken: ${{ secrets.CACHIX_AUTH_TOKEN }}
cachixArgs: -v
pushFilter: -zed-editor-[0-9.]*-nightly
- name: nix_build::build
- name: nix_build::build_nix::build
run: nix build .#debug -L --accept-flake-config
shell: bash -euxo pipefail {0}
timeout-minutes: 60
@@ -475,22 +475,22 @@ jobs:
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: nix_build::set_path
- name: nix_build::build_nix::set_path
run: |
echo "/nix/var/nix/profiles/default/bin" >> "$GITHUB_PATH"
echo "/Users/administrator/.nix-profile/bin" >> "$GITHUB_PATH"
shell: bash -euxo pipefail {0}
- name: nix_build::cachix_action
- name: nix_build::build_nix::cachix_action
uses: cachix/cachix-action@0fc020193b5a1fa3ac4575aa3a7d3aa6a35435ad
with:
name: zed
authToken: ${{ secrets.CACHIX_AUTH_TOKEN }}
cachixArgs: -v
pushFilter: -zed-editor-[0-9.]*-nightly
- name: nix_build::build
- name: nix_build::build_nix::build
run: nix build .#debug -L --accept-flake-config
shell: bash -euxo pipefail {0}
- name: nix_build::limit_store
- name: nix_build::build_nix::limit_store
run: |-
if [ "$(du -sm /nix/store | cut -f1)" -gt 50000 ]; then
nix-collect-garbage -d || true

63
.github/workflows/run_unit_evals.yml vendored Normal file
View File

@@ -0,0 +1,63 @@
# Generated from xtask::workflows::run_agent_evals
# Rebuild with `cargo xtask workflows`.
name: run_agent_evals
env:
CARGO_TERM_COLOR: always
CARGO_INCREMENTAL: '0'
RUST_BACKTRACE: '1'
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
on:
schedule:
- cron: 47 1 * * 2
workflow_dispatch: {}
jobs:
unit_evals:
runs-on: namespace-profile-16x32-ubuntu-2204
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_cargo_config
run: |
mkdir -p ./../.cargo
cp ./.cargo/ci-config.toml ./../.cargo/config.toml
shell: bash -euxo pipefail {0}
- name: steps::cache_rust_dependencies
uses: swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6
with:
save-if: ${{ github.ref == 'refs/heads/main' }}
- name: steps::setup_linux
run: ./script/linux
shell: bash -euxo pipefail {0}
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: steps::cargo_install_nextest
run: cargo install cargo-nextest --locked
shell: bash -euxo pipefail {0}
- name: steps::clear_target_dir_if_large
run: ./script/clear-target-dir-if-larger-than 100
shell: bash -euxo pipefail {0}
- name: ./script/run-unit-evals
run: ./script/run-unit-evals
shell: bash -euxo pipefail {0}
env:
ANTHROPIC_API_KEY: ${{ secrets.ANTHROPIC_API_KEY }}
- name: run_agent_evals::unit_evals::send_failure_to_slack
if: ${{ failure() }}
uses: slackapi/slack-github-action@b0fa283ad8fea605de13dc3f449259339835fc52
with:
method: chat.postMessage
token: ${{ secrets.SLACK_APP_ZED_UNIT_EVALS_BOT_TOKEN }}
payload: |
channel: C04UDRNNJFQ
text: "Unit Evals Failed: https://github.com/zed-industries/zed/actions/runs/${{ github.run_id }}"
- name: steps::cleanup_cargo_config
if: always()
run: |
rm -rf ./../.cargo
shell: bash -euxo pipefail {0}
concurrency:
group: ${{ github.workflow }}-${{ github.ref_name }}-${{ github.ref_name == 'main' && github.sha || 'anysha' }}
cancel-in-progress: true

View File

@@ -1,86 +0,0 @@
name: Run Unit Evals
on:
schedule:
# GitHub might drop jobs at busy times, so we choose a random time in the middle of the night.
- cron: "47 1 * * 2"
workflow_dispatch:
concurrency:
# Allow only one workflow per any non-`main` branch.
group: ${{ github.workflow }}-${{ github.ref_name }}-${{ github.ref_name == 'main' && github.sha || 'anysha' }}
cancel-in-progress: true
env:
CARGO_TERM_COLOR: always
CARGO_INCREMENTAL: 0
RUST_BACKTRACE: 1
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
jobs:
unit_evals:
if: github.repository_owner == 'zed-industries'
timeout-minutes: 60
name: Run unit evals
runs-on:
- namespace-profile-16x32-ubuntu-2204
steps:
- name: Add Rust to the PATH
run: echo "$HOME/.cargo/bin" >> "$GITHUB_PATH"
- name: Checkout repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false
- name: Cache dependencies
uses: swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2
with:
save-if: ${{ github.ref == 'refs/heads/main' }}
# cache-provider: "buildjet"
- name: Install Linux dependencies
run: ./script/linux
- name: Configure CI
run: |
mkdir -p ./../.cargo
cp ./.cargo/ci-config.toml ./../.cargo/config.toml
- name: Install Rust
shell: bash -euxo pipefail {0}
run: |
cargo install cargo-nextest --locked
- name: Install Node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4
with:
node-version: "18"
- name: Limit target directory size
shell: bash -euxo pipefail {0}
run: script/clear-target-dir-if-larger-than 100
- name: Run unit evals
shell: bash -euxo pipefail {0}
run: cargo nextest run --workspace --no-fail-fast --features unit-eval --no-capture -E 'test(::eval_)'
env:
ANTHROPIC_API_KEY: ${{ secrets.ANTHROPIC_API_KEY }}
- name: Send failure message to Slack channel if needed
if: ${{ failure() }}
uses: slackapi/slack-github-action@b0fa283ad8fea605de13dc3f449259339835fc52
with:
method: chat.postMessage
token: ${{ secrets.SLACK_APP_ZED_UNIT_EVALS_BOT_TOKEN }}
payload: |
channel: C04UDRNNJFQ
text: "Unit Evals Failed: https://github.com/zed-industries/zed/actions/runs/${{ github.run_id }}"
# Even the Linux runner is not stateful, in theory there is no need to do this cleanup.
# But, to avoid potential issues in the future if we choose to use a stateful Linux runner and forget to add code
# to clean up the config file, Ive included the cleanup code here as a precaution.
# While its not strictly necessary at this moment, I believe its better to err on the side of caution.
- name: Clean CI config file
if: always()
run: rm -rf ./../.cargo

13
Cargo.lock generated
View File

@@ -1339,6 +1339,7 @@ dependencies = [
"settings",
"smol",
"tempfile",
"util",
"which 6.0.3",
"workspace",
]
@@ -4528,12 +4529,15 @@ dependencies = [
"fs",
"futures 0.3.31",
"gpui",
"http_client",
"json_dotpath",
"language",
"log",
"node_runtime",
"paths",
"serde",
"serde_json",
"settings",
"smol",
"task",
"util",
@@ -4932,6 +4936,7 @@ dependencies = [
"editor",
"gpui",
"indoc",
"itertools 0.14.0",
"language",
"log",
"lsp",
@@ -7074,6 +7079,7 @@ dependencies = [
"serde_json",
"settings",
"url",
"urlencoding",
"util",
]
@@ -16362,7 +16368,7 @@ dependencies = [
"arrayvec",
"ctor",
"futures 0.3.31",
"itertools 0.14.0",
"futures-lite 1.13.0",
"log",
"pollster 0.4.0",
"rand 0.9.2",
@@ -18050,7 +18056,7 @@ dependencies = [
[[package]]
name = "tree-sitter-gomod"
version = "1.1.1"
source = "git+https://github.com/camdencheek/tree-sitter-go-mod?rev=6efb59652d30e0e9cd5f3b3a669afd6f1a926d3c#6efb59652d30e0e9cd5f3b3a669afd6f1a926d3c"
source = "git+https://github.com/camdencheek/tree-sitter-go-mod?rev=2e886870578eeba1927a2dc4bd2e2b3f598c5f9a#2e886870578eeba1927a2dc4bd2e2b3f598c5f9a"
dependencies = [
"cc",
"tree-sitter-language",
@@ -21228,6 +21234,7 @@ dependencies = [
"project_symbols",
"prompt_store",
"proto",
"rayon",
"recent_projects",
"release_channel",
"remote",
@@ -21754,6 +21761,7 @@ dependencies = [
"polars",
"project",
"prompt_store",
"pulldown-cmark 0.12.2",
"release_channel",
"reqwest_client",
"serde",
@@ -21763,6 +21771,7 @@ dependencies = [
"smol",
"soa-rs",
"terminal_view",
"toml 0.8.23",
"util",
"watch",
"zeta",

View File

@@ -680,7 +680,7 @@ tree-sitter-elixir = "0.3"
tree-sitter-embedded-template = "0.23.0"
tree-sitter-gitcommit = { git = "https://github.com/zed-industries/tree-sitter-git-commit", rev = "88309716a69dd13ab83443721ba6e0b491d37ee9" }
tree-sitter-go = "0.23"
tree-sitter-go-mod = { git = "https://github.com/camdencheek/tree-sitter-go-mod", rev = "6efb59652d30e0e9cd5f3b3a669afd6f1a926d3c", package = "tree-sitter-gomod" }
tree-sitter-go-mod = { git = "https://github.com/camdencheek/tree-sitter-go-mod", rev = "2e886870578eeba1927a2dc4bd2e2b3f598c5f9a", package = "tree-sitter-gomod" }
tree-sitter-gowork = { git = "https://github.com/zed-industries/tree-sitter-go-work", rev = "acb0617bf7f4fda02c6217676cc64acb89536dc7" }
tree-sitter-heex = { git = "https://github.com/zed-industries/tree-sitter-heex", rev = "1dd45142fbb05562e35b2040c6129c9bca346592" }
tree-sitter-html = "0.23"

View File

@@ -33,6 +33,7 @@ gpui
git
= @cole-miller
= @danilo-leal
= @dvdsk
linux
= @dvdsk

View File

@@ -602,7 +602,9 @@
"whole_word": false,
"case_sensitive": false,
"include_ignored": false,
"regex": false
"regex": false,
// Whether to center the cursor on each search match when navigating.
"center_on_match": false
},
// When to populate a new search's query based on the text under the cursor.
// This setting can take the following three values:

View File

@@ -3,7 +3,6 @@ mod diff;
mod mention;
mod terminal;
use ::terminal::terminal_settings::TerminalSettings;
use agent_settings::AgentSettings;
use collections::HashSet;
pub use connection::*;
@@ -12,7 +11,7 @@ use language::language_settings::FormatOnSave;
pub use mention::*;
use project::lsp_store::{FormatTrigger, LspFormatTarget};
use serde::{Deserialize, Serialize};
use settings::{Settings as _, SettingsLocation};
use settings::Settings as _;
use task::{Shell, ShellBuilder};
pub use terminal::*;
@@ -2141,17 +2140,9 @@ impl AcpThread {
) -> Task<Result<Entity<Terminal>>> {
let env = match &cwd {
Some(dir) => self.project.update(cx, |project, cx| {
let worktree = project.find_worktree(dir.as_path(), cx);
let shell = TerminalSettings::get(
worktree.as_ref().map(|(worktree, path)| SettingsLocation {
worktree_id: worktree.read(cx).id(),
path: &path,
}),
cx,
)
.shell
.clone();
project.directory_environment(&shell, dir.as_path().into(), cx)
project.environment().update(cx, |env, cx| {
env.directory_environment(dir.as_path().into(), cx)
})
}),
None => Task::ready(None).shared(),
};

View File

@@ -5,10 +5,8 @@ use gpui::{App, AppContext, AsyncApp, Context, Entity, Task};
use language::LanguageRegistry;
use markdown::Markdown;
use project::Project;
use settings::{Settings as _, SettingsLocation};
use std::{path::PathBuf, process::ExitStatus, sync::Arc, time::Instant};
use task::Shell;
use terminal::terminal_settings::TerminalSettings;
use util::get_default_system_shell_preferring_bash;
pub struct Terminal {
@@ -187,17 +185,9 @@ pub async fn create_terminal_entity(
let mut env = if let Some(dir) = &cwd {
project
.update(cx, |project, cx| {
let worktree = project.find_worktree(dir.as_path(), cx);
let shell = TerminalSettings::get(
worktree.as_ref().map(|(worktree, path)| SettingsLocation {
worktree_id: worktree.read(cx).id(),
path: &path,
}),
cx,
)
.shell
.clone();
project.directory_environment(&shell, dir.clone().into(), cx)
project.environment().update(cx, |env, cx| {
env.directory_environment(dir.clone().into(), cx)
})
})?
.await
.unwrap_or_default()

View File

@@ -19,7 +19,7 @@ use markdown::{CodeBlockRenderer, Markdown, MarkdownElement, MarkdownStyle};
use project::Project;
use settings::Settings;
use theme::ThemeSettings;
use ui::{Tooltip, prelude::*};
use ui::{Tooltip, WithScrollbar, prelude::*};
use util::ResultExt as _;
use workspace::{
Item, ItemHandle, ToolbarItemEvent, ToolbarItemLocation, ToolbarItemView, Workspace,
@@ -291,17 +291,19 @@ impl AcpTools {
let expanded = self.expanded.contains(&index);
v_flex()
.w_full()
.px_4()
.py_3()
.border_color(colors.border)
.border_b_1()
.gap_2()
.items_start()
.font_buffer(cx)
.text_size(base_size)
.id(index)
.group("message")
.cursor_pointer()
.font_buffer(cx)
.w_full()
.py_3()
.pl_4()
.pr_5()
.gap_2()
.items_start()
.text_size(base_size)
.border_color(colors.border)
.border_b_1()
.hover(|this| this.bg(colors.element_background.opacity(0.5)))
.on_click(cx.listener(move |this, _, _, cx| {
if this.expanded.contains(&index) {
@@ -323,15 +325,14 @@ impl AcpTools {
h_flex()
.w_full()
.gap_2()
.items_center()
.flex_shrink_0()
.child(match message.direction {
acp::StreamMessageDirection::Incoming => {
ui::Icon::new(ui::IconName::ArrowDown).color(Color::Error)
}
acp::StreamMessageDirection::Outgoing => {
ui::Icon::new(ui::IconName::ArrowUp).color(Color::Success)
}
acp::StreamMessageDirection::Incoming => Icon::new(IconName::ArrowDown)
.color(Color::Error)
.size(IconSize::Small),
acp::StreamMessageDirection::Outgoing => Icon::new(IconName::ArrowUp)
.color(Color::Success)
.size(IconSize::Small),
})
.child(
Label::new(message.name.clone())
@@ -501,7 +502,7 @@ impl Focusable for AcpTools {
}
impl Render for AcpTools {
fn render(&mut self, _window: &mut Window, cx: &mut Context<Self>) -> impl IntoElement {
fn render(&mut self, window: &mut Window, cx: &mut Context<Self>) -> impl IntoElement {
v_flex()
.track_focus(&self.focus_handle)
.size_full()
@@ -516,13 +517,19 @@ impl Render for AcpTools {
.child("No messages recorded yet")
.into_any()
} else {
list(
connection.list_state.clone(),
cx.processor(Self::render_message),
)
.with_sizing_behavior(gpui::ListSizingBehavior::Auto)
.flex_grow()
.into_any()
div()
.size_full()
.flex_grow()
.child(
list(
connection.list_state.clone(),
cx.processor(Self::render_message),
)
.with_sizing_behavior(gpui::ListSizingBehavior::Auto)
.size_full(),
)
.vertical_scrollbar_for(connection.list_state.clone(), window, cx)
.into_any()
}
}
None => h_flex()

View File

@@ -13,7 +13,15 @@ const EDITS_END_TAG: &str = "</edits>";
const SEARCH_MARKER: &str = "<<<<<<< SEARCH";
const SEPARATOR_MARKER: &str = "=======";
const REPLACE_MARKER: &str = ">>>>>>> REPLACE";
const END_TAGS: [&str; 3] = [OLD_TEXT_END_TAG, NEW_TEXT_END_TAG, EDITS_END_TAG];
const SONNET_PARAMETER_INVOKE_1: &str = "</parameter>\n</invoke>";
const SONNET_PARAMETER_INVOKE_2: &str = "</parameter></invoke>";
const END_TAGS: [&str; 5] = [
OLD_TEXT_END_TAG,
NEW_TEXT_END_TAG,
EDITS_END_TAG,
SONNET_PARAMETER_INVOKE_1, // Remove this after switching to streaming tool call
SONNET_PARAMETER_INVOKE_2,
];
#[derive(Debug)]
pub enum EditParserEvent {
@@ -547,6 +555,37 @@ mod tests {
);
}
#[gpui::test(iterations = 1000)]
fn test_xml_edits_with_closing_parameter_invoke(mut rng: StdRng) {
// This case is a regression with Claude Sonnet 4.5.
// Sometimes Sonnet thinks that it's doing a tool call
// and closes its response with '</parameter></invoke>'
// instead of properly closing </new_text>
let mut parser = EditParser::new(EditFormat::XmlTags);
assert_eq!(
parse_random_chunks(
indoc! {"
<old_text>some text</old_text><new_text>updated text</parameter></invoke>
"},
&mut parser,
&mut rng
),
vec![Edit {
old_text: "some text".to_string(),
new_text: "updated text".to_string(),
line_hint: None,
},]
);
assert_eq!(
parser.finish(),
EditParserMetrics {
tags: 2,
mismatched_tags: 1
}
);
}
#[gpui::test(iterations = 1000)]
fn test_xml_nested_tags(mut rng: StdRng) {
let mut parser = EditParser::new(EditFormat::XmlTags);
@@ -1035,6 +1074,11 @@ mod tests {
last_ix = chunk_ix;
}
if new_text.is_some() {
pending_edit.new_text = new_text.take().unwrap();
edits.push(pending_edit);
}
edits
}
}

View File

@@ -1,8 +1,10 @@
use acp_thread::AgentSessionModes;
use agent_client_protocol as acp;
use agent_servers::AgentServer;
use agent_settings::AgentSettings;
use fs::Fs;
use gpui::{Context, Entity, FocusHandle, WeakEntity, Window, prelude::*};
use settings::Settings as _;
use std::{rc::Rc, sync::Arc};
use ui::{
Button, ContextMenu, ContextMenuEntry, DocumentationEdge, DocumentationSide, KeyBinding,
@@ -84,6 +86,14 @@ impl ModeSelector {
let current_mode = self.connection.current_mode();
let default_mode = self.agent_server.default_mode(cx);
let settings = AgentSettings::get_global(cx);
let side = match settings.dock {
settings::DockPosition::Left => DocumentationSide::Right,
settings::DockPosition::Bottom | settings::DockPosition::Right => {
DocumentationSide::Left
}
};
for mode in all_modes {
let is_selected = &mode.id == &current_mode;
let is_default = Some(&mode.id) == default_mode.as_ref();
@@ -91,7 +101,7 @@ impl ModeSelector {
.toggleable(IconPosition::End, is_selected);
let entry = if let Some(description) = &mode.description {
entry.documentation_aside(DocumentationSide::Left, DocumentationEdge::Bottom, {
entry.documentation_aside(side, DocumentationEdge::Bottom, {
let description = description.clone();
move |cx| {

View File

@@ -3631,6 +3631,7 @@ impl AcpThreadView {
.child(
h_flex()
.id("edits-container")
.cursor_pointer()
.gap_1()
.child(Disclosure::new("edits-disclosure", expanded))
.map(|this| {
@@ -3770,6 +3771,7 @@ impl AcpThreadView {
Label::new(name.to_string())
.size(LabelSize::XSmall)
.buffer_font(cx)
.ml_1p5()
});
let file_icon = FileIcons::get_icon(path.as_std_path(), cx)
@@ -3801,14 +3803,30 @@ impl AcpThreadView {
})
.child(
h_flex()
.id(("file-name-row", index))
.relative()
.id(("file-name", index))
.pr_8()
.gap_1p5()
.w_full()
.overflow_x_scroll()
.child(file_icon)
.child(h_flex().gap_0p5().children(file_name).children(file_path))
.child(
h_flex()
.id(("file-name-path", index))
.cursor_pointer()
.pr_0p5()
.gap_0p5()
.hover(|s| s.bg(cx.theme().colors().element_hover))
.rounded_xs()
.child(file_icon)
.children(file_name)
.children(file_path)
.tooltip(Tooltip::text("Go to File"))
.on_click({
let buffer = buffer.clone();
cx.listener(move |this, _, window, cx| {
this.open_edited_buffer(&buffer, window, cx);
})
}),
)
.child(
div()
.absolute()
@@ -3818,13 +3836,7 @@ impl AcpThreadView {
.bottom_0()
.right_0()
.bg(overlay_gradient),
)
.on_click({
let buffer = buffer.clone();
cx.listener(move |this, _, window, cx| {
this.open_edited_buffer(&buffer, window, cx);
})
}),
),
)
.child(
h_flex()
@@ -4571,14 +4583,29 @@ impl AcpThreadView {
window: &mut Window,
cx: &mut Context<Self>,
) {
if window.is_window_active() || !self.notifications.is_empty() {
if !self.notifications.is_empty() {
return;
}
let settings = AgentSettings::get_global(cx);
let window_is_inactive = !window.is_window_active();
let panel_is_hidden = self
.workspace
.upgrade()
.map(|workspace| AgentPanel::is_hidden(&workspace, cx))
.unwrap_or(true);
let should_notify = window_is_inactive || panel_is_hidden;
if !should_notify {
return;
}
// TODO: Change this once we have title summarization for external agents.
let title = self.agent.name();
match AgentSettings::get_global(cx).notify_when_agent_waiting {
match settings.notify_when_agent_waiting {
NotifyWhenAgentWaiting::PrimaryScreen => {
if let Some(primary) = cx.primary_display() {
self.pop_up(icon, caption.into(), title, window, primary, cx);
@@ -5581,7 +5608,7 @@ fn default_markdown_style(
let theme_settings = ThemeSettings::get_global(cx);
let colors = cx.theme().colors();
let buffer_font_size = TextSize::Small.rems(cx);
let buffer_font_size = theme_settings.agent_buffer_font_size(cx);
let mut text_style = window.text_style();
let line_height = buffer_font_size * 1.75;
@@ -5593,9 +5620,9 @@ fn default_markdown_style(
};
let font_size = if buffer_font {
TextSize::Small.rems(cx)
theme_settings.agent_buffer_font_size(cx)
} else {
TextSize::Default.rems(cx)
theme_settings.agent_ui_font_size(cx)
};
let text_color = if muted_text {
@@ -5892,6 +5919,107 @@ pub(crate) mod tests {
);
}
#[gpui::test]
async fn test_notification_when_panel_hidden(cx: &mut TestAppContext) {
init_test(cx);
let (thread_view, cx) = setup_thread_view(StubAgentServer::default_response(), cx).await;
add_to_workspace(thread_view.clone(), cx);
let message_editor = cx.read(|cx| thread_view.read(cx).message_editor.clone());
message_editor.update_in(cx, |editor, window, cx| {
editor.set_text("Hello", window, cx);
});
// Window is active (don't deactivate), but panel will be hidden
// Note: In the test environment, the panel is not actually added to the dock,
// so is_agent_panel_hidden will return true
thread_view.update_in(cx, |thread_view, window, cx| {
thread_view.send(window, cx);
});
cx.run_until_parked();
// Should show notification because window is active but panel is hidden
assert!(
cx.windows()
.iter()
.any(|window| window.downcast::<AgentNotification>().is_some()),
"Expected notification when panel is hidden"
);
}
#[gpui::test]
async fn test_notification_still_works_when_window_inactive(cx: &mut TestAppContext) {
init_test(cx);
let (thread_view, cx) = setup_thread_view(StubAgentServer::default_response(), cx).await;
let message_editor = cx.read(|cx| thread_view.read(cx).message_editor.clone());
message_editor.update_in(cx, |editor, window, cx| {
editor.set_text("Hello", window, cx);
});
// Deactivate window - should show notification regardless of setting
cx.deactivate_window();
thread_view.update_in(cx, |thread_view, window, cx| {
thread_view.send(window, cx);
});
cx.run_until_parked();
// Should still show notification when window is inactive (existing behavior)
assert!(
cx.windows()
.iter()
.any(|window| window.downcast::<AgentNotification>().is_some()),
"Expected notification when window is inactive"
);
}
#[gpui::test]
async fn test_notification_respects_never_setting(cx: &mut TestAppContext) {
init_test(cx);
// Set notify_when_agent_waiting to Never
cx.update(|cx| {
AgentSettings::override_global(
AgentSettings {
notify_when_agent_waiting: NotifyWhenAgentWaiting::Never,
..AgentSettings::get_global(cx).clone()
},
cx,
);
});
let (thread_view, cx) = setup_thread_view(StubAgentServer::default_response(), cx).await;
let message_editor = cx.read(|cx| thread_view.read(cx).message_editor.clone());
message_editor.update_in(cx, |editor, window, cx| {
editor.set_text("Hello", window, cx);
});
// Window is active
thread_view.update_in(cx, |thread_view, window, cx| {
thread_view.send(window, cx);
});
cx.run_until_parked();
// Should NOT show notification because notify_when_agent_waiting is Never
assert!(
!cx.windows()
.iter()
.any(|window| window.downcast::<AgentNotification>().is_some()),
"Expected no notification when notify_when_agent_waiting is Never"
);
}
async fn setup_thread_view(
agent: impl AgentServer + 'static,
cx: &mut TestAppContext,

View File

@@ -23,16 +23,18 @@ use language::LanguageRegistry;
use language_model::{
LanguageModelProvider, LanguageModelProviderId, LanguageModelRegistry, ZED_CLOUD_PROVIDER_ID,
};
use language_models::AllLanguageModelSettings;
use notifications::status_toast::{StatusToast, ToastIcon};
use project::{
agent_server_store::{AgentServerStore, CLAUDE_CODE_NAME, CODEX_NAME, GEMINI_NAME},
context_server_store::{ContextServerConfiguration, ContextServerStatus, ContextServerStore},
};
use rope::Rope;
use settings::{SettingsStore, update_settings_file};
use settings::{Settings, SettingsStore, update_settings_file};
use ui::{
Chip, CommonAnimationExt, ContextMenu, Disclosure, Divider, DividerColor, ElevationIndex,
Indicator, PopoverMenu, Switch, SwitchColor, Tooltip, WithScrollbar, prelude::*,
Button, ButtonStyle, Chip, CommonAnimationExt, ContextMenu, Disclosure, Divider, DividerColor,
ElevationIndex, IconName, IconPosition, IconSize, Indicator, LabelSize, PopoverMenu, Switch,
SwitchColor, Tooltip, WithScrollbar, prelude::*,
};
use util::ResultExt as _;
use workspace::{Workspace, create_and_open_local_file};
@@ -304,10 +306,76 @@ impl AgentConfiguration {
}
})),
)
}),
})
.when(
is_expanded && is_removable_provider(&provider.id(), cx),
|this| {
this.child(
Button::new(
SharedString::from(format!("delete-provider-{provider_id}")),
"Remove Provider",
)
.full_width()
.style(ButtonStyle::Outlined)
.icon_position(IconPosition::Start)
.icon(IconName::Trash)
.icon_size(IconSize::Small)
.icon_color(Color::Muted)
.label_size(LabelSize::Small)
.on_click(cx.listener({
let provider = provider.clone();
move |this, _event, window, cx| {
this.delete_provider(provider.clone(), window, cx);
}
})),
)
},
),
)
}
fn delete_provider(
&mut self,
provider: Arc<dyn LanguageModelProvider>,
window: &mut Window,
cx: &mut Context<Self>,
) {
let fs = self.fs.clone();
let provider_id = provider.id();
cx.spawn_in(window, async move |_, cx| {
cx.update(|_window, cx| {
update_settings_file(fs.clone(), cx, {
let provider_id = provider_id.clone();
move |settings, _| {
if let Some(ref mut openai_compatible) = settings
.language_models
.as_mut()
.and_then(|lm| lm.openai_compatible.as_mut())
{
let key_to_remove: Arc<str> = Arc::from(provider_id.0.as_ref());
openai_compatible.remove(&key_to_remove);
}
}
});
})
.log_err();
cx.update(|_window, cx| {
LanguageModelRegistry::global(cx).update(cx, {
let provider_id = provider_id.clone();
move |registry, cx| {
registry.unregister_provider(provider_id, cx);
}
})
})
.log_err();
anyhow::Ok(())
})
.detach_and_log_err(cx);
}
fn render_provider_configuration_section(
&mut self,
cx: &mut Context<Self>,
@@ -1225,3 +1293,14 @@ fn find_text_in_buffer(
None
}
}
// OpenAI-compatible providers are user-configured and can be removed,
// whereas built-in providers (like Anthropic, OpenAI, Google, etc.) can't.
//
// If in the future we have more "API-compatible-type" of providers,
// they should be included here as removable providers.
fn is_removable_provider(provider_id: &LanguageModelProviderId, cx: &App) -> bool {
AllLanguageModelSettings::get_global(cx)
.openai_compatible
.contains_key(provider_id.0.as_ref())
}

View File

@@ -70,14 +70,6 @@ impl AgentDiffThread {
}
}
fn is_generating(&self, cx: &App) -> bool {
match self {
AgentDiffThread::AcpThread(thread) => {
thread.read(cx).status() == acp_thread::ThreadStatus::Generating
}
}
}
fn has_pending_edit_tool_uses(&self, cx: &App) -> bool {
match self {
AgentDiffThread::AcpThread(thread) => thread.read(cx).has_pending_edit_tool_calls(),
@@ -970,9 +962,7 @@ impl AgentDiffToolbar {
None => ToolbarItemLocation::Hidden,
Some(AgentDiffToolbarItem::Pane(_)) => ToolbarItemLocation::PrimaryRight,
Some(AgentDiffToolbarItem::Editor { state, .. }) => match state {
EditorState::Generating | EditorState::Reviewing => {
ToolbarItemLocation::PrimaryRight
}
EditorState::Reviewing => ToolbarItemLocation::PrimaryRight,
EditorState::Idle => ToolbarItemLocation::Hidden,
},
}
@@ -1050,7 +1040,6 @@ impl Render for AgentDiffToolbar {
let content = match state {
EditorState::Idle => return Empty.into_any(),
EditorState::Generating => vec![spinner_icon],
EditorState::Reviewing => vec![
h_flex()
.child(
@@ -1222,7 +1211,6 @@ pub struct AgentDiff {
pub enum EditorState {
Idle,
Reviewing,
Generating,
}
struct WorkspaceThread {
@@ -1545,15 +1533,11 @@ impl AgentDiff {
multibuffer.add_diff(diff_handle.clone(), cx);
});
let new_state = if thread.is_generating(cx) {
EditorState::Generating
} else {
EditorState::Reviewing
};
let reviewing_state = EditorState::Reviewing;
let previous_state = self
.reviewing_editors
.insert(weak_editor.clone(), new_state.clone());
.insert(weak_editor.clone(), reviewing_state.clone());
if previous_state.is_none() {
editor.update(cx, |editor, cx| {
@@ -1566,7 +1550,9 @@ impl AgentDiff {
unaffected.remove(weak_editor);
}
if new_state == EditorState::Reviewing && previous_state != Some(new_state) {
if reviewing_state == EditorState::Reviewing
&& previous_state != Some(reviewing_state)
{
// Jump to first hunk when we enter review mode
editor.update(cx, |editor, cx| {
let snapshot = multibuffer.read(cx).snapshot(cx);

View File

@@ -729,6 +729,25 @@ impl AgentPanel {
&self.context_server_registry
}
pub fn is_hidden(workspace: &Entity<Workspace>, cx: &App) -> bool {
let workspace_read = workspace.read(cx);
workspace_read
.panel::<AgentPanel>(cx)
.map(|panel| {
let panel_id = Entity::entity_id(&panel);
let is_visible = workspace_read.all_docks().iter().any(|dock| {
dock.read(cx)
.visible_panel()
.is_some_and(|visible_panel| visible_panel.panel_id() == panel_id)
});
!is_visible
})
.unwrap_or(true)
}
fn active_thread_view(&self) -> Option<&Entity<AcpThreadView>> {
match &self.active_view {
ActiveView::ExternalAgentThread { thread_view, .. } => Some(thread_view),

View File

@@ -26,6 +26,7 @@ serde_json.workspace = true
settings.workspace = true
smol.workspace = true
tempfile.workspace = true
util.workspace = true
workspace.workspace = true
[target.'cfg(not(target_os = "windows"))'.dependencies]

View File

@@ -962,7 +962,7 @@ pub async fn finalize_auto_update_on_quit() {
.parent()
.map(|p| p.join("tools").join("auto_update_helper.exe"))
{
let mut command = smol::process::Command::new(helper);
let mut command = util::command::new_smol_command(helper);
command.arg("--launch");
command.arg("false");
if let Ok(mut cmd) = command.spawn() {

View File

@@ -212,7 +212,7 @@ pub fn write_codeblock<'a>(
include_line_numbers: bool,
output: &'a mut String,
) {
writeln!(output, "`````path={}", path.display()).unwrap();
writeln!(output, "`````{}", path.display()).unwrap();
write_excerpts(
excerpts,
sorted_insertions,

View File

@@ -39,6 +39,7 @@ use std::{
Arc,
atomic::{self, AtomicBool, AtomicUsize},
},
time::Duration,
};
use text::Point;
use util::{path, rel_path::rel_path, uri};
@@ -1817,14 +1818,7 @@ async fn test_mutual_editor_inlay_hint_cache_update(
settings.project.all_languages.defaults.inlay_hints =
Some(InlayHintSettingsContent {
enabled: Some(true),
show_value_hints: Some(true),
edit_debounce_ms: Some(0),
scroll_debounce_ms: Some(0),
show_type_hints: Some(true),
show_parameter_hints: Some(false),
show_other_hints: Some(true),
show_background: Some(false),
toggle_on_modifiers_press: None,
..InlayHintSettingsContent::default()
})
});
});
@@ -1834,15 +1828,8 @@ async fn test_mutual_editor_inlay_hint_cache_update(
store.update_user_settings(cx, |settings| {
settings.project.all_languages.defaults.inlay_hints =
Some(InlayHintSettingsContent {
show_value_hints: Some(true),
enabled: Some(true),
edit_debounce_ms: Some(0),
scroll_debounce_ms: Some(0),
show_type_hints: Some(true),
show_parameter_hints: Some(false),
show_other_hints: Some(true),
show_background: Some(false),
toggle_on_modifiers_press: None,
..InlayHintSettingsContent::default()
})
});
});
@@ -1935,6 +1922,7 @@ async fn test_mutual_editor_inlay_hint_cache_update(
});
let fake_language_server = fake_language_servers.next().await.unwrap();
let editor_a = file_a.await.unwrap().downcast::<Editor>().unwrap();
executor.advance_clock(Duration::from_millis(100));
executor.run_until_parked();
let initial_edit = edits_made.load(atomic::Ordering::Acquire);
@@ -1955,6 +1943,7 @@ async fn test_mutual_editor_inlay_hint_cache_update(
.downcast::<Editor>()
.unwrap();
executor.advance_clock(Duration::from_millis(100));
executor.run_until_parked();
editor_b.update(cx_b, |editor, cx| {
assert_eq!(
@@ -1973,6 +1962,7 @@ async fn test_mutual_editor_inlay_hint_cache_update(
});
cx_b.focus(&editor_b);
executor.advance_clock(Duration::from_secs(1));
executor.run_until_parked();
editor_a.update(cx_a, |editor, cx| {
assert_eq!(
@@ -1996,6 +1986,7 @@ async fn test_mutual_editor_inlay_hint_cache_update(
});
cx_a.focus(&editor_a);
executor.advance_clock(Duration::from_secs(1));
executor.run_until_parked();
editor_a.update(cx_a, |editor, cx| {
assert_eq!(
@@ -2017,6 +2008,7 @@ async fn test_mutual_editor_inlay_hint_cache_update(
.into_response()
.expect("inlay refresh request failed");
executor.advance_clock(Duration::from_secs(1));
executor.run_until_parked();
editor_a.update(cx_a, |editor, cx| {
assert_eq!(

View File

@@ -41,6 +41,10 @@ util.workspace = true
[dev-dependencies]
dap = { workspace = true, features = ["test-support"] }
fs = { workspace = true, features = ["test-support"] }
gpui = { workspace = true, features = ["test-support"] }
http_client.workspace = true
node_runtime.workspace = true
settings = { workspace = true, features = ["test-support"] }
task = { workspace = true, features = ["test-support"] }
util = { workspace = true, features = ["test-support"] }

View File

@@ -4,6 +4,8 @@ mod go;
mod javascript;
mod python;
#[cfg(test)]
use std::path::PathBuf;
use std::sync::Arc;
use anyhow::Result;
@@ -38,3 +40,65 @@ pub fn init(cx: &mut App) {
}
})
}
#[cfg(test)]
mod test_mocks {
use super::*;
pub(crate) struct MockDelegate {
worktree_root: PathBuf,
}
impl MockDelegate {
pub(crate) fn new() -> Arc<dyn adapters::DapDelegate> {
Arc::new(Self {
worktree_root: PathBuf::from("/tmp/test"),
})
}
}
#[async_trait::async_trait]
impl adapters::DapDelegate for MockDelegate {
fn worktree_id(&self) -> settings::WorktreeId {
settings::WorktreeId::from_usize(0)
}
fn worktree_root_path(&self) -> &std::path::Path {
&self.worktree_root
}
fn http_client(&self) -> Arc<dyn http_client::HttpClient> {
unimplemented!("Not needed for tests")
}
fn node_runtime(&self) -> node_runtime::NodeRuntime {
unimplemented!("Not needed for tests")
}
fn toolchain_store(&self) -> Arc<dyn language::LanguageToolchainStore> {
unimplemented!("Not needed for tests")
}
fn fs(&self) -> Arc<dyn fs::Fs> {
unimplemented!("Not needed for tests")
}
fn output_to_console(&self, _msg: String) {}
async fn which(&self, _command: &std::ffi::OsStr) -> Option<PathBuf> {
None
}
async fn read_text_file(&self, _path: &util::rel_path::RelPath) -> Result<String> {
Ok(String::new())
}
async fn shell_env(&self) -> collections::HashMap<String, String> {
collections::HashMap::default()
}
fn is_headless(&self) -> bool {
false
}
}
}

View File

@@ -23,6 +23,11 @@ use std::{
use util::command::new_smol_command;
use util::{ResultExt, paths::PathStyle, rel_path::RelPath};
enum DebugpyLaunchMode<'a> {
Normal,
AttachWithConnect { host: Option<&'a str> },
}
#[derive(Default)]
pub(crate) struct PythonDebugAdapter {
base_venv_path: OnceCell<Result<Arc<Path>, String>>,
@@ -36,10 +41,11 @@ impl PythonDebugAdapter {
const LANGUAGE_NAME: &'static str = "Python";
async fn generate_debugpy_arguments(
host: &Ipv4Addr,
async fn generate_debugpy_arguments<'a>(
host: &'a Ipv4Addr,
port: u16,
user_installed_path: Option<&Path>,
launch_mode: DebugpyLaunchMode<'a>,
user_installed_path: Option<&'a Path>,
user_args: Option<Vec<String>>,
) -> Result<Vec<String>> {
let mut args = if let Some(user_installed_path) = user_installed_path {
@@ -62,7 +68,20 @@ impl PythonDebugAdapter {
args.extend(if let Some(args) = user_args {
args
} else {
vec![format!("--host={}", host), format!("--port={}", port)]
match launch_mode {
DebugpyLaunchMode::Normal => {
vec![format!("--host={}", host), format!("--port={}", port)]
}
DebugpyLaunchMode::AttachWithConnect { host } => {
let mut args = vec!["connect".to_string()];
if let Some(host) = host {
args.push(format!("{host}:"));
}
args.push(format!("{port}"));
args
}
}
});
Ok(args)
}
@@ -315,7 +334,46 @@ impl PythonDebugAdapter {
user_env: Option<HashMap<String, String>>,
python_from_toolchain: Option<String>,
) -> Result<DebugAdapterBinary> {
let tcp_connection = config.tcp_connection.clone().unwrap_or_default();
let mut tcp_connection = config.tcp_connection.clone().unwrap_or_default();
let (config_port, config_host) = config
.config
.get("connect")
.map(|value| {
(
value
.get("port")
.and_then(|val| val.as_u64().map(|p| p as u16)),
value.get("host").and_then(|val| val.as_str()),
)
})
.unwrap_or_else(|| {
(
config
.config
.get("port")
.and_then(|port| port.as_u64().map(|p| p as u16)),
config.config.get("host").and_then(|host| host.as_str()),
)
});
let is_attach_with_connect = if config
.config
.get("request")
.is_some_and(|val| val.as_str().is_some_and(|request| request == "attach"))
{
if tcp_connection.host.is_some() && config_host.is_some() {
bail!("Cannot have two different hosts in debug configuration")
} else if tcp_connection.port.is_some() && config_port.is_some() {
bail!("Cannot have two different ports in debug configuration")
}
tcp_connection.port = config_port;
DebugpyLaunchMode::AttachWithConnect { host: config_host }
} else {
DebugpyLaunchMode::Normal
};
let (host, port, timeout) = crate::configure_tcp_connection(tcp_connection).await?;
let python_path = if let Some(toolchain) = python_from_toolchain {
@@ -330,6 +388,7 @@ impl PythonDebugAdapter {
let arguments = Self::generate_debugpy_arguments(
&host,
port,
is_attach_with_connect,
user_installed_path.as_deref(),
user_args,
)
@@ -765,29 +824,58 @@ impl DebugAdapter for PythonDebugAdapter {
.await;
}
let base_path = config
.config
.get("cwd")
.and_then(|cwd| {
RelPath::new(
cwd.as_str()
.map(Path::new)?
.strip_prefix(delegate.worktree_root_path())
.ok()?,
PathStyle::local(),
)
.ok()
let base_paths = ["cwd", "program", "module"]
.into_iter()
.filter_map(|key| {
config.config.get(key).and_then(|cwd| {
RelPath::new(
cwd.as_str()
.map(Path::new)?
.strip_prefix(delegate.worktree_root_path())
.ok()?,
PathStyle::local(),
)
.ok()
})
})
.unwrap_or_else(|| RelPath::empty().into());
let toolchain = delegate
.toolchain_store()
.active_toolchain(
delegate.worktree_id(),
base_path.into_arc(),
language::LanguageName::new(Self::LANGUAGE_NAME),
cx,
.chain(
// While Debugpy's wiki saids absolute paths are required, but it actually supports relative paths when cwd is passed in.
// (Which should always be the case because Zed defaults to the cwd worktree root)
// So we want to check that these relative paths find toolchains as well. Otherwise, they won't be checked
// because the strip prefix in the iteration above will return an error
config
.config
.get("cwd")
.map(|_| {
["program", "module"].into_iter().filter_map(|key| {
config.config.get(key).and_then(|value| {
let path = Path::new(value.as_str()?);
RelPath::new(path, PathStyle::local()).ok()
})
})
})
.into_iter()
.flatten(),
)
.await;
.chain([RelPath::empty().into()]);
let mut toolchain = None;
for base_path in base_paths {
if let Some(found_toolchain) = delegate
.toolchain_store()
.active_toolchain(
delegate.worktree_id(),
base_path.into_arc(),
language::LanguageName::new(Self::LANGUAGE_NAME),
cx,
)
.await
{
toolchain = Some(found_toolchain);
break;
}
}
self.fetch_debugpy_whl(toolchain.clone(), delegate)
.await
@@ -824,7 +912,148 @@ mod tests {
use util::path;
use super::*;
use std::{net::Ipv4Addr, path::PathBuf};
use task::TcpArgumentsTemplate;
#[gpui::test]
async fn test_tcp_connection_conflict_with_connect_args() {
let adapter = PythonDebugAdapter {
base_venv_path: OnceCell::new(),
debugpy_whl_base_path: OnceCell::new(),
};
let config_with_port_conflict = json!({
"request": "attach",
"connect": {
"port": 5679
}
});
let tcp_connection = TcpArgumentsTemplate {
host: None,
port: Some(5678),
timeout: None,
};
let task_def = DebugTaskDefinition {
label: "test".into(),
adapter: PythonDebugAdapter::ADAPTER_NAME.into(),
config: config_with_port_conflict,
tcp_connection: Some(tcp_connection.clone()),
};
let result = adapter
.get_installed_binary(
&test_mocks::MockDelegate::new(),
&task_def,
None,
None,
None,
Some("python3".to_string()),
)
.await;
assert!(result.is_err());
assert!(
result
.unwrap_err()
.to_string()
.contains("Cannot have two different ports")
);
let host = Ipv4Addr::new(127, 0, 0, 1);
let config_with_host_conflict = json!({
"request": "attach",
"connect": {
"host": "192.168.1.1",
"port": 5678
}
});
let tcp_connection_with_host = TcpArgumentsTemplate {
host: Some(host),
port: None,
timeout: None,
};
let task_def_host = DebugTaskDefinition {
label: "test".into(),
adapter: PythonDebugAdapter::ADAPTER_NAME.into(),
config: config_with_host_conflict,
tcp_connection: Some(tcp_connection_with_host),
};
let result_host = adapter
.get_installed_binary(
&test_mocks::MockDelegate::new(),
&task_def_host,
None,
None,
None,
Some("python3".to_string()),
)
.await;
assert!(result_host.is_err());
assert!(
result_host
.unwrap_err()
.to_string()
.contains("Cannot have two different hosts")
);
}
#[gpui::test]
async fn test_attach_with_connect_mode_generates_correct_arguments() {
let host = Ipv4Addr::new(127, 0, 0, 1);
let port = 5678;
let args_without_host = PythonDebugAdapter::generate_debugpy_arguments(
&host,
port,
DebugpyLaunchMode::AttachWithConnect { host: None },
None,
None,
)
.await
.unwrap();
let expected_suffix = path!("debug_adapters/Debugpy/debugpy/adapter");
assert!(args_without_host[0].ends_with(expected_suffix));
assert_eq!(args_without_host[1], "connect");
assert_eq!(args_without_host[2], "5678");
let args_with_host = PythonDebugAdapter::generate_debugpy_arguments(
&host,
port,
DebugpyLaunchMode::AttachWithConnect {
host: Some("192.168.1.100"),
},
None,
None,
)
.await
.unwrap();
assert!(args_with_host[0].ends_with(expected_suffix));
assert_eq!(args_with_host[1], "connect");
assert_eq!(args_with_host[2], "192.168.1.100:");
assert_eq!(args_with_host[3], "5678");
let args_normal = PythonDebugAdapter::generate_debugpy_arguments(
&host,
port,
DebugpyLaunchMode::Normal,
None,
None,
)
.await
.unwrap();
assert!(args_normal[0].ends_with(expected_suffix));
assert_eq!(args_normal[1], "--host=127.0.0.1");
assert_eq!(args_normal[2], "--port=5678");
assert!(!args_normal.contains(&"connect".to_string()));
}
#[gpui::test]
async fn test_debugpy_install_path_cases() {
@@ -833,15 +1062,25 @@ mod tests {
// Case 1: User-defined debugpy path (highest precedence)
let user_path = PathBuf::from("/custom/path/to/debugpy/src/debugpy/adapter");
let user_args =
PythonDebugAdapter::generate_debugpy_arguments(&host, port, Some(&user_path), None)
.await
.unwrap();
let user_args = PythonDebugAdapter::generate_debugpy_arguments(
&host,
port,
DebugpyLaunchMode::Normal,
Some(&user_path),
None,
)
.await
.unwrap();
// Case 2: Venv-installed debugpy (uses -m debugpy.adapter)
let venv_args = PythonDebugAdapter::generate_debugpy_arguments(&host, port, None, None)
.await
.unwrap();
let venv_args = PythonDebugAdapter::generate_debugpy_arguments(
&host,
port,
DebugpyLaunchMode::Normal,
None,
None,
)
.await
.unwrap();
assert_eq!(user_args[0], "/custom/path/to/debugpy/src/debugpy/adapter");
assert_eq!(user_args[1], "--host=127.0.0.1");
@@ -856,6 +1095,7 @@ mod tests {
let user_args = PythonDebugAdapter::generate_debugpy_arguments(
&host,
port,
DebugpyLaunchMode::Normal,
Some(&user_path),
Some(vec!["foo".into()]),
)
@@ -864,6 +1104,7 @@ mod tests {
let venv_args = PythonDebugAdapter::generate_debugpy_arguments(
&host,
port,
DebugpyLaunchMode::Normal,
None,
Some(vec!["foo".into()]),
)

View File

@@ -34,6 +34,7 @@ theme.workspace = true
ui.workspace = true
util.workspace = true
workspace.workspace = true
itertools.workspace = true
[dev-dependencies]
client = { workspace = true, features = ["test-support"] }

View File

@@ -1,5 +1,5 @@
use crate::{
DIAGNOSTICS_UPDATE_DELAY, IncludeWarnings, ToggleWarnings, context_range_for_entry,
DIAGNOSTICS_UPDATE_DEBOUNCE, IncludeWarnings, ToggleWarnings, context_range_for_entry,
diagnostic_renderer::{DiagnosticBlock, DiagnosticRenderer},
toolbar_controls::DiagnosticsToolbarEditor,
};
@@ -283,7 +283,7 @@ impl BufferDiagnosticsEditor {
self.update_excerpts_task = Some(cx.spawn_in(window, async move |editor, cx| {
cx.background_executor()
.timer(DIAGNOSTICS_UPDATE_DELAY)
.timer(DIAGNOSTICS_UPDATE_DEBOUNCE)
.await;
if let Some(buffer) = buffer {
@@ -938,10 +938,6 @@ impl DiagnosticsToolbarEditor for WeakEntity<BufferDiagnosticsEditor> {
.unwrap_or(false)
}
fn has_stale_excerpts(&self, _cx: &App) -> bool {
false
}
fn is_updating(&self, cx: &App) -> bool {
self.read_with(cx, |buffer_diagnostics_editor, cx| {
buffer_diagnostics_editor.update_excerpts_task.is_some()

View File

@@ -9,7 +9,7 @@ mod diagnostics_tests;
use anyhow::Result;
use buffer_diagnostics::BufferDiagnosticsEditor;
use collections::{BTreeSet, HashMap};
use collections::{BTreeSet, HashMap, HashSet};
use diagnostic_renderer::DiagnosticBlock;
use editor::{
Editor, EditorEvent, ExcerptRange, MultiBuffer, PathKey,
@@ -17,10 +17,11 @@ use editor::{
multibuffer_context_lines,
};
use gpui::{
AnyElement, AnyView, App, AsyncApp, Context, Entity, EventEmitter, FocusHandle, Focusable,
Global, InteractiveElement, IntoElement, ParentElement, Render, SharedString, Styled,
Subscription, Task, WeakEntity, Window, actions, div,
AnyElement, AnyView, App, AsyncApp, Context, Entity, EventEmitter, FocusHandle, FocusOutEvent,
Focusable, Global, InteractiveElement, IntoElement, ParentElement, Render, SharedString,
Styled, Subscription, Task, WeakEntity, Window, actions, div,
};
use itertools::Itertools as _;
use language::{
Bias, Buffer, BufferRow, BufferSnapshot, DiagnosticEntry, DiagnosticEntryRef, Point,
ToTreeSitterPoint,
@@ -32,7 +33,7 @@ use project::{
use settings::Settings;
use std::{
any::{Any, TypeId},
cmp::{self, Ordering},
cmp,
ops::{Range, RangeInclusive},
sync::Arc,
time::Duration,
@@ -89,8 +90,8 @@ pub(crate) struct ProjectDiagnosticsEditor {
impl EventEmitter<EditorEvent> for ProjectDiagnosticsEditor {}
const DIAGNOSTICS_UPDATE_DELAY: Duration = Duration::from_millis(50);
const DIAGNOSTICS_SUMMARY_UPDATE_DELAY: Duration = Duration::from_millis(30);
const DIAGNOSTICS_UPDATE_DEBOUNCE: Duration = Duration::from_millis(50);
const DIAGNOSTICS_SUMMARY_UPDATE_DEBOUNCE: Duration = Duration::from_millis(30);
impl Render for ProjectDiagnosticsEditor {
fn render(&mut self, _: &mut Window, cx: &mut Context<Self>) -> impl IntoElement {
@@ -149,6 +150,12 @@ impl Render for ProjectDiagnosticsEditor {
}
}
#[derive(PartialEq, Eq, Copy, Clone, Debug)]
enum RetainExcerpts {
Yes,
No,
}
impl ProjectDiagnosticsEditor {
pub fn register(
workspace: &mut Workspace,
@@ -165,14 +172,21 @@ impl ProjectDiagnosticsEditor {
window: &mut Window,
cx: &mut Context<Self>,
) -> Self {
let project_event_subscription =
cx.subscribe_in(&project_handle, window, |this, _project, event, window, cx| match event {
let project_event_subscription = cx.subscribe_in(
&project_handle,
window,
|this, _project, event, window, cx| match event {
project::Event::DiskBasedDiagnosticsStarted { .. } => {
cx.notify();
}
project::Event::DiskBasedDiagnosticsFinished { language_server_id } => {
log::debug!("disk based diagnostics finished for server {language_server_id}");
this.update_stale_excerpts(window, cx);
this.close_diagnosticless_buffers(
window,
cx,
this.editor.focus_handle(cx).contains_focused(window, cx)
|| this.focus_handle.contains_focused(window, cx),
);
}
project::Event::DiagnosticsUpdated {
language_server_id,
@@ -181,34 +195,39 @@ impl ProjectDiagnosticsEditor {
this.paths_to_update.extend(paths.clone());
this.diagnostic_summary_update = cx.spawn(async move |this, cx| {
cx.background_executor()
.timer(DIAGNOSTICS_SUMMARY_UPDATE_DELAY)
.timer(DIAGNOSTICS_SUMMARY_UPDATE_DEBOUNCE)
.await;
this.update(cx, |this, cx| {
this.update_diagnostic_summary(cx);
})
.log_err();
});
cx.emit(EditorEvent::TitleChanged);
if this.editor.focus_handle(cx).contains_focused(window, cx) || this.focus_handle.contains_focused(window, cx) {
log::debug!("diagnostics updated for server {language_server_id}, paths {paths:?}. recording change");
} else {
log::debug!("diagnostics updated for server {language_server_id}, paths {paths:?}. updating excerpts");
this.update_stale_excerpts(window, cx);
}
log::debug!(
"diagnostics updated for server {language_server_id}, \
paths {paths:?}. updating excerpts"
);
let focused = this.editor.focus_handle(cx).contains_focused(window, cx)
|| this.focus_handle.contains_focused(window, cx);
this.update_stale_excerpts(
if focused {
RetainExcerpts::Yes
} else {
RetainExcerpts::No
},
window,
cx,
);
}
_ => {}
});
},
);
let focus_handle = cx.focus_handle();
cx.on_focus_in(&focus_handle, window, |this, window, cx| {
this.focus_in(window, cx)
})
.detach();
cx.on_focus_out(&focus_handle, window, |this, _event, window, cx| {
this.focus_out(window, cx)
})
.detach();
cx.on_focus_in(&focus_handle, window, Self::focus_in)
.detach();
cx.on_focus_out(&focus_handle, window, Self::focus_out)
.detach();
let excerpts = cx.new(|cx| MultiBuffer::new(project_handle.read(cx).capability()));
let editor = cx.new(|cx| {
@@ -238,8 +257,11 @@ impl ProjectDiagnosticsEditor {
window.focus(&this.focus_handle);
}
}
EditorEvent::Blurred => this.update_stale_excerpts(window, cx),
EditorEvent::Saved => this.update_stale_excerpts(window, cx),
EditorEvent::Blurred => this.close_diagnosticless_buffers(window, cx, false),
EditorEvent::Saved => this.close_diagnosticless_buffers(window, cx, true),
EditorEvent::SelectionsChanged { .. } => {
this.close_diagnosticless_buffers(window, cx, true)
}
_ => {}
}
},
@@ -283,15 +305,67 @@ impl ProjectDiagnosticsEditor {
this
}
fn update_stale_excerpts(&mut self, window: &mut Window, cx: &mut Context<Self>) {
if self.update_excerpts_task.is_some() || self.multibuffer.read(cx).is_dirty(cx) {
/// Closes all excerpts of buffers that:
/// - have no diagnostics anymore
/// - are saved (not dirty)
/// - and, if `reatin_selections` is true, do not have selections within them
fn close_diagnosticless_buffers(
&mut self,
_window: &mut Window,
cx: &mut Context<Self>,
retain_selections: bool,
) {
let buffer_ids = self.multibuffer.read(cx).all_buffer_ids();
let selected_buffers = self.editor.update(cx, |editor, cx| {
editor
.selections
.all_anchors(cx)
.iter()
.filter_map(|anchor| anchor.start.buffer_id)
.collect::<HashSet<_>>()
});
for buffer_id in buffer_ids {
if retain_selections && selected_buffers.contains(&buffer_id) {
continue;
}
let has_blocks = self
.blocks
.get(&buffer_id)
.is_none_or(|blocks| blocks.is_empty());
if !has_blocks {
continue;
}
let is_dirty = self
.multibuffer
.read(cx)
.buffer(buffer_id)
.is_some_and(|buffer| buffer.read(cx).is_dirty());
if !is_dirty {
continue;
}
self.multibuffer.update(cx, |b, cx| {
b.remove_excerpts_for_buffer(buffer_id, cx);
});
}
}
fn update_stale_excerpts(
&mut self,
mut retain_excerpts: RetainExcerpts,
window: &mut Window,
cx: &mut Context<Self>,
) {
if self.update_excerpts_task.is_some() {
return;
}
if self.multibuffer.read(cx).is_dirty(cx) {
retain_excerpts = RetainExcerpts::Yes;
}
let project_handle = self.project.clone();
self.update_excerpts_task = Some(cx.spawn_in(window, async move |this, cx| {
cx.background_executor()
.timer(DIAGNOSTICS_UPDATE_DELAY)
.timer(DIAGNOSTICS_UPDATE_DEBOUNCE)
.await;
loop {
let Some(path) = this.update(cx, |this, cx| {
@@ -312,7 +386,7 @@ impl ProjectDiagnosticsEditor {
.log_err()
{
this.update_in(cx, |this, window, cx| {
this.update_excerpts(buffer, window, cx)
this.update_excerpts(buffer, retain_excerpts, window, cx)
})?
.await?;
}
@@ -378,10 +452,10 @@ impl ProjectDiagnosticsEditor {
}
}
fn focus_out(&mut self, window: &mut Window, cx: &mut Context<Self>) {
fn focus_out(&mut self, _: FocusOutEvent, window: &mut Window, cx: &mut Context<Self>) {
if !self.focus_handle.is_focused(window) && !self.editor.focus_handle(cx).is_focused(window)
{
self.update_stale_excerpts(window, cx);
self.close_diagnosticless_buffers(window, cx, false);
}
}
@@ -403,12 +477,13 @@ impl ProjectDiagnosticsEditor {
});
}
}
multibuffer.clear(cx);
});
self.paths_to_update = project_paths;
});
self.update_stale_excerpts(window, cx);
self.update_stale_excerpts(RetainExcerpts::No, window, cx);
}
fn diagnostics_are_unchanged(
@@ -431,6 +506,7 @@ impl ProjectDiagnosticsEditor {
fn update_excerpts(
&mut self,
buffer: Entity<Buffer>,
retain_excerpts: RetainExcerpts,
window: &mut Window,
cx: &mut Context<Self>,
) -> Task<Result<()>> {
@@ -497,24 +573,27 @@ impl ProjectDiagnosticsEditor {
)
})?;
for item in more {
let i = blocks
.binary_search_by(|probe| {
probe
.initial_range
.start
.cmp(&item.initial_range.start)
.then(probe.initial_range.end.cmp(&item.initial_range.end))
.then(Ordering::Greater)
})
.unwrap_or_else(|i| i);
blocks.insert(i, item);
}
blocks.extend(more);
}
let mut excerpt_ranges: Vec<ExcerptRange<Point>> = Vec::new();
let mut excerpt_ranges: Vec<ExcerptRange<Point>> = match retain_excerpts {
RetainExcerpts::No => Vec::new(),
RetainExcerpts::Yes => this.update(cx, |this, cx| {
this.multibuffer.update(cx, |multi_buffer, cx| {
multi_buffer
.excerpts_for_buffer(buffer_id, cx)
.into_iter()
.map(|(_, range)| ExcerptRange {
context: range.context.to_point(&buffer_snapshot),
primary: range.primary.to_point(&buffer_snapshot),
})
.collect()
})
})?,
};
let mut result_blocks = vec![None; excerpt_ranges.len()];
let context_lines = cx.update(|_, cx| multibuffer_context_lines(cx))?;
for b in blocks.iter() {
for b in blocks {
let excerpt_range = context_range_for_entry(
b.initial_range.clone(),
context_lines,
@@ -541,7 +620,8 @@ impl ProjectDiagnosticsEditor {
context: excerpt_range,
primary: b.initial_range.clone(),
},
)
);
result_blocks.insert(i, Some(b));
}
this.update_in(cx, |this, window, cx| {
@@ -562,7 +642,7 @@ impl ProjectDiagnosticsEditor {
)
});
#[cfg(test)]
let cloned_blocks = blocks.clone();
let cloned_blocks = result_blocks.clone();
if was_empty && let Some(anchor_range) = anchor_ranges.first() {
let range_to_select = anchor_range.start..anchor_range.start;
@@ -576,22 +656,20 @@ impl ProjectDiagnosticsEditor {
}
}
let editor_blocks =
anchor_ranges
.into_iter()
.zip(blocks.into_iter())
.map(|(anchor, block)| {
let editor = this.editor.downgrade();
BlockProperties {
placement: BlockPlacement::Near(anchor.start),
height: Some(1),
style: BlockStyle::Flex,
render: Arc::new(move |bcx| {
block.render_block(editor.clone(), bcx)
}),
priority: 1,
}
});
let editor_blocks = anchor_ranges
.into_iter()
.zip_eq(result_blocks.into_iter())
.filter_map(|(anchor, block)| {
let block = block?;
let editor = this.editor.downgrade();
Some(BlockProperties {
placement: BlockPlacement::Near(anchor.start),
height: Some(1),
style: BlockStyle::Flex,
render: Arc::new(move |bcx| block.render_block(editor.clone(), bcx)),
priority: 1,
})
});
let block_ids = this.editor.update(cx, |editor, cx| {
editor.display_map.update(cx, |display_map, cx| {
@@ -601,7 +679,9 @@ impl ProjectDiagnosticsEditor {
#[cfg(test)]
{
for (block_id, block) in block_ids.iter().zip(cloned_blocks.iter()) {
for (block_id, block) in
block_ids.iter().zip(cloned_blocks.into_iter().flatten())
{
let markdown = block.markdown.clone();
editor::test::set_block_content_for_tests(
&this.editor,
@@ -626,6 +706,7 @@ impl ProjectDiagnosticsEditor {
fn update_diagnostic_summary(&mut self, cx: &mut Context<Self>) {
self.summary = self.project.read(cx).diagnostic_summary(false, cx);
cx.emit(EditorEvent::TitleChanged);
}
}
@@ -843,13 +924,6 @@ impl DiagnosticsToolbarEditor for WeakEntity<ProjectDiagnosticsEditor> {
.unwrap_or(false)
}
fn has_stale_excerpts(&self, cx: &App) -> bool {
self.read_with(cx, |project_diagnostics_editor, _cx| {
!project_diagnostics_editor.paths_to_update.is_empty()
})
.unwrap_or(false)
}
fn is_updating(&self, cx: &App) -> bool {
self.read_with(cx, |project_diagnostics_editor, cx| {
project_diagnostics_editor.update_excerpts_task.is_some()
@@ -1010,12 +1084,6 @@ async fn heuristic_syntactic_expand(
return;
}
}
log::info!(
"Expanding to ancestor started on {} node\
exceeding row limit of {max_row_count}.",
node.grammar_name()
);
*ancestor_range = Some(None);
}
})

View File

@@ -119,7 +119,7 @@ async fn test_diagnostics(cx: &mut TestAppContext) {
let editor = diagnostics.update(cx, |diagnostics, _| diagnostics.editor.clone());
diagnostics
.next_notification(DIAGNOSTICS_UPDATE_DELAY + Duration::from_millis(10), cx)
.next_notification(DIAGNOSTICS_UPDATE_DEBOUNCE + Duration::from_millis(10), cx)
.await;
pretty_assertions::assert_eq!(
@@ -190,7 +190,7 @@ async fn test_diagnostics(cx: &mut TestAppContext) {
});
diagnostics
.next_notification(DIAGNOSTICS_UPDATE_DELAY + Duration::from_millis(10), cx)
.next_notification(DIAGNOSTICS_UPDATE_DEBOUNCE + Duration::from_millis(10), cx)
.await;
pretty_assertions::assert_eq!(
@@ -277,7 +277,7 @@ async fn test_diagnostics(cx: &mut TestAppContext) {
});
diagnostics
.next_notification(DIAGNOSTICS_UPDATE_DELAY + Duration::from_millis(10), cx)
.next_notification(DIAGNOSTICS_UPDATE_DEBOUNCE + Duration::from_millis(10), cx)
.await;
pretty_assertions::assert_eq!(
@@ -391,7 +391,7 @@ async fn test_diagnostics_with_folds(cx: &mut TestAppContext) {
// Only the first language server's diagnostics are shown.
cx.executor()
.advance_clock(DIAGNOSTICS_UPDATE_DELAY + Duration::from_millis(10));
.advance_clock(DIAGNOSTICS_UPDATE_DEBOUNCE + Duration::from_millis(10));
cx.executor().run_until_parked();
editor.update_in(cx, |editor, window, cx| {
editor.fold_ranges(vec![Point::new(0, 0)..Point::new(3, 0)], false, window, cx);
@@ -490,7 +490,7 @@ async fn test_diagnostics_multiple_servers(cx: &mut TestAppContext) {
// Only the first language server's diagnostics are shown.
cx.executor()
.advance_clock(DIAGNOSTICS_UPDATE_DELAY + Duration::from_millis(10));
.advance_clock(DIAGNOSTICS_UPDATE_DEBOUNCE + Duration::from_millis(10));
cx.executor().run_until_parked();
pretty_assertions::assert_eq!(
@@ -530,7 +530,7 @@ async fn test_diagnostics_multiple_servers(cx: &mut TestAppContext) {
// Both language server's diagnostics are shown.
cx.executor()
.advance_clock(DIAGNOSTICS_UPDATE_DELAY + Duration::from_millis(10));
.advance_clock(DIAGNOSTICS_UPDATE_DEBOUNCE + Duration::from_millis(10));
cx.executor().run_until_parked();
pretty_assertions::assert_eq!(
@@ -587,7 +587,7 @@ async fn test_diagnostics_multiple_servers(cx: &mut TestAppContext) {
// Only the first language server's diagnostics are updated.
cx.executor()
.advance_clock(DIAGNOSTICS_UPDATE_DELAY + Duration::from_millis(10));
.advance_clock(DIAGNOSTICS_UPDATE_DEBOUNCE + Duration::from_millis(10));
cx.executor().run_until_parked();
pretty_assertions::assert_eq!(
@@ -629,7 +629,7 @@ async fn test_diagnostics_multiple_servers(cx: &mut TestAppContext) {
// Both language servers' diagnostics are updated.
cx.executor()
.advance_clock(DIAGNOSTICS_UPDATE_DELAY + Duration::from_millis(10));
.advance_clock(DIAGNOSTICS_UPDATE_DEBOUNCE + Duration::from_millis(10));
cx.executor().run_until_parked();
pretty_assertions::assert_eq!(
@@ -760,7 +760,7 @@ async fn test_random_diagnostics_blocks(cx: &mut TestAppContext, mut rng: StdRng
.unwrap()
});
cx.executor()
.advance_clock(DIAGNOSTICS_UPDATE_DELAY + Duration::from_millis(10));
.advance_clock(DIAGNOSTICS_UPDATE_DEBOUNCE + Duration::from_millis(10));
cx.run_until_parked();
}
@@ -769,7 +769,7 @@ async fn test_random_diagnostics_blocks(cx: &mut TestAppContext, mut rng: StdRng
log::info!("updating mutated diagnostics view");
mutated_diagnostics.update_in(cx, |diagnostics, window, cx| {
diagnostics.update_stale_excerpts(window, cx)
diagnostics.update_stale_excerpts(RetainExcerpts::No, window, cx)
});
log::info!("constructing reference diagnostics view");
@@ -777,7 +777,7 @@ async fn test_random_diagnostics_blocks(cx: &mut TestAppContext, mut rng: StdRng
ProjectDiagnosticsEditor::new(true, project.clone(), workspace.downgrade(), window, cx)
});
cx.executor()
.advance_clock(DIAGNOSTICS_UPDATE_DELAY + Duration::from_millis(10));
.advance_clock(DIAGNOSTICS_UPDATE_DEBOUNCE + Duration::from_millis(10));
cx.run_until_parked();
let mutated_excerpts =
@@ -789,7 +789,12 @@ async fn test_random_diagnostics_blocks(cx: &mut TestAppContext, mut rng: StdRng
// The mutated view may contain more than the reference view as
// we don't currently shrink excerpts when diagnostics were removed.
let mut ref_iter = reference_excerpts.lines().filter(|line| *line != "§ -----");
let mut ref_iter = reference_excerpts.lines().filter(|line| {
// ignore $ ---- and $ <file>.rs
!line.starts_with('§')
|| line.starts_with("§ diagnostic")
|| line.starts_with("§ related info")
});
let mut next_ref_line = ref_iter.next();
let mut skipped_block = false;
@@ -797,7 +802,12 @@ async fn test_random_diagnostics_blocks(cx: &mut TestAppContext, mut rng: StdRng
if let Some(ref_line) = next_ref_line {
if mut_line == ref_line {
next_ref_line = ref_iter.next();
} else if mut_line.contains('§') && mut_line != "§ -----" {
} else if mut_line.contains('§')
// ignore $ ---- and $ <file>.rs
&& (!mut_line.starts_with('§')
|| mut_line.starts_with("§ diagnostic")
|| mut_line.starts_with("§ related info"))
{
skipped_block = true;
}
}
@@ -949,7 +959,7 @@ async fn test_random_diagnostics_with_inlays(cx: &mut TestAppContext, mut rng: S
.unwrap()
});
cx.executor()
.advance_clock(DIAGNOSTICS_UPDATE_DELAY + Duration::from_millis(10));
.advance_clock(DIAGNOSTICS_UPDATE_DEBOUNCE + Duration::from_millis(10));
cx.run_until_parked();
}
@@ -958,11 +968,11 @@ async fn test_random_diagnostics_with_inlays(cx: &mut TestAppContext, mut rng: S
log::info!("updating mutated diagnostics view");
mutated_diagnostics.update_in(cx, |diagnostics, window, cx| {
diagnostics.update_stale_excerpts(window, cx)
diagnostics.update_stale_excerpts(RetainExcerpts::No, window, cx)
});
cx.executor()
.advance_clock(DIAGNOSTICS_UPDATE_DELAY + Duration::from_millis(10));
.advance_clock(DIAGNOSTICS_UPDATE_DEBOUNCE + Duration::from_millis(10));
cx.run_until_parked();
}
@@ -1427,7 +1437,7 @@ async fn test_diagnostics_with_code(cx: &mut TestAppContext) {
let editor = diagnostics.update(cx, |diagnostics, _| diagnostics.editor.clone());
diagnostics
.next_notification(DIAGNOSTICS_UPDATE_DELAY + Duration::from_millis(10), cx)
.next_notification(DIAGNOSTICS_UPDATE_DEBOUNCE + Duration::from_millis(10), cx)
.await;
// Verify that the diagnostic codes are displayed correctly
@@ -1704,7 +1714,7 @@ async fn test_buffer_diagnostics(cx: &mut TestAppContext) {
// wait a little bit to ensure that the buffer diagnostic's editor content
// is rendered.
cx.executor()
.advance_clock(DIAGNOSTICS_UPDATE_DELAY + Duration::from_millis(10));
.advance_clock(DIAGNOSTICS_UPDATE_DEBOUNCE + Duration::from_millis(10));
pretty_assertions::assert_eq!(
editor_content_with_blocks(&editor, cx),
@@ -1837,7 +1847,7 @@ async fn test_buffer_diagnostics_without_warnings(cx: &mut TestAppContext) {
// wait a little bit to ensure that the buffer diagnostic's editor content
// is rendered.
cx.executor()
.advance_clock(DIAGNOSTICS_UPDATE_DELAY + Duration::from_millis(10));
.advance_clock(DIAGNOSTICS_UPDATE_DEBOUNCE + Duration::from_millis(10));
pretty_assertions::assert_eq!(
editor_content_with_blocks(&editor, cx),
@@ -1971,7 +1981,7 @@ async fn test_buffer_diagnostics_multiple_servers(cx: &mut TestAppContext) {
// wait a little bit to ensure that the buffer diagnostic's editor content
// is rendered.
cx.executor()
.advance_clock(DIAGNOSTICS_UPDATE_DELAY + Duration::from_millis(10));
.advance_clock(DIAGNOSTICS_UPDATE_DEBOUNCE + Duration::from_millis(10));
pretty_assertions::assert_eq!(
editor_content_with_blocks(&editor, cx),

View File

@@ -16,9 +16,6 @@ pub(crate) trait DiagnosticsToolbarEditor: Send + Sync {
/// Toggles whether warning diagnostics should be displayed by the
/// diagnostics editor.
fn toggle_warnings(&self, window: &mut Window, cx: &mut App);
/// Indicates whether any of the excerpts displayed by the diagnostics
/// editor are stale.
fn has_stale_excerpts(&self, cx: &App) -> bool;
/// Indicates whether the diagnostics editor is currently updating the
/// diagnostics.
fn is_updating(&self, cx: &App) -> bool;
@@ -37,14 +34,12 @@ pub(crate) trait DiagnosticsToolbarEditor: Send + Sync {
impl Render for ToolbarControls {
fn render(&mut self, _: &mut Window, cx: &mut Context<Self>) -> impl IntoElement {
let mut has_stale_excerpts = false;
let mut include_warnings = false;
let mut is_updating = false;
match &self.editor {
Some(editor) => {
include_warnings = editor.include_warnings(cx);
has_stale_excerpts = editor.has_stale_excerpts(cx);
is_updating = editor.is_updating(cx);
}
None => {}
@@ -86,7 +81,6 @@ impl Render for ToolbarControls {
IconButton::new("refresh-diagnostics", IconName::ArrowCircle)
.icon_color(Color::Info)
.shape(IconButtonShape::Square)
.disabled(!has_stale_excerpts)
.tooltip(Tooltip::for_action_title(
"Refresh diagnostics",
&ToggleDiagnosticsRefresh,

View File

@@ -1832,9 +1832,15 @@ impl Editor {
project::Event::RefreshCodeLens => {
// we always query lens with actions, without storing them, always refreshing them
}
project::Event::RefreshInlayHints(server_id) => {
project::Event::RefreshInlayHints {
server_id,
request_id,
} => {
editor.refresh_inlay_hints(
InlayHintRefreshReason::RefreshRequested(*server_id),
InlayHintRefreshReason::RefreshRequested {
server_id: *server_id,
request_id: *request_id,
},
cx,
);
}

View File

@@ -159,6 +159,7 @@ pub struct SearchSettings {
pub case_sensitive: bool,
pub include_ignored: bool,
pub regex: bool,
pub center_on_match: bool,
}
impl EditorSettings {
@@ -249,6 +250,7 @@ impl Settings for EditorSettings {
case_sensitive: search.case_sensitive.unwrap(),
include_ignored: search.include_ignored.unwrap(),
regex: search.regex.unwrap(),
center_on_match: search.center_on_match.unwrap(),
},
auto_signature_help: editor.auto_signature_help.unwrap(),
show_signature_help_after_edits: editor.show_signature_help_after_edits.unwrap(),

View File

@@ -1,5 +1,4 @@
use std::{
collections::hash_map,
ops::{ControlFlow, Range},
time::Duration,
};
@@ -49,8 +48,8 @@ pub struct LspInlayHintData {
allowed_hint_kinds: HashSet<Option<InlayHintKind>>,
invalidate_debounce: Option<Duration>,
append_debounce: Option<Duration>,
hint_refresh_tasks: HashMap<BufferId, HashMap<Vec<Range<BufferRow>>, Vec<Task<()>>>>,
hint_chunk_fetched: HashMap<BufferId, (Global, HashSet<Range<BufferRow>>)>,
hint_refresh_tasks: HashMap<BufferId, Vec<Task<()>>>,
hint_chunk_fetching: HashMap<BufferId, (Global, HashSet<Range<BufferRow>>)>,
invalidate_hints_for_buffers: HashSet<BufferId>,
pub added_hints: HashMap<InlayId, Option<InlayHintKind>>,
}
@@ -63,7 +62,7 @@ impl LspInlayHintData {
enabled_in_settings: settings.enabled,
hint_refresh_tasks: HashMap::default(),
added_hints: HashMap::default(),
hint_chunk_fetched: HashMap::default(),
hint_chunk_fetching: HashMap::default(),
invalidate_hints_for_buffers: HashSet::default(),
invalidate_debounce: debounce_value(settings.edit_debounce_ms),
append_debounce: debounce_value(settings.scroll_debounce_ms),
@@ -99,9 +98,8 @@ impl LspInlayHintData {
pub fn clear(&mut self) {
self.hint_refresh_tasks.clear();
self.hint_chunk_fetched.clear();
self.hint_chunk_fetching.clear();
self.added_hints.clear();
self.invalidate_hints_for_buffers.clear();
}
/// Checks inlay hint settings for enabled hint kinds and general enabled state.
@@ -199,7 +197,7 @@ impl LspInlayHintData {
) {
for buffer_id in removed_buffer_ids {
self.hint_refresh_tasks.remove(buffer_id);
self.hint_chunk_fetched.remove(buffer_id);
self.hint_chunk_fetching.remove(buffer_id);
}
}
}
@@ -211,7 +209,10 @@ pub enum InlayHintRefreshReason {
SettingsChange(InlayHintSettings),
NewLinesShown,
BufferEdited(BufferId),
RefreshRequested(LanguageServerId),
RefreshRequested {
server_id: LanguageServerId,
request_id: Option<usize>,
},
ExcerptsRemoved(Vec<ExcerptId>),
}
@@ -296,7 +297,7 @@ impl Editor {
| InlayHintRefreshReason::Toggle(_)
| InlayHintRefreshReason::SettingsChange(_) => true,
InlayHintRefreshReason::NewLinesShown
| InlayHintRefreshReason::RefreshRequested(_)
| InlayHintRefreshReason::RefreshRequested { .. }
| InlayHintRefreshReason::ExcerptsRemoved(_) => false,
InlayHintRefreshReason::BufferEdited(buffer_id) => {
let Some(affected_language) = self
@@ -370,48 +371,45 @@ impl Editor {
let Some(buffer) = multi_buffer.read(cx).buffer(buffer_id) else {
continue;
};
let fetched_tasks = inlay_hints.hint_chunk_fetched.entry(buffer_id).or_default();
let (fetched_for_version, fetched_chunks) = inlay_hints
.hint_chunk_fetching
.entry(buffer_id)
.or_default();
if visible_excerpts
.buffer_version
.changed_since(&fetched_tasks.0)
.changed_since(fetched_for_version)
{
fetched_tasks.1.clear();
fetched_tasks.0 = visible_excerpts.buffer_version.clone();
*fetched_for_version = visible_excerpts.buffer_version.clone();
fetched_chunks.clear();
inlay_hints.hint_refresh_tasks.remove(&buffer_id);
}
let applicable_chunks =
semantics_provider.applicable_inlay_chunks(&buffer, &visible_excerpts.ranges, cx);
let known_chunks = if ignore_previous_fetches {
None
} else {
Some((fetched_for_version.clone(), fetched_chunks.clone()))
};
match inlay_hints
let mut applicable_chunks =
semantics_provider.applicable_inlay_chunks(&buffer, &visible_excerpts.ranges, cx);
applicable_chunks.retain(|chunk| fetched_chunks.insert(chunk.clone()));
if applicable_chunks.is_empty() && !ignore_previous_fetches {
continue;
}
inlay_hints
.hint_refresh_tasks
.entry(buffer_id)
.or_default()
.entry(applicable_chunks)
{
hash_map::Entry::Occupied(mut o) => {
if invalidate_cache.should_invalidate() || ignore_previous_fetches {
o.get_mut().push(spawn_editor_hints_refresh(
buffer_id,
invalidate_cache,
ignore_previous_fetches,
debounce,
visible_excerpts,
cx,
));
}
}
hash_map::Entry::Vacant(v) => {
v.insert(Vec::new()).push(spawn_editor_hints_refresh(
buffer_id,
invalidate_cache,
ignore_previous_fetches,
debounce,
visible_excerpts,
cx,
));
}
}
.push(spawn_editor_hints_refresh(
buffer_id,
invalidate_cache,
debounce,
visible_excerpts,
known_chunks,
applicable_chunks,
cx,
));
}
}
@@ -506,9 +504,13 @@ impl Editor {
}
InlayHintRefreshReason::NewLinesShown => InvalidationStrategy::None,
InlayHintRefreshReason::BufferEdited(_) => InvalidationStrategy::BufferEdited,
InlayHintRefreshReason::RefreshRequested(server_id) => {
InvalidationStrategy::RefreshRequested(*server_id)
}
InlayHintRefreshReason::RefreshRequested {
server_id,
request_id,
} => InvalidationStrategy::RefreshRequested {
server_id: *server_id,
request_id: *request_id,
},
};
match &mut self.inlay_hints {
@@ -718,44 +720,29 @@ impl Editor {
fn inlay_hints_for_buffer(
&mut self,
invalidate_cache: InvalidationStrategy,
ignore_previous_fetches: bool,
buffer_excerpts: VisibleExcerpts,
known_chunks: Option<(Global, HashSet<Range<BufferRow>>)>,
cx: &mut Context<Self>,
) -> Option<Vec<Task<(Range<BufferRow>, anyhow::Result<CacheInlayHints>)>>> {
let semantics_provider = self.semantics_provider()?;
let inlay_hints = self.inlay_hints.as_mut()?;
let buffer_id = buffer_excerpts.buffer.read(cx).remote_id();
let new_hint_tasks = semantics_provider
.inlay_hints(
invalidate_cache,
buffer_excerpts.buffer,
buffer_excerpts.ranges,
inlay_hints
.hint_chunk_fetched
.get(&buffer_id)
.filter(|_| !ignore_previous_fetches && !invalidate_cache.should_invalidate())
.cloned(),
known_chunks,
cx,
)
.unwrap_or_default();
let (known_version, known_chunks) =
inlay_hints.hint_chunk_fetched.entry(buffer_id).or_default();
if buffer_excerpts.buffer_version.changed_since(known_version) {
known_chunks.clear();
*known_version = buffer_excerpts.buffer_version;
}
let mut hint_tasks = Vec::new();
let mut hint_tasks = None;
for (row_range, new_hints_task) in new_hint_tasks {
let inserted = known_chunks.insert(row_range.clone());
if inserted || ignore_previous_fetches || invalidate_cache.should_invalidate() {
hint_tasks.push(cx.spawn(async move |_, _| (row_range, new_hints_task.await)));
}
hint_tasks
.get_or_insert_with(Vec::new)
.push(cx.spawn(async move |_, _| (row_range, new_hints_task.await)));
}
Some(hint_tasks)
hint_tasks
}
fn apply_fetched_hints(
@@ -793,20 +780,28 @@ impl Editor {
let excerpts = self.buffer.read(cx).excerpt_ids();
let hints_to_insert = new_hints
.into_iter()
.filter_map(|(chunk_range, hints_result)| match hints_result {
Ok(new_hints) => Some(new_hints),
Err(e) => {
log::error!(
"Failed to query inlays for buffer row range {chunk_range:?}, {e:#}"
);
if let Some((for_version, chunks_fetched)) =
inlay_hints.hint_chunk_fetched.get_mut(&buffer_id)
{
if for_version == &query_version {
chunks_fetched.remove(&chunk_range);
.filter_map(|(chunk_range, hints_result)| {
let chunks_fetched = inlay_hints.hint_chunk_fetching.get_mut(&buffer_id);
match hints_result {
Ok(new_hints) => {
if new_hints.is_empty() {
if let Some((_, chunks_fetched)) = chunks_fetched {
chunks_fetched.remove(&chunk_range);
}
}
Some(new_hints)
}
Err(e) => {
log::error!(
"Failed to query inlays for buffer row range {chunk_range:?}, {e:#}"
);
if let Some((for_version, chunks_fetched)) = chunks_fetched {
if for_version == &query_version {
chunks_fetched.remove(&chunk_range);
}
}
None
}
None
}
})
.flat_map(|hints| hints.into_values())
@@ -856,9 +851,10 @@ struct VisibleExcerpts {
fn spawn_editor_hints_refresh(
buffer_id: BufferId,
invalidate_cache: InvalidationStrategy,
ignore_previous_fetches: bool,
debounce: Option<Duration>,
buffer_excerpts: VisibleExcerpts,
known_chunks: Option<(Global, HashSet<Range<BufferRow>>)>,
applicable_chunks: Vec<Range<BufferRow>>,
cx: &mut Context<'_, Editor>,
) -> Task<()> {
cx.spawn(async move |editor, cx| {
@@ -869,12 +865,7 @@ fn spawn_editor_hints_refresh(
let query_version = buffer_excerpts.buffer_version.clone();
let Some(hint_tasks) = editor
.update(cx, |editor, cx| {
editor.inlay_hints_for_buffer(
invalidate_cache,
ignore_previous_fetches,
buffer_excerpts,
cx,
)
editor.inlay_hints_for_buffer(invalidate_cache, buffer_excerpts, known_chunks, cx)
})
.ok()
else {
@@ -882,6 +873,19 @@ fn spawn_editor_hints_refresh(
};
let hint_tasks = hint_tasks.unwrap_or_default();
if hint_tasks.is_empty() {
editor
.update(cx, |editor, _| {
if let Some((_, hint_chunk_fetching)) = editor
.inlay_hints
.as_mut()
.and_then(|inlay_hints| inlay_hints.hint_chunk_fetching.get_mut(&buffer_id))
{
for applicable_chunks in &applicable_chunks {
hint_chunk_fetching.remove(applicable_chunks);
}
}
})
.ok();
return;
}
let new_hints = join_all(hint_tasks).await;
@@ -1102,7 +1106,10 @@ pub mod tests {
editor
.update(cx, |editor, _window, cx| {
editor.refresh_inlay_hints(
InlayHintRefreshReason::RefreshRequested(fake_server.server.server_id()),
InlayHintRefreshReason::RefreshRequested {
server_id: fake_server.server.server_id(),
request_id: Some(1),
},
cx,
);
})
@@ -1958,15 +1965,8 @@ pub mod tests {
async fn test_large_buffer_inlay_requests_split(cx: &mut gpui::TestAppContext) {
init_test(cx, |settings| {
settings.defaults.inlay_hints = Some(InlayHintSettingsContent {
show_value_hints: Some(true),
enabled: Some(true),
edit_debounce_ms: Some(0),
scroll_debounce_ms: Some(0),
show_type_hints: Some(true),
show_parameter_hints: Some(true),
show_other_hints: Some(true),
show_background: Some(false),
toggle_on_modifiers_press: None,
..InlayHintSettingsContent::default()
})
});
@@ -2044,6 +2044,7 @@ pub mod tests {
cx.add_window(|window, cx| Editor::for_buffer(buffer, Some(project), window, cx));
cx.executor().run_until_parked();
let _fake_server = fake_servers.next().await.unwrap();
cx.executor().advance_clock(Duration::from_millis(100));
cx.executor().run_until_parked();
let ranges = lsp_request_ranges
@@ -2129,6 +2130,7 @@ pub mod tests {
);
})
.unwrap();
cx.executor().advance_clock(Duration::from_millis(100));
cx.executor().run_until_parked();
editor.update(cx, |_, _, _| {
let ranges = lsp_request_ranges
@@ -2145,6 +2147,7 @@ pub mod tests {
editor.handle_input("++++more text++++", window, cx);
})
.unwrap();
cx.executor().advance_clock(Duration::from_secs(1));
cx.executor().run_until_parked();
editor.update(cx, |editor, _window, cx| {
let mut ranges = lsp_request_ranges.lock().drain(..).collect::<Vec<_>>();
@@ -3887,7 +3890,10 @@ let c = 3;"#
editor
.update(cx, |editor, _, cx| {
editor.refresh_inlay_hints(
InlayHintRefreshReason::RefreshRequested(fake_server.server.server_id()),
InlayHintRefreshReason::RefreshRequested {
server_id: fake_server.server.server_id(),
request_id: Some(1),
},
cx,
);
})
@@ -4022,7 +4028,7 @@ let c = 3;"#
let mut all_fetched_hints = Vec::new();
for buffer in editor.buffer.read(cx).all_buffers() {
lsp_store.update(cx, |lsp_store, cx| {
let hints = &lsp_store.latest_lsp_data(&buffer, cx).inlay_hints();
let hints = lsp_store.latest_lsp_data(&buffer, cx).inlay_hints();
all_cached_labels.extend(hints.all_cached_hints().into_iter().map(|hint| {
let mut label = hint.text().to_string();
if hint.padding_left {

View File

@@ -1593,7 +1593,12 @@ impl SearchableItem for Editor {
) {
self.unfold_ranges(&[matches[index].clone()], false, true, cx);
let range = self.range_for_match(&matches[index], collapse);
self.change_selections(Default::default(), window, cx, |s| {
let autoscroll = if EditorSettings::get_global(cx).search.center_on_match {
Autoscroll::center()
} else {
Autoscroll::fit()
};
self.change_selections(SelectionEffects::scroll(autoscroll), window, cx, |s| {
s.select_ranges([range]);
})
}

View File

@@ -60,8 +60,10 @@ async fn lsp_task_context(
buffer: &Entity<Buffer>,
cx: &mut AsyncApp,
) -> Option<TaskContext> {
let worktree_store = project
.read_with(cx, |project, _| project.worktree_store())
let (worktree_store, environment) = project
.read_with(cx, |project, _| {
(project.worktree_store(), project.environment().clone())
})
.ok()?;
let worktree_abs_path = cx
@@ -74,9 +76,9 @@ async fn lsp_task_context(
})
.ok()?;
let project_env = project
.update(cx, |project, cx| {
project.buffer_environment(buffer, &worktree_store, cx)
let project_env = environment
.update(cx, |environment, cx| {
environment.buffer_environment(buffer, &worktree_store, cx)
})
.ok()?
.await;

View File

@@ -145,6 +145,10 @@ fn extension_provides(manifest: &ExtensionManifest) -> BTreeSet<ExtensionProvide
provides.insert(ExtensionProvides::ContextServers);
}
if !manifest.agent_servers.is_empty() {
provides.insert(ExtensionProvides::AgentServers);
}
if manifest.snippets.is_some() {
provides.insert(ExtensionProvides::Snippets);
}

View File

@@ -225,6 +225,9 @@ impl ExtensionFilter {
#[derive(Debug, PartialEq, Eq, PartialOrd, Ord, Hash, Clone, Copy)]
enum Feature {
AgentClaude,
AgentCodex,
AgentGemini,
ExtensionRuff,
ExtensionTailwind,
Git,
@@ -244,6 +247,9 @@ fn keywords_by_feature() -> &'static BTreeMap<Feature, Vec<&'static str>> {
static KEYWORDS_BY_FEATURE: OnceLock<BTreeMap<Feature, Vec<&'static str>>> = OnceLock::new();
KEYWORDS_BY_FEATURE.get_or_init(|| {
BTreeMap::from_iter([
(Feature::AgentClaude, vec!["claude", "claude code"]),
(Feature::AgentCodex, vec!["codex", "codex cli"]),
(Feature::AgentGemini, vec!["gemini", "gemini cli"]),
(Feature::ExtensionRuff, vec!["ruff"]),
(Feature::ExtensionTailwind, vec!["tail", "tailwind"]),
(Feature::Git, vec!["git"]),
@@ -799,25 +805,22 @@ impl ExtensionsPage {
)
.child(
h_flex()
.gap_2()
.gap_1()
.justify_between()
.child(
h_flex()
.gap_1()
.child(
Icon::new(IconName::Person)
.size(IconSize::XSmall)
.color(Color::Muted),
)
.child(
Label::new(extension.manifest.authors.join(", "))
.size(LabelSize::Small)
.color(Color::Muted)
.truncate(),
),
Icon::new(IconName::Person)
.size(IconSize::XSmall)
.color(Color::Muted),
)
.child(
Label::new(extension.manifest.authors.join(", "))
.size(LabelSize::Small)
.color(Color::Muted)
.truncate(),
)
.child(
h_flex()
.ml_auto()
.gap_1()
.child(
IconButton::new(
@@ -1422,6 +1425,24 @@ impl ExtensionsPage {
for feature in &self.upsells {
let banner = match feature {
Feature::AgentClaude => self.render_feature_upsell_banner(
"Claude Code support is built-in to Zed!".into(),
"https://zed.dev/docs/ai/external-agents#claude-code".into(),
false,
cx,
),
Feature::AgentCodex => self.render_feature_upsell_banner(
"Codex CLI support is built-in to Zed!".into(),
"https://zed.dev/docs/ai/external-agents#codex-cli".into(),
false,
cx,
),
Feature::AgentGemini => self.render_feature_upsell_banner(
"Gemini CLI support is built-in to Zed!".into(),
"https://zed.dev/docs/ai/external-agents#gemini-cli".into(),
false,
cx,
),
Feature::ExtensionRuff => self.render_feature_upsell_banner(
"Ruff (linter for Python) support is built-in to Zed!".into(),
"https://zed.dev/docs/languages/python#code-formatting--linting".into(),

View File

@@ -711,7 +711,9 @@ impl PickerDelegate for OpenPathDelegate {
match &self.directory_state {
DirectoryState::List { parent_path, .. } => {
let (label, indices) = if *parent_path == self.prompt_root {
let (label, indices) = if is_current_dir_candidate {
("open this directory".to_string(), vec![])
} else if *parent_path == self.prompt_root {
match_positions.iter_mut().for_each(|position| {
*position += self.prompt_root.len();
});
@@ -719,8 +721,6 @@ impl PickerDelegate for OpenPathDelegate {
format!("{}{}", self.prompt_root, candidate.path.string),
match_positions,
)
} else if is_current_dir_candidate {
("open this directory".to_string(), vec![])
} else {
(candidate.path.string, match_positions)
};

View File

@@ -377,7 +377,7 @@ impl Fs for RealFs {
#[cfg(windows)]
if smol::fs::metadata(&target).await?.is_dir() {
let status = smol::process::Command::new("cmd")
let status = new_smol_command("cmd")
.args(["/C", "mklink", "/J"])
.args([path, target.as_path()])
.status()

View File

@@ -23,6 +23,7 @@ serde.workspace = true
serde_json.workspace = true
settings.workspace = true
url.workspace = true
urlencoding.workspace = true
util.workspace = true
[dev-dependencies]

View File

@@ -1,5 +1,11 @@
use std::str::FromStr;
use std::{str::FromStr, sync::Arc};
use anyhow::{Context as _, Result, bail};
use async_trait::async_trait;
use futures::AsyncReadExt;
use gpui::SharedString;
use http_client::{AsyncBody, HttpClient, HttpRequestExt, Request};
use serde::Deserialize;
use url::Url;
use git::{
@@ -9,6 +15,55 @@ use git::{
pub struct Gitee;
#[derive(Debug, Deserialize)]
struct CommitDetails {
author: Option<Author>,
}
#[derive(Debug, Deserialize)]
struct Author {
avatar_url: String,
}
impl Gitee {
async fn fetch_gitee_commit_author(
&self,
repo_owner: &str,
repo: &str,
commit: &str,
client: &Arc<dyn HttpClient>,
) -> Result<Option<Author>> {
let url = format!("https://gitee.com/api/v5/repos/{repo_owner}/{repo}/commits/{commit}");
let request = Request::get(&url)
.header("Content-Type", "application/json")
.follow_redirects(http_client::RedirectPolicy::FollowAll);
let mut response = client
.send(request.body(AsyncBody::default())?)
.await
.with_context(|| format!("error fetching Gitee commit details at {:?}", url))?;
let mut body = Vec::new();
response.body_mut().read_to_end(&mut body).await?;
if response.status().is_client_error() {
let text = String::from_utf8_lossy(body.as_slice());
bail!(
"status error {}, response: {text:?}",
response.status().as_u16()
);
}
let body_str = std::str::from_utf8(&body)?;
serde_json::from_str::<CommitDetails>(body_str)
.map(|commit| commit.author)
.context("failed to deserialize Gitee commit details")
}
}
#[async_trait]
impl GitHostingProvider for Gitee {
fn name(&self) -> String {
"Gitee".to_string()
@@ -19,7 +74,7 @@ impl GitHostingProvider for Gitee {
}
fn supports_avatars(&self) -> bool {
false
true
}
fn format_line_number(&self, line: u32) -> String {
@@ -80,6 +135,26 @@ impl GitHostingProvider for Gitee {
);
permalink
}
async fn commit_author_avatar_url(
&self,
repo_owner: &str,
repo: &str,
commit: SharedString,
http_client: Arc<dyn HttpClient>,
) -> Result<Option<Url>> {
let commit = commit.to_string();
let avatar_url = self
.fetch_gitee_commit_author(repo_owner, repo, &commit, &http_client)
.await?
.map(|author| -> Result<Url, url::ParseError> {
let mut url = Url::parse(&author.avatar_url)?;
url.set_query(Some("width=128"));
Ok(url)
})
.transpose()?;
Ok(avatar_url)
}
}
#[cfg(test)]

View File

@@ -1,6 +1,11 @@
use std::str::FromStr;
use std::{str::FromStr, sync::Arc};
use anyhow::{Result, bail};
use anyhow::{Context as _, Result, bail};
use async_trait::async_trait;
use futures::AsyncReadExt;
use gpui::SharedString;
use http_client::{AsyncBody, HttpClient, HttpRequestExt, Request};
use serde::Deserialize;
use url::Url;
use git::{
@@ -10,6 +15,16 @@ use git::{
use crate::get_host_from_git_remote_url;
#[derive(Debug, Deserialize)]
struct CommitDetails {
author_email: String,
}
#[derive(Debug, Deserialize)]
struct AvatarInfo {
avatar_url: String,
}
#[derive(Debug)]
pub struct Gitlab {
name: String,
@@ -46,8 +61,79 @@ impl Gitlab {
Url::parse(&format!("https://{}", host))?,
))
}
async fn fetch_gitlab_commit_author(
&self,
repo_owner: &str,
repo: &str,
commit: &str,
client: &Arc<dyn HttpClient>,
) -> Result<Option<AvatarInfo>> {
let Some(host) = self.base_url.host_str() else {
bail!("failed to get host from gitlab base url");
};
let project_path = format!("{}/{}", repo_owner, repo);
let project_path_encoded = urlencoding::encode(&project_path);
let url = format!(
"https://{host}/api/v4/projects/{project_path_encoded}/repository/commits/{commit}"
);
let request = Request::get(&url)
.header("Content-Type", "application/json")
.follow_redirects(http_client::RedirectPolicy::FollowAll);
let mut response = client
.send(request.body(AsyncBody::default())?)
.await
.with_context(|| format!("error fetching GitLab commit details at {:?}", url))?;
let mut body = Vec::new();
response.body_mut().read_to_end(&mut body).await?;
if response.status().is_client_error() {
let text = String::from_utf8_lossy(body.as_slice());
bail!(
"status error {}, response: {text:?}",
response.status().as_u16()
);
}
let body_str = std::str::from_utf8(&body)?;
let author_email = serde_json::from_str::<CommitDetails>(body_str)
.map(|commit| commit.author_email)
.context("failed to deserialize GitLab commit details")?;
let avatar_info_url = format!("https://{host}/api/v4/avatar?email={author_email}");
let request = Request::get(&avatar_info_url)
.header("Content-Type", "application/json")
.follow_redirects(http_client::RedirectPolicy::FollowAll);
let mut response = client
.send(request.body(AsyncBody::default())?)
.await
.with_context(|| format!("error fetching GitLab avatar info at {:?}", url))?;
let mut body = Vec::new();
response.body_mut().read_to_end(&mut body).await?;
if response.status().is_client_error() {
let text = String::from_utf8_lossy(body.as_slice());
bail!(
"status error {}, response: {text:?}",
response.status().as_u16()
);
}
let body_str = std::str::from_utf8(&body)?;
serde_json::from_str::<Option<AvatarInfo>>(body_str)
.context("failed to deserialize GitLab avatar info")
}
}
#[async_trait]
impl GitHostingProvider for Gitlab {
fn name(&self) -> String {
self.name.clone()
@@ -58,7 +144,7 @@ impl GitHostingProvider for Gitlab {
}
fn supports_avatars(&self) -> bool {
false
true
}
fn format_line_number(&self, line: u32) -> String {
@@ -122,6 +208,39 @@ impl GitHostingProvider for Gitlab {
);
permalink
}
async fn commit_author_avatar_url(
&self,
repo_owner: &str,
repo: &str,
commit: SharedString,
http_client: Arc<dyn HttpClient>,
) -> Result<Option<Url>> {
let commit = commit.to_string();
let avatar_url = self
.fetch_gitlab_commit_author(repo_owner, repo, &commit, &http_client)
.await?
.map(|author| -> Result<Url, url::ParseError> {
let mut url = Url::parse(&author.avatar_url)?;
if let Some(host) = url.host_str() {
let size_query = if host.contains("gravatar") || host.contains("libravatar") {
Some("s=128")
} else if self
.base_url
.host_str()
.is_some_and(|base_host| host.contains(base_host))
{
Some("width=128")
} else {
None
};
url.set_query(size_query);
}
Ok(url)
})
.transpose()?;
Ok(avatar_url)
}
}
#[cfg(test)]
@@ -134,8 +253,8 @@ mod tests {
#[test]
fn test_invalid_self_hosted_remote_url() {
let remote_url = "https://gitlab.com/zed-industries/zed.git";
let github = Gitlab::from_remote_url(remote_url);
assert!(github.is_err());
let gitlab = Gitlab::from_remote_url(remote_url);
assert!(gitlab.is_err());
}
#[test]

View File

@@ -58,8 +58,8 @@ use std::{collections::HashSet, sync::Arc, time::Duration, usize};
use strum::{IntoEnumIterator, VariantNames};
use time::OffsetDateTime;
use ui::{
Checkbox, CommonAnimationExt, ContextMenu, ElevationIndex, IconPosition, Label, LabelSize,
PopoverMenu, ScrollAxes, Scrollbars, SplitButton, Tooltip, WithScrollbar, prelude::*,
ButtonLike, Checkbox, CommonAnimationExt, ContextMenu, ElevationIndex, PopoverMenu, ScrollAxes,
Scrollbars, SplitButton, Tooltip, WithScrollbar, prelude::*,
};
use util::paths::PathStyle;
use util::{ResultExt, TryFutureExt, maybe};
@@ -286,6 +286,12 @@ struct PendingOperation {
op_id: usize,
}
impl PendingOperation {
fn contains_path(&self, path: &RepoPath) -> bool {
self.entries.iter().any(|p| &p.repo_path == path)
}
}
pub struct GitPanel {
pub(crate) active_repository: Option<Entity<Repository>>,
pub(crate) commit_editor: Entity<Editor>,
@@ -1240,19 +1246,21 @@ impl GitPanel {
};
let (stage, repo_paths) = match entry {
GitListEntry::Status(status_entry) => {
if status_entry.status.staging().is_fully_staged() {
let repo_paths = vec![status_entry.clone()];
let stage = if let Some(status) = self.entry_staging(&status_entry) {
!status.is_fully_staged()
} else if status_entry.status.staging().is_fully_staged() {
if let Some(op) = self.bulk_staging.clone()
&& op.anchor == status_entry.repo_path
{
self.bulk_staging = None;
}
(false, vec![status_entry.clone()])
false
} else {
self.set_bulk_staging_anchor(status_entry.repo_path.clone(), cx);
(true, vec![status_entry.clone()])
}
true
};
(stage, repo_paths)
}
GitListEntry::Header(section) => {
let goal_staged_state = !self.header_state(section.header).selected();
@@ -2677,10 +2685,7 @@ impl GitPanel {
if self.pending.iter().any(|pending| {
pending.target_status == TargetStatus::Reverted
&& !pending.finished
&& pending
.entries
.iter()
.any(|pending| pending.repo_path == entry.repo_path)
&& pending.contains_path(&entry.repo_path)
}) {
continue;
}
@@ -2731,10 +2736,7 @@ impl GitPanel {
last_pending_staged = pending.entries.first().cloned();
}
if let Some(single_staged) = &single_staged_entry
&& pending
.entries
.iter()
.any(|entry| entry.repo_path == single_staged.repo_path)
&& pending.contains_path(&single_staged.repo_path)
{
pending_status_for_single_staged = Some(pending.target_status);
}
@@ -2797,7 +2799,7 @@ impl GitPanel {
&& let Some(index) = bulk_staging_anchor_new_index
&& let Some(entry) = self.entries.get(index)
&& let Some(entry) = entry.status_entry()
&& self.entry_staging(entry) == StageStatus::Staged
&& self.entry_staging(entry).unwrap_or(entry.staging) == StageStatus::Staged
{
self.bulk_staging = bulk_staging;
}
@@ -2845,39 +2847,47 @@ impl GitPanel {
self.entry_count += 1;
if repo.had_conflict_on_last_merge_head_change(&status_entry.repo_path) {
self.conflicted_count += 1;
if self.entry_staging(status_entry).has_staged() {
if self
.entry_staging(status_entry)
.unwrap_or(status_entry.staging)
.has_staged()
{
self.conflicted_staged_count += 1;
}
} else if status_entry.status.is_created() {
self.new_count += 1;
if self.entry_staging(status_entry).has_staged() {
if self
.entry_staging(status_entry)
.unwrap_or(status_entry.staging)
.has_staged()
{
self.new_staged_count += 1;
}
} else {
self.tracked_count += 1;
if self.entry_staging(status_entry).has_staged() {
if self
.entry_staging(status_entry)
.unwrap_or(status_entry.staging)
.has_staged()
{
self.tracked_staged_count += 1;
}
}
}
}
fn entry_staging(&self, entry: &GitStatusEntry) -> StageStatus {
fn entry_staging(&self, entry: &GitStatusEntry) -> Option<StageStatus> {
for pending in self.pending.iter().rev() {
if pending
.entries
.iter()
.any(|pending_entry| pending_entry.repo_path == entry.repo_path)
{
if pending.contains_path(&entry.repo_path) {
match pending.target_status {
TargetStatus::Staged => return StageStatus::Staged,
TargetStatus::Unstaged => return StageStatus::Unstaged,
TargetStatus::Staged => return Some(StageStatus::Staged),
TargetStatus::Unstaged => return Some(StageStatus::Unstaged),
TargetStatus::Reverted => continue,
TargetStatus::Unchanged => continue,
}
}
}
entry.staging
None
}
pub(crate) fn has_staged_changes(&self) -> bool {
@@ -3495,6 +3505,12 @@ impl GitPanel {
let amend = self.amend_pending();
let signoff = self.signoff_enabled;
let label_color = if self.pending_commit.is_some() {
Color::Disabled
} else {
Color::Default
};
div()
.id("commit-wrapper")
.on_hover(cx.listener(move |this, hovered, _, cx| {
@@ -3503,14 +3519,15 @@ impl GitPanel {
cx.notify()
}))
.child(SplitButton::new(
ui::ButtonLike::new_rounded_left(ElementId::Name(
ButtonLike::new_rounded_left(ElementId::Name(
format!("split-button-left-{}", title).into(),
))
.layer(ui::ElevationIndex::ModalSurface)
.size(ui::ButtonSize::Compact)
.layer(ElevationIndex::ModalSurface)
.size(ButtonSize::Compact)
.child(
div()
.child(Label::new(title).size(LabelSize::Small))
Label::new(title)
.size(LabelSize::Small)
.color(label_color)
.mr_0p5(),
)
.on_click({
@@ -3710,7 +3727,8 @@ impl GitPanel {
let ix = self.entry_by_path(&repo_path, cx)?;
let entry = self.entries.get(ix)?;
let entry_staging = self.entry_staging(entry.status_entry()?);
let status = entry.status_entry()?;
let entry_staging = self.entry_staging(status).unwrap_or(status.staging);
let checkbox = Checkbox::new("stage-file", entry_staging.as_bool().into())
.disabled(!self.has_write_access(cx))
@@ -4004,8 +4022,8 @@ impl GitPanel {
let checkbox_id: ElementId =
ElementId::Name(format!("entry_{}_{}_checkbox", display_name, ix).into());
let entry_staging = self.entry_staging(entry);
let mut is_staged: ToggleState = self.entry_staging(entry).as_bool().into();
let entry_staging = self.entry_staging(entry).unwrap_or(entry.staging);
let mut is_staged: ToggleState = entry_staging.as_bool().into();
if self.show_placeholders && !self.has_staged_changes() && !entry.status.is_created() {
is_staged = ToggleState::Selected;
}

View File

@@ -5,16 +5,14 @@ use git::stash::StashEntry;
use gpui::{
Action, AnyElement, App, Context, DismissEvent, Entity, EventEmitter, FocusHandle, Focusable,
InteractiveElement, IntoElement, Modifiers, ModifiersChangedEvent, ParentElement, Render,
SharedString, Styled, Subscription, Task, WeakEntity, Window, actions, rems, svg,
SharedString, Styled, Subscription, Task, WeakEntity, Window, actions, rems,
};
use picker::{Picker, PickerDelegate};
use project::git_store::{Repository, RepositoryEvent};
use std::sync::Arc;
use time::{OffsetDateTime, UtcOffset};
use time_format;
use ui::{
ButtonLike, HighlightedLabel, KeyBinding, ListItem, ListItemSpacing, Tooltip, prelude::*,
};
use ui::{HighlightedLabel, KeyBinding, ListItem, ListItemSpacing, Tooltip, prelude::*};
use util::ResultExt;
use workspace::notifications::DetachAndPromptErr;
use workspace::{ModalView, Workspace};
@@ -434,7 +432,7 @@ impl PickerDelegate for StashListDelegate {
ix: usize,
selected: bool,
_window: &mut Window,
cx: &mut Context<Picker<Self>>,
_cx: &mut Context<Picker<Self>>,
) -> Option<Self::ListItem> {
let entry_match = &self.matches[ix];
@@ -446,23 +444,14 @@ impl PickerDelegate for StashListDelegate {
.into_any_element();
let branch_name = entry_match.entry.branch.clone().unwrap_or_default();
let branch_label = h_flex()
let branch_info = h_flex()
.gap_1p5()
.w_full()
.child(
h_flex()
.gap_0p5()
.child(
Icon::new(IconName::GitBranch)
.color(Color::Muted)
.size(IconSize::Small),
)
.child(
Label::new(branch_name)
.truncate()
.color(Color::Muted)
.size(LabelSize::Small),
),
Label::new(branch_name)
.truncate()
.color(Color::Muted)
.size(LabelSize::Small),
)
.child(
Label::new("")
@@ -476,42 +465,12 @@ impl PickerDelegate for StashListDelegate {
.size(LabelSize::Small),
);
let show_button = div()
.group("show-button-hover")
.child(
ButtonLike::new("show-button")
.child(
svg()
.size(IconSize::Medium.rems())
.flex_none()
.path(IconName::Eye.path())
.text_color(Color::Default.color(cx))
.group_hover("show-button-hover", |this| {
this.text_color(Color::Accent.color(cx))
})
.hover(|this| this.text_color(Color::Accent.color(cx))),
)
.tooltip(Tooltip::for_action_title("Show Stash", &ShowStashItem))
.on_click(cx.listener(move |picker, _, window, cx| {
cx.stop_propagation();
picker.delegate.show_stash_at(ix, window, cx);
})),
)
.into_any_element();
Some(
ListItem::new(SharedString::from(format!("stash-{ix}")))
.inset(true)
.spacing(ListItemSpacing::Sparse)
.toggle_state(selected)
.end_slot(show_button)
.child(
v_flex()
.w_full()
.overflow_hidden()
.child(stash_label)
.child(branch_label.into_element()),
)
.child(v_flex().w_full().child(stash_label).child(branch_info))
.tooltip(Tooltip::text(format!(
"stash@{{{}}}",
entry_match.entry.index
@@ -534,26 +493,6 @@ impl PickerDelegate for StashListDelegate {
.justify_end()
.border_t_1()
.border_color(cx.theme().colors().border_variant)
.child(
Button::new("apply-stash", "Apply")
.key_binding(
KeyBinding::for_action_in(&menu::Confirm, &focus_handle, cx)
.map(|kb| kb.size(rems_from_px(12.))),
)
.on_click(|_, window, cx| {
window.dispatch_action(menu::Confirm.boxed_clone(), cx)
}),
)
.child(
Button::new("pop-stash", "Pop")
.key_binding(
KeyBinding::for_action_in(&menu::SecondaryConfirm, &focus_handle, cx)
.map(|kb| kb.size(rems_from_px(12.))),
)
.on_click(|_, window, cx| {
window.dispatch_action(menu::SecondaryConfirm.boxed_clone(), cx)
}),
)
.child(
Button::new("drop-stash", "Drop")
.key_binding(
@@ -568,6 +507,42 @@ impl PickerDelegate for StashListDelegate {
window.dispatch_action(stash_picker::DropStashItem.boxed_clone(), cx)
}),
)
.child(
Button::new("view-stash", "View")
.key_binding(
KeyBinding::for_action_in(
&stash_picker::ShowStashItem,
&focus_handle,
cx,
)
.map(|kb| kb.size(rems_from_px(12.))),
)
.on_click(cx.listener(move |picker, _, window, cx| {
cx.stop_propagation();
let selected_ix = picker.delegate.selected_index();
picker.delegate.show_stash_at(selected_ix, window, cx);
})),
)
.child(
Button::new("pop-stash", "Pop")
.key_binding(
KeyBinding::for_action_in(&menu::SecondaryConfirm, &focus_handle, cx)
.map(|kb| kb.size(rems_from_px(12.))),
)
.on_click(|_, window, cx| {
window.dispatch_action(menu::SecondaryConfirm.boxed_clone(), cx)
}),
)
.child(
Button::new("apply-stash", "Apply")
.key_binding(
KeyBinding::for_action_in(&menu::Confirm, &focus_handle, cx)
.map(|kb| kb.size(rems_from_px(12.))),
)
.on_click(|_, window, cx| {
window.dispatch_action(menu::Confirm.boxed_clone(), cx)
}),
)
.into_any(),
)
}

View File

@@ -80,27 +80,15 @@ impl PlatformDispatcher for WindowsDispatcher {
}
fn dispatch_on_main_thread(&self, runnable: Runnable) {
let was_empty = self.main_sender.is_empty();
match self.main_sender.send(runnable) {
Ok(_) => unsafe {
// Only send a `WM_GPUI_TASK_DISPATCHED_ON_MAIN_THREAD` to the
// queue if we have no runnables queued up yet, otherwise we
// risk filling the message queue with gpui messages causing us
// to starve the message loop of system messages, resulting in a
// process hang.
//
// When the message loop receives a
// `WM_GPUI_TASK_DISPATCHED_ON_MAIN_THREAD` message we drain the
// runnable queue entirely.
if was_empty {
PostMessageW(
Some(self.platform_window_handle.as_raw()),
WM_GPUI_TASK_DISPATCHED_ON_MAIN_THREAD,
WPARAM(self.validation_number),
LPARAM(0),
)
.log_err();
}
PostMessageW(
Some(self.platform_window_handle.as_raw()),
WM_GPUI_TASK_DISPATCHED_ON_MAIN_THREAD,
WPARAM(self.validation_number),
LPARAM(0),
)
.log_err();
},
Err(runnable) => {
// NOTE: Runnable may wrap a Future that is !Send.

View File

@@ -9,7 +9,6 @@ use windows::Win32::UI::{
},
WindowsAndMessaging::KL_NAMELENGTH,
};
use windows_core::HSTRING;
use crate::{
KeybindingKeystroke, Keystroke, Modifiers, PlatformKeyboardLayout, PlatformKeyboardMapper,
@@ -93,14 +92,13 @@ impl PlatformKeyboardMapper for WindowsKeyboardMapper {
impl WindowsKeyboardLayout {
pub(crate) fn new() -> Result<Self> {
let mut buffer = [0u16; KL_NAMELENGTH as usize];
let mut buffer = [0u16; KL_NAMELENGTH as usize]; // KL_NAMELENGTH includes the null terminator
unsafe { GetKeyboardLayoutNameW(&mut buffer)? };
let id = HSTRING::from_wide(&buffer).to_string();
let id = String::from_utf16_lossy(&buffer[..buffer.len() - 1]); // Remove the null terminator
let entry = windows_registry::LOCAL_MACHINE.open(format!(
"System\\CurrentControlSet\\Control\\Keyboard Layouts\\{}",
id
"System\\CurrentControlSet\\Control\\Keyboard Layouts\\{id}"
))?;
let name = entry.get_hstring("Layout Text")?.to_string();
let name = entry.get_string("Layout Text")?;
Ok(Self { id, name })
}
@@ -135,6 +133,7 @@ impl WindowsKeyboardLayout {
b"0405" | // Czech
b"040E" | // Hungarian
b"0424" | // Slovenian
b"041A" | // Croatian
b"041B" | // Slovak
b"0418" // Romanian
)

View File

@@ -906,6 +906,16 @@ impl Render for ConfigurationView {
.child(Icon::new(IconName::Check).color(Color::Success))
.child(Label::new("Connected"))
.into_any_element(),
)
.child(
IconButton::new("refresh-models", IconName::RotateCcw)
.tooltip(Tooltip::text("Refresh models"))
.on_click(cx.listener(|this, _, _, cx| {
this.state.update(cx, |state, _| {
state.fetched_models.clear();
});
this.retry_connection(cx);
})),
),
)
} else {

View File

@@ -7,6 +7,7 @@
"exclude"
"retract"
"module"
"ignore"
] @keyword
"=>" @operator

View File

@@ -27,3 +27,9 @@
("(") @structure.open
(")") @structure.close
)
(ignore_directive
"ignore" @structure.anchor
("(") @structure.open
(")") @structure.close
)

View File

@@ -1211,7 +1211,7 @@ impl ToolchainLister for PythonToolchainProvider {
activation_script.extend(match shell {
ShellKind::Fish => Some(format!("\"{pyenv}\" shell - fish {version}")),
ShellKind::Posix => Some(format!("\"{pyenv}\" shell - sh {version}")),
ShellKind::Nushell => Some(format!("\"{pyenv}\" shell - nu {version}")),
ShellKind::Nushell => Some(format!("^\"{pyenv}\" shell - nu {version}")),
ShellKind::PowerShell => None,
ShellKind::Csh => None,
ShellKind::Tcsh => None,

View File

@@ -121,6 +121,15 @@
; Tokens
[
";"
"?."
"."
","
":"
"?"
] @punctuation.delimiter
[
"..."
"-"
@@ -179,15 +188,6 @@
] @operator
)
[
";"
"?."
"."
","
":"
"?"
] @punctuation.delimiter
[
"("
")"

View File

@@ -5,7 +5,7 @@ use gpui::{App, AppContext, Context, Entity};
use itertools::Itertools;
use language::{Buffer, BufferSnapshot};
use rope::Point;
use text::{Bias, OffsetRangeExt, locator::Locator};
use text::{Bias, BufferId, OffsetRangeExt, locator::Locator};
use util::{post_inc, rel_path::RelPath};
use crate::{
@@ -152,6 +152,15 @@ impl MultiBuffer {
}
}
pub fn remove_excerpts_for_buffer(&mut self, buffer: BufferId, cx: &mut Context<Self>) {
self.remove_excerpts(
self.excerpts_for_buffer(buffer, cx)
.into_iter()
.map(|(excerpt, _)| excerpt),
cx,
);
}
pub(super) fn expand_excerpts_with_paths(
&mut self,
ids: impl IntoIterator<Item = ExcerptId>,

View File

@@ -1,7 +1,6 @@
use std::{
any::Any,
borrow::Borrow,
collections::HashSet,
path::{Path, PathBuf},
str::FromStr as _,
sync::Arc,
@@ -137,7 +136,7 @@ impl EventEmitter<AgentServersUpdated> for AgentServerStore {}
#[cfg(test)]
mod ext_agent_tests {
use super::*;
use std::fmt::Write as _;
use std::{collections::HashSet, fmt::Write as _};
// Helper to build a store in Collab mode so we can mutate internal maps without
// needing to spin up a full project environment.
@@ -244,25 +243,18 @@ impl AgentServerStore {
// Collect manifests first so we can iterate twice
let manifests: Vec<_> = manifests.into_iter().collect();
// Remove existing extension-provided agents by tracking which ones we're about to add
let extension_agent_names: HashSet<_> = manifests
.iter()
.flat_map(|(_, manifest)| manifest.agent_servers.keys().map(|k| k.to_string()))
.collect();
let keys_to_remove: Vec<_> = self
.external_agents
.keys()
.filter(|name| {
// Remove if it matches an extension agent name from any extension
extension_agent_names.contains(name.0.as_ref())
})
.cloned()
.collect();
for key in &keys_to_remove {
self.external_agents.remove(key);
self.agent_icons.remove(key);
}
// Remove all extension-provided agents
// (They will be re-added below if they're in the currently installed extensions)
self.external_agents.retain(|name, agent| {
if agent.downcast_mut::<LocalExtensionArchiveAgent>().is_some() {
self.agent_icons.remove(name);
false
} else {
// Keep the hardcoded external agents that don't come from extensions
// (In the future we may move these over to being extensions too.)
true
}
});
// Insert agent servers from extension manifests
match &self.state {
@@ -1037,7 +1029,7 @@ impl ExternalAgentServer for LocalGemini {
cx.spawn(async move |cx| {
let mut env = project_environment
.update(cx, |project_environment, cx| {
project_environment.get_local_directory_environment(
project_environment.local_directory_environment(
&Shell::System,
root_dir.clone(),
cx,
@@ -1133,7 +1125,7 @@ impl ExternalAgentServer for LocalClaudeCode {
cx.spawn(async move |cx| {
let mut env = project_environment
.update(cx, |project_environment, cx| {
project_environment.get_local_directory_environment(
project_environment.local_directory_environment(
&Shell::System,
root_dir.clone(),
cx,
@@ -1227,7 +1219,7 @@ impl ExternalAgentServer for LocalCodex {
cx.spawn(async move |cx| {
let mut env = project_environment
.update(cx, |project_environment, cx| {
project_environment.get_local_directory_environment(
project_environment.local_directory_environment(
&Shell::System,
root_dir.clone(),
cx,
@@ -1402,7 +1394,7 @@ impl ExternalAgentServer for LocalExtensionArchiveAgent {
// Get project environment
let mut env = project_environment
.update(cx, |project_environment, cx| {
project_environment.get_local_directory_environment(
project_environment.local_directory_environment(
&Shell::System,
root_dir.clone(),
cx,
@@ -1585,7 +1577,7 @@ impl ExternalAgentServer for LocalCustomAgent {
cx.spawn(async move |cx| {
let mut env = project_environment
.update(cx, |project_environment, cx| {
project_environment.get_local_directory_environment(
project_environment.local_directory_environment(
&Shell::System,
root_dir.clone(),
cx,
@@ -1702,6 +1694,8 @@ impl settings::Settings for AllAgentServersSettings {
#[cfg(test)]
mod extension_agent_tests {
use crate::worktree_store::WorktreeStore;
use super::*;
use gpui::TestAppContext;
use std::sync::Arc;
@@ -1826,7 +1820,9 @@ mod extension_agent_tests {
async fn archive_agent_uses_extension_and_agent_id_for_cache_key(cx: &mut TestAppContext) {
let fs = fs::FakeFs::new(cx.background_executor.clone());
let http_client = http_client::FakeHttpClient::with_404_response();
let project_environment = cx.new(|cx| crate::ProjectEnvironment::new(None, cx));
let worktree_store = cx.new(|_| WorktreeStore::local(false, fs.clone()));
let project_environment =
cx.new(|cx| crate::ProjectEnvironment::new(None, worktree_store.downgrade(), None, cx));
let agent = LocalExtensionArchiveAgent {
fs,

View File

@@ -49,7 +49,7 @@ use std::{
path::{Path, PathBuf},
sync::{Arc, Once},
};
use task::{DebugScenario, Shell, SpawnInTerminal, TaskContext, TaskTemplate};
use task::{DebugScenario, SpawnInTerminal, TaskContext, TaskTemplate};
use util::{ResultExt as _, rel_path::RelPath};
use worktree::Worktree;
@@ -267,8 +267,8 @@ impl DapStore {
let user_env = dap_settings.map(|s| s.env.clone());
let delegate = self.delegate(worktree, console, cx);
let cwd: Arc<Path> = worktree.read(cx).abs_path().as_ref().into();
let worktree = worktree.clone();
cx.spawn(async move |this, cx| {
let mut binary = adapter
.get_binary(
@@ -287,11 +287,7 @@ impl DapStore {
.unwrap()
.environment
.update(cx, |environment, cx| {
environment.get_local_directory_environment(
&Shell::System,
cwd,
cx,
)
environment.worktree_environment(worktree, cx)
})
})?
.await;
@@ -607,9 +603,9 @@ impl DapStore {
local_store.node_runtime.clone(),
local_store.http_client.clone(),
local_store.toolchain_store.clone(),
local_store.environment.update(cx, |env, cx| {
env.get_worktree_environment(worktree.clone(), cx)
}),
local_store
.environment
.update(cx, |env, cx| env.worktree_environment(worktree.clone(), cx)),
local_store.is_headless,
))
}

View File

@@ -5,11 +5,12 @@ use remote::RemoteClient;
use rpc::proto::{self, REMOTE_SERVER_PROJECT_ID};
use std::{collections::VecDeque, path::Path, sync::Arc};
use task::{Shell, shell_to_proto};
use util::ResultExt;
use terminal::terminal_settings::TerminalSettings;
use util::{ResultExt, rel_path::RelPath};
use worktree::Worktree;
use collections::HashMap;
use gpui::{AppContext as _, Context, Entity, EventEmitter, Task};
use gpui::{App, AppContext as _, Context, Entity, EventEmitter, Task, WeakEntity};
use settings::Settings as _;
use crate::{
@@ -23,6 +24,8 @@ pub struct ProjectEnvironment {
remote_environments: HashMap<(Shell, Arc<Path>), Shared<Task<Option<HashMap<String, String>>>>>,
environment_error_messages: VecDeque<String>,
environment_error_messages_tx: mpsc::UnboundedSender<String>,
worktree_store: WeakEntity<WorktreeStore>,
remote_client: Option<WeakEntity<RemoteClient>>,
_tasks: Vec<Task<()>>,
}
@@ -33,7 +36,12 @@ pub enum ProjectEnvironmentEvent {
impl EventEmitter<ProjectEnvironmentEvent> for ProjectEnvironment {}
impl ProjectEnvironment {
pub fn new(cli_environment: Option<HashMap<String, String>>, cx: &mut Context<Self>) -> Self {
pub fn new(
cli_environment: Option<HashMap<String, String>>,
worktree_store: WeakEntity<WorktreeStore>,
remote_client: Option<WeakEntity<RemoteClient>>,
cx: &mut Context<Self>,
) -> Self {
let (tx, mut rx) = mpsc::unbounded();
let task = cx.spawn(async move |this, cx| {
while let Some(message) = rx.next().await {
@@ -50,12 +58,17 @@ impl ProjectEnvironment {
remote_environments: Default::default(),
environment_error_messages: Default::default(),
environment_error_messages_tx: tx,
worktree_store,
remote_client,
_tasks: vec![task],
}
}
/// Returns the inherited CLI environment, if this project was opened from the Zed CLI.
pub(crate) fn get_cli_environment(&self) -> Option<HashMap<String, String>> {
if cfg!(any(test, feature = "test-support")) {
return Some(HashMap::default());
}
if let Some(mut env) = self.cli_environment.clone() {
set_origin_marker(&mut env, EnvironmentOrigin::Cli);
Some(env)
@@ -64,16 +77,12 @@ impl ProjectEnvironment {
}
}
pub(crate) fn get_buffer_environment(
pub fn buffer_environment(
&mut self,
buffer: &Entity<Buffer>,
worktree_store: &Entity<WorktreeStore>,
cx: &mut Context<Self>,
) -> Shared<Task<Option<HashMap<String, String>>>> {
if cfg!(any(test, feature = "test-support")) {
return Task::ready(Some(HashMap::default())).shared();
}
if let Some(cli_environment) = self.get_cli_environment() {
log::debug!("using project environment variables from CLI");
return Task::ready(Some(cli_environment)).shared();
@@ -87,54 +96,105 @@ impl ProjectEnvironment {
else {
return Task::ready(None).shared();
};
self.get_worktree_environment(worktree, cx)
self.worktree_environment(worktree, cx)
}
pub fn get_worktree_environment(
pub fn worktree_environment(
&mut self,
worktree: Entity<Worktree>,
cx: &mut Context<Self>,
cx: &mut App,
) -> Shared<Task<Option<HashMap<String, String>>>> {
if cfg!(any(test, feature = "test-support")) {
return Task::ready(Some(HashMap::default())).shared();
}
if let Some(cli_environment) = self.get_cli_environment() {
log::debug!("using project environment variables from CLI");
return Task::ready(Some(cli_environment)).shared();
}
let mut abs_path = worktree.read(cx).abs_path();
if !worktree.read(cx).is_local() {
log::error!(
"attempted to get project environment for a non-local worktree at {abs_path:?}"
);
return Task::ready(None).shared();
} else if worktree.read(cx).is_single_file() {
let worktree = worktree.read(cx);
let mut abs_path = worktree.abs_path();
if worktree.is_single_file() {
let Some(parent) = abs_path.parent() else {
return Task::ready(None).shared();
};
abs_path = parent.into();
}
self.get_local_directory_environment(&Shell::System, abs_path, cx)
let remote_client = self.remote_client.as_ref().and_then(|it| it.upgrade());
match remote_client {
Some(remote_client) => remote_client.clone().read(cx).shell().map(|shell| {
self.remote_directory_environment(
&Shell::Program(shell),
abs_path,
remote_client,
cx,
)
}),
None => Some({
let shell = TerminalSettings::get(
Some(settings::SettingsLocation {
worktree_id: worktree.id(),
path: RelPath::empty(),
}),
cx,
)
.shell
.clone();
self.local_directory_environment(&shell, abs_path, cx)
}),
}
.unwrap_or_else(|| Task::ready(None).shared())
}
pub fn directory_environment(
&mut self,
abs_path: Arc<Path>,
cx: &mut App,
) -> Shared<Task<Option<HashMap<String, String>>>> {
let remote_client = self.remote_client.as_ref().and_then(|it| it.upgrade());
match remote_client {
Some(remote_client) => remote_client.clone().read(cx).shell().map(|shell| {
self.remote_directory_environment(
&Shell::Program(shell),
abs_path,
remote_client,
cx,
)
}),
None => self
.worktree_store
.read_with(cx, |worktree_store, cx| {
worktree_store.find_worktree(&abs_path, cx)
})
.ok()
.map(|worktree| {
let shell = terminal::terminal_settings::TerminalSettings::get(
worktree
.as_ref()
.map(|(worktree, path)| settings::SettingsLocation {
worktree_id: worktree.read(cx).id(),
path: &path,
}),
cx,
)
.shell
.clone();
self.local_directory_environment(&shell, abs_path, cx)
}),
}
.unwrap_or_else(|| Task::ready(None).shared())
}
/// Returns the project environment, if possible.
/// If the project was opened from the CLI, then the inherited CLI environment is returned.
/// If it wasn't opened from the CLI, and an absolute path is given, then a shell is spawned in
/// that directory, to get environment variables as if the user has `cd`'d there.
pub fn get_local_directory_environment(
pub fn local_directory_environment(
&mut self,
shell: &Shell,
abs_path: Arc<Path>,
cx: &mut Context<Self>,
cx: &mut App,
) -> Shared<Task<Option<HashMap<String, String>>>> {
if cfg!(any(test, feature = "test-support")) {
return Task::ready(Some(HashMap::default())).shared();
}
if let Some(cli_environment) = self.get_cli_environment() {
log::debug!("using project environment variables from CLI");
return Task::ready(Some(cli_environment)).shared();
@@ -146,7 +206,7 @@ impl ProjectEnvironment {
let load_direnv = ProjectSettings::get_global(cx).load_direnv.clone();
let shell = shell.clone();
let tx = self.environment_error_messages_tx.clone();
cx.spawn(async move |_, cx| {
cx.spawn(async move |cx| {
let mut shell_env = cx
.background_spawn(load_directory_shell_environment(
shell,
@@ -178,12 +238,12 @@ impl ProjectEnvironment {
.clone()
}
pub fn get_remote_directory_environment(
pub fn remote_directory_environment(
&mut self,
shell: &Shell,
abs_path: Arc<Path>,
remote_client: Entity<RemoteClient>,
cx: &mut Context<Self>,
cx: &mut App,
) -> Shared<Task<Option<HashMap<String, String>>>> {
if cfg!(any(test, feature = "test-support")) {
return Task::ready(Some(HashMap::default())).shared();
@@ -201,7 +261,7 @@ impl ProjectEnvironment {
shell: Some(shell_to_proto(shell.clone())),
directory: abs_path.to_string_lossy().to_string(),
});
cx.spawn(async move |_, _| {
cx.background_spawn(async move {
let environment = response.await.log_err()?;
Some(environment.environment.into_iter().collect())
})

View File

@@ -3717,20 +3717,15 @@ impl Repository {
Some(self.git_store.upgrade()?.read(cx).buffer_store.clone())
}
pub fn stage_entries(
fn save_buffers<'a>(
&self,
entries: Vec<RepoPath>,
entries: impl IntoIterator<Item = &'a RepoPath>,
cx: &mut Context<Self>,
) -> Task<anyhow::Result<()>> {
if entries.is_empty() {
return Task::ready(Ok(()));
}
let id = self.id;
) -> Vec<Task<anyhow::Result<()>>> {
let mut save_futures = Vec::new();
if let Some(buffer_store) = self.buffer_store(cx) {
buffer_store.update(cx, |buffer_store, cx| {
for path in &entries {
for path in entries {
let Some(project_path) = self.repo_path_to_project_path(path, cx) else {
continue;
};
@@ -3746,37 +3741,64 @@ impl Repository {
}
})
}
save_futures
}
pub fn stage_entries(
&self,
entries: Vec<RepoPath>,
cx: &mut Context<Self>,
) -> Task<anyhow::Result<()>> {
if entries.is_empty() {
return Task::ready(Ok(()));
}
let id = self.id;
let save_tasks = self.save_buffers(&entries, cx);
let paths = entries
.iter()
.map(|p| p.as_unix_str())
.collect::<Vec<_>>()
.join(" ");
let status = format!("git add {paths}");
let job_key = match entries.len() {
1 => Some(GitJobKey::WriteIndex(entries[0].clone())),
_ => None,
};
cx.spawn(async move |this, cx| {
for save_future in save_futures {
save_future.await?;
for save_task in save_tasks {
save_task.await?;
}
this.update(cx, |this, _| {
this.send_job(None, move |git_repo, _cx| async move {
match git_repo {
RepositoryState::Local {
backend,
environment,
..
} => backend.stage_paths(entries, environment.clone()).await,
RepositoryState::Remote { project_id, client } => {
client
.request(proto::Stage {
project_id: project_id.0,
repository_id: id.to_proto(),
paths: entries
.into_iter()
.map(|repo_path| repo_path.to_proto())
.collect(),
})
.await
.context("sending stage request")?;
this.send_keyed_job(
job_key,
Some(status.into()),
move |git_repo, _cx| async move {
match git_repo {
RepositoryState::Local {
backend,
environment,
..
} => backend.stage_paths(entries, environment.clone()).await,
RepositoryState::Remote { project_id, client } => {
client
.request(proto::Stage {
project_id: project_id.0,
repository_id: id.to_proto(),
paths: entries
.into_iter()
.map(|repo_path| repo_path.to_proto())
.collect(),
})
.await
.context("sending stage request")?;
Ok(())
Ok(())
}
}
}
})
},
)
})?
.await??;
@@ -3793,57 +3815,52 @@ impl Repository {
return Task::ready(Ok(()));
}
let id = self.id;
let mut save_futures = Vec::new();
if let Some(buffer_store) = self.buffer_store(cx) {
buffer_store.update(cx, |buffer_store, cx| {
for path in &entries {
let Some(project_path) = self.repo_path_to_project_path(path, cx) else {
continue;
};
if let Some(buffer) = buffer_store.get_by_path(&project_path)
&& buffer
.read(cx)
.file()
.is_some_and(|file| file.disk_state().exists())
&& buffer.read(cx).has_unsaved_edits()
{
save_futures.push(buffer_store.save_buffer(buffer, cx));
}
}
})
}
let save_tasks = self.save_buffers(&entries, cx);
let paths = entries
.iter()
.map(|p| p.as_unix_str())
.collect::<Vec<_>>()
.join(" ");
let status = format!("git reset {paths}");
let job_key = match entries.len() {
1 => Some(GitJobKey::WriteIndex(entries[0].clone())),
_ => None,
};
cx.spawn(async move |this, cx| {
for save_future in save_futures {
save_future.await?;
for save_task in save_tasks {
save_task.await?;
}
this.update(cx, |this, _| {
this.send_job(None, move |git_repo, _cx| async move {
match git_repo {
RepositoryState::Local {
backend,
environment,
..
} => backend.unstage_paths(entries, environment).await,
RepositoryState::Remote { project_id, client } => {
client
.request(proto::Unstage {
project_id: project_id.0,
repository_id: id.to_proto(),
paths: entries
.into_iter()
.map(|repo_path| repo_path.to_proto())
.collect(),
})
.await
.context("sending unstage request")?;
this.send_keyed_job(
job_key,
Some(status.into()),
move |git_repo, _cx| async move {
match git_repo {
RepositoryState::Local {
backend,
environment,
..
} => backend.unstage_paths(entries, environment).await,
RepositoryState::Remote { project_id, client } => {
client
.request(proto::Unstage {
project_id: project_id.0,
repository_id: id.to_proto(),
paths: entries
.into_iter()
.map(|repo_path| repo_path.to_proto())
.collect(),
})
.await
.context("sending unstage request")?;
Ok(())
Ok(())
}
}
}
})
},
)
})?
.await??;
@@ -4787,7 +4804,7 @@ impl Repository {
.upgrade()
.context("missing project environment")?
.update(cx, |project_environment, cx| {
project_environment.get_local_directory_environment(&Shell::System, work_directory_abs_path.clone(), cx)
project_environment.local_directory_environment(&Shell::System, work_directory_abs_path.clone(), cx)
})?
.await
.unwrap_or_else(|| {

View File

@@ -75,14 +75,14 @@ use language::{
range_from_lsp, range_to_lsp,
};
use lsp::{
AdapterServerCapabilities, CodeActionKind, CompletionContext, DiagnosticServerCapabilities,
DiagnosticSeverity, DiagnosticTag, DidChangeWatchedFilesRegistrationOptions, Edit,
FileOperationFilter, FileOperationPatternKind, FileOperationRegistrationOptions, FileRename,
FileSystemWatcher, LSP_REQUEST_TIMEOUT, LanguageServer, LanguageServerBinary,
LanguageServerBinaryOptions, LanguageServerId, LanguageServerName, LanguageServerSelector,
LspRequestFuture, MessageActionItem, MessageType, OneOf, RenameFilesParams, SymbolKind,
TextDocumentSyncSaveOptions, TextEdit, Uri, WillRenameFiles, WorkDoneProgressCancelParams,
WorkspaceFolder, notification::DidRenameFiles,
AdapterServerCapabilities, CodeActionKind, CompletionContext, CompletionOptions,
DiagnosticServerCapabilities, DiagnosticSeverity, DiagnosticTag,
DidChangeWatchedFilesRegistrationOptions, Edit, FileOperationFilter, FileOperationPatternKind,
FileOperationRegistrationOptions, FileRename, FileSystemWatcher, LSP_REQUEST_TIMEOUT,
LanguageServer, LanguageServerBinary, LanguageServerBinaryOptions, LanguageServerId,
LanguageServerName, LanguageServerSelector, LspRequestFuture, MessageActionItem, MessageType,
OneOf, RenameFilesParams, SymbolKind, TextDocumentSyncSaveOptions, TextEdit, Uri,
WillRenameFiles, WorkDoneProgressCancelParams, WorkspaceFolder, notification::DidRenameFiles,
};
use node_runtime::read_package_installed_version;
use parking_lot::Mutex;
@@ -853,23 +853,32 @@ impl LocalLspStore {
language_server
.on_request::<lsp::request::InlayHintRefreshRequest, _, _>({
let lsp_store = lsp_store.clone();
let request_id = Arc::new(AtomicUsize::new(0));
move |(), cx| {
let this = lsp_store.clone();
let lsp_store = lsp_store.clone();
let request_id = request_id.clone();
let mut cx = cx.clone();
async move {
this.update(&mut cx, |lsp_store, cx| {
cx.emit(LspStoreEvent::RefreshInlayHints(server_id));
lsp_store
.downstream_client
.as_ref()
.map(|(client, project_id)| {
client.send(proto::RefreshInlayHints {
project_id: *project_id,
server_id: server_id.to_proto(),
lsp_store
.update(&mut cx, |lsp_store, cx| {
let request_id =
Some(request_id.fetch_add(1, atomic::Ordering::AcqRel));
cx.emit(LspStoreEvent::RefreshInlayHints {
server_id,
request_id,
});
lsp_store
.downstream_client
.as_ref()
.map(|(client, project_id)| {
client.send(proto::RefreshInlayHints {
project_id: *project_id,
server_id: server_id.to_proto(),
request_id: request_id.map(|id| id as u64),
})
})
})
})?
.transpose()?;
})?
.transpose()?;
Ok(())
}
}
@@ -3659,7 +3668,10 @@ pub enum LspStoreEvent {
new_language: Option<Arc<Language>>,
},
Notification(String),
RefreshInlayHints(LanguageServerId),
RefreshInlayHints {
server_id: LanguageServerId,
request_id: Option<usize>,
},
RefreshCodeLens,
DiagnosticsUpdated {
server_id: LanguageServerId,
@@ -5329,8 +5341,8 @@ impl LspStore {
request.to_proto(project_id, buffer.read(cx)),
);
let buffer = buffer.clone();
cx.spawn(async move |weak_project, cx| {
let Some(project) = weak_project.upgrade() else {
cx.spawn(async move |weak_lsp_store, cx| {
let Some(lsp_store) = weak_lsp_store.upgrade() else {
return Ok(None);
};
let Some(responses) = request_task.await? else {
@@ -5339,7 +5351,7 @@ impl LspStore {
let actions = join_all(responses.payload.into_iter().map(|response| {
GetDefinitions { position }.response_from_proto(
response.response,
project.clone(),
lsp_store.clone(),
buffer.clone(),
cx.clone(),
)
@@ -5395,8 +5407,8 @@ impl LspStore {
request.to_proto(project_id, buffer.read(cx)),
);
let buffer = buffer.clone();
cx.spawn(async move |weak_project, cx| {
let Some(project) = weak_project.upgrade() else {
cx.spawn(async move |weak_lsp_store, cx| {
let Some(lsp_store) = weak_lsp_store.upgrade() else {
return Ok(None);
};
let Some(responses) = request_task.await? else {
@@ -5405,7 +5417,7 @@ impl LspStore {
let actions = join_all(responses.payload.into_iter().map(|response| {
GetDeclarations { position }.response_from_proto(
response.response,
project.clone(),
lsp_store.clone(),
buffer.clone(),
cx.clone(),
)
@@ -5461,8 +5473,8 @@ impl LspStore {
request.to_proto(project_id, buffer.read(cx)),
);
let buffer = buffer.clone();
cx.spawn(async move |weak_project, cx| {
let Some(project) = weak_project.upgrade() else {
cx.spawn(async move |weak_lsp_store, cx| {
let Some(lsp_store) = weak_lsp_store.upgrade() else {
return Ok(None);
};
let Some(responses) = request_task.await? else {
@@ -5471,7 +5483,7 @@ impl LspStore {
let actions = join_all(responses.payload.into_iter().map(|response| {
GetTypeDefinitions { position }.response_from_proto(
response.response,
project.clone(),
lsp_store.clone(),
buffer.clone(),
cx.clone(),
)
@@ -5527,8 +5539,8 @@ impl LspStore {
request.to_proto(project_id, buffer.read(cx)),
);
let buffer = buffer.clone();
cx.spawn(async move |weak_project, cx| {
let Some(project) = weak_project.upgrade() else {
cx.spawn(async move |weak_lsp_store, cx| {
let Some(lsp_store) = weak_lsp_store.upgrade() else {
return Ok(None);
};
let Some(responses) = request_task.await? else {
@@ -5537,7 +5549,7 @@ impl LspStore {
let actions = join_all(responses.payload.into_iter().map(|response| {
GetImplementations { position }.response_from_proto(
response.response,
project.clone(),
lsp_store.clone(),
buffer.clone(),
cx.clone(),
)
@@ -5594,8 +5606,8 @@ impl LspStore {
request.to_proto(project_id, buffer.read(cx)),
);
let buffer = buffer.clone();
cx.spawn(async move |weak_project, cx| {
let Some(project) = weak_project.upgrade() else {
cx.spawn(async move |weak_lsp_store, cx| {
let Some(lsp_store) = weak_lsp_store.upgrade() else {
return Ok(None);
};
let Some(responses) = request_task.await? else {
@@ -5605,7 +5617,7 @@ impl LspStore {
let locations = join_all(responses.payload.into_iter().map(|lsp_response| {
GetReferences { position }.response_from_proto(
lsp_response.response,
project.clone(),
lsp_store.clone(),
buffer.clone(),
cx.clone(),
)
@@ -5662,8 +5674,8 @@ impl LspStore {
request.to_proto(project_id, buffer.read(cx)),
);
let buffer = buffer.clone();
cx.spawn(async move |weak_project, cx| {
let Some(project) = weak_project.upgrade() else {
cx.spawn(async move |weak_lsp_store, cx| {
let Some(lsp_store) = weak_lsp_store.upgrade() else {
return Ok(None);
};
let Some(responses) = request_task.await? else {
@@ -5676,7 +5688,7 @@ impl LspStore {
}
.response_from_proto(
response.response,
project.clone(),
lsp_store.clone(),
buffer.clone(),
cx.clone(),
)
@@ -6636,14 +6648,22 @@ impl LspStore {
cx: &mut Context<Self>,
) -> HashMap<Range<BufferRow>, Task<Result<CacheInlayHints>>> {
let buffer_snapshot = buffer.read(cx).snapshot();
let for_server = if let InvalidationStrategy::RefreshRequested(server_id) = invalidate {
let next_hint_id = self.next_hint_id.clone();
let lsp_data = self.latest_lsp_data(&buffer, cx);
let mut lsp_refresh_requested = false;
let for_server = if let InvalidationStrategy::RefreshRequested {
server_id,
request_id,
} = invalidate
{
let invalidated = lsp_data
.inlay_hints
.invalidate_for_server_refresh(server_id, request_id);
lsp_refresh_requested = invalidated;
Some(server_id)
} else {
None
};
let invalidate_cache = invalidate.should_invalidate();
let next_hint_id = self.next_hint_id.clone();
let lsp_data = self.latest_lsp_data(&buffer, cx);
let existing_inlay_hints = &mut lsp_data.inlay_hints;
let known_chunks = known_chunks
.filter(|(known_version, _)| !lsp_data.buffer_version.changed_since(known_version))
@@ -6651,8 +6671,8 @@ impl LspStore {
.unwrap_or_default();
let mut hint_fetch_tasks = Vec::new();
let mut cached_inlay_hints = HashMap::default();
let mut ranges_to_query = Vec::new();
let mut cached_inlay_hints = None;
let mut ranges_to_query = None;
let applicable_chunks = existing_inlay_hints
.applicable_chunks(ranges.as_slice())
.filter(|chunk| !known_chunks.contains(&(chunk.start..chunk.end)))
@@ -6667,12 +6687,12 @@ impl LspStore {
match (
existing_inlay_hints
.cached_hints(&row_chunk)
.filter(|_| !invalidate_cache)
.filter(|_| !lsp_refresh_requested)
.cloned(),
existing_inlay_hints
.fetched_hints(&row_chunk)
.as_ref()
.filter(|_| !invalidate_cache)
.filter(|_| !lsp_refresh_requested)
.cloned(),
) {
(None, None) => {
@@ -6681,19 +6701,18 @@ impl LspStore {
} else {
Point::new(row_chunk.end, 0)
};
ranges_to_query.push((
ranges_to_query.get_or_insert_with(Vec::new).push((
row_chunk,
buffer_snapshot.anchor_before(Point::new(row_chunk.start, 0))
..buffer_snapshot.anchor_after(end),
));
}
(None, Some(fetched_hints)) => {
hint_fetch_tasks.push((row_chunk, fetched_hints.clone()))
}
(None, Some(fetched_hints)) => hint_fetch_tasks.push((row_chunk, fetched_hints)),
(Some(cached_hints), None) => {
for (server_id, cached_hints) in cached_hints {
if for_server.is_none_or(|for_server| for_server == server_id) {
cached_inlay_hints
.get_or_insert_with(HashMap::default)
.entry(row_chunk.start..row_chunk.end)
.or_insert_with(HashMap::default)
.entry(server_id)
@@ -6703,10 +6722,11 @@ impl LspStore {
}
}
(Some(cached_hints), Some(fetched_hints)) => {
hint_fetch_tasks.push((row_chunk, fetched_hints.clone()));
hint_fetch_tasks.push((row_chunk, fetched_hints));
for (server_id, cached_hints) in cached_hints {
if for_server.is_none_or(|for_server| for_server == server_id) {
cached_inlay_hints
.get_or_insert_with(HashMap::default)
.entry(row_chunk.start..row_chunk.end)
.or_insert_with(HashMap::default)
.entry(server_id)
@@ -6718,18 +6738,18 @@ impl LspStore {
}
}
let cached_chunk_data = cached_inlay_hints
.into_iter()
.map(|(row_chunk, hints)| (row_chunk, Task::ready(Ok(hints))))
.collect();
if hint_fetch_tasks.is_empty() && ranges_to_query.is_empty() {
cached_chunk_data
if hint_fetch_tasks.is_empty()
&& ranges_to_query
.as_ref()
.is_none_or(|ranges| ranges.is_empty())
&& let Some(cached_inlay_hints) = cached_inlay_hints
{
cached_inlay_hints
.into_iter()
.map(|(row_chunk, hints)| (row_chunk, Task::ready(Ok(hints))))
.collect()
} else {
if invalidate_cache {
lsp_data.inlay_hints.clear();
}
for (chunk, range_to_query) in ranges_to_query {
for (chunk, range_to_query) in ranges_to_query.into_iter().flatten() {
let next_hint_id = next_hint_id.clone();
let buffer = buffer.clone();
let new_inlay_hints = cx
@@ -6745,31 +6765,38 @@ impl LspStore {
let update_cache = !lsp_data
.buffer_version
.changed_since(&buffer.read(cx).version());
new_hints_by_server
.into_iter()
.map(|(server_id, new_hints)| {
let new_hints = new_hints
.into_iter()
.map(|new_hint| {
(
InlayId::Hint(next_hint_id.fetch_add(
1,
atomic::Ordering::AcqRel,
)),
new_hint,
)
})
.collect::<Vec<_>>();
if update_cache {
lsp_data.inlay_hints.insert_new_hints(
chunk,
server_id,
new_hints.clone(),
);
}
(server_id, new_hints)
})
.collect()
if new_hints_by_server.is_empty() {
if update_cache {
lsp_data.inlay_hints.invalidate_for_chunk(chunk);
}
HashMap::default()
} else {
new_hints_by_server
.into_iter()
.map(|(server_id, new_hints)| {
let new_hints = new_hints
.into_iter()
.map(|new_hint| {
(
InlayId::Hint(next_hint_id.fetch_add(
1,
atomic::Ordering::AcqRel,
)),
new_hint,
)
})
.collect::<Vec<_>>();
if update_cache {
lsp_data.inlay_hints.insert_new_hints(
chunk,
server_id,
new_hints.clone(),
);
}
(server_id, new_hints)
})
.collect()
}
})
})
.map_err(Arc::new)
@@ -6781,22 +6808,25 @@ impl LspStore {
hint_fetch_tasks.push((chunk, new_inlay_hints));
}
let mut combined_data = cached_chunk_data;
combined_data.extend(hint_fetch_tasks.into_iter().map(|(chunk, hints_fetch)| {
(
chunk.start..chunk.end,
cx.spawn(async move |_, _| {
hints_fetch.await.map_err(|e| {
if e.error_code() != ErrorCode::Internal {
anyhow!(e.error_code())
} else {
anyhow!("{e:#}")
}
})
}),
)
}));
combined_data
cached_inlay_hints
.unwrap_or_default()
.into_iter()
.map(|(row_chunk, hints)| (row_chunk, Task::ready(Ok(hints))))
.chain(hint_fetch_tasks.into_iter().map(|(chunk, hints_fetch)| {
(
chunk.start..chunk.end,
cx.spawn(async move |_, _| {
hints_fetch.await.map_err(|e| {
if e.error_code() != ErrorCode::Internal {
anyhow!(e.error_code())
} else {
anyhow!("{e:#}")
}
})
}),
)
}))
.collect()
}
}
@@ -7157,7 +7187,7 @@ impl LspStore {
);
let buffer = buffer.clone();
cx.spawn(async move |lsp_store, cx| {
let Some(project) = lsp_store.upgrade() else {
let Some(lsp_store) = lsp_store.upgrade() else {
return Ok(None);
};
let colors = join_all(
@@ -7171,7 +7201,7 @@ impl LspStore {
.map(|color_response| {
let response = request.response_from_proto(
color_response.response,
project.clone(),
lsp_store.clone(),
buffer.clone(),
cx.clone(),
);
@@ -7235,8 +7265,8 @@ impl LspStore {
request.to_proto(upstream_project_id, buffer.read(cx)),
);
let buffer = buffer.clone();
cx.spawn(async move |weak_project, cx| {
let project = weak_project.upgrade()?;
cx.spawn(async move |weak_lsp_store, cx| {
let lsp_store = weak_lsp_store.upgrade()?;
let signatures = join_all(
request_task
.await
@@ -7248,7 +7278,7 @@ impl LspStore {
.map(|response| {
let response = GetSignatureHelp { position }.response_from_proto(
response.response,
project.clone(),
lsp_store.clone(),
buffer.clone(),
cx.clone(),
);
@@ -7299,8 +7329,8 @@ impl LspStore {
request.to_proto(upstream_project_id, buffer.read(cx)),
);
let buffer = buffer.clone();
cx.spawn(async move |weak_project, cx| {
let project = weak_project.upgrade()?;
cx.spawn(async move |weak_lsp_store, cx| {
let lsp_store = weak_lsp_store.upgrade()?;
let hovers = join_all(
request_task
.await
@@ -7312,7 +7342,7 @@ impl LspStore {
.map(|response| {
let response = GetHover { position }.response_from_proto(
response.response,
project.clone(),
lsp_store.clone(),
buffer.clone(),
cx.clone(),
);
@@ -9604,7 +9634,10 @@ impl LspStore {
if let Some(work) = status.pending_work.remove(&token)
&& !work.is_disk_based_diagnostics_progress
{
cx.emit(LspStoreEvent::RefreshInlayHints(language_server_id));
cx.emit(LspStoreEvent::RefreshInlayHints {
server_id: language_server_id,
request_id: None,
});
}
cx.notify();
}
@@ -9743,9 +9776,10 @@ impl LspStore {
mut cx: AsyncApp,
) -> Result<proto::Ack> {
lsp_store.update(&mut cx, |_, cx| {
cx.emit(LspStoreEvent::RefreshInlayHints(
LanguageServerId::from_proto(envelope.payload.server_id),
));
cx.emit(LspStoreEvent::RefreshInlayHints {
server_id: LanguageServerId::from_proto(envelope.payload.server_id),
request_id: envelope.payload.request_id.map(|id| id as usize),
});
})?;
Ok(proto::Ack {})
}
@@ -10130,7 +10164,7 @@ impl LspStore {
) -> Shared<Task<Option<HashMap<String, String>>>> {
if let Some(environment) = &self.as_local().map(|local| local.environment.clone()) {
environment.update(cx, |env, cx| {
env.get_buffer_environment(buffer, &self.worktree_store, cx)
env.buffer_environment(buffer, &self.worktree_store, cx)
})
} else {
Task::ready(None).shared()
@@ -10972,7 +11006,6 @@ impl LspStore {
language_server.name(),
Some(key.worktree_id),
));
cx.emit(LspStoreEvent::RefreshInlayHints(server_id));
let server_capabilities = language_server.capabilities();
if let Some((downstream_client, project_id)) = self.downstream_client.as_ref() {
@@ -11898,12 +11931,38 @@ impl LspStore {
"textDocument/completion" => {
if let Some(caps) = reg
.register_options
.map(serde_json::from_value)
.map(serde_json::from_value::<CompletionOptions>)
.transpose()?
{
server.update_capabilities(|capabilities| {
capabilities.completion_provider = Some(caps);
capabilities.completion_provider = Some(caps.clone());
});
if let Some(local) = self.as_local() {
let mut buffers_with_language_server = Vec::new();
for handle in self.buffer_store.read(cx).buffers() {
let buffer_id = handle.read(cx).remote_id();
if local
.buffers_opened_in_servers
.get(&buffer_id)
.filter(|s| s.contains(&server_id))
.is_some()
{
buffers_with_language_server.push(handle);
}
}
let triggers = caps
.trigger_characters
.unwrap_or_default()
.into_iter()
.collect::<BTreeSet<_>>();
for handle in buffers_with_language_server {
let triggers = triggers.clone();
let _ = handle.update(cx, move |buffer, cx| {
buffer.set_completion_triggers(server_id, triggers, cx);
});
}
}
notify_server_capabilities_updated(&server, cx);
}
}
@@ -12890,7 +12949,7 @@ impl LanguageServerWatchedPathsBuilder {
language_server_id: LanguageServerId,
cx: &mut Context<LspStore>,
) -> LanguageServerWatchedPaths {
let project = cx.weak_entity();
let lsp_store = cx.weak_entity();
const LSP_ABS_PATH_OBSERVE: Duration = Duration::from_millis(100);
let abs_paths = self
@@ -12901,7 +12960,7 @@ impl LanguageServerWatchedPathsBuilder {
let abs_path = abs_path.clone();
let fs = fs.clone();
let lsp_store = project.clone();
let lsp_store = lsp_store.clone();
async move |_, cx| {
maybe!(async move {
let mut push_updates = fs.watch(&abs_path, LSP_ABS_PATH_OBSERVE).await;
@@ -13369,9 +13428,8 @@ impl LocalLspAdapterDelegate {
fs: Arc<dyn Fs>,
cx: &mut App,
) -> Arc<Self> {
let load_shell_env_task = environment.update(cx, |env, cx| {
env.get_worktree_environment(worktree.clone(), cx)
});
let load_shell_env_task =
environment.update(cx, |env, cx| env.worktree_environment(worktree.clone(), cx));
Arc::new(Self {
lsp_store,

View File

@@ -19,7 +19,10 @@ pub enum InvalidationStrategy {
/// Demands to re-query all inlay hints needed and invalidate all cached entries, but does not require instant update with invalidation.
///
/// Despite nothing forbids language server from sending this request on every edit, it is expected to be sent only when certain internal server state update, invisible for the editor otherwise.
RefreshRequested(LanguageServerId),
RefreshRequested {
server_id: LanguageServerId,
request_id: Option<usize>,
},
/// Multibuffer excerpt(s) and/or singleton buffer(s) were edited at least on one place.
/// Neither editor nor LSP is able to tell which open file hints' are not affected, so all of them have to be invalidated, re-queried and do that fast enough to avoid being slow, but also debounce to avoid loading hints on every fast keystroke sequence.
BufferEdited,
@@ -36,7 +39,7 @@ impl InvalidationStrategy {
pub fn should_invalidate(&self) -> bool {
matches!(
self,
InvalidationStrategy::RefreshRequested(_) | InvalidationStrategy::BufferEdited
InvalidationStrategy::RefreshRequested { .. } | InvalidationStrategy::BufferEdited
)
}
}
@@ -47,6 +50,7 @@ pub struct BufferInlayHints {
hints_by_chunks: Vec<Option<CacheInlayHints>>,
fetches_by_chunks: Vec<Option<CacheInlayHintsTask>>,
hints_by_id: HashMap<InlayId, HintForId>,
latest_invalidation_requests: HashMap<LanguageServerId, Option<usize>>,
pub(super) hint_resolves: HashMap<InlayId, Shared<Task<()>>>,
}
@@ -104,6 +108,7 @@ impl BufferInlayHints {
Self {
hints_by_chunks: vec![None; buffer_chunks.len()],
fetches_by_chunks: vec![None; buffer_chunks.len()],
latest_invalidation_requests: HashMap::default(),
hints_by_id: HashMap::default(),
hint_resolves: HashMap::default(),
snapshot,
@@ -176,6 +181,7 @@ impl BufferInlayHints {
self.fetches_by_chunks = vec![None; self.buffer_chunks.len()];
self.hints_by_id.clear();
self.hint_resolves.clear();
self.latest_invalidation_requests.clear();
}
pub fn insert_new_hints(
@@ -222,4 +228,48 @@ impl BufferInlayHints {
pub fn buffer_chunks_len(&self) -> usize {
self.buffer_chunks.len()
}
pub(crate) fn invalidate_for_server_refresh(
&mut self,
for_server: LanguageServerId,
request_id: Option<usize>,
) -> bool {
match self.latest_invalidation_requests.entry(for_server) {
hash_map::Entry::Occupied(mut o) => {
if request_id > *o.get() {
o.insert(request_id);
} else {
return false;
}
}
hash_map::Entry::Vacant(v) => {
v.insert(request_id);
}
}
for (chunk_id, chunk_data) in self.hints_by_chunks.iter_mut().enumerate() {
if let Some(removed_hints) = chunk_data
.as_mut()
.and_then(|chunk_data| chunk_data.remove(&for_server))
{
for (id, _) in removed_hints {
self.hints_by_id.remove(&id);
self.hint_resolves.remove(&id);
}
self.fetches_by_chunks[chunk_id] = None;
}
}
true
}
pub(crate) fn invalidate_for_chunk(&mut self, chunk: BufferChunk) {
self.fetches_by_chunks[chunk.id] = None;
if let Some(hints_by_server) = self.hints_by_chunks[chunk.id].take() {
for (hint_id, _) in hints_by_server.into_values().flatten() {
self.hints_by_id.remove(&hint_id);
self.hint_resolves.remove(&hint_id);
}
}
}
}

View File

@@ -33,7 +33,6 @@ pub mod search_history;
mod yarn;
use dap::inline_value::{InlineValueLocation, VariableLookupKind, VariableScope};
use task::Shell;
use crate::{
agent_server_store::AllAgentServersSettings,
@@ -68,7 +67,7 @@ use futures::future::join_all;
use futures::{
StreamExt,
channel::mpsc::{self, UnboundedReceiver},
future::{Shared, try_join_all},
future::try_join_all,
};
pub use image_store::{ImageItem, ImageStore};
use image_store::{ImageItemEvent, ImageStoreEvent};
@@ -337,7 +336,10 @@ pub enum Event {
HostReshared,
Reshared,
Rejoined,
RefreshInlayHints(LanguageServerId),
RefreshInlayHints {
server_id: LanguageServerId,
request_id: Option<usize>,
},
RefreshCodeLens,
RevealInProjectPanel(ProjectEntryId),
SnippetEdit(BufferId, Vec<(lsp::Range, Snippet)>),
@@ -1070,9 +1072,10 @@ impl Project {
let weak_self = cx.weak_entity();
let context_server_store =
cx.new(|cx| ContextServerStore::new(worktree_store.clone(), weak_self, cx));
cx.new(|cx| ContextServerStore::new(worktree_store.clone(), weak_self.clone(), cx));
let environment = cx.new(|cx| ProjectEnvironment::new(env, cx));
let environment =
cx.new(|cx| ProjectEnvironment::new(env, worktree_store.downgrade(), None, cx));
let manifest_tree = ManifestTree::new(worktree_store.clone(), cx);
let toolchain_store = cx.new(|cx| {
ToolchainStore::local(
@@ -1261,7 +1264,7 @@ impl Project {
let weak_self = cx.weak_entity();
let context_server_store =
cx.new(|cx| ContextServerStore::new(worktree_store.clone(), weak_self, cx));
cx.new(|cx| ContextServerStore::new(worktree_store.clone(), weak_self.clone(), cx));
let buffer_store = cx.new(|cx| {
BufferStore::remote(
@@ -1307,7 +1310,14 @@ impl Project {
cx.subscribe(&settings_observer, Self::on_settings_observer_event)
.detach();
let environment = cx.new(|cx| ProjectEnvironment::new(None, cx));
let environment = cx.new(|cx| {
ProjectEnvironment::new(
None,
worktree_store.downgrade(),
Some(remote.downgrade()),
cx,
)
});
let lsp_store = cx.new(|cx| {
LspStore::new_remote(
@@ -1520,8 +1530,8 @@ impl Project {
ImageStore::remote(worktree_store.clone(), client.clone().into(), remote_id, cx)
})?;
let environment = cx.new(|cx| ProjectEnvironment::new(None, cx))?;
let environment =
cx.new(|cx| ProjectEnvironment::new(None, worktree_store.downgrade(), None, cx))?;
let breakpoint_store =
cx.new(|_| BreakpointStore::remote(remote_id, client.clone().into()))?;
let dap_store = cx.new(|cx| {
@@ -1925,32 +1935,6 @@ impl Project {
self.environment.read(cx).get_cli_environment()
}
pub fn buffer_environment<'a>(
&'a self,
buffer: &Entity<Buffer>,
worktree_store: &Entity<WorktreeStore>,
cx: &'a mut App,
) -> Shared<Task<Option<HashMap<String, String>>>> {
self.environment.update(cx, |environment, cx| {
environment.get_buffer_environment(buffer, worktree_store, cx)
})
}
pub fn directory_environment(
&self,
shell: &Shell,
abs_path: Arc<Path>,
cx: &mut App,
) -> Shared<Task<Option<HashMap<String, String>>>> {
self.environment.update(cx, |environment, cx| {
if let Some(remote_client) = self.remote_client.clone() {
environment.get_remote_directory_environment(shell, abs_path, remote_client, cx)
} else {
environment.get_local_directory_environment(shell, abs_path, cx)
}
})
}
#[inline]
pub fn peek_environment_error<'a>(&'a self, cx: &'a App) -> Option<&'a String> {
self.environment.read(cx).peek_environment_error()
@@ -3076,9 +3060,13 @@ impl Project {
return;
};
}
LspStoreEvent::RefreshInlayHints(server_id) => {
cx.emit(Event::RefreshInlayHints(*server_id))
}
LspStoreEvent::RefreshInlayHints {
server_id,
request_id,
} => cx.emit(Event::RefreshInlayHints {
server_id: *server_id,
request_id: *request_id,
}),
LspStoreEvent::RefreshCodeLens => cx.emit(Event::RefreshCodeLens),
LspStoreEvent::LanguageServerPrompt(prompt) => {
cx.emit(Event::LanguageServerPrompt(prompt.clone()))

View File

@@ -1815,10 +1815,6 @@ async fn test_disk_based_diagnostics_progress(cx: &mut gpui::TestAppContext) {
fake_server
.start_progress(format!("{}/0", progress_token))
.await;
assert_eq!(
events.next().await.unwrap(),
Event::RefreshInlayHints(fake_server.server.server_id())
);
assert_eq!(
events.next().await.unwrap(),
Event::DiskBasedDiagnosticsStarted {
@@ -1957,10 +1953,6 @@ async fn test_restarting_server_with_diagnostics_running(cx: &mut gpui::TestAppC
Some(worktree_id)
)
);
assert_eq!(
events.next().await.unwrap(),
Event::RefreshInlayHints(fake_server.server.server_id())
);
fake_server.start_progress(progress_token).await;
assert_eq!(
events.next().await.unwrap(),

View File

@@ -317,7 +317,7 @@ fn local_task_context_for_location(
cx.spawn(async move |cx| {
let project_env = environment
.update(cx, |environment, cx| {
environment.get_buffer_environment(&location.buffer, &worktree_store, cx)
environment.buffer_environment(&location.buffer, &worktree_store, cx)
})
.ok()?
.await;

View File

@@ -8,7 +8,6 @@ use remote::RemoteClient;
use settings::{Settings, SettingsLocation};
use smol::channel::bounded;
use std::{
borrow::Cow,
path::{Path, PathBuf},
sync::Arc,
};
@@ -122,6 +121,7 @@ impl Project {
let lang_registry = self.languages.clone();
cx.spawn(async move |project, cx| {
let shell_kind = ShellKind::new(&shell, is_windows);
let activation_script = maybe!(async {
for toolchain in toolchains {
let Some(toolchain) = toolchain.await else {
@@ -143,14 +143,8 @@ impl Project {
.update(cx, move |_, cx| {
let format_to_run = || {
if let Some(command) = &spawn_task.command {
let mut command: Option<Cow<str>> = shell_kind.try_quote(command);
if let Some(command) = &mut command
&& command.starts_with('"')
&& let Some(prefix) = shell_kind.command_prefix()
{
*command = Cow::Owned(format!("{prefix}{command}"));
}
let command = shell_kind.prepend_command_prefix(command);
let command = shell_kind.try_quote_prefix_aware(&command);
let args = spawn_task
.args
.iter()
@@ -172,12 +166,13 @@ impl Project {
let activation_script =
activation_script.join(&format!("{separator} "));
let to_run = format_to_run();
let arg = format!("{activation_script}{separator} {to_run}");
let args = shell_kind.args_for_shell(false, arg);
let shell = remote_client
.read(cx)
.shell()
.unwrap_or_else(get_default_system_shell);
let arg = format!("{activation_script}{separator} {to_run}");
let args = shell_kind.args_for_shell(false, arg);
create_remote_shell(
Some((&shell, &args)),

View File

@@ -527,7 +527,7 @@ impl LocalToolchainStore {
let project_env = environment
.update(cx, |environment, cx| {
environment.get_local_directory_environment(
environment.local_directory_environment(
&Shell::System,
abs_path.as_path().into(),
cx,
@@ -590,7 +590,7 @@ impl LocalToolchainStore {
let project_env = environment
.update(cx, |environment, cx| {
environment.get_local_directory_environment(
environment.local_directory_environment(
&Shell::System,
path.as_path().into(),
cx,

View File

@@ -466,6 +466,7 @@ message ResolveInlayHintResponse {
message RefreshInlayHints {
uint64 project_id = 1;
uint64 server_id = 2;
optional uint64 request_id = 3;
}
message CodeLens {

View File

@@ -574,6 +574,7 @@ pub async fn open_remote_project(
open_options: workspace::OpenOptions,
cx: &mut AsyncApp,
) -> Result<()> {
let created_new_window = open_options.replace_window.is_none();
let window = if let Some(window) = open_options.replace_window {
window
} else {
@@ -648,7 +649,45 @@ pub async fn open_remote_project(
let Some(delegate) = delegate else { break };
let remote_connection =
remote::connect(connection_options.clone(), delegate.clone(), cx).await?;
match remote::connect(connection_options.clone(), delegate.clone(), cx).await {
Ok(connection) => connection,
Err(e) => {
window
.update(cx, |workspace, _, cx| {
if let Some(ui) = workspace.active_modal::<RemoteConnectionModal>(cx) {
ui.update(cx, |modal, cx| modal.finished(cx))
}
})
.ok();
log::error!("Failed to open project: {e:?}");
let response = window
.update(cx, |_, window, cx| {
window.prompt(
PromptLevel::Critical,
match connection_options {
RemoteConnectionOptions::Ssh(_) => "Failed to connect over SSH",
RemoteConnectionOptions::Wsl(_) => "Failed to connect to WSL",
},
Some(&e.to_string()),
&["Retry", "Cancel"],
cx,
)
})?
.await;
if response == Ok(0) {
continue;
}
if created_new_window {
window
.update(cx, |_, window, _| window.remove_window())
.ok();
}
break;
}
};
let (paths, paths_with_positions) =
determine_paths_with_positions(&remote_connection, paths.clone()).await;
@@ -686,7 +725,7 @@ pub async fn open_remote_project(
RemoteConnectionOptions::Wsl(_) => "Failed to connect to WSL",
},
Some(&e.to_string()),
&["Retry", "Ok"],
&["Retry", "Cancel"],
cx,
)
})?
@@ -694,7 +733,14 @@ pub async fn open_remote_project(
if response == Ok(0) {
continue;
}
if created_new_window {
window
.update(cx, |_, window, _| window.remove_window())
.ok();
}
}
Ok(items) => {
for (item, path) in items.into_iter().zip(paths_with_positions) {
let Some(item) = item else {

View File

@@ -39,6 +39,7 @@ pub(crate) struct SshRemoteConnection {
ssh_platform: RemotePlatform,
ssh_path_style: PathStyle,
ssh_shell: String,
ssh_shell_kind: ShellKind,
ssh_default_system_shell: String,
_temp_dir: TempDir,
}
@@ -241,6 +242,7 @@ impl RemoteConnection for SshRemoteConnection {
let Self {
ssh_path_style,
socket,
ssh_shell_kind,
ssh_shell,
..
} = self;
@@ -254,6 +256,7 @@ impl RemoteConnection for SshRemoteConnection {
env,
*ssh_path_style,
ssh_shell,
*ssh_shell_kind,
socket.ssh_args(),
)
}
@@ -367,7 +370,7 @@ impl RemoteConnection for SshRemoteConnection {
let ssh_proxy_process = match self
.socket
.ssh_command("env", &proxy_args)
.ssh_command(self.ssh_shell_kind, "env", &proxy_args)
// IMPORTANT: we kill this process when we drop the task that uses it.
.kill_on_drop(true)
.spawn()
@@ -490,6 +493,13 @@ impl SshRemoteConnection {
_ => PathStyle::Posix,
};
let ssh_default_system_shell = String::from("/bin/sh");
let ssh_shell_kind = ShellKind::new(
&ssh_shell,
match ssh_platform.os {
"windows" => true,
_ => false,
},
);
let mut this = Self {
socket,
@@ -499,6 +509,7 @@ impl SshRemoteConnection {
ssh_path_style,
ssh_platform,
ssh_shell,
ssh_shell_kind,
ssh_default_system_shell,
};
@@ -563,7 +574,11 @@ impl SshRemoteConnection {
if self
.socket
.run_command(&dst_path.display(self.path_style()), &["version"])
.run_command(
self.ssh_shell_kind,
&dst_path.display(self.path_style()),
&["version"],
)
.await
.is_ok()
{
@@ -632,7 +647,11 @@ impl SshRemoteConnection {
) -> Result<()> {
if let Some(parent) = tmp_path_gz.parent() {
self.socket
.run_command("mkdir", &["-p", parent.display(self.path_style()).as_ref()])
.run_command(
self.ssh_shell_kind,
"mkdir",
&["-p", parent.display(self.path_style()).as_ref()],
)
.await?;
}
@@ -641,6 +660,7 @@ impl SshRemoteConnection {
match self
.socket
.run_command(
self.ssh_shell_kind,
"curl",
&[
"-f",
@@ -660,13 +680,19 @@ impl SshRemoteConnection {
{
Ok(_) => {}
Err(e) => {
if self.socket.run_command("which", &["curl"]).await.is_ok() {
if self
.socket
.run_command(self.ssh_shell_kind, "which", &["curl"])
.await
.is_ok()
{
return Err(e);
}
match self
.socket
.run_command(
self.ssh_shell_kind,
"wget",
&[
"--header=Content-Type: application/json",
@@ -681,7 +707,12 @@ impl SshRemoteConnection {
{
Ok(_) => {}
Err(e) => {
if self.socket.run_command("which", &["wget"]).await.is_ok() {
if self
.socket
.run_command(self.ssh_shell_kind, "which", &["wget"])
.await
.is_ok()
{
return Err(e);
} else {
anyhow::bail!("Neither curl nor wget is available");
@@ -703,7 +734,11 @@ impl SshRemoteConnection {
) -> Result<()> {
if let Some(parent) = tmp_path_gz.parent() {
self.socket
.run_command("mkdir", &["-p", parent.display(self.path_style()).as_ref()])
.run_command(
self.ssh_shell_kind,
"mkdir",
&["-p", parent.display(self.path_style()).as_ref()],
)
.await?;
}
@@ -750,7 +785,7 @@ impl SshRemoteConnection {
format!("chmod {server_mode} {orig_tmp_path} && mv {orig_tmp_path} {dst_path}",)
};
let args = shell_kind.args_for_shell(false, script.to_string());
self.socket.run_command("sh", &args).await?;
self.socket.run_command(shell_kind, "sh", &args).await?;
Ok(())
}
@@ -894,11 +929,16 @@ impl SshSocket {
// Furthermore, some setups (e.g. Coder) will change directory when SSH'ing
// into a machine. You must use `cd` to get back to $HOME.
// You need to do it like this: $ ssh host "cd; sh -c 'ls -l /tmp'"
fn ssh_command(&self, program: &str, args: &[impl AsRef<str>]) -> process::Command {
let shell_kind = ShellKind::Posix;
fn ssh_command(
&self,
shell_kind: ShellKind,
program: &str,
args: &[impl AsRef<str>],
) -> process::Command {
let mut command = util::command::new_smol_command("ssh");
let program = shell_kind.prepend_command_prefix(program);
let mut to_run = shell_kind
.try_quote(program)
.try_quote_prefix_aware(&program)
.expect("shell quoting")
.into_owned();
for arg in args {
@@ -920,8 +960,13 @@ impl SshSocket {
command
}
async fn run_command(&self, program: &str, args: &[impl AsRef<str>]) -> Result<String> {
let output = self.ssh_command(program, args).output().await?;
async fn run_command(
&self,
shell_kind: ShellKind,
program: &str,
args: &[impl AsRef<str>],
) -> Result<String> {
let output = self.ssh_command(shell_kind, program, args).output().await?;
anyhow::ensure!(
output.status.success(),
"failed to run command: {}",
@@ -994,12 +1039,7 @@ impl SshSocket {
}
async fn platform(&self, shell: ShellKind) -> Result<RemotePlatform> {
let program = if shell == ShellKind::Nushell {
"^uname"
} else {
"uname"
};
let uname = self.run_command(program, &["-sm"]).await?;
let uname = self.run_command(shell, "uname", &["-sm"]).await?;
let Some((os, arch)) = uname.split_once(" ") else {
anyhow::bail!("unknown uname: {uname:?}")
};
@@ -1030,7 +1070,10 @@ impl SshSocket {
}
async fn shell(&self) -> String {
match self.run_command("sh", &["-c", "echo $SHELL"]).await {
match self
.run_command(ShellKind::Posix, "sh", &["-c", "echo $SHELL"])
.await
{
Ok(shell) => shell.trim().to_owned(),
Err(e) => {
log::error!("Failed to get shell: {e}");
@@ -1256,11 +1299,11 @@ fn build_command(
ssh_env: HashMap<String, String>,
ssh_path_style: PathStyle,
ssh_shell: &str,
ssh_shell_kind: ShellKind,
ssh_args: Vec<String>,
) -> Result<CommandTemplate> {
use std::fmt::Write as _;
let shell_kind = ShellKind::new(ssh_shell, false);
let mut exec = String::new();
if let Some(working_dir) = working_dir {
let working_dir = RemotePathBuf::new(working_dir, ssh_path_style).to_string();
@@ -1270,12 +1313,24 @@ fn build_command(
const TILDE_PREFIX: &'static str = "~/";
if working_dir.starts_with(TILDE_PREFIX) {
let working_dir = working_dir.trim_start_matches("~").trim_start_matches("/");
write!(exec, "cd \"$HOME/{working_dir}\" && ",)?;
write!(
exec,
"cd \"$HOME/{working_dir}\" {} ",
ssh_shell_kind.sequential_and_commands_separator()
)?;
} else {
write!(exec, "cd \"{working_dir}\" && ",)?;
write!(
exec,
"cd \"{working_dir}\" {} ",
ssh_shell_kind.sequential_and_commands_separator()
)?;
}
} else {
write!(exec, "cd && ")?;
write!(
exec,
"cd {} ",
ssh_shell_kind.sequential_and_commands_separator()
)?;
};
write!(exec, "exec env ")?;
@@ -1284,7 +1339,7 @@ fn build_command(
exec,
"{}={} ",
k,
shell_kind.try_quote(v).context("shell quoting")?
ssh_shell_kind.try_quote(v).context("shell quoting")?
)?;
}
@@ -1292,12 +1347,12 @@ fn build_command(
write!(
exec,
"{}",
shell_kind
.try_quote(&input_program)
ssh_shell_kind
.try_quote_prefix_aware(&input_program)
.context("shell quoting")?
)?;
for arg in input_args {
let arg = shell_kind.try_quote(&arg).context("shell quoting")?;
let arg = ssh_shell_kind.try_quote(&arg).context("shell quoting")?;
write!(exec, " {}", &arg)?;
}
} else {
@@ -1341,6 +1396,7 @@ mod tests {
env.clone(),
PathStyle::Posix,
"/bin/fish",
ShellKind::Fish,
vec!["-p".to_string(), "2222".to_string()],
)?;
@@ -1370,6 +1426,7 @@ mod tests {
env.clone(),
PathStyle::Posix,
"/bin/fish",
ShellKind::Fish,
vec!["-p".to_string(), "2222".to_string()],
)?;

View File

@@ -44,6 +44,7 @@ pub(crate) struct WslRemoteConnection {
remote_binary_path: Option<Arc<RelPath>>,
platform: RemotePlatform,
shell: String,
shell_kind: ShellKind,
default_system_shell: String,
connection_options: WslConnectionOptions,
can_exec: bool,
@@ -73,16 +74,17 @@ impl WslRemoteConnection {
remote_binary_path: None,
platform: RemotePlatform { os: "", arch: "" },
shell: String::new(),
shell_kind: ShellKind::Posix,
default_system_shell: String::from("/bin/sh"),
can_exec: true,
};
delegate.set_status(Some("Detecting WSL environment"), cx);
this.shell = this.detect_shell().await?;
let shell = ShellKind::new(&this.shell, false);
this.can_exec = this.detect_can_exec(shell).await?;
this.platform = this.detect_platform(shell).await?;
this.shell_kind = ShellKind::new(&this.shell, false);
this.can_exec = this.detect_can_exec().await?;
this.platform = this.detect_platform().await?;
this.remote_binary_path = Some(
this.ensure_server_binary(&delegate, release_channel, version, commit, shell, cx)
this.ensure_server_binary(&delegate, release_channel, version, commit, cx)
.await?,
);
log::debug!("Detected WSL environment: {this:#?}");
@@ -90,20 +92,16 @@ impl WslRemoteConnection {
Ok(this)
}
async fn detect_can_exec(&self, shell: ShellKind) -> Result<bool> {
async fn detect_can_exec(&self) -> Result<bool> {
let options = &self.connection_options;
let program = if shell == ShellKind::Nushell {
"^uname"
} else {
"uname"
};
let program = self.shell_kind.prepend_command_prefix("uname");
let args = &["-m"];
let output = wsl_command_impl(options, program, args, true)
let output = wsl_command_impl(options, &program, args, true)
.output()
.await?;
if !output.status.success() {
let output = wsl_command_impl(options, program, args, false)
let output = wsl_command_impl(options, &program, args, false)
.output()
.await?;
@@ -120,14 +118,9 @@ impl WslRemoteConnection {
Ok(true)
}
}
async fn detect_platform(&self, shell: ShellKind) -> Result<RemotePlatform> {
let arch_str = if shell == ShellKind::Nushell {
// https://github.com/nushell/nushell/issues/12570
self.run_wsl_command("sh", &["-c", "uname -m"])
} else {
self.run_wsl_command("uname", &["-m"])
}
.await?;
async fn detect_platform(&self) -> Result<RemotePlatform> {
let program = self.shell_kind.prepend_command_prefix("uname");
let arch_str = self.run_wsl_command(&program, &["-m"]).await?;
let arch_str = arch_str.trim().to_string();
let arch = match arch_str.as_str() {
"x86_64" => "x86_64",
@@ -163,7 +156,6 @@ impl WslRemoteConnection {
release_channel: ReleaseChannel,
version: SemanticVersion,
commit: Option<AppCommitSha>,
shell: ShellKind,
cx: &mut AsyncApp,
) -> Result<Arc<RelPath>> {
let version_str = match release_channel {
@@ -186,12 +178,9 @@ impl WslRemoteConnection {
if let Some(parent) = dst_path.parent() {
let parent = parent.display(PathStyle::Posix);
if shell == ShellKind::Nushell {
self.run_wsl_command("mkdir", &[&parent]).await
} else {
self.run_wsl_command("mkdir", &["-p", &parent]).await
}
.map_err(|e| anyhow!("Failed to create directory: {}", e))?;
self.run_wsl_command("mkdir", &["-p", &parent])
.await
.map_err(|e| anyhow!("Failed to create directory: {}", e))?;
}
#[cfg(debug_assertions)]
@@ -206,7 +195,7 @@ impl WslRemoteConnection {
))
.unwrap(),
);
self.upload_file(&remote_server_path, &tmp_path, delegate, &shell, cx)
self.upload_file(&remote_server_path, &tmp_path, delegate, cx)
.await?;
self.extract_and_install(&tmp_path, &dst_path, delegate, cx)
.await?;
@@ -239,8 +228,7 @@ impl WslRemoteConnection {
);
let tmp_path = RelPath::unix(&tmp_path).unwrap();
self.upload_file(&src_path, &tmp_path, delegate, &shell, cx)
.await?;
self.upload_file(&src_path, &tmp_path, delegate, cx).await?;
self.extract_and_install(&tmp_path, &dst_path, delegate, cx)
.await?;
@@ -252,19 +240,15 @@ impl WslRemoteConnection {
src_path: &Path,
dst_path: &RelPath,
delegate: &Arc<dyn RemoteClientDelegate>,
shell: &ShellKind,
cx: &mut AsyncApp,
) -> Result<()> {
delegate.set_status(Some("Uploading remote server to WSL"), cx);
if let Some(parent) = dst_path.parent() {
let parent = parent.display(PathStyle::Posix);
if *shell == ShellKind::Nushell {
self.run_wsl_command("mkdir", &[&parent]).await
} else {
self.run_wsl_command("mkdir", &["-p", &parent]).await
}
.map_err(|e| anyhow!("Failed to create directory when uploading file: {}", e))?;
self.run_wsl_command("mkdir", &["-p", &parent])
.await
.map_err(|e| anyhow!("Failed to create directory when uploading file: {}", e))?;
}
let t0 = Instant::now();
@@ -441,7 +425,7 @@ impl RemoteConnection for WslRemoteConnection {
bail!("WSL shares the network interface with the host system");
}
let shell_kind = ShellKind::new(&self.shell, false);
let shell_kind = self.shell_kind;
let working_dir = working_dir
.map(|working_dir| RemotePathBuf::new(working_dir, PathStyle::Posix).to_string())
.unwrap_or("~".to_string());
@@ -461,7 +445,9 @@ impl RemoteConnection for WslRemoteConnection {
write!(
exec,
"{}",
shell_kind.try_quote(&program).context("shell quoting")?
shell_kind
.try_quote_prefix_aware(&program)
.context("shell quoting")?
)?;
for arg in args {
let arg = shell_kind.try_quote(&arg).context("shell quoting")?;

View File

@@ -94,7 +94,8 @@ impl HeadlessProject {
store
});
let environment = cx.new(|cx| ProjectEnvironment::new(None, cx));
let environment =
cx.new(|cx| ProjectEnvironment::new(None, worktree_store.downgrade(), None, cx));
let manifest_tree = ManifestTree::new(worktree_store.clone(), cx);
let toolchain_store = cx.new(|cx| {
ToolchainStore::local(
@@ -786,7 +787,7 @@ impl HeadlessProject {
let environment = this
.update(&mut cx, |this, cx| {
this.environment.update(cx, |environment, cx| {
environment.get_local_directory_environment(&shell, directory.into(), cx)
environment.local_directory_environment(&shell, directory.into(), cx)
})
})?
.await

View File

@@ -323,21 +323,13 @@ impl Rope {
const PARALLEL_THRESHOLD: usize = 4 * (2 * sum_tree::TREE_BASE);
if new_chunks.len() >= PARALLEL_THRESHOLD {
let cx2 = executor.clone();
executor
.scoped(|scope| {
// SAFETY: transmuting to 'static is safe because the future is scoped
// and the underlying string data cannot go out of scope because dropping the scope
// will wait for the task to finish
let new_chunks =
unsafe { std::mem::transmute::<Vec<&str>, Vec<&'static str>>(new_chunks) };
let async_extend = self
.chunks
.async_extend(new_chunks.into_iter().map(Chunk::new), cx2);
scope.spawn(async_extend);
})
// SAFETY: transmuting to 'static is sound here. We block on the future making use of this
// and we know that the result of this computation is not stashing the static reference
// away.
let new_chunks =
unsafe { std::mem::transmute::<Vec<&str>, Vec<&'static str>>(new_chunks) };
self.chunks
.async_extend(new_chunks.into_iter().map(Chunk::new), executor)
.await;
} else {
self.chunks

View File

@@ -2813,6 +2813,7 @@ mod tests {
case_sensitive: false,
include_ignored: false,
regex: false,
center_on_match: false,
},
cx,
);
@@ -2875,6 +2876,7 @@ mod tests {
case_sensitive: true,
include_ignored: false,
regex: false,
center_on_match: false,
},
cx,
);
@@ -2912,6 +2914,7 @@ mod tests {
case_sensitive: true,
include_ignored: false,
regex: false,
center_on_match: false,
},
cx,
);
@@ -2938,6 +2941,7 @@ mod tests {
case_sensitive: Some(search_settings.case_sensitive),
include_ignored: Some(search_settings.include_ignored),
regex: Some(search_settings.regex),
center_on_match: Some(search_settings.center_on_match),
});
});
});

View File

@@ -12,7 +12,9 @@ use editor::{
SelectionEffects, VimFlavor,
actions::{Backtab, SelectAll, Tab},
items::active_match_index,
multibuffer_context_lines, vim_flavor,
multibuffer_context_lines,
scroll::Autoscroll,
vim_flavor,
};
use futures::{StreamExt, stream::FuturesOrdered};
use gpui::{
@@ -1346,8 +1348,13 @@ impl ProjectSearchView {
self.results_editor.update(cx, |editor, cx| {
let collapse = vim_flavor(cx) == Some(VimFlavor::Vim);
let range_to_select = editor.range_for_match(&range_to_select, collapse);
let autoscroll = if EditorSettings::get_global(cx).search.center_on_match {
Autoscroll::center()
} else {
Autoscroll::fit()
};
editor.unfold_ranges(std::slice::from_ref(&range_to_select), false, true, cx);
editor.change_selections(Default::default(), window, cx, |s| {
editor.change_selections(SelectionEffects::scroll(autoscroll), window, cx, |s| {
s.select_ranges([range_to_select])
});
});
@@ -2346,7 +2353,15 @@ pub fn perform_project_search(
#[cfg(test)]
pub mod tests {
use std::{ops::Deref as _, sync::Arc, time::Duration};
use std::{
ops::Deref as _,
path::PathBuf,
sync::{
Arc,
atomic::{self, AtomicUsize},
},
time::Duration,
};
use super::*;
use editor::{DisplayPoint, display_map::DisplayRow};
@@ -4247,6 +4262,8 @@ pub mod tests {
)
.await;
let requests_count = Arc::new(AtomicUsize::new(0));
let closure_requests_count = requests_count.clone();
let project = Project::test(fs.clone(), [path!("/dir").as_ref()], cx).await;
let language_registry = project.read_with(cx, |project, _| project.languages().clone());
let language = rust_lang();
@@ -4258,21 +4275,26 @@ pub mod tests {
inlay_hint_provider: Some(lsp::OneOf::Left(true)),
..lsp::ServerCapabilities::default()
},
initializer: Some(Box::new(|fake_server| {
fake_server.set_request_handler::<lsp::request::InlayHintRequest, _, _>(
move |_, _| async move {
Ok(Some(vec![lsp::InlayHint {
position: lsp::Position::new(0, 17),
label: lsp::InlayHintLabel::String(": i32".to_owned()),
kind: Some(lsp::InlayHintKind::TYPE),
text_edits: None,
tooltip: None,
padding_left: None,
padding_right: None,
data: None,
}]))
},
);
initializer: Some(Box::new(move |fake_server| {
let requests_count = closure_requests_count.clone();
fake_server.set_request_handler::<lsp::request::InlayHintRequest, _, _>({
move |_, _| {
let requests_count = requests_count.clone();
async move {
requests_count.fetch_add(1, atomic::Ordering::Release);
Ok(Some(vec![lsp::InlayHint {
position: lsp::Position::new(0, 17),
label: lsp::InlayHintLabel::String(": i32".to_owned()),
kind: Some(lsp::InlayHintKind::TYPE),
text_edits: None,
tooltip: None,
padding_left: None,
padding_right: None,
data: None,
}]))
}
}
});
})),
..FakeLspAdapter::default()
},
@@ -4286,7 +4308,7 @@ pub mod tests {
});
perform_search(search_view, "let ", cx);
let _fake_server = fake_servers.next().await.unwrap();
let fake_server = fake_servers.next().await.unwrap();
cx.executor().advance_clock(Duration::from_secs(1));
cx.executor().run_until_parked();
search_view
@@ -4299,11 +4321,127 @@ pub mod tests {
);
})
.unwrap();
assert_eq!(
requests_count.load(atomic::Ordering::Acquire),
1,
"New hints should have been queried",
);
// Can do the 2nd search without any panics
perform_search(search_view, "let ", cx);
cx.executor().advance_clock(Duration::from_secs(1));
cx.executor().run_until_parked();
search_view
.update(cx, |search_view, _, cx| {
assert_eq!(
search_view
.results_editor
.update(cx, |editor, cx| editor.display_text(cx)),
"\n\nfn main() { let a: i32 = 2; }\n"
);
})
.unwrap();
assert_eq!(
requests_count.load(atomic::Ordering::Acquire),
2,
"We did drop the previous buffer when cleared the old project search results, hence another query was made",
);
let singleton_editor = window
.update(cx, |workspace, window, cx| {
workspace.open_abs_path(
PathBuf::from(path!("/dir/main.rs")),
workspace::OpenOptions::default(),
window,
cx,
)
})
.unwrap()
.await
.unwrap()
.downcast::<Editor>()
.unwrap();
cx.executor().advance_clock(Duration::from_millis(100));
cx.executor().run_until_parked();
singleton_editor.update(cx, |editor, cx| {
assert_eq!(
editor.display_text(cx),
"fn main() { let a: i32 = 2; }\n",
"Newly opened editor should have the correct text with hints",
);
});
assert_eq!(
requests_count.load(atomic::Ordering::Acquire),
2,
"Opening the same buffer again should reuse the cached hints",
);
window
.update(cx, |_, window, cx| {
singleton_editor.update(cx, |editor, cx| {
editor.handle_input("test", window, cx);
});
})
.unwrap();
cx.executor().advance_clock(Duration::from_secs(1));
cx.executor().run_until_parked();
singleton_editor.update(cx, |editor, cx| {
assert_eq!(
editor.display_text(cx),
"testfn main() { l: i32et a = 2; }\n",
"Newly opened editor should have the correct text with hints",
);
});
assert_eq!(
requests_count.load(atomic::Ordering::Acquire),
3,
"We have edited the buffer and should send a new request",
);
window
.update(cx, |_, window, cx| {
singleton_editor.update(cx, |editor, cx| {
editor.undo(&editor::actions::Undo, window, cx);
});
})
.unwrap();
cx.executor().advance_clock(Duration::from_secs(1));
cx.executor().run_until_parked();
assert_eq!(
requests_count.load(atomic::Ordering::Acquire),
4,
"We have edited the buffer again and should send a new request again",
);
singleton_editor.update(cx, |editor, cx| {
assert_eq!(
editor.display_text(cx),
"fn main() { let a: i32 = 2; }\n",
"Newly opened editor should have the correct text with hints",
);
});
project.update(cx, |_, cx| {
cx.emit(project::Event::RefreshInlayHints {
server_id: fake_server.server.server_id(),
request_id: Some(1),
});
});
cx.executor().advance_clock(Duration::from_secs(1));
cx.executor().run_until_parked();
assert_eq!(
requests_count.load(atomic::Ordering::Acquire),
5,
"After a simulated server refresh request, we should have sent another request",
);
perform_search(search_view, "let ", cx);
cx.executor().advance_clock(Duration::from_secs(1));
cx.executor().run_until_parked();
assert_eq!(
requests_count.load(atomic::Ordering::Acquire),
5,
"New project search should reuse the cached hints",
);
search_view
.update(cx, |search_view, _, cx| {
assert_eq!(

View File

@@ -699,6 +699,8 @@ pub struct SearchSettingsContent {
pub case_sensitive: Option<bool>,
pub include_ignored: Option<bool>,
pub regex: Option<bool>,
/// Whether to center the cursor on each search match when navigating.
pub center_on_match: Option<bool>,
}
#[skip_serializing_none]

View File

@@ -2450,6 +2450,29 @@ pub(crate) fn settings_data(cx: &App) -> Vec<SettingsPage> {
metadata: None,
files: USER,
}),
SettingsPageItem::SettingItem(SettingItem {
title: "Center on Match",
description: "Whether to center the current match in the editor",
field: Box::new(SettingField {
json_path: Some("editor.search.center_on_match"),
pick: |settings_content| {
settings_content
.editor
.search
.as_ref()
.and_then(|search| search.center_on_match.as_ref())
},
write: |settings_content, value| {
settings_content
.editor
.search
.get_or_insert_default()
.center_on_match = value;
},
}),
metadata: None,
files: USER,
}),
SettingsPageItem::SettingItem(SettingItem {
title: "Seed Search Query From Cursor",
description: "When to populate a new search's query based on the text under the cursor.",

View File

@@ -17,7 +17,7 @@ doctest = false
arrayvec = "0.7.1"
log.workspace = true
futures.workspace = true
itertools.workspace = true
futures-lite.workspace = true
[dev-dependencies]
ctor.workspace = true

View File

@@ -4,15 +4,15 @@ mod tree_map;
use arrayvec::ArrayVec;
pub use cursor::{Cursor, FilterCursor, Iter};
use futures::{StreamExt, stream};
use itertools::Itertools as _;
use futures_lite::future::yield_now;
use std::marker::PhantomData;
use std::mem;
use std::{cmp::Ordering, fmt, iter::FromIterator, sync::Arc};
pub use tree_map::{MapSeekTarget, TreeMap, TreeSet};
#[cfg(test)]
#[cfg(all(test, not(rust_analyzer)))]
pub const TREE_BASE: usize = 2;
#[cfg(not(test))]
#[cfg(not(all(test, not(rust_analyzer))))]
pub const TREE_BASE: usize = 6;
pub trait BackgroundSpawn {
@@ -316,30 +316,44 @@ impl<T: Item> SumTree<T> {
T: 'static + Send + Sync,
for<'a> T::Summary: Summary<Context<'a> = ()> + Send + Sync,
S: BackgroundSpawn,
I: IntoIterator<Item = T>,
I: IntoIterator<Item = T, IntoIter: ExactSizeIterator>,
{
let mut futures = vec![];
let chunks = iter.into_iter().chunks(2 * TREE_BASE);
for chunk in chunks.into_iter() {
let items: ArrayVec<T, { 2 * TREE_BASE }> = chunk.into_iter().collect();
futures.push(async move {
let item_summaries: ArrayVec<T::Summary, { 2 * TREE_BASE }> =
items.iter().map(|item| item.summary(())).collect();
let mut summary = item_summaries[0].clone();
for item_summary in &item_summaries[1..] {
<T::Summary as Summary>::add_summary(&mut summary, item_summary, ());
}
SumTree(Arc::new(Node::Leaf {
summary,
items,
item_summaries,
}))
});
let iter = iter.into_iter();
let num_leaves = iter.len().div_ceil(2 * TREE_BASE);
if num_leaves == 0 {
return Self::new(());
}
let mut nodes = futures::stream::iter(futures)
let mut nodes = stream::iter(iter)
.chunks(num_leaves.div_ceil(4))
.map(|chunk| async move {
let mut chunk = chunk.into_iter();
let mut leaves = vec![];
loop {
let items: ArrayVec<T, { 2 * TREE_BASE }> =
chunk.by_ref().take(2 * TREE_BASE).collect();
if items.is_empty() {
break;
}
let item_summaries: ArrayVec<T::Summary, { 2 * TREE_BASE }> =
items.iter().map(|item| item.summary(())).collect();
let mut summary = item_summaries[0].clone();
for item_summary in &item_summaries[1..] {
<T::Summary as Summary>::add_summary(&mut summary, item_summary, ());
}
leaves.push(SumTree(Arc::new(Node::Leaf {
summary,
items,
item_summaries,
})));
yield_now().await;
}
leaves
})
.map(|future| spawn.background_spawn(future))
.buffered(4)
.flat_map(|it| stream::iter(it.into_iter()))
.collect::<Vec<_>>()
.await;
@@ -622,7 +636,7 @@ impl<T: Item> SumTree<T> {
pub async fn async_extend<S, I>(&mut self, iter: I, spawn: S)
where
S: BackgroundSpawn,
I: IntoIterator<Item = T> + 'static,
I: IntoIterator<Item = T, IntoIter: ExactSizeIterator>,
T: 'static + Send + Sync,
for<'b> T::Summary: Summary<Context<'b> = ()> + Send + Sync,
{
@@ -1126,7 +1140,7 @@ mod tests {
let rng = &mut rng;
let mut tree = SumTree::<u8>::default();
let count = rng.random_range(0..10);
let count = rng.random_range(0..128);
if rng.random() {
tree.extend(rng.sample_iter(StandardUniform).take(count), ());
} else {
@@ -1140,7 +1154,7 @@ mod tests {
for _ in 0..num_operations {
let splice_end = rng.random_range(0..tree.extent::<Count>(()).0 + 1);
let splice_start = rng.random_range(0..splice_end + 1);
let count = rng.random_range(0..10);
let count = rng.random_range(0..128);
let tree_end = tree.extent::<Count>(());
let new_items = rng
.sample_iter(StandardUniform)

View File

@@ -448,11 +448,12 @@ impl PickerDelegate for TasksModalDelegate {
let template = resolved_task.original_task();
let display_label = resolved_task.display_label();
let mut tooltip_label_text = if display_label != &template.label {
resolved_task.resolved_label.clone()
} else {
String::new()
};
let mut tooltip_label_text =
if display_label != &template.label || source_kind == &TaskSourceKind::UserInput {
resolved_task.resolved_label.clone()
} else {
String::new()
};
if resolved_task.resolved.command_label != resolved_task.resolved_label {
if !tooltip_label_text.trim().is_empty() {

View File

@@ -220,6 +220,8 @@ impl TitleBar {
.on_click({
let peer_id = collaborator.peer_id;
cx.listener(move |this, _, window, cx| {
cx.stop_propagation();
this.workspace
.update(cx, |workspace, cx| {
if is_following {

View File

@@ -270,11 +270,11 @@ fn show_menu<M: ManagedView>(
window: &mut Window,
cx: &mut App,
) {
let previous_focus_handle = window.focused(cx);
let Some(new_menu) = (builder)(window, cx) else {
return;
};
let menu2 = menu.clone();
let previous_focus_handle = window.focused(cx);
window
.subscribe(&new_menu, cx, move |modal, _: &DismissEvent, window, cx| {

View File

@@ -408,6 +408,15 @@ impl ShellKind {
}
}
pub fn prepend_command_prefix<'a>(&self, command: &'a str) -> Cow<'a, str> {
match self.command_prefix() {
Some(prefix) if !command.starts_with(prefix) => {
Cow::Owned(format!("{prefix}{command}"))
}
_ => Cow::Borrowed(command),
}
}
pub const fn sequential_commands_separator(&self) -> char {
match self {
ShellKind::Cmd => '&',
@@ -422,6 +431,20 @@ impl ShellKind {
}
}
pub const fn sequential_and_commands_separator(&self) -> &'static str {
match self {
ShellKind::Cmd
| ShellKind::Posix
| ShellKind::Csh
| ShellKind::Tcsh
| ShellKind::Rc
| ShellKind::Fish
| ShellKind::PowerShell
| ShellKind::Xonsh => "&&",
ShellKind::Nushell => ";",
}
}
pub fn try_quote<'a>(&self, arg: &'a str) -> Option<Cow<'a, str>> {
shlex::try_quote(arg).ok().map(|arg| match self {
// If we are running in PowerShell, we want to take extra care when escaping strings.
@@ -438,6 +461,42 @@ impl ShellKind {
})
}
/// Quotes the given argument if necessary, taking into account the command prefix.
///
/// In other words, this will consider quoting arg without its command prefix to not break the command.
/// You should use this over `try_quote` when you want to quote a shell command.
pub fn try_quote_prefix_aware<'a>(&self, arg: &'a str) -> Option<Cow<'a, str>> {
if let Some(char) = self.command_prefix() {
if let Some(arg) = arg.strip_prefix(char) {
// we have a command that is prefixed
for quote in ['\'', '"'] {
if let Some(arg) = arg
.strip_prefix(quote)
.and_then(|arg| arg.strip_suffix(quote))
{
// and the command itself is wrapped as a literal, that
// means the prefix exists to interpret a literal as a
// command. So strip the quotes, quote the command, and
// re-add the quotes if they are missing after requoting
let quoted = self.try_quote(arg)?;
return Some(if quoted.starts_with(['\'', '"']) {
Cow::Owned(self.prepend_command_prefix(&quoted).into_owned())
} else {
Cow::Owned(
self.prepend_command_prefix(&format!("{quote}{quoted}{quote}"))
.into_owned(),
)
});
}
}
return self
.try_quote(arg)
.map(|quoted| Cow::Owned(self.prepend_command_prefix(&quoted).into_owned()));
}
}
self.try_quote(arg)
}
pub fn split(&self, input: &str) -> Option<Vec<String>> {
shlex::split(input)
}
@@ -525,4 +584,75 @@ mod tests {
"\"C:\\Users\\johndoe\\dev\\python\\39007\\tests\\.venv\\Scripts\\python.exe -m pytest \\\"test_foo.py::test_foo\\\"\"".to_string()
);
}
#[test]
fn test_try_quote_nu_command() {
let shell_kind = ShellKind::Nushell;
assert_eq!(
shell_kind.try_quote("'uname'").unwrap().into_owned(),
"\"'uname'\"".to_string()
);
assert_eq!(
shell_kind
.try_quote_prefix_aware("'uname'")
.unwrap()
.into_owned(),
"\"'uname'\"".to_string()
);
assert_eq!(
shell_kind.try_quote("^uname").unwrap().into_owned(),
"'^uname'".to_string()
);
assert_eq!(
shell_kind
.try_quote_prefix_aware("^uname")
.unwrap()
.into_owned(),
"^uname".to_string()
);
assert_eq!(
shell_kind.try_quote("^'uname'").unwrap().into_owned(),
"'^'\"'uname\'\"".to_string()
);
assert_eq!(
shell_kind
.try_quote_prefix_aware("^'uname'")
.unwrap()
.into_owned(),
"^'uname'".to_string()
);
assert_eq!(
shell_kind.try_quote("'uname a'").unwrap().into_owned(),
"\"'uname a'\"".to_string()
);
assert_eq!(
shell_kind
.try_quote_prefix_aware("'uname a'")
.unwrap()
.into_owned(),
"\"'uname a'\"".to_string()
);
assert_eq!(
shell_kind.try_quote("^'uname a'").unwrap().into_owned(),
"'^'\"'uname a'\"".to_string()
);
assert_eq!(
shell_kind
.try_quote_prefix_aware("^'uname a'")
.unwrap()
.into_owned(),
"^'uname a'".to_string()
);
assert_eq!(
shell_kind.try_quote("uname").unwrap().into_owned(),
"uname".to_string()
);
assert_eq!(
shell_kind
.try_quote_prefix_aware("uname")
.unwrap()
.into_owned(),
"uname".to_string()
);
}
}

View File

@@ -124,6 +124,7 @@ fn cover_or_next<I: Iterator<Item = (Range<usize>, Range<usize>)>>(
candidates: Option<I>,
caret: DisplayPoint,
map: &DisplaySnapshot,
range_filter: Option<&dyn Fn(Range<usize>, Range<usize>) -> bool>,
) -> Option<CandidateWithRanges> {
let caret_offset = caret.to_offset(map, Bias::Left);
let mut covering = vec![];
@@ -134,6 +135,11 @@ fn cover_or_next<I: Iterator<Item = (Range<usize>, Range<usize>)>>(
for (open_range, close_range) in ranges {
let start_off = open_range.start;
let end_off = close_range.end;
if let Some(range_filter) = range_filter
&& !range_filter(open_range.clone(), close_range.clone())
{
continue;
}
let candidate = CandidateWithRanges {
candidate: CandidateRange {
start: start_off.to_display_point(map),
@@ -208,35 +214,16 @@ fn find_mini_delimiters(
let visible_line_range = get_visible_line_range(&line_range);
let snapshot = &map.buffer_snapshot();
let mut excerpt = snapshot.excerpt_containing(offset..offset)?;
let excerpt = snapshot.excerpt_containing(offset..offset)?;
let buffer = excerpt.buffer();
let buffer_offset = excerpt.map_offset_to_buffer(offset);
let bracket_filter = |open: Range<usize>, close: Range<usize>| {
is_valid_delimiter(buffer, open.start, close.start)
};
// Try to find delimiters in visible range first
let ranges = map
.buffer_snapshot()
.bracket_ranges(visible_line_range)
.map(|ranges| {
ranges.filter_map(move |(open, close)| {
// Convert the ranges from multibuffer space to buffer space as
// that is what `is_valid_delimiter` expects, otherwise it might
// panic as the values might be out of bounds.
let buffer_open = excerpt.map_range_to_buffer(open.clone());
let buffer_close = excerpt.map_range_to_buffer(close.clone());
if is_valid_delimiter(buffer, buffer_open.start, buffer_close.start) {
Some((open, close))
} else {
None
}
})
});
if let Some(candidate) = cover_or_next(ranges, display_point, map) {
let ranges = map.buffer_snapshot().bracket_ranges(visible_line_range);
if let Some(candidate) = cover_or_next(ranges, display_point, map, Some(&bracket_filter)) {
return Some(
DelimiterRange {
open: candidate.open_range,
@@ -247,8 +234,8 @@ fn find_mini_delimiters(
}
// Fall back to innermost enclosing brackets
let (open_bracket, close_bracket) = buffer
.innermost_enclosing_bracket_ranges(buffer_offset..buffer_offset, Some(&bracket_filter))?;
let (open_bracket, close_bracket) =
buffer.innermost_enclosing_bracket_ranges(offset..offset, Some(&bracket_filter))?;
Some(
DelimiterRange {
@@ -1749,10 +1736,8 @@ pub fn surrounding_markers(
#[cfg(test)]
mod test {
use editor::{Editor, EditorMode, MultiBuffer, test::editor_test_context::EditorTestContext};
use gpui::KeyBinding;
use indoc::indoc;
use text::Point;
use crate::{
object::{AnyBrackets, AnyQuotes, MiniBrackets},
@@ -3200,80 +3185,6 @@ mod test {
}
}
#[gpui::test]
async fn test_minibrackets_multibuffer(cx: &mut gpui::TestAppContext) {
// Initialize test context with the TypeScript language loaded, so we
// can actually get brackets definition.
let mut cx = VimTestContext::new(cx, true).await;
// Update `b` to `MiniBrackets` so we can later use it when simulating
// keystrokes.
cx.update(|_, cx| {
cx.bind_keys([KeyBinding::new("b", MiniBrackets, None)]);
});
// Setup MultiBuffer with 3 different excerpts, only showing the first
// two rows for each buffer.
let (editor, cx) = cx.add_window_view(|window, cx| {
let multi_buffer = MultiBuffer::build_multi(
[
("111\n222\n333\n444\n", vec![Point::row_range(0..2)]),
("111\na {bracket} example\n", vec![Point::row_range(0..2)]),
],
cx,
);
// In order for the brackets to actually be found, we need to update
// the language used for the second buffer. This is something that
// is handled automatically when simply using `VimTestContext::new`
// but, since this is being set manually, the language isn't
// automatically set.
let editor = Editor::new(EditorMode::full(), multi_buffer.clone(), None, window, cx);
let buffer_ids = multi_buffer.read(cx).excerpt_buffer_ids();
if let Some(buffer) = multi_buffer.read(cx).buffer(buffer_ids[1]) {
buffer.update(cx, |buffer, cx| {
buffer.set_language(Some(language::rust_lang()), cx);
})
};
editor
});
let mut cx = EditorTestContext::for_editor_in(editor.clone(), cx).await;
cx.assert_excerpts_with_selections(indoc! {"
[EXCERPT]
ˇ111
222
[EXCERPT]
111
a {bracket} example
"
});
cx.simulate_keystrokes("j j j j f r");
cx.assert_excerpts_with_selections(indoc! {"
[EXCERPT]
111
222
[EXCERPT]
111
a {bˇracket} example
"
});
cx.simulate_keystrokes("d i b");
cx.assert_excerpts_with_selections(indoc! {"
[EXCERPT]
111
222
[EXCERPT]
111
a {ˇ} example
"
});
}
#[gpui::test]
async fn test_minibrackets_trailing_space(cx: &mut gpui::TestAppContext) {
let mut cx = NeovimBackedTestContext::new(cx).await;

View File

@@ -73,6 +73,7 @@ gpui = { workspace = true, features = [
"windows-manifest",
] }
gpui_tokio.workspace = true
rayon.workspace = true
edit_prediction_button.workspace = true
http_client.workspace = true

View File

@@ -257,6 +257,13 @@ pub fn main() {
return;
}
rayon::ThreadPoolBuilder::new()
.num_threads(4)
.stack_size(10 * 1024 * 1024)
.thread_name(|ix| format!("RayonWorker{}", ix))
.build_global()
.unwrap();
log::info!(
"========== starting zed version {}, sha {} ==========",
app_version,

View File

@@ -64,7 +64,7 @@ const SEARCH_PROMPT: &str = indoc! {r#"
## Current cursor context
`````path={current_file_path}
`````{current_file_path}
{cursor_excerpt}
`````

View File

@@ -39,8 +39,10 @@ paths.workspace = true
polars = { version = "0.51", features = ["lazy", "dtype-struct", "parquet"] }
project.workspace = true
prompt_store.workspace = true
pulldown-cmark.workspace = true
release_channel.workspace = true
reqwest_client.workspace = true
toml.workspace = true
serde.workspace = true
serde_json.workspace = true
settings.workspace = true

View File

@@ -0,0 +1,355 @@
use std::{
borrow::Cow,
env,
fmt::{self, Display},
fs,
io::Write,
mem,
path::{Path, PathBuf},
};
use anyhow::{Context as _, Result};
use clap::ValueEnum;
use gpui::http_client::Url;
use pulldown_cmark::CowStr;
use serde::{Deserialize, Serialize};
const CURSOR_POSITION_HEADING: &str = "Cursor Position";
const EDIT_HISTORY_HEADING: &str = "Edit History";
const EXPECTED_PATCH_HEADING: &str = "Expected Patch";
const EXPECTED_EXCERPTS_HEADING: &str = "Expected Excerpts";
const REPOSITORY_URL_FIELD: &str = "repository_url";
const REVISION_FIELD: &str = "revision";
#[derive(Debug)]
pub struct NamedExample {
pub name: String,
pub example: Example,
}
#[derive(Debug, Serialize, Deserialize)]
pub struct Example {
pub repository_url: String,
pub revision: String,
pub cursor_path: PathBuf,
pub cursor_position: String,
pub edit_history: Vec<String>,
pub expected_patch: String,
pub expected_excerpts: Vec<ExpectedExcerpt>,
}
#[derive(Debug, Serialize, Deserialize)]
pub struct ExpectedExcerpt {
path: PathBuf,
text: String,
}
#[derive(ValueEnum, Debug, Clone)]
pub enum ExampleFormat {
Json,
Toml,
Md,
}
impl NamedExample {
pub fn load(path: impl AsRef<Path>) -> Result<Self> {
let path = path.as_ref();
let content = std::fs::read_to_string(path)?;
let ext = path.extension();
match ext.and_then(|s| s.to_str()) {
Some("json") => Ok(Self {
name: path.file_name().unwrap_or_default().display().to_string(),
example: serde_json::from_str(&content)?,
}),
Some("toml") => Ok(Self {
name: path.file_name().unwrap_or_default().display().to_string(),
example: toml::from_str(&content)?,
}),
Some("md") => Self::parse_md(&content),
Some(_) => {
anyhow::bail!("Unrecognized example extension: {}", ext.unwrap().display());
}
None => {
anyhow::bail!(
"Failed to determine example type since the file does not have an extension."
);
}
}
}
pub fn parse_md(input: &str) -> Result<Self> {
use pulldown_cmark::{CodeBlockKind, Event, HeadingLevel, Parser, Tag, TagEnd};
let parser = Parser::new(input);
let mut named = NamedExample {
name: String::new(),
example: Example {
repository_url: String::new(),
revision: String::new(),
cursor_path: PathBuf::new(),
cursor_position: String::new(),
edit_history: Vec::new(),
expected_patch: String::new(),
expected_excerpts: Vec::new(),
},
};
let mut text = String::new();
let mut current_section = String::new();
let mut block_info: CowStr = "".into();
for event in parser {
match event {
Event::Text(line) => {
text.push_str(&line);
if !named.name.is_empty()
&& current_section.is_empty()
// in h1 section
&& let Some((field, value)) = line.split_once('=')
{
match field.trim() {
REPOSITORY_URL_FIELD => {
named.example.repository_url = value.trim().to_string();
}
REVISION_FIELD => {
named.example.revision = value.trim().to_string();
}
_ => {
eprintln!("Warning: Unrecognized field `{field}`");
}
}
}
}
Event::End(TagEnd::Heading(HeadingLevel::H1)) => {
if !named.name.is_empty() {
anyhow::bail!(
"Found multiple H1 headings. There should only be one with the name of the example."
);
}
named.name = mem::take(&mut text);
}
Event::End(TagEnd::Heading(HeadingLevel::H2)) => {
current_section = mem::take(&mut text);
}
Event::End(TagEnd::Heading(level)) => {
anyhow::bail!("Unexpected heading level: {level}");
}
Event::Start(Tag::CodeBlock(kind)) => {
match kind {
CodeBlockKind::Fenced(info) => {
block_info = info;
}
CodeBlockKind::Indented => {
anyhow::bail!("Unexpected indented codeblock");
}
};
}
Event::Start(_) => {
text.clear();
block_info = "".into();
}
Event::End(TagEnd::CodeBlock) => {
if current_section.eq_ignore_ascii_case(EDIT_HISTORY_HEADING) {
named.example.edit_history.push(mem::take(&mut text));
} else if current_section.eq_ignore_ascii_case(CURSOR_POSITION_HEADING) {
let path = PathBuf::from(block_info.trim());
named.example.cursor_path = path;
named.example.cursor_position = mem::take(&mut text);
} else if current_section.eq_ignore_ascii_case(EXPECTED_PATCH_HEADING) {
named.example.expected_patch = mem::take(&mut text);
} else if current_section.eq_ignore_ascii_case(EXPECTED_EXCERPTS_HEADING) {
let path = PathBuf::from(block_info.trim());
named.example.expected_excerpts.push(ExpectedExcerpt {
path,
text: mem::take(&mut text),
});
} else {
eprintln!("Warning: Unrecognized section `{current_section:?}`")
}
}
_ => {}
}
}
if named.example.cursor_path.as_path() == Path::new("")
|| named.example.cursor_position.is_empty()
{
anyhow::bail!("Missing cursor position codeblock");
}
Ok(named)
}
pub fn write(&self, format: ExampleFormat, mut out: impl Write) -> Result<()> {
match format {
ExampleFormat::Json => Ok(serde_json::to_writer(out, &self.example)?),
ExampleFormat::Toml => {
Ok(out.write_all(toml::to_string_pretty(&self.example)?.as_bytes())?)
}
ExampleFormat::Md => Ok(write!(out, "{}", self)?),
}
}
#[allow(unused)]
pub async fn setup_worktree(&self) -> Result<PathBuf> {
let worktrees_dir = env::current_dir()?.join("target").join("zeta-worktrees");
let repos_dir = env::current_dir()?.join("target").join("zeta-repos");
fs::create_dir_all(&repos_dir)?;
fs::create_dir_all(&worktrees_dir)?;
let (repo_owner, repo_name) = self.repo_name()?;
let repo_dir = repos_dir.join(repo_owner.as_ref()).join(repo_name.as_ref());
if !repo_dir.is_dir() {
fs::create_dir_all(&repo_dir)?;
run_git(&repo_dir, &["init"]).await?;
run_git(
&repo_dir,
&["remote", "add", "origin", &self.example.repository_url],
)
.await?;
}
run_git(
&repo_dir,
&["fetch", "--depth", "1", "origin", &self.example.revision],
)
.await?;
let worktree_path = worktrees_dir.join(&self.name);
if worktree_path.is_dir() {
run_git(&worktree_path, &["clean", "--force", "-d"]).await?;
run_git(&worktree_path, &["reset", "--hard", "HEAD"]).await?;
run_git(&worktree_path, &["checkout", &self.example.revision]).await?;
} else {
let worktree_path_string = worktree_path.to_string_lossy();
run_git(
&repo_dir,
&[
"worktree",
"add",
"-f",
&worktree_path_string,
&self.example.revision,
],
)
.await?;
}
Ok(worktree_path)
}
#[allow(unused)]
fn repo_name(&self) -> Result<(Cow<'_, str>, Cow<'_, str>)> {
// git@github.com:owner/repo.git
if self.example.repository_url.contains('@') {
let (owner, repo) = self
.example
.repository_url
.split_once(':')
.context("expected : in git url")?
.1
.split_once('/')
.context("expected / in git url")?;
Ok((
Cow::Borrowed(owner),
Cow::Borrowed(repo.trim_end_matches(".git")),
))
// http://github.com/owner/repo.git
} else {
let url = Url::parse(&self.example.repository_url)?;
let mut segments = url.path_segments().context("empty http url")?;
let owner = segments
.next()
.context("expected owner path segment")?
.to_string();
let repo = segments
.next()
.context("expected repo path segment")?
.trim_end_matches(".git")
.to_string();
assert!(segments.next().is_none());
Ok((owner.into(), repo.into()))
}
}
}
async fn run_git(repo_path: &Path, args: &[&str]) -> Result<String> {
let output = smol::process::Command::new("git")
.current_dir(repo_path)
.args(args)
.output()
.await?;
anyhow::ensure!(
output.status.success(),
"`git {}` within `{}` failed with status: {}\nstderr:\n{}\nstdout:\n{}",
args.join(" "),
repo_path.display(),
output.status,
String::from_utf8_lossy(&output.stderr),
String::from_utf8_lossy(&output.stdout),
);
Ok(String::from_utf8(output.stdout)?.trim().to_string())
}
impl Display for NamedExample {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
write!(f, "# {}\n\n", self.name)?;
write!(
f,
"{REPOSITORY_URL_FIELD} = {}\n",
self.example.repository_url
)?;
write!(f, "{REVISION_FIELD} = {}\n\n", self.example.revision)?;
write!(
f,
"## {CURSOR_POSITION_HEADING}\n\n`````{}\n{}`````\n",
self.example.cursor_path.display(),
self.example.cursor_position
)?;
write!(f, "## {EDIT_HISTORY_HEADING}\n\n")?;
if !self.example.edit_history.is_empty() {
write!(f, "`````diff\n")?;
for item in &self.example.edit_history {
write!(f, "{item}")?;
}
write!(f, "`````\n")?;
}
if !self.example.expected_patch.is_empty() {
write!(
f,
"\n## {EXPECTED_PATCH_HEADING}\n\n`````diff\n{}`````\n",
self.example.expected_patch
)?;
}
if !self.example.expected_excerpts.is_empty() {
write!(f, "\n## {EXPECTED_EXCERPTS_HEADING}\n\n")?;
for excerpt in &self.example.expected_excerpts {
write!(
f,
"`````{}{}\n{}`````\n\n",
excerpt
.path
.extension()
.map(|ext| format!("{} ", ext.to_string_lossy()))
.unwrap_or_default(),
excerpt.path.display(),
excerpt.text
)?;
}
}
Ok(())
}
}

View File

@@ -1,8 +1,10 @@
mod example;
mod headless;
mod source_location;
mod syntax_retrieval_stats;
mod util;
use crate::example::{ExampleFormat, NamedExample};
use crate::syntax_retrieval_stats::retrieval_stats;
use ::serde::Serialize;
use ::util::paths::PathStyle;
@@ -22,6 +24,7 @@ use language_model::LanguageModelRegistry;
use project::{Project, Worktree};
use reqwest_client::ReqwestClient;
use serde_json::json;
use std::io;
use std::{collections::HashSet, path::PathBuf, process::exit, str::FromStr, sync::Arc};
use zeta2::{ContextMode, LlmContextOptions, SearchToolQuery};
@@ -48,6 +51,11 @@ enum Command {
#[command(subcommand)]
command: Zeta2Command,
},
ConvertExample {
path: PathBuf,
#[arg(long, value_enum, default_value_t = ExampleFormat::Md)]
output_format: ExampleFormat,
},
}
#[derive(Subcommand, Debug)]
@@ -641,6 +649,15 @@ fn main() {
}
},
},
Command::ConvertExample {
path,
output_format,
} => {
let example = NamedExample::load(path).unwrap();
example.write(output_format, io::stdout()).unwrap();
let _ = cx.update(|cx| cx.quit());
return;
}
};
match result {

View File

@@ -165,6 +165,5 @@
- [Local Collaboration](./development/local-collaboration.md)
- [Using Debuggers](./development/debuggers.md)
- [Glossary](./development/glossary.md)
- [Release Process](./development/releases.md)
- [Release Notes](./development/release-notes.md)
- [Debugging Crashes](./development/debugging-crashes.md)

View File

@@ -3163,6 +3163,12 @@ Non-negative `integer` values
- Setting: `search_wrap`
- Default: `true`
## Center on Match
- Description: If `center_on_match` is enabled, the editor will center the cursor on the current match when searching.
- Setting: `center_on_match`
- Default: `false`
## Seed Search Query From Cursor
- Description: When to populate a new search's query based on the text under the cursor.

View File

@@ -88,7 +88,6 @@ in-depth examples and explanations.
## Contributor links
- [CONTRIBUTING.md](https://github.com/zed-industries/zed/blob/main/CONTRIBUTING.md)
- [Releases](./development/releases.md)
- [Debugging Crashes](./development/debugging-crashes.md)
- [Code of Conduct](https://zed.dev/code-of-conduct)
- [Zed Contributor License](https://zed.dev/cla)

View File

@@ -165,6 +165,58 @@ $ cargo heaptrack -b zed
When this zed instance is exited, terminal output will include a command to run `heaptrack_interpret` to convert the `*.raw.zst` profile to a `*.zst` file which can be passed to `heaptrack_gui` for viewing.
## Perf recording
How to get a flamegraph with resolved symbols from a running zed instance. Use
when zed is using a lot of CPU. Not useful for hangs.
### During the incident
- Find the PID (process ID) using:
`ps -eo size,pid,comm | grep zed | sort | head -n 1 | cut -d ' ' -f 2`
Or find the pid of the command zed-editor with the most ram usage in something
like htop/btop/top.
- Install perf:
On Ubuntu (derivatives) run `sudo apt install linux-tools`.
- Perf Record:
run `sudo perf record -p <pid you just found>`, wait a few seconds to gather data then press Ctrl+C. You should now have a perf.data file
- Make the output file user owned:
run `sudo chown $USER:$USER perf.data`
- Get build info:
Run zed again and type `zed: about` in the command pallet to get the exact commit.
The `data.perf` file can be send to zed together with the exact commit.
### Later
This can be done by Zed staff.
- Build Zed with symbols:
Check out the commit found previously and modify `Cargo.toml`.
Apply the following diff then make a release build.
```diff
[profile.release]
-debug = "limited"
+debug = "full"
```
- Add the symbols to perf database:
`pref buildid-cache -v -a <path to release zed binary>`
- Resolve the symbols from the db:
`perf inject -i perf.data -o perf_with_symbols.data`
- Install flamegraph:
`cargo install cargo-flamegraph`
- Render the flamegraph:
`flamegraph --perfdata perf_with_symbols.data`
## Troubleshooting
### Cargo errors claiming that a dependency is using unstable features

View File

@@ -1,147 +0,0 @@
# Zed Releases
Read about Zed's [release channels here](https://zed.dev/faq#what-are-the-release-channels).
## Wednesday Release Process
You will need write access to the Zed repository to do this.
Credentials for various services used in this process can be found in 1Password.
Use the `releases` Slack channel to notify the team that releases will be starting.
This is mostly a formality on Wednesday's minor update releases, but can be beneficial when doing patch releases, as other devs may have landed fixes they'd like to cherry pick.
### Starting the Builds
1. Checkout `main` and ensure your working copy is clean.
1. Run `git fetch && git pull` to ensure you have the latest commits locally.
1. Run `git fetch --tags --force` to forcibly ensure your local tags are in sync with the remote.
1. Run `./script/get-stable-channel-release-notes` and store output locally.
1. Run `./script/bump-zed-minor-versions`.
- Push the tags and branches as instructed.
1. Run `./script/get-preview-channel-changes` and store output locally.
> **Note:** Always prioritize the stable release.
> If you've completed aggregating stable release notes, you can move on to working on aggregating preview release notes, but once the stable build has finished, work through the rest of the stable steps to fully publish.
> Preview can be finished up after.
### Stable Release
1. Aggregate stable release notes.
- Follow the instructions at the end of the script and aggregate the release notes into one structure.
1. Once the stable release draft is up on [GitHub Releases](https://github.com/zed-industries/zed/releases), paste the stable release notes into it and **save**.
- **Do not publish the draft!**
1. Check the stable release assets.
- Ensure the stable release job has finished without error.
- Ensure the draft has the proper number of assets—releases currently have 12 assets each (as of v0.211).
- Download the artifacts for the stable release draft and test that you can run them locally.
1. Publish the stable draft on [GitHub Releases](https://github.com/zed-industries/zed/releases).
- Use [Vercel](https://vercel.com/zed-industries/zed-dev) to check the progress of the website rebuild.
The release will be public once the rebuild has completed.
1. Post the stable release notes to social media.
- Bluesky and X posts will already be built as drafts in [Buffer](https://buffer.com).
- Double-check links.
- Publish both, one at a time, ensuring both are posted to each respective platform.
1. Send the stable release notes email.
- The email broadcast will already be built as a draft in [Kit](https://kit.com).
- Double-check links.
- Publish the email.
### Preview Release
1. Aggregate preview release notes.
- Take the script's output and build release notes by organizing each release note line into a category.
- Use a prior release for the initial outline.
- Make sure to append the `Credit` line, if present, to the end of each release note line.
1. Once the preview release draft is up on [GitHub Releases](https://github.com/zed-industries/zed/releases), paste the preview release notes into it and **save**.
- **Do not publish the draft!**
1. Check the preview release assets.
- Ensure the preview release job has finished without error.
- Ensure the draft has the proper number of assets—releases currently have 12 assets each (as of v0.211).
- Download the artifacts for the preview release draft and test that you can run them locally.
1. Publish the preview draft on [GitHub Releases](https://github.com/zed-industries/zed/releases).
- Use [Vercel](https://vercel.com/zed-industries/zed-dev) to check the progress of the website rebuild.
The release will be public once the rebuild has completed.
### Prep Content for Next Week's Stable Release
1. Build social media posts based on the popular items in preview.
- Draft the copy in the [tweets](https://zed.dev/channel/tweets-23331) channel.
- Create the preview media (videos, screenshots).
- For features that you film videos around, try to create alternative photo-only versions to be used in the email, as videos and GIFs aren't great for email.
- Store all created media in `Feature Media` in our Google Drive.
- Build X and Bluesky post drafts (copy and media) in [Buffer](https://buffer.com), to be sent for next week's stable release.
**Note: These are preview items and you may discover bugs.**
**This is a very good time to report these findings to the team!**
1. Build email based on the popular items in preview.
- You can reuse the copy and photo media from the preview social media posts.
- Create a draft email in [Kit](https://kit.com), to be sent for next week's stable release.
## Patch Release Process
If your PR fixes a panic or a crash, you should cherry-pick it to the current stable and preview branches.
If your PR fixes a regression in recently released code, you should cherry-pick it to preview.
You will need write access to the Zed repository to do this:
---
1. Send a PR containing your change to `main` as normal.
1. Once it is merged, cherry-pick the commit locally to either of the release branches (`v0.XXX.x`).
- In some cases, you may have to handle a merge conflict.
More often than not, this will happen when cherry-picking to stable, as the stable branch is more "stale" than the preview branch.
1. After the commit is cherry-picked, run `./script/trigger-release {preview|stable}`.
This will bump the version numbers, create a new release tag, and kick off a release build.
- This can also be run from the [GitHub Actions UI](https://github.com/zed-industries/zed/actions/workflows/bump_patch_version.yml):
![](https://github.com/zed-industries/zed/assets/1486634/9e31ae95-09e1-4c7f-9591-944f4f5b63ea)
1. Once release drafts are up on [GitHub Releases](https://github.com/zed-industries/zed/releases), proofread and edit the release notes as needed and **save**.
- **Do not publish the drafts, yet.**
1. Check the release assets.
- Ensure the stable / preview release jobs have finished without error.
- Ensure each draft has the proper number of assets—releases currently have 10 assets each.
- Download the artifacts for each release draft and test that you can run them locally.
1. Publish stable / preview drafts, one at a time.
- Use [Vercel](https://vercel.com/zed-industries/zed-dev) to check the progress of the website rebuild.
The release will be public once the rebuild has completed.
## Nightly release process
In addition to the public releases, we also have a nightly build that we encourage employees to use.
Nightly is released by cron once a day, and can be shipped as often as you'd like.
There are no release notes or announcements, so you can just merge your changes to main and run `./script/trigger-release nightly`.

View File

@@ -11,7 +11,7 @@ The [Material Icon Theme](https://github.com/zed-extensions/material-icon-theme)
There are two important directories for an icon theme extension:
- `icon_themes`: This directory will contain one or more JSON files containing the icon theme definitions.
- `icons`: This directory contains the icons assets that will be distributed with the extension. You can created subdirectories in this directory, if so desired.
- `icons`: This directory contains the icon assets that will be distributed with the extension. You can created subdirectories in this directory, if so desired.
Each icon theme file should adhere to the JSON schema specified at [`https://zed.dev/schema/icon_themes/v0.3.0.json`](https://zed.dev/schema/icon_themes/v0.3.0.json).

View File

@@ -324,7 +324,7 @@ This query marks number and string values in key-value pairs and arrays for reda
The `runnables.scm` file defines rules for detecting runnable code.
Here's an example from an `runnables.scm` file for JSON:
Here's an example from a `runnables.scm` file for JSON:
```scheme
(

View File

@@ -4,19 +4,21 @@ Zed comes with a built-in icon theme, with more icon themes available as extensi
## Selecting an Icon Theme
See what icon themes are installed and preview them via the Icon Theme Selector, which you can open from the command palette with "icon theme selector: toggle".
See what icon themes are installed and preview them via the Icon Theme Selector, which you can open from the command palette with `icon theme selector: toggle`.
Navigating through the icon theme list by moving up and down will change the icon theme in real time and hitting enter will save it to your settings file.
## Installing more Icon Themes
More icon themes are available from the Extensions page, which you can access via the command palette with `zed: extensions` or the [Zed website](https://zed.dev/extensions).
More icon themes are available from the Extensions page, which you can access via the command palette with `zed: extensions` or the [Zed website](https://zed.dev/extensions?filter=icon-themes).
## Configuring Icon Themes
Your selected icon theme is stored in your settings file. You can open your settings file from the command palette with `zed: open settings file` (bound to `cmd-alt-,` on macOS and `ctrl-alt-,` on Linux).
Your selected icon theme is stored in your settings file.
You can open your settings file from the command palette with {#action zed::OpenSettingsFile} (bound to {#kb zed::OpenSettingsFile}).
Just like with themes, Zed allows for configuring different icon themes for light and dark mode. You can set the mode to `"light"` or `"dark"` to ignore the current system mode.
Just like with themes, Zed allows for configuring different icon themes for light and dark mode.
You can set the mode to `"light"` or `"dark"` to ignore the current system mode.
```json [settings]
{

Some files were not shown because too many files have changed in this diff Show More