Compare commits

...

193 Commits

Author SHA1 Message Date
Julia Ryan
1caafceac0 Format 2025-11-06 14:16:15 -08:00
Julia Ryan
dc5e3fafc9 Fix a bunch of things
Co-authored-by: David Kleingeld <davidsk@zed.dev>
2025-11-06 14:10:56 -08:00
Julia Ryan
2b1e2571d0 Fix smart_case 2025-11-06 11:41:52 -08:00
Julia Ryan
9cbc6d174e Weight path matches differently
Co-authored-by: David Kleingeld <davidsk@zed.dev>
2025-11-06 11:41:52 -08:00
Julia Ryan
509686c428 wip 2025-11-06 11:41:52 -08:00
Julia Ryan
ef1624738d Add fuzzy path matching
Add path match test

Fix segment splitting

Co-authored-by: David Kleingeld <davidsk@zed.dev>

Add tests and tweak ordering

Co-authored-by: David Kleingeld <davidsk@zed.dev>

Fix penalize_length

Fix all the Ord impls

Co-authored-by: David Kleingeld <davidsk@zed.dev>

Fix file_finder ordering

Co-authored-by: David Kleingeld <davidsk@zed.dev>

Add tests
2025-11-06 11:41:51 -08:00
Julia Ryan
0f3993b92a Use nucleo for fuzzy string matching
Co-authored-by: David Kleingeld <davidsk@zed.dev>
2025-11-06 11:41:51 -08:00
Cave Bats Of Ware
a112153a2e Enable image support in remote projects (#39158)
Adds support for opening and displaying images in remote projects. The
server streams image data to the client in chunks, where the client then
reconstructs the image and displays it. This change includes:

- Adding `image` crate as a dependency for remote_server
- Implementing `ImageStore` for remote access
- Creating proto definitions for image-related messages
- Adding handlers for creating images for peers
- Computing image metadata from bytes instead of reading from disk for
remote images

Closes #20430
Closes #39104
Closes #40445

Release Notes:

- Added support for image preview in remote sessions.
- Fixed #39104

<img width="982" height="551" alt="image"
src="https://github.com/user-attachments/assets/575428a3-9144-4c1f-b76f-952019ea14cc"
/>
<img width="978" height="547" alt="image"
src="https://github.com/user-attachments/assets/fb58243a-4856-4e73-bb30-8d5e188b3ac9"
/>

---------

Co-authored-by: Julia Ryan <juliaryan3.14@gmail.com>
2025-11-06 11:31:32 -08:00
Richard Feldman
d21184b1d3 Fix ACP extension root dir (#42131)
Agents running in extensions need to have a root directory of the
extension's dir for installation and authentication, but *not* for the
conversation itself - otherwise the agent is running things like
terminal commands in the wrong dir.

Release Notes:

- Fixed Agent Server extensions having the current working directory of
the extension rather than the project
2025-11-06 19:19:16 +00:00
Anthony Eid
ba136abf6c editor: Add action to move between snippet tabstop positions v2 (#42127)
Closes https://github.com/zed-industries/zed/issues/41407

This PR fixes the issues that caused #41407 to be reverted in #42008.
Namely that the action context didn't take into account if a snippet
could move backwards or forwards, and the action shared the same key
mapping as `editor::MoveToPreviousWordStart` and
`editor::MoveToNextWordEnd`.

I changed the default key mapping for the move to snippet tabstop to tab
and shift-tab to match the default behavior of other editors.

Release Notes:

- Editor: Add actions to move between snippet tabstop positions
2025-11-06 18:49:39 +00:00
Lukas Wirth
2d45c23fb0 remote: Flush to stdin when writing to sftp 2 (#42126)
https://github.com/zed-industries/zed/pull/42103#issuecomment-3498137130

Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-06 18:18:19 +00:00
Bo Lorentsen
b78f19982f Allow for using config specific gdb binaries in gdb adapter (#37193)
I really need this for embedded development in IDF and Zephyr, where the
chip vendors sometimes provide there own specialized version of the
tools, and I need to direct zed to use these.

The current GDB adapter only supports the gdb it find in the normal
search path, and it also seems like we where not able to transfer gdb
specific startup arguments (only `-i=dap` is set) .

In order to fix this I (semi wipe using GPT-4.1) expanded the GDB
adapter with 2 new config options :

* **gdb_path** holds a full path to the gdb executable, for now only
full path or relative to cwd
* **gdb_args** an array holding additional arguments given to gdb on
startup
 
It seemed to me, like the `env` config did not transferred to gdb, so
this is added to.
 
 I have tested this locally, and it seems to work not only compile :-)

Release Notes:

debugger: Adds gdb_path and gdb_args to gdb debug adapter options 
debugger: Fix bug where gdb debug sessions wouldn't inherit the shell
environment from Zed

---------

Co-authored-by: Anthony <anthony@zed.dev>
2025-11-06 12:53:59 -05:00
Lukas Wirth
a7fac65d62 gpui: Remove unneeded weak Rc cycle in WindowsWindowInner (#42119)
Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-06 17:33:09 +00:00
Lukas Wirth
bf8864d106 gpui: Remove all (unsound) ManuallyDrop usages, panic on device loss (#42114)
Given that when we lose our devices unrecoverably we will panic anyways,
might as well do so eagerly which makes it clearer.

Additionally this PR replaces all uses of `ManuallyDrop` with `Option`,
as otherwise we need to do manual bookkeeping of what is and isn't
initialized when we try to recover devices as we can bail out halfway
while recovering. In other words, the code prior to this was fairly
unsound due to freely using `ManuallyDrop::drop`.
 
Fixes ZED-1SS
 
Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-06 17:29:15 +00:00
Nia
08ee4f7966 notify: Bump to rebased 8.2.0 fork (#42113)
Hopefully makes progress towards #38109, #39266

Release Notes:

- N/A
2025-11-06 16:30:17 +00:00
Lukas Wirth
6f6f652cf2 zlog: Add env var to enable line number logging (#41905)
Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-06 15:31:26 +00:00
Lukas Wirth
9c8e37a156 clangd: Fix switch source header action on windows (#42105)
Fixes https://github.com/zed-industries/zed/issues/40935

Release Notes:

- Fixed clangd's switch source header action not working on windows
2025-11-06 15:12:34 +00:00
Kirill Bulatov
d54c64f35a Refresh outline panel on file renames (#42104)
Closes https://github.com/zed-industries/zed/issues/41877

Release Notes:

- Fixed outline panel not updating file headers on rename
2025-11-06 14:46:46 +00:00
Lukas Wirth
0b53da18d5 remote: Flush to stdin when writing to sftp (#42103)
https://github.com/zed-industries/zed/issues/42027#issuecomment-3497210172

Release Notes:

- Fixed ssh remoting potentially failing due to not flushing stdin to
sftp
2025-11-06 14:27:26 +00:00
Libon
5ced3ef0fd editor: Improve multibuffer header spacing (#42071)
BEFORE:
<img width="1751" height="629" alt="image"
src="https://github.com/user-attachments/assets/f88464d3-5daa-4d53-b394-f92db8b0fd8c"
/>

AFTER:
<img width="1714" height="493" alt="image"
src="https://github.com/user-attachments/assets/022c883b-b219-40a3-aa5f-0c16d23e8abf"
/>

Release Notes:

- N/A

---------

Co-authored-by: Danilo Leal <daniloleal09@gmail.com>
2025-11-06 14:09:06 +00:00
ʟᴜɴᴇx
7a37dd9433 Do not serialize buffers containing bundled files (#38102)
Fix bundled file persistence by introducing SerializationMode enum

Closes: #38094

What's the issue?

Opening bundled files like Default Key Bindings
(zed://settings/keymap-default.json) was causing SQLite foreign key
constraint errors. The editor was trying to save state for these
read-only assets just like regular files, but since bundled files don't
have database entries, the foreign key constraint would fail.

The fix

Replaced the boolean serialize_dirty_buffers flag with a type-safe
SerializationMode enum:

```rust
pub enum SerializationMode {
    Enabled,   // Regular files persist across sessions
    Disabled,  // Bundled files don't persist
}
```

This prevents serialization at the source: workspace_id() returns None
for disabled editors, serialize() bails early, and should_serialize()
returns false. When opening bundled files, we set the mode to Disabled
from the start, so they're treated as transient views that never
interact with the persistence layer.

Changes

- editor.rs: Added SerializationMode enum and updated serialization
methods to respect it
- items.rs: Guarded should_serialize() to prevent disabled editors from
being serialized
- zed.rs: Set SerializationMode::Disabled in open_bundled_file()

Result

Bundled files open cleanly without SQLite errors and don't persist
across workspace reloads (expected behavior). Regular file persistence
remains unaffected.

Release Notes: Fixed SQLite foreign key constraint errors when opening
bundled files like Default Key Bindings.

---------

Co-authored-by: MrSubidubi <finn@zed.dev>
2025-11-06 13:30:03 +00:00
Danilo Leal
a7163623e7 agent_ui: Make "add more agents" menu item take to extensions (#42098)
Now that agent servers are a thing, this is the primary and easiest way
to quickly add more agents to Zed, without touching any settings JSON
file. :)

Release Notes:

- N/A
2025-11-06 09:55:06 -03:00
Lukas Wirth
f08068680d agent_ui: Do not show Codex wsl warning on wsl take 2 (#42096)
https://github.com/zed-industries/zed/pull/42079#discussion_r2498472887

Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-06 12:15:17 +00:00
Lukas Wirth
a951e414d8 util: Fix shell environment fetching with cmd (#42093)
Release Notes:

- Fixed shell environment fetching failing when having `cmd` configured
as terminal shell
2025-11-06 12:07:34 +00:00
Lukas Wirth
e75c6b1aa5 remote: Fix detect_can_exec detection (#42087)
Closes https://github.com/zed-industries/zed/issues/42036

Release Notes:

- Fixed an issuer with wsl exec detection eagerly failing, breaking
remote connections
2025-11-06 12:06:32 +00:00
Kirill Bulatov
149eedb73d Fix scroll position restoration (#42088)
Follow-up of https://github.com/zed-industries/zed/pull/42035
Scroll position needs to be stored immediately, otherwise editor close
may not register that.

Release Notes:

- N/A
2025-11-06 11:37:27 +00:00
Lukas Wirth
fb46bae3ed remote: Add missing quotation in extract_server_binary (#42085)
Also respect the shell env for various commands again
Should close https://github.com/zed-industries/zed/issues/42027

Release Notes:

- Fixed remote server installation failing on some setups
2025-11-06 11:19:24 +00:00
Lukas Wirth
28d7c37b0d recent_projects: Improve user facing error messages on connection failure (#42083)
cc https://github.com/zed-industries/zed/issues/42004

Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-06 11:00:41 +00:00
Lukas Wirth
f6da987d4c agent_ui: Do not show Codex wsl warning on wsl (#42079)
Release Notes:

- Fixed the codex wsl warning being shown on wsl itself
2025-11-06 10:34:14 +00:00
Kirill Bulatov
efc71f35a5 Tone down extension errors (#42080)
Before:
<img width="2032" height="1161" alt="before"
src="https://github.com/user-attachments/assets/5c497b47-87e8-4167-bc28-93e34556ea4d"
/>

After:
<img width="2032" height="1161" alt="after"
src="https://github.com/user-attachments/assets/4a87803f-67df-4bf8-ade0-306f3c9ca81e"
/>

Release Notes:

- N/A
2025-11-06 10:12:04 +00:00
Lukas Wirth
3b7ee58cfa zed: Attach console to parent process before processing --printenv (#42075)
Otherwise the `--printenv` flag will simply not work on windows

Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-06 09:47:54 +00:00
Lukas Wirth
32047bef93 text: Improve panic messages with more information (#42072)
Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-06 09:16:45 +00:00
Lukas Wirth
4003287cc3 vim: Downgrade user config error from panic to log (#42070)
Fixes ZED-2W3

Release Notes:

- Fixed panic due to invalid vim keycap
2025-11-06 08:37:04 +00:00
Lukas Wirth
001a47c8b7 gpui: Inline some hot recursive scope functions (#42069)
This reduces stack usage in prepainting slightly as we stack less
references to window/app.

Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-06 08:31:39 +00:00
Lukas Wirth
113f0780b3 agent_ui: Fix string slicing panic in message editor (#42068)
Fixes ZED-302

Release Notes:

- Fixed a panic in agent message editor when using multibyte whitespace
characters
2025-11-06 07:58:20 +00:00
Lexi Mattick
273321608f gpui: Add debug assertion to Window::on_action and make docs consistent (#41202)
Improves formatting consistency across various docs, fixes some typos,
and adds a missing `debug_assert_paint` to `Window::on_action` and
`Window::on_action_when`.

Release Notes:

- N/A
2025-11-06 07:55:20 +00:00
Smit Barmase
92cfce568b language: Fix completion menu no longer prioritizes relevant items for Typescript and Python (#42065)
Closes #41672

Regressed in https://github.com/zed-industries/zed/pull/40242

Release Notes:

- Fixed issue where completion menu no longer prioritizes relevant items
for TypeScript and Python.
2025-11-06 13:19:46 +05:30
Coenen Benjamin
2b6cf31ace file_finder: Display duplicated file in file finder history (#41917)
Closes #41850

When digging into this I figured out that basically what was going on is
in the history of the file finder it doesn't update the name of the file
duplicated because when you duplicate a file it's named automatically
with `filename copy` and so this filename was added to the history but
not updated so once you wanted to go back into this file it was not part
of file finder displayed history anymore because this file doesn't exist
anymore but the entity id remains the same.
I was also to reproduce this bug when just renaming a file.

Release Notes:

- Fixed: Display duplicated file in file finder history

---------

Signed-off-by: Benjamin <5719034+bnjjj@users.noreply.github.com>
Co-authored-by: Conrad Irwin <conrad.irwin@gmail.com>
Co-authored-by: Lukas Wirth <me@lukaswirth.dev>
2025-11-06 07:06:45 +00:00
Lemon
58cec41932 Adjust terminal decorative character ranges to include missing Powerline characters (#42043)
Closes #41975.

This change adjusts some of the ranges which were incorrectly labeled or
excluded characters. The new ranges include three codepoints which are
not assigned in Nerd Fonts. However, one of these codepoints were
already included prior to this change. These codepoints are:

- U+E0C9, between the two ice separators
- U+E0D3, between the two trapezoid separators. The ranges prior to this
PR already included this one.
- U+E0D5, between the trapezoid separators and the inverted triangle
separators

I included these so as to not overcomplicate the ranges by
cherry-picking the defined codepoints. That being said, if we're okay
with this and an additional unassigned codepoint (U+E0CB, between ice
separators and honeycomb separators) being included then a simple range
from 0xE0B0 to 0xE0D7 nicely includes all of the Powerline characters.

I wasn't sure how to write tests for this so I just added two characters
to the existing tests which were previously not covered lol. All of the
Powerline characters can be seen
[here](https://www.nerdfonts.com/cheat-sheet) by searching `nf-pl`.

Release Notes:

- Fixed certain Powerline characters incorrectly having terminal
contrast adjustment applied.
2025-11-06 06:38:34 +00:00
Conrad Irwin
2ec5ca0e05 Fix generate release notes script on first stable (#42061)
Don't crash in generate-release-notes on the first stable
commit on a branch.

Release Notes:

- N/A
2025-11-05 23:25:30 -07:00
Conrad Irwin
f8da550867 Refresh zed.dev releases page after releases (#42060)
Release Notes:

- N/A
2025-11-05 23:25:13 -07:00
Mayank Verma
0b1d3d78a4 git: Fix pull failing when tracking remote with different branch name (#41768)
Closes #31430

Release Notes:

- Fixed git pull failing when tracking remote with different branch name

Here's a before/after comparison when `dev` branch has upstream set to
`origin/main`:


https://github.com/user-attachments/assets/3a47e736-c7b7-4634-8cd1-aca7300c3a73
2025-11-06 05:18:08 +00:00
Cole Miller
930b489d90 ci: Don't require protobuf and postgres checks for tests_pass for now (#42057)
For now, there are cases where we want to merge PRs (advisedly) even
though these checks fail.

Release Notes:

- N/A
2025-11-06 00:03:50 -05:00
Delvin
121cee8045 git: Add cursor pointer on last commit to check changes (#41960)
Release Notes:
-  Improved visual cue on git panel ui to check previous commit changes 

Before: 
<img width="1470" height="956" alt="Screenshot 2025-11-05 at 2 06 49 pm"
src="https://github.com/user-attachments/assets/b8c54bb6-c8b8-4d36-a14f-71d725ed68f2"
/>

After:
<img width="1470" height="956" alt="Screenshot 2025-11-05 at 2 06 24 pm"
src="https://github.com/user-attachments/assets/d8d96f9e-ceed-4c02-9f93-de9fd3dfcbf1"
/>
2025-11-05 23:27:38 -05:00
Viraj Bhartiya
5360dc1504 Refactor timestamp formatting in Git UI components to use chrono for local time calculations (#41005)
- Updated `blame_ui.rs`, `branch_picker.rs`, `commit_tooltip.rs`, and
`commit_view.rs` to replace the previous timestamp formatting with
`chrono` for better accuracy in local time representation.
- Introduced `chrono::Local::now().offset().local_minus_utc()` to obtain
the local offset for timestamp formatting.

Closes #40878

Release Notes:
- Improved timestamp handling in various Git UI components for enhanced
user experience.
2025-11-05 23:23:22 -05:00
Danilo Leal
69862790cb docs: Improve content in /ai/agent-panel and /ai/rules (#42055)
- Clarify about first-party supported features in external agents
- Create new section for selection as context for higher visibility
- Add keybindings for agent profiles
- Fix outdated setting using `assistant` instead of `agent`
- Add keybinding for accessing the rules library

Release Notes:

- N/A
2025-11-06 01:18:55 -03:00
ᴀᴍᴛᴏᴀᴇʀ
284d8f790a Support Forgejo and Gitea avatars in git blame (#41813)
Part of #11043.

Codeberg is a public instance of Forgejo, as confirmed by the API
documentation at https://codeberg.org/api/swagger. Therefore, I renamed
the related component from codeberg to forgejo and added codeberg.org as
a public instance.

Furthermore, to optimize request speed for the commit API, I set
`stat=false&verification=false&files=false`.

<img width="1650" height="1268" alt="CleanShot 2025-11-03 at 19 57
06@2x"
src="https://github.com/user-attachments/assets/c1b4129e-f324-41c2-86dc-5e4f7403c046"
/>

<br/>
<br/>

Regarding Gitea Support: 

Forgejo is a fork of Gitea, and their APIs are currently identical
(e.g., for getting avatars). However, to future-proof against potential
API divergence, I decided to treat them as separate entities. The
current gitea implementation is essentially a copy of the forgejo file
with the relevant type names and the public instance URL updated.

Release Notes:

- Added Support for Forgejo and Gitea avatars in git blame
2025-11-05 22:53:28 -05:00
Danilo Leal
f824e93eeb docs: Remove non-existing keybinding in /visual-customization (#42053)
Release Notes:

- N/A
2025-11-06 00:38:39 -03:00
Danilo Leal
e71bc4821c docs: Add section about agent servers in /external-agents (#42052)
Release Notes:

- N/A
2025-11-06 00:38:33 -03:00
Jason Lee
64c8c19e1b gpui: Impl Default for TextRun (#41084)
Release Notes:

- N/A

When I was implementing Input, I often used `TextRun`, but `background`,
`underline` and `strikethrough` were often not used.

So make change to simplify it.
2025-11-06 01:50:23 +00:00
Richard Feldman
622d626a29 Add agent-servers.md (#41609)
Release Notes:

- N/A

---------

Co-authored-by: Danilo Leal <67129314+danilo-leal@users.noreply.github.com>
2025-11-06 01:35:56 +00:00
Petros Amoiridis
714481073d Fix macOS new window stacking (#38683)
Closes #36206 

Disclaimer: I did use AI for help to end up with this proposed solution.
😅

## Observed behavior of native apps on macOS (like Safari)

I first did a quick research on how Safari behaves on macOS, and here's
what I have found:

1. Safari seems to position new windows with an offset based on the
currently active window
2. It keeps opening new windows with an offset until the new window
cannot fit the display bounds horizontally, vertically or both.
3. When it cannot fit horizontally, the new window opens at x=0
(y=active window's y)
4. When it cannot fit vertically, the new window opens at y=0 (x=active
window's x)
5. When it cannot fit both horizontally and vertically, the new window
opens at x=0 and y=0 (top left).
6. At any moment if I activate a different Safari window, the next new
window is offset off of that
7. If I resize the active window and open a new window, the new window
has the same size as the active window

So, I implemented the changes based on those observations.

I am not sure if touching `gpui/src/window.rs` is the way to go. I am
open to feedback and direction here.

I am also not sure if making my changes platform (macOS) specific, is
the right thing to do. I reckoned that Linux and Windows have different
default behaviors, and the original issue mentioned macOS. But,
likewise, I am open to take a different approach.

## Tests

I haven't included tests for such change, as it seems to me a bit
difficult to properly test this, other than just doing a manual
integration test. But if you would want them for such a change, happy to
try including them.

## Alternative approach

I also did some research on macOS native APIs that we could use instead
of trying to make the calculations ourselves, and I found
`NSWindow.cascadeTopLeftFromPoint` which seems to be doing exactly what
we want, and more. It probably takes more things into consideration and
thus it is more robust. We could go down that road, and add it to
`gpui/src/platform/mac/window.rs` and then use it for new window
creation. Again, if that's what you would do yourselves, let me know and
I can either change the implementation here, or open a new pull request
and let you decide which one would you would like to pursue.

## Video showing the behavior


https://github.com/user-attachments/assets/f802a864-7504-47ee-8c6b-8d9b55474899

🙇‍♂️

Release Notes:

- Improved macOS new window stacking
2025-11-06 01:22:49 +00:00
Sean Timm
eccdfed32b gpui: Convert macOS clipboard file URLs to paths for paste (#36848)
- On macOS, pasting now inserts the actual file path when the clipboard
contains a file URL (public.file-url/public.url)
- Terminal paste remains text-only; no temp files or data URLs are
created. If only raw image bytes exist on the clipboard, paste is a
no-op.
- Scope: macOS only; no dependency changes.
- Added a test (test_file_url_converts_to_path) that verifies URL→path
conversion using a unique pasteboard.

Release Notes:

- Improved pasting on macOS: now inserts the actual file path when the
clipboard contains a file URL (enables image paste support for Claude
Code)
2025-11-06 00:35:52 +00:00
Cyandev
2664596a34 gpui: Fix incorrect handling of Function key modifier on macOS (#38518)
On macOS, the Function key is reserved for system use and should not be
used in application code.

This commit updated keystroke matching and key event handling to ignore
the Function key modifier while users are typing or pressing
keybindings.

For some keyboards with compact layout (like my 65% keyboard), there is
no separated backtick key. Esc and it shares the same physical key. To
input backtick, users may press `Fn-Esc`. However, macOS will still
deliver events with Fn key modifier to applications. Cocoa framework can
handle this correctly, which typically ignore the Fn directly. GPUI
should also follow the same rule, otherwise, the backtick key on those
keyboards won't work.

Release Notes:

- Fixed a bug where typing fn-\` on macOS would not insert a `.
2025-11-05 23:19:32 +00:00
Richard Feldman
23f2fb6089 Run ACP login from same cwd as agent server (#42038)
This makes it possible to do login via things like `cmd: "node", args:
["my-node-file.js", "login"]`

Also, that command will now use Zed's managed `node` instance.

Release Notes:

- ACP extensions can now run terminal login commands using relative
paths
2025-11-05 18:17:50 -05:00
Julia Ryan
fb2c2c55dc Fix windows crash handler (#42039)
Closes #41471

We were killing the crash handler when it received a second copy of any
of the messages, but this GPU specs one is sent on each new window
rather than once at startup. We could gate the sending to only happen
once, but it's simpler to just allow multiple gpu specs messages.

Release Notes:

- N/A
2025-11-05 22:34:05 +00:00
Rémi Kalbe
8315fde1ff Fix LSP spawning by resetting exception ports in child processes (#40716)
## Summary

Fixes #36754

This PR fixes an issue where LSPs fail to spawn after the crash handler
is initialized.

## Problem

After PR #35263 added minidump crash reporting, some users experienced
LSP spawn failures. The issue manifests as:
- LSPs fail to spawn with no clear error messages
- The problem only occurs after crash handler initialization
- LSPs work when a debugger is attached, revealing a timing issue

### Root Cause

The crash handler installs Mach exception ports for minidump generation.
Due to a timing issue, child processes inherit these exception ports
before they're fully stabilized, which can block child process spawning.

## Solution

Reset exception ports in child processes using the `pre_exec()` hook,
which runs after `fork()` but before `exec()`. This prevents children
from inheriting the parent's crash handler exception ports.

### Implementation

- Adds macOS-specific implementation of `new_smol_command()` that resets
exception ports before exec
- Calls `task_set_exception_ports` to reset all exception ports to
`MACH_PORT_NULL`
- Graceful error handling: logs warnings but doesn't fail process
spawning if port reset fails

Release Notes:

- Fixed LSPs failing to spawn on some macOS systems

---------

Co-authored-by: Julia Ryan <juliaryan3.14@gmail.com>
2025-11-05 22:05:34 +00:00
John Tur
fc87440682 Update Windows docs (#41423)
- Document Arm64 support
- Document minimum Windows version requirements

Release Notes:

- N/A
2025-11-05 21:23:48 +00:00
Kirill Bulatov
c996eadaf5 Update editor data only after real scroll reports (#42035)
Release Notes:

- N/A
2025-11-05 21:22:29 +00:00
Danilo Leal
e8c6c1ba04 agent_ui: Fix how icons from external agents are displayed (#42034)
Release Notes:

- N/A
2025-11-05 21:14:16 +00:00
versecafe
b8364d7c33 node: Move managed runtime to v24 LTS (#41956)
Release Notes:

- Moved managed Node runtime to v24 LTS
2025-11-05 14:10:42 -07:00
John Tur
7c23ef89ec Fix corrupted characters being inserted when Alt is pressed (#42033)
The Alt+Numpad buffer that's maintained by the input stack is getting
corrupted, leading to garbage characters being inserted on keystrokes
like Alt+Up. Disable the automatic handling of Alt+Numpad for now until
the cause of this corruption is understood. The Alt+Numpad input did not
work anyway, so this does not regress anything.

Release Notes:

- windows: Fixed corrupted characters being inserted when Alt is pressed
(preview only)
2025-11-05 21:03:36 +00:00
Matt Miller
2f463370cc Refactor buffer headers to collapse on click (#42021)
Release Notes:
Updated how clicking on multi-buffer headers works to provide better
control and prevent unexpected navigation:

Clicking the header now collapses/expands the file section instead of
opening the file.
Opening files can be done by clicking the filename or the "Open file"
button on the right side of the header.
Existing shortcuts continue to work: use the left chevron to collapse or
your keyboard shortcut to jump to the file

**Demo:**

https://github.com/user-attachments/assets/dca9ccc5-bd98-416c-97af-43b4e4b2f903
2025-11-05 14:59:50 -06:00
Danilo Leal
feed34cafe gpui: Add support for rendering SVG from external files (#42024)
Release Notes:

- N/A

---------

Co-authored-by: Mikayla Maki <mikayla.c.maki@gmail.com>
2025-11-05 16:12:30 -03:00
Joseph T. Lyons
4724aa5cb8 Bump Zed to v0.213 (#42018)
Release Notes:

- N/A
2025-11-05 17:31:18 +00:00
Cameron Mcloughlin
366a5db2c0 collab_ui: Show parents when searching channels (#42005)
Co-authored-by: Danilo Leal <daniloleal09@gmail.com>
2025-11-05 16:51:10 +00:00
Danilo Leal
81e87c4cd6 settings ui: Fix divider in items that doesn't have subfields (#42016)
Release Notes:

- N/A

Co-authored-by: cameron <cameron.studdstreet@gmail.com>
2025-11-05 16:49:19 +00:00
Andrew Farkas
b8ba663c20 Revert "Refactor completions" (#42014)
Reverts zed-industries/zed#41939
2025-11-05 16:48:28 +00:00
Anthony Eid
27fb1098fa Revert "editor: Add action to move between snippet tabstop positions" (#42008)
Reverts zed-industries/zed#41466

This PR would add "in_snippet" context when there wasn't a completion
menu visible, causing some actions to not be hit.

Release Note:

- N/A
2025-11-05 11:16:39 -05:00
Danilo Leal
0f5a63a9b0 agent_ui: Make "waiting confirmation" state more apparent (#41998)
This PR changes the loading/generating indicator when in the "waiting
for tool call confirmation" state so that's a bit more visible and
discernible as needing your attention, as opposed to a regular
generating state.

<img width="400" alt="Screenshot 2025-11-05 at 10  46@2x"
src="https://github.com/user-attachments/assets/88adbf97-20fb-49c4-9c77-b0a3a22aa14e"
/>

Release Notes:

- agent: Improved the "waiting for confirmation" state visibility so
that you more rapidly know the agent is waiting for you to act.
2025-11-05 11:45:44 -03:00
Danilo Leal
c8ada5b1ae agent_ui: Reduce label repetitiveness on new thread menu (#42001)
Mostly just removing "thread" from all external agent menu items; I
think we can do without it and it already becomes much better/cleaner.

Release Notes:

- N/A
2025-11-05 11:45:31 -03:00
Techy
27a18843d4 open_ai: Make the deltas optional (#39142)
I am using an Azure OpenAI instance since that is what is provided at
work and with how they have it setup not all responses contain a delta,
which lead to errors and truncated responses. This is related to how
they are filtering potentially offensive requests and responses. I don't
believe this filter was made in-house, instead I believe it is provided
by Microsoft/Azure, so I suspect this fix may help other users.

Release Notes:

- N/A

Co-authored-by: Bennet Bo Fenner <bennetbo@gmx.de>
2025-11-05 13:47:14 +01:00
Smit Barmase
2bc1d60c52 remote: Fix open terminal fails when $SHELL is not set (#41990)
Closes #41644

Release Notes:

- Fixed issue where it failed to spawn terminal on systems such as
Alpine.
2025-11-05 18:15:15 +05:30
Karl-Erik Enkelmann
17933f1222 Update documentation on Java support (#41758)
This brings the documentation on Java in line with the much changed
reality of the Java extension.
Note that the correctness of this is contingent on
https://github.com/zed-industries/extensions/pull/3745 being merged.

Release Notes:

- N/A
2025-11-05 12:24:29 +01:00
Vitaly Slobodin
cd87307289 language: Fix language detection for injected syntax layers (#41111)
Closes #40632

**TL;DR:** The `wrap selections in tag` action was unavailable in ERB
files, even when the cursor was positioned in HTML content (outside of
Ruby code blocks). This happened because `syntax_layer_at()` incorrectly
returned the Ruby language for positions that were actually in HTML.
**NOTE:** I am not familiar with that part of Zed so it could be that
the fix here is completely incorrect.

Previously, `syntax_layer_at` incorrectly reported injected languages
(e.g., Ruby in ERB files) even when the cursor was in the base language
content (HTML). This broke actions like `wrap selections in tag` that
depend on language-specific configuration.

The issue had two parts:
1. Missing start boundary check: The filter only checked if a layer's
end was after the cursor (`end_byte() > offset`), not if it started
before, causing layers outside the cursor position to be included. See
the `BEFORE` video: when I click on the HTML part it reports `Ruby`
language instead of `HTML`.
2. Wrong boundary reference for injections: For injected layers with
`included_sub_ranges` (like Ruby code blocks in ERB), checking the root
node boundaries returned the entire file range instead of the actual
injection ranges.

This fix:
- Adds the containment check using half-open range semantics [start,
end) for root node boundaries. That ensures proper reporting of the
detected language when a cursor (`|`) is located right after the
injection:

   ```
   <body>
    <%= yield %>|
  </body>
   ```

- Checks `included_sub_ranges` for injected layers to determine if the
cursor is actually within an injection
- Falls back to root node boundaries for base layers without sub-ranges.
This is the original behavior.

Fixes ERB language support where actions should be available based on
the cursor's actual language context. I think that also applies to some
other template languages like HEEX (Phoenix) and `*.pug`. On short
videos below you can see how I navigate through the ERB template and the
terminal on the right outputs the detected language if you apply the
following patch:

```diff
diff --git i/crates/editor/src/editor.rs w/crates/editor/src/editor.rs
index 15af61f5d2..54a8e0ae37 100644
--- i/crates/editor/src/editor.rs
+++ w/crates/editor/src/editor.rs
@@ -10671,6 +10671,7 @@ impl Editor {
         for selection in self.selections.disjoint_anchors_arc().iter() {
             if snapshot
                 .language_at(selection.start)
+                .inspect(|language| println!("Detected language: {:?}", language))
                 .and_then(|lang| lang.config().wrap_characters.as_ref())
                 .is_some()
             {
```

**Before:**


https://github.com/user-attachments/assets/3f8358f4-d343-462e-b6b1-3f1f2e8c533d


**After:**



https://github.com/user-attachments/assets/c1b9f065-1b44-45a2-8a24-76b7d812130d



Here is the ERB template:

```
<!DOCTYPE html>
<html>
  <head>
    <meta http-equiv="Content-Type" content="text/html; charset=utf-8">
    <style>
      /* Email styles need to be inline */
    </style>
  </head>
  <body>
    <%= yield %>
  </body>
</html>
```

Release Notes:

- N/A
2025-11-05 12:16:28 +01:00
Vitaly Slobodin
11b29d693f ruby: Add note about enabling Ruby LSP for ERB files (#41851)
Hi, this is a follow-up change for
https://github.com/zed-industries/zed/pull/41754 I think it important to
keep existing things working. So add notes to the Ruby extension doc
about enabling Ruby LSP for ERB files as well. Thanks!

Release Notes:

- N/A
2025-11-05 11:00:12 +01:00
Lukas Wirth
c061698229 project: Fetch latest lsp data in deduplicate_range_based_lsp_requests (#41971)
Fixes ZED-2MK

Release Notes:

- Fixed a panic in inlay hints
2025-11-05 08:30:22 +00:00
John Tur
b4f7af066e Work around codegen bug with GetKeyboardState (#41970)
Works around an issue, which can be reproduced in the following program:
```rs
use windows::Win32::UI::Input::KeyboardAndMouse::{GetKeyboardState, VK_CONTROL};

fn main() {
    let mut keyboard_state = [0u8; 256];
    unsafe {
        GetKeyboardState(&mut keyboard_state).unwrap();
    }

    let ctrl_down = (keyboard_state[VK_CONTROL.0 as usize] & 0x80) != 0;
    println!("Is Ctrl down: {ctrl_down}");
}
```

In debug mode, this program prints the correct answer. In release mode,
it always prints false. The optimizer appears to think that
`keyboard_state` isn't mutated and remains zeroed, and folds the
`modifier_down` comparisons to `false`.

Release Notes:

- N/A
2025-11-05 08:06:14 +00:00
Smit Barmase
c83621fa1f editor: Fix setting multi_cursor_modifier opens implementation in new pane instead of new tab (#41963)
Closes #41014

Release Notes:

- Fixed an issue where `multi_cursor_modifier` set to `cmd_or_ctrl`
opens implementation in new pane instead of new tab.
2025-11-05 11:31:56 +05:30
Richard Feldman
0da52d6774 Add ACP terminal-login via _meta field (#41954)
As discussed with @benbrandt and @mikayla-maki:

* We now tell ACP clients we support the nonstandard `terminal-auth`
`_meta` field for terminal-based authentication
* In the future, we anticipate ACP itself supporting *some* form of
terminal-based authentication, but that hasn't been designed yet or gone
through the RFD process
* For now, this unblocks terminal-based auth

Release Notes:

- Added experimental terminal-based authentication to ACP support
2025-11-04 21:40:35 -05:00
Richard Feldman
60ee0dd19b Use our node runtime for ACP extensions (#41955)
Release Notes:

- Now ACP extensions use Zed's managed Node.js runtime

---------

Co-authored-by: Mikayla Maki <mikayla.c.maki@gmail.com>
2025-11-04 21:40:23 -05:00
Conrad Irwin
9fc4abd8de Re-enable preview auto-release (#41952)
Patch releases to preview will now automatically release

Release Notes:

- N/A
2025-11-04 19:26:33 -07:00
John Tur
2ead8c42fb Improve Windows text input for international keyboard layouts and IMEs (#41259)
- Custom handling of dead keys has been removed. UX for dead keys is now
the same as other applications on Windows.
- We could bring back some kind of custom UI, but only if UX is fully
compatible with expected Windows behavior (e.g. ability to move the
cursor after typing a dead key).
  - Fixes https://github.com/zed-industries/zed/issues/38838
- Character input via AltGr shift state now always has priority over
keybindings. This applies regardless of whether the keystroke used the
AltGr key or Ctrl+Alt to enter the shift state.
- In particular, we use the following heuristic to determine whether a
keystroke should trigger character input first or trigger keybindings
first:
- If the keystroke does not have any of Ctrl/Alt/Win down, trigger
keybindings first.
- Otherwise, determine the character that would be entered by the
keystroke. If it is a control character, or no character at all, trigger
keybindings first.
- Otherwise, the keystroke has _any_ of Ctrl/Alt/Win down and generates
a printable character. Compare this character against the character that
would be generated if the keystroke had _none_ of Ctrl/Alt/Win down:
- If the character is the same, the modifiers are not significant;
trigger keybindings first.
- If there is no active input handler, or the active input handler
indicates that it isn't accepting text input (e.g. when an operator is
pending in Vim mode), character entry is not useful; trigger keybindings
first.
- Otherwise, assume the modifiers enable access to an otherwise
difficult-to-enter key; trigger character entry first.
  - Fixes https://github.com/zed-industries/zed/issues/35862
- Fixes
https://github.com/zed-industries/zed/issues/40054#issuecomment-3447833349
  - Fixes https://github.com/zed-industries/zed/issues/41486
- TranslateMessage calls are no longer skipped for unhandled keystrokes.
This fixes language input keys on Japanese and Korean keyboards (and
surely other cases as well).
- To avoid any other missing-TranslateMessage headaches in the future,
the message loop has been rewritten in a "traditional" Win32 style,
where accelerators are handled in the message loop and TranslateMessage
is called in the intended manner.
  - Fixes https://github.com/zed-industries/zed/issues/39971
  - Fixes https://github.com/zed-industries/zed/issues/40300
  - Fixes https://github.com/zed-industries/zed/issues/40321
  - Fixes https://github.com/zed-industries/zed/issues/40335
  - Fixes https://github.com/zed-industries/zed/issues/40592
  - Fixes https://github.com/zed-industries/zed/issues/40638
- As a bonus, Alt+Space now opens the system menu, since it is triggered
by the WM_SYSCHAR generated by TranslateMessage.
- VK_PROCESSKEYs are now ignored rather than being unwrapped and matched
against keybindings. This ensures that IMEs will reliably receieve
keystrokes that they express interest in. This matches the behavior of
native Windows applications.
  - Fixes https://github.com/zed-industries/zed/issues/36736
  - Fixes https://github.com/zed-industries/zed/issues/39608
  - Fixes https://github.com/zed-industries/zed/issues/39991
  - Fixes https://github.com/zed-industries/zed/issues/41223
  - Fixes https://github.com/zed-industries/zed/issues/41656
  - Fixes https://github.com/zed-industries/zed/issues/34180
  - Fixes https://github.com/zed-industries/zed/issues/41766


Release Notes:

- windows: Improved keyboard input handling for international keyboard
layouts and IMEs
2025-11-04 20:28:12 -05:00
Danilo Leal
0a4b1ac696 inline assistant: Mention ability to add context with @ in the placeholder (#41950)
This has been possible in the inline assistant for ages now and maybe
you didn't know because we didn't say anything about it! This PR fixes
that by including that you can @-mention context on it the same you can
in the agent panel.

Release Notes:

- N/A
2025-11-04 21:18:49 -03:00
Conrad Irwin
f9fb855990 Fetch (just) enough refs in script/cherry-pick (#41949)
Before this change we'd download all the tagged commits, but none of
their ancestors,
this was slow and made cherry-picking fail.

Release Notes:

- N/A
2025-11-04 17:09:43 -07:00
Conrad Irwin
b587a62ac3 No-op commit to test cherry-picking (#41948)
Closes #ISSUE

Release Notes:

- N/A
2025-11-04 23:35:19 +00:00
Conrad Irwin
1b2e38bb33 More tweaks to CI pipeline (#41941)
Closes #ISSUE

Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-04 16:28:29 -07:00
Kirill Bulatov
4339c772e4 Tidy up Edit in json footer entries (#41890)
Before:
<img width="350" height="109" alt="before"
src="https://github.com/user-attachments/assets/d5d3e6bd-3a65-4d7d-8585-1e4d8f72997f"
/>

After:
<img width="310" height="103" alt="after"
src="https://github.com/user-attachments/assets/40137084-7323-4a79-b95b-a020c418646b"
/>

* All items got a keybinding label
* All items were made of a non-small size, to match the text size in the
right button
* Keybindings are rendered as disabled for disabled buttons

Release Notes:

- N/A

Co-authored-by: Lukas Wirth <lukas@zed.dev>
2025-11-05 00:49:47 +02:00
tidely
ba7ea71c00 node_runtime: Improve proxy mapping (#41807)
Closes #ISSUE

More accurately map localhost to `127.0.0.1`. Previously we would
lowercase and simply replace all instances of localhost inside of the
URL string, meaning query parameters, username, password etc. could not
contain the string `localhost` or contain uppercase letters without
getting modified. Added a test ensuring the mapping logic works. The
previous implementation would fail this new test.

Release Notes:

- Improved the behavior of mapping `localhost` to `127.0.0.1` when
passing configured proxy urls to `node`
2025-11-04 17:11:54 -05:00
Andrew Farkas
cd04450273 Refactor completions (#41939)
This is progress toward multi-word snippets (including snippets with
prefixes containing symbols)

Release Notes:

- Removed `trigger` argument in `ShowCompletions` command

---------

Co-authored-by: Conrad Irwin <conrad.irwin@gmail.com>
2025-11-04 20:18:03 +00:00
Sergey Cherepanoff
769a8a650e windows: Automatically find Windows SDK when building gpui (#38711)
### Summary

This PR changes `gpui/build.rs` to look up the Windows SDK directory in
the registry instead of falling back to a hard-coded path.

---

### Problem

Currently, building `gpui` on Windows requires `fxc.exe` to be in `PATH`
or at a predefined location (unless `GPUI_FXC_PATH` is set). This
requires to maintain a certain build environment with proper paths/vars
or to install the specific SDK version.

It is possible to find the SDK automatically using the registry keys it
creates upon installation. Specifically in
`SOFTWARE\\WOW6432Node\\Microsoft\\Microsoft SDKs\\Windows\\v10.0`
branch there are:
* `InstallationFolder` telling the SDK installation location;
* `ProductVersion` telling the SDK version in use.

These keys provide enough information to locate the SDK binaries, with
added robustness:
* handles non-standard SDK installation path;
* deterministically selects the latest SDK when multiple versions are
present.

---

### Changes Made

* **Updated `crates/gpui/build.rs`**:

  * added dependency on `winreg`
  * introduced `find_latest_windows_sdk_binary()` helper
  * updated fallback logic to use registry lookup

This PR only changes the fallback location, and does not touch the
established environment-based workflow.

Release Notes:

- N/A

---

### Impact

Reduces manual configuration needed to build GPUI on Windows.

---------

Co-authored-by: John Tur <john-tur@outlook.com>
2025-11-04 20:14:54 +00:00
Anthony Eid
94ff4aa4b2 AI: Fix Github Copilot edit predictions failing to start (#41934)
Closes #41457 #41806 #41801

Copilot started using `node:sqlite` module which is an experimental
feature between node v22-v23 (stable in v24). The fix was passing in the
experimental flag when Zed starts the copilot LSP.

I tested this with v20.19.5 and v24.11.0. The fix got v20.19 working and
didn't affect v24.11 which was already working.

Release Notes:

- AI: Fix Github Copilot edit predictions failing to start
2025-11-04 14:31:51 -05:00
Ignasius
1e1480405a Fix integer underflow in autosave mode after delay in the settings (#41898)
Closes #41774 

Release Notes:

- settings_ui: Fixed an integer underflow panic when attempting to hit
the `-` sign on settings item that take delays in milliseconds
2025-11-04 14:12:05 -05:00
scuzqy
cdd7d4b2fb windows: Skip AppendCategory call when there are no shell links (#37926)
`ICustomDestinationList::AppendCategory` rejects an empty `IObjectArray`
and returns an `E_INVALIDARG` error. Error propagated and caused an
early-return from `update_jump_list()`.
<img width="1628" height="540" alt="image"
src="https://github.com/user-attachments/assets/f8143297-c71e-42a1-a505-66cd77dfa599"
/>

Release Notes:

- N/A
2025-11-04 18:48:18 +00:00
Piotr Osiewicz
4fd2b3f374 editor: Jumping to diagnostics unfolds target locations (#41932)
Release Notes:

- Jumping to diagnostics no longer skips over folded regions. The folded
region that contains a target diagnostic is now unfolded.
2025-11-04 18:43:24 +00:00
Conrad Irwin
43a7f96462 Improve compare_perf.yml, cherry_pick.yml (#41606)
Release Notes:

- N/A

---------

Co-authored-by: Nia Espera <nia@zed.dev>
2025-11-04 11:29:35 -07:00
Joseph T. Lyons
2a2e04bb5c Add docs for settings profiles (#41931)
Release Notes:

- N/A
2025-11-04 18:27:39 +00:00
Piotr Osiewicz
9bf212bd1e lsp: Fix dynamic registration of document diagnostics (#41929)
- lsp: Fix dynamic registration of diagnostic capabilities not taking
effect when an initial capability is not specified
Gist of the issue lies within use of .get_mut instead of .entry. If we
had not created any dynamic capability beforehand, we'd miss a
registration, essentially

- **Determine whether to update remote caps in a smarter manner**

Release Notes:

- Fixed document diagnostics with Ty language server.
2025-11-04 18:23:14 +00:00
localcc
38cd16aad9 Fix save_last_workspace (#41907)
Closes #37348

Release Notes:

- Fixed last workspace window restoration on linux/windows
2025-11-04 18:47:49 +01:00
Ben Kunkle
d8655f0656 settings_ui: Fix dropdowns after #41036 (#41920)
Closes #41533

Both of the issues in the release notes that are fixed in this PR, were
caused by incorrect usage of the `window.use_state` API.
The first issue was caused by calling `window.use_state` in a render
helper, resulting in the element ID used to share state being the same
across different pages, resulting in the state being re-used when it
should have been re-created. The fix for this was to move the
`window.state` (and rendering logic) into a `impl RenderOnce` component,
so that the IDs are resolved during the render, avoiding the state
conflicts.

The second issue is caused by using a `move` closure in the
`window.use_state` call, resulting in stale closure values when the
window state is re-used.

Release Notes:

- settings_ui: Fixed an issue where some dropdown menus would show
options from a different dropdown when clicked
- settings_ui: Fixed an issue where attempting to change a setting in a
dropdown back to it's original value after changing it would do nothing
2025-11-04 12:46:16 -05:00
Oleksiy Syvokon
91d631c229 Evaluate zeta2 context retrieval and edit predictions (#41921)
This PR implements the `zeta-cli eval` command. It will:

- Run the edit prediction model if there are no cached results
- Compute precision/recall/F1 for context retrieval at the line level:
every retrieved line of context is counted as a true positive (correct
retrieval), false positive (retrieved something that was not expected),
or false negative (didn't retrieve an expected line)
- Compute similar metrics for edit predictions
- Pretty-print results, highlighting the difference between actual and
expected when printing to tty

Other changes:
- `zeta-cli predict` accepts a `--format` argument with options `md`,
`json`, `diff`
- Code restructure

Release Notes:

- N/A

---------

Co-authored-by: Piotr Osiewicz <24362066+osiewicz@users.noreply.github.com>
Co-authored-by: Agus Zubiaga <agus@zed.dev>
2025-11-04 17:36:50 +00:00
Joseph T. Lyons
054d2e1524 Add Ben K to gpui PR reviewers (#41919)
Release Notes:

- N/A
2025-11-04 17:06:16 +00:00
Sathiyaraman M
982f2418f4 git: Add support for git pull with rebase (#41117)
- Adds a new action `git::PullRebase` which adds `--rebase` in the final
command invoked by existing Git-Pull implementation.
- Includes the new action in "Fetch/Push" button in the Git Panel
(screenshot below)
- Adds key-binding for `git::PullRebase` in all three platforms,
following the existing key-binding patterns (`ctrl-g shift-down`)
- Update git docs to include the new action.

Sidenote: This is my first ever OSS contribution

Screenshot:

<img width="234" height="215" alt="image"
src="https://github.com/user-attachments/assets/713d068f-5ea5-444f-8d66-444ca65affc8"
/>

---

Release Notes:

- Git: Added `git: pull rebase` for running `git pull --rebase`.
2025-11-04 16:41:06 +00:00
Joseph T. Lyons
9f580464f0 Add Anthony and Cameron to gpui PR reviewers (#41914)
Release Notes:

- N/A
2025-11-04 16:20:41 +00:00
ᴀᴍᴛᴏᴀᴇʀ
fc3e503cfe remote: Fix incorrect default repository selection when using remote (#41698)
If I understand this correctly: The `active_repo_id` uses
`get_or_insert_with`, which makes it dependent on the `RepositoryAdded`
event sequence. To ensure correct initialization of the `active_repo_id`
on the remote side, the first local `RepositoryAdded` event must
synchronously send an `UpdateRepository` to `updates_tx`.

Closes #30694

Release Notes:

- Fixed incorrect default repository selection when using remote
2025-11-04 11:15:35 -05:00
Joseph T. Lyons
52c49b86b2 Sort PR reviewer categories (#41912)
Release Notes:

- N/A
2025-11-04 16:07:46 +00:00
Joseph T. Lyons
07b707153d Sort PR reviewers per category (#41909)
Release Notes:

- N/A
2025-11-04 15:44:17 +00:00
Lukas Wirth
75cef88ea2 agent_ui: Render error context when failing to spawn agent thread (#41908)
Release Notes:

- Fixed not telling the user what went wrong when spawning ACP agents
2025-11-04 15:35:33 +00:00
Pranav Joglekar
46b39f0077 editor: Clean up edit prediction preview when edit prediction is disabled (#40159)
Update the `editor::Editor.handle_modifiers_changed` method to ensure
that the `editor::Editor.update_edit_prediction_preview` method is
called even if edit prediction preview is disabled, if there's an active
edit prediction preview.

Without this change it was possible for users to get into a state where
holding the modifiers to show the prediction were part of the modifiers
used to disable edit prediction. When that keybinding was used, edit
prediction would be disabled, but the edit prediction preview would
remain as active, so the context menu for the editor would never be
shown again, as the editor would assume it was still showing the edit
prediction preview.

Closes #40056 

Release Notes:

- Fixed a bug that could cause the completions menu to stop being show
when edit predictions were disabled

---------

Co-authored-by: dino <dinojoaocosta@gmail.com>
2025-11-04 15:30:37 +00:00
Thomas Heartman
fb410ab3ae Support relative line number on wrapped lines (rework) (#41805)
## Add relative line numbers on wrapped lines, take 2

This is a rework of https://github.com/zed-industries/zed/pull/39268
that excludes
e7096d27a6.
This commit introduced some line number rendering issues as described in
https://github.com/zed-industries/zed/issues/41422.

While @ConradIrwin suggested we try to pass in the buffer rows from the
calling method instead of the snapshot, that
appears to have had unintended consequences and I don't think the two
calculations were intended to do the same thing. Hence, this PR has
removed those changes.

This PR also includes the migration fix originally done by @MrSubidubi
in https://github.com/zed-industries/zed/pull/41351.

## Original PR description and release notes.

**Problem:** Current relative line numbering creates a mismatch with
vim-style navigation when soft wrap is enabled. Users must mentally
calculate whether target lines are wrapped segments or logical lines,
making `<n>j/k` navigation unreliable and cognitively demanding.

**How things work today:**
- Real line navigation (`j/k` moves by logical lines): Requires
determining if visible lines are wrapped segments before jumping. Can't
jump to wrapped lines directly.
- Display line navigation (`j/k` moves by display rows): Line numbers
don't correspond to actual row distances for multi-line jumps.

**Proposed solution:** Count and number each display line (including
wrapped segments) for relative numbering. This creates direct
visual-to-navigational correspondence, where the relative number shown
always matches the `<n>j/k` distance needed.

**Benefits:**
- Eliminates mental overhead of distinguishing wrapped vs. logical lines
- Makes relative line numbers consistently actionable regardless of wrap
state
- Preserves intuitive "what you see is what you navigate" principle
- Maintains vim workflow efficiency in narrow window scenarios

Also explained and discussed in
https://github.com/zed-industries/zed/discussions/25733.

Release Notes:

- Added support for counting wrapped lines as relative lines and for
displaying line numbers for wrapped segments. Changes
`relative_line_numbers` from a boolean to an enum: `enabled`,
`disabled`, or `wrapped`.
2025-11-04 08:24:17 -07:00
B. Collier Jones
1b93242351 project_panel: Add hidden files glob patterns and action toggle hidden files visibility (#41532)
This PR adds the ability to configure which files are considered
"hidden" in the project panel and toggle their visibility with a
keyboard shortcut. Previously, the editor hardcoded dotfiles as hidden -
now users can customize the pattern and quickly show/hide them.

### Release Notes

- Added `project_panel::ToggleHideHidden` action with keyboard shortcuts
to toggle visibility of hidden files
- Added configurable `hidden_files` setting to customize which files are
marked as hidden (defaults to `**/.*` for dotfiles)

### Motivation

This change allows users to:
1. Quickly toggle hidden file visibility with a keyboard shortcut
2. Customize which files are considered "hidden" beyond just dotfiles
3. Better organize their project panel by hiding build artifacts, logs,
or other generated files

### Usage

**Toggle hidden files:**
- **macOS:** `cmd-alt-.`
- **Linux:** `ctrl-alt-.`
- **Windows:** `ctrl-alt-.`

**Customize patterns in settings:**
```json
{
  "hidden_files": ["**/.*", "**/*.tmp", "**/build/**"]
}
```

### Changes

**Core Implementation:**
- Added `hidden_files` setting (defaults to `**/.*` to match current
dotfile behavior)
- Replaced hardcoded `name.starts_with('.')` logic with configurable
pattern matching using `PathMatcher`
- Hidden status propagates through directory hierarchies (if a directory
is hidden, all children inherit that status)

**User-Facing:**
- Added `ToggleHideHidden` action in the project panel
- Added keyboard shortcuts for all platforms
- Added settings UI entry for configuring `hidden_files` patterns

**Testing:**
- Added comprehensive test coverage validating default behavior, custom
patterns, propagation, and settings changes

### Implementation Notes

- Uses `PathMatcher` for efficient glob matching
- Settings changes automatically trigger worktree re-indexing
- No breaking changes - defaults maintain current behavior (hiding
dotfiles)

---

**Disclaimer:** This was implemented with a fair amount of copy/paste
(particularly the gitignore handling), trial and error, and a healthy
dose of Claude.

### Screenshots

**Project Panel with hidden files visible:**
<img width="1368" height="935" alt="Screenshot 2025-10-30 at 3 15 53 AM"
src="https://github.com/user-attachments/assets/1cbe90ce-504c-4f9b-bca8-bef02ab961be"
/>

**Project Panel with hidden files hidden:**
<img width="1363" height="917" alt="Screenshot 2025-10-30 at 3 16 07 AM"
src="https://github.com/user-attachments/assets/9297f43e-98c7-4b19-be8f-3934589d6451"
/>

**Toggle action in command palette:**
<img width="565" height="161" alt="Screenshot 2025-10-30 at 3 17 26 AM"
src="https://github.com/user-attachments/assets/4dc9e7b6-9c29-4972-b886-88d8018905da"
/>

Release Notes:

- Added the ability to configure glob patterns for files treated as
hidden in the project panel using the `hidden_files` setting.
- Added an action `project panel: toggle hidden files` to quickly show
or hide hidden files in the project panel.

---------

Co-authored-by: Smit Barmase <heysmitbarmase@gmail.com>
2025-11-04 20:35:37 +05:30
Lukas Wirth
fb6e41d51e remote_server: Fix panic due to invalid settings access (#41904)
Closes https://github.com/zed-industries/zed/issues/41860

Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-04 14:56:24 +00:00
Lukas Wirth
0ec31db398 util: Add missing quotes in shell env capturing on windows (#41902)
Release Notes:

- Fixed shell env capturing failing if zed is installed on a path with
whitespace in it
2025-11-04 14:31:20 +00:00
Jakub Konka
a262ca1cd5 util: Log JSON with envs if failed to deserialize (#41894)
Release Notes:

- N/A
2025-11-04 15:30:26 +01:00
localcc
6a38d699dc Fix autoupdate nuking the app on Windows (#41571)
Closes #41477

Release Notes:

- N/A
2025-11-04 15:06:30 +01:00
Smit Barmase
95feefc1cf remote: Fix terminal crash on Elvish shell (#41893) 2025-11-04 17:48:42 +05:30
Coenen Benjamin
a827f25d00 file_finder: Respect .gitignore and file_scan_inclusions with ** in glob (#40654)
Closes #39037 

Previously, the code split the `**/.env` glob in `file_scan_inclusions`
into two sources for the `PathMatcher`: `["**", "**/.env"]`. This
approach works for directories, but including `**` will match all
directories and their files. To address this, I now select the
appropriate `PathMatcher` using only `**/.env` when specifically
targeting a file to determine whether to include it in the file finder.

Release Notes:

- Fixed: respect `.gitignore` and `file_scan_inclusions` settings with
`**` in glob for file finder

---------

Signed-off-by: Benjamin <5719034+bnjjj@users.noreply.github.com>
Co-authored-by: Julia Ryan <juliaryan3.14@gmail.com>
2025-11-04 12:11:17 +00:00
Kirill Bulatov
cc6208b17f Show inlay label parts' tooltips if those are present and hovered (#41889)
Part of https://github.com/zed-industries/zed/issues/33715


https://github.com/user-attachments/assets/d2d6f47d-3974-4c8c-aab9-9046891186bf

Unlike VSCode that does more advanced hovering, this one only works for
inlays with tooltip LSP data in them.

Release Notes:

- Started to show inlay label parts' tooltips when they are hovered

Co-authored-by: Lukas Wirth <lukas@zed.dev>
2025-11-04 11:10:51 +00:00
Dino
8d15ec7f99 vim: Fix mini delimiters in multibuffer (#41834)
- Update `vim::object::find_mini_delimiters` in order to filter out the
  ranges before calling `vim::object::cover_or_next`, ensuring that the
  provided ranges are converted from multibuffer space into buffer
  space.
- Remove the `range_filter` from `vim::object::cover_or_next` was the
  `find_mini_delimiters` function is the only caller and no longer uses
  it

Closes #41346 

Release Notes:

- Fixed a crash that could occur when using `vim::MiniQuotes` and
`vim::MiniBrackets` in a multibuffer
2025-11-04 11:08:51 +00:00
Jakub Konka
ca5a4dcffa terminal: Resolve env based on the project dir on the target (#41867)
Prior to this change we would always resolve envs when spawning a new
terminal window based on the inherited CLI environment. This works fine
as long as we open a new Zed instance in the terminal when using it
locally only. When using Zed connected to a remote server, it would not
be meaningful however. WIth this change, we correctly ping the remote
for the project-local envs and use that instead. This change should also
fix a pesky issue when updating Zed - after Zed restarts, opening a new
terminal window will not run `direnv` for example.

Release Notes:

- N/A
2025-11-04 09:52:28 +01:00
Lukas Wirth
3e7b8efb98 editor: Newtype WrapRow (#41843)
Makes a better distinction between `WrapRow` and `BlockRow`

Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-04 08:48:38 +00:00
John Tur
4002b32ad4 Coalesce dispatcher window messages on Windows (#41861)
Take 2 at https://github.com/zed-industries/zed/pull/41595

Release Notes:

- N/A
2025-11-04 09:43:06 +01:00
Coenen Benjamin
3ef8163357 yaml: Fix indentation with dictionary when editing (#40913)
Closes #39570 

Release Notes:

- Fixed indentation with dictionary when editing YAML file.

---------

Signed-off-by: Benjamin <5719034+bnjjj@users.noreply.github.com>
2025-11-04 12:23:34 +05:30
Conrad Irwin
b9524837bb Fix branch diff hunk expansion (#41873)
Closes #ISSUE

Release Notes:

- (preview only) Fixes a bug where hunks were not expanded when viewing
branch diff
2025-11-04 03:46:02 +00:00
Alvaro Parker
e5660d25f1 git: Add git worktree picker (#38719)
Related discussions #26084 

Worktree creations are implemented similar to how branch creations are
handled on the branch picker (the user types a new name that's not on
the list and a new entry option appears to create a new branch with that
name).


https://github.com/user-attachments/assets/39e58983-740c-4a91-be88-57ef95aed85b

With this picker you have a few workflows: 

- Open the picker and type the name of a branch that's checked out on an
existing worktree:
    - Press enter to open the worktree on a new window
- Press ctrl-enter to open the worktree and replace the current window
- Open the picker and type the name of a new branch or an existing one
that's not checked out in another worktree:
- Press enter to create the worktree and open in a new window. If the
branch doesn't exists, we will create a new one based on the branch you
have currently checked out. If the branch does exists then we create a
worktree with that branch checked out.
- Press ctrl-enter to do everything on the previous point but instead,
replace the current window with the new worktre.
- Open the picker and type the name of a new branch or an existing one
that's not checked out in another worktree:
- If a default branch is detected on the repo, you can create a new
worktree based on that branch by pressing ctrl-enter or
ctrl-shift-enter. The first one will open a new window and the last one
will replace the current one.


Note: If you preffer to not use the system prompt for choosing a
directory, you can set `"use_system_path_prompts": false` in zed
settings.

Release Notes:

- Added git worktree picker to open a git worktree on a new window or
replace the current one
- Added git worktree creation action

---------

Co-authored-by: Cole Miller <cole@zed.dev>
2025-11-03 21:38:00 -05:00
Danilo Leal
57adf42492 agent_ui: Add profiles item in the panel menu (#41871)
Making the profile management modal accessible through the agent panel
additional options menu as well. And in the process, adjusting the menu
keybinding that was getting conflicted with something else.

Release Notes:

- N/A
2025-11-03 22:54:28 -03:00
Danilo Leal
4cdcb0c15e agent_ui: Improve display of external agents in configuration view (#41869)
This PR makes the agent panel's configuration view use the icon from an
external agent that comes directly from the extension, as well as some
other clean ups.

Release Notes:

- N/A
2025-11-03 22:22:50 -03:00
Conrad Irwin
9113a20b8b Shell out to real tar in extension builder (#41856)
We see `test_extension_store_with_test_extension` hang in untarring the
WASI SDK some times.

In lieu of trying to debug the problem, let's try shelling out for now
in the hope that the test becomes more reliable.

There's a bit of risk here because we're using async-tar for other
things (but probably not 300Mb tar files...)

Assisted-By: Zed AI

Closes #ISSUE

Release Notes:

- N/A
2025-11-03 16:13:03 -07:00
Max Brunsfeld
1631cec15a Add zeta-cli subcommand for running zeta2 predictions (#41722)
This PR adds a `zeta zeta2 predict` subcommand that takes an edit
prediction example markdown file as an argument, and performs zeta2's
prediction, showing the retrieved context and the predicted edit.

* [x] Apply uncommitted diff to get repo into the right state.
* [x] Apply edits in edit history
* [x] Display predicted edits as unified diff, regardless of model
output format

Release Notes:

- N/A

---------

Co-authored-by: Agus Zubiaga <agus@zed.dev>
Co-authored-by: Piotr Osiewicz <24362066+osiewicz@users.noreply.github.com>
Co-authored-by: Ben Kunkle <ben.kunkle@gmail.com>
2025-11-03 15:12:08 -08:00
Kirill Bulatov
5e41ce17e3 Do not pull diagnostics when those are disabled (#41865)
Based on 

[hang.log](https://github.com/user-attachments/files/23319081/hang.log)


Release Notes:

- N/A
2025-11-03 22:42:26 +00:00
KyleBarton
5ed458497e Copy outline improvements from typescript over to tsx as well (#41862)
Closes #4483 

Release Notes:

- Interprets outline of tsx files with the same grammar as typescript,
including improvements from #39797
2025-11-03 14:38:05 -08:00
Kirill Bulatov
9ecf257502 Fix incorrect search ranges when rendering search matches in the outline panel (#41859)
Closes https://github.com/zed-industries/zed/issues/41792

Release Notes:

- Fixed outline panel panicking when rendering certain search matches
2025-11-03 22:01:52 +00:00
Anthony Eid
2eeb02305c editor: Add a setting to show a scrollbar in completion menu (#41849)
A user on Discord requested this feature:
https://discord.com/channels/869392257814519848/1434188637389717556/1434188637389717556

I added a scrollbar setting called `completion_menu_scrollbar` to the
completion menu and defaulted it to "Never" to match past behavior.

Release Notes:

- editor: Add `editor.completion_menu_scrollbar` setting to show a
scrollbar in the completion menu
2025-11-03 21:03:18 +00:00
Katie Geer
c2b3e60f6d settings: Change "remove trailing whitespace on save" to default false for Markdown (#41658)
Closes #ISSUE: reported on X by user. 

Release Notes:

- Made it so that the default value for the "remove trailing whitespace
on save" setting in Markdown is false, to fix cases where the removed
trailing whitespace had syntactic meaning
2025-11-03 13:00:30 -08:00
Conrad Irwin
d075a56ee7 Fix merge conflict (#41853)
Closes #ISSUE

Release Notes:

- N/A
2025-11-03 13:41:39 -07:00
Piotr Osiewicz
8217e57a2d ci: Enable namespace caching for Linux workers (#41652)
Release Notes:

- N/A
2025-11-03 21:07:36 +01:00
Finn Evers
08daedd014 editor: Add support for no scroll margin in full mode (#41838)
Noticed this whilst testing the Docker debugger. I randomly scrolled the
console off screen and was confused briefly as to why this was the case.

Release Notes:

- The debugger query console will no longer needlessly overscroll.
2025-11-03 20:47:39 +01:00
Conrad Irwin
4da5675920 Re-use the existing bundle steps for nightly too (#41699)
One of the reasons we didn't spot that we were missing the telemetry env
vars for the production builds was that nightly (which was working) had
its own set of build steps. This re-uses those and pushes the env vars
down from the workflow to the job.

It also fixes nightly releases to upload all-in-one go so that all
platforms update in sync.

Closes #41655

Release Notes:

- N/A
2025-11-03 12:29:22 -07:00
Lukas Wirth
5fc54986c7 Revert "sum_tree: Replace rayon with futures (#41586) (#41846)
This causes the background executor to hang

Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-03 19:25:15 +00:00
Finn Evers
cb5055aaec agent_ui: Fix expand message editor button not always working (#41845)
The button could not be clicked whenever the editor was currently not
focused. This PR fixes this and also registers the action on a more
global level, similar to how this is done for all the other agent
actions.

Release Notes:

- Fixed an issue where the `Expand message editor` button would not work
in agent threads if the message editor was not focused.
2025-11-03 19:08:27 +00:00
warrenjokinen
454d649b6e docs: Mark macOS 26.x as being supported (#41777)
Add "Tahoe" to list of supported macOS versions.

Closes #ISSUE

Release Notes:

- N/A
2025-11-03 13:43:42 -05:00
Vitaly Slobodin
222767e69b ruby: Disable Ruby LSP for ERB files (#41754)
The Ruby extension uses the `solargraph`
language server by default for Ruby files.
However, when a user opens any ERB file,
the extension automatically starts the Ruby LSP.
This affects developers because
they do not expect the Ruby LSP to be running.

Closes https://github.com/zed-extensions/ruby/issues/172

Release Notes:

- N/A
2025-11-03 19:35:36 +01:00
Xiaobo Liu
d7b7fa3ee2 agent: Add XML escaping for TextThreadContext title attribute (#39734)
Escape special characters (&, <, >, ", ') in the title attribute of
TextThreadContext's XML output to prevent malformed XML when titles
contain these characters.

Resolves TODO at context.rs:629

Release Notes:

- N/A

Signed-off-by: Xiaobo Liu <cppcoffee@gmail.com>
Co-authored-by: Bennet Bo Fenner <bennetbo@gmx.de>
2025-11-03 18:16:36 +00:00
Dylan
7cfce60570 project_search: Add button to collapse/expand all excerpts (#41654)
<img width="500" height="834" alt="Screenshot 2025-11-03 at 12  59@2x"
src="https://github.com/user-attachments/assets/15c5e1fc-2291-41b4-9eec-a8cfa5a446c7"
/>

Releases Note:

- Added a button that allows to expand/collapse all project search
excerpts at once.

---------

Co-authored-by: Danilo Leal <daniloleal09@gmail.com>
2025-11-03 14:50:34 -03:00
Nia
45b78482f5 perf: Fixup Hyperfine finding (#41837)
Release Notes:

- N/A
2025-11-03 17:36:33 +00:00
Antal Szabó
71f1f3728d windows: Remove null terminator from keyboard ID (#41785)
Closes #41486, closes #35862 

It is unnecessary, and it broke the `uses_altgr` function.

Also add Slovenian layout as using AltGr.

This should fix:
-
https://github.com/zed-industries/zed/pull/40536#issuecomment-3477121224
- https://github.com/zed-industries/zed/issues/41486
- https://github.com/zed-industries/zed/issues/35862

As the current strategy relies on manually adding layouts that have
AltGr, it's brittle and not very elegant. It also has other issues (it
requests the current layout on every kesytroke and mouse movement).

**A potentially better and more comprehensive solution is at
https://github.com/zed-industries/zed/pull/41259**
This is just to fix the immediate issues while that gets reviewed.

Release Notes:

- windows: Fix AltGr handling on non-US layouts again.
2025-11-03 18:29:28 +01:00
Richard Feldman
8b560cd8aa Fix bug with uninstalled agent extensions (#41836)
Previously, uninstalled agent extensions didn't immediately disappear
from the menu. Now, they do!

Release Notes:

- N/A
2025-11-03 12:09:26 -05:00
Lukas Wirth
38e1e3f498 project: Use user configured shells for project env fetching (#41288)
Closes https://github.com/zed-industries/zed/issues/40464

Release Notes:

- Fix shell environment sourcing not respecting users remote shells
2025-11-03 16:29:07 +00:00
Karl-Erik Enkelmann
a6b177d806 Update open buffers with newly registered completion trigger characters (#41243)
Closes https://github.com/zed-extensions/java/issues/108

Previously, when language servers dynamically register completion
capabilities with trigger characters for completions (hello JDTLS), this
would not get updated in buffers for that language server that were
already open. This change is to find open buffers for the language
server and update the trigger characters in each of them when the new
capability is being registered.

Release Notes:

- N/A
2025-11-03 17:28:30 +01:00
Lukas Wirth
73366bef62 diagnostics: Live update diagnostics view on edits while focused (#41829)
Prior we were only updating the diagnostics pane when it is either
unfocued, saved or when a disk based diagnostic run finishes (aka cargo
check). The reason for this is simple, we do not want to take away the
excerpt under the users cursor while they are typing if they manage to
fix the diagnostic. Additionally we need to prevent dropping the changed
buffer before it is saved.

Delaying updates was a simple way to work around these kind of issues,
but comes at a huge annoyance that the diagnostics pane is not actually
reflecting the current state of the world but some snapshot of it
instead making it less than ideal to work within it for languages that
do not leverage disk based diagnostics (that is not rust-analyzer, and
even for rust-analyzer its annoying).

This PR changes this. We now always live update the view but take care
to retain unsaved buffers as well as buffers that contain a cursor in
them (as well as some other "checkpoint" properties).

Release Notes:

- Improved diagnostics pane to live update when editing within its
editor
2025-11-03 16:16:05 +00:00
David Kleingeld
48bd253358 Adds instructions on how to use Perf & Flamegraph without debug symbols (#41831)
Release Notes:

- N/A
2025-11-03 16:11:30 +00:00
Bob Mannino
2131d88e48 Add center_on_match option for search (#40523)
[Closes discussion
#28943](https://github.com/zed-industries/zed/discussions/28943)

Release Notes:

- Added `center_on_match` option to center matched text in view during buffer or project search.

---------

Co-authored-by: Smit Barmase <heysmitbarmase@gmail.com>
2025-11-03 21:17:09 +05:30
Kirill Bulatov
42149df0f2 Show modal hover for one-off tasks (#41824)
Before, one-off commands did not show anything on hover at all, now they
show their command:

<img width="657" height="527" alt="after"
src="https://github.com/user-attachments/assets/d43292ca-9101-4a6a-b689-828277e8cfeb"
/>

Release Notes:

- Show modal hover for one-off tasks
2025-11-03 15:25:44 +00:00
Bing Wang
379bdb227a languages: Add ignore keyword for gomod (#41520)
go 1.25 introduce ignore directive in go.mod to specify directories the
go command should ignore.

ref: https://tip.golang.org/doc/go1.25#go-command


Release Notes:

- Added syntax highlighting support for the new [`ignore`
directive](https://tip.golang.org/doc/go1.25#go-command) in `go.mod`
files
2025-11-03 09:11:50 -06:00
Dijana Pavlovic
04e53bff3d Move @punctuation.delimiter before @operator capture (#41663)
Closes #41593

From what I understand the order of captures inside tree-sitter query
files matters, and the last capture will win. `?` and `:` are captured
by both `@operator` and `@punctuation.delimiter`.So in order for the
ternary operator to win it should live after `@punctuation.delimiter`.

Before:
<img width="298" height="32" alt="Screenshot 2025-10-31 at 17 41 21"
src="https://github.com/user-attachments/assets/af376e52-88be-4f62-9e2b-a106731f8145"
/>


After:
<img width="303" height="39" alt="Screenshot 2025-10-31 at 17 41 33"
src="https://github.com/user-attachments/assets/9a754ae9-0521-4c70-9adb-90a562404ce8"
/>


Release Notes:

- Fixed an issue where the ternary operator symbols in TypeScript would
not be highlighted as operators.
2025-11-03 08:51:22 -06:00
Kirill Bulatov
28f30fc851 Fix racy inlay hints queries (#41816)
Follow-up of https://github.com/zed-industries/zed/pull/40183

Release Notes:

- (Preview only) Fixed inlay hints duplicating when multiple editors are
open for the same buffer

---------

Co-authored-by: Lukas Wirth <lukas@zed.dev>
2025-11-03 13:54:53 +00:00
Lukas Wirth
f8b414c22c zed: Reduce number of rayon threads, spawn with bigger stacks (#41812)
We already do this for the cli and remote server but forgot to do so for
the main binary

Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-03 12:28:32 +00:00
Lukas Wirth
50504793e6 file_finder: Fix highlighting panic in open path prompt (#41808)
Closes https://github.com/zed-industries/zed/issues/41249

Couldn't quite come up with a test case here but verified it works.

Release Notes:

- Fixed a panic in file finder when deleting characters
2025-11-03 12:07:13 +00:00
kallyaleksiev
b625263989 remote: Close window when SSH connection fails (#41782)
## Context 

This PR closes issue https://github.com/zed-industries/zed/issues/41781

It essentially `matches` the result of opening the connection here
f7153bbe8a/crates/recent_projects/src/remote_connections.rs (L650)

and adds a Close / Retry alert that upon 'Close' closes the new window
if the result is an error
2025-11-03 11:13:20 +00:00
Lukas Wirth
c8f9db2e24 remote: Fix more quoting issues with nushell (#41547)
https://github.com/zed-industries/zed/pull/40084#issuecomment-3464159871
Closes https://github.com/zed-industries/zed/pull/41547

Release Notes:

- Fixed remoting not working when the remote has nu set as its shell
2025-11-03 10:50:05 +00:00
Lukas Wirth
bc3c88e737 Revert "windows: Don't flood windows message queue with gpui messages" (#41803)
Reverts zed-industries/zed#41595

Closes #41704
2025-11-03 10:41:53 +00:00
Oleksiy Syvokon
3a058138c1 Fix Sonnet's regression with inserting </parameter></invoke> (#41800)
Sometimes, inside the edit agent, Sonnet thinks that it's doing a tool
call and closes its response with `</parameter></invoke>` instead of
properly closing </new_text>.

A better but more labor-intensive way of fixing this would be switching
to streaming tool calls for LLMs that support it.

Closes #39921

Release Notes:

- Fixed Sonnet's regression with inserting `</parameter></invoke>`
sometimes
2025-11-03 12:22:51 +02:00
Lukas Wirth
f2b539598e sum_tree: Spawn less tasks in SumTree::from_iter_async (#41793)
Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-03 10:02:31 +00:00
ᴀᴍᴛᴏᴀᴇʀ
dc503e9975 Support GitLab and self-hosted GitLab avatars in git blame (#41747)
Part of #11043.

Release Notes:

- Added Support for showing GitLab and self-hosted GitLab avatars in git
blame
2025-11-03 07:51:04 +01:00
ᴀᴍᴛᴏᴀᴇʀ
73b75a7765 Support Gitee avatars in git blame (#41783)
Part of https://github.com/zed-industries/zed/issues/11043.

<img width="3596" height="1894" alt="CleanShot 2025-11-03 at 10 39
08@2x"
src="https://github.com/user-attachments/assets/68d16c32-fd23-4f54-9973-6cbda9685a8f"
/>


Release Notes:

- Added Support for showing Gitee avatars in git blame
2025-11-03 07:47:20 +01:00
Danilo Leal
deacd3e922 extension_ui: Fix card label truncation (#41784)
Closes https://github.com/zed-industries/zed/issues/41763

Release Notes:

- N/A
2025-11-03 03:54:47 +00:00
Aero
f7153bbe8a agent_ui: Add delete button for compatible API-based LLM providers (#41739)
Discussion: https://github.com/zed-industries/zed/discussions/41736

Release Notes:

- agent panel: Added the ability to remove OpenAI-compatible LLM
providers directly from the UI.

---------

Co-authored-by: Danilo Leal <daniloleal09@gmail.com>
2025-11-02 16:05:29 -03:00
Xiaobo Liu
4e7ba8e680 acp_tools: Add vertical scrollbar to ACP logs (#41740)
Release Notes:

- N/A

---------

Signed-off-by: Xiaobo Liu <cppcoffee@gmail.com>
Co-authored-by: Danilo Leal <daniloleal09@gmail.com>
2025-11-02 18:11:21 +00:00
Danilo Leal
9909b59bd0 agent_ui: Improve the "go to file" affordance in the edit bar (#41762)
This PR makes it clearer that you can click on the file path to open the
corresponding file in the agent panel's "edit bar", which is the element
that shows up in the panel as soon as agent-made edits happen.

Release Notes:

- agent panel: Improved the "go to file" affordance in the edit bar.
2025-11-02 14:56:23 -03:00
Danilo Leal
00ff89f00f agent_ui: Make single file review actions match panel (#41718)
When we introduced the ACP-based agent panel, the condition that the
"review" | "reject" | "keep" buttons observed to be displayed got
mismatched between the panel and the pane (when in the single file
review scenario). In the panel, the buttons appear as soon as there are
changed buffers, whereas in the pane, they appear when response
generation is done.

I believe that making them appear at the same time, observing the same
condition, is the desired behavior. Thus, I think the panel behavior is
more correct, because there are loads of times where agent response
generation isn't technically done (e.g., when there's a command waiting
for permission to be run) but the _file edit_ has already been performed
and is in a good state to be already accepted or rejected.

So, this is what this PR is doing; effectively removing the "generating"
state from the agent diff, and switching to `EditorState::Reviewing`
when there are changed buffers.

Release Notes:

- Improved agent edit single file reviews by making the "reject" and
"accept" buttons appear at the same time.
2025-11-02 14:30:50 -03:00
Danilo Leal
12fe12b5ac docs: Update theme, icon theme, and visual customization pages (#41761)
Some housekeeping updates:

- Update hardcoded actions/keybindings so they're pulled from the repo
- Mention settings window when useful
- Add more info about agent panel's font size
- Break sentences in individual lines

Release Notes:

- N/A
2025-11-02 14:30:37 -03:00
Mayank Verma
a9bc890497 ui: Fix popover menu not restoring focus to the previously focused element (#41751)
Closes #26548

Here's a before/after comparison:


https://github.com/user-attachments/assets/21d49db7-28bb-4fe2-bdaf-e86b6400ae7a

Release Notes:

- Fixed popover menus not restoring focus to the previously focused
element
2025-11-02 16:56:58 +01:00
Xiaobo Liu
d887e2050f windows: Hide background helpers behind CREATE_NO_WINDOW (#41737)
Close https://github.com/zed-industries/zed/issues/41538

Release Notes:

- Fixed some processes on windows not spawning with CREATE_NO_WINDOW

---------

Signed-off-by: Xiaobo Liu <cppcoffee@gmail.com>
2025-11-02 09:15:01 +01:00
Anthony Eid
d5421ba1a8 windows: Fix click bleeding through collab follow (#41726)
On Windows, clicking on a collab user icon in the title bar would
minimize/expand Zed because the click would bleed through to the title
bar. This PR fixes this by stopping propagation.

#### Before (On MacOS with double clicks to mimic the same behavior)

https://github.com/user-attachments/assets/5a91f7ff-265a-4575-aa23-00b8d30daeed
#### After (On MacOS with double clicks to mimic the same behavior)

https://github.com/user-attachments/assets/e9fcb98f-4855-4f21-8926-2d306d256f1c

Release Notes:

- Windows: Fix clicking on user icon in title bar to follow
minimizing/expanding Zed

Co-authored-by: Remco Smits <djsmits12@gmail.com>
2025-11-02 07:35:11 +00:00
Joseph T. Lyons
548cdfde3a Delete release process docs (#41733)
These have been migrated to the README.md
[here](https://github.com/zed-industries/release_notes). These don't
need to be public. Putting them in the same repo where we draft
(`release_notes`) means less jumping around and allows us to include
additional information we might not want to make public.

Release Notes:

- N/A
2025-11-02 00:37:02 -04:00
Ben Kunkle
2408f767f4 gh-workflow unit evals (#41637)
Closes #ISSUE

Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-01 22:45:44 -04:00
Haojian Wu
df15d2d2fe Fix doc typos (#41727)
Release Notes:

- N/A
2025-11-01 20:52:32 -03:00
Anthony Eid
07dcb8f2bb debugger: Add program and module path fallbacks for debugpy toolchain (#40975)
Fixes the Debugpy toolchain detection bug in #40324 

When detecting what toolchain (venv) to use in the Debugpy configuration
stage, we used to only base it off of the current working directory
argument passed to the config. This is wrong behavior for cases like
mono repos, where the correct virtual environment to use is nested in
another folder.

This PR fixes this issue by adding the program and module fields as
fallbacks to check for virtual environments. We also added support for
program/module relative paths as well when cwd is not None.

Release Notes:

- debugger: Improve mono repo virtual environment detection with Debugpy

---------

Co-authored-by: Remco Smits <djsmits12@gmail.com>
2025-11-01 17:30:13 -04:00
Agus Zubiaga
06bdb28517 zeta cli: Add convert-example command (#41608)
Adds a `convert-example` subcommand to the zeta cli that converts eval
examples from/to `json`, `toml`, and `md` formats.

Release Notes:

- N/A

---------

Co-authored-by: Max Brunsfeld <maxbrunsfeld@gmail.com>
2025-11-01 19:35:04 +00:00
Mark Christiansen
d6b58bb948 agent_ui: Use agent font size tokens for thread markdown rendering (#41610)
Release Notes:

- N/A

--- 

Previously, agent markdown rendering used hardcoded font sizes
(TextSize::Default and TextSize::Small) which ignored the
agent_ui_font_size and agent_buffer_font_size settings. This updates the
markdown style to respect these settings.

This pull request adds support for customizing the font size of code
blocks in agent responses, making it possible to set a distinct font
size for code within the agent panel. The changes ensure that if the new
setting is not specified, the font size will fall back to the agent UI
font size, maintaining consistent appearance.

(I am a frontend developer without any Rust knowledge so this is
co-authored with Claude Code)


**Theme settings extension:**

* Added a new `agent_buffer_code_font_size` setting to
`ThemeSettingsContent`, `ThemeSettings`, and the default settings JSON,
allowing users to specify the font size for code blocks in agent
responses.
[[1]](diffhunk://#diff-a3bba02a485aba48e8e9a9d85485332378aa4fe29a0c50d11ae801ecfa0a56a4R69-R72)
[[2]](diffhunk://#diff-aed3a9217587d27844c57ac8aff4a749f1fb1fc5d54926ef5065bf85f8fd633aR118-R119)
[[3]](diffhunk://#diff-42e01d7aacb60673842554e30970b4ddbbaee7a2ec2c6f2be1c0b08b0dd89631R82-R83)
* Updated the VSCode import logic to recognize and import the new
`agent_buffer_code_font_size` setting.

**Font size application in agent UI:**

* Modified the agent UI rendering logic in `thread_view.rs` to use the
new `agent_buffer_code_font_size` for code blocks, and to fall back to
the agent UI font size if unset.
[[1]](diffhunk://#diff-f73942e8d4f8c4d4d173d57d7c58bb653c4bb6ae7079533ee501750cdca27d98L5584-R5584)
[[2]](diffhunk://#diff-f73942e8d4f8c4d4d173d57d7c58bb653c4bb6ae7079533ee501750cdca27d98L5596-R5598)
* Implemented a helper method in `ThemeSettings` to retrieve the code
block font size, with fallback logic to ensure a value is always used.
* Updated the settings application logic to propagate the new code block
font size setting throughout the theme system.


### Example Screenshots
![Screenshot 2025-10-31 at 12 38
28](https://github.com/user-attachments/assets/cbc34232-ab1f-40bf-a006-689678380e47)
![Screenshot 2025-10-31 at 12 37
45](https://github.com/user-attachments/assets/372b5cf8-2df8-425a-b052-12136de7c6bd)

---------

Co-authored-by: Danilo Leal <daniloleal09@gmail.com>
2025-11-01 11:14:21 -03:00
Charles McLaughlin
03e0581ee8 agent_ui: Show notifications also when the panel is hidden (#40942)
Currently Zed only displays agent notifications (e.g. when the agent
completes a task) if the user has switched apps and Zed is not in the
foreground. This adds PR supports the scenario where the agent finishes
a long-running task and the user is busy coding within Zed on something
else.

Releases Note:

- If agent notifications are turned on, they will now also be displayed
when the agent panel is hidden, in complement to them showing when the
Zed window is in the background.

---------

Co-authored-by: Danilo Leal <daniloleal09@gmail.com>
2025-11-01 11:14:12 -03:00
Conrad Irwin
1552e13799 Fix telemetry in release builds (#41695)
This was inadvertently broken in v0.211.1-pre when we rewrote the
release build

Release Notes:

- N/A
2025-10-31 22:55:12 -06:00
Dijana Pavlovic
ade0f1342c agent_ui: Prevent mode selector tooltip from going off-screen (#41589)
Closes #41458 

Dynamically position mode selector tooltip to prevent clipping.

Position tooltip on the right when panel is docked left, otherwise on
the left. This ensures the tooltip remains visible regardless of panel
position.

**Note:** The tooltip currently vertically aligns with the bottom of the
menu rather than individual items. Would be great if it can be aligned
with the option it explains. But this doesn't seem trivial to me to
implement and not sure if it's important enough atm?

Before: 
<img width="431" height="248" alt="Screenshot 2025-10-30 at 22 21 09"
src="https://github.com/user-attachments/assets/073f5440-b1bf-420b-b12f-558928b627f1"
/>

After:
<img width="632" height="158" alt="Screenshot 2025-10-30 at 17 26 52"
src="https://github.com/user-attachments/assets/e999e390-bf23-435e-9df0-3126dbc14ecb"
/>
<img width="685" height="175" alt="Screenshot 2025-10-30 at 17 27 15"
src="https://github.com/user-attachments/assets/84efca94-7920-474b-bcf8-062c7b59a812"
/>


Release Notes:

- Improved the agent panel's mode selector by preventing it to go
off-screen in case the panel is docked to the left.
2025-11-01 04:29:58 +00:00
Joe Innes
04f7b08ab9 Give visual feedback when an operation is pending (#41686)
Currently, if a commit operation takes some time, there's no visual
feedback in the UI that anything's happening.

This PR changes the colour of the text on the button to the
`Color::Disabled` colour when a commit operation is pending.

Release Notes:

- Improved UI feedback when a commit is in progress

---------

Co-authored-by: Danilo Leal <daniloleal09@gmail.com>
2025-11-01 04:24:59 +00:00
Anthony Eid
ecbdffc84f debugger: Fix Debugpy attach with connect session startup (#41690)
Closes #38345, #34882, #33280

Debugpy has four distinct configuration scenarios, which are:
1. launch
2. attach with process id
3. attach with listen
4. attach with connect

Spawning Debugpy directly works with the first three scenarios but not
with "attach with connect". Which requires host/port arguments being
passed in both with an attach request and when starting up Debugpy. This
PR passes in the right arguments when spawning Debugpy in an attach with
connect scenario, thus fixing the bug.

The VsCode extension comment that explains this:
98f5b93ee4/src/extension/debugger/adapter/factory.ts (L43-L51)

Release Notes:

- debugger: Fix Python attach-based sessions not working with `connect`
or `port` arguments
2025-10-31 19:02:51 -04:00
Jakub Konka
aa61f25795 git: Make GitPanel more responsive to long-running staging ops (#41667)
Currently, this only applies to long-running individually selected
unstaged files in the git panel. Next up I would like to make this work
for `Stage All`/`Unstage All` however this will most likely require
pushing `PendingOperation` into `GitStore` (from the `GitPanel`).

Release Notes:

- N/A
2025-10-31 22:47:49 +01:00
Danilo Leal
d406409b72 Fix categorization of agent server extensions (#41689)
We missed making extensions that provide agent servers fill the
`provides` field with `agent-servers`, and thus, filtering for this type
of extension in both the app and site wouldn't return anything.

Release Notes:

- N/A
2025-10-31 21:18:22 +00:00
Danilo Leal
bf79592465 git_ui: Adjust stash picker (#41688)
Just tidying it up by removing the unnecessary eye icon buttons in all
list items and adding that action in the form of a button in the footer,
closer to all other actions. Also reordering the footer buttons so that
the likely most common action is in the far right.

Release Notes:

- N/A
2025-10-31 18:15:52 -03:00
Ben Kunkle
d3d7199507 Fix release.yml workflow (#41675)
Closes #ISSUE

Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-10-31 16:29:13 -04:00
Danilo Leal
743a9cf258 Add included agents in extensions search (#41679)
Given agent servers will soon be a thing, I'm adding Claude Code, Gemini
CLI, and Codex CLI as included agents in case anyone comes first to
search them as extensions before looking up on the agent panel.

Release Notes:

- N/A
2025-10-31 16:45:39 -03:00
Conrad Irwin
a05358f47f Delete old ci.yml (#41668)
The new one is much better

Release Notes:

- N/A
2025-10-31 12:58:47 -06:00
Ben Kunkle
3a4aba1df2 gh-workflow release (#41502)
Closes #ISSUE

Rewrite our release pipeline to be generated by `gh-workflow`

Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-10-31 14:23:25 -04:00
tidely
12d71b37bb ollama: Add button for refreshing available models (#38181)
Closes #17524

This PR adds a button to the bottom right corner of the ollama settings
ui. It resets the available ollama models, also resets the "Connected"
state in the process. This means it can be used to check if the
connection is still valid as well. It's a question whether we should
clear the available models on ALL `fetch_models` calls, since these only
happen during auth anyway.

Ollama is a local model provider which means clicking the refresh button
often only flashes the "not connected" state because the latency of the
request is so low. This accentuates changes in the UI, however I don't
think there's a way around this without adding some rather cumbersome
deferred ui updates.

I've attached the refresh button to the "Connected" `ButtonLike`, since
I don't think automatic UI spacing should separate these elements. I
think this is okay because the "Connected" isn't actually something that
the user can interact with.

Before: 
<img width="211" height="245" alt="image"
src="https://github.com/user-attachments/assets/ea90e24a-b603-4ee2-9212-2917e1695774"
/>

After: 
<img width="211" height="250" alt="image"
src="https://github.com/user-attachments/assets/be9af950-86a2-4067-87a0-52034a80a823"
/>


Alternative approach: There was also a suggestion to simply add a entry
to the command palette, however none of the other providers have this
ability currently either so I went with this approach. The current
approach also makes it more discoverable to the user.

Release Notes:

- Added a button for refreshing available ollama models

---------

Co-authored-by: Bennet Bo Fenner <bennetbo@gmx.de>
2025-10-31 18:12:02 +00:00
347 changed files with 16758 additions and 8631 deletions

69
.github/workflows/after_release.yml vendored Normal file
View File

@@ -0,0 +1,69 @@
# Generated from xtask::workflows::after_release
# Rebuild with `cargo xtask workflows`.
name: after_release
on:
release:
types:
- published
jobs:
rebuild_releases_page:
if: github.repository_owner == 'zed-industries'
runs-on: namespace-profile-2x4-ubuntu-2404
steps:
- name: after_release::rebuild_releases_page
run: 'curl https://zed.dev/api/revalidate-releases -H "Authorization: Bearer ${RELEASE_NOTES_API_TOKEN}"'
shell: bash -euxo pipefail {0}
env:
RELEASE_NOTES_API_TOKEN: ${{ secrets.RELEASE_NOTES_API_TOKEN }}
post_to_discord:
needs:
- rebuild_releases_page
if: github.repository_owner == 'zed-industries'
runs-on: namespace-profile-2x4-ubuntu-2404
steps:
- id: get-release-url
name: after_release::post_to_discord::get_release_url
run: |
if [ "${{ github.event.release.prerelease }}" == "true" ]; then
URL="https://zed.dev/releases/preview"
else
URL="https://zed.dev/releases/stable"
fi
echo "URL=$URL" >> "$GITHUB_OUTPUT"
shell: bash -euxo pipefail {0}
- id: get-content
name: after_release::post_to_discord::get_content
uses: 2428392/gh-truncate-string-action@b3ff790d21cf42af3ca7579146eedb93c8fb0757
with:
stringToTruncate: |
📣 Zed [${{ github.event.release.tag_name }}](<${{ steps.get-release-url.outputs.URL }}>) was just released!
${{ github.event.release.body }}
maxLength: 2000
truncationSymbol: '...'
- name: after_release::post_to_discord::discord_webhook_action
uses: tsickert/discord-webhook@c840d45a03a323fbc3f7507ac7769dbd91bfb164
with:
webhook-url: ${{ secrets.DISCORD_WEBHOOK_RELEASE_NOTES }}
content: ${{ steps.get-content.outputs.string }}
publish_winget:
runs-on: namespace-profile-2x4-ubuntu-2404
steps:
- id: set-package-name
name: after_release::publish_winget::set_package_name
run: |
if [ "${{ github.event.release.prerelease }}" == "true" ]; then
PACKAGE_NAME=ZedIndustries.Zed.Preview
else
PACKAGE_NAME=ZedIndustries.Zed
fi
echo "PACKAGE_NAME=$PACKAGE_NAME" >> "$GITHUB_OUTPUT"
shell: bash -euxo pipefail {0}
- name: after_release::publish_winget::winget_releaser
uses: vedantmgoyal9/winget-releaser@19e706d4c9121098010096f9c495a70a7518b30f
with:
identifier: ${{ steps.set-package-name.outputs.PACKAGE_NAME }}
max-versions-to-keep: 5
token: ${{ secrets.WINGET_TOKEN }}

39
.github/workflows/cherry_pick.yml vendored Normal file
View File

@@ -0,0 +1,39 @@
# Generated from xtask::workflows::cherry_pick
# Rebuild with `cargo xtask workflows`.
name: cherry_pick
on:
workflow_dispatch:
inputs:
commit:
description: commit
required: true
type: string
branch:
description: branch
required: true
type: string
channel:
description: channel
required: true
type: string
jobs:
run_cherry_pick:
runs-on: namespace-profile-2x4-ubuntu-2404
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- id: get-app-token
name: cherry_pick::run_cherry_pick::authenticate_as_zippy
uses: actions/create-github-app-token@bef1eaf1c0ac2b148ee2a0a74c65fbe6db0631f1
with:
app-id: ${{ secrets.ZED_ZIPPY_APP_ID }}
private-key: ${{ secrets.ZED_ZIPPY_APP_PRIVATE_KEY }}
- name: cherry_pick::run_cherry_pick::cherry_pick
run: ./script/cherry-pick ${{ inputs.branch }} ${{ inputs.commit }} ${{ inputs.channel }}
shell: bash -euxo pipefail {0}
env:
GIT_COMMITTER_NAME: Zed Zippy
GIT_COMMITTER_EMAIL: hi@zed.dev
GITHUB_TOKEN: ${{ steps.get-app-token.outputs.token }}

View File

@@ -1,841 +0,0 @@
name: CI
on:
push:
tags:
- "v*"
concurrency:
# Allow only one workflow per any non-`main` branch.
group: ${{ github.workflow }}-${{ github.ref_name }}-${{ github.ref_name == 'main' && github.sha || 'anysha' }}
cancel-in-progress: true
env:
CARGO_TERM_COLOR: always
CARGO_INCREMENTAL: 0
RUST_BACKTRACE: 1
DIGITALOCEAN_SPACES_ACCESS_KEY: ${{ secrets.DIGITALOCEAN_SPACES_ACCESS_KEY }}
DIGITALOCEAN_SPACES_SECRET_KEY: ${{ secrets.DIGITALOCEAN_SPACES_SECRET_KEY }}
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_MINIDUMP_ENDPOINT: ${{ secrets.ZED_SENTRY_MINIDUMP_ENDPOINT }}
jobs:
job_spec:
name: Decide which jobs to run
if: github.repository_owner == 'zed-industries'
outputs:
run_tests: ${{ steps.filter.outputs.run_tests }}
run_license: ${{ steps.filter.outputs.run_license }}
run_docs: ${{ steps.filter.outputs.run_docs }}
run_nix: ${{ steps.filter.outputs.run_nix }}
run_actionlint: ${{ steps.filter.outputs.run_actionlint }}
runs-on:
- namespace-profile-2x4-ubuntu-2404
steps:
- name: Checkout repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
# 350 is arbitrary; ~10days of history on main (5secs); full history is ~25secs
fetch-depth: ${{ github.ref == 'refs/heads/main' && 2 || 350 }}
- name: Fetch git history and generate output filters
id: filter
run: |
if [ -z "$GITHUB_BASE_REF" ]; then
echo "Not in a PR context (i.e., push to main/stable/preview)"
COMPARE_REV="$(git rev-parse HEAD~1)"
else
echo "In a PR context comparing to pull_request.base.ref"
git fetch origin "$GITHUB_BASE_REF" --depth=350
COMPARE_REV="$(git merge-base "origin/${GITHUB_BASE_REF}" HEAD)"
fi
CHANGED_FILES="$(git diff --name-only "$COMPARE_REV" ${{ github.sha }})"
# Specify anything which should potentially skip full test suite in this regex:
# - docs/
# - script/update_top_ranking_issues/
# - .github/ISSUE_TEMPLATE/
# - .github/workflows/ (except .github/workflows/ci.yml)
SKIP_REGEX='^(docs/|script/update_top_ranking_issues/|\.github/(ISSUE_TEMPLATE|workflows/(?!ci)))'
echo "$CHANGED_FILES" | grep -qvP "$SKIP_REGEX" && \
echo "run_tests=true" >> "$GITHUB_OUTPUT" || \
echo "run_tests=false" >> "$GITHUB_OUTPUT"
echo "$CHANGED_FILES" | grep -qP '^docs/' && \
echo "run_docs=true" >> "$GITHUB_OUTPUT" || \
echo "run_docs=false" >> "$GITHUB_OUTPUT"
echo "$CHANGED_FILES" | grep -qP '^\.github/(workflows/|actions/|actionlint.yml)' && \
echo "run_actionlint=true" >> "$GITHUB_OUTPUT" || \
echo "run_actionlint=false" >> "$GITHUB_OUTPUT"
echo "$CHANGED_FILES" | grep -qP '^(Cargo.lock|script/.*licenses)' && \
echo "run_license=true" >> "$GITHUB_OUTPUT" || \
echo "run_license=false" >> "$GITHUB_OUTPUT"
echo "$CHANGED_FILES" | grep -qP '^(nix/|flake\.|Cargo\.|rust-toolchain.toml|\.cargo/config.toml)' && \
echo "$GITHUB_REF_NAME" | grep -qvP '^v[0-9]+\.[0-9]+\.[0-9x](-pre)?$' && \
echo "run_nix=true" >> "$GITHUB_OUTPUT" || \
echo "run_nix=false" >> "$GITHUB_OUTPUT"
migration_checks:
name: Check Postgres and Protobuf migrations, mergability
needs: [job_spec]
if: |
github.repository_owner == 'zed-industries' &&
needs.job_spec.outputs.run_tests == 'true'
timeout-minutes: 60
runs-on:
- self-mini-macos
steps:
- name: Checkout repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false
fetch-depth: 0 # fetch full history
- name: Remove untracked files
run: git clean -df
- name: Find modified migrations
shell: bash -euxo pipefail {0}
run: |
export SQUAWK_GITHUB_TOKEN=${{ github.token }}
. ./script/squawk
- name: Ensure fresh merge
shell: bash -euxo pipefail {0}
run: |
if [ -z "$GITHUB_BASE_REF" ];
then
echo "BUF_BASE_BRANCH=$(git merge-base origin/main HEAD)" >> "$GITHUB_ENV"
else
git checkout -B temp
git merge -q "origin/$GITHUB_BASE_REF" -m "merge main into temp"
echo "BUF_BASE_BRANCH=$GITHUB_BASE_REF" >> "$GITHUB_ENV"
fi
- uses: bufbuild/buf-setup-action@v1
with:
version: v1.29.0
- uses: bufbuild/buf-breaking-action@v1
with:
input: "crates/proto/proto/"
against: "https://github.com/${GITHUB_REPOSITORY}.git#branch=${BUF_BASE_BRANCH},subdir=crates/proto/proto/"
style:
timeout-minutes: 60
name: Check formatting and spelling
needs: [job_spec]
if: github.repository_owner == 'zed-industries'
runs-on:
- namespace-profile-4x8-ubuntu-2204
steps:
- name: Checkout repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
- uses: pnpm/action-setup@fe02b34f77f8bc703788d5817da081398fad5dd2 # v4.0.0
with:
version: 9
- name: Prettier Check on /docs
working-directory: ./docs
run: |
pnpm dlx "prettier@${PRETTIER_VERSION}" . --check || {
echo "To fix, run from the root of the Zed repo:"
echo " cd docs && pnpm dlx prettier@${PRETTIER_VERSION} . --write && cd .."
false
}
env:
PRETTIER_VERSION: 3.5.0
- name: Prettier Check on default.json
run: |
pnpm dlx "prettier@${PRETTIER_VERSION}" assets/settings/default.json --check || {
echo "To fix, run from the root of the Zed repo:"
echo " pnpm dlx prettier@${PRETTIER_VERSION} assets/settings/default.json --write"
false
}
env:
PRETTIER_VERSION: 3.5.0
# To support writing comments that they will certainly be revisited.
- name: Check for todo! and FIXME comments
run: script/check-todos
- name: Check modifier use in keymaps
run: script/check-keymaps
- name: Run style checks
uses: ./.github/actions/check_style
- name: Check for typos
uses: crate-ci/typos@80c8a4945eec0f6d464eaf9e65ed98ef085283d1 # v1.38.1
with:
config: ./typos.toml
check_docs:
timeout-minutes: 60
name: Check docs
needs: [job_spec]
if: |
github.repository_owner == 'zed-industries' &&
(needs.job_spec.outputs.run_tests == 'true' || needs.job_spec.outputs.run_docs == 'true')
runs-on:
- namespace-profile-8x16-ubuntu-2204
steps:
- name: Checkout repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false
- name: Configure CI
run: |
mkdir -p ./../.cargo
cp ./.cargo/ci-config.toml ./../.cargo/config.toml
- name: Build docs
uses: ./.github/actions/build_docs
actionlint:
runs-on: namespace-profile-2x4-ubuntu-2404
if: github.repository_owner == 'zed-industries' && needs.job_spec.outputs.run_actionlint == 'true'
needs: [job_spec]
steps:
- uses: actions/checkout@v4
- name: Download actionlint
id: get_actionlint
run: bash <(curl https://raw.githubusercontent.com/rhysd/actionlint/main/scripts/download-actionlint.bash)
shell: bash
- name: Check workflow files
run: ${{ steps.get_actionlint.outputs.executable }} -color
shell: bash
macos_tests:
timeout-minutes: 60
name: (macOS) Run Clippy and tests
needs: [job_spec]
if: |
github.repository_owner == 'zed-industries' &&
needs.job_spec.outputs.run_tests == 'true'
runs-on:
- self-mini-macos
steps:
- name: Checkout repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false
- name: Configure CI
run: |
mkdir -p ./../.cargo
cp ./.cargo/ci-config.toml ./../.cargo/config.toml
- name: Check that Cargo.lock is up to date
run: |
cargo update --locked --workspace
- name: cargo clippy
run: ./script/clippy
- name: Install cargo-machete
uses: clechasseur/rs-cargo@8435b10f6e71c2e3d4d3b7573003a8ce4bfc6386 # v2
with:
command: install
args: cargo-machete@0.7.0
- name: Check unused dependencies
uses: clechasseur/rs-cargo@8435b10f6e71c2e3d4d3b7573003a8ce4bfc6386 # v2
with:
command: machete
- name: Check licenses
run: |
script/check-licenses
if [[ "${{ needs.job_spec.outputs.run_license }}" == "true" ]]; then
script/generate-licenses /tmp/zed_licenses_output
fi
- name: Check for new vulnerable dependencies
if: github.event_name == 'pull_request'
uses: actions/dependency-review-action@67d4f4bd7a9b17a0db54d2a7519187c65e339de8 # v4
with:
license-check: false
- name: Run tests
uses: ./.github/actions/run_tests
- name: Build collab
# we should do this on a linux x86 machinge
run: cargo build -p collab
- name: Build other binaries and features
run: |
cargo build --workspace --bins --examples
# Since the macOS runners are stateful, so we need to remove the config file to prevent potential bug.
- name: Clean CI config file
if: always()
run: rm -rf ./../.cargo
linux_tests:
timeout-minutes: 60
name: (Linux) Run Clippy and tests
needs: [job_spec]
if: |
github.repository_owner == 'zed-industries' &&
needs.job_spec.outputs.run_tests == 'true'
runs-on:
- namespace-profile-16x32-ubuntu-2204
steps:
- name: Add Rust to the PATH
run: echo "$HOME/.cargo/bin" >> "$GITHUB_PATH"
- name: Checkout repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false
- name: Cache dependencies
uses: swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2
with:
save-if: ${{ github.ref == 'refs/heads/main' }}
# cache-provider: "buildjet"
- name: Install Linux dependencies
run: ./script/linux
- name: Configure CI
run: |
mkdir -p ./../.cargo
cp ./.cargo/ci-config.toml ./../.cargo/config.toml
- name: cargo clippy
run: ./script/clippy
- name: Run tests
uses: ./.github/actions/run_tests
- name: Build other binaries and features
run: |
cargo build -p zed
cargo check -p workspace
cargo check -p gpui --examples
# Even the Linux runner is not stateful, in theory there is no need to do this cleanup.
# But, to avoid potential issues in the future if we choose to use a stateful Linux runner and forget to add code
# to clean up the config file, Ive included the cleanup code here as a precaution.
# While its not strictly necessary at this moment, I believe its better to err on the side of caution.
- name: Clean CI config file
if: always()
run: rm -rf ./../.cargo
doctests:
# Nextest currently doesn't support doctests, so run them separately and in parallel.
timeout-minutes: 60
name: (Linux) Run doctests
needs: [job_spec]
if: |
github.repository_owner == 'zed-industries' &&
needs.job_spec.outputs.run_tests == 'true'
runs-on:
- namespace-profile-16x32-ubuntu-2204
steps:
- name: Add Rust to the PATH
run: echo "$HOME/.cargo/bin" >> "$GITHUB_PATH"
- name: Checkout repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false
- name: Cache dependencies
uses: swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2
with:
save-if: ${{ github.ref == 'refs/heads/main' }}
# cache-provider: "buildjet"
- name: Install Linux dependencies
run: ./script/linux
- name: Configure CI
run: |
mkdir -p ./../.cargo
cp ./.cargo/ci-config.toml ./../.cargo/config.toml
- name: Run doctests
run: cargo test --workspace --doc --no-fail-fast
- name: Clean CI config file
if: always()
run: rm -rf ./../.cargo
build_remote_server:
timeout-minutes: 60
name: (Linux) Build Remote Server
needs: [job_spec]
if: |
github.repository_owner == 'zed-industries' &&
needs.job_spec.outputs.run_tests == 'true'
runs-on:
- namespace-profile-16x32-ubuntu-2204
steps:
- name: Add Rust to the PATH
run: echo "$HOME/.cargo/bin" >> "$GITHUB_PATH"
- name: Checkout repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false
- name: Cache dependencies
uses: swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2
with:
save-if: ${{ github.ref == 'refs/heads/main' }}
# cache-provider: "buildjet"
- name: Install Clang & Mold
run: ./script/remote-server && ./script/install-mold 2.34.0
- name: Configure CI
run: |
mkdir -p ./../.cargo
cp ./.cargo/ci-config.toml ./../.cargo/config.toml
- name: Build Remote Server
run: cargo build -p remote_server
- name: Clean CI config file
if: always()
run: rm -rf ./../.cargo
windows_tests:
timeout-minutes: 60
name: (Windows) Run Clippy and tests
needs: [job_spec]
if: |
github.repository_owner == 'zed-industries' &&
needs.job_spec.outputs.run_tests == 'true'
runs-on: [self-32vcpu-windows-2022]
steps:
- name: Environment Setup
run: |
$RunnerDir = Split-Path -Parent $env:RUNNER_WORKSPACE
Write-Output `
"RUSTUP_HOME=$RunnerDir\.rustup" `
"CARGO_HOME=$RunnerDir\.cargo" `
"PATH=$RunnerDir\.cargo\bin;$env:PATH" `
>> $env:GITHUB_ENV
- name: Checkout repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false
- name: Configure CI
run: |
New-Item -ItemType Directory -Path "./../.cargo" -Force
Copy-Item -Path "./.cargo/ci-config.toml" -Destination "./../.cargo/config.toml"
- name: cargo clippy
run: |
.\script\clippy.ps1
- name: Run tests
uses: ./.github/actions/run_tests_windows
- name: Build Zed
run: cargo build
- name: Limit target directory size
run: ./script/clear-target-dir-if-larger-than.ps1 250
- name: Clean CI config file
if: always()
run: Remove-Item -Recurse -Path "./../.cargo" -Force -ErrorAction SilentlyContinue
tests_pass:
name: Tests Pass
runs-on: namespace-profile-2x4-ubuntu-2404
needs:
- job_spec
- style
- check_docs
- actionlint
- migration_checks
# run_tests: If adding required tests, add them here and to script below.
- linux_tests
- build_remote_server
- macos_tests
- windows_tests
if: |
github.repository_owner == 'zed-industries' &&
always()
steps:
- name: Check all tests passed
run: |
# Check dependent jobs...
RET_CODE=0
# Always check style
[[ "${{ needs.style.result }}" != 'success' ]] && { RET_CODE=1; echo "style tests failed"; }
if [[ "${{ needs.job_spec.outputs.run_docs }}" == "true" ]]; then
[[ "${{ needs.check_docs.result }}" != 'success' ]] && { RET_CODE=1; echo "docs checks failed"; }
fi
if [[ "${{ needs.job_spec.outputs.run_actionlint }}" == "true" ]]; then
[[ "${{ needs.actionlint.result }}" != 'success' ]] && { RET_CODE=1; echo "actionlint checks failed"; }
fi
# Only check test jobs if they were supposed to run
if [[ "${{ needs.job_spec.outputs.run_tests }}" == "true" ]]; then
[[ "${{ needs.macos_tests.result }}" != 'success' ]] && { RET_CODE=1; echo "macOS tests failed"; }
[[ "${{ needs.linux_tests.result }}" != 'success' ]] && { RET_CODE=1; echo "Linux tests failed"; }
[[ "${{ needs.windows_tests.result }}" != 'success' ]] && { RET_CODE=1; echo "Windows tests failed"; }
[[ "${{ needs.build_remote_server.result }}" != 'success' ]] && { RET_CODE=1; echo "Remote server build failed"; }
# This check is intentionally disabled. See: https://github.com/zed-industries/zed/pull/28431
# [[ "${{ needs.migration_checks.result }}" != 'success' ]] && { RET_CODE=1; echo "Migration Checks failed"; }
fi
if [[ "$RET_CODE" -eq 0 ]]; then
echo "All tests passed successfully!"
fi
exit $RET_CODE
bundle-mac:
timeout-minutes: 120
name: Create a macOS bundle
runs-on:
- self-mini-macos
if: startsWith(github.ref, 'refs/tags/v')
needs: [macos_tests]
env:
MACOS_CERTIFICATE: ${{ secrets.MACOS_CERTIFICATE }}
MACOS_CERTIFICATE_PASSWORD: ${{ secrets.MACOS_CERTIFICATE_PASSWORD }}
APPLE_NOTARIZATION_KEY: ${{ secrets.APPLE_NOTARIZATION_KEY }}
APPLE_NOTARIZATION_KEY_ID: ${{ secrets.APPLE_NOTARIZATION_KEY_ID }}
APPLE_NOTARIZATION_ISSUER_ID: ${{ secrets.APPLE_NOTARIZATION_ISSUER_ID }}
steps:
- name: Install Node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4
with:
node-version: "18"
- name: Setup Sentry CLI
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b #v2
with:
token: ${{ SECRETS.SENTRY_AUTH_TOKEN }}
- name: Checkout repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
# We need to fetch more than one commit so that `script/draft-release-notes`
# is able to diff between the current and previous tag.
#
# 25 was chosen arbitrarily.
fetch-depth: 25
clean: false
ref: ${{ github.ref }}
- name: Limit target directory size
run: script/clear-target-dir-if-larger-than 300
- name: Determine version and release channel
run: |
# This exports RELEASE_CHANNEL into env (GITHUB_ENV)
script/determine-release-channel
- name: Draft release notes
run: |
mkdir -p target/
# Ignore any errors that occur while drafting release notes to not fail the build.
script/draft-release-notes "$RELEASE_VERSION" "$RELEASE_CHANNEL" > target/release-notes.md || true
script/create-draft-release target/release-notes.md
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
- name: Create macOS app bundle (aarch64)
run: script/bundle-mac aarch64-apple-darwin
- name: Create macOS app bundle (x64)
run: script/bundle-mac x86_64-apple-darwin
- name: Rename binaries
run: |
mv target/aarch64-apple-darwin/release/Zed.dmg target/aarch64-apple-darwin/release/Zed-aarch64.dmg
mv target/x86_64-apple-darwin/release/Zed.dmg target/x86_64-apple-darwin/release/Zed-x86_64.dmg
- uses: softprops/action-gh-release@de2c0eb89ae2a093876385947365aca7b0e5f844 # v1
name: Upload app bundle to release
if: ${{ env.RELEASE_CHANNEL == 'preview' || env.RELEASE_CHANNEL == 'stable' }}
with:
draft: true
prerelease: ${{ env.RELEASE_CHANNEL == 'preview' }}
files: |
target/zed-remote-server-macos-x86_64.gz
target/zed-remote-server-macos-aarch64.gz
target/aarch64-apple-darwin/release/Zed-aarch64.dmg
target/x86_64-apple-darwin/release/Zed-x86_64.dmg
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
bundle-linux-x86_x64:
timeout-minutes: 60
name: Linux x86_x64 release bundle
runs-on:
- namespace-profile-16x32-ubuntu-2004 # ubuntu 20.04 for minimal glibc
if: |
( startsWith(github.ref, 'refs/tags/v') )
needs: [linux_tests]
steps:
- name: Checkout repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false
- name: Install Linux dependencies
run: ./script/linux && ./script/install-mold 2.34.0
- name: Setup Sentry CLI
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b #v2
with:
token: ${{ SECRETS.SENTRY_AUTH_TOKEN }}
- name: Determine version and release channel
run: |
# This exports RELEASE_CHANNEL into env (GITHUB_ENV)
script/determine-release-channel
- name: Create Linux .tar.gz bundle
run: script/bundle-linux
- name: Upload Artifacts to release
uses: softprops/action-gh-release@de2c0eb89ae2a093876385947365aca7b0e5f844 # v1
with:
draft: true
prerelease: ${{ env.RELEASE_CHANNEL == 'preview' }}
files: |
target/zed-remote-server-linux-x86_64.gz
target/release/zed-linux-x86_64.tar.gz
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
bundle-linux-aarch64: # this runs on ubuntu22.04
timeout-minutes: 60
name: Linux arm64 release bundle
runs-on:
- namespace-profile-8x32-ubuntu-2004-arm-m4 # ubuntu 20.04 for minimal glibc
if: |
startsWith(github.ref, 'refs/tags/v')
needs: [linux_tests]
steps:
- name: Checkout repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false
- name: Install Linux dependencies
run: ./script/linux
- name: Setup Sentry CLI
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b #v2
with:
token: ${{ SECRETS.SENTRY_AUTH_TOKEN }}
- name: Determine version and release channel
run: |
# This exports RELEASE_CHANNEL into env (GITHUB_ENV)
script/determine-release-channel
- name: Create and upload Linux .tar.gz bundles
run: script/bundle-linux
- name: Upload Artifacts to release
uses: softprops/action-gh-release@de2c0eb89ae2a093876385947365aca7b0e5f844 # v1
with:
draft: true
prerelease: ${{ env.RELEASE_CHANNEL == 'preview' }}
files: |
target/zed-remote-server-linux-aarch64.gz
target/release/zed-linux-aarch64.tar.gz
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
freebsd:
timeout-minutes: 60
runs-on: github-8vcpu-ubuntu-2404
if: |
false && ( startsWith(github.ref, 'refs/tags/v') )
needs: [linux_tests]
name: Build Zed on FreeBSD
steps:
- uses: actions/checkout@v4
- name: Build FreeBSD remote-server
id: freebsd-build
uses: vmactions/freebsd-vm@c3ae29a132c8ef1924775414107a97cac042aad5 # v1.2.0
with:
usesh: true
release: 13.5
copyback: true
prepare: |
pkg install -y \
bash curl jq git \
rustup-init cmake-core llvm-devel-lite pkgconf protobuf # ibx11 alsa-lib rust-bindgen-cli
run: |
freebsd-version
sysctl hw.model
sysctl hw.ncpu
sysctl hw.physmem
sysctl hw.usermem
git config --global --add safe.directory /home/runner/work/zed/zed
rustup-init --profile minimal --default-toolchain none -y
. "$HOME/.cargo/env"
./script/bundle-freebsd
mkdir -p out/
mv "target/zed-remote-server-freebsd-x86_64.gz" out/
rm -rf target/
cargo clean
- name: Upload Artifact to Workflow - zed-remote-server (run-bundling)
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4
if: contains(github.event.pull_request.labels.*.name, 'run-bundling')
with:
name: zed-remote-server-${{ github.event.pull_request.head.sha || github.sha }}-x86_64-unknown-freebsd.gz
path: out/zed-remote-server-freebsd-x86_64.gz
- name: Upload Artifacts to release
uses: softprops/action-gh-release@de2c0eb89ae2a093876385947365aca7b0e5f844 # v1
if: ${{ !(contains(github.event.pull_request.labels.*.name, 'run-bundling')) }}
with:
draft: true
prerelease: ${{ env.RELEASE_CHANNEL == 'preview' }}
files: |
out/zed-remote-server-freebsd-x86_64.gz
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
nix-build:
name: Build with Nix
uses: ./.github/workflows/nix_build.yml
needs: [job_spec]
if: github.repository_owner == 'zed-industries' &&
(contains(github.event.pull_request.labels.*.name, 'run-nix') ||
needs.job_spec.outputs.run_nix == 'true')
secrets: inherit
bundle-windows-x64:
timeout-minutes: 120
name: Create a Windows installer for x86_64
runs-on: [self-32vcpu-windows-2022]
if: |
( startsWith(github.ref, 'refs/tags/v') )
needs: [windows_tests]
env:
AZURE_TENANT_ID: ${{ secrets.AZURE_SIGNING_TENANT_ID }}
AZURE_CLIENT_ID: ${{ secrets.AZURE_SIGNING_CLIENT_ID }}
AZURE_CLIENT_SECRET: ${{ secrets.AZURE_SIGNING_CLIENT_SECRET }}
ACCOUNT_NAME: ${{ vars.AZURE_SIGNING_ACCOUNT_NAME }}
CERT_PROFILE_NAME: ${{ vars.AZURE_SIGNING_CERT_PROFILE_NAME }}
ENDPOINT: ${{ vars.AZURE_SIGNING_ENDPOINT }}
FILE_DIGEST: SHA256
TIMESTAMP_DIGEST: SHA256
TIMESTAMP_SERVER: "http://timestamp.acs.microsoft.com"
steps:
- name: Checkout repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false
- name: Setup Sentry CLI
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b #v2
with:
token: ${{ SECRETS.SENTRY_AUTH_TOKEN }}
- name: Determine version and release channel
working-directory: ${{ env.ZED_WORKSPACE }}
run: |
# This exports RELEASE_CHANNEL into env (GITHUB_ENV)
script/determine-release-channel.ps1
- name: Build Zed installer
working-directory: ${{ env.ZED_WORKSPACE }}
run: script/bundle-windows.ps1
- name: Upload Artifacts to release
uses: softprops/action-gh-release@de2c0eb89ae2a093876385947365aca7b0e5f844 # v1
with:
draft: true
prerelease: ${{ env.RELEASE_CHANNEL == 'preview' }}
files: ${{ env.SETUP_PATH }}
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
bundle-windows-aarch64:
timeout-minutes: 120
name: Create a Windows installer for aarch64
runs-on: [self-32vcpu-windows-2022]
if: |
( startsWith(github.ref, 'refs/tags/v') )
needs: [windows_tests]
env:
AZURE_TENANT_ID: ${{ secrets.AZURE_SIGNING_TENANT_ID }}
AZURE_CLIENT_ID: ${{ secrets.AZURE_SIGNING_CLIENT_ID }}
AZURE_CLIENT_SECRET: ${{ secrets.AZURE_SIGNING_CLIENT_SECRET }}
ACCOUNT_NAME: ${{ vars.AZURE_SIGNING_ACCOUNT_NAME }}
CERT_PROFILE_NAME: ${{ vars.AZURE_SIGNING_CERT_PROFILE_NAME }}
ENDPOINT: ${{ vars.AZURE_SIGNING_ENDPOINT }}
FILE_DIGEST: SHA256
TIMESTAMP_DIGEST: SHA256
TIMESTAMP_SERVER: "http://timestamp.acs.microsoft.com"
steps:
- name: Checkout repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false
- name: Setup Sentry CLI
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b #v2
with:
token: ${{ SECRETS.SENTRY_AUTH_TOKEN }}
- name: Determine version and release channel
working-directory: ${{ env.ZED_WORKSPACE }}
run: |
# This exports RELEASE_CHANNEL into env (GITHUB_ENV)
script/determine-release-channel.ps1
- name: Build Zed installer
working-directory: ${{ env.ZED_WORKSPACE }}
run: script/bundle-windows.ps1 -Architecture aarch64
- name: Upload Artifacts to release
uses: softprops/action-gh-release@de2c0eb89ae2a093876385947365aca7b0e5f844 # v1
with:
draft: true
prerelease: ${{ env.RELEASE_CHANNEL == 'preview' }}
files: ${{ env.SETUP_PATH }}
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
auto-release-preview:
name: Auto release preview
if: |
false
&& startsWith(github.ref, 'refs/tags/v')
&& endsWith(github.ref, '-pre') && !endsWith(github.ref, '.0-pre')
needs: [bundle-mac, bundle-linux-x86_x64, bundle-linux-aarch64, bundle-windows-x64, bundle-windows-aarch64]
runs-on:
- self-mini-macos
steps:
- name: gh release
run: gh release edit "$GITHUB_REF_NAME" --draft=false
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
- name: Create Sentry release
uses: getsentry/action-release@526942b68292201ac6bbb99b9a0747d4abee354c # v3
env:
SENTRY_ORG: zed-dev
SENTRY_PROJECT: zed
SENTRY_AUTH_TOKEN: ${{ secrets.SENTRY_AUTH_TOKEN }}
with:
environment: production

View File

@@ -1,93 +0,0 @@
# IF YOU UPDATE THE NAME OF ANY GITHUB SECRET, YOU MUST CHERRY PICK THE COMMIT
# TO BOTH STABLE AND PREVIEW CHANNELS
name: Release Actions
on:
release:
types: [published]
jobs:
discord_release:
if: github.repository_owner == 'zed-industries'
runs-on: ubuntu-latest
steps:
- name: Get release URL
id: get-release-url
run: |
if [ "${{ github.event.release.prerelease }}" == "true" ]; then
URL="https://zed.dev/releases/preview"
else
URL="https://zed.dev/releases/stable"
fi
echo "URL=$URL" >> "$GITHUB_OUTPUT"
- name: Get content
uses: 2428392/gh-truncate-string-action@b3ff790d21cf42af3ca7579146eedb93c8fb0757 # v1.4.1
id: get-content
with:
stringToTruncate: |
📣 Zed [${{ github.event.release.tag_name }}](<${{ steps.get-release-url.outputs.URL }}>) was just released!
${{ github.event.release.body }}
maxLength: 2000
truncationSymbol: "..."
- name: Discord Webhook Action
uses: tsickert/discord-webhook@c840d45a03a323fbc3f7507ac7769dbd91bfb164 # v5.3.0
with:
webhook-url: ${{ secrets.DISCORD_WEBHOOK_RELEASE_NOTES }}
content: ${{ steps.get-content.outputs.string }}
publish-winget:
runs-on:
- ubuntu-latest
steps:
- name: Set Package Name
id: set-package-name
run: |
if [ "${{ github.event.release.prerelease }}" == "true" ]; then
PACKAGE_NAME=ZedIndustries.Zed.Preview
else
PACKAGE_NAME=ZedIndustries.Zed
fi
echo "PACKAGE_NAME=$PACKAGE_NAME" >> "$GITHUB_OUTPUT"
- uses: vedantmgoyal9/winget-releaser@19e706d4c9121098010096f9c495a70a7518b30f # v2
with:
identifier: ${{ steps.set-package-name.outputs.PACKAGE_NAME }}
max-versions-to-keep: 5
token: ${{ secrets.WINGET_TOKEN }}
send_release_notes_email:
if: false && github.repository_owner == 'zed-industries' && !github.event.release.prerelease
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
fetch-depth: 0
- name: Check if release was promoted from preview
id: check-promotion-from-preview
run: |
VERSION="${{ github.event.release.tag_name }}"
PREVIEW_TAG="${VERSION}-pre"
if git rev-parse "$PREVIEW_TAG" > /dev/null 2>&1; then
echo "was_promoted_from_preview=true" >> "$GITHUB_OUTPUT"
else
echo "was_promoted_from_preview=false" >> "$GITHUB_OUTPUT"
fi
- name: Send release notes email
if: steps.check-promotion-from-preview.outputs.was_promoted_from_preview == 'true'
run: |
TAG="${{ github.event.release.tag_name }}"
cat << 'EOF' > release_body.txt
${{ github.event.release.body }}
EOF
jq -n --arg tag "$TAG" --rawfile body release_body.txt '{version: $tag, markdown_body: $body}' \
> release_data.json
curl -X POST "https://zed.dev/api/send_release_notes_email" \
-H "Authorization: Bearer ${{ secrets.RELEASE_NOTES_API_TOKEN }}" \
-H "Content-Type: application/json" \
-d @release_data.json

View File

@@ -2,12 +2,77 @@
# Rebuild with `cargo xtask workflows`.
name: compare_perf
on:
workflow_dispatch: {}
workflow_dispatch:
inputs:
head:
description: head
required: true
type: string
base:
description: base
required: true
type: string
crate_name:
description: crate_name
type: string
default: ''
jobs:
run_perf:
runs-on: namespace-profile-2x4-ubuntu-2404
runs-on: namespace-profile-16x32-ubuntu-2204
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_cargo_config
run: |
mkdir -p ./../.cargo
cp ./.cargo/ci-config.toml ./../.cargo/config.toml
shell: bash -euxo pipefail {0}
- name: steps::setup_linux
run: ./script/linux
shell: bash -euxo pipefail {0}
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: compare_perf::run_perf::install_hyperfine
run: cargo install hyperfine
shell: bash -euxo pipefail {0}
- name: steps::git_checkout
run: git fetch origin ${{ inputs.base }} && git checkout ${{ inputs.base }}
shell: bash -euxo pipefail {0}
- name: compare_perf::run_perf::cargo_perf_test
run: |2-
if [ -n "${{ inputs.crate_name }}" ]; then
cargo perf-test -p ${{ inputs.crate_name }} -- --json=${{ inputs.base }};
else
cargo perf-test -p vim -- --json=${{ inputs.base }};
fi
shell: bash -euxo pipefail {0}
- name: steps::git_checkout
run: git fetch origin ${{ inputs.head }} && git checkout ${{ inputs.head }}
shell: bash -euxo pipefail {0}
- name: compare_perf::run_perf::cargo_perf_test
run: |2-
if [ -n "${{ inputs.crate_name }}" ]; then
cargo perf-test -p ${{ inputs.crate_name }} -- --json=${{ inputs.head }};
else
cargo perf-test -p vim -- --json=${{ inputs.head }};
fi
shell: bash -euxo pipefail {0}
- name: compare_perf::run_perf::compare_runs
run: cargo perf-compare --save=results.md ${{ inputs.base }} ${{ inputs.head }}
shell: bash -euxo pipefail {0}
- name: '@actions/upload-artifact results.md'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: results.md
path: results.md
if-no-files-found: error
- name: steps::cleanup_cargo_config
if: always()
run: |
rm -rf ./../.cargo
shell: bash -euxo pipefail {0}

View File

@@ -29,10 +29,10 @@ jobs:
node-version: '20'
cache: pnpm
cache-dependency-path: script/danger/pnpm-lock.yaml
- name: danger::install_deps
- name: danger::danger_job::install_deps
run: pnpm install --dir script/danger
shell: bash -euxo pipefail {0}
- name: danger::run
- name: danger::danger_job::run
run: pnpm run --dir script/danger danger ci
shell: bash -euxo pipefail {0}
env:

View File

@@ -1,71 +0,0 @@
name: Run Agent Eval
on:
schedule:
- cron: "0 0 * * *"
pull_request:
branches:
- "**"
types: [synchronize, reopened, labeled]
workflow_dispatch:
concurrency:
# Allow only one workflow per any non-`main` branch.
group: ${{ github.workflow }}-${{ github.ref_name }}-${{ github.ref_name == 'main' && github.sha || 'anysha' }}
cancel-in-progress: true
env:
CARGO_TERM_COLOR: always
CARGO_INCREMENTAL: 0
RUST_BACKTRACE: 1
ANTHROPIC_API_KEY: ${{ secrets.ANTHROPIC_API_KEY }}
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_EVAL_TELEMETRY: 1
jobs:
run_eval:
timeout-minutes: 60
name: Run Agent Eval
if: >
github.repository_owner == 'zed-industries' &&
(github.event_name != 'pull_request' || contains(github.event.pull_request.labels.*.name, 'run-eval'))
runs-on:
- namespace-profile-16x32-ubuntu-2204
steps:
- name: Add Rust to the PATH
run: echo "$HOME/.cargo/bin" >> "$GITHUB_PATH"
- name: Checkout repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false
- name: Cache dependencies
uses: swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2
with:
save-if: ${{ github.ref == 'refs/heads/main' }}
# cache-provider: "buildjet"
- name: Install Linux dependencies
run: ./script/linux
- name: Configure CI
run: |
mkdir -p ./../.cargo
cp ./.cargo/ci-config.toml ./../.cargo/config.toml
- name: Compile eval
run: cargo build --package=eval
- name: Run eval
run: cargo run --package=eval -- --repetitions=8 --concurrency=1
# Even the Linux runner is not stateful, in theory there is no need to do this cleanup.
# But, to avoid potential issues in the future if we choose to use a stateful Linux runner and forget to add code
# to clean up the config file, Ive included the cleanup code here as a precaution.
# While its not strictly necessary at this moment, I believe its better to err on the side of caution.
- name: Clean CI config file
if: always()
run: rm -rf ./../.cargo

View File

@@ -1,97 +0,0 @@
# Generated from xtask::workflows::nix_build
# Rebuild with `cargo xtask workflows`.
name: nix_build
env:
CARGO_TERM_COLOR: always
RUST_BACKTRACE: '1'
CARGO_INCREMENTAL: '0'
on:
pull_request:
branches:
- '**'
paths:
- nix/**
- flake.*
- Cargo.*
- rust-toolchain.toml
- .cargo/config.toml
push:
branches:
- main
- v[0-9]+.[0-9]+.x
paths:
- nix/**
- flake.*
- Cargo.*
- rust-toolchain.toml
- .cargo/config.toml
workflow_call: {}
jobs:
build_nix_linux_x86_64:
if: github.repository_owner == 'zed-industries'
runs-on: namespace-profile-32x64-ubuntu-2004
env:
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_MINIDUMP_ENDPOINT: ${{ secrets.ZED_SENTRY_MINIDUMP_ENDPOINT }}
ZED_CLOUD_PROVIDER_ADDITIONAL_MODELS_JSON: ${{ secrets.ZED_CLOUD_PROVIDER_ADDITIONAL_MODELS_JSON }}
GIT_LFS_SKIP_SMUDGE: '1'
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: nix_build::install_nix
uses: cachix/install-nix-action@02a151ada4993995686f9ed4f1be7cfbb229e56f
with:
github_access_token: ${{ secrets.GITHUB_TOKEN }}
- name: nix_build::cachix_action
uses: cachix/cachix-action@0fc020193b5a1fa3ac4575aa3a7d3aa6a35435ad
with:
name: zed
authToken: ${{ secrets.CACHIX_AUTH_TOKEN }}
cachixArgs: -v
pushFilter: -zed-editor-[0-9.]*-nightly
- name: nix_build::build
run: nix build .#debug -L --accept-flake-config
shell: bash -euxo pipefail {0}
timeout-minutes: 60
continue-on-error: true
build_nix_mac_aarch64:
if: github.repository_owner == 'zed-industries'
runs-on: self-mini-macos
env:
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_MINIDUMP_ENDPOINT: ${{ secrets.ZED_SENTRY_MINIDUMP_ENDPOINT }}
ZED_CLOUD_PROVIDER_ADDITIONAL_MODELS_JSON: ${{ secrets.ZED_CLOUD_PROVIDER_ADDITIONAL_MODELS_JSON }}
GIT_LFS_SKIP_SMUDGE: '1'
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: nix_build::set_path
run: |
echo "/nix/var/nix/profiles/default/bin" >> "$GITHUB_PATH"
echo "/Users/administrator/.nix-profile/bin" >> "$GITHUB_PATH"
shell: bash -euxo pipefail {0}
- name: nix_build::cachix_action
uses: cachix/cachix-action@0fc020193b5a1fa3ac4575aa3a7d3aa6a35435ad
with:
name: zed
authToken: ${{ secrets.CACHIX_AUTH_TOKEN }}
cachixArgs: -v
pushFilter: -zed-editor-[0-9.]*-nightly
- name: nix_build::build
run: nix build .#debug -L --accept-flake-config
shell: bash -euxo pipefail {0}
- name: nix_build::limit_store
run: |-
if [ "$(du -sm /nix/store | cut -f1)" -gt 50000 ]; then
nix-collect-garbage -d || true
fi
shell: bash -euxo pipefail {0}
timeout-minutes: 60
continue-on-error: true
concurrency:
group: ${{ github.workflow }}-${{ github.ref_name }}-${{ github.ref_name == 'main' && github.sha || 'anysha' }}
cancel-in-progress: true

488
.github/workflows/release.yml vendored Normal file
View File

@@ -0,0 +1,488 @@
# Generated from xtask::workflows::release
# Rebuild with `cargo xtask workflows`.
name: release
env:
CARGO_TERM_COLOR: always
RUST_BACKTRACE: '1'
on:
push:
tags:
- v*
jobs:
run_tests_mac:
if: github.repository_owner == 'zed-industries'
runs-on: self-mini-macos
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_cargo_config
run: |
mkdir -p ./../.cargo
cp ./.cargo/ci-config.toml ./../.cargo/config.toml
shell: bash -euxo pipefail {0}
- name: steps::setup_node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020
with:
node-version: '20'
- name: steps::clippy
run: ./script/clippy
shell: bash -euxo pipefail {0}
- name: steps::cargo_install_nextest
run: cargo install cargo-nextest --locked
shell: bash -euxo pipefail {0}
- name: steps::clear_target_dir_if_large
run: ./script/clear-target-dir-if-larger-than 300
shell: bash -euxo pipefail {0}
- name: steps::cargo_nextest
run: cargo nextest run --workspace --no-fail-fast --failure-output immediate-final
shell: bash -euxo pipefail {0}
- name: steps::cleanup_cargo_config
if: always()
run: |
rm -rf ./../.cargo
shell: bash -euxo pipefail {0}
timeout-minutes: 60
run_tests_linux:
if: github.repository_owner == 'zed-industries'
runs-on: namespace-profile-16x32-ubuntu-2204
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_cargo_config
run: |
mkdir -p ./../.cargo
cp ./.cargo/ci-config.toml ./../.cargo/config.toml
shell: bash -euxo pipefail {0}
- name: steps::setup_linux
run: ./script/linux
shell: bash -euxo pipefail {0}
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: steps::cache_rust_dependencies_namespace
uses: namespacelabs/nscloud-cache-action@v1
with:
cache: rust
- name: steps::setup_node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020
with:
node-version: '20'
- name: steps::clippy
run: ./script/clippy
shell: bash -euxo pipefail {0}
- name: steps::cargo_install_nextest
run: cargo install cargo-nextest --locked
shell: bash -euxo pipefail {0}
- name: steps::clear_target_dir_if_large
run: ./script/clear-target-dir-if-larger-than 250
shell: bash -euxo pipefail {0}
- name: steps::cargo_nextest
run: cargo nextest run --workspace --no-fail-fast --failure-output immediate-final
shell: bash -euxo pipefail {0}
- name: steps::cleanup_cargo_config
if: always()
run: |
rm -rf ./../.cargo
shell: bash -euxo pipefail {0}
timeout-minutes: 60
run_tests_windows:
if: github.repository_owner == 'zed-industries'
runs-on: self-32vcpu-windows-2022
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_cargo_config
run: |
New-Item -ItemType Directory -Path "./../.cargo" -Force
Copy-Item -Path "./.cargo/ci-config.toml" -Destination "./../.cargo/config.toml"
shell: pwsh
- name: steps::setup_node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020
with:
node-version: '20'
- name: steps::clippy
run: ./script/clippy.ps1
shell: pwsh
- name: steps::cargo_install_nextest
run: cargo install cargo-nextest --locked
shell: pwsh
- name: steps::clear_target_dir_if_large
run: ./script/clear-target-dir-if-larger-than.ps1 250
shell: pwsh
- name: steps::cargo_nextest
run: cargo nextest run --workspace --no-fail-fast --failure-output immediate-final
shell: pwsh
- name: steps::cleanup_cargo_config
if: always()
run: |
Remove-Item -Recurse -Path "./../.cargo" -Force -ErrorAction SilentlyContinue
shell: pwsh
timeout-minutes: 60
check_scripts:
if: github.repository_owner == 'zed-industries'
runs-on: namespace-profile-2x4-ubuntu-2404
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: run_tests::check_scripts::run_shellcheck
run: ./script/shellcheck-scripts error
shell: bash -euxo pipefail {0}
- id: get_actionlint
name: run_tests::check_scripts::download_actionlint
run: bash <(curl https://raw.githubusercontent.com/rhysd/actionlint/main/scripts/download-actionlint.bash)
shell: bash -euxo pipefail {0}
- name: run_tests::check_scripts::run_actionlint
run: |
${{ steps.get_actionlint.outputs.executable }} -color
shell: bash -euxo pipefail {0}
- name: run_tests::check_scripts::check_xtask_workflows
run: |
cargo xtask workflows
if ! git diff --exit-code .github; then
echo "Error: .github directory has uncommitted changes after running 'cargo xtask workflows'"
echo "Please run 'cargo xtask workflows' locally and commit the changes"
exit 1
fi
shell: bash -euxo pipefail {0}
timeout-minutes: 60
create_draft_release:
if: github.repository_owner == 'zed-industries'
runs-on: namespace-profile-2x4-ubuntu-2404
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
fetch-depth: 25
ref: ${{ github.ref }}
- name: script/determine-release-channel
run: script/determine-release-channel
shell: bash -euxo pipefail {0}
- name: mkdir -p target/
run: mkdir -p target/
shell: bash -euxo pipefail {0}
- name: release::create_draft_release::generate_release_notes
run: node --redirect-warnings=/dev/null ./script/draft-release-notes "$RELEASE_VERSION" "$RELEASE_CHANNEL" > target/release-notes.md
shell: bash -euxo pipefail {0}
- name: release::create_draft_release::create_release
run: script/create-draft-release target/release-notes.md
shell: bash -euxo pipefail {0}
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
timeout-minutes: 60
bundle_linux_aarch64:
needs:
- run_tests_linux
- check_scripts
runs-on: namespace-profile-8x32-ubuntu-2004-arm-m4
env:
CARGO_INCREMENTAL: 0
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_MINIDUMP_ENDPOINT: ${{ secrets.ZED_SENTRY_MINIDUMP_ENDPOINT }}
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_sentry
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b
with:
token: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: steps::setup_linux
run: ./script/linux
shell: bash -euxo pipefail {0}
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: ./script/bundle-linux
run: ./script/bundle-linux
shell: bash -euxo pipefail {0}
- name: '@actions/upload-artifact zed-linux-aarch64.tar.gz'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: zed-linux-aarch64.tar.gz
path: target/release/zed-linux-aarch64.tar.gz
if-no-files-found: error
- name: '@actions/upload-artifact zed-remote-server-linux-aarch64.gz'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: zed-remote-server-linux-aarch64.gz
path: target/zed-remote-server-linux-aarch64.gz
if-no-files-found: error
timeout-minutes: 60
bundle_linux_x86_64:
needs:
- run_tests_linux
- check_scripts
runs-on: namespace-profile-32x64-ubuntu-2004
env:
CARGO_INCREMENTAL: 0
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_MINIDUMP_ENDPOINT: ${{ secrets.ZED_SENTRY_MINIDUMP_ENDPOINT }}
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_sentry
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b
with:
token: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: steps::setup_linux
run: ./script/linux
shell: bash -euxo pipefail {0}
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: ./script/bundle-linux
run: ./script/bundle-linux
shell: bash -euxo pipefail {0}
- name: '@actions/upload-artifact zed-linux-x86_64.tar.gz'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: zed-linux-x86_64.tar.gz
path: target/release/zed-linux-x86_64.tar.gz
if-no-files-found: error
- name: '@actions/upload-artifact zed-remote-server-linux-x86_64.gz'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: zed-remote-server-linux-x86_64.gz
path: target/zed-remote-server-linux-x86_64.gz
if-no-files-found: error
timeout-minutes: 60
bundle_mac_aarch64:
needs:
- run_tests_mac
- check_scripts
runs-on: self-mini-macos
env:
CARGO_INCREMENTAL: 0
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_MINIDUMP_ENDPOINT: ${{ secrets.ZED_SENTRY_MINIDUMP_ENDPOINT }}
MACOS_CERTIFICATE: ${{ secrets.MACOS_CERTIFICATE }}
MACOS_CERTIFICATE_PASSWORD: ${{ secrets.MACOS_CERTIFICATE_PASSWORD }}
APPLE_NOTARIZATION_KEY: ${{ secrets.APPLE_NOTARIZATION_KEY }}
APPLE_NOTARIZATION_KEY_ID: ${{ secrets.APPLE_NOTARIZATION_KEY_ID }}
APPLE_NOTARIZATION_ISSUER_ID: ${{ secrets.APPLE_NOTARIZATION_ISSUER_ID }}
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020
with:
node-version: '20'
- name: steps::setup_sentry
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b
with:
token: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: steps::clear_target_dir_if_large
run: ./script/clear-target-dir-if-larger-than 300
shell: bash -euxo pipefail {0}
- name: run_bundling::bundle_mac::bundle_mac
run: ./script/bundle-mac aarch64-apple-darwin
shell: bash -euxo pipefail {0}
- name: '@actions/upload-artifact Zed-aarch64.dmg'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: Zed-aarch64.dmg
path: target/aarch64-apple-darwin/release/Zed-aarch64.dmg
if-no-files-found: error
- name: '@actions/upload-artifact zed-remote-server-macos-aarch64.gz'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: zed-remote-server-macos-aarch64.gz
path: target/zed-remote-server-macos-aarch64.gz
if-no-files-found: error
timeout-minutes: 60
bundle_mac_x86_64:
needs:
- run_tests_mac
- check_scripts
runs-on: self-mini-macos
env:
CARGO_INCREMENTAL: 0
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_MINIDUMP_ENDPOINT: ${{ secrets.ZED_SENTRY_MINIDUMP_ENDPOINT }}
MACOS_CERTIFICATE: ${{ secrets.MACOS_CERTIFICATE }}
MACOS_CERTIFICATE_PASSWORD: ${{ secrets.MACOS_CERTIFICATE_PASSWORD }}
APPLE_NOTARIZATION_KEY: ${{ secrets.APPLE_NOTARIZATION_KEY }}
APPLE_NOTARIZATION_KEY_ID: ${{ secrets.APPLE_NOTARIZATION_KEY_ID }}
APPLE_NOTARIZATION_ISSUER_ID: ${{ secrets.APPLE_NOTARIZATION_ISSUER_ID }}
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020
with:
node-version: '20'
- name: steps::setup_sentry
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b
with:
token: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: steps::clear_target_dir_if_large
run: ./script/clear-target-dir-if-larger-than 300
shell: bash -euxo pipefail {0}
- name: run_bundling::bundle_mac::bundle_mac
run: ./script/bundle-mac x86_64-apple-darwin
shell: bash -euxo pipefail {0}
- name: '@actions/upload-artifact Zed-x86_64.dmg'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: Zed-x86_64.dmg
path: target/x86_64-apple-darwin/release/Zed-x86_64.dmg
if-no-files-found: error
- name: '@actions/upload-artifact zed-remote-server-macos-x86_64.gz'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: zed-remote-server-macos-x86_64.gz
path: target/zed-remote-server-macos-x86_64.gz
if-no-files-found: error
timeout-minutes: 60
bundle_windows_aarch64:
needs:
- run_tests_windows
- check_scripts
runs-on: self-32vcpu-windows-2022
env:
CARGO_INCREMENTAL: 0
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_MINIDUMP_ENDPOINT: ${{ secrets.ZED_SENTRY_MINIDUMP_ENDPOINT }}
AZURE_TENANT_ID: ${{ secrets.AZURE_SIGNING_TENANT_ID }}
AZURE_CLIENT_ID: ${{ secrets.AZURE_SIGNING_CLIENT_ID }}
AZURE_CLIENT_SECRET: ${{ secrets.AZURE_SIGNING_CLIENT_SECRET }}
ACCOUNT_NAME: ${{ vars.AZURE_SIGNING_ACCOUNT_NAME }}
CERT_PROFILE_NAME: ${{ vars.AZURE_SIGNING_CERT_PROFILE_NAME }}
ENDPOINT: ${{ vars.AZURE_SIGNING_ENDPOINT }}
FILE_DIGEST: SHA256
TIMESTAMP_DIGEST: SHA256
TIMESTAMP_SERVER: http://timestamp.acs.microsoft.com
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_sentry
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b
with:
token: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: run_bundling::bundle_windows::bundle_windows
run: script/bundle-windows.ps1 -Architecture aarch64
shell: pwsh
working-directory: ${{ env.ZED_WORKSPACE }}
- name: '@actions/upload-artifact Zed-aarch64.exe'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: Zed-aarch64.exe
path: target/Zed-aarch64.exe
if-no-files-found: error
timeout-minutes: 60
bundle_windows_x86_64:
needs:
- run_tests_windows
- check_scripts
runs-on: self-32vcpu-windows-2022
env:
CARGO_INCREMENTAL: 0
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_MINIDUMP_ENDPOINT: ${{ secrets.ZED_SENTRY_MINIDUMP_ENDPOINT }}
AZURE_TENANT_ID: ${{ secrets.AZURE_SIGNING_TENANT_ID }}
AZURE_CLIENT_ID: ${{ secrets.AZURE_SIGNING_CLIENT_ID }}
AZURE_CLIENT_SECRET: ${{ secrets.AZURE_SIGNING_CLIENT_SECRET }}
ACCOUNT_NAME: ${{ vars.AZURE_SIGNING_ACCOUNT_NAME }}
CERT_PROFILE_NAME: ${{ vars.AZURE_SIGNING_CERT_PROFILE_NAME }}
ENDPOINT: ${{ vars.AZURE_SIGNING_ENDPOINT }}
FILE_DIGEST: SHA256
TIMESTAMP_DIGEST: SHA256
TIMESTAMP_SERVER: http://timestamp.acs.microsoft.com
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_sentry
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b
with:
token: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: run_bundling::bundle_windows::bundle_windows
run: script/bundle-windows.ps1 -Architecture x86_64
shell: pwsh
working-directory: ${{ env.ZED_WORKSPACE }}
- name: '@actions/upload-artifact Zed-x86_64.exe'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: Zed-x86_64.exe
path: target/Zed-x86_64.exe
if-no-files-found: error
timeout-minutes: 60
upload_release_assets:
needs:
- create_draft_release
- bundle_linux_aarch64
- bundle_linux_x86_64
- bundle_mac_aarch64
- bundle_mac_x86_64
- bundle_windows_aarch64
- bundle_windows_x86_64
runs-on: namespace-profile-4x8-ubuntu-2204
steps:
- name: release::download_workflow_artifacts
uses: actions/download-artifact@018cc2cf5baa6db3ef3c5f8a56943fffe632ef53
with:
path: ./artifacts/
- name: ls -lR ./artifacts
run: ls -lR ./artifacts
shell: bash -euxo pipefail {0}
- name: release::prep_release_artifacts
run: |-
mkdir -p release-artifacts/
mv ./artifacts/Zed-aarch64.dmg/Zed-aarch64.dmg release-artifacts/Zed-aarch64.dmg
mv ./artifacts/Zed-x86_64.dmg/Zed-x86_64.dmg release-artifacts/Zed-x86_64.dmg
mv ./artifacts/zed-linux-aarch64.tar.gz/zed-linux-aarch64.tar.gz release-artifacts/zed-linux-aarch64.tar.gz
mv ./artifacts/zed-linux-x86_64.tar.gz/zed-linux-x86_64.tar.gz release-artifacts/zed-linux-x86_64.tar.gz
mv ./artifacts/Zed-x86_64.exe/Zed-x86_64.exe release-artifacts/Zed-x86_64.exe
mv ./artifacts/Zed-aarch64.exe/Zed-aarch64.exe release-artifacts/Zed-aarch64.exe
mv ./artifacts/zed-remote-server-macos-aarch64.gz/zed-remote-server-macos-aarch64.gz release-artifacts/zed-remote-server-macos-aarch64.gz
mv ./artifacts/zed-remote-server-macos-x86_64.gz/zed-remote-server-macos-x86_64.gz release-artifacts/zed-remote-server-macos-x86_64.gz
mv ./artifacts/zed-remote-server-linux-aarch64.gz/zed-remote-server-linux-aarch64.gz release-artifacts/zed-remote-server-linux-aarch64.gz
mv ./artifacts/zed-remote-server-linux-x86_64.gz/zed-remote-server-linux-x86_64.gz release-artifacts/zed-remote-server-linux-x86_64.gz
shell: bash -euxo pipefail {0}
- name: gh release upload "$GITHUB_REF_NAME" --repo=zed-industries/zed release-artifacts/*
run: gh release upload "$GITHUB_REF_NAME" --repo=zed-industries/zed release-artifacts/*
shell: bash -euxo pipefail {0}
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
auto_release_preview:
needs:
- upload_release_assets
if: startsWith(github.ref, 'refs/tags/v') && endsWith(github.ref, '-pre') && !endsWith(github.ref, '.0-pre')
runs-on: namespace-profile-2x4-ubuntu-2404
steps:
- name: gh release edit "$GITHUB_REF_NAME" --repo=zed-industries/zed --draft=false
run: gh release edit "$GITHUB_REF_NAME" --repo=zed-industries/zed --draft=false
shell: bash -euxo pipefail {0}
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
- name: release::create_sentry_release
uses: getsentry/action-release@526942b68292201ac6bbb99b9a0747d4abee354c
with:
environment: production
env:
SENTRY_ORG: zed-dev
SENTRY_PROJECT: zed
SENTRY_AUTH_TOKEN: ${{ secrets.SENTRY_AUTH_TOKEN }}
concurrency:
group: ${{ github.workflow }}-${{ github.ref_name }}-${{ github.ref_name == 'main' && github.sha || 'anysha' }}
cancel-in-progress: true

View File

@@ -3,12 +3,7 @@
name: release_nightly
env:
CARGO_TERM_COLOR: always
CARGO_INCREMENTAL: '0'
RUST_BACKTRACE: '1'
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_MINIDUMP_ENDPOINT: ${{ secrets.ZED_SENTRY_MINIDUMP_ENDPOINT }}
DIGITALOCEAN_SPACES_ACCESS_KEY: ${{ secrets.DIGITALOCEAN_SPACES_ACCESS_KEY }}
DIGITALOCEAN_SPACES_SECRET_KEY: ${{ secrets.DIGITALOCEAN_SPACES_SECRET_KEY }}
on:
push:
tags:
@@ -32,41 +27,6 @@ jobs:
run: ./script/clippy
shell: bash -euxo pipefail {0}
timeout-minutes: 60
run_tests_mac:
if: github.repository_owner == 'zed-industries'
runs-on: self-mini-macos
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_cargo_config
run: |
mkdir -p ./../.cargo
cp ./.cargo/ci-config.toml ./../.cargo/config.toml
shell: bash -euxo pipefail {0}
- name: steps::setup_node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020
with:
node-version: '20'
- name: steps::clippy
run: ./script/clippy
shell: bash -euxo pipefail {0}
- name: steps::cargo_install_nextest
run: cargo install cargo-nextest --locked
shell: bash -euxo pipefail {0}
- name: steps::clear_target_dir_if_large
run: ./script/clear-target-dir-if-larger-than 300
shell: bash -euxo pipefail {0}
- name: steps::cargo_nextest
run: cargo nextest run --workspace --no-fail-fast --failure-output immediate-final
shell: bash -euxo pipefail {0}
- name: steps::cleanup_cargo_config
if: always()
run: |
rm -rf ./../.cargo
shell: bash -euxo pipefail {0}
timeout-minutes: 60
run_tests_windows:
if: github.repository_owner == 'zed-industries'
runs-on: self-32vcpu-windows-2022
@@ -102,176 +62,215 @@ jobs:
Remove-Item -Recurse -Path "./../.cargo" -Force -ErrorAction SilentlyContinue
shell: pwsh
timeout-minutes: 60
bundle_mac_nightly_x86_64:
bundle_linux_aarch64:
needs:
- check_style
- run_tests_mac
if: github.repository_owner == 'zed-industries'
runs-on: self-mini-macos
- run_tests_windows
runs-on: namespace-profile-8x32-ubuntu-2004-arm-m4
env:
MACOS_CERTIFICATE: ${{ secrets.MACOS_CERTIFICATE }}
MACOS_CERTIFICATE_PASSWORD: ${{ secrets.MACOS_CERTIFICATE_PASSWORD }}
APPLE_NOTARIZATION_KEY: ${{ secrets.APPLE_NOTARIZATION_KEY }}
APPLE_NOTARIZATION_KEY_ID: ${{ secrets.APPLE_NOTARIZATION_KEY_ID }}
APPLE_NOTARIZATION_ISSUER_ID: ${{ secrets.APPLE_NOTARIZATION_ISSUER_ID }}
CARGO_INCREMENTAL: 0
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_MINIDUMP_ENDPOINT: ${{ secrets.ZED_SENTRY_MINIDUMP_ENDPOINT }}
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020
with:
node-version: '20'
- name: steps::setup_sentry
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b
with:
token: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: steps::clear_target_dir_if_large
run: ./script/clear-target-dir-if-larger-than 300
shell: bash -euxo pipefail {0}
- name: release_nightly::set_release_channel_to_nightly
- name: run_bundling::set_release_channel_to_nightly
run: |
set -eu
version=$(git rev-parse --short HEAD)
echo "Publishing version: ${version} on release channel nightly"
echo "nightly" > crates/zed/RELEASE_CHANNEL
shell: bash -euxo pipefail {0}
- name: run_bundling::bundle_mac
run: ./script/bundle-mac x86_64-apple-darwin
shell: bash -euxo pipefail {0}
- name: release_nightly::upload_zed_nightly
run: script/upload-nightly macos x86_64
shell: bash -euxo pipefail {0}
timeout-minutes: 60
bundle_mac_nightly_aarch64:
needs:
- check_style
- run_tests_mac
if: github.repository_owner == 'zed-industries'
runs-on: self-mini-macos
env:
MACOS_CERTIFICATE: ${{ secrets.MACOS_CERTIFICATE }}
MACOS_CERTIFICATE_PASSWORD: ${{ secrets.MACOS_CERTIFICATE_PASSWORD }}
APPLE_NOTARIZATION_KEY: ${{ secrets.APPLE_NOTARIZATION_KEY }}
APPLE_NOTARIZATION_KEY_ID: ${{ secrets.APPLE_NOTARIZATION_KEY_ID }}
APPLE_NOTARIZATION_ISSUER_ID: ${{ secrets.APPLE_NOTARIZATION_ISSUER_ID }}
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020
with:
node-version: '20'
- name: steps::setup_sentry
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b
with:
token: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: steps::clear_target_dir_if_large
run: ./script/clear-target-dir-if-larger-than 300
shell: bash -euxo pipefail {0}
- name: release_nightly::set_release_channel_to_nightly
run: |
set -eu
version=$(git rev-parse --short HEAD)
echo "Publishing version: ${version} on release channel nightly"
echo "nightly" > crates/zed/RELEASE_CHANNEL
shell: bash -euxo pipefail {0}
- name: run_bundling::bundle_mac
run: ./script/bundle-mac aarch64-apple-darwin
shell: bash -euxo pipefail {0}
- name: release_nightly::upload_zed_nightly
run: script/upload-nightly macos aarch64
shell: bash -euxo pipefail {0}
timeout-minutes: 60
bundle_linux_nightly_x86_64:
needs:
- check_style
- run_tests_mac
if: github.repository_owner == 'zed-industries'
runs-on: namespace-profile-32x64-ubuntu-2004
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_sentry
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b
with:
token: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: release_nightly::add_rust_to_path
run: echo "$HOME/.cargo/bin" >> "$GITHUB_PATH"
shell: bash -euxo pipefail {0}
- name: ./script/linux
- name: steps::setup_linux
run: ./script/linux
shell: bash -euxo pipefail {0}
- name: ./script/install-mold
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: steps::clear_target_dir_if_large
run: ./script/clear-target-dir-if-larger-than 100
- name: ./script/bundle-linux
run: ./script/bundle-linux
shell: bash -euxo pipefail {0}
- name: release_nightly::set_release_channel_to_nightly
- name: '@actions/upload-artifact zed-linux-aarch64.tar.gz'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: zed-linux-aarch64.tar.gz
path: target/release/zed-linux-aarch64.tar.gz
if-no-files-found: error
- name: '@actions/upload-artifact zed-remote-server-linux-aarch64.gz'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: zed-remote-server-linux-aarch64.gz
path: target/zed-remote-server-linux-aarch64.gz
if-no-files-found: error
timeout-minutes: 60
bundle_linux_x86_64:
needs:
- check_style
- run_tests_windows
runs-on: namespace-profile-32x64-ubuntu-2004
env:
CARGO_INCREMENTAL: 0
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_MINIDUMP_ENDPOINT: ${{ secrets.ZED_SENTRY_MINIDUMP_ENDPOINT }}
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: run_bundling::set_release_channel_to_nightly
run: |
set -eu
version=$(git rev-parse --short HEAD)
echo "Publishing version: ${version} on release channel nightly"
echo "nightly" > crates/zed/RELEASE_CHANNEL
shell: bash -euxo pipefail {0}
- name: ./script/bundle-linux
run: ./script/bundle-linux
shell: bash -euxo pipefail {0}
- name: release_nightly::upload_zed_nightly
run: script/upload-nightly linux-targz x86_64
shell: bash -euxo pipefail {0}
timeout-minutes: 60
bundle_linux_nightly_aarch64:
needs:
- check_style
- run_tests_mac
if: github.repository_owner == 'zed-industries'
runs-on: namespace-profile-8x32-ubuntu-2004-arm-m4
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_sentry
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b
with:
token: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: release_nightly::add_rust_to_path
run: echo "$HOME/.cargo/bin" >> "$GITHUB_PATH"
shell: bash -euxo pipefail {0}
- name: ./script/linux
- name: steps::setup_linux
run: ./script/linux
shell: bash -euxo pipefail {0}
- name: steps::clear_target_dir_if_large
run: ./script/clear-target-dir-if-larger-than 100
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: release_nightly::set_release_channel_to_nightly
- name: ./script/bundle-linux
run: ./script/bundle-linux
shell: bash -euxo pipefail {0}
- name: '@actions/upload-artifact zed-linux-x86_64.tar.gz'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: zed-linux-x86_64.tar.gz
path: target/release/zed-linux-x86_64.tar.gz
if-no-files-found: error
- name: '@actions/upload-artifact zed-remote-server-linux-x86_64.gz'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: zed-remote-server-linux-x86_64.gz
path: target/zed-remote-server-linux-x86_64.gz
if-no-files-found: error
timeout-minutes: 60
bundle_mac_aarch64:
needs:
- check_style
- run_tests_windows
runs-on: self-mini-macos
env:
CARGO_INCREMENTAL: 0
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_MINIDUMP_ENDPOINT: ${{ secrets.ZED_SENTRY_MINIDUMP_ENDPOINT }}
MACOS_CERTIFICATE: ${{ secrets.MACOS_CERTIFICATE }}
MACOS_CERTIFICATE_PASSWORD: ${{ secrets.MACOS_CERTIFICATE_PASSWORD }}
APPLE_NOTARIZATION_KEY: ${{ secrets.APPLE_NOTARIZATION_KEY }}
APPLE_NOTARIZATION_KEY_ID: ${{ secrets.APPLE_NOTARIZATION_KEY_ID }}
APPLE_NOTARIZATION_ISSUER_ID: ${{ secrets.APPLE_NOTARIZATION_ISSUER_ID }}
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: run_bundling::set_release_channel_to_nightly
run: |
set -eu
version=$(git rev-parse --short HEAD)
echo "Publishing version: ${version} on release channel nightly"
echo "nightly" > crates/zed/RELEASE_CHANNEL
shell: bash -euxo pipefail {0}
- name: ./script/bundle-linux
run: ./script/bundle-linux
- name: steps::setup_node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020
with:
node-version: '20'
- name: steps::setup_sentry
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b
with:
token: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: steps::clear_target_dir_if_large
run: ./script/clear-target-dir-if-larger-than 300
shell: bash -euxo pipefail {0}
- name: release_nightly::upload_zed_nightly
run: script/upload-nightly linux-targz aarch64
- name: run_bundling::bundle_mac::bundle_mac
run: ./script/bundle-mac aarch64-apple-darwin
shell: bash -euxo pipefail {0}
- name: '@actions/upload-artifact Zed-aarch64.dmg'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: Zed-aarch64.dmg
path: target/aarch64-apple-darwin/release/Zed-aarch64.dmg
if-no-files-found: error
- name: '@actions/upload-artifact zed-remote-server-macos-aarch64.gz'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: zed-remote-server-macos-aarch64.gz
path: target/zed-remote-server-macos-aarch64.gz
if-no-files-found: error
timeout-minutes: 60
bundle_windows_nightly_x86_64:
bundle_mac_x86_64:
needs:
- check_style
- run_tests_windows
runs-on: self-mini-macos
env:
CARGO_INCREMENTAL: 0
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_MINIDUMP_ENDPOINT: ${{ secrets.ZED_SENTRY_MINIDUMP_ENDPOINT }}
MACOS_CERTIFICATE: ${{ secrets.MACOS_CERTIFICATE }}
MACOS_CERTIFICATE_PASSWORD: ${{ secrets.MACOS_CERTIFICATE_PASSWORD }}
APPLE_NOTARIZATION_KEY: ${{ secrets.APPLE_NOTARIZATION_KEY }}
APPLE_NOTARIZATION_KEY_ID: ${{ secrets.APPLE_NOTARIZATION_KEY_ID }}
APPLE_NOTARIZATION_ISSUER_ID: ${{ secrets.APPLE_NOTARIZATION_ISSUER_ID }}
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: run_bundling::set_release_channel_to_nightly
run: |
set -eu
version=$(git rev-parse --short HEAD)
echo "Publishing version: ${version} on release channel nightly"
echo "nightly" > crates/zed/RELEASE_CHANNEL
shell: bash -euxo pipefail {0}
- name: steps::setup_node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020
with:
node-version: '20'
- name: steps::setup_sentry
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b
with:
token: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: steps::clear_target_dir_if_large
run: ./script/clear-target-dir-if-larger-than 300
shell: bash -euxo pipefail {0}
- name: run_bundling::bundle_mac::bundle_mac
run: ./script/bundle-mac x86_64-apple-darwin
shell: bash -euxo pipefail {0}
- name: '@actions/upload-artifact Zed-x86_64.dmg'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: Zed-x86_64.dmg
path: target/x86_64-apple-darwin/release/Zed-x86_64.dmg
if-no-files-found: error
- name: '@actions/upload-artifact zed-remote-server-macos-x86_64.gz'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: zed-remote-server-macos-x86_64.gz
path: target/zed-remote-server-macos-x86_64.gz
if-no-files-found: error
timeout-minutes: 60
bundle_windows_aarch64:
needs:
- check_style
- run_tests_windows
if: github.repository_owner == 'zed-industries'
runs-on: self-32vcpu-windows-2022
env:
CARGO_INCREMENTAL: 0
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_MINIDUMP_ENDPOINT: ${{ secrets.ZED_SENTRY_MINIDUMP_ENDPOINT }}
AZURE_TENANT_ID: ${{ secrets.AZURE_SIGNING_TENANT_ID }}
AZURE_CLIENT_ID: ${{ secrets.AZURE_SIGNING_CLIENT_ID }}
AZURE_CLIENT_SECRET: ${{ secrets.AZURE_SIGNING_CLIENT_SECRET }}
@@ -286,11 +285,7 @@ jobs:
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_sentry
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b
with:
token: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: release_nightly::set_release_channel_to_nightly
- name: run_bundling::set_release_channel_to_nightly
run: |
$ErrorActionPreference = "Stop"
$version = git rev-parse --short HEAD
@@ -298,61 +293,71 @@ jobs:
"nightly" | Set-Content -Path "crates/zed/RELEASE_CHANNEL"
shell: pwsh
working-directory: ${{ env.ZED_WORKSPACE }}
- name: release_nightly::build_zed_installer
run: script/bundle-windows.ps1 -Architecture x86_64
shell: pwsh
working-directory: ${{ env.ZED_WORKSPACE }}
- name: release_nightly::upload_zed_nightly_windows
run: script/upload-nightly.ps1 -Architecture x86_64
shell: pwsh
working-directory: ${{ env.ZED_WORKSPACE }}
timeout-minutes: 60
bundle_windows_nightly_aarch64:
needs:
- check_style
- run_tests_windows
if: github.repository_owner == 'zed-industries'
runs-on: self-32vcpu-windows-2022
env:
AZURE_TENANT_ID: ${{ secrets.AZURE_SIGNING_TENANT_ID }}
AZURE_CLIENT_ID: ${{ secrets.AZURE_SIGNING_CLIENT_ID }}
AZURE_CLIENT_SECRET: ${{ secrets.AZURE_SIGNING_CLIENT_SECRET }}
ACCOUNT_NAME: ${{ vars.AZURE_SIGNING_ACCOUNT_NAME }}
CERT_PROFILE_NAME: ${{ vars.AZURE_SIGNING_CERT_PROFILE_NAME }}
ENDPOINT: ${{ vars.AZURE_SIGNING_ENDPOINT }}
FILE_DIGEST: SHA256
TIMESTAMP_DIGEST: SHA256
TIMESTAMP_SERVER: http://timestamp.acs.microsoft.com
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_sentry
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b
with:
token: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: release_nightly::set_release_channel_to_nightly
run: |
$ErrorActionPreference = "Stop"
$version = git rev-parse --short HEAD
Write-Host "Publishing version: $version on release channel nightly"
"nightly" | Set-Content -Path "crates/zed/RELEASE_CHANNEL"
shell: pwsh
working-directory: ${{ env.ZED_WORKSPACE }}
- name: release_nightly::build_zed_installer
- name: run_bundling::bundle_windows::bundle_windows
run: script/bundle-windows.ps1 -Architecture aarch64
shell: pwsh
working-directory: ${{ env.ZED_WORKSPACE }}
- name: release_nightly::upload_zed_nightly_windows
run: script/upload-nightly.ps1 -Architecture aarch64
- name: '@actions/upload-artifact Zed-aarch64.exe'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: Zed-aarch64.exe
path: target/Zed-aarch64.exe
if-no-files-found: error
timeout-minutes: 60
bundle_windows_x86_64:
needs:
- check_style
- run_tests_windows
runs-on: self-32vcpu-windows-2022
env:
CARGO_INCREMENTAL: 0
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_MINIDUMP_ENDPOINT: ${{ secrets.ZED_SENTRY_MINIDUMP_ENDPOINT }}
AZURE_TENANT_ID: ${{ secrets.AZURE_SIGNING_TENANT_ID }}
AZURE_CLIENT_ID: ${{ secrets.AZURE_SIGNING_CLIENT_ID }}
AZURE_CLIENT_SECRET: ${{ secrets.AZURE_SIGNING_CLIENT_SECRET }}
ACCOUNT_NAME: ${{ vars.AZURE_SIGNING_ACCOUNT_NAME }}
CERT_PROFILE_NAME: ${{ vars.AZURE_SIGNING_CERT_PROFILE_NAME }}
ENDPOINT: ${{ vars.AZURE_SIGNING_ENDPOINT }}
FILE_DIGEST: SHA256
TIMESTAMP_DIGEST: SHA256
TIMESTAMP_SERVER: http://timestamp.acs.microsoft.com
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: run_bundling::set_release_channel_to_nightly
run: |
$ErrorActionPreference = "Stop"
$version = git rev-parse --short HEAD
Write-Host "Publishing version: $version on release channel nightly"
"nightly" | Set-Content -Path "crates/zed/RELEASE_CHANNEL"
shell: pwsh
working-directory: ${{ env.ZED_WORKSPACE }}
- name: steps::setup_sentry
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b
with:
token: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: run_bundling::bundle_windows::bundle_windows
run: script/bundle-windows.ps1 -Architecture x86_64
shell: pwsh
working-directory: ${{ env.ZED_WORKSPACE }}
- name: '@actions/upload-artifact Zed-x86_64.exe'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: Zed-x86_64.exe
path: target/Zed-x86_64.exe
if-no-files-found: error
timeout-minutes: 60
build_nix_linux_x86_64:
needs:
- check_style
- run_tests_mac
- run_tests_windows
if: github.repository_owner == 'zed-industries'
runs-on: namespace-profile-32x64-ubuntu-2004
env:
@@ -365,17 +370,17 @@ jobs:
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: nix_build::install_nix
- name: nix_build::build_nix::install_nix
uses: cachix/install-nix-action@02a151ada4993995686f9ed4f1be7cfbb229e56f
with:
github_access_token: ${{ secrets.GITHUB_TOKEN }}
- name: nix_build::cachix_action
- name: nix_build::build_nix::cachix_action
uses: cachix/cachix-action@0fc020193b5a1fa3ac4575aa3a7d3aa6a35435ad
with:
name: zed
authToken: ${{ secrets.CACHIX_AUTH_TOKEN }}
cachixArgs: -v
- name: nix_build::build
- name: nix_build::build_nix::build
run: nix build .#default -L --accept-flake-config
shell: bash -euxo pipefail {0}
timeout-minutes: 60
@@ -383,7 +388,7 @@ jobs:
build_nix_mac_aarch64:
needs:
- check_style
- run_tests_mac
- run_tests_windows
if: github.repository_owner == 'zed-industries'
runs-on: self-mini-macos
env:
@@ -396,21 +401,21 @@ jobs:
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: nix_build::set_path
- name: nix_build::build_nix::set_path
run: |
echo "/nix/var/nix/profiles/default/bin" >> "$GITHUB_PATH"
echo "/Users/administrator/.nix-profile/bin" >> "$GITHUB_PATH"
shell: bash -euxo pipefail {0}
- name: nix_build::cachix_action
- name: nix_build::build_nix::cachix_action
uses: cachix/cachix-action@0fc020193b5a1fa3ac4575aa3a7d3aa6a35435ad
with:
name: zed
authToken: ${{ secrets.CACHIX_AUTH_TOKEN }}
cachixArgs: -v
- name: nix_build::build
- name: nix_build::build_nix::build
run: nix build .#default -L --accept-flake-config
shell: bash -euxo pipefail {0}
- name: nix_build::limit_store
- name: nix_build::build_nix::limit_store
run: |-
if [ "$(du -sm /nix/store | cut -f1)" -gt 50000 ]; then
nix-collect-garbage -d || true
@@ -420,21 +425,49 @@ jobs:
continue-on-error: true
update_nightly_tag:
needs:
- bundle_mac_nightly_x86_64
- bundle_mac_nightly_aarch64
- bundle_linux_nightly_x86_64
- bundle_linux_nightly_aarch64
- bundle_windows_nightly_x86_64
- bundle_windows_nightly_aarch64
- bundle_linux_aarch64
- bundle_linux_x86_64
- bundle_mac_aarch64
- bundle_mac_x86_64
- bundle_windows_aarch64
- bundle_windows_x86_64
if: github.repository_owner == 'zed-industries'
runs-on: namespace-profile-2x4-ubuntu-2404
runs-on: namespace-profile-4x8-ubuntu-2204
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
fetch-depth: 0
- name: release_nightly::update_nightly_tag
- name: release::download_workflow_artifacts
uses: actions/download-artifact@018cc2cf5baa6db3ef3c5f8a56943fffe632ef53
with:
path: ./artifacts/
- name: ls -lR ./artifacts
run: ls -lR ./artifacts
shell: bash -euxo pipefail {0}
- name: release::prep_release_artifacts
run: |-
mkdir -p release-artifacts/
mv ./artifacts/Zed-aarch64.dmg/Zed-aarch64.dmg release-artifacts/Zed-aarch64.dmg
mv ./artifacts/Zed-x86_64.dmg/Zed-x86_64.dmg release-artifacts/Zed-x86_64.dmg
mv ./artifacts/zed-linux-aarch64.tar.gz/zed-linux-aarch64.tar.gz release-artifacts/zed-linux-aarch64.tar.gz
mv ./artifacts/zed-linux-x86_64.tar.gz/zed-linux-x86_64.tar.gz release-artifacts/zed-linux-x86_64.tar.gz
mv ./artifacts/Zed-x86_64.exe/Zed-x86_64.exe release-artifacts/Zed-x86_64.exe
mv ./artifacts/Zed-aarch64.exe/Zed-aarch64.exe release-artifacts/Zed-aarch64.exe
mv ./artifacts/zed-remote-server-macos-aarch64.gz/zed-remote-server-macos-aarch64.gz release-artifacts/zed-remote-server-macos-aarch64.gz
mv ./artifacts/zed-remote-server-macos-x86_64.gz/zed-remote-server-macos-x86_64.gz release-artifacts/zed-remote-server-macos-x86_64.gz
mv ./artifacts/zed-remote-server-linux-aarch64.gz/zed-remote-server-linux-aarch64.gz release-artifacts/zed-remote-server-linux-aarch64.gz
mv ./artifacts/zed-remote-server-linux-x86_64.gz/zed-remote-server-linux-x86_64.gz release-artifacts/zed-remote-server-linux-x86_64.gz
shell: bash -euxo pipefail {0}
- name: ./script/upload-nightly
run: ./script/upload-nightly
shell: bash -euxo pipefail {0}
env:
DIGITALOCEAN_SPACES_ACCESS_KEY: ${{ secrets.DIGITALOCEAN_SPACES_ACCESS_KEY }}
DIGITALOCEAN_SPACES_SECRET_KEY: ${{ secrets.DIGITALOCEAN_SPACES_SECRET_KEY }}
- name: release_nightly::update_nightly_tag_job::update_nightly_tag
run: |
if [ "$(git rev-parse nightly)" = "$(git rev-parse HEAD)" ]; then
echo "Nightly tag already points to current commit. Skipping tagging."
@@ -445,7 +478,7 @@ jobs:
git tag -f nightly
git push origin nightly --force
shell: bash -euxo pipefail {0}
- name: release_nightly::create_sentry_release
- name: release::create_sentry_release
uses: getsentry/action-release@526942b68292201ac6bbb99b9a0747d4abee354c
with:
environment: production

62
.github/workflows/run_agent_evals.yml vendored Normal file
View File

@@ -0,0 +1,62 @@
# Generated from xtask::workflows::run_agent_evals
# Rebuild with `cargo xtask workflows`.
name: run_agent_evals
env:
CARGO_TERM_COLOR: always
CARGO_INCREMENTAL: '0'
RUST_BACKTRACE: '1'
ANTHROPIC_API_KEY: ${{ secrets.ANTHROPIC_API_KEY }}
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_EVAL_TELEMETRY: '1'
on:
pull_request:
types:
- synchronize
- reopened
- labeled
branches:
- '**'
schedule:
- cron: 0 0 * * *
workflow_dispatch: {}
jobs:
agent_evals:
if: |
github.repository_owner == 'zed-industries' &&
(github.event_name != 'pull_request' || contains(github.event.pull_request.labels.*.name, 'run-eval'))
runs-on: namespace-profile-16x32-ubuntu-2204
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::cache_rust_dependencies_namespace
uses: namespacelabs/nscloud-cache-action@v1
with:
cache: rust
- name: steps::setup_linux
run: ./script/linux
shell: bash -euxo pipefail {0}
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: steps::setup_cargo_config
run: |
mkdir -p ./../.cargo
cp ./.cargo/ci-config.toml ./../.cargo/config.toml
shell: bash -euxo pipefail {0}
- name: cargo build --package=eval
run: cargo build --package=eval
shell: bash -euxo pipefail {0}
- name: run_agent_evals::agent_evals::run_eval
run: cargo run --package=eval -- --repetitions=8 --concurrency=1
shell: bash -euxo pipefail {0}
- name: steps::cleanup_cargo_config
if: always()
run: |
rm -rf ./../.cargo
shell: bash -euxo pipefail {0}
timeout-minutes: 60
concurrency:
group: ${{ github.workflow }}-${{ github.ref_name }}-${{ github.ref_name == 'main' && github.sha || 'anysha' }}
cancel-in-progress: true

View File

@@ -3,103 +3,62 @@
name: run_bundling
env:
CARGO_TERM_COLOR: always
CARGO_INCREMENTAL: '0'
RUST_BACKTRACE: '1'
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_MINIDUMP_ENDPOINT: ${{ secrets.ZED_SENTRY_MINIDUMP_ENDPOINT }}
on:
pull_request:
types:
- labeled
- synchronize
jobs:
bundle_mac_x86_64:
bundle_linux_aarch64:
if: |-
(github.event.action == 'labeled' && github.event.label.name == 'run-bundling') ||
(github.event.action == 'synchronize' && contains(github.event.pull_request.labels.*.name, 'run-bundling'))
runs-on: self-mini-macos
runs-on: namespace-profile-8x32-ubuntu-2004-arm-m4
env:
MACOS_CERTIFICATE: ${{ secrets.MACOS_CERTIFICATE }}
MACOS_CERTIFICATE_PASSWORD: ${{ secrets.MACOS_CERTIFICATE_PASSWORD }}
APPLE_NOTARIZATION_KEY: ${{ secrets.APPLE_NOTARIZATION_KEY }}
APPLE_NOTARIZATION_KEY_ID: ${{ secrets.APPLE_NOTARIZATION_KEY_ID }}
APPLE_NOTARIZATION_ISSUER_ID: ${{ secrets.APPLE_NOTARIZATION_ISSUER_ID }}
CARGO_INCREMENTAL: 0
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_MINIDUMP_ENDPOINT: ${{ secrets.ZED_SENTRY_MINIDUMP_ENDPOINT }}
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020
with:
node-version: '20'
- name: steps::setup_sentry
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b
with:
token: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: steps::clear_target_dir_if_large
run: ./script/clear-target-dir-if-larger-than 300
- name: steps::setup_linux
run: ./script/linux
shell: bash -euxo pipefail {0}
- name: run_bundling::bundle_mac
run: ./script/bundle-mac x86_64-apple-darwin
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: '@actions/upload-artifact Zed_${{ github.event.pull_request.head.sha || github.sha }}-x86_64.dmg'
- name: ./script/bundle-linux
run: ./script/bundle-linux
shell: bash -euxo pipefail {0}
- name: '@actions/upload-artifact zed-linux-aarch64.tar.gz'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: Zed_${{ github.event.pull_request.head.sha || github.sha }}-x86_64.dmg
path: target/x86_64-apple-darwin/release/Zed.dmg
- name: '@actions/upload-artifact zed-remote-server-${{ github.event.pull_request.head.sha || github.sha }}-macos-x86_64.gz'
name: zed-linux-aarch64.tar.gz
path: target/release/zed-linux-aarch64.tar.gz
if-no-files-found: error
- name: '@actions/upload-artifact zed-remote-server-linux-aarch64.gz'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: zed-remote-server-${{ github.event.pull_request.head.sha || github.sha }}-macos-x86_64.gz
path: target/zed-remote-server-macos-x86_64.gz
timeout-minutes: 60
bundle_mac_arm64:
if: |-
(github.event.action == 'labeled' && github.event.label.name == 'run-bundling') ||
(github.event.action == 'synchronize' && contains(github.event.pull_request.labels.*.name, 'run-bundling'))
runs-on: self-mini-macos
env:
MACOS_CERTIFICATE: ${{ secrets.MACOS_CERTIFICATE }}
MACOS_CERTIFICATE_PASSWORD: ${{ secrets.MACOS_CERTIFICATE_PASSWORD }}
APPLE_NOTARIZATION_KEY: ${{ secrets.APPLE_NOTARIZATION_KEY }}
APPLE_NOTARIZATION_KEY_ID: ${{ secrets.APPLE_NOTARIZATION_KEY_ID }}
APPLE_NOTARIZATION_ISSUER_ID: ${{ secrets.APPLE_NOTARIZATION_ISSUER_ID }}
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020
with:
node-version: '20'
- name: steps::setup_sentry
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b
with:
token: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: steps::clear_target_dir_if_large
run: ./script/clear-target-dir-if-larger-than 300
shell: bash -euxo pipefail {0}
- name: run_bundling::bundle_mac
run: ./script/bundle-mac aarch64-apple-darwin
shell: bash -euxo pipefail {0}
- name: '@actions/upload-artifact Zed_${{ github.event.pull_request.head.sha || github.sha }}-aarch64.dmg'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: Zed_${{ github.event.pull_request.head.sha || github.sha }}-aarch64.dmg
path: target/aarch64-apple-darwin/release/Zed.dmg
- name: '@actions/upload-artifact zed-remote-server-${{ github.event.pull_request.head.sha || github.sha }}-macos-aarch64.gz'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: zed-remote-server-${{ github.event.pull_request.head.sha || github.sha }}-macos-aarch64.gz
path: target/zed-remote-server-macos-aarch64.gz
name: zed-remote-server-linux-aarch64.gz
path: target/zed-remote-server-linux-aarch64.gz
if-no-files-found: error
timeout-minutes: 60
bundle_linux_x86_64:
if: |-
(github.event.action == 'labeled' && github.event.label.name == 'run-bundling') ||
(github.event.action == 'synchronize' && contains(github.event.pull_request.labels.*.name, 'run-bundling'))
runs-on: namespace-profile-32x64-ubuntu-2004
env:
CARGO_INCREMENTAL: 0
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_MINIDUMP_ENDPOINT: ${{ secrets.ZED_SENTRY_MINIDUMP_ENDPOINT }}
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
@@ -118,22 +77,129 @@ jobs:
- name: ./script/bundle-linux
run: ./script/bundle-linux
shell: bash -euxo pipefail {0}
- name: '@actions/upload-artifact zed-${{ github.event.pull_request.head.sha || github.sha }}-x86_64-unknown-linux-gnu.tar.gz'
- name: '@actions/upload-artifact zed-linux-x86_64.tar.gz'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: zed-${{ github.event.pull_request.head.sha || github.sha }}-x86_64-unknown-linux-gnu.tar.gz
path: target/release/zed-*.tar.gz
- name: '@actions/upload-artifact zed-remote-server-${{ github.event.pull_request.head.sha || github.sha }}-x86_64-unknown-linux-gnu.tar.gz'
name: zed-linux-x86_64.tar.gz
path: target/release/zed-linux-x86_64.tar.gz
if-no-files-found: error
- name: '@actions/upload-artifact zed-remote-server-linux-x86_64.gz'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: zed-remote-server-${{ github.event.pull_request.head.sha || github.sha }}-x86_64-unknown-linux-gnu.tar.gz
path: target/release/zed-remote-server-*.tar.gz
name: zed-remote-server-linux-x86_64.gz
path: target/zed-remote-server-linux-x86_64.gz
if-no-files-found: error
timeout-minutes: 60
bundle_linux_arm64:
bundle_mac_aarch64:
if: |-
(github.event.action == 'labeled' && github.event.label.name == 'run-bundling') ||
(github.event.action == 'synchronize' && contains(github.event.pull_request.labels.*.name, 'run-bundling'))
runs-on: namespace-profile-8x32-ubuntu-2004-arm-m4
runs-on: self-mini-macos
env:
CARGO_INCREMENTAL: 0
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_MINIDUMP_ENDPOINT: ${{ secrets.ZED_SENTRY_MINIDUMP_ENDPOINT }}
MACOS_CERTIFICATE: ${{ secrets.MACOS_CERTIFICATE }}
MACOS_CERTIFICATE_PASSWORD: ${{ secrets.MACOS_CERTIFICATE_PASSWORD }}
APPLE_NOTARIZATION_KEY: ${{ secrets.APPLE_NOTARIZATION_KEY }}
APPLE_NOTARIZATION_KEY_ID: ${{ secrets.APPLE_NOTARIZATION_KEY_ID }}
APPLE_NOTARIZATION_ISSUER_ID: ${{ secrets.APPLE_NOTARIZATION_ISSUER_ID }}
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020
with:
node-version: '20'
- name: steps::setup_sentry
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b
with:
token: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: steps::clear_target_dir_if_large
run: ./script/clear-target-dir-if-larger-than 300
shell: bash -euxo pipefail {0}
- name: run_bundling::bundle_mac::bundle_mac
run: ./script/bundle-mac aarch64-apple-darwin
shell: bash -euxo pipefail {0}
- name: '@actions/upload-artifact Zed-aarch64.dmg'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: Zed-aarch64.dmg
path: target/aarch64-apple-darwin/release/Zed-aarch64.dmg
if-no-files-found: error
- name: '@actions/upload-artifact zed-remote-server-macos-aarch64.gz'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: zed-remote-server-macos-aarch64.gz
path: target/zed-remote-server-macos-aarch64.gz
if-no-files-found: error
timeout-minutes: 60
bundle_mac_x86_64:
if: |-
(github.event.action == 'labeled' && github.event.label.name == 'run-bundling') ||
(github.event.action == 'synchronize' && contains(github.event.pull_request.labels.*.name, 'run-bundling'))
runs-on: self-mini-macos
env:
CARGO_INCREMENTAL: 0
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_MINIDUMP_ENDPOINT: ${{ secrets.ZED_SENTRY_MINIDUMP_ENDPOINT }}
MACOS_CERTIFICATE: ${{ secrets.MACOS_CERTIFICATE }}
MACOS_CERTIFICATE_PASSWORD: ${{ secrets.MACOS_CERTIFICATE_PASSWORD }}
APPLE_NOTARIZATION_KEY: ${{ secrets.APPLE_NOTARIZATION_KEY }}
APPLE_NOTARIZATION_KEY_ID: ${{ secrets.APPLE_NOTARIZATION_KEY_ID }}
APPLE_NOTARIZATION_ISSUER_ID: ${{ secrets.APPLE_NOTARIZATION_ISSUER_ID }}
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020
with:
node-version: '20'
- name: steps::setup_sentry
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b
with:
token: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: steps::clear_target_dir_if_large
run: ./script/clear-target-dir-if-larger-than 300
shell: bash -euxo pipefail {0}
- name: run_bundling::bundle_mac::bundle_mac
run: ./script/bundle-mac x86_64-apple-darwin
shell: bash -euxo pipefail {0}
- name: '@actions/upload-artifact Zed-x86_64.dmg'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: Zed-x86_64.dmg
path: target/x86_64-apple-darwin/release/Zed-x86_64.dmg
if-no-files-found: error
- name: '@actions/upload-artifact zed-remote-server-macos-x86_64.gz'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: zed-remote-server-macos-x86_64.gz
path: target/zed-remote-server-macos-x86_64.gz
if-no-files-found: error
timeout-minutes: 60
bundle_windows_aarch64:
if: |-
(github.event.action == 'labeled' && github.event.label.name == 'run-bundling') ||
(github.event.action == 'synchronize' && contains(github.event.pull_request.labels.*.name, 'run-bundling'))
runs-on: self-32vcpu-windows-2022
env:
CARGO_INCREMENTAL: 0
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_MINIDUMP_ENDPOINT: ${{ secrets.ZED_SENTRY_MINIDUMP_ENDPOINT }}
AZURE_TENANT_ID: ${{ secrets.AZURE_SIGNING_TENANT_ID }}
AZURE_CLIENT_ID: ${{ secrets.AZURE_SIGNING_CLIENT_ID }}
AZURE_CLIENT_SECRET: ${{ secrets.AZURE_SIGNING_CLIENT_SECRET }}
ACCOUNT_NAME: ${{ vars.AZURE_SIGNING_ACCOUNT_NAME }}
CERT_PROFILE_NAME: ${{ vars.AZURE_SIGNING_CERT_PROFILE_NAME }}
ENDPOINT: ${{ vars.AZURE_SIGNING_ENDPOINT }}
FILE_DIGEST: SHA256
TIMESTAMP_DIGEST: SHA256
TIMESTAMP_SERVER: http://timestamp.acs.microsoft.com
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
@@ -143,25 +209,16 @@ jobs:
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b
with:
token: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: steps::setup_linux
run: ./script/linux
shell: bash -euxo pipefail {0}
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: ./script/bundle-linux
run: ./script/bundle-linux
shell: bash -euxo pipefail {0}
- name: '@actions/upload-artifact zed-${{ github.event.pull_request.head.sha || github.sha }}-aarch64-unknown-linux-gnu.tar.gz'
- name: run_bundling::bundle_windows::bundle_windows
run: script/bundle-windows.ps1 -Architecture aarch64
shell: pwsh
working-directory: ${{ env.ZED_WORKSPACE }}
- name: '@actions/upload-artifact Zed-aarch64.exe'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: zed-${{ github.event.pull_request.head.sha || github.sha }}-aarch64-unknown-linux-gnu.tar.gz
path: target/release/zed-*.tar.gz
- name: '@actions/upload-artifact zed-remote-server-${{ github.event.pull_request.head.sha || github.sha }}-aarch64-unknown-linux-gnu.tar.gz'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: zed-remote-server-${{ github.event.pull_request.head.sha || github.sha }}-aarch64-unknown-linux-gnu.tar.gz
path: target/release/zed-remote-server-*.tar.gz
name: Zed-aarch64.exe
path: target/Zed-aarch64.exe
if-no-files-found: error
timeout-minutes: 60
bundle_windows_x86_64:
if: |-
@@ -169,6 +226,9 @@ jobs:
(github.event.action == 'synchronize' && contains(github.event.pull_request.labels.*.name, 'run-bundling'))
runs-on: self-32vcpu-windows-2022
env:
CARGO_INCREMENTAL: 0
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_MINIDUMP_ENDPOINT: ${{ secrets.ZED_SENTRY_MINIDUMP_ENDPOINT }}
AZURE_TENANT_ID: ${{ secrets.AZURE_SIGNING_TENANT_ID }}
AZURE_CLIENT_ID: ${{ secrets.AZURE_SIGNING_CLIENT_ID }}
AZURE_CLIENT_SECRET: ${{ secrets.AZURE_SIGNING_CLIENT_SECRET }}
@@ -187,49 +247,16 @@ jobs:
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b
with:
token: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: run_bundling::bundle_windows
- name: run_bundling::bundle_windows::bundle_windows
run: script/bundle-windows.ps1 -Architecture x86_64
shell: pwsh
working-directory: ${{ env.ZED_WORKSPACE }}
- name: '@actions/upload-artifact Zed_${{ github.event.pull_request.head.sha || github.sha }}-x86_64.exe'
- name: '@actions/upload-artifact Zed-x86_64.exe'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: Zed_${{ github.event.pull_request.head.sha || github.sha }}-x86_64.exe
path: ${{ env.SETUP_PATH }}
timeout-minutes: 60
bundle_windows_arm64:
if: |-
(github.event.action == 'labeled' && github.event.label.name == 'run-bundling') ||
(github.event.action == 'synchronize' && contains(github.event.pull_request.labels.*.name, 'run-bundling'))
runs-on: self-32vcpu-windows-2022
env:
AZURE_TENANT_ID: ${{ secrets.AZURE_SIGNING_TENANT_ID }}
AZURE_CLIENT_ID: ${{ secrets.AZURE_SIGNING_CLIENT_ID }}
AZURE_CLIENT_SECRET: ${{ secrets.AZURE_SIGNING_CLIENT_SECRET }}
ACCOUNT_NAME: ${{ vars.AZURE_SIGNING_ACCOUNT_NAME }}
CERT_PROFILE_NAME: ${{ vars.AZURE_SIGNING_CERT_PROFILE_NAME }}
ENDPOINT: ${{ vars.AZURE_SIGNING_ENDPOINT }}
FILE_DIGEST: SHA256
TIMESTAMP_DIGEST: SHA256
TIMESTAMP_SERVER: http://timestamp.acs.microsoft.com
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_sentry
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b
with:
token: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: run_bundling::bundle_windows
run: script/bundle-windows.ps1 -Architecture aarch64
shell: pwsh
working-directory: ${{ env.ZED_WORKSPACE }}
- name: '@actions/upload-artifact Zed_${{ github.event.pull_request.head.sha || github.sha }}-aarch64.exe'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: Zed_${{ github.event.pull_request.head.sha || github.sha }}-aarch64.exe
path: ${{ env.SETUP_PATH }}
name: Zed-x86_64.exe
path: target/Zed-x86_64.exe
if-no-files-found: error
timeout-minutes: 60
concurrency:
group: ${{ github.workflow }}-${{ github.head_ref || github.ref }}

View File

@@ -66,6 +66,10 @@ jobs:
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::cache_rust_dependencies_namespace
uses: namespacelabs/nscloud-cache-action@v1
with:
cache: rust
- name: steps::setup_pnpm
uses: pnpm/action-setup@fe02b34f77f8bc703788d5817da081398fad5dd2
with:
@@ -145,6 +149,10 @@ jobs:
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: steps::cache_rust_dependencies_namespace
uses: namespacelabs/nscloud-cache-action@v1
with:
cache: rust
- name: steps::setup_node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020
with:
@@ -156,7 +164,7 @@ jobs:
run: cargo install cargo-nextest --locked
shell: bash -euxo pipefail {0}
- name: steps::clear_target_dir_if_large
run: ./script/clear-target-dir-if-larger-than 100
run: ./script/clear-target-dir-if-larger-than 250
shell: bash -euxo pipefail {0}
- name: steps::cargo_nextest
run: cargo nextest run --workspace --no-fail-fast --failure-output immediate-final
@@ -214,10 +222,10 @@ jobs:
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::cache_rust_dependencies
uses: swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6
- name: steps::cache_rust_dependencies_namespace
uses: namespacelabs/nscloud-cache-action@v1
with:
save-if: ${{ github.ref == 'refs/heads/main' }}
cache: rust
- name: steps::setup_linux
run: ./script/linux
shell: bash -euxo pipefail {0}
@@ -261,6 +269,10 @@ jobs:
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: steps::cache_rust_dependencies_namespace
uses: namespacelabs/nscloud-cache-action@v1
with:
cache: rust
- name: cargo build -p collab
run: cargo build -p collab
shell: bash -euxo pipefail {0}
@@ -273,40 +285,6 @@ jobs:
rm -rf ./../.cargo
shell: bash -euxo pipefail {0}
timeout-minutes: 60
check_postgres_and_protobuf_migrations:
needs:
- orchestrate
if: needs.orchestrate.outputs.run_tests == 'true'
runs-on: self-mini-macos
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
fetch-depth: 0
- name: run_tests::check_postgres_and_protobuf_migrations::remove_untracked_files
run: git clean -df
shell: bash -euxo pipefail {0}
- name: run_tests::check_postgres_and_protobuf_migrations::ensure_fresh_merge
run: |
if [ -z "$GITHUB_BASE_REF" ];
then
echo "BUF_BASE_BRANCH=$(git merge-base origin/main HEAD)" >> "$GITHUB_ENV"
else
git checkout -B temp
git merge -q "origin/$GITHUB_BASE_REF" -m "merge main into temp"
echo "BUF_BASE_BRANCH=$GITHUB_BASE_REF" >> "$GITHUB_ENV"
fi
shell: bash -euxo pipefail {0}
- name: run_tests::check_postgres_and_protobuf_migrations::bufbuild_setup_action
uses: bufbuild/buf-setup-action@v1
with:
version: v1.29.0
- name: run_tests::check_postgres_and_protobuf_migrations::bufbuild_breaking_action
uses: bufbuild/buf-breaking-action@v1
with:
input: crates/proto/proto/
against: https://github.com/${GITHUB_REPOSITORY}.git#branch=${BUF_BASE_BRANCH},subdir=crates/proto/proto/
timeout-minutes: 60
check_dependencies:
needs:
- orchestrate
@@ -317,6 +295,10 @@ jobs:
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::cache_rust_dependencies_namespace
uses: namespacelabs/nscloud-cache-action@v1
with:
cache: rust
- name: run_tests::check_dependencies::install_cargo_machete
uses: clechasseur/rs-cargo@8435b10f6e71c2e3d4d3b7573003a8ce4bfc6386
with:
@@ -350,10 +332,10 @@ jobs:
mkdir -p ./../.cargo
cp ./.cargo/ci-config.toml ./../.cargo/config.toml
shell: bash -euxo pipefail {0}
- name: steps::cache_rust_dependencies
uses: swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6
- name: steps::cache_rust_dependencies_namespace
uses: namespacelabs/nscloud-cache-action@v1
with:
save-if: ${{ github.ref == 'refs/heads/main' }}
cache: rust
- name: run_tests::check_docs::lychee_link_check
uses: lycheeverse/lychee-action@82202e5e9c2f4ef1a55a3d02563e1cb6041e5332
with:
@@ -392,6 +374,10 @@ jobs:
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::cache_rust_dependencies_namespace
uses: namespacelabs/nscloud-cache-action@v1
with:
cache: rust
- name: ./script/check-licenses
run: ./script/check-licenses
shell: bash -euxo pipefail {0}
@@ -444,18 +430,18 @@ jobs:
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: nix_build::install_nix
- name: nix_build::build_nix::install_nix
uses: cachix/install-nix-action@02a151ada4993995686f9ed4f1be7cfbb229e56f
with:
github_access_token: ${{ secrets.GITHUB_TOKEN }}
- name: nix_build::cachix_action
- name: nix_build::build_nix::cachix_action
uses: cachix/cachix-action@0fc020193b5a1fa3ac4575aa3a7d3aa6a35435ad
with:
name: zed
authToken: ${{ secrets.CACHIX_AUTH_TOKEN }}
cachixArgs: -v
pushFilter: -zed-editor-[0-9.]*-nightly
- name: nix_build::build
- name: nix_build::build_nix::build
run: nix build .#debug -L --accept-flake-config
shell: bash -euxo pipefail {0}
timeout-minutes: 60
@@ -475,22 +461,22 @@ jobs:
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: nix_build::set_path
- name: nix_build::build_nix::set_path
run: |
echo "/nix/var/nix/profiles/default/bin" >> "$GITHUB_PATH"
echo "/Users/administrator/.nix-profile/bin" >> "$GITHUB_PATH"
shell: bash -euxo pipefail {0}
- name: nix_build::cachix_action
- name: nix_build::build_nix::cachix_action
uses: cachix/cachix-action@0fc020193b5a1fa3ac4575aa3a7d3aa6a35435ad
with:
name: zed
authToken: ${{ secrets.CACHIX_AUTH_TOKEN }}
cachixArgs: -v
pushFilter: -zed-editor-[0-9.]*-nightly
- name: nix_build::build
- name: nix_build::build_nix::build
run: nix build .#debug -L --accept-flake-config
shell: bash -euxo pipefail {0}
- name: nix_build::limit_store
- name: nix_build::build_nix::limit_store
run: |-
if [ "$(du -sm /nix/store | cut -f1)" -gt 50000 ]; then
nix-collect-garbage -d || true
@@ -498,6 +484,40 @@ jobs:
shell: bash -euxo pipefail {0}
timeout-minutes: 60
continue-on-error: true
check_postgres_and_protobuf_migrations:
needs:
- orchestrate
if: needs.orchestrate.outputs.run_tests == 'true'
runs-on: self-mini-macos
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
fetch-depth: 0
- name: run_tests::check_postgres_and_protobuf_migrations::remove_untracked_files
run: git clean -df
shell: bash -euxo pipefail {0}
- name: run_tests::check_postgres_and_protobuf_migrations::ensure_fresh_merge
run: |
if [ -z "$GITHUB_BASE_REF" ];
then
echo "BUF_BASE_BRANCH=$(git merge-base origin/main HEAD)" >> "$GITHUB_ENV"
else
git checkout -B temp
git merge -q "origin/$GITHUB_BASE_REF" -m "merge main into temp"
echo "BUF_BASE_BRANCH=$GITHUB_BASE_REF" >> "$GITHUB_ENV"
fi
shell: bash -euxo pipefail {0}
- name: run_tests::check_postgres_and_protobuf_migrations::bufbuild_setup_action
uses: bufbuild/buf-setup-action@v1
with:
version: v1.29.0
- name: run_tests::check_postgres_and_protobuf_migrations::bufbuild_breaking_action
uses: bufbuild/buf-breaking-action@v1
with:
input: crates/proto/proto/
against: https://github.com/${GITHUB_REPOSITORY}.git#branch=${BUF_BASE_BRANCH},subdir=crates/proto/proto/
timeout-minutes: 60
tests_pass:
needs:
- orchestrate
@@ -507,7 +527,6 @@ jobs:
- run_tests_mac
- doctests
- check_workspace_binaries
- check_postgres_and_protobuf_migrations
- check_dependencies
- check_docs
- check_licenses
@@ -534,7 +553,6 @@ jobs:
check_result "run_tests_mac" "${{ needs.run_tests_mac.result }}"
check_result "doctests" "${{ needs.doctests.result }}"
check_result "check_workspace_binaries" "${{ needs.check_workspace_binaries.result }}"
check_result "check_postgres_and_protobuf_migrations" "${{ needs.check_postgres_and_protobuf_migrations.result }}"
check_result "check_dependencies" "${{ needs.check_dependencies.result }}"
check_result "check_docs" "${{ needs.check_docs.result }}"
check_result "check_licenses" "${{ needs.check_licenses.result }}"

63
.github/workflows/run_unit_evals.yml vendored Normal file
View File

@@ -0,0 +1,63 @@
# Generated from xtask::workflows::run_agent_evals
# Rebuild with `cargo xtask workflows`.
name: run_agent_evals
env:
CARGO_TERM_COLOR: always
CARGO_INCREMENTAL: '0'
RUST_BACKTRACE: '1'
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
on:
schedule:
- cron: 47 1 * * 2
workflow_dispatch: {}
jobs:
unit_evals:
runs-on: namespace-profile-16x32-ubuntu-2204
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_cargo_config
run: |
mkdir -p ./../.cargo
cp ./.cargo/ci-config.toml ./../.cargo/config.toml
shell: bash -euxo pipefail {0}
- name: steps::cache_rust_dependencies_namespace
uses: namespacelabs/nscloud-cache-action@v1
with:
cache: rust
- name: steps::setup_linux
run: ./script/linux
shell: bash -euxo pipefail {0}
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: steps::cargo_install_nextest
run: cargo install cargo-nextest --locked
shell: bash -euxo pipefail {0}
- name: steps::clear_target_dir_if_large
run: ./script/clear-target-dir-if-larger-than 250
shell: bash -euxo pipefail {0}
- name: ./script/run-unit-evals
run: ./script/run-unit-evals
shell: bash -euxo pipefail {0}
env:
ANTHROPIC_API_KEY: ${{ secrets.ANTHROPIC_API_KEY }}
- name: run_agent_evals::unit_evals::send_failure_to_slack
if: ${{ failure() }}
uses: slackapi/slack-github-action@b0fa283ad8fea605de13dc3f449259339835fc52
with:
method: chat.postMessage
token: ${{ secrets.SLACK_APP_ZED_UNIT_EVALS_BOT_TOKEN }}
payload: |
channel: C04UDRNNJFQ
text: "Unit Evals Failed: https://github.com/zed-industries/zed/actions/runs/${{ github.run_id }}"
- name: steps::cleanup_cargo_config
if: always()
run: |
rm -rf ./../.cargo
shell: bash -euxo pipefail {0}
concurrency:
group: ${{ github.workflow }}-${{ github.ref_name }}-${{ github.ref_name == 'main' && github.sha || 'anysha' }}
cancel-in-progress: true

View File

@@ -1,86 +0,0 @@
name: Run Unit Evals
on:
schedule:
# GitHub might drop jobs at busy times, so we choose a random time in the middle of the night.
- cron: "47 1 * * 2"
workflow_dispatch:
concurrency:
# Allow only one workflow per any non-`main` branch.
group: ${{ github.workflow }}-${{ github.ref_name }}-${{ github.ref_name == 'main' && github.sha || 'anysha' }}
cancel-in-progress: true
env:
CARGO_TERM_COLOR: always
CARGO_INCREMENTAL: 0
RUST_BACKTRACE: 1
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
jobs:
unit_evals:
if: github.repository_owner == 'zed-industries'
timeout-minutes: 60
name: Run unit evals
runs-on:
- namespace-profile-16x32-ubuntu-2204
steps:
- name: Add Rust to the PATH
run: echo "$HOME/.cargo/bin" >> "$GITHUB_PATH"
- name: Checkout repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false
- name: Cache dependencies
uses: swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2
with:
save-if: ${{ github.ref == 'refs/heads/main' }}
# cache-provider: "buildjet"
- name: Install Linux dependencies
run: ./script/linux
- name: Configure CI
run: |
mkdir -p ./../.cargo
cp ./.cargo/ci-config.toml ./../.cargo/config.toml
- name: Install Rust
shell: bash -euxo pipefail {0}
run: |
cargo install cargo-nextest --locked
- name: Install Node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4
with:
node-version: "18"
- name: Limit target directory size
shell: bash -euxo pipefail {0}
run: script/clear-target-dir-if-larger-than 100
- name: Run unit evals
shell: bash -euxo pipefail {0}
run: cargo nextest run --workspace --no-fail-fast --features unit-eval --no-capture -E 'test(::eval_)'
env:
ANTHROPIC_API_KEY: ${{ secrets.ANTHROPIC_API_KEY }}
- name: Send failure message to Slack channel if needed
if: ${{ failure() }}
uses: slackapi/slack-github-action@b0fa283ad8fea605de13dc3f449259339835fc52
with:
method: chat.postMessage
token: ${{ secrets.SLACK_APP_ZED_UNIT_EVALS_BOT_TOKEN }}
payload: |
channel: C04UDRNNJFQ
text: "Unit Evals Failed: https://github.com/zed-industries/zed/actions/runs/${{ github.run_id }}"
# Even the Linux runner is not stateful, in theory there is no need to do this cleanup.
# But, to avoid potential issues in the future if we choose to use a stateful Linux runner and forget to add code
# to clean up the config file, Ive included the cleanup code here as a precaution.
# While its not strictly necessary at this moment, I believe its better to err on the side of caution.
- name: Clean CI config file
if: always()
run: rm -rf ./../.cargo

76
Cargo.lock generated
View File

@@ -1339,6 +1339,7 @@ dependencies = [
"settings",
"smol",
"tempfile",
"util",
"which 6.0.3",
"workspace",
]
@@ -1350,6 +1351,7 @@ dependencies = [
"anyhow",
"log",
"simplelog",
"tempfile",
"windows 0.61.3",
"winresource",
]
@@ -4528,12 +4530,15 @@ dependencies = [
"fs",
"futures 0.3.31",
"gpui",
"http_client",
"json_dotpath",
"language",
"log",
"node_runtime",
"paths",
"serde",
"serde_json",
"settings",
"smol",
"task",
"util",
@@ -4932,6 +4937,7 @@ dependencies = [
"editor",
"gpui",
"indoc",
"itertools 0.14.0",
"language",
"log",
"lsp",
@@ -5832,8 +5838,6 @@ name = "extension"
version = "0.1.0"
dependencies = [
"anyhow",
"async-compression",
"async-tar",
"async-trait",
"collections",
"dap",
@@ -6400,7 +6404,7 @@ dependencies = [
"ignore",
"libc",
"log",
"notify 8.0.0",
"notify 8.2.0",
"objc",
"parking_lot",
"paths",
@@ -6649,7 +6653,9 @@ version = "0.1.0"
dependencies = [
"gpui",
"log",
"nucleo",
"util",
"util_macros",
]
[[package]]
@@ -6957,7 +6963,7 @@ dependencies = [
[[package]]
name = "gh-workflow"
version = "0.8.0"
source = "git+https://github.com/zed-industries/gh-workflow?rev=0090c6b6ef82fff02bc8616645953e778d1acc08#0090c6b6ef82fff02bc8616645953e778d1acc08"
source = "git+https://github.com/zed-industries/gh-workflow?rev=3eaa84abca0778eb54272f45a312cb24f9a0b435#3eaa84abca0778eb54272f45a312cb24f9a0b435"
dependencies = [
"async-trait",
"derive_more 2.0.1",
@@ -6974,7 +6980,7 @@ dependencies = [
[[package]]
name = "gh-workflow-macros"
version = "0.8.0"
source = "git+https://github.com/zed-industries/gh-workflow?rev=0090c6b6ef82fff02bc8616645953e778d1acc08#0090c6b6ef82fff02bc8616645953e778d1acc08"
source = "git+https://github.com/zed-industries/gh-workflow?rev=3eaa84abca0778eb54272f45a312cb24f9a0b435#3eaa84abca0778eb54272f45a312cb24f9a0b435"
dependencies = [
"heck 0.5.0",
"quote",
@@ -7074,6 +7080,7 @@ dependencies = [
"serde_json",
"settings",
"url",
"urlencoding",
"util",
]
@@ -7086,7 +7093,6 @@ dependencies = [
"askpass",
"buffer_diff",
"call",
"chrono",
"cloud_llm_client",
"collections",
"command_palette_hooks",
@@ -7112,6 +7118,8 @@ dependencies = [
"picker",
"pretty_assertions",
"project",
"recent_projects",
"remote",
"schemars 1.0.4",
"serde",
"serde_json",
@@ -10402,11 +10410,10 @@ dependencies = [
[[package]]
name = "notify"
version = "8.0.0"
source = "git+https://github.com/zed-industries/notify.git?rev=bbb9ea5ae52b253e095737847e367c30653a2e96#bbb9ea5ae52b253e095737847e367c30653a2e96"
version = "8.2.0"
source = "git+https://github.com/zed-industries/notify.git?rev=b4588b2e5aee68f4c0e100f140e808cbce7b1419#b4588b2e5aee68f4c0e100f140e808cbce7b1419"
dependencies = [
"bitflags 2.9.4",
"filetime",
"fsevent-sys 4.1.0",
"inotify 0.11.0",
"kqueue",
@@ -10415,7 +10422,7 @@ dependencies = [
"mio 1.1.0",
"notify-types",
"walkdir",
"windows-sys 0.59.0",
"windows-sys 0.60.2",
]
[[package]]
@@ -10432,7 +10439,7 @@ dependencies = [
[[package]]
name = "notify-types"
version = "2.0.0"
source = "git+https://github.com/zed-industries/notify.git?rev=bbb9ea5ae52b253e095737847e367c30653a2e96#bbb9ea5ae52b253e095737847e367c30653a2e96"
source = "git+https://github.com/zed-industries/notify.git?rev=b4588b2e5aee68f4c0e100f140e808cbce7b1419#b4588b2e5aee68f4c0e100f140e808cbce7b1419"
[[package]]
name = "now"
@@ -10461,6 +10468,27 @@ dependencies = [
"windows-sys 0.61.2",
]
[[package]]
name = "nucleo"
version = "0.5.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5262af4c94921c2646c5ac6ff7900c2af9cbb08dc26a797e18130a7019c039d4"
dependencies = [
"nucleo-matcher",
"parking_lot",
"rayon",
]
[[package]]
name = "nucleo-matcher"
version = "0.3.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "bf33f538733d1a5a3494b836ba913207f14d9d4a1d3cd67030c5061bdd2cac85"
dependencies = [
"memchr",
"unicode-segmentation",
]
[[package]]
name = "num"
version = "0.4.3"
@@ -12711,12 +12739,6 @@ version = "0.2.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5da3b0203fd7ee5720aa0b5e790b591aa5d3f41c3ed2c34a3a393382198af2f7"
[[package]]
name = "pollster"
version = "0.4.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "2f3a9f18d041e6d0e102a0a46750538147e5e8992d3b4873aaafee2520b00ce3"
[[package]]
name = "portable-atomic"
version = "1.11.1"
@@ -12765,7 +12787,7 @@ dependencies = [
"log",
"parking_lot",
"pin-project",
"pollster 0.2.5",
"pollster",
"static_assertions",
"thiserror 1.0.69",
]
@@ -13970,6 +13992,7 @@ dependencies = [
"gpui",
"gpui_tokio",
"http_client",
"image",
"json_schema_store",
"language",
"language_extension",
@@ -14317,6 +14340,7 @@ dependencies = [
"gpui",
"log",
"rand 0.9.2",
"rayon",
"sum_tree",
"unicode-segmentation",
"util",
@@ -16242,7 +16266,6 @@ checksum = "2b2231b7c3057d5e4ad0156fb3dc807d900806020c5ffa3ee6ff2c8c76fb8520"
name = "streaming_diff"
version = "0.1.0"
dependencies = [
"gpui",
"ordered-float 2.10.1",
"rand 0.9.2",
"rope",
@@ -16361,11 +16384,9 @@ version = "0.1.0"
dependencies = [
"arrayvec",
"ctor",
"futures 0.3.31",
"itertools 0.14.0",
"log",
"pollster 0.4.0",
"rand 0.9.2",
"rayon",
"zlog",
]
@@ -18050,7 +18071,7 @@ dependencies = [
[[package]]
name = "tree-sitter-gomod"
version = "1.1.1"
source = "git+https://github.com/camdencheek/tree-sitter-go-mod?rev=6efb59652d30e0e9cd5f3b3a669afd6f1a926d3c#6efb59652d30e0e9cd5f3b3a669afd6f1a926d3c"
source = "git+https://github.com/camdencheek/tree-sitter-go-mod?rev=2e886870578eeba1927a2dc4bd2e2b3f598c5f9a#2e886870578eeba1927a2dc4bd2e2b3f598c5f9a"
dependencies = [
"cc",
"tree-sitter-language",
@@ -18619,6 +18640,7 @@ dependencies = [
"itertools 0.14.0",
"libc",
"log",
"mach2 0.5.0",
"nix 0.29.0",
"pretty_assertions",
"rand 0.9.2",
@@ -20952,6 +20974,7 @@ dependencies = [
"gh-workflow",
"indexmap 2.11.4",
"indoc",
"serde",
"toml 0.8.23",
"toml_edit 0.22.27",
]
@@ -21136,7 +21159,7 @@ dependencies = [
[[package]]
name = "zed"
version = "0.212.0"
version = "0.213.0"
dependencies = [
"acp_tools",
"activity_indicator",
@@ -21228,6 +21251,7 @@ dependencies = [
"project_symbols",
"prompt_store",
"proto",
"rayon",
"recent_projects",
"release_channel",
"remote",
@@ -21742,6 +21766,7 @@ dependencies = [
"futures 0.3.31",
"gpui",
"gpui_tokio",
"indoc",
"language",
"language_extension",
"language_model",
@@ -21752,8 +21777,10 @@ dependencies = [
"ordered-float 2.10.1",
"paths",
"polars",
"pretty_assertions",
"project",
"prompt_store",
"pulldown-cmark 0.12.2",
"release_channel",
"reqwest_client",
"serde",
@@ -21763,6 +21790,7 @@ dependencies = [
"smol",
"soa-rs",
"terminal_view",
"toml 0.8.23",
"util",
"watch",
"zeta",

View File

@@ -508,7 +508,7 @@ fork = "0.2.0"
futures = "0.3"
futures-batch = "0.6.1"
futures-lite = "1.13"
gh-workflow = { git = "https://github.com/zed-industries/gh-workflow", rev = "0090c6b6ef82fff02bc8616645953e778d1acc08" }
gh-workflow = { git = "https://github.com/zed-industries/gh-workflow", rev = "3eaa84abca0778eb54272f45a312cb24f9a0b435" }
git2 = { version = "0.20.1", default-features = false }
globset = "0.4"
handlebars = "4.3"
@@ -547,6 +547,7 @@ naga = { version = "25.0", features = ["wgsl-in"] }
nanoid = "0.4"
nbformat = { git = "https://github.com/ConradIrwin/runtimed", rev = "7130c804216b6914355d15d0b91ea91f6babd734" }
nix = "0.29"
nucleo = "0.5"
num-format = "0.4.4"
num-traits = "0.2"
objc = "0.2"
@@ -663,6 +664,7 @@ time = { version = "0.3", features = [
"serde",
"serde-well-known",
"formatting",
"local-offset",
] }
tiny_http = "0.8"
tokio = { version = "1" }
@@ -680,7 +682,7 @@ tree-sitter-elixir = "0.3"
tree-sitter-embedded-template = "0.23.0"
tree-sitter-gitcommit = { git = "https://github.com/zed-industries/tree-sitter-git-commit", rev = "88309716a69dd13ab83443721ba6e0b491d37ee9" }
tree-sitter-go = "0.23"
tree-sitter-go-mod = { git = "https://github.com/camdencheek/tree-sitter-go-mod", rev = "6efb59652d30e0e9cd5f3b3a669afd6f1a926d3c", package = "tree-sitter-gomod" }
tree-sitter-go-mod = { git = "https://github.com/camdencheek/tree-sitter-go-mod", rev = "2e886870578eeba1927a2dc4bd2e2b3f598c5f9a", package = "tree-sitter-gomod" }
tree-sitter-gowork = { git = "https://github.com/zed-industries/tree-sitter-go-work", rev = "acb0617bf7f4fda02c6217676cc64acb89536dc7" }
tree-sitter-heex = { git = "https://github.com/zed-industries/tree-sitter-heex", rev = "1dd45142fbb05562e35b2040c6129c9bca346592" }
tree-sitter-html = "0.23"
@@ -772,8 +774,8 @@ features = [
]
[patch.crates-io]
notify = { git = "https://github.com/zed-industries/notify.git", rev = "bbb9ea5ae52b253e095737847e367c30653a2e96" }
notify-types = { git = "https://github.com/zed-industries/notify.git", rev = "bbb9ea5ae52b253e095737847e367c30653a2e96" }
notify = { git = "https://github.com/zed-industries/notify.git", rev = "b4588b2e5aee68f4c0e100f140e808cbce7b1419" }
notify-types = { git = "https://github.com/zed-industries/notify.git", rev = "b4588b2e5aee68f4c0e100f140e808cbce7b1419" }
windows-capture = { git = "https://github.com/zed-industries/windows-capture.git", rev = "f0d6c1b6691db75461b732f6d5ff56eed002eeb9" }
[profile.dev]

View File

@@ -8,107 +8,110 @@
; to other areas too.
<all>
= @ConradIrwin
= @maxdeviant
= @SomeoneToIgnore
= @probably-neb
= @danilo-leal
= @Veykril
= @kubkon
= @p1n3appl3
= @dinocosta
= @smitbarmase
= @cole-miller
= @ConradIrwin
= @danilo-leal
= @dinocosta
= @HactarCE
vim
= @ConradIrwin
= @probably-neb
= @kubkon
= @maxdeviant
= @p1n3appl3
= @dinocosta
gpui
= @mikayla-maki
git
= @cole-miller
= @danilo-leal
linux
= @dvdsk
= @probably-neb
= @smitbarmase
= @p1n3appl3
= @cole-miller
= @probably-neb
windows
= @reflectronic
= @localcc
pickers
= @p1n3appl3
= @dvdsk
= @SomeoneToIgnore
= @Veykril
ai
= @benbrandt
= @bennetbo
= @danilo-leal
= @rtfeldman
audio
= @dvdsk
helix
= @kubkon
terminal
= @kubkon
= @Veykril
debugger
= @kubkon
= @osiewicz
= @Anthony-Eid
extension
= @kubkon
settings_ui
= @probably-neb
= @danilo-leal
= @Anthony-Eid
crashes
= @p1n3appl3
= @Veykril
ai
= @rtfeldman
= @danilo-leal
= @benbrandt
= @bennetbo
debugger
= @Anthony-Eid
= @kubkon
= @osiewicz
design
= @danilo-leal
docs
= @probably-neb
extension
= @kubkon
git
= @cole-miller
= @danilo-leal
gpui
= @Anthony-Eid
= @cameron1024
= @mikayla-maki
= @probably-neb
helix
= @kubkon
languages
= @osiewicz
= @probably-neb
= @smitbarmase
= @SomeoneToIgnore
= @Veykril
linux
= @cole-miller
= @dvdsk
= @p1n3appl3
= @probably-neb
= @smitbarmase
lsp
= @osiewicz
= @smitbarmase
= @SomeoneToIgnore
= @Veykril
multi_buffer
= @Veykril
= @SomeoneToIgnore
lsp
= @osiewicz
= @Veykril
= @smitbarmase
pickers
= @dvdsk
= @p1n3appl3
= @SomeoneToIgnore
languages
= @osiewicz
= @Veykril
= @smitbarmase
= @SomeoneToIgnore
= @probably-neb
project_panel
= @smitbarmase
settings_ui
= @Anthony-Eid
= @danilo-leal
= @probably-neb
tasks
= @SomeoneToIgnore
= @Veykril
docs
terminal
= @kubkon
= @Veykril
vim
= @ConradIrwin
= @dinocosta
= @p1n3appl3
= @probably-neb
windows
= @localcc
= @reflectronic

View File

@@ -0,0 +1,4 @@
<svg width="16" height="16" viewBox="0 0 16 16" fill="none" xmlns="http://www.w3.org/2000/svg">
<path d="M11.3335 13.3333L8.00017 10L4.66685 13.3333" stroke="black" stroke-width="1.2" stroke-linecap="round" stroke-linejoin="round"/>
<path d="M11.3335 2.66669L8.00017 6.00002L4.66685 2.66669" stroke="black" stroke-width="1.2" stroke-linecap="round" stroke-linejoin="round"/>
</svg>

After

Width:  |  Height:  |  Size: 382 B

View File

Before

Width:  |  Height:  |  Size: 5.5 KiB

After

Width:  |  Height:  |  Size: 5.5 KiB

View File

Before

Width:  |  Height:  |  Size: 3.9 KiB

After

Width:  |  Height:  |  Size: 3.9 KiB

View File

@@ -43,7 +43,8 @@
"f11": "zed::ToggleFullScreen",
"ctrl-alt-z": "edit_prediction::RateCompletions",
"ctrl-alt-shift-i": "edit_prediction::ToggleMenu",
"ctrl-alt-l": "lsp_tool::ToggleMenu"
"ctrl-alt-l": "lsp_tool::ToggleMenu",
"ctrl-alt-.": "project_panel::ToggleHideHidden"
}
},
{
@@ -235,12 +236,13 @@
"ctrl-alt-n": "agent::NewTextThread",
"ctrl-shift-h": "agent::OpenHistory",
"ctrl-alt-c": "agent::OpenSettings",
"ctrl-alt-p": "agent::OpenRulesLibrary",
"ctrl-alt-p": "agent::ManageProfiles",
"ctrl-alt-l": "agent::OpenRulesLibrary",
"ctrl-i": "agent::ToggleProfileSelector",
"ctrl-alt-/": "agent::ToggleModelSelector",
"ctrl-shift-a": "agent::ToggleContextPicker",
"ctrl-shift-j": "agent::ToggleNavigationMenu",
"ctrl-shift-i": "agent::ToggleOptionsMenu",
"ctrl-alt-i": "agent::ToggleOptionsMenu",
"ctrl-alt-shift-n": "agent::ToggleNewThreadMenu",
"shift-alt-escape": "agent::ExpandMessageEditor",
"ctrl->": "agent::AddSelectionToThread",
@@ -407,6 +409,7 @@
"bindings": {
"escape": "project_search::ToggleFocus",
"shift-find": "search::FocusSearch",
"shift-enter": "project_search::ToggleAllSearchResults",
"ctrl-shift-f": "search::FocusSearch",
"ctrl-shift-h": "search::ToggleReplace",
"alt-ctrl-g": "search::ToggleRegex",
@@ -479,6 +482,7 @@
"alt-w": "search::ToggleWholeWord",
"alt-find": "project_search::ToggleFilters",
"alt-ctrl-f": "project_search::ToggleFilters",
"shift-enter": "project_search::ToggleAllSearchResults",
"ctrl-alt-shift-r": "search::ToggleRegex",
"ctrl-alt-shift-x": "search::ToggleRegex",
"alt-r": "search::ToggleRegex",
@@ -732,11 +736,17 @@
}
},
{
"context": "Editor && in_snippet",
"context": "Editor && in_snippet && has_next_tabstop && !showing_completions",
"use_key_equivalents": true,
"bindings": {
"alt-right": "editor::NextSnippetTabstop",
"alt-left": "editor::PreviousSnippetTabstop"
"tab": "editor::NextSnippetTabstop"
}
},
{
"context": "Editor && in_snippet && has_previous_tabstop && !showing_completions",
"use_key_equivalents": true,
"bindings": {
"shift-tab": "editor::PreviousSnippetTabstop"
}
},
// Bindings for accepting edit predictions
@@ -941,6 +951,7 @@
"ctrl-g ctrl-g": "git::Fetch",
"ctrl-g up": "git::Push",
"ctrl-g down": "git::Pull",
"ctrl-g shift-down": "git::PullRebase",
"ctrl-g shift-up": "git::ForcePush",
"ctrl-g d": "git::Diff",
"ctrl-g backspace": "git::RestoreTrackedFiles",
@@ -1252,6 +1263,14 @@
"ctrl-shift-enter": "workspace::OpenWithSystem"
}
},
{
"context": "GitWorktreeSelector || (GitWorktreeSelector > Picker > Editor)",
"use_key_equivalents": true,
"bindings": {
"ctrl-shift-space": "git::WorktreeFromDefaultOnWindow",
"ctrl-space": "git::WorktreeFromDefault"
}
},
{
"context": "SettingsWindow",
"use_key_equivalents": true,

View File

@@ -49,7 +49,8 @@
"ctrl-cmd-f": "zed::ToggleFullScreen",
"ctrl-cmd-z": "edit_prediction::RateCompletions",
"ctrl-cmd-i": "edit_prediction::ToggleMenu",
"ctrl-cmd-l": "lsp_tool::ToggleMenu"
"ctrl-cmd-l": "lsp_tool::ToggleMenu",
"cmd-alt-.": "project_panel::ToggleHideHidden"
}
},
{
@@ -274,12 +275,13 @@
"cmd-alt-n": "agent::NewTextThread",
"cmd-shift-h": "agent::OpenHistory",
"cmd-alt-c": "agent::OpenSettings",
"cmd-alt-p": "agent::OpenRulesLibrary",
"cmd-alt-l": "agent::OpenRulesLibrary",
"cmd-alt-p": "agent::ManageProfiles",
"cmd-i": "agent::ToggleProfileSelector",
"cmd-alt-/": "agent::ToggleModelSelector",
"cmd-shift-a": "agent::ToggleContextPicker",
"cmd-shift-j": "agent::ToggleNavigationMenu",
"cmd-shift-i": "agent::ToggleOptionsMenu",
"cmd-alt-m": "agent::ToggleOptionsMenu",
"cmd-alt-shift-n": "agent::ToggleNewThreadMenu",
"shift-alt-escape": "agent::ExpandMessageEditor",
"cmd->": "agent::AddSelectionToThread",
@@ -468,6 +470,7 @@
"bindings": {
"escape": "project_search::ToggleFocus",
"cmd-shift-j": "project_search::ToggleFilters",
"shift-enter": "project_search::ToggleAllSearchResults",
"cmd-shift-f": "search::FocusSearch",
"cmd-shift-h": "search::ToggleReplace",
"alt-cmd-g": "search::ToggleRegex",
@@ -496,6 +499,7 @@
"bindings": {
"escape": "project_search::ToggleFocus",
"cmd-shift-j": "project_search::ToggleFilters",
"shift-enter": "project_search::ToggleAllSearchResults",
"cmd-shift-h": "search::ToggleReplace",
"alt-cmd-g": "search::ToggleRegex",
"alt-cmd-x": "search::ToggleRegex"
@@ -802,11 +806,17 @@
}
},
{
"context": "Editor && in_snippet",
"context": "Editor && in_snippet && has_next_tabstop && !showing_completions",
"use_key_equivalents": true,
"bindings": {
"alt-right": "editor::NextSnippetTabstop",
"alt-left": "editor::PreviousSnippetTabstop"
"tab": "editor::NextSnippetTabstop"
}
},
{
"context": "Editor && in_snippet && has_previous_tabstop && !showing_completions",
"use_key_equivalents": true,
"bindings": {
"shift-tab": "editor::PreviousSnippetTabstop"
}
},
{
@@ -1034,6 +1044,7 @@
"ctrl-g ctrl-g": "git::Fetch",
"ctrl-g up": "git::Push",
"ctrl-g down": "git::Pull",
"ctrl-g shift-down": "git::PullRebase",
"ctrl-g shift-up": "git::ForcePush",
"ctrl-g d": "git::Diff",
"ctrl-g backspace": "git::RestoreTrackedFiles",
@@ -1357,6 +1368,14 @@
"ctrl-shift-enter": "workspace::OpenWithSystem"
}
},
{
"context": "GitWorktreeSelector || (GitWorktreeSelector > Picker > Editor)",
"use_key_equivalents": true,
"bindings": {
"ctrl-shift-space": "git::WorktreeFromDefaultOnWindow",
"ctrl-space": "git::WorktreeFromDefault"
}
},
{
"context": "SettingsWindow",
"use_key_equivalents": true,

View File

@@ -41,7 +41,8 @@
"shift-f11": "debugger::StepOut",
"f11": "zed::ToggleFullScreen",
"ctrl-shift-i": "edit_prediction::ToggleMenu",
"shift-alt-l": "lsp_tool::ToggleMenu"
"shift-alt-l": "lsp_tool::ToggleMenu",
"ctrl-alt-.": "project_panel::ToggleHideHidden"
}
},
{
@@ -236,12 +237,13 @@
"shift-alt-n": "agent::NewTextThread",
"ctrl-shift-h": "agent::OpenHistory",
"shift-alt-c": "agent::OpenSettings",
"shift-alt-p": "agent::OpenRulesLibrary",
"shift-alt-l": "agent::OpenRulesLibrary",
"shift-alt-p": "agent::ManageProfiles",
"ctrl-i": "agent::ToggleProfileSelector",
"shift-alt-/": "agent::ToggleModelSelector",
"ctrl-shift-a": "agent::ToggleContextPicker",
"ctrl-shift-j": "agent::ToggleNavigationMenu",
"ctrl-shift-i": "agent::ToggleOptionsMenu",
"ctrl-alt-i": "agent::ToggleOptionsMenu",
// "ctrl-shift-alt-n": "agent::ToggleNewThreadMenu",
"shift-alt-escape": "agent::ExpandMessageEditor",
"ctrl-shift-.": "agent::AddSelectionToThread",
@@ -488,6 +490,7 @@
"alt-c": "search::ToggleCaseSensitive",
"alt-w": "search::ToggleWholeWord",
"alt-f": "project_search::ToggleFilters",
"shift-enter": "project_search::ToggleAllSearchResults",
"alt-r": "search::ToggleRegex",
// "ctrl-shift-alt-x": "search::ToggleRegex",
"ctrl-k shift-enter": "pane::TogglePinTab"
@@ -737,11 +740,17 @@
}
},
{
"context": "Editor && in_snippet",
"context": "Editor && in_snippet && has_next_tabstop && !showing_completions",
"use_key_equivalents": true,
"bindings": {
"alt-right": "editor::NextSnippetTabstop",
"alt-left": "editor::PreviousSnippetTabstop"
"tab": "editor::NextSnippetTabstop"
}
},
{
"context": "Editor && in_snippet && has_previous_tabstop && !showing_completions",
"use_key_equivalents": true,
"bindings": {
"shift-tab": "editor::PreviousSnippetTabstop"
}
},
// Bindings for accepting edit predictions
@@ -951,6 +960,7 @@
"ctrl-g ctrl-g": "git::Fetch",
"ctrl-g up": "git::Push",
"ctrl-g down": "git::Pull",
"ctrl-g shift-down": "git::PullRebase",
"ctrl-g shift-up": "git::ForcePush",
"ctrl-g d": "git::Diff",
"ctrl-g backspace": "git::RestoreTrackedFiles",
@@ -1280,6 +1290,14 @@
"shift-alt-a": "onboarding::OpenAccount"
}
},
{
"context": "GitWorktreeSelector || (GitWorktreeSelector > Picker > Editor)",
"use_key_equivalents": true,
"bindings": {
"ctrl-shift-space": "git::WorktreeFromDefaultOnWindow",
"ctrl-space": "git::WorktreeFromDefault"
}
},
{
"context": "SettingsWindow",
"use_key_equivalents": true,

View File

@@ -255,6 +255,19 @@
// Whether to display inline and alongside documentation for items in the
// completions menu
"show_completion_documentation": true,
// When to show the scrollbar in the completion menu.
// This setting can take four values:
//
// 1. Show the scrollbar if there's important information or
// follow the system's configured behavior
// "auto"
// 2. Match the system's configured behavior:
// "system"
// 3. Always show the scrollbar:
// "always"
// 4. Never show the scrollbar:
// "never" (default)
"completion_menu_scrollbar": "never",
// Show method signatures in the editor, when inside parentheses.
"auto_signature_help": false,
// Whether to show the signature help after completion or a bracket pair inserted.
@@ -592,7 +605,7 @@
// to both the horizontal and vertical delta values while scrolling. Fast scrolling
// happens when a user holds the alt or option key while scrolling.
"fast_scroll_sensitivity": 4.0,
"relative_line_numbers": false,
"relative_line_numbers": "disabled",
// If 'search_wrap' is disabled, search result do not wrap around the end of the file.
"search_wrap": true,
// Search options to enable by default when opening new project and buffer searches.
@@ -602,7 +615,9 @@
"whole_word": false,
"case_sensitive": false,
"include_ignored": false,
"regex": false
"regex": false,
// Whether to center the cursor on each search match when navigating.
"center_on_match": false
},
// When to populate a new search's query based on the text under the cursor.
// This setting can take the following three values:
@@ -1232,6 +1247,9 @@
// that are overly broad can slow down Zed's file scanning. `file_scan_exclusions` takes
// precedence over these inclusions.
"file_scan_inclusions": [".env*"],
// Globs to match files that will be considered "hidden". These files can be hidden from the
// project panel by toggling the "hide_hidden" setting.
"hidden_files": ["**/.*"],
// Git gutter behavior configuration.
"git": {
// Control whether the git gutter is shown. May take 2 values:
@@ -1719,6 +1737,9 @@
"allowed": true
}
},
"HTML+ERB": {
"language_servers": ["herb", "!ruby-lsp", "..."]
},
"Java": {
"prettier": {
"allowed": true,
@@ -1741,6 +1762,9 @@
"allowed": true
}
},
"JS+ERB": {
"language_servers": ["!ruby-lsp", "..."]
},
"Kotlin": {
"language_servers": ["!kotlin-language-server", "kotlin-lsp", "..."]
},
@@ -1755,6 +1779,7 @@
"Markdown": {
"format_on_save": "off",
"use_on_type_format": false,
"remove_trailing_whitespace_on_save": false,
"allow_rewrap": "anywhere",
"soft_wrap": "editor_width",
"prettier": {
@@ -1845,6 +1870,9 @@
"allowed": true
}
},
"YAML+ERB": {
"language_servers": ["!ruby-lsp", "..."]
},
"Zig": {
"language_servers": ["zls", "..."]
}

View File

@@ -3,7 +3,6 @@ mod diff;
mod mention;
mod terminal;
use ::terminal::terminal_settings::TerminalSettings;
use agent_settings::AgentSettings;
use collections::HashSet;
pub use connection::*;
@@ -12,7 +11,7 @@ use language::language_settings::FormatOnSave;
pub use mention::*;
use project::lsp_store::{FormatTrigger, LspFormatTarget};
use serde::{Deserialize, Serialize};
use settings::{Settings as _, SettingsLocation};
use settings::Settings as _;
use task::{Shell, ShellBuilder};
pub use terminal::*;
@@ -2141,17 +2140,9 @@ impl AcpThread {
) -> Task<Result<Entity<Terminal>>> {
let env = match &cwd {
Some(dir) => self.project.update(cx, |project, cx| {
let worktree = project.find_worktree(dir.as_path(), cx);
let shell = TerminalSettings::get(
worktree.as_ref().map(|(worktree, path)| SettingsLocation {
worktree_id: worktree.read(cx).id(),
path: &path,
}),
cx,
)
.shell
.clone();
project.directory_environment(&shell, dir.as_path().into(), cx)
project.environment().update(cx, |env, cx| {
env.directory_environment(dir.as_path().into(), cx)
})
}),
None => Task::ready(None).shared(),
};

View File

@@ -361,12 +361,10 @@ async fn build_buffer_diff(
) -> Result<Entity<BufferDiff>> {
let buffer = cx.update(|cx| buffer.read(cx).snapshot())?;
let executor = cx.background_executor().clone();
let old_text_rope = cx
.background_spawn({
let old_text = old_text.clone();
let executor = executor.clone();
async move { Rope::from_str(old_text.as_str(), &executor) }
async move { Rope::from(old_text.as_str()) }
})
.await;
let base_buffer = cx

View File

@@ -5,10 +5,8 @@ use gpui::{App, AppContext, AsyncApp, Context, Entity, Task};
use language::LanguageRegistry;
use markdown::Markdown;
use project::Project;
use settings::{Settings as _, SettingsLocation};
use std::{path::PathBuf, process::ExitStatus, sync::Arc, time::Instant};
use task::Shell;
use terminal::terminal_settings::TerminalSettings;
use util::get_default_system_shell_preferring_bash;
pub struct Terminal {
@@ -187,17 +185,9 @@ pub async fn create_terminal_entity(
let mut env = if let Some(dir) = &cwd {
project
.update(cx, |project, cx| {
let worktree = project.find_worktree(dir.as_path(), cx);
let shell = TerminalSettings::get(
worktree.as_ref().map(|(worktree, path)| SettingsLocation {
worktree_id: worktree.read(cx).id(),
path: &path,
}),
cx,
)
.shell
.clone();
project.directory_environment(&shell, dir.clone().into(), cx)
project.environment().update(cx, |env, cx| {
env.directory_environment(dir.clone().into(), cx)
})
})?
.await
.unwrap_or_default()

View File

@@ -19,7 +19,7 @@ use markdown::{CodeBlockRenderer, Markdown, MarkdownElement, MarkdownStyle};
use project::Project;
use settings::Settings;
use theme::ThemeSettings;
use ui::{Tooltip, prelude::*};
use ui::{Tooltip, WithScrollbar, prelude::*};
use util::ResultExt as _;
use workspace::{
Item, ItemHandle, ToolbarItemEvent, ToolbarItemLocation, ToolbarItemView, Workspace,
@@ -291,17 +291,19 @@ impl AcpTools {
let expanded = self.expanded.contains(&index);
v_flex()
.w_full()
.px_4()
.py_3()
.border_color(colors.border)
.border_b_1()
.gap_2()
.items_start()
.font_buffer(cx)
.text_size(base_size)
.id(index)
.group("message")
.cursor_pointer()
.font_buffer(cx)
.w_full()
.py_3()
.pl_4()
.pr_5()
.gap_2()
.items_start()
.text_size(base_size)
.border_color(colors.border)
.border_b_1()
.hover(|this| this.bg(colors.element_background.opacity(0.5)))
.on_click(cx.listener(move |this, _, _, cx| {
if this.expanded.contains(&index) {
@@ -323,15 +325,14 @@ impl AcpTools {
h_flex()
.w_full()
.gap_2()
.items_center()
.flex_shrink_0()
.child(match message.direction {
acp::StreamMessageDirection::Incoming => {
ui::Icon::new(ui::IconName::ArrowDown).color(Color::Error)
}
acp::StreamMessageDirection::Outgoing => {
ui::Icon::new(ui::IconName::ArrowUp).color(Color::Success)
}
acp::StreamMessageDirection::Incoming => Icon::new(IconName::ArrowDown)
.color(Color::Error)
.size(IconSize::Small),
acp::StreamMessageDirection::Outgoing => Icon::new(IconName::ArrowUp)
.color(Color::Success)
.size(IconSize::Small),
})
.child(
Label::new(message.name.clone())
@@ -501,7 +502,7 @@ impl Focusable for AcpTools {
}
impl Render for AcpTools {
fn render(&mut self, _window: &mut Window, cx: &mut Context<Self>) -> impl IntoElement {
fn render(&mut self, window: &mut Window, cx: &mut Context<Self>) -> impl IntoElement {
v_flex()
.track_focus(&self.focus_handle)
.size_full()
@@ -516,13 +517,19 @@ impl Render for AcpTools {
.child("No messages recorded yet")
.into_any()
} else {
list(
connection.list_state.clone(),
cx.processor(Self::render_message),
)
.with_sizing_behavior(gpui::ListSizingBehavior::Auto)
.flex_grow()
.into_any()
div()
.size_full()
.flex_grow()
.child(
list(
connection.list_state.clone(),
cx.processor(Self::render_message),
)
.with_sizing_behavior(gpui::ListSizingBehavior::Auto)
.size_full(),
)
.vertical_scrollbar_for(connection.list_state.clone(), window, cx)
.into_any()
}
}
None => h_flex()

View File

@@ -3,9 +3,7 @@ use buffer_diff::BufferDiff;
use clock;
use collections::BTreeMap;
use futures::{FutureExt, StreamExt, channel::mpsc};
use gpui::{
App, AppContext, AsyncApp, BackgroundExecutor, Context, Entity, Subscription, Task, WeakEntity,
};
use gpui::{App, AppContext, AsyncApp, Context, Entity, Subscription, Task, WeakEntity};
use language::{Anchor, Buffer, BufferEvent, DiskState, Point, ToPoint};
use project::{Project, ProjectItem, lsp_store::OpenLspBufferHandle};
use std::{cmp, ops::Range, sync::Arc};
@@ -323,7 +321,6 @@ impl ActionLog {
let unreviewed_edits = tracked_buffer.unreviewed_edits.clone();
let edits = diff_snapshots(&old_snapshot, &new_snapshot);
let mut has_user_changes = false;
let executor = cx.background_executor().clone();
async move {
if let ChangeAuthor::User = author {
has_user_changes = apply_non_conflicting_edits(
@@ -331,7 +328,6 @@ impl ActionLog {
edits,
&mut base_text,
new_snapshot.as_rope(),
&executor,
);
}
@@ -386,7 +382,6 @@ impl ActionLog {
let agent_diff_base = tracked_buffer.diff_base.clone();
let git_diff_base = git_diff.read(cx).base_text().as_rope().clone();
let buffer_text = tracked_buffer.snapshot.as_rope().clone();
let executor = cx.background_executor().clone();
anyhow::Ok(cx.background_spawn(async move {
let mut old_unreviewed_edits = old_unreviewed_edits.into_iter().peekable();
let committed_edits = language::line_diff(
@@ -421,11 +416,8 @@ impl ActionLog {
),
new_agent_diff_base.max_point(),
));
new_agent_diff_base.replace(
old_byte_start..old_byte_end,
&unreviewed_new,
&executor,
);
new_agent_diff_base
.replace(old_byte_start..old_byte_end, &unreviewed_new);
row_delta +=
unreviewed.new_len() as i32 - unreviewed.old_len() as i32;
}
@@ -619,7 +611,6 @@ impl ActionLog {
.snapshot
.text_for_range(new_range)
.collect::<String>(),
cx.background_executor(),
);
delta += edit.new_len() as i32 - edit.old_len() as i32;
false
@@ -833,7 +824,6 @@ fn apply_non_conflicting_edits(
edits: Vec<Edit<u32>>,
old_text: &mut Rope,
new_text: &Rope,
executor: &BackgroundExecutor,
) -> bool {
let mut old_edits = patch.edits().iter().cloned().peekable();
let mut new_edits = edits.into_iter().peekable();
@@ -887,7 +877,6 @@ fn apply_non_conflicting_edits(
old_text.replace(
old_bytes,
&new_text.chunks_in_range(new_bytes).collect::<String>(),
executor,
);
applied_delta += new_edit.new_len() as i32 - new_edit.old_len() as i32;
has_made_changes = true;
@@ -2293,7 +2282,6 @@ mod tests {
old_text.replace(
old_start..old_end,
&new_text.slice_rows(edit.new.clone()).to_string(),
cx.background_executor(),
);
}
pretty_assertions::assert_eq!(old_text.to_string(), new_text.to_string());

View File

@@ -13,7 +13,15 @@ const EDITS_END_TAG: &str = "</edits>";
const SEARCH_MARKER: &str = "<<<<<<< SEARCH";
const SEPARATOR_MARKER: &str = "=======";
const REPLACE_MARKER: &str = ">>>>>>> REPLACE";
const END_TAGS: [&str; 3] = [OLD_TEXT_END_TAG, NEW_TEXT_END_TAG, EDITS_END_TAG];
const SONNET_PARAMETER_INVOKE_1: &str = "</parameter>\n</invoke>";
const SONNET_PARAMETER_INVOKE_2: &str = "</parameter></invoke>";
const END_TAGS: [&str; 5] = [
OLD_TEXT_END_TAG,
NEW_TEXT_END_TAG,
EDITS_END_TAG,
SONNET_PARAMETER_INVOKE_1, // Remove this after switching to streaming tool call
SONNET_PARAMETER_INVOKE_2,
];
#[derive(Debug)]
pub enum EditParserEvent {
@@ -547,6 +555,37 @@ mod tests {
);
}
#[gpui::test(iterations = 1000)]
fn test_xml_edits_with_closing_parameter_invoke(mut rng: StdRng) {
// This case is a regression with Claude Sonnet 4.5.
// Sometimes Sonnet thinks that it's doing a tool call
// and closes its response with '</parameter></invoke>'
// instead of properly closing </new_text>
let mut parser = EditParser::new(EditFormat::XmlTags);
assert_eq!(
parse_random_chunks(
indoc! {"
<old_text>some text</old_text><new_text>updated text</parameter></invoke>
"},
&mut parser,
&mut rng
),
vec![Edit {
old_text: "some text".to_string(),
new_text: "updated text".to_string(),
line_hint: None,
},]
);
assert_eq!(
parser.finish(),
EditParserMetrics {
tags: 2,
mismatched_tags: 1
}
);
}
#[gpui::test(iterations = 1000)]
fn test_xml_nested_tags(mut rng: StdRng) {
let mut parser = EditParser::new(EditFormat::XmlTags);
@@ -1035,6 +1074,11 @@ mod tests {
last_ix = chunk_ix;
}
if new_text.is_some() {
pending_edit.new_text = new_text.take().unwrap();
edits.push(pending_edit);
}
edits
}
}

View File

@@ -305,20 +305,18 @@ impl SearchMatrix {
#[cfg(test)]
mod tests {
use super::*;
use gpui::TestAppContext;
use indoc::indoc;
use language::{BufferId, TextBuffer};
use rand::prelude::*;
use text::ReplicaId;
use util::test::{generate_marked_text, marked_text_ranges};
#[gpui::test]
fn test_empty_query(cx: &mut gpui::TestAppContext) {
#[test]
fn test_empty_query() {
let buffer = TextBuffer::new(
ReplicaId::LOCAL,
BufferId::new(1).unwrap(),
"Hello world\nThis is a test\nFoo bar baz",
cx.background_executor(),
);
let snapshot = buffer.snapshot();
@@ -327,13 +325,12 @@ mod tests {
assert_eq!(finish(finder), None);
}
#[gpui::test]
fn test_streaming_exact_match(cx: &mut gpui::TestAppContext) {
#[test]
fn test_streaming_exact_match() {
let buffer = TextBuffer::new(
ReplicaId::LOCAL,
BufferId::new(1).unwrap(),
"Hello world\nThis is a test\nFoo bar baz",
cx.background_executor(),
);
let snapshot = buffer.snapshot();
@@ -352,8 +349,8 @@ mod tests {
assert_eq!(finish(finder), Some("This is a test".to_string()));
}
#[gpui::test]
fn test_streaming_fuzzy_match(cx: &mut gpui::TestAppContext) {
#[test]
fn test_streaming_fuzzy_match() {
let buffer = TextBuffer::new(
ReplicaId::LOCAL,
BufferId::new(1).unwrap(),
@@ -366,7 +363,6 @@ mod tests {
return x * y;
}
"},
cx.background_executor(),
);
let snapshot = buffer.snapshot();
@@ -387,13 +383,12 @@ mod tests {
);
}
#[gpui::test]
fn test_incremental_improvement(cx: &mut gpui::TestAppContext) {
#[test]
fn test_incremental_improvement() {
let buffer = TextBuffer::new(
ReplicaId::LOCAL,
BufferId::new(1).unwrap(),
"Line 1\nLine 2\nLine 3\nLine 4\nLine 5",
cx.background_executor(),
);
let snapshot = buffer.snapshot();
@@ -413,8 +408,8 @@ mod tests {
assert_eq!(finish(finder), Some("Line 3\nLine 4".to_string()));
}
#[gpui::test]
fn test_incomplete_lines_buffering(cx: &mut gpui::TestAppContext) {
#[test]
fn test_incomplete_lines_buffering() {
let buffer = TextBuffer::new(
ReplicaId::LOCAL,
BufferId::new(1).unwrap(),
@@ -423,7 +418,6 @@ mod tests {
jumps over the lazy dog
Pack my box with five dozen liquor jugs
"},
cx.background_executor(),
);
let snapshot = buffer.snapshot();
@@ -441,8 +435,8 @@ mod tests {
);
}
#[gpui::test]
fn test_multiline_fuzzy_match(cx: &mut gpui::TestAppContext) {
#[test]
fn test_multiline_fuzzy_match() {
let buffer = TextBuffer::new(
ReplicaId::LOCAL,
BufferId::new(1).unwrap(),
@@ -462,7 +456,6 @@ mod tests {
}
}
"#},
cx.background_executor(),
);
let snapshot = buffer.snapshot();
@@ -516,7 +509,7 @@ mod tests {
}
#[gpui::test(iterations = 100)]
fn test_resolve_location_single_line(mut rng: StdRng, cx: &mut TestAppContext) {
fn test_resolve_location_single_line(mut rng: StdRng) {
assert_location_resolution(
concat!(
" Lorem\n",
@@ -526,12 +519,11 @@ mod tests {
),
"ipsum",
&mut rng,
cx,
);
}
#[gpui::test(iterations = 100)]
fn test_resolve_location_multiline(mut rng: StdRng, cx: &mut TestAppContext) {
fn test_resolve_location_multiline(mut rng: StdRng) {
assert_location_resolution(
concat!(
" Lorem\n",
@@ -541,12 +533,11 @@ mod tests {
),
"ipsum\ndolor sit amet",
&mut rng,
cx,
);
}
#[gpui::test(iterations = 100)]
fn test_resolve_location_function_with_typo(mut rng: StdRng, cx: &mut TestAppContext) {
fn test_resolve_location_function_with_typo(mut rng: StdRng) {
assert_location_resolution(
indoc! {"
«fn foo1(a: usize) -> usize {
@@ -559,12 +550,11 @@ mod tests {
"},
"fn foo1(a: usize) -> u32 {\n40\n}",
&mut rng,
cx,
);
}
#[gpui::test(iterations = 100)]
fn test_resolve_location_class_methods(mut rng: StdRng, cx: &mut TestAppContext) {
fn test_resolve_location_class_methods(mut rng: StdRng) {
assert_location_resolution(
indoc! {"
class Something {
@@ -585,12 +575,11 @@ mod tests {
six() { return 6666; }
"},
&mut rng,
cx,
);
}
#[gpui::test(iterations = 100)]
fn test_resolve_location_imports_no_match(mut rng: StdRng, cx: &mut TestAppContext) {
fn test_resolve_location_imports_no_match(mut rng: StdRng) {
assert_location_resolution(
indoc! {"
use std::ops::Range;
@@ -620,12 +609,11 @@ mod tests {
use std::sync::Arc;
"},
&mut rng,
cx,
);
}
#[gpui::test(iterations = 100)]
fn test_resolve_location_nested_closure(mut rng: StdRng, cx: &mut TestAppContext) {
fn test_resolve_location_nested_closure(mut rng: StdRng) {
assert_location_resolution(
indoc! {"
impl Foo {
@@ -653,12 +641,11 @@ mod tests {
" });",
),
&mut rng,
cx,
);
}
#[gpui::test(iterations = 100)]
fn test_resolve_location_tool_invocation(mut rng: StdRng, cx: &mut TestAppContext) {
fn test_resolve_location_tool_invocation(mut rng: StdRng) {
assert_location_resolution(
indoc! {r#"
let tool = cx
@@ -686,12 +673,11 @@ mod tests {
" .output;",
),
&mut rng,
cx,
);
}
#[gpui::test]
fn test_line_hint_selection(cx: &mut TestAppContext) {
fn test_line_hint_selection() {
let text = indoc! {r#"
fn first_function() {
return 42;
@@ -710,7 +696,6 @@ mod tests {
ReplicaId::LOCAL,
BufferId::new(1).unwrap(),
text.to_string(),
cx.background_executor(),
);
let snapshot = buffer.snapshot();
let mut matcher = StreamingFuzzyMatcher::new(snapshot.clone());
@@ -742,19 +727,9 @@ mod tests {
}
#[track_caller]
fn assert_location_resolution(
text_with_expected_range: &str,
query: &str,
rng: &mut StdRng,
cx: &mut TestAppContext,
) {
fn assert_location_resolution(text_with_expected_range: &str, query: &str, rng: &mut StdRng) {
let (text, expected_ranges) = marked_text_ranges(text_with_expected_range, false);
let buffer = TextBuffer::new(
ReplicaId::LOCAL,
BufferId::new(1).unwrap(),
text.clone(),
cx.background_executor(),
);
let buffer = TextBuffer::new(ReplicaId::LOCAL, BufferId::new(1).unwrap(), text.clone());
let snapshot = buffer.snapshot();
let mut matcher = StreamingFuzzyMatcher::new(snapshot);

View File

@@ -569,7 +569,6 @@ mod tests {
use prompt_store::ProjectContext;
use serde_json::json;
use settings::SettingsStore;
use text::Rope;
use util::{path, rel_path::rel_path};
#[gpui::test]
@@ -742,7 +741,7 @@ mod tests {
// Create the file
fs.save(
path!("/root/src/main.rs").as_ref(),
&Rope::from_str_small("initial content"),
&"initial content".into(),
language::LineEnding::Unix,
)
.await
@@ -909,7 +908,7 @@ mod tests {
// Create a simple file with trailing whitespace
fs.save(
path!("/root/src/main.rs").as_ref(),
&Rope::from_str_small("initial content"),
&"initial content".into(),
language::LineEnding::Unix,
)
.await

View File

@@ -178,6 +178,7 @@ impl AcpConnection {
meta: Some(serde_json::json!({
// Experimental: Allow for rendering terminal output from the agents
"terminal_output": true,
"terminal-auth": true,
})),
},
client_info: Some(acp::Implementation {

View File

@@ -4,7 +4,7 @@ use acp_thread::{AcpThread, AgentThreadEntry};
use agent::HistoryStore;
use agent_client_protocol::{self as acp, ToolCallId};
use collections::HashMap;
use editor::{Editor, EditorMode, MinimapVisibility};
use editor::{Editor, EditorMode, MinimapVisibility, SizingBehavior};
use gpui::{
AnyEntity, App, AppContext as _, Entity, EntityId, EventEmitter, FocusHandle, Focusable,
ScrollHandle, SharedString, TextStyleRefinement, WeakEntity, Window,
@@ -357,7 +357,7 @@ fn create_editor_diff(
EditorMode::Full {
scale_ui_elements_with_buffer_font_size: false,
show_active_line_background: false,
sized_by_content: true,
sizing_behavior: SizingBehavior::SizeByContent,
},
diff.read(cx).multibuffer().clone(),
None,

View File

@@ -717,7 +717,10 @@ impl MessageEditor {
let mut all_tracked_buffers = Vec::new();
let result = editor.update(cx, |editor, cx| {
let mut ix = text.chars().position(|c| !c.is_whitespace()).unwrap_or(0);
let (mut ix, _) = text
.char_indices()
.find(|(_, c)| !c.is_whitespace())
.unwrap_or((0, '\0'));
let mut chunks: Vec<acp::ContentBlock> = Vec::new();
let text = editor.text(cx);
editor.display_map.update(cx, |map, cx| {
@@ -2879,7 +2882,7 @@ mod tests {
cx.run_until_parked();
editor.update_in(cx, |editor, window, cx| {
editor.set_text(" hello world ", window, cx);
editor.set_text(" \u{A0}してhello world ", window, cx);
});
let (content, _) = message_editor
@@ -2890,7 +2893,7 @@ mod tests {
assert_eq!(
content,
vec![acp::ContentBlock::Text(acp::TextContent {
text: "hello world".into(),
text: "してhello world".into(),
annotations: None,
meta: None
})]

View File

@@ -1,8 +1,10 @@
use acp_thread::AgentSessionModes;
use agent_client_protocol as acp;
use agent_servers::AgentServer;
use agent_settings::AgentSettings;
use fs::Fs;
use gpui::{Context, Entity, FocusHandle, WeakEntity, Window, prelude::*};
use settings::Settings as _;
use std::{rc::Rc, sync::Arc};
use ui::{
Button, ContextMenu, ContextMenuEntry, DocumentationEdge, DocumentationSide, KeyBinding,
@@ -84,6 +86,14 @@ impl ModeSelector {
let current_mode = self.connection.current_mode();
let default_mode = self.agent_server.default_mode(cx);
let settings = AgentSettings::get_global(cx);
let side = match settings.dock {
settings::DockPosition::Left => DocumentationSide::Right,
settings::DockPosition::Bottom | settings::DockPosition::Right => {
DocumentationSide::Left
}
};
for mode in all_modes {
let is_selected = &mode.id == &current_mode;
let is_default = Some(&mode.id) == default_mode.as_ref();
@@ -91,7 +101,7 @@ impl ModeSelector {
.toggleable(IconPosition::End, is_selected);
let entry = if let Some(description) = &mode.description {
entry.documentation_aside(DocumentationSide::Left, DocumentationEdge::Bottom, {
entry.documentation_aside(side, DocumentationEdge::Bottom, {
let description = description.clone();
move |cx| {

View File

@@ -17,7 +17,9 @@ use client::zed_urls;
use cloud_llm_client::PlanV1;
use collections::{HashMap, HashSet};
use editor::scroll::Autoscroll;
use editor::{Editor, EditorEvent, EditorMode, MultiBuffer, PathKey, SelectionEffects};
use editor::{
Editor, EditorEvent, EditorMode, MultiBuffer, PathKey, SelectionEffects, SizingBehavior,
};
use file_icons::FileIcons;
use fs::Fs;
use futures::FutureExt as _;
@@ -102,7 +104,7 @@ impl ThreadError {
{
Self::AuthenticationRequired(acp_error.message.clone().into())
} else {
let string = error.to_string();
let string = format!("{:#}", error);
// TODO: we should have Gemini return better errors here.
if agent.clone().downcast::<agent_servers::Gemini>().is_some()
&& string.contains("Could not load the default credentials")
@@ -111,7 +113,7 @@ impl ThreadError {
{
Self::AuthenticationRequired(string.into())
} else {
Self::Other(error.to_string().into())
Self::Other(string.into())
}
}
}
@@ -292,7 +294,6 @@ pub struct AcpThreadView {
resume_thread_metadata: Option<DbThreadMetadata>,
_cancel_task: Option<Task<()>>,
_subscriptions: [Subscription; 5],
#[cfg(target_os = "windows")]
show_codex_windows_warning: bool,
}
@@ -399,7 +400,6 @@ impl AcpThreadView {
),
];
#[cfg(target_os = "windows")]
let show_codex_windows_warning = crate::ExternalAgent::parse_built_in(agent.as_ref())
== Some(crate::ExternalAgent::Codex);
@@ -445,7 +445,6 @@ impl AcpThreadView {
focus_handle: cx.focus_handle(),
new_server_version_available: None,
resume_thread_metadata: resume_thread,
#[cfg(target_os = "windows")]
show_codex_windows_warning,
}
}
@@ -539,14 +538,7 @@ impl AcpThreadView {
})
.log_err()
} else {
let root_dir = if let Some(acp_agent) = connection
.clone()
.downcast::<agent_servers::AcpConnection>()
{
acp_agent.root_dir().into()
} else {
root_dir.unwrap_or(paths::home_dir().as_path().into())
};
let root_dir = root_dir.unwrap_or(paths::home_dir().as_path().into());
cx.update(|_, cx| {
connection
.clone()
@@ -793,7 +785,8 @@ impl AcpThreadView {
if let Some(load_err) = err.downcast_ref::<LoadError>() {
self.thread_state = ThreadState::LoadError(load_err.clone());
} else {
self.thread_state = ThreadState::LoadError(LoadError::Other(err.to_string().into()))
self.thread_state =
ThreadState::LoadError(LoadError::Other(format!("{:#}", err).into()))
}
if self.message_editor.focus_handle(cx).is_focused(window) {
self.focus_handle.focus(window)
@@ -881,6 +874,7 @@ impl AcpThreadView {
cx: &mut Context<Self>,
) {
self.set_editor_is_expanded(!self.editor_expanded, cx);
cx.stop_propagation();
cx.notify();
}
@@ -892,7 +886,7 @@ impl AcpThreadView {
EditorMode::Full {
scale_ui_elements_with_buffer_font_size: false,
show_active_line_background: false,
sized_by_content: false,
sizing_behavior: SizingBehavior::ExcludeOverscrollMargin,
},
cx,
)
@@ -1469,6 +1463,114 @@ impl AcpThreadView {
return;
};
// Check for the experimental "terminal-auth" _meta field
let auth_method = connection.auth_methods().iter().find(|m| m.id == method);
if let Some(auth_method) = auth_method {
if let Some(meta) = &auth_method.meta {
if let Some(terminal_auth) = meta.get("terminal-auth") {
// Extract terminal auth details from meta
if let (Some(command), Some(label)) = (
terminal_auth.get("command").and_then(|v| v.as_str()),
terminal_auth.get("label").and_then(|v| v.as_str()),
) {
let args = terminal_auth
.get("args")
.and_then(|v| v.as_array())
.map(|arr| {
arr.iter()
.filter_map(|v| v.as_str().map(String::from))
.collect()
})
.unwrap_or_default();
let env = terminal_auth
.get("env")
.and_then(|v| v.as_object())
.map(|obj| {
obj.iter()
.filter_map(|(k, v)| {
v.as_str().map(|val| (k.clone(), val.to_string()))
})
.collect::<HashMap<String, String>>()
})
.unwrap_or_default();
// Run SpawnInTerminal in the same dir as the ACP server
let cwd = connection
.clone()
.downcast::<agent_servers::AcpConnection>()
.map(|acp_conn| acp_conn.root_dir().to_path_buf());
// Build SpawnInTerminal from _meta
let login = task::SpawnInTerminal {
id: task::TaskId(format!("external-agent-{}-login", label)),
full_label: label.to_string(),
label: label.to_string(),
command: Some(command.to_string()),
args,
command_label: label.to_string(),
cwd,
env,
use_new_terminal: true,
allow_concurrent_runs: true,
hide: task::HideStrategy::Always,
..Default::default()
};
self.thread_error.take();
configuration_view.take();
pending_auth_method.replace(method.clone());
if let Some(workspace) = self.workspace.upgrade() {
let project = self.project.clone();
let authenticate = Self::spawn_external_agent_login(
login, workspace, project, false, true, window, cx,
);
cx.notify();
self.auth_task = Some(cx.spawn_in(window, {
let agent = self.agent.clone();
async move |this, cx| {
let result = authenticate.await;
match &result {
Ok(_) => telemetry::event!(
"Authenticate Agent Succeeded",
agent = agent.telemetry_id()
),
Err(_) => {
telemetry::event!(
"Authenticate Agent Failed",
agent = agent.telemetry_id(),
)
}
}
this.update_in(cx, |this, window, cx| {
if let Err(err) = result {
if let ThreadState::Unauthenticated {
pending_auth_method,
..
} = &mut this.thread_state
{
pending_auth_method.take();
}
this.handle_thread_error(err, cx);
} else {
this.reset(window, cx);
}
this.auth_task.take()
})
.ok();
}
}));
}
return;
}
}
}
}
if method.0.as_ref() == "gemini-api-key" {
let registry = LanguageModelRegistry::global(cx);
let provider = registry
@@ -1567,7 +1669,10 @@ impl AcpThreadView {
&& let Some(login) = self.login.clone()
{
if let Some(workspace) = self.workspace.upgrade() {
Self::spawn_external_agent_login(login, workspace, false, window, cx)
let project = self.project.clone();
Self::spawn_external_agent_login(
login, workspace, project, false, false, window, cx,
)
} else {
Task::ready(Ok(()))
}
@@ -1617,17 +1722,40 @@ impl AcpThreadView {
fn spawn_external_agent_login(
login: task::SpawnInTerminal,
workspace: Entity<Workspace>,
project: Entity<Project>,
previous_attempt: bool,
check_exit_code: bool,
window: &mut Window,
cx: &mut App,
) -> Task<Result<()>> {
let Some(terminal_panel) = workspace.read(cx).panel::<TerminalPanel>(cx) else {
return Task::ready(Ok(()));
};
let project = workspace.read(cx).project().clone();
window.spawn(cx, async move |cx| {
let mut task = login.clone();
if let Some(cmd) = &task.command {
// Have "node" command use Zed's managed Node runtime by default
if cmd == "node" {
let resolved_node_runtime = project
.update(cx, |project, cx| {
let agent_server_store = project.agent_server_store().clone();
agent_server_store.update(cx, |store, cx| {
store.node_runtime().map(|node_runtime| {
cx.background_spawn(async move {
node_runtime.binary_path().await
})
})
})
});
if let Ok(Some(resolve_task)) = resolved_node_runtime {
if let Ok(node_path) = resolve_task.await {
task.command = Some(node_path.to_string_lossy().to_string());
}
}
}
}
task.shell = task::Shell::WithArguments {
program: task.command.take().expect("login command should be set"),
args: std::mem::take(&mut task.args),
@@ -1645,44 +1773,65 @@ impl AcpThreadView {
})?;
let terminal = terminal.await?;
let mut exit_status = terminal
.read_with(cx, |terminal, cx| terminal.wait_for_completed_task(cx))?
.fuse();
let logged_in = cx
.spawn({
let terminal = terminal.clone();
async move |cx| {
loop {
cx.background_executor().timer(Duration::from_secs(1)).await;
let content =
terminal.update(cx, |terminal, _cx| terminal.get_content())?;
if content.contains("Login successful")
|| content.contains("Type your message")
{
return anyhow::Ok(());
if check_exit_code {
// For extension-based auth, wait for the process to exit and check exit code
let exit_status = terminal
.read_with(cx, |terminal, cx| terminal.wait_for_completed_task(cx))?
.await;
match exit_status {
Some(status) if status.success() => {
Ok(())
}
Some(status) => {
Err(anyhow!("Login command failed with exit code: {:?}", status.code()))
}
None => {
Err(anyhow!("Login command terminated without exit status"))
}
}
} else {
// For hardcoded agents (claude-login, gemini-cli): look for specific output
let mut exit_status = terminal
.read_with(cx, |terminal, cx| terminal.wait_for_completed_task(cx))?
.fuse();
let logged_in = cx
.spawn({
let terminal = terminal.clone();
async move |cx| {
loop {
cx.background_executor().timer(Duration::from_secs(1)).await;
let content =
terminal.update(cx, |terminal, _cx| terminal.get_content())?;
if content.contains("Login successful")
|| content.contains("Type your message")
{
return anyhow::Ok(());
}
}
}
})
.fuse();
futures::pin_mut!(logged_in);
futures::select_biased! {
result = logged_in => {
if let Err(e) = result {
log::error!("{e}");
return Err(anyhow!("exited before logging in"));
}
}
})
.fuse();
futures::pin_mut!(logged_in);
futures::select_biased! {
result = logged_in => {
if let Err(e) = result {
log::error!("{e}");
_ = exit_status => {
if !previous_attempt && project.read_with(cx, |project, _| project.is_via_remote_server())? && login.label.contains("gemini") {
return cx.update(|window, cx| Self::spawn_external_agent_login(login, workspace, project.clone(), true, false, window, cx))?.await
}
return Err(anyhow!("exited before logging in"));
}
}
_ = exit_status => {
if !previous_attempt && project.read_with(cx, |project, _| project.is_via_remote_server())? && login.label.contains("gemini") {
return cx.update(|window, cx| Self::spawn_external_agent_login(login, workspace, true, window, cx))?.await
}
return Err(anyhow!("exited before logging in"));
}
terminal.update(cx, |terminal, _| terminal.kill_active_task())?;
Ok(())
}
terminal.update(cx, |terminal, _| terminal.kill_active_task())?;
Ok(())
})
}
@@ -1947,6 +2096,15 @@ impl AcpThreadView {
.into_any(),
};
let needs_confirmation = if let AgentThreadEntry::ToolCall(tool_call) = entry {
matches!(
tool_call.status,
ToolCallStatus::WaitingForConfirmation { .. }
)
} else {
false
};
let Some(thread) = self.thread() else {
return primary;
};
@@ -1955,7 +2113,13 @@ impl AcpThreadView {
v_flex()
.w_full()
.child(primary)
.child(self.render_thread_controls(&thread, cx))
.map(|this| {
if needs_confirmation {
this.child(self.render_generating(true))
} else {
this.child(self.render_thread_controls(&thread, cx))
}
})
.when_some(
self.thread_feedback.comments_editor.clone(),
|this, editor| this.child(Self::render_feedback_feedback_editor(editor, cx)),
@@ -3631,6 +3795,7 @@ impl AcpThreadView {
.child(
h_flex()
.id("edits-container")
.cursor_pointer()
.gap_1()
.child(Disclosure::new("edits-disclosure", expanded))
.map(|this| {
@@ -3770,6 +3935,7 @@ impl AcpThreadView {
Label::new(name.to_string())
.size(LabelSize::XSmall)
.buffer_font(cx)
.ml_1p5()
});
let file_icon = FileIcons::get_icon(path.as_std_path(), cx)
@@ -3801,14 +3967,30 @@ impl AcpThreadView {
})
.child(
h_flex()
.id(("file-name-row", index))
.relative()
.id(("file-name", index))
.pr_8()
.gap_1p5()
.w_full()
.overflow_x_scroll()
.child(file_icon)
.child(h_flex().gap_0p5().children(file_name).children(file_path))
.child(
h_flex()
.id(("file-name-path", index))
.cursor_pointer()
.pr_0p5()
.gap_0p5()
.hover(|s| s.bg(cx.theme().colors().element_hover))
.rounded_xs()
.child(file_icon)
.children(file_name)
.children(file_path)
.tooltip(Tooltip::text("Go to File"))
.on_click({
let buffer = buffer.clone();
cx.listener(move |this, _, window, cx| {
this.open_edited_buffer(&buffer, window, cx);
})
}),
)
.child(
div()
.absolute()
@@ -3818,13 +4000,7 @@ impl AcpThreadView {
.bottom_0()
.right_0()
.bg(overlay_gradient),
)
.on_click({
let buffer = buffer.clone();
cx.listener(move |this, _, window, cx| {
this.open_edited_buffer(&buffer, window, cx);
})
}),
),
)
.child(
h_flex()
@@ -3966,8 +4142,12 @@ impl AcpThreadView {
)
}
})
.on_click(cx.listener(|_, _, window, cx| {
window.dispatch_action(Box::new(ExpandMessageEditor), cx);
.on_click(cx.listener(|this, _, window, cx| {
this.expand_message_editor(
&ExpandMessageEditor,
window,
cx,
);
})),
),
),
@@ -4571,14 +4751,29 @@ impl AcpThreadView {
window: &mut Window,
cx: &mut Context<Self>,
) {
if window.is_window_active() || !self.notifications.is_empty() {
if !self.notifications.is_empty() {
return;
}
let settings = AgentSettings::get_global(cx);
let window_is_inactive = !window.is_window_active();
let panel_is_hidden = self
.workspace
.upgrade()
.map(|workspace| AgentPanel::is_hidden(&workspace, cx))
.unwrap_or(true);
let should_notify = window_is_inactive || panel_is_hidden;
if !should_notify {
return;
}
// TODO: Change this once we have title summarization for external agents.
let title = self.agent.name();
match AgentSettings::get_global(cx).notify_when_agent_waiting {
match settings.notify_when_agent_waiting {
NotifyWhenAgentWaiting::PrimaryScreen => {
if let Some(primary) = cx.primary_display() {
self.pop_up(icon, caption.into(), title, window, primary, cx);
@@ -4694,6 +4889,31 @@ impl AcpThreadView {
}
}
fn render_generating(&self, confirmation: bool) -> impl IntoElement {
h_flex()
.id("generating-spinner")
.py_2()
.px(rems_from_px(22.))
.map(|this| {
if confirmation {
this.gap_2()
.child(
h_flex()
.w_2()
.child(SpinnerLabel::sand().size(LabelSize::Small)),
)
.child(
LoadingLabel::new("Waiting Confirmation")
.size(LabelSize::Small)
.color(Color::Muted),
)
} else {
this.child(SpinnerLabel::new().size(LabelSize::Small))
}
})
.into_any_element()
}
fn render_thread_controls(
&self,
thread: &Entity<AcpThread>,
@@ -4701,12 +4921,7 @@ impl AcpThreadView {
) -> impl IntoElement {
let is_generating = matches!(thread.read(cx).status(), ThreadStatus::Generating);
if is_generating {
return h_flex().id("thread-controls-container").child(
div()
.py_2()
.px(rems_from_px(22.))
.child(SpinnerLabel::new().size(LabelSize::Small)),
);
return self.render_generating(false).into_any_element();
}
let open_as_markdown = IconButton::new("open-as-markdown", IconName::FileMarkdown)
@@ -4794,7 +5009,10 @@ impl AcpThreadView {
);
}
container.child(open_as_markdown).child(scroll_to_top)
container
.child(open_as_markdown)
.child(scroll_to_top)
.into_any_element()
}
fn render_feedback_feedback_editor(editor: Entity<Editor>, cx: &Context<Self>) -> Div {
@@ -5024,7 +5242,6 @@ impl AcpThreadView {
)
}
#[cfg(target_os = "windows")]
fn render_codex_windows_warning(&self, cx: &mut Context<Self>) -> Option<Callout> {
if self.show_codex_windows_warning {
Some(
@@ -5040,8 +5257,9 @@ impl AcpThreadView {
.icon_size(IconSize::Small)
.icon_color(Color::Muted)
.on_click(cx.listener({
move |_, _, window, cx| {
window.dispatch_action(
move |_, _, _window, cx| {
#[cfg(windows)]
_window.dispatch_action(
zed_actions::wsl_actions::OpenWsl::default().boxed_clone(),
cx,
);
@@ -5544,13 +5762,10 @@ impl Render for AcpThreadView {
})
.children(self.render_thread_retry_status_callout(window, cx))
.children({
#[cfg(target_os = "windows")]
{
if cfg!(windows) && self.project.read(cx).is_local() {
self.render_codex_windows_warning(cx)
}
#[cfg(not(target_os = "windows"))]
{
Vec::<Empty>::new()
} else {
None
}
})
.children(self.render_thread_error(cx))
@@ -5581,7 +5796,7 @@ fn default_markdown_style(
let theme_settings = ThemeSettings::get_global(cx);
let colors = cx.theme().colors();
let buffer_font_size = TextSize::Small.rems(cx);
let buffer_font_size = theme_settings.agent_buffer_font_size(cx);
let mut text_style = window.text_style();
let line_height = buffer_font_size * 1.75;
@@ -5593,9 +5808,9 @@ fn default_markdown_style(
};
let font_size = if buffer_font {
TextSize::Small.rems(cx)
theme_settings.agent_buffer_font_size(cx)
} else {
TextSize::Default.rems(cx)
theme_settings.agent_ui_font_size(cx)
};
let text_color = if muted_text {
@@ -5892,6 +6107,107 @@ pub(crate) mod tests {
);
}
#[gpui::test]
async fn test_notification_when_panel_hidden(cx: &mut TestAppContext) {
init_test(cx);
let (thread_view, cx) = setup_thread_view(StubAgentServer::default_response(), cx).await;
add_to_workspace(thread_view.clone(), cx);
let message_editor = cx.read(|cx| thread_view.read(cx).message_editor.clone());
message_editor.update_in(cx, |editor, window, cx| {
editor.set_text("Hello", window, cx);
});
// Window is active (don't deactivate), but panel will be hidden
// Note: In the test environment, the panel is not actually added to the dock,
// so is_agent_panel_hidden will return true
thread_view.update_in(cx, |thread_view, window, cx| {
thread_view.send(window, cx);
});
cx.run_until_parked();
// Should show notification because window is active but panel is hidden
assert!(
cx.windows()
.iter()
.any(|window| window.downcast::<AgentNotification>().is_some()),
"Expected notification when panel is hidden"
);
}
#[gpui::test]
async fn test_notification_still_works_when_window_inactive(cx: &mut TestAppContext) {
init_test(cx);
let (thread_view, cx) = setup_thread_view(StubAgentServer::default_response(), cx).await;
let message_editor = cx.read(|cx| thread_view.read(cx).message_editor.clone());
message_editor.update_in(cx, |editor, window, cx| {
editor.set_text("Hello", window, cx);
});
// Deactivate window - should show notification regardless of setting
cx.deactivate_window();
thread_view.update_in(cx, |thread_view, window, cx| {
thread_view.send(window, cx);
});
cx.run_until_parked();
// Should still show notification when window is inactive (existing behavior)
assert!(
cx.windows()
.iter()
.any(|window| window.downcast::<AgentNotification>().is_some()),
"Expected notification when window is inactive"
);
}
#[gpui::test]
async fn test_notification_respects_never_setting(cx: &mut TestAppContext) {
init_test(cx);
// Set notify_when_agent_waiting to Never
cx.update(|cx| {
AgentSettings::override_global(
AgentSettings {
notify_when_agent_waiting: NotifyWhenAgentWaiting::Never,
..AgentSettings::get_global(cx).clone()
},
cx,
);
});
let (thread_view, cx) = setup_thread_view(StubAgentServer::default_response(), cx).await;
let message_editor = cx.read(|cx| thread_view.read(cx).message_editor.clone());
message_editor.update_in(cx, |editor, window, cx| {
editor.set_text("Hello", window, cx);
});
// Window is active
thread_view.update_in(cx, |thread_view, window, cx| {
thread_view.send(window, cx);
});
cx.run_until_parked();
// Should NOT show notification because notify_when_agent_waiting is Never
assert!(
!cx.windows()
.iter()
.any(|window| window.downcast::<AgentNotification>().is_some()),
"Expected no notification when notify_when_agent_waiting is Never"
);
}
async fn setup_thread_view(
agent: impl AgentServer + 'static,
cx: &mut TestAppContext,

View File

@@ -23,16 +23,17 @@ use language::LanguageRegistry;
use language_model::{
LanguageModelProvider, LanguageModelProviderId, LanguageModelRegistry, ZED_CLOUD_PROVIDER_ID,
};
use language_models::AllLanguageModelSettings;
use notifications::status_toast::{StatusToast, ToastIcon};
use project::{
agent_server_store::{AgentServerStore, CLAUDE_CODE_NAME, CODEX_NAME, GEMINI_NAME},
context_server_store::{ContextServerConfiguration, ContextServerStatus, ContextServerStore},
};
use rope::Rope;
use settings::{SettingsStore, update_settings_file};
use settings::{Settings, SettingsStore, update_settings_file};
use ui::{
Chip, CommonAnimationExt, ContextMenu, Disclosure, Divider, DividerColor, ElevationIndex,
Indicator, PopoverMenu, Switch, SwitchColor, Tooltip, WithScrollbar, prelude::*,
Button, ButtonStyle, Chip, CommonAnimationExt, ContextMenu, Disclosure, Divider, DividerColor,
ElevationIndex, IconName, IconPosition, IconSize, Indicator, LabelSize, PopoverMenu, Switch,
SwitchColor, Tooltip, WithScrollbar, prelude::*,
};
use util::ResultExt as _;
use workspace::{Workspace, create_and_open_local_file};
@@ -153,7 +154,42 @@ pub enum AssistantConfigurationEvent {
impl EventEmitter<AssistantConfigurationEvent> for AgentConfiguration {}
enum AgentIcon {
Name(IconName),
Path(SharedString),
}
impl AgentConfiguration {
fn render_section_title(
&mut self,
title: impl Into<SharedString>,
description: impl Into<SharedString>,
menu: AnyElement,
) -> impl IntoElement {
h_flex()
.p_4()
.pb_0()
.mb_2p5()
.items_start()
.justify_between()
.child(
v_flex()
.w_full()
.gap_0p5()
.child(
h_flex()
.pr_1()
.w_full()
.gap_2()
.justify_between()
.flex_wrap()
.child(Headline::new(title.into()))
.child(menu),
)
.child(Label::new(description.into()).color(Color::Muted)),
)
}
fn render_provider_configuration_block(
&mut self,
provider: &Arc<dyn LanguageModelProvider>,
@@ -288,7 +324,7 @@ impl AgentConfiguration {
"Start New Thread",
)
.full_width()
.style(ButtonStyle::Filled)
.style(ButtonStyle::Outlined)
.layer(ElevationIndex::ModalSurface)
.icon_position(IconPosition::Start)
.icon(IconName::Thread)
@@ -304,89 +340,122 @@ impl AgentConfiguration {
}
})),
)
}),
})
.when(
is_expanded && is_removable_provider(&provider.id(), cx),
|this| {
this.child(
Button::new(
SharedString::from(format!("delete-provider-{provider_id}")),
"Remove Provider",
)
.full_width()
.style(ButtonStyle::Outlined)
.icon_position(IconPosition::Start)
.icon(IconName::Trash)
.icon_size(IconSize::Small)
.icon_color(Color::Muted)
.label_size(LabelSize::Small)
.on_click(cx.listener({
let provider = provider.clone();
move |this, _event, window, cx| {
this.delete_provider(provider.clone(), window, cx);
}
})),
)
},
),
)
}
fn delete_provider(
&mut self,
provider: Arc<dyn LanguageModelProvider>,
window: &mut Window,
cx: &mut Context<Self>,
) {
let fs = self.fs.clone();
let provider_id = provider.id();
cx.spawn_in(window, async move |_, cx| {
cx.update(|_window, cx| {
update_settings_file(fs.clone(), cx, {
let provider_id = provider_id.clone();
move |settings, _| {
if let Some(ref mut openai_compatible) = settings
.language_models
.as_mut()
.and_then(|lm| lm.openai_compatible.as_mut())
{
let key_to_remove: Arc<str> = Arc::from(provider_id.0.as_ref());
openai_compatible.remove(&key_to_remove);
}
}
});
})
.log_err();
cx.update(|_window, cx| {
LanguageModelRegistry::global(cx).update(cx, {
let provider_id = provider_id.clone();
move |registry, cx| {
registry.unregister_provider(provider_id, cx);
}
})
})
.log_err();
anyhow::Ok(())
})
.detach_and_log_err(cx);
}
fn render_provider_configuration_section(
&mut self,
cx: &mut Context<Self>,
) -> impl IntoElement {
let providers = LanguageModelRegistry::read_global(cx).providers();
let popover_menu = PopoverMenu::new("add-provider-popover")
.trigger(
Button::new("add-provider", "Add Provider")
.style(ButtonStyle::Outlined)
.icon_position(IconPosition::Start)
.icon(IconName::Plus)
.icon_size(IconSize::Small)
.icon_color(Color::Muted)
.label_size(LabelSize::Small),
)
.anchor(gpui::Corner::TopRight)
.menu({
let workspace = self.workspace.clone();
move |window, cx| {
Some(ContextMenu::build(window, cx, |menu, _window, _cx| {
menu.header("Compatible APIs").entry("OpenAI", None, {
let workspace = workspace.clone();
move |window, cx| {
workspace
.update(cx, |workspace, cx| {
AddLlmProviderModal::toggle(
LlmCompatibleProvider::OpenAi,
workspace,
window,
cx,
);
})
.log_err();
}
})
}))
}
});
v_flex()
.w_full()
.child(
h_flex()
.p(DynamicSpacing::Base16.rems(cx))
.pr(DynamicSpacing::Base20.rems(cx))
.pb_0()
.mb_2p5()
.items_start()
.justify_between()
.child(
v_flex()
.w_full()
.gap_0p5()
.child(
h_flex()
.pr_1()
.w_full()
.gap_2()
.justify_between()
.child(Headline::new("LLM Providers"))
.child(
PopoverMenu::new("add-provider-popover")
.trigger(
Button::new("add-provider", "Add Provider")
.style(ButtonStyle::Filled)
.layer(ElevationIndex::ModalSurface)
.icon_position(IconPosition::Start)
.icon(IconName::Plus)
.icon_size(IconSize::Small)
.icon_color(Color::Muted)
.label_size(LabelSize::Small),
)
.anchor(gpui::Corner::TopRight)
.menu({
let workspace = self.workspace.clone();
move |window, cx| {
Some(ContextMenu::build(
window,
cx,
|menu, _window, _cx| {
menu.header("Compatible APIs").entry(
"OpenAI",
None,
{
let workspace =
workspace.clone();
move |window, cx| {
workspace
.update(cx, |workspace, cx| {
AddLlmProviderModal::toggle(
LlmCompatibleProvider::OpenAi,
workspace,
window,
cx,
);
})
.log_err();
}
},
)
},
))
}
}),
),
)
.child(
Label::new("Add at least one provider to use AI-powered features with Zed's native agent.")
.color(Color::Muted),
),
),
)
.child(self.render_section_title(
"LLM Providers",
"Add at least one provider to use AI-powered features with Zed's native agent.",
popover_menu.into_any_element(),
))
.child(
div()
.w_full()
@@ -465,8 +534,7 @@ impl AgentConfiguration {
let add_server_popover = PopoverMenu::new("add-server-popover")
.trigger(
Button::new("add-server", "Add Server")
.style(ButtonStyle::Filled)
.layer(ElevationIndex::ModalSurface)
.style(ButtonStyle::Outlined)
.icon_position(IconPosition::Start)
.icon(IconName::Plus)
.icon_size(IconSize::Small)
@@ -499,61 +567,57 @@ impl AgentConfiguration {
});
v_flex()
.p(DynamicSpacing::Base16.rems(cx))
.pr(DynamicSpacing::Base20.rems(cx))
.gap_2()
.border_b_1()
.border_color(cx.theme().colors().border)
.child(self.render_section_title(
"Model Context Protocol (MCP) Servers",
"All MCP servers connected directly or via a Zed extension.",
add_server_popover.into_any_element(),
))
.child(
h_flex()
v_flex()
.pl_4()
.pb_4()
.pr_5()
.w_full()
.items_start()
.justify_between()
.gap_1()
.child(
v_flex()
.gap_0p5()
.child(Headline::new("Model Context Protocol (MCP) Servers"))
.child(
Label::new(
"All MCP servers connected directly or via a Zed extension.",
)
.color(Color::Muted),
),
)
.child(add_server_popover),
)
.child(v_flex().w_full().gap_1().map(|mut parent| {
if context_server_ids.is_empty() {
parent.child(
h_flex()
.p_4()
.justify_center()
.border_1()
.border_dashed()
.border_color(cx.theme().colors().border.opacity(0.6))
.rounded_sm()
.child(
Label::new("No MCP servers added yet.")
.color(Color::Muted)
.size(LabelSize::Small),
),
)
} else {
for (index, context_server_id) in context_server_ids.into_iter().enumerate() {
if index > 0 {
parent = parent.child(
Divider::horizontal()
.color(DividerColor::BorderFaded)
.into_any_element(),
);
.map(|mut parent| {
if context_server_ids.is_empty() {
parent.child(
h_flex()
.p_4()
.justify_center()
.border_1()
.border_dashed()
.border_color(cx.theme().colors().border.opacity(0.6))
.rounded_sm()
.child(
Label::new("No MCP servers added yet.")
.color(Color::Muted)
.size(LabelSize::Small),
),
)
} else {
for (index, context_server_id) in
context_server_ids.into_iter().enumerate()
{
if index > 0 {
parent = parent.child(
Divider::horizontal()
.color(DividerColor::BorderFaded)
.into_any_element(),
);
}
parent = parent.child(self.render_context_server(
context_server_id,
window,
cx,
));
}
parent
}
parent =
parent.child(self.render_context_server(context_server_id, window, cx));
}
parent
}
}))
}),
)
}
fn render_context_server(
@@ -598,12 +662,12 @@ impl AgentConfiguration {
let (source_icon, source_tooltip) = if is_from_extension {
(
IconName::ZedMcpExtension,
IconName::ZedSrcExtension,
"This MCP server was installed from an extension.",
)
} else {
(
IconName::ZedMcpCustom,
IconName::ZedSrcCustom,
"This custom MCP server was installed directly.",
)
};
@@ -885,9 +949,9 @@ impl AgentConfiguration {
}
fn render_agent_servers_section(&mut self, cx: &mut Context<Self>) -> impl IntoElement {
let user_defined_agents = self
.agent_server_store
.read(cx)
let agent_server_store = self.agent_server_store.read(cx);
let user_defined_agents = agent_server_store
.external_agents()
.filter(|name| {
name.0 != GEMINI_NAME && name.0 != CLAUDE_CODE_NAME && name.0 != CODEX_NAME
@@ -898,102 +962,121 @@ impl AgentConfiguration {
let user_defined_agents = user_defined_agents
.into_iter()
.map(|name| {
self.render_agent_server(IconName::Ai, name)
let icon = if let Some(icon_path) = agent_server_store.agent_icon(&name) {
AgentIcon::Path(icon_path)
} else {
AgentIcon::Name(IconName::Ai)
};
self.render_agent_server(icon, name, true)
.into_any_element()
})
.collect::<Vec<_>>();
let add_agens_button = Button::new("add-agent", "Add Agent")
.style(ButtonStyle::Outlined)
.icon_position(IconPosition::Start)
.icon(IconName::Plus)
.icon_size(IconSize::Small)
.icon_color(Color::Muted)
.label_size(LabelSize::Small)
.on_click(move |_, window, cx| {
if let Some(workspace) = window.root().flatten() {
let workspace = workspace.downgrade();
window
.spawn(cx, async |cx| {
open_new_agent_servers_entry_in_settings_editor(workspace, cx).await
})
.detach_and_log_err(cx);
}
});
v_flex()
.border_b_1()
.border_color(cx.theme().colors().border)
.child(
v_flex()
.p(DynamicSpacing::Base16.rems(cx))
.pr(DynamicSpacing::Base20.rems(cx))
.gap_2()
.child(self.render_section_title(
"External Agents",
"All agents connected through the Agent Client Protocol.",
add_agens_button.into_any_element(),
))
.child(
v_flex()
.gap_0p5()
.child(
h_flex()
.pr_1()
.w_full()
.gap_2()
.justify_between()
.child(Headline::new("External Agents"))
.child(
Button::new("add-agent", "Add Agent")
.style(ButtonStyle::Filled)
.layer(ElevationIndex::ModalSurface)
.icon_position(IconPosition::Start)
.icon(IconName::Plus)
.icon_size(IconSize::Small)
.icon_color(Color::Muted)
.label_size(LabelSize::Small)
.on_click(
move |_, window, cx| {
if let Some(workspace) = window.root().flatten() {
let workspace = workspace.downgrade();
window
.spawn(cx, async |cx| {
open_new_agent_servers_entry_in_settings_editor(
workspace,
cx,
).await
})
.detach_and_log_err(cx);
}
}
),
)
)
.child(
Label::new(
"All agents connected through the Agent Client Protocol.",
)
.color(Color::Muted),
),
)
.child(self.render_agent_server(
IconName::AiClaude,
"Claude Code",
))
.child(Divider::horizontal().color(DividerColor::BorderFaded))
.child(self.render_agent_server(
IconName::AiOpenAi,
"Codex",
))
.child(Divider::horizontal().color(DividerColor::BorderFaded))
.child(self.render_agent_server(
IconName::AiGemini,
"Gemini CLI",
))
.map(|mut parent| {
for agent in user_defined_agents {
parent = parent.child(Divider::horizontal().color(DividerColor::BorderFaded))
.child(agent);
}
parent
})
.p_4()
.pt_0()
.gap_2()
.child(self.render_agent_server(
AgentIcon::Name(IconName::AiClaude),
"Claude Code",
false,
))
.child(Divider::horizontal().color(DividerColor::BorderFaded))
.child(self.render_agent_server(
AgentIcon::Name(IconName::AiOpenAi),
"Codex CLI",
false,
))
.child(Divider::horizontal().color(DividerColor::BorderFaded))
.child(self.render_agent_server(
AgentIcon::Name(IconName::AiGemini),
"Gemini CLI",
false,
))
.map(|mut parent| {
for agent in user_defined_agents {
parent = parent
.child(
Divider::horizontal().color(DividerColor::BorderFaded),
)
.child(agent);
}
parent
}),
),
)
}
fn render_agent_server(
&self,
icon: IconName,
icon: AgentIcon,
name: impl Into<SharedString>,
external: bool,
) -> impl IntoElement {
h_flex().gap_1p5().justify_between().child(
h_flex()
.gap_1p5()
.child(Icon::new(icon).size(IconSize::Small).color(Color::Muted))
.child(Label::new(name.into()))
.child(
Icon::new(IconName::Check)
.size(IconSize::Small)
.color(Color::Success),
),
)
let name = name.into();
let icon = match icon {
AgentIcon::Name(icon_name) => Icon::new(icon_name)
.size(IconSize::Small)
.color(Color::Muted),
AgentIcon::Path(icon_path) => Icon::from_path(icon_path)
.size(IconSize::Small)
.color(Color::Muted),
};
let tooltip_id = SharedString::new(format!("agent-source-{}", name));
let tooltip_message = format!("The {} agent was installed from an extension.", name);
h_flex()
.gap_1p5()
.child(icon)
.child(Label::new(name))
.when(external, |this| {
this.child(
div()
.id(tooltip_id)
.flex_none()
.tooltip(Tooltip::text(tooltip_message))
.child(
Icon::new(IconName::ZedSrcExtension)
.size(IconSize::Small)
.color(Color::Muted),
),
)
})
.child(
Icon::new(IconName::Check)
.color(Color::Success)
.size(IconSize::Small),
)
}
}
@@ -1115,11 +1198,8 @@ async fn open_new_agent_servers_entry_in_settings_editor(
) -> Result<()> {
let settings_editor = workspace
.update_in(cx, |_, window, cx| {
create_and_open_local_file(paths::settings_file(), window, cx, |cx| {
Rope::from_str(
&settings::initial_user_settings_content(),
cx.background_executor(),
)
create_and_open_local_file(paths::settings_file(), window, cx, || {
settings::initial_user_settings_content().as_ref().into()
})
})?
.await?
@@ -1225,3 +1305,14 @@ fn find_text_in_buffer(
None
}
}
// OpenAI-compatible providers are user-configured and can be removed,
// whereas built-in providers (like Anthropic, OpenAI, Google, etc.) can't.
//
// If in the future we have more "API-compatible-type" of providers,
// they should be included here as removable providers.
fn is_removable_provider(provider_id: &LanguageModelProviderId, cx: &App) -> bool {
AllLanguageModelSettings::get_global(cx)
.openai_compatible
.contains_key(provider_id.0.as_ref())
}

View File

@@ -70,14 +70,6 @@ impl AgentDiffThread {
}
}
fn is_generating(&self, cx: &App) -> bool {
match self {
AgentDiffThread::AcpThread(thread) => {
thread.read(cx).status() == acp_thread::ThreadStatus::Generating
}
}
}
fn has_pending_edit_tool_uses(&self, cx: &App) -> bool {
match self {
AgentDiffThread::AcpThread(thread) => thread.read(cx).has_pending_edit_tool_calls(),
@@ -970,9 +962,7 @@ impl AgentDiffToolbar {
None => ToolbarItemLocation::Hidden,
Some(AgentDiffToolbarItem::Pane(_)) => ToolbarItemLocation::PrimaryRight,
Some(AgentDiffToolbarItem::Editor { state, .. }) => match state {
EditorState::Generating | EditorState::Reviewing => {
ToolbarItemLocation::PrimaryRight
}
EditorState::Reviewing => ToolbarItemLocation::PrimaryRight,
EditorState::Idle => ToolbarItemLocation::Hidden,
},
}
@@ -1050,7 +1040,6 @@ impl Render for AgentDiffToolbar {
let content = match state {
EditorState::Idle => return Empty.into_any(),
EditorState::Generating => vec![spinner_icon],
EditorState::Reviewing => vec![
h_flex()
.child(
@@ -1222,7 +1211,6 @@ pub struct AgentDiff {
pub enum EditorState {
Idle,
Reviewing,
Generating,
}
struct WorkspaceThread {
@@ -1545,15 +1533,11 @@ impl AgentDiff {
multibuffer.add_diff(diff_handle.clone(), cx);
});
let new_state = if thread.is_generating(cx) {
EditorState::Generating
} else {
EditorState::Reviewing
};
let reviewing_state = EditorState::Reviewing;
let previous_state = self
.reviewing_editors
.insert(weak_editor.clone(), new_state.clone());
.insert(weak_editor.clone(), reviewing_state.clone());
if previous_state.is_none() {
editor.update(cx, |editor, cx| {
@@ -1566,7 +1550,9 @@ impl AgentDiff {
unaffected.remove(weak_editor);
}
if new_state == EditorState::Reviewing && previous_state != Some(new_state) {
if reviewing_state == EditorState::Reviewing
&& previous_state != Some(reviewing_state)
{
// Jump to first hunk when we enter review mode
editor.update(cx, |editor, cx| {
let snapshot = multibuffer.read(cx).snapshot(cx);

View File

@@ -16,11 +16,9 @@ use serde::{Deserialize, Serialize};
use settings::{
DefaultAgentView as DefaultView, LanguageModelProviderSetting, LanguageModelSelection,
};
use zed_actions::OpenBrowser;
use zed_actions::agent::{OpenClaudeCodeOnboardingModal, ReauthenticateAgent};
use crate::acp::{AcpThreadHistory, ThreadHistoryEvent};
use crate::context_store::ContextStore;
use crate::ui::{AcpOnboardingModal, ClaudeCodeOnboardingModal};
use crate::{
AddContextServer, AgentDiffPane, DeleteRecentlyOpenThread, Follow, InlineAssistant,
@@ -33,9 +31,14 @@ use crate::{
text_thread_editor::{AgentPanelDelegate, TextThreadEditor, make_lsp_adapter_delegate},
ui::{AgentOnboardingModal, EndTrialUpsell},
};
use crate::{
ExpandMessageEditor,
acp::{AcpThreadHistory, ThreadHistoryEvent},
};
use crate::{
ExternalAgent, NewExternalAgentThread, NewNativeAgentThreadFromSummary, placeholder_command,
};
use crate::{ManageProfiles, context_store::ContextStore};
use agent_settings::AgentSettings;
use ai_onboarding::AgentPanelOnboarding;
use anyhow::{Result, anyhow};
@@ -106,6 +109,12 @@ pub fn init(cx: &mut App) {
}
},
)
.register_action(|workspace, _: &ExpandMessageEditor, window, cx| {
if let Some(panel) = workspace.panel::<AgentPanel>(cx) {
workspace.focus_panel::<AgentPanel>(window, cx);
panel.update(cx, |panel, cx| panel.expand_message_editor(window, cx));
}
})
.register_action(|workspace, _: &OpenHistory, window, cx| {
if let Some(panel) = workspace.panel::<AgentPanel>(cx) {
workspace.focus_panel::<AgentPanel>(window, cx);
@@ -729,6 +738,25 @@ impl AgentPanel {
&self.context_server_registry
}
pub fn is_hidden(workspace: &Entity<Workspace>, cx: &App) -> bool {
let workspace_read = workspace.read(cx);
workspace_read
.panel::<AgentPanel>(cx)
.map(|panel| {
let panel_id = Entity::entity_id(&panel);
let is_visible = workspace_read.all_docks().iter().any(|dock| {
dock.read(cx)
.visible_panel()
.is_some_and(|visible_panel| visible_panel.panel_id() == panel_id)
});
!is_visible
})
.unwrap_or(true)
}
fn active_thread_view(&self) -> Option<&Entity<AcpThreadView>> {
match &self.active_view {
ActiveView::ExternalAgentThread { thread_view, .. } => Some(thread_view),
@@ -925,6 +953,15 @@ impl AgentPanel {
.detach_and_log_err(cx);
}
fn expand_message_editor(&mut self, window: &mut Window, cx: &mut Context<Self>) {
if let Some(thread_view) = self.active_thread_view() {
thread_view.update(cx, |view, cx| {
view.expand_message_editor(&ExpandMessageEditor, window, cx);
view.focus_handle(cx).focus(window);
});
}
}
fn open_history(&mut self, window: &mut Window, cx: &mut Context<Self>) {
if matches!(self.active_view, ActiveView::History) {
if let Some(previous_view) = self.previous_view.take() {
@@ -1743,10 +1780,9 @@ impl AgentPanel {
}),
)
.action("Add Custom Server…", Box::new(AddContextServer))
.separator();
menu = menu
.separator()
.action("Rules", Box::new(OpenRulesLibrary::default()))
.action("Profiles", Box::new(ManageProfiles::default()))
.action("Settings", Box::new(OpenSettings))
.separator()
.action(full_screen_label, Box::new(ToggleZoom));
@@ -1844,7 +1880,12 @@ impl AgentPanel {
{
let focus_handle = focus_handle.clone();
move |_window, cx| {
Tooltip::for_action_in("New…", &ToggleNewThreadMenu, &focus_handle, cx)
Tooltip::for_action_in(
"New Thread…",
&ToggleNewThreadMenu,
&focus_handle,
cx,
)
}
},
)
@@ -1942,7 +1983,7 @@ impl AgentPanel {
.separator()
.header("External Agents")
.item(
ContextMenuEntry::new("New Claude Code Thread")
ContextMenuEntry::new("New Claude Code")
.icon(IconName::AiClaude)
.disabled(is_via_collab)
.icon_color(Color::Muted)
@@ -1968,7 +2009,7 @@ impl AgentPanel {
}),
)
.item(
ContextMenuEntry::new("New Codex Thread")
ContextMenuEntry::new("New Codex CLI")
.icon(IconName::AiOpenAi)
.disabled(is_via_collab)
.icon_color(Color::Muted)
@@ -1994,7 +2035,7 @@ impl AgentPanel {
}),
)
.item(
ContextMenuEntry::new("New Gemini CLI Thread")
ContextMenuEntry::new("New Gemini CLI")
.icon(IconName::AiGemini)
.icon_color(Color::Muted)
.disabled(is_via_collab)
@@ -2038,9 +2079,9 @@ impl AgentPanel {
for agent_name in agent_names {
let icon_path = agent_server_store_read.agent_icon(&agent_name);
let mut entry =
ContextMenuEntry::new(format!("New {} Thread", agent_name));
ContextMenuEntry::new(format!("New {}", agent_name));
if let Some(icon_path) = icon_path {
entry = entry.custom_icon_path(icon_path);
entry = entry.custom_icon_svg(icon_path);
} else {
entry = entry.icon(IconName::Terminal);
}
@@ -2090,12 +2131,20 @@ impl AgentPanel {
menu
})
.separator()
.link(
"Add Other Agents",
OpenBrowser {
url: zed_urls::external_agents_docs(cx),
}
.boxed_clone(),
.item(
ContextMenuEntry::new("Add More Agents")
.icon(IconName::Plus)
.icon_color(Color::Muted)
.handler({
move |window, cx| {
window.dispatch_action(Box::new(zed_actions::Extensions {
category_filter: Some(
zed_actions::ExtensionCategoryFilter::AgentServers,
),
id: None,
}), cx)
}
}),
)
}))
}
@@ -2109,7 +2158,7 @@ impl AgentPanel {
.when_some(selected_agent_custom_icon, |this, icon_path| {
let label = selected_agent_label.clone();
this.px(DynamicSpacing::Base02.rems(cx))
.child(Icon::from_path(icon_path).color(Color::Muted))
.child(Icon::from_external_svg(icon_path).color(Color::Muted))
.tooltip(move |_window, cx| {
Tooltip::with_meta(label.clone(), None, "Selected Agent", cx)
})

View File

@@ -487,10 +487,9 @@ impl CodegenAlternative {
) {
let start_time = Instant::now();
let snapshot = self.snapshot.clone();
let selected_text = Rope::from_iter(
snapshot.text_for_range(self.range.start..self.range.end),
cx.background_executor(),
);
let selected_text = snapshot
.text_for_range(self.range.start..self.range.end)
.collect::<Rope>();
let selection_start = self.range.start.to_point(&snapshot);

View File

@@ -620,8 +620,18 @@ impl TextThreadContextHandle {
impl Display for TextThreadContext {
fn fmt(&self, f: &mut Formatter<'_>) -> fmt::Result {
// TODO: escape title?
writeln!(f, "<text_thread title=\"{}\">", self.title)?;
write!(f, "<text_thread title=\"")?;
for c in self.title.chars() {
match c {
'&' => write!(f, "&amp;")?,
'<' => write!(f, "&lt;")?,
'>' => write!(f, "&gt;")?,
'"' => write!(f, "&quot;")?,
'\'' => write!(f, "&apos;")?,
_ => write!(f, "{}", c)?,
}
}
writeln!(f, "\">")?;
write!(f, "{}", self.text.trim())?;
write!(f, "\n</text_thread>")
}

View File

@@ -260,10 +260,10 @@ impl<T: 'static> PromptEditor<T> {
let agent_panel_keybinding =
ui::text_for_action(&zed_actions::assistant::ToggleFocus, window, cx)
.map(|keybinding| format!("{keybinding} to chat"))
.map(|keybinding| format!("{keybinding} to chat"))
.unwrap_or_default();
format!("{action}… ({agent_panel_keybinding}↓↑ for history)")
format!("{action}… ({agent_panel_keybinding}↓↑ for history — @ to include context)")
}
pub fn prompt(&self, cx: &App) -> String {

View File

@@ -744,13 +744,12 @@ impl TextThread {
telemetry: Option<Arc<Telemetry>>,
cx: &mut Context<Self>,
) -> Self {
let buffer = cx.new(|cx| {
let buffer = cx.new(|_cx| {
let buffer = Buffer::remote(
language::BufferId::new(1).unwrap(),
replica_id,
capability,
"",
cx.background_executor(),
);
buffer.set_language_registry(language_registry.clone());
buffer

View File

@@ -26,6 +26,7 @@ serde_json.workspace = true
settings.workspace = true
smol.workspace = true
tempfile.workspace = true
util.workspace = true
workspace.workspace = true
[target.'cfg(not(target_os = "windows"))'.dependencies]

View File

@@ -331,6 +331,16 @@ impl AutoUpdater {
pub fn start_polling(&self, cx: &mut Context<Self>) -> Task<Result<()>> {
cx.spawn(async move |this, cx| {
#[cfg(target_os = "windows")]
{
use util::ResultExt;
cleanup_windows()
.await
.context("failed to cleanup old directories")
.log_err();
}
loop {
this.update(cx, |this, cx| this.poll(UpdateCheckType::Automatic, cx))?;
cx.background_executor().timer(POLL_INTERVAL).await;
@@ -396,6 +406,7 @@ impl AutoUpdater {
arch: &str,
release_channel: ReleaseChannel,
version: Option<SemanticVersion>,
set_status: impl Fn(&str, &mut AsyncApp) + Send + 'static,
cx: &mut AsyncApp,
) -> Result<PathBuf> {
let this = cx.update(|cx| {
@@ -405,6 +416,7 @@ impl AutoUpdater {
.context("auto-update not initialized")
})??;
set_status("Fetching remote server release", cx);
let release = Self::get_release(
&this,
"zed-remote-server",
@@ -429,6 +441,7 @@ impl AutoUpdater {
"downloading zed-remote-server {os} {arch} version {}",
release.version
);
set_status("Downloading remote server", cx);
download_remote_server_binary(&version_path, release, client, cx).await?;
}
@@ -923,6 +936,32 @@ async fn install_release_macos(
Ok(None)
}
#[cfg(target_os = "windows")]
async fn cleanup_windows() -> Result<()> {
use util::ResultExt;
let parent = std::env::current_exe()?
.parent()
.context("No parent dir for Zed.exe")?
.to_owned();
// keep in sync with crates/auto_update_helper/src/updater.rs
smol::fs::remove_dir(parent.join("updates"))
.await
.context("failed to remove updates dir")
.log_err();
smol::fs::remove_dir(parent.join("install"))
.await
.context("failed to remove install dir")
.log_err();
smol::fs::remove_dir(parent.join("old"))
.await
.context("failed to remove old version dir")
.log_err();
Ok(())
}
async fn install_release_windows(downloaded_installer: PathBuf) -> Result<Option<PathBuf>> {
let output = Command::new(downloaded_installer)
.arg("/verysilent")
@@ -962,7 +1001,7 @@ pub async fn finalize_auto_update_on_quit() {
.parent()
.map(|p| p.join("tools").join("auto_update_helper.exe"))
{
let mut command = smol::process::Command::new(helper);
let mut command = util::command::new_smol_command(helper);
command.arg("--launch");
command.arg("false");
if let Ok(mut cmd) = command.spawn() {

View File

@@ -21,6 +21,9 @@ simplelog.workspace = true
[target.'cfg(target_os = "windows")'.dependencies]
windows.workspace = true
[target.'cfg(target_os = "windows")'.dev-dependencies]
tempfile.workspace = true
[target.'cfg(target_os = "windows")'.build-dependencies]
winresource = "0.1"

View File

@@ -1,4 +1,5 @@
use std::{
cell::LazyCell,
path::Path,
time::{Duration, Instant},
};
@@ -11,210 +12,274 @@ use windows::Win32::{
use crate::windows_impl::WM_JOB_UPDATED;
type Job = fn(&Path) -> Result<()>;
pub(crate) struct Job {
pub apply: Box<dyn Fn(&Path) -> Result<()>>,
pub rollback: Box<dyn Fn(&Path) -> Result<()>>,
}
#[cfg(not(test))]
pub(crate) const JOBS: &[Job] = &[
// Delete old files
|app_dir| {
let zed_executable = app_dir.join("Zed.exe");
log::info!("Removing old file: {}", zed_executable.display());
std::fs::remove_file(&zed_executable).context(format!(
"Failed to remove old file {}",
zed_executable.display()
))
},
|app_dir| {
let zed_cli = app_dir.join("bin\\zed.exe");
log::info!("Removing old file: {}", zed_cli.display());
std::fs::remove_file(&zed_cli)
.context(format!("Failed to remove old file {}", zed_cli.display()))
},
|app_dir| {
let zed_wsl = app_dir.join("bin\\zed");
log::info!("Removing old file: {}", zed_wsl.display());
std::fs::remove_file(&zed_wsl)
.context(format!("Failed to remove old file {}", zed_wsl.display()))
},
// TODO: remove after a few weeks once everyone is on the new version and this file never exists
|app_dir| {
let open_console = app_dir.join("OpenConsole.exe");
if open_console.exists() {
log::info!("Removing old file: {}", open_console.display());
std::fs::remove_file(&open_console).context(format!(
"Failed to remove old file {}",
open_console.display()
))?
impl Job {
pub fn mkdir(name: &'static Path) -> Self {
Job {
apply: Box::new(move |app_dir| {
let dir = app_dir.join(name);
std::fs::create_dir_all(&dir)
.context(format!("Failed to create directory {}", dir.display()))
}),
rollback: Box::new(move |app_dir| {
let dir = app_dir.join(name);
std::fs::remove_dir_all(&dir)
.context(format!("Failed to remove directory {}", dir.display()))
}),
}
Ok(())
},
|app_dir| {
let archs = ["x64", "arm64"];
for arch in archs {
let open_console = app_dir.join(format!("{arch}\\OpenConsole.exe"));
if open_console.exists() {
log::info!("Removing old file: {}", open_console.display());
std::fs::remove_file(&open_console).context(format!(
"Failed to remove old file {}",
open_console.display()
))?
}
}
pub fn mkdir_if_exists(name: &'static Path, check: &'static Path) -> Self {
Job {
apply: Box::new(move |app_dir| {
let dir = app_dir.join(name);
let check = app_dir.join(check);
if check.exists() {
std::fs::create_dir_all(&dir)
.context(format!("Failed to create directory {}", dir.display()))?
}
Ok(())
}),
rollback: Box::new(move |app_dir| {
let dir = app_dir.join(name);
if dir.exists() {
std::fs::remove_dir_all(&dir)
.context(format!("Failed to remove directory {}", dir.display()))?
}
Ok(())
}),
}
Ok(())
},
|app_dir| {
let conpty = app_dir.join("conpty.dll");
log::info!("Removing old file: {}", conpty.display());
std::fs::remove_file(&conpty)
.context(format!("Failed to remove old file {}", conpty.display()))
},
// Copy new files
|app_dir| {
let zed_executable_source = app_dir.join("install\\Zed.exe");
let zed_executable_dest = app_dir.join("Zed.exe");
log::info!(
"Copying new file {} to {}",
zed_executable_source.display(),
zed_executable_dest.display()
);
std::fs::copy(&zed_executable_source, &zed_executable_dest)
.map(|_| ())
.context(format!(
"Failed to copy new file {} to {}",
zed_executable_source.display(),
zed_executable_dest.display()
))
},
|app_dir| {
let zed_cli_source = app_dir.join("install\\bin\\zed.exe");
let zed_cli_dest = app_dir.join("bin\\zed.exe");
log::info!(
"Copying new file {} to {}",
zed_cli_source.display(),
zed_cli_dest.display()
);
std::fs::copy(&zed_cli_source, &zed_cli_dest)
.map(|_| ())
.context(format!(
"Failed to copy new file {} to {}",
zed_cli_source.display(),
zed_cli_dest.display()
))
},
|app_dir| {
let zed_wsl_source = app_dir.join("install\\bin\\zed");
let zed_wsl_dest = app_dir.join("bin\\zed");
log::info!(
"Copying new file {} to {}",
zed_wsl_source.display(),
zed_wsl_dest.display()
);
std::fs::copy(&zed_wsl_source, &zed_wsl_dest)
.map(|_| ())
.context(format!(
"Failed to copy new file {} to {}",
zed_wsl_source.display(),
zed_wsl_dest.display()
))
},
|app_dir| {
let archs = ["x64", "arm64"];
for arch in archs {
let open_console_source = app_dir.join(format!("install\\{arch}\\OpenConsole.exe"));
let open_console_dest = app_dir.join(format!("{arch}\\OpenConsole.exe"));
if open_console_source.exists() {
}
pub fn move_file(filename: &'static Path, new_filename: &'static Path) -> Self {
Job {
apply: Box::new(move |app_dir| {
let old_file = app_dir.join(filename);
let new_file = app_dir.join(new_filename);
log::info!(
"Copying new file {} to {}",
open_console_source.display(),
open_console_dest.display()
"Moving file: {}->{}",
old_file.display(),
new_file.display()
);
let parent = open_console_dest.parent().context(format!(
"Failed to get parent directory of {}",
open_console_dest.display()
))?;
std::fs::create_dir_all(parent)
.context(format!("Failed to create directory {}", parent.display()))?;
std::fs::copy(&open_console_source, &open_console_dest)
.map(|_| ())
.context(format!(
"Failed to copy new file {} to {}",
open_console_source.display(),
open_console_dest.display()
))?
}
}
Ok(())
},
|app_dir| {
let conpty_source = app_dir.join("install\\conpty.dll");
let conpty_dest = app_dir.join("conpty.dll");
log::info!(
"Copying new file {} to {}",
conpty_source.display(),
conpty_dest.display()
);
std::fs::copy(&conpty_source, &conpty_dest)
.map(|_| ())
.context(format!(
"Failed to copy new file {} to {}",
conpty_source.display(),
conpty_dest.display()
))
},
// Clean up installer folder and updates folder
|app_dir| {
let updates_folder = app_dir.join("updates");
log::info!("Cleaning up: {}", updates_folder.display());
std::fs::remove_dir_all(&updates_folder).context(format!(
"Failed to remove updates folder {}",
updates_folder.display()
))
},
|app_dir| {
let installer_folder = app_dir.join("install");
log::info!("Cleaning up: {}", installer_folder.display());
std::fs::remove_dir_all(&installer_folder).context(format!(
"Failed to remove installer folder {}",
installer_folder.display()
))
},
];
std::fs::rename(&old_file, new_file)
.context(format!("Failed to move file {}", old_file.display()))
}),
rollback: Box::new(move |app_dir| {
let old_file = app_dir.join(filename);
let new_file = app_dir.join(new_filename);
log::info!(
"Rolling back file move: {}->{}",
old_file.display(),
new_file.display()
);
std::fs::rename(&new_file, &old_file).context(format!(
"Failed to rollback file move {}->{}",
new_file.display(),
old_file.display()
))
}),
}
}
pub fn move_if_exists(filename: &'static Path, new_filename: &'static Path) -> Self {
Job {
apply: Box::new(move |app_dir| {
let old_file = app_dir.join(filename);
let new_file = app_dir.join(new_filename);
if old_file.exists() {
log::info!(
"Moving file: {}->{}",
old_file.display(),
new_file.display()
);
std::fs::rename(&old_file, new_file)
.context(format!("Failed to move file {}", old_file.display()))?;
}
Ok(())
}),
rollback: Box::new(move |app_dir| {
let old_file = app_dir.join(filename);
let new_file = app_dir.join(new_filename);
if new_file.exists() {
log::info!(
"Rolling back file move: {}->{}",
old_file.display(),
new_file.display()
);
std::fs::rename(&new_file, &old_file).context(format!(
"Failed to rollback file move {}->{}",
new_file.display(),
old_file.display()
))?
}
Ok(())
}),
}
}
pub fn rmdir_nofail(filename: &'static Path) -> Self {
Job {
apply: Box::new(move |app_dir| {
let filename = app_dir.join(filename);
log::info!("Removing file: {}", filename.display());
if let Err(e) = std::fs::remove_dir_all(&filename) {
log::warn!("Failed to remove directory: {}", e);
}
Ok(())
}),
rollback: Box::new(move |app_dir| {
let filename = app_dir.join(filename);
anyhow::bail!(
"Delete operations cannot be rolled back, file: {}",
filename.display()
)
}),
}
}
}
// app is single threaded
#[cfg(not(test))]
#[allow(clippy::declare_interior_mutable_const)]
pub(crate) const JOBS: LazyCell<[Job; 22]> = LazyCell::new(|| {
fn p(value: &str) -> &Path {
Path::new(value)
}
[
// Move old files
// Not deleting because installing new files can fail
Job::mkdir(p("old")),
Job::move_file(p("Zed.exe"), p("old\\Zed.exe")),
Job::mkdir(p("old\\bin")),
Job::move_file(p("bin\\Zed.exe"), p("old\\bin\\Zed.exe")),
Job::move_file(p("bin\\zed"), p("old\\bin\\zed")),
//
// TODO: remove after a few weeks once everyone is on the new version and this file never exists
Job::move_if_exists(p("OpenConsole.exe"), p("old\\OpenConsole.exe")),
Job::mkdir(p("old\\x64")),
Job::mkdir(p("old\\arm64")),
Job::move_if_exists(p("x64\\OpenConsole.exe"), p("old\\x64\\OpenConsole.exe")),
Job::move_if_exists(
p("arm64\\OpenConsole.exe"),
p("old\\arm64\\OpenConsole.exe"),
),
//
Job::move_file(p("conpty.dll"), p("old\\conpty.dll")),
// Copy new files
Job::move_file(p("install\\Zed.exe"), p("Zed.exe")),
Job::move_file(p("install\\bin\\Zed.exe"), p("bin\\Zed.exe")),
Job::move_file(p("install\\bin\\zed"), p("bin\\zed")),
//
Job::mkdir_if_exists(p("x64"), p("install\\x64")),
Job::mkdir_if_exists(p("arm64"), p("install\\arm64")),
Job::move_if_exists(
p("install\\x64\\OpenConsole.exe"),
p("x64\\OpenConsole.exe"),
),
Job::move_if_exists(
p("install\\arm64\\OpenConsole.exe"),
p("arm64\\OpenConsole.exe"),
),
//
Job::move_file(p("install\\conpty.dll"), p("conpty.dll")),
// Cleanup installer and updates folder
Job::rmdir_nofail(p("updates")),
Job::rmdir_nofail(p("install")),
// Cleanup old installation
Job::rmdir_nofail(p("old")),
]
});
// app is single threaded
#[cfg(test)]
pub(crate) const JOBS: &[Job] = &[
|_| {
std::thread::sleep(Duration::from_millis(1000));
if let Ok(config) = std::env::var("ZED_AUTO_UPDATE") {
match config.as_str() {
"err" => Err(std::io::Error::other("Simulated error")).context("Anyhow!"),
_ => panic!("Unknown ZED_AUTO_UPDATE value: {}", config),
}
} else {
Ok(())
}
},
|_| {
std::thread::sleep(Duration::from_millis(1000));
if let Ok(config) = std::env::var("ZED_AUTO_UPDATE") {
match config.as_str() {
"err" => Err(std::io::Error::other("Simulated error")).context("Anyhow!"),
_ => panic!("Unknown ZED_AUTO_UPDATE value: {}", config),
}
} else {
Ok(())
}
},
];
#[allow(clippy::declare_interior_mutable_const)]
pub(crate) const JOBS: LazyCell<[Job; 9]> = LazyCell::new(|| {
fn p(value: &str) -> &Path {
Path::new(value)
}
[
Job {
apply: Box::new(|_| {
std::thread::sleep(Duration::from_millis(1000));
if let Ok(config) = std::env::var("ZED_AUTO_UPDATE") {
match config.as_str() {
"err1" => Err(std::io::Error::other("Simulated error")).context("Anyhow!"),
"err2" => Ok(()),
_ => panic!("Unknown ZED_AUTO_UPDATE value: {}", config),
}
} else {
Ok(())
}
}),
rollback: Box::new(|_| {
unsafe { std::env::set_var("ZED_AUTO_UPDATE_RB", "rollback1") };
Ok(())
}),
},
Job::mkdir(p("test1")),
Job::mkdir_if_exists(p("test_exists"), p("test1")),
Job::mkdir_if_exists(p("test_missing"), p("dont")),
Job {
apply: Box::new(|folder| {
std::fs::write(folder.join("test1/test"), "test")?;
Ok(())
}),
rollback: Box::new(|folder| {
std::fs::remove_file(folder.join("test1/test"))?;
Ok(())
}),
},
Job::move_file(p("test1/test"), p("test1/moved")),
Job::move_if_exists(p("test1/test"), p("test1/noop")),
Job {
apply: Box::new(|_| {
std::thread::sleep(Duration::from_millis(1000));
if let Ok(config) = std::env::var("ZED_AUTO_UPDATE") {
match config.as_str() {
"err1" => Ok(()),
"err2" => Err(std::io::Error::other("Simulated error")).context("Anyhow!"),
_ => panic!("Unknown ZED_AUTO_UPDATE value: {}", config),
}
} else {
Ok(())
}
}),
rollback: Box::new(|_| Ok(())),
},
Job::rmdir_nofail(p("test1/nofolder")),
]
});
pub(crate) fn perform_update(app_dir: &Path, hwnd: Option<isize>, launch: bool) -> Result<()> {
let hwnd = hwnd.map(|ptr| HWND(ptr as _));
for job in JOBS.iter() {
let mut last_successful_job = None;
'outer: for (i, job) in JOBS.iter().enumerate() {
let start = Instant::now();
loop {
anyhow::ensure!(start.elapsed().as_secs() <= 2, "Timed out");
match (*job)(app_dir) {
if start.elapsed().as_secs() > 2 {
log::error!("Timed out, rolling back");
break 'outer;
}
match (job.apply)(app_dir) {
Ok(_) => {
last_successful_job = Some(i);
unsafe { PostMessageW(hwnd, WM_JOB_UPDATED, WPARAM(0), LPARAM(0))? };
break;
}
@@ -223,6 +288,7 @@ pub(crate) fn perform_update(app_dir: &Path, hwnd: Option<isize>, launch: bool)
let io_err = err.downcast_ref::<std::io::Error>().unwrap();
if io_err.kind() == std::io::ErrorKind::NotFound {
log::warn!("File or folder not found.");
last_successful_job = Some(i);
unsafe { PostMessageW(hwnd, WM_JOB_UPDATED, WPARAM(0), LPARAM(0))? };
break;
}
@@ -233,6 +299,28 @@ pub(crate) fn perform_update(app_dir: &Path, hwnd: Option<isize>, launch: bool)
}
}
}
if last_successful_job
.map(|job| job != JOBS.len() - 1)
.unwrap_or(true)
{
let Some(last_successful_job) = last_successful_job else {
anyhow::bail!("Autoupdate failed, nothing to rollback");
};
for job in (0..=last_successful_job).rev() {
let job = &JOBS[job];
if let Err(e) = (job.rollback)(app_dir) {
anyhow::bail!(
"Job rollback failed, the app might be left in an inconsistent state: ({:?})",
e
);
}
}
anyhow::bail!("Autoupdate failed, rollback successful");
}
if launch {
#[allow(clippy::disallowed_methods, reason = "doesn't run in the main binary")]
let _ = std::process::Command::new(app_dir.join("Zed.exe")).spawn();
@@ -247,12 +335,27 @@ mod test {
#[test]
fn test_perform_update() {
let app_dir = std::path::Path::new("C:/");
let app_dir = tempfile::tempdir().unwrap();
let app_dir = app_dir.path();
assert!(perform_update(app_dir, None, false).is_ok());
let app_dir = tempfile::tempdir().unwrap();
let app_dir = app_dir.path();
// Simulate a timeout
unsafe { std::env::set_var("ZED_AUTO_UPDATE", "err") };
unsafe { std::env::set_var("ZED_AUTO_UPDATE", "err1") };
let ret = perform_update(app_dir, None, false);
assert!(ret.is_err_and(|e| e.to_string().as_str() == "Timed out"));
assert!(
ret.is_err_and(|e| e.to_string().as_str() == "Autoupdate failed, nothing to rollback")
);
let app_dir = tempfile::tempdir().unwrap();
let app_dir = app_dir.path();
// Simulate a timeout
unsafe { std::env::set_var("ZED_AUTO_UPDATE", "err2") };
let ret = perform_update(app_dir, None, false);
assert!(
ret.is_err_and(|e| e.to_string().as_str() == "Autoupdate failed, rollback successful")
);
assert!(std::env::var("ZED_AUTO_UPDATE_RB").is_ok_and(|e| e == "rollback1"));
}
}

View File

@@ -100,13 +100,21 @@ impl Render for Breadcrumbs {
let breadcrumbs_stack = h_flex().gap_1().children(breadcrumbs);
let prefix_element = active_item.breadcrumb_prefix(window, cx);
let breadcrumbs = if let Some(prefix) = prefix_element {
h_flex().gap_1p5().child(prefix).child(breadcrumbs_stack)
} else {
breadcrumbs_stack
};
match active_item
.downcast::<Editor>()
.map(|editor| editor.downgrade())
{
Some(editor) => element.child(
ButtonLike::new("toggle outline view")
.child(breadcrumbs_stack)
.child(breadcrumbs)
.style(ButtonStyle::Transparent)
.on_click({
let editor = editor.clone();
@@ -141,7 +149,7 @@ impl Render for Breadcrumbs {
// Match the height and padding of the `ButtonLike` in the other arm.
.h(rems_from_px(22.))
.pl_1()
.child(breadcrumbs_stack),
.child(breadcrumbs),
}
}
}

View File

@@ -1,9 +1,6 @@
use futures::channel::oneshot;
use git2::{DiffLineType as GitDiffLineType, DiffOptions as GitOptions, Patch as GitPatch};
use gpui::{
App, AppContext as _, AsyncApp, BackgroundExecutor, Context, Entity, EventEmitter, Task,
TaskLabel,
};
use gpui::{App, AppContext as _, AsyncApp, Context, Entity, EventEmitter, Task, TaskLabel};
use language::{Language, LanguageRegistry};
use rope::Rope;
use std::{
@@ -194,7 +191,7 @@ impl BufferDiffSnapshot {
let base_text_exists;
let base_text_snapshot;
if let Some(text) = &base_text {
let base_text_rope = Rope::from_str(text.as_str(), cx.background_executor());
let base_text_rope = Rope::from(text.as_str());
base_text_pair = Some((text.clone(), base_text_rope.clone()));
let snapshot =
language::Buffer::build_snapshot(base_text_rope, language, language_registry, cx);
@@ -314,7 +311,6 @@ impl BufferDiffInner {
hunks: &[DiffHunk],
buffer: &text::BufferSnapshot,
file_exists: bool,
cx: &BackgroundExecutor,
) -> Option<Rope> {
let head_text = self
.base_text_exists
@@ -509,7 +505,7 @@ impl BufferDiffInner {
for (old_range, replacement_text) in edits {
new_index_text.append(index_cursor.slice(old_range.start));
index_cursor.seek_forward(old_range.end);
new_index_text.push(&replacement_text, cx);
new_index_text.push(&replacement_text);
}
new_index_text.append(index_cursor.suffix());
Some(new_index_text)
@@ -966,7 +962,6 @@ impl BufferDiff {
hunks,
buffer,
file_exists,
cx.background_executor(),
);
cx.emit(BufferDiffEvent::HunksStagedOrUnstaged(
@@ -1390,12 +1385,7 @@ mod tests {
"
.unindent();
let mut buffer = Buffer::new(
ReplicaId::LOCAL,
BufferId::new(1).unwrap(),
buffer_text,
cx.background_executor(),
);
let mut buffer = Buffer::new(ReplicaId::LOCAL, BufferId::new(1).unwrap(), buffer_text);
let mut diff = BufferDiffSnapshot::new_sync(buffer.clone(), diff_base.clone(), cx);
assert_hunks(
diff.hunks_intersecting_range(Anchor::MIN..Anchor::MAX, &buffer),
@@ -1404,7 +1394,7 @@ mod tests {
&[(1..2, "two\n", "HELLO\n", DiffHunkStatus::modified_none())],
);
buffer.edit([(0..0, "point five\n")], cx.background_executor());
buffer.edit([(0..0, "point five\n")]);
diff = BufferDiffSnapshot::new_sync(buffer.clone(), diff_base.clone(), cx);
assert_hunks(
diff.hunks_intersecting_range(Anchor::MIN..Anchor::MAX, &buffer),
@@ -1469,12 +1459,7 @@ mod tests {
"
.unindent();
let buffer = Buffer::new(
ReplicaId::LOCAL,
BufferId::new(1).unwrap(),
buffer_text,
cx.background_executor(),
);
let buffer = Buffer::new(ReplicaId::LOCAL, BufferId::new(1).unwrap(), buffer_text);
let unstaged_diff = BufferDiffSnapshot::new_sync(buffer.clone(), index_text, cx);
let mut uncommitted_diff =
BufferDiffSnapshot::new_sync(buffer.clone(), head_text.clone(), cx);
@@ -1543,12 +1528,7 @@ mod tests {
"
.unindent();
let buffer = Buffer::new(
ReplicaId::LOCAL,
BufferId::new(1).unwrap(),
buffer_text,
cx.background_executor(),
);
let buffer = Buffer::new(ReplicaId::LOCAL, BufferId::new(1).unwrap(), buffer_text);
let diff = cx
.update(|cx| {
BufferDiffSnapshot::new_with_base_text(
@@ -1811,12 +1791,7 @@ mod tests {
for example in table {
let (buffer_text, ranges) = marked_text_ranges(&example.buffer_marked_text, false);
let buffer = Buffer::new(
ReplicaId::LOCAL,
BufferId::new(1).unwrap(),
buffer_text,
cx.background_executor(),
);
let buffer = Buffer::new(ReplicaId::LOCAL, BufferId::new(1).unwrap(), buffer_text);
let hunk_range =
buffer.anchor_before(ranges[0].start)..buffer.anchor_before(ranges[0].end);
@@ -1893,7 +1868,6 @@ mod tests {
ReplicaId::LOCAL,
BufferId::new(1).unwrap(),
buffer_text.clone(),
cx.background_executor(),
);
let unstaged = BufferDiffSnapshot::new_sync(buffer.clone(), index_text, cx);
let uncommitted = BufferDiffSnapshot::new_sync(buffer.clone(), head_text.clone(), cx);
@@ -1967,12 +1941,7 @@ mod tests {
"
.unindent();
let mut buffer = Buffer::new(
ReplicaId::LOCAL,
BufferId::new(1).unwrap(),
buffer_text_1,
cx.background_executor(),
);
let mut buffer = Buffer::new(ReplicaId::LOCAL, BufferId::new(1).unwrap(), buffer_text_1);
let empty_diff = cx.update(|cx| BufferDiffSnapshot::empty(&buffer, cx));
let diff_1 = BufferDiffSnapshot::new_sync(buffer.clone(), base_text.clone(), cx);
@@ -1992,7 +1961,6 @@ mod tests {
NINE
"
.unindent(),
cx.background_executor(),
);
let diff_2 = BufferDiffSnapshot::new_sync(buffer.clone(), base_text.clone(), cx);
assert_eq!(None, diff_2.inner.compare(&diff_1.inner, &buffer));
@@ -2010,7 +1978,6 @@ mod tests {
NINE
"
.unindent(),
cx.background_executor(),
);
let diff_3 = BufferDiffSnapshot::new_sync(buffer.clone(), base_text.clone(), cx);
let range = diff_3.inner.compare(&diff_2.inner, &buffer).unwrap();
@@ -2028,7 +1995,6 @@ mod tests {
NINE
"
.unindent(),
cx.background_executor(),
);
let diff_4 = BufferDiffSnapshot::new_sync(buffer.clone(), base_text.clone(), cx);
let range = diff_4.inner.compare(&diff_3.inner, &buffer).unwrap();
@@ -2047,7 +2013,6 @@ mod tests {
NINE
"
.unindent(),
cx.background_executor(),
);
let diff_5 = BufferDiffSnapshot::new_sync(buffer.snapshot(), base_text.clone(), cx);
let range = diff_5.inner.compare(&diff_4.inner, &buffer).unwrap();
@@ -2066,7 +2031,6 @@ mod tests {
«nine»
"
.unindent(),
cx.background_executor(),
);
let diff_6 = BufferDiffSnapshot::new_sync(buffer.snapshot(), base_text, cx);
let range = diff_6.inner.compare(&diff_5.inner, &buffer).unwrap();
@@ -2176,14 +2140,14 @@ mod tests {
let working_copy = gen_working_copy(rng, &head_text);
let working_copy = cx.new(|cx| {
language::Buffer::local_normalized(
Rope::from_str(working_copy.as_str(), cx.background_executor()),
Rope::from(working_copy.as_str()),
text::LineEnding::default(),
cx,
)
});
let working_copy = working_copy.read_with(cx, |working_copy, _| working_copy.snapshot());
let mut index_text = if rng.random() {
Rope::from_str(head_text.as_str(), cx.background_executor())
Rope::from(head_text.as_str())
} else {
working_copy.as_rope().clone()
};

View File

@@ -70,7 +70,6 @@ impl ChannelBuffer {
ReplicaId::new(response.replica_id as u16),
capability,
base_text,
cx.background_executor(),
)
})?;
buffer.update(cx, |buffer, cx| buffer.apply_ops(operations, cx))?;

View File

@@ -260,7 +260,7 @@ impl fmt::Debug for Lamport {
impl fmt::Debug for Global {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
write!(f, "Global {{")?;
for timestamp in self.iter() {
for timestamp in self.iter().filter(|t| t.value > 0) {
if timestamp.replica_id.0 > 0 {
write!(f, ", ")?;
}

View File

@@ -1,4 +1,5 @@
pub mod predict_edits_v3;
pub mod udiff;
use std::str::FromStr;
use std::sync::Arc;

View File

@@ -0,0 +1,294 @@
use std::{borrow::Cow, fmt::Display};
#[derive(Debug, PartialEq)]
pub enum DiffLine<'a> {
OldPath { path: Cow<'a, str> },
NewPath { path: Cow<'a, str> },
HunkHeader(Option<HunkLocation>),
Context(&'a str),
Deletion(&'a str),
Addition(&'a str),
Garbage(&'a str),
}
#[derive(Debug, PartialEq)]
pub struct HunkLocation {
start_line_old: u32,
count_old: u32,
start_line_new: u32,
count_new: u32,
}
impl<'a> DiffLine<'a> {
pub fn parse(line: &'a str) -> Self {
Self::try_parse(line).unwrap_or(Self::Garbage(line))
}
fn try_parse(line: &'a str) -> Option<Self> {
if let Some(header) = line.strip_prefix("---").and_then(eat_required_whitespace) {
let path = parse_header_path("a/", header);
Some(Self::OldPath { path })
} else if let Some(header) = line.strip_prefix("+++").and_then(eat_required_whitespace) {
Some(Self::NewPath {
path: parse_header_path("b/", header),
})
} else if let Some(header) = line.strip_prefix("@@").and_then(eat_required_whitespace) {
if header.starts_with("...") {
return Some(Self::HunkHeader(None));
}
let (start_line_old, header) = header.strip_prefix('-')?.split_once(',')?;
let mut parts = header.split_ascii_whitespace();
let count_old = parts.next()?;
let (start_line_new, count_new) = parts.next()?.strip_prefix('+')?.split_once(',')?;
Some(Self::HunkHeader(Some(HunkLocation {
start_line_old: start_line_old.parse::<u32>().ok()?.saturating_sub(1),
count_old: count_old.parse().ok()?,
start_line_new: start_line_new.parse::<u32>().ok()?.saturating_sub(1),
count_new: count_new.parse().ok()?,
})))
} else if let Some(deleted_header) = line.strip_prefix("-") {
Some(Self::Deletion(deleted_header))
} else if line.is_empty() {
Some(Self::Context(""))
} else if let Some(context) = line.strip_prefix(" ") {
Some(Self::Context(context))
} else {
Some(Self::Addition(line.strip_prefix("+")?))
}
}
}
impl<'a> Display for DiffLine<'a> {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
match self {
DiffLine::OldPath { path } => write!(f, "--- {path}"),
DiffLine::NewPath { path } => write!(f, "+++ {path}"),
DiffLine::HunkHeader(Some(hunk_location)) => {
write!(
f,
"@@ -{},{} +{},{} @@",
hunk_location.start_line_old + 1,
hunk_location.count_old,
hunk_location.start_line_new + 1,
hunk_location.count_new
)
}
DiffLine::HunkHeader(None) => write!(f, "@@ ... @@"),
DiffLine::Context(content) => write!(f, " {content}"),
DiffLine::Deletion(content) => write!(f, "-{content}"),
DiffLine::Addition(content) => write!(f, "+{content}"),
DiffLine::Garbage(line) => write!(f, "{line}"),
}
}
}
fn parse_header_path<'a>(strip_prefix: &'static str, header: &'a str) -> Cow<'a, str> {
if !header.contains(['"', '\\']) {
let path = header.split_ascii_whitespace().next().unwrap_or(header);
return Cow::Borrowed(path.strip_prefix(strip_prefix).unwrap_or(path));
}
let mut path = String::with_capacity(header.len());
let mut in_quote = false;
let mut chars = header.chars().peekable();
let mut strip_prefix = Some(strip_prefix);
while let Some(char) = chars.next() {
if char == '"' {
in_quote = !in_quote;
} else if char == '\\' {
let Some(&next_char) = chars.peek() else {
break;
};
chars.next();
path.push(next_char);
} else if char.is_ascii_whitespace() && !in_quote {
break;
} else {
path.push(char);
}
if let Some(prefix) = strip_prefix
&& path == prefix
{
strip_prefix.take();
path.clear();
}
}
Cow::Owned(path)
}
fn eat_required_whitespace(header: &str) -> Option<&str> {
let trimmed = header.trim_ascii_start();
if trimmed.len() == header.len() {
None
} else {
Some(trimmed)
}
}
#[cfg(test)]
mod tests {
use super::*;
use indoc::indoc;
#[test]
fn parse_lines_simple() {
let input = indoc! {"
diff --git a/text.txt b/text.txt
index 86c770d..a1fd855 100644
--- a/file.txt
+++ b/file.txt
@@ -1,2 +1,3 @@
context
-deleted
+inserted
garbage
--- b/file.txt
+++ a/file.txt
"};
let lines = input.lines().map(DiffLine::parse).collect::<Vec<_>>();
pretty_assertions::assert_eq!(
lines,
&[
DiffLine::Garbage("diff --git a/text.txt b/text.txt"),
DiffLine::Garbage("index 86c770d..a1fd855 100644"),
DiffLine::OldPath {
path: "file.txt".into()
},
DiffLine::NewPath {
path: "file.txt".into()
},
DiffLine::HunkHeader(Some(HunkLocation {
start_line_old: 0,
count_old: 2,
start_line_new: 0,
count_new: 3
})),
DiffLine::Context("context"),
DiffLine::Deletion("deleted"),
DiffLine::Addition("inserted"),
DiffLine::Garbage("garbage"),
DiffLine::Context(""),
DiffLine::OldPath {
path: "b/file.txt".into()
},
DiffLine::NewPath {
path: "a/file.txt".into()
},
]
);
}
#[test]
fn file_header_extra_space() {
let options = ["--- file", "--- file", "---\tfile"];
for option in options {
pretty_assertions::assert_eq!(
DiffLine::parse(option),
DiffLine::OldPath {
path: "file".into()
},
"{option}",
);
}
}
#[test]
fn hunk_header_extra_space() {
let options = [
"@@ -1,2 +1,3 @@",
"@@ -1,2 +1,3 @@",
"@@\t-1,2\t+1,3\t@@",
"@@ -1,2 +1,3 @@",
"@@ -1,2 +1,3 @@",
"@@ -1,2 +1,3 @@",
"@@ -1,2 +1,3 @@ garbage",
];
for option in options {
pretty_assertions::assert_eq!(
DiffLine::parse(option),
DiffLine::HunkHeader(Some(HunkLocation {
start_line_old: 0,
count_old: 2,
start_line_new: 0,
count_new: 3
})),
"{option}",
);
}
}
#[test]
fn hunk_header_without_location() {
pretty_assertions::assert_eq!(DiffLine::parse("@@ ... @@"), DiffLine::HunkHeader(None));
}
#[test]
fn test_parse_path() {
assert_eq!(parse_header_path("a/", "foo.txt"), "foo.txt");
assert_eq!(
parse_header_path("a/", "foo/bar/baz.txt"),
"foo/bar/baz.txt"
);
assert_eq!(parse_header_path("a/", "a/foo.txt"), "foo.txt");
assert_eq!(
parse_header_path("a/", "a/foo/bar/baz.txt"),
"foo/bar/baz.txt"
);
// Extra
assert_eq!(
parse_header_path("a/", "a/foo/bar/baz.txt 2025"),
"foo/bar/baz.txt"
);
assert_eq!(
parse_header_path("a/", "a/foo/bar/baz.txt\t2025"),
"foo/bar/baz.txt"
);
assert_eq!(
parse_header_path("a/", "a/foo/bar/baz.txt \""),
"foo/bar/baz.txt"
);
// Quoted
assert_eq!(
parse_header_path("a/", "a/foo/bar/\"baz quox.txt\""),
"foo/bar/baz quox.txt"
);
assert_eq!(
parse_header_path("a/", "\"a/foo/bar/baz quox.txt\""),
"foo/bar/baz quox.txt"
);
assert_eq!(
parse_header_path("a/", "\"foo/bar/baz quox.txt\""),
"foo/bar/baz quox.txt"
);
assert_eq!(parse_header_path("a/", "\"whatever 🤷\""), "whatever 🤷");
assert_eq!(
parse_header_path("a/", "\"foo/bar/baz quox.txt\" 2025"),
"foo/bar/baz quox.txt"
);
// unescaped quotes are dropped
assert_eq!(parse_header_path("a/", "foo/\"bar\""), "foo/bar");
// Escaped
assert_eq!(
parse_header_path("a/", "\"foo/\\\"bar\\\"/baz.txt\""),
"foo/\"bar\"/baz.txt"
);
assert_eq!(
parse_header_path("a/", "\"C:\\\\Projects\\\\My App\\\\old file.txt\""),
"C:\\Projects\\My App\\old file.txt"
);
}
}

View File

@@ -212,7 +212,7 @@ pub fn write_codeblock<'a>(
include_line_numbers: bool,
output: &'a mut String,
) {
writeln!(output, "`````path={}", path.display()).unwrap();
writeln!(output, "`````{}", path.display()).unwrap();
write_excerpts(
excerpts,
sorted_insertions,

View File

@@ -701,12 +701,12 @@ impl Database {
return Ok(());
}
let mut text_buffer = text::Buffer::new_slow(
let mut text_buffer = text::Buffer::new(
clock::ReplicaId::LOCAL,
text::BufferId::new(1).unwrap(),
base_text,
);
text_buffer.apply_ops(operations.into_iter().filter_map(operation_from_wire), None);
text_buffer.apply_ops(operations.into_iter().filter_map(operation_from_wire));
let base_text = text_buffer.text();
let epoch = buffer.epoch + 1;

View File

@@ -74,21 +74,11 @@ async fn test_channel_buffers(db: &Arc<Database>) {
ReplicaId::new(0),
text::BufferId::new(1).unwrap(),
"".to_string(),
&db.test_options.as_ref().unwrap().executor,
);
let operations = vec![
buffer_a.edit(
[(0..0, "hello world")],
&db.test_options.as_ref().unwrap().executor,
),
buffer_a.edit(
[(5..5, ", cruel")],
&db.test_options.as_ref().unwrap().executor,
),
buffer_a.edit(
[(0..5, "goodbye")],
&db.test_options.as_ref().unwrap().executor,
),
buffer_a.edit([(0..0, "hello world")]),
buffer_a.edit([(5..5, ", cruel")]),
buffer_a.edit([(0..5, "goodbye")]),
buffer_a.undo().unwrap().1,
];
assert_eq!(buffer_a.text(), "hello, cruel world");
@@ -112,19 +102,15 @@ async fn test_channel_buffers(db: &Arc<Database>) {
ReplicaId::new(0),
text::BufferId::new(1).unwrap(),
buffer_response_b.base_text,
&db.test_options.as_ref().unwrap().executor,
);
buffer_b.apply_ops(
buffer_response_b.operations.into_iter().map(|operation| {
let operation = proto::deserialize_operation(operation).unwrap();
if let language::Operation::Buffer(operation) = operation {
operation
} else {
unreachable!()
}
}),
None,
);
buffer_b.apply_ops(buffer_response_b.operations.into_iter().map(|operation| {
let operation = proto::deserialize_operation(operation).unwrap();
if let language::Operation::Buffer(operation) = operation {
operation
} else {
unreachable!()
}
}));
assert_eq!(buffer_b.text(), "hello, cruel world");
@@ -261,7 +247,6 @@ async fn test_channel_buffers_last_operations(db: &Database) {
ReplicaId::new(res.replica_id as u16),
text::BufferId::new(1).unwrap(),
"".to_string(),
&db.test_options.as_ref().unwrap().executor,
));
}
@@ -270,9 +255,9 @@ async fn test_channel_buffers_last_operations(db: &Database) {
user_id,
db,
vec![
text_buffers[0].edit([(0..0, "a")], &db.test_options.as_ref().unwrap().executor),
text_buffers[0].edit([(0..0, "b")], &db.test_options.as_ref().unwrap().executor),
text_buffers[0].edit([(0..0, "c")], &db.test_options.as_ref().unwrap().executor),
text_buffers[0].edit([(0..0, "a")]),
text_buffers[0].edit([(0..0, "b")]),
text_buffers[0].edit([(0..0, "c")]),
],
)
.await;
@@ -282,9 +267,9 @@ async fn test_channel_buffers_last_operations(db: &Database) {
user_id,
db,
vec![
text_buffers[1].edit([(0..0, "d")], &db.test_options.as_ref().unwrap().executor),
text_buffers[1].edit([(1..1, "e")], &db.test_options.as_ref().unwrap().executor),
text_buffers[1].edit([(2..2, "f")], &db.test_options.as_ref().unwrap().executor),
text_buffers[1].edit([(0..0, "d")]),
text_buffers[1].edit([(1..1, "e")]),
text_buffers[1].edit([(2..2, "f")]),
],
)
.await;
@@ -301,15 +286,14 @@ async fn test_channel_buffers_last_operations(db: &Database) {
replica_id,
text::BufferId::new(1).unwrap(),
"def".to_string(),
&db.test_options.as_ref().unwrap().executor,
);
update_buffer(
buffers[1].channel_id,
user_id,
db,
vec![
text_buffers[1].edit([(0..0, "g")], &db.test_options.as_ref().unwrap().executor),
text_buffers[1].edit([(0..0, "h")], &db.test_options.as_ref().unwrap().executor),
text_buffers[1].edit([(0..0, "g")]),
text_buffers[1].edit([(0..0, "h")]),
],
)
.await;
@@ -318,7 +302,7 @@ async fn test_channel_buffers_last_operations(db: &Database) {
buffers[2].channel_id,
user_id,
db,
vec![text_buffers[2].edit([(0..0, "i")], &db.test_options.as_ref().unwrap().executor)],
vec![text_buffers[2].edit([(0..0, "i")])],
)
.await;

View File

@@ -346,6 +346,7 @@ impl Server {
.add_request_handler(forward_read_only_project_request::<proto::ResolveInlayHint>)
.add_request_handler(forward_read_only_project_request::<proto::GetColorPresentation>)
.add_request_handler(forward_read_only_project_request::<proto::OpenBufferByPath>)
.add_request_handler(forward_read_only_project_request::<proto::OpenImageByPath>)
.add_request_handler(forward_read_only_project_request::<proto::GitGetBranches>)
.add_request_handler(forward_read_only_project_request::<proto::GetDefaultBranch>)
.add_request_handler(forward_read_only_project_request::<proto::OpenUnstagedDiff>)
@@ -395,6 +396,7 @@ impl Server {
.add_request_handler(forward_mutating_project_request::<proto::StopLanguageServers>)
.add_request_handler(forward_mutating_project_request::<proto::LinkedEditingRange>)
.add_message_handler(create_buffer_for_peer)
.add_message_handler(create_image_for_peer)
.add_request_handler(update_buffer)
.add_message_handler(broadcast_project_message_from_host::<proto::RefreshInlayHints>)
.add_message_handler(broadcast_project_message_from_host::<proto::RefreshCodeLens>)
@@ -2389,6 +2391,26 @@ async fn create_buffer_for_peer(
Ok(())
}
/// Notify other participants that a new image has been created
async fn create_image_for_peer(
request: proto::CreateImageForPeer,
session: MessageContext,
) -> Result<()> {
session
.db()
.await
.check_user_is_project_host(
ProjectId::from_proto(request.project_id),
session.connection_id,
)
.await?;
let peer_id = request.peer_id.context("invalid peer id")?;
session
.peer
.forward_send(session.connection_id, peer_id.into(), request)?;
Ok(())
}
/// Notify other participants that a buffer has been updated. This is
/// allowed for guests as long as the update is limited to selections.
async fn update_buffer(

View File

@@ -39,6 +39,7 @@ use std::{
Arc,
atomic::{self, AtomicBool, AtomicUsize},
},
time::Duration,
};
use text::Point;
use util::{path, rel_path::rel_path, uri};
@@ -1817,14 +1818,7 @@ async fn test_mutual_editor_inlay_hint_cache_update(
settings.project.all_languages.defaults.inlay_hints =
Some(InlayHintSettingsContent {
enabled: Some(true),
show_value_hints: Some(true),
edit_debounce_ms: Some(0),
scroll_debounce_ms: Some(0),
show_type_hints: Some(true),
show_parameter_hints: Some(false),
show_other_hints: Some(true),
show_background: Some(false),
toggle_on_modifiers_press: None,
..InlayHintSettingsContent::default()
})
});
});
@@ -1834,15 +1828,8 @@ async fn test_mutual_editor_inlay_hint_cache_update(
store.update_user_settings(cx, |settings| {
settings.project.all_languages.defaults.inlay_hints =
Some(InlayHintSettingsContent {
show_value_hints: Some(true),
enabled: Some(true),
edit_debounce_ms: Some(0),
scroll_debounce_ms: Some(0),
show_type_hints: Some(true),
show_parameter_hints: Some(false),
show_other_hints: Some(true),
show_background: Some(false),
toggle_on_modifiers_press: None,
..InlayHintSettingsContent::default()
})
});
});
@@ -1935,6 +1922,7 @@ async fn test_mutual_editor_inlay_hint_cache_update(
});
let fake_language_server = fake_language_servers.next().await.unwrap();
let editor_a = file_a.await.unwrap().downcast::<Editor>().unwrap();
executor.advance_clock(Duration::from_millis(100));
executor.run_until_parked();
let initial_edit = edits_made.load(atomic::Ordering::Acquire);
@@ -1955,6 +1943,7 @@ async fn test_mutual_editor_inlay_hint_cache_update(
.downcast::<Editor>()
.unwrap();
executor.advance_clock(Duration::from_millis(100));
executor.run_until_parked();
editor_b.update(cx_b, |editor, cx| {
assert_eq!(
@@ -1973,6 +1962,7 @@ async fn test_mutual_editor_inlay_hint_cache_update(
});
cx_b.focus(&editor_b);
executor.advance_clock(Duration::from_secs(1));
executor.run_until_parked();
editor_a.update(cx_a, |editor, cx| {
assert_eq!(
@@ -1996,6 +1986,7 @@ async fn test_mutual_editor_inlay_hint_cache_update(
});
cx_a.focus(&editor_a);
executor.advance_clock(Duration::from_secs(1));
executor.run_until_parked();
editor_a.update(cx_a, |editor, cx| {
assert_eq!(
@@ -2017,6 +2008,7 @@ async fn test_mutual_editor_inlay_hint_cache_update(
.into_response()
.expect("inlay refresh request failed");
executor.advance_clock(Duration::from_secs(1));
executor.run_until_parked();
editor_a.update(cx_a, |editor, cx| {
assert_eq!(

View File

@@ -3694,7 +3694,7 @@ async fn test_buffer_reloading(
assert_eq!(buf.line_ending(), LineEnding::Unix);
});
let new_contents = Rope::from_str_small("d\ne\nf");
let new_contents = Rope::from("d\ne\nf");
client_a
.fs()
.save(
@@ -4479,7 +4479,7 @@ async fn test_reloading_buffer_manually(
.fs()
.save(
path!("/a/a.rs").as_ref(),
&Rope::from_str_small("let seven = 7;"),
&Rope::from("let seven = 7;"),
LineEnding::Unix,
)
.await

View File

@@ -27,7 +27,6 @@ use std::{
rc::Rc,
sync::Arc,
};
use text::Rope;
use util::{
ResultExt, path,
paths::PathStyle,
@@ -939,11 +938,7 @@ impl RandomizedTest for ProjectCollaborationTest {
client
.fs()
.save(
&path,
&Rope::from_str_small(content.as_str()),
text::LineEnding::Unix,
)
.save(&path, &content.as_str().into(), text::LineEnding::Unix)
.await
.unwrap();
}

View File

@@ -7,10 +7,11 @@ use anyhow::Context as _;
use call::ActiveCall;
use channel::{Channel, ChannelEvent, ChannelStore};
use client::{ChannelId, Client, Contact, User, UserStore};
use collections::{HashMap, HashSet};
use contact_finder::ContactFinder;
use db::kvp::KEY_VALUE_STORE;
use editor::{Editor, EditorElement, EditorStyle};
use fuzzy::{StringMatchCandidate, match_strings};
use fuzzy::{StringMatch, StringMatchCandidate, match_strings};
use gpui::{
AnyElement, App, AsyncWindowContext, Bounds, ClickEvent, ClipboardItem, Context, DismissEvent,
Div, Entity, EventEmitter, FocusHandle, Focusable, FontStyle, InteractiveElement, IntoElement,
@@ -30,9 +31,9 @@ use smallvec::SmallVec;
use std::{mem, sync::Arc};
use theme::{ActiveTheme, ThemeSettings};
use ui::{
Avatar, AvatarAvailabilityIndicator, Button, Color, ContextMenu, Facepile, Icon, IconButton,
IconName, IconSize, Indicator, Label, ListHeader, ListItem, Tooltip, prelude::*,
tooltip_container,
Avatar, AvatarAvailabilityIndicator, Button, Color, ContextMenu, Facepile, HighlightedLabel,
Icon, IconButton, IconName, IconSize, Indicator, Label, ListHeader, ListItem, Tooltip,
prelude::*, tooltip_container,
};
use util::{ResultExt, TryFutureExt, maybe};
use workspace::{
@@ -261,6 +262,8 @@ enum ListEntry {
channel: Arc<Channel>,
depth: usize,
has_children: bool,
// `None` when the channel is a parent of a matched channel.
string_match: Option<StringMatch>,
},
ChannelNotes {
channel_id: ChannelId,
@@ -630,6 +633,10 @@ impl CollabPanel {
.enumerate()
.map(|(ix, (_, channel))| StringMatchCandidate::new(ix, &channel.name)),
);
let mut channels = channel_store
.ordered_channels()
.map(|(_, chan)| chan)
.collect::<Vec<_>>();
let matches = executor.block(match_strings(
&self.match_candidates,
&query,
@@ -639,14 +646,34 @@ impl CollabPanel {
&Default::default(),
executor.clone(),
));
let matches_by_id: HashMap<_, _> = matches
.iter()
.map(|mat| (channels[mat.candidate_id].id, mat.clone()))
.collect();
let channel_ids_of_matches_or_parents: HashSet<_> = matches
.iter()
.flat_map(|mat| {
let match_channel = channels[mat.candidate_id];
match_channel
.parent_path
.iter()
.copied()
.chain(Some(match_channel.id))
})
.collect();
channels.retain(|chan| channel_ids_of_matches_or_parents.contains(&chan.id));
if let Some(state) = &self.channel_editing_state
&& matches!(state, ChannelEditingState::Create { location: None, .. })
{
self.entries.push(ListEntry::ChannelEditor { depth: 0 });
}
let mut collapse_depth = None;
for mat in matches {
let channel = channel_store.channel_at_index(mat.candidate_id).unwrap();
for (idx, channel) in channels.into_iter().enumerate() {
let depth = channel.parent_path.len();
if collapse_depth.is_none() && self.is_channel_collapsed(channel.id) {
@@ -663,7 +690,7 @@ impl CollabPanel {
}
let has_children = channel_store
.channel_at_index(mat.candidate_id + 1)
.channel_at_index(idx + 1)
.is_some_and(|next_channel| next_channel.parent_path.ends_with(&[channel.id]));
match &self.channel_editing_state {
@@ -675,6 +702,7 @@ impl CollabPanel {
channel: channel.clone(),
depth,
has_children: false,
string_match: matches_by_id.get(&channel.id).map(|mat| (*mat).clone()),
});
self.entries
.push(ListEntry::ChannelEditor { depth: depth + 1 });
@@ -690,6 +718,7 @@ impl CollabPanel {
channel: channel.clone(),
depth,
has_children,
string_match: matches_by_id.get(&channel.id).map(|mat| (*mat).clone()),
});
}
}
@@ -2321,8 +2350,17 @@ impl CollabPanel {
channel,
depth,
has_children,
string_match,
} => self
.render_channel(channel, *depth, *has_children, is_selected, ix, cx)
.render_channel(
channel,
*depth,
*has_children,
is_selected,
ix,
string_match.as_ref(),
cx,
)
.into_any_element(),
ListEntry::ChannelEditor { depth } => self
.render_channel_editor(*depth, window, cx)
@@ -2719,6 +2757,7 @@ impl CollabPanel {
has_children: bool,
is_selected: bool,
ix: usize,
string_match: Option<&StringMatch>,
cx: &mut Context<Self>,
) -> impl IntoElement {
let channel_id = channel.id;
@@ -2855,7 +2894,14 @@ impl CollabPanel {
.child(
h_flex()
.id(channel_id.0 as usize)
.child(Label::new(channel.name.clone()))
.child(match string_match {
None => Label::new(channel.name.clone()).into_any_element(),
Some(string_match) => HighlightedLabel::new(
channel.name.clone(),
string_match.positions.clone(),
)
.into_any_element(),
})
.children(face_pile.map(|face_pile| face_pile.p_1())),
),
)

View File

@@ -489,7 +489,11 @@ impl Copilot {
let node_path = node_runtime.binary_path().await?;
ensure_node_version_for_copilot(&node_path).await?;
let arguments: Vec<OsString> = vec![server_path.into(), "--stdio".into()];
let arguments: Vec<OsString> = vec![
"--experimental-sqlite".into(),
server_path.into(),
"--stdio".into(),
];
let binary = LanguageServerBinary {
path: node_path,
arguments,

View File

@@ -265,9 +265,10 @@ impl minidumper::ServerHandler for CrashServer {
3 => {
let gpu_specs: system_specs::GpuSpecs =
bincode::deserialize(&buffer).expect("gpu specs");
self.active_gpu
.set(gpu_specs)
.expect("already set active gpu");
// we ignore the case where it was already set because this message is sent
// on each new window. in theory all zed windows should be using the same
// GPU so this is fine.
self.active_gpu.set(gpu_specs).ok();
}
_ => {
panic!("invalid message kind");

View File

@@ -41,6 +41,10 @@ util.workspace = true
[dev-dependencies]
dap = { workspace = true, features = ["test-support"] }
fs = { workspace = true, features = ["test-support"] }
gpui = { workspace = true, features = ["test-support"] }
http_client.workspace = true
node_runtime.workspace = true
settings = { workspace = true, features = ["test-support"] }
task = { workspace = true, features = ["test-support"] }
util = { workspace = true, features = ["test-support"] }

View File

@@ -4,6 +4,8 @@ mod go;
mod javascript;
mod python;
#[cfg(test)]
use std::path::PathBuf;
use std::sync::Arc;
use anyhow::Result;
@@ -38,3 +40,65 @@ pub fn init(cx: &mut App) {
}
})
}
#[cfg(test)]
mod test_mocks {
use super::*;
pub(crate) struct MockDelegate {
worktree_root: PathBuf,
}
impl MockDelegate {
pub(crate) fn new() -> Arc<dyn adapters::DapDelegate> {
Arc::new(Self {
worktree_root: PathBuf::from("/tmp/test"),
})
}
}
#[async_trait::async_trait]
impl adapters::DapDelegate for MockDelegate {
fn worktree_id(&self) -> settings::WorktreeId {
settings::WorktreeId::from_usize(0)
}
fn worktree_root_path(&self) -> &std::path::Path {
&self.worktree_root
}
fn http_client(&self) -> Arc<dyn http_client::HttpClient> {
unimplemented!("Not needed for tests")
}
fn node_runtime(&self) -> node_runtime::NodeRuntime {
unimplemented!("Not needed for tests")
}
fn toolchain_store(&self) -> Arc<dyn language::LanguageToolchainStore> {
unimplemented!("Not needed for tests")
}
fn fs(&self) -> Arc<dyn fs::Fs> {
unimplemented!("Not needed for tests")
}
fn output_to_console(&self, _msg: String) {}
async fn which(&self, _command: &std::ffi::OsStr) -> Option<PathBuf> {
None
}
async fn read_text_file(&self, _path: &util::rel_path::RelPath) -> Result<String> {
Ok(String::new())
}
async fn shell_env(&self) -> collections::HashMap<String, String> {
collections::HashMap::default()
}
fn is_headless(&self) -> bool {
false
}
}
}

View File

@@ -1,10 +1,9 @@
use std::ffi::OsStr;
use anyhow::{Context as _, Result, bail};
use async_trait::async_trait;
use collections::HashMap;
use dap::{StartDebuggingRequestArguments, adapters::DebugTaskDefinition};
use gpui::AsyncApp;
use std::ffi::OsStr;
use task::{DebugScenario, ZedDebugConfig};
use crate::*;
@@ -16,6 +15,14 @@ impl GdbDebugAdapter {
const ADAPTER_NAME: &'static str = "GDB";
}
/// Ensures that "-i=dap" is present in the GDB argument list.
fn ensure_dap_interface(mut gdb_args: Vec<String>) -> Vec<String> {
if !gdb_args.iter().any(|arg| arg.trim() == "-i=dap") {
gdb_args.insert(0, "-i=dap".to_string());
}
gdb_args
}
#[async_trait(?Send)]
impl DebugAdapter for GdbDebugAdapter {
fn name(&self) -> DebugAdapterName {
@@ -99,6 +106,18 @@ impl DebugAdapter for GdbDebugAdapter {
"type": "string",
"description": "Working directory for the debugged program. GDB will change its working directory to this directory."
},
"gdb_path": {
"type": "string",
"description": "Alternative path to the GDB executable, if the one in standard path is not desirable"
},
"gdb_args": {
"type": "array",
"items": {
"type":"string"
},
"description": "additional arguments given to GDB at startup, not the program debugged",
"default": []
},
"env": {
"type": "object",
"description": "Environment variables for the debugged program. Each key is the name of an environment variable; each value is the value of that variable."
@@ -164,21 +183,49 @@ impl DebugAdapter for GdbDebugAdapter {
user_env: Option<HashMap<String, String>>,
_: &mut AsyncApp,
) -> Result<DebugAdapterBinary> {
let user_setting_path = user_installed_path
.filter(|p| p.exists())
.and_then(|p| p.to_str().map(|s| s.to_string()));
// Try to get gdb_path from config
let gdb_path_from_config = config
.config
.get("gdb_path")
.and_then(|v| v.as_str())
.map(|s| s.to_string());
let gdb_path = delegate
.which(OsStr::new("gdb"))
.await
.and_then(|p| p.to_str().map(|s| s.to_string()))
.context("Could not find gdb in path");
let gdb_path = if let Some(path) = gdb_path_from_config {
path
} else {
// Original logic: use user_installed_path or search in system path
let user_setting_path = user_installed_path
.filter(|p| p.exists())
.and_then(|p| p.to_str().map(|s| s.to_string()));
if gdb_path.is_err() && user_setting_path.is_none() {
bail!("Could not find gdb path or it's not installed");
}
let gdb_path_result = delegate
.which(OsStr::new("gdb"))
.await
.and_then(|p| p.to_str().map(|s| s.to_string()))
.context("Could not find gdb in path");
let gdb_path = user_setting_path.unwrap_or(gdb_path?);
if gdb_path_result.is_err() && user_setting_path.is_none() {
bail!("Could not find gdb path or it's not installed");
}
user_setting_path.unwrap_or_else(|| gdb_path_result.unwrap())
};
// Arguments: use gdb_args from config if present, else user_args, else default
let gdb_args = {
let args = config
.config
.get("gdb_args")
.and_then(|v| v.as_array())
.map(|arr| {
arr.iter()
.filter_map(|v| v.as_str().map(|s| s.to_string()))
.collect::<Vec<_>>()
})
.or(user_args.clone())
.unwrap_or_else(|| vec!["-i=dap".into()]);
ensure_dap_interface(args)
};
let mut configuration = config.config.clone();
if let Some(configuration) = configuration.as_object_mut() {
@@ -187,10 +234,26 @@ impl DebugAdapter for GdbDebugAdapter {
.or_insert_with(|| delegate.worktree_root_path().to_string_lossy().into());
}
let mut base_env = delegate.shell_env().await;
base_env.extend(user_env.unwrap_or_default());
let config_env: HashMap<String, String> = config
.config
.get("env")
.and_then(|v| v.as_object())
.map(|obj| {
obj.iter()
.filter_map(|(k, v)| v.as_str().map(|s| (k.clone(), s.to_string())))
.collect::<HashMap<String, String>>()
})
.unwrap_or_else(HashMap::default);
base_env.extend(config_env);
Ok(DebugAdapterBinary {
command: Some(gdb_path),
arguments: user_args.unwrap_or_else(|| vec!["-i=dap".into()]),
envs: user_env.unwrap_or_default(),
arguments: gdb_args,
envs: base_env,
cwd: Some(delegate.worktree_root_path().to_path_buf()),
connection: None,
request_args: StartDebuggingRequestArguments {

View File

@@ -23,6 +23,11 @@ use std::{
use util::command::new_smol_command;
use util::{ResultExt, paths::PathStyle, rel_path::RelPath};
enum DebugpyLaunchMode<'a> {
Normal,
AttachWithConnect { host: Option<&'a str> },
}
#[derive(Default)]
pub(crate) struct PythonDebugAdapter {
base_venv_path: OnceCell<Result<Arc<Path>, String>>,
@@ -36,10 +41,11 @@ impl PythonDebugAdapter {
const LANGUAGE_NAME: &'static str = "Python";
async fn generate_debugpy_arguments(
host: &Ipv4Addr,
async fn generate_debugpy_arguments<'a>(
host: &'a Ipv4Addr,
port: u16,
user_installed_path: Option<&Path>,
launch_mode: DebugpyLaunchMode<'a>,
user_installed_path: Option<&'a Path>,
user_args: Option<Vec<String>>,
) -> Result<Vec<String>> {
let mut args = if let Some(user_installed_path) = user_installed_path {
@@ -62,7 +68,20 @@ impl PythonDebugAdapter {
args.extend(if let Some(args) = user_args {
args
} else {
vec![format!("--host={}", host), format!("--port={}", port)]
match launch_mode {
DebugpyLaunchMode::Normal => {
vec![format!("--host={}", host), format!("--port={}", port)]
}
DebugpyLaunchMode::AttachWithConnect { host } => {
let mut args = vec!["connect".to_string()];
if let Some(host) = host {
args.push(format!("{host}:"));
}
args.push(format!("{port}"));
args
}
}
});
Ok(args)
}
@@ -315,7 +334,46 @@ impl PythonDebugAdapter {
user_env: Option<HashMap<String, String>>,
python_from_toolchain: Option<String>,
) -> Result<DebugAdapterBinary> {
let tcp_connection = config.tcp_connection.clone().unwrap_or_default();
let mut tcp_connection = config.tcp_connection.clone().unwrap_or_default();
let (config_port, config_host) = config
.config
.get("connect")
.map(|value| {
(
value
.get("port")
.and_then(|val| val.as_u64().map(|p| p as u16)),
value.get("host").and_then(|val| val.as_str()),
)
})
.unwrap_or_else(|| {
(
config
.config
.get("port")
.and_then(|port| port.as_u64().map(|p| p as u16)),
config.config.get("host").and_then(|host| host.as_str()),
)
});
let is_attach_with_connect = if config
.config
.get("request")
.is_some_and(|val| val.as_str().is_some_and(|request| request == "attach"))
{
if tcp_connection.host.is_some() && config_host.is_some() {
bail!("Cannot have two different hosts in debug configuration")
} else if tcp_connection.port.is_some() && config_port.is_some() {
bail!("Cannot have two different ports in debug configuration")
}
tcp_connection.port = config_port;
DebugpyLaunchMode::AttachWithConnect { host: config_host }
} else {
DebugpyLaunchMode::Normal
};
let (host, port, timeout) = crate::configure_tcp_connection(tcp_connection).await?;
let python_path = if let Some(toolchain) = python_from_toolchain {
@@ -330,6 +388,7 @@ impl PythonDebugAdapter {
let arguments = Self::generate_debugpy_arguments(
&host,
port,
is_attach_with_connect,
user_installed_path.as_deref(),
user_args,
)
@@ -765,29 +824,58 @@ impl DebugAdapter for PythonDebugAdapter {
.await;
}
let base_path = config
.config
.get("cwd")
.and_then(|cwd| {
RelPath::new(
cwd.as_str()
.map(Path::new)?
.strip_prefix(delegate.worktree_root_path())
.ok()?,
PathStyle::local(),
)
.ok()
let base_paths = ["cwd", "program", "module"]
.into_iter()
.filter_map(|key| {
config.config.get(key).and_then(|cwd| {
RelPath::new(
cwd.as_str()
.map(Path::new)?
.strip_prefix(delegate.worktree_root_path())
.ok()?,
PathStyle::local(),
)
.ok()
})
})
.unwrap_or_else(|| RelPath::empty().into());
let toolchain = delegate
.toolchain_store()
.active_toolchain(
delegate.worktree_id(),
base_path.into_arc(),
language::LanguageName::new(Self::LANGUAGE_NAME),
cx,
.chain(
// While Debugpy's wiki saids absolute paths are required, but it actually supports relative paths when cwd is passed in.
// (Which should always be the case because Zed defaults to the cwd worktree root)
// So we want to check that these relative paths find toolchains as well. Otherwise, they won't be checked
// because the strip prefix in the iteration above will return an error
config
.config
.get("cwd")
.map(|_| {
["program", "module"].into_iter().filter_map(|key| {
config.config.get(key).and_then(|value| {
let path = Path::new(value.as_str()?);
RelPath::new(path, PathStyle::local()).ok()
})
})
})
.into_iter()
.flatten(),
)
.await;
.chain([RelPath::empty().into()]);
let mut toolchain = None;
for base_path in base_paths {
if let Some(found_toolchain) = delegate
.toolchain_store()
.active_toolchain(
delegate.worktree_id(),
base_path.into_arc(),
language::LanguageName::new(Self::LANGUAGE_NAME),
cx,
)
.await
{
toolchain = Some(found_toolchain);
break;
}
}
self.fetch_debugpy_whl(toolchain.clone(), delegate)
.await
@@ -824,7 +912,148 @@ mod tests {
use util::path;
use super::*;
use std::{net::Ipv4Addr, path::PathBuf};
use task::TcpArgumentsTemplate;
#[gpui::test]
async fn test_tcp_connection_conflict_with_connect_args() {
let adapter = PythonDebugAdapter {
base_venv_path: OnceCell::new(),
debugpy_whl_base_path: OnceCell::new(),
};
let config_with_port_conflict = json!({
"request": "attach",
"connect": {
"port": 5679
}
});
let tcp_connection = TcpArgumentsTemplate {
host: None,
port: Some(5678),
timeout: None,
};
let task_def = DebugTaskDefinition {
label: "test".into(),
adapter: PythonDebugAdapter::ADAPTER_NAME.into(),
config: config_with_port_conflict,
tcp_connection: Some(tcp_connection.clone()),
};
let result = adapter
.get_installed_binary(
&test_mocks::MockDelegate::new(),
&task_def,
None,
None,
None,
Some("python3".to_string()),
)
.await;
assert!(result.is_err());
assert!(
result
.unwrap_err()
.to_string()
.contains("Cannot have two different ports")
);
let host = Ipv4Addr::new(127, 0, 0, 1);
let config_with_host_conflict = json!({
"request": "attach",
"connect": {
"host": "192.168.1.1",
"port": 5678
}
});
let tcp_connection_with_host = TcpArgumentsTemplate {
host: Some(host),
port: None,
timeout: None,
};
let task_def_host = DebugTaskDefinition {
label: "test".into(),
adapter: PythonDebugAdapter::ADAPTER_NAME.into(),
config: config_with_host_conflict,
tcp_connection: Some(tcp_connection_with_host),
};
let result_host = adapter
.get_installed_binary(
&test_mocks::MockDelegate::new(),
&task_def_host,
None,
None,
None,
Some("python3".to_string()),
)
.await;
assert!(result_host.is_err());
assert!(
result_host
.unwrap_err()
.to_string()
.contains("Cannot have two different hosts")
);
}
#[gpui::test]
async fn test_attach_with_connect_mode_generates_correct_arguments() {
let host = Ipv4Addr::new(127, 0, 0, 1);
let port = 5678;
let args_without_host = PythonDebugAdapter::generate_debugpy_arguments(
&host,
port,
DebugpyLaunchMode::AttachWithConnect { host: None },
None,
None,
)
.await
.unwrap();
let expected_suffix = path!("debug_adapters/Debugpy/debugpy/adapter");
assert!(args_without_host[0].ends_with(expected_suffix));
assert_eq!(args_without_host[1], "connect");
assert_eq!(args_without_host[2], "5678");
let args_with_host = PythonDebugAdapter::generate_debugpy_arguments(
&host,
port,
DebugpyLaunchMode::AttachWithConnect {
host: Some("192.168.1.100"),
},
None,
None,
)
.await
.unwrap();
assert!(args_with_host[0].ends_with(expected_suffix));
assert_eq!(args_with_host[1], "connect");
assert_eq!(args_with_host[2], "192.168.1.100:");
assert_eq!(args_with_host[3], "5678");
let args_normal = PythonDebugAdapter::generate_debugpy_arguments(
&host,
port,
DebugpyLaunchMode::Normal,
None,
None,
)
.await
.unwrap();
assert!(args_normal[0].ends_with(expected_suffix));
assert_eq!(args_normal[1], "--host=127.0.0.1");
assert_eq!(args_normal[2], "--port=5678");
assert!(!args_normal.contains(&"connect".to_string()));
}
#[gpui::test]
async fn test_debugpy_install_path_cases() {
@@ -833,15 +1062,25 @@ mod tests {
// Case 1: User-defined debugpy path (highest precedence)
let user_path = PathBuf::from("/custom/path/to/debugpy/src/debugpy/adapter");
let user_args =
PythonDebugAdapter::generate_debugpy_arguments(&host, port, Some(&user_path), None)
.await
.unwrap();
let user_args = PythonDebugAdapter::generate_debugpy_arguments(
&host,
port,
DebugpyLaunchMode::Normal,
Some(&user_path),
None,
)
.await
.unwrap();
// Case 2: Venv-installed debugpy (uses -m debugpy.adapter)
let venv_args = PythonDebugAdapter::generate_debugpy_arguments(&host, port, None, None)
.await
.unwrap();
let venv_args = PythonDebugAdapter::generate_debugpy_arguments(
&host,
port,
DebugpyLaunchMode::Normal,
None,
None,
)
.await
.unwrap();
assert_eq!(user_args[0], "/custom/path/to/debugpy/src/debugpy/adapter");
assert_eq!(user_args[1], "--host=127.0.0.1");
@@ -856,6 +1095,7 @@ mod tests {
let user_args = PythonDebugAdapter::generate_debugpy_arguments(
&host,
port,
DebugpyLaunchMode::Normal,
Some(&user_path),
Some(vec!["foo".into()]),
)
@@ -864,6 +1104,7 @@ mod tests {
let venv_args = PythonDebugAdapter::generate_debugpy_arguments(
&host,
port,
DebugpyLaunchMode::Normal,
None,
Some(vec!["foo".into()]),
)

View File

@@ -697,6 +697,7 @@ impl Render for NewProcessModal {
.justify_between()
.border_t_1()
.border_color(cx.theme().colors().border_variant);
let secondary_action = menu::SecondaryConfirm.boxed_clone();
match self.mode {
NewProcessMode::Launch => el.child(
container
@@ -706,6 +707,7 @@ impl Render for NewProcessModal {
.on_click(cx.listener(|this, _, window, cx| {
this.save_debug_scenario(window, cx);
}))
.key_binding(KeyBinding::for_action(&*secondary_action, cx))
.disabled(
self.debugger.is_none()
|| self
@@ -749,7 +751,6 @@ impl Render for NewProcessModal {
container
.child(div().child({
Button::new("edit-attach-task", "Edit in debug.json")
.label_size(LabelSize::Small)
.key_binding(KeyBinding::for_action(&*secondary_action, cx))
.on_click(move |_, window, cx| {
window.dispatch_action(secondary_action.boxed_clone(), cx)
@@ -1192,7 +1193,7 @@ impl PickerDelegate for DebugDelegate {
}
fn placeholder_text(&self, _window: &mut Window, _cx: &mut App) -> std::sync::Arc<str> {
"Find a debug task, or debug a command.".into()
"Find a debug task, or debug a command".into()
}
fn update_matches(
@@ -1453,18 +1454,17 @@ impl PickerDelegate for DebugDelegate {
.child({
let action = menu::SecondaryConfirm.boxed_clone();
if self.matches.is_empty() {
Button::new("edit-debug-json", "Edit debug.json")
.label_size(LabelSize::Small)
.on_click(cx.listener(|_picker, _, window, cx| {
Button::new("edit-debug-json", "Edit debug.json").on_click(cx.listener(
|_picker, _, window, cx| {
window.dispatch_action(
zed_actions::OpenProjectDebugTasks.boxed_clone(),
cx,
);
cx.emit(DismissEvent);
}))
},
))
} else {
Button::new("edit-debug-task", "Edit in debug.json")
.label_size(LabelSize::Small)
.key_binding(KeyBinding::for_action(&*action, cx))
.on_click(move |_, window, cx| {
window.dispatch_action(action.boxed_clone(), cx)

View File

@@ -6,7 +6,10 @@ use alacritty_terminal::vte::ansi;
use anyhow::Result;
use collections::HashMap;
use dap::{CompletionItem, CompletionItemType, OutputEvent};
use editor::{Bias, CompletionProvider, Editor, EditorElement, EditorStyle, ExcerptId};
use editor::{
Bias, CompletionProvider, Editor, EditorElement, EditorMode, EditorStyle, ExcerptId,
SizingBehavior,
};
use fuzzy::StringMatchCandidate;
use gpui::{
Action as _, AppContext, Context, Corner, Entity, FocusHandle, Focusable, HighlightStyle, Hsla,
@@ -59,6 +62,11 @@ impl Console {
) -> Self {
let console = cx.new(|cx| {
let mut editor = Editor::multi_line(window, cx);
editor.set_mode(EditorMode::Full {
scale_ui_elements_with_buffer_font_size: true,
show_active_line_background: true,
sizing_behavior: SizingBehavior::ExcludeOverscrollMargin,
});
editor.move_to_end(&editor::actions::MoveToEnd, window, cx);
editor.set_read_only(true);
editor.disable_scrollbars_and_minimap(window, cx);

View File

@@ -34,6 +34,7 @@ theme.workspace = true
ui.workspace = true
util.workspace = true
workspace.workspace = true
itertools.workspace = true
[dev-dependencies]
client = { workspace = true, features = ["test-support"] }

View File

@@ -1,5 +1,5 @@
use crate::{
DIAGNOSTICS_UPDATE_DELAY, IncludeWarnings, ToggleWarnings, context_range_for_entry,
DIAGNOSTICS_UPDATE_DEBOUNCE, IncludeWarnings, ToggleWarnings, context_range_for_entry,
diagnostic_renderer::{DiagnosticBlock, DiagnosticRenderer},
toolbar_controls::DiagnosticsToolbarEditor,
};
@@ -283,7 +283,7 @@ impl BufferDiagnosticsEditor {
self.update_excerpts_task = Some(cx.spawn_in(window, async move |editor, cx| {
cx.background_executor()
.timer(DIAGNOSTICS_UPDATE_DELAY)
.timer(DIAGNOSTICS_UPDATE_DEBOUNCE)
.await;
if let Some(buffer) = buffer {
@@ -938,10 +938,6 @@ impl DiagnosticsToolbarEditor for WeakEntity<BufferDiagnosticsEditor> {
.unwrap_or(false)
}
fn has_stale_excerpts(&self, _cx: &App) -> bool {
false
}
fn is_updating(&self, cx: &App) -> bool {
self.read_with(cx, |buffer_diagnostics_editor, cx| {
buffer_diagnostics_editor.update_excerpts_task.is_some()

View File

@@ -9,7 +9,7 @@ mod diagnostics_tests;
use anyhow::Result;
use buffer_diagnostics::BufferDiagnosticsEditor;
use collections::{BTreeSet, HashMap};
use collections::{BTreeSet, HashMap, HashSet};
use diagnostic_renderer::DiagnosticBlock;
use editor::{
Editor, EditorEvent, ExcerptRange, MultiBuffer, PathKey,
@@ -17,10 +17,11 @@ use editor::{
multibuffer_context_lines,
};
use gpui::{
AnyElement, AnyView, App, AsyncApp, Context, Entity, EventEmitter, FocusHandle, Focusable,
Global, InteractiveElement, IntoElement, ParentElement, Render, SharedString, Styled,
Subscription, Task, WeakEntity, Window, actions, div,
AnyElement, AnyView, App, AsyncApp, Context, Entity, EventEmitter, FocusHandle, FocusOutEvent,
Focusable, Global, InteractiveElement, IntoElement, ParentElement, Render, SharedString,
Styled, Subscription, Task, WeakEntity, Window, actions, div,
};
use itertools::Itertools as _;
use language::{
Bias, Buffer, BufferRow, BufferSnapshot, DiagnosticEntry, DiagnosticEntryRef, Point,
ToTreeSitterPoint,
@@ -32,7 +33,7 @@ use project::{
use settings::Settings;
use std::{
any::{Any, TypeId},
cmp::{self, Ordering},
cmp,
ops::{Range, RangeInclusive},
sync::Arc,
time::Duration,
@@ -89,8 +90,8 @@ pub(crate) struct ProjectDiagnosticsEditor {
impl EventEmitter<EditorEvent> for ProjectDiagnosticsEditor {}
const DIAGNOSTICS_UPDATE_DELAY: Duration = Duration::from_millis(50);
const DIAGNOSTICS_SUMMARY_UPDATE_DELAY: Duration = Duration::from_millis(30);
const DIAGNOSTICS_UPDATE_DEBOUNCE: Duration = Duration::from_millis(50);
const DIAGNOSTICS_SUMMARY_UPDATE_DEBOUNCE: Duration = Duration::from_millis(30);
impl Render for ProjectDiagnosticsEditor {
fn render(&mut self, _: &mut Window, cx: &mut Context<Self>) -> impl IntoElement {
@@ -149,6 +150,12 @@ impl Render for ProjectDiagnosticsEditor {
}
}
#[derive(PartialEq, Eq, Copy, Clone, Debug)]
enum RetainExcerpts {
Yes,
No,
}
impl ProjectDiagnosticsEditor {
pub fn register(
workspace: &mut Workspace,
@@ -165,14 +172,21 @@ impl ProjectDiagnosticsEditor {
window: &mut Window,
cx: &mut Context<Self>,
) -> Self {
let project_event_subscription =
cx.subscribe_in(&project_handle, window, |this, _project, event, window, cx| match event {
let project_event_subscription = cx.subscribe_in(
&project_handle,
window,
|this, _project, event, window, cx| match event {
project::Event::DiskBasedDiagnosticsStarted { .. } => {
cx.notify();
}
project::Event::DiskBasedDiagnosticsFinished { language_server_id } => {
log::debug!("disk based diagnostics finished for server {language_server_id}");
this.update_stale_excerpts(window, cx);
this.close_diagnosticless_buffers(
window,
cx,
this.editor.focus_handle(cx).contains_focused(window, cx)
|| this.focus_handle.contains_focused(window, cx),
);
}
project::Event::DiagnosticsUpdated {
language_server_id,
@@ -181,34 +195,39 @@ impl ProjectDiagnosticsEditor {
this.paths_to_update.extend(paths.clone());
this.diagnostic_summary_update = cx.spawn(async move |this, cx| {
cx.background_executor()
.timer(DIAGNOSTICS_SUMMARY_UPDATE_DELAY)
.timer(DIAGNOSTICS_SUMMARY_UPDATE_DEBOUNCE)
.await;
this.update(cx, |this, cx| {
this.update_diagnostic_summary(cx);
})
.log_err();
});
cx.emit(EditorEvent::TitleChanged);
if this.editor.focus_handle(cx).contains_focused(window, cx) || this.focus_handle.contains_focused(window, cx) {
log::debug!("diagnostics updated for server {language_server_id}, paths {paths:?}. recording change");
} else {
log::debug!("diagnostics updated for server {language_server_id}, paths {paths:?}. updating excerpts");
this.update_stale_excerpts(window, cx);
}
log::debug!(
"diagnostics updated for server {language_server_id}, \
paths {paths:?}. updating excerpts"
);
let focused = this.editor.focus_handle(cx).contains_focused(window, cx)
|| this.focus_handle.contains_focused(window, cx);
this.update_stale_excerpts(
if focused {
RetainExcerpts::Yes
} else {
RetainExcerpts::No
},
window,
cx,
);
}
_ => {}
});
},
);
let focus_handle = cx.focus_handle();
cx.on_focus_in(&focus_handle, window, |this, window, cx| {
this.focus_in(window, cx)
})
.detach();
cx.on_focus_out(&focus_handle, window, |this, _event, window, cx| {
this.focus_out(window, cx)
})
.detach();
cx.on_focus_in(&focus_handle, window, Self::focus_in)
.detach();
cx.on_focus_out(&focus_handle, window, Self::focus_out)
.detach();
let excerpts = cx.new(|cx| MultiBuffer::new(project_handle.read(cx).capability()));
let editor = cx.new(|cx| {
@@ -238,8 +257,11 @@ impl ProjectDiagnosticsEditor {
window.focus(&this.focus_handle);
}
}
EditorEvent::Blurred => this.update_stale_excerpts(window, cx),
EditorEvent::Saved => this.update_stale_excerpts(window, cx),
EditorEvent::Blurred => this.close_diagnosticless_buffers(window, cx, false),
EditorEvent::Saved => this.close_diagnosticless_buffers(window, cx, true),
EditorEvent::SelectionsChanged { .. } => {
this.close_diagnosticless_buffers(window, cx, true)
}
_ => {}
}
},
@@ -283,15 +305,67 @@ impl ProjectDiagnosticsEditor {
this
}
fn update_stale_excerpts(&mut self, window: &mut Window, cx: &mut Context<Self>) {
if self.update_excerpts_task.is_some() || self.multibuffer.read(cx).is_dirty(cx) {
/// Closes all excerpts of buffers that:
/// - have no diagnostics anymore
/// - are saved (not dirty)
/// - and, if `reatin_selections` is true, do not have selections within them
fn close_diagnosticless_buffers(
&mut self,
_window: &mut Window,
cx: &mut Context<Self>,
retain_selections: bool,
) {
let buffer_ids = self.multibuffer.read(cx).all_buffer_ids();
let selected_buffers = self.editor.update(cx, |editor, cx| {
editor
.selections
.all_anchors(cx)
.iter()
.filter_map(|anchor| anchor.start.buffer_id)
.collect::<HashSet<_>>()
});
for buffer_id in buffer_ids {
if retain_selections && selected_buffers.contains(&buffer_id) {
continue;
}
let has_blocks = self
.blocks
.get(&buffer_id)
.is_none_or(|blocks| blocks.is_empty());
if !has_blocks {
continue;
}
let is_dirty = self
.multibuffer
.read(cx)
.buffer(buffer_id)
.is_some_and(|buffer| buffer.read(cx).is_dirty());
if !is_dirty {
continue;
}
self.multibuffer.update(cx, |b, cx| {
b.remove_excerpts_for_buffer(buffer_id, cx);
});
}
}
fn update_stale_excerpts(
&mut self,
mut retain_excerpts: RetainExcerpts,
window: &mut Window,
cx: &mut Context<Self>,
) {
if self.update_excerpts_task.is_some() {
return;
}
if self.multibuffer.read(cx).is_dirty(cx) {
retain_excerpts = RetainExcerpts::Yes;
}
let project_handle = self.project.clone();
self.update_excerpts_task = Some(cx.spawn_in(window, async move |this, cx| {
cx.background_executor()
.timer(DIAGNOSTICS_UPDATE_DELAY)
.timer(DIAGNOSTICS_UPDATE_DEBOUNCE)
.await;
loop {
let Some(path) = this.update(cx, |this, cx| {
@@ -312,7 +386,7 @@ impl ProjectDiagnosticsEditor {
.log_err()
{
this.update_in(cx, |this, window, cx| {
this.update_excerpts(buffer, window, cx)
this.update_excerpts(buffer, retain_excerpts, window, cx)
})?
.await?;
}
@@ -378,10 +452,10 @@ impl ProjectDiagnosticsEditor {
}
}
fn focus_out(&mut self, window: &mut Window, cx: &mut Context<Self>) {
fn focus_out(&mut self, _: FocusOutEvent, window: &mut Window, cx: &mut Context<Self>) {
if !self.focus_handle.is_focused(window) && !self.editor.focus_handle(cx).is_focused(window)
{
self.update_stale_excerpts(window, cx);
self.close_diagnosticless_buffers(window, cx, false);
}
}
@@ -403,12 +477,13 @@ impl ProjectDiagnosticsEditor {
});
}
}
multibuffer.clear(cx);
});
self.paths_to_update = project_paths;
});
self.update_stale_excerpts(window, cx);
self.update_stale_excerpts(RetainExcerpts::No, window, cx);
}
fn diagnostics_are_unchanged(
@@ -431,6 +506,7 @@ impl ProjectDiagnosticsEditor {
fn update_excerpts(
&mut self,
buffer: Entity<Buffer>,
retain_excerpts: RetainExcerpts,
window: &mut Window,
cx: &mut Context<Self>,
) -> Task<Result<()>> {
@@ -497,24 +573,27 @@ impl ProjectDiagnosticsEditor {
)
})?;
for item in more {
let i = blocks
.binary_search_by(|probe| {
probe
.initial_range
.start
.cmp(&item.initial_range.start)
.then(probe.initial_range.end.cmp(&item.initial_range.end))
.then(Ordering::Greater)
})
.unwrap_or_else(|i| i);
blocks.insert(i, item);
}
blocks.extend(more);
}
let mut excerpt_ranges: Vec<ExcerptRange<Point>> = Vec::new();
let mut excerpt_ranges: Vec<ExcerptRange<Point>> = match retain_excerpts {
RetainExcerpts::No => Vec::new(),
RetainExcerpts::Yes => this.update(cx, |this, cx| {
this.multibuffer.update(cx, |multi_buffer, cx| {
multi_buffer
.excerpts_for_buffer(buffer_id, cx)
.into_iter()
.map(|(_, range)| ExcerptRange {
context: range.context.to_point(&buffer_snapshot),
primary: range.primary.to_point(&buffer_snapshot),
})
.collect()
})
})?,
};
let mut result_blocks = vec![None; excerpt_ranges.len()];
let context_lines = cx.update(|_, cx| multibuffer_context_lines(cx))?;
for b in blocks.iter() {
for b in blocks {
let excerpt_range = context_range_for_entry(
b.initial_range.clone(),
context_lines,
@@ -541,7 +620,8 @@ impl ProjectDiagnosticsEditor {
context: excerpt_range,
primary: b.initial_range.clone(),
},
)
);
result_blocks.insert(i, Some(b));
}
this.update_in(cx, |this, window, cx| {
@@ -562,7 +642,7 @@ impl ProjectDiagnosticsEditor {
)
});
#[cfg(test)]
let cloned_blocks = blocks.clone();
let cloned_blocks = result_blocks.clone();
if was_empty && let Some(anchor_range) = anchor_ranges.first() {
let range_to_select = anchor_range.start..anchor_range.start;
@@ -576,22 +656,20 @@ impl ProjectDiagnosticsEditor {
}
}
let editor_blocks =
anchor_ranges
.into_iter()
.zip(blocks.into_iter())
.map(|(anchor, block)| {
let editor = this.editor.downgrade();
BlockProperties {
placement: BlockPlacement::Near(anchor.start),
height: Some(1),
style: BlockStyle::Flex,
render: Arc::new(move |bcx| {
block.render_block(editor.clone(), bcx)
}),
priority: 1,
}
});
let editor_blocks = anchor_ranges
.into_iter()
.zip_eq(result_blocks.into_iter())
.filter_map(|(anchor, block)| {
let block = block?;
let editor = this.editor.downgrade();
Some(BlockProperties {
placement: BlockPlacement::Near(anchor.start),
height: Some(1),
style: BlockStyle::Flex,
render: Arc::new(move |bcx| block.render_block(editor.clone(), bcx)),
priority: 1,
})
});
let block_ids = this.editor.update(cx, |editor, cx| {
editor.display_map.update(cx, |display_map, cx| {
@@ -601,7 +679,9 @@ impl ProjectDiagnosticsEditor {
#[cfg(test)]
{
for (block_id, block) in block_ids.iter().zip(cloned_blocks.iter()) {
for (block_id, block) in
block_ids.iter().zip(cloned_blocks.into_iter().flatten())
{
let markdown = block.markdown.clone();
editor::test::set_block_content_for_tests(
&this.editor,
@@ -626,6 +706,7 @@ impl ProjectDiagnosticsEditor {
fn update_diagnostic_summary(&mut self, cx: &mut Context<Self>) {
self.summary = self.project.read(cx).diagnostic_summary(false, cx);
cx.emit(EditorEvent::TitleChanged);
}
}
@@ -843,13 +924,6 @@ impl DiagnosticsToolbarEditor for WeakEntity<ProjectDiagnosticsEditor> {
.unwrap_or(false)
}
fn has_stale_excerpts(&self, cx: &App) -> bool {
self.read_with(cx, |project_diagnostics_editor, _cx| {
!project_diagnostics_editor.paths_to_update.is_empty()
})
.unwrap_or(false)
}
fn is_updating(&self, cx: &App) -> bool {
self.read_with(cx, |project_diagnostics_editor, cx| {
project_diagnostics_editor.update_excerpts_task.is_some()
@@ -1010,12 +1084,6 @@ async fn heuristic_syntactic_expand(
return;
}
}
log::info!(
"Expanding to ancestor started on {} node\
exceeding row limit of {max_row_count}.",
node.grammar_name()
);
*ancestor_range = Some(None);
}
})

View File

@@ -119,7 +119,7 @@ async fn test_diagnostics(cx: &mut TestAppContext) {
let editor = diagnostics.update(cx, |diagnostics, _| diagnostics.editor.clone());
diagnostics
.next_notification(DIAGNOSTICS_UPDATE_DELAY + Duration::from_millis(10), cx)
.next_notification(DIAGNOSTICS_UPDATE_DEBOUNCE + Duration::from_millis(10), cx)
.await;
pretty_assertions::assert_eq!(
@@ -190,7 +190,7 @@ async fn test_diagnostics(cx: &mut TestAppContext) {
});
diagnostics
.next_notification(DIAGNOSTICS_UPDATE_DELAY + Duration::from_millis(10), cx)
.next_notification(DIAGNOSTICS_UPDATE_DEBOUNCE + Duration::from_millis(10), cx)
.await;
pretty_assertions::assert_eq!(
@@ -277,7 +277,7 @@ async fn test_diagnostics(cx: &mut TestAppContext) {
});
diagnostics
.next_notification(DIAGNOSTICS_UPDATE_DELAY + Duration::from_millis(10), cx)
.next_notification(DIAGNOSTICS_UPDATE_DEBOUNCE + Duration::from_millis(10), cx)
.await;
pretty_assertions::assert_eq!(
@@ -391,7 +391,7 @@ async fn test_diagnostics_with_folds(cx: &mut TestAppContext) {
// Only the first language server's diagnostics are shown.
cx.executor()
.advance_clock(DIAGNOSTICS_UPDATE_DELAY + Duration::from_millis(10));
.advance_clock(DIAGNOSTICS_UPDATE_DEBOUNCE + Duration::from_millis(10));
cx.executor().run_until_parked();
editor.update_in(cx, |editor, window, cx| {
editor.fold_ranges(vec![Point::new(0, 0)..Point::new(3, 0)], false, window, cx);
@@ -490,7 +490,7 @@ async fn test_diagnostics_multiple_servers(cx: &mut TestAppContext) {
// Only the first language server's diagnostics are shown.
cx.executor()
.advance_clock(DIAGNOSTICS_UPDATE_DELAY + Duration::from_millis(10));
.advance_clock(DIAGNOSTICS_UPDATE_DEBOUNCE + Duration::from_millis(10));
cx.executor().run_until_parked();
pretty_assertions::assert_eq!(
@@ -530,7 +530,7 @@ async fn test_diagnostics_multiple_servers(cx: &mut TestAppContext) {
// Both language server's diagnostics are shown.
cx.executor()
.advance_clock(DIAGNOSTICS_UPDATE_DELAY + Duration::from_millis(10));
.advance_clock(DIAGNOSTICS_UPDATE_DEBOUNCE + Duration::from_millis(10));
cx.executor().run_until_parked();
pretty_assertions::assert_eq!(
@@ -587,7 +587,7 @@ async fn test_diagnostics_multiple_servers(cx: &mut TestAppContext) {
// Only the first language server's diagnostics are updated.
cx.executor()
.advance_clock(DIAGNOSTICS_UPDATE_DELAY + Duration::from_millis(10));
.advance_clock(DIAGNOSTICS_UPDATE_DEBOUNCE + Duration::from_millis(10));
cx.executor().run_until_parked();
pretty_assertions::assert_eq!(
@@ -629,7 +629,7 @@ async fn test_diagnostics_multiple_servers(cx: &mut TestAppContext) {
// Both language servers' diagnostics are updated.
cx.executor()
.advance_clock(DIAGNOSTICS_UPDATE_DELAY + Duration::from_millis(10));
.advance_clock(DIAGNOSTICS_UPDATE_DEBOUNCE + Duration::from_millis(10));
cx.executor().run_until_parked();
pretty_assertions::assert_eq!(
@@ -760,7 +760,7 @@ async fn test_random_diagnostics_blocks(cx: &mut TestAppContext, mut rng: StdRng
.unwrap()
});
cx.executor()
.advance_clock(DIAGNOSTICS_UPDATE_DELAY + Duration::from_millis(10));
.advance_clock(DIAGNOSTICS_UPDATE_DEBOUNCE + Duration::from_millis(10));
cx.run_until_parked();
}
@@ -769,7 +769,7 @@ async fn test_random_diagnostics_blocks(cx: &mut TestAppContext, mut rng: StdRng
log::info!("updating mutated diagnostics view");
mutated_diagnostics.update_in(cx, |diagnostics, window, cx| {
diagnostics.update_stale_excerpts(window, cx)
diagnostics.update_stale_excerpts(RetainExcerpts::No, window, cx)
});
log::info!("constructing reference diagnostics view");
@@ -777,7 +777,7 @@ async fn test_random_diagnostics_blocks(cx: &mut TestAppContext, mut rng: StdRng
ProjectDiagnosticsEditor::new(true, project.clone(), workspace.downgrade(), window, cx)
});
cx.executor()
.advance_clock(DIAGNOSTICS_UPDATE_DELAY + Duration::from_millis(10));
.advance_clock(DIAGNOSTICS_UPDATE_DEBOUNCE + Duration::from_millis(10));
cx.run_until_parked();
let mutated_excerpts =
@@ -789,7 +789,12 @@ async fn test_random_diagnostics_blocks(cx: &mut TestAppContext, mut rng: StdRng
// The mutated view may contain more than the reference view as
// we don't currently shrink excerpts when diagnostics were removed.
let mut ref_iter = reference_excerpts.lines().filter(|line| *line != "§ -----");
let mut ref_iter = reference_excerpts.lines().filter(|line| {
// ignore $ ---- and $ <file>.rs
!line.starts_with('§')
|| line.starts_with("§ diagnostic")
|| line.starts_with("§ related info")
});
let mut next_ref_line = ref_iter.next();
let mut skipped_block = false;
@@ -797,7 +802,12 @@ async fn test_random_diagnostics_blocks(cx: &mut TestAppContext, mut rng: StdRng
if let Some(ref_line) = next_ref_line {
if mut_line == ref_line {
next_ref_line = ref_iter.next();
} else if mut_line.contains('§') && mut_line != "§ -----" {
} else if mut_line.contains('§')
// ignore $ ---- and $ <file>.rs
&& (!mut_line.starts_with('§')
|| mut_line.starts_with("§ diagnostic")
|| mut_line.starts_with("§ related info"))
{
skipped_block = true;
}
}
@@ -877,7 +887,7 @@ async fn test_random_diagnostics_with_inlays(cx: &mut TestAppContext, mut rng: S
vec![Inlay::edit_prediction(
post_inc(&mut next_inlay_id),
snapshot.buffer_snapshot().anchor_before(position),
Rope::from_iter_small(["Test inlay ", "next_inlay_id"]),
Rope::from_iter(["Test inlay ", "next_inlay_id"]),
)],
cx,
);
@@ -949,7 +959,7 @@ async fn test_random_diagnostics_with_inlays(cx: &mut TestAppContext, mut rng: S
.unwrap()
});
cx.executor()
.advance_clock(DIAGNOSTICS_UPDATE_DELAY + Duration::from_millis(10));
.advance_clock(DIAGNOSTICS_UPDATE_DEBOUNCE + Duration::from_millis(10));
cx.run_until_parked();
}
@@ -958,11 +968,11 @@ async fn test_random_diagnostics_with_inlays(cx: &mut TestAppContext, mut rng: S
log::info!("updating mutated diagnostics view");
mutated_diagnostics.update_in(cx, |diagnostics, window, cx| {
diagnostics.update_stale_excerpts(window, cx)
diagnostics.update_stale_excerpts(RetainExcerpts::No, window, cx)
});
cx.executor()
.advance_clock(DIAGNOSTICS_UPDATE_DELAY + Duration::from_millis(10));
.advance_clock(DIAGNOSTICS_UPDATE_DEBOUNCE + Duration::from_millis(10));
cx.run_until_parked();
}
@@ -1427,7 +1437,7 @@ async fn test_diagnostics_with_code(cx: &mut TestAppContext) {
let editor = diagnostics.update(cx, |diagnostics, _| diagnostics.editor.clone());
diagnostics
.next_notification(DIAGNOSTICS_UPDATE_DELAY + Duration::from_millis(10), cx)
.next_notification(DIAGNOSTICS_UPDATE_DEBOUNCE + Duration::from_millis(10), cx)
.await;
// Verify that the diagnostic codes are displayed correctly
@@ -1704,7 +1714,7 @@ async fn test_buffer_diagnostics(cx: &mut TestAppContext) {
// wait a little bit to ensure that the buffer diagnostic's editor content
// is rendered.
cx.executor()
.advance_clock(DIAGNOSTICS_UPDATE_DELAY + Duration::from_millis(10));
.advance_clock(DIAGNOSTICS_UPDATE_DEBOUNCE + Duration::from_millis(10));
pretty_assertions::assert_eq!(
editor_content_with_blocks(&editor, cx),
@@ -1837,7 +1847,7 @@ async fn test_buffer_diagnostics_without_warnings(cx: &mut TestAppContext) {
// wait a little bit to ensure that the buffer diagnostic's editor content
// is rendered.
cx.executor()
.advance_clock(DIAGNOSTICS_UPDATE_DELAY + Duration::from_millis(10));
.advance_clock(DIAGNOSTICS_UPDATE_DEBOUNCE + Duration::from_millis(10));
pretty_assertions::assert_eq!(
editor_content_with_blocks(&editor, cx),
@@ -1971,7 +1981,7 @@ async fn test_buffer_diagnostics_multiple_servers(cx: &mut TestAppContext) {
// wait a little bit to ensure that the buffer diagnostic's editor content
// is rendered.
cx.executor()
.advance_clock(DIAGNOSTICS_UPDATE_DELAY + Duration::from_millis(10));
.advance_clock(DIAGNOSTICS_UPDATE_DEBOUNCE + Duration::from_millis(10));
pretty_assertions::assert_eq!(
editor_content_with_blocks(&editor, cx),
@@ -2070,7 +2080,7 @@ fn random_lsp_diagnostic(
const ERROR_MARGIN: usize = 10;
let file_content = fs.read_file_sync(path).unwrap();
let file_text = Rope::from_str_small(String::from_utf8_lossy(&file_content).as_ref());
let file_text = Rope::from(String::from_utf8_lossy(&file_content).as_ref());
let start = rng.random_range(0..file_text.len().saturating_add(ERROR_MARGIN));
let end = rng.random_range(start..file_text.len().saturating_add(ERROR_MARGIN));

View File

@@ -16,9 +16,6 @@ pub(crate) trait DiagnosticsToolbarEditor: Send + Sync {
/// Toggles whether warning diagnostics should be displayed by the
/// diagnostics editor.
fn toggle_warnings(&self, window: &mut Window, cx: &mut App);
/// Indicates whether any of the excerpts displayed by the diagnostics
/// editor are stale.
fn has_stale_excerpts(&self, cx: &App) -> bool;
/// Indicates whether the diagnostics editor is currently updating the
/// diagnostics.
fn is_updating(&self, cx: &App) -> bool;
@@ -37,14 +34,12 @@ pub(crate) trait DiagnosticsToolbarEditor: Send + Sync {
impl Render for ToolbarControls {
fn render(&mut self, _: &mut Window, cx: &mut Context<Self>) -> impl IntoElement {
let mut has_stale_excerpts = false;
let mut include_warnings = false;
let mut is_updating = false;
match &self.editor {
Some(editor) => {
include_warnings = editor.include_warnings(cx);
has_stale_excerpts = editor.has_stale_excerpts(cx);
is_updating = editor.is_updating(cx);
}
None => {}
@@ -86,7 +81,6 @@ impl Render for ToolbarControls {
IconButton::new("refresh-diagnostics", IconName::ArrowCircle)
.icon_color(Color::Info)
.shape(IconButtonShape::Square)
.disabled(!has_stale_excerpts)
.tooltip(Tooltip::for_action_title(
"Refresh diagnostics",
&ToggleDiagnosticsRefresh,

View File

@@ -13,7 +13,7 @@ use gpui::{
};
use indoc::indoc;
use language::{
EditPredictionsMode, File, Language, Rope,
EditPredictionsMode, File, Language,
language_settings::{self, AllLanguageSettings, EditPredictionProvider, all_language_settings},
};
use project::DisableAiSettings;
@@ -1056,11 +1056,8 @@ async fn open_disabled_globs_setting_in_editor(
) -> Result<()> {
let settings_editor = workspace
.update_in(cx, |_, window, cx| {
create_and_open_local_file(paths::settings_file(), window, cx, |cx| {
Rope::from_str(
settings::initial_user_settings_content().as_ref(),
cx.background_executor(),
)
create_and_open_local_file(paths::settings_file(), window, cx, || {
settings::initial_user_settings_content().as_ref().into()
})
})?
.await?

View File

@@ -88,10 +88,14 @@ pub fn switch_source_header(
)
})?;
let path = PathBuf::from(goto);
workspace
.update_in(cx, |workspace, window, cx| {
let goto = if workspace.path_style(cx).is_windows() {
goto.strip_prefix('/').unwrap_or(goto)
} else {
goto
};
let path = PathBuf::from(goto);
workspace.open_abs_path(
path,
OpenOptions {

View File

@@ -42,15 +42,14 @@ async fn test_fuzzy_score(cx: &mut TestAppContext) {
CompletionBuilder::constant("ElementType", None, "7fffffff"),
];
let matches =
filter_and_sort_matches("Elem", &completions, SnippetSortOrder::default(), cx).await;
filter_and_sort_matches("elem", &completions, SnippetSortOrder::default(), cx).await;
assert_eq!(
matches
.iter()
.map(|m| m.string.as_str())
.collect::<Vec<_>>(),
vec!["ElementType", "element_type"]
vec!["element_type", "ElementType"]
);
assert!(matches[0].score > matches[1].score);
}
// fuzzy takes over sort_text and sort_kind
@@ -135,9 +134,6 @@ async fn test_sort_text(cx: &mut TestAppContext) {
// for each prefix, first item should always be one with lower sort_text
assert_eq!(matches[0].string, "unreachable!(…)");
assert_eq!(matches[1].string, "unreachable");
// fuzzy score should match for first two items as query is common prefix
assert_eq!(matches[0].score, matches[1].score);
})
.await;
@@ -146,9 +142,6 @@ async fn test_sort_text(cx: &mut TestAppContext) {
// exact match comes first
assert_eq!(matches[0].string, "unreachable");
assert_eq!(matches[1].string, "unreachable!(…)");
// fuzzy score should match for first two items as query is common prefix
assert_eq!(matches[0].score, matches[1].score);
}
}
@@ -325,12 +318,17 @@ async fn filter_and_sort_matches(
let matches = fuzzy::match_strings(
&candidates,
query,
query.chars().any(|c| c.is_uppercase()),
true,
false,
100,
&cancel_flag,
background_executor,
)
.await;
CompletionsMenu::sort_string_matches(matches, Some(query), snippet_sort_order, completions)
dbg!(CompletionsMenu::sort_string_matches(
matches,
Some(query),
snippet_sort_order,
completions
))
}

View File

@@ -28,10 +28,12 @@ use std::{
rc::Rc,
};
use task::ResolvedTask;
use ui::{Color, IntoElement, ListItem, Pixels, Popover, Styled, prelude::*};
use ui::{
Color, IntoElement, ListItem, Pixels, Popover, ScrollAxes, Scrollbars, Styled, WithScrollbar,
prelude::*,
};
use util::ResultExt;
use crate::CodeActionSource;
use crate::hover_popover::{hover_markdown_style, open_markdown_url};
use crate::{
CodeActionProvider, CompletionId, CompletionItemKind, CompletionProvider, DisplayRow, Editor,
@@ -39,7 +41,8 @@ use crate::{
actions::{ConfirmCodeAction, ConfirmCompletion},
split_words, styled_runs_for_code_label,
};
use settings::SnippetSortOrder;
use crate::{CodeActionSource, EditorSettings};
use settings::{Settings, SnippetSortOrder};
pub const MENU_GAP: Pixels = px(4.);
pub const MENU_ASIDE_X_PADDING: Pixels = px(16.);
@@ -261,6 +264,20 @@ impl Drop for CompletionsMenu {
}
}
struct CompletionMenuScrollBarSetting;
impl ui::scrollbars::GlobalSetting for CompletionMenuScrollBarSetting {
fn get_value(_cx: &App) -> &Self {
&Self
}
}
impl ui::scrollbars::ScrollbarVisibility for CompletionMenuScrollBarSetting {
fn visibility(&self, cx: &App) -> ui::scrollbars::ShowScrollbar {
EditorSettings::get_global(cx).completion_menu_scrollbar
}
}
impl CompletionsMenu {
pub fn new(
id: CompletionId,
@@ -898,7 +915,17 @@ impl CompletionsMenu {
}
});
Popover::new().child(list).into_any_element()
Popover::new()
.child(
div().child(list).custom_scrollbars(
Scrollbars::for_settings::<CompletionMenuScrollBarSetting>()
.show_along(ScrollAxes::Vertical)
.tracked_scroll_handle(self.scroll_handle.clone()),
window,
cx,
),
)
.into_any_element()
}
fn render_aside(
@@ -1033,7 +1060,7 @@ impl CompletionsMenu {
fuzzy::match_strings(
&match_candidates,
&query,
query.chars().any(|c| c.is_uppercase()),
true,
false,
1000,
&cancel_filter,
@@ -1143,7 +1170,7 @@ impl CompletionsMenu {
});
}
matches.sort_unstable_by_key(|string_match| {
let match_tier = |string_match: &StringMatch| {
let completion = &completions[string_match.candidate_id];
let is_snippet = matches!(
@@ -1159,9 +1186,7 @@ impl CompletionsMenu {
};
let (sort_kind, sort_label) = completion.sort_key();
let score = string_match.score;
let sort_score = Reverse(OrderedFloat(score));
let sort_score = Reverse(OrderedFloat(string_match.score.floor()));
let query_start_doesnt_match_split_words = query_start_lower
.map(|query_char| {
@@ -1198,7 +1223,11 @@ impl CompletionsMenu {
sort_label,
}
}
});
};
let tomato: Vec<_> = matches.iter().map(match_tier).collect();
dbg!(tomato);
matches.sort_unstable_by_key(|string_match| match_tier(string_match));
matches
}

View File

@@ -17,6 +17,9 @@
//! [Editor]: crate::Editor
//! [EditorElement]: crate::element::EditorElement
#[macro_use]
mod dimensions;
mod block_map;
mod crease_map;
mod custom_highlights;
@@ -167,11 +170,12 @@ impl DisplayMap {
}
pub fn snapshot(&mut self, cx: &mut Context<Self>) -> DisplaySnapshot {
let tab_size = Self::tab_size(&self.buffer, cx);
let buffer_snapshot = self.buffer.read(cx).snapshot(cx);
let edits = self.buffer_subscription.consume().into_inner();
let (inlay_snapshot, edits) = self.inlay_map.sync(buffer_snapshot, edits);
let (fold_snapshot, edits) = self.fold_map.read(inlay_snapshot, edits);
let tab_size = Self::tab_size(&self.buffer, cx);
let (tab_snapshot, edits) = self.tab_map.sync(fold_snapshot, edits, tab_size);
let (wrap_snapshot, edits) = self
.wrap_map
@@ -915,7 +919,7 @@ impl DisplaySnapshot {
pub fn text_chunks(&self, display_row: DisplayRow) -> impl Iterator<Item = &str> {
self.block_snapshot
.chunks(
display_row.0..self.max_point().row().next_row().0,
BlockRow(display_row.0)..BlockRow(self.max_point().row().next_row().0),
false,
self.masked,
Highlights::default(),
@@ -927,7 +931,12 @@ impl DisplaySnapshot {
pub fn reverse_text_chunks(&self, display_row: DisplayRow) -> impl Iterator<Item = &str> {
(0..=display_row.0).rev().flat_map(move |row| {
self.block_snapshot
.chunks(row..row + 1, false, self.masked, Highlights::default())
.chunks(
BlockRow(row)..BlockRow(row + 1),
false,
self.masked,
Highlights::default(),
)
.map(|h| h.text)
.collect::<Vec<_>>()
.into_iter()
@@ -942,7 +951,7 @@ impl DisplaySnapshot {
highlight_styles: HighlightStyles,
) -> DisplayChunks<'_> {
self.block_snapshot.chunks(
display_rows.start.0..display_rows.end.0,
BlockRow(display_rows.start.0)..BlockRow(display_rows.end.0),
language_aware,
self.masked,
Highlights {
@@ -1178,8 +1187,8 @@ impl DisplaySnapshot {
rows: Range<DisplayRow>,
) -> impl Iterator<Item = (DisplayRow, &Block)> {
self.block_snapshot
.blocks_in_range(rows.start.0..rows.end.0)
.map(|(row, block)| (DisplayRow(row), block))
.blocks_in_range(BlockRow(rows.start.0)..BlockRow(rows.end.0))
.map(|(row, block)| (DisplayRow(row.0), block))
}
pub fn sticky_header_excerpt(&self, row: f64) -> Option<StickyHeaderExcerpt<'_>> {
@@ -1211,7 +1220,7 @@ impl DisplaySnapshot {
pub fn soft_wrap_indent(&self, display_row: DisplayRow) -> Option<u32> {
let wrap_row = self
.block_snapshot
.to_wrap_point(BlockPoint::new(display_row.0, 0), Bias::Left)
.to_wrap_point(BlockPoint::new(BlockRow(display_row.0), 0), Bias::Left)
.row();
self.wrap_snapshot().soft_wrap_indent(wrap_row)
}
@@ -1242,7 +1251,7 @@ impl DisplaySnapshot {
}
pub fn longest_row(&self) -> DisplayRow {
DisplayRow(self.block_snapshot.longest_row())
DisplayRow(self.block_snapshot.longest_row().0)
}
pub fn longest_row_in_range(&self, range: Range<DisplayRow>) -> DisplayRow {
@@ -1569,7 +1578,6 @@ pub mod tests {
use lsp::LanguageServerId;
use project::Project;
use rand::{Rng, prelude::*};
use rope::Rope;
use settings::{SettingsContent, SettingsStore};
use smol::stream::StreamExt;
use std::{env, sync::Arc};
@@ -2075,7 +2083,7 @@ pub mod tests {
vec![Inlay::edit_prediction(
0,
buffer_snapshot.anchor_after(0),
Rope::from_str_small("\n"),
"\n",
)],
cx,
);

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,96 @@
#[derive(Copy, Clone, Debug, Default, Eq, Ord, PartialOrd, PartialEq)]
pub struct RowDelta(pub u32);
impl RowDelta {
pub fn saturating_sub(self, other: RowDelta) -> RowDelta {
RowDelta(self.0.saturating_sub(other.0))
}
}
impl ::std::ops::Add for RowDelta {
type Output = RowDelta;
fn add(self, rhs: RowDelta) -> Self::Output {
RowDelta(self.0 + rhs.0)
}
}
impl ::std::ops::Sub for RowDelta {
type Output = RowDelta;
fn sub(self, rhs: RowDelta) -> Self::Output {
RowDelta(self.0 - rhs.0)
}
}
impl ::std::ops::AddAssign for RowDelta {
fn add_assign(&mut self, rhs: RowDelta) {
self.0 += rhs.0;
}
}
impl ::std::ops::SubAssign for RowDelta {
fn sub_assign(&mut self, rhs: RowDelta) {
self.0 -= rhs.0;
}
}
macro_rules! impl_for_row_types {
($row:ident => $row_delta:ident) => {
impl $row {
pub fn saturating_sub(self, other: $row_delta) -> $row {
$row(self.0.saturating_sub(other.0))
}
}
impl ::std::ops::Add for $row {
type Output = Self;
fn add(self, rhs: Self) -> Self::Output {
Self(self.0 + rhs.0)
}
}
impl ::std::ops::Add<$row_delta> for $row {
type Output = Self;
fn add(self, rhs: $row_delta) -> Self::Output {
Self(self.0 + rhs.0)
}
}
impl ::std::ops::Sub for $row {
type Output = $row_delta;
fn sub(self, rhs: Self) -> Self::Output {
$row_delta(self.0 - rhs.0)
}
}
impl ::std::ops::Sub<$row_delta> for $row {
type Output = $row;
fn sub(self, rhs: $row_delta) -> Self::Output {
$row(self.0 - rhs.0)
}
}
impl ::std::ops::AddAssign for $row {
fn add_assign(&mut self, rhs: Self) {
self.0 += rhs.0;
}
}
impl ::std::ops::AddAssign<$row_delta> for $row {
fn add_assign(&mut self, rhs: $row_delta) {
self.0 += rhs.0;
}
}
impl ::std::ops::SubAssign<$row_delta> for $row {
fn sub_assign(&mut self, rhs: $row_delta) {
self.0 -= rhs.0;
}
}
};
}

View File

@@ -700,20 +700,16 @@ impl InlayMap {
.collect::<String>();
let next_inlay = if i % 2 == 0 {
use rope::Rope;
Inlay::mock_hint(
post_inc(next_inlay_id),
snapshot.buffer.anchor_at(position, bias),
Rope::from_str_small(&text),
&text,
)
} else {
use rope::Rope;
Inlay::edit_prediction(
post_inc(next_inlay_id),
snapshot.buffer.anchor_at(position, bias),
Rope::from_str_small(&text),
&text,
)
};
let inlay_id = next_inlay.id;
@@ -1305,7 +1301,7 @@ mod tests {
vec![Inlay::mock_hint(
post_inc(&mut next_inlay_id),
buffer.read(cx).snapshot(cx).anchor_after(3),
Rope::from_str_small("|123|"),
"|123|",
)],
);
assert_eq!(inlay_snapshot.text(), "abc|123|defghi");
@@ -1382,12 +1378,12 @@ mod tests {
Inlay::mock_hint(
post_inc(&mut next_inlay_id),
buffer.read(cx).snapshot(cx).anchor_before(3),
Rope::from_str_small("|123|"),
"|123|",
),
Inlay::edit_prediction(
post_inc(&mut next_inlay_id),
buffer.read(cx).snapshot(cx).anchor_after(3),
Rope::from_str_small("|456|"),
"|456|",
),
],
);
@@ -1597,17 +1593,17 @@ mod tests {
Inlay::mock_hint(
post_inc(&mut next_inlay_id),
buffer.read(cx).snapshot(cx).anchor_before(0),
Rope::from_str_small("|123|\n"),
"|123|\n",
),
Inlay::mock_hint(
post_inc(&mut next_inlay_id),
buffer.read(cx).snapshot(cx).anchor_before(4),
Rope::from_str_small("|456|"),
"|456|",
),
Inlay::edit_prediction(
post_inc(&mut next_inlay_id),
buffer.read(cx).snapshot(cx).anchor_before(7),
Rope::from_str_small("\n|567|\n"),
"\n|567|\n",
),
],
);
@@ -1681,14 +1677,9 @@ mod tests {
(offset, inlay.clone())
})
.collect::<Vec<_>>();
let mut expected_text =
Rope::from_str(&buffer_snapshot.text(), cx.background_executor());
let mut expected_text = Rope::from(&buffer_snapshot.text());
for (offset, inlay) in inlays.iter().rev() {
expected_text.replace(
*offset..*offset,
&inlay.text().to_string(),
cx.background_executor(),
);
expected_text.replace(*offset..*offset, &inlay.text().to_string());
}
assert_eq!(inlay_snapshot.text(), expected_text.to_string());
@@ -2076,7 +2067,7 @@ mod tests {
let inlay = Inlay {
id: InlayId::Hint(0),
position,
content: InlayContent::Text(text::Rope::from_str(inlay_text, cx.background_executor())),
content: InlayContent::Text(text::Rope::from(inlay_text)),
};
let (inlay_snapshot, _) = inlay_map.splice(&[], vec![inlay]);
@@ -2190,10 +2181,7 @@ mod tests {
let inlay = Inlay {
id: InlayId::Hint(0),
position,
content: InlayContent::Text(text::Rope::from_str(
test_case.inlay_text,
cx.background_executor(),
)),
content: InlayContent::Text(text::Rope::from(test_case.inlay_text)),
};
let (inlay_snapshot, _) = inlay_map.splice(&[], vec![inlay]);

View File

@@ -1042,7 +1042,7 @@ mod tests {
let (mut tab_map, _) = TabMap::new(fold_snapshot, tab_size);
let tabs_snapshot = tab_map.set_max_expansion_column(32);
let text = text::Rope::from_str(tabs_snapshot.text().as_str(), cx.background_executor());
let text = text::Rope::from(tabs_snapshot.text().as_str());
log::info!(
"TabMap text (tab size: {}): {:?}",
tab_size,

View File

@@ -1,5 +1,6 @@
use super::{
Highlights,
dimensions::RowDelta,
fold_map::{Chunk, FoldRows},
tab_map::{self, TabEdit, TabPoint, TabSnapshot},
};
@@ -7,13 +8,20 @@ use gpui::{App, AppContext as _, Context, Entity, Font, LineWrapper, Pixels, Tas
use language::Point;
use multi_buffer::{MultiBufferSnapshot, RowInfo};
use smol::future::yield_now;
use std::sync::LazyLock;
use std::{cmp, collections::VecDeque, mem, ops::Range, time::Duration};
use std::{cmp, collections::VecDeque, mem, ops::Range, sync::LazyLock, time::Duration};
use sum_tree::{Bias, Cursor, Dimensions, SumTree};
use text::Patch;
pub use super::tab_map::TextSummary;
pub type WrapEdit = text::Edit<u32>;
pub type WrapEdit = text::Edit<WrapRow>;
pub type WrapPatch = text::Patch<WrapRow>;
#[derive(Copy, Clone, Debug, Default, Eq, Ord, PartialOrd, PartialEq)]
pub struct WrapRow(pub u32);
impl_for_row_types! {
WrapRow => RowDelta
}
/// Handles soft wrapping of text.
///
@@ -21,8 +29,8 @@ pub type WrapEdit = text::Edit<u32>;
pub struct WrapMap {
snapshot: WrapSnapshot,
pending_edits: VecDeque<(TabSnapshot, Vec<TabEdit>)>,
interpolated_edits: Patch<u32>,
edits_since_sync: Patch<u32>,
interpolated_edits: WrapPatch,
edits_since_sync: WrapPatch,
wrap_width: Option<Pixels>,
background_task: Option<Task<()>>,
font_with_size: (Font, Pixels),
@@ -54,7 +62,7 @@ pub struct WrapChunks<'a> {
input_chunks: tab_map::TabChunks<'a>,
input_chunk: Chunk<'a>,
output_position: WrapPoint,
max_output_row: u32,
max_output_row: WrapRow,
transforms: Cursor<'a, 'static, Transform, Dimensions<WrapPoint, TabPoint>>,
snapshot: &'a WrapSnapshot,
}
@@ -63,19 +71,19 @@ pub struct WrapChunks<'a> {
pub struct WrapRows<'a> {
input_buffer_rows: FoldRows<'a>,
input_buffer_row: RowInfo,
output_row: u32,
output_row: WrapRow,
soft_wrapped: bool,
max_output_row: u32,
max_output_row: WrapRow,
transforms: Cursor<'a, 'static, Transform, Dimensions<WrapPoint, TabPoint>>,
}
impl WrapRows<'_> {
pub(crate) fn seek(&mut self, start_row: u32) {
pub(crate) fn seek(&mut self, start_row: WrapRow) {
self.transforms
.seek(&WrapPoint::new(start_row, 0), Bias::Left);
let mut input_row = self.transforms.start().1.row();
if self.transforms.item().is_some_and(|t| t.is_isomorphic()) {
input_row += start_row - self.transforms.start().0.row();
input_row += (start_row - self.transforms.start().0.row()).0;
}
self.soft_wrapped = self.transforms.item().is_some_and(|t| !t.is_isomorphic());
self.input_buffer_rows.seek(input_row);
@@ -120,7 +128,7 @@ impl WrapMap {
tab_snapshot: TabSnapshot,
edits: Vec<TabEdit>,
cx: &mut Context<Self>,
) -> (WrapSnapshot, Patch<u32>) {
) -> (WrapSnapshot, WrapPatch) {
if self.wrap_width.is_some() {
self.pending_edits.push_back((tab_snapshot, edits));
self.flush_edits(cx);
@@ -226,8 +234,8 @@ impl WrapMap {
let new_rows = self.snapshot.transforms.summary().output.lines.row + 1;
self.snapshot.interpolated = false;
self.edits_since_sync = self.edits_since_sync.compose(Patch::new(vec![WrapEdit {
old: 0..old_rows,
new: 0..new_rows,
old: WrapRow(0)..WrapRow(old_rows),
new: WrapRow(0)..WrapRow(new_rows),
}]));
}
}
@@ -331,7 +339,7 @@ impl WrapSnapshot {
self.tab_snapshot.buffer_snapshot()
}
fn interpolate(&mut self, new_tab_snapshot: TabSnapshot, tab_edits: &[TabEdit]) -> Patch<u32> {
fn interpolate(&mut self, new_tab_snapshot: TabSnapshot, tab_edits: &[TabEdit]) -> WrapPatch {
let mut new_transforms;
if tab_edits.is_empty() {
new_transforms = self.transforms.clone();
@@ -401,7 +409,7 @@ impl WrapSnapshot {
tab_edits: &[TabEdit],
wrap_width: Pixels,
line_wrapper: &mut LineWrapper,
) -> Patch<u32> {
) -> WrapPatch {
#[derive(Debug)]
struct RowEdit {
old_rows: Range<u32>,
@@ -554,7 +562,7 @@ impl WrapSnapshot {
old_snapshot.compute_edits(tab_edits, self)
}
fn compute_edits(&self, tab_edits: &[TabEdit], new_snapshot: &WrapSnapshot) -> Patch<u32> {
fn compute_edits(&self, tab_edits: &[TabEdit], new_snapshot: &WrapSnapshot) -> WrapPatch {
let mut wrap_edits = Vec::with_capacity(tab_edits.len());
let mut old_cursor = self.transforms.cursor::<TransformSummary>(());
let mut new_cursor = new_snapshot.transforms.cursor::<TransformSummary>(());
@@ -581,8 +589,8 @@ impl WrapSnapshot {
new_end += tab_edit.new.end.0 - new_cursor.start().input.lines;
wrap_edits.push(WrapEdit {
old: old_start.row..old_end.row,
new: new_start.row..new_end.row,
old: WrapRow(old_start.row)..WrapRow(old_end.row),
new: WrapRow(new_start.row)..WrapRow(new_end.row),
});
}
@@ -592,7 +600,7 @@ impl WrapSnapshot {
pub(crate) fn chunks<'a>(
&'a self,
rows: Range<u32>,
rows: Range<WrapRow>,
language_aware: bool,
highlights: Highlights<'a>,
) -> WrapChunks<'a> {
@@ -627,17 +635,17 @@ impl WrapSnapshot {
WrapPoint(self.transforms.summary().output.lines)
}
pub fn line_len(&self, row: u32) -> u32 {
pub fn line_len(&self, row: WrapRow) -> u32 {
let (start, _, item) = self.transforms.find::<Dimensions<WrapPoint, TabPoint>, _>(
(),
&WrapPoint::new(row + 1, 0),
&WrapPoint::new(row + WrapRow(1), 0),
Bias::Left,
);
if item.is_some_and(|transform| transform.is_isomorphic()) {
let overshoot = row - start.0.row();
let tab_row = start.1.row() + overshoot;
let tab_row = start.1.row() + overshoot.0;
let tab_line_len = self.tab_snapshot.line_len(tab_row);
if overshoot == 0 {
if overshoot.0 == 0 {
start.0.column() + (tab_line_len - start.1.column())
} else {
tab_line_len
@@ -647,7 +655,7 @@ impl WrapSnapshot {
}
}
pub fn text_summary_for_range(&self, rows: Range<u32>) -> TextSummary {
pub fn text_summary_for_range(&self, rows: Range<WrapRow>) -> TextSummary {
let mut summary = TextSummary::default();
let start = WrapPoint::new(rows.start, 0);
@@ -708,10 +716,12 @@ impl WrapSnapshot {
summary
}
pub fn soft_wrap_indent(&self, row: u32) -> Option<u32> {
let (.., item) =
self.transforms
.find::<WrapPoint, _>((), &WrapPoint::new(row + 1, 0), Bias::Right);
pub fn soft_wrap_indent(&self, row: WrapRow) -> Option<u32> {
let (.., item) = self.transforms.find::<WrapPoint, _>(
(),
&WrapPoint::new(row + WrapRow(1), 0),
Bias::Right,
);
item.and_then(|transform| {
if transform.is_isomorphic() {
None
@@ -725,14 +735,14 @@ impl WrapSnapshot {
self.transforms.summary().output.longest_row
}
pub fn row_infos(&self, start_row: u32) -> WrapRows<'_> {
pub fn row_infos(&self, start_row: WrapRow) -> WrapRows<'_> {
let mut transforms = self
.transforms
.cursor::<Dimensions<WrapPoint, TabPoint>>(());
transforms.seek(&WrapPoint::new(start_row, 0), Bias::Left);
let mut input_row = transforms.start().1.row();
if transforms.item().is_some_and(|t| t.is_isomorphic()) {
input_row += start_row - transforms.start().0.row();
input_row += (start_row - transforms.start().0.row()).0;
}
let soft_wrapped = transforms.item().is_some_and(|t| !t.is_isomorphic());
let mut input_buffer_rows = self.tab_snapshot.rows(input_row);
@@ -787,9 +797,9 @@ impl WrapSnapshot {
self.tab_point_to_wrap_point(self.tab_snapshot.clip_point(self.to_tab_point(point), bias))
}
pub fn prev_row_boundary(&self, mut point: WrapPoint) -> u32 {
pub fn prev_row_boundary(&self, mut point: WrapPoint) -> WrapRow {
if self.transforms.is_empty() {
return 0;
return WrapRow(0);
}
*point.column_mut() = 0;
@@ -813,7 +823,7 @@ impl WrapSnapshot {
unreachable!()
}
pub fn next_row_boundary(&self, mut point: WrapPoint) -> Option<u32> {
pub fn next_row_boundary(&self, mut point: WrapPoint) -> Option<WrapRow> {
point.0 += Point::new(1, 0);
let mut cursor = self
@@ -833,13 +843,13 @@ impl WrapSnapshot {
#[cfg(test)]
pub fn text(&self) -> String {
self.text_chunks(0).collect()
self.text_chunks(WrapRow(0)).collect()
}
#[cfg(test)]
pub fn text_chunks(&self, wrap_row: u32) -> impl Iterator<Item = &str> {
pub fn text_chunks(&self, wrap_row: WrapRow) -> impl Iterator<Item = &str> {
self.chunks(
wrap_row..self.max_point().row() + 1,
wrap_row..self.max_point().row() + WrapRow(1),
false,
Highlights::default(),
)
@@ -863,25 +873,26 @@ impl WrapSnapshot {
}
}
let text = language::Rope::from_str_small(self.text().as_str());
let text = language::Rope::from(self.text().as_str());
let mut input_buffer_rows = self.tab_snapshot.rows(0);
let mut expected_buffer_rows = Vec::new();
let mut prev_tab_row = 0;
for display_row in 0..=self.max_point().row() {
for display_row in 0..=self.max_point().row().0 {
let display_row = WrapRow(display_row);
let tab_point = self.to_tab_point(WrapPoint::new(display_row, 0));
if tab_point.row() == prev_tab_row && display_row != 0 {
if tab_point.row() == prev_tab_row && display_row != WrapRow(0) {
expected_buffer_rows.push(None);
} else {
expected_buffer_rows.push(input_buffer_rows.next().unwrap().buffer_row);
}
prev_tab_row = tab_point.row();
assert_eq!(self.line_len(display_row), text.line_len(display_row));
assert_eq!(self.line_len(display_row), text.line_len(display_row.0));
}
for start_display_row in 0..expected_buffer_rows.len() {
assert_eq!(
self.row_infos(start_display_row as u32)
self.row_infos(WrapRow(start_display_row as u32))
.map(|row_info| row_info.buffer_row)
.collect::<Vec<_>>(),
&expected_buffer_rows[start_display_row..],
@@ -894,7 +905,7 @@ impl WrapSnapshot {
}
impl WrapChunks<'_> {
pub(crate) fn seek(&mut self, rows: Range<u32>) {
pub(crate) fn seek(&mut self, rows: Range<WrapRow>) {
let output_start = WrapPoint::new(rows.start, 0);
let output_end = WrapPoint::new(rows.end, 0);
self.transforms.seek(&output_start, Bias::Right);
@@ -931,7 +942,7 @@ impl<'a> Iterator for WrapChunks<'a> {
// Exclude newline starting prior to the desired row.
start_ix = 1;
summary.row = 0;
} else if self.output_position.row() + 1 >= self.max_output_row {
} else if self.output_position.row() + WrapRow(1) >= self.max_output_row {
// Exclude soft indentation ending after the desired row.
end_ix = 1;
summary.column = 0;
@@ -997,7 +1008,7 @@ impl Iterator for WrapRows<'_> {
let soft_wrapped = self.soft_wrapped;
let diff_status = self.input_buffer_row.diff_status;
self.output_row += 1;
self.output_row += WrapRow(1);
self.transforms
.seek_forward(&WrapPoint::new(self.output_row, 0), Bias::Left);
if self.transforms.item().is_some_and(|t| t.is_isomorphic()) {
@@ -1014,6 +1025,7 @@ impl Iterator for WrapRows<'_> {
multibuffer_row: None,
diff_status,
expand_info: None,
wrapped_buffer_row: buffer_row.buffer_row,
}
} else {
buffer_row
@@ -1107,12 +1119,12 @@ impl SumTreeExt for SumTree<Transform> {
}
impl WrapPoint {
pub fn new(row: u32, column: u32) -> Self {
Self(Point::new(row, column))
pub fn new(row: WrapRow, column: u32) -> Self {
Self(Point::new(row.0, column))
}
pub fn row(self) -> u32 {
self.0.row
pub fn row(self) -> WrapRow {
WrapRow(self.0.row)
}
pub fn row_mut(&mut self) -> &mut u32 {
@@ -1413,26 +1425,25 @@ mod tests {
}
}
let mut initial_text =
Rope::from_str(initial_snapshot.text().as_str(), cx.background_executor());
let mut initial_text = Rope::from(initial_snapshot.text().as_str());
for (snapshot, patch) in edits {
let snapshot_text = Rope::from_str(snapshot.text().as_str(), cx.background_executor());
let snapshot_text = Rope::from(snapshot.text().as_str());
for edit in &patch {
let old_start = initial_text.point_to_offset(Point::new(edit.new.start, 0));
let old_start = initial_text.point_to_offset(Point::new(edit.new.start.0, 0));
let old_end = initial_text.point_to_offset(cmp::min(
Point::new(edit.new.start + edit.old.len() as u32, 0),
Point::new(edit.new.start.0 + (edit.old.end - edit.old.start).0, 0),
initial_text.max_point(),
));
let new_start = snapshot_text.point_to_offset(Point::new(edit.new.start, 0));
let new_start = snapshot_text.point_to_offset(Point::new(edit.new.start.0, 0));
let new_end = snapshot_text.point_to_offset(cmp::min(
Point::new(edit.new.end, 0),
Point::new(edit.new.end.0, 0),
snapshot_text.max_point(),
));
let new_text = snapshot_text
.chunks_in_range(new_start..new_end)
.collect::<String>();
initial_text.replace(old_start..old_end, &new_text, cx.background_executor());
initial_text.replace(old_start..old_end, &new_text);
}
assert_eq!(initial_text.to_string(), snapshot_text.to_string());
}
@@ -1485,11 +1496,11 @@ mod tests {
impl WrapSnapshot {
fn verify_chunks(&mut self, rng: &mut impl Rng) {
for _ in 0..5 {
let mut end_row = rng.random_range(0..=self.max_point().row());
let mut end_row = rng.random_range(0..=self.max_point().row().0);
let start_row = rng.random_range(0..=end_row);
end_row += 1;
let mut expected_text = self.text_chunks(start_row).collect::<String>();
let mut expected_text = self.text_chunks(WrapRow(start_row)).collect::<String>();
if expected_text.ends_with('\n') {
expected_text.push('\n');
}
@@ -1498,12 +1509,16 @@ mod tests {
.take((end_row - start_row) as usize)
.collect::<Vec<_>>()
.join("\n");
if end_row <= self.max_point().row() {
if end_row <= self.max_point().row().0 {
expected_text.push('\n');
}
let actual_text = self
.chunks(start_row..end_row, true, Highlights::default())
.chunks(
WrapRow(start_row)..WrapRow(end_row),
true,
Highlights::default(),
)
.map(|c| c.text)
.collect::<String>();
assert_eq!(

View File

@@ -1,12 +1,13 @@
use edit_prediction::EditPredictionProvider;
use gpui::{Entity, prelude::*};
use gpui::{Entity, KeyBinding, Modifiers, prelude::*};
use indoc::indoc;
use multi_buffer::{Anchor, MultiBufferSnapshot, ToPoint};
use std::ops::Range;
use text::{Point, ToOffset};
use crate::{
EditPrediction, editor_tests::init_test, test::editor_test_context::EditorTestContext,
AcceptEditPrediction, EditPrediction, MenuEditPredictionsPolicy, editor_tests::init_test,
test::editor_test_context::EditorTestContext,
};
#[gpui::test]
@@ -270,6 +271,63 @@ async fn test_edit_prediction_jump_disabled_for_non_zed_providers(cx: &mut gpui:
});
}
#[gpui::test]
async fn test_edit_prediction_preview_cleanup_on_toggle_off(cx: &mut gpui::TestAppContext) {
init_test(cx, |_| {});
// Bind `ctrl-shift-a` to accept the provided edit prediction. The actual key
// binding here doesn't matter, we simply need to confirm that holding the
// binding's modifiers triggers the edit prediction preview.
cx.update(|cx| cx.bind_keys([KeyBinding::new("ctrl-shift-a", AcceptEditPrediction, None)]));
let mut cx = EditorTestContext::new(cx).await;
let provider = cx.new(|_| FakeEditPredictionProvider::default());
assign_editor_completion_provider(provider.clone(), &mut cx);
cx.set_state("let x = ˇ;");
propose_edits(&provider, vec![(8..8, "42")], &mut cx);
cx.update_editor(|editor, window, cx| {
editor.set_menu_edit_predictions_policy(MenuEditPredictionsPolicy::ByProvider);
editor.update_visible_edit_prediction(window, cx)
});
cx.editor(|editor, _, _| {
assert!(editor.has_active_edit_prediction());
});
// Simulate pressing the modifiers for `AcceptEditPrediction`, namely
// `ctrl-shift`, so that we can confirm that the edit prediction preview is
// activated.
let modifiers = Modifiers::control_shift();
cx.simulate_modifiers_change(modifiers);
cx.run_until_parked();
cx.editor(|editor, _, _| {
assert!(editor.edit_prediction_preview_is_active());
});
// Disable showing edit predictions without issuing a new modifiers changed
// event, to confirm that the edit prediction preview is still active.
cx.update_editor(|editor, window, cx| {
editor.set_show_edit_predictions(Some(false), window, cx);
});
cx.editor(|editor, _, _| {
assert!(!editor.has_active_edit_prediction());
assert!(editor.edit_prediction_preview_is_active());
});
// Now release the modifiers
// Simulate releasing all modifiers, ensuring that even with edit prediction
// disabled, the edit prediction preview is cleaned up.
cx.simulate_modifiers_change(Modifiers::none());
cx.run_until_parked();
cx.editor(|editor, _, _| {
assert!(!editor.edit_prediction_preview_is_active());
});
}
fn assert_editor_active_edit_completion(
cx: &mut EditorTestContext,
assert: impl FnOnce(MultiBufferSnapshot, &Vec<(Range<Anchor>, String)>),
@@ -395,7 +453,7 @@ impl EditPredictionProvider for FakeEditPredictionProvider {
}
fn show_completions_in_menu() -> bool {
false
true
}
fn supports_jump_to_edit() -> bool {

View File

@@ -163,7 +163,10 @@ use rpc::{ErrorCode, ErrorExt, proto::PeerId};
use scroll::{Autoscroll, OngoingScroll, ScrollAnchor, ScrollManager};
use selections_collection::{MutableSelectionsCollection, SelectionsCollection};
use serde::{Deserialize, Serialize};
use settings::{GitGutterSetting, Settings, SettingsLocation, SettingsStore, update_settings_file};
use settings::{
GitGutterSetting, RelativeLineNumbers, Settings, SettingsLocation, SettingsStore,
update_settings_file,
};
use smallvec::{SmallVec, smallvec};
use snippet::Snippet;
use std::{
@@ -452,6 +455,20 @@ pub enum SelectMode {
All,
}
#[derive(Copy, Clone, Default, PartialEq, Eq, Debug)]
pub enum SizingBehavior {
/// The editor will layout itself using `size_full` and will include the vertical
/// scroll margin as requested by user settings.
#[default]
Default,
/// The editor will layout itself using `size_full`, but will not have any
/// vertical overscroll.
ExcludeOverscrollMargin,
/// The editor will request a vertical size according to its content and will be
/// layouted without a vertical scroll margin.
SizeByContent,
}
#[derive(Clone, PartialEq, Eq, Debug)]
pub enum EditorMode {
SingleLine,
@@ -464,8 +481,8 @@ pub enum EditorMode {
scale_ui_elements_with_buffer_font_size: bool,
/// When set to `true`, the editor will render a background for the active line.
show_active_line_background: bool,
/// When set to `true`, the editor's height will be determined by its content.
sized_by_content: bool,
/// Determines the sizing behavior for this editor
sizing_behavior: SizingBehavior,
},
Minimap {
parent: WeakEntity<Editor>,
@@ -477,7 +494,7 @@ impl EditorMode {
Self::Full {
scale_ui_elements_with_buffer_font_size: true,
show_active_line_background: true,
sized_by_content: false,
sizing_behavior: SizingBehavior::Default,
}
}
@@ -795,6 +812,22 @@ impl MinimapVisibility {
}
}
#[derive(Debug, Clone, Copy, PartialEq, Eq)]
pub enum BufferSerialization {
All,
NonDirtyBuffers,
}
impl BufferSerialization {
fn new(restore_unsaved_buffers: bool) -> Self {
if restore_unsaved_buffers {
Self::All
} else {
Self::NonDirtyBuffers
}
}
}
#[derive(Clone, Debug)]
struct RunnableTasks {
templates: Vec<(TaskSourceKind, TaskTemplate)>,
@@ -1112,7 +1145,7 @@ pub struct Editor {
show_git_blame_inline_delay_task: Option<Task<()>>,
git_blame_inline_enabled: bool,
render_diff_hunk_controls: RenderDiffHunkControlsFn,
serialize_dirty_buffers: bool,
buffer_serialization: Option<BufferSerialization>,
show_selection_menu: Option<bool>,
blame: Option<Entity<GitBlame>>,
blame_subscription: Option<Subscription>,
@@ -1832,9 +1865,15 @@ impl Editor {
project::Event::RefreshCodeLens => {
// we always query lens with actions, without storing them, always refreshing them
}
project::Event::RefreshInlayHints(server_id) => {
project::Event::RefreshInlayHints {
server_id,
request_id,
} => {
editor.refresh_inlay_hints(
InlayHintRefreshReason::RefreshRequested(*server_id),
InlayHintRefreshReason::RefreshRequested {
server_id: *server_id,
request_id: *request_id,
},
cx,
);
}
@@ -1882,7 +1921,7 @@ impl Editor {
}
}
project::Event::EntryRenamed(transaction) => {
project::Event::EntryRenamed(transaction, project_path, abs_path) => {
let Some(workspace) = editor.workspace() else {
return;
};
@@ -1890,7 +1929,23 @@ impl Editor {
else {
return;
};
if active_editor.entity_id() == cx.entity_id() {
let entity_id = cx.entity_id();
workspace.update(cx, |this, cx| {
this.panes_mut()
.iter_mut()
.filter(|pane| pane.entity_id() != entity_id)
.for_each(|p| {
p.update(cx, |pane, _| {
pane.nav_history_mut().rename_item(
entity_id,
project_path.clone(),
abs_path.clone().into(),
);
})
});
});
let edited_buffers_already_open = {
let other_editors: Vec<Entity<Editor>> = workspace
.read(cx)
@@ -1913,7 +1968,6 @@ impl Editor {
})
})
};
if !edited_buffers_already_open {
let workspace = workspace.downgrade();
let transaction = transaction.clone();
@@ -2167,10 +2221,13 @@ impl Editor {
git_blame_inline_enabled: full_mode
&& ProjectSettings::get_global(cx).git.inline_blame.enabled,
render_diff_hunk_controls: Arc::new(render_diff_hunk_controls),
serialize_dirty_buffers: !is_minimap
&& ProjectSettings::get_global(cx)
.session
.restore_unsaved_buffers,
buffer_serialization: is_minimap.not().then(|| {
BufferSerialization::new(
ProjectSettings::get_global(cx)
.session
.restore_unsaved_buffers,
)
}),
blame: None,
blame_subscription: None,
tasks: BTreeMap::default(),
@@ -2258,6 +2315,8 @@ impl Editor {
|editor, _, e: &EditorEvent, window, cx| match e {
EditorEvent::ScrollPositionChanged { local, .. } => {
if *local {
editor.hide_signature_help(cx, SignatureHelpHiddenBy::Escape);
editor.inline_blame_popover.take();
let new_anchor = editor.scroll_manager.anchor();
let snapshot = editor.snapshot(window, cx);
editor.update_restoration_data(cx, move |data| {
@@ -2266,8 +2325,22 @@ impl Editor {
new_anchor.offset,
);
});
editor.hide_signature_help(cx, SignatureHelpHiddenBy::Escape);
editor.inline_blame_popover.take();
editor.post_scroll_update = cx.spawn_in(window, async move |editor, cx| {
cx.background_executor()
.timer(Duration::from_millis(50))
.await;
editor
.update_in(cx, |editor, window, cx| {
editor.register_visible_buffers(cx);
editor.refresh_colors_for_visible_range(None, window, cx);
editor.refresh_inlay_hints(
InlayHintRefreshReason::NewLinesShown,
cx,
);
})
.ok();
});
}
}
EditorEvent::Edited { .. } => {
@@ -2438,8 +2511,16 @@ impl Editor {
key_context.add("renaming");
}
if !self.snippet_stack.is_empty() {
if let Some(snippet_stack) = self.snippet_stack.last() {
key_context.add("in_snippet");
if snippet_stack.active_index > 0 {
key_context.add("has_previous_tabstop");
}
if snippet_stack.active_index < snippet_stack.ranges.len().saturating_sub(1) {
key_context.add("has_next_tabstop");
}
}
match self.context_menu.borrow().as_ref() {
@@ -2702,6 +2783,14 @@ impl Editor {
self.workspace.as_ref()?.0.upgrade()
}
/// Returns the workspace serialization ID if this editor should be serialized.
fn workspace_serialization_id(&self, _cx: &App) -> Option<WorkspaceId> {
self.workspace
.as_ref()
.filter(|_| self.should_serialize_buffer())
.and_then(|workspace| workspace.1)
}
pub fn title<'a>(&self, cx: &'a App) -> Cow<'a, str> {
self.buffer().read(cx).title(cx)
}
@@ -2950,6 +3039,20 @@ impl Editor {
self.auto_replace_emoji_shortcode = auto_replace;
}
pub fn set_should_serialize(&mut self, should_serialize: bool, cx: &App) {
self.buffer_serialization = should_serialize.then(|| {
BufferSerialization::new(
ProjectSettings::get_global(cx)
.session
.restore_unsaved_buffers,
)
})
}
fn should_serialize_buffer(&self) -> bool {
self.buffer_serialization.is_some()
}
pub fn toggle_edit_predictions(
&mut self,
_: &ToggleEditPrediction,
@@ -3166,8 +3269,7 @@ impl Editor {
});
if WorkspaceSettings::get(None, cx).restore_on_startup != RestoreOnStartupBehavior::None
&& let Some(workspace_id) =
self.workspace.as_ref().and_then(|workspace| workspace.1)
&& let Some(workspace_id) = self.workspace_serialization_id(cx)
{
let snapshot = self.buffer().read(cx).snapshot(cx);
let selections = selections.clone();
@@ -3232,7 +3334,7 @@ impl Editor {
data.folds = inmemory_folds;
});
let Some(workspace_id) = self.workspace.as_ref().and_then(|workspace| workspace.1) else {
let Some(workspace_id) = self.workspace_serialization_id(cx) else {
return;
};
let background_executor = cx.background_executor().clone();
@@ -7553,7 +7655,14 @@ impl Editor {
window: &mut Window,
cx: &mut Context<Self>,
) {
if self.show_edit_predictions_in_menu() {
// Ensure that the edit prediction preview is updated, even when not
// enabled, if there's an active edit prediction preview.
if self.show_edit_predictions_in_menu()
|| matches!(
self.edit_prediction_preview,
EditPredictionPreview::Active { .. }
)
{
self.update_edit_prediction_preview(&modifiers, window, cx);
}
@@ -7573,18 +7682,17 @@ impl Editor {
)
}
fn multi_cursor_modifier(invert: bool, modifiers: &Modifiers, cx: &mut Context<Self>) -> bool {
let multi_cursor_setting = EditorSettings::get_global(cx).multi_cursor_modifier;
if invert {
match multi_cursor_setting {
MultiCursorModifier::Alt => modifiers.alt,
MultiCursorModifier::CmdOrCtrl => modifiers.secondary(),
}
} else {
match multi_cursor_setting {
MultiCursorModifier::Alt => modifiers.secondary(),
MultiCursorModifier::CmdOrCtrl => modifiers.alt,
}
fn is_cmd_or_ctrl_pressed(modifiers: &Modifiers, cx: &mut Context<Self>) -> bool {
match EditorSettings::get_global(cx).multi_cursor_modifier {
MultiCursorModifier::Alt => modifiers.secondary(),
MultiCursorModifier::CmdOrCtrl => modifiers.alt,
}
}
fn is_alt_pressed(modifiers: &Modifiers, cx: &mut Context<Self>) -> bool {
match EditorSettings::get_global(cx).multi_cursor_modifier {
MultiCursorModifier::Alt => modifiers.alt,
MultiCursorModifier::CmdOrCtrl => modifiers.secondary(),
}
}
@@ -7593,9 +7701,9 @@ impl Editor {
cx: &mut Context<Self>,
) -> Option<ColumnarMode> {
if modifiers.shift && modifiers.number_of_modifiers() == 2 {
if Self::multi_cursor_modifier(false, modifiers, cx) {
if Self::is_cmd_or_ctrl_pressed(modifiers, cx) {
Some(ColumnarMode::FromMouse)
} else if Self::multi_cursor_modifier(true, modifiers, cx) {
} else if Self::is_alt_pressed(modifiers, cx) {
Some(ColumnarMode::FromSelection)
} else {
None
@@ -7852,7 +7960,7 @@ impl Editor {
let inlay = Inlay::edit_prediction(
post_inc(&mut self.next_inlay_id),
range.start,
Rope::from_str_small(new_text.as_str()),
new_text.as_str(),
);
inlay_ids.push(inlay.id);
inlays.push(inlay);
@@ -9957,6 +10065,7 @@ impl Editor {
cx: &mut Context<Self>,
) {
if self.mode.is_single_line() || self.snippet_stack.is_empty() {
cx.propagate();
return;
}
@@ -9964,6 +10073,7 @@ impl Editor {
self.hide_mouse_cursor(HideMouseCursorOrigin::TypingAction, cx);
return;
}
cx.propagate();
}
pub fn previous_snippet_tabstop(
@@ -9973,6 +10083,7 @@ impl Editor {
cx: &mut Context<Self>,
) {
if self.mode.is_single_line() || self.snippet_stack.is_empty() {
cx.propagate();
return;
}
@@ -9980,6 +10091,7 @@ impl Editor {
self.hide_mouse_cursor(HideMouseCursorOrigin::TypingAction, cx);
return;
}
cx.propagate();
}
pub fn tab(&mut self, _: &Tab, window: &mut Window, cx: &mut Context<Self>) {
@@ -15995,7 +16107,6 @@ impl Editor {
}
fn filtered<'a>(
snapshot: EditorSnapshot,
severity: GoToDiagnosticSeverityFilter,
diagnostics: impl Iterator<Item = DiagnosticEntryRef<'a, usize>>,
) -> impl Iterator<Item = DiagnosticEntryRef<'a, usize>> {
@@ -16003,19 +16114,15 @@ impl Editor {
.filter(move |entry| severity.matches(entry.diagnostic.severity))
.filter(|entry| entry.range.start != entry.range.end)
.filter(|entry| !entry.diagnostic.is_unnecessary)
.filter(move |entry| !snapshot.intersects_fold(entry.range.start))
}
let snapshot = self.snapshot(window, cx);
let before = filtered(
snapshot.clone(),
severity,
buffer
.diagnostics_in_range(0..selection.start)
.filter(|entry| entry.range.start <= selection.start),
);
let after = filtered(
snapshot,
severity,
buffer
.diagnostics_in_range(selection.start..buffer.len())
@@ -16054,6 +16161,15 @@ impl Editor {
let Some(buffer_id) = buffer.buffer_id_for_anchor(next_diagnostic_start) else {
return;
};
let snapshot = self.snapshot(window, cx);
if snapshot.intersects_fold(next_diagnostic.range.start) {
self.unfold_ranges(
std::slice::from_ref(&next_diagnostic.range),
true,
false,
cx,
);
}
self.change_selections(Default::default(), window, cx, |s| {
s.select_ranges(vec![
next_diagnostic.range.start..next_diagnostic.range.start,
@@ -17827,6 +17943,7 @@ impl Editor {
.unwrap_or(self.diagnostics_max_severity);
if !self.inline_diagnostics_enabled()
|| !self.diagnostics_enabled()
|| !self.show_inline_diagnostics
|| max_severity == DiagnosticSeverity::Off
{
@@ -17905,7 +18022,7 @@ impl Editor {
window: &Window,
cx: &mut Context<Self>,
) -> Option<()> {
if self.ignore_lsp_data() {
if self.ignore_lsp_data() || !self.diagnostics_enabled() {
return None;
}
let pull_diagnostics_settings = ProjectSettings::get_global(cx)
@@ -19521,9 +19638,16 @@ impl Editor {
EditorSettings::get_global(cx).gutter.line_numbers
}
pub fn should_use_relative_line_numbers(&self, cx: &mut App) -> bool {
self.use_relative_line_numbers
.unwrap_or(EditorSettings::get_global(cx).relative_line_numbers)
pub fn relative_line_numbers(&self, cx: &mut App) -> RelativeLineNumbers {
match (
self.use_relative_line_numbers,
EditorSettings::get_global(cx).relative_line_numbers,
) {
(None, setting) => setting,
(Some(false), _) => RelativeLineNumbers::Disabled,
(Some(true), RelativeLineNumbers::Wrapped) => RelativeLineNumbers::Wrapped,
(Some(true), _) => RelativeLineNumbers::Enabled,
}
}
pub fn toggle_relative_line_numbers(
@@ -19532,8 +19656,8 @@ impl Editor {
_: &mut Window,
cx: &mut Context<Self>,
) {
let is_relative = self.should_use_relative_line_numbers(cx);
self.set_relative_line_number(Some(!is_relative), cx)
let is_relative = self.relative_line_numbers(cx);
self.set_relative_line_number(Some(!is_relative.enabled()), cx)
}
pub fn set_relative_line_number(&mut self, is_relative: Option<bool>, cx: &mut Context<Self>) {
@@ -21143,8 +21267,9 @@ impl Editor {
}
let project_settings = ProjectSettings::get_global(cx);
self.serialize_dirty_buffers =
!self.mode.is_minimap() && project_settings.session.restore_unsaved_buffers;
self.buffer_serialization = self
.should_serialize_buffer()
.then(|| BufferSerialization::new(project_settings.session.restore_unsaved_buffers));
if self.mode.is_full() {
let show_inline_diagnostics = project_settings.diagnostics.inline.enabled;
@@ -24152,6 +24277,10 @@ impl EntityInputHandler for Editor {
let utf16_offset = anchor.to_offset_utf16(&position_map.snapshot.buffer_snapshot());
Some(utf16_offset.0)
}
fn accepts_text_input(&self, _window: &mut Window, _cx: &mut Context<Self>) -> bool {
self.input_enabled
}
}
trait SelectionExt {

View File

@@ -3,12 +3,12 @@ use core::num;
use gpui::App;
use language::CursorShape;
use project::project_settings::DiagnosticSeverity;
use settings::Settings;
pub use settings::{
CurrentLineHighlight, DelayMs, DisplayIn, DocumentColorsRenderMode, DoubleClickInMultibuffer,
GoToDefinitionFallback, HideMouseMode, MinimapThumb, MinimapThumbBorder, MultiCursorModifier,
ScrollBeyondLastLine, ScrollbarDiagnostics, SeedQuerySetting, ShowMinimap, SnippetSortOrder,
};
use settings::{RelativeLineNumbers, Settings};
use ui::scrollbars::{ScrollbarVisibility, ShowScrollbar};
/// Imports from the VSCode settings at
@@ -33,7 +33,7 @@ pub struct EditorSettings {
pub horizontal_scroll_margin: f32,
pub scroll_sensitivity: f32,
pub fast_scroll_sensitivity: f32,
pub relative_line_numbers: bool,
pub relative_line_numbers: RelativeLineNumbers,
pub seed_search_query_from_cursor: SeedQuerySetting,
pub use_smartcase_search: bool,
pub multi_cursor_modifier: MultiCursorModifier,
@@ -55,6 +55,7 @@ pub struct EditorSettings {
pub drag_and_drop_selection: DragAndDropSelection,
pub lsp_document_colors: DocumentColorsRenderMode,
pub minimum_contrast_for_highlights: f32,
pub completion_menu_scrollbar: ShowScrollbar,
}
#[derive(Debug, Clone)]
pub struct Jupyter {
@@ -159,6 +160,7 @@ pub struct SearchSettings {
pub case_sensitive: bool,
pub include_ignored: bool,
pub regex: bool,
pub center_on_match: bool,
}
impl EditorSettings {
@@ -249,6 +251,7 @@ impl Settings for EditorSettings {
case_sensitive: search.case_sensitive.unwrap(),
include_ignored: search.include_ignored.unwrap(),
regex: search.regex.unwrap(),
center_on_match: search.center_on_match.unwrap(),
},
auto_signature_help: editor.auto_signature_help.unwrap(),
show_signature_help_after_edits: editor.show_signature_help_after_edits.unwrap(),
@@ -266,6 +269,7 @@ impl Settings for EditorSettings {
},
lsp_document_colors: editor.lsp_document_colors.unwrap(),
minimum_contrast_for_highlights: editor.minimum_contrast_for_highlights.unwrap().0,
completion_menu_scrollbar: editor.completion_menu_scrollbar.map(Into::into).unwrap(),
}
}
}

View File

@@ -3136,6 +3136,77 @@ fn test_newline(cx: &mut TestAppContext) {
});
}
#[gpui::test]
async fn test_newline_yaml(cx: &mut TestAppContext) {
init_test(cx, |_| {});
let mut cx = EditorTestContext::new(cx).await;
let yaml_language = languages::language("yaml", tree_sitter_yaml::LANGUAGE.into());
cx.update_buffer(|buffer, cx| buffer.set_language(Some(yaml_language), cx));
// Object (between 2 fields)
cx.set_state(indoc! {"
test:ˇ
hello: bye"});
cx.update_editor(|e, window, cx| e.newline(&Newline, window, cx));
cx.assert_editor_state(indoc! {"
test:
ˇ
hello: bye"});
// Object (first and single line)
cx.set_state(indoc! {"
test:ˇ"});
cx.update_editor(|e, window, cx| e.newline(&Newline, window, cx));
cx.assert_editor_state(indoc! {"
test:
ˇ"});
// Array with objects (after first element)
cx.set_state(indoc! {"
test:
- foo: barˇ"});
cx.update_editor(|e, window, cx| e.newline(&Newline, window, cx));
cx.assert_editor_state(indoc! {"
test:
- foo: bar
ˇ"});
// Array with objects and comment
cx.set_state(indoc! {"
test:
- foo: bar
- bar: # testˇ"});
cx.update_editor(|e, window, cx| e.newline(&Newline, window, cx));
cx.assert_editor_state(indoc! {"
test:
- foo: bar
- bar: # test
ˇ"});
// Array with objects (after second element)
cx.set_state(indoc! {"
test:
- foo: bar
- bar: fooˇ"});
cx.update_editor(|e, window, cx| e.newline(&Newline, window, cx));
cx.assert_editor_state(indoc! {"
test:
- foo: bar
- bar: foo
ˇ"});
// Array with strings (after first element)
cx.set_state(indoc! {"
test:
- fooˇ"});
cx.update_editor(|e, window, cx| e.newline(&Newline, window, cx));
cx.assert_editor_state(indoc! {"
test:
- foo
ˇ"});
}
#[gpui::test]
fn test_newline_with_old_selections(cx: &mut TestAppContext) {
init_test(cx, |_| {});
@@ -14217,7 +14288,7 @@ async fn test_completion_in_multibuffer_with_replace_range(cx: &mut TestAppConte
EditorMode::Full {
scale_ui_elements_with_buffer_font_size: false,
show_active_line_background: false,
sized_by_content: false,
sizing_behavior: SizingBehavior::Default,
},
multi_buffer.clone(),
Some(project.clone()),

View File

@@ -8,8 +8,8 @@ use crate::{
HandleInput, HoveredCursor, InlayHintRefreshReason, JumpData, LineDown, LineHighlight, LineUp,
MAX_LINE_LEN, MINIMAP_FONT_SIZE, MULTI_BUFFER_EXCERPT_HEADER_HEIGHT, OpenExcerpts, PageDown,
PageUp, PhantomBreakpointIndicator, Point, RowExt, RowRangeExt, SelectPhase,
SelectedTextHighlight, Selection, SelectionDragState, SoftWrap, StickyHeaderExcerpt, ToPoint,
ToggleFold, ToggleFoldAll,
SelectedTextHighlight, Selection, SelectionDragState, SizingBehavior, SoftWrap,
StickyHeaderExcerpt, ToPoint, ToggleFold, ToggleFoldAll,
code_context_menus::{CodeActionsMenu, MENU_ASIDE_MAX_WIDTH, MENU_ASIDE_MIN_WIDTH, MENU_GAP},
display_map::{
Block, BlockContext, BlockStyle, ChunkRendererId, DisplaySnapshot, EditorMargins,
@@ -766,8 +766,14 @@ impl EditorElement {
.row;
if line_numbers
.get(&MultiBufferRow(multi_buffer_row))
.and_then(|line_number| line_number.hitbox.as_ref())
.is_some_and(|hitbox| hitbox.contains(&event.position))
.is_some_and(|line_layout| {
line_layout.segments.iter().any(|segment| {
segment
.hitbox
.as_ref()
.is_some_and(|hitbox| hitbox.contains(&event.position))
})
})
{
let line_offset_from_top = display_row - position_map.scroll_position.y as u32;
@@ -814,7 +820,7 @@ impl EditorElement {
editor.select(
SelectPhase::Begin {
position,
add: Editor::multi_cursor_modifier(true, &modifiers, cx),
add: Editor::is_alt_pressed(&modifiers, cx),
click_count,
},
window,
@@ -998,7 +1004,7 @@ impl EditorElement {
let text_hitbox = &position_map.text_hitbox;
let pending_nonempty_selections = editor.has_pending_nonempty_selection();
let hovered_link_modifier = Editor::multi_cursor_modifier(false, &event.modifiers(), cx);
let hovered_link_modifier = Editor::is_cmd_or_ctrl_pressed(&event.modifiers(), cx);
if let Some(mouse_position) = event.mouse_position()
&& !pending_nonempty_selections
@@ -1310,7 +1316,14 @@ impl EditorElement {
hover_at(editor, Some(anchor), window, cx);
Self::update_visible_cursor(editor, point, position_map, window, cx);
} else {
hover_at(editor, None, window, cx);
editor.update_inlay_link_and_hover_points(
&position_map.snapshot,
point_for_position,
modifiers.secondary(),
modifiers.shift,
window,
cx,
);
}
} else {
editor.hide_hovered_link(cx);
@@ -1690,9 +1703,7 @@ impl EditorElement {
len,
font,
color,
background_color: None,
strikethrough: None,
underline: None,
..Default::default()
}],
None,
)
@@ -3150,6 +3161,7 @@ impl EditorElement {
snapshot: &EditorSnapshot,
rows: &Range<DisplayRow>,
relative_to: Option<DisplayRow>,
count_wrapped_lines: bool,
) -> HashMap<DisplayRow, DisplayRowDelta> {
let mut relative_rows: HashMap<DisplayRow, DisplayRowDelta> = Default::default();
let Some(relative_to) = relative_to else {
@@ -3167,8 +3179,15 @@ impl EditorElement {
let head_idx = relative_to.minus(start);
let mut delta = 1;
let mut i = head_idx + 1;
let should_count_line = |row_info: &RowInfo| {
if count_wrapped_lines {
row_info.buffer_row.is_some() || row_info.wrapped_buffer_row.is_some()
} else {
row_info.buffer_row.is_some()
}
};
while i < buffer_rows.len() as u32 {
if buffer_rows[i as usize].buffer_row.is_some() {
if should_count_line(&buffer_rows[i as usize]) {
if rows.contains(&DisplayRow(i + start.0)) {
relative_rows.insert(DisplayRow(i + start.0), delta);
}
@@ -3178,13 +3197,13 @@ impl EditorElement {
}
delta = 1;
i = head_idx.min(buffer_rows.len().saturating_sub(1) as u32);
while i > 0 && buffer_rows[i as usize].buffer_row.is_none() {
while i > 0 && buffer_rows[i as usize].buffer_row.is_none() && !count_wrapped_lines {
i -= 1;
}
while i > 0 {
i -= 1;
if buffer_rows[i as usize].buffer_row.is_some() {
if should_count_line(&buffer_rows[i as usize]) {
if rows.contains(&DisplayRow(i + start.0)) {
relative_rows.insert(DisplayRow(i + start.0), delta);
}
@@ -3216,7 +3235,7 @@ impl EditorElement {
return Arc::default();
}
let (newest_selection_head, is_relative) = self.editor.update(cx, |editor, cx| {
let (newest_selection_head, relative) = self.editor.update(cx, |editor, cx| {
let newest_selection_head = newest_selection_head.unwrap_or_else(|| {
let newest = editor
.selections
@@ -3232,79 +3251,93 @@ impl EditorElement {
)
.head
});
let is_relative = editor.should_use_relative_line_numbers(cx);
(newest_selection_head, is_relative)
let relative = editor.relative_line_numbers(cx);
(newest_selection_head, relative)
});
let relative_to = if is_relative {
let relative_to = if relative.enabled() {
Some(newest_selection_head.row())
} else {
None
};
let relative_rows = self.calculate_relative_line_numbers(snapshot, &rows, relative_to);
let relative_rows =
self.calculate_relative_line_numbers(snapshot, &rows, relative_to, relative.wrapped());
let mut line_number = String::new();
let line_numbers = buffer_rows
.iter()
.enumerate()
.flat_map(|(ix, row_info)| {
let display_row = DisplayRow(rows.start.0 + ix as u32);
line_number.clear();
let non_relative_number = row_info.buffer_row? + 1;
let number = relative_rows
.get(&display_row)
.unwrap_or(&non_relative_number);
write!(&mut line_number, "{number}").unwrap();
if row_info
.diff_status
.is_some_and(|status| status.is_deleted())
{
return None;
}
let segments = buffer_rows.iter().enumerate().flat_map(|(ix, row_info)| {
let display_row = DisplayRow(rows.start.0 + ix as u32);
line_number.clear();
let non_relative_number = if relative.wrapped() {
row_info.buffer_row.or(row_info.wrapped_buffer_row)? + 1
} else {
row_info.buffer_row? + 1
};
let number = relative_rows
.get(&display_row)
.unwrap_or(&non_relative_number);
write!(&mut line_number, "{number}").unwrap();
if row_info
.diff_status
.is_some_and(|status| status.is_deleted())
{
return None;
}
let color = active_rows
.get(&display_row)
.map(|spec| {
if spec.breakpoint {
cx.theme().colors().debugger_accent
} else {
cx.theme().colors().editor_active_line_number
}
})
.unwrap_or_else(|| cx.theme().colors().editor_line_number);
let shaped_line =
self.shape_line_number(SharedString::from(&line_number), color, window);
let scroll_top = scroll_position.y * ScrollPixelOffset::from(line_height);
let line_origin = gutter_hitbox.map(|hitbox| {
hitbox.origin
+ point(
hitbox.size.width - shaped_line.width - gutter_dimensions.right_padding,
ix as f32 * line_height
- Pixels::from(scroll_top % ScrollPixelOffset::from(line_height)),
)
});
#[cfg(not(test))]
let hitbox = line_origin.map(|line_origin| {
window.insert_hitbox(
Bounds::new(line_origin, size(shaped_line.width, line_height)),
HitboxBehavior::Normal,
let color = active_rows
.get(&display_row)
.map(|spec| {
if spec.breakpoint {
cx.theme().colors().debugger_accent
} else {
cx.theme().colors().editor_active_line_number
}
})
.unwrap_or_else(|| cx.theme().colors().editor_line_number);
let shaped_line =
self.shape_line_number(SharedString::from(&line_number), color, window);
let scroll_top = scroll_position.y * ScrollPixelOffset::from(line_height);
let line_origin = gutter_hitbox.map(|hitbox| {
hitbox.origin
+ point(
hitbox.size.width - shaped_line.width - gutter_dimensions.right_padding,
ix as f32 * line_height
- Pixels::from(scroll_top % ScrollPixelOffset::from(line_height)),
)
});
#[cfg(test)]
let hitbox = {
let _ = line_origin;
None
};
});
let multi_buffer_row = DisplayPoint::new(display_row, 0).to_point(snapshot).row;
let multi_buffer_row = MultiBufferRow(multi_buffer_row);
let line_number = LineNumberLayout {
shaped_line,
hitbox,
};
Some((multi_buffer_row, line_number))
})
.collect();
#[cfg(not(test))]
let hitbox = line_origin.map(|line_origin| {
window.insert_hitbox(
Bounds::new(line_origin, size(shaped_line.width, line_height)),
HitboxBehavior::Normal,
)
});
#[cfg(test)]
let hitbox = {
let _ = line_origin;
None
};
let segment = LineNumberSegment {
shaped_line,
hitbox,
};
let buffer_row = DisplayPoint::new(display_row, 0).to_point(snapshot).row;
let multi_buffer_row = MultiBufferRow(buffer_row);
Some((multi_buffer_row, segment))
});
let mut line_numbers: HashMap<MultiBufferRow, LineNumberLayout> = HashMap::default();
for (buffer_row, segment) in segments {
line_numbers
.entry(buffer_row)
.or_insert_with(|| LineNumberLayout {
segments: Default::default(),
})
.segments
.push(segment);
}
Arc::new(line_numbers)
}
@@ -3550,9 +3583,7 @@ impl EditorElement {
len: line.len(),
font: style.text.font(),
color: placeholder_color,
background_color: None,
underline: None,
strikethrough: None,
..Default::default()
};
let line = window.text_system().shape_line(
line.to_string().into(),
@@ -3874,11 +3905,11 @@ impl EditorElement {
.child(
h_flex()
.size_full()
.gap_2()
.flex_basis(Length::Definite(DefiniteLength::Fraction(0.667)))
.pl_0p5()
.pr_5()
.pl_1()
.pr_2()
.rounded_sm()
.gap_1p5()
.when(is_sticky, |el| el.shadow_md())
.border_1()
.map(|div| {
@@ -3899,15 +3930,17 @@ impl EditorElement {
let buffer_id = for_excerpt.buffer_id;
let toggle_chevron_icon =
FileIcons::get_chevron_icon(!is_folded, cx).map(Icon::from_path);
let button_size = rems_from_px(28.);
header.child(
div()
.hover(|style| style.bg(colors.element_selected))
.rounded_xs()
.child(
ButtonLike::new("toggle-buffer-fold")
.style(ui::ButtonStyle::Transparent)
.height(px(28.).into())
.width(px(28.))
.style(ButtonStyle::Transparent)
.height(button_size.into())
.width(button_size)
.children(toggle_chevron_icon)
.tooltip({
let focus_handle = focus_handle.clone();
@@ -3963,7 +3996,13 @@ impl EditorElement {
})
.take(1),
)
.child(h_flex().size(px(12.0)).justify_center().children(indicator))
.child(
h_flex()
.size(rems_from_px(12.0))
.justify_center()
.flex_shrink_0()
.children(indicator),
)
.child(
h_flex()
.cursor_pointer()
@@ -3971,41 +4010,51 @@ impl EditorElement {
.size_full()
.justify_between()
.overflow_hidden()
.child(
h_flex()
.gap_2()
.map(|path_header| {
let filename = filename
.map(SharedString::from)
.unwrap_or_else(|| "untitled".into());
.child(h_flex().gap_0p5().map(|path_header| {
let filename = filename
.map(SharedString::from)
.unwrap_or_else(|| "untitled".into());
path_header
.when(ItemSettings::get_global(cx).file_icons, |el| {
let path = path::Path::new(filename.as_str());
let icon = FileIcons::get_icon(path, cx)
.unwrap_or_default();
let icon =
Icon::from_path(icon).color(Color::Muted);
el.child(icon)
})
.child(Label::new(filename).single_line().when_some(
file_status,
|el, status| {
el.color(if status.is_conflicted() {
Color::Conflict
} else if status.is_modified() {
Color::Modified
} else if status.is_deleted() {
Color::Disabled
} else {
Color::Created
})
.when(status.is_deleted(), |el| {
el.strikethrough()
})
},
))
path_header
.when(ItemSettings::get_global(cx).file_icons, |el| {
let path = path::Path::new(filename.as_str());
let icon =
FileIcons::get_icon(path, cx).unwrap_or_default();
let icon = Icon::from_path(icon).color(Color::Muted);
el.child(icon)
})
.child(
ButtonLike::new("filename-button")
.style(ButtonStyle::Subtle)
.child(
div()
.child(
Label::new(filename)
.single_line()
.color(file_status_label_color(
file_status,
))
.when(
file_status.is_some_and(|s| {
s.is_deleted()
}),
|label| label.strikethrough(),
),
)
.group_hover("", |div| div.underline()),
)
.on_click(window.listener_for(&self.editor, {
let jump_data = jump_data.clone();
move |editor, e: &ClickEvent, window, cx| {
editor.open_excerpts_common(
Some(jump_data.clone()),
e.modifiers().secondary(),
window,
cx,
);
}
})),
)
.when_some(parent_path, |then, path| {
then.child(div().child(path).text_color(
if file_status.is_some_and(FileStatus::is_deleted) {
@@ -4014,33 +4063,42 @@ impl EditorElement {
colors.text_muted
},
))
}),
)
})
}))
.when(
can_open_excerpts && is_selected && relative_path.is_some(),
|el| {
el.child(
h_flex()
.id("jump-to-file-button")
.gap_2p5()
.child(Label::new("Jump To File"))
.child(KeyBinding::for_action_in(
Button::new("open-file", "Open File")
.style(ButtonStyle::OutlinedGhost)
.key_binding(KeyBinding::for_action_in(
&OpenExcerpts,
&focus_handle,
cx,
)),
))
.on_click(window.listener_for(&self.editor, {
let jump_data = jump_data.clone();
move |editor, e: &ClickEvent, window, cx| {
editor.open_excerpts_common(
Some(jump_data.clone()),
e.modifiers().secondary(),
window,
cx,
);
}
})),
)
},
)
.on_mouse_down(MouseButton::Left, |_, _, cx| cx.stop_propagation())
.on_click(window.listener_for(&self.editor, {
move |editor, e: &ClickEvent, window, cx| {
editor.open_excerpts_common(
Some(jump_data.clone()),
e.modifiers().secondary(),
window,
cx,
);
let buffer_id = for_excerpt.buffer_id;
move |editor, _e: &ClickEvent, _window, cx| {
if is_folded {
editor.unfold_buffer(buffer_id, cx);
} else {
editor.fold_buffer(buffer_id, cx);
}
}
})),
),
@@ -4048,6 +4106,7 @@ impl EditorElement {
let file = for_excerpt.buffer.file().cloned();
let editor = self.editor.clone();
right_click_menu("buffer-header-context-menu")
.trigger(move |_, _, _| header)
.menu(move |window, cx| {
@@ -5845,34 +5904,36 @@ impl EditorElement {
let line_height = layout.position_map.line_height;
window.set_cursor_style(CursorStyle::Arrow, &layout.gutter_hitbox);
for LineNumberLayout {
shaped_line,
hitbox,
} in layout.line_numbers.values()
{
let Some(hitbox) = hitbox else {
continue;
};
for line_layout in layout.line_numbers.values() {
for LineNumberSegment {
shaped_line,
hitbox,
} in &line_layout.segments
{
let Some(hitbox) = hitbox else {
continue;
};
let Some(()) = (if !is_singleton && hitbox.is_hovered(window) {
let color = cx.theme().colors().editor_hover_line_number;
let Some(()) = (if !is_singleton && hitbox.is_hovered(window) {
let color = cx.theme().colors().editor_hover_line_number;
let line = self.shape_line_number(shaped_line.text.clone(), color, window);
line.paint(hitbox.origin, line_height, window, cx).log_err()
} else {
shaped_line
.paint(hitbox.origin, line_height, window, cx)
.log_err()
}) else {
continue;
};
let line = self.shape_line_number(shaped_line.text.clone(), color, window);
line.paint(hitbox.origin, line_height, window, cx).log_err()
} else {
shaped_line
.paint(hitbox.origin, line_height, window, cx)
.log_err()
}) else {
continue;
};
// In singleton buffers, we select corresponding lines on the line number click, so use | -like cursor.
// In multi buffers, we open file at the line number clicked, so use a pointing hand cursor.
if is_singleton {
window.set_cursor_style(CursorStyle::IBeam, hitbox);
} else {
window.set_cursor_style(CursorStyle::PointingHand, hitbox);
// In singleton buffers, we select corresponding lines on the line number click, so use | -like cursor.
// In multi buffers, we open file at the line number clicked, so use a pointing hand cursor.
if is_singleton {
window.set_cursor_style(CursorStyle::IBeam, hitbox);
} else {
window.set_cursor_style(CursorStyle::PointingHand, hitbox);
}
}
}
}
@@ -7381,9 +7442,7 @@ impl EditorElement {
len: column,
font: style.text.font(),
color: Hsla::default(),
background_color: None,
underline: None,
strikethrough: None,
..Default::default()
}],
None,
);
@@ -7406,9 +7465,7 @@ impl EditorElement {
len: text.len(),
font: self.style.text.font(),
color,
background_color: None,
underline: None,
strikethrough: None,
..Default::default()
};
window.text_system().shape_line(
text,
@@ -7477,6 +7534,22 @@ impl EditorElement {
}
}
fn file_status_label_color(file_status: Option<FileStatus>) -> Color {
file_status.map_or(Color::Default, |status| {
if status.is_conflicted() {
Color::Conflict
} else if status.is_modified() {
Color::Modified
} else if status.is_deleted() {
Color::Disabled
} else if status.is_created() {
Color::Created
} else {
Color::Default
}
})
}
fn header_jump_data(
snapshot: &EditorSnapshot,
block_row_start: DisplayRow,
@@ -8441,11 +8514,11 @@ impl Element for EditorElement {
window.request_layout(style, None, cx)
}
EditorMode::Full {
sized_by_content, ..
sizing_behavior, ..
} => {
let mut style = Style::default();
style.size.width = relative(1.).into();
if sized_by_content {
if sizing_behavior == SizingBehavior::SizeByContent {
let snapshot = editor.snapshot(window, cx);
let line_height =
self.style.text.line_height_in_pixels(window.rem_size());
@@ -8609,7 +8682,8 @@ impl Element for EditorElement {
EditorMode::SingleLine
| EditorMode::AutoHeight { .. }
| EditorMode::Full {
sized_by_content: true,
sizing_behavior: SizingBehavior::ExcludeOverscrollMargin
| SizingBehavior::SizeByContent,
..
}
) {
@@ -9492,9 +9566,7 @@ impl Element for EditorElement {
len: tab_len,
font: self.style.text.font(),
color: cx.theme().colors().editor_invisible,
background_color: None,
underline: None,
strikethrough: None,
..Default::default()
}],
None,
);
@@ -9508,9 +9580,7 @@ impl Element for EditorElement {
len: space_len,
font: self.style.text.font(),
color: cx.theme().colors().editor_invisible,
background_color: None,
underline: None,
strikethrough: None,
..Default::default()
}],
None,
);
@@ -9785,11 +9855,17 @@ impl EditorLayout {
}
}
struct LineNumberLayout {
#[derive(Debug)]
struct LineNumberSegment {
shaped_line: ShapedLine,
hitbox: Option<Hitbox>,
}
#[derive(Debug)]
struct LineNumberLayout {
segments: SmallVec<[LineNumberSegment; 1]>,
}
struct ColoredRange<T> {
start: T,
end: T,
@@ -10853,6 +10929,7 @@ mod tests {
&snapshot,
&(DisplayRow(0)..DisplayRow(6)),
Some(DisplayRow(3)),
false,
)
})
.unwrap();
@@ -10871,6 +10948,7 @@ mod tests {
&snapshot,
&(DisplayRow(3)..DisplayRow(6)),
Some(DisplayRow(1)),
false,
)
})
.unwrap();
@@ -10887,6 +10965,7 @@ mod tests {
&snapshot,
&(DisplayRow(0)..DisplayRow(3)),
Some(DisplayRow(6)),
false,
)
})
.unwrap();
@@ -10896,6 +10975,81 @@ mod tests {
assert_eq!(relative_rows[&DisplayRow(2)], 3);
}
#[gpui::test]
fn test_shape_line_numbers_wrapping(cx: &mut TestAppContext) {
init_test(cx, |_| {});
let window = cx.add_window(|window, cx| {
let buffer = MultiBuffer::build_simple(&sample_text(6, 6, 'a'), cx);
Editor::new(EditorMode::full(), buffer, None, window, cx)
});
update_test_language_settings(cx, |s| {
s.defaults.preferred_line_length = Some(5_u32);
s.defaults.soft_wrap = Some(language_settings::SoftWrap::PreferredLineLength);
});
let editor = window.root(cx).unwrap();
let style = cx.update(|cx| editor.read(cx).style().unwrap().clone());
let line_height = window
.update(cx, |_, window, _| {
style.text.line_height_in_pixels(window.rem_size())
})
.unwrap();
let element = EditorElement::new(&editor, style);
let snapshot = window
.update(cx, |editor, window, cx| editor.snapshot(window, cx))
.unwrap();
let layouts = cx
.update_window(*window, |_, window, cx| {
element.layout_line_numbers(
None,
GutterDimensions {
left_padding: Pixels::ZERO,
right_padding: Pixels::ZERO,
width: px(30.0),
margin: Pixels::ZERO,
git_blame_entries_width: None,
},
line_height,
gpui::Point::default(),
DisplayRow(0)..DisplayRow(6),
&(0..6)
.map(|row| RowInfo {
buffer_row: Some(row),
..Default::default()
})
.collect::<Vec<_>>(),
&BTreeMap::default(),
Some(DisplayPoint::new(DisplayRow(0), 0)),
&snapshot,
window,
cx,
)
})
.unwrap();
assert_eq!(layouts.len(), 3);
let relative_rows = window
.update(cx, |editor, window, cx| {
let snapshot = editor.snapshot(window, cx);
element.calculate_relative_line_numbers(
&snapshot,
&(DisplayRow(0)..DisplayRow(6)),
Some(DisplayRow(3)),
true,
)
})
.unwrap();
assert_eq!(relative_rows[&DisplayRow(0)], 3);
assert_eq!(relative_rows[&DisplayRow(1)], 2);
assert_eq!(relative_rows[&DisplayRow(2)], 1);
// current line has no relative number
assert_eq!(relative_rows[&DisplayRow(4)], 1);
assert_eq!(relative_rows[&DisplayRow(5)], 2);
}
#[gpui::test]
async fn test_vim_visual_selections(cx: &mut TestAppContext) {
init_test(cx, |_| {});
@@ -11009,7 +11163,13 @@ mod tests {
state
.line_numbers
.get(&MultiBufferRow(0))
.map(|line_number| line_number.shaped_line.text.as_ref()),
.map(|line_number| line_number
.segments
.first()
.unwrap()
.shaped_line
.text
.as_ref()),
Some("1")
);
}
@@ -11407,11 +11567,8 @@ mod tests {
fn generate_test_run(len: usize, color: Hsla) -> TextRun {
TextRun {
len,
font: gpui::font(".SystemUIFont"),
color,
background_color: None,
underline: None,
strikethrough: None,
..Default::default()
}
}

View File

@@ -1115,19 +1115,18 @@ mod tests {
let fs = FakeFs::new(cx.executor());
let buffer_initial_text_len = rng.random_range(5..15);
let mut buffer_initial_text = Rope::from_str(
let mut buffer_initial_text = Rope::from(
RandomCharIter::new(&mut rng)
.take(buffer_initial_text_len)
.collect::<String>()
.as_str(),
cx.background_executor(),
);
let mut newline_ixs = (0..buffer_initial_text_len).choose_multiple(&mut rng, 5);
newline_ixs.sort_unstable();
for newline_ix in newline_ixs.into_iter().rev() {
let newline_ix = buffer_initial_text.clip_offset(newline_ix, Bias::Right);
buffer_initial_text.replace(newline_ix..newline_ix, "\n", cx.background_executor());
buffer_initial_text.replace(newline_ix..newline_ix, "\n");
}
log::info!("initial buffer text: {:?}", buffer_initial_text);

View File

@@ -116,7 +116,7 @@ impl Editor {
window: &mut Window,
cx: &mut Context<Self>,
) {
let hovered_link_modifier = Editor::multi_cursor_modifier(false, &modifiers, cx);
let hovered_link_modifier = Editor::is_cmd_or_ctrl_pressed(&modifiers, cx);
if !hovered_link_modifier || self.has_pending_selection() {
self.hide_hovered_link(cx);
return;
@@ -241,8 +241,8 @@ impl Editor {
}
})
.collect();
let navigate_task =
self.navigate_to_hover_links(None, links, modifiers.alt, window, cx);
let split = Self::is_alt_pressed(&modifiers, cx);
let navigate_task = self.navigate_to_hover_links(None, links, split, window, cx);
self.select(SelectPhase::End, window, cx);
return navigate_task;
}
@@ -261,7 +261,8 @@ impl Editor {
);
let navigate_task = if point.as_valid().is_some() {
match (modifiers.shift, modifiers.alt) {
let split = Self::is_alt_pressed(&modifiers, cx);
match (modifiers.shift, split) {
(true, true) => {
self.go_to_type_definition_split(&GoToTypeDefinitionSplit, window, cx)
}

View File

@@ -59,10 +59,10 @@ impl Inlay {
pub fn hint(id: InlayId, position: Anchor, hint: &InlayHint) -> Self {
let mut text = hint.text();
if hint.padding_right && text.reversed_chars_at(text.len()).next() != Some(' ') {
text.push_small(" ");
text.push(" ");
}
if hint.padding_left && text.chars_at(0).next() != Some(' ') {
text.push_front_small(" ");
text.push_front(" ");
}
Self {
id,
@@ -72,11 +72,11 @@ impl Inlay {
}
#[cfg(any(test, feature = "test-support"))]
pub fn mock_hint(id: usize, position: Anchor, text: Rope) -> Self {
pub fn mock_hint(id: usize, position: Anchor, text: impl Into<Rope>) -> Self {
Self {
id: InlayId::Hint(id),
position,
content: InlayContent::Text(text),
content: InlayContent::Text(text.into()),
}
}
@@ -88,19 +88,19 @@ impl Inlay {
}
}
pub fn edit_prediction(id: usize, position: Anchor, text: Rope) -> Self {
pub fn edit_prediction<T: Into<Rope>>(id: usize, position: Anchor, text: T) -> Self {
Self {
id: InlayId::EditPrediction(id),
position,
content: InlayContent::Text(text),
content: InlayContent::Text(text.into()),
}
}
pub fn debugger(id: usize, position: Anchor, text: Rope) -> Self {
pub fn debugger<T: Into<Rope>>(id: usize, position: Anchor, text: T) -> Self {
Self {
id: InlayId::DebuggerValue(id),
position,
content: InlayContent::Text(text),
content: InlayContent::Text(text.into()),
}
}
@@ -108,7 +108,7 @@ impl Inlay {
static COLOR_TEXT: OnceLock<Rope> = OnceLock::new();
match &self.content {
InlayContent::Text(text) => text,
InlayContent::Color(_) => COLOR_TEXT.get_or_init(|| Rope::from_str_small("")),
InlayContent::Color(_) => COLOR_TEXT.get_or_init(|| Rope::from("")),
}
}

View File

@@ -1,5 +1,4 @@
use std::{
collections::hash_map,
ops::{ControlFlow, Range},
time::Duration,
};
@@ -49,8 +48,8 @@ pub struct LspInlayHintData {
allowed_hint_kinds: HashSet<Option<InlayHintKind>>,
invalidate_debounce: Option<Duration>,
append_debounce: Option<Duration>,
hint_refresh_tasks: HashMap<BufferId, HashMap<Vec<Range<BufferRow>>, Vec<Task<()>>>>,
hint_chunk_fetched: HashMap<BufferId, (Global, HashSet<Range<BufferRow>>)>,
hint_refresh_tasks: HashMap<BufferId, Vec<Task<()>>>,
hint_chunk_fetching: HashMap<BufferId, (Global, HashSet<Range<BufferRow>>)>,
invalidate_hints_for_buffers: HashSet<BufferId>,
pub added_hints: HashMap<InlayId, Option<InlayHintKind>>,
}
@@ -63,7 +62,7 @@ impl LspInlayHintData {
enabled_in_settings: settings.enabled,
hint_refresh_tasks: HashMap::default(),
added_hints: HashMap::default(),
hint_chunk_fetched: HashMap::default(),
hint_chunk_fetching: HashMap::default(),
invalidate_hints_for_buffers: HashSet::default(),
invalidate_debounce: debounce_value(settings.edit_debounce_ms),
append_debounce: debounce_value(settings.scroll_debounce_ms),
@@ -99,9 +98,8 @@ impl LspInlayHintData {
pub fn clear(&mut self) {
self.hint_refresh_tasks.clear();
self.hint_chunk_fetched.clear();
self.hint_chunk_fetching.clear();
self.added_hints.clear();
self.invalidate_hints_for_buffers.clear();
}
/// Checks inlay hint settings for enabled hint kinds and general enabled state.
@@ -199,7 +197,7 @@ impl LspInlayHintData {
) {
for buffer_id in removed_buffer_ids {
self.hint_refresh_tasks.remove(buffer_id);
self.hint_chunk_fetched.remove(buffer_id);
self.hint_chunk_fetching.remove(buffer_id);
}
}
}
@@ -211,7 +209,10 @@ pub enum InlayHintRefreshReason {
SettingsChange(InlayHintSettings),
NewLinesShown,
BufferEdited(BufferId),
RefreshRequested(LanguageServerId),
RefreshRequested {
server_id: LanguageServerId,
request_id: Option<usize>,
},
ExcerptsRemoved(Vec<ExcerptId>),
}
@@ -296,7 +297,7 @@ impl Editor {
| InlayHintRefreshReason::Toggle(_)
| InlayHintRefreshReason::SettingsChange(_) => true,
InlayHintRefreshReason::NewLinesShown
| InlayHintRefreshReason::RefreshRequested(_)
| InlayHintRefreshReason::RefreshRequested { .. }
| InlayHintRefreshReason::ExcerptsRemoved(_) => false,
InlayHintRefreshReason::BufferEdited(buffer_id) => {
let Some(affected_language) = self
@@ -370,48 +371,45 @@ impl Editor {
let Some(buffer) = multi_buffer.read(cx).buffer(buffer_id) else {
continue;
};
let fetched_tasks = inlay_hints.hint_chunk_fetched.entry(buffer_id).or_default();
let (fetched_for_version, fetched_chunks) = inlay_hints
.hint_chunk_fetching
.entry(buffer_id)
.or_default();
if visible_excerpts
.buffer_version
.changed_since(&fetched_tasks.0)
.changed_since(fetched_for_version)
{
fetched_tasks.1.clear();
fetched_tasks.0 = visible_excerpts.buffer_version.clone();
*fetched_for_version = visible_excerpts.buffer_version.clone();
fetched_chunks.clear();
inlay_hints.hint_refresh_tasks.remove(&buffer_id);
}
let applicable_chunks =
semantics_provider.applicable_inlay_chunks(&buffer, &visible_excerpts.ranges, cx);
let known_chunks = if ignore_previous_fetches {
None
} else {
Some((fetched_for_version.clone(), fetched_chunks.clone()))
};
match inlay_hints
let mut applicable_chunks =
semantics_provider.applicable_inlay_chunks(&buffer, &visible_excerpts.ranges, cx);
applicable_chunks.retain(|chunk| fetched_chunks.insert(chunk.clone()));
if applicable_chunks.is_empty() && !ignore_previous_fetches {
continue;
}
inlay_hints
.hint_refresh_tasks
.entry(buffer_id)
.or_default()
.entry(applicable_chunks)
{
hash_map::Entry::Occupied(mut o) => {
if invalidate_cache.should_invalidate() || ignore_previous_fetches {
o.get_mut().push(spawn_editor_hints_refresh(
buffer_id,
invalidate_cache,
ignore_previous_fetches,
debounce,
visible_excerpts,
cx,
));
}
}
hash_map::Entry::Vacant(v) => {
v.insert(Vec::new()).push(spawn_editor_hints_refresh(
buffer_id,
invalidate_cache,
ignore_previous_fetches,
debounce,
visible_excerpts,
cx,
));
}
}
.push(spawn_editor_hints_refresh(
buffer_id,
invalidate_cache,
debounce,
visible_excerpts,
known_chunks,
applicable_chunks,
cx,
));
}
}
@@ -506,9 +504,13 @@ impl Editor {
}
InlayHintRefreshReason::NewLinesShown => InvalidationStrategy::None,
InlayHintRefreshReason::BufferEdited(_) => InvalidationStrategy::BufferEdited,
InlayHintRefreshReason::RefreshRequested(server_id) => {
InvalidationStrategy::RefreshRequested(*server_id)
}
InlayHintRefreshReason::RefreshRequested {
server_id,
request_id,
} => InvalidationStrategy::RefreshRequested {
server_id: *server_id,
request_id: *request_id,
},
};
match &mut self.inlay_hints {
@@ -718,44 +720,29 @@ impl Editor {
fn inlay_hints_for_buffer(
&mut self,
invalidate_cache: InvalidationStrategy,
ignore_previous_fetches: bool,
buffer_excerpts: VisibleExcerpts,
known_chunks: Option<(Global, HashSet<Range<BufferRow>>)>,
cx: &mut Context<Self>,
) -> Option<Vec<Task<(Range<BufferRow>, anyhow::Result<CacheInlayHints>)>>> {
let semantics_provider = self.semantics_provider()?;
let inlay_hints = self.inlay_hints.as_mut()?;
let buffer_id = buffer_excerpts.buffer.read(cx).remote_id();
let new_hint_tasks = semantics_provider
.inlay_hints(
invalidate_cache,
buffer_excerpts.buffer,
buffer_excerpts.ranges,
inlay_hints
.hint_chunk_fetched
.get(&buffer_id)
.filter(|_| !ignore_previous_fetches && !invalidate_cache.should_invalidate())
.cloned(),
known_chunks,
cx,
)
.unwrap_or_default();
let (known_version, known_chunks) =
inlay_hints.hint_chunk_fetched.entry(buffer_id).or_default();
if buffer_excerpts.buffer_version.changed_since(known_version) {
known_chunks.clear();
*known_version = buffer_excerpts.buffer_version;
}
let mut hint_tasks = Vec::new();
let mut hint_tasks = None;
for (row_range, new_hints_task) in new_hint_tasks {
let inserted = known_chunks.insert(row_range.clone());
if inserted || ignore_previous_fetches || invalidate_cache.should_invalidate() {
hint_tasks.push(cx.spawn(async move |_, _| (row_range, new_hints_task.await)));
}
hint_tasks
.get_or_insert_with(Vec::new)
.push(cx.spawn(async move |_, _| (row_range, new_hints_task.await)));
}
Some(hint_tasks)
hint_tasks
}
fn apply_fetched_hints(
@@ -793,20 +780,28 @@ impl Editor {
let excerpts = self.buffer.read(cx).excerpt_ids();
let hints_to_insert = new_hints
.into_iter()
.filter_map(|(chunk_range, hints_result)| match hints_result {
Ok(new_hints) => Some(new_hints),
Err(e) => {
log::error!(
"Failed to query inlays for buffer row range {chunk_range:?}, {e:#}"
);
if let Some((for_version, chunks_fetched)) =
inlay_hints.hint_chunk_fetched.get_mut(&buffer_id)
{
if for_version == &query_version {
chunks_fetched.remove(&chunk_range);
.filter_map(|(chunk_range, hints_result)| {
let chunks_fetched = inlay_hints.hint_chunk_fetching.get_mut(&buffer_id);
match hints_result {
Ok(new_hints) => {
if new_hints.is_empty() {
if let Some((_, chunks_fetched)) = chunks_fetched {
chunks_fetched.remove(&chunk_range);
}
}
Some(new_hints)
}
Err(e) => {
log::error!(
"Failed to query inlays for buffer row range {chunk_range:?}, {e:#}"
);
if let Some((for_version, chunks_fetched)) = chunks_fetched {
if for_version == &query_version {
chunks_fetched.remove(&chunk_range);
}
}
None
}
None
}
})
.flat_map(|hints| hints.into_values())
@@ -856,9 +851,10 @@ struct VisibleExcerpts {
fn spawn_editor_hints_refresh(
buffer_id: BufferId,
invalidate_cache: InvalidationStrategy,
ignore_previous_fetches: bool,
debounce: Option<Duration>,
buffer_excerpts: VisibleExcerpts,
known_chunks: Option<(Global, HashSet<Range<BufferRow>>)>,
applicable_chunks: Vec<Range<BufferRow>>,
cx: &mut Context<'_, Editor>,
) -> Task<()> {
cx.spawn(async move |editor, cx| {
@@ -869,12 +865,7 @@ fn spawn_editor_hints_refresh(
let query_version = buffer_excerpts.buffer_version.clone();
let Some(hint_tasks) = editor
.update(cx, |editor, cx| {
editor.inlay_hints_for_buffer(
invalidate_cache,
ignore_previous_fetches,
buffer_excerpts,
cx,
)
editor.inlay_hints_for_buffer(invalidate_cache, buffer_excerpts, known_chunks, cx)
})
.ok()
else {
@@ -882,6 +873,19 @@ fn spawn_editor_hints_refresh(
};
let hint_tasks = hint_tasks.unwrap_or_default();
if hint_tasks.is_empty() {
editor
.update(cx, |editor, _| {
if let Some((_, hint_chunk_fetching)) = editor
.inlay_hints
.as_mut()
.and_then(|inlay_hints| inlay_hints.hint_chunk_fetching.get_mut(&buffer_id))
{
for applicable_chunks in &applicable_chunks {
hint_chunk_fetching.remove(applicable_chunks);
}
}
})
.ok();
return;
}
let new_hints = join_all(hint_tasks).await;
@@ -1102,7 +1106,10 @@ pub mod tests {
editor
.update(cx, |editor, _window, cx| {
editor.refresh_inlay_hints(
InlayHintRefreshReason::RefreshRequested(fake_server.server.server_id()),
InlayHintRefreshReason::RefreshRequested {
server_id: fake_server.server.server_id(),
request_id: Some(1),
},
cx,
);
})
@@ -1958,15 +1965,8 @@ pub mod tests {
async fn test_large_buffer_inlay_requests_split(cx: &mut gpui::TestAppContext) {
init_test(cx, |settings| {
settings.defaults.inlay_hints = Some(InlayHintSettingsContent {
show_value_hints: Some(true),
enabled: Some(true),
edit_debounce_ms: Some(0),
scroll_debounce_ms: Some(0),
show_type_hints: Some(true),
show_parameter_hints: Some(true),
show_other_hints: Some(true),
show_background: Some(false),
toggle_on_modifiers_press: None,
..InlayHintSettingsContent::default()
})
});
@@ -2044,6 +2044,7 @@ pub mod tests {
cx.add_window(|window, cx| Editor::for_buffer(buffer, Some(project), window, cx));
cx.executor().run_until_parked();
let _fake_server = fake_servers.next().await.unwrap();
cx.executor().advance_clock(Duration::from_millis(100));
cx.executor().run_until_parked();
let ranges = lsp_request_ranges
@@ -2129,6 +2130,7 @@ pub mod tests {
);
})
.unwrap();
cx.executor().advance_clock(Duration::from_millis(100));
cx.executor().run_until_parked();
editor.update(cx, |_, _, _| {
let ranges = lsp_request_ranges
@@ -2145,6 +2147,7 @@ pub mod tests {
editor.handle_input("++++more text++++", window, cx);
})
.unwrap();
cx.executor().advance_clock(Duration::from_secs(1));
cx.executor().run_until_parked();
editor.update(cx, |editor, _window, cx| {
let mut ranges = lsp_request_ranges.lock().drain(..).collect::<Vec<_>>();
@@ -3887,7 +3890,10 @@ let c = 3;"#
editor
.update(cx, |editor, _, cx| {
editor.refresh_inlay_hints(
InlayHintRefreshReason::RefreshRequested(fake_server.server.server_id()),
InlayHintRefreshReason::RefreshRequested {
server_id: fake_server.server.server_id(),
request_id: Some(1),
},
cx,
);
})
@@ -4022,7 +4028,7 @@ let c = 3;"#
let mut all_fetched_hints = Vec::new();
for buffer in editor.buffer.read(cx).all_buffers() {
lsp_store.update(cx, |lsp_store, cx| {
let hints = &lsp_store.latest_lsp_data(&buffer, cx).inlay_hints();
let hints = lsp_store.latest_lsp_data(&buffer, cx).inlay_hints();
all_cached_labels.extend(hints.all_cached_hints().into_iter().map(|hint| {
let mut label = hint.text().to_string();
if hint.padding_left {

View File

@@ -1,7 +1,7 @@
use crate::{
Anchor, Autoscroll, Editor, EditorEvent, EditorSettings, ExcerptId, ExcerptRange, FormatTarget,
MultiBuffer, MultiBufferSnapshot, NavigationData, ReportEditorEvent, SearchWithinRange,
SelectionEffects, ToPoint as _,
Anchor, Autoscroll, BufferSerialization, Editor, EditorEvent, EditorSettings, ExcerptId,
ExcerptRange, FormatTarget, MultiBuffer, MultiBufferSnapshot, NavigationData,
ReportEditorEvent, SearchWithinRange, SelectionEffects, ToPoint as _,
display_map::HighlightKey,
editor_settings::SeedQuerySetting,
persistence::{DB, SerializedEditor},
@@ -1256,17 +1256,15 @@ impl SerializableItem for Editor {
window: &mut Window,
cx: &mut Context<Self>,
) -> Option<Task<Result<()>>> {
if self.mode.is_minimap() {
return None;
}
let mut serialize_dirty_buffers = self.serialize_dirty_buffers;
let buffer_serialization = self.buffer_serialization?;
let project = self.project.clone()?;
if project.read(cx).visible_worktrees(cx).next().is_none() {
let serialize_dirty_buffers = match buffer_serialization {
// If we don't have a worktree, we don't serialize, because
// projects without worktrees aren't deserialized.
serialize_dirty_buffers = false;
}
BufferSerialization::All => project.read(cx).visible_worktrees(cx).next().is_some(),
BufferSerialization::NonDirtyBuffers => false,
};
if closing && !serialize_dirty_buffers {
return None;
@@ -1323,10 +1321,11 @@ impl SerializableItem for Editor {
}
fn should_serialize(&self, event: &Self::Event) -> bool {
matches!(
event,
EditorEvent::Saved | EditorEvent::DirtyChanged | EditorEvent::BufferEdited
)
self.should_serialize_buffer()
&& matches!(
event,
EditorEvent::Saved | EditorEvent::DirtyChanged | EditorEvent::BufferEdited
)
}
}
@@ -1593,7 +1592,12 @@ impl SearchableItem for Editor {
) {
self.unfold_ranges(&[matches[index].clone()], false, true, cx);
let range = self.range_for_match(&matches[index], collapse);
self.change_selections(Default::default(), window, cx, |s| {
let autoscroll = if EditorSettings::get_global(cx).search.center_on_match {
Autoscroll::center()
} else {
Autoscroll::fit()
};
self.change_selections(SelectionEffects::scroll(autoscroll), window, cx, |s| {
s.select_ranges([range]);
})
}

View File

@@ -60,8 +60,10 @@ async fn lsp_task_context(
buffer: &Entity<Buffer>,
cx: &mut AsyncApp,
) -> Option<TaskContext> {
let worktree_store = project
.read_with(cx, |project, _| project.worktree_store())
let (worktree_store, environment) = project
.read_with(cx, |project, _| {
(project.worktree_store(), project.environment().clone())
})
.ok()?;
let worktree_abs_path = cx
@@ -74,9 +76,9 @@ async fn lsp_task_context(
})
.ok()?;
let project_env = project
.update(cx, |project, cx| {
project.buffer_environment(buffer, &worktree_store, cx)
let project_env = environment
.update(cx, |environment, cx| {
environment.buffer_environment(buffer, &worktree_store, cx)
})
.ok()?
.await;

View File

@@ -878,7 +878,6 @@ mod tests {
use gpui::{AppContext as _, font, px};
use language::Capability;
use project::{Project, project_settings::DiagnosticSeverity};
use rope::Rope;
use settings::SettingsStore;
use util::post_inc;
@@ -1025,22 +1024,22 @@ mod tests {
Inlay::edit_prediction(
post_inc(&mut id),
buffer_snapshot.anchor_before(offset),
Rope::from_str_small("test"),
"test",
),
Inlay::edit_prediction(
post_inc(&mut id),
buffer_snapshot.anchor_after(offset),
Rope::from_str_small("test"),
"test",
),
Inlay::mock_hint(
post_inc(&mut id),
buffer_snapshot.anchor_before(offset),
Rope::from_str_small("test"),
"test",
),
Inlay::mock_hint(
post_inc(&mut id),
buffer_snapshot.anchor_after(offset),
Rope::from_str_small("test"),
"test",
),
]
})

View File

@@ -603,7 +603,7 @@ impl Editor {
scroll_position
};
let editor_was_scrolled = self.scroll_manager.set_scroll_position(
self.scroll_manager.set_scroll_position(
adjusted_position,
&display_map,
local,
@@ -611,22 +611,7 @@ impl Editor {
workspace_id,
window,
cx,
);
self.post_scroll_update = cx.spawn_in(window, async move |editor, cx| {
cx.background_executor()
.timer(Duration::from_millis(50))
.await;
editor
.update_in(cx, |editor, window, cx| {
editor.register_visible_buffers(cx);
editor.refresh_colors_for_visible_range(None, window, cx);
editor.refresh_inlay_hints(InlayHintRefreshReason::NewLinesShown, cx);
})
.ok();
});
editor_was_scrolled
)
}
pub fn scroll_position(&self, cx: &mut Context<Self>) -> gpui::Point<ScrollOffset> {

View File

@@ -193,7 +193,7 @@ impl Editor {
if let Some(language) = language {
for signature in &mut signature_help.signatures {
let text = Rope::from_str_small(signature.label.as_ref());
let text = Rope::from(signature.label.as_ref());
let highlights = language
.highlight_text(&text, 0..signature.label.len())
.into_iter()

Some files were not shown because too many files have changed in this diff Show More