Compare commits

...

34 Commits

Author SHA1 Message Date
Joseph T. Lyons
e486f3e80a v0.190.x stable 2025-06-11 11:18:53 -04:00
Kirill Bulatov
abcec9f33e Remove previous multi buffer hardcode from the outline panel (#32321)
Closes https://github.com/zed-industries/zed/issues/32316

Multi buffer design was changed so that control buttons are not
occupying extra lines, the hardcoded logic for that is obsolete thus
removed.

Release Notes:

- Fixed incorrect offsets during outline panel navigation in singleton
buffers
2025-06-11 12:38:53 +03:00
gcp-cherry-pick-bot[bot]
8b18939f2f Log error instead of panics in InlineAssistant::scroll_to_assist (cherry-pick #32519) (#32520)
Cherry-picked Log error instead of panics in
`InlineAssistant::scroll_to_assist` (#32519)

Leaving release notes blank as it's not very actionable to know that a
rare crash might be fixed.

Release Notes:

- N/A

Co-authored-by: Michael Sloan <michael@zed.dev>
2025-06-11 00:44:58 -06:00
gcp-cherry-pick-bot[bot]
ac967be2f7 Add missing #[track_caller] meant to be in #32433 (cherry-pick #32434) (#32518)
Cherry-picked Add missing `#[track_caller]` meant to be in #32433
(#32434)

Release Notes:

- N/A

Co-authored-by: Michael Sloan <michael@zed.dev>
2025-06-10 22:57:41 -06:00
gcp-cherry-pick-bot[bot]
0591076054 Don't push to selection history if selections are empty (cherry-pick #32433) (#32517)
Cherry-picked Don't push to selection history if selections are empty
(#32433)

I got a panic during undo but haven't been able to repro it. Potentially
a consequence of my changes in #31731

> Thread "main" panicked with "There must be at least one selection" at
crates/editor/src/selections_collection.rs

Leaving release notes blank as I'm not sure this actually fixes the
panic

Release Notes:

- N/A

Co-authored-by: Michael Sloan <michael@zed.dev>
2025-06-10 22:56:42 -06:00
Zed Bot
8adf3c0462 Bump to 0.190.4 for @ConradIrwin 2025-06-11 04:31:27 +00:00
Anthony Eid
8fa21f7793 debugger beta: Fix inline value provider panic (#32502)
Closes #32143

Release Notes:

- debugger beta: Fix panic that could occur when generating inline
values
2025-06-10 22:24:43 -06:00
Conrad Irwin
2fd32ff568 Fix cherry-pick 2025-06-10 22:23:48 -06:00
gcp-cherry-pick-bot[bot]
cb660c5f8c debugger: Fix panic when handling invalid RunInTerminal request (cherry-pick #32500) (#32513)
Cherry-picked debugger: Fix panic when handling invalid `RunInTerminal`
request (#32500)

The new dap-types version has a default to cwd for the
RunInTerminalRequest

Closes #31695

Release Notes:

- debugger beta: Fix panic that occurred when a debug adapter sent an
invalid `RunInTerminal` request

Co-authored-by: Anthony Eid <56899983+Anthony-Eid@users.noreply.github.com>
2025-06-10 22:22:51 -06:00
Antonio Scandurra
7be10b974b Replace async-watch with a custom watch (#32245)
The `async-watch` crate doesn't seem to be maintained and we noticed
several panics coming from it, such as:

```
[bug] failed to observe change after notificaton.
zed::reliability::init_panic_hook::{{closure}}::hea8cdcb6299fad6b+154543526
std::panicking::rust_panic_with_hook::h33b18b24045abff4+127578547
std::panicking::begin_panic_handler::{{closure}}::hf8313cc2fd0126bc+127577770
std::sys::backtrace::__rust_end_short_backtrace::h57fe07c8aea5c98a+127571385
__rustc[95feac21a9532783]::rust_begin_unwind+127576909
core::panicking::panic_fmt::hd54fb667be51beea+9433328
core::option::expect_failed::h8456634a3dada3e4+9433291
assistant_tools::edit_agent::EditAgent::apply_edit_chunks::{{closure}}::habe2e1a32b267fd4+26921553
gpui::app::async_context::AsyncApp::spawn::{{closure}}::h12f5f25757f572ea+25923441
async_task::raw::RawTask<F,T,S,M>::run::h3cca0d402690ccba+25186815
<gpui::platform::linux::x11::client::X11Client as gpui::platform::linux::platform::LinuxClient>::run::h26264aefbcfbc14b+73961666
gpui::platform::linux::platform::<impl gpui::platform::Platform for P>::run::hb12dcd4abad715b5+73562509
gpui::app::Application::run::h0f936a5f855a3f9f+150676820
zed::main::ha17f9a25fe257d35+154788471
std::sys::backtrace::__rust_begin_short_backtrace::h1edd02429370b2bd+154624579
std::rt::lang_start::{{closure}}::h3d2e300f10059b0a+154264777
std::rt::lang_start_internal::h418648f91f5be3a1+127502049
main+154806636
__libc_start_main+46051972301573
_start+12358494
```

I didn't find an executor-agnostic watch crate that was well maintained
(we already tried postage and async-watch), so decided to implement it
our own version.

Release Notes:

- Fixed a panic that could sometimes occur when the agent performed
edits.
2025-06-10 22:13:26 -06:00
Max Brunsfeld
9744fd5965 Add a test demonstrating ERB language loading bug (#32278)
Fixes https://github.com/zed-industries/zed/issues/12174

Release Notes:

- Fixed a bug where ERB files were not parsed correctly when the
languages were initially loaded.
2025-06-10 21:06:12 -07:00
gcp-cherry-pick-bot[bot]
5a6def7150 Add ANSI C quoting to export env parsing (cherry-pick #32404) (#32507)
Cherry-picked Add ANSI C quoting to export env parsing (#32404)

Follow up to #31799 to support ansi-c quoting. This is used by
nix/direnv

Release Notes:

- N/A

Co-authored-by: Stanislav Alekseev <43210583+WeetHet@users.noreply.github.com>
2025-06-10 22:01:39 -06:00
gcp-cherry-pick-bot[bot]
b908614d82 Filter language server completions even when is_incomplete: true (cherry-pick #32491) (#32494)
Cherry-picked Filter language server completions even when
`is_incomplete: true` (#32491)

In #31872 I changed the behavior of completions to not filter instead of
requerying completions when `is_incomplete: false`. Unfortunately this
also stopped filtering completions when `is_incomplete: true` - we still
want to filter the incomplete completions so that the menu updates
quickly even when completions are slow. This does mean that the
completions menu will display partial results, hopefully only briefly
while waiting for fresh completions.

Thanks to @mikayla-maki for noticing the regression. Thankfully just in
time to fix it before this makes it into a stable release. Leaving off
release notes since I will cherry-pick this to the current preview
version, 190.x, and there probably won't be a preview release before the
next stable.

Release Notes:

- N/A

Co-authored-by: Michael Sloan <michael@zed.dev>
2025-06-10 16:33:04 -06:00
Joseph T. Lyons
48c6c377a6 Revert "Preserve selection direction when running editor: open selections in multibuffer" (#32483)
Reverts zed-industries/zed#31399

I found that in some cases, Zed will panic when using `editor: open
selections in multibuffer` if the selection is reversed. It doesn't
happen in most cases that I've tested, but in some strange edge cases
(that I dont fully understand ATM), it does. I'm reverting for now, as
the previous behavior is better than a panic, but will re-implement this
fix to preserving selection directions in a new PR with comprehensive
testing

Release Notes:

- N/A
2025-06-10 16:41:45 -04:00
gcp-cherry-pick-bot[bot]
4338967ef5 Use unix pipe to capture environment variables (cherry-pick #32136) (#32428)
Cherry-picked Use unix pipe to capture environment variables (#32136)

The use of `NamedTempFile` in #31799 was not secure and could
potentially cause write permission issues (see [this

comment](https://github.com/zed-industries/zed/issues/29528#issuecomment-2939672433)).
Therefore, it has been replaced with a Unix pipe.

Release Notes:
- N/A

Co-authored-by: Haru Kim <haruleekim@icloud.com>
2025-06-09 20:56:25 -06:00
gcp-cherry-pick-bot[bot]
1afd06d52f Fix buffer rendering on every mouse move (cherry-pick #32408) (#32412)
Cherry-picked Fix buffer rendering on every mouse move (#32408)

Closes #32210

This notify was added in #13433. Solution is to only notify when the
breakpoint indicator state has changed.

Also improves the logic for enqueuing a task to delay showing - now only
does this if it isn't already visible, and that delay task now only
notifies if still hovering.

Release Notes:

- Fixed a bug where buffers render on every mouse move.

Co-authored-by: Michael Sloan <michael@zed.dev>
2025-06-09 14:27:25 -06:00
gcp-cherry-pick-bot[bot]
0bcb68ee29 No longer instantiate recently opened agent threads on startup (cherry-pick #32285) (#32351)
Cherry-picked No longer instantiate recently opened agent threads on
startup (#32285)

This was causing a lot of work on startup, particularly due to
instantiating edit tool cards. The minor downside is that now these
threads don't open quite as fast.

Includes a few other improvements:

* On text thread rename, now immediately updates the metadata for
display in the UI instead of waiting for reload.

* On text thread rename, first renames the file before writing. Before
if the file removal failed you'd end up with a duplicate.

* Now only stores text thread file names instead of full paths. This is
more concise and allows for the app data dir changing location.

* Renames `ThreadStore::unordered_threads` to
`ThreadStore::reverse_chronological_threads` (and removes the old one
that sorted), since the recent change to use a SQL database queries them
in that order.

* Removes `ContextStore::reverse_chronological_contexts` since it was
only used in one location where it does sorting anyway - no need to sort
twice.

* `SavedContextMetadata::title` is now `SharedString` instead of
`String`.

Release Notes:

- Fixed regression in startup performance by not deserializing and
instantiating recently opened agent threads.

Co-authored-by: Michael Sloan <michael@zed.dev>
2025-06-08 13:46:19 -06:00
Joseph T. Lyons
857897ee10 zed 0.190.3 2025-06-07 13:58:18 -04:00
gcp-cherry-pick-bot[bot]
e6de7933bb Fix panic dragging tabs multiple positions to the right (cherry-pick #32305) (#32307)
Cherry-picked Fix panic dragging tabs multiple positions to the right
(#32305)

Closes https://github.com/zed-industries/zed/issues/32303

Release Notes:

- Fixed a panic that occurred when dragging tabs multiple positions to
the right (preview only)

Co-authored-by: Joseph T. Lyons <JosephTLyons@gmail.com>
2025-06-07 13:46:15 -04:00
Peter Tripp
fc077d194c zed 0.190.2 2025-06-06 18:59:21 -04:00
Michael Sloan
4cf76207aa Fix anchor biases for completion replacement ranges (esp slash commands) (cherry-pick #32262) (#32271)
Closes #32205

The issue was that in some places the end of the replacement range used
anchors with `Bias::Left` instead of `Bias::Right`. Before #31872
completions were recomputed on every change and so the anchor bias
didn't matter. After that change, the end anchor didn't move as the
user's typing. Changing it to `Bias::Right` to "stick" to the character
to the right of the cursor fixes this.

Also fixes slash command arguments to now have `is_incomplete: true`,
since some of the argument providers do limited fuzzy matching.

Release Notes:

- Fixed slash command completions in agent text threads to replace the
user's query text (Preview Only)
2025-06-06 15:57:54 -06:00
Joseph T. Lyons
1b45289efa Fix panic when dragging pinned item left in pinned region (#32263)
This was a regression with my recent fixes to pinned tabs. Dragging a
pinned tab left in the pinned region would still update the pinned tab
count, which would later cause an out-of-bounds later when it used that
value to index into a vec.

https://zed-industries.slack.com/archives/C04S6T1T7TQ/p1749220447796559

Release Notes:

- Fixed a panic caused by dragging a pinned item to the left in the
pinned region
2025-06-06 16:03:20 -04:00
Joseph T. Lyons
8eb12a3c3d Fixed more bugs around moving pinned tabs (#32228)
Closes https://github.com/zed-industries/zed/issues/32199
https://github.com/zed-industries/zed/issues/32229
https://github.com/zed-industries/zed/issues/32230
https://github.com/zed-industries/zed/issues/32232

Release Notes:

- Fixed a bug where if the last tab was a pinned tab and it was dragged
to the right, resulting in a no-op, it would become unpinned
- Fixed a bug where a pinned tab dragged just to the right of the end of
the pinned tab region would become unpinned
- Fixed a bug where dragging a pinned tab from one pane to another
pane's pinned region could result in an existing pinned tab becoming
unpinned when `max_tabs` was reached
- Fixed a bug where moving an unpinned tab to the left, just to the end
of the pinned region, would cause the pinned tabs to become unpinned.
2025-06-06 16:03:10 -04:00
Joseph T. Lyons
7b4f37353a Refactor some logic in handle_tab_drop (#32213)
Tiny little clean up PR after #32184 

Release Notes:

- N/A
2025-06-06 16:02:55 -04:00
Joseph T. Lyons
9b8035e8c2 Fix bugs around tab state loss when moving pinned tabs across panes (#32184)
Closes https://github.com/zed-industries/zed/issues/32181,
https://github.com/zed-industries/zed/issues/32179

In the screenshots: left = nightly, right = dev

Fix 1: A pinned tab dragged into a new split should remain pinned


https://github.com/user-attachments/assets/608a7e10-4ccb-4219-ba81-624298c960b0

Fix 2: Moving a pinned tab from one pane to another should not cause
other pinned tabs to be unpinned


https://github.com/user-attachments/assets/ccc05913-591d-4a43-85bb-3a7164a4d6a8

I also added tests for moving both pinned tabs and unpinned tabs into
existing panes, both into the "pinned" region and the "unpinned" region.

Release Notes:

- Fixed a bug where dragging a pinned tab into a new split would lose
its pinned tab state.
- Fixed a bug where pinned tabs in one pane could be lost when moving
one of the pinned tabs to another pane.
2025-06-06 16:02:42 -04:00
Peter Tripp
f6fdb7094b ci: Auto-release release prefix hotfixes again (#32265)
Undo:
- https://github.com/zed-industries/zed/pull/24894

CC: @JosephTLyons

Release Notes:

- N/A
2025-06-06 15:30:00 -04:00
Peter Tripp
0c90b19fa1 Build zed-remote-server on FreeBSD (#29561)
Builds freebsd-remote-server under qemu working on linux runners.

Release Notes:

- Initial support for ssh remotes running FreeBSD x86_64
2025-06-06 14:00:06 -04:00
Zed Bot
cb2885410d Bump to 0.190.1 for @ConradIrwin 2025-06-06 03:11:19 +00:00
gcp-cherry-pick-bot[bot]
6535a148b5 Fix innermost brackets panic (cherry-pick #32120) (#32189)
Cherry-picked Fix innermost brackets panic (#32120)

Release Notes:

- Fixed a few rare panics that could happen when a multibuffer excerpt
started with expanded deleted content.

Co-authored-by: Conrad Irwin <conrad.irwin@gmail.com>
2025-06-05 16:00:22 -06:00
gcp-cherry-pick-bot[bot]
fc3a5a799e Fix a panic in merge conflict parsing (cherry-pick #32119) (#32123)
Cherry-picked Fix a panic in merge conflict parsing (#32119)

Release Notes:

- Fixed a panic that could occur when editing files containing merge
conflicts.

Co-authored-by: Cole Miller <cole@zed.dev>
2025-06-04 20:32:12 -04:00
Max Brunsfeld
393382c448 Bump tree-sitter-bash to 0.25 (#32091)
Closes https://github.com/zed-industries/zed/issues/23703

Release Notes:

- Fixed a crash that could occur when editing bash files
2025-06-04 13:23:22 -04:00
Max Brunsfeld
2ad24ef799 Bump Tree-sitter to 0.25.6 (#32090)
Fixes #31810 

Release Notes:

- Fixed a crash that could occur when editing YAML files.
2025-06-04 12:55:46 -04:00
Vitaly Slobodin
185e0696ae ruby: Add sorbet and steep to the list of available language servers (#32008)
Hi, this pull request adds `sorbet` and `steep` to the list of available
language servers for the Ruby language in order to prepare default Ruby
language settings for these LS. Both language servers are disabled by
default. We plan to add both in #104 and #102. Thanks!

Release Notes:

- ruby: Added `sorbet` and `steep` to the list of available language servers.
2025-06-04 10:20:51 -04:00
Joseph T. Lyons
750c6356e7 v0.190.x preview 2025-06-04 10:12:33 -04:00
53 changed files with 2203 additions and 501 deletions

View File

@@ -715,6 +715,64 @@ jobs:
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
freebsd:
timeout-minutes: 60
runs-on: github-8vcpu-ubuntu-2404
if: |
startsWith(github.ref, 'refs/tags/v')
|| contains(github.event.pull_request.labels.*.name, 'run-bundling')
needs: [linux_tests]
name: Build Zed on FreeBSD
# env:
# MYTOKEN : ${{ secrets.MYTOKEN }}
# MYTOKEN2: "value2"
steps:
- uses: actions/checkout@v4
- name: Build FreeBSD remote-server
id: freebsd-build
uses: vmactions/freebsd-vm@c3ae29a132c8ef1924775414107a97cac042aad5 # v1.2.0
with:
# envs: "MYTOKEN MYTOKEN2"
usesh: true
release: 13.5
copyback: true
prepare: |
pkg install -y \
bash curl jq git \
rustup-init cmake-core llvm-devel-lite pkgconf protobuf # ibx11 alsa-lib rust-bindgen-cli
run: |
freebsd-version
sysctl hw.model
sysctl hw.ncpu
sysctl hw.physmem
sysctl hw.usermem
git config --global --add safe.directory /home/runner/work/zed/zed
rustup-init --profile minimal --default-toolchain none -y
. "$HOME/.cargo/env"
./script/bundle-freebsd
mkdir -p out/
mv "target/zed-remote-server-freebsd-x86_64.gz" out/
rm -rf target/
cargo clean
- name: Upload Artifact to Workflow - zed-remote-server (run-bundling)
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4
if: contains(github.event.pull_request.labels.*.name, 'run-bundling')
with:
name: zed-remote-server-${{ github.event.pull_request.head.sha || github.sha }}-x86_64-unknown-freebsd.gz
path: out/zed-remote-server-freebsd-x86_64.gz
- name: Upload Artifacts to release
uses: softprops/action-gh-release@de2c0eb89ae2a093876385947365aca7b0e5f844 # v1
if: ${{ !(contains(github.event.pull_request.labels.*.name, 'run-bundling')) }}
with:
draft: true
prerelease: ${{ env.RELEASE_CHANNEL == 'preview' }}
files: |
out/zed-remote-server-freebsd-x86_64.gz
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
nix-build:
name: Build with Nix
uses: ./.github/workflows/nix.yml
@@ -729,12 +787,12 @@ jobs:
if: |
startsWith(github.ref, 'refs/tags/v')
&& endsWith(github.ref, '-pre') && !endsWith(github.ref, '.0-pre')
needs: [bundle-mac, bundle-linux-x86_x64, bundle-linux-aarch64]
needs: [bundle-mac, bundle-linux-x86_x64, bundle-linux-aarch64, freebsd]
runs-on:
- self-hosted
- bundle
steps:
- name: gh release
run: gh release edit $GITHUB_REF_NAME --draft=true
run: gh release edit $GITHUB_REF_NAME --draft=false
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}

View File

@@ -167,6 +167,62 @@ jobs:
- name: Upload Zed Nightly
run: script/upload-nightly linux-targz
freebsd:
timeout-minutes: 60
if: github.repository_owner == 'zed-industries'
runs-on: github-8vcpu-ubuntu-2404
needs: tests
name: Build Zed on FreeBSD
# env:
# MYTOKEN : ${{ secrets.MYTOKEN }}
# MYTOKEN2: "value2"
steps:
- uses: actions/checkout@v4
- name: Build FreeBSD remote-server
id: freebsd-build
uses: vmactions/freebsd-vm@c3ae29a132c8ef1924775414107a97cac042aad5 # v1.2.0
with:
# envs: "MYTOKEN MYTOKEN2"
usesh: true
release: 13.5
copyback: true
prepare: |
pkg install -y \
bash curl jq git \
rustup-init cmake-core llvm-devel-lite pkgconf protobuf # ibx11 alsa-lib rust-bindgen-cli
run: |
freebsd-version
sysctl hw.model
sysctl hw.ncpu
sysctl hw.physmem
sysctl hw.usermem
git config --global --add safe.directory /home/runner/work/zed/zed
rustup-init --profile minimal --default-toolchain none -y
. "$HOME/.cargo/env"
./script/bundle-freebsd
mkdir -p out/
mv "target/zed-remote-server-freebsd-x86_64.gz" out/
rm -rf target/
cargo clean
- name: Upload Artifact to Workflow - zed-remote-server (run-bundling)
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4
if: contains(github.event.pull_request.labels.*.name, 'run-bundling')
with:
name: zed-remote-server-${{ github.event.pull_request.head.sha || github.sha }}-x86_64-unknown-freebsd.gz
path: out/zed-remote-server-freebsd-x86_64.gz
- name: Upload Artifacts to release
uses: softprops/action-gh-release@de2c0eb89ae2a093876385947365aca7b0e5f844 # v1
if: ${{ !(contains(github.event.pull_request.labels.*.name, 'run-bundling')) }}
with:
draft: true
prerelease: ${{ env.RELEASE_CHANNEL == 'preview' }}
files: |
out/zed-remote-server-freebsd-x86_64.gz
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
bundle-nix:
name: Build and cache Nix package
needs: tests

71
Cargo.lock generated
View File

@@ -59,7 +59,6 @@ dependencies = [
"assistant_slash_command",
"assistant_slash_commands",
"assistant_tool",
"async-watch",
"audio",
"buffer_diff",
"chrono",
@@ -130,6 +129,7 @@ dependencies = [
"urlencoding",
"util",
"uuid",
"watch",
"workspace",
"workspace-hack",
"zed_actions",
@@ -664,7 +664,6 @@ dependencies = [
"agent_settings",
"anyhow",
"assistant_tool",
"async-watch",
"buffer_diff",
"chrono",
"client",
@@ -715,6 +714,7 @@ dependencies = [
"ui",
"unindent",
"util",
"watch",
"web_search",
"which 6.0.3",
"workspace",
@@ -1073,15 +1073,6 @@ dependencies = [
"tungstenite 0.26.2",
]
[[package]]
name = "async-watch"
version = "0.3.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a078faf4e27c0c6cc0efb20e5da59dcccc04968ebf2801d8e0b2195124cdcdb2"
dependencies = [
"event-listener 2.5.3",
]
[[package]]
name = "async_zip"
version = "0.0.17"
@@ -3168,6 +3159,16 @@ dependencies = [
"memchr",
]
[[package]]
name = "command-fds"
version = "0.3.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "2ec1052629a80c28594777d1252efc8a6b005d13f9edfd8c3fc0f44d5b32489a"
dependencies = [
"nix 0.30.1",
"thiserror 2.0.12",
]
[[package]]
name = "command_palette"
version = "0.1.0"
@@ -4047,7 +4048,7 @@ dependencies = [
[[package]]
name = "dap-types"
version = "0.0.1"
source = "git+https://github.com/zed-industries/dap-types?rev=68516de327fa1be15214133a0a2e52a12982ce75#68516de327fa1be15214133a0a2e52a12982ce75"
source = "git+https://github.com/zed-industries/dap-types?rev=cef124a5109d6fd44a3f986882d78ce40b8d4fb5#cef124a5109d6fd44a3f986882d78ce40b8d4fb5"
dependencies = [
"schemars",
"serde",
@@ -5009,7 +5010,6 @@ dependencies = [
"assistant_tool",
"assistant_tools",
"async-trait",
"async-watch",
"buffer_diff",
"chrono",
"clap",
@@ -5051,6 +5051,7 @@ dependencies = [
"unindent",
"util",
"uuid",
"watch",
"workspace-hack",
"zed_llm_client",
]
@@ -8735,7 +8736,6 @@ version = "0.1.0"
dependencies = [
"anyhow",
"async-trait",
"async-watch",
"clock",
"collections",
"ctor",
@@ -8785,6 +8785,7 @@ dependencies = [
"unicase",
"unindent",
"util",
"watch",
"workspace-hack",
"zlog",
]
@@ -10134,6 +10135,18 @@ dependencies = [
"memoffset",
]
[[package]]
name = "nix"
version = "0.30.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "74523f3a35e05aba87a1d978330aef40f67b0304ac79c1c00b294c9830543db6"
dependencies = [
"bitflags 2.9.0",
"cfg-if",
"cfg_aliases 0.2.1",
"libc",
]
[[package]]
name = "node_runtime"
version = "0.1.0"
@@ -10143,7 +10156,6 @@ dependencies = [
"async-std",
"async-tar",
"async-trait",
"async-watch",
"futures 0.3.31",
"http_client",
"log",
@@ -10153,6 +10165,7 @@ dependencies = [
"serde_json",
"smol",
"util",
"watch",
"which 6.0.3",
"workspace-hack",
]
@@ -13002,7 +13015,6 @@ dependencies = [
"askpass",
"assistant_tool",
"assistant_tools",
"async-watch",
"backtrace",
"cargo_toml",
"chrono",
@@ -13049,6 +13061,7 @@ dependencies = [
"toml 0.8.20",
"unindent",
"util",
"watch",
"worktree",
"zlog",
]
@@ -16509,9 +16522,9 @@ dependencies = [
[[package]]
name = "tree-sitter"
version = "0.25.5"
version = "0.25.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ac5fff5c47490dfdf473b5228039bfacad9d765d9b6939d26bf7cc064c1c7822"
checksum = "a7cf18d43cbf0bfca51f657132cc616a5097edc4424d538bae6fa60142eaf9f0"
dependencies = [
"cc",
"regex",
@@ -16524,9 +16537,9 @@ dependencies = [
[[package]]
name = "tree-sitter-bash"
version = "0.23.3"
version = "0.25.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "329a4d48623ac337d42b1df84e81a1c9dbb2946907c102ca72db158c1964a52e"
checksum = "871b0606e667e98a1237ebdc1b0d7056e0aebfdc3141d12b399865d4cb6ed8a6"
dependencies = [
"cc",
"tree-sitter-language",
@@ -17124,6 +17137,7 @@ dependencies = [
"async-fs",
"async_zip",
"collections",
"command-fds",
"dirs 4.0.0",
"dunce",
"futures 0.3.31",
@@ -17909,6 +17923,19 @@ dependencies = [
"leb128",
]
[[package]]
name = "watch"
version = "0.1.0"
dependencies = [
"ctor",
"futures 0.3.31",
"gpui",
"parking_lot",
"rand 0.8.5",
"workspace-hack",
"zlog",
]
[[package]]
name = "wayland-backend"
version = "0.3.8"
@@ -19708,7 +19735,7 @@ dependencies = [
[[package]]
name = "zed"
version = "0.190.0"
version = "0.190.4"
dependencies = [
"activity_indicator",
"agent",
@@ -19720,7 +19747,6 @@ dependencies = [
"assistant_context_editor",
"assistant_tool",
"assistant_tools",
"async-watch",
"audio",
"auto_update",
"auto_update_ui",
@@ -19837,6 +19863,7 @@ dependencies = [
"uuid",
"vim",
"vim_mode_setting",
"watch",
"web_search",
"web_search_providers",
"welcome",

View File

@@ -165,6 +165,7 @@ members = [
"crates/util_macros",
"crates/vim",
"crates/vim_mode_setting",
"crates/watch",
"crates/web_search",
"crates/web_search_providers",
"crates/welcome",
@@ -373,6 +374,7 @@ util = { path = "crates/util" }
util_macros = { path = "crates/util_macros" }
vim = { path = "crates/vim" }
vim_mode_setting = { path = "crates/vim_mode_setting" }
watch = { path = "crates/watch" }
web_search = { path = "crates/web_search" }
web_search_providers = { path = "crates/web_search_providers" }
welcome = { path = "crates/welcome" }
@@ -403,7 +405,6 @@ async-recursion = "1.0.0"
async-tar = "0.5.0"
async-trait = "0.1"
async-tungstenite = "0.29.1"
async-watch = "0.3.1"
async_zip = { version = "0.0.17", features = ["deflate", "deflate64"] }
aws-config = { version = "1.6.1", features = ["behavior-version-latest"] }
aws-credential-types = { version = "1.2.2", features = [
@@ -434,7 +435,7 @@ core-foundation-sys = "0.8.6"
core-video = { version = "0.4.3", features = ["metal"] }
criterion = { version = "0.5", features = ["html_reports"] }
ctor = "0.4.0"
dap-types = { git = "https://github.com/zed-industries/dap-types", rev = "68516de327fa1be15214133a0a2e52a12982ce75" }
dap-types = { git = "https://github.com/zed-industries/dap-types", rev = "cef124a5109d6fd44a3f986882d78ce40b8d4fb5" }
dashmap = "6.0"
derive_more = "0.99.17"
dirs = "4.0"
@@ -574,8 +575,8 @@ tokio = { version = "1" }
tokio-tungstenite = { version = "0.26", features = ["__rustls-tls"] }
toml = "0.8"
tower-http = "0.4.4"
tree-sitter = { version = "0.25.5", features = ["wasm"] }
tree-sitter-bash = "0.23"
tree-sitter = { version = "0.25.6", features = ["wasm"] }
tree-sitter-bash = "0.25.0"
tree-sitter-c = "0.23"
tree-sitter-cpp = "0.23"
tree-sitter-css = "0.23"
@@ -697,6 +698,8 @@ codegen-units = 16
[profile.dev.package]
taffy = { opt-level = 3 }
cranelift-codegen = { opt-level = 3 }
cranelift-codegen-meta = { opt-level = 3 }
cranelift-codegen-shared = { opt-level = 3 }
resvg = { opt-level = 3 }
rustybuzz = { opt-level = 3 }
ttf-parser = { opt-level = 3 }

View File

@@ -1525,7 +1525,7 @@
"allow_rewrap": "anywhere"
},
"Ruby": {
"language_servers": ["solargraph", "!ruby-lsp", "!rubocop", "..."]
"language_servers": ["solargraph", "!ruby-lsp", "!rubocop", "!sorbet", "!steep", "..."]
},
"SCSS": {
"prettier": {

View File

@@ -25,7 +25,6 @@ assistant_context_editor.workspace = true
assistant_slash_command.workspace = true
assistant_slash_commands.workspace = true
assistant_tool.workspace = true
async-watch.workspace = true
audio.workspace = true
buffer_diff.workspace = true
chrono.workspace = true
@@ -95,6 +94,7 @@ ui_input.workspace = true
urlencoding.workspace = true
util.workspace = true
uuid.workspace = true
watch.workspace = true
workspace-hack.workspace = true
workspace.workspace = true
zed_actions.workspace = true

View File

@@ -57,7 +57,7 @@ use zed_llm_client::{CompletionIntent, UsageLimit};
use crate::active_thread::{self, ActiveThread, ActiveThreadEvent};
use crate::agent_configuration::{AgentConfiguration, AssistantConfigurationEvent};
use crate::agent_diff::AgentDiff;
use crate::history_store::{HistoryStore, RecentEntry};
use crate::history_store::{HistoryEntryId, HistoryStore};
use crate::message_editor::{MessageEditor, MessageEditorEvent};
use crate::thread::{Thread, ThreadError, ThreadId, ThreadSummary, TokenUsageRatio};
use crate::thread_history::{HistoryEntryElement, ThreadHistory};
@@ -257,6 +257,7 @@ impl ActiveView {
pub fn prompt_editor(
context_editor: Entity<ContextEditor>,
history_store: Entity<HistoryStore>,
language_registry: Arc<LanguageRegistry>,
window: &mut Window,
cx: &mut App,
@@ -322,6 +323,19 @@ impl ActiveView {
editor.set_text(summary, window, cx);
})
}
ContextEvent::PathChanged { old_path, new_path } => {
history_store.update(cx, |history_store, cx| {
if let Some(old_path) = old_path {
history_store
.replace_recently_opened_text_thread(old_path, new_path, cx);
} else {
history_store.push_recently_opened_entry(
HistoryEntryId::Context(new_path.clone()),
cx,
);
}
});
}
_ => {}
}
}),
@@ -516,8 +530,7 @@ impl AgentPanel {
HistoryStore::new(
thread_store.clone(),
context_store.clone(),
[RecentEntry::Thread(thread_id, thread.clone())],
window,
[HistoryEntryId::Thread(thread_id)],
cx,
)
});
@@ -544,7 +557,13 @@ impl AgentPanel {
editor.insert_default_prompt(window, cx);
editor
});
ActiveView::prompt_editor(context_editor, language_registry.clone(), window, cx)
ActiveView::prompt_editor(
context_editor,
history_store.clone(),
language_registry.clone(),
window,
cx,
)
}
};
@@ -581,86 +600,9 @@ impl AgentPanel {
let panel = weak_panel.clone();
let assistant_navigation_menu =
ContextMenu::build_persistent(window, cx, move |mut menu, _window, cx| {
let recently_opened = panel
.update(cx, |this, cx| {
this.history_store.update(cx, |history_store, cx| {
history_store.recently_opened_entries(cx)
})
})
.unwrap_or_default();
if !recently_opened.is_empty() {
menu = menu.header("Recently Opened");
for entry in recently_opened.iter() {
if let RecentEntry::Context(context) = entry {
if context.read(cx).path().is_none() {
log::error!(
"bug: text thread in recent history list was never saved"
);
continue;
}
}
let summary = entry.summary(cx);
menu = menu.entry_with_end_slot_on_hover(
summary,
None,
{
let panel = panel.clone();
let entry = entry.clone();
move |window, cx| {
panel
.update(cx, {
let entry = entry.clone();
move |this, cx| match entry {
RecentEntry::Thread(_, thread) => {
this.open_thread(thread, window, cx)
}
RecentEntry::Context(context) => {
let Some(path) = context.read(cx).path()
else {
return;
};
this.open_saved_prompt_editor(
path.clone(),
window,
cx,
)
.detach_and_log_err(cx)
}
}
})
.ok();
}
},
IconName::Close,
"Close Entry".into(),
{
let panel = panel.clone();
let entry = entry.clone();
move |_window, cx| {
panel
.update(cx, |this, cx| {
this.history_store.update(
cx,
|history_store, cx| {
history_store.remove_recently_opened_entry(
&entry, cx,
);
},
);
})
.ok();
}
},
);
}
menu = menu.separator();
if let Some(panel) = panel.upgrade() {
menu = Self::populate_recently_opened_menu_section(menu, panel, cx);
}
menu.action("View All", Box::new(OpenHistory))
.end_slot_action(DeleteRecentlyOpenThread.boxed_clone())
.fixed_width(px(320.).into())
@@ -898,6 +840,7 @@ impl AgentPanel {
self.set_active_view(
ActiveView::prompt_editor(
context_editor.clone(),
self.history_store.clone(),
self.language_registry.clone(),
window,
cx,
@@ -984,7 +927,13 @@ impl AgentPanel {
)
});
self.set_active_view(
ActiveView::prompt_editor(editor.clone(), self.language_registry.clone(), window, cx),
ActiveView::prompt_editor(
editor.clone(),
self.history_store.clone(),
self.language_registry.clone(),
window,
cx,
),
window,
cx,
);
@@ -1383,16 +1332,6 @@ impl AgentPanel {
}
}
}
ActiveView::TextThread { context_editor, .. } => {
let context = context_editor.read(cx).context();
// When switching away from an unsaved text thread, delete its entry.
if context.read(cx).path().is_none() {
let context = context.clone();
self.history_store.update(cx, |store, cx| {
store.remove_recently_opened_entry(&RecentEntry::Context(context), cx);
});
}
}
_ => {}
}
@@ -1400,13 +1339,14 @@ impl AgentPanel {
ActiveView::Thread { thread, .. } => self.history_store.update(cx, |store, cx| {
if let Some(thread) = thread.upgrade() {
let id = thread.read(cx).id().clone();
store.push_recently_opened_entry(RecentEntry::Thread(id, thread), cx);
store.push_recently_opened_entry(HistoryEntryId::Thread(id), cx);
}
}),
ActiveView::TextThread { context_editor, .. } => {
self.history_store.update(cx, |store, cx| {
let context = context_editor.read(cx).context().clone();
store.push_recently_opened_entry(RecentEntry::Context(context), cx)
if let Some(path) = context_editor.read(cx).context().read(cx).path() {
store.push_recently_opened_entry(HistoryEntryId::Context(path.clone()), cx)
}
})
}
_ => {}
@@ -1425,6 +1365,70 @@ impl AgentPanel {
self.focus_handle(cx).focus(window);
}
fn populate_recently_opened_menu_section(
mut menu: ContextMenu,
panel: Entity<Self>,
cx: &mut Context<ContextMenu>,
) -> ContextMenu {
let entries = panel
.read(cx)
.history_store
.read(cx)
.recently_opened_entries(cx);
if entries.is_empty() {
return menu;
}
menu = menu.header("Recently Opened");
for entry in entries {
let title = entry.title().clone();
let id = entry.id();
menu = menu.entry_with_end_slot_on_hover(
title,
None,
{
let panel = panel.downgrade();
let id = id.clone();
move |window, cx| {
let id = id.clone();
panel
.update(cx, move |this, cx| match id {
HistoryEntryId::Thread(id) => this
.open_thread_by_id(&id, window, cx)
.detach_and_log_err(cx),
HistoryEntryId::Context(path) => this
.open_saved_prompt_editor(path.clone(), window, cx)
.detach_and_log_err(cx),
})
.ok();
}
},
IconName::Close,
"Close Entry".into(),
{
let panel = panel.downgrade();
let id = id.clone();
move |_window, cx| {
panel
.update(cx, |this, cx| {
this.history_store.update(cx, |history_store, cx| {
history_store.remove_recently_opened_entry(&id, cx);
});
})
.ok();
}
},
);
}
menu = menu.separator();
menu
}
}
impl Focusable for AgentPanel {

View File

@@ -767,7 +767,7 @@ impl CompletionProvider for ContextPickerCompletionProvider {
let snapshot = buffer.read(cx).snapshot();
let source_range = snapshot.anchor_before(state.source_range.start)
..snapshot.anchor_before(state.source_range.end);
..snapshot.anchor_after(state.source_range.end);
let thread_store = self.thread_store.clone();
let text_thread_store = self.text_thread_store.clone();

View File

@@ -282,15 +282,18 @@ pub fn unordered_thread_entries(
text_thread_store: Entity<TextThreadStore>,
cx: &App,
) -> impl Iterator<Item = (DateTime<Utc>, ThreadContextEntry)> {
let threads = thread_store.read(cx).unordered_threads().map(|thread| {
(
thread.updated_at,
ThreadContextEntry::Thread {
id: thread.id.clone(),
title: thread.summary.clone(),
},
)
});
let threads = thread_store
.read(cx)
.reverse_chronological_threads()
.map(|thread| {
(
thread.updated_at,
ThreadContextEntry::Thread {
id: thread.id.clone(),
title: thread.summary.clone(),
},
)
});
let text_threads = text_thread_store
.read(cx)
@@ -300,7 +303,7 @@ pub fn unordered_thread_entries(
context.mtime.to_utc(),
ThreadContextEntry::Context {
path: context.path.clone(),
title: context.title.clone().into(),
title: context.title.clone(),
},
)
});

View File

@@ -1,18 +1,17 @@
use std::{collections::VecDeque, path::Path, sync::Arc};
use anyhow::Context as _;
use assistant_context_editor::{AssistantContext, SavedContextMetadata};
use anyhow::{Context as _, Result};
use assistant_context_editor::SavedContextMetadata;
use chrono::{DateTime, Utc};
use futures::future::{TryFutureExt as _, join_all};
use gpui::{Entity, Task, prelude::*};
use gpui::{AsyncApp, Entity, SharedString, Task, prelude::*};
use itertools::Itertools;
use paths::contexts_dir;
use serde::{Deserialize, Serialize};
use smol::future::FutureExt;
use std::time::Duration;
use ui::{App, SharedString, Window};
use ui::App;
use util::ResultExt as _;
use crate::{
Thread,
thread::ThreadId,
thread_store::{SerializedThreadMetadata, ThreadStore},
};
@@ -41,52 +40,34 @@ impl HistoryEntry {
HistoryEntry::Context(context) => HistoryEntryId::Context(context.path.clone()),
}
}
pub fn title(&self) -> &SharedString {
match self {
HistoryEntry::Thread(thread) => &thread.summary,
HistoryEntry::Context(context) => &context.title,
}
}
}
/// Generic identifier for a history entry.
#[derive(Clone, PartialEq, Eq)]
#[derive(Clone, PartialEq, Eq, Debug)]
pub enum HistoryEntryId {
Thread(ThreadId),
Context(Arc<Path>),
}
#[derive(Clone, Debug)]
pub(crate) enum RecentEntry {
Thread(ThreadId, Entity<Thread>),
Context(Entity<AssistantContext>),
}
impl PartialEq for RecentEntry {
fn eq(&self, other: &Self) -> bool {
match (self, other) {
(Self::Thread(l0, _), Self::Thread(r0, _)) => l0 == r0,
(Self::Context(l0), Self::Context(r0)) => l0 == r0,
_ => false,
}
}
}
impl Eq for RecentEntry {}
impl RecentEntry {
pub(crate) fn summary(&self, cx: &App) -> SharedString {
match self {
RecentEntry::Thread(_, thread) => thread.read(cx).summary().or_default(),
RecentEntry::Context(context) => context.read(cx).summary().or_default(),
}
}
}
#[derive(Serialize, Deserialize)]
enum SerializedRecentEntry {
enum SerializedRecentOpen {
Thread(String),
ContextName(String),
/// Old format which stores the full path
Context(String),
}
pub struct HistoryStore {
thread_store: Entity<ThreadStore>,
context_store: Entity<assistant_context_editor::ContextStore>,
recently_opened_entries: VecDeque<RecentEntry>,
recently_opened_entries: VecDeque<HistoryEntryId>,
_subscriptions: Vec<gpui::Subscription>,
_save_recently_opened_entries_task: Task<()>,
}
@@ -95,8 +76,7 @@ impl HistoryStore {
pub fn new(
thread_store: Entity<ThreadStore>,
context_store: Entity<assistant_context_editor::ContextStore>,
initial_recent_entries: impl IntoIterator<Item = RecentEntry>,
window: &mut Window,
initial_recent_entries: impl IntoIterator<Item = HistoryEntryId>,
cx: &mut Context<Self>,
) -> Self {
let subscriptions = vec![
@@ -104,68 +84,20 @@ impl HistoryStore {
cx.observe(&context_store, |_, _, cx| cx.notify()),
];
window
.spawn(cx, {
let thread_store = thread_store.downgrade();
let context_store = context_store.downgrade();
let this = cx.weak_entity();
async move |cx| {
let path = paths::data_dir().join(NAVIGATION_HISTORY_PATH);
let contents = cx
.background_spawn(async move { std::fs::read_to_string(path) })
.await
.ok()?;
let entries = serde_json::from_str::<Vec<SerializedRecentEntry>>(&contents)
.context("deserializing persisted agent panel navigation history")
.log_err()?
.into_iter()
.take(MAX_RECENTLY_OPENED_ENTRIES)
.map(|serialized| match serialized {
SerializedRecentEntry::Thread(id) => thread_store
.update_in(cx, |thread_store, window, cx| {
let thread_id = ThreadId::from(id.as_str());
thread_store
.open_thread(&thread_id, window, cx)
.map_ok(|thread| RecentEntry::Thread(thread_id, thread))
.boxed()
})
.unwrap_or_else(|_| {
async {
anyhow::bail!("no thread store");
}
.boxed()
}),
SerializedRecentEntry::Context(id) => context_store
.update(cx, |context_store, cx| {
context_store
.open_local_context(Path::new(&id).into(), cx)
.map_ok(RecentEntry::Context)
.boxed()
})
.unwrap_or_else(|_| {
async {
anyhow::bail!("no context store");
}
.boxed()
}),
});
let entries = join_all(entries)
.await
.into_iter()
.filter_map(|result| result.log_with_level(log::Level::Debug))
.collect::<VecDeque<_>>();
this.update(cx, |this, _| {
this.recently_opened_entries.extend(entries);
this.recently_opened_entries
.truncate(MAX_RECENTLY_OPENED_ENTRIES);
})
.ok();
Some(())
}
cx.spawn(async move |this, cx| {
let entries = Self::load_recently_opened_entries(cx).await.log_err()?;
this.update(cx, |this, _| {
this.recently_opened_entries
.extend(
entries.into_iter().take(
MAX_RECENTLY_OPENED_ENTRIES
.saturating_sub(this.recently_opened_entries.len()),
),
);
})
.detach();
.ok()
})
.detach();
Self {
thread_store,
@@ -184,19 +116,20 @@ impl HistoryStore {
return history_entries;
}
for thread in self
.thread_store
.update(cx, |this, _cx| this.reverse_chronological_threads())
{
history_entries.push(HistoryEntry::Thread(thread));
}
for context in self
.context_store
.update(cx, |this, _cx| this.reverse_chronological_contexts())
{
history_entries.push(HistoryEntry::Context(context));
}
history_entries.extend(
self.thread_store
.read(cx)
.reverse_chronological_threads()
.cloned()
.map(HistoryEntry::Thread),
);
history_entries.extend(
self.context_store
.read(cx)
.unordered_contexts()
.cloned()
.map(HistoryEntry::Context),
);
history_entries.sort_unstable_by_key(|entry| std::cmp::Reverse(entry.updated_at()));
history_entries
@@ -206,15 +139,62 @@ impl HistoryStore {
self.entries(cx).into_iter().take(limit).collect()
}
pub fn recently_opened_entries(&self, cx: &App) -> Vec<HistoryEntry> {
#[cfg(debug_assertions)]
if std::env::var("ZED_SIMULATE_NO_THREAD_HISTORY").is_ok() {
return Vec::new();
}
let thread_entries = self
.thread_store
.read(cx)
.reverse_chronological_threads()
.flat_map(|thread| {
self.recently_opened_entries
.iter()
.enumerate()
.flat_map(|(index, entry)| match entry {
HistoryEntryId::Thread(id) if &thread.id == id => {
Some((index, HistoryEntry::Thread(thread.clone())))
}
_ => None,
})
});
let context_entries =
self.context_store
.read(cx)
.unordered_contexts()
.flat_map(|context| {
self.recently_opened_entries
.iter()
.enumerate()
.flat_map(|(index, entry)| match entry {
HistoryEntryId::Context(path) if &context.path == path => {
Some((index, HistoryEntry::Context(context.clone())))
}
_ => None,
})
});
thread_entries
.chain(context_entries)
// optimization to halt iteration early
.take(self.recently_opened_entries.len())
.sorted_unstable_by_key(|(index, _)| *index)
.map(|(_, entry)| entry)
.collect()
}
fn save_recently_opened_entries(&mut self, cx: &mut Context<Self>) {
let serialized_entries = self
.recently_opened_entries
.iter()
.filter_map(|entry| match entry {
RecentEntry::Context(context) => Some(SerializedRecentEntry::Context(
context.read(cx).path()?.to_str()?.to_owned(),
)),
RecentEntry::Thread(id, _) => Some(SerializedRecentEntry::Thread(id.to_string())),
HistoryEntryId::Context(path) => path.file_name().map(|file| {
SerializedRecentOpen::ContextName(file.to_string_lossy().to_string())
}),
HistoryEntryId::Thread(id) => Some(SerializedRecentOpen::Thread(id.to_string())),
})
.collect::<Vec<_>>();
@@ -233,7 +213,33 @@ impl HistoryStore {
});
}
pub fn push_recently_opened_entry(&mut self, entry: RecentEntry, cx: &mut Context<Self>) {
fn load_recently_opened_entries(cx: &AsyncApp) -> Task<Result<Vec<HistoryEntryId>>> {
cx.background_spawn(async move {
let path = paths::data_dir().join(NAVIGATION_HISTORY_PATH);
let contents = smol::fs::read_to_string(path).await?;
let entries = serde_json::from_str::<Vec<SerializedRecentOpen>>(&contents)
.context("deserializing persisted agent panel navigation history")?
.into_iter()
.take(MAX_RECENTLY_OPENED_ENTRIES)
.flat_map(|entry| match entry {
SerializedRecentOpen::Thread(id) => {
Some(HistoryEntryId::Thread(id.as_str().into()))
}
SerializedRecentOpen::ContextName(file_name) => Some(HistoryEntryId::Context(
contexts_dir().join(file_name).into(),
)),
SerializedRecentOpen::Context(path) => {
Path::new(&path).file_name().map(|file_name| {
HistoryEntryId::Context(contexts_dir().join(file_name).into())
})
}
})
.collect::<Vec<_>>();
Ok(entries)
})
}
pub fn push_recently_opened_entry(&mut self, entry: HistoryEntryId, cx: &mut Context<Self>) {
self.recently_opened_entries
.retain(|old_entry| old_entry != &entry);
self.recently_opened_entries.push_front(entry);
@@ -244,24 +250,33 @@ impl HistoryStore {
pub fn remove_recently_opened_thread(&mut self, id: ThreadId, cx: &mut Context<Self>) {
self.recently_opened_entries.retain(|entry| match entry {
RecentEntry::Thread(thread_id, _) if thread_id == &id => false,
HistoryEntryId::Thread(thread_id) if thread_id == &id => false,
_ => true,
});
self.save_recently_opened_entries(cx);
}
pub fn remove_recently_opened_entry(&mut self, entry: &RecentEntry, cx: &mut Context<Self>) {
pub fn replace_recently_opened_text_thread(
&mut self,
old_path: &Path,
new_path: &Arc<Path>,
cx: &mut Context<Self>,
) {
for entry in &mut self.recently_opened_entries {
match entry {
HistoryEntryId::Context(path) if path.as_ref() == old_path => {
*entry = HistoryEntryId::Context(new_path.clone());
break;
}
_ => {}
}
}
self.save_recently_opened_entries(cx);
}
pub fn remove_recently_opened_entry(&mut self, entry: &HistoryEntryId, cx: &mut Context<Self>) {
self.recently_opened_entries
.retain(|old_entry| old_entry != entry);
self.save_recently_opened_entries(cx);
}
pub fn recently_opened_entries(&self, _cx: &mut Context<Self>) -> VecDeque<RecentEntry> {
#[cfg(debug_assertions)]
if std::env::var("ZED_SIMULATE_NO_THREAD_HISTORY").is_ok() {
return VecDeque::new();
}
self.recently_opened_entries.clone()
}
}

View File

@@ -38,8 +38,7 @@ use telemetry_events::{AssistantEventData, AssistantKind, AssistantPhase};
use terminal_view::{TerminalView, terminal_panel::TerminalPanel};
use text::{OffsetRangeExt, ToPoint as _};
use ui::prelude::*;
use util::RangeExt;
use util::ResultExt;
use util::{RangeExt, ResultExt, maybe};
use workspace::{ItemHandle, Toast, Workspace, dock::Panel, notifications::NotificationId};
use zed_actions::agent::OpenConfiguration;
@@ -1011,7 +1010,7 @@ impl InlineAssistant {
self.update_editor_highlights(&editor, cx);
}
} else {
entry.get().highlight_updates.send(()).ok();
entry.get_mut().highlight_updates.send(()).ok();
}
}
@@ -1171,27 +1170,31 @@ impl InlineAssistant {
selections.select_anchor_ranges([position..position])
});
let mut scroll_target_top;
let mut scroll_target_bottom;
let mut scroll_target_range = None;
if let Some(decorations) = assist.decorations.as_ref() {
scroll_target_top = editor
.row_for_block(decorations.prompt_block_id, cx)
.unwrap()
.0 as f32;
scroll_target_bottom = editor
.row_for_block(decorations.end_block_id, cx)
.unwrap()
.0 as f32;
} else {
scroll_target_range = maybe!({
let top = editor.row_for_block(decorations.prompt_block_id, cx)?.0 as f32;
let bottom = editor.row_for_block(decorations.end_block_id, cx)?.0 as f32;
Some((top, bottom))
});
if scroll_target_range.is_none() {
log::error!("bug: failed to find blocks for scrolling to inline assist");
}
}
let scroll_target_range = scroll_target_range.unwrap_or_else(|| {
let snapshot = editor.snapshot(window, cx);
let start_row = assist
.range
.start
.to_display_point(&snapshot.display_snapshot)
.row();
scroll_target_top = start_row.0 as f32;
scroll_target_bottom = scroll_target_top + 1.;
}
let top = start_row.0 as f32;
let bottom = top + 1.0;
(top, bottom)
});
let mut scroll_target_top = scroll_target_range.0;
let mut scroll_target_bottom = scroll_target_range.1;
scroll_target_top -= editor.vertical_scroll_margin() as f32;
scroll_target_bottom += editor.vertical_scroll_margin() as f32;
@@ -1519,7 +1522,7 @@ impl InlineAssistant {
struct EditorInlineAssists {
assist_ids: Vec<InlineAssistId>,
scroll_lock: Option<InlineAssistScrollLock>,
highlight_updates: async_watch::Sender<()>,
highlight_updates: watch::Sender<()>,
_update_highlights: Task<Result<()>>,
_subscriptions: Vec<gpui::Subscription>,
}
@@ -1531,7 +1534,7 @@ struct InlineAssistScrollLock {
impl EditorInlineAssists {
fn new(editor: &Entity<Editor>, window: &mut Window, cx: &mut App) -> Self {
let (highlight_updates_tx, mut highlight_updates_rx) = async_watch::channel(());
let (highlight_updates_tx, mut highlight_updates_rx) = watch::channel(());
Self {
assist_ids: Vec::new(),
scroll_lock: None,
@@ -1689,7 +1692,7 @@ impl InlineAssist {
if let Some(editor) = editor.upgrade() {
InlineAssistant::update_global(cx, |this, cx| {
if let Some(editor_assists) =
this.assists_by_editor.get(&editor.downgrade())
this.assists_by_editor.get_mut(&editor.downgrade())
{
editor_assists.highlight_updates.send(()).ok();
}

View File

@@ -671,7 +671,7 @@ impl RenderOnce for HistoryEntryElement {
),
HistoryEntry::Context(context) => (
context.path.to_string_lossy().to_string(),
context.title.clone().into(),
context.title.clone(),
context.mtime.timestamp(),
),
};

View File

@@ -400,16 +400,11 @@ impl ThreadStore {
self.threads.len()
}
pub fn unordered_threads(&self) -> impl Iterator<Item = &SerializedThreadMetadata> {
pub fn reverse_chronological_threads(&self) -> impl Iterator<Item = &SerializedThreadMetadata> {
// ordering is from "ORDER BY" in `list_threads`
self.threads.iter()
}
pub fn reverse_chronological_threads(&self) -> Vec<SerializedThreadMetadata> {
let mut threads = self.threads.iter().cloned().collect::<Vec<_>>();
threads.sort_unstable_by_key(|thread| std::cmp::Reverse(thread.updated_at));
threads
}
pub fn create_thread(&mut self, cx: &mut Context<Self>) -> Entity<Thread> {
cx.new(|cx| {
Thread::new(

View File

@@ -11,7 +11,7 @@ use assistant_slash_commands::FileCommandMetadata;
use client::{self, proto, telemetry::Telemetry};
use clock::ReplicaId;
use collections::{HashMap, HashSet};
use fs::{Fs, RemoveOptions};
use fs::{Fs, RenameOptions};
use futures::{FutureExt, StreamExt, future::Shared};
use gpui::{
App, AppContext as _, Context, Entity, EventEmitter, RenderImage, SharedString, Subscription,
@@ -452,6 +452,10 @@ pub enum ContextEvent {
MessagesEdited,
SummaryChanged,
SummaryGenerated,
PathChanged {
old_path: Option<Arc<Path>>,
new_path: Arc<Path>,
},
StreamedCompletion,
StartedThoughtProcess(Range<language::Anchor>),
EndedThoughtProcess(language::Anchor),
@@ -2894,22 +2898,34 @@ impl AssistantContext {
}
fs.create_dir(contexts_dir().as_ref()).await?;
fs.atomic_write(new_path.clone(), serde_json::to_string(&context).unwrap())
.await?;
if let Some(old_path) = old_path {
// rename before write ensures that only one file exists
if let Some(old_path) = old_path.as_ref() {
if new_path.as_path() != old_path.as_ref() {
fs.remove_file(
fs.rename(
&old_path,
RemoveOptions {
recursive: false,
ignore_if_not_exists: true,
&new_path,
RenameOptions {
overwrite: true,
ignore_if_exists: true,
},
)
.await?;
}
}
this.update(cx, |this, _| this.path = Some(new_path.into()))?;
// update path before write in case it fails
this.update(cx, {
let new_path: Arc<Path> = new_path.clone().into();
move |this, cx| {
this.path = Some(new_path.clone());
cx.emit(ContextEvent::PathChanged { old_path, new_path });
}
})
.ok();
fs.atomic_write(new_path, serde_json::to_string(&context).unwrap())
.await?;
}
Ok(())
@@ -3277,7 +3293,7 @@ impl SavedContextV0_1_0 {
#[derive(Debug, Clone)]
pub struct SavedContextMetadata {
pub title: String,
pub title: SharedString,
pub path: Arc<Path>,
pub mtime: chrono::DateTime<chrono::Local>,
}

View File

@@ -580,6 +580,7 @@ impl ContextEditor {
});
}
ContextEvent::SummaryGenerated => {}
ContextEvent::PathChanged { .. } => {}
ContextEvent::StartedThoughtProcess(range) => {
let creases = self.insert_thought_process_output_sections(
[(

View File

@@ -347,12 +347,6 @@ impl ContextStore {
self.contexts_metadata.iter()
}
pub fn reverse_chronological_contexts(&self) -> Vec<SavedContextMetadata> {
let mut contexts = self.contexts_metadata.iter().cloned().collect::<Vec<_>>();
contexts.sort_unstable_by_key(|thread| std::cmp::Reverse(thread.mtime));
contexts
}
pub fn create(&mut self, cx: &mut Context<Self>) -> Entity<AssistantContext> {
let context = cx.new(|cx| {
AssistantContext::local(
@@ -618,6 +612,16 @@ impl ContextStore {
ContextEvent::SummaryChanged => {
self.advertise_contexts(cx);
}
ContextEvent::PathChanged { old_path, new_path } => {
if let Some(old_path) = old_path.as_ref() {
for metadata in &mut self.contexts_metadata {
if &metadata.path == old_path {
metadata.path = new_path.clone();
break;
}
}
}
}
ContextEvent::Operation(operation) => {
let context_id = context.read(cx).id().to_proto();
let operation = operation.to_proto();
@@ -792,7 +796,7 @@ impl ContextStore {
.next()
{
contexts.push(SavedContextMetadata {
title: title.to_string(),
title: title.to_string().into(),
path: path.into(),
mtime: metadata.mtime.timestamp_for_user().into(),
});

View File

@@ -240,13 +240,14 @@ impl SlashCommandCompletionProvider {
Ok(vec![project::CompletionResponse {
completions,
is_incomplete: false,
// TODO: Could have slash commands indicate whether their completions are incomplete.
is_incomplete: true,
}])
})
} else {
Task::ready(Ok(vec![project::CompletionResponse {
completions: Vec::new(),
is_incomplete: false,
is_incomplete: true,
}]))
}
}
@@ -275,17 +276,17 @@ impl CompletionProvider for SlashCommandCompletionProvider {
position.row,
call.arguments.last().map_or(call.name.end, |arg| arg.end) as u32,
);
let command_range = buffer.anchor_after(command_range_start)
let command_range = buffer.anchor_before(command_range_start)
..buffer.anchor_after(command_range_end);
let name = line[call.name.clone()].to_string();
let (arguments, last_argument_range) = if let Some(argument) = call.arguments.last()
{
let last_arg_start =
buffer.anchor_after(Point::new(position.row, argument.start as u32));
buffer.anchor_before(Point::new(position.row, argument.start as u32));
let first_arg_start = call.arguments.first().expect("we have the last element");
let first_arg_start =
buffer.anchor_after(Point::new(position.row, first_arg_start.start as u32));
let first_arg_start = buffer
.anchor_before(Point::new(position.row, first_arg_start.start as u32));
let arguments = call
.arguments
.into_iter()
@@ -298,7 +299,7 @@ impl CompletionProvider for SlashCommandCompletionProvider {
)
} else {
let start =
buffer.anchor_after(Point::new(position.row, call.name.start as u32));
buffer.anchor_before(Point::new(position.row, call.name.start as u32));
(None, start..buffer_position)
};

View File

@@ -18,7 +18,6 @@ eval = []
agent_settings.workspace = true
anyhow.workspace = true
assistant_tool.workspace = true
async-watch.workspace = true
buffer_diff.workspace = true
chrono.workspace = true
collections.workspace = true
@@ -58,6 +57,7 @@ terminal_view.workspace = true
theme.workspace = true
ui.workspace = true
util.workspace = true
watch.workspace = true
web_search.workspace = true
which.workspace = true
workspace-hack.workspace = true

View File

@@ -420,12 +420,12 @@ impl EditAgent {
cx: &mut AsyncApp,
) -> (
Task<Result<(T, Vec<ResolvedOldText>)>>,
async_watch::Receiver<Option<Range<usize>>>,
watch::Receiver<Option<Range<usize>>>,
)
where
T: 'static + Send + Unpin + Stream<Item = Result<EditParserEvent>>,
{
let (old_range_tx, old_range_rx) = async_watch::channel(None);
let (mut old_range_tx, old_range_rx) = watch::channel(None);
let task = cx.background_spawn(async move {
let mut matcher = StreamingFuzzyMatcher::new(snapshot);
while let Some(edit_event) = edit_events.next().await {

View File

@@ -910,7 +910,7 @@ struct InlineBlamePopover {
/// Represents a breakpoint indicator that shows up when hovering over lines in the gutter that don't have
/// a breakpoint on them.
#[derive(Clone, Copy, Debug)]
#[derive(Clone, Copy, Debug, PartialEq, Eq)]
struct PhantomBreakpointIndicator {
display_row: DisplayRow,
/// There's a small debounce between hovering over the line and showing the indicator.
@@ -918,6 +918,7 @@ struct PhantomBreakpointIndicator {
is_active: bool,
collides_with_existing_breakpoint: bool,
}
/// Zed's primary implementation of text input, allowing users to edit a [`MultiBuffer`].
///
/// See the [module level documentation](self) for more information.
@@ -1220,11 +1221,19 @@ struct SelectionHistory {
}
impl SelectionHistory {
#[track_caller]
fn insert_transaction(
&mut self,
transaction_id: TransactionId,
selections: Arc<[Selection<Anchor>]>,
) {
if selections.is_empty() {
log::error!(
"SelectionHistory::insert_transaction called with empty selections. Caller: {}",
std::panic::Location::caller()
);
return;
}
self.selections_by_transaction
.insert(transaction_id, (selections, None));
}
@@ -5046,7 +5055,13 @@ impl Editor {
return;
}
let position = self.selections.newest_anchor().head();
let multibuffer_snapshot = self.buffer.read(cx).read(cx);
let position = self
.selections
.newest_anchor()
.head()
.bias_right(&multibuffer_snapshot);
if position.diff_base_anchor.is_some() {
return;
}
@@ -5059,8 +5074,9 @@ impl Editor {
let buffer_snapshot = buffer.read(cx).snapshot();
let query: Option<Arc<String>> =
Self::completion_query(&self.buffer.read(cx).read(cx), position)
.map(|query| query.into());
Self::completion_query(&multibuffer_snapshot, position).map(|query| query.into());
drop(multibuffer_snapshot);
let provider = match requested_source {
Some(CompletionsMenuSource::Normal) | None => self.completion_provider.clone(),
@@ -5079,10 +5095,13 @@ impl Editor {
.as_ref()
.map_or(true, |provider| provider.filter_completions());
// When `is_incomplete` is false, can filter completions instead of re-querying when the
// current query is a suffix of the initial query.
if let Some(CodeContextMenu::Completions(menu)) = self.context_menu.borrow_mut().as_mut() {
if !menu.is_incomplete && filter_completions {
if filter_completions {
menu.filter(query.clone(), provider.clone(), window, cx);
}
// When `is_incomplete` is false, no need to re-query completions when the current query
// is a suffix of the initial query.
if !menu.is_incomplete {
// If the new query is a suffix of the old query (typing more characters) and
// the previous result was complete, the existing completions can be filtered.
//
@@ -5100,7 +5119,6 @@ impl Editor {
menu.initial_position.to_offset(&snapshot) == position.to_offset(&snapshot)
};
if position_matches {
menu.filter(query.clone(), provider.clone(), window, cx);
return;
}
}
@@ -17854,16 +17872,9 @@ impl Editor {
.selections
.disjoint_anchors()
.iter()
.map(|selection| {
let range = if selection.reversed {
selection.end.text_anchor..selection.start.text_anchor
} else {
selection.start.text_anchor..selection.end.text_anchor
};
Location {
buffer: buffer.clone(),
range,
}
.map(|range| Location {
buffer: buffer.clone(),
range: range.start.text_anchor..range.end.text_anchor,
})
.collect::<Vec<_>>();

View File

@@ -950,7 +950,7 @@ impl EditorElement {
editor.set_gutter_hovered(gutter_hovered, cx);
editor.mouse_cursor_hidden = false;
if gutter_hovered {
let breakpoint_indicator = if gutter_hovered {
let new_point = position_map
.point_for_position(event.position)
.previous_valid;
@@ -964,7 +964,6 @@ impl EditorElement {
.buffer_for_excerpt(buffer_anchor.excerpt_id)
.and_then(|buffer| buffer.file().map(|file| (buffer, file)))
{
let was_hovered = editor.gutter_breakpoint_indicator.0.is_some();
let as_point = text::ToPoint::to_point(&buffer_anchor.text_anchor, buffer_snapshot);
let is_visible = editor
@@ -992,38 +991,43 @@ impl EditorElement {
.is_some()
});
editor.gutter_breakpoint_indicator.0 = Some(PhantomBreakpointIndicator {
display_row: new_point.row(),
is_active: is_visible,
collides_with_existing_breakpoint: has_existing_breakpoint,
});
editor.gutter_breakpoint_indicator.1.get_or_insert_with(|| {
cx.spawn(async move |this, cx| {
if !was_hovered {
if !is_visible {
editor.gutter_breakpoint_indicator.1.get_or_insert_with(|| {
cx.spawn(async move |this, cx| {
cx.background_executor()
.timer(Duration::from_millis(200))
.await;
}
this.update(cx, |this, cx| {
if let Some(indicator) = this.gutter_breakpoint_indicator.0.as_mut() {
indicator.is_active = true;
}
cx.notify();
this.update(cx, |this, cx| {
if let Some(indicator) = this.gutter_breakpoint_indicator.0.as_mut()
{
indicator.is_active = true;
cx.notify();
}
})
.ok();
})
.ok();
})
});
});
}
Some(PhantomBreakpointIndicator {
display_row: new_point.row(),
is_active: is_visible,
collides_with_existing_breakpoint: has_existing_breakpoint,
})
} else {
editor.gutter_breakpoint_indicator = (None, None);
editor.gutter_breakpoint_indicator.1 = None;
None
}
} else {
editor.gutter_breakpoint_indicator = (None, None);
}
editor.gutter_breakpoint_indicator.1 = None;
None
};
cx.notify();
if &breakpoint_indicator != &editor.gutter_breakpoint_indicator.0 {
editor.gutter_breakpoint_indicator.0 = breakpoint_indicator;
cx.notify();
}
// Don't trigger hover popover if mouse is hovering over context menu
if text_hitbox.is_hovered(window) {

View File

@@ -24,7 +24,6 @@ anyhow.workspace = true
assistant_tool.workspace = true
assistant_tools.workspace = true
async-trait.workspace = true
async-watch.workspace = true
buffer_diff.workspace = true
chrono.workspace = true
clap.workspace = true
@@ -66,5 +65,6 @@ toml.workspace = true
unindent.workspace = true
util.workspace = true
uuid.workspace = true
watch.workspace = true
workspace-hack.workspace = true
zed_llm_client.workspace = true

View File

@@ -385,7 +385,7 @@ pub fn init(cx: &mut App) -> Arc<AgentAppState> {
extension::init(cx);
let (tx, rx) = async_watch::channel(None);
let (mut tx, rx) = watch::channel(None);
cx.observe_global::<SettingsStore>(move |cx| {
let settings = &ProjectSettings::get_global(cx).node;
let options = NodeBinaryOptions {

View File

@@ -726,7 +726,7 @@ fn completion_replace_range(snapshot: &BufferSnapshot, anchor: &Anchor) -> Optio
if end_in_line > start_in_line {
let replace_start = snapshot.anchor_before(line_start + start_in_line);
let replace_end = snapshot.anchor_before(line_start + end_in_line);
let replace_end = snapshot.anchor_after(line_start + end_in_line);
Some(replace_start..replace_end)
} else {
None

View File

@@ -28,7 +28,6 @@ test-support = [
[dependencies]
anyhow.workspace = true
async-trait.workspace = true
async-watch.workspace = true
clock.workspace = true
collections.workspace = true
ec4rs.workspace = true
@@ -66,6 +65,7 @@ tree-sitter-typescript = { workspace = true, optional = true }
tree-sitter.workspace = true
unicase = "2.6"
util.workspace = true
watch.workspace = true
workspace-hack.workspace = true
diffy = "0.4.2"

View File

@@ -18,7 +18,6 @@ use crate::{
text_diff::text_diff,
};
use anyhow::{Context as _, Result};
use async_watch as watch;
pub use clock::ReplicaId;
use clock::{AGENT_REPLICA_ID, Lamport};
use collections::HashMap;
@@ -932,7 +931,7 @@ impl Buffer {
reparse: None,
non_text_state_update_count: 0,
sync_parse_timeout: Duration::from_millis(1),
parse_status: async_watch::channel(ParseStatus::Idle),
parse_status: watch::channel(ParseStatus::Idle),
autoindent_requests: Default::default(),
pending_autoindent: Default::default(),
language: None,

View File

@@ -39,7 +39,7 @@ use util::{ResultExt, maybe, post_inc};
#[derive(
Debug, Clone, Hash, PartialEq, Eq, PartialOrd, Ord, Serialize, Deserialize, JsonSchema,
)]
pub struct LanguageName(SharedString);
pub struct LanguageName(pub SharedString);
impl LanguageName {
pub fn new(s: &str) -> Self {
@@ -1000,6 +1000,7 @@ impl LanguageRegistry {
txs.push(tx);
}
AvailableGrammar::Unloaded(wasm_path) => {
log::trace!("start loading grammar {name:?}");
let this = self.clone();
let wasm_path = wasm_path.clone();
*grammar = AvailableGrammar::Loading(wasm_path.clone(), vec![tx]);
@@ -1025,6 +1026,7 @@ impl LanguageRegistry {
Err(error) => AvailableGrammar::LoadFailed(error.clone()),
};
log::trace!("finish loading grammar {name:?}");
let old_value = this.state.write().grammars.insert(name, value);
if let Some(AvailableGrammar::Loading(_, txs)) = old_value {
for tx in txs {

View File

@@ -7,6 +7,7 @@ use crate::{
use anyhow::Context as _;
use collections::HashMap;
use futures::FutureExt;
use gpui::SharedString;
use std::{
borrow::Cow,
cmp::{self, Ordering, Reverse},
@@ -181,6 +182,13 @@ enum ParseStepLanguage {
}
impl ParseStepLanguage {
fn name(&self) -> SharedString {
match self {
ParseStepLanguage::Loaded { language } => language.name().0,
ParseStepLanguage::Pending { name } => name.into(),
}
}
fn id(&self) -> Option<LanguageId> {
match self {
ParseStepLanguage::Loaded { language } => Some(language.id),
@@ -413,7 +421,9 @@ impl SyntaxSnapshot {
.and_then(|language| language.ok())
.is_some()
{
resolved_injection_ranges.push(layer.range.to_offset(text));
let range = layer.range.to_offset(text);
log::trace!("reparse range {range:?} for language {language_name:?}");
resolved_injection_ranges.push(range);
}
cursor.next(text);
@@ -440,7 +450,10 @@ impl SyntaxSnapshot {
invalidated_ranges: Vec<Range<usize>>,
registry: Option<&Arc<LanguageRegistry>>,
) {
log::trace!("reparse. invalidated ranges:{:?}", invalidated_ranges);
log::trace!(
"reparse. invalidated ranges:{:?}",
LogOffsetRanges(&invalidated_ranges, text),
);
let max_depth = self.layers.summary().max_depth;
let mut cursor = self.layers.cursor::<SyntaxLayerSummary>(text);
@@ -468,6 +481,13 @@ impl SyntaxSnapshot {
loop {
let step = queue.pop();
let position = if let Some(step) = &step {
log::trace!(
"parse step depth:{}, range:{:?}, language:{} ({:?})",
step.depth,
LogAnchorRange(&step.range, text),
step.language.name(),
step.language.id(),
);
SyntaxLayerPosition {
depth: step.depth,
range: step.range.clone(),
@@ -566,13 +586,13 @@ impl SyntaxSnapshot {
.to_ts_point();
}
if let Some((SyntaxLayerContent::Parsed { tree: old_tree, .. }, layer_start)) =
old_layer.map(|layer| (&layer.content, layer.range.start))
if let Some((SyntaxLayerContent::Parsed { tree: old_tree, .. }, layer_range)) =
old_layer.map(|layer| (&layer.content, layer.range.clone()))
{
log::trace!(
"existing layer. language:{}, start:{:?}, ranges:{:?}",
"existing layer. language:{}, range:{:?}, included_ranges:{:?}",
language.name(),
LogPoint(layer_start.to_point(text)),
LogAnchorRange(&layer_range, text),
LogIncludedRanges(&old_tree.included_ranges())
);
@@ -611,7 +631,7 @@ impl SyntaxSnapshot {
}
log::trace!(
"update layer. language:{}, start:{:?}, included_ranges:{:?}",
"update layer. language:{}, range:{:?}, included_ranges:{:?}",
language.name(),
LogAnchorRange(&step.range, text),
LogIncludedRanges(&included_ranges),
@@ -745,28 +765,36 @@ impl SyntaxSnapshot {
#[cfg(debug_assertions)]
fn check_invariants(&self, text: &BufferSnapshot) {
let mut max_depth = 0;
let mut prev_range: Option<Range<Anchor>> = None;
let mut prev_layer: Option<(Range<Anchor>, Option<LanguageId>)> = None;
for layer in self.layers.iter() {
match Ord::cmp(&layer.depth, &max_depth) {
Ordering::Less => {
panic!("layers out of order")
}
Ordering::Equal => {
if let Some(prev_range) = prev_range {
if let Some((prev_range, prev_language_id)) = prev_layer {
match layer.range.start.cmp(&prev_range.start, text) {
Ordering::Less => panic!("layers out of order"),
Ordering::Equal => {
assert!(layer.range.end.cmp(&prev_range.end, text).is_ge())
}
Ordering::Equal => match layer.range.end.cmp(&prev_range.end, text) {
Ordering::Less => panic!("layers out of order"),
Ordering::Equal => {
if layer.content.language_id() < prev_language_id {
panic!("layers out of order")
}
}
Ordering::Greater => {}
},
Ordering::Greater => {}
}
}
prev_layer = Some((layer.range.clone(), layer.content.language_id()));
}
Ordering::Greater => {
prev_layer = None;
}
Ordering::Greater => {}
}
max_depth = layer.depth;
prev_range = Some(layer.range.clone());
}
}
@@ -1619,7 +1647,7 @@ impl Ord for ParseStep {
Ord::cmp(&other.depth, &self.depth)
.then_with(|| Ord::cmp(&range_b.start, &range_a.start))
.then_with(|| Ord::cmp(&range_a.end, &range_b.end))
.then_with(|| self.language.id().cmp(&other.language.id()))
.then_with(|| other.language.id().cmp(&self.language.id()))
}
}
@@ -1865,6 +1893,7 @@ impl ToTreeSitterPoint for Point {
struct LogIncludedRanges<'a>(&'a [tree_sitter::Range]);
struct LogPoint(Point);
struct LogAnchorRange<'a>(&'a Range<Anchor>, &'a text::BufferSnapshot);
struct LogOffsetRanges<'a>(&'a [Range<usize>], &'a text::BufferSnapshot);
struct LogChangedRegions<'a>(&'a ChangeRegionSet, &'a text::BufferSnapshot);
impl fmt::Debug for LogIncludedRanges<'_> {
@@ -1886,6 +1915,16 @@ impl fmt::Debug for LogAnchorRange<'_> {
}
}
impl fmt::Debug for LogOffsetRanges<'_> {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
f.debug_list()
.entries(self.0.iter().map(|range| {
LogPoint(range.start.to_point(self.1))..LogPoint(range.end.to_point(self.1))
}))
.finish()
}
}
impl fmt::Debug for LogChangedRegions<'_> {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
f.debug_list()

View File

@@ -788,15 +788,99 @@ fn test_empty_combined_injections_inside_injections(cx: &mut App) {
"(template...",
// Markdown inline content
"(inline)",
// HTML within the ERB
"(document (text))",
// The ruby syntax tree should be empty, since there are
// no interpolations in the ERB template.
"(program)",
// HTML within the ERB
"(document (text))",
],
);
}
#[gpui::test]
fn test_syntax_map_languages_loading_with_erb(cx: &mut App) {
let text = r#"
<body>
<% if @one %>
<div class=one>
<% else %>
<div class=two>
<% end %>
</div>
</body>
"#
.unindent();
let registry = Arc::new(LanguageRegistry::test(cx.background_executor().clone()));
let mut buffer = Buffer::new(0, BufferId::new(1).unwrap(), text);
let mut syntax_map = SyntaxMap::new(&buffer);
syntax_map.set_language_registry(registry.clone());
let language = Arc::new(erb_lang());
log::info!("parsing");
registry.add(language.clone());
syntax_map.reparse(language.clone(), &buffer);
log::info!("loading html");
registry.add(Arc::new(html_lang()));
syntax_map.reparse(language.clone(), &buffer);
log::info!("loading ruby");
registry.add(Arc::new(ruby_lang()));
syntax_map.reparse(language.clone(), &buffer);
assert_capture_ranges(
&syntax_map,
&buffer,
&["tag", "ivar"],
"
<«body»>
<% if «@one» %>
<«div» class=one>
<% else %>
<«div» class=two>
<% end %>
</«div»>
</«body»>
",
);
let text = r#"
<body>
<% if @one«_hundred» %>
<div class=one>
<% else %>
<div class=two>
<% end %>
</div>
</body>
"#
.unindent();
log::info!("editing");
buffer.edit_via_marked_text(&text);
syntax_map.interpolate(&buffer);
syntax_map.reparse(language.clone(), &buffer);
assert_capture_ranges(
&syntax_map,
&buffer,
&["tag", "ivar"],
"
<«body»>
<% if «@one_hundred» %>
<«div» class=one>
<% else %>
<«div» class=two>
<% end %>
</«div»>
</«body»>
",
);
}
#[gpui::test(iterations = 50)]
fn test_random_syntax_map_edits_rust_macros(rng: StdRng, cx: &mut App) {
let text = r#"

View File

@@ -6213,7 +6213,7 @@ impl MultiBufferSnapshot {
cursor.seek_to_start_of_current_excerpt();
let region = cursor.region()?;
let offset = region.range.start;
let buffer_offset = region.buffer_range.start;
let buffer_offset = start_excerpt.buffer_start_offset();
let excerpt_offset = cursor.excerpts.start().clone();
Some(MultiBufferExcerpt {
diff_transforms: cursor.diff_transforms,

View File

@@ -2842,6 +2842,22 @@ async fn test_random_multibuffer(cx: &mut TestAppContext, mut rng: StdRng) {
.unwrap()
+ 1
);
let reference_ranges = cx.update(|cx| {
reference
.excerpts
.iter()
.map(|excerpt| {
(
excerpt.id,
excerpt.range.to_offset(&excerpt.buffer.read(cx).snapshot()),
)
})
.collect::<HashMap<_, _>>()
});
for i in 0..snapshot.len() {
let excerpt = snapshot.excerpt_containing(i..i).unwrap();
assert_eq!(excerpt.buffer_range(), reference_ranges[&excerpt.id()]);
}
assert_consistent_line_numbers(&snapshot);
assert_position_translation(&snapshot);

View File

@@ -18,7 +18,6 @@ test-support = []
[dependencies]
anyhow.workspace = true
async-compression.workspace = true
async-watch.workspace = true
async-tar.workspace = true
async-trait.workspace = true
futures.workspace = true
@@ -30,6 +29,7 @@ serde.workspace = true
serde_json.workspace = true
smol.workspace = true
util.workspace = true
watch.workspace = true
which.workspace = true
workspace-hack.workspace = true

View File

@@ -34,7 +34,7 @@ struct NodeRuntimeState {
http: Arc<dyn HttpClient>,
instance: Option<Box<dyn NodeRuntimeTrait>>,
last_options: Option<NodeBinaryOptions>,
options: async_watch::Receiver<Option<NodeBinaryOptions>>,
options: watch::Receiver<Option<NodeBinaryOptions>>,
shell_env_loaded: Shared<oneshot::Receiver<()>>,
}
@@ -42,7 +42,7 @@ impl NodeRuntime {
pub fn new(
http: Arc<dyn HttpClient>,
shell_env_loaded: Option<oneshot::Receiver<()>>,
options: async_watch::Receiver<Option<NodeBinaryOptions>>,
options: watch::Receiver<Option<NodeBinaryOptions>>,
) -> Self {
NodeRuntime(Arc::new(Mutex::new(NodeRuntimeState {
http,
@@ -58,7 +58,7 @@ impl NodeRuntime {
http: Arc::new(http_client::BlockedHttpClient),
instance: None,
last_options: None,
options: async_watch::channel(Some(NodeBinaryOptions::default())).1,
options: watch::channel(Some(NodeBinaryOptions::default())).1,
shell_env_loaded: oneshot::channel().1.shared(),
})))
}

View File

@@ -1107,38 +1107,14 @@ impl OutlinePanel {
});
} else {
let mut offset = Point::default();
let expand_excerpt_control_height = 1.0;
if let Some(buffer_id) = scroll_to_buffer {
let current_folded = active_editor.read(cx).is_buffer_folded(buffer_id, cx);
if current_folded {
let previous_buffer_id = self
.fs_entries
.iter()
.rev()
.filter_map(|entry| match entry {
FsEntry::File(file) => Some(file.buffer_id),
FsEntry::ExternalFile(external_file) => {
Some(external_file.buffer_id)
}
FsEntry::Directory(..) => None,
})
.skip_while(|id| *id != buffer_id)
.nth(1);
if let Some(previous_buffer_id) = previous_buffer_id {
if !active_editor
.read(cx)
.is_buffer_folded(previous_buffer_id, cx)
{
offset.y += expand_excerpt_control_height;
}
}
} else {
if multi_buffer_snapshot.as_singleton().is_none() {
offset.y = -(active_editor.read(cx).file_header_size() as f32);
}
offset.y -= expand_excerpt_control_height;
if multi_buffer_snapshot.as_singleton().is_none()
&& !active_editor.read(cx).is_buffer_folded(buffer_id, cx)
{
offset.y = -(active_editor.read(cx).file_header_size() as f32);
}
}
active_editor.update(cx, |editor, cx| {
editor.set_scroll_anchor(ScrollAnchor { offset, anchor }, window, cx);
});

View File

@@ -576,7 +576,12 @@ impl DapStore {
const LIMIT: usize = 100;
if value.len() > LIMIT {
value.truncate(LIMIT);
let mut index = LIMIT;
// If index isn't a char boundary truncate will cause a panic
while !value.is_char_boundary(index) {
index -= 1;
}
value.truncate(index);
value.push_str("...");
}

View File

@@ -990,10 +990,41 @@ impl Session {
request: dap::messages::Request,
cx: &mut Context<Self>,
) -> Task<Result<()>> {
let request_args = serde_json::from_value::<RunInTerminalRequestArguments>(
let request_args = match serde_json::from_value::<RunInTerminalRequestArguments>(
request.arguments.unwrap_or_default(),
)
.expect("To parse StartDebuggingRequestArguments");
) {
Ok(args) => args,
Err(error) => {
return cx.spawn(async move |session, cx| {
let error = serde_json::to_value(dap::ErrorResponse {
error: Some(dap::Message {
id: request.seq,
format: error.to_string(),
variables: None,
send_telemetry: None,
show_user: None,
url: None,
url_label: None,
}),
})
.ok();
session
.update(cx, |this, cx| {
this.respond_to_client(
request.seq,
false,
StartDebugging::COMMAND.to_string(),
error,
cx,
)
})?
.await?;
Err(anyhow!("Failed to parse RunInTerminalRequestArguments"))
});
}
};
let seq = request.seq;

View File

@@ -249,7 +249,7 @@ async fn load_shell_environment(
use util::shell_env;
let dir_ = dir.to_owned();
let mut envs = match smol::unblock(move || shell_env::capture(Some(dir_))).await {
let mut envs = match smol::unblock(move || shell_env::capture(&dir_)).await {
Ok(envs) => envs,
Err(err) => {
util::log_err(&err);

View File

@@ -171,7 +171,8 @@ impl ConflictSet {
let mut conflicts = Vec::new();
let mut line_pos = 0;
let mut lines = buffer.text_for_range(0..buffer.len()).lines();
let buffer_len = buffer.len();
let mut lines = buffer.text_for_range(0..buffer_len).lines();
let mut conflict_start: Option<usize> = None;
let mut ours_start: Option<usize> = None;
@@ -212,7 +213,7 @@ impl ConflictSet {
&& theirs_start.is_some()
{
let theirs_end = line_pos;
let conflict_end = line_end + 1;
let conflict_end = (line_end + 1).min(buffer_len);
let range = buffer.anchor_after(conflict_start.unwrap())
..buffer.anchor_before(conflict_end);
@@ -390,6 +391,22 @@ mod tests {
assert_eq!(their_text, "This is their version in a nested conflict\n");
}
#[test]
fn test_conflict_markers_at_eof() {
let test_content = r#"
<<<<<<< ours
=======
This is their version
>>>>>>> "#
.unindent();
let buffer_id = BufferId::new(1).unwrap();
let buffer = Buffer::new(0, buffer_id, test_content.to_string());
let snapshot = buffer.snapshot();
let conflict_snapshot = ConflictSet::parse(&snapshot);
assert_eq!(conflict_snapshot.conflicts.len(), 1);
}
#[test]
fn test_conflicts_in_range() {
// Create a buffer with conflict markers

View File

@@ -24,7 +24,6 @@ test-support = ["fs/test-support"]
[dependencies]
anyhow.workspace = true
askpass.workspace = true
async-watch.workspace = true
backtrace = "0.3"
chrono.workspace = true
clap.workspace = true
@@ -63,6 +62,7 @@ smol.workspace = true
sysinfo.workspace = true
telemetry_events.workspace = true
util.workspace = true
watch.workspace = true
worktree.workspace = true
[target.'cfg(not(windows))'.dependencies]

View File

@@ -756,7 +756,7 @@ fn initialize_settings(
session: Arc<ChannelClient>,
fs: Arc<dyn Fs>,
cx: &mut App,
) -> async_watch::Receiver<Option<NodeBinaryOptions>> {
) -> watch::Receiver<Option<NodeBinaryOptions>> {
let user_settings_file_rx = watch_config_file(
&cx.background_executor(),
fs,
@@ -791,7 +791,7 @@ fn initialize_settings(
}
});
let (tx, rx) = async_watch::channel(None);
let (mut tx, rx) = watch::channel(None);
cx.observe_global::<SettingsStore>(move |cx| {
let settings = &ProjectSettings::get_global(cx).node;
log::info!("Got new node settings: {:?}", settings);

View File

@@ -42,6 +42,7 @@ walkdir.workspace = true
workspace-hack.workspace = true
[target.'cfg(unix)'.dependencies]
command-fds = "0.3.1"
libc.workspace = true
[target.'cfg(windows)'.dependencies]

View File

@@ -1,16 +1,21 @@
#![cfg_attr(not(unix), allow(unused))]
use anyhow::{Context as _, Result};
use collections::HashMap;
use std::borrow::Cow;
use std::ffi::OsStr;
use std::io::Read;
use std::path::{Path, PathBuf};
use std::process::Command;
use tempfile::NamedTempFile;
/// Capture all environment variables from the login shell.
pub fn capture(change_dir: Option<impl AsRef<Path>>) -> Result<HashMap<String, String>> {
let shell_path = std::env::var("SHELL").map(PathBuf::from)?;
let shell_name = shell_path.file_name().and_then(OsStr::to_str);
#[cfg(unix)]
pub fn capture(directory: &std::path::Path) -> Result<collections::HashMap<String, String>> {
use std::os::unix::process::CommandExt;
use std::process::Stdio;
let shell_path = std::env::var("SHELL").map(std::path::PathBuf::from)?;
let shell_name = shell_path.file_name().and_then(std::ffi::OsStr::to_str);
let mut command = std::process::Command::new(&shell_path);
command.stdin(Stdio::null());
command.stdout(Stdio::piped());
command.stderr(Stdio::piped());
let mut command_string = String::new();
@@ -18,10 +23,7 @@ pub fn capture(change_dir: Option<impl AsRef<Path>>) -> Result<HashMap<String, S
// the project directory to get the env in there as if the user
// `cd`'d into it. We do that because tools like direnv, asdf, ...
// hook into `cd` and only set up the env after that.
if let Some(dir) = change_dir {
let dir_str = dir.as_ref().to_string_lossy();
command_string.push_str(&format!("cd '{dir_str}';"));
}
command_string.push_str(&format!("cd '{}';", directory.display()));
// In certain shells we need to execute additional_command in order to
// trigger the behavior of direnv, etc.
@@ -30,26 +32,26 @@ pub fn capture(change_dir: Option<impl AsRef<Path>>) -> Result<HashMap<String, S
_ => "",
});
let mut env_output_file = NamedTempFile::new()?;
command_string.push_str(&format!(
"sh -c 'export -p' > '{}';",
env_output_file.path().to_string_lossy(),
));
let mut command = Command::new(&shell_path);
// In some shells, file descriptors greater than 2 cannot be used in interactive mode,
// so file descriptor 0 is used instead.
const ENV_OUTPUT_FD: std::os::fd::RawFd = 0;
command_string.push_str(&format!("sh -c 'export -p >&{ENV_OUTPUT_FD}';"));
// For csh/tcsh, the login shell option is set by passing `-` as
// the 0th argument instead of using `-l`.
if let Some("tcsh" | "csh") = shell_name {
#[cfg(unix)]
std::os::unix::process::CommandExt::arg0(&mut command, "-");
command.arg0("-");
} else {
command.arg("-l");
}
command.args(["-i", "-c", &command_string]);
let process_output = super::set_pre_exec_to_start_new_session(&mut command).output()?;
super::set_pre_exec_to_start_new_session(&mut command);
let (env_output, process_output) = spawn_and_read_fd(command, ENV_OUTPUT_FD)?;
let env_output = String::from_utf8_lossy(&env_output);
anyhow::ensure!(
process_output.status.success(),
"login shell exited with {}. stdout: {:?}, stderr: {:?}",
@@ -58,15 +60,36 @@ pub fn capture(change_dir: Option<impl AsRef<Path>>) -> Result<HashMap<String, S
String::from_utf8_lossy(&process_output.stderr),
);
let mut env_output = String::new();
env_output_file.read_to_string(&mut env_output)?;
parse(&env_output)
.filter_map(|entry| match entry {
Ok((name, value)) => Some(Ok((name.into(), value?.into()))),
Err(err) => Some(Err(err)),
})
.collect::<Result<HashMap<String, String>>>()
.collect::<Result<_>>()
}
#[cfg(unix)]
fn spawn_and_read_fd(
mut command: std::process::Command,
child_fd: std::os::fd::RawFd,
) -> anyhow::Result<(Vec<u8>, std::process::Output)> {
use command_fds::{CommandFdExt, FdMapping};
use std::io::Read;
let (mut reader, writer) = std::io::pipe()?;
command.fd_mappings(vec![FdMapping {
parent_fd: writer.into(),
child_fd,
}])?;
let process = command.spawn()?;
drop(command);
let mut buffer = Vec::new();
reader.read_to_end(&mut buffer)?;
Ok((buffer, process.wait_with_output()?))
}
/// Parse the result of calling `sh -c 'export -p'`.
@@ -107,7 +130,10 @@ fn parse_name_and_terminator(input: &str, terminator: char) -> Option<(Cow<'_, s
}
fn parse_literal_and_terminator(input: &str, terminator: char) -> Option<(Cow<'_, str>, &str)> {
if let Some((literal, rest)) = parse_literal_single_quoted(input) {
if let Some((literal, rest)) = parse_literal_ansi_c_quoted(input) {
let rest = rest.strip_prefix(terminator)?;
Some((Cow::Owned(literal), rest))
} else if let Some((literal, rest)) = parse_literal_single_quoted(input) {
let rest = rest.strip_prefix(terminator)?;
Some((Cow::Borrowed(literal), rest))
} else if let Some((literal, rest)) = parse_literal_double_quoted(input) {
@@ -120,6 +146,52 @@ fn parse_literal_and_terminator(input: &str, terminator: char) -> Option<(Cow<'_
}
}
/// https://www.gnu.org/software/bash/manual/html_node/ANSI_002dC-Quoting.html
fn parse_literal_ansi_c_quoted(input: &str) -> Option<(String, &str)> {
let rest = input.strip_prefix("$'")?;
let mut char_indices = rest.char_indices();
let mut escaping = false;
let (literal, rest) = loop {
let (index, char) = char_indices.next()?;
if char == '\'' && !escaping {
break (&rest[..index], &rest[index + 1..]);
} else {
escaping = !escaping && char == '\\';
}
};
let mut result = String::new();
let mut chars = literal.chars();
while let Some(ch) = chars.next() {
if ch == '\\' {
match chars.next() {
Some('n') => result.push('\n'),
Some('t') => result.push('\t'),
Some('r') => result.push('\r'),
Some('\\') => result.push('\\'),
Some('\'') => result.push('\''),
Some('"') => result.push('"'),
Some('a') => result.push('\x07'), // bell
Some('b') => result.push('\x08'), // backspace
Some('f') => result.push('\x0C'), // form feed
Some('v') => result.push('\x0B'), // vertical tab
Some('0') => result.push('\0'), // null
Some(other) => {
// For unknown escape sequences, keep the backslash and character
result.push('\\');
result.push(other);
}
None => result.push('\\'), // trailing backslash
}
} else {
result.push(ch);
}
}
Some((result, rest))
}
/// https://www.gnu.org/software/bash/manual/html_node/Single-Quotes.html
fn parse_literal_single_quoted(input: &str) -> Option<(&str, &str)> {
input.strip_prefix('\'')?.split_once('\'')
@@ -154,6 +226,17 @@ fn parse_literal_double_quoted(input: &str) -> Option<(String, &str)> {
mod tests {
use super::*;
#[cfg(unix)]
#[test]
fn test_spawn_and_read_fd() -> anyhow::Result<()> {
let mut command = std::process::Command::new("sh");
super::super::set_pre_exec_to_start_new_session(&mut command);
command.args(["-lic", "printf 'abc%.0s' $(seq 1 65536) >&0"]);
let (bytes, _) = spawn_and_read_fd(command, 0)?;
assert_eq!(bytes.len(), 65536 * 3);
Ok(())
}
#[test]
fn test_parse() {
let input = indoc::indoc! {r#"
@@ -186,6 +269,7 @@ mod tests {
\"wo\
rld\"\n!\\
!"
export foo=$'hello\nworld'
"#};
let expected_values = [
@@ -210,6 +294,7 @@ mod tests {
Some(indoc::indoc! {r#"
`Hello`
"world"\n!\!"#}),
Some("hello\nworld"),
];
let expected = expected_values
.into_iter()
@@ -270,4 +355,43 @@ mod tests {
assert_eq!(expected, actual);
assert_eq!(rest, "\nrest");
}
#[test]
fn test_parse_literal_ansi_c_quoted() {
let (actual, rest) = parse_literal_ansi_c_quoted("$'hello\\nworld'\nrest").unwrap();
assert_eq!(actual, "hello\nworld");
assert_eq!(rest, "\nrest");
let (actual, rest) = parse_literal_ansi_c_quoted("$'tab\\there'\nrest").unwrap();
assert_eq!(actual, "tab\there");
assert_eq!(rest, "\nrest");
let (actual, rest) = parse_literal_ansi_c_quoted("$'quote\\'\\'end'\nrest").unwrap();
assert_eq!(actual, "quote''end");
assert_eq!(rest, "\nrest");
let (actual, rest) = parse_literal_ansi_c_quoted("$'backslash\\\\end'\nrest").unwrap();
assert_eq!(actual, "backslash\\end");
assert_eq!(rest, "\nrest");
}
#[test]
fn test_parse_buildphase_export() {
let input = r#"export buildPhase=$'{ echo "------------------------------------------------------------";\n echo " WARNING: the existence of this path is not guaranteed.";\n echo " It is an internal implementation detail for pkgs.mkShell.";\n echo "------------------------------------------------------------";\n echo;\n # Record all build inputs as runtime dependencies\n export;\n} >> "$out"\n'
"#;
let expected_value = r#"{ echo "------------------------------------------------------------";
echo " WARNING: the existence of this path is not guaranteed.";
echo " It is an internal implementation detail for pkgs.mkShell.";
echo "------------------------------------------------------------";
echo;
# Record all build inputs as runtime dependencies
export;
} >> "$out"
"#;
let ((name, value), _rest) = parse_declaration(input).unwrap();
assert_eq!(name, "buildPhase");
assert_eq!(value.as_deref(), Some(expected_value));
}
}

View File

@@ -315,7 +315,7 @@ pub fn load_login_shell_environment() -> Result<()> {
// into shell's `cd` command (and hooks) to manipulate env.
// We do this so that we get the env a user would have when spawning a shell
// in home directory.
for (name, value) in shell_env::capture(Some(paths::home_dir()))? {
for (name, value) in shell_env::capture(paths::home_dir())? {
unsafe { env::set_var(&name, &value) };
}

24
crates/watch/Cargo.toml Normal file
View File

@@ -0,0 +1,24 @@
[package]
name = "watch"
version = "0.1.0"
edition.workspace = true
publish.workspace = true
license = "Apache-2.0"
[lints]
workspace = true
[lib]
path = "src/watch.rs"
doctest = true
[dependencies]
parking_lot.workspace = true
workspace-hack.workspace = true
[dev-dependencies]
ctor.workspace = true
futures.workspace = true
gpui = { workspace = true, features = ["test-support"] }
rand.workspace = true
zlog.workspace = true

1
crates/watch/LICENSE-APACHE Symbolic link
View File

@@ -0,0 +1 @@
../../LICENSE-APACHE

25
crates/watch/src/error.rs Normal file
View File

@@ -0,0 +1,25 @@
//! Watch error types.
use std::fmt;
#[derive(Debug, Eq, PartialEq)]
pub struct NoReceiverError;
impl fmt::Display for NoReceiverError {
fn fmt(&self, fmt: &mut fmt::Formatter<'_>) -> fmt::Result {
write!(fmt, "all receivers were dropped")
}
}
impl std::error::Error for NoReceiverError {}
#[derive(Debug, Eq, PartialEq)]
pub struct NoSenderError;
impl fmt::Display for NoSenderError {
fn fmt(&self, fmt: &mut fmt::Formatter<'_>) -> fmt::Result {
write!(fmt, "sender was dropped")
}
}
impl std::error::Error for NoSenderError {}

279
crates/watch/src/watch.rs Normal file
View File

@@ -0,0 +1,279 @@
mod error;
pub use error::*;
use parking_lot::{RwLock, RwLockReadGuard, RwLockUpgradableReadGuard};
use std::{
collections::BTreeMap,
mem,
pin::Pin,
sync::Arc,
task::{Context, Poll, Waker},
};
pub fn channel<T>(value: T) -> (Sender<T>, Receiver<T>) {
let state = Arc::new(RwLock::new(State {
value,
wakers: BTreeMap::new(),
next_waker_id: WakerId::default(),
version: 0,
closed: false,
}));
(
Sender {
state: state.clone(),
},
Receiver { state, version: 0 },
)
}
#[derive(Default, Debug, Copy, Clone, PartialEq, Eq, PartialOrd, Ord)]
struct WakerId(usize);
impl WakerId {
fn post_inc(&mut self) -> Self {
let id = *self;
self.0 = id.0.wrapping_add(1);
*self
}
}
struct State<T> {
value: T,
wakers: BTreeMap<WakerId, Waker>,
next_waker_id: WakerId,
version: usize,
closed: bool,
}
pub struct Sender<T> {
state: Arc<RwLock<State<T>>>,
}
impl<T> Sender<T> {
pub fn receiver(&self) -> Receiver<T> {
let version = self.state.read().version;
Receiver {
state: self.state.clone(),
version,
}
}
pub fn send(&mut self, value: T) -> Result<(), NoReceiverError> {
if let Some(state) = Arc::get_mut(&mut self.state) {
let state = state.get_mut();
state.value = value;
debug_assert_eq!(state.wakers.len(), 0);
Err(NoReceiverError)
} else {
let mut state = self.state.write();
state.value = value;
state.version = state.version.wrapping_add(1);
let wakers = mem::take(&mut state.wakers);
drop(state);
for (_, waker) in wakers {
waker.wake();
}
Ok(())
}
}
}
impl<T> Drop for Sender<T> {
fn drop(&mut self) {
let mut state = self.state.write();
state.closed = true;
for (_, waker) in mem::take(&mut state.wakers) {
waker.wake();
}
}
}
#[derive(Clone)]
pub struct Receiver<T> {
state: Arc<RwLock<State<T>>>,
version: usize,
}
struct Changed<'a, T> {
receiver: &'a mut Receiver<T>,
pending_waker_id: Option<WakerId>,
}
impl<T> Future for Changed<'_, T> {
type Output = Result<(), NoSenderError>;
fn poll(mut self: Pin<&mut Self>, cx: &mut Context) -> Poll<Self::Output> {
let this = &mut *self;
let state = this.receiver.state.upgradable_read();
if state.version != this.receiver.version {
// The sender produced a new value. Avoid unregistering the pending
// waker, because the sender has already done so.
this.pending_waker_id = None;
this.receiver.version = state.version;
Poll::Ready(Ok(()))
} else if state.closed {
Poll::Ready(Err(NoSenderError))
} else {
let mut state = RwLockUpgradableReadGuard::upgrade(state);
// Unregister the pending waker. This should happen automatically
// when the waker gets awoken by the sender, but if this future was
// polled again without an explicit call to `wake` (e.g., a spurious
// wake by the executor), we need to remove it manually.
if let Some(pending_waker_id) = this.pending_waker_id.take() {
state.wakers.remove(&pending_waker_id);
}
// Register the waker for this future.
let waker_id = state.next_waker_id.post_inc();
state.wakers.insert(waker_id, cx.waker().clone());
this.pending_waker_id = Some(waker_id);
Poll::Pending
}
}
}
impl<T> Drop for Changed<'_, T> {
fn drop(&mut self) {
// If this future gets dropped before the waker has a chance of being
// awoken, we need to clear it to avoid a memory leak.
if let Some(waker_id) = self.pending_waker_id {
let mut state = self.receiver.state.write();
state.wakers.remove(&waker_id);
}
}
}
impl<T> Receiver<T> {
pub fn borrow(&mut self) -> parking_lot::MappedRwLockReadGuard<T> {
let state = self.state.read();
self.version = state.version;
RwLockReadGuard::map(state, |state| &state.value)
}
pub fn changed(&mut self) -> impl Future<Output = Result<(), NoSenderError>> {
Changed {
receiver: self,
pending_waker_id: None,
}
}
}
impl<T: Clone> Receiver<T> {
pub async fn recv(&mut self) -> Result<T, NoSenderError> {
self.changed().await?;
Ok(self.borrow().clone())
}
}
#[cfg(test)]
mod tests {
use super::*;
use futures::{FutureExt, select_biased};
use gpui::{AppContext, TestAppContext};
use std::{
pin::pin,
sync::atomic::{AtomicBool, AtomicUsize, Ordering::SeqCst},
};
#[gpui::test]
async fn test_basic_watch() {
let (mut sender, mut receiver) = channel(0);
assert_eq!(sender.send(1), Ok(()));
assert_eq!(receiver.recv().await, Ok(1));
assert_eq!(sender.send(2), Ok(()));
assert_eq!(sender.send(3), Ok(()));
assert_eq!(receiver.recv().await, Ok(3));
drop(receiver);
assert_eq!(sender.send(4), Err(NoReceiverError));
let mut receiver = sender.receiver();
assert_eq!(sender.send(5), Ok(()));
assert_eq!(receiver.recv().await, Ok(5));
// Ensure `changed` doesn't resolve if we just read the latest value
// using `borrow`.
assert_eq!(sender.send(6), Ok(()));
assert_eq!(*receiver.borrow(), 6);
assert_eq!(receiver.changed().now_or_never(), None);
assert_eq!(sender.send(7), Ok(()));
drop(sender);
assert_eq!(receiver.recv().await, Ok(7));
assert_eq!(receiver.recv().await, Err(NoSenderError));
}
#[gpui::test(iterations = 1000)]
async fn test_watch_random(cx: &mut TestAppContext) {
let next_id = Arc::new(AtomicUsize::new(1));
let closed = Arc::new(AtomicBool::new(false));
let (mut tx, rx) = channel(0);
let mut tasks = Vec::new();
tasks.push(cx.background_spawn({
let executor = cx.executor().clone();
let next_id = next_id.clone();
let closed = closed.clone();
async move {
for _ in 0..16 {
executor.simulate_random_delay().await;
let id = next_id.fetch_add(1, SeqCst);
zlog::info!("sending {}", id);
tx.send(id).ok();
}
closed.store(true, SeqCst);
}
}));
for receiver_id in 0..16 {
let executor = cx.executor().clone();
let next_id = next_id.clone();
let closed = closed.clone();
let mut rx = rx.clone();
let mut prev_observed_value = *rx.borrow();
tasks.push(cx.background_spawn(async move {
for _ in 0..16 {
executor.simulate_random_delay().await;
zlog::info!("{}: receiving", receiver_id);
let mut timeout = executor.simulate_random_delay().fuse();
let mut recv = pin!(rx.recv().fuse());
select_biased! {
_ = timeout => {
zlog::info!("{}: dropping recv future", receiver_id);
}
result = recv => {
match result {
Ok(value) => {
zlog::info!("{}: received {}", receiver_id, value);
assert_eq!(value, next_id.load(SeqCst) - 1);
assert_ne!(value, prev_observed_value);
prev_observed_value = value;
}
Err(NoSenderError) => {
zlog::info!("{}: closed", receiver_id);
assert!(closed.load(SeqCst));
break;
}
}
}
}
}
}));
}
futures::future::join_all(tasks).await;
}
#[ctor::ctor]
fn init_logger() {
zlog::init_test();
}
}

View File

@@ -2174,10 +2174,6 @@ impl Pane {
self.pinned_tab_count > ix
}
fn has_pinned_tabs(&self) -> bool {
self.pinned_tab_count != 0
}
fn has_unpinned_tabs(&self) -> bool {
self.pinned_tab_count < self.items.len()
}
@@ -2893,6 +2889,7 @@ impl Pane {
|| cfg!(not(target_os = "macos")) && window.modifiers().control;
let from_pane = dragged_tab.pane.clone();
let from_ix = dragged_tab.ix;
self.workspace
.update(cx, |_, cx| {
cx.defer_in(window, move |workspace, window, cx| {
@@ -2900,8 +2897,11 @@ impl Pane {
to_pane = workspace.split_pane(to_pane, split_direction, window, cx);
}
let database_id = workspace.database_id();
let old_ix = from_pane.read(cx).index_for_item_id(item_id);
let old_len = to_pane.read(cx).items.len();
let was_pinned_in_from_pane = from_pane.read_with(cx, |pane, _| {
pane.index_for_item_id(item_id)
.is_some_and(|ix| pane.is_tab_pinned(ix))
});
let to_pane_old_length = to_pane.read(cx).items.len();
if is_clone {
let Some(item) = from_pane
.read(cx)
@@ -2919,38 +2919,36 @@ impl Pane {
} else {
move_item(&from_pane, &to_pane, item_id, ix, true, window, cx);
}
if to_pane == from_pane {
if let Some(old_index) = old_ix {
to_pane.update(cx, |this, _| {
if old_index < this.pinned_tab_count
&& (ix == this.items.len() || ix > this.pinned_tab_count)
{
this.pinned_tab_count -= 1;
} else if this.has_pinned_tabs()
&& old_index >= this.pinned_tab_count
&& ix < this.pinned_tab_count
{
this.pinned_tab_count += 1;
}
});
}
} else {
to_pane.update(cx, |this, _| {
if this.items.len() > old_len // Did we not deduplicate on drag?
&& this.has_pinned_tabs()
&& ix < this.pinned_tab_count
to_pane.update(cx, |this, _| {
if to_pane == from_pane {
let moved_right = ix > from_ix;
let ix = if moved_right { ix - 1 } else { ix };
let is_pinned_in_to_pane = this.is_tab_pinned(ix);
if (was_pinned_in_from_pane && is_pinned_in_to_pane)
|| (!was_pinned_in_from_pane && !is_pinned_in_to_pane)
{
return;
}
if is_pinned_in_to_pane {
this.pinned_tab_count += 1;
} else {
this.pinned_tab_count -= 1;
}
} else if this.items.len() >= to_pane_old_length {
let is_pinned_in_to_pane = this.is_tab_pinned(ix);
let item_created_pane = to_pane_old_length == 0;
let is_first_position = ix == 0;
let was_dropped_at_beginning = item_created_pane || is_first_position;
let should_remain_pinned = is_pinned_in_to_pane
|| (was_pinned_in_from_pane && was_dropped_at_beginning);
if should_remain_pinned {
this.pinned_tab_count += 1;
}
});
from_pane.update(cx, |this, _| {
if let Some(index) = old_ix {
if this.pinned_tab_count > index {
this.pinned_tab_count -= 1;
}
}
})
}
}
});
});
})
.log_err();
@@ -4325,6 +4323,725 @@ mod tests {
assert_item_labels(&pane, ["C", "A", "B*"], cx);
}
#[gpui::test]
async fn test_drag_unpinned_tab_to_split_creates_pane_with_unpinned_tab(
cx: &mut TestAppContext,
) {
init_test(cx);
let fs = FakeFs::new(cx.executor());
let project = Project::test(fs, None, cx).await;
let (workspace, cx) =
cx.add_window_view(|window, cx| Workspace::test_new(project.clone(), window, cx));
let pane_a = workspace.read_with(cx, |workspace, _| workspace.active_pane().clone());
// Add A, B. Pin B. Activate A
let item_a = add_labeled_item(&pane_a, "A", false, cx);
let item_b = add_labeled_item(&pane_a, "B", false, cx);
pane_a.update_in(cx, |pane, window, cx| {
let ix = pane.index_for_item_id(item_b.item_id()).unwrap();
pane.pin_tab_at(ix, window, cx);
let ix = pane.index_for_item_id(item_a.item_id()).unwrap();
pane.activate_item(ix, true, true, window, cx);
});
// Drag A to create new split
pane_a.update_in(cx, |pane, window, cx| {
pane.drag_split_direction = Some(SplitDirection::Right);
let dragged_tab = DraggedTab {
pane: pane_a.clone(),
item: item_a.boxed_clone(),
ix: 0,
detail: 0,
is_active: true,
};
pane.handle_tab_drop(&dragged_tab, 0, window, cx);
});
// A should be moved to new pane. B should remain pinned, A should not be pinned
let (pane_a, pane_b) = workspace.read_with(cx, |workspace, _| {
let panes = workspace.panes();
(panes[0].clone(), panes[1].clone())
});
assert_item_labels(&pane_a, ["B*!"], cx);
assert_item_labels(&pane_b, ["A*"], cx);
}
#[gpui::test]
async fn test_drag_pinned_tab_to_split_creates_pane_with_pinned_tab(cx: &mut TestAppContext) {
init_test(cx);
let fs = FakeFs::new(cx.executor());
let project = Project::test(fs, None, cx).await;
let (workspace, cx) =
cx.add_window_view(|window, cx| Workspace::test_new(project.clone(), window, cx));
let pane_a = workspace.read_with(cx, |workspace, _| workspace.active_pane().clone());
// Add A, B. Pin both. Activate A
let item_a = add_labeled_item(&pane_a, "A", false, cx);
let item_b = add_labeled_item(&pane_a, "B", false, cx);
pane_a.update_in(cx, |pane, window, cx| {
let ix = pane.index_for_item_id(item_a.item_id()).unwrap();
pane.pin_tab_at(ix, window, cx);
let ix = pane.index_for_item_id(item_b.item_id()).unwrap();
pane.pin_tab_at(ix, window, cx);
let ix = pane.index_for_item_id(item_a.item_id()).unwrap();
pane.activate_item(ix, true, true, window, cx);
});
assert_item_labels(&pane_a, ["A*!", "B!"], cx);
// Drag A to create new split
pane_a.update_in(cx, |pane, window, cx| {
pane.drag_split_direction = Some(SplitDirection::Right);
let dragged_tab = DraggedTab {
pane: pane_a.clone(),
item: item_a.boxed_clone(),
ix: 0,
detail: 0,
is_active: true,
};
pane.handle_tab_drop(&dragged_tab, 0, window, cx);
});
// A should be moved to new pane. Both A and B should still be pinned
let (pane_a, pane_b) = workspace.read_with(cx, |workspace, _| {
let panes = workspace.panes();
(panes[0].clone(), panes[1].clone())
});
assert_item_labels(&pane_a, ["B*!"], cx);
assert_item_labels(&pane_b, ["A*!"], cx);
}
#[gpui::test]
async fn test_drag_pinned_tab_into_existing_panes_pinned_region(cx: &mut TestAppContext) {
init_test(cx);
let fs = FakeFs::new(cx.executor());
let project = Project::test(fs, None, cx).await;
let (workspace, cx) =
cx.add_window_view(|window, cx| Workspace::test_new(project.clone(), window, cx));
let pane_a = workspace.read_with(cx, |workspace, _| workspace.active_pane().clone());
// Add A to pane A and pin
let item_a = add_labeled_item(&pane_a, "A", false, cx);
pane_a.update_in(cx, |pane, window, cx| {
let ix = pane.index_for_item_id(item_a.item_id()).unwrap();
pane.pin_tab_at(ix, window, cx);
});
assert_item_labels(&pane_a, ["A*!"], cx);
// Add B to pane B and pin
let pane_b = workspace.update_in(cx, |workspace, window, cx| {
workspace.split_pane(pane_a.clone(), SplitDirection::Right, window, cx)
});
let item_b = add_labeled_item(&pane_b, "B", false, cx);
pane_b.update_in(cx, |pane, window, cx| {
let ix = pane.index_for_item_id(item_b.item_id()).unwrap();
pane.pin_tab_at(ix, window, cx);
});
assert_item_labels(&pane_b, ["B*!"], cx);
// Move A from pane A to pane B's pinned region
pane_b.update_in(cx, |pane, window, cx| {
let dragged_tab = DraggedTab {
pane: pane_a.clone(),
item: item_a.boxed_clone(),
ix: 0,
detail: 0,
is_active: true,
};
pane.handle_tab_drop(&dragged_tab, 0, window, cx);
});
// A should stay pinned
assert_item_labels(&pane_a, [], cx);
assert_item_labels(&pane_b, ["A*!", "B!"], cx);
}
#[gpui::test]
async fn test_drag_pinned_tab_into_existing_panes_unpinned_region(cx: &mut TestAppContext) {
init_test(cx);
let fs = FakeFs::new(cx.executor());
let project = Project::test(fs, None, cx).await;
let (workspace, cx) =
cx.add_window_view(|window, cx| Workspace::test_new(project.clone(), window, cx));
let pane_a = workspace.read_with(cx, |workspace, _| workspace.active_pane().clone());
// Add A to pane A and pin
let item_a = add_labeled_item(&pane_a, "A", false, cx);
pane_a.update_in(cx, |pane, window, cx| {
let ix = pane.index_for_item_id(item_a.item_id()).unwrap();
pane.pin_tab_at(ix, window, cx);
});
assert_item_labels(&pane_a, ["A*!"], cx);
// Create pane B with pinned item B
let pane_b = workspace.update_in(cx, |workspace, window, cx| {
workspace.split_pane(pane_a.clone(), SplitDirection::Right, window, cx)
});
let item_b = add_labeled_item(&pane_b, "B", false, cx);
assert_item_labels(&pane_b, ["B*"], cx);
pane_b.update_in(cx, |pane, window, cx| {
let ix = pane.index_for_item_id(item_b.item_id()).unwrap();
pane.pin_tab_at(ix, window, cx);
});
assert_item_labels(&pane_b, ["B*!"], cx);
// Move A from pane A to pane B's unpinned region
pane_b.update_in(cx, |pane, window, cx| {
let dragged_tab = DraggedTab {
pane: pane_a.clone(),
item: item_a.boxed_clone(),
ix: 0,
detail: 0,
is_active: true,
};
pane.handle_tab_drop(&dragged_tab, 1, window, cx);
});
// A should become pinned
assert_item_labels(&pane_a, [], cx);
assert_item_labels(&pane_b, ["B!", "A*"], cx);
}
#[gpui::test]
async fn test_drag_pinned_tab_into_existing_panes_first_position_with_no_pinned_tabs(
cx: &mut TestAppContext,
) {
init_test(cx);
let fs = FakeFs::new(cx.executor());
let project = Project::test(fs, None, cx).await;
let (workspace, cx) =
cx.add_window_view(|window, cx| Workspace::test_new(project.clone(), window, cx));
let pane_a = workspace.read_with(cx, |workspace, _| workspace.active_pane().clone());
// Add A to pane A and pin
let item_a = add_labeled_item(&pane_a, "A", false, cx);
pane_a.update_in(cx, |pane, window, cx| {
let ix = pane.index_for_item_id(item_a.item_id()).unwrap();
pane.pin_tab_at(ix, window, cx);
});
assert_item_labels(&pane_a, ["A*!"], cx);
// Add B to pane B
let pane_b = workspace.update_in(cx, |workspace, window, cx| {
workspace.split_pane(pane_a.clone(), SplitDirection::Right, window, cx)
});
add_labeled_item(&pane_b, "B", false, cx);
assert_item_labels(&pane_b, ["B*"], cx);
// Move A from pane A to position 0 in pane B, indicating it should stay pinned
pane_b.update_in(cx, |pane, window, cx| {
let dragged_tab = DraggedTab {
pane: pane_a.clone(),
item: item_a.boxed_clone(),
ix: 0,
detail: 0,
is_active: true,
};
pane.handle_tab_drop(&dragged_tab, 0, window, cx);
});
// A should stay pinned
assert_item_labels(&pane_a, [], cx);
assert_item_labels(&pane_b, ["A*!", "B"], cx);
}
#[gpui::test]
async fn test_drag_pinned_tab_into_existing_pane_at_max_capacity_closes_unpinned_tabs(
cx: &mut TestAppContext,
) {
init_test(cx);
let fs = FakeFs::new(cx.executor());
let project = Project::test(fs, None, cx).await;
let (workspace, cx) =
cx.add_window_view(|window, cx| Workspace::test_new(project.clone(), window, cx));
let pane_a = workspace.read_with(cx, |workspace, _| workspace.active_pane().clone());
set_max_tabs(cx, Some(2));
// Add A, B to pane A. Pin both
let item_a = add_labeled_item(&pane_a, "A", false, cx);
let item_b = add_labeled_item(&pane_a, "B", false, cx);
pane_a.update_in(cx, |pane, window, cx| {
let ix = pane.index_for_item_id(item_a.item_id()).unwrap();
pane.pin_tab_at(ix, window, cx);
let ix = pane.index_for_item_id(item_b.item_id()).unwrap();
pane.pin_tab_at(ix, window, cx);
});
assert_item_labels(&pane_a, ["A!", "B*!"], cx);
// Add C, D to pane B. Pin both
let pane_b = workspace.update_in(cx, |workspace, window, cx| {
workspace.split_pane(pane_a.clone(), SplitDirection::Right, window, cx)
});
let item_c = add_labeled_item(&pane_b, "C", false, cx);
let item_d = add_labeled_item(&pane_b, "D", false, cx);
pane_b.update_in(cx, |pane, window, cx| {
let ix = pane.index_for_item_id(item_c.item_id()).unwrap();
pane.pin_tab_at(ix, window, cx);
let ix = pane.index_for_item_id(item_d.item_id()).unwrap();
pane.pin_tab_at(ix, window, cx);
});
assert_item_labels(&pane_b, ["C!", "D*!"], cx);
// Add a third unpinned item to pane B (exceeds max tabs), but is allowed,
// as we allow 1 tab over max if the others are pinned or dirty
add_labeled_item(&pane_b, "E", false, cx);
assert_item_labels(&pane_b, ["C!", "D!", "E*"], cx);
// Drag pinned A from pane A to position 0 in pane B
pane_b.update_in(cx, |pane, window, cx| {
let dragged_tab = DraggedTab {
pane: pane_a.clone(),
item: item_a.boxed_clone(),
ix: 0,
detail: 0,
is_active: true,
};
pane.handle_tab_drop(&dragged_tab, 0, window, cx);
});
// E (unpinned) should be closed, leaving 3 pinned items
assert_item_labels(&pane_a, ["B*!"], cx);
assert_item_labels(&pane_b, ["A*!", "C!", "D!"], cx);
}
#[gpui::test]
async fn test_drag_last_pinned_tab_to_same_position_stays_pinned(cx: &mut TestAppContext) {
init_test(cx);
let fs = FakeFs::new(cx.executor());
let project = Project::test(fs, None, cx).await;
let (workspace, cx) =
cx.add_window_view(|window, cx| Workspace::test_new(project.clone(), window, cx));
let pane_a = workspace.read_with(cx, |workspace, _| workspace.active_pane().clone());
// Add A to pane A and pin it
let item_a = add_labeled_item(&pane_a, "A", false, cx);
pane_a.update_in(cx, |pane, window, cx| {
let ix = pane.index_for_item_id(item_a.item_id()).unwrap();
pane.pin_tab_at(ix, window, cx);
});
assert_item_labels(&pane_a, ["A*!"], cx);
// Drag pinned A to position 1 (directly to the right) in the same pane
pane_a.update_in(cx, |pane, window, cx| {
let dragged_tab = DraggedTab {
pane: pane_a.clone(),
item: item_a.boxed_clone(),
ix: 0,
detail: 0,
is_active: true,
};
pane.handle_tab_drop(&dragged_tab, 1, window, cx);
});
// A should still be pinned and active
assert_item_labels(&pane_a, ["A*!"], cx);
}
#[gpui::test]
async fn test_drag_pinned_tab_beyond_last_pinned_tab_in_same_pane_stays_pinned(
cx: &mut TestAppContext,
) {
init_test(cx);
let fs = FakeFs::new(cx.executor());
let project = Project::test(fs, None, cx).await;
let (workspace, cx) =
cx.add_window_view(|window, cx| Workspace::test_new(project.clone(), window, cx));
let pane_a = workspace.read_with(cx, |workspace, _| workspace.active_pane().clone());
// Add A, B to pane A and pin both
let item_a = add_labeled_item(&pane_a, "A", false, cx);
let item_b = add_labeled_item(&pane_a, "B", false, cx);
pane_a.update_in(cx, |pane, window, cx| {
let ix = pane.index_for_item_id(item_a.item_id()).unwrap();
pane.pin_tab_at(ix, window, cx);
let ix = pane.index_for_item_id(item_b.item_id()).unwrap();
pane.pin_tab_at(ix, window, cx);
});
assert_item_labels(&pane_a, ["A!", "B*!"], cx);
// Drag pinned A right of B in the same pane
pane_a.update_in(cx, |pane, window, cx| {
let dragged_tab = DraggedTab {
pane: pane_a.clone(),
item: item_a.boxed_clone(),
ix: 0,
detail: 0,
is_active: true,
};
pane.handle_tab_drop(&dragged_tab, 2, window, cx);
});
// A stays pinned
assert_item_labels(&pane_a, ["B!", "A*!"], cx);
}
#[gpui::test]
async fn test_drag_pinned_tab_beyond_unpinned_tab_in_same_pane_becomes_unpinned(
cx: &mut TestAppContext,
) {
init_test(cx);
let fs = FakeFs::new(cx.executor());
let project = Project::test(fs, None, cx).await;
let (workspace, cx) =
cx.add_window_view(|window, cx| Workspace::test_new(project.clone(), window, cx));
let pane_a = workspace.read_with(cx, |workspace, _| workspace.active_pane().clone());
// Add A, B to pane A and pin A
let item_a = add_labeled_item(&pane_a, "A", false, cx);
add_labeled_item(&pane_a, "B", false, cx);
pane_a.update_in(cx, |pane, window, cx| {
let ix = pane.index_for_item_id(item_a.item_id()).unwrap();
pane.pin_tab_at(ix, window, cx);
});
assert_item_labels(&pane_a, ["A!", "B*"], cx);
// Drag pinned A right of B in the same pane
pane_a.update_in(cx, |pane, window, cx| {
let dragged_tab = DraggedTab {
pane: pane_a.clone(),
item: item_a.boxed_clone(),
ix: 0,
detail: 0,
is_active: true,
};
pane.handle_tab_drop(&dragged_tab, 2, window, cx);
});
// A becomes unpinned
assert_item_labels(&pane_a, ["B", "A*"], cx);
}
#[gpui::test]
async fn test_drag_unpinned_tab_in_front_of_pinned_tab_in_same_pane_becomes_pinned(
cx: &mut TestAppContext,
) {
init_test(cx);
let fs = FakeFs::new(cx.executor());
let project = Project::test(fs, None, cx).await;
let (workspace, cx) =
cx.add_window_view(|window, cx| Workspace::test_new(project.clone(), window, cx));
let pane_a = workspace.read_with(cx, |workspace, _| workspace.active_pane().clone());
// Add A, B to pane A and pin A
let item_a = add_labeled_item(&pane_a, "A", false, cx);
let item_b = add_labeled_item(&pane_a, "B", false, cx);
pane_a.update_in(cx, |pane, window, cx| {
let ix = pane.index_for_item_id(item_a.item_id()).unwrap();
pane.pin_tab_at(ix, window, cx);
});
assert_item_labels(&pane_a, ["A!", "B*"], cx);
// Drag pinned B left of A in the same pane
pane_a.update_in(cx, |pane, window, cx| {
let dragged_tab = DraggedTab {
pane: pane_a.clone(),
item: item_b.boxed_clone(),
ix: 1,
detail: 0,
is_active: true,
};
pane.handle_tab_drop(&dragged_tab, 0, window, cx);
});
// A becomes unpinned
assert_item_labels(&pane_a, ["B*!", "A!"], cx);
}
#[gpui::test]
async fn test_drag_unpinned_tab_to_the_pinned_region_stays_pinned(cx: &mut TestAppContext) {
init_test(cx);
let fs = FakeFs::new(cx.executor());
let project = Project::test(fs, None, cx).await;
let (workspace, cx) =
cx.add_window_view(|window, cx| Workspace::test_new(project.clone(), window, cx));
let pane_a = workspace.read_with(cx, |workspace, _| workspace.active_pane().clone());
// Add A, B, C to pane A and pin A
let item_a = add_labeled_item(&pane_a, "A", false, cx);
add_labeled_item(&pane_a, "B", false, cx);
let item_c = add_labeled_item(&pane_a, "C", false, cx);
pane_a.update_in(cx, |pane, window, cx| {
let ix = pane.index_for_item_id(item_a.item_id()).unwrap();
pane.pin_tab_at(ix, window, cx);
});
assert_item_labels(&pane_a, ["A!", "B", "C*"], cx);
// Drag pinned C left of B in the same pane
pane_a.update_in(cx, |pane, window, cx| {
let dragged_tab = DraggedTab {
pane: pane_a.clone(),
item: item_c.boxed_clone(),
ix: 2,
detail: 0,
is_active: true,
};
pane.handle_tab_drop(&dragged_tab, 1, window, cx);
});
// A stays pinned, B and C remain unpinned
assert_item_labels(&pane_a, ["A!", "C*", "B"], cx);
}
#[gpui::test]
async fn test_drag_unpinned_tab_into_existing_panes_pinned_region(cx: &mut TestAppContext) {
init_test(cx);
let fs = FakeFs::new(cx.executor());
let project = Project::test(fs, None, cx).await;
let (workspace, cx) =
cx.add_window_view(|window, cx| Workspace::test_new(project.clone(), window, cx));
let pane_a = workspace.read_with(cx, |workspace, _| workspace.active_pane().clone());
// Add unpinned item A to pane A
let item_a = add_labeled_item(&pane_a, "A", false, cx);
assert_item_labels(&pane_a, ["A*"], cx);
// Create pane B with pinned item B
let pane_b = workspace.update_in(cx, |workspace, window, cx| {
workspace.split_pane(pane_a.clone(), SplitDirection::Right, window, cx)
});
let item_b = add_labeled_item(&pane_b, "B", false, cx);
pane_b.update_in(cx, |pane, window, cx| {
let ix = pane.index_for_item_id(item_b.item_id()).unwrap();
pane.pin_tab_at(ix, window, cx);
});
assert_item_labels(&pane_b, ["B*!"], cx);
// Move A from pane A to pane B's pinned region
pane_b.update_in(cx, |pane, window, cx| {
let dragged_tab = DraggedTab {
pane: pane_a.clone(),
item: item_a.boxed_clone(),
ix: 0,
detail: 0,
is_active: true,
};
pane.handle_tab_drop(&dragged_tab, 0, window, cx);
});
// A should become pinned since it was dropped in the pinned region
assert_item_labels(&pane_a, [], cx);
assert_item_labels(&pane_b, ["A*!", "B!"], cx);
}
#[gpui::test]
async fn test_drag_unpinned_tab_into_existing_panes_unpinned_region(cx: &mut TestAppContext) {
init_test(cx);
let fs = FakeFs::new(cx.executor());
let project = Project::test(fs, None, cx).await;
let (workspace, cx) =
cx.add_window_view(|window, cx| Workspace::test_new(project.clone(), window, cx));
let pane_a = workspace.read_with(cx, |workspace, _| workspace.active_pane().clone());
// Add unpinned item A to pane A
let item_a = add_labeled_item(&pane_a, "A", false, cx);
assert_item_labels(&pane_a, ["A*"], cx);
// Create pane B with one pinned item B
let pane_b = workspace.update_in(cx, |workspace, window, cx| {
workspace.split_pane(pane_a.clone(), SplitDirection::Right, window, cx)
});
let item_b = add_labeled_item(&pane_b, "B", false, cx);
pane_b.update_in(cx, |pane, window, cx| {
let ix = pane.index_for_item_id(item_b.item_id()).unwrap();
pane.pin_tab_at(ix, window, cx);
});
assert_item_labels(&pane_b, ["B*!"], cx);
// Move A from pane A to pane B's unpinned region
pane_b.update_in(cx, |pane, window, cx| {
let dragged_tab = DraggedTab {
pane: pane_a.clone(),
item: item_a.boxed_clone(),
ix: 0,
detail: 0,
is_active: true,
};
pane.handle_tab_drop(&dragged_tab, 1, window, cx);
});
// A should remain unpinned since it was dropped outside the pinned region
assert_item_labels(&pane_a, [], cx);
assert_item_labels(&pane_b, ["B!", "A*"], cx);
}
#[gpui::test]
async fn test_drag_pinned_tab_throughout_entire_range_of_pinned_tabs_both_directions(
cx: &mut TestAppContext,
) {
init_test(cx);
let fs = FakeFs::new(cx.executor());
let project = Project::test(fs, None, cx).await;
let (workspace, cx) =
cx.add_window_view(|window, cx| Workspace::test_new(project.clone(), window, cx));
let pane_a = workspace.read_with(cx, |workspace, _| workspace.active_pane().clone());
// Add A, B, C and pin all
let item_a = add_labeled_item(&pane_a, "A", false, cx);
let item_b = add_labeled_item(&pane_a, "B", false, cx);
let item_c = add_labeled_item(&pane_a, "C", false, cx);
assert_item_labels(&pane_a, ["A", "B", "C*"], cx);
pane_a.update_in(cx, |pane, window, cx| {
let ix = pane.index_for_item_id(item_a.item_id()).unwrap();
pane.pin_tab_at(ix, window, cx);
let ix = pane.index_for_item_id(item_b.item_id()).unwrap();
pane.pin_tab_at(ix, window, cx);
let ix = pane.index_for_item_id(item_c.item_id()).unwrap();
pane.pin_tab_at(ix, window, cx);
});
assert_item_labels(&pane_a, ["A!", "B!", "C*!"], cx);
// Move A to right of B
pane_a.update_in(cx, |pane, window, cx| {
let dragged_tab = DraggedTab {
pane: pane_a.clone(),
item: item_a.boxed_clone(),
ix: 0,
detail: 0,
is_active: true,
};
pane.handle_tab_drop(&dragged_tab, 1, window, cx);
});
// A should be after B and all are pinned
assert_item_labels(&pane_a, ["B!", "A*!", "C!"], cx);
// Move A to right of C
pane_a.update_in(cx, |pane, window, cx| {
let dragged_tab = DraggedTab {
pane: pane_a.clone(),
item: item_a.boxed_clone(),
ix: 1,
detail: 0,
is_active: true,
};
pane.handle_tab_drop(&dragged_tab, 2, window, cx);
});
// A should be after C and all are pinned
assert_item_labels(&pane_a, ["B!", "C!", "A*!"], cx);
// Move A to left of C
pane_a.update_in(cx, |pane, window, cx| {
let dragged_tab = DraggedTab {
pane: pane_a.clone(),
item: item_a.boxed_clone(),
ix: 2,
detail: 0,
is_active: true,
};
pane.handle_tab_drop(&dragged_tab, 1, window, cx);
});
// A should be before C and all are pinned
assert_item_labels(&pane_a, ["B!", "A*!", "C!"], cx);
// Move A to left of B
pane_a.update_in(cx, |pane, window, cx| {
let dragged_tab = DraggedTab {
pane: pane_a.clone(),
item: item_a.boxed_clone(),
ix: 1,
detail: 0,
is_active: true,
};
pane.handle_tab_drop(&dragged_tab, 0, window, cx);
});
// A should be before B and all are pinned
assert_item_labels(&pane_a, ["A*!", "B!", "C!"], cx);
}
#[gpui::test]
async fn test_drag_first_tab_to_last_position(cx: &mut TestAppContext) {
init_test(cx);
let fs = FakeFs::new(cx.executor());
let project = Project::test(fs, None, cx).await;
let (workspace, cx) =
cx.add_window_view(|window, cx| Workspace::test_new(project.clone(), window, cx));
let pane_a = workspace.read_with(cx, |workspace, _| workspace.active_pane().clone());
// Add A, B, C
let item_a = add_labeled_item(&pane_a, "A", false, cx);
add_labeled_item(&pane_a, "B", false, cx);
add_labeled_item(&pane_a, "C", false, cx);
assert_item_labels(&pane_a, ["A", "B", "C*"], cx);
// Move A to the end
pane_a.update_in(cx, |pane, window, cx| {
let dragged_tab = DraggedTab {
pane: pane_a.clone(),
item: item_a.boxed_clone(),
ix: 0,
detail: 0,
is_active: true,
};
pane.handle_tab_drop(&dragged_tab, 2, window, cx);
});
// A should be at the end
assert_item_labels(&pane_a, ["B", "C", "A*"], cx);
}
#[gpui::test]
async fn test_drag_last_tab_to_first_position(cx: &mut TestAppContext) {
init_test(cx);
let fs = FakeFs::new(cx.executor());
let project = Project::test(fs, None, cx).await;
let (workspace, cx) =
cx.add_window_view(|window, cx| Workspace::test_new(project.clone(), window, cx));
let pane_a = workspace.read_with(cx, |workspace, _| workspace.active_pane().clone());
// Add A, B, C
add_labeled_item(&pane_a, "A", false, cx);
add_labeled_item(&pane_a, "B", false, cx);
let item_c = add_labeled_item(&pane_a, "C", false, cx);
assert_item_labels(&pane_a, ["A", "B", "C*"], cx);
// Move C to the beginning
pane_a.update_in(cx, |pane, window, cx| {
let dragged_tab = DraggedTab {
pane: pane_a.clone(),
item: item_c.boxed_clone(),
ix: 2,
detail: 0,
is_active: true,
};
pane.handle_tab_drop(&dragged_tab, 0, window, cx);
});
// C should be at the beginning
assert_item_labels(&pane_a, ["C*", "A", "B"], cx);
}
#[gpui::test]
async fn test_add_item_with_new_item(cx: &mut TestAppContext) {
init_test(cx);

View File

@@ -2,7 +2,7 @@
description = "The fast, collaborative code editor."
edition.workspace = true
name = "zed"
version = "0.190.0"
version = "0.190.4"
publish.workspace = true
license = "GPL-3.0-or-later"
authors = ["Zed Team <hi@zed.dev>"]
@@ -24,7 +24,6 @@ assets.workspace = true
assistant_context_editor.workspace = true
assistant_tool.workspace = true
assistant_tools.workspace = true
async-watch.workspace = true
audio.workspace = true
auto_update.workspace = true
auto_update_ui.workspace = true
@@ -138,6 +137,7 @@ util.workspace = true
uuid.workspace = true
vim.workspace = true
vim_mode_setting.workspace = true
watch.workspace = true
web_search.workspace = true
web_search_providers.workspace = true
welcome.workspace = true

View File

@@ -1 +1 @@
dev
stable

View File

@@ -410,7 +410,7 @@ Error: Running Zed as root or via sudo is unsupported.
let mut languages = LanguageRegistry::new(cx.background_executor().clone());
languages.set_language_server_download_dir(paths::languages_dir().clone());
let languages = Arc::new(languages);
let (tx, rx) = async_watch::channel(None);
let (mut tx, rx) = watch::channel(None);
cx.observe_global::<SettingsStore>(move |cx| {
let settings = &ProjectSettings::get_global(cx).node;
let options = NodeBinaryOptions {

View File

@@ -5,6 +5,7 @@ components = [ "rustfmt", "clippy" ]
targets = [
"x86_64-apple-darwin",
"aarch64-apple-darwin",
"x86_64-unknown-freebsd",
"x86_64-unknown-linux-gnu",
"x86_64-pc-windows-msvc",
"wasm32-wasip2", # extensions

160
script/bundle-freebsd Executable file
View File

@@ -0,0 +1,160 @@
#!/usr/bin/env bash
set -euxo pipefail
source script/lib/blob-store.sh
# Function for displaying help info
help_info() {
echo "
Usage: ${0##*/} [options]
Build a release .tar.gz for FreeBSD.
Options:
-h Display this help and exit.
"
}
while getopts 'h' flag; do
case "${flag}" in
h)
help_info
exit 0
;;
esac
done
export ZED_BUNDLE=true
channel=$(<crates/zed/RELEASE_CHANNEL)
target_dir="${CARGO_TARGET_DIR:-target}"
version="$(script/get-crate-version zed)"
# Set RELEASE_VERSION so it's compiled into GPUI and it knows about the version.
export RELEASE_VERSION="${version}"
commit=$(git rev-parse HEAD | cut -c 1-7)
version_info=$(rustc --version --verbose)
host_line=$(echo "$version_info" | grep host)
target_triple=${host_line#*: }
remote_server_triple=${REMOTE_SERVER_TARGET:-"${target_triple}"}
# musl_triple=${target_triple%-gnu}-musl
# rustup_installed=false
# if command -v rustup >/dev/null 2>&1; then
# rustup_installed=true
# fi
# Generate the licenses first, so they can be baked into the binaries
# script/generate-licenses
# if "$rustup_installed"; then
# rustup target add "$remote_server_triple"
# fi
# export CC=$(which clang)
# Build binary in release mode
export RUSTFLAGS="${RUSTFLAGS:-} -C link-args=-Wl,--disable-new-dtags,-rpath,\$ORIGIN/../lib"
# cargo build --release --target "${target_triple}" --package zed --package cli
# Build remote_server in separate invocation to prevent feature unification from other crates
# from influencing dynamic libraries required by it.
# if [[ "$remote_server_triple" == "$musl_triple" ]]; then
# export RUSTFLAGS="${RUSTFLAGS:-} -C target-feature=+crt-static"
# fi
cargo build --release --target "${remote_server_triple}" --package remote_server
# Strip debug symbols and save them for upload to DigitalOcean
# objcopy --only-keep-debug "${target_dir}/${target_triple}/release/zed" "${target_dir}/${target_triple}/release/zed.dbg"
# objcopy --only-keep-debug "${target_dir}/${remote_server_triple}/release/remote_server" "${target_dir}/${remote_server_triple}/release/remote_server.dbg"
# objcopy --strip-debug "${target_dir}/${target_triple}/release/zed"
# objcopy --strip-debug "${target_dir}/${target_triple}/release/cli"
# objcopy --strip-debug "${target_dir}/${remote_server_triple}/release/remote_server"
# gzip -f "${target_dir}/${target_triple}/release/zed.dbg"
# gzip -f "${target_dir}/${remote_server_triple}/release/remote_server.dbg"
# if [[ -n "${DIGITALOCEAN_SPACES_SECRET_KEY:-}" && -n "${DIGITALOCEAN_SPACES_ACCESS_KEY:-}" ]]; then
# upload_to_blob_store_public \
# "zed-debug-symbols" \
# "${target_dir}/${target_triple}/release/zed.dbg.gz" \
# "$channel/zed-$version-${target_triple}.dbg.gz"
# upload_to_blob_store_public \
# "zed-debug-symbols" \
# "${target_dir}/${remote_server_triple}/release/remote_server.dbg.gz" \
# "$channel/remote_server-$version-${remote_server_triple}.dbg.gz"
# fi
# Ensure that remote_server does not depend on libssl nor libcrypto, as we got rid of these deps.
if ldd "${target_dir}/${remote_server_triple}/release/remote_server" | grep -q 'libcrypto\|libssl'; then
echo "Error: remote_server still depends on libssl or libcrypto" && exit 1
fi
suffix=""
if [ "$channel" != "stable" ]; then
suffix="-$channel"
fi
# Move everything that should end up in the final package
# into a temp directory.
# temp_dir=$(mktemp -d)
# zed_dir="${temp_dir}/zed$suffix.app"
# Binary
# mkdir -p "${zed_dir}/bin" "${zed_dir}/libexec"
# cp "${target_dir}/${target_triple}/release/zed" "${zed_dir}/libexec/zed-editor"
# cp "${target_dir}/${target_triple}/release/cli" "${zed_dir}/bin/zed"
# Libs
# find_libs() {
# ldd ${target_dir}/${target_triple}/release/zed |
# cut -d' ' -f3 |
# grep -v '\<\(libstdc++.so\|libc.so\|libgcc_s.so\|libm.so\|libpthread.so\|libdl.so\|libasound.so\)'
# }
# mkdir -p "${zed_dir}/lib"
# rm -rf "${zed_dir}/lib/*"
# cp $(find_libs) "${zed_dir}/lib"
# Icons
# mkdir -p "${zed_dir}/share/icons/hicolor/512x512/apps"
# cp "crates/zed/resources/app-icon$suffix.png" "${zed_dir}/share/icons/hicolor/512x512/apps/zed.png"
# mkdir -p "${zed_dir}/share/icons/hicolor/1024x1024/apps"
# cp "crates/zed/resources/app-icon$suffix@2x.png" "${zed_dir}/share/icons/hicolor/1024x1024/apps/zed.png"
# .desktop
# export DO_STARTUP_NOTIFY="true"
# export APP_CLI="zed"
# export APP_ICON="zed"
# export APP_ARGS="%U"
# if [[ "$channel" == "preview" ]]; then
# export APP_NAME="Zed Preview"
# elif [[ "$channel" == "nightly" ]]; then
# export APP_NAME="Zed Nightly"
# elif [[ "$channel" == "dev" ]]; then
# export APP_NAME="Zed Devel"
# else
# export APP_NAME="Zed"
# fi
# mkdir -p "${zed_dir}/share/applications"
# envsubst <"crates/zed/resources/zed.desktop.in" >"${zed_dir}/share/applications/zed$suffix.desktop"
# Copy generated licenses so they'll end up in archive too
# cp "assets/licenses.md" "${zed_dir}/licenses.md"
# Create archive out of everything that's in the temp directory
arch=$(uname -m)
# target="freebsd-${arch}"
# if [[ "$channel" == "dev" ]]; then
# archive="zed-${commit}-${target}.tar.gz"
# else
# archive="zed-${target}.tar.gz"
# fi
# rm -rf "${archive}"
# remove_match="zed(-[a-zA-Z0-9]+)?-linux-$(uname -m)\.tar\.gz"
# ls "${target_dir}/release" | grep -E ${remove_match} | xargs -d "\n" -I {} rm -f "${target_dir}/release/{}" || true
# tar -czvf "${target_dir}/release/$archive" -C ${temp_dir} "zed$suffix.app"
gzip -f --stdout --best "${target_dir}/${remote_server_triple}/release/remote_server" \
> "${target_dir}/zed-remote-server-freebsd-x86_64.gz"