Compare commits

..

39 Commits

Author SHA1 Message Date
Jakub Konka
637296ce6e git_panel: Removing offending notify 2025-11-13 17:15:12 +01:00
Jakub Konka
90710b91c5 git_panel: Clean up a little 2025-11-12 23:14:18 +01:00
Jakub Konka
53f1fd5a27 git_panel: Debugging flickers 2025-11-12 19:21:05 +01:00
Joseph T. Lyons
e98de8c7ca v0.212.x stable 2025-11-12 10:15:09 -05:00
zed-zippy[bot]
fbd98c1fdc agent_servers: Fix panic when setting default mode (#42452) (cherry-pick to preview) (#42455)
Cherry-pick of #42452 to preview

----
Closes ZED-35A

Release Notes:

- Fixed an issue where Zed would panic when trying to set the default
mode for ACP agents

Co-authored-by: Bennet Bo Fenner <bennet@zed.dev>
2025-11-11 17:12:44 +01:00
zed-zippy[bot]
50b491b360 gpui: Do not panic when unable to find the selected fonts (#42212) (cherry-pick to preview) (#42456)
Cherry-pick of #42212 to preview

----
Fixes ZED-329

Release Notes:

- N/A *or* Added/Fixed/Improved ...

Co-authored-by: Lukas Wirth <lukas@zed.dev>
2025-11-11 16:44:51 +01:00
zed-zippy[bot]
9dab0e5f9b remote: Add more context to error logging in wsl (#42450) (cherry-pick to preview) (#42451)
Cherry-pick of #42450 to preview

----
cc https://github.com/zed-industries/zed/issues/40892

Release Notes:

- N/A *or* Added/Fixed/Improved ...

Co-authored-by: Lukas Wirth <lukas@zed.dev>
2025-11-11 15:20:50 +00:00
zed-zippy[bot]
b0c8952b0c gpui: Fix invalid unwrap in windows window creation (#42426) (cherry-pick to preview) (#42429)
Cherry-pick of #42426 to preview

----
Fixes ZED-34M

Release Notes:

- N/A *or* Added/Fixed/Improved ...

Co-authored-by: Lukas Wirth <lukas@zed.dev>
2025-11-11 12:18:48 +01:00
zed-zippy[bot]
47bdf9f026 terminal: Spawn terminal process on main thread on macos again (#42411) (cherry-pick to preview) (#42413)
Cherry-pick of #42411 to preview

----
Closes https://github.com/zed-industries/zed/issues/42365, follow up to
https://github.com/zed-industries/zed/pull/42234

Release Notes:

- N/A *or* Added/Fixed/Improved ...

Co-authored-by: Lukas Wirth <lukas@zed.dev>
2025-11-11 11:57:24 +01:00
zed-zippy[bot]
02cc7765f9 diagnostics: Fix panic due non-sorted diagnostics excerpt ranges (#42416) (cherry-pick to preview) (#42419)
Cherry-pick of #42416 to preview

----
Fixes ZED-356

Release Notes:

- N/A *or* Added/Fixed/Improved ...

Co-authored-by: Smit Barmase <heysmitbarmase@gmail.com>

Co-authored-by: Lukas Wirth <lukas@zed.dev>
Co-authored-by: Smit Barmase <heysmitbarmase@gmail.com>
2025-11-11 10:36:06 +00:00
zed-zippy[bot]
07ffb17991 remote: Flush to stdin when writing to sftp (#42103) (cherry-pick to preview) (#42361)
Cherry-pick of #42103 to preview

----

https://github.com/zed-industries/zed/issues/42027#issuecomment-3497210172

Release Notes:

- Fixed ssh remoting potentially failing due to not flushing stdin to
sftp

Co-authored-by: Lukas Wirth <lukas@zed.dev>
2025-11-10 21:34:08 +00:00
Conrad Irwin
b8c644a12e Create sentry releases in after_release (#42169)
This had been moved to auto-release preview, and was not running for
stable.

Closes #ISSUE

Release Notes:

- N/A
2025-11-10 10:17:13 -07:00
Zed Bot
f7dac91680 Bump to 0.212.3 for @bennetbo 2025-11-10 16:35:26 +00:00
zed-zippy[bot]
5a8a80a956 diagnostics: Keep diagnostic excerpt ranges properly ordered (#42298) (cherry-pick to preview) (#42300)
Cherry-pick of #42298 to preview

----
Fixes ZED-2CQ

We were doing the binary search by buffer points, but due to await
points within this function we could end up mixing points of differing
buffer versions.

Release Notes:

- N/A *or* Added/Fixed/Improved ...

Co-authored-by: Lukas Wirth <lukas@zed.dev>
2025-11-10 14:37:03 +00:00
zed-zippy[bot]
0c64f5185f acp: Fix issue with mentions when embedded_context is set to false (#42260) (cherry-pick to preview) (#42320)
Cherry-pick of #42260 to preview

----
Release Notes:

- acp: Fixed an issue where Zed would not respect
`PromptCapabilities::embedded_context`

Co-authored-by: Bennet Bo Fenner <bennet@zed.dev>
2025-11-10 09:14:19 +00:00
zed-zippy[bot]
56bf1e09f7 agent_ui: Always allow to include symbols (#42261) (cherry-pick to preview) (#42321)
Cherry-pick of #42261 to preview

----
We can always include symbols, since we either include a ResourceLink to
the symbol (when `PromptCapabilities::embedded_context = false`) or a
Resource (when `PromptCapabilities::embedded_context = true`)

Release Notes:

- Fixed an issue where symbols could not be included when using specific
ACP agents

Co-authored-by: Bennet Bo Fenner <bennet@zed.dev>
2025-11-10 08:16:21 +00:00
zed-zippy[bot]
1ce5974917 Fix crash during drag-and-drop on Windows (#42227) (cherry-pick to preview) (#42301)
Cherry-pick of #42227 to preview

----
The HGLOBAL is itself the HDROP. Do not dereference it.

Release Notes:

- windows: Fixed crashes during drag-and-drop operations

Co-authored-by: John Tur <john-tur@outlook.com>
2025-11-09 11:29:39 +00:00
zed-zippy[bot]
db7d11e879 terminal: Spawn terminal process on main thread on unix (#42234) (cherry-pick to preview) (#42236)
Cherry-pick of #42234 to preview

----
Otherwise the terminal will not process the signals correctly 

Release Notes:

- Fixed ctrl+c and friends not working in the terminal on macOS and
linux

Co-authored-by: Lukas Wirth <lukas@zed.dev>
2025-11-07 21:08:56 +00:00
Joseph T. Lyons
c70617c69e zed 0.212.2 2025-11-07 11:25:35 -05:00
zed-zippy[bot]
17afd49adb settings_ui: Use any open workspace window when opening settings links (#42106) (cherry-pick to preview) (#42196)
Cherry-pick of #42106 to preview

----
Closes #ISSUE

Release Notes:

- N/A *or* Added/Fixed/Improved ...

Co-authored-by: Ben Kunkle <ben@zed.dev>
2025-11-07 09:49:54 -05:00
zed-zippy[bot]
6984c9b459 workspace: Do not panic when the database is corruped (#42186) (cherry-pick to preview) (#42189)
Cherry-pick of #42186 to preview

----
Fixes ZED-1NK

Release Notes:

- Fixed zed not starting when the database cannot be loaded

Co-authored-by: Lukas Wirth <lukas@zed.dev>
2025-11-07 13:22:52 +00:00
Lukas Wirth
be8038eb81 project: Fetch latest lsp data in deduplicate_range_based_lsp_requests (#41971)
Fixes ZED-2MK

Release Notes:

- Fixed a panic in inlay hints
2025-11-07 13:31:12 +01:00
Lukas Wirth
577467bed8 agent_ui: Do not show Codex wsl warning on wsl take 2 (#42096)
https://github.com/zed-industries/zed/pull/42079#discussion_r2498472887

Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-07 13:30:38 +01:00
Lukas Wirth
18ddbc044f agent_ui: Do not show Codex wsl warning on wsl (#42079)
Release Notes:

- Fixed the codex wsl warning being shown on wsl itself
2025-11-07 13:30:30 +01:00
Lukas Wirth
5e1da57f25 remote: Fix detect_can_exec detection (#42087)
Closes https://github.com/zed-industries/zed/issues/42036

Release Notes:

- Fixed an issuer with wsl exec detection eagerly failing, breaking
remote connections
2025-11-07 13:30:21 +01:00
Lukas Wirth
4bee58f70e util: Fix shell environment fetching with cmd (#42093)
Release Notes:

- Fixed shell environment fetching failing when having `cmd` configured
as terminal shell
2025-11-07 13:30:11 +01:00
Lukas Wirth
7eda0cd996 zlog: Add env var to enable line number logging (#41905)
Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-07 13:29:37 +01:00
zed-zippy[bot]
c21c47e8ef remote: Flush to stdin when writing to sftp 2 (#42126) (cherry-pick to preview) (#42187)
Cherry-pick of #42126 to preview

----
https://github.com/zed-industries/zed/pull/42103#issuecomment-3498137130

Release Notes:

- N/A *or* Added/Fixed/Improved ...

Co-authored-by: Lukas Wirth <lukas@zed.dev>
2025-11-07 13:28:41 +01:00
zed-zippy[bot]
92142fc2b2 diagnostics: Fix diagnostics view no clearing blocks correctly (#42179) (cherry-pick to preview) (#42181)
Cherry-pick of #42179 to preview

----
Release Notes:

- N/A *or* Added/Fixed/Improved ...

Co-authored-by: Lukas Wirth <lukas@zed.dev>
2025-11-07 11:25:39 +00:00
zed-zippy[bot]
ed0121dc3a project: Remove unnecessary panic (#42167) (cherry-pick to preview) (#42168)
Cherry-pick of #42167 to preview

----
If we are in a remote session with the remote dropped, this path is very
much reachable if the call to this function got queued up in a task.

Fixes ZED-124

Release Notes:

- N/A *or* Added/Fixed/Improved ...

Co-authored-by: Lukas Wirth <lukas@zed.dev>
2025-11-07 07:07:06 +00:00
Zed Bot
3aeeed0b30 Bump to 0.212.1 for @smitbarmase 2025-11-06 15:03:25 +00:00
Smit Barmase
7e6cdabb26 language: Fix completion menu no longer prioritizes relevant items for Typescript and Python (#42065)
Closes #41672

Regressed in https://github.com/zed-industries/zed/pull/40242

Release Notes:

- Fixed issue where completion menu no longer prioritizes relevant items
for TypeScript and Python.
2025-11-06 13:37:37 +05:30
Richard Feldman
f91f3f24ef Run ACP login from same cwd as agent server (#42038)
This makes it possible to do login via things like `cmd: "node", args:
["my-node-file.js", "login"]`

Also, that command will now use Zed's managed `node` instance.

Release Notes:

- ACP extensions can now run terminal login commands using relative
paths
2025-11-06 08:08:30 +01:00
Danilo Leal
4ebc20b30c agent_ui: Fix how icons from external agents are displayed (#42034)
Release Notes:

- N/A
2025-11-06 08:08:09 +01:00
Danilo Leal
ac6aa735e4 gpui: Add support for rendering SVG from external files (#42024)
Release Notes:

- N/A

---------

Co-authored-by: Mikayla Maki <mikayla.c.maki@gmail.com>
2025-11-06 08:07:31 +01:00
zed-zippy[bot]
d2cb9a13b8 Refresh zed.dev releases page after releases (#42060) (cherry-pick to preview) (#42063)
Cherry-pick of #42060 to preview

----
Release Notes:

- N/A

Co-authored-by: Conrad Irwin <conrad.irwin@gmail.com>
2025-11-06 06:47:58 +00:00
zed-zippy[bot]
5956489a47 Fix generate release notes script on first stable (#42061) (cherry-pick to preview) (#42062)
Cherry-pick of #42061 to preview

----
Don't crash in generate-release-notes on the first stable
commit on a branch.

Release Notes:

- N/A

Co-authored-by: Conrad Irwin <conrad.irwin@gmail.com>
2025-11-06 06:37:43 +00:00
John Tur
418d850375 Fix corrupted characters being inserted when Alt is pressed (#42033)
The Alt+Numpad buffer that's maintained by the input stack is getting
corrupted, leading to garbage characters being inserted on keystrokes
like Alt+Up. Disable the automatic handling of Alt+Numpad for now until
the cause of this corruption is understood. The Alt+Numpad input did not
work anyway, so this does not regress anything.

Release Notes:

- windows: Fixed corrupted characters being inserted when Alt is pressed
(preview only)
2025-11-05 16:11:55 -05:00
Joseph T. Lyons
7ad3cf4387 v0.212.x preview 2025-11-05 12:07:10 -05:00
492 changed files with 8973 additions and 19248 deletions

View File

@@ -39,21 +39,3 @@ body:
Output of "zed: copy system specs into clipboard"
validations:
required: true
- type: textarea
attributes:
label: If applicable, attach your `Zed.log` file to this issue.
description: |
From the command palette, run `zed: open log` to see the last 1000 lines.
Or run `zed: reveal log in file manager` to reveal the log file itself.
value: |
<details><summary>Zed.log</summary>
<!-- Paste your log inside the code block. -->
```log
```
</details>
validations:
required: false

View File

@@ -33,21 +33,3 @@ body:
Output of "zed: copy system specs into clipboard"
validations:
required: true
- type: textarea
attributes:
label: If applicable, attach your `Zed.log` file to this issue.
description: |
From the command palette, run `zed: open log` to see the last 1000 lines.
Or run `zed: reveal log in file manager` to reveal the log file itself.
value: |
<details><summary>Zed.log</summary>
<!-- Paste your log inside the code block. -->
```log
```
</details>
validations:
required: false

View File

@@ -33,21 +33,3 @@ body:
Output of "zed: copy system specs into clipboard"
validations:
required: true
- type: textarea
attributes:
label: If applicable, attach your `Zed.log` file to this issue.
description: |
From the command palette, run `zed: open log` to see the last 1000 lines.
Or run `zed: reveal log in file manager` to reveal the log file itself.
value: |
<details><summary>Zed.log</summary>
<!-- Paste your log inside the code block. -->
```log
```
</details>
validations:
required: false

View File

@@ -33,21 +33,3 @@ body:
Output of "zed: copy system specs into clipboard"
validations:
required: true
- type: textarea
attributes:
label: If applicable, attach your `Zed.log` file to this issue.
description: |
From the command palette, run `zed: open log` to see the last 1000 lines.
Or run `zed: reveal log in file manager` to reveal the log file itself.
value: |
<details><summary>Zed.log</summary>
<!-- Paste your log inside the code block. -->
```log
```
</details>
validations:
required: false

View File

@@ -56,20 +56,3 @@ body:
Output of "zed: copy system specs into clipboard"
validations:
required: true
- type: textarea
attributes:
label: If applicable, attach your `Zed.log` file to this issue.
description: |
From the command palette, run `zed: open log` to see the last 1000 lines.
Or run `zed: reveal log in file manager` to reveal the log file itself.
value: |
<details><summary>Zed.log</summary>
<!-- Paste your log inside the code block. -->
```log
```
</details>
validations:
required: false

View File

@@ -1,4 +1,4 @@
# yaml-language-server: $schema=https://www.schemastore.org/github-issue-config.json
# yaml-language-server: $schema=https://json.schemastore.org/github-issue-config.json
blank_issues_enabled: false
contact_links:
- name: Feature Request

View File

@@ -6,21 +6,7 @@ on:
types:
- published
jobs:
rebuild_releases_page:
if: github.repository_owner == 'zed-industries'
runs-on: namespace-profile-2x4-ubuntu-2404
steps:
- name: after_release::rebuild_releases_page::refresh_cloud_releases
run: curl -fX POST https://cloud.zed.dev/releases/refresh?expect_tag=${{ github.event.release.tag_name }}
shell: bash -euxo pipefail {0}
- name: after_release::rebuild_releases_page::redeploy_zed_dev
run: npm exec --yes -- vercel@37 --token="$VERCEL_TOKEN" --scope zed-industries redeploy https://zed.dev
shell: bash -euxo pipefail {0}
env:
VERCEL_TOKEN: ${{ secrets.VERCEL_TOKEN }}
post_to_discord:
needs:
- rebuild_releases_page
if: github.repository_owner == 'zed-industries'
runs-on: namespace-profile-2x4-ubuntu-2404
steps:
@@ -51,7 +37,7 @@ jobs:
webhook-url: ${{ secrets.DISCORD_WEBHOOK_RELEASE_NOTES }}
content: ${{ steps.get-content.outputs.string }}
publish_winget:
runs-on: self-32vcpu-windows-2022
runs-on: namespace-profile-2x4-ubuntu-2404
steps:
- id: set-package-name
name: after_release::publish_winget::set_package_name

View File

@@ -15,13 +15,13 @@ jobs:
stale-issue-message: >
Hi there! 👋
We're working to clean up our issue tracker by closing older bugs that might not be relevant anymore. If you are able to reproduce this issue in the latest version of Zed, please let us know by commenting on this issue, and it will be kept open. If you can't reproduce it, feel free to close the issue yourself. Otherwise, it will close automatically in 14 days.
We're working to clean up our issue tracker by closing older issues that might not be relevant anymore. If you are able to reproduce this issue in the latest version of Zed, please let us know by commenting on this issue, and we will keep it open. If you can't reproduce it, feel free to close the issue yourself. Otherwise, we'll close it in 7 days.
Thanks for your help!
close-issue-message: "This issue was closed due to inactivity. If you're still experiencing this problem, please open a new issue with a link to this issue."
days-before-stale: 60
days-before-close: 14
only-issue-types: "Bug,Crash"
days-before-stale: 120
days-before-close: 7
any-of-issue-labels: "bug,panic / crash"
operations-per-run: 1000
ascending: true
enable-statistics: true

View File

@@ -35,9 +35,6 @@ jobs:
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: steps::download_wasi_sdk
run: ./script/download-wasi-sdk
shell: bash -euxo pipefail {0}
- name: compare_perf::run_perf::install_hyperfine
run: cargo install hyperfine
shell: bash -euxo pipefail {0}

View File

@@ -57,19 +57,16 @@ jobs:
mkdir -p ./../.cargo
cp ./.cargo/ci-config.toml ./../.cargo/config.toml
shell: bash -euxo pipefail {0}
- name: steps::cache_rust_dependencies_namespace
uses: namespacelabs/nscloud-cache-action@v1
with:
cache: rust
- name: steps::setup_linux
run: ./script/linux
shell: bash -euxo pipefail {0}
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: steps::download_wasi_sdk
run: ./script/download-wasi-sdk
shell: bash -euxo pipefail {0}
- name: steps::cache_rust_dependencies_namespace
uses: namespacelabs/nscloud-cache-action@v1
with:
cache: rust
- name: steps::setup_node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020
with:
@@ -205,9 +202,6 @@ jobs:
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: steps::download_wasi_sdk
run: ./script/download-wasi-sdk
shell: bash -euxo pipefail {0}
- name: ./script/bundle-linux
run: ./script/bundle-linux
shell: bash -euxo pipefail {0}
@@ -248,9 +242,6 @@ jobs:
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: steps::download_wasi_sdk
run: ./script/download-wasi-sdk
shell: bash -euxo pipefail {0}
- name: ./script/bundle-linux
run: ./script/bundle-linux
shell: bash -euxo pipefail {0}

View File

@@ -93,9 +93,6 @@ jobs:
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: steps::download_wasi_sdk
run: ./script/download-wasi-sdk
shell: bash -euxo pipefail {0}
- name: ./script/bundle-linux
run: ./script/bundle-linux
shell: bash -euxo pipefail {0}
@@ -143,9 +140,6 @@ jobs:
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: steps::download_wasi_sdk
run: ./script/download-wasi-sdk
shell: bash -euxo pipefail {0}
- name: ./script/bundle-linux
run: ./script/bundle-linux
shell: bash -euxo pipefail {0}

View File

@@ -6,21 +6,24 @@ env:
CARGO_INCREMENTAL: '0'
RUST_BACKTRACE: '1'
ANTHROPIC_API_KEY: ${{ secrets.ANTHROPIC_API_KEY }}
OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }}
GOOGLE_AI_API_KEY: ${{ secrets.GOOGLE_AI_API_KEY }}
GOOGLE_CLOUD_PROJECT: ${{ secrets.GOOGLE_CLOUD_PROJECT }}
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_EVAL_TELEMETRY: '1'
MODEL_NAME: ${{ inputs.model_name }}
on:
workflow_dispatch:
inputs:
model_name:
description: model_name
required: true
type: string
pull_request:
types:
- synchronize
- reopened
- labeled
branches:
- '**'
schedule:
- cron: 0 0 * * *
workflow_dispatch: {}
jobs:
agent_evals:
if: |
github.repository_owner == 'zed-industries' &&
(github.event_name != 'pull_request' || contains(github.event.pull_request.labels.*.name, 'run-eval'))
runs-on: namespace-profile-16x32-ubuntu-2204
steps:
- name: steps::checkout_repo
@@ -37,9 +40,6 @@ jobs:
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: steps::download_wasi_sdk
run: ./script/download-wasi-sdk
shell: bash -euxo pipefail {0}
- name: steps::setup_cargo_config
run: |
mkdir -p ./../.cargo
@@ -49,19 +49,14 @@ jobs:
run: cargo build --package=eval
shell: bash -euxo pipefail {0}
- name: run_agent_evals::agent_evals::run_eval
run: cargo run --package=eval -- --repetitions=8 --concurrency=1 --model "${MODEL_NAME}"
run: cargo run --package=eval -- --repetitions=8 --concurrency=1
shell: bash -euxo pipefail {0}
env:
ANTHROPIC_API_KEY: ${{ secrets.ANTHROPIC_API_KEY }}
OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }}
GOOGLE_AI_API_KEY: ${{ secrets.GOOGLE_AI_API_KEY }}
GOOGLE_CLOUD_PROJECT: ${{ secrets.GOOGLE_CLOUD_PROJECT }}
- name: steps::cleanup_cargo_config
if: always()
run: |
rm -rf ./../.cargo
shell: bash -euxo pipefail {0}
timeout-minutes: 600
timeout-minutes: 60
concurrency:
group: ${{ github.workflow }}-${{ github.ref_name }}-${{ github.ref_name == 'main' && github.sha || 'anysha' }}
cancel-in-progress: true

View File

@@ -34,9 +34,6 @@ jobs:
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: steps::download_wasi_sdk
run: ./script/download-wasi-sdk
shell: bash -euxo pipefail {0}
- name: ./script/bundle-linux
run: ./script/bundle-linux
shell: bash -euxo pipefail {0}
@@ -77,9 +74,6 @@ jobs:
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: steps::download_wasi_sdk
run: ./script/download-wasi-sdk
shell: bash -euxo pipefail {0}
- name: ./script/bundle-linux
run: ./script/bundle-linux
shell: bash -euxo pipefail {0}

View File

@@ -1,69 +0,0 @@
# Generated from xtask::workflows::run_cron_unit_evals
# Rebuild with `cargo xtask workflows`.
name: run_cron_unit_evals
env:
CARGO_TERM_COLOR: always
CARGO_INCREMENTAL: '0'
RUST_BACKTRACE: '1'
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
on:
schedule:
- cron: 47 1 * * 2
workflow_dispatch: {}
jobs:
cron_unit_evals:
runs-on: namespace-profile-16x32-ubuntu-2204
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_cargo_config
run: |
mkdir -p ./../.cargo
cp ./.cargo/ci-config.toml ./../.cargo/config.toml
shell: bash -euxo pipefail {0}
- name: steps::cache_rust_dependencies_namespace
uses: namespacelabs/nscloud-cache-action@v1
with:
cache: rust
- name: steps::setup_linux
run: ./script/linux
shell: bash -euxo pipefail {0}
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: steps::download_wasi_sdk
run: ./script/download-wasi-sdk
shell: bash -euxo pipefail {0}
- name: steps::cargo_install_nextest
run: cargo install cargo-nextest --locked
shell: bash -euxo pipefail {0}
- name: steps::clear_target_dir_if_large
run: ./script/clear-target-dir-if-larger-than 250
shell: bash -euxo pipefail {0}
- name: ./script/run-unit-evals
run: ./script/run-unit-evals
shell: bash -euxo pipefail {0}
env:
ANTHROPIC_API_KEY: ${{ secrets.ANTHROPIC_API_KEY }}
OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }}
GOOGLE_AI_API_KEY: ${{ secrets.GOOGLE_AI_API_KEY }}
GOOGLE_CLOUD_PROJECT: ${{ secrets.GOOGLE_CLOUD_PROJECT }}
- name: steps::cleanup_cargo_config
if: always()
run: |
rm -rf ./../.cargo
shell: bash -euxo pipefail {0}
- name: run_agent_evals::cron_unit_evals::send_failure_to_slack
if: ${{ failure() }}
uses: slackapi/slack-github-action@b0fa283ad8fea605de13dc3f449259339835fc52
with:
method: chat.postMessage
token: ${{ secrets.SLACK_APP_ZED_UNIT_EVALS_BOT_TOKEN }}
payload: |
channel: C04UDRNNJFQ
text: "Unit Evals Failed: https://github.com/zed-industries/zed/actions/runs/${{ github.run_id }}"
concurrency:
group: ${{ github.workflow }}-${{ github.ref_name }}-${{ github.ref_name == 'main' && github.sha || 'anysha' }}
cancel-in-progress: true

View File

@@ -143,19 +143,16 @@ jobs:
mkdir -p ./../.cargo
cp ./.cargo/ci-config.toml ./../.cargo/config.toml
shell: bash -euxo pipefail {0}
- name: steps::cache_rust_dependencies_namespace
uses: namespacelabs/nscloud-cache-action@v1
with:
cache: rust
- name: steps::setup_linux
run: ./script/linux
shell: bash -euxo pipefail {0}
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: steps::download_wasi_sdk
run: ./script/download-wasi-sdk
shell: bash -euxo pipefail {0}
- name: steps::cache_rust_dependencies_namespace
uses: namespacelabs/nscloud-cache-action@v1
with:
cache: rust
- name: steps::setup_node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020
with:
@@ -235,9 +232,6 @@ jobs:
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: steps::download_wasi_sdk
run: ./script/download-wasi-sdk
shell: bash -euxo pipefail {0}
- name: steps::setup_cargo_config
run: |
mkdir -p ./../.cargo
@@ -269,19 +263,16 @@ jobs:
mkdir -p ./../.cargo
cp ./.cargo/ci-config.toml ./../.cargo/config.toml
shell: bash -euxo pipefail {0}
- name: steps::cache_rust_dependencies_namespace
uses: namespacelabs/nscloud-cache-action@v1
with:
cache: rust
- name: steps::setup_linux
run: ./script/linux
shell: bash -euxo pipefail {0}
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: steps::download_wasi_sdk
run: ./script/download-wasi-sdk
shell: bash -euxo pipefail {0}
- name: steps::cache_rust_dependencies_namespace
uses: namespacelabs/nscloud-cache-action@v1
with:
cache: rust
- name: cargo build -p collab
run: cargo build -p collab
shell: bash -euxo pipefail {0}
@@ -294,6 +285,40 @@ jobs:
rm -rf ./../.cargo
shell: bash -euxo pipefail {0}
timeout-minutes: 60
check_postgres_and_protobuf_migrations:
needs:
- orchestrate
if: needs.orchestrate.outputs.run_tests == 'true'
runs-on: self-mini-macos
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
fetch-depth: 0
- name: run_tests::check_postgres_and_protobuf_migrations::remove_untracked_files
run: git clean -df
shell: bash -euxo pipefail {0}
- name: run_tests::check_postgres_and_protobuf_migrations::ensure_fresh_merge
run: |
if [ -z "$GITHUB_BASE_REF" ];
then
echo "BUF_BASE_BRANCH=$(git merge-base origin/main HEAD)" >> "$GITHUB_ENV"
else
git checkout -B temp
git merge -q "origin/$GITHUB_BASE_REF" -m "merge main into temp"
echo "BUF_BASE_BRANCH=$GITHUB_BASE_REF" >> "$GITHUB_ENV"
fi
shell: bash -euxo pipefail {0}
- name: run_tests::check_postgres_and_protobuf_migrations::bufbuild_setup_action
uses: bufbuild/buf-setup-action@v1
with:
version: v1.29.0
- name: run_tests::check_postgres_and_protobuf_migrations::bufbuild_breaking_action
uses: bufbuild/buf-breaking-action@v1
with:
input: crates/proto/proto/
against: https://github.com/${GITHUB_REPOSITORY}.git#branch=${BUF_BASE_BRANCH},subdir=crates/proto/proto/
timeout-minutes: 60
check_dependencies:
needs:
- orchestrate
@@ -357,9 +382,6 @@ jobs:
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: steps::download_wasi_sdk
run: ./script/download-wasi-sdk
shell: bash -euxo pipefail {0}
- name: run_tests::check_docs::install_mdbook
uses: peaceiris/actions-mdbook@ee69d230fe19748b7abf22df32acaa93833fad08
with:
@@ -496,40 +518,6 @@ jobs:
shell: bash -euxo pipefail {0}
timeout-minutes: 60
continue-on-error: true
check_postgres_and_protobuf_migrations:
needs:
- orchestrate
if: needs.orchestrate.outputs.run_tests == 'true'
runs-on: self-mini-macos
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
fetch-depth: 0
- name: run_tests::check_postgres_and_protobuf_migrations::remove_untracked_files
run: git clean -df
shell: bash -euxo pipefail {0}
- name: run_tests::check_postgres_and_protobuf_migrations::ensure_fresh_merge
run: |
if [ -z "$GITHUB_BASE_REF" ];
then
echo "BUF_BASE_BRANCH=$(git merge-base origin/main HEAD)" >> "$GITHUB_ENV"
else
git checkout -B temp
git merge -q "origin/$GITHUB_BASE_REF" -m "merge main into temp"
echo "BUF_BASE_BRANCH=$GITHUB_BASE_REF" >> "$GITHUB_ENV"
fi
shell: bash -euxo pipefail {0}
- name: run_tests::check_postgres_and_protobuf_migrations::bufbuild_setup_action
uses: bufbuild/buf-setup-action@v1
with:
version: v1.29.0
- name: run_tests::check_postgres_and_protobuf_migrations::bufbuild_breaking_action
uses: bufbuild/buf-breaking-action@v1
with:
input: crates/proto/proto/
against: https://github.com/${GITHUB_REPOSITORY}.git#branch=${BUF_BASE_BRANCH},subdir=crates/proto/proto/
timeout-minutes: 60
tests_pass:
needs:
- orchestrate
@@ -539,6 +527,7 @@ jobs:
- run_tests_mac
- doctests
- check_workspace_binaries
- check_postgres_and_protobuf_migrations
- check_dependencies
- check_docs
- check_licenses
@@ -565,6 +554,7 @@ jobs:
check_result "run_tests_mac" "${{ needs.run_tests_mac.result }}"
check_result "doctests" "${{ needs.doctests.result }}"
check_result "check_workspace_binaries" "${{ needs.check_workspace_binaries.result }}"
check_result "check_postgres_and_protobuf_migrations" "${{ needs.check_postgres_and_protobuf_migrations.result }}"
check_result "check_dependencies" "${{ needs.check_dependencies.result }}"
check_result "check_docs" "${{ needs.check_docs.result }}"
check_result "check_licenses" "${{ needs.check_licenses.result }}"

View File

@@ -1,26 +1,17 @@
# Generated from xtask::workflows::run_unit_evals
# Generated from xtask::workflows::run_agent_evals
# Rebuild with `cargo xtask workflows`.
name: run_unit_evals
name: run_agent_evals
env:
CARGO_TERM_COLOR: always
CARGO_INCREMENTAL: '0'
RUST_BACKTRACE: '1'
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_EVAL_TELEMETRY: '1'
MODEL_NAME: ${{ inputs.model_name }}
on:
workflow_dispatch:
inputs:
model_name:
description: model_name
required: true
type: string
commit_sha:
description: commit_sha
required: true
type: string
schedule:
- cron: 47 1 * * 2
workflow_dispatch: {}
jobs:
run_unit_evals:
unit_evals:
runs-on: namespace-profile-16x32-ubuntu-2204
steps:
- name: steps::checkout_repo
@@ -42,9 +33,6 @@ jobs:
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: steps::download_wasi_sdk
run: ./script/download-wasi-sdk
shell: bash -euxo pipefail {0}
- name: steps::cargo_install_nextest
run: cargo install cargo-nextest --locked
shell: bash -euxo pipefail {0}
@@ -56,10 +44,15 @@ jobs:
shell: bash -euxo pipefail {0}
env:
ANTHROPIC_API_KEY: ${{ secrets.ANTHROPIC_API_KEY }}
OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }}
GOOGLE_AI_API_KEY: ${{ secrets.GOOGLE_AI_API_KEY }}
GOOGLE_CLOUD_PROJECT: ${{ secrets.GOOGLE_CLOUD_PROJECT }}
UNIT_EVAL_COMMIT: ${{ inputs.commit_sha }}
- name: run_agent_evals::unit_evals::send_failure_to_slack
if: ${{ failure() }}
uses: slackapi/slack-github-action@b0fa283ad8fea605de13dc3f449259339835fc52
with:
method: chat.postMessage
token: ${{ secrets.SLACK_APP_ZED_UNIT_EVALS_BOT_TOKEN }}
payload: |
channel: C04UDRNNJFQ
text: "Unit Evals Failed: https://github.com/zed-industries/zed/actions/runs/${{ github.run_id }}"
- name: steps::cleanup_cargo_config
if: always()
run: |

75
Cargo.lock generated
View File

@@ -32,7 +32,6 @@ dependencies = [
"settings",
"smol",
"task",
"telemetry",
"tempfile",
"terminal",
"ui",
@@ -40,7 +39,6 @@ dependencies = [
"util",
"uuid",
"watch",
"zlog",
]
[[package]]
@@ -81,7 +79,6 @@ dependencies = [
"rand 0.9.2",
"serde_json",
"settings",
"telemetry",
"text",
"util",
"watch",
@@ -96,7 +93,6 @@ dependencies = [
"auto_update",
"editor",
"extension_host",
"fs",
"futures 0.3.31",
"gpui",
"language",
@@ -251,6 +247,7 @@ dependencies = [
"acp_tools",
"action_log",
"agent-client-protocol",
"agent_settings",
"anyhow",
"async-trait",
"client",
@@ -1331,14 +1328,10 @@ version = "0.1.0"
dependencies = [
"anyhow",
"client",
"clock",
"ctor",
"db",
"futures 0.3.31",
"gpui",
"http_client",
"log",
"parking_lot",
"paths",
"release_channel",
"serde",
@@ -1349,7 +1342,6 @@ dependencies = [
"util",
"which 6.0.3",
"workspace",
"zlog",
]
[[package]]
@@ -3206,7 +3198,6 @@ dependencies = [
"indoc",
"ordered-float 2.10.1",
"rustc-hash 2.1.1",
"schemars 1.0.4",
"serde",
"strum 0.27.2",
]
@@ -6248,7 +6239,7 @@ dependencies = [
"futures-core",
"futures-sink",
"nanorand",
"spin 0.9.8",
"spin",
]
[[package]]
@@ -6359,9 +6350,9 @@ checksum = "aa9a19cbb55df58761df49b23516a86d432839add4af60fc256da840f66ed35b"
[[package]]
name = "fork"
version = "0.4.0"
version = "0.2.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "30268f1eefccc9d72f43692e8b89e659aeb52e84016c3b32b6e7e9f1c8f38f94"
checksum = "05dc8b302e04a1c27f4fe694439ef0f29779ca4edc205b7b58f00db04e29656d"
dependencies = [
"libc",
]
@@ -6413,7 +6404,7 @@ dependencies = [
"ignore",
"libc",
"log",
"notify 8.2.0",
"notify 8.0.0",
"objc",
"parking_lot",
"paths",
@@ -7100,6 +7091,7 @@ dependencies = [
"askpass",
"buffer_diff",
"call",
"chrono",
"cloud_llm_client",
"collections",
"command_palette_hooks",
@@ -7287,7 +7279,6 @@ dependencies = [
"calloop",
"calloop-wayland-source",
"cbindgen",
"circular-buffer",
"cocoa 0.26.0",
"cocoa-foundation 0.2.0",
"collections",
@@ -7343,7 +7334,6 @@ dependencies = [
"slotmap",
"smallvec",
"smol",
"spin 0.10.0",
"stacksafe",
"strum 0.27.2",
"sum_tree",
@@ -7807,7 +7797,6 @@ dependencies = [
"parking_lot",
"serde",
"serde_json",
"serde_urlencoded",
"sha2",
"tempfile",
"url",
@@ -8875,7 +8864,6 @@ dependencies = [
"open_router",
"parking_lot",
"proto",
"schemars 1.0.4",
"serde",
"serde_json",
"settings",
@@ -9040,7 +9028,6 @@ dependencies = [
"settings",
"smol",
"task",
"terminal",
"text",
"theme",
"toml 0.8.23",
@@ -9074,7 +9061,7 @@ version = "1.5.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "bbd2bcb4c963f2ddae06a2efc7e9f3591312473c50c6685e1f298068316e66fe"
dependencies = [
"spin 0.9.8",
"spin",
]
[[package]]
@@ -9688,7 +9675,6 @@ dependencies = [
"settings",
"theme",
"ui",
"urlencoding",
"util",
"workspace",
]
@@ -10016,18 +10002,6 @@ version = "0.2.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "68354c5c6bd36d73ff3feceb05efa59b6acb7626617f4962be322a825e61f79a"
[[package]]
name = "miniprofiler_ui"
version = "0.1.0"
dependencies = [
"gpui",
"serde_json",
"smol",
"util",
"workspace",
"zed_actions",
]
[[package]]
name = "miniz_oxide"
version = "0.8.9"
@@ -10435,10 +10409,11 @@ dependencies = [
[[package]]
name = "notify"
version = "8.2.0"
source = "git+https://github.com/zed-industries/notify.git?rev=b4588b2e5aee68f4c0e100f140e808cbce7b1419#b4588b2e5aee68f4c0e100f140e808cbce7b1419"
version = "8.0.0"
source = "git+https://github.com/zed-industries/notify.git?rev=bbb9ea5ae52b253e095737847e367c30653a2e96#bbb9ea5ae52b253e095737847e367c30653a2e96"
dependencies = [
"bitflags 2.9.4",
"filetime",
"fsevent-sys 4.1.0",
"inotify 0.11.0",
"kqueue",
@@ -10447,7 +10422,7 @@ dependencies = [
"mio 1.1.0",
"notify-types",
"walkdir",
"windows-sys 0.60.2",
"windows-sys 0.59.0",
]
[[package]]
@@ -10464,7 +10439,7 @@ dependencies = [
[[package]]
name = "notify-types"
version = "2.0.0"
source = "git+https://github.com/zed-industries/notify.git?rev=b4588b2e5aee68f4c0e100f140e808cbce7b1419#b4588b2e5aee68f4c0e100f140e808cbce7b1419"
source = "git+https://github.com/zed-industries/notify.git?rev=bbb9ea5ae52b253e095737847e367c30653a2e96#bbb9ea5ae52b253e095737847e367c30653a2e96"
[[package]]
name = "now"
@@ -13092,7 +13067,6 @@ dependencies = [
"settings",
"smallvec",
"telemetry",
"tempfile",
"theme",
"ui",
"util",
@@ -13997,7 +13971,6 @@ dependencies = [
"gpui",
"gpui_tokio",
"http_client",
"image",
"json_schema_store",
"language",
"language_extension",
@@ -15868,15 +15841,6 @@ dependencies = [
"lock_api",
]
[[package]]
name = "spin"
version = "0.10.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d5fe4ccb98d9c292d56fec89a5e07da7fc4cf0dc11e156b41793132775d3e591"
dependencies = [
"lock_api",
]
[[package]]
name = "spirv"
version = "0.3.0+sdk-1.3.268.0"
@@ -16248,6 +16212,7 @@ dependencies = [
"log",
"menu",
"picker",
"project",
"reqwest_client",
"rust-embed",
"settings",
@@ -16257,6 +16222,7 @@ dependencies = [
"theme",
"title_bar",
"ui",
"workspace",
]
[[package]]
@@ -18652,7 +18618,6 @@ dependencies = [
"itertools 0.14.0",
"libc",
"log",
"mach2 0.5.0",
"nix 0.29.0",
"pretty_assertions",
"rand 0.9.2",
@@ -18842,6 +18807,7 @@ dependencies = [
name = "vim_mode_setting"
version = "0.1.0"
dependencies = [
"gpui",
"settings",
]
@@ -21170,7 +21136,7 @@ dependencies = [
[[package]]
name = "zed"
version = "0.214.0"
version = "0.212.3"
dependencies = [
"acp_tools",
"activity_indicator",
@@ -21188,7 +21154,6 @@ dependencies = [
"breadcrumbs",
"call",
"channel",
"chrono",
"clap",
"cli",
"client",
@@ -21246,7 +21211,6 @@ dependencies = [
"menu",
"migrator",
"mimalloc",
"miniprofiler_ui",
"nc",
"nix 0.29.0",
"node_runtime",
@@ -21711,21 +21675,18 @@ dependencies = [
"language_model",
"log",
"lsp",
"open_ai",
"pretty_assertions",
"project",
"release_channel",
"schemars 1.0.4",
"serde",
"serde_json",
"settings",
"smol",
"strsim",
"thiserror 2.0.17",
"util",
"uuid",
"workspace",
"worktree",
"zlog",
]
[[package]]
@@ -21737,7 +21698,6 @@ dependencies = [
"clap",
"client",
"cloud_llm_client",
"cloud_zeta2_prompt",
"collections",
"edit_prediction_context",
"editor",
@@ -21751,6 +21711,7 @@ dependencies = [
"ordered-float 2.10.1",
"pretty_assertions",
"project",
"regex-syntax",
"serde",
"serde_json",
"settings",

View File

@@ -110,7 +110,6 @@ members = [
"crates/menu",
"crates/migrator",
"crates/mistral",
"crates/miniprofiler_ui",
"crates/multi_buffer",
"crates/nc",
"crates/net",
@@ -342,7 +341,6 @@ menu = { path = "crates/menu" }
migrator = { path = "crates/migrator" }
mistral = { path = "crates/mistral" }
multi_buffer = { path = "crates/multi_buffer" }
miniprofiler_ui = { path = "crates/miniprofiler_ui" }
nc = { path = "crates/nc" }
net = { path = "crates/net" }
node_runtime = { path = "crates/node_runtime" }
@@ -506,7 +504,7 @@ emojis = "0.6.1"
env_logger = "0.11"
exec = "0.3.1"
fancy-regex = "0.14.0"
fork = "0.4.0"
fork = "0.2.0"
futures = "0.3"
futures-batch = "0.6.1"
futures-lite = "1.13"
@@ -665,7 +663,6 @@ time = { version = "0.3", features = [
"serde",
"serde-well-known",
"formatting",
"local-offset",
] }
tiny_http = "0.8"
tokio = { version = "1" }
@@ -775,8 +772,8 @@ features = [
]
[patch.crates-io]
notify = { git = "https://github.com/zed-industries/notify.git", rev = "b4588b2e5aee68f4c0e100f140e808cbce7b1419" }
notify-types = { git = "https://github.com/zed-industries/notify.git", rev = "b4588b2e5aee68f4c0e100f140e808cbce7b1419" }
notify = { git = "https://github.com/zed-industries/notify.git", rev = "bbb9ea5ae52b253e095737847e367c30653a2e96" }
notify-types = { git = "https://github.com/zed-industries/notify.git", rev = "bbb9ea5ae52b253e095737847e367c30653a2e96" }
windows-capture = { git = "https://github.com/zed-industries/windows-capture.git", rev = "f0d6c1b6691db75461b732f6d5ff56eed002eeb9" }
[profile.dev]

View File

@@ -1,7 +1,7 @@
# Zed
[![Zed](https://img.shields.io/endpoint?url=https://raw.githubusercontent.com/zed-industries/zed/main/assets/badge/v0.json)](https://zed.dev)
[![CI](https://github.com/zed-industries/zed/actions/workflows/run_tests.yml/badge.svg)](https://github.com/zed-industries/zed/actions/workflows/run_tests.yml)
[![CI](https://github.com/zed-industries/zed/actions/workflows/ci.yml/badge.svg)](https://github.com/zed-industries/zed/actions/workflows/ci.yml)
Welcome to Zed, a high-performance, multiplayer code editor from the creators of [Atom](https://github.com/atom/atom) and [Tree-sitter](https://github.com/tree-sitter/tree-sitter).

View File

@@ -1,4 +0,0 @@
<svg width="16" height="16" viewBox="0 0 16 16" fill="none" xmlns="http://www.w3.org/2000/svg">
<path d="M8.00156 10.3996C9.32705 10.3996 10.4016 9.32509 10.4016 7.99961C10.4016 6.67413 9.32705 5.59961 8.00156 5.59961C6.67608 5.59961 5.60156 6.67413 5.60156 7.99961C5.60156 9.32509 6.67608 10.3996 8.00156 10.3996Z" stroke="black" stroke-width="1.2" stroke-linecap="round" stroke-linejoin="round"/>
<path d="M10.4 5.6V8.6C10.4 9.07739 10.5896 9.53523 10.9272 9.8728C11.2648 10.2104 11.7226 10.4 12.2 10.4C12.6774 10.4 13.1352 10.2104 13.4728 9.8728C13.8104 9.53523 14 9.07739 14 8.6V8C14 6.64839 13.5436 5.33636 12.7048 4.27651C11.8661 3.21665 10.694 2.47105 9.37852 2.16051C8.06306 1.84997 6.68129 1.99269 5.45707 2.56554C4.23285 3.13838 3.23791 4.1078 2.63344 5.31672C2.02898 6.52565 1.85041 7.90325 2.12667 9.22633C2.40292 10.5494 3.11782 11.7405 4.15552 12.6065C5.19323 13.4726 6.49295 13.9629 7.84411 13.998C9.19527 14.0331 10.5187 13.611 11.6 12.8" stroke="black" stroke-width="1.2" stroke-linecap="round" stroke-linejoin="round"/>
</svg>

Before

Width:  |  Height:  |  Size: 1.0 KiB

View File

@@ -735,20 +735,6 @@
"tab": "editor::ComposeCompletion"
}
},
{
"context": "Editor && in_snippet && has_next_tabstop && !showing_completions",
"use_key_equivalents": true,
"bindings": {
"tab": "editor::NextSnippetTabstop"
}
},
{
"context": "Editor && in_snippet && has_previous_tabstop && !showing_completions",
"use_key_equivalents": true,
"bindings": {
"shift-tab": "editor::PreviousSnippetTabstop"
}
},
// Bindings for accepting edit predictions
//
// alt-l is provided as an alternative to tab/alt-tab. and will be displayed in the UI. This is

View File

@@ -805,20 +805,6 @@
"tab": "editor::ComposeCompletion"
}
},
{
"context": "Editor && in_snippet && has_next_tabstop && !showing_completions",
"use_key_equivalents": true,
"bindings": {
"tab": "editor::NextSnippetTabstop"
}
},
{
"context": "Editor && in_snippet && has_previous_tabstop && !showing_completions",
"use_key_equivalents": true,
"bindings": {
"shift-tab": "editor::PreviousSnippetTabstop"
}
},
{
"context": "Editor && edit_prediction",
"bindings": {

View File

@@ -739,20 +739,6 @@
"tab": "editor::ComposeCompletion"
}
},
{
"context": "Editor && in_snippet && has_next_tabstop && !showing_completions",
"use_key_equivalents": true,
"bindings": {
"tab": "editor::NextSnippetTabstop"
}
},
{
"context": "Editor && in_snippet && has_previous_tabstop && !showing_completions",
"use_key_equivalents": true,
"bindings": {
"shift-tab": "editor::PreviousSnippetTabstop"
}
},
// Bindings for accepting edit predictions
//
// alt-l is provided as an alternative to tab/alt-tab. and will be displayed in the UI. This is

View File

@@ -455,7 +455,6 @@
"<": "vim::Outdent",
"=": "vim::AutoIndent",
"d": "vim::HelixDelete",
"alt-d": "editor::Delete", // Delete selection, without yanking
"c": "vim::HelixSubstitute",
"alt-c": "vim::HelixSubstituteNoYank",

View File

@@ -605,10 +605,6 @@
// to both the horizontal and vertical delta values while scrolling. Fast scrolling
// happens when a user holds the alt or option key while scrolling.
"fast_scroll_sensitivity": 4.0,
"sticky_scroll": {
// Whether to stick scopes to the top of the editor.
"enabled": false
},
"relative_line_numbers": "disabled",
// If 'search_wrap' is disabled, search result do not wrap around the end of the file.
"search_wrap": true,
@@ -616,13 +612,9 @@
"search": {
// Whether to show the project search button in the status bar.
"button": true,
// Whether to only match on whole words.
"whole_word": false,
// Whether to match case sensitively.
"case_sensitive": false,
// Whether to include gitignored files in search results.
"include_ignored": false,
// Whether to interpret the search query as a regular expression.
"regex": false,
// Whether to center the cursor on each search match when navigating.
"center_on_match": false
@@ -748,15 +740,8 @@
"hide_root": false,
// Whether to hide the hidden entries in the project panel.
"hide_hidden": false,
// Settings for automatically opening files.
"auto_open": {
// Whether to automatically open newly created files in the editor.
"on_create": true,
// Whether to automatically open files after pasting or duplicating them.
"on_paste": true,
// Whether to automatically open files dropped from external sources.
"on_drop": true
}
// Whether to automatically open files when pasting them in the project panel.
"open_file_on_paste": true
},
"outline_panel": {
// Whether to show the outline panel button in the status bar
@@ -1502,11 +1487,7 @@
// in your project's settings, rather than globally.
"directories": [".env", "env", ".venv", "venv"],
// Can also be `csh`, `fish`, `nushell` and `power_shell`
"activate_script": "default",
// Preferred Conda manager to use when activating Conda environments.
// Values: "auto", "conda", "mamba", "micromamba"
// Default: "auto"
"conda_manager": "auto"
"activate_script": "default"
}
},
"toolbar": {
@@ -1550,8 +1531,6 @@
// Default: 10_000, maximum: 100_000 (all bigger values set will be treated as 100_000), 0 disables the scrolling.
// Existing terminals will not pick up this change until they are recreated.
"max_scroll_history_lines": 10000,
// The multiplier for scrolling speed in the terminal.
"scroll_multiplier": 1.0,
// The minimum APCA perceptual contrast between foreground and background colors.
// APCA (Accessible Perceptual Contrast Algorithm) is more accurate than WCAG 2.x,
// especially for dark mode. Values range from 0 to 106.

View File

@@ -39,7 +39,6 @@ serde_json.workspace = true
settings.workspace = true
smol.workspace = true
task.workspace = true
telemetry.workspace = true
terminal.workspace = true
ui.workspace = true
url.workspace = true
@@ -57,4 +56,3 @@ rand.workspace = true
tempfile.workspace = true
util.workspace = true
settings.workspace = true
zlog.workspace = true

View File

@@ -15,7 +15,7 @@ use settings::Settings as _;
use task::{Shell, ShellBuilder};
pub use terminal::*;
use action_log::{ActionLog, ActionLogTelemetry};
use action_log::ActionLog;
use agent_client_protocol::{self as acp};
use anyhow::{Context as _, Result, anyhow};
use editor::Bias;
@@ -820,15 +820,6 @@ pub struct AcpThread {
pending_terminal_exit: HashMap<acp::TerminalId, acp::TerminalExitStatus>,
}
impl From<&AcpThread> for ActionLogTelemetry {
fn from(value: &AcpThread) -> Self {
Self {
agent_telemetry_id: value.connection().telemetry_id(),
session_id: value.session_id.0.clone(),
}
}
}
#[derive(Debug)]
pub enum AcpThreadEvent {
NewEntry,
@@ -1355,17 +1346,6 @@ impl AcpThread {
let path_style = self.project.read(cx).path_style(cx);
let id = update.id.clone();
let agent = self.connection().telemetry_id();
let session = self.session_id();
if let ToolCallStatus::Completed | ToolCallStatus::Failed = status {
let status = if matches!(status, ToolCallStatus::Completed) {
"completed"
} else {
"failed"
};
telemetry::event!("Agent Tool Call Completed", agent, session, status);
}
if let Some(ix) = self.index_for_tool_call(&id) {
let AgentThreadEntry::ToolCall(call) = &mut self.entries[ix] else {
unreachable!()
@@ -1866,14 +1846,10 @@ impl AcpThread {
.checkpoint
.as_ref()
.map(|c| c.git_checkpoint.clone());
// Cancel any in-progress generation before restoring
let cancel_task = self.cancel(cx);
let rewind = self.rewind(id.clone(), cx);
let git_store = self.project.read(cx).git_store().clone();
cx.spawn(async move |_, cx| {
cancel_task.await;
rewind.await?;
if let Some(checkpoint) = checkpoint {
git_store
@@ -1893,34 +1869,16 @@ impl AcpThread {
return Task::ready(Err(anyhow!("not supported")));
};
let telemetry = ActionLogTelemetry::from(&*self);
cx.spawn(async move |this, cx| {
cx.update(|cx| truncate.run(id.clone(), cx))?.await?;
this.update(cx, |this, cx| {
if let Some((ix, _)) = this.user_message_mut(&id) {
// Collect all terminals from entries that will be removed
let terminals_to_remove: Vec<acp::TerminalId> = this.entries[ix..]
.iter()
.flat_map(|entry| entry.terminals())
.filter_map(|terminal| terminal.read(cx).id().clone().into())
.collect();
let range = ix..this.entries.len();
this.entries.truncate(ix);
cx.emit(AcpThreadEvent::EntriesRemoved(range));
// Kill and remove the terminals
for terminal_id in terminals_to_remove {
if let Some(terminal) = this.terminals.remove(&terminal_id) {
terminal.update(cx, |terminal, cx| {
terminal.kill(cx);
});
}
}
}
this.action_log().update(cx, |action_log, cx| {
action_log.reject_all_edits(Some(telemetry), cx)
})
this.action_log()
.update(cx, |action_log, cx| action_log.reject_all_edits(cx))
})?
.await;
Ok(())
@@ -2397,6 +2355,8 @@ mod tests {
cx.update(|cx| {
let settings_store = SettingsStore::test(cx);
cx.set_global(settings_store);
Project::init_settings(cx);
language::init(cx);
});
}
@@ -3654,10 +3614,6 @@ mod tests {
}
impl AgentConnection for FakeAgentConnection {
fn telemetry_id(&self) -> &'static str {
"fake"
}
fn auth_methods(&self) -> &[acp::AuthMethod] {
&self.auth_methods
}
@@ -3823,314 +3779,4 @@ mod tests {
}
});
}
/// Tests that restoring a checkpoint properly cleans up terminals that were
/// created after that checkpoint, and cancels any in-progress generation.
///
/// Reproduces issue #35142: When a checkpoint is restored, any terminal processes
/// that were started after that checkpoint should be terminated, and any in-progress
/// AI generation should be canceled.
#[gpui::test]
async fn test_restore_checkpoint_kills_terminal(cx: &mut TestAppContext) {
init_test(cx);
let fs = FakeFs::new(cx.executor());
let project = Project::test(fs, [], cx).await;
let connection = Rc::new(FakeAgentConnection::new());
let thread = cx
.update(|cx| connection.new_thread(project, Path::new(path!("/test")), cx))
.await
.unwrap();
// Send first user message to create a checkpoint
cx.update(|cx| {
thread.update(cx, |thread, cx| {
thread.send(vec!["first message".into()], cx)
})
})
.await
.unwrap();
// Send second message (creates another checkpoint) - we'll restore to this one
cx.update(|cx| {
thread.update(cx, |thread, cx| {
thread.send(vec!["second message".into()], cx)
})
})
.await
.unwrap();
// Create 2 terminals BEFORE the checkpoint that have completed running
let terminal_id_1 = acp::TerminalId(uuid::Uuid::new_v4().to_string().into());
let mock_terminal_1 = cx.new(|cx| {
let builder = ::terminal::TerminalBuilder::new_display_only(
::terminal::terminal_settings::CursorShape::default(),
::terminal::terminal_settings::AlternateScroll::On,
None,
0,
)
.unwrap();
builder.subscribe(cx)
});
thread.update(cx, |thread, cx| {
thread.on_terminal_provider_event(
TerminalProviderEvent::Created {
terminal_id: terminal_id_1.clone(),
label: "echo 'first'".to_string(),
cwd: Some(PathBuf::from("/test")),
output_byte_limit: None,
terminal: mock_terminal_1.clone(),
},
cx,
);
});
thread.update(cx, |thread, cx| {
thread.on_terminal_provider_event(
TerminalProviderEvent::Output {
terminal_id: terminal_id_1.clone(),
data: b"first\n".to_vec(),
},
cx,
);
});
thread.update(cx, |thread, cx| {
thread.on_terminal_provider_event(
TerminalProviderEvent::Exit {
terminal_id: terminal_id_1.clone(),
status: acp::TerminalExitStatus {
exit_code: Some(0),
signal: None,
meta: None,
},
},
cx,
);
});
let terminal_id_2 = acp::TerminalId(uuid::Uuid::new_v4().to_string().into());
let mock_terminal_2 = cx.new(|cx| {
let builder = ::terminal::TerminalBuilder::new_display_only(
::terminal::terminal_settings::CursorShape::default(),
::terminal::terminal_settings::AlternateScroll::On,
None,
0,
)
.unwrap();
builder.subscribe(cx)
});
thread.update(cx, |thread, cx| {
thread.on_terminal_provider_event(
TerminalProviderEvent::Created {
terminal_id: terminal_id_2.clone(),
label: "echo 'second'".to_string(),
cwd: Some(PathBuf::from("/test")),
output_byte_limit: None,
terminal: mock_terminal_2.clone(),
},
cx,
);
});
thread.update(cx, |thread, cx| {
thread.on_terminal_provider_event(
TerminalProviderEvent::Output {
terminal_id: terminal_id_2.clone(),
data: b"second\n".to_vec(),
},
cx,
);
});
thread.update(cx, |thread, cx| {
thread.on_terminal_provider_event(
TerminalProviderEvent::Exit {
terminal_id: terminal_id_2.clone(),
status: acp::TerminalExitStatus {
exit_code: Some(0),
signal: None,
meta: None,
},
},
cx,
);
});
// Get the second message ID to restore to
let second_message_id = thread.read_with(cx, |thread, _| {
// At this point we have:
// - Index 0: First user message (with checkpoint)
// - Index 1: Second user message (with checkpoint)
// No assistant responses because FakeAgentConnection just returns EndTurn
let AgentThreadEntry::UserMessage(message) = &thread.entries[1] else {
panic!("expected user message at index 1");
};
message.id.clone().unwrap()
});
// Create a terminal AFTER the checkpoint we'll restore to.
// This simulates the AI agent starting a long-running terminal command.
let terminal_id = acp::TerminalId(uuid::Uuid::new_v4().to_string().into());
let mock_terminal = cx.new(|cx| {
let builder = ::terminal::TerminalBuilder::new_display_only(
::terminal::terminal_settings::CursorShape::default(),
::terminal::terminal_settings::AlternateScroll::On,
None,
0,
)
.unwrap();
builder.subscribe(cx)
});
// Register the terminal as created
thread.update(cx, |thread, cx| {
thread.on_terminal_provider_event(
TerminalProviderEvent::Created {
terminal_id: terminal_id.clone(),
label: "sleep 1000".to_string(),
cwd: Some(PathBuf::from("/test")),
output_byte_limit: None,
terminal: mock_terminal.clone(),
},
cx,
);
});
// Simulate the terminal producing output (still running)
thread.update(cx, |thread, cx| {
thread.on_terminal_provider_event(
TerminalProviderEvent::Output {
terminal_id: terminal_id.clone(),
data: b"terminal is running...\n".to_vec(),
},
cx,
);
});
// Create a tool call entry that references this terminal
// This represents the agent requesting a terminal command
thread.update(cx, |thread, cx| {
thread
.handle_session_update(
acp::SessionUpdate::ToolCall(acp::ToolCall {
id: acp::ToolCallId("terminal-tool-1".into()),
title: "Running command".into(),
kind: acp::ToolKind::Execute,
status: acp::ToolCallStatus::InProgress,
content: vec![acp::ToolCallContent::Terminal {
terminal_id: terminal_id.clone(),
}],
locations: vec![],
raw_input: Some(
serde_json::json!({"command": "sleep 1000", "cd": "/test"}),
),
raw_output: None,
meta: None,
}),
cx,
)
.unwrap();
});
// Verify terminal exists and is in the thread
let terminal_exists_before =
thread.read_with(cx, |thread, _| thread.terminals.contains_key(&terminal_id));
assert!(
terminal_exists_before,
"Terminal should exist before checkpoint restore"
);
// Verify the terminal's underlying task is still running (not completed)
let terminal_running_before = thread.read_with(cx, |thread, _cx| {
let terminal_entity = thread.terminals.get(&terminal_id).unwrap();
terminal_entity.read_with(cx, |term, _cx| {
term.output().is_none() // output is None means it's still running
})
});
assert!(
terminal_running_before,
"Terminal should be running before checkpoint restore"
);
// Verify we have the expected entries before restore
let entry_count_before = thread.read_with(cx, |thread, _| thread.entries.len());
assert!(
entry_count_before > 1,
"Should have multiple entries before restore"
);
// Restore the checkpoint to the second message.
// This should:
// 1. Cancel any in-progress generation (via the cancel() call)
// 2. Remove the terminal that was created after that point
thread
.update(cx, |thread, cx| {
thread.restore_checkpoint(second_message_id, cx)
})
.await
.unwrap();
// Verify that no send_task is in progress after restore
// (cancel() clears the send_task)
let has_send_task_after = thread.read_with(cx, |thread, _| thread.send_task.is_some());
assert!(
!has_send_task_after,
"Should not have a send_task after restore (cancel should have cleared it)"
);
// Verify the entries were truncated (restoring to index 1 truncates at 1, keeping only index 0)
let entry_count = thread.read_with(cx, |thread, _| thread.entries.len());
assert_eq!(
entry_count, 1,
"Should have 1 entry after restore (only the first user message)"
);
// Verify the 2 completed terminals from before the checkpoint still exist
let terminal_1_exists = thread.read_with(cx, |thread, _| {
thread.terminals.contains_key(&terminal_id_1)
});
assert!(
terminal_1_exists,
"Terminal 1 (from before checkpoint) should still exist"
);
let terminal_2_exists = thread.read_with(cx, |thread, _| {
thread.terminals.contains_key(&terminal_id_2)
});
assert!(
terminal_2_exists,
"Terminal 2 (from before checkpoint) should still exist"
);
// Verify they're still in completed state
let terminal_1_completed = thread.read_with(cx, |thread, _cx| {
let terminal_entity = thread.terminals.get(&terminal_id_1).unwrap();
terminal_entity.read_with(cx, |term, _cx| term.output().is_some())
});
assert!(terminal_1_completed, "Terminal 1 should still be completed");
let terminal_2_completed = thread.read_with(cx, |thread, _cx| {
let terminal_entity = thread.terminals.get(&terminal_id_2).unwrap();
terminal_entity.read_with(cx, |term, _cx| term.output().is_some())
});
assert!(terminal_2_completed, "Terminal 2 should still be completed");
// Verify the running terminal (created after checkpoint) was removed
let terminal_3_exists =
thread.read_with(cx, |thread, _| thread.terminals.contains_key(&terminal_id));
assert!(
!terminal_3_exists,
"Terminal 3 (created after checkpoint) should have been removed"
);
// Verify total count is 2 (the two from before the checkpoint)
let terminal_count = thread.read_with(cx, |thread, _| thread.terminals.len());
assert_eq!(
terminal_count, 2,
"Should have exactly 2 terminals (the completed ones from before checkpoint)"
);
}
}

View File

@@ -20,8 +20,6 @@ impl UserMessageId {
}
pub trait AgentConnection {
fn telemetry_id(&self) -> &'static str;
fn new_thread(
self: Rc<Self>,
project: Entity<Project>,
@@ -108,6 +106,9 @@ pub trait AgentSessionSetTitle {
}
pub trait AgentTelemetry {
/// The name of the agent used for telemetry.
fn agent_name(&self) -> String;
/// A representation of the current thread state that can be serialized for
/// storage with telemetry events.
fn thread_data(
@@ -317,10 +318,6 @@ mod test_support {
}
impl AgentConnection for StubAgentConnection {
fn telemetry_id(&self) -> &'static str {
"stub"
}
fn auth_methods(&self) -> &[acp::AuthMethod] {
&[]
}

View File

@@ -20,7 +20,6 @@ futures.workspace = true
gpui.workspace = true
language.workspace = true
project.workspace = true
telemetry.workspace = true
text.workspace = true
util.workspace = true
watch.workspace = true

View File

@@ -3,9 +3,7 @@ use buffer_diff::BufferDiff;
use clock;
use collections::BTreeMap;
use futures::{FutureExt, StreamExt, channel::mpsc};
use gpui::{
App, AppContext, AsyncApp, Context, Entity, SharedString, Subscription, Task, WeakEntity,
};
use gpui::{App, AppContext, AsyncApp, Context, Entity, Subscription, Task, WeakEntity};
use language::{Anchor, Buffer, BufferEvent, DiskState, Point, ToPoint};
use project::{Project, ProjectItem, lsp_store::OpenLspBufferHandle};
use std::{cmp, ops::Range, sync::Arc};
@@ -33,6 +31,71 @@ impl ActionLog {
&self.project
}
pub fn latest_snapshot(&self, buffer: &Entity<Buffer>) -> Option<text::BufferSnapshot> {
Some(self.tracked_buffers.get(buffer)?.snapshot.clone())
}
/// Return a unified diff patch with user edits made since last read or notification
pub fn unnotified_user_edits(&self, cx: &Context<Self>) -> Option<String> {
let diffs = self
.tracked_buffers
.values()
.filter_map(|tracked| {
if !tracked.may_have_unnotified_user_edits {
return None;
}
let text_with_latest_user_edits = tracked.diff_base.to_string();
let text_with_last_seen_user_edits = tracked.last_seen_base.to_string();
if text_with_latest_user_edits == text_with_last_seen_user_edits {
return None;
}
let patch = language::unified_diff(
&text_with_last_seen_user_edits,
&text_with_latest_user_edits,
);
let buffer = tracked.buffer.clone();
let file_path = buffer
.read(cx)
.file()
.map(|file| {
let mut path = file.full_path(cx).to_string_lossy().into_owned();
if file.path_style(cx).is_windows() {
path = path.replace('\\', "/");
}
path
})
.unwrap_or_else(|| format!("buffer_{}", buffer.entity_id()));
let mut result = String::new();
result.push_str(&format!("--- a/{}\n", file_path));
result.push_str(&format!("+++ b/{}\n", file_path));
result.push_str(&patch);
Some(result)
})
.collect::<Vec<_>>();
if diffs.is_empty() {
return None;
}
let unified_diff = diffs.join("\n\n");
Some(unified_diff)
}
/// Return a unified diff patch with user edits made since last read/notification
/// and mark them as notified
pub fn flush_unnotified_user_edits(&mut self, cx: &Context<Self>) -> Option<String> {
let patch = self.unnotified_user_edits(cx);
self.tracked_buffers.values_mut().for_each(|tracked| {
tracked.may_have_unnotified_user_edits = false;
tracked.last_seen_base = tracked.diff_base.clone();
});
patch
}
fn track_buffer_internal(
&mut self,
buffer: Entity<Buffer>,
@@ -82,26 +145,31 @@ impl ActionLog {
let diff = cx.new(|cx| BufferDiff::new(&text_snapshot, cx));
let (diff_update_tx, diff_update_rx) = mpsc::unbounded();
let diff_base;
let last_seen_base;
let unreviewed_edits;
if is_created {
diff_base = Rope::default();
last_seen_base = Rope::default();
unreviewed_edits = Patch::new(vec![Edit {
old: 0..1,
new: 0..text_snapshot.max_point().row + 1,
}])
} else {
diff_base = buffer.read(cx).as_rope().clone();
last_seen_base = diff_base.clone();
unreviewed_edits = Patch::default();
}
TrackedBuffer {
buffer: buffer.clone(),
diff_base,
last_seen_base,
unreviewed_edits,
snapshot: text_snapshot,
status,
version: buffer.read(cx).version(),
diff,
diff_update: diff_update_tx,
may_have_unnotified_user_edits: false,
_open_lsp_handle: open_lsp_handle,
_maintain_diff: cx.spawn({
let buffer = buffer.clone();
@@ -252,9 +320,10 @@ impl ActionLog {
let new_snapshot = buffer_snapshot.clone();
let unreviewed_edits = tracked_buffer.unreviewed_edits.clone();
let edits = diff_snapshots(&old_snapshot, &new_snapshot);
let mut has_user_changes = false;
async move {
if let ChangeAuthor::User = author {
apply_non_conflicting_edits(
has_user_changes = apply_non_conflicting_edits(
&unreviewed_edits,
edits,
&mut base_text,
@@ -262,13 +331,22 @@ impl ActionLog {
);
}
(Arc::new(base_text.to_string()), base_text)
(Arc::new(base_text.to_string()), base_text, has_user_changes)
}
});
anyhow::Ok(rebase)
})??;
let (new_base_text, new_diff_base) = rebase.await;
let (new_base_text, new_diff_base, has_user_changes) = rebase.await;
this.update(cx, |this, _| {
let tracked_buffer = this
.tracked_buffers
.get_mut(buffer)
.context("buffer not tracked")
.unwrap();
tracked_buffer.may_have_unnotified_user_edits |= has_user_changes;
})?;
Self::update_diff(
this,
@@ -487,17 +565,14 @@ impl ActionLog {
&mut self,
buffer: Entity<Buffer>,
buffer_range: Range<impl language::ToPoint>,
telemetry: Option<ActionLogTelemetry>,
cx: &mut Context<Self>,
) {
let Some(tracked_buffer) = self.tracked_buffers.get_mut(&buffer) else {
return;
};
let mut metrics = ActionLogMetrics::for_buffer(buffer.read(cx));
match tracked_buffer.status {
TrackedBufferStatus::Deleted => {
metrics.add_edits(tracked_buffer.unreviewed_edits.edits());
self.tracked_buffers.remove(&buffer);
cx.notify();
}
@@ -506,6 +581,7 @@ impl ActionLog {
let buffer_range =
buffer_range.start.to_point(buffer)..buffer_range.end.to_point(buffer);
let mut delta = 0i32;
tracked_buffer.unreviewed_edits.retain_mut(|edit| {
edit.old.start = (edit.old.start as i32 + delta) as u32;
edit.old.end = (edit.old.end as i32 + delta) as u32;
@@ -537,7 +613,6 @@ impl ActionLog {
.collect::<String>(),
);
delta += edit.new_len() as i32 - edit.old_len() as i32;
metrics.add_edit(edit);
false
}
});
@@ -549,24 +624,19 @@ impl ActionLog {
tracked_buffer.schedule_diff_update(ChangeAuthor::User, cx);
}
}
if let Some(telemetry) = telemetry {
telemetry_report_accepted_edits(&telemetry, metrics);
}
}
pub fn reject_edits_in_ranges(
&mut self,
buffer: Entity<Buffer>,
buffer_ranges: Vec<Range<impl language::ToPoint>>,
telemetry: Option<ActionLogTelemetry>,
cx: &mut Context<Self>,
) -> Task<Result<()>> {
let Some(tracked_buffer) = self.tracked_buffers.get_mut(&buffer) else {
return Task::ready(Ok(()));
};
let mut metrics = ActionLogMetrics::for_buffer(buffer.read(cx));
let task = match &tracked_buffer.status {
match &tracked_buffer.status {
TrackedBufferStatus::Created {
existing_file_content,
} => {
@@ -616,7 +686,6 @@ impl ActionLog {
}
};
metrics.add_edits(tracked_buffer.unreviewed_edits.edits());
self.tracked_buffers.remove(&buffer);
cx.notify();
task
@@ -630,7 +699,6 @@ impl ActionLog {
.update(cx, |project, cx| project.save_buffer(buffer.clone(), cx));
// Clear all tracked edits for this buffer and start over as if we just read it.
metrics.add_edits(tracked_buffer.unreviewed_edits.edits());
self.tracked_buffers.remove(&buffer);
self.buffer_read(buffer.clone(), cx);
cx.notify();
@@ -670,7 +738,6 @@ impl ActionLog {
}
if revert {
metrics.add_edit(edit);
let old_range = tracked_buffer
.diff_base
.point_to_offset(Point::new(edit.old.start, 0))
@@ -691,25 +758,12 @@ impl ActionLog {
self.project
.update(cx, |project, cx| project.save_buffer(buffer, cx))
}
};
if let Some(telemetry) = telemetry {
telemetry_report_rejected_edits(&telemetry, metrics);
}
task
}
pub fn keep_all_edits(
&mut self,
telemetry: Option<ActionLogTelemetry>,
cx: &mut Context<Self>,
) {
self.tracked_buffers.retain(|buffer, tracked_buffer| {
let mut metrics = ActionLogMetrics::for_buffer(buffer.read(cx));
metrics.add_edits(tracked_buffer.unreviewed_edits.edits());
if let Some(telemetry) = telemetry.as_ref() {
telemetry_report_accepted_edits(telemetry, metrics);
}
match tracked_buffer.status {
pub fn keep_all_edits(&mut self, cx: &mut Context<Self>) {
self.tracked_buffers
.retain(|_buffer, tracked_buffer| match tracked_buffer.status {
TrackedBufferStatus::Deleted => false,
_ => {
if let TrackedBufferStatus::Created { .. } = &mut tracked_buffer.status {
@@ -720,24 +774,13 @@ impl ActionLog {
tracked_buffer.schedule_diff_update(ChangeAuthor::User, cx);
true
}
}
});
});
cx.notify();
}
pub fn reject_all_edits(
&mut self,
telemetry: Option<ActionLogTelemetry>,
cx: &mut Context<Self>,
) -> Task<()> {
pub fn reject_all_edits(&mut self, cx: &mut Context<Self>) -> Task<()> {
let futures = self.changed_buffers(cx).into_keys().map(|buffer| {
let reject = self.reject_edits_in_ranges(
buffer,
vec![Anchor::MIN..Anchor::MAX],
telemetry.clone(),
cx,
);
let reject = self.reject_edits_in_ranges(buffer, vec![Anchor::MIN..Anchor::MAX], cx);
async move {
reject.await.log_err();
@@ -745,7 +788,8 @@ impl ActionLog {
});
let task = futures::future::join_all(futures);
cx.background_spawn(async move {
cx.spawn(async move |_, _| {
task.await;
})
}
@@ -775,61 +819,6 @@ impl ActionLog {
}
}
#[derive(Clone)]
pub struct ActionLogTelemetry {
pub agent_telemetry_id: &'static str,
pub session_id: Arc<str>,
}
struct ActionLogMetrics {
lines_removed: u32,
lines_added: u32,
language: Option<SharedString>,
}
impl ActionLogMetrics {
fn for_buffer(buffer: &Buffer) -> Self {
Self {
language: buffer.language().map(|l| l.name().0),
lines_removed: 0,
lines_added: 0,
}
}
fn add_edits(&mut self, edits: &[Edit<u32>]) {
for edit in edits {
self.add_edit(edit);
}
}
fn add_edit(&mut self, edit: &Edit<u32>) {
self.lines_added += edit.new_len();
self.lines_removed += edit.old_len();
}
}
fn telemetry_report_accepted_edits(telemetry: &ActionLogTelemetry, metrics: ActionLogMetrics) {
telemetry::event!(
"Agent Edits Accepted",
agent = telemetry.agent_telemetry_id,
session = telemetry.session_id,
language = metrics.language,
lines_added = metrics.lines_added,
lines_removed = metrics.lines_removed
);
}
fn telemetry_report_rejected_edits(telemetry: &ActionLogTelemetry, metrics: ActionLogMetrics) {
telemetry::event!(
"Agent Edits Rejected",
agent = telemetry.agent_telemetry_id,
session = telemetry.session_id,
language = metrics.language,
lines_added = metrics.lines_added,
lines_removed = metrics.lines_removed
);
}
fn apply_non_conflicting_edits(
patch: &Patch<u32>,
edits: Vec<Edit<u32>>,
@@ -960,12 +949,14 @@ enum TrackedBufferStatus {
struct TrackedBuffer {
buffer: Entity<Buffer>,
diff_base: Rope,
last_seen_base: Rope,
unreviewed_edits: Patch<u32>,
status: TrackedBufferStatus,
version: clock::Global,
diff: Entity<BufferDiff>,
snapshot: text::BufferSnapshot,
diff_update: mpsc::UnboundedSender<(ChangeAuthor, text::BufferSnapshot)>,
may_have_unnotified_user_edits: bool,
_open_lsp_handle: OpenLspBufferHandle,
_maintain_diff: Task<()>,
_subscription: Subscription,
@@ -996,6 +987,7 @@ mod tests {
use super::*;
use buffer_diff::DiffHunkStatusKind;
use gpui::TestAppContext;
use indoc::indoc;
use language::Point;
use project::{FakeFs, Fs, Project, RemoveOptions};
use rand::prelude::*;
@@ -1013,6 +1005,8 @@ mod tests {
cx.update(|cx| {
let settings_store = SettingsStore::test(cx);
cx.set_global(settings_store);
language::init(cx);
Project::init_settings(cx);
});
}
@@ -1072,7 +1066,7 @@ mod tests {
);
action_log.update(cx, |log, cx| {
log.keep_edits_in_range(buffer.clone(), Point::new(3, 0)..Point::new(4, 3), None, cx)
log.keep_edits_in_range(buffer.clone(), Point::new(3, 0)..Point::new(4, 3), cx)
});
cx.run_until_parked();
assert_eq!(
@@ -1088,7 +1082,7 @@ mod tests {
);
action_log.update(cx, |log, cx| {
log.keep_edits_in_range(buffer.clone(), Point::new(0, 0)..Point::new(4, 3), None, cx)
log.keep_edits_in_range(buffer.clone(), Point::new(0, 0)..Point::new(4, 3), cx)
});
cx.run_until_parked();
assert_eq!(unreviewed_hunks(&action_log, cx), vec![]);
@@ -1173,7 +1167,7 @@ mod tests {
);
action_log.update(cx, |log, cx| {
log.keep_edits_in_range(buffer.clone(), Point::new(1, 0)..Point::new(1, 0), None, cx)
log.keep_edits_in_range(buffer.clone(), Point::new(1, 0)..Point::new(1, 0), cx)
});
cx.run_until_parked();
assert_eq!(unreviewed_hunks(&action_log, cx), vec![]);
@@ -1270,7 +1264,111 @@ mod tests {
);
action_log.update(cx, |log, cx| {
log.keep_edits_in_range(buffer.clone(), Point::new(0, 0)..Point::new(1, 0), None, cx)
log.keep_edits_in_range(buffer.clone(), Point::new(0, 0)..Point::new(1, 0), cx)
});
cx.run_until_parked();
assert_eq!(unreviewed_hunks(&action_log, cx), vec![]);
}
#[gpui::test(iterations = 10)]
async fn test_user_edits_notifications(cx: &mut TestAppContext) {
init_test(cx);
let fs = FakeFs::new(cx.executor());
fs.insert_tree(
path!("/dir"),
json!({"file": indoc! {"
abc
def
ghi
jkl
mno"}}),
)
.await;
let project = Project::test(fs.clone(), [path!("/dir").as_ref()], cx).await;
let action_log = cx.new(|_| ActionLog::new(project.clone()));
let file_path = project
.read_with(cx, |project, cx| project.find_project_path("dir/file", cx))
.unwrap();
let buffer = project
.update(cx, |project, cx| project.open_buffer(file_path, cx))
.await
.unwrap();
// Agent edits
cx.update(|cx| {
action_log.update(cx, |log, cx| log.buffer_read(buffer.clone(), cx));
buffer.update(cx, |buffer, cx| {
buffer
.edit([(Point::new(1, 2)..Point::new(2, 3), "F\nGHI")], None, cx)
.unwrap()
});
action_log.update(cx, |log, cx| log.buffer_edited(buffer.clone(), cx));
});
cx.run_until_parked();
assert_eq!(
buffer.read_with(cx, |buffer, _| buffer.text()),
indoc! {"
abc
deF
GHI
jkl
mno"}
);
assert_eq!(
unreviewed_hunks(&action_log, cx),
vec![(
buffer.clone(),
vec![HunkStatus {
range: Point::new(1, 0)..Point::new(3, 0),
diff_status: DiffHunkStatusKind::Modified,
old_text: "def\nghi\n".into(),
}],
)]
);
// User edits
buffer.update(cx, |buffer, cx| {
buffer.edit(
[
(Point::new(0, 2)..Point::new(0, 2), "X"),
(Point::new(3, 0)..Point::new(3, 0), "Y"),
],
None,
cx,
)
});
cx.run_until_parked();
assert_eq!(
buffer.read_with(cx, |buffer, _| buffer.text()),
indoc! {"
abXc
deF
GHI
Yjkl
mno"}
);
// User edits should be stored separately from agent's
let user_edits = action_log.update(cx, |log, cx| log.unnotified_user_edits(cx));
assert_eq!(
user_edits.expect("should have some user edits"),
indoc! {"
--- a/dir/file
+++ b/dir/file
@@ -1,5 +1,5 @@
-abc
+abXc
def
ghi
-jkl
+Yjkl
mno
"}
);
action_log.update(cx, |log, cx| {
log.keep_edits_in_range(buffer.clone(), Point::new(0, 0)..Point::new(1, 0), cx)
});
cx.run_until_parked();
assert_eq!(unreviewed_hunks(&action_log, cx), vec![]);
@@ -1329,7 +1427,7 @@ mod tests {
);
action_log.update(cx, |log, cx| {
log.keep_edits_in_range(buffer.clone(), 0..5, None, cx)
log.keep_edits_in_range(buffer.clone(), 0..5, cx)
});
cx.run_until_parked();
assert_eq!(unreviewed_hunks(&action_log, cx), vec![]);
@@ -1381,7 +1479,7 @@ mod tests {
action_log
.update(cx, |log, cx| {
log.reject_edits_in_ranges(buffer.clone(), vec![2..5], None, cx)
log.reject_edits_in_ranges(buffer.clone(), vec![2..5], cx)
})
.await
.unwrap();
@@ -1461,7 +1559,7 @@ mod tests {
action_log
.update(cx, |log, cx| {
log.reject_edits_in_ranges(buffer.clone(), vec![2..5], None, cx)
log.reject_edits_in_ranges(buffer.clone(), vec![2..5], cx)
})
.await
.unwrap();
@@ -1644,7 +1742,6 @@ mod tests {
log.reject_edits_in_ranges(
buffer.clone(),
vec![Point::new(4, 0)..Point::new(4, 0)],
None,
cx,
)
})
@@ -1679,7 +1776,6 @@ mod tests {
log.reject_edits_in_ranges(
buffer.clone(),
vec![Point::new(0, 0)..Point::new(1, 0)],
None,
cx,
)
})
@@ -1707,7 +1803,6 @@ mod tests {
log.reject_edits_in_ranges(
buffer.clone(),
vec![Point::new(4, 0)..Point::new(4, 0)],
None,
cx,
)
})
@@ -1782,7 +1877,7 @@ mod tests {
let range_2 = buffer.read(cx).anchor_before(Point::new(5, 0))
..buffer.read(cx).anchor_before(Point::new(5, 3));
log.reject_edits_in_ranges(buffer.clone(), vec![range_1, range_2], None, cx)
log.reject_edits_in_ranges(buffer.clone(), vec![range_1, range_2], cx)
.detach();
assert_eq!(
buffer.read_with(cx, |buffer, _| buffer.text()),
@@ -1843,7 +1938,6 @@ mod tests {
log.reject_edits_in_ranges(
buffer.clone(),
vec![Point::new(0, 0)..Point::new(0, 0)],
None,
cx,
)
})
@@ -1899,7 +1993,6 @@ mod tests {
log.reject_edits_in_ranges(
buffer.clone(),
vec![Point::new(0, 0)..Point::new(0, 11)],
None,
cx,
)
})
@@ -1962,7 +2055,6 @@ mod tests {
log.reject_edits_in_ranges(
buffer.clone(),
vec![Point::new(0, 0)..Point::new(100, 0)],
None,
cx,
)
})
@@ -2010,7 +2102,7 @@ mod tests {
// User accepts the single hunk
action_log.update(cx, |log, cx| {
log.keep_edits_in_range(buffer.clone(), Anchor::MIN..Anchor::MAX, None, cx)
log.keep_edits_in_range(buffer.clone(), Anchor::MIN..Anchor::MAX, cx)
});
cx.run_until_parked();
assert_eq!(unreviewed_hunks(&action_log, cx), vec![]);
@@ -2031,7 +2123,7 @@ mod tests {
// User rejects the hunk
action_log
.update(cx, |log, cx| {
log.reject_edits_in_ranges(buffer.clone(), vec![Anchor::MIN..Anchor::MAX], None, cx)
log.reject_edits_in_ranges(buffer.clone(), vec![Anchor::MIN..Anchor::MAX], cx)
})
.await
.unwrap();
@@ -2075,7 +2167,7 @@ mod tests {
cx.run_until_parked();
// User clicks "Accept All"
action_log.update(cx, |log, cx| log.keep_all_edits(None, cx));
action_log.update(cx, |log, cx| log.keep_all_edits(cx));
cx.run_until_parked();
assert!(fs.is_file(path!("/dir/new_file").as_ref()).await);
assert_eq!(unreviewed_hunks(&action_log, cx), vec![]); // Hunks are cleared
@@ -2094,7 +2186,7 @@ mod tests {
// User clicks "Reject All"
action_log
.update(cx, |log, cx| log.reject_all_edits(None, cx))
.update(cx, |log, cx| log.reject_all_edits(cx))
.await;
cx.run_until_parked();
assert!(fs.is_file(path!("/dir/new_file").as_ref()).await);
@@ -2134,7 +2226,7 @@ mod tests {
action_log.update(cx, |log, cx| {
let range = buffer.read(cx).random_byte_range(0, &mut rng);
log::info!("keeping edits in range {:?}", range);
log.keep_edits_in_range(buffer.clone(), range, None, cx)
log.keep_edits_in_range(buffer.clone(), range, cx)
});
}
25..50 => {
@@ -2142,7 +2234,7 @@ mod tests {
.update(cx, |log, cx| {
let range = buffer.read(cx).random_byte_range(0, &mut rng);
log::info!("rejecting edits in range {:?}", range);
log.reject_edits_in_ranges(buffer.clone(), vec![range], None, cx)
log.reject_edits_in_ranges(buffer.clone(), vec![range], cx)
})
.await
.unwrap();
@@ -2396,4 +2488,61 @@ mod tests {
.collect()
})
}
#[gpui::test]
async fn test_format_patch(cx: &mut TestAppContext) {
init_test(cx);
let fs = FakeFs::new(cx.executor());
fs.insert_tree(
path!("/dir"),
json!({"test.txt": "line 1\nline 2\nline 3\n"}),
)
.await;
let project = Project::test(fs.clone(), [path!("/dir").as_ref()], cx).await;
let action_log = cx.new(|_| ActionLog::new(project.clone()));
let file_path = project
.read_with(cx, |project, cx| {
project.find_project_path("dir/test.txt", cx)
})
.unwrap();
let buffer = project
.update(cx, |project, cx| project.open_buffer(file_path, cx))
.await
.unwrap();
cx.update(|cx| {
// Track the buffer and mark it as read first
action_log.update(cx, |log, cx| {
log.buffer_read(buffer.clone(), cx);
});
// Make some edits to create a patch
buffer.update(cx, |buffer, cx| {
buffer
.edit([(Point::new(1, 0)..Point::new(1, 6), "CHANGED")], None, cx)
.unwrap(); // Replace "line2" with "CHANGED"
});
});
cx.run_until_parked();
// Get the patch
let patch = action_log.update(cx, |log, cx| log.unnotified_user_edits(cx));
// Verify the patch format contains expected unified diff elements
assert_eq!(
patch.unwrap(),
indoc! {"
--- a/dir/test.txt
+++ b/dir/test.txt
@@ -1,3 +1,3 @@
line 1
-line 2
+CHANGED
line 3
"}
);
}
}

View File

@@ -17,7 +17,6 @@ anyhow.workspace = true
auto_update.workspace = true
editor.workspace = true
extension_host.workspace = true
fs.workspace = true
futures.workspace = true
gpui.workspace = true
language.workspace = true

View File

@@ -51,7 +51,6 @@ pub struct ActivityIndicator {
project: Entity<Project>,
auto_updater: Option<Entity<AutoUpdater>>,
context_menu_handle: PopoverMenuHandle<ContextMenu>,
fs_jobs: Vec<fs::JobInfo>,
}
#[derive(Debug)]
@@ -100,27 +99,6 @@ impl ActivityIndicator {
})
.detach();
let fs = project.read(cx).fs().clone();
let mut job_events = fs.subscribe_to_jobs();
cx.spawn(async move |this, cx| {
while let Some(job_event) = job_events.next().await {
this.update(cx, |this: &mut ActivityIndicator, cx| {
match job_event {
fs::JobEvent::Started { info } => {
this.fs_jobs.retain(|j| j.id != info.id);
this.fs_jobs.push(info);
}
fs::JobEvent::Completed { id } => {
this.fs_jobs.retain(|j| j.id != id);
}
}
cx.notify();
})?;
}
anyhow::Ok(())
})
.detach();
cx.subscribe(
&project.read(cx).lsp_store(),
|activity_indicator, _, event, cx| {
@@ -223,8 +201,7 @@ impl ActivityIndicator {
statuses: Vec::new(),
project: project.clone(),
auto_updater,
context_menu_handle: PopoverMenuHandle::default(),
fs_jobs: Vec::new(),
context_menu_handle: Default::default(),
}
});
@@ -455,23 +432,6 @@ impl ActivityIndicator {
});
}
// Show any long-running fs command
for fs_job in &self.fs_jobs {
if Instant::now().duration_since(fs_job.start) >= GIT_OPERATION_DELAY {
return Some(Content {
icon: Some(
Icon::new(IconName::ArrowCircle)
.size(IconSize::Small)
.with_rotate_animation(2)
.into_any_element(),
),
message: fs_job.message.clone().into(),
on_click: None,
tooltip_message: None,
});
}
}
// Show any language server installation info.
let mut downloading = SmallVec::<[_; 3]>::new();
let mut checking_for_update = SmallVec::<[_; 3]>::new();

View File

@@ -63,6 +63,7 @@ streaming_diff.workspace = true
strsim.workspace = true
task.workspace = true
telemetry.workspace = true
terminal.workspace = true
text.workspace = true
thiserror.workspace = true
ui.workspace = true

View File

@@ -6,6 +6,7 @@ mod native_agent_server;
pub mod outline;
mod templates;
mod thread;
mod tool_schema;
mod tools;
#[cfg(test)]
@@ -133,7 +134,9 @@ impl LanguageModels {
for model in provider.provided_models(cx) {
let model_info = Self::map_language_model_to_info(&model, &provider);
let model_id = model_info.id.clone();
provider_models.push(model_info);
if !recommended_models.contains(&(model.provider_id(), model.id())) {
provider_models.push(model_info);
}
models.insert(model_id, model);
}
if !provider_models.is_empty() {
@@ -215,7 +218,7 @@ impl LanguageModels {
}
_ => {
log::error!(
"Failed to authenticate provider: {}: {err:#}",
"Failed to authenticate provider: {}: {err}",
provider_name.0
);
}
@@ -964,10 +967,6 @@ impl acp_thread::AgentModelSelector for NativeAgentModelSelector {
}
impl acp_thread::AgentConnection for NativeAgentConnection {
fn telemetry_id(&self) -> &'static str {
"zed"
}
fn new_thread(
self: Rc<Self>,
project: Entity<Project>,
@@ -1108,6 +1107,10 @@ impl acp_thread::AgentConnection for NativeAgentConnection {
}
impl acp_thread::AgentTelemetry for NativeAgentConnection {
fn agent_name(&self) -> String {
"Zed".into()
}
fn thread_data(
&self,
session_id: &acp::SessionId,
@@ -1624,7 +1627,9 @@ mod internal_tests {
cx.update(|cx| {
let settings_store = SettingsStore::test(cx);
cx.set_global(settings_store);
Project::init_settings(cx);
agent_settings::init(cx);
language::init(cx);
LanguageModelRegistry::test(cx);
});
}

View File

@@ -1394,7 +1394,7 @@ mod tests {
async fn init_test(cx: &mut TestAppContext) -> EditAgent {
cx.update(settings::init);
cx.update(Project::init_settings);
let project = Project::test(FakeFs::new(cx.executor()), [], cx).await;
let model = Arc::new(FakeLanguageModel::default());
let action_log = cx.new(|_| ActionLog::new(project.clone()));

View File

@@ -1468,9 +1468,14 @@ impl EditAgentTest {
gpui_tokio::init(cx);
let http_client = Arc::new(ReqwestClient::user_agent("agent tests").unwrap());
cx.set_http_client(http_client);
client::init_settings(cx);
let client = Client::production(cx);
let user_store = cx.new(|cx| UserStore::new(client.clone(), cx));
settings::init(cx);
Project::init_settings(cx);
language::init(cx);
language_model::init(client.clone(), cx);
language_models::init(user_store, client.clone(), cx);
});

View File

@@ -88,6 +88,8 @@ mod tests {
async |fs, project, cx| {
let auth = cx.update(|cx| {
prompt_store::init(cx);
terminal::init(cx);
let registry = language_model::LanguageModelRegistry::read_global(cx);
let auth = registry
.provider(&language_model::ANTHROPIC_PROVIDER_ID)

View File

@@ -1,6 +1,6 @@
use anyhow::Result;
use gpui::{AsyncApp, Entity};
use language::{Buffer, OutlineItem};
use language::{Buffer, OutlineItem, ParseStatus};
use regex::Regex;
use std::fmt::Write;
use text::Point;
@@ -30,9 +30,10 @@ pub async fn get_buffer_content_or_outline(
if file_size > AUTO_OUTLINE_SIZE {
// For large files, use outline instead of full content
// Wait until the buffer has been fully parsed, so we can read its outline
buffer
.read_with(cx, |buffer, _| buffer.parsing_idle())?
.await;
let mut parse_status = buffer.read_with(cx, |buffer, _| buffer.parse_status())?;
while *parse_status.borrow() != ParseStatus::Idle {
parse_status.changed().await?;
}
let outline_items = buffer.read_with(cx, |buffer, _| {
let snapshot = buffer.snapshot();
@@ -44,25 +45,6 @@ pub async fn get_buffer_content_or_outline(
.collect::<Vec<_>>()
})?;
// If no outline exists, fall back to first 1KB so the agent has some context
if outline_items.is_empty() {
let text = buffer.read_with(cx, |buffer, _| {
let snapshot = buffer.snapshot();
let len = snapshot.len().min(1024);
let content = snapshot.text_for_range(0..len).collect::<String>();
if let Some(path) = path {
format!("# First 1KB of {path} (file too large to show full content, and no outline available)\n\n{content}")
} else {
format!("# First 1KB of file (file too large to show full content, and no outline available)\n\n{content}")
}
})?;
return Ok(BufferContent {
text,
is_outline: false,
});
}
let outline_text = render_outline(outline_items, None, 0, usize::MAX).await?;
let text = if let Some(path) = path {
@@ -159,62 +141,3 @@ fn render_entries(
entries_rendered
}
#[cfg(test)]
mod tests {
use super::*;
use fs::FakeFs;
use gpui::TestAppContext;
use project::Project;
use settings::SettingsStore;
#[gpui::test]
async fn test_large_file_fallback_to_subset(cx: &mut TestAppContext) {
cx.update(|cx| {
let settings = SettingsStore::test(cx);
cx.set_global(settings);
});
let fs = FakeFs::new(cx.executor());
let project = Project::test(fs, [], cx).await;
let content = "A".repeat(100 * 1024); // 100KB
let content_len = content.len();
let buffer = project
.update(cx, |project, cx| project.create_buffer(true, cx))
.await
.expect("failed to create buffer");
buffer.update(cx, |buffer, cx| buffer.set_text(content, cx));
let result = cx
.spawn(|cx| async move { get_buffer_content_or_outline(buffer, None, &cx).await })
.await
.unwrap();
// Should contain some of the actual file content
assert!(
result.text.contains("AAAAAAAAAA"),
"Result did not contain content subset"
);
// Should be marked as not an outline (it's truncated content)
assert!(
!result.is_outline,
"Large file without outline should not be marked as outline"
);
// Should be reasonably sized (much smaller than original)
assert!(
result.text.len() < 50 * 1024,
"Result size {} should be smaller than 50KB",
result.text.len()
);
// Should be significantly smaller than the original content
assert!(
result.text.len() < content_len / 10,
"Result should be much smaller than original content"
);
}
}

View File

@@ -933,7 +933,7 @@ async fn test_profiles(cx: &mut TestAppContext) {
// Test that test-1 profile (default) has echo and delay tools
thread
.update(cx, |thread, cx| {
thread.set_profile(AgentProfileId("test-1".into()), cx);
thread.set_profile(AgentProfileId("test-1".into()));
thread.send(UserMessageId::new(), ["test"], cx)
})
.unwrap();
@@ -953,7 +953,7 @@ async fn test_profiles(cx: &mut TestAppContext) {
// Switch to test-2 profile, and verify that it has only the infinite tool.
thread
.update(cx, |thread, cx| {
thread.set_profile(AgentProfileId("test-2".into()), cx);
thread.set_profile(AgentProfileId("test-2".into()));
thread.send(UserMessageId::new(), ["test2"], cx)
})
.unwrap();
@@ -1002,8 +1002,8 @@ async fn test_mcp_tools(cx: &mut TestAppContext) {
)
.await;
cx.run_until_parked();
thread.update(cx, |thread, cx| {
thread.set_profile(AgentProfileId("test".into()), cx)
thread.update(cx, |thread, _| {
thread.set_profile(AgentProfileId("test".into()))
});
let mut mcp_tool_calls = setup_context_server(
@@ -1169,8 +1169,8 @@ async fn test_mcp_tool_truncation(cx: &mut TestAppContext) {
.await;
cx.run_until_parked();
thread.update(cx, |thread, cx| {
thread.set_profile(AgentProfileId("test".into()), cx);
thread.update(cx, |thread, _| {
thread.set_profile(AgentProfileId("test".into()));
thread.add_tool(EchoTool);
thread.add_tool(DelayTool);
thread.add_tool(WordListTool);
@@ -1851,6 +1851,7 @@ async fn test_agent_connection(cx: &mut TestAppContext) {
// Initialize language model system with test provider
cx.update(|cx| {
gpui_tokio::init(cx);
client::init_settings(cx);
let http_client = FakeHttpClient::with_404_response();
let clock = Arc::new(clock::FakeSystemClock::new());
@@ -1858,7 +1859,9 @@ async fn test_agent_connection(cx: &mut TestAppContext) {
let user_store = cx.new(|cx| UserStore::new(client.clone(), cx));
language_model::init(client.clone(), cx);
language_models::init(user_store, client.clone(), cx);
Project::init_settings(cx);
LanguageModelRegistry::test(cx);
agent_settings::init(cx);
});
cx.executor().forbid_parking();
@@ -2392,6 +2395,8 @@ async fn setup(cx: &mut TestAppContext, model: TestModel) -> ThreadTest {
cx.update(|cx| {
settings::init(cx);
Project::init_settings(cx);
agent_settings::init(cx);
match model {
TestModel::Fake => {}
@@ -2399,6 +2404,7 @@ async fn setup(cx: &mut TestAppContext, model: TestModel) -> ThreadTest {
gpui_tokio::init(cx);
let http_client = ReqwestClient::user_agent("agent tests").unwrap();
cx.set_http_client(Arc::new(http_client));
client::init_settings(cx);
let client = Client::production(cx);
let user_store = cx.new(|cx| UserStore::new(client.clone(), cx));
language_model::init(client.clone(), cx);

View File

@@ -30,17 +30,16 @@ use gpui::{
};
use language_model::{
LanguageModel, LanguageModelCompletionError, LanguageModelCompletionEvent, LanguageModelExt,
LanguageModelId, LanguageModelImage, LanguageModelProviderId, LanguageModelRegistry,
LanguageModelRequest, LanguageModelRequestMessage, LanguageModelRequestTool,
LanguageModelToolResult, LanguageModelToolResultContent, LanguageModelToolSchemaFormat,
LanguageModelToolUse, LanguageModelToolUseId, Role, SelectedModel, StopReason, TokenUsage,
ZED_CLOUD_PROVIDER_ID,
LanguageModelImage, LanguageModelProviderId, LanguageModelRegistry, LanguageModelRequest,
LanguageModelRequestMessage, LanguageModelRequestTool, LanguageModelToolResult,
LanguageModelToolResultContent, LanguageModelToolSchemaFormat, LanguageModelToolUse,
LanguageModelToolUseId, Role, SelectedModel, StopReason, TokenUsage, ZED_CLOUD_PROVIDER_ID,
};
use project::Project;
use prompt_store::ProjectContext;
use schemars::{JsonSchema, Schema};
use serde::{Deserialize, Serialize};
use settings::{LanguageModelSelection, Settings, update_settings_file};
use settings::{Settings, update_settings_file};
use smol::stream::StreamExt;
use std::{
collections::BTreeMap,
@@ -799,8 +798,7 @@ impl Thread {
let profile_id = db_thread
.profile
.unwrap_or_else(|| AgentSettings::get_global(cx).default_profile.clone());
let mut model = LanguageModelRegistry::global(cx).update(cx, |registry, cx| {
let model = LanguageModelRegistry::global(cx).update(cx, |registry, cx| {
db_thread
.model
.and_then(|model| {
@@ -813,16 +811,6 @@ impl Thread {
.or_else(|| registry.default_model())
.map(|model| model.model)
});
if model.is_none() {
model = Self::resolve_profile_model(&profile_id, cx);
}
if model.is_none() {
model = LanguageModelRegistry::global(cx).update(cx, |registry, _cx| {
registry.default_model().map(|model| model.model)
});
}
let (prompt_capabilities_tx, prompt_capabilities_rx) =
watch::channel(Self::prompt_capabilities(model.as_deref()));
@@ -1019,17 +1007,8 @@ impl Thread {
&self.profile_id
}
pub fn set_profile(&mut self, profile_id: AgentProfileId, cx: &mut Context<Self>) {
if self.profile_id == profile_id {
return;
}
pub fn set_profile(&mut self, profile_id: AgentProfileId) {
self.profile_id = profile_id;
// Swap to the profile's preferred model when available.
if let Some(model) = Self::resolve_profile_model(&self.profile_id, cx) {
self.set_model(model, cx);
}
}
pub fn cancel(&mut self, cx: &mut Context<Self>) {
@@ -1086,35 +1065,6 @@ impl Thread {
})
}
/// Look up the active profile and resolve its preferred model if one is configured.
fn resolve_profile_model(
profile_id: &AgentProfileId,
cx: &mut Context<Self>,
) -> Option<Arc<dyn LanguageModel>> {
let selection = AgentSettings::get_global(cx)
.profiles
.get(profile_id)?
.default_model
.clone()?;
Self::resolve_model_from_selection(&selection, cx)
}
/// Translate a stored model selection into the configured model from the registry.
fn resolve_model_from_selection(
selection: &LanguageModelSelection,
cx: &mut Context<Self>,
) -> Option<Arc<dyn LanguageModel>> {
let selected = SelectedModel {
provider: LanguageModelProviderId::from(selection.provider.0.clone()),
model: LanguageModelId::from(selection.model.clone()),
};
LanguageModelRegistry::global(cx).update(cx, |registry, cx| {
registry
.select_model(&selected, cx)
.map(|configured| configured.model)
})
}
pub fn resume(
&mut self,
cx: &mut Context<Self>,
@@ -2189,7 +2139,7 @@ where
/// Returns the JSON schema that describes the tool's input.
fn input_schema(format: LanguageModelToolSchemaFormat) -> Schema {
language_model::tool_schema::root_schema_for::<Self::Input>(format)
crate::tool_schema::root_schema_for::<Self::Input>(format)
}
/// Some tools rely on a provider for the underlying billing or other reasons.
@@ -2276,7 +2226,7 @@ where
fn input_schema(&self, format: LanguageModelToolSchemaFormat) -> Result<serde_json::Value> {
let mut json = serde_json::to_value(T::input_schema(format))?;
language_model::tool_schema::adapt_schema_to_format(&mut json, format)?;
crate::tool_schema::adapt_schema_to_format(&mut json, format)?;
Ok(json)
}

View File

@@ -1,4 +1,5 @@
use anyhow::Result;
use language_model::LanguageModelToolSchemaFormat;
use schemars::{
JsonSchema, Schema,
generate::SchemaSettings,
@@ -6,16 +7,7 @@ use schemars::{
};
use serde_json::Value;
/// Indicates the format used to define the input schema for a language model tool.
#[derive(Debug, PartialEq, Eq, Clone, Copy, Hash)]
pub enum LanguageModelToolSchemaFormat {
/// A JSON schema, see https://json-schema.org
JsonSchema,
/// A subset of an OpenAPI 3.0 schema object supported by Google AI, see https://ai.google.dev/api/caching#Schema
JsonSchemaSubset,
}
pub fn root_schema_for<T: JsonSchema>(format: LanguageModelToolSchemaFormat) -> Schema {
pub(crate) fn root_schema_for<T: JsonSchema>(format: LanguageModelToolSchemaFormat) -> Schema {
let mut generator = match format {
LanguageModelToolSchemaFormat::JsonSchema => SchemaSettings::draft07().into_generator(),
LanguageModelToolSchemaFormat::JsonSchemaSubset => SchemaSettings::openapi3()

View File

@@ -165,7 +165,7 @@ impl AnyAgentTool for ContextServerTool {
format: language_model::LanguageModelToolSchemaFormat,
) -> Result<serde_json::Value> {
let mut schema = self.tool.input_schema.clone();
language_model::tool_schema::adapt_schema_to_format(&mut schema, format)?;
crate::tool_schema::adapt_schema_to_format(&mut schema, format)?;
Ok(match schema {
serde_json::Value::Null => {
serde_json::json!({ "type": "object", "properties": [] })

View File

@@ -562,6 +562,7 @@ fn resolve_path(
mod tests {
use super::*;
use crate::{ContextServerRegistry, Templates};
use client::TelemetrySettings;
use fs::Fs;
use gpui::{TestAppContext, UpdateGlobal};
use language_model::fake_provider::FakeLanguageModel;
@@ -1752,6 +1753,10 @@ mod tests {
cx.update(|cx| {
let settings_store = SettingsStore::test(cx);
cx.set_global(settings_store);
language::init(cx);
TelemetrySettings::register(cx);
agent_settings::AgentSettings::register(cx);
Project::init_settings(cx);
});
}
}

View File

@@ -246,6 +246,8 @@ mod test {
cx.update(|cx| {
let settings_store = SettingsStore::test(cx);
cx.set_global(settings_store);
language::init(cx);
Project::init_settings(cx);
});
}
}

View File

@@ -778,6 +778,8 @@ mod tests {
cx.update(|cx| {
let settings_store = SettingsStore::test(cx);
cx.set_global(settings_store);
language::init(cx);
Project::init_settings(cx);
});
}

View File

@@ -223,6 +223,8 @@ mod tests {
cx.update(|cx| {
let settings_store = SettingsStore::test(cx);
cx.set_global(settings_store);
language::init(cx);
Project::init_settings(cx);
});
}

View File

@@ -163,6 +163,8 @@ mod tests {
cx.update(|cx| {
let settings_store = SettingsStore::test(cx);
cx.set_global(settings_store);
language::init(cx);
Project::init_settings(cx);
});
}
}

View File

@@ -509,6 +509,8 @@ mod test {
cx.update(|cx| {
let settings_store = SettingsStore::test(cx);
cx.set_global(settings_store);
language::init(cx);
Project::init_settings(cx);
});
}

View File

@@ -21,6 +21,7 @@ acp_tools.workspace = true
acp_thread.workspace = true
action_log.workspace = true
agent-client-protocol.workspace = true
agent_settings.workspace = true
anyhow.workspace = true
async-trait.workspace = true
client.workspace = true
@@ -32,6 +33,7 @@ gpui.workspace = true
gpui_tokio = { workspace = true, optional = true }
http_client.workspace = true
indoc.workspace = true
language.workspace = true
language_model.workspace = true
language_models.workspace = true
log.workspace = true

View File

@@ -29,7 +29,6 @@ pub struct UnsupportedVersion;
pub struct AcpConnection {
server_name: SharedString,
telemetry_id: &'static str,
connection: Rc<acp::ClientSideConnection>,
sessions: Rc<RefCell<HashMap<acp::SessionId, AcpSession>>>,
auth_methods: Vec<acp::AuthMethod>,
@@ -53,7 +52,6 @@ pub struct AcpSession {
pub async fn connect(
server_name: SharedString,
telemetry_id: &'static str,
command: AgentServerCommand,
root_dir: &Path,
default_mode: Option<acp::SessionModeId>,
@@ -62,7 +60,6 @@ pub async fn connect(
) -> Result<Rc<dyn AgentConnection>> {
let conn = AcpConnection::stdio(
server_name,
telemetry_id,
command.clone(),
root_dir,
default_mode,
@@ -78,7 +75,6 @@ const MINIMUM_SUPPORTED_VERSION: acp::ProtocolVersion = acp::V1;
impl AcpConnection {
pub async fn stdio(
server_name: SharedString,
telemetry_id: &'static str,
command: AgentServerCommand,
root_dir: &Path,
default_mode: Option<acp::SessionModeId>,
@@ -136,7 +132,7 @@ impl AcpConnection {
while let Ok(n) = stderr.read_line(&mut line).await
&& n > 0
{
log::warn!("agent stderr: {}", line.trim());
log::warn!("agent stderr: {}", &line);
line.clear();
}
Ok(())
@@ -203,7 +199,6 @@ impl AcpConnection {
root_dir: root_dir.to_owned(),
connection,
server_name,
telemetry_id,
sessions,
agent_capabilities: response.agent_capabilities,
default_mode,
@@ -231,10 +226,6 @@ impl Drop for AcpConnection {
}
impl AgentConnection for AcpConnection {
fn telemetry_id(&self) -> &'static str {
self.telemetry_id
}
fn new_thread(
self: Rc<Self>,
project: Entity<Project>,

View File

@@ -62,7 +62,6 @@ impl AgentServer for ClaudeCode {
cx: &mut App,
) -> Task<Result<(Rc<dyn AgentConnection>, Option<task::SpawnInTerminal>)>> {
let name = self.name();
let telemetry_id = self.telemetry_id();
let root_dir = root_dir.map(|root_dir| root_dir.to_string_lossy().into_owned());
let is_remote = delegate.project.read(cx).is_via_remote_server();
let store = delegate.store.downgrade();
@@ -86,7 +85,6 @@ impl AgentServer for ClaudeCode {
.await?;
let connection = crate::acp::connect(
name,
telemetry_id,
command,
root_dir.as_ref(),
default_mode,

View File

@@ -63,7 +63,6 @@ impl AgentServer for Codex {
cx: &mut App,
) -> Task<Result<(Rc<dyn AgentConnection>, Option<task::SpawnInTerminal>)>> {
let name = self.name();
let telemetry_id = self.telemetry_id();
let root_dir = root_dir.map(|root_dir| root_dir.to_string_lossy().into_owned());
let is_remote = delegate.project.read(cx).is_via_remote_server();
let store = delegate.store.downgrade();
@@ -88,7 +87,6 @@ impl AgentServer for Codex {
let connection = crate::acp::connect(
name,
telemetry_id,
command,
root_dir.as_ref(),
default_mode,

View File

@@ -68,7 +68,6 @@ impl crate::AgentServer for CustomAgentServer {
cx: &mut App,
) -> Task<Result<(Rc<dyn AgentConnection>, Option<task::SpawnInTerminal>)>> {
let name = self.name();
let telemetry_id = self.telemetry_id();
let root_dir = root_dir.map(|root_dir| root_dir.to_string_lossy().into_owned());
let is_remote = delegate.project.read(cx).is_via_remote_server();
let default_mode = self.default_mode(cx);
@@ -94,7 +93,6 @@ impl crate::AgentServer for CustomAgentServer {
.await?;
let connection = crate::acp::connect(
name,
telemetry_id,
command,
root_dir.as_ref(),
default_mode,

View File

@@ -6,9 +6,7 @@ use gpui::{AppContext, Entity, TestAppContext};
use indoc::indoc;
#[cfg(test)]
use project::agent_server_store::BuiltinAgentServerSettings;
use project::{FakeFs, Project};
#[cfg(test)]
use settings::Settings;
use project::{FakeFs, Project, agent_server_store::AllAgentServersSettings};
use std::{
path::{Path, PathBuf},
sync::Arc,
@@ -454,22 +452,29 @@ pub use common_e2e_tests;
// Helpers
pub async fn init_test(cx: &mut TestAppContext) -> Arc<FakeFs> {
use settings::Settings;
env_logger::try_init().ok();
cx.update(|cx| {
let settings_store = settings::SettingsStore::test(cx);
cx.set_global(settings_store);
Project::init_settings(cx);
language::init(cx);
gpui_tokio::init(cx);
let http_client = reqwest_client::ReqwestClient::user_agent("agent tests").unwrap();
cx.set_http_client(Arc::new(http_client));
client::init_settings(cx);
let client = client::Client::production(cx);
let user_store = cx.new(|cx| client::UserStore::new(client.clone(), cx));
language_model::init(client.clone(), cx);
language_models::init(user_store, client, cx);
agent_settings::init(cx);
AllAgentServersSettings::register(cx);
#[cfg(test)]
project::agent_server_store::AllAgentServersSettings::override_global(
project::agent_server_store::AllAgentServersSettings {
AllAgentServersSettings::override_global(
AllAgentServersSettings {
claude: Some(BuiltinAgentServerSettings {
path: Some("claude-code-acp".into()),
args: None,

View File

@@ -31,7 +31,6 @@ impl AgentServer for Gemini {
cx: &mut App,
) -> Task<Result<(Rc<dyn AgentConnection>, Option<task::SpawnInTerminal>)>> {
let name = self.name();
let telemetry_id = self.telemetry_id();
let root_dir = root_dir.map(|root_dir| root_dir.to_string_lossy().into_owned());
let is_remote = delegate.project.read(cx).is_via_remote_server();
let store = delegate.store.downgrade();
@@ -65,7 +64,6 @@ impl AgentServer for Gemini {
let connection = crate::acp::connect(
name,
telemetry_id,
command,
root_dir.as_ref(),
default_mode,

View File

@@ -6,8 +6,8 @@ use convert_case::{Case, Casing as _};
use fs::Fs;
use gpui::{App, SharedString};
use settings::{
AgentProfileContent, ContextServerPresetContent, LanguageModelSelection, Settings as _,
SettingsContent, update_settings_file,
AgentProfileContent, ContextServerPresetContent, Settings as _, SettingsContent,
update_settings_file,
};
use util::ResultExt as _;
@@ -53,30 +53,19 @@ impl AgentProfile {
let base_profile =
base_profile_id.and_then(|id| AgentSettings::get_global(cx).profiles.get(&id).cloned());
// Copy toggles from the base profile so the new profile starts with familiar defaults.
let tools = base_profile
.as_ref()
.map(|profile| profile.tools.clone())
.unwrap_or_default();
let enable_all_context_servers = base_profile
.as_ref()
.map(|profile| profile.enable_all_context_servers)
.unwrap_or_default();
let context_servers = base_profile
.as_ref()
.map(|profile| profile.context_servers.clone())
.unwrap_or_default();
// Preserve the base profile's model preference when cloning into a new profile.
let default_model = base_profile
.as_ref()
.and_then(|profile| profile.default_model.clone());
let profile_settings = AgentProfileSettings {
name: name.into(),
tools,
enable_all_context_servers,
context_servers,
default_model,
tools: base_profile
.as_ref()
.map(|profile| profile.tools.clone())
.unwrap_or_default(),
enable_all_context_servers: base_profile
.as_ref()
.map(|profile| profile.enable_all_context_servers)
.unwrap_or_default(),
context_servers: base_profile
.map(|profile| profile.context_servers)
.unwrap_or_default(),
};
update_settings_file(fs, cx, {
@@ -107,8 +96,6 @@ pub struct AgentProfileSettings {
pub tools: IndexMap<Arc<str>, bool>,
pub enable_all_context_servers: bool,
pub context_servers: IndexMap<Arc<str>, ContextServerPreset>,
/// Default language model to apply when this profile becomes active.
pub default_model: Option<LanguageModelSelection>,
}
impl AgentProfileSettings {
@@ -157,7 +144,6 @@ impl AgentProfileSettings {
)
})
.collect(),
default_model: self.default_model.clone(),
},
);
@@ -167,23 +153,15 @@ impl AgentProfileSettings {
impl From<AgentProfileContent> for AgentProfileSettings {
fn from(content: AgentProfileContent) -> Self {
let AgentProfileContent {
name,
tools,
enable_all_context_servers,
context_servers,
default_model,
} = content;
Self {
name: name.into(),
tools,
enable_all_context_servers: enable_all_context_servers.unwrap_or_default(),
context_servers: context_servers
name: content.name.into(),
tools: content.tools,
enable_all_context_servers: content.enable_all_context_servers.unwrap_or_default(),
context_servers: content
.context_servers
.into_iter()
.map(|(server_id, preset)| (server_id, preset.into()))
.collect(),
default_model,
}
}
}

View File

@@ -10,7 +10,7 @@ use schemars::JsonSchema;
use serde::{Deserialize, Serialize};
use settings::{
DefaultAgentView, DockPosition, LanguageModelParameters, LanguageModelSelection,
NotifyWhenAgentWaiting, RegisterSetting, Settings,
NotifyWhenAgentWaiting, Settings,
};
pub use crate::agent_profile::*;
@@ -19,7 +19,11 @@ pub const SUMMARIZE_THREAD_PROMPT: &str = include_str!("prompts/summarize_thread
pub const SUMMARIZE_THREAD_DETAILED_PROMPT: &str =
include_str!("prompts/summarize_thread_detailed_prompt.txt");
#[derive(Clone, Debug, RegisterSetting)]
pub fn init(cx: &mut App) {
AgentSettings::register(cx);
}
#[derive(Clone, Debug)]
pub struct AgentSettings {
pub enabled: bool,
pub button: bool,

View File

@@ -109,8 +109,6 @@ impl ContextPickerCompletionProvider {
icon_path: Some(mode.icon().path().into()),
documentation: None,
source: project::CompletionSource::Custom,
match_start: None,
snippet_deduplication_key: None,
insert_text_mode: None,
// This ensures that when a user accepts this completion, the
// completion menu will still be shown after "@category " is
@@ -148,8 +146,6 @@ impl ContextPickerCompletionProvider {
documentation: None,
insert_text_mode: None,
source: project::CompletionSource::Custom,
match_start: None,
snippet_deduplication_key: None,
icon_path: Some(icon_for_completion),
confirm: Some(confirm_completion_callback(
thread_entry.title().clone(),
@@ -181,8 +177,6 @@ impl ContextPickerCompletionProvider {
documentation: None,
insert_text_mode: None,
source: project::CompletionSource::Custom,
match_start: None,
snippet_deduplication_key: None,
icon_path: Some(icon_path),
confirm: Some(confirm_completion_callback(
rule.title,
@@ -239,8 +233,6 @@ impl ContextPickerCompletionProvider {
documentation: None,
source: project::CompletionSource::Custom,
icon_path: Some(completion_icon_path),
match_start: None,
snippet_deduplication_key: None,
insert_text_mode: None,
confirm: Some(confirm_completion_callback(
file_name,
@@ -292,8 +284,6 @@ impl ContextPickerCompletionProvider {
documentation: None,
source: project::CompletionSource::Custom,
icon_path: Some(icon_path),
match_start: None,
snippet_deduplication_key: None,
insert_text_mode: None,
confirm: Some(confirm_completion_callback(
symbol.name.into(),
@@ -326,8 +316,6 @@ impl ContextPickerCompletionProvider {
documentation: None,
source: project::CompletionSource::Custom,
icon_path: Some(icon_path),
match_start: None,
snippet_deduplication_key: None,
insert_text_mode: None,
confirm: Some(confirm_completion_callback(
url_to_fetch.to_string().into(),
@@ -396,8 +384,6 @@ impl ContextPickerCompletionProvider {
icon_path: Some(action.icon().path().into()),
documentation: None,
source: project::CompletionSource::Custom,
match_start: None,
snippet_deduplication_key: None,
insert_text_mode: None,
// This ensures that when a user accepts this completion, the
// completion menu will still be shown after "@category " is
@@ -708,18 +694,14 @@ fn build_symbol_label(symbol_name: &str, file_name: &str, line: u32, cx: &App) -
}
fn build_code_label_for_full_path(file_name: &str, directory: Option<&str>, cx: &App) -> CodeLabel {
let path = cx
.theme()
.syntax()
.highlight_id("variable")
.map(HighlightId);
let comment_id = cx.theme().syntax().highlight_id("comment").map(HighlightId);
let mut label = CodeLabelBuilder::default();
label.push_str(file_name, None);
label.push_str(" ", None);
if let Some(directory) = directory {
label.push_str(directory, path);
label.push_str(directory, comment_id);
}
label.build()
@@ -788,8 +770,6 @@ impl CompletionProvider for ContextPickerCompletionProvider {
)),
source: project::CompletionSource::Custom,
icon_path: None,
match_start: None,
snippet_deduplication_key: None,
insert_text_mode: None,
confirm: Some(Arc::new({
let editor = editor.clone();

View File

@@ -401,9 +401,10 @@ mod tests {
use acp_thread::{AgentConnection, StubAgentConnection};
use agent::HistoryStore;
use agent_client_protocol as acp;
use agent_settings::AgentSettings;
use assistant_text_thread::TextThreadStore;
use buffer_diff::{DiffHunkStatus, DiffHunkStatusKind};
use editor::RowInfo;
use editor::{EditorSettings, RowInfo};
use fs::FakeFs;
use gpui::{AppContext as _, SemanticVersion, TestAppContext};
@@ -412,7 +413,7 @@ mod tests {
use pretty_assertions::assert_matches;
use project::Project;
use serde_json::json;
use settings::SettingsStore;
use settings::{Settings as _, SettingsStore};
use util::path;
use workspace::Workspace;
@@ -538,8 +539,13 @@ mod tests {
cx.update(|cx| {
let settings_store = SettingsStore::test(cx);
cx.set_global(settings_store);
language::init(cx);
Project::init_settings(cx);
AgentSettings::register(cx);
workspace::init_settings(cx);
theme::init(theme::LoadThemes::JustBase, cx);
release_channel::init(SemanticVersion::default(), cx);
EditorSettings::register(cx);
});
}
}

View File

@@ -15,7 +15,6 @@ use editor::{
EditorEvent, EditorMode, EditorSnapshot, EditorStyle, ExcerptId, FoldPlaceholder, Inlay,
MultiBuffer, ToOffset,
actions::Paste,
code_context_menus::CodeContextMenu,
display_map::{Crease, CreaseId, FoldId},
scroll::Autoscroll,
};
@@ -273,15 +272,6 @@ impl MessageEditor {
self.editor.read(cx).is_empty(cx)
}
pub fn is_completions_menu_visible(&self, cx: &App) -> bool {
self.editor
.read(cx)
.context_menu()
.borrow()
.as_ref()
.is_some_and(|menu| matches!(menu, CodeContextMenu::Completions(_)) && menu.visible())
}
pub fn mentions(&self) -> HashSet<MentionUri> {
self.mention_set
.mentions
@@ -724,10 +714,7 @@ impl MessageEditor {
let mut all_tracked_buffers = Vec::new();
let result = editor.update(cx, |editor, cx| {
let (mut ix, _) = text
.char_indices()
.find(|(_, c)| !c.is_whitespace())
.unwrap_or((0, '\0'));
let mut ix = text.chars().position(|c| !c.is_whitespace()).unwrap_or(0);
let mut chunks: Vec<acp::ContentBlock> = Vec::new();
let text = editor.text(cx);
editor.display_map.update(cx, |map, cx| {
@@ -846,45 +833,6 @@ impl MessageEditor {
cx.emit(MessageEditorEvent::Send)
}
pub fn trigger_completion_menu(&mut self, window: &mut Window, cx: &mut Context<Self>) {
let editor = self.editor.clone();
cx.spawn_in(window, async move |_, cx| {
editor
.update_in(cx, |editor, window, cx| {
let menu_is_open =
editor.context_menu().borrow().as_ref().is_some_and(|menu| {
matches!(menu, CodeContextMenu::Completions(_)) && menu.visible()
});
let has_at_sign = {
let snapshot = editor.display_snapshot(cx);
let cursor = editor.selections.newest::<text::Point>(&snapshot).head();
let offset = cursor.to_offset(&snapshot);
if offset > 0 {
snapshot
.buffer_snapshot()
.reversed_chars_at(offset)
.next()
.map(|sign| sign == '@')
.unwrap_or(false)
} else {
false
}
};
if menu_is_open && has_at_sign {
return;
}
editor.insert("@", window, cx);
editor.show_completions(&editor::actions::ShowCompletions, window, cx);
})
.log_err();
})
.detach();
}
fn chat(&mut self, _: &Chat, _: &mut Window, cx: &mut Context<Self>) {
self.send(cx);
}
@@ -1244,17 +1192,6 @@ impl MessageEditor {
self.editor.read(cx).text(cx)
}
pub fn set_placeholder_text(
&mut self,
placeholder: &str,
window: &mut Window,
cx: &mut Context<Self>,
) {
self.editor.update(cx, |editor, cx| {
editor.set_placeholder_text(placeholder, window, cx);
});
}
#[cfg(test)]
pub fn set_text(&mut self, text: &str, window: &mut Window, cx: &mut Context<Self>) {
self.editor.update(cx, |editor, cx| {
@@ -1959,8 +1896,10 @@ mod tests {
let app_state = cx.update(AppState::test);
cx.update(|cx| {
language::init(cx);
editor::init(cx);
workspace::init(app_state.clone(), cx);
Project::init_settings(cx);
});
let project = Project::test(app_state.fs.clone(), [path!("/dir").as_ref()], cx).await;
@@ -2133,8 +2072,10 @@ mod tests {
let app_state = cx.update(AppState::test);
cx.update(|cx| {
language::init(cx);
editor::init(cx);
workspace::init(app_state.clone(), cx);
Project::init_settings(cx);
});
app_state
@@ -2671,14 +2612,13 @@ mod tests {
}
#[gpui::test]
async fn test_large_file_mention_fallback(cx: &mut TestAppContext) {
async fn test_large_file_mention_uses_outline(cx: &mut TestAppContext) {
init_test(cx);
let fs = FakeFs::new(cx.executor());
// Create a large file that exceeds AUTO_OUTLINE_SIZE
// Using plain text without a configured language, so no outline is available
const LINE: &str = "This is a line of text in the file\n";
const LINE: &str = "fn example_function() { /* some code */ }\n";
let large_content = LINE.repeat(2 * (outline::AUTO_OUTLINE_SIZE / LINE.len()));
assert!(large_content.len() > outline::AUTO_OUTLINE_SIZE);
@@ -2689,8 +2629,8 @@ mod tests {
fs.insert_tree(
"/project",
json!({
"large_file.txt": large_content.clone(),
"small_file.txt": small_content,
"large_file.rs": large_content.clone(),
"small_file.rs": small_content,
}),
)
.await;
@@ -2736,7 +2676,7 @@ mod tests {
let large_file_abs_path = project.read_with(cx, |project, cx| {
let worktree = project.worktrees(cx).next().unwrap();
let worktree_root = worktree.read(cx).abs_path();
worktree_root.join("large_file.txt")
worktree_root.join("large_file.rs")
});
let large_file_task = message_editor.update(cx, |editor, cx| {
editor.confirm_mention_for_file(large_file_abs_path, cx)
@@ -2745,20 +2685,11 @@ mod tests {
let large_file_mention = large_file_task.await.unwrap();
match large_file_mention {
Mention::Text { content, .. } => {
// Should contain some of the content but not all of it
assert!(
content.contains(LINE),
"Should contain some of the file content"
);
assert!(
!content.contains(&LINE.repeat(100)),
"Should not contain the full file"
);
// Should be much smaller than original
assert!(
content.len() < large_content.len() / 10,
"Should be significantly truncated"
);
// Should contain outline header for large files
assert!(content.contains("File outline for"));
assert!(content.contains("file too large to show full content"));
// Should not contain the full repeated content
assert!(!content.contains(&LINE.repeat(100)));
}
_ => panic!("Expected Text mention for large file"),
}
@@ -2768,7 +2699,7 @@ mod tests {
let small_file_abs_path = project.read_with(cx, |project, cx| {
let worktree = project.worktrees(cx).next().unwrap();
let worktree_root = worktree.read(cx).abs_path();
worktree_root.join("small_file.txt")
worktree_root.join("small_file.rs")
});
let small_file_task = message_editor.update(cx, |editor, cx| {
editor.confirm_mention_for_file(small_file_abs_path, cx)
@@ -2777,8 +2708,10 @@ mod tests {
let small_file_mention = small_file_task.await.unwrap();
match small_file_mention {
Mention::Text { content, .. } => {
// Should contain the full actual content
// Should contain the actual content
assert_eq!(content, small_content);
// Should not contain outline header
assert!(!content.contains("File outline for"));
}
_ => panic!("Expected Text mention for small file"),
}
@@ -2900,7 +2833,7 @@ mod tests {
cx.run_until_parked();
editor.update_in(cx, |editor, window, cx| {
editor.set_text(" \u{A0}してhello world ", window, cx);
editor.set_text(" hello world ", window, cx);
});
let (content, _) = message_editor
@@ -2911,7 +2844,7 @@ mod tests {
assert_eq!(
content,
vec![acp::ContentBlock::Text(acp::TextContent {
text: "してhello world".into(),
text: "hello world".into(),
annotations: None,
meta: None
})]
@@ -3066,8 +2999,10 @@ mod tests {
let app_state = cx.update(AppState::test);
cx.update(|cx| {
language::init(cx);
editor::init(cx);
workspace::init(app_state.clone(), cx);
Project::init_settings(cx);
});
app_state

View File

@@ -1,6 +1,6 @@
use std::rc::Rc;
use acp_thread::{AgentModelInfo, AgentModelSelector};
use acp_thread::AgentModelSelector;
use gpui::{Entity, FocusHandle};
use picker::popover_menu::PickerPopoverMenu;
use ui::{
@@ -36,8 +36,12 @@ impl AcpModelSelectorPopover {
self.menu_handle.toggle(window, cx);
}
pub fn active_model<'a>(&self, cx: &'a App) -> Option<&'a AgentModelInfo> {
self.selector.read(cx).delegate.active_model()
pub fn active_model_name(&self, cx: &App) -> Option<SharedString> {
self.selector
.read(cx)
.delegate
.active_model()
.map(|model| model.name.clone())
}
}

View File

@@ -457,23 +457,25 @@ impl Render for AcpThreadHistory {
.on_action(cx.listener(Self::select_last))
.on_action(cx.listener(Self::confirm))
.on_action(cx.listener(Self::remove_selected_thread))
.child(
h_flex()
.h(px(41.)) // Match the toolbar perfectly
.w_full()
.py_1()
.px_2()
.gap_2()
.justify_between()
.border_b_1()
.border_color(cx.theme().colors().border)
.child(
Icon::new(IconName::MagnifyingGlass)
.color(Color::Muted)
.size(IconSize::Small),
)
.child(self.search_editor.clone()),
)
.when(!self.history_store.read(cx).is_empty(cx), |parent| {
parent.child(
h_flex()
.h(px(41.)) // Match the toolbar perfectly
.w_full()
.py_1()
.px_2()
.gap_2()
.justify_between()
.border_b_1()
.border_color(cx.theme().colors().border)
.child(
Icon::new(IconName::MagnifyingGlass)
.color(Color::Muted)
.size(IconSize::Small),
)
.child(self.search_editor.clone()),
)
})
.child({
let view = v_flex()
.id("list-container")
@@ -482,15 +484,19 @@ impl Render for AcpThreadHistory {
.flex_grow();
if self.history_store.read(cx).is_empty(cx) {
view.justify_center().items_center().child(
Label::new("You don't have any past threads yet.")
.size(LabelSize::Small)
.color(Color::Muted),
)
} else if self.search_produced_no_matches() {
view.justify_center()
.items_center()
.child(Label::new("No threads match your search.").size(LabelSize::Small))
.child(
h_flex().w_full().justify_center().child(
Label::new("You don't have any past threads yet.")
.size(LabelSize::Small),
),
)
} else if self.search_produced_no_matches() {
view.justify_center().child(
h_flex().w_full().justify_center().child(
Label::new("No threads match your search.").size(LabelSize::Small),
),
)
} else {
view.child(
uniform_list(
@@ -667,7 +673,7 @@ impl EntryTimeFormat {
timezone,
time_format::TimestampFormat::EnhancedAbsolute,
),
EntryTimeFormat::TimeOnly => time_format::format_time(timestamp.to_offset(timezone)),
EntryTimeFormat::TimeOnly => time_format::format_time(timestamp),
}
}
}

View File

@@ -4,12 +4,12 @@ use acp_thread::{
ToolCallStatus, UserMessageId,
};
use acp_thread::{AgentConnection, Plan};
use action_log::{ActionLog, ActionLogTelemetry};
use action_log::ActionLog;
use agent::{DbThreadMetadata, HistoryEntry, HistoryEntryId, HistoryStore, NativeAgentServer};
use agent_client_protocol::{self as acp, PromptCapabilities};
use agent_servers::{AgentServer, AgentServerDelegate};
use agent_settings::{AgentProfileId, AgentSettings, CompletionMode};
use anyhow::{Result, anyhow};
use anyhow::{Result, anyhow, bail};
use arrayvec::ArrayVec;
use audio::{Audio, Sound};
use buffer_diff::BufferDiff;
@@ -125,9 +125,8 @@ impl ProfileProvider for Entity<agent::Thread> {
}
fn set_profile(&self, profile_id: AgentProfileId, cx: &mut App) {
self.update(cx, |thread, cx| {
// Apply the profile and let the thread swap to its default model.
thread.set_profile(profile_id, cx);
self.update(cx, |thread, _cx| {
thread.set_profile(profile_id);
});
}
@@ -170,7 +169,7 @@ impl ThreadFeedbackState {
}
}
let session_id = thread.read(cx).session_id().clone();
let agent = thread.read(cx).connection().telemetry_id();
let agent_name = telemetry.agent_name();
let task = telemetry.thread_data(&session_id, cx);
let rating = match feedback {
ThreadFeedback::Positive => "positive",
@@ -180,9 +179,9 @@ impl ThreadFeedbackState {
let thread = task.await?;
telemetry::event!(
"Agent Thread Rated",
agent = agent,
session_id = session_id,
rating = rating,
agent = agent_name,
thread = thread
);
anyhow::Ok(())
@@ -207,15 +206,15 @@ impl ThreadFeedbackState {
self.comments_editor.take();
let session_id = thread.read(cx).session_id().clone();
let agent = thread.read(cx).connection().telemetry_id();
let agent_name = telemetry.agent_name();
let task = telemetry.thread_data(&session_id, cx);
cx.background_spawn(async move {
let thread = task.await?;
telemetry::event!(
"Agent Thread Feedback Comments",
agent = agent,
session_id = session_id,
comments = comments,
agent = agent_name,
thread = thread
);
anyhow::Ok(())
@@ -337,7 +336,19 @@ impl AcpThreadView {
let prompt_capabilities = Rc::new(RefCell::new(acp::PromptCapabilities::default()));
let available_commands = Rc::new(RefCell::new(vec![]));
let placeholder = placeholder_text(agent.name().as_ref(), false);
let placeholder = if agent.name() == "Zed Agent" {
format!("Message the {} — @ to include context", agent.name())
} else if agent.name() == "Claude Code"
|| agent.name() == "Codex"
|| !available_commands.borrow().is_empty()
{
format!(
"Message {} — @ to include context, / for commands",
agent.name()
)
} else {
format!("Message {} — @ to include context", agent.name())
};
let message_editor = cx.new(|cx| {
let mut editor = MessageEditor::new(
@@ -527,7 +538,14 @@ impl AcpThreadView {
})
.log_err()
} else {
let root_dir = root_dir.unwrap_or(paths::home_dir().as_path().into());
let root_dir = if let Some(acp_agent) = connection
.clone()
.downcast::<agent_servers::AcpConnection>()
{
acp_agent.root_dir().into()
} else {
root_dir.unwrap_or(paths::home_dir().as_path().into())
};
cx.update(|_, cx| {
connection
.clone()
@@ -1112,6 +1130,8 @@ impl AcpThreadView {
message_editor.contents(full_mention_content, cx)
});
let agent_telemetry_id = self.agent.telemetry_id();
self.thread_error.take();
self.editing_message.take();
self.thread_feedback.clear();
@@ -1119,8 +1139,6 @@ impl AcpThreadView {
let Some(thread) = self.thread() else {
return;
};
let agent_telemetry_id = self.agent.telemetry_id();
let session_id = thread.read(cx).session_id().clone();
let thread = thread.downgrade();
if self.should_be_following {
self.workspace
@@ -1131,7 +1149,6 @@ impl AcpThreadView {
}
self.is_loading_contents = true;
let model_id = self.current_model_id(cx);
let guard = cx.new(|_| ());
cx.observe_release(&guard, |this, _guard, cx| {
this.is_loading_contents = false;
@@ -1153,7 +1170,6 @@ impl AcpThreadView {
message_editor.clear(window, cx);
});
})?;
let turn_start_time = Instant::now();
let send = thread.update(cx, |thread, cx| {
thread.action_log().update(cx, |action_log, cx| {
for buffer in tracked_buffers {
@@ -1162,27 +1178,11 @@ impl AcpThreadView {
});
drop(guard);
telemetry::event!(
"Agent Message Sent",
agent = agent_telemetry_id,
session = session_id,
model = model_id
);
telemetry::event!("Agent Message Sent", agent = agent_telemetry_id);
thread.send(contents, cx)
})?;
let res = send.await;
let turn_time_ms = turn_start_time.elapsed().as_millis();
let status = if res.is_ok() { "success" } else { "failure" };
telemetry::event!(
"Agent Turn Completed",
agent = agent_telemetry_id,
session = session_id,
model = model_id,
status,
turn_time_ms,
);
res
send.await
});
cx.spawn(async move |this, cx| {
@@ -1384,7 +1384,7 @@ impl AcpThreadView {
AcpThreadEvent::Refusal => {
self.thread_retry_status.take();
self.thread_error = Some(ThreadError::Refusal);
let model_or_agent_name = self.current_model_name(cx);
let model_or_agent_name = self.get_current_model_name(cx);
let notification_message =
format!("{} refused to respond to this request", model_or_agent_name);
self.notify_with_sound(&notification_message, IconName::Warning, window, cx);
@@ -1444,14 +1444,7 @@ impl AcpThreadView {
});
}
let has_commands = !available_commands.is_empty();
self.available_commands.replace(available_commands);
let new_placeholder = placeholder_text(self.agent.name().as_ref(), has_commands);
self.message_editor.update(cx, |editor, cx| {
editor.set_placeholder_text(&new_placeholder, window, cx);
});
}
AcpThreadEvent::ModeUpdated(_mode) => {
// The connection keeps track of the mode
@@ -1860,14 +1853,6 @@ impl AcpThreadView {
let Some(thread) = self.thread() else {
return;
};
telemetry::event!(
"Agent Tool Call Authorized",
agent = self.agent.telemetry_id(),
session = thread.read(cx).session_id(),
option = option_kind
);
thread.update(cx, |thread, cx| {
thread.authorize_tool_call(tool_call_id, option_id, option_kind, cx);
});
@@ -3600,7 +3585,6 @@ impl AcpThreadView {
) -> Option<AnyElement> {
let thread = thread_entity.read(cx);
let action_log = thread.action_log();
let telemetry = ActionLogTelemetry::from(thread);
let changed_buffers = action_log.read(cx).changed_buffers(cx);
let plan = thread.plan();
@@ -3648,7 +3632,6 @@ impl AcpThreadView {
.when(self.edits_expanded, |parent| {
parent.child(self.render_edited_files(
action_log,
telemetry,
&changed_buffers,
pending_edits,
cx,
@@ -3929,7 +3912,6 @@ impl AcpThreadView {
fn render_edited_files(
&self,
action_log: &Entity<ActionLog>,
telemetry: ActionLogTelemetry,
changed_buffers: &BTreeMap<Entity<Buffer>, Entity<BufferDiff>>,
pending_edits: bool,
cx: &Context<Self>,
@@ -4049,14 +4031,12 @@ impl AcpThreadView {
.on_click({
let buffer = buffer.clone();
let action_log = action_log.clone();
let telemetry = telemetry.clone();
move |_, _, cx| {
action_log.update(cx, |action_log, cx| {
action_log
.reject_edits_in_ranges(
buffer.clone(),
vec![Anchor::MIN..Anchor::MAX],
Some(telemetry.clone()),
cx,
)
.detach_and_log_err(cx);
@@ -4071,13 +4051,11 @@ impl AcpThreadView {
.on_click({
let buffer = buffer.clone();
let action_log = action_log.clone();
let telemetry = telemetry.clone();
move |_, _, cx| {
action_log.update(cx, |action_log, cx| {
action_log.keep_edits_in_range(
buffer.clone(),
Anchor::MIN..Anchor::MAX,
Some(telemetry.clone()),
cx,
);
})
@@ -4188,8 +4166,6 @@ impl AcpThreadView {
.justify_between()
.child(
h_flex()
.gap_0p5()
.child(self.render_add_context_button(cx))
.child(self.render_follow_toggle(cx))
.children(self.render_burn_mode_toggle(cx)),
)
@@ -4295,23 +4271,17 @@ impl AcpThreadView {
let Some(thread) = self.thread() else {
return;
};
let telemetry = ActionLogTelemetry::from(thread.read(cx));
let action_log = thread.read(cx).action_log().clone();
action_log.update(cx, |action_log, cx| {
action_log.keep_all_edits(Some(telemetry), cx)
});
action_log.update(cx, |action_log, cx| action_log.keep_all_edits(cx));
}
fn reject_all(&mut self, _: &RejectAll, _window: &mut Window, cx: &mut Context<Self>) {
let Some(thread) = self.thread() else {
return;
};
let telemetry = ActionLogTelemetry::from(thread.read(cx));
let action_log = thread.read(cx).action_log().clone();
action_log
.update(cx, |action_log, cx| {
action_log.reject_all_edits(Some(telemetry), cx)
})
.update(cx, |action_log, cx| action_log.reject_all_edits(cx))
.detach();
}
@@ -4504,29 +4474,6 @@ impl AcpThreadView {
}))
}
fn render_add_context_button(&self, cx: &mut Context<Self>) -> impl IntoElement {
let message_editor = self.message_editor.clone();
let menu_visible = message_editor.read(cx).is_completions_menu_visible(cx);
IconButton::new("add-context", IconName::AtSign)
.icon_size(IconSize::Small)
.icon_color(Color::Muted)
.when(!menu_visible, |this| {
this.tooltip(move |_window, cx| {
Tooltip::with_meta("Add Context", None, "Or type @ to include context", cx)
})
})
.on_click(cx.listener(move |_this, _, window, cx| {
let message_editor_clone = message_editor.clone();
window.defer(cx, move |window, cx| {
message_editor_clone.update(cx, |message_editor, cx| {
message_editor.trigger_completion_menu(window, cx);
});
});
}))
}
fn render_markdown(&self, markdown: Entity<Markdown>, style: MarkdownStyle) -> MarkdownElement {
let workspace = self.workspace.clone();
MarkdownElement::new(markdown, style).on_url_click(move |text, window, cx| {
@@ -4730,36 +4677,35 @@ impl AcpThreadView {
.languages
.language_for_name("Markdown");
let (thread_title, markdown) = if let Some(thread) = self.thread() {
let (thread_summary, markdown) = if let Some(thread) = self.thread() {
let thread = thread.read(cx);
(thread.title().to_string(), thread.to_markdown(cx))
} else {
return Task::ready(Ok(()));
};
let project = workspace.read(cx).project().clone();
window.spawn(cx, async move |cx| {
let markdown_language = markdown_language_task.await?;
let buffer = project
.update(cx, |project, cx| project.create_buffer(false, cx))?
.await?;
buffer.update(cx, |buffer, cx| {
buffer.set_text(markdown, cx);
buffer.set_language(Some(markdown_language), cx);
buffer.set_capability(language::Capability::ReadOnly, cx);
})?;
workspace.update_in(cx, |workspace, window, cx| {
let buffer = cx
.new(|cx| MultiBuffer::singleton(buffer, cx).with_title(thread_title.clone()));
let project = workspace.project().clone();
if !project.read(cx).is_local() {
bail!("failed to open active thread as markdown in remote project");
}
let buffer = project.update(cx, |project, cx| {
project.create_local_buffer(&markdown, Some(markdown_language), true, cx)
});
let buffer = cx.new(|cx| {
MultiBuffer::singleton(buffer, cx).with_title(thread_summary.clone())
});
workspace.add_item_to_active_pane(
Box::new(cx.new(|cx| {
let mut editor =
Editor::for_multibuffer(buffer, Some(project.clone()), window, cx);
editor.set_breadcrumb_header(thread_title);
editor.set_breadcrumb_header(thread_summary);
editor
})),
None,
@@ -4767,7 +4713,9 @@ impl AcpThreadView {
window,
cx,
);
})?;
anyhow::Ok(())
})??;
anyhow::Ok(())
})
}
@@ -5393,21 +5341,20 @@ impl AcpThreadView {
)
}
fn current_model_id(&self, cx: &App) -> Option<String> {
self.model_selector
.as_ref()
.and_then(|selector| selector.read(cx).active_model(cx).map(|m| m.id.to_string()))
}
fn current_model_name(&self, cx: &App) -> SharedString {
fn get_current_model_name(&self, cx: &App) -> SharedString {
// For native agent (Zed Agent), use the specific model name (e.g., "Claude 3.5 Sonnet")
// For ACP agents, use the agent name (e.g., "Claude Code", "Gemini CLI")
// This provides better clarity about what refused the request
if self.as_native_connection(cx).is_some() {
if self
.agent
.clone()
.downcast::<agent::NativeAgentServer>()
.is_some()
{
// Native agent - use the model name
self.model_selector
.as_ref()
.and_then(|selector| selector.read(cx).active_model(cx))
.map(|model| model.name.clone())
.and_then(|selector| selector.read(cx).active_model_name(cx))
.unwrap_or_else(|| SharedString::from("The model"))
} else {
// ACP agent - use the agent name (e.g., "Claude Code", "Gemini CLI")
@@ -5416,7 +5363,7 @@ impl AcpThreadView {
}
fn render_refusal_error(&self, cx: &mut Context<'_, Self>) -> Callout {
let model_or_agent_name = self.current_model_name(cx);
let model_or_agent_name = self.get_current_model_name(cx);
let refusal_message = format!(
"{} refused to respond to this prompt. This can happen when a model believes the prompt violates its content policy or safety guidelines, so rephrasing it can sometimes address the issue.",
model_or_agent_name
@@ -5728,19 +5675,6 @@ fn loading_contents_spinner(size: IconSize) -> AnyElement {
.into_any_element()
}
fn placeholder_text(agent_name: &str, has_commands: bool) -> String {
if agent_name == "Zed Agent" {
format!("Message the {} — @ to include context", agent_name)
} else if has_commands {
format!(
"Message {} — @ to include context, / for commands",
agent_name
)
} else {
format!("Message {} — @ to include context", agent_name)
}
}
impl Focusable for AcpThreadView {
fn focus_handle(&self, cx: &App) -> FocusHandle {
match self.thread_state {
@@ -6027,6 +5961,7 @@ pub(crate) mod tests {
use acp_thread::StubAgentConnection;
use agent_client_protocol::SessionId;
use assistant_text_thread::TextThreadStore;
use editor::EditorSettings;
use fs::FakeFs;
use gpui::{EventEmitter, SemanticVersion, TestAppContext, VisualTestContext};
use project::Project;
@@ -6414,10 +6349,6 @@ pub(crate) mod tests {
struct SaboteurAgentConnection;
impl AgentConnection for SaboteurAgentConnection {
fn telemetry_id(&self) -> &'static str {
"saboteur"
}
fn new_thread(
self: Rc<Self>,
project: Entity<Project>,
@@ -6478,10 +6409,6 @@ pub(crate) mod tests {
struct RefusalAgentConnection;
impl AgentConnection for RefusalAgentConnection {
fn telemetry_id(&self) -> &'static str {
"refusal"
}
fn new_thread(
self: Rc<Self>,
project: Entity<Project>,
@@ -6544,8 +6471,13 @@ pub(crate) mod tests {
cx.update(|cx| {
let settings_store = SettingsStore::test(cx);
cx.set_global(settings_store);
language::init(cx);
Project::init_settings(cx);
AgentSettings::register(cx);
workspace::init_settings(cx);
theme::init(theme::LoadThemes::JustBase, cx);
release_channel::init(SemanticVersion::default(), cx);
EditorSettings::register(cx);
prompt_store::init(cx)
});
}

View File

@@ -8,7 +8,6 @@ use std::{ops::Range, sync::Arc};
use agent::ContextServerRegistry;
use anyhow::Result;
use client::zed_urls;
use cloud_llm_client::{Plan, PlanV1, PlanV2};
use collections::HashMap;
use context_server::ContextServerId;
@@ -27,20 +26,18 @@ use language_model::{
use language_models::AllLanguageModelSettings;
use notifications::status_toast::{StatusToast, ToastIcon};
use project::{
agent_server_store::{
AgentServerStore, CLAUDE_CODE_NAME, CODEX_NAME, ExternalAgentServerName, GEMINI_NAME,
},
agent_server_store::{AgentServerStore, CLAUDE_CODE_NAME, CODEX_NAME, GEMINI_NAME},
context_server_store::{ContextServerConfiguration, ContextServerStatus, ContextServerStore},
};
use settings::{Settings, SettingsStore, update_settings_file};
use ui::{
Button, ButtonStyle, Chip, CommonAnimationExt, ContextMenu, ContextMenuEntry, Disclosure,
Divider, DividerColor, ElevationIndex, IconName, IconPosition, IconSize, Indicator, LabelSize,
PopoverMenu, Switch, SwitchColor, Tooltip, WithScrollbar, prelude::*,
Button, ButtonStyle, Chip, CommonAnimationExt, ContextMenu, Disclosure, Divider, DividerColor,
ElevationIndex, IconName, IconPosition, IconSize, Indicator, LabelSize, PopoverMenu, Switch,
SwitchColor, Tooltip, WithScrollbar, prelude::*,
};
use util::ResultExt as _;
use workspace::{Workspace, create_and_open_local_file};
use zed_actions::{ExtensionCategoryFilter, OpenBrowser};
use zed_actions::ExtensionCategoryFilter;
pub(crate) use configure_context_server_modal::ConfigureContextServerModal;
pub(crate) use configure_context_server_tools_modal::ConfigureContextServerToolsModal;
@@ -418,7 +415,6 @@ impl AgentConfiguration {
cx: &mut Context<Self>,
) -> impl IntoElement {
let providers = LanguageModelRegistry::read_global(cx).providers();
let popover_menu = PopoverMenu::new("add-provider-popover")
.trigger(
Button::new("add-provider", "Add Provider")
@@ -429,6 +425,7 @@ impl AgentConfiguration {
.icon_color(Color::Muted)
.label_size(LabelSize::Small),
)
.anchor(gpui::Corner::TopRight)
.menu({
let workspace = self.workspace.clone();
move |window, cx| {
@@ -450,11 +447,6 @@ impl AgentConfiguration {
})
}))
}
})
.anchor(gpui::Corner::TopRight)
.offset(gpui::Point {
x: px(0.0),
y: px(2.0),
});
v_flex()
@@ -549,6 +541,7 @@ impl AgentConfiguration {
.icon_color(Color::Muted)
.label_size(LabelSize::Small),
)
.anchor(gpui::Corner::TopRight)
.menu({
move |window, cx| {
Some(ContextMenu::build(window, cx, |menu, _window, _cx| {
@@ -571,11 +564,6 @@ impl AgentConfiguration {
})
}))
}
})
.anchor(gpui::Corner::TopRight)
.offset(gpui::Point {
x: px(0.0),
y: px(2.0),
});
v_flex()
@@ -650,13 +638,15 @@ impl AgentConfiguration {
let is_running = matches!(server_status, ContextServerStatus::Running);
let item_id = SharedString::from(context_server_id.0.clone());
// Servers without a configuration can only be provided by extensions.
let provided_by_extension = server_configuration.is_none_or(|config| {
matches!(
config.as_ref(),
ContextServerConfiguration::Extension { .. }
)
});
let is_from_extension = server_configuration
.as_ref()
.map(|config| {
matches!(
config.as_ref(),
ContextServerConfiguration::Extension { .. }
)
})
.unwrap_or(false);
let error = if let ContextServerStatus::Error(error) = server_status.clone() {
Some(error)
@@ -670,7 +660,7 @@ impl AgentConfiguration {
.tools_for_server(&context_server_id)
.count();
let (source_icon, source_tooltip) = if provided_by_extension {
let (source_icon, source_tooltip) = if is_from_extension {
(
IconName::ZedSrcExtension,
"This MCP server was installed from an extension.",
@@ -720,6 +710,7 @@ impl AgentConfiguration {
let fs = self.fs.clone();
let context_server_id = context_server_id.clone();
let language_registry = self.language_registry.clone();
let context_server_store = self.context_server_store.clone();
let workspace = self.workspace.clone();
let context_server_registry = self.context_server_registry.clone();
@@ -761,10 +752,23 @@ impl AgentConfiguration {
.entry("Uninstall", None, {
let fs = fs.clone();
let context_server_id = context_server_id.clone();
let context_server_store = context_server_store.clone();
let workspace = workspace.clone();
move |_, cx| {
let is_provided_by_extension = context_server_store
.read(cx)
.configuration_for_server(&context_server_id)
.as_ref()
.map(|config| {
matches!(
config.as_ref(),
ContextServerConfiguration::Extension { .. }
)
})
.unwrap_or(false);
let uninstall_extension_task = match (
provided_by_extension,
is_provided_by_extension,
resolve_extension_for_context_server(&context_server_id, cx),
) {
(true, Some((id, manifest))) => {
@@ -955,7 +959,7 @@ impl AgentConfiguration {
.cloned()
.collect::<Vec<_>>();
let user_defined_agents: Vec<_> = user_defined_agents
let user_defined_agents = user_defined_agents
.into_iter()
.map(|name| {
let icon = if let Some(icon_path) = agent_server_store.agent_icon(&name) {
@@ -963,93 +967,27 @@ impl AgentConfiguration {
} else {
AgentIcon::Name(IconName::Ai)
};
(name, icon)
self.render_agent_server(icon, name, true)
.into_any_element()
})
.collect();
.collect::<Vec<_>>();
let add_agent_popover = PopoverMenu::new("add-agent-server-popover")
.trigger(
Button::new("add-agent", "Add Agent")
.style(ButtonStyle::Outlined)
.icon_position(IconPosition::Start)
.icon(IconName::Plus)
.icon_size(IconSize::Small)
.icon_color(Color::Muted)
.label_size(LabelSize::Small),
)
.menu({
move |window, cx| {
Some(ContextMenu::build(window, cx, |menu, _window, _cx| {
menu.entry("Install from Extensions", None, {
|window, cx| {
window.dispatch_action(
zed_actions::Extensions {
category_filter: Some(
ExtensionCategoryFilter::AgentServers,
),
id: None,
}
.boxed_clone(),
cx,
)
}
let add_agens_button = Button::new("add-agent", "Add Agent")
.style(ButtonStyle::Outlined)
.icon_position(IconPosition::Start)
.icon(IconName::Plus)
.icon_size(IconSize::Small)
.icon_color(Color::Muted)
.label_size(LabelSize::Small)
.on_click(move |_, window, cx| {
if let Some(workspace) = window.root().flatten() {
let workspace = workspace.downgrade();
window
.spawn(cx, async |cx| {
open_new_agent_servers_entry_in_settings_editor(workspace, cx).await
})
.entry("Add Custom Agent", None, {
move |window, cx| {
if let Some(workspace) = window.root().flatten() {
let workspace = workspace.downgrade();
window
.spawn(cx, async |cx| {
open_new_agent_servers_entry_in_settings_editor(
workspace, cx,
)
.await
})
.detach_and_log_err(cx);
}
}
})
.separator()
.header("Learn More")
.item(
ContextMenuEntry::new("Agent Servers Docs")
.icon(IconName::ArrowUpRight)
.icon_color(Color::Muted)
.icon_position(IconPosition::End)
.handler({
move |window, cx| {
window.dispatch_action(
Box::new(OpenBrowser {
url: zed_urls::agent_server_docs(cx),
}),
cx,
);
}
}),
)
.item(
ContextMenuEntry::new("ACP Docs")
.icon(IconName::ArrowUpRight)
.icon_color(Color::Muted)
.icon_position(IconPosition::End)
.handler({
move |window, cx| {
window.dispatch_action(
Box::new(OpenBrowser {
url: "https://agentclientprotocol.com/".into(),
}),
cx,
);
}
}),
)
}))
.detach_and_log_err(cx);
}
})
.anchor(gpui::Corner::TopRight)
.offset(gpui::Point {
x: px(0.0),
y: px(2.0),
});
v_flex()
@@ -1060,7 +998,7 @@ impl AgentConfiguration {
.child(self.render_section_title(
"External Agents",
"All agents connected through the Agent Client Protocol.",
add_agent_popover.into_any_element(),
add_agens_button.into_any_element(),
))
.child(
v_flex()
@@ -1071,29 +1009,26 @@ impl AgentConfiguration {
AgentIcon::Name(IconName::AiClaude),
"Claude Code",
false,
cx,
))
.child(Divider::horizontal().color(DividerColor::BorderFaded))
.child(self.render_agent_server(
AgentIcon::Name(IconName::AiOpenAi),
"Codex CLI",
false,
cx,
))
.child(Divider::horizontal().color(DividerColor::BorderFaded))
.child(self.render_agent_server(
AgentIcon::Name(IconName::AiGemini),
"Gemini CLI",
false,
cx,
))
.map(|mut parent| {
for (name, icon) in user_defined_agents {
for agent in user_defined_agents {
parent = parent
.child(
Divider::horizontal().color(DividerColor::BorderFaded),
)
.child(self.render_agent_server(icon, name, true, cx));
.child(agent);
}
parent
}),
@@ -1106,14 +1041,13 @@ impl AgentConfiguration {
icon: AgentIcon,
name: impl Into<SharedString>,
external: bool,
cx: &mut Context<Self>,
) -> impl IntoElement {
let name = name.into();
let icon = match icon {
AgentIcon::Name(icon_name) => Icon::new(icon_name)
.size(IconSize::Small)
.color(Color::Muted),
AgentIcon::Path(icon_path) => Icon::from_external_svg(icon_path)
AgentIcon::Path(icon_path) => Icon::from_path(icon_path)
.size(IconSize::Small)
.color(Color::Muted),
};
@@ -1121,53 +1055,28 @@ impl AgentConfiguration {
let tooltip_id = SharedString::new(format!("agent-source-{}", name));
let tooltip_message = format!("The {} agent was installed from an extension.", name);
let agent_server_name = ExternalAgentServerName(name.clone());
let uninstall_btn_id = SharedString::from(format!("uninstall-{}", name));
let uninstall_button = IconButton::new(uninstall_btn_id, IconName::Trash)
.icon_color(Color::Muted)
.icon_size(IconSize::Small)
.tooltip(Tooltip::text("Uninstall Agent Extension"))
.on_click(cx.listener(move |this, _, _window, cx| {
let agent_name = agent_server_name.clone();
if let Some(ext_id) = this.agent_server_store.update(cx, |store, _cx| {
store.get_extension_id_for_agent(&agent_name)
}) {
ExtensionStore::global(cx)
.update(cx, |store, cx| store.uninstall_extension(ext_id, cx))
.detach_and_log_err(cx);
}
}));
h_flex()
.gap_1()
.justify_between()
.gap_1p5()
.child(icon)
.child(Label::new(name))
.when(external, |this| {
this.child(
div()
.id(tooltip_id)
.flex_none()
.tooltip(Tooltip::text(tooltip_message))
.child(
Icon::new(IconName::ZedSrcExtension)
.size(IconSize::Small)
.color(Color::Muted),
),
)
})
.child(
h_flex()
.gap_1p5()
.child(icon)
.child(Label::new(name))
.when(external, |this| {
this.child(
div()
.id(tooltip_id)
.flex_none()
.tooltip(Tooltip::text(tooltip_message))
.child(
Icon::new(IconName::ZedSrcExtension)
.size(IconSize::Small)
.color(Color::Muted),
),
)
})
.child(
Icon::new(IconName::Check)
.color(Color::Success)
.size(IconSize::Small),
),
Icon::new(IconName::Check)
.color(Color::Success)
.size(IconSize::Small),
)
.when(external, |this| this.child(uninstall_button))
}
}

View File

@@ -515,14 +515,16 @@ impl Render for AddLlmProviderModal {
#[cfg(test)]
mod tests {
use super::*;
use editor::EditorSettings;
use fs::FakeFs;
use gpui::{TestAppContext, VisualTestContext};
use language::language_settings;
use language_model::{
LanguageModelProviderId, LanguageModelProviderName,
fake_provider::FakeLanguageModelProvider,
};
use project::Project;
use settings::SettingsStore;
use settings::{Settings as _, SettingsStore};
use util::path;
#[gpui::test]
@@ -728,9 +730,13 @@ mod tests {
cx.update(|cx| {
let store = SettingsStore::test(cx);
cx.set_global(store);
workspace::init_settings(cx);
Project::init_settings(cx);
theme::init(theme::LoadThemes::JustBase, cx);
language_settings::init(cx);
EditorSettings::register(cx);
language_model::init_settings(cx);
language_models::init_settings(cx);
});
let fs = FakeFs::new(cx.executor());

View File

@@ -7,10 +7,8 @@ use agent_settings::{AgentProfile, AgentProfileId, AgentSettings, builtin_profil
use editor::Editor;
use fs::Fs;
use gpui::{DismissEvent, Entity, EventEmitter, FocusHandle, Focusable, Subscription, prelude::*};
use language_model::{LanguageModel, LanguageModelRegistry};
use settings::{
LanguageModelProviderSetting, LanguageModelSelection, Settings as _, update_settings_file,
};
use language_model::LanguageModel;
use settings::Settings as _;
use ui::{
KeyBinding, ListItem, ListItemSpacing, ListSeparator, Navigable, NavigableEntry, prelude::*,
};
@@ -18,7 +16,6 @@ use workspace::{ModalView, Workspace};
use crate::agent_configuration::manage_profiles_modal::profile_modal_header::ProfileModalHeader;
use crate::agent_configuration::tool_picker::{ToolPicker, ToolPickerDelegate};
use crate::language_model_selector::{LanguageModelSelector, language_model_selector};
use crate::{AgentPanel, ManageProfiles};
enum Mode {
@@ -35,11 +32,6 @@ enum Mode {
tool_picker: Entity<ToolPicker>,
_subscription: Subscription,
},
ConfigureDefaultModel {
profile_id: AgentProfileId,
model_picker: Entity<LanguageModelSelector>,
_subscription: Subscription,
},
}
impl Mode {
@@ -91,7 +83,6 @@ pub struct ChooseProfileMode {
pub struct ViewProfileMode {
profile_id: AgentProfileId,
fork_profile: NavigableEntry,
configure_default_model: NavigableEntry,
configure_tools: NavigableEntry,
configure_mcps: NavigableEntry,
cancel_item: NavigableEntry,
@@ -189,7 +180,6 @@ impl ManageProfilesModal {
self.mode = Mode::ViewProfile(ViewProfileMode {
profile_id,
fork_profile: NavigableEntry::focusable(cx),
configure_default_model: NavigableEntry::focusable(cx),
configure_tools: NavigableEntry::focusable(cx),
configure_mcps: NavigableEntry::focusable(cx),
cancel_item: NavigableEntry::focusable(cx),
@@ -197,83 +187,6 @@ impl ManageProfilesModal {
self.focus_handle(cx).focus(window);
}
fn configure_default_model(
&mut self,
profile_id: AgentProfileId,
window: &mut Window,
cx: &mut Context<Self>,
) {
let fs = self.fs.clone();
let profile_id_for_closure = profile_id.clone();
let model_picker = cx.new(|cx| {
let fs = fs.clone();
let profile_id = profile_id_for_closure.clone();
language_model_selector(
{
let profile_id = profile_id.clone();
move |cx| {
let settings = AgentSettings::get_global(cx);
settings
.profiles
.get(&profile_id)
.and_then(|profile| profile.default_model.as_ref())
.and_then(|selection| {
let registry = LanguageModelRegistry::read_global(cx);
let provider_id = language_model::LanguageModelProviderId(
gpui::SharedString::from(selection.provider.0.clone()),
);
let provider = registry.provider(&provider_id)?;
let model = provider
.provided_models(cx)
.iter()
.find(|m| m.id().0 == selection.model.as_str())?
.clone();
Some(language_model::ConfiguredModel { provider, model })
})
}
},
move |model, cx| {
let provider = model.provider_id().0.to_string();
let model_id = model.id().0.to_string();
let profile_id = profile_id.clone();
update_settings_file(fs.clone(), cx, move |settings, _cx| {
let agent_settings = settings.agent.get_or_insert_default();
if let Some(profiles) = agent_settings.profiles.as_mut() {
if let Some(profile) = profiles.get_mut(profile_id.0.as_ref()) {
profile.default_model = Some(LanguageModelSelection {
provider: LanguageModelProviderSetting(provider.clone()),
model: model_id.clone(),
});
}
}
});
},
false, // Do not use popover styles for the model picker
window,
cx,
)
.modal(false)
});
let dismiss_subscription = cx.subscribe_in(&model_picker, window, {
let profile_id = profile_id.clone();
move |this, _picker, _: &DismissEvent, window, cx| {
this.view_profile(profile_id.clone(), window, cx);
}
});
self.mode = Mode::ConfigureDefaultModel {
profile_id,
model_picker,
_subscription: dismiss_subscription,
};
self.focus_handle(cx).focus(window);
}
fn configure_mcp_tools(
&mut self,
profile_id: AgentProfileId,
@@ -364,7 +277,6 @@ impl ManageProfilesModal {
Mode::ViewProfile(_) => {}
Mode::ConfigureTools { .. } => {}
Mode::ConfigureMcps { .. } => {}
Mode::ConfigureDefaultModel { .. } => {}
}
}
@@ -387,9 +299,6 @@ impl ManageProfilesModal {
Mode::ConfigureMcps { profile_id, .. } => {
self.view_profile(profile_id.clone(), window, cx)
}
Mode::ConfigureDefaultModel { profile_id, .. } => {
self.view_profile(profile_id.clone(), window, cx)
}
}
}
}
@@ -404,7 +313,6 @@ impl Focusable for ManageProfilesModal {
Mode::ViewProfile(_) => self.focus_handle.clone(),
Mode::ConfigureTools { tool_picker, .. } => tool_picker.focus_handle(cx),
Mode::ConfigureMcps { tool_picker, .. } => tool_picker.focus_handle(cx),
Mode::ConfigureDefaultModel { model_picker, .. } => model_picker.focus_handle(cx),
}
}
}
@@ -636,47 +544,6 @@ impl ManageProfilesModal {
}),
),
)
.child(
div()
.id("configure-default-model")
.track_focus(&mode.configure_default_model.focus_handle)
.on_action({
let profile_id = mode.profile_id.clone();
cx.listener(move |this, _: &menu::Confirm, window, cx| {
this.configure_default_model(
profile_id.clone(),
window,
cx,
);
})
})
.child(
ListItem::new("model-item")
.toggle_state(
mode.configure_default_model
.focus_handle
.contains_focused(window, cx),
)
.inset(true)
.spacing(ListItemSpacing::Sparse)
.start_slot(
Icon::new(IconName::ZedAssistant)
.size(IconSize::Small)
.color(Color::Muted),
)
.child(Label::new("Configure Default Model"))
.on_click({
let profile_id = mode.profile_id.clone();
cx.listener(move |this, _, window, cx| {
this.configure_default_model(
profile_id.clone(),
window,
cx,
);
})
}),
),
)
.child(
div()
.id("configure-builtin-tools")
@@ -801,7 +668,6 @@ impl ManageProfilesModal {
.into_any_element(),
)
.entry(mode.fork_profile)
.entry(mode.configure_default_model)
.entry(mode.configure_tools)
.entry(mode.configure_mcps)
.entry(mode.cancel_item)
@@ -887,29 +753,6 @@ impl Render for ManageProfilesModal {
.child(go_back_item)
.into_any_element()
}
Mode::ConfigureDefaultModel {
profile_id,
model_picker,
..
} => {
let profile_name = settings
.profiles
.get(profile_id)
.map(|profile| profile.name.clone())
.unwrap_or_else(|| "Unknown".into());
v_flex()
.pb_1()
.child(ProfileModalHeader::new(
format!("{profile_name} — Configure Default Model"),
Some(IconName::Ai),
))
.child(ListSeparator)
.child(v_flex().w(rems(34.)).child(model_picker.clone()))
.child(ListSeparator)
.child(go_back_item)
.into_any_element()
}
Mode::ConfigureMcps {
profile_id,
tool_picker,

View File

@@ -314,7 +314,6 @@ impl PickerDelegate for ToolPickerDelegate {
)
})
.collect(),
default_model: default_profile.default_model.clone(),
});
if let Some(server_id) = server_id {

View File

@@ -1,6 +1,6 @@
use crate::{Keep, KeepAll, OpenAgentDiff, Reject, RejectAll};
use acp_thread::{AcpThread, AcpThreadEvent};
use action_log::ActionLogTelemetry;
use action_log::ActionLog;
use agent_settings::AgentSettings;
use anyhow::Result;
use buffer_diff::DiffHunkStatus;
@@ -40,16 +40,79 @@ use zed_actions::assistant::ToggleFocus;
pub struct AgentDiffPane {
multibuffer: Entity<MultiBuffer>,
editor: Entity<Editor>,
thread: Entity<AcpThread>,
thread: AgentDiffThread,
focus_handle: FocusHandle,
workspace: WeakEntity<Workspace>,
title: SharedString,
_subscriptions: Vec<Subscription>,
}
#[derive(PartialEq, Eq, Clone)]
pub enum AgentDiffThread {
AcpThread(Entity<AcpThread>),
}
impl AgentDiffThread {
fn project(&self, cx: &App) -> Entity<Project> {
match self {
AgentDiffThread::AcpThread(thread) => thread.read(cx).project().clone(),
}
}
fn action_log(&self, cx: &App) -> Entity<ActionLog> {
match self {
AgentDiffThread::AcpThread(thread) => thread.read(cx).action_log().clone(),
}
}
fn title(&self, cx: &App) -> SharedString {
match self {
AgentDiffThread::AcpThread(thread) => thread.read(cx).title(),
}
}
fn has_pending_edit_tool_uses(&self, cx: &App) -> bool {
match self {
AgentDiffThread::AcpThread(thread) => thread.read(cx).has_pending_edit_tool_calls(),
}
}
fn downgrade(&self) -> WeakAgentDiffThread {
match self {
AgentDiffThread::AcpThread(thread) => {
WeakAgentDiffThread::AcpThread(thread.downgrade())
}
}
}
}
impl From<Entity<AcpThread>> for AgentDiffThread {
fn from(entity: Entity<AcpThread>) -> Self {
AgentDiffThread::AcpThread(entity)
}
}
#[derive(PartialEq, Eq, Clone)]
pub enum WeakAgentDiffThread {
AcpThread(WeakEntity<AcpThread>),
}
impl WeakAgentDiffThread {
pub fn upgrade(&self) -> Option<AgentDiffThread> {
match self {
WeakAgentDiffThread::AcpThread(weak) => weak.upgrade().map(AgentDiffThread::AcpThread),
}
}
}
impl From<WeakEntity<AcpThread>> for WeakAgentDiffThread {
fn from(entity: WeakEntity<AcpThread>) -> Self {
WeakAgentDiffThread::AcpThread(entity)
}
}
impl AgentDiffPane {
pub fn deploy(
thread: Entity<AcpThread>,
thread: impl Into<AgentDiffThread>,
workspace: WeakEntity<Workspace>,
window: &mut Window,
cx: &mut App,
@@ -60,11 +123,12 @@ impl AgentDiffPane {
}
pub fn deploy_in_workspace(
thread: Entity<AcpThread>,
thread: impl Into<AgentDiffThread>,
workspace: &mut Workspace,
window: &mut Window,
cx: &mut Context<Workspace>,
) -> Entity<Self> {
let thread = thread.into();
let existing_diff = workspace
.items_of_type::<AgentDiffPane>(cx)
.find(|diff| diff.read(cx).thread == thread);
@@ -81,7 +145,7 @@ impl AgentDiffPane {
}
pub fn new(
thread: Entity<AcpThread>,
thread: AgentDiffThread,
workspace: WeakEntity<Workspace>,
window: &mut Window,
cx: &mut Context<Self>,
@@ -89,7 +153,7 @@ impl AgentDiffPane {
let focus_handle = cx.focus_handle();
let multibuffer = cx.new(|_| MultiBuffer::new(Capability::ReadWrite));
let project = thread.read(cx).project().clone();
let project = thread.project(cx);
let editor = cx.new(|cx| {
let mut editor =
Editor::for_multibuffer(multibuffer.clone(), Some(project.clone()), window, cx);
@@ -100,16 +164,19 @@ impl AgentDiffPane {
editor
});
let action_log = thread.read(cx).action_log().clone();
let action_log = thread.action_log(cx);
let mut this = Self {
_subscriptions: vec![
cx.observe_in(&action_log, window, |this, _action_log, window, cx| {
this.update_excerpts(window, cx)
}),
cx.subscribe(&thread, |this, _thread, event, cx| {
this.handle_acp_thread_event(event, cx)
}),
match &thread {
AgentDiffThread::AcpThread(thread) => cx
.subscribe(thread, |this, _thread, event, cx| {
this.handle_acp_thread_event(event, cx)
}),
},
],
title: SharedString::default(),
multibuffer,
@@ -124,12 +191,7 @@ impl AgentDiffPane {
}
fn update_excerpts(&mut self, window: &mut Window, cx: &mut Context<Self>) {
let changed_buffers = self
.thread
.read(cx)
.action_log()
.read(cx)
.changed_buffers(cx);
let changed_buffers = self.thread.action_log(cx).read(cx).changed_buffers(cx);
let mut paths_to_delete = self.multibuffer.read(cx).paths().collect::<HashSet<_>>();
for (buffer, diff_handle) in changed_buffers {
@@ -216,7 +278,7 @@ impl AgentDiffPane {
}
fn update_title(&mut self, cx: &mut Context<Self>) {
let new_title = self.thread.read(cx).title();
let new_title = self.thread.title(cx);
if new_title != self.title {
self.title = new_title;
cx.emit(EditorEvent::TitleChanged);
@@ -278,18 +340,16 @@ impl AgentDiffPane {
}
fn keep_all(&mut self, _: &KeepAll, _window: &mut Window, cx: &mut Context<Self>) {
let telemetry = ActionLogTelemetry::from(self.thread.read(cx));
let action_log = self.thread.read(cx).action_log().clone();
action_log.update(cx, |action_log, cx| {
action_log.keep_all_edits(Some(telemetry), cx)
});
self.thread
.action_log(cx)
.update(cx, |action_log, cx| action_log.keep_all_edits(cx))
}
}
fn keep_edits_in_selection(
editor: &mut Editor,
buffer_snapshot: &MultiBufferSnapshot,
thread: &Entity<AcpThread>,
thread: &AgentDiffThread,
window: &mut Window,
cx: &mut Context<Editor>,
) {
@@ -304,7 +364,7 @@ fn keep_edits_in_selection(
fn reject_edits_in_selection(
editor: &mut Editor,
buffer_snapshot: &MultiBufferSnapshot,
thread: &Entity<AcpThread>,
thread: &AgentDiffThread,
window: &mut Window,
cx: &mut Context<Editor>,
) {
@@ -318,7 +378,7 @@ fn reject_edits_in_selection(
fn keep_edits_in_ranges(
editor: &mut Editor,
buffer_snapshot: &MultiBufferSnapshot,
thread: &Entity<AcpThread>,
thread: &AgentDiffThread,
ranges: Vec<Range<editor::Anchor>>,
window: &mut Window,
cx: &mut Context<Editor>,
@@ -333,15 +393,8 @@ fn keep_edits_in_ranges(
for hunk in &diff_hunks_in_ranges {
let buffer = multibuffer.read(cx).buffer(hunk.buffer_id);
if let Some(buffer) = buffer {
let action_log = thread.read(cx).action_log().clone();
let telemetry = ActionLogTelemetry::from(thread.read(cx));
action_log.update(cx, |action_log, cx| {
action_log.keep_edits_in_range(
buffer,
hunk.buffer_range.clone(),
Some(telemetry),
cx,
)
thread.action_log(cx).update(cx, |action_log, cx| {
action_log.keep_edits_in_range(buffer, hunk.buffer_range.clone(), cx)
});
}
}
@@ -350,7 +403,7 @@ fn keep_edits_in_ranges(
fn reject_edits_in_ranges(
editor: &mut Editor,
buffer_snapshot: &MultiBufferSnapshot,
thread: &Entity<AcpThread>,
thread: &AgentDiffThread,
ranges: Vec<Range<editor::Anchor>>,
window: &mut Window,
cx: &mut Context<Editor>,
@@ -374,12 +427,11 @@ fn reject_edits_in_ranges(
}
}
let action_log = thread.read(cx).action_log().clone();
let telemetry = ActionLogTelemetry::from(thread.read(cx));
for (buffer, ranges) in ranges_by_buffer {
action_log
thread
.action_log(cx)
.update(cx, |action_log, cx| {
action_log.reject_edits_in_ranges(buffer, ranges, Some(telemetry.clone()), cx)
action_log.reject_edits_in_ranges(buffer, ranges, cx)
})
.detach_and_log_err(cx);
}
@@ -479,7 +531,7 @@ impl Item for AgentDiffPane {
}
fn tab_content(&self, params: TabContentParams, _window: &Window, cx: &App) -> AnyElement {
let title = self.thread.read(cx).title();
let title = self.thread.title(cx);
Label::new(format!("Review: {}", title))
.color(if params.selected {
Color::Default
@@ -660,7 +712,7 @@ impl Render for AgentDiffPane {
}
}
fn diff_hunk_controls(thread: &Entity<AcpThread>) -> editor::RenderDiffHunkControlsFn {
fn diff_hunk_controls(thread: &AgentDiffThread) -> editor::RenderDiffHunkControlsFn {
let thread = thread.clone();
Arc::new(
@@ -687,7 +739,7 @@ fn render_diff_hunk_controls(
hunk_range: Range<editor::Anchor>,
is_created_file: bool,
line_height: Pixels,
thread: &Entity<AcpThread>,
thread: &AgentDiffThread,
editor: &Entity<Editor>,
cx: &mut App,
) -> AnyElement {
@@ -1101,11 +1153,8 @@ impl Render for AgentDiffToolbar {
return Empty.into_any();
};
let has_pending_edit_tool_use = agent_diff
.read(cx)
.thread
.read(cx)
.has_pending_edit_tool_calls();
let has_pending_edit_tool_use =
agent_diff.read(cx).thread.has_pending_edit_tool_uses(cx);
if has_pending_edit_tool_use {
return div().px_2().child(spinner_icon).into_any();
@@ -1165,7 +1214,7 @@ pub enum EditorState {
}
struct WorkspaceThread {
thread: WeakEntity<AcpThread>,
thread: WeakAgentDiffThread,
_thread_subscriptions: (Subscription, Subscription),
singleton_editors: HashMap<WeakEntity<Buffer>, HashMap<WeakEntity<Editor>, Subscription>>,
_settings_subscription: Subscription,
@@ -1190,23 +1239,23 @@ impl AgentDiff {
pub fn set_active_thread(
workspace: &WeakEntity<Workspace>,
thread: Entity<AcpThread>,
thread: impl Into<AgentDiffThread>,
window: &mut Window,
cx: &mut App,
) {
Self::global(cx).update(cx, |this, cx| {
this.register_active_thread_impl(workspace, thread, window, cx);
this.register_active_thread_impl(workspace, thread.into(), window, cx);
});
}
fn register_active_thread_impl(
&mut self,
workspace: &WeakEntity<Workspace>,
thread: Entity<AcpThread>,
thread: AgentDiffThread,
window: &mut Window,
cx: &mut Context<Self>,
) {
let action_log = thread.read(cx).action_log().clone();
let action_log = thread.action_log(cx);
let action_log_subscription = cx.observe_in(&action_log, window, {
let workspace = workspace.clone();
@@ -1215,12 +1264,14 @@ impl AgentDiff {
}
});
let thread_subscription = cx.subscribe_in(&thread, window, {
let workspace = workspace.clone();
move |this, thread, event, window, cx| {
this.handle_acp_thread_event(&workspace, thread, event, window, cx)
}
});
let thread_subscription = match &thread {
AgentDiffThread::AcpThread(thread) => cx.subscribe_in(thread, window, {
let workspace = workspace.clone();
move |this, thread, event, window, cx| {
this.handle_acp_thread_event(&workspace, thread, event, window, cx)
}
}),
};
if let Some(workspace_thread) = self.workspace_threads.get_mut(workspace) {
// replace thread and action log subscription, but keep editors
@@ -1297,7 +1348,7 @@ impl AgentDiff {
fn register_review_action<T: Action>(
workspace: &mut Workspace,
review: impl Fn(&Entity<Editor>, &Entity<AcpThread>, &mut Window, &mut App) -> PostReviewState
review: impl Fn(&Entity<Editor>, &AgentDiffThread, &mut Window, &mut App) -> PostReviewState
+ 'static,
this: &Entity<AgentDiff>,
) {
@@ -1457,7 +1508,7 @@ impl AgentDiff {
return;
};
let action_log = thread.read(cx).action_log();
let action_log = thread.action_log(cx);
let changed_buffers = action_log.read(cx).changed_buffers(cx);
let mut unaffected = self.reviewing_editors.clone();
@@ -1576,7 +1627,7 @@ impl AgentDiff {
fn keep_all(
editor: &Entity<Editor>,
thread: &Entity<AcpThread>,
thread: &AgentDiffThread,
window: &mut Window,
cx: &mut App,
) -> PostReviewState {
@@ -1596,7 +1647,7 @@ impl AgentDiff {
fn reject_all(
editor: &Entity<Editor>,
thread: &Entity<AcpThread>,
thread: &AgentDiffThread,
window: &mut Window,
cx: &mut App,
) -> PostReviewState {
@@ -1616,7 +1667,7 @@ impl AgentDiff {
fn keep(
editor: &Entity<Editor>,
thread: &Entity<AcpThread>,
thread: &AgentDiffThread,
window: &mut Window,
cx: &mut App,
) -> PostReviewState {
@@ -1629,7 +1680,7 @@ impl AgentDiff {
fn reject(
editor: &Entity<Editor>,
thread: &Entity<AcpThread>,
thread: &AgentDiffThread,
window: &mut Window,
cx: &mut App,
) -> PostReviewState {
@@ -1652,7 +1703,7 @@ impl AgentDiff {
fn review_in_active_editor(
&mut self,
workspace: &mut Workspace,
review: impl Fn(&Entity<Editor>, &Entity<AcpThread>, &mut Window, &mut App) -> PostReviewState,
review: impl Fn(&Entity<Editor>, &AgentDiffThread, &mut Window, &mut App) -> PostReviewState,
window: &mut Window,
cx: &mut Context<Self>,
) -> Option<Task<Result<()>>> {
@@ -1674,7 +1725,7 @@ impl AgentDiff {
if let PostReviewState::AllReviewed = review(&editor, &thread, window, cx)
&& let Some(curr_buffer) = editor.read(cx).buffer().read(cx).as_singleton()
{
let changed_buffers = thread.read(cx).action_log().read(cx).changed_buffers(cx);
let changed_buffers = thread.action_log(cx).read(cx).changed_buffers(cx);
let mut keys = changed_buffers.keys().cycle();
keys.find(|k| *k == &curr_buffer);
@@ -1717,11 +1768,12 @@ mod tests {
use super::*;
use crate::Keep;
use acp_thread::AgentConnection as _;
use agent_settings::AgentSettings;
use editor::EditorSettings;
use gpui::{TestAppContext, UpdateGlobal, VisualTestContext};
use project::{FakeFs, Project};
use serde_json::json;
use settings::SettingsStore;
use settings::{Settings, SettingsStore};
use std::{path::Path, rc::Rc};
use util::path;
@@ -1730,8 +1782,13 @@ mod tests {
cx.update(|cx| {
let settings_store = SettingsStore::test(cx);
cx.set_global(settings_store);
language::init(cx);
Project::init_settings(cx);
AgentSettings::register(cx);
prompt_store::init(cx);
workspace::init_settings(cx);
theme::init(theme::LoadThemes::JustBase, cx);
EditorSettings::register(cx);
language_model::init_settings(cx);
});
@@ -1758,7 +1815,8 @@ mod tests {
.await
.unwrap();
let action_log = cx.read(|cx| thread.read(cx).action_log().clone());
let thread = AgentDiffThread::AcpThread(thread);
let action_log = cx.read(|cx| thread.action_log(cx));
let (workspace, cx) =
cx.add_window_view(|window, cx| Workspace::test_new(project.clone(), window, cx));
@@ -1884,8 +1942,13 @@ mod tests {
cx.update(|cx| {
let settings_store = SettingsStore::test(cx);
cx.set_global(settings_store);
language::init(cx);
Project::init_settings(cx);
AgentSettings::register(cx);
prompt_store::init(cx);
workspace::init_settings(cx);
theme::init(theme::LoadThemes::JustBase, cx);
EditorSettings::register(cx);
language_model::init_settings(cx);
workspace::register_project_item::<Editor>(cx);
});
@@ -1941,6 +2004,7 @@ mod tests {
let action_log = thread.read_with(cx, |thread, _| thread.action_log().clone());
// Set the active thread
let thread = AgentDiffThread::AcpThread(thread);
cx.update(|window, cx| {
AgentDiff::set_active_thread(&workspace.downgrade(), thread.clone(), window, cx)
});

View File

@@ -47,7 +47,6 @@ impl AgentModelSelector {
}
}
},
true, // Use popover styles for picker
window,
cx,
)

View File

@@ -16,7 +16,7 @@ use serde::{Deserialize, Serialize};
use settings::{
DefaultAgentView as DefaultView, LanguageModelProviderSetting, LanguageModelSelection,
};
use zed_actions::OpenBrowser;
use zed_actions::agent::{OpenClaudeCodeOnboardingModal, ReauthenticateAgent};
use crate::ui::{AcpOnboardingModal, ClaudeCodeOnboardingModal};
@@ -2131,20 +2131,12 @@ impl AgentPanel {
menu
})
.separator()
.item(
ContextMenuEntry::new("Add More Agents")
.icon(IconName::Plus)
.icon_color(Color::Muted)
.handler({
move |window, cx| {
window.dispatch_action(Box::new(zed_actions::Extensions {
category_filter: Some(
zed_actions::ExtensionCategoryFilter::AgentServers,
),
id: None,
}), cx)
}
}),
.link(
"Add Other Agents",
OpenBrowser {
url: zed_urls::external_agents_docs(cx),
}
.boxed_clone(),
)
}))
}

View File

@@ -12,6 +12,7 @@ mod context_strip;
mod inline_assistant;
mod inline_prompt_editor;
mod language_model_selector;
mod message_editor;
mod profile_selector;
mod slash_command;
mod slash_command_picker;
@@ -247,6 +248,8 @@ pub fn init(
is_eval: bool,
cx: &mut App,
) {
AgentSettings::register(cx);
assistant_text_thread::init(client.clone(), cx);
rules_library::init(cx);
if !is_eval {

View File

@@ -1082,7 +1082,10 @@ mod tests {
};
use gpui::TestAppContext;
use indoc::indoc;
use language::{Buffer, Language, LanguageConfig, LanguageMatcher, Point, tree_sitter_rust};
use language::{
Buffer, Language, LanguageConfig, LanguageMatcher, Point, language_settings,
tree_sitter_rust,
};
use language_model::{LanguageModelRegistry, TokenUsage};
use rand::prelude::*;
use settings::SettingsStore;
@@ -1462,6 +1465,8 @@ mod tests {
fn init_test(cx: &mut TestAppContext) {
cx.update(LanguageModelRegistry::test);
cx.set_global(cx.update(SettingsStore::test));
cx.update(Project::init_settings);
cx.update(language_settings::init);
}
fn simulate_response_stream(

View File

@@ -1075,6 +1075,8 @@ mod tests {
cx.update(|cx| {
let settings_store = SettingsStore::test(cx);
cx.set_global(settings_store);
language::init(cx);
Project::init_settings(cx);
});
}
@@ -1089,7 +1091,7 @@ mod tests {
}
#[gpui::test]
async fn test_large_file_uses_fallback(cx: &mut TestAppContext) {
async fn test_large_file_uses_outline(cx: &mut TestAppContext) {
init_test_settings(cx);
// Create a large file that exceeds AUTO_OUTLINE_SIZE
@@ -1101,16 +1103,16 @@ mod tests {
let file_context = load_context_for("file.txt", large_content, cx).await;
// Should contain some of the actual file content
assert!(
file_context.text.contains(LINE),
"Should contain some of the file content"
file_context
.text
.contains(&format!("# File outline for {}", path!("test/file.txt"))),
"Large files should not get an outline"
);
// Should be much smaller than original
assert!(
file_context.text.len() < content_len / 10,
"Should be significantly smaller than original content"
file_context.text.len() < content_len,
"Outline should be smaller than original content"
);
}

View File

@@ -42,7 +42,7 @@ use super::{
ContextPickerAction, ContextPickerEntry, ContextPickerMode, MentionLink, RecentEntry,
available_context_picker_entries, recent_context_picker_entries_with_store, selection_ranges,
};
use crate::inline_prompt_editor::ContextCreasesAddon;
use crate::message_editor::ContextCreasesAddon;
pub(crate) enum Match {
File(FileMatch),
@@ -278,8 +278,6 @@ impl ContextPickerCompletionProvider {
icon_path: Some(mode.icon().path().into()),
documentation: None,
source: project::CompletionSource::Custom,
match_start: None,
snippet_deduplication_key: None,
insert_text_mode: None,
// This ensures that when a user accepts this completion, the
// completion menu will still be shown after "@category " is
@@ -388,8 +386,6 @@ impl ContextPickerCompletionProvider {
icon_path: Some(action.icon().path().into()),
documentation: None,
source: project::CompletionSource::Custom,
match_start: None,
snippet_deduplication_key: None,
insert_text_mode: None,
// This ensures that when a user accepts this completion, the
// completion menu will still be shown after "@category " is
@@ -421,8 +417,6 @@ impl ContextPickerCompletionProvider {
replace_range: source_range.clone(),
new_text,
label: CodeLabel::plain(thread_entry.title().to_string(), None),
match_start: None,
snippet_deduplication_key: None,
documentation: None,
insert_text_mode: None,
source: project::CompletionSource::Custom,
@@ -490,8 +484,6 @@ impl ContextPickerCompletionProvider {
replace_range: source_range.clone(),
new_text,
label: CodeLabel::plain(rules.title.to_string(), None),
match_start: None,
snippet_deduplication_key: None,
documentation: None,
insert_text_mode: None,
source: project::CompletionSource::Custom,
@@ -532,8 +524,6 @@ impl ContextPickerCompletionProvider {
documentation: None,
source: project::CompletionSource::Custom,
icon_path: Some(IconName::ToolWeb.path().into()),
match_start: None,
snippet_deduplication_key: None,
insert_text_mode: None,
confirm: Some(confirm_completion_callback(
IconName::ToolWeb.path().into(),
@@ -622,8 +612,6 @@ impl ContextPickerCompletionProvider {
documentation: None,
source: project::CompletionSource::Custom,
icon_path: Some(completion_icon_path),
match_start: None,
snippet_deduplication_key: None,
insert_text_mode: None,
confirm: Some(confirm_completion_callback(
crease_icon_path,
@@ -701,8 +689,6 @@ impl ContextPickerCompletionProvider {
documentation: None,
source: project::CompletionSource::Custom,
icon_path: Some(IconName::Code.path().into()),
match_start: None,
snippet_deduplication_key: None,
insert_text_mode: None,
confirm: Some(confirm_completion_callback(
IconName::Code.path().into(),
@@ -1196,8 +1182,10 @@ mod tests {
let app_state = cx.update(AppState::test);
cx.update(|cx| {
language::init(cx);
editor::init(cx);
workspace::init(app_state.clone(), cx);
Project::init_settings(cx);
});
app_state
@@ -1498,8 +1486,10 @@ mod tests {
let app_state = cx.update(AppState::test);
cx.update(|cx| {
language::init(cx);
editor::init(cx);
workspace::init(app_state.clone(), cx);
Project::init_settings(cx);
});
app_state
@@ -1696,6 +1686,11 @@ mod tests {
let store = SettingsStore::test(cx);
cx.set_global(store);
theme::init(theme::LoadThemes::JustBase, cx);
client::init_settings(cx);
language::init(cx);
Project::init_settings(cx);
workspace::init_settings(cx);
editor::init_settings(cx);
});
}
}

View File

@@ -1,8 +1,8 @@
use crate::context_store::ContextStore;
use agent::HistoryStore;
use collections::{HashMap, VecDeque};
use collections::VecDeque;
use editor::actions::Paste;
use editor::display_map::{CreaseId, EditorMargins};
use editor::{Addon, AnchorRangeExt as _};
use editor::display_map::EditorMargins;
use editor::{
ContextMenuOptions, Editor, EditorElement, EditorEvent, EditorMode, EditorStyle, MultiBuffer,
actions::{MoveDown, MoveUp},
@@ -17,7 +17,6 @@ use parking_lot::Mutex;
use prompt_store::PromptStore;
use settings::Settings;
use std::cmp;
use std::ops::Range;
use std::rc::Rc;
use std::sync::Arc;
use theme::ThemeSettings;
@@ -28,15 +27,12 @@ use zed_actions::agent::ToggleModelSelector;
use crate::agent_model_selector::AgentModelSelector;
use crate::buffer_codegen::BufferCodegen;
use crate::context::{AgentContextHandle, AgentContextKey};
use crate::context_picker::{ContextPicker, ContextPickerCompletionProvider, crease_for_mention};
use crate::context_store::{ContextStore, ContextStoreEvent};
use crate::context_picker::{ContextPicker, ContextPickerCompletionProvider};
use crate::context_strip::{ContextStrip, ContextStripEvent, SuggestContextKind};
use crate::message_editor::{ContextCreasesAddon, extract_message_creases, insert_message_creases};
use crate::terminal_codegen::TerminalCodegen;
use crate::{
CycleNextInlineAssist, CyclePreviousInlineAssist, ModelUsageContext, RemoveAllContext,
ToggleContextPicker,
};
use crate::{CycleNextInlineAssist, CyclePreviousInlineAssist, ModelUsageContext};
use crate::{RemoveAllContext, ToggleContextPicker};
pub struct PromptEditor<T> {
pub editor: Entity<Editor>,
@@ -1161,156 +1157,3 @@ impl GenerationMode {
}
}
}
/// Stored information that can be used to resurrect a context crease when creating an editor for a past message.
#[derive(Clone, Debug)]
pub struct MessageCrease {
pub range: Range<usize>,
pub icon_path: SharedString,
pub label: SharedString,
/// None for a deserialized message, Some otherwise.
pub context: Option<AgentContextHandle>,
}
#[derive(Default)]
pub struct ContextCreasesAddon {
creases: HashMap<AgentContextKey, Vec<(CreaseId, SharedString)>>,
_subscription: Option<Subscription>,
}
impl Addon for ContextCreasesAddon {
fn to_any(&self) -> &dyn std::any::Any {
self
}
fn to_any_mut(&mut self) -> Option<&mut dyn std::any::Any> {
Some(self)
}
}
impl ContextCreasesAddon {
pub fn new() -> Self {
Self {
creases: HashMap::default(),
_subscription: None,
}
}
pub fn add_creases(
&mut self,
context_store: &Entity<ContextStore>,
key: AgentContextKey,
creases: impl IntoIterator<Item = (CreaseId, SharedString)>,
cx: &mut Context<Editor>,
) {
self.creases.entry(key).or_default().extend(creases);
self._subscription = Some(
cx.subscribe(context_store, |editor, _, event, cx| match event {
ContextStoreEvent::ContextRemoved(key) => {
let Some(this) = editor.addon_mut::<Self>() else {
return;
};
let (crease_ids, replacement_texts): (Vec<_>, Vec<_>) = this
.creases
.remove(key)
.unwrap_or_default()
.into_iter()
.unzip();
let ranges = editor
.remove_creases(crease_ids, cx)
.into_iter()
.map(|(_, range)| range)
.collect::<Vec<_>>();
editor.unfold_ranges(&ranges, false, false, cx);
editor.edit(ranges.into_iter().zip(replacement_texts), cx);
cx.notify();
}
}),
)
}
pub fn into_inner(self) -> HashMap<AgentContextKey, Vec<(CreaseId, SharedString)>> {
self.creases
}
}
pub fn extract_message_creases(
editor: &mut Editor,
cx: &mut Context<'_, Editor>,
) -> Vec<MessageCrease> {
let buffer_snapshot = editor.buffer().read(cx).snapshot(cx);
let mut contexts_by_crease_id = editor
.addon_mut::<ContextCreasesAddon>()
.map(std::mem::take)
.unwrap_or_default()
.into_inner()
.into_iter()
.flat_map(|(key, creases)| {
let context = key.0;
creases
.into_iter()
.map(move |(id, _)| (id, context.clone()))
})
.collect::<HashMap<_, _>>();
// Filter the addon's list of creases based on what the editor reports,
// since the addon might have removed creases in it.
editor.display_map.update(cx, |display_map, cx| {
display_map
.snapshot(cx)
.crease_snapshot
.creases()
.filter_map(|(id, crease)| {
Some((
id,
(
crease.range().to_offset(&buffer_snapshot),
crease.metadata()?.clone(),
),
))
})
.map(|(id, (range, metadata))| {
let context = contexts_by_crease_id.remove(&id);
MessageCrease {
range,
context,
label: metadata.label,
icon_path: metadata.icon_path,
}
})
.collect()
})
}
pub fn insert_message_creases(
editor: &mut Editor,
message_creases: &[MessageCrease],
context_store: &Entity<ContextStore>,
window: &mut Window,
cx: &mut Context<'_, Editor>,
) {
let buffer_snapshot = editor.buffer().read(cx).snapshot(cx);
let creases = message_creases
.iter()
.map(|crease| {
let start = buffer_snapshot.anchor_after(crease.range.start);
let end = buffer_snapshot.anchor_before(crease.range.end);
crease_for_mention(
crease.label.clone(),
crease.icon_path.clone(),
start..end,
cx.weak_entity(),
)
})
.collect::<Vec<_>>();
let ids = editor.insert_creases(creases.clone(), cx);
editor.fold_creases(creases, false, window, cx);
if let Some(addon) = editor.addon_mut::<ContextCreasesAddon>() {
for (crease, id) in message_creases.iter().zip(ids) {
if let Some(context) = crease.context.as_ref() {
let key = AgentContextKey(context.clone());
addon.add_creases(context_store, key, vec![(id, crease.label.clone())], cx);
}
}
}
}

View File

@@ -1,6 +1,6 @@
use std::{cmp::Reverse, sync::Arc};
use collections::IndexMap;
use collections::{HashSet, IndexMap};
use fuzzy::{StringMatch, StringMatchCandidate, match_strings};
use gpui::{Action, AnyElement, App, BackgroundExecutor, DismissEvent, Subscription, Task};
use language_model::{
@@ -19,26 +19,14 @@ pub type LanguageModelSelector = Picker<LanguageModelPickerDelegate>;
pub fn language_model_selector(
get_active_model: impl Fn(&App) -> Option<ConfiguredModel> + 'static,
on_model_changed: impl Fn(Arc<dyn LanguageModel>, &mut App) + 'static,
popover_styles: bool,
window: &mut Window,
cx: &mut Context<LanguageModelSelector>,
) -> LanguageModelSelector {
let delegate = LanguageModelPickerDelegate::new(
get_active_model,
on_model_changed,
popover_styles,
window,
cx,
);
if popover_styles {
Picker::list(delegate, window, cx)
.show_scrollbar(true)
.width(rems(20.))
.max_height(Some(rems(20.).into()))
} else {
Picker::list(delegate, window, cx).show_scrollbar(true)
}
let delegate = LanguageModelPickerDelegate::new(get_active_model, on_model_changed, window, cx);
Picker::list(delegate, window, cx)
.show_scrollbar(true)
.width(rems(20.))
.max_height(Some(rems(20.).into()))
}
fn all_models(cx: &App) -> GroupedModels {
@@ -57,7 +45,7 @@ fn all_models(cx: &App) -> GroupedModels {
})
.collect();
let all = providers
let other = providers
.iter()
.flat_map(|provider| {
provider
@@ -70,7 +58,7 @@ fn all_models(cx: &App) -> GroupedModels {
})
.collect();
GroupedModels::new(all, recommended)
GroupedModels::new(other, recommended)
}
#[derive(Clone)]
@@ -87,14 +75,12 @@ pub struct LanguageModelPickerDelegate {
selected_index: usize,
_authenticate_all_providers_task: Task<()>,
_subscriptions: Vec<Subscription>,
popover_styles: bool,
}
impl LanguageModelPickerDelegate {
fn new(
get_active_model: impl Fn(&App) -> Option<ConfiguredModel> + 'static,
on_model_changed: impl Fn(Arc<dyn LanguageModel>, &mut App) + 'static,
popover_styles: bool,
window: &mut Window,
cx: &mut Context<Picker<Self>>,
) -> Self {
@@ -127,7 +113,6 @@ impl LanguageModelPickerDelegate {
}
},
)],
popover_styles,
}
}
@@ -192,7 +177,7 @@ impl LanguageModelPickerDelegate {
}
_ => {
log::error!(
"Failed to authenticate provider: {}: {err:#}",
"Failed to authenticate provider: {}: {err}",
provider_name.0
);
}
@@ -210,24 +195,33 @@ impl LanguageModelPickerDelegate {
struct GroupedModels {
recommended: Vec<ModelInfo>,
all: IndexMap<LanguageModelProviderId, Vec<ModelInfo>>,
other: IndexMap<LanguageModelProviderId, Vec<ModelInfo>>,
}
impl GroupedModels {
pub fn new(all: Vec<ModelInfo>, recommended: Vec<ModelInfo>) -> Self {
let mut all_by_provider: IndexMap<_, Vec<ModelInfo>> = IndexMap::default();
for model in all {
pub fn new(other: Vec<ModelInfo>, recommended: Vec<ModelInfo>) -> Self {
let recommended_ids = recommended
.iter()
.map(|info| (info.model.provider_id(), info.model.id()))
.collect::<HashSet<_>>();
let mut other_by_provider: IndexMap<_, Vec<ModelInfo>> = IndexMap::default();
for model in other {
if recommended_ids.contains(&(model.model.provider_id(), model.model.id())) {
continue;
}
let provider = model.model.provider_id();
if let Some(models) = all_by_provider.get_mut(&provider) {
if let Some(models) = other_by_provider.get_mut(&provider) {
models.push(model);
} else {
all_by_provider.insert(provider, vec![model]);
other_by_provider.insert(provider, vec![model]);
}
}
Self {
recommended,
all: all_by_provider,
other: other_by_provider,
}
}
@@ -243,7 +237,7 @@ impl GroupedModels {
);
}
for models in self.all.values() {
for models in self.other.values() {
if models.is_empty() {
continue;
}
@@ -258,6 +252,20 @@ impl GroupedModels {
}
entries
}
fn model_infos(&self) -> Vec<ModelInfo> {
let other = self
.other
.values()
.flat_map(|model| model.iter())
.cloned()
.collect::<Vec<_>>();
self.recommended
.iter()
.chain(&other)
.cloned()
.collect::<Vec<_>>()
}
}
enum LanguageModelPickerEntry {
@@ -402,9 +410,8 @@ impl PickerDelegate for LanguageModelPickerDelegate {
.collect::<Vec<_>>();
let available_models = all_models
.all
.values()
.flat_map(|models| models.iter())
.model_infos()
.iter()
.filter(|m| configured_provider_ids.contains(&m.model.provider_id()))
.cloned()
.collect::<Vec<_>>();
@@ -523,10 +530,6 @@ impl PickerDelegate for LanguageModelPickerDelegate {
_window: &mut Window,
cx: &mut Context<Picker<Self>>,
) -> Option<gpui::AnyElement> {
if !self.popover_styles {
return None;
}
Some(
h_flex()
.w_full()
@@ -742,52 +745,46 @@ mod tests {
}
#[gpui::test]
fn test_recommended_models_also_appear_in_other(_cx: &mut TestAppContext) {
fn test_exclude_recommended_models(_cx: &mut TestAppContext) {
let recommended_models = create_models(vec![("zed", "claude")]);
let all_models = create_models(vec![
("zed", "claude"), // Should also appear in "other"
("zed", "claude"), // Should be filtered out from "other"
("zed", "gemini"),
("copilot", "o3"),
]);
let grouped_models = GroupedModels::new(all_models, recommended_models);
let actual_all_models = grouped_models
.all
let actual_other_models = grouped_models
.other
.values()
.flatten()
.cloned()
.collect::<Vec<_>>();
// Recommended models should also appear in "all"
assert_models_eq(
actual_all_models,
vec!["zed/claude", "zed/gemini", "copilot/o3"],
);
// Recommended models should not appear in "other"
assert_models_eq(actual_other_models, vec!["zed/gemini", "copilot/o3"]);
}
#[gpui::test]
fn test_models_from_different_providers(_cx: &mut TestAppContext) {
fn test_dont_exclude_models_from_other_providers(_cx: &mut TestAppContext) {
let recommended_models = create_models(vec![("zed", "claude")]);
let all_models = create_models(vec![
("zed", "claude"), // Should also appear in "other"
("zed", "claude"), // Should be filtered out from "other"
("zed", "gemini"),
("copilot", "claude"), // Different provider, should appear in "other"
("copilot", "claude"), // Should not be filtered out from "other"
]);
let grouped_models = GroupedModels::new(all_models, recommended_models);
let actual_all_models = grouped_models
.all
let actual_other_models = grouped_models
.other
.values()
.flatten()
.cloned()
.collect::<Vec<_>>();
// All models should appear in "all" regardless of recommended status
assert_models_eq(
actual_all_models,
vec!["zed/claude", "zed/gemini", "copilot/claude"],
);
// Recommended models should not appear in "other"
assert_models_eq(actual_other_models, vec!["zed/gemini", "copilot/claude"]);
}
}

View File

@@ -0,0 +1,166 @@
use std::ops::Range;
use collections::HashMap;
use editor::display_map::CreaseId;
use editor::{Addon, AnchorRangeExt, Editor};
use gpui::{Entity, Subscription};
use ui::prelude::*;
use crate::{
context::{AgentContextHandle, AgentContextKey},
context_picker::crease_for_mention,
context_store::{ContextStore, ContextStoreEvent},
};
/// Stored information that can be used to resurrect a context crease when creating an editor for a past message.
#[derive(Clone, Debug)]
pub struct MessageCrease {
pub range: Range<usize>,
pub icon_path: SharedString,
pub label: SharedString,
/// None for a deserialized message, Some otherwise.
pub context: Option<AgentContextHandle>,
}
#[derive(Default)]
pub struct ContextCreasesAddon {
creases: HashMap<AgentContextKey, Vec<(CreaseId, SharedString)>>,
_subscription: Option<Subscription>,
}
impl Addon for ContextCreasesAddon {
fn to_any(&self) -> &dyn std::any::Any {
self
}
fn to_any_mut(&mut self) -> Option<&mut dyn std::any::Any> {
Some(self)
}
}
impl ContextCreasesAddon {
pub fn new() -> Self {
Self {
creases: HashMap::default(),
_subscription: None,
}
}
pub fn add_creases(
&mut self,
context_store: &Entity<ContextStore>,
key: AgentContextKey,
creases: impl IntoIterator<Item = (CreaseId, SharedString)>,
cx: &mut Context<Editor>,
) {
self.creases.entry(key).or_default().extend(creases);
self._subscription = Some(
cx.subscribe(context_store, |editor, _, event, cx| match event {
ContextStoreEvent::ContextRemoved(key) => {
let Some(this) = editor.addon_mut::<Self>() else {
return;
};
let (crease_ids, replacement_texts): (Vec<_>, Vec<_>) = this
.creases
.remove(key)
.unwrap_or_default()
.into_iter()
.unzip();
let ranges = editor
.remove_creases(crease_ids, cx)
.into_iter()
.map(|(_, range)| range)
.collect::<Vec<_>>();
editor.unfold_ranges(&ranges, false, false, cx);
editor.edit(ranges.into_iter().zip(replacement_texts), cx);
cx.notify();
}
}),
)
}
pub fn into_inner(self) -> HashMap<AgentContextKey, Vec<(CreaseId, SharedString)>> {
self.creases
}
}
pub fn extract_message_creases(
editor: &mut Editor,
cx: &mut Context<'_, Editor>,
) -> Vec<MessageCrease> {
let buffer_snapshot = editor.buffer().read(cx).snapshot(cx);
let mut contexts_by_crease_id = editor
.addon_mut::<ContextCreasesAddon>()
.map(std::mem::take)
.unwrap_or_default()
.into_inner()
.into_iter()
.flat_map(|(key, creases)| {
let context = key.0;
creases
.into_iter()
.map(move |(id, _)| (id, context.clone()))
})
.collect::<HashMap<_, _>>();
// Filter the addon's list of creases based on what the editor reports,
// since the addon might have removed creases in it.
editor.display_map.update(cx, |display_map, cx| {
display_map
.snapshot(cx)
.crease_snapshot
.creases()
.filter_map(|(id, crease)| {
Some((
id,
(
crease.range().to_offset(&buffer_snapshot),
crease.metadata()?.clone(),
),
))
})
.map(|(id, (range, metadata))| {
let context = contexts_by_crease_id.remove(&id);
MessageCrease {
range,
context,
label: metadata.label,
icon_path: metadata.icon_path,
}
})
.collect()
})
}
pub fn insert_message_creases(
editor: &mut Editor,
message_creases: &[MessageCrease],
context_store: &Entity<ContextStore>,
window: &mut Window,
cx: &mut Context<'_, Editor>,
) {
let buffer_snapshot = editor.buffer().read(cx).snapshot(cx);
let creases = message_creases
.iter()
.map(|crease| {
let start = buffer_snapshot.anchor_after(crease.range.start);
let end = buffer_snapshot.anchor_before(crease.range.end);
crease_for_mention(
crease.label.clone(),
crease.icon_path.clone(),
start..end,
cx.weak_entity(),
)
})
.collect::<Vec<_>>();
let ids = editor.insert_creases(creases.clone(), cx);
editor.fold_creases(creases, false, window, cx);
if let Some(addon) = editor.addon_mut::<ContextCreasesAddon>() {
for (crease, id) in message_creases.iter().zip(ids) {
if let Some(context) = crease.context.as_ref() {
let key = AgentContextKey(context.clone());
addon.add_creases(context_store, key, vec![(id, crease.label.clone())], cx);
}
}
}
}

View File

@@ -15,8 +15,8 @@ use std::{
sync::{Arc, atomic::AtomicBool},
};
use ui::{
DocumentationAside, DocumentationEdge, DocumentationSide, HighlightedLabel, KeyBinding,
LabelSize, ListItem, ListItemSpacing, PopoverMenuHandle, TintColor, Tooltip, prelude::*,
DocumentationAside, DocumentationEdge, DocumentationSide, HighlightedLabel, LabelSize,
ListItem, ListItemSpacing, PopoverMenuHandle, TintColor, Tooltip, prelude::*,
};
/// Trait for types that can provide and manage agent profiles
@@ -81,7 +81,6 @@ impl ProfileSelector {
self.provider.clone(),
self.profiles.clone(),
cx.background_executor().clone(),
self.focus_handle.clone(),
cx,
);
@@ -208,7 +207,6 @@ pub(crate) struct ProfilePickerDelegate {
selected_index: usize,
query: String,
cancel: Option<Arc<AtomicBool>>,
focus_handle: FocusHandle,
}
impl ProfilePickerDelegate {
@@ -217,7 +215,6 @@ impl ProfilePickerDelegate {
provider: Arc<dyn ProfileProvider>,
profiles: AvailableProfiles,
background: BackgroundExecutor,
focus_handle: FocusHandle,
cx: &mut Context<ProfileSelector>,
) -> Self {
let candidates = Self::candidates_from(profiles);
@@ -234,7 +231,6 @@ impl ProfilePickerDelegate {
selected_index: 0,
query: String::new(),
cancel: None,
focus_handle,
};
this.selected_index = this
@@ -598,26 +594,20 @@ impl PickerDelegate for ProfilePickerDelegate {
_: &mut Window,
cx: &mut Context<Picker<Self>>,
) -> Option<gpui::AnyElement> {
let focus_handle = self.focus_handle.clone();
Some(
h_flex()
.w_full()
.border_t_1()
.border_color(cx.theme().colors().border_variant)
.p_1p5()
.p_1()
.gap_4()
.justify_between()
.child(
Button::new("configure", "Configure")
.full_width()
.style(ButtonStyle::Outlined)
.key_binding(
KeyBinding::for_action_in(
&ManageProfiles::default(),
&focus_handle,
cx,
)
.map(|kb| kb.size(rems_from_px(12.))),
)
.icon(IconName::Settings)
.icon_size(IconSize::Small)
.icon_color(Color::Muted)
.icon_position(IconPosition::Start)
.on_click(|_, window, cx| {
window.dispatch_action(ManageProfiles::default().boxed_clone(), cx);
}),
@@ -669,25 +659,20 @@ mod tests {
is_builtin: true,
}];
cx.update(|cx| {
let focus_handle = cx.focus_handle();
let delegate = ProfilePickerDelegate {
fs: FakeFs::new(cx.executor()),
provider: Arc::new(TestProfileProvider::new(AgentProfileId("write".into()))),
background: cx.executor(),
candidates,
string_candidates: Arc::new(Vec::new()),
filtered_entries: Vec::new(),
selected_index: 0,
query: String::new(),
cancel: None,
};
let delegate = ProfilePickerDelegate {
fs: FakeFs::new(cx.background_executor().clone()),
provider: Arc::new(TestProfileProvider::new(AgentProfileId("write".into()))),
background: cx.background_executor().clone(),
candidates,
string_candidates: Arc::new(Vec::new()),
filtered_entries: Vec::new(),
selected_index: 0,
query: String::new(),
cancel: None,
focus_handle,
};
let matches = Vec::new(); // No matches
let _entries = delegate.entries_from_matches(matches);
});
let matches = Vec::new(); // No matches
let _entries = delegate.entries_from_matches(matches);
}
#[gpui::test]
@@ -705,35 +690,30 @@ mod tests {
},
];
cx.update(|cx| {
let focus_handle = cx.focus_handle();
let delegate = ProfilePickerDelegate {
fs: FakeFs::new(cx.executor()),
provider: Arc::new(TestProfileProvider::new(AgentProfileId("write".into()))),
background: cx.executor(),
candidates,
string_candidates: Arc::new(Vec::new()),
filtered_entries: vec![
ProfilePickerEntry::Profile(ProfileMatchEntry {
candidate_index: 0,
positions: Vec::new(),
}),
ProfilePickerEntry::Profile(ProfileMatchEntry {
candidate_index: 1,
positions: Vec::new(),
}),
],
selected_index: 0,
query: String::new(),
cancel: None,
};
let delegate = ProfilePickerDelegate {
fs: FakeFs::new(cx.background_executor().clone()),
provider: Arc::new(TestProfileProvider::new(AgentProfileId("write".into()))),
background: cx.background_executor().clone(),
candidates,
string_candidates: Arc::new(Vec::new()),
filtered_entries: vec![
ProfilePickerEntry::Profile(ProfileMatchEntry {
candidate_index: 0,
positions: Vec::new(),
}),
ProfilePickerEntry::Profile(ProfileMatchEntry {
candidate_index: 1,
positions: Vec::new(),
}),
],
selected_index: 0,
query: String::new(),
cancel: None,
focus_handle,
};
// Active profile should be found at index 0
let active_index = delegate.index_of_profile(&AgentProfileId("write".into()));
assert_eq!(active_index, Some(0));
});
// Active profile should be found at index 0
let active_index = delegate.index_of_profile(&AgentProfileId("write".into()));
assert_eq!(active_index, Some(0));
}
struct TestProfileProvider {

View File

@@ -127,8 +127,6 @@ impl SlashCommandCompletionProvider {
new_text,
label: command.label(cx),
icon_path: None,
match_start: None,
snippet_deduplication_key: None,
insert_text_mode: None,
confirm,
source: CompletionSource::Custom,
@@ -234,8 +232,6 @@ impl SlashCommandCompletionProvider {
icon_path: None,
new_text,
documentation: None,
match_start: None,
snippet_deduplication_key: None,
confirm,
insert_text_mode: None,
source: CompletionSource::Custom,

View File

@@ -314,7 +314,6 @@ impl TextThreadEditor {
)
});
},
true, // Use popover styles for picker
window,
cx,
)
@@ -478,7 +477,7 @@ impl TextThreadEditor {
editor.insert(&format!("/{name}"), window, cx);
if command.accepts_arguments() {
editor.insert(" ", window, cx);
editor.show_completions(&ShowCompletions, window, cx);
editor.show_completions(&ShowCompletions::default(), window, cx);
}
});
});
@@ -3224,7 +3223,11 @@ mod tests {
prompt_store::init(cx);
LanguageModelRegistry::test(cx);
cx.set_global(settings_store);
language::init(cx);
agent_settings::init(cx);
Project::init_settings(cx);
theme::init(theme::LoadThemes::JustBase, cx);
workspace::init_settings(cx);
editor::init_settings(cx);
}
}

View File

@@ -577,6 +577,8 @@ mod test {
let settings_store = SettingsStore::test(cx);
cx.set_global(settings_store);
// release_channel::init(SemanticVersion::default(), cx);
language::init(cx);
Project::init_settings(cx);
});
}

View File

@@ -22,6 +22,7 @@ use language_model::{
};
use parking_lot::Mutex;
use pretty_assertions::assert_eq;
use project::Project;
use prompt_store::PromptBuilder;
use rand::prelude::*;
use serde_json::json;
@@ -1410,6 +1411,9 @@ fn init_test(cx: &mut App) {
prompt_store::init(cx);
LanguageModelRegistry::test(cx);
cx.set_global(settings_store);
language::init(cx);
agent_settings::init(cx);
Project::init_settings(cx);
}
#[derive(Clone)]

View File

@@ -48,6 +48,7 @@ pub const LEGACY_CHANNEL_COUNT: NonZero<u16> = nz!(2);
pub const REPLAY_DURATION: Duration = Duration::from_secs(30);
pub fn init(cx: &mut App) {
AudioSettings::register(cx);
LIVE_SETTINGS.initialize(cx);
}

View File

@@ -1,9 +1,9 @@
use std::sync::atomic::{AtomicBool, Ordering};
use gpui::App;
use settings::{RegisterSetting, Settings, SettingsStore};
use settings::{Settings, SettingsStore};
#[derive(Clone, Debug, RegisterSetting)]
#[derive(Clone, Debug)]
pub struct AudioSettings {
/// Opt into the new audio system.
///

View File

@@ -33,9 +33,4 @@ workspace.workspace = true
which.workspace = true
[dev-dependencies]
ctor.workspace = true
clock= { workspace = true, "features" = ["test-support"] }
futures.workspace = true
gpui = { workspace = true, "features" = ["test-support"] }
parking_lot.workspace = true
zlog.workspace = true

View File

@@ -1,15 +1,16 @@
use anyhow::{Context as _, Result};
use client::Client;
use client::{Client, TelemetrySettings};
use db::RELEASE_CHANNEL;
use db::kvp::KEY_VALUE_STORE;
use gpui::{
App, AppContext as _, AsyncApp, BackgroundExecutor, Context, Entity, Global, SemanticVersion,
Task, Window, actions,
};
use http_client::{HttpClient, HttpClientWithUrl};
use http_client::{AsyncBody, HttpClient, HttpClientWithUrl};
use paths::remote_servers_dir;
use release_channel::{AppCommitSha, ReleaseChannel};
use serde::{Deserialize, Serialize};
use settings::{RegisterSetting, Settings, SettingsStore};
use settings::{Settings, SettingsStore};
use smol::{fs, io::AsyncReadExt};
use smol::{fs::File, process::Command};
use std::mem;
@@ -40,23 +41,22 @@ actions!(
]
);
#[derive(Serialize)]
struct UpdateRequestBody {
installation_id: Option<Arc<str>>,
release_channel: Option<&'static str>,
telemetry: bool,
is_staff: Option<bool>,
destination: &'static str,
}
#[derive(Clone, Debug, PartialEq, Eq)]
pub enum VersionCheckType {
Sha(AppCommitSha),
Semantic(SemanticVersion),
}
#[derive(Serialize, Debug)]
pub struct AssetQuery<'a> {
asset: &'a str,
os: &'a str,
arch: &'a str,
metrics_id: Option<&'a str>,
system_id: Option<&'a str>,
is_staff: Option<bool>,
}
#[derive(Clone, Debug)]
#[derive(Clone)]
pub enum AutoUpdateStatus {
Idle,
Checking,
@@ -66,31 +66,6 @@ pub enum AutoUpdateStatus {
Errored { error: Arc<anyhow::Error> },
}
impl PartialEq for AutoUpdateStatus {
fn eq(&self, other: &Self) -> bool {
match (self, other) {
(AutoUpdateStatus::Idle, AutoUpdateStatus::Idle) => true,
(AutoUpdateStatus::Checking, AutoUpdateStatus::Checking) => true,
(
AutoUpdateStatus::Downloading { version: v1 },
AutoUpdateStatus::Downloading { version: v2 },
) => v1 == v2,
(
AutoUpdateStatus::Installing { version: v1 },
AutoUpdateStatus::Installing { version: v2 },
) => v1 == v2,
(
AutoUpdateStatus::Updated { version: v1 },
AutoUpdateStatus::Updated { version: v2 },
) => v1 == v2,
(AutoUpdateStatus::Errored { error: e1 }, AutoUpdateStatus::Errored { error: e2 }) => {
e1.to_string() == e2.to_string()
}
_ => false,
}
}
}
impl AutoUpdateStatus {
pub fn is_updated(&self) -> bool {
matches!(self, Self::Updated { .. })
@@ -100,13 +75,13 @@ impl AutoUpdateStatus {
pub struct AutoUpdater {
status: AutoUpdateStatus,
current_version: SemanticVersion,
client: Arc<Client>,
http_client: Arc<HttpClientWithUrl>,
pending_poll: Option<Task<Option<()>>>,
quit_subscription: Option<gpui::Subscription>,
}
#[derive(Deserialize, Serialize, Clone, Debug)]
pub struct ReleaseAsset {
#[derive(Deserialize, Clone, Debug)]
pub struct JsonRelease {
pub version: String,
pub url: String,
}
@@ -145,7 +120,7 @@ impl Drop for MacOsUnmounter<'_> {
}
}
#[derive(Clone, Copy, Debug, RegisterSetting)]
#[derive(Clone, Copy, Debug)]
struct AutoUpdateSetting(bool);
/// Whether or not to automatically check for updates.
@@ -162,7 +137,9 @@ struct GlobalAutoUpdate(Option<Entity<AutoUpdater>>);
impl Global for GlobalAutoUpdate {}
pub fn init(client: Arc<Client>, cx: &mut App) {
pub fn init(http_client: Arc<HttpClientWithUrl>, cx: &mut App) {
AutoUpdateSetting::register(cx);
cx.observe_new(|workspace: &mut Workspace, _window, _cx| {
workspace.register_action(|_, action, window, cx| check(action, window, cx));
@@ -174,7 +151,7 @@ pub fn init(client: Arc<Client>, cx: &mut App) {
let version = release_channel::AppVersion::global(cx);
let auto_updater = cx.new(|cx| {
let updater = AutoUpdater::new(version, client, cx);
let updater = AutoUpdater::new(version, http_client, cx);
let poll_for_updates = ReleaseChannel::try_global(cx)
.map(|channel| channel.poll_for_updates())
@@ -258,7 +235,7 @@ pub fn view_release_notes(_: &ViewReleaseNotes, cx: &mut App) -> Option<()> {
let current_version = auto_updater.current_version;
let release_channel = release_channel.dev_name();
let path = format!("/releases/{release_channel}/{current_version}");
let url = &auto_updater.client.http_client().build_url(&path);
let url = &auto_updater.http_client.build_url(&path);
cx.open_url(url);
}
ReleaseChannel::Nightly => {
@@ -321,7 +298,11 @@ impl AutoUpdater {
cx.default_global::<GlobalAutoUpdate>().0.clone()
}
fn new(current_version: SemanticVersion, client: Arc<Client>, cx: &mut Context<Self>) -> Self {
fn new(
current_version: SemanticVersion,
http_client: Arc<HttpClientWithUrl>,
cx: &mut Context<Self>,
) -> Self {
// On windows, executable files cannot be overwritten while they are
// running, so we must wait to overwrite the application until quitting
// or restarting. When quitting the app, we spawn the auto update helper
@@ -342,7 +323,7 @@ impl AutoUpdater {
Self {
status: AutoUpdateStatus::Idle,
current_version,
client,
http_client,
pending_poll: None,
quit_subscription,
}
@@ -350,7 +331,8 @@ impl AutoUpdater {
pub fn start_polling(&self, cx: &mut Context<Self>) -> Task<Result<()>> {
cx.spawn(async move |this, cx| {
if cfg!(target_os = "windows") {
#[cfg(target_os = "windows")]
{
use util::ResultExt;
cleanup_windows()
@@ -374,7 +356,7 @@ impl AutoUpdater {
cx.notify();
self.pending_poll = Some(cx.spawn(async move |this, cx| {
let result = Self::update(this.upgrade()?, cx).await;
let result = Self::update(this.upgrade()?, cx.clone()).await;
this.update(cx, |this, cx| {
this.pending_poll = None;
if let Err(error) = result {
@@ -420,11 +402,10 @@ impl AutoUpdater {
// you can override this function. You should also update get_remote_server_release_url to return
// Ok(None).
pub async fn download_remote_server_release(
release_channel: ReleaseChannel,
version: Option<SemanticVersion>,
os: &str,
arch: &str,
set_status: impl Fn(&str, &mut AsyncApp) + Send + 'static,
release_channel: ReleaseChannel,
version: Option<SemanticVersion>,
cx: &mut AsyncApp,
) -> Result<PathBuf> {
let this = cx.update(|cx| {
@@ -434,14 +415,13 @@ impl AutoUpdater {
.context("auto-update not initialized")
})??;
set_status("Fetching remote server release", cx);
let release = Self::get_release_asset(
let release = Self::get_release(
&this,
release_channel,
version,
"zed-remote-server",
os,
arch,
version,
Some(release_channel),
cx,
)
.await?;
@@ -452,27 +432,26 @@ impl AutoUpdater {
let version_path = platform_dir.join(format!("{}.gz", release.version));
smol::fs::create_dir_all(&platform_dir).await.ok();
let client = this.read_with(cx, |this, _| this.client.http_client())?;
let client = this.read_with(cx, |this, _| this.http_client.clone())?;
if smol::fs::metadata(&version_path).await.is_err() {
log::info!(
"downloading zed-remote-server {os} {arch} version {}",
release.version
);
set_status("Downloading remote server", cx);
download_remote_server_binary(&version_path, release, client).await?;
download_remote_server_binary(&version_path, release, client, cx).await?;
}
Ok(version_path)
}
pub async fn get_remote_server_release_url(
channel: ReleaseChannel,
version: Option<SemanticVersion>,
os: &str,
arch: &str,
release_channel: ReleaseChannel,
version: Option<SemanticVersion>,
cx: &mut AsyncApp,
) -> Result<Option<String>> {
) -> Result<Option<(String, String)>> {
let this = cx.update(|cx| {
cx.default_global::<GlobalAutoUpdate>()
.0
@@ -480,99 +459,108 @@ impl AutoUpdater {
.context("auto-update not initialized")
})??;
let release =
Self::get_release_asset(&this, channel, version, "zed-remote-server", os, arch, cx)
.await?;
let release = Self::get_release(
&this,
"zed-remote-server",
os,
arch,
version,
Some(release_channel),
cx,
)
.await?;
Ok(Some(release.url))
let update_request_body = build_remote_server_update_request_body(cx)?;
let body = serde_json::to_string(&update_request_body)?;
Ok(Some((release.url, body)))
}
async fn get_release_asset(
async fn get_release(
this: &Entity<Self>,
release_channel: ReleaseChannel,
version: Option<SemanticVersion>,
asset: &str,
os: &str,
arch: &str,
version: Option<SemanticVersion>,
release_channel: Option<ReleaseChannel>,
cx: &mut AsyncApp,
) -> Result<ReleaseAsset> {
let client = this.read_with(cx, |this, _| this.client.clone())?;
) -> Result<JsonRelease> {
let client = this.read_with(cx, |this, _| this.http_client.clone())?;
let (system_id, metrics_id, is_staff) = if client.telemetry().metrics_enabled() {
(
client.telemetry().system_id(),
client.telemetry().metrics_id(),
client.telemetry().is_staff(),
)
if let Some(version) = version {
let channel = release_channel.map(|c| c.dev_name()).unwrap_or("stable");
let url = format!("/api/releases/{channel}/{version}/{asset}-{os}-{arch}.gz?update=1",);
Ok(JsonRelease {
version: version.to_string(),
url: client.build_url(&url),
})
} else {
(None, None, None)
};
let mut url_string = client.build_url(&format!(
"/api/releases/latest?asset={}&os={}&arch={}",
asset, os, arch
));
if let Some(param) = release_channel.and_then(|c| c.release_query_param()) {
url_string += "&";
url_string += param;
}
let version = if let Some(version) = version {
version.to_string()
} else {
"latest".to_string()
};
let http_client = client.http_client();
let mut response = client.get(&url_string, Default::default(), true).await?;
let mut body = Vec::new();
response.body_mut().read_to_end(&mut body).await?;
let path = format!("/releases/{}/{}/asset", release_channel.dev_name(), version,);
let url = http_client.build_zed_cloud_url_with_query(
&path,
AssetQuery {
os,
arch,
asset,
metrics_id: metrics_id.as_deref(),
system_id: system_id.as_deref(),
is_staff: is_staff,
},
)?;
let mut response = http_client
.get(url.as_str(), Default::default(), true)
.await?;
let mut body = Vec::new();
response.body_mut().read_to_end(&mut body).await?;
anyhow::ensure!(
response.status().is_success(),
"failed to fetch release: {:?}",
String::from_utf8_lossy(&body),
);
serde_json::from_slice(body.as_slice()).with_context(|| {
format!(
"error deserializing release {:?}",
anyhow::ensure!(
response.status().is_success(),
"failed to fetch release: {:?}",
String::from_utf8_lossy(&body),
)
})
);
serde_json::from_slice(body.as_slice()).with_context(|| {
format!(
"error deserializing release {:?}",
String::from_utf8_lossy(&body),
)
})
}
}
async fn update(this: Entity<Self>, cx: &mut AsyncApp) -> Result<()> {
async fn get_latest_release(
this: &Entity<Self>,
asset: &str,
os: &str,
arch: &str,
release_channel: Option<ReleaseChannel>,
cx: &mut AsyncApp,
) -> Result<JsonRelease> {
Self::get_release(this, asset, os, arch, None, release_channel, cx).await
}
async fn update(this: Entity<Self>, mut cx: AsyncApp) -> Result<()> {
let (client, installed_version, previous_status, release_channel) =
this.read_with(cx, |this, cx| {
this.read_with(&cx, |this, cx| {
(
this.client.http_client(),
this.http_client.clone(),
this.current_version,
this.status.clone(),
ReleaseChannel::try_global(cx).unwrap_or(ReleaseChannel::Stable),
ReleaseChannel::try_global(cx),
)
})?;
Self::check_dependencies()?;
this.update(cx, |this, cx| {
this.update(&mut cx, |this, cx| {
this.status = AutoUpdateStatus::Checking;
log::info!("Auto Update: checking for updates");
cx.notify();
})?;
let fetched_release_data =
Self::get_release_asset(&this, release_channel, None, "zed", OS, ARCH, cx).await?;
Self::get_latest_release(&this, "zed", OS, ARCH, release_channel, &mut cx).await?;
let fetched_version = fetched_release_data.clone().version;
let app_commit_sha = cx.update(|cx| AppCommitSha::try_global(cx).map(|sha| sha.full()));
let newer_version = Self::check_if_fetched_version_is_newer(
release_channel,
*RELEASE_CHANNEL,
app_commit_sha,
installed_version,
fetched_version,
@@ -580,7 +568,7 @@ impl AutoUpdater {
)?;
let Some(newer_version) = newer_version else {
return this.update(cx, |this, cx| {
return this.update(&mut cx, |this, cx| {
let status = match previous_status {
AutoUpdateStatus::Updated { .. } => previous_status,
_ => AutoUpdateStatus::Idle,
@@ -590,7 +578,7 @@ impl AutoUpdater {
});
};
this.update(cx, |this, cx| {
this.update(&mut cx, |this, cx| {
this.status = AutoUpdateStatus::Downloading {
version: newer_version.clone(),
};
@@ -599,21 +587,21 @@ impl AutoUpdater {
let installer_dir = InstallerDir::new().await?;
let target_path = Self::target_path(&installer_dir).await?;
download_release(&target_path, fetched_release_data, client).await?;
download_release(&target_path, fetched_release_data, client, &cx).await?;
this.update(cx, |this, cx| {
this.update(&mut cx, |this, cx| {
this.status = AutoUpdateStatus::Installing {
version: newer_version.clone(),
};
cx.notify();
})?;
let new_binary_path = Self::install_release(installer_dir, target_path, cx).await?;
let new_binary_path = Self::install_release(installer_dir, target_path, &cx).await?;
if let Some(new_binary_path) = new_binary_path {
cx.update(|cx| cx.set_restart_path(new_binary_path))?;
}
this.update(cx, |this, cx| {
this.update(&mut cx, |this, cx| {
this.set_should_show_update_notification(true, cx)
.detach_and_log_err(cx);
this.status = AutoUpdateStatus::Updated {
@@ -692,12 +680,6 @@ impl AutoUpdater {
target_path: PathBuf,
cx: &AsyncApp,
) -> Result<Option<PathBuf>> {
#[cfg(test)]
if let Some(test_install) =
cx.try_read_global::<tests::InstallOverride, _>(|g, _| g.0.clone())
{
return test_install(target_path, cx);
}
match OS {
"macos" => install_release_macos(&installer_dir, target_path, cx).await,
"linux" => install_release_linux(&installer_dir, target_path, cx).await,
@@ -748,13 +730,16 @@ impl AutoUpdater {
async fn download_remote_server_binary(
target_path: &PathBuf,
release: ReleaseAsset,
release: JsonRelease,
client: Arc<HttpClientWithUrl>,
cx: &AsyncApp,
) -> Result<()> {
let temp = tempfile::Builder::new().tempfile_in(remote_servers_dir())?;
let mut temp_file = File::create(&temp).await?;
let update_request_body = build_remote_server_update_request_body(cx)?;
let request_body = AsyncBody::from(serde_json::to_string(&update_request_body)?);
let mut response = client.get(&release.url, Default::default(), true).await?;
let mut response = client.get(&release.url, request_body, true).await?;
anyhow::ensure!(
response.status().is_success(),
"failed to download remote server release: {:?}",
@@ -766,19 +751,65 @@ async fn download_remote_server_binary(
Ok(())
}
fn build_remote_server_update_request_body(cx: &AsyncApp) -> Result<UpdateRequestBody> {
let (installation_id, release_channel, telemetry_enabled, is_staff) = cx.update(|cx| {
let telemetry = Client::global(cx).telemetry().clone();
let is_staff = telemetry.is_staff();
let installation_id = telemetry.installation_id();
let release_channel =
ReleaseChannel::try_global(cx).map(|release_channel| release_channel.display_name());
let telemetry_enabled = TelemetrySettings::get_global(cx).metrics;
(
installation_id,
release_channel,
telemetry_enabled,
is_staff,
)
})?;
Ok(UpdateRequestBody {
installation_id,
release_channel,
telemetry: telemetry_enabled,
is_staff,
destination: "remote",
})
}
async fn download_release(
target_path: &Path,
release: ReleaseAsset,
release: JsonRelease,
client: Arc<HttpClientWithUrl>,
cx: &AsyncApp,
) -> Result<()> {
let mut target_file = File::create(&target_path).await?;
let mut response = client.get(&release.url, Default::default(), true).await?;
anyhow::ensure!(
response.status().is_success(),
"failed to download update: {:?}",
response.status()
);
let (installation_id, release_channel, telemetry_enabled, is_staff) = cx.update(|cx| {
let telemetry = Client::global(cx).telemetry().clone();
let is_staff = telemetry.is_staff();
let installation_id = telemetry.installation_id();
let release_channel =
ReleaseChannel::try_global(cx).map(|release_channel| release_channel.display_name());
let telemetry_enabled = TelemetrySettings::get_global(cx).metrics;
(
installation_id,
release_channel,
telemetry_enabled,
is_staff,
)
})?;
let request_body = AsyncBody::from(serde_json::to_string(&UpdateRequestBody {
installation_id,
release_channel,
telemetry: telemetry_enabled,
is_staff,
destination: "local",
})?);
let mut response = client.get(&release.url, request_body, true).await?;
smol::io::copy(response.body_mut(), &mut target_file).await?;
log::info!("downloaded update. path:{:?}", target_path);
@@ -902,16 +933,28 @@ async fn install_release_macos(
Ok(None)
}
#[cfg(target_os = "windows")]
async fn cleanup_windows() -> Result<()> {
use util::ResultExt;
let parent = std::env::current_exe()?
.parent()
.context("No parent dir for Zed.exe")?
.to_owned();
// keep in sync with crates/auto_update_helper/src/updater.rs
_ = smol::fs::remove_dir(parent.join("updates")).await;
_ = smol::fs::remove_dir(parent.join("install")).await;
_ = smol::fs::remove_dir(parent.join("old")).await;
smol::fs::remove_dir(parent.join("updates"))
.await
.context("failed to remove updates dir")
.log_err();
smol::fs::remove_dir(parent.join("install"))
.await
.context("failed to remove install dir")
.log_err();
smol::fs::remove_dir(parent.join("old"))
.await
.context("failed to remove old version dir")
.log_err();
Ok(())
}
@@ -966,33 +1009,11 @@ pub async fn finalize_auto_update_on_quit() {
#[cfg(test)]
mod tests {
use client::Client;
use clock::FakeSystemClock;
use futures::channel::oneshot;
use gpui::TestAppContext;
use http_client::{FakeHttpClient, Response};
use settings::default_settings;
use std::{
rc::Rc,
sync::{
Arc,
atomic::{self, AtomicBool},
},
};
use tempfile::tempdir;
#[ctor::ctor]
fn init_logger() {
zlog::init_test();
}
use super::*;
pub(super) struct InstallOverride(
pub Rc<dyn Fn(PathBuf, &AsyncApp) -> Result<Option<PathBuf>>>,
);
impl Global for InstallOverride {}
#[gpui::test]
fn test_auto_update_defaults_to_true(cx: &mut TestAppContext) {
cx.update(|cx| {
@@ -1004,119 +1025,11 @@ mod tests {
.set_user_settings("{}", cx)
.expect("Unable to set user settings");
cx.set_global(store);
AutoUpdateSetting::register(cx);
assert!(AutoUpdateSetting::get_global(cx).0);
});
}
#[gpui::test]
async fn test_auto_update_downloads(cx: &mut TestAppContext) {
cx.background_executor.allow_parking();
zlog::init_test();
let release_available = Arc::new(AtomicBool::new(false));
let (dmg_tx, dmg_rx) = oneshot::channel::<String>();
cx.update(|cx| {
settings::init(cx);
let current_version = SemanticVersion::new(0, 100, 0);
release_channel::init_test(current_version, ReleaseChannel::Stable, cx);
let clock = Arc::new(FakeSystemClock::new());
let release_available = Arc::clone(&release_available);
let dmg_rx = Arc::new(parking_lot::Mutex::new(Some(dmg_rx)));
let fake_client_http = FakeHttpClient::create(move |req| {
let release_available = release_available.load(atomic::Ordering::Relaxed);
let dmg_rx = dmg_rx.clone();
async move {
if req.uri().path() == "/releases/stable/latest/asset" {
if release_available {
return Ok(Response::builder().status(200).body(
r#"{"version":"0.100.1","url":"https://test.example/new-download"}"#.into()
).unwrap());
} else {
return Ok(Response::builder().status(200).body(
r#"{"version":"0.100.0","url":"https://test.example/old-download"}"#.into()
).unwrap());
}
} else if req.uri().path() == "/new-download" {
return Ok(Response::builder().status(200).body({
let dmg_rx = dmg_rx.lock().take().unwrap();
dmg_rx.await.unwrap().into()
}).unwrap());
}
Ok(Response::builder().status(404).body("".into()).unwrap())
}
});
let client = Client::new(clock, fake_client_http, cx);
crate::init(client, cx);
});
let auto_updater = cx.update(|cx| AutoUpdater::get(cx).expect("auto updater should exist"));
cx.background_executor.run_until_parked();
auto_updater.read_with(cx, |updater, _| {
assert_eq!(updater.status(), AutoUpdateStatus::Idle);
assert_eq!(updater.current_version(), SemanticVersion::new(0, 100, 0));
});
release_available.store(true, atomic::Ordering::SeqCst);
cx.background_executor.advance_clock(POLL_INTERVAL);
cx.background_executor.run_until_parked();
loop {
cx.background_executor.timer(Duration::from_millis(0)).await;
cx.run_until_parked();
let status = auto_updater.read_with(cx, |updater, _| updater.status());
if !matches!(status, AutoUpdateStatus::Idle) {
break;
}
}
let status = auto_updater.read_with(cx, |updater, _| updater.status());
assert_eq!(
status,
AutoUpdateStatus::Downloading {
version: VersionCheckType::Semantic(SemanticVersion::new(0, 100, 1))
}
);
dmg_tx.send("<fake-zed-update>".to_owned()).unwrap();
let tmp_dir = Arc::new(tempdir().unwrap());
cx.update(|cx| {
let tmp_dir = tmp_dir.clone();
cx.set_global(InstallOverride(Rc::new(move |target_path, _cx| {
let tmp_dir = tmp_dir.clone();
let dest_path = tmp_dir.path().join("zed");
std::fs::copy(&target_path, &dest_path)?;
Ok(Some(dest_path))
})));
});
loop {
cx.background_executor.timer(Duration::from_millis(0)).await;
cx.run_until_parked();
let status = auto_updater.read_with(cx, |updater, _| updater.status());
if !matches!(status, AutoUpdateStatus::Downloading { .. }) {
break;
}
}
let status = auto_updater.read_with(cx, |updater, _| updater.status());
assert_eq!(
status,
AutoUpdateStatus::Updated {
version: VersionCheckType::Semantic(SemanticVersion::new(0, 100, 1))
}
);
let will_restart = cx.expect_restart();
cx.update(|cx| cx.restart());
let path = will_restart.await.unwrap().unwrap();
assert_eq!(path, tmp_dir.path().join("zed"));
assert_eq!(std::fs::read_to_string(path).unwrap(), "<fake-zed-update>");
}
#[test]
fn test_stable_does_not_update_when_fetched_version_is_not_higher() {
let release_channel = ReleaseChannel::Stable;

View File

@@ -1,6 +1,6 @@
use std::{
cell::LazyCell,
path::Path,
sync::LazyLock,
time::{Duration, Instant},
};
@@ -13,8 +13,8 @@ use windows::Win32::{
use crate::windows_impl::WM_JOB_UPDATED;
pub(crate) struct Job {
pub apply: Box<dyn Fn(&Path) -> Result<()> + Send + Sync>,
pub rollback: Box<dyn Fn(&Path) -> Result<()> + Send + Sync>,
pub apply: Box<dyn Fn(&Path) -> Result<()>>,
pub rollback: Box<dyn Fn(&Path) -> Result<()>>,
}
impl Job {
@@ -154,8 +154,10 @@ impl Job {
}
}
// app is single threaded
#[cfg(not(test))]
pub(crate) static JOBS: LazyLock<[Job; 22]> = LazyLock::new(|| {
#[allow(clippy::declare_interior_mutable_const)]
pub(crate) const JOBS: LazyCell<[Job; 22]> = LazyCell::new(|| {
fn p(value: &str) -> &Path {
Path::new(value)
}
@@ -204,8 +206,10 @@ pub(crate) static JOBS: LazyLock<[Job; 22]> = LazyLock::new(|| {
]
});
// app is single threaded
#[cfg(test)]
pub(crate) static JOBS: LazyLock<[Job; 9]> = LazyLock::new(|| {
#[allow(clippy::declare_interior_mutable_const)]
pub(crate) const JOBS: LazyCell<[Job; 9]> = LazyCell::new(|| {
fn p(value: &str) -> &Path {
Path::new(value)
}

View File

@@ -1,6 +1,7 @@
pub mod participant;
pub mod room;
use crate::call_settings::CallSettings;
use anyhow::{Context as _, Result, anyhow};
use audio::Audio;
use client::{ChannelId, Client, TypedEnvelope, User, UserStore, ZED_ALWAYS_ACTIVE, proto};
@@ -13,6 +14,7 @@ use gpui::{
use postage::watch;
use project::Project;
use room::Event;
use settings::Settings;
use std::sync::Arc;
pub use livekit_client::{RemoteVideoTrack, RemoteVideoTrackView, RemoteVideoTrackViewEvent};
@@ -24,6 +26,8 @@ struct GlobalActiveCall(Entity<ActiveCall>);
impl Global for GlobalActiveCall {}
pub fn init(client: Arc<Client>, user_store: Entity<UserStore>, cx: &mut App) {
CallSettings::register(cx);
let active_call = cx.new(|cx| ActiveCall::new(client, user_store, cx));
cx.set_global(GlobalActiveCall(active_call));
}

View File

@@ -1,6 +1,6 @@
use settings::{RegisterSetting, Settings};
use settings::Settings;
#[derive(Debug, RegisterSetting)]
#[derive(Debug)]
pub struct CallSettings {
pub mute_on_join: bool,
pub share_on_join: bool,

View File

@@ -237,6 +237,7 @@ fn init_test(cx: &mut App) -> Entity<ChannelStore> {
let settings_store = SettingsStore::test(cx);
cx.set_global(settings_store);
release_channel::init(SemanticVersion::default(), cx);
client::init_settings(cx);
let clock = Arc::new(FakeSystemClock::new());
let http = FakeHttpClient::with_404_response();

View File

@@ -30,7 +30,7 @@ use rand::prelude::*;
use release_channel::{AppVersion, ReleaseChannel};
use rpc::proto::{AnyTypedEnvelope, EnvelopedMessage, PeerId, RequestMessage};
use serde::{Deserialize, Serialize};
use settings::{RegisterSetting, Settings, SettingsContent};
use settings::{Settings, SettingsContent};
use std::{
any::TypeId,
convert::TryFrom,
@@ -95,7 +95,7 @@ actions!(
]
);
#[derive(Deserialize, RegisterSetting)]
#[derive(Deserialize)]
pub struct ClientSettings {
pub server_url: String,
}
@@ -113,7 +113,7 @@ impl Settings for ClientSettings {
}
}
#[derive(Deserialize, Default, RegisterSetting)]
#[derive(Deserialize, Default)]
pub struct ProxySettings {
pub proxy: Option<String>,
}
@@ -140,6 +140,12 @@ impl Settings for ProxySettings {
}
}
pub fn init_settings(cx: &mut App) {
TelemetrySettings::register(cx);
ClientSettings::register(cx);
ProxySettings::register(cx);
}
pub fn init(client: &Arc<Client>, cx: &mut App) {
let client = Arc::downgrade(client);
cx.on_action({
@@ -502,7 +508,7 @@ impl<T: 'static> Drop for PendingEntitySubscription<T> {
}
}
#[derive(Copy, Clone, Deserialize, Debug, RegisterSetting)]
#[derive(Copy, Clone, Deserialize, Debug)]
pub struct TelemetrySettings {
pub diagnostics: bool,
pub metrics: bool,
@@ -1487,7 +1493,7 @@ impl Client {
let url = self
.http
.build_zed_cloud_url("/internal/users/impersonate")?;
.build_zed_cloud_url("/internal/users/impersonate", &[])?;
let request = Request::post(url.as_str())
.header("Content-Type", "application/json")
.header("Authorization", format!("Bearer {api_token}"))
@@ -2171,6 +2177,7 @@ mod tests {
cx.update(|cx| {
let settings_store = SettingsStore::test(cx);
cx.set_global(settings_store);
init_settings(cx);
});
}
}

View File

@@ -179,6 +179,8 @@ impl Telemetry {
let release_channel =
ReleaseChannel::try_global(cx).map(|release_channel| release_channel.display_name());
TelemetrySettings::register(cx);
let state = Arc::new(Mutex::new(TelemetryState {
settings: *TelemetrySettings::get_global(cx),
architecture: env::consts::ARCH,
@@ -435,7 +437,7 @@ impl Telemetry {
Some(project_types)
}
fn report_event(self: &Arc<Self>, mut event: Event) {
fn report_event(self: &Arc<Self>, event: Event) {
let mut state = self.state.lock();
// RUST_LOG=telemetry=trace to debug telemetry events
log::trace!(target: "telemetry", "{:?}", event);
@@ -444,12 +446,6 @@ impl Telemetry {
return;
}
match &mut event {
Event::Flexible(event) => event
.event_properties
.insert("event_source".into(), "zed".into()),
};
if state.flush_events_task.is_none() {
let this = self.clone();
state.flush_events_task = Some(self.executor.spawn(async move {

View File

@@ -51,11 +51,3 @@ pub fn external_agents_docs(cx: &App) -> String {
server_url = server_url(cx)
)
}
/// Returns the URL to Zed agent servers documentation.
pub fn agent_server_docs(cx: &App) -> String {
format!(
"{server_url}/docs/extensions/agent-servers",
server_url = server_url(cx)
)
}

View File

@@ -260,7 +260,7 @@ impl fmt::Debug for Lamport {
impl fmt::Debug for Global {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
write!(f, "Global {{")?;
for timestamp in self.iter().filter(|t| t.value > 0) {
for timestamp in self.iter() {
if timestamp.replica_id.0 > 0 {
write!(f, ", ")?;
}

View File

@@ -62,7 +62,7 @@ impl CloudApiClient {
let request = self.build_request(
Request::builder().method(Method::GET).uri(
self.http_client
.build_zed_cloud_url("/client/users/me")?
.build_zed_cloud_url("/client/users/me", &[])?
.as_ref(),
),
AsyncBody::default(),
@@ -89,7 +89,7 @@ impl CloudApiClient {
pub fn connect(&self, cx: &App) -> Result<Task<Result<Connection>>> {
let mut connect_url = self
.http_client
.build_zed_cloud_url("/client/users/connect")?;
.build_zed_cloud_url("/client/users/connect", &[])?;
connect_url
.set_scheme(match connect_url.scheme() {
"https" => "wss",
@@ -123,7 +123,7 @@ impl CloudApiClient {
.method(Method::POST)
.uri(
self.http_client
.build_zed_cloud_url("/client/llm_tokens")?
.build_zed_cloud_url("/client/llm_tokens", &[])?
.as_ref(),
)
.when_some(system_id, |builder, system_id| {
@@ -154,7 +154,7 @@ impl CloudApiClient {
let request = build_request(
Request::builder().method(Method::GET).uri(
self.http_client
.build_zed_cloud_url("/client/users/me")?
.build_zed_cloud_url("/client/users/me", &[])?
.as_ref(),
),
AsyncBody::default(),

View File

@@ -1,4 +1,5 @@
pub mod predict_edits_v3;
pub mod udiff;
use std::str::FromStr;
use std::sync::Arc;
@@ -183,13 +184,13 @@ pub struct PredictEditsGitInfo {
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct PredictEditsResponse {
pub request_id: String,
pub request_id: Uuid,
pub output_excerpt: String,
}
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct AcceptEditPredictionBody {
pub request_id: String,
pub request_id: Uuid,
}
#[derive(Debug, PartialEq, Eq, PartialOrd, Ord, Hash, Clone, Copy, Serialize, Deserialize)]

View File

@@ -1,7 +1,7 @@
use chrono::Duration;
use serde::{Deserialize, Serialize};
use std::{
fmt::{Display, Write as _},
fmt::Display,
ops::{Add, Range, Sub},
path::{Path, PathBuf},
sync::Arc,
@@ -11,14 +11,7 @@ use uuid::Uuid;
use crate::PredictEditsGitInfo;
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct PlanContextRetrievalRequest {
pub excerpt: String,
pub excerpt_path: Arc<Path>,
pub excerpt_line_range: Range<Line>,
pub cursor_file_max_row: Line,
pub events: Vec<Event>,
}
// TODO: snippet ordering within file / relative to excerpt
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct PredictEditsRequest {
@@ -73,7 +66,6 @@ pub enum PromptFormat {
MarkedExcerpt,
LabeledSections,
NumLinesUniDiff,
OldTextNewText,
/// Prompt format intended for use via zeta_cli
OnlySnippets,
}
@@ -101,7 +93,6 @@ impl std::fmt::Display for PromptFormat {
PromptFormat::LabeledSections => write!(f, "Labeled Sections"),
PromptFormat::OnlySnippets => write!(f, "Only Snippets"),
PromptFormat::NumLinesUniDiff => write!(f, "Numbered Lines / Unified Diff"),
PromptFormat::OldTextNewText => write!(f, "Old Text / New Text"),
}
}
}
@@ -134,15 +125,15 @@ impl Display for Event {
write!(
f,
"// User accepted prediction:\n--- a/{}\n+++ b/{}\n{diff}",
DiffPathFmt(old_path),
DiffPathFmt(new_path)
old_path.display(),
new_path.display()
)
} else {
write!(
f,
"--- a/{}\n+++ b/{}\n{diff}",
DiffPathFmt(old_path),
DiffPathFmt(new_path)
old_path.display(),
new_path.display()
)
}
}
@@ -150,24 +141,6 @@ impl Display for Event {
}
}
/// always format the Path as a unix path with `/` as the path sep in Diffs
pub struct DiffPathFmt<'a>(pub &'a Path);
impl<'a> std::fmt::Display for DiffPathFmt<'a> {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
let mut is_first = true;
for component in self.0.components() {
if !is_first {
f.write_char('/')?;
} else {
is_first = false;
}
write!(f, "{}", component.as_os_str().display())?;
}
Ok(())
}
}
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct Signature {
pub text: String,

View File

@@ -0,0 +1,294 @@
use std::{borrow::Cow, fmt::Display};
#[derive(Debug, PartialEq)]
pub enum DiffLine<'a> {
OldPath { path: Cow<'a, str> },
NewPath { path: Cow<'a, str> },
HunkHeader(Option<HunkLocation>),
Context(&'a str),
Deletion(&'a str),
Addition(&'a str),
Garbage(&'a str),
}
#[derive(Debug, PartialEq)]
pub struct HunkLocation {
start_line_old: u32,
count_old: u32,
start_line_new: u32,
count_new: u32,
}
impl<'a> DiffLine<'a> {
pub fn parse(line: &'a str) -> Self {
Self::try_parse(line).unwrap_or(Self::Garbage(line))
}
fn try_parse(line: &'a str) -> Option<Self> {
if let Some(header) = line.strip_prefix("---").and_then(eat_required_whitespace) {
let path = parse_header_path("a/", header);
Some(Self::OldPath { path })
} else if let Some(header) = line.strip_prefix("+++").and_then(eat_required_whitespace) {
Some(Self::NewPath {
path: parse_header_path("b/", header),
})
} else if let Some(header) = line.strip_prefix("@@").and_then(eat_required_whitespace) {
if header.starts_with("...") {
return Some(Self::HunkHeader(None));
}
let (start_line_old, header) = header.strip_prefix('-')?.split_once(',')?;
let mut parts = header.split_ascii_whitespace();
let count_old = parts.next()?;
let (start_line_new, count_new) = parts.next()?.strip_prefix('+')?.split_once(',')?;
Some(Self::HunkHeader(Some(HunkLocation {
start_line_old: start_line_old.parse::<u32>().ok()?.saturating_sub(1),
count_old: count_old.parse().ok()?,
start_line_new: start_line_new.parse::<u32>().ok()?.saturating_sub(1),
count_new: count_new.parse().ok()?,
})))
} else if let Some(deleted_header) = line.strip_prefix("-") {
Some(Self::Deletion(deleted_header))
} else if line.is_empty() {
Some(Self::Context(""))
} else if let Some(context) = line.strip_prefix(" ") {
Some(Self::Context(context))
} else {
Some(Self::Addition(line.strip_prefix("+")?))
}
}
}
impl<'a> Display for DiffLine<'a> {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
match self {
DiffLine::OldPath { path } => write!(f, "--- {path}"),
DiffLine::NewPath { path } => write!(f, "+++ {path}"),
DiffLine::HunkHeader(Some(hunk_location)) => {
write!(
f,
"@@ -{},{} +{},{} @@",
hunk_location.start_line_old + 1,
hunk_location.count_old,
hunk_location.start_line_new + 1,
hunk_location.count_new
)
}
DiffLine::HunkHeader(None) => write!(f, "@@ ... @@"),
DiffLine::Context(content) => write!(f, " {content}"),
DiffLine::Deletion(content) => write!(f, "-{content}"),
DiffLine::Addition(content) => write!(f, "+{content}"),
DiffLine::Garbage(line) => write!(f, "{line}"),
}
}
}
fn parse_header_path<'a>(strip_prefix: &'static str, header: &'a str) -> Cow<'a, str> {
if !header.contains(['"', '\\']) {
let path = header.split_ascii_whitespace().next().unwrap_or(header);
return Cow::Borrowed(path.strip_prefix(strip_prefix).unwrap_or(path));
}
let mut path = String::with_capacity(header.len());
let mut in_quote = false;
let mut chars = header.chars().peekable();
let mut strip_prefix = Some(strip_prefix);
while let Some(char) = chars.next() {
if char == '"' {
in_quote = !in_quote;
} else if char == '\\' {
let Some(&next_char) = chars.peek() else {
break;
};
chars.next();
path.push(next_char);
} else if char.is_ascii_whitespace() && !in_quote {
break;
} else {
path.push(char);
}
if let Some(prefix) = strip_prefix
&& path == prefix
{
strip_prefix.take();
path.clear();
}
}
Cow::Owned(path)
}
fn eat_required_whitespace(header: &str) -> Option<&str> {
let trimmed = header.trim_ascii_start();
if trimmed.len() == header.len() {
None
} else {
Some(trimmed)
}
}
#[cfg(test)]
mod tests {
use super::*;
use indoc::indoc;
#[test]
fn parse_lines_simple() {
let input = indoc! {"
diff --git a/text.txt b/text.txt
index 86c770d..a1fd855 100644
--- a/file.txt
+++ b/file.txt
@@ -1,2 +1,3 @@
context
-deleted
+inserted
garbage
--- b/file.txt
+++ a/file.txt
"};
let lines = input.lines().map(DiffLine::parse).collect::<Vec<_>>();
pretty_assertions::assert_eq!(
lines,
&[
DiffLine::Garbage("diff --git a/text.txt b/text.txt"),
DiffLine::Garbage("index 86c770d..a1fd855 100644"),
DiffLine::OldPath {
path: "file.txt".into()
},
DiffLine::NewPath {
path: "file.txt".into()
},
DiffLine::HunkHeader(Some(HunkLocation {
start_line_old: 0,
count_old: 2,
start_line_new: 0,
count_new: 3
})),
DiffLine::Context("context"),
DiffLine::Deletion("deleted"),
DiffLine::Addition("inserted"),
DiffLine::Garbage("garbage"),
DiffLine::Context(""),
DiffLine::OldPath {
path: "b/file.txt".into()
},
DiffLine::NewPath {
path: "a/file.txt".into()
},
]
);
}
#[test]
fn file_header_extra_space() {
let options = ["--- file", "--- file", "---\tfile"];
for option in options {
pretty_assertions::assert_eq!(
DiffLine::parse(option),
DiffLine::OldPath {
path: "file".into()
},
"{option}",
);
}
}
#[test]
fn hunk_header_extra_space() {
let options = [
"@@ -1,2 +1,3 @@",
"@@ -1,2 +1,3 @@",
"@@\t-1,2\t+1,3\t@@",
"@@ -1,2 +1,3 @@",
"@@ -1,2 +1,3 @@",
"@@ -1,2 +1,3 @@",
"@@ -1,2 +1,3 @@ garbage",
];
for option in options {
pretty_assertions::assert_eq!(
DiffLine::parse(option),
DiffLine::HunkHeader(Some(HunkLocation {
start_line_old: 0,
count_old: 2,
start_line_new: 0,
count_new: 3
})),
"{option}",
);
}
}
#[test]
fn hunk_header_without_location() {
pretty_assertions::assert_eq!(DiffLine::parse("@@ ... @@"), DiffLine::HunkHeader(None));
}
#[test]
fn test_parse_path() {
assert_eq!(parse_header_path("a/", "foo.txt"), "foo.txt");
assert_eq!(
parse_header_path("a/", "foo/bar/baz.txt"),
"foo/bar/baz.txt"
);
assert_eq!(parse_header_path("a/", "a/foo.txt"), "foo.txt");
assert_eq!(
parse_header_path("a/", "a/foo/bar/baz.txt"),
"foo/bar/baz.txt"
);
// Extra
assert_eq!(
parse_header_path("a/", "a/foo/bar/baz.txt 2025"),
"foo/bar/baz.txt"
);
assert_eq!(
parse_header_path("a/", "a/foo/bar/baz.txt\t2025"),
"foo/bar/baz.txt"
);
assert_eq!(
parse_header_path("a/", "a/foo/bar/baz.txt \""),
"foo/bar/baz.txt"
);
// Quoted
assert_eq!(
parse_header_path("a/", "a/foo/bar/\"baz quox.txt\""),
"foo/bar/baz quox.txt"
);
assert_eq!(
parse_header_path("a/", "\"a/foo/bar/baz quox.txt\""),
"foo/bar/baz quox.txt"
);
assert_eq!(
parse_header_path("a/", "\"foo/bar/baz quox.txt\""),
"foo/bar/baz quox.txt"
);
assert_eq!(parse_header_path("a/", "\"whatever 🤷\""), "whatever 🤷");
assert_eq!(
parse_header_path("a/", "\"foo/bar/baz quox.txt\" 2025"),
"foo/bar/baz quox.txt"
);
// unescaped quotes are dropped
assert_eq!(parse_header_path("a/", "foo/\"bar\""), "foo/bar");
// Escaped
assert_eq!(
parse_header_path("a/", "\"foo/\\\"bar\\\"/baz.txt\""),
"foo/\"bar\"/baz.txt"
);
assert_eq!(
parse_header_path("a/", "\"C:\\\\Projects\\\\My App\\\\old file.txt\""),
"C:\\Projects\\My App\\old file.txt"
);
}
}

View File

@@ -17,6 +17,5 @@ cloud_llm_client.workspace = true
indoc.workspace = true
ordered-float.workspace = true
rustc-hash.workspace = true
schemars.workspace = true
serde.workspace = true
strum.workspace = true

View File

@@ -1,9 +1,8 @@
//! Zeta2 prompt planning and generation code shared with cloud.
pub mod retrieval_prompt;
use anyhow::{Context as _, Result, anyhow};
use cloud_llm_client::predict_edits_v3::{
self, DiffPathFmt, Excerpt, Line, Point, PromptFormat, ReferencedDeclaration,
self, Excerpt, Line, Point, PromptFormat, ReferencedDeclaration,
};
use indoc::indoc;
use ordered_float::OrderedFloat;
@@ -56,98 +55,50 @@ const LABELED_SECTIONS_INSTRUCTIONS: &str = indoc! {r#"
const NUMBERED_LINES_INSTRUCTIONS: &str = indoc! {r#"
# Instructions
You are an edit prediction agent in a code editor.
Your job is to predict the next edit that the user will make,
based on their last few edits and their current cursor location.
You are a code completion assistant helping a programmer finish their work. Your task is to:
## Output Format
1. Analyze the edit history to understand what the programmer is trying to achieve
2. Identify any incomplete refactoring or changes that need to be finished
3. Make the remaining edits that a human programmer would logically make next
4. Apply systematic changes consistently across the entire codebase - if you see a pattern starting, complete it everywhere.
You must briefly explain your understanding of the user's goal, in one
or two sentences, and then specify their next edit in the form of a
unified diff, like this:
Focus on:
- Understanding the intent behind the changes (e.g., improving error handling, refactoring APIs, fixing bugs)
- Completing any partially-applied changes across the codebase
- Ensuring consistency with the programming style and patterns already established
- Making edits that maintain or improve code quality
- If the programmer started refactoring one instance of a pattern, find and update ALL similar instances
- Don't write a lot of code if you're not sure what to do
Rules:
- Do not just mechanically apply patterns - reason about what changes make sense given the context and the programmer's apparent goals.
- Do not just fix syntax errors - look for the broader refactoring pattern and apply it systematically throughout the code.
- Write the edits in the unified diff format as shown in the example.
# Example output:
```
--- a/src/myapp/cli.py
+++ b/src/myapp/cli.py
@@ ... @@
import os
import time
import sys
+from constants import LOG_LEVEL_WARNING
@@ ... @@
config.headless()
config.set_interactive(false)
-config.set_log_level(LOG_L)
+config.set_log_level(LOG_LEVEL_WARNING)
config.set_use_color(True)
@@ -1,3 +1,3 @@
-
-
-import sys
+import json
```
## Edit History
# Edit History:
"#};
const UNIFIED_DIFF_REMINDER: &str = indoc! {"
---
Analyze the edit history and the files, then provide the unified diff for your predicted edits.
Please analyze the edit history and the files, then provide the unified diff for your predicted edits.
Do not include the cursor marker in your output.
Your diff should include edited file paths in its file headers (lines beginning with `---` and `+++`).
Do not include line numbers in the hunk headers, use `@@ ... @@`.
Removed lines begin with `-`.
Added lines begin with `+`.
Context lines begin with an extra space.
Context and removed lines are used to match the target edit location, so make sure to include enough of them
to uniquely identify it amongst all excerpts of code provided.
If you're editing multiple files, be sure to reflect filename in the hunk's header.
"};
const XML_TAGS_INSTRUCTIONS: &str = indoc! {r#"
# Instructions
You are an edit prediction agent in a code editor.
Your job is to predict the next edit that the user will make,
based on their last few edits and their current cursor location.
# Output Format
You must briefly explain your understanding of the user's goal, in one
or two sentences, and then specify their next edit, using the following
XML format:
<edits path="my-project/src/myapp/cli.py">
<old_text>
OLD TEXT 1 HERE
</old_text>
<new_text>
NEW TEXT 1 HERE
</new_text>
<old_text>
OLD TEXT 1 HERE
</old_text>
<new_text>
NEW TEXT 1 HERE
</new_text>
</edits>
- Specify the file to edit using the `path` attribute.
- Use `<old_text>` and `<new_text>` tags to replace content
- `<old_text>` must exactly match existing file content, including indentation
- `<old_text>` cannot be empty
- Do not escape quotes, newlines, or other characters within tags
- Always close all tags properly
- Don't include the <|user_cursor|> marker in your output.
# Edit History:
"#};
const OLD_TEXT_NEW_TEXT_REMINDER: &str = indoc! {r#"
---
Remember that the edits in the edit history have already been deployed.
The files are currently as shown in the Code Excerpts section.
"#};
pub fn build_prompt(
request: &predict_edits_v3::PredictEditsRequest,
) -> Result<(String, SectionLabels)> {
@@ -169,9 +120,8 @@ pub fn build_prompt(
EDITABLE_REGION_END_MARKER_WITH_NEWLINE,
),
],
PromptFormat::LabeledSections
| PromptFormat::NumLinesUniDiff
| PromptFormat::OldTextNewText => {
PromptFormat::LabeledSections => vec![(request.cursor_point, CURSOR_MARKER)],
PromptFormat::NumLinesUniDiff => {
vec![(request.cursor_point, CURSOR_MARKER)]
}
PromptFormat::OnlySnippets => vec![],
@@ -181,31 +131,45 @@ pub fn build_prompt(
PromptFormat::MarkedExcerpt => MARKED_EXCERPT_INSTRUCTIONS.to_string(),
PromptFormat::LabeledSections => LABELED_SECTIONS_INSTRUCTIONS.to_string(),
PromptFormat::NumLinesUniDiff => NUMBERED_LINES_INSTRUCTIONS.to_string(),
PromptFormat::OldTextNewText => XML_TAGS_INSTRUCTIONS.to_string(),
// only intended for use via zeta_cli
PromptFormat::OnlySnippets => String::new(),
};
if request.events.is_empty() {
prompt.push_str("(No edit history)\n\n");
} else {
prompt.push_str("Here are the latest edits made by the user, from earlier to later.\n\n");
prompt.push_str(
"The following are the latest edits made by the user, from earlier to later.\n\n",
);
push_events(&mut prompt, &request.events);
}
prompt.push_str(indoc! {"
# Code Excerpts
The cursor marker <|user_cursor|> indicates the current user cursor position.
The file is in current state, edits from edit history have been applied.
"});
if request.prompt_format == PromptFormat::NumLinesUniDiff {
prompt.push_str(indoc! {"
We prepend line numbers (e.g., `123|<actual line>`); they are not part of the file.
"});
}
if request.referenced_declarations.is_empty() {
prompt.push_str(indoc! {"
# File under the cursor:
prompt.push('\n');
The cursor marker <|user_cursor|> indicates the current user cursor position.
The file is in current state, edits from edit history have been applied.
We prepend line numbers (e.g., `123|<actual line>`); they are not part of the file.
"});
} else {
// Note: This hasn't been trained on yet
prompt.push_str(indoc! {"
# Code Excerpts:
The cursor marker <|user_cursor|> indicates the current user cursor position.
Other excerpts of code from the project have been included as context based on their similarity to the code under the cursor.
Context excerpts are not guaranteed to be relevant, so use your own judgement.
Files are in their current state, edits from edit history have been applied.
We prepend line numbers (e.g., `123|<actual line>`); they are not part of the file.
"});
}
} else {
prompt.push_str("\n## Code\n\n");
}
let mut section_labels = Default::default();
@@ -233,14 +197,8 @@ pub fn build_prompt(
}
}
match request.prompt_format {
PromptFormat::NumLinesUniDiff => {
prompt.push_str(UNIFIED_DIFF_REMINDER);
}
PromptFormat::OldTextNewText => {
prompt.push_str(OLD_TEXT_NEW_TEXT_REMINDER);
}
_ => {}
if request.prompt_format == PromptFormat::NumLinesUniDiff {
prompt.push_str(UNIFIED_DIFF_REMINDER);
}
Ok((prompt, section_labels))
@@ -254,7 +212,7 @@ pub fn write_codeblock<'a>(
include_line_numbers: bool,
output: &'a mut String,
) {
writeln!(output, "`````{}", DiffPathFmt(path)).unwrap();
writeln!(output, "`````{}", path.display()).unwrap();
write_excerpts(
excerpts,
sorted_insertions,
@@ -317,7 +275,7 @@ pub fn write_excerpts<'a>(
}
}
pub fn push_events(output: &mut String, events: &[predict_edits_v3::Event]) {
fn push_events(output: &mut String, events: &[predict_edits_v3::Event]) {
if events.is_empty() {
return;
};
@@ -665,7 +623,6 @@ impl<'a> SyntaxBasedPrompt<'a> {
match self.request.prompt_format {
PromptFormat::MarkedExcerpt
| PromptFormat::OnlySnippets
| PromptFormat::OldTextNewText
| PromptFormat::NumLinesUniDiff => {
if range.start.0 > 0 && !skipped_last_snippet {
output.push_str("\n");

Some files were not shown because too many files have changed in this diff Show More