Compare commits

..

73 Commits

Author SHA1 Message Date
Peter Tripp
044594da78 Merge branch 'main' into faster_arm_linux 2025-08-20 15:25:38 -04:00
Ben Brandt
b0bef3a9a2 agent2: Clean up tool descriptions (#36619)
schemars was passing along the newlines from the doc comments. This
should make these closer to the markdown file versions we had in the old
agent.

Release Notes:

- N/A
2025-08-20 19:17:07 +00:00
Agus Zubiaga
2813073d7b message editor: Only allow types of content the agent can handle (#36616)
Uses the new
[`acp::PromptCapabilities`](a39b7f635d/rust/agent.rs (L194-L215))
to disable non-file mentions and images for agents that don't support
them.

Release Notes:

- N/A
2025-08-20 19:04:10 +00:00
Piotr Osiewicz
74ce543d8b clippy: println_empty_string & non_minimal_cfg (#36614)
- **clippy: Fix println-empty-string**
- **clippy: non-minimal-cfg**

Related to #36577

Release Notes:
- N/A
2025-08-20 18:45:40 +00:00
Cole Miller
b6722ca3c8 Remove special case for singleton buffers from MultiBufferSnapshot::anchor_at (#36524)
This may be responsible for a panic that we've been seeing with
increased frequency in agent2 threads.

Release Notes:

- N/A

Co-authored-by: Conrad Irwin <conrad.irwin@gmail.com>
2025-08-20 18:43:29 +00:00
Umesh Yadav
ec8106d1db Fix clippy::println_empty_string, clippy::while_let_on_iterator, clippy::while_let_on_iterator lint style violations (#36613)
Related: #36577

Release Notes:

- N/A
2025-08-20 20:14:30 +02:00
Cole Miller
41e28a7185 Add tracked buffers for agent2 mentions (#36608)
Release Notes:

- N/A
2025-08-20 14:01:18 -04:00
Bennet Bo Fenner
8334cdb358 agent2: Port feedback (#36603)
Release Notes:

- N/A

---------

Co-authored-by: Ben Brandt <benjamin.j.brandt@gmail.com>
2025-08-20 19:10:43 +02:00
Antonio Scandurra
d0fb6120d9 Fix scrollbar flicker when streaming agent2 response (#36606)
This was caused by calling `list_state.splice` on updated entries. We
don't need to splice the entry, as we'll recompute its measurements
automatically when we render it.

Release Notes:

- N/A
2025-08-20 16:39:46 +00:00
Peter Tripp
1b8ca17dad ci: Larger Linux ARM runners 2025-08-20 12:15:48 -04:00
Antonio Scandurra
699f58aeba Capture telemetry when requesting completions in agent2 (#36600)
Release Notes:

- N/A
2025-08-20 16:04:32 +00:00
Umesh Yadav
1e6cefaa56 Fix clippy::len_zero lint style violations (#36589)
Related: #36577

Release Notes:

- N/A

---------

Signed-off-by: Umesh Yadav <git@umesh.dev>
2025-08-20 14:35:59 +00:00
tidely
92352f97ad Fix clippy::map_clone lint violations (#36585)
#36577

Release Notes:

- N/A
2025-08-20 16:34:52 +02:00
Antonio Scandurra
eaf6b56163 Miscellaneous UX fixes for agent2 (#36591)
Release Notes:

- N/A
2025-08-20 13:56:39 +00:00
Bennet Bo Fenner
85865fc950 agent2: New thread from summary (#36578)
Release Notes:

- N/A

---------

Co-authored-by: Agus Zubiaga <agus@zed.dev>
Co-authored-by: Cole Miller <cole@zed.dev>
2025-08-20 13:54:00 +00:00
Lukas Wirth
c5040bd0a4 remote: Do not leave client hanging on unhandled proto message (#36590)
Otherwise the client will wait for a response that never arrives,
causing the task to lock up

Release Notes:

- N/A
2025-08-20 13:41:58 +00:00
tidely
bc79076ad3 Fix clippy::manual_map lint violations (#36584)
#36577

Release Notes:

- N/A
2025-08-20 15:17:28 +02:00
Antonio Scandurra
de12633591 Wait for agent2 feature flag before loading panel (#36583)
Release Notes:

- N/A
2025-08-20 15:02:40 +02:00
tidely
6ed29fbc34 Enforce style lints which do not have violations (#36580)
Release Notes:

- N/A
2025-08-20 14:07:37 +02:00
Antonio Scandurra
4ee565cd39 Fix mentions roundtrip from/to database and other history bugs (#36575)
Release Notes:

- N/A
2025-08-20 12:03:20 +00:00
tidely
f80a0ba056 Move clippy lints which aren't apart of the style category (#36579)
Move lints which aren't apart of the style category.

Motivation: They might get accidentally get reverted when we turn the
style category on again and remove the manual lint enforcements.

Release Notes:

- N/A
2025-08-20 11:26:45 +00:00
tidely
7bdc99abc1 Fix clippy::redundant_clone lint violations (#36558)
This removes around 900 unnecessary clones, ranging from cloning a few
ints all the way to large data structures and images.

A lot of these were fixed using `cargo clippy --fix --workspace
--all-targets`, however it often breaks other lints and needs to be run
again. This was then followed up with some manual fixing.

I understand this is a large diff, but all the changes are pretty
trivial. Rust is doing some heavy lifting here for us. Once I get it up
to speed with main, I'd appreciate this getting merged rather sooner
than later.

Release Notes:

- N/A
2025-08-20 12:20:13 +02:00
Piotr Osiewicz
cf7c64d77f lints: A bunch of extra style lint fixes (#36568)
- **lints: Fix 'doc_lazy_continuation'**
- **lints: Fix 'doc_overindented_list_items'**
- **inherent_to_string and io_other_error**
- **Some more lint fixes**
- **lints: enable bool_assert_comparison, match_like_matches_macro and
wrong_self_convention**


Release Notes:

- N/A
2025-08-20 12:05:58 +02:00
Bennet Bo Fenner
a32a264508 agent2: Use correct completion intent when generating summary (#36573)
Release Notes:

- N/A
2025-08-20 10:03:35 +00:00
Bennet Bo Fenner
0a80209c5e agent2: Fix remaining update_model_request_usage todos (#36570)
Release Notes:

- N/A
2025-08-20 09:54:26 +00:00
Finn Evers
83d361ba69 Add more string and comment overrides (#36566)
Follow-up to #36469

Part of the issue was that we hadn't defined comment and string
overrides for some languages. Hence, even after the fix edit predictions
would show up in comments for me in e.g. JSONC files.

This PR adds some more overrides where possible for this repo to ensure
this happens less frequently.

Release Notes:

- N/A
2025-08-20 09:29:53 +00:00
Bennet Bo Fenner
4290f043cd agent2: Fix token count not updating when changing model/toggling burn mode (#36562)
Release Notes:

- N/A

---------

Co-authored-by: Antonio Scandurra <me@as-cii.com>
2025-08-20 09:29:05 +00:00
tidely
44941b5dfe Fix clippy::for_kv_map lint violations (#36493)
Release Notes:

- N/A
2025-08-20 11:22:19 +02:00
Bennet Bo Fenner
d4d049d7b9 agent2: Port more Zed AI features (#36559)
Release Notes:

- N/A

---------

Co-authored-by: Danilo Leal <daniloleal09@gmail.com>
2025-08-20 08:45:03 +00:00
Smit Barmase
4c85a0dc71 project: Register dynamic capabilities even when registerOptions doesn't exist (#36554)
Closes #36482

Looks like we accidentally referenced
[common/formatting.ts#L67-L70](d90a87f955/client/src/common/formatting.ts (L67-L70))
instead of
[common/client.ts#L2133](d90a87f955/client/src/common/client.ts (L2133)).

Release Notes:

- Fixed code not formatting on save in language servers like Biome.
(Preview Only)
2025-08-20 12:20:09 +05:30
Conrad Irwin
5d2bb2466e ACP history mentions (#36551)
- **TEMP**
- **Update @-mentions to use new history**

Closes #ISSUE

Release Notes:

- N/A
2025-08-20 06:25:07 +00:00
Danilo Leal
159b5e9fb5 agent2: Port user_modifier_to_send setting (#36550)
Release Notes:

- N/A
2025-08-20 05:30:43 +00:00
Danilo Leal
1e1110ee8c thread_view: Increase click area of the user rules links (#36549)
Release Notes:

- N/A
2025-08-20 05:20:58 +00:00
Danilo Leal
60960409f7 thread view: Refine the UI a bit (#36504)
Release Notes:

- N/A

---------

Co-authored-by: Agus Zubiaga <agus@zed.dev>
2025-08-20 01:47:28 -03:00
zumbalogy
fbba6addfd docs: Document global_lsp_settings.button and remove duplicate docs for lsp_highlight_debounce (#36547)
Follow up to this discussion:
https://github.com/zed-industries/zed/pull/36337

Release Notes:

- N/A

This will (gracefully) break links to
https://zed.dev/docs/configuring-zed#lsp-highlight-debounce-1 I don't
see anything show up for that on google or github search and I don't
think its load bearing.

---------

Co-authored-by: zumbalogy <3770982+zumbalogy@users.noreply.github.com>
2025-08-20 07:39:51 +03:00
Ben Brandt
d273aca1c1 agent_ui: Add check to prevent sending empty messages in MessageEditor (#36545)
Release Notes:

- N/A
2025-08-20 04:06:24 +00:00
Ben Brandt
ceec258bf3 Some clippy fixes (#36544)
These showed up today, so just applied the simplifications, which were
mostly switching matches to if let

Release Notes:

- N/A
2025-08-20 03:40:39 +00:00
Conrad Irwin
cac80e2ebd Silence a bucketload of logs (#36534)
Closes #ISSUE

Release Notes:

- Silenced a bunch of logs that were on by default
2025-08-19 20:26:56 -06:00
Agus Zubiaga
b12d862236 Rename acp flag (#36541)
Release Notes:

- N/A
2025-08-20 02:11:17 +00:00
Cole Miller
3996587c0b Add version detection for CC (#36502)
- Render a helpful message when the installed CC version is too old
- Show the full path for agent binaries when the version is not recent
enough (helps in cases where multiple binaries are installed in
different places)
- Add UI for the case where a server binary is not installed at all
- Refresh thread view after installing/updating server binary

Release Notes:

- N/A
2025-08-20 01:59:14 +00:00
Agus Zubiaga
7c7043947b Improve claude tools (#36538)
- Return unified diff from `Edit` tool so model can see the final state
- Format on save if enabled
- Provide `Write` tool
- Disable `MultiEdit` tool
- Better prompting

Release Notes:

- N/A
2025-08-20 01:42:11 +00:00
Agus Zubiaga
714c36fa7b claude: Include all mentions and images in user message (#36539)
User messages sent to Claude Code will now include the content of all
mentions, and any images included.

Release Notes:

- N/A
2025-08-19 22:30:26 -03:00
Max Brunsfeld
ce216432be Refactor ssh remoting - make ChannelClient type private (#36514)
This PR is one step in a series of refactors to prepare for having
"remote" projects that do not use SSH. The main use cases for this are
WSL and dev containers.

Release Notes:

- N/A
2025-08-19 17:33:56 -07:00
Marshall Bowers
82ac8a8aaa collab: Make stripe_subscription_id and stripe_subscription_status nullable on billing_subscriptions (#36536)
This PR makes the `stripe_subscription_id` and
`stripe_subscription_status` columns nullable on the
`billing_subscriptions` table.

Release Notes:

- N/A
2025-08-19 20:25:07 -04:00
Conrad Irwin
757b37fd41 Hide old Agent UI when ACP flag set (#36533)
- **Use key value store instead of JSON**
- **Default NewThread to the native agent when flagged**

Closes #ISSUE

Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-08-19 16:42:52 -06:00
Julia Ryan
ecee6746ec Attach minidump errors to uploaded crash events (#36527)
We see a bunch of crash events with truncated minidumps where they have
a valid header but no events. We think this is due to an issue
generating them, so we're attaching the relevant result to the uploaded
tags.

Release Notes:

- N/A

Co-authored-by: Max Brunsfeld <maxbrunsfeld@gmail.com>
2025-08-19 15:37:39 -07:00
Conrad Irwin
88754a70f7 Rebuild recently opened threads for ACP (#36531)
Closes #ISSUE

Release Notes:

- N/A
2025-08-19 16:26:30 -06:00
Julia Ryan
88c4a5ca49 Suspend macOS threads during crashes (#36520)
This should improve our detection of which thread crashed since they
wont be able to resume while the minidump is being generated.

Release Notes:

- N/A
2025-08-19 14:31:13 -07:00
Bennet Bo Fenner
5fb68cb8be agent2: Token count (#36496)
Release Notes:

- N/A

---------

Co-authored-by: Agus Zubiaga <agus@zed.dev>
2025-08-19 20:40:31 +00:00
Piotr Osiewicz
6825715503 Another batch of lint fixes (#36521)
- **Enable a bunch of extra lints**
- **First batch of fixes**
- **More fixes**

Release Notes:

- N/A
2025-08-19 20:33:44 +00:00
Peter Tripp
69b1c6d6f5 Fix workspace::SendKeystrokes example in docs (#36515)
Closes: https://github.com/zed-industries/zed/issues/25683

Remove two bad examples from the key binding docs.
`cmd-shift-p` (command palette) and `cmd-p` (file finder) are async
operations and thus do not work properly with
`workspace::SendKeystrokes`.

Originally reported in
https://github.com/zed-industries/zed/issues/25683#issuecomment-3145830534

Release Notes:

- N/A
2025-08-19 19:26:40 +00:00
Piotr Osiewicz
05fc0c432c Fix a bunch of other low-hanging style lints (#36498)
- **Fix a bunch of low hanging style lints like unnecessary-return**
- **Fix single worktree violation**
- **And the rest**

Release Notes:

- N/A
2025-08-19 21:26:17 +02:00
Cole Miller
df9c2aefb1 thread_view: Fix issues with images (#36509)
- Clean up failed load tasks for mentions that require async processing
- When dragging and dropping files, hold onto added worktrees until any
async processing has completed; this fixes a bug when dragging items
from outside the project

Release Notes:

- N/A
2025-08-19 15:14:43 -04:00
Umesh Yadav
a91acb5f41 onboarding: Fix theme selection in system mode (#36484)
Previously, selecting the "System" theme during onboarding would
hardcode the theme based on the device's current mode (e.g., Light or
Dark). This change ensures the "System" setting is saved correctly,
allowing the app to dynamically follow the OS theme by inserting the
correct theme in the config for both light and dark mode.

Release Notes:

- N/A

Signed-off-by: Umesh Yadav <git@umesh.dev>
2025-08-19 14:43:54 -04:00
Conrad Irwin
6ba52a3a42 Re-add history entries for native agent threads (#36500)
Closes #ISSUE

Release Notes:

- N/A

---------

Co-authored-by: Antonio Scandurra <me@as-cii.com>
2025-08-19 12:08:11 -06:00
Bennet Bo Fenner
6b6eb11643 agent2: Fix tool schemas for Gemini (#36507)
Release Notes:

- N/A

---------

Co-authored-by: Agus Zubiaga <agus@zed.dev>
2025-08-19 18:06:09 +00:00
fantacell
1af47a563f helix: Uncomment one test (#36328)
There are two tests commented out in the helix file, but one of them
works again. I don't know if this is too little a change to be merged,
but I wanted to suggest it.
The other test might be more complicated though, so I didn't touch it.

Release Notes:

- N/A
2025-08-19 11:52:29 -06:00
Agus Zubiaga
e092aed253 Split external agent flags (#36499)
Release Notes:

- N/A
2025-08-19 14:25:25 -03:00
Lukas Wirth
d1cabef2bf editor: Fix inline diagnostics min column inaccuracy (#36501)
Closes https://github.com/zed-industries/zed/issues/33346

Release Notes:

- Fixed `diagnostic.inline.min_column` being inaccurate
2025-08-19 16:53:45 +00:00
Lukas Wirth
013eaaeadd editor: Render dirty and conflict markers in multibuffer headers (#36489)
Release Notes:

- Added rendering of status indicators for multi buffer headers
2025-08-19 18:43:42 +02:00
Smit Barmase
43b4363b34 lsp: Enable dynamic registration for TextDocumentSyncClientCapabilities post revert (#36494)
Follow up: https://github.com/zed-industries/zed/pull/36485

Release Notes:

- N/A
2025-08-19 20:30:25 +05:30
Cole Miller
1444cd9839 Fix Windows test failures not being detected in CI (#36446)
Bug introduced in #35926 

Release Notes:

- N/A
2025-08-19 14:53:10 +00:00
Antonio Scandurra
6c255c1973 Lay the groundwork to support history in agent2 (#36483)
This pull request introduces title generation and history replaying. We
still need to wire up the rest of the history but this gets us very
close. I extracted a lot of this code from `agent2-history` because that
branch was starting to get long-lived and there were lots of changes
since we started.

Release Notes:

- N/A
2025-08-19 14:24:23 +00:00
Piotr Osiewicz
c4083b9b63 Fix unnecessary-mut-passed lint (#36490)
Release Notes:

- N/A
2025-08-19 14:20:01 +00:00
Smit Barmase
e3b593efbd project: Take 2 on Handle textDocument/didSave and textDocument/didChange (un)registration and usage correctly (#36485)
Relands https://github.com/zed-industries/zed/pull/36441 with a
deserialization fix.

Previously, deserializing `"includeText"` into
`lsp::TextDocumentSyncSaveOptions` resulted in a `Supported(false)` type
instead of `SaveOptions(SaveOptions { include_text: Option<bool> })`.

```rs
impl From<bool> for TextDocumentSyncSaveOptions {
    fn from(from: bool) -> Self {
        Self::Supported(from)
    }
}
```

Looks like, while dynamic registartion we only get `SaveOptions` type
and never `Supported` type.
(https://microsoft.github.io/language-server-protocol/specifications/lsp/3.17/specification/#textDocumentSaveRegistrationOptions)

Release Notes:

- N/A

---------

Co-authored-by: Lukas Wirth <lukas@zed.dev>
2025-08-19 19:04:48 +05:30
Piotr Osiewicz
8f567383e4 Auto-fix clippy::collapsible_if violations (#36428)
Release Notes:

- N/A
2025-08-19 13:27:24 +00:00
Lukas Wirth
9e8ec72bd5 Revert "project: Handle textDocument/didSave and textDocument/didChange (un)registration and usage correctly (#36441)" (#36480)
This reverts commit c5991e74bb.

This PR broke rust-analyzer's check on save function, so reverting for
now

Release Notes:

- N/A
2025-08-19 14:32:26 +02:00
Vincent Durewski
2fb89c9b3e chore: Default settings: Comments: dock option (#36476)
Minor tweak in the wording of the comments for the default settings
regarding the `dock` option of the panels, in order to make them
congruent across all panels.

Release Notes:

- N/A
2025-08-19 11:08:10 +00:00
Bennet Bo Fenner
e6d5a6a4fd agent: Remove thread-auto-capture feature (#36474)
We never ended up using this in practice (the feature flag is not
enabled for anyone, not even staff)

Release Notes:

- N/A
2025-08-19 10:59:34 +00:00
Bennet Bo Fenner
790a2a0cfa agent2: Support preferred_completion_mode setting (#36473)
Release Notes:

- N/A
2025-08-19 10:40:02 +00:00
Bennet Bo Fenner
97a31c59c9 agent2: Fix agent location still being present after thread stopped (#36471)
Release Notes:

- N/A
2025-08-19 10:22:17 +00:00
Lukas Wirth
5df9c7c1c2 search: Fix project search query flickering (#36470)
Release Notes:

- N/A

Co-authored-by: Smit Barmase <heysmitbarmase@gmail.com>
2025-08-19 10:16:49 +00:00
Bennet Bo Fenner
0ea0d466d2 agent2: Port retry logic (#36421)
Release Notes:

- N/A
2025-08-19 09:41:55 +00:00
678 changed files with 17393 additions and 14425 deletions

View File

@@ -27,6 +27,7 @@ self-hosted-runner:
# Namespace Limited Preview
- namespace-profile-8x16-ubuntu-2004-arm-m4
- namespace-profile-8x32-ubuntu-2004-arm-m4
- namespace-profile-12x32-ubuntu-2004-arm-m4
# Self Hosted Runners
- self-mini-macos
- self-32vcpu-windows-2022

View File

@@ -56,7 +56,6 @@ runs:
$env:COMPlus_CreateDumpDiagnostics = "1"
cargo nextest run --workspace --no-fail-fast
continue-on-error: true
- name: Analyze crash dumps
if: always()

View File

@@ -660,7 +660,7 @@ jobs:
timeout-minutes: 60
name: Linux arm64 release bundle
runs-on:
- namespace-profile-8x32-ubuntu-2004-arm-m4 # ubuntu 20.04 for minimal glibc
- namespace-profile-12x32-ubuntu-2004-arm-m4 # ubuntu 20.04 for minimal glibc
if: |
startsWith(github.ref, 'refs/tags/v')
|| contains(github.event.pull_request.labels.*.name, 'run-bundling')

View File

@@ -168,7 +168,7 @@ jobs:
name: Create a Linux *.tar.gz bundle for ARM
if: github.repository_owner == 'zed-industries'
runs-on:
- namespace-profile-8x32-ubuntu-2004-arm-m4 # ubuntu 20.04 for minimal glibc
- namespace-profile-12x32-ubuntu-2004-arm-m4 # ubuntu 20.04 for minimal glibc
needs: tests
steps:
- name: Checkout repo

36
Cargo.lock generated
View File

@@ -7,7 +7,6 @@ name = "acp_thread"
version = "0.1.0"
dependencies = [
"action_log",
"agent",
"agent-client-protocol",
"anyhow",
"buffer_diff",
@@ -131,7 +130,6 @@ dependencies = [
"component",
"context_server",
"convert_case 0.8.0",
"feature_flags",
"fs",
"futures 0.3.31",
"git",
@@ -173,9 +171,9 @@ dependencies = [
[[package]]
name = "agent-client-protocol"
version = "0.0.26"
version = "0.0.28"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "160971bb53ca0b2e70ebc857c21e24eb448745f1396371015f4c59e9a9e51ed0"
checksum = "4c887e795097665ab95119580534e7cc1335b59e1a7fec296943e534b970f4ed"
dependencies = [
"anyhow",
"futures 0.3.31",
@@ -192,10 +190,12 @@ version = "0.1.0"
dependencies = [
"acp_thread",
"action_log",
"agent",
"agent-client-protocol",
"agent_servers",
"agent_settings",
"anyhow",
"assistant_context",
"assistant_tool",
"assistant_tools",
"chrono",
@@ -205,10 +205,12 @@ dependencies = [
"collections",
"context_server",
"ctor",
"db",
"editor",
"env_logger 0.11.8",
"fs",
"futures 0.3.31",
"git",
"gpui",
"gpui_tokio",
"handlebars 4.5.0",
@@ -222,6 +224,7 @@ dependencies = [
"log",
"lsp",
"open",
"parking_lot",
"paths",
"portable-pty",
"pretty_assertions",
@@ -234,7 +237,9 @@ dependencies = [
"serde_json",
"settings",
"smol",
"sqlez",
"task",
"telemetry",
"tempfile",
"terminal",
"text",
@@ -250,6 +255,7 @@ dependencies = [
"workspace-hack",
"worktree",
"zlog",
"zstd",
]
[[package]]
@@ -257,6 +263,7 @@ name = "agent_servers"
version = "0.1.0"
dependencies = [
"acp_thread",
"action_log",
"agent-client-protocol",
"agent_settings",
"agentic-coding-protocol",
@@ -278,6 +285,7 @@ dependencies = [
"project",
"rand 0.8.5",
"schemars",
"semver",
"serde",
"serde_json",
"settings",
@@ -3866,7 +3874,7 @@ dependencies = [
"jni",
"js-sys",
"libc",
"mach2",
"mach2 0.4.2",
"ndk",
"ndk-context",
"num-derive",
@@ -4016,7 +4024,7 @@ checksum = "031ed29858d90cfdf27fe49fae28028a1f20466db97962fa2f4ea34809aeebf3"
dependencies = [
"cfg-if",
"libc",
"mach2",
"mach2 0.4.2",
]
[[package]]
@@ -4028,7 +4036,7 @@ dependencies = [
"cfg-if",
"crash-context",
"libc",
"mach2",
"mach2 0.4.2",
"parking_lot",
]
@@ -4038,6 +4046,7 @@ version = "0.1.0"
dependencies = [
"crash-handler",
"log",
"mach2 0.5.0",
"minidumper",
"paths",
"release_channel",
@@ -9860,6 +9869,15 @@ dependencies = [
"libc",
]
[[package]]
name = "mach2"
version = "0.5.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "6a1b95cd5421ec55b445b5ae102f5ea0e768de1f82bd3001e11f426c269c3aea"
dependencies = [
"libc",
]
[[package]]
name = "malloc_buf"
version = "0.0.6"
@@ -10196,7 +10214,7 @@ dependencies = [
"goblin",
"libc",
"log",
"mach2",
"mach2 0.4.2",
"memmap2",
"memoffset",
"minidump-common",
@@ -18286,7 +18304,7 @@ dependencies = [
"indexmap",
"libc",
"log",
"mach2",
"mach2 0.4.2",
"memfd",
"object",
"once_cell",

View File

@@ -423,7 +423,7 @@ zlog_settings = { path = "crates/zlog_settings" }
#
agentic-coding-protocol = "0.0.10"
agent-client-protocol = "0.0.26"
agent-client-protocol = "0.0.28"
aho-corasick = "1.1"
alacritty_terminal = { git = "https://github.com/zed-industries/alacritty.git", branch = "add-hush-login-flag" }
any_vec = "0.14"
@@ -515,6 +515,7 @@ libsqlite3-sys = { version = "0.30.1", features = ["bundled"] }
linkify = "0.10.0"
log = { version = "0.4.16", features = ["kv_unstable_serde", "serde"] }
lsp-types = { git = "https://github.com/zed-industries/lsp-types", rev = "39f629bdd03d59abd786ed9fc27e8bca02c0c0ec" }
mach2 = "0.5"
markup5ever_rcdom = "0.3.0"
metal = "0.29"
minidumper = "0.8"
@@ -805,6 +806,9 @@ todo = "deny"
# warning on this rule produces a lot of noise.
single_range_in_vec_init = "allow"
redundant_clone = "warn"
declare_interior_mutable_const = "deny"
# These are all of the rules that currently have violations in the Zed
# codebase.
#
@@ -821,16 +825,114 @@ single_range_in_vec_init = "allow"
style = { level = "allow", priority = -1 }
# Temporary list of style lints that we've fixed so far.
# Progress is being tracked in #36577
blocks_in_conditions = "warn"
bool_assert_comparison = "warn"
borrow_interior_mutable_const = "warn"
box_default = "warn"
builtin_type_shadow = "warn"
bytes_nth = "warn"
chars_next_cmp = "warn"
cmp_null = "warn"
collapsible_else_if = "warn"
collapsible_if = "warn"
comparison_to_empty = "warn"
default_instead_of_iter_empty = "warn"
disallowed_macros = "warn"
disallowed_methods = "warn"
disallowed_names = "warn"
disallowed_types = "warn"
doc_lazy_continuation = "warn"
doc_overindented_list_items = "warn"
duplicate_underscore_argument = "warn"
err_expect = "warn"
fn_to_numeric_cast = "warn"
fn_to_numeric_cast_with_truncation = "warn"
for_kv_map = "warn"
implicit_saturating_add = "warn"
implicit_saturating_sub = "warn"
inconsistent_digit_grouping = "warn"
infallible_destructuring_match = "warn"
inherent_to_string = "warn"
init_numbered_fields = "warn"
into_iter_on_ref = "warn"
io_other_error = "warn"
items_after_test_module = "warn"
iter_cloned_collect = "warn"
iter_next_slice = "warn"
iter_nth = "warn"
iter_nth_zero = "warn"
iter_skip_next = "warn"
module_inception = { level = "deny" }
question_mark = { level = "deny" }
redundant_closure = { level = "deny" }
declare_interior_mutable_const = { level = "deny" }
needless_borrow = { level = "warn"}
just_underscores_and_digits = "warn"
len_zero = "warn"
let_and_return = "warn"
main_recursion = "warn"
manual_bits = "warn"
manual_dangling_ptr = "warn"
manual_is_ascii_check = "warn"
manual_is_finite = "warn"
manual_is_infinite = "warn"
manual_map = "warn"
manual_next_back = "warn"
manual_non_exhaustive = "warn"
manual_ok_or = "warn"
manual_pattern_char_comparison = "warn"
manual_rotate = "warn"
manual_slice_fill = "warn"
manual_while_let_some = "warn"
map_clone = "warn"
map_collect_result_unit = "warn"
match_like_matches_macro = "warn"
match_overlapping_arm = "warn"
mem_replace_option_with_none = "warn"
mem_replace_option_with_some = "warn"
missing_enforced_import_renames = "warn"
missing_safety_doc = "warn"
mixed_attributes_style = "warn"
mixed_case_hex_literals = "warn"
module_inception = "warn"
must_use_unit = "warn"
mut_mutex_lock = "warn"
needless_borrow = "warn"
needless_doctest_main = "warn"
needless_else = "warn"
needless_parens_on_range_literals = "warn"
needless_pub_self = "warn"
needless_return = "warn"
needless_return_with_question_mark = "warn"
non_minimal_cfg = "warn"
ok_expect = "warn"
owned_cow = "warn"
print_literal = "warn"
print_with_newline = "warn"
println_empty_string = "warn"
ptr_eq = "warn"
question_mark = "warn"
redundant_closure = "warn"
redundant_field_names = "warn"
redundant_pattern_matching = "warn"
redundant_static_lifetimes = "warn"
result_map_or_into_option = "warn"
self_named_constructors = "warn"
single_match = "warn"
tabs_in_doc_comments = "warn"
to_digit_is_some = "warn"
toplevel_ref_arg = "warn"
unnecessary_fold = "warn"
unnecessary_map_or = "warn"
unnecessary_mut_passed = "warn"
unnecessary_owned_empty_strings = "warn"
unneeded_struct_pattern = "warn"
unsafe_removed_from_name = "warn"
unused_unit = "warn"
unusual_byte_groupings = "warn"
while_let_on_iterator = "warn"
write_literal = "warn"
write_with_newline = "warn"
writeln_empty_string = "warn"
wrong_self_convention = "warn"
zero_ptr = "warn"
# Individual rules that have violations in the codebase:
type_complexity = "allow"
# We often return trait objects from `new` functions.
@@ -849,6 +951,10 @@ too_many_arguments = "allow"
# We often have large enum variants yet we rarely actually bother with splitting them up.
large_enum_variant = "allow"
# `enum_variant_names` fires for all enums, even when they derive serde traits.
# Adhering to this lint would be a breaking change.
enum_variant_names = "allow"
[workspace.metadata.cargo-machete]
ignored = [
"bindgen",

View File

@@ -1 +1,3 @@
<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16" fill="none"><path stroke="#000" stroke-linecap="round" stroke-linejoin="round" stroke-width="1.2" d="M2.667 8h8M2.667 4h10.666M2.667 12H8"/></svg>
<svg width="16" height="16" viewBox="0 0 16 16" fill="none" xmlns="http://www.w3.org/2000/svg">
<path d="M13.333 10H8M13.333 6H2.66701" stroke="black" stroke-width="1.25" stroke-linecap="round" stroke-linejoin="round"/>
</svg>

Before

Width:  |  Height:  |  Size: 210 B

After

Width:  |  Height:  |  Size: 227 B

View File

@@ -0,0 +1,3 @@
<svg width="16" height="16" viewBox="0 0 16 16" fill="none" xmlns="http://www.w3.org/2000/svg">
<path d="M13.333 10H8M13.333 6H2.66701" stroke="black" stroke-width="1.25" stroke-linecap="round" stroke-linejoin="round"/>
</svg>

After

Width:  |  Height:  |  Size: 227 B

View File

@@ -0,0 +1,3 @@
<svg width="16" height="16" viewBox="0 0 16 16" fill="none" xmlns="http://www.w3.org/2000/svg">
<path d="M8 2C11.3137 2 14 4.68629 14 8C14 11.3137 11.3137 14 8 14C4.68629 14 2 11.3137 2 8C2 4.68629 4.68629 2 8 2ZM10.4238 5.57617C10.1895 5.34187 9.81049 5.3419 9.57617 5.57617L8 7.15234L6.42383 5.57617C6.18953 5.34187 5.81049 5.3419 5.57617 5.57617C5.34186 5.81049 5.34186 6.18951 5.57617 6.42383L7.15234 8L5.57617 9.57617C5.34186 9.81049 5.34186 10.1895 5.57617 10.4238C5.81049 10.6581 6.18954 10.6581 6.42383 10.4238L8 8.84766L9.57617 10.4238C9.81049 10.6581 10.1895 10.6581 10.4238 10.4238C10.6581 10.1895 10.658 9.81048 10.4238 9.57617L8.84766 8L10.4238 6.42383C10.6581 6.18954 10.658 5.81048 10.4238 5.57617Z" fill="black"/>
</svg>

After

Width:  |  Height:  |  Size: 737 B

View File

@@ -0,0 +1,27 @@
<svg width="16" height="16" viewBox="0 0 16 16" fill="none" xmlns="http://www.w3.org/2000/svg">
<path d="M11 8.75V10.5C8.93097 10.5 8.06903 10.5 6 10.5V10L11 6V5.5H6V7.25" stroke="black" stroke-width="1.2"/>
<path d="M2 8.5C2.27614 8.5 2.5 8.27614 2.5 8C2.5 7.72386 2.27614 7.5 2 7.5C1.72386 7.5 1.5 7.72386 1.5 8C1.5 8.27614 1.72386 8.5 2 8.5Z" fill="black"/>
<path d="M2.99976 6.33002C3.2759 6.33002 3.49976 6.10616 3.49976 5.83002C3.49976 5.55387 3.2759 5.33002 2.99976 5.33002C2.72361 5.33002 2.49976 5.55387 2.49976 5.83002C2.49976 6.10616 2.72361 6.33002 2.99976 6.33002Z" fill="black"/>
<path d="M2.99976 10.66C3.2759 10.66 3.49976 10.4361 3.49976 10.16C3.49976 9.88383 3.2759 9.65997 2.99976 9.65997C2.72361 9.65997 2.49976 9.88383 2.49976 10.16C2.49976 10.4361 2.72361 10.66 2.99976 10.66Z" fill="black"/>
<path d="M15 8.5C15.2761 8.5 15.5 8.27614 15.5 8C15.5 7.72386 15.2761 7.5 15 7.5C14.7239 7.5 14.5 7.72386 14.5 8C14.5 8.27614 14.7239 8.5 15 8.5Z" fill="black"/>
<path d="M14 6.33002C14.2761 6.33002 14.5 6.10616 14.5 5.83002C14.5 5.55387 14.2761 5.33002 14 5.33002C13.7239 5.33002 13.5 5.55387 13.5 5.83002C13.5 6.10616 13.7239 6.33002 14 6.33002Z" fill="black"/>
<path d="M14 10.66C14.2761 10.66 14.5 10.4361 14.5 10.16C14.5 9.88383 14.2761 9.65997 14 9.65997C13.7239 9.65997 13.5 9.88383 13.5 10.16C13.5 10.4361 13.7239 10.66 14 10.66Z" fill="black"/>
<path d="M8.49219 2C8.76833 2 8.99219 1.77614 8.99219 1.5C8.99219 1.22386 8.76833 1 8.49219 1C8.21605 1 7.99219 1.22386 7.99219 1.5C7.99219 1.77614 8.21605 2 8.49219 2Z" fill="black"/>
<path d="M6 3C6.27614 3 6.5 2.77614 6.5 2.5C6.5 2.22386 6.27614 2 6 2C5.72386 2 5.5 2.22386 5.5 2.5C5.5 2.77614 5.72386 3 6 3Z" fill="black"/>
<path d="M4 4C4.27614 4 4.5 3.77614 4.5 3.5C4.5 3.22386 4.27614 3 4 3C3.72386 3 3.5 3.22386 3.5 3.5C3.5 3.77614 3.72386 4 4 4Z" fill="black"/>
<path d="M3.99976 13C4.2759 13 4.49976 12.7761 4.49976 12.5C4.49976 12.2239 4.2759 12 3.99976 12C3.72361 12 3.49976 12.2239 3.49976 12.5C3.49976 12.7761 3.72361 13 3.99976 13Z" fill="black"/>
<path d="M2 12.5C2.27614 12.5 2.5 12.2761 2.5 12C2.5 11.7239 2.27614 11.5 2 11.5C1.72386 11.5 1.5 11.7239 1.5 12C1.5 12.2761 1.72386 12.5 2 12.5Z" fill="black"/>
<path d="M2 4.5C2.27614 4.5 2.5 4.27614 2.5 4C2.5 3.72386 2.27614 3.5 2 3.5C1.72386 3.5 1.5 3.72386 1.5 4C1.5 4.27614 1.72386 4.5 2 4.5Z" fill="black"/>
<path d="M15 12.5C15.2761 12.5 15.5 12.2761 15.5 12C15.5 11.7239 15.2761 11.5 15 11.5C14.7239 11.5 14.5 11.7239 14.5 12C14.5 12.2761 14.7239 12.5 15 12.5Z" fill="black"/>
<path d="M15 4.5C15.2761 4.5 15.5 4.27614 15.5 4C15.5 3.72386 15.2761 3.5 15 3.5C14.7239 3.5 14.5 3.72386 14.5 4C14.5 4.27614 14.7239 4.5 15 4.5Z" fill="black"/>
<path d="M3.99976 15C4.2759 15 4.49976 14.7761 4.49976 14.5C4.49976 14.2239 4.2759 14 3.99976 14C3.72361 14 3.49976 14.2239 3.49976 14.5C3.49976 14.7761 3.72361 15 3.99976 15Z" fill="black"/>
<path d="M4 2C4.27614 2 4.5 1.77614 4.5 1.5C4.5 1.22386 4.27614 1 4 1C3.72386 1 3.5 1.22386 3.5 1.5C3.5 1.77614 3.72386 2 4 2Z" fill="black"/>
<path d="M13 15C13.2761 15 13.5 14.7761 13.5 14.5C13.5 14.2239 13.2761 14 13 14C12.7239 14 12.5 14.2239 12.5 14.5C12.5 14.7761 12.7239 15 13 15Z" fill="black"/>
<path d="M13 2C13.2761 2 13.5 1.77614 13.5 1.5C13.5 1.22386 13.2761 1 13 1C12.7239 1 12.5 1.22386 12.5 1.5C12.5 1.77614 12.7239 2 13 2Z" fill="black"/>
<path d="M13 4C13.2761 4 13.5 3.77614 13.5 3.5C13.5 3.22386 13.2761 3 13 3C12.7239 3 12.5 3.22386 12.5 3.5C12.5 3.77614 12.7239 4 13 4Z" fill="black"/>
<path d="M13 13C13.2761 13 13.5 12.7761 13.5 12.5C13.5 12.2239 13.2761 12 13 12C12.7239 12 12.5 12.2239 12.5 12.5C12.5 12.7761 12.7239 13 13 13Z" fill="black"/>
<path d="M11 3C11.2761 3 11.5 2.77614 11.5 2.5C11.5 2.22386 11.2761 2 11 2C10.7239 2 10.5 2.22386 10.5 2.5C10.5 2.77614 10.7239 3 11 3Z" fill="black"/>
<path d="M8.5 15C8.77614 15 9 14.7761 9 14.5C9 14.2239 8.77614 14 8.5 14C8.22386 14 8 14.2239 8 14.5C8 14.7761 8.22386 15 8.5 15Z" fill="black"/>
<path d="M6 14C6.27614 14 6.5 13.7761 6.5 13.5C6.5 13.2239 6.27614 13 6 13C5.72386 13 5.5 13.2239 5.5 13.5C5.5 13.7761 5.72386 14 6 14Z" fill="black"/>
<path d="M11 14C11.2761 14 11.5 13.7761 11.5 13.5C11.5 13.2239 11.2761 13 11 13C10.7239 13 10.5 13.2239 10.5 13.5C10.5 13.7761 10.7239 14 11 14Z" fill="black"/>
</svg>

After

Width:  |  Height:  |  Size: 4.2 KiB

View File

@@ -327,7 +327,7 @@
}
},
{
"context": "AcpThread > Editor",
"context": "AcpThread > Editor && !use_modifier_to_send",
"use_key_equivalents": true,
"bindings": {
"enter": "agent::Chat",
@@ -336,6 +336,16 @@
"ctrl-shift-n": "agent::RejectAll"
}
},
{
"context": "AcpThread > Editor && use_modifier_to_send",
"use_key_equivalents": true,
"bindings": {
"ctrl-enter": "agent::Chat",
"shift-ctrl-r": "agent::OpenAgentDiff",
"ctrl-shift-y": "agent::KeepAll",
"ctrl-shift-n": "agent::RejectAll"
}
},
{
"context": "ThreadHistory",
"bindings": {

View File

@@ -379,7 +379,7 @@
}
},
{
"context": "AcpThread > Editor",
"context": "AcpThread > Editor && !use_modifier_to_send",
"use_key_equivalents": true,
"bindings": {
"enter": "agent::Chat",
@@ -388,6 +388,16 @@
"cmd-shift-n": "agent::RejectAll"
}
},
{
"context": "AcpThread > Editor && use_modifier_to_send",
"use_key_equivalents": true,
"bindings": {
"cmd-enter": "agent::Chat",
"shift-ctrl-r": "agent::OpenAgentDiff",
"cmd-shift-y": "agent::KeepAll",
"cmd-shift-n": "agent::RejectAll"
}
},
{
"context": "ThreadHistory",
"bindings": {

File diff suppressed because it is too large Load Diff

View File

@@ -717,7 +717,7 @@
// Can be 'never', 'always', or 'when_in_call',
// or a boolean (interpreted as 'never'/'always').
"button": "when_in_call",
// Where to the chat panel. Can be 'left' or 'right'.
// Where to dock the chat panel. Can be 'left' or 'right'.
"dock": "right",
// Default width of the chat panel.
"default_width": 240
@@ -725,7 +725,7 @@
"git_panel": {
// Whether to show the git panel button in the status bar.
"button": true,
// Where to show the git panel. Can be 'left' or 'right'.
// Where to dock the git panel. Can be 'left' or 'right'.
"dock": "left",
// Default width of the git panel.
"default_width": 360,

View File

@@ -18,7 +18,6 @@ test-support = ["gpui/test-support", "project/test-support", "dep:parking_lot"]
[dependencies]
action_log.workspace = true
agent-client-protocol.workspace = true
agent.workspace = true
anyhow.workspace = true
buffer_diff.workspace = true
collections.workspace = true

View File

@@ -3,9 +3,13 @@ mod diff;
mod mention;
mod terminal;
use collections::HashSet;
pub use connection::*;
pub use diff::*;
use language::language_settings::FormatOnSave;
pub use mention::*;
use project::lsp_store::{FormatTrigger, LspFormatTarget};
use serde::{Deserialize, Serialize};
pub use terminal::*;
use action_log::ActionLog;
@@ -24,6 +28,7 @@ use std::fmt::{Formatter, Write};
use std::ops::Range;
use std::process::ExitStatus;
use std::rc::Rc;
use std::time::{Duration, Instant};
use std::{fmt::Display, mem, path::PathBuf, sync::Arc};
use ui::App;
use util::ResultExt;
@@ -48,7 +53,7 @@ impl UserMessage {
if self
.checkpoint
.as_ref()
.map_or(false, |checkpoint| checkpoint.show)
.is_some_and(|checkpoint| checkpoint.show)
{
writeln!(markdown, "## User (checkpoint)").unwrap();
} else {
@@ -248,14 +253,13 @@ impl ToolCall {
}
if let Some(raw_output) = raw_output {
if self.content.is_empty() {
if let Some(markdown) = markdown_for_raw_output(&raw_output, &language_registry, cx)
{
self.content
.push(ToolCallContent::ContentBlock(ContentBlock::Markdown {
markdown,
}));
}
if self.content.is_empty()
&& let Some(markdown) = markdown_for_raw_output(&raw_output, &language_registry, cx)
{
self.content
.push(ToolCallContent::ContentBlock(ContentBlock::Markdown {
markdown,
}));
}
self.raw_output = Some(raw_output);
}
@@ -297,11 +301,9 @@ impl ToolCall {
) -> Option<AgentLocation> {
let buffer = project
.update(cx, |project, cx| {
if let Some(path) = project.project_path_for_absolute_path(&location.path, cx) {
Some(project.open_buffer(path, cx))
} else {
None
}
project
.project_path_for_absolute_path(&location.path, cx)
.map(|path| project.open_buffer(path, cx))
})
.ok()??;
let buffer = buffer.await.log_err()?;
@@ -429,11 +431,11 @@ impl ContentBlock {
language_registry: &Arc<LanguageRegistry>,
cx: &mut App,
) {
if matches!(self, ContentBlock::Empty) {
if let acp::ContentBlock::ResourceLink(resource_link) = block {
*self = ContentBlock::ResourceLink { resource_link };
return;
}
if matches!(self, ContentBlock::Empty)
&& let acp::ContentBlock::ResourceLink(resource_link) = block
{
*self = ContentBlock::ResourceLink { resource_link };
return;
}
let new_content = self.block_string_contents(block);
@@ -467,7 +469,7 @@ impl ContentBlock {
fn block_string_contents(&self, block: acp::ContentBlock) -> String {
match block {
acp::ContentBlock::Text(text_content) => text_content.text.clone(),
acp::ContentBlock::Text(text_content) => text_content.text,
acp::ContentBlock::ResourceLink(resource_link) => {
Self::resource_link_md(&resource_link.uri)
}
@@ -537,9 +539,15 @@ impl ToolCallContent {
acp::ToolCallContent::Content { content } => {
Self::ContentBlock(ContentBlock::new(content, &language_registry, cx))
}
acp::ToolCallContent::Diff { diff } => {
Self::Diff(cx.new(|cx| Diff::from_acp(diff, language_registry, cx)))
}
acp::ToolCallContent::Diff { diff } => Self::Diff(cx.new(|cx| {
Diff::finalized(
diff.path,
diff.old_text,
diff.new_text,
language_registry,
cx,
)
})),
}
}
@@ -658,6 +666,52 @@ impl PlanEntry {
}
}
#[derive(Debug, Clone, PartialEq, Eq, Serialize, Deserialize)]
pub struct TokenUsage {
pub max_tokens: u64,
pub used_tokens: u64,
}
impl TokenUsage {
pub fn ratio(&self) -> TokenUsageRatio {
#[cfg(debug_assertions)]
let warning_threshold: f32 = std::env::var("ZED_THREAD_WARNING_THRESHOLD")
.unwrap_or("0.8".to_string())
.parse()
.unwrap();
#[cfg(not(debug_assertions))]
let warning_threshold: f32 = 0.8;
// When the maximum is unknown because there is no selected model,
// avoid showing the token limit warning.
if self.max_tokens == 0 {
TokenUsageRatio::Normal
} else if self.used_tokens >= self.max_tokens {
TokenUsageRatio::Exceeded
} else if self.used_tokens as f32 / self.max_tokens as f32 >= warning_threshold {
TokenUsageRatio::Warning
} else {
TokenUsageRatio::Normal
}
}
}
#[derive(Debug, Clone, PartialEq, Eq)]
pub enum TokenUsageRatio {
Normal,
Warning,
Exceeded,
}
#[derive(Debug, Clone)]
pub struct RetryStatus {
pub last_error: SharedString,
pub attempt: usize,
pub max_attempts: usize,
pub started_at: Instant,
pub duration: Duration,
}
pub struct AcpThread {
title: SharedString,
entries: Vec<AgentThreadEntry>,
@@ -668,17 +722,21 @@ pub struct AcpThread {
send_task: Option<Task<()>>,
connection: Rc<dyn AgentConnection>,
session_id: acp::SessionId,
token_usage: Option<TokenUsage>,
}
#[derive(Debug)]
pub enum AcpThreadEvent {
NewEntry,
TitleUpdated,
TokenUsageUpdated,
EntryUpdated(usize),
EntriesRemoved(Range<usize>),
ToolAuthorizationRequired,
Retry(RetryStatus),
Stopped,
Error,
ServerExited(ExitStatus),
LoadError(LoadError),
}
impl EventEmitter<AcpThreadEvent> for AcpThread {}
@@ -692,20 +750,30 @@ pub enum ThreadStatus {
#[derive(Debug, Clone)]
pub enum LoadError {
NotInstalled {
error_message: SharedString,
install_message: SharedString,
install_command: String,
},
Unsupported {
error_message: SharedString,
upgrade_message: SharedString,
upgrade_command: String,
},
Exited(i32),
Exited {
status: ExitStatus,
},
Other(SharedString),
}
impl Display for LoadError {
fn fmt(&self, f: &mut Formatter<'_>) -> std::fmt::Result {
match self {
LoadError::Unsupported { error_message, .. } => write!(f, "{}", error_message),
LoadError::Exited(status) => write!(f, "Server exited with status {}", status),
LoadError::NotInstalled { error_message, .. }
| LoadError::Unsupported { error_message, .. } => {
write!(f, "{error_message}")
}
LoadError::Exited { status } => write!(f, "Server exited with status {status}"),
LoadError::Other(msg) => write!(f, "{}", msg),
}
}
@@ -718,11 +786,9 @@ impl AcpThread {
title: impl Into<SharedString>,
connection: Rc<dyn AgentConnection>,
project: Entity<Project>,
action_log: Entity<ActionLog>,
session_id: acp::SessionId,
cx: &mut Context<Self>,
) -> Self {
let action_log = cx.new(|_| ActionLog::new(project.clone()));
Self {
action_log,
shared_buffers: Default::default(),
@@ -733,6 +799,7 @@ impl AcpThread {
send_task: None,
connection,
session_id,
token_usage: None,
}
}
@@ -772,6 +839,10 @@ impl AcpThread {
}
}
pub fn token_usage(&self) -> Option<&TokenUsage> {
self.token_usage.as_ref()
}
pub fn has_pending_edit_tool_calls(&self) -> bool {
for entry in self.entries.iter().rev() {
match entry {
@@ -916,6 +987,21 @@ impl AcpThread {
cx.emit(AcpThreadEvent::NewEntry);
}
pub fn update_title(&mut self, title: SharedString, cx: &mut Context<Self>) -> Result<()> {
self.title = title;
cx.emit(AcpThreadEvent::TitleUpdated);
Ok(())
}
pub fn update_token_usage(&mut self, usage: Option<TokenUsage>, cx: &mut Context<Self>) {
self.token_usage = usage;
cx.emit(AcpThreadEvent::TokenUsageUpdated);
}
pub fn update_retry_status(&mut self, status: RetryStatus, cx: &mut Context<Self>) {
cx.emit(AcpThreadEvent::Retry(status));
}
pub fn update_tool_call(
&mut self,
update: impl Into<ToolCallUpdate>,
@@ -932,7 +1018,7 @@ impl AcpThread {
let location_updated = update.fields.locations.is_some();
current_call.update_fields(update.fields, languages, cx);
if location_updated {
self.resolve_locations(update.id.clone(), cx);
self.resolve_locations(update.id, cx);
}
}
ToolCallUpdate::UpdateDiff(update) => {
@@ -1007,6 +1093,22 @@ impl AcpThread {
})
}
pub fn tool_call(&mut self, id: &acp::ToolCallId) -> Option<(usize, &ToolCall)> {
self.entries
.iter()
.enumerate()
.rev()
.find_map(|(index, tool_call)| {
if let AgentThreadEntry::ToolCall(tool_call) = tool_call
&& &tool_call.id == id
{
Some((index, tool_call))
} else {
None
}
})
}
pub fn resolve_locations(&mut self, id: acp::ToolCallId, cx: &mut Context<Self>) {
let project = self.project.clone();
let Some((_, tool_call)) = self.tool_call_mut(&id) else {
@@ -1267,6 +1369,8 @@ impl AcpThread {
.await?;
this.update(cx, |this, cx| {
this.project
.update(cx, |project, cx| project.set_agent_location(None, cx));
match response {
Ok(Err(e)) => {
this.send_task.take();
@@ -1290,6 +1394,17 @@ impl AcpThread {
this.send_task.take();
}
// Truncate entries if the last prompt was refused.
if let Ok(Ok(acp::PromptResponse {
stop_reason: acp::StopReason::Refusal,
})) = result
&& let Some((ix, _)) = this.last_user_message()
{
let range = ix..this.entries.len();
this.entries.truncate(ix);
cx.emit(AcpThreadEvent::EntriesRemoved(range));
}
cx.emit(AcpThreadEvent::Stopped);
Ok(())
}
@@ -1555,30 +1670,59 @@ impl AcpThread {
.collect::<Vec<_>>()
})
.await;
cx.update(|cx| {
project.update(cx, |project, cx| {
project.set_agent_location(
Some(AgentLocation {
buffer: buffer.downgrade(),
position: edits
.last()
.map(|(range, _)| range.end)
.unwrap_or(Anchor::MIN),
}),
cx,
);
});
project.update(cx, |project, cx| {
project.set_agent_location(
Some(AgentLocation {
buffer: buffer.downgrade(),
position: edits
.last()
.map(|(range, _)| range.end)
.unwrap_or(Anchor::MIN),
}),
cx,
);
})?;
let format_on_save = cx.update(|cx| {
action_log.update(cx, |action_log, cx| {
action_log.buffer_read(buffer.clone(), cx);
});
buffer.update(cx, |buffer, cx| {
let format_on_save = buffer.update(cx, |buffer, cx| {
buffer.edit(edits, None, cx);
let settings = language::language_settings::language_settings(
buffer.language().map(|l| l.name()),
buffer.file(),
cx,
);
settings.format_on_save != FormatOnSave::Off
});
action_log.update(cx, |action_log, cx| {
action_log.buffer_edited(buffer.clone(), cx);
});
format_on_save
})?;
if format_on_save {
let format_task = project.update(cx, |project, cx| {
project.format(
HashSet::from_iter([buffer.clone()]),
LspFormatTarget::Buffers,
false,
FormatTrigger::Save,
cx,
)
})?;
format_task.await.log_err();
action_log.update(cx, |action_log, cx| {
action_log.buffer_edited(buffer.clone(), cx);
})?;
}
project
.update(cx, |project, cx| project.save_buffer(buffer, cx))?
.await
@@ -1589,8 +1733,8 @@ impl AcpThread {
self.entries.iter().map(|e| e.to_markdown(cx)).collect()
}
pub fn emit_server_exited(&mut self, status: ExitStatus, cx: &mut Context<Self>) {
cx.emit(AcpThreadEvent::ServerExited(status));
pub fn emit_load_error(&mut self, error: LoadError, cx: &mut Context<Self>) {
cx.emit(AcpThreadEvent::LoadError(error));
}
}
@@ -1641,7 +1785,7 @@ mod tests {
use super::*;
use anyhow::anyhow;
use futures::{channel::mpsc, future::LocalBoxFuture, select};
use gpui::{AsyncApp, TestAppContext, WeakEntity};
use gpui::{App, AsyncApp, TestAppContext, WeakEntity};
use indoc::indoc;
use project::{FakeFs, Fs};
use rand::Rng as _;
@@ -2128,7 +2272,7 @@ mod tests {
"}
);
});
assert_eq!(fs.files(), vec![Path::new("/test/file-0")]);
assert_eq!(fs.files(), vec![Path::new(path!("/test/file-0"))]);
cx.update(|cx| thread.update(cx, |thread, cx| thread.send(vec!["ipsum".into()], cx)))
.await
@@ -2158,7 +2302,10 @@ mod tests {
});
assert_eq!(
fs.files(),
vec![Path::new("/test/file-0"), Path::new("/test/file-1")]
vec![
Path::new(path!("/test/file-0")),
Path::new(path!("/test/file-1"))
]
);
// Checkpoint isn't stored when there are no changes.
@@ -2199,7 +2346,10 @@ mod tests {
});
assert_eq!(
fs.files(),
vec![Path::new("/test/file-0"), Path::new("/test/file-1")]
vec![
Path::new(path!("/test/file-0")),
Path::new(path!("/test/file-1"))
]
);
// Rewinding the conversation truncates the history and restores the checkpoint.
@@ -2227,7 +2377,93 @@ mod tests {
"}
);
});
assert_eq!(fs.files(), vec![Path::new("/test/file-0")]);
assert_eq!(fs.files(), vec![Path::new(path!("/test/file-0"))]);
}
#[gpui::test]
async fn test_refusal(cx: &mut TestAppContext) {
init_test(cx);
let fs = FakeFs::new(cx.background_executor.clone());
fs.insert_tree(path!("/"), json!({})).await;
let project = Project::test(fs.clone(), [path!("/").as_ref()], cx).await;
let refuse_next = Arc::new(AtomicBool::new(false));
let connection = Rc::new(FakeAgentConnection::new().on_user_message({
let refuse_next = refuse_next.clone();
move |request, thread, mut cx| {
let refuse_next = refuse_next.clone();
async move {
if refuse_next.load(SeqCst) {
return Ok(acp::PromptResponse {
stop_reason: acp::StopReason::Refusal,
});
}
let acp::ContentBlock::Text(content) = &request.prompt[0] else {
panic!("expected text content block");
};
thread.update(&mut cx, |thread, cx| {
thread
.handle_session_update(
acp::SessionUpdate::AgentMessageChunk {
content: content.text.to_uppercase().into(),
},
cx,
)
.unwrap();
})?;
Ok(acp::PromptResponse {
stop_reason: acp::StopReason::EndTurn,
})
}
.boxed_local()
}
}));
let thread = cx
.update(|cx| connection.new_thread(project, Path::new(path!("/test")), cx))
.await
.unwrap();
cx.update(|cx| thread.update(cx, |thread, cx| thread.send(vec!["hello".into()], cx)))
.await
.unwrap();
thread.read_with(cx, |thread, cx| {
assert_eq!(
thread.to_markdown(cx),
indoc! {"
## User
hello
## Assistant
HELLO
"}
);
});
// Simulate refusing the second message, ensuring the conversation gets
// truncated to before sending it.
refuse_next.store(true, SeqCst);
cx.update(|cx| thread.update(cx, |thread, cx| thread.send(vec!["world".into()], cx)))
.await
.unwrap();
thread.read_with(cx, |thread, cx| {
assert_eq!(
thread.to_markdown(cx),
indoc! {"
## User
hello
## Assistant
HELLO
"}
);
});
}
async fn run_until_first_tool_call(
@@ -2311,7 +2547,7 @@ mod tests {
self: Rc<Self>,
project: Entity<Project>,
_cwd: &Path,
cx: &mut gpui::App,
cx: &mut App,
) -> Task<gpui::Result<Entity<AcpThread>>> {
let session_id = acp::SessionId(
rand::thread_rng()
@@ -2321,8 +2557,16 @@ mod tests {
.collect::<String>()
.into(),
);
let thread =
cx.new(|cx| AcpThread::new("Test", self.clone(), project, session_id.clone(), cx));
let action_log = cx.new(|_| ActionLog::new(project.clone()));
let thread = cx.new(|_cx| {
AcpThread::new(
"Test",
self.clone(),
project,
action_log,
session_id.clone(),
)
});
self.sessions.lock().insert(session_id, thread.downgrade());
Task::ready(Ok(thread))
}
@@ -2354,6 +2598,14 @@ mod tests {
}
}
fn prompt_capabilities(&self) -> acp::PromptCapabilities {
acp::PromptCapabilities {
image: true,
audio: true,
embedded_context: true,
}
}
fn cancel(&self, session_id: &acp::SessionId, cx: &mut App) {
let sessions = self.sessions.lock();
let thread = sessions.get(session_id).unwrap().clone();

View File

@@ -5,11 +5,12 @@ use collections::IndexMap;
use gpui::{Entity, SharedString, Task};
use language_model::LanguageModelProviderId;
use project::Project;
use serde::{Deserialize, Serialize};
use std::{any::Any, error::Error, fmt, path::Path, rc::Rc, sync::Arc};
use ui::{App, IconName};
use uuid::Uuid;
#[derive(Clone, Debug, Eq, PartialEq)]
#[derive(Debug, Clone, PartialEq, Eq, Serialize, Deserialize, Hash)]
pub struct UserMessageId(Arc<str>);
impl UserMessageId {
@@ -37,6 +38,8 @@ pub trait AgentConnection {
cx: &mut App,
) -> Task<Result<acp::PromptResponse>>;
fn prompt_capabilities(&self) -> acp::PromptCapabilities;
fn resume(
&self,
_session_id: &acp::SessionId,
@@ -63,6 +66,10 @@ pub trait AgentConnection {
None
}
fn telemetry(&self) -> Option<Rc<dyn AgentTelemetry>> {
None
}
fn into_any(self: Rc<Self>) -> Rc<dyn Any>;
}
@@ -80,6 +87,19 @@ pub trait AgentSessionResume {
fn run(&self, cx: &mut App) -> Task<Result<acp::PromptResponse>>;
}
pub trait AgentTelemetry {
/// The name of the agent used for telemetry.
fn agent_name(&self) -> String;
/// A representation of the current thread state that can be serialized for
/// storage with telemetry events.
fn thread_data(
&self,
session_id: &acp::SessionId,
cx: &mut App,
) -> Task<Result<serde_json::Value>>;
}
#[derive(Debug)]
pub struct AuthRequired {
pub description: Option<String>,
@@ -208,6 +228,7 @@ impl AgentModelList {
mod test_support {
use std::sync::Arc;
use action_log::ActionLog;
use collections::HashMap;
use futures::{channel::oneshot, future::try_join_all};
use gpui::{AppContext as _, WeakEntity};
@@ -295,8 +316,16 @@ mod test_support {
cx: &mut gpui::App,
) -> Task<gpui::Result<Entity<AcpThread>>> {
let session_id = acp::SessionId(self.sessions.lock().len().to_string().into());
let thread =
cx.new(|cx| AcpThread::new("Test", self.clone(), project, session_id.clone(), cx));
let action_log = cx.new(|_| ActionLog::new(project.clone()));
let thread = cx.new(|_cx| {
AcpThread::new(
"Test",
self.clone(),
project,
action_log,
session_id.clone(),
)
});
self.sessions.lock().insert(
session_id,
Session {
@@ -307,6 +336,14 @@ mod test_support {
Task::ready(Ok(thread))
}
fn prompt_capabilities(&self) -> acp::PromptCapabilities {
acp::PromptCapabilities {
image: true,
audio: true,
embedded_context: true,
}
}
fn authenticate(
&self,
_method_id: acp::AuthMethodId,

View File

@@ -1,4 +1,3 @@
use agent_client_protocol as acp;
use anyhow::Result;
use buffer_diff::{BufferDiff, BufferDiffSnapshot};
use editor::{MultiBuffer, PathKey};
@@ -21,17 +20,13 @@ pub enum Diff {
}
impl Diff {
pub fn from_acp(
diff: acp::Diff,
pub fn finalized(
path: PathBuf,
old_text: Option<String>,
new_text: String,
language_registry: Arc<LanguageRegistry>,
cx: &mut Context<Self>,
) -> Self {
let acp::Diff {
path,
old_text,
new_text,
} = diff;
let multibuffer = cx.new(|_cx| MultiBuffer::without_headers(Capability::ReadOnly));
let new_buffer = cx.new(|cx| Buffer::local(new_text, cx));
@@ -227,7 +222,7 @@ impl PendingDiff {
fn finalize(&self, cx: &mut Context<Diff>) -> FinalizedDiff {
let ranges = self.excerpt_ranges(cx);
let base_text = self.base_text.clone();
let language_registry = self.buffer.read(cx).language_registry().clone();
let language_registry = self.buffer.read(cx).language_registry();
let path = self
.buffer
@@ -253,7 +248,6 @@ impl PendingDiff {
let buffer_diff = cx.spawn({
let buffer = buffer.clone();
let language_registry = language_registry.clone();
async move |_this, cx| {
build_buffer_diff(base_text, &buffer, language_registry, cx).await
}

View File

@@ -1,7 +1,8 @@
use agent::ThreadId;
use agent_client_protocol as acp;
use anyhow::{Context as _, Result, bail};
use file_icons::FileIcons;
use prompt_store::{PromptId, UserPromptId};
use serde::{Deserialize, Serialize};
use std::{
fmt,
ops::Range,
@@ -11,7 +12,7 @@ use std::{
use ui::{App, IconName, SharedString};
use url::Url;
#[derive(Clone, Debug, PartialEq, Eq)]
#[derive(Clone, Debug, PartialEq, Eq, Serialize, Deserialize, Hash)]
pub enum MentionUri {
File {
abs_path: PathBuf,
@@ -25,7 +26,7 @@ pub enum MentionUri {
line_range: Range<u32>,
},
Thread {
id: ThreadId,
id: acp::SessionId,
name: String,
},
TextThread {
@@ -51,6 +52,7 @@ impl MentionUri {
let path = url.path();
match url.scheme() {
"file" => {
let path = url.to_file_path().ok().context("Extracting file path")?;
if let Some(fragment) = url.fragment() {
let range = fragment
.strip_prefix("L")
@@ -71,31 +73,23 @@ impl MentionUri {
if let Some(name) = single_query_param(&url, "symbol")? {
Ok(Self::Symbol {
name,
path: path.into(),
path,
line_range,
})
} else {
Ok(Self::Selection {
path: path.into(),
line_range,
})
Ok(Self::Selection { path, line_range })
}
} else if input.ends_with("/") {
Ok(Self::Directory { abs_path: path })
} else {
let abs_path =
PathBuf::from(format!("{}{}", url.host_str().unwrap_or(""), path));
if input.ends_with("/") {
Ok(Self::Directory { abs_path })
} else {
Ok(Self::File { abs_path })
}
Ok(Self::File { abs_path: path })
}
}
"zed" => {
if let Some(thread_id) = path.strip_prefix("/agent/thread/") {
let name = single_query_param(&url, "name")?.context("Missing thread name")?;
Ok(Self::Thread {
id: thread_id.into(),
id: acp::SessionId(thread_id.into()),
name,
})
} else if let Some(path) = path.strip_prefix("/agent/text-thread/") {
@@ -161,27 +155,17 @@ impl MentionUri {
pub fn to_uri(&self) -> Url {
match self {
MentionUri::File { abs_path } => {
let mut url = Url::parse("file:///").unwrap();
let path = abs_path.to_string_lossy();
url.set_path(&path);
url
Url::from_file_path(abs_path).expect("mention path should be absolute")
}
MentionUri::Directory { abs_path } => {
let mut url = Url::parse("file:///").unwrap();
let mut path = abs_path.to_string_lossy().to_string();
if !path.ends_with("/") {
path.push_str("/");
}
url.set_path(&path);
url
Url::from_directory_path(abs_path).expect("mention path should be absolute")
}
MentionUri::Symbol {
path,
name,
line_range,
} => {
let mut url = Url::parse("file:///").unwrap();
url.set_path(&path.to_string_lossy());
let mut url = Url::from_file_path(path).expect("mention path should be absolute");
url.query_pairs_mut().append_pair("symbol", name);
url.set_fragment(Some(&format!(
"L{}:{}",
@@ -191,8 +175,7 @@ impl MentionUri {
url
}
MentionUri::Selection { path, line_range } => {
let mut url = Url::parse("file:///").unwrap();
url.set_path(&path.to_string_lossy());
let mut url = Url::from_file_path(path).expect("mention path should be absolute");
url.set_fragment(Some(&format!(
"L{}:{}",
line_range.start + 1,
@@ -265,15 +248,17 @@ pub fn selection_name(path: &Path, line_range: &Range<u32>) -> String {
#[cfg(test)]
mod tests {
use util::{path, uri};
use super::*;
#[test]
fn test_parse_file_uri() {
let file_uri = "file:///path/to/file.rs";
let file_uri = uri!("file:///path/to/file.rs");
let parsed = MentionUri::parse(file_uri).unwrap();
match &parsed {
MentionUri::File { abs_path } => {
assert_eq!(abs_path.to_str().unwrap(), "/path/to/file.rs");
assert_eq!(abs_path.to_str().unwrap(), path!("/path/to/file.rs"));
}
_ => panic!("Expected File variant"),
}
@@ -282,11 +267,11 @@ mod tests {
#[test]
fn test_parse_directory_uri() {
let file_uri = "file:///path/to/dir/";
let file_uri = uri!("file:///path/to/dir/");
let parsed = MentionUri::parse(file_uri).unwrap();
match &parsed {
MentionUri::Directory { abs_path } => {
assert_eq!(abs_path.to_str().unwrap(), "/path/to/dir/");
assert_eq!(abs_path.to_str().unwrap(), path!("/path/to/dir/"));
}
_ => panic!("Expected Directory variant"),
}
@@ -296,22 +281,24 @@ mod tests {
#[test]
fn test_to_directory_uri_with_slash() {
let uri = MentionUri::Directory {
abs_path: PathBuf::from("/path/to/dir/"),
abs_path: PathBuf::from(path!("/path/to/dir/")),
};
assert_eq!(uri.to_uri().to_string(), "file:///path/to/dir/");
let expected = uri!("file:///path/to/dir/");
assert_eq!(uri.to_uri().to_string(), expected);
}
#[test]
fn test_to_directory_uri_without_slash() {
let uri = MentionUri::Directory {
abs_path: PathBuf::from("/path/to/dir"),
abs_path: PathBuf::from(path!("/path/to/dir")),
};
assert_eq!(uri.to_uri().to_string(), "file:///path/to/dir/");
let expected = uri!("file:///path/to/dir/");
assert_eq!(uri.to_uri().to_string(), expected);
}
#[test]
fn test_parse_symbol_uri() {
let symbol_uri = "file:///path/to/file.rs?symbol=MySymbol#L10:20";
let symbol_uri = uri!("file:///path/to/file.rs?symbol=MySymbol#L10:20");
let parsed = MentionUri::parse(symbol_uri).unwrap();
match &parsed {
MentionUri::Symbol {
@@ -319,7 +306,7 @@ mod tests {
name,
line_range,
} => {
assert_eq!(path.to_str().unwrap(), "/path/to/file.rs");
assert_eq!(path.to_str().unwrap(), path!("/path/to/file.rs"));
assert_eq!(name, "MySymbol");
assert_eq!(line_range.start, 9);
assert_eq!(line_range.end, 19);
@@ -331,11 +318,11 @@ mod tests {
#[test]
fn test_parse_selection_uri() {
let selection_uri = "file:///path/to/file.rs#L5:15";
let selection_uri = uri!("file:///path/to/file.rs#L5:15");
let parsed = MentionUri::parse(selection_uri).unwrap();
match &parsed {
MentionUri::Selection { path, line_range } => {
assert_eq!(path.to_str().unwrap(), "/path/to/file.rs");
assert_eq!(path.to_str().unwrap(), path!("/path/to/file.rs"));
assert_eq!(line_range.start, 4);
assert_eq!(line_range.end, 14);
}
@@ -417,32 +404,35 @@ mod tests {
#[test]
fn test_invalid_line_range_format() {
// Missing L prefix
assert!(MentionUri::parse("file:///path/to/file.rs#10:20").is_err());
assert!(MentionUri::parse(uri!("file:///path/to/file.rs#10:20")).is_err());
// Missing colon separator
assert!(MentionUri::parse("file:///path/to/file.rs#L1020").is_err());
assert!(MentionUri::parse(uri!("file:///path/to/file.rs#L1020")).is_err());
// Invalid numbers
assert!(MentionUri::parse("file:///path/to/file.rs#L10:abc").is_err());
assert!(MentionUri::parse("file:///path/to/file.rs#Labc:20").is_err());
assert!(MentionUri::parse(uri!("file:///path/to/file.rs#L10:abc")).is_err());
assert!(MentionUri::parse(uri!("file:///path/to/file.rs#Labc:20")).is_err());
}
#[test]
fn test_invalid_query_parameters() {
// Invalid query parameter name
assert!(MentionUri::parse("file:///path/to/file.rs#L10:20?invalid=test").is_err());
assert!(MentionUri::parse(uri!("file:///path/to/file.rs#L10:20?invalid=test")).is_err());
// Too many query parameters
assert!(
MentionUri::parse("file:///path/to/file.rs#L10:20?symbol=test&another=param").is_err()
MentionUri::parse(uri!(
"file:///path/to/file.rs#L10:20?symbol=test&another=param"
))
.is_err()
);
}
#[test]
fn test_zero_based_line_numbers() {
// Test that 0-based line numbers are rejected (should be 1-based)
assert!(MentionUri::parse("file:///path/to/file.rs#L0:10").is_err());
assert!(MentionUri::parse("file:///path/to/file.rs#L1:0").is_err());
assert!(MentionUri::parse("file:///path/to/file.rs#L0:0").is_err());
assert!(MentionUri::parse(uri!("file:///path/to/file.rs#L0:10")).is_err());
assert!(MentionUri::parse(uri!("file:///path/to/file.rs#L1:0")).is_err());
assert!(MentionUri::parse(uri!("file:///path/to/file.rs#L0:0")).is_err());
}
}

View File

@@ -116,7 +116,7 @@ impl ActionLog {
} else if buffer
.read(cx)
.file()
.map_or(false, |file| file.disk_state().exists())
.is_some_and(|file| file.disk_state().exists())
{
TrackedBufferStatus::Created {
existing_file_content: Some(buffer.read(cx).as_rope().clone()),
@@ -161,7 +161,7 @@ impl ActionLog {
diff_base,
last_seen_base,
unreviewed_edits,
snapshot: text_snapshot.clone(),
snapshot: text_snapshot,
status,
version: buffer.read(cx).version(),
diff,
@@ -190,7 +190,7 @@ impl ActionLog {
cx: &mut Context<Self>,
) {
match event {
BufferEvent::Edited { .. } => self.handle_buffer_edited(buffer, cx),
BufferEvent::Edited => self.handle_buffer_edited(buffer, cx),
BufferEvent::FileHandleChanged => {
self.handle_buffer_file_changed(buffer, cx);
}
@@ -215,7 +215,7 @@ impl ActionLog {
if buffer
.read(cx)
.file()
.map_or(false, |file| file.disk_state() == DiskState::Deleted)
.is_some_and(|file| file.disk_state() == DiskState::Deleted)
{
// If the buffer had been edited by a tool, but it got
// deleted externally, we want to stop tracking it.
@@ -227,7 +227,7 @@ impl ActionLog {
if buffer
.read(cx)
.file()
.map_or(false, |file| file.disk_state() != DiskState::Deleted)
.is_some_and(|file| file.disk_state() != DiskState::Deleted)
{
// If the buffer had been deleted by a tool, but it got
// resurrected externally, we want to clear the edits we
@@ -264,15 +264,14 @@ impl ActionLog {
if let Some((git_diff, (buffer_repo, _))) = git_diff.as_ref().zip(buffer_repo) {
cx.update(|cx| {
let mut old_head = buffer_repo.read(cx).head_commit.clone();
Some(cx.subscribe(git_diff, move |_, event, cx| match event {
buffer_diff::BufferDiffEvent::DiffChanged { .. } => {
Some(cx.subscribe(git_diff, move |_, event, cx| {
if let buffer_diff::BufferDiffEvent::DiffChanged { .. } = event {
let new_head = buffer_repo.read(cx).head_commit.clone();
if new_head != old_head {
old_head = new_head;
git_diff_updates_tx.send(()).ok();
}
}
_ => {}
}))
})?
} else {
@@ -462,7 +461,7 @@ impl ActionLog {
anyhow::Ok((
tracked_buffer.diff.clone(),
buffer.read(cx).language().cloned(),
buffer.read(cx).language_registry().clone(),
buffer.read(cx).language_registry(),
))
})??;
let diff_snapshot = BufferDiff::update_diff(
@@ -530,12 +529,12 @@ impl ActionLog {
/// Mark a buffer as created by agent, so we can refresh it in the context
pub fn buffer_created(&mut self, buffer: Entity<Buffer>, cx: &mut Context<Self>) {
self.track_buffer_internal(buffer.clone(), true, cx);
self.track_buffer_internal(buffer, true, cx);
}
/// Mark a buffer as edited by agent, so we can refresh it in the context
pub fn buffer_edited(&mut self, buffer: Entity<Buffer>, cx: &mut Context<Self>) {
let tracked_buffer = self.track_buffer_internal(buffer.clone(), false, cx);
let tracked_buffer = self.track_buffer_internal(buffer, false, cx);
if let TrackedBufferStatus::Deleted = tracked_buffer.status {
tracked_buffer.status = TrackedBufferStatus::Modified;
}
@@ -614,10 +613,10 @@ impl ActionLog {
false
}
});
if tracked_buffer.unreviewed_edits.is_empty() {
if let TrackedBufferStatus::Created { .. } = &mut tracked_buffer.status {
tracked_buffer.status = TrackedBufferStatus::Modified;
}
if tracked_buffer.unreviewed_edits.is_empty()
&& let TrackedBufferStatus::Created { .. } = &mut tracked_buffer.status
{
tracked_buffer.status = TrackedBufferStatus::Modified;
}
tracked_buffer.schedule_diff_update(ChangeAuthor::User, cx);
}
@@ -811,7 +810,7 @@ impl ActionLog {
tracked.version != buffer.version
&& buffer
.file()
.map_or(false, |file| file.disk_state() != DiskState::Deleted)
.is_some_and(|file| file.disk_state() != DiskState::Deleted)
})
.map(|(buffer, _)| buffer)
}
@@ -847,7 +846,7 @@ fn apply_non_conflicting_edits(
conflict = true;
if new_edits
.peek()
.map_or(false, |next_edit| next_edit.old.overlaps(&old_edit.new))
.is_some_and(|next_edit| next_edit.old.overlaps(&old_edit.new))
{
new_edit = new_edits.next().unwrap();
} else {
@@ -2426,7 +2425,7 @@ mod tests {
assert_eq!(
unreviewed_hunks(&action_log, cx),
vec![(
buffer.clone(),
buffer,
vec![
HunkStatus {
range: Point::new(6, 0)..Point::new(7, 0),

View File

@@ -103,26 +103,21 @@ impl ActivityIndicator {
cx.subscribe_in(
&workspace_handle,
window,
|activity_indicator, _, event, window, cx| match event {
workspace::Event::ClearActivityIndicator { .. } => {
if activity_indicator.statuses.pop().is_some() {
activity_indicator.dismiss_error_message(
&DismissErrorMessage,
window,
cx,
);
cx.notify();
}
|activity_indicator, _, event, window, cx| {
if let workspace::Event::ClearActivityIndicator = event
&& activity_indicator.statuses.pop().is_some()
{
activity_indicator.dismiss_error_message(&DismissErrorMessage, window, cx);
cx.notify();
}
_ => {}
},
)
.detach();
cx.subscribe(
&project.read(cx).lsp_store(),
|activity_indicator, _, event, cx| match event {
LspStoreEvent::LanguageServerUpdate { name, message, .. } => {
|activity_indicator, _, event, cx| {
if let LspStoreEvent::LanguageServerUpdate { name, message, .. } = event {
if let proto::update_language_server::Variant::StatusUpdate(status_update) =
message
{
@@ -191,7 +186,6 @@ impl ActivityIndicator {
}
cx.notify()
}
_ => {}
},
)
.detach();
@@ -206,9 +200,10 @@ impl ActivityIndicator {
cx.subscribe(
&project.read(cx).git_store().clone(),
|_, _, event: &GitStoreEvent, cx| match event {
project::git_store::GitStoreEvent::JobsUpdated => cx.notify(),
_ => {}
|_, _, event: &GitStoreEvent, cx| {
if let project::git_store::GitStoreEvent::JobsUpdated = event {
cx.notify()
}
},
)
.detach();
@@ -458,26 +453,24 @@ impl ActivityIndicator {
.map(|r| r.read(cx))
.and_then(Repository::current_job);
// Show any long-running git command
if let Some(job_info) = current_job {
if Instant::now() - job_info.start >= GIT_OPERATION_DELAY {
return Some(Content {
icon: Some(
Icon::new(IconName::ArrowCircle)
.size(IconSize::Small)
.with_animation(
"arrow-circle",
Animation::new(Duration::from_secs(2)).repeat(),
|icon, delta| {
icon.transform(Transformation::rotate(percentage(delta)))
},
)
.into_any_element(),
),
message: job_info.message.into(),
on_click: None,
tooltip_message: None,
});
}
if let Some(job_info) = current_job
&& Instant::now() - job_info.start >= GIT_OPERATION_DELAY
{
return Some(Content {
icon: Some(
Icon::new(IconName::ArrowCircle)
.size(IconSize::Small)
.with_animation(
"arrow-circle",
Animation::new(Duration::from_secs(2)).repeat(),
|icon, delta| icon.transform(Transformation::rotate(percentage(delta))),
)
.into_any_element(),
),
message: job_info.message.into(),
on_click: None,
tooltip_message: None,
});
}
// Show any language server installation info.
@@ -740,21 +733,20 @@ impl ActivityIndicator {
if let Some(extension_store) =
ExtensionStore::try_global(cx).map(|extension_store| extension_store.read(cx))
&& let Some(extension_id) = extension_store.outstanding_operations().keys().next()
{
if let Some(extension_id) = extension_store.outstanding_operations().keys().next() {
return Some(Content {
icon: Some(
Icon::new(IconName::Download)
.size(IconSize::Small)
.into_any_element(),
),
message: format!("Updating {extension_id} extension…"),
on_click: Some(Arc::new(|this, window, cx| {
this.dismiss_error_message(&DismissErrorMessage, window, cx)
})),
tooltip_message: None,
});
}
return Some(Content {
icon: Some(
Icon::new(IconName::Download)
.size(IconSize::Small)
.into_any_element(),
),
message: format!("Updating {extension_id} extension…"),
on_click: Some(Arc::new(|this, window, cx| {
this.dismiss_error_message(&DismissErrorMessage, window, cx)
})),
tooltip_message: None,
});
}
None

View File

@@ -31,7 +31,6 @@ collections.workspace = true
component.workspace = true
context_server.workspace = true
convert_case.workspace = true
feature_flags.workspace = true
fs.workspace = true
futures.workspace = true
git.workspace = true

View File

@@ -90,7 +90,7 @@ impl AgentProfile {
return false;
};
return Self::is_enabled(settings, source, tool_name);
Self::is_enabled(settings, source, tool_name)
}
fn is_enabled(settings: &AgentProfileSettings, source: ToolSource, name: String) -> bool {
@@ -132,7 +132,7 @@ mod tests {
});
let tool_set = default_tool_set(cx);
let profile = AgentProfile::new(id.clone(), tool_set);
let profile = AgentProfile::new(id, tool_set);
let mut enabled_tools = cx
.read(|cx| profile.enabled_tools(cx))
@@ -169,7 +169,7 @@ mod tests {
});
let tool_set = default_tool_set(cx);
let profile = AgentProfile::new(id.clone(), tool_set);
let profile = AgentProfile::new(id, tool_set);
let mut enabled_tools = cx
.read(|cx| profile.enabled_tools(cx))
@@ -202,7 +202,7 @@ mod tests {
});
let tool_set = default_tool_set(cx);
let profile = AgentProfile::new(id.clone(), tool_set);
let profile = AgentProfile::new(id, tool_set);
let mut enabled_tools = cx
.read(|cx| profile.enabled_tools(cx))

View File

@@ -201,24 +201,24 @@ impl FileContextHandle {
parse_status.changed().await.log_err();
}
if let Ok(snapshot) = buffer.read_with(cx, |buffer, _| buffer.snapshot()) {
if let Some(outline) = snapshot.outline(None) {
let items = outline
.items
.into_iter()
.map(|item| item.to_point(&snapshot));
if let Ok(snapshot) = buffer.read_with(cx, |buffer, _| buffer.snapshot())
&& let Some(outline) = snapshot.outline(None)
{
let items = outline
.items
.into_iter()
.map(|item| item.to_point(&snapshot));
if let Ok(outline_text) =
outline::render_outline(items, None, 0, usize::MAX).await
{
let context = AgentContext::File(FileContext {
handle: self,
full_path,
text: outline_text.into(),
is_outline: true,
});
return Some((context, vec![buffer]));
}
if let Ok(outline_text) =
outline::render_outline(items, None, 0, usize::MAX).await
{
let context = AgentContext::File(FileContext {
handle: self,
full_path,
text: outline_text.into(),
is_outline: true,
});
return Some((context, vec![buffer]));
}
}
}
@@ -362,7 +362,7 @@ impl Display for DirectoryContext {
let mut is_first = true;
for descendant in &self.descendants {
if !is_first {
write!(f, "\n")?;
writeln!(f)?;
} else {
is_first = false;
}
@@ -650,7 +650,7 @@ impl TextThreadContextHandle {
impl Display for TextThreadContext {
fn fmt(&self, f: &mut Formatter<'_>) -> fmt::Result {
// TODO: escape title?
write!(f, "<text_thread title=\"{}\">\n", self.title)?;
writeln!(f, "<text_thread title=\"{}\">", self.title)?;
write!(f, "{}", self.text.trim())?;
write!(f, "\n</text_thread>")
}
@@ -716,7 +716,7 @@ impl RulesContextHandle {
impl Display for RulesContext {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
if let Some(title) = &self.title {
write!(f, "Rules title: {}\n", title)?;
writeln!(f, "Rules title: {}", title)?;
}
let code_block = MarkdownCodeBlock {
tag: "",

View File

@@ -86,15 +86,13 @@ impl Tool for ContextServerTool {
) -> ToolResult {
if let Some(server) = self.store.read(cx).get_running_server(&self.server_id) {
let tool_name = self.tool.name.clone();
let server_clone = server.clone();
let input_clone = input.clone();
cx.spawn(async move |_cx| {
let Some(protocol) = server_clone.client() else {
let Some(protocol) = server.client() else {
bail!("Context server not initialized");
};
let arguments = if let serde_json::Value::Object(map) = input_clone {
let arguments = if let serde_json::Value::Object(map) = input {
Some(map.into_iter().collect())
} else {
None

View File

@@ -338,11 +338,9 @@ impl ContextStore {
image_task,
context_id: self.next_context_id.post_inc(),
});
if self.has_context(&context) {
if remove_if_exists {
self.remove_context(&context, cx);
return None;
}
if self.has_context(&context) && remove_if_exists {
self.remove_context(&context, cx);
return None;
}
self.insert_context(context.clone(), cx);

View File

@@ -254,10 +254,9 @@ impl HistoryStore {
}
pub fn remove_recently_opened_thread(&mut self, id: ThreadId, cx: &mut Context<Self>) {
self.recently_opened_entries.retain(|entry| match entry {
HistoryEntryId::Thread(thread_id) if thread_id == &id => false,
_ => true,
});
self.recently_opened_entries.retain(
|entry| !matches!(entry, HistoryEntryId::Thread(thread_id) if thread_id == &id),
);
self.save_recently_opened_entries(cx);
}

View File

@@ -9,14 +9,16 @@ use crate::{
tool_use::{PendingToolUse, ToolUse, ToolUseMetadata, ToolUseState},
};
use action_log::ActionLog;
use agent_settings::{AgentProfileId, AgentSettings, CompletionMode, SUMMARIZE_THREAD_PROMPT};
use agent_settings::{
AgentProfileId, AgentSettings, CompletionMode, SUMMARIZE_THREAD_DETAILED_PROMPT,
SUMMARIZE_THREAD_PROMPT,
};
use anyhow::{Result, anyhow};
use assistant_tool::{AnyToolCard, Tool, ToolWorkingSet};
use chrono::{DateTime, Utc};
use client::{ModelRequestUsage, RequestUsage};
use cloud_llm_client::{CompletionIntent, CompletionRequestStatus, Plan, UsageLimit};
use collections::HashMap;
use feature_flags::{self, FeatureFlagAppExt};
use futures::{FutureExt, StreamExt as _, future::Shared};
use git::repository::DiffType;
use gpui::{
@@ -108,7 +110,7 @@ impl std::fmt::Display for PromptId {
}
#[derive(Debug, PartialEq, Eq, PartialOrd, Ord, Hash, Clone, Copy, Serialize, Deserialize)]
pub struct MessageId(pub(crate) usize);
pub struct MessageId(pub usize);
impl MessageId {
fn post_inc(&mut self) -> Self {
@@ -179,7 +181,7 @@ impl Message {
}
}
pub fn to_string(&self) -> String {
pub fn to_message_content(&self) -> String {
let mut result = String::new();
if !self.loaded_context.text.is_empty() {
@@ -385,10 +387,8 @@ pub struct Thread {
cumulative_token_usage: TokenUsage,
exceeded_window_error: Option<ExceededWindowError>,
tool_use_limit_reached: bool,
feedback: Option<ThreadFeedback>,
retry_state: Option<RetryState>,
message_feedback: HashMap<MessageId, ThreadFeedback>,
last_auto_capture_at: Option<Instant>,
last_received_chunk_at: Option<Instant>,
request_callback: Option<
Box<dyn FnMut(&LanguageModelRequest, &[Result<LanguageModelCompletionEvent, String>])>,
@@ -486,15 +486,13 @@ impl Thread {
cumulative_token_usage: TokenUsage::default(),
exceeded_window_error: None,
tool_use_limit_reached: false,
feedback: None,
retry_state: None,
message_feedback: HashMap::default(),
last_auto_capture_at: None,
last_error_context: None,
last_received_chunk_at: None,
request_callback: None,
remaining_turns: u32::MAX,
configured_model: configured_model.clone(),
configured_model,
profile: AgentProfile::new(profile_id, tools),
}
}
@@ -532,7 +530,7 @@ impl Thread {
.and_then(|model| {
let model = SelectedModel {
provider: model.provider.clone().into(),
model: model.model.clone().into(),
model: model.model.into(),
};
registry.select_model(&model, cx)
})
@@ -612,9 +610,7 @@ impl Thread {
cumulative_token_usage: serialized.cumulative_token_usage,
exceeded_window_error: None,
tool_use_limit_reached: serialized.tool_use_limit_reached,
feedback: None,
message_feedback: HashMap::default(),
last_auto_capture_at: None,
last_error_context: None,
last_received_chunk_at: None,
request_callback: None,
@@ -1033,8 +1029,6 @@ impl Thread {
});
}
self.auto_capture_telemetry(cx);
message_id
}
@@ -1649,17 +1643,15 @@ impl Thread {
};
self.tool_use
.request_tool_use(tool_message_id, tool_use, tool_use_metadata.clone(), cx);
.request_tool_use(tool_message_id, tool_use, tool_use_metadata, cx);
let pending_tool_use = self.tool_use.insert_tool_output(
tool_use_id.clone(),
self.tool_use.insert_tool_output(
tool_use_id,
tool_name,
tool_output,
self.configured_model.as_ref(),
self.completion_mode,
);
pending_tool_use
)
}
pub fn stream_completion(
@@ -1906,7 +1898,6 @@ impl Thread {
cx.emit(ThreadEvent::StreamedCompletion);
cx.notify();
thread.auto_capture_telemetry(cx);
Ok(())
})??;
@@ -1974,11 +1965,9 @@ impl Thread {
if let Some(prev_message) =
thread.messages.get(ix - 1)
{
if prev_message.role == Role::Assistant {
&& prev_message.role == Role::Assistant {
break;
}
}
}
}
@@ -2081,8 +2070,6 @@ impl Thread {
request_callback(request, response_events);
}
thread.auto_capture_telemetry(cx);
if let Ok(initial_usage) = initial_token_usage {
let usage = thread.cumulative_token_usage - initial_usage;
@@ -2438,12 +2425,10 @@ impl Thread {
return;
}
let added_user_message = include_str!("./prompts/summarize_thread_detailed_prompt.txt");
let request = self.to_summarize_request(
&model,
CompletionIntent::ThreadContextSummarization,
added_user_message.into(),
SUMMARIZE_THREAD_DETAILED_PROMPT.into(),
cx,
);
@@ -2485,13 +2470,13 @@ impl Thread {
.ok()?;
// Save thread so its summary can be reused later
if let Some(thread) = thread.upgrade() {
if let Ok(Ok(save_task)) = cx.update(|cx| {
if let Some(thread) = thread.upgrade()
&& let Ok(Ok(save_task)) = cx.update(|cx| {
thread_store
.update(cx, |thread_store, cx| thread_store.save_thread(&thread, cx))
}) {
save_task.await.log_err();
}
})
{
save_task.await.log_err();
}
Some(())
@@ -2536,7 +2521,6 @@ impl Thread {
model: Arc<dyn LanguageModel>,
cx: &mut Context<Self>,
) -> Vec<PendingToolUse> {
self.auto_capture_telemetry(cx);
let request =
Arc::new(self.to_completion_request(model.clone(), CompletionIntent::ToolResults, cx));
let pending_tool_uses = self
@@ -2740,13 +2724,11 @@ impl Thread {
window: Option<AnyWindowHandle>,
cx: &mut Context<Self>,
) {
if self.all_tools_finished() {
if let Some(ConfiguredModel { model, .. }) = self.configured_model.as_ref() {
if !canceled {
self.send_to_model(model.clone(), CompletionIntent::ToolResults, window, cx);
}
self.auto_capture_telemetry(cx);
}
if self.all_tools_finished()
&& let Some(ConfiguredModel { model, .. }) = self.configured_model.as_ref()
&& !canceled
{
self.send_to_model(model.clone(), CompletionIntent::ToolResults, window, cx);
}
cx.emit(ThreadEvent::ToolFinished {
@@ -2802,10 +2784,6 @@ impl Thread {
cx.emit(ThreadEvent::CancelEditing);
}
pub fn feedback(&self) -> Option<ThreadFeedback> {
self.feedback
}
pub fn message_feedback(&self, message_id: MessageId) -> Option<ThreadFeedback> {
self.message_feedback.get(&message_id).copied()
}
@@ -2838,7 +2816,7 @@ impl Thread {
let message_content = self
.message(message_id)
.map(|msg| msg.to_string())
.map(|msg| msg.to_message_content())
.unwrap_or_default();
cx.background_spawn(async move {
@@ -2867,52 +2845,6 @@ impl Thread {
})
}
pub fn report_feedback(
&mut self,
feedback: ThreadFeedback,
cx: &mut Context<Self>,
) -> Task<Result<()>> {
let last_assistant_message_id = self
.messages
.iter()
.rev()
.find(|msg| msg.role == Role::Assistant)
.map(|msg| msg.id);
if let Some(message_id) = last_assistant_message_id {
self.report_message_feedback(message_id, feedback, cx)
} else {
let final_project_snapshot = Self::project_snapshot(self.project.clone(), cx);
let serialized_thread = self.serialize(cx);
let thread_id = self.id().clone();
let client = self.project.read(cx).client();
self.feedback = Some(feedback);
cx.notify();
cx.background_spawn(async move {
let final_project_snapshot = final_project_snapshot.await;
let serialized_thread = serialized_thread.await?;
let thread_data = serde_json::to_value(serialized_thread)
.unwrap_or_else(|_| serde_json::Value::Null);
let rating = match feedback {
ThreadFeedback::Positive => "positive",
ThreadFeedback::Negative => "negative",
};
telemetry::event!(
"Assistant Thread Rated",
rating,
thread_id,
thread_data,
final_project_snapshot
);
client.telemetry().flush_events().await;
Ok(())
})
}
}
/// Create a snapshot of the current project state including git information and unsaved buffers.
fn project_snapshot(
project: Entity<Project>,
@@ -2933,11 +2865,11 @@ impl Thread {
let buffer_store = project.read(app_cx).buffer_store();
for buffer_handle in buffer_store.read(app_cx).buffers() {
let buffer = buffer_handle.read(app_cx);
if buffer.is_dirty() {
if let Some(file) = buffer.file() {
let path = file.path().to_string_lossy().to_string();
unsaved_buffers.push(path);
}
if buffer.is_dirty()
&& let Some(file) = buffer.file()
{
let path = file.path().to_string_lossy().to_string();
unsaved_buffers.push(path);
}
}
})
@@ -3147,50 +3079,6 @@ impl Thread {
&self.project
}
pub fn auto_capture_telemetry(&mut self, cx: &mut Context<Self>) {
if !cx.has_flag::<feature_flags::ThreadAutoCaptureFeatureFlag>() {
return;
}
let now = Instant::now();
if let Some(last) = self.last_auto_capture_at {
if now.duration_since(last).as_secs() < 10 {
return;
}
}
self.last_auto_capture_at = Some(now);
let thread_id = self.id().clone();
let github_login = self
.project
.read(cx)
.user_store()
.read(cx)
.current_user()
.map(|user| user.github_login.clone());
let client = self.project.read(cx).client();
let serialize_task = self.serialize(cx);
cx.background_executor()
.spawn(async move {
if let Ok(serialized_thread) = serialize_task.await {
if let Ok(thread_data) = serde_json::to_value(serialized_thread) {
telemetry::event!(
"Agent Thread Auto-Captured",
thread_id = thread_id.to_string(),
thread_data = thread_data,
auto_capture_reason = "tracked_user",
github_login = github_login
);
client.telemetry().flush_events().await;
}
}
})
.detach();
}
pub fn cumulative_token_usage(&self) -> TokenUsage {
self.cumulative_token_usage
}
@@ -3233,13 +3121,13 @@ impl Thread {
.model
.max_token_count_for_mode(self.completion_mode().into());
if let Some(exceeded_error) = &self.exceeded_window_error {
if model.model.id() == exceeded_error.model_id {
return Some(TotalTokenUsage {
total: exceeded_error.token_count,
max,
});
}
if let Some(exceeded_error) = &self.exceeded_window_error
&& model.model.id() == exceeded_error.model_id
{
return Some(TotalTokenUsage {
total: exceeded_error.token_count,
max,
});
}
let total = self
@@ -3300,7 +3188,7 @@ impl Thread {
self.configured_model.as_ref(),
self.completion_mode,
);
self.tool_finished(tool_use_id.clone(), None, true, window, cx);
self.tool_finished(tool_use_id, None, true, window, cx);
}
}
@@ -3932,7 +3820,7 @@ fn main() {{
AgentSettings {
model_parameters: vec![LanguageModelParameters {
provider: Some(model.provider_id().0.to_string().into()),
model: Some(model.id().0.clone()),
model: Some(model.id().0),
temperature: Some(0.66),
}],
..AgentSettings::get_global(cx).clone()
@@ -3952,7 +3840,7 @@ fn main() {{
AgentSettings {
model_parameters: vec![LanguageModelParameters {
provider: None,
model: Some(model.id().0.clone()),
model: Some(model.id().0),
temperature: Some(0.66),
}],
..AgentSettings::get_global(cx).clone()
@@ -3992,7 +3880,7 @@ fn main() {{
AgentSettings {
model_parameters: vec![LanguageModelParameters {
provider: Some("anthropic".into()),
model: Some(model.id().0.clone()),
model: Some(model.id().0),
temperature: Some(0.66),
}],
..AgentSettings::get_global(cx).clone()

View File

@@ -42,7 +42,7 @@ use std::{
use util::ResultExt as _;
pub static ZED_STATELESS: std::sync::LazyLock<bool> =
std::sync::LazyLock::new(|| std::env::var("ZED_STATELESS").map_or(false, |v| !v.is_empty()));
std::sync::LazyLock::new(|| std::env::var("ZED_STATELESS").is_ok_and(|v| !v.is_empty()));
#[derive(Debug, Clone, PartialEq, Eq, Serialize, Deserialize)]
pub enum DataType {
@@ -74,7 +74,7 @@ impl Column for DataType {
}
}
const RULES_FILE_NAMES: [&'static str; 9] = [
const RULES_FILE_NAMES: [&str; 9] = [
".rules",
".cursorrules",
".windsurfrules",
@@ -581,33 +581,32 @@ impl ThreadStore {
return;
};
if protocol.capable(context_server::protocol::ServerCapability::Tools) {
if let Some(response) = protocol
if protocol.capable(context_server::protocol::ServerCapability::Tools)
&& let Some(response) = protocol
.request::<context_server::types::requests::ListTools>(())
.await
.log_err()
{
let tool_ids = tool_working_set
.update(cx, |tool_working_set, cx| {
tool_working_set.extend(
response.tools.into_iter().map(|tool| {
Arc::new(ContextServerTool::new(
context_server_store.clone(),
server.id(),
tool,
)) as Arc<dyn Tool>
}),
cx,
)
})
.log_err();
{
let tool_ids = tool_working_set
.update(cx, |tool_working_set, cx| {
tool_working_set.extend(
response.tools.into_iter().map(|tool| {
Arc::new(ContextServerTool::new(
context_server_store.clone(),
server.id(),
tool,
)) as Arc<dyn Tool>
}),
cx,
)
})
.log_err();
if let Some(tool_ids) = tool_ids {
this.update(cx, |this, _| {
this.context_server_tool_ids.insert(server_id, tool_ids);
})
.log_err();
}
if let Some(tool_ids) = tool_ids {
this.update(cx, |this, _| {
this.context_server_tool_ids.insert(server_id, tool_ids);
})
.log_err();
}
}
})
@@ -697,13 +696,14 @@ impl SerializedThreadV0_1_0 {
let mut messages: Vec<SerializedMessage> = Vec::with_capacity(self.0.messages.len());
for message in self.0.messages {
if message.role == Role::User && !message.tool_results.is_empty() {
if let Some(last_message) = messages.last_mut() {
debug_assert!(last_message.role == Role::Assistant);
if message.role == Role::User
&& !message.tool_results.is_empty()
&& let Some(last_message) = messages.last_mut()
{
debug_assert!(last_message.role == Role::Assistant);
last_message.tool_results = message.tool_results;
continue;
}
last_message.tool_results = message.tool_results;
continue;
}
messages.push(message);
@@ -893,7 +893,7 @@ impl ThreadsDatabase {
let needs_migration_from_heed = mdb_path.exists();
let connection = if *ZED_STATELESS {
let connection = if *ZED_STATELESS || cfg!(any(feature = "test-support", test)) {
Connection::open_memory(Some("THREAD_FALLBACK_DB"))
} else {
Connection::open_file(&sqlite_path.to_string_lossy())

View File

@@ -112,19 +112,13 @@ impl ToolUseState {
},
);
if let Some(window) = &mut window {
if let Some(tool) = this.tools.read(cx).tool(tool_use, cx) {
if let Some(output) = tool_result.output.clone() {
if let Some(card) = tool.deserialize_card(
output,
project.clone(),
window,
cx,
) {
this.tool_result_cards.insert(tool_use_id, card);
}
}
}
if let Some(window) = &mut window
&& let Some(tool) = this.tools.read(cx).tool(tool_use, cx)
&& let Some(output) = tool_result.output.clone()
&& let Some(card) =
tool.deserialize_card(output, project.clone(), window, cx)
{
this.tool_result_cards.insert(tool_use_id, card);
}
}
}
@@ -281,7 +275,7 @@ impl ToolUseState {
pub fn message_has_tool_results(&self, assistant_message_id: MessageId) -> bool {
self.tool_uses_by_assistant_message
.get(&assistant_message_id)
.map_or(false, |results| !results.is_empty())
.is_some_and(|results| !results.is_empty())
}
pub fn tool_result(

View File

@@ -8,24 +8,32 @@ license = "GPL-3.0-or-later"
[lib]
path = "src/agent2.rs"
[features]
test-support = ["db/test-support"]
[lints]
workspace = true
[dependencies]
acp_thread.workspace = true
action_log.workspace = true
agent.workspace = true
agent-client-protocol.workspace = true
agent_servers.workspace = true
agent_settings.workspace = true
anyhow.workspace = true
assistant_context.workspace = true
assistant_tool.workspace = true
assistant_tools.workspace = true
chrono.workspace = true
client.workspace = true
cloud_llm_client.workspace = true
collections.workspace = true
context_server.workspace = true
db.workspace = true
fs.workspace = true
futures.workspace = true
git.workspace = true
gpui.workspace = true
handlebars = { workspace = true, features = ["rust-embed"] }
html_to_markdown.workspace = true
@@ -37,6 +45,7 @@ language_model.workspace = true
language_models.workspace = true
log.workspace = true
open.workspace = true
parking_lot.workspace = true
paths.workspace = true
portable-pty.workspace = true
project.workspace = true
@@ -47,7 +56,9 @@ serde.workspace = true
serde_json.workspace = true
settings.workspace = true
smol.workspace = true
sqlez.workspace = true
task.workspace = true
telemetry.workspace = true
terminal.workspace = true
text.workspace = true
ui.workspace = true
@@ -57,15 +68,20 @@ watch.workspace = true
web_search.workspace = true
which.workspace = true
workspace-hack.workspace = true
zstd.workspace = true
[dev-dependencies]
agent = { workspace = true, "features" = ["test-support"] }
assistant_context = { workspace = true, "features" = ["test-support"] }
ctor.workspace = true
client = { workspace = true, "features" = ["test-support"] }
clock = { workspace = true, "features" = ["test-support"] }
context_server = { workspace = true, "features" = ["test-support"] }
db = { workspace = true, "features" = ["test-support"] }
editor = { workspace = true, "features" = ["test-support"] }
env_logger.workspace = true
fs = { workspace = true, "features" = ["test-support"] }
git = { workspace = true, "features" = ["test-support"] }
gpui = { workspace = true, "features" = ["test-support"] }
gpui_tokio.workspace = true
language = { workspace = true, "features" = ["test-support"] }

View File

@@ -1,10 +1,10 @@
use crate::{
AgentResponseEvent, ContextServerRegistry, CopyPathTool, CreateDirectoryTool, DeletePathTool,
DiagnosticsTool, EditFileTool, FetchTool, FindPathTool, GrepTool, ListDirectoryTool,
MovePathTool, NowTool, OpenTool, ReadFileTool, TerminalTool, ThinkingTool, Thread,
ToolCallAuthorization, UserMessageContent, WebSearchTool, templates::Templates,
ContextServerRegistry, Thread, ThreadEvent, ThreadsDatabase, ToolCallAuthorization,
UserMessageContent, templates::Templates,
};
use acp_thread::AgentModelSelector;
use crate::{HistoryStore, TokenUsageUpdated};
use acp_thread::{AcpThread, AgentModelSelector};
use action_log::ActionLog;
use agent_client_protocol as acp;
use agent_settings::AgentSettings;
use anyhow::{Context as _, Result, anyhow};
@@ -28,7 +28,7 @@ use std::rc::Rc;
use std::sync::Arc;
use util::ResultExt;
const RULES_FILE_NAMES: [&'static str; 9] = [
const RULES_FILE_NAMES: [&str; 9] = [
".rules",
".cursorrules",
".windsurfrules",
@@ -50,7 +50,8 @@ struct Session {
thread: Entity<Thread>,
/// The ACP thread that handles protocol communication
acp_thread: WeakEntity<acp_thread::AcpThread>,
_subscription: Subscription,
pending_save: Task<()>,
_subscriptions: Vec<Subscription>,
}
pub struct LanguageModels {
@@ -154,6 +155,7 @@ impl LanguageModels {
pub struct NativeAgent {
/// Session ID -> Session mapping
sessions: HashMap<acp::SessionId, Session>,
history: Entity<HistoryStore>,
/// Shared project context for all threads
project_context: Entity<ProjectContext>,
project_context_needs_refresh: watch::Sender<()>,
@@ -172,6 +174,7 @@ pub struct NativeAgent {
impl NativeAgent {
pub async fn new(
project: Entity<Project>,
history: Entity<HistoryStore>,
templates: Arc<Templates>,
prompt_store: Option<Entity<PromptStore>>,
fs: Arc<dyn Fs>,
@@ -199,6 +202,7 @@ impl NativeAgent {
watch::channel(());
Self {
sessions: HashMap::new(),
history,
project_context: cx.new(|_| project_context),
project_context_needs_refresh: project_context_needs_refresh_tx,
_maintain_project_context: cx.spawn(async move |this, cx| {
@@ -217,6 +221,56 @@ impl NativeAgent {
})
}
fn register_session(
&mut self,
thread_handle: Entity<Thread>,
cx: &mut Context<Self>,
) -> Entity<AcpThread> {
let connection = Rc::new(NativeAgentConnection(cx.entity()));
let registry = LanguageModelRegistry::read_global(cx);
let summarization_model = registry.thread_summary_model().map(|c| c.model);
thread_handle.update(cx, |thread, cx| {
thread.set_summarization_model(summarization_model, cx);
thread.add_default_tools(cx)
});
let thread = thread_handle.read(cx);
let session_id = thread.id().clone();
let title = thread.title();
let project = thread.project.clone();
let action_log = thread.action_log.clone();
let acp_thread = cx.new(|_cx| {
acp_thread::AcpThread::new(
title,
connection,
project.clone(),
action_log.clone(),
session_id.clone(),
)
});
let subscriptions = vec![
cx.observe_release(&acp_thread, |this, acp_thread, _cx| {
this.sessions.remove(acp_thread.session_id());
}),
cx.subscribe(&thread_handle, Self::handle_thread_token_usage_updated),
cx.observe(&thread_handle, move |this, thread, cx| {
this.save_thread(thread, cx)
}),
];
self.sessions.insert(
session_id,
Session {
thread: thread_handle,
acp_thread: acp_thread.downgrade(),
_subscriptions: subscriptions,
pending_save: Task::ready(()),
},
);
acp_thread
}
pub fn models(&self) -> &LanguageModels {
&self.models
}
@@ -387,6 +441,23 @@ impl NativeAgent {
})
}
fn handle_thread_token_usage_updated(
&mut self,
thread: Entity<Thread>,
usage: &TokenUsageUpdated,
cx: &mut Context<Self>,
) {
let Some(session) = self.sessions.get(thread.read(cx).id()) else {
return;
};
session
.acp_thread
.update(cx, |acp_thread, cx| {
acp_thread.update_token_usage(usage.0.clone(), cx);
})
.ok();
}
fn handle_project_event(
&mut self,
_project: Entity<Project>,
@@ -427,21 +498,105 @@ impl NativeAgent {
) {
self.models.refresh_list(cx);
let default_model = LanguageModelRegistry::read_global(cx)
.default_model()
.map(|m| m.model.clone());
let registry = LanguageModelRegistry::read_global(cx);
let default_model = registry.default_model().map(|m| m.model);
let summarization_model = registry.thread_summary_model().map(|m| m.model);
for session in self.sessions.values_mut() {
session.thread.update(cx, |thread, cx| {
if thread.model().is_none()
&& let Some(model) = default_model.clone()
{
thread.set_model(model);
thread.set_model(model, cx);
cx.notify();
}
thread.set_summarization_model(summarization_model.clone(), cx);
});
}
}
pub fn open_thread(
&mut self,
id: acp::SessionId,
cx: &mut Context<Self>,
) -> Task<Result<Entity<AcpThread>>> {
let database_future = ThreadsDatabase::connect(cx);
cx.spawn(async move |this, cx| {
let database = database_future.await.map_err(|err| anyhow!(err))?;
let db_thread = database
.load_thread(id.clone())
.await?
.with_context(|| format!("no thread found with ID: {id:?}"))?;
let thread = this.update(cx, |this, cx| {
let action_log = cx.new(|_cx| ActionLog::new(this.project.clone()));
cx.new(|cx| {
Thread::from_db(
id.clone(),
db_thread,
this.project.clone(),
this.project_context.clone(),
this.context_server_registry.clone(),
action_log.clone(),
this.templates.clone(),
cx,
)
})
})?;
let acp_thread =
this.update(cx, |this, cx| this.register_session(thread.clone(), cx))?;
let events = thread.update(cx, |thread, cx| thread.replay(cx))?;
cx.update(|cx| {
NativeAgentConnection::handle_thread_events(events, acp_thread.downgrade(), cx)
})?
.await?;
Ok(acp_thread)
})
}
pub fn thread_summary(
&mut self,
id: acp::SessionId,
cx: &mut Context<Self>,
) -> Task<Result<SharedString>> {
let thread = self.open_thread(id.clone(), cx);
cx.spawn(async move |this, cx| {
let acp_thread = thread.await?;
let result = this
.update(cx, |this, cx| {
this.sessions
.get(&id)
.unwrap()
.thread
.update(cx, |thread, cx| thread.summary(cx))
})?
.await?;
drop(acp_thread);
Ok(result)
})
}
fn save_thread(&mut self, thread: Entity<Thread>, cx: &mut Context<Self>) {
if thread.read(cx).is_empty() {
return;
}
let database_future = ThreadsDatabase::connect(cx);
let (id, db_thread) =
thread.update(cx, |thread, cx| (thread.id().clone(), thread.to_db(cx)));
let Some(session) = self.sessions.get_mut(&id) else {
return;
};
let history = self.history.clone();
session.pending_save = cx.spawn(async move |_, cx| {
let Some(database) = database_future.await.map_err(|err| anyhow!(err)).log_err() else {
return;
};
let db_thread = db_thread.await;
database.save_thread(id, db_thread).await.log_err();
history.update(cx, |history, cx| history.reload(cx)).ok();
});
}
}
/// Wrapper struct that implements the AgentConnection trait
@@ -462,10 +617,7 @@ impl NativeAgentConnection {
session_id: acp::SessionId,
cx: &mut App,
f: impl 'static
+ FnOnce(
Entity<Thread>,
&mut App,
) -> Result<mpsc::UnboundedReceiver<Result<AgentResponseEvent>>>,
+ FnOnce(Entity<Thread>, &mut App) -> Result<mpsc::UnboundedReceiver<Result<ThreadEvent>>>,
) -> Task<Result<acp::PromptResponse>> {
let Some((thread, acp_thread)) = self.0.update(cx, |agent, _cx| {
agent
@@ -477,19 +629,38 @@ impl NativeAgentConnection {
};
log::debug!("Found session for: {}", session_id);
let mut response_stream = match f(thread, cx) {
let response_stream = match f(thread, cx) {
Ok(stream) => stream,
Err(err) => return Task::ready(Err(err)),
};
Self::handle_thread_events(response_stream, acp_thread, cx)
}
fn handle_thread_events(
mut events: mpsc::UnboundedReceiver<Result<ThreadEvent>>,
acp_thread: WeakEntity<AcpThread>,
cx: &App,
) -> Task<Result<acp::PromptResponse>> {
cx.spawn(async move |cx| {
// Handle response stream and forward to session.acp_thread
while let Some(result) = response_stream.next().await {
while let Some(result) = events.next().await {
match result {
Ok(event) => {
log::trace!("Received completion event: {:?}", event);
match event {
AgentResponseEvent::Text(text) => {
ThreadEvent::UserMessage(message) => {
acp_thread.update(cx, |thread, cx| {
for content in message.content {
thread.push_user_content_block(
Some(message.id.clone()),
content.into(),
cx,
);
}
})?;
}
ThreadEvent::AgentText(text) => {
acp_thread.update(cx, |thread, cx| {
thread.push_assistant_content_block(
acp::ContentBlock::Text(acp::TextContent {
@@ -501,7 +672,7 @@ impl NativeAgentConnection {
)
})?;
}
AgentResponseEvent::Thinking(text) => {
ThreadEvent::AgentThinking(text) => {
acp_thread.update(cx, |thread, cx| {
thread.push_assistant_content_block(
acp::ContentBlock::Text(acp::TextContent {
@@ -513,7 +684,7 @@ impl NativeAgentConnection {
)
})?;
}
AgentResponseEvent::ToolCallAuthorization(ToolCallAuthorization {
ThreadEvent::ToolCallAuthorization(ToolCallAuthorization {
tool_call,
options,
response,
@@ -536,17 +707,26 @@ impl NativeAgentConnection {
})
.detach();
}
AgentResponseEvent::ToolCall(tool_call) => {
ThreadEvent::ToolCall(tool_call) => {
acp_thread.update(cx, |thread, cx| {
thread.upsert_tool_call(tool_call, cx)
})??;
}
AgentResponseEvent::ToolCallUpdate(update) => {
ThreadEvent::ToolCallUpdate(update) => {
acp_thread.update(cx, |thread, cx| {
thread.update_tool_call(update, cx)
})??;
}
AgentResponseEvent::Stop(stop_reason) => {
ThreadEvent::TitleUpdate(title) => {
acp_thread
.update(cx, |thread, cx| thread.update_title(title, cx))??;
}
ThreadEvent::Retry(status) => {
acp_thread.update(cx, |thread, cx| {
thread.update_retry_status(status, cx)
})?;
}
ThreadEvent::Stop(stop_reason) => {
log::debug!("Assistant message complete: {:?}", stop_reason);
return Ok(acp::PromptResponse { stop_reason });
}
@@ -599,8 +779,8 @@ impl AgentModelSelector for NativeAgentConnection {
return Task::ready(Err(anyhow!("Invalid model ID {}", model_id)));
};
thread.update(cx, |thread, _cx| {
thread.set_model(model.clone());
thread.update(cx, |thread, cx| {
thread.set_model(model.clone(), cx);
});
update_settings_file::<AgentSettings>(
@@ -660,31 +840,13 @@ impl acp_thread::AgentConnection for NativeAgentConnection {
cx.spawn(async move |cx| {
log::debug!("Starting thread creation in async context");
// Generate session ID
let session_id = acp::SessionId(uuid::Uuid::new_v4().to_string().into());
log::info!("Created session with ID: {}", session_id);
// Create AcpThread
let acp_thread = cx.update(|cx| {
cx.new(|cx| {
acp_thread::AcpThread::new(
"agent2",
self.clone(),
project.clone(),
session_id.clone(),
cx,
)
})
})?;
let action_log = cx.update(|cx| acp_thread.read(cx).action_log().clone())?;
let action_log = cx.new(|_cx| ActionLog::new(project.clone()))?;
// Create Thread
let thread = agent.update(
cx,
|agent, cx: &mut gpui::Context<NativeAgent>| -> Result<_> {
// Fetch default model from registry settings
let registry = LanguageModelRegistry::read_global(cx);
// Log available models for debugging
let available_count = registry.available_models(cx).count();
log::debug!("Total available models: {}", available_count);
@@ -696,7 +858,7 @@ impl acp_thread::AgentConnection for NativeAgentConnection {
});
let thread = cx.new(|cx| {
let mut thread = Thread::new(
Thread::new(
project.clone(),
agent.project_context.clone(),
agent.context_server_registry.clone(),
@@ -704,45 +866,13 @@ impl acp_thread::AgentConnection for NativeAgentConnection {
agent.templates.clone(),
default_model,
cx,
);
thread.add_tool(CopyPathTool::new(project.clone()));
thread.add_tool(CreateDirectoryTool::new(project.clone()));
thread.add_tool(DeletePathTool::new(project.clone(), action_log.clone()));
thread.add_tool(DiagnosticsTool::new(project.clone()));
thread.add_tool(EditFileTool::new(cx.entity()));
thread.add_tool(FetchTool::new(project.read(cx).client().http_client()));
thread.add_tool(FindPathTool::new(project.clone()));
thread.add_tool(GrepTool::new(project.clone()));
thread.add_tool(ListDirectoryTool::new(project.clone()));
thread.add_tool(MovePathTool::new(project.clone()));
thread.add_tool(NowTool);
thread.add_tool(OpenTool::new(project.clone()));
thread.add_tool(ReadFileTool::new(project.clone(), action_log));
thread.add_tool(TerminalTool::new(project.clone(), cx));
thread.add_tool(ThinkingTool);
thread.add_tool(WebSearchTool); // TODO: Enable this only if it's a zed model.
thread
)
});
Ok(thread)
},
)??;
// Store the session
agent.update(cx, |agent, cx| {
agent.sessions.insert(
session_id,
Session {
thread,
acp_thread: acp_thread.downgrade(),
_subscription: cx.observe_release(&acp_thread, |this, acp_thread, _cx| {
this.sessions.remove(acp_thread.session_id());
}),
},
);
})?;
Ok(acp_thread)
agent.update(cx, |agent, cx| agent.register_session(thread, cx))
})
}
@@ -783,6 +913,14 @@ impl acp_thread::AgentConnection for NativeAgentConnection {
})
}
fn prompt_capabilities(&self) -> acp::PromptCapabilities {
acp::PromptCapabilities {
image: true,
audio: false,
embedded_context: true,
}
}
fn resume(
&self,
session_id: &acp::SessionId,
@@ -798,7 +936,7 @@ impl acp_thread::AgentConnection for NativeAgentConnection {
log::info!("Cancelling on session: {}", session_id);
self.0.update(cx, |agent, cx| {
if let Some(agent) = agent.sessions.get(session_id) {
agent.thread.update(cx, |thread, _cx| thread.cancel());
agent.thread.update(cx, |thread, cx| thread.cancel(cx));
}
});
}
@@ -809,23 +947,66 @@ impl acp_thread::AgentConnection for NativeAgentConnection {
cx: &mut App,
) -> Option<Rc<dyn acp_thread::AgentSessionEditor>> {
self.0.update(cx, |agent, _cx| {
agent
.sessions
.get(session_id)
.map(|session| Rc::new(NativeAgentSessionEditor(session.thread.clone())) as _)
agent.sessions.get(session_id).map(|session| {
Rc::new(NativeAgentSessionEditor {
thread: session.thread.clone(),
acp_thread: session.acp_thread.clone(),
}) as _
})
})
}
fn telemetry(&self) -> Option<Rc<dyn acp_thread::AgentTelemetry>> {
Some(Rc::new(self.clone()) as Rc<dyn acp_thread::AgentTelemetry>)
}
fn into_any(self: Rc<Self>) -> Rc<dyn Any> {
self
}
}
struct NativeAgentSessionEditor(Entity<Thread>);
impl acp_thread::AgentTelemetry for NativeAgentConnection {
fn agent_name(&self) -> String {
"Zed".into()
}
fn thread_data(
&self,
session_id: &acp::SessionId,
cx: &mut App,
) -> Task<Result<serde_json::Value>> {
let Some(session) = self.0.read(cx).sessions.get(session_id) else {
return Task::ready(Err(anyhow!("Session not found")));
};
let task = session.thread.read(cx).to_db(cx);
cx.background_spawn(async move {
serde_json::to_value(task.await).context("Failed to serialize thread")
})
}
}
struct NativeAgentSessionEditor {
thread: Entity<Thread>,
acp_thread: WeakEntity<AcpThread>,
}
impl acp_thread::AgentSessionEditor for NativeAgentSessionEditor {
fn truncate(&self, message_id: acp_thread::UserMessageId, cx: &mut App) -> Task<Result<()>> {
Task::ready(self.0.update(cx, |thread, _cx| thread.truncate(message_id)))
match self.thread.update(cx, |thread, cx| {
thread.truncate(message_id.clone(), cx)?;
Ok(thread.latest_token_usage())
}) {
Ok(usage) => {
self.acp_thread
.update(cx, |thread, cx| {
thread.update_token_usage(usage, cx);
})
.ok();
Task::ready(Ok(()))
}
Err(error) => Task::ready(Err(error)),
}
}
}
@@ -845,12 +1026,19 @@ impl acp_thread::AgentSessionResume for NativeAgentSessionResume {
#[cfg(test)]
mod tests {
use crate::HistoryEntryId;
use super::*;
use acp_thread::{AgentConnection, AgentModelGroupName, AgentModelId, AgentModelInfo};
use acp_thread::{
AgentConnection, AgentModelGroupName, AgentModelId, AgentModelInfo, MentionUri,
};
use fs::FakeFs;
use gpui::TestAppContext;
use indoc::indoc;
use language_model::fake_provider::FakeLanguageModel;
use serde_json::json;
use settings::SettingsStore;
use util::path;
#[gpui::test]
async fn test_maintaining_project_context(cx: &mut TestAppContext) {
@@ -864,8 +1052,11 @@ mod tests {
)
.await;
let project = Project::test(fs.clone(), [], cx).await;
let context_store = cx.new(|cx| assistant_context::ContextStore::fake(project.clone(), cx));
let history_store = cx.new(|cx| HistoryStore::new(context_store, cx));
let agent = NativeAgent::new(
project.clone(),
history_store,
Templates::new(),
None,
fs.clone(),
@@ -919,9 +1110,12 @@ mod tests {
let fs = FakeFs::new(cx.executor());
fs.insert_tree("/", json!({ "a": {} })).await;
let project = Project::test(fs.clone(), [], cx).await;
let context_store = cx.new(|cx| assistant_context::ContextStore::fake(project.clone(), cx));
let history_store = cx.new(|cx| HistoryStore::new(context_store, cx));
let connection = NativeAgentConnection(
NativeAgent::new(
project.clone(),
history_store,
Templates::new(),
None,
fs.clone(),
@@ -972,9 +1166,13 @@ mod tests {
.await;
let project = Project::test(fs.clone(), [], cx).await;
let context_store = cx.new(|cx| assistant_context::ContextStore::fake(project.clone(), cx));
let history_store = cx.new(|cx| HistoryStore::new(context_store, cx));
// Create the agent and connection
let agent = NativeAgent::new(
project.clone(),
history_store,
Templates::new(),
None,
fs.clone(),
@@ -1025,6 +1223,163 @@ mod tests {
);
}
#[gpui::test]
#[cfg_attr(target_os = "windows", ignore)] // TODO: Fix this test on Windows
async fn test_save_load_thread(cx: &mut TestAppContext) {
init_test(cx);
let fs = FakeFs::new(cx.executor());
fs.insert_tree(
"/",
json!({
"a": {
"b.md": "Lorem"
}
}),
)
.await;
let project = Project::test(fs.clone(), [path!("/a").as_ref()], cx).await;
let context_store = cx.new(|cx| assistant_context::ContextStore::fake(project.clone(), cx));
let history_store = cx.new(|cx| HistoryStore::new(context_store, cx));
let agent = NativeAgent::new(
project.clone(),
history_store.clone(),
Templates::new(),
None,
fs.clone(),
&mut cx.to_async(),
)
.await
.unwrap();
let connection = Rc::new(NativeAgentConnection(agent.clone()));
let acp_thread = cx
.update(|cx| {
connection
.clone()
.new_thread(project.clone(), Path::new(""), cx)
})
.await
.unwrap();
let session_id = acp_thread.read_with(cx, |thread, _| thread.session_id().clone());
let thread = agent.read_with(cx, |agent, _| {
agent.sessions.get(&session_id).unwrap().thread.clone()
});
// Ensure empty threads are not saved, even if they get mutated.
let model = Arc::new(FakeLanguageModel::default());
let summary_model = Arc::new(FakeLanguageModel::default());
thread.update(cx, |thread, cx| {
thread.set_model(model, cx);
thread.set_summarization_model(Some(summary_model), cx);
});
cx.run_until_parked();
assert_eq!(history_entries(&history_store, cx), vec![]);
let model = thread.read_with(cx, |thread, _| thread.model().unwrap().clone());
let model = model.as_fake();
let summary_model = thread.read_with(cx, |thread, _| {
thread.summarization_model().unwrap().clone()
});
let summary_model = summary_model.as_fake();
let send = acp_thread.update(cx, |thread, cx| {
thread.send(
vec![
"What does ".into(),
acp::ContentBlock::ResourceLink(acp::ResourceLink {
name: "b.md".into(),
uri: MentionUri::File {
abs_path: path!("/a/b.md").into(),
}
.to_uri()
.to_string(),
annotations: None,
description: None,
mime_type: None,
size: None,
title: None,
}),
" mean?".into(),
],
cx,
)
});
let send = cx.foreground_executor().spawn(send);
cx.run_until_parked();
model.send_last_completion_stream_text_chunk("Lorem.");
model.end_last_completion_stream();
cx.run_until_parked();
summary_model.send_last_completion_stream_text_chunk("Explaining /a/b.md");
summary_model.end_last_completion_stream();
send.await.unwrap();
acp_thread.read_with(cx, |thread, cx| {
assert_eq!(
thread.to_markdown(cx),
indoc! {"
## User
What does [@b.md](file:///a/b.md) mean?
## Assistant
Lorem.
"}
)
});
// Drop the ACP thread, which should cause the session to be dropped as well.
cx.update(|_| {
drop(thread);
drop(acp_thread);
});
agent.read_with(cx, |agent, _| {
assert_eq!(agent.sessions.keys().cloned().collect::<Vec<_>>(), []);
});
// Ensure the thread can be reloaded from disk.
assert_eq!(
history_entries(&history_store, cx),
vec![(
HistoryEntryId::AcpThread(session_id.clone()),
"Explaining /a/b.md".into()
)]
);
let acp_thread = agent
.update(cx, |agent, cx| agent.open_thread(session_id.clone(), cx))
.await
.unwrap();
acp_thread.read_with(cx, |thread, cx| {
assert_eq!(
thread.to_markdown(cx),
indoc! {"
## User
What does [@b.md](file:///a/b.md) mean?
## Assistant
Lorem.
"}
)
});
}
fn history_entries(
history: &Entity<HistoryStore>,
cx: &mut TestAppContext,
) -> Vec<(HistoryEntryId, String)> {
history.read_with(cx, |history, cx| {
history
.entries(cx)
.iter()
.map(|e| (e.id(), e.title().to_string()))
.collect::<Vec<_>>()
})
}
fn init_test(cx: &mut TestAppContext) {
env_logger::try_init().ok();
cx.update(|cx| {

View File

@@ -1,13 +1,18 @@
mod agent;
mod db;
mod history_store;
mod native_agent_server;
mod templates;
mod thread;
mod tool_schema;
mod tools;
#[cfg(test)]
mod tests;
pub use agent::*;
pub use db::*;
pub use history_store::*;
pub use native_agent_server::NativeAgentServer;
pub use templates::*;
pub use thread::*;

488
crates/agent2/src/db.rs Normal file
View File

@@ -0,0 +1,488 @@
use crate::{AgentMessage, AgentMessageContent, UserMessage, UserMessageContent};
use acp_thread::UserMessageId;
use agent::{thread::DetailedSummaryState, thread_store};
use agent_client_protocol as acp;
use agent_settings::{AgentProfileId, CompletionMode};
use anyhow::{Result, anyhow};
use chrono::{DateTime, Utc};
use collections::{HashMap, IndexMap};
use futures::{FutureExt, future::Shared};
use gpui::{BackgroundExecutor, Global, Task};
use indoc::indoc;
use parking_lot::Mutex;
use serde::{Deserialize, Serialize};
use sqlez::{
bindable::{Bind, Column},
connection::Connection,
statement::Statement,
};
use std::sync::Arc;
use ui::{App, SharedString};
pub type DbMessage = crate::Message;
pub type DbSummary = DetailedSummaryState;
pub type DbLanguageModel = thread_store::SerializedLanguageModel;
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct DbThreadMetadata {
pub id: acp::SessionId,
#[serde(alias = "summary")]
pub title: SharedString,
pub updated_at: DateTime<Utc>,
}
#[derive(Debug, Serialize, Deserialize)]
pub struct DbThread {
pub title: SharedString,
pub messages: Vec<DbMessage>,
pub updated_at: DateTime<Utc>,
#[serde(default)]
pub detailed_summary: Option<SharedString>,
#[serde(default)]
pub initial_project_snapshot: Option<Arc<agent::thread::ProjectSnapshot>>,
#[serde(default)]
pub cumulative_token_usage: language_model::TokenUsage,
#[serde(default)]
pub request_token_usage: HashMap<acp_thread::UserMessageId, language_model::TokenUsage>,
#[serde(default)]
pub model: Option<DbLanguageModel>,
#[serde(default)]
pub completion_mode: Option<CompletionMode>,
#[serde(default)]
pub profile: Option<AgentProfileId>,
}
impl DbThread {
pub const VERSION: &'static str = "0.3.0";
pub fn from_json(json: &[u8]) -> Result<Self> {
let saved_thread_json = serde_json::from_slice::<serde_json::Value>(json)?;
match saved_thread_json.get("version") {
Some(serde_json::Value::String(version)) => match version.as_str() {
Self::VERSION => Ok(serde_json::from_value(saved_thread_json)?),
_ => Self::upgrade_from_agent_1(agent::SerializedThread::from_json(json)?),
},
_ => Self::upgrade_from_agent_1(agent::SerializedThread::from_json(json)?),
}
}
fn upgrade_from_agent_1(thread: agent::SerializedThread) -> Result<Self> {
let mut messages = Vec::new();
let mut request_token_usage = HashMap::default();
let mut last_user_message_id = None;
for (ix, msg) in thread.messages.into_iter().enumerate() {
let message = match msg.role {
language_model::Role::User => {
let mut content = Vec::new();
// Convert segments to content
for segment in msg.segments {
match segment {
thread_store::SerializedMessageSegment::Text { text } => {
content.push(UserMessageContent::Text(text));
}
thread_store::SerializedMessageSegment::Thinking { text, .. } => {
// User messages don't have thinking segments, but handle gracefully
content.push(UserMessageContent::Text(text));
}
thread_store::SerializedMessageSegment::RedactedThinking { .. } => {
// User messages don't have redacted thinking, skip.
}
}
}
// If no content was added, add context as text if available
if content.is_empty() && !msg.context.is_empty() {
content.push(UserMessageContent::Text(msg.context));
}
let id = UserMessageId::new();
last_user_message_id = Some(id.clone());
crate::Message::User(UserMessage {
// MessageId from old format can't be meaningfully converted, so generate a new one
id,
content,
})
}
language_model::Role::Assistant => {
let mut content = Vec::new();
// Convert segments to content
for segment in msg.segments {
match segment {
thread_store::SerializedMessageSegment::Text { text } => {
content.push(AgentMessageContent::Text(text));
}
thread_store::SerializedMessageSegment::Thinking {
text,
signature,
} => {
content.push(AgentMessageContent::Thinking { text, signature });
}
thread_store::SerializedMessageSegment::RedactedThinking { data } => {
content.push(AgentMessageContent::RedactedThinking(data));
}
}
}
// Convert tool uses
let mut tool_names_by_id = HashMap::default();
for tool_use in msg.tool_uses {
tool_names_by_id.insert(tool_use.id.clone(), tool_use.name.clone());
content.push(AgentMessageContent::ToolUse(
language_model::LanguageModelToolUse {
id: tool_use.id,
name: tool_use.name.into(),
raw_input: serde_json::to_string(&tool_use.input)
.unwrap_or_default(),
input: tool_use.input,
is_input_complete: true,
},
));
}
// Convert tool results
let mut tool_results = IndexMap::default();
for tool_result in msg.tool_results {
let name = tool_names_by_id
.remove(&tool_result.tool_use_id)
.unwrap_or_else(|| SharedString::from("unknown"));
tool_results.insert(
tool_result.tool_use_id.clone(),
language_model::LanguageModelToolResult {
tool_use_id: tool_result.tool_use_id,
tool_name: name.into(),
is_error: tool_result.is_error,
content: tool_result.content,
output: tool_result.output,
},
);
}
if let Some(last_user_message_id) = &last_user_message_id
&& let Some(token_usage) = thread.request_token_usage.get(ix).copied()
{
request_token_usage.insert(last_user_message_id.clone(), token_usage);
}
crate::Message::Agent(AgentMessage {
content,
tool_results,
})
}
language_model::Role::System => {
// Skip system messages as they're not supported in the new format
continue;
}
};
messages.push(message);
}
Ok(Self {
title: thread.summary,
messages,
updated_at: thread.updated_at,
detailed_summary: match thread.detailed_summary_state {
DetailedSummaryState::NotGenerated | DetailedSummaryState::Generating { .. } => {
None
}
DetailedSummaryState::Generated { text, .. } => Some(text),
},
initial_project_snapshot: thread.initial_project_snapshot,
cumulative_token_usage: thread.cumulative_token_usage,
request_token_usage,
model: thread.model,
completion_mode: thread.completion_mode,
profile: thread.profile,
})
}
}
pub static ZED_STATELESS: std::sync::LazyLock<bool> =
std::sync::LazyLock::new(|| std::env::var("ZED_STATELESS").is_ok_and(|v| !v.is_empty()));
#[derive(Debug, Clone, PartialEq, Eq, Serialize, Deserialize)]
pub enum DataType {
#[serde(rename = "json")]
Json,
#[serde(rename = "zstd")]
Zstd,
}
impl Bind for DataType {
fn bind(&self, statement: &Statement, start_index: i32) -> Result<i32> {
let value = match self {
DataType::Json => "json",
DataType::Zstd => "zstd",
};
value.bind(statement, start_index)
}
}
impl Column for DataType {
fn column(statement: &mut Statement, start_index: i32) -> Result<(Self, i32)> {
let (value, next_index) = String::column(statement, start_index)?;
let data_type = match value.as_str() {
"json" => DataType::Json,
"zstd" => DataType::Zstd,
_ => anyhow::bail!("Unknown data type: {}", value),
};
Ok((data_type, next_index))
}
}
pub(crate) struct ThreadsDatabase {
executor: BackgroundExecutor,
connection: Arc<Mutex<Connection>>,
}
struct GlobalThreadsDatabase(Shared<Task<Result<Arc<ThreadsDatabase>, Arc<anyhow::Error>>>>);
impl Global for GlobalThreadsDatabase {}
impl ThreadsDatabase {
pub fn connect(cx: &mut App) -> Shared<Task<Result<Arc<ThreadsDatabase>, Arc<anyhow::Error>>>> {
if cx.has_global::<GlobalThreadsDatabase>() {
return cx.global::<GlobalThreadsDatabase>().0.clone();
}
let executor = cx.background_executor().clone();
let task = executor
.spawn({
let executor = executor.clone();
async move {
match ThreadsDatabase::new(executor) {
Ok(db) => Ok(Arc::new(db)),
Err(err) => Err(Arc::new(err)),
}
}
})
.shared();
cx.set_global(GlobalThreadsDatabase(task.clone()));
task
}
pub fn new(executor: BackgroundExecutor) -> Result<Self> {
let connection = if *ZED_STATELESS || cfg!(any(feature = "test-support", test)) {
Connection::open_memory(Some("THREAD_FALLBACK_DB"))
} else {
let threads_dir = paths::data_dir().join("threads");
std::fs::create_dir_all(&threads_dir)?;
let sqlite_path = threads_dir.join("threads.db");
Connection::open_file(&sqlite_path.to_string_lossy())
};
connection.exec(indoc! {"
CREATE TABLE IF NOT EXISTS threads (
id TEXT PRIMARY KEY,
summary TEXT NOT NULL,
updated_at TEXT NOT NULL,
data_type TEXT NOT NULL,
data BLOB NOT NULL
)
"})?()
.map_err(|e| anyhow!("Failed to create threads table: {}", e))?;
let db = Self {
executor,
connection: Arc::new(Mutex::new(connection)),
};
Ok(db)
}
fn save_thread_sync(
connection: &Arc<Mutex<Connection>>,
id: acp::SessionId,
thread: DbThread,
) -> Result<()> {
const COMPRESSION_LEVEL: i32 = 3;
#[derive(Serialize)]
struct SerializedThread {
#[serde(flatten)]
thread: DbThread,
version: &'static str,
}
let title = thread.title.to_string();
let updated_at = thread.updated_at.to_rfc3339();
let json_data = serde_json::to_string(&SerializedThread {
thread,
version: DbThread::VERSION,
})?;
let connection = connection.lock();
let compressed = zstd::encode_all(json_data.as_bytes(), COMPRESSION_LEVEL)?;
let data_type = DataType::Zstd;
let data = compressed;
let mut insert = connection.exec_bound::<(Arc<str>, String, String, DataType, Vec<u8>)>(indoc! {"
INSERT OR REPLACE INTO threads (id, summary, updated_at, data_type, data) VALUES (?, ?, ?, ?, ?)
"})?;
insert((id.0, title, updated_at, data_type, data))?;
Ok(())
}
pub fn list_threads(&self) -> Task<Result<Vec<DbThreadMetadata>>> {
let connection = self.connection.clone();
self.executor.spawn(async move {
let connection = connection.lock();
let mut select =
connection.select_bound::<(), (Arc<str>, String, String)>(indoc! {"
SELECT id, summary, updated_at FROM threads ORDER BY updated_at DESC
"})?;
let rows = select(())?;
let mut threads = Vec::new();
for (id, summary, updated_at) in rows {
threads.push(DbThreadMetadata {
id: acp::SessionId(id),
title: summary.into(),
updated_at: DateTime::parse_from_rfc3339(&updated_at)?.with_timezone(&Utc),
});
}
Ok(threads)
})
}
pub fn load_thread(&self, id: acp::SessionId) -> Task<Result<Option<DbThread>>> {
let connection = self.connection.clone();
self.executor.spawn(async move {
let connection = connection.lock();
let mut select = connection.select_bound::<Arc<str>, (DataType, Vec<u8>)>(indoc! {"
SELECT data_type, data FROM threads WHERE id = ? LIMIT 1
"})?;
let rows = select(id.0)?;
if let Some((data_type, data)) = rows.into_iter().next() {
let json_data = match data_type {
DataType::Zstd => {
let decompressed = zstd::decode_all(&data[..])?;
String::from_utf8(decompressed)?
}
DataType::Json => String::from_utf8(data)?,
};
let thread = DbThread::from_json(json_data.as_bytes())?;
Ok(Some(thread))
} else {
Ok(None)
}
})
}
pub fn save_thread(&self, id: acp::SessionId, thread: DbThread) -> Task<Result<()>> {
let connection = self.connection.clone();
self.executor
.spawn(async move { Self::save_thread_sync(&connection, id, thread) })
}
pub fn delete_thread(&self, id: acp::SessionId) -> Task<Result<()>> {
let connection = self.connection.clone();
self.executor.spawn(async move {
let connection = connection.lock();
let mut delete = connection.exec_bound::<Arc<str>>(indoc! {"
DELETE FROM threads WHERE id = ?
"})?;
delete(id.0)?;
Ok(())
})
}
}
#[cfg(test)]
mod tests {
use super::*;
use agent::MessageSegment;
use agent::context::LoadedContext;
use client::Client;
use fs::FakeFs;
use gpui::AppContext;
use gpui::TestAppContext;
use http_client::FakeHttpClient;
use language_model::Role;
use project::Project;
use settings::SettingsStore;
fn init_test(cx: &mut TestAppContext) {
env_logger::try_init().ok();
cx.update(|cx| {
let settings_store = SettingsStore::test(cx);
cx.set_global(settings_store);
Project::init_settings(cx);
language::init(cx);
let http_client = FakeHttpClient::with_404_response();
let clock = Arc::new(clock::FakeSystemClock::new());
let client = Client::new(clock, http_client, cx);
agent::init(cx);
agent_settings::init(cx);
language_model::init(client, cx);
});
}
#[gpui::test]
async fn test_retrieving_old_thread(cx: &mut TestAppContext) {
init_test(cx);
let fs = FakeFs::new(cx.executor());
let project = Project::test(fs, [], cx).await;
// Save a thread using the old agent.
let thread_store = cx.new(|cx| agent::ThreadStore::fake(project, cx));
let thread = thread_store.update(cx, |thread_store, cx| thread_store.create_thread(cx));
thread.update(cx, |thread, cx| {
thread.insert_message(
Role::User,
vec![MessageSegment::Text("Hey!".into())],
LoadedContext::default(),
vec![],
false,
cx,
);
thread.insert_message(
Role::Assistant,
vec![MessageSegment::Text("How're you doing?".into())],
LoadedContext::default(),
vec![],
false,
cx,
)
});
thread_store
.update(cx, |thread_store, cx| thread_store.save_thread(&thread, cx))
.await
.unwrap();
// Open that same thread using the new agent.
let db = cx.update(ThreadsDatabase::connect).await.unwrap();
let threads = db.list_threads().await.unwrap();
assert_eq!(threads.len(), 1);
let thread = db
.load_thread(threads[0].id.clone())
.await
.unwrap()
.unwrap();
assert_eq!(thread.messages[0].to_markdown(), "## User\n\nHey!\n");
assert_eq!(
thread.messages[1].to_markdown(),
"## Assistant\n\nHow're you doing?\n"
);
}
}

View File

@@ -0,0 +1,348 @@
use crate::{DbThreadMetadata, ThreadsDatabase};
use acp_thread::MentionUri;
use agent_client_protocol as acp;
use anyhow::{Context as _, Result, anyhow};
use assistant_context::{AssistantContext, SavedContextMetadata};
use chrono::{DateTime, Utc};
use db::kvp::KEY_VALUE_STORE;
use gpui::{App, AsyncApp, Entity, SharedString, Task, prelude::*};
use itertools::Itertools;
use paths::contexts_dir;
use serde::{Deserialize, Serialize};
use std::{collections::VecDeque, path::Path, sync::Arc, time::Duration};
use util::ResultExt as _;
const MAX_RECENTLY_OPENED_ENTRIES: usize = 6;
const RECENTLY_OPENED_THREADS_KEY: &str = "recent-agent-threads";
const SAVE_RECENTLY_OPENED_ENTRIES_DEBOUNCE: Duration = Duration::from_millis(50);
const DEFAULT_TITLE: &SharedString = &SharedString::new_static("New Thread");
#[derive(Clone, Debug)]
pub enum HistoryEntry {
AcpThread(DbThreadMetadata),
TextThread(SavedContextMetadata),
}
impl HistoryEntry {
pub fn updated_at(&self) -> DateTime<Utc> {
match self {
HistoryEntry::AcpThread(thread) => thread.updated_at,
HistoryEntry::TextThread(context) => context.mtime.to_utc(),
}
}
pub fn id(&self) -> HistoryEntryId {
match self {
HistoryEntry::AcpThread(thread) => HistoryEntryId::AcpThread(thread.id.clone()),
HistoryEntry::TextThread(context) => HistoryEntryId::TextThread(context.path.clone()),
}
}
pub fn mention_uri(&self) -> MentionUri {
match self {
HistoryEntry::AcpThread(thread) => MentionUri::Thread {
id: thread.id.clone(),
name: thread.title.to_string(),
},
HistoryEntry::TextThread(context) => MentionUri::TextThread {
path: context.path.as_ref().to_owned(),
name: context.title.to_string(),
},
}
}
pub fn title(&self) -> &SharedString {
match self {
HistoryEntry::AcpThread(thread) if thread.title.is_empty() => DEFAULT_TITLE,
HistoryEntry::AcpThread(thread) => &thread.title,
HistoryEntry::TextThread(context) => &context.title,
}
}
}
/// Generic identifier for a history entry.
#[derive(Clone, PartialEq, Eq, Debug, Hash)]
pub enum HistoryEntryId {
AcpThread(acp::SessionId),
TextThread(Arc<Path>),
}
#[derive(Serialize, Deserialize, Debug)]
enum SerializedRecentOpen {
AcpThread(String),
TextThread(String),
}
pub struct HistoryStore {
threads: Vec<DbThreadMetadata>,
context_store: Entity<assistant_context::ContextStore>,
recently_opened_entries: VecDeque<HistoryEntryId>,
_subscriptions: Vec<gpui::Subscription>,
_save_recently_opened_entries_task: Task<()>,
}
impl HistoryStore {
pub fn new(
context_store: Entity<assistant_context::ContextStore>,
cx: &mut Context<Self>,
) -> Self {
let subscriptions = vec![cx.observe(&context_store, |_, _, cx| cx.notify())];
cx.spawn(async move |this, cx| {
let entries = Self::load_recently_opened_entries(cx).await;
this.update(cx, |this, cx| {
if let Some(entries) = entries.log_err() {
this.recently_opened_entries = entries;
}
this.reload(cx);
})
.ok();
})
.detach();
Self {
context_store,
recently_opened_entries: VecDeque::default(),
threads: Vec::default(),
_subscriptions: subscriptions,
_save_recently_opened_entries_task: Task::ready(()),
}
}
pub fn thread_from_session_id(&self, session_id: &acp::SessionId) -> Option<&DbThreadMetadata> {
self.threads.iter().find(|thread| &thread.id == session_id)
}
pub fn delete_thread(
&mut self,
id: acp::SessionId,
cx: &mut Context<Self>,
) -> Task<Result<()>> {
let database_future = ThreadsDatabase::connect(cx);
cx.spawn(async move |this, cx| {
let database = database_future.await.map_err(|err| anyhow!(err))?;
database.delete_thread(id.clone()).await?;
this.update(cx, |this, cx| this.reload(cx))
})
}
pub fn delete_text_thread(
&mut self,
path: Arc<Path>,
cx: &mut Context<Self>,
) -> Task<Result<()>> {
self.context_store.update(cx, |context_store, cx| {
context_store.delete_local_context(path, cx)
})
}
pub fn load_text_thread(
&self,
path: Arc<Path>,
cx: &mut Context<Self>,
) -> Task<Result<Entity<AssistantContext>>> {
self.context_store.update(cx, |context_store, cx| {
context_store.open_local_context(path, cx)
})
}
pub fn reload(&self, cx: &mut Context<Self>) {
let database_future = ThreadsDatabase::connect(cx);
cx.spawn(async move |this, cx| {
let threads = database_future
.await
.map_err(|err| anyhow!(err))?
.list_threads()
.await?;
this.update(cx, |this, cx| {
if this.recently_opened_entries.len() < MAX_RECENTLY_OPENED_ENTRIES {
for thread in threads
.iter()
.take(MAX_RECENTLY_OPENED_ENTRIES - this.recently_opened_entries.len())
.rev()
{
this.push_recently_opened_entry(
HistoryEntryId::AcpThread(thread.id.clone()),
cx,
)
}
}
this.threads = threads;
cx.notify();
})
})
.detach_and_log_err(cx);
}
pub fn entries(&self, cx: &App) -> Vec<HistoryEntry> {
let mut history_entries = Vec::new();
#[cfg(debug_assertions)]
if std::env::var("ZED_SIMULATE_NO_THREAD_HISTORY").is_ok() {
return history_entries;
}
history_entries.extend(self.threads.iter().cloned().map(HistoryEntry::AcpThread));
history_entries.extend(
self.context_store
.read(cx)
.unordered_contexts()
.cloned()
.map(HistoryEntry::TextThread),
);
history_entries.sort_unstable_by_key(|entry| std::cmp::Reverse(entry.updated_at()));
history_entries
}
pub fn is_empty(&self, cx: &App) -> bool {
self.threads.is_empty()
&& self
.context_store
.read(cx)
.unordered_contexts()
.next()
.is_none()
}
pub fn recently_opened_entries(&self, cx: &App) -> Vec<HistoryEntry> {
#[cfg(debug_assertions)]
if std::env::var("ZED_SIMULATE_NO_THREAD_HISTORY").is_ok() {
return Vec::new();
}
let thread_entries = self.threads.iter().flat_map(|thread| {
self.recently_opened_entries
.iter()
.enumerate()
.flat_map(|(index, entry)| match entry {
HistoryEntryId::AcpThread(id) if &thread.id == id => {
Some((index, HistoryEntry::AcpThread(thread.clone())))
}
_ => None,
})
});
let context_entries =
self.context_store
.read(cx)
.unordered_contexts()
.flat_map(|context| {
self.recently_opened_entries
.iter()
.enumerate()
.flat_map(|(index, entry)| match entry {
HistoryEntryId::TextThread(path) if &context.path == path => {
Some((index, HistoryEntry::TextThread(context.clone())))
}
_ => None,
})
});
thread_entries
.chain(context_entries)
// optimization to halt iteration early
.take(self.recently_opened_entries.len())
.sorted_unstable_by_key(|(index, _)| *index)
.map(|(_, entry)| entry)
.collect()
}
fn save_recently_opened_entries(&mut self, cx: &mut Context<Self>) {
let serialized_entries = self
.recently_opened_entries
.iter()
.filter_map(|entry| match entry {
HistoryEntryId::TextThread(path) => path.file_name().map(|file| {
SerializedRecentOpen::TextThread(file.to_string_lossy().to_string())
}),
HistoryEntryId::AcpThread(id) => {
Some(SerializedRecentOpen::AcpThread(id.to_string()))
}
})
.collect::<Vec<_>>();
self._save_recently_opened_entries_task = cx.spawn(async move |_, cx| {
let content = serde_json::to_string(&serialized_entries).unwrap();
cx.background_executor()
.timer(SAVE_RECENTLY_OPENED_ENTRIES_DEBOUNCE)
.await;
if cfg!(any(feature = "test-support", test)) {
return;
}
KEY_VALUE_STORE
.write_kvp(RECENTLY_OPENED_THREADS_KEY.to_owned(), content)
.await
.log_err();
});
}
fn load_recently_opened_entries(cx: &AsyncApp) -> Task<Result<VecDeque<HistoryEntryId>>> {
cx.background_spawn(async move {
if cfg!(any(feature = "test-support", test)) {
anyhow::bail!("history store does not persist in tests");
}
let json = KEY_VALUE_STORE
.read_kvp(RECENTLY_OPENED_THREADS_KEY)?
.unwrap_or("[]".to_string());
let entries = serde_json::from_str::<Vec<SerializedRecentOpen>>(&json)
.context("deserializing persisted agent panel navigation history")?
.into_iter()
.take(MAX_RECENTLY_OPENED_ENTRIES)
.flat_map(|entry| match entry {
SerializedRecentOpen::AcpThread(id) => Some(HistoryEntryId::AcpThread(
acp::SessionId(id.as_str().into()),
)),
SerializedRecentOpen::TextThread(file_name) => Some(
HistoryEntryId::TextThread(contexts_dir().join(file_name).into()),
),
})
.collect();
Ok(entries)
})
}
pub fn push_recently_opened_entry(&mut self, entry: HistoryEntryId, cx: &mut Context<Self>) {
self.recently_opened_entries
.retain(|old_entry| old_entry != &entry);
self.recently_opened_entries.push_front(entry);
self.recently_opened_entries
.truncate(MAX_RECENTLY_OPENED_ENTRIES);
self.save_recently_opened_entries(cx);
}
pub fn remove_recently_opened_thread(&mut self, id: acp::SessionId, cx: &mut Context<Self>) {
self.recently_opened_entries.retain(
|entry| !matches!(entry, HistoryEntryId::AcpThread(thread_id) if thread_id == &id),
);
self.save_recently_opened_entries(cx);
}
pub fn replace_recently_opened_text_thread(
&mut self,
old_path: &Path,
new_path: &Arc<Path>,
cx: &mut Context<Self>,
) {
for entry in &mut self.recently_opened_entries {
match entry {
HistoryEntryId::TextThread(path) if path.as_ref() == old_path => {
*entry = HistoryEntryId::TextThread(new_path.clone());
break;
}
_ => {}
}
}
self.save_recently_opened_entries(cx);
}
pub fn remove_recently_opened_entry(&mut self, entry: &HistoryEntryId, cx: &mut Context<Self>) {
self.recently_opened_entries
.retain(|old_entry| old_entry != entry);
self.save_recently_opened_entries(cx);
}
}

View File

@@ -7,16 +7,17 @@ use gpui::{App, Entity, Task};
use project::Project;
use prompt_store::PromptStore;
use crate::{NativeAgent, NativeAgentConnection, templates::Templates};
use crate::{HistoryStore, NativeAgent, NativeAgentConnection, templates::Templates};
#[derive(Clone)]
pub struct NativeAgentServer {
fs: Arc<dyn Fs>,
history: Entity<HistoryStore>,
}
impl NativeAgentServer {
pub fn new(fs: Arc<dyn Fs>) -> Self {
Self { fs }
pub fn new(fs: Arc<dyn Fs>, history: Entity<HistoryStore>) -> Self {
Self { fs, history }
}
}
@@ -26,16 +27,15 @@ impl AgentServer for NativeAgentServer {
}
fn empty_state_headline(&self) -> &'static str {
"Native Agent"
""
}
fn empty_state_message(&self) -> &'static str {
"How can I help you today?"
""
}
fn logo(&self) -> ui::IconName {
// Using the ZedAssistant icon as it's the native built-in agent
ui::IconName::ZedAssistant
ui::IconName::ZedAgent
}
fn connect(
@@ -50,6 +50,7 @@ impl AgentServer for NativeAgentServer {
);
let project = project.clone();
let fs = self.fs.clone();
let history = self.history.clone();
let prompt_store = PromptStore::global(cx);
cx.spawn(async move |cx| {
log::debug!("Creating templates for native agent");
@@ -57,7 +58,8 @@ impl AgentServer for NativeAgentServer {
let prompt_store = prompt_store.await?;
log::debug!("Creating native agent entity");
let agent = NativeAgent::new(project, templates, Some(prompt_store), fs, cx).await?;
let agent =
NativeAgent::new(project, history, templates, Some(prompt_store), fs, cx).await?;
// Create the connection wrapper
let connection = NativeAgentConnection(agent);

View File

@@ -6,15 +6,16 @@ use agent_settings::AgentProfileId;
use anyhow::Result;
use client::{Client, UserStore};
use fs::{FakeFs, Fs};
use futures::channel::mpsc::UnboundedReceiver;
use futures::{StreamExt, channel::mpsc::UnboundedReceiver};
use gpui::{
App, AppContext, Entity, Task, TestAppContext, UpdateGlobal, http_client::FakeHttpClient,
};
use indoc::indoc;
use language_model::{
LanguageModel, LanguageModelCompletionEvent, LanguageModelId, LanguageModelRegistry,
LanguageModelRequestMessage, LanguageModelToolResult, LanguageModelToolUse, MessageContent,
Role, StopReason, fake_provider::FakeLanguageModel,
LanguageModel, LanguageModelCompletionError, LanguageModelCompletionEvent, LanguageModelId,
LanguageModelProviderName, LanguageModelRegistry, LanguageModelRequestMessage,
LanguageModelToolResult, LanguageModelToolUse, MessageContent, Role, StopReason,
fake_provider::FakeLanguageModel,
};
use pretty_assertions::assert_eq;
use project::Project;
@@ -24,7 +25,6 @@ use schemars::JsonSchema;
use serde::{Deserialize, Serialize};
use serde_json::json;
use settings::SettingsStore;
use smol::stream::StreamExt;
use std::{path::Path, rc::Rc, sync::Arc, time::Duration};
use util::path;
@@ -345,7 +345,7 @@ async fn test_streaming_tool_calls(cx: &mut TestAppContext) {
let mut saw_partial_tool_use = false;
while let Some(event) = events.next().await {
if let Ok(AgentResponseEvent::ToolCall(tool_call)) = event {
if let Ok(ThreadEvent::ToolCall(tool_call)) = event {
thread.update(cx, |thread, _cx| {
// Look for a tool use in the thread's last message
let message = thread.last_message().unwrap();
@@ -735,16 +735,14 @@ async fn test_send_after_tool_use_limit(cx: &mut TestAppContext) {
);
}
async fn expect_tool_call(
events: &mut UnboundedReceiver<Result<AgentResponseEvent>>,
) -> acp::ToolCall {
async fn expect_tool_call(events: &mut UnboundedReceiver<Result<ThreadEvent>>) -> acp::ToolCall {
let event = events
.next()
.await
.expect("no tool call authorization event received")
.unwrap();
match event {
AgentResponseEvent::ToolCall(tool_call) => return tool_call,
ThreadEvent::ToolCall(tool_call) => tool_call,
event => {
panic!("Unexpected event {event:?}");
}
@@ -752,7 +750,7 @@ async fn expect_tool_call(
}
async fn expect_tool_call_update_fields(
events: &mut UnboundedReceiver<Result<AgentResponseEvent>>,
events: &mut UnboundedReceiver<Result<ThreadEvent>>,
) -> acp::ToolCallUpdate {
let event = events
.next()
@@ -760,9 +758,7 @@ async fn expect_tool_call_update_fields(
.expect("no tool call authorization event received")
.unwrap();
match event {
AgentResponseEvent::ToolCallUpdate(acp_thread::ToolCallUpdate::UpdateFields(update)) => {
return update;
}
ThreadEvent::ToolCallUpdate(acp_thread::ToolCallUpdate::UpdateFields(update)) => update,
event => {
panic!("Unexpected event {event:?}");
}
@@ -770,7 +766,7 @@ async fn expect_tool_call_update_fields(
}
async fn next_tool_call_authorization(
events: &mut UnboundedReceiver<Result<AgentResponseEvent>>,
events: &mut UnboundedReceiver<Result<ThreadEvent>>,
) -> ToolCallAuthorization {
loop {
let event = events
@@ -778,7 +774,7 @@ async fn next_tool_call_authorization(
.await
.expect("no tool call authorization event received")
.unwrap();
if let AgentResponseEvent::ToolCallAuthorization(tool_call_authorization) = event {
if let ThreadEvent::ToolCallAuthorization(tool_call_authorization) = event {
let permission_kinds = tool_call_authorization
.options
.iter()
@@ -945,13 +941,13 @@ async fn test_cancellation(cx: &mut TestAppContext) {
let mut echo_completed = false;
while let Some(event) = events.next().await {
match event.unwrap() {
AgentResponseEvent::ToolCall(tool_call) => {
ThreadEvent::ToolCall(tool_call) => {
assert_eq!(tool_call.title, expected_tools.remove(0));
if tool_call.title == "Echo" {
echo_id = Some(tool_call.id);
}
}
AgentResponseEvent::ToolCallUpdate(acp_thread::ToolCallUpdate::UpdateFields(
ThreadEvent::ToolCallUpdate(acp_thread::ToolCallUpdate::UpdateFields(
acp::ToolCallUpdate {
id,
fields:
@@ -973,13 +969,13 @@ async fn test_cancellation(cx: &mut TestAppContext) {
// Cancel the current send and ensure that the event stream is closed, even
// if one of the tools is still running.
thread.update(cx, |thread, _cx| thread.cancel());
thread.update(cx, |thread, cx| thread.cancel(cx));
let events = events.collect::<Vec<_>>().await;
let last_event = events.last();
assert!(
matches!(
last_event,
Some(Ok(AgentResponseEvent::Stop(acp::StopReason::Canceled)))
Some(Ok(ThreadEvent::Stop(acp::StopReason::Canceled)))
),
"unexpected event {last_event:?}"
);
@@ -1121,7 +1117,7 @@ async fn test_refusal(cx: &mut TestAppContext) {
}
#[gpui::test]
async fn test_truncate(cx: &mut TestAppContext) {
async fn test_truncate_first_message(cx: &mut TestAppContext) {
let ThreadTest { model, thread, .. } = setup(cx, TestModel::Fake).await;
let fake_model = model.as_fake();
@@ -1141,9 +1137,18 @@ async fn test_truncate(cx: &mut TestAppContext) {
Hello
"}
);
assert_eq!(thread.latest_token_usage(), None);
});
fake_model.send_last_completion_stream_text_chunk("Hey!");
fake_model.send_last_completion_stream_event(LanguageModelCompletionEvent::UsageUpdate(
language_model::TokenUsage {
input_tokens: 32_000,
output_tokens: 16_000,
cache_creation_input_tokens: 0,
cache_read_input_tokens: 0,
},
));
cx.run_until_parked();
thread.read_with(cx, |thread, _| {
assert_eq!(
@@ -1158,14 +1163,22 @@ async fn test_truncate(cx: &mut TestAppContext) {
Hey!
"}
);
assert_eq!(
thread.latest_token_usage(),
Some(acp_thread::TokenUsage {
used_tokens: 32_000 + 16_000,
max_tokens: 1_000_000,
})
);
});
thread
.update(cx, |thread, _cx| thread.truncate(message_id))
.update(cx, |thread, cx| thread.truncate(message_id, cx))
.unwrap();
cx.run_until_parked();
thread.read_with(cx, |thread, _| {
assert_eq!(thread.to_markdown(), "");
assert_eq!(thread.latest_token_usage(), None);
});
// Ensure we can still send a new message after truncation.
@@ -1186,6 +1199,14 @@ async fn test_truncate(cx: &mut TestAppContext) {
});
cx.run_until_parked();
fake_model.send_last_completion_stream_text_chunk("Ahoy!");
fake_model.send_last_completion_stream_event(LanguageModelCompletionEvent::UsageUpdate(
language_model::TokenUsage {
input_tokens: 40_000,
output_tokens: 20_000,
cache_creation_input_tokens: 0,
cache_read_input_tokens: 0,
},
));
cx.run_until_parked();
thread.read_with(cx, |thread, _| {
assert_eq!(
@@ -1200,9 +1221,171 @@ async fn test_truncate(cx: &mut TestAppContext) {
Ahoy!
"}
);
assert_eq!(
thread.latest_token_usage(),
Some(acp_thread::TokenUsage {
used_tokens: 40_000 + 20_000,
max_tokens: 1_000_000,
})
);
});
}
#[gpui::test]
async fn test_truncate_second_message(cx: &mut TestAppContext) {
let ThreadTest { model, thread, .. } = setup(cx, TestModel::Fake).await;
let fake_model = model.as_fake();
thread
.update(cx, |thread, cx| {
thread.send(UserMessageId::new(), ["Message 1"], cx)
})
.unwrap();
cx.run_until_parked();
fake_model.send_last_completion_stream_text_chunk("Message 1 response");
fake_model.send_last_completion_stream_event(LanguageModelCompletionEvent::UsageUpdate(
language_model::TokenUsage {
input_tokens: 32_000,
output_tokens: 16_000,
cache_creation_input_tokens: 0,
cache_read_input_tokens: 0,
},
));
fake_model.end_last_completion_stream();
cx.run_until_parked();
let assert_first_message_state = |cx: &mut TestAppContext| {
thread.clone().read_with(cx, |thread, _| {
assert_eq!(
thread.to_markdown(),
indoc! {"
## User
Message 1
## Assistant
Message 1 response
"}
);
assert_eq!(
thread.latest_token_usage(),
Some(acp_thread::TokenUsage {
used_tokens: 32_000 + 16_000,
max_tokens: 1_000_000,
})
);
});
};
assert_first_message_state(cx);
let second_message_id = UserMessageId::new();
thread
.update(cx, |thread, cx| {
thread.send(second_message_id.clone(), ["Message 2"], cx)
})
.unwrap();
cx.run_until_parked();
fake_model.send_last_completion_stream_text_chunk("Message 2 response");
fake_model.send_last_completion_stream_event(LanguageModelCompletionEvent::UsageUpdate(
language_model::TokenUsage {
input_tokens: 40_000,
output_tokens: 20_000,
cache_creation_input_tokens: 0,
cache_read_input_tokens: 0,
},
));
fake_model.end_last_completion_stream();
cx.run_until_parked();
thread.read_with(cx, |thread, _| {
assert_eq!(
thread.to_markdown(),
indoc! {"
## User
Message 1
## Assistant
Message 1 response
## User
Message 2
## Assistant
Message 2 response
"}
);
assert_eq!(
thread.latest_token_usage(),
Some(acp_thread::TokenUsage {
used_tokens: 40_000 + 20_000,
max_tokens: 1_000_000,
})
);
});
thread
.update(cx, |thread, cx| thread.truncate(second_message_id, cx))
.unwrap();
cx.run_until_parked();
assert_first_message_state(cx);
}
#[gpui::test]
async fn test_title_generation(cx: &mut TestAppContext) {
let ThreadTest { model, thread, .. } = setup(cx, TestModel::Fake).await;
let fake_model = model.as_fake();
let summary_model = Arc::new(FakeLanguageModel::default());
thread.update(cx, |thread, cx| {
thread.set_summarization_model(Some(summary_model.clone()), cx)
});
let send = thread
.update(cx, |thread, cx| {
thread.send(UserMessageId::new(), ["Hello"], cx)
})
.unwrap();
cx.run_until_parked();
fake_model.send_last_completion_stream_text_chunk("Hey!");
fake_model.end_last_completion_stream();
cx.run_until_parked();
thread.read_with(cx, |thread, _| assert_eq!(thread.title(), "New Thread"));
// Ensure the summary model has been invoked to generate a title.
summary_model.send_last_completion_stream_text_chunk("Hello ");
summary_model.send_last_completion_stream_text_chunk("world\nG");
summary_model.send_last_completion_stream_text_chunk("oodnight Moon");
summary_model.end_last_completion_stream();
send.collect::<Vec<_>>().await;
thread.read_with(cx, |thread, _| assert_eq!(thread.title(), "Hello world"));
// Send another message, ensuring no title is generated this time.
let send = thread
.update(cx, |thread, cx| {
thread.send(UserMessageId::new(), ["Hello again"], cx)
})
.unwrap();
cx.run_until_parked();
fake_model.send_last_completion_stream_text_chunk("Hey again!");
fake_model.end_last_completion_stream();
cx.run_until_parked();
assert_eq!(summary_model.pending_completions(), Vec::new());
send.collect::<Vec<_>>().await;
thread.read_with(cx, |thread, _| assert_eq!(thread.title(), "Hello world"));
}
#[gpui::test]
async fn test_agent_connection(cx: &mut TestAppContext) {
cx.update(settings::init);
@@ -1218,7 +1401,7 @@ async fn test_agent_connection(cx: &mut TestAppContext) {
let client = Client::new(clock, http_client, cx);
let user_store = cx.new(|cx| UserStore::new(client.clone(), cx));
language_model::init(client.clone(), cx);
language_models::init(user_store.clone(), client.clone(), cx);
language_models::init(user_store, client.clone(), cx);
Project::init_settings(cx);
LanguageModelRegistry::test(cx);
agent_settings::init(cx);
@@ -1230,10 +1413,13 @@ async fn test_agent_connection(cx: &mut TestAppContext) {
fake_fs.insert_tree(path!("/test"), json!({})).await;
let project = Project::test(fake_fs.clone(), [Path::new("/test")], cx).await;
let cwd = Path::new("/test");
let context_store = cx.new(|cx| assistant_context::ContextStore::fake(project.clone(), cx));
let history_store = cx.new(|cx| HistoryStore::new(context_store, cx));
// Create agent and connection
let agent = NativeAgent::new(
project.clone(),
history_store,
templates.clone(),
None,
fake_fs.clone(),
@@ -1435,12 +1621,168 @@ async fn test_tool_updates_to_completion(cx: &mut TestAppContext) {
);
}
#[gpui::test]
async fn test_send_no_retry_on_success(cx: &mut TestAppContext) {
let ThreadTest { thread, model, .. } = setup(cx, TestModel::Fake).await;
let fake_model = model.as_fake();
let mut events = thread
.update(cx, |thread, cx| {
thread.set_completion_mode(agent_settings::CompletionMode::Burn, cx);
thread.send(UserMessageId::new(), ["Hello!"], cx)
})
.unwrap();
cx.run_until_parked();
fake_model.send_last_completion_stream_text_chunk("Hey!");
fake_model.end_last_completion_stream();
let mut retry_events = Vec::new();
while let Some(Ok(event)) = events.next().await {
match event {
ThreadEvent::Retry(retry_status) => {
retry_events.push(retry_status);
}
ThreadEvent::Stop(..) => break,
_ => {}
}
}
assert_eq!(retry_events.len(), 0);
thread.read_with(cx, |thread, _cx| {
assert_eq!(
thread.to_markdown(),
indoc! {"
## User
Hello!
## Assistant
Hey!
"}
)
});
}
#[gpui::test]
async fn test_send_retry_on_error(cx: &mut TestAppContext) {
let ThreadTest { thread, model, .. } = setup(cx, TestModel::Fake).await;
let fake_model = model.as_fake();
let mut events = thread
.update(cx, |thread, cx| {
thread.set_completion_mode(agent_settings::CompletionMode::Burn, cx);
thread.send(UserMessageId::new(), ["Hello!"], cx)
})
.unwrap();
cx.run_until_parked();
fake_model.send_last_completion_stream_error(LanguageModelCompletionError::ServerOverloaded {
provider: LanguageModelProviderName::new("Anthropic"),
retry_after: Some(Duration::from_secs(3)),
});
fake_model.end_last_completion_stream();
cx.executor().advance_clock(Duration::from_secs(3));
cx.run_until_parked();
fake_model.send_last_completion_stream_text_chunk("Hey!");
fake_model.end_last_completion_stream();
let mut retry_events = Vec::new();
while let Some(Ok(event)) = events.next().await {
match event {
ThreadEvent::Retry(retry_status) => {
retry_events.push(retry_status);
}
ThreadEvent::Stop(..) => break,
_ => {}
}
}
assert_eq!(retry_events.len(), 1);
assert!(matches!(
retry_events[0],
acp_thread::RetryStatus { attempt: 1, .. }
));
thread.read_with(cx, |thread, _cx| {
assert_eq!(
thread.to_markdown(),
indoc! {"
## User
Hello!
## Assistant
Hey!
"}
)
});
}
#[gpui::test]
async fn test_send_max_retries_exceeded(cx: &mut TestAppContext) {
let ThreadTest { thread, model, .. } = setup(cx, TestModel::Fake).await;
let fake_model = model.as_fake();
let mut events = thread
.update(cx, |thread, cx| {
thread.set_completion_mode(agent_settings::CompletionMode::Burn, cx);
thread.send(UserMessageId::new(), ["Hello!"], cx)
})
.unwrap();
cx.run_until_parked();
for _ in 0..crate::thread::MAX_RETRY_ATTEMPTS + 1 {
fake_model.send_last_completion_stream_error(
LanguageModelCompletionError::ServerOverloaded {
provider: LanguageModelProviderName::new("Anthropic"),
retry_after: Some(Duration::from_secs(3)),
},
);
fake_model.end_last_completion_stream();
cx.executor().advance_clock(Duration::from_secs(3));
cx.run_until_parked();
}
let mut errors = Vec::new();
let mut retry_events = Vec::new();
while let Some(event) = events.next().await {
match event {
Ok(ThreadEvent::Retry(retry_status)) => {
retry_events.push(retry_status);
}
Ok(ThreadEvent::Stop(..)) => break,
Err(error) => errors.push(error),
_ => {}
}
}
assert_eq!(
retry_events.len(),
crate::thread::MAX_RETRY_ATTEMPTS as usize
);
for i in 0..crate::thread::MAX_RETRY_ATTEMPTS as usize {
assert_eq!(retry_events[i].attempt, i + 1);
}
assert_eq!(errors.len(), 1);
let error = errors[0]
.downcast_ref::<LanguageModelCompletionError>()
.unwrap();
assert!(matches!(
error,
LanguageModelCompletionError::ServerOverloaded { .. }
));
}
/// Filters out the stop events for asserting against in tests
fn stop_events(result_events: Vec<Result<AgentResponseEvent>>) -> Vec<acp::StopReason> {
fn stop_events(result_events: Vec<Result<ThreadEvent>>) -> Vec<acp::StopReason> {
result_events
.into_iter()
.filter_map(|event| match event.unwrap() {
AgentResponseEvent::Stop(stop_reason) => Some(stop_reason),
ThreadEvent::Stop(stop_reason) => Some(stop_reason),
_ => None,
})
.collect()
@@ -1512,7 +1854,7 @@ async fn setup(cx: &mut TestAppContext, model: TestModel) -> ThreadTest {
let client = Client::production(cx);
let user_store = cx.new(|cx| UserStore::new(client.clone(), cx));
language_model::init(client.clone(), cx);
language_models::init(user_store.clone(), client.clone(), cx);
language_models::init(user_store, client.clone(), cx);
watch_settings(fs.clone(), cx);
});

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,43 @@
use language_model::LanguageModelToolSchemaFormat;
use schemars::{
JsonSchema, Schema,
generate::SchemaSettings,
transform::{Transform, transform_subschemas},
};
pub(crate) fn root_schema_for<T: JsonSchema>(format: LanguageModelToolSchemaFormat) -> Schema {
let mut generator = match format {
LanguageModelToolSchemaFormat::JsonSchema => SchemaSettings::draft07().into_generator(),
LanguageModelToolSchemaFormat::JsonSchemaSubset => SchemaSettings::openapi3()
.with(|settings| {
settings.meta_schema = None;
settings.inline_subschemas = true;
})
.with_transform(ToJsonSchemaSubsetTransform)
.into_generator(),
};
generator.root_schema_for::<T>()
}
#[derive(Debug, Clone)]
struct ToJsonSchemaSubsetTransform;
impl Transform for ToJsonSchemaSubsetTransform {
fn transform(&mut self, schema: &mut Schema) {
// Ensure that the type field is not an array, this happens when we use
// Option<T>, the type will be [T, "null"].
if let Some(type_field) = schema.get_mut("type")
&& let Some(types) = type_field.as_array()
&& let Some(first_type) = types.first()
{
*type_field = first_type.clone();
}
// oneOf is not supported, use anyOf instead
if let Some(one_of) = schema.remove("oneOf") {
schema.insert("anyOf".to_string(), one_of);
}
transform_subschemas(self, schema);
}
}

View File

@@ -176,15 +176,13 @@ impl AnyAgentTool for ContextServerTool {
return Task::ready(Err(anyhow!("Context server not found")));
};
let tool_name = self.tool.name.clone();
let server_clone = server.clone();
let input_clone = input.clone();
cx.spawn(async move |_cx| {
let Some(protocol) = server_clone.client() else {
let Some(protocol) = server.client() else {
bail!("Context server not initialized");
};
let arguments = if let serde_json::Value::Object(map) = input_clone {
let arguments = if let serde_json::Value::Object(map) = input {
Some(map.into_iter().collect())
} else {
None
@@ -228,4 +226,14 @@ impl AnyAgentTool for ContextServerTool {
})
})
}
fn replay(
&self,
_input: serde_json::Value,
_output: serde_json::Value,
_event_stream: ToolCallEventStream,
_cx: &mut App,
) -> Result<()> {
Ok(())
}
}

View File

@@ -8,16 +8,11 @@ use serde::{Deserialize, Serialize};
use std::sync::Arc;
use util::markdown::MarkdownInlineCode;
/// Copies a file or directory in the project, and returns confirmation that the
/// copy succeeded.
///
/// Copies a file or directory in the project, and returns confirmation that the copy succeeded.
/// Directory contents will be copied recursively (like `cp -r`).
///
/// This tool should be used when it's desirable to create a copy of a file or
/// directory without modifying the original. It's much more efficient than
/// doing this by separately reading and then writing the file or directory's
/// contents, so this tool should be preferred over that approach whenever
/// copying is the goal.
/// This tool should be used when it's desirable to create a copy of a file or directory without modifying the original.
/// It's much more efficient than doing this by separately reading and then writing the file or directory's contents, so this tool should be preferred over that approach whenever copying is the goal.
#[derive(Debug, Serialize, Deserialize, JsonSchema)]
pub struct CopyPathToolInput {
/// The source path of the file or directory to copy.
@@ -33,12 +28,10 @@ pub struct CopyPathToolInput {
/// You can copy the first file by providing a source_path of "directory1/a/something.txt"
/// </example>
pub source_path: String,
/// The destination path where the file or directory should be copied to.
///
/// <example>
/// To copy "directory1/a/something.txt" to "directory2/b/copy.txt",
/// provide a destination_path of "directory2/b/copy.txt"
/// To copy "directory1/a/something.txt" to "directory2/b/copy.txt", provide a destination_path of "directory2/b/copy.txt"
/// </example>
pub destination_path: String,
}

View File

@@ -9,12 +9,9 @@ use util::markdown::MarkdownInlineCode;
use crate::{AgentTool, ToolCallEventStream};
/// Creates a new directory at the specified path within the project. Returns
/// confirmation that the directory was created.
/// Creates a new directory at the specified path within the project. Returns confirmation that the directory was created.
///
/// This tool creates a directory and all necessary parent directories (similar
/// to `mkdir -p`). It should be used whenever you need to create new
/// directories within the project.
/// This tool creates a directory and all necessary parent directories (similar to `mkdir -p`). It should be used whenever you need to create new directories within the project.
#[derive(Debug, Serialize, Deserialize, JsonSchema)]
pub struct CreateDirectoryToolInput {
/// The path of the new directory.

View File

@@ -9,8 +9,7 @@ use schemars::JsonSchema;
use serde::{Deserialize, Serialize};
use std::sync::Arc;
/// Deletes the file or directory (and the directory's contents, recursively) at
/// the specified path in the project, and returns confirmation of the deletion.
/// Deletes the file or directory (and the directory's contents, recursively) at the specified path in the project, and returns confirmation of the deletion.
#[derive(Debug, Serialize, Deserialize, JsonSchema)]
pub struct DeletePathToolInput {
/// The path of the file or directory to delete.

View File

@@ -5,10 +5,10 @@ use anyhow::{Context as _, Result, anyhow};
use assistant_tools::edit_agent::{EditAgent, EditAgentOutput, EditAgentOutputEvent, EditFormat};
use cloud_llm_client::CompletionIntent;
use collections::HashSet;
use gpui::{App, AppContext, AsyncApp, Entity, Task};
use gpui::{App, AppContext, AsyncApp, Entity, Task, WeakEntity};
use indoc::formatdoc;
use language::ToPoint;
use language::language_settings::{self, FormatOnSave};
use language::{LanguageRegistry, ToPoint};
use language_model::LanguageModelToolResultContent;
use paths;
use project::lsp_store::{FormatTrigger, LspFormatTarget};
@@ -34,25 +34,21 @@ const DEFAULT_UI_TEXT: &str = "Editing file";
/// - Use the `list_directory` tool to verify the parent directory exists and is the correct location
#[derive(Debug, Serialize, Deserialize, JsonSchema)]
pub struct EditFileToolInput {
/// A one-line, user-friendly markdown description of the edit. This will be
/// shown in the UI and also passed to another model to perform the edit.
/// A one-line, user-friendly markdown description of the edit. This will be shown in the UI and also passed to another model to perform the edit.
///
/// Be terse, but also descriptive in what you want to achieve with this
/// edit. Avoid generic instructions.
/// Be terse, but also descriptive in what you want to achieve with this edit. Avoid generic instructions.
///
/// NEVER mention the file path in this description.
///
/// <example>Fix API endpoint URLs</example>
/// <example>Update copyright year in `page_footer`</example>
///
/// Make sure to include this field before all the others in the input object
/// so that we can display it immediately.
/// Make sure to include this field before all the others in the input object so that we can display it immediately.
pub display_description: String,
/// The full path of the file to create or modify in the project.
///
/// WARNING: When specifying which file path need changing, you MUST
/// start each path with one of the project's root directories.
/// WARNING: When specifying which file path need changing, you MUST start each path with one of the project's root directories.
///
/// The following examples assume we have two root directories in the project:
/// - /a/b/backend
@@ -61,22 +57,19 @@ pub struct EditFileToolInput {
/// <example>
/// `backend/src/main.rs`
///
/// Notice how the file path starts with `backend`. Without that, the path
/// would be ambiguous and the call would fail!
/// Notice how the file path starts with `backend`. Without that, the path would be ambiguous and the call would fail!
/// </example>
///
/// <example>
/// `frontend/db.js`
/// </example>
pub path: PathBuf,
/// The mode of operation on the file. Possible values:
/// - 'edit': Make granular edits to an existing file.
/// - 'create': Create a new file if it doesn't exist.
/// - 'overwrite': Replace the entire contents of an existing file.
///
/// When a file already exists or you just created it, prefer editing
/// it as opposed to recreating it from scratch.
/// When a file already exists or you just created it, prefer editing it as opposed to recreating it from scratch.
pub mode: EditFileMode,
}
@@ -98,11 +91,13 @@ pub enum EditFileMode {
#[derive(Debug, Serialize, Deserialize)]
pub struct EditFileToolOutput {
#[serde(alias = "original_path")]
input_path: PathBuf,
project_path: PathBuf,
new_text: String,
old_text: Arc<String>,
#[serde(default)]
diff: String,
#[serde(alias = "raw_output")]
edit_agent_output: EditAgentOutput,
}
@@ -122,12 +117,16 @@ impl From<EditFileToolOutput> for LanguageModelToolResultContent {
}
pub struct EditFileTool {
thread: Entity<Thread>,
thread: WeakEntity<Thread>,
language_registry: Arc<LanguageRegistry>,
}
impl EditFileTool {
pub fn new(thread: Entity<Thread>) -> Self {
Self { thread }
pub fn new(thread: WeakEntity<Thread>, language_registry: Arc<LanguageRegistry>) -> Self {
Self {
thread,
language_registry,
}
}
fn authorize(
@@ -156,19 +155,22 @@ impl EditFileTool {
// It's also possible that the global config dir is configured to be inside the project,
// so check for that edge case too.
if let Ok(canonical_path) = std::fs::canonicalize(&input.path) {
if canonical_path.starts_with(paths::config_dir()) {
return event_stream.authorize(
format!("{} (global settings)", input.display_description),
cx,
);
}
if let Ok(canonical_path) = std::fs::canonicalize(&input.path)
&& canonical_path.starts_with(paths::config_dir())
{
return event_stream.authorize(
format!("{} (global settings)", input.display_description),
cx,
);
}
// Check if path is inside the global config directory
// First check if it's already inside project - if not, try to canonicalize
let thread = self.thread.read(cx);
let project_path = thread.project().read(cx).find_project_path(&input.path, cx);
let Ok(project_path) = self.thread.read_with(cx, |thread, cx| {
thread.project().read(cx).find_project_path(&input.path, cx)
}) else {
return Task::ready(Err(anyhow!("thread was dropped")));
};
// If the path is inside the project, and it's not one of the above edge cases,
// then no confirmation is necessary. Otherwise, confirmation is necessary.
@@ -221,7 +223,12 @@ impl AgentTool for EditFileTool {
event_stream: ToolCallEventStream,
cx: &mut App,
) -> Task<Result<Self::Output>> {
let project = self.thread.read(cx).project().clone();
let Ok(project) = self
.thread
.read_with(cx, |thread, _cx| thread.project().clone())
else {
return Task::ready(Err(anyhow!("thread was dropped")));
};
let project_path = match resolve_path(&input, project.clone(), cx) {
Ok(path) => path,
Err(err) => return Task::ready(Err(anyhow!(err))),
@@ -237,23 +244,17 @@ impl AgentTool for EditFileTool {
});
}
let Some(request) = self.thread.update(cx, |thread, cx| {
thread
.build_completion_request(CompletionIntent::ToolResults, cx)
.ok()
}) else {
return Task::ready(Err(anyhow!("Failed to build completion request")));
};
let thread = self.thread.read(cx);
let Some(model) = thread.model().cloned() else {
return Task::ready(Err(anyhow!("No language model configured")));
};
let action_log = thread.action_log().clone();
let authorize = self.authorize(&input, &event_stream, cx);
cx.spawn(async move |cx: &mut AsyncApp| {
authorize.await?;
let (request, model, action_log) = self.thread.update(cx, |thread, cx| {
let request = thread.build_completion_request(CompletionIntent::ToolResults, cx);
(request, thread.model().cloned(), thread.action_log().clone())
})?;
let request = request?;
let model = model.context("No language model configured")?;
let edit_format = EditFormat::from_model(model.clone())?;
let edit_agent = EditAgent::new(
model,
@@ -419,14 +420,32 @@ impl AgentTool for EditFileTool {
Ok(EditFileToolOutput {
input_path: input.path,
project_path: project_path.path.to_path_buf(),
new_text: new_text.clone(),
new_text,
old_text,
diff: unified_diff,
edit_agent_output,
})
})
}
fn replay(
&self,
_input: Self::Input,
output: Self::Output,
event_stream: ToolCallEventStream,
cx: &mut App,
) -> Result<()> {
event_stream.update_diff(cx.new(|cx| {
Diff::finalized(
output.input_path,
Some(output.old_text.to_string()),
output.new_text,
self.language_registry.clone(),
cx,
)
}));
Ok(())
}
}
/// Validate that the file path is valid, meaning:
@@ -515,6 +534,7 @@ mod tests {
let fs = project::FakeFs::new(cx.executor());
fs.insert_tree("/root", json!({})).await;
let project = Project::test(fs.clone(), [path!("/root").as_ref()], cx).await;
let language_registry = project.read_with(cx, |project, _cx| project.languages().clone());
let action_log = cx.new(|_| ActionLog::new(project.clone()));
let context_server_registry =
cx.new(|cx| ContextServerRegistry::new(project.read(cx).context_server_store(), cx));
@@ -537,7 +557,11 @@ mod tests {
path: "root/nonexistent_file.txt".into(),
mode: EditFileMode::Edit,
};
Arc::new(EditFileTool { thread }).run(input, ToolCallEventStream::test().0, cx)
Arc::new(EditFileTool::new(thread.downgrade(), language_registry)).run(
input,
ToolCallEventStream::test().0,
cx,
)
})
.await;
assert_eq!(
@@ -624,8 +648,7 @@ mod tests {
mode: mode.clone(),
};
let result = cx.update(|cx| resolve_path(&input, project, cx));
result
cx.update(|cx| resolve_path(&input, project, cx))
}
fn assert_resolved_path_eq(path: anyhow::Result<ProjectPath>, expected: &str) {
@@ -750,9 +773,10 @@ mod tests {
path: "root/src/main.rs".into(),
mode: EditFileMode::Overwrite,
};
Arc::new(EditFileTool {
thread: thread.clone(),
})
Arc::new(EditFileTool::new(
thread.downgrade(),
language_registry.clone(),
))
.run(input, ToolCallEventStream::test().0, cx)
});
@@ -806,7 +830,11 @@ mod tests {
path: "root/src/main.rs".into(),
mode: EditFileMode::Overwrite,
};
Arc::new(EditFileTool { thread }).run(input, ToolCallEventStream::test().0, cx)
Arc::new(EditFileTool::new(thread.downgrade(), language_registry)).run(
input,
ToolCallEventStream::test().0,
cx,
)
});
// Stream the unformatted content
@@ -850,6 +878,7 @@ mod tests {
let project = Project::test(fs.clone(), [path!("/root").as_ref()], cx).await;
let context_server_registry =
cx.new(|cx| ContextServerRegistry::new(project.read(cx).context_server_store(), cx));
let language_registry = project.read_with(cx, |project, _cx| project.languages().clone());
let action_log = cx.new(|_| ActionLog::new(project.clone()));
let model = Arc::new(FakeLanguageModel::default());
let thread = cx.new(|cx| {
@@ -887,9 +916,10 @@ mod tests {
path: "root/src/main.rs".into(),
mode: EditFileMode::Overwrite,
};
Arc::new(EditFileTool {
thread: thread.clone(),
})
Arc::new(EditFileTool::new(
thread.downgrade(),
language_registry.clone(),
))
.run(input, ToolCallEventStream::test().0, cx)
});
@@ -938,10 +968,11 @@ mod tests {
path: "root/src/main.rs".into(),
mode: EditFileMode::Overwrite,
};
Arc::new(EditFileTool {
thread: thread.clone(),
})
.run(input, ToolCallEventStream::test().0, cx)
Arc::new(EditFileTool::new(thread.downgrade(), language_registry)).run(
input,
ToolCallEventStream::test().0,
cx,
)
});
// Stream the content with trailing whitespace
@@ -976,6 +1007,7 @@ mod tests {
let project = Project::test(fs.clone(), [path!("/root").as_ref()], cx).await;
let context_server_registry =
cx.new(|cx| ContextServerRegistry::new(project.read(cx).context_server_store(), cx));
let language_registry = project.read_with(cx, |project, _cx| project.languages().clone());
let action_log = cx.new(|_| ActionLog::new(project.clone()));
let model = Arc::new(FakeLanguageModel::default());
let thread = cx.new(|cx| {
@@ -989,7 +1021,7 @@ mod tests {
cx,
)
});
let tool = Arc::new(EditFileTool { thread });
let tool = Arc::new(EditFileTool::new(thread.downgrade(), language_registry));
fs.insert_tree("/root", json!({})).await;
// Test 1: Path with .zed component should require confirmation
@@ -1111,6 +1143,7 @@ mod tests {
let fs = project::FakeFs::new(cx.executor());
fs.insert_tree("/project", json!({})).await;
let project = Project::test(fs.clone(), [path!("/project").as_ref()], cx).await;
let language_registry = project.read_with(cx, |project, _cx| project.languages().clone());
let context_server_registry =
cx.new(|cx| ContextServerRegistry::new(project.read(cx).context_server_store(), cx));
let action_log = cx.new(|_| ActionLog::new(project.clone()));
@@ -1126,7 +1159,7 @@ mod tests {
cx,
)
});
let tool = Arc::new(EditFileTool { thread });
let tool = Arc::new(EditFileTool::new(thread.downgrade(), language_registry));
// Test global config paths - these should require confirmation if they exist and are outside the project
let test_cases = vec![
@@ -1220,7 +1253,7 @@ mod tests {
cx,
)
.await;
let language_registry = project.read_with(cx, |project, _cx| project.languages().clone());
let action_log = cx.new(|_| ActionLog::new(project.clone()));
let context_server_registry =
cx.new(|cx| ContextServerRegistry::new(project.read(cx).context_server_store(), cx));
@@ -1236,7 +1269,7 @@ mod tests {
cx,
)
});
let tool = Arc::new(EditFileTool { thread });
let tool = Arc::new(EditFileTool::new(thread.downgrade(), language_registry));
// Test files in different worktrees
let test_cases = vec![
@@ -1302,6 +1335,7 @@ mod tests {
)
.await;
let project = Project::test(fs.clone(), [path!("/project").as_ref()], cx).await;
let language_registry = project.read_with(cx, |project, _cx| project.languages().clone());
let action_log = cx.new(|_| ActionLog::new(project.clone()));
let context_server_registry =
cx.new(|cx| ContextServerRegistry::new(project.read(cx).context_server_store(), cx));
@@ -1317,7 +1351,7 @@ mod tests {
cx,
)
});
let tool = Arc::new(EditFileTool { thread });
let tool = Arc::new(EditFileTool::new(thread.downgrade(), language_registry));
// Test edge cases
let test_cases = vec![
@@ -1386,6 +1420,7 @@ mod tests {
)
.await;
let project = Project::test(fs.clone(), [path!("/project").as_ref()], cx).await;
let language_registry = project.read_with(cx, |project, _cx| project.languages().clone());
let action_log = cx.new(|_| ActionLog::new(project.clone()));
let context_server_registry =
cx.new(|cx| ContextServerRegistry::new(project.read(cx).context_server_store(), cx));
@@ -1401,7 +1436,7 @@ mod tests {
cx,
)
});
let tool = Arc::new(EditFileTool { thread });
let tool = Arc::new(EditFileTool::new(thread.downgrade(), language_registry));
// Test different EditFileMode values
let modes = vec![
@@ -1467,6 +1502,7 @@ mod tests {
init_test(cx);
let fs = project::FakeFs::new(cx.executor());
let project = Project::test(fs.clone(), [path!("/project").as_ref()], cx).await;
let language_registry = project.read_with(cx, |project, _cx| project.languages().clone());
let action_log = cx.new(|_| ActionLog::new(project.clone()));
let context_server_registry =
cx.new(|cx| ContextServerRegistry::new(project.read(cx).context_server_store(), cx));
@@ -1482,7 +1518,7 @@ mod tests {
cx,
)
});
let tool = Arc::new(EditFileTool { thread });
let tool = Arc::new(EditFileTool::new(thread.downgrade(), language_registry));
assert_eq!(
tool.initial_title(Err(json!({

View File

@@ -31,7 +31,6 @@ pub struct FindPathToolInput {
/// You can get back the first two paths by providing a glob of "*thing*.txt"
/// </example>
pub glob: String,
/// Optional starting position for paginated results (0-based).
/// When not provided, starts from the beginning.
#[serde(default)]
@@ -116,7 +115,7 @@ impl AgentTool for FindPathTool {
..cmp::min(input.offset + RESULTS_PER_PAGE, matches.len())];
event_stream.update_fields(acp::ToolCallUpdateFields {
title: Some(if paginated_matches.len() == 0 {
title: Some(if paginated_matches.is_empty() {
"No matches".into()
} else if paginated_matches.len() == 1 {
"1 match".into()

View File

@@ -27,8 +27,7 @@ use util::paths::PathMatcher;
/// - DO NOT use HTML entities solely to escape characters in the tool parameters.
#[derive(Debug, Serialize, Deserialize, JsonSchema)]
pub struct GrepToolInput {
/// A regex pattern to search for in the entire project. Note that the regex
/// will be parsed by the Rust `regex` crate.
/// A regex pattern to search for in the entire project. Note that the regex will be parsed by the Rust `regex` crate.
///
/// Do NOT specify a path here! This will only be matched against the code **content**.
pub regex: String,
@@ -179,15 +178,14 @@ impl AgentTool for GrepTool {
// Check if this file should be excluded based on its worktree settings
if let Ok(Some(project_path)) = project.read_with(cx, |project, cx| {
project.find_project_path(&path, cx)
}) {
if cx.update(|cx| {
})
&& cx.update(|cx| {
let worktree_settings = WorktreeSettings::get(Some((&project_path).into()), cx);
worktree_settings.is_path_excluded(&project_path.path)
|| worktree_settings.is_path_private(&project_path.path)
}).unwrap_or(false) {
continue;
}
}
while *parse_status.borrow() != ParseStatus::Idle {
parse_status.changed().await?;
@@ -275,12 +273,11 @@ impl AgentTool for GrepTool {
output.extend(snapshot.text_for_range(range));
output.push_str("\n```\n");
if let Some(ancestor_range) = ancestor_range {
if end_row < ancestor_range.end.row {
if let Some(ancestor_range) = ancestor_range
&& end_row < ancestor_range.end.row {
let remaining_lines = ancestor_range.end.row - end_row;
writeln!(output, "\n{} lines remaining in ancestor node. Read the file to see all.", remaining_lines)?;
}
}
matches_found += 1;
}
@@ -320,7 +317,7 @@ mod tests {
init_test(cx);
cx.executor().allow_parking();
let fs = FakeFs::new(cx.executor().clone());
let fs = FakeFs::new(cx.executor());
fs.insert_tree(
path!("/root"),
serde_json::json!({
@@ -405,7 +402,7 @@ mod tests {
init_test(cx);
cx.executor().allow_parking();
let fs = FakeFs::new(cx.executor().clone());
let fs = FakeFs::new(cx.executor());
fs.insert_tree(
path!("/root"),
serde_json::json!({
@@ -480,7 +477,7 @@ mod tests {
init_test(cx);
cx.executor().allow_parking();
let fs = FakeFs::new(cx.executor().clone());
let fs = FakeFs::new(cx.executor());
// Create test file with syntax structures
fs.insert_tree(
@@ -765,7 +762,7 @@ mod tests {
if cfg!(windows) {
result.replace("root\\", "root/")
} else {
result.to_string()
result
}
}
Err(e) => panic!("Failed to run grep tool: {}", e),

View File

@@ -10,14 +10,12 @@ use std::fmt::Write;
use std::{path::Path, sync::Arc};
use util::markdown::MarkdownInlineCode;
/// Lists files and directories in a given path. Prefer the `grep` or
/// `find_path` tools when searching the codebase.
/// Lists files and directories in a given path. Prefer the `grep` or `find_path` tools when searching the codebase.
#[derive(Debug, Serialize, Deserialize, JsonSchema)]
pub struct ListDirectoryToolInput {
/// The fully-qualified path of the directory to list in the project.
///
/// This path should never be absolute, and the first component
/// of the path should always be a root directory in a project.
/// This path should never be absolute, and the first component of the path should always be a root directory in a project.
///
/// <example>
/// If the project has the following root directories:

View File

@@ -8,14 +8,11 @@ use serde::{Deserialize, Serialize};
use std::{path::Path, sync::Arc};
use util::markdown::MarkdownInlineCode;
/// Moves or rename a file or directory in the project, and returns confirmation
/// that the move succeeded.
/// Moves or rename a file or directory in the project, and returns confirmation that the move succeeded.
///
/// If the source and destination directories are the same, but the filename is
/// different, this performs a rename. Otherwise, it performs a move.
/// If the source and destination directories are the same, but the filename is different, this performs a rename. Otherwise, it performs a move.
///
/// This tool should be used when it's desirable to move or rename a file or
/// directory without changing its contents at all.
/// This tool should be used when it's desirable to move or rename a file or directory without changing its contents at all.
#[derive(Debug, Serialize, Deserialize, JsonSchema)]
pub struct MovePathToolInput {
/// The source path of the file or directory to move/rename.

View File

@@ -8,19 +8,15 @@ use serde::{Deserialize, Serialize};
use std::{path::PathBuf, sync::Arc};
use util::markdown::MarkdownEscaped;
/// This tool opens a file or URL with the default application associated with
/// it on the user's operating system:
/// This tool opens a file or URL with the default application associated with it on the user's operating system:
///
/// - On macOS, it's equivalent to the `open` command
/// - On Windows, it's equivalent to `start`
/// - On Linux, it uses something like `xdg-open`, `gio open`, `gnome-open`, `kde-open`, `wslview` as appropriate
///
/// For example, it can open a web browser with a URL, open a PDF file with the
/// default PDF viewer, etc.
/// For example, it can open a web browser with a URL, open a PDF file with the default PDF viewer, etc.
///
/// You MUST ONLY use this tool when the user has explicitly requested opening
/// something. You MUST NEVER assume that the user would like for you to use
/// this tool.
/// You MUST ONLY use this tool when the user has explicitly requested opening something. You MUST NEVER assume that the user would like for you to use this tool.
#[derive(Clone, Debug, Serialize, Deserialize, JsonSchema)]
pub struct OpenToolInput {
/// The path or URL to open with the default application.

View File

@@ -21,8 +21,7 @@ use crate::{AgentTool, ToolCallEventStream};
pub struct ReadFileToolInput {
/// The relative path of the file to read.
///
/// This path should never be absolute, and the first component
/// of the path should always be a root directory in a project.
/// This path should never be absolute, and the first component of the path should always be a root directory in a project.
///
/// <example>
/// If the project has the following root directories:
@@ -34,11 +33,9 @@ pub struct ReadFileToolInput {
/// If you want to access `file.txt` in `directory2`, you should use the path `directory2/file.txt`.
/// </example>
pub path: String,
/// Optional line number to start reading on (1-based index)
#[serde(default)]
pub start_line: Option<u32>,
/// Optional line number to end reading on (1-based index, inclusive)
#[serde(default)]
pub end_line: Option<u32>,
@@ -175,7 +172,7 @@ impl AgentTool for ReadFileTool {
buffer
.file()
.as_ref()
.map_or(true, |file| !file.disk_state().exists())
.is_none_or(|file| !file.disk_state().exists())
})? {
anyhow::bail!("{file_path} not found");
}

View File

@@ -47,12 +47,9 @@ impl TerminalTool {
}
if which::which("bash").is_ok() {
log::info!("agent selected bash for terminal tool");
"bash".into()
} else {
let shell = get_system_shell();
log::info!("agent selected {shell} for terminal tool");
shell
get_system_shell()
}
});
Self {
@@ -237,7 +234,7 @@ fn process_content(
if is_empty {
"Command executed successfully.".to_string()
} else {
content.to_string()
content
}
}
Some(exit_status) => {
@@ -271,7 +268,7 @@ fn working_dir(
let project = project.read(cx);
let cd = &input.cd;
if cd == "." || cd == "" {
if cd == "." || cd.is_empty() {
// Accept "." or "" as meaning "the one worktree" if we only have one worktree.
let mut worktrees = project.worktrees(cx);
@@ -296,10 +293,8 @@ fn working_dir(
{
return Ok(Some(input_path.into()));
}
} else {
if let Some(worktree) = project.worktree_for_root_name(cd, cx) {
return Ok(Some(worktree.read(cx).abs_path().to_path_buf()));
}
} else if let Some(worktree) = project.worktree_for_root_name(cd, cx) {
return Ok(Some(worktree.read(cx).abs_path().to_path_buf()));
}
anyhow::bail!("`cd` directory {cd:?} was not in any of the project's worktrees.");
@@ -319,7 +314,7 @@ mod tests {
use theme::ThemeSettings;
use util::test::TempTree;
use crate::AgentResponseEvent;
use crate::ThreadEvent;
use super::*;
@@ -396,7 +391,7 @@ mod tests {
});
cx.run_until_parked();
let event = stream_rx.try_next();
if let Ok(Some(Ok(AgentResponseEvent::ToolCallAuthorization(auth)))) = event {
if let Ok(Some(Ok(ThreadEvent::ToolCallAuthorization(auth)))) = event {
auth.response.send(auth.options[0].id.clone()).unwrap();
}

View File

@@ -11,8 +11,7 @@ use crate::{AgentTool, ToolCallEventStream};
/// Use this tool when you need to work through complex problems, develop strategies, or outline approaches before taking action.
#[derive(Debug, Serialize, Deserialize, JsonSchema)]
pub struct ThinkingToolInput {
/// Content to think about. This should be a description of what to think about or
/// a problem to solve.
/// Content to think about. This should be a description of what to think about or a problem to solve.
content: String,
}

View File

@@ -14,7 +14,7 @@ use ui::prelude::*;
use web_search::WebSearchRegistry;
/// Search the web for information using your query.
/// Use this when you need real-time information, facts, or data that might not be in your training. \
/// Use this when you need real-time information, facts, or data that might not be in your training.
/// Results will include snippets and links from relevant web pages.
#[derive(Debug, Serialize, Deserialize, JsonSchema)]
pub struct WebSearchToolInput {
@@ -80,33 +80,48 @@ impl AgentTool for WebSearchTool {
}
};
let result_text = if response.results.len() == 1 {
"1 result".to_string()
} else {
format!("{} results", response.results.len())
};
event_stream.update_fields(acp::ToolCallUpdateFields {
title: Some(format!("Searched the web: {result_text}")),
content: Some(
response
.results
.iter()
.map(|result| acp::ToolCallContent::Content {
content: acp::ContentBlock::ResourceLink(acp::ResourceLink {
name: result.title.clone(),
uri: result.url.clone(),
title: Some(result.title.clone()),
description: Some(result.text.clone()),
mime_type: None,
annotations: None,
size: None,
}),
})
.collect(),
),
..Default::default()
});
emit_update(&response, &event_stream);
Ok(WebSearchToolOutput(response))
})
}
fn replay(
&self,
_input: Self::Input,
output: Self::Output,
event_stream: ToolCallEventStream,
_cx: &mut App,
) -> Result<()> {
emit_update(&output.0, &event_stream);
Ok(())
}
}
fn emit_update(response: &WebSearchResponse, event_stream: &ToolCallEventStream) {
let result_text = if response.results.len() == 1 {
"1 result".to_string()
} else {
format!("{} results", response.results.len())
};
event_stream.update_fields(acp::ToolCallUpdateFields {
title: Some(format!("Searched the web: {result_text}")),
content: Some(
response
.results
.iter()
.map(|result| acp::ToolCallContent::Content {
content: acp::ContentBlock::ResourceLink(acp::ResourceLink {
name: result.title.clone(),
uri: result.url.clone(),
title: Some(result.title.clone()),
description: Some(result.text.clone()),
mime_type: None,
annotations: None,
size: None,
}),
})
.collect(),
),
..Default::default()
});
}

View File

@@ -18,6 +18,7 @@ doctest = false
[dependencies]
acp_thread.workspace = true
action_log.workspace = true
agent-client-protocol.workspace = true
agent_settings.workspace = true
agentic-coding-protocol.workspace = true
@@ -28,6 +29,7 @@ futures.workspace = true
gpui.workspace = true
indoc.workspace = true
itertools.workspace = true
language.workspace = true
language_model.workspace = true
language_models.workspace = true
log.workspace = true
@@ -35,6 +37,7 @@ paths.workspace = true
project.workspace = true
rand.workspace = true
schemars.workspace = true
semver.workspace = true
serde.workspace = true
serde_json.workspace = true
settings.workspace = true

View File

@@ -1,4 +1,5 @@
// Translates old acp agents into the new schema
use action_log::ActionLog;
use agent_client_protocol as acp;
use agentic_coding_protocol::{self as acp_old, AgentRequest as _};
use anyhow::{Context as _, Result, anyhow};
@@ -148,7 +149,7 @@ impl acp_old::Client for OldAcpClientDelegate {
Ok(acp_old::RequestToolCallConfirmationResponse {
id: acp_old::ToolCallId(old_acp_id),
outcome: outcome,
outcome,
})
}
@@ -265,7 +266,7 @@ impl acp_old::Client for OldAcpClientDelegate {
fn into_new_tool_call(id: acp::ToolCallId, request: acp_old::PushToolCallParams) -> acp::ToolCall {
acp::ToolCall {
id: id,
id,
title: request.label,
kind: acp_kind_from_old_icon(request.icon),
status: acp::ToolCallStatus::InProgress,
@@ -443,7 +444,8 @@ impl AgentConnection for AcpConnection {
cx.update(|cx| {
let thread = cx.new(|cx| {
let session_id = acp::SessionId("acp-old-no-id".into());
AcpThread::new(self.name, self.clone(), project, session_id, cx)
let action_log = cx.new(|_| ActionLog::new(project.clone()));
AcpThread::new(self.name, self.clone(), project, action_log, session_id)
});
current_thread.replace(thread.downgrade());
thread
@@ -496,6 +498,14 @@ impl AgentConnection for AcpConnection {
})
}
fn prompt_capabilities(&self) -> acp::PromptCapabilities {
acp::PromptCapabilities {
image: false,
audio: false,
embedded_context: false,
}
}
fn cancel(&self, _session_id: &acp::SessionId, cx: &mut App) {
let task = self
.connection

View File

@@ -1,3 +1,4 @@
use action_log::ActionLog;
use agent_client_protocol::{self as acp, Agent as _};
use anyhow::anyhow;
use collections::HashMap;
@@ -13,13 +14,14 @@ use anyhow::{Context as _, Result};
use gpui::{App, AppContext as _, AsyncApp, Entity, Task, WeakEntity};
use crate::{AgentServerCommand, acp::UnsupportedVersion};
use acp_thread::{AcpThread, AgentConnection, AuthRequired};
use acp_thread::{AcpThread, AgentConnection, AuthRequired, LoadError};
pub struct AcpConnection {
server_name: &'static str,
connection: Rc<acp::ClientSideConnection>,
sessions: Rc<RefCell<HashMap<acp::SessionId, AcpSession>>>,
auth_methods: Vec<acp::AuthMethod>,
prompt_capabilities: acp::PromptCapabilities,
_io_task: Task<Result<()>>,
}
@@ -86,7 +88,9 @@ impl AcpConnection {
for session in sessions.borrow().values() {
session
.thread
.update(cx, |thread, cx| thread.emit_server_exited(status, cx))
.update(cx, |thread, cx| {
thread.emit_load_error(LoadError::Exited { status }, cx)
})
.ok();
}
@@ -116,6 +120,7 @@ impl AcpConnection {
connection: connection.into(),
server_name,
sessions,
prompt_capabilities: response.agent_capabilities.prompt_capabilities,
_io_task: io_task,
})
}
@@ -153,14 +158,14 @@ impl AgentConnection for AcpConnection {
})?;
let session_id = response.session_id;
let thread = cx.new(|cx| {
let action_log = cx.new(|_| ActionLog::new(project.clone()))?;
let thread = cx.new(|_cx| {
AcpThread::new(
self.server_name,
self.clone(),
project,
action_log,
session_id.clone(),
cx,
)
})?;
@@ -203,6 +208,10 @@ impl AgentConnection for AcpConnection {
})
}
fn prompt_capabilities(&self) -> acp::PromptCapabilities {
self.prompt_capabilities
}
fn cancel(&self, session_id: &acp::SessionId, cx: &mut App) {
let conn = self.connection.clone();
let params = acp::CancelNotification {

View File

@@ -104,7 +104,7 @@ impl AgentServerCommand {
cx: &mut AsyncApp,
) -> Option<Self> {
if let Some(agent_settings) = settings {
return Some(Self {
Some(Self {
path: agent_settings.command.path,
args: agent_settings
.command
@@ -113,7 +113,7 @@ impl AgentServerCommand {
.chain(extra_args.iter().map(|arg| arg.to_string()))
.collect(),
env: agent_settings.command.env,
});
})
} else {
match find_bin_in_path(path_bin_name, project, cx).await {
Some(path) => Some(Self {

View File

@@ -1,6 +1,11 @@
mod edit_tool;
mod mcp_server;
mod permission_tool;
mod read_tool;
pub mod tools;
mod write_tool;
use action_log::ActionLog;
use collections::HashMap;
use context_server::listener::McpServerTool;
use language_models::provider::anthropic::AnthropicLanguageModelProvider;
@@ -10,8 +15,9 @@ use smol::process::Child;
use std::any::Any;
use std::cell::RefCell;
use std::fmt::Display;
use std::path::Path;
use std::path::{Path, PathBuf};
use std::rc::Rc;
use util::command::new_smol_command;
use uuid::Uuid;
use agent_client_protocol as acp;
@@ -31,7 +37,7 @@ use util::{ResultExt, debug_panic};
use crate::claude::mcp_server::{ClaudeZedMcpServer, McpConfig};
use crate::claude::tools::ClaudeTool;
use crate::{AgentServer, AgentServerCommand, AllAgentServersSettings};
use acp_thread::{AcpThread, AgentConnection, AuthRequired};
use acp_thread::{AcpThread, AgentConnection, AuthRequired, LoadError, MentionUri};
#[derive(Clone)]
pub struct ClaudeCode;
@@ -98,7 +104,11 @@ impl AgentConnection for ClaudeAgentConnection {
)
.await
else {
anyhow::bail!("Failed to find claude binary");
return Err(LoadError::NotInstalled {
error_message: "Failed to find Claude Code binary".into(),
install_message: "Install Claude Code".into(),
install_command: "npm install -g @anthropic-ai/claude-code@latest".into(),
}.into());
};
let api_key =
@@ -203,20 +213,50 @@ impl AgentConnection for ClaudeAgentConnection {
.await
}
if let Some(status) = child.status().await.log_err() {
if let Some(thread) = thread_rx.recv().await.ok() {
thread
.update(cx, |thread, cx| {
thread.emit_server_exited(status, cx);
})
.ok();
}
if let Some(status) = child.status().await.log_err()
&& let Some(thread) = thread_rx.recv().await.ok()
{
let version = claude_version(command.path.clone(), cx).await.log_err();
let help = claude_help(command.path.clone(), cx).await.log_err();
thread
.update(cx, |thread, cx| {
let error = if let Some(version) = version
&& let Some(help) = help
&& (!help.contains("--input-format")
|| !help.contains("--session-id"))
{
LoadError::Unsupported {
error_message: format!(
"Your installed version of Claude Code ({}, version {}) does not have required features for use with Zed.",
command.path.to_string_lossy(),
version,
)
.into(),
upgrade_message: "Upgrade Claude Code to latest".into(),
upgrade_command: format!(
"{} update",
command.path.to_string_lossy()
),
}
} else {
LoadError::Exited { status }
};
thread.emit_load_error(error, cx);
})
.ok();
}
}
});
let thread = cx.new(|cx| {
AcpThread::new("Claude Code", self.clone(), project, session_id.clone(), cx)
let action_log = cx.new(|_| ActionLog::new(project.clone()))?;
let thread = cx.new(|_cx| {
AcpThread::new(
"Claude Code",
self.clone(),
project,
action_log,
session_id.clone(),
)
})?;
thread_tx.send(thread.downgrade())?;
@@ -259,27 +299,12 @@ impl AgentConnection for ClaudeAgentConnection {
let (end_tx, end_rx) = oneshot::channel();
session.turn_state.replace(TurnState::InProgress { end_tx });
let mut content = String::new();
for chunk in params.prompt {
match chunk {
acp::ContentBlock::Text(text_content) => {
content.push_str(&text_content.text);
}
acp::ContentBlock::ResourceLink(resource_link) => {
content.push_str(&format!("@{}", resource_link.uri));
}
acp::ContentBlock::Audio(_)
| acp::ContentBlock::Image(_)
| acp::ContentBlock::Resource(_) => {
// TODO
}
}
}
let content = acp_content_to_claude(params.prompt);
if let Err(err) = session.outgoing_tx.unbounded_send(SdkMessage::User {
message: Message {
role: Role::User,
content: Content::UntaggedText(content),
content: Content::Chunks(content),
id: None,
model: None,
stop_reason: None,
@@ -294,6 +319,14 @@ impl AgentConnection for ClaudeAgentConnection {
cx.foreground_executor().spawn(async move { end_rx.await? })
}
fn prompt_capabilities(&self) -> acp::PromptCapabilities {
acp::PromptCapabilities {
image: true,
audio: false,
embedded_context: true,
}
}
fn cancel(&self, session_id: &acp::SessionId, _cx: &mut App) {
let sessions = self.sessions.borrow();
let Some(session) = sessions.get(session_id) else {
@@ -358,18 +391,16 @@ fn spawn_claude(
&format!(
"mcp__{}__{}",
mcp_server::SERVER_NAME,
mcp_server::PermissionTool::NAME,
permission_tool::PermissionTool::NAME,
),
"--allowedTools",
&format!(
"mcp__{}__{},mcp__{}__{}",
"mcp__{}__{}",
mcp_server::SERVER_NAME,
mcp_server::EditTool::NAME,
mcp_server::SERVER_NAME,
mcp_server::ReadTool::NAME
read_tool::ReadTool::NAME
),
"--disallowedTools",
"Read,Edit",
"Read,Write,Edit,MultiEdit",
])
.args(match mode {
ClaudeSessionMode::Start => ["--session-id".to_string(), session_id.to_string()],
@@ -388,6 +419,27 @@ fn spawn_claude(
Ok(child)
}
fn claude_version(path: PathBuf, cx: &mut AsyncApp) -> Task<Result<semver::Version>> {
cx.background_spawn(async move {
let output = new_smol_command(path).arg("--version").output().await?;
let output = String::from_utf8(output.stdout)?;
let version = output
.trim()
.strip_suffix(" (Claude Code)")
.context("parsing Claude version")?;
let version = semver::Version::parse(version)?;
anyhow::Ok(version)
})
}
fn claude_help(path: PathBuf, cx: &mut AsyncApp) -> Task<Result<String>> {
cx.background_spawn(async move {
let output = new_smol_command(path).arg("--help").output().await?;
let output = String::from_utf8(output.stdout)?;
anyhow::Ok(output)
})
}
struct ClaudeAgentSession {
outgoing_tx: UnboundedSender<SdkMessage>,
turn_state: Rc<RefCell<TurnState>>,
@@ -477,9 +529,16 @@ impl ClaudeAgentSession {
let content = content.to_string();
thread
.update(cx, |thread, cx| {
let id = acp::ToolCallId(tool_use_id.into());
let set_new_content = !content.is_empty()
&& thread.tool_call(&id).is_none_or(|(_, tool_call)| {
// preserve rich diff if we have one
tool_call.diffs().next().is_none()
});
thread.update_tool_call(
acp::ToolCallUpdate {
id: acp::ToolCallId(tool_use_id.into()),
id,
fields: acp::ToolCallUpdateFields {
status: if turn_state.borrow().is_canceled() {
// Do not set to completed if turn was canceled
@@ -487,7 +546,7 @@ impl ClaudeAgentSession {
} else {
Some(acp::ToolCallStatus::Completed)
},
content: (!content.is_empty())
content: set_new_content
.then(|| vec![content.into()]),
..Default::default()
},
@@ -505,10 +564,17 @@ impl ClaudeAgentSession {
chunk
);
}
ContentChunk::Image { source } => {
if !turn_state.borrow().is_canceled() {
thread
.update(cx, |thread, cx| {
thread.push_user_content_block(None, source.into(), cx)
})
.log_err();
}
}
ContentChunk::Image
| ContentChunk::Document
| ContentChunk::WebSearchToolResult => {
ContentChunk::Document | ContentChunk::WebSearchToolResult => {
thread
.update(cx, |thread, cx| {
thread.push_assistant_content_block(
@@ -594,7 +660,14 @@ impl ClaudeAgentSession {
"Should not get tool results with role: assistant. should we handle this?"
);
}
ContentChunk::Image | ContentChunk::Document => {
ContentChunk::Image { source } => {
thread
.update(cx, |thread, cx| {
thread.push_assistant_content_block(source.into(), false, cx)
})
.log_err();
}
ContentChunk::Document => {
thread
.update(cx, |thread, cx| {
thread.push_assistant_content_block(
@@ -722,7 +795,7 @@ impl Content {
pub fn chunks(self) -> impl Iterator<Item = ContentChunk> {
match self {
Self::Chunks(chunks) => chunks.into_iter(),
Self::UntaggedText(text) => vec![ContentChunk::Text { text: text.clone() }].into_iter(),
Self::UntaggedText(text) => vec![ContentChunk::Text { text }].into_iter(),
}
}
}
@@ -760,14 +833,44 @@ enum ContentChunk {
thinking: String,
},
RedactedThinking,
Image {
source: ImageSource,
},
// TODO
Image,
Document,
WebSearchToolResult,
#[serde(untagged)]
UntaggedText(String),
}
#[derive(Debug, Clone, Serialize, Deserialize)]
#[serde(tag = "type", rename_all = "snake_case")]
enum ImageSource {
Base64 { data: String, media_type: String },
Url { url: String },
}
impl Into<acp::ContentBlock> for ImageSource {
fn into(self) -> acp::ContentBlock {
match self {
ImageSource::Base64 { data, media_type } => {
acp::ContentBlock::Image(acp::ImageContent {
annotations: None,
data,
mime_type: media_type,
uri: None,
})
}
ImageSource::Url { url } => acp::ContentBlock::Image(acp::ImageContent {
annotations: None,
data: "".to_string(),
mime_type: "".to_string(),
uri: Some(url),
}),
}
}
}
impl Display for ContentChunk {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
match self {
@@ -776,7 +879,7 @@ impl Display for ContentChunk {
ContentChunk::RedactedThinking => write!(f, "Thinking: [REDACTED]"),
ContentChunk::UntaggedText(text) => write!(f, "{}", text),
ContentChunk::ToolResult { content, .. } => write!(f, "{}", content),
ContentChunk::Image
ContentChunk::Image { .. }
| ContentChunk::Document
| ContentChunk::ToolUse { .. }
| ContentChunk::WebSearchToolResult => {
@@ -888,6 +991,75 @@ impl Display for ResultErrorType {
}
}
fn acp_content_to_claude(prompt: Vec<acp::ContentBlock>) -> Vec<ContentChunk> {
let mut content = Vec::with_capacity(prompt.len());
let mut context = Vec::with_capacity(prompt.len());
for chunk in prompt {
match chunk {
acp::ContentBlock::Text(text_content) => {
content.push(ContentChunk::Text {
text: text_content.text,
});
}
acp::ContentBlock::ResourceLink(resource_link) => {
match MentionUri::parse(&resource_link.uri) {
Ok(uri) => {
content.push(ContentChunk::Text {
text: format!("{}", uri.as_link()),
});
}
Err(_) => {
content.push(ContentChunk::Text {
text: resource_link.uri,
});
}
}
}
acp::ContentBlock::Resource(resource) => match resource.resource {
acp::EmbeddedResourceResource::TextResourceContents(resource) => {
match MentionUri::parse(&resource.uri) {
Ok(uri) => {
content.push(ContentChunk::Text {
text: format!("{}", uri.as_link()),
});
}
Err(_) => {
content.push(ContentChunk::Text {
text: resource.uri.clone(),
});
}
}
context.push(ContentChunk::Text {
text: format!(
"\n<context ref=\"{}\">\n{}\n</context>",
resource.uri, resource.text
),
});
}
acp::EmbeddedResourceResource::BlobResourceContents(_) => {
// Unsupported by SDK
}
},
acp::ContentBlock::Image(acp::ImageContent {
data, mime_type, ..
}) => content.push(ContentChunk::Image {
source: ImageSource::Base64 {
data,
media_type: mime_type,
},
}),
acp::ContentBlock::Audio(_) => {
// Unsupported by SDK
}
}
}
content.extend(context);
content
}
fn new_request_id() -> String {
use rand::Rng;
// In the Claude Code TS SDK they just generate a random 12 character string,
@@ -953,7 +1125,7 @@ pub(crate) mod tests {
thread.read_with(cx, |thread, _| {
entries_len = thread.plan().entries.len();
assert!(thread.plan().entries.len() > 0, "Empty plan");
assert!(!thread.plan().entries.is_empty(), "Empty plan");
});
thread
@@ -1104,4 +1276,100 @@ pub(crate) mod tests {
_ => panic!("Expected ToolResult variant"),
}
}
#[test]
fn test_acp_content_to_claude() {
let acp_content = vec![
acp::ContentBlock::Text(acp::TextContent {
text: "Hello world".to_string(),
annotations: None,
}),
acp::ContentBlock::Image(acp::ImageContent {
data: "base64data".to_string(),
mime_type: "image/png".to_string(),
annotations: None,
uri: None,
}),
acp::ContentBlock::ResourceLink(acp::ResourceLink {
uri: "file:///path/to/example.rs".to_string(),
name: "example.rs".to_string(),
annotations: None,
description: None,
mime_type: None,
size: None,
title: None,
}),
acp::ContentBlock::Resource(acp::EmbeddedResource {
annotations: None,
resource: acp::EmbeddedResourceResource::TextResourceContents(
acp::TextResourceContents {
mime_type: None,
text: "fn main() { println!(\"Hello!\"); }".to_string(),
uri: "file:///path/to/code.rs".to_string(),
},
),
}),
acp::ContentBlock::ResourceLink(acp::ResourceLink {
uri: "invalid_uri_format".to_string(),
name: "invalid.txt".to_string(),
annotations: None,
description: None,
mime_type: None,
size: None,
title: None,
}),
];
let claude_content = acp_content_to_claude(acp_content);
assert_eq!(claude_content.len(), 6);
match &claude_content[0] {
ContentChunk::Text { text } => assert_eq!(text, "Hello world"),
_ => panic!("Expected Text chunk"),
}
match &claude_content[1] {
ContentChunk::Image { source } => match source {
ImageSource::Base64 { data, media_type } => {
assert_eq!(data, "base64data");
assert_eq!(media_type, "image/png");
}
_ => panic!("Expected Base64 image source"),
},
_ => panic!("Expected Image chunk"),
}
match &claude_content[2] {
ContentChunk::Text { text } => {
assert!(text.contains("example.rs"));
assert!(text.contains("file:///path/to/example.rs"));
}
_ => panic!("Expected Text chunk for ResourceLink"),
}
match &claude_content[3] {
ContentChunk::Text { text } => {
assert!(text.contains("code.rs"));
assert!(text.contains("file:///path/to/code.rs"));
}
_ => panic!("Expected Text chunk for Resource"),
}
match &claude_content[4] {
ContentChunk::Text { text } => {
assert_eq!(text, "invalid_uri_format");
}
_ => panic!("Expected Text chunk for invalid URI"),
}
match &claude_content[5] {
ContentChunk::Text { text } => {
assert!(text.contains("<context ref=\"file:///path/to/code.rs\">"));
assert!(text.contains("fn main() { println!(\"Hello!\"); }"));
assert!(text.contains("</context>"));
}
_ => panic!("Expected Text chunk for context"),
}
}
}

View File

@@ -0,0 +1,178 @@
use acp_thread::AcpThread;
use anyhow::Result;
use context_server::{
listener::{McpServerTool, ToolResponse},
types::{ToolAnnotations, ToolResponseContent},
};
use gpui::{AsyncApp, WeakEntity};
use language::unified_diff;
use util::markdown::MarkdownCodeBlock;
use crate::tools::EditToolParams;
#[derive(Clone)]
pub struct EditTool {
thread_rx: watch::Receiver<WeakEntity<AcpThread>>,
}
impl EditTool {
pub fn new(thread_rx: watch::Receiver<WeakEntity<AcpThread>>) -> Self {
Self { thread_rx }
}
}
impl McpServerTool for EditTool {
type Input = EditToolParams;
type Output = ();
const NAME: &'static str = "Edit";
fn annotations(&self) -> ToolAnnotations {
ToolAnnotations {
title: Some("Edit file".to_string()),
read_only_hint: Some(false),
destructive_hint: Some(false),
open_world_hint: Some(false),
idempotent_hint: Some(false),
}
}
async fn run(
&self,
input: Self::Input,
cx: &mut AsyncApp,
) -> Result<ToolResponse<Self::Output>> {
let mut thread_rx = self.thread_rx.clone();
let Some(thread) = thread_rx.recv().await?.upgrade() else {
anyhow::bail!("Thread closed");
};
let content = thread
.update(cx, |thread, cx| {
thread.read_text_file(input.abs_path.clone(), None, None, true, cx)
})?
.await?;
let (new_content, diff) = cx
.background_executor()
.spawn(async move {
let new_content = content.replace(&input.old_text, &input.new_text);
if new_content == content {
return Err(anyhow::anyhow!("Failed to find `old_text`",));
}
let diff = unified_diff(&content, &new_content);
Ok((new_content, diff))
})
.await?;
thread
.update(cx, |thread, cx| {
thread.write_text_file(input.abs_path, new_content, cx)
})?
.await?;
Ok(ToolResponse {
content: vec![ToolResponseContent::Text {
text: MarkdownCodeBlock {
tag: "diff",
text: diff.as_str().trim_end_matches('\n'),
}
.to_string(),
}],
structured_content: (),
})
}
}
#[cfg(test)]
mod tests {
use std::rc::Rc;
use acp_thread::{AgentConnection, StubAgentConnection};
use gpui::{Entity, TestAppContext};
use indoc::indoc;
use project::{FakeFs, Project};
use serde_json::json;
use settings::SettingsStore;
use util::path;
use super::*;
#[gpui::test]
async fn old_text_not_found(cx: &mut TestAppContext) {
let (_thread, tool) = init_test(cx).await;
let result = tool
.run(
EditToolParams {
abs_path: path!("/root/file.txt").into(),
old_text: "hi".into(),
new_text: "bye".into(),
},
&mut cx.to_async(),
)
.await;
assert_eq!(result.unwrap_err().to_string(), "Failed to find `old_text`");
}
#[gpui::test]
async fn found_and_replaced(cx: &mut TestAppContext) {
let (_thread, tool) = init_test(cx).await;
let result = tool
.run(
EditToolParams {
abs_path: path!("/root/file.txt").into(),
old_text: "hello".into(),
new_text: "hi".into(),
},
&mut cx.to_async(),
)
.await;
assert_eq!(
result.unwrap().content[0].text().unwrap(),
indoc! {
r"
```diff
@@ -1,1 +1,1 @@
-hello
+hi
```
"
}
);
}
async fn init_test(cx: &mut TestAppContext) -> (Entity<AcpThread>, EditTool) {
cx.update(|cx| {
let settings_store = SettingsStore::test(cx);
cx.set_global(settings_store);
language::init(cx);
Project::init_settings(cx);
});
let connection = Rc::new(StubAgentConnection::new());
let fs = FakeFs::new(cx.executor());
fs.insert_tree(
path!("/root"),
json!({
"file.txt": "hello"
}),
)
.await;
let project = Project::test(fs, [path!("/root").as_ref()], cx).await;
let (mut thread_tx, thread_rx) = watch::channel(WeakEntity::new_invalid());
let thread = cx
.update(|cx| connection.new_thread(project, path!("/test").as_ref(), cx))
.await
.unwrap();
thread_tx.send(thread.downgrade()).unwrap();
(thread, EditTool::new(thread_rx))
}
}

View File

@@ -1,23 +1,22 @@
use std::path::PathBuf;
use std::sync::Arc;
use crate::claude::tools::{ClaudeTool, EditToolParams, ReadToolParams};
use crate::claude::edit_tool::EditTool;
use crate::claude::permission_tool::PermissionTool;
use crate::claude::read_tool::ReadTool;
use crate::claude::write_tool::WriteTool;
use acp_thread::AcpThread;
use agent_client_protocol as acp;
use agent_settings::AgentSettings;
use anyhow::{Context, Result};
#[cfg(not(test))]
use anyhow::Context as _;
use anyhow::Result;
use collections::HashMap;
use context_server::listener::{McpServerTool, ToolResponse};
use context_server::types::{
Implementation, InitializeParams, InitializeResponse, ProtocolVersion, ServerCapabilities,
ToolAnnotations, ToolResponseContent, ToolsCapabilities, requests,
ToolsCapabilities, requests,
};
use gpui::{App, AsyncApp, Task, WeakEntity};
use project::Fs;
use schemars::JsonSchema;
use serde::{Deserialize, Serialize};
use settings::{Settings as _, update_settings_file};
use util::debug_panic;
use serde::Serialize;
pub struct ClaudeZedMcpServer {
server: context_server::listener::McpServer,
@@ -34,16 +33,10 @@ impl ClaudeZedMcpServer {
let mut mcp_server = context_server::listener::McpServer::new(cx).await?;
mcp_server.handle_request::<requests::Initialize>(Self::handle_initialize);
mcp_server.add_tool(PermissionTool {
thread_rx: thread_rx.clone(),
fs: fs.clone(),
});
mcp_server.add_tool(ReadTool {
thread_rx: thread_rx.clone(),
});
mcp_server.add_tool(EditTool {
thread_rx: thread_rx.clone(),
});
mcp_server.add_tool(PermissionTool::new(fs.clone(), thread_rx.clone()));
mcp_server.add_tool(ReadTool::new(thread_rx.clone()));
mcp_server.add_tool(EditTool::new(thread_rx.clone()));
mcp_server.add_tool(WriteTool::new(thread_rx.clone()));
Ok(Self { server: mcp_server })
}
@@ -104,249 +97,3 @@ pub struct McpServerConfig {
#[serde(skip_serializing_if = "Option::is_none")]
pub env: Option<HashMap<String, String>>,
}
// Tools
#[derive(Clone)]
pub struct PermissionTool {
fs: Arc<dyn Fs>,
thread_rx: watch::Receiver<WeakEntity<AcpThread>>,
}
#[derive(Deserialize, JsonSchema, Debug)]
pub struct PermissionToolParams {
tool_name: String,
input: serde_json::Value,
tool_use_id: Option<String>,
}
#[derive(Serialize)]
#[serde(rename_all = "camelCase")]
pub struct PermissionToolResponse {
behavior: PermissionToolBehavior,
updated_input: serde_json::Value,
}
#[derive(Serialize)]
#[serde(rename_all = "snake_case")]
enum PermissionToolBehavior {
Allow,
Deny,
}
impl McpServerTool for PermissionTool {
type Input = PermissionToolParams;
type Output = ();
const NAME: &'static str = "Confirmation";
fn description(&self) -> &'static str {
"Request permission for tool calls"
}
async fn run(
&self,
input: Self::Input,
cx: &mut AsyncApp,
) -> Result<ToolResponse<Self::Output>> {
if agent_settings::AgentSettings::try_read_global(cx, |settings| {
settings.always_allow_tool_actions
})
.unwrap_or(false)
{
let response = PermissionToolResponse {
behavior: PermissionToolBehavior::Allow,
updated_input: input.input,
};
return Ok(ToolResponse {
content: vec![ToolResponseContent::Text {
text: serde_json::to_string(&response)?,
}],
structured_content: (),
});
}
let mut thread_rx = self.thread_rx.clone();
let Some(thread) = thread_rx.recv().await?.upgrade() else {
anyhow::bail!("Thread closed");
};
let claude_tool = ClaudeTool::infer(&input.tool_name, input.input.clone());
let tool_call_id = acp::ToolCallId(input.tool_use_id.context("Tool ID required")?.into());
const ALWAYS_ALLOW: &'static str = "always_allow";
const ALLOW: &'static str = "allow";
const REJECT: &'static str = "reject";
let chosen_option = thread
.update(cx, |thread, cx| {
thread.request_tool_call_authorization(
claude_tool.as_acp(tool_call_id).into(),
vec![
acp::PermissionOption {
id: acp::PermissionOptionId(ALWAYS_ALLOW.into()),
name: "Always Allow".into(),
kind: acp::PermissionOptionKind::AllowAlways,
},
acp::PermissionOption {
id: acp::PermissionOptionId(ALLOW.into()),
name: "Allow".into(),
kind: acp::PermissionOptionKind::AllowOnce,
},
acp::PermissionOption {
id: acp::PermissionOptionId(REJECT.into()),
name: "Reject".into(),
kind: acp::PermissionOptionKind::RejectOnce,
},
],
cx,
)
})??
.await?;
let response = match chosen_option.0.as_ref() {
ALWAYS_ALLOW => {
cx.update(|cx| {
update_settings_file::<AgentSettings>(self.fs.clone(), cx, |settings, _| {
settings.set_always_allow_tool_actions(true);
});
})?;
PermissionToolResponse {
behavior: PermissionToolBehavior::Allow,
updated_input: input.input,
}
}
ALLOW => PermissionToolResponse {
behavior: PermissionToolBehavior::Allow,
updated_input: input.input,
},
REJECT => PermissionToolResponse {
behavior: PermissionToolBehavior::Deny,
updated_input: input.input,
},
opt => {
debug_panic!("Unexpected option: {}", opt);
PermissionToolResponse {
behavior: PermissionToolBehavior::Deny,
updated_input: input.input,
}
}
};
Ok(ToolResponse {
content: vec![ToolResponseContent::Text {
text: serde_json::to_string(&response)?,
}],
structured_content: (),
})
}
}
#[derive(Clone)]
pub struct ReadTool {
thread_rx: watch::Receiver<WeakEntity<AcpThread>>,
}
impl McpServerTool for ReadTool {
type Input = ReadToolParams;
type Output = ();
const NAME: &'static str = "Read";
fn description(&self) -> &'static str {
"Read the contents of a file. In sessions with mcp__zed__Read always use it instead of Read as it contains the most up-to-date contents."
}
fn annotations(&self) -> ToolAnnotations {
ToolAnnotations {
title: Some("Read file".to_string()),
read_only_hint: Some(true),
destructive_hint: Some(false),
open_world_hint: Some(false),
idempotent_hint: None,
}
}
async fn run(
&self,
input: Self::Input,
cx: &mut AsyncApp,
) -> Result<ToolResponse<Self::Output>> {
let mut thread_rx = self.thread_rx.clone();
let Some(thread) = thread_rx.recv().await?.upgrade() else {
anyhow::bail!("Thread closed");
};
let content = thread
.update(cx, |thread, cx| {
thread.read_text_file(input.abs_path, input.offset, input.limit, false, cx)
})?
.await?;
Ok(ToolResponse {
content: vec![ToolResponseContent::Text { text: content }],
structured_content: (),
})
}
}
#[derive(Clone)]
pub struct EditTool {
thread_rx: watch::Receiver<WeakEntity<AcpThread>>,
}
impl McpServerTool for EditTool {
type Input = EditToolParams;
type Output = ();
const NAME: &'static str = "Edit";
fn description(&self) -> &'static str {
"Edits a file. In sessions with mcp__zed__Edit always use it instead of Edit as it will show the diff to the user better."
}
fn annotations(&self) -> ToolAnnotations {
ToolAnnotations {
title: Some("Edit file".to_string()),
read_only_hint: Some(false),
destructive_hint: Some(false),
open_world_hint: Some(false),
idempotent_hint: Some(false),
}
}
async fn run(
&self,
input: Self::Input,
cx: &mut AsyncApp,
) -> Result<ToolResponse<Self::Output>> {
let mut thread_rx = self.thread_rx.clone();
let Some(thread) = thread_rx.recv().await?.upgrade() else {
anyhow::bail!("Thread closed");
};
let content = thread
.update(cx, |thread, cx| {
thread.read_text_file(input.abs_path.clone(), None, None, true, cx)
})?
.await?;
let new_content = content.replace(&input.old_text, &input.new_text);
if new_content == content {
return Err(anyhow::anyhow!("The old_text was not found in the content"));
}
thread
.update(cx, |thread, cx| {
thread.write_text_file(input.abs_path, new_content, cx)
})?
.await?;
Ok(ToolResponse {
content: vec![],
structured_content: (),
})
}
}

View File

@@ -0,0 +1,158 @@
use std::sync::Arc;
use acp_thread::AcpThread;
use agent_client_protocol as acp;
use agent_settings::AgentSettings;
use anyhow::{Context as _, Result};
use context_server::{
listener::{McpServerTool, ToolResponse},
types::ToolResponseContent,
};
use gpui::{AsyncApp, WeakEntity};
use project::Fs;
use schemars::JsonSchema;
use serde::{Deserialize, Serialize};
use settings::{Settings as _, update_settings_file};
use util::debug_panic;
use crate::tools::ClaudeTool;
#[derive(Clone)]
pub struct PermissionTool {
fs: Arc<dyn Fs>,
thread_rx: watch::Receiver<WeakEntity<AcpThread>>,
}
/// Request permission for tool calls
#[derive(Deserialize, JsonSchema, Debug)]
pub struct PermissionToolParams {
tool_name: String,
input: serde_json::Value,
tool_use_id: Option<String>,
}
#[derive(Serialize)]
#[serde(rename_all = "camelCase")]
pub struct PermissionToolResponse {
behavior: PermissionToolBehavior,
updated_input: serde_json::Value,
}
#[derive(Serialize)]
#[serde(rename_all = "snake_case")]
enum PermissionToolBehavior {
Allow,
Deny,
}
impl PermissionTool {
pub fn new(fs: Arc<dyn Fs>, thread_rx: watch::Receiver<WeakEntity<AcpThread>>) -> Self {
Self { fs, thread_rx }
}
}
impl McpServerTool for PermissionTool {
type Input = PermissionToolParams;
type Output = ();
const NAME: &'static str = "Confirmation";
async fn run(
&self,
input: Self::Input,
cx: &mut AsyncApp,
) -> Result<ToolResponse<Self::Output>> {
if agent_settings::AgentSettings::try_read_global(cx, |settings| {
settings.always_allow_tool_actions
})
.unwrap_or(false)
{
let response = PermissionToolResponse {
behavior: PermissionToolBehavior::Allow,
updated_input: input.input,
};
return Ok(ToolResponse {
content: vec![ToolResponseContent::Text {
text: serde_json::to_string(&response)?,
}],
structured_content: (),
});
}
let mut thread_rx = self.thread_rx.clone();
let Some(thread) = thread_rx.recv().await?.upgrade() else {
anyhow::bail!("Thread closed");
};
let claude_tool = ClaudeTool::infer(&input.tool_name, input.input.clone());
let tool_call_id = acp::ToolCallId(input.tool_use_id.context("Tool ID required")?.into());
const ALWAYS_ALLOW: &str = "always_allow";
const ALLOW: &str = "allow";
const REJECT: &str = "reject";
let chosen_option = thread
.update(cx, |thread, cx| {
thread.request_tool_call_authorization(
claude_tool.as_acp(tool_call_id).into(),
vec![
acp::PermissionOption {
id: acp::PermissionOptionId(ALWAYS_ALLOW.into()),
name: "Always Allow".into(),
kind: acp::PermissionOptionKind::AllowAlways,
},
acp::PermissionOption {
id: acp::PermissionOptionId(ALLOW.into()),
name: "Allow".into(),
kind: acp::PermissionOptionKind::AllowOnce,
},
acp::PermissionOption {
id: acp::PermissionOptionId(REJECT.into()),
name: "Reject".into(),
kind: acp::PermissionOptionKind::RejectOnce,
},
],
cx,
)
})??
.await?;
let response = match chosen_option.0.as_ref() {
ALWAYS_ALLOW => {
cx.update(|cx| {
update_settings_file::<AgentSettings>(self.fs.clone(), cx, |settings, _| {
settings.set_always_allow_tool_actions(true);
});
})?;
PermissionToolResponse {
behavior: PermissionToolBehavior::Allow,
updated_input: input.input,
}
}
ALLOW => PermissionToolResponse {
behavior: PermissionToolBehavior::Allow,
updated_input: input.input,
},
REJECT => PermissionToolResponse {
behavior: PermissionToolBehavior::Deny,
updated_input: input.input,
},
opt => {
debug_panic!("Unexpected option: {}", opt);
PermissionToolResponse {
behavior: PermissionToolBehavior::Deny,
updated_input: input.input,
}
}
};
Ok(ToolResponse {
content: vec![ToolResponseContent::Text {
text: serde_json::to_string(&response)?,
}],
structured_content: (),
})
}
}

View File

@@ -0,0 +1,59 @@
use acp_thread::AcpThread;
use anyhow::Result;
use context_server::{
listener::{McpServerTool, ToolResponse},
types::{ToolAnnotations, ToolResponseContent},
};
use gpui::{AsyncApp, WeakEntity};
use crate::tools::ReadToolParams;
#[derive(Clone)]
pub struct ReadTool {
thread_rx: watch::Receiver<WeakEntity<AcpThread>>,
}
impl ReadTool {
pub fn new(thread_rx: watch::Receiver<WeakEntity<AcpThread>>) -> Self {
Self { thread_rx }
}
}
impl McpServerTool for ReadTool {
type Input = ReadToolParams;
type Output = ();
const NAME: &'static str = "Read";
fn annotations(&self) -> ToolAnnotations {
ToolAnnotations {
title: Some("Read file".to_string()),
read_only_hint: Some(true),
destructive_hint: Some(false),
open_world_hint: Some(false),
idempotent_hint: None,
}
}
async fn run(
&self,
input: Self::Input,
cx: &mut AsyncApp,
) -> Result<ToolResponse<Self::Output>> {
let mut thread_rx = self.thread_rx.clone();
let Some(thread) = thread_rx.recv().await?.upgrade() else {
anyhow::bail!("Thread closed");
};
let content = thread
.update(cx, |thread, cx| {
thread.read_text_file(input.abs_path, input.offset, input.limit, false, cx)
})?
.await?;
Ok(ToolResponse {
content: vec![ToolResponseContent::Text { text: content }],
structured_content: (),
})
}
}

View File

@@ -34,6 +34,7 @@ impl ClaudeTool {
// Known tools
"mcp__zed__Read" => Self::ReadFile(serde_json::from_value(input).log_err()),
"mcp__zed__Edit" => Self::Edit(serde_json::from_value(input).log_err()),
"mcp__zed__Write" => Self::Write(serde_json::from_value(input).log_err()),
"MultiEdit" => Self::MultiEdit(serde_json::from_value(input).log_err()),
"Write" => Self::Write(serde_json::from_value(input).log_err()),
"LS" => Self::Ls(serde_json::from_value(input).log_err()),
@@ -57,7 +58,7 @@ impl ClaudeTool {
Self::Terminal(None)
} else {
Self::Other {
name: tool_name.to_string(),
name: tool_name,
input,
}
}
@@ -93,7 +94,7 @@ impl ClaudeTool {
}
Self::MultiEdit(None) => "Multi Edit".into(),
Self::Write(Some(params)) => {
format!("Write {}", params.file_path.display())
format!("Write {}", params.abs_path.display())
}
Self::Write(None) => "Write".into(),
Self::Glob(Some(params)) => {
@@ -153,7 +154,7 @@ impl ClaudeTool {
}],
Self::Write(Some(params)) => vec![acp::ToolCallContent::Diff {
diff: acp::Diff {
path: params.file_path.clone(),
path: params.abs_path.clone(),
old_text: None,
new_text: params.content.clone(),
},
@@ -229,7 +230,10 @@ impl ClaudeTool {
line: None,
}]
}
Self::Write(Some(WriteToolParams { file_path, .. })) => {
Self::Write(Some(WriteToolParams {
abs_path: file_path,
..
})) => {
vec![acp::ToolCallLocation {
path: file_path.clone(),
line: None,
@@ -302,6 +306,20 @@ impl ClaudeTool {
}
}
/// Edit a file.
///
/// In sessions with mcp__zed__Edit always use it instead of Edit as it will
/// allow the user to conveniently review changes.
///
/// File editing instructions:
/// - The `old_text` param must match existing file content, including indentation.
/// - The `old_text` param must come from the actual file, not an outline.
/// - The `old_text` section must not be empty.
/// - Be minimal with replacements:
/// - For unique lines, include only those lines.
/// - For non-unique lines, include enough context to identify them.
/// - Do not escape quotes, newlines, or other characters.
/// - Only edit the specified file.
#[derive(Deserialize, JsonSchema, Debug)]
pub struct EditToolParams {
/// The absolute path to the file to read.
@@ -312,6 +330,11 @@ pub struct EditToolParams {
pub new_text: String,
}
/// Reads the content of the given file in the project.
///
/// Never attempt to read a path that hasn't been previously mentioned.
///
/// In sessions with mcp__zed__Read always use it instead of Read as it contains the most up-to-date contents.
#[derive(Deserialize, JsonSchema, Debug)]
pub struct ReadToolParams {
/// The absolute path to the file to read.
@@ -324,11 +347,15 @@ pub struct ReadToolParams {
pub limit: Option<u32>,
}
/// Writes content to the specified file in the project.
///
/// In sessions with mcp__zed__Write always use it instead of Write as it will
/// allow the user to conveniently review changes.
#[derive(Deserialize, JsonSchema, Debug)]
pub struct WriteToolParams {
/// Absolute path for new file
pub file_path: PathBuf,
/// File content
/// The absolute path of the file to write.
pub abs_path: PathBuf,
/// The full content to write.
pub content: String,
}

View File

@@ -0,0 +1,59 @@
use acp_thread::AcpThread;
use anyhow::Result;
use context_server::{
listener::{McpServerTool, ToolResponse},
types::ToolAnnotations,
};
use gpui::{AsyncApp, WeakEntity};
use crate::tools::WriteToolParams;
#[derive(Clone)]
pub struct WriteTool {
thread_rx: watch::Receiver<WeakEntity<AcpThread>>,
}
impl WriteTool {
pub fn new(thread_rx: watch::Receiver<WeakEntity<AcpThread>>) -> Self {
Self { thread_rx }
}
}
impl McpServerTool for WriteTool {
type Input = WriteToolParams;
type Output = ();
const NAME: &'static str = "Write";
fn annotations(&self) -> ToolAnnotations {
ToolAnnotations {
title: Some("Write file".to_string()),
read_only_hint: Some(false),
destructive_hint: Some(false),
open_world_hint: Some(false),
idempotent_hint: Some(false),
}
}
async fn run(
&self,
input: Self::Input,
cx: &mut AsyncApp,
) -> Result<ToolResponse<Self::Output>> {
let mut thread_rx = self.thread_rx.clone();
let Some(thread) = thread_rx.recv().await?.upgrade() else {
anyhow::bail!("Thread closed");
};
thread
.update(cx, |thread, cx| {
thread.write_text_file(input.abs_path, input.content, cx)
})?
.await?;
Ok(ToolResponse {
content: vec![],
structured_content: (),
})
}
}

View File

@@ -428,12 +428,9 @@ pub async fn new_test_thread(
.await
.unwrap();
let thread = cx
.update(|cx| connection.new_thread(project.clone(), current_dir.as_ref(), cx))
cx.update(|cx| connection.new_thread(project.clone(), current_dir.as_ref(), cx))
.await
.unwrap();
thread
.unwrap()
}
pub async fn run_until_first_tool_call(
@@ -471,7 +468,7 @@ pub fn get_zed_path() -> PathBuf {
while zed_path
.file_name()
.map_or(true, |name| name.to_string_lossy() != "debug")
.is_none_or(|name| name.to_string_lossy() != "debug")
{
if !zed_path.pop() {
panic!("Could not find target directory");

View File

@@ -26,7 +26,7 @@ impl AgentServer for Gemini {
}
fn empty_state_message(&self) -> &'static str {
"Ask questions, edit files, run commands.\nBe specific for the best results."
"Ask questions, edit files, run commands"
}
fn logo(&self) -> ui::IconName {
@@ -50,7 +50,11 @@ impl AgentServer for Gemini {
let Some(command) =
AgentServerCommand::resolve("gemini", &[ACP_ARG], None, settings, &project, cx).await
else {
anyhow::bail!("Failed to find gemini binary");
return Err(LoadError::NotInstalled {
error_message: "Failed to find Gemini CLI binary".into(),
install_message: "Install Gemini CLI".into(),
install_command: "npm install -g @google/gemini-cli@latest".into()
}.into());
};
let result = crate::acp::connect(server_name, command.clone(), &root_dir, cx).await;
@@ -75,10 +79,11 @@ impl AgentServer for Gemini {
if !supported {
return Err(LoadError::Unsupported {
error_message: format!(
"Your installed version of Gemini {} doesn't support the Agentic Coding Protocol (ACP).",
"Your installed version of Gemini CLI ({}, version {}) doesn't support the Agentic Coding Protocol (ACP).",
command.path.to_string_lossy(),
current_version
).into(),
upgrade_message: "Upgrade Gemini to Latest".into(),
upgrade_message: "Upgrade Gemini CLI to latest".into(),
upgrade_command: "npm install -g @google/gemini-cli@latest".into(),
}.into())
}

View File

@@ -58,7 +58,7 @@ impl AgentProfileSettings {
|| self
.context_servers
.get(server_id)
.map_or(false, |preset| preset.tools.get(tool_name) == Some(&true))
.is_some_and(|preset| preset.tools.get(tool_name) == Some(&true))
}
}

View File

@@ -15,6 +15,8 @@ pub use crate::agent_profile::*;
pub const SUMMARIZE_THREAD_PROMPT: &str =
include_str!("../../agent/src/prompts/summarize_thread_prompt.txt");
pub const SUMMARIZE_THREAD_DETAILED_PROMPT: &str =
include_str!("../../agent/src/prompts/summarize_thread_detailed_prompt.txt");
pub fn init(cx: &mut App) {
AgentSettings::register(cx);
@@ -116,15 +118,15 @@ pub struct LanguageModelParameters {
impl LanguageModelParameters {
pub fn matches(&self, model: &Arc<dyn LanguageModel>) -> bool {
if let Some(provider) = &self.provider {
if provider.0 != model.provider_id().0 {
return false;
}
if let Some(provider) = &self.provider
&& provider.0 != model.provider_id().0
{
return false;
}
if let Some(setting_model) = &self.model {
if *setting_model != model.id().0 {
return false;
}
if let Some(setting_model) = &self.model
&& *setting_model != model.id().0
{
return false;
}
true
}
@@ -503,9 +505,8 @@ impl Settings for AgentSettings {
}
}
debug_assert_eq!(
sources.default.always_allow_tool_actions.unwrap_or(false),
false,
debug_assert!(
!sources.default.always_allow_tool_actions.unwrap_or(false),
"For security, agent.always_allow_tool_actions should always be false in default.json. If it's true, that is a bug that should be fixed!"
);

View File

@@ -104,9 +104,11 @@ zed_actions.workspace = true
[dev-dependencies]
acp_thread = { workspace = true, features = ["test-support"] }
agent = { workspace = true, features = ["test-support"] }
agent2 = { workspace = true, features = ["test-support"] }
assistant_context = { workspace = true, features = ["test-support"] }
assistant_tools.workspace = true
buffer_diff = { workspace = true, features = ["test-support"] }
db = { workspace = true, features = ["test-support"] }
editor = { workspace = true, features = ["test-support"] }
gpui = { workspace = true, "features" = ["test-support"] }
indoc.workspace = true

View File

@@ -3,8 +3,10 @@ mod entry_view_state;
mod message_editor;
mod model_selector;
mod model_selector_popover;
mod thread_history;
mod thread_view;
pub use model_selector::AcpModelSelector;
pub use model_selector_popover::AcpModelSelectorPopover;
pub use thread_history::*;
pub use thread_view::AcpThreadView;

View File

@@ -1,8 +1,12 @@
use std::cell::Cell;
use std::ops::Range;
use std::rc::Rc;
use std::sync::Arc;
use std::sync::atomic::AtomicBool;
use acp_thread::MentionUri;
use agent_client_protocol as acp;
use agent2::{HistoryEntry, HistoryStore};
use anyhow::Result;
use editor::{CompletionProvider, Editor, ExcerptId};
use fuzzy::{StringMatch, StringMatchCandidate};
@@ -18,25 +22,21 @@ use text::{Anchor, ToPoint as _};
use ui::prelude::*;
use workspace::Workspace;
use agent::thread_store::{TextThreadStore, ThreadStore};
use crate::AgentPanel;
use crate::acp::message_editor::MessageEditor;
use crate::context_picker::file_context_picker::{FileMatch, search_files};
use crate::context_picker::rules_context_picker::{RulesContextEntry, search_rules};
use crate::context_picker::symbol_context_picker::SymbolMatch;
use crate::context_picker::symbol_context_picker::search_symbols;
use crate::context_picker::thread_context_picker::{
ThreadContextEntry, ThreadMatch, search_threads,
};
use crate::context_picker::{
ContextPickerAction, ContextPickerEntry, ContextPickerMode, RecentEntry,
available_context_picker_entries, recent_context_picker_entries, selection_ranges,
ContextPickerAction, ContextPickerEntry, ContextPickerMode, selection_ranges,
};
pub(crate) enum Match {
File(FileMatch),
Symbol(SymbolMatch),
Thread(ThreadMatch),
Thread(HistoryEntry),
RecentThread(HistoryEntry),
Fetch(SharedString),
Rules(RulesContextEntry),
Entry(EntryMatch),
@@ -53,6 +53,7 @@ impl Match {
Match::File(file) => file.mat.score,
Match::Entry(mode) => mode.mat.as_ref().map(|mat| mat.score).unwrap_or(1.),
Match::Thread(_) => 1.,
Match::RecentThread(_) => 1.,
Match::Symbol(_) => 1.,
Match::Rules(_) => 1.,
Match::Fetch(_) => 1.,
@@ -60,209 +61,28 @@ impl Match {
}
}
fn search(
mode: Option<ContextPickerMode>,
query: String,
cancellation_flag: Arc<AtomicBool>,
recent_entries: Vec<RecentEntry>,
prompt_store: Option<Entity<PromptStore>>,
thread_store: WeakEntity<ThreadStore>,
text_thread_context_store: WeakEntity<assistant_context::ContextStore>,
workspace: Entity<Workspace>,
cx: &mut App,
) -> Task<Vec<Match>> {
match mode {
Some(ContextPickerMode::File) => {
let search_files_task =
search_files(query.clone(), cancellation_flag.clone(), &workspace, cx);
cx.background_spawn(async move {
search_files_task
.await
.into_iter()
.map(Match::File)
.collect()
})
}
Some(ContextPickerMode::Symbol) => {
let search_symbols_task =
search_symbols(query.clone(), cancellation_flag.clone(), &workspace, cx);
cx.background_spawn(async move {
search_symbols_task
.await
.into_iter()
.map(Match::Symbol)
.collect()
})
}
Some(ContextPickerMode::Thread) => {
if let Some((thread_store, context_store)) = thread_store
.upgrade()
.zip(text_thread_context_store.upgrade())
{
let search_threads_task = search_threads(
query.clone(),
cancellation_flag.clone(),
thread_store,
context_store,
cx,
);
cx.background_spawn(async move {
search_threads_task
.await
.into_iter()
.map(Match::Thread)
.collect()
})
} else {
Task::ready(Vec::new())
}
}
Some(ContextPickerMode::Fetch) => {
if !query.is_empty() {
Task::ready(vec![Match::Fetch(query.into())])
} else {
Task::ready(Vec::new())
}
}
Some(ContextPickerMode::Rules) => {
if let Some(prompt_store) = prompt_store.as_ref() {
let search_rules_task =
search_rules(query.clone(), cancellation_flag.clone(), prompt_store, cx);
cx.background_spawn(async move {
search_rules_task
.await
.into_iter()
.map(Match::Rules)
.collect::<Vec<_>>()
})
} else {
Task::ready(Vec::new())
}
}
None => {
if query.is_empty() {
let mut matches = recent_entries
.into_iter()
.map(|entry| match entry {
RecentEntry::File {
project_path,
path_prefix,
} => Match::File(FileMatch {
mat: fuzzy::PathMatch {
score: 1.,
positions: Vec::new(),
worktree_id: project_path.worktree_id.to_usize(),
path: project_path.path,
path_prefix,
is_dir: false,
distance_to_relative_ancestor: 0,
},
is_recent: true,
}),
RecentEntry::Thread(thread_context_entry) => Match::Thread(ThreadMatch {
thread: thread_context_entry,
is_recent: true,
}),
})
.collect::<Vec<_>>();
matches.extend(
available_context_picker_entries(
&prompt_store,
&Some(thread_store.clone()),
&workspace,
cx,
)
.into_iter()
.map(|mode| {
Match::Entry(EntryMatch {
entry: mode,
mat: None,
})
}),
);
Task::ready(matches)
} else {
let executor = cx.background_executor().clone();
let search_files_task =
search_files(query.clone(), cancellation_flag.clone(), &workspace, cx);
let entries = available_context_picker_entries(
&prompt_store,
&Some(thread_store.clone()),
&workspace,
cx,
);
let entry_candidates = entries
.iter()
.enumerate()
.map(|(ix, entry)| StringMatchCandidate::new(ix, entry.keyword()))
.collect::<Vec<_>>();
cx.background_spawn(async move {
let mut matches = search_files_task
.await
.into_iter()
.map(Match::File)
.collect::<Vec<_>>();
let entry_matches = fuzzy::match_strings(
&entry_candidates,
&query,
false,
true,
100,
&Arc::new(AtomicBool::default()),
executor,
)
.await;
matches.extend(entry_matches.into_iter().map(|mat| {
Match::Entry(EntryMatch {
entry: entries[mat.candidate_id],
mat: Some(mat),
})
}));
matches.sort_by(|a, b| {
b.score()
.partial_cmp(&a.score())
.unwrap_or(std::cmp::Ordering::Equal)
});
matches
})
}
}
}
}
pub struct ContextPickerCompletionProvider {
workspace: WeakEntity<Workspace>,
thread_store: WeakEntity<ThreadStore>,
text_thread_store: WeakEntity<TextThreadStore>,
message_editor: WeakEntity<MessageEditor>,
workspace: WeakEntity<Workspace>,
history_store: Entity<HistoryStore>,
prompt_store: Option<Entity<PromptStore>>,
prompt_capabilities: Rc<Cell<acp::PromptCapabilities>>,
}
impl ContextPickerCompletionProvider {
pub fn new(
workspace: WeakEntity<Workspace>,
thread_store: WeakEntity<ThreadStore>,
text_thread_store: WeakEntity<TextThreadStore>,
message_editor: WeakEntity<MessageEditor>,
workspace: WeakEntity<Workspace>,
history_store: Entity<HistoryStore>,
prompt_store: Option<Entity<PromptStore>>,
prompt_capabilities: Rc<Cell<acp::PromptCapabilities>>,
) -> Self {
Self {
workspace,
thread_store,
text_thread_store,
message_editor,
workspace,
history_store,
prompt_store,
prompt_capabilities,
}
}
@@ -275,7 +95,7 @@ impl ContextPickerCompletionProvider {
) -> Option<Completion> {
match entry {
ContextPickerEntry::Mode(mode) => Some(Completion {
replace_range: source_range.clone(),
replace_range: source_range,
new_text: format!("@{} ", mode.keyword()),
label: CodeLabel::plain(mode.label().to_string(), None),
icon_path: Some(mode.icon().path().into()),
@@ -332,7 +152,7 @@ impl ContextPickerCompletionProvider {
};
Some(Completion {
replace_range: source_range.clone(),
replace_range: source_range,
new_text,
label: CodeLabel::plain(action.label().to_string(), None),
icon_path: Some(action.icon().path().into()),
@@ -349,22 +169,13 @@ impl ContextPickerCompletionProvider {
}
fn completion_for_thread(
thread_entry: ThreadContextEntry,
thread_entry: HistoryEntry,
source_range: Range<Anchor>,
recent: bool,
editor: WeakEntity<MessageEditor>,
cx: &mut App,
) -> Completion {
let uri = match &thread_entry {
ThreadContextEntry::Thread { id, title } => MentionUri::Thread {
id: id.clone(),
name: title.to_string(),
},
ThreadContextEntry::Context { path, title } => MentionUri::TextThread {
path: path.to_path_buf(),
name: title.to_string(),
},
};
let uri = thread_entry.mention_uri();
let icon_for_completion = if recent {
IconName::HistoryRerun.path().into()
@@ -382,7 +193,7 @@ impl ContextPickerCompletionProvider {
documentation: None,
insert_text_mode: None,
source: project::CompletionSource::Custom,
icon_path: Some(icon_for_completion.clone()),
icon_path: Some(icon_for_completion),
confirm: Some(confirm_completion_callback(
thread_entry.title().clone(),
source_range.start,
@@ -413,9 +224,9 @@ impl ContextPickerCompletionProvider {
documentation: None,
insert_text_mode: None,
source: project::CompletionSource::Custom,
icon_path: Some(icon_path.clone()),
icon_path: Some(icon_path),
confirm: Some(confirm_completion_callback(
rule.title.clone(),
rule.title,
source_range.start,
new_text_len - 1,
editor,
@@ -455,7 +266,7 @@ impl ContextPickerCompletionProvider {
let completion_icon_path = if is_recent {
IconName::HistoryRerun.path().into()
} else {
crease_icon_path.clone()
crease_icon_path
};
let new_text = format!("{} ", uri.as_link());
@@ -504,10 +315,10 @@ impl ContextPickerCompletionProvider {
label,
documentation: None,
source: project::CompletionSource::Custom,
icon_path: Some(icon_path.clone()),
icon_path: Some(icon_path),
insert_text_mode: None,
confirm: Some(confirm_completion_callback(
symbol.name.clone().into(),
symbol.name.into(),
source_range.start,
new_text_len - 1,
message_editor,
@@ -522,7 +333,7 @@ impl ContextPickerCompletionProvider {
message_editor: WeakEntity<MessageEditor>,
cx: &mut App,
) -> Option<Completion> {
let new_text = format!("@fetch {} ", url_to_fetch.clone());
let new_text = format!("@fetch {} ", url_to_fetch);
let url_to_fetch = url::Url::parse(url_to_fetch.as_ref())
.or_else(|_| url::Url::parse(&format!("https://{url_to_fetch}")))
.ok()?;
@@ -536,7 +347,7 @@ impl ContextPickerCompletionProvider {
label: CodeLabel::plain(url_to_fetch.to_string(), None),
documentation: None,
source: project::CompletionSource::Custom,
icon_path: Some(icon_path.clone()),
icon_path: Some(icon_path),
insert_text_mode: None,
confirm: Some(confirm_completion_callback(
url_to_fetch.to_string().into(),
@@ -547,6 +358,255 @@ impl ContextPickerCompletionProvider {
)),
})
}
fn search(
&self,
mode: Option<ContextPickerMode>,
query: String,
cancellation_flag: Arc<AtomicBool>,
cx: &mut App,
) -> Task<Vec<Match>> {
let Some(workspace) = self.workspace.upgrade() else {
return Task::ready(Vec::default());
};
match mode {
Some(ContextPickerMode::File) => {
let search_files_task = search_files(query, cancellation_flag, &workspace, cx);
cx.background_spawn(async move {
search_files_task
.await
.into_iter()
.map(Match::File)
.collect()
})
}
Some(ContextPickerMode::Symbol) => {
let search_symbols_task = search_symbols(query, cancellation_flag, &workspace, cx);
cx.background_spawn(async move {
search_symbols_task
.await
.into_iter()
.map(Match::Symbol)
.collect()
})
}
Some(ContextPickerMode::Thread) => {
let search_threads_task =
search_threads(query, cancellation_flag, &self.history_store, cx);
cx.background_spawn(async move {
search_threads_task
.await
.into_iter()
.map(Match::Thread)
.collect()
})
}
Some(ContextPickerMode::Fetch) => {
if !query.is_empty() {
Task::ready(vec![Match::Fetch(query.into())])
} else {
Task::ready(Vec::new())
}
}
Some(ContextPickerMode::Rules) => {
if let Some(prompt_store) = self.prompt_store.as_ref() {
let search_rules_task =
search_rules(query, cancellation_flag, prompt_store, cx);
cx.background_spawn(async move {
search_rules_task
.await
.into_iter()
.map(Match::Rules)
.collect::<Vec<_>>()
})
} else {
Task::ready(Vec::new())
}
}
None if query.is_empty() => {
let mut matches = self.recent_context_picker_entries(&workspace, cx);
matches.extend(
self.available_context_picker_entries(&workspace, cx)
.into_iter()
.map(|mode| {
Match::Entry(EntryMatch {
entry: mode,
mat: None,
})
}),
);
Task::ready(matches)
}
None => {
let executor = cx.background_executor().clone();
let search_files_task =
search_files(query.clone(), cancellation_flag, &workspace, cx);
let entries = self.available_context_picker_entries(&workspace, cx);
let entry_candidates = entries
.iter()
.enumerate()
.map(|(ix, entry)| StringMatchCandidate::new(ix, entry.keyword()))
.collect::<Vec<_>>();
cx.background_spawn(async move {
let mut matches = search_files_task
.await
.into_iter()
.map(Match::File)
.collect::<Vec<_>>();
let entry_matches = fuzzy::match_strings(
&entry_candidates,
&query,
false,
true,
100,
&Arc::new(AtomicBool::default()),
executor,
)
.await;
matches.extend(entry_matches.into_iter().map(|mat| {
Match::Entry(EntryMatch {
entry: entries[mat.candidate_id],
mat: Some(mat),
})
}));
matches.sort_by(|a, b| {
b.score()
.partial_cmp(&a.score())
.unwrap_or(std::cmp::Ordering::Equal)
});
matches
})
}
}
}
fn recent_context_picker_entries(
&self,
workspace: &Entity<Workspace>,
cx: &mut App,
) -> Vec<Match> {
let mut recent = Vec::with_capacity(6);
let mut mentions = self
.message_editor
.read_with(cx, |message_editor, _cx| message_editor.mentions())
.unwrap_or_default();
let workspace = workspace.read(cx);
let project = workspace.project().read(cx);
if let Some(agent_panel) = workspace.panel::<AgentPanel>(cx)
&& let Some(thread) = agent_panel.read(cx).active_agent_thread(cx)
{
let thread = thread.read(cx);
mentions.insert(MentionUri::Thread {
id: thread.session_id().clone(),
name: thread.title().into(),
});
}
recent.extend(
workspace
.recent_navigation_history_iter(cx)
.filter(|(_, abs_path)| {
abs_path.as_ref().is_none_or(|path| {
!mentions.contains(&MentionUri::File {
abs_path: path.clone(),
})
})
})
.take(4)
.filter_map(|(project_path, _)| {
project
.worktree_for_id(project_path.worktree_id, cx)
.map(|worktree| {
let path_prefix = worktree.read(cx).root_name().into();
Match::File(FileMatch {
mat: fuzzy::PathMatch {
score: 1.,
positions: Vec::new(),
worktree_id: project_path.worktree_id.to_usize(),
path: project_path.path,
path_prefix,
is_dir: false,
distance_to_relative_ancestor: 0,
},
is_recent: true,
})
})
}),
);
if self.prompt_capabilities.get().embedded_context {
const RECENT_COUNT: usize = 2;
let threads = self
.history_store
.read(cx)
.recently_opened_entries(cx)
.into_iter()
.filter(|thread| !mentions.contains(&thread.mention_uri()))
.take(RECENT_COUNT)
.collect::<Vec<_>>();
recent.extend(threads.into_iter().map(Match::RecentThread));
}
recent
}
fn available_context_picker_entries(
&self,
workspace: &Entity<Workspace>,
cx: &mut App,
) -> Vec<ContextPickerEntry> {
let embedded_context = self.prompt_capabilities.get().embedded_context;
let mut entries = if embedded_context {
vec![
ContextPickerEntry::Mode(ContextPickerMode::File),
ContextPickerEntry::Mode(ContextPickerMode::Symbol),
ContextPickerEntry::Mode(ContextPickerMode::Thread),
]
} else {
// File is always available, but we don't need a mode entry
vec![]
};
let has_selection = workspace
.read(cx)
.active_item(cx)
.and_then(|item| item.downcast::<Editor>())
.is_some_and(|editor| {
editor.update(cx, |editor, cx| editor.has_non_empty_selection(cx))
});
if has_selection {
entries.push(ContextPickerEntry::Action(
ContextPickerAction::AddSelections,
));
}
if embedded_context {
if self.prompt_store.is_some() {
entries.push(ContextPickerEntry::Mode(ContextPickerMode::Rules));
}
entries.push(ContextPickerEntry::Mode(ContextPickerMode::Fetch));
}
entries
}
}
fn build_code_label_for_full_path(file_name: &str, directory: Option<&str>, cx: &App) -> CodeLabel {
@@ -581,7 +641,11 @@ impl CompletionProvider for ContextPickerCompletionProvider {
let offset_to_line = buffer.point_to_offset(line_start);
let mut lines = buffer.text_for_range(line_start..position).lines();
let line = lines.next()?;
MentionCompletion::try_parse(line, offset_to_line)
MentionCompletion::try_parse(
self.prompt_capabilities.get().embedded_context,
line,
offset_to_line,
)
});
let Some(state) = state else {
return Task::ready(Ok(Vec::new()));
@@ -596,45 +660,12 @@ impl CompletionProvider for ContextPickerCompletionProvider {
let source_range = snapshot.anchor_before(state.source_range.start)
..snapshot.anchor_after(state.source_range.end);
let thread_store = self.thread_store.clone();
let text_thread_store = self.text_thread_store.clone();
let editor = self.message_editor.clone();
let Ok((exclude_paths, exclude_threads)) =
self.message_editor.update(cx, |message_editor, _cx| {
message_editor.mentioned_path_and_threads()
})
else {
return Task::ready(Ok(Vec::new()));
};
let MentionCompletion { mode, argument, .. } = state;
let query = argument.unwrap_or_else(|| "".to_string());
let recent_entries = recent_context_picker_entries(
Some(thread_store.clone()),
Some(text_thread_store.clone()),
workspace.clone(),
&exclude_paths,
&exclude_threads,
cx,
);
let prompt_store = thread_store
.read_with(cx, |thread_store, _cx| thread_store.prompt_store().clone())
.ok()
.flatten();
let search_task = search(
mode,
query,
Arc::<AtomicBool>::default(),
recent_entries,
prompt_store,
thread_store.clone(),
text_thread_store.clone(),
workspace.clone(),
cx,
);
let search_task = self.search(mode, query, Arc::<AtomicBool>::default(), cx);
cx.spawn(async move |_, cx| {
let matches = search_task.await;
@@ -669,12 +700,18 @@ impl CompletionProvider for ContextPickerCompletionProvider {
cx,
),
Match::Thread(ThreadMatch {
thread, is_recent, ..
}) => Some(Self::completion_for_thread(
Match::Thread(thread) => Some(Self::completion_for_thread(
thread,
source_range.clone(),
is_recent,
false,
editor.clone(),
cx,
)),
Match::RecentThread(thread) => Some(Self::completion_for_thread(
thread,
source_range.clone(),
true,
editor.clone(),
cx,
)),
@@ -728,12 +765,16 @@ impl CompletionProvider for ContextPickerCompletionProvider {
let offset_to_line = buffer.point_to_offset(line_start);
let mut lines = buffer.text_for_range(line_start..position).lines();
if let Some(line) = lines.next() {
MentionCompletion::try_parse(line, offset_to_line)
.map(|completion| {
completion.source_range.start <= offset_to_line + position.column as usize
&& completion.source_range.end >= offset_to_line + position.column as usize
})
.unwrap_or(false)
MentionCompletion::try_parse(
self.prompt_capabilities.get().embedded_context,
line,
offset_to_line,
)
.map(|completion| {
completion.source_range.start <= offset_to_line + position.column as usize
&& completion.source_range.end >= offset_to_line + position.column as usize
})
.unwrap_or(false)
} else {
false
}
@@ -748,6 +789,42 @@ impl CompletionProvider for ContextPickerCompletionProvider {
}
}
pub(crate) fn search_threads(
query: String,
cancellation_flag: Arc<AtomicBool>,
history_store: &Entity<HistoryStore>,
cx: &mut App,
) -> Task<Vec<HistoryEntry>> {
let threads = history_store.read(cx).entries(cx);
if query.is_empty() {
return Task::ready(threads);
}
let executor = cx.background_executor().clone();
cx.background_spawn(async move {
let candidates = threads
.iter()
.enumerate()
.map(|(id, thread)| StringMatchCandidate::new(id, thread.title()))
.collect::<Vec<_>>();
let matches = fuzzy::match_strings(
&candidates,
&query,
false,
true,
100,
&cancellation_flag,
executor,
)
.await;
matches
.into_iter()
.map(|mat| threads[mat.candidate_id].clone())
.collect()
})
}
fn confirm_completion_callback(
crease_text: SharedString,
start: Anchor,
@@ -763,14 +840,16 @@ fn confirm_completion_callback(
message_editor
.clone()
.update(cx, |message_editor, cx| {
message_editor.confirm_completion(
crease_text,
start,
content_len,
mention_uri,
window,
cx,
)
message_editor
.confirm_completion(
crease_text,
start,
content_len,
mention_uri,
window,
cx,
)
.detach();
})
.ok();
});
@@ -786,7 +865,7 @@ struct MentionCompletion {
}
impl MentionCompletion {
fn try_parse(line: &str, offset_to_line: usize) -> Option<Self> {
fn try_parse(allow_non_file_mentions: bool, line: &str, offset_to_line: usize) -> Option<Self> {
let last_mention_start = line.rfind('@')?;
if last_mention_start >= line.len() {
return Some(Self::default());
@@ -795,7 +874,7 @@ impl MentionCompletion {
&& line
.chars()
.nth(last_mention_start - 1)
.map_or(false, |c| !c.is_whitespace())
.is_some_and(|c| !c.is_whitespace())
{
return None;
}
@@ -810,7 +889,9 @@ impl MentionCompletion {
if let Some(mode_text) = parts.next() {
end += mode_text.len();
if let Some(parsed_mode) = ContextPickerMode::try_from(mode_text).ok() {
if let Some(parsed_mode) = ContextPickerMode::try_from(mode_text).ok()
&& (allow_non_file_mentions || matches!(parsed_mode, ContextPickerMode::File))
{
mode = Some(parsed_mode);
} else {
argument = Some(mode_text.to_string());
@@ -843,10 +924,10 @@ mod tests {
#[test]
fn test_mention_completion_parse() {
assert_eq!(MentionCompletion::try_parse("Lorem Ipsum", 0), None);
assert_eq!(MentionCompletion::try_parse(true, "Lorem Ipsum", 0), None);
assert_eq!(
MentionCompletion::try_parse("Lorem @", 0),
MentionCompletion::try_parse(true, "Lorem @", 0),
Some(MentionCompletion {
source_range: 6..7,
mode: None,
@@ -855,7 +936,7 @@ mod tests {
);
assert_eq!(
MentionCompletion::try_parse("Lorem @file", 0),
MentionCompletion::try_parse(true, "Lorem @file", 0),
Some(MentionCompletion {
source_range: 6..11,
mode: Some(ContextPickerMode::File),
@@ -864,7 +945,7 @@ mod tests {
);
assert_eq!(
MentionCompletion::try_parse("Lorem @file ", 0),
MentionCompletion::try_parse(true, "Lorem @file ", 0),
Some(MentionCompletion {
source_range: 6..12,
mode: Some(ContextPickerMode::File),
@@ -873,7 +954,7 @@ mod tests {
);
assert_eq!(
MentionCompletion::try_parse("Lorem @file main.rs", 0),
MentionCompletion::try_parse(true, "Lorem @file main.rs", 0),
Some(MentionCompletion {
source_range: 6..19,
mode: Some(ContextPickerMode::File),
@@ -882,7 +963,7 @@ mod tests {
);
assert_eq!(
MentionCompletion::try_parse("Lorem @file main.rs ", 0),
MentionCompletion::try_parse(true, "Lorem @file main.rs ", 0),
Some(MentionCompletion {
source_range: 6..19,
mode: Some(ContextPickerMode::File),
@@ -891,7 +972,7 @@ mod tests {
);
assert_eq!(
MentionCompletion::try_parse("Lorem @file main.rs Ipsum", 0),
MentionCompletion::try_parse(true, "Lorem @file main.rs Ipsum", 0),
Some(MentionCompletion {
source_range: 6..19,
mode: Some(ContextPickerMode::File),
@@ -900,7 +981,7 @@ mod tests {
);
assert_eq!(
MentionCompletion::try_parse("Lorem @main", 0),
MentionCompletion::try_parse(true, "Lorem @main", 0),
Some(MentionCompletion {
source_range: 6..11,
mode: None,
@@ -908,6 +989,28 @@ mod tests {
})
);
assert_eq!(MentionCompletion::try_parse("test@", 0), None);
assert_eq!(MentionCompletion::try_parse(true, "test@", 0), None);
// Allowed non-file mentions
assert_eq!(
MentionCompletion::try_parse(true, "Lorem @symbol main", 0),
Some(MentionCompletion {
source_range: 6..18,
mode: Some(ContextPickerMode::Symbol),
argument: Some("main".to_string()),
})
);
// Disallowed non-file mentions
assert_eq!(
MentionCompletion::try_parse(false, "Lorem @symbol main", 0),
Some(MentionCompletion {
source_range: 6..18,
mode: None,
argument: Some("main".to_string()),
})
);
}
}

View File

@@ -1,7 +1,7 @@
use std::ops::Range;
use acp_thread::{AcpThread, AgentThreadEntry};
use agent::{TextThreadStore, ThreadStore};
use agent2::HistoryStore;
use collections::HashMap;
use editor::{Editor, EditorMode, MinimapVisibility};
use gpui::{
@@ -10,6 +10,7 @@ use gpui::{
};
use language::language_settings::SoftWrap;
use project::Project;
use prompt_store::PromptStore;
use settings::Settings as _;
use terminal_view::TerminalView;
use theme::ThemeSettings;
@@ -21,8 +22,8 @@ use crate::acp::message_editor::{MessageEditor, MessageEditorEvent};
pub struct EntryViewState {
workspace: WeakEntity<Workspace>,
project: Entity<Project>,
thread_store: Entity<ThreadStore>,
text_thread_store: Entity<TextThreadStore>,
history_store: Entity<HistoryStore>,
prompt_store: Option<Entity<PromptStore>>,
entries: Vec<Entry>,
prevent_slash_commands: bool,
}
@@ -31,15 +32,15 @@ impl EntryViewState {
pub fn new(
workspace: WeakEntity<Workspace>,
project: Entity<Project>,
thread_store: Entity<ThreadStore>,
text_thread_store: Entity<TextThreadStore>,
history_store: Entity<HistoryStore>,
prompt_store: Option<Entity<PromptStore>>,
prevent_slash_commands: bool,
) -> Self {
Self {
workspace,
project,
thread_store,
text_thread_store,
history_store,
prompt_store,
entries: Vec::new(),
prevent_slash_commands,
}
@@ -77,8 +78,8 @@ impl EntryViewState {
let mut editor = MessageEditor::new(
self.workspace.clone(),
self.project.clone(),
self.thread_store.clone(),
self.text_thread_store.clone(),
self.history_store.clone(),
self.prompt_store.clone(),
"Edit message @ to include context",
self.prevent_slash_commands,
editor::EditorMode::AutoHeight {
@@ -189,6 +190,7 @@ pub enum ViewEvent {
MessageEditorEvent(Entity<MessageEditor>, MessageEditorEvent),
}
#[derive(Debug)]
pub enum Entry {
UserMessage(Entity<MessageEditor>),
Content(HashMap<EntityId, AnyEntity>),
@@ -312,9 +314,10 @@ mod tests {
use std::{path::Path, rc::Rc};
use acp_thread::{AgentConnection, StubAgentConnection};
use agent::{TextThreadStore, ThreadStore};
use agent_client_protocol as acp;
use agent_settings::AgentSettings;
use agent2::HistoryStore;
use assistant_context::ContextStore;
use buffer_diff::{DiffHunkStatus, DiffHunkStatusKind};
use editor::{EditorSettings, RowInfo};
use fs::FakeFs;
@@ -377,15 +380,15 @@ mod tests {
connection.send_update(session_id, acp::SessionUpdate::ToolCall(tool_call), cx)
});
let thread_store = cx.new(|cx| ThreadStore::fake(project.clone(), cx));
let text_thread_store = cx.new(|cx| TextThreadStore::fake(project.clone(), cx));
let context_store = cx.new(|cx| ContextStore::fake(project.clone(), cx));
let history_store = cx.new(|cx| HistoryStore::new(context_store, cx));
let view_state = cx.new(|_cx| {
EntryViewState::new(
workspace.downgrade(),
project.clone(),
thread_store,
text_thread_store,
history_store,
None,
false,
)
});

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,766 @@
use crate::RemoveSelectedThread;
use agent2::{HistoryEntry, HistoryStore};
use chrono::{Datelike as _, Local, NaiveDate, TimeDelta};
use editor::{Editor, EditorEvent};
use fuzzy::{StringMatch, StringMatchCandidate};
use gpui::{
App, Empty, Entity, EventEmitter, FocusHandle, Focusable, ScrollStrategy, Stateful, Task,
UniformListScrollHandle, Window, uniform_list,
};
use std::{fmt::Display, ops::Range, sync::Arc};
use time::{OffsetDateTime, UtcOffset};
use ui::{
HighlightedLabel, IconButtonShape, ListItem, ListItemSpacing, Scrollbar, ScrollbarState,
Tooltip, prelude::*,
};
use util::ResultExt;
pub struct AcpThreadHistory {
pub(crate) history_store: Entity<HistoryStore>,
scroll_handle: UniformListScrollHandle,
selected_index: usize,
hovered_index: Option<usize>,
search_editor: Entity<Editor>,
all_entries: Arc<Vec<HistoryEntry>>,
// When the search is empty, we display date separators between history entries
// This vector contains an enum of either a separator or an actual entry
separated_items: Vec<ListItemType>,
// Maps entry indexes to list item indexes
separated_item_indexes: Vec<u32>,
_separated_items_task: Option<Task<()>>,
search_state: SearchState,
scrollbar_visibility: bool,
scrollbar_state: ScrollbarState,
local_timezone: UtcOffset,
_subscriptions: Vec<gpui::Subscription>,
}
enum SearchState {
Empty,
Searching {
query: SharedString,
_task: Task<()>,
},
Searched {
query: SharedString,
matches: Vec<StringMatch>,
},
}
enum ListItemType {
BucketSeparator(TimeBucket),
Entry {
index: usize,
format: EntryTimeFormat,
},
}
pub enum ThreadHistoryEvent {
Open(HistoryEntry),
}
impl EventEmitter<ThreadHistoryEvent> for AcpThreadHistory {}
impl AcpThreadHistory {
pub(crate) fn new(
history_store: Entity<agent2::HistoryStore>,
window: &mut Window,
cx: &mut Context<Self>,
) -> Self {
let search_editor = cx.new(|cx| {
let mut editor = Editor::single_line(window, cx);
editor.set_placeholder_text("Search threads...", cx);
editor
});
let search_editor_subscription =
cx.subscribe(&search_editor, |this, search_editor, event, cx| {
if let EditorEvent::BufferEdited = event {
let query = search_editor.read(cx).text(cx);
this.search(query.into(), cx);
}
});
let history_store_subscription = cx.observe(&history_store, |this, _, cx| {
this.update_all_entries(cx);
});
let scroll_handle = UniformListScrollHandle::default();
let scrollbar_state = ScrollbarState::new(scroll_handle.clone());
let mut this = Self {
history_store,
scroll_handle,
selected_index: 0,
hovered_index: None,
search_state: SearchState::Empty,
all_entries: Default::default(),
separated_items: Default::default(),
separated_item_indexes: Default::default(),
search_editor,
scrollbar_visibility: true,
scrollbar_state,
local_timezone: UtcOffset::from_whole_seconds(
chrono::Local::now().offset().local_minus_utc(),
)
.unwrap(),
_subscriptions: vec![search_editor_subscription, history_store_subscription],
_separated_items_task: None,
};
this.update_all_entries(cx);
this
}
fn update_all_entries(&mut self, cx: &mut Context<Self>) {
let new_entries: Arc<Vec<HistoryEntry>> = self
.history_store
.update(cx, |store, cx| store.entries(cx))
.into();
self._separated_items_task.take();
let mut items = Vec::with_capacity(new_entries.len() + 1);
let mut indexes = Vec::with_capacity(new_entries.len() + 1);
let bg_task = cx.background_spawn(async move {
let mut bucket = None;
let today = Local::now().naive_local().date();
for (index, entry) in new_entries.iter().enumerate() {
let entry_date = entry
.updated_at()
.with_timezone(&Local)
.naive_local()
.date();
let entry_bucket = TimeBucket::from_dates(today, entry_date);
if Some(entry_bucket) != bucket {
bucket = Some(entry_bucket);
items.push(ListItemType::BucketSeparator(entry_bucket));
}
indexes.push(items.len() as u32);
items.push(ListItemType::Entry {
index,
format: entry_bucket.into(),
});
}
(new_entries, items, indexes)
});
let task = cx.spawn(async move |this, cx| {
let (new_entries, items, indexes) = bg_task.await;
this.update(cx, |this, cx| {
let previously_selected_entry =
this.all_entries.get(this.selected_index).map(|e| e.id());
this.all_entries = new_entries;
this.separated_items = items;
this.separated_item_indexes = indexes;
match &this.search_state {
SearchState::Empty => {
if this.selected_index >= this.all_entries.len() {
this.set_selected_entry_index(
this.all_entries.len().saturating_sub(1),
cx,
);
} else if let Some(prev_id) = previously_selected_entry
&& let Some(new_ix) = this
.all_entries
.iter()
.position(|probe| probe.id() == prev_id)
{
this.set_selected_entry_index(new_ix, cx);
}
}
SearchState::Searching { query, .. } | SearchState::Searched { query, .. } => {
this.search(query.clone(), cx);
}
}
cx.notify();
})
.log_err();
});
self._separated_items_task = Some(task);
}
fn search(&mut self, query: SharedString, cx: &mut Context<Self>) {
if query.is_empty() {
self.search_state = SearchState::Empty;
cx.notify();
return;
}
let all_entries = self.all_entries.clone();
let fuzzy_search_task = cx.background_spawn({
let query = query.clone();
let executor = cx.background_executor().clone();
async move {
let mut candidates = Vec::with_capacity(all_entries.len());
for (idx, entry) in all_entries.iter().enumerate() {
candidates.push(StringMatchCandidate::new(idx, entry.title()));
}
const MAX_MATCHES: usize = 100;
fuzzy::match_strings(
&candidates,
&query,
false,
true,
MAX_MATCHES,
&Default::default(),
executor,
)
.await
}
});
let task = cx.spawn({
let query = query.clone();
async move |this, cx| {
let matches = fuzzy_search_task.await;
this.update(cx, |this, cx| {
let SearchState::Searching {
query: current_query,
_task,
} = &this.search_state
else {
return;
};
if &query == current_query {
this.search_state = SearchState::Searched {
query: query.clone(),
matches,
};
this.set_selected_entry_index(0, cx);
cx.notify();
};
})
.log_err();
}
});
self.search_state = SearchState::Searching { query, _task: task };
cx.notify();
}
fn matched_count(&self) -> usize {
match &self.search_state {
SearchState::Empty => self.all_entries.len(),
SearchState::Searching { .. } => 0,
SearchState::Searched { matches, .. } => matches.len(),
}
}
fn list_item_count(&self) -> usize {
match &self.search_state {
SearchState::Empty => self.separated_items.len(),
SearchState::Searching { .. } => 0,
SearchState::Searched { matches, .. } => matches.len(),
}
}
fn search_produced_no_matches(&self) -> bool {
match &self.search_state {
SearchState::Empty => false,
SearchState::Searching { .. } => false,
SearchState::Searched { matches, .. } => matches.is_empty(),
}
}
fn get_match(&self, ix: usize) -> Option<&HistoryEntry> {
match &self.search_state {
SearchState::Empty => self.all_entries.get(ix),
SearchState::Searching { .. } => None,
SearchState::Searched { matches, .. } => matches
.get(ix)
.and_then(|m| self.all_entries.get(m.candidate_id)),
}
}
pub fn select_previous(
&mut self,
_: &menu::SelectPrevious,
_window: &mut Window,
cx: &mut Context<Self>,
) {
let count = self.matched_count();
if count > 0 {
if self.selected_index == 0 {
self.set_selected_entry_index(count - 1, cx);
} else {
self.set_selected_entry_index(self.selected_index - 1, cx);
}
}
}
pub fn select_next(
&mut self,
_: &menu::SelectNext,
_window: &mut Window,
cx: &mut Context<Self>,
) {
let count = self.matched_count();
if count > 0 {
if self.selected_index == count - 1 {
self.set_selected_entry_index(0, cx);
} else {
self.set_selected_entry_index(self.selected_index + 1, cx);
}
}
}
fn select_first(
&mut self,
_: &menu::SelectFirst,
_window: &mut Window,
cx: &mut Context<Self>,
) {
let count = self.matched_count();
if count > 0 {
self.set_selected_entry_index(0, cx);
}
}
fn select_last(&mut self, _: &menu::SelectLast, _window: &mut Window, cx: &mut Context<Self>) {
let count = self.matched_count();
if count > 0 {
self.set_selected_entry_index(count - 1, cx);
}
}
fn set_selected_entry_index(&mut self, entry_index: usize, cx: &mut Context<Self>) {
self.selected_index = entry_index;
let scroll_ix = match self.search_state {
SearchState::Empty | SearchState::Searching { .. } => self
.separated_item_indexes
.get(entry_index)
.map(|ix| *ix as usize)
.unwrap_or(entry_index + 1),
SearchState::Searched { .. } => entry_index,
};
self.scroll_handle
.scroll_to_item(scroll_ix, ScrollStrategy::Top);
cx.notify();
}
fn render_scrollbar(&self, cx: &mut Context<Self>) -> Option<Stateful<Div>> {
if !(self.scrollbar_visibility || self.scrollbar_state.is_dragging()) {
return None;
}
Some(
div()
.occlude()
.id("thread-history-scroll")
.h_full()
.bg(cx.theme().colors().panel_background.opacity(0.8))
.border_l_1()
.border_color(cx.theme().colors().border_variant)
.absolute()
.right_1()
.top_0()
.bottom_0()
.w_4()
.pl_1()
.cursor_default()
.on_mouse_move(cx.listener(|_, _, _window, cx| {
cx.notify();
cx.stop_propagation()
}))
.on_hover(|_, _window, cx| {
cx.stop_propagation();
})
.on_any_mouse_down(|_, _window, cx| {
cx.stop_propagation();
})
.on_scroll_wheel(cx.listener(|_, _, _window, cx| {
cx.notify();
}))
.children(Scrollbar::vertical(self.scrollbar_state.clone())),
)
}
fn confirm(&mut self, _: &menu::Confirm, _window: &mut Window, cx: &mut Context<Self>) {
self.confirm_entry(self.selected_index, cx);
}
fn confirm_entry(&mut self, ix: usize, cx: &mut Context<Self>) {
let Some(entry) = self.get_match(ix) else {
return;
};
cx.emit(ThreadHistoryEvent::Open(entry.clone()));
}
fn remove_selected_thread(
&mut self,
_: &RemoveSelectedThread,
_window: &mut Window,
cx: &mut Context<Self>,
) {
self.remove_thread(self.selected_index, cx)
}
fn remove_thread(&mut self, ix: usize, cx: &mut Context<Self>) {
let Some(entry) = self.get_match(ix) else {
return;
};
let task = match entry {
HistoryEntry::AcpThread(thread) => self
.history_store
.update(cx, |this, cx| this.delete_thread(thread.id.clone(), cx)),
HistoryEntry::TextThread(context) => self.history_store.update(cx, |this, cx| {
this.delete_text_thread(context.path.clone(), cx)
}),
};
task.detach_and_log_err(cx);
}
fn list_items(
&mut self,
range: Range<usize>,
_window: &mut Window,
cx: &mut Context<Self>,
) -> Vec<AnyElement> {
match &self.search_state {
SearchState::Empty => self
.separated_items
.get(range)
.iter()
.flat_map(|items| {
items
.iter()
.map(|item| self.render_list_item(item, vec![], cx))
})
.collect(),
SearchState::Searched { matches, .. } => matches[range]
.iter()
.filter_map(|m| {
let entry = self.all_entries.get(m.candidate_id)?;
Some(self.render_history_entry(
entry,
EntryTimeFormat::DateAndTime,
m.candidate_id,
m.positions.clone(),
cx,
))
})
.collect(),
SearchState::Searching { .. } => {
vec![]
}
}
}
fn render_list_item(
&self,
item: &ListItemType,
highlight_positions: Vec<usize>,
cx: &Context<Self>,
) -> AnyElement {
match item {
ListItemType::Entry { index, format } => match self.all_entries.get(*index) {
Some(entry) => self
.render_history_entry(entry, *format, *index, highlight_positions, cx)
.into_any(),
None => Empty.into_any_element(),
},
ListItemType::BucketSeparator(bucket) => div()
.px(DynamicSpacing::Base06.rems(cx))
.pt_2()
.pb_1()
.child(
Label::new(bucket.to_string())
.size(LabelSize::XSmall)
.color(Color::Muted),
)
.into_any_element(),
}
}
fn render_history_entry(
&self,
entry: &HistoryEntry,
format: EntryTimeFormat,
list_entry_ix: usize,
highlight_positions: Vec<usize>,
cx: &Context<Self>,
) -> AnyElement {
let selected = list_entry_ix == self.selected_index;
let hovered = Some(list_entry_ix) == self.hovered_index;
let timestamp = entry.updated_at().timestamp();
let thread_timestamp = format.format_timestamp(timestamp, self.local_timezone);
h_flex()
.w_full()
.pb_1()
.child(
ListItem::new(list_entry_ix)
.rounded()
.toggle_state(selected)
.spacing(ListItemSpacing::Sparse)
.start_slot(
h_flex()
.w_full()
.gap_2()
.justify_between()
.child(
HighlightedLabel::new(entry.title(), highlight_positions)
.size(LabelSize::Small)
.truncate(),
)
.child(
Label::new(thread_timestamp)
.color(Color::Muted)
.size(LabelSize::XSmall),
),
)
.on_hover(cx.listener(move |this, is_hovered, _window, cx| {
if *is_hovered {
this.hovered_index = Some(list_entry_ix);
} else if this.hovered_index == Some(list_entry_ix) {
this.hovered_index = None;
}
cx.notify();
}))
.end_slot::<IconButton>(if hovered || selected {
Some(
IconButton::new("delete", IconName::Trash)
.shape(IconButtonShape::Square)
.icon_size(IconSize::XSmall)
.icon_color(Color::Muted)
.tooltip(move |window, cx| {
Tooltip::for_action("Delete", &RemoveSelectedThread, window, cx)
})
.on_click(cx.listener(move |this, _, _, cx| {
this.remove_thread(list_entry_ix, cx)
})),
)
} else {
None
})
.on_click(
cx.listener(move |this, _, _, cx| this.confirm_entry(list_entry_ix, cx)),
),
)
.into_any_element()
}
}
impl Focusable for AcpThreadHistory {
fn focus_handle(&self, cx: &App) -> FocusHandle {
self.search_editor.focus_handle(cx)
}
}
impl Render for AcpThreadHistory {
fn render(&mut self, _window: &mut Window, cx: &mut Context<Self>) -> impl IntoElement {
v_flex()
.key_context("ThreadHistory")
.size_full()
.on_action(cx.listener(Self::select_previous))
.on_action(cx.listener(Self::select_next))
.on_action(cx.listener(Self::select_first))
.on_action(cx.listener(Self::select_last))
.on_action(cx.listener(Self::confirm))
.on_action(cx.listener(Self::remove_selected_thread))
.when(!self.all_entries.is_empty(), |parent| {
parent.child(
h_flex()
.h(px(41.)) // Match the toolbar perfectly
.w_full()
.py_1()
.px_2()
.gap_2()
.justify_between()
.border_b_1()
.border_color(cx.theme().colors().border)
.child(
Icon::new(IconName::MagnifyingGlass)
.color(Color::Muted)
.size(IconSize::Small),
)
.child(self.search_editor.clone()),
)
})
.child({
let view = v_flex()
.id("list-container")
.relative()
.overflow_hidden()
.flex_grow();
if self.all_entries.is_empty() {
view.justify_center()
.child(
h_flex().w_full().justify_center().child(
Label::new("You don't have any past threads yet.")
.size(LabelSize::Small),
),
)
} else if self.search_produced_no_matches() {
view.justify_center().child(
h_flex().w_full().justify_center().child(
Label::new("No threads match your search.").size(LabelSize::Small),
),
)
} else {
view.pr_5()
.child(
uniform_list(
"thread-history",
self.list_item_count(),
cx.processor(|this, range: Range<usize>, window, cx| {
this.list_items(range, window, cx)
}),
)
.p_1()
.track_scroll(self.scroll_handle.clone())
.flex_grow(),
)
.when_some(self.render_scrollbar(cx), |div, scrollbar| {
div.child(scrollbar)
})
}
})
}
}
#[derive(Clone, Copy)]
pub enum EntryTimeFormat {
DateAndTime,
TimeOnly,
}
impl EntryTimeFormat {
fn format_timestamp(&self, timestamp: i64, timezone: UtcOffset) -> String {
let timestamp = OffsetDateTime::from_unix_timestamp(timestamp).unwrap();
match self {
EntryTimeFormat::DateAndTime => time_format::format_localized_timestamp(
timestamp,
OffsetDateTime::now_utc(),
timezone,
time_format::TimestampFormat::EnhancedAbsolute,
),
EntryTimeFormat::TimeOnly => time_format::format_time(timestamp),
}
}
}
impl From<TimeBucket> for EntryTimeFormat {
fn from(bucket: TimeBucket) -> Self {
match bucket {
TimeBucket::Today => EntryTimeFormat::TimeOnly,
TimeBucket::Yesterday => EntryTimeFormat::TimeOnly,
TimeBucket::ThisWeek => EntryTimeFormat::DateAndTime,
TimeBucket::PastWeek => EntryTimeFormat::DateAndTime,
TimeBucket::All => EntryTimeFormat::DateAndTime,
}
}
}
#[derive(PartialEq, Eq, Clone, Copy, Debug)]
enum TimeBucket {
Today,
Yesterday,
ThisWeek,
PastWeek,
All,
}
impl TimeBucket {
fn from_dates(reference: NaiveDate, date: NaiveDate) -> Self {
if date == reference {
return TimeBucket::Today;
}
if date == reference - TimeDelta::days(1) {
return TimeBucket::Yesterday;
}
let week = date.iso_week();
if reference.iso_week() == week {
return TimeBucket::ThisWeek;
}
let last_week = (reference - TimeDelta::days(7)).iso_week();
if week == last_week {
return TimeBucket::PastWeek;
}
TimeBucket::All
}
}
impl Display for TimeBucket {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
match self {
TimeBucket::Today => write!(f, "Today"),
TimeBucket::Yesterday => write!(f, "Yesterday"),
TimeBucket::ThisWeek => write!(f, "This Week"),
TimeBucket::PastWeek => write!(f, "Past Week"),
TimeBucket::All => write!(f, "All"),
}
}
}
#[cfg(test)]
mod tests {
use super::*;
use chrono::NaiveDate;
#[test]
fn test_time_bucket_from_dates() {
let today = NaiveDate::from_ymd_opt(2023, 1, 15).unwrap();
let date = today;
assert_eq!(TimeBucket::from_dates(today, date), TimeBucket::Today);
let date = NaiveDate::from_ymd_opt(2023, 1, 14).unwrap();
assert_eq!(TimeBucket::from_dates(today, date), TimeBucket::Yesterday);
let date = NaiveDate::from_ymd_opt(2023, 1, 13).unwrap();
assert_eq!(TimeBucket::from_dates(today, date), TimeBucket::ThisWeek);
let date = NaiveDate::from_ymd_opt(2023, 1, 11).unwrap();
assert_eq!(TimeBucket::from_dates(today, date), TimeBucket::ThisWeek);
let date = NaiveDate::from_ymd_opt(2023, 1, 8).unwrap();
assert_eq!(TimeBucket::from_dates(today, date), TimeBucket::PastWeek);
let date = NaiveDate::from_ymd_opt(2023, 1, 5).unwrap();
assert_eq!(TimeBucket::from_dates(today, date), TimeBucket::PastWeek);
// All: not in this week or last week
let date = NaiveDate::from_ymd_opt(2023, 1, 1).unwrap();
assert_eq!(TimeBucket::from_dates(today, date), TimeBucket::All);
// Test year boundary cases
let new_year = NaiveDate::from_ymd_opt(2023, 1, 1).unwrap();
let date = NaiveDate::from_ymd_opt(2022, 12, 31).unwrap();
assert_eq!(
TimeBucket::from_dates(new_year, date),
TimeBucket::Yesterday
);
let date = NaiveDate::from_ymd_opt(2022, 12, 28).unwrap();
assert_eq!(TimeBucket::from_dates(new_year, date), TimeBucket::ThisWeek);
}
}

File diff suppressed because it is too large Load Diff

View File

@@ -491,7 +491,7 @@ fn render_markdown_code_block(
.on_click({
let active_thread = active_thread.clone();
let parsed_markdown = parsed_markdown.clone();
let code_block_range = metadata.content_range.clone();
let code_block_range = metadata.content_range;
move |_event, _window, cx| {
active_thread.update(cx, |this, cx| {
this.copied_code_block_ids.insert((message_id, ix));
@@ -532,7 +532,6 @@ fn render_markdown_code_block(
"Expand Code"
}))
.on_click({
let active_thread = active_thread.clone();
move |_event, _window, cx| {
active_thread.update(cx, |this, cx| {
this.toggle_codeblock_expanded(message_id, ix);
@@ -780,13 +779,11 @@ impl ActiveThread {
let list_state = ListState::new(0, ListAlignment::Bottom, px(2048.));
let workspace_subscription = if let Some(workspace) = workspace.upgrade() {
Some(cx.observe_release(&workspace, |this, _, cx| {
let workspace_subscription = workspace.upgrade().map(|workspace| {
cx.observe_release(&workspace, |this, _, cx| {
this.dismiss_notifications(cx);
}))
} else {
None
};
})
});
let mut this = Self {
language_registry,
@@ -916,7 +913,7 @@ impl ActiveThread {
) {
let rendered = self
.rendered_tool_uses
.entry(tool_use_id.clone())
.entry(tool_use_id)
.or_insert_with(|| RenderedToolUse {
label: cx.new(|cx| {
Markdown::new("".into(), Some(self.language_registry.clone()), None, cx)
@@ -1072,8 +1069,8 @@ impl ActiveThread {
}
ThreadEvent::MessageEdited(message_id) => {
self.clear_last_error();
if let Some(index) = self.messages.iter().position(|id| id == message_id) {
if let Some(rendered_message) = self.thread.update(cx, |thread, cx| {
if let Some(index) = self.messages.iter().position(|id| id == message_id)
&& let Some(rendered_message) = self.thread.update(cx, |thread, cx| {
thread.message(*message_id).map(|message| {
let mut rendered_message = RenderedMessage {
language_registry: self.language_registry.clone(),
@@ -1084,14 +1081,14 @@ impl ActiveThread {
}
rendered_message
})
}) {
self.list_state.splice(index..index + 1, 1);
self.rendered_messages_by_id
.insert(*message_id, rendered_message);
self.scroll_to_bottom(cx);
self.save_thread(cx);
cx.notify();
}
})
{
self.list_state.splice(index..index + 1, 1);
self.rendered_messages_by_id
.insert(*message_id, rendered_message);
self.scroll_to_bottom(cx);
self.save_thread(cx);
cx.notify();
}
}
ThreadEvent::MessageDeleted(message_id) => {
@@ -1218,7 +1215,7 @@ impl ActiveThread {
match AgentSettings::get_global(cx).notify_when_agent_waiting {
NotifyWhenAgentWaiting::PrimaryScreen => {
if let Some(primary) = cx.primary_display() {
self.pop_up(icon, caption.into(), title.clone(), window, primary, cx);
self.pop_up(icon, caption.into(), title, window, primary, cx);
}
}
NotifyWhenAgentWaiting::AllScreens => {
@@ -1272,62 +1269,61 @@ impl ActiveThread {
})
})
.log_err()
&& let Some(pop_up) = screen_window.entity(cx).log_err()
{
if let Some(pop_up) = screen_window.entity(cx).log_err() {
self.notification_subscriptions
.entry(screen_window)
.or_insert_with(Vec::new)
.push(cx.subscribe_in(&pop_up, window, {
|this, _, event, window, cx| match event {
AgentNotificationEvent::Accepted => {
let handle = window.window_handle();
cx.activate(true);
self.notification_subscriptions
.entry(screen_window)
.or_insert_with(Vec::new)
.push(cx.subscribe_in(&pop_up, window, {
|this, _, event, window, cx| match event {
AgentNotificationEvent::Accepted => {
let handle = window.window_handle();
cx.activate(true);
let workspace_handle = this.workspace.clone();
let workspace_handle = this.workspace.clone();
// If there are multiple Zed windows, activate the correct one.
cx.defer(move |cx| {
handle
.update(cx, |_view, window, _cx| {
window.activate_window();
// If there are multiple Zed windows, activate the correct one.
cx.defer(move |cx| {
handle
.update(cx, |_view, window, _cx| {
window.activate_window();
if let Some(workspace) = workspace_handle.upgrade() {
workspace.update(_cx, |workspace, cx| {
workspace.focus_panel::<AgentPanel>(window, cx);
});
}
})
.log_err();
});
if let Some(workspace) = workspace_handle.upgrade() {
workspace.update(_cx, |workspace, cx| {
workspace.focus_panel::<AgentPanel>(window, cx);
});
}
})
.log_err();
});
this.dismiss_notifications(cx);
}
AgentNotificationEvent::Dismissed => {
this.dismiss_notifications(cx);
}
this.dismiss_notifications(cx);
}
}));
AgentNotificationEvent::Dismissed => {
this.dismiss_notifications(cx);
}
}
}));
self.notifications.push(screen_window);
self.notifications.push(screen_window);
// If the user manually refocuses the original window, dismiss the popup.
self.notification_subscriptions
.entry(screen_window)
.or_insert_with(Vec::new)
.push({
let pop_up_weak = pop_up.downgrade();
// If the user manually refocuses the original window, dismiss the popup.
self.notification_subscriptions
.entry(screen_window)
.or_insert_with(Vec::new)
.push({
let pop_up_weak = pop_up.downgrade();
cx.observe_window_activation(window, move |_, window, cx| {
if window.is_window_active() {
if let Some(pop_up) = pop_up_weak.upgrade() {
pop_up.update(cx, |_, cx| {
cx.emit(AgentNotificationEvent::Dismissed);
});
}
}
})
});
}
cx.observe_window_activation(window, move |_, window, cx| {
if window.is_window_active()
&& let Some(pop_up) = pop_up_weak.upgrade()
{
pop_up.update(cx, |_, cx| {
cx.emit(AgentNotificationEvent::Dismissed);
});
}
})
});
}
}
@@ -1374,12 +1370,12 @@ impl ActiveThread {
editor.focus_handle(cx).focus(window);
editor.move_to_end(&editor::actions::MoveToEnd, window, cx);
});
let buffer_edited_subscription = cx.subscribe(&editor, |this, _, event, cx| match event {
EditorEvent::BufferEdited => {
this.update_editing_message_token_count(true, cx);
}
_ => {}
});
let buffer_edited_subscription =
cx.subscribe(&editor, |this, _, event: &EditorEvent, cx| {
if event == &EditorEvent::BufferEdited {
this.update_editing_message_token_count(true, cx);
}
});
let context_picker_menu_handle = PopoverMenuHandle::default();
let context_strip = cx.new(|cx| {
@@ -1766,7 +1762,7 @@ impl ActiveThread {
.thread
.read(cx)
.message(message_id)
.map(|msg| msg.to_string())
.map(|msg| msg.to_message_content())
.unwrap_or_default();
telemetry::event!(
@@ -2113,7 +2109,7 @@ impl ActiveThread {
.gap_1()
.children(message_content)
.when_some(editing_message_state, |this, state| {
let focus_handle = state.editor.focus_handle(cx).clone();
let focus_handle = state.editor.focus_handle(cx);
this.child(
h_flex()
@@ -2174,7 +2170,6 @@ impl ActiveThread {
.icon_color(Color::Muted)
.icon_size(IconSize::Small)
.tooltip({
let focus_handle = focus_handle.clone();
move |window, cx| {
Tooltip::for_action_in(
"Regenerate",
@@ -2247,9 +2242,7 @@ impl ActiveThread {
let after_editing_message = self
.editing_message
.as_ref()
.map_or(false, |(editing_message_id, _)| {
message_id > *editing_message_id
});
.is_some_and(|(editing_message_id, _)| message_id > *editing_message_id);
let backdrop = div()
.id(("backdrop", ix))
@@ -2269,13 +2262,12 @@ impl ActiveThread {
let mut error = None;
if let Some(last_restore_checkpoint) =
self.thread.read(cx).last_restore_checkpoint()
&& last_restore_checkpoint.message_id() == message_id
{
if last_restore_checkpoint.message_id() == message_id {
match last_restore_checkpoint {
LastRestoreCheckpoint::Pending { .. } => is_pending = true,
LastRestoreCheckpoint::Error { error: err, .. } => {
error = Some(err.clone());
}
match last_restore_checkpoint {
LastRestoreCheckpoint::Pending { .. } => is_pending = true,
LastRestoreCheckpoint::Error { error: err, .. } => {
error = Some(err.clone());
}
}
}
@@ -2316,7 +2308,7 @@ impl ActiveThread {
.into_any_element()
} else if let Some(error) = error {
restore_checkpoint_button
.tooltip(Tooltip::text(error.to_string()))
.tooltip(Tooltip::text(error))
.into_any_element()
} else {
restore_checkpoint_button.into_any_element()
@@ -2357,7 +2349,6 @@ impl ActiveThread {
this.submit_feedback_message(message_id, cx);
cx.notify();
}))
.on_action(cx.listener(Self::confirm_editing_message))
.mb_2()
.mx_4()
.p_2()

View File

@@ -96,7 +96,7 @@ impl AgentConfiguration {
let mut expanded_provider_configurations = HashMap::default();
if LanguageModelRegistry::read_global(cx)
.provider(&ZED_CLOUD_PROVIDER_ID)
.map_or(false, |cloud_provider| cloud_provider.must_accept_terms(cx))
.is_some_and(|cloud_provider| cloud_provider.must_accept_terms(cx))
{
expanded_provider_configurations.insert(ZED_CLOUD_PROVIDER_ID, true);
}
@@ -165,8 +165,8 @@ impl AgentConfiguration {
provider: &Arc<dyn LanguageModelProvider>,
cx: &mut Context<Self>,
) -> impl IntoElement + use<> {
let provider_id = provider.id().0.clone();
let provider_name = provider.name().0.clone();
let provider_id = provider.id().0;
let provider_name = provider.name().0;
let provider_id_string = SharedString::from(format!("provider-disclosure-{provider_id}"));
let configuration_view = self
@@ -269,7 +269,7 @@ impl AgentConfiguration {
.closed_icon(IconName::ChevronDown),
)
.on_click(cx.listener({
let provider_id = provider.id().clone();
let provider_id = provider.id();
move |this, _event, _window, _cx| {
let is_expanded = this
.expanded_provider_configurations
@@ -665,7 +665,7 @@ impl AgentConfiguration {
.size(IconSize::XSmall)
.color(Color::Accent)
.with_animation(
SharedString::from(format!("{}-starting", context_server_id.0.clone(),)),
SharedString::from(format!("{}-starting", context_server_id.0,)),
Animation::new(Duration::from_secs(3)).repeat(),
|icon, delta| icon.transform(Transformation::rotate(percentage(delta))),
)
@@ -865,7 +865,6 @@ impl AgentConfiguration {
.on_click({
let context_server_manager =
self.context_server_store.clone();
let context_server_id = context_server_id.clone();
let fs = self.fs.clone();
move |state, _window, cx| {
@@ -958,7 +957,7 @@ impl AgentConfiguration {
}
parent.child(v_flex().py_1p5().px_1().gap_1().children(
tools.into_iter().enumerate().map(|(ix, tool)| {
tools.iter().enumerate().map(|(ix, tool)| {
h_flex()
.id(("tool-item", ix))
.px_1()
@@ -1075,7 +1074,6 @@ fn show_unable_to_uninstall_extension_with_context_server(
cx,
move |this, _cx| {
let workspace_handle = workspace_handle.clone();
let context_server_id = context_server_id.clone();
this.icon(ToastIcon::new(IconName::Warning).color(Color::Warning))
.dismiss_button(true)

View File

@@ -668,10 +668,10 @@ mod tests {
);
let parsed_model = model_input.parse(cx).unwrap();
assert_eq!(parsed_model.capabilities.tools, true);
assert_eq!(parsed_model.capabilities.images, false);
assert_eq!(parsed_model.capabilities.parallel_tool_calls, false);
assert_eq!(parsed_model.capabilities.prompt_cache_key, false);
assert!(parsed_model.capabilities.tools);
assert!(!parsed_model.capabilities.images);
assert!(!parsed_model.capabilities.parallel_tool_calls);
assert!(!parsed_model.capabilities.prompt_cache_key);
});
}
@@ -693,10 +693,10 @@ mod tests {
model_input.capabilities.supports_prompt_cache_key = ToggleState::Unselected;
let parsed_model = model_input.parse(cx).unwrap();
assert_eq!(parsed_model.capabilities.tools, false);
assert_eq!(parsed_model.capabilities.images, false);
assert_eq!(parsed_model.capabilities.parallel_tool_calls, false);
assert_eq!(parsed_model.capabilities.prompt_cache_key, false);
assert!(!parsed_model.capabilities.tools);
assert!(!parsed_model.capabilities.images);
assert!(!parsed_model.capabilities.parallel_tool_calls);
assert!(!parsed_model.capabilities.prompt_cache_key);
});
}
@@ -719,10 +719,10 @@ mod tests {
let parsed_model = model_input.parse(cx).unwrap();
assert_eq!(parsed_model.name, "somemodel");
assert_eq!(parsed_model.capabilities.tools, true);
assert_eq!(parsed_model.capabilities.images, false);
assert_eq!(parsed_model.capabilities.parallel_tool_calls, true);
assert_eq!(parsed_model.capabilities.prompt_cache_key, false);
assert!(parsed_model.capabilities.tools);
assert!(!parsed_model.capabilities.images);
assert!(parsed_model.capabilities.parallel_tool_calls);
assert!(!parsed_model.capabilities.prompt_cache_key);
});
}

View File

@@ -163,10 +163,10 @@ impl ConfigurationSource {
.read(cx)
.text(cx);
let settings = serde_json_lenient::from_str::<serde_json::Value>(&text)?;
if let Some(settings_validator) = settings_validator {
if let Err(error) = settings_validator.validate(&settings) {
return Err(anyhow::anyhow!(error.to_string()));
}
if let Some(settings_validator) = settings_validator
&& let Err(error) = settings_validator.validate(&settings)
{
return Err(anyhow::anyhow!(error.to_string()));
}
Ok((
id.clone(),
@@ -261,7 +261,6 @@ impl ConfigureContextServerModal {
_cx: &mut Context<Workspace>,
) {
workspace.register_action({
let language_registry = language_registry.clone();
move |_workspace, _: &AddContextServer, window, cx| {
let workspace_handle = cx.weak_entity();
let language_registry = language_registry.clone();
@@ -487,7 +486,7 @@ impl ConfigureContextServerModal {
}
fn render_modal_description(&self, window: &mut Window, cx: &mut Context<Self>) -> AnyElement {
const MODAL_DESCRIPTION: &'static str = "Visit the MCP server configuration docs to find all necessary arguments and environment variables.";
const MODAL_DESCRIPTION: &str = "Visit the MCP server configuration docs to find all necessary arguments and environment variables.";
if let ConfigurationSource::Extension {
installation_instructions: Some(installation_instructions),
@@ -716,24 +715,24 @@ fn wait_for_context_server(
project::context_server_store::Event::ServerStatusChanged { server_id, status } => {
match status {
ContextServerStatus::Running => {
if server_id == &context_server_id {
if let Some(tx) = tx.lock().unwrap().take() {
let _ = tx.send(Ok(()));
}
if server_id == &context_server_id
&& let Some(tx) = tx.lock().unwrap().take()
{
let _ = tx.send(Ok(()));
}
}
ContextServerStatus::Stopped => {
if server_id == &context_server_id {
if let Some(tx) = tx.lock().unwrap().take() {
let _ = tx.send(Err("Context server stopped running".into()));
}
if server_id == &context_server_id
&& let Some(tx) = tx.lock().unwrap().take()
{
let _ = tx.send(Err("Context server stopped running".into()));
}
}
ContextServerStatus::Error(error) => {
if server_id == &context_server_id {
if let Some(tx) = tx.lock().unwrap().take() {
let _ = tx.send(Err(error.clone()));
}
if server_id == &context_server_id
&& let Some(tx) = tx.lock().unwrap().take()
{
let _ = tx.send(Err(error.clone()));
}
}
_ => {}

View File

@@ -464,7 +464,7 @@ impl ManageProfilesModal {
},
))
.child(ListSeparator)
.child(h_flex().p_2().child(mode.name_editor.clone()))
.child(h_flex().p_2().child(mode.name_editor))
}
fn render_view_profile(

View File

@@ -191,10 +191,10 @@ impl PickerDelegate for ToolPickerDelegate {
BTreeMap::default();
for item in all_items.iter() {
if let PickerItem::Tool { server_id, name } = item.clone() {
if name.contains(&query) {
tools_by_provider.entry(server_id).or_default().push(name);
}
if let PickerItem::Tool { server_id, name } = item.clone()
&& name.contains(&query)
{
tools_by_provider.entry(server_id).or_default().push(name);
}
}

View File

@@ -185,7 +185,7 @@ impl AgentDiffPane {
let focus_handle = cx.focus_handle();
let multibuffer = cx.new(|_| MultiBuffer::new(Capability::ReadWrite));
let project = thread.project(cx).clone();
let project = thread.project(cx);
let editor = cx.new(|cx| {
let mut editor =
Editor::for_multibuffer(multibuffer.clone(), Some(project.clone()), window, cx);
@@ -196,27 +196,24 @@ impl AgentDiffPane {
editor
});
let action_log = thread.action_log(cx).clone();
let action_log = thread.action_log(cx);
let mut this = Self {
_subscriptions: [
Some(
cx.observe_in(&action_log, window, |this, _action_log, window, cx| {
this.update_excerpts(window, cx)
}),
),
_subscriptions: vec![
cx.observe_in(&action_log, window, |this, _action_log, window, cx| {
this.update_excerpts(window, cx)
}),
match &thread {
AgentDiffThread::Native(thread) => {
Some(cx.subscribe(thread, |this, _thread, event, cx| {
this.handle_thread_event(event, cx)
}))
}
AgentDiffThread::AcpThread(_) => None,
AgentDiffThread::Native(thread) => cx
.subscribe(thread, |this, _thread, event, cx| {
this.handle_native_thread_event(event, cx)
}),
AgentDiffThread::AcpThread(thread) => cx
.subscribe(thread, |this, _thread, event, cx| {
this.handle_acp_thread_event(event, cx)
}),
},
]
.into_iter()
.flatten()
.collect(),
],
title: SharedString::default(),
multibuffer,
editor,
@@ -288,7 +285,7 @@ impl AgentDiffPane {
&& buffer
.read(cx)
.file()
.map_or(false, |file| file.disk_state() == DiskState::Deleted)
.is_some_and(|file| file.disk_state() == DiskState::Deleted)
{
editor.fold_buffer(snapshot.text.remote_id(), cx)
}
@@ -324,10 +321,15 @@ impl AgentDiffPane {
}
}
fn handle_thread_event(&mut self, event: &ThreadEvent, cx: &mut Context<Self>) {
match event {
ThreadEvent::SummaryGenerated => self.update_title(cx),
_ => {}
fn handle_native_thread_event(&mut self, event: &ThreadEvent, cx: &mut Context<Self>) {
if let ThreadEvent::SummaryGenerated = event {
self.update_title(cx)
}
}
fn handle_acp_thread_event(&mut self, event: &AcpThreadEvent, cx: &mut Context<Self>) {
if let AcpThreadEvent::TitleUpdated = event {
self.update_title(cx)
}
}
@@ -1043,23 +1045,23 @@ impl ToolbarItemView for AgentDiffToolbar {
return self.location(cx);
}
if let Some(editor) = item.act_as::<Editor>(cx) {
if editor.read(cx).mode().is_full() {
let agent_diff = AgentDiff::global(cx);
if let Some(editor) = item.act_as::<Editor>(cx)
&& editor.read(cx).mode().is_full()
{
let agent_diff = AgentDiff::global(cx);
self.active_item = Some(AgentDiffToolbarItem::Editor {
editor: editor.downgrade(),
state: agent_diff.read(cx).editor_state(&editor.downgrade()),
_diff_subscription: cx.observe(&agent_diff, Self::handle_diff_notify),
});
self.active_item = Some(AgentDiffToolbarItem::Editor {
editor: editor.downgrade(),
state: agent_diff.read(cx).editor_state(&editor.downgrade()),
_diff_subscription: cx.observe(&agent_diff, Self::handle_diff_notify),
});
return self.location(cx);
}
return self.location(cx);
}
}
self.active_item = None;
return self.location(cx);
self.location(cx)
}
fn pane_focus_update(
@@ -1310,7 +1312,7 @@ impl AgentDiff {
let entity = cx.new(|_cx| Self::default());
let global = AgentDiffGlobal(entity.clone());
cx.set_global(global);
entity.clone()
entity
})
}
@@ -1332,7 +1334,7 @@ impl AgentDiff {
window: &mut Window,
cx: &mut Context<Self>,
) {
let action_log = thread.action_log(cx).clone();
let action_log = thread.action_log(cx);
let action_log_subscription = cx.observe_in(&action_log, window, {
let workspace = workspace.clone();
@@ -1505,7 +1507,7 @@ impl AgentDiff {
.read(cx)
.entries()
.last()
.map_or(false, |entry| entry.diffs().next().is_some())
.is_some_and(|entry| entry.diffs().next().is_some())
{
self.update_reviewing_editors(workspace, window, cx);
}
@@ -1515,16 +1517,19 @@ impl AgentDiff {
.read(cx)
.entries()
.get(*ix)
.map_or(false, |entry| entry.diffs().next().is_some())
.is_some_and(|entry| entry.diffs().next().is_some())
{
self.update_reviewing_editors(workspace, window, cx);
}
}
AcpThreadEvent::EntriesRemoved(_)
| AcpThreadEvent::Stopped
AcpThreadEvent::Stopped | AcpThreadEvent::Error | AcpThreadEvent::LoadError(_) => {
self.update_reviewing_editors(workspace, window, cx);
}
AcpThreadEvent::TitleUpdated
| AcpThreadEvent::TokenUsageUpdated
| AcpThreadEvent::EntriesRemoved(_)
| AcpThreadEvent::ToolAuthorizationRequired
| AcpThreadEvent::Error
| AcpThreadEvent::ServerExited(_) => {}
| AcpThreadEvent::Retry(_) => {}
}
}
@@ -1535,21 +1540,11 @@ impl AgentDiff {
window: &mut Window,
cx: &mut Context<Self>,
) {
match event {
workspace::Event::ItemAdded { item } => {
if let Some(editor) = item.downcast::<Editor>() {
if let Some(buffer) = Self::full_editor_buffer(editor.read(cx), cx) {
self.register_editor(
workspace.downgrade(),
buffer.clone(),
editor,
window,
cx,
);
}
}
}
_ => {}
if let workspace::Event::ItemAdded { item } = event
&& let Some(editor) = item.downcast::<Editor>()
&& let Some(buffer) = Self::full_editor_buffer(editor.read(cx), cx)
{
self.register_editor(workspace.downgrade(), buffer, editor, window, cx);
}
}
@@ -1648,7 +1643,7 @@ impl AgentDiff {
continue;
};
for (weak_editor, _) in buffer_editors {
for weak_editor in buffer_editors.keys() {
let Some(editor) = weak_editor.upgrade() else {
continue;
};
@@ -1709,7 +1704,7 @@ impl AgentDiff {
.read_with(cx, |editor, _cx| editor.workspace())
.ok()
.flatten()
.map_or(false, |editor_workspace| {
.is_some_and(|editor_workspace| {
editor_workspace.entity_id() == workspace.entity_id()
});
@@ -1849,26 +1844,26 @@ impl AgentDiff {
let thread = thread.upgrade()?;
if let PostReviewState::AllReviewed = review(&editor, &thread, window, cx) {
if let Some(curr_buffer) = editor.read(cx).buffer().read(cx).as_singleton() {
let changed_buffers = thread.action_log(cx).read(cx).changed_buffers(cx);
if let PostReviewState::AllReviewed = review(&editor, &thread, window, cx)
&& let Some(curr_buffer) = editor.read(cx).buffer().read(cx).as_singleton()
{
let changed_buffers = thread.action_log(cx).read(cx).changed_buffers(cx);
let mut keys = changed_buffers.keys().cycle();
keys.find(|k| *k == &curr_buffer);
let next_project_path = keys
.next()
.filter(|k| *k != &curr_buffer)
.and_then(|after| after.read(cx).project_path(cx));
let mut keys = changed_buffers.keys().cycle();
keys.find(|k| *k == &curr_buffer);
let next_project_path = keys
.next()
.filter(|k| *k != &curr_buffer)
.and_then(|after| after.read(cx).project_path(cx));
if let Some(path) = next_project_path {
let task = workspace.open_path(path, None, true, window, cx);
let task = cx.spawn(async move |_, _cx| task.await.map(|_| ()));
return Some(task);
}
if let Some(path) = next_project_path {
let task = workspace.open_path(path, None, true, window, cx);
let task = cx.spawn(async move |_, _cx| task.await.map(|_| ()));
return Some(task);
}
}
return Some(Task::ready(Ok(())));
Some(Task::ready(Ok(())))
}
}

View File

@@ -66,10 +66,8 @@ impl AgentModelSelector {
fs.clone(),
cx,
move |settings, _cx| {
settings.set_inline_assistant_model(
provider.clone(),
model_id.clone(),
);
settings
.set_inline_assistant_model(provider.clone(), model_id);
},
);
}

File diff suppressed because it is too large Load Diff

View File

@@ -146,6 +146,13 @@ pub struct NewExternalAgentThread {
agent: Option<ExternalAgent>,
}
#[derive(Clone, PartialEq, Deserialize, JsonSchema, Action)]
#[action(namespace = agent)]
#[serde(deny_unknown_fields)]
pub struct NewNativeAgentThreadFromSummary {
from_session_id: agent_client_protocol::SessionId,
}
#[derive(Default, Debug, Clone, Copy, PartialEq, Serialize, Deserialize, JsonSchema)]
#[serde(rename_all = "snake_case")]
enum ExternalAgent {
@@ -156,11 +163,15 @@ enum ExternalAgent {
}
impl ExternalAgent {
pub fn server(&self, fs: Arc<dyn fs::Fs>) -> Rc<dyn agent_servers::AgentServer> {
pub fn server(
&self,
fs: Arc<dyn fs::Fs>,
history: Entity<agent2::HistoryStore>,
) -> Rc<dyn agent_servers::AgentServer> {
match self {
ExternalAgent::Gemini => Rc::new(agent_servers::Gemini),
ExternalAgent::ClaudeCode => Rc::new(agent_servers::ClaudeCode),
ExternalAgent::NativeAgent => Rc::new(agent2::NativeAgentServer::new(fs)),
ExternalAgent::NativeAgent => Rc::new(agent2::NativeAgentServer::new(fs, history)),
}
}
}
@@ -236,12 +247,7 @@ pub fn init(
client.telemetry().clone(),
cx,
);
terminal_inline_assistant::init(
fs.clone(),
prompt_builder.clone(),
client.telemetry().clone(),
cx,
);
terminal_inline_assistant::init(fs.clone(), prompt_builder, client.telemetry().clone(), cx);
cx.observe_new(move |workspace, window, cx| {
ConfigureContextServerModal::register(workspace, language_registry.clone(), window, cx)
})
@@ -387,7 +393,6 @@ fn register_slash_commands(cx: &mut App) {
slash_command_registry.register_command(assistant_slash_commands::FetchSlashCommand, true);
cx.observe_flag::<assistant_slash_commands::StreamingExampleSlashCommandFeatureFlag, _>({
let slash_command_registry = slash_command_registry.clone();
move |is_enabled, _cx| {
if is_enabled {
slash_command_registry.register_command(

View File

@@ -352,12 +352,12 @@ impl CodegenAlternative {
event: &multi_buffer::Event,
cx: &mut Context<Self>,
) {
if let multi_buffer::Event::TransactionUndone { transaction_id } = event {
if self.transformation_transaction_id == Some(*transaction_id) {
self.transformation_transaction_id = None;
self.generation = Task::ready(());
cx.emit(CodegenEvent::Undone);
}
if let multi_buffer::Event::TransactionUndone { transaction_id } = event
&& self.transformation_transaction_id == Some(*transaction_id)
{
self.transformation_transaction_id = None;
self.generation = Task::ready(());
cx.emit(CodegenEvent::Undone);
}
}
@@ -576,38 +576,34 @@ impl CodegenAlternative {
let mut lines = chunk.split('\n').peekable();
while let Some(line) = lines.next() {
new_text.push_str(line);
if line_indent.is_none() {
if let Some(non_whitespace_ch_ix) =
if line_indent.is_none()
&& let Some(non_whitespace_ch_ix) =
new_text.find(|ch: char| !ch.is_whitespace())
{
line_indent = Some(non_whitespace_ch_ix);
base_indent = base_indent.or(line_indent);
{
line_indent = Some(non_whitespace_ch_ix);
base_indent = base_indent.or(line_indent);
let line_indent = line_indent.unwrap();
let base_indent = base_indent.unwrap();
let indent_delta =
line_indent as i32 - base_indent as i32;
let mut corrected_indent_len = cmp::max(
0,
suggested_line_indent.len as i32 + indent_delta,
)
as usize;
if first_line {
corrected_indent_len = corrected_indent_len
.saturating_sub(
selection_start.column as usize,
);
}
let indent_char = suggested_line_indent.char();
let mut indent_buffer = [0; 4];
let indent_str =
indent_char.encode_utf8(&mut indent_buffer);
new_text.replace_range(
..line_indent,
&indent_str.repeat(corrected_indent_len),
);
let line_indent = line_indent.unwrap();
let base_indent = base_indent.unwrap();
let indent_delta = line_indent as i32 - base_indent as i32;
let mut corrected_indent_len = cmp::max(
0,
suggested_line_indent.len as i32 + indent_delta,
)
as usize;
if first_line {
corrected_indent_len = corrected_indent_len
.saturating_sub(selection_start.column as usize);
}
let indent_char = suggested_line_indent.char();
let mut indent_buffer = [0; 4];
let indent_str =
indent_char.encode_utf8(&mut indent_buffer);
new_text.replace_range(
..line_indent,
&indent_str.repeat(corrected_indent_len),
);
}
if line_indent.is_some() {
@@ -1133,7 +1129,7 @@ mod tests {
)
});
let chunks_tx = simulate_response_stream(codegen.clone(), cx);
let chunks_tx = simulate_response_stream(&codegen, cx);
let mut new_text = concat!(
" let mut x = 0;\n",
@@ -1200,7 +1196,7 @@ mod tests {
)
});
let chunks_tx = simulate_response_stream(codegen.clone(), cx);
let chunks_tx = simulate_response_stream(&codegen, cx);
cx.background_executor.run_until_parked();
@@ -1269,7 +1265,7 @@ mod tests {
)
});
let chunks_tx = simulate_response_stream(codegen.clone(), cx);
let chunks_tx = simulate_response_stream(&codegen, cx);
cx.background_executor.run_until_parked();
@@ -1338,7 +1334,7 @@ mod tests {
)
});
let chunks_tx = simulate_response_stream(codegen.clone(), cx);
let chunks_tx = simulate_response_stream(&codegen, cx);
let new_text = concat!(
"func main() {\n",
"\tx := 0\n",
@@ -1395,7 +1391,7 @@ mod tests {
)
});
let chunks_tx = simulate_response_stream(codegen.clone(), cx);
let chunks_tx = simulate_response_stream(&codegen, cx);
chunks_tx
.unbounded_send("let mut x = 0;\nx += 1;".to_string())
.unwrap();
@@ -1477,7 +1473,7 @@ mod tests {
}
fn simulate_response_stream(
codegen: Entity<CodegenAlternative>,
codegen: &Entity<CodegenAlternative>,
cx: &mut TestAppContext,
) -> mpsc::UnboundedSender<String> {
let (chunks_tx, chunks_rx) = mpsc::unbounded();

View File

@@ -385,12 +385,11 @@ impl ContextPicker {
}
pub fn select_first(&mut self, window: &mut Window, cx: &mut Context<Self>) {
match &self.mode {
ContextPickerState::Default(entity) => entity.update(cx, |entity, cx| {
// Other variants already select their first entry on open automatically
if let ContextPickerState::Default(entity) = &self.mode {
entity.update(cx, |entity, cx| {
entity.select_first(&Default::default(), window, cx)
}),
// Other variants already select their first entry on open automatically
_ => {}
})
}
}
@@ -610,9 +609,7 @@ pub(crate) fn available_context_picker_entries(
.read(cx)
.active_item(cx)
.and_then(|item| item.downcast::<Editor>())
.map_or(false, |editor| {
editor.update(cx, |editor, cx| editor.has_non_empty_selection(cx))
});
.is_some_and(|editor| editor.update(cx, |editor, cx| editor.has_non_empty_selection(cx)));
if has_selection {
entries.push(ContextPickerEntry::Action(
ContextPickerAction::AddSelections,
@@ -680,7 +677,7 @@ pub(crate) fn recent_context_picker_entries(
.filter(|(_, abs_path)| {
abs_path
.as_ref()
.map_or(true, |path| !exclude_paths.contains(path.as_path()))
.is_none_or(|path| !exclude_paths.contains(path.as_path()))
})
.take(4)
.filter_map(|(project_path, _)| {
@@ -821,13 +818,8 @@ pub fn crease_for_mention(
let render_trailer = move |_row, _unfold, _window: &mut Window, _cx: &mut App| Empty.into_any();
Crease::inline(
range,
placeholder.clone(),
fold_toggle("mention"),
render_trailer,
)
.with_metadata(CreaseMetadata { icon_path, label })
Crease::inline(range, placeholder, fold_toggle("mention"), render_trailer)
.with_metadata(CreaseMetadata { icon_path, label })
}
fn render_fold_icon_button(

View File

@@ -79,8 +79,7 @@ fn search(
) -> Task<Vec<Match>> {
match mode {
Some(ContextPickerMode::File) => {
let search_files_task =
search_files(query.clone(), cancellation_flag.clone(), &workspace, cx);
let search_files_task = search_files(query, cancellation_flag, &workspace, cx);
cx.background_spawn(async move {
search_files_task
.await
@@ -91,8 +90,7 @@ fn search(
}
Some(ContextPickerMode::Symbol) => {
let search_symbols_task =
search_symbols(query.clone(), cancellation_flag.clone(), &workspace, cx);
let search_symbols_task = search_symbols(query, cancellation_flag, &workspace, cx);
cx.background_spawn(async move {
search_symbols_task
.await
@@ -108,13 +106,8 @@ fn search(
.and_then(|t| t.upgrade())
.zip(text_thread_context_store.as_ref().and_then(|t| t.upgrade()))
{
let search_threads_task = search_threads(
query.clone(),
cancellation_flag.clone(),
thread_store,
context_store,
cx,
);
let search_threads_task =
search_threads(query, cancellation_flag, thread_store, context_store, cx);
cx.background_spawn(async move {
search_threads_task
.await
@@ -137,8 +130,7 @@ fn search(
Some(ContextPickerMode::Rules) => {
if let Some(prompt_store) = prompt_store.as_ref() {
let search_rules_task =
search_rules(query.clone(), cancellation_flag.clone(), prompt_store, cx);
let search_rules_task = search_rules(query, cancellation_flag, prompt_store, cx);
cx.background_spawn(async move {
search_rules_task
.await
@@ -196,7 +188,7 @@ fn search(
let executor = cx.background_executor().clone();
let search_files_task =
search_files(query.clone(), cancellation_flag.clone(), &workspace, cx);
search_files(query.clone(), cancellation_flag, &workspace, cx);
let entries =
available_context_picker_entries(&prompt_store, &thread_store, &workspace, cx);
@@ -283,7 +275,7 @@ impl ContextPickerCompletionProvider {
) -> Option<Completion> {
match entry {
ContextPickerEntry::Mode(mode) => Some(Completion {
replace_range: source_range.clone(),
replace_range: source_range,
new_text: format!("@{} ", mode.keyword()),
label: CodeLabel::plain(mode.label().to_string(), None),
icon_path: Some(mode.icon().path().into()),
@@ -330,9 +322,6 @@ impl ContextPickerCompletionProvider {
);
let callback = Arc::new({
let context_store = context_store.clone();
let selections = selections.clone();
let selection_infos = selection_infos.clone();
move |_, window: &mut Window, cx: &mut App| {
context_store.update(cx, |context_store, cx| {
for (buffer, range) in &selections {
@@ -441,7 +430,7 @@ impl ContextPickerCompletionProvider {
excerpt_id,
source_range.start,
new_text_len - 1,
editor.clone(),
editor,
context_store.clone(),
move |window, cx| match &thread_entry {
ThreadContextEntry::Thread { id, .. } => {
@@ -510,7 +499,7 @@ impl ContextPickerCompletionProvider {
excerpt_id,
source_range.start,
new_text_len - 1,
editor.clone(),
editor,
context_store.clone(),
move |_, cx| {
let user_prompt_id = rules.prompt_id;
@@ -547,7 +536,7 @@ impl ContextPickerCompletionProvider {
excerpt_id,
source_range.start,
new_text_len - 1,
editor.clone(),
editor,
context_store.clone(),
move |_, cx| {
let context_store = context_store.clone();
@@ -704,16 +693,16 @@ impl ContextPickerCompletionProvider {
excerpt_id,
source_range.start,
new_text_len - 1,
editor.clone(),
editor,
context_store.clone(),
move |_, cx| {
let symbol = symbol.clone();
let context_store = context_store.clone();
let workspace = workspace.clone();
let result = super::symbol_context_picker::add_symbol(
symbol.clone(),
symbol,
false,
workspace.clone(),
workspace,
context_store.downgrade(),
cx,
);
@@ -1020,7 +1009,7 @@ impl MentionCompletion {
&& line
.chars()
.nth(last_mention_start - 1)
.map_or(false, |c| !c.is_whitespace())
.is_some_and(|c| !c.is_whitespace())
{
return None;
}
@@ -1162,7 +1151,7 @@ mod tests {
impl Focusable for AtMentionEditor {
fn focus_handle(&self, cx: &App) -> FocusHandle {
self.0.read(cx).focus_handle(cx).clone()
self.0.read(cx).focus_handle(cx)
}
}
@@ -1480,7 +1469,7 @@ mod tests {
let completions = editor.current_completions().expect("Missing completions");
completions
.into_iter()
.map(|completion| completion.label.text.to_string())
.map(|completion| completion.label.text)
.collect::<Vec<_>>()
}

View File

@@ -226,9 +226,10 @@ impl PickerDelegate for FetchContextPickerDelegate {
_window: &mut Window,
cx: &mut Context<Picker<Self>>,
) -> Option<Self::ListItem> {
let added = self.context_store.upgrade().map_or(false, |context_store| {
context_store.read(cx).includes_url(&self.url)
});
let added = self
.context_store
.upgrade()
.is_some_and(|context_store| context_store.read(cx).includes_url(&self.url));
Some(
ListItem::new(ix)

View File

@@ -239,9 +239,7 @@ pub(crate) fn search_files(
PathMatchCandidateSet {
snapshot: worktree.snapshot(),
include_ignored: worktree
.root_entry()
.map_or(false, |entry| entry.is_ignored),
include_ignored: worktree.root_entry().is_some_and(|entry| entry.is_ignored),
include_root_name: true,
candidates: project::Candidates::Entries,
}

View File

@@ -159,7 +159,7 @@ pub fn render_thread_context_entry(
context_store: WeakEntity<ContextStore>,
cx: &mut App,
) -> Div {
let added = context_store.upgrade().map_or(false, |context_store| {
let added = context_store.upgrade().is_some_and(|context_store| {
context_store
.read(cx)
.includes_user_rules(user_rules.prompt_id)

View File

@@ -294,7 +294,7 @@ pub(crate) fn search_symbols(
.partition(|candidate| {
project
.entry_for_path(&symbols[candidate.id].path, cx)
.map_or(false, |e| !e.is_ignored)
.is_some_and(|e| !e.is_ignored)
})
})
.log_err()

View File

@@ -236,12 +236,10 @@ pub fn render_thread_context_entry(
let is_added = match entry {
ThreadContextEntry::Thread { id, .. } => context_store
.upgrade()
.map_or(false, |ctx_store| ctx_store.read(cx).includes_thread(id)),
ThreadContextEntry::Context { path, .. } => {
context_store.upgrade().map_or(false, |ctx_store| {
ctx_store.read(cx).includes_text_thread(path)
})
}
.is_some_and(|ctx_store| ctx_store.read(cx).includes_thread(id)),
ThreadContextEntry::Context { path, .. } => context_store
.upgrade()
.is_some_and(|ctx_store| ctx_store.read(cx).includes_text_thread(path)),
};
h_flex()

View File

@@ -368,10 +368,10 @@ impl ContextStrip {
_window: &mut Window,
cx: &mut Context<Self>,
) {
if let Some(suggested) = self.suggested_context(cx) {
if self.is_suggested_focused(&self.added_contexts(cx)) {
self.add_suggested_context(&suggested, cx);
}
if let Some(suggested) = self.suggested_context(cx)
&& self.is_suggested_focused(&self.added_contexts(cx))
{
self.add_suggested_context(&suggested, cx);
}
}

View File

@@ -182,13 +182,13 @@ impl InlineAssistant {
match event {
workspace::Event::UserSavedItem { item, .. } => {
// When the user manually saves an editor, automatically accepts all finished transformations.
if let Some(editor) = item.upgrade().and_then(|item| item.act_as::<Editor>(cx)) {
if let Some(editor_assists) = self.assists_by_editor.get(&editor.downgrade()) {
for assist_id in editor_assists.assist_ids.clone() {
let assist = &self.assists[&assist_id];
if let CodegenStatus::Done = assist.codegen.read(cx).status(cx) {
self.finish_assist(assist_id, false, window, cx)
}
if let Some(editor) = item.upgrade().and_then(|item| item.act_as::<Editor>(cx))
&& let Some(editor_assists) = self.assists_by_editor.get(&editor.downgrade())
{
for assist_id in editor_assists.assist_ids.clone() {
let assist = &self.assists[&assist_id];
if let CodegenStatus::Done = assist.codegen.read(cx).status(cx) {
self.finish_assist(assist_id, false, window, cx)
}
}
}
@@ -342,13 +342,11 @@ impl InlineAssistant {
)
.await
.ok();
if let Some(answer) = answer {
if answer == 0 {
cx.update(|window, cx| {
window.dispatch_action(Box::new(OpenSettings), cx)
})
if let Some(answer) = answer
&& answer == 0
{
cx.update(|window, cx| window.dispatch_action(Box::new(OpenSettings), cx))
.ok();
}
}
anyhow::Ok(())
})
@@ -435,11 +433,11 @@ impl InlineAssistant {
}
}
if let Some(prev_selection) = selections.last_mut() {
if selection.start <= prev_selection.end {
prev_selection.end = selection.end;
continue;
}
if let Some(prev_selection) = selections.last_mut()
&& selection.start <= prev_selection.end
{
prev_selection.end = selection.end;
continue;
}
let latest_selection = newest_selection.get_or_insert_with(|| selection.clone());
@@ -985,14 +983,13 @@ impl InlineAssistant {
EditorEvent::SelectionsChanged { .. } => {
for assist_id in editor_assists.assist_ids.clone() {
let assist = &self.assists[&assist_id];
if let Some(decorations) = assist.decorations.as_ref() {
if decorations
if let Some(decorations) = assist.decorations.as_ref()
&& decorations
.prompt_editor
.focus_handle(cx)
.is_focused(window)
{
return;
}
{
return;
}
}
@@ -1123,7 +1120,7 @@ impl InlineAssistant {
if editor_assists
.scroll_lock
.as_ref()
.map_or(false, |lock| lock.assist_id == assist_id)
.is_some_and(|lock| lock.assist_id == assist_id)
{
editor_assists.scroll_lock = None;
}
@@ -1503,20 +1500,18 @@ impl InlineAssistant {
window: &mut Window,
cx: &mut App,
) -> Option<InlineAssistTarget> {
if let Some(terminal_panel) = workspace.panel::<TerminalPanel>(cx) {
if terminal_panel
if let Some(terminal_panel) = workspace.panel::<TerminalPanel>(cx)
&& terminal_panel
.read(cx)
.focus_handle(cx)
.contains_focused(window, cx)
{
if let Some(terminal_view) = terminal_panel.read(cx).pane().and_then(|pane| {
pane.read(cx)
.active_item()
.and_then(|t| t.downcast::<TerminalView>())
}) {
return Some(InlineAssistTarget::Terminal(terminal_view));
}
}
&& let Some(terminal_view) = terminal_panel.read(cx).pane().and_then(|pane| {
pane.read(cx)
.active_item()
.and_then(|t| t.downcast::<TerminalView>())
})
{
return Some(InlineAssistTarget::Terminal(terminal_view));
}
let context_editor = agent_panel
@@ -1537,13 +1532,11 @@ impl InlineAssistant {
.and_then(|item| item.act_as::<Editor>(cx))
{
Some(InlineAssistTarget::Editor(workspace_editor))
} else if let Some(terminal_view) = workspace
.active_item(cx)
.and_then(|item| item.act_as::<TerminalView>(cx))
{
Some(InlineAssistTarget::Terminal(terminal_view))
} else {
None
workspace
.active_item(cx)
.and_then(|item| item.act_as::<TerminalView>(cx))
.map(InlineAssistTarget::Terminal)
}
}
}
@@ -1698,7 +1691,7 @@ impl InlineAssist {
}),
range,
codegen: codegen.clone(),
workspace: workspace.clone(),
workspace,
_subscriptions: vec![
window.on_focus_in(&prompt_editor_focus_handle, cx, move |_, cx| {
InlineAssistant::update_global(cx, |this, cx| {
@@ -1741,22 +1734,20 @@ impl InlineAssist {
return;
};
if let CodegenStatus::Error(error) = codegen.read(cx).status(cx) {
if assist.decorations.is_none() {
if let Some(workspace) = assist.workspace.upgrade() {
let error = format!("Inline assistant error: {}", error);
workspace.update(cx, |workspace, cx| {
struct InlineAssistantError;
if let CodegenStatus::Error(error) = codegen.read(cx).status(cx)
&& assist.decorations.is_none()
&& let Some(workspace) = assist.workspace.upgrade()
{
let error = format!("Inline assistant error: {}", error);
workspace.update(cx, |workspace, cx| {
struct InlineAssistantError;
let id =
NotificationId::composite::<InlineAssistantError>(
assist_id.0,
);
let id = NotificationId::composite::<InlineAssistantError>(
assist_id.0,
);
workspace.show_toast(Toast::new(id, error), cx);
})
}
}
workspace.show_toast(Toast::new(id, error), cx);
})
}
if assist.decorations.is_none() {
@@ -1821,18 +1812,18 @@ impl CodeActionProvider for AssistantCodeActionProvider {
has_diagnostics = true;
}
if has_diagnostics {
if let Some(symbols_containing_start) = snapshot.symbols_containing(range.start, None) {
if let Some(symbol) = symbols_containing_start.last() {
range.start = cmp::min(range.start, symbol.range.start.to_point(&snapshot));
range.end = cmp::max(range.end, symbol.range.end.to_point(&snapshot));
}
if let Some(symbols_containing_start) = snapshot.symbols_containing(range.start, None)
&& let Some(symbol) = symbols_containing_start.last()
{
range.start = cmp::min(range.start, symbol.range.start.to_point(&snapshot));
range.end = cmp::max(range.end, symbol.range.end.to_point(&snapshot));
}
if let Some(symbols_containing_end) = snapshot.symbols_containing(range.end, None) {
if let Some(symbol) = symbols_containing_end.last() {
range.start = cmp::min(range.start, symbol.range.start.to_point(&snapshot));
range.end = cmp::max(range.end, symbol.range.end.to_point(&snapshot));
}
if let Some(symbols_containing_end) = snapshot.symbols_containing(range.end, None)
&& let Some(symbol) = symbols_containing_end.last()
{
range.start = cmp::min(range.start, symbol.range.start.to_point(&snapshot));
range.end = cmp::max(range.end, symbol.range.end.to_point(&snapshot));
}
Task::ready(Ok(vec![CodeAction {

View File

@@ -345,7 +345,7 @@ impl<T: 'static> PromptEditor<T> {
let prompt = self.editor.read(cx).text(cx);
if self
.prompt_history_ix
.map_or(true, |ix| self.prompt_history[ix] != prompt)
.is_none_or(|ix| self.prompt_history[ix] != prompt)
{
self.prompt_history_ix.take();
self.pending_prompt = prompt;
@@ -1229,27 +1229,27 @@ pub enum GenerationMode {
impl GenerationMode {
fn start_label(self) -> &'static str {
match self {
GenerationMode::Generate { .. } => "Generate",
GenerationMode::Generate => "Generate",
GenerationMode::Transform => "Transform",
}
}
fn tooltip_interrupt(self) -> &'static str {
match self {
GenerationMode::Generate { .. } => "Interrupt Generation",
GenerationMode::Generate => "Interrupt Generation",
GenerationMode::Transform => "Interrupt Transform",
}
}
fn tooltip_restart(self) -> &'static str {
match self {
GenerationMode::Generate { .. } => "Restart Generation",
GenerationMode::Generate => "Restart Generation",
GenerationMode::Transform => "Restart Transform",
}
}
fn tooltip_accept(self) -> &'static str {
match self {
GenerationMode::Generate { .. } => "Accept Generation",
GenerationMode::Generate => "Accept Generation",
GenerationMode::Transform => "Accept Transform",
}
}

View File

@@ -93,7 +93,7 @@ impl LanguageModelPickerDelegate {
let entries = models.entries();
Self {
on_model_changed: on_model_changed.clone(),
on_model_changed,
all_models: Arc::new(models),
selected_index: Self::get_active_model_index(&entries, get_active_model(cx)),
filtered_entries: entries,
@@ -514,7 +514,7 @@ impl PickerDelegate for LanguageModelPickerDelegate {
.pl_0p5()
.gap_1p5()
.w(px(240.))
.child(Label::new(model_info.model.name().0.clone()).truncate()),
.child(Label::new(model_info.model.name().0).truncate()),
)
.end_slot(div().pr_3().when(is_selected, |this| {
this.child(

View File

@@ -117,7 +117,7 @@ pub(crate) fn create_editor(
let mut editor = Editor::new(
editor::EditorMode::AutoHeight {
min_lines,
max_lines: max_lines,
max_lines,
},
buffer,
None,
@@ -156,7 +156,7 @@ impl ProfileProvider for Entity<Thread> {
fn profiles_supported(&self, cx: &App) -> bool {
self.read(cx)
.configured_model()
.map_or(false, |model| model.model.supports_tools())
.is_some_and(|model| model.model.supports_tools())
}
fn profile_id(&self, cx: &App) -> AgentProfileId {
@@ -215,9 +215,10 @@ impl MessageEditor {
let subscriptions = vec![
cx.subscribe_in(&context_strip, window, Self::handle_context_strip_event),
cx.subscribe(&editor, |this, _, event, cx| match event {
EditorEvent::BufferEdited => this.handle_message_changed(cx),
_ => {}
cx.subscribe(&editor, |this, _, event: &EditorEvent, cx| {
if event == &EditorEvent::BufferEdited {
this.handle_message_changed(cx)
}
}),
cx.observe(&context_store, |this, _, cx| {
// When context changes, reload it for token counting.
@@ -247,7 +248,7 @@ impl MessageEditor {
editor: editor.clone(),
project: thread.read(cx).project().clone(),
thread,
incompatible_tools_state: incompatible_tools.clone(),
incompatible_tools_state: incompatible_tools,
workspace,
context_store,
prompt_store,
@@ -838,7 +839,6 @@ impl MessageEditor {
.child(self.profile_selector.clone())
.child(self.model_selector.clone())
.map({
let focus_handle = focus_handle.clone();
move |parent| {
if is_generating {
parent
@@ -1132,7 +1132,7 @@ impl MessageEditor {
)
.when(is_edit_changes_expanded, |parent| {
parent.child(
v_flex().children(changed_buffers.into_iter().enumerate().flat_map(
v_flex().children(changed_buffers.iter().enumerate().flat_map(
|(index, (buffer, _diff))| {
let file = buffer.read(cx).file()?;
let path = file.path();
@@ -1289,7 +1289,7 @@ impl MessageEditor {
self.thread
.read(cx)
.configured_model()
.map_or(false, |model| model.provider.id() == ZED_CLOUD_PROVIDER_ID)
.is_some_and(|model| model.provider.id() == ZED_CLOUD_PROVIDER_ID)
}
fn render_usage_callout(&self, line_height: Pixels, cx: &mut Context<Self>) -> Option<Div> {
@@ -1442,7 +1442,7 @@ impl MessageEditor {
let message_text = editor.read(cx).text(cx);
if message_text.is_empty()
&& loaded_context.map_or(true, |loaded_context| loaded_context.is_empty())
&& loaded_context.is_none_or(|loaded_context| loaded_context.is_empty())
{
return None;
}
@@ -1605,7 +1605,8 @@ pub fn extract_message_creases(
.collect::<HashMap<_, _>>();
// Filter the addon's list of creases based on what the editor reports,
// since the addon might have removed creases in it.
let creases = editor.display_map.update(cx, |display_map, cx| {
editor.display_map.update(cx, |display_map, cx| {
display_map
.snapshot(cx)
.crease_snapshot
@@ -1629,8 +1630,7 @@ pub fn extract_message_creases(
}
})
.collect()
});
creases
})
}
impl EventEmitter<MessageEditorEvent> for MessageEditor {}
@@ -1682,7 +1682,7 @@ impl Render for MessageEditor {
let has_history = self
.history_store
.as_ref()
.and_then(|hs| hs.update(cx, |hs, cx| hs.entries(cx).len() > 0).ok())
.and_then(|hs| hs.update(cx, |hs, cx| !hs.entries(cx).is_empty()).ok())
.unwrap_or(false)
|| self
.thread
@@ -1695,7 +1695,7 @@ impl Render for MessageEditor {
!has_history && is_signed_out && has_configured_providers,
|this| this.child(cx.new(ApiKeysWithProviders::new)),
)
.when(changed_buffers.len() > 0, |parent| {
.when(!changed_buffers.is_empty(), |parent| {
parent.child(self.render_edits_bar(&changed_buffers, window, cx))
})
.child(self.render_editor(window, cx))
@@ -1800,7 +1800,7 @@ impl AgentPreview for MessageEditor {
.bg(cx.theme().colors().panel_background)
.border_1()
.border_color(cx.theme().colors().border)
.child(default_message_editor.clone())
.child(default_message_editor)
.into_any_element(),
)])
.into_any_element(),

View File

@@ -137,12 +137,11 @@ impl ProfileSelector {
entry.handler({
let fs = self.fs.clone();
let provider = self.provider.clone();
let profile_id = profile_id.clone();
move |_window, cx| {
update_settings_file::<AgentSettings>(fs.clone(), cx, {
let profile_id = profile_id.clone();
move |settings, _cx| {
settings.set_profile(profile_id.clone());
settings.set_profile(profile_id);
}
});
@@ -175,7 +174,6 @@ impl Render for ProfileSelector {
PopoverMenu::new("profile-selector")
.trigger_with_tooltip(trigger_button, {
let focus_handle = focus_handle.clone();
move |window, cx| {
Tooltip::for_action_in(
"Toggle Profile Menu",

View File

@@ -88,8 +88,6 @@ impl SlashCommandCompletionProvider {
.map(|(editor, workspace)| {
let command_name = mat.string.clone();
let command_range = command_range.clone();
let editor = editor.clone();
let workspace = workspace.clone();
Arc::new(
move |intent: CompletionIntent,
window: &mut Window,
@@ -158,7 +156,7 @@ impl SlashCommandCompletionProvider {
if let Some(command) = self.slash_commands.command(command_name, cx) {
let completions = command.complete_argument(
arguments,
new_cancel_flag.clone(),
new_cancel_flag,
self.workspace.clone(),
window,
cx,

Some files were not shown because too many files have changed in this diff Show More