Compare commits

..

20 Commits

Author SHA1 Message Date
Michael Sloan
cfab93eb15 Merge branch 'main' into run-command-on-selection-change 2025-09-08 13:48:34 -06:00
Cole Miller
fa0df6da1c python: Replace pyright with basedpyright (#35362)
Follow-up to #35250. Let's experiment with having this by default on
nightly.

Release Notes:

- Added built-in support for the basedpyright language server for Python
code. basedpyright is now enabled by default, and pyright (previously
the primary Python language server) remains available but is disabled by
default. This supersedes the basedpyright extension, which can be
uninstalled. Advantages of basedpyright over pyright include support for
inlay hints, semantic highlighting, auto-import code actions, and
stricter type checking. To switch back to pyright, add the following
configuration to settings.json:

```json
{
  "languages": {
    "Python": {
      "language_servers": ["pyright", "pylsp", "!basedpyright"]
    }
  }
}
```

---------

Co-authored-by: Piotr Osiewicz <24362066+osiewicz@users.noreply.github.com>
Co-authored-by: Lukas Wirth <lukas@zed.dev>
2025-09-08 19:15:17 +00:00
Cole Miller
99102a84fa ACP over SSH (#37725)
This PR adds support for using external agents in SSH projects via ACP,
including automatic installation of Gemini CLI and Claude Code,
authentication with API keys (for Gemini) and CLI login, and custom
agents from user configuration.

Co-authored-by: maan2003 <manmeetmann2003@gmail.com>

Release Notes:

- agent: Gemini CLI, Claude Code, and custom external agents can now be
used in SSH projects.

---------

Co-authored-by: maan2003 <manmeetmann2003@gmail.com>
2025-09-08 14:19:41 -04:00
Cole Miller
5f01f6d75f agent: Make read_file and edit_file tool call titles more specific (#37639)
For read_file and edit_file, show the worktree-relative path if there's
only one visible worktree, and the "full path" otherwise. Also restores
the display of line numbers for read_file calls.

Release Notes:

- N/A
2025-09-08 12:57:22 -04:00
Dave Waggoner
a66cd820b3 Fix line endings in terminal_hyperlinks.rs (#37654)
Fixes Windows line endings in `terminal_hyperlinks.rs`, which was
accidentally originally added with them.

Release Notes:

- N/A
2025-09-08 12:21:47 -04:00
hong jihwan
f07da9d9f2 Correctly parse backslash character on replacement (#37014)
When a keybind contains a backslash character (\\), it is parsed
incorrectly, which results in an invalid keybind configuration.


This patch fixes the issue by ensuring that backslashes are properly
escaped during the parsing process. This allows them to be used as
intended in keybind definitions.

Release Notes:

- Fixed an issue where keybinds containing a backslash character (\\)
failed to be replaced correctly


## Screenshots
<img width="912" height="530" alt="SCR-20250828-borp"
src="https://github.com/user-attachments/assets/561a040f-575b-4222-ac75-17ab4fa71d07"
/>
<img width="912" height="530" alt="SCR-20250828-bosx"
src="https://github.com/user-attachments/assets/b8e0fb99-549e-4fc9-8609-9b9aa2004656"
/>
2025-09-08 12:17:48 -04:00
Iha Shin (신의하)
8d05bb090c Add injections for Isograph function calls in JavaScript and TypeScript (#36320)
Required for https://github.com/isographlabs/isograph/pull/568 to work
properly. Tested with a local build and made sure everything's working
great!

Release Notes:

- JavaScript/TypeScript/JSX: Added support for injecting Isograph language support into `iso`
function calls
2025-09-08 16:04:37 +00:00
Dino
2325f14713 diagnostics: Current file diagnostics view (#34430)
These changes introduce a new command to the Diagnostics panel,
`diagnostics: deploy current file`, which allows the user to view the
diagnostics only for the currently opened file.

Here's a screen recording showing these changes in action 🔽 

[diagnostics: deploy current
file](https://github.com/user-attachments/assets/b0e87eea-3b3a-4888-95f8-9e21aff8ea97)

Closes #4739 

Release Notes:

- Added new `diagnostics: deploy current file` command to view
diagnostics for the currently open file

---------

Co-authored-by: Conrad Irwin <conrad.irwin@gmail.com>
2025-09-08 09:14:24 -06:00
Ben Kunkle
fe2aa3f4cb onboarding: Fix font loading frame delay (#37668)
Closes #ISSUE

Fixed an issue where the first frame of the `Editing` page in onboarding
would have a slight delay before rendering the first time it was
navigated to. This was caused by listing the OS fonts on the main
thread, blocking rendering. This PR fixes the issue by adding a new
method to the font family cache to prefill the cache on a background
thread.

Release Notes:

- N/A *or* Added/Fixed/Improved ...

---------

Co-authored-by: Mikayla Maki <mikayla.c.maki@gmail.com>
Co-authored-by: Anthony Eid <hello@anthonyeid.me>
Co-authored-by: Anthony <anthony@zed.dev>
2025-09-08 11:09:54 -04:00
fantacell
10989c702c helix: Add match operator (#34060)
This is an implementation of matching like "m i (", as well as "] (" and
"[ (" in `helix_mode` with a few supported objects and a basis for more.

Release Notes:

- Added helix operators for selecting text objects

---------

Co-authored-by: Conrad Irwin <conrad.irwin@gmail.com>
2025-09-08 08:48:47 -06:00
张小白
3f80ac0127 macos: Fix menu bar flickering (#37707)
Closes #37526

Release Notes:

- Fixed menu bar flickering when using some IMEs on macOS.
2025-09-08 10:44:19 -04:00
Bennet Bo Fenner
4f1634f95c Remove unused semantic_index crate (#37780)
Release Notes:

- N/A
2025-09-08 13:38:31 +00:00
Eduardo Alba
40eec32cb8 markdown_preview: Fix trimming of leading whitespace in Markdown lists (#35750)
Closes #35712

Release Notes:

- Fixed white-space trimming leading to disconnect between list items
and content in markdown previews.

---------

Co-authored-by: Bennet Bo Fenner <bennetbo@gmx.de>
2025-09-08 12:37:11 +00:00
张小白
17499453f6 windows: Check required GPU/driver feature StructuredBuffer (#37776)
Check whether the GPU/driver supports the StructuredBuffer feature
required by our shaders. If it doesn’t, log an error and skip that
GPU/driver, so Windows can fall back to the software renderer.

Release Notes:

- N/A
2025-09-08 12:22:57 +00:00
Lukas Wirth
80a4746a46 project: Be explicit about project-searchability for buffers (#37773)
Closes https://github.com/zed-industries/zed/issues/28830

Release Notes:

- Fixed builtin buffers and log views showing up in project search
2025-09-08 11:22:36 +00:00
Jakub Konka
01f5b73e3b cargo: Remove unused -fuse-ld=lld flag from Win config (#37769)
It is unused and generates a warning

```
 LINK : warning LNK4044: unrecognized option '/fuse-ld=lld'; ignored
```

If in the future we want to give `lld-link.exe` a try, we can set

```toml
linker = "lld-link.exe"
```

instead. At the time of writing, my tests have shown that there is no
real difference between `lld-link` and `link` in terms of linking speed.

Release Notes:

- N/A
2025-09-08 10:43:56 +00:00
Michael Sloan
a1a6031c6a Close stdin 2025-08-19 16:08:12 -06:00
Michael Sloan
2d20b5d850 Log command that is run 2025-08-19 15:53:09 -06:00
Michael Sloan
11ad0b5793 Rerun command if it is a file and the file changes 2025-08-19 15:47:14 -06:00
Michael Sloan
2755cd8ec7 Add ZED_SELECTION_CHANGE_CMD to run a command on selection change 2025-08-19 14:46:16 -06:00
128 changed files with 6898 additions and 6927 deletions

View File

@@ -19,8 +19,6 @@ rustflags = [
"windows_slim_errors", # This cfg will reduce the size of `windows::core::Error` from 16 bytes to 4 bytes
"-C",
"target-feature=+crt-static", # This fixes the linking issue when compiling livekit on Windows
"-C",
"link-arg=-fuse-ld=lld",
]
[env]

107
Cargo.lock generated
View File

@@ -308,22 +308,18 @@ dependencies = [
"libc",
"log",
"nix 0.29.0",
"node_runtime",
"paths",
"project",
"reqwest_client",
"schemars",
"semver",
"serde",
"serde_json",
"settings",
"smol",
"task",
"tempfile",
"thiserror 2.0.12",
"ui",
"util",
"watch",
"which 6.0.3",
"workspace-hack",
]
@@ -2351,19 +2347,6 @@ dependencies = [
"digest",
]
[[package]]
name = "blake3"
version = "1.8.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "3888aaa89e4b2a40fca9848e400f6a658a5a3978de7be858e209cafa8be9a4a0"
dependencies = [
"arrayref",
"arrayvec",
"cc",
"cfg-if",
"constant_time_eq 0.3.1",
]
[[package]]
name = "block"
version = "0.1.6"
@@ -3578,12 +3561,6 @@ version = "0.1.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "245097e9a4535ee1e3e3931fcfcd55a796a44c643e8596ff6566d68f09b87bbc"
[[package]]
name = "constant_time_eq"
version = "0.3.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "7c74b8349d32d297c9134b8c88677813a227df8f779daa29bfc29c183fe3dca6"
[[package]]
name = "context_server"
version = "0.1.0"
@@ -5067,6 +5044,7 @@ dependencies = [
"multi_buffer",
"ordered-float 2.10.1",
"parking_lot",
"postage",
"pretty_assertions",
"project",
"rand 0.9.1",
@@ -6164,17 +6142,6 @@ dependencies = [
"futures-util",
]
[[package]]
name = "futures-batch"
version = "0.6.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "6f444c45a1cb86f2a7e301469fd50a82084a60dadc25d94529a8312276ecb71a"
dependencies = [
"futures 0.3.31",
"futures-timer",
"pin-utils",
]
[[package]]
name = "futures-channel"
version = "0.3.31"
@@ -6270,12 +6237,6 @@ version = "0.3.31"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f90f7dce0722e95104fcb095585910c0977252f286e354b5e3bd38902cd99988"
[[package]]
name = "futures-timer"
version = "3.0.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f288b0a4f20f9a56b5d1da57e2227c661b7b16168e2f72365f57b63326e29b24"
[[package]]
name = "futures-util"
version = "0.3.31"
@@ -9221,6 +9182,19 @@ dependencies = [
"x_ai",
]
[[package]]
name = "language_onboarding"
version = "0.1.0"
dependencies = [
"db",
"editor",
"gpui",
"project",
"ui",
"workspace",
"workspace-hack",
]
[[package]]
name = "language_selector"
version = "0.1.0"
@@ -9283,7 +9257,6 @@ dependencies = [
"chrono",
"collections",
"dap",
"feature_flags",
"futures 0.3.31",
"gpui",
"http_client",
@@ -12641,6 +12614,7 @@ dependencies = [
"remote",
"rpc",
"schemars",
"semver",
"serde",
"serde_json",
"settings",
@@ -12660,6 +12634,7 @@ dependencies = [
"unindent",
"url",
"util",
"watch",
"which 6.0.3",
"workspace-hack",
"worktree",
@@ -14688,49 +14663,6 @@ version = "1.2.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0f7d95a54511e0c7be3f51e8867aa8cf35148d7b9445d44de2f943e2b206e749"
[[package]]
name = "semantic_index"
version = "0.1.0"
dependencies = [
"anyhow",
"arrayvec",
"blake3",
"client",
"clock",
"collections",
"feature_flags",
"fs",
"futures 0.3.31",
"futures-batch",
"gpui",
"heed",
"http_client",
"language",
"language_model",
"languages",
"log",
"open_ai",
"parking_lot",
"project",
"reqwest_client",
"serde",
"serde_json",
"settings",
"sha2",
"smol",
"streaming-iterator",
"tempfile",
"theme",
"tree-sitter",
"ui",
"unindent",
"util",
"workspace",
"workspace-hack",
"worktree",
"zlog",
]
[[package]]
name = "semantic_version"
version = "0.1.0"
@@ -20446,7 +20378,6 @@ dependencies = [
"acp_tools",
"activity_indicator",
"agent",
"agent_servers",
"agent_settings",
"agent_ui",
"anyhow",
@@ -20510,6 +20441,7 @@ dependencies = [
"language_extension",
"language_model",
"language_models",
"language_onboarding",
"language_selector",
"language_tools",
"languages",
@@ -20532,6 +20464,7 @@ dependencies = [
"parking_lot",
"paths",
"picker",
"postage",
"pretty_assertions",
"profiling",
"project",
@@ -20915,7 +20848,7 @@ dependencies = [
"aes",
"byteorder",
"bzip2",
"constant_time_eq 0.1.5",
"constant_time_eq",
"crc32fast",
"crossbeam-utils",
"flate2",

View File

@@ -94,6 +94,7 @@ members = [
"crates/language_extension",
"crates/language_model",
"crates/language_models",
"crates/language_onboarding",
"crates/language_selector",
"crates/language_tools",
"crates/languages",
@@ -143,7 +144,6 @@ members = [
"crates/rules_library",
"crates/schema_generator",
"crates/search",
"crates/semantic_index",
"crates/semantic_version",
"crates/session",
"crates/settings",
@@ -321,6 +321,7 @@ language = { path = "crates/language" }
language_extension = { path = "crates/language_extension" }
language_model = { path = "crates/language_model" }
language_models = { path = "crates/language_models" }
language_onboarding = { path = "crates/language_onboarding" }
language_selector = { path = "crates/language_selector" }
language_tools = { path = "crates/language_tools" }
languages = { path = "crates/languages" }
@@ -373,7 +374,6 @@ rope = { path = "crates/rope" }
rpc = { path = "crates/rpc" }
rules_library = { path = "crates/rules_library" }
search = { path = "crates/search" }
semantic_index = { path = "crates/semantic_index" }
semantic_version = { path = "crates/semantic_version" }
session = { path = "crates/session" }
settings = { path = "crates/settings" }

View File

@@ -32,34 +32,6 @@
"(": "vim::SentenceBackward",
")": "vim::SentenceForward",
"|": "vim::GoToColumn",
"] ]": "vim::NextSectionStart",
"] [": "vim::NextSectionEnd",
"[ [": "vim::PreviousSectionStart",
"[ ]": "vim::PreviousSectionEnd",
"] m": "vim::NextMethodStart",
"] shift-m": "vim::NextMethodEnd",
"[ m": "vim::PreviousMethodStart",
"[ shift-m": "vim::PreviousMethodEnd",
"[ *": "vim::PreviousComment",
"[ /": "vim::PreviousComment",
"] *": "vim::NextComment",
"] /": "vim::NextComment",
"[ -": "vim::PreviousLesserIndent",
"[ +": "vim::PreviousGreaterIndent",
"[ =": "vim::PreviousSameIndent",
"] -": "vim::NextLesserIndent",
"] +": "vim::NextGreaterIndent",
"] =": "vim::NextSameIndent",
"] b": "pane::ActivateNextItem",
"[ b": "pane::ActivatePreviousItem",
"] shift-b": "pane::ActivateLastItem",
"[ shift-b": ["pane::ActivateItem", 0],
"] space": "vim::InsertEmptyLineBelow",
"[ space": "vim::InsertEmptyLineAbove",
"[ e": "editor::MoveLineUp",
"] e": "editor::MoveLineDown",
"[ f": "workspace::FollowNextCollaborator",
"] f": "workspace::FollowNextCollaborator",
// Word motions
"w": "vim::NextWordStart",
@@ -83,10 +55,6 @@
"n": "vim::MoveToNextMatch",
"shift-n": "vim::MoveToPreviousMatch",
"%": "vim::Matching",
"] }": ["vim::UnmatchedForward", { "char": "}" }],
"[ {": ["vim::UnmatchedBackward", { "char": "{" }],
"] )": ["vim::UnmatchedForward", { "char": ")" }],
"[ (": ["vim::UnmatchedBackward", { "char": "(" }],
"f": ["vim::PushFindForward", { "before": false, "multiline": false }],
"t": ["vim::PushFindForward", { "before": true, "multiline": false }],
"shift-f": ["vim::PushFindBackward", { "after": false, "multiline": false }],
@@ -219,6 +187,46 @@
".": "vim::Repeat"
}
},
{
"context": "vim_mode == normal || vim_mode == visual || vim_mode == operator",
"bindings": {
"] ]": "vim::NextSectionStart",
"] [": "vim::NextSectionEnd",
"[ [": "vim::PreviousSectionStart",
"[ ]": "vim::PreviousSectionEnd",
"] m": "vim::NextMethodStart",
"] shift-m": "vim::NextMethodEnd",
"[ m": "vim::PreviousMethodStart",
"[ shift-m": "vim::PreviousMethodEnd",
"[ *": "vim::PreviousComment",
"[ /": "vim::PreviousComment",
"] *": "vim::NextComment",
"] /": "vim::NextComment",
"[ -": "vim::PreviousLesserIndent",
"[ +": "vim::PreviousGreaterIndent",
"[ =": "vim::PreviousSameIndent",
"] -": "vim::NextLesserIndent",
"] +": "vim::NextGreaterIndent",
"] =": "vim::NextSameIndent",
"] b": "pane::ActivateNextItem",
"[ b": "pane::ActivatePreviousItem",
"] shift-b": "pane::ActivateLastItem",
"[ shift-b": ["pane::ActivateItem", 0],
"] space": "vim::InsertEmptyLineBelow",
"[ space": "vim::InsertEmptyLineAbove",
"[ e": "editor::MoveLineUp",
"] e": "editor::MoveLineDown",
"[ f": "workspace::FollowNextCollaborator",
"] f": "workspace::FollowNextCollaborator",
"] }": ["vim::UnmatchedForward", { "char": "}" }],
"[ {": ["vim::UnmatchedBackward", { "char": "{" }],
"] )": ["vim::UnmatchedForward", { "char": ")" }],
"[ (": ["vim::UnmatchedBackward", { "char": "(" }],
// tree-sitter related commands
"[ x": "vim::SelectLargerSyntaxNode",
"] x": "vim::SelectSmallerSyntaxNode"
}
},
{
"context": "vim_mode == normal",
"bindings": {
@@ -249,9 +257,6 @@
"g w": "vim::PushRewrap",
"g q": "vim::PushRewrap",
"insert": "vim::InsertBefore",
// tree-sitter related commands
"[ x": "vim::SelectLargerSyntaxNode",
"] x": "vim::SelectSmallerSyntaxNode",
"] d": "editor::GoToDiagnostic",
"[ d": "editor::GoToPreviousDiagnostic",
"] c": "editor::GoToHunk",
@@ -317,10 +322,7 @@
"g w": "vim::Rewrap",
"g ?": "vim::ConvertToRot13",
// "g ?": "vim::ConvertToRot47",
"\"": "vim::PushRegister",
// tree-sitter related commands
"[ x": "editor::SelectLargerSyntaxNode",
"] x": "editor::SelectSmallerSyntaxNode"
"\"": "vim::PushRegister"
}
},
{
@@ -397,6 +399,9 @@
"ctrl-[": "editor::Cancel",
";": "vim::HelixCollapseSelection",
":": "command_palette::Toggle",
"m": "vim::PushHelixMatch",
"]": ["vim::PushHelixNext", { "around": true }],
"[": ["vim::PushHelixPrevious", { "around": true }],
"left": "vim::WrappingLeft",
"right": "vim::WrappingRight",
"h": "vim::WrappingLeft",
@@ -419,13 +424,6 @@
"insert": "vim::InsertBefore",
"alt-.": "vim::RepeatFind",
"alt-s": ["editor::SplitSelectionIntoLines", { "keep_selections": true }],
// tree-sitter related commands
"[ x": "editor::SelectLargerSyntaxNode",
"] x": "editor::SelectSmallerSyntaxNode",
"] d": "editor::GoToDiagnostic",
"[ d": "editor::GoToPreviousDiagnostic",
"] c": "editor::GoToHunk",
"[ c": "editor::GoToPreviousHunk",
// Goto mode
"g n": "pane::ActivateNextItem",
"g p": "pane::ActivatePreviousItem",
@@ -469,9 +467,6 @@
"space c": "editor::ToggleComments",
"space y": "editor::Copy",
"space p": "editor::Paste",
// Match mode
"m m": "vim::Matching",
"m i w": ["workspace::SendKeystrokes", "v i w"],
"shift-u": "editor::Redo",
"ctrl-c": "editor::ToggleComments",
"d": "vim::HelixDelete",
@@ -540,7 +535,7 @@
}
},
{
"context": "vim_operator == a || vim_operator == i || vim_operator == cs",
"context": "vim_operator == a || vim_operator == i || vim_operator == cs || vim_operator == helix_next || vim_operator == helix_previous",
"bindings": {
"w": "vim::Word",
"shift-w": ["vim::Word", { "ignore_punctuation": true }],
@@ -577,6 +572,48 @@
"e": "vim::EntireFile"
}
},
{
"context": "vim_operator == helix_m",
"bindings": {
"m": "vim::Matching"
}
},
{
"context": "vim_operator == helix_next",
"bindings": {
"z": "vim::NextSectionStart",
"shift-z": "vim::NextSectionEnd",
"*": "vim::NextComment",
"/": "vim::NextComment",
"-": "vim::NextLesserIndent",
"+": "vim::NextGreaterIndent",
"=": "vim::NextSameIndent",
"b": "pane::ActivateNextItem",
"shift-b": "pane::ActivateLastItem",
"x": "editor::SelectSmallerSyntaxNode",
"d": "editor::GoToDiagnostic",
"c": "editor::GoToHunk",
"space": "vim::InsertEmptyLineBelow"
}
},
{
"context": "vim_operator == helix_previous",
"bindings": {
"z": "vim::PreviousSectionStart",
"shift-z": "vim::PreviousSectionEnd",
"*": "vim::PreviousComment",
"/": "vim::PreviousComment",
"-": "vim::PreviousLesserIndent",
"+": "vim::PreviousGreaterIndent",
"=": "vim::PreviousSameIndent",
"b": "pane::ActivatePreviousItem",
"shift-b": ["pane::ActivateItem", 0],
"x": "editor::SelectLargerSyntaxNode",
"d": "editor::GoToPreviousDiagnostic",
"c": "editor::GoToPreviousHunk",
"space": "vim::InsertEmptyLineAbove"
}
},
{
"context": "vim_operator == c",
"bindings": {

View File

@@ -2758,7 +2758,7 @@ mod tests {
}));
let thread = cx
.update(|cx| connection.new_thread(project, Path::new("/test"), cx))
.update(|cx| connection.new_thread(project, Path::new(path!("/test")), cx))
.await
.unwrap();

View File

@@ -212,7 +212,8 @@ impl ActivityIndicator {
server_name,
status,
} => {
let create_buffer = project.update(cx, |project, cx| project.create_buffer(cx));
let create_buffer =
project.update(cx, |project, cx| project.create_buffer(false, cx));
let status = status.clone();
let server_name = server_name.clone();
cx.spawn_in(window, async move |workspace, cx| {

View File

@@ -35,10 +35,15 @@ impl AgentServer for NativeAgentServer {
fn connect(
&self,
_root_dir: &Path,
_root_dir: Option<&Path>,
delegate: AgentServerDelegate,
cx: &mut App,
) -> Task<Result<Rc<dyn acp_thread::AgentConnection>>> {
) -> Task<
Result<(
Rc<dyn acp_thread::AgentConnection>,
Option<task::SpawnInTerminal>,
)>,
> {
log::debug!(
"NativeAgentServer::connect called for path: {:?}",
_root_dir
@@ -60,7 +65,10 @@ impl AgentServer for NativeAgentServer {
let connection = NativeAgentConnection(agent);
log::debug!("NativeAgentServer connection established successfully");
Ok(Rc::new(connection) as Rc<dyn acp_thread::AgentConnection>)
Ok((
Rc::new(connection) as Rc<dyn acp_thread::AgentConnection>,
None,
))
})
}

View File

@@ -24,7 +24,11 @@ impl AgentTool for EchoTool {
acp::ToolKind::Other
}
fn initial_title(&self, _input: Result<Self::Input, serde_json::Value>) -> SharedString {
fn initial_title(
&self,
_input: Result<Self::Input, serde_json::Value>,
_cx: &mut App,
) -> SharedString {
"Echo".into()
}
@@ -55,7 +59,11 @@ impl AgentTool for DelayTool {
"delay"
}
fn initial_title(&self, input: Result<Self::Input, serde_json::Value>) -> SharedString {
fn initial_title(
&self,
input: Result<Self::Input, serde_json::Value>,
_cx: &mut App,
) -> SharedString {
if let Ok(input) = input {
format!("Delay {}ms", input.ms).into()
} else {
@@ -100,7 +108,11 @@ impl AgentTool for ToolRequiringPermission {
acp::ToolKind::Other
}
fn initial_title(&self, _input: Result<Self::Input, serde_json::Value>) -> SharedString {
fn initial_title(
&self,
_input: Result<Self::Input, serde_json::Value>,
_cx: &mut App,
) -> SharedString {
"This tool requires permission".into()
}
@@ -135,7 +147,11 @@ impl AgentTool for InfiniteTool {
acp::ToolKind::Other
}
fn initial_title(&self, _input: Result<Self::Input, serde_json::Value>) -> SharedString {
fn initial_title(
&self,
_input: Result<Self::Input, serde_json::Value>,
_cx: &mut App,
) -> SharedString {
"Infinite Tool".into()
}
@@ -186,7 +202,11 @@ impl AgentTool for WordListTool {
acp::ToolKind::Other
}
fn initial_title(&self, _input: Result<Self::Input, serde_json::Value>) -> SharedString {
fn initial_title(
&self,
_input: Result<Self::Input, serde_json::Value>,
_cx: &mut App,
) -> SharedString {
"List of random words".into()
}

View File

@@ -741,7 +741,7 @@ impl Thread {
return;
};
let title = tool.initial_title(tool_use.input.clone());
let title = tool.initial_title(tool_use.input.clone(), cx);
let kind = tool.kind();
stream.send_tool_call(&tool_use.id, title, kind, tool_use.input.clone());
@@ -1062,7 +1062,11 @@ impl Thread {
self.action_log.clone(),
));
self.add_tool(DiagnosticsTool::new(self.project.clone()));
self.add_tool(EditFileTool::new(cx.weak_entity(), language_registry));
self.add_tool(EditFileTool::new(
self.project.clone(),
cx.weak_entity(),
language_registry,
));
self.add_tool(FetchTool::new(self.project.read(cx).client().http_client()));
self.add_tool(FindPathTool::new(self.project.clone()));
self.add_tool(GrepTool::new(self.project.clone()));
@@ -1514,7 +1518,7 @@ impl Thread {
let mut title = SharedString::from(&tool_use.name);
let mut kind = acp::ToolKind::Other;
if let Some(tool) = tool.as_ref() {
title = tool.initial_title(tool_use.input.clone());
title = tool.initial_title(tool_use.input.clone(), cx);
kind = tool.kind();
}
@@ -2148,7 +2152,11 @@ where
fn kind() -> acp::ToolKind;
/// The initial tool title to display. Can be updated during the tool run.
fn initial_title(&self, input: Result<Self::Input, serde_json::Value>) -> SharedString;
fn initial_title(
&self,
input: Result<Self::Input, serde_json::Value>,
cx: &mut App,
) -> SharedString;
/// Returns the JSON schema that describes the tool's input.
fn input_schema(&self, format: LanguageModelToolSchemaFormat) -> Schema {
@@ -2196,7 +2204,7 @@ pub trait AnyAgentTool {
fn name(&self) -> SharedString;
fn description(&self) -> SharedString;
fn kind(&self) -> acp::ToolKind;
fn initial_title(&self, input: serde_json::Value) -> SharedString;
fn initial_title(&self, input: serde_json::Value, _cx: &mut App) -> SharedString;
fn input_schema(&self, format: LanguageModelToolSchemaFormat) -> Result<serde_json::Value>;
fn supported_provider(&self, _provider: &LanguageModelProviderId) -> bool {
true
@@ -2232,9 +2240,9 @@ where
T::kind()
}
fn initial_title(&self, input: serde_json::Value) -> SharedString {
fn initial_title(&self, input: serde_json::Value, _cx: &mut App) -> SharedString {
let parsed_input = serde_json::from_value(input.clone()).map_err(|_| input);
self.0.initial_title(parsed_input)
self.0.initial_title(parsed_input, _cx)
}
fn input_schema(&self, format: LanguageModelToolSchemaFormat) -> Result<serde_json::Value> {

View File

@@ -145,7 +145,7 @@ impl AnyAgentTool for ContextServerTool {
ToolKind::Other
}
fn initial_title(&self, _input: serde_json::Value) -> SharedString {
fn initial_title(&self, _input: serde_json::Value, _cx: &mut App) -> SharedString {
format!("Run MCP tool `{}`", self.tool.name).into()
}
@@ -176,7 +176,7 @@ impl AnyAgentTool for ContextServerTool {
return Task::ready(Err(anyhow!("Context server not found")));
};
let tool_name = self.tool.name.clone();
let authorize = event_stream.authorize(self.initial_title(input.clone()), cx);
let authorize = event_stream.authorize(self.initial_title(input.clone(), cx), cx);
cx.spawn(async move |_cx| {
authorize.await?;

View File

@@ -58,7 +58,11 @@ impl AgentTool for CopyPathTool {
ToolKind::Move
}
fn initial_title(&self, input: Result<Self::Input, serde_json::Value>) -> ui::SharedString {
fn initial_title(
&self,
input: Result<Self::Input, serde_json::Value>,
_cx: &mut App,
) -> ui::SharedString {
if let Ok(input) = input {
let src = MarkdownInlineCode(&input.source_path);
let dest = MarkdownInlineCode(&input.destination_path);

View File

@@ -49,7 +49,11 @@ impl AgentTool for CreateDirectoryTool {
ToolKind::Read
}
fn initial_title(&self, input: Result<Self::Input, serde_json::Value>) -> SharedString {
fn initial_title(
&self,
input: Result<Self::Input, serde_json::Value>,
_cx: &mut App,
) -> SharedString {
if let Ok(input) = input {
format!("Create directory {}", MarkdownInlineCode(&input.path)).into()
} else {

View File

@@ -52,7 +52,11 @@ impl AgentTool for DeletePathTool {
ToolKind::Delete
}
fn initial_title(&self, input: Result<Self::Input, serde_json::Value>) -> SharedString {
fn initial_title(
&self,
input: Result<Self::Input, serde_json::Value>,
_cx: &mut App,
) -> SharedString {
if let Ok(input) = input {
format!("Delete “`{}`”", input.path).into()
} else {

View File

@@ -71,7 +71,11 @@ impl AgentTool for DiagnosticsTool {
acp::ToolKind::Read
}
fn initial_title(&self, input: Result<Self::Input, serde_json::Value>) -> SharedString {
fn initial_title(
&self,
input: Result<Self::Input, serde_json::Value>,
_cx: &mut App,
) -> SharedString {
if let Some(path) = input.ok().and_then(|input| match input.path {
Some(path) if !path.is_empty() => Some(path),
_ => None,

View File

@@ -120,11 +120,17 @@ impl From<EditFileToolOutput> for LanguageModelToolResultContent {
pub struct EditFileTool {
thread: WeakEntity<Thread>,
language_registry: Arc<LanguageRegistry>,
project: Entity<Project>,
}
impl EditFileTool {
pub fn new(thread: WeakEntity<Thread>, language_registry: Arc<LanguageRegistry>) -> Self {
pub fn new(
project: Entity<Project>,
thread: WeakEntity<Thread>,
language_registry: Arc<LanguageRegistry>,
) -> Self {
Self {
project,
thread,
language_registry,
}
@@ -195,22 +201,50 @@ impl AgentTool for EditFileTool {
acp::ToolKind::Edit
}
fn initial_title(&self, input: Result<Self::Input, serde_json::Value>) -> SharedString {
fn initial_title(
&self,
input: Result<Self::Input, serde_json::Value>,
cx: &mut App,
) -> SharedString {
match input {
Ok(input) => input.display_description.into(),
Ok(input) => self
.project
.read(cx)
.find_project_path(&input.path, cx)
.and_then(|project_path| {
self.project
.read(cx)
.short_full_path_for_project_path(&project_path, cx)
})
.unwrap_or(Path::new(&input.path).into())
.to_string_lossy()
.to_string()
.into(),
Err(raw_input) => {
if let Some(input) =
serde_json::from_value::<EditFileToolPartialInput>(raw_input).ok()
{
let path = input.path.trim();
if !path.is_empty() {
return self
.project
.read(cx)
.find_project_path(&input.path, cx)
.and_then(|project_path| {
self.project
.read(cx)
.short_full_path_for_project_path(&project_path, cx)
})
.unwrap_or(Path::new(&input.path).into())
.to_string_lossy()
.to_string()
.into();
}
let description = input.display_description.trim();
if !description.is_empty() {
return description.to_string().into();
}
let path = input.path.trim().to_string();
if !path.is_empty() {
return path.into();
}
}
DEFAULT_UI_TEXT.into()
@@ -545,7 +579,7 @@ mod tests {
let model = Arc::new(FakeLanguageModel::default());
let thread = cx.new(|cx| {
Thread::new(
project,
project.clone(),
cx.new(|_cx| ProjectContext::default()),
context_server_registry,
Templates::new(),
@@ -560,11 +594,12 @@ mod tests {
path: "root/nonexistent_file.txt".into(),
mode: EditFileMode::Edit,
};
Arc::new(EditFileTool::new(thread.downgrade(), language_registry)).run(
input,
ToolCallEventStream::test().0,
cx,
)
Arc::new(EditFileTool::new(
project,
thread.downgrade(),
language_registry,
))
.run(input, ToolCallEventStream::test().0, cx)
})
.await;
assert_eq!(
@@ -743,7 +778,7 @@ mod tests {
let model = Arc::new(FakeLanguageModel::default());
let thread = cx.new(|cx| {
Thread::new(
project,
project.clone(),
cx.new(|_cx| ProjectContext::default()),
context_server_registry,
Templates::new(),
@@ -775,6 +810,7 @@ mod tests {
mode: EditFileMode::Overwrite,
};
Arc::new(EditFileTool::new(
project.clone(),
thread.downgrade(),
language_registry.clone(),
))
@@ -833,11 +869,12 @@ mod tests {
path: "root/src/main.rs".into(),
mode: EditFileMode::Overwrite,
};
Arc::new(EditFileTool::new(thread.downgrade(), language_registry)).run(
input,
ToolCallEventStream::test().0,
cx,
)
Arc::new(EditFileTool::new(
project.clone(),
thread.downgrade(),
language_registry,
))
.run(input, ToolCallEventStream::test().0, cx)
});
// Stream the unformatted content
@@ -885,7 +922,7 @@ mod tests {
let model = Arc::new(FakeLanguageModel::default());
let thread = cx.new(|cx| {
Thread::new(
project,
project.clone(),
cx.new(|_cx| ProjectContext::default()),
context_server_registry,
Templates::new(),
@@ -918,6 +955,7 @@ mod tests {
mode: EditFileMode::Overwrite,
};
Arc::new(EditFileTool::new(
project.clone(),
thread.downgrade(),
language_registry.clone(),
))
@@ -969,11 +1007,12 @@ mod tests {
path: "root/src/main.rs".into(),
mode: EditFileMode::Overwrite,
};
Arc::new(EditFileTool::new(thread.downgrade(), language_registry)).run(
input,
ToolCallEventStream::test().0,
cx,
)
Arc::new(EditFileTool::new(
project.clone(),
thread.downgrade(),
language_registry,
))
.run(input, ToolCallEventStream::test().0, cx)
});
// Stream the content with trailing whitespace
@@ -1012,7 +1051,7 @@ mod tests {
let model = Arc::new(FakeLanguageModel::default());
let thread = cx.new(|cx| {
Thread::new(
project,
project.clone(),
cx.new(|_cx| ProjectContext::default()),
context_server_registry,
Templates::new(),
@@ -1020,7 +1059,11 @@ mod tests {
cx,
)
});
let tool = Arc::new(EditFileTool::new(thread.downgrade(), language_registry));
let tool = Arc::new(EditFileTool::new(
project.clone(),
thread.downgrade(),
language_registry,
));
fs.insert_tree("/root", json!({})).await;
// Test 1: Path with .zed component should require confirmation
@@ -1148,7 +1191,7 @@ mod tests {
let model = Arc::new(FakeLanguageModel::default());
let thread = cx.new(|cx| {
Thread::new(
project,
project.clone(),
cx.new(|_cx| ProjectContext::default()),
context_server_registry,
Templates::new(),
@@ -1156,7 +1199,11 @@ mod tests {
cx,
)
});
let tool = Arc::new(EditFileTool::new(thread.downgrade(), language_registry));
let tool = Arc::new(EditFileTool::new(
project.clone(),
thread.downgrade(),
language_registry,
));
// Test global config paths - these should require confirmation if they exist and are outside the project
let test_cases = vec![
@@ -1264,7 +1311,11 @@ mod tests {
cx,
)
});
let tool = Arc::new(EditFileTool::new(thread.downgrade(), language_registry));
let tool = Arc::new(EditFileTool::new(
project.clone(),
thread.downgrade(),
language_registry,
));
// Test files in different worktrees
let test_cases = vec![
@@ -1344,7 +1395,11 @@ mod tests {
cx,
)
});
let tool = Arc::new(EditFileTool::new(thread.downgrade(), language_registry));
let tool = Arc::new(EditFileTool::new(
project.clone(),
thread.downgrade(),
language_registry,
));
// Test edge cases
let test_cases = vec![
@@ -1427,7 +1482,11 @@ mod tests {
cx,
)
});
let tool = Arc::new(EditFileTool::new(thread.downgrade(), language_registry));
let tool = Arc::new(EditFileTool::new(
project.clone(),
thread.downgrade(),
language_registry,
));
// Test different EditFileMode values
let modes = vec![
@@ -1507,48 +1566,67 @@ mod tests {
cx,
)
});
let tool = Arc::new(EditFileTool::new(thread.downgrade(), language_registry));
let tool = Arc::new(EditFileTool::new(
project,
thread.downgrade(),
language_registry,
));
assert_eq!(
tool.initial_title(Err(json!({
"path": "src/main.rs",
"display_description": "",
"old_string": "old code",
"new_string": "new code"
}))),
"src/main.rs"
);
assert_eq!(
tool.initial_title(Err(json!({
"path": "",
"display_description": "Fix error handling",
"old_string": "old code",
"new_string": "new code"
}))),
"Fix error handling"
);
assert_eq!(
tool.initial_title(Err(json!({
"path": "src/main.rs",
"display_description": "Fix error handling",
"old_string": "old code",
"new_string": "new code"
}))),
"Fix error handling"
);
assert_eq!(
tool.initial_title(Err(json!({
"path": "",
"display_description": "",
"old_string": "old code",
"new_string": "new code"
}))),
DEFAULT_UI_TEXT
);
assert_eq!(
tool.initial_title(Err(serde_json::Value::Null)),
DEFAULT_UI_TEXT
);
cx.update(|cx| {
// ...
assert_eq!(
tool.initial_title(
Err(json!({
"path": "src/main.rs",
"display_description": "",
"old_string": "old code",
"new_string": "new code"
})),
cx
),
"src/main.rs"
);
assert_eq!(
tool.initial_title(
Err(json!({
"path": "",
"display_description": "Fix error handling",
"old_string": "old code",
"new_string": "new code"
})),
cx
),
"Fix error handling"
);
assert_eq!(
tool.initial_title(
Err(json!({
"path": "src/main.rs",
"display_description": "Fix error handling",
"old_string": "old code",
"new_string": "new code"
})),
cx
),
"src/main.rs"
);
assert_eq!(
tool.initial_title(
Err(json!({
"path": "",
"display_description": "",
"old_string": "old code",
"new_string": "new code"
})),
cx
),
DEFAULT_UI_TEXT
);
assert_eq!(
tool.initial_title(Err(serde_json::Value::Null), cx),
DEFAULT_UI_TEXT
);
});
}
#[gpui::test]
@@ -1575,7 +1653,11 @@ mod tests {
// Ensure the diff is finalized after the edit completes.
{
let tool = Arc::new(EditFileTool::new(thread.downgrade(), languages.clone()));
let tool = Arc::new(EditFileTool::new(
project.clone(),
thread.downgrade(),
languages.clone(),
));
let (stream_tx, mut stream_rx) = ToolCallEventStream::test();
let edit = cx.update(|cx| {
tool.run(
@@ -1600,7 +1682,11 @@ mod tests {
// Ensure the diff is finalized if an error occurs while editing.
{
model.forbid_requests();
let tool = Arc::new(EditFileTool::new(thread.downgrade(), languages.clone()));
let tool = Arc::new(EditFileTool::new(
project.clone(),
thread.downgrade(),
languages.clone(),
));
let (stream_tx, mut stream_rx) = ToolCallEventStream::test();
let edit = cx.update(|cx| {
tool.run(
@@ -1623,7 +1709,11 @@ mod tests {
// Ensure the diff is finalized if the tool call gets dropped.
{
let tool = Arc::new(EditFileTool::new(thread.downgrade(), languages.clone()));
let tool = Arc::new(EditFileTool::new(
project.clone(),
thread.downgrade(),
languages.clone(),
));
let (stream_tx, mut stream_rx) = ToolCallEventStream::test();
let edit = cx.update(|cx| {
tool.run(

View File

@@ -126,7 +126,11 @@ impl AgentTool for FetchTool {
acp::ToolKind::Fetch
}
fn initial_title(&self, input: Result<Self::Input, serde_json::Value>) -> SharedString {
fn initial_title(
&self,
input: Result<Self::Input, serde_json::Value>,
_cx: &mut App,
) -> SharedString {
match input {
Ok(input) => format!("Fetch {}", MarkdownEscaped(&input.url)).into(),
Err(_) => "Fetch URL".into(),

View File

@@ -93,7 +93,11 @@ impl AgentTool for FindPathTool {
acp::ToolKind::Search
}
fn initial_title(&self, input: Result<Self::Input, serde_json::Value>) -> SharedString {
fn initial_title(
&self,
input: Result<Self::Input, serde_json::Value>,
_cx: &mut App,
) -> SharedString {
let mut title = "Find paths".to_string();
if let Ok(input) = input {
title.push_str(&format!(" matching “`{}`”", input.glob));

View File

@@ -75,7 +75,11 @@ impl AgentTool for GrepTool {
acp::ToolKind::Search
}
fn initial_title(&self, input: Result<Self::Input, serde_json::Value>) -> SharedString {
fn initial_title(
&self,
input: Result<Self::Input, serde_json::Value>,
_cx: &mut App,
) -> SharedString {
match input {
Ok(input) => {
let page = input.page();

View File

@@ -59,7 +59,11 @@ impl AgentTool for ListDirectoryTool {
ToolKind::Read
}
fn initial_title(&self, input: Result<Self::Input, serde_json::Value>) -> SharedString {
fn initial_title(
&self,
input: Result<Self::Input, serde_json::Value>,
_cx: &mut App,
) -> SharedString {
if let Ok(input) = input {
let path = MarkdownInlineCode(&input.path);
format!("List the {path} directory's contents").into()

View File

@@ -60,7 +60,11 @@ impl AgentTool for MovePathTool {
ToolKind::Move
}
fn initial_title(&self, input: Result<Self::Input, serde_json::Value>) -> SharedString {
fn initial_title(
&self,
input: Result<Self::Input, serde_json::Value>,
_cx: &mut App,
) -> SharedString {
if let Ok(input) = input {
let src = MarkdownInlineCode(&input.source_path);
let dest = MarkdownInlineCode(&input.destination_path);

View File

@@ -41,7 +41,11 @@ impl AgentTool for NowTool {
acp::ToolKind::Other
}
fn initial_title(&self, _input: Result<Self::Input, serde_json::Value>) -> SharedString {
fn initial_title(
&self,
_input: Result<Self::Input, serde_json::Value>,
_cx: &mut App,
) -> SharedString {
"Get current time".into()
}

View File

@@ -45,7 +45,11 @@ impl AgentTool for OpenTool {
ToolKind::Execute
}
fn initial_title(&self, input: Result<Self::Input, serde_json::Value>) -> SharedString {
fn initial_title(
&self,
input: Result<Self::Input, serde_json::Value>,
_cx: &mut App,
) -> SharedString {
if let Ok(input) = input {
format!("Open `{}`", MarkdownEscaped(&input.path_or_url)).into()
} else {
@@ -61,7 +65,7 @@ impl AgentTool for OpenTool {
) -> Task<Result<Self::Output>> {
// If path_or_url turns out to be a path in the project, make it absolute.
let abs_path = to_absolute_path(&input.path_or_url, self.project.clone(), cx);
let authorize = event_stream.authorize(self.initial_title(Ok(input.clone())), cx);
let authorize = event_stream.authorize(self.initial_title(Ok(input.clone()), cx), cx);
cx.background_spawn(async move {
authorize.await?;

View File

@@ -10,7 +10,7 @@ use project::{AgentLocation, ImageItem, Project, WorktreeSettings, image_store};
use schemars::JsonSchema;
use serde::{Deserialize, Serialize};
use settings::Settings;
use std::{path::Path, sync::Arc};
use std::sync::Arc;
use util::markdown::MarkdownCodeBlock;
use crate::{AgentTool, ToolCallEventStream};
@@ -68,13 +68,31 @@ impl AgentTool for ReadFileTool {
acp::ToolKind::Read
}
fn initial_title(&self, input: Result<Self::Input, serde_json::Value>) -> SharedString {
input
.ok()
.as_ref()
.and_then(|input| Path::new(&input.path).file_name())
.map(|file_name| file_name.to_string_lossy().to_string().into())
.unwrap_or_default()
fn initial_title(
&self,
input: Result<Self::Input, serde_json::Value>,
cx: &mut App,
) -> SharedString {
if let Ok(input) = input
&& let Some(project_path) = self.project.read(cx).find_project_path(&input.path, cx)
&& let Some(path) = self
.project
.read(cx)
.short_full_path_for_project_path(&project_path, cx)
{
match (input.start_line, input.end_line) {
(Some(start), Some(end)) => {
format!("Read file `{}` (lines {}-{})", path.display(), start, end,)
}
(Some(start), None) => {
format!("Read file `{}` (from line {})", path.display(), start)
}
_ => format!("Read file `{}`", path.display()),
}
.into()
} else {
"Read file".into()
}
}
fn run(
@@ -86,6 +104,12 @@ impl AgentTool for ReadFileTool {
let Some(project_path) = self.project.read(cx).find_project_path(&input.path, cx) else {
return Task::ready(Err(anyhow!("Path {} not found in project", &input.path)));
};
let Some(abs_path) = self.project.read(cx).absolute_path(&project_path, cx) else {
return Task::ready(Err(anyhow!(
"Failed to convert {} to absolute path",
&input.path
)));
};
// Error out if this path is either excluded or private in global settings
let global_settings = WorktreeSettings::get_global(cx);
@@ -121,6 +145,14 @@ impl AgentTool for ReadFileTool {
let file_path = input.path.clone();
event_stream.update_fields(ToolCallUpdateFields {
locations: Some(vec![acp::ToolCallLocation {
path: abs_path,
line: input.start_line.map(|line| line.saturating_sub(1)),
}]),
..Default::default()
});
if image_store::is_image_file(&self.project, &project_path, cx) {
return cx.spawn(async move |cx| {
let image_entity: Entity<ImageItem> = cx
@@ -229,34 +261,25 @@ impl AgentTool for ReadFileTool {
};
project.update(cx, |project, cx| {
if let Some(abs_path) = project.absolute_path(&project_path, cx) {
project.set_agent_location(
Some(AgentLocation {
buffer: buffer.downgrade(),
position: anchor.unwrap_or(text::Anchor::MIN),
}),
cx,
);
project.set_agent_location(
Some(AgentLocation {
buffer: buffer.downgrade(),
position: anchor.unwrap_or(text::Anchor::MIN),
}),
cx,
);
if let Ok(LanguageModelToolResultContent::Text(text)) = &result {
let markdown = MarkdownCodeBlock {
tag: &input.path,
text,
}
.to_string();
event_stream.update_fields(ToolCallUpdateFields {
locations: Some(vec![acp::ToolCallLocation {
path: abs_path,
line: input.start_line.map(|line| line.saturating_sub(1)),
content: Some(vec![acp::ToolCallContent::Content {
content: markdown.into(),
}]),
..Default::default()
});
if let Ok(LanguageModelToolResultContent::Text(text)) = &result {
let markdown = MarkdownCodeBlock {
tag: &input.path,
text,
}
.to_string();
event_stream.update_fields(ToolCallUpdateFields {
content: Some(vec![acp::ToolCallContent::Content {
content: markdown.into(),
}]),
..Default::default()
})
}
})
}
})?;

View File

@@ -60,7 +60,11 @@ impl AgentTool for TerminalTool {
acp::ToolKind::Execute
}
fn initial_title(&self, input: Result<Self::Input, serde_json::Value>) -> SharedString {
fn initial_title(
&self,
input: Result<Self::Input, serde_json::Value>,
_cx: &mut App,
) -> SharedString {
if let Ok(input) = input {
let mut lines = input.command.lines();
let first_line = lines.next().unwrap_or_default();
@@ -93,7 +97,7 @@ impl AgentTool for TerminalTool {
Err(err) => return Task::ready(Err(err)),
};
let authorize = event_stream.authorize(self.initial_title(Ok(input.clone())), cx);
let authorize = event_stream.authorize(self.initial_title(Ok(input.clone()), cx), cx);
cx.spawn(async move |cx| {
authorize.await?;

View File

@@ -29,7 +29,11 @@ impl AgentTool for ThinkingTool {
acp::ToolKind::Think
}
fn initial_title(&self, _input: Result<Self::Input, serde_json::Value>) -> SharedString {
fn initial_title(
&self,
_input: Result<Self::Input, serde_json::Value>,
_cx: &mut App,
) -> SharedString {
"Thinking".into()
}

View File

@@ -48,7 +48,11 @@ impl AgentTool for WebSearchTool {
acp::ToolKind::Fetch
}
fn initial_title(&self, _input: Result<Self::Input, serde_json::Value>) -> SharedString {
fn initial_title(
&self,
_input: Result<Self::Input, serde_json::Value>,
_cx: &mut App,
) -> SharedString {
"Searching the Web".into()
}

View File

@@ -35,22 +35,18 @@ language.workspace = true
language_model.workspace = true
language_models.workspace = true
log.workspace = true
node_runtime.workspace = true
paths.workspace = true
project.workspace = true
reqwest_client = { workspace = true, optional = true }
schemars.workspace = true
semver.workspace = true
serde.workspace = true
serde_json.workspace = true
settings.workspace = true
smol.workspace = true
task.workspace = true
tempfile.workspace = true
thiserror.workspace = true
ui.workspace = true
util.workspace = true
watch.workspace = true
which.workspace = true
workspace-hack.workspace = true
[target.'cfg(unix)'.dependencies]

View File

@@ -1,4 +1,3 @@
use crate::AgentServerCommand;
use acp_thread::AgentConnection;
use acp_tools::AcpConnectionRegistry;
use action_log::ActionLog;
@@ -8,8 +7,10 @@ use collections::HashMap;
use futures::AsyncBufReadExt as _;
use futures::io::BufReader;
use project::Project;
use project::agent_server_store::AgentServerCommand;
use serde::Deserialize;
use std::path::PathBuf;
use std::{any::Any, cell::RefCell};
use std::{path::Path, rc::Rc};
use thiserror::Error;
@@ -29,6 +30,7 @@ pub struct AcpConnection {
sessions: Rc<RefCell<HashMap<acp::SessionId, AcpSession>>>,
auth_methods: Vec<acp::AuthMethod>,
agent_capabilities: acp::AgentCapabilities,
root_dir: PathBuf,
_io_task: Task<Result<()>>,
_wait_task: Task<Result<()>>,
_stderr_task: Task<Result<()>>,
@@ -43,9 +45,10 @@ pub async fn connect(
server_name: SharedString,
command: AgentServerCommand,
root_dir: &Path,
is_remote: bool,
cx: &mut AsyncApp,
) -> Result<Rc<dyn AgentConnection>> {
let conn = AcpConnection::stdio(server_name, command.clone(), root_dir, cx).await?;
let conn = AcpConnection::stdio(server_name, command.clone(), root_dir, is_remote, cx).await?;
Ok(Rc::new(conn) as _)
}
@@ -56,17 +59,21 @@ impl AcpConnection {
server_name: SharedString,
command: AgentServerCommand,
root_dir: &Path,
is_remote: bool,
cx: &mut AsyncApp,
) -> Result<Self> {
let mut child = util::command::new_smol_command(command.path)
let mut child = util::command::new_smol_command(command.path);
child
.args(command.args.iter().map(|arg| arg.as_str()))
.envs(command.env.iter().flatten())
.current_dir(root_dir)
.stdin(std::process::Stdio::piped())
.stdout(std::process::Stdio::piped())
.stderr(std::process::Stdio::piped())
.kill_on_drop(true)
.spawn()?;
.kill_on_drop(true);
if !is_remote {
child.current_dir(root_dir);
}
let mut child = child.spawn()?;
let stdout = child.stdout.take().context("Failed to take stdout")?;
let stdin = child.stdin.take().context("Failed to take stdin")?;
@@ -145,6 +152,7 @@ impl AcpConnection {
Ok(Self {
auth_methods: response.auth_methods,
root_dir: root_dir.to_owned(),
connection,
server_name,
sessions,
@@ -158,6 +166,10 @@ impl AcpConnection {
pub fn prompt_capabilities(&self) -> &acp::PromptCapabilities {
&self.agent_capabilities.prompt_capabilities
}
pub fn root_dir(&self) -> &Path {
&self.root_dir
}
}
impl AgentConnection for AcpConnection {
@@ -171,29 +183,36 @@ impl AgentConnection for AcpConnection {
let sessions = self.sessions.clone();
let cwd = cwd.to_path_buf();
let context_server_store = project.read(cx).context_server_store().read(cx);
let mcp_servers = context_server_store
.configured_server_ids()
.iter()
.filter_map(|id| {
let configuration = context_server_store.configuration_for_server(id)?;
let command = configuration.command();
Some(acp::McpServer {
name: id.0.to_string(),
command: command.path.clone(),
args: command.args.clone(),
env: if let Some(env) = command.env.as_ref() {
env.iter()
.map(|(name, value)| acp::EnvVariable {
name: name.clone(),
value: value.clone(),
})
.collect()
} else {
vec![]
},
let mcp_servers = if project.read(cx).is_local() {
context_server_store
.configured_server_ids()
.iter()
.filter_map(|id| {
let configuration = context_server_store.configuration_for_server(id)?;
let command = configuration.command();
Some(acp::McpServer {
name: id.0.to_string(),
command: command.path.clone(),
args: command.args.clone(),
env: if let Some(env) = command.env.as_ref() {
env.iter()
.map(|(name, value)| acp::EnvVariable {
name: name.clone(),
value: value.clone(),
})
.collect()
} else {
vec![]
},
})
})
})
.collect();
.collect()
} else {
// In SSH projects, the external agent is running on the remote
// machine, and currently we only run MCP servers on the local
// machine. So don't pass any MCP servers to the agent in that case.
Vec::new()
};
cx.spawn(async move |cx| {
let response = conn

View File

@@ -2,47 +2,25 @@ mod acp;
mod claude;
mod custom;
mod gemini;
mod settings;
#[cfg(any(test, feature = "test-support"))]
pub mod e2e_tests;
use anyhow::Context as _;
pub use claude::*;
pub use custom::*;
use fs::Fs;
use fs::RemoveOptions;
use fs::RenameOptions;
use futures::StreamExt as _;
pub use gemini::*;
use gpui::AppContext;
use node_runtime::NodeRuntime;
pub use settings::*;
use project::agent_server_store::AgentServerStore;
use acp_thread::AgentConnection;
use acp_thread::LoadError;
use anyhow::Result;
use anyhow::anyhow;
use collections::HashMap;
use gpui::{App, AsyncApp, Entity, SharedString, Task};
use gpui::{App, Entity, SharedString, Task};
use project::Project;
use schemars::JsonSchema;
use semver::Version;
use serde::{Deserialize, Serialize};
use std::str::FromStr as _;
use std::{
any::Any,
path::{Path, PathBuf},
rc::Rc,
sync::Arc,
};
use util::ResultExt as _;
use std::{any::Any, path::Path, rc::Rc};
pub fn init(cx: &mut App) {
settings::init(cx);
}
pub use acp::AcpConnection;
pub struct AgentServerDelegate {
store: Entity<AgentServerStore>,
project: Entity<Project>,
status_tx: Option<watch::Sender<SharedString>>,
new_version_available: Option<watch::Sender<Option<String>>>,
@@ -50,11 +28,13 @@ pub struct AgentServerDelegate {
impl AgentServerDelegate {
pub fn new(
store: Entity<AgentServerStore>,
project: Entity<Project>,
status_tx: Option<watch::Sender<SharedString>>,
new_version_tx: Option<watch::Sender<Option<String>>>,
) -> Self {
Self {
store,
project,
status_tx,
new_version_available: new_version_tx,
@@ -64,188 +44,6 @@ impl AgentServerDelegate {
pub fn project(&self) -> &Entity<Project> {
&self.project
}
fn get_or_npm_install_builtin_agent(
self,
binary_name: SharedString,
package_name: SharedString,
entrypoint_path: PathBuf,
ignore_system_version: bool,
minimum_version: Option<Version>,
cx: &mut App,
) -> Task<Result<AgentServerCommand>> {
let project = self.project;
let fs = project.read(cx).fs().clone();
let Some(node_runtime) = project.read(cx).node_runtime().cloned() else {
return Task::ready(Err(anyhow!(
"External agents are not yet available in remote projects."
)));
};
let status_tx = self.status_tx;
let new_version_available = self.new_version_available;
cx.spawn(async move |cx| {
if !ignore_system_version {
if let Some(bin) = find_bin_in_path(binary_name.clone(), &project, cx).await {
return Ok(AgentServerCommand {
path: bin,
args: Vec::new(),
env: Default::default(),
});
}
}
cx.spawn(async move |cx| {
let node_path = node_runtime.binary_path().await?;
let dir = paths::data_dir()
.join("external_agents")
.join(binary_name.as_str());
fs.create_dir(&dir).await?;
let mut stream = fs.read_dir(&dir).await?;
let mut versions = Vec::new();
let mut to_delete = Vec::new();
while let Some(entry) = stream.next().await {
let Ok(entry) = entry else { continue };
let Some(file_name) = entry.file_name() else {
continue;
};
if let Some(name) = file_name.to_str()
&& let Some(version) = semver::Version::from_str(name).ok()
&& fs
.is_file(&dir.join(file_name).join(&entrypoint_path))
.await
{
versions.push((version, file_name.to_owned()));
} else {
to_delete.push(file_name.to_owned())
}
}
versions.sort();
let newest_version = if let Some((version, file_name)) = versions.last().cloned()
&& minimum_version.is_none_or(|minimum_version| version >= minimum_version)
{
versions.pop();
Some(file_name)
} else {
None
};
log::debug!("existing version of {package_name}: {newest_version:?}");
to_delete.extend(versions.into_iter().map(|(_, file_name)| file_name));
cx.background_spawn({
let fs = fs.clone();
let dir = dir.clone();
async move {
for file_name in to_delete {
fs.remove_dir(
&dir.join(file_name),
RemoveOptions {
recursive: true,
ignore_if_not_exists: false,
},
)
.await
.ok();
}
}
})
.detach();
let version = if let Some(file_name) = newest_version {
cx.background_spawn({
let file_name = file_name.clone();
let dir = dir.clone();
let fs = fs.clone();
async move {
let latest_version =
node_runtime.npm_package_latest_version(&package_name).await;
if let Ok(latest_version) = latest_version
&& &latest_version != &file_name.to_string_lossy()
{
Self::download_latest_version(
fs,
dir.clone(),
node_runtime,
package_name,
)
.await
.log_err();
if let Some(mut new_version_available) = new_version_available {
new_version_available.send(Some(latest_version)).ok();
}
}
}
})
.detach();
file_name
} else {
if let Some(mut status_tx) = status_tx {
status_tx.send("Installing…".into()).ok();
}
let dir = dir.clone();
cx.background_spawn(Self::download_latest_version(
fs.clone(),
dir.clone(),
node_runtime,
package_name,
))
.await?
.into()
};
let agent_server_path = dir.join(version).join(entrypoint_path);
let agent_server_path_exists = fs.is_file(&agent_server_path).await;
anyhow::ensure!(
agent_server_path_exists,
"Missing entrypoint path {} after installation",
agent_server_path.to_string_lossy()
);
anyhow::Ok(AgentServerCommand {
path: node_path,
args: vec![agent_server_path.to_string_lossy().to_string()],
env: Default::default(),
})
})
.await
.map_err(|e| LoadError::FailedToInstall(e.to_string().into()).into())
})
}
async fn download_latest_version(
fs: Arc<dyn Fs>,
dir: PathBuf,
node_runtime: NodeRuntime,
package_name: SharedString,
) -> Result<String> {
log::debug!("downloading latest version of {package_name}");
let tmp_dir = tempfile::tempdir_in(&dir)?;
node_runtime
.npm_install_packages(tmp_dir.path(), &[(&package_name, "latest")])
.await?;
let version = node_runtime
.npm_package_installed_version(tmp_dir.path(), &package_name)
.await?
.context("expected package to be installed")?;
fs.rename(
&tmp_dir.keep(),
&dir.join(&version),
RenameOptions {
ignore_if_exists: true,
overwrite: false,
},
)
.await?;
anyhow::Ok(version)
}
}
pub trait AgentServer: Send {
@@ -255,10 +53,10 @@ pub trait AgentServer: Send {
fn connect(
&self,
root_dir: &Path,
root_dir: Option<&Path>,
delegate: AgentServerDelegate,
cx: &mut App,
) -> Task<Result<Rc<dyn AgentConnection>>>;
) -> Task<Result<(Rc<dyn AgentConnection>, Option<task::SpawnInTerminal>)>>;
fn into_any(self: Rc<Self>) -> Rc<dyn Any>;
}
@@ -268,120 +66,3 @@ impl dyn AgentServer {
self.into_any().downcast().ok()
}
}
impl std::fmt::Debug for AgentServerCommand {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
let filtered_env = self.env.as_ref().map(|env| {
env.iter()
.map(|(k, v)| {
(
k,
if util::redact::should_redact(k) {
"[REDACTED]"
} else {
v
},
)
})
.collect::<Vec<_>>()
});
f.debug_struct("AgentServerCommand")
.field("path", &self.path)
.field("args", &self.args)
.field("env", &filtered_env)
.finish()
}
}
#[derive(Deserialize, Serialize, Clone, PartialEq, Eq, JsonSchema)]
pub struct AgentServerCommand {
#[serde(rename = "command")]
pub path: PathBuf,
#[serde(default)]
pub args: Vec<String>,
pub env: Option<HashMap<String, String>>,
}
impl AgentServerCommand {
pub async fn resolve(
path_bin_name: &'static str,
extra_args: &[&'static str],
fallback_path: Option<&Path>,
settings: Option<BuiltinAgentServerSettings>,
project: &Entity<Project>,
cx: &mut AsyncApp,
) -> Option<Self> {
if let Some(settings) = settings
&& let Some(command) = settings.custom_command()
{
Some(command)
} else {
match find_bin_in_path(path_bin_name.into(), project, cx).await {
Some(path) => Some(Self {
path,
args: extra_args.iter().map(|arg| arg.to_string()).collect(),
env: None,
}),
None => fallback_path.and_then(|path| {
if path.exists() {
Some(Self {
path: path.to_path_buf(),
args: extra_args.iter().map(|arg| arg.to_string()).collect(),
env: None,
})
} else {
None
}
}),
}
}
}
}
async fn find_bin_in_path(
bin_name: SharedString,
project: &Entity<Project>,
cx: &mut AsyncApp,
) -> Option<PathBuf> {
let (env_task, root_dir) = project
.update(cx, |project, cx| {
let worktree = project.visible_worktrees(cx).next();
match worktree {
Some(worktree) => {
let env_task = project.environment().update(cx, |env, cx| {
env.get_worktree_environment(worktree.clone(), cx)
});
let path = worktree.read(cx).abs_path();
(env_task, path)
}
None => {
let path: Arc<Path> = paths::home_dir().as_path().into();
let env_task = project.environment().update(cx, |env, cx| {
env.get_directory_environment(path.clone(), cx)
});
(env_task, path)
}
}
})
.log_err()?;
cx.background_executor()
.spawn(async move {
let which_result = if cfg!(windows) {
which::which(bin_name.as_str())
} else {
let env = env_task.await.unwrap_or_default();
let shell_path = env.get("PATH").cloned();
which::which_in(bin_name.as_str(), shell_path.as_ref(), root_dir.as_ref())
};
if let Err(which::Error::CannotFindBinaryPath) = which_result {
return None;
}
which_result.log_err()
})
.await
}

View File

@@ -1,60 +1,22 @@
use settings::SettingsStore;
use std::path::Path;
use std::rc::Rc;
use std::{any::Any, path::PathBuf};
use anyhow::Result;
use gpui::{App, AppContext as _, SharedString, Task};
use anyhow::{Context as _, Result};
use gpui::{App, SharedString, Task};
use project::agent_server_store::CLAUDE_CODE_NAME;
use crate::{AgentServer, AgentServerDelegate, AllAgentServersSettings};
use crate::{AgentServer, AgentServerDelegate};
use acp_thread::AgentConnection;
#[derive(Clone)]
pub struct ClaudeCode;
pub struct ClaudeCodeLoginCommand {
pub struct AgentServerLoginCommand {
pub path: PathBuf,
pub arguments: Vec<String>,
}
impl ClaudeCode {
const BINARY_NAME: &'static str = "claude-code-acp";
const PACKAGE_NAME: &'static str = "@zed-industries/claude-code-acp";
pub fn login_command(
delegate: AgentServerDelegate,
cx: &mut App,
) -> Task<Result<ClaudeCodeLoginCommand>> {
let settings = cx.read_global(|settings: &SettingsStore, _| {
settings.get::<AllAgentServersSettings>(None).claude.clone()
});
cx.spawn(async move |cx| {
let mut command = if let Some(settings) = settings {
settings.command
} else {
cx.update(|cx| {
delegate.get_or_npm_install_builtin_agent(
Self::BINARY_NAME.into(),
Self::PACKAGE_NAME.into(),
"node_modules/@anthropic-ai/claude-code/cli.js".into(),
true,
Some("0.2.5".parse().unwrap()),
cx,
)
})?
.await?
};
command.args.push("/login".into());
Ok(ClaudeCodeLoginCommand {
path: command.path,
arguments: command.args,
})
})
}
}
impl AgentServer for ClaudeCode {
fn telemetry_id(&self) -> &'static str {
"claude-code"
@@ -70,56 +32,33 @@ impl AgentServer for ClaudeCode {
fn connect(
&self,
root_dir: &Path,
root_dir: Option<&Path>,
delegate: AgentServerDelegate,
cx: &mut App,
) -> Task<Result<Rc<dyn AgentConnection>>> {
let root_dir = root_dir.to_path_buf();
let fs = delegate.project().read(cx).fs().clone();
let server_name = self.name();
let settings = cx.read_global(|settings: &SettingsStore, _| {
settings.get::<AllAgentServersSettings>(None).claude.clone()
});
let project = delegate.project().clone();
) -> Task<Result<(Rc<dyn AgentConnection>, Option<task::SpawnInTerminal>)>> {
let name = self.name();
let root_dir = root_dir.map(|root_dir| root_dir.to_string_lossy().to_string());
let is_remote = delegate.project.read(cx).is_via_remote_server();
let store = delegate.store.downgrade();
cx.spawn(async move |cx| {
let mut project_env = project
.update(cx, |project, cx| {
project.directory_environment(root_dir.as_path().into(), cx)
})?
.await
.unwrap_or_default();
let mut command = if let Some(settings) = settings {
settings.command
} else {
cx.update(|cx| {
delegate.get_or_npm_install_builtin_agent(
Self::BINARY_NAME.into(),
Self::PACKAGE_NAME.into(),
format!("node_modules/{}/dist/index.js", Self::PACKAGE_NAME).into(),
true,
None,
cx,
)
})?
.await?
};
project_env.extend(command.env.take().unwrap_or_default());
command.env = Some(project_env);
command
.env
.get_or_insert_default()
.insert("ANTHROPIC_API_KEY".to_owned(), "".to_owned());
let root_dir_exists = fs.is_dir(&root_dir).await;
anyhow::ensure!(
root_dir_exists,
"Session root {} does not exist or is not a directory",
root_dir.to_string_lossy()
);
crate::acp::connect(server_name, command.clone(), &root_dir, cx).await
let (command, root_dir, login) = store
.update(cx, |store, cx| {
let agent = store
.get_external_agent(&CLAUDE_CODE_NAME.into())
.context("Claude Code is not registered")?;
anyhow::Ok(agent.get_command(
root_dir.as_deref(),
Default::default(),
delegate.status_tx,
delegate.new_version_available,
&mut cx.to_async(),
))
})??
.await?;
let connection =
crate::acp::connect(name, command, root_dir.as_ref(), is_remote, cx).await?;
Ok((connection, login))
})
}

View File

@@ -1,19 +1,19 @@
use crate::{AgentServerCommand, AgentServerDelegate};
use crate::AgentServerDelegate;
use acp_thread::AgentConnection;
use anyhow::Result;
use anyhow::{Context as _, Result};
use gpui::{App, SharedString, Task};
use project::agent_server_store::ExternalAgentServerName;
use std::{path::Path, rc::Rc};
use ui::IconName;
/// A generic agent server implementation for custom user-defined agents
pub struct CustomAgentServer {
name: SharedString,
command: AgentServerCommand,
}
impl CustomAgentServer {
pub fn new(name: SharedString, command: AgentServerCommand) -> Self {
Self { name, command }
pub fn new(name: SharedString) -> Self {
Self { name }
}
}
@@ -32,14 +32,36 @@ impl crate::AgentServer for CustomAgentServer {
fn connect(
&self,
root_dir: &Path,
_delegate: AgentServerDelegate,
root_dir: Option<&Path>,
delegate: AgentServerDelegate,
cx: &mut App,
) -> Task<Result<Rc<dyn AgentConnection>>> {
let server_name = self.name();
let command = self.command.clone();
let root_dir = root_dir.to_path_buf();
cx.spawn(async move |cx| crate::acp::connect(server_name, command, &root_dir, cx).await)
) -> Task<Result<(Rc<dyn AgentConnection>, Option<task::SpawnInTerminal>)>> {
let name = self.name();
let root_dir = root_dir.map(|root_dir| root_dir.to_string_lossy().to_string());
let is_remote = delegate.project.read(cx).is_via_remote_server();
let store = delegate.store.downgrade();
cx.spawn(async move |cx| {
let (command, root_dir, login) = store
.update(cx, |store, cx| {
let agent = store
.get_external_agent(&ExternalAgentServerName(name.clone()))
.with_context(|| {
format!("Custom agent server `{}` is not registered", name)
})?;
anyhow::Ok(agent.get_command(
root_dir.as_deref(),
Default::default(),
delegate.status_tx,
delegate.new_version_available,
&mut cx.to_async(),
))
})??
.await?;
let connection =
crate::acp::connect(name, command, root_dir.as_ref(), is_remote, cx).await?;
Ok((connection, login))
})
}
fn into_any(self: Rc<Self>) -> Rc<dyn std::any::Any> {

View File

@@ -1,12 +1,12 @@
use crate::{AgentServer, AgentServerDelegate};
#[cfg(test)]
use crate::{AgentServerCommand, CustomAgentServerSettings};
use acp_thread::{AcpThread, AgentThreadEntry, ToolCall, ToolCallStatus};
use agent_client_protocol as acp;
use futures::{FutureExt, StreamExt, channel::mpsc, select};
use gpui::{AppContext, Entity, TestAppContext};
use indoc::indoc;
use project::{FakeFs, Project};
#[cfg(test)]
use project::agent_server_store::{AgentServerCommand, CustomAgentServerSettings};
use project::{FakeFs, Project, agent_server_store::AllAgentServersSettings};
use std::{
path::{Path, PathBuf},
sync::Arc,
@@ -449,7 +449,6 @@ pub use common_e2e_tests;
// Helpers
pub async fn init_test(cx: &mut TestAppContext) -> Arc<FakeFs> {
#[cfg(test)]
use settings::Settings;
env_logger::try_init().ok();
@@ -468,11 +467,11 @@ pub async fn init_test(cx: &mut TestAppContext) -> Arc<FakeFs> {
language_model::init(client.clone(), cx);
language_models::init(user_store, client, cx);
agent_settings::init(cx);
crate::settings::init(cx);
AllAgentServersSettings::register(cx);
#[cfg(test)]
crate::AllAgentServersSettings::override_global(
crate::AllAgentServersSettings {
AllAgentServersSettings::override_global(
AllAgentServersSettings {
claude: Some(CustomAgentServerSettings {
command: AgentServerCommand {
path: "claude-code-acp".into(),
@@ -498,10 +497,11 @@ pub async fn new_test_thread(
current_dir: impl AsRef<Path>,
cx: &mut TestAppContext,
) -> Entity<AcpThread> {
let delegate = AgentServerDelegate::new(project.clone(), None, None);
let store = project.read_with(cx, |project, _| project.agent_server_store().clone());
let delegate = AgentServerDelegate::new(store, project.clone(), None, None);
let connection = cx
.update(|cx| server.connect(current_dir.as_ref(), delegate, cx))
let (connection, _) = cx
.update(|cx| server.connect(Some(current_dir.as_ref()), delegate, cx))
.await
.unwrap();

View File

@@ -1,21 +1,17 @@
use std::rc::Rc;
use std::{any::Any, path::Path};
use crate::acp::AcpConnection;
use crate::{AgentServer, AgentServerDelegate};
use acp_thread::{AgentConnection, LoadError};
use anyhow::Result;
use gpui::{App, AppContext as _, SharedString, Task};
use acp_thread::AgentConnection;
use anyhow::{Context as _, Result};
use collections::HashMap;
use gpui::{App, SharedString, Task};
use language_models::provider::google::GoogleLanguageModelProvider;
use settings::SettingsStore;
use crate::AllAgentServersSettings;
use project::agent_server_store::GEMINI_NAME;
#[derive(Clone)]
pub struct Gemini;
const ACP_ARG: &str = "--experimental-acp";
impl AgentServer for Gemini {
fn telemetry_id(&self) -> &'static str {
"gemini-cli"
@@ -31,126 +27,37 @@ impl AgentServer for Gemini {
fn connect(
&self,
root_dir: &Path,
root_dir: Option<&Path>,
delegate: AgentServerDelegate,
cx: &mut App,
) -> Task<Result<Rc<dyn AgentConnection>>> {
let root_dir = root_dir.to_path_buf();
let fs = delegate.project().read(cx).fs().clone();
let server_name = self.name();
let settings = cx.read_global(|settings: &SettingsStore, _| {
settings.get::<AllAgentServersSettings>(None).gemini.clone()
});
let project = delegate.project().clone();
) -> Task<Result<(Rc<dyn AgentConnection>, Option<task::SpawnInTerminal>)>> {
let name = self.name();
let root_dir = root_dir.map(|root_dir| root_dir.to_string_lossy().to_string());
let is_remote = delegate.project.read(cx).is_via_remote_server();
let store = delegate.store.downgrade();
cx.spawn(async move |cx| {
let ignore_system_version = settings
.as_ref()
.and_then(|settings| settings.ignore_system_version)
.unwrap_or(true);
let mut project_env = project
.update(cx, |project, cx| {
project.directory_environment(root_dir.as_path().into(), cx)
})?
.await
.unwrap_or_default();
let mut command = if let Some(settings) = settings
&& let Some(command) = settings.custom_command()
{
command
} else {
cx.update(|cx| {
delegate.get_or_npm_install_builtin_agent(
Self::BINARY_NAME.into(),
Self::PACKAGE_NAME.into(),
format!("node_modules/{}/dist/index.js", Self::PACKAGE_NAME).into(),
ignore_system_version,
Some(Self::MINIMUM_VERSION.parse().unwrap()),
cx,
)
})?
.await?
};
if !command.args.contains(&ACP_ARG.into()) {
command.args.push(ACP_ARG.into());
}
let mut extra_env = HashMap::default();
if let Some(api_key) = cx.update(GoogleLanguageModelProvider::api_key)?.await.ok() {
project_env
.insert("GEMINI_API_KEY".to_owned(), api_key.key);
extra_env.insert("GEMINI_API_KEY".into(), api_key.key);
}
project_env.extend(command.env.take().unwrap_or_default());
command.env = Some(project_env);
let root_dir_exists = fs.is_dir(&root_dir).await;
anyhow::ensure!(
root_dir_exists,
"Session root {} does not exist or is not a directory",
root_dir.to_string_lossy()
);
let result = crate::acp::connect(server_name, command.clone(), &root_dir, cx).await;
match &result {
Ok(connection) => {
if let Some(connection) = connection.clone().downcast::<AcpConnection>()
&& !connection.prompt_capabilities().image
{
let version_output = util::command::new_smol_command(&command.path)
.args(command.args.iter())
.arg("--version")
.kill_on_drop(true)
.output()
.await;
let current_version =
String::from_utf8(version_output?.stdout)?.trim().to_owned();
log::error!("connected to gemini, but missing prompt_capabilities.image (version is {current_version})");
return Err(LoadError::Unsupported {
current_version: current_version.into(),
command: (command.path.to_string_lossy().to_string() + " " + &command.args.join(" ")).into(),
minimum_version: Self::MINIMUM_VERSION.into(),
}
.into());
}
}
Err(e) => {
let version_fut = util::command::new_smol_command(&command.path)
.args(command.args.iter())
.arg("--version")
.kill_on_drop(true)
.output();
let help_fut = util::command::new_smol_command(&command.path)
.args(command.args.iter())
.arg("--help")
.kill_on_drop(true)
.output();
let (version_output, help_output) =
futures::future::join(version_fut, help_fut).await;
let Some(version_output) = version_output.ok().and_then(|output| String::from_utf8(output.stdout).ok()) else {
return result;
};
let Some((help_stdout, help_stderr)) = help_output.ok().and_then(|output| String::from_utf8(output.stdout).ok().zip(String::from_utf8(output.stderr).ok())) else {
return result;
};
let current_version = version_output.trim().to_string();
let supported = help_stdout.contains(ACP_ARG) || current_version.parse::<semver::Version>().is_ok_and(|version| version >= Self::MINIMUM_VERSION.parse::<semver::Version>().unwrap());
log::error!("failed to create ACP connection to gemini (version is {current_version}, supported: {supported}): {e}");
log::debug!("gemini --help stdout: {help_stdout:?}");
log::debug!("gemini --help stderr: {help_stderr:?}");
if !supported {
return Err(LoadError::Unsupported {
current_version: current_version.into(),
command: (command.path.to_string_lossy().to_string() + " " + &command.args.join(" ")).into(),
minimum_version: Self::MINIMUM_VERSION.into(),
}
.into());
}
}
}
result
let (command, root_dir, login) = store
.update(cx, |store, cx| {
let agent = store
.get_external_agent(&GEMINI_NAME.into())
.context("Gemini CLI is not registered")?;
anyhow::Ok(agent.get_command(
root_dir.as_deref(),
extra_env,
delegate.status_tx,
delegate.new_version_available,
&mut cx.to_async(),
))
})??
.await?;
let connection =
crate::acp::connect(name, command, root_dir.as_ref(), is_remote, cx).await?;
Ok((connection, login))
})
}
@@ -159,18 +66,11 @@ impl AgentServer for Gemini {
}
}
impl Gemini {
const PACKAGE_NAME: &str = "@google/gemini-cli";
const MINIMUM_VERSION: &str = "0.2.1";
const BINARY_NAME: &str = "gemini";
}
#[cfg(test)]
pub(crate) mod tests {
use project::agent_server_store::AgentServerCommand;
use super::*;
use crate::AgentServerCommand;
use std::path::Path;
crate::common_e2e_tests!(async |_, _, _| Gemini, allow_option_id = "proceed_once");

View File

@@ -1,110 +0,0 @@
use std::path::PathBuf;
use crate::AgentServerCommand;
use anyhow::Result;
use collections::HashMap;
use gpui::{App, SharedString};
use schemars::JsonSchema;
use serde::{Deserialize, Serialize};
use settings::{Settings, SettingsKey, SettingsSources, SettingsUi};
pub fn init(cx: &mut App) {
AllAgentServersSettings::register(cx);
}
#[derive(Default, Deserialize, Serialize, Clone, JsonSchema, Debug, SettingsUi, SettingsKey)]
#[settings_key(key = "agent_servers")]
pub struct AllAgentServersSettings {
pub gemini: Option<BuiltinAgentServerSettings>,
pub claude: Option<CustomAgentServerSettings>,
/// Custom agent servers configured by the user
#[serde(flatten)]
pub custom: HashMap<SharedString, CustomAgentServerSettings>,
}
#[derive(Default, Deserialize, Serialize, Clone, JsonSchema, Debug, PartialEq)]
pub struct BuiltinAgentServerSettings {
/// Absolute path to a binary to be used when launching this agent.
///
/// This can be used to run a specific binary without automatic downloads or searching `$PATH`.
#[serde(rename = "command")]
pub path: Option<PathBuf>,
/// If a binary is specified in `command`, it will be passed these arguments.
pub args: Option<Vec<String>>,
/// If a binary is specified in `command`, it will be passed these environment variables.
pub env: Option<HashMap<String, String>>,
/// Whether to skip searching `$PATH` for an agent server binary when
/// launching this agent.
///
/// This has no effect if a `command` is specified. Otherwise, when this is
/// `false`, Zed will search `$PATH` for an agent server binary and, if one
/// is found, use it for threads with this agent. If no agent binary is
/// found on `$PATH`, Zed will automatically install and use its own binary.
/// When this is `true`, Zed will not search `$PATH`, and will always use
/// its own binary.
///
/// Default: true
pub ignore_system_version: Option<bool>,
}
impl BuiltinAgentServerSettings {
pub(crate) fn custom_command(self) -> Option<AgentServerCommand> {
self.path.map(|path| AgentServerCommand {
path,
args: self.args.unwrap_or_default(),
env: self.env,
})
}
}
impl From<AgentServerCommand> for BuiltinAgentServerSettings {
fn from(value: AgentServerCommand) -> Self {
BuiltinAgentServerSettings {
path: Some(value.path),
args: Some(value.args),
env: value.env,
..Default::default()
}
}
}
#[derive(Deserialize, Serialize, Clone, JsonSchema, Debug, PartialEq)]
pub struct CustomAgentServerSettings {
#[serde(flatten)]
pub command: AgentServerCommand,
}
impl settings::Settings for AllAgentServersSettings {
type FileContent = Self;
fn load(sources: SettingsSources<Self::FileContent>, _: &mut App) -> Result<Self> {
let mut settings = AllAgentServersSettings::default();
for AllAgentServersSettings {
gemini,
claude,
custom,
} in sources.defaults_and_customizations()
{
if gemini.is_some() {
settings.gemini = gemini.clone();
}
if claude.is_some() {
settings.claude = claude.clone();
}
// Merge custom agents
for (name, config) in custom {
// Skip built-in agent names to avoid conflicts
if name != "gemini" && name != "claude" {
settings.custom.insert(name.clone(), config.clone());
}
}
}
Ok(settings)
}
fn import_from_vscode(_vscode: &settings::VsCodeSettings, _current: &mut Self::FileContent) {}
}

View File

@@ -699,10 +699,15 @@ impl MessageEditor {
self.project.read(cx).fs().clone(),
self.history_store.clone(),
));
let delegate = AgentServerDelegate::new(self.project.clone(), None, None);
let connection = server.connect(Path::new(""), delegate, cx);
let delegate = AgentServerDelegate::new(
self.project.read(cx).agent_server_store().clone(),
self.project.clone(),
None,
None,
);
let connection = server.connect(None, delegate, cx);
cx.spawn(async move |_, cx| {
let agent = connection.await?;
let (agent, _) = connection.await?;
let agent = agent.downcast::<agent2::NativeAgentConnection>().unwrap();
let summary = agent
.0

View File

@@ -6,7 +6,7 @@ use acp_thread::{
use acp_thread::{AgentConnection, Plan};
use action_log::ActionLog;
use agent_client_protocol::{self as acp, PromptCapabilities};
use agent_servers::{AgentServer, AgentServerDelegate, ClaudeCode};
use agent_servers::{AgentServer, AgentServerDelegate};
use agent_settings::{AgentProfileId, AgentSettings, CompletionMode, NotifyWhenAgentWaiting};
use agent2::{DbThreadMetadata, HistoryEntry, HistoryEntryId, HistoryStore, NativeAgentServer};
use anyhow::{Context as _, Result, anyhow, bail};
@@ -40,7 +40,6 @@ use std::path::Path;
use std::sync::Arc;
use std::time::Instant;
use std::{collections::BTreeMap, rc::Rc, time::Duration};
use task::SpawnInTerminal;
use terminal_view::terminal_panel::TerminalPanel;
use text::Anchor;
use theme::{AgentFontSize, ThemeSettings};
@@ -263,6 +262,7 @@ pub struct AcpThreadView {
workspace: WeakEntity<Workspace>,
project: Entity<Project>,
thread_state: ThreadState,
login: Option<task::SpawnInTerminal>,
history_store: Entity<HistoryStore>,
hovered_recent_history_item: Option<usize>,
entry_view_state: Entity<EntryViewState>,
@@ -392,6 +392,7 @@ impl AcpThreadView {
project: project.clone(),
entry_view_state,
thread_state: Self::initial_state(agent, resume_thread, workspace, project, window, cx),
login: None,
message_editor,
model_selector: None,
profile_selector: None,
@@ -444,9 +445,11 @@ impl AcpThreadView {
window: &mut Window,
cx: &mut Context<Self>,
) -> ThreadState {
if !project.read(cx).is_local() && agent.clone().downcast::<NativeAgentServer>().is_none() {
if project.read(cx).is_via_collab()
&& agent.clone().downcast::<NativeAgentServer>().is_none()
{
return ThreadState::LoadError(LoadError::Other(
"External agents are not yet supported for remote projects.".into(),
"External agents are not yet supported in shared projects.".into(),
));
}
let mut worktrees = project.read(cx).visible_worktrees(cx).collect::<Vec<_>>();
@@ -466,20 +469,23 @@ impl AcpThreadView {
Some(worktree.read(cx).abs_path())
}
})
.next()
.unwrap_or_else(|| paths::home_dir().as_path().into());
.next();
let (status_tx, mut status_rx) = watch::channel("Loading…".into());
let (new_version_available_tx, mut new_version_available_rx) = watch::channel(None);
let delegate = AgentServerDelegate::new(
project.read(cx).agent_server_store().clone(),
project.clone(),
Some(status_tx),
Some(new_version_available_tx),
);
let connect_task = agent.connect(&root_dir, delegate, cx);
let connect_task = agent.connect(root_dir.as_deref(), delegate, cx);
let load_task = cx.spawn_in(window, async move |this, cx| {
let connection = match connect_task.await {
Ok(connection) => connection,
Ok((connection, login)) => {
this.update(cx, |this, _| this.login = login).ok();
connection
}
Err(err) => {
this.update_in(cx, |this, window, cx| {
if err.downcast_ref::<LoadError>().is_some() {
@@ -506,6 +512,14 @@ impl AcpThreadView {
})
.log_err()
} else {
let root_dir = if let Some(acp_agent) = connection
.clone()
.downcast::<agent_servers::AcpConnection>()
{
acp_agent.root_dir().into()
} else {
root_dir.unwrap_or(paths::home_dir().as_path().into())
};
cx.update(|_, cx| {
connection
.clone()
@@ -1462,9 +1476,12 @@ impl AcpThreadView {
self.thread_error.take();
configuration_view.take();
pending_auth_method.replace(method.clone());
let authenticate = if method.0.as_ref() == "claude-login" {
let authenticate = if (method.0.as_ref() == "claude-login"
|| method.0.as_ref() == "spawn-gemini-cli")
&& let Some(login) = self.login.clone()
{
if let Some(workspace) = self.workspace.upgrade() {
Self::spawn_claude_login(&workspace, window, cx)
Self::spawn_external_agent_login(login, workspace, false, window, cx)
} else {
Task::ready(Ok(()))
}
@@ -1511,31 +1528,28 @@ impl AcpThreadView {
}));
}
fn spawn_claude_login(
workspace: &Entity<Workspace>,
fn spawn_external_agent_login(
login: task::SpawnInTerminal,
workspace: Entity<Workspace>,
previous_attempt: bool,
window: &mut Window,
cx: &mut App,
) -> Task<Result<()>> {
let Some(terminal_panel) = workspace.read(cx).panel::<TerminalPanel>(cx) else {
return Task::ready(Ok(()));
};
let project_entity = workspace.read(cx).project();
let project = project_entity.read(cx);
let cwd = project.first_project_directory(cx);
let shell = project.terminal_settings(&cwd, cx).shell.clone();
let delegate = AgentServerDelegate::new(project_entity.clone(), None, None);
let command = ClaudeCode::login_command(delegate, cx);
let project = workspace.read(cx).project().clone();
let cwd = project.read(cx).first_project_directory(cx);
let shell = project.read(cx).terminal_settings(&cwd, cx).shell.clone();
window.spawn(cx, async move |cx| {
let login_command = command.await?;
let command = login_command
.path
.to_str()
.with_context(|| format!("invalid login command: {:?}", login_command.path))?;
let command = shlex::try_quote(command)?;
let args = login_command
.arguments
let mut task = login.clone();
task.command = task
.command
.map(|command| anyhow::Ok(shlex::try_quote(&command)?.to_string()))
.transpose()?;
task.args = task
.args
.iter()
.map(|arg| {
Ok(shlex::try_quote(arg)
@@ -1543,26 +1557,16 @@ impl AcpThreadView {
.to_string())
})
.collect::<Result<Vec<_>>>()?;
task.full_label = task.label.clone();
task.id = task::TaskId(format!("external-agent-{}-login", task.label));
task.command_label = task.label.clone();
task.use_new_terminal = true;
task.allow_concurrent_runs = true;
task.hide = task::HideStrategy::Always;
task.shell = shell;
let terminal = terminal_panel.update_in(cx, |terminal_panel, window, cx| {
terminal_panel.spawn_task(
&SpawnInTerminal {
id: task::TaskId("claude-login".into()),
full_label: "claude /login".to_owned(),
label: "claude /login".to_owned(),
command: Some(command.into()),
args,
command_label: "claude /login".to_owned(),
cwd,
use_new_terminal: true,
allow_concurrent_runs: true,
hide: task::HideStrategy::Always,
shell,
..Default::default()
},
window,
cx,
)
terminal_panel.spawn_task(&login, window, cx)
})?;
let terminal = terminal.await?;
@@ -1578,7 +1582,9 @@ impl AcpThreadView {
cx.background_executor().timer(Duration::from_secs(1)).await;
let content =
terminal.update(cx, |terminal, _cx| terminal.get_content())?;
if content.contains("Login successful") {
if content.contains("Login successful")
|| content.contains("Type your message")
{
return anyhow::Ok(());
}
}
@@ -1594,6 +1600,9 @@ impl AcpThreadView {
}
}
_ = exit_status => {
if !previous_attempt && project.read_with(cx, |project, _| project.is_via_remote_server())? && login.label.contains("gemini") {
return cx.update(|window, cx| Self::spawn_external_agent_login(login, workspace, true, window, cx))?.await
}
return Err(anyhow!("exited before logging in"));
}
}
@@ -2024,35 +2033,34 @@ impl AcpThreadView {
window: &Window,
cx: &Context<Self>,
) -> Div {
let has_location = tool_call.locations.len() == 1;
let card_header_id = SharedString::from("inner-tool-call-header");
let tool_icon =
if tool_call.kind == acp::ToolKind::Edit && tool_call.locations.len() == 1 {
FileIcons::get_icon(&tool_call.locations[0].path, cx)
.map(Icon::from_path)
.unwrap_or(Icon::new(IconName::ToolPencil))
} else {
Icon::new(match tool_call.kind {
acp::ToolKind::Read => IconName::ToolSearch,
acp::ToolKind::Edit => IconName::ToolPencil,
acp::ToolKind::Delete => IconName::ToolDeleteFile,
acp::ToolKind::Move => IconName::ArrowRightLeft,
acp::ToolKind::Search => IconName::ToolSearch,
acp::ToolKind::Execute => IconName::ToolTerminal,
acp::ToolKind::Think => IconName::ToolThink,
acp::ToolKind::Fetch => IconName::ToolWeb,
acp::ToolKind::Other => IconName::ToolHammer,
})
}
.size(IconSize::Small)
.color(Color::Muted);
let tool_icon = if tool_call.kind == acp::ToolKind::Edit && has_location {
FileIcons::get_icon(&tool_call.locations[0].path, cx)
.map(Icon::from_path)
.unwrap_or(Icon::new(IconName::ToolPencil))
} else {
Icon::new(match tool_call.kind {
acp::ToolKind::Read => IconName::ToolSearch,
acp::ToolKind::Edit => IconName::ToolPencil,
acp::ToolKind::Delete => IconName::ToolDeleteFile,
acp::ToolKind::Move => IconName::ArrowRightLeft,
acp::ToolKind::Search => IconName::ToolSearch,
acp::ToolKind::Execute => IconName::ToolTerminal,
acp::ToolKind::Think => IconName::ToolThink,
acp::ToolKind::Fetch => IconName::ToolWeb,
acp::ToolKind::Other => IconName::ToolHammer,
})
}
.size(IconSize::Small)
.color(Color::Muted);
let failed_or_canceled = match &tool_call.status {
ToolCallStatus::Rejected | ToolCallStatus::Canceled | ToolCallStatus::Failed => true,
_ => false,
};
let has_location = tool_call.locations.len() == 1;
let needs_confirmation = matches!(
tool_call.status,
ToolCallStatus::WaitingForConfirmation { .. }
@@ -2195,13 +2203,6 @@ impl AcpThreadView {
.overflow_hidden()
.child(tool_icon)
.child(if has_location {
let name = tool_call.locations[0]
.path
.file_name()
.unwrap_or_default()
.display()
.to_string();
h_flex()
.id(("open-tool-call-location", entry_ix))
.w_full()
@@ -2212,7 +2213,13 @@ impl AcpThreadView {
this.text_color(cx.theme().colors().text_muted)
}
})
.child(name)
.child(self.render_markdown(
tool_call.label.clone(),
MarkdownStyle {
prevent_mouse_interaction: true,
..default_markdown_style(false, true, window, cx)
},
))
.tooltip(Tooltip::text("Jump to File"))
.on_click(cx.listener(move |this, _, window, cx| {
this.open_tool_call_location(entry_ix, 0, window, cx);
@@ -3090,26 +3097,38 @@ impl AcpThreadView {
})
.children(connection.auth_methods().iter().enumerate().rev().map(
|(ix, method)| {
Button::new(
SharedString::from(method.id.0.clone()),
method.name.clone(),
)
.when(ix == 0, |el| {
el.style(ButtonStyle::Tinted(ui::TintColor::Warning))
})
.label_size(LabelSize::Small)
.on_click({
let method_id = method.id.clone();
cx.listener(move |this, _, window, cx| {
telemetry::event!(
"Authenticate Agent Started",
agent = this.agent.telemetry_id(),
method = method_id
);
let (method_id, name) = if self
.project
.read(cx)
.is_via_remote_server()
&& method.id.0.as_ref() == "oauth-personal"
&& method.name == "Log in with Google"
{
("spawn-gemini-cli".into(), "Log in with Gemini CLI".into())
} else {
(method.id.0.clone(), method.name.clone())
};
this.authenticate(method_id.clone(), window, cx)
Button::new(SharedString::from(method_id.clone()), name)
.when(ix == 0, |el| {
el.style(ButtonStyle::Tinted(ui::TintColor::Warning))
})
.label_size(LabelSize::Small)
.on_click({
cx.listener(move |this, _, window, cx| {
telemetry::event!(
"Authenticate Agent Started",
agent = this.agent.telemetry_id(),
method = method_id
);
this.authenticate(
acp::AuthMethodId(method_id.clone()),
window,
cx,
)
})
})
})
},
)),
)
@@ -4255,7 +4274,7 @@ impl AcpThreadView {
}
let buffer = project.update(cx, |project, cx| {
project.create_local_buffer(&markdown, Some(markdown_language), cx)
project.create_local_buffer(&markdown, Some(markdown_language), true, cx)
});
let buffer = cx.new(|cx| {
MultiBuffer::singleton(buffer, cx).with_title(thread_summary.clone())
@@ -5712,11 +5731,11 @@ pub(crate) mod tests {
fn connect(
&self,
_root_dir: &Path,
_root_dir: Option<&Path>,
_delegate: AgentServerDelegate,
_cx: &mut App,
) -> Task<gpui::Result<Rc<dyn AgentConnection>>> {
Task::ready(Ok(Rc::new(self.connection.clone())))
) -> Task<gpui::Result<(Rc<dyn AgentConnection>, Option<task::SpawnInTerminal>)>> {
Task::ready(Ok((Rc::new(self.connection.clone()), None)))
}
fn into_any(self: Rc<Self>) -> Rc<dyn Any> {

View File

@@ -3585,7 +3585,7 @@ pub(crate) fn open_active_thread_as_markdown(
}
let buffer = project.update(cx, |project, cx| {
project.create_local_buffer(&markdown, Some(markdown_language), cx)
project.create_local_buffer(&markdown, Some(markdown_language), true, cx)
});
let buffer =
cx.new(|cx| MultiBuffer::singleton(buffer, cx).with_title(thread_summary.clone()));

View File

@@ -5,7 +5,6 @@ mod tool_picker;
use std::{ops::Range, sync::Arc};
use agent_servers::{AgentServerCommand, AllAgentServersSettings, CustomAgentServerSettings};
use agent_settings::AgentSettings;
use anyhow::Result;
use assistant_tool::{ToolSource, ToolWorkingSet};
@@ -26,6 +25,10 @@ use language_model::{
};
use notifications::status_toast::{StatusToast, ToastIcon};
use project::{
agent_server_store::{
AgentServerCommand, AgentServerStore, AllAgentServersSettings, CLAUDE_CODE_NAME,
CustomAgentServerSettings, GEMINI_NAME,
},
context_server_store::{ContextServerConfiguration, ContextServerStatus, ContextServerStore},
project_settings::{ContextServerSettings, ProjectSettings},
};
@@ -45,11 +48,13 @@ pub(crate) use manage_profiles_modal::ManageProfilesModal;
use crate::{
AddContextServer, ExternalAgent, NewExternalAgentThread,
agent_configuration::add_llm_provider_modal::{AddLlmProviderModal, LlmCompatibleProvider},
placeholder_command,
};
pub struct AgentConfiguration {
fs: Arc<dyn Fs>,
language_registry: Arc<LanguageRegistry>,
agent_server_store: Entity<AgentServerStore>,
workspace: WeakEntity<Workspace>,
focus_handle: FocusHandle,
configuration_views_by_provider: HashMap<LanguageModelProviderId, AnyView>,
@@ -66,6 +71,7 @@ pub struct AgentConfiguration {
impl AgentConfiguration {
pub fn new(
fs: Arc<dyn Fs>,
agent_server_store: Entity<AgentServerStore>,
context_server_store: Entity<ContextServerStore>,
tools: Entity<ToolWorkingSet>,
language_registry: Arc<LanguageRegistry>,
@@ -104,6 +110,7 @@ impl AgentConfiguration {
workspace,
focus_handle,
configuration_views_by_provider: HashMap::default(),
agent_server_store,
context_server_store,
expanded_context_server_tools: HashMap::default(),
expanded_provider_configurations: HashMap::default(),
@@ -991,17 +998,30 @@ impl AgentConfiguration {
}
fn render_agent_servers_section(&mut self, cx: &mut Context<Self>) -> impl IntoElement {
let settings = AllAgentServersSettings::get_global(cx).clone();
let user_defined_agents = settings
let custom_settings = cx
.global::<SettingsStore>()
.get::<AllAgentServersSettings>(None)
.custom
.iter()
.map(|(name, settings)| {
.clone();
let user_defined_agents = self
.agent_server_store
.read(cx)
.external_agents()
.filter(|name| name.0 != GEMINI_NAME && name.0 != CLAUDE_CODE_NAME)
.cloned()
.collect::<Vec<_>>();
let user_defined_agents = user_defined_agents
.into_iter()
.map(|name| {
self.render_agent_server(
IconName::Ai,
name.clone(),
ExternalAgent::Custom {
name: name.clone(),
command: settings.command.clone(),
name: name.clone().into(),
command: custom_settings
.get(&name.0)
.map(|settings| settings.command.clone())
.unwrap_or(placeholder_command()),
},
cx,
)

View File

@@ -5,9 +5,11 @@ use std::sync::Arc;
use std::time::Duration;
use acp_thread::AcpThread;
use agent_servers::AgentServerCommand;
use agent2::{DbThreadMetadata, HistoryEntry};
use db::kvp::{Dismissable, KEY_VALUE_STORE};
use project::agent_server_store::{
AgentServerCommand, AllAgentServersSettings, CLAUDE_CODE_NAME, GEMINI_NAME,
};
use serde::{Deserialize, Serialize};
use zed_actions::OpenBrowser;
use zed_actions::agent::{OpenClaudeCodeOnboardingModal, ReauthenticateAgent};
@@ -33,7 +35,9 @@ use crate::{
thread_history::{HistoryEntryElement, ThreadHistory},
ui::{AgentOnboardingModal, EndTrialUpsell},
};
use crate::{ExternalAgent, NewExternalAgentThread, NewNativeAgentThreadFromSummary};
use crate::{
ExternalAgent, NewExternalAgentThread, NewNativeAgentThreadFromSummary, placeholder_command,
};
use agent::{
Thread, ThreadError, ThreadEvent, ThreadId, ThreadSummary, TokenUsageRatio,
context_store::ContextStore,
@@ -62,7 +66,7 @@ use project::{DisableAiSettings, Project, ProjectPath, Worktree};
use prompt_store::{PromptBuilder, PromptStore, UserPromptId};
use rules_library::{RulesLibrary, open_rules_library};
use search::{BufferSearchBar, buffer_search};
use settings::{Settings, update_settings_file};
use settings::{Settings, SettingsStore, update_settings_file};
use theme::ThemeSettings;
use time::UtcOffset;
use ui::utils::WithRemSize;
@@ -1094,7 +1098,7 @@ impl AgentPanel {
let workspace = self.workspace.clone();
let project = self.project.clone();
let fs = self.fs.clone();
let is_not_local = !self.project.read(cx).is_local();
let is_via_collab = self.project.read(cx).is_via_collab();
const LAST_USED_EXTERNAL_AGENT_KEY: &str = "agent_panel__last_used_external_agent";
@@ -1126,7 +1130,7 @@ impl AgentPanel {
agent
}
None => {
if is_not_local {
if is_via_collab {
ExternalAgent::NativeAgent
} else {
cx.background_spawn(async move {
@@ -1503,6 +1507,7 @@ impl AgentPanel {
}
pub(crate) fn open_configuration(&mut self, window: &mut Window, cx: &mut Context<Self>) {
let agent_server_store = self.project.read(cx).agent_server_store().clone();
let context_server_store = self.project.read(cx).context_server_store();
let tools = self.thread_store.read(cx).tools();
let fs = self.fs.clone();
@@ -1511,6 +1516,7 @@ impl AgentPanel {
self.configuration = Some(cx.new(|cx| {
AgentConfiguration::new(
fs,
agent_server_store,
context_server_store,
tools,
self.language_registry.clone(),
@@ -2503,6 +2509,7 @@ impl AgentPanel {
}
fn render_toolbar_new(&self, window: &mut Window, cx: &mut Context<Self>) -> impl IntoElement {
let agent_server_store = self.project.read(cx).agent_server_store().clone();
let focus_handle = self.focus_handle(cx);
let active_thread = match &self.active_view {
@@ -2535,8 +2542,10 @@ impl AgentPanel {
.with_handle(self.new_thread_menu_handle.clone())
.menu({
let workspace = self.workspace.clone();
let is_not_local = workspace
.update(cx, |workspace, cx| !workspace.project().read(cx).is_local())
let is_via_collab = workspace
.update(cx, |workspace, cx| {
workspace.project().read(cx).is_via_collab()
})
.unwrap_or_default();
move |window, cx| {
@@ -2628,7 +2637,7 @@ impl AgentPanel {
ContextMenuEntry::new("New Gemini CLI Thread")
.icon(IconName::AiGemini)
.icon_color(Color::Muted)
.disabled(is_not_local)
.disabled(is_via_collab)
.handler({
let workspace = workspace.clone();
move |window, cx| {
@@ -2655,7 +2664,7 @@ impl AgentPanel {
menu.item(
ContextMenuEntry::new("New Claude Code Thread")
.icon(IconName::AiClaude)
.disabled(is_not_local)
.disabled(is_via_collab)
.icon_color(Color::Muted)
.handler({
let workspace = workspace.clone();
@@ -2680,19 +2689,25 @@ impl AgentPanel {
)
})
.when(cx.has_flag::<GeminiAndNativeFeatureFlag>(), |mut menu| {
// Add custom agents from settings
let settings =
agent_servers::AllAgentServersSettings::get_global(cx);
for (agent_name, agent_settings) in &settings.custom {
let agent_names = agent_server_store
.read(cx)
.external_agents()
.filter(|name| {
name.0 != GEMINI_NAME && name.0 != CLAUDE_CODE_NAME
})
.cloned()
.collect::<Vec<_>>();
let custom_settings = cx.global::<SettingsStore>().get::<AllAgentServersSettings>(None).custom.clone();
for agent_name in agent_names {
menu = menu.item(
ContextMenuEntry::new(format!("New {} Thread", agent_name))
.icon(IconName::Terminal)
.icon_color(Color::Muted)
.disabled(is_not_local)
.disabled(is_via_collab)
.handler({
let workspace = workspace.clone();
let agent_name = agent_name.clone();
let agent_settings = agent_settings.clone();
let custom_settings = custom_settings.clone();
move |window, cx| {
if let Some(workspace) = workspace.upgrade() {
workspace.update(cx, |workspace, cx| {
@@ -2703,10 +2718,9 @@ impl AgentPanel {
panel.new_agent_thread(
AgentType::Custom {
name: agent_name
.clone(),
command: agent_settings
.command
.clone(),
.clone()
.into(),
command: custom_settings.get(&agent_name.0).map(|settings| settings.command.clone()).unwrap_or(placeholder_command())
},
window,
cx,

View File

@@ -28,7 +28,6 @@ use std::rc::Rc;
use std::sync::Arc;
use agent::{Thread, ThreadId};
use agent_servers::AgentServerCommand;
use agent_settings::{AgentProfileId, AgentSettings, LanguageModelSelection};
use assistant_slash_command::SlashCommandRegistry;
use client::Client;
@@ -41,6 +40,7 @@ use language_model::{
ConfiguredModel, LanguageModel, LanguageModelId, LanguageModelProviderId, LanguageModelRegistry,
};
use project::DisableAiSettings;
use project::agent_server_store::AgentServerCommand;
use prompt_store::PromptBuilder;
use schemars::JsonSchema;
use serde::{Deserialize, Serialize};
@@ -174,6 +174,14 @@ enum ExternalAgent {
},
}
fn placeholder_command() -> AgentServerCommand {
AgentServerCommand {
path: "/placeholder".into(),
args: vec![],
env: None,
}
}
impl ExternalAgent {
fn name(&self) -> &'static str {
match self {
@@ -193,10 +201,9 @@ impl ExternalAgent {
Self::Gemini => Rc::new(agent_servers::Gemini),
Self::ClaudeCode => Rc::new(agent_servers::ClaudeCode),
Self::NativeAgent => Rc::new(agent2::NativeAgentServer::new(fs, history)),
Self::Custom { name, command } => Rc::new(agent_servers::CustomAgentServer::new(
name.clone(),
command.clone(),
)),
Self::Custom { name, command: _ } => {
Rc::new(agent_servers::CustomAgentServer::new(name.clone()))
}
}
}
}

View File

@@ -88,10 +88,7 @@ fn view_release_notes_locally(
.update_in(cx, |workspace, window, cx| {
let project = workspace.project().clone();
let buffer = project.update(cx, |project, cx| {
let buffer = project.create_local_buffer("", markdown, cx);
project
.mark_buffer_as_non_searchable(buffer.read(cx).remote_id(), cx);
buffer
project.create_local_buffer("", markdown, false, cx)
});
buffer.update(cx, |buffer, cx| {
buffer.edit([(0..0, body.release_notes)], None, cx)

View File

@@ -2098,7 +2098,7 @@ async fn test_following_after_replacement(cx_a: &mut TestAppContext, cx_b: &mut
share_workspace(&workspace, cx_a).await.unwrap();
let buffer = workspace.update(cx_a, |workspace, cx| {
workspace.project().update(cx, |project, cx| {
project.create_local_buffer(&sample_text(26, 5, 'a'), None, cx)
project.create_local_buffer(&sample_text(26, 5, 'a'), None, false, cx)
})
});
let multibuffer = cx_a.new(|cx| {

View File

@@ -2506,7 +2506,7 @@ async fn test_propagate_saves_and_fs_changes(
});
let new_buffer_a = project_a
.update(cx_a, |p, cx| p.create_buffer(cx))
.update(cx_a, |p, cx| p.create_buffer(false, cx))
.await
.unwrap();

View File

@@ -0,0 +1,982 @@
use crate::{
DIAGNOSTICS_UPDATE_DELAY, IncludeWarnings, ToggleWarnings, context_range_for_entry,
diagnostic_renderer::{DiagnosticBlock, DiagnosticRenderer},
toolbar_controls::DiagnosticsToolbarEditor,
};
use anyhow::Result;
use collections::HashMap;
use editor::{
Editor, EditorEvent, ExcerptRange, MultiBuffer, PathKey,
display_map::{BlockPlacement, BlockProperties, BlockStyle, CustomBlockId},
multibuffer_context_lines,
};
use gpui::{
AnyElement, App, AppContext, Context, Entity, EntityId, EventEmitter, FocusHandle, Focusable,
InteractiveElement, IntoElement, ParentElement, Render, SharedString, Styled, Subscription,
Task, WeakEntity, Window, actions, div,
};
use language::{Buffer, DiagnosticEntry, Point};
use project::{
DiagnosticSummary, Event, Project, ProjectItem, ProjectPath,
project_settings::{DiagnosticSeverity, ProjectSettings},
};
use settings::Settings;
use std::{
any::{Any, TypeId},
cmp::Ordering,
sync::Arc,
};
use text::{Anchor, BufferSnapshot, OffsetRangeExt};
use ui::{Button, ButtonStyle, Icon, IconName, Label, Tooltip, h_flex, prelude::*};
use util::paths::PathExt;
use workspace::{
ItemHandle, ItemNavHistory, ToolbarItemLocation, Workspace,
item::{BreadcrumbText, Item, ItemEvent, TabContentParams},
};
actions!(
diagnostics,
[
/// Opens the project diagnostics view for the currently focused file.
DeployCurrentFile,
]
);
/// The `BufferDiagnosticsEditor` is meant to be used when dealing specifically
/// with diagnostics for a single buffer, as only the excerpts of the buffer
/// where diagnostics are available are displayed.
pub(crate) struct BufferDiagnosticsEditor {
pub project: Entity<Project>,
focus_handle: FocusHandle,
editor: Entity<Editor>,
/// The current diagnostic entries in the `BufferDiagnosticsEditor`. Used to
/// allow quick comparison of updated diagnostics, to confirm if anything
/// has changed.
pub(crate) diagnostics: Vec<DiagnosticEntry<Anchor>>,
/// The blocks used to display the diagnostics' content in the editor, next
/// to the excerpts where the diagnostic originated.
blocks: Vec<CustomBlockId>,
/// Multibuffer to contain all excerpts that contain diagnostics, which are
/// to be rendered in the editor.
multibuffer: Entity<MultiBuffer>,
/// The buffer for which the editor is displaying diagnostics and excerpts
/// for.
buffer: Option<Entity<Buffer>>,
/// The path for which the editor is displaying diagnostics for.
project_path: ProjectPath,
/// Summary of the number of warnings and errors for the path. Used to
/// display the number of warnings and errors in the tab's content.
summary: DiagnosticSummary,
/// Whether to include warnings in the list of diagnostics shown in the
/// editor.
pub(crate) include_warnings: bool,
/// Keeps track of whether there's a background task already running to
/// update the excerpts, in order to avoid firing multiple tasks for this purpose.
pub(crate) update_excerpts_task: Option<Task<Result<()>>>,
/// The project's subscription, responsible for processing events related to
/// diagnostics.
_subscription: Subscription,
}
impl BufferDiagnosticsEditor {
/// Creates new instance of the `BufferDiagnosticsEditor` which can then be
/// displayed by adding it to a pane.
pub fn new(
project_path: ProjectPath,
project_handle: Entity<Project>,
buffer: Option<Entity<Buffer>>,
include_warnings: bool,
window: &mut Window,
cx: &mut Context<Self>,
) -> Self {
// Subscribe to project events related to diagnostics so the
// `BufferDiagnosticsEditor` can update its state accordingly.
let project_event_subscription = cx.subscribe_in(
&project_handle,
window,
|buffer_diagnostics_editor, _project, event, window, cx| match event {
Event::DiskBasedDiagnosticsStarted { .. } => {
cx.notify();
}
Event::DiskBasedDiagnosticsFinished { .. } => {
buffer_diagnostics_editor.update_all_excerpts(window, cx);
}
Event::DiagnosticsUpdated {
paths,
language_server_id,
} => {
// When diagnostics have been updated, the
// `BufferDiagnosticsEditor` should update its state only if
// one of the paths matches its `project_path`, otherwise
// the event should be ignored.
if paths.contains(&buffer_diagnostics_editor.project_path) {
buffer_diagnostics_editor.update_diagnostic_summary(cx);
if buffer_diagnostics_editor.editor.focus_handle(cx).contains_focused(window, cx) || buffer_diagnostics_editor.focus_handle.contains_focused(window, cx) {
log::debug!("diagnostics updated for server {language_server_id}. recording change");
} else {
log::debug!("diagnostics updated for server {language_server_id}. updating excerpts");
buffer_diagnostics_editor.update_all_excerpts(window, cx);
}
}
}
_ => {}
},
);
let focus_handle = cx.focus_handle();
cx.on_focus_in(
&focus_handle,
window,
|buffer_diagnostics_editor, window, cx| buffer_diagnostics_editor.focus_in(window, cx),
)
.detach();
cx.on_focus_out(
&focus_handle,
window,
|buffer_diagnostics_editor, _event, window, cx| {
buffer_diagnostics_editor.focus_out(window, cx)
},
)
.detach();
let summary = project_handle
.read(cx)
.diagnostic_summary_for_path(&project_path, cx);
let multibuffer = cx.new(|cx| MultiBuffer::new(project_handle.read(cx).capability()));
let max_severity = Self::max_diagnostics_severity(include_warnings);
let editor = cx.new(|cx| {
let mut editor = Editor::for_multibuffer(
multibuffer.clone(),
Some(project_handle.clone()),
window,
cx,
);
editor.set_vertical_scroll_margin(5, cx);
editor.disable_inline_diagnostics();
editor.set_max_diagnostics_severity(max_severity, cx);
editor.set_all_diagnostics_active(cx);
editor
});
// Subscribe to events triggered by the editor in order to correctly
// update the buffer's excerpts.
cx.subscribe_in(
&editor,
window,
|buffer_diagnostics_editor, _editor, event: &EditorEvent, window, cx| {
cx.emit(event.clone());
match event {
// If the user tries to focus on the editor but there's actually
// no excerpts for the buffer, focus back on the
// `BufferDiagnosticsEditor` instance.
EditorEvent::Focused => {
if buffer_diagnostics_editor.multibuffer.read(cx).is_empty() {
window.focus(&buffer_diagnostics_editor.focus_handle);
}
}
EditorEvent::Blurred => {
buffer_diagnostics_editor.update_all_excerpts(window, cx)
}
_ => {}
}
},
)
.detach();
let diagnostics = vec![];
let update_excerpts_task = None;
let mut buffer_diagnostics_editor = Self {
project: project_handle,
focus_handle,
editor,
diagnostics,
blocks: Default::default(),
multibuffer,
buffer,
project_path,
summary,
include_warnings,
update_excerpts_task,
_subscription: project_event_subscription,
};
buffer_diagnostics_editor.update_all_diagnostics(window, cx);
buffer_diagnostics_editor
}
fn deploy(
workspace: &mut Workspace,
_: &DeployCurrentFile,
window: &mut Window,
cx: &mut Context<Workspace>,
) {
// Determine the currently opened path by finding the active editor and
// finding the project path for the buffer.
// If there's no active editor with a project path, avoiding deploying
// the buffer diagnostics view.
if let Some(editor) = workspace.active_item_as::<Editor>(cx)
&& let Some(project_path) = editor.project_path(cx)
{
// Check if there's already a `BufferDiagnosticsEditor` tab for this
// same path, and if so, focus on that one instead of creating a new
// one.
let existing_editor = workspace
.items_of_type::<BufferDiagnosticsEditor>(cx)
.find(|editor| editor.read(cx).project_path == project_path);
if let Some(editor) = existing_editor {
workspace.activate_item(&editor, true, true, window, cx);
} else {
let include_warnings = match cx.try_global::<IncludeWarnings>() {
Some(include_warnings) => include_warnings.0,
None => ProjectSettings::get_global(cx).diagnostics.include_warnings,
};
let item = cx.new(|cx| {
Self::new(
project_path,
workspace.project().clone(),
editor.read(cx).buffer().read(cx).as_singleton(),
include_warnings,
window,
cx,
)
});
workspace.add_item_to_active_pane(Box::new(item), None, true, window, cx);
}
}
}
pub fn register(
workspace: &mut Workspace,
_window: Option<&mut Window>,
_: &mut Context<Workspace>,
) {
workspace.register_action(Self::deploy);
}
fn update_all_diagnostics(&mut self, window: &mut Window, cx: &mut Context<Self>) {
self.update_all_excerpts(window, cx);
}
fn update_diagnostic_summary(&mut self, cx: &mut Context<Self>) {
let project = self.project.read(cx);
self.summary = project.diagnostic_summary_for_path(&self.project_path, cx);
}
/// Enqueue an update to the excerpts and diagnostic blocks being shown in
/// the editor.
pub(crate) fn update_all_excerpts(&mut self, window: &mut Window, cx: &mut Context<Self>) {
// If there's already a task updating the excerpts, early return and let
// the other task finish.
if self.update_excerpts_task.is_some() {
return;
}
let buffer = self.buffer.clone();
self.update_excerpts_task = Some(cx.spawn_in(window, async move |editor, cx| {
cx.background_executor()
.timer(DIAGNOSTICS_UPDATE_DELAY)
.await;
if let Some(buffer) = buffer {
editor
.update_in(cx, |editor, window, cx| {
editor.update_excerpts(buffer, window, cx)
})?
.await?;
};
let _ = editor.update(cx, |editor, cx| {
editor.update_excerpts_task = None;
cx.notify();
});
Ok(())
}));
}
/// Updates the excerpts in the `BufferDiagnosticsEditor` for a single
/// buffer.
fn update_excerpts(
&mut self,
buffer: Entity<Buffer>,
window: &mut Window,
cx: &mut Context<Self>,
) -> Task<Result<()>> {
let was_empty = self.multibuffer.read(cx).is_empty();
let multibuffer_context = multibuffer_context_lines(cx);
let buffer_snapshot = buffer.read(cx).snapshot();
let buffer_snapshot_max = buffer_snapshot.max_point();
let max_severity = Self::max_diagnostics_severity(self.include_warnings)
.into_lsp()
.unwrap_or(lsp::DiagnosticSeverity::WARNING);
cx.spawn_in(window, async move |buffer_diagnostics_editor, mut cx| {
// Fetch the diagnostics for the whole of the buffer
// (`Point::zero()..buffer_snapshot.max_point()`) so we can confirm
// if the diagnostics changed, if it didn't, early return as there's
// nothing to update.
let diagnostics = buffer_snapshot
.diagnostics_in_range::<_, Anchor>(Point::zero()..buffer_snapshot_max, false)
.collect::<Vec<_>>();
let unchanged =
buffer_diagnostics_editor.update(cx, |buffer_diagnostics_editor, _cx| {
if buffer_diagnostics_editor
.diagnostics_are_unchanged(&diagnostics, &buffer_snapshot)
{
return true;
}
buffer_diagnostics_editor.set_diagnostics(&diagnostics);
return false;
})?;
if unchanged {
return Ok(());
}
// Mapping between the Group ID and a vector of DiagnosticEntry.
let mut grouped: HashMap<usize, Vec<_>> = HashMap::default();
for entry in diagnostics {
grouped
.entry(entry.diagnostic.group_id)
.or_default()
.push(DiagnosticEntry {
range: entry.range.to_point(&buffer_snapshot),
diagnostic: entry.diagnostic,
})
}
let mut blocks: Vec<DiagnosticBlock> = Vec::new();
for (_, group) in grouped {
// If the minimum severity of the group is higher than the
// maximum severity, or it doesn't even have severity, skip this
// group.
if group
.iter()
.map(|d| d.diagnostic.severity)
.min()
.is_none_or(|severity| severity > max_severity)
{
continue;
}
let diagnostic_blocks = cx.update(|_window, cx| {
DiagnosticRenderer::diagnostic_blocks_for_group(
group,
buffer_snapshot.remote_id(),
Some(Arc::new(buffer_diagnostics_editor.clone())),
cx,
)
})?;
// For each of the diagnostic blocks to be displayed in the
// editor, figure out its index in the list of blocks.
//
// The following rules are used to determine the order:
// 1. Blocks with a lower start position should come first.
// 2. If two blocks have the same start position, the one with
// the higher end position should come first.
for diagnostic_block in diagnostic_blocks {
let index = blocks.partition_point(|probe| {
match probe
.initial_range
.start
.cmp(&diagnostic_block.initial_range.start)
{
Ordering::Less => true,
Ordering::Greater => false,
Ordering::Equal => {
probe.initial_range.end > diagnostic_block.initial_range.end
}
}
});
blocks.insert(index, diagnostic_block);
}
}
// Build the excerpt ranges for this specific buffer's diagnostics,
// so those excerpts can later be used to update the excerpts shown
// in the editor.
// This is done by iterating over the list of diagnostic blocks and
// determine what range does the diagnostic block span.
let mut excerpt_ranges: Vec<ExcerptRange<Point>> = Vec::new();
for diagnostic_block in blocks.iter() {
let excerpt_range = context_range_for_entry(
diagnostic_block.initial_range.clone(),
multibuffer_context,
buffer_snapshot.clone(),
&mut cx,
)
.await;
let index = excerpt_ranges
.binary_search_by(|probe| {
probe
.context
.start
.cmp(&excerpt_range.start)
.then(probe.context.end.cmp(&excerpt_range.end))
.then(
probe
.primary
.start
.cmp(&diagnostic_block.initial_range.start),
)
.then(probe.primary.end.cmp(&diagnostic_block.initial_range.end))
.then(Ordering::Greater)
})
.unwrap_or_else(|index| index);
excerpt_ranges.insert(
index,
ExcerptRange {
context: excerpt_range,
primary: diagnostic_block.initial_range.clone(),
},
)
}
// Finally, update the editor's content with the new excerpt ranges
// for this editor, as well as the diagnostic blocks.
buffer_diagnostics_editor.update_in(cx, |buffer_diagnostics_editor, window, cx| {
// Remove the list of `CustomBlockId` from the editor's display
// map, ensuring that if any diagnostics have been solved, the
// associated block stops being shown.
let block_ids = buffer_diagnostics_editor.blocks.clone();
buffer_diagnostics_editor.editor.update(cx, |editor, cx| {
editor.display_map.update(cx, |display_map, cx| {
display_map.remove_blocks(block_ids.into_iter().collect(), cx);
})
});
let (anchor_ranges, _) =
buffer_diagnostics_editor
.multibuffer
.update(cx, |multibuffer, cx| {
multibuffer.set_excerpt_ranges_for_path(
PathKey::for_buffer(&buffer, cx),
buffer.clone(),
&buffer_snapshot,
excerpt_ranges,
cx,
)
});
if was_empty {
if let Some(anchor_range) = anchor_ranges.first() {
let range_to_select = anchor_range.start..anchor_range.start;
buffer_diagnostics_editor.editor.update(cx, |editor, cx| {
editor.change_selections(Default::default(), window, cx, |selection| {
selection.select_anchor_ranges([range_to_select])
})
});
// If the `BufferDiagnosticsEditor` is currently
// focused, move focus to its editor.
if buffer_diagnostics_editor.focus_handle.is_focused(window) {
buffer_diagnostics_editor
.editor
.read(cx)
.focus_handle(cx)
.focus(window);
}
}
}
// Cloning the blocks before moving ownership so these can later
// be used to set the block contents for testing purposes.
#[cfg(test)]
let cloned_blocks = blocks.clone();
// Build new diagnostic blocks to be added to the editor's
// display map for the new diagnostics. Update the `blocks`
// property before finishing, to ensure the blocks are removed
// on the next execution.
let editor_blocks =
anchor_ranges
.into_iter()
.zip(blocks.into_iter())
.map(|(anchor, block)| {
let editor = buffer_diagnostics_editor.editor.downgrade();
BlockProperties {
placement: BlockPlacement::Near(anchor.start),
height: Some(1),
style: BlockStyle::Flex,
render: Arc::new(move |block_context| {
block.render_block(editor.clone(), block_context)
}),
priority: 1,
}
});
let block_ids = buffer_diagnostics_editor.editor.update(cx, |editor, cx| {
editor.display_map.update(cx, |display_map, cx| {
display_map.insert_blocks(editor_blocks, cx)
})
});
// In order to be able to verify which diagnostic blocks are
// rendered in the editor, the `set_block_content_for_tests`
// function must be used, so that the
// `editor::test::editor_content_with_blocks` function can then
// be called to fetch these blocks.
#[cfg(test)]
{
for (block_id, block) in block_ids.iter().zip(cloned_blocks.iter()) {
let markdown = block.markdown.clone();
editor::test::set_block_content_for_tests(
&buffer_diagnostics_editor.editor,
*block_id,
cx,
move |cx| {
markdown::MarkdownElement::rendered_text(
markdown.clone(),
cx,
editor::hover_popover::diagnostics_markdown_style,
)
},
);
}
}
buffer_diagnostics_editor.blocks = block_ids;
cx.notify()
})
})
}
fn set_diagnostics(&mut self, diagnostics: &Vec<DiagnosticEntry<Anchor>>) {
self.diagnostics = diagnostics.clone();
}
fn diagnostics_are_unchanged(
&self,
diagnostics: &Vec<DiagnosticEntry<Anchor>>,
snapshot: &BufferSnapshot,
) -> bool {
if self.diagnostics.len() != diagnostics.len() {
return false;
}
self.diagnostics
.iter()
.zip(diagnostics.iter())
.all(|(existing, new)| {
existing.diagnostic.message == new.diagnostic.message
&& existing.diagnostic.severity == new.diagnostic.severity
&& existing.diagnostic.is_primary == new.diagnostic.is_primary
&& existing.range.to_offset(snapshot) == new.range.to_offset(snapshot)
})
}
fn focus_in(&mut self, window: &mut Window, cx: &mut Context<Self>) {
// If the `BufferDiagnosticsEditor` is focused and the multibuffer is
// not empty, focus on the editor instead, which will allow the user to
// start interacting and editing the buffer's contents.
if self.focus_handle.is_focused(window) && !self.multibuffer.read(cx).is_empty() {
self.editor.focus_handle(cx).focus(window)
}
}
fn focus_out(&mut self, window: &mut Window, cx: &mut Context<Self>) {
if !self.focus_handle.is_focused(window) && !self.editor.focus_handle(cx).is_focused(window)
{
self.update_all_excerpts(window, cx);
}
}
pub fn toggle_warnings(
&mut self,
_: &ToggleWarnings,
window: &mut Window,
cx: &mut Context<Self>,
) {
let include_warnings = !self.include_warnings;
let max_severity = Self::max_diagnostics_severity(include_warnings);
self.editor.update(cx, |editor, cx| {
editor.set_max_diagnostics_severity(max_severity, cx);
});
self.include_warnings = include_warnings;
self.diagnostics.clear();
self.update_all_diagnostics(window, cx);
}
fn max_diagnostics_severity(include_warnings: bool) -> DiagnosticSeverity {
match include_warnings {
true => DiagnosticSeverity::Warning,
false => DiagnosticSeverity::Error,
}
}
#[cfg(test)]
pub fn editor(&self) -> &Entity<Editor> {
&self.editor
}
#[cfg(test)]
pub fn summary(&self) -> &DiagnosticSummary {
&self.summary
}
}
impl Focusable for BufferDiagnosticsEditor {
fn focus_handle(&self, _: &App) -> FocusHandle {
self.focus_handle.clone()
}
}
impl EventEmitter<EditorEvent> for BufferDiagnosticsEditor {}
impl Item for BufferDiagnosticsEditor {
type Event = EditorEvent;
fn act_as_type<'a>(
&'a self,
type_id: std::any::TypeId,
self_handle: &'a Entity<Self>,
_: &'a App,
) -> Option<gpui::AnyView> {
if type_id == TypeId::of::<Self>() {
Some(self_handle.to_any())
} else if type_id == TypeId::of::<Editor>() {
Some(self.editor.to_any())
} else {
None
}
}
fn added_to_workspace(
&mut self,
workspace: &mut Workspace,
window: &mut Window,
cx: &mut Context<Self>,
) {
self.editor.update(cx, |editor, cx| {
editor.added_to_workspace(workspace, window, cx)
});
}
fn breadcrumb_location(&self, _: &App) -> ToolbarItemLocation {
ToolbarItemLocation::PrimaryLeft
}
fn breadcrumbs(&self, theme: &theme::Theme, cx: &App) -> Option<Vec<BreadcrumbText>> {
self.editor.breadcrumbs(theme, cx)
}
fn can_save(&self, _cx: &App) -> bool {
true
}
fn clone_on_split(
&self,
_workspace_id: Option<workspace::WorkspaceId>,
window: &mut Window,
cx: &mut Context<Self>,
) -> Option<Entity<Self>>
where
Self: Sized,
{
Some(cx.new(|cx| {
BufferDiagnosticsEditor::new(
self.project_path.clone(),
self.project.clone(),
self.buffer.clone(),
self.include_warnings,
window,
cx,
)
}))
}
fn deactivated(&mut self, window: &mut Window, cx: &mut Context<Self>) {
self.editor
.update(cx, |editor, cx| editor.deactivated(window, cx));
}
fn for_each_project_item(&self, cx: &App, f: &mut dyn FnMut(EntityId, &dyn ProjectItem)) {
self.editor.for_each_project_item(cx, f);
}
fn has_conflict(&self, cx: &App) -> bool {
self.multibuffer.read(cx).has_conflict(cx)
}
fn has_deleted_file(&self, cx: &App) -> bool {
self.multibuffer.read(cx).has_deleted_file(cx)
}
fn is_dirty(&self, cx: &App) -> bool {
self.multibuffer.read(cx).is_dirty(cx)
}
fn is_singleton(&self, _cx: &App) -> bool {
false
}
fn navigate(
&mut self,
data: Box<dyn Any>,
window: &mut Window,
cx: &mut Context<Self>,
) -> bool {
self.editor
.update(cx, |editor, cx| editor.navigate(data, window, cx))
}
fn reload(
&mut self,
project: Entity<Project>,
window: &mut Window,
cx: &mut Context<Self>,
) -> Task<Result<()>> {
self.editor.reload(project, window, cx)
}
fn save(
&mut self,
options: workspace::item::SaveOptions,
project: Entity<Project>,
window: &mut Window,
cx: &mut Context<Self>,
) -> Task<Result<()>> {
self.editor.save(options, project, window, cx)
}
fn save_as(
&mut self,
_project: Entity<Project>,
_path: ProjectPath,
_window: &mut Window,
_cx: &mut Context<Self>,
) -> Task<Result<()>> {
unreachable!()
}
fn set_nav_history(
&mut self,
nav_history: ItemNavHistory,
_window: &mut Window,
cx: &mut Context<Self>,
) {
self.editor.update(cx, |editor, _| {
editor.set_nav_history(Some(nav_history));
})
}
// Builds the content to be displayed in the tab.
fn tab_content(&self, params: TabContentParams, _window: &Window, _cx: &App) -> AnyElement {
let error_count = self.summary.error_count;
let warning_count = self.summary.warning_count;
let label = Label::new(
self.project_path
.path
.file_name()
.map(|f| f.to_sanitized_string())
.unwrap_or_else(|| self.project_path.path.to_sanitized_string()),
);
h_flex()
.gap_1()
.child(label)
.when(error_count == 0 && warning_count == 0, |parent| {
parent.child(
h_flex()
.gap_1()
.child(Icon::new(IconName::Check).color(Color::Success)),
)
})
.when(error_count > 0, |parent| {
parent.child(
h_flex()
.gap_1()
.child(Icon::new(IconName::XCircle).color(Color::Error))
.child(Label::new(error_count.to_string()).color(params.text_color())),
)
})
.when(warning_count > 0, |parent| {
parent.child(
h_flex()
.gap_1()
.child(Icon::new(IconName::Warning).color(Color::Warning))
.child(Label::new(warning_count.to_string()).color(params.text_color())),
)
})
.into_any_element()
}
fn tab_content_text(&self, _detail: usize, _app: &App) -> SharedString {
"Buffer Diagnostics".into()
}
fn tab_tooltip_text(&self, _: &App) -> Option<SharedString> {
Some(
format!(
"Buffer Diagnostics - {}",
self.project_path.path.to_sanitized_string()
)
.into(),
)
}
fn telemetry_event_text(&self) -> Option<&'static str> {
Some("Buffer Diagnostics Opened")
}
fn to_item_events(event: &EditorEvent, f: impl FnMut(ItemEvent)) {
Editor::to_item_events(event, f)
}
}
impl Render for BufferDiagnosticsEditor {
fn render(&mut self, _: &mut Window, cx: &mut Context<Self>) -> impl IntoElement {
let filename = self.project_path.path.to_sanitized_string();
let error_count = self.summary.error_count;
let warning_count = match self.include_warnings {
true => self.summary.warning_count,
false => 0,
};
let child = if error_count + warning_count == 0 {
let label = match warning_count {
0 => "No problems in",
_ => "No errors in",
};
v_flex()
.key_context("EmptyPane")
.size_full()
.gap_1()
.justify_center()
.items_center()
.text_center()
.bg(cx.theme().colors().editor_background)
.child(
div()
.h_flex()
.child(Label::new(label).color(Color::Muted))
.child(
Button::new("open-file", filename)
.style(ButtonStyle::Transparent)
.tooltip(Tooltip::text("Open File"))
.on_click(cx.listener(|buffer_diagnostics, _, window, cx| {
if let Some(workspace) = window.root::<Workspace>().flatten() {
workspace.update(cx, |workspace, cx| {
workspace
.open_path(
buffer_diagnostics.project_path.clone(),
None,
true,
window,
cx,
)
.detach_and_log_err(cx);
})
}
})),
),
)
.when(self.summary.warning_count > 0, |div| {
let label = match self.summary.warning_count {
1 => "Show 1 warning".into(),
warning_count => format!("Show {} warnings", warning_count),
};
div.child(
Button::new("diagnostics-show-warning-label", label).on_click(cx.listener(
|buffer_diagnostics_editor, _, window, cx| {
buffer_diagnostics_editor.toggle_warnings(
&Default::default(),
window,
cx,
);
cx.notify();
},
)),
)
})
} else {
div().size_full().child(self.editor.clone())
};
div()
.key_context("Diagnostics")
.track_focus(&self.focus_handle(cx))
.size_full()
.child(child)
}
}
impl DiagnosticsToolbarEditor for WeakEntity<BufferDiagnosticsEditor> {
fn include_warnings(&self, cx: &App) -> bool {
self.read_with(cx, |buffer_diagnostics_editor, _cx| {
buffer_diagnostics_editor.include_warnings
})
.unwrap_or(false)
}
fn has_stale_excerpts(&self, _cx: &App) -> bool {
false
}
fn is_updating(&self, cx: &App) -> bool {
self.read_with(cx, |buffer_diagnostics_editor, cx| {
buffer_diagnostics_editor.update_excerpts_task.is_some()
|| buffer_diagnostics_editor
.project
.read(cx)
.language_servers_running_disk_based_diagnostics(cx)
.next()
.is_some()
})
.unwrap_or(false)
}
fn stop_updating(&self, cx: &mut App) {
let _ = self.update(cx, |buffer_diagnostics_editor, cx| {
buffer_diagnostics_editor.update_excerpts_task = None;
cx.notify();
});
}
fn refresh_diagnostics(&self, window: &mut Window, cx: &mut App) {
let _ = self.update(cx, |buffer_diagnostics_editor, cx| {
buffer_diagnostics_editor.update_all_excerpts(window, cx);
});
}
fn toggle_warnings(&self, window: &mut Window, cx: &mut App) {
let _ = self.update(cx, |buffer_diagnostics_editor, cx| {
buffer_diagnostics_editor.toggle_warnings(&Default::default(), window, cx);
});
}
fn get_diagnostics_for_buffer(
&self,
_buffer_id: text::BufferId,
cx: &App,
) -> Vec<language::DiagnosticEntry<text::Anchor>> {
self.read_with(cx, |buffer_diagnostics_editor, _cx| {
buffer_diagnostics_editor.diagnostics.clone()
})
.unwrap_or_default()
}
}

View File

@@ -18,7 +18,7 @@ use ui::{
};
use util::maybe;
use crate::ProjectDiagnosticsEditor;
use crate::toolbar_controls::DiagnosticsToolbarEditor;
pub struct DiagnosticRenderer;
@@ -26,7 +26,7 @@ impl DiagnosticRenderer {
pub fn diagnostic_blocks_for_group(
diagnostic_group: Vec<DiagnosticEntry<Point>>,
buffer_id: BufferId,
diagnostics_editor: Option<WeakEntity<ProjectDiagnosticsEditor>>,
diagnostics_editor: Option<Arc<dyn DiagnosticsToolbarEditor>>,
cx: &mut App,
) -> Vec<DiagnosticBlock> {
let Some(primary_ix) = diagnostic_group
@@ -130,6 +130,7 @@ impl editor::DiagnosticRenderer for DiagnosticRenderer {
cx: &mut App,
) -> Vec<BlockProperties<Anchor>> {
let blocks = Self::diagnostic_blocks_for_group(diagnostic_group, buffer_id, None, cx);
blocks
.into_iter()
.map(|block| {
@@ -182,7 +183,7 @@ pub(crate) struct DiagnosticBlock {
pub(crate) initial_range: Range<Point>,
pub(crate) severity: DiagnosticSeverity,
pub(crate) markdown: Entity<Markdown>,
pub(crate) diagnostics_editor: Option<WeakEntity<ProjectDiagnosticsEditor>>,
pub(crate) diagnostics_editor: Option<Arc<dyn DiagnosticsToolbarEditor>>,
}
impl DiagnosticBlock {
@@ -233,7 +234,7 @@ impl DiagnosticBlock {
pub fn open_link(
editor: &mut Editor,
diagnostics_editor: &Option<WeakEntity<ProjectDiagnosticsEditor>>,
diagnostics_editor: &Option<Arc<dyn DiagnosticsToolbarEditor>>,
link: SharedString,
window: &mut Window,
cx: &mut Context<Editor>,
@@ -254,18 +255,10 @@ impl DiagnosticBlock {
if let Some(diagnostics_editor) = diagnostics_editor {
if let Some(diagnostic) = diagnostics_editor
.read_with(cx, |diagnostics, _| {
diagnostics
.diagnostics
.get(&buffer_id)
.cloned()
.unwrap_or_default()
.into_iter()
.filter(|d| d.diagnostic.group_id == group_id)
.nth(ix)
})
.ok()
.flatten()
.get_diagnostics_for_buffer(buffer_id, cx)
.into_iter()
.filter(|d| d.diagnostic.group_id == group_id)
.nth(ix)
{
let multibuffer = editor.buffer().read(cx);
let Some(snapshot) = multibuffer
@@ -297,9 +290,9 @@ impl DiagnosticBlock {
};
}
fn jump_to<T: ToOffset>(
fn jump_to<I: ToOffset>(
editor: &mut Editor,
range: Range<T>,
range: Range<I>,
window: &mut Window,
cx: &mut Context<Editor>,
) {

View File

@@ -1,12 +1,14 @@
pub mod items;
mod toolbar_controls;
mod buffer_diagnostics;
mod diagnostic_renderer;
#[cfg(test)]
mod diagnostics_tests;
use anyhow::Result;
use buffer_diagnostics::BufferDiagnosticsEditor;
use collections::{BTreeSet, HashMap};
use diagnostic_renderer::DiagnosticBlock;
use editor::{
@@ -36,6 +38,7 @@ use std::{
};
use text::{BufferId, OffsetRangeExt};
use theme::ActiveTheme;
use toolbar_controls::DiagnosticsToolbarEditor;
pub use toolbar_controls::ToolbarControls;
use ui::{Icon, IconName, Label, h_flex, prelude::*};
use util::ResultExt;
@@ -64,6 +67,7 @@ impl Global for IncludeWarnings {}
pub fn init(cx: &mut App) {
editor::set_diagnostic_renderer(diagnostic_renderer::DiagnosticRenderer {}, cx);
cx.observe_new(ProjectDiagnosticsEditor::register).detach();
cx.observe_new(BufferDiagnosticsEditor::register).detach();
}
pub(crate) struct ProjectDiagnosticsEditor {
@@ -85,6 +89,7 @@ pub(crate) struct ProjectDiagnosticsEditor {
impl EventEmitter<EditorEvent> for ProjectDiagnosticsEditor {}
const DIAGNOSTICS_UPDATE_DELAY: Duration = Duration::from_millis(50);
const DIAGNOSTICS_SUMMARY_UPDATE_DELAY: Duration = Duration::from_millis(30);
impl Render for ProjectDiagnosticsEditor {
fn render(&mut self, _: &mut Window, cx: &mut Context<Self>) -> impl IntoElement {
@@ -144,7 +149,7 @@ impl Render for ProjectDiagnosticsEditor {
}
impl ProjectDiagnosticsEditor {
fn register(
pub fn register(
workspace: &mut Workspace,
_window: Option<&mut Window>,
_: &mut Context<Workspace>,
@@ -160,7 +165,7 @@ impl ProjectDiagnosticsEditor {
cx: &mut Context<Self>,
) -> Self {
let project_event_subscription =
cx.subscribe_in(&project_handle, window, |this, project, event, window, cx| match event {
cx.subscribe_in(&project_handle, window, |this, _project, event, window, cx| match event {
project::Event::DiskBasedDiagnosticsStarted { .. } => {
cx.notify();
}
@@ -173,13 +178,12 @@ impl ProjectDiagnosticsEditor {
paths,
} => {
this.paths_to_update.extend(paths.clone());
let project = project.clone();
this.diagnostic_summary_update = cx.spawn(async move |this, cx| {
cx.background_executor()
.timer(Duration::from_millis(30))
.timer(DIAGNOSTICS_SUMMARY_UPDATE_DELAY)
.await;
this.update(cx, |this, cx| {
this.summary = project.read(cx).diagnostic_summary(false, cx);
this.update_diagnostic_summary(cx);
})
.log_err();
});
@@ -326,6 +330,7 @@ impl ProjectDiagnosticsEditor {
let is_active = workspace
.active_item(cx)
.is_some_and(|item| item.item_id() == existing.item_id());
workspace.activate_item(&existing, true, !is_active, window, cx);
} else {
let workspace_handle = cx.entity().downgrade();
@@ -383,22 +388,25 @@ impl ProjectDiagnosticsEditor {
/// currently have diagnostics or are currently present in this view.
fn update_all_excerpts(&mut self, window: &mut Window, cx: &mut Context<Self>) {
self.project.update(cx, |project, cx| {
let mut paths = project
let mut project_paths = project
.diagnostic_summaries(false, cx)
.map(|(path, _, _)| path)
.map(|(project_path, _, _)| project_path)
.collect::<BTreeSet<_>>();
self.multibuffer.update(cx, |multibuffer, cx| {
for buffer in multibuffer.all_buffers() {
if let Some(file) = buffer.read(cx).file() {
paths.insert(ProjectPath {
project_paths.insert(ProjectPath {
path: file.path().clone(),
worktree_id: file.worktree_id(cx),
});
}
}
});
self.paths_to_update = paths;
self.paths_to_update = project_paths;
});
self.update_stale_excerpts(window, cx);
}
@@ -428,6 +436,7 @@ impl ProjectDiagnosticsEditor {
let was_empty = self.multibuffer.read(cx).is_empty();
let buffer_snapshot = buffer.read(cx).snapshot();
let buffer_id = buffer_snapshot.remote_id();
let max_severity = if self.include_warnings {
lsp::DiagnosticSeverity::WARNING
} else {
@@ -441,6 +450,7 @@ impl ProjectDiagnosticsEditor {
false,
)
.collect::<Vec<_>>();
let unchanged = this.update(cx, |this, _| {
if this.diagnostics.get(&buffer_id).is_some_and(|existing| {
this.diagnostics_are_unchanged(existing, &diagnostics, &buffer_snapshot)
@@ -475,7 +485,7 @@ impl ProjectDiagnosticsEditor {
crate::diagnostic_renderer::DiagnosticRenderer::diagnostic_blocks_for_group(
group,
buffer_snapshot.remote_id(),
Some(this.clone()),
Some(Arc::new(this.clone())),
cx,
)
})?;
@@ -505,6 +515,7 @@ impl ProjectDiagnosticsEditor {
cx,
)
.await;
let i = excerpt_ranges
.binary_search_by(|probe| {
probe
@@ -574,6 +585,7 @@ impl ProjectDiagnosticsEditor {
priority: 1,
}
});
let block_ids = this.editor.update(cx, |editor, cx| {
editor.display_map.update(cx, |display_map, cx| {
display_map.insert_blocks(editor_blocks, cx)
@@ -604,6 +616,10 @@ impl ProjectDiagnosticsEditor {
})
})
}
fn update_diagnostic_summary(&mut self, cx: &mut Context<Self>) {
self.summary = self.project.read(cx).diagnostic_summary(false, cx);
}
}
impl Focusable for ProjectDiagnosticsEditor {
@@ -812,6 +828,68 @@ impl Item for ProjectDiagnosticsEditor {
}
}
impl DiagnosticsToolbarEditor for WeakEntity<ProjectDiagnosticsEditor> {
fn include_warnings(&self, cx: &App) -> bool {
self.read_with(cx, |project_diagnostics_editor, _cx| {
project_diagnostics_editor.include_warnings
})
.unwrap_or(false)
}
fn has_stale_excerpts(&self, cx: &App) -> bool {
self.read_with(cx, |project_diagnostics_editor, _cx| {
!project_diagnostics_editor.paths_to_update.is_empty()
})
.unwrap_or(false)
}
fn is_updating(&self, cx: &App) -> bool {
self.read_with(cx, |project_diagnostics_editor, cx| {
project_diagnostics_editor.update_excerpts_task.is_some()
|| project_diagnostics_editor
.project
.read(cx)
.language_servers_running_disk_based_diagnostics(cx)
.next()
.is_some()
})
.unwrap_or(false)
}
fn stop_updating(&self, cx: &mut App) {
let _ = self.update(cx, |project_diagnostics_editor, cx| {
project_diagnostics_editor.update_excerpts_task = None;
cx.notify();
});
}
fn refresh_diagnostics(&self, window: &mut Window, cx: &mut App) {
let _ = self.update(cx, |project_diagnostics_editor, cx| {
project_diagnostics_editor.update_all_excerpts(window, cx);
});
}
fn toggle_warnings(&self, window: &mut Window, cx: &mut App) {
let _ = self.update(cx, |project_diagnostics_editor, cx| {
project_diagnostics_editor.toggle_warnings(&Default::default(), window, cx);
});
}
fn get_diagnostics_for_buffer(
&self,
buffer_id: text::BufferId,
cx: &App,
) -> Vec<language::DiagnosticEntry<text::Anchor>> {
self.read_with(cx, |project_diagnostics_editor, _cx| {
project_diagnostics_editor
.diagnostics
.get(&buffer_id)
.cloned()
.unwrap_or_default()
})
.unwrap_or_default()
}
}
const DIAGNOSTIC_EXPANSION_ROW_LIMIT: u32 = 32;
async fn context_range_for_entry(

View File

@@ -1567,6 +1567,440 @@ async fn go_to_diagnostic_with_severity(cx: &mut TestAppContext) {
cx.assert_editor_state(indoc! {"error ˇwarning info hint"});
}
#[gpui::test]
async fn test_buffer_diagnostics(cx: &mut TestAppContext) {
init_test(cx);
// We'll be creating two different files, both with diagnostics, so we can
// later verify that, since the `BufferDiagnosticsEditor` only shows
// diagnostics for the provided path, the diagnostics for the other file
// will not be shown, contrary to what happens with
// `ProjectDiagnosticsEditor`.
let fs = FakeFs::new(cx.executor());
fs.insert_tree(
path!("/test"),
json!({
"main.rs": "
fn main() {
let x = vec![];
let y = vec![];
a(x);
b(y);
c(y);
d(x);
}
"
.unindent(),
"other.rs": "
fn other() {
let unused = 42;
undefined_function();
}
"
.unindent(),
}),
)
.await;
let project = Project::test(fs.clone(), [path!("/test").as_ref()], cx).await;
let window = cx.add_window(|window, cx| Workspace::test_new(project.clone(), window, cx));
let cx = &mut VisualTestContext::from_window(*window, cx);
let project_path = project::ProjectPath {
worktree_id: project.read_with(cx, |project, cx| {
project.worktrees(cx).next().unwrap().read(cx).id()
}),
path: Arc::from(Path::new("main.rs")),
};
let buffer = project
.update(cx, |project, cx| {
project.open_buffer(project_path.clone(), cx)
})
.await
.ok();
// Create the diagnostics for `main.rs`.
let language_server_id = LanguageServerId(0);
let uri = lsp::Uri::from_file_path(path!("/test/main.rs")).unwrap();
let lsp_store = project.read_with(cx, |project, _| project.lsp_store());
lsp_store.update(cx, |lsp_store, cx| {
lsp_store.update_diagnostics(language_server_id, lsp::PublishDiagnosticsParams {
uri: uri.clone(),
diagnostics: vec![
lsp::Diagnostic{
range: lsp::Range::new(lsp::Position::new(5, 6), lsp::Position::new(5, 7)),
severity: Some(lsp::DiagnosticSeverity::WARNING),
message: "use of moved value\nvalue used here after move".to_string(),
related_information: Some(vec![
lsp::DiagnosticRelatedInformation {
location: lsp::Location::new(uri.clone(), lsp::Range::new(lsp::Position::new(2, 8), lsp::Position::new(2, 9))),
message: "move occurs because `y` has type `Vec<char>`, which does not implement the `Copy` trait".to_string()
},
lsp::DiagnosticRelatedInformation {
location: lsp::Location::new(uri.clone(), lsp::Range::new(lsp::Position::new(4, 6), lsp::Position::new(4, 7))),
message: "value moved here".to_string()
},
]),
..Default::default()
},
lsp::Diagnostic{
range: lsp::Range::new(lsp::Position::new(6, 6), lsp::Position::new(6, 7)),
severity: Some(lsp::DiagnosticSeverity::ERROR),
message: "use of moved value\nvalue used here after move".to_string(),
related_information: Some(vec![
lsp::DiagnosticRelatedInformation {
location: lsp::Location::new(uri.clone(), lsp::Range::new(lsp::Position::new(1, 8), lsp::Position::new(1, 9))),
message: "move occurs because `x` has type `Vec<char>`, which does not implement the `Copy` trait".to_string()
},
lsp::DiagnosticRelatedInformation {
location: lsp::Location::new(uri.clone(), lsp::Range::new(lsp::Position::new(3, 6), lsp::Position::new(3, 7))),
message: "value moved here".to_string()
},
]),
..Default::default()
}
],
version: None
}, None, DiagnosticSourceKind::Pushed, &[], cx).unwrap();
// Create diagnostics for other.rs to ensure that the file and
// diagnostics are not included in `BufferDiagnosticsEditor` when it is
// deployed for main.rs.
lsp_store.update_diagnostics(language_server_id, lsp::PublishDiagnosticsParams {
uri: lsp::Uri::from_file_path(path!("/test/other.rs")).unwrap(),
diagnostics: vec![
lsp::Diagnostic{
range: lsp::Range::new(lsp::Position::new(1, 8), lsp::Position::new(1, 14)),
severity: Some(lsp::DiagnosticSeverity::WARNING),
message: "unused variable: `unused`".to_string(),
..Default::default()
},
lsp::Diagnostic{
range: lsp::Range::new(lsp::Position::new(2, 4), lsp::Position::new(2, 22)),
severity: Some(lsp::DiagnosticSeverity::ERROR),
message: "cannot find function `undefined_function` in this scope".to_string(),
..Default::default()
}
],
version: None
}, None, DiagnosticSourceKind::Pushed, &[], cx).unwrap();
});
let buffer_diagnostics = window.build_entity(cx, |window, cx| {
BufferDiagnosticsEditor::new(
project_path.clone(),
project.clone(),
buffer,
true,
window,
cx,
)
});
let editor = buffer_diagnostics.update(cx, |buffer_diagnostics, _| {
buffer_diagnostics.editor().clone()
});
// Since the excerpt updates is handled by a background task, we need to
// wait a little bit to ensure that the buffer diagnostic's editor content
// is rendered.
cx.executor()
.advance_clock(DIAGNOSTICS_UPDATE_DELAY + Duration::from_millis(10));
pretty_assertions::assert_eq!(
editor_content_with_blocks(&editor, cx),
indoc::indoc! {
"§ main.rs
§ -----
fn main() {
let x = vec![];
§ move occurs because `x` has type `Vec<char>`, which does not implement
§ the `Copy` trait (back)
let y = vec![];
§ move occurs because `y` has type `Vec<char>`, which does not implement
§ the `Copy` trait
a(x); § value moved here
b(y); § value moved here
c(y);
§ use of moved value
§ value used here after move
d(x);
§ use of moved value
§ value used here after move
§ hint: move occurs because `x` has type `Vec<char>`, which does not
§ implement the `Copy` trait
}"
}
);
}
#[gpui::test]
async fn test_buffer_diagnostics_without_warnings(cx: &mut TestAppContext) {
init_test(cx);
let fs = FakeFs::new(cx.executor());
fs.insert_tree(
path!("/test"),
json!({
"main.rs": "
fn main() {
let x = vec![];
let y = vec![];
a(x);
b(y);
c(y);
d(x);
}
"
.unindent(),
}),
)
.await;
let project = Project::test(fs.clone(), [path!("/test").as_ref()], cx).await;
let window = cx.add_window(|window, cx| Workspace::test_new(project.clone(), window, cx));
let cx = &mut VisualTestContext::from_window(*window, cx);
let project_path = project::ProjectPath {
worktree_id: project.read_with(cx, |project, cx| {
project.worktrees(cx).next().unwrap().read(cx).id()
}),
path: Arc::from(Path::new("main.rs")),
};
let buffer = project
.update(cx, |project, cx| {
project.open_buffer(project_path.clone(), cx)
})
.await
.ok();
let language_server_id = LanguageServerId(0);
let uri = lsp::Uri::from_file_path(path!("/test/main.rs")).unwrap();
let lsp_store = project.read_with(cx, |project, _| project.lsp_store());
lsp_store.update(cx, |lsp_store, cx| {
lsp_store.update_diagnostics(language_server_id, lsp::PublishDiagnosticsParams {
uri: uri.clone(),
diagnostics: vec![
lsp::Diagnostic{
range: lsp::Range::new(lsp::Position::new(5, 6), lsp::Position::new(5, 7)),
severity: Some(lsp::DiagnosticSeverity::WARNING),
message: "use of moved value\nvalue used here after move".to_string(),
related_information: Some(vec![
lsp::DiagnosticRelatedInformation {
location: lsp::Location::new(uri.clone(), lsp::Range::new(lsp::Position::new(2, 8), lsp::Position::new(2, 9))),
message: "move occurs because `y` has type `Vec<char>`, which does not implement the `Copy` trait".to_string()
},
lsp::DiagnosticRelatedInformation {
location: lsp::Location::new(uri.clone(), lsp::Range::new(lsp::Position::new(4, 6), lsp::Position::new(4, 7))),
message: "value moved here".to_string()
},
]),
..Default::default()
},
lsp::Diagnostic{
range: lsp::Range::new(lsp::Position::new(6, 6), lsp::Position::new(6, 7)),
severity: Some(lsp::DiagnosticSeverity::ERROR),
message: "use of moved value\nvalue used here after move".to_string(),
related_information: Some(vec![
lsp::DiagnosticRelatedInformation {
location: lsp::Location::new(uri.clone(), lsp::Range::new(lsp::Position::new(1, 8), lsp::Position::new(1, 9))),
message: "move occurs because `x` has type `Vec<char>`, which does not implement the `Copy` trait".to_string()
},
lsp::DiagnosticRelatedInformation {
location: lsp::Location::new(uri.clone(), lsp::Range::new(lsp::Position::new(3, 6), lsp::Position::new(3, 7))),
message: "value moved here".to_string()
},
]),
..Default::default()
}
],
version: None
}, None, DiagnosticSourceKind::Pushed, &[], cx).unwrap();
});
let include_warnings = false;
let buffer_diagnostics = window.build_entity(cx, |window, cx| {
BufferDiagnosticsEditor::new(
project_path.clone(),
project.clone(),
buffer,
include_warnings,
window,
cx,
)
});
let editor = buffer_diagnostics.update(cx, |buffer_diagnostics, _cx| {
buffer_diagnostics.editor().clone()
});
// Since the excerpt updates is handled by a background task, we need to
// wait a little bit to ensure that the buffer diagnostic's editor content
// is rendered.
cx.executor()
.advance_clock(DIAGNOSTICS_UPDATE_DELAY + Duration::from_millis(10));
pretty_assertions::assert_eq!(
editor_content_with_blocks(&editor, cx),
indoc::indoc! {
"§ main.rs
§ -----
fn main() {
let x = vec![];
§ move occurs because `x` has type `Vec<char>`, which does not implement
§ the `Copy` trait (back)
let y = vec![];
a(x); § value moved here
b(y);
c(y);
d(x);
§ use of moved value
§ value used here after move
§ hint: move occurs because `x` has type `Vec<char>`, which does not
§ implement the `Copy` trait
}"
}
);
}
#[gpui::test]
async fn test_buffer_diagnostics_multiple_servers(cx: &mut TestAppContext) {
init_test(cx);
let fs = FakeFs::new(cx.executor());
fs.insert_tree(
path!("/test"),
json!({
"main.rs": "
fn main() {
let x = vec![];
let y = vec![];
a(x);
b(y);
c(y);
d(x);
}
"
.unindent(),
}),
)
.await;
let project = Project::test(fs.clone(), [path!("/test").as_ref()], cx).await;
let window = cx.add_window(|window, cx| Workspace::test_new(project.clone(), window, cx));
let cx = &mut VisualTestContext::from_window(*window, cx);
let project_path = project::ProjectPath {
worktree_id: project.read_with(cx, |project, cx| {
project.worktrees(cx).next().unwrap().read(cx).id()
}),
path: Arc::from(Path::new("main.rs")),
};
let buffer = project
.update(cx, |project, cx| {
project.open_buffer(project_path.clone(), cx)
})
.await
.ok();
// Create the diagnostics for `main.rs`.
// Two warnings are being created, one for each language server, in order to
// assert that both warnings are rendered in the editor.
let language_server_id_a = LanguageServerId(0);
let language_server_id_b = LanguageServerId(1);
let uri = lsp::Uri::from_file_path(path!("/test/main.rs")).unwrap();
let lsp_store = project.read_with(cx, |project, _| project.lsp_store());
lsp_store.update(cx, |lsp_store, cx| {
lsp_store
.update_diagnostics(
language_server_id_a,
lsp::PublishDiagnosticsParams {
uri: uri.clone(),
diagnostics: vec![lsp::Diagnostic {
range: lsp::Range::new(lsp::Position::new(5, 6), lsp::Position::new(5, 7)),
severity: Some(lsp::DiagnosticSeverity::WARNING),
message: "use of moved value\nvalue used here after move".to_string(),
related_information: None,
..Default::default()
}],
version: None,
},
None,
DiagnosticSourceKind::Pushed,
&[],
cx,
)
.unwrap();
lsp_store
.update_diagnostics(
language_server_id_b,
lsp::PublishDiagnosticsParams {
uri: uri.clone(),
diagnostics: vec![lsp::Diagnostic {
range: lsp::Range::new(lsp::Position::new(6, 6), lsp::Position::new(6, 7)),
severity: Some(lsp::DiagnosticSeverity::WARNING),
message: "use of moved value\nvalue used here after move".to_string(),
related_information: None,
..Default::default()
}],
version: None,
},
None,
DiagnosticSourceKind::Pushed,
&[],
cx,
)
.unwrap();
});
let buffer_diagnostics = window.build_entity(cx, |window, cx| {
BufferDiagnosticsEditor::new(
project_path.clone(),
project.clone(),
buffer,
true,
window,
cx,
)
});
let editor = buffer_diagnostics.update(cx, |buffer_diagnostics, _| {
buffer_diagnostics.editor().clone()
});
// Since the excerpt updates is handled by a background task, we need to
// wait a little bit to ensure that the buffer diagnostic's editor content
// is rendered.
cx.executor()
.advance_clock(DIAGNOSTICS_UPDATE_DELAY + Duration::from_millis(10));
pretty_assertions::assert_eq!(
editor_content_with_blocks(&editor, cx),
indoc::indoc! {
"§ main.rs
§ -----
a(x);
b(y);
c(y);
§ use of moved value
§ value used here after move
d(x);
§ use of moved value
§ value used here after move
}"
}
);
buffer_diagnostics.update(cx, |buffer_diagnostics, _cx| {
assert_eq!(
*buffer_diagnostics.summary(),
DiagnosticSummary {
warning_count: 2,
error_count: 0
}
);
})
}
fn init_test(cx: &mut TestAppContext) {
cx.update(|cx| {
zlog::init_test();

View File

@@ -1,33 +1,56 @@
use crate::{ProjectDiagnosticsEditor, ToggleDiagnosticsRefresh};
use gpui::{Context, Entity, EventEmitter, ParentElement, Render, WeakEntity, Window};
use crate::{BufferDiagnosticsEditor, ProjectDiagnosticsEditor, ToggleDiagnosticsRefresh};
use gpui::{Context, EventEmitter, ParentElement, Render, Window};
use language::DiagnosticEntry;
use text::{Anchor, BufferId};
use ui::prelude::*;
use ui::{IconButton, IconButtonShape, IconName, Tooltip};
use workspace::{ToolbarItemEvent, ToolbarItemLocation, ToolbarItemView, item::ItemHandle};
pub struct ToolbarControls {
editor: Option<WeakEntity<ProjectDiagnosticsEditor>>,
editor: Option<Box<dyn DiagnosticsToolbarEditor>>,
}
pub(crate) trait DiagnosticsToolbarEditor: Send + Sync {
/// Informs the toolbar whether warnings are included in the diagnostics.
fn include_warnings(&self, cx: &App) -> bool;
/// Toggles whether warning diagnostics should be displayed by the
/// diagnostics editor.
fn toggle_warnings(&self, window: &mut Window, cx: &mut App);
/// Indicates whether any of the excerpts displayed by the diagnostics
/// editor are stale.
fn has_stale_excerpts(&self, cx: &App) -> bool;
/// Indicates whether the diagnostics editor is currently updating the
/// diagnostics.
fn is_updating(&self, cx: &App) -> bool;
/// Requests that the diagnostics editor stop updating the diagnostics.
fn stop_updating(&self, cx: &mut App);
/// Requests that the diagnostics editor updates the displayed diagnostics
/// with the latest information.
fn refresh_diagnostics(&self, window: &mut Window, cx: &mut App);
/// Returns a list of diagnostics for the provided buffer id.
fn get_diagnostics_for_buffer(
&self,
buffer_id: BufferId,
cx: &App,
) -> Vec<DiagnosticEntry<Anchor>>;
}
impl Render for ToolbarControls {
fn render(&mut self, _: &mut Window, cx: &mut Context<Self>) -> impl IntoElement {
let mut include_warnings = false;
let mut has_stale_excerpts = false;
let mut include_warnings = false;
let mut is_updating = false;
if let Some(editor) = self.diagnostics() {
let diagnostics = editor.read(cx);
include_warnings = diagnostics.include_warnings;
has_stale_excerpts = !diagnostics.paths_to_update.is_empty();
is_updating = diagnostics.update_excerpts_task.is_some()
|| diagnostics
.project
.read(cx)
.language_servers_running_disk_based_diagnostics(cx)
.next()
.is_some();
match &self.editor {
Some(editor) => {
include_warnings = editor.include_warnings(cx);
has_stale_excerpts = editor.has_stale_excerpts(cx);
is_updating = editor.is_updating(cx);
}
None => {}
}
let tooltip = if include_warnings {
let warning_tooltip = if include_warnings {
"Exclude Warnings"
} else {
"Include Warnings"
@@ -52,11 +75,12 @@ impl Render for ToolbarControls {
&ToggleDiagnosticsRefresh,
))
.on_click(cx.listener(move |toolbar_controls, _, _, cx| {
if let Some(diagnostics) = toolbar_controls.diagnostics() {
diagnostics.update(cx, |diagnostics, cx| {
diagnostics.update_excerpts_task = None;
match toolbar_controls.editor() {
Some(editor) => {
editor.stop_updating(cx);
cx.notify();
});
}
None => {}
}
})),
)
@@ -71,12 +95,11 @@ impl Render for ToolbarControls {
&ToggleDiagnosticsRefresh,
))
.on_click(cx.listener({
move |toolbar_controls, _, window, cx| {
if let Some(diagnostics) = toolbar_controls.diagnostics() {
diagnostics.update(cx, move |diagnostics, cx| {
diagnostics.update_all_excerpts(window, cx);
});
}
move |toolbar_controls, _, window, cx| match toolbar_controls
.editor()
{
Some(editor) => editor.refresh_diagnostics(window, cx),
None => {}
}
})),
)
@@ -86,13 +109,10 @@ impl Render for ToolbarControls {
IconButton::new("toggle-warnings", IconName::Warning)
.icon_color(warning_color)
.shape(IconButtonShape::Square)
.tooltip(Tooltip::text(tooltip))
.on_click(cx.listener(|this, _, window, cx| {
if let Some(editor) = this.diagnostics() {
editor.update(cx, |editor, cx| {
editor.toggle_warnings(&Default::default(), window, cx);
});
}
.tooltip(Tooltip::text(warning_tooltip))
.on_click(cx.listener(|this, _, window, cx| match &this.editor {
Some(editor) => editor.toggle_warnings(window, cx),
None => {}
})),
)
}
@@ -109,7 +129,10 @@ impl ToolbarItemView for ToolbarControls {
) -> ToolbarItemLocation {
if let Some(pane_item) = active_pane_item.as_ref() {
if let Some(editor) = pane_item.downcast::<ProjectDiagnosticsEditor>() {
self.editor = Some(editor.downgrade());
self.editor = Some(Box::new(editor.downgrade()));
ToolbarItemLocation::PrimaryRight
} else if let Some(editor) = pane_item.downcast::<BufferDiagnosticsEditor>() {
self.editor = Some(Box::new(editor.downgrade()));
ToolbarItemLocation::PrimaryRight
} else {
ToolbarItemLocation::Hidden
@@ -131,7 +154,7 @@ impl ToolbarControls {
ToolbarControls { editor: None }
}
fn diagnostics(&self) -> Option<Entity<ProjectDiagnosticsEditor>> {
self.editor.as_ref()?.upgrade()
fn editor(&self) -> Option<&dyn DiagnosticsToolbarEditor> {
self.editor.as_deref()
}
}

View File

@@ -92,6 +92,7 @@ uuid.workspace = true
workspace.workspace = true
zed_actions.workspace = true
workspace-hack.workspace = true
postage.workspace = true
[dev-dependencies]
ctor.workspace = true

View File

@@ -177,17 +177,15 @@ use snippet::Snippet;
use std::{
any::TypeId,
borrow::Cow,
cell::OnceCell,
cell::RefCell,
cell::{OnceCell, RefCell},
cmp::{self, Ordering, Reverse},
iter::Peekable,
mem,
num::NonZeroU32,
ops::Not,
ops::{ControlFlow, Deref, DerefMut, Range, RangeInclusive},
ops::{ControlFlow, Deref, DerefMut, Not, Range, RangeInclusive},
path::{Path, PathBuf},
rc::Rc,
sync::Arc,
sync::{Arc, LazyLock},
time::{Duration, Instant},
};
use task::{ResolvedTask, RunnableTag, TaskTemplate, TaskVariables};
@@ -236,6 +234,21 @@ pub(crate) const EDIT_PREDICTION_KEY_CONTEXT: &str = "edit_prediction";
pub(crate) const EDIT_PREDICTION_CONFLICT_KEY_CONTEXT: &str = "edit_prediction_conflict";
pub(crate) const MINIMAP_FONT_SIZE: AbsoluteLength = AbsoluteLength::Pixels(px(2.));
#[derive(Clone, Debug, Eq, PartialEq)]
pub struct LastCursorPosition {
pub path: PathBuf,
pub worktree_path: Arc<Path>,
pub point: Point,
}
pub static LAST_CURSOR_POSITION_WATCH: LazyLock<(
Mutex<postage::watch::Sender<Option<LastCursorPosition>>>,
postage::watch::Receiver<Option<LastCursorPosition>>,
)> = LazyLock::new(|| {
let (sender, receiver) = postage::watch::channel();
(Mutex::new(sender), receiver)
});
pub type RenderDiffHunkControlsFn = Arc<
dyn Fn(
u32,
@@ -2619,7 +2632,7 @@ impl Editor {
cx: &mut Context<Workspace>,
) -> Task<Result<Entity<Editor>>> {
let project = workspace.project().clone();
let create = project.update(cx, |project, cx| project.create_buffer(cx));
let create = project.update(cx, |project, cx| project.create_buffer(true, cx));
cx.spawn_in(window, async move |workspace, cx| {
let buffer = create.await?;
@@ -2657,7 +2670,7 @@ impl Editor {
cx: &mut Context<Workspace>,
) {
let project = workspace.project().clone();
let create = project.update(cx, |project, cx| project.create_buffer(cx));
let create = project.update(cx, |project, cx| project.create_buffer(true, cx));
cx.spawn_in(window, async move |workspace, cx| {
let buffer = create.await?;
@@ -3064,10 +3077,28 @@ impl Editor {
let new_cursor_position = newest_selection.head();
let selection_start = newest_selection.start;
let new_cursor_point = new_cursor_position.to_point(buffer);
if let Some(project) = self.project()
&& let Some((path, worktree_path)) =
self.file_at(new_cursor_point, cx).and_then(|file| {
file.as_local().and_then(|file| {
let worktree =
project.read(cx).worktree_for_id(file.worktree_id(cx), cx)?;
Some((file.abs_path(cx), worktree.read(cx).abs_path()))
})
})
{
*LAST_CURSOR_POSITION_WATCH.0.lock().borrow_mut() = Some(LastCursorPosition {
path,
worktree_path,
point: new_cursor_point,
});
}
if effects.nav_history.is_none() || effects.nav_history == Some(true) {
self.push_to_nav_history(
*old_cursor_position,
Some(new_cursor_position.to_point(buffer)),
Some(new_cursor_point),
false,
effects.nav_history == Some(true),
cx,
@@ -18998,6 +19029,8 @@ impl Editor {
}
}
/// Returns the project path for the editor's buffer, if any buffer is
/// opened in the editor.
pub fn project_path(&self, cx: &App) -> Option<ProjectPath> {
if let Some(buffer) = self.buffer.read(cx).as_singleton() {
buffer.read(cx).project_path(cx)

View File

@@ -15913,7 +15913,7 @@ async fn test_following(cx: &mut TestAppContext) {
let project = Project::test(fs, ["/file.rs".as_ref()], cx).await;
let buffer = project.update(cx, |project, cx| {
let buffer = project.create_local_buffer(&sample_text(16, 8, 'a'), None, cx);
let buffer = project.create_local_buffer(&sample_text(16, 8, 'a'), None, false, cx);
cx.new(|cx| MultiBuffer::singleton(buffer, cx))
});
let leader = cx.add_window(|window, cx| build_editor(buffer.clone(), window, cx));
@@ -16165,8 +16165,8 @@ async fn test_following_with_multiple_excerpts(cx: &mut TestAppContext) {
let (buffer_1, buffer_2) = project.update(cx, |project, cx| {
(
project.create_local_buffer("abc\ndef\nghi\njkl\n", None, cx),
project.create_local_buffer("mno\npqr\nstu\nvwx\n", None, cx),
project.create_local_buffer("abc\ndef\nghi\njkl\n", None, false, cx),
project.create_local_buffer("mno\npqr\nstu\nvwx\n", None, false, cx),
)
});

View File

@@ -1130,7 +1130,7 @@ impl SerializableItem for Editor {
// First create the empty buffer
let buffer = project
.update(cx, |project, cx| project.create_buffer(cx))?
.update(cx, |project, cx| project.create_buffer(true, cx))?
.await?;
// Then set the text so that the dirty bit is set correctly
@@ -1238,7 +1238,7 @@ impl SerializableItem for Editor {
..
} => window.spawn(cx, async move |cx| {
let buffer = project
.update(cx, |project, cx| project.create_buffer(cx))?
.update(cx, |project, cx| project.create_buffer(true, cx))?
.await?;
cx.update(|window, cx| {

View File

@@ -4,7 +4,7 @@
use super::{Bias, DisplayPoint, DisplaySnapshot, SelectionGoal, ToDisplayPoint};
use crate::{DisplayRow, EditorStyle, ToOffset, ToPoint, scroll::ScrollAnchor};
use gpui::{Pixels, WindowTextSystem};
use language::Point;
use language::{CharClassifier, Point};
use multi_buffer::{MultiBufferRow, MultiBufferSnapshot};
use serde::Deserialize;
use workspace::searchable::Direction;
@@ -405,15 +405,18 @@ pub fn previous_subword_start(map: &DisplaySnapshot, point: DisplayPoint) -> Dis
let classifier = map.buffer_snapshot.char_classifier_at(raw_point);
find_preceding_boundary_display_point(map, point, FindRange::MultiLine, |left, right| {
let is_word_start =
classifier.kind(left) != classifier.kind(right) && !right.is_whitespace();
let is_subword_start = classifier.is_word('-') && left == '-' && right != '-'
|| left == '_' && right != '_'
|| left.is_lowercase() && right.is_uppercase();
is_word_start || is_subword_start || left == '\n'
is_subword_start(left, right, &classifier) || left == '\n'
})
}
pub fn is_subword_start(left: char, right: char, classifier: &CharClassifier) -> bool {
let is_word_start = classifier.kind(left) != classifier.kind(right) && !right.is_whitespace();
let is_subword_start = classifier.is_word('-') && left == '-' && right != '-'
|| left == '_' && right != '_'
|| left.is_lowercase() && right.is_uppercase();
is_word_start || is_subword_start
}
/// Returns a position of the next word boundary, where a word character is defined as either
/// uppercase letter, lowercase letter, '_' character or language-specific word character (like '-' in CSS).
pub fn next_word_end(map: &DisplaySnapshot, point: DisplayPoint) -> DisplayPoint {
@@ -463,15 +466,19 @@ pub fn next_subword_end(map: &DisplaySnapshot, point: DisplayPoint) -> DisplayPo
let classifier = map.buffer_snapshot.char_classifier_at(raw_point);
find_boundary(map, point, FindRange::MultiLine, |left, right| {
let is_word_end =
(classifier.kind(left) != classifier.kind(right)) && !classifier.is_whitespace(left);
let is_subword_end = classifier.is_word('-') && left != '-' && right == '-'
|| left != '_' && right == '_'
|| left.is_lowercase() && right.is_uppercase();
is_word_end || is_subword_end || right == '\n'
is_subword_end(left, right, &classifier) || right == '\n'
})
}
pub fn is_subword_end(left: char, right: char, classifier: &CharClassifier) -> bool {
let is_word_end =
(classifier.kind(left) != classifier.kind(right)) && !classifier.is_whitespace(left);
let is_subword_end = classifier.is_word('-') && left != '-' && right == '-'
|| left != '_' && right == '_'
|| left.is_lowercase() && right.is_uppercase();
is_word_end || is_subword_end
}
/// Returns a position of the start of the current paragraph, where a paragraph
/// is defined as a run of non-blank lines.
pub fn start_of_paragraph(

View File

@@ -200,7 +200,7 @@ pub fn expand_macro_recursively(
}
let buffer = project
.update(cx, |project, cx| project.create_buffer(cx))?
.update(cx, |project, cx| project.create_buffer(false, cx))?
.await?;
workspace.update_in(cx, |workspace, window, cx| {
buffer.update(cx, |buffer, cx| {

View File

@@ -388,9 +388,6 @@ pub(crate) fn commit_message_editor(
window: &mut Window,
cx: &mut Context<Editor>,
) -> Editor {
project.update(cx, |this, cx| {
this.mark_buffer_as_non_searchable(commit_message_buffer.read(cx).remote_id(), cx);
});
let buffer = cx.new(|cx| MultiBuffer::singleton(commit_message_buffer, cx));
let max_lines = if in_panel { MAX_PANEL_EDITOR_LINES } else { 18 };
let mut commit_editor = Editor::new(

View File

@@ -8,8 +8,9 @@ use windows::Win32::{
D3D_FEATURE_LEVEL_11_0, D3D_FEATURE_LEVEL_11_1,
},
Direct3D11::{
D3D11_CREATE_DEVICE_BGRA_SUPPORT, D3D11_CREATE_DEVICE_DEBUG, D3D11_SDK_VERSION,
D3D11CreateDevice, ID3D11Device, ID3D11DeviceContext,
D3D11_CREATE_DEVICE_BGRA_SUPPORT, D3D11_CREATE_DEVICE_DEBUG,
D3D11_FEATURE_D3D10_X_HARDWARE_OPTIONS, D3D11_FEATURE_DATA_D3D10_X_HARDWARE_OPTIONS,
D3D11_SDK_VERSION, D3D11CreateDevice, ID3D11Device, ID3D11DeviceContext,
},
Dxgi::{
CreateDXGIFactory2, DXGI_CREATE_FACTORY_DEBUG, DXGI_CREATE_FACTORY_FLAGS,
@@ -54,12 +55,10 @@ impl DirectXDevices {
let adapter =
get_adapter(&dxgi_factory, debug_layer_available).context("Getting DXGI adapter")?;
let (device, device_context) = {
let mut device: Option<ID3D11Device> = None;
let mut context: Option<ID3D11DeviceContext> = None;
let mut feature_level = D3D_FEATURE_LEVEL::default();
get_device(
let device = get_device(
&adapter,
Some(&mut device),
Some(&mut context),
Some(&mut feature_level),
debug_layer_available,
@@ -77,7 +76,7 @@ impl DirectXDevices {
}
_ => unreachable!(),
}
(device.unwrap(), context.unwrap())
(device, context.unwrap())
};
Ok(Self {
@@ -134,7 +133,7 @@ fn get_adapter(dxgi_factory: &IDXGIFactory6, debug_layer_available: bool) -> Res
}
// Check to see whether the adapter supports Direct3D 11, but don't
// create the actual device yet.
if get_device(&adapter, None, None, None, debug_layer_available)
if get_device(&adapter, None, None, debug_layer_available)
.log_err()
.is_some()
{
@@ -148,11 +147,11 @@ fn get_adapter(dxgi_factory: &IDXGIFactory6, debug_layer_available: bool) -> Res
#[inline]
fn get_device(
adapter: &IDXGIAdapter1,
device: Option<*mut Option<ID3D11Device>>,
context: Option<*mut Option<ID3D11DeviceContext>>,
feature_level: Option<*mut D3D_FEATURE_LEVEL>,
debug_layer_available: bool,
) -> Result<()> {
) -> Result<ID3D11Device> {
let mut device: Option<ID3D11Device> = None;
let device_flags = if debug_layer_available {
D3D11_CREATE_DEVICE_BGRA_SUPPORT | D3D11_CREATE_DEVICE_DEBUG
} else {
@@ -171,10 +170,30 @@ fn get_device(
D3D_FEATURE_LEVEL_10_1,
]),
D3D11_SDK_VERSION,
device,
Some(&mut device),
feature_level,
context,
)?;
}
Ok(())
let device = device.unwrap();
let mut data = D3D11_FEATURE_DATA_D3D10_X_HARDWARE_OPTIONS::default();
unsafe {
device
.CheckFeatureSupport(
D3D11_FEATURE_D3D10_X_HARDWARE_OPTIONS,
&mut data as *mut _ as _,
std::mem::size_of::<D3D11_FEATURE_DATA_D3D10_X_HARDWARE_OPTIONS>() as u32,
)
.context("Checking GPU device feature support")?;
}
if data
.ComputeShaders_Plus_RawAndStructuredBuffers_Via_Shader_4_x
.as_bool()
{
Ok(device)
} else {
Err(anyhow::anyhow!(
"Required feature StructuredBuffer is not supported by GPU/driver"
))
}
}

View File

@@ -590,6 +590,11 @@ pub trait LspAdapter: 'static + Send + Sync {
"Not implemented for this adapter. This method should only be called on the default JSON language server adapter"
);
}
/// True for the extension adapter and false otherwise.
fn is_extension(&self) -> bool {
false
}
}
async fn try_fetch_server_binary<L: LspAdapter + 'static + Send + Sync + ?Sized>(
@@ -2270,6 +2275,10 @@ impl LspAdapter for FakeLspAdapter {
let label_for_completion = self.label_for_completion.as_ref()?;
label_for_completion(item, language)
}
fn is_extension(&self) -> bool {
false
}
}
fn get_capture_indices(query: &Query, captures: &mut [(&str, &mut Option<u32>)]) {

View File

@@ -374,14 +374,23 @@ impl LanguageRegistry {
pub fn register_available_lsp_adapter(
&self,
name: LanguageServerName,
load: impl Fn() -> Arc<dyn LspAdapter> + 'static + Send + Sync,
adapter: Arc<dyn LspAdapter>,
) {
self.state.write().available_lsp_adapters.insert(
let mut state = self.state.write();
if adapter.is_extension()
&& let Some(existing_adapter) = state.all_lsp_adapters.get(&name)
&& !existing_adapter.adapter.is_extension()
{
log::warn!(
"not registering extension-provided language server {name:?}, since a builtin language server exists with that name",
);
return;
}
state.available_lsp_adapters.insert(
name,
Arc::new(move || {
let lsp_adapter = load();
CachedLspAdapter::new(lsp_adapter)
}),
Arc::new(move || CachedLspAdapter::new(adapter.clone())),
);
}
@@ -396,13 +405,21 @@ impl LanguageRegistry {
Some(load_lsp_adapter())
}
pub fn register_lsp_adapter(
&self,
language_name: LanguageName,
adapter: Arc<dyn LspAdapter>,
) -> Arc<CachedLspAdapter> {
let cached = CachedLspAdapter::new(adapter);
pub fn register_lsp_adapter(&self, language_name: LanguageName, adapter: Arc<dyn LspAdapter>) {
let mut state = self.state.write();
if adapter.is_extension()
&& let Some(existing_adapter) = state.all_lsp_adapters.get(&adapter.name())
&& !existing_adapter.adapter.is_extension()
{
log::warn!(
"not registering extension-provided language server {:?} for language {language_name:?}, since a builtin language server exists with that name",
adapter.name(),
);
return;
}
let cached = CachedLspAdapter::new(adapter);
state
.lsp_adapters
.entry(language_name)
@@ -411,8 +428,6 @@ impl LanguageRegistry {
state
.all_lsp_adapters
.insert(cached.name.clone(), cached.clone());
cached
}
/// Register a fake language server and adapter

View File

@@ -398,6 +398,10 @@ impl LspAdapter for ExtensionLspAdapter {
Ok(labels_from_extension(labels, language))
}
fn is_extension(&self) -> bool {
true
}
}
fn labels_from_extension(

View File

@@ -0,0 +1,30 @@
[package]
name = "language_onboarding"
version = "0.1.0"
edition.workspace = true
publish.workspace = true
license = "GPL-3.0-or-later"
[lints]
workspace = true
[lib]
path = "src/python.rs"
[features]
default = []
[dependencies]
db.workspace = true
editor.workspace = true
gpui.workspace = true
project.workspace = true
ui.workspace = true
workspace.workspace = true
workspace-hack.workspace = true
# Uncomment other workspace dependencies as needed
# assistant.workspace = true
# client.workspace = true
# project.workspace = true
# settings.workspace = true

View File

@@ -0,0 +1,95 @@
use db::kvp::Dismissable;
use editor::Editor;
use gpui::{Context, EventEmitter, Subscription};
use ui::{
Banner, Button, Clickable, Color, FluentBuilder as _, IconButton, IconName,
InteractiveElement as _, IntoElement, Label, LabelCommon, LabelSize, ParentElement as _,
Render, Styled as _, Window, div, h_flex, v_flex,
};
use workspace::{ToolbarItemEvent, ToolbarItemLocation, ToolbarItemView, Workspace};
pub struct BasedPyrightBanner {
dismissed: bool,
have_basedpyright: bool,
_subscriptions: [Subscription; 1],
}
impl Dismissable for BasedPyrightBanner {
const KEY: &str = "basedpyright-banner";
}
impl BasedPyrightBanner {
pub fn new(workspace: &Workspace, cx: &mut Context<Self>) -> Self {
let subscription = cx.subscribe(workspace.project(), |this, _, event, _| {
if let project::Event::LanguageServerAdded(_, name, _) = event
&& name == "basedpyright"
{
this.have_basedpyright = true;
}
});
let dismissed = Self::dismissed();
Self {
dismissed,
have_basedpyright: false,
_subscriptions: [subscription],
}
}
}
impl EventEmitter<ToolbarItemEvent> for BasedPyrightBanner {}
impl Render for BasedPyrightBanner {
fn render(&mut self, _window: &mut Window, cx: &mut Context<Self>) -> impl IntoElement {
div()
.id("basedpyright-banner")
.when(!self.dismissed && self.have_basedpyright, |el| {
el.child(
Banner::new()
.severity(ui::Severity::Info)
.child(
h_flex()
.gap_2()
.child(v_flex()
.child("Basedpyright is now the only default language server for Python")
.child(Label::new("We have disabled PyRight and pylsp by default. They can be re-enabled in your settings.").size(LabelSize::XSmall).color(Color::Muted))
)
.child(
Button::new("learn-more", "Learn More")
.icon(IconName::ArrowUpRight)
.on_click(|_, _, cx| {
cx.open_url("https://zed.dev/docs/languages/python")
}),
),
)
.action_slot(IconButton::new("dismiss", IconName::Close).on_click(
cx.listener(|this, _, _, cx| {
this.dismissed = true;
Self::set_dismissed(true, cx);
cx.notify();
}),
))
.into_any_element(),
)
})
}
}
impl ToolbarItemView for BasedPyrightBanner {
fn set_active_pane_item(
&mut self,
active_pane_item: Option<&dyn workspace::ItemHandle>,
_window: &mut ui::Window,
cx: &mut Context<Self>,
) -> ToolbarItemLocation {
if let Some(item) = active_pane_item
&& let Some(editor) = item.act_as::<Editor>(cx)
&& let Some(path) = editor.update(cx, |editor, cx| editor.target_file_abs_path(cx))
&& let Some(file_name) = path.file_name()
&& file_name.as_encoded_bytes().ends_with(".py".as_bytes())
{
return ToolbarItemLocation::Secondary;
}
ToolbarItemLocation::Hidden
}
}

View File

@@ -296,7 +296,7 @@ impl LanguageServerState {
.update(cx, |workspace, cx| {
workspace
.project()
.update(cx, |project, cx| project.create_buffer(cx))
.update(cx, |project, cx| project.create_buffer(false, cx))
})
.ok()
else {

View File

@@ -42,7 +42,6 @@ async-trait.workspace = true
chrono.workspace = true
collections.workspace = true
dap.workspace = true
feature_flags.workspace = true
futures.workspace = true
gpui.workspace = true
http_client.workspace = true

View File

@@ -73,3 +73,9 @@
arguments: (arguments (template_string (string_fragment) @injection.content
(#set! injection.language "graphql")))
)
(call_expression
function: (identifier) @_name(#match? @_name "^iso$")
arguments: (arguments (template_string (string_fragment) @injection.content
(#set! injection.language "isograph")))
)

View File

@@ -1,5 +1,4 @@
use anyhow::Context as _;
use feature_flags::{FeatureFlag, FeatureFlagAppExt as _};
use gpui::{App, SharedString, UpdateGlobal};
use node_runtime::NodeRuntime;
use python::PyprojectTomlManifestProvider;
@@ -54,12 +53,6 @@ pub static LANGUAGE_GIT_COMMIT: std::sync::LazyLock<Arc<Language>> =
))
});
struct BasedPyrightFeatureFlag;
impl FeatureFlag for BasedPyrightFeatureFlag {
const NAME: &'static str = "basedpyright";
}
pub fn init(languages: Arc<LanguageRegistry>, node: NodeRuntime, cx: &mut App) {
#[cfg(feature = "load-grammars")]
languages.register_native_grammars([
@@ -174,7 +167,7 @@ pub fn init(languages: Arc<LanguageRegistry>, node: NodeRuntime, cx: &mut App) {
},
LanguageInfo {
name: "python",
adapters: vec![python_lsp_adapter, py_lsp_adapter],
adapters: vec![basedpyright_lsp_adapter],
context: Some(python_context_provider),
toolchain: Some(python_toolchain_provider),
manifest_name: Some(SharedString::new_static("pyproject.toml").into()),
@@ -240,17 +233,6 @@ pub fn init(languages: Arc<LanguageRegistry>, node: NodeRuntime, cx: &mut App) {
);
}
let mut basedpyright_lsp_adapter = Some(basedpyright_lsp_adapter);
cx.observe_flag::<BasedPyrightFeatureFlag, _>({
let languages = languages.clone();
move |enabled, _| {
if enabled && let Some(adapter) = basedpyright_lsp_adapter.take() {
languages.register_available_lsp_adapter(adapter.name(), move || adapter.clone());
}
}
})
.detach();
// Register globally available language servers.
//
// This will allow users to add support for a built-in language server (e.g., Tailwind)
@@ -267,27 +249,19 @@ pub fn init(languages: Arc<LanguageRegistry>, node: NodeRuntime, cx: &mut App) {
// ```
languages.register_available_lsp_adapter(
LanguageServerName("tailwindcss-language-server".into()),
{
let adapter = tailwind_adapter.clone();
move || adapter.clone()
},
tailwind_adapter.clone(),
);
languages.register_available_lsp_adapter(LanguageServerName("eslint".into()), {
let adapter = eslint_adapter.clone();
move || adapter.clone()
});
languages.register_available_lsp_adapter(LanguageServerName("vtsls".into()), {
let adapter = vtsls_adapter;
move || adapter.clone()
});
languages.register_available_lsp_adapter(
LanguageServerName("eslint".into()),
eslint_adapter.clone(),
);
languages.register_available_lsp_adapter(LanguageServerName("vtsls".into()), vtsls_adapter);
languages.register_available_lsp_adapter(
LanguageServerName("typescript-language-server".into()),
{
let adapter = typescript_lsp_adapter;
move || adapter.clone()
},
typescript_lsp_adapter,
);
languages.register_available_lsp_adapter(python_lsp_adapter.name(), python_lsp_adapter);
languages.register_available_lsp_adapter(py_lsp_adapter.name(), py_lsp_adapter);
// Register Tailwind for the existing languages that should have it by default.
//
// This can be driven by the `language_servers` setting once we have a way for

View File

@@ -35,7 +35,7 @@ use std::{
sync::Arc,
};
use task::{ShellKind, TaskTemplate, TaskTemplates, VariableName};
use util::ResultExt;
use util::{ResultExt, maybe};
pub(crate) struct PyprojectTomlManifestProvider;
@@ -1619,23 +1619,37 @@ impl LspAdapter for BasedPyrightLspAdapter {
}
}
// Always set the python interpreter path
// Get or create the python section
let python = object
// Set both pythonPath and defaultInterpreterPath for compatibility
if let Some(python) = object
.entry("python")
.or_insert(Value::Object(serde_json::Map::default()))
.as_object_mut()
.unwrap();
// Set both pythonPath and defaultInterpreterPath for compatibility
python.insert(
"pythonPath".to_owned(),
Value::String(interpreter_path.clone()),
);
python.insert(
"defaultInterpreterPath".to_owned(),
Value::String(interpreter_path),
);
{
python.insert(
"pythonPath".to_owned(),
Value::String(interpreter_path.clone()),
);
python.insert(
"defaultInterpreterPath".to_owned(),
Value::String(interpreter_path),
);
}
// Basedpyright by default uses `strict` type checking, we tone it down as to not surpris users
maybe!({
let basedpyright = object
.entry("basedpyright")
.or_insert(Value::Object(serde_json::Map::default()));
let analysis = basedpyright
.as_object_mut()?
.entry("analysis")
.or_insert(Value::Object(serde_json::Map::default()));
if let serde_json::map::Entry::Vacant(v) =
analysis.as_object_mut()?.entry("typeCheckingMode")
{
v.insert(Value::String("standard".to_owned()));
}
Some(())
});
}
user_settings

View File

@@ -73,3 +73,9 @@
arguments: (arguments (template_string (string_fragment) @injection.content
(#set! injection.language "graphql")))
)
(call_expression
function: (identifier) @_name(#match? @_name "^iso$")
arguments: (arguments (template_string (string_fragment) @injection.content
(#set! injection.language "isograph")))
)

View File

@@ -78,6 +78,12 @@
(#set! injection.language "graphql")))
)
(call_expression
function: (identifier) @_name(#match? @_name "^iso$")
arguments: (arguments (template_string (string_fragment) @injection.content
(#set! injection.language "isograph")))
)
;; Angular Component template injection
(call_expression
function: [

View File

@@ -166,6 +166,12 @@ impl<'a> From<&'a str> for LanguageServerName {
}
}
impl PartialEq<str> for LanguageServerName {
fn eq(&self, other: &str) -> bool {
self.0 == other
}
}
/// Handle to a language server RPC activity subscription.
pub enum Subscription {
Notification {

View File

@@ -69,6 +69,7 @@ pub struct MarkdownStyle {
pub heading_level_styles: Option<HeadingLevelStyles>,
pub table_overflow_x_scroll: bool,
pub height_is_multiple_of_line_height: bool,
pub prevent_mouse_interaction: bool,
}
impl Default for MarkdownStyle {
@@ -89,6 +90,7 @@ impl Default for MarkdownStyle {
heading_level_styles: None,
table_overflow_x_scroll: false,
height_is_multiple_of_line_height: false,
prevent_mouse_interaction: false,
}
}
}
@@ -575,16 +577,22 @@ impl MarkdownElement {
window: &mut Window,
cx: &mut App,
) {
if self.style.prevent_mouse_interaction {
return;
}
let is_hovering_link = hitbox.is_hovered(window)
&& !self.markdown.read(cx).selection.pending
&& rendered_text
.link_for_position(window.mouse_position())
.is_some();
if is_hovering_link {
window.set_cursor_style(CursorStyle::PointingHand, hitbox);
} else {
window.set_cursor_style(CursorStyle::IBeam, hitbox);
if !self.style.prevent_mouse_interaction {
if is_hovering_link {
window.set_cursor_style(CursorStyle::PointingHand, hitbox);
} else {
window.set_cursor_style(CursorStyle::IBeam, hitbox);
}
}
let on_open_url = self.on_url_click.take();

View File

@@ -277,7 +277,11 @@ fn render_markdown_list_item(
.items_start()
.children(vec![
bullet,
div().children(contents).pr(cx.scaled_rems(1.0)).w_full(),
v_flex()
.children(contents)
.gap(cx.scaled_rems(1.0))
.pr(cx.scaled_rems(1.0))
.w_full(),
]);
cx.with_common_p(item).into_any()

View File

@@ -19,7 +19,6 @@ use std::{
};
use util::ResultExt;
use util::archive::extract_zip;
use util::paths::{SanitizedPath, SanitizedPathBuf};
const NODE_CA_CERTS_ENV_VAR: &str = "NODE_EXTRA_CA_CERTS";
@@ -372,7 +371,7 @@ trait NodeRuntimeTrait: Send + Sync {
#[derive(Clone)]
struct ManagedNodeRuntime {
installation_path: SanitizedPathBuf,
installation_path: PathBuf,
}
impl ManagedNodeRuntime {
@@ -484,13 +483,9 @@ impl ManagedNodeRuntime {
ArchiveType::TarGz => {
let decompressed_bytes = GzipDecoder::new(BufReader::new(response.body_mut()));
let archive = Archive::new(decompressed_bytes);
archive
.unpack(node_containing_dir.as_path().as_path())
.await?;
}
ArchiveType::Zip => {
extract_zip(node_containing_dir.as_path().as_path(), body).await?
archive.unpack(&node_containing_dir).await?;
}
ArchiveType::Zip => extract_zip(&node_containing_dir, body).await?,
}
log::info!("Extracted Node.js to {}", node_containing_dir.display())
}
@@ -506,7 +501,7 @@ impl ManagedNodeRuntime {
}
}
fn path_with_node_binary_prepended(node_binary: &SanitizedPath) -> Option<OsString> {
fn path_with_node_binary_prepended(node_binary: &Path) -> Option<OsString> {
let existing_path = env::var_os("PATH");
let node_bin_dir = node_binary.parent().map(|dir| dir.as_os_str());
match (existing_path, node_bin_dir) {
@@ -611,15 +606,15 @@ impl NodeRuntimeTrait for ManagedNodeRuntime {
#[derive(Debug, Clone)]
pub struct SystemNodeRuntime {
node: SanitizedPathBuf,
npm: SanitizedPathBuf,
global_node_modules: SanitizedPathBuf,
scratch_dir: SanitizedPathBuf,
node: PathBuf,
npm: PathBuf,
global_node_modules: PathBuf,
scratch_dir: PathBuf,
}
impl SystemNodeRuntime {
const MIN_VERSION: semver::Version = Version::new(20, 0, 0);
async fn new(node: SanitizedPathBuf, npm: SanitizedPathBuf) -> Result<Self> {
async fn new(node: PathBuf, npm: PathBuf) -> Result<Self> {
let output = util::command::new_smol_command(&node)
.arg("--version")
.output()
@@ -650,19 +645,19 @@ impl SystemNodeRuntime {
let mut this = Self {
node,
npm,
global_node_modules: SanitizedPathBuf::default(),
global_node_modules: PathBuf::default(),
scratch_dir,
};
let output = this.run_npm_subcommand(None, None, "root", &["-g"]).await?;
this.global_node_modules =
PathBuf::from(String::from_utf8_lossy(&output.stdout).to_string()).into();
PathBuf::from(String::from_utf8_lossy(&output.stdout).to_string());
Ok(this)
}
async fn detect() -> std::result::Result<Self, DetectError> {
let node = which::which("node").map_err(DetectError::NotInPath)?.into();
let npm = which::which("npm").map_err(DetectError::NotInPath)?.into();
let node = which::which("node").map_err(DetectError::NotInPath)?;
let npm = which::which("npm").map_err(DetectError::NotInPath)?;
Self::new(node, npm).await.map_err(DetectError::Other)
}
}

View File

@@ -449,28 +449,28 @@ impl FontPickerDelegate {
) -> Self {
let font_family_cache = FontFamilyCache::global(cx);
let fonts: Vec<SharedString> = font_family_cache
.list_font_families(cx)
.into_iter()
.collect();
let fonts = font_family_cache
.try_list_font_families()
.unwrap_or_else(|| vec![current_font.clone()]);
let selected_index = fonts
.iter()
.position(|font| *font == current_font)
.unwrap_or(0);
let filtered_fonts = fonts
.iter()
.enumerate()
.map(|(index, font)| StringMatch {
candidate_id: index,
string: font.to_string(),
positions: Vec::new(),
score: 0.0,
})
.collect();
Self {
fonts: fonts.clone(),
filtered_fonts: fonts
.iter()
.enumerate()
.map(|(index, font)| StringMatch {
candidate_id: index,
string: font.to_string(),
positions: Vec::new(),
score: 0.0,
})
.collect(),
fonts,
filtered_fonts,
selected_index,
current_font,
on_font_changed: Arc::new(on_font_changed),

View File

@@ -242,12 +242,25 @@ struct Onboarding {
impl Onboarding {
fn new(workspace: &Workspace, cx: &mut App) -> Entity<Self> {
cx.new(|cx| Self {
workspace: workspace.weak_handle(),
focus_handle: cx.focus_handle(),
selected_page: SelectedPage::Basics,
user_store: workspace.user_store().clone(),
_settings_subscription: cx.observe_global::<SettingsStore>(move |_, cx| cx.notify()),
let font_family_cache = theme::FontFamilyCache::global(cx);
cx.new(|cx| {
cx.spawn(async move |this, cx| {
font_family_cache.prefetch(cx).await;
this.update(cx, |_, cx| {
cx.notify();
})
})
.detach();
Self {
workspace: workspace.weak_handle(),
focus_handle: cx.focus_handle(),
selected_page: SelectedPage::Basics,
user_store: workspace.user_store().clone(),
_settings_subscription: cx
.observe_global::<SettingsStore>(move |_, cx| cx.notify()),
}
})
}

View File

@@ -5,7 +5,6 @@ use std::path::{Path, PathBuf};
use std::sync::OnceLock;
pub use util::paths::home_dir;
use util::paths::{SanitizedPath, SanitizedPathBuf};
/// A default editorconfig file name to use when resolving project settings.
pub const EDITORCONFIG_NAME: &str = ".editorconfig";
@@ -13,30 +12,30 @@ pub const EDITORCONFIG_NAME: &str = ".editorconfig";
/// A custom data directory override, set only by `set_custom_data_dir`.
/// This is used to override the default data directory location.
/// The directory will be created if it doesn't exist when set.
static CUSTOM_DATA_DIR: OnceLock<SanitizedPathBuf> = OnceLock::new();
static CUSTOM_DATA_DIR: OnceLock<PathBuf> = OnceLock::new();
/// The resolved data directory, combining custom override or platform defaults.
/// This is set once and cached for subsequent calls.
/// On macOS, this is `~/Library/Application Support/Zed`.
/// On Linux/FreeBSD, this is `$XDG_DATA_HOME/zed`.
/// On Windows, this is `%LOCALAPPDATA%\Zed`.
static CURRENT_DATA_DIR: OnceLock<SanitizedPathBuf> = OnceLock::new();
static CURRENT_DATA_DIR: OnceLock<PathBuf> = OnceLock::new();
/// The resolved config directory, combining custom override or platform defaults.
/// This is set once and cached for subsequent calls.
/// On macOS, this is `~/.config/zed`.
/// On Linux/FreeBSD, this is `$XDG_CONFIG_HOME/zed`.
/// On Windows, this is `%APPDATA%\Zed`.
static CONFIG_DIR: OnceLock<SanitizedPathBuf> = OnceLock::new();
static CONFIG_DIR: OnceLock<PathBuf> = OnceLock::new();
/// Returns the relative path to the zed_server directory on the ssh host.
pub fn remote_server_dir_relative() -> &'static SanitizedPath {
SanitizedPath::new(".zed_server")
pub fn remote_server_dir_relative() -> &'static Path {
Path::new(".zed_server")
}
/// Returns the relative path to the zed_wsl_server directory on the wsl host.
pub fn remote_wsl_server_dir_relative() -> &'static SanitizedPath {
SanitizedPath::new(".zed_wsl_server")
pub fn remote_wsl_server_dir_relative() -> &'static Path {
Path::new(".zed_wsl_server")
}
/// Sets a custom directory for all user data, overriding the default data directory.
@@ -59,12 +58,12 @@ pub fn remote_wsl_server_dir_relative() -> &'static SanitizedPath {
/// * Called after the data directory has been initialized (e.g., via `data_dir` or `config_dir`)
/// * The directory's path cannot be canonicalized to an absolute path
/// * The directory cannot be created
pub fn set_custom_data_dir(dir: &str) -> &'static SanitizedPathBuf {
pub fn set_custom_data_dir(dir: &str) -> &'static PathBuf {
if CURRENT_DATA_DIR.get().is_some() || CONFIG_DIR.get().is_some() {
panic!("set_custom_data_dir called after data_dir or config_dir was initialized");
}
CUSTOM_DATA_DIR.get_or_init(|| {
let mut path = SanitizedPathBuf::from(dir);
let mut path = PathBuf::from(dir);
if path.is_relative() {
let abs_path = path
.canonicalize()
@@ -77,7 +76,7 @@ pub fn set_custom_data_dir(dir: &str) -> &'static SanitizedPathBuf {
}
/// Returns the path to the configuration directory used by Zed.
pub fn config_dir() -> &'static SanitizedPathBuf {
pub fn config_dir() -> &'static PathBuf {
CONFIG_DIR.get_or_init(|| {
if let Some(custom_dir) = CUSTOM_DATA_DIR.get() {
custom_dir.join("config")
@@ -85,7 +84,6 @@ pub fn config_dir() -> &'static SanitizedPathBuf {
dirs::config_dir()
.expect("failed to determine RoamingAppData directory")
.join("Zed")
.into()
} else if cfg!(any(target_os = "linux", target_os = "freebsd")) {
if let Ok(flatpak_xdg_config) = std::env::var("FLATPAK_XDG_CONFIG_HOME") {
flatpak_xdg_config.into()
@@ -93,7 +91,6 @@ pub fn config_dir() -> &'static SanitizedPathBuf {
dirs::config_dir().expect("failed to determine XDG_CONFIG_HOME directory")
}
.join("zed")
.into()
} else {
home_dir().join(".config").join("zed")
}
@@ -101,7 +98,7 @@ pub fn config_dir() -> &'static SanitizedPathBuf {
}
/// Returns the path to the data directory used by Zed.
pub fn data_dir() -> &'static SanitizedPathBuf {
pub fn data_dir() -> &'static PathBuf {
CURRENT_DATA_DIR.get_or_init(|| {
if let Some(custom_dir) = CUSTOM_DATA_DIR.get() {
custom_dir.clone()
@@ -114,12 +111,10 @@ pub fn data_dir() -> &'static SanitizedPathBuf {
dirs::data_local_dir().expect("failed to determine XDG_DATA_HOME directory")
}
.join("zed")
.into()
} else if cfg!(target_os = "windows") {
dirs::data_local_dir()
.expect("failed to determine LocalAppData directory")
.join("Zed")
.into()
} else {
config_dir().clone() // Fallback
}
@@ -127,21 +122,19 @@ pub fn data_dir() -> &'static SanitizedPathBuf {
}
/// Returns the path to the temp directory used by Zed.
pub fn temp_dir() -> &'static SanitizedPathBuf {
static TEMP_DIR: OnceLock<SanitizedPathBuf> = OnceLock::new();
pub fn temp_dir() -> &'static PathBuf {
static TEMP_DIR: OnceLock<PathBuf> = OnceLock::new();
TEMP_DIR.get_or_init(|| {
if cfg!(target_os = "macos") {
return dirs::cache_dir()
.expect("failed to determine cachesDirectory directory")
.join("Zed")
.into();
.join("Zed");
}
if cfg!(target_os = "windows") {
return dirs::cache_dir()
.expect("failed to determine LocalAppData directory")
.join("Zed")
.into();
.join("Zed");
}
if cfg!(any(target_os = "linux", target_os = "freebsd")) {
@@ -150,8 +143,7 @@ pub fn temp_dir() -> &'static SanitizedPathBuf {
} else {
dirs::cache_dir().expect("failed to determine XDG_CACHE_HOME directory")
}
.join("zed")
.into();
.join("zed");
}
home_dir().join(".cache").join("zed")
@@ -159,8 +151,8 @@ pub fn temp_dir() -> &'static SanitizedPathBuf {
}
/// Returns the path to the logs directory.
pub fn logs_dir() -> &'static SanitizedPathBuf {
static LOGS_DIR: OnceLock<SanitizedPathBuf> = OnceLock::new();
pub fn logs_dir() -> &'static PathBuf {
static LOGS_DIR: OnceLock<PathBuf> = OnceLock::new();
LOGS_DIR.get_or_init(|| {
if cfg!(target_os = "macos") {
home_dir().join("Library/Logs/Zed")
@@ -171,128 +163,128 @@ pub fn logs_dir() -> &'static SanitizedPathBuf {
}
/// Returns the path to the Zed server directory on this SSH host.
pub fn remote_server_state_dir() -> &'static SanitizedPathBuf {
static REMOTE_SERVER_STATE: OnceLock<SanitizedPathBuf> = OnceLock::new();
pub fn remote_server_state_dir() -> &'static PathBuf {
static REMOTE_SERVER_STATE: OnceLock<PathBuf> = OnceLock::new();
REMOTE_SERVER_STATE.get_or_init(|| data_dir().join("server_state"))
}
/// Returns the path to the `Zed.log` file.
pub fn log_file() -> &'static SanitizedPathBuf {
static LOG_FILE: OnceLock<SanitizedPathBuf> = OnceLock::new();
pub fn log_file() -> &'static PathBuf {
static LOG_FILE: OnceLock<PathBuf> = OnceLock::new();
LOG_FILE.get_or_init(|| logs_dir().join("Zed.log"))
}
/// Returns the path to the `Zed.log.old` file.
pub fn old_log_file() -> &'static SanitizedPathBuf {
static OLD_LOG_FILE: OnceLock<SanitizedPathBuf> = OnceLock::new();
pub fn old_log_file() -> &'static PathBuf {
static OLD_LOG_FILE: OnceLock<PathBuf> = OnceLock::new();
OLD_LOG_FILE.get_or_init(|| logs_dir().join("Zed.log.old"))
}
/// Returns the path to the database directory.
pub fn database_dir() -> &'static SanitizedPathBuf {
static DATABASE_DIR: OnceLock<SanitizedPathBuf> = OnceLock::new();
pub fn database_dir() -> &'static PathBuf {
static DATABASE_DIR: OnceLock<PathBuf> = OnceLock::new();
DATABASE_DIR.get_or_init(|| data_dir().join("db"))
}
/// Returns the path to the crashes directory, if it exists for the current platform.
pub fn crashes_dir() -> &'static Option<SanitizedPathBuf> {
static CRASHES_DIR: OnceLock<Option<SanitizedPathBuf>> = OnceLock::new();
pub fn crashes_dir() -> &'static Option<PathBuf> {
static CRASHES_DIR: OnceLock<Option<PathBuf>> = OnceLock::new();
CRASHES_DIR.get_or_init(|| {
cfg!(target_os = "macos").then_some(home_dir().join("Library/Logs/DiagnosticReports"))
})
}
/// Returns the path to the retired crashes directory, if it exists for the current platform.
pub fn crashes_retired_dir() -> &'static Option<SanitizedPathBuf> {
static CRASHES_RETIRED_DIR: OnceLock<Option<SanitizedPathBuf>> = OnceLock::new();
pub fn crashes_retired_dir() -> &'static Option<PathBuf> {
static CRASHES_RETIRED_DIR: OnceLock<Option<PathBuf>> = OnceLock::new();
CRASHES_RETIRED_DIR.get_or_init(|| crashes_dir().as_ref().map(|dir| dir.join("Retired")))
}
/// Returns the path to the `settings.json` file.
pub fn settings_file() -> &'static SanitizedPathBuf {
static SETTINGS_FILE: OnceLock<SanitizedPathBuf> = OnceLock::new();
pub fn settings_file() -> &'static PathBuf {
static SETTINGS_FILE: OnceLock<PathBuf> = OnceLock::new();
SETTINGS_FILE.get_or_init(|| config_dir().join("settings.json"))
}
/// Returns the path to the global settings file.
pub fn global_settings_file() -> &'static SanitizedPathBuf {
static GLOBAL_SETTINGS_FILE: OnceLock<SanitizedPathBuf> = OnceLock::new();
pub fn global_settings_file() -> &'static PathBuf {
static GLOBAL_SETTINGS_FILE: OnceLock<PathBuf> = OnceLock::new();
GLOBAL_SETTINGS_FILE.get_or_init(|| config_dir().join("global_settings.json"))
}
/// Returns the path to the `settings_backup.json` file.
pub fn settings_backup_file() -> &'static SanitizedPathBuf {
static SETTINGS_FILE: OnceLock<SanitizedPathBuf> = OnceLock::new();
pub fn settings_backup_file() -> &'static PathBuf {
static SETTINGS_FILE: OnceLock<PathBuf> = OnceLock::new();
SETTINGS_FILE.get_or_init(|| config_dir().join("settings_backup.json"))
}
/// Returns the path to the `keymap.json` file.
pub fn keymap_file() -> &'static SanitizedPathBuf {
static KEYMAP_FILE: OnceLock<SanitizedPathBuf> = OnceLock::new();
pub fn keymap_file() -> &'static PathBuf {
static KEYMAP_FILE: OnceLock<PathBuf> = OnceLock::new();
KEYMAP_FILE.get_or_init(|| config_dir().join("keymap.json"))
}
/// Returns the path to the `keymap_backup.json` file.
pub fn keymap_backup_file() -> &'static SanitizedPathBuf {
static KEYMAP_FILE: OnceLock<SanitizedPathBuf> = OnceLock::new();
pub fn keymap_backup_file() -> &'static PathBuf {
static KEYMAP_FILE: OnceLock<PathBuf> = OnceLock::new();
KEYMAP_FILE.get_or_init(|| config_dir().join("keymap_backup.json"))
}
/// Returns the path to the `tasks.json` file.
pub fn tasks_file() -> &'static SanitizedPathBuf {
static TASKS_FILE: OnceLock<SanitizedPathBuf> = OnceLock::new();
pub fn tasks_file() -> &'static PathBuf {
static TASKS_FILE: OnceLock<PathBuf> = OnceLock::new();
TASKS_FILE.get_or_init(|| config_dir().join("tasks.json"))
}
/// Returns the path to the `debug.json` file.
pub fn debug_scenarios_file() -> &'static SanitizedPathBuf {
static DEBUG_SCENARIOS_FILE: OnceLock<SanitizedPathBuf> = OnceLock::new();
pub fn debug_scenarios_file() -> &'static PathBuf {
static DEBUG_SCENARIOS_FILE: OnceLock<PathBuf> = OnceLock::new();
DEBUG_SCENARIOS_FILE.get_or_init(|| config_dir().join("debug.json"))
}
/// Returns the path to the extensions directory.
///
/// This is where installed extensions are stored.
pub fn extensions_dir() -> &'static SanitizedPathBuf {
static EXTENSIONS_DIR: OnceLock<SanitizedPathBuf> = OnceLock::new();
pub fn extensions_dir() -> &'static PathBuf {
static EXTENSIONS_DIR: OnceLock<PathBuf> = OnceLock::new();
EXTENSIONS_DIR.get_or_init(|| data_dir().join("extensions"))
}
/// Returns the path to the extensions directory.
///
/// This is where installed extensions are stored on a remote.
pub fn remote_extensions_dir() -> &'static SanitizedPathBuf {
static EXTENSIONS_DIR: OnceLock<SanitizedPathBuf> = OnceLock::new();
pub fn remote_extensions_dir() -> &'static PathBuf {
static EXTENSIONS_DIR: OnceLock<PathBuf> = OnceLock::new();
EXTENSIONS_DIR.get_or_init(|| data_dir().join("remote_extensions"))
}
/// Returns the path to the extensions directory.
///
/// This is where installed extensions are stored on a remote.
pub fn remote_extensions_uploads_dir() -> &'static SanitizedPathBuf {
static UPLOAD_DIR: OnceLock<SanitizedPathBuf> = OnceLock::new();
pub fn remote_extensions_uploads_dir() -> &'static PathBuf {
static UPLOAD_DIR: OnceLock<PathBuf> = OnceLock::new();
UPLOAD_DIR.get_or_init(|| remote_extensions_dir().join("uploads"))
}
/// Returns the path to the themes directory.
///
/// This is where themes that are not provided by extensions are stored.
pub fn themes_dir() -> &'static SanitizedPathBuf {
static THEMES_DIR: OnceLock<SanitizedPathBuf> = OnceLock::new();
pub fn themes_dir() -> &'static PathBuf {
static THEMES_DIR: OnceLock<PathBuf> = OnceLock::new();
THEMES_DIR.get_or_init(|| config_dir().join("themes"))
}
/// Returns the path to the snippets directory.
pub fn snippets_dir() -> &'static SanitizedPathBuf {
static SNIPPETS_DIR: OnceLock<SanitizedPathBuf> = OnceLock::new();
pub fn snippets_dir() -> &'static PathBuf {
static SNIPPETS_DIR: OnceLock<PathBuf> = OnceLock::new();
SNIPPETS_DIR.get_or_init(|| config_dir().join("snippets"))
}
/// Returns the path to the contexts directory.
///
/// This is where the saved contexts from the Assistant are stored.
pub fn contexts_dir() -> &'static SanitizedPathBuf {
static CONTEXTS_DIR: OnceLock<SanitizedPathBuf> = OnceLock::new();
pub fn contexts_dir() -> &'static PathBuf {
static CONTEXTS_DIR: OnceLock<PathBuf> = OnceLock::new();
CONTEXTS_DIR.get_or_init(|| {
if cfg!(target_os = "macos") {
config_dir().join("conversations")
@@ -305,8 +297,8 @@ pub fn contexts_dir() -> &'static SanitizedPathBuf {
/// Returns the path to the contexts directory.
///
/// This is where the prompts for use with the Assistant are stored.
pub fn prompts_dir() -> &'static SanitizedPathBuf {
static PROMPTS_DIR: OnceLock<SanitizedPathBuf> = OnceLock::new();
pub fn prompts_dir() -> &'static PathBuf {
static PROMPTS_DIR: OnceLock<PathBuf> = OnceLock::new();
PROMPTS_DIR.get_or_init(|| {
if cfg!(target_os = "macos") {
config_dir().join("prompts")
@@ -323,15 +315,15 @@ pub fn prompts_dir() -> &'static SanitizedPathBuf {
/// # Arguments
///
/// * `dev_mode` - If true, assumes the current working directory is the Zed repository.
pub fn prompt_overrides_dir(repo_path: Option<&SanitizedPath>) -> SanitizedPathBuf {
pub fn prompt_overrides_dir(repo_path: Option<&Path>) -> PathBuf {
if let Some(path) = repo_path {
let dev_path = path.join("assets").join("prompts");
if dev_path.exists() {
return dev_path.into();
return dev_path;
}
}
static PROMPT_TEMPLATES_DIR: OnceLock<SanitizedPathBuf> = OnceLock::new();
static PROMPT_TEMPLATES_DIR: OnceLock<PathBuf> = OnceLock::new();
PROMPT_TEMPLATES_DIR
.get_or_init(|| {
if cfg!(target_os = "macos") {
@@ -346,8 +338,8 @@ pub fn prompt_overrides_dir(repo_path: Option<&SanitizedPath>) -> SanitizedPathB
/// Returns the path to the semantic search's embeddings directory.
///
/// This is where the embeddings used to power semantic search are stored.
pub fn embeddings_dir() -> &'static SanitizedPathBuf {
static EMBEDDINGS_DIR: OnceLock<SanitizedPathBuf> = OnceLock::new();
pub fn embeddings_dir() -> &'static PathBuf {
static EMBEDDINGS_DIR: OnceLock<PathBuf> = OnceLock::new();
EMBEDDINGS_DIR.get_or_init(|| {
if cfg!(target_os = "macos") {
config_dir().join("embeddings")
@@ -360,74 +352,74 @@ pub fn embeddings_dir() -> &'static SanitizedPathBuf {
/// Returns the path to the languages directory.
///
/// This is where language servers are downloaded to for languages built-in to Zed.
pub fn languages_dir() -> &'static SanitizedPathBuf {
static LANGUAGES_DIR: OnceLock<SanitizedPathBuf> = OnceLock::new();
pub fn languages_dir() -> &'static PathBuf {
static LANGUAGES_DIR: OnceLock<PathBuf> = OnceLock::new();
LANGUAGES_DIR.get_or_init(|| data_dir().join("languages"))
}
/// Returns the path to the debug adapters directory
///
/// This is where debug adapters are downloaded to for DAPs that are built-in to Zed.
pub fn debug_adapters_dir() -> &'static SanitizedPathBuf {
static DEBUG_ADAPTERS_DIR: OnceLock<SanitizedPathBuf> = OnceLock::new();
pub fn debug_adapters_dir() -> &'static PathBuf {
static DEBUG_ADAPTERS_DIR: OnceLock<PathBuf> = OnceLock::new();
DEBUG_ADAPTERS_DIR.get_or_init(|| data_dir().join("debug_adapters"))
}
/// Returns the path to the agent servers directory
///
/// This is where agent servers are downloaded to
pub fn agent_servers_dir() -> &'static SanitizedPathBuf {
static AGENT_SERVERS_DIR: OnceLock<SanitizedPathBuf> = OnceLock::new();
pub fn agent_servers_dir() -> &'static PathBuf {
static AGENT_SERVERS_DIR: OnceLock<PathBuf> = OnceLock::new();
AGENT_SERVERS_DIR.get_or_init(|| data_dir().join("agent_servers"))
}
/// Returns the path to the Copilot directory.
pub fn copilot_dir() -> &'static SanitizedPathBuf {
static COPILOT_DIR: OnceLock<SanitizedPathBuf> = OnceLock::new();
pub fn copilot_dir() -> &'static PathBuf {
static COPILOT_DIR: OnceLock<PathBuf> = OnceLock::new();
COPILOT_DIR.get_or_init(|| data_dir().join("copilot"))
}
/// Returns the path to the Supermaven directory.
pub fn supermaven_dir() -> &'static SanitizedPathBuf {
static SUPERMAVEN_DIR: OnceLock<SanitizedPathBuf> = OnceLock::new();
pub fn supermaven_dir() -> &'static PathBuf {
static SUPERMAVEN_DIR: OnceLock<PathBuf> = OnceLock::new();
SUPERMAVEN_DIR.get_or_init(|| data_dir().join("supermaven"))
}
/// Returns the path to the default Prettier directory.
pub fn default_prettier_dir() -> &'static SanitizedPathBuf {
static DEFAULT_PRETTIER_DIR: OnceLock<SanitizedPathBuf> = OnceLock::new();
pub fn default_prettier_dir() -> &'static PathBuf {
static DEFAULT_PRETTIER_DIR: OnceLock<PathBuf> = OnceLock::new();
DEFAULT_PRETTIER_DIR.get_or_init(|| data_dir().join("prettier"))
}
/// Returns the path to the remote server binaries directory.
pub fn remote_servers_dir() -> &'static SanitizedPathBuf {
static REMOTE_SERVERS_DIR: OnceLock<SanitizedPathBuf> = OnceLock::new();
pub fn remote_servers_dir() -> &'static PathBuf {
static REMOTE_SERVERS_DIR: OnceLock<PathBuf> = OnceLock::new();
REMOTE_SERVERS_DIR.get_or_init(|| data_dir().join("remote_servers"))
}
/// Returns the relative path to a `.zed` folder within a project.
pub fn local_settings_folder_relative_path() -> &'static SanitizedPath {
SanitizedPath::new(".zed")
pub fn local_settings_folder_relative_path() -> &'static Path {
Path::new(".zed")
}
/// Returns the relative path to a `.vscode` folder within a project.
pub fn local_vscode_folder_relative_path() -> &'static SanitizedPath {
SanitizedPath::new(".vscode")
pub fn local_vscode_folder_relative_path() -> &'static Path {
Path::new(".vscode")
}
/// Returns the relative path to a `settings.json` file within a project.
pub fn local_settings_file_relative_path() -> &'static SanitizedPath {
SanitizedPath::new(".zed/settings.json")
pub fn local_settings_file_relative_path() -> &'static Path {
Path::new(".zed/settings.json")
}
/// Returns the relative path to a `tasks.json` file within a project.
pub fn local_tasks_file_relative_path() -> &'static SanitizedPath {
SanitizedPath::new(".zed/tasks.json")
pub fn local_tasks_file_relative_path() -> &'static Path {
Path::new(".zed/tasks.json")
}
/// Returns the relative path to a `.vscode/tasks.json` file within a project.
pub fn local_vscode_tasks_file_relative_path() -> &'static SanitizedPath {
SanitizedPath::new(".vscode/tasks.json")
pub fn local_vscode_tasks_file_relative_path() -> &'static Path {
Path::new(".vscode/tasks.json")
}
pub fn debug_task_file_name() -> &'static str {
@@ -440,25 +432,25 @@ pub fn task_file_name() -> &'static str {
/// Returns the relative path to a `debug.json` file within a project.
/// .zed/debug.json
pub fn local_debug_file_relative_path() -> &'static SanitizedPath {
SanitizedPath::new(".zed/debug.json")
pub fn local_debug_file_relative_path() -> &'static Path {
Path::new(".zed/debug.json")
}
/// Returns the relative path to a `.vscode/launch.json` file within a project.
pub fn local_vscode_launch_file_relative_path() -> &'static SanitizedPath {
SanitizedPath::new(".vscode/launch.json")
pub fn local_vscode_launch_file_relative_path() -> &'static Path {
Path::new(".vscode/launch.json")
}
pub fn user_ssh_config_file() -> SanitizedPathBuf {
pub fn user_ssh_config_file() -> PathBuf {
home_dir().join(".ssh/config")
}
pub fn global_ssh_config_file() -> &'static SanitizedPath {
SanitizedPath::new("/etc/ssh/ssh_config")
pub fn global_ssh_config_file() -> &'static Path {
Path::new("/etc/ssh/ssh_config")
}
/// Returns candidate paths for the vscode user settings file
pub fn vscode_settings_file_paths() -> Vec<SanitizedPathBuf> {
pub fn vscode_settings_file_paths() -> Vec<PathBuf> {
let mut paths = vscode_user_data_paths();
for path in paths.iter_mut() {
path.push("User/settings.json");
@@ -467,7 +459,7 @@ pub fn vscode_settings_file_paths() -> Vec<SanitizedPathBuf> {
}
/// Returns candidate paths for the cursor user settings file
pub fn cursor_settings_file_paths() -> Vec<SanitizedPathBuf> {
pub fn cursor_settings_file_paths() -> Vec<PathBuf> {
let mut paths = cursor_user_data_paths();
for path in paths.iter_mut() {
path.push("User/settings.json");
@@ -475,7 +467,7 @@ pub fn cursor_settings_file_paths() -> Vec<SanitizedPathBuf> {
paths
}
fn vscode_user_data_paths() -> Vec<SanitizedPathBuf> {
fn vscode_user_data_paths() -> Vec<PathBuf> {
// https://github.com/microsoft/vscode/blob/23e7148cdb6d8a27f0109ff77e5b1e019f8da051/src/vs/platform/environment/node/userDataPath.ts#L45
const VSCODE_PRODUCT_NAMES: &[&str] = &[
"Code",
@@ -487,11 +479,11 @@ fn vscode_user_data_paths() -> Vec<SanitizedPathBuf> {
];
let mut paths = Vec::new();
if let Ok(portable_path) = env::var("VSCODE_PORTABLE") {
paths.push(SanitizedPath::new(&portable_path).join("user-data"));
paths.push(Path::new(&portable_path).join("user-data"));
}
if let Ok(vscode_appdata) = env::var("VSCODE_APPDATA") {
for product_name in VSCODE_PRODUCT_NAMES {
paths.push(SanitizedPath::new(&vscode_appdata).join(product_name));
paths.push(Path::new(&vscode_appdata).join(product_name));
}
}
for product_name in VSCODE_PRODUCT_NAMES {
@@ -500,13 +492,13 @@ fn vscode_user_data_paths() -> Vec<SanitizedPathBuf> {
paths
}
fn cursor_user_data_paths() -> Vec<SanitizedPathBuf> {
fn cursor_user_data_paths() -> Vec<PathBuf> {
let mut paths = Vec::new();
add_vscode_user_data_paths(&mut paths, "Cursor");
paths
}
fn add_vscode_user_data_paths(paths: &mut Vec<SanitizedPathBuf>, product_name: &str) {
fn add_vscode_user_data_paths(paths: &mut Vec<PathBuf>, product_name: &str) {
if cfg!(target_os = "macos") {
paths.push(
home_dir()
@@ -515,15 +507,14 @@ fn add_vscode_user_data_paths(paths: &mut Vec<SanitizedPathBuf>, product_name: &
);
} else if cfg!(target_os = "windows") {
if let Some(data_local_dir) = dirs::data_local_dir() {
paths.push(data_local_dir.join(product_name).into());
paths.push(data_local_dir.join(product_name));
}
if let Some(data_dir) = dirs::data_dir() {
paths.push(data_dir.join(product_name).into());
paths.push(data_dir.join(product_name));
}
} else {
paths.push(
dirs::config_dir()
.map(|e| e.into())
.unwrap_or(home_dir().join(".config"))
.join(product_name),
);

View File

@@ -67,6 +67,7 @@ regex.workspace = true
remote.workspace = true
rpc.workspace = true
schemars.workspace = true
semver.workspace = true
serde.workspace = true
serde_json.workspace = true
settings.workspace = true
@@ -85,6 +86,7 @@ text.workspace = true
toml.workspace = true
url.workspace = true
util.workspace = true
watch.workspace = true
which.workspace = true
worktree.workspace = true
zlog.workspace = true

File diff suppressed because it is too large Load Diff

View File

@@ -319,7 +319,11 @@ impl RemoteBufferStore {
})
}
fn create_buffer(&self, cx: &mut Context<BufferStore>) -> Task<Result<Entity<Buffer>>> {
fn create_buffer(
&self,
project_searchable: bool,
cx: &mut Context<BufferStore>,
) -> Task<Result<Entity<Buffer>>> {
let create = self.upstream_client.request(proto::OpenNewBuffer {
project_id: self.project_id,
});
@@ -327,8 +331,13 @@ impl RemoteBufferStore {
let response = create.await?;
let buffer_id = BufferId::new(response.buffer_id)?;
this.update(cx, |this, cx| this.wait_for_remote_buffer(buffer_id, cx))?
.await
this.update(cx, |this, cx| {
if !project_searchable {
this.non_searchable_buffers.insert(buffer_id);
}
this.wait_for_remote_buffer(buffer_id, cx)
})?
.await
})
}
@@ -670,12 +679,21 @@ impl LocalBufferStore {
})
}
fn create_buffer(&self, cx: &mut Context<BufferStore>) -> Task<Result<Entity<Buffer>>> {
fn create_buffer(
&self,
project_searchable: bool,
cx: &mut Context<BufferStore>,
) -> Task<Result<Entity<Buffer>>> {
cx.spawn(async move |buffer_store, cx| {
let buffer =
cx.new(|cx| Buffer::local("", cx).with_language(language::PLAIN_TEXT.clone(), cx))?;
buffer_store.update(cx, |buffer_store, cx| {
buffer_store.add_buffer(buffer.clone(), cx).log_err();
if !project_searchable {
buffer_store
.non_searchable_buffers
.insert(buffer.read(cx).remote_id());
}
})?;
Ok(buffer)
})
@@ -848,10 +866,14 @@ impl BufferStore {
})
}
pub fn create_buffer(&mut self, cx: &mut Context<Self>) -> Task<Result<Entity<Buffer>>> {
pub fn create_buffer(
&mut self,
project_searchable: bool,
cx: &mut Context<Self>,
) -> Task<Result<Entity<Buffer>>> {
match &self.state {
BufferStoreState::Local(this) => this.create_buffer(cx),
BufferStoreState::Remote(this) => this.create_buffer(cx),
BufferStoreState::Local(this) => this.create_buffer(project_searchable, cx),
BufferStoreState::Remote(this) => this.create_buffer(project_searchable, cx),
}
}
@@ -1610,6 +1632,7 @@ impl BufferStore {
&mut self,
text: &str,
language: Option<Arc<Language>>,
project_searchable: bool,
cx: &mut Context<Self>,
) -> Entity<Buffer> {
let buffer = cx.new(|cx| {
@@ -1619,6 +1642,9 @@ impl BufferStore {
self.add_buffer(buffer.clone(), cx).log_err();
let buffer_id = buffer.read(cx).remote_id();
if !project_searchable {
self.non_searchable_buffers.insert(buffer_id);
}
if let Some(file) = File::from_dyn(buffer.read(cx).file()) {
self.path_to_buffer_id.insert(
@@ -1688,10 +1714,6 @@ impl BufferStore {
}
serialized_transaction
}
pub(crate) fn mark_buffer_as_non_searchable(&mut self, buffer_id: BufferId) {
self.non_searchable_buffers.insert(buffer_id);
}
}
impl OpenBuffer {

View File

@@ -3317,7 +3317,7 @@ impl Repository {
) -> Task<Result<Entity<Buffer>>> {
cx.spawn(async move |repository, cx| {
let buffer = buffer_store
.update(cx, |buffer_store, cx| buffer_store.create_buffer(cx))?
.update(cx, |buffer_store, cx| buffer_store.create_buffer(false, cx))?
.await?;
if let Some(language_registry) = language_registry {

View File

@@ -86,7 +86,6 @@ use node_runtime::read_package_installed_version;
use parking_lot::Mutex;
use postage::{mpsc, sink::Sink, stream::Stream, watch};
use rand::prelude::*;
use rpc::{
AnyProtoClient,
proto::{FromProto, LspRequestId, LspRequestMessage as _, ToProto},
@@ -7124,6 +7123,36 @@ impl LspStore {
summary
}
/// Returns the diagnostic summary for a specific project path.
pub fn diagnostic_summary_for_path(
&self,
project_path: &ProjectPath,
_: &App,
) -> DiagnosticSummary {
if let Some(summaries) = self
.diagnostic_summaries
.get(&project_path.worktree_id)
.and_then(|map| map.get(&project_path.path))
{
let (error_count, warning_count) = summaries.iter().fold(
(0, 0),
|(error_count, warning_count), (_language_server_id, summary)| {
(
error_count + summary.error_count,
warning_count + summary.warning_count,
)
},
);
DiagnosticSummary {
error_count,
warning_count,
}
} else {
DiagnosticSummary::default()
}
}
pub fn diagnostic_summaries<'a>(
&'a self,
include_ignored: bool,

View File

@@ -1,3 +1,4 @@
pub mod agent_server_store;
pub mod buffer_store;
mod color_extractor;
pub mod connection_manager;
@@ -34,7 +35,11 @@ mod yarn;
use dap::inline_value::{InlineValueLocation, VariableLookupKind, VariableScope};
use crate::{git_store::GitStore, lsp_store::log_store::LogKind};
use crate::{
agent_server_store::{AgentServerStore, AllAgentServersSettings},
git_store::GitStore,
lsp_store::log_store::LogKind,
};
pub use git_store::{
ConflictRegion, ConflictSet, ConflictSetSnapshot, ConflictSetUpdate,
git_traversal::{ChildEntriesGitIter, GitEntry, GitEntryRef, GitTraversal},
@@ -179,6 +184,7 @@ pub struct Project {
buffer_ordered_messages_tx: mpsc::UnboundedSender<BufferOrderedMessage>,
languages: Arc<LanguageRegistry>,
dap_store: Entity<DapStore>,
agent_server_store: Entity<AgentServerStore>,
breakpoint_store: Entity<BreakpointStore>,
collab_client: Arc<client::Client>,
@@ -1019,6 +1025,7 @@ impl Project {
WorktreeSettings::register(cx);
ProjectSettings::register(cx);
DisableAiSettings::register(cx);
AllAgentServersSettings::register(cx);
}
pub fn init(client: &Arc<Client>, cx: &mut App) {
@@ -1174,6 +1181,10 @@ impl Project {
)
});
let agent_server_store = cx.new(|cx| {
AgentServerStore::local(node.clone(), fs.clone(), environment.clone(), cx)
});
cx.subscribe(&lsp_store, Self::on_lsp_store_event).detach();
Self {
@@ -1200,6 +1211,7 @@ impl Project {
remote_client: None,
breakpoint_store,
dap_store,
agent_server_store,
buffers_needing_diff: Default::default(),
git_diff_debouncer: DebouncedDelay::new(),
@@ -1338,6 +1350,9 @@ impl Project {
)
});
let agent_server_store =
cx.new(|cx| AgentServerStore::remote(REMOTE_SERVER_PROJECT_ID, remote.clone(), cx));
cx.subscribe(&remote, Self::on_remote_client_event).detach();
let this = Self {
@@ -1353,6 +1368,7 @@ impl Project {
join_project_response_message_id: 0,
client_state: ProjectClientState::Local,
git_store,
agent_server_store,
client_subscriptions: Vec::new(),
_subscriptions: vec![
cx.on_release(Self::release),
@@ -1407,6 +1423,7 @@ impl Project {
remote_proto.subscribe_to_entity(REMOTE_SERVER_PROJECT_ID, &this.dap_store);
remote_proto.subscribe_to_entity(REMOTE_SERVER_PROJECT_ID, &this.settings_observer);
remote_proto.subscribe_to_entity(REMOTE_SERVER_PROJECT_ID, &this.git_store);
remote_proto.subscribe_to_entity(REMOTE_SERVER_PROJECT_ID, &this.agent_server_store);
remote_proto.add_entity_message_handler(Self::handle_create_buffer_for_peer);
remote_proto.add_entity_message_handler(Self::handle_update_worktree);
@@ -1422,6 +1439,7 @@ impl Project {
ToolchainStore::init(&remote_proto);
DapStore::init(&remote_proto, cx);
GitStore::init(&remote_proto);
AgentServerStore::init_remote(&remote_proto);
this
})
@@ -1564,6 +1582,8 @@ impl Project {
)
})?;
let agent_server_store = cx.new(|cx| AgentServerStore::collab(cx))?;
let project = cx.new(|cx| {
let replica_id = response.payload.replica_id as ReplicaId;
@@ -1624,6 +1644,7 @@ impl Project {
breakpoint_store,
dap_store: dap_store.clone(),
git_store: git_store.clone(),
agent_server_store,
buffers_needing_diff: Default::default(),
git_diff_debouncer: DebouncedDelay::new(),
terminals: Terminals {
@@ -2540,22 +2561,28 @@ impl Project {
}
}
pub fn create_buffer(&mut self, cx: &mut Context<Self>) -> Task<Result<Entity<Buffer>>> {
self.buffer_store
.update(cx, |buffer_store, cx| buffer_store.create_buffer(cx))
pub fn create_buffer(
&mut self,
searchable: bool,
cx: &mut Context<Self>,
) -> Task<Result<Entity<Buffer>>> {
self.buffer_store.update(cx, |buffer_store, cx| {
buffer_store.create_buffer(searchable, cx)
})
}
pub fn create_local_buffer(
&mut self,
text: &str,
language: Option<Arc<Language>>,
project_searchable: bool,
cx: &mut Context<Self>,
) -> Entity<Buffer> {
if self.is_via_collab() || self.is_via_remote_server() {
panic!("called create_local_buffer on a remote project")
}
self.buffer_store.update(cx, |buffer_store, cx| {
buffer_store.create_local_buffer(text, language, cx)
buffer_store.create_local_buffer(text, language, project_searchable, cx)
})
}
@@ -4394,6 +4421,13 @@ impl Project {
.diagnostic_summary(include_ignored, cx)
}
/// Returns a summary of the diagnostics for the provided project path only.
pub fn diagnostic_summary_for_path(&self, path: &ProjectPath, cx: &App) -> DiagnosticSummary {
self.lsp_store
.read(cx)
.diagnostic_summary_for_path(path, cx)
}
pub fn diagnostic_summaries<'a>(
&'a self,
include_ignored: bool,
@@ -4484,6 +4518,23 @@ impl Project {
None
}
/// If there's only one visible worktree, returns the given worktree-relative path with no prefix.
///
/// Otherwise, returns the full path for the project path (obtained by prefixing the worktree-relative path with the name of the worktree).
pub fn short_full_path_for_project_path(
&self,
project_path: &ProjectPath,
cx: &App,
) -> Option<PathBuf> {
if self.visible_worktrees(cx).take(2).count() < 2 {
return Some(project_path.path.to_path_buf());
}
self.worktree_for_id(project_path.worktree_id, cx)
.and_then(|worktree| {
Some(Path::new(worktree.read(cx).abs_path().file_name()?).join(&project_path.path))
})
}
pub fn project_path_for_absolute_path(&self, abs_path: &Path, cx: &App) -> Option<ProjectPath> {
self.find_worktree(abs_path, cx)
.map(|(worktree, relative_path)| ProjectPath {
@@ -4913,7 +4964,7 @@ impl Project {
mut cx: AsyncApp,
) -> Result<proto::OpenBufferResponse> {
let buffer = this
.update(&mut cx, |this, cx| this.create_buffer(cx))?
.update(&mut cx, |this, cx| this.create_buffer(true, cx))?
.await?;
let peer_id = envelope.original_sender_id()?;
@@ -5169,6 +5220,10 @@ impl Project {
&self.git_store
}
pub fn agent_server_store(&self) -> &Entity<AgentServerStore> {
&self.agent_server_store
}
#[cfg(test)]
fn git_scans_complete(&self, cx: &Context<Self>) -> Task<()> {
cx.spawn(async move |this, cx| {
@@ -5244,12 +5299,6 @@ impl Project {
pub fn agent_location(&self) -> Option<AgentLocation> {
self.agent_location.clone()
}
pub fn mark_buffer_as_non_searchable(&self, buffer_id: BufferId, cx: &mut Context<Project>) {
self.buffer_store.update(cx, |buffer_store, _| {
buffer_store.mark_buffer_as_non_searchable(buffer_id)
});
}
}
pub struct PathMatchCandidateSet {

View File

@@ -3876,7 +3876,7 @@ async fn test_save_file_spawns_language_server(cx: &mut gpui::TestAppContext) {
);
let buffer = project
.update(cx, |this, cx| this.create_buffer(cx))
.update(cx, |this, cx| this.create_buffer(false, cx))
.unwrap()
.await;
project.update(cx, |this, cx| {
@@ -4088,7 +4088,9 @@ async fn test_save_as(cx: &mut gpui::TestAppContext) {
let languages = project.update(cx, |project, _| project.languages().clone());
languages.add(rust_lang());
let buffer = project.update(cx, |project, cx| project.create_local_buffer("", None, cx));
let buffer = project.update(cx, |project, cx| {
project.create_local_buffer("", None, false, cx)
});
buffer.update(cx, |buffer, cx| {
buffer.edit([(0..0, "abc")], None, cx);
assert!(buffer.is_dirty());
@@ -5585,9 +5587,7 @@ async fn test_search_with_buffer_exclusions(cx: &mut gpui::TestAppContext) {
let project = Project::test(fs.clone(), [path!("/dir").as_ref()], cx).await;
let _buffer = project.update(cx, |project, cx| {
let buffer = project.create_local_buffer("file", None, cx);
project.mark_buffer_as_non_searchable(buffer.read(cx).remote_id(), cx);
buffer
project.create_local_buffer("file", None, false, cx)
});
assert_eq!(

View File

@@ -2,6 +2,7 @@ syntax = "proto3";
package zed.messages;
import "buffer.proto";
import "task.proto";
message Context {
repeated ContextOperation operations = 1;
@@ -164,3 +165,35 @@ enum LanguageModelRole {
LanguageModelSystem = 2;
reserved 3;
}
message GetAgentServerCommand {
uint64 project_id = 1;
string name = 2;
optional string root_dir = 3;
}
message AgentServerCommand {
string path = 1;
repeated string args = 2;
map<string, string> env = 3;
string root_dir = 4;
optional SpawnInTerminal login = 5;
}
message ExternalAgentsUpdated {
uint64 project_id = 1;
repeated string names = 2;
}
message ExternalAgentLoadingStatusUpdated {
uint64 project_id = 1;
string name = 2;
string status = 3;
}
message NewExternalAgentVersionAvailable {
uint64 project_id = 1;
string name = 2;
string version = 3;
}

View File

@@ -3,6 +3,7 @@ package zed.messages;
import "core.proto";
import "buffer.proto";
import "task.proto";
enum BreakpointState {
Enabled = 0;
@@ -533,14 +534,6 @@ message DebugScenario {
optional string configuration = 7;
}
message SpawnInTerminal {
string label = 1;
optional string command = 2;
repeated string args = 3;
map<string, string> env = 4;
optional string cwd = 5;
}
message LogToDebugConsole {
uint64 project_id = 1;
uint64 session_id = 2;

View File

@@ -40,3 +40,11 @@ enum HideStrategy {
HideNever = 1;
HideOnSuccess = 2;
}
message SpawnInTerminal {
string label = 1;
optional string command = 2;
repeated string args = 3;
map<string, string> env = 4;
optional string cwd = 5;
}

View File

@@ -405,7 +405,15 @@ message Envelope {
GetProcessesResponse get_processes_response = 370;
ResolveToolchain resolve_toolchain = 371;
ResolveToolchainResponse resolve_toolchain_response = 372; // current max
ResolveToolchainResponse resolve_toolchain_response = 372;
GetAgentServerCommand get_agent_server_command = 373;
AgentServerCommand agent_server_command = 374;
ExternalAgentsUpdated external_agents_updated = 375;
ExternalAgentLoadingStatusUpdated external_agent_loading_status_updated = 376;
NewExternalAgentVersionAvailable new_external_agent_version_available = 377; // current max
}
reserved 87 to 88;

View File

@@ -319,6 +319,11 @@ messages!(
(GitClone, Background),
(GitCloneResponse, Background),
(ToggleLspLogs, Background),
(GetAgentServerCommand, Background),
(AgentServerCommand, Background),
(ExternalAgentsUpdated, Background),
(ExternalAgentLoadingStatusUpdated, Background),
(NewExternalAgentVersionAvailable, Background),
);
request_messages!(
@@ -491,6 +496,7 @@ request_messages!(
(GitClone, GitCloneResponse),
(ToggleLspLogs, Ack),
(GetProcesses, GetProcessesResponse),
(GetAgentServerCommand, AgentServerCommand)
);
lsp_messages!(
@@ -644,7 +650,11 @@ entity_messages!(
GetDocumentDiagnostics,
PullWorkspaceDiagnostics,
GetDefaultBranch,
GitClone
GitClone,
GetAgentServerCommand,
ExternalAgentsUpdated,
ExternalAgentLoadingStatusUpdated,
NewExternalAgentVersionAvailable,
);
entity_messages!(

View File

@@ -210,12 +210,15 @@ impl FromProto for Arc<Path> {
}
}
impl<T> ToProto for T
where
T: AsRef<Path>,
{
impl ToProto for PathBuf {
fn to_proto(self) -> String {
to_proto_path(self.as_ref())
to_proto_path(&self)
}
}
impl ToProto for &Path {
fn to_proto(self) -> String {
to_proto_path(self)
}
}

View File

@@ -12,6 +12,7 @@ use node_runtime::NodeRuntime;
use project::{
LspStore, LspStoreEvent, ManifestTree, PrettierStore, ProjectEnvironment, ProjectPath,
ToolchainStore, WorktreeId,
agent_server_store::AgentServerStore,
buffer_store::{BufferStore, BufferStoreEvent},
debugger::{breakpoint_store::BreakpointStore, dap_store::DapStore},
git_store::GitStore,
@@ -44,6 +45,7 @@ pub struct HeadlessProject {
pub lsp_store: Entity<LspStore>,
pub task_store: Entity<TaskStore>,
pub dap_store: Entity<DapStore>,
pub agent_server_store: Entity<AgentServerStore>,
pub settings_observer: Entity<SettingsObserver>,
pub next_entry_id: Arc<AtomicUsize>,
pub languages: Arc<LanguageRegistry>,
@@ -182,7 +184,7 @@ impl HeadlessProject {
.as_local_store()
.expect("Toolchain store to be local")
.clone(),
environment,
environment.clone(),
manifest_tree,
languages.clone(),
http_client.clone(),
@@ -193,6 +195,13 @@ impl HeadlessProject {
lsp_store
});
let agent_server_store = cx.new(|cx| {
let mut agent_server_store =
AgentServerStore::local(node_runtime.clone(), fs.clone(), environment, cx);
agent_server_store.shared(REMOTE_SERVER_PROJECT_ID, session.clone());
agent_server_store
});
cx.subscribe(&lsp_store, Self::on_lsp_store_event).detach();
language_extension::init(
language_extension::LspAccess::ViaLspStore(lsp_store.clone()),
@@ -226,6 +235,7 @@ impl HeadlessProject {
session.subscribe_to_entity(REMOTE_SERVER_PROJECT_ID, &dap_store);
session.subscribe_to_entity(REMOTE_SERVER_PROJECT_ID, &settings_observer);
session.subscribe_to_entity(REMOTE_SERVER_PROJECT_ID, &git_store);
session.subscribe_to_entity(REMOTE_SERVER_PROJECT_ID, &agent_server_store);
session.add_request_handler(cx.weak_entity(), Self::handle_list_remote_directory);
session.add_request_handler(cx.weak_entity(), Self::handle_get_path_metadata);
@@ -264,6 +274,7 @@ impl HeadlessProject {
// todo(debugger): Re init breakpoint store when we set it up for collab
// BreakpointStore::init(&client);
GitStore::init(&session);
AgentServerStore::init_headless(&session);
HeadlessProject {
next_entry_id: Default::default(),
@@ -275,6 +286,7 @@ impl HeadlessProject {
lsp_store,
task_store,
dap_store,
agent_server_store,
languages,
extensions,
git_store,
@@ -517,7 +529,7 @@ impl HeadlessProject {
let buffer_store = this.buffer_store.clone();
let buffer = this
.buffer_store
.update(cx, |buffer_store, cx| buffer_store.create_buffer(cx));
.update(cx, |buffer_store, cx| buffer_store.create_buffer(true, cx));
anyhow::Ok((buffer_store, buffer))
})??;

View File

@@ -1,69 +0,0 @@
[package]
name = "semantic_index"
description = "Process, chunk, and embed text as vectors for semantic search."
version = "0.1.0"
edition.workspace = true
publish.workspace = true
license = "GPL-3.0-or-later"
[lints]
workspace = true
[lib]
path = "src/semantic_index.rs"
[[example]]
name = "index"
path = "examples/index.rs"
crate-type = ["bin"]
[dependencies]
anyhow.workspace = true
arrayvec.workspace = true
blake3.workspace = true
client.workspace = true
clock.workspace = true
collections.workspace = true
feature_flags.workspace = true
fs.workspace = true
futures-batch.workspace = true
futures.workspace = true
gpui.workspace = true
heed.workspace = true
http_client.workspace = true
language.workspace = true
language_model.workspace = true
log.workspace = true
open_ai.workspace = true
parking_lot.workspace = true
project.workspace = true
serde.workspace = true
serde_json.workspace = true
settings.workspace = true
sha2.workspace = true
smol.workspace = true
streaming-iterator.workspace = true
theme.workspace = true
tree-sitter.workspace = true
ui.workspace = true
unindent.workspace = true
util.workspace = true
workspace.workspace = true
worktree.workspace = true
workspace-hack.workspace = true
[dev-dependencies]
client = { workspace = true, features = ["test-support"] }
fs = { workspace = true, features = ["test-support"] }
futures.workspace = true
gpui = { workspace = true, features = ["test-support"] }
http_client = { workspace = true, features = ["test-support"] }
language = { workspace = true, features = ["test-support"] }
languages.workspace = true
project = { workspace = true, features = ["test-support"] }
tempfile.workspace = true
reqwest_client.workspace = true
util = { workspace = true, features = ["test-support"] }
workspace = { workspace = true, features = ["test-support"] }
worktree = { workspace = true, features = ["test-support"] }
zlog.workspace = true

View File

@@ -1,140 +0,0 @@
use client::Client;
use futures::channel::oneshot;
use gpui::Application;
use http_client::HttpClientWithUrl;
use language::language_settings::AllLanguageSettings;
use project::Project;
use semantic_index::{OpenAiEmbeddingModel, OpenAiEmbeddingProvider, SemanticDb};
use settings::SettingsStore;
use std::{
path::{Path, PathBuf},
sync::Arc,
};
fn main() {
zlog::init();
use clock::FakeSystemClock;
Application::new().run(|cx| {
let store = SettingsStore::test(cx);
cx.set_global(store);
language::init(cx);
Project::init_settings(cx);
SettingsStore::update(cx, |store, cx| {
store.update_user_settings::<AllLanguageSettings>(cx, |_| {});
});
let clock = Arc::new(FakeSystemClock::new());
let http = Arc::new(HttpClientWithUrl::new(
Arc::new(
reqwest_client::ReqwestClient::user_agent("Zed semantic index example").unwrap(),
),
"http://localhost:11434",
None,
));
let client = client::Client::new(clock, http.clone(), cx);
Client::set_global(client, cx);
let args: Vec<String> = std::env::args().collect();
if args.len() < 2 {
eprintln!("Usage: cargo run --example index -p semantic_index -- <project_path>");
cx.quit();
return;
}
// let embedding_provider = semantic_index::FakeEmbeddingProvider;
let api_key = std::env::var("OPENAI_API_KEY").expect("OPENAI_API_KEY not set");
let embedding_provider = Arc::new(OpenAiEmbeddingProvider::new(
http,
OpenAiEmbeddingModel::TextEmbedding3Small,
open_ai::OPEN_AI_API_URL.to_string(),
api_key,
));
cx.spawn(async move |cx| {
let semantic_index = SemanticDb::new(
PathBuf::from("/tmp/semantic-index-db.mdb"),
embedding_provider,
cx,
);
let mut semantic_index = semantic_index.await.unwrap();
let project_path = Path::new(&args[1]);
let project = Project::example([project_path], cx).await;
cx.update(|cx| {
let language_registry = project.read(cx).languages().clone();
let node_runtime = project.read(cx).node_runtime().unwrap().clone();
languages::init(language_registry, node_runtime, cx);
})
.unwrap();
let project_index = cx
.update(|cx| semantic_index.project_index(project.clone(), cx))
.unwrap()
.unwrap();
let (tx, rx) = oneshot::channel();
let mut tx = Some(tx);
let subscription = cx.update(|cx| {
cx.subscribe(&project_index, move |_, event, _| {
if let Some(tx) = tx.take() {
_ = tx.send(*event);
}
})
});
let index_start = std::time::Instant::now();
rx.await.expect("no event emitted");
drop(subscription);
println!("Index time: {:?}", index_start.elapsed());
let results = cx
.update(|cx| {
let project_index = project_index.read(cx);
let query = "converting an anchor to a point";
project_index.search(vec![query.into()], 4, cx)
})
.unwrap()
.await
.unwrap();
for search_result in results {
let path = search_result.path.clone();
let content = cx
.update(|cx| {
let worktree = search_result.worktree.read(cx);
let entry_abs_path = worktree.abs_path().join(search_result.path.clone());
let fs = project.read(cx).fs().clone();
cx.spawn(async move |_| fs.load(&entry_abs_path).await.unwrap())
})
.unwrap()
.await;
let range = search_result.range.clone();
let content = content[search_result.range].to_owned();
println!(
"✄✄✄✄✄✄✄✄✄✄✄✄✄✄ {:?} @ {} ✄✄✄✄✄✄✄✄✄✄✄✄✄✄",
path, search_result.score
);
println!("{:?}:{:?}:{:?}", path, range.start, range.end);
println!("{}", content);
}
cx.background_executor()
.timer(std::time::Duration::from_secs(100000))
.await;
cx.update(|cx| cx.quit()).unwrap();
})
.detach();
});
}

View File

@@ -1,3 +0,0 @@
fn main() {
println!("Hello Indexer!");
}

View File

@@ -1,43 +0,0 @@
# Searching for a needle in a haystack
When you have a large amount of text, it can be useful to search for a specific word or phrase. This is often referred to as "finding a needle in a haystack." In this markdown document, we're "hiding" a key phrase for our text search to find. Can you find it?
## Instructions
1. Use the search functionality in your text editor or markdown viewer to find the hidden phrase in this document.
2. Once you've found the **phrase**, write it down and proceed to the next step.
Honestly, I just want to fill up plenty of characters so that we chunk this markdown into several chunks.
## Tips
- Relax
- Take a deep breath
- Focus on the task at hand
- Don't get distracted by other text
- Use the search functionality to your advantage
## Example code
```python
def search_for_needle(haystack, needle):
if needle in haystack:
return True
else:
return False
```
```javascript
function searchForNeedle(haystack, needle) {
return haystack.includes(needle);
}
```
## Background
When creating an index for a book or searching for a specific term in a large document, the ability to quickly find a specific word or phrase is essential. This is where search functionality comes in handy. However, one should _remember_ that the search is only as good as the index that was built. As they say, garbage in, garbage out!
## Conclusion
Searching for a needle in a haystack can be a challenging task, but with the right tools and techniques, it becomes much easier. Whether you're looking for a specific word in a document or trying to find a key piece of information in a large dataset, the ability to search efficiently is a valuable skill to have.

View File

@@ -1,415 +0,0 @@
use language::{Language, with_parser, with_query_cursor};
use serde::{Deserialize, Serialize};
use sha2::{Digest, Sha256};
use std::{
cmp::{self, Reverse},
ops::Range,
path::Path,
sync::Arc,
};
use streaming_iterator::StreamingIterator;
use tree_sitter::QueryCapture;
use util::ResultExt as _;
#[derive(Copy, Clone)]
struct ChunkSizeRange {
min: usize,
max: usize,
}
const CHUNK_SIZE_RANGE: ChunkSizeRange = ChunkSizeRange {
min: 1024,
max: 8192,
};
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct Chunk {
pub range: Range<usize>,
pub digest: [u8; 32],
}
pub fn chunk_text(text: &str, language: Option<&Arc<Language>>, path: &Path) -> Vec<Chunk> {
chunk_text_with_size_range(text, language, path, CHUNK_SIZE_RANGE)
}
fn chunk_text_with_size_range(
text: &str,
language: Option<&Arc<Language>>,
path: &Path,
size_config: ChunkSizeRange,
) -> Vec<Chunk> {
let ranges = syntactic_ranges(text, language, path).unwrap_or_default();
chunk_text_with_syntactic_ranges(text, &ranges, size_config)
}
fn syntactic_ranges(
text: &str,
language: Option<&Arc<Language>>,
path: &Path,
) -> Option<Vec<Range<usize>>> {
let language = language?;
let grammar = language.grammar()?;
let outline = grammar.outline_config.as_ref()?;
let tree = with_parser(|parser| {
parser.set_language(&grammar.ts_language).log_err()?;
parser.parse(text, None)
});
let Some(tree) = tree else {
log::error!("failed to parse file {path:?} for chunking");
return None;
};
struct RowInfo {
offset: usize,
is_comment: bool,
}
let scope = language.default_scope();
let line_comment_prefixes = scope.line_comment_prefixes();
let row_infos = text
.split('\n')
.map({
let mut offset = 0;
move |line| {
let line = line.trim_start();
let is_comment = line_comment_prefixes
.iter()
.any(|prefix| line.starts_with(prefix.as_ref()));
let result = RowInfo { offset, is_comment };
offset += line.len() + 1;
result
}
})
.collect::<Vec<_>>();
// Retrieve a list of ranges of outline items (types, functions, etc) in the document.
// Omit single-line outline items (e.g. struct fields, constant declarations), because
// we'll already be attempting to split on lines.
let mut ranges = with_query_cursor(|cursor| {
cursor
.matches(&outline.query, tree.root_node(), text.as_bytes())
.filter_map_deref(|mat| {
mat.captures
.iter()
.find_map(|QueryCapture { node, index }| {
if *index == outline.item_capture_ix {
let mut start_offset = node.start_byte();
let mut start_row = node.start_position().row;
let end_offset = node.end_byte();
let end_row = node.end_position().row;
// Expand the range to include any preceding comments.
while start_row > 0 && row_infos[start_row - 1].is_comment {
start_offset = row_infos[start_row - 1].offset;
start_row -= 1;
}
if end_row > start_row {
return Some(start_offset..end_offset);
}
}
None
})
})
.collect::<Vec<_>>()
});
ranges.sort_unstable_by_key(|range| (range.start, Reverse(range.end)));
Some(ranges)
}
fn chunk_text_with_syntactic_ranges(
text: &str,
mut syntactic_ranges: &[Range<usize>],
size_config: ChunkSizeRange,
) -> Vec<Chunk> {
let mut chunks = Vec::new();
let mut range = 0..0;
let mut range_end_nesting_depth = 0;
// Try to split the text at line boundaries.
let mut line_ixs = text
.match_indices('\n')
.map(|(ix, _)| ix + 1)
.chain(if text.ends_with('\n') {
None
} else {
Some(text.len())
})
.peekable();
while let Some(&line_ix) = line_ixs.peek() {
// If the current position is beyond the maximum chunk size, then
// start a new chunk.
if line_ix - range.start > size_config.max {
if range.is_empty() {
range.end = cmp::min(range.start + size_config.max, line_ix);
while !text.is_char_boundary(range.end) {
range.end -= 1;
}
}
chunks.push(Chunk {
range: range.clone(),
digest: Sha256::digest(&text[range.clone()]).into(),
});
range_end_nesting_depth = 0;
range.start = range.end;
continue;
}
// Discard any syntactic ranges that end before the current position.
while let Some(first_item) = syntactic_ranges.first() {
if first_item.end < line_ix {
syntactic_ranges = &syntactic_ranges[1..];
continue;
} else {
break;
}
}
// Count how many syntactic ranges contain the current position.
let mut nesting_depth = 0;
for range in syntactic_ranges {
if range.start > line_ix {
break;
}
if range.start < line_ix && range.end > line_ix {
nesting_depth += 1;
}
}
// Extend the current range to this position, unless an earlier candidate
// end position was less nested syntactically.
if range.len() < size_config.min || nesting_depth <= range_end_nesting_depth {
range.end = line_ix;
range_end_nesting_depth = nesting_depth;
}
line_ixs.next();
}
if !range.is_empty() {
chunks.push(Chunk {
range: range.clone(),
digest: Sha256::digest(&text[range]).into(),
});
}
chunks
}
#[cfg(test)]
mod tests {
use super::*;
use language::{Language, LanguageConfig, LanguageMatcher, tree_sitter_rust};
use unindent::Unindent as _;
#[test]
fn test_chunk_text_with_syntax() {
let language = rust_language();
let text = "
struct Person {
first_name: String,
last_name: String,
age: u32,
}
impl Person {
fn new(first_name: String, last_name: String, age: u32) -> Self {
Self { first_name, last_name, age }
}
/// Returns the first name
/// something something something
fn first_name(&self) -> &str {
&self.first_name
}
fn last_name(&self) -> &str {
&self.last_name
}
fn age(&self) -> u32 {
self.age
}
}
"
.unindent();
let chunks = chunk_text_with_size_range(
&text,
Some(&language),
Path::new("lib.rs"),
ChunkSizeRange {
min: text.find('}').unwrap(),
max: text.find("Self {").unwrap(),
},
);
// The entire impl cannot fit in a chunk, so it is split.
// Within the impl, two methods can fit in a chunk.
assert_chunks(
&text,
&chunks,
&[
"struct Person {", // ...
"impl Person {",
" /// Returns the first name",
" fn last_name",
],
);
let text = "
struct T {}
struct U {}
struct V {}
struct W {
a: T,
b: U,
}
"
.unindent();
let chunks = chunk_text_with_size_range(
&text,
Some(&language),
Path::new("lib.rs"),
ChunkSizeRange {
min: text.find('{').unwrap(),
max: text.find('V').unwrap(),
},
);
// Two single-line structs can fit in a chunk.
// The last struct cannot fit in a chunk together
// with the previous single-line struct.
assert_chunks(
&text,
&chunks,
&[
"struct T", // ...
"struct V", // ...
"struct W", // ...
"}",
],
);
}
#[test]
fn test_chunk_with_long_lines() {
let language = rust_language();
let text = "
struct S { a: u32 }
struct T { a: u64 }
struct U { a: u64, b: u64, c: u64, d: u64, e: u64, f: u64, g: u64, h: u64, i: u64, j: u64 }
struct W { a: u64, b: u64, c: u64, d: u64, e: u64, f: u64, g: u64, h: u64, i: u64, j: u64 }
"
.unindent();
let chunks = chunk_text_with_size_range(
&text,
Some(&language),
Path::new("lib.rs"),
ChunkSizeRange { min: 32, max: 64 },
);
// The line is too long to fit in one chunk
assert_chunks(
&text,
&chunks,
&[
"struct S {", // ...
"struct U",
"4, h: u64, i: u64", // ...
"struct W",
"4, h: u64, i: u64", // ...
],
);
}
#[track_caller]
fn assert_chunks(text: &str, chunks: &[Chunk], expected_chunk_text_prefixes: &[&str]) {
check_chunk_invariants(text, chunks);
assert_eq!(
chunks.len(),
expected_chunk_text_prefixes.len(),
"unexpected number of chunks: {chunks:?}",
);
let mut prev_chunk_end = 0;
for (ix, chunk) in chunks.iter().enumerate() {
let expected_prefix = expected_chunk_text_prefixes[ix];
let chunk_text = &text[chunk.range.clone()];
if !chunk_text.starts_with(expected_prefix) {
let chunk_prefix_offset = text[prev_chunk_end..].find(expected_prefix);
if let Some(chunk_prefix_offset) = chunk_prefix_offset {
panic!(
"chunk {ix} starts at unexpected offset {}. expected {}",
chunk.range.start,
chunk_prefix_offset + prev_chunk_end
);
} else {
panic!("invalid expected chunk prefix {ix}: {expected_prefix:?}");
}
}
prev_chunk_end = chunk.range.end;
}
}
#[track_caller]
fn check_chunk_invariants(text: &str, chunks: &[Chunk]) {
for (ix, chunk) in chunks.iter().enumerate() {
if ix > 0 && chunk.range.start != chunks[ix - 1].range.end {
panic!("chunk ranges are not contiguous: {:?}", chunks);
}
}
if text.is_empty() {
assert!(chunks.is_empty())
} else if chunks.first().unwrap().range.start != 0
|| chunks.last().unwrap().range.end != text.len()
{
panic!("chunks don't cover entire text {:?}", chunks);
}
}
#[test]
fn test_chunk_text() {
let text = "a\n".repeat(1000);
let chunks = chunk_text(&text, None, Path::new("lib.rs"));
assert_eq!(
chunks.len(),
((2000_f64) / (CHUNK_SIZE_RANGE.max as f64)).ceil() as usize
);
}
fn rust_language() -> Arc<Language> {
Arc::new(
Language::new(
LanguageConfig {
name: "Rust".into(),
matcher: LanguageMatcher {
path_suffixes: vec!["rs".to_string()],
..Default::default()
},
..Default::default()
},
Some(tree_sitter_rust::LANGUAGE.into()),
)
.with_outline_query(
"
(function_item name: (_) @name) @item
(impl_item type: (_) @name) @item
(struct_item name: (_) @name) @item
(field_declaration name: (_) @name) @item
",
)
.unwrap(),
)
}
}

View File

@@ -1,134 +0,0 @@
mod lmstudio;
mod ollama;
mod open_ai;
pub use lmstudio::*;
pub use ollama::*;
pub use open_ai::*;
use sha2::{Digest, Sha256};
use anyhow::Result;
use futures::{FutureExt, future::BoxFuture};
use serde::{Deserialize, Serialize};
use std::{fmt, future};
/// Trait for embedding providers. Texts in, vectors out.
pub trait EmbeddingProvider: Sync + Send {
fn embed<'a>(&'a self, texts: &'a [TextToEmbed<'a>]) -> BoxFuture<'a, Result<Vec<Embedding>>>;
fn batch_size(&self) -> usize;
}
#[derive(Debug, Default, Clone, PartialEq, Serialize, Deserialize)]
pub struct Embedding(Vec<f32>);
impl Embedding {
pub fn new(mut embedding: Vec<f32>) -> Self {
let len = embedding.len();
let mut norm = 0f32;
for i in 0..len {
norm += embedding[i] * embedding[i];
}
norm = norm.sqrt();
for dimension in &mut embedding {
*dimension /= norm;
}
Self(embedding)
}
fn len(&self) -> usize {
self.0.len()
}
pub fn similarity(&self, others: &[Embedding]) -> (f32, usize) {
debug_assert!(others.iter().all(|other| self.0.len() == other.0.len()));
others
.iter()
.enumerate()
.map(|(index, other)| {
let dot_product: f32 = self
.0
.iter()
.copied()
.zip(other.0.iter().copied())
.map(|(a, b)| a * b)
.sum();
(dot_product, index)
})
.max_by(|a, b| a.0.partial_cmp(&b.0).unwrap_or(std::cmp::Ordering::Equal))
.unwrap_or((0.0, 0))
}
}
impl fmt::Display for Embedding {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
let digits_to_display = 3;
// Start the Embedding display format
write!(f, "Embedding(sized: {}; values: [", self.len())?;
for (index, value) in self.0.iter().enumerate().take(digits_to_display) {
// Lead with comma if not the first element
if index != 0 {
write!(f, ", ")?;
}
write!(f, "{:.3}", value)?;
}
if self.len() > digits_to_display {
write!(f, "...")?;
}
write!(f, "])")
}
}
#[derive(Debug)]
pub struct TextToEmbed<'a> {
pub text: &'a str,
pub digest: [u8; 32],
}
impl<'a> TextToEmbed<'a> {
pub fn new(text: &'a str) -> Self {
let digest = Sha256::digest(text.as_bytes());
Self {
text,
digest: digest.into(),
}
}
}
pub struct FakeEmbeddingProvider;
impl EmbeddingProvider for FakeEmbeddingProvider {
fn embed<'a>(&'a self, texts: &'a [TextToEmbed<'a>]) -> BoxFuture<'a, Result<Vec<Embedding>>> {
let embeddings = texts
.iter()
.map(|_text| {
let mut embedding = vec![0f32; 1536];
for i in 0..embedding.len() {
embedding[i] = i as f32;
}
Embedding::new(embedding)
})
.collect();
future::ready(Ok(embeddings)).boxed()
}
fn batch_size(&self) -> usize {
16
}
}
#[cfg(test)]
mod test {
use super::*;
#[gpui::test]
fn test_normalize_embedding() {
let normalized = Embedding::new(vec![1.0, 1.0, 1.0]);
let value: f32 = 1.0 / 3.0_f32.sqrt();
assert_eq!(normalized, Embedding(vec![value; 3]));
}
}

View File

@@ -1,70 +0,0 @@
use anyhow::{Context as _, Result};
use futures::{AsyncReadExt as _, FutureExt, future::BoxFuture};
use http_client::HttpClient;
use serde::{Deserialize, Serialize};
use std::sync::Arc;
use crate::{Embedding, EmbeddingProvider, TextToEmbed};
pub enum LmStudioEmbeddingModel {
NomicEmbedText,
}
pub struct LmStudioEmbeddingProvider {
client: Arc<dyn HttpClient>,
model: LmStudioEmbeddingModel,
}
#[derive(Serialize)]
struct LmStudioEmbeddingRequest {
model: String,
prompt: String,
}
#[derive(Deserialize)]
struct LmStudioEmbeddingResponse {
embedding: Vec<f32>,
}
impl LmStudioEmbeddingProvider {
pub fn new(client: Arc<dyn HttpClient>, model: LmStudioEmbeddingModel) -> Self {
Self { client, model }
}
}
impl EmbeddingProvider for LmStudioEmbeddingProvider {
fn embed<'a>(&'a self, texts: &'a [TextToEmbed<'a>]) -> BoxFuture<'a, Result<Vec<Embedding>>> {
let model = match self.model {
LmStudioEmbeddingModel::NomicEmbedText => "nomic-embed-text",
};
futures::future::try_join_all(texts.iter().map(|to_embed| {
let request = LmStudioEmbeddingRequest {
model: model.to_string(),
prompt: to_embed.text.to_string(),
};
let request = serde_json::to_string(&request).unwrap();
async {
let response = self
.client
.post_json("http://localhost:1234/api/v0/embeddings", request.into())
.await?;
let mut body = String::new();
response.into_body().read_to_string(&mut body).await?;
let response: LmStudioEmbeddingResponse =
serde_json::from_str(&body).context("Unable to parse response")?;
Ok(Embedding::new(response.embedding))
}
}))
.boxed()
}
fn batch_size(&self) -> usize {
256
}
}

View File

@@ -1,74 +0,0 @@
use anyhow::{Context as _, Result};
use futures::{AsyncReadExt as _, FutureExt, future::BoxFuture};
use http_client::HttpClient;
use serde::{Deserialize, Serialize};
use std::sync::Arc;
use crate::{Embedding, EmbeddingProvider, TextToEmbed};
pub enum OllamaEmbeddingModel {
NomicEmbedText,
MxbaiEmbedLarge,
}
pub struct OllamaEmbeddingProvider {
client: Arc<dyn HttpClient>,
model: OllamaEmbeddingModel,
}
#[derive(Serialize)]
struct OllamaEmbeddingRequest {
model: String,
prompt: String,
}
#[derive(Deserialize)]
struct OllamaEmbeddingResponse {
embedding: Vec<f32>,
}
impl OllamaEmbeddingProvider {
pub fn new(client: Arc<dyn HttpClient>, model: OllamaEmbeddingModel) -> Self {
Self { client, model }
}
}
impl EmbeddingProvider for OllamaEmbeddingProvider {
fn embed<'a>(&'a self, texts: &'a [TextToEmbed<'a>]) -> BoxFuture<'a, Result<Vec<Embedding>>> {
//
let model = match self.model {
OllamaEmbeddingModel::NomicEmbedText => "nomic-embed-text",
OllamaEmbeddingModel::MxbaiEmbedLarge => "mxbai-embed-large",
};
futures::future::try_join_all(texts.iter().map(|to_embed| {
let request = OllamaEmbeddingRequest {
model: model.to_string(),
prompt: to_embed.text.to_string(),
};
let request = serde_json::to_string(&request).unwrap();
async {
let response = self
.client
.post_json("http://localhost:11434/api/embeddings", request.into())
.await?;
let mut body = String::new();
response.into_body().read_to_string(&mut body).await?;
let response: OllamaEmbeddingResponse =
serde_json::from_str(&body).context("Unable to pull response")?;
Ok(Embedding::new(response.embedding))
}
}))
.boxed()
}
fn batch_size(&self) -> usize {
// TODO: Figure out decent value
10
}
}

View File

@@ -1,55 +0,0 @@
use crate::{Embedding, EmbeddingProvider, TextToEmbed};
use anyhow::Result;
use futures::{FutureExt, future::BoxFuture};
use http_client::HttpClient;
pub use open_ai::OpenAiEmbeddingModel;
use std::sync::Arc;
pub struct OpenAiEmbeddingProvider {
client: Arc<dyn HttpClient>,
model: OpenAiEmbeddingModel,
api_url: String,
api_key: String,
}
impl OpenAiEmbeddingProvider {
pub fn new(
client: Arc<dyn HttpClient>,
model: OpenAiEmbeddingModel,
api_url: String,
api_key: String,
) -> Self {
Self {
client,
model,
api_url,
api_key,
}
}
}
impl EmbeddingProvider for OpenAiEmbeddingProvider {
fn embed<'a>(&'a self, texts: &'a [TextToEmbed<'a>]) -> BoxFuture<'a, Result<Vec<Embedding>>> {
let embed = open_ai::embed(
self.client.as_ref(),
&self.api_url,
&self.api_key,
self.model,
texts.iter().map(|to_embed| to_embed.text),
);
async move {
let response = embed.await?;
Ok(response
.data
.into_iter()
.map(|data| Embedding::new(data.embedding))
.collect())
}
.boxed()
}
fn batch_size(&self) -> usize {
// From https://platform.openai.com/docs/api-reference/embeddings/create
2048
}
}

View File

@@ -1,470 +0,0 @@
use crate::{
chunking::{self, Chunk},
embedding::{Embedding, EmbeddingProvider, TextToEmbed},
indexing::{IndexingEntryHandle, IndexingEntrySet},
};
use anyhow::{Context as _, Result};
use collections::Bound;
use feature_flags::FeatureFlagAppExt;
use fs::Fs;
use fs::MTime;
use futures::{FutureExt as _, stream::StreamExt};
use futures_batch::ChunksTimeoutStreamExt;
use gpui::{App, AppContext as _, Entity, Task};
use heed::types::{SerdeBincode, Str};
use language::LanguageRegistry;
use log;
use project::{Entry, UpdatedEntriesSet, Worktree};
use serde::{Deserialize, Serialize};
use smol::channel;
use std::{cmp::Ordering, future::Future, iter, path::Path, pin::pin, sync::Arc, time::Duration};
use util::ResultExt;
use worktree::Snapshot;
pub struct EmbeddingIndex {
worktree: Entity<Worktree>,
db_connection: heed::Env,
db: heed::Database<Str, SerdeBincode<EmbeddedFile>>,
fs: Arc<dyn Fs>,
language_registry: Arc<LanguageRegistry>,
embedding_provider: Arc<dyn EmbeddingProvider>,
entry_ids_being_indexed: Arc<IndexingEntrySet>,
}
impl EmbeddingIndex {
pub fn new(
worktree: Entity<Worktree>,
fs: Arc<dyn Fs>,
db_connection: heed::Env,
embedding_db: heed::Database<Str, SerdeBincode<EmbeddedFile>>,
language_registry: Arc<LanguageRegistry>,
embedding_provider: Arc<dyn EmbeddingProvider>,
entry_ids_being_indexed: Arc<IndexingEntrySet>,
) -> Self {
Self {
worktree,
fs,
db_connection,
db: embedding_db,
language_registry,
embedding_provider,
entry_ids_being_indexed,
}
}
pub fn db(&self) -> &heed::Database<Str, SerdeBincode<EmbeddedFile>> {
&self.db
}
pub fn index_entries_changed_on_disk(
&self,
cx: &App,
) -> impl Future<Output = Result<()>> + use<> {
if !cx.is_staff() {
return async move { Ok(()) }.boxed();
}
let worktree = self.worktree.read(cx).snapshot();
let worktree_abs_path = worktree.abs_path().clone();
let scan = self.scan_entries(worktree, cx);
let chunk = self.chunk_files(worktree_abs_path, scan.updated_entries, cx);
let embed = Self::embed_files(self.embedding_provider.clone(), chunk.files, cx);
let persist = self.persist_embeddings(scan.deleted_entry_ranges, embed.files, cx);
async move {
futures::try_join!(scan.task, chunk.task, embed.task, persist)?;
Ok(())
}
.boxed()
}
pub fn index_updated_entries(
&self,
updated_entries: UpdatedEntriesSet,
cx: &App,
) -> impl Future<Output = Result<()>> + use<> {
if !cx.is_staff() {
return async move { Ok(()) }.boxed();
}
let worktree = self.worktree.read(cx).snapshot();
let worktree_abs_path = worktree.abs_path().clone();
let scan = self.scan_updated_entries(worktree, updated_entries, cx);
let chunk = self.chunk_files(worktree_abs_path, scan.updated_entries, cx);
let embed = Self::embed_files(self.embedding_provider.clone(), chunk.files, cx);
let persist = self.persist_embeddings(scan.deleted_entry_ranges, embed.files, cx);
async move {
futures::try_join!(scan.task, chunk.task, embed.task, persist)?;
Ok(())
}
.boxed()
}
fn scan_entries(&self, worktree: Snapshot, cx: &App) -> ScanEntries {
let (updated_entries_tx, updated_entries_rx) = channel::bounded(512);
let (deleted_entry_ranges_tx, deleted_entry_ranges_rx) = channel::bounded(128);
let db_connection = self.db_connection.clone();
let db = self.db;
let entries_being_indexed = self.entry_ids_being_indexed.clone();
let task = cx.background_spawn(async move {
let txn = db_connection
.read_txn()
.context("failed to create read transaction")?;
let mut db_entries = db
.iter(&txn)
.context("failed to create iterator")?
.move_between_keys()
.peekable();
let mut deletion_range: Option<(Bound<&str>, Bound<&str>)> = None;
for entry in worktree.files(false, 0) {
log::trace!("scanning for embedding index: {:?}", &entry.path);
let entry_db_key = db_key_for_path(&entry.path);
let mut saved_mtime = None;
while let Some(db_entry) = db_entries.peek() {
match db_entry {
Ok((db_path, db_embedded_file)) => match (*db_path).cmp(&entry_db_key) {
Ordering::Less => {
if let Some(deletion_range) = deletion_range.as_mut() {
deletion_range.1 = Bound::Included(db_path);
} else {
deletion_range =
Some((Bound::Included(db_path), Bound::Included(db_path)));
}
db_entries.next();
}
Ordering::Equal => {
if let Some(deletion_range) = deletion_range.take() {
deleted_entry_ranges_tx
.send((
deletion_range.0.map(ToString::to_string),
deletion_range.1.map(ToString::to_string),
))
.await?;
}
saved_mtime = db_embedded_file.mtime;
db_entries.next();
break;
}
Ordering::Greater => {
break;
}
},
Err(_) => return Err(db_entries.next().unwrap().unwrap_err())?,
}
}
if entry.mtime != saved_mtime {
let handle = entries_being_indexed.insert(entry.id);
updated_entries_tx.send((entry.clone(), handle)).await?;
}
}
if let Some(db_entry) = db_entries.next() {
let (db_path, _) = db_entry?;
deleted_entry_ranges_tx
.send((Bound::Included(db_path.to_string()), Bound::Unbounded))
.await?;
}
Ok(())
});
ScanEntries {
updated_entries: updated_entries_rx,
deleted_entry_ranges: deleted_entry_ranges_rx,
task,
}
}
fn scan_updated_entries(
&self,
worktree: Snapshot,
updated_entries: UpdatedEntriesSet,
cx: &App,
) -> ScanEntries {
let (updated_entries_tx, updated_entries_rx) = channel::bounded(512);
let (deleted_entry_ranges_tx, deleted_entry_ranges_rx) = channel::bounded(128);
let entries_being_indexed = self.entry_ids_being_indexed.clone();
let task = cx.background_spawn(async move {
for (path, entry_id, status) in updated_entries.iter() {
match status {
project::PathChange::Added
| project::PathChange::Updated
| project::PathChange::AddedOrUpdated => {
if let Some(entry) = worktree.entry_for_id(*entry_id)
&& entry.is_file()
{
let handle = entries_being_indexed.insert(entry.id);
updated_entries_tx.send((entry.clone(), handle)).await?;
}
}
project::PathChange::Removed => {
let db_path = db_key_for_path(path);
deleted_entry_ranges_tx
.send((Bound::Included(db_path.clone()), Bound::Included(db_path)))
.await?;
}
project::PathChange::Loaded => {
// Do nothing.
}
}
}
Ok(())
});
ScanEntries {
updated_entries: updated_entries_rx,
deleted_entry_ranges: deleted_entry_ranges_rx,
task,
}
}
fn chunk_files(
&self,
worktree_abs_path: Arc<Path>,
entries: channel::Receiver<(Entry, IndexingEntryHandle)>,
cx: &App,
) -> ChunkFiles {
let language_registry = self.language_registry.clone();
let fs = self.fs.clone();
let (chunked_files_tx, chunked_files_rx) = channel::bounded(2048);
let task = cx.spawn(async move |cx| {
cx.background_executor()
.scoped(|cx| {
for _ in 0..cx.num_cpus() {
cx.spawn(async {
while let Ok((entry, handle)) = entries.recv().await {
let entry_abs_path = worktree_abs_path.join(&entry.path);
if let Some(text) = fs.load(&entry_abs_path).await.ok() {
let language = language_registry
.language_for_file_path(&entry.path)
.await
.ok();
let chunked_file = ChunkedFile {
chunks: chunking::chunk_text(
&text,
language.as_ref(),
&entry.path,
),
handle,
path: entry.path,
mtime: entry.mtime,
text,
};
if chunked_files_tx.send(chunked_file).await.is_err() {
return;
}
}
}
});
}
})
.await;
Ok(())
});
ChunkFiles {
files: chunked_files_rx,
task,
}
}
pub fn embed_files(
embedding_provider: Arc<dyn EmbeddingProvider>,
chunked_files: channel::Receiver<ChunkedFile>,
cx: &App,
) -> EmbedFiles {
let embedding_provider = embedding_provider.clone();
let (embedded_files_tx, embedded_files_rx) = channel::bounded(512);
let task = cx.background_spawn(async move {
let mut chunked_file_batches =
pin!(chunked_files.chunks_timeout(512, Duration::from_secs(2)));
while let Some(chunked_files) = chunked_file_batches.next().await {
// View the batch of files as a vec of chunks
// Flatten out to a vec of chunks that we can subdivide into batch sized pieces
// Once those are done, reassemble them back into the files in which they belong
// If any embeddings fail for a file, the entire file is discarded
let chunks: Vec<TextToEmbed> = chunked_files
.iter()
.flat_map(|file| {
file.chunks.iter().map(|chunk| TextToEmbed {
text: &file.text[chunk.range.clone()],
digest: chunk.digest,
})
})
.collect::<Vec<_>>();
let mut embeddings: Vec<Option<Embedding>> = Vec::new();
for embedding_batch in chunks.chunks(embedding_provider.batch_size()) {
if let Some(batch_embeddings) =
embedding_provider.embed(embedding_batch).await.log_err()
{
if batch_embeddings.len() == embedding_batch.len() {
embeddings.extend(batch_embeddings.into_iter().map(Some));
continue;
}
log::error!(
"embedding provider returned unexpected embedding count {}, expected {}",
batch_embeddings.len(), embedding_batch.len()
);
}
embeddings.extend(iter::repeat(None).take(embedding_batch.len()));
}
let mut embeddings = embeddings.into_iter();
for chunked_file in chunked_files {
let mut embedded_file = EmbeddedFile {
path: chunked_file.path,
mtime: chunked_file.mtime,
chunks: Vec::new(),
};
let mut embedded_all_chunks = true;
for (chunk, embedding) in
chunked_file.chunks.into_iter().zip(embeddings.by_ref())
{
if let Some(embedding) = embedding {
embedded_file
.chunks
.push(EmbeddedChunk { chunk, embedding });
} else {
embedded_all_chunks = false;
}
}
if embedded_all_chunks {
embedded_files_tx
.send((embedded_file, chunked_file.handle))
.await?;
}
}
}
Ok(())
});
EmbedFiles {
files: embedded_files_rx,
task,
}
}
fn persist_embeddings(
&self,
deleted_entry_ranges: channel::Receiver<(Bound<String>, Bound<String>)>,
embedded_files: channel::Receiver<(EmbeddedFile, IndexingEntryHandle)>,
cx: &App,
) -> Task<Result<()>> {
let db_connection = self.db_connection.clone();
let db = self.db;
cx.background_spawn(async move {
let mut deleted_entry_ranges = pin!(deleted_entry_ranges);
let mut embedded_files = pin!(embedded_files);
loop {
// Interleave deletions and persists of embedded files
futures::select_biased! {
deletion_range = deleted_entry_ranges.next() => {
if let Some(deletion_range) = deletion_range {
let mut txn = db_connection.write_txn()?;
let start = deletion_range.0.as_ref().map(|start| start.as_str());
let end = deletion_range.1.as_ref().map(|end| end.as_str());
log::debug!("deleting embeddings in range {:?}", &(start, end));
db.delete_range(&mut txn, &(start, end))?;
txn.commit()?;
}
},
file = embedded_files.next() => {
if let Some((file, _)) = file {
let mut txn = db_connection.write_txn()?;
log::debug!("saving embedding for file {:?}", file.path);
let key = db_key_for_path(&file.path);
db.put(&mut txn, &key, &file)?;
txn.commit()?;
}
},
complete => break,
}
}
Ok(())
})
}
pub fn paths(&self, cx: &App) -> Task<Result<Vec<Arc<Path>>>> {
let connection = self.db_connection.clone();
let db = self.db;
cx.background_spawn(async move {
let tx = connection
.read_txn()
.context("failed to create read transaction")?;
let result = db
.iter(&tx)?
.map(|entry| Ok(entry?.1.path))
.collect::<Result<Vec<Arc<Path>>>>();
drop(tx);
result
})
}
pub fn chunks_for_path(&self, path: Arc<Path>, cx: &App) -> Task<Result<Vec<EmbeddedChunk>>> {
let connection = self.db_connection.clone();
let db = self.db;
cx.background_spawn(async move {
let tx = connection
.read_txn()
.context("failed to create read transaction")?;
Ok(db
.get(&tx, &db_key_for_path(&path))?
.context("no such path")?
.chunks)
})
}
}
struct ScanEntries {
updated_entries: channel::Receiver<(Entry, IndexingEntryHandle)>,
deleted_entry_ranges: channel::Receiver<(Bound<String>, Bound<String>)>,
task: Task<Result<()>>,
}
struct ChunkFiles {
files: channel::Receiver<ChunkedFile>,
task: Task<Result<()>>,
}
pub struct ChunkedFile {
pub path: Arc<Path>,
pub mtime: Option<MTime>,
pub handle: IndexingEntryHandle,
pub text: String,
pub chunks: Vec<Chunk>,
}
pub struct EmbedFiles {
pub files: channel::Receiver<(EmbeddedFile, IndexingEntryHandle)>,
pub task: Task<Result<()>>,
}
#[derive(Debug, Serialize, Deserialize)]
pub struct EmbeddedFile {
pub path: Arc<Path>,
pub mtime: Option<MTime>,
pub chunks: Vec<EmbeddedChunk>,
}
#[derive(Clone, Debug, Serialize, Deserialize)]
pub struct EmbeddedChunk {
pub chunk: Chunk,
pub embedding: Embedding,
}
fn db_key_for_path(path: &Arc<Path>) -> String {
path.to_string_lossy().replace('/', "\0")
}

Some files were not shown because too many files have changed in this diff Show More