Compare commits

..

36 Commits

Author SHA1 Message Date
Richard Feldman
97c1abbeae Improve docs for max image sizes 2025-07-02 16:23:20 -04:00
Richard Feldman
fe905f1fd2 wip 2025-07-02 11:06:07 -04:00
Richard Feldman
4c442a66b0 Reject images that are too big 2025-06-30 18:27:17 -04:00
Richard Feldman
f38c6e9bd0 Introduce max_image_size 2025-06-30 16:44:02 -04:00
Conrad Irwin
a2e786e0f9 Allow repeat in visual mode (#33569)
Release Notes:

- vim: Allow `.` in visual mode.
2025-06-30 14:04:28 -06:00
Alejandro Fernández Gómez
b0086b472f Fix an interaction between vim's linewise yank and editor's paste (#33555)
Closes #32397

This PR fixes an issue when pasting text with the `editor::Paste`
command that was copied with `vim::Yank`'s linewise selection.

The change stops setting the `is_entire_line` setting when copying from
with vim linewise selections (<kbd>⇧v</kbd>) and motions (i.e.
<kbd>y2j</kbd>).

This flag is used when cutting/copying text without being selected (so,
place a cursor on line without selecting anything, and press
<kbd>⌘X</kbd>). When cutting/copying text in this manner, [the editor
pastes the text above the
cursor](36941253ee/crates/editor/src/editor.rs (L11936-L11947)).
However, this behaviour is not needed when cutting/copying with vim
motions.

Pasting with vim operations is not affected by this change. [They are
handled
elsewhere](36941253ee/crates/vim/src/normal/paste.rs)
and they don't consider the `is_entire_line` flag at all.

Note for maintainers: I'm not familiar with this codebase 🙃. This change
fixes the issue. I don't see anything breaking... but let me know if
it's not the case and a more thorough change is needed.

**Before:**

The text is copied above the first line, before the cursor.


https://github.com/user-attachments/assets/0c2f111a-5da0-4775-a7a0-2e4fb6f78bfc


**After:**
The text is copied at the cursor location:


https://github.com/user-attachments/assets/60a17985-fe8b-4149-a77b-d72bf531bf85


Release Notes:

- Fixed an issue when pasting text that was yanked with vim's linewise
selections.
2025-06-30 14:03:55 -06:00
fantacell
d10cc13924 helix: Add more tests (#33582)
These tests cover more edge cases

Release Notes:

- N/A
2025-06-30 13:57:20 -06:00
Alvaro Parker
2680a78f9c Support vim-mode in git commit editor (#33222)
Release Notes:

- Added support for vim-mode on git commit editor (modal included)

Side notes: 
- Maybe in the future (or even on this PR) a config could be added to
let the user choose whether to enable vim-mode on this editor or not?
And on the agent message editor as well.
2025-06-30 13:55:45 -06:00
Kirill Bulatov
197828980c Properly register initialized default prettier (#33669)
Stop doing useless prettier-related work when doing a project search.

Before, project search might cause

<img width="1728" alt="not_pretty"
src="https://github.com/user-attachments/assets/5f8b935f-962d-488e-984f-50dfbaee97ba"
/>

but now we debounce the prettier-related task first, and actually set
the "installed" state for the default prettier, when there's no install
needed.

Release Notes:

- N/A
2025-06-30 19:08:50 +00:00
Conrad Taylor
7c4da37322 emmet: Fix expansion for HEEx and H sigil files (#32208)
Closes #14149

Release Notes:

- Added support for the Emmet LSP in Elixir heex files
2025-06-30 12:45:10 -04:00
Julia Ryan
ce164f5e65 Remove ruby debug adapter (#33541)
Now that the extension version has been bumped we can remove our in-tree
one to avoid having duplicate debug adapters.

Release Notes:

- The ruby debug adapter has been moved to the [ruby
extension](https://github.com/zed-extensions/ruby), if you have any
saved debug scenarios you'll need to change `"adapter": "Ruby"` to
`"adapter": "rdbg"`.
2025-06-30 09:15:56 -07:00
Piotr Osiewicz
42c59014a9 debugger: Fix global debug tasks not being picked up (#33664)
Release Notes:

- Fixed a bug which caused global debug scenarios (from global
.zed/debug.json) to not be picked up.
2025-06-30 15:53:34 +00:00
Danilo Leal
3db452eec7 agent: Use a banner for the auto-retry message (#33661)
Follow-up to https://github.com/zed-industries/zed/pull/33275 so we use
the Banner component to display the auto-retry messages in the thread.

Release Notes:

- N/A
2025-06-30 15:34:28 +00:00
Kirill Bulatov
6e77e8405b Revert "languages: Bump ESLint LSP server to version 3.0.10 (#32717)" (#33659)
This reverts commit 1edaeebae5.

Based on an elevated number of ESLint-related issues, reverting the
upgrade.
Many people upvoted the issues and did not share any repro details, so
cannot be certain what's more broken: seems relatively generic as
related to *.ts ESLint configs.

Checked the revert on 2 projects from the issues below:

Closes https://github.com/zed-industries/zed/issues/33425

With https://github.com/adamhl8/zed-33425 as an example repo: there,
both eslint configurations worked for me when I stopped Zed and opened a
project.
Somehow, switching various Zed's with different vscode-eslint package
versions, eventually I get
`Error: Cannot find module
'~/.local/share/zed/languages/eslint/vscode-eslint-3.0.10/vscode-eslint/server/out/eslintServer.js'`-ish
error.

Not very related to issues with newer vscode-eslint integration, but
worth mentioning as is related to the package updates.


Closes https://github.com/zed-industries/zed/issues/33648

With a good example of
https://github.com/florian-lackner365/zed-eslint-bug monorepo project.
The monorepo part seems not to be related, but somehow,
`eslint.config.js` is involved as the newer vscode-eslint fails to find
a config.
Works well with the older vscode-eslint.

Release Notes:

- Downgraded to vscode-eslint-2.4.4 as a ESLint language server
2025-06-30 15:19:00 +00:00
Alejandro Fernández Gómez
465f64da7e Make the preview button the same as the other buttons (#33658)
This fixes a tiny visual defect I noticed today. The "Preview" button is
slightly smaller and has less padding than the other buttons in the
quick action bar.

**Before:**

Note how there is a small gap between the black guides and the button.


https://github.com/user-attachments/assets/04d3d83a-9193-47b1-80d8-94a5d1fbd750

**After:**


https://github.com/user-attachments/assets/98f878cc-c5e3-491c-abe9-9ef0d5cf678a



Release Notes:

- N/A
2025-06-30 15:16:01 +00:00
Piotr Osiewicz
e5a8cc7aab debugger: Fix DAP Logs mangling sessions across multiple Zed windows (#33656)
Release Notes:

- Fixed an issue with Debug Adapter log showing sessions from other Zed
windows in the dropdown.
2025-06-30 15:01:54 +00:00
Peter Tripp
bdf29bf76f Allow disabling tools when 'enable_all_context_servers = true' (#33536)
Closes https://github.com/zed-industries/zed/issues/33519

Release Notes:

- agent: Improved support for explicitly disabling individual tools when
`enable_all_context_servers` is true. (e.g. enable all tools except
XYZ).
2025-06-30 10:56:25 -04:00
Danilo Leal
402c61c00d Add small UI tweak to the inline color preview square (#33655)
Follow-up to https://github.com/zed-industries/zed/pull/33605 so it is
just a bit more subtle and smaller.

Release Notes:

- N/A
2025-06-30 11:19:58 -03:00
Mikal Sande
59e88ce82b Show regex query error under the search bar (#33638)
Closes #17223

Release Notes:

- Show regex parsing errors under the search bar for buffer and project
search.

---------

Co-authored-by: Danilo Leal <daniloleal09@gmail.com>
2025-06-30 11:05:33 -03:00
Peter Tripp
22ab4c53d1 R docs: Remove non-working configuration (#33654)
This config was meant to be commented out in #33594 because it does not
work.

Release Notes:

- N/A
2025-06-30 14:03:09 +00:00
Danilo Leal
f106ea7641 docs: Update custom MCP format template (#33649)
To match the new format added in
https://github.com/zed-industries/zed/pull/33539.

Release Notes:

- N/A
2025-06-30 10:42:38 -03:00
Kirill Bulatov
e37ef2a991 Use more generic error messages in gpui (#33651)
Follow-up of https://github.com/zed-industries/zed/pull/32537

Release Notes:

- N/A
2025-06-30 13:40:31 +00:00
Danilo Leal
1c05062482 agent: Always focus on to the active model in the picker (#33567)
Release Notes:

- agent: Improved the model selector by ensuring the active model is
always focused on open.
2025-06-30 10:32:27 -03:00
Piotr Osiewicz
8c04f12499 debugger: Tighten up breakpoint list (#33645)
Release Notes:

- N/A
2025-06-30 14:49:09 +02:00
Bennet Bo Fenner
aa7ccecc49 agent: Reduce log spam for context servers (#33644)
Previously we would always run `maintain_servers` even if the settings
did not change. While this would not cause any MCP servers to restart,
we would still go through all configured servers and call the
`command(...)` function on each installed MCP extension. This can cause
lots of logs to show up when an MCP server is not configured correctly.

Release Notes:

- N/A
2025-06-30 10:26:14 +00:00
Umesh Yadav
f4aeeda2d9 script: Fix license symlink and path in new-crate.sh (#33620)
While creating a new crate I realised the License symlink and path are
broken. The symlink was broken for LICENSE-GPL. Also the file created in
the new crate was not using the expected file name as per the
check-license script which was failing due to wrong filename in the new
crate. I fixed that as well.

Release Notes:

- N/A

Signed-off-by: Umesh Yadav <git@umesh.dev>
2025-06-30 09:51:58 +00:00
Bennet Bo Fenner
ca0bd53bed agent: Fix an issue with messages containing trailing whitespace (#33643)
Seeing this come up in our server logs when sending requests to
Anthropic: `final assistant content cannot end with trailing
whitespace`.


Release Notes:

- agent: Fixed an issue where Anthropic requests would sometimes fail
because of malformed assistant messages
2025-06-30 09:31:40 +00:00
Kirill Bulatov
ae6237178c Further improve color inlay hints in multi buffers (#33642)
Follow-up of https://github.com/zed-industries/zed/pull/33605

Release Notes:

- N/A
2025-06-30 09:18:43 +00:00
Bennet Bo Fenner
ac3328adb6 agent: Fix issue where web search could return 401 (#33639)
Closes #33524

Release Notes:

- agent: Fix an issue where performing a web search request would
sometimes fail
2025-06-30 11:12:12 +02:00
Bennet Bo Fenner
d63909c598 agent: Use standardized MCP configuration format in settings (#33539)
Changes our MCP settings from:

```json
{
  "context_servers": {
    "some-mcp-server": {
      "source": "custom",
      "command": {
        "path": "npx",
        "args": [
          "-y",
          "@supabase/mcp-server-supabase@latest",
          "--read-only",
          "--project-ref=<project-ref>",
        ],
        "env": {
          "SUPABASE_ACCESS_TOKEN": "<personal-access-token>",
        },
      },
    },
  },
}

```

to:
```json
{
  "context_servers": {
    "some-mcp-server": {
      "source": "custom",
      "command": "npx",
      "args": [
        "-y",
        "@supabase/mcp-server-supabase@latest",
        "--read-only",
        "--project-ref=<project-ref>",
      ],
      "env": {
        "SUPABASE_ACCESS_TOKEN": "<personal-access-token>",
      },
    },
  },
}
```


Which seems to be somewhat of a standard now (VSCode, Cursor, Windsurf,
...)

Release Notes:

- agent: Use standardised format for configuring MCP Servers
2025-06-30 08:05:52 +00:00
Danilo Leal
c3d0230f89 docs: Adjust heading sizes (#33628)
Just fine-tuning some heading sizes that were off, particularly h4s and
h5s.

Release Notes:

- N/A
2025-06-29 20:49:28 -03:00
Piotr Osiewicz
bc5927d5af debugger: Fix spec violation with threads request being issued before debug session is initialized (#33627)
Follow-up to #32852. This time we'll check if the debug session is
initialized before querying threads.

Release Notes:

- Fix Zed's debugger issuing threads request before it is allowed to do
so per DAP specification.
2025-06-29 23:38:16 +00:00
Piotr Osiewicz
d2cf995e27 debugger: Tweak layout of debug landing page in vertical dock position (#33625)
Release Notes:

- Reorganized layout of a debug panel without any sessions for a
vertical dock position.
- Moved parent directories of source breakpoints into a tooltip.
2025-06-30 00:48:14 +02:00
Michael Sloan
86161aa427 Use refs to deduplicate settings JSON schema (~1.7mb to ~0.26mb) (#33618)
Release Notes:

- N/A
2025-06-29 18:33:05 +00:00
Peter Tripp
a602b4b305 Improve R documentation (#33594)
Release Notes:

- N/A
2025-06-29 12:21:10 -04:00
Kirill Bulatov
047d515abf Rework color indicators visual representation (#33605)
Use a div-based rendering code instead of using a text

Closes https://github.com/zed-industries/zed/discussions/33507

Before:
<img width="410" alt="before_dark"
src="https://github.com/user-attachments/assets/66ad63ae-7836-4dc7-8176-a2ff5a38bcd4"
/>
After:
<img width="407" alt="after_dark"
src="https://github.com/user-attachments/assets/0b627da8-461b-4f19-b236-4a69bf5952a0"
/>


Before:
<img width="409" alt="before_light"
src="https://github.com/user-attachments/assets/ebcfabec-fcda-4b63-aee6-c702888f0db4"
/>
After:
<img width="410" alt="after_light"
src="https://github.com/user-attachments/assets/c0da42a1-d6b3-4e08-a56c-9966c07e442d"
/>

The border is not that contrast as in VSCode examples in the issue, but
I'm supposed to use the right thing in

1e11de48ee/crates/editor/src/display_map/inlay_map.rs (L357)

based on 


41583fb066/crates/theme/src/styles/colors.rs (L16-L17)

Another oddity is that the border starts to shrink on `cmd-=`
(`zed::IncreaseBufferFontSize`):

<img width="1244" alt="image"
src="https://github.com/user-attachments/assets/f424edc0-ca0c-4b02-96d4-6da7bf70449a"
/>

but that needs a different part of code to be adjusted hence skipped.

Tailwind CSS example:

<img width="1108" alt="image"
src="https://github.com/user-attachments/assets/10ada4dc-ea8c-46d3-b285-d895bbd6a619"
/>


Release Notes:

- Reworked color indicators visual representation
2025-06-29 09:43:56 +00:00
73 changed files with 3107 additions and 1514 deletions

3
Cargo.lock generated
View File

@@ -4314,14 +4314,11 @@ dependencies = [
"client",
"collections",
"command_palette_hooks",
"component",
"dap",
"dap_adapters",
"db",
"debugger_tools",
"editor",
"env_logger 0.11.8",
"feature_flags",
"file_icons",
"futures 0.3.31",
"fuzzy",

View File

@@ -210,7 +210,8 @@
"ctrl-w space": "editor::OpenExcerptsSplit",
"ctrl-w g space": "editor::OpenExcerptsSplit",
"ctrl-6": "pane::AlternateFile",
"ctrl-^": "pane::AlternateFile"
"ctrl-^": "pane::AlternateFile",
".": "vim::Repeat"
}
},
{
@@ -219,7 +220,6 @@
"ctrl-[": "editor::Cancel",
"escape": "editor::Cancel",
":": "command_palette::Toggle",
".": "vim::Repeat",
"c": "vim::PushChange",
"shift-c": "vim::ChangeToEndOfLine",
"d": "vim::PushDelete",
@@ -849,6 +849,25 @@
"shift-u": "git::UnstageAll"
}
},
{
"context": "Editor && mode == auto_height && VimControl",
"bindings": {
// TODO: Implement search
"/": null,
"?": null,
"#": null,
"*": null,
"n": null,
"shift-n": null
}
},
{
"context": "GitCommit > Editor && VimControl && vim_mode == normal",
"bindings": {
"ctrl-c": "menu::Cancel",
"escape": "menu::Cancel"
}
},
{
"context": "Editor && edit_prediction",
"bindings": {
@@ -860,14 +879,7 @@
{
"context": "MessageEditor > Editor && VimControl",
"bindings": {
"enter": "agent::Chat",
// TODO: Implement search
"/": null,
"?": null,
"#": null,
"*": null,
"n": null,
"shift-n": null
"enter": "agent::Chat"
}
},
{

View File

@@ -96,16 +96,11 @@ impl AgentProfile {
fn is_enabled(settings: &AgentProfileSettings, source: ToolSource, name: String) -> bool {
match source {
ToolSource::Native => *settings.tools.get(name.as_str()).unwrap_or(&false),
ToolSource::ContextServer { id } => {
if settings.enable_all_context_servers {
return true;
}
let Some(preset) = settings.context_servers.get(id.as_ref()) else {
return false;
};
*preset.tools.get(name.as_str()).unwrap_or(&false)
}
ToolSource::ContextServer { id } => settings
.context_servers
.get(id.as_ref())
.and_then(|preset| preset.tools.get(name.as_str()).copied())
.unwrap_or(settings.enable_all_context_servers),
}
}
}

View File

@@ -819,6 +819,134 @@ impl LoadedContext {
}
}
}
pub fn add_to_request_message_with_model(
&self,
request_message: &mut LanguageModelRequestMessage,
model: &Arc<dyn language_model::LanguageModel>,
) {
if !self.text.is_empty() {
request_message
.content
.push(MessageContent::Text(self.text.to_string()));
}
if !self.images.is_empty() {
let max_image_size = model.max_image_size();
let mut images_added = false;
for image in &self.images {
let image_size = image.len() as u64;
if image_size > max_image_size {
if max_image_size == 0 {
log::warn!(
"Skipping image attachment: model {:?} does not support images",
model.name()
);
} else {
log::warn!(
"Skipping image attachment: size {} bytes exceeds model {:?} limit of {} bytes",
image_size,
model.name(),
max_image_size
);
}
continue;
}
// Some providers only support image parts after an initial text part
if !images_added && request_message.content.is_empty() {
request_message
.content
.push(MessageContent::Text("Images attached by user:".to_string()));
}
request_message
.content
.push(MessageContent::Image(image.clone()));
images_added = true;
}
}
}
/// Checks images against model size limits and returns information about rejected images
pub fn check_image_size_limits(
&self,
model: &Arc<dyn language_model::LanguageModel>,
) -> Vec<RejectedImage> {
let mut rejected_images = Vec::new();
if !self.images.is_empty() {
let max_image_size = model.max_image_size();
for image in &self.images {
let image_size = image.len() as u64;
if image_size > max_image_size {
rejected_images.push(RejectedImage {
size: image_size,
max_size: max_image_size,
model_name: model.name().0.to_string(),
});
}
}
}
rejected_images
}
pub fn add_to_request_message_with_validation<F>(
&self,
request_message: &mut LanguageModelRequestMessage,
model: &Arc<dyn language_model::LanguageModel>,
mut on_image_rejected: F,
) where
F: FnMut(u64, u64, &str),
{
if !self.text.is_empty() {
request_message
.content
.push(MessageContent::Text(self.text.to_string()));
}
if !self.images.is_empty() {
let max_image_size = model.max_image_size();
let mut images_added = false;
for image in &self.images {
let image_size = image.len() as u64;
if image_size > max_image_size {
on_image_rejected(image_size, max_image_size, &model.name().0);
if max_image_size == 0 {
log::warn!(
"Skipping image attachment: model {:?} does not support images",
model.name()
);
} else {
log::warn!(
"Skipping image attachment: size {} bytes exceeds model {:?} limit of {} bytes",
image_size,
model.name(),
max_image_size
);
}
continue;
}
// Some providers only support image parts after an initial text part
if !images_added && request_message.content.is_empty() {
request_message
.content
.push(MessageContent::Text("Images attached by user:".to_string()));
}
request_message
.content
.push(MessageContent::Image(image.clone()));
images_added = true;
}
}
}
}
/// Loads and formats a collection of contexts.
@@ -1112,10 +1240,18 @@ impl Hash for AgentContextKey {
}
}
#[derive(Debug, Clone)]
pub struct RejectedImage {
pub size: u64,
pub max_size: u64,
pub model_name: String,
}
#[cfg(test)]
mod tests {
use super::*;
use gpui::TestAppContext;
use gpui::{AsyncApp, TestAppContext};
use language_model::{LanguageModelCacheConfiguration, LanguageModelId, LanguageModelName};
use project::{FakeFs, Project};
use serde_json::json;
use settings::SettingsStore;
@@ -1222,4 +1358,484 @@ mod tests {
})
.expect("Should have found a file context")
}
#[gpui::test]
async fn test_image_size_limit_filtering(_cx: &mut TestAppContext) {
use futures::stream::BoxStream;
use gpui::{AsyncApp, DevicePixels, SharedString};
use language_model::{
LanguageModelId, LanguageModelImage, LanguageModelName, LanguageModelProviderId,
LanguageModelProviderName, Role,
};
use std::sync::Arc;
// Create a mock image that's 10 bytes
let small_image = LanguageModelImage {
source: "small_data".into(),
size: gpui::size(DevicePixels(10), DevicePixels(10)),
};
// Create a mock image that's 1MB
let large_image_source = "x".repeat(1_048_576);
let large_image = LanguageModelImage {
source: large_image_source.into(),
size: gpui::size(DevicePixels(1024), DevicePixels(1024)),
};
let loaded_context = LoadedContext {
contexts: vec![],
text: "Some text".to_string(),
images: vec![small_image.clone(), large_image.clone()],
};
// Test with a model that supports images with 500KB limit
struct TestModel500KB;
impl language_model::LanguageModel for TestModel500KB {
fn id(&self) -> LanguageModelId {
LanguageModelId(SharedString::from("test-500kb"))
}
fn name(&self) -> LanguageModelName {
LanguageModelName(SharedString::from("Test Model 500KB"))
}
fn provider_id(&self) -> LanguageModelProviderId {
LanguageModelProviderId(SharedString::from("test"))
}
fn provider_name(&self) -> LanguageModelProviderName {
LanguageModelProviderName(SharedString::from("Test Provider"))
}
fn supports_tools(&self) -> bool {
false
}
fn supports_tool_choice(&self, _: language_model::LanguageModelToolChoice) -> bool {
false
}
fn max_image_size(&self) -> u64 {
512_000
} // 500KB
fn telemetry_id(&self) -> String {
"test-500kb".to_string()
}
fn max_token_count(&self) -> u64 {
100_000
}
fn count_tokens(
&self,
_request: language_model::LanguageModelRequest,
_cx: &App,
) -> futures::future::BoxFuture<'static, anyhow::Result<u64>> {
Box::pin(async { Ok(0) })
}
fn stream_completion(
&self,
_request: language_model::LanguageModelRequest,
_cx: &AsyncApp,
) -> futures::future::BoxFuture<
'static,
Result<
BoxStream<
'static,
Result<
language_model::LanguageModelCompletionEvent,
language_model::LanguageModelCompletionError,
>,
>,
language_model::LanguageModelCompletionError,
>,
> {
use language_model::LanguageModelCompletionError;
Box::pin(async {
Err(LanguageModelCompletionError::Other(anyhow::anyhow!(
"Not implemented"
)))
})
}
}
let model_500kb: Arc<dyn language_model::LanguageModel> = Arc::new(TestModel500KB);
let mut request_message = LanguageModelRequestMessage {
role: Role::User,
content: vec![],
cache: false,
};
loaded_context.add_to_request_message_with_model(&mut request_message, &model_500kb);
// Should have text and only the small image
assert_eq!(request_message.content.len(), 2); // text + small image
assert!(
matches!(&request_message.content[0], MessageContent::Text(text) if text == "Some text")
);
assert!(matches!(
&request_message.content[1],
MessageContent::Image(_)
));
// Test with a model that doesn't support images
struct TestModelNoImages;
impl language_model::LanguageModel for TestModelNoImages {
fn id(&self) -> LanguageModelId {
LanguageModelId(SharedString::from("test-no-images"))
}
fn name(&self) -> LanguageModelName {
LanguageModelName(SharedString::from("Test Model No Images"))
}
fn provider_id(&self) -> LanguageModelProviderId {
LanguageModelProviderId(SharedString::from("test"))
}
fn provider_name(&self) -> LanguageModelProviderName {
LanguageModelProviderName(SharedString::from("Test Provider"))
}
fn supports_tools(&self) -> bool {
false
}
fn supports_tool_choice(&self, _: language_model::LanguageModelToolChoice) -> bool {
false
}
fn max_image_size(&self) -> u64 {
0
} // No image support
fn telemetry_id(&self) -> String {
"test-no-images".to_string()
}
fn max_token_count(&self) -> u64 {
100_000
}
fn count_tokens(
&self,
_request: language_model::LanguageModelRequest,
_cx: &App,
) -> futures::future::BoxFuture<'static, anyhow::Result<u64>> {
Box::pin(async { Ok(0) })
}
fn stream_completion(
&self,
_request: language_model::LanguageModelRequest,
_cx: &AsyncApp,
) -> futures::future::BoxFuture<
'static,
Result<
BoxStream<
'static,
Result<
language_model::LanguageModelCompletionEvent,
language_model::LanguageModelCompletionError,
>,
>,
language_model::LanguageModelCompletionError,
>,
> {
use language_model::LanguageModelCompletionError;
Box::pin(async {
Err(LanguageModelCompletionError::Other(anyhow::anyhow!(
"Not implemented"
)))
})
}
}
let model_no_images: Arc<dyn language_model::LanguageModel> = Arc::new(TestModelNoImages);
let mut request_message_no_images = LanguageModelRequestMessage {
role: Role::User,
content: vec![],
cache: false,
};
loaded_context
.add_to_request_message_with_model(&mut request_message_no_images, &model_no_images);
// Should have only text, no images
assert_eq!(request_message_no_images.content.len(), 1);
assert!(
matches!(&request_message_no_images.content[0], MessageContent::Text(text) if text == "Some text")
);
}
#[gpui::test]
async fn test_check_image_size_limits() {
use gpui::DevicePixels;
use language_model::LanguageModelImage;
// Create test images of various sizes
let tiny_image = LanguageModelImage {
source: "tiny".into(),
size: gpui::size(DevicePixels(10), DevicePixels(10)),
};
let small_image = LanguageModelImage {
source: "x".repeat(100_000).into(), // 100KB
size: gpui::size(DevicePixels(100), DevicePixels(100)),
};
let medium_image = LanguageModelImage {
source: "x".repeat(500_000).into(), // 500KB
size: gpui::size(DevicePixels(500), DevicePixels(500)),
};
let large_image = LanguageModelImage {
source: "x".repeat(1_048_576).into(), // 1MB
size: gpui::size(DevicePixels(1024), DevicePixels(1024)),
};
let huge_image = LanguageModelImage {
source: "x".repeat(5_242_880).into(), // 5MB
size: gpui::size(DevicePixels(2048), DevicePixels(2048)),
};
// Test with model that has 1MB limit
let model_1mb = Arc::new(TestModel1MB);
let loaded_context = LoadedContext {
contexts: vec![],
text: String::new(),
images: vec![
tiny_image.clone(),
small_image.clone(),
medium_image.clone(),
large_image.clone(),
huge_image.clone(),
],
};
let rejected = loaded_context.check_image_size_limits(
&(model_1mb.clone() as Arc<dyn language_model::LanguageModel>),
);
assert_eq!(rejected.len(), 1);
assert_eq!(rejected[0].size, 5_242_880);
assert_eq!(rejected[0].max_size, 1_048_576);
assert_eq!(rejected[0].model_name, "Test Model 1MB");
// Test with model that doesn't support images
let model_no_images = Arc::new(TestModelNoImages);
let rejected = loaded_context.check_image_size_limits(
&(model_no_images.clone() as Arc<dyn language_model::LanguageModel>),
);
assert_eq!(rejected.len(), 5); // All images rejected
for (_i, rejected_image) in rejected.iter().enumerate() {
assert_eq!(rejected_image.max_size, 0);
assert_eq!(rejected_image.model_name, "Test Model No Images");
}
// Test with empty image list
let empty_context = LoadedContext {
contexts: vec![],
text: String::new(),
images: vec![],
};
let rejected = empty_context.check_image_size_limits(
&(model_1mb.clone() as Arc<dyn language_model::LanguageModel>),
);
assert!(rejected.is_empty());
// Test with all images within limit
let small_context = LoadedContext {
contexts: vec![],
text: String::new(),
images: vec![tiny_image.clone(), small_image.clone()],
};
let rejected = small_context
.check_image_size_limits(&(model_1mb as Arc<dyn language_model::LanguageModel>));
assert!(rejected.is_empty());
}
#[gpui::test]
async fn test_add_to_request_message_with_validation() {
use gpui::DevicePixels;
use language_model::{LanguageModelImage, MessageContent, Role};
let small_image = LanguageModelImage {
source: "small".into(),
size: gpui::size(DevicePixels(10), DevicePixels(10)),
};
let large_image = LanguageModelImage {
source: "x".repeat(2_097_152).into(), // 2MB
size: gpui::size(DevicePixels(1024), DevicePixels(1024)),
};
let loaded_context = LoadedContext {
contexts: vec![],
text: "Test message".to_string(),
images: vec![small_image.clone(), large_image.clone()],
};
let model = Arc::new(TestModel1MB);
let mut request_message = LanguageModelRequestMessage {
role: Role::User,
content: Vec::new(),
cache: false,
};
let mut rejected_count = 0;
let mut rejected_sizes = Vec::new();
let mut rejected_model_names = Vec::new();
loaded_context.add_to_request_message_with_validation(
&mut request_message,
&(model.clone() as Arc<dyn language_model::LanguageModel>),
|size, max_size, model_name| {
rejected_count += 1;
rejected_sizes.push((size, max_size));
rejected_model_names.push(model_name.to_string());
},
);
// Verify callback was called for the large image
assert_eq!(rejected_count, 1);
assert_eq!(rejected_sizes[0], (2_097_152, 1_048_576));
assert_eq!(rejected_model_names[0], "Test Model 1MB");
// Verify the request message contains text and only the small image
assert_eq!(request_message.content.len(), 2); // text + small image
assert!(
matches!(&request_message.content[0], MessageContent::Text(text) if text == "Test message")
);
assert!(matches!(
&request_message.content[1],
MessageContent::Image(_)
));
}
// Helper test models
struct TestModel1MB;
impl language_model::LanguageModel for TestModel1MB {
fn id(&self) -> LanguageModelId {
LanguageModelId(SharedString::from("test-1mb"))
}
fn name(&self) -> LanguageModelName {
LanguageModelName(SharedString::from("Test Model 1MB"))
}
fn provider_id(&self) -> language_model::LanguageModelProviderId {
language_model::LanguageModelProviderId(SharedString::from("test"))
}
fn provider_name(&self) -> language_model::LanguageModelProviderName {
language_model::LanguageModelProviderName(SharedString::from("Test Provider"))
}
fn supports_tools(&self) -> bool {
false
}
fn supports_tool_choice(&self, _: language_model::LanguageModelToolChoice) -> bool {
false
}
fn max_image_size(&self) -> u64 {
1_048_576 // 1MB
}
fn telemetry_id(&self) -> String {
"test-1mb".to_string()
}
fn max_token_count(&self) -> u64 {
100_000
}
fn max_output_tokens(&self) -> Option<u64> {
Some(4096)
}
fn cache_configuration(&self) -> Option<LanguageModelCacheConfiguration> {
Some(LanguageModelCacheConfiguration {
max_cache_anchors: 0,
should_speculate: false,
min_total_token: 1024,
})
}
fn count_tokens(
&self,
_request: language_model::LanguageModelRequest,
_cx: &App,
) -> futures::future::BoxFuture<'static, anyhow::Result<u64>> {
Box::pin(async { Ok(0) })
}
fn stream_completion(
&self,
_request: language_model::LanguageModelRequest,
_cx: &AsyncApp,
) -> futures::future::BoxFuture<
'static,
Result<
futures::stream::BoxStream<
'static,
Result<
language_model::LanguageModelCompletionEvent,
language_model::LanguageModelCompletionError,
>,
>,
language_model::LanguageModelCompletionError,
>,
> {
use language_model::LanguageModelCompletionError;
Box::pin(async {
Err(LanguageModelCompletionError::Other(anyhow::anyhow!(
"Not implemented"
)))
})
}
}
struct TestModelNoImages;
impl language_model::LanguageModel for TestModelNoImages {
fn id(&self) -> LanguageModelId {
LanguageModelId(SharedString::from("test-no-images"))
}
fn name(&self) -> LanguageModelName {
LanguageModelName(SharedString::from("Test Model No Images"))
}
fn provider_id(&self) -> language_model::LanguageModelProviderId {
language_model::LanguageModelProviderId(SharedString::from("test"))
}
fn provider_name(&self) -> language_model::LanguageModelProviderName {
language_model::LanguageModelProviderName(SharedString::from("Test Provider"))
}
fn supports_tools(&self) -> bool {
false
}
fn supports_tool_choice(&self, _: language_model::LanguageModelToolChoice) -> bool {
false
}
fn max_image_size(&self) -> u64 {
0 // No image support
}
fn telemetry_id(&self) -> String {
"test-no-images".to_string()
}
fn max_token_count(&self) -> u64 {
100_000
}
fn max_output_tokens(&self) -> Option<u64> {
Some(4096)
}
fn cache_configuration(&self) -> Option<LanguageModelCacheConfiguration> {
Some(LanguageModelCacheConfiguration {
max_cache_anchors: 0,
should_speculate: false,
min_total_token: 1024,
})
}
fn count_tokens(
&self,
_request: language_model::LanguageModelRequest,
_cx: &App,
) -> futures::future::BoxFuture<'static, anyhow::Result<u64>> {
Box::pin(async { Ok(0) })
}
fn stream_completion(
&self,
_request: language_model::LanguageModelRequest,
_cx: &AsyncApp,
) -> futures::future::BoxFuture<
'static,
Result<
futures::stream::BoxStream<
'static,
Result<
language_model::LanguageModelCompletionEvent,
language_model::LanguageModelCompletionError,
>,
>,
language_model::LanguageModelCompletionError,
>,
> {
use language_model::LanguageModelCompletionError;
Box::pin(async {
Err(LanguageModelCompletionError::Other(anyhow::anyhow!(
"Not implemented"
)))
})
}
}
}

View File

@@ -1338,11 +1338,12 @@ impl Thread {
message
.loaded_context
.add_to_request_message(&mut request_message);
.add_to_request_message_with_model(&mut request_message, &model);
for segment in &message.segments {
match segment {
MessageSegment::Text(text) => {
let text = text.trim_end();
if !text.is_empty() {
request_message
.content
@@ -4107,6 +4108,10 @@ fn main() {{
self.inner.supports_images()
}
fn max_image_size(&self) -> u64 {
self.inner.max_image_size()
}
fn telemetry_id(&self) -> String {
self.inner.telemetry_id()
}
@@ -4616,6 +4621,10 @@ fn main() {{
self.inner.supports_images()
}
fn max_image_size(&self) -> u64 {
self.inner.max_image_size()
}
fn telemetry_id(&self) -> String {
self.inner.telemetry_id()
}
@@ -4781,6 +4790,10 @@ fn main() {{
self.inner.supports_images()
}
fn max_image_size(&self) -> u64 {
self.inner.max_image_size()
}
fn telemetry_id(&self) -> String {
self.inner.telemetry_id()
}
@@ -4938,6 +4951,10 @@ fn main() {{
self.inner.supports_images()
}
fn max_image_size(&self) -> u64 {
self.inner.max_image_size()
}
fn telemetry_id(&self) -> String {
self.inner.telemetry_id()
}
@@ -5381,4 +5398,192 @@ fn main() {{
Ok(buffer)
}
#[gpui::test]
async fn test_image_size_limit_in_thread(cx: &mut TestAppContext) {
use gpui::DevicePixels;
use language_model::{
LanguageModelImage,
fake_provider::{FakeLanguageModel, FakeLanguageModelProvider},
};
init_test_settings(cx);
let project = create_test_project(cx, serde_json::json!({})).await;
let (_, _, thread, _, _) = setup_test_environment(cx, project).await;
// Create a small image that's under the limit
let small_image = LanguageModelImage {
source: "small_data".into(),
size: gpui::size(DevicePixels(10), DevicePixels(10)),
};
// Create a large image that exceeds typical limits (10MB)
let large_image_source = "x".repeat(10_485_760); // 10MB
let large_image = LanguageModelImage {
source: large_image_source.into(),
size: gpui::size(DevicePixels(1024), DevicePixels(1024)),
};
// Create a loaded context with both images
let loaded_context = ContextLoadResult {
loaded_context: LoadedContext {
contexts: vec![],
text: "Test message".to_string(),
images: vec![small_image.clone(), large_image.clone()],
},
referenced_buffers: HashSet::default(),
};
// Insert a user message with the loaded context
thread.update(cx, |thread, cx| {
thread.insert_user_message("Test with images", loaded_context, None, vec![], cx);
});
// Create a model with 500KB image size limit
let _provider = Arc::new(FakeLanguageModelProvider);
let model = Arc::new(FakeLanguageModel::default());
// Note: FakeLanguageModel doesn't support images by default (max_image_size returns 0)
// so we'll test that images are excluded when the model doesn't support them
// Generate the completion request
let request = thread.update(cx, |thread, cx| {
thread.to_completion_request(model.clone(), CompletionIntent::UserPrompt, cx)
});
// Verify that no images were included (because FakeLanguageModel doesn't support images)
let mut image_count = 0;
let mut has_text = false;
for message in &request.messages {
for content in &message.content {
match content {
MessageContent::Text(text) => {
if text.contains("Test message") {
has_text = true;
}
}
MessageContent::Image(_) => {
image_count += 1;
}
_ => {}
}
}
}
assert!(has_text, "Text content should be included");
assert_eq!(
image_count, 0,
"No images should be included when model doesn't support them"
);
}
#[gpui::test]
async fn test_image_size_limit_with_anthropic_model(_cx: &mut TestAppContext) {
use gpui::{DevicePixels, SharedString};
use language_model::{
LanguageModelId, LanguageModelImage, LanguageModelName, LanguageModelProviderId,
LanguageModelProviderName,
};
// Test with a model that has specific size limits (like Anthropic's 5MB limit)
// We'll create a simple test to verify the logic works correctly
// Create test images
let small_image = LanguageModelImage {
source: "small".into(),
size: gpui::size(DevicePixels(100), DevicePixels(100)),
};
let large_image_source = "x".repeat(6_000_000); // 6MB - over Anthropic's 5MB limit
let large_image = LanguageModelImage {
source: large_image_source.into(),
size: gpui::size(DevicePixels(2000), DevicePixels(2000)),
};
let loaded_context = LoadedContext {
contexts: vec![],
text: "Test".to_string(),
images: vec![small_image.clone(), large_image.clone()],
};
// Test the add_to_request_message_with_model method directly
let mut request_message = LanguageModelRequestMessage {
role: Role::User,
content: vec![],
cache: false,
};
// Use the test from context.rs as a guide - create a mock model with 5MB limit
struct TestModel5MB;
impl language_model::LanguageModel for TestModel5MB {
fn id(&self) -> LanguageModelId {
LanguageModelId(SharedString::from("test"))
}
fn name(&self) -> LanguageModelName {
LanguageModelName(SharedString::from("Test 5MB"))
}
fn provider_id(&self) -> LanguageModelProviderId {
LanguageModelProviderId(SharedString::from("test"))
}
fn provider_name(&self) -> LanguageModelProviderName {
LanguageModelProviderName(SharedString::from("Test"))
}
fn supports_tools(&self) -> bool {
false
}
fn supports_tool_choice(&self, _: language_model::LanguageModelToolChoice) -> bool {
false
}
fn max_image_size(&self) -> u64 {
5_242_880 // 5MB like Anthropic
}
fn telemetry_id(&self) -> String {
"test".to_string()
}
fn max_token_count(&self) -> u64 {
100_000
}
fn count_tokens(
&self,
_request: language_model::LanguageModelRequest,
_cx: &App,
) -> futures::future::BoxFuture<'static, anyhow::Result<u64>> {
Box::pin(async { Ok(0) })
}
fn stream_completion(
&self,
_request: language_model::LanguageModelRequest,
_cx: &gpui::AsyncApp,
) -> futures::future::BoxFuture<
'static,
Result<
futures::stream::BoxStream<
'static,
Result<
language_model::LanguageModelCompletionEvent,
language_model::LanguageModelCompletionError,
>,
>,
language_model::LanguageModelCompletionError,
>,
> {
Box::pin(async {
Err(language_model::LanguageModelCompletionError::Other(
anyhow::anyhow!("Not implemented"),
))
})
}
}
let model: Arc<dyn language_model::LanguageModel> = Arc::new(TestModel5MB);
loaded_context.add_to_request_message_with_model(&mut request_message, &model);
// Should have text and only the small image
let mut image_count = 0;
for content in &request_message.content {
if matches!(content, MessageContent::Image(_)) {
image_count += 1;
}
}
assert_eq!(image_count, 1, "Only the small image should be included");
}
}

View File

@@ -47,8 +47,8 @@ use std::time::Duration;
use text::ToPoint;
use theme::ThemeSettings;
use ui::{
Disclosure, KeyBinding, PopoverMenuHandle, Scrollbar, ScrollbarState, TextSize, Tooltip,
prelude::*,
Banner, Disclosure, KeyBinding, PopoverMenuHandle, Scrollbar, ScrollbarState, TextSize,
Tooltip, prelude::*,
};
use util::ResultExt as _;
use util::markdown::MarkdownCodeBlock;
@@ -58,6 +58,7 @@ use zed_llm_client::CompletionIntent;
const CODEBLOCK_CONTAINER_GROUP: &str = "codeblock_container";
const EDIT_PREVIOUS_MESSAGE_MIN_LINES: usize = 1;
const RESPONSE_PADDING_X: Pixels = px(19.);
pub struct ActiveThread {
context_store: Entity<ContextStore>,
@@ -888,6 +889,46 @@ impl ActiveThread {
&self.text_thread_store
}
pub fn validate_image(&self, image: &Arc<gpui::Image>, cx: &App) -> Result<(), String> {
let image_size = image.bytes().len() as u64;
if let Some(model) = self.thread.read(cx).configured_model() {
let max_size = model.model.max_image_size();
if image_size > max_size {
if max_size == 0 {
Err(format!(
"{} does not support image attachments",
model.model.name().0
))
} else {
let size_mb = image_size as f64 / 1_048_576.0;
let max_size_mb = max_size as f64 / 1_048_576.0;
Err(format!(
"Image ({:.1} MB) exceeds {}'s {:.1} MB size limit",
size_mb,
model.model.name().0,
max_size_mb
))
}
} else {
Ok(())
}
} else {
// No model configured, use default 10MB limit
const DEFAULT_MAX_SIZE: u64 = 10 * 1024 * 1024;
if image_size > DEFAULT_MAX_SIZE {
let size_mb = image_size as f64 / 1_048_576.0;
Err(format!(
"Image ({:.1} MB) exceeds the 10 MB size limit",
size_mb
))
} else {
Ok(())
}
}
}
fn push_rendered_message(&mut self, id: MessageId, rendered_message: RenderedMessage) {
let old_len = self.messages.len();
self.messages.push(id);
@@ -1521,7 +1562,7 @@ impl ActiveThread {
}
fn paste(&mut self, _: &Paste, _window: &mut Window, cx: &mut Context<Self>) {
attach_pasted_images_as_context(&self.context_store, cx);
attach_pasted_images_as_context_with_validation(&self.context_store, Some(self), cx);
}
fn cancel_editing_message(
@@ -1874,9 +1915,6 @@ impl ActiveThread {
this.scroll_to_top(cx);
}));
// For all items that should be aligned with the LLM's response.
const RESPONSE_PADDING_X: Pixels = px(19.);
let show_feedback = thread.is_turn_end(ix);
let feedback_container = h_flex()
.group("feedback_container")
@@ -2537,34 +2575,18 @@ impl ActiveThread {
ix: usize,
cx: &mut Context<Self>,
) -> Stateful<Div> {
let colors = cx.theme().colors();
div().id(("message-container", ix)).py_1().px_2().child(
v_flex()
.w_full()
.bg(colors.editor_background)
.rounded_sm()
.child(
h_flex()
.w_full()
.p_2()
.gap_2()
.child(
div().flex_none().child(
Icon::new(IconName::Warning)
.size(IconSize::Small)
.color(Color::Warning),
),
)
.child(
v_flex()
.flex_1()
.min_w_0()
.text_size(TextSize::Small.rems(cx))
.text_color(cx.theme().colors().text_muted)
.children(message_content),
),
),
)
let message = div()
.flex_1()
.min_w_0()
.text_size(TextSize::XSmall.rems(cx))
.text_color(cx.theme().colors().text_muted)
.children(message_content);
div()
.id(("message-container", ix))
.py_1()
.px_2p5()
.child(Banner::new().severity(ui::Severity::Warning).child(message))
}
fn render_message_thinking_segment(
@@ -3721,6 +3743,14 @@ pub(crate) fn open_context(
pub(crate) fn attach_pasted_images_as_context(
context_store: &Entity<ContextStore>,
cx: &mut App,
) -> bool {
attach_pasted_images_as_context_with_validation(context_store, None, cx)
}
pub(crate) fn attach_pasted_images_as_context_with_validation(
context_store: &Entity<ContextStore>,
active_thread: Option<&ActiveThread>,
cx: &mut App,
) -> bool {
let images = cx
.read_from_clipboard()
@@ -3742,9 +3772,67 @@ pub(crate) fn attach_pasted_images_as_context(
}
cx.stop_propagation();
// Try to find the workspace for showing toasts
let workspace = cx
.active_window()
.and_then(|window| window.downcast::<Workspace>());
context_store.update(cx, |store, cx| {
for image in images {
store.add_image_instance(Arc::new(image), cx);
let image_arc = Arc::new(image);
// Validate image if we have an active thread
let should_add = if let Some(thread) = active_thread {
match thread.validate_image(&image_arc, cx) {
Ok(()) => true,
Err(err) => {
// Show error toast if we have a workspace
if let Some(workspace) = workspace {
let _ = workspace.update(cx, |workspace, _, cx| {
use workspace::{Toast, notifications::NotificationId};
struct ImageRejectionToast;
workspace.show_toast(
Toast::new(
NotificationId::unique::<ImageRejectionToast>(),
err,
),
cx,
);
});
}
false
}
}
} else {
// No active thread, check against default limit
let image_size = image_arc.bytes().len() as u64;
const DEFAULT_MAX_SIZE: u64 = 10 * 1024 * 1024; // 10MB
if image_size > DEFAULT_MAX_SIZE {
let size_mb = image_size as f64 / 1_048_576.0;
let err = format!("Image ({:.1} MB) exceeds the 10 MB size limit", size_mb);
if let Some(workspace) = workspace {
let _ = workspace.update(cx, |workspace, _, cx| {
use workspace::{Toast, notifications::NotificationId};
struct ImageRejectionToast;
workspace.show_toast(
Toast::new(NotificationId::unique::<ImageRejectionToast>(), err),
cx,
);
});
}
false
} else {
true
}
};
if should_add {
store.add_image_instance(image_arc, cx);
}
}
});
true

View File

@@ -180,7 +180,7 @@ impl ConfigurationSource {
}
fn context_server_input(existing: Option<(ContextServerId, ContextServerCommand)>) -> String {
let (name, path, args, env) = match existing {
let (name, command, args, env) = match existing {
Some((id, cmd)) => {
let args = serde_json::to_string(&cmd.args).unwrap();
let env = serde_json::to_string(&cmd.env.unwrap_or_default()).unwrap();
@@ -198,14 +198,12 @@ fn context_server_input(existing: Option<(ContextServerId, ContextServerCommand)
r#"{{
/// The name of your MCP server
"{name}": {{
"command": {{
/// The path to the executable
"path": "{path}",
/// The arguments to pass to the executable
"args": {args},
/// The environment variables to set for the executable
"env": {env}
}}
/// The command which runs the MCP server
"command": "{command}",
/// The arguments to pass to the MCP server
"args": {args},
/// The environment variables to set
"env": {env}
}}
}}"#
)
@@ -439,8 +437,7 @@ fn parse_input(text: &str) -> Result<(ContextServerId, ContextServerCommand)> {
let object = value.as_object().context("Expected object")?;
anyhow::ensure!(object.len() == 1, "Expected exactly one key-value pair");
let (context_server_name, value) = object.into_iter().next().unwrap();
let command = value.get("command").context("Expected command")?;
let command: ContextServerCommand = serde_json::from_value(command.clone())?;
let command: ContextServerCommand = serde_json::from_value(value.clone())?;
Ok((ContextServerId(context_server_name.clone().into()), command))
}

View File

@@ -4,6 +4,8 @@ use std::rc::Rc;
use std::sync::Arc;
use std::time::Duration;
use gpui::{Image, ImageFormat};
use db::kvp::{Dismissable, KEY_VALUE_STORE};
use serde::{Deserialize, Serialize};
@@ -2932,29 +2934,215 @@ impl AgentPanel {
}),
)
.on_drop(cx.listener(move |this, paths: &ExternalPaths, window, cx| {
let tasks = paths
.paths()
.into_iter()
.map(|path| {
Workspace::project_path_for_path(this.project.clone(), &path, false, cx)
})
.collect::<Vec<_>>();
cx.spawn_in(window, async move |this, cx| {
let mut paths = vec![];
let mut added_worktrees = vec![];
let opened_paths = futures::future::join_all(tasks).await;
for entry in opened_paths {
if let Some((worktree, project_path)) = entry.log_err() {
added_worktrees.push(worktree);
paths.push(project_path);
eprintln!("=== ON_DROP EXTERNAL_PATHS HANDLER ===");
eprintln!("Number of external paths: {}", paths.paths().len());
for (i, path) in paths.paths().iter().enumerate() {
eprintln!("External path {}: {:?}", i, path);
}
match &this.active_view {
ActiveView::Thread { thread, .. } => {
eprintln!("In ActiveView::Thread branch");
let thread = thread.clone();
let paths = paths.paths();
let workspace = this.workspace.clone();
for path in paths {
eprintln!("Processing path: {:?}", path);
// Check if it's an image file by extension
let is_image = path.extension()
.and_then(|ext| ext.to_str())
.map(|ext| {
matches!(
ext.to_lowercase().as_str(),
"jpg" | "jpeg" | "png" | "gif" | "webp" | "bmp" | "ico" | "svg" | "tiff" | "tif"
)
})
.unwrap_or(false);
eprintln!("Is image: {}", is_image);
if is_image {
let path = path.to_path_buf();
let thread = thread.clone();
let workspace = workspace.clone();
eprintln!("Spawning async task for image: {:?}", path);
cx.spawn_in(window, async move |_, cx| {
eprintln!("=== INSIDE ASYNC IMAGE TASK ===");
eprintln!("Image path: {:?}", path);
// Get file metadata first
let metadata = smol::fs::metadata(&path).await;
eprintln!("Metadata result: {:?}", metadata.is_ok());
if let Ok(metadata) = metadata {
let file_size = metadata.len();
eprintln!("File size: {} bytes", file_size);
// Get model limits
let (max_image_size, model_name) = thread
.update_in(cx, |thread, _window, cx| {
let model = thread.thread().read(cx).configured_model();
let max_size = model
.as_ref()
.map(|m| m.model.max_image_size())
.unwrap_or(10 * 1024 * 1024);
let name = model.as_ref().map(|m| m.model.name().0.to_string());
(max_size, name)
})
.ok()
.unwrap_or((10 * 1024 * 1024, None));
eprintln!("Max image size: {}, Model: {:?}", max_image_size, model_name);
eprintln!("File size: {:.2} MB, Limit: {:.2} MB",
file_size as f64 / 1_048_576.0,
max_image_size as f64 / 1_048_576.0);
if file_size > max_image_size {
eprintln!("FILE SIZE EXCEEDS LIMIT!");
let error_message = if let Some(model_name) = &model_name {
if max_image_size == 0 {
format!("{} does not support image attachments", model_name)
} else {
let size_mb = file_size as f64 / 1_048_576.0;
let max_size_mb = max_image_size as f64 / 1_048_576.0;
format!(
"Image ({:.1} MB) exceeds {}'s {:.1} MB size limit",
size_mb, model_name, max_size_mb
)
}
} else {
let size_mb = file_size as f64 / 1_048_576.0;
format!("Image ({:.1} MB) exceeds the 10 MB size limit", size_mb)
};
eprintln!("Showing error toast: {}", error_message);
cx.update(|_, cx| {
eprintln!("Inside cx.update for toast");
if let Some(workspace) = workspace.upgrade() {
eprintln!("Got workspace, showing toast!");
let _ = workspace.update(cx, |workspace, cx| {
use workspace::{Toast, notifications::NotificationId};
struct ImageRejectionToast;
workspace.show_toast(
Toast::new(
NotificationId::unique::<ImageRejectionToast>(),
error_message,
),
cx,
);
});
eprintln!("Toast command issued!");
} else {
eprintln!("FAILED to upgrade workspace!");
}
})
.log_err();
} else {
eprintln!("Image within size limits, loading file");
// Load the image file
match smol::fs::read(&path).await {
Ok(data) => {
eprintln!("Successfully read {} bytes", data.len());
// Determine image format from extension
let format = path.extension()
.and_then(|ext| ext.to_str())
.and_then(|ext| {
match ext.to_lowercase().as_str() {
"png" => Some(ImageFormat::Png),
"jpg" | "jpeg" => Some(ImageFormat::Jpeg),
"gif" => Some(ImageFormat::Gif),
"webp" => Some(ImageFormat::Webp),
"bmp" => Some(ImageFormat::Bmp),
"svg" => Some(ImageFormat::Svg),
"tiff" | "tif" => Some(ImageFormat::Tiff),
_ => None
}
})
.unwrap_or(ImageFormat::Png); // Default to PNG if unknown
// Create image from data
let image = Image::from_bytes(format, data);
let image_arc = Arc::new(image);
// Add to context store
thread
.update_in(cx, |thread, _window, cx| {
thread.context_store().update(cx, |store, cx| {
store.add_image_instance(image_arc, cx);
});
})
.log_err();
eprintln!("Image added to context store!");
}
Err(e) => {
log::error!("Failed to read image file: {}", e);
}
}
}
} else {
eprintln!("Failed to get file metadata!");
}
})
.detach();
eprintln!("Image task detached");
} else {
eprintln!("Not an image, using project path logic");
// For non-image files, use the existing project path logic
let project = this.project.clone();
let context_store = thread.read(cx).context_store().clone();
let path = path.to_path_buf();
cx.spawn_in(window, async move |_, cx| {
if let Some(task) = cx.update(|_, cx| {
Workspace::project_path_for_path(project.clone(), &path, false, cx)
}).ok() {
if let Some((_, project_path)) = task.await.log_err() {
context_store
.update(cx, |store, cx| {
store.add_file_from_path(project_path, false, cx).detach();
})
.ok();
}
}
})
.detach();
}
}
}
this.update_in(cx, |this, window, cx| {
this.handle_drop(paths, added_worktrees, window, cx);
})
.ok();
})
.detach();
ActiveView::TextThread { .. } => {
eprintln!("In ActiveView::TextThread branch");
// Keep existing behavior for text threads
let tasks = paths
.paths()
.into_iter()
.map(|path| {
Workspace::project_path_for_path(this.project.clone(), &path, false, cx)
})
.collect::<Vec<_>>();
cx.spawn_in(window, async move |this, cx| {
let mut paths = vec![];
let mut added_worktrees = vec![];
let opened_paths = futures::future::join_all(tasks).await;
for entry in opened_paths {
if let Some((worktree, project_path)) = entry.log_err() {
added_worktrees.push(worktree);
paths.push(project_path);
}
}
this.update_in(cx, |this, window, cx| {
this.handle_drop(paths, added_worktrees, window, cx);
})
.ok();
})
.detach();
}
_ => {
eprintln!("In unknown ActiveView branch");
}
}
}))
}
@@ -2965,20 +3153,47 @@ impl AgentPanel {
window: &mut Window,
cx: &mut Context<Self>,
) {
// This method is now only used for non-image files and text threads
match &self.active_view {
ActiveView::Thread { thread, .. } => {
let context_store = thread.read(cx).context_store().clone();
// All paths here should be non-image files
context_store.update(cx, move |context_store, cx| {
let mut tasks = Vec::new();
for project_path in &paths {
tasks.push(context_store.add_file_from_path(
project_path.clone(),
false,
cx,
));
for path in paths {
tasks.push(context_store.add_file_from_path(path, false, cx));
}
cx.background_spawn(async move {
futures::future::join_all(tasks).await;
cx.spawn(async move |_, cx| {
let results = futures::future::join_all(tasks).await;
// Show error toasts for any file errors
for result in results {
if let Err(err) = result {
cx.update(|cx| {
if let Some(workspace) = cx
.active_window()
.and_then(|window| window.downcast::<Workspace>())
{
let _ = workspace.update(cx, |workspace, _, cx| {
use workspace::{Toast, notifications::NotificationId};
struct FileLoadErrorToast;
workspace.show_toast(
Toast::new(
NotificationId::unique::<FileLoadErrorToast>(),
err.to_string(),
),
cx,
);
});
}
})
.log_err();
}
}
// Need to hold onto the worktrees until they have already been used when
// opening the buffers.
drop(added_worktrees);

View File

@@ -9,6 +9,7 @@ mod context_picker;
mod context_server_configuration;
mod context_strip;
mod debug;
mod inline_assistant;
mod inline_prompt_editor;
mod language_model_selector;

View File

@@ -399,7 +399,7 @@ impl PickerDelegate for LanguageModelPickerDelegate {
cx: &mut Context<Picker<Self>>,
) -> Task<()> {
let all_models = self.all_models.clone();
let current_index = self.selected_index;
let active_model = (self.get_active_model)(cx);
let bg_executor = cx.background_executor();
let language_model_registry = LanguageModelRegistry::global(cx);
@@ -441,12 +441,9 @@ impl PickerDelegate for LanguageModelPickerDelegate {
cx.spawn_in(window, async move |this, cx| {
this.update_in(cx, |this, window, cx| {
this.delegate.filtered_entries = filtered_models.entries();
// Preserve selection focus
let new_index = if current_index >= this.delegate.filtered_entries.len() {
0
} else {
current_index
};
// Finds the currently selected model in the list
let new_index =
Self::get_active_model_index(&this.delegate.filtered_entries, active_model);
this.set_selected_index(new_index, Some(picker::Direction::Down), true, window, cx);
cx.notify();
})
@@ -659,6 +656,10 @@ mod tests {
false
}
fn max_image_size(&self) -> u64 {
0
}
fn telemetry_id(&self) -> String {
format!("{}/{}", self.provider_id.0, self.name.0)
}

View File

@@ -27,6 +27,7 @@ use file_icons::FileIcons;
use fs::Fs;
use futures::future::Shared;
use futures::{FutureExt as _, future};
use gpui::AsyncApp;
use gpui::{
Animation, AnimationExt, App, Entity, EventEmitter, Focusable, Subscription, Task, TextStyle,
WeakEntity, linear_color_stop, linear_gradient, point, pulsating_between,
@@ -62,6 +63,7 @@ use agent::{
context_store::ContextStore,
thread_store::{TextThreadStore, ThreadStore},
};
use workspace::{Toast, notifications::NotificationId};
#[derive(RegisterComponent)]
pub struct MessageEditor {
@@ -380,11 +382,15 @@ impl MessageEditor {
let checkpoint = git_store.update(cx, |git_store, cx| git_store.checkpoint(cx));
let context_task = self.reload_context(cx);
let window_handle = window.window_handle();
let workspace = self.workspace.clone();
cx.spawn(async move |_this, cx| {
let (checkpoint, loaded_context) = future::join(checkpoint, context_task).await;
let loaded_context = loaded_context.unwrap_or_default();
// Check for rejected images and show notifications
Self::notify_rejected_images(&loaded_context, &model, &workspace, &cx);
thread
.update(cx, |thread, cx| {
thread.insert_user_message(
@@ -412,6 +418,80 @@ impl MessageEditor {
.detach();
}
fn notify_rejected_images(
loaded_context: &agent::context::ContextLoadResult,
model: &Arc<dyn language_model::LanguageModel>,
workspace: &WeakEntity<Workspace>,
cx: &AsyncApp,
) {
let rejected_images = loaded_context.loaded_context.check_image_size_limits(model);
if rejected_images.is_empty() {
return;
}
let workspace = workspace.clone();
let model_name = rejected_images[0].model_name.clone();
let max_size = model.max_image_size();
let count = rejected_images.len();
let rejected_images = rejected_images.clone();
cx.update(|cx| {
if let Some(workspace) = workspace.upgrade() {
workspace.update(cx, |workspace, cx| {
let message = if max_size == 0 {
Self::format_unsupported_images_message(&model_name, count)
} else {
Self::format_size_limit_message(
&model_name,
count,
max_size,
&rejected_images,
)
};
struct ImageRejectionToast;
workspace.show_toast(
Toast::new(NotificationId::unique::<ImageRejectionToast>(), message),
cx,
);
});
}
})
.log_err();
}
fn format_unsupported_images_message(model_name: &str, count: usize) -> String {
let plural = if count > 1 { "s" } else { "" };
format!(
"{} does not support image attachments. {} image{} will be excluded from your message.",
model_name, count, plural
)
}
fn format_size_limit_message(
model_name: &str,
count: usize,
max_size: u64,
rejected_images: &[agent::context::RejectedImage],
) -> String {
let plural = if count > 1 { "s" } else { "" };
let max_size_mb = max_size as f64 / 1_048_576.0;
// If only one image, show its specific size
if count == 1 {
let image_size_mb = rejected_images[0].size as f64 / 1_048_576.0;
format!(
"Image ({:.1} MB) exceeds {}'s {:.1} MB size limit and will be excluded.",
image_size_mb, model_name, max_size_mb
)
} else {
format!(
"{} image{} exceeded {}'s {:.1} MB size limit and will be excluded.",
count, plural, model_name, max_size_mb
)
}
}
fn stop_current_and_send_new_message(&mut self, window: &mut Window, cx: &mut Context<Self>) {
self.thread.update(cx, |thread, cx| {
thread.cancel_editing(cx);

View File

@@ -303,7 +303,6 @@ pub enum ComponentScope {
Collaboration,
#[strum(serialize = "Data Display")]
DataDisplay,
Debugger,
Editor,
#[strum(serialize = "Images & Icons")]
Images,

View File

@@ -29,6 +29,7 @@ impl Display for ContextServerId {
#[derive(Deserialize, Serialize, Clone, PartialEq, Eq, JsonSchema)]
pub struct ContextServerCommand {
#[serde(rename = "command")]
pub path: String,
pub args: Vec<String>,
pub env: Option<HashMap<String, String>>,

View File

@@ -4,7 +4,6 @@ mod go;
mod javascript;
mod php;
mod python;
mod ruby;
use std::sync::Arc;
@@ -25,7 +24,6 @@ use gpui::{App, BorrowAppContext};
use javascript::JsDebugAdapter;
use php::PhpDebugAdapter;
use python::PythonDebugAdapter;
use ruby::RubyDebugAdapter;
use serde_json::json;
use task::{DebugScenario, ZedDebugConfig};
@@ -35,7 +33,6 @@ pub fn init(cx: &mut App) {
registry.add_adapter(Arc::from(PythonDebugAdapter::default()));
registry.add_adapter(Arc::from(PhpDebugAdapter::default()));
registry.add_adapter(Arc::from(JsDebugAdapter::default()));
registry.add_adapter(Arc::from(RubyDebugAdapter));
registry.add_adapter(Arc::from(GoDebugAdapter::default()));
registry.add_adapter(Arc::from(GdbDebugAdapter));

View File

@@ -1,208 +0,0 @@
use anyhow::{Result, bail};
use async_trait::async_trait;
use collections::FxHashMap;
use dap::{
DebugRequest, StartDebuggingRequestArguments, StartDebuggingRequestArgumentsRequest,
adapters::{
DapDelegate, DebugAdapter, DebugAdapterBinary, DebugAdapterName, DebugTaskDefinition,
},
};
use gpui::{AsyncApp, SharedString};
use language::LanguageName;
use serde::{Deserialize, Serialize};
use serde_json::json;
use std::path::PathBuf;
use std::{ffi::OsStr, sync::Arc};
use task::{DebugScenario, ZedDebugConfig};
use util::command::new_smol_command;
#[derive(Default)]
pub(crate) struct RubyDebugAdapter;
impl RubyDebugAdapter {
const ADAPTER_NAME: &'static str = "Ruby";
}
#[derive(Serialize, Deserialize)]
struct RubyDebugConfig {
script_or_command: Option<String>,
script: Option<String>,
command: Option<String>,
#[serde(default)]
args: Vec<String>,
#[serde(default)]
env: FxHashMap<String, String>,
cwd: Option<PathBuf>,
}
#[async_trait(?Send)]
impl DebugAdapter for RubyDebugAdapter {
fn name(&self) -> DebugAdapterName {
DebugAdapterName(Self::ADAPTER_NAME.into())
}
fn adapter_language_name(&self) -> Option<LanguageName> {
Some(SharedString::new_static("Ruby").into())
}
async fn request_kind(
&self,
_: &serde_json::Value,
) -> Result<StartDebuggingRequestArgumentsRequest> {
Ok(StartDebuggingRequestArgumentsRequest::Launch)
}
fn dap_schema(&self) -> serde_json::Value {
json!({
"type": "object",
"properties": {
"command": {
"type": "string",
"description": "Command name (ruby, rake, bin/rails, bundle exec ruby, etc)",
},
"script": {
"type": "string",
"description": "Absolute path to a Ruby file."
},
"cwd": {
"type": "string",
"description": "Directory to execute the program in",
"default": "${ZED_WORKTREE_ROOT}"
},
"args": {
"type": "array",
"description": "Command line arguments passed to the program",
"items": {
"type": "string"
},
"default": []
},
"env": {
"type": "object",
"description": "Additional environment variables to pass to the debugging (and debugged) process",
"default": {}
},
}
})
}
async fn config_from_zed_format(&self, zed_scenario: ZedDebugConfig) -> Result<DebugScenario> {
match zed_scenario.request {
DebugRequest::Launch(launch) => {
let config = RubyDebugConfig {
script_or_command: Some(launch.program),
script: None,
command: None,
args: launch.args,
env: launch.env,
cwd: launch.cwd.clone(),
};
let config = serde_json::to_value(config)?;
Ok(DebugScenario {
adapter: zed_scenario.adapter,
label: zed_scenario.label,
config,
tcp_connection: None,
build: None,
})
}
DebugRequest::Attach(_) => {
anyhow::bail!("Attach requests are unsupported");
}
}
}
async fn get_binary(
&self,
delegate: &Arc<dyn DapDelegate>,
definition: &DebugTaskDefinition,
_user_installed_path: Option<PathBuf>,
_user_args: Option<Vec<String>>,
_cx: &mut AsyncApp,
) -> Result<DebugAdapterBinary> {
let adapter_path = paths::debug_adapters_dir().join(self.name().as_ref());
let mut rdbg_path = adapter_path.join("rdbg");
if !delegate.fs().is_file(&rdbg_path).await {
match delegate.which("rdbg".as_ref()).await {
Some(path) => rdbg_path = path,
None => {
delegate.output_to_console(
"rdbg not found on path, trying `gem install debug`".to_string(),
);
let output = new_smol_command("gem")
.arg("install")
.arg("--no-document")
.arg("--bindir")
.arg(adapter_path)
.arg("debug")
.output()
.await?;
anyhow::ensure!(
output.status.success(),
"Failed to install rdbg:\n{}",
String::from_utf8_lossy(&output.stderr).to_string()
);
}
}
}
let tcp_connection = definition.tcp_connection.clone().unwrap_or_default();
let (host, port, timeout) = crate::configure_tcp_connection(tcp_connection).await?;
let ruby_config = serde_json::from_value::<RubyDebugConfig>(definition.config.clone())?;
let mut arguments = vec![
"--open".to_string(),
format!("--port={}", port),
format!("--host={}", host),
];
if let Some(script) = &ruby_config.script {
arguments.push(script.clone());
} else if let Some(command) = &ruby_config.command {
arguments.push("--command".to_string());
arguments.push(command.clone());
} else if let Some(command_or_script) = &ruby_config.script_or_command {
if delegate
.which(OsStr::new(&command_or_script))
.await
.is_some()
{
arguments.push("--command".to_string());
}
arguments.push(command_or_script.clone());
} else {
bail!("Ruby debug config must have 'script' or 'command' args");
}
arguments.extend(ruby_config.args);
let mut configuration = definition.config.clone();
if let Some(configuration) = configuration.as_object_mut() {
configuration
.entry("cwd")
.or_insert_with(|| delegate.worktree_root_path().to_string_lossy().into());
}
Ok(DebugAdapterBinary {
command: Some(rdbg_path.to_string_lossy().to_string()),
arguments,
connection: Some(dap::adapters::TcpArguments {
host,
port,
timeout,
}),
cwd: Some(
ruby_config
.cwd
.unwrap_or(delegate.worktree_root_path().to_owned()),
),
envs: ruby_config.env.into_iter().collect(),
request_args: StartDebuggingRequestArguments {
request: self.request_kind(&definition.config).await?,
configuration,
},
})
}
}

View File

@@ -21,7 +21,7 @@ use project::{
use settings::Settings as _;
use std::{
borrow::Cow,
collections::{HashMap, VecDeque},
collections::{BTreeMap, HashMap, VecDeque},
sync::Arc,
};
use util::maybe;
@@ -32,13 +32,6 @@ use workspace::{
ui::{Button, Clickable, ContextMenu, Label, LabelCommon, PopoverMenu, h_flex},
};
// TODO:
// - [x] stop sorting by session ID
// - [x] pick the most recent session by default (logs if available, RPC messages otherwise)
// - [ ] dump the launch/attach request somewhere (logs?)
const MAX_SESSIONS: usize = 10;
struct DapLogView {
editor: Entity<Editor>,
focus_handle: FocusHandle,
@@ -49,14 +42,34 @@ struct DapLogView {
_subscriptions: Vec<Subscription>,
}
struct LogStoreEntryIdentifier<'a> {
session_id: SessionId,
project: Cow<'a, WeakEntity<Project>>,
}
impl LogStoreEntryIdentifier<'_> {
fn to_owned(&self) -> LogStoreEntryIdentifier<'static> {
LogStoreEntryIdentifier {
session_id: self.session_id,
project: Cow::Owned(self.project.as_ref().clone()),
}
}
}
struct LogStoreMessage {
id: LogStoreEntryIdentifier<'static>,
kind: IoKind,
command: Option<SharedString>,
message: SharedString,
}
pub struct LogStore {
projects: HashMap<WeakEntity<Project>, ProjectState>,
debug_sessions: VecDeque<DebugAdapterState>,
rpc_tx: UnboundedSender<(SessionId, IoKind, Option<SharedString>, SharedString)>,
adapter_log_tx: UnboundedSender<(SessionId, IoKind, Option<SharedString>, SharedString)>,
rpc_tx: UnboundedSender<LogStoreMessage>,
adapter_log_tx: UnboundedSender<LogStoreMessage>,
}
struct ProjectState {
debug_sessions: BTreeMap<SessionId, DebugAdapterState>,
_subscriptions: [gpui::Subscription; 2],
}
@@ -122,13 +135,12 @@ impl DebugAdapterState {
impl LogStore {
pub fn new(cx: &Context<Self>) -> Self {
let (rpc_tx, mut rpc_rx) =
unbounded::<(SessionId, IoKind, Option<SharedString>, SharedString)>();
let (rpc_tx, mut rpc_rx) = unbounded::<LogStoreMessage>();
cx.spawn(async move |this, cx| {
while let Some((session_id, io_kind, command, message)) = rpc_rx.next().await {
while let Some(message) = rpc_rx.next().await {
if let Some(this) = this.upgrade() {
this.update(cx, |this, cx| {
this.add_debug_adapter_message(session_id, io_kind, command, message, cx);
this.add_debug_adapter_message(message, cx);
})?;
}
@@ -138,13 +150,12 @@ impl LogStore {
})
.detach_and_log_err(cx);
let (adapter_log_tx, mut adapter_log_rx) =
unbounded::<(SessionId, IoKind, Option<SharedString>, SharedString)>();
let (adapter_log_tx, mut adapter_log_rx) = unbounded::<LogStoreMessage>();
cx.spawn(async move |this, cx| {
while let Some((session_id, io_kind, _, message)) = adapter_log_rx.next().await {
while let Some(message) = adapter_log_rx.next().await {
if let Some(this) = this.upgrade() {
this.update(cx, |this, cx| {
this.add_debug_adapter_log(session_id, io_kind, message, cx);
this.add_debug_adapter_log(message, cx);
})?;
}
@@ -157,57 +168,76 @@ impl LogStore {
rpc_tx,
adapter_log_tx,
projects: HashMap::new(),
debug_sessions: Default::default(),
}
}
pub fn add_project(&mut self, project: &Entity<Project>, cx: &mut Context<Self>) {
let weak_project = project.downgrade();
self.projects.insert(
project.downgrade(),
ProjectState {
_subscriptions: [
cx.observe_release(project, move |this, _, _| {
this.projects.remove(&weak_project);
cx.observe_release(project, {
let weak_project = project.downgrade();
move |this, _, _| {
this.projects.remove(&weak_project);
}
}),
cx.subscribe(
&project.read(cx).dap_store(),
|this, dap_store, event, cx| match event {
cx.subscribe(&project.read(cx).dap_store(), {
let weak_project = project.downgrade();
move |this, dap_store, event, cx| match event {
dap_store::DapStoreEvent::DebugClientStarted(session_id) => {
let session = dap_store.read(cx).session_by_id(session_id);
if let Some(session) = session {
this.add_debug_session(*session_id, session, cx);
this.add_debug_session(
LogStoreEntryIdentifier {
project: Cow::Owned(weak_project.clone()),
session_id: *session_id,
},
session,
cx,
);
}
}
dap_store::DapStoreEvent::DebugClientShutdown(session_id) => {
this.get_debug_adapter_state(*session_id)
.iter_mut()
.for_each(|state| state.is_terminated = true);
let id = LogStoreEntryIdentifier {
project: Cow::Borrowed(&weak_project),
session_id: *session_id,
};
if let Some(state) = this.get_debug_adapter_state(&id) {
state.is_terminated = true;
}
this.clean_sessions(cx);
}
_ => {}
},
),
}
}),
],
debug_sessions: Default::default(),
},
);
}
fn get_debug_adapter_state(&mut self, id: SessionId) -> Option<&mut DebugAdapterState> {
self.debug_sessions
.iter_mut()
.find(|adapter_state| adapter_state.id == id)
fn get_debug_adapter_state(
&mut self,
id: &LogStoreEntryIdentifier<'_>,
) -> Option<&mut DebugAdapterState> {
self.projects
.get_mut(&id.project)
.and_then(|state| state.debug_sessions.get_mut(&id.session_id))
}
fn add_debug_adapter_message(
&mut self,
id: SessionId,
io_kind: IoKind,
command: Option<SharedString>,
message: SharedString,
LogStoreMessage {
id,
kind: io_kind,
command,
message,
}: LogStoreMessage,
cx: &mut Context<Self>,
) {
let Some(debug_client_state) = self.get_debug_adapter_state(id) else {
let Some(debug_client_state) = self.get_debug_adapter_state(&id) else {
return;
};
@@ -229,7 +259,7 @@ impl LogStore {
if rpc_messages.last_message_kind != Some(kind) {
Self::get_debug_adapter_entry(
&mut rpc_messages.messages,
id,
id.to_owned(),
kind.label().into(),
LogKind::Rpc,
cx,
@@ -239,7 +269,7 @@ impl LogStore {
let entry = Self::get_debug_adapter_entry(
&mut rpc_messages.messages,
id,
id.to_owned(),
message,
LogKind::Rpc,
cx,
@@ -260,12 +290,15 @@ impl LogStore {
fn add_debug_adapter_log(
&mut self,
id: SessionId,
io_kind: IoKind,
message: SharedString,
LogStoreMessage {
id,
kind: io_kind,
message,
..
}: LogStoreMessage,
cx: &mut Context<Self>,
) {
let Some(debug_adapter_state) = self.get_debug_adapter_state(id) else {
let Some(debug_adapter_state) = self.get_debug_adapter_state(&id) else {
return;
};
@@ -276,7 +309,7 @@ impl LogStore {
Self::get_debug_adapter_entry(
&mut debug_adapter_state.log_messages,
id,
id.to_owned(),
message,
LogKind::Adapter,
cx,
@@ -286,13 +319,17 @@ impl LogStore {
fn get_debug_adapter_entry(
log_lines: &mut VecDeque<SharedString>,
id: SessionId,
id: LogStoreEntryIdentifier<'static>,
message: SharedString,
kind: LogKind,
cx: &mut Context<Self>,
) -> SharedString {
while log_lines.len() >= RpcMessages::MESSAGE_QUEUE_LIMIT {
log_lines.pop_front();
if let Some(excess) = log_lines
.len()
.checked_sub(RpcMessages::MESSAGE_QUEUE_LIMIT)
&& excess > 0
{
log_lines.drain(..excess);
}
let format_messages = DebuggerSettings::get_global(cx).format_dap_log_messages;
@@ -322,118 +359,111 @@ impl LogStore {
fn add_debug_session(
&mut self,
session_id: SessionId,
id: LogStoreEntryIdentifier<'static>,
session: Entity<Session>,
cx: &mut Context<Self>,
) {
if self
.debug_sessions
.iter_mut()
.any(|adapter_state| adapter_state.id == session_id)
{
return;
}
maybe!({
let project_entry = self.projects.get_mut(&id.project)?;
let std::collections::btree_map::Entry::Vacant(state) =
project_entry.debug_sessions.entry(id.session_id)
else {
return None;
};
let (adapter_name, has_adapter_logs) = session.read_with(cx, |session, _| {
(
session.adapter(),
session
.adapter_client()
.map(|client| client.has_adapter_logs())
.unwrap_or(false),
)
let (adapter_name, has_adapter_logs) = session.read_with(cx, |session, _| {
(
session.adapter(),
session
.adapter_client()
.map_or(false, |client| client.has_adapter_logs()),
)
});
state.insert(DebugAdapterState::new(
id.session_id,
adapter_name,
has_adapter_logs,
));
self.clean_sessions(cx);
let io_tx = self.rpc_tx.clone();
let client = session.read(cx).adapter_client()?;
let project = id.project.clone();
let session_id = id.session_id;
client.add_log_handler(
move |kind, command, message| {
io_tx
.unbounded_send(LogStoreMessage {
id: LogStoreEntryIdentifier {
session_id,
project: project.clone(),
},
kind,
command: command.map(|command| command.to_owned().into()),
message: message.to_owned().into(),
})
.ok();
},
LogKind::Rpc,
);
let log_io_tx = self.adapter_log_tx.clone();
let project = id.project;
client.add_log_handler(
move |kind, command, message| {
log_io_tx
.unbounded_send(LogStoreMessage {
id: LogStoreEntryIdentifier {
session_id,
project: project.clone(),
},
kind,
command: command.map(|command| command.to_owned().into()),
message: message.to_owned().into(),
})
.ok();
},
LogKind::Adapter,
);
Some(())
});
self.debug_sessions.push_back(DebugAdapterState::new(
session_id,
adapter_name,
has_adapter_logs,
));
self.clean_sessions(cx);
let io_tx = self.rpc_tx.clone();
let Some(client) = session.read(cx).adapter_client() else {
return;
};
client.add_log_handler(
move |io_kind, command, message| {
io_tx
.unbounded_send((
session_id,
io_kind,
command.map(|command| command.to_owned().into()),
message.to_owned().into(),
))
.ok();
},
LogKind::Rpc,
);
let log_io_tx = self.adapter_log_tx.clone();
client.add_log_handler(
move |io_kind, command, message| {
log_io_tx
.unbounded_send((
session_id,
io_kind,
command.map(|command| command.to_owned().into()),
message.to_owned().into(),
))
.ok();
},
LogKind::Adapter,
);
}
fn clean_sessions(&mut self, cx: &mut Context<Self>) {
let mut to_remove = self.debug_sessions.len().saturating_sub(MAX_SESSIONS);
self.debug_sessions.retain(|session| {
if to_remove > 0 && session.is_terminated {
to_remove -= 1;
return false;
}
true
self.projects.values_mut().for_each(|project| {
project
.debug_sessions
.retain(|_, session| !session.is_terminated);
});
cx.notify();
}
fn log_messages_for_session(
&mut self,
session_id: SessionId,
id: &LogStoreEntryIdentifier<'_>,
) -> Option<&mut VecDeque<SharedString>> {
self.debug_sessions
.iter_mut()
.find(|session| session.id == session_id)
self.get_debug_adapter_state(id)
.map(|state| &mut state.log_messages)
}
fn rpc_messages_for_session(
&mut self,
session_id: SessionId,
id: &LogStoreEntryIdentifier<'_>,
) -> Option<&mut VecDeque<SharedString>> {
self.debug_sessions.iter_mut().find_map(|state| {
if state.id == session_id {
Some(&mut state.rpc_messages.messages)
} else {
None
}
})
self.get_debug_adapter_state(id)
.map(|state| &mut state.rpc_messages.messages)
}
fn initialization_sequence_for_session(
&mut self,
session_id: SessionId,
) -> Option<&mut Vec<SharedString>> {
self.debug_sessions.iter_mut().find_map(|state| {
if state.id == session_id {
Some(&mut state.rpc_messages.initialization_sequence)
} else {
None
}
})
id: &LogStoreEntryIdentifier<'_>,
) -> Option<&Vec<SharedString>> {
self.get_debug_adapter_state(&id)
.map(|state| &state.rpc_messages.initialization_sequence)
}
}
@@ -453,10 +483,11 @@ impl Render for DapLogToolbarItemView {
return Empty.into_any_element();
};
let (menu_rows, current_session_id) = log_view.update(cx, |log_view, cx| {
let (menu_rows, current_session_id, project) = log_view.update(cx, |log_view, cx| {
(
log_view.menu_items(cx),
log_view.current_view.map(|(session_id, _)| session_id),
log_view.project.downgrade(),
)
});
@@ -484,6 +515,7 @@ impl Render for DapLogToolbarItemView {
.menu(move |mut window, cx| {
let log_view = log_view.clone();
let menu_rows = menu_rows.clone();
let project = project.clone();
ContextMenu::build(&mut window, cx, move |mut menu, window, _cx| {
for row in menu_rows.into_iter() {
menu = menu.custom_row(move |_window, _cx| {
@@ -509,8 +541,15 @@ impl Render for DapLogToolbarItemView {
.child(Label::new(ADAPTER_LOGS))
.into_any_element()
},
window.handler_for(&log_view, move |view, window, cx| {
view.show_log_messages_for_adapter(row.session_id, window, cx);
window.handler_for(&log_view, {
let project = project.clone();
let id = LogStoreEntryIdentifier {
project: Cow::Owned(project),
session_id: row.session_id,
};
move |view, window, cx| {
view.show_log_messages_for_adapter(&id, window, cx);
}
}),
);
}
@@ -524,8 +563,15 @@ impl Render for DapLogToolbarItemView {
.child(Label::new(RPC_MESSAGES))
.into_any_element()
},
window.handler_for(&log_view, move |view, window, cx| {
view.show_rpc_trace_for_server(row.session_id, window, cx);
window.handler_for(&log_view, {
let project = project.clone();
let id = LogStoreEntryIdentifier {
project: Cow::Owned(project),
session_id: row.session_id,
};
move |view, window, cx| {
view.show_rpc_trace_for_server(&id, window, cx);
}
}),
)
.custom_entry(
@@ -536,12 +582,17 @@ impl Render for DapLogToolbarItemView {
.child(Label::new(INITIALIZATION_SEQUENCE))
.into_any_element()
},
window.handler_for(&log_view, move |view, window, cx| {
view.show_initialization_sequence_for_server(
row.session_id,
window,
cx,
);
window.handler_for(&log_view, {
let project = project.clone();
let id = LogStoreEntryIdentifier {
project: Cow::Owned(project),
session_id: row.session_id,
};
move |view, window, cx| {
view.show_initialization_sequence_for_server(
&id, window, cx,
);
}
}),
);
}
@@ -613,7 +664,9 @@ impl DapLogView {
let events_subscriptions = cx.subscribe(&log_store, |log_view, _, event, cx| match event {
Event::NewLogEntry { id, entry, kind } => {
if log_view.current_view == Some((*id, *kind)) {
if log_view.current_view == Some((id.session_id, *kind))
&& log_view.project == *id.project
{
log_view.editor.update(cx, |editor, cx| {
editor.set_read_only(false);
let last_point = editor.buffer().read(cx).len(cx);
@@ -629,12 +682,18 @@ impl DapLogView {
}
}
});
let weak_project = project.downgrade();
let state_info = log_store
.read(cx)
.debug_sessions
.back()
.map(|session| (session.id, session.has_adapter_logs));
.projects
.get(&weak_project)
.and_then(|project| {
project
.debug_sessions
.values()
.next_back()
.map(|session| (session.id, session.has_adapter_logs))
});
let mut this = Self {
editor,
@@ -647,10 +706,14 @@ impl DapLogView {
};
if let Some((session_id, have_adapter_logs)) = state_info {
let id = LogStoreEntryIdentifier {
session_id,
project: Cow::Owned(weak_project),
};
if have_adapter_logs {
this.show_log_messages_for_adapter(session_id, window, cx);
this.show_log_messages_for_adapter(&id, window, cx);
} else {
this.show_rpc_trace_for_server(session_id, window, cx);
this.show_rpc_trace_for_server(&id, window, cx);
}
}
@@ -690,31 +753,38 @@ impl DapLogView {
fn menu_items(&self, cx: &App) -> Vec<DapMenuItem> {
self.log_store
.read(cx)
.debug_sessions
.iter()
.rev()
.map(|state| DapMenuItem {
session_id: state.id,
adapter_name: state.adapter_name.clone(),
has_adapter_logs: state.has_adapter_logs,
selected_entry: self.current_view.map_or(LogKind::Adapter, |(_, kind)| kind),
.projects
.get(&self.project.downgrade())
.map_or_else(Vec::new, |state| {
state
.debug_sessions
.values()
.rev()
.map(|state| DapMenuItem {
session_id: state.id,
adapter_name: state.adapter_name.clone(),
has_adapter_logs: state.has_adapter_logs,
selected_entry: self
.current_view
.map_or(LogKind::Adapter, |(_, kind)| kind),
})
.collect::<Vec<_>>()
})
.collect::<Vec<_>>()
}
fn show_rpc_trace_for_server(
&mut self,
session_id: SessionId,
id: &LogStoreEntryIdentifier<'_>,
window: &mut Window,
cx: &mut Context<Self>,
) {
let rpc_log = self.log_store.update(cx, |log_store, _| {
log_store
.rpc_messages_for_session(session_id)
.rpc_messages_for_session(id)
.map(|state| log_contents(state.iter().cloned()))
});
if let Some(rpc_log) = rpc_log {
self.current_view = Some((session_id, LogKind::Rpc));
self.current_view = Some((id.session_id, LogKind::Rpc));
let (editor, editor_subscriptions) = Self::editor_for_logs(rpc_log, window, cx);
let language = self.project.read(cx).languages().language_for_name("JSON");
editor
@@ -725,8 +795,7 @@ impl DapLogView {
.expect("log buffer should be a singleton")
.update(cx, |_, cx| {
cx.spawn({
let buffer = cx.entity();
async move |_, cx| {
async move |buffer, cx| {
let language = language.await.ok();
buffer.update(cx, |buffer, cx| {
buffer.set_language(language, cx);
@@ -746,17 +815,17 @@ impl DapLogView {
fn show_log_messages_for_adapter(
&mut self,
session_id: SessionId,
id: &LogStoreEntryIdentifier<'_>,
window: &mut Window,
cx: &mut Context<Self>,
) {
let message_log = self.log_store.update(cx, |log_store, _| {
log_store
.log_messages_for_session(session_id)
.log_messages_for_session(id)
.map(|state| log_contents(state.iter().cloned()))
});
if let Some(message_log) = message_log {
self.current_view = Some((session_id, LogKind::Adapter));
self.current_view = Some((id.session_id, LogKind::Adapter));
let (editor, editor_subscriptions) = Self::editor_for_logs(message_log, window, cx);
editor
.read(cx)
@@ -775,17 +844,17 @@ impl DapLogView {
fn show_initialization_sequence_for_server(
&mut self,
session_id: SessionId,
id: &LogStoreEntryIdentifier<'_>,
window: &mut Window,
cx: &mut Context<Self>,
) {
let rpc_log = self.log_store.update(cx, |log_store, _| {
log_store
.initialization_sequence_for_session(session_id)
.initialization_sequence_for_session(id)
.map(|state| log_contents(state.iter().cloned()))
});
if let Some(rpc_log) = rpc_log {
self.current_view = Some((session_id, LogKind::Rpc));
self.current_view = Some((id.session_id, LogKind::Rpc));
let (editor, editor_subscriptions) = Self::editor_for_logs(rpc_log, window, cx);
let language = self.project.read(cx).languages().language_for_name("JSON");
editor
@@ -993,9 +1062,9 @@ impl Focusable for DapLogView {
}
}
pub enum Event {
enum Event {
NewLogEntry {
id: SessionId,
id: LogStoreEntryIdentifier<'static>,
entry: SharedString,
kind: LogKind,
},
@@ -1008,31 +1077,30 @@ impl EventEmitter<SearchEvent> for DapLogView {}
#[cfg(any(test, feature = "test-support"))]
impl LogStore {
pub fn contained_session_ids(&self) -> Vec<SessionId> {
self.debug_sessions
.iter()
.map(|session| session.id)
.collect()
pub fn has_projects(&self) -> bool {
!self.projects.is_empty()
}
pub fn rpc_messages_for_session_id(&self, session_id: SessionId) -> Vec<SharedString> {
self.debug_sessions
.iter()
.find(|adapter_state| adapter_state.id == session_id)
.expect("This session should exist if a test is calling")
.rpc_messages
.messages
.clone()
.into()
pub fn contained_session_ids(&self, project: &WeakEntity<Project>) -> Vec<SessionId> {
self.projects.get(project).map_or(vec![], |state| {
state.debug_sessions.keys().copied().collect()
})
}
pub fn log_messages_for_session_id(&self, session_id: SessionId) -> Vec<SharedString> {
self.debug_sessions
.iter()
.find(|adapter_state| adapter_state.id == session_id)
.expect("This session should exist if a test is calling")
.log_messages
.clone()
.into()
pub fn rpc_messages_for_session_id(
&self,
project: &WeakEntity<Project>,
session_id: SessionId,
) -> Vec<SharedString> {
self.projects.get(&project).map_or(vec![], |state| {
state
.debug_sessions
.get(&session_id)
.expect("This session should exist if a test is calling")
.rpc_messages
.messages
.clone()
.into()
})
}
}

View File

@@ -32,14 +32,10 @@ bitflags.workspace = true
client.workspace = true
collections.workspace = true
command_palette_hooks.workspace = true
component.workspace = true
dap.workspace = true
dap_adapters = { workspace = true, optional = true }
db.workspace = true
debugger_tools = { workspace = true, optional = true }
editor.workspace = true
env_logger = { workspace = true, optional = true }
feature_flags.workspace = true
file_icons.workspace = true
futures.workspace = true
fuzzy.workspace = true
@@ -68,10 +64,11 @@ theme.workspace = true
tree-sitter.workspace = true
tree-sitter-json.workspace = true
ui.workspace = true
unindent = { workspace = true, optional = true }
util.workspace = true
workspace-hack.workspace = true
workspace.workspace = true
workspace-hack.workspace = true
debugger_tools = { workspace = true, optional = true }
unindent = { workspace = true, optional = true }
zed_actions.workspace = true
[dev-dependencies]

View File

@@ -868,7 +868,7 @@ impl DebugPanel {
let threads =
running_state.update(cx, |running_state, cx| {
let session = running_state.session();
session.read(cx).is_running().then(|| {
session.read(cx).is_started().then(|| {
session.update(cx, |session, cx| {
session.threads(cx)
})
@@ -1298,6 +1298,11 @@ impl Render for DebugPanel {
}
v_flex()
.when_else(
self.position(window, cx) == DockPosition::Bottom,
|this| this.max_h(self.size),
|this| this.max_w(self.size),
)
.size_full()
.key_context("DebugPanel")
.child(h_flex().children(self.top_controls_strip(window, cx)))
@@ -1468,6 +1473,94 @@ impl Render for DebugPanel {
if has_sessions {
this.children(self.active_session.clone())
} else {
let docked_to_bottom = self.position(window, cx) == DockPosition::Bottom;
let welcome_experience = v_flex()
.when_else(
docked_to_bottom,
|this| this.w_2_3().h_full().pr_8(),
|this| this.w_full().h_1_3(),
)
.items_center()
.justify_center()
.gap_2()
.child(
Button::new("spawn-new-session-empty-state", "New Session")
.icon(IconName::Plus)
.icon_size(IconSize::XSmall)
.icon_color(Color::Muted)
.icon_position(IconPosition::Start)
.on_click(|_, window, cx| {
window.dispatch_action(crate::Start.boxed_clone(), cx);
}),
)
.child(
Button::new("edit-debug-settings", "Edit debug.json")
.icon(IconName::Code)
.icon_size(IconSize::XSmall)
.color(Color::Muted)
.icon_color(Color::Muted)
.icon_position(IconPosition::Start)
.on_click(|_, window, cx| {
window.dispatch_action(
zed_actions::OpenProjectDebugTasks.boxed_clone(),
cx,
);
}),
)
.child(
Button::new("open-debugger-docs", "Debugger Docs")
.icon(IconName::Book)
.color(Color::Muted)
.icon_size(IconSize::XSmall)
.icon_color(Color::Muted)
.icon_position(IconPosition::Start)
.on_click(|_, _, cx| cx.open_url("https://zed.dev/docs/debugger")),
)
.child(
Button::new(
"spawn-new-session-install-extensions",
"Debugger Extensions",
)
.icon(IconName::Blocks)
.color(Color::Muted)
.icon_size(IconSize::XSmall)
.icon_color(Color::Muted)
.icon_position(IconPosition::Start)
.on_click(|_, window, cx| {
window.dispatch_action(
zed_actions::Extensions {
category_filter: Some(
zed_actions::ExtensionCategoryFilter::DebugAdapters,
),
}
.boxed_clone(),
cx,
);
}),
);
let breakpoint_list =
v_flex()
.group("base-breakpoint-list")
.items_start()
.when_else(
docked_to_bottom,
|this| this.min_w_1_3().h_full(),
|this| this.w_full().h_2_3(),
)
.p_1()
.child(
h_flex()
.pl_1()
.w_full()
.justify_between()
.child(Label::new("Breakpoints").size(LabelSize::Small))
.child(h_flex().visible_on_hover("base-breakpoint-list").child(
self.breakpoint_list.read(cx).render_control_strip(),
))
.track_focus(&self.breakpoint_list.focus_handle(cx)),
)
.child(Divider::horizontal())
.child(self.breakpoint_list.clone());
this.child(
v_flex()
.h_full()
@@ -1475,65 +1568,23 @@ impl Render for DebugPanel {
.items_center()
.justify_center()
.child(
h_flex().size_full()
.items_start()
.child(v_flex().group("base-breakpoint-list").items_start().min_w_1_3().h_full().p_1()
.child(h_flex().pl_1().w_full().justify_between()
.child(Label::new("Breakpoints").size(LabelSize::Small))
.child(h_flex().visible_on_hover("base-breakpoint-list").child(self.breakpoint_list.read(cx).render_control_strip())))
.child(Divider::horizontal())
.child(self.breakpoint_list.clone()))
.child(Divider::vertical())
.child(
v_flex().w_2_3().h_full().items_center().justify_center()
.gap_2()
.pr_8()
.child(
Button::new("spawn-new-session-empty-state", "New Session")
.icon(IconName::Plus)
.icon_size(IconSize::XSmall)
.icon_color(Color::Muted)
.icon_position(IconPosition::Start)
.on_click(|_, window, cx| {
window.dispatch_action(crate::Start.boxed_clone(), cx);
})
)
.child(
Button::new("edit-debug-settings", "Edit debug.json")
.icon(IconName::Code)
.icon_size(IconSize::XSmall)
.color(Color::Muted)
.icon_color(Color::Muted)
.icon_position(IconPosition::Start)
.on_click(|_, window, cx| {
window.dispatch_action(zed_actions::OpenProjectDebugTasks.boxed_clone(), cx);
})
)
.child(
Button::new("open-debugger-docs", "Debugger Docs")
.icon(IconName::Book)
.color(Color::Muted)
.icon_size(IconSize::XSmall)
.icon_color(Color::Muted)
.icon_position(IconPosition::Start)
.on_click(|_, _, cx| {
cx.open_url("https://zed.dev/docs/debugger")
})
)
.child(
Button::new("spawn-new-session-install-extensions", "Debugger Extensions")
.icon(IconName::Blocks)
.color(Color::Muted)
.icon_size(IconSize::XSmall)
.icon_color(Color::Muted)
.icon_position(IconPosition::Start)
.on_click(|_, window, cx| {
window.dispatch_action(zed_actions::Extensions { category_filter: Some(zed_actions::ExtensionCategoryFilter::DebugAdapters)}.boxed_clone(), cx);
})
)
)
)
div()
.when_else(docked_to_bottom, Div::h_flex, Div::v_flex)
.size_full()
.map(|this| {
if docked_to_bottom {
this.items_start()
.child(breakpoint_list)
.child(Divider::vertical())
.child(welcome_experience)
} else {
this.items_end()
.child(welcome_experience)
.child(Divider::horizontal())
.child(breakpoint_list)
}
}),
),
)
}
})

View File

@@ -878,19 +878,30 @@ impl LineBreakpoint {
.cursor_pointer()
.child(
h_flex()
.gap_1()
.gap_0p5()
.child(
Label::new(format!("{}:{}", self.name, self.line))
.size(LabelSize::Small)
.line_height_style(ui::LineHeightStyle::UiLabel),
)
.children(self.dir.clone().map(|dir| {
Label::new(dir)
.color(Color::Muted)
.size(LabelSize::Small)
.line_height_style(ui::LineHeightStyle::UiLabel)
.children(self.dir.as_ref().and_then(|dir| {
let path_without_root = Path::new(dir.as_ref())
.components()
.skip(1)
.collect::<PathBuf>();
path_without_root.components().next()?;
Some(
Label::new(path_without_root.to_string_lossy().into_owned())
.color(Color::Muted)
.size(LabelSize::Small)
.line_height_style(ui::LineHeightStyle::UiLabel)
.truncate(),
)
})),
)
.when_some(self.dir.as_ref(), |this, parent_dir| {
this.tooltip(Tooltip::text(format!("Worktree parent path: {parent_dir}")))
})
.child(BreakpointOptionsStrip {
props,
breakpoint: BreakpointEntry {
@@ -1234,14 +1245,15 @@ impl RenderOnce for BreakpointOptionsStrip {
};
h_flex()
.gap_2()
.gap_1()
.child(
div() .map(self.add_border(ActiveBreakpointStripMode::Log, supports_logs, window, cx))
div().map(self.add_border(ActiveBreakpointStripMode::Log, supports_logs, window, cx))
.child(
IconButton::new(
SharedString::from(format!("{id}-log-toggle")),
IconName::ScrollText,
)
.icon_size(IconSize::XSmall)
.style(style_for_toggle(ActiveBreakpointStripMode::Log, has_logs))
.icon_color(color_for_toggle(has_logs))
.disabled(!supports_logs)
@@ -1261,6 +1273,7 @@ impl RenderOnce for BreakpointOptionsStrip {
SharedString::from(format!("{id}-condition-toggle")),
IconName::SplitAlt,
)
.icon_size(IconSize::XSmall)
.style(style_for_toggle(
ActiveBreakpointStripMode::Condition,
has_condition
@@ -1274,7 +1287,7 @@ impl RenderOnce for BreakpointOptionsStrip {
.when(!has_condition && !self.is_selected, |this| this.invisible()),
)
.child(
div() .map(self.add_border(
div().map(self.add_border(
ActiveBreakpointStripMode::HitCondition,
supports_hit_condition,window, cx
))
@@ -1283,6 +1296,7 @@ impl RenderOnce for BreakpointOptionsStrip {
SharedString::from(format!("{id}-hit-condition-toggle")),
IconName::ArrowDown10,
)
.icon_size(IconSize::XSmall)
.style(style_for_toggle(
ActiveBreakpointStripMode::HitCondition,
has_hit_condition,

View File

@@ -114,7 +114,7 @@ impl Console {
}
fn is_running(&self, cx: &Context<Self>) -> bool {
self.session.read(cx).is_running()
self.session.read(cx).is_started()
}
fn handle_stack_frame_list_events(

View File

@@ -37,15 +37,23 @@ async fn test_dap_logger_captures_all_session_rpc_messages(
.await;
assert!(
log_store.read_with(cx, |log_store, _| log_store
.contained_session_ids()
.is_empty()),
"log_store shouldn't contain any session IDs before any sessions were created"
log_store.read_with(cx, |log_store, _| !log_store.has_projects()),
"log_store shouldn't contain any projects before any projects were created"
);
let project = Project::test(fs, [path!("/project").as_ref()], cx).await;
let workspace = init_test_workspace(&project, cx).await;
assert!(
log_store.read_with(cx, |log_store, _| log_store.has_projects()),
"log_store shouldn't contain any projects before any projects were created"
);
assert!(
log_store.read_with(cx, |log_store, _| log_store
.contained_session_ids(&project.downgrade())
.is_empty()),
"log_store shouldn't contain any projects before any projects were created"
);
let cx = &mut VisualTestContext::from_window(*workspace, cx);
// Start a debug session
@@ -54,20 +62,22 @@ async fn test_dap_logger_captures_all_session_rpc_messages(
let client = session.update(cx, |session, _| session.adapter_client().unwrap());
assert_eq!(
log_store.read_with(cx, |log_store, _| log_store.contained_session_ids().len()),
log_store.read_with(cx, |log_store, _| log_store
.contained_session_ids(&project.downgrade())
.len()),
1,
);
assert!(
log_store.read_with(cx, |log_store, _| log_store
.contained_session_ids()
.contained_session_ids(&project.downgrade())
.contains(&session_id)),
"log_store should contain the session IDs of the started session"
);
assert!(
!log_store.read_with(cx, |log_store, _| log_store
.rpc_messages_for_session_id(session_id)
.rpc_messages_for_session_id(&project.downgrade(), session_id)
.is_empty()),
"We should have the initialization sequence in the log store"
);

View File

@@ -267,7 +267,6 @@ async fn test_dap_adapter_config_conversion_and_validation(cx: &mut TestAppConte
"Debugpy",
"PHP",
"JavaScript",
"Ruby",
"Delve",
"GDB",
"fake-adapter",

View File

@@ -37,7 +37,9 @@ pub use block_map::{
use block_map::{BlockRow, BlockSnapshot};
use collections::{HashMap, HashSet};
pub use crease_map::*;
pub use fold_map::{ChunkRenderer, ChunkRendererContext, Fold, FoldId, FoldPlaceholder, FoldPoint};
pub use fold_map::{
ChunkRenderer, ChunkRendererContext, ChunkRendererId, Fold, FoldId, FoldPlaceholder, FoldPoint,
};
use fold_map::{FoldMap, FoldSnapshot};
use gpui::{App, Context, Entity, Font, HighlightStyle, LineLayout, Pixels, UnderlineStyle};
pub use inlay_map::Inlay;
@@ -538,7 +540,7 @@ impl DisplayMap {
pub fn update_fold_widths(
&mut self,
widths: impl IntoIterator<Item = (FoldId, Pixels)>,
widths: impl IntoIterator<Item = (ChunkRendererId, Pixels)>,
cx: &mut Context<Self>,
) -> bool {
let snapshot = self.buffer.read(cx).snapshot(cx);

View File

@@ -1,3 +1,5 @@
use crate::{InlayId, display_map::inlay_map::InlayChunk};
use super::{
Highlights,
inlay_map::{InlayBufferRows, InlayChunks, InlayEdit, InlayOffset, InlayPoint, InlaySnapshot},
@@ -275,13 +277,16 @@ impl FoldMapWriter<'_> {
pub(crate) fn update_fold_widths(
&mut self,
new_widths: impl IntoIterator<Item = (FoldId, Pixels)>,
new_widths: impl IntoIterator<Item = (ChunkRendererId, Pixels)>,
) -> (FoldSnapshot, Vec<FoldEdit>) {
let mut edits = Vec::new();
let inlay_snapshot = self.0.snapshot.inlay_snapshot.clone();
let buffer = &inlay_snapshot.buffer;
for (id, new_width) in new_widths {
let ChunkRendererId::Fold(id) = id else {
continue;
};
if let Some(metadata) = self.0.snapshot.fold_metadata_by_id.get(&id).cloned() {
if Some(new_width) != metadata.width {
let buffer_start = metadata.range.start.to_offset(buffer);
@@ -527,7 +532,7 @@ impl FoldMap {
placeholder: Some(TransformPlaceholder {
text: ELLIPSIS,
renderer: ChunkRenderer {
id: fold.id,
id: ChunkRendererId::Fold(fold.id),
render: Arc::new(move |cx| {
(fold.placeholder.render)(
fold_id,
@@ -1060,7 +1065,7 @@ impl sum_tree::Summary for TransformSummary {
}
#[derive(Copy, Clone, Eq, PartialEq, Debug, Default, Ord, PartialOrd, Hash)]
pub struct FoldId(usize);
pub struct FoldId(pub(super) usize);
impl From<FoldId> for ElementId {
fn from(val: FoldId) -> Self {
@@ -1265,11 +1270,17 @@ pub struct Chunk<'a> {
pub renderer: Option<ChunkRenderer>,
}
#[derive(Clone, Copy, PartialEq, Eq, PartialOrd, Ord)]
pub enum ChunkRendererId {
Fold(FoldId),
Inlay(InlayId),
}
/// A recipe for how the chunk should be presented.
#[derive(Clone)]
pub struct ChunkRenderer {
/// The id of the fold associated with this chunk.
pub id: FoldId,
/// The id of the renderer associated with this chunk.
pub id: ChunkRendererId,
/// Creates a custom element to represent this chunk.
pub render: Arc<dyn Send + Sync + Fn(&mut ChunkRendererContext) -> AnyElement>,
/// If true, the element is constrained to the shaped width of the text.
@@ -1311,7 +1322,7 @@ impl DerefMut for ChunkRendererContext<'_, '_> {
pub struct FoldChunks<'a> {
transform_cursor: Cursor<'a, Transform, (FoldOffset, InlayOffset)>,
inlay_chunks: InlayChunks<'a>,
inlay_chunk: Option<(InlayOffset, language::Chunk<'a>)>,
inlay_chunk: Option<(InlayOffset, InlayChunk<'a>)>,
inlay_offset: InlayOffset,
output_offset: FoldOffset,
max_output_offset: FoldOffset,
@@ -1403,7 +1414,8 @@ impl<'a> Iterator for FoldChunks<'a> {
}
// Otherwise, take a chunk from the buffer's text.
if let Some((buffer_chunk_start, mut chunk)) = self.inlay_chunk.clone() {
if let Some((buffer_chunk_start, mut inlay_chunk)) = self.inlay_chunk.clone() {
let chunk = &mut inlay_chunk.chunk;
let buffer_chunk_end = buffer_chunk_start + InlayOffset(chunk.text.len());
let transform_end = self.transform_cursor.end(&()).1;
let chunk_end = buffer_chunk_end.min(transform_end);
@@ -1428,7 +1440,7 @@ impl<'a> Iterator for FoldChunks<'a> {
is_tab: chunk.is_tab,
is_inlay: chunk.is_inlay,
underline: chunk.underline,
renderer: None,
renderer: inlay_chunk.renderer,
});
}

View File

@@ -1,4 +1,4 @@
use crate::{HighlightStyles, InlayId};
use crate::{ChunkRenderer, HighlightStyles, InlayId};
use collections::BTreeSet;
use gpui::{Hsla, Rgba};
use language::{Chunk, Edit, Point, TextSummary};
@@ -8,11 +8,13 @@ use multi_buffer::{
use std::{
cmp,
ops::{Add, AddAssign, Range, Sub, SubAssign},
sync::Arc,
};
use sum_tree::{Bias, Cursor, SumTree};
use text::{Patch, Rope};
use ui::{ActiveTheme, IntoElement as _, ParentElement as _, Styled as _, div};
use super::{Highlights, custom_highlights::CustomHighlightsChunks};
use super::{Highlights, custom_highlights::CustomHighlightsChunks, fold_map::ChunkRendererId};
/// Decides where the [`Inlay`]s should be displayed.
///
@@ -252,6 +254,13 @@ pub struct InlayChunks<'a> {
snapshot: &'a InlaySnapshot,
}
#[derive(Clone)]
pub struct InlayChunk<'a> {
pub chunk: Chunk<'a>,
/// Whether the inlay should be customly rendered.
pub renderer: Option<ChunkRenderer>,
}
impl InlayChunks<'_> {
pub fn seek(&mut self, new_range: Range<InlayOffset>) {
self.transforms.seek(&new_range.start, Bias::Right, &());
@@ -271,7 +280,7 @@ impl InlayChunks<'_> {
}
impl<'a> Iterator for InlayChunks<'a> {
type Item = Chunk<'a>;
type Item = InlayChunk<'a>;
fn next(&mut self) -> Option<Self::Item> {
if self.output_offset == self.max_output_offset {
@@ -296,9 +305,12 @@ impl<'a> Iterator for InlayChunks<'a> {
chunk.text = suffix;
self.output_offset.0 += prefix.len();
Chunk {
text: prefix,
..chunk.clone()
InlayChunk {
chunk: Chunk {
text: prefix,
..chunk.clone()
},
renderer: None,
}
}
Transform::Inlay(inlay) => {
@@ -313,6 +325,7 @@ impl<'a> Iterator for InlayChunks<'a> {
}
}
let mut renderer = None;
let mut highlight_style = match inlay.id {
InlayId::InlineCompletion(_) => {
self.highlight_styles.inline_completion.map(|s| {
@@ -325,14 +338,31 @@ impl<'a> Iterator for InlayChunks<'a> {
}
InlayId::Hint(_) => self.highlight_styles.inlay_hint,
InlayId::DebuggerValue(_) => self.highlight_styles.inlay_hint,
InlayId::Color(_) => match inlay.color {
Some(color) => {
let mut style = self.highlight_styles.inlay_hint.unwrap_or_default();
style.color = Some(color);
Some(style)
InlayId::Color(_) => {
if let Some(color) = inlay.color {
renderer = Some(ChunkRenderer {
id: ChunkRendererId::Inlay(inlay.id),
render: Arc::new(move |cx| {
div()
.relative()
.size_3p5()
.child(
div()
.absolute()
.right_1()
.size_3()
.border_1()
.border_color(cx.theme().colors().border)
.bg(color),
)
.into_any_element()
}),
constrain_width: false,
measured_width: None,
});
}
None => self.highlight_styles.inlay_hint,
},
self.highlight_styles.inlay_hint
}
};
let next_inlay_highlight_endpoint;
let offset_in_inlay = self.output_offset - self.transforms.start().0;
@@ -370,11 +400,14 @@ impl<'a> Iterator for InlayChunks<'a> {
self.output_offset.0 += chunk.len();
Chunk {
text: chunk,
highlight_style,
is_inlay: true,
..Default::default()
InlayChunk {
chunk: Chunk {
text: chunk,
highlight_style,
is_inlay: true,
..Chunk::default()
},
renderer,
}
}
};
@@ -1066,7 +1099,7 @@ impl InlaySnapshot {
#[cfg(test)]
pub fn text(&self) -> String {
self.chunks(Default::default()..self.len(), false, Highlights::default())
.map(|chunk| chunk.text)
.map(|chunk| chunk.chunk.text)
.collect()
}
@@ -1704,7 +1737,7 @@ mod tests {
..Highlights::default()
},
)
.map(|chunk| chunk.text)
.map(|chunk| chunk.chunk.text)
.collect::<String>();
assert_eq!(
actual_text,

View File

@@ -21,7 +21,6 @@ mod editor_settings;
mod editor_settings_controls;
mod element;
mod git;
mod gutter;
mod highlight_matching_bracket;
mod hover_links;
pub mod hover_popover;
@@ -202,8 +201,8 @@ use theme::{
observe_buffer_font_size_adjustment,
};
use ui::{
ButtonSize, ContextMenu, Disclosure, IconButton, IconButtonShape, IconName, IconSize,
Indicator, Key, Tooltip, h_flex, prelude::*,
ButtonSize, ButtonStyle, ContextMenu, Disclosure, IconButton, IconButtonShape, IconName,
IconSize, Indicator, Key, Tooltip, h_flex, prelude::*,
};
use util::{RangeExt, ResultExt, TryFutureExt, maybe, post_inc};
use workspace::{
@@ -548,6 +547,7 @@ pub enum SoftWrap {
#[derive(Clone)]
pub struct EditorStyle {
pub background: Hsla,
pub border: Hsla,
pub local_player: PlayerColor,
pub text: TextStyle,
pub scrollbar_width: Pixels,
@@ -563,6 +563,7 @@ impl Default for EditorStyle {
fn default() -> Self {
Self {
background: Hsla::default(),
border: Hsla::default(),
local_player: PlayerColor::default(),
text: TextStyle::default(),
scrollbar_width: Pixels::default(),
@@ -7978,6 +7979,121 @@ impl Editor {
})
}
fn render_breakpoint(
&self,
position: Anchor,
row: DisplayRow,
breakpoint: &Breakpoint,
state: Option<BreakpointSessionState>,
cx: &mut Context<Self>,
) -> IconButton {
let is_rejected = state.is_some_and(|s| !s.verified);
// Is it a breakpoint that shows up when hovering over gutter?
let (is_phantom, collides_with_existing) = self.gutter_breakpoint_indicator.0.map_or(
(false, false),
|PhantomBreakpointIndicator {
is_active,
display_row,
collides_with_existing_breakpoint,
}| {
(
is_active && display_row == row,
collides_with_existing_breakpoint,
)
},
);
let (color, icon) = {
let icon = match (&breakpoint.message.is_some(), breakpoint.is_disabled()) {
(false, false) => ui::IconName::DebugBreakpoint,
(true, false) => ui::IconName::DebugLogBreakpoint,
(false, true) => ui::IconName::DebugDisabledBreakpoint,
(true, true) => ui::IconName::DebugDisabledLogBreakpoint,
};
let color = if is_phantom {
Color::Hint
} else if is_rejected {
Color::Disabled
} else {
Color::Debugger
};
(color, icon)
};
let breakpoint = Arc::from(breakpoint.clone());
let alt_as_text = gpui::Keystroke {
modifiers: Modifiers::secondary_key(),
..Default::default()
};
let primary_action_text = if breakpoint.is_disabled() {
"Enable breakpoint"
} else if is_phantom && !collides_with_existing {
"Set breakpoint"
} else {
"Unset breakpoint"
};
let focus_handle = self.focus_handle.clone();
let meta = if is_rejected {
SharedString::from("No executable code is associated with this line.")
} else if collides_with_existing && !breakpoint.is_disabled() {
SharedString::from(format!(
"{alt_as_text}-click to disable,\nright-click for more options."
))
} else {
SharedString::from("Right-click for more options.")
};
IconButton::new(("breakpoint_indicator", row.0 as usize), icon)
.icon_size(IconSize::XSmall)
.size(ui::ButtonSize::None)
.when(is_rejected, |this| {
this.indicator(Indicator::icon(Icon::new(IconName::Warning)).color(Color::Warning))
})
.icon_color(color)
.style(ButtonStyle::Transparent)
.on_click(cx.listener({
let breakpoint = breakpoint.clone();
move |editor, event: &ClickEvent, window, cx| {
let edit_action = if event.modifiers().platform || breakpoint.is_disabled() {
BreakpointEditAction::InvertState
} else {
BreakpointEditAction::Toggle
};
window.focus(&editor.focus_handle(cx));
editor.edit_breakpoint_at_anchor(
position,
breakpoint.as_ref().clone(),
edit_action,
cx,
);
}
}))
.on_right_click(cx.listener(move |editor, event: &ClickEvent, window, cx| {
editor.set_breakpoint_context_menu(
row,
Some(position),
event.down.position,
window,
cx,
);
}))
.tooltip(move |window, cx| {
Tooltip::with_meta_in(
primary_action_text,
Some(&ToggleBreakpoint),
meta.clone(),
&focus_handle,
window,
cx,
)
})
}
fn build_tasks_context(
project: &Entity<Project>,
buffer: &Entity<Buffer>,
@@ -17217,9 +17333,9 @@ impl Editor {
self.active_indent_guides_state.dirty = true;
}
pub fn update_fold_widths(
pub fn update_renderer_widths(
&mut self,
widths: impl IntoIterator<Item = (FoldId, Pixels)>,
widths: impl IntoIterator<Item = (ChunkRendererId, Pixels)>,
cx: &mut Context<Self>,
) -> bool {
self.display_map
@@ -22291,6 +22407,7 @@ impl Render for Editor {
&cx.entity(),
EditorStyle {
background,
border: cx.theme().colors().border,
local_player: cx.theme().players().local(),
text: text_style,
scrollbar_width: EditorElement::SCROLLBAR_WIDTH,

View File

@@ -12,8 +12,8 @@ use crate::{
ToggleFold,
code_context_menus::{CodeActionsMenu, MENU_ASIDE_MAX_WIDTH, MENU_ASIDE_MIN_WIDTH, MENU_GAP},
display_map::{
Block, BlockContext, BlockStyle, DisplaySnapshot, EditorMargins, FoldId, HighlightKey,
HighlightedChunk, ToDisplayPoint,
Block, BlockContext, BlockStyle, ChunkRendererId, DisplaySnapshot, EditorMargins,
HighlightKey, HighlightedChunk, ToDisplayPoint,
},
editor_settings::{
CurrentLineHighlight, DocumentColorsRenderMode, DoubleClickInMultibuffer, Minimap,
@@ -21,7 +21,6 @@ use crate::{
ScrollbarDiagnostics, ShowMinimap, ShowScrollbar,
},
git::blame::{BlameRenderer, GitBlame, GlobalBlameRenderer},
gutter::breakpoint_indicator::breakpoint_indicator_path,
hover_popover::{
self, HOVER_POPOVER_GAP, MIN_POPOVER_CHARACTER_WIDTH, MIN_POPOVER_LINE_HEIGHT,
POPOVER_RIGHT_OFFSET, hover_at,
@@ -31,7 +30,6 @@ use crate::{
mouse_context_menu::{self, MenuPosition},
scroll::{ActiveScrollbarState, ScrollbarThumbState, scroll_amount::ScrollAmount},
};
use buffer_diff::{DiffHunkStatus, DiffHunkStatusKind};
use collections::{BTreeMap, HashMap};
use file_icons::FileIcons;
@@ -44,12 +42,12 @@ use gpui::{
Action, Along, AnyElement, App, AppContext, AvailableSpace, Axis as ScrollbarAxis, BorderStyle,
Bounds, ClickEvent, ContentMask, Context, Corner, Corners, CursorStyle, DispatchPhase, Edges,
Element, ElementInputHandler, Entity, Focusable as _, FontId, GlobalElementId, Hitbox,
HitboxBehavior, Hsla, InteractiveElement, IntoElement, IsZero, Keystroke, Length, Modifiers,
HitboxBehavior, Hsla, InteractiveElement, IntoElement, IsZero, Keystroke, Length,
ModifiersChangedEvent, MouseButton, MouseDownEvent, MouseMoveEvent, MouseUpEvent, PaintQuad,
ParentElement, Pixels, ScrollDelta, ScrollHandle, ScrollWheelEvent, ShapedLine, SharedString,
Size, StatefulInteractiveElement, Style, Styled, TextRun, TextStyleRefinement, WeakEntity,
Window, anchored, canvas, deferred, div, fill, linear_color_stop, linear_gradient, outline,
point, px, quad, relative, size, solid_background, transparent_black,
Window, anchored, deferred, div, fill, linear_color_stop, linear_gradient, outline, point, px,
quad, relative, size, solid_background, transparent_black,
};
use itertools::Itertools;
use language::language_settings::{
@@ -63,7 +61,7 @@ use multi_buffer::{
use project::{
ProjectPath,
debugger::breakpoint_store::{Breakpoint, BreakpointEditAction, BreakpointSessionState},
debugger::breakpoint_store::{Breakpoint, BreakpointSessionState},
project_settings::{GitGutterSetting, GitHunkStyleSetting, ProjectSettings},
};
use settings::Settings;
@@ -2759,16 +2757,7 @@ impl EditorElement {
return None;
}
let button = self.render_breakpoint(
editor,
snapshot,
text_anchor,
display_row,
row,
&bp,
window,
cx,
);
let button = editor.render_breakpoint(text_anchor, display_row, &bp, state, cx);
let button = prepaint_gutter_button(
button,
@@ -2787,184 +2776,6 @@ impl EditorElement {
})
}
fn render_breakpoint(
&self,
editor: &Editor,
snapshot: &EditorSnapshot,
position: Anchor,
row: DisplayRow,
multibuffer_row: MultiBufferRow,
breakpoint: &Breakpoint,
window: &mut Window,
cx: &mut App,
) -> AnyElement {
let element_id =
ElementId::Name(format!("breakpoint_indicator_{}", multibuffer_row.0).into());
// === Extract text style and calculate dimensions ===
let text_style = self.style.text.clone();
let font_size = text_style.font_size;
let font_size_px = font_size.to_pixels(window.rem_size());
let rem_size = window.rem_size();
// Calculate font scale relative to a baseline font size
const BASELINE_FONT_SIZE: f32 = 14.0; // Default editor font size
let font_scale = font_size_px / px(BASELINE_FONT_SIZE);
let line_height: Pixels = text_style.line_height.to_pixels(font_size, rem_size);
// Debug font metrics
dbg!(font_size);
dbg!(font_size_px);
dbg!(rem_size);
dbg!(BASELINE_FONT_SIZE);
dbg!(font_scale);
dbg!(line_height);
// Helper to scale pixel values based on font size
let scale_px = |value: f32| px(value) * font_scale;
const HORIZONTAL_OFFSET: f32 = 40.0;
const VERTICAL_OFFSET: f32 = 4.0;
let horizontal_offset = scale_px(HORIZONTAL_OFFSET);
let vertical_offset = px(VERTICAL_OFFSET);
let indicator_height = line_height - vertical_offset;
let line_number_width = self.max_line_number_width(snapshot, window, cx);
let indicator_width = dbg!(line_number_width) + scale_px(HORIZONTAL_OFFSET);
// Debug indicator dimensions
dbg!(horizontal_offset);
dbg!(vertical_offset);
dbg!(indicator_height);
dbg!(indicator_width);
let is_disabled = breakpoint.is_disabled();
let breakpoint_arc = Arc::from(breakpoint.clone());
let (is_hovered, collides_with_existing) = editor.gutter_breakpoint_indicator.0.map_or(
(false, false),
|PhantomBreakpointIndicator {
is_active,
display_row,
collides_with_existing_breakpoint,
}| {
(
is_active && display_row == row,
collides_with_existing_breakpoint,
)
},
);
let indicator_color = if is_hovered {
cx.theme().colors().ghost_element_hover
} else if is_disabled {
cx.theme().status().info.alpha(0.64)
} else {
cx.theme().status().info.alpha(0.48)
};
let primary_action = if is_disabled {
"enable"
} else if is_hovered && !collides_with_existing {
"set"
} else {
"unset"
};
let mut tooltip_text = format!("Click to {primary_action}");
if collides_with_existing && !is_disabled {
use std::fmt::Write;
let modifier_key = gpui::Keystroke {
modifiers: Modifiers::secondary_key(),
..Default::default()
};
write!(tooltip_text, ", {modifier_key}-click to disable").ok();
}
div()
.id(element_id)
.cursor_pointer()
.absolute()
.left_0()
.w(indicator_width)
.h(indicator_height)
.child(
canvas(
|_bounds, _cx, _style| {},
move |bounds, _cx, window, _style| {
// Debug canvas bounds
dbg!(&bounds);
// Adjust bounds to account for horizontal offset
let adjusted_bounds = Bounds {
origin: point(bounds.origin.x + horizontal_offset, bounds.origin.y),
size: size(bounds.size.width, indicator_height),
};
// Debug adjusted bounds
dbg!(&adjusted_bounds);
dbg!(font_scale);
// Generate the breakpoint indicator path
let path =
breakpoint_indicator_path(adjusted_bounds, font_scale, is_disabled);
// Paint the path with the calculated color
window.paint_path(path, indicator_color);
},
)
.size_full(),
)
.on_click({
let editor_weak = self.editor.downgrade();
let breakpoint = breakpoint_arc.clone();
move |event, window, cx| {
let action = if event.modifiers().platform || breakpoint.is_disabled() {
BreakpointEditAction::InvertState
} else {
BreakpointEditAction::Toggle
};
let Some(editor_strong) = editor_weak.upgrade() else {
return;
};
window.focus(&editor_strong.focus_handle(cx));
editor_strong.update(cx, |editor, cx| {
editor.edit_breakpoint_at_anchor(
position,
breakpoint.as_ref().clone(),
action,
cx,
);
});
}
})
.on_mouse_down(gpui::MouseButton::Right, {
let editor_weak = self.editor.downgrade();
let anchor_position = position.clone();
move |event, window, cx| {
let Some(editor_strong) = editor_weak.upgrade() else {
return;
};
editor_strong.update(cx, |editor, cx| {
editor.set_breakpoint_context_menu(
row,
Some(anchor_position.clone()),
event.position,
window,
cx,
);
});
}
})
.into_any_element()
}
#[allow(clippy::too_many_arguments)]
fn layout_run_indicators(
&self,
@@ -3223,7 +3034,7 @@ impl EditorElement {
scroll_position: gpui::Point<f32>,
rows: Range<DisplayRow>,
buffer_rows: &[RowInfo],
_active_rows: &BTreeMap<DisplayRow, LineHighlightSpec>,
active_rows: &BTreeMap<DisplayRow, LineHighlightSpec>,
newest_selection_head: Option<DisplayPoint>,
snapshot: &EditorSnapshot,
window: &mut Window,
@@ -3279,7 +3090,16 @@ impl EditorElement {
return None;
}
let color = cx.theme().colors().editor_active_line_number;
let color = active_rows
.get(&display_row)
.map(|spec| {
if spec.breakpoint {
cx.theme().colors().debugger_accent
} else {
cx.theme().colors().editor_active_line_number
}
})
.unwrap_or_else(|| cx.theme().colors().editor_line_number);
let shaped_line =
self.shape_line_number(SharedString::from(&line_number), color, window);
let scroll_top = scroll_position.y * line_height;
@@ -5757,9 +5577,6 @@ impl EditorElement {
cx: &mut App,
) {
window.paint_layer(layout.gutter_hitbox.bounds, |window| {
for breakpoint in layout.breakpoints.iter_mut() {
breakpoint.paint(window, cx);
}
window.with_element_namespace("crease_toggles", |window| {
for crease_toggle in layout.crease_toggles.iter_mut().flatten() {
crease_toggle.paint(window, cx);
@@ -5772,23 +5589,12 @@ impl EditorElement {
}
});
for test_indicator in layout.test_indicators.iter_mut() {
test_indicator.paint(window, cx);
for breakpoint in layout.breakpoints.iter_mut() {
breakpoint.paint(window, cx);
}
let show_git_gutter = layout
.position_map
.snapshot
.show_git_diff_gutter
.unwrap_or_else(|| {
matches!(
ProjectSettings::get_global(cx).git.git_gutter,
Some(GitGutterSetting::TrackedFiles)
)
});
if show_git_gutter {
Self::paint_gutter_diff_hunks(layout, window, cx)
for test_indicator in layout.test_indicators.iter_mut() {
test_indicator.paint(window, cx);
}
});
}
@@ -5813,6 +5619,20 @@ impl EditorElement {
}
}
let show_git_gutter = layout
.position_map
.snapshot
.show_git_diff_gutter
.unwrap_or_else(|| {
matches!(
ProjectSettings::get_global(cx).git.git_gutter,
Some(GitGutterSetting::TrackedFiles)
)
});
if show_git_gutter {
Self::paint_gutter_diff_hunks(layout, window, cx)
}
let highlight_width = 0.275 * layout.position_map.line_height;
let highlight_corner_radii = Corners::all(0.05 * layout.position_map.line_height);
window.paint_layer(layout.gutter_hitbox.bounds, |window| {
@@ -7059,8 +6879,7 @@ impl EditorElement {
layout.width
}
/// Get the width of the longest line number in the current editor in Pixels
pub(crate) fn max_line_number_width(
fn max_line_number_width(
&self,
snapshot: &EditorSnapshot,
window: &mut Window,
@@ -7155,7 +6974,7 @@ impl AcceptEditPredictionBinding {
}
fn prepaint_gutter_button(
button: impl IntoElement,
button: IconButton,
row: DisplayRow,
line_height: Pixels,
gutter_dimensions: &GutterDimensions,
@@ -7300,7 +7119,7 @@ pub(crate) struct LineWithInvisibles {
enum LineFragment {
Text(ShapedLine),
Element {
id: FoldId,
id: ChunkRendererId,
element: Option<AnyElement>,
size: Size<Pixels>,
len: usize,
@@ -8478,7 +8297,7 @@ impl Element for EditorElement {
window,
cx,
);
let new_fold_widths = line_layouts
let new_renrerer_widths = line_layouts
.iter()
.flat_map(|layout| &layout.fragments)
.filter_map(|fragment| {
@@ -8489,7 +8308,7 @@ impl Element for EditorElement {
}
});
if self.editor.update(cx, |editor, cx| {
editor.update_fold_widths(new_fold_widths, cx)
editor.update_renderer_widths(new_renrerer_widths, cx)
}) {
// If the fold widths have changed, we need to prepaint
// the element again to account for any changes in
@@ -9140,10 +8959,6 @@ impl Element for EditorElement {
self.paint_background(layout, window, cx);
self.paint_indent_guides(layout, window, cx);
if layout.gutter_hitbox.size.width > Pixels::ZERO {
self.paint_gutter_highlights(layout, window, cx);
self.paint_gutter_indicators(layout, window, cx);
}
if layout.gutter_hitbox.size.width > Pixels::ZERO {
self.paint_blamed_display_rows(layout, window, cx);
self.paint_line_numbers(layout, window, cx);
@@ -9151,6 +8966,11 @@ impl Element for EditorElement {
self.paint_text(layout, window, cx);
if layout.gutter_hitbox.size.width > Pixels::ZERO {
self.paint_gutter_highlights(layout, window, cx);
self.paint_gutter_indicators(layout, window, cx);
}
if !layout.blocks.is_empty() {
window.with_element_namespace("blocks", |window| {
self.paint_blocks(layout, window, cx);

View File

@@ -1,208 +0,0 @@
use gpui::{Bounds, Path, PathBuilder, PathStyle, StrokeOptions, point};
use ui::{Pixels, px};
/// Draw the path for the breakpoint indicator.
///
/// Note: The indicator needs to be a minimum of MIN_WIDTH px wide.
/// wide to draw without graphical issues, so it will ignore narrower width.
pub(crate) fn breakpoint_indicator_path(
bounds: Bounds<Pixels>,
scale: f32,
stroke: bool,
) -> Path<Pixels> {
// Constants for the breakpoint shape dimensions
// The shape is designed based on a 50px wide by 15px high template
// and uses 9-slice style scaling to allow the shape to be stretched
// vertically and horizontally.
const SHAPE_BASE_HEIGHT: f32 = 15.0;
const SHAPE_FIXED_WIDTH: f32 = 32.0; // Width of non-stretchable parts (corners)
const SHAPE_MIN_WIDTH: f32 = 34.0; // Minimum width to render properly
const PIXEL_ROUNDING_FACTOR: f32 = 8.0; // Round to nearest 1/8 pixel
// Key points in the shape (in base coordinates)
const CORNER_RADIUS: f32 = 5.0;
const CENTER_Y: f32 = 7.5;
const TOP_Y: f32 = 0.0;
const BOTTOM_Y: f32 = 15.0;
const CURVE_CONTROL_OFFSET: f32 = 1.5;
const RIGHT_CORNER_START: f32 = 4.0;
const RIGHT_CORNER_WIDTH: f32 = 13.0;
// Helper function to round pixels to nearest 1/8
let round_to_pixel_grid = |value: Pixels| -> Pixels {
let value_f32: f32 = value.into();
px((value_f32 * PIXEL_ROUNDING_FACTOR).round() / PIXEL_ROUNDING_FACTOR)
};
// Calculate actual dimensions with scaling
let min_allowed_width = px(SHAPE_MIN_WIDTH * scale);
let actual_width = if bounds.size.width < min_allowed_width {
min_allowed_width
} else {
bounds.size.width
};
let actual_height = bounds.size.height;
// Debug input parameters and initial calculations
dbg!(&bounds);
dbg!(scale);
dbg!(stroke);
dbg!(min_allowed_width);
dbg!(actual_width);
dbg!(actual_height);
// Origin point for positioning
let origin_x = bounds.origin.x;
let origin_y = bounds.origin.y;
// Calculate the scale factor based on height and user scale
let shape_scale = (actual_height / px(SHAPE_BASE_HEIGHT)) * scale;
// Calculate the width of fixed and stretchable sections
let fixed_sections_width = px(SHAPE_FIXED_WIDTH) * shape_scale;
let stretchable_middle_width = actual_width - fixed_sections_width;
// Debug scaling calculations
dbg!(shape_scale);
dbg!(fixed_sections_width);
dbg!(stretchable_middle_width);
// Pre-calculate all the key x-coordinates
let left_edge_x = round_to_pixel_grid(origin_x);
let left_corner_end_x = round_to_pixel_grid(origin_x + px(CORNER_RADIUS) * shape_scale);
let middle_section_end_x =
round_to_pixel_grid(origin_x + px(CORNER_RADIUS) * shape_scale + stretchable_middle_width);
let right_corner_start_x = round_to_pixel_grid(
origin_x
+ px(CORNER_RADIUS) * shape_scale
+ stretchable_middle_width
+ px(RIGHT_CORNER_START) * shape_scale,
);
let right_edge_x = round_to_pixel_grid(
origin_x
+ px(CORNER_RADIUS) * shape_scale
+ stretchable_middle_width
+ px(RIGHT_CORNER_WIDTH) * shape_scale,
);
// Debug x-coordinates
dbg!(origin_x);
dbg!(left_edge_x);
dbg!(left_corner_end_x);
dbg!(middle_section_end_x);
dbg!(right_corner_start_x);
dbg!(right_edge_x);
// Pre-calculate all the key y-coordinates
let top_edge_y = round_to_pixel_grid(origin_y);
let center_y = round_to_pixel_grid(origin_y + px(CENTER_Y) * shape_scale);
let bottom_edge_y = round_to_pixel_grid(origin_y + px(BOTTOM_Y) * shape_scale);
// Y-coordinates for the left side curves
let left_upper_curve_start_y = round_to_pixel_grid(origin_y + px(CORNER_RADIUS) * shape_scale);
let left_lower_curve_end_y = round_to_pixel_grid(origin_y + px(10.0) * shape_scale);
// Y-coordinates for the right side curves
let right_upper_curve_control_y = round_to_pixel_grid(origin_y + px(6.0) * shape_scale);
let right_lower_curve_control_y = round_to_pixel_grid(origin_y + px(9.0) * shape_scale);
// Control point offsets
let control_offset = px(CURVE_CONTROL_OFFSET) * shape_scale;
let right_control_offset = px(9.0) * shape_scale;
// Debug y-coordinates
dbg!(origin_y);
dbg!(top_edge_y);
dbg!(center_y);
dbg!(bottom_edge_y);
dbg!(left_upper_curve_start_y);
dbg!(left_lower_curve_end_y);
dbg!(right_upper_curve_control_y);
dbg!(right_lower_curve_control_y);
// Create the path builder
let mut builder = if stroke {
let stroke_width = px(1.0 * scale);
let options = StrokeOptions::default().with_line_width(stroke_width.0);
PathBuilder::stroke(stroke_width).with_style(PathStyle::Stroke(options))
} else {
PathBuilder::fill()
};
// Build the path - starting from left center
builder.move_to(point(left_edge_x, center_y));
// === Upper half of the shape ===
// Move up to start of left upper curve
builder.line_to(point(left_edge_x, left_upper_curve_start_y));
// Top-left corner curve
builder.cubic_bezier_to(
point(left_corner_end_x, top_edge_y),
point(left_edge_x, round_to_pixel_grid(origin_y + control_offset)),
point(round_to_pixel_grid(origin_x + control_offset), top_edge_y),
);
// Top edge - stretchable middle section
builder.line_to(point(middle_section_end_x, top_edge_y));
// Top edge - right corner start
builder.line_to(point(right_corner_start_x, top_edge_y));
// Top-right corner curve
builder.cubic_bezier_to(
point(right_edge_x, center_y),
point(
round_to_pixel_grid(
origin_x
+ px(CORNER_RADIUS) * shape_scale
+ stretchable_middle_width
+ right_control_offset,
),
top_edge_y,
),
point(right_edge_x, right_upper_curve_control_y),
);
// === Lower half of the shape (mirrored) ===
// Bottom-right corner curve
builder.cubic_bezier_to(
point(right_corner_start_x, bottom_edge_y),
point(right_edge_x, right_lower_curve_control_y),
point(
round_to_pixel_grid(
origin_x
+ px(CORNER_RADIUS) * shape_scale
+ stretchable_middle_width
+ right_control_offset,
),
bottom_edge_y,
),
);
// Bottom edge - right corner to middle
builder.line_to(point(middle_section_end_x, bottom_edge_y));
// Bottom edge - stretchable middle section
builder.line_to(point(left_corner_end_x, bottom_edge_y));
// Bottom-left corner curve
builder.cubic_bezier_to(
point(left_edge_x, left_lower_curve_end_y),
point(
round_to_pixel_grid(origin_x + control_offset),
bottom_edge_y,
),
point(
left_edge_x,
round_to_pixel_grid(origin_y + px(13.5) * shape_scale),
),
);
// Close the path by returning to start
builder.line_to(point(left_edge_x, center_y));
builder.build().unwrap()
}

View File

@@ -1 +0,0 @@
pub mod breakpoint_indicator;

View File

@@ -19,18 +19,21 @@ use crate::{
#[derive(Debug)]
pub(super) struct LspColorData {
cache_version_used: usize,
buffer_colors: HashMap<BufferId, BufferColors>,
render_mode: DocumentColorsRenderMode,
}
#[derive(Debug, Default)]
struct BufferColors {
colors: Vec<(Range<Anchor>, DocumentColor, InlayId)>,
inlay_colors: HashMap<InlayId, usize>,
render_mode: DocumentColorsRenderMode,
cache_version_used: usize,
}
impl LspColorData {
pub fn new(cx: &App) -> Self {
Self {
cache_version_used: 0,
colors: Vec::new(),
inlay_colors: HashMap::default(),
buffer_colors: HashMap::default(),
render_mode: EditorSettings::get_global(cx).lsp_document_colors,
}
}
@@ -47,8 +50,9 @@ impl LspColorData {
DocumentColorsRenderMode::Inlay => Some(InlaySplice {
to_remove: Vec::new(),
to_insert: self
.colors
.buffer_colors
.iter()
.flat_map(|(_, buffer_colors)| buffer_colors.colors.iter())
.map(|(range, color, id)| {
Inlay::color(
id.id(),
@@ -63,33 +67,49 @@ impl LspColorData {
})
.collect(),
}),
DocumentColorsRenderMode::None => {
self.colors.clear();
Some(InlaySplice {
to_remove: self.inlay_colors.drain().map(|(id, _)| id).collect(),
to_insert: Vec::new(),
})
}
DocumentColorsRenderMode::None => Some(InlaySplice {
to_remove: self
.buffer_colors
.drain()
.flat_map(|(_, buffer_colors)| buffer_colors.inlay_colors)
.map(|(id, _)| id)
.collect(),
to_insert: Vec::new(),
}),
DocumentColorsRenderMode::Border | DocumentColorsRenderMode::Background => {
Some(InlaySplice {
to_remove: self.inlay_colors.drain().map(|(id, _)| id).collect(),
to_remove: self
.buffer_colors
.iter_mut()
.flat_map(|(_, buffer_colors)| buffer_colors.inlay_colors.drain())
.map(|(id, _)| id)
.collect(),
to_insert: Vec::new(),
})
}
}
}
fn set_colors(&mut self, colors: Vec<(Range<Anchor>, DocumentColor, InlayId)>) -> bool {
if self.colors == colors {
fn set_colors(
&mut self,
buffer_id: BufferId,
colors: Vec<(Range<Anchor>, DocumentColor, InlayId)>,
cache_version: Option<usize>,
) -> bool {
let buffer_colors = self.buffer_colors.entry(buffer_id).or_default();
if let Some(cache_version) = cache_version {
buffer_colors.cache_version_used = cache_version;
}
if buffer_colors.colors == colors {
return false;
}
self.inlay_colors = colors
buffer_colors.inlay_colors = colors
.iter()
.enumerate()
.map(|(i, (_, _, id))| (*id, i))
.collect();
self.colors = colors;
buffer_colors.colors = colors;
true
}
@@ -103,8 +123,9 @@ impl LspColorData {
{
Vec::new()
} else {
self.colors
self.buffer_colors
.iter()
.flat_map(|(_, buffer_colors)| &buffer_colors.colors)
.map(|(range, color, _)| {
let display_range = range.clone().to_display_points(snapshot);
let color = Hsla::from(Rgba {
@@ -162,10 +183,9 @@ impl Editor {
ColorFetchStrategy::IgnoreCache
} else {
ColorFetchStrategy::UseCache {
known_cache_version: self
.colors
.as_ref()
.map(|colors| colors.cache_version_used),
known_cache_version: self.colors.as_ref().and_then(|colors| {
Some(colors.buffer_colors.get(&buffer_id)?.cache_version_used)
}),
}
};
let colors_task = lsp_store.document_colors(fetch_strategy, buffer, cx)?;
@@ -201,15 +221,13 @@ impl Editor {
return;
};
let mut cache_version = None;
let mut new_editor_colors = Vec::<(Range<Anchor>, DocumentColor)>::new();
let mut new_editor_colors = HashMap::default();
for (buffer_id, colors) in all_colors {
let Some(excerpts) = editor_excerpts.get(&buffer_id) else {
continue;
};
match colors {
Ok(colors) => {
cache_version = colors.cache_version;
for color in colors.colors {
let color_start = point_from_lsp(color.lsp_range.start);
let color_end = point_from_lsp(color.lsp_range.end);
@@ -243,8 +261,15 @@ impl Editor {
continue;
};
let new_entry =
new_editor_colors.entry(buffer_id).or_insert_with(|| {
(Vec::<(Range<Anchor>, DocumentColor)>::new(), None)
});
new_entry.1 = colors.cache_version;
let new_buffer_colors = &mut new_entry.0;
let (Ok(i) | Err(i)) =
new_editor_colors.binary_search_by(|(probe, _)| {
new_buffer_colors.binary_search_by(|(probe, _)| {
probe
.start
.cmp(&color_start_anchor, &multi_buffer_snapshot)
@@ -254,7 +279,7 @@ impl Editor {
.cmp(&color_end_anchor, &multi_buffer_snapshot)
})
});
new_editor_colors
new_buffer_colors
.insert(i, (color_start_anchor..color_end_anchor, color));
break;
}
@@ -267,45 +292,70 @@ impl Editor {
editor
.update(cx, |editor, cx| {
let mut colors_splice = InlaySplice::default();
let mut new_color_inlays = Vec::with_capacity(new_editor_colors.len());
let Some(colors) = &mut editor.colors else {
return;
};
let mut existing_colors = colors.colors.iter().peekable();
for (new_range, new_color) in new_editor_colors {
let rgba_color = Rgba {
r: new_color.color.red,
g: new_color.color.green,
b: new_color.color.blue,
a: new_color.color.alpha,
};
let mut updated = false;
for (buffer_id, (new_buffer_colors, new_cache_version)) in new_editor_colors {
let mut new_buffer_color_inlays =
Vec::with_capacity(new_buffer_colors.len());
let mut existing_buffer_colors = colors
.buffer_colors
.entry(buffer_id)
.or_default()
.colors
.iter()
.peekable();
for (new_range, new_color) in new_buffer_colors {
let rgba_color = Rgba {
r: new_color.color.red,
g: new_color.color.green,
b: new_color.color.blue,
a: new_color.color.alpha,
};
loop {
match existing_colors.peek() {
Some((existing_range, existing_color, existing_inlay_id)) => {
match existing_range
.start
.cmp(&new_range.start, &multi_buffer_snapshot)
.then_with(|| {
existing_range
.end
.cmp(&new_range.end, &multi_buffer_snapshot)
}) {
cmp::Ordering::Less => {
colors_splice.to_remove.push(*existing_inlay_id);
existing_colors.next();
continue;
}
cmp::Ordering::Equal => {
if existing_color == &new_color {
new_color_inlays.push((
new_range,
new_color,
*existing_inlay_id,
));
} else {
loop {
match existing_buffer_colors.peek() {
Some((existing_range, existing_color, existing_inlay_id)) => {
match existing_range
.start
.cmp(&new_range.start, &multi_buffer_snapshot)
.then_with(|| {
existing_range
.end
.cmp(&new_range.end, &multi_buffer_snapshot)
}) {
cmp::Ordering::Less => {
colors_splice.to_remove.push(*existing_inlay_id);
existing_buffer_colors.next();
continue;
}
cmp::Ordering::Equal => {
if existing_color == &new_color {
new_buffer_color_inlays.push((
new_range,
new_color,
*existing_inlay_id,
));
} else {
colors_splice
.to_remove
.push(*existing_inlay_id);
let inlay = Inlay::color(
post_inc(&mut editor.next_color_inlay_id),
new_range.start,
rgba_color,
);
let inlay_id = inlay.id;
colors_splice.to_insert.push(inlay);
new_buffer_color_inlays
.push((new_range, new_color, inlay_id));
}
existing_buffer_colors.next();
break;
}
cmp::Ordering::Greater => {
let inlay = Inlay::color(
post_inc(&mut editor.next_color_inlay_id),
new_range.start,
@@ -313,49 +363,40 @@ impl Editor {
);
let inlay_id = inlay.id;
colors_splice.to_insert.push(inlay);
new_color_inlays
new_buffer_color_inlays
.push((new_range, new_color, inlay_id));
break;
}
existing_colors.next();
break;
}
cmp::Ordering::Greater => {
let inlay = Inlay::color(
post_inc(&mut editor.next_color_inlay_id),
new_range.start,
rgba_color,
);
let inlay_id = inlay.id;
colors_splice.to_insert.push(inlay);
new_color_inlays.push((new_range, new_color, inlay_id));
break;
}
}
}
None => {
let inlay = Inlay::color(
post_inc(&mut editor.next_color_inlay_id),
new_range.start,
rgba_color,
);
let inlay_id = inlay.id;
colors_splice.to_insert.push(inlay);
new_color_inlays.push((new_range, new_color, inlay_id));
break;
None => {
let inlay = Inlay::color(
post_inc(&mut editor.next_color_inlay_id),
new_range.start,
rgba_color,
);
let inlay_id = inlay.id;
colors_splice.to_insert.push(inlay);
new_buffer_color_inlays
.push((new_range, new_color, inlay_id));
break;
}
}
}
}
}
if existing_colors.peek().is_some() {
colors_splice
.to_remove
.extend(existing_colors.map(|(_, _, id)| *id));
if existing_buffer_colors.peek().is_some() {
colors_splice
.to_remove
.extend(existing_buffer_colors.map(|(_, _, id)| *id));
}
updated |= colors.set_colors(
buffer_id,
new_buffer_color_inlays,
new_cache_version,
);
}
let mut updated = colors.set_colors(new_color_inlays);
if let Some(cache_version) = cache_version {
colors.cache_version_used = cache_version;
}
if colors.render_mode == DocumentColorsRenderMode::Inlay
&& (!colors_splice.to_insert.is_empty()
|| !colors_splice.to_remove.is_empty())

View File

@@ -388,6 +388,7 @@ pub(crate) fn commit_message_editor(
commit_editor.set_collaboration_hub(Box::new(project));
commit_editor.set_use_autoclose(false);
commit_editor.set_show_gutter(false, cx);
commit_editor.set_use_modal_editing(true);
commit_editor.set_show_wrap_guides(false, cx);
commit_editor.set_show_indent_guides(false, cx);
let placeholder = placeholder.unwrap_or("Enter commit message".into());

View File

@@ -1,7 +1,7 @@
use gpui::{
Application, Background, Bounds, ColorSpace, Context, MouseDownEvent, Path, PathBuilder,
PathStyle, Pixels, Point, Render, SharedString, StrokeOptions, Window, WindowOptions, bounds,
canvas, div, linear_color_stop, linear_gradient, point, prelude::*, px, rgb, size,
PathStyle, Pixels, Point, Render, SharedString, StrokeOptions, Window, WindowOptions, canvas,
div, linear_color_stop, linear_gradient, point, prelude::*, px, rgb, size,
};
struct PaintingViewer {
@@ -150,14 +150,6 @@ impl PaintingViewer {
let path = builder.build().unwrap();
lines.push((path, gpui::green().into()));
// draw the indicators (aligned and unaligned versions)
let aligned_indicator = breakpoint_indicator_path(
bounds(point(px(50.), px(250.)), size(px(60.), px(16.))),
1.0,
false,
);
lines.push((aligned_indicator, rgb(0x1e88e5).into()));
Self {
default_lines: lines.clone(),
lines: vec![],
@@ -314,137 +306,3 @@ fn main() {
cx.activate(true);
});
}
/// Draw the path for the breakpoint indicator.
///
/// Note: The indicator needs to be a minimum of MIN_WIDTH px wide.
/// wide to draw without graphical issues, so it will ignore narrower width.
fn breakpoint_indicator_path(bounds: Bounds<Pixels>, scale: f32, stroke: bool) -> Path<Pixels> {
static MIN_WIDTH: f32 = 31.;
// Apply user scale to the minimum width
let min_width = MIN_WIDTH * scale;
let width = if bounds.size.width.0 < min_width {
px(min_width)
} else {
bounds.size.width
};
let height = bounds.size.height;
// Position the indicator on the canvas
let base_x = bounds.origin.x;
let base_y = bounds.origin.y;
// Calculate the scaling factor for the height (SVG is 15px tall), incorporating user scale
let scale_factor = (height / px(15.0)) * scale;
// Calculate how much width to allocate to the stretchable middle section
// SVG has 32px of fixed elements (corners), so the rest is for the middle
let fixed_width = px(32.0) * scale_factor;
let middle_width = width - fixed_width;
// Helper function to round to nearest quarter pixel
let round_to_quarter = |value: Pixels| -> Pixels {
let value_f32: f32 = value.into();
px((value_f32 * 4.0).round() / 4.0)
};
// Create a new path - either fill or stroke based on the flag
let mut builder = if stroke {
// For stroke, we need to set appropriate line width and options
let stroke_width = px(1.0 * scale); // Apply scale to stroke width
let options = StrokeOptions::default().with_line_width(stroke_width.0);
PathBuilder::stroke(stroke_width).with_style(PathStyle::Stroke(options))
} else {
// For fill, use the original implementation
PathBuilder::fill()
};
// Upper half of the shape - Based on the provided SVG
// Start at bottom left (0, 8)
let start_x = round_to_quarter(base_x);
let start_y = round_to_quarter(base_y + px(7.5) * scale_factor);
builder.move_to(point(start_x, start_y));
// Vertical line to (0, 5)
let vert_y = round_to_quarter(base_y + px(5.0) * scale_factor);
builder.line_to(point(start_x, vert_y));
// Curve to (5, 0) - using cubic Bezier
let curve1_end_x = round_to_quarter(base_x + px(5.0) * scale_factor);
let curve1_end_y = round_to_quarter(base_y);
let curve1_ctrl1_x = round_to_quarter(base_x);
let curve1_ctrl1_y = round_to_quarter(base_y + px(1.5) * scale_factor);
let curve1_ctrl2_x = round_to_quarter(base_x + px(1.5) * scale_factor);
let curve1_ctrl2_y = round_to_quarter(base_y);
builder.cubic_bezier_to(
point(curve1_end_x, curve1_end_y),
point(curve1_ctrl1_x, curve1_ctrl1_y),
point(curve1_ctrl2_x, curve1_ctrl2_y),
);
// Horizontal line through the middle section to (37, 0)
let middle_end_x = round_to_quarter(base_x + px(5.0) * scale_factor + middle_width);
builder.line_to(point(middle_end_x, curve1_end_y));
// Horizontal line to (41, 0)
let right_section_x =
round_to_quarter(base_x + px(5.0) * scale_factor + middle_width + px(4.0) * scale_factor);
builder.line_to(point(right_section_x, curve1_end_y));
// Curve to (50, 7.5) - using cubic Bezier
let curve2_end_x =
round_to_quarter(base_x + px(5.0) * scale_factor + middle_width + px(13.0) * scale_factor);
let curve2_end_y = round_to_quarter(base_y + px(7.5) * scale_factor);
let curve2_ctrl1_x =
round_to_quarter(base_x + px(5.0) * scale_factor + middle_width + px(9.0) * scale_factor);
let curve2_ctrl1_y = round_to_quarter(base_y);
let curve2_ctrl2_x =
round_to_quarter(base_x + px(5.0) * scale_factor + middle_width + px(13.0) * scale_factor);
let curve2_ctrl2_y = round_to_quarter(base_y + px(6.0) * scale_factor);
builder.cubic_bezier_to(
point(curve2_end_x, curve2_end_y),
point(curve2_ctrl1_x, curve2_ctrl1_y),
point(curve2_ctrl2_x, curve2_ctrl2_y),
);
// Lower half of the shape - mirrored vertically
// Curve from (50, 7.5) to (41, 15)
let curve3_end_y = round_to_quarter(base_y + px(15.0) * scale_factor);
let curve3_ctrl1_x =
round_to_quarter(base_x + px(5.0) * scale_factor + middle_width + px(13.0) * scale_factor);
let curve3_ctrl1_y = round_to_quarter(base_y + px(9.0) * scale_factor);
let curve3_ctrl2_x =
round_to_quarter(base_x + px(5.0) * scale_factor + middle_width + px(9.0) * scale_factor);
let curve3_ctrl2_y = round_to_quarter(base_y + px(15.0) * scale_factor);
builder.cubic_bezier_to(
point(right_section_x, curve3_end_y),
point(curve3_ctrl1_x, curve3_ctrl1_y),
point(curve3_ctrl2_x, curve3_ctrl2_y),
);
// Horizontal line to (37, 15)
builder.line_to(point(middle_end_x, curve3_end_y));
// Horizontal line through the middle section to (5, 15)
builder.line_to(point(curve1_end_x, curve3_end_y));
// Curve to (0, 10)
let curve4_end_y = round_to_quarter(base_y + px(10.0) * scale_factor);
let curve4_ctrl1_x = round_to_quarter(base_x + px(1.5) * scale_factor);
let curve4_ctrl1_y = round_to_quarter(base_y + px(15.0) * scale_factor);
let curve4_ctrl2_x = round_to_quarter(base_x);
let curve4_ctrl2_y = round_to_quarter(base_y + px(13.5) * scale_factor);
builder.cubic_bezier_to(
point(start_x, curve4_end_y),
point(curve4_ctrl1_x, curve4_ctrl1_y),
point(curve4_ctrl2_x, curve4_ctrl2_y),
);
// Close the path
builder.line_to(point(start_x, start_y));
builder.build().unwrap()
}

View File

@@ -151,7 +151,7 @@ pub fn guess_compositor() -> &'static str {
pub(crate) fn current_platform(_headless: bool) -> Rc<dyn Platform> {
Rc::new(
WindowsPlatform::new()
.inspect_err(|err| show_error("Error: Zed failed to launch", err.to_string()))
.inspect_err(|err| show_error("Failed to launch", err.to_string()))
.unwrap(),
)
}

View File

@@ -1299,12 +1299,8 @@ mod windows_renderer {
size: Default::default(),
transparent,
};
BladeRenderer::new(context, &raw, config).inspect_err(|err| {
show_error(
"Error: Zed failed to initialize BladeRenderer",
err.to_string(),
)
})
BladeRenderer::new(context, &raw, config)
.inspect_err(|err| show_error("Failed to initialize BladeRenderer", err.to_string()))
}
struct RawWindow {

View File

@@ -165,6 +165,10 @@ impl LanguageModel for FakeLanguageModel {
false
}
fn max_image_size(&self) -> u64 {
0 // No image support
}
fn telemetry_id(&self) -> String {
"fake".to_string()
}

View File

@@ -284,8 +284,13 @@ pub trait LanguageModel: Send + Sync {
None
}
/// Whether this model supports images
fn supports_images(&self) -> bool;
/// Whether this model supports images. This is determined by whether self.max_image_size() is positive.
fn supports_images(&self) -> bool {
self.max_image_size() > 0
}
/// The maximum image size the model accepts, in bytes. (Zero means images are unsupported.)
fn max_image_size(&self) -> u64;
/// Whether this model supports tools.
fn supports_tools(&self) -> bool;

View File

@@ -437,6 +437,13 @@ impl LanguageModel for AnthropicModel {
true
}
fn max_image_size(&self) -> u64 {
// Anthropic documentation: https://docs.anthropic.com/en/docs/build-with-claude/vision#faq
// FAQ section: "Is there a limit to the image file size I can upload?"
// "API: Maximum 5MB per image"
5_242_880 // 5 MiB - Anthropic's stated maximum
}
fn supports_tool_choice(&self, choice: LanguageModelToolChoice) -> bool {
match choice {
LanguageModelToolChoice::Auto
@@ -528,6 +535,11 @@ pub fn into_anthropic(
.into_iter()
.filter_map(|content| match content {
MessageContent::Text(text) => {
let text = if text.chars().last().map_or(false, |c| c.is_whitespace()) {
text.trim_end().to_string()
} else {
text
};
if !text.is_empty() {
Some(anthropic::RequestContent::Text {
text,

View File

@@ -504,6 +504,10 @@ impl LanguageModel for BedrockModel {
false
}
fn max_image_size(&self) -> u64 {
0 // Bedrock models don't currently support images in this implementation
}
fn supports_tool_choice(&self, choice: LanguageModelToolChoice) -> bool {
match choice {
LanguageModelToolChoice::Auto | LanguageModelToolChoice::Any => {

View File

@@ -699,6 +699,18 @@ impl LanguageModel for CloudLanguageModel {
self.model.supports_max_mode
}
fn max_image_size(&self) -> u64 {
if self.model.supports_images {
// Use a conservative limit that works across all providers
// Anthropic has the smallest limit at 5 MiB
// Anthropic documentation: https://docs.anthropic.com/en/docs/build-with-claude/vision#faq
// "API: Maximum 5MB per image"
5_242_880 // 5 MiB
} else {
0
}
}
fn telemetry_id(&self) -> String {
format!("zed.dev/{}", self.model.id)
}

View File

@@ -216,6 +216,17 @@ impl LanguageModel for CopilotChatLanguageModel {
self.model.supports_vision()
}
fn max_image_size(&self) -> u64 {
if self.model.supports_vision() {
// OpenAI documentation: https://help.openai.com/en/articles/8983719-what-are-the-file-upload-size-restrictions
// "For images, there's a limit of 20MB per image."
// GitHub Copilot uses OpenAI models under the hood
20_971_520 // 20 MB - GitHub Copilot uses OpenAI models
} else {
0
}
}
fn tool_input_format(&self) -> LanguageModelToolSchemaFormat {
match self.model.vendor() {
ModelVendor::OpenAI | ModelVendor::Anthropic => {

View File

@@ -302,6 +302,10 @@ impl LanguageModel for DeepSeekLanguageModel {
false
}
fn max_image_size(&self) -> u64 {
0 // DeepSeek models don't currently support images
}
fn telemetry_id(&self) -> String {
format!("deepseek/{}", self.model.id())
}

View File

@@ -349,6 +349,17 @@ impl LanguageModel for GoogleLanguageModel {
self.model.supports_images()
}
fn max_image_size(&self) -> u64 {
if self.model.supports_images() {
// Google Gemini documentation: https://ai.google.dev/gemini-api/docs/image-understanding
// "Note: Inline image data limits your total request size (text prompts, system instructions, and inline bytes) to 20MB."
// "For larger requests, upload image files using the File API."
20_971_520 // 20 MB - Google Gemini's file API limit
} else {
0
}
}
fn supports_tool_choice(&self, choice: LanguageModelToolChoice) -> bool {
match choice {
LanguageModelToolChoice::Auto

View File

@@ -410,6 +410,18 @@ impl LanguageModel for LmStudioLanguageModel {
self.model.supports_images
}
fn max_image_size(&self) -> u64 {
if self.model.supports_images {
// LM Studio documentation: https://lmstudio.ai/docs/typescript/llm-prediction/image-input
// While not explicitly stated, LM Studio uses a standard 20MB limit
// matching OpenAI's documented limit: https://help.openai.com/en/articles/8983719-what-are-the-file-upload-size-restrictions
// "For images, there's a limit of 20MB per image."
20_971_520 // 20 MB - Default limit for local models
} else {
0
}
}
fn telemetry_id(&self) -> String {
format!("lmstudio/{}", self.model.id())
}

View File

@@ -317,6 +317,18 @@ impl LanguageModel for MistralLanguageModel {
self.model.supports_images()
}
fn max_image_size(&self) -> u64 {
if self.model.supports_images() {
// Mistral documentation: https://www.infoq.com/news/2025/03/mistral-ai-ocr-api/
// "The API is currently limited to files that do not exceed 50MB in size or 1,000 pages"
// Also confirmed in https://github.com/everaldo/mcp-mistral-ocr/blob/master/README.md
// "Maximum file size: 50MB (enforced by Mistral API)"
52_428_800 // 50 MB - Mistral's OCR API limit
} else {
0
}
}
fn telemetry_id(&self) -> String {
format!("mistral/{}", self.model.id())
}

View File

@@ -365,6 +365,16 @@ impl LanguageModel for OllamaLanguageModel {
self.model.supports_vision.unwrap_or(false)
}
fn max_image_size(&self) -> u64 {
if self.model.supports_vision.unwrap_or(false) {
// Ollama documentation: https://github.com/ollama/ollama/releases/tag/v0.1.15
// "Images up to 100MB in size are supported."
104_857_600 // 100 MB - Ollama's documented API limit
} else {
0
}
}
fn supports_tool_choice(&self, choice: LanguageModelToolChoice) -> bool {
match choice {
LanguageModelToolChoice::Auto => false,

View File

@@ -302,6 +302,14 @@ impl LanguageModel for OpenAiLanguageModel {
false
}
fn max_image_size(&self) -> u64 {
// OpenAI documentation: https://help.openai.com/en/articles/8983719-what-are-the-file-upload-size-restrictions
// "For images, there's a limit of 20MB per image."
// Note: OpenAI models don't currently support images in this implementation
// When enabled, OpenAI supports up to 20MB (20_971_520 bytes)
0
}
fn supports_tool_choice(&self, choice: LanguageModelToolChoice) -> bool {
match choice {
LanguageModelToolChoice::Auto => true,

View File

@@ -407,6 +407,18 @@ impl LanguageModel for OpenRouterLanguageModel {
self.model.supports_images.unwrap_or(false)
}
fn max_image_size(&self) -> u64 {
if self.model.supports_images.unwrap_or(false) {
// OpenRouter documentation: https://openrouter.ai/docs/features/images-and-pdfs
// While not explicitly stated, OpenRouter appears to follow OpenAI's standard
// which is documented at: https://help.openai.com/en/articles/8983719-what-are-the-file-upload-size-restrictions
// "For images, there's a limit of 20MB per image."
20_971_520 // 20 MB - OpenRouter's default limit
} else {
0
}
}
fn count_tokens(
&self,
request: LanguageModelRequest,

View File

@@ -305,6 +305,14 @@ impl LanguageModel for VercelLanguageModel {
true
}
fn max_image_size(&self) -> u64 {
// Vercel AI SDK uses standard provider limits. Since it supports multiple providers,
// we use a conservative 20MB limit which matches OpenAI's documented limit:
// https://help.openai.com/en/articles/8983719-what-are-the-file-upload-size-restrictions
// "For images, there's a limit of 20MB per image."
20_971_520 // 20 MB - Default limit for Vercel AI SDK
}
fn supports_tool_choice(&self, choice: LanguageModelToolChoice) -> bool {
match choice {
LanguageModelToolChoice::Auto

View File

@@ -767,8 +767,8 @@ pub struct EsLintLspAdapter {
}
impl EsLintLspAdapter {
const CURRENT_VERSION: &'static str = "3.0.10";
const CURRENT_VERSION_TAG_NAME: &'static str = "release/3.0.10";
const CURRENT_VERSION: &'static str = "2.4.4";
const CURRENT_VERSION_TAG_NAME: &'static str = "release/2.4.4";
#[cfg(not(windows))]
const GITHUB_ASSET_KIND: AssetKind = AssetKind::TarGz;
@@ -846,7 +846,9 @@ impl LspAdapter for EsLintLspAdapter {
"enable": true
}
},
"useFlatConfig": use_flat_config,
"experimental": {
"useFlatConfig": use_flat_config,
},
});
let override_options = cx.update(|cx| {

View File

@@ -87,3 +87,9 @@ pub(crate) mod m_2025_06_25 {
pub(crate) use settings::SETTINGS_PATTERNS;
}
pub(crate) mod m_2025_06_27 {
mod settings;
pub(crate) use settings::SETTINGS_PATTERNS;
}

View File

@@ -0,0 +1,133 @@
use std::ops::Range;
use tree_sitter::{Query, QueryMatch};
use crate::MigrationPatterns;
pub const SETTINGS_PATTERNS: MigrationPatterns = &[(
SETTINGS_CONTEXT_SERVER_PATTERN,
flatten_context_server_command,
)];
const SETTINGS_CONTEXT_SERVER_PATTERN: &str = r#"(document
(object
(pair
key: (string (string_content) @context-servers)
value: (object
(pair
key: (string (string_content) @server-name)
value: (object
(pair
key: (string (string_content) @source-key)
value: (string (string_content) @source-value)
)
(pair
key: (string (string_content) @command-key)
value: (object) @command-object
) @command-pair
) @server-settings
)
)
)
)
(#eq? @context-servers "context_servers")
(#eq? @source-key "source")
(#eq? @source-value "custom")
(#eq? @command-key "command")
)"#;
fn flatten_context_server_command(
contents: &str,
mat: &QueryMatch,
query: &Query,
) -> Option<(Range<usize>, String)> {
let command_pair_index = query.capture_index_for_name("command-pair")?;
let command_pair = mat.nodes_for_capture_index(command_pair_index).next()?;
let command_object_index = query.capture_index_for_name("command-object")?;
let command_object = mat.nodes_for_capture_index(command_object_index).next()?;
let server_settings_index = query.capture_index_for_name("server-settings")?;
let _server_settings = mat.nodes_for_capture_index(server_settings_index).next()?;
// Parse the command object to extract path, args, and env
let mut path_value = None;
let mut args_value = None;
let mut env_value = None;
let mut cursor = command_object.walk();
for child in command_object.children(&mut cursor) {
if child.kind() == "pair" {
if let Some(key_node) = child.child_by_field_name("key") {
if let Some(string_content) = key_node.child(1) {
let key = &contents[string_content.byte_range()];
if let Some(value_node) = child.child_by_field_name("value") {
let value_range = value_node.byte_range();
match key {
"path" => path_value = Some(&contents[value_range]),
"args" => args_value = Some(&contents[value_range]),
"env" => env_value = Some(&contents[value_range]),
_ => {}
}
}
}
}
}
}
let path = path_value?;
// Get the proper indentation from the command pair
let command_pair_start = command_pair.start_byte();
let line_start = contents[..command_pair_start]
.rfind('\n')
.map(|pos| pos + 1)
.unwrap_or(0);
let indent = &contents[line_start..command_pair_start];
// Build the replacement string
let mut replacement = format!("\"command\": {}", path);
// Add args if present - need to reduce indentation
if let Some(args) = args_value {
replacement.push_str(",\n");
replacement.push_str(indent);
replacement.push_str("\"args\": ");
let reduced_args = reduce_indentation(args, 4);
replacement.push_str(&reduced_args);
}
// Add env if present - need to reduce indentation
if let Some(env) = env_value {
replacement.push_str(",\n");
replacement.push_str(indent);
replacement.push_str("\"env\": ");
replacement.push_str(&reduce_indentation(env, 4));
}
let range_to_replace = command_pair.byte_range();
Some((range_to_replace, replacement))
}
fn reduce_indentation(text: &str, spaces: usize) -> String {
let lines: Vec<&str> = text.lines().collect();
let mut result = String::new();
for (i, line) in lines.iter().enumerate() {
if i > 0 {
result.push('\n');
}
// Count leading spaces
let leading_spaces = line.chars().take_while(|&c| c == ' ').count();
if leading_spaces >= spaces {
// Reduce indentation
result.push_str(&line[spaces..]);
} else {
// Keep line as is if it doesn't have enough indentation
result.push_str(line);
}
}
result
}

View File

@@ -156,6 +156,10 @@ pub fn migrate_settings(text: &str) -> Result<Option<String>> {
migrations::m_2025_06_25::SETTINGS_PATTERNS,
&SETTINGS_QUERY_2025_06_25,
),
(
migrations::m_2025_06_27::SETTINGS_PATTERNS,
&SETTINGS_QUERY_2025_06_27,
),
];
run_migrations(text, migrations)
}
@@ -262,6 +266,10 @@ define_query!(
SETTINGS_QUERY_2025_06_25,
migrations::m_2025_06_25::SETTINGS_PATTERNS
);
define_query!(
SETTINGS_QUERY_2025_06_27,
migrations::m_2025_06_27::SETTINGS_PATTERNS
);
// custom query
static EDIT_PREDICTION_SETTINGS_MIGRATION_QUERY: LazyLock<Query> = LazyLock::new(|| {
@@ -286,6 +294,15 @@ mod tests {
pretty_assertions::assert_eq!(migrated.as_deref(), output);
}
fn assert_migrate_settings_with_migrations(
migrations: &[(MigrationPatterns, &Query)],
input: &str,
output: Option<&str>,
) {
let migrated = run_migrations(input, migrations).unwrap();
pretty_assertions::assert_eq!(migrated.as_deref(), output);
}
#[test]
fn test_replace_array_with_single_string() {
assert_migrate_keymap(
@@ -873,7 +890,11 @@ mod tests {
#[test]
fn test_mcp_settings_migration() {
assert_migrate_settings(
assert_migrate_settings_with_migrations(
&[(
migrations::m_2025_06_16::SETTINGS_PATTERNS,
&SETTINGS_QUERY_2025_06_16,
)],
r#"{
"context_servers": {
"empty_server": {},
@@ -1058,7 +1079,14 @@ mod tests {
}
}
}"#;
assert_migrate_settings(settings, None);
assert_migrate_settings_with_migrations(
&[(
migrations::m_2025_06_16::SETTINGS_PATTERNS,
&SETTINGS_QUERY_2025_06_16,
)],
settings,
None,
);
}
#[test]
@@ -1131,4 +1159,100 @@ mod tests {
None,
);
}
#[test]
fn test_flatten_context_server_command() {
assert_migrate_settings(
r#"{
"context_servers": {
"some-mcp-server": {
"source": "custom",
"command": {
"path": "npx",
"args": [
"-y",
"@supabase/mcp-server-supabase@latest",
"--read-only",
"--project-ref=<project-ref>"
],
"env": {
"SUPABASE_ACCESS_TOKEN": "<personal-access-token>"
}
}
}
}
}"#,
Some(
r#"{
"context_servers": {
"some-mcp-server": {
"source": "custom",
"command": "npx",
"args": [
"-y",
"@supabase/mcp-server-supabase@latest",
"--read-only",
"--project-ref=<project-ref>"
],
"env": {
"SUPABASE_ACCESS_TOKEN": "<personal-access-token>"
}
}
}
}"#,
),
);
// Test with additional keys in server object
assert_migrate_settings(
r#"{
"context_servers": {
"server-with-extras": {
"source": "custom",
"command": {
"path": "/usr/bin/node",
"args": ["server.js"]
},
"settings": {}
}
}
}"#,
Some(
r#"{
"context_servers": {
"server-with-extras": {
"source": "custom",
"command": "/usr/bin/node",
"args": ["server.js"],
"settings": {}
}
}
}"#,
),
);
// Test command without args or env
assert_migrate_settings(
r#"{
"context_servers": {
"simple-server": {
"source": "custom",
"command": {
"path": "simple-mcp-server"
}
}
}
}"#,
Some(
r#"{
"context_servers": {
"simple-server": {
"source": "custom",
"command": "simple-mcp-server"
}
}
}"#,
),
);
}
}

View File

@@ -135,6 +135,7 @@ pub type ContextServerFactory =
Box<dyn Fn(ContextServerId, Arc<ContextServerConfiguration>) -> Arc<ContextServer>>;
pub struct ContextServerStore {
context_server_settings: HashMap<Arc<str>, ContextServerSettings>,
servers: HashMap<ContextServerId, ContextServerState>,
worktree_store: Entity<WorktreeStore>,
registry: Entity<ContextServerDescriptorRegistry>,
@@ -202,6 +203,11 @@ impl ContextServerStore {
this.available_context_servers_changed(cx);
}),
cx.observe_global::<SettingsStore>(|this, cx| {
let settings = Self::resolve_context_server_settings(&this.worktree_store, cx);
if &this.context_server_settings == settings {
return;
}
this.context_server_settings = settings.clone();
this.available_context_servers_changed(cx);
}),
]
@@ -211,6 +217,8 @@ impl ContextServerStore {
let mut this = Self {
_subscriptions: subscriptions,
context_server_settings: Self::resolve_context_server_settings(&worktree_store, cx)
.clone(),
worktree_store,
registry,
needs_server_update: false,
@@ -268,10 +276,8 @@ impl ContextServerStore {
cx.spawn(async move |this, cx| {
let this = this.upgrade().context("Context server store dropped")?;
let settings = this
.update(cx, |this, cx| {
this.context_server_settings(cx)
.get(&server.id().0)
.cloned()
.update(cx, |this, _| {
this.context_server_settings.get(&server.id().0).cloned()
})
.ok()
.flatten()
@@ -439,12 +445,11 @@ impl ContextServerStore {
}
}
fn context_server_settings<'a>(
&'a self,
fn resolve_context_server_settings<'a>(
worktree_store: &'a Entity<WorktreeStore>,
cx: &'a App,
) -> &'a HashMap<Arc<str>, ContextServerSettings> {
let location = self
.worktree_store
let location = worktree_store
.read(cx)
.visible_worktrees(cx)
.next()
@@ -492,9 +497,9 @@ impl ContextServerStore {
}
async fn maintain_servers(this: WeakEntity<Self>, cx: &mut AsyncApp) -> Result<()> {
let (mut configured_servers, registry, worktree_store) = this.update(cx, |this, cx| {
let (mut configured_servers, registry, worktree_store) = this.update(cx, |this, _| {
(
this.context_server_settings(cx).clone(),
this.context_server_settings.clone(),
this.registry.clone(),
this.worktree_store.clone(),
)
@@ -990,6 +995,33 @@ mod tests {
assert_eq!(store.read(cx).status_for_server(&server_2_id), None);
});
}
// Ensure that nothing happens if the settings do not change
{
let _server_events = assert_server_events(&store, vec![], cx);
set_context_server_configuration(
vec![(
server_1_id.0.clone(),
ContextServerSettings::Extension {
enabled: true,
settings: json!({
"somevalue": false
}),
},
)],
cx,
);
cx.run_until_parked();
cx.update(|cx| {
assert_eq!(
store.read(cx).status_for_server(&server_1_id),
Some(ContextServerStatus::Running)
);
assert_eq!(store.read(cx).status_for_server(&server_2_id), None);
});
}
}
#[gpui::test]

View File

@@ -1037,10 +1037,6 @@ impl Session {
matches!(self.mode, Mode::Building)
}
pub fn is_running(&self) -> bool {
matches!(self.mode, Mode::Running(_))
}
pub fn as_running_mut(&mut self) -> Option<&mut RunningMode> {
match &mut self.mode {
Mode::Running(local_mode) => Some(local_mode),

View File

@@ -170,6 +170,7 @@ pub struct LocalLspStore {
_subscription: gpui::Subscription,
lsp_tree: Entity<LanguageServerTree>,
registered_buffers: HashMap<BufferId, usize>,
buffers_opened_in_servers: HashMap<BufferId, HashSet<LanguageServerId>>,
buffer_pull_diagnostics_result_ids: HashMap<LanguageServerId, HashMap<PathBuf, Option<String>>>,
}
@@ -2546,6 +2547,10 @@ impl LocalLspStore {
vec![snapshot]
});
self.buffers_opened_in_servers
.entry(buffer_id)
.or_default()
.insert(server.server_id());
cx.emit(LspStoreEvent::LanguageServerUpdate {
language_server_id: server.server_id(),
name: None,
@@ -3208,6 +3213,9 @@ impl LocalLspStore {
self.language_servers.remove(server_id_to_remove);
self.buffer_pull_diagnostics_result_ids
.remove(server_id_to_remove);
for buffer_servers in self.buffers_opened_in_servers.values_mut() {
buffer_servers.remove(server_id_to_remove);
}
cx.emit(LspStoreEvent::LanguageServerRemoved(*server_id_to_remove));
}
servers_to_remove.into_keys().collect()
@@ -3787,6 +3795,7 @@ impl LspStore {
}),
lsp_tree: LanguageServerTree::new(manifest_tree, languages.clone(), cx),
registered_buffers: HashMap::default(),
buffers_opened_in_servers: HashMap::default(),
buffer_pull_diagnostics_result_ids: HashMap::default(),
}),
last_formatting_failure: None,
@@ -4159,6 +4168,7 @@ impl LspStore {
lsp_store.lsp_data.remove(&buffer_id);
let local = lsp_store.as_local_mut().unwrap();
local.registered_buffers.remove(&buffer_id);
local.buffers_opened_in_servers.remove(&buffer_id);
if let Some(file) = File::from_dyn(buffer.read(cx).file()).cloned() {
local.unregister_old_buffer_from_language_servers(&buffer, &file, cx);
}
@@ -6235,21 +6245,31 @@ impl LspStore {
} => {
if let Some(cached_data) = self.lsp_data.get(&buffer_id) {
if !version_queried_for.changed_since(&cached_data.colors_for_version) {
if Some(cached_data.cache_version) == known_cache_version {
return None;
} else {
return Some(
Task::ready(Ok(DocumentColors {
colors: cached_data
.colors
.values()
.flatten()
.cloned()
.collect(),
cache_version: Some(cached_data.cache_version),
}))
.shared(),
);
let has_different_servers = self.as_local().is_some_and(|local| {
local
.buffers_opened_in_servers
.get(&buffer_id)
.cloned()
.unwrap_or_default()
!= cached_data.colors.keys().copied().collect()
});
if !has_different_servers {
if Some(cached_data.cache_version) == known_cache_version {
return None;
} else {
return Some(
Task::ready(Ok(DocumentColors {
colors: cached_data
.colors
.values()
.flatten()
.cloned()
.collect(),
cache_version: Some(cached_data.cache_version),
}))
.shared(),
);
}
}
}
}
@@ -7522,6 +7542,14 @@ impl LspStore {
.unwrap_or(true)
})
.map(|(_, server)| server.server_id())
.filter(|server_id| {
self.as_local().is_none_or(|local| {
local
.buffers_opened_in_servers
.get(&snapshot.remote_id())
.is_some_and(|servers| servers.contains(server_id))
})
})
.collect::<Vec<_>>()
});
@@ -10084,6 +10112,7 @@ impl LspStore {
}
// Tell the language server about every open buffer in the worktree that matches the language.
let mut buffer_paths_registered = Vec::new();
self.buffer_store.clone().update(cx, |buffer_store, cx| {
for buffer_handle in buffer_store.buffers() {
let buffer = buffer_handle.read(cx);
@@ -10142,6 +10171,12 @@ impl LspStore {
version,
initial_snapshot.text(),
);
buffer_paths_registered.push(file.abs_path(cx));
local
.buffers_opened_in_servers
.entry(buffer.remote_id())
.or_default()
.insert(server_id);
}
buffer_handle.update(cx, |buffer, cx| {
buffer.set_completion_triggers(
@@ -10163,6 +10198,18 @@ impl LspStore {
}
});
for abs_path in buffer_paths_registered {
cx.emit(LspStoreEvent::LanguageServerUpdate {
language_server_id: server_id,
name: Some(adapter.name()),
message: proto::update_language_server::Variant::RegisteredForBuffer(
proto::RegisteredForBuffer {
buffer_abs_path: abs_path.to_string_lossy().to_string(),
},
),
});
}
cx.notify();
}
@@ -10612,6 +10659,9 @@ impl LspStore {
}
if let Some(local) = self.as_local_mut() {
local.buffer_pull_diagnostics_result_ids.remove(&for_server);
for buffer_servers in local.buffers_opened_in_servers.values_mut() {
buffer_servers.remove(&for_server);
}
}
}

View File

@@ -2,6 +2,7 @@ use std::{
ops::ControlFlow,
path::{Path, PathBuf},
sync::Arc,
time::Duration,
};
use anyhow::{Context as _, Result, anyhow};
@@ -527,26 +528,6 @@ impl PrettierStore {
let mut new_plugins = plugins.collect::<HashSet<_>>();
let node = self.node.clone();
let fs = Arc::clone(&self.fs);
let locate_prettier_installation = match worktree.and_then(|worktree_id| {
self.worktree_store
.read(cx)
.worktree_for_id(worktree_id, cx)
.map(|worktree| worktree.read(cx).abs_path())
}) {
Some(locate_from) => {
let installed_prettiers = self.prettier_instances.keys().cloned().collect();
cx.background_spawn(async move {
Prettier::locate_prettier_installation(
fs.as_ref(),
&installed_prettiers,
locate_from.as_ref(),
)
.await
})
}
None => Task::ready(Ok(ControlFlow::Continue(None))),
};
new_plugins.retain(|plugin| !self.default_prettier.installed_plugins.contains(plugin));
let mut installation_attempt = 0;
let previous_installation_task = match &mut self.default_prettier.prettier {
@@ -574,15 +555,34 @@ impl PrettierStore {
}
};
log::info!("Initializing default prettier with plugins {new_plugins:?}");
let plugins_to_install = new_plugins.clone();
let fs = Arc::clone(&self.fs);
let new_installation_task = cx
.spawn(async move |project, cx| {
match locate_prettier_installation
.spawn(async move |prettier_store, cx| {
cx.background_executor().timer(Duration::from_millis(30)).await;
let location_data = prettier_store.update(cx, |prettier_store, cx| {
worktree.and_then(|worktree_id| {
prettier_store.worktree_store
.read(cx)
.worktree_for_id(worktree_id, cx)
.map(|worktree| worktree.read(cx).abs_path())
}).map(|locate_from| {
let installed_prettiers = prettier_store.prettier_instances.keys().cloned().collect();
(locate_from, installed_prettiers)
})
})?;
let locate_prettier_installation = match location_data {
Some((locate_from, installed_prettiers)) => Prettier::locate_prettier_installation(
fs.as_ref(),
&installed_prettiers,
locate_from.as_ref(),
)
.await
.context("locate prettier installation")
.map_err(Arc::new)?
.context("locate prettier installation").map_err(Arc::new)?,
None => ControlFlow::Continue(None),
};
match locate_prettier_installation
{
ControlFlow::Break(()) => return Ok(()),
ControlFlow::Continue(prettier_path) => {
@@ -593,8 +593,8 @@ impl PrettierStore {
if let Some(previous_installation_task) = previous_installation_task {
if let Err(e) = previous_installation_task.await {
log::error!("Failed to install default prettier: {e:#}");
project.update(cx, |project, _| {
if let PrettierInstallation::NotInstalled { attempts, not_installed_plugins, .. } = &mut project.default_prettier.prettier {
prettier_store.update(cx, |prettier_store, _| {
if let PrettierInstallation::NotInstalled { attempts, not_installed_plugins, .. } = &mut prettier_store.default_prettier.prettier {
*attempts += 1;
new_plugins.extend(not_installed_plugins.iter().cloned());
installation_attempt = *attempts;
@@ -604,8 +604,8 @@ impl PrettierStore {
}
};
if installation_attempt > prettier::FAIL_THRESHOLD {
project.update(cx, |project, _| {
if let PrettierInstallation::NotInstalled { installation_task, .. } = &mut project.default_prettier.prettier {
prettier_store.update(cx, |prettier_store, _| {
if let PrettierInstallation::NotInstalled { installation_task, .. } = &mut prettier_store.default_prettier.prettier {
*installation_task = None;
};
})?;
@@ -614,19 +614,20 @@ impl PrettierStore {
);
return Ok(());
}
project.update(cx, |project, _| {
prettier_store.update(cx, |prettier_store, _| {
new_plugins.retain(|plugin| {
!project.default_prettier.installed_plugins.contains(plugin)
!prettier_store.default_prettier.installed_plugins.contains(plugin)
});
if let PrettierInstallation::NotInstalled { not_installed_plugins, .. } = &mut project.default_prettier.prettier {
if let PrettierInstallation::NotInstalled { not_installed_plugins, .. } = &mut prettier_store.default_prettier.prettier {
not_installed_plugins.retain(|plugin| {
!project.default_prettier.installed_plugins.contains(plugin)
!prettier_store.default_prettier.installed_plugins.contains(plugin)
});
not_installed_plugins.extend(new_plugins.iter().cloned());
}
needs_install |= !new_plugins.is_empty();
})?;
if needs_install {
log::info!("Initializing default prettier with plugins {new_plugins:?}");
let installed_plugins = new_plugins.clone();
cx.background_spawn(async move {
install_prettier_packages(fs.as_ref(), new_plugins, node).await?;
@@ -637,17 +638,27 @@ impl PrettierStore {
.await
.context("prettier & plugins install")
.map_err(Arc::new)?;
log::info!("Initialized prettier with plugins: {installed_plugins:?}");
project.update(cx, |project, _| {
project.default_prettier.prettier =
log::info!("Initialized default prettier with plugins: {installed_plugins:?}");
prettier_store.update(cx, |prettier_store, _| {
prettier_store.default_prettier.prettier =
PrettierInstallation::Installed(PrettierInstance {
attempt: 0,
prettier: None,
});
project.default_prettier
prettier_store.default_prettier
.installed_plugins
.extend(installed_plugins);
})?;
} else {
prettier_store.update(cx, |prettier_store, _| {
if let PrettierInstallation::NotInstalled { .. } = &mut prettier_store.default_prettier.prettier {
prettier_store.default_prettier.prettier =
PrettierInstallation::Installed(PrettierInstance {
attempt: 0,
prettier: None,
});
}
})?;
}
}
}
@@ -767,6 +778,7 @@ pub(super) async fn format_with_prettier(
}
}
#[derive(Debug)]
pub struct DefaultPrettier {
prettier: PrettierInstallation,
installed_plugins: HashSet<Arc<str>>,

View File

@@ -2975,6 +2975,20 @@ impl Project {
}),
Err(_) => {}
},
SettingsObserverEvent::LocalDebugScenariosUpdated(result) => match result {
Err(InvalidSettingsError::Debug { message, path }) => {
let message =
format!("Failed to set local debug scenarios in {path:?}:\n{message}");
cx.emit(Event::Toast {
notification_id: format!("local-debug-scenarios-{path:?}").into(),
message,
});
}
Ok(path) => cx.emit(Event::HideToast {
notification_id: format!("local-debug-scenarios-{path:?}").into(),
}),
Err(_) => {}
},
}
}

View File

@@ -97,9 +97,8 @@ pub enum ContextServerSettings {
/// Whether the context server is enabled.
#[serde(default = "default_true")]
enabled: bool,
/// The command to run this context server.
///
/// This will override the command set by an extension.
#[serde(flatten)]
command: ContextServerCommand,
},
Extension {
@@ -555,6 +554,7 @@ pub enum SettingsObserverMode {
pub enum SettingsObserverEvent {
LocalSettingsUpdated(Result<PathBuf, InvalidSettingsError>),
LocalTasksUpdated(Result<PathBuf, InvalidSettingsError>),
LocalDebugScenariosUpdated(Result<PathBuf, InvalidSettingsError>),
}
impl EventEmitter<SettingsObserverEvent> for SettingsObserver {}
@@ -566,6 +566,7 @@ pub struct SettingsObserver {
project_id: u64,
task_store: Entity<TaskStore>,
_global_task_config_watcher: Task<()>,
_global_debug_config_watcher: Task<()>,
}
/// SettingsObserver observers changes to .zed/{settings, task}.json files in local worktrees
@@ -598,6 +599,11 @@ impl SettingsObserver {
paths::tasks_file().clone(),
cx,
),
_global_debug_config_watcher: Self::subscribe_to_global_debug_scenarios_changes(
fs.clone(),
paths::debug_scenarios_file().clone(),
cx,
),
}
}
@@ -618,6 +624,11 @@ impl SettingsObserver {
paths::tasks_file().clone(),
cx,
),
_global_debug_config_watcher: Self::subscribe_to_global_debug_scenarios_changes(
fs.clone(),
paths::debug_scenarios_file().clone(),
cx,
),
}
}
@@ -1048,6 +1059,61 @@ impl SettingsObserver {
}
})
}
fn subscribe_to_global_debug_scenarios_changes(
fs: Arc<dyn Fs>,
file_path: PathBuf,
cx: &mut Context<Self>,
) -> Task<()> {
let mut user_tasks_file_rx =
watch_config_file(&cx.background_executor(), fs, file_path.clone());
let user_tasks_content = cx.background_executor().block(user_tasks_file_rx.next());
let weak_entry = cx.weak_entity();
cx.spawn(async move |settings_observer, cx| {
let Ok(task_store) = settings_observer.read_with(cx, |settings_observer, _| {
settings_observer.task_store.clone()
}) else {
return;
};
if let Some(user_tasks_content) = user_tasks_content {
let Ok(()) = task_store.update(cx, |task_store, cx| {
task_store
.update_user_debug_scenarios(
TaskSettingsLocation::Global(&file_path),
Some(&user_tasks_content),
cx,
)
.log_err();
}) else {
return;
};
}
while let Some(user_tasks_content) = user_tasks_file_rx.next().await {
let Ok(result) = task_store.update(cx, |task_store, cx| {
task_store.update_user_debug_scenarios(
TaskSettingsLocation::Global(&file_path),
Some(&user_tasks_content),
cx,
)
}) else {
break;
};
weak_entry
.update(cx, |_, cx| match result {
Ok(()) => cx.emit(SettingsObserverEvent::LocalDebugScenariosUpdated(Ok(
file_path.clone(),
))),
Err(err) => cx.emit(SettingsObserverEvent::LocalDebugScenariosUpdated(
Err(InvalidSettingsError::Tasks {
path: file_path.clone(),
message: err.to_string(),
}),
)),
})
.ok();
}
})
}
}
pub fn local_settings_kind_from_proto(kind: proto::LocalSettingsKind) -> LocalSettingsKind {

View File

@@ -101,7 +101,7 @@ pub struct BufferSearchBar {
search_options: SearchOptions,
default_options: SearchOptions,
configured_options: SearchOptions,
query_contains_error: bool,
query_error: Option<String>,
dismissed: bool,
search_history: SearchHistory,
search_history_cursor: SearchHistoryCursor,
@@ -217,7 +217,7 @@ impl Render for BufferSearchBar {
if in_replace {
key_context.add("in_replace");
}
let editor_border = if self.query_contains_error {
let editor_border = if self.query_error.is_some() {
Color::Error.color(cx)
} else {
cx.theme().colors().border
@@ -469,6 +469,14 @@ impl Render for BufferSearchBar {
)
});
let query_error_line = self.query_error.as_ref().map(|error| {
Label::new(error)
.size(LabelSize::Small)
.color(Color::Error)
.mt_neg_1()
.ml_2()
});
v_flex()
.id("buffer_search")
.gap_2()
@@ -524,6 +532,7 @@ impl Render for BufferSearchBar {
.w_full()
},
))
.children(query_error_line)
.children(replace_line)
}
}
@@ -728,7 +737,7 @@ impl BufferSearchBar {
configured_options: search_options,
search_options,
pending_search: None,
query_contains_error: false,
query_error: None,
dismissed: true,
search_history: SearchHistory::new(
Some(MAX_BUFFER_SEARCH_HISTORY_SIZE),
@@ -1230,7 +1239,7 @@ impl BufferSearchBar {
self.pending_search.take();
if let Some(active_searchable_item) = self.active_searchable_item.as_ref() {
self.query_contains_error = false;
self.query_error = None;
if query.is_empty() {
self.clear_active_searchable_item_matches(window, cx);
let _ = done_tx.send(());
@@ -1255,8 +1264,8 @@ impl BufferSearchBar {
None,
) {
Ok(query) => query.with_replacement(self.replacement(cx)),
Err(_) => {
self.query_contains_error = true;
Err(e) => {
self.query_error = Some(e.to_string());
self.clear_active_searchable_item_matches(window, cx);
cx.notify();
return done_rx;
@@ -1274,8 +1283,8 @@ impl BufferSearchBar {
None,
) {
Ok(query) => query.with_replacement(self.replacement(cx)),
Err(_) => {
self.query_contains_error = true;
Err(e) => {
self.query_error = Some(e.to_string());
self.clear_active_searchable_item_matches(window, cx);
cx.notify();
return done_rx;

View File

@@ -208,6 +208,7 @@ pub struct ProjectSearchView {
included_opened_only: bool,
regex_language: Option<Arc<Language>>,
_subscriptions: Vec<Subscription>,
query_error: Option<String>,
}
#[derive(Debug, Clone)]
@@ -876,6 +877,7 @@ impl ProjectSearchView {
included_opened_only: false,
regex_language: None,
_subscriptions: subscriptions,
query_error: None,
};
this.entity_changed(window, cx);
this
@@ -1209,14 +1211,16 @@ impl ProjectSearchView {
if should_unmark_error {
cx.notify();
}
self.query_error = None;
Some(query)
}
Err(_e) => {
Err(e) => {
let should_mark_error = self.panels_with_errors.insert(InputPanel::Query);
if should_mark_error {
cx.notify();
}
self.query_error = Some(e.to_string());
None
}
@@ -2291,6 +2295,14 @@ impl Render for ProjectSearchBar {
key_context.add("in_replace");
}
let query_error_line = search.query_error.as_ref().map(|error| {
Label::new(error)
.size(LabelSize::Small)
.color(Color::Error)
.mt_neg_1()
.ml_2()
});
v_flex()
.py(px(1.0))
.key_context(key_context)
@@ -2342,6 +2354,7 @@ impl Render for ProjectSearchBar {
.gap_2()
.w_full()
.child(search_line)
.children(query_error_line)
.children(replace_line)
.children(filter_line)
}

View File

@@ -8,7 +8,7 @@ use gpui::{App, AsyncApp, BorrowAppContext, Global, Task, UpdateGlobal};
use paths::{EDITORCONFIG_NAME, local_settings_file_relative_path, task_file_name};
use schemars::{JsonSchema, r#gen::SchemaGenerator, schema::RootSchema};
use serde::{Deserialize, Serialize, de::DeserializeOwned};
use serde_json::Value;
use serde_json::{Value, json};
use smallvec::SmallVec;
use std::{
any::{Any, TypeId, type_name},
@@ -967,16 +967,38 @@ impl SettingsStore {
}
}
for release_stage in ["dev", "nightly", "stable", "preview"] {
let schema = combined_schema.schema.clone();
combined_schema
.schema
.object()
.properties
.insert(release_stage.to_string(), schema.into());
const ZED_SETTINGS: &str = "ZedSettings";
let RootSchema {
meta_schema,
schema: zed_settings_schema,
mut definitions,
} = combined_schema;
definitions.insert(ZED_SETTINGS.to_string(), zed_settings_schema.into());
let zed_settings_ref = Schema::new_ref(format!("#/definitions/{ZED_SETTINGS}"));
// settings file contents matches ZedSettings + overrides for each release stage
let mut root_schema = json!({
"allOf": [
zed_settings_ref,
{
"properties": {
"dev": zed_settings_ref,
"nightly": zed_settings_ref,
"stable": zed_settings_ref,
"preview": zed_settings_ref,
}
}
],
"definitions": definitions,
});
if let Some(meta_schema) = meta_schema {
if let Some(root_schema_object) = root_schema.as_object_mut() {
root_schema_object.insert("$schema".to_string(), meta_schema.into());
}
}
serde_json::to_value(&combined_schema).unwrap()
root_schema
}
fn recompute_values(

View File

@@ -93,7 +93,7 @@ fn task_type_to_adapter_name(task_type: &str) -> String {
"php" => "PHP",
"cppdbg" | "lldb" => "CodeLLDB",
"debugpy" => "Debugpy",
"rdbg" => "Ruby",
"rdbg" => "rdbg",
_ => task_type,
}
.to_owned()

View File

@@ -302,14 +302,14 @@ mod test {
use crate::{state::Mode, test::VimTestContext};
#[gpui::test]
async fn test_next_word_start(cx: &mut gpui::TestAppContext) {
async fn test_word_motions(cx: &mut gpui::TestAppContext) {
let mut cx = VimTestContext::new(cx, true).await;
// «
// ˇ
// »
cx.set_state(
indoc! {"
The quˇick brown
Th«e quiˇ»ck brown
fox jumps over
the lazy dog."},
Mode::HelixNormal,
@@ -334,6 +334,32 @@ mod test {
the lazy dog."},
Mode::HelixNormal,
);
cx.simulate_keystrokes("2 b");
cx.assert_state(
indoc! {"
The «ˇquick »brown
fox jumps over
the lazy dog."},
Mode::HelixNormal,
);
cx.simulate_keystrokes("down e up");
cx.assert_state(
indoc! {"
The quicˇk brown
fox jumps over
the lazy dog."},
Mode::HelixNormal,
);
cx.set_state("aa\n «ˇbb»", Mode::HelixNormal);
cx.simulate_keystroke("b");
cx.assert_state("aa\n«ˇ »bb", Mode::HelixNormal);
}
// #[gpui::test]
@@ -448,4 +474,21 @@ mod test {
Mode::HelixNormal,
);
}
#[gpui::test]
async fn test_newline_char(cx: &mut gpui::TestAppContext) {
let mut cx = VimTestContext::new(cx, true).await;
cx.set_state("aa«\nˇ»bb cc", Mode::HelixNormal);
cx.simulate_keystroke("w");
cx.assert_state("aa\n«bb ˇ»cc", Mode::HelixNormal);
cx.set_state("aa«\nˇ»", Mode::HelixNormal);
cx.simulate_keystroke("b");
cx.assert_state("«ˇaa»\n", Mode::HelixNormal);
}
}

View File

@@ -245,61 +245,63 @@ impl Vim {
}) else {
return;
};
if let Some(mode) = mode {
self.switch_mode(mode, false, window, cx)
}
if mode != Some(self.mode) {
if let Some(mode) = mode {
self.switch_mode(mode, false, window, cx)
}
match selection {
RecordedSelection::SingleLine { cols } => {
if cols > 1 {
self.visual_motion(Motion::Right, Some(cols as usize - 1), window, cx)
match selection {
RecordedSelection::SingleLine { cols } => {
if cols > 1 {
self.visual_motion(Motion::Right, Some(cols as usize - 1), window, cx)
}
}
}
RecordedSelection::Visual { rows, cols } => {
self.visual_motion(
Motion::Down {
display_lines: false,
},
Some(rows as usize),
window,
cx,
);
self.visual_motion(
Motion::StartOfLine {
display_lines: false,
},
None,
window,
cx,
);
if cols > 1 {
self.visual_motion(Motion::Right, Some(cols as usize - 1), window, cx)
RecordedSelection::Visual { rows, cols } => {
self.visual_motion(
Motion::Down {
display_lines: false,
},
Some(rows as usize),
window,
cx,
);
self.visual_motion(
Motion::StartOfLine {
display_lines: false,
},
None,
window,
cx,
);
if cols > 1 {
self.visual_motion(Motion::Right, Some(cols as usize - 1), window, cx)
}
}
}
RecordedSelection::VisualBlock { rows, cols } => {
self.visual_motion(
Motion::Down {
display_lines: false,
},
Some(rows as usize),
window,
cx,
);
if cols > 1 {
self.visual_motion(Motion::Right, Some(cols as usize - 1), window, cx);
RecordedSelection::VisualBlock { rows, cols } => {
self.visual_motion(
Motion::Down {
display_lines: false,
},
Some(rows as usize),
window,
cx,
);
if cols > 1 {
self.visual_motion(Motion::Right, Some(cols as usize - 1), window, cx);
}
}
RecordedSelection::VisualLine { rows } => {
self.visual_motion(
Motion::Down {
display_lines: false,
},
Some(rows as usize),
window,
cx,
);
}
RecordedSelection::None => {}
}
RecordedSelection::VisualLine { rows } => {
self.visual_motion(
Motion::Down {
display_lines: false,
},
Some(rows as usize),
window,
cx,
);
}
RecordedSelection::None => {}
}
// insert internally uses repeat to handle counts

View File

@@ -196,7 +196,7 @@ impl Vim {
}
clipboard_selections.push(ClipboardSelection {
len: text.len() - initial_len,
is_entire_line: kind.linewise(),
is_entire_line: false,
first_line_indent: buffer.indent_size_for_line(MultiBufferRow(start.row)).len,
});
}

View File

@@ -2071,3 +2071,42 @@ async fn test_paragraph_multi_delete(cx: &mut gpui::TestAppContext) {
cx.simulate_shared_keystrokes("4 d a p").await;
cx.shared_state().await.assert_eq(indoc! {"ˇ"});
}
#[gpui::test]
async fn test_multi_cursor_replay(cx: &mut gpui::TestAppContext) {
let mut cx = VimTestContext::new(cx, true).await;
cx.set_state(
indoc! {
"
oˇne one one
two two two
"
},
Mode::Normal,
);
cx.simulate_keystrokes("3 g l s wow escape escape");
cx.assert_state(
indoc! {
"
woˇw wow wow
two two two
"
},
Mode::Normal,
);
cx.simulate_keystrokes("2 j 3 g l .");
cx.assert_state(
indoc! {
"
wow wow wow
woˇw woˇw woˇw
"
},
Mode::Normal,
);
}

View File

@@ -8,7 +8,8 @@ use http_client::{HttpClient, Method};
use language_model::{LlmApiToken, RefreshLlmTokenListener};
use web_search::{WebSearchProvider, WebSearchProviderId};
use zed_llm_client::{
CLIENT_SUPPORTS_EXA_WEB_SEARCH_PROVIDER_HEADER_NAME, WebSearchBody, WebSearchResponse,
CLIENT_SUPPORTS_EXA_WEB_SEARCH_PROVIDER_HEADER_NAME, EXPIRED_LLM_TOKEN_HEADER_NAME,
WebSearchBody, WebSearchResponse,
};
pub struct CloudWebSearchProvider {
@@ -73,32 +74,51 @@ async fn perform_web_search(
llm_api_token: LlmApiToken,
body: WebSearchBody,
) -> Result<WebSearchResponse> {
const MAX_RETRIES: usize = 3;
let http_client = &client.http_client();
let mut retries_remaining = MAX_RETRIES;
let mut token = llm_api_token.acquire(&client).await?;
let token = llm_api_token.acquire(&client).await?;
loop {
if retries_remaining == 0 {
return Err(anyhow::anyhow!(
"error performing web search, max retries exceeded"
));
}
let request = http_client::Request::builder()
.method(Method::POST)
.uri(http_client.build_zed_llm_url("/web_search", &[])?.as_ref())
.header("Content-Type", "application/json")
.header("Authorization", format!("Bearer {token}"))
.header(CLIENT_SUPPORTS_EXA_WEB_SEARCH_PROVIDER_HEADER_NAME, "true")
.body(serde_json::to_string(&body)?.into())?;
let mut response = http_client
.send(request)
.await
.context("failed to send web search request")?;
let request = http_client::Request::builder()
.method(Method::POST)
.uri(http_client.build_zed_llm_url("/web_search", &[])?.as_ref())
.header("Content-Type", "application/json")
.header("Authorization", format!("Bearer {token}"))
.header(CLIENT_SUPPORTS_EXA_WEB_SEARCH_PROVIDER_HEADER_NAME, "true")
.body(serde_json::to_string(&body)?.into())?;
let mut response = http_client
.send(request)
.await
.context("failed to send web search request")?;
if response.status().is_success() {
let mut body = String::new();
response.body_mut().read_to_string(&mut body).await?;
return Ok(serde_json::from_str(&body)?);
} else {
let mut body = String::new();
response.body_mut().read_to_string(&mut body).await?;
anyhow::bail!(
"error performing web search.\nStatus: {:?}\nBody: {body}",
response.status(),
);
if response.status().is_success() {
let mut body = String::new();
response.body_mut().read_to_string(&mut body).await?;
return Ok(serde_json::from_str(&body)?);
} else if response
.headers()
.get(EXPIRED_LLM_TOKEN_HEADER_NAME)
.is_some()
{
token = llm_api_token.refresh(&client).await?;
retries_remaining -= 1;
} else {
// For now we will only retry if the LLM token is expired,
// not if the request failed for any other reason.
let mut body = String::new();
response.body_mut().read_to_string(&mut body).await?;
anyhow::bail!(
"error performing web search.\nStatus: {:?}\nBody: {body}",
response.status(),
);
}
}
}

View File

@@ -7,7 +7,7 @@ use svg_preview::{
OpenPreview as SvgOpenPreview, OpenPreviewToTheSide as SvgOpenPreviewToTheSide,
svg_preview_view::SvgPreviewView,
};
use ui::{IconButtonShape, Tooltip, prelude::*, text_for_keystroke};
use ui::{Tooltip, prelude::*, text_for_keystroke};
use workspace::Workspace;
use super::QuickActionBar;
@@ -66,7 +66,6 @@ impl QuickActionBar {
};
let button = IconButton::new(button_id, IconName::Eye)
.shape(IconButtonShape::Square)
.icon_size(IconSize::Small)
.style(ButtonStyle::Subtle)
.tooltip(move |window, cx| {

View File

@@ -40,13 +40,11 @@ You can connect them by adding their commands directly to your `settings.json`,
```json
{
"context_servers": {
"some-context-server": {
"your-mcp-server": {
"source": "custom",
"command": {
"path": "some-command",
"args": ["arg-1", "arg-2"],
"env": {}
}
"command": "some-command",
"args": ["arg-1", "arg-2"],
"env": {}
}
}
}

View File

@@ -1,9 +1,14 @@
# R
R support is available through the [R extension](https://github.com/ocsmit/zed-r).
R support is available via multiple R Zed extensions:
- Tree-sitter: [r-lib/tree-sitter-r](https://github.com/r-lib/tree-sitter-r)
- Language-Server: [REditorSupport/languageserver](https://github.com/REditorSupport/languageserver)
- [ocsmit/zed-r](https://github.com/ocsmit/zed-r)
- Tree-sitter: [r-lib/tree-sitter-r](https://github.com/r-lib/tree-sitter-r)
- Language-Server: [REditorSupport/languageserver](https://github.com/REditorSupport/languageserver)
- [posit-dev/air](https://github.com/posit-dev/air/tree/main/editors/zed)
- Language-Server: [posit-dev/air](https://github.com/posit-dev/air)
## Installation
@@ -15,7 +20,7 @@ install.packages("languageserver")
install.packages("lintr")
```
3. Install the [R Zed extension](https://github.com/ocsmit/zed-r) through Zed's extensions manager.
3. Install the [ocsmit/zed-r](https://github.com/ocsmit/zed-r) through Zed's extensions manager.
For example on macOS:
@@ -28,7 +33,70 @@ Rscript -e 'packageVersion("languageserver")'
Rscript -e 'packageVersion("lintr")'
```
## Ark Installation
## Configuration
### Linting
`REditorSupport/languageserver` bundles support for [r-lib/lintr](https://github.com/r-lib/lintr) as a linter. This can be configured via the use of a `.lintr` inside your project (or in your home directory for global defaults).
```r
linters: linters_with_defaults(
line_length_linter(120),
commented_code_linter = NULL
)
exclusions: list(
"inst/doc/creating_linters.R" = 1,
"inst/example/bad.R",
"tests/testthat/exclusions-test"
)
```
Or exclude it from linting anything,
```r
exclusions: list(".")
```
See [Using lintr](https://lintr.r-lib.org/articles/lintr.html) for a complete list of options,
### Formatting
`REditorSupport/languageserver` bundles support for [r-lib/styler](https://github.com/r-lib/styler) as a formatter. See [Customizing Styler](https://cran.r-project.org/web/packages/styler/vignettes/customizing_styler.html) for more information on how to customize its behavior.
<!--
TBD: Get this working
### REditorSupport/languageserver Configuration
You can configure the [R languageserver settings](https://github.com/REditorSupport/languageserver#settings) via Zed Project Settings `.zed/settings.json` or Zed User Settings `~/.config/zed/settings.json`:
For example to disable Lintr linting and suppress code snippet suggestions (both enabled by default):
```json
{
"lsp": {
"r_language_server": {
"settings": {
"r": {
"lsp": {
"diagnostics": false,
"snippet_support": false
}
}
}
}
}
}
```
-->
<!--
TBD: R REPL Docs
## REPL
### Ark Installation
To use the Zed REPL with R you need to install [Ark](https://github.com/posit-dev/ark), an R Kernel for Jupyter applications.
You can down the latest version from the [Ark GitHub Releases](https://github.com/posit-dev/ark/releases) and then extract the `ark` binary to a directory in your `PATH`.
@@ -56,6 +124,4 @@ unzip ark-latest-linux.zip ark
sudo mv /tmp/ark /usr/local/bin/
```
<!--
TBD: R REPL Docs
-->

View File

@@ -79,20 +79,34 @@ h6 code {
display: none !important;
}
h1 {
font-size: 3.4rem;
}
h2 {
padding-bottom: 1rem;
border-bottom: 1px solid;
border-color: var(--border-light);
}
h2,
h3 {
margin-block-start: 1.5em;
margin-block-end: 0;
font-size: 2rem;
}
h4 {
font-size: 1.8rem;
}
h5 {
font-size: 1.6rem;
}
h2,
h3,
h4,
h5 {
margin-block-start: 2em;
margin-block-start: 1.5em;
margin-block-end: 0;
}
.header + .header h3,

View File

@@ -1,7 +1,7 @@
id = "emmet"
name = "Emmet"
description = "Emmet support"
version = "0.0.3"
version = "0.0.4"
schema_version = 1
authors = ["Piotr Osiewicz <piotr@zed.dev>"]
repository = "https://github.com/zed-industries/zed"
@@ -9,7 +9,7 @@ repository = "https://github.com/zed-industries/zed"
[language_servers.emmet-language-server]
name = "Emmet Language Server"
language = "HTML"
languages = ["HTML", "PHP", "ERB", "JavaScript", "TSX", "CSS"]
languages = ["HTML", "PHP", "ERB", "JavaScript", "TSX", "CSS", "HEEX", "Elixir"]
[language_servers.emmet-language-server.language_ids]
"HTML" = "html"
@@ -18,3 +18,5 @@ languages = ["HTML", "PHP", "ERB", "JavaScript", "TSX", "CSS"]
"JavaScript" = "javascriptreact"
"TSX" = "typescriptreact"
"CSS" = "css"
"HEEX" = "heex"
"Elixir" = "heex"

View File

@@ -27,7 +27,7 @@ elif [[ "$LICENSE_FLAG" == *"agpl"* ]]; then
LICENSE_FILE="LICENSE-AGPL"
else
LICENSE_MODE="GPL-3.0-or-later"
LICENSE_FILE="LICENSE"
LICENSE_FILE="LICENSE-GPL"
fi
if [[ ! "$CRATE_NAME" =~ ^[a-z0-9_]+$ ]]; then
@@ -39,7 +39,7 @@ CRATE_PATH="crates/$CRATE_NAME"
mkdir -p "$CRATE_PATH/src"
# Symlink the license
ln -sf "../../../$LICENSE_FILE" "$CRATE_PATH/LICENSE"
ln -sf "../../../$LICENSE_FILE" "$CRATE_PATH/$LICENSE_FILE"
CARGO_TOML_TEMPLATE=$(cat << 'EOF'
[package]