Compare commits

..

34 Commits

Author SHA1 Message Date
Smit Barmase
514121d6d4 fix indents when multicursor 2025-04-16 05:09:34 +05:30
Agus Zubiaga
0182e09e33 eval: Do not create run files for skipped examples (#28800)
Release Notes:

- N/A
2025-04-15 18:00:04 +00:00
Smit Barmase
6f6e207eb5 editor: Move mouse context menu code actions at bottom (#28799)
Release Notes:

- N/A
2025-04-15 23:27:32 +05:30
Marshall Bowers
149cdeca29 collab: Add kind and period start/end timestamps to billing_subscriptions (#28796)
This PR updates the `billing_subscriptions` table with some new columns

- `kind` - The kind of the description (used to denote Zed Pro vs
existing)
- `stripe_current_period_start` - The Stripe timestamp of when the
subscriptions current period starts
- `stripe_current_period_end` - The Stripe timestamp of when the
subscriptions current period ends

Release Notes:

- N/A

Co-authored-by: Mikayla <mikayla@zed.dev>
2025-04-15 13:48:03 -04:00
Smit Barmase
92dc812aea git_ui: Fix commit/amend telemetry and amend click from commit modal (#28795)
Release Notes:

- N/A
2025-04-15 23:17:04 +05:30
Bennet Bo Fenner
c7e80c80c6 gemini: Pass system prompt as system instructions (#28793)
https://ai.google.dev/gemini-api/docs/text-generation#system-instructions

Release Notes:

- agent: Improve performance of Gemini models
2025-04-15 19:45:47 +02:00
Bennet Bo Fenner
c381a500f8 agent: Show a warning when some tools are incompatible with the selected model (#28755)
WIP

<img width="644" alt="image"
src="https://github.com/user-attachments/assets/b24e1a57-f82e-457c-b788-1b314ade7c84"
/>


<img width="644" alt="image"
src="https://github.com/user-attachments/assets/b158953c-2015-4cc8-b8ed-35c6fcbe162d"
/>


Release Notes:

- agent: Improve compatibility with Gemini Tool Calling APIs. When a
tool is incompatible with the Gemini APIs a warning indicator will be
displayed. Incompatible tools will be automatically excluded from the
conversation

---------

Co-authored-by: Danilo Leal <daniloleal09@gmail.com>
2025-04-15 16:58:11 +00:00
Agus Zubiaga
ff4334efc7 eval: Fix stalling on tool confirmation (#28786)
The `always_allow_tool_actions` setting would get overridden with the
default when we loaded each example project, leading to examples
stalling when they run a tool that needed confirmation. There's now a
separate `runner_settings.json` file where we can configure the
environment for the eval.

Release Notes:

- N/A

---------

Co-authored-by: Oleksiy <oleksiy@zed.dev>
2025-04-15 16:53:45 +00:00
Thomas Mickley-Doyle
b1e4e6048a agent: Add more Rust code examples, update TODO check (#28737)
Release Notes:

- N/A
2025-04-15 16:52:08 +00:00
Jason Lee
d0f806456c gpui: Fix snap_to_window_with_margin when window has client inset (#27330)
Release Notes:

- Fixed popup menu snap to window to leave margin on Linux.

This change to continue #17159 to fix same thing on Linux.

| Before | After |
| -- | -- |
| ![Pasted
image](https://github.com/user-attachments/assets/3129d42c-7253-4a3f-a428-86e2a3df38ff)
|
![image](https://github.com/user-attachments/assets/8dc83377-9df7-45ba-805b-1cfdea612ae0)
|
2025-04-15 18:47:00 +02:00
Marshall Bowers
b6cce1ed91 collab: Add support for launching a general-purpose billing portal session (#28785)
This PR adds a new `ManageSubscriptionIntent` that allows uses to launch
a general-purpose billing portal session to manage their subscription.

Release Notes:

- N/A
2025-04-15 16:40:22 +00:00
Piotr Osiewicz
05fc9ee396 call: Fix crash when screensharing on MacOS (#28784)
Closes #ISSUE

Release Notes:

- Fixed a crash when screensharing on MacOS

Co-authored-by: Conrad <conrad@zed.dev>
Co-authored-by: Anthony Eid <hello@anthonyeid.me>
2025-04-15 16:36:08 +00:00
Danilo Leal
8f52bb92b6 agent: Add ability to interrupt current generation with a new message (#28762)
If you wanted to interrupt the current LLM response that's generating to
send a follow up message, you'd need to stop it first, type your new
message, and then send it. Now, you can just type your new message while
there's a response generating and send it. This will interrupt the
previous response generation and kick off a new one.

Release Notes:

- agent: Allow to send a new message while a response is generating,
interrupting the LLM to focus instead on the most recent prompt.
2025-04-15 13:34:35 -03:00
Cole Miller
144fd0b00d Fix the git panel's commit button sometimes opening the modal (#28767)
Release Notes:

- N/A
2025-04-15 12:16:24 -04:00
Cole Miller
cd4a3fd679 debugger: Skip out-of-bounds breakpoints when deserializing (#28781)
Previously we'd crash when deserializing a breakpoint whose row number
was out of bounds (could happen if the file was externally modified).
This PR fixes that code to skip such breakpoints.

An alternative would be to clip the deserialized `PointUtf16`, but I
think that would mostly result in nonsensical breakpoints.

Release Notes:

- N/A
2025-04-15 16:14:01 +00:00
Cole Miller
42c3f4e7cf debugger_ui: Preview thread state when using the dropdown (#28778)
This PR changes the thread list dropdown menu in the debugger UI to
eagerly preview the state of a thread when selecting it, instead of
waiting until confirming the selection.

Release Notes:

- N/A
2025-04-15 12:10:32 -04:00
Marshall Bowers
90dec1d451 collab: Add Zed Pro checkout flow (#28776)
This PR adds support for initiating a checkout flow for Zed Pro.

Release Notes:

- N/A
2025-04-15 15:45:51 +00:00
Conrad Irwin
afabcd1547 Update block diagnostics (#28006)
Release Notes:

- "Block" diagnostics (that show up in the diagnostics view, or when
using `f8`/`shift-f8`) are rendered more clearly
- `f8`/`shift-f8` now always go to the "next" or "prev" diagnostic,
regardless of the state of the editor

![Screenshot 2025-04-09 at 16 42
09](https://github.com/user-attachments/assets/ae6d2ff6-5183-4b74-89d0-fefee1aa11e3)

---------

Co-authored-by: Kirill Bulatov <mail4score@gmail.com>
Co-authored-by: Julia Ryan <juliaryan3.14@gmail.com>
2025-04-15 09:35:13 -06:00
Piotr Osiewicz
ccf9aef767 debugger: Remove LLDB adapter, switch Rust tasks to CodeLLDB (#28773)
Closes #ISSUE

Release Notes:

- N/A
2025-04-15 15:29:43 +00:00
Conrad Irwin
aef78dcffd Tidy up DAP initialization (#28730)
To make DAP work over SSH we want to create the binary
at the project level (so we can wrap it in an `ssh` invocation
transparently).

This means not pushing the adapter down into the session, and resolving
more information ahead-of-time.

Co-authored-by: Anthony Eid <hello@anthonyeid.me>
Co-authored-by: Piotr <piotr@zed.dev>

Release Notes:

- N/A

---------

Co-authored-by: Anthony Eid <hello@anthonyeid.me>
Co-authored-by: Piotr <piotr@zed.dev>
Co-authored-by: Piotr Osiewicz <24362066+osiewicz@users.noreply.github.com>
Co-authored-by: Anthony <anthony@zed.dev>
2025-04-15 17:11:29 +02:00
Marshall Bowers
6f0951ff77 debugger_ui: Move DEBUGGER_PANEL_PREFIX out of db (#28768)
This PR moves the `DEBUGGER_PANEL_PREFIX` constant out of the `db` crate
and into `debugger_ui`, since it is specific to that.

Release Notes:

- N/A
2025-04-15 14:59:42 +00:00
Bennet Bo Fenner
5e094553fa agent: Return ToolResult from run inside Tool (#28763)
This is just a refactor which adds no functionality.
We now return a `ToolResult` from `Tool > run(...)`. For now this just
wraps the output task in a struct. We'll use this to implement custom
rendering of tools, see #28621.

Release Notes:

- N/A
2025-04-15 14:28:09 +00:00
Kirill Bulatov
32829d9f12 Use proper codenames for macOS versions (#28766)
Closes https://github.com/zed-industries/zed/issues/28765

Release Notes:

- N/A
2025-04-15 14:18:40 +00:00
Agus Zubiaga
e4cf7fe8f5 eval: Improve readability with colors and alignment (#28761)
![CleanShot 2025-04-15 at 10 35
39@2x](https://github.com/user-attachments/assets/495d96fb-fe2f-478b-a9d6-678c1184db9a)


Release Notes:

- N/A
2025-04-15 13:50:01 +00:00
Danilo Leal
2b89b97cd1 agent: Adjust markdown heading sizes (#28759)
Adjust the heading sizes for the Agent Panel so they're not aggressively
huge.

Release Notes:

- N/A
2025-04-15 10:20:35 -03:00
Bennet Bo Fenner
e26f0a331f agent: Make ToolWorkingSet an Entity (#28757)
Motivation is to emit events when enabled tools change, want to use this
in #28755

Release Notes:

- N/A
2025-04-15 14:42:31 +02:00
Danilo Leal
7e1b419243 markdown: Add ability to customize individual heading level (#28733)
This PR adds a new field in the `MarkdownStyle` struct,
`heading_level_styles`, allowing, via the newly added function
`apply_heading_style` and struct `HeadingLevelStyles` to customize each
individual heading level in Markdown rendering/styling function.

Things like this should now be possible:

```rust
    MarkdownStyle {
        heading_level_styles: Some(HeadingLevelStyles {
            h1: Some(TextStyleRefinement {
                font_size: Some(rems(1.15).into()),
                ..Default::default()
            }),
        }),
        ..Default::default()
    }
```

Release Notes:

- N/A
2025-04-15 09:35:24 -03:00
Piotr Osiewicz
98d001bad5 debugger: Always show process list in attach (#28685)
Closes #ISSUE

Release Notes:

- N/A
2025-04-15 14:13:19 +02:00
François Mockers
d4a985a6e3 Case Insensitive Unicode Text Search: Fallback To Regex (#28752)
Closes #9980

Release Notes:

- Fixed: case insensitive text search with unicode characters
2025-04-15 13:12:37 +02:00
Smit Barmase
616d17f517 git_ui: Force commit modal mode from command palette (#28745)
Depending on `git::commit` or `git::amend` action triggered, commit
modal opens up in appropriate mode, handling edge cases like if you are
already in amend mode, etc.

Release Notes:

- N/A
2025-04-15 13:30:02 +05:30
Bennet Bo Fenner
e1c42315dc gemini: Fix "invalid argument" error when request contains no tools (#28747)
When we do not have any tools, we want to set the `tools` field to
`None`

Release Notes:

- Fixed an issue where Gemini requests would sometimes return a Bad
Request ("Invalid argument...")
2025-04-15 07:57:54 +00:00
Smit Barmase
cfc848d24b git_ui: Fix commit modal dismiss on commit menu click (#28744)
Release Notes:

- N/A
2025-04-15 13:05:10 +05:30
Anthony Eid
d4761cea47 debugger: Remember pane layout from previous debugger session (#28692)
This PR makes a debugger's pane layout persistent across session's that
use the same debug adapter.

Release Notes:

- N/A

---------

Co-authored-by: Piotr Osiewicz <24362066+osiewicz@users.noreply.github.com>
Co-authored-by: Cole Miller <m@cole-miller.net>
2025-04-15 06:32:28 +00:00
Richard Feldman
b794919842 Add contents_tool (#28738)
This is a combination of the "read file" and "list directory contents"
tools as part of a push to reduce our quantity of builtin tools by
combining some of them.

The functionality is all there for this tool, although there's room for
improvement on the visuals side: it currently always shows the same icon
and always says "Read" - so you can't tell at a glance when it's reading
a directory vs an individual file. Changing this will require a change
to the `Tool` trait, which can be in a separate PR. (FYI @danilo-leal!)

<img width="606" alt="Screenshot 2025-04-14 at 11 56 27 PM"
src="https://github.com/user-attachments/assets/bded72af-6476-4469-97c6-2f344629b0e4"
/>

Release Notes:

- Added `contents` tool
2025-04-15 00:54:25 -04:00
170 changed files with 5633 additions and 4816 deletions

9
Cargo.lock generated
View File

@@ -4001,7 +4001,6 @@ dependencies = [
"node_runtime",
"parking_lot",
"paths",
"regex",
"schemars",
"serde",
"serde_json",
@@ -4033,7 +4032,6 @@ dependencies = [
"gpui",
"language",
"paths",
"regex",
"serde",
"serde_json",
"task",
@@ -4177,6 +4175,7 @@ dependencies = [
"collections",
"command_palette_hooks",
"dap",
"db",
"editor",
"env_logger 0.11.8",
"feature_flags",
@@ -4316,19 +4315,24 @@ dependencies = [
"anyhow",
"client",
"collections",
"component",
"ctor",
"editor",
"env_logger 0.11.8",
"gpui",
"indoc",
"language",
"linkme",
"log",
"lsp",
"markdown",
"pretty_assertions",
"project",
"rand 0.8.5",
"serde",
"serde_json",
"settings",
"text",
"theme",
"ui",
"unindent",
@@ -4873,7 +4877,6 @@ version = "0.1.0"
dependencies = [
"agent",
"anyhow",
"assistant_settings",
"assistant_tool",
"assistant_tools",
"async-watch",

View File

@@ -644,6 +644,7 @@
// We don't know which of the context server tools are safe for the "Ask" profile, so we don't enable them by default.
// "enable_all_context_servers": true,
"tools": {
"contents": true,
"diagnostics": true,
"fetch": true,
"list_directory": false,
@@ -662,6 +663,7 @@
"batch_tool": true,
"code_actions": true,
"code_symbols": true,
"contents": true,
"copy_path": false,
"create_file": true,
"delete_path": false,

View File

@@ -23,7 +23,7 @@ use gpui::{
use language::{Buffer, LanguageRegistry};
use language_model::{LanguageModelRegistry, LanguageModelToolUseId, Role, StopReason};
use markdown::parser::{CodeBlockKind, CodeBlockMetadata};
use markdown::{Markdown, MarkdownElement, MarkdownStyle, ParsedMarkdown};
use markdown::{HeadingLevelStyles, Markdown, MarkdownElement, MarkdownStyle, ParsedMarkdown};
use project::ProjectItem as _;
use rope::Point;
use settings::{Settings as _, update_settings_file};
@@ -175,11 +175,37 @@ fn default_markdown_style(window: &Window, cx: &App) -> MarkdownStyle {
});
MarkdownStyle {
base_text_style: text_style,
base_text_style: text_style.clone(),
syntax: cx.theme().syntax().clone(),
selection_background_color: cx.theme().players().local().selection,
code_block_overflow_x_scroll: true,
table_overflow_x_scroll: true,
heading_level_styles: Some(HeadingLevelStyles {
h1: Some(TextStyleRefinement {
font_size: Some(rems(1.15).into()),
..Default::default()
}),
h2: Some(TextStyleRefinement {
font_size: Some(rems(1.1).into()),
..Default::default()
}),
h3: Some(TextStyleRefinement {
font_size: Some(rems(1.05).into()),
..Default::default()
}),
h4: Some(TextStyleRefinement {
font_size: Some(rems(1.).into()),
..Default::default()
}),
h5: Some(TextStyleRefinement {
font_size: Some(rems(0.95).into()),
..Default::default()
}),
h6: Some(TextStyleRefinement {
font_size: Some(rems(0.875).into()),
..Default::default()
}),
}),
code_block: StyleRefinement {
padding: EdgesRefinement {
top: Some(DefiniteLength::Absolute(AbsoluteLength::Pixels(Pixels(8.)))),

View File

@@ -894,6 +894,7 @@ mod tests {
use super::*;
use crate::{ThreadStore, thread_store};
use assistant_settings::AssistantSettings;
use assistant_tool::ToolWorkingSet;
use context_server::ContextServerSettings;
use editor::EditorSettings;
use gpui::TestAppContext;
@@ -937,7 +938,7 @@ mod tests {
.update(|cx| {
ThreadStore::load(
project.clone(),
Arc::default(),
cx.new(|_| ToolWorkingSet::default()),
Arc::new(PromptBuilder::new(None).unwrap()),
cx,
)

View File

@@ -18,6 +18,7 @@ mod terminal_inline_assistant;
mod thread;
mod thread_history;
mod thread_store;
mod tool_compatibility;
mod tool_use;
mod ui;

View File

@@ -29,7 +29,7 @@ pub struct AssistantConfiguration {
configuration_views_by_provider: HashMap<LanguageModelProviderId, AnyView>,
context_server_manager: Entity<ContextServerManager>,
expanded_context_server_tools: HashMap<Arc<str>, bool>,
tools: Arc<ToolWorkingSet>,
tools: Entity<ToolWorkingSet>,
_registry_subscription: Subscription,
}
@@ -37,7 +37,7 @@ impl AssistantConfiguration {
pub fn new(
fs: Arc<dyn Fs>,
context_server_manager: Entity<ContextServerManager>,
tools: Arc<ToolWorkingSet>,
tools: Entity<ToolWorkingSet>,
window: &mut Window,
cx: &mut Context<Self>,
) -> Self {
@@ -226,7 +226,7 @@ impl AssistantConfiguration {
fn render_context_servers_section(&mut self, cx: &mut Context<Self>) -> impl IntoElement {
let context_servers = self.context_server_manager.read(cx).all_servers().clone();
let tools_by_source = self.tools.tools_by_source(cx);
let tools_by_source = self.tools.read(cx).tools_by_source(cx);
let empty = Vec::new();
const SUBHEADING: &str = "Connect to context servers via the Model Context Protocol either via Zed extensions or directly.";

View File

@@ -84,7 +84,7 @@ pub struct NewProfileMode {
pub struct ManageProfilesModal {
fs: Arc<dyn Fs>,
tools: Arc<ToolWorkingSet>,
tools: Entity<ToolWorkingSet>,
thread_store: WeakEntity<ThreadStore>,
focus_handle: FocusHandle,
mode: Mode,
@@ -117,7 +117,7 @@ impl ManageProfilesModal {
pub fn new(
fs: Arc<dyn Fs>,
tools: Arc<ToolWorkingSet>,
tools: Entity<ToolWorkingSet>,
thread_store: WeakEntity<ThreadStore>,
window: &mut Window,
cx: &mut Context<Self>,

View File

@@ -60,7 +60,7 @@ pub struct ToolPickerDelegate {
impl ToolPickerDelegate {
pub fn new(
fs: Arc<dyn Fs>,
tool_set: Arc<ToolWorkingSet>,
tool_set: Entity<ToolWorkingSet>,
thread_store: WeakEntity<ThreadStore>,
profile_id: AgentProfileId,
profile: AgentProfile,
@@ -68,7 +68,7 @@ impl ToolPickerDelegate {
) -> Self {
let mut tool_entries = Vec::new();
for (source, tools) in tool_set.tools_by_source(cx) {
for (source, tools) in tool_set.read(cx).tools_by_source(cx) {
tool_entries.extend(tools.into_iter().map(|tool| ToolEntry {
name: tool.name().into(),
source: source.clone(),
@@ -192,7 +192,7 @@ impl PickerDelegate for ToolPickerDelegate {
if active_profile_id == &self.profile_id {
self.thread_store
.update(cx, |this, cx| {
this.load_profile(&self.profile, cx);
this.load_profile(self.profile.clone(), cx);
})
.log_err();
}

View File

@@ -203,7 +203,7 @@ impl AssistantPanel {
cx: AsyncWindowContext,
) -> Task<Result<Entity<Self>>> {
cx.spawn(async move |cx| {
let tools = Arc::new(ToolWorkingSet::default());
let tools = cx.new(|_| ToolWorkingSet::default())?;
let thread_store = workspace
.update(cx, |workspace, cx| {
let project = workspace.project().clone();

View File

@@ -2,6 +2,7 @@ use std::collections::BTreeMap;
use std::sync::Arc;
use crate::assistant_model_selector::ModelType;
use crate::tool_compatibility::{IncompatibleToolsState, IncompatibleToolsTooltip};
use buffer_diff::BufferDiff;
use collections::HashSet;
use editor::actions::MoveUp;
@@ -41,6 +42,7 @@ use crate::{
pub struct MessageEditor {
thread: Entity<Thread>,
incompatible_tools_state: Entity<IncompatibleToolsState>,
editor: Entity<Editor>,
#[allow(dead_code)]
workspace: WeakEntity<Workspace>,
@@ -124,6 +126,9 @@ impl MessageEditor {
)
});
let incompatible_tools =
cx.new(|cx| IncompatibleToolsState::new(thread.read(cx).tools().clone(), cx));
let subscriptions =
vec![cx.subscribe_in(&context_strip, window, Self::handle_context_strip_event)];
@@ -131,6 +136,7 @@ impl MessageEditor {
editor: editor.clone(),
project: thread.read(cx).project().clone(),
thread,
incompatible_tools_state: incompatible_tools.clone(),
workspace,
context_store,
context_strip,
@@ -208,6 +214,7 @@ impl MessageEditor {
}
if self.thread.read(cx).is_generating() {
self.stop_current_and_send_new_message(window, cx);
return;
}
@@ -297,6 +304,17 @@ impl MessageEditor {
.detach();
}
fn stop_current_and_send_new_message(&mut self, window: &mut Window, cx: &mut Context<Self>) {
let cancelled = self
.thread
.update(cx, |thread, cx| thread.cancel_last_completion(cx));
if cancelled {
self.set_editor_is_expanded(false, cx);
self.send_to_model(RequestKind::Chat, window, cx);
}
}
fn handle_context_strip_event(
&mut self,
_context_strip: &Entity<ContextStrip>,
@@ -356,6 +374,23 @@ impl MessageEditor {
let is_model_selected = self.is_model_selected(cx);
let is_editor_empty = self.is_editor_empty(cx);
let model = LanguageModelRegistry::read_global(cx)
.default_model()
.map(|default| default.model.clone());
let incompatible_tools = model
.as_ref()
.map(|model| {
self.incompatible_tools_state.update(cx, |state, cx| {
state
.incompatible_tools(model, cx)
.iter()
.cloned()
.collect::<Vec<_>>()
})
})
.unwrap_or_default();
let is_editor_expanded = self.editor_is_expanded;
let expand_icon = if is_editor_expanded {
IconName::Minimize
@@ -460,84 +495,150 @@ impl MessageEditor {
.flex_none()
.justify_between()
.child(h_flex().gap_2().child(self.profile_selector.clone()))
.child(h_flex().gap_1().child(self.model_selector.clone()).map({
let focus_handle = focus_handle.clone();
move |parent| {
if is_generating {
parent.child(
.child(
h_flex()
.gap_1()
.when(!incompatible_tools.is_empty(), |this| {
this.child(
IconButton::new(
"stop-generation",
IconName::StopFilled,
"tools-incompatible-warning",
IconName::Warning,
)
.icon_color(Color::Error)
.style(ButtonStyle::Tinted(ui::TintColor::Error))
.tooltip(move |window, cx| {
Tooltip::for_action(
"Stop Generation",
&editor::actions::Cancel,
window,
cx,
)
})
.on_click({
let focus_handle = focus_handle.clone();
move |_event, window, cx| {
focus_handle.dispatch_action(
&editor::actions::Cancel,
window,
cx,
);
.icon_color(Color::Warning)
.icon_size(IconSize::Small)
.tooltip({
move |_, cx| {
cx.new(|_| IncompatibleToolsTooltip {
incompatible_tools: incompatible_tools
.clone(),
})
.into()
}
})
.with_animation(
"pulsating-label",
Animation::new(Duration::from_secs(2))
.repeat()
.with_easing(pulsating_between(0.4, 1.0)),
|icon_button, delta| icon_button.alpha(delta),
),
}),
)
} else {
parent.child(
IconButton::new("send-message", IconName::Send)
.icon_color(Color::Accent)
.style(ButtonStyle::Filled)
.disabled(
is_editor_empty
|| !is_model_selected
|| self.waiting_for_summaries_to_send,
)
.on_click({
let focus_handle = focus_handle.clone();
move |_event, window, cx| {
focus_handle
.dispatch_action(&Chat, window, cx);
}
})
.when(
!is_editor_empty && is_model_selected,
|button| {
button.tooltip(move |window, cx| {
Tooltip::for_action(
"Send", &Chat, window, cx,
})
.child(self.model_selector.clone())
.map({
let focus_handle = focus_handle.clone();
move |parent| {
if is_generating {
parent
.when(is_editor_empty, |parent| {
parent.child(
IconButton::new(
"stop-generation",
IconName::StopFilled,
)
.icon_color(Color::Error)
.style(ButtonStyle::Tinted(
ui::TintColor::Error,
))
.tooltip(move |window, cx| {
Tooltip::for_action(
"Stop Generation",
&editor::actions::Cancel,
window,
cx,
)
})
.on_click({
let focus_handle =
focus_handle.clone();
move |_event, window, cx| {
focus_handle.dispatch_action(
&editor::actions::Cancel,
window,
cx,
);
}
})
.with_animation(
"pulsating-label",
Animation::new(
Duration::from_secs(2),
)
.repeat()
.with_easing(pulsating_between(
0.4, 1.0,
)),
|icon_button, delta| {
icon_button.alpha(delta)
},
),
)
})
.when(!is_editor_empty, |parent| {
parent.child(
IconButton::new("send-message", IconName::Send)
.icon_color(Color::Accent)
.style(ButtonStyle::Filled)
.disabled(
!is_model_selected
|| self
.waiting_for_summaries_to_send,
)
.on_click({
let focus_handle = focus_handle.clone();
move |_event, window, cx| {
focus_handle.dispatch_action(
&Chat, window, cx,
);
}
})
},
.tooltip(move |window, cx| {
Tooltip::for_action(
"Stop and Send New Message",
&Chat,
window,
cx,
)
}),
)
.when(is_editor_empty, |button| {
button.tooltip(Tooltip::text(
"Type a message to submit",
))
})
.when(!is_model_selected, |button| {
button.tooltip(Tooltip::text(
"Select a model to continue",
))
}),
)
}
}
})),
})
} else {
parent.child(
IconButton::new("send-message", IconName::Send)
.icon_color(Color::Accent)
.style(ButtonStyle::Filled)
.disabled(
is_editor_empty
|| !is_model_selected
|| self
.waiting_for_summaries_to_send,
)
.on_click({
let focus_handle = focus_handle.clone();
move |_event, window, cx| {
focus_handle.dispatch_action(
&Chat, window, cx,
);
}
})
.when(
!is_editor_empty && is_model_selected,
|button| {
button.tooltip(move |window, cx| {
Tooltip::for_action(
"Send", &Chat, window, cx,
)
})
},
)
.when(is_editor_empty, |button| {
button.tooltip(Tooltip::text(
"Type a message to submit",
))
})
.when(!is_model_selected, |button| {
button.tooltip(Tooltip::text(
"Select a model to continue",
))
}),
)
}
}
}),
),
),
)
}

View File

@@ -86,7 +86,7 @@ impl ProfileSelector {
thread_store
.update(cx, |this, cx| {
this.load_profile_by_id(&profile_id, cx);
this.load_profile_by_id(profile_id.clone(), cx);
})
.log_err();
}

View File

@@ -254,7 +254,7 @@ pub struct Thread {
pending_completions: Vec<PendingCompletion>,
project: Entity<Project>,
prompt_builder: Arc<PromptBuilder>,
tools: Arc<ToolWorkingSet>,
tools: Entity<ToolWorkingSet>,
tool_use: ToolUseState,
action_log: Entity<ActionLog>,
last_restore_checkpoint: Option<LastRestoreCheckpoint>,
@@ -278,7 +278,7 @@ pub struct ExceededWindowError {
impl Thread {
pub fn new(
project: Entity<Project>,
tools: Arc<ToolWorkingSet>,
tools: Entity<ToolWorkingSet>,
prompt_builder: Arc<PromptBuilder>,
system_prompt: SharedProjectContext,
cx: &mut Context<Self>,
@@ -322,7 +322,7 @@ impl Thread {
id: ThreadId,
serialized: SerializedThread,
project: Entity<Project>,
tools: Arc<ToolWorkingSet>,
tools: Entity<ToolWorkingSet>,
prompt_builder: Arc<PromptBuilder>,
project_context: SharedProjectContext,
cx: &mut Context<Self>,
@@ -458,7 +458,7 @@ impl Thread {
!self.pending_completions.is_empty() || !self.all_tools_finished()
}
pub fn tools(&self) -> &Arc<ToolWorkingSet> {
pub fn tools(&self) -> &Entity<ToolWorkingSet> {
&self.tools
}
@@ -846,6 +846,7 @@ impl Thread {
let mut tools = Vec::new();
tools.extend(
self.tools()
.read(cx)
.enabled_tools(cx)
.into_iter()
.filter_map(|tool| {
@@ -1354,7 +1355,7 @@ impl Thread {
.collect::<Vec<_>>();
for tool_use in pending_tool_uses.iter() {
if let Some(tool) = self.tools.tool(&tool_use.name, cx) {
if let Some(tool) = self.tools.read(cx).tool(&tool_use.name, cx) {
if tool.needs_confirmation(&tool_use.input, cx)
&& !AssistantSettings::get_global(cx).always_allow_tool_actions
{
@@ -1406,8 +1407,8 @@ impl Thread {
) -> Task<()> {
let tool_name: Arc<str> = tool.name().into();
let run_tool = if self.tools.is_disabled(&tool.source(), &tool_name) {
Task::ready(Err(anyhow!("tool is disabled: {tool_name}")))
let tool_result = if self.tools.read(cx).is_disabled(&tool.source(), &tool_name) {
Task::ready(Err(anyhow!("tool is disabled: {tool_name}"))).into()
} else {
tool.run(
input,
@@ -1420,7 +1421,7 @@ impl Thread {
cx.spawn({
async move |thread: WeakEntity<Thread>, cx| {
let output = run_tool.await;
let output = tool_result.output.await;
thread
.update(cx, |thread, cx| {
@@ -1521,6 +1522,7 @@ impl Thread {
let enabled_tool_names: Vec<String> = self
.tools()
.read(cx)
.enabled_tools(cx)
.iter()
.map(|tool| tool.name().to_string())
@@ -2341,7 +2343,7 @@ fn main() {{
.update(|_, cx| {
ThreadStore::load(
project.clone(),
Arc::default(),
cx.new(|_| ToolWorkingSet::default()),
Arc::new(PromptBuilder::new(None).unwrap()),
cx,
)

View File

@@ -56,7 +56,7 @@ impl SharedProjectContext {
pub struct ThreadStore {
project: Entity<Project>,
tools: Arc<ToolWorkingSet>,
tools: Entity<ToolWorkingSet>,
prompt_builder: Arc<PromptBuilder>,
context_server_manager: Entity<ContextServerManager>,
context_server_tool_ids: HashMap<Arc<str>, Vec<ToolId>>,
@@ -74,7 +74,7 @@ impl EventEmitter<RulesLoadingError> for ThreadStore {}
impl ThreadStore {
pub fn load(
project: Entity<Project>,
tools: Arc<ToolWorkingSet>,
tools: Entity<ToolWorkingSet>,
prompt_builder: Arc<PromptBuilder>,
cx: &mut App,
) -> Task<Entity<Self>> {
@@ -88,7 +88,7 @@ impl ThreadStore {
fn new(
project: Entity<Project>,
tools: Arc<ToolWorkingSet>,
tools: Entity<ToolWorkingSet>,
prompt_builder: Arc<PromptBuilder>,
cx: &mut Context<Self>,
) -> Self {
@@ -248,7 +248,7 @@ impl ThreadStore {
self.context_server_manager.clone()
}
pub fn tools(&self) -> Arc<ToolWorkingSet> {
pub fn tools(&self) -> Entity<ToolWorkingSet> {
self.tools.clone()
}
@@ -355,52 +355,60 @@ impl ThreadStore {
})
}
fn load_default_profile(&self, cx: &Context<Self>) {
fn load_default_profile(&self, cx: &mut Context<Self>) {
let assistant_settings = AssistantSettings::get_global(cx);
self.load_profile_by_id(&assistant_settings.default_profile, cx);
self.load_profile_by_id(assistant_settings.default_profile.clone(), cx);
}
pub fn load_profile_by_id(&self, profile_id: &AgentProfileId, cx: &Context<Self>) {
pub fn load_profile_by_id(&self, profile_id: AgentProfileId, cx: &mut Context<Self>) {
let assistant_settings = AssistantSettings::get_global(cx);
if let Some(profile) = assistant_settings.profiles.get(profile_id) {
self.load_profile(profile, cx);
if let Some(profile) = assistant_settings.profiles.get(&profile_id) {
self.load_profile(profile.clone(), cx);
}
}
pub fn load_profile(&self, profile: &AgentProfile, cx: &Context<Self>) {
self.tools.disable_all_tools();
self.tools.enable(
ToolSource::Native,
&profile
.tools
.iter()
.filter_map(|(tool, enabled)| enabled.then(|| tool.clone()))
.collect::<Vec<_>>(),
);
pub fn load_profile(&self, profile: AgentProfile, cx: &mut Context<Self>) {
self.tools.update(cx, |tools, cx| {
tools.disable_all_tools(cx);
tools.enable(
ToolSource::Native,
&profile
.tools
.iter()
.filter_map(|(tool, enabled)| enabled.then(|| tool.clone()))
.collect::<Vec<_>>(),
cx,
);
});
if profile.enable_all_context_servers {
for context_server in self.context_server_manager.read(cx).all_servers() {
self.tools.enable_source(
ToolSource::ContextServer {
id: context_server.id().into(),
},
cx,
);
self.tools.update(cx, |tools, cx| {
tools.enable_source(
ToolSource::ContextServer {
id: context_server.id().into(),
},
cx,
);
});
}
} else {
for (context_server_id, preset) in &profile.context_servers {
self.tools.enable(
ToolSource::ContextServer {
id: context_server_id.clone().into(),
},
&preset
.tools
.iter()
.filter_map(|(tool, enabled)| enabled.then(|| tool.clone()))
.collect::<Vec<_>>(),
)
self.tools.update(cx, |tools, cx| {
tools.enable(
ToolSource::ContextServer {
id: context_server_id.clone().into(),
},
&preset
.tools
.iter()
.filter_map(|(tool, enabled)| enabled.then(|| tool.clone()))
.collect::<Vec<_>>(),
cx,
)
})
}
}
}
@@ -434,29 +442,36 @@ impl ThreadStore {
if protocol.capable(context_server::protocol::ServerCapability::Tools) {
if let Some(tools) = protocol.list_tools().await.log_err() {
let tool_ids = tools
.tools
.into_iter()
.map(|tool| {
log::info!(
"registering context server tool: {:?}",
tool.name
);
tool_working_set.insert(Arc::new(
ContextServerTool::new(
context_server_manager.clone(),
server.id(),
tool,
),
))
let tool_ids = tool_working_set
.update(cx, |tool_working_set, _| {
tools
.tools
.into_iter()
.map(|tool| {
log::info!(
"registering context server tool: {:?}",
tool.name
);
tool_working_set.insert(Arc::new(
ContextServerTool::new(
context_server_manager.clone(),
server.id(),
tool,
),
))
})
.collect::<Vec<_>>()
})
.collect::<Vec<_>>();
.log_err();
this.update(cx, |this, cx| {
this.context_server_tool_ids.insert(server_id, tool_ids);
this.load_default_profile(cx);
})
.log_err();
if let Some(tool_ids) = tool_ids {
this.update(cx, |this, cx| {
this.context_server_tool_ids
.insert(server_id, tool_ids);
this.load_default_profile(cx);
})
.log_err();
}
}
}
}
@@ -466,7 +481,9 @@ impl ThreadStore {
}
context_server::manager::Event::ServerStopped { server_id } => {
if let Some(tool_ids) = self.context_server_tool_ids.remove(server_id) {
tool_working_set.remove(&tool_ids);
tool_working_set.update(cx, |tool_working_set, _| {
tool_working_set.remove(&tool_ids);
});
self.load_default_profile(cx);
}
}

View File

@@ -0,0 +1,89 @@
use std::sync::Arc;
use assistant_tool::{Tool, ToolWorkingSet, ToolWorkingSetEvent};
use collections::HashMap;
use gpui::{App, Context, Entity, IntoElement, Render, Subscription, Window};
use language_model::{LanguageModel, LanguageModelToolSchemaFormat};
use ui::prelude::*;
pub struct IncompatibleToolsState {
cache: HashMap<LanguageModelToolSchemaFormat, Vec<Arc<dyn Tool>>>,
tool_working_set: Entity<ToolWorkingSet>,
_tool_working_set_subscription: Subscription,
}
impl IncompatibleToolsState {
pub fn new(tool_working_set: Entity<ToolWorkingSet>, cx: &mut Context<Self>) -> Self {
let _tool_working_set_subscription =
cx.subscribe(&tool_working_set, |this, _, event, _| match event {
ToolWorkingSetEvent::EnabledToolsChanged => {
this.cache.clear();
}
});
Self {
cache: HashMap::default(),
tool_working_set,
_tool_working_set_subscription,
}
}
pub fn incompatible_tools(
&mut self,
model: &Arc<dyn LanguageModel>,
cx: &App,
) -> &[Arc<dyn Tool>] {
self.cache
.entry(model.tool_input_format())
.or_insert_with(|| {
self.tool_working_set
.read(cx)
.enabled_tools(cx)
.iter()
.filter(|tool| tool.input_schema(model.tool_input_format()).is_err())
.cloned()
.collect()
})
}
}
pub struct IncompatibleToolsTooltip {
pub incompatible_tools: Vec<Arc<dyn Tool>>,
}
impl Render for IncompatibleToolsTooltip {
fn render(&mut self, window: &mut Window, cx: &mut Context<Self>) -> impl IntoElement {
ui::tooltip_container(window, cx, |container, _, cx| {
container
.w_72()
.child(Label::new("Incompatible Tools").size(LabelSize::Small))
.child(
Label::new(
"This model is incompatible with the following tools from your MCPs:",
)
.size(LabelSize::Small)
.color(Color::Muted),
)
.child(
v_flex()
.my_1p5()
.py_0p5()
.border_b_1()
.border_color(cx.theme().colors().border_variant)
.children(
self.incompatible_tools
.iter()
.map(|tool| Label::new(tool.name()).size(LabelSize::Small).buffer_font(cx)),
),
)
.child(Label::new("What To Do Instead").size(LabelSize::Small))
.child(
Label::new(
"Every other tool continues to work with this model, but to specifically use those, switch to another model.",
)
.size(LabelSize::Small)
.color(Color::Muted),
)
})
}
}

View File

@@ -5,7 +5,7 @@ use assistant_tool::{Tool, ToolWorkingSet};
use collections::HashMap;
use futures::FutureExt as _;
use futures::future::Shared;
use gpui::{App, SharedString, Task};
use gpui::{App, Entity, SharedString, Task};
use language_model::{
LanguageModelRegistry, LanguageModelRequestMessage, LanguageModelToolResult,
LanguageModelToolUse, LanguageModelToolUseId, MessageContent, Role,
@@ -49,7 +49,7 @@ impl ToolUseStatus {
}
pub struct ToolUseState {
tools: Arc<ToolWorkingSet>,
tools: Entity<ToolWorkingSet>,
tool_uses_by_assistant_message: HashMap<MessageId, Vec<LanguageModelToolUse>>,
tool_uses_by_user_message: HashMap<MessageId, Vec<LanguageModelToolUseId>>,
tool_results: HashMap<LanguageModelToolUseId, LanguageModelToolResult>,
@@ -59,7 +59,7 @@ pub struct ToolUseState {
pub const USING_TOOL_MARKER: &str = "<using_tool>";
impl ToolUseState {
pub fn new(tools: Arc<ToolWorkingSet>) -> Self {
pub fn new(tools: Entity<ToolWorkingSet>) -> Self {
Self {
tools,
tool_uses_by_assistant_message: HashMap::default(),
@@ -73,7 +73,7 @@ impl ToolUseState {
///
/// Accepts a function to filter the tools that should be used to populate the state.
pub fn from_serialized_messages(
tools: Arc<ToolWorkingSet>,
tools: Entity<ToolWorkingSet>,
messages: &[SerializedMessage],
mut filter_by_tool_name: impl FnMut(&str) -> bool,
) -> Self {
@@ -199,12 +199,12 @@ impl ToolUseState {
}
})();
let (icon, needs_confirmation) = if let Some(tool) = self.tools.tool(&tool_use.name, cx)
{
(tool.icon(), tool.needs_confirmation(&tool_use.input, cx))
} else {
(IconName::Cog, false)
};
let (icon, needs_confirmation) =
if let Some(tool) = self.tools.read(cx).tool(&tool_use.name, cx) {
(tool.icon(), tool.needs_confirmation(&tool_use.input, cx))
} else {
(IconName::Cog, false)
};
tool_uses.push(ToolUse {
id: tool_use.id.clone(),
@@ -226,7 +226,7 @@ impl ToolUseState {
input: &serde_json::Value,
cx: &App,
) -> SharedString {
if let Some(tool) = self.tools.tool(tool_name, cx) {
if let Some(tool) = self.tools.read(cx).tool(tool_name, cx) {
tool.ui_text(input).into()
} else {
format!("Unknown tool {tool_name:?}").into()

View File

@@ -24,6 +24,19 @@ pub fn init(cx: &mut App) {
ToolRegistry::default_global(cx);
}
/// The result of running a tool
pub struct ToolResult {
/// The asynchronous task that will eventually resolve to the tool's output
pub output: Task<Result<String>>,
}
impl From<Task<Result<String>>> for ToolResult {
/// Convert from a task to a ToolResult
fn from(output: Task<Result<String>>) -> Self {
Self { output }
}
}
#[derive(Debug, PartialEq, Eq, PartialOrd, Ord, Hash, Clone)]
pub enum ToolSource {
/// A native tool built-in to Zed.
@@ -68,7 +81,7 @@ pub trait Tool: 'static + Send + Sync {
project: Entity<Project>,
action_log: Entity<ActionLog>,
cx: &mut App,
) -> Task<Result<String>>;
) -> ToolResult;
}
impl Debug for dyn Tool {

View File

@@ -1,8 +1,7 @@
use std::sync::Arc;
use collections::{HashMap, HashSet, IndexMap};
use gpui::App;
use parking_lot::Mutex;
use gpui::{App, Context, EventEmitter};
use crate::{Tool, ToolRegistry, ToolSource};
@@ -12,11 +11,6 @@ pub struct ToolId(usize);
/// A working set of tools for use in one instance of the Assistant Panel.
#[derive(Default)]
pub struct ToolWorkingSet {
state: Mutex<WorkingSetState>,
}
#[derive(Default)]
struct WorkingSetState {
context_server_tools_by_id: HashMap<ToolId, Arc<dyn Tool>>,
context_server_tools_by_name: HashMap<String, Arc<dyn Tool>>,
enabled_sources: HashSet<ToolSource>,
@@ -24,99 +18,27 @@ struct WorkingSetState {
next_tool_id: ToolId,
}
pub enum ToolWorkingSetEvent {
EnabledToolsChanged,
}
impl EventEmitter<ToolWorkingSetEvent> for ToolWorkingSet {}
impl ToolWorkingSet {
pub fn tool(&self, name: &str, cx: &App) -> Option<Arc<dyn Tool>> {
self.state
.lock()
.context_server_tools_by_name
self.context_server_tools_by_name
.get(name)
.cloned()
.or_else(|| ToolRegistry::global(cx).tool(name))
}
pub fn tools(&self, cx: &App) -> Vec<Arc<dyn Tool>> {
self.state.lock().tools(cx)
}
pub fn tools_by_source(&self, cx: &App) -> IndexMap<ToolSource, Vec<Arc<dyn Tool>>> {
self.state.lock().tools_by_source(cx)
}
pub fn enabled_tools(&self, cx: &App) -> Vec<Arc<dyn Tool>> {
self.state.lock().enabled_tools(cx)
}
pub fn disable_all_tools(&self) {
let mut state = self.state.lock();
state.disable_all_tools();
}
pub fn enable_source(&self, source: ToolSource, cx: &App) {
let mut state = self.state.lock();
state.enable_source(source, cx);
}
pub fn disable_source(&self, source: &ToolSource) {
let mut state = self.state.lock();
state.disable_source(source);
}
pub fn insert(&self, tool: Arc<dyn Tool>) -> ToolId {
let mut state = self.state.lock();
let tool_id = state.next_tool_id;
state.next_tool_id.0 += 1;
state
.context_server_tools_by_id
.insert(tool_id, tool.clone());
state.tools_changed();
tool_id
}
pub fn is_enabled(&self, source: &ToolSource, name: &Arc<str>) -> bool {
self.state.lock().is_enabled(source, name)
}
pub fn is_disabled(&self, source: &ToolSource, name: &Arc<str>) -> bool {
self.state.lock().is_disabled(source, name)
}
pub fn enable(&self, source: ToolSource, tools_to_enable: &[Arc<str>]) {
let mut state = self.state.lock();
state.enable(source, tools_to_enable);
}
pub fn disable(&self, source: ToolSource, tools_to_disable: &[Arc<str>]) {
let mut state = self.state.lock();
state.disable(source, tools_to_disable);
}
pub fn remove(&self, tool_ids_to_remove: &[ToolId]) {
let mut state = self.state.lock();
state
.context_server_tools_by_id
.retain(|id, _| !tool_ids_to_remove.contains(id));
state.tools_changed();
}
}
impl WorkingSetState {
fn tools_changed(&mut self) {
self.context_server_tools_by_name.clear();
self.context_server_tools_by_name.extend(
self.context_server_tools_by_id
.values()
.map(|tool| (tool.name(), tool.clone())),
);
}
fn tools(&self, cx: &App) -> Vec<Arc<dyn Tool>> {
let mut tools = ToolRegistry::global(cx).tools();
tools.extend(self.context_server_tools_by_id.values().cloned());
tools
}
fn tools_by_source(&self, cx: &App) -> IndexMap<ToolSource, Vec<Arc<dyn Tool>>> {
pub fn tools_by_source(&self, cx: &App) -> IndexMap<ToolSource, Vec<Arc<dyn Tool>>> {
let mut tools_by_source = IndexMap::default();
for tool in self.tools(cx) {
@@ -135,7 +57,7 @@ impl WorkingSetState {
tools_by_source
}
fn enabled_tools(&self, cx: &App) -> Vec<Arc<dyn Tool>> {
pub fn enabled_tools(&self, cx: &App) -> Vec<Arc<dyn Tool>> {
let all_tools = self.tools(cx);
all_tools
@@ -144,31 +66,12 @@ impl WorkingSetState {
.collect()
}
fn is_enabled(&self, source: &ToolSource, name: &Arc<str>) -> bool {
self.enabled_tools_by_source
.get(source)
.map_or(false, |enabled_tools| enabled_tools.contains(name))
pub fn disable_all_tools(&mut self, cx: &mut Context<Self>) {
self.enabled_tools_by_source.clear();
cx.emit(ToolWorkingSetEvent::EnabledToolsChanged);
}
fn is_disabled(&self, source: &ToolSource, name: &Arc<str>) -> bool {
!self.is_enabled(source, name)
}
fn enable(&mut self, source: ToolSource, tools_to_enable: &[Arc<str>]) {
self.enabled_tools_by_source
.entry(source)
.or_default()
.extend(tools_to_enable.into_iter().cloned());
}
fn disable(&mut self, source: ToolSource, tools_to_disable: &[Arc<str>]) {
self.enabled_tools_by_source
.entry(source)
.or_default()
.retain(|name| !tools_to_disable.contains(name));
}
fn enable_source(&mut self, source: ToolSource, cx: &App) {
pub fn enable_source(&mut self, source: ToolSource, cx: &mut Context<Self>) {
self.enabled_sources.insert(source.clone());
let tools_by_source = self.tools_by_source(cx);
@@ -181,14 +84,72 @@ impl WorkingSetState {
.collect::<HashSet<_>>(),
);
}
cx.emit(ToolWorkingSetEvent::EnabledToolsChanged);
}
fn disable_source(&mut self, source: &ToolSource) {
pub fn disable_source(&mut self, source: &ToolSource, cx: &mut Context<Self>) {
self.enabled_sources.remove(source);
self.enabled_tools_by_source.remove(source);
cx.emit(ToolWorkingSetEvent::EnabledToolsChanged);
}
fn disable_all_tools(&mut self) {
self.enabled_tools_by_source.clear();
pub fn insert(&mut self, tool: Arc<dyn Tool>) -> ToolId {
let tool_id = self.next_tool_id;
self.next_tool_id.0 += 1;
self.context_server_tools_by_id
.insert(tool_id, tool.clone());
self.tools_changed();
tool_id
}
pub fn is_enabled(&self, source: &ToolSource, name: &Arc<str>) -> bool {
self.enabled_tools_by_source
.get(source)
.map_or(false, |enabled_tools| enabled_tools.contains(name))
}
pub fn is_disabled(&self, source: &ToolSource, name: &Arc<str>) -> bool {
!self.is_enabled(source, name)
}
pub fn enable(
&mut self,
source: ToolSource,
tools_to_enable: &[Arc<str>],
cx: &mut Context<Self>,
) {
self.enabled_tools_by_source
.entry(source)
.or_default()
.extend(tools_to_enable.into_iter().cloned());
cx.emit(ToolWorkingSetEvent::EnabledToolsChanged);
}
pub fn disable(
&mut self,
source: ToolSource,
tools_to_disable: &[Arc<str>],
cx: &mut Context<Self>,
) {
self.enabled_tools_by_source
.entry(source)
.or_default()
.retain(|name| !tools_to_disable.contains(name));
cx.emit(ToolWorkingSetEvent::EnabledToolsChanged);
}
pub fn remove(&mut self, tool_ids_to_remove: &[ToolId]) {
self.context_server_tools_by_id
.retain(|id, _| !tool_ids_to_remove.contains(id));
self.tools_changed();
}
fn tools_changed(&mut self) {
self.context_server_tools_by_name.clear();
self.context_server_tools_by_name.extend(
self.context_server_tools_by_id
.values()
.map(|tool| (tool.name(), tool.clone())),
);
}
}

View File

@@ -1,6 +1,7 @@
mod batch_tool;
mod code_action_tool;
mod code_symbols_tool;
mod contents_tool;
mod copy_path_tool;
mod create_directory_tool;
mod create_file_tool;
@@ -33,6 +34,7 @@ use move_path_tool::MovePathTool;
use crate::batch_tool::BatchTool;
use crate::code_action_tool::CodeActionTool;
use crate::code_symbols_tool::CodeSymbolsTool;
use crate::contents_tool::ContentsTool;
use crate::create_directory_tool::CreateDirectoryTool;
use crate::create_file_tool::CreateFileTool;
use crate::delete_path_tool::DeletePathTool;
@@ -69,6 +71,7 @@ pub fn init(http_client: Arc<HttpClientWithUrl>, cx: &mut App) {
registry.register_tool(NowTool);
registry.register_tool(OpenTool);
registry.register_tool(CodeSymbolsTool);
registry.register_tool(ContentsTool);
registry.register_tool(PathSearchTool);
registry.register_tool(ReadFileTool);
registry.register_tool(RegexSearchTool);

View File

@@ -1,6 +1,6 @@
use crate::schema::json_schema_for;
use anyhow::{Result, anyhow};
use assistant_tool::{ActionLog, Tool, ToolWorkingSet};
use assistant_tool::{ActionLog, Tool, ToolResult, ToolWorkingSet};
use futures::future::join_all;
use gpui::{App, AppContext, Entity, Task};
use language_model::{LanguageModelRequestMessage, LanguageModelToolSchemaFormat};
@@ -219,14 +219,14 @@ impl Tool for BatchTool {
project: Entity<Project>,
action_log: Entity<ActionLog>,
cx: &mut App,
) -> Task<Result<String>> {
) -> ToolResult {
let input = match serde_json::from_value::<BatchToolInput>(input) {
Ok(input) => input,
Err(err) => return Task::ready(Err(anyhow!(err))),
Err(err) => return Task::ready(Err(anyhow!(err))).into(),
};
if input.invocations.is_empty() {
return Task::ready(Err(anyhow!("No tool invocations provided")));
return Task::ready(Err(anyhow!("No tool invocations provided"))).into();
}
let run_tools_concurrently = input.run_tools_concurrently;
@@ -257,11 +257,11 @@ impl Tool for BatchTool {
let project = project.clone();
let action_log = action_log.clone();
let messages = messages.clone();
let task = cx
let tool_result = cx
.update(|cx| tool.run(invocation.input, &messages, project, action_log, cx))
.map_err(|err| anyhow!("Failed to start tool '{}': {}", tool_name, err))?;
tasks.push(task);
tasks.push(tool_result.output);
}
Ok((tasks, tool_names))
@@ -306,5 +306,6 @@ impl Tool for BatchTool {
Ok(formatted_results.trim().to_string())
})
.into()
}
}

View File

@@ -1,5 +1,5 @@
use anyhow::{Context as _, Result, anyhow};
use assistant_tool::{ActionLog, Tool};
use assistant_tool::{ActionLog, Tool, ToolResult};
use gpui::{App, Entity, Task};
use language::{self, Anchor, Buffer, ToPointUtf16};
use language_model::{LanguageModelRequestMessage, LanguageModelToolSchemaFormat};
@@ -141,10 +141,10 @@ impl Tool for CodeActionTool {
project: Entity<Project>,
action_log: Entity<ActionLog>,
cx: &mut App,
) -> Task<Result<String>> {
) -> ToolResult {
let input = match serde_json::from_value::<CodeActionToolInput>(input) {
Ok(input) => input,
Err(err) => return Task::ready(Err(anyhow!(err))),
Err(err) => return Task::ready(Err(anyhow!(err))).into(),
};
cx.spawn(async move |cx| {
@@ -319,7 +319,7 @@ impl Tool for CodeActionTool {
Ok(response)
}
})
}).into()
}
}

View File

@@ -4,7 +4,7 @@ use std::sync::Arc;
use crate::schema::json_schema_for;
use anyhow::{Result, anyhow};
use assistant_tool::{ActionLog, Tool};
use assistant_tool::{ActionLog, Tool, ToolResult};
use collections::IndexMap;
use gpui::{App, AsyncApp, Entity, Task};
use language::{OutlineItem, ParseStatus, Point};
@@ -129,10 +129,10 @@ impl Tool for CodeSymbolsTool {
project: Entity<Project>,
action_log: Entity<ActionLog>,
cx: &mut App,
) -> Task<Result<String>> {
) -> ToolResult {
let input = match serde_json::from_value::<CodeSymbolsInput>(input) {
Ok(input) => input,
Err(err) => return Task::ready(Err(anyhow!(err))),
Err(err) => return Task::ready(Err(anyhow!(err))).into(),
};
let regex = match input.regex {
@@ -141,7 +141,7 @@ impl Tool for CodeSymbolsTool {
.build()
{
Ok(regex) => Some(regex),
Err(err) => return Task::ready(Err(anyhow!("Invalid regex: {err}"))),
Err(err) => return Task::ready(Err(anyhow!("Invalid regex: {err}"))).into(),
},
None => None,
};
@@ -150,6 +150,7 @@ impl Tool for CodeSymbolsTool {
Some(path) => file_outline(project, path, action_log, regex, input.offset, cx).await,
None => project_symbols(project, regex, input.offset, cx).await,
})
.into()
}
}

View File

@@ -0,0 +1,239 @@
use std::sync::Arc;
use crate::{code_symbols_tool::file_outline, schema::json_schema_for};
use anyhow::{Result, anyhow};
use assistant_tool::{ActionLog, Tool, ToolResult};
use gpui::{App, Entity, Task};
use itertools::Itertools;
use language_model::{LanguageModelRequestMessage, LanguageModelToolSchemaFormat};
use project::Project;
use schemars::JsonSchema;
use serde::{Deserialize, Serialize};
use std::{fmt::Write, path::Path};
use ui::IconName;
use util::markdown::MarkdownString;
/// If the model requests to read a file whose size exceeds this, then
/// the tool will return the file's symbol outline instead of its contents,
/// and suggest trying again using line ranges from the outline.
const MAX_FILE_SIZE_TO_READ: usize = 16384;
/// If the model requests to list the entries in a directory with more
/// entries than this, then the tool will return a subset of the entries
/// and suggest trying again.
const MAX_DIR_ENTRIES: usize = 1024;
#[derive(Debug, Serialize, Deserialize, JsonSchema)]
pub struct ContentsToolInput {
/// The relative path of the file or directory to access.
///
/// This path should never be absolute, and the first component
/// of the path should always be a root directory in a project.
///
/// <example>
/// If the project has the following root directories:
///
/// - directory1
/// - directory2
///
/// If you want to access `file.txt` in `directory1`, you should use the path `directory1/file.txt`.
/// If you want to list contents in the directory `directory2/subfolder`, you should use the path `directory2/subfolder`.
/// </example>
pub path: String,
/// Optional position (1-based index) to start reading on, if you want to read a subset of the contents.
/// When reading a file, this refers to a line number in the file (e.g. 1 is the first line).
/// When reading a directory, this refers to the number of the directory entry (e.g. 1 is the first entry).
///
/// Defaults to 1.
pub start: Option<u32>,
/// Optional position (1-based index) to end reading on, if you want to read a subset of the contents.
/// When reading a file, this refers to a line number in the file (e.g. 1 is the first line).
/// When reading a directory, this refers to the number of the directory entry (e.g. 1 is the first entry).
///
/// Defaults to reading until the end of the file or directory.
pub end: Option<u32>,
}
pub struct ContentsTool;
impl Tool for ContentsTool {
fn name(&self) -> String {
"contents".into()
}
fn needs_confirmation(&self, _: &serde_json::Value, _: &App) -> bool {
false
}
fn description(&self) -> String {
include_str!("./contents_tool/description.md").into()
}
fn icon(&self) -> IconName {
IconName::FileSearch
}
fn input_schema(&self, format: LanguageModelToolSchemaFormat) -> Result<serde_json::Value> {
json_schema_for::<ContentsToolInput>(format)
}
fn ui_text(&self, input: &serde_json::Value) -> String {
match serde_json::from_value::<ContentsToolInput>(input.clone()) {
Ok(input) => {
let path = MarkdownString::inline_code(&input.path);
match (input.start, input.end) {
(Some(start), None) => format!("Read {path} (from line {start})"),
(Some(start), Some(end)) => {
format!("Read {path} (lines {start}-{end})")
}
_ => format!("Read {path}"),
}
}
Err(_) => "Read file or directory".to_string(),
}
}
fn run(
self: Arc<Self>,
input: serde_json::Value,
_messages: &[LanguageModelRequestMessage],
project: Entity<Project>,
action_log: Entity<ActionLog>,
cx: &mut App,
) -> ToolResult {
let input = match serde_json::from_value::<ContentsToolInput>(input) {
Ok(input) => input,
Err(err) => return Task::ready(Err(anyhow!(err))).into(),
};
// Sometimes models will return these even though we tell it to give a path and not a glob.
// When this happens, just list the root worktree directories.
if matches!(input.path.as_str(), "." | "" | "./" | "*") {
let output = project
.read(cx)
.worktrees(cx)
.filter_map(|worktree| {
worktree.read(cx).root_entry().and_then(|entry| {
if entry.is_dir() {
entry.path.to_str()
} else {
None
}
})
})
.collect::<Vec<_>>()
.join("\n");
return Task::ready(Ok(output)).into();
}
let Some(project_path) = project.read(cx).find_project_path(&input.path, cx) else {
return Task::ready(Err(anyhow!("Path {} not found in project", &input.path))).into();
};
let Some(worktree) = project
.read(cx)
.worktree_for_id(project_path.worktree_id, cx)
else {
return Task::ready(Err(anyhow!("Worktree not found"))).into();
};
let worktree = worktree.read(cx);
let Some(entry) = worktree.entry_for_path(&project_path.path) else {
return Task::ready(Err(anyhow!("Path not found: {}", input.path))).into();
};
// If it's a directory, list its contents
if entry.is_dir() {
let mut output = String::new();
let start_index = input
.start
.map(|line| (line as usize).saturating_sub(1))
.unwrap_or(0);
let end_index = input
.end
.map(|line| (line as usize).saturating_sub(1))
.unwrap_or(MAX_DIR_ENTRIES);
let mut skipped = 0;
for (index, entry) in worktree.child_entries(&project_path.path).enumerate() {
if index >= start_index && index <= end_index {
writeln!(
output,
"{}",
Path::new(worktree.root_name()).join(&entry.path).display(),
)
.unwrap();
} else {
skipped += 1;
}
}
if output.is_empty() {
output.push_str(&input.path);
output.push_str(" is empty.");
}
if skipped > 0 {
write!(
output,
"\n\nNote: Skipped {skipped} entries. Adjust start and end to see other entries.",
).ok();
}
Task::ready(Ok(output)).into()
} else {
// It's a file, so read its contents
let file_path = input.path.clone();
cx.spawn(async move |cx| {
let buffer = cx
.update(|cx| {
project.update(cx, |project, cx| project.open_buffer(project_path, cx))
})?
.await?;
if input.start.is_some() || input.end.is_some() {
let result = buffer.read_with(cx, |buffer, _cx| {
let text = buffer.text();
let start = input.start.unwrap_or(1);
let lines = text.split('\n').skip(start as usize - 1);
if let Some(end) = input.end {
let count = end.saturating_sub(start).max(1); // Ensure at least 1 line
Itertools::intersperse(lines.take(count as usize), "\n").collect()
} else {
Itertools::intersperse(lines, "\n").collect()
}
})?;
action_log.update(cx, |log, cx| {
log.buffer_read(buffer, cx);
})?;
Ok(result)
} else {
// No line ranges specified, so check file size to see if it's too big.
let file_size = buffer.read_with(cx, |buffer, _cx| buffer.text().len())?;
if file_size <= MAX_FILE_SIZE_TO_READ {
let result = buffer.read_with(cx, |buffer, _cx| buffer.text())?;
action_log.update(cx, |log, cx| {
log.buffer_read(buffer, cx);
})?;
Ok(result)
} else {
// File is too big, so return its outline and a suggestion to
// read again with a line number range specified.
let outline = file_outline(project, file_path, action_log, None, 0, cx).await?;
Ok(format!("This file was too big to read all at once. Here is an outline of its symbols:\n\n{outline}\n\nUsing the line numbers in this outline, you can call this tool again while specifying the start and end fields to see the implementations of symbols in the outline."))
}
}
}).into()
}
}
}

View File

@@ -0,0 +1,9 @@
Reads the contents of a path on the filesystem.
If the path is a directory, this lists all files and directories within that path.
If the path is a file, this returns the file's contents.
When reading a file, if the file is too big and no line range is specified, an outline of the file's code symbols is listed instead, which can be used to request specific line ranges in a subsequent call.
Similarly, if a directory has too many entries to show at once, a subset of entries will be shown,
and subsequent requests can use starting and ending line numbers to get other subsets.

View File

@@ -1,6 +1,6 @@
use crate::schema::json_schema_for;
use anyhow::{Result, anyhow};
use assistant_tool::{ActionLog, Tool};
use assistant_tool::{ActionLog, Tool, ToolResult};
use gpui::{App, AppContext, Entity, Task};
use language_model::LanguageModelRequestMessage;
use language_model::LanguageModelToolSchemaFormat;
@@ -77,10 +77,10 @@ impl Tool for CopyPathTool {
project: Entity<Project>,
_action_log: Entity<ActionLog>,
cx: &mut App,
) -> Task<Result<String>> {
) -> ToolResult {
let input = match serde_json::from_value::<CopyPathToolInput>(input) {
Ok(input) => input,
Err(err) => return Task::ready(Err(anyhow!(err))),
Err(err) => return Task::ready(Err(anyhow!(err))).into(),
};
let copy_task = project.update(cx, |project, cx| {
match project
@@ -117,5 +117,6 @@ impl Tool for CopyPathTool {
)),
}
})
.into()
}
}

View File

@@ -1,6 +1,6 @@
use crate::schema::json_schema_for;
use anyhow::{Result, anyhow};
use assistant_tool::{ActionLog, Tool};
use assistant_tool::{ActionLog, Tool, ToolResult};
use gpui::{App, Entity, Task};
use language_model::LanguageModelRequestMessage;
use language_model::LanguageModelToolSchemaFormat;
@@ -68,14 +68,16 @@ impl Tool for CreateDirectoryTool {
project: Entity<Project>,
_action_log: Entity<ActionLog>,
cx: &mut App,
) -> Task<Result<String>> {
) -> ToolResult {
let input = match serde_json::from_value::<CreateDirectoryToolInput>(input) {
Ok(input) => input,
Err(err) => return Task::ready(Err(anyhow!(err))),
Err(err) => return Task::ready(Err(anyhow!(err))).into(),
};
let project_path = match project.read(cx).find_project_path(&input.path, cx) {
Some(project_path) => project_path,
None => return Task::ready(Err(anyhow!("Path to create was outside the project"))),
None => {
return Task::ready(Err(anyhow!("Path to create was outside the project"))).into();
}
};
let destination_path: Arc<str> = input.path.as_str().into();
@@ -89,5 +91,6 @@ impl Tool for CreateDirectoryTool {
Ok(format!("Created directory {destination_path}"))
})
.into()
}
}

View File

@@ -1,6 +1,6 @@
use crate::schema::json_schema_for;
use anyhow::{Result, anyhow};
use assistant_tool::{ActionLog, Tool};
use assistant_tool::{ActionLog, Tool, ToolResult};
use gpui::{App, Entity, Task};
use language_model::LanguageModelRequestMessage;
use language_model::LanguageModelToolSchemaFormat;
@@ -73,14 +73,16 @@ impl Tool for CreateFileTool {
project: Entity<Project>,
action_log: Entity<ActionLog>,
cx: &mut App,
) -> Task<Result<String>> {
) -> ToolResult {
let input = match serde_json::from_value::<CreateFileToolInput>(input) {
Ok(input) => input,
Err(err) => return Task::ready(Err(anyhow!(err))),
Err(err) => return Task::ready(Err(anyhow!(err))).into(),
};
let project_path = match project.read(cx).find_project_path(&input.path, cx) {
Some(project_path) => project_path,
None => return Task::ready(Err(anyhow!("Path to create was outside the project"))),
None => {
return Task::ready(Err(anyhow!("Path to create was outside the project"))).into();
}
};
let contents: Arc<str> = input.contents.as_str().into();
let destination_path: Arc<str> = input.path.as_str().into();
@@ -106,5 +108,6 @@ impl Tool for CreateFileTool {
Ok(format!("Created file {destination_path}"))
})
.into()
}
}

View File

@@ -1,6 +1,6 @@
use crate::schema::json_schema_for;
use anyhow::{Result, anyhow};
use assistant_tool::{ActionLog, Tool};
use assistant_tool::{ActionLog, Tool, ToolResult};
use futures::{SinkExt, StreamExt, channel::mpsc};
use gpui::{App, AppContext, Entity, Task};
use language_model::{LanguageModelRequestMessage, LanguageModelToolSchemaFormat};
@@ -63,15 +63,16 @@ impl Tool for DeletePathTool {
project: Entity<Project>,
action_log: Entity<ActionLog>,
cx: &mut App,
) -> Task<Result<String>> {
) -> ToolResult {
let path_str = match serde_json::from_value::<DeletePathToolInput>(input) {
Ok(input) => input.path,
Err(err) => return Task::ready(Err(anyhow!(err))),
Err(err) => return Task::ready(Err(anyhow!(err))).into(),
};
let Some(project_path) = project.read(cx).find_project_path(&path_str, cx) else {
return Task::ready(Err(anyhow!(
"Couldn't delete {path_str} because that path isn't in this project."
)));
)))
.into();
};
let Some(worktree) = project
@@ -80,7 +81,8 @@ impl Tool for DeletePathTool {
else {
return Task::ready(Err(anyhow!(
"Couldn't delete {path_str} because that path isn't in this project."
)));
)))
.into();
};
let worktree_snapshot = worktree.read(cx).snapshot();
@@ -132,5 +134,6 @@ impl Tool for DeletePathTool {
)),
}
})
.into()
}
}

View File

@@ -1,6 +1,6 @@
use crate::schema::json_schema_for;
use anyhow::{Result, anyhow};
use assistant_tool::{ActionLog, Tool};
use assistant_tool::{ActionLog, Tool, ToolResult};
use gpui::{App, Entity, Task};
use language::{DiagnosticSeverity, OffsetRangeExt};
use language_model::{LanguageModelRequestMessage, LanguageModelToolSchemaFormat};
@@ -83,14 +83,15 @@ impl Tool for DiagnosticsTool {
project: Entity<Project>,
action_log: Entity<ActionLog>,
cx: &mut App,
) -> Task<Result<String>> {
) -> ToolResult {
match serde_json::from_value::<DiagnosticsToolInput>(input)
.ok()
.and_then(|input| input.path)
{
Some(path) if !path.is_empty() => {
let Some(project_path) = project.read(cx).find_project_path(&path, cx) else {
return Task::ready(Err(anyhow!("Could not find path {path} in project",)));
return Task::ready(Err(anyhow!("Could not find path {path} in project",)))
.into();
};
let buffer =
@@ -125,6 +126,7 @@ impl Tool for DiagnosticsTool {
Ok(output)
}
})
.into()
}
_ => {
let project = project.read(cx);
@@ -155,9 +157,10 @@ impl Tool for DiagnosticsTool {
});
if has_diagnostics {
Task::ready(Ok(output))
Task::ready(Ok(output)).into()
} else {
Task::ready(Ok("No errors or warnings found in the project.".to_string()))
.into()
}
}
}

View File

@@ -4,7 +4,7 @@ use std::sync::Arc;
use crate::schema::json_schema_for;
use anyhow::{Context as _, Result, anyhow, bail};
use assistant_tool::{ActionLog, Tool};
use assistant_tool::{ActionLog, Tool, ToolResult};
use futures::AsyncReadExt as _;
use gpui::{App, AppContext as _, Entity, Task};
use html_to_markdown::{TagHandler, convert_html_to_markdown, markdown};
@@ -146,10 +146,10 @@ impl Tool for FetchTool {
_project: Entity<Project>,
_action_log: Entity<ActionLog>,
cx: &mut App,
) -> Task<Result<String>> {
) -> ToolResult {
let input = match serde_json::from_value::<FetchToolInput>(input) {
Ok(input) => input,
Err(err) => return Task::ready(Err(anyhow!(err))),
Err(err) => return Task::ready(Err(anyhow!(err))).into(),
};
let text = cx.background_spawn({
@@ -158,13 +158,15 @@ impl Tool for FetchTool {
async move { Self::build_message(http_client, &url).await }
});
cx.foreground_executor().spawn(async move {
let text = text.await?;
if text.trim().is_empty() {
bail!("no textual content found");
}
cx.foreground_executor()
.spawn(async move {
let text = text.await?;
if text.trim().is_empty() {
bail!("no textual content found");
}
Ok(text)
})
Ok(text)
})
.into()
}
}

View File

@@ -1,6 +1,6 @@
use crate::{replace::replace_with_flexible_indent, schema::json_schema_for};
use anyhow::{Context as _, Result, anyhow};
use assistant_tool::{ActionLog, Tool};
use assistant_tool::{ActionLog, Tool, ToolResult};
use gpui::{App, AppContext, AsyncApp, Entity, Task};
use language_model::{LanguageModelRequestMessage, LanguageModelToolSchemaFormat};
use project::Project;
@@ -169,10 +169,10 @@ impl Tool for FindReplaceFileTool {
project: Entity<Project>,
action_log: Entity<ActionLog>,
cx: &mut App,
) -> Task<Result<String>> {
) -> ToolResult {
let input = match serde_json::from_value::<FindReplaceFileToolInput>(input) {
Ok(input) => input,
Err(err) => return Task::ready(Err(anyhow!(err))),
Err(err) => return Task::ready(Err(anyhow!(err))).into(),
};
cx.spawn(async move |cx: &mut AsyncApp| {
@@ -263,6 +263,6 @@ impl Tool for FindReplaceFileTool {
Ok(format!("Edited {}:\n\n```diff\n{}\n```", input.path.display(), diff_str))
})
}).into()
}
}

View File

@@ -1,6 +1,6 @@
use crate::schema::json_schema_for;
use anyhow::{Result, anyhow};
use assistant_tool::{ActionLog, Tool};
use assistant_tool::{ActionLog, Tool, ToolResult};
use gpui::{App, Entity, Task};
use language_model::{LanguageModelRequestMessage, LanguageModelToolSchemaFormat};
use project::Project;
@@ -77,10 +77,10 @@ impl Tool for ListDirectoryTool {
project: Entity<Project>,
_action_log: Entity<ActionLog>,
cx: &mut App,
) -> Task<Result<String>> {
) -> ToolResult {
let input = match serde_json::from_value::<ListDirectoryToolInput>(input) {
Ok(input) => input,
Err(err) => return Task::ready(Err(anyhow!(err))),
Err(err) => return Task::ready(Err(anyhow!(err))).into(),
};
// Sometimes models will return these even though we tell it to give a path and not a glob.
@@ -101,26 +101,26 @@ impl Tool for ListDirectoryTool {
.collect::<Vec<_>>()
.join("\n");
return Task::ready(Ok(output));
return Task::ready(Ok(output)).into();
}
let Some(project_path) = project.read(cx).find_project_path(&input.path, cx) else {
return Task::ready(Err(anyhow!("Path {} not found in project", input.path)));
return Task::ready(Err(anyhow!("Path {} not found in project", input.path))).into();
};
let Some(worktree) = project
.read(cx)
.worktree_for_id(project_path.worktree_id, cx)
else {
return Task::ready(Err(anyhow!("Worktree not found")));
return Task::ready(Err(anyhow!("Worktree not found"))).into();
};
let worktree = worktree.read(cx);
let Some(entry) = worktree.entry_for_path(&project_path.path) else {
return Task::ready(Err(anyhow!("Path not found: {}", input.path)));
return Task::ready(Err(anyhow!("Path not found: {}", input.path))).into();
};
if !entry.is_dir() {
return Task::ready(Err(anyhow!("{} is not a directory.", input.path)));
return Task::ready(Err(anyhow!("{} is not a directory.", input.path))).into();
}
let mut output = String::new();
@@ -133,8 +133,8 @@ impl Tool for ListDirectoryTool {
.unwrap();
}
if output.is_empty() {
return Task::ready(Ok(format!("{} is empty.", input.path)));
return Task::ready(Ok(format!("{} is empty.", input.path))).into();
}
Task::ready(Ok(output))
Task::ready(Ok(output)).into()
}
}

View File

@@ -1,6 +1,6 @@
use crate::schema::json_schema_for;
use anyhow::{Result, anyhow};
use assistant_tool::{ActionLog, Tool};
use assistant_tool::{ActionLog, Tool, ToolResult};
use gpui::{App, AppContext, Entity, Task};
use language_model::{LanguageModelRequestMessage, LanguageModelToolSchemaFormat};
use project::Project;
@@ -90,10 +90,10 @@ impl Tool for MovePathTool {
project: Entity<Project>,
_action_log: Entity<ActionLog>,
cx: &mut App,
) -> Task<Result<String>> {
) -> ToolResult {
let input = match serde_json::from_value::<MovePathToolInput>(input) {
Ok(input) => input,
Err(err) => return Task::ready(Err(anyhow!(err))),
Err(err) => return Task::ready(Err(anyhow!(err))).into(),
};
let rename_task = project.update(cx, |project, cx| {
match project
@@ -128,5 +128,6 @@ impl Tool for MovePathTool {
)),
}
})
.into()
}
}

View File

@@ -2,7 +2,7 @@ use std::sync::Arc;
use crate::schema::json_schema_for;
use anyhow::{Result, anyhow};
use assistant_tool::{ActionLog, Tool};
use assistant_tool::{ActionLog, Tool, ToolResult};
use chrono::{Local, Utc};
use gpui::{App, Entity, Task};
use language_model::{LanguageModelRequestMessage, LanguageModelToolSchemaFormat};
@@ -60,10 +60,10 @@ impl Tool for NowTool {
_project: Entity<Project>,
_action_log: Entity<ActionLog>,
_cx: &mut App,
) -> Task<Result<String>> {
) -> ToolResult {
let input: NowToolInput = match serde_json::from_value(input) {
Ok(input) => input,
Err(err) => return Task::ready(Err(anyhow!(err))),
Err(err) => return Task::ready(Err(anyhow!(err))).into(),
};
let now = match input.timezone {
@@ -72,6 +72,6 @@ impl Tool for NowTool {
};
let text = format!("The current datetime is {now}.");
Task::ready(Ok(text))
Task::ready(Ok(text)).into()
}
}

View File

@@ -1,6 +1,6 @@
use crate::schema::json_schema_for;
use anyhow::{Context as _, Result, anyhow};
use assistant_tool::{ActionLog, Tool};
use assistant_tool::{ActionLog, Tool, ToolResult};
use gpui::{App, AppContext, Entity, Task};
use language_model::{LanguageModelRequestMessage, LanguageModelToolSchemaFormat};
use project::Project;
@@ -53,10 +53,10 @@ impl Tool for OpenTool {
_project: Entity<Project>,
_action_log: Entity<ActionLog>,
cx: &mut App,
) -> Task<Result<String>> {
) -> ToolResult {
let input: OpenToolInput = match serde_json::from_value(input) {
Ok(input) => input,
Err(err) => return Task::ready(Err(anyhow!(err))),
Err(err) => return Task::ready(Err(anyhow!(err))).into(),
};
cx.background_spawn(async move {
@@ -64,5 +64,6 @@ impl Tool for OpenTool {
Ok(format!("Successfully opened {}", input.path_or_url))
})
.into()
}
}

View File

@@ -1,6 +1,6 @@
use crate::schema::json_schema_for;
use anyhow::{Result, anyhow};
use assistant_tool::{ActionLog, Tool};
use assistant_tool::{ActionLog, Tool, ToolResult};
use gpui::{App, AppContext, Entity, Task};
use language_model::{LanguageModelRequestMessage, LanguageModelToolSchemaFormat};
use project::Project;
@@ -71,10 +71,10 @@ impl Tool for PathSearchTool {
project: Entity<Project>,
_action_log: Entity<ActionLog>,
cx: &mut App,
) -> Task<Result<String>> {
) -> ToolResult {
let (offset, glob) = match serde_json::from_value::<PathSearchToolInput>(input) {
Ok(input) => (input.offset, input.glob),
Err(err) => return Task::ready(Err(anyhow!(err))),
Err(err) => return Task::ready(Err(anyhow!(err))).into(),
};
let path_matcher = match PathMatcher::new([
@@ -82,7 +82,7 @@ impl Tool for PathSearchTool {
if glob.is_empty() { "*" } else { &glob },
]) {
Ok(matcher) => matcher,
Err(err) => return Task::ready(Err(anyhow!("Invalid glob: {err}"))),
Err(err) => return Task::ready(Err(anyhow!("Invalid glob: {err}"))).into(),
};
let snapshots: Vec<Snapshot> = project
.read(cx)
@@ -136,6 +136,6 @@ impl Tool for PathSearchTool {
Ok(response)
}
})
}).into()
}
}

View File

@@ -2,7 +2,7 @@ use std::sync::Arc;
use crate::{code_symbols_tool::file_outline, schema::json_schema_for};
use anyhow::{Result, anyhow};
use assistant_tool::{ActionLog, Tool};
use assistant_tool::{ActionLog, Tool, ToolResult};
use gpui::{App, Entity, Task};
use itertools::Itertools;
use language_model::{LanguageModelRequestMessage, LanguageModelToolSchemaFormat};
@@ -88,14 +88,14 @@ impl Tool for ReadFileTool {
project: Entity<Project>,
action_log: Entity<ActionLog>,
cx: &mut App,
) -> Task<Result<String>> {
) -> ToolResult {
let input = match serde_json::from_value::<ReadFileToolInput>(input) {
Ok(input) => input,
Err(err) => return Task::ready(Err(anyhow!(err))),
Err(err) => return Task::ready(Err(anyhow!(err))).into(),
};
let Some(project_path) = project.read(cx).find_project_path(&input.path, cx) else {
return Task::ready(Err(anyhow!("Path {} not found in project", &input.path,)));
return Task::ready(Err(anyhow!("Path {} not found in project", &input.path,))).into();
};
let file_path = input.path.clone();
@@ -146,6 +146,6 @@ impl Tool for ReadFileTool {
Ok(format!("This file was too big to read all at once. Here is an outline of its symbols:\n\n{outline}\n\nUsing the line numbers in this outline, you can call this tool again while specifying the start_line and end_line fields to see the implementations of symbols in the outline."))
}
}
})
}).into()
}
}

View File

@@ -1,6 +1,6 @@
use crate::schema::json_schema_for;
use anyhow::{Result, anyhow};
use assistant_tool::{ActionLog, Tool};
use assistant_tool::{ActionLog, Tool, ToolResult};
use futures::StreamExt;
use gpui::{App, Entity, Task};
use language::OffsetRangeExt;
@@ -92,13 +92,13 @@ impl Tool for RegexSearchTool {
project: Entity<Project>,
_action_log: Entity<ActionLog>,
cx: &mut App,
) -> Task<Result<String>> {
) -> ToolResult {
const CONTEXT_LINES: u32 = 2;
let (offset, regex, case_sensitive) =
match serde_json::from_value::<RegexSearchToolInput>(input) {
Ok(input) => (input.offset, input.regex, input.case_sensitive),
Err(err) => return Task::ready(Err(anyhow!(err))),
Err(err) => return Task::ready(Err(anyhow!(err))).into(),
};
let query = match SearchQuery::regex(
@@ -112,7 +112,7 @@ impl Tool for RegexSearchTool {
None,
) {
Ok(query) => query,
Err(error) => return Task::ready(Err(error)),
Err(error) => return Task::ready(Err(error)).into(),
};
let results = project.update(cx, |project, cx| project.search(query, cx));
@@ -201,6 +201,6 @@ impl Tool for RegexSearchTool {
} else {
Ok(format!("Found {matches_found} matches:\n{output}"))
}
})
}).into()
}
}

View File

@@ -1,5 +1,5 @@
use anyhow::{Context as _, Result, anyhow};
use assistant_tool::{ActionLog, Tool};
use assistant_tool::{ActionLog, Tool, ToolResult};
use gpui::{App, Entity, Task};
use language::{self, Buffer, ToPointUtf16};
use language_model::{LanguageModelRequestMessage, LanguageModelToolSchemaFormat};
@@ -88,10 +88,10 @@ impl Tool for RenameTool {
project: Entity<Project>,
action_log: Entity<ActionLog>,
cx: &mut App,
) -> Task<Result<String>> {
) -> ToolResult {
let input = match serde_json::from_value::<RenameToolInput>(input) {
Ok(input) => input,
Err(err) => return Task::ready(Err(anyhow!(err))),
Err(err) => return Task::ready(Err(anyhow!(err))).into(),
};
cx.spawn(async move |cx| {
@@ -138,7 +138,7 @@ impl Tool for RenameTool {
})?;
Ok(format!("Renamed '{}' to '{}'", input.symbol, input.new_name))
})
}).into()
}
}

View File

@@ -1,5 +1,5 @@
use anyhow::{Context as _, Result, anyhow};
use assistant_tool::{ActionLog, Tool};
use assistant_tool::{ActionLog, Tool, ToolResult};
use gpui::{App, AsyncApp, Entity, Task};
use language::{self, Anchor, Buffer, BufferSnapshot, Location, Point, ToPoint, ToPointUtf16};
use language_model::{LanguageModelRequestMessage, LanguageModelToolSchemaFormat};
@@ -122,10 +122,10 @@ impl Tool for SymbolInfoTool {
project: Entity<Project>,
action_log: Entity<ActionLog>,
cx: &mut App,
) -> Task<Result<String>> {
) -> ToolResult {
let input = match serde_json::from_value::<SymbolInfoToolInput>(input) {
Ok(input) => input,
Err(err) => return Task::ready(Err(anyhow!(err))),
Err(err) => return Task::ready(Err(anyhow!(err))).into(),
};
cx.spawn(async move |cx| {
@@ -205,7 +205,7 @@ impl Tool for SymbolInfoTool {
} else {
Ok(output)
}
})
}).into()
}
}

View File

@@ -1,6 +1,6 @@
use crate::schema::json_schema_for;
use anyhow::{Context as _, Result, anyhow};
use assistant_tool::{ActionLog, Tool};
use assistant_tool::{ActionLog, Tool, ToolResult};
use futures::io::BufReader;
use futures::{AsyncBufReadExt, AsyncReadExt, FutureExt};
use gpui::{App, AppContext, Entity, Task};
@@ -79,10 +79,10 @@ impl Tool for TerminalTool {
project: Entity<Project>,
_action_log: Entity<ActionLog>,
cx: &mut App,
) -> Task<Result<String>> {
) -> ToolResult {
let input: TerminalToolInput = match serde_json::from_value(input) {
Ok(input) => input,
Err(err) => return Task::ready(Err(anyhow!(err))),
Err(err) => return Task::ready(Err(anyhow!(err))).into(),
};
let project = project.read(cx);
@@ -93,13 +93,15 @@ impl Tool for TerminalTool {
let only_worktree = match worktrees.next() {
Some(worktree) => worktree,
None => return Task::ready(Err(anyhow!("No worktrees found in the project"))),
None => {
return Task::ready(Err(anyhow!("No worktrees found in the project"))).into();
}
};
if worktrees.next().is_some() {
return Task::ready(Err(anyhow!(
"'.' is ambiguous in multi-root workspaces. Please specify a root directory explicitly."
)));
))).into();
}
only_worktree.read(cx).abs_path()
@@ -111,7 +113,8 @@ impl Tool for TerminalTool {
{
return Task::ready(Err(anyhow!(
"The absolute path must be within one of the project's worktrees"
)));
)))
.into();
}
input_path.into()
@@ -120,13 +123,15 @@ impl Tool for TerminalTool {
return Task::ready(Err(anyhow!(
"`cd` directory {} not found in the project",
&input.cd
)));
)))
.into();
};
worktree.read(cx).abs_path()
};
cx.background_spawn(run_command_limited(working_dir, input.command))
.into()
}
}

View File

@@ -2,7 +2,7 @@ use std::sync::Arc;
use crate::schema::json_schema_for;
use anyhow::{Result, anyhow};
use assistant_tool::{ActionLog, Tool};
use assistant_tool::{ActionLog, Tool, ToolResult};
use gpui::{App, Entity, Task};
use language_model::{LanguageModelRequestMessage, LanguageModelToolSchemaFormat};
use project::Project;
@@ -51,11 +51,12 @@ impl Tool for ThinkingTool {
_project: Entity<Project>,
_action_log: Entity<ActionLog>,
_cx: &mut App,
) -> Task<Result<String>> {
) -> ToolResult {
// This tool just "thinks out loud" and doesn't perform any actions.
Task::ready(match serde_json::from_value::<ThinkingToolInput>(input) {
Ok(_input) => Ok("Finished thinking.".to_string()),
Err(err) => Err(anyhow!(err)),
})
.into()
}
}

Binary file not shown.

Before

Width:  |  Height:  |  Size: 577 KiB

After

Width:  |  Height:  |  Size: 0 B

View File

@@ -505,7 +505,10 @@ CREATE TABLE IF NOT EXISTS billing_subscriptions (
stripe_subscription_id TEXT NOT NULL,
stripe_subscription_status TEXT NOT NULL,
stripe_cancel_at TIMESTAMP,
stripe_cancellation_reason TEXT
stripe_cancellation_reason TEXT,
kind TEXT,
stripe_current_period_start BIGINT,
stripe_current_period_end BIGINT
);
CREATE INDEX "ix_billing_subscriptions_on_billing_customer_id" ON billing_subscriptions (billing_customer_id);

View File

@@ -0,0 +1,4 @@
alter table billing_subscriptions
add column kind text,
add column stripe_current_period_start bigint,
add column stripe_current_period_end bigint;

View File

@@ -1 +0,0 @@
ALTER TABLE monthly_usages ADD requests INTEGER NOT NULL DEFAULT 0;

View File

@@ -21,7 +21,9 @@ use stripe::{
use util::ResultExt;
use crate::api::events::SnowflakeRow;
use crate::db::billing_subscription::{StripeCancellationReason, StripeSubscriptionStatus};
use crate::db::billing_subscription::{
StripeCancellationReason, StripeSubscriptionStatus, SubscriptionKind,
};
use crate::llm::{DEFAULT_MAX_MONTHLY_SPEND, FREE_TIER_MONTHLY_SPENDING_LIMIT};
use crate::rpc::{ResultExt as _, Server};
use crate::{AppState, Cents, Error, Result};
@@ -184,7 +186,10 @@ async fn list_billing_subscriptions(
.into_iter()
.map(|subscription| BillingSubscriptionJson {
id: subscription.id,
name: "Zed LLM Usage".to_string(),
name: match subscription.kind {
Some(SubscriptionKind::ZedPro) => "Zed Pro".to_string(),
None => "Zed LLM Usage".to_string(),
},
status: subscription.stripe_subscription_status,
cancel_at: subscription.stripe_cancel_at.map(|cancel_at| {
cancel_at
@@ -198,9 +203,16 @@ async fn list_billing_subscriptions(
}))
}
#[derive(Debug, Clone, Copy, Deserialize)]
#[serde(rename_all = "snake_case")]
enum ProductCode {
ZedPro,
}
#[derive(Debug, Deserialize)]
struct CreateBillingSubscriptionBody {
github_user_id: i32,
product: Option<ProductCode>,
}
#[derive(Debug, Serialize)]
@@ -274,15 +286,30 @@ async fn create_billing_subscription(
customer.id
};
let default_model = llm_db.model(rpc::LanguageModelProvider::Anthropic, "claude-3-7-sonnet")?;
let stripe_model = stripe_billing.register_model(default_model).await?;
let success_url = format!(
"{}/account?checkout_complete=1",
app.config.zed_dot_dev_url()
);
let checkout_session_url = stripe_billing
.checkout(customer_id, &user.github_login, &stripe_model, &success_url)
.await?;
let checkout_session_url = match body.product {
Some(ProductCode::ZedPro) => {
let success_url = format!(
"{}/account?checkout_complete=1",
app.config.zed_dot_dev_url()
);
stripe_billing
.checkout_with_zed_pro(customer_id, &user.github_login, &success_url)
.await?
}
None => {
let default_model =
llm_db.model(rpc::LanguageModelProvider::Anthropic, "claude-3-7-sonnet")?;
let stripe_model = stripe_billing.register_model(default_model).await?;
let success_url = format!(
"{}/account?checkout_complete=1",
app.config.zed_dot_dev_url()
);
stripe_billing
.checkout(customer_id, &user.github_login, &stripe_model, &success_url)
.await?
}
};
Ok(Json(CreateBillingSubscriptionResponse {
checkout_session_url,
}))
@@ -291,6 +318,10 @@ async fn create_billing_subscription(
#[derive(Debug, PartialEq, Deserialize)]
#[serde(rename_all = "snake_case")]
enum ManageSubscriptionIntent {
/// The user intends to manage their subscription.
///
/// This will open the Stripe billing portal without putting the user in a specific flow.
ManageSubscription,
/// The user intends to cancel their subscription.
Cancel,
/// The user intends to stop the cancellation of their subscription.
@@ -378,7 +409,8 @@ async fn manage_billing_subscription(
}
let flow = match body.intent {
ManageSubscriptionIntent::Cancel => CreateBillingPortalSessionFlowData {
ManageSubscriptionIntent::ManageSubscription => None,
ManageSubscriptionIntent::Cancel => Some(CreateBillingPortalSessionFlowData {
type_: CreateBillingPortalSessionFlowDataType::SubscriptionCancel,
after_completion: Some(CreateBillingPortalSessionFlowDataAfterCompletion {
type_: stripe::CreateBillingPortalSessionFlowDataAfterCompletionType::Redirect,
@@ -394,12 +426,12 @@ async fn manage_billing_subscription(
},
),
..Default::default()
},
}),
ManageSubscriptionIntent::StopCancellation => unreachable!(),
};
let mut params = CreateBillingPortalSession::new(customer_id);
params.flow_data = Some(flow);
params.flow_data = flow;
let return_url = format!("{}/account", app.config.zed_dot_dev_url());
params.return_url = Some(&return_url);
@@ -664,6 +696,23 @@ async fn handle_customer_subscription_event(
log::info!("handling Stripe {} event: {}", event.type_, event.id);
let subscription_kind =
if let Some(zed_pro_price_id) = app.config.stripe_zed_pro_price_id.as_deref() {
let has_zed_pro_price = subscription.items.data.iter().any(|item| {
item.price
.as_ref()
.map_or(false, |price| price.id.as_str() == zed_pro_price_id)
});
if has_zed_pro_price {
Some(SubscriptionKind::ZedPro)
} else {
None
}
} else {
None
};
let billing_customer =
find_or_create_billing_customer(app, stripe_client, subscription.customer)
.await?
@@ -700,6 +749,7 @@ async fn handle_customer_subscription_event(
existing_subscription.id,
&UpdateBillingSubscriptionParams {
billing_customer_id: ActiveValue::set(billing_customer.id),
kind: ActiveValue::set(subscription_kind),
stripe_subscription_id: ActiveValue::set(subscription.id.to_string()),
stripe_subscription_status: ActiveValue::set(subscription.status.into()),
stripe_cancel_at: ActiveValue::set(
@@ -714,6 +764,12 @@ async fn handle_customer_subscription_event(
.and_then(|details| details.reason)
.map(|reason| reason.into()),
),
stripe_current_period_start: ActiveValue::set(Some(
subscription.current_period_start,
)),
stripe_current_period_end: ActiveValue::set(Some(
subscription.current_period_end,
)),
},
)
.await?;
@@ -748,12 +804,15 @@ async fn handle_customer_subscription_event(
app.db
.create_billing_subscription(&CreateBillingSubscriptionParams {
billing_customer_id: billing_customer.id,
kind: subscription_kind,
stripe_subscription_id: subscription.id.to_string(),
stripe_subscription_status: subscription.status.into(),
stripe_cancellation_reason: subscription
.cancellation_details
.and_then(|details| details.reason)
.map(|reason| reason.into()),
stripe_current_period_start: Some(subscription.current_period_start),
stripe_current_period_end: Some(subscription.current_period_end),
})
.await?;
}

View File

@@ -1,22 +1,30 @@
use crate::db::billing_subscription::{StripeCancellationReason, StripeSubscriptionStatus};
use crate::db::billing_subscription::{
StripeCancellationReason, StripeSubscriptionStatus, SubscriptionKind,
};
use super::*;
#[derive(Debug)]
pub struct CreateBillingSubscriptionParams {
pub billing_customer_id: BillingCustomerId,
pub kind: Option<SubscriptionKind>,
pub stripe_subscription_id: String,
pub stripe_subscription_status: StripeSubscriptionStatus,
pub stripe_cancellation_reason: Option<StripeCancellationReason>,
pub stripe_current_period_start: Option<i64>,
pub stripe_current_period_end: Option<i64>,
}
#[derive(Debug, Default)]
pub struct UpdateBillingSubscriptionParams {
pub billing_customer_id: ActiveValue<BillingCustomerId>,
pub kind: ActiveValue<Option<SubscriptionKind>>,
pub stripe_subscription_id: ActiveValue<String>,
pub stripe_subscription_status: ActiveValue<StripeSubscriptionStatus>,
pub stripe_cancel_at: ActiveValue<Option<DateTime>>,
pub stripe_cancellation_reason: ActiveValue<Option<StripeCancellationReason>>,
pub stripe_current_period_start: ActiveValue<Option<i64>>,
pub stripe_current_period_end: ActiveValue<Option<i64>>,
}
impl Database {
@@ -28,9 +36,12 @@ impl Database {
self.transaction(|tx| async move {
billing_subscription::Entity::insert(billing_subscription::ActiveModel {
billing_customer_id: ActiveValue::set(params.billing_customer_id),
kind: ActiveValue::set(params.kind),
stripe_subscription_id: ActiveValue::set(params.stripe_subscription_id.clone()),
stripe_subscription_status: ActiveValue::set(params.stripe_subscription_status),
stripe_cancellation_reason: ActiveValue::set(params.stripe_cancellation_reason),
stripe_current_period_start: ActiveValue::set(params.stripe_current_period_start),
stripe_current_period_end: ActiveValue::set(params.stripe_current_period_end),
..Default::default()
})
.exec_without_returning(&*tx)

View File

@@ -9,10 +9,13 @@ pub struct Model {
#[sea_orm(primary_key)]
pub id: BillingSubscriptionId,
pub billing_customer_id: BillingCustomerId,
pub kind: Option<SubscriptionKind>,
pub stripe_subscription_id: String,
pub stripe_subscription_status: StripeSubscriptionStatus,
pub stripe_cancel_at: Option<DateTime>,
pub stripe_cancellation_reason: Option<StripeCancellationReason>,
pub stripe_current_period_start: Option<i64>,
pub stripe_current_period_end: Option<i64>,
pub created_at: DateTime,
}
@@ -34,6 +37,14 @@ impl Related<super::billing_customer::Entity> for Entity {
impl ActiveModelBehavior for ActiveModel {}
#[derive(Eq, PartialEq, Copy, Clone, Debug, EnumIter, DeriveActiveEnum, Hash, Serialize)]
#[sea_orm(rs_type = "String", db_type = "String(StringLen::None)")]
#[serde(rename_all = "snake_case")]
pub enum SubscriptionKind {
#[sea_orm(string_value = "zed_pro")]
ZedPro,
}
/// The status of a Stripe subscription.
///
/// [Stripe docs](https://docs.stripe.com/api/subscriptions/object#subscription_object-status)

View File

@@ -39,9 +39,12 @@ async fn test_get_active_billing_subscriptions(db: &Arc<Database>) {
db.create_billing_subscription(&CreateBillingSubscriptionParams {
billing_customer_id: customer.id,
kind: None,
stripe_subscription_id: "sub_active_user".into(),
stripe_subscription_status: StripeSubscriptionStatus::Active,
stripe_cancellation_reason: None,
stripe_current_period_start: None,
stripe_current_period_end: None,
})
.await
.unwrap();
@@ -74,9 +77,12 @@ async fn test_get_active_billing_subscriptions(db: &Arc<Database>) {
db.create_billing_subscription(&CreateBillingSubscriptionParams {
billing_customer_id: customer.id,
kind: None,
stripe_subscription_id: "sub_past_due_user".into(),
stripe_subscription_status: StripeSubscriptionStatus::PastDue,
stripe_cancellation_reason: None,
stripe_current_period_start: None,
stripe_current_period_end: None,
})
.await
.unwrap();

View File

@@ -182,6 +182,7 @@ pub struct Config {
pub slack_panics_webhook: Option<String>,
pub auto_join_channel_id: Option<ChannelId>,
pub stripe_api_key: Option<String>,
pub stripe_zed_pro_price_id: Option<String>,
pub supermaven_admin_api_key: Option<Arc<str>>,
pub user_backfiller_github_access_token: Option<Arc<str>>,
}
@@ -237,6 +238,7 @@ impl Config {
migrations_path: None,
seed_path: None,
stripe_api_key: None,
stripe_zed_pro_price_id: None,
supermaven_admin_api_key: None,
user_backfiller_github_access_token: None,
kinesis_region: None,
@@ -322,9 +324,12 @@ impl AppState {
llm_db,
livekit_client,
blob_store_client: build_blob_store_client(&config).await.log_err(),
stripe_billing: stripe_client
.clone()
.map(|stripe_client| Arc::new(StripeBilling::new(stripe_client))),
stripe_billing: stripe_client.clone().map(|stripe_client| {
Arc::new(StripeBilling::new(
stripe_client,
config.stripe_zed_pro_price_id.clone(),
))
}),
stripe_client,
rate_limiter: Arc::new(RateLimiter::new(db)),
executor,

View File

@@ -1,7 +1,7 @@
use std::sync::Arc;
use crate::{Cents, Result, llm};
use anyhow::Context as _;
use anyhow::{Context as _, anyhow};
use chrono::{Datelike, Utc};
use collections::HashMap;
use serde::{Deserialize, Serialize};
@@ -10,6 +10,7 @@ use tokio::sync::RwLock;
pub struct StripeBilling {
state: RwLock<StripeBillingState>,
client: Arc<stripe::Client>,
zed_pro_price_id: Option<String>,
}
#[derive(Default)]
@@ -31,10 +32,11 @@ struct StripeBillingPrice {
}
impl StripeBilling {
pub fn new(client: Arc<stripe::Client>) -> Self {
pub fn new(client: Arc<stripe::Client>, zed_pro_price_id: Option<String>) -> Self {
Self {
client,
state: RwLock::default(),
zed_pro_price_id,
}
}
@@ -382,6 +384,32 @@ impl StripeBilling {
let session = stripe::CheckoutSession::create(&self.client, params).await?;
Ok(session.url.context("no checkout session URL")?)
}
pub async fn checkout_with_zed_pro(
&self,
customer_id: stripe::CustomerId,
github_login: &str,
success_url: &str,
) -> Result<String> {
let zed_pro_price_id = self
.zed_pro_price_id
.as_ref()
.ok_or_else(|| anyhow!("Zed Pro price ID not set"))?;
let mut params = stripe::CreateCheckoutSession::new();
params.mode = Some(stripe::CheckoutSessionMode::Subscription);
params.customer = Some(customer_id);
params.client_reference_id = Some(github_login);
params.line_items = Some(vec![stripe::CreateCheckoutSessionLineItems {
price: Some(zed_pro_price_id.clone()),
quantity: Some(1),
..Default::default()
}]);
params.success_url = Some(success_url);
let session = stripe::CheckoutSession::create(&self.client, params).await?;
Ok(session.url.context("no checkout session URL")?)
}
}
#[derive(Serialize)]

View File

@@ -557,6 +557,7 @@ impl TestServer {
migrations_path: None,
seed_path: None,
stripe_api_key: None,
stripe_zed_pro_price_id: None,
supermaven_admin_api_key: None,
user_backfiller_github_access_token: None,
kinesis_region: None,

View File

@@ -1,7 +1,7 @@
use std::sync::Arc;
use anyhow::{Result, anyhow, bail};
use assistant_tool::{ActionLog, Tool, ToolSource};
use assistant_tool::{ActionLog, Tool, ToolResult, ToolSource};
use gpui::{App, Entity, Task};
use icons::IconName;
use language_model::{LanguageModelRequestMessage, LanguageModelToolSchemaFormat};
@@ -78,7 +78,7 @@ impl Tool for ContextServerTool {
_project: Entity<Project>,
_action_log: Entity<ActionLog>,
cx: &mut App,
) -> Task<Result<String>> {
) -> ToolResult {
if let Some(server) = self.server_manager.read(cx).get_server(&self.server_id) {
let tool_name = self.tool.name.clone();
let server_clone = server.clone();
@@ -118,8 +118,9 @@ impl Tool for ContextServerTool {
}
Ok(result)
})
.into()
} else {
Task::ready(Err(anyhow!("Context server not found")))
Task::ready(Err(anyhow!("Context server not found"))).into()
}
}
}

View File

@@ -39,7 +39,6 @@ log.workspace = true
node_runtime.workspace = true
parking_lot.workspace = true
paths.workspace = true
regex.workspace = true
schemars.workspace = true
serde.workspace = true
serde_json.workspace = true

View File

@@ -3,13 +3,13 @@ use anyhow::{Context as _, Ok, Result, anyhow};
use async_compression::futures::bufread::GzipDecoder;
use async_tar::Archive;
use async_trait::async_trait;
use dap_types::StartDebuggingRequestArguments;
use futures::io::BufReader;
use gpui::{AsyncApp, SharedString};
pub use http_client::{HttpClient, github::latest_github_release};
use language::LanguageToolchainStore;
use node_runtime::NodeRuntime;
use serde::{Deserialize, Serialize};
use serde_json::Value;
use settings::WorktreeId;
use smol::{self, fs::File, lock::Mutex};
use std::{
@@ -20,9 +20,9 @@ use std::{
net::Ipv4Addr,
ops::Deref,
path::PathBuf,
sync::{Arc, LazyLock},
sync::Arc,
};
use task::{DebugAdapterConfig, DebugTaskDefinition};
use task::DebugTaskDefinition;
use util::ResultExt;
#[derive(Clone, Debug, PartialEq, Eq)]
@@ -93,13 +93,15 @@ pub struct TcpArguments {
pub port: u16,
pub timeout: Option<u64>,
}
#[derive(Default, Debug, Clone)]
#[derive(Debug, Clone)]
pub struct DebugAdapterBinary {
pub adapter_name: DebugAdapterName,
pub command: String,
pub arguments: Option<Vec<OsString>>,
pub envs: Option<HashMap<String, String>>,
pub cwd: Option<PathBuf>,
pub connection: Option<TcpArguments>,
pub request_args: StartDebuggingRequestArguments,
}
#[derive(Debug)]
@@ -220,7 +222,7 @@ pub trait DebugAdapter: 'static + Send + Sync {
async fn get_binary(
&self,
delegate: &dyn DapDelegate,
config: &DebugAdapterConfig,
config: &DebugTaskDefinition,
user_installed_path: Option<PathBuf>,
cx: &mut AsyncApp,
) -> Result<DebugAdapterBinary> {
@@ -284,21 +286,11 @@ pub trait DebugAdapter: 'static + Send + Sync {
async fn get_installed_binary(
&self,
delegate: &dyn DapDelegate,
config: &DebugAdapterConfig,
config: &DebugTaskDefinition,
user_installed_path: Option<PathBuf>,
cx: &mut AsyncApp,
) -> Result<DebugAdapterBinary>;
/// Should return base configuration to make the debug adapter work
fn request_args(&self, config: &DebugTaskDefinition) -> Value;
fn attach_processes_filter(&self) -> regex::Regex {
EMPTY_REGEX.clone()
}
}
static EMPTY_REGEX: LazyLock<regex::Regex> =
LazyLock::new(|| regex::Regex::new("").expect("Regex compilation to succeed"));
#[cfg(any(test, feature = "test-support"))]
pub struct FakeAdapter {}
@@ -309,6 +301,31 @@ impl FakeAdapter {
pub fn new() -> Self {
Self {}
}
fn request_args(&self, config: &DebugTaskDefinition) -> StartDebuggingRequestArguments {
use serde_json::json;
use task::DebugRequestType;
let value = json!({
"request": match config.request {
DebugRequestType::Launch(_) => "launch",
DebugRequestType::Attach(_) => "attach",
},
"process_id": if let DebugRequestType::Attach(attach_config) = &config.request {
attach_config.process_id
} else {
None
},
});
let request = match config.request {
DebugRequestType::Launch(_) => dap_types::StartDebuggingRequestArgumentsRequest::Launch,
DebugRequestType::Attach(_) => dap_types::StartDebuggingRequestArgumentsRequest::Attach,
};
StartDebuggingRequestArguments {
configuration: value,
request,
}
}
}
#[cfg(any(test, feature = "test-support"))]
@@ -321,16 +338,18 @@ impl DebugAdapter for FakeAdapter {
async fn get_binary(
&self,
_: &dyn DapDelegate,
_: &DebugAdapterConfig,
config: &DebugTaskDefinition,
_: Option<PathBuf>,
_: &mut AsyncApp,
) -> Result<DebugAdapterBinary> {
Ok(DebugAdapterBinary {
adapter_name: Self::ADAPTER_NAME.into(),
command: "command".into(),
arguments: None,
connection: None,
envs: None,
cwd: None,
request_args: self.request_args(config),
})
}
@@ -352,33 +371,10 @@ impl DebugAdapter for FakeAdapter {
async fn get_installed_binary(
&self,
_: &dyn DapDelegate,
_: &DebugAdapterConfig,
_: &DebugTaskDefinition,
_: Option<PathBuf>,
_: &mut AsyncApp,
) -> Result<DebugAdapterBinary> {
unimplemented!("get installed binary");
}
fn request_args(&self, config: &DebugTaskDefinition) -> Value {
use serde_json::json;
use task::DebugRequestType;
json!({
"request": match config.request {
DebugRequestType::Launch(_) => "launch",
DebugRequestType::Attach(_) => "attach",
},
"process_id": if let DebugRequestType::Attach(attach_config) = &config.request {
attach_config.process_id
} else {
None
},
})
}
fn attach_processes_filter(&self) -> regex::Regex {
static REGEX: LazyLock<regex::Regex> =
LazyLock::new(|| regex::Regex::new("^fake-binary").unwrap());
REGEX.clone()
}
}

View File

@@ -39,7 +39,6 @@ impl SessionId {
/// Represents a connection to the debug adapter process, either via stdout/stdin or a socket.
pub struct DebugAdapterClient {
id: SessionId,
name: DebugAdapterName,
sequence_count: AtomicU64,
binary: DebugAdapterBinary,
executor: BackgroundExecutor,
@@ -51,7 +50,6 @@ pub type DapMessageHandler = Box<dyn FnMut(Message) + 'static + Send + Sync>;
impl DebugAdapterClient {
pub async fn start(
id: SessionId,
name: DebugAdapterName,
binary: DebugAdapterBinary,
message_handler: DapMessageHandler,
cx: AsyncApp,
@@ -60,7 +58,6 @@ impl DebugAdapterClient {
TransportDelegate::start(&binary, cx.clone()).await?;
let this = Self {
id,
name,
binary,
transport_delegate,
sequence_count: AtomicU64::new(1),
@@ -91,6 +88,7 @@ impl DebugAdapterClient {
) -> Result<Self> {
let binary = match self.transport_delegate.transport() {
crate::transport::Transport::Tcp(tcp_transport) => DebugAdapterBinary {
adapter_name: binary.adapter_name,
command: binary.command,
arguments: binary.arguments,
envs: binary.envs,
@@ -100,11 +98,12 @@ impl DebugAdapterClient {
port: tcp_transport.port,
timeout: Some(tcp_transport.timeout),
}),
request_args: binary.request_args,
},
_ => self.binary.clone(),
};
Self::start(session_id, self.name(), binary, message_handler, cx).await
Self::start(session_id, binary, message_handler, cx).await
}
async fn handle_receive_messages(
@@ -189,7 +188,17 @@ impl DebugAdapterClient {
let response = response??;
match response.success {
true => Ok(serde_json::from_value(response.body.unwrap_or_default())?),
true => {
if let Some(json) = response.body {
Ok(serde_json::from_value(json)?)
// Note: dap types configure themselves to return `None` when an empty object is received,
// which then fails here...
} else if let Ok(result) = serde_json::from_value(serde_json::Value::Object(Default::default())) {
Ok(result)
} else {
Ok(serde_json::from_value(Default::default())?)
}
}
false => Err(anyhow!("Request failed: {}", response.message.unwrap_or_default())),
}
}
@@ -211,7 +220,7 @@ impl DebugAdapterClient {
}
pub fn name(&self) -> DebugAdapterName {
self.name.clone()
self.binary.adapter_name.clone()
}
pub fn binary(&self) -> &DebugAdapterBinary {
&self.binary
@@ -238,14 +247,14 @@ impl DebugAdapterClient {
}
#[cfg(any(test, feature = "test-support"))]
pub async fn on_request<R: dap_types::requests::Request, F>(&self, handler: F)
pub fn on_request<R: dap_types::requests::Request, F>(&self, handler: F)
where
F: 'static
+ Send
+ FnMut(u64, R::Arguments) -> Result<R::Response, dap_types::ErrorResponse>,
{
let transport = self.transport_delegate.transport().as_fake();
transport.on_request::<R, F>(handler).await;
transport.on_request::<R, F>(handler);
}
#[cfg(any(test, feature = "test-support"))]
@@ -282,7 +291,7 @@ mod tests {
use crate::{client::DebugAdapterClient, debugger_settings::DebuggerSettings};
use dap_types::{
Capabilities, InitializeRequestArguments, InitializeRequestArgumentsPathFormat,
RunInTerminalRequestArguments,
RunInTerminalRequestArguments, StartDebuggingRequestArguments,
messages::Events,
requests::{Initialize, Request, RunInTerminal},
};
@@ -312,13 +321,17 @@ mod tests {
let client = DebugAdapterClient::start(
crate::client::SessionId(1),
DebugAdapterName("adapter".into()),
DebugAdapterBinary {
adapter_name: "adapter".into(),
command: "command".into(),
arguments: Default::default(),
envs: Default::default(),
connection: None,
cwd: None,
request_args: StartDebuggingRequestArguments {
configuration: serde_json::Value::Null,
request: dap_types::StartDebuggingRequestArgumentsRequest::Launch,
},
},
Box::new(|_| panic!("Did not expect to hit this code path")),
cx.to_async(),
@@ -326,14 +339,12 @@ mod tests {
.await
.unwrap();
client
.on_request::<Initialize, _>(move |_, _| {
Ok(dap_types::Capabilities {
supports_configuration_done_request: Some(true),
..Default::default()
})
client.on_request::<Initialize, _>(move |_, _| {
Ok(dap_types::Capabilities {
supports_configuration_done_request: Some(true),
..Default::default()
})
.await;
});
cx.run_until_parked();
@@ -381,13 +392,17 @@ mod tests {
let client = DebugAdapterClient::start(
crate::client::SessionId(1),
DebugAdapterName("adapter".into()),
DebugAdapterBinary {
adapter_name: "adapter".into(),
command: "command".into(),
arguments: Default::default(),
envs: Default::default(),
connection: None,
cwd: None,
request_args: StartDebuggingRequestArguments {
configuration: serde_json::Value::Null,
request: dap_types::StartDebuggingRequestArgumentsRequest::Launch,
},
},
Box::new({
let called_event_handler = called_event_handler.clone();
@@ -431,13 +446,17 @@ mod tests {
let client = DebugAdapterClient::start(
crate::client::SessionId(1),
DebugAdapterName("test-adapter".into()),
DebugAdapterBinary {
adapter_name: "test-adapter".into(),
command: "command".into(),
arguments: Default::default(),
envs: Default::default(),
connection: None,
cwd: None,
request_args: dap_types::StartDebuggingRequestArguments {
configuration: serde_json::Value::Null,
request: dap_types::StartDebuggingRequestArgumentsRequest::Launch,
},
},
Box::new({
let called_event_handler = called_event_handler.clone();

View File

@@ -8,7 +8,7 @@ struct DapRegistryState {
adapters: BTreeMap<DebugAdapterName, Arc<dyn DebugAdapter>>,
}
#[derive(Default)]
#[derive(Clone, Default)]
/// Stores available debug adapters.
pub struct DapRegistry(Arc<RwLock<DapRegistryState>>);

View File

@@ -699,14 +699,8 @@ impl StdioTransport {
}
#[cfg(any(test, feature = "test-support"))]
type RequestHandler = Box<
dyn Send
+ FnMut(
u64,
serde_json::Value,
Arc<Mutex<async_pipe::PipeWriter>>,
) -> std::pin::Pin<Box<dyn std::future::Future<Output = ()> + Send>>,
>;
type RequestHandler =
Box<dyn Send + FnMut(u64, serde_json::Value) -> dap_types::messages::Response>;
#[cfg(any(test, feature = "test-support"))]
type ResponseHandler = Box<dyn Send + Fn(Response)>;
@@ -714,45 +708,41 @@ type ResponseHandler = Box<dyn Send + Fn(Response)>;
#[cfg(any(test, feature = "test-support"))]
pub struct FakeTransport {
// for sending fake response back from adapter side
request_handlers: Arc<Mutex<HashMap<&'static str, RequestHandler>>>,
request_handlers: Arc<parking_lot::Mutex<HashMap<&'static str, RequestHandler>>>,
// for reverse request responses
response_handlers: Arc<Mutex<HashMap<&'static str, ResponseHandler>>>,
response_handlers: Arc<parking_lot::Mutex<HashMap<&'static str, ResponseHandler>>>,
}
#[cfg(any(test, feature = "test-support"))]
impl FakeTransport {
pub async fn on_request<R: dap_types::requests::Request, F>(&self, mut handler: F)
pub fn on_request<R: dap_types::requests::Request, F>(&self, mut handler: F)
where
F: 'static + Send + FnMut(u64, R::Arguments) -> Result<R::Response, ErrorResponse>,
{
self.request_handlers.lock().await.insert(
self.request_handlers.lock().insert(
R::COMMAND,
Box::new(
move |seq, args, writer: Arc<Mutex<async_pipe::PipeWriter>>| {
let response = handler(seq, serde_json::from_value(args).unwrap());
let message = serde_json::to_string(&Message::Response(Response {
Box::new(move |seq, args| {
let result = handler(seq, serde_json::from_value(args).unwrap());
let response = match result {
Ok(response) => Response {
seq: seq + 1,
request_seq: seq,
success: response.as_ref().is_ok(),
success: true,
command: R::COMMAND.into(),
body: util::maybe!({ serde_json::to_value(response.ok()?).ok() }),
body: Some(serde_json::to_value(response).unwrap()),
message: None,
}))
.unwrap();
let writer = writer.clone();
Box::pin(async move {
let mut writer = writer.lock().await;
writer
.write_all(TransportDelegate::build_rpc_message(message).as_bytes())
.await
.unwrap();
writer.flush().await.unwrap();
})
},
),
},
Err(response) => Response {
seq: seq + 1,
request_seq: seq,
success: false,
command: R::COMMAND.into(),
body: Some(serde_json::to_value(response).unwrap()),
message: None,
},
};
response
}),
);
}
@@ -762,14 +752,13 @@ impl FakeTransport {
{
self.response_handlers
.lock()
.await
.insert(R::COMMAND, Box::new(handler));
}
async fn start(cx: AsyncApp) -> Result<(TransportPipe, Self)> {
let this = Self {
request_handlers: Arc::new(Mutex::new(HashMap::default())),
response_handlers: Arc::new(Mutex::new(HashMap::default())),
request_handlers: Arc::new(parking_lot::Mutex::new(HashMap::default())),
response_handlers: Arc::new(parking_lot::Mutex::new(HashMap::default())),
};
use dap_types::requests::{Request, RunInTerminal, StartDebugging};
use serde_json::json;
@@ -816,23 +805,31 @@ impl FakeTransport {
.unwrap();
writer.flush().await.unwrap();
} else {
if let Some(handle) = request_handlers
let response = if let Some(handle) = request_handlers
.lock()
.await
.get_mut(request.command.as_str())
{
handle(
request.seq,
request.arguments.unwrap_or(json!({})),
stdout_writer.clone(),
)
.await;
} else {
log::error!(
"No request handler for {}",
request.command
);
}
panic!("No request handler for {}", request.command);
};
let message =
serde_json::to_string(&Message::Response(response))
.unwrap();
let mut writer = stdout_writer.lock().await;
writer
.write_all(
TransportDelegate::build_rpc_message(message)
.as_bytes(),
)
.await
.unwrap();
writer.flush().await.unwrap();
}
}
Message::Event(event) => {
@@ -850,10 +847,8 @@ impl FakeTransport {
writer.flush().await.unwrap();
}
Message::Response(response) => {
if let Some(handle) = response_handlers
.lock()
.await
.get(response.command.as_str())
if let Some(handle) =
response_handlers.lock().get(response.command.as_str())
{
handle(response);
} else {

View File

@@ -27,7 +27,6 @@ dap.workspace = true
gpui.workspace = true
language.workspace = true
paths.workspace = true
regex.workspace = true
serde.workspace = true
serde_json.workspace = true
task.workspace = true

View File

@@ -4,7 +4,7 @@ use anyhow::{Result, bail};
use async_trait::async_trait;
use dap::adapters::latest_github_release;
use gpui::AsyncApp;
use task::{DebugAdapterConfig, DebugRequestType, DebugTaskDefinition};
use task::{DebugRequestType, DebugTaskDefinition};
use crate::*;
@@ -15,6 +15,42 @@ pub(crate) struct CodeLldbDebugAdapter {
impl CodeLldbDebugAdapter {
const ADAPTER_NAME: &'static str = "CodeLLDB";
fn request_args(&self, config: &DebugTaskDefinition) -> dap::StartDebuggingRequestArguments {
let mut configuration = json!({
"request": match config.request {
DebugRequestType::Launch(_) => "launch",
DebugRequestType::Attach(_) => "attach",
},
});
let map = configuration.as_object_mut().unwrap();
// CodeLLDB uses `name` for a terminal label.
map.insert("name".into(), Value::String(config.label.clone()));
let request = config.request.to_dap();
match &config.request {
DebugRequestType::Attach(attach) => {
map.insert("pid".into(), attach.process_id.into());
}
DebugRequestType::Launch(launch) => {
map.insert("program".into(), launch.program.clone().into());
if !launch.args.is_empty() {
map.insert("args".into(), launch.args.clone().into());
}
if let Some(stop_on_entry) = config.stop_on_entry {
map.insert("stopOnEntry".into(), stop_on_entry.into());
}
if let Some(cwd) = launch.cwd.as_ref() {
map.insert("cwd".into(), cwd.to_string_lossy().into_owned().into());
}
}
}
dap::StartDebuggingRequestArguments {
request,
configuration,
}
}
}
#[async_trait(?Send)]
@@ -86,7 +122,7 @@ impl DebugAdapter for CodeLldbDebugAdapter {
async fn get_installed_binary(
&self,
_: &dyn DapDelegate,
_: &DebugAdapterConfig,
config: &DebugTaskDefinition,
_: Option<PathBuf>,
_: &mut AsyncApp,
) -> Result<DebugAdapterBinary> {
@@ -111,39 +147,10 @@ impl DebugAdapter for CodeLldbDebugAdapter {
.to_string()
.into(),
]),
..Default::default()
request_args: self.request_args(config),
adapter_name: "test".into(),
envs: None,
connection: None,
})
}
fn request_args(&self, config: &DebugTaskDefinition) -> Value {
let mut args = json!({
"request": match config.request {
DebugRequestType::Launch(_) => "launch",
DebugRequestType::Attach(_) => "attach",
},
});
let map = args.as_object_mut().unwrap();
// CodeLLDB uses `name` for a terminal label.
map.insert("name".into(), Value::String(config.label.clone()));
match &config.request {
DebugRequestType::Attach(attach) => {
map.insert("pid".into(), attach.process_id.into());
}
DebugRequestType::Launch(launch) => {
map.insert("program".into(), launch.program.clone().into());
if !launch.args.is_empty() {
map.insert("args".into(), launch.args.clone().into());
}
if let Some(stop_on_entry) = config.stop_on_entry {
map.insert("stopOnEntry".into(), stop_on_entry.into());
}
if let Some(cwd) = launch.cwd.as_ref() {
map.insert("cwd".into(), cwd.to_string_lossy().into_owned().into());
}
}
}
args
}
}

View File

@@ -2,7 +2,6 @@ mod codelldb;
mod gdb;
mod go;
mod javascript;
mod lldb;
mod php;
mod python;
@@ -12,7 +11,7 @@ use anyhow::{Result, anyhow};
use async_trait::async_trait;
use codelldb::CodeLldbDebugAdapter;
use dap::{
DapRegistry,
DapRegistry, DebugRequestType,
adapters::{
self, AdapterVersion, DapDelegate, DebugAdapter, DebugAdapterBinary, DebugAdapterName,
GithubRepo,
@@ -21,18 +20,16 @@ use dap::{
use gdb::GdbDebugAdapter;
use go::GoDebugAdapter;
use javascript::JsDebugAdapter;
use lldb::LldbDebugAdapter;
use php::PhpDebugAdapter;
use python::PythonDebugAdapter;
use serde_json::{Value, json};
use task::{DebugAdapterConfig, TCPHost};
use task::TCPHost;
pub fn init(registry: Arc<DapRegistry>) {
registry.add_adapter(Arc::from(CodeLldbDebugAdapter::default()));
registry.add_adapter(Arc::from(PythonDebugAdapter));
registry.add_adapter(Arc::from(PhpDebugAdapter));
registry.add_adapter(Arc::from(JsDebugAdapter::default()));
registry.add_adapter(Arc::from(LldbDebugAdapter));
registry.add_adapter(Arc::from(JsDebugAdapter));
registry.add_adapter(Arc::from(GoDebugAdapter));
registry.add_adapter(Arc::from(GdbDebugAdapter));
}
@@ -51,3 +48,16 @@ pub(crate) async fn configure_tcp_connection(
Ok((host, port, timeout))
}
trait ToDap {
fn to_dap(&self) -> dap::StartDebuggingRequestArgumentsRequest;
}
impl ToDap for DebugRequestType {
fn to_dap(&self) -> dap::StartDebuggingRequestArgumentsRequest {
match self {
Self::Launch(_) => dap::StartDebuggingRequestArgumentsRequest::Launch,
Self::Attach(_) => dap::StartDebuggingRequestArgumentsRequest::Attach,
}
}
}

View File

@@ -2,8 +2,9 @@ use std::ffi::OsStr;
use anyhow::{Result, bail};
use async_trait::async_trait;
use dap::StartDebuggingRequestArguments;
use gpui::AsyncApp;
use task::{DebugAdapterConfig, DebugRequestType, DebugTaskDefinition};
use task::{DebugRequestType, DebugTaskDefinition};
use crate::*;
@@ -12,68 +13,8 @@ pub(crate) struct GdbDebugAdapter;
impl GdbDebugAdapter {
const ADAPTER_NAME: &'static str = "GDB";
}
#[async_trait(?Send)]
impl DebugAdapter for GdbDebugAdapter {
fn name(&self) -> DebugAdapterName {
DebugAdapterName(Self::ADAPTER_NAME.into())
}
async fn get_binary(
&self,
delegate: &dyn DapDelegate,
_: &DebugAdapterConfig,
user_installed_path: Option<std::path::PathBuf>,
_: &mut AsyncApp,
) -> Result<DebugAdapterBinary> {
let user_setting_path = user_installed_path
.filter(|p| p.exists())
.and_then(|p| p.to_str().map(|s| s.to_string()));
let gdb_path = delegate
.which(OsStr::new("gdb"))
.and_then(|p| p.to_str().map(|s| s.to_string()))
.ok_or(anyhow!("Could not find gdb in path"));
if gdb_path.is_err() && user_setting_path.is_none() {
bail!("Could not find gdb path or it's not installed");
}
let gdb_path = user_setting_path.unwrap_or(gdb_path?);
Ok(DebugAdapterBinary {
command: gdb_path,
arguments: Some(vec!["-i=dap".into()]),
envs: None,
cwd: None,
connection: None,
})
}
async fn install_binary(
&self,
_version: AdapterVersion,
_delegate: &dyn DapDelegate,
) -> Result<()> {
unimplemented!("GDB debug adapter cannot be installed by Zed (yet)")
}
async fn fetch_latest_adapter_version(&self, _: &dyn DapDelegate) -> Result<AdapterVersion> {
unimplemented!("Fetch latest GDB version not implemented (yet)")
}
async fn get_installed_binary(
&self,
_: &dyn DapDelegate,
_: &DebugAdapterConfig,
_: Option<std::path::PathBuf>,
_: &mut AsyncApp,
) -> Result<DebugAdapterBinary> {
unimplemented!("GDB cannot be installed by Zed (yet)")
}
fn request_args(&self, config: &DebugTaskDefinition) -> Value {
fn request_args(&self, config: &DebugTaskDefinition) -> StartDebuggingRequestArguments {
let mut args = json!({
"request": match config.request {
DebugRequestType::Launch(_) => "launch",
@@ -105,6 +46,71 @@ impl DebugAdapter for GdbDebugAdapter {
}
}
}
args
StartDebuggingRequestArguments {
configuration: args,
request: config.request.to_dap(),
}
}
}
#[async_trait(?Send)]
impl DebugAdapter for GdbDebugAdapter {
fn name(&self) -> DebugAdapterName {
DebugAdapterName(Self::ADAPTER_NAME.into())
}
async fn get_binary(
&self,
delegate: &dyn DapDelegate,
config: &DebugTaskDefinition,
user_installed_path: Option<std::path::PathBuf>,
_: &mut AsyncApp,
) -> Result<DebugAdapterBinary> {
let user_setting_path = user_installed_path
.filter(|p| p.exists())
.and_then(|p| p.to_str().map(|s| s.to_string()));
let gdb_path = delegate
.which(OsStr::new("gdb"))
.and_then(|p| p.to_str().map(|s| s.to_string()))
.ok_or(anyhow!("Could not find gdb in path"));
if gdb_path.is_err() && user_setting_path.is_none() {
bail!("Could not find gdb path or it's not installed");
}
let gdb_path = user_setting_path.unwrap_or(gdb_path?);
Ok(DebugAdapterBinary {
adapter_name: Self::ADAPTER_NAME.into(),
command: gdb_path,
arguments: Some(vec!["-i=dap".into()]),
envs: None,
cwd: None,
connection: None,
request_args: self.request_args(config),
})
}
async fn install_binary(
&self,
_version: AdapterVersion,
_delegate: &dyn DapDelegate,
) -> Result<()> {
unimplemented!("GDB debug adapter cannot be installed by Zed (yet)")
}
async fn fetch_latest_adapter_version(&self, _: &dyn DapDelegate) -> Result<AdapterVersion> {
unimplemented!("Fetch latest GDB version not implemented (yet)")
}
async fn get_installed_binary(
&self,
_: &dyn DapDelegate,
_: &DebugTaskDefinition,
_: Option<std::path::PathBuf>,
_: &mut AsyncApp,
) -> Result<DebugAdapterBinary> {
unimplemented!("GDB cannot be installed by Zed (yet)")
}
}

View File

@@ -1,3 +1,4 @@
use dap::StartDebuggingRequestArguments;
use gpui::AsyncApp;
use std::{ffi::OsStr, path::PathBuf};
use task::DebugTaskDefinition;
@@ -9,6 +10,31 @@ pub(crate) struct GoDebugAdapter;
impl GoDebugAdapter {
const ADAPTER_NAME: &'static str = "Delve";
fn request_args(&self, config: &DebugTaskDefinition) -> StartDebuggingRequestArguments {
let mut args = match &config.request {
dap::DebugRequestType::Attach(attach_config) => {
json!({
"processId": attach_config.process_id,
})
}
dap::DebugRequestType::Launch(launch_config) => json!({
"program": launch_config.program,
"cwd": launch_config.cwd,
"args": launch_config.args
}),
};
let map = args.as_object_mut().unwrap();
if let Some(stop_on_entry) = config.stop_on_entry {
map.insert("stopOnEntry".into(), stop_on_entry.into());
}
StartDebuggingRequestArguments {
configuration: args,
request: config.request.to_dap(),
}
}
}
#[async_trait(?Send)]
@@ -20,7 +46,7 @@ impl DebugAdapter for GoDebugAdapter {
async fn get_binary(
&self,
delegate: &dyn DapDelegate,
config: &DebugAdapterConfig,
config: &DebugTaskDefinition,
user_installed_path: Option<PathBuf>,
cx: &mut AsyncApp,
) -> Result<DebugAdapterBinary> {
@@ -53,7 +79,7 @@ impl DebugAdapter for GoDebugAdapter {
async fn get_installed_binary(
&self,
delegate: &dyn DapDelegate,
config: &DebugAdapterConfig,
config: &DebugTaskDefinition,
_: Option<PathBuf>,
_: &mut AsyncApp,
) -> Result<DebugAdapterBinary> {
@@ -66,6 +92,7 @@ impl DebugAdapter for GoDebugAdapter {
let (host, port, timeout) = crate::configure_tcp_connection(tcp_connection).await?;
Ok(DebugAdapterBinary {
adapter_name: self.name(),
command: delve_path,
arguments: Some(vec![
"dap".into(),
@@ -79,29 +106,7 @@ impl DebugAdapter for GoDebugAdapter {
port,
timeout,
}),
request_args: self.request_args(config),
})
}
fn request_args(&self, config: &DebugTaskDefinition) -> Value {
let mut args = match &config.request {
dap::DebugRequestType::Attach(attach_config) => {
json!({
"processId": attach_config.process_id,
})
}
dap::DebugRequestType::Launch(launch_config) => json!({
"program": launch_config.program,
"cwd": launch_config.cwd,
"args": launch_config.args
}),
};
let map = args.as_object_mut().unwrap();
if let Some(stop_on_entry) = config.stop_on_entry {
map.insert("stopOnEntry".into(), stop_on_entry.into());
}
args
}
}

View File

@@ -1,28 +1,52 @@
use adapters::latest_github_release;
use dap::StartDebuggingRequestArguments;
use gpui::AsyncApp;
use regex::Regex;
use std::path::PathBuf;
use task::{DebugRequestType, DebugTaskDefinition};
use crate::*;
#[derive(Debug)]
pub(crate) struct JsDebugAdapter {
attach_processes: Regex,
}
pub(crate) struct JsDebugAdapter;
impl Default for JsDebugAdapter {
fn default() -> Self {
Self {
attach_processes: Regex::new(r"(?i)^(?:node|bun|iojs)(?:$|\b)")
.expect("Regex compilation to succeed"),
}
}
}
impl JsDebugAdapter {
const ADAPTER_NAME: &'static str = "JavaScript";
const ADAPTER_NPM_NAME: &'static str = "vscode-js-debug";
const ADAPTER_PATH: &'static str = "js-debug/src/dapDebugServer.js";
fn request_args(&self, config: &DebugTaskDefinition) -> StartDebuggingRequestArguments {
let mut args = json!({
"type": "pwa-node",
"request": match config.request {
DebugRequestType::Launch(_) => "launch",
DebugRequestType::Attach(_) => "attach",
},
});
let map = args.as_object_mut().unwrap();
match &config.request {
DebugRequestType::Attach(attach) => {
map.insert("processId".into(), attach.process_id.into());
}
DebugRequestType::Launch(launch) => {
map.insert("program".into(), launch.program.clone().into());
if !launch.args.is_empty() {
map.insert("args".into(), launch.args.clone().into());
}
if let Some(stop_on_entry) = config.stop_on_entry {
map.insert("stopOnEntry".into(), stop_on_entry.into());
}
if let Some(cwd) = launch.cwd.as_ref() {
map.insert("cwd".into(), cwd.to_string_lossy().into_owned().into());
}
}
}
StartDebuggingRequestArguments {
configuration: args,
request: config.request.to_dap(),
}
}
}
#[async_trait(?Send)]
@@ -60,7 +84,7 @@ impl DebugAdapter for JsDebugAdapter {
async fn get_installed_binary(
&self,
delegate: &dyn DapDelegate,
config: &DebugAdapterConfig,
config: &DebugTaskDefinition,
user_installed_path: Option<PathBuf>,
_: &mut AsyncApp,
) -> Result<DebugAdapterBinary> {
@@ -82,6 +106,7 @@ impl DebugAdapter for JsDebugAdapter {
let (host, port, timeout) = crate::configure_tcp_connection(tcp_connection).await?;
Ok(DebugAdapterBinary {
adapter_name: self.name(),
command: delegate
.node_runtime()
.binary_path()
@@ -100,6 +125,7 @@ impl DebugAdapter for JsDebugAdapter {
port,
timeout,
}),
request_args: self.request_args(config),
})
}
@@ -118,39 +144,4 @@ impl DebugAdapter for JsDebugAdapter {
return Ok(());
}
fn request_args(&self, config: &DebugTaskDefinition) -> Value {
let mut args = json!({
"type": "pwa-node",
"request": match config.request {
DebugRequestType::Launch(_) => "launch",
DebugRequestType::Attach(_) => "attach",
},
});
let map = args.as_object_mut().unwrap();
match &config.request {
DebugRequestType::Attach(attach) => {
map.insert("processId".into(), attach.process_id.into());
}
DebugRequestType::Launch(launch) => {
map.insert("program".into(), launch.program.clone().into());
if !launch.args.is_empty() {
map.insert("args".into(), launch.args.clone().into());
}
if let Some(stop_on_entry) = config.stop_on_entry {
map.insert("stopOnEntry".into(), stop_on_entry.into());
}
if let Some(cwd) = launch.cwd.as_ref() {
map.insert("cwd".into(), cwd.to_string_lossy().into_owned().into());
}
}
}
args
}
fn attach_processes_filter(&self) -> Regex {
self.attach_processes.clone()
}
}

View File

@@ -1,99 +0,0 @@
use std::{ffi::OsStr, path::PathBuf};
use anyhow::Result;
use async_trait::async_trait;
use gpui::AsyncApp;
use task::{DebugAdapterConfig, DebugRequestType, DebugTaskDefinition};
use crate::*;
#[derive(Default)]
pub(crate) struct LldbDebugAdapter;
impl LldbDebugAdapter {
const ADAPTER_NAME: &'static str = "LLDB";
}
#[async_trait(?Send)]
impl DebugAdapter for LldbDebugAdapter {
fn name(&self) -> DebugAdapterName {
DebugAdapterName(Self::ADAPTER_NAME.into())
}
async fn get_binary(
&self,
delegate: &dyn DapDelegate,
_: &DebugAdapterConfig,
user_installed_path: Option<PathBuf>,
_: &mut AsyncApp,
) -> Result<DebugAdapterBinary> {
let lldb_dap_path = if let Some(user_installed_path) = user_installed_path {
user_installed_path.to_string_lossy().into()
} else {
delegate
.which(OsStr::new("lldb-dap"))
.and_then(|p| p.to_str().map(|s| s.to_string()))
.ok_or(anyhow!("Could not find lldb-dap in path"))?
};
Ok(DebugAdapterBinary {
command: lldb_dap_path,
arguments: None,
envs: None,
cwd: None,
connection: None,
})
}
async fn install_binary(
&self,
_version: AdapterVersion,
_delegate: &dyn DapDelegate,
) -> Result<()> {
unimplemented!("LLDB debug adapter cannot be installed by Zed (yet)")
}
async fn fetch_latest_adapter_version(&self, _: &dyn DapDelegate) -> Result<AdapterVersion> {
unimplemented!("Fetch latest adapter version not implemented for lldb (yet)")
}
async fn get_installed_binary(
&self,
_: &dyn DapDelegate,
_: &DebugAdapterConfig,
_: Option<PathBuf>,
_: &mut AsyncApp,
) -> Result<DebugAdapterBinary> {
unimplemented!("LLDB debug adapter cannot be installed by Zed (yet)")
}
fn request_args(&self, config: &DebugTaskDefinition) -> Value {
let mut args = json!({
"request": match config.request {
DebugRequestType::Launch(_) => "launch",
DebugRequestType::Attach(_) => "attach",
},
});
let map = args.as_object_mut().unwrap();
match &config.request {
DebugRequestType::Attach(attach) => {
map.insert("pid".into(), attach.process_id.into());
}
DebugRequestType::Launch(launch) => {
map.insert("program".into(), launch.program.clone().into());
if !launch.args.is_empty() {
map.insert("args".into(), launch.args.clone().into());
}
if let Some(stop_on_entry) = config.stop_on_entry {
map.insert("stopOnEntry".into(), stop_on_entry.into());
}
if let Some(cwd) = launch.cwd.as_ref() {
map.insert("cwd".into(), cwd.to_string_lossy().into_owned().into());
}
}
}
args
}
}

View File

@@ -13,6 +13,28 @@ impl PhpDebugAdapter {
const ADAPTER_NAME: &'static str = "PHP";
const ADAPTER_PACKAGE_NAME: &'static str = "vscode-php-debug";
const ADAPTER_PATH: &'static str = "extension/out/phpDebug.js";
fn request_args(
&self,
config: &DebugTaskDefinition,
) -> Result<dap::StartDebuggingRequestArguments> {
match &config.request {
dap::DebugRequestType::Attach(_) => {
anyhow::bail!("php adapter does not support attaching")
}
dap::DebugRequestType::Launch(launch_config) => {
Ok(dap::StartDebuggingRequestArguments {
configuration: json!({
"program": launch_config.program,
"cwd": launch_config.cwd,
"args": launch_config.args,
"stopOnEntry": config.stop_on_entry.unwrap_or_default(),
}),
request: config.request.to_dap(),
})
}
}
}
}
#[async_trait(?Send)]
@@ -50,7 +72,7 @@ impl DebugAdapter for PhpDebugAdapter {
async fn get_installed_binary(
&self,
delegate: &dyn DapDelegate,
config: &DebugAdapterConfig,
config: &DebugTaskDefinition,
user_installed_path: Option<PathBuf>,
_: &mut AsyncApp,
) -> Result<DebugAdapterBinary> {
@@ -72,6 +94,7 @@ impl DebugAdapter for PhpDebugAdapter {
let (host, port, timeout) = crate::configure_tcp_connection(tcp_connection).await?;
Ok(DebugAdapterBinary {
adapter_name: self.name(),
command: delegate
.node_runtime()
.binary_path()
@@ -89,6 +112,7 @@ impl DebugAdapter for PhpDebugAdapter {
}),
cwd: None,
envs: None,
request_args: self.request_args(config)?,
})
}
@@ -107,21 +131,4 @@ impl DebugAdapter for PhpDebugAdapter {
Ok(())
}
fn request_args(&self, config: &DebugTaskDefinition) -> Value {
match &config.request {
dap::DebugRequestType::Attach(_) => {
// php adapter does not support attaching
json!({})
}
dap::DebugRequestType::Launch(launch_config) => {
json!({
"program": launch_config.program,
"cwd": launch_config.cwd,
"args": launch_config.args,
"stopOnEntry": config.stop_on_entry.unwrap_or_default(),
})
}
}
}
}

View File

@@ -1,5 +1,5 @@
use crate::*;
use dap::DebugRequestType;
use dap::{DebugRequestType, StartDebuggingRequestArguments};
use gpui::AsyncApp;
use std::{ffi::OsStr, path::PathBuf};
use task::DebugTaskDefinition;
@@ -12,6 +12,38 @@ impl PythonDebugAdapter {
const ADAPTER_PACKAGE_NAME: &'static str = "debugpy";
const ADAPTER_PATH: &'static str = "src/debugpy/adapter";
const LANGUAGE_NAME: &'static str = "Python";
fn request_args(&self, config: &DebugTaskDefinition) -> StartDebuggingRequestArguments {
let mut args = json!({
"request": match config.request {
DebugRequestType::Launch(_) => "launch",
DebugRequestType::Attach(_) => "attach",
},
"subProcess": true,
"redirectOutput": true,
});
let map = args.as_object_mut().unwrap();
match &config.request {
DebugRequestType::Attach(attach) => {
map.insert("processId".into(), attach.process_id.into());
}
DebugRequestType::Launch(launch) => {
map.insert("program".into(), launch.program.clone().into());
map.insert("args".into(), launch.args.clone().into());
if let Some(stop_on_entry) = config.stop_on_entry {
map.insert("stopOnEntry".into(), stop_on_entry.into());
}
if let Some(cwd) = launch.cwd.as_ref() {
map.insert("cwd".into(), cwd.to_string_lossy().into_owned().into());
}
}
}
StartDebuggingRequestArguments {
configuration: args,
request: config.request.to_dap(),
}
}
}
#[async_trait(?Send)]
@@ -64,7 +96,7 @@ impl DebugAdapter for PythonDebugAdapter {
async fn get_installed_binary(
&self,
delegate: &dyn DapDelegate,
config: &DebugAdapterConfig,
config: &DebugTaskDefinition,
user_installed_path: Option<PathBuf>,
cx: &mut AsyncApp,
) -> Result<DebugAdapterBinary> {
@@ -109,6 +141,7 @@ impl DebugAdapter for PythonDebugAdapter {
};
Ok(DebugAdapterBinary {
adapter_name: self.name(),
command: python_path.ok_or(anyhow!("failed to find binary path for python"))?,
arguments: Some(vec![
debugpy_dir.join(Self::ADAPTER_PATH).into(),
@@ -122,35 +155,7 @@ impl DebugAdapter for PythonDebugAdapter {
}),
cwd: None,
envs: None,
request_args: self.request_args(config),
})
}
fn request_args(&self, config: &DebugTaskDefinition) -> Value {
let mut args = json!({
"request": match config.request {
DebugRequestType::Launch(_) => "launch",
DebugRequestType::Attach(_) => "attach",
},
"subProcess": true,
"redirectOutput": true,
});
let map = args.as_object_mut().unwrap();
match &config.request {
DebugRequestType::Attach(attach) => {
map.insert("processId".into(), attach.process_id.into());
}
DebugRequestType::Launch(launch) => {
map.insert("program".into(), launch.program.clone().into());
map.insert("args".into(), launch.args.clone().into());
if let Some(stop_on_entry) = config.stop_on_entry {
map.insert("stopOnEntry".into(), stop_on_entry.into());
}
if let Some(cwd) = launch.cwd.as_ref() {
map.insert("cwd".into(), cwd.to_string_lossy().into_owned().into());
}
}
}
args
}
}

View File

@@ -28,6 +28,7 @@ client.workspace = true
collections.workspace = true
command_palette_hooks.workspace = true
dap.workspace = true
db.workspace = true
editor.workspace = true
feature_flags.workspace = true
futures.workspace = true

View File

@@ -4,7 +4,6 @@ use gpui::Subscription;
use gpui::{DismissEvent, Entity, EventEmitter, Focusable, Render};
use picker::{Picker, PickerDelegate};
use std::cell::LazyCell;
use std::sync::Arc;
use sysinfo::System;
use ui::{Context, Tooltip, prelude::*};
@@ -24,7 +23,7 @@ pub(crate) struct AttachModalDelegate {
matches: Vec<StringMatch>,
placeholder_text: Arc<str>,
project: Entity<project::Project>,
debug_config: task::DebugTaskDefinition,
pub(crate) debug_config: task::DebugTaskDefinition,
candidates: Arc<[Candidate]>,
}
@@ -58,7 +57,7 @@ impl AttachModal {
window: &mut Window,
cx: &mut Context<Self>,
) -> Self {
let mut processes: Vec<_> = System::new_all()
let mut processes: Box<[_]> = System::new_all()
.processes()
.values()
.map(|process| {
@@ -75,30 +74,18 @@ impl AttachModal {
})
.collect();
processes.sort_by_key(|k| k.name.clone());
let processes = processes.into_iter().collect();
Self::with_processes(project, debug_config, processes, modal, window, cx)
}
pub(super) fn with_processes(
project: Entity<project::Project>,
debug_config: task::DebugTaskDefinition,
processes: Vec<Candidate>,
processes: Arc<[Candidate]>,
modal: bool,
window: &mut Window,
cx: &mut Context<Self>,
) -> Self {
let adapter = project
.read(cx)
.debug_adapters()
.adapter(&debug_config.adapter);
let filter = LazyCell::new(|| adapter.map(|adapter| adapter.attach_processes_filter()));
let processes = processes
.into_iter()
.filter(|process| {
filter
.as_ref()
.map_or(false, |filter| filter.is_match(&process.name))
})
.collect();
let picker = cx.new(|cx| {
Picker::uniform_list(
AttachModalDelegate::new(project, debug_config, processes),
@@ -117,9 +104,10 @@ impl AttachModal {
}
impl Render for AttachModal {
fn render(&mut self, _window: &mut Window, _: &mut Context<Self>) -> impl ui::IntoElement {
fn render(&mut self, _window: &mut Window, cx: &mut Context<Self>) -> impl ui::IntoElement {
v_flex()
.key_context("AttachModal")
.track_focus(&self.focus_handle(cx))
.w(rems(34.))
.child(self.picker.clone())
}
@@ -240,10 +228,7 @@ impl PickerDelegate for AttachModalDelegate {
let config = self.debug_config.clone();
self.project
.update(cx, |project, cx| {
#[cfg(any(test, feature = "test-support"))]
let ret = project.fake_debug_session(config.request, None, false, cx);
#[cfg(not(any(test, feature = "test-support")))]
let ret = project.start_debug_session(config.into(), cx);
let ret = project.start_debug_session(config, cx);
ret
})
.detach_and_log_err(cx);

View File

@@ -1,6 +1,6 @@
use crate::{
ClearAllBreakpoints, Continue, CreateDebuggingSession, Disconnect, Pause, Restart, StepBack,
StepInto, StepOut, StepOver, Stop, ToggleIgnoreBreakpoints,
StepInto, StepOut, StepOver, Stop, ToggleIgnoreBreakpoints, persistence,
};
use crate::{new_session_modal::NewSessionModal, session::DebugSession};
use anyhow::{Result, anyhow};
@@ -293,35 +293,49 @@ impl DebugPanel {
);
};
let Some(project) = self.project.upgrade() else {
return log::error!("Debug Panel out lived it's weak reference to Project");
};
let adapter_name = session.read(cx).adapter_name();
if self
.sessions
.iter()
.any(|item| item.read(cx).session_id(cx) == *session_id)
{
// We already have an item for this session.
return;
}
let session_item = DebugSession::running(
project,
self.workspace.clone(),
session,
cx.weak_entity(),
window,
cx,
);
let session_id = *session_id;
cx.spawn_in(window, async move |this, cx| {
let serialized_layout =
persistence::get_serialized_pane_layout(adapter_name).await;
if let Some(running) = session_item.read(cx).mode().as_running().cloned() {
// We might want to make this an event subscription and only notify when a new thread is selected
// This is used to filter the command menu correctly
cx.observe(&running, |_, _, cx| cx.notify()).detach();
}
this.update_in(cx, |this, window, cx| {
let Some(project) = this.project.upgrade() else {
return log::error!(
"Debug Panel out lived it's weak reference to Project"
);
};
self.sessions.push(session_item.clone());
self.activate_session(session_item, window, cx);
if this
.sessions
.iter()
.any(|item| item.read(cx).session_id(cx) == session_id)
{
// We already have an item for this session.
return;
}
let session_item = DebugSession::running(
project,
this.workspace.clone(),
session,
cx.weak_entity(),
serialized_layout,
window,
cx,
);
if let Some(running) = session_item.read(cx).mode().as_running().cloned() {
// We might want to make this an event subscription and only notify when a new thread is selected
// This is used to filter the command menu correctly
cx.observe(&running, |_, _, cx| cx.notify()).detach();
}
this.sessions.push(session_item.clone());
this.activate_session(session_item, window, cx);
})
})
.detach();
}
dap_store::DapStoreEvent::RunInTerminal {
title,

View File

@@ -13,6 +13,7 @@ use workspace::{ShutdownDebugAdapters, Workspace};
pub mod attach_modal;
pub mod debugger_panel;
mod new_session_modal;
mod persistence;
pub(crate) mod session;
#[cfg(test)]

View File

@@ -11,6 +11,7 @@ use gpui::{
App, AppContext, DismissEvent, Entity, EventEmitter, FocusHandle, Focusable, Render, TextStyle,
WeakEntity,
};
use project::Project;
use settings::Settings;
use task::{DebugTaskDefinition, LaunchConfig};
use theme::ThemeSettings;
@@ -59,7 +60,7 @@ impl NewSessionModal {
debug_panel: WeakEntity<DebugPanel>,
workspace: WeakEntity<Workspace>,
window: &mut Window,
cx: &mut App,
cx: &mut Context<Self>,
) -> Self {
let debugger = past_debug_definition
.as_ref()
@@ -145,7 +146,7 @@ impl NewSessionModal {
{
this.start_debug_session(debug_config, cx)
} else {
this.start_debug_session(config.into(), cx)
this.start_debug_session(config, cx)
}
})?;
let spawn_result = task.await;
@@ -171,25 +172,13 @@ impl NewSessionModal {
attach.update(cx, |this, cx| {
if selected_debugger != this.debug_definition.adapter {
this.debug_definition.adapter = selected_debugger.into();
if let Some(project) = this
.workspace
.read_with(cx, |workspace, _| workspace.project().clone())
.ok()
{
this.attach_picker = Some(cx.new(|cx| {
let modal = AttachModal::new(
project,
this.debug_definition.clone(),
false,
window,
cx,
);
window.focus(&modal.focus_handle(cx));
modal
}));
}
this.attach_picker.update(cx, |this, cx| {
this.picker.update(cx, |this, cx| {
this.delegate.debug_config.adapter = selected_debugger.into();
this.focus(window, cx);
})
});
}
cx.notify();
@@ -256,7 +245,6 @@ impl NewSessionModal {
ContextMenu::build(window, cx, move |mut menu, _, cx| {
let setter_for_name = |task: DebugTaskDefinition| {
let weak = weak.clone();
let workspace = workspace.clone();
move |window: &mut Window, cx: &mut App| {
weak.update(cx, |this, cx| {
this.last_selected_profile_name = Some(SharedString::from(&task.label));
@@ -271,12 +259,19 @@ impl NewSessionModal {
);
}
DebugRequestType::Attach(_) => {
let Ok(project) = this
.workspace
.read_with(cx, |this, _| this.project().clone())
else {
return;
};
this.mode = NewSessionMode::attach(
this.debugger.clone(),
workspace.clone(),
project,
window,
cx,
);
this.mode.focus_handle(cx).focus(window);
if let Some((debugger, attach)) =
this.debugger.as_ref().zip(this.mode.as_attach())
{
@@ -365,18 +360,16 @@ impl LaunchMode {
#[derive(Clone)]
struct AttachMode {
workspace: WeakEntity<Workspace>,
debug_definition: DebugTaskDefinition,
attach_picker: Option<Entity<AttachModal>>,
focus_handle: FocusHandle,
attach_picker: Entity<AttachModal>,
}
impl AttachMode {
fn new(
debugger: Option<SharedString>,
workspace: WeakEntity<Workspace>,
project: Entity<Project>,
window: &mut Window,
cx: &mut App,
cx: &mut Context<NewSessionModal>,
) -> Entity<Self> {
let debug_definition = DebugTaskDefinition {
label: "Attach New Session Setup".into(),
@@ -387,27 +380,15 @@ impl AttachMode {
initialize_args: None,
stop_on_entry: Some(false),
};
let attach_picker = cx.new(|cx| {
let modal = AttachModal::new(project, debug_definition.clone(), false, window, cx);
window.focus(&modal.focus_handle(cx));
let attach_picker = if let Some(project) = debugger.and(
workspace
.read_with(cx, |workspace, _| workspace.project().clone())
.ok(),
) {
Some(cx.new(|cx| {
let modal = AttachModal::new(project, debug_definition.clone(), false, window, cx);
window.focus(&modal.focus_handle(cx));
modal
}))
} else {
None
};
cx.new(|cx| Self {
workspace,
modal
});
cx.new(|_| Self {
debug_definition,
attach_picker,
focus_handle: cx.focus_handle(),
})
}
fn debug_task(&self) -> task::AttachConfig {
@@ -444,7 +425,7 @@ impl Focusable for NewSessionMode {
fn focus_handle(&self, cx: &App) -> FocusHandle {
match &self {
NewSessionMode::Launch(entity) => entity.read(cx).program.focus_handle(cx),
NewSessionMode::Attach(entity) => entity.read(cx).focus_handle.clone(),
NewSessionMode::Attach(entity) => entity.read(cx).attach_picker.focus_handle(cx),
}
}
}
@@ -476,8 +457,11 @@ impl RenderOnce for LaunchMode {
}
impl RenderOnce for AttachMode {
fn render(self, _: &mut Window, _: &mut App) -> impl IntoElement {
v_flex().w_full().children(self.attach_picker.clone())
fn render(self, _: &mut Window, cx: &mut App) -> impl IntoElement {
v_flex()
.w_full()
.track_focus(&self.attach_picker.focus_handle(cx))
.child(self.attach_picker.clone())
}
}
@@ -497,13 +481,17 @@ impl RenderOnce for NewSessionMode {
impl NewSessionMode {
fn attach(
debugger: Option<SharedString>,
workspace: WeakEntity<Workspace>,
project: Entity<Project>,
window: &mut Window,
cx: &mut App,
cx: &mut Context<NewSessionModal>,
) -> Self {
Self::Attach(AttachMode::new(debugger, workspace, window, cx))
Self::Attach(AttachMode::new(debugger, project, window, cx))
}
fn launch(past_launch_config: Option<LaunchConfig>, window: &mut Window, cx: &mut App) -> Self {
fn launch(
past_launch_config: Option<LaunchConfig>,
window: &mut Window,
cx: &mut Context<NewSessionModal>,
) -> Self {
Self::Launch(LaunchMode::new(past_launch_config, window, cx))
}
}
@@ -592,18 +580,25 @@ impl Render for NewSessionModal {
.toggle_state(matches!(self.mode, NewSessionMode::Attach(_)))
.style(ui::ButtonStyle::Subtle)
.on_click(cx.listener(|this, _, window, cx| {
let Ok(project) = this
.workspace
.read_with(cx, |this, _| this.project().clone())
else {
return;
};
this.mode = NewSessionMode::attach(
this.debugger.clone(),
this.workspace.clone(),
project,
window,
cx,
);
this.mode.focus_handle(cx).focus(window);
if let Some((debugger, attach)) =
this.debugger.as_ref().zip(this.mode.as_attach())
{
Self::update_attach_picker(&attach, &debugger, window, cx);
}
this.mode.focus_handle(cx).focus(window);
cx.notify();
}))
.last(),

View File

@@ -0,0 +1,257 @@
use collections::HashMap;
use db::kvp::KEY_VALUE_STORE;
use gpui::{Axis, Context, Entity, EntityId, Focusable, Subscription, WeakEntity, Window};
use project::Project;
use serde::{Deserialize, Serialize};
use ui::{App, SharedString};
use util::ResultExt;
use workspace::{Member, Pane, PaneAxis, Workspace};
use crate::session::running::{
self, RunningState, SubView, breakpoint_list::BreakpointList, console::Console,
module_list::ModuleList, stack_frame_list::StackFrameList, variable_list::VariableList,
};
#[derive(Clone, Copy, Debug, Serialize, Deserialize, PartialEq, Eq)]
pub(crate) enum DebuggerPaneItem {
Console,
Variables,
BreakpointList,
Frames,
Modules,
}
impl DebuggerPaneItem {
pub(crate) fn to_shared_string(self) -> SharedString {
match self {
DebuggerPaneItem::Console => SharedString::new_static("Console"),
DebuggerPaneItem::Variables => SharedString::new_static("Variables"),
DebuggerPaneItem::BreakpointList => SharedString::new_static("Breakpoints"),
DebuggerPaneItem::Frames => SharedString::new_static("Frames"),
DebuggerPaneItem::Modules => SharedString::new_static("Modules"),
}
}
}
#[derive(Debug, Serialize, Deserialize)]
pub(crate) struct SerializedAxis(pub Axis);
#[derive(Debug, Serialize, Deserialize)]
pub(crate) enum SerializedPaneLayout {
Pane(SerializedPane),
Group {
axis: SerializedAxis,
flexes: Option<Vec<f32>>,
children: Vec<SerializedPaneLayout>,
},
}
#[derive(Debug, Serialize, Deserialize)]
pub(crate) struct SerializedPane {
pub children: Vec<DebuggerPaneItem>,
pub active_item: Option<DebuggerPaneItem>,
}
const DEBUGGER_PANEL_PREFIX: &str = "debugger_panel_";
pub(crate) async fn serialize_pane_layout(
adapter_name: SharedString,
pane_group: SerializedPaneLayout,
) -> anyhow::Result<()> {
if let Ok(serialized_pane_group) = serde_json::to_string(&pane_group) {
KEY_VALUE_STORE
.write_kvp(
format!("{DEBUGGER_PANEL_PREFIX}-{adapter_name}"),
serialized_pane_group,
)
.await
} else {
Err(anyhow::anyhow!(
"Failed to serialize pane group with serde_json as a string"
))
}
}
pub(crate) fn build_serialized_pane_layout(
pane_group: &Member,
cx: &mut App,
) -> SerializedPaneLayout {
match pane_group {
Member::Axis(PaneAxis {
axis,
members,
flexes,
bounding_boxes: _,
}) => SerializedPaneLayout::Group {
axis: SerializedAxis(*axis),
children: members
.iter()
.map(|member| build_serialized_pane_layout(member, cx))
.collect::<Vec<_>>(),
flexes: Some(flexes.lock().clone()),
},
Member::Pane(pane_handle) => SerializedPaneLayout::Pane(serialize_pane(pane_handle, cx)),
}
}
fn serialize_pane(pane: &Entity<Pane>, cx: &mut App) -> SerializedPane {
let pane = pane.read(cx);
let children = pane
.items()
.filter_map(|item| {
item.act_as::<SubView>(cx)
.map(|view| view.read(cx).view_kind())
})
.collect::<Vec<_>>();
let active_item = pane
.active_item()
.and_then(|item| item.act_as::<SubView>(cx))
.map(|view| view.read(cx).view_kind());
SerializedPane {
children,
active_item,
}
}
pub(crate) async fn get_serialized_pane_layout(
adapter_name: impl AsRef<str>,
) -> Option<SerializedPaneLayout> {
let key = format!("{DEBUGGER_PANEL_PREFIX}-{}", adapter_name.as_ref());
KEY_VALUE_STORE
.read_kvp(&key)
.log_err()
.flatten()
.and_then(|value| serde_json::from_str::<SerializedPaneLayout>(&value).ok())
}
pub(crate) fn deserialize_pane_layout(
serialized: SerializedPaneLayout,
workspace: &WeakEntity<Workspace>,
project: &Entity<Project>,
stack_frame_list: &Entity<StackFrameList>,
variable_list: &Entity<VariableList>,
module_list: &Entity<ModuleList>,
console: &Entity<Console>,
breakpoint_list: &Entity<BreakpointList>,
subscriptions: &mut HashMap<EntityId, Subscription>,
window: &mut Window,
cx: &mut Context<RunningState>,
) -> Option<Member> {
match serialized {
SerializedPaneLayout::Group {
axis,
flexes,
children,
} => {
let mut members = Vec::new();
for child in children {
if let Some(new_member) = deserialize_pane_layout(
child,
workspace,
project,
stack_frame_list,
variable_list,
module_list,
console,
breakpoint_list,
subscriptions,
window,
cx,
) {
members.push(new_member);
}
}
if members.is_empty() {
return None;
}
if members.len() == 1 {
return Some(members.remove(0));
}
Some(Member::Axis(PaneAxis::load(
axis.0,
members,
flexes.clone(),
)))
}
SerializedPaneLayout::Pane(serialized_pane) => {
let pane = running::new_debugger_pane(workspace.clone(), project.clone(), window, cx);
subscriptions.insert(
pane.entity_id(),
cx.subscribe_in(&pane, window, RunningState::handle_pane_event),
);
let sub_views: Vec<_> = serialized_pane
.children
.iter()
.map(|child| match child {
DebuggerPaneItem::Frames => Box::new(SubView::new(
pane.focus_handle(cx),
stack_frame_list.clone().into(),
DebuggerPaneItem::Frames,
None,
cx,
)),
DebuggerPaneItem::Variables => Box::new(SubView::new(
variable_list.focus_handle(cx),
variable_list.clone().into(),
DebuggerPaneItem::Variables,
None,
cx,
)),
DebuggerPaneItem::BreakpointList => Box::new(SubView::new(
breakpoint_list.focus_handle(cx),
breakpoint_list.clone().into(),
DebuggerPaneItem::BreakpointList,
None,
cx,
)),
DebuggerPaneItem::Modules => Box::new(SubView::new(
pane.focus_handle(cx),
module_list.clone().into(),
DebuggerPaneItem::Modules,
None,
cx,
)),
DebuggerPaneItem::Console => Box::new(SubView::new(
pane.focus_handle(cx),
console.clone().into(),
DebuggerPaneItem::Console,
Some(Box::new({
let console = console.clone().downgrade();
move |cx| {
console
.read_with(cx, |console, cx| console.show_indicator(cx))
.unwrap_or_default()
}
})),
cx,
)),
})
.collect();
pane.update(cx, |pane, cx| {
let mut active_idx = 0;
for (idx, sub_view) in sub_views.into_iter().enumerate() {
if serialized_pane
.active_item
.is_some_and(|active| active == sub_view.read(cx).view_kind())
{
active_idx = idx;
}
pane.add_item(sub_view, false, false, None, window, cx);
}
pane.activate_item(active_idx, false, false, window, cx);
});
Some(Member::Pane(pane.clone()))
}
}
}

View File

@@ -16,6 +16,7 @@ use workspace::{
};
use crate::debugger_panel::DebugPanel;
use crate::persistence::SerializedPaneLayout;
pub(crate) enum DebugSessionState {
Running(Entity<running::RunningState>),
@@ -52,6 +53,7 @@ impl DebugSession {
workspace: WeakEntity<Workspace>,
session: Entity<Session>,
_debug_panel: WeakEntity<DebugPanel>,
serialized_pane_layout: Option<SerializedPaneLayout>,
window: &mut Window,
cx: &mut App,
) -> Entity<Self> {
@@ -60,6 +62,7 @@ impl DebugSession {
session.clone(),
project.clone(),
workspace.clone(),
serialized_pane_layout,
window,
cx,
)

View File

@@ -1,11 +1,13 @@
mod breakpoint_list;
mod console;
mod loaded_source_list;
mod module_list;
pub(crate) mod breakpoint_list;
pub(crate) mod console;
pub(crate) mod loaded_source_list;
pub(crate) mod module_list;
pub mod stack_frame_list;
pub mod variable_list;
use std::{any::Any, ops::ControlFlow, sync::Arc};
use std::{any::Any, ops::ControlFlow, sync::Arc, time::Duration};
use crate::persistence::{self, DebuggerPaneItem, SerializedPaneLayout};
use super::DebugPanelItemEvent;
use breakpoint_list::BreakpointList;
@@ -14,7 +16,7 @@ use console::Console;
use dap::{Capabilities, Thread, client::SessionId, debugger_settings::DebuggerSettings};
use gpui::{
Action as _, AnyView, AppContext, Entity, EntityId, EventEmitter, FocusHandle, Focusable,
NoAction, Subscription, WeakEntity,
NoAction, Subscription, Task, WeakEntity,
};
use loaded_source_list::LoadedSourceList;
use module_list::ModuleList;
@@ -33,8 +35,8 @@ use ui::{
use util::ResultExt;
use variable_list::VariableList;
use workspace::{
ActivePaneDecorator, DraggedTab, Item, Pane, PaneGroup, Workspace, item::TabContentParams,
move_item, pane::Event,
ActivePaneDecorator, DraggedTab, Item, Member, Pane, PaneGroup, Workspace,
item::TabContentParams, move_item, pane::Event,
};
pub struct RunningState {
@@ -51,6 +53,7 @@ pub struct RunningState {
_console: Entity<Console>,
panes: PaneGroup,
pane_close_subscriptions: HashMap<EntityId, Subscription>,
_schedule_serialize: Option<Task<()>>,
}
impl Render for RunningState {
@@ -84,28 +87,32 @@ impl Render for RunningState {
}
}
struct SubView {
pub(crate) struct SubView {
inner: AnyView,
pane_focus_handle: FocusHandle,
tab_name: SharedString,
kind: DebuggerPaneItem,
show_indicator: Box<dyn Fn(&App) -> bool>,
}
impl SubView {
fn new(
pub(crate) fn new(
pane_focus_handle: FocusHandle,
view: AnyView,
tab_name: SharedString,
kind: DebuggerPaneItem,
show_indicator: Option<Box<dyn Fn(&App) -> bool>>,
cx: &mut App,
) -> Entity<Self> {
cx.new(|_| Self {
tab_name,
kind,
inner: view,
pane_focus_handle,
show_indicator: show_indicator.unwrap_or(Box::new(|_| false)),
})
}
pub(crate) fn view_kind(&self) -> DebuggerPaneItem {
self.kind
}
}
impl Focusable for SubView {
fn focus_handle(&self, _: &App) -> FocusHandle {
@@ -116,13 +123,19 @@ impl EventEmitter<()> for SubView {}
impl Item for SubView {
type Event = ();
/// This is used to serialize debugger pane layouts
/// A SharedString gets converted to a enum and back during serialization/deserialization.
fn tab_content_text(&self, _window: &Window, _cx: &App) -> Option<SharedString> {
Some(self.kind.to_shared_string())
}
fn tab_content(
&self,
params: workspace::item::TabContentParams,
_: &Window,
cx: &App,
) -> AnyElement {
let label = Label::new(self.tab_name.clone())
let label = Label::new(self.kind.to_shared_string())
.size(ui::LabelSize::Small)
.color(params.text_color())
.line_height_style(ui::LineHeightStyle::UiLabel);
@@ -146,7 +159,7 @@ impl Render for SubView {
}
}
fn new_debugger_pane(
pub(crate) fn new_debugger_pane(
workspace: WeakEntity<Workspace>,
project: Entity<Project>,
window: &mut Window,
@@ -185,7 +198,7 @@ fn new_debugger_pane(
new_debugger_pane(workspace.clone(), project.clone(), window, cx);
let _previous_subscription = running.pane_close_subscriptions.insert(
new_pane.entity_id(),
cx.subscribe(&new_pane, RunningState::handle_pane_event),
cx.subscribe_in(&new_pane, window, RunningState::handle_pane_event),
);
debug_assert!(_previous_subscription.is_none());
running
@@ -354,6 +367,7 @@ impl RunningState {
session: Entity<Session>,
project: Entity<Project>,
workspace: WeakEntity<Workspace>,
serialized_pane_layout: Option<SerializedPaneLayout>,
window: &mut Window,
cx: &mut Context<Self>,
) -> Self {
@@ -382,6 +396,8 @@ impl RunningState {
)
});
let breakpoints = BreakpointList::new(session.clone(), workspace.clone(), &project, cx);
let _subscriptions = vec![
cx.observe(&module_list, |_, _, cx| cx.notify()),
cx.subscribe_in(&session, window, |this, _, event, window, cx| {
@@ -407,112 +423,40 @@ impl RunningState {
}),
];
let leftmost_pane = new_debugger_pane(workspace.clone(), project.clone(), window, cx);
leftmost_pane.update(cx, |this, cx| {
this.add_item(
Box::new(SubView::new(
this.focus_handle(cx),
stack_frame_list.clone().into(),
SharedString::new_static("Frames"),
None,
cx,
)),
true,
false,
None,
let mut pane_close_subscriptions = HashMap::default();
let panes = if let Some(root) = serialized_pane_layout.and_then(|serialized_layout| {
persistence::deserialize_pane_layout(
serialized_layout,
&workspace,
&project,
&stack_frame_list,
&variable_list,
&module_list,
&console,
&breakpoints,
&mut pane_close_subscriptions,
window,
cx,
)
}) {
workspace::PaneGroup::with_root(root)
} else {
pane_close_subscriptions.clear();
let root = Self::default_pane_layout(
project,
&workspace,
&stack_frame_list,
&variable_list,
&module_list,
&console,
breakpoints,
&mut pane_close_subscriptions,
window,
cx,
);
let breakpoints = BreakpointList::new(session.clone(), workspace.clone(), &project, cx);
this.add_item(
Box::new(SubView::new(
breakpoints.focus_handle(cx),
breakpoints.into(),
SharedString::new_static("Breakpoints"),
None,
cx,
)),
true,
false,
None,
window,
cx,
);
this.activate_item(0, false, false, window, cx);
});
let center_pane = new_debugger_pane(workspace.clone(), project.clone(), window, cx);
center_pane.update(cx, |this, cx| {
this.add_item(
Box::new(SubView::new(
variable_list.focus_handle(cx),
variable_list.clone().into(),
SharedString::new_static("Variables"),
None,
cx,
)),
true,
false,
None,
window,
cx,
);
this.add_item(
Box::new(SubView::new(
this.focus_handle(cx),
module_list.clone().into(),
SharedString::new_static("Modules"),
None,
cx,
)),
false,
false,
None,
window,
cx,
);
this.activate_item(0, false, false, window, cx);
});
let rightmost_pane = new_debugger_pane(workspace.clone(), project.clone(), window, cx);
rightmost_pane.update(cx, |this, cx| {
let weak_console = console.downgrade();
this.add_item(
Box::new(SubView::new(
this.focus_handle(cx),
console.clone().into(),
SharedString::new_static("Console"),
Some(Box::new(move |cx| {
weak_console
.read_with(cx, |console, cx| console.show_indicator(cx))
.unwrap_or_default()
})),
cx,
)),
true,
false,
None,
window,
cx,
);
});
let pane_close_subscriptions = HashMap::from_iter(
[&leftmost_pane, &center_pane, &rightmost_pane]
.into_iter()
.map(|entity| {
(
entity.entity_id(),
cx.subscribe(entity, Self::handle_pane_event),
)
}),
);
let group_root = workspace::PaneAxis::new(
gpui::Axis::Horizontal,
[leftmost_pane, center_pane, rightmost_pane]
.into_iter()
.map(workspace::Member::Pane)
.collect(),
);
let panes = PaneGroup::with_root(workspace::Member::Axis(group_root));
workspace::PaneGroup::with_root(root)
};
Self {
session,
@@ -528,21 +472,57 @@ impl RunningState {
_module_list: module_list,
_console: console,
pane_close_subscriptions,
_schedule_serialize: None,
}
}
fn handle_pane_event(
fn serialize_layout(&mut self, window: &mut Window, cx: &mut Context<Self>) {
if self._schedule_serialize.is_none() {
self._schedule_serialize = Some(cx.spawn_in(window, async move |this, cx| {
cx.background_executor()
.timer(Duration::from_millis(100))
.await;
let Some((adapter_name, pane_group)) = this
.update(cx, |this, cx| {
let adapter_name = this.session.read(cx).adapter_name();
(
adapter_name,
persistence::build_serialized_pane_layout(&this.panes.root, cx),
)
})
.ok()
else {
return;
};
persistence::serialize_pane_layout(adapter_name, pane_group)
.await
.log_err();
this.update(cx, |this, _| {
this._schedule_serialize.take();
})
.ok();
}));
}
}
pub(crate) fn handle_pane_event(
this: &mut RunningState,
source_pane: Entity<Pane>,
source_pane: &Entity<Pane>,
event: &Event,
window: &mut Window,
cx: &mut Context<RunningState>,
) {
this.serialize_layout(window, cx);
if let Event::Remove { .. } = event {
let _did_find_pane = this.panes.remove(&source_pane).is_ok();
debug_assert!(_did_find_pane);
cx.notify();
}
}
pub(crate) fn go_to_selected_stack_frame(&self, window: &Window, cx: &mut Context<Self>) {
if self.thread_id.is_some() {
self.stack_frame_list
@@ -586,7 +566,7 @@ impl RunningState {
.find_map(|pane| {
pane.read(cx)
.items_of_type::<SubView>()
.position(|view| view.read(cx).tab_name == *"Modules")
.position(|view| view.read(cx).view_kind().to_shared_string() == *"Modules")
.map(|view| (view, pane))
})
.unwrap();
@@ -788,7 +768,7 @@ impl RunningState {
DropdownMenu::new(
("thread-list", self.session_id.0),
selected_thread_name,
ContextMenu::build(window, cx, move |mut this, _, _| {
ContextMenu::build_eager(window, cx, move |mut this, _, _| {
for (thread, _) in threads {
let state = state.clone();
let thread_id = thread.id;
@@ -802,6 +782,127 @@ impl RunningState {
}),
)
}
fn default_pane_layout(
project: Entity<Project>,
workspace: &WeakEntity<Workspace>,
stack_frame_list: &Entity<StackFrameList>,
variable_list: &Entity<VariableList>,
module_list: &Entity<ModuleList>,
console: &Entity<Console>,
breakpoints: Entity<BreakpointList>,
subscriptions: &mut HashMap<EntityId, Subscription>,
window: &mut Window,
cx: &mut Context<'_, RunningState>,
) -> Member {
let leftmost_pane = new_debugger_pane(workspace.clone(), project.clone(), window, cx);
leftmost_pane.update(cx, |this, cx| {
this.add_item(
Box::new(SubView::new(
this.focus_handle(cx),
stack_frame_list.clone().into(),
DebuggerPaneItem::Frames,
None,
cx,
)),
true,
false,
None,
window,
cx,
);
this.add_item(
Box::new(SubView::new(
breakpoints.focus_handle(cx),
breakpoints.into(),
DebuggerPaneItem::BreakpointList,
None,
cx,
)),
true,
false,
None,
window,
cx,
);
this.activate_item(0, false, false, window, cx);
});
let center_pane = new_debugger_pane(workspace.clone(), project.clone(), window, cx);
center_pane.update(cx, |this, cx| {
this.add_item(
Box::new(SubView::new(
variable_list.focus_handle(cx),
variable_list.clone().into(),
DebuggerPaneItem::Variables,
None,
cx,
)),
true,
false,
None,
window,
cx,
);
this.add_item(
Box::new(SubView::new(
this.focus_handle(cx),
module_list.clone().into(),
DebuggerPaneItem::Modules,
None,
cx,
)),
false,
false,
None,
window,
cx,
);
this.activate_item(0, false, false, window, cx);
});
let rightmost_pane = new_debugger_pane(workspace.clone(), project.clone(), window, cx);
rightmost_pane.update(cx, |this, cx| {
let weak_console = console.downgrade();
this.add_item(
Box::new(SubView::new(
this.focus_handle(cx),
console.clone().into(),
DebuggerPaneItem::Console,
Some(Box::new(move |cx| {
weak_console
.read_with(cx, |console, cx| console.show_indicator(cx))
.unwrap_or_default()
})),
cx,
)),
true,
false,
None,
window,
cx,
);
});
subscriptions.extend(
[&leftmost_pane, &center_pane, &rightmost_pane]
.into_iter()
.map(|entity| {
(
entity.entity_id(),
cx.subscribe_in(entity, window, Self::handle_pane_event),
)
}),
);
let group_root = workspace::PaneAxis::new(
gpui::Axis::Horizontal,
[leftmost_pane, center_pane, rightmost_pane]
.into_iter()
.map(workspace::Member::Pane)
.collect(),
);
Member::Axis(group_root)
}
}
impl EventEmitter<DebugPanelItemEvent> for RunningState {}

View File

@@ -27,7 +27,7 @@ use ui::{
use util::{ResultExt, maybe};
use workspace::Workspace;
pub(super) struct BreakpointList {
pub(crate) struct BreakpointList {
workspace: WeakEntity<Workspace>,
breakpoint_store: Entity<BreakpointStore>,
worktree_store: Entity<WorktreeStore>,

View File

@@ -68,6 +68,7 @@ pub async fn init_test_workspace(
workspace_handle
}
#[track_caller]
pub fn active_debug_session_panel(
workspace: WindowHandle<Workspace>,
cx: &mut TestAppContext,

View File

@@ -26,18 +26,30 @@ async fn test_direct_attach_to_process(executor: BackgroundExecutor, cx: &mut Te
let workspace = init_test_workspace(&project, cx).await;
let cx = &mut VisualTestContext::from_window(*workspace, cx);
let task = project.update(cx, |project, cx| {
project.fake_debug_session(
dap::DebugRequestType::Attach(AttachConfig {
let session = debugger::test::start_debug_session_with(
&project,
cx,
DebugTaskDefinition {
adapter: "fake-adapter".to_string(),
request: dap::DebugRequestType::Attach(AttachConfig {
process_id: Some(10),
}),
None,
false,
cx,
)
});
label: "label".to_string(),
initialize_args: None,
tcp_connection: None,
locator: None,
stop_on_entry: None,
},
|client| {
client.on_request::<dap::requests::Attach, _>(move |_, args| {
assert_eq!(json!({"request": "attach", "process_id": 10}), args.raw);
let session = task.await.unwrap();
Ok(())
});
},
)
.await
.unwrap();
cx.run_until_parked();
@@ -77,7 +89,15 @@ async fn test_show_attach_modal_and_select_process(
let project = Project::test(fs, ["/project".as_ref()], cx).await;
let workspace = init_test_workspace(&project, cx).await;
let cx = &mut VisualTestContext::from_window(*workspace, cx);
// Set up handlers for sessions spawned via modal.
let _initialize_subscription =
project::debugger::test::intercept_debug_sessions(cx, |client| {
client.on_request::<dap::requests::Attach, _>(move |_, args| {
assert_eq!(json!({"request": "attach", "process_id": 1}), args.raw);
Ok(())
});
});
let attach_modal = workspace
.update(cx, |workspace, window, cx| {
workspace.toggle_modal(window, cx, |window, cx| {
@@ -100,7 +120,7 @@ async fn test_show_attach_modal_and_select_process(
},
Candidate {
pid: 3,
name: "non-fake-binary-1".into(),
name: "real-binary-1".into(),
command: vec![],
},
Candidate {
@@ -108,7 +128,9 @@ async fn test_show_attach_modal_and_select_process(
name: "fake-binary-2".into(),
command: vec![],
},
],
]
.into_iter()
.collect(),
true,
window,
cx,
@@ -121,17 +143,30 @@ async fn test_show_attach_modal_and_select_process(
cx.run_until_parked();
// assert we got the expected processes
workspace
.update(cx, |_, window, cx| {
let names =
attach_modal.update(cx, |modal, cx| attach_modal::_process_names(&modal, cx));
// Initially all processes are visible.
assert_eq!(3, names.len());
attach_modal.update(cx, |this, cx| {
this.picker.update(cx, |this, cx| {
this.set_query("fakb", window, cx);
})
})
})
.unwrap();
cx.run_until_parked();
// assert we got the expected processes
workspace
.update(cx, |_, _, cx| {
let names =
attach_modal.update(cx, |modal, cx| attach_modal::_process_names(&modal, cx));
// we filtered out all processes that are not starting with `fake-binary`
// Initially all processes are visible.
assert_eq!(2, names.len());
})
.unwrap();
// select the only existing process
cx.dispatch_action(Confirm);

View File

@@ -3,7 +3,6 @@ use dap::requests::StackTrace;
use gpui::{BackgroundExecutor, TestAppContext, VisualTestContext};
use project::{FakeFs, Project};
use serde_json::json;
use task::LaunchConfig;
use tests::{init_test, init_test_workspace};
#[gpui::test]
@@ -29,26 +28,17 @@ async fn test_handle_output_event(executor: BackgroundExecutor, cx: &mut TestApp
})
.unwrap();
let task = project.update(cx, |project, cx| {
project.fake_debug_session(
dap::DebugRequestType::Launch(LaunchConfig::default()),
None,
false,
cx,
)
});
let session = task.await.unwrap();
let session = debugger::test::start_debug_session(&project, cx, |_| {})
.await
.unwrap();
let client = session.update(cx, |session, _| session.adapter_client().unwrap());
client
.on_request::<StackTrace, _>(move |_, _| {
Ok(dap::StackTraceResponse {
stack_frames: Vec::default(),
total_frames: None,
})
client.on_request::<StackTrace, _>(move |_, _| {
Ok(dap::StackTraceResponse {
stack_frames: Vec::default(),
total_frames: None,
})
.await;
});
client
.fake_event(dap::messages::Events::Output(dap::OutputEvent {

View File

@@ -1,7 +1,7 @@
use crate::*;
use dap::{
ErrorResponse, RunInTerminalRequestArguments, SourceBreakpoint, StartDebuggingRequestArguments,
StartDebuggingRequestArgumentsRequest,
ErrorResponse, Message, RunInTerminalRequestArguments, SourceBreakpoint,
StartDebuggingRequestArguments, StartDebuggingRequestArgumentsRequest,
client::SessionId,
requests::{
Continue, Disconnect, Launch, Next, RunInTerminal, SetBreakpoints, StackTrace,
@@ -25,7 +25,6 @@ use std::{
atomic::{AtomicBool, Ordering},
},
};
use task::LaunchConfig;
use terminal_view::{TerminalView, terminal_panel::TerminalPanel};
use tests::{active_debug_session_panel, init_test, init_test_workspace};
use util::path;
@@ -49,37 +48,28 @@ async fn test_basic_show_debug_panel(executor: BackgroundExecutor, cx: &mut Test
let workspace = init_test_workspace(&project, cx).await;
let cx = &mut VisualTestContext::from_window(*workspace, cx);
let task = project.update(cx, |project, cx| {
project.fake_debug_session(
dap::DebugRequestType::Launch(LaunchConfig::default()),
None,
false,
cx,
)
});
let session = task.await.unwrap();
let session = debugger::test::start_debug_session(&project, cx, |_| {})
.await
.unwrap();
let client = session.update(cx, |session, _| session.adapter_client().unwrap());
client
.on_request::<Threads, _>(move |_, _| {
Ok(dap::ThreadsResponse {
threads: vec![dap::Thread {
id: 1,
name: "Thread 1".into(),
}],
})
client.on_request::<Threads, _>(move |_, _| {
Ok(dap::ThreadsResponse {
threads: vec![dap::Thread {
id: 1,
name: "Thread 1".into(),
}],
})
.await;
});
client
.on_request::<StackTrace, _>(move |_, _| {
Ok(dap::StackTraceResponse {
stack_frames: Vec::default(),
total_frames: None,
})
client.on_request::<StackTrace, _>(move |_, _| {
Ok(dap::StackTraceResponse {
stack_frames: Vec::default(),
total_frames: None,
})
.await;
});
cx.run_until_parked();
// assert we have a debug panel item before the session has stopped
workspace
@@ -197,37 +187,28 @@ async fn test_we_can_only_have_one_panel_per_debug_session(
let workspace = init_test_workspace(&project, cx).await;
let cx = &mut VisualTestContext::from_window(*workspace, cx);
let task = project.update(cx, |project, cx| {
project.fake_debug_session(
dap::DebugRequestType::Launch(LaunchConfig::default()),
None,
false,
cx,
)
});
let session = task.await.unwrap();
let session = debugger::test::start_debug_session(&project, cx, |_| {})
.await
.unwrap();
let client = session.update(cx, |session, _| session.adapter_client().unwrap());
client
.on_request::<Threads, _>(move |_, _| {
Ok(dap::ThreadsResponse {
threads: vec![dap::Thread {
id: 1,
name: "Thread 1".into(),
}],
})
client.on_request::<Threads, _>(move |_, _| {
Ok(dap::ThreadsResponse {
threads: vec![dap::Thread {
id: 1,
name: "Thread 1".into(),
}],
})
.await;
});
client
.on_request::<StackTrace, _>(move |_, _| {
Ok(dap::StackTraceResponse {
stack_frames: Vec::default(),
total_frames: None,
})
client.on_request::<StackTrace, _>(move |_, _| {
Ok(dap::StackTraceResponse {
stack_frames: Vec::default(),
total_frames: None,
})
.await;
});
cx.run_until_parked();
// assert we have a debug panel item before the session has stopped
workspace
@@ -373,16 +354,9 @@ async fn test_handle_successful_run_in_terminal_reverse_request(
let workspace = init_test_workspace(&project, cx).await;
let cx = &mut VisualTestContext::from_window(*workspace, cx);
let task = project.update(cx, |project, cx| {
project.fake_debug_session(
dap::DebugRequestType::Launch(LaunchConfig::default()),
None,
false,
cx,
)
});
let session = task.await.unwrap();
let session = debugger::test::start_debug_session(&project, cx, |_| {})
.await
.unwrap();
let client = session.update(cx, |session, _| session.adapter_client().unwrap());
client
@@ -470,16 +444,9 @@ async fn test_handle_error_run_in_terminal_reverse_request(
let workspace = init_test_workspace(&project, cx).await;
let cx = &mut VisualTestContext::from_window(*workspace, cx);
let task = project.update(cx, |project, cx| {
project.fake_debug_session(
dap::DebugRequestType::Launch(LaunchConfig::default()),
None,
false,
cx,
)
});
let session = task.await.unwrap();
let session = debugger::test::start_debug_session(&project, cx, |_| {})
.await
.unwrap();
let client = session.update(cx, |session, _| session.adapter_client().unwrap());
client
@@ -555,28 +522,19 @@ async fn test_handle_start_debugging_reverse_request(
let workspace = init_test_workspace(&project, cx).await;
let cx = &mut VisualTestContext::from_window(*workspace, cx);
let task = project.update(cx, |project, cx| {
project.fake_debug_session(
dap::DebugRequestType::Launch(LaunchConfig::default()),
None,
false,
cx,
)
});
let session = task.await.unwrap();
let session = debugger::test::start_debug_session(&project, cx, |_| {})
.await
.unwrap();
let client = session.update(cx, |session, _| session.adapter_client().unwrap());
client
.on_request::<dap::requests::Threads, _>(move |_, _| {
Ok(dap::ThreadsResponse {
threads: vec![dap::Thread {
id: 1,
name: "Thread 1".into(),
}],
})
client.on_request::<dap::requests::Threads, _>(move |_, _| {
Ok(dap::ThreadsResponse {
threads: vec![dap::Thread {
id: 1,
name: "Thread 1".into(),
}],
})
.await;
});
client
.on_response::<StartDebugging, _>({
@@ -589,7 +547,9 @@ async fn test_handle_start_debugging_reverse_request(
}
})
.await;
// Set up handlers for sessions spawned with reverse request too.
let _reverse_request_subscription =
project::debugger::test::intercept_debug_sessions(cx, |_| {});
client
.fake_reverse_request::<StartDebugging>(StartDebuggingRequestArguments {
configuration: json!({}),
@@ -608,20 +568,16 @@ async fn test_handle_start_debugging_reverse_request(
});
let child_client = child_session.update(cx, |session, _| session.adapter_client().unwrap());
child_client
.on_request::<dap::requests::Threads, _>(move |_, _| {
Ok(dap::ThreadsResponse {
threads: vec![dap::Thread {
id: 1,
name: "Thread 1".into(),
}],
})
child_client.on_request::<dap::requests::Threads, _>(move |_, _| {
Ok(dap::ThreadsResponse {
threads: vec![dap::Thread {
id: 1,
name: "Thread 1".into(),
}],
})
.await;
});
child_client
.on_request::<Disconnect, _>(move |_, _| Ok(()))
.await;
child_client.on_request::<Disconnect, _>(move |_, _| Ok(()));
child_client
.fake_event(dap::messages::Events::Stopped(dap::StoppedEvent {
@@ -673,31 +629,24 @@ async fn test_shutdown_children_when_parent_session_shutdown(
let workspace = init_test_workspace(&project, cx).await;
let cx = &mut VisualTestContext::from_window(*workspace, cx);
let task = project.update(cx, |project, cx| {
project.fake_debug_session(
dap::DebugRequestType::Launch(LaunchConfig::default()),
None,
false,
cx,
)
});
let parent_session = task.await.unwrap();
let parent_session = debugger::test::start_debug_session(&project, cx, |_| {})
.await
.unwrap();
let client = parent_session.update(cx, |session, _| session.adapter_client().unwrap());
client
.on_request::<dap::requests::Threads, _>(move |_, _| {
Ok(dap::ThreadsResponse {
threads: vec![dap::Thread {
id: 1,
name: "Thread 1".into(),
}],
})
client.on_request::<dap::requests::Threads, _>(move |_, _| {
Ok(dap::ThreadsResponse {
threads: vec![dap::Thread {
id: 1,
name: "Thread 1".into(),
}],
})
.await;
});
client.on_response::<StartDebugging, _>(move |_| {}).await;
// Set up handlers for sessions spawned with reverse request too.
let _reverse_request_subscription =
project::debugger::test::intercept_debug_sessions(cx, |_| {});
// start first child session
client
.fake_reverse_request::<StartDebugging>(StartDebuggingRequestArguments {
@@ -725,9 +674,7 @@ async fn test_shutdown_children_when_parent_session_shutdown(
let first_child_client =
first_child_session.update(cx, |session, _| session.adapter_client().unwrap());
first_child_client
.on_request::<Disconnect, _>(move |_, _| Ok(()))
.await;
first_child_client.on_request::<Disconnect, _>(move |_, _| Ok(()));
// configure second child session
let second_child_session = dap_store.read_with(cx, |dap_store, _| {
@@ -736,9 +683,7 @@ async fn test_shutdown_children_when_parent_session_shutdown(
let second_child_client =
second_child_session.update(cx, |session, _| session.adapter_client().unwrap());
second_child_client
.on_request::<Disconnect, _>(move |_, _| Ok(()))
.await;
second_child_client.on_request::<Disconnect, _>(move |_, _| Ok(()));
cx.run_until_parked();
@@ -792,20 +737,15 @@ async fn test_shutdown_parent_session_if_all_children_are_shutdown(
let workspace = init_test_workspace(&project, cx).await;
let cx = &mut VisualTestContext::from_window(*workspace, cx);
let task = project.update(cx, |project, cx| {
project.fake_debug_session(
dap::DebugRequestType::Launch(LaunchConfig::default()),
None,
false,
cx,
)
});
let parent_session = task.await.unwrap();
let parent_session = debugger::test::start_debug_session(&project, cx, |_| {})
.await
.unwrap();
let client = parent_session.update(cx, |session, _| session.adapter_client().unwrap());
client.on_response::<StartDebugging, _>(move |_| {}).await;
// Set up handlers for sessions spawned with reverse request too.
let _reverse_request_subscription =
project::debugger::test::intercept_debug_sessions(cx, |_| {});
// start first child session
client
.fake_reverse_request::<StartDebugging>(StartDebuggingRequestArguments {
@@ -833,9 +773,7 @@ async fn test_shutdown_parent_session_if_all_children_are_shutdown(
let first_child_client =
first_child_session.update(cx, |session, _| session.adapter_client().unwrap());
first_child_client
.on_request::<Disconnect, _>(move |_, _| Ok(()))
.await;
first_child_client.on_request::<Disconnect, _>(move |_, _| Ok(()));
// configure second child session
let second_child_session = dap_store.read_with(cx, |dap_store, _| {
@@ -844,9 +782,7 @@ async fn test_shutdown_parent_session_if_all_children_are_shutdown(
let second_child_client =
second_child_session.update(cx, |session, _| session.adapter_client().unwrap());
second_child_client
.on_request::<Disconnect, _>(move |_, _| Ok(()))
.await;
second_child_client.on_request::<Disconnect, _>(move |_, _| Ok(()));
cx.run_until_parked();
@@ -922,123 +858,107 @@ async fn test_debug_panel_item_thread_status_reset_on_failure(
let workspace = init_test_workspace(&project, cx).await;
let cx = &mut VisualTestContext::from_window(*workspace, cx);
let task = project.update(cx, |project, cx| {
project.fake_debug_session(
dap::DebugRequestType::Launch(LaunchConfig::default()),
Some(dap::Capabilities {
let session = debugger::test::start_debug_session(&project, cx, |client| {
client.on_request::<dap::requests::Initialize, _>(move |_, _| {
Ok(dap::Capabilities {
supports_step_back: Some(true),
..Default::default()
}),
false,
cx,
)
});
})
});
})
.await
.unwrap();
let session = task.await.unwrap();
let client = session.update(cx, |session, _| session.adapter_client().unwrap());
const THREAD_ID_NUM: u64 = 1;
client
.on_request::<dap::requests::Threads, _>(move |_, _| {
Ok(dap::ThreadsResponse {
threads: vec![dap::Thread {
id: THREAD_ID_NUM,
name: "Thread 1".into(),
}],
})
client.on_request::<dap::requests::Threads, _>(move |_, _| {
Ok(dap::ThreadsResponse {
threads: vec![dap::Thread {
id: THREAD_ID_NUM,
name: "Thread 1".into(),
}],
})
.await;
});
client.on_request::<Launch, _>(move |_, _| Ok(())).await;
client.on_request::<Launch, _>(move |_, _| Ok(()));
client
.on_request::<StackTrace, _>(move |_, _| {
Ok(dap::StackTraceResponse {
stack_frames: Vec::default(),
total_frames: None,
})
client.on_request::<StackTrace, _>(move |_, _| {
Ok(dap::StackTraceResponse {
stack_frames: Vec::default(),
total_frames: None,
})
.await;
});
client
.on_request::<Next, _>(move |_, _| {
Err(ErrorResponse {
error: Some(dap::Message {
id: 1,
format: "error".into(),
variables: None,
send_telemetry: None,
show_user: None,
url: None,
url_label: None,
}),
})
client.on_request::<Next, _>(move |_, _| {
Err(ErrorResponse {
error: Some(dap::Message {
id: 1,
format: "error".into(),
variables: None,
send_telemetry: None,
show_user: None,
url: None,
url_label: None,
}),
})
.await;
});
client
.on_request::<StepOut, _>(move |_, _| {
Err(ErrorResponse {
error: Some(dap::Message {
id: 1,
format: "error".into(),
variables: None,
send_telemetry: None,
show_user: None,
url: None,
url_label: None,
}),
})
client.on_request::<StepOut, _>(move |_, _| {
Err(ErrorResponse {
error: Some(dap::Message {
id: 1,
format: "error".into(),
variables: None,
send_telemetry: None,
show_user: None,
url: None,
url_label: None,
}),
})
.await;
});
client
.on_request::<StepIn, _>(move |_, _| {
Err(ErrorResponse {
error: Some(dap::Message {
id: 1,
format: "error".into(),
variables: None,
send_telemetry: None,
show_user: None,
url: None,
url_label: None,
}),
})
client.on_request::<StepIn, _>(move |_, _| {
Err(ErrorResponse {
error: Some(dap::Message {
id: 1,
format: "error".into(),
variables: None,
send_telemetry: None,
show_user: None,
url: None,
url_label: None,
}),
})
.await;
});
client
.on_request::<StepBack, _>(move |_, _| {
Err(ErrorResponse {
error: Some(dap::Message {
id: 1,
format: "error".into(),
variables: None,
send_telemetry: None,
show_user: None,
url: None,
url_label: None,
}),
})
client.on_request::<StepBack, _>(move |_, _| {
Err(ErrorResponse {
error: Some(dap::Message {
id: 1,
format: "error".into(),
variables: None,
send_telemetry: None,
show_user: None,
url: None,
url_label: None,
}),
})
.await;
});
client
.on_request::<Continue, _>(move |_, _| {
Err(ErrorResponse {
error: Some(dap::Message {
id: 1,
format: "error".into(),
variables: None,
send_telemetry: None,
show_user: None,
url: None,
url_label: None,
}),
})
client.on_request::<Continue, _>(move |_, _| {
Err(ErrorResponse {
error: Some(dap::Message {
id: 1,
format: "error".into(),
variables: None,
send_telemetry: None,
show_user: None,
url: None,
url_label: None,
}),
})
.await;
});
client
.fake_event(dap::messages::Events::Stopped(dap::StoppedEvent {
@@ -1052,6 +972,8 @@ async fn test_debug_panel_item_thread_status_reset_on_failure(
}))
.await;
cx.run_until_parked();
let running_state = active_debug_session_panel(workspace, cx).update_in(cx, |item, _, _| {
item.mode()
.as_running()
@@ -1151,16 +1073,9 @@ async fn test_send_breakpoints_when_editor_has_been_saved(
.update(cx, |_, _, cx| worktree.read(cx).id())
.unwrap();
let task = project.update(cx, |project, cx| {
project.fake_debug_session(
dap::DebugRequestType::Launch(LaunchConfig::default()),
None,
false,
cx,
)
});
let session = task.await.unwrap();
let session = debugger::test::start_debug_session(&project, cx, |_| {})
.await
.unwrap();
let client = session.update(cx, |session, _| session.adapter_client().unwrap());
let buffer = project
@@ -1180,16 +1095,14 @@ async fn test_send_breakpoints_when_editor_has_been_saved(
)
});
client.on_request::<Launch, _>(move |_, _| Ok(())).await;
client.on_request::<Launch, _>(move |_, _| Ok(()));
client
.on_request::<StackTrace, _>(move |_, _| {
Ok(dap::StackTraceResponse {
stack_frames: Vec::default(),
total_frames: None,
})
client.on_request::<StackTrace, _>(move |_, _| {
Ok(dap::StackTraceResponse {
stack_frames: Vec::default(),
total_frames: None,
})
.await;
});
client
.fake_event(dap::messages::Events::Stopped(dap::StoppedEvent {
@@ -1204,32 +1117,30 @@ async fn test_send_breakpoints_when_editor_has_been_saved(
.await;
let called_set_breakpoints = Arc::new(AtomicBool::new(false));
client
.on_request::<SetBreakpoints, _>({
let called_set_breakpoints = called_set_breakpoints.clone();
move |_, args| {
assert_eq!(path!("/project/main.rs"), args.source.path.unwrap());
assert_eq!(
vec![SourceBreakpoint {
line: 2,
column: None,
condition: None,
hit_condition: None,
log_message: None,
mode: None
}],
args.breakpoints.unwrap()
);
assert!(!args.source_modified.unwrap());
client.on_request::<SetBreakpoints, _>({
let called_set_breakpoints = called_set_breakpoints.clone();
move |_, args| {
assert_eq!(path!("/project/main.rs"), args.source.path.unwrap());
assert_eq!(
vec![SourceBreakpoint {
line: 2,
column: None,
condition: None,
hit_condition: None,
log_message: None,
mode: None
}],
args.breakpoints.unwrap()
);
assert!(!args.source_modified.unwrap());
called_set_breakpoints.store(true, Ordering::SeqCst);
called_set_breakpoints.store(true, Ordering::SeqCst);
Ok(dap::SetBreakpointsResponse {
breakpoints: Vec::default(),
})
}
})
.await;
Ok(dap::SetBreakpointsResponse {
breakpoints: Vec::default(),
})
}
});
editor.update_in(cx, |editor, window, cx| {
editor.move_down(&actions::MoveDown, window, cx);
@@ -1244,32 +1155,30 @@ async fn test_send_breakpoints_when_editor_has_been_saved(
);
let called_set_breakpoints = Arc::new(AtomicBool::new(false));
client
.on_request::<SetBreakpoints, _>({
let called_set_breakpoints = called_set_breakpoints.clone();
move |_, args| {
assert_eq!(path!("/project/main.rs"), args.source.path.unwrap());
assert_eq!(
vec![SourceBreakpoint {
line: 3,
column: None,
condition: None,
hit_condition: None,
log_message: None,
mode: None
}],
args.breakpoints.unwrap()
);
assert!(args.source_modified.unwrap());
client.on_request::<SetBreakpoints, _>({
let called_set_breakpoints = called_set_breakpoints.clone();
move |_, args| {
assert_eq!(path!("/project/main.rs"), args.source.path.unwrap());
assert_eq!(
vec![SourceBreakpoint {
line: 3,
column: None,
condition: None,
hit_condition: None,
log_message: None,
mode: None
}],
args.breakpoints.unwrap()
);
assert!(args.source_modified.unwrap());
called_set_breakpoints.store(true, Ordering::SeqCst);
called_set_breakpoints.store(true, Ordering::SeqCst);
Ok(dap::SetBreakpointsResponse {
breakpoints: Vec::default(),
})
}
})
.await;
Ok(dap::SetBreakpointsResponse {
breakpoints: Vec::default(),
})
}
});
editor.update_in(cx, |editor, window, cx| {
editor.move_up(&actions::MoveUp, window, cx);
@@ -1381,49 +1290,40 @@ async fn test_unsetting_breakpoints_on_clear_breakpoint_action(
editor.toggle_breakpoint(&actions::ToggleBreakpoint, window, cx);
});
let task = project.update(cx, |project, cx| {
project.fake_debug_session(
dap::DebugRequestType::Launch(LaunchConfig::default()),
None,
false,
cx,
)
});
let session = task.await.unwrap();
let session = debugger::test::start_debug_session(&project, cx, |_| {})
.await
.unwrap();
let client = session.update(cx, |session, _| session.adapter_client().unwrap());
let called_set_breakpoints = Arc::new(AtomicBool::new(false));
client
.on_request::<SetBreakpoints, _>({
let called_set_breakpoints = called_set_breakpoints.clone();
move |_, args| {
assert!(
args.breakpoints.is_none_or(|bps| bps.is_empty()),
"Send empty breakpoint sets to clear them from DAP servers"
);
client.on_request::<SetBreakpoints, _>({
let called_set_breakpoints = called_set_breakpoints.clone();
move |_, args| {
assert!(
args.breakpoints.is_none_or(|bps| bps.is_empty()),
"Send empty breakpoint sets to clear them from DAP servers"
);
match args
.source
.path
.expect("We should always send a breakpoint's path")
.as_str()
{
"/project/main.rs" | "/project/second.rs" => {}
_ => {
panic!("Unset breakpoints for path that doesn't have any")
}
match args
.source
.path
.expect("We should always send a breakpoint's path")
.as_str()
{
"/project/main.rs" | "/project/second.rs" => {}
_ => {
panic!("Unset breakpoints for path that doesn't have any")
}
called_set_breakpoints.store(true, Ordering::SeqCst);
Ok(dap::SetBreakpointsResponse {
breakpoints: Vec::default(),
})
}
})
.await;
called_set_breakpoints.store(true, Ordering::SeqCst);
Ok(dap::SetBreakpointsResponse {
breakpoints: Vec::default(),
})
}
});
cx.dispatch_action(crate::ClearAllBreakpoints);
cx.run_until_parked();
@@ -1458,13 +1358,20 @@ async fn test_debug_session_is_shutdown_when_attach_and_launch_request_fails(
let workspace = init_test_workspace(&project, cx).await;
let cx = &mut VisualTestContext::from_window(*workspace, cx);
let task = project.update(cx, |project, cx| {
project.fake_debug_session(
dap::DebugRequestType::Launch(LaunchConfig::default()),
None,
true,
cx,
)
let task = project::debugger::test::start_debug_session(&project, cx, |client| {
client.on_request::<dap::requests::Initialize, _>(|_, _| {
Err(ErrorResponse {
error: Some(Message {
format: "failed to launch".to_string(),
id: 1,
variables: None,
send_telemetry: None,
show_user: None,
url: None,
url_label: None,
}),
})
});
});
assert!(

View File

@@ -4,15 +4,17 @@ use crate::{
};
use dap::{
StoppedEvent,
requests::{Modules, StackTrace, Threads},
requests::{Initialize, Modules},
};
use gpui::{BackgroundExecutor, TestAppContext, VisualTestContext};
use project::{FakeFs, Project};
use project::{
FakeFs, Project,
debugger::{self},
};
use std::sync::{
Arc,
atomic::{AtomicBool, AtomicI32, Ordering},
};
use task::LaunchConfig;
#[gpui::test]
async fn test_module_list(executor: BackgroundExecutor, cx: &mut TestAppContext) {
@@ -29,30 +31,18 @@ async fn test_module_list(executor: BackgroundExecutor, cx: &mut TestAppContext)
.unwrap();
let cx = &mut VisualTestContext::from_window(*workspace, cx);
let task = project.update(cx, |project, cx| {
project.fake_debug_session(
dap::DebugRequestType::Launch(LaunchConfig::default()),
Some(dap::Capabilities {
let session = debugger::test::start_debug_session(&project, cx, |client| {
client.on_request::<Initialize, _>(move |_, _| {
Ok(dap::Capabilities {
supports_modules_request: Some(true),
..Default::default()
}),
false,
cx,
)
});
let session = task.await.unwrap();
let client = session.update(cx, |session, _| session.adapter_client().unwrap());
client
.on_request::<StackTrace, _>(move |_, args| {
assert!(args.thread_id == 1);
Ok(dap::StackTraceResponse {
stack_frames: Vec::default(),
total_frames: None,
})
})
.await;
});
})
.await
.unwrap();
let client = session.update(cx, |session, _| session.adapter_client().unwrap());
let called_modules = Arc::new(AtomicBool::new(false));
let modules = vec![
@@ -82,38 +72,25 @@ async fn test_module_list(executor: BackgroundExecutor, cx: &mut TestAppContext)
},
];
client
.on_request::<Threads, _>(move |_, _| {
Ok(dap::ThreadsResponse {
threads: vec![dap::Thread {
id: 1,
name: "Thread 1".into(),
}],
client.on_request::<Modules, _>({
let called_modules = called_modules.clone();
let modules_request_count = AtomicI32::new(0);
let modules = modules.clone();
move |_, _| {
modules_request_count.fetch_add(1, Ordering::SeqCst);
assert_eq!(
1,
modules_request_count.load(Ordering::SeqCst),
"This request should only be called once from the host"
);
called_modules.store(true, Ordering::SeqCst);
Ok(dap::ModulesResponse {
modules: modules.clone(),
total_modules: Some(2u64),
})
})
.await;
client
.on_request::<Modules, _>({
let called_modules = called_modules.clone();
let modules_request_count = AtomicI32::new(0);
let modules = modules.clone();
move |_, _| {
modules_request_count.fetch_add(1, Ordering::SeqCst);
assert_eq!(
1,
modules_request_count.load(Ordering::SeqCst),
"This request should only be called once from the host"
);
called_modules.store(true, Ordering::SeqCst);
Ok(dap::ModulesResponse {
modules: modules.clone(),
total_modules: Some(2u64),
})
}
})
.await;
}
});
client
.fake_event(dap::messages::Events::Stopped(StoppedEvent {

View File

@@ -5,14 +5,13 @@ use crate::{
};
use dap::{
StackFrame,
requests::{StackTrace, Threads},
requests::{Scopes, StackTrace, Threads},
};
use editor::{Editor, ToPoint as _};
use gpui::{BackgroundExecutor, TestAppContext, VisualTestContext};
use project::{FakeFs, Project};
use project::{FakeFs, Project, debugger};
use serde_json::json;
use std::sync::Arc;
use task::LaunchConfig;
use unindent::Unindent as _;
use util::path;
@@ -51,29 +50,20 @@ async fn test_fetch_initial_stack_frames_and_go_to_stack_frame(
let project = Project::test(fs, [path!("/project").as_ref()], cx).await;
let workspace = init_test_workspace(&project, cx).await;
let cx = &mut VisualTestContext::from_window(*workspace, cx);
let task = project.update(cx, |project, cx| {
project.fake_debug_session(
dap::DebugRequestType::Launch(LaunchConfig::default()),
None,
false,
cx,
)
});
let session = task.await.unwrap();
let session = debugger::test::start_debug_session(&project, cx, |_| {})
.await
.unwrap();
let client = session.update(cx, |session, _| session.adapter_client().unwrap());
client.on_request::<Scopes, _>(move |_, _| Ok(dap::ScopesResponse { scopes: vec![] }));
client
.on_request::<Threads, _>(move |_, _| {
Ok(dap::ThreadsResponse {
threads: vec![dap::Thread {
id: 1,
name: "Thread 1".into(),
}],
})
client.on_request::<Threads, _>(move |_, _| {
Ok(dap::ThreadsResponse {
threads: vec![dap::Thread {
id: 1,
name: "Thread 1".into(),
}],
})
.await;
});
let stack_frames = vec![
StackFrame {
@@ -122,19 +112,17 @@ async fn test_fetch_initial_stack_frames_and_go_to_stack_frame(
},
];
client
.on_request::<StackTrace, _>({
let stack_frames = Arc::new(stack_frames.clone());
move |_, args| {
assert_eq!(1, args.thread_id);
client.on_request::<StackTrace, _>({
let stack_frames = Arc::new(stack_frames.clone());
move |_, args| {
assert_eq!(1, args.thread_id);
Ok(dap::StackTraceResponse {
stack_frames: (*stack_frames).clone(),
total_frames: None,
})
}
})
.await;
Ok(dap::StackTraceResponse {
stack_frames: (*stack_frames).clone(),
total_frames: None,
})
}
});
client
.fake_event(dap::messages::Events::Stopped(dap::StoppedEvent {
@@ -241,29 +229,21 @@ async fn test_select_stack_frame(executor: BackgroundExecutor, cx: &mut TestAppC
});
let cx = &mut VisualTestContext::from_window(*workspace, cx);
let task = project.update(cx, |project, cx| {
project.fake_debug_session(
dap::DebugRequestType::Launch(LaunchConfig::default()),
None,
false,
cx,
)
});
let session = task.await.unwrap();
let session = debugger::test::start_debug_session(&project, cx, |_| {})
.await
.unwrap();
let client = session.update(cx, |session, _| session.adapter_client().unwrap());
client
.on_request::<Threads, _>(move |_, _| {
Ok(dap::ThreadsResponse {
threads: vec![dap::Thread {
id: 1,
name: "Thread 1".into(),
}],
})
client.on_request::<Threads, _>(move |_, _| {
Ok(dap::ThreadsResponse {
threads: vec![dap::Thread {
id: 1,
name: "Thread 1".into(),
}],
})
.await;
});
client.on_request::<Scopes, _>(move |_, _| Ok(dap::ScopesResponse { scopes: vec![] }));
let stack_frames = vec![
StackFrame {
@@ -312,19 +292,17 @@ async fn test_select_stack_frame(executor: BackgroundExecutor, cx: &mut TestAppC
},
];
client
.on_request::<StackTrace, _>({
let stack_frames = Arc::new(stack_frames.clone());
move |_, args| {
assert_eq!(1, args.thread_id);
client.on_request::<StackTrace, _>({
let stack_frames = Arc::new(stack_frames.clone());
move |_, args| {
assert_eq!(1, args.thread_id);
Ok(dap::StackTraceResponse {
stack_frames: (*stack_frames).clone(),
total_frames: None,
})
}
})
.await;
Ok(dap::StackTraceResponse {
stack_frames: (*stack_frames).clone(),
total_frames: None,
})
}
});
client
.fake_event(dap::messages::Events::Stopped(dap::StoppedEvent {
@@ -517,28 +495,21 @@ async fn test_collapsed_entries(executor: BackgroundExecutor, cx: &mut TestAppCo
let workspace = init_test_workspace(&project, cx).await;
let cx = &mut VisualTestContext::from_window(*workspace, cx);
let task = project.update(cx, |project, cx| {
project.fake_debug_session(
dap::DebugRequestType::Launch(LaunchConfig::default()),
None,
false,
cx,
)
});
let session = task.await.unwrap();
let session = debugger::test::start_debug_session(&project, cx, |_| {})
.await
.unwrap();
let client = session.update(cx, |session, _| session.adapter_client().unwrap());
client
.on_request::<Threads, _>(move |_, _| {
Ok(dap::ThreadsResponse {
threads: vec![dap::Thread {
id: 1,
name: "Thread 1".into(),
}],
})
client.on_request::<Threads, _>(move |_, _| {
Ok(dap::ThreadsResponse {
threads: vec![dap::Thread {
id: 1,
name: "Thread 1".into(),
}],
})
.await;
});
client.on_request::<Scopes, _>(move |_, _| Ok(dap::ScopesResponse { scopes: vec![] }));
let stack_frames = vec![
StackFrame {
@@ -697,19 +668,17 @@ async fn test_collapsed_entries(executor: BackgroundExecutor, cx: &mut TestAppCo
},
];
client
.on_request::<StackTrace, _>({
let stack_frames = Arc::new(stack_frames.clone());
move |_, args| {
assert_eq!(1, args.thread_id);
client.on_request::<StackTrace, _>({
let stack_frames = Arc::new(stack_frames.clone());
move |_, args| {
assert_eq!(1, args.thread_id);
Ok(dap::StackTraceResponse {
stack_frames: (*stack_frames).clone(),
total_frames: None,
})
}
})
.await;
Ok(dap::StackTraceResponse {
stack_frames: (*stack_frames).clone(),
total_frames: None,
})
}
});
client
.fake_event(dap::messages::Events::Stopped(dap::StoppedEvent {

View File

@@ -15,9 +15,8 @@ use dap::{
};
use gpui::{BackgroundExecutor, TestAppContext, VisualTestContext};
use menu::{SelectFirst, SelectNext, SelectPrevious};
use project::{FakeFs, Project};
use project::{FakeFs, Project, debugger};
use serde_json::json;
use task::LaunchConfig;
use unindent::Unindent as _;
use util::path;
@@ -55,29 +54,19 @@ async fn test_basic_fetch_initial_scope_and_variables(
})
.unwrap();
let cx = &mut VisualTestContext::from_window(*workspace, cx);
let task = project.update(cx, |project, cx| {
project.fake_debug_session(
dap::DebugRequestType::Launch(LaunchConfig::default()),
None,
false,
cx,
)
});
let session = task.await.unwrap();
let session = debugger::test::start_debug_session(&project, cx, |_| {})
.await
.unwrap();
let client = session.update(cx, |session, _| session.adapter_client().unwrap());
client
.on_request::<dap::requests::Threads, _>(move |_, _| {
Ok(dap::ThreadsResponse {
threads: vec![dap::Thread {
id: 1,
name: "Thread 1".into(),
}],
})
client.on_request::<dap::requests::Threads, _>(move |_, _| {
Ok(dap::ThreadsResponse {
threads: vec![dap::Thread {
id: 1,
name: "Thread 1".into(),
}],
})
.await;
});
let stack_frames = vec![StackFrame {
id: 1,
@@ -102,19 +91,17 @@ async fn test_basic_fetch_initial_scope_and_variables(
presentation_hint: None,
}];
client
.on_request::<StackTrace, _>({
let stack_frames = Arc::new(stack_frames.clone());
move |_, args| {
assert_eq!(1, args.thread_id);
client.on_request::<StackTrace, _>({
let stack_frames = Arc::new(stack_frames.clone());
move |_, args| {
assert_eq!(1, args.thread_id);
Ok(dap::StackTraceResponse {
stack_frames: (*stack_frames).clone(),
total_frames: None,
})
}
})
.await;
Ok(dap::StackTraceResponse {
stack_frames: (*stack_frames).clone(),
total_frames: None,
})
}
});
let scopes = vec![Scope {
name: "Scope 1".into(),
@@ -130,18 +117,16 @@ async fn test_basic_fetch_initial_scope_and_variables(
end_column: None,
}];
client
.on_request::<Scopes, _>({
let scopes = Arc::new(scopes.clone());
move |_, args| {
assert_eq!(1, args.frame_id);
client.on_request::<Scopes, _>({
let scopes = Arc::new(scopes.clone());
move |_, args| {
assert_eq!(1, args.frame_id);
Ok(dap::ScopesResponse {
scopes: (*scopes).clone(),
})
}
})
.await;
Ok(dap::ScopesResponse {
scopes: (*scopes).clone(),
})
}
});
let variables = vec![
Variable {
@@ -172,18 +157,16 @@ async fn test_basic_fetch_initial_scope_and_variables(
},
];
client
.on_request::<Variables, _>({
let variables = Arc::new(variables.clone());
move |_, args| {
assert_eq!(2, args.variables_reference);
client.on_request::<Variables, _>({
let variables = Arc::new(variables.clone());
move |_, args| {
assert_eq!(2, args.variables_reference);
Ok(dap::VariablesResponse {
variables: (*variables).clone(),
})
}
})
.await;
Ok(dap::VariablesResponse {
variables: (*variables).clone(),
})
}
});
client
.fake_event(dap::messages::Events::Stopped(dap::StoppedEvent {
@@ -283,39 +266,28 @@ async fn test_fetch_variables_for_multiple_scopes(
.unwrap();
let cx = &mut VisualTestContext::from_window(*workspace, cx);
let task = project.update(cx, |project, cx| {
project.fake_debug_session(
dap::DebugRequestType::Launch(LaunchConfig::default()),
None,
false,
cx,
)
});
let session = task.await.unwrap();
let session = debugger::test::start_debug_session(&project, cx, |_| {})
.await
.unwrap();
let client = session.update(cx, |session, _| session.adapter_client().unwrap());
client
.on_request::<dap::requests::Threads, _>(move |_, _| {
Ok(dap::ThreadsResponse {
threads: vec![dap::Thread {
id: 1,
name: "Thread 1".into(),
}],
})
client.on_request::<dap::requests::Threads, _>(move |_, _| {
Ok(dap::ThreadsResponse {
threads: vec![dap::Thread {
id: 1,
name: "Thread 1".into(),
}],
})
.await;
});
client
.on_request::<Initialize, _>(move |_, _| {
Ok(dap::Capabilities {
supports_step_back: Some(false),
..Default::default()
})
client.on_request::<Initialize, _>(move |_, _| {
Ok(dap::Capabilities {
supports_step_back: Some(false),
..Default::default()
})
.await;
});
client.on_request::<Launch, _>(move |_, _| Ok(())).await;
client.on_request::<Launch, _>(move |_, _| Ok(()));
let stack_frames = vec![StackFrame {
id: 1,
@@ -340,19 +312,17 @@ async fn test_fetch_variables_for_multiple_scopes(
presentation_hint: None,
}];
client
.on_request::<StackTrace, _>({
let stack_frames = Arc::new(stack_frames.clone());
move |_, args| {
assert_eq!(1, args.thread_id);
client.on_request::<StackTrace, _>({
let stack_frames = Arc::new(stack_frames.clone());
move |_, args| {
assert_eq!(1, args.thread_id);
Ok(dap::StackTraceResponse {
stack_frames: (*stack_frames).clone(),
total_frames: None,
})
}
})
.await;
Ok(dap::StackTraceResponse {
stack_frames: (*stack_frames).clone(),
total_frames: None,
})
}
});
let scopes = vec![
Scope {
@@ -383,18 +353,16 @@ async fn test_fetch_variables_for_multiple_scopes(
},
];
client
.on_request::<Scopes, _>({
let scopes = Arc::new(scopes.clone());
move |_, args| {
assert_eq!(1, args.frame_id);
client.on_request::<Scopes, _>({
let scopes = Arc::new(scopes.clone());
move |_, args| {
assert_eq!(1, args.frame_id);
Ok(dap::ScopesResponse {
scopes: (*scopes).clone(),
})
}
})
.await;
Ok(dap::ScopesResponse {
scopes: (*scopes).clone(),
})
}
});
let mut variables = HashMap::default();
variables.insert(
@@ -445,16 +413,14 @@ async fn test_fetch_variables_for_multiple_scopes(
}],
);
client
.on_request::<Variables, _>({
let variables = Arc::new(variables.clone());
move |_, args| {
Ok(dap::VariablesResponse {
variables: variables.get(&args.variables_reference).unwrap().clone(),
})
}
})
.await;
client.on_request::<Variables, _>({
let variables = Arc::new(variables.clone());
move |_, args| {
Ok(dap::VariablesResponse {
variables: variables.get(&args.variables_reference).unwrap().clone(),
})
}
});
client
.fake_event(dap::messages::Events::Stopped(dap::StoppedEvent {
@@ -562,40 +528,28 @@ async fn test_keyboard_navigation(executor: BackgroundExecutor, cx: &mut TestApp
})
.unwrap();
let cx = &mut VisualTestContext::from_window(*workspace, cx);
let task = project.update(cx, |project, cx| {
project.fake_debug_session(
dap::DebugRequestType::Launch(LaunchConfig::default()),
None,
false,
cx,
)
});
let session = task.await.unwrap();
let session = debugger::test::start_debug_session(&project, cx, |_| {})
.await
.unwrap();
let client = session.update(cx, |session, _| session.adapter_client().unwrap());
client
.on_request::<dap::requests::Threads, _>(move |_, _| {
Ok(dap::ThreadsResponse {
threads: vec![dap::Thread {
id: 1,
name: "Thread 1".into(),
}],
})
client.on_request::<dap::requests::Threads, _>(move |_, _| {
Ok(dap::ThreadsResponse {
threads: vec![dap::Thread {
id: 1,
name: "Thread 1".into(),
}],
})
.await;
});
client
.on_request::<Initialize, _>(move |_, _| {
Ok(dap::Capabilities {
supports_step_back: Some(false),
..Default::default()
})
client.on_request::<Initialize, _>(move |_, _| {
Ok(dap::Capabilities {
supports_step_back: Some(false),
..Default::default()
})
.await;
});
client.on_request::<Launch, _>(move |_, _| Ok(())).await;
client.on_request::<Launch, _>(move |_, _| Ok(()));
let stack_frames = vec![StackFrame {
id: 1,
@@ -620,19 +574,17 @@ async fn test_keyboard_navigation(executor: BackgroundExecutor, cx: &mut TestApp
presentation_hint: None,
}];
client
.on_request::<StackTrace, _>({
let stack_frames = Arc::new(stack_frames.clone());
move |_, args| {
assert_eq!(1, args.thread_id);
client.on_request::<StackTrace, _>({
let stack_frames = Arc::new(stack_frames.clone());
move |_, args| {
assert_eq!(1, args.thread_id);
Ok(dap::StackTraceResponse {
stack_frames: (*stack_frames).clone(),
total_frames: None,
})
}
})
.await;
Ok(dap::StackTraceResponse {
stack_frames: (*stack_frames).clone(),
total_frames: None,
})
}
});
let scopes = vec![
Scope {
@@ -663,18 +615,16 @@ async fn test_keyboard_navigation(executor: BackgroundExecutor, cx: &mut TestApp
},
];
client
.on_request::<Scopes, _>({
let scopes = Arc::new(scopes.clone());
move |_, args| {
assert_eq!(1, args.frame_id);
client.on_request::<Scopes, _>({
let scopes = Arc::new(scopes.clone());
move |_, args| {
assert_eq!(1, args.frame_id);
Ok(dap::ScopesResponse {
scopes: (*scopes).clone(),
})
}
})
.await;
Ok(dap::ScopesResponse {
scopes: (*scopes).clone(),
})
}
});
let scope1_variables = vec![
Variable {
@@ -748,25 +698,23 @@ async fn test_keyboard_navigation(executor: BackgroundExecutor, cx: &mut TestApp
value_location_reference: None,
}];
client
.on_request::<Variables, _>({
let scope1_variables = Arc::new(scope1_variables.clone());
let nested_variables = Arc::new(nested_variables.clone());
let scope2_variables = Arc::new(scope2_variables.clone());
move |_, args| match args.variables_reference {
4 => Ok(dap::VariablesResponse {
variables: (*scope2_variables).clone(),
}),
3 => Ok(dap::VariablesResponse {
variables: (*nested_variables).clone(),
}),
2 => Ok(dap::VariablesResponse {
variables: (*scope1_variables).clone(),
}),
id => unreachable!("unexpected variables reference {id}"),
}
})
.await;
client.on_request::<Variables, _>({
let scope1_variables = Arc::new(scope1_variables.clone());
let nested_variables = Arc::new(nested_variables.clone());
let scope2_variables = Arc::new(scope2_variables.clone());
move |_, args| match args.variables_reference {
4 => Ok(dap::VariablesResponse {
variables: (*scope2_variables).clone(),
}),
3 => Ok(dap::VariablesResponse {
variables: (*nested_variables).clone(),
}),
2 => Ok(dap::VariablesResponse {
variables: (*scope1_variables).clone(),
}),
id => unreachable!("unexpected variables reference {id}"),
}
});
client
.fake_event(dap::messages::Events::Stopped(dap::StoppedEvent {
@@ -1365,39 +1313,28 @@ async fn test_variable_list_only_sends_requests_when_rendering(
let workspace = init_test_workspace(&project, cx).await;
let cx = &mut VisualTestContext::from_window(*workspace, cx);
let task = project.update(cx, |project, cx| {
project.fake_debug_session(
dap::DebugRequestType::Launch(LaunchConfig::default()),
None,
false,
cx,
)
});
let session = task.await.unwrap();
let session = debugger::test::start_debug_session(&project, cx, |_| {})
.await
.unwrap();
let client = session.update(cx, |session, _| session.adapter_client().unwrap());
client
.on_request::<dap::requests::Threads, _>(move |_, _| {
Ok(dap::ThreadsResponse {
threads: vec![dap::Thread {
id: 1,
name: "Thread 1".into(),
}],
})
client.on_request::<dap::requests::Threads, _>(move |_, _| {
Ok(dap::ThreadsResponse {
threads: vec![dap::Thread {
id: 1,
name: "Thread 1".into(),
}],
})
.await;
});
client
.on_request::<Initialize, _>(move |_, _| {
Ok(dap::Capabilities {
supports_step_back: Some(false),
..Default::default()
})
client.on_request::<Initialize, _>(move |_, _| {
Ok(dap::Capabilities {
supports_step_back: Some(false),
..Default::default()
})
.await;
});
client.on_request::<Launch, _>(move |_, _| Ok(())).await;
client.on_request::<Launch, _>(move |_, _| Ok(()));
let stack_frames = vec![
StackFrame {
@@ -1446,19 +1383,17 @@ async fn test_variable_list_only_sends_requests_when_rendering(
},
];
client
.on_request::<StackTrace, _>({
let stack_frames = Arc::new(stack_frames.clone());
move |_, args| {
assert_eq!(1, args.thread_id);
client.on_request::<StackTrace, _>({
let stack_frames = Arc::new(stack_frames.clone());
move |_, args| {
assert_eq!(1, args.thread_id);
Ok(dap::StackTraceResponse {
stack_frames: (*stack_frames).clone(),
total_frames: None,
})
}
})
.await;
Ok(dap::StackTraceResponse {
stack_frames: (*stack_frames).clone(),
total_frames: None,
})
}
});
let frame_1_scopes = vec![Scope {
name: "Frame 1 Scope 1".into(),
@@ -1476,25 +1411,23 @@ async fn test_variable_list_only_sends_requests_when_rendering(
let made_scopes_request = Arc::new(AtomicBool::new(false));
client
.on_request::<Scopes, _>({
let frame_1_scopes = Arc::new(frame_1_scopes.clone());
let made_scopes_request = made_scopes_request.clone();
move |_, args| {
assert_eq!(1, args.frame_id);
assert!(
!made_scopes_request.load(Ordering::SeqCst),
"We should be caching the scope request"
);
client.on_request::<Scopes, _>({
let frame_1_scopes = Arc::new(frame_1_scopes.clone());
let made_scopes_request = made_scopes_request.clone();
move |_, args| {
assert_eq!(1, args.frame_id);
assert!(
!made_scopes_request.load(Ordering::SeqCst),
"We should be caching the scope request"
);
made_scopes_request.store(true, Ordering::SeqCst);
made_scopes_request.store(true, Ordering::SeqCst);
Ok(dap::ScopesResponse {
scopes: (*frame_1_scopes).clone(),
})
}
})
.await;
Ok(dap::ScopesResponse {
scopes: (*frame_1_scopes).clone(),
})
}
});
let frame_1_variables = vec![
Variable {
@@ -1525,18 +1458,18 @@ async fn test_variable_list_only_sends_requests_when_rendering(
},
];
client
.on_request::<Variables, _>({
let frame_1_variables = Arc::new(frame_1_variables.clone());
move |_, args| {
assert_eq!(2, args.variables_reference);
client.on_request::<Variables, _>({
let frame_1_variables = Arc::new(frame_1_variables.clone());
move |_, args| {
assert_eq!(2, args.variables_reference);
Ok(dap::VariablesResponse {
variables: (*frame_1_variables).clone(),
})
}
})
.await;
Ok(dap::VariablesResponse {
variables: (*frame_1_variables).clone(),
})
}
});
cx.run_until_parked();
let running_state = active_debug_session_panel(workspace, cx).update_in(cx, |item, _, _| {
let state = item
@@ -1627,39 +1560,28 @@ async fn test_it_fetches_scopes_variables_when_you_select_a_stack_frame(
.unwrap();
let cx = &mut VisualTestContext::from_window(*workspace, cx);
let task = project.update(cx, |project, cx| {
project.fake_debug_session(
dap::DebugRequestType::Launch(LaunchConfig::default()),
None,
false,
cx,
)
});
let session = task.await.unwrap();
let session = debugger::test::start_debug_session(&project, cx, |_| {})
.await
.unwrap();
let client = session.update(cx, |session, _| session.adapter_client().unwrap());
client
.on_request::<dap::requests::Threads, _>(move |_, _| {
Ok(dap::ThreadsResponse {
threads: vec![dap::Thread {
id: 1,
name: "Thread 1".into(),
}],
})
client.on_request::<dap::requests::Threads, _>(move |_, _| {
Ok(dap::ThreadsResponse {
threads: vec![dap::Thread {
id: 1,
name: "Thread 1".into(),
}],
})
.await;
});
client
.on_request::<Initialize, _>(move |_, _| {
Ok(dap::Capabilities {
supports_step_back: Some(false),
..Default::default()
})
client.on_request::<Initialize, _>(move |_, _| {
Ok(dap::Capabilities {
supports_step_back: Some(false),
..Default::default()
})
.await;
});
client.on_request::<Launch, _>(move |_, _| Ok(())).await;
client.on_request::<Launch, _>(move |_, _| Ok(()));
let stack_frames = vec![
StackFrame {
@@ -1708,19 +1630,17 @@ async fn test_it_fetches_scopes_variables_when_you_select_a_stack_frame(
},
];
client
.on_request::<StackTrace, _>({
let stack_frames = Arc::new(stack_frames.clone());
move |_, args| {
assert_eq!(1, args.thread_id);
client.on_request::<StackTrace, _>({
let stack_frames = Arc::new(stack_frames.clone());
move |_, args| {
assert_eq!(1, args.thread_id);
Ok(dap::StackTraceResponse {
stack_frames: (*stack_frames).clone(),
total_frames: None,
})
}
})
.await;
Ok(dap::StackTraceResponse {
stack_frames: (*stack_frames).clone(),
total_frames: None,
})
}
});
let frame_1_scopes = vec![Scope {
name: "Frame 1 Scope 1".into(),
@@ -1755,30 +1675,28 @@ async fn test_it_fetches_scopes_variables_when_you_select_a_stack_frame(
let called_second_stack_frame = Arc::new(AtomicBool::new(false));
let called_first_stack_frame = Arc::new(AtomicBool::new(false));
client
.on_request::<Scopes, _>({
let frame_1_scopes = Arc::new(frame_1_scopes.clone());
let frame_2_scopes = Arc::new(frame_2_scopes.clone());
let called_first_stack_frame = called_first_stack_frame.clone();
let called_second_stack_frame = called_second_stack_frame.clone();
move |_, args| match args.frame_id {
1 => {
called_first_stack_frame.store(true, Ordering::SeqCst);
Ok(dap::ScopesResponse {
scopes: (*frame_1_scopes).clone(),
})
}
2 => {
called_second_stack_frame.store(true, Ordering::SeqCst);
Ok(dap::ScopesResponse {
scopes: (*frame_2_scopes).clone(),
})
}
_ => panic!("Made a scopes request with an invalid frame id"),
client.on_request::<Scopes, _>({
let frame_1_scopes = Arc::new(frame_1_scopes.clone());
let frame_2_scopes = Arc::new(frame_2_scopes.clone());
let called_first_stack_frame = called_first_stack_frame.clone();
let called_second_stack_frame = called_second_stack_frame.clone();
move |_, args| match args.frame_id {
1 => {
called_first_stack_frame.store(true, Ordering::SeqCst);
Ok(dap::ScopesResponse {
scopes: (*frame_1_scopes).clone(),
})
}
})
.await;
2 => {
called_second_stack_frame.store(true, Ordering::SeqCst);
Ok(dap::ScopesResponse {
scopes: (*frame_2_scopes).clone(),
})
}
_ => panic!("Made a scopes request with an invalid frame id"),
}
});
let frame_1_variables = vec![
Variable {
@@ -1838,18 +1756,16 @@ async fn test_it_fetches_scopes_variables_when_you_select_a_stack_frame(
},
];
client
.on_request::<Variables, _>({
let frame_1_variables = Arc::new(frame_1_variables.clone());
move |_, args| {
assert_eq!(2, args.variables_reference);
client.on_request::<Variables, _>({
let frame_1_variables = Arc::new(frame_1_variables.clone());
move |_, args| {
assert_eq!(2, args.variables_reference);
Ok(dap::VariablesResponse {
variables: (*frame_1_variables).clone(),
})
}
})
.await;
Ok(dap::VariablesResponse {
variables: (*frame_1_variables).clone(),
})
}
});
client
.fake_event(dap::messages::Events::Stopped(dap::StoppedEvent {
@@ -1905,18 +1821,16 @@ async fn test_it_fetches_scopes_variables_when_you_select_a_stack_frame(
assert_eq!(frame_1_variables, variables);
});
client
.on_request::<Variables, _>({
let frame_2_variables = Arc::new(frame_2_variables.clone());
move |_, args| {
assert_eq!(3, args.variables_reference);
client.on_request::<Variables, _>({
let frame_2_variables = Arc::new(frame_2_variables.clone());
move |_, args| {
assert_eq!(3, args.variables_reference);
Ok(dap::VariablesResponse {
variables: (*frame_2_variables).clone(),
})
}
})
.await;
Ok(dap::VariablesResponse {
variables: (*frame_2_variables).clone(),
})
}
});
running_state
.update_in(cx, |running_state, window, cx| {

View File

@@ -15,17 +15,22 @@ doctest = false
[dependencies]
anyhow.workspace = true
collections.workspace = true
component.workspace = true
ctor.workspace = true
editor.workspace = true
env_logger.workspace = true
gpui.workspace = true
indoc.workspace = true
language.workspace = true
linkme.workspace = true
log.workspace = true
lsp.workspace = true
markdown.workspace = true
project.workspace = true
rand.workspace = true
serde.workspace = true
settings.workspace = true
text.workspace = true
theme.workspace = true
ui.workspace = true
util.workspace = true
@@ -37,6 +42,7 @@ client = { workspace = true, features = ["test-support"] }
editor = { workspace = true, features = ["test-support"] }
gpui = { workspace = true, features = ["test-support"] }
language = { workspace = true, features = ["test-support"] }
markdown = { workspace = true, features = ["test-support"] }
lsp = { workspace = true, features = ["test-support"] }
serde_json.workspace = true
theme = { workspace = true, features = ["test-support"] }

View File

@@ -0,0 +1,302 @@
use std::{ops::Range, sync::Arc};
use editor::{
Anchor, Editor, EditorSnapshot, ToOffset,
display_map::{BlockContext, BlockPlacement, BlockProperties, BlockStyle},
hover_markdown_style,
scroll::Autoscroll,
};
use gpui::{AppContext, Entity, Focusable, WeakEntity};
use language::{BufferId, DiagnosticEntry};
use lsp::DiagnosticSeverity;
use markdown::{Markdown, MarkdownElement};
use settings::Settings;
use text::{AnchorRangeExt, Point};
use theme::ThemeSettings;
use ui::{
ActiveTheme, AnyElement, App, Context, IntoElement, ParentElement, SharedString, Styled,
Window, div, px,
};
use util::maybe;
use crate::ProjectDiagnosticsEditor;
pub struct DiagnosticRenderer;
impl DiagnosticRenderer {
pub fn diagnostic_blocks_for_group(
diagnostic_group: Vec<DiagnosticEntry<Point>>,
buffer_id: BufferId,
diagnostics_editor: Option<WeakEntity<ProjectDiagnosticsEditor>>,
cx: &mut App,
) -> Vec<DiagnosticBlock> {
let Some(primary_ix) = diagnostic_group
.iter()
.position(|d| d.diagnostic.is_primary)
else {
return Vec::new();
};
let primary = diagnostic_group[primary_ix].clone();
let mut same_row = Vec::new();
let mut close = Vec::new();
let mut distant = Vec::new();
let group_id = primary.diagnostic.group_id;
for (ix, entry) in diagnostic_group.into_iter().enumerate() {
if entry.diagnostic.is_primary {
continue;
}
if entry.range.start.row == primary.range.start.row {
same_row.push(entry)
} else if entry.range.start.row.abs_diff(primary.range.start.row) < 5 {
close.push(entry)
} else {
distant.push((ix, entry))
}
}
let mut markdown =
Markdown::escape(&if let Some(source) = primary.diagnostic.source.as_ref() {
format!("{}: {}", source, primary.diagnostic.message)
} else {
primary.diagnostic.message
})
.to_string();
for entry in same_row {
markdown.push_str("\n- hint: ");
markdown.push_str(&Markdown::escape(&entry.diagnostic.message))
}
for (ix, entry) in &distant {
markdown.push_str("\n- hint: [");
markdown.push_str(&Markdown::escape(&entry.diagnostic.message));
markdown.push_str(&format!("](file://#diagnostic-{group_id}-{ix})\n",))
}
let mut results = vec![DiagnosticBlock {
initial_range: primary.range,
severity: primary.diagnostic.severity,
buffer_id,
diagnostics_editor: diagnostics_editor.clone(),
markdown: cx.new(|cx| Markdown::new(markdown.into(), None, None, cx)),
}];
for entry in close {
let markdown = if let Some(source) = entry.diagnostic.source.as_ref() {
format!("{}: {}", source, entry.diagnostic.message)
} else {
entry.diagnostic.message
};
let markdown = Markdown::escape(&markdown).to_string();
results.push(DiagnosticBlock {
initial_range: entry.range,
severity: entry.diagnostic.severity,
buffer_id,
diagnostics_editor: diagnostics_editor.clone(),
markdown: cx.new(|cx| Markdown::new(markdown.into(), None, None, cx)),
});
}
for (_, entry) in distant {
let markdown = if let Some(source) = entry.diagnostic.source.as_ref() {
format!("{}: {}", source, entry.diagnostic.message)
} else {
entry.diagnostic.message
};
let mut markdown = Markdown::escape(&markdown).to_string();
markdown.push_str(&format!(
" ([back](file://#diagnostic-{group_id}-{primary_ix}))"
));
// problem: group-id changes...
// - only an issue in diagnostics because caching
results.push(DiagnosticBlock {
initial_range: entry.range,
severity: entry.diagnostic.severity,
buffer_id,
diagnostics_editor: diagnostics_editor.clone(),
markdown: cx.new(|cx| Markdown::new(markdown.into(), None, None, cx)),
});
}
results
}
}
impl editor::DiagnosticRenderer for DiagnosticRenderer {
fn render_group(
&self,
diagnostic_group: Vec<DiagnosticEntry<Point>>,
buffer_id: BufferId,
snapshot: EditorSnapshot,
editor: WeakEntity<Editor>,
cx: &mut App,
) -> Vec<BlockProperties<Anchor>> {
let blocks = Self::diagnostic_blocks_for_group(diagnostic_group, buffer_id, None, cx);
blocks
.into_iter()
.map(|block| {
let editor = editor.clone();
BlockProperties {
placement: BlockPlacement::Near(
snapshot
.buffer_snapshot
.anchor_after(block.initial_range.start),
),
height: Some(1),
style: BlockStyle::Flex,
render: Arc::new(move |bcx| block.render_block(editor.clone(), bcx)),
priority: 1,
}
})
.collect()
}
}
#[derive(Clone)]
pub(crate) struct DiagnosticBlock {
pub(crate) initial_range: Range<Point>,
pub(crate) severity: DiagnosticSeverity,
pub(crate) buffer_id: BufferId,
pub(crate) markdown: Entity<Markdown>,
pub(crate) diagnostics_editor: Option<WeakEntity<ProjectDiagnosticsEditor>>,
}
impl DiagnosticBlock {
pub fn render_block(&self, editor: WeakEntity<Editor>, bcx: &BlockContext) -> AnyElement {
let cx = &bcx.app;
let status_colors = bcx.app.theme().status();
let max_width = px(600.);
let (background_color, border_color) = match self.severity {
DiagnosticSeverity::ERROR => (status_colors.error_background, status_colors.error),
DiagnosticSeverity::WARNING => {
(status_colors.warning_background, status_colors.warning)
}
DiagnosticSeverity::INFORMATION => (status_colors.info_background, status_colors.info),
DiagnosticSeverity::HINT => (status_colors.hint_background, status_colors.info),
_ => (status_colors.ignored_background, status_colors.ignored),
};
let settings = ThemeSettings::get_global(cx);
let editor_line_height = (settings.line_height() * settings.buffer_font_size(cx)).round();
let line_height = editor_line_height;
let buffer_id = self.buffer_id;
let diagnostics_editor = self.diagnostics_editor.clone();
div()
.border_l_2()
.px_2()
.line_height(line_height)
.bg(background_color)
.border_color(border_color)
.max_w(max_width)
.child(
MarkdownElement::new(self.markdown.clone(), hover_markdown_style(bcx.window, cx))
.on_url_click({
move |link, window, cx| {
Self::open_link(
editor.clone(),
&diagnostics_editor,
link,
window,
buffer_id,
cx,
)
}
}),
)
.into_any_element()
}
pub fn open_link(
editor: WeakEntity<Editor>,
diagnostics_editor: &Option<WeakEntity<ProjectDiagnosticsEditor>>,
link: SharedString,
window: &mut Window,
buffer_id: BufferId,
cx: &mut App,
) {
editor
.update(cx, |editor, cx| {
let Some(diagnostic_link) = link.strip_prefix("file://#diagnostic-") else {
editor::hover_popover::open_markdown_url(link, window, cx);
return;
};
let Some((group_id, ix)) = maybe!({
let (group_id, ix) = diagnostic_link.split_once('-')?;
let group_id: usize = group_id.parse().ok()?;
let ix: usize = ix.parse().ok()?;
Some((group_id, ix))
}) else {
return;
};
if let Some(diagnostics_editor) = diagnostics_editor {
if let Some(diagnostic) = diagnostics_editor
.update(cx, |diagnostics, _| {
diagnostics
.diagnostics
.get(&buffer_id)
.cloned()
.unwrap_or_default()
.into_iter()
.filter(|d| d.diagnostic.group_id == group_id)
.nth(ix)
})
.ok()
.flatten()
{
let multibuffer = editor.buffer().read(cx);
let Some(snapshot) = multibuffer
.buffer(buffer_id)
.map(|entity| entity.read(cx).snapshot())
else {
return;
};
for (excerpt_id, range) in multibuffer.excerpts_for_buffer(buffer_id, cx) {
if range.context.overlaps(&diagnostic.range, &snapshot) {
Self::jump_to(
editor,
Anchor::range_in_buffer(
excerpt_id,
buffer_id,
diagnostic.range,
),
window,
cx,
);
return;
}
}
}
} else {
if let Some(diagnostic) = editor
.snapshot(window, cx)
.buffer_snapshot
.diagnostic_group(buffer_id, group_id)
.nth(ix)
{
Self::jump_to(editor, diagnostic.range, window, cx)
}
};
})
.ok();
}
fn jump_to<T: ToOffset>(
editor: &mut Editor,
range: Range<T>,
window: &mut Window,
cx: &mut Context<Editor>,
) {
let snapshot = &editor.buffer().read(cx).snapshot(cx);
let range = range.start.to_offset(&snapshot)..range.end.to_offset(&snapshot);
editor.unfold_ranges(&[range.start..range.end], true, false, cx);
editor.change_selections(Some(Autoscroll::fit()), window, cx, |s| {
s.select_ranges([range.start..range.start]);
});
window.focus(&editor.focus_handle(cx));
}
}

View File

@@ -1,38 +1,39 @@
pub mod items;
mod toolbar_controls;
mod diagnostic_renderer;
#[cfg(test)]
mod diagnostics_tests;
use anyhow::Result;
use collections::{BTreeSet, HashSet};
use collections::{BTreeSet, HashMap};
use diagnostic_renderer::DiagnosticBlock;
use editor::{
Editor, EditorEvent, ExcerptId, ExcerptRange, MultiBuffer, ToOffset, diagnostic_block_renderer,
display_map::{BlockPlacement, BlockProperties, BlockStyle, CustomBlockId, RenderBlock},
highlight_diagnostic_message,
DEFAULT_MULTIBUFFER_CONTEXT, Editor, EditorEvent, ExcerptRange, MultiBuffer, PathKey,
display_map::{BlockPlacement, BlockProperties, BlockStyle, CustomBlockId},
scroll::Autoscroll,
};
use gpui::{
AnyElement, AnyView, App, AsyncApp, Context, Entity, EventEmitter, FocusHandle, Focusable,
Global, HighlightStyle, InteractiveElement, IntoElement, ParentElement, Render, SharedString,
Styled, StyledText, Subscription, Task, WeakEntity, Window, actions, div, svg,
Global, InteractiveElement, IntoElement, ParentElement, Render, SharedString, Styled,
Subscription, Task, WeakEntity, Window, actions, div,
};
use language::{
Bias, Buffer, BufferRow, BufferSnapshot, Diagnostic, DiagnosticEntry, DiagnosticSeverity,
Point, Selection, SelectionGoal, ToTreeSitterPoint,
Bias, Buffer, BufferRow, BufferSnapshot, DiagnosticEntry, Point, ToTreeSitterPoint,
};
use lsp::LanguageServerId;
use lsp::DiagnosticSeverity;
use project::{DiagnosticSummary, Project, ProjectPath, project_settings::ProjectSettings};
use settings::Settings;
use std::{
any::{Any, TypeId},
cmp,
cmp::Ordering,
mem,
ops::{Range, RangeInclusive},
sync::Arc,
time::Duration,
};
use text::{BufferId, OffsetRangeExt};
use theme::ActiveTheme;
pub use toolbar_controls::ToolbarControls;
use ui::{Icon, IconName, Label, h_flex, prelude::*};
@@ -49,41 +50,28 @@ struct IncludeWarnings(bool);
impl Global for IncludeWarnings {}
pub fn init(cx: &mut App) {
editor::set_diagnostic_renderer(diagnostic_renderer::DiagnosticRenderer {}, cx);
cx.observe_new(ProjectDiagnosticsEditor::register).detach();
}
struct ProjectDiagnosticsEditor {
pub(crate) struct ProjectDiagnosticsEditor {
project: Entity<Project>,
workspace: WeakEntity<Workspace>,
focus_handle: FocusHandle,
editor: Entity<Editor>,
diagnostics: HashMap<BufferId, Vec<DiagnosticEntry<text::Anchor>>>,
blocks: HashMap<BufferId, Vec<CustomBlockId>>,
summary: DiagnosticSummary,
excerpts: Entity<MultiBuffer>,
path_states: Vec<PathState>,
paths_to_update: BTreeSet<(ProjectPath, Option<LanguageServerId>)>,
multibuffer: Entity<MultiBuffer>,
paths_to_update: BTreeSet<ProjectPath>,
include_warnings: bool,
context: u32,
update_excerpts_task: Option<Task<Result<()>>>,
_subscription: Subscription,
}
struct PathState {
path: ProjectPath,
diagnostic_groups: Vec<DiagnosticGroupState>,
}
struct DiagnosticGroupState {
language_server_id: LanguageServerId,
primary_diagnostic: DiagnosticEntry<language::Anchor>,
primary_excerpt_ix: usize,
excerpts: Vec<ExcerptId>,
blocks: HashSet<CustomBlockId>,
block_count: usize,
}
impl EventEmitter<EditorEvent> for ProjectDiagnosticsEditor {}
const DIAGNOSTICS_UPDATE_DEBOUNCE: Duration = Duration::from_millis(50);
const DIAGNOSTICS_UPDATE_DELAY: Duration = Duration::from_millis(50);
impl Render for ProjectDiagnosticsEditor {
fn render(&mut self, _: &mut Window, cx: &mut Context<Self>) -> impl IntoElement {
@@ -149,8 +137,7 @@ impl ProjectDiagnosticsEditor {
workspace.register_action(Self::deploy);
}
fn new_with_context(
context: u32,
fn new(
include_warnings: bool,
project_handle: Entity<Project>,
workspace: WeakEntity<Workspace>,
@@ -170,8 +157,7 @@ impl ProjectDiagnosticsEditor {
language_server_id,
path,
} => {
this.paths_to_update
.insert((path.clone(), Some(*language_server_id)));
this.paths_to_update.insert(path.clone());
this.summary = project.read(cx).diagnostic_summary(false, cx);
cx.emit(EditorEvent::TitleChanged);
@@ -201,6 +187,7 @@ impl ProjectDiagnosticsEditor {
Editor::for_multibuffer(excerpts.clone(), Some(project_handle.clone()), window, cx);
editor.set_vertical_scroll_margin(5, cx);
editor.disable_inline_diagnostics();
editor.set_all_diagnostics_active(cx);
editor
});
cx.subscribe_in(
@@ -210,7 +197,7 @@ impl ProjectDiagnosticsEditor {
cx.emit(event.clone());
match event {
EditorEvent::Focused => {
if this.path_states.is_empty() {
if this.multibuffer.read(cx).is_empty() {
window.focus(&this.focus_handle);
}
}
@@ -229,14 +216,14 @@ impl ProjectDiagnosticsEditor {
let project = project_handle.read(cx);
let mut this = Self {
project: project_handle.clone(),
context,
summary: project.diagnostic_summary(false, cx),
diagnostics: Default::default(),
blocks: Default::default(),
include_warnings,
workspace,
excerpts,
multibuffer: excerpts,
focus_handle,
editor,
path_states: Default::default(),
paths_to_update: Default::default(),
update_excerpts_task: None,
_subscription: project_event_subscription,
@@ -252,15 +239,15 @@ impl ProjectDiagnosticsEditor {
let project_handle = self.project.clone();
self.update_excerpts_task = Some(cx.spawn_in(window, async move |this, cx| {
cx.background_executor()
.timer(DIAGNOSTICS_UPDATE_DEBOUNCE)
.timer(DIAGNOSTICS_UPDATE_DELAY)
.await;
loop {
let Some((path, language_server_id)) = this.update(cx, |this, _| {
let Some((path, language_server_id)) = this.paths_to_update.pop_first() else {
let Some(path) = this.update(cx, |this, _| {
let Some(path) = this.paths_to_update.pop_first() else {
this.update_excerpts_task.take();
return None;
};
Some((path, language_server_id))
Some(path)
})?
else {
break;
@@ -272,7 +259,7 @@ impl ProjectDiagnosticsEditor {
.log_err()
{
this.update_in(cx, |this, window, cx| {
this.update_excerpts(path, language_server_id, buffer, window, cx)
this.update_excerpts(buffer, window, cx)
})?
.await?;
}
@@ -281,23 +268,6 @@ impl ProjectDiagnosticsEditor {
}));
}
fn new(
project_handle: Entity<Project>,
include_warnings: bool,
workspace: WeakEntity<Workspace>,
window: &mut Window,
cx: &mut Context<Self>,
) -> Self {
Self::new_with_context(
editor::DEFAULT_MULTIBUFFER_CONTEXT,
include_warnings,
project_handle,
workspace,
window,
cx,
)
}
fn deploy(
workspace: &mut Workspace,
_: &Deploy,
@@ -319,8 +289,8 @@ impl ProjectDiagnosticsEditor {
let diagnostics = cx.new(|cx| {
ProjectDiagnosticsEditor::new(
workspace.project().clone(),
include_warnings,
workspace.project().clone(),
workspace_handle,
window,
cx,
@@ -338,7 +308,7 @@ impl ProjectDiagnosticsEditor {
}
fn focus_in(&mut self, window: &mut Window, cx: &mut Context<Self>) {
if self.focus_handle.is_focused(window) && !self.path_states.is_empty() {
if self.focus_handle.is_focused(window) && !self.multibuffer.read(cx).is_empty() {
self.editor.focus_handle(cx).focus(window)
}
}
@@ -356,396 +326,212 @@ impl ProjectDiagnosticsEditor {
self.project.update(cx, |project, cx| {
let mut paths = project
.diagnostic_summaries(false, cx)
.map(|(path, _, _)| (path, None))
.map(|(path, _, _)| path)
.collect::<BTreeSet<_>>();
paths.extend(
self.path_states
.iter()
.map(|state| (state.path.clone(), None)),
);
let paths_to_update = std::mem::take(&mut self.paths_to_update);
paths.extend(paths_to_update.into_iter().map(|(path, _)| (path, None)));
self.multibuffer.update(cx, |multibuffer, cx| {
for buffer in multibuffer.all_buffers() {
if let Some(file) = buffer.read(cx).file() {
paths.insert(ProjectPath {
path: file.path().clone(),
worktree_id: file.worktree_id(cx),
});
}
}
});
self.paths_to_update = paths;
});
self.update_stale_excerpts(window, cx);
}
fn diagnostics_are_unchanged(
&self,
existing: &Vec<DiagnosticEntry<text::Anchor>>,
new: &Vec<DiagnosticEntry<text::Anchor>>,
snapshot: &BufferSnapshot,
) -> bool {
if existing.len() != new.len() {
return false;
}
existing.iter().zip(new.iter()).all(|(existing, new)| {
existing.diagnostic.message == new.diagnostic.message
&& existing.diagnostic.severity == new.diagnostic.severity
&& existing.diagnostic.is_primary == new.diagnostic.is_primary
&& existing.range.to_offset(snapshot) == new.range.to_offset(snapshot)
})
}
fn update_excerpts(
&mut self,
path_to_update: ProjectPath,
server_to_update: Option<LanguageServerId>,
buffer: Entity<Buffer>,
window: &mut Window,
cx: &mut Context<Self>,
) -> Task<Result<()>> {
let was_empty = self.path_states.is_empty();
let snapshot = buffer.read(cx).snapshot();
let path_ix = match self
.path_states
.binary_search_by_key(&&path_to_update, |e| &e.path)
{
Ok(ix) => ix,
Err(ix) => {
self.path_states.insert(
ix,
PathState {
path: path_to_update.clone(),
diagnostic_groups: Default::default(),
},
);
ix
}
};
let mut prev_excerpt_id = if path_ix > 0 {
let prev_path_last_group = &self.path_states[path_ix - 1]
.diagnostic_groups
.last()
.unwrap();
*prev_path_last_group.excerpts.last().unwrap()
} else {
ExcerptId::min()
};
let mut new_group_ixs = Vec::new();
let mut blocks_to_add = Vec::new();
let mut blocks_to_remove = HashSet::default();
let mut first_excerpt_id = None;
let was_empty = self.multibuffer.read(cx).is_empty();
let buffer_snapshot = buffer.read(cx).snapshot();
let buffer_id = buffer_snapshot.remote_id();
let max_severity = if self.include_warnings {
DiagnosticSeverity::WARNING
} else {
DiagnosticSeverity::ERROR
};
let excerpts = self.excerpts.clone().downgrade();
let context = self.context;
let editor = self.editor.clone().downgrade();
cx.spawn_in(window, async move |this, cx| {
let mut old_groups = this
.update(cx, |this, _| {
mem::take(&mut this.path_states[path_ix].diagnostic_groups)
})?
.into_iter()
.enumerate()
.peekable();
let mut new_groups = snapshot
.diagnostic_groups(server_to_update)
.into_iter()
.filter(|(_, group)| {
group.entries[group.primary_ix].diagnostic.severity <= max_severity
})
.peekable();
loop {
let mut to_insert = None;
let mut to_remove = None;
let mut to_keep = None;
match (old_groups.peek(), new_groups.peek()) {
(None, None) => break,
(None, Some(_)) => to_insert = new_groups.next(),
(Some((_, old_group)), None) => {
if server_to_update.map_or(true, |id| id == old_group.language_server_id) {
to_remove = old_groups.next();
} else {
to_keep = old_groups.next();
}
}
(Some((_, old_group)), Some((new_language_server_id, new_group))) => {
let old_primary = &old_group.primary_diagnostic;
let new_primary = &new_group.entries[new_group.primary_ix];
match compare_diagnostics(old_primary, new_primary, &snapshot)
.then_with(|| old_group.language_server_id.cmp(new_language_server_id))
{
Ordering::Less => {
if server_to_update
.map_or(true, |id| id == old_group.language_server_id)
{
to_remove = old_groups.next();
} else {
to_keep = old_groups.next();
}
}
Ordering::Equal => {
to_keep = old_groups.next();
new_groups.next();
}
Ordering::Greater => to_insert = new_groups.next(),
}
}
cx.spawn_in(window, async move |this, mut cx| {
let diagnostics = buffer_snapshot
.diagnostics_in_range::<_, text::Anchor>(
Point::zero()..buffer_snapshot.max_point(),
false,
)
.filter(|d| !(d.diagnostic.is_primary && d.diagnostic.is_unnecessary))
.collect::<Vec<_>>();
let unchanged = this.update(cx, |this, _| {
if this.diagnostics.get(&buffer_id).is_some_and(|existing| {
this.diagnostics_are_unchanged(existing, &diagnostics, &buffer_snapshot)
}) {
return true;
}
this.diagnostics.insert(buffer_id, diagnostics.clone());
return false;
})?;
if unchanged {
return Ok(());
}
if let Some((language_server_id, group)) = to_insert {
let mut group_state = DiagnosticGroupState {
language_server_id,
primary_diagnostic: group.entries[group.primary_ix].clone(),
primary_excerpt_ix: 0,
excerpts: Default::default(),
blocks: Default::default(),
block_count: 0,
};
let mut pending_range: Option<(Range<Point>, Range<Point>, usize)> = None;
let mut is_first_excerpt_for_group = true;
for (ix, entry) in group.entries.iter().map(Some).chain([None]).enumerate() {
let resolved_entry = entry.map(|e| e.resolve::<Point>(&snapshot));
let expanded_range = if let Some(entry) = &resolved_entry {
Some(
context_range_for_entry(
entry.range.clone(),
context,
snapshot.clone(),
(**cx).clone(),
)
.await,
)
} else {
None
};
if let Some((range, context_range, start_ix)) = &mut pending_range {
if let Some(expanded_range) = expanded_range.clone() {
// If the entries are overlapping or next to each-other, merge them into one excerpt.
if context_range.end.row + 1 >= expanded_range.start.row {
context_range.end = context_range.end.max(expanded_range.end);
continue;
}
let mut grouped: HashMap<usize, Vec<_>> = HashMap::default();
for entry in diagnostics {
grouped
.entry(entry.diagnostic.group_id)
.or_default()
.push(DiagnosticEntry {
range: entry.range.to_point(&buffer_snapshot),
diagnostic: entry.diagnostic,
})
}
let mut blocks: Vec<DiagnosticBlock> = Vec::new();
for (_, group) in grouped {
let group_severity = group.iter().map(|d| d.diagnostic.severity).min();
if group_severity.is_none_or(|s| s > max_severity) {
continue;
}
let more = cx.update(|_, cx| {
crate::diagnostic_renderer::DiagnosticRenderer::diagnostic_blocks_for_group(
group,
buffer_snapshot.remote_id(),
Some(this.clone()),
cx,
)
})?;
for item in more {
let insert_pos = blocks
.binary_search_by(|existing| {
match existing.initial_range.start.cmp(&item.initial_range.start) {
Ordering::Equal => item
.initial_range
.end
.cmp(&existing.initial_range.end)
.reverse(),
other => other,
}
})
.unwrap_or_else(|pos| pos);
let excerpt_id = excerpts.update(cx, |excerpts, cx| {
excerpts
.insert_excerpts_after(
prev_excerpt_id,
buffer.clone(),
[ExcerptRange {
context: context_range.clone(),
primary: range.clone(),
}],
cx,
)
.pop()
.unwrap()
})?;
prev_excerpt_id = excerpt_id;
first_excerpt_id.get_or_insert(prev_excerpt_id);
group_state.excerpts.push(excerpt_id);
let header_position = (excerpt_id, language::Anchor::MIN);
if is_first_excerpt_for_group {
is_first_excerpt_for_group = false;
let mut primary =
group.entries[group.primary_ix].diagnostic.clone();
primary.message =
primary.message.split('\n').next().unwrap().to_string();
group_state.block_count += 1;
blocks_to_add.push(BlockProperties {
placement: BlockPlacement::Above(header_position),
height: Some(2),
style: BlockStyle::Sticky,
render: diagnostic_header_renderer(primary),
priority: 0,
});
}
for entry in &group.entries[*start_ix..ix] {
let mut diagnostic = entry.diagnostic.clone();
if diagnostic.is_primary {
group_state.primary_excerpt_ix = group_state.excerpts.len() - 1;
diagnostic.message =
entry.diagnostic.message.split('\n').skip(1).collect();
}
if !diagnostic.message.is_empty() {
group_state.block_count += 1;
blocks_to_add.push(BlockProperties {
placement: BlockPlacement::Below((
excerpt_id,
entry.range.start,
)),
height: Some(
diagnostic.message.matches('\n').count() as u32 + 1,
),
style: BlockStyle::Fixed,
render: diagnostic_block_renderer(diagnostic, None, true),
priority: 0,
});
}
}
pending_range.take();
}
if let Some(entry) = resolved_entry.as_ref() {
let range = entry.range.clone();
pending_range = Some((range, expanded_range.unwrap(), ix));
}
}
this.update(cx, |this, _| {
new_group_ixs.push(this.path_states[path_ix].diagnostic_groups.len());
this.path_states[path_ix]
.diagnostic_groups
.push(group_state);
})?;
} else if let Some((_, group_state)) = to_remove {
excerpts.update(cx, |excerpts, cx| {
excerpts.remove_excerpts(group_state.excerpts.iter().copied(), cx)
})?;
blocks_to_remove.extend(group_state.blocks.iter().copied());
} else if let Some((_, group_state)) = to_keep {
prev_excerpt_id = *group_state.excerpts.last().unwrap();
first_excerpt_id.get_or_insert(prev_excerpt_id);
this.update(cx, |this, _| {
this.path_states[path_ix]
.diagnostic_groups
.push(group_state)
})?;
blocks.insert(insert_pos, item);
}
}
let excerpts_snapshot = excerpts.update(cx, |excerpts, cx| excerpts.snapshot(cx))?;
editor.update(cx, |editor, cx| {
editor.remove_blocks(blocks_to_remove, None, cx);
let block_ids = editor.insert_blocks(
blocks_to_add.into_iter().flat_map(|block| {
let placement = match block.placement {
BlockPlacement::Above((excerpt_id, text_anchor)) => {
BlockPlacement::Above(
excerpts_snapshot.anchor_in_excerpt(excerpt_id, text_anchor)?,
)
}
BlockPlacement::Below((excerpt_id, text_anchor)) => {
BlockPlacement::Below(
excerpts_snapshot.anchor_in_excerpt(excerpt_id, text_anchor)?,
)
}
BlockPlacement::Replace(_) | BlockPlacement::Near(_) => {
unreachable!(
"no Near/Replace block should have been pushed to blocks_to_add"
)
}
};
Some(BlockProperties {
placement,
height: block.height,
style: block.style,
render: block.render,
priority: 0,
})
}),
Some(Autoscroll::fit()),
cx,
);
let mut block_ids = block_ids.into_iter();
this.update(cx, |this, _| {
for ix in new_group_ixs {
let group_state = &mut this.path_states[path_ix].diagnostic_groups[ix];
group_state.blocks =
block_ids.by_ref().take(group_state.block_count).collect();
}
})?;
Result::<(), anyhow::Error>::Ok(())
})??;
let mut excerpt_ranges: Vec<ExcerptRange<Point>> = Vec::new();
for b in blocks.iter() {
let excerpt_range = context_range_for_entry(
b.initial_range.clone(),
DEFAULT_MULTIBUFFER_CONTEXT,
buffer_snapshot.clone(),
&mut cx,
)
.await;
excerpt_ranges.push(ExcerptRange {
context: excerpt_range,
primary: b.initial_range.clone(),
})
}
this.update_in(cx, |this, window, cx| {
if this.path_states[path_ix].diagnostic_groups.is_empty() {
this.path_states.remove(path_ix);
if let Some(block_ids) = this.blocks.remove(&buffer_id) {
this.editor.update(cx, |editor, cx| {
editor.display_map.update(cx, |display_map, cx| {
display_map.remove_blocks(block_ids.into_iter().collect(), cx)
});
})
}
this.editor.update(cx, |editor, cx| {
let groups;
let mut selections;
let new_excerpt_ids_by_selection_id;
if was_empty {
groups = this.path_states.first()?.diagnostic_groups.as_slice();
new_excerpt_ids_by_selection_id =
[(0, ExcerptId::min())].into_iter().collect();
selections = vec![Selection {
id: 0,
start: 0,
end: 0,
reversed: false,
goal: SelectionGoal::None,
}];
} else {
groups = this.path_states.get(path_ix)?.diagnostic_groups.as_slice();
new_excerpt_ids_by_selection_id =
editor.change_selections(Some(Autoscroll::fit()), window, cx, |s| {
s.refresh()
});
selections = editor.selections.all::<usize>(cx);
}
// If any selection has lost its position, move it to start of the next primary diagnostic.
let snapshot = editor.snapshot(window, cx);
for selection in &mut selections {
if let Some(new_excerpt_id) =
new_excerpt_ids_by_selection_id.get(&selection.id)
{
let group_ix = match groups.binary_search_by(|probe| {
probe
.excerpts
.last()
.unwrap()
.cmp(new_excerpt_id, &snapshot.buffer_snapshot)
}) {
Ok(ix) | Err(ix) => ix,
};
if let Some(group) = groups.get(group_ix) {
if let Some(offset) = excerpts_snapshot
.anchor_in_excerpt(
group.excerpts[group.primary_excerpt_ix],
group.primary_diagnostic.range.start,
)
.map(|anchor| anchor.to_offset(&excerpts_snapshot))
{
selection.start = offset;
selection.end = offset;
}
}
}
}
editor.change_selections(None, window, cx, |s| {
s.select(selections);
});
Some(())
let (anchor_ranges, _) = this.multibuffer.update(cx, |multi_buffer, cx| {
multi_buffer.set_excerpt_ranges_for_path(
PathKey::for_buffer(&buffer, cx),
buffer.clone(),
&buffer_snapshot,
excerpt_ranges,
cx,
)
});
})?;
#[cfg(test)]
let cloned_blocks = blocks.clone();
this.update_in(cx, |this, window, cx| {
if this.path_states.is_empty() {
if this.editor.focus_handle(cx).is_focused(window) {
window.focus(&this.focus_handle);
if was_empty {
if let Some(anchor_range) = anchor_ranges.first() {
let range_to_select = anchor_range.start..anchor_range.start;
this.editor.update(cx, |editor, cx| {
editor.change_selections(Some(Autoscroll::fit()), window, cx, |s| {
s.select_anchor_ranges([range_to_select]);
})
})
}
} else if this.focus_handle.is_focused(window) {
let focus_handle = this.editor.focus_handle(cx);
window.focus(&focus_handle);
}
let editor_blocks =
anchor_ranges
.into_iter()
.zip(blocks.into_iter())
.map(|(anchor, block)| {
let editor = this.editor.downgrade();
BlockProperties {
placement: BlockPlacement::Near(anchor.start),
height: Some(1),
style: BlockStyle::Flex,
render: Arc::new(move |bcx| {
block.render_block(editor.clone(), bcx)
}),
priority: 1,
}
});
let block_ids = this.editor.update(cx, |editor, cx| {
editor.display_map.update(cx, |display_map, cx| {
display_map.insert_blocks(editor_blocks, cx)
})
});
#[cfg(test)]
this.check_invariants(cx);
{
for (block_id, block) in block_ids.iter().zip(cloned_blocks.iter()) {
let markdown = block.markdown.clone();
editor::test::set_block_content_for_tests(
&this.editor,
*block_id,
cx,
move |cx| {
markdown::MarkdownElement::rendered_text(
markdown.clone(),
cx,
editor::hover_markdown_style,
)
},
);
}
}
cx.notify();
this.blocks.insert(buffer_id, block_ids);
cx.notify()
})
})
}
#[cfg(test)]
fn check_invariants(&self, cx: &mut Context<Self>) {
let mut excerpts = Vec::new();
for (id, buffer, _) in self.excerpts.read(cx).snapshot(cx).excerpts() {
if let Some(file) = buffer.file() {
excerpts.push((id, file.path().clone()));
}
}
let mut prev_path = None;
for (_, path) in &excerpts {
if let Some(prev_path) = prev_path {
if path < prev_path {
panic!("excerpts are not sorted by path {:?}", excerpts);
}
}
prev_path = Some(path);
}
}
}
impl Focusable for ProjectDiagnosticsEditor {
@@ -857,8 +643,8 @@ impl Item for ProjectDiagnosticsEditor {
{
Some(cx.new(|cx| {
ProjectDiagnosticsEditor::new(
self.project.clone(),
self.include_warnings,
self.project.clone(),
self.workspace.clone(),
window,
cx,
@@ -867,15 +653,15 @@ impl Item for ProjectDiagnosticsEditor {
}
fn is_dirty(&self, cx: &App) -> bool {
self.excerpts.read(cx).is_dirty(cx)
self.multibuffer.read(cx).is_dirty(cx)
}
fn has_deleted_file(&self, cx: &App) -> bool {
self.excerpts.read(cx).has_deleted_file(cx)
self.multibuffer.read(cx).has_deleted_file(cx)
}
fn has_conflict(&self, cx: &App) -> bool {
self.excerpts.read(cx).has_conflict(cx)
self.multibuffer.read(cx).has_conflict(cx)
}
fn can_save(&self, _: &App) -> bool {
@@ -950,128 +736,31 @@ impl Item for ProjectDiagnosticsEditor {
}
}
const DIAGNOSTIC_HEADER: &str = "diagnostic header";
fn diagnostic_header_renderer(diagnostic: Diagnostic) -> RenderBlock {
let (message, code_ranges) = highlight_diagnostic_message(&diagnostic, None);
let message: SharedString = message;
Arc::new(move |cx| {
let color = cx.theme().colors();
let highlight_style: HighlightStyle = color.text_accent.into();
h_flex()
.id(DIAGNOSTIC_HEADER)
.block_mouse_down()
.h(2. * cx.window.line_height())
.w_full()
.px_9()
.justify_between()
.gap_2()
.child(
h_flex()
.gap_2()
.px_1()
.rounded_sm()
.bg(color.surface_background.opacity(0.5))
.map(|stack| {
stack.child(
svg()
.size(cx.window.text_style().font_size)
.flex_none()
.map(|icon| {
if diagnostic.severity == DiagnosticSeverity::ERROR {
icon.path(IconName::XCircle.path())
.text_color(Color::Error.color(cx))
} else {
icon.path(IconName::Warning.path())
.text_color(Color::Warning.color(cx))
}
}),
)
})
.child(
h_flex()
.gap_1()
.child(
StyledText::new(message.clone()).with_default_highlights(
&cx.window.text_style(),
code_ranges
.iter()
.map(|range| (range.clone(), highlight_style)),
),
)
.when_some(diagnostic.code.as_ref(), |stack, code| {
stack.child(
div()
.child(SharedString::from(format!("({code:?})")))
.text_color(color.text_muted),
)
}),
),
)
.when_some(diagnostic.source.as_ref(), |stack, source| {
stack.child(
div()
.child(SharedString::from(source.clone()))
.text_color(color.text_muted),
)
})
.into_any_element()
})
}
fn compare_diagnostics(
old: &DiagnosticEntry<language::Anchor>,
new: &DiagnosticEntry<language::Anchor>,
snapshot: &language::BufferSnapshot,
) -> Ordering {
use language::ToOffset;
// The diagnostics may point to a previously open Buffer for this file.
if !old.range.start.is_valid(snapshot) || !new.range.start.is_valid(snapshot) {
return Ordering::Greater;
}
old.range
.start
.to_offset(snapshot)
.cmp(&new.range.start.to_offset(snapshot))
.then_with(|| {
old.range
.end
.to_offset(snapshot)
.cmp(&new.range.end.to_offset(snapshot))
})
.then_with(|| old.diagnostic.message.cmp(&new.diagnostic.message))
}
const DIAGNOSTIC_EXPANSION_ROW_LIMIT: u32 = 32;
fn context_range_for_entry(
async fn context_range_for_entry(
range: Range<Point>,
context: u32,
snapshot: BufferSnapshot,
cx: AsyncApp,
) -> Task<Range<Point>> {
cx.spawn(async move |cx| {
if let Some(rows) = heuristic_syntactic_expand(
range.clone(),
DIAGNOSTIC_EXPANSION_ROW_LIMIT,
snapshot.clone(),
cx,
)
.await
{
return Range {
start: Point::new(*rows.start(), 0),
end: snapshot.clip_point(Point::new(*rows.end(), u32::MAX), Bias::Left),
};
}
Range {
start: Point::new(range.start.row.saturating_sub(context), 0),
end: snapshot.clip_point(Point::new(range.end.row + context, u32::MAX), Bias::Left),
}
})
cx: &mut AsyncApp,
) -> Range<Point> {
if let Some(rows) = heuristic_syntactic_expand(
range.clone(),
DIAGNOSTIC_EXPANSION_ROW_LIMIT,
snapshot.clone(),
cx,
)
.await
{
return Range {
start: Point::new(*rows.start(), 0),
end: snapshot.clip_point(Point::new(*rows.end(), u32::MAX), Bias::Left),
};
}
Range {
start: Point::new(range.start.row.saturating_sub(context), 0),
end: snapshot.clip_point(Point::new(range.end.row + context, u32::MAX), Bias::Left),
}
}
/// Expands the input range using syntax information from TreeSitter. This expansion will be limited

File diff suppressed because it is too large Load Diff

View File

@@ -61,7 +61,7 @@ pub struct BlockSnapshot {
}
#[derive(Clone, Copy, Debug, Default, PartialEq, Eq, PartialOrd, Ord, Hash)]
pub struct CustomBlockId(usize);
pub struct CustomBlockId(pub usize);
impl From<CustomBlockId> for ElementId {
fn from(val: CustomBlockId) -> Self {
@@ -89,7 +89,7 @@ pub enum BlockPlacement<T> {
}
impl<T> BlockPlacement<T> {
fn start(&self) -> &T {
pub fn start(&self) -> &T {
match self {
BlockPlacement::Above(position) => position,
BlockPlacement::Below(position) => position,
@@ -187,14 +187,15 @@ impl BlockPlacement<Anchor> {
}
pub struct CustomBlock {
id: CustomBlockId,
placement: BlockPlacement<Anchor>,
height: Option<u32>,
pub id: CustomBlockId,
pub placement: BlockPlacement<Anchor>,
pub height: Option<u32>,
style: BlockStyle,
render: Arc<Mutex<RenderBlock>>,
priority: usize,
}
#[derive(Clone)]
pub struct BlockProperties<P> {
pub placement: BlockPlacement<P>,
// None if the block takes up no space
@@ -686,6 +687,9 @@ impl BlockMap {
rows_before_block = position.0 - new_transforms.summary().input_rows;
}
BlockPlacement::Near(position) | BlockPlacement::Below(position) => {
if position.0 + 1 < new_transforms.summary().input_rows {
continue;
}
rows_before_block = (position.0 + 1) - new_transforms.summary().input_rows;
}
BlockPlacement::Replace(range) => {

View File

@@ -23,7 +23,7 @@ mod element;
mod git;
mod highlight_matching_bracket;
mod hover_links;
mod hover_popover;
pub mod hover_popover;
mod indent_guides;
mod inlay_hint_cache;
pub mod items;
@@ -88,10 +88,9 @@ use gpui::{
ClipboardItem, Context, DispatchPhase, Edges, Entity, EntityInputHandler, EventEmitter,
FocusHandle, FocusOutEvent, Focusable, FontId, FontWeight, Global, HighlightStyle, Hsla,
KeyContext, Modifiers, MouseButton, MouseDownEvent, PaintQuad, ParentElement, Pixels, Render,
SharedString, Size, Stateful, Styled, StyledText, Subscription, Task, TextStyle,
TextStyleRefinement, UTF16Selection, UnderlineStyle, UniformListScrollHandle, WeakEntity,
WeakFocusHandle, Window, div, impl_actions, point, prelude::*, pulsating_between, px, relative,
size,
SharedString, Size, Stateful, Styled, Subscription, Task, TextStyle, TextStyleRefinement,
UTF16Selection, UnderlineStyle, UniformListScrollHandle, WeakEntity, WeakFocusHandle, Window,
div, impl_actions, point, prelude::*, pulsating_between, px, relative, size,
};
use highlight_matching_bracket::refresh_matching_bracket_highlights;
use hover_links::{HoverLink, HoveredLinkState, InlayHighlight, find_file};
@@ -105,7 +104,7 @@ pub use items::MAX_TAB_TITLE_LEN;
use itertools::Itertools;
use language::{
AutoindentMode, BracketMatch, BracketPair, Buffer, Capability, CharKind, CodeLabel,
CursorShape, Diagnostic, DiffOptions, EditPredictionsMode, EditPreview, HighlightedText,
CursorShape, DiagnosticEntry, DiffOptions, EditPredictionsMode, EditPreview, HighlightedText,
IndentKind, IndentSize, Language, OffsetRangeExt, Point, Selection, SelectionGoal, TextObject,
TransactionId, TreeSitterOptions, WordsQuery,
language_settings::{
@@ -143,12 +142,12 @@ use language::BufferSnapshot;
pub use lsp_ext::lsp_tasks;
use movement::TextLayoutDetails;
pub use multi_buffer::{
Anchor, AnchorRangeExt, ExcerptId, ExcerptRange, MultiBuffer, MultiBufferSnapshot, RowInfo,
ToOffset, ToPoint,
Anchor, AnchorRangeExt, ExcerptId, ExcerptRange, MultiBuffer, MultiBufferSnapshot, PathKey,
RowInfo, ToOffset, ToPoint,
};
use multi_buffer::{
ExcerptInfo, ExpandExcerptDirection, MultiBufferDiffHunk, MultiBufferPoint, MultiBufferRow,
MultiOrSingleBufferOffsetRange, PathKey, ToOffsetUtf16,
MultiOrSingleBufferOffsetRange, ToOffsetUtf16,
};
use parking_lot::Mutex;
use project::{
@@ -356,6 +355,24 @@ pub fn set_blame_renderer(renderer: impl BlameRenderer + 'static, cx: &mut App)
cx.set_global(GlobalBlameRenderer(Arc::new(renderer)));
}
pub trait DiagnosticRenderer {
fn render_group(
&self,
diagnostic_group: Vec<DiagnosticEntry<Point>>,
buffer_id: BufferId,
snapshot: EditorSnapshot,
editor: WeakEntity<Editor>,
cx: &mut App,
) -> Vec<BlockProperties<Anchor>>;
}
pub(crate) struct GlobalDiagnosticRenderer(pub Arc<dyn DiagnosticRenderer>);
impl gpui::Global for GlobalDiagnosticRenderer {}
pub fn set_diagnostic_renderer(renderer: impl DiagnosticRenderer + 'static, cx: &mut App) {
cx.set_global(GlobalDiagnosticRenderer(Arc::new(renderer)));
}
pub struct SearchWithinRange;
trait InvalidationRegion {
@@ -701,7 +718,7 @@ pub struct Editor {
snippet_stack: InvalidationStack<SnippetState>,
select_syntax_node_history: SelectSyntaxNodeHistory,
ime_transaction: Option<TransactionId>,
active_diagnostics: Option<ActiveDiagnosticGroup>,
active_diagnostics: ActiveDiagnostic,
show_inline_diagnostics: bool,
inline_diagnostics_update: Task<()>,
inline_diagnostics_enabled: bool,
@@ -1074,12 +1091,19 @@ struct RegisteredInlineCompletionProvider {
}
#[derive(Debug, PartialEq, Eq)]
struct ActiveDiagnosticGroup {
primary_range: Range<Anchor>,
primary_message: String,
group_id: usize,
blocks: HashMap<CustomBlockId, Diagnostic>,
is_valid: bool,
pub struct ActiveDiagnosticGroup {
pub active_range: Range<Anchor>,
pub active_message: String,
pub group_id: usize,
pub blocks: HashSet<CustomBlockId>,
}
#[derive(Debug, PartialEq, Eq)]
#[allow(clippy::large_enum_variant)]
pub(crate) enum ActiveDiagnostic {
None,
All,
Group(ActiveDiagnosticGroup),
}
#[derive(Serialize, Deserialize, Clone, Debug)]
@@ -1475,7 +1499,7 @@ impl Editor {
snippet_stack: Default::default(),
select_syntax_node_history: SelectSyntaxNodeHistory::default(),
ime_transaction: Default::default(),
active_diagnostics: None,
active_diagnostics: ActiveDiagnostic::None,
show_inline_diagnostics: ProjectSettings::get_global(cx).diagnostics.inline.enabled,
inline_diagnostics_update: Task::ready(()),
inline_diagnostics: Vec::new(),
@@ -3076,7 +3100,7 @@ impl Editor {
return true;
}
if self.mode.is_full() && self.active_diagnostics.is_some() {
if self.mode.is_full() && matches!(self.active_diagnostics, ActiveDiagnostic::Group(_)) {
self.dismiss_diagnostics(cx);
return true;
}
@@ -13052,7 +13076,7 @@ impl Editor {
});
}
fn go_to_diagnostic(
pub fn go_to_diagnostic(
&mut self,
_: &GoToDiagnostic,
window: &mut Window,
@@ -13062,7 +13086,7 @@ impl Editor {
self.go_to_diagnostic_impl(Direction::Next, window, cx)
}
fn go_to_prev_diagnostic(
pub fn go_to_prev_diagnostic(
&mut self,
_: &GoToPreviousDiagnostic,
window: &mut Window,
@@ -13080,137 +13104,76 @@ impl Editor {
) {
let buffer = self.buffer.read(cx).snapshot(cx);
let selection = self.selections.newest::<usize>(cx);
// If there is an active Diagnostic Popover jump to its diagnostic instead.
if direction == Direction::Next {
if let Some(popover) = self.hover_state.diagnostic_popover.as_ref() {
let Some(buffer_id) = popover.local_diagnostic.range.start.buffer_id else {
return;
};
self.activate_diagnostics(
buffer_id,
popover.local_diagnostic.diagnostic.group_id,
window,
cx,
);
if let Some(active_diagnostics) = self.active_diagnostics.as_ref() {
let primary_range_start = active_diagnostics.primary_range.start;
self.change_selections(Some(Autoscroll::fit()), window, cx, |s| {
let mut new_selection = s.newest_anchor().clone();
new_selection.collapse_to(primary_range_start, SelectionGoal::None);
s.select_anchors(vec![new_selection.clone()]);
});
self.refresh_inline_completion(false, true, window, cx);
}
return;
let mut active_group_id = None;
if let ActiveDiagnostic::Group(active_group) = &self.active_diagnostics {
if active_group.active_range.start.to_offset(&buffer) == selection.start {
active_group_id = Some(active_group.group_id);
}
}
let active_group_id = self
.active_diagnostics
.as_ref()
.map(|active_group| active_group.group_id);
let active_primary_range = self.active_diagnostics.as_ref().map(|active_diagnostics| {
active_diagnostics
.primary_range
.to_offset(&buffer)
.to_inclusive()
});
let search_start = if let Some(active_primary_range) = active_primary_range.as_ref() {
if active_primary_range.contains(&selection.head()) {
*active_primary_range.start()
} else {
selection.head()
}
} else {
selection.head()
};
fn filtered(
snapshot: EditorSnapshot,
diagnostics: impl Iterator<Item = DiagnosticEntry<usize>>,
) -> impl Iterator<Item = DiagnosticEntry<usize>> {
diagnostics
.filter(|entry| entry.range.start != entry.range.end)
.filter(|entry| !entry.diagnostic.is_unnecessary)
.filter(move |entry| !snapshot.intersects_fold(entry.range.start))
}
let snapshot = self.snapshot(window, cx);
let primary_diagnostics_before = buffer
.diagnostics_in_range::<usize>(0..search_start)
.filter(|entry| entry.diagnostic.is_primary)
.filter(|entry| entry.range.start != entry.range.end)
.filter(|entry| entry.diagnostic.severity <= DiagnosticSeverity::WARNING)
.filter(|entry| !snapshot.intersects_fold(entry.range.start))
.collect::<Vec<_>>();
let last_same_group_diagnostic_before = active_group_id.and_then(|active_group_id| {
primary_diagnostics_before
.iter()
.position(|entry| entry.diagnostic.group_id == active_group_id)
});
let before = filtered(
snapshot.clone(),
buffer
.diagnostics_in_range(0..selection.start)
.filter(|entry| entry.range.start <= selection.start),
);
let after = filtered(
snapshot,
buffer
.diagnostics_in_range(selection.start..buffer.len())
.filter(|entry| entry.range.start >= selection.start),
);
let primary_diagnostics_after = buffer
.diagnostics_in_range::<usize>(search_start..buffer.len())
.filter(|entry| entry.diagnostic.is_primary)
.filter(|entry| entry.range.start != entry.range.end)
.filter(|entry| entry.diagnostic.severity <= DiagnosticSeverity::WARNING)
.filter(|diagnostic| !snapshot.intersects_fold(diagnostic.range.start))
.collect::<Vec<_>>();
let last_same_group_diagnostic_after = active_group_id.and_then(|active_group_id| {
primary_diagnostics_after
.iter()
.enumerate()
.rev()
.find_map(|(i, entry)| {
if entry.diagnostic.group_id == active_group_id {
Some(i)
} else {
None
let mut found: Option<DiagnosticEntry<usize>> = None;
if direction == Direction::Prev {
'outer: for prev_diagnostics in [before.collect::<Vec<_>>(), after.collect::<Vec<_>>()]
{
for diagnostic in prev_diagnostics.into_iter().rev() {
if diagnostic.range.start != selection.start
|| active_group_id
.is_some_and(|active| diagnostic.diagnostic.group_id < active)
{
found = Some(diagnostic);
break 'outer;
}
})
});
let next_primary_diagnostic = match direction {
Direction::Prev => primary_diagnostics_before
.iter()
.take(last_same_group_diagnostic_before.unwrap_or(usize::MAX))
.rev()
.next(),
Direction::Next => primary_diagnostics_after
.iter()
.skip(
last_same_group_diagnostic_after
.map(|index| index + 1)
.unwrap_or(0),
)
.next(),
};
// Cycle around to the start of the buffer, potentially moving back to the start of
// the currently active diagnostic.
let cycle_around = || match direction {
Direction::Prev => primary_diagnostics_after
.iter()
.rev()
.chain(primary_diagnostics_before.iter().rev())
.next(),
Direction::Next => primary_diagnostics_before
.iter()
.chain(primary_diagnostics_after.iter())
.next(),
};
if let Some((primary_range, group_id)) = next_primary_diagnostic
.or_else(cycle_around)
.map(|entry| (&entry.range, entry.diagnostic.group_id))
{
let Some(buffer_id) = buffer.anchor_after(primary_range.start).buffer_id else {
return;
};
self.activate_diagnostics(buffer_id, group_id, window, cx);
if self.active_diagnostics.is_some() {
self.change_selections(Some(Autoscroll::fit()), window, cx, |s| {
s.select(vec![Selection {
id: selection.id,
start: primary_range.start,
end: primary_range.start,
reversed: false,
goal: SelectionGoal::None,
}]);
});
self.refresh_inline_completion(false, true, window, cx);
}
}
} else {
for diagnostic in after.chain(before) {
if diagnostic.range.start != selection.start
|| active_group_id.is_some_and(|active| diagnostic.diagnostic.group_id > active)
{
found = Some(diagnostic);
break;
}
}
}
let Some(next_diagnostic) = found else {
return;
};
let Some(buffer_id) = buffer.anchor_after(next_diagnostic.range.start).buffer_id else {
return;
};
self.change_selections(Some(Autoscroll::fit()), window, cx, |s| {
s.select_ranges(vec![
next_diagnostic.range.start..next_diagnostic.range.start,
])
});
self.activate_diagnostics(buffer_id, next_diagnostic, window, cx);
self.refresh_inline_completion(false, true, window, cx);
}
fn go_to_next_hunk(&mut self, _: &GoToHunk, window: &mut Window, cx: &mut Context<Self>) {
@@ -14502,110 +14465,91 @@ impl Editor {
}
fn refresh_active_diagnostics(&mut self, cx: &mut Context<Editor>) {
if let Some(active_diagnostics) = self.active_diagnostics.as_mut() {
if let ActiveDiagnostic::Group(active_diagnostics) = &mut self.active_diagnostics {
let buffer = self.buffer.read(cx).snapshot(cx);
let primary_range_start = active_diagnostics.primary_range.start.to_offset(&buffer);
let primary_range_end = active_diagnostics.primary_range.end.to_offset(&buffer);
let primary_range_start = active_diagnostics.active_range.start.to_offset(&buffer);
let primary_range_end = active_diagnostics.active_range.end.to_offset(&buffer);
let is_valid = buffer
.diagnostics_in_range::<usize>(primary_range_start..primary_range_end)
.any(|entry| {
entry.diagnostic.is_primary
&& !entry.range.is_empty()
&& entry.range.start == primary_range_start
&& entry.diagnostic.message == active_diagnostics.primary_message
&& entry.diagnostic.message == active_diagnostics.active_message
});
if is_valid != active_diagnostics.is_valid {
active_diagnostics.is_valid = is_valid;
if is_valid {
let mut new_styles = HashMap::default();
for (block_id, diagnostic) in &active_diagnostics.blocks {
new_styles.insert(
*block_id,
diagnostic_block_renderer(diagnostic.clone(), None, true),
);
}
self.display_map.update(cx, |display_map, _cx| {
display_map.replace_blocks(new_styles);
});
} else {
self.dismiss_diagnostics(cx);
}
if !is_valid {
self.dismiss_diagnostics(cx);
}
}
}
pub fn active_diagnostic_group(&self) -> Option<&ActiveDiagnosticGroup> {
match &self.active_diagnostics {
ActiveDiagnostic::Group(group) => Some(group),
_ => None,
}
}
pub fn set_all_diagnostics_active(&mut self, cx: &mut Context<Self>) {
self.dismiss_diagnostics(cx);
self.active_diagnostics = ActiveDiagnostic::All;
}
fn activate_diagnostics(
&mut self,
buffer_id: BufferId,
group_id: usize,
diagnostic: DiagnosticEntry<usize>,
window: &mut Window,
cx: &mut Context<Self>,
) {
if matches!(self.active_diagnostics, ActiveDiagnostic::All) {
return;
}
self.dismiss_diagnostics(cx);
let snapshot = self.snapshot(window, cx);
self.active_diagnostics = self.display_map.update(cx, |display_map, cx| {
let buffer = self.buffer.read(cx).snapshot(cx);
let Some(diagnostic_renderer) = cx
.try_global::<GlobalDiagnosticRenderer>()
.map(|g| g.0.clone())
else {
return;
};
let buffer = self.buffer.read(cx).snapshot(cx);
let mut primary_range = None;
let mut primary_message = None;
let diagnostic_group = buffer
.diagnostic_group(buffer_id, group_id)
.filter_map(|entry| {
let start = entry.range.start;
let end = entry.range.end;
if snapshot.is_line_folded(MultiBufferRow(start.row))
&& (start.row == end.row
|| snapshot.is_line_folded(MultiBufferRow(end.row)))
{
return None;
}
if entry.diagnostic.is_primary {
primary_range = Some(entry.range.clone());
primary_message = Some(entry.diagnostic.message.clone());
}
Some(entry)
})
.collect::<Vec<_>>();
let primary_range = primary_range?;
let primary_message = primary_message?;
let diagnostic_group = buffer
.diagnostic_group(buffer_id, diagnostic.diagnostic.group_id)
.collect::<Vec<_>>();
let blocks = display_map
.insert_blocks(
diagnostic_group.iter().map(|entry| {
let diagnostic = entry.diagnostic.clone();
let message_height = diagnostic.message.matches('\n').count() as u32 + 1;
BlockProperties {
style: BlockStyle::Fixed,
placement: BlockPlacement::Below(
buffer.anchor_after(entry.range.start),
),
height: Some(message_height),
render: diagnostic_block_renderer(diagnostic, None, true),
priority: 0,
}
}),
cx,
)
.into_iter()
.zip(diagnostic_group.into_iter().map(|entry| entry.diagnostic))
.collect();
let blocks = diagnostic_renderer.render_group(
diagnostic_group,
buffer_id,
snapshot,
cx.weak_entity(),
cx,
);
Some(ActiveDiagnosticGroup {
primary_range: buffer.anchor_before(primary_range.start)
..buffer.anchor_after(primary_range.end),
primary_message,
group_id,
blocks,
is_valid: true,
})
let blocks = self.display_map.update(cx, |display_map, cx| {
display_map.insert_blocks(blocks, cx).into_iter().collect()
});
self.active_diagnostics = ActiveDiagnostic::Group(ActiveDiagnosticGroup {
active_range: buffer.anchor_before(diagnostic.range.start)
..buffer.anchor_after(diagnostic.range.end),
active_message: diagnostic.diagnostic.message.clone(),
group_id: diagnostic.diagnostic.group_id,
blocks,
});
cx.notify();
}
fn dismiss_diagnostics(&mut self, cx: &mut Context<Self>) {
if let Some(active_diagnostic_group) = self.active_diagnostics.take() {
if matches!(self.active_diagnostics, ActiveDiagnostic::All) {
return;
};
let prev = mem::replace(&mut self.active_diagnostics, ActiveDiagnostic::None);
if let ActiveDiagnostic::Group(group) = prev {
self.display_map.update(cx, |display_map, cx| {
display_map.remove_blocks(active_diagnostic_group.blocks.into_keys().collect(), cx);
display_map.remove_blocks(group.blocks, cx);
});
cx.notify();
}
@@ -14658,6 +14602,8 @@ impl Editor {
None
};
self.inline_diagnostics_update = cx.spawn_in(window, async move |editor, cx| {
let editor = editor.upgrade().unwrap();
if let Some(debounce) = debounce {
cx.background_executor().timer(debounce).await;
}
@@ -15230,7 +15176,7 @@ impl Editor {
&mut self,
creases: Vec<Crease<T>>,
auto_scroll: bool,
window: &mut Window,
_window: &mut Window,
cx: &mut Context<Self>,
) {
if creases.is_empty() {
@@ -15255,18 +15201,6 @@ impl Editor {
cx.notify();
if let Some(active_diagnostics) = self.active_diagnostics.take() {
// Clear diagnostics block when folding a range that contains it.
let snapshot = self.snapshot(window, cx);
if snapshot.intersects_fold(active_diagnostics.primary_range.start) {
drop(snapshot);
self.active_diagnostics = Some(active_diagnostics);
self.dismiss_diagnostics(cx);
} else {
self.active_diagnostics = Some(active_diagnostics);
}
}
self.scrollbar_marker_state.dirty = true;
self.folds_did_change(cx);
}
@@ -20120,103 +20054,6 @@ impl InvalidationRegion for SnippetState {
}
}
pub fn diagnostic_block_renderer(
diagnostic: Diagnostic,
max_message_rows: Option<u8>,
allow_closing: bool,
) -> RenderBlock {
let (text_without_backticks, code_ranges) =
highlight_diagnostic_message(&diagnostic, max_message_rows);
Arc::new(move |cx: &mut BlockContext| {
let group_id: SharedString = cx.block_id.to_string().into();
let mut text_style = cx.window.text_style().clone();
text_style.color = diagnostic_style(diagnostic.severity, cx.theme().status());
let theme_settings = ThemeSettings::get_global(cx);
text_style.font_family = theme_settings.buffer_font.family.clone();
text_style.font_style = theme_settings.buffer_font.style;
text_style.font_features = theme_settings.buffer_font.features.clone();
text_style.font_weight = theme_settings.buffer_font.weight;
let multi_line_diagnostic = diagnostic.message.contains('\n');
let buttons = |diagnostic: &Diagnostic| {
if multi_line_diagnostic {
v_flex()
} else {
h_flex()
}
.when(allow_closing, |div| {
div.children(diagnostic.is_primary.then(|| {
IconButton::new("close-block", IconName::XCircle)
.icon_color(Color::Muted)
.size(ButtonSize::Compact)
.style(ButtonStyle::Transparent)
.visible_on_hover(group_id.clone())
.on_click(move |_click, window, cx| {
window.dispatch_action(Box::new(Cancel), cx)
})
.tooltip(|window, cx| {
Tooltip::for_action("Close Diagnostics", &Cancel, window, cx)
})
}))
})
.child(
IconButton::new("copy-block", IconName::Copy)
.icon_color(Color::Muted)
.size(ButtonSize::Compact)
.style(ButtonStyle::Transparent)
.visible_on_hover(group_id.clone())
.on_click({
let message = diagnostic.message.clone();
move |_click, _, cx| {
cx.write_to_clipboard(ClipboardItem::new_string(message.clone()))
}
})
.tooltip(Tooltip::text("Copy diagnostic message")),
)
};
let icon_size = buttons(&diagnostic).into_any_element().layout_as_root(
AvailableSpace::min_size(),
cx.window,
cx.app,
);
h_flex()
.id(cx.block_id)
.group(group_id.clone())
.relative()
.size_full()
.block_mouse_down()
.pl(cx.gutter_dimensions.width)
.w(cx.max_width - cx.gutter_dimensions.full_width())
.child(
div()
.flex()
.w(cx.anchor_x - cx.gutter_dimensions.width - icon_size.width)
.flex_shrink(),
)
.child(buttons(&diagnostic))
.child(div().flex().flex_shrink_0().child(
StyledText::new(text_without_backticks.clone()).with_default_highlights(
&text_style,
code_ranges.iter().map(|range| {
(
range.clone(),
HighlightStyle {
font_weight: Some(FontWeight::BOLD),
..Default::default()
},
)
}),
),
))
.into_any_element()
})
}
fn inline_completion_edit_text(
current_snapshot: &BufferSnapshot,
edits: &[(Range<Anchor>, String)],
@@ -20237,74 +20074,7 @@ fn inline_completion_edit_text(
edit_preview.highlight_edits(current_snapshot, &edits, include_deletions, cx)
}
pub fn highlight_diagnostic_message(
diagnostic: &Diagnostic,
mut max_message_rows: Option<u8>,
) -> (SharedString, Vec<Range<usize>>) {
let mut text_without_backticks = String::new();
let mut code_ranges = Vec::new();
if let Some(source) = &diagnostic.source {
text_without_backticks.push_str(source);
code_ranges.push(0..source.len());
text_without_backticks.push_str(": ");
}
let mut prev_offset = 0;
let mut in_code_block = false;
let has_row_limit = max_message_rows.is_some();
let mut newline_indices = diagnostic
.message
.match_indices('\n')
.filter(|_| has_row_limit)
.map(|(ix, _)| ix)
.fuse()
.peekable();
for (quote_ix, _) in diagnostic
.message
.match_indices('`')
.chain([(diagnostic.message.len(), "")])
{
let mut first_newline_ix = None;
let mut last_newline_ix = None;
while let Some(newline_ix) = newline_indices.peek() {
if *newline_ix < quote_ix {
if first_newline_ix.is_none() {
first_newline_ix = Some(*newline_ix);
}
last_newline_ix = Some(*newline_ix);
if let Some(rows_left) = &mut max_message_rows {
if *rows_left == 0 {
break;
} else {
*rows_left -= 1;
}
}
let _ = newline_indices.next();
} else {
break;
}
}
let prev_len = text_without_backticks.len();
let new_text = &diagnostic.message[prev_offset..first_newline_ix.unwrap_or(quote_ix)];
text_without_backticks.push_str(new_text);
if in_code_block {
code_ranges.push(prev_len..text_without_backticks.len());
}
prev_offset = last_newline_ix.unwrap_or(quote_ix) + 1;
in_code_block = !in_code_block;
if first_newline_ix.map_or(false, |newline_ix| newline_ix < quote_ix) {
text_without_backticks.push_str("...");
break;
}
}
(text_without_backticks.into(), code_ranges)
}
fn diagnostic_style(severity: DiagnosticSeverity, colors: &StatusColors) -> Hsla {
pub fn diagnostic_style(severity: DiagnosticSeverity, colors: &StatusColors) -> Hsla {
match severity {
DiagnosticSeverity::ERROR => colors.error,
DiagnosticSeverity::WARNING => colors.warning,

View File

@@ -12585,276 +12585,6 @@ async fn go_to_prev_overlapping_diagnostic(executor: BackgroundExecutor, cx: &mu
"});
}
#[gpui::test]
async fn cycle_through_same_place_diagnostics(
executor: BackgroundExecutor,
cx: &mut TestAppContext,
) {
init_test(cx, |_| {});
let mut cx = EditorTestContext::new(cx).await;
let lsp_store =
cx.update_editor(|editor, _, cx| editor.project.as_ref().unwrap().read(cx).lsp_store());
cx.set_state(indoc! {"
ˇfn func(abc def: i32) -> u32 {
}
"});
cx.update(|_, cx| {
lsp_store.update(cx, |lsp_store, cx| {
lsp_store
.update_diagnostics(
LanguageServerId(0),
lsp::PublishDiagnosticsParams {
uri: lsp::Url::from_file_path(path!("/root/file")).unwrap(),
version: None,
diagnostics: vec![
lsp::Diagnostic {
range: lsp::Range::new(
lsp::Position::new(0, 11),
lsp::Position::new(0, 12),
),
severity: Some(lsp::DiagnosticSeverity::ERROR),
..Default::default()
},
lsp::Diagnostic {
range: lsp::Range::new(
lsp::Position::new(0, 12),
lsp::Position::new(0, 15),
),
severity: Some(lsp::DiagnosticSeverity::ERROR),
..Default::default()
},
lsp::Diagnostic {
range: lsp::Range::new(
lsp::Position::new(0, 12),
lsp::Position::new(0, 15),
),
severity: Some(lsp::DiagnosticSeverity::ERROR),
..Default::default()
},
lsp::Diagnostic {
range: lsp::Range::new(
lsp::Position::new(0, 25),
lsp::Position::new(0, 28),
),
severity: Some(lsp::DiagnosticSeverity::ERROR),
..Default::default()
},
],
},
&[],
cx,
)
.unwrap()
});
});
executor.run_until_parked();
//// Backward
// Fourth diagnostic
cx.update_editor(|editor, window, cx| {
editor.go_to_prev_diagnostic(&GoToPreviousDiagnostic, window, cx);
});
cx.assert_editor_state(indoc! {"
fn func(abc def: i32) -> ˇu32 {
}
"});
// Third diagnostic
cx.update_editor(|editor, window, cx| {
editor.go_to_prev_diagnostic(&GoToPreviousDiagnostic, window, cx);
});
cx.assert_editor_state(indoc! {"
fn func(abc ˇdef: i32) -> u32 {
}
"});
// Second diagnostic, same place
cx.update_editor(|editor, window, cx| {
editor.go_to_prev_diagnostic(&GoToPreviousDiagnostic, window, cx);
});
cx.assert_editor_state(indoc! {"
fn func(abc ˇdef: i32) -> u32 {
}
"});
// First diagnostic
cx.update_editor(|editor, window, cx| {
editor.go_to_prev_diagnostic(&GoToPreviousDiagnostic, window, cx);
});
cx.assert_editor_state(indoc! {"
fn func(abcˇ def: i32) -> u32 {
}
"});
// Wrapped over, fourth diagnostic
cx.update_editor(|editor, window, cx| {
editor.go_to_prev_diagnostic(&GoToPreviousDiagnostic, window, cx);
});
cx.assert_editor_state(indoc! {"
fn func(abc def: i32) -> ˇu32 {
}
"});
cx.update_editor(|editor, window, cx| {
editor.move_to_beginning(&MoveToBeginning, window, cx);
});
cx.assert_editor_state(indoc! {"
ˇfn func(abc def: i32) -> u32 {
}
"});
//// Forward
// First diagnostic
cx.update_editor(|editor, window, cx| {
editor.go_to_diagnostic(&GoToDiagnostic, window, cx);
});
cx.assert_editor_state(indoc! {"
fn func(abcˇ def: i32) -> u32 {
}
"});
// Second diagnostic
cx.update_editor(|editor, window, cx| {
editor.go_to_diagnostic(&GoToDiagnostic, window, cx);
});
cx.assert_editor_state(indoc! {"
fn func(abc ˇdef: i32) -> u32 {
}
"});
// Third diagnostic, same place
cx.update_editor(|editor, window, cx| {
editor.go_to_diagnostic(&GoToDiagnostic, window, cx);
});
cx.assert_editor_state(indoc! {"
fn func(abc ˇdef: i32) -> u32 {
}
"});
// Fourth diagnostic
cx.update_editor(|editor, window, cx| {
editor.go_to_diagnostic(&GoToDiagnostic, window, cx);
});
cx.assert_editor_state(indoc! {"
fn func(abc def: i32) -> ˇu32 {
}
"});
// Wrapped around, first diagnostic
cx.update_editor(|editor, window, cx| {
editor.go_to_diagnostic(&GoToDiagnostic, window, cx);
});
cx.assert_editor_state(indoc! {"
fn func(abcˇ def: i32) -> u32 {
}
"});
}
#[gpui::test]
async fn active_diagnostics_dismiss_after_invalidation(
executor: BackgroundExecutor,
cx: &mut TestAppContext,
) {
init_test(cx, |_| {});
let mut cx = EditorTestContext::new(cx).await;
let lsp_store =
cx.update_editor(|editor, _, cx| editor.project.as_ref().unwrap().read(cx).lsp_store());
cx.set_state(indoc! {"
ˇfn func(abc def: i32) -> u32 {
}
"});
let message = "Something's wrong!";
cx.update(|_, cx| {
lsp_store.update(cx, |lsp_store, cx| {
lsp_store
.update_diagnostics(
LanguageServerId(0),
lsp::PublishDiagnosticsParams {
uri: lsp::Url::from_file_path(path!("/root/file")).unwrap(),
version: None,
diagnostics: vec![lsp::Diagnostic {
range: lsp::Range::new(
lsp::Position::new(0, 11),
lsp::Position::new(0, 12),
),
severity: Some(lsp::DiagnosticSeverity::ERROR),
message: message.to_string(),
..Default::default()
}],
},
&[],
cx,
)
.unwrap()
});
});
executor.run_until_parked();
cx.update_editor(|editor, window, cx| {
editor.go_to_diagnostic(&GoToDiagnostic, window, cx);
assert_eq!(
editor
.active_diagnostics
.as_ref()
.map(|diagnostics_group| diagnostics_group.primary_message.as_str()),
Some(message),
"Should have a diagnostics group activated"
);
});
cx.assert_editor_state(indoc! {"
fn func(abcˇ def: i32) -> u32 {
}
"});
cx.update(|_, cx| {
lsp_store.update(cx, |lsp_store, cx| {
lsp_store
.update_diagnostics(
LanguageServerId(0),
lsp::PublishDiagnosticsParams {
uri: lsp::Url::from_file_path(path!("/root/file")).unwrap(),
version: None,
diagnostics: Vec::new(),
},
&[],
cx,
)
.unwrap()
});
});
executor.run_until_parked();
cx.update_editor(|editor, _, _| {
assert_eq!(
editor.active_diagnostics, None,
"After no diagnostics set to the editor, no diagnostics should be active"
);
});
cx.assert_editor_state(indoc! {"
fn func(abcˇ def: i32) -> u32 {
}
"});
cx.update_editor(|editor, window, cx| {
editor.go_to_diagnostic(&GoToDiagnostic, window, cx);
assert_eq!(
editor.active_diagnostics, None,
"Should be no diagnostics to go to and activate"
);
});
cx.assert_editor_state(indoc! {"
fn func(abcˇ def: i32) -> u32 {
}
"});
}
#[gpui::test]
async fn test_diagnostics_with_links(cx: &mut TestAppContext) {
init_test(cx, |_| {});

View File

@@ -1,11 +1,11 @@
use crate::{
BlockId, COLUMNAR_SELECTION_MODIFIERS, CURSORS_VISIBLE_FOR, ChunkRendererContext,
ChunkReplacement, ContextMenuPlacement, CursorShape, CustomBlockId, DisplayDiffHunk,
DisplayPoint, DisplayRow, DocumentHighlightRead, DocumentHighlightWrite, EditDisplayMode,
Editor, EditorMode, EditorSettings, EditorSnapshot, EditorStyle, FILE_HEADER_HEIGHT,
FocusedBlock, GutterDimensions, HalfPageDown, HalfPageUp, HandleInput, HoveredCursor,
InlayHintRefreshReason, InlineCompletion, JumpData, LineDown, LineHighlight, LineUp,
MAX_LINE_LEN, MIN_LINE_NUMBER_DIGITS, MULTI_BUFFER_EXCERPT_HEADER_HEIGHT, OpenExcerpts,
ActiveDiagnostic, BlockId, COLUMNAR_SELECTION_MODIFIERS, CURSORS_VISIBLE_FOR,
ChunkRendererContext, ChunkReplacement, ContextMenuPlacement, CursorShape, CustomBlockId,
DisplayDiffHunk, DisplayPoint, DisplayRow, DocumentHighlightRead, DocumentHighlightWrite,
EditDisplayMode, Editor, EditorMode, EditorSettings, EditorSnapshot, EditorStyle,
FILE_HEADER_HEIGHT, FocusedBlock, GutterDimensions, HalfPageDown, HalfPageUp, HandleInput,
HoveredCursor, InlayHintRefreshReason, InlineCompletion, JumpData, LineDown, LineHighlight,
LineUp, MAX_LINE_LEN, MIN_LINE_NUMBER_DIGITS, MULTI_BUFFER_EXCERPT_HEADER_HEIGHT, OpenExcerpts,
PageDown, PageUp, Point, RowExt, RowRangeExt, SelectPhase, SelectedTextHighlight, Selection,
SoftWrap, StickyHeaderExcerpt, ToPoint, ToggleFold,
code_context_menus::{CodeActionsMenu, MENU_ASIDE_MAX_WIDTH, MENU_ASIDE_MIN_WIDTH, MENU_GAP},
@@ -1614,12 +1614,12 @@ impl EditorElement {
project_settings::DiagnosticSeverity::Hint => DiagnosticSeverity::HINT,
});
let active_diagnostics_group = self
.editor
.read(cx)
.active_diagnostics
.as_ref()
.map(|active_diagnostics| active_diagnostics.group_id);
let active_diagnostics_group =
if let ActiveDiagnostic::Group(group) = &self.editor.read(cx).active_diagnostics {
Some(group.group_id)
} else {
None
};
let diagnostics_by_rows = self.editor.update(cx, |editor, cx| {
let snapshot = editor.snapshot(window, cx);
@@ -2643,12 +2643,15 @@ impl EditorElement {
sticky_header_excerpt_id: Option<ExcerptId>,
window: &mut Window,
cx: &mut App,
) -> (AnyElement, Size<Pixels>, DisplayRow, Pixels) {
) -> Option<(AnyElement, Size<Pixels>, DisplayRow, Pixels)> {
let mut x_position = None;
let mut element = match block {
Block::Custom(block) => {
let block_start = block.start().to_point(&snapshot.buffer_snapshot);
let block_end = block.end().to_point(&snapshot.buffer_snapshot);
Block::Custom(custom) => {
let block_start = custom.start().to_point(&snapshot.buffer_snapshot);
let block_end = custom.end().to_point(&snapshot.buffer_snapshot);
if block.place_near() && snapshot.is_line_folded(MultiBufferRow(block_start.row)) {
return None;
}
let align_to = block_start.to_display_point(snapshot);
let x_and_width = |layout: &LineWithInvisibles| {
Some((
@@ -2686,7 +2689,7 @@ impl EditorElement {
div()
.size_full()
.child(block.render(&mut BlockContext {
.child(custom.render(&mut BlockContext {
window,
app: cx,
anchor_x,
@@ -2774,6 +2777,7 @@ impl EditorElement {
} else {
element.layout_as_root(size(available_width, quantized_height.into()), window, cx)
};
let mut element_height_in_lines = ((final_size.height / line_height).ceil() as u32).max(1);
let mut row = block_row_start;
let mut x_offset = px(0.);
@@ -2781,20 +2785,19 @@ impl EditorElement {
if let BlockId::Custom(custom_block_id) = block_id {
if block.has_height() {
let mut element_height_in_lines =
((final_size.height / line_height).ceil() as u32).max(1);
if block.place_near() && element_height_in_lines == 1 {
if block.place_near() {
if let Some((x_target, line_width)) = x_position {
let margin = em_width * 2;
if line_width + final_size.width + margin
< editor_width + gutter_dimensions.full_width()
&& !row_block_types.contains_key(&(row - 1))
&& element_height_in_lines == 1
{
x_offset = line_width + margin;
row = row - 1;
is_block = false;
element_height_in_lines = 0;
row_block_types.insert(row, is_block);
} else {
let max_offset =
editor_width + gutter_dimensions.full_width() - final_size.width;
@@ -2809,9 +2812,11 @@ impl EditorElement {
}
}
}
row_block_types.insert(row, is_block);
for i in 0..element_height_in_lines {
row_block_types.insert(row + i, is_block);
}
(element, final_size, row, x_offset)
Some((element, final_size, row, x_offset))
}
fn render_buffer_header(
@@ -3044,7 +3049,7 @@ impl EditorElement {
focused_block = None;
}
let (element, element_size, row, x_offset) = self.render_block(
if let Some((element, element_size, row, x_offset)) = self.render_block(
block,
AvailableSpace::MinContent,
block_id,
@@ -3067,19 +3072,19 @@ impl EditorElement {
sticky_header_excerpt_id,
window,
cx,
);
fixed_block_max_width = fixed_block_max_width.max(element_size.width + em_width);
blocks.push(BlockLayout {
id: block_id,
x_offset,
row: Some(row),
element,
available_space: size(AvailableSpace::MinContent, element_size.height.into()),
style: BlockStyle::Fixed,
overlaps_gutter: true,
is_buffer_header: block.is_buffer_header(),
});
) {
fixed_block_max_width = fixed_block_max_width.max(element_size.width + em_width);
blocks.push(BlockLayout {
id: block_id,
x_offset,
row: Some(row),
element,
available_space: size(AvailableSpace::MinContent, element_size.height.into()),
style: BlockStyle::Fixed,
overlaps_gutter: true,
is_buffer_header: block.is_buffer_header(),
});
}
}
for (row, block) in non_fixed_blocks {
@@ -3101,7 +3106,7 @@ impl EditorElement {
focused_block = None;
}
let (element, element_size, row, x_offset) = self.render_block(
if let Some((element, element_size, row, x_offset)) = self.render_block(
block,
width,
block_id,
@@ -3124,18 +3129,18 @@ impl EditorElement {
sticky_header_excerpt_id,
window,
cx,
);
blocks.push(BlockLayout {
id: block_id,
x_offset,
row: Some(row),
element,
available_space: size(width, element_size.height.into()),
style,
overlaps_gutter: !block.place_near(),
is_buffer_header: block.is_buffer_header(),
});
) {
blocks.push(BlockLayout {
id: block_id,
x_offset,
row: Some(row),
element,
available_space: size(width, element_size.height.into()),
style,
overlaps_gutter: !block.place_near(),
is_buffer_header: block.is_buffer_header(),
});
}
}
if let Some(focused_block) = focused_block {
@@ -3155,7 +3160,7 @@ impl EditorElement {
BlockStyle::Sticky => AvailableSpace::Definite(hitbox.size.width),
};
let (element, element_size, _, x_offset) = self.render_block(
if let Some((element, element_size, _, x_offset)) = self.render_block(
&block,
width,
focused_block.id,
@@ -3178,18 +3183,18 @@ impl EditorElement {
sticky_header_excerpt_id,
window,
cx,
);
blocks.push(BlockLayout {
id: block.id(),
x_offset,
row: None,
element,
available_space: size(width, element_size.height.into()),
style,
overlaps_gutter: true,
is_buffer_header: block.is_buffer_header(),
});
) {
blocks.push(BlockLayout {
id: block.id(),
x_offset,
row: None,
element,
available_space: size(width, element_size.height.into()),
style,
overlaps_gutter: true,
is_buffer_header: block.is_buffer_header(),
});
}
}
}
}

View File

@@ -1,6 +1,6 @@
use crate::{
Anchor, AnchorRangeExt, DisplayPoint, DisplayRow, Editor, EditorSettings, EditorSnapshot,
Hover,
ActiveDiagnostic, Anchor, AnchorRangeExt, DisplayPoint, DisplayRow, Editor, EditorSettings,
EditorSnapshot, Hover,
display_map::{InlayOffset, ToDisplayPoint, invisibles::is_invisible},
hover_links::{InlayHighlight, RangeInEditor},
scroll::{Autoscroll, ScrollAmount},
@@ -95,7 +95,7 @@ pub fn show_keyboard_hover(
}
pub struct InlayHover {
pub range: InlayHighlight,
pub(crate) range: InlayHighlight,
pub tooltip: HoverBlock,
}
@@ -276,6 +276,12 @@ fn show_hover(
}
let hover_popover_delay = EditorSettings::get_global(cx).hover_popover_delay;
let all_diagnostics_active = editor.active_diagnostics == ActiveDiagnostic::All;
let active_group_id = if let ActiveDiagnostic::Group(group) = &editor.active_diagnostics {
Some(group.group_id)
} else {
None
};
let task = cx.spawn_in(window, async move |this, cx| {
async move {
@@ -302,11 +308,16 @@ fn show_hover(
}
let offset = anchor.to_offset(&snapshot.buffer_snapshot);
let local_diagnostic = snapshot
.buffer_snapshot
.diagnostics_in_range::<usize>(offset..offset)
// Find the entry with the most specific range
.min_by_key(|entry| entry.range.len());
let local_diagnostic = if all_diagnostics_active {
None
} else {
snapshot
.buffer_snapshot
.diagnostics_in_range::<usize>(offset..offset)
.filter(|diagnostic| Some(diagnostic.diagnostic.group_id) != active_group_id)
// Find the entry with the most specific range
.min_by_key(|entry| entry.range.len())
};
let diagnostic_popover = if let Some(local_diagnostic) = local_diagnostic {
let text = match local_diagnostic.diagnostic.source {
@@ -638,6 +649,7 @@ pub fn hover_markdown_style(window: &Window, cx: &App) -> MarkdownStyle {
},
syntax: cx.theme().syntax().clone(),
selection_background_color: { cx.theme().players().local().selection },
height_is_multiple_of_line_height: true,
heading: StyleRefinement::default()
.font_weight(FontWeight::BOLD)
.text_base()
@@ -707,7 +719,7 @@ pub fn open_markdown_url(link: SharedString, window: &mut Window, cx: &mut App)
#[derive(Default)]
pub struct HoverState {
pub info_popovers: Vec<InfoPopover>,
pub(crate) info_popovers: Vec<InfoPopover>,
pub diagnostic_popover: Option<DiagnosticPopover>,
pub triggered_from: Option<Anchor>,
pub info_task: Option<Task<Option<()>>>,

View File

@@ -295,55 +295,6 @@ fn build_context_menu(
ui::ContextMenu::build(window, cx, |menu, _window, cx| {
let menu = menu
.on_blur_subscription(Subscription::new(|| {}))
.when_some(code_action_load_state, |menu, state| {
match state {
CodeActionLoadState::Loading => menu.disabled_action(
"Loading code actions...",
Box::new(ConfirmCodeAction {
item_ix: None,
from_mouse_context_menu: true,
}),
),
CodeActionLoadState::Loaded(actions) => {
if actions.is_empty() {
menu.disabled_action(
"No code actions available",
Box::new(ConfirmCodeAction {
item_ix: None,
from_mouse_context_menu: true,
}),
)
} else {
actions
.iter()
.filter(|action| {
if action
.as_task()
.map(|task| {
matches!(task.task_type(), task::TaskType::Debug(_))
})
.unwrap_or(false)
{
cx.has_flag::<Debugger>()
} else {
true
}
})
.enumerate()
.fold(menu, |menu, (ix, action)| {
menu.action(
action.label(),
Box::new(ConfirmCodeAction {
item_ix: Some(ix),
from_mouse_context_menu: true,
}),
)
})
}
}
}
.separator()
})
.when(evaluate_selection && has_selections, |builder| {
builder
.action("Evaluate Selection", Box::new(DebuggerEvaluateSelectedText))
@@ -390,6 +341,54 @@ fn build_context_menu(
} else {
builder.disabled_action(COPY_PERMALINK_LABEL, Box::new(CopyPermalinkToLine))
}
})
.when_some(code_action_load_state, |menu, state| {
menu.separator().map(|menu| match state {
CodeActionLoadState::Loading => menu.disabled_action(
"Loading code actions...",
Box::new(ConfirmCodeAction {
item_ix: None,
from_mouse_context_menu: true,
}),
),
CodeActionLoadState::Loaded(actions) => {
if actions.is_empty() {
menu.disabled_action(
"No code actions available",
Box::new(ConfirmCodeAction {
item_ix: None,
from_mouse_context_menu: true,
}),
)
} else {
actions
.iter()
.filter(|action| {
if action
.as_task()
.map(|task| {
matches!(task.task_type(), task::TaskType::Debug(_))
})
.unwrap_or(false)
{
cx.has_flag::<Debugger>()
} else {
true
}
})
.enumerate()
.fold(menu, |menu, (ix, action)| {
menu.action(
action.label(),
Box::new(ConfirmCodeAction {
item_ix: Some(ix),
from_mouse_context_menu: true,
}),
)
})
}
}
})
});
match focus {
Some(focus) => menu.context(focus),

View File

@@ -1,18 +1,25 @@
pub mod editor_lsp_test_context;
pub mod editor_test_context;
use std::{rc::Rc, sync::LazyLock};
pub use crate::rust_analyzer_ext::expand_macro_recursively;
use crate::{
DisplayPoint, Editor, EditorMode, FoldPlaceholder, MultiBuffer,
display_map::{DisplayMap, DisplaySnapshot, ToDisplayPoint},
display_map::{
Block, BlockPlacement, CustomBlockId, DisplayMap, DisplayRow, DisplaySnapshot,
ToDisplayPoint,
},
};
use collections::HashMap;
use gpui::{
AppContext as _, Context, Entity, Font, FontFeatures, FontStyle, FontWeight, Pixels, Window,
font,
AppContext as _, Context, Entity, EntityId, Font, FontFeatures, FontStyle, FontWeight, Pixels,
VisualTestContext, Window, font, size,
};
use multi_buffer::ToPoint;
use pretty_assertions::assert_eq;
use project::Project;
use std::sync::LazyLock;
use ui::{App, BorrowAppContext, px};
use util::test::{marked_text_offsets, marked_text_ranges};
#[cfg(test)]
@@ -122,3 +129,126 @@ pub(crate) fn build_editor_with_project(
) -> Editor {
Editor::new(EditorMode::full(), buffer, Some(project), window, cx)
}
#[derive(Default)]
struct TestBlockContent(
HashMap<(EntityId, CustomBlockId), Rc<dyn Fn(&mut VisualTestContext) -> String>>,
);
impl gpui::Global for TestBlockContent {}
pub fn set_block_content_for_tests(
editor: &Entity<Editor>,
id: CustomBlockId,
cx: &mut App,
f: impl Fn(&mut VisualTestContext) -> String + 'static,
) {
cx.update_default_global::<TestBlockContent, _>(|bc, _| {
bc.0.insert((editor.entity_id(), id), Rc::new(f))
});
}
pub fn block_content_for_tests(
editor: &Entity<Editor>,
id: CustomBlockId,
cx: &mut VisualTestContext,
) -> Option<String> {
let f = cx.update(|_, cx| {
cx.default_global::<TestBlockContent>()
.0
.get(&(editor.entity_id(), id))
.cloned()
})?;
Some(f(cx))
}
pub fn editor_content_with_blocks(editor: &Entity<Editor>, cx: &mut VisualTestContext) -> String {
cx.draw(
gpui::Point::default(),
size(px(3000.0), px(3000.0)),
|_, _| editor.clone(),
);
let (snapshot, mut lines, blocks) = editor.update_in(cx, |editor, window, cx| {
let snapshot = editor.snapshot(window, cx);
let text = editor.display_text(cx);
let lines = text.lines().map(|s| s.to_string()).collect::<Vec<String>>();
let blocks = snapshot
.blocks_in_range(DisplayRow(0)..snapshot.max_point().row())
.map(|(row, block)| (row, block.clone()))
.collect::<Vec<_>>();
(snapshot, lines, blocks)
});
for (row, block) in blocks {
match block {
Block::Custom(custom_block) => {
if let BlockPlacement::Near(x) = &custom_block.placement {
if snapshot.intersects_fold(x.to_point(&snapshot.buffer_snapshot)) {
continue;
}
};
let content = block_content_for_tests(&editor, custom_block.id, cx)
.expect("block content not found");
// 2: "related info 1 for diagnostic 0"
if let Some(height) = custom_block.height {
if height == 0 {
lines[row.0 as usize - 1].push_str(" § ");
lines[row.0 as usize - 1].push_str(&content);
} else {
let block_lines = content.lines().collect::<Vec<_>>();
assert_eq!(block_lines.len(), height as usize);
lines[row.0 as usize].push_str("§ ");
lines[row.0 as usize].push_str(block_lines[0].trim_end());
for i in 1..height as usize {
lines[row.0 as usize + i].push_str("§ ");
lines[row.0 as usize + i].push_str(block_lines[i].trim_end());
}
}
}
}
Block::FoldedBuffer {
first_excerpt,
height,
} => {
lines[row.0 as usize].push_str(&cx.update(|_, cx| {
format!(
"§ {}",
first_excerpt
.buffer
.file()
.unwrap()
.file_name(cx)
.to_string_lossy()
)
}));
for row in row.0 + 1..row.0 + height {
lines[row as usize].push_str("§ -----");
}
}
Block::ExcerptBoundary {
excerpt,
height,
starts_new_buffer,
} => {
if starts_new_buffer {
lines[row.0 as usize].push_str(&cx.update(|_, cx| {
format!(
"§ {}",
excerpt
.buffer
.file()
.unwrap()
.file_name(cx)
.to_string_lossy()
)
}));
} else {
lines[row.0 as usize].push_str("§ -----")
}
for row in row.0 + 1..row.0 + height {
lines[row as usize].push_str("§ -----");
}
}
}
}
lines.join("\n")
}

View File

@@ -8,7 +8,6 @@ edition.workspace = true
agent.workspace = true
anyhow.workspace = true
async-watch.workspace = true
assistant_settings.workspace = true
assistant_tool.workspace = true
assistant_tools.workspace = true
chrono.workspace = true

View File

@@ -0,0 +1,3 @@
url = "https://github.com/GyulyVGC/sniffnet.git"
revision = "cfb5b6519bd7838f279e5be9d360445aaffaa647"
language_extension = "rs"

View File

@@ -0,0 +1,16 @@
1. **Protocol Enumeration:** Ensure the `Protocol` enum includes the `ARP` variant and is integrated in `Protocol::ALL`.
2. **Packet Analysis Logic:**
- Properly detect ARP packets within `analyze_headers` and `analyze_network_header`.
- Appropriately extract ARP sender/target IPs based on the protocol (IPv4 or IPv6).
- Track and store ARP operations (Request, Reply) using the `ArpType` enum.
3. **Display & User Interface:**
- Accurately represent ARP packet types in the UI (`connection_details_page.rs`) alongside ICMP types.
- Skip displaying service information for ARP packets in line with ICMP behavior.
4. **Data Struct Enhancements:**
- Update `InfoAddressPortPair` to store and count ARP operation types.
- Ensure filtering and presentation logic uses ARP data correctly.
5. **Default Behaviors:**
- Set default `protocol` in `PacketFiltersFields` to `ARP` for consistency.
6. **Testing:**
- Update unit tests for `Protocol::ALL` and `get_service` to account for ARP behavior.
- Confirm that ARP protocol toggling works properly in the GUI protocol filter handling.

View File

@@ -0,0 +1 @@
Add full support for the Address Resolution Protocol (ARP) in the packet sniffer. This includes recognizing ARP packets during packet analysis, displaying ARP operation types in the UI, and updating data structures to track ARP-specific metadata. Integrate ARP into the protocol filtering system, update all relevant UI logic to ensure it handles ARP packets similarly to ICMP, and ensure proper test coverage for all new functionality. Update `Protocol::ALL` to include ARP and skip service detection for ARP packets, as they dont use ports. Finally, ensure the `connection_details_page` displays the ARP operation types with counts, using a `pretty_print_types` method similar to ICMP types.

View File

@@ -0,0 +1,3 @@
url = "https://github.com/swc-project/swc.git"
revision = "787d5fabf410fafe6595ec00c197181b27578cb1"
language_extension = "rs"

View File

@@ -0,0 +1,6 @@
1. The `parse` and `parse_sync` functions must support both `Buffer` and `String` inputs for the `src` parameter, using the `Either` type from `napi` to avoid breaking existing string-based usage while adding buffer support.
2. A helper function `stringify` must handle conversion of `Either<Buffer, String>` to a unified `String` representation internally, ensuring consistent UTF-8 decoding for buffers and direct string passthrough.
3. The TypeScript binding declarations (`binding.d.ts`) must reflect the updated parameter types for `parse` and `parse_sync` to accept `Buffer | string`, ensuring compatibility with JavaScript/TypeScript callers.
4. Unit tests must validate both buffer and string input paths for asynchronous (`parse`) and synchronous (`parse_sync`) APIs, ensuring parity in functionality and output correctness.
5. The `filename` parameter must remain optional but use `FileName::Real` when provided and fall back to `FileName::Anon` if omitted, preserving existing file resolution logic.
6. No regressions in error handling, abort signal support, or serialization/deserialization of `ParseOptions` during the refactor.

Some files were not shown because too many files have changed in this diff Show More