Compare commits

...

23 Commits

Author SHA1 Message Date
Richard Feldman
0ff95fec62 Add a tooltip to the icon for the language name 2025-05-01 10:58:41 -04:00
Richard Feldman
845cf799c6 Only show icons, not language names, above code blocks 2025-05-01 10:58:41 -04:00
Richard Feldman
22c1090163 Respect cursor_pointer when ButtonLike is disabled 2025-05-01 10:57:44 -04:00
Danilo Leal
122af4fd53 agent: Show nav dropdown close button only on hover (#29732) 2025-05-01 11:21:57 -03:00
Kirill Bulatov
e07ffe7cf1 Allow to fetch cargo diagnostics separately (#29706)
Adjusts the way `cargo` and `rust-analyzer` diagnostics are fetched into
Zed.

Nothing is changed for defaults: in this mode, Zed does nothing but
reports file updates, which trigger rust-analyzers'
mechanisms:

* generating internal diagnostics, which it is able to produce on the
fly, without blocking cargo lock.
Unfortunately, there are not that many diagnostics in r-a, and some of
them have false-positives compared to rustc ones

* running `cargo check --workspace --all-targets` on each file save,
taking the cargo lock
For large projects like Zed, this might take a while, reducing the
ability to choose how to work with the project: e.g. it's impossible to
save multiple times without long diagnostics refreshes (may happen
automatically on e.g. focus loss), save the project and run it instantly
without waiting for cargo check to finish, etc.

In addition, it's relatively tricky to reconfigure r-a to run a
different command, with different arguments and maybe different env
vars: that would require a language server restart (and a large project
reindex) and fiddling with multiple JSON fields.

The new mode aims to separate out cargo diagnostics into its own loop so
that all Zed diagnostics features are supported still.


For that, an extra mode was introduced:

```jsonc
"rust": {
  // When enabled, Zed runs `cargo check --message-format=json`-based commands and
  // collect cargo diagnostics instead of rust-analyzer.
  "fetch_cargo_diagnostics": false,
  // A command override for fetching the cargo diagnostics.
  // First argument is the command, followed by the arguments.
  "diagnostics_fetch_command": [
    "cargo",
    "check",
    "--quiet",
    "--workspace",
    "--message-format=json",
    "--all-targets",
    "--keep-going"
  ],
  // Extra environment variables to pass to the diagnostics fetch command.
  "env": {}
}
```

which calls to cargo, parses its output and mixes in with the existing
diagnostics:




https://github.com/user-attachments/assets/e986f955-b452-4995-8aac-3049683dd22c




Release Notes:

- Added a way to get diagnostics from cargo and rust-analyzer without
mutually locking each other
- Added `ctrl-r` binding to refresh diagnostics in the project
diagnostics editor context
2025-05-01 11:25:52 +03:00
Finn Evers
5e4be013af zed: Fix application menu capitalization (#29722)
This PR is a quick follow-up to #29717 to ensure that the action within
the app menu has the same capitalization as in the context menu.

Release Notes:

- N/A
2025-05-01 08:19:02 +00:00
Aaron Feickert
f055dca592 editor: Fix context menu capitalization (#29717)
Fixes context menu capitalization for consistency.

Release Notes:

- N/A
2025-05-01 09:58:08 +03:00
Richard Feldman
5872276511 Re-enable open tool (#29707)
Release Notes:

- Added `open` tool for opening files or URLs.
2025-04-30 22:33:52 -04:00
Bennet Bo Fenner
1bf9e15f26 agent: Allow adding/removing context when editing existing messages (#29698)
Release Notes:

- agent: Support adding/removing context when editing existing message

---------

Co-authored-by: Cole Miller <m@cole-miller.net>
Co-authored-by: Cole Miller <cole@zed.dev>
2025-05-01 01:39:34 +00:00
Marshall Bowers
f046d70625 collab: Look up Stripe prices with lookup keys (#29715)
This PR makes it so we look up Stripe prices via lookup keys instead of
passing in the price IDs as environment variables.

Release Notes:

- N/A
2025-05-01 00:26:31 +00:00
Richard Feldman
afeb3d4fd9 Make eval more resilient to bad input from LLM (#29703)
I saw a slice panic (for begin > end) in a debug build of the eval. This
should just be a failed assertion, not a panic that takes out the whole
eval run!

Release Notes:

- N/A
2025-04-30 18:13:45 -04:00
Richard Feldman
92dd6b67c7 Fix potential subtraction overflow (#29697)
I saw this come up in an eval, where the LLM provided a start line of 0.

Release Notes:

- N/A
2025-04-30 18:13:37 -04:00
Cole Miller
38ede4bae3 Fix parsing of author name in git show output (#29704)
Closes #ISSUE

Release Notes:

- Fixed a bug causing incorrect formatting of git commit tooltips
2025-04-30 20:54:53 +00:00
Ben Kunkle
fc920bf63d Improve behavior around word-based brackets in bash (#29700)
Closes #28414

Makes it so that `do`, `then`, `done`, `else`, etc are treated as
brackets in bash. They are not auto-closed *yet* as that requires
additional work to function properly, however they can now be toggled
between using `%` in vim. Additionally, newlines are inserted like they
are with regular brackets (`{}()[]""''`) when hitting enter between
them.

While `if <-> fi` `while/for <-> done` and `case <-> esac` are the
*logical* matching pairs, I've opted to instead match between `then <->
else/elif/fi` `do <-> done` and `in <-> esac` as these are the pairs
that delimit the sub-scope, and are more similar to the `{}` style
bracket pairs than `if <-> }` in a c-like syntax. This does cause some
wierd behavior with `else` in `if` expressions as it matches both with
the previous `then` as well as the following `fi`, so in this case

```bash
if true; then
   foo
else
   bar
f|i
```

after hitting `%` twice times (where cursor is `|`), the cursor will end
up on the `then` instead of back on the `fi` as hitting `%` on the else
will *always* navigate up to the `then`

Release Notes:

- vim: Improved behavior around word-based delimiters in bash (`do <->
done`, `then <-> fi`, etc) so they can be toggled between using `%`
2025-04-30 19:57:29 +00:00
Richard Feldman
04c68dc0cf Make the default repetitions be 8, and concurrency 4 (#29576)
This is based on having observed that there is a lot of variation
between runs on `n=1` and `n=3`.

* With `n=8` two runs on the same branch give answers that seem close
enough to be reasonably consistent.
* With higher concurrency, trying to run this many repetitions seems to
lead language servers to time out a lot, causing evals to fail.

Release Notes:

- N/A
2025-04-30 15:21:19 -04:00
Marshall Bowers
399eced884 collab: Return current usage by model from GET /billing/usage (#29693)
This PR updates the `GET /billing/usage` endpoint to return the number
of requests made to each model and mode.

Release Notes:

- N/A
2025-04-30 19:06:39 +00:00
Richard Feldman
50f705e779 Use outline (#29687)
## Before

![Screenshot 2025-04-30 at 10 56
36 AM](https://github.com/user-attachments/assets/3a435f4c-ad45-4f26-a847-2d5c9d03648e)

## After

![Screenshot 2025-04-30 at 10 55
27 AM](https://github.com/user-attachments/assets/cc3a8144-b6fe-4a15-8a47-b2487ce4f66e)

Release Notes:

- Context picker and `@`-mentions now work with very large files.
2025-04-30 18:00:00 +00:00
Ben Kunkle
8173534ad5 zed: Reinstate default file_scan_exclusions in Zed repo project settings (#29690)
Closes #ISSUE

Re-adds default `file_scan_exclusions` to [project
settings](84e4891d54/.zed/settings.json)
that were overridden in #29106

Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-04-30 17:50:56 +00:00
Nate Butler
8c03934b26 welcome: Theme preview tile (#29689)
![CleanShot 2025-04-30 at 13 26
44@2x](https://github.com/user-attachments/assets/f68fefe2-84a1-48b7-b9a2-47c2547cd06b)


- Adds the ThemePreviewTile component, used for upcoming onboarding UI
- Adds the CornerSolver utility for resolving correct nested corner
radii

Release Notes:

- N/A
2025-04-30 17:46:11 +00:00
Patrick
84e4891d54 file_finder: Add skip_focus_for_active_in_search setting (#27624)
Closes #27073

Currently, when searching for a file with Ctrl+P, and the first file
found is the active one, file_finder skips focus to the second file
automatically. This PR adds a setting to disable this and make the first
file always the focused one.

Default setting is still skipping the active file.

Release Notes: 

- Added the `skip_focus_for_active_in_search` setting for the file
finder, which allows turning off the default behavior of skipping focus
on the active file while searching in the file finder.

---------

Co-authored-by: Smit Barmase <heysmitbarmase@gmail.com>
2025-04-30 22:58:33 +05:30
Ben Kunkle
d03d8ccec1 python: Fix identification of runnable tests within decorated test classes (#29688)
Closes #29486

Release Notes:

- python: Fixed identification of runnable test functions within
decorated pytest classes
2025-04-30 17:26:30 +00:00
Joseph T. Lyons
4d934f2884 Bump Zed to v0.186 (#29680)
Release Notes:

-N/A
2025-04-30 12:52:25 -04:00
Smit Barmase
e697cf9747 editor: Fix edit range for linked edits on do completion (#29650)
Closes #29544

Fixes an issue where accepting an HTML completion would correctly edit
the start tag but incorrectly update the end tag due to incorrect linked
edit ranges.

I want to handle multi cursor case (as it barely works now), but seems
like this should go first. As, it might need whole `do_completions`
overhaul.

Todo:
- [x] Tests for completion aceept on linked edits

Before:


https://github.com/user-attachments/assets/917f8d2a-4a0f-46e8-a004-675fde55fe3d

After:


https://github.com/user-attachments/assets/84b760b6-a5b9-45c4-85d8-b5dccf97775f

Release Notes:

- Fixes an issue where accepting an HTML completion would correctly edit
the start tag but incorrectly update the end tag.
2025-04-30 21:44:20 +05:30
67 changed files with 3184 additions and 616 deletions

View File

@@ -69,7 +69,7 @@ jobs:
run: cargo build --package=eval
- name: Run eval
run: cargo run --package=eval -- --repetitions=3 --concurrency=1
run: cargo run --package=eval -- --repetitions=8 --concurrency=1
# Even the Linux runner is not stateful, in theory there is no need to do this cleanup.
# But, to avoid potential issues in the future if we choose to use a stateful Linux runner and forget to add code

View File

@@ -46,5 +46,17 @@
"formatter": "auto",
"remove_trailing_whitespace_on_save": true,
"ensure_final_newline_on_save": true,
"file_scan_exclusions": ["crates/eval/worktrees/", "crates/eval/repos/"]
"file_scan_exclusions": [
"crates/eval/worktrees/",
"crates/eval/repos/",
"**/.git",
"**/.svn",
"**/.hg",
"**/.jj",
"**/CVS",
"**/.DS_Store",
"**/Thumbs.db",
"**/.classpath",
"**/.settings"
]
}

12
Cargo.lock generated
View File

@@ -690,6 +690,7 @@ dependencies = [
"pretty_assertions",
"project",
"rand 0.8.5",
"regex",
"serde",
"serde_json",
"settings",
@@ -731,6 +732,7 @@ dependencies = [
"serde_json",
"settings",
"task",
"tempfile",
"terminal",
"terminal_view",
"tree-sitter-rust",
@@ -4363,14 +4365,17 @@ name = "diagnostics"
version = "0.1.0"
dependencies = [
"anyhow",
"cargo_metadata",
"client",
"collections",
"component",
"ctor",
"editor",
"env_logger 0.11.8",
"futures 0.3.31",
"gpui",
"indoc",
"itertools 0.14.0",
"language",
"linkme",
"log",
@@ -4382,6 +4387,7 @@ dependencies = [
"serde",
"serde_json",
"settings",
"smol",
"text",
"theme",
"ui",
@@ -16878,18 +16884,22 @@ version = "0.1.0"
dependencies = [
"anyhow",
"client",
"component",
"db",
"documented",
"editor",
"fuzzy",
"gpui",
"install_cli",
"language",
"linkme",
"picker",
"project",
"schemars",
"serde",
"settings",
"telemetry",
"theme",
"ui",
"util",
"vim_mode_setting",
@@ -18465,7 +18475,7 @@ dependencies = [
[[package]]
name = "zed"
version = "0.185.0"
version = "0.186.0"
dependencies = [
"activity_indicator",
"agent",

View File

@@ -435,6 +435,7 @@ dap-types = { git = "https://github.com/zed-industries/dap-types", rev = "be69a0
dashmap = "6.0"
derive_more = "0.99.17"
dirs = "4.0"
documented = "0.9.1"
dotenv = "0.15.0"
ec4rs = "1.1"
emojis = "0.6.1"
@@ -797,5 +798,6 @@ ignored = [
"serde",
"component",
"linkme",
"documented",
"workspace-hack",
]

View File

@@ -962,5 +962,12 @@
"bindings": {
"escape": "menu::Cancel"
}
},
{
"context": "Diagnostics",
"use_key_equivalents": true,
"bindings": {
"ctrl-r": "diagnostics::ToggleDiagnosticsRefresh"
}
}
]

View File

@@ -1068,5 +1068,12 @@
"bindings": {
"escape": "menu::Cancel"
}
},
{
"context": "Diagnostics",
"use_key_equivalents": true,
"bindings": {
"ctrl-r": "diagnostics::ToggleDiagnosticsRefresh"
}
}
]

View File

@@ -671,6 +671,7 @@
"now": true,
"find_path": true,
"read_file": true,
"open": true,
"grep": true,
"thinking": true,
"web_search": true
@@ -834,7 +835,20 @@
// "modal_max_width": "full"
//
// Default: small
"modal_max_width": "small"
"modal_max_width": "small",
// Determines whether the file finder should skip focus for the active file in search results.
// There are 2 possible values:
//
// 1. true: When searching for files, if the currently active file appears as the first result,
// auto-focus will skip it and focus the second result instead.
// "skip_focus_for_active_in_search": true
//
// 2. false: When searching for files, the first result will always receive focus,
// even if it's the currently active file.
// "skip_focus_for_active_in_search": false
//
// Default: true
"skip_focus_for_active_in_search": true
},
// Whether or not to remove any trailing whitespace from lines of a buffer
// before saving it.
@@ -917,6 +931,24 @@
// The minimum severity of the diagnostics to show inline.
// Shows all diagnostics when not specified.
"max_severity": null
},
"rust": {
// When enabled, Zed runs `cargo check --message-format=json`-based commands and
// collect cargo diagnostics instead of rust-analyzer.
"fetch_cargo_diagnostics": false,
// A command override for fetching the cargo diagnostics.
// First argument is the command, followed by the arguments.
"diagnostics_fetch_command": [
"cargo",
"check",
"--quiet",
"--workspace",
"--message-format=json",
"--all-targets",
"--keep-going"
],
// Extra environment variables to pass to the diagnostics fetch command.
"env": {}
}
},
// Files or globs of files that will be excluded by Zed entirely. They will be skipped during file

View File

@@ -1,5 +1,7 @@
use crate::context::{AgentContextHandle, RULES_ICON};
use crate::context_picker::MentionLink;
use crate::context_picker::{ContextPicker, MentionLink};
use crate::context_store::ContextStore;
use crate::context_strip::{ContextStrip, ContextStripEvent, SuggestContextKind};
use crate::thread::{
LastRestoreCheckpoint, MessageId, MessageSegment, Thread, ThreadError, ThreadEvent,
ThreadFeedback,
@@ -14,14 +16,16 @@ use anyhow::Context as _;
use assistant_settings::{AssistantSettings, NotifyWhenAgentWaiting};
use assistant_tool::ToolUseStatus;
use collections::{HashMap, HashSet};
use editor::actions::{MoveUp, Paste};
use editor::scroll::Autoscroll;
use editor::{Editor, EditorElement, EditorEvent, EditorStyle, MultiBuffer};
use gpui::{
AbsoluteLength, Animation, AnimationExt, AnyElement, App, ClickEvent, ClipboardItem,
DefiniteLength, EdgesRefinement, Empty, Entity, EventEmitter, Focusable, Hsla, ListAlignment,
ListState, MouseButton, PlatformDisplay, ScrollHandle, Stateful, StyleRefinement, Subscription,
Task, TextStyle, TextStyleRefinement, Transformation, UnderlineStyle, WeakEntity, WindowHandle,
linear_color_stop, linear_gradient, list, percentage, pulsating_between,
AbsoluteLength, Animation, AnimationExt, AnyElement, App, ClickEvent, ClipboardEntry,
ClipboardItem, CursorStyle, DefiniteLength, EdgesRefinement, Empty, Entity, EventEmitter,
Focusable, Hsla, ListAlignment, ListState, MouseButton, PlatformDisplay, ScrollHandle,
Stateful, StyleRefinement, Subscription, Task, TextStyle, TextStyleRefinement, Transformation,
UnderlineStyle, WeakEntity, WindowHandle, linear_color_stop, linear_gradient, list, percentage,
pulsating_between,
};
use language::{Buffer, Language, LanguageRegistry};
use language_model::{
@@ -41,7 +45,8 @@ use std::time::Duration;
use text::ToPoint;
use theme::ThemeSettings;
use ui::{
Disclosure, IconButton, KeyBinding, Scrollbar, ScrollbarState, TextSize, Tooltip, prelude::*,
ButtonLike, Disclosure, IconButton, KeyBinding, PopoverMenuHandle, Scrollbar, ScrollbarState,
TextSize, Tooltip, prelude::*,
};
use util::ResultExt as _;
use util::markdown::MarkdownCodeBlock;
@@ -49,6 +54,7 @@ use workspace::Workspace;
use zed_actions::assistant::OpenRulesLibrary;
pub struct ActiveThread {
context_store: Entity<ContextStore>,
language_registry: Arc<LanguageRegistry>,
thread_store: Entity<ThreadStore>,
thread: Entity<Thread>,
@@ -61,7 +67,7 @@ pub struct ActiveThread {
hide_scrollbar_task: Option<Task<()>>,
rendered_messages_by_id: HashMap<MessageId, RenderedMessage>,
rendered_tool_uses: HashMap<LanguageModelToolUseId, RenderedToolUse>,
editing_message: Option<(MessageId, EditMessageState)>,
editing_message: Option<(MessageId, EditingMessageState)>,
expanded_tool_uses: HashMap<LanguageModelToolUseId, bool>,
expanded_thinking_segments: HashMap<(MessageId, usize), bool>,
expanded_code_blocks: HashMap<(MessageId, usize), bool>,
@@ -72,6 +78,7 @@ pub struct ActiveThread {
_subscriptions: Vec<Subscription>,
notification_subscriptions: HashMap<WindowHandle<AgentNotification>, Vec<Subscription>>,
open_feedback_editors: HashMap<MessageId, Entity<Editor>>,
_load_edited_message_context_task: Option<Task<()>>,
}
struct RenderedMessage {
@@ -353,25 +360,42 @@ fn render_markdown_code_block(
cx,
)),
CodeBlockKind::FencedSrc(path_range) => path_range.path.file_name().map(|file_name| {
let language = parsed_markdown
.languages_by_path
.get(&path_range.path)
.or_else(|| {
path_range
.path
.extension()
.and_then(OsStr::to_str)
.and_then(|str| {
let ext = SharedString::new(str.to_string());
parsed_markdown.languages_by_name.get(&ext)
})
});
// We tell the model to use /dev/null for the path instead of using ```language
// because otherwise it consistently fails to use code citations.
if path_range.path.starts_with("/dev/null") {
let ext = path_range
.path
.extension()
.and_then(OsStr::to_str)
.map(|str| SharedString::new(str.to_string()))
.unwrap_or_default();
let icon = language.and_then(|language| {
language
.config()
.matcher
.path_suffixes
.iter()
.find_map(|extension| {
file_icons::FileIcons::get_icon(Path::new(extension), cx)
})
.map(|icon_path| {
code_block_icon(ix, icon_path, Some(language.name().into()))
})
});
render_code_language(
parsed_markdown
.languages_by_path
.get(&path_range.path)
.or_else(|| parsed_markdown.languages_by_name.get(&ext)),
ext,
cx,
)
div().children(icon).into_any_element()
} else {
let icon = file_icons::FileIcons::get_icon(&path_range.path, cx).map(|icon_path| {
code_block_icon(ix, icon_path, language.map(|lang| lang.name().into()))
});
let content = if let Some(parent) = path_range.path.parent() {
h_flex()
.ml_1()
@@ -404,19 +428,11 @@ fn render_markdown_code_block(
.hover(|item| item.bg(cx.theme().colors().element_hover.opacity(0.5)))
.tooltip(Tooltip::text("Jump to File"))
.child(
h_flex()
.gap_0p5()
.children(
file_icons::FileIcons::get_icon(&path_range.path, cx)
.map(Icon::from_path)
.map(|icon| icon.color(Color::Muted).size(IconSize::XSmall)),
)
.child(content)
.child(
Icon::new(IconName::ArrowUpRight)
.size(IconSize::XSmall)
.color(Color::Ignored),
),
h_flex().gap_0p5().children(icon).child(content).child(
Icon::new(IconName::ArrowUpRight)
.size(IconSize::XSmall)
.color(Color::Ignored),
),
)
.on_click({
let path_range = path_range.clone();
@@ -605,6 +621,26 @@ fn render_markdown_code_block(
)
}
fn code_block_icon(
ix: usize,
icon_path: SharedString,
tooltip: Option<SharedString>,
) -> ButtonLike {
let without_tooltip = ButtonLike::new(("code_block_icon", ix))
.disabled(true)
.cursor_style(CursorStyle::Arrow)
.child(
Icon::from_path(icon_path)
.color(Color::Muted)
.size(IconSize::XSmall),
);
match tooltip {
Some(tooltip) => without_tooltip.tooltip(Tooltip::text(tooltip)),
None => without_tooltip,
}
}
fn render_code_language(
language: Option<&Arc<Language>>,
name_fallback: SharedString,
@@ -725,10 +761,12 @@ fn open_markdown_link(
}
}
struct EditMessageState {
struct EditingMessageState {
editor: Entity<Editor>,
context_strip: Entity<ContextStrip>,
context_picker_menu_handle: PopoverMenuHandle<ContextPicker>,
last_estimated_token_count: Option<usize>,
_subscription: Subscription,
_subscriptions: [Subscription; 2],
_update_token_count_task: Option<Task<()>>,
}
@@ -736,6 +774,7 @@ impl ActiveThread {
pub fn new(
thread: Entity<Thread>,
thread_store: Entity<ThreadStore>,
context_store: Entity<ContextStore>,
language_registry: Arc<LanguageRegistry>,
workspace: WeakEntity<Workspace>,
window: &mut Window,
@@ -758,6 +797,7 @@ impl ActiveThread {
let mut this = Self {
language_registry,
thread_store,
context_store,
thread: thread.clone(),
workspace,
save_thread_task: None,
@@ -779,6 +819,7 @@ impl ActiveThread {
_subscriptions: subscriptions,
notification_subscriptions: HashMap::default(),
open_feedback_editors: HashMap::default(),
_load_edited_message_context_task: None,
};
for message in thread.read(cx).messages().cloned().collect::<Vec<_>>() {
@@ -1237,33 +1278,49 @@ impl ActiveThread {
return;
};
let buffer = cx.new(|cx| {
MultiBuffer::singleton(cx.new(|cx| Buffer::local(message_text.clone(), cx)), cx)
});
let editor = cx.new(|cx| {
let mut editor = Editor::new(
editor::EditorMode::AutoHeight { max_lines: 8 },
buffer,
None,
window,
cx,
);
let editor = crate::message_editor::create_editor(
self.workspace.clone(),
self.context_store.downgrade(),
self.thread_store.downgrade(),
window,
cx,
);
editor.update(cx, |editor, cx| {
editor.set_text(message_text.clone(), window, cx);
editor.focus_handle(cx).focus(window);
editor.move_to_end(&editor::actions::MoveToEnd, window, cx);
editor
});
let subscription = cx.subscribe(&editor, |this, _, event, cx| match event {
let buffer_edited_subscription = cx.subscribe(&editor, |this, _, event, cx| match event {
EditorEvent::BufferEdited => {
this.update_editing_message_token_count(true, cx);
}
_ => {}
});
let context_picker_menu_handle = PopoverMenuHandle::default();
let context_strip = cx.new(|cx| {
ContextStrip::new(
self.context_store.clone(),
self.workspace.clone(),
Some(self.thread_store.downgrade()),
context_picker_menu_handle.clone(),
SuggestContextKind::File,
window,
cx,
)
});
let context_strip_subscription =
cx.subscribe_in(&context_strip, window, Self::handle_context_strip_event);
self.editing_message = Some((
message_id,
EditMessageState {
EditingMessageState {
editor: editor.clone(),
context_strip,
context_picker_menu_handle,
last_estimated_token_count: None,
_subscription: subscription,
_subscriptions: [buffer_edited_subscription, context_strip_subscription],
_update_token_count_task: None,
},
));
@@ -1271,6 +1328,26 @@ impl ActiveThread {
cx.notify();
}
fn handle_context_strip_event(
&mut self,
_context_strip: &Entity<ContextStrip>,
event: &ContextStripEvent,
window: &mut Window,
cx: &mut Context<Self>,
) {
if let Some((_, state)) = self.editing_message.as_ref() {
match event {
ContextStripEvent::PickerDismissed
| ContextStripEvent::BlurredEmpty
| ContextStripEvent::BlurredDown => {
let editor_focus_handle = state.editor.focus_handle(cx);
window.focus(&editor_focus_handle);
}
ContextStripEvent::BlurredUp => {}
}
}
}
fn update_editing_message_token_count(&mut self, debounce: bool, cx: &mut Context<Self>) {
let Some((message_id, state)) = self.editing_message.as_mut() else {
return;
@@ -1357,6 +1434,68 @@ impl ActiveThread {
}));
}
fn toggle_context_picker(
&mut self,
_: &crate::ToggleContextPicker,
window: &mut Window,
cx: &mut Context<Self>,
) {
if let Some((_, state)) = self.editing_message.as_mut() {
let handle = state.context_picker_menu_handle.clone();
window.defer(cx, move |window, cx| {
handle.toggle(window, cx);
});
}
}
fn remove_all_context(
&mut self,
_: &crate::RemoveAllContext,
_window: &mut Window,
cx: &mut Context<Self>,
) {
self.context_store.update(cx, |store, _cx| store.clear());
cx.notify();
}
fn move_up(&mut self, _: &MoveUp, window: &mut Window, cx: &mut Context<Self>) {
if let Some((_, state)) = self.editing_message.as_mut() {
if state.context_picker_menu_handle.is_deployed() {
cx.propagate();
} else {
state.context_strip.focus_handle(cx).focus(window);
}
}
}
fn paste(&mut self, _: &Paste, _window: &mut Window, cx: &mut Context<Self>) {
let images = cx
.read_from_clipboard()
.map(|item| {
item.into_entries()
.filter_map(|entry| {
if let ClipboardEntry::Image(image) = entry {
Some(image)
} else {
None
}
})
.collect::<Vec<_>>()
})
.unwrap_or_default();
if images.is_empty() {
return;
}
cx.stop_propagation();
self.context_store.update(cx, |store, cx| {
for image in images {
store.add_image_instance(Arc::new(image), cx);
}
});
}
fn cancel_editing_message(&mut self, _: &menu::Cancel, _: &mut Window, cx: &mut Context<Self>) {
self.editing_message.take();
cx.notify();
@@ -1371,21 +1510,11 @@ impl ActiveThread {
let Some((message_id, state)) = self.editing_message.take() else {
return;
};
let edited_text = state.editor.read(cx).text(cx);
let thread_model = self.thread.update(cx, |thread, cx| {
thread.edit_message(
message_id,
Role::User,
vec![MessageSegment::Text(edited_text)],
cx,
);
for message_id in self.messages_after(message_id) {
thread.delete_message(*message_id, cx);
}
thread.get_or_init_configured_model(cx)
});
let Some(model) = thread_model else {
let Some(model) = self
.thread
.update(cx, |thread, cx| thread.get_or_init_configured_model(cx))
else {
return;
};
@@ -1394,11 +1523,45 @@ impl ActiveThread {
return;
}
self.thread.update(cx, |thread, cx| {
thread.advance_prompt_id();
thread.send_to_model(model.model, Some(window.window_handle()), cx);
});
cx.notify();
let edited_text = state.editor.read(cx).text(cx);
let new_context = self
.context_store
.read(cx)
.new_context_for_thread(self.thread.read(cx), Some(message_id));
let project = self.thread.read(cx).project().clone();
let prompt_store = self.thread_store.read(cx).prompt_store().clone();
let load_context_task =
crate::context::load_context(new_context, &project, &prompt_store, cx);
self._load_edited_message_context_task =
Some(cx.spawn_in(window, async move |this, cx| {
let context = load_context_task.await;
let _ = this
.update_in(cx, |this, window, cx| {
this.thread.update(cx, |thread, cx| {
thread.edit_message(
message_id,
Role::User,
vec![MessageSegment::Text(edited_text)],
Some(context.loaded_context),
cx,
);
for message_id in this.messages_after(message_id) {
thread.delete_message(*message_id, cx);
}
});
this.thread.update(cx, |thread, cx| {
thread.advance_prompt_id();
thread.send_to_model(model.model, Some(window.window_handle()), cx);
});
this._load_edited_message_context_task = None;
cx.notify();
})
.log_err();
}));
}
fn messages_after(&self, message_id: MessageId) -> &[MessageId] {
@@ -1519,6 +1682,53 @@ impl ActiveThread {
}
}
fn render_edit_message_editor(
&self,
state: &EditingMessageState,
window: &mut Window,
cx: &Context<Self>,
) -> impl IntoElement {
let settings = ThemeSettings::get_global(cx);
let font_size = TextSize::Small.rems(cx);
let line_height = font_size.to_pixels(window.rem_size()) * 1.75;
let colors = cx.theme().colors();
let text_style = TextStyle {
color: colors.text,
font_family: settings.buffer_font.family.clone(),
font_fallbacks: settings.buffer_font.fallbacks.clone(),
font_features: settings.buffer_font.features.clone(),
font_size: font_size.into(),
line_height: line_height.into(),
..Default::default()
};
v_flex()
.key_context("EditMessageEditor")
.on_action(cx.listener(Self::toggle_context_picker))
.on_action(cx.listener(Self::remove_all_context))
.on_action(cx.listener(Self::move_up))
.on_action(cx.listener(Self::cancel_editing_message))
.on_action(cx.listener(Self::confirm_editing_message))
.capture_action(cx.listener(Self::paste))
.min_h_6()
.flex_grow()
.w_full()
.gap_2()
.child(EditorElement::new(
&state.editor,
EditorStyle {
background: colors.editor_background,
local_player: cx.theme().players().local(),
text: text_style,
syntax: cx.theme().syntax().clone(),
..Default::default()
},
))
.child(state.context_strip.clone())
}
fn render_message(&self, ix: usize, window: &mut Window, cx: &mut Context<Self>) -> AnyElement {
let message_id = self.messages[ix];
let Some(message) = self.thread.read(cx).message(message_id) else {
@@ -1551,11 +1761,11 @@ impl ActiveThread {
let generating_label = (is_generating && is_last_message)
.then(|| AnimatedLabel::new("Generating").size(LabelSize::Small));
let edit_message_editor = self
let editing_message_state = self
.editing_message
.as_ref()
.filter(|(id, _)| *id == message_id)
.map(|(_, state)| state.editor.clone());
.map(|(_, state)| state);
let colors = cx.theme().colors();
let editor_bg_color = colors.editor_background;
@@ -1690,77 +1900,43 @@ impl ActiveThread {
let has_content = !message_is_empty || !added_context.is_empty();
let message_content = has_content.then(|| {
v_flex()
.w_full()
.gap_1()
.when(!message_is_empty, |parent| {
parent.child(
if let Some(edit_message_editor) = edit_message_editor.clone() {
let settings = ThemeSettings::get_global(cx);
let font_size = TextSize::Small.rems(cx);
let line_height = font_size.to_pixels(window.rem_size()) * 1.75;
let text_style = TextStyle {
color: cx.theme().colors().text,
font_family: settings.buffer_font.family.clone(),
font_fallbacks: settings.buffer_font.fallbacks.clone(),
font_features: settings.buffer_font.features.clone(),
font_size: font_size.into(),
line_height: line_height.into(),
..Default::default()
};
div()
.key_context("EditMessageEditor")
.on_action(cx.listener(Self::cancel_editing_message))
.on_action(cx.listener(Self::confirm_editing_message))
.min_h_6()
.flex_grow()
.w_full()
.child(EditorElement::new(
&edit_message_editor,
EditorStyle {
background: colors.editor_background,
local_player: cx.theme().players().local(),
text: text_style,
syntax: cx.theme().syntax().clone(),
..Default::default()
},
))
.into_any()
} else {
div()
.min_h_6()
.child(self.render_message_content(
message_id,
rendered_message,
has_tool_uses,
workspace.clone(),
window,
cx,
))
.into_any()
},
)
})
.when(!added_context.is_empty(), |parent| {
parent.child(h_flex().flex_wrap().gap_1().children(
added_context.into_iter().map(|added_context| {
let context = added_context.handle.clone();
ContextPill::added(added_context, false, false, None).on_click(Rc::new(
cx.listener({
let workspace = workspace.clone();
move |_, _, window, cx| {
if let Some(workspace) = workspace.upgrade() {
open_context(&context, workspace, window, cx);
cx.notify();
if let Some(state) = editing_message_state.as_ref() {
self.render_edit_message_editor(state, window, cx)
.into_any_element()
} else {
v_flex()
.w_full()
.gap_1()
.when(!message_is_empty, |parent| {
parent.child(div().min_h_6().child(self.render_message_content(
message_id,
rendered_message,
has_tool_uses,
workspace.clone(),
window,
cx,
)))
})
.when(!added_context.is_empty(), |parent| {
parent.child(h_flex().flex_wrap().gap_1().children(
added_context.into_iter().map(|added_context| {
let context = added_context.handle.clone();
ContextPill::added(added_context, false, false, None).on_click(
Rc::new(cx.listener({
let workspace = workspace.clone();
move |_, _, window, cx| {
if let Some(workspace) = workspace.upgrade() {
open_context(&context, workspace, window, cx);
cx.notify();
}
}
}
}),
))
}),
))
})
})),
)
}),
))
})
.into_any_element()
}
});
let styled_message = match message.role {
@@ -1785,8 +1961,8 @@ impl ActiveThread {
.p_2p5()
.gap_1()
.children(message_content)
.when_some(edit_message_editor.clone(), |this, edit_editor| {
let edit_editor_clone = edit_editor.clone();
.when_some(editing_message_state, |this, state| {
let focus_handle = state.editor.focus_handle(cx).clone();
this.w_full().justify_between().child(
h_flex()
.gap_0p5()
@@ -1797,16 +1973,17 @@ impl ActiveThread {
)
.shape(ui::IconButtonShape::Square)
.icon_color(Color::Error)
.tooltip(move |window, cx| {
let focus_handle =
edit_editor_clone.focus_handle(cx);
Tooltip::for_action_in(
"Cancel Edit",
&menu::Cancel,
&focus_handle,
window,
cx,
)
.tooltip({
let focus_handle = focus_handle.clone();
move |window, cx| {
Tooltip::for_action_in(
"Cancel Edit",
&menu::Cancel,
&focus_handle,
window,
cx,
)
}
})
.on_click(cx.listener(Self::handle_cancel_click)),
)
@@ -1815,18 +1992,20 @@ impl ActiveThread {
"confirm-edit-message",
IconName::Check,
)
.disabled(edit_editor.read(cx).is_empty(cx))
.disabled(state.editor.read(cx).is_empty(cx))
.shape(ui::IconButtonShape::Square)
.icon_color(Color::Success)
.tooltip(move |window, cx| {
let focus_handle = edit_editor.focus_handle(cx);
Tooltip::for_action_in(
"Regenerate",
&menu::Confirm,
&focus_handle,
window,
cx,
)
.tooltip({
let focus_handle = focus_handle.clone();
move |window, cx| {
Tooltip::for_action_in(
"Regenerate",
&menu::Confirm,
&focus_handle,
window,
cx,
)
}
})
.on_click(
cx.listener(Self::handle_regenerate_click),
@@ -1835,7 +2014,7 @@ impl ActiveThread {
)
}),
)
.when(edit_message_editor.is_none(), |this| {
.when(editing_message_state.is_none(), |this| {
this.tooltip(Tooltip::text("Click To Edit"))
})
.on_click(cx.listener({

View File

@@ -424,6 +424,7 @@ impl AssistantPanel {
ActiveThread::new(
thread.clone(),
thread_store.clone(),
message_editor_context_store.clone(),
language_registry.clone(),
workspace.clone(),
window,
@@ -457,7 +458,8 @@ impl AssistantPanel {
for entry in recently_opened.iter() {
let summary = entry.summary(cx);
menu = menu.entry_with_end_slot(
menu = menu.entry_with_end_slot_on_hover(
summary,
None,
{
@@ -626,7 +628,7 @@ impl AssistantPanel {
let thread_view = ActiveView::thread(thread.clone(), window, cx);
self.set_active_view(thread_view, window, cx);
let message_editor_context_store = cx.new(|_cx| {
let context_store = cx.new(|_cx| {
crate::context_store::ContextStore::new(
self.project.downgrade(),
Some(self.thread_store.downgrade()),
@@ -639,7 +641,7 @@ impl AssistantPanel {
.update(cx, |this, cx| this.open_thread(&other_thread_id, cx));
cx.spawn({
let context_store = message_editor_context_store.clone();
let context_store = context_store.clone();
async move |_panel, cx| {
let other_thread = other_thread_task.await?;
@@ -664,6 +666,7 @@ impl AssistantPanel {
ActiveThread::new(
thread.clone(),
self.thread_store.clone(),
context_store.clone(),
self.language_registry.clone(),
self.workspace.clone(),
window,
@@ -682,7 +685,7 @@ impl AssistantPanel {
MessageEditor::new(
self.fs.clone(),
self.workspace.clone(),
message_editor_context_store,
context_store,
self.prompt_store.clone(),
self.thread_store.downgrade(),
thread,
@@ -843,7 +846,7 @@ impl AssistantPanel {
) {
let thread_view = ActiveView::thread(thread.clone(), window, cx);
self.set_active_view(thread_view, window, cx);
let message_editor_context_store = cx.new(|_cx| {
let context_store = cx.new(|_cx| {
crate::context_store::ContextStore::new(
self.project.downgrade(),
Some(self.thread_store.downgrade()),
@@ -860,6 +863,7 @@ impl AssistantPanel {
ActiveThread::new(
thread.clone(),
self.thread_store.clone(),
context_store.clone(),
self.language_registry.clone(),
self.workspace.clone(),
window,
@@ -878,7 +882,7 @@ impl AssistantPanel {
MessageEditor::new(
self.fs.clone(),
self.workspace.clone(),
message_editor_context_store,
context_store,
self.prompt_store.clone(),
self.thread_store.downgrade(),
thread,

View File

@@ -3,11 +3,12 @@ use std::hash::{Hash, Hasher};
use std::path::PathBuf;
use std::{ops::Range, path::Path, sync::Arc};
use assistant_tool::outline;
use collections::HashSet;
use futures::future;
use futures::{FutureExt, future::Shared};
use gpui::{App, AppContext as _, Entity, SharedString, Task};
use language::Buffer;
use language::{Buffer, ParseStatus};
use language_model::{LanguageModelImage, LanguageModelRequestMessage, MessageContent};
use project::{Project, ProjectEntryId, ProjectPath, Worktree};
use prompt_store::{PromptStore, UserPromptId};
@@ -152,6 +153,7 @@ pub struct FileContext {
pub handle: FileContextHandle,
pub full_path: Arc<Path>,
pub text: SharedString,
pub is_outline: bool,
}
impl FileContextHandle {
@@ -177,14 +179,51 @@ impl FileContextHandle {
log::error!("file context missing path");
return Task::ready(None);
};
let full_path = file.full_path(cx);
let full_path: Arc<Path> = file.full_path(cx).into();
let rope = buffer_ref.as_rope().clone();
let buffer = self.buffer.clone();
cx.background_spawn(async move {
cx.spawn(async move |cx| {
// For large files, use outline instead of full content
if rope.len() > outline::AUTO_OUTLINE_SIZE {
// Wait until the buffer has been fully parsed, so we can read its outline
if let Ok(mut parse_status) =
buffer.read_with(cx, |buffer, _| buffer.parse_status())
{
while *parse_status.borrow() != ParseStatus::Idle {
parse_status.changed().await.log_err();
}
if let Ok(snapshot) = buffer.read_with(cx, |buffer, _| buffer.snapshot()) {
if let Some(outline) = snapshot.outline(None) {
let items = outline
.items
.into_iter()
.map(|item| item.to_point(&snapshot));
if let Ok(outline_text) =
outline::render_outline(items, None, 0, usize::MAX).await
{
let context = AgentContext::File(FileContext {
handle: self,
full_path,
text: outline_text.into(),
is_outline: true,
});
return Some((context, vec![buffer]));
}
}
}
}
}
// Fallback to full content if we couldn't build an outline
// (or didn't need to because the file was small enough)
let context = AgentContext::File(FileContext {
handle: self,
full_path: full_path.into(),
full_path,
text: rope.to_string().into(),
is_outline: false,
});
Some((context, vec![buffer]))
})
@@ -996,3 +1035,115 @@ impl Hash for AgentContextKey {
}
}
}
#[cfg(test)]
mod tests {
use super::*;
use gpui::TestAppContext;
use project::{FakeFs, Project};
use serde_json::json;
use settings::SettingsStore;
use util::path;
fn init_test_settings(cx: &mut TestAppContext) {
cx.update(|cx| {
let settings_store = SettingsStore::test(cx);
cx.set_global(settings_store);
language::init(cx);
Project::init_settings(cx);
});
}
// Helper to create a test project with test files
async fn create_test_project(
cx: &mut TestAppContext,
files: serde_json::Value,
) -> Entity<Project> {
let fs = FakeFs::new(cx.background_executor.clone());
fs.insert_tree(path!("/test"), files).await;
Project::test(fs, [path!("/test").as_ref()], cx).await
}
#[gpui::test]
async fn test_large_file_uses_outline(cx: &mut TestAppContext) {
init_test_settings(cx);
// Create a large file that exceeds AUTO_OUTLINE_SIZE
const LINE: &str = "Line with some text\n";
let large_content = LINE.repeat(2 * (outline::AUTO_OUTLINE_SIZE / LINE.len()));
let content_len = large_content.len();
assert!(content_len > outline::AUTO_OUTLINE_SIZE);
let file_context = file_context_for(large_content, cx).await;
assert!(
file_context.is_outline,
"Large file should use outline format"
);
assert!(
file_context.text.len() < content_len,
"Outline should be smaller than original content"
);
}
#[gpui::test]
async fn test_small_file_uses_full_content(cx: &mut TestAppContext) {
init_test_settings(cx);
let small_content = "This is a small file.\n";
let content_len = small_content.len();
assert!(content_len < outline::AUTO_OUTLINE_SIZE);
let file_context = file_context_for(small_content.to_string(), cx).await;
assert!(
!file_context.is_outline,
"Small files should not get an outline"
);
assert_eq!(file_context.text, small_content);
}
async fn file_context_for(content: String, cx: &mut TestAppContext) -> FileContext {
// Create a test project with the file
let project = create_test_project(
cx,
json!({
"file.txt": content,
}),
)
.await;
// Open the buffer
let buffer_path = project
.read_with(cx, |project, cx| project.find_project_path("file.txt", cx))
.unwrap();
let buffer = project
.update(cx, |project, cx| project.open_buffer(buffer_path, cx))
.await
.unwrap();
let context_handle = AgentContextHandle::File(FileContextHandle {
buffer: buffer.clone(),
context_id: ContextId::zero(),
});
cx.update(|cx| load_context(vec![context_handle], &project, &None, cx))
.await
.loaded_context
.contexts
.into_iter()
.find_map(|ctx| {
if let AgentContext::File(file_ctx) = ctx {
Some(file_ctx)
} else {
None
}
})
.expect("Should have found a file context")
}
}

View File

@@ -21,7 +21,7 @@ use crate::context::{
SymbolContextHandle, ThreadContextHandle,
};
use crate::context_strip::SuggestedContext;
use crate::thread::{Thread, ThreadId};
use crate::thread::{MessageId, Thread, ThreadId};
pub struct ContextStore {
project: WeakEntity<Project>,
@@ -54,9 +54,14 @@ impl ContextStore {
self.context_thread_ids.clear();
}
pub fn new_context_for_thread(&self, thread: &Thread) -> Vec<AgentContextHandle> {
pub fn new_context_for_thread(
&self,
thread: &Thread,
exclude_messages_from_id: Option<MessageId>,
) -> Vec<AgentContextHandle> {
let existing_context = thread
.messages()
.take_while(|message| exclude_messages_from_id.is_none_or(|id| message.id != id))
.flat_map(|message| {
message
.loaded_context

View File

@@ -69,6 +69,56 @@ pub struct MessageEditor {
const MAX_EDITOR_LINES: usize = 8;
pub(crate) fn create_editor(
workspace: WeakEntity<Workspace>,
context_store: WeakEntity<ContextStore>,
thread_store: WeakEntity<ThreadStore>,
window: &mut Window,
cx: &mut App,
) -> Entity<Editor> {
let language = Language::new(
language::LanguageConfig {
completion_query_characters: HashSet::from_iter(['.', '-', '_', '@']),
..Default::default()
},
None,
);
let editor = cx.new(|cx| {
let buffer = cx.new(|cx| Buffer::local("", cx).with_language(Arc::new(language), cx));
let buffer = cx.new(|cx| MultiBuffer::singleton(buffer, cx));
let mut editor = Editor::new(
editor::EditorMode::AutoHeight {
max_lines: MAX_EDITOR_LINES,
},
buffer,
None,
window,
cx,
);
editor.set_placeholder_text("Ask anything, @ to mention, ↑ to select", cx);
editor.set_show_indent_guides(false, cx);
editor.set_soft_wrap();
editor.set_context_menu_options(ContextMenuOptions {
min_entries_visible: 12,
max_entries_visible: 12,
placement: Some(ContextMenuPlacement::Above),
});
editor
});
let editor_entity = editor.downgrade();
editor.update(cx, |editor, _| {
editor.set_completion_provider(Some(Box::new(ContextPickerCompletionProvider::new(
workspace,
context_store,
Some(thread_store),
editor_entity,
))));
});
editor
}
impl MessageEditor {
pub fn new(
fs: Arc<dyn Fs>,
@@ -83,47 +133,14 @@ impl MessageEditor {
let context_picker_menu_handle = PopoverMenuHandle::default();
let model_selector_menu_handle = PopoverMenuHandle::default();
let language = Language::new(
language::LanguageConfig {
completion_query_characters: HashSet::from_iter(['.', '-', '_', '@']),
..Default::default()
},
None,
let editor = create_editor(
workspace.clone(),
context_store.downgrade(),
thread_store.clone(),
window,
cx,
);
let editor = cx.new(|cx| {
let buffer = cx.new(|cx| Buffer::local("", cx).with_language(Arc::new(language), cx));
let buffer = cx.new(|cx| MultiBuffer::singleton(buffer, cx));
let mut editor = Editor::new(
editor::EditorMode::AutoHeight {
max_lines: MAX_EDITOR_LINES,
},
buffer,
None,
window,
cx,
);
editor.set_placeholder_text("Ask anything, @ to mention, ↑ to select", cx);
editor.set_show_indent_guides(false, cx);
editor.set_soft_wrap();
editor.set_context_menu_options(ContextMenuOptions {
min_entries_visible: 12,
max_entries_visible: 12,
placement: Some(ContextMenuPlacement::Above),
});
editor
});
let editor_entity = editor.downgrade();
editor.update(cx, |editor, _| {
editor.set_completion_provider(Some(Box::new(ContextPickerCompletionProvider::new(
workspace.clone(),
context_store.downgrade(),
Some(thread_store.clone()),
editor_entity,
))));
});
let context_strip = cx.new(|cx| {
ContextStrip::new(
context_store.clone(),
@@ -1041,7 +1058,7 @@ impl MessageEditor {
let load_task = cx.spawn(async move |this, cx| {
let Ok(load_task) = this.update(cx, |this, cx| {
let new_context = this.context_store.read_with(cx, |context_store, cx| {
context_store.new_context_for_thread(this.thread.read(cx))
context_store.new_context_for_thread(this.thread.read(cx), None)
});
load_context(new_context, &this.project, &this.prompt_store, cx)
}) else {

View File

@@ -879,6 +879,7 @@ impl Thread {
id: MessageId,
new_role: Role,
new_segments: Vec<MessageSegment>,
loaded_context: Option<LoadedContext>,
cx: &mut Context<Self>,
) -> bool {
let Some(message) = self.messages.iter_mut().find(|message| message.id == id) else {
@@ -886,6 +887,9 @@ impl Thread {
};
message.role = new_role;
message.segments = new_segments;
if let Some(context) = loaded_context {
message.loaded_context = context;
}
self.touch_updated_at();
cx.emit(ThreadEvent::MessageEdited(id));
true
@@ -2546,6 +2550,7 @@ fn main() {{
"file1.rs": "fn function1() {}\n",
"file2.rs": "fn function2() {}\n",
"file3.rs": "fn function3() {}\n",
"file4.rs": "fn function4() {}\n",
}),
)
.await;
@@ -2558,7 +2563,7 @@ fn main() {{
.await
.unwrap();
let new_contexts = context_store.update(cx, |store, cx| {
store.new_context_for_thread(thread.read(cx))
store.new_context_for_thread(thread.read(cx), None)
});
assert_eq!(new_contexts.len(), 1);
let loaded_context = cx
@@ -2573,7 +2578,7 @@ fn main() {{
.await
.unwrap();
let new_contexts = context_store.update(cx, |store, cx| {
store.new_context_for_thread(thread.read(cx))
store.new_context_for_thread(thread.read(cx), None)
});
assert_eq!(new_contexts.len(), 1);
let loaded_context = cx
@@ -2589,7 +2594,7 @@ fn main() {{
.await
.unwrap();
let new_contexts = context_store.update(cx, |store, cx| {
store.new_context_for_thread(thread.read(cx))
store.new_context_for_thread(thread.read(cx), None)
});
assert_eq!(new_contexts.len(), 1);
let loaded_context = cx
@@ -2640,6 +2645,55 @@ fn main() {{
assert!(!request.messages[3].string_contents().contains("file1.rs"));
assert!(!request.messages[3].string_contents().contains("file2.rs"));
assert!(request.messages[3].string_contents().contains("file3.rs"));
add_file_to_context(&project, &context_store, "test/file4.rs", cx)
.await
.unwrap();
let new_contexts = context_store.update(cx, |store, cx| {
store.new_context_for_thread(thread.read(cx), Some(message2_id))
});
assert_eq!(new_contexts.len(), 3);
let loaded_context = cx
.update(|cx| load_context(new_contexts, &project, &None, cx))
.await
.loaded_context;
assert!(!loaded_context.text.contains("file1.rs"));
assert!(loaded_context.text.contains("file2.rs"));
assert!(loaded_context.text.contains("file3.rs"));
assert!(loaded_context.text.contains("file4.rs"));
let new_contexts = context_store.update(cx, |store, cx| {
// Remove file4.rs
store.remove_context(&loaded_context.contexts[2].handle(), cx);
store.new_context_for_thread(thread.read(cx), Some(message2_id))
});
assert_eq!(new_contexts.len(), 2);
let loaded_context = cx
.update(|cx| load_context(new_contexts, &project, &None, cx))
.await
.loaded_context;
assert!(!loaded_context.text.contains("file1.rs"));
assert!(loaded_context.text.contains("file2.rs"));
assert!(loaded_context.text.contains("file3.rs"));
assert!(!loaded_context.text.contains("file4.rs"));
let new_contexts = context_store.update(cx, |store, cx| {
// Remove file3.rs
store.remove_context(&loaded_context.contexts[1].handle(), cx);
store.new_context_for_thread(thread.read(cx), Some(message2_id))
});
assert_eq!(new_contexts.len(), 1);
let loaded_context = cx
.update(|cx| load_context(new_contexts, &project, &None, cx))
.await
.loaded_context;
assert!(!loaded_context.text.contains("file1.rs"));
assert!(loaded_context.text.contains("file2.rs"));
assert!(!loaded_context.text.contains("file3.rs"));
assert!(!loaded_context.text.contains("file4.rs"));
}
#[gpui::test]

View File

@@ -24,6 +24,7 @@ language.workspace = true
language_model.workspace = true
parking_lot.workspace = true
project.workspace = true
regex.workspace = true
serde.workspace = true
serde_json.workspace = true
text.workspace = true

View File

@@ -1,4 +1,5 @@
mod action_log;
pub mod outline;
mod tool_registry;
mod tool_schema;
mod tool_working_set;

View File

@@ -0,0 +1,132 @@
use crate::ActionLog;
use anyhow::{Result, anyhow};
use gpui::{AsyncApp, Entity};
use language::{OutlineItem, ParseStatus};
use project::Project;
use regex::Regex;
use std::fmt::Write;
use text::Point;
/// For files over this size, instead of reading them (or including them in context),
/// we automatically provide the file's symbol outline instead, with line numbers.
pub const AUTO_OUTLINE_SIZE: usize = 16384;
pub async fn file_outline(
project: Entity<Project>,
path: String,
action_log: Entity<ActionLog>,
regex: Option<Regex>,
cx: &mut AsyncApp,
) -> anyhow::Result<String> {
let buffer = {
let project_path = project.read_with(cx, |project, cx| {
project
.find_project_path(&path, cx)
.ok_or_else(|| anyhow!("Path {path} not found in project"))
})??;
project
.update(cx, |project, cx| project.open_buffer(project_path, cx))?
.await?
};
action_log.update(cx, |action_log, cx| {
action_log.track_buffer(buffer.clone(), cx);
})?;
// Wait until the buffer has been fully parsed, so that we can read its outline.
let mut parse_status = buffer.read_with(cx, |buffer, _| buffer.parse_status())?;
while *parse_status.borrow() != ParseStatus::Idle {
parse_status.changed().await?;
}
let snapshot = buffer.read_with(cx, |buffer, _| buffer.snapshot())?;
let Some(outline) = snapshot.outline(None) else {
return Err(anyhow!("No outline information available for this file."));
};
render_outline(
outline
.items
.into_iter()
.map(|item| item.to_point(&snapshot)),
regex,
0,
usize::MAX,
)
.await
}
pub async fn render_outline(
items: impl IntoIterator<Item = OutlineItem<Point>>,
regex: Option<Regex>,
offset: usize,
results_per_page: usize,
) -> Result<String> {
let mut items = items.into_iter().skip(offset);
let entries = items
.by_ref()
.filter(|item| {
regex
.as_ref()
.is_none_or(|regex| regex.is_match(&item.text))
})
.take(results_per_page)
.collect::<Vec<_>>();
let has_more = items.next().is_some();
let mut output = String::new();
let entries_rendered = render_entries(&mut output, entries);
// Calculate pagination information
let page_start = offset + 1;
let page_end = offset + entries_rendered;
let total_symbols = if has_more {
format!("more than {}", page_end)
} else {
page_end.to_string()
};
// Add pagination information
if has_more {
writeln!(&mut output, "\nShowing symbols {page_start}-{page_end} (there were more symbols found; use offset: {page_end} to see next page)",
)
} else {
writeln!(
&mut output,
"\nShowing symbols {page_start}-{page_end} (total symbols: {total_symbols})",
)
}
.ok();
Ok(output)
}
fn render_entries(
output: &mut String,
items: impl IntoIterator<Item = OutlineItem<Point>>,
) -> usize {
let mut entries_rendered = 0;
for item in items {
// Indent based on depth ("" for level 0, " " for level 1, etc.)
for _ in 0..item.depth {
output.push(' ');
}
output.push_str(&item.text);
// Add position information - convert to 1-based line numbers for display
let start_line = item.range.start.row + 1;
let end_line = item.range.end.row + 1;
if start_line == end_line {
writeln!(output, " [L{}]", start_line).ok();
} else {
writeln!(output, " [L{}-{}]", start_line, end_line).ok();
}
entries_rendered += 1;
}
entries_rendered
}

View File

@@ -55,6 +55,7 @@ rand.workspace = true
pretty_assertions.workspace = true
settings = { workspace = true, features = ["test-support"] }
task = { workspace = true, features = ["test-support"]}
tempfile.workspace = true
tree-sitter-rust.workspace = true
workspace = { workspace = true, features = ["test-support"] }
unindent.workspace = true

View File

@@ -4,10 +4,10 @@ use std::sync::Arc;
use crate::schema::json_schema_for;
use anyhow::{Result, anyhow};
use assistant_tool::outline;
use assistant_tool::{ActionLog, Tool, ToolResult};
use collections::IndexMap;
use gpui::{AnyWindowHandle, App, AsyncApp, Entity, Task};
use language::{OutlineItem, ParseStatus, Point};
use language_model::{LanguageModelRequestMessage, LanguageModelToolSchemaFormat};
use project::{Project, Symbol};
use regex::{Regex, RegexBuilder};
@@ -148,59 +148,13 @@ impl Tool for CodeSymbolsTool {
};
cx.spawn(async move |cx| match input.path {
Some(path) => file_outline(project, path, action_log, regex, cx).await,
Some(path) => outline::file_outline(project, path, action_log, regex, cx).await,
None => project_symbols(project, regex, input.offset, cx).await,
})
.into()
}
}
pub async fn file_outline(
project: Entity<Project>,
path: String,
action_log: Entity<ActionLog>,
regex: Option<Regex>,
cx: &mut AsyncApp,
) -> anyhow::Result<String> {
let buffer = {
let project_path = project.read_with(cx, |project, cx| {
project
.find_project_path(&path, cx)
.ok_or_else(|| anyhow!("Path {path} not found in project"))
})??;
project
.update(cx, |project, cx| project.open_buffer(project_path, cx))?
.await?
};
action_log.update(cx, |action_log, cx| {
action_log.track_buffer(buffer.clone(), cx);
})?;
// Wait until the buffer has been fully parsed, so that we can read its outline.
let mut parse_status = buffer.read_with(cx, |buffer, _| buffer.parse_status())?;
while *parse_status.borrow() != ParseStatus::Idle {
parse_status.changed().await?;
}
let snapshot = buffer.read_with(cx, |buffer, _| buffer.snapshot())?;
let Some(outline) = snapshot.outline(None) else {
return Err(anyhow!("No outline information available for this file."));
};
render_outline(
outline
.items
.into_iter()
.map(|item| item.to_point(&snapshot)),
regex,
0,
usize::MAX,
)
.await
}
async fn project_symbols(
project: Entity<Project>,
regex: Option<Regex>,
@@ -291,77 +245,3 @@ async fn project_symbols(
output
})
}
async fn render_outline(
items: impl IntoIterator<Item = OutlineItem<Point>>,
regex: Option<Regex>,
offset: usize,
results_per_page: usize,
) -> Result<String> {
let mut items = items.into_iter().skip(offset);
let entries = items
.by_ref()
.filter(|item| {
regex
.as_ref()
.is_none_or(|regex| regex.is_match(&item.text))
})
.take(results_per_page)
.collect::<Vec<_>>();
let has_more = items.next().is_some();
let mut output = String::new();
let entries_rendered = render_entries(&mut output, entries);
// Calculate pagination information
let page_start = offset + 1;
let page_end = offset + entries_rendered;
let total_symbols = if has_more {
format!("more than {}", page_end)
} else {
page_end.to_string()
};
// Add pagination information
if has_more {
writeln!(&mut output, "\nShowing symbols {page_start}-{page_end} (there were more symbols found; use offset: {page_end} to see next page)",
)
} else {
writeln!(
&mut output,
"\nShowing symbols {page_start}-{page_end} (total symbols: {total_symbols})",
)
}
.ok();
Ok(output)
}
fn render_entries(
output: &mut String,
items: impl IntoIterator<Item = OutlineItem<Point>>,
) -> usize {
let mut entries_rendered = 0;
for item in items {
// Indent based on depth ("" for level 0, " " for level 1, etc.)
for _ in 0..item.depth {
output.push(' ');
}
output.push_str(&item.text);
// Add position information - convert to 1-based line numbers for display
let start_line = item.range.start.row + 1;
let end_line = item.range.end.row + 1;
if start_line == end_line {
writeln!(output, " [L{}]", start_line).ok();
} else {
writeln!(output, " [L{}-{}]", start_line, end_line).ok();
}
entries_rendered += 1;
}
entries_rendered
}

View File

@@ -1,8 +1,8 @@
use std::sync::Arc;
use crate::{code_symbols_tool::file_outline, schema::json_schema_for};
use crate::schema::json_schema_for;
use anyhow::{Result, anyhow};
use assistant_tool::{ActionLog, Tool, ToolResult};
use assistant_tool::{ActionLog, Tool, ToolResult, outline};
use gpui::{AnyWindowHandle, App, Entity, Task};
use itertools::Itertools;
use language_model::{LanguageModelRequestMessage, LanguageModelToolSchemaFormat};
@@ -14,10 +14,6 @@ use ui::IconName;
use util::markdown::MarkdownInlineCode;
/// If the model requests to read a file whose size exceeds this, then
/// the tool will return the file's symbol outline instead of its contents,
/// and suggest trying again using line ranges from the outline.
const MAX_FILE_SIZE_TO_READ: usize = 16384;
/// If the model requests to list the entries in a directory with more
/// entries than this, then the tool will return a subset of the entries
/// and suggest trying again.
@@ -218,7 +214,7 @@ impl Tool for ContentsTool {
// No line ranges specified, so check file size to see if it's too big.
let file_size = buffer.read_with(cx, |buffer, _cx| buffer.text().len())?;
if file_size <= MAX_FILE_SIZE_TO_READ {
if file_size <= outline::AUTO_OUTLINE_SIZE {
let result = buffer.read_with(cx, |buffer, _cx| buffer.text())?;
action_log.update(cx, |log, cx| {
@@ -229,7 +225,7 @@ impl Tool for ContentsTool {
} else {
// File is too big, so return its outline and a suggestion to
// read again with a line number range specified.
let outline = file_outline(project, file_path, action_log, None, cx).await?;
let outline = outline::file_outline(project, file_path, action_log, None, cx).await?;
Ok(format!("This file was too big to read all at once. Here is an outline of its symbols:\n\n{outline}\n\nUsing the line numbers in this outline, you can call this tool again while specifying the start and end fields to see the implementations of symbols in the outline."))
}

View File

@@ -6,7 +6,7 @@ use language_model::{LanguageModelRequestMessage, LanguageModelToolSchemaFormat}
use project::Project;
use schemars::JsonSchema;
use serde::{Deserialize, Serialize};
use std::sync::Arc;
use std::{path::PathBuf, sync::Arc};
use ui::IconName;
use util::markdown::MarkdownEscaped;
@@ -50,7 +50,7 @@ impl Tool for OpenTool {
self: Arc<Self>,
input: serde_json::Value,
_messages: &[LanguageModelRequestMessage],
_project: Entity<Project>,
project: Entity<Project>,
_action_log: Entity<ActionLog>,
_window: Option<AnyWindowHandle>,
cx: &mut App,
@@ -60,11 +60,107 @@ impl Tool for OpenTool {
Err(err) => return Task::ready(Err(anyhow!(err))).into(),
};
// If path_or_url turns out to be a path in the project, make it absolute.
let abs_path = to_absolute_path(&input.path_or_url, project, cx);
cx.background_spawn(async move {
open::that(&input.path_or_url).context("Failed to open URL or file path")?;
match abs_path {
Some(path) => open::that(path),
None => open::that(&input.path_or_url),
}
.context("Failed to open URL or file path")?;
Ok(format!("Successfully opened {}", input.path_or_url))
})
.into()
}
}
fn to_absolute_path(
potential_path: &str,
project: Entity<Project>,
cx: &mut App,
) -> Option<PathBuf> {
let project = project.read(cx);
project
.find_project_path(PathBuf::from(potential_path), cx)
.and_then(|project_path| project.absolute_path(&project_path, cx))
}
#[cfg(test)]
mod tests {
use super::*;
use gpui::TestAppContext;
use project::{FakeFs, Project};
use settings::SettingsStore;
use std::path::Path;
use tempfile::TempDir;
#[gpui::test]
async fn test_to_absolute_path(cx: &mut TestAppContext) {
init_test(cx);
let temp_dir = TempDir::new().expect("Failed to create temp directory");
let temp_path = temp_dir.path().to_string_lossy().to_string();
let fs = FakeFs::new(cx.executor());
fs.insert_tree(
&temp_path,
serde_json::json!({
"src": {
"main.rs": "fn main() {}",
"lib.rs": "pub fn lib_fn() {}"
},
"docs": {
"readme.md": "# Project Documentation"
}
}),
)
.await;
// Use the temp_path as the root directory, not just its filename
let project = Project::test(fs.clone(), [temp_dir.path()], cx).await;
// Test cases where the function should return Some
cx.update(|cx| {
// Project-relative paths should return Some
// Create paths using the last segment of the temp path to simulate a project-relative path
let root_dir_name = Path::new(&temp_path)
.file_name()
.unwrap_or_else(|| std::ffi::OsStr::new("temp"))
.to_string_lossy();
assert!(
to_absolute_path(&format!("{root_dir_name}/src/main.rs"), project.clone(), cx)
.is_some(),
"Failed to resolve main.rs path"
);
assert!(
to_absolute_path(
&format!("{root_dir_name}/docs/readme.md",),
project.clone(),
cx,
)
.is_some(),
"Failed to resolve readme.md path"
);
// External URL should return None
let result = to_absolute_path("https://example.com", project.clone(), cx);
assert_eq!(result, None, "External URLs should return None");
// Path outside project
let result = to_absolute_path("../invalid/path", project.clone(), cx);
assert_eq!(result, None, "Paths outside the project should return None");
});
}
fn init_test(cx: &mut TestAppContext) {
cx.update(|cx| {
let settings_store = SettingsStore::test(cx);
cx.set_global(settings_store);
language::init(cx);
Project::init_settings(cx);
});
}
}

View File

@@ -1,5 +1,6 @@
use crate::{code_symbols_tool::file_outline, schema::json_schema_for};
use crate::schema::json_schema_for;
use anyhow::{Result, anyhow};
use assistant_tool::outline;
use assistant_tool::{ActionLog, Tool, ToolResult};
use gpui::{AnyWindowHandle, App, Entity, Task};
@@ -14,10 +15,6 @@ use ui::IconName;
use util::markdown::MarkdownInlineCode;
/// If the model requests to read a file whose size exceeds this, then
/// the tool will return an error along with the model's symbol outline,
/// and suggest trying again using line ranges from the outline.
const MAX_FILE_SIZE_TO_READ: usize = 16384;
#[derive(Debug, Serialize, Deserialize, JsonSchema)]
pub struct ReadFileToolInput {
/// The relative path of the file to read.
@@ -125,7 +122,8 @@ impl Tool for ReadFileTool {
if input.start_line.is_some() || input.end_line.is_some() {
let result = buffer.read_with(cx, |buffer, _cx| {
let text = buffer.text();
let start = input.start_line.unwrap_or(1);
// .max(1) because despite instructions to be 1-indexed, sometimes the model passes 0.
let start = input.start_line.unwrap_or(1).max(1);
let lines = text.split('\n').skip(start - 1);
if let Some(end) = input.end_line {
let count = end.saturating_sub(start).saturating_add(1); // Ensure at least 1 line
@@ -144,7 +142,7 @@ impl Tool for ReadFileTool {
// No line ranges specified, so check file size to see if it's too big.
let file_size = buffer.read_with(cx, |buffer, _cx| buffer.text().len())?;
if file_size <= MAX_FILE_SIZE_TO_READ {
if file_size <= outline::AUTO_OUTLINE_SIZE {
// File is small enough, so return its contents.
let result = buffer.read_with(cx, |buffer, _cx| buffer.text())?;
@@ -154,9 +152,9 @@ impl Tool for ReadFileTool {
Ok(result)
} else {
// File is too big, so return an error with the outline
// File is too big, so return the outline
// and a suggestion to read again with line numbers.
let outline = file_outline(project, file_path, action_log, None, cx).await?;
let outline = outline::file_outline(project, file_path, action_log, None, cx).await?;
Ok(formatdoc! {"
This file was too big to read all at once. Here is an outline of its symbols:
@@ -332,6 +330,67 @@ mod test {
assert_eq!(result.unwrap(), "Line 2\nLine 3\nLine 4");
}
#[gpui::test]
async fn test_read_file_line_range_edge_cases(cx: &mut TestAppContext) {
init_test(cx);
let fs = FakeFs::new(cx.executor());
fs.insert_tree(
"/root",
json!({
"multiline.txt": "Line 1\nLine 2\nLine 3\nLine 4\nLine 5"
}),
)
.await;
let project = Project::test(fs.clone(), [path!("/root").as_ref()], cx).await;
let action_log = cx.new(|_| ActionLog::new(project.clone()));
// start_line of 0 should be treated as 1
let result = cx
.update(|cx| {
let input = json!({
"path": "root/multiline.txt",
"start_line": 0,
"end_line": 2
});
Arc::new(ReadFileTool)
.run(input, &[], project.clone(), action_log.clone(), None, cx)
.output
})
.await;
assert_eq!(result.unwrap(), "Line 1\nLine 2");
// end_line of 0 should result in at least 1 line
let result = cx
.update(|cx| {
let input = json!({
"path": "root/multiline.txt",
"start_line": 1,
"end_line": 0
});
Arc::new(ReadFileTool)
.run(input, &[], project.clone(), action_log.clone(), None, cx)
.output
})
.await;
assert_eq!(result.unwrap(), "Line 1");
// when start_line > end_line, should still return at least 1 line
let result = cx
.update(|cx| {
let input = json!({
"path": "root/multiline.txt",
"start_line": 3,
"end_line": 2
});
Arc::new(ReadFileTool)
.run(input, &[], project.clone(), action_log, None, cx)
.output
})
.await;
assert_eq!(result.unwrap(), "Line 3");
}
fn init_test(cx: &mut TestAppContext) {
cx.update(|cx| {
let settings_store = SettingsStore::test(cx);

View File

@@ -362,12 +362,7 @@ async fn create_billing_subscription(
let checkout_session_url = match body.product {
Some(ProductCode::ZedPro) => {
stripe_billing
.checkout_with_price(
app.config.zed_pro_price_id()?,
customer_id,
&user.github_login,
&success_url,
)
.checkout_with_zed_pro(customer_id, &user.github_login, &success_url)
.await?
}
Some(ProductCode::ZedProTrial) => {
@@ -384,7 +379,6 @@ async fn create_billing_subscription(
stripe_billing
.checkout_with_zed_pro_trial(
app.config.zed_pro_price_id()?,
customer_id,
&user.github_login,
feature_flags,
@@ -458,6 +452,14 @@ async fn manage_billing_subscription(
))?
};
let Some(stripe_billing) = app.stripe_billing.clone() else {
log::error!("failed to retrieve Stripe billing object");
Err(Error::http(
StatusCode::NOT_IMPLEMENTED,
"not supported".into(),
))?
};
let customer = app
.db
.get_billing_customer_by_user_id(user.id)
@@ -508,8 +510,8 @@ async fn manage_billing_subscription(
let flow = match body.intent {
ManageSubscriptionIntent::ManageSubscription => None,
ManageSubscriptionIntent::UpgradeToPro => {
let zed_pro_price_id = app.config.zed_pro_price_id()?;
let zed_free_price_id = app.config.zed_free_price_id()?;
let zed_pro_price_id = stripe_billing.zed_pro_price_id().await?;
let zed_free_price_id = stripe_billing.zed_free_price_id().await?;
let stripe_subscription =
Subscription::retrieve(&stripe_client, &subscription_id, &[]).await?;
@@ -856,9 +858,11 @@ async fn handle_customer_subscription_event(
log::info!("handling Stripe {} event: {}", event.type_, event.id);
let subscription_kind = maybe!({
let zed_pro_price_id = app.config.zed_pro_price_id().ok()?;
let zed_free_price_id = app.config.zed_free_price_id().ok()?;
let subscription_kind = maybe!(async {
let stripe_billing = app.stripe_billing.clone()?;
let zed_pro_price_id = stripe_billing.zed_pro_price_id().await.ok()?;
let zed_free_price_id = stripe_billing.zed_free_price_id().await.ok()?;
subscription.items.data.iter().find_map(|item| {
let price = item.price.as_ref()?;
@@ -875,7 +879,8 @@ async fn handle_customer_subscription_event(
None
}
})
});
})
.await;
let billing_customer =
find_or_create_billing_customer(app, stripe_client, subscription.customer)
@@ -1090,9 +1095,17 @@ struct UsageCounts {
pub remaining: Option<i32>,
}
#[derive(Debug, Serialize)]
struct ModelRequestUsage {
pub model: String,
pub mode: CompletionMode,
pub requests: i32,
}
#[derive(Debug, Serialize)]
struct GetCurrentUsageResponse {
pub model_requests: UsageCounts,
pub model_request_usage: Vec<ModelRequestUsage>,
pub edit_predictions: UsageCounts,
}
@@ -1119,6 +1132,7 @@ async fn get_current_usage(
limit: Some(0),
remaining: Some(0),
},
model_request_usage: Vec::new(),
edit_predictions: UsageCounts {
used: 0,
limit: Some(0),
@@ -1163,12 +1177,30 @@ async fn get_current_usage(
zed_llm_client::UsageLimit::Unlimited => None,
};
let subscription_usage_meters = llm_db
.get_current_subscription_usage_meters_for_user(user.id, Utc::now())
.await?;
let model_request_usage = subscription_usage_meters
.into_iter()
.filter_map(|(usage_meter, _usage)| {
let model = llm_db.model_by_id(usage_meter.model_id).ok()?;
Some(ModelRequestUsage {
model: model.name.clone(),
mode: usage_meter.mode,
requests: usage_meter.requests,
})
})
.collect::<Vec<_>>();
Ok(Json(GetCurrentUsageResponse {
model_requests: UsageCounts {
used: usage.model_requests,
limit: model_requests_limit,
remaining: model_requests_limit.map(|limit| (limit - usage.model_requests).max(0)),
},
model_request_usage,
edit_predictions: UsageCounts {
used: usage.edit_predictions,
limit: edit_prediction_limit,
@@ -1371,13 +1403,13 @@ async fn sync_model_request_usage_with_stripe(
.await?;
let claude_3_5_sonnet = stripe_billing
.find_price_by_lookup_key("claude-3-5-sonnet-requests")
.find_price_id_by_lookup_key("claude-3-5-sonnet-requests")
.await?;
let claude_3_7_sonnet = stripe_billing
.find_price_by_lookup_key("claude-3-7-sonnet-requests")
.find_price_id_by_lookup_key("claude-3-7-sonnet-requests")
.await?;
let claude_3_7_sonnet_max = stripe_billing
.find_price_by_lookup_key("claude-3-7-sonnet-requests-max")
.find_price_id_by_lookup_key("claude-3-7-sonnet-requests-max")
.await?;
for (usage_meter, usage) in usage_meters {
@@ -1403,11 +1435,11 @@ async fn sync_model_request_usage_with_stripe(
let model = llm_db.model_by_id(usage_meter.model_id)?;
let (price_id, meter_event_name) = match model.name.as_str() {
"claude-3-5-sonnet" => (&claude_3_5_sonnet.id, "claude_3_5_sonnet/requests"),
"claude-3-5-sonnet" => (&claude_3_5_sonnet, "claude_3_5_sonnet/requests"),
"claude-3-7-sonnet" => match usage_meter.mode {
CompletionMode::Normal => (&claude_3_7_sonnet.id, "claude_3_7_sonnet/requests"),
CompletionMode::Normal => (&claude_3_7_sonnet, "claude_3_7_sonnet/requests"),
CompletionMode::Max => {
(&claude_3_7_sonnet_max.id, "claude_3_7_sonnet/requests/max")
(&claude_3_7_sonnet_max, "claude_3_7_sonnet/requests/max")
}
},
model_name => {

View File

@@ -180,9 +180,6 @@ pub struct Config {
pub slack_panics_webhook: Option<String>,
pub auto_join_channel_id: Option<ChannelId>,
pub stripe_api_key: Option<String>,
pub stripe_zed_pro_price_id: Option<String>,
pub stripe_zed_pro_trial_price_id: Option<String>,
pub stripe_zed_free_price_id: Option<String>,
pub supermaven_admin_api_key: Option<Arc<str>>,
pub user_backfiller_github_access_token: Option<Arc<str>>,
}
@@ -201,22 +198,6 @@ impl Config {
}
}
pub fn zed_pro_price_id(&self) -> anyhow::Result<stripe::PriceId> {
Self::parse_stripe_price_id("Zed Pro", self.stripe_zed_pro_price_id.as_deref())
}
pub fn zed_free_price_id(&self) -> anyhow::Result<stripe::PriceId> {
Self::parse_stripe_price_id("Zed Free", self.stripe_zed_pro_price_id.as_deref())
}
fn parse_stripe_price_id(name: &str, value: Option<&str>) -> anyhow::Result<stripe::PriceId> {
use std::str::FromStr as _;
let price_id = value.ok_or_else(|| anyhow!("{name} price ID not set"))?;
Ok(stripe::PriceId::from_str(price_id)?)
}
#[cfg(test)]
pub fn test() -> Self {
Self {
@@ -254,9 +235,6 @@ impl Config {
migrations_path: None,
seed_path: None,
stripe_api_key: None,
stripe_zed_pro_price_id: None,
stripe_zed_pro_trial_price_id: None,
stripe_zed_free_price_id: None,
supermaven_admin_api_key: None,
user_backfiller_github_access_token: None,
kinesis_region: None,

View File

@@ -1,3 +1,4 @@
use crate::db::UserId;
use crate::llm::db::queries::subscription_usages::convert_chrono_to_time;
use super::*;
@@ -34,4 +35,38 @@ impl LlmDatabase {
})
.await
}
/// Returns all current subscription usage meters for the given user as of the given timestamp.
pub async fn get_current_subscription_usage_meters_for_user(
&self,
user_id: UserId,
now: DateTimeUtc,
) -> Result<Vec<(subscription_usage_meter::Model, subscription_usage::Model)>> {
let now = convert_chrono_to_time(now)?;
self.transaction(|tx| async move {
let result = subscription_usage_meter::Entity::find()
.inner_join(subscription_usage::Entity)
.filter(subscription_usage::Column::UserId.eq(user_id))
.filter(
subscription_usage::Column::PeriodStartAt
.lte(now)
.and(subscription_usage::Column::PeriodEndAt.gte(now)),
)
.select_also(subscription_usage::Entity)
.all(&*tx)
.await?;
let result = result
.into_iter()
.filter_map(|(meter, usage)| {
let usage = usage?;
Some((meter, usage))
})
.collect();
Ok(result)
})
.await
}
}

View File

@@ -81,13 +81,21 @@ impl StripeBilling {
Ok(())
}
pub async fn find_price_by_lookup_key(&self, lookup_key: &str) -> Result<stripe::Price> {
pub async fn zed_pro_price_id(&self) -> Result<PriceId> {
self.find_price_id_by_lookup_key("zed-pro").await
}
pub async fn zed_free_price_id(&self) -> Result<PriceId> {
self.find_price_id_by_lookup_key("zed-free").await
}
pub async fn find_price_id_by_lookup_key(&self, lookup_key: &str) -> Result<PriceId> {
self.state
.read()
.await
.prices_by_lookup_key
.get(lookup_key)
.cloned()
.map(|price| price.id.clone())
.ok_or_else(|| crate::Error::Internal(anyhow!("no price ID found for {lookup_key:?}")))
}
@@ -463,19 +471,20 @@ impl StripeBilling {
Ok(session.url.context("no checkout session URL")?)
}
pub async fn checkout_with_price(
pub async fn checkout_with_zed_pro(
&self,
price_id: PriceId,
customer_id: stripe::CustomerId,
github_login: &str,
success_url: &str,
) -> Result<String> {
let zed_pro_price_id = self.zed_pro_price_id().await?;
let mut params = stripe::CreateCheckoutSession::new();
params.mode = Some(stripe::CheckoutSessionMode::Subscription);
params.customer = Some(customer_id);
params.client_reference_id = Some(github_login);
params.line_items = Some(vec![stripe::CreateCheckoutSessionLineItems {
price: Some(price_id.to_string()),
price: Some(zed_pro_price_id.to_string()),
quantity: Some(1),
..Default::default()
}]);
@@ -487,12 +496,13 @@ impl StripeBilling {
pub async fn checkout_with_zed_pro_trial(
&self,
zed_pro_price_id: PriceId,
customer_id: stripe::CustomerId,
github_login: &str,
feature_flags: Vec<String>,
success_url: &str,
) -> Result<String> {
let zed_pro_price_id = self.zed_pro_price_id().await?;
let eligible_for_extended_trial = feature_flags
.iter()
.any(|flag| flag == AGENT_EXTENDED_TRIAL_FEATURE_FLAG);

View File

@@ -554,9 +554,6 @@ impl TestServer {
migrations_path: None,
seed_path: None,
stripe_api_key: None,
stripe_zed_pro_price_id: None,
stripe_zed_pro_trial_price_id: None,
stripe_zed_free_price_id: None,
supermaven_admin_api_key: None,
user_backfiller_github_access_token: None,
kinesis_region: None,

View File

@@ -14,13 +14,16 @@ doctest = false
[dependencies]
anyhow.workspace = true
cargo_metadata.workspace = true
collections.workspace = true
component.workspace = true
ctor.workspace = true
editor.workspace = true
env_logger.workspace = true
futures.workspace = true
gpui.workspace = true
indoc.workspace = true
itertools.workspace = true
language.workspace = true
linkme.workspace = true
log.workspace = true
@@ -29,7 +32,9 @@ markdown.workspace = true
project.workspace = true
rand.workspace = true
serde.workspace = true
serde_json.workspace = true
settings.workspace = true
smol.workspace = true
text.workspace = true
theme.workspace = true
ui.workspace = true

View File

@@ -0,0 +1,603 @@
use std::{
path::{Component, Path, Prefix},
process::Stdio,
sync::atomic::{self, AtomicUsize},
};
use cargo_metadata::{
Message,
diagnostic::{Applicability, Diagnostic as CargoDiagnostic, DiagnosticLevel, DiagnosticSpan},
};
use collections::HashMap;
use gpui::{AppContext, Entity, Task};
use itertools::Itertools as _;
use language::Diagnostic;
use project::{
Worktree, lsp_store::rust_analyzer_ext::CARGO_DIAGNOSTICS_SOURCE_NAME,
project_settings::ProjectSettings,
};
use serde::{Deserialize, Serialize};
use settings::Settings;
use smol::{
channel::Receiver,
io::{AsyncBufReadExt, BufReader},
process::Command,
};
use ui::App;
use util::ResultExt;
use crate::ProjectDiagnosticsEditor;
#[derive(Debug, serde::Deserialize)]
#[serde(untagged)]
enum CargoMessage {
Cargo(Message),
Rustc(CargoDiagnostic),
}
/// Appends formatted string to a `String`.
macro_rules! format_to {
($buf:expr) => ();
($buf:expr, $lit:literal $($arg:tt)*) => {
{
use ::std::fmt::Write as _;
// We can't do ::std::fmt::Write::write_fmt($buf, format_args!($lit $($arg)*))
// unfortunately, as that loses out on autoref behavior.
_ = $buf.write_fmt(format_args!($lit $($arg)*))
}
};
}
pub fn cargo_diagnostics_sources(
editor: &ProjectDiagnosticsEditor,
cx: &App,
) -> Vec<Entity<Worktree>> {
let fetch_cargo_diagnostics = ProjectSettings::get_global(cx)
.diagnostics
.fetch_cargo_diagnostics();
if !fetch_cargo_diagnostics {
return Vec::new();
}
editor
.project
.read(cx)
.worktrees(cx)
.filter(|worktree| worktree.read(cx).entry_for_path("Cargo.toml").is_some())
.collect()
}
#[derive(Debug)]
pub enum FetchUpdate {
Diagnostic(CargoDiagnostic),
Progress(String),
}
#[derive(Debug)]
pub enum FetchStatus {
Started,
Progress { message: String },
Finished,
}
pub fn fetch_worktree_diagnostics(
worktree_root: &Path,
cx: &App,
) -> Option<(Task<()>, Receiver<FetchUpdate>)> {
let diagnostics_settings = ProjectSettings::get_global(cx)
.diagnostics
.cargo
.as_ref()
.filter(|cargo_diagnostics| cargo_diagnostics.fetch_cargo_diagnostics)?;
let command_string = diagnostics_settings
.diagnostics_fetch_command
.iter()
.join(" ");
let mut command_parts = diagnostics_settings.diagnostics_fetch_command.iter();
let mut command = Command::new(command_parts.next()?)
.args(command_parts)
.envs(diagnostics_settings.env.clone())
.current_dir(worktree_root)
.stdout(Stdio::piped())
.stderr(Stdio::null())
.kill_on_drop(true)
.spawn()
.log_err()?;
let stdout = command.stdout.take()?;
let mut reader = BufReader::new(stdout);
let (tx, rx) = smol::channel::unbounded();
let error_threshold = 10;
let cargo_diagnostics_fetch_task = cx.background_spawn(async move {
let _command = command;
let mut errors = 0;
loop {
let mut line = String::new();
match reader.read_line(&mut line).await {
Ok(0) => {
return;
},
Ok(_) => {
errors = 0;
let mut deserializer = serde_json::Deserializer::from_str(&line);
deserializer.disable_recursion_limit();
let send_result = match CargoMessage::deserialize(&mut deserializer) {
Ok(CargoMessage::Cargo(Message::CompilerMessage(message))) => tx.send(FetchUpdate::Diagnostic(message.message)).await,
Ok(CargoMessage::Cargo(Message::CompilerArtifact(artifact))) => tx.send(FetchUpdate::Progress(format!("Compiled {:?}", artifact.manifest_path.parent().unwrap_or(&artifact.manifest_path)))).await,
Ok(CargoMessage::Cargo(_)) => Ok(()),
Ok(CargoMessage::Rustc(rustc_message)) => tx.send(FetchUpdate::Diagnostic(rustc_message)).await,
Err(_) => {
log::debug!("Failed to parse cargo diagnostics from line '{line}'");
Ok(())
},
};
if send_result.is_err() {
return;
}
},
Err(e) => {
log::error!("Failed to read line from {command_string} command output when fetching cargo diagnostics: {e}");
errors += 1;
if errors >= error_threshold {
log::error!("Failed {error_threshold} times, aborting the diagnostics fetch");
return;
}
},
}
}
});
Some((cargo_diagnostics_fetch_task, rx))
}
static CARGO_DIAGNOSTICS_FETCH_GENERATION: AtomicUsize = AtomicUsize::new(0);
#[derive(Debug, Clone, Copy, Serialize, Deserialize)]
struct CargoFetchDiagnosticData {
generation: usize,
}
pub fn next_cargo_fetch_generation() {
CARGO_DIAGNOSTICS_FETCH_GENERATION.fetch_add(1, atomic::Ordering::Release);
}
pub fn is_outdated_cargo_fetch_diagnostic(diagnostic: &Diagnostic) -> bool {
if let Some(data) = diagnostic
.data
.clone()
.and_then(|data| serde_json::from_value::<CargoFetchDiagnosticData>(data).ok())
{
let current_generation = CARGO_DIAGNOSTICS_FETCH_GENERATION.load(atomic::Ordering::Acquire);
data.generation < current_generation
} else {
false
}
}
/// Converts a Rust root diagnostic to LSP form
///
/// This flattens the Rust diagnostic by:
///
/// 1. Creating a LSP diagnostic with the root message and primary span.
/// 2. Adding any labelled secondary spans to `relatedInformation`
/// 3. Categorising child diagnostics as either `SuggestedFix`es,
/// `relatedInformation` or additional message lines.
///
/// If the diagnostic has no primary span this will return `None`
///
/// Taken from https://github.com/rust-lang/rust-analyzer/blob/fe7b4f2ad96f7c13cc571f45edc2c578b35dddb4/crates/rust-analyzer/src/diagnostics/to_proto.rs#L275-L285
pub(crate) fn map_rust_diagnostic_to_lsp(
worktree_root: &Path,
cargo_diagnostic: &CargoDiagnostic,
) -> Vec<(lsp::Url, lsp::Diagnostic)> {
let primary_spans: Vec<&DiagnosticSpan> = cargo_diagnostic
.spans
.iter()
.filter(|s| s.is_primary)
.collect();
if primary_spans.is_empty() {
return Vec::new();
}
let severity = diagnostic_severity(cargo_diagnostic.level);
let mut source = String::from(CARGO_DIAGNOSTICS_SOURCE_NAME);
let mut code = cargo_diagnostic.code.as_ref().map(|c| c.code.clone());
if let Some(code_val) = &code {
// See if this is an RFC #2103 scoped lint (e.g. from Clippy)
let scoped_code: Vec<&str> = code_val.split("::").collect();
if scoped_code.len() == 2 {
source = String::from(scoped_code[0]);
code = Some(String::from(scoped_code[1]));
}
}
let mut needs_primary_span_label = true;
let mut subdiagnostics = Vec::new();
let mut tags = Vec::new();
for secondary_span in cargo_diagnostic.spans.iter().filter(|s| !s.is_primary) {
if let Some(label) = secondary_span.label.clone() {
subdiagnostics.push(lsp::DiagnosticRelatedInformation {
location: location(worktree_root, secondary_span),
message: label,
});
}
}
let mut message = cargo_diagnostic.message.clone();
for child in &cargo_diagnostic.children {
let child = map_rust_child_diagnostic(worktree_root, child);
match child {
MappedRustChildDiagnostic::SubDiagnostic(sub) => {
subdiagnostics.push(sub);
}
MappedRustChildDiagnostic::MessageLine(message_line) => {
format_to!(message, "\n{message_line}");
// These secondary messages usually duplicate the content of the
// primary span label.
needs_primary_span_label = false;
}
}
}
if let Some(code) = &cargo_diagnostic.code {
let code = code.code.as_str();
if matches!(
code,
"dead_code"
| "unknown_lints"
| "unreachable_code"
| "unused_attributes"
| "unused_imports"
| "unused_macros"
| "unused_variables"
) {
tags.push(lsp::DiagnosticTag::UNNECESSARY);
}
if matches!(code, "deprecated") {
tags.push(lsp::DiagnosticTag::DEPRECATED);
}
}
let code_description = match source.as_str() {
"rustc" => rustc_code_description(code.as_deref()),
"clippy" => clippy_code_description(code.as_deref()),
_ => None,
};
let generation = CARGO_DIAGNOSTICS_FETCH_GENERATION.load(atomic::Ordering::Acquire);
let data = Some(
serde_json::to_value(CargoFetchDiagnosticData { generation })
.expect("Serializing a regular Rust struct"),
);
primary_spans
.iter()
.flat_map(|primary_span| {
let primary_location = primary_location(worktree_root, primary_span);
let message = {
let mut message = message.clone();
if needs_primary_span_label {
if let Some(primary_span_label) = &primary_span.label {
format_to!(message, "\n{primary_span_label}");
}
}
message
};
// Each primary diagnostic span may result in multiple LSP diagnostics.
let mut diagnostics = Vec::new();
let mut related_info_macro_calls = vec![];
// If error occurs from macro expansion, add related info pointing to
// where the error originated
// Also, we would generate an additional diagnostic, so that exact place of macro
// will be highlighted in the error origin place.
let span_stack = std::iter::successors(Some(*primary_span), |span| {
Some(&span.expansion.as_ref()?.span)
});
for (i, span) in span_stack.enumerate() {
if is_dummy_macro_file(&span.file_name) {
continue;
}
// First span is the original diagnostic, others are macro call locations that
// generated that code.
let is_in_macro_call = i != 0;
let secondary_location = location(worktree_root, span);
if secondary_location == primary_location {
continue;
}
related_info_macro_calls.push(lsp::DiagnosticRelatedInformation {
location: secondary_location.clone(),
message: if is_in_macro_call {
"Error originated from macro call here".to_owned()
} else {
"Actual error occurred here".to_owned()
},
});
// For the additional in-macro diagnostic we add the inverse message pointing to the error location in code.
let information_for_additional_diagnostic =
vec![lsp::DiagnosticRelatedInformation {
location: primary_location.clone(),
message: "Exact error occurred here".to_owned(),
}];
let diagnostic = lsp::Diagnostic {
range: secondary_location.range,
// downgrade to hint if we're pointing at the macro
severity: Some(lsp::DiagnosticSeverity::HINT),
code: code.clone().map(lsp::NumberOrString::String),
code_description: code_description.clone(),
source: Some(source.clone()),
message: message.clone(),
related_information: Some(information_for_additional_diagnostic),
tags: if tags.is_empty() {
None
} else {
Some(tags.clone())
},
data: data.clone(),
};
diagnostics.push((secondary_location.uri, diagnostic));
}
// Emit the primary diagnostic.
diagnostics.push((
primary_location.uri.clone(),
lsp::Diagnostic {
range: primary_location.range,
severity,
code: code.clone().map(lsp::NumberOrString::String),
code_description: code_description.clone(),
source: Some(source.clone()),
message,
related_information: {
let info = related_info_macro_calls
.iter()
.cloned()
.chain(subdiagnostics.iter().cloned())
.collect::<Vec<_>>();
if info.is_empty() { None } else { Some(info) }
},
tags: if tags.is_empty() {
None
} else {
Some(tags.clone())
},
data: data.clone(),
},
));
// Emit hint-level diagnostics for all `related_information` entries such as "help"s.
// This is useful because they will show up in the user's editor, unlike
// `related_information`, which just produces hard-to-read links, at least in VS Code.
let back_ref = lsp::DiagnosticRelatedInformation {
location: primary_location,
message: "original diagnostic".to_owned(),
};
for sub in &subdiagnostics {
diagnostics.push((
sub.location.uri.clone(),
lsp::Diagnostic {
range: sub.location.range,
severity: Some(lsp::DiagnosticSeverity::HINT),
code: code.clone().map(lsp::NumberOrString::String),
code_description: code_description.clone(),
source: Some(source.clone()),
message: sub.message.clone(),
related_information: Some(vec![back_ref.clone()]),
tags: None, // don't apply modifiers again
data: data.clone(),
},
));
}
diagnostics
})
.collect()
}
fn rustc_code_description(code: Option<&str>) -> Option<lsp::CodeDescription> {
code.filter(|code| {
let mut chars = code.chars();
chars.next() == Some('E')
&& chars.by_ref().take(4).all(|c| c.is_ascii_digit())
&& chars.next().is_none()
})
.and_then(|code| {
lsp::Url::parse(&format!(
"https://doc.rust-lang.org/error-index.html#{code}"
))
.ok()
.map(|href| lsp::CodeDescription { href })
})
}
fn clippy_code_description(code: Option<&str>) -> Option<lsp::CodeDescription> {
code.and_then(|code| {
lsp::Url::parse(&format!(
"https://rust-lang.github.io/rust-clippy/master/index.html#{code}"
))
.ok()
.map(|href| lsp::CodeDescription { href })
})
}
/// Determines the LSP severity from a diagnostic
fn diagnostic_severity(level: DiagnosticLevel) -> Option<lsp::DiagnosticSeverity> {
let res = match level {
DiagnosticLevel::Ice => lsp::DiagnosticSeverity::ERROR,
DiagnosticLevel::Error => lsp::DiagnosticSeverity::ERROR,
DiagnosticLevel::Warning => lsp::DiagnosticSeverity::WARNING,
DiagnosticLevel::Note => lsp::DiagnosticSeverity::INFORMATION,
DiagnosticLevel::Help => lsp::DiagnosticSeverity::HINT,
_ => return None,
};
Some(res)
}
enum MappedRustChildDiagnostic {
SubDiagnostic(lsp::DiagnosticRelatedInformation),
MessageLine(String),
}
fn map_rust_child_diagnostic(
worktree_root: &Path,
cargo_diagnostic: &CargoDiagnostic,
) -> MappedRustChildDiagnostic {
let spans: Vec<&DiagnosticSpan> = cargo_diagnostic
.spans
.iter()
.filter(|s| s.is_primary)
.collect();
if spans.is_empty() {
// `rustc` uses these spanless children as a way to print multi-line
// messages
return MappedRustChildDiagnostic::MessageLine(cargo_diagnostic.message.clone());
}
let mut edit_map: HashMap<lsp::Url, Vec<lsp::TextEdit>> = HashMap::default();
let mut suggested_replacements = Vec::new();
for &span in &spans {
if let Some(suggested_replacement) = &span.suggested_replacement {
if !suggested_replacement.is_empty() {
suggested_replacements.push(suggested_replacement);
}
let location = location(worktree_root, span);
let edit = lsp::TextEdit::new(location.range, suggested_replacement.clone());
// Only actually emit a quickfix if the suggestion is "valid enough".
// We accept both "MaybeIncorrect" and "MachineApplicable". "MaybeIncorrect" means that
// the suggestion is *complete* (contains no placeholders where code needs to be
// inserted), but might not be what the user wants, or might need minor adjustments.
if matches!(
span.suggestion_applicability,
None | Some(Applicability::MaybeIncorrect | Applicability::MachineApplicable)
) {
edit_map.entry(location.uri).or_default().push(edit);
}
}
}
// rustc renders suggestion diagnostics by appending the suggested replacement, so do the same
// here, otherwise the diagnostic text is missing useful information.
let mut message = cargo_diagnostic.message.clone();
if !suggested_replacements.is_empty() {
message.push_str(": ");
let suggestions = suggested_replacements
.iter()
.map(|suggestion| format!("`{suggestion}`"))
.join(", ");
message.push_str(&suggestions);
}
MappedRustChildDiagnostic::SubDiagnostic(lsp::DiagnosticRelatedInformation {
location: location(worktree_root, spans[0]),
message,
})
}
/// Converts a Rust span to a LSP location
fn location(worktree_root: &Path, span: &DiagnosticSpan) -> lsp::Location {
let file_name = worktree_root.join(&span.file_name);
let uri = url_from_abs_path(&file_name);
let range = {
lsp::Range::new(
position(span, span.line_start, span.column_start.saturating_sub(1)),
position(span, span.line_end, span.column_end.saturating_sub(1)),
)
};
lsp::Location::new(uri, range)
}
/// Returns a `Url` object from a given path, will lowercase drive letters if present.
/// This will only happen when processing windows paths.
///
/// When processing non-windows path, this is essentially the same as `Url::from_file_path`.
pub(crate) fn url_from_abs_path(path: &Path) -> lsp::Url {
let url = lsp::Url::from_file_path(path).unwrap();
match path.components().next() {
Some(Component::Prefix(prefix))
if matches!(prefix.kind(), Prefix::Disk(_) | Prefix::VerbatimDisk(_)) =>
{
// Need to lowercase driver letter
}
_ => return url,
}
let driver_letter_range = {
let (scheme, drive_letter, _rest) = match url.as_str().splitn(3, ':').collect_tuple() {
Some(it) => it,
None => return url,
};
let start = scheme.len() + ':'.len_utf8();
start..(start + drive_letter.len())
};
// Note: lowercasing the `path` itself doesn't help, the `Url::parse`
// machinery *also* canonicalizes the drive letter. So, just massage the
// string in place.
let mut url: String = url.into();
url[driver_letter_range].make_ascii_lowercase();
lsp::Url::parse(&url).unwrap()
}
fn position(
span: &DiagnosticSpan,
line_number: usize,
column_offset_utf32: usize,
) -> lsp::Position {
let line_index = line_number - span.line_start;
let column_offset_encoded = match span.text.get(line_index) {
// Fast path.
Some(line) if line.text.is_ascii() => column_offset_utf32,
Some(line) => {
let line_prefix_len = line
.text
.char_indices()
.take(column_offset_utf32)
.last()
.map(|(pos, c)| pos + c.len_utf8())
.unwrap_or(0);
let line_prefix = &line.text[..line_prefix_len];
line_prefix.len()
}
None => column_offset_utf32,
};
lsp::Position {
line: (line_number as u32).saturating_sub(1),
character: column_offset_encoded as u32,
}
}
/// Checks whether a file name is from macro invocation and does not refer to an actual file.
fn is_dummy_macro_file(file_name: &str) -> bool {
file_name.starts_with('<') && file_name.ends_with('>')
}
/// Extracts a suitable "primary" location from a rustc diagnostic.
///
/// This takes locations pointing into the standard library, or generally outside the current
/// workspace into account and tries to avoid those, in case macros are involved.
fn primary_location(worktree_root: &Path, span: &DiagnosticSpan) -> lsp::Location {
let span_stack = std::iter::successors(Some(span), |span| Some(&span.expansion.as_ref()?.span));
for span in span_stack.clone() {
let abs_path = worktree_root.join(&span.file_name);
if !is_dummy_macro_file(&span.file_name) && abs_path.starts_with(worktree_root) {
return location(worktree_root, span);
}
}
// Fall back to the outermost macro invocation if no suitable span comes up.
let last_span = span_stack.last().unwrap();
location(worktree_root, last_span)
}

View File

@@ -1,3 +1,4 @@
mod cargo;
pub mod items;
mod toolbar_controls;
@@ -7,7 +8,12 @@ mod diagnostic_renderer;
mod diagnostics_tests;
use anyhow::Result;
use collections::{BTreeSet, HashMap};
use cargo::{
FetchStatus, FetchUpdate, cargo_diagnostics_sources, fetch_worktree_diagnostics,
is_outdated_cargo_fetch_diagnostic, map_rust_diagnostic_to_lsp, next_cargo_fetch_generation,
url_from_abs_path,
};
use collections::{BTreeSet, HashMap, HashSet};
use diagnostic_renderer::DiagnosticBlock;
use editor::{
DEFAULT_MULTIBUFFER_CONTEXT, Editor, EditorEvent, ExcerptRange, MultiBuffer, PathKey,
@@ -22,14 +28,16 @@ use gpui::{
use language::{
Bias, Buffer, BufferRow, BufferSnapshot, DiagnosticEntry, Point, ToTreeSitterPoint,
};
use lsp::DiagnosticSeverity;
use project::{DiagnosticSummary, Project, ProjectPath, project_settings::ProjectSettings};
use lsp::{DiagnosticSeverity, LanguageServerId};
use project::{
DiagnosticSummary, Project, ProjectPath, Worktree,
lsp_store::rust_analyzer_ext::{CARGO_DIAGNOSTICS_SOURCE_NAME, RUST_ANALYZER_NAME},
project_settings::ProjectSettings,
};
use settings::Settings;
use std::{
any::{Any, TypeId},
cmp,
cmp::Ordering,
cmp::{self, Ordering},
ops::{Range, RangeInclusive},
sync::Arc,
time::Duration,
@@ -45,7 +53,10 @@ use workspace::{
searchable::SearchableItemHandle,
};
actions!(diagnostics, [Deploy, ToggleWarnings]);
actions!(
diagnostics,
[Deploy, ToggleWarnings, ToggleDiagnosticsRefresh]
);
#[derive(Default)]
pub(crate) struct IncludeWarnings(bool);
@@ -68,9 +79,15 @@ pub(crate) struct ProjectDiagnosticsEditor {
paths_to_update: BTreeSet<ProjectPath>,
include_warnings: bool,
update_excerpts_task: Option<Task<Result<()>>>,
cargo_diagnostics_fetch: CargoDiagnosticsFetchState,
_subscription: Subscription,
}
struct CargoDiagnosticsFetchState {
task: Option<Task<()>>,
rust_analyzer: Option<LanguageServerId>,
}
impl EventEmitter<EditorEvent> for ProjectDiagnosticsEditor {}
const DIAGNOSTICS_UPDATE_DELAY: Duration = Duration::from_millis(50);
@@ -126,6 +143,7 @@ impl Render for ProjectDiagnosticsEditor {
.track_focus(&self.focus_handle(cx))
.size_full()
.on_action(cx.listener(Self::toggle_warnings))
.on_action(cx.listener(Self::toggle_diagnostics_refresh))
.child(child)
}
}
@@ -212,7 +230,11 @@ impl ProjectDiagnosticsEditor {
cx.observe_global_in::<IncludeWarnings>(window, |this, window, cx| {
this.include_warnings = cx.global::<IncludeWarnings>().0;
this.diagnostics.clear();
this.update_all_excerpts(window, cx);
this.update_all_diagnostics(window, cx);
})
.detach();
cx.observe_release(&cx.entity(), |editor, _, cx| {
editor.stop_cargo_diagnostics_fetch(cx);
})
.detach();
@@ -229,9 +251,13 @@ impl ProjectDiagnosticsEditor {
editor,
paths_to_update: Default::default(),
update_excerpts_task: None,
cargo_diagnostics_fetch: CargoDiagnosticsFetchState {
task: None,
rust_analyzer: None,
},
_subscription: project_event_subscription,
};
this.update_all_excerpts(window, cx);
this.update_all_diagnostics(window, cx);
this
}
@@ -239,15 +265,17 @@ impl ProjectDiagnosticsEditor {
if self.update_excerpts_task.is_some() {
return;
}
let project_handle = self.project.clone();
self.update_excerpts_task = Some(cx.spawn_in(window, async move |this, cx| {
cx.background_executor()
.timer(DIAGNOSTICS_UPDATE_DELAY)
.await;
loop {
let Some(path) = this.update(cx, |this, _| {
let Some(path) = this.update(cx, |this, cx| {
let Some(path) = this.paths_to_update.pop_first() else {
this.update_excerpts_task.take();
this.update_excerpts_task = None;
cx.notify();
return None;
};
Some(path)
@@ -307,6 +335,32 @@ impl ProjectDiagnosticsEditor {
cx.set_global(IncludeWarnings(!self.include_warnings));
}
fn toggle_diagnostics_refresh(
&mut self,
_: &ToggleDiagnosticsRefresh,
window: &mut Window,
cx: &mut Context<Self>,
) {
let fetch_cargo_diagnostics = ProjectSettings::get_global(cx)
.diagnostics
.fetch_cargo_diagnostics();
if fetch_cargo_diagnostics {
if self.cargo_diagnostics_fetch.task.is_some() {
self.stop_cargo_diagnostics_fetch(cx);
} else {
self.update_all_diagnostics(window, cx);
}
} else {
if self.update_excerpts_task.is_some() {
self.update_excerpts_task = None;
} else {
self.update_all_diagnostics(window, cx);
}
}
cx.notify();
}
fn focus_in(&mut self, window: &mut Window, cx: &mut Context<Self>) {
if self.focus_handle.is_focused(window) && !self.multibuffer.read(cx).is_empty() {
self.editor.focus_handle(cx).focus(window)
@@ -320,6 +374,303 @@ impl ProjectDiagnosticsEditor {
}
}
fn update_all_diagnostics(&mut self, window: &mut Window, cx: &mut Context<Self>) {
let cargo_diagnostics_sources = cargo_diagnostics_sources(self, cx);
if cargo_diagnostics_sources.is_empty() {
self.update_all_excerpts(window, cx);
} else {
self.fetch_cargo_diagnostics(Arc::new(cargo_diagnostics_sources), window, cx);
}
}
fn fetch_cargo_diagnostics(
&mut self,
diagnostics_sources: Arc<Vec<Entity<Worktree>>>,
window: &mut Window,
cx: &mut Context<Self>,
) {
self.cargo_diagnostics_fetch.task = Some(cx.spawn_in(window, async move |editor, cx| {
let rust_analyzer_server = editor
.update(cx, |editor, cx| {
editor
.project
.read(cx)
.language_server_with_name(RUST_ANALYZER_NAME, cx)
})
.ok();
let rust_analyzer_server = match rust_analyzer_server {
Some(rust_analyzer_server) => rust_analyzer_server.await,
None => None,
};
let mut worktree_diagnostics_tasks = Vec::new();
let mut paths_with_reported_cargo_diagnostics = HashSet::default();
if let Some(rust_analyzer_server) = rust_analyzer_server {
let can_continue = editor
.update(cx, |editor, cx| {
editor.cargo_diagnostics_fetch.rust_analyzer = Some(rust_analyzer_server);
let status_inserted =
editor
.project
.read(cx)
.lsp_store()
.update(cx, |lsp_store, cx| {
if let Some(rust_analyzer_status) = lsp_store
.language_server_statuses
.get_mut(&rust_analyzer_server)
{
rust_analyzer_status
.progress_tokens
.insert(fetch_cargo_diagnostics_token());
paths_with_reported_cargo_diagnostics.extend(editor.diagnostics.iter().filter_map(|(buffer_id, diagnostics)| {
if diagnostics.iter().any(|d| d.diagnostic.source.as_deref() == Some(CARGO_DIAGNOSTICS_SOURCE_NAME)) {
Some(*buffer_id)
} else {
None
}
}).filter_map(|buffer_id| {
let buffer = lsp_store.buffer_store().read(cx).get(buffer_id)?;
let path = buffer.read(cx).file()?.as_local()?.abs_path(cx);
Some(url_from_abs_path(&path))
}));
true
} else {
false
}
});
if status_inserted {
editor.update_cargo_fetch_status(FetchStatus::Started, cx);
next_cargo_fetch_generation();
true
} else {
false
}
})
.unwrap_or(false);
if can_continue {
for worktree in diagnostics_sources.iter() {
if let Some(((_task, worktree_diagnostics), worktree_root)) = cx
.update(|_, cx| {
let worktree_root = worktree.read(cx).abs_path();
log::info!("Fetching cargo diagnostics for {worktree_root:?}");
fetch_worktree_diagnostics(&worktree_root, cx)
.zip(Some(worktree_root))
})
.ok()
.flatten()
{
let editor = editor.clone();
worktree_diagnostics_tasks.push(cx.spawn(async move |cx| {
let _task = _task;
let mut file_diagnostics = HashMap::default();
let mut diagnostics_total = 0;
let mut updated_urls = HashSet::default();
while let Ok(fetch_update) = worktree_diagnostics.recv().await {
match fetch_update {
FetchUpdate::Diagnostic(diagnostic) => {
for (url, diagnostic) in map_rust_diagnostic_to_lsp(
&worktree_root,
&diagnostic,
) {
let file_diagnostics = file_diagnostics
.entry(url)
.or_insert_with(Vec::<lsp::Diagnostic>::new);
let i = file_diagnostics
.binary_search_by(|probe| {
probe.range.start.cmp(&diagnostic.range.start)
.then(probe.range.end.cmp(&diagnostic.range.end))
.then(Ordering::Greater)
})
.unwrap_or_else(|i| i);
file_diagnostics.insert(i, diagnostic);
}
let file_changed = file_diagnostics.len() > 1;
if file_changed {
if editor
.update_in(cx, |editor, window, cx| {
editor
.project
.read(cx)
.lsp_store()
.update(cx, |lsp_store, cx| {
for (uri, mut diagnostics) in
file_diagnostics.drain()
{
diagnostics.dedup();
diagnostics_total += diagnostics.len();
updated_urls.insert(uri.clone());
lsp_store.merge_diagnostics(
rust_analyzer_server,
lsp::PublishDiagnosticsParams {
uri,
diagnostics,
version: None,
},
&[],
|diagnostic, _| {
!is_outdated_cargo_fetch_diagnostic(diagnostic)
},
cx,
)?;
}
anyhow::Ok(())
})?;
editor.update_all_excerpts(window, cx);
anyhow::Ok(())
})
.ok()
.transpose()
.ok()
.flatten()
.is_none()
{
break;
}
}
}
FetchUpdate::Progress(message) => {
if editor
.update(cx, |editor, cx| {
editor.update_cargo_fetch_status(
FetchStatus::Progress { message },
cx,
);
})
.is_err()
{
return updated_urls;
}
}
}
}
editor
.update_in(cx, |editor, window, cx| {
editor
.project
.read(cx)
.lsp_store()
.update(cx, |lsp_store, cx| {
for (uri, mut diagnostics) in
file_diagnostics.drain()
{
diagnostics.dedup();
diagnostics_total += diagnostics.len();
updated_urls.insert(uri.clone());
lsp_store.merge_diagnostics(
rust_analyzer_server,
lsp::PublishDiagnosticsParams {
uri,
diagnostics,
version: None,
},
&[],
|diagnostic, _| {
!is_outdated_cargo_fetch_diagnostic(diagnostic)
},
cx,
)?;
}
anyhow::Ok(())
})?;
editor.update_all_excerpts(window, cx);
anyhow::Ok(())
})
.ok();
log::info!("Fetched {diagnostics_total} cargo diagnostics for worktree {worktree_root:?}");
updated_urls
}));
}
}
} else {
log::info!(
"No rust-analyzer language server found, skipping diagnostics fetch"
);
}
}
let updated_urls = futures::future::join_all(worktree_diagnostics_tasks).await.into_iter().flatten().collect();
if let Some(rust_analyzer_server) = rust_analyzer_server {
editor
.update_in(cx, |editor, window, cx| {
editor
.project
.read(cx)
.lsp_store()
.update(cx, |lsp_store, cx| {
for uri_to_cleanup in paths_with_reported_cargo_diagnostics.difference(&updated_urls).cloned() {
lsp_store.merge_diagnostics(
rust_analyzer_server,
lsp::PublishDiagnosticsParams {
uri: uri_to_cleanup,
diagnostics: Vec::new(),
version: None,
},
&[],
|diagnostic, _| {
!is_outdated_cargo_fetch_diagnostic(diagnostic)
},
cx,
).ok();
}
});
editor.update_all_excerpts(window, cx);
editor.stop_cargo_diagnostics_fetch(cx);
cx.notify();
})
.ok();
}
}));
}
fn update_cargo_fetch_status(&self, status: FetchStatus, cx: &mut App) {
let Some(rust_analyzer) = self.cargo_diagnostics_fetch.rust_analyzer else {
return;
};
let work_done = match status {
FetchStatus::Started => lsp::WorkDoneProgress::Begin(lsp::WorkDoneProgressBegin {
title: "cargo".to_string(),
cancellable: None,
message: Some("Fetching cargo diagnostics".to_string()),
percentage: None,
}),
FetchStatus::Progress { message } => {
lsp::WorkDoneProgress::Report(lsp::WorkDoneProgressReport {
message: Some(message),
cancellable: None,
percentage: None,
})
}
FetchStatus::Finished => {
lsp::WorkDoneProgress::End(lsp::WorkDoneProgressEnd { message: None })
}
};
let progress = lsp::ProgressParams {
token: lsp::NumberOrString::String(fetch_cargo_diagnostics_token()),
value: lsp::ProgressParamsValue::WorkDone(work_done),
};
self.project
.read(cx)
.lsp_store()
.update(cx, |lsp_store, cx| {
lsp_store.on_lsp_progress(progress, rust_analyzer, None, cx)
});
}
fn stop_cargo_diagnostics_fetch(&mut self, cx: &mut App) {
self.update_cargo_fetch_status(FetchStatus::Finished, cx);
self.cargo_diagnostics_fetch.task = None;
log::info!("Finished fetching cargo diagnostics");
}
/// Enqueue an update of all excerpts. Updates all paths that either
/// currently have diagnostics or are currently present in this view.
fn update_all_excerpts(&mut self, window: &mut Window, cx: &mut Context<Self>) {
@@ -422,20 +773,17 @@ impl ProjectDiagnosticsEditor {
})?;
for item in more {
let insert_pos = blocks
.binary_search_by(|existing| {
match existing.initial_range.start.cmp(&item.initial_range.start) {
Ordering::Equal => item
.initial_range
.end
.cmp(&existing.initial_range.end)
.reverse(),
other => other,
}
let i = blocks
.binary_search_by(|probe| {
probe
.initial_range
.start
.cmp(&item.initial_range.start)
.then(probe.initial_range.end.cmp(&item.initial_range.end))
.then(Ordering::Greater)
})
.unwrap_or_else(|pos| pos);
blocks.insert(insert_pos, item);
.unwrap_or_else(|i| i);
blocks.insert(i, item);
}
}
@@ -448,10 +796,25 @@ impl ProjectDiagnosticsEditor {
&mut cx,
)
.await;
excerpt_ranges.push(ExcerptRange {
context: excerpt_range,
primary: b.initial_range.clone(),
})
let i = excerpt_ranges
.binary_search_by(|probe| {
probe
.context
.start
.cmp(&excerpt_range.start)
.then(probe.context.end.cmp(&excerpt_range.end))
.then(probe.primary.start.cmp(&b.initial_range.start))
.then(probe.primary.end.cmp(&b.initial_range.end))
.then(cmp::Ordering::Greater)
})
.unwrap_or_else(|i| i);
excerpt_ranges.insert(
i,
ExcerptRange {
context: excerpt_range,
primary: b.initial_range.clone(),
},
)
}
this.update_in(cx, |this, window, cx| {
@@ -923,3 +1286,7 @@ fn is_line_blank_or_indented_less(
let line_indent = snapshot.line_indent_for_row(row);
line_indent.is_line_blank() || line_indent.len(tab_size) < indent_level
}
fn fetch_cargo_diagnostics_token() -> String {
"fetch_cargo_diagnostics".to_string()
}

View File

@@ -1,4 +1,7 @@
use crate::ProjectDiagnosticsEditor;
use std::sync::Arc;
use crate::cargo::cargo_diagnostics_sources;
use crate::{ProjectDiagnosticsEditor, ToggleDiagnosticsRefresh};
use gpui::{Context, Entity, EventEmitter, ParentElement, Render, WeakEntity, Window};
use ui::prelude::*;
use ui::{IconButton, IconButtonShape, IconName, Tooltip};
@@ -13,18 +16,28 @@ impl Render for ToolbarControls {
let mut include_warnings = false;
let mut has_stale_excerpts = false;
let mut is_updating = false;
let cargo_diagnostics_sources = Arc::new(
self.diagnostics()
.map(|editor| cargo_diagnostics_sources(editor.read(cx), cx))
.unwrap_or_default(),
);
let fetch_cargo_diagnostics = !cargo_diagnostics_sources.is_empty();
if let Some(editor) = self.diagnostics() {
let diagnostics = editor.read(cx);
include_warnings = diagnostics.include_warnings;
has_stale_excerpts = !diagnostics.paths_to_update.is_empty();
is_updating = diagnostics.update_excerpts_task.is_some()
|| diagnostics
.project
.read(cx)
.language_servers_running_disk_based_diagnostics(cx)
.next()
.is_some();
is_updating = if fetch_cargo_diagnostics {
diagnostics.cargo_diagnostics_fetch.task.is_some()
} else {
diagnostics.update_excerpts_task.is_some()
|| diagnostics
.project
.read(cx)
.language_servers_running_disk_based_diagnostics(cx)
.next()
.is_some()
};
}
let tooltip = if include_warnings {
@@ -41,21 +54,57 @@ impl Render for ToolbarControls {
h_flex()
.gap_1()
.when(has_stale_excerpts, |div| {
div.child(
IconButton::new("update-excerpts", IconName::Update)
.icon_color(Color::Info)
.shape(IconButtonShape::Square)
.disabled(is_updating)
.tooltip(Tooltip::text("Update excerpts"))
.on_click(cx.listener(|this, _, window, cx| {
if let Some(diagnostics) = this.diagnostics() {
diagnostics.update(cx, |diagnostics, cx| {
diagnostics.update_all_excerpts(window, cx);
});
}
})),
)
.map(|div| {
if is_updating {
div.child(
IconButton::new("stop-updating", IconName::StopFilled)
.icon_color(Color::Info)
.shape(IconButtonShape::Square)
.tooltip(Tooltip::for_action_title(
"Stop diagnostics update",
&ToggleDiagnosticsRefresh,
))
.on_click(cx.listener(move |toolbar_controls, _, _, cx| {
if let Some(diagnostics) = toolbar_controls.diagnostics() {
diagnostics.update(cx, |diagnostics, cx| {
diagnostics.stop_cargo_diagnostics_fetch(cx);
diagnostics.update_excerpts_task = None;
cx.notify();
});
}
})),
)
} else {
div.child(
IconButton::new("refresh-diagnostics", IconName::Update)
.icon_color(Color::Info)
.shape(IconButtonShape::Square)
.disabled(!has_stale_excerpts && !fetch_cargo_diagnostics)
.tooltip(Tooltip::for_action_title(
"Refresh diagnostics",
&ToggleDiagnosticsRefresh,
))
.on_click(cx.listener({
move |toolbar_controls, _, window, cx| {
if let Some(diagnostics) = toolbar_controls.diagnostics() {
let cargo_diagnostics_sources =
Arc::clone(&cargo_diagnostics_sources);
diagnostics.update(cx, move |diagnostics, cx| {
if fetch_cargo_diagnostics {
diagnostics.fetch_cargo_diagnostics(
cargo_diagnostics_sources,
window,
cx,
);
} else {
diagnostics.update_all_excerpts(window, cx);
}
});
}
}
})),
)
}
})
.child(
IconButton::new("toggle-warnings", IconName::Warning)

View File

@@ -5005,11 +5005,11 @@ impl Editor {
range
};
ranges.push(range);
ranges.push(range.clone());
if !self.linked_edit_ranges.is_empty() {
let start_anchor = snapshot.anchor_before(selection.head());
let end_anchor = snapshot.anchor_after(selection.tail());
let start_anchor = snapshot.anchor_before(range.start);
let end_anchor = snapshot.anchor_after(range.end);
if let Some(ranges) = self
.linked_editing_ranges_for(start_anchor.text_anchor..end_anchor.text_anchor, cx)
{

View File

@@ -1,6 +1,7 @@
use super::*;
use crate::{
JoinLines,
linked_editing_ranges::LinkedEditingRanges,
scroll::scroll_amount::ScrollAmount,
test::{
assert_text_with_selections, build_editor,
@@ -19559,6 +19560,146 @@ async fn test_hide_mouse_context_menu_on_modal_opened(cx: &mut TestAppContext) {
});
}
#[gpui::test]
async fn test_html_linked_edits_on_completion(cx: &mut TestAppContext) {
init_test(cx, |_| {});
let fs = FakeFs::new(cx.executor());
fs.insert_file(path!("/file.html"), Default::default())
.await;
let project = Project::test(fs, [path!("/").as_ref()], cx).await;
let language_registry = project.read_with(cx, |project, _| project.languages().clone());
let html_language = Arc::new(Language::new(
LanguageConfig {
name: "HTML".into(),
matcher: LanguageMatcher {
path_suffixes: vec!["html".to_string()],
..LanguageMatcher::default()
},
brackets: BracketPairConfig {
pairs: vec![BracketPair {
start: "<".into(),
end: ">".into(),
close: true,
..Default::default()
}],
..Default::default()
},
..Default::default()
},
Some(tree_sitter_html::LANGUAGE.into()),
));
language_registry.add(html_language);
let mut fake_servers = language_registry.register_fake_lsp(
"HTML",
FakeLspAdapter {
capabilities: lsp::ServerCapabilities {
completion_provider: Some(lsp::CompletionOptions {
resolve_provider: Some(true),
..Default::default()
}),
..Default::default()
},
..Default::default()
},
);
let workspace = cx.add_window(|window, cx| Workspace::test_new(project.clone(), window, cx));
let cx = &mut VisualTestContext::from_window(*workspace, cx);
let worktree_id = workspace
.update(cx, |workspace, _window, cx| {
workspace.project().update(cx, |project, cx| {
project.worktrees(cx).next().unwrap().read(cx).id()
})
})
.unwrap();
project
.update(cx, |project, cx| {
project.open_local_buffer_with_lsp(path!("/file.html"), cx)
})
.await
.unwrap();
let editor = workspace
.update(cx, |workspace, window, cx| {
workspace.open_path((worktree_id, "file.html"), None, true, window, cx)
})
.unwrap()
.await
.unwrap()
.downcast::<Editor>()
.unwrap();
let fake_server = fake_servers.next().await.unwrap();
editor.update_in(cx, |editor, window, cx| {
editor.set_text("<ad></ad>", window, cx);
editor.change_selections(None, window, cx, |selections| {
selections.select_ranges([Point::new(0, 3)..Point::new(0, 3)]);
});
let Some((buffer, _)) = editor
.buffer
.read(cx)
.text_anchor_for_position(editor.selections.newest_anchor().start, cx)
else {
panic!("Failed to get buffer for selection position");
};
let buffer = buffer.read(cx);
let buffer_id = buffer.remote_id();
let opening_range =
buffer.anchor_before(Point::new(0, 1))..buffer.anchor_after(Point::new(0, 3));
let closing_range =
buffer.anchor_before(Point::new(0, 6))..buffer.anchor_after(Point::new(0, 8));
let mut linked_ranges = HashMap::default();
linked_ranges.insert(
buffer_id,
vec![(opening_range.clone(), vec![closing_range.clone()])],
);
editor.linked_edit_ranges = LinkedEditingRanges(linked_ranges);
});
let mut completion_handle =
fake_server.set_request_handler::<lsp::request::Completion, _, _>(move |_, _| async move {
Ok(Some(lsp::CompletionResponse::Array(vec![
lsp::CompletionItem {
label: "head".to_string(),
text_edit: Some(lsp::CompletionTextEdit::InsertAndReplace(
lsp::InsertReplaceEdit {
new_text: "head".to_string(),
insert: lsp::Range::new(
lsp::Position::new(0, 1),
lsp::Position::new(0, 3),
),
replace: lsp::Range::new(
lsp::Position::new(0, 1),
lsp::Position::new(0, 3),
),
},
)),
..Default::default()
},
])))
});
editor.update_in(cx, |editor, window, cx| {
editor.show_completions(&ShowCompletions { trigger: None }, window, cx);
});
cx.run_until_parked();
completion_handle.next().await.unwrap();
editor.update(cx, |editor, _| {
assert!(
editor.context_menu_visible(),
"Completion menu should be visible"
);
});
editor.update_in(cx, |editor, window, cx| {
editor.confirm_completion(&ConfirmCompletion::default(), window, cx)
});
cx.executor().run_until_parked();
editor.update(cx, |editor, cx| {
assert_eq!(editor.text(cx), "<head></head>");
});
}
fn empty_range(row: usize, column: usize) -> Range<DisplayPoint> {
let point = DisplayPoint::new(DisplayRow(row as u32), column as u32);
point..point

View File

@@ -2276,6 +2276,9 @@ impl EditorElement {
}
let display_row = multibuffer_point.to_display_point(snapshot).row();
if !range.contains(&display_row) {
return None;
}
if row_infos
.get((display_row - range.start).0 as usize)
.is_some_and(|row_info| row_info.expand_info.is_some())

View File

@@ -233,7 +233,7 @@ pub fn deploy_context_menu(
.separator()
.action("Cut", Box::new(Cut))
.action("Copy", Box::new(Copy))
.action("Copy and trim", Box::new(CopyAndTrim))
.action("Copy and Trim", Box::new(CopyAndTrim))
.action("Paste", Box::new(Paste))
.separator()
.map(|builder| {

View File

@@ -52,10 +52,10 @@ struct Args {
#[arg(long, value_delimiter = ',', default_value = "rs,ts")]
languages: Vec<String>,
/// How many times to run each example.
#[arg(long, default_value = "1")]
#[arg(long, default_value = "8")]
repetitions: usize,
/// Maximum number of examples to run concurrently.
#[arg(long, default_value = "10")]
#[arg(long, default_value = "4")]
concurrency: usize,
}

View File

@@ -98,54 +98,64 @@ impl Example for CodeBlockCitations {
if let Some(content_len) = content_len {
// + 1 because there's a newline character after the citation.
let content =
&text[(citation.len() + 1)..content_len - (citation.len() + 1)];
let start_index = citation.len() + 1;
let end_index = content_len.saturating_sub(start_index);
// deindent (trim the start of each line) because sometimes the model
// chooses to deindent its code snippets for the sake of readability,
// which in markdown is not only reasonable but usually desirable.
cx.assert(
deindent(&buffer_text)
.trim()
.contains(deindent(&content).trim()),
"Code block content was found in file",
)
.ok();
if let Some(range) = path_range.range {
let start_line_index = range.start.line.saturating_sub(1);
let line_count =
range.end.line.saturating_sub(start_line_index);
let mut snippet = buffer_text
.lines()
.skip(start_line_index as usize)
.take(line_count as usize)
.collect::<Vec<&str>>()
.join("\n");
if let Some(start_col) = range.start.col {
snippet = snippet[start_col as usize..].to_string();
}
if let Some(end_col) = range.end.col {
let last_line = snippet.lines().last().unwrap();
snippet = snippet
[..snippet.len() - last_line.len() + end_col as usize]
.to_string();
}
if cx
.assert(
start_index <= end_index,
"Code block had a valid citation",
)
.is_ok()
{
let content = &text[start_index..end_index];
// deindent (trim the start of each line) because sometimes the model
// chooses to deindent its code snippets for the sake of readability,
// which in markdown is not only reasonable but usually desirable.
cx.assert_eq(
deindent(snippet.as_str()).trim(),
deindent(content).trim(),
format!(
"Code block was at {:?}-{:?}",
range.start, range.end
),
cx.assert(
deindent(&buffer_text)
.trim()
.contains(deindent(&content).trim()),
"Code block content was found in file",
)
.ok();
if let Some(range) = path_range.range {
let start_line_index = range.start.line.saturating_sub(1);
let line_count =
range.end.line.saturating_sub(start_line_index);
let mut snippet = buffer_text
.lines()
.skip(start_line_index as usize)
.take(line_count as usize)
.collect::<Vec<&str>>()
.join("\n");
if let Some(start_col) = range.start.col {
snippet = snippet[start_col as usize..].to_string();
}
if let Some(end_col) = range.end.col {
let last_line = snippet.lines().last().unwrap();
snippet = snippet[..snippet.len() - last_line.len()
+ end_col as usize]
.to_string();
}
// deindent (trim the start of each line) because sometimes the model
// chooses to deindent its code snippets for the sake of readability,
// which in markdown is not only reasonable but usually desirable.
cx.assert_eq(
deindent(snippet.as_str()).trim(),
deindent(content).trim(),
format!(
"Code block was at {:?}-{:?}",
range.start, range.end
),
)
.ok();
}
}
}
}

View File

@@ -822,7 +822,6 @@ impl FileFinderDelegate {
did_cancel: bool,
query: FileSearchQuery,
matches: impl IntoIterator<Item = ProjectPanelOrdMatch>,
cx: &mut Context<Picker<Self>>,
) {
if search_id >= self.latest_search_id {
@@ -849,7 +848,7 @@ impl FileFinderDelegate {
);
self.selected_index = selected_match.map_or_else(
|| self.calculate_selected_index(),
|| self.calculate_selected_index(cx),
|m| {
self.matches
.position(&m, self.currently_opened_path.as_ref())
@@ -1092,12 +1091,14 @@ impl FileFinderDelegate {
}
/// Skips first history match (that is displayed topmost) if it's currently opened.
fn calculate_selected_index(&self) -> usize {
if let Some(Match::History { path, .. }) = self.matches.get(0) {
if Some(path) == self.currently_opened_path.as_ref() {
let elements_after_first = self.matches.len() - 1;
if elements_after_first > 0 {
return 1;
fn calculate_selected_index(&self, cx: &mut Context<Picker<Self>>) -> usize {
if FileFinderSettings::get_global(cx).skip_focus_for_active_in_search {
if let Some(Match::History { path, .. }) = self.matches.get(0) {
if Some(path) == self.currently_opened_path.as_ref() {
let elements_after_first = self.matches.len() - 1;
if elements_after_first > 0 {
return 1;
}
}
}
}

View File

@@ -7,6 +7,7 @@ use settings::{Settings, SettingsSources};
pub struct FileFinderSettings {
pub file_icons: bool,
pub modal_max_width: Option<FileFinderWidth>,
pub skip_focus_for_active_in_search: bool,
}
#[derive(Clone, Default, Serialize, Deserialize, JsonSchema, Debug)]
@@ -19,6 +20,10 @@ pub struct FileFinderSettingsContent {
///
/// Default: small
pub modal_max_width: Option<FileFinderWidth>,
/// Determines whether the file finder should skip focus for the active file in search results.
///
/// Default: true
pub skip_focus_for_active_in_search: Option<bool>,
}
impl Settings for FileFinderSettings {

View File

@@ -1359,6 +1359,73 @@ async fn test_keep_opened_file_on_top_of_search_results_and_select_next_one(
});
}
#[gpui::test]
async fn test_setting_auto_select_first_and_select_active_file(cx: &mut TestAppContext) {
let app_state = init_test(cx);
cx.update(|cx| {
let settings = *FileFinderSettings::get_global(cx);
FileFinderSettings::override_global(
FileFinderSettings {
skip_focus_for_active_in_search: false,
..settings
},
cx,
);
});
app_state
.fs
.as_fake()
.insert_tree(
path!("/src"),
json!({
"test": {
"bar.rs": "// Bar file",
"lib.rs": "// Lib file",
"maaa.rs": "// Maaaaaaa",
"main.rs": "// Main file",
"moo.rs": "// Moooooo",
}
}),
)
.await;
let project = Project::test(app_state.fs.clone(), [path!("/src").as_ref()], cx).await;
let (workspace, cx) = cx.add_window_view(|window, cx| Workspace::test_new(project, window, cx));
open_close_queried_buffer("bar", 1, "bar.rs", &workspace, cx).await;
open_close_queried_buffer("lib", 1, "lib.rs", &workspace, cx).await;
open_queried_buffer("main", 1, "main.rs", &workspace, cx).await;
// main.rs is on top, previously used is selected
let picker = open_file_picker(&workspace, cx);
picker.update(cx, |finder, _| {
assert_eq!(finder.delegate.matches.len(), 3);
assert_match_selection(finder, 0, "main.rs");
assert_match_at_position(finder, 1, "lib.rs");
assert_match_at_position(finder, 2, "bar.rs");
});
// all files match, main.rs is on top, and is selected
picker
.update_in(cx, |finder, window, cx| {
finder
.delegate
.update_matches(".rs".to_string(), window, cx)
})
.await;
picker.update(cx, |finder, _| {
assert_eq!(finder.delegate.matches.len(), 5);
assert_match_selection(finder, 0, "main.rs");
assert_match_at_position(finder, 1, "bar.rs");
assert_match_at_position(finder, 2, "lib.rs");
assert_match_at_position(finder, 3, "moo.rs");
assert_match_at_position(finder, 4, "maaa.rs");
});
}
#[gpui::test]
async fn test_non_separate_history_items(cx: &mut TestAppContext) {
let app_state = init_test(cx);

View File

@@ -411,13 +411,13 @@ impl GitRepository for RealGitRepository {
"--no-optional-locks",
"show",
"--no-patch",
"--format=%H%x00%B%x00%at%x00%ae%x00%an",
"--format=%H%x00%B%x00%at%x00%ae%x00%an%x00",
&commit,
])
.output()?;
let output = std::str::from_utf8(&output.stdout)?;
let fields = output.split('\0').collect::<Vec<_>>();
if fields.len() != 5 {
if fields.len() != 6 {
bail!("unexpected git-show output for {commit:?}: {output:?}")
}
let sha = fields[0].to_string().into();

View File

@@ -239,6 +239,10 @@ impl CachedLspAdapter {
.process_diagnostics(params, server_id, existing_diagnostics)
}
pub fn retain_old_diagnostic(&self, previous_diagnostic: &Diagnostic, cx: &App) -> bool {
self.adapter.retain_old_diagnostic(previous_diagnostic, cx)
}
pub fn diagnostic_message_to_markdown(&self, message: &str) -> Option<String> {
self.adapter.diagnostic_message_to_markdown(message)
}
@@ -461,6 +465,11 @@ pub trait LspAdapter: 'static + Send + Sync {
) {
}
/// When processing new `lsp::PublishDiagnosticsParams` diagnostics, whether to retain previous one(s) or not.
fn retain_old_diagnostic(&self, _previous_diagnostic: &Diagnostic, _cx: &App) -> bool {
false
}
/// Post-processes completions provided by the language server.
async fn process_completions(&self, _: &mut [lsp::CompletionItem]) {}

View File

@@ -3,3 +3,10 @@
("{" @open "}" @close)
("\"" @open "\"" @close)
("`" @open "`" @close)
(("do" @open "done" @close) (#set! newline.only))
((case_statement ("in" @open "esac" @close)) (#set! newline.only))
((if_statement (elif_clause ("then" @open)) (else_clause ("else" @close))) (#set! newline.only))
((if_statement (else_clause ("else" @open)) "fi" @close) (#set! newline.only))
((if_statement ("then" @open) (elif_clause ("elif" @close))) (#set! newline.only))
((if_statement ("then" @open) (else_clause ("else" @close))) (#set! newline.only))
((if_statement ("then" @open "fi" @close)) (#set! newline.only))

View File

@@ -10,6 +10,11 @@ brackets = [
{ start = "{", end = "}", close = true, newline = true },
{ start = "\"", end = "\"", close = true, newline = false, not_in = ["comment", "string"] },
{ start = "'", end = "'", close = true, newline = false, not_in = ["string", "comment"] },
{ start = "do", end = "done", close = false, newline = true, not_in = ["comment", "string"] },
{ start = "then", end = "fi", close = false, newline = true, not_in = ["comment", "string"] },
{ start = "then", end = "else", close = false, newline = true, not_in = ["comment", "string"] },
{ start = "then", end = "elif", close = false, newline = true, not_in = ["comment", "string"] },
{ start = "in", end = "esac", close = false, newline = true, not_in = ["comment", "string"] },
]
### WARN: the following is not working when you insert an `elif` just before an else

View File

@@ -298,9 +298,9 @@ impl super::LspAdapter for CLspAdapter {
&self,
params: &mut lsp::PublishDiagnosticsParams,
server_id: LanguageServerId,
buffer_access: Option<&'_ Buffer>,
buffer: Option<&'_ Buffer>,
) {
if let Some(buffer) = buffer_access {
if let Some(buffer) = buffer {
let snapshot = buffer.snapshot();
let inactive_regions = buffer
.get_diagnostics(server_id)

View File

@@ -4,12 +4,12 @@
name: (identifier) @run @_unittest_class_name
superclasses: (argument_list
[(identifier) @_superclass
(attribute (identifier) @_superclass)]
)
(attribute (identifier) @_superclass)]
)
(#eq? @_superclass "TestCase")
) @_python-unittest-class
) @_python-unittest-class
(#set! tag python-unittest-class)
)
)
; test methods whose names start with `test` in a TestCase
(
@@ -17,18 +17,18 @@
name: (identifier) @_unittest_class_name
superclasses: (argument_list
[(identifier) @_superclass
(attribute (identifier) @_superclass)]
)
(attribute (identifier) @_superclass)]
)
(#eq? @_superclass "TestCase")
body: (block
(function_definition
name: (identifier) @run @_unittest_method_name
(#match? @_unittest_method_name "^test.*")
(function_definition
name: (identifier) @run @_unittest_method_name
(#match? @_unittest_method_name "^test.*")
) @_python-unittest-method
(#set! tag python-unittest-method)
(#set! tag python-unittest-method)
)
)
)
)
; pytest functions
(
@@ -36,10 +36,10 @@
(function_definition
name: (identifier) @run @_pytest_method_name
(#match? @_pytest_method_name "^test_")
) @_python-pytest-method
)
) @_python-pytest-method
)
(#set! tag python-pytest-method)
)
)
; decorated pytest functions
(
@@ -53,7 +53,8 @@
) @_python-pytest-method
)
(#set! tag python-pytest-method)
)
)
; pytest classes
(
@@ -61,10 +62,26 @@
(class_definition
name: (identifier) @run @_pytest_class_name
(#match? @_pytest_class_name "^Test")
)
)
(#set! tag python-pytest-class)
)
)
)
; decorated pytest classes
(
(module
(decorated_definition
(decorator)+ @_decorator
definition: (class_definition
name: (identifier) @run @_pytest_class_name
(#match? @_pytest_class_name "^Test")
)
)
(#set! tag python-pytest-class)
)
)
; pytest class methods
(
@@ -73,35 +90,49 @@
name: (identifier) @_pytest_class_name
(#match? @_pytest_class_name "^Test")
body: (block
(function_definition
name: (identifier) @run @_pytest_method_name
(#match? @_pytest_method_name "^test")
) @_python-pytest-method
(#set! tag python-pytest-method)
)
)
)
)
; decorated pytest class methods
(
(module
(class_definition
name: (identifier) @_pytest_class_name
(#match? @_pytest_class_name "^Test")
body: (block
(decorated_definition
[(decorated_definition
(decorator)+ @_decorator
definition: (function_definition
name: (identifier) @run @_pytest_method_name
(#match? @_pytest_method_name "^test_")
)
)
) @_python-pytest-method
(function_definition
name: (identifier) @run @_pytest_method_name
(#match? @_pytest_method_name "^test")
)
] @_python-pytest-method)
(#set! tag python-pytest-method)
)
)
)
; decorated pytest class methods
(
(module
(decorated_definition
(decorator)+ @_decorator
definition: (class_definition
name: (identifier) @_pytest_class_name
(#match? @_pytest_class_name "^Test")
body: (block
[(decorated_definition
(decorator)+ @_decorator
definition: (function_definition
name: (identifier) @run @_pytest_method_name
(#match? @_pytest_method_name "^test_")
)
)
(function_definition
name: (identifier) @run @_pytest_method_name
(#match? @_pytest_method_name "^test")
)
] @_python-pytest-method)
(#set! tag python-pytest-method)
)
)
)
)
)
; module main method
(
@@ -111,10 +142,10 @@
(identifier) @run @_lhs
operators: "=="
(string) @_rhs
)
)
(#eq? @_lhs "__name__")
(#match? @_rhs "^[\"']__main__[\"']$")
(#set! tag python-module-main-method)
)
)
)
)

View File

@@ -8,6 +8,7 @@ use http_client::github::AssetKind;
use http_client::github::{GitHubLspBinaryVersion, latest_github_release};
pub use language::*;
use lsp::{InitializeParams, LanguageServerBinary};
use project::lsp_store::rust_analyzer_ext::CARGO_DIAGNOSTICS_SOURCE_NAME;
use project::project_settings::ProjectSettings;
use regex::Regex;
use serde_json::json;
@@ -252,13 +253,22 @@ impl LspAdapter for RustLspAdapter {
}
fn disk_based_diagnostic_sources(&self) -> Vec<String> {
vec!["rustc".into()]
vec![CARGO_DIAGNOSTICS_SOURCE_NAME.to_owned()]
}
fn disk_based_diagnostics_progress_token(&self) -> Option<String> {
Some("rust-analyzer/flycheck".into())
}
fn retain_old_diagnostic(&self, previous_diagnostic: &Diagnostic, cx: &App) -> bool {
let zed_provides_cargo_diagnostics = ProjectSettings::get_global(cx)
.diagnostics
.fetch_cargo_diagnostics();
// Zed manages the lifecycle of cargo diagnostics when configured so.
zed_provides_cargo_diagnostics
&& previous_diagnostic.source.as_deref() == Some(CARGO_DIAGNOSTICS_SOURCE_NAME)
}
fn process_diagnostics(
&self,
params: &mut lsp::PublishDiagnosticsParams,
@@ -499,12 +509,27 @@ impl LspAdapter for RustLspAdapter {
"kinds": [ "cargo", "shell" ],
},
});
if let Some(ref mut original_experimental) = original.capabilities.experimental {
if let Some(original_experimental) = &mut original.capabilities.experimental {
merge_json_value_into(experimental, original_experimental);
} else {
original.capabilities.experimental = Some(experimental);
}
}
let zed_provides_cargo_diagnostics = ProjectSettings::get_global(cx)
.diagnostics
.fetch_cargo_diagnostics();
if zed_provides_cargo_diagnostics {
let disable_check_on_save = json!({
"checkOnSave": false,
});
if let Some(initialization_options) = &mut original.initialization_options {
merge_json_value_into(disable_check_on_save, initialization_options);
} else {
original.initialization_options = Some(disable_check_on_save);
}
}
Ok(original)
}
}

View File

@@ -1686,7 +1686,10 @@ impl MultiBuffer {
let mut counts: Vec<usize> = Vec::new();
for range in expanded_ranges {
if let Some(last_range) = merged_ranges.last_mut() {
debug_assert!(last_range.context.start <= range.context.start);
debug_assert!(
last_range.context.start <= range.context.start,
"Last range: {last_range:?} Range: {range:?}"
);
if last_range.context.end >= range.context.start {
last_range.context.end = range.context.end.max(last_range.context.end);
*counts.last_mut().unwrap() += 1;

View File

@@ -467,10 +467,11 @@ impl LocalLspStore {
adapter.process_diagnostics(&mut params, server_id, buffer);
}
this.update_diagnostics(
this.merge_diagnostics(
server_id,
params,
&adapter.disk_based_diagnostic_sources,
|diagnostic, cx| adapter.retain_old_diagnostic(diagnostic, cx),
cx,
)
.log_err();
@@ -3395,7 +3396,7 @@ pub struct LanguageServerStatus {
pub name: String,
pub pending_work: BTreeMap<String, LanguageServerProgress>,
pub has_pending_diagnostic_updates: bool,
progress_tokens: HashSet<String>,
pub progress_tokens: HashSet<String>,
}
#[derive(Clone, Debug)]
@@ -6237,6 +6238,13 @@ impl LspStore {
})
}
pub fn language_server_with_name(&self, name: &str, cx: &App) -> Option<LanguageServerId> {
self.as_local()?
.lsp_tree
.read(cx)
.server_id_for_name(&LanguageServerName::from(name))
}
pub fn language_servers_for_local_buffer<'a>(
&'a self,
buffer: &Buffer,
@@ -6380,10 +6388,10 @@ impl LspStore {
diagnostics: Vec<DiagnosticEntry<Unclipped<PointUtf16>>>,
cx: &mut Context<Self>,
) -> anyhow::Result<()> {
self.merge_diagnostic_entries(server_id, abs_path, version, diagnostics, |_| false, cx)
self.merge_diagnostic_entries(server_id, abs_path, version, diagnostics, |_, _| false, cx)
}
pub fn merge_diagnostic_entries<F: Fn(&Diagnostic) -> bool + Clone>(
pub fn merge_diagnostic_entries<F: Fn(&Diagnostic, &App) -> bool + Clone>(
&mut self,
server_id: LanguageServerId,
abs_path: PathBuf,
@@ -6416,7 +6424,7 @@ impl LspStore {
.get_diagnostics(server_id)
.into_iter()
.flat_map(|diag| {
diag.iter().filter(|v| filter(&v.diagnostic)).map(|v| {
diag.iter().filter(|v| filter(&v.diagnostic, cx)).map(|v| {
let start = Unclipped(v.range.start.to_point_utf16(&snapshot));
let end = Unclipped(v.range.end.to_point_utf16(&snapshot));
DiagnosticEntry {
@@ -7021,27 +7029,38 @@ impl LspStore {
envelope: TypedEnvelope<proto::LanguageServerIdForName>,
mut cx: AsyncApp,
) -> Result<proto::LanguageServerIdForNameResponse> {
let buffer_id = BufferId::new(envelope.payload.buffer_id)?;
let name = &envelope.payload.name;
lsp_store
.update(&mut cx, |lsp_store, cx| {
let buffer = lsp_store.buffer_store.read(cx).get_existing(buffer_id)?;
let server_id = buffer.update(cx, |buffer, cx| {
lsp_store
.language_servers_for_local_buffer(buffer, cx)
.find_map(|(adapter, server)| {
if adapter.name.0.as_ref() == name {
Some(server.server_id())
} else {
None
}
})
});
Ok(server_id)
})?
.map(|server_id| proto::LanguageServerIdForNameResponse {
server_id: server_id.map(|id| id.to_proto()),
})
match envelope.payload.buffer_id {
Some(buffer_id) => {
let buffer_id = BufferId::new(buffer_id)?;
lsp_store
.update(&mut cx, |lsp_store, cx| {
let buffer = lsp_store.buffer_store.read(cx).get_existing(buffer_id)?;
let server_id = buffer.update(cx, |buffer, cx| {
lsp_store
.language_servers_for_local_buffer(buffer, cx)
.find_map(|(adapter, server)| {
if adapter.name.0.as_ref() == name {
Some(server.server_id())
} else {
None
}
})
});
Ok(server_id)
})?
.map(|server_id| proto::LanguageServerIdForNameResponse {
server_id: server_id.map(|id| id.to_proto()),
})
}
None => lsp_store.update(&mut cx, |lsp_store, cx| {
proto::LanguageServerIdForNameResponse {
server_id: lsp_store
.language_server_with_name(name, cx)
.map(|id| id.to_proto()),
}
}),
}
}
async fn handle_rename_project_entry(
@@ -7517,7 +7536,7 @@ impl LspStore {
}
}
fn on_lsp_progress(
pub fn on_lsp_progress(
&mut self,
progress: lsp::ProgressParams,
language_server_id: LanguageServerId,
@@ -8550,12 +8569,12 @@ impl LspStore {
language_server_id,
params,
disk_based_sources,
|_| false,
|_, _| false,
cx,
)
}
pub fn merge_diagnostics<F: Fn(&Diagnostic) -> bool + Clone>(
pub fn merge_diagnostics<F: Fn(&Diagnostic, &App) -> bool + Clone>(
&mut self,
language_server_id: LanguageServerId,
mut params: lsp::PublishDiagnosticsParams,

View File

@@ -75,7 +75,7 @@ pub fn register_notifications(
server_id,
mapped_diagnostics,
&adapter.disk_based_diagnostic_sources,
|diag| !is_inactive_region(diag),
|diag, _| !is_inactive_region(diag),
cx,
)
.log_err();

View File

@@ -5,6 +5,7 @@ use lsp::LanguageServer;
use crate::{LanguageServerPromptRequest, LspStore, LspStoreEvent};
pub const RUST_ANALYZER_NAME: &str = "rust-analyzer";
pub const CARGO_DIAGNOSTICS_SOURCE_NAME: &str = "rustc";
/// Experimental: Informs the end user about the state of the server
///

View File

@@ -247,6 +247,20 @@ impl LanguageServerTree {
self.languages.adapter_for_name(name)
}
pub fn server_id_for_name(&self, name: &LanguageServerName) -> Option<LanguageServerId> {
self.instances
.values()
.flat_map(|instance| instance.roots.values())
.flatten()
.find_map(|(server_name, (data, _))| {
if server_name == name {
data.id.get().copied()
} else {
None
}
})
}
fn adapters_for_language(
&self,
settings_location: SettingsLocation,

View File

@@ -4748,6 +4748,42 @@ impl Project {
})
}
pub fn language_server_with_name(
&self,
name: &str,
cx: &App,
) -> Task<Option<LanguageServerId>> {
if self.is_local() {
Task::ready(self.lsp_store.read(cx).language_server_with_name(name, cx))
} else if let Some(project_id) = self.remote_id() {
let request = self.client.request(proto::LanguageServerIdForName {
project_id,
buffer_id: None,
name: name.to_string(),
});
cx.background_spawn(async move {
let response = request.await.log_err()?;
response.server_id.map(LanguageServerId::from_proto)
})
} else if let Some(ssh_client) = self.ssh_client.as_ref() {
let request =
ssh_client
.read(cx)
.proto_client()
.request(proto::LanguageServerIdForName {
project_id: SSH_PROJECT_ID,
buffer_id: None,
name: name.to_string(),
});
cx.background_spawn(async move {
let response = request.await.log_err()?;
response.server_id.map(LanguageServerId::from_proto)
})
} else {
Task::ready(None)
}
}
pub fn language_server_id_for_name(
&self,
buffer: &Buffer,
@@ -4769,7 +4805,7 @@ impl Project {
} else if let Some(project_id) = self.remote_id() {
let request = self.client.request(proto::LanguageServerIdForName {
project_id,
buffer_id: buffer.remote_id().to_proto(),
buffer_id: Some(buffer.remote_id().to_proto()),
name: name.to_string(),
});
cx.background_spawn(async move {
@@ -4783,7 +4819,7 @@ impl Project {
.proto_client()
.request(proto::LanguageServerIdForName {
project_id: SSH_PROJECT_ID,
buffer_id: buffer.remote_id().to_proto(),
buffer_id: Some(buffer.remote_id().to_proto()),
name: name.to_string(),
});
cx.background_spawn(async move {

View File

@@ -99,7 +99,7 @@ pub enum DirenvSettings {
Direct,
}
#[derive(Copy, Clone, Debug, Default, Serialize, Deserialize, JsonSchema)]
#[derive(Clone, Debug, Default, Serialize, Deserialize, JsonSchema)]
pub struct DiagnosticsSettings {
/// Whether or not to include warning diagnostics
#[serde(default = "true_value")]
@@ -108,6 +108,18 @@ pub struct DiagnosticsSettings {
/// Settings for showing inline diagnostics
#[serde(default)]
pub inline: InlineDiagnosticsSettings,
/// Configuration, related to Rust language diagnostics.
#[serde(default)]
pub cargo: Option<CargoDiagnosticsSettings>,
}
impl DiagnosticsSettings {
pub fn fetch_cargo_diagnostics(&self) -> bool {
self.cargo.as_ref().map_or(false, |cargo_diagnostics| {
cargo_diagnostics.fetch_cargo_diagnostics
})
}
}
#[derive(Clone, Copy, Debug, Serialize, Deserialize, JsonSchema)]
@@ -141,6 +153,41 @@ pub struct InlineDiagnosticsSettings {
pub max_severity: Option<DiagnosticSeverity>,
}
#[derive(Clone, Debug, Default, Serialize, Deserialize, JsonSchema)]
pub struct CargoDiagnosticsSettings {
/// When enabled, Zed runs `cargo check --message-format=json`-based commands and
/// collect cargo diagnostics instead of rust-analyzer.
///
/// Default: false
#[serde(default)]
pub fetch_cargo_diagnostics: bool,
/// A command override for fetching the cargo diagnostics.
/// First argument is the command, followed by the arguments.
///
/// Default: ["cargo", "check", "--quiet", "--workspace", "--message-format=json", "--all-targets", "--keep-going"]
#[serde(default = "default_diagnostics_fetch_command")]
pub diagnostics_fetch_command: Vec<String>,
/// Extra environment variables to pass to the diagnostics fetch command.
///
/// Default: {}
#[serde(default)]
pub env: HashMap<String, String>,
}
fn default_diagnostics_fetch_command() -> Vec<String> {
vec![
"cargo".to_string(),
"check".to_string(),
"--quiet".to_string(),
"--workspace".to_string(),
"--message-format=json".to_string(),
"--all-targets".to_string(),
"--keep-going".to_string(),
]
}
#[derive(Clone, Copy, Debug, Serialize, Deserialize, JsonSchema)]
#[serde(rename_all = "snake_case")]
pub enum DiagnosticSeverity {

View File

@@ -696,7 +696,7 @@ message LspResponse {
message LanguageServerIdForName {
uint64 project_id = 1;
uint64 buffer_id = 2;
optional uint64 buffer_id = 2;
string name = 3;
}

View File

@@ -15,6 +15,7 @@ path = "src/ui.rs"
[dependencies]
chrono.workspace = true
component.workspace = true
documented.workspace = true
gpui.workspace = true
icons.workspace = true
itertools.workspace = true
@@ -28,7 +29,6 @@ strum.workspace = true
theme.workspace = true
ui_macros.workspace = true
util.workspace = true
documented = "0.9.1"
workspace-hack.workspace = true
[target.'cfg(windows)'.dependencies]

View File

@@ -535,9 +535,15 @@ impl RenderOnce for ButtonLike {
ButtonSize::None => this,
})
.bg(style.enabled(self.layer, cx).background)
.when(self.disabled, |this| this.cursor_not_allowed())
.when(self.disabled, |this| {
if self.cursor_style == CursorStyle::PointingHand {
this.cursor_not_allowed()
} else {
this.cursor(self.cursor_style)
}
})
.when(!self.disabled, |this| {
this.cursor_pointer()
this.cursor(self.cursor_style)
.hover(|hover| hover.bg(style.hovered(self.layer, cx).background))
.active(|active| active.bg(style.active(cx).background))
})

View File

@@ -52,6 +52,7 @@ pub struct ContextMenuEntry {
end_slot_icon: Option<IconName>,
end_slot_title: Option<SharedString>,
end_slot_handler: Option<Rc<dyn Fn(Option<&FocusHandle>, &mut Window, &mut App)>>,
show_end_slot_on_hover: bool,
}
impl ContextMenuEntry {
@@ -70,6 +71,7 @@ impl ContextMenuEntry {
end_slot_icon: None,
end_slot_title: None,
end_slot_handler: None,
show_end_slot_on_hover: false,
}
}
@@ -365,6 +367,7 @@ impl ContextMenu {
end_slot_icon: None,
end_slot_title: None,
end_slot_handler: None,
show_end_slot_on_hover: false,
}));
self
}
@@ -392,6 +395,35 @@ impl ContextMenu {
end_slot_icon: Some(end_slot_icon),
end_slot_title: Some(end_slot_title),
end_slot_handler: Some(Rc::new(move |_, window, cx| end_slot_handler(window, cx))),
show_end_slot_on_hover: false,
}));
self
}
pub fn entry_with_end_slot_on_hover(
mut self,
label: impl Into<SharedString>,
action: Option<Box<dyn Action>>,
handler: impl Fn(&mut Window, &mut App) + 'static,
end_slot_icon: IconName,
end_slot_title: SharedString,
end_slot_handler: impl Fn(&mut Window, &mut App) + 'static,
) -> Self {
self.items.push(ContextMenuItem::Entry(ContextMenuEntry {
toggle: None,
label: label.into(),
handler: Rc::new(move |_, window, cx| handler(window, cx)),
icon: None,
icon_position: IconPosition::End,
icon_size: IconSize::Small,
icon_color: None,
action,
disabled: false,
documentation_aside: None,
end_slot_icon: Some(end_slot_icon),
end_slot_title: Some(end_slot_title),
end_slot_handler: Some(Rc::new(move |_, window, cx| end_slot_handler(window, cx))),
show_end_slot_on_hover: true,
}));
self
}
@@ -418,6 +450,7 @@ impl ContextMenu {
end_slot_icon: None,
end_slot_title: None,
end_slot_handler: None,
show_end_slot_on_hover: false,
}));
self
}
@@ -472,6 +505,7 @@ impl ContextMenu {
end_slot_icon: None,
end_slot_title: None,
end_slot_handler: None,
show_end_slot_on_hover: false,
}));
self
}
@@ -500,6 +534,7 @@ impl ContextMenu {
end_slot_icon: None,
end_slot_title: None,
end_slot_handler: None,
show_end_slot_on_hover: false,
}));
self
}
@@ -519,6 +554,7 @@ impl ContextMenu {
end_slot_icon: None,
end_slot_title: None,
end_slot_handler: None,
show_end_slot_on_hover: false,
}));
self
}
@@ -822,6 +858,7 @@ impl ContextMenu {
end_slot_icon,
end_slot_title,
end_slot_handler,
show_end_slot_on_hover,
} = entry;
let this = cx.weak_entity();
@@ -884,6 +921,7 @@ impl ContextMenu {
)
.child(
ListItem::new(ix)
.group_name("label_container")
.inset(true)
.disabled(*disabled)
.toggle_state(Some(ix) == self.selected_index)
@@ -942,8 +980,8 @@ impl ContextMenu {
.zip(end_slot_title.as_ref())
.zip(end_slot_handler.as_ref()),
|el, (((icon, action), title), handler)| {
el.end_slot(
IconButton::new("end-slot-icon", *icon)
el.end_slot({
let icon_button = IconButton::new("end-slot-icon", *icon)
.shape(IconButtonShape::Square)
.tooltip({
let action_context = self.action_context.clone();
@@ -981,8 +1019,17 @@ impl ContextMenu {
})
.ok();
}
}),
)
});
if *show_end_slot_on_hover {
div()
.visible_on_hover("label_container")
.child(icon_button)
.into_any_element()
} else {
icon_button.into_any_element()
}
})
},
)
.on_click({

View File

@@ -16,6 +16,7 @@ pub enum ListItemSpacing {
#[derive(IntoElement)]
pub struct ListItem {
id: ElementId,
group_name: Option<SharedString>,
disabled: bool,
selected: bool,
spacing: ListItemSpacing,
@@ -48,6 +49,7 @@ impl ListItem {
pub fn new(id: impl Into<ElementId>) -> Self {
Self {
id: id.into(),
group_name: None,
disabled: false,
selected: false,
spacing: ListItemSpacing::Dense,
@@ -72,6 +74,11 @@ impl ListItem {
}
}
pub fn group_name(mut self, group_name: impl Into<SharedString>) -> Self {
self.group_name = Some(group_name.into());
self
}
pub fn spacing(mut self, spacing: ListItemSpacing) -> Self {
self.spacing = spacing;
self
@@ -196,6 +203,7 @@ impl RenderOnce for ListItem {
fn render(self, _window: &mut Window, cx: &mut App) -> impl IntoElement {
h_flex()
.id(self.id)
.when_some(self.group_name, |this, group| this.group(group))
.w_full()
.relative()
// When an item is inset draw the indent spacing outside of the item

View File

@@ -4,11 +4,13 @@ use gpui::App;
use theme::ActiveTheme;
mod color_contrast;
mod corner_solver;
mod format_distance;
mod search_input;
mod with_rem_size;
pub use color_contrast::*;
pub use corner_solver::{CornerSolver, inner_corner_radius};
pub use format_distance::*;
pub use search_input::*;
pub use with_rem_size::*;

View File

@@ -0,0 +1,61 @@
use gpui::Pixels;
/// Calculates the childs content-corner radius for a single nested level.
///
/// child_content_radius = max(0, parent_radius - parent_border - parent_padding + self_border)
///
/// - parent_radius: outer corner radius of the parent element
/// - parent_border: border width of the parent element
/// - parent_padding: padding of the parent element
/// - self_border: border width of this child element (for content inset)
pub fn inner_corner_radius(
parent_radius: Pixels,
parent_border: Pixels,
parent_padding: Pixels,
self_border: Pixels,
) -> Pixels {
(parent_radius - parent_border - parent_padding + self_border).max(Pixels::ZERO)
}
/// Solver for arbitrarily deep nested corner radii.
///
/// Each nested levels outer border-box radius is:
/// R₀ = max(0, root_radius - root_border - root_padding)
/// Rᵢ = max(0, Rᵢ₋₁ - childᵢ₋₁_border - childᵢ₋₁_padding) for i > 0
pub struct CornerSolver {
root_radius: Pixels,
root_border: Pixels,
root_padding: Pixels,
children: Vec<(Pixels, Pixels)>, // (border, padding)
}
impl CornerSolver {
pub fn new(root_radius: Pixels, root_border: Pixels, root_padding: Pixels) -> Self {
Self {
root_radius,
root_border,
root_padding,
children: Vec::new(),
}
}
pub fn add_child(mut self, border: Pixels, padding: Pixels) -> Self {
self.children.push((border, padding));
self
}
pub fn corner_radius(&self, level: usize) -> Pixels {
if level == 0 {
return (self.root_radius - self.root_border - self.root_padding).max(Pixels::ZERO);
}
if level >= self.children.len() {
return Pixels::ZERO;
}
let mut r = (self.root_radius - self.root_border - self.root_padding).max(Pixels::ZERO);
for i in 0..level {
let (b, p) = self.children[i];
r = (r - b - p).max(Pixels::ZERO);
}
r
}
}

View File

@@ -17,23 +17,27 @@ test-support = []
[dependencies]
anyhow.workspace = true
client.workspace = true
component.workspace = true
db.workspace = true
documented.workspace = true
fuzzy.workspace = true
gpui.workspace = true
install_cli.workspace = true
language.workspace = true
linkme.workspace = true
picker.workspace = true
project.workspace = true
schemars.workspace = true
serde.workspace = true
settings.workspace = true
telemetry.workspace = true
theme.workspace = true
ui.workspace = true
util.workspace = true
vim_mode_setting.workspace = true
workspace-hack.workspace = true
workspace.workspace = true
zed_actions.workspace = true
workspace-hack.workspace = true
[dev-dependencies]
editor = { workspace = true, features = ["test-support"] }

View File

@@ -1,7 +1,3 @@
mod base_keymap_picker;
mod base_keymap_setting;
mod multibuffer_hint;
use client::{TelemetrySettings, telemetry::Telemetry};
use db::kvp::KEY_VALUE_STORE;
use gpui::{
@@ -24,6 +20,11 @@ use workspace::{
pub use base_keymap_setting::BaseKeymap;
pub use multibuffer_hint::*;
mod base_keymap_picker;
mod base_keymap_setting;
mod multibuffer_hint;
mod welcome_ui;
actions!(welcome, [ResetHints]);
pub const FIRST_OPEN: &str = "first_open";

View File

@@ -0,0 +1 @@
mod theme_preview;

View File

@@ -0,0 +1,280 @@
#![allow(unused, dead_code)]
use gpui::{Hsla, Length};
use std::sync::Arc;
use theme::{Theme, ThemeRegistry};
use ui::{
IntoElement, RenderOnce, component_prelude::Documented, prelude::*, utils::inner_corner_radius,
};
/// Shows a preview of a theme as an abstract illustration
/// of a thumbnail-sized editor.
#[derive(IntoElement, RegisterComponent, Documented)]
pub struct ThemePreviewTile {
theme: Arc<Theme>,
selected: bool,
seed: f32,
}
impl ThemePreviewTile {
pub fn new(theme: Arc<Theme>, selected: bool, seed: f32) -> Self {
Self {
theme,
selected,
seed,
}
}
pub fn selected(mut self, selected: bool) -> Self {
self.selected = selected;
self
}
}
impl RenderOnce for ThemePreviewTile {
fn render(self, _window: &mut ui::Window, _cx: &mut ui::App) -> impl IntoElement {
let color = self.theme.colors();
let root_radius = px(8.0);
let root_border = px(2.0);
let root_padding = px(2.0);
let child_border = px(1.0);
let inner_radius =
inner_corner_radius(root_radius, root_border, root_padding, child_border);
let item_skeleton = |w: Length, h: Pixels, bg: Hsla| div().w(w).h(h).rounded_full().bg(bg);
let skeleton_height = px(4.);
let sidebar_seeded_width = |seed: f32, index: usize| {
let value = (seed * 1000.0 + index as f32 * 10.0).sin() * 0.5 + 0.5;
0.5 + value * 0.45
};
let sidebar_skeleton_items = 8;
let sidebar_skeleton = (0..sidebar_skeleton_items)
.map(|i| {
let width = sidebar_seeded_width(self.seed, i);
item_skeleton(
relative(width).into(),
skeleton_height,
color.text.alpha(0.45),
)
})
.collect::<Vec<_>>();
let sidebar = div()
.h_full()
.w(relative(0.25))
.border_r(px(1.))
.border_color(color.border_transparent)
.bg(color.panel_background)
.child(
div()
.p_2()
.flex()
.flex_col()
.size_full()
.gap(px(4.))
.children(sidebar_skeleton),
);
let pseudo_code_skeleton = |theme: Arc<Theme>, seed: f32| -> AnyElement {
let colors = theme.colors();
let syntax = theme.syntax();
let keyword_color = syntax.get("keyword").color;
let function_color = syntax.get("function").color;
let string_color = syntax.get("string").color;
let comment_color = syntax.get("comment").color;
let variable_color = syntax.get("variable").color;
let type_color = syntax.get("type").color;
let punctuation_color = syntax.get("punctuation").color;
let syntax_colors = [
keyword_color,
function_color,
string_color,
variable_color,
type_color,
punctuation_color,
comment_color,
];
let line_width = |line_idx: usize, block_idx: usize| -> f32 {
let val = (seed * 100.0 + line_idx as f32 * 20.0 + block_idx as f32 * 5.0).sin()
* 0.5
+ 0.5;
0.05 + val * 0.2
};
let indentation = |line_idx: usize| -> f32 {
let step = line_idx % 6;
if step < 3 {
step as f32 * 0.1
} else {
(5 - step) as f32 * 0.1
}
};
let pick_color = |line_idx: usize, block_idx: usize| -> Hsla {
let idx = ((seed * 10.0 + line_idx as f32 * 7.0 + block_idx as f32 * 3.0).sin()
* 3.5)
.abs() as usize
% syntax_colors.len();
syntax_colors[idx].unwrap_or(colors.text)
};
let line_count = 13;
let lines = (0..line_count)
.map(|line_idx| {
let block_count = (((seed * 30.0 + line_idx as f32 * 12.0).sin() * 0.5 + 0.5)
* 3.0)
.round() as usize
+ 2;
let indent = indentation(line_idx);
let blocks = (0..block_count)
.map(|block_idx| {
let width = line_width(line_idx, block_idx);
let color = pick_color(line_idx, block_idx);
item_skeleton(relative(width).into(), skeleton_height, color)
})
.collect::<Vec<_>>();
h_flex().gap(px(2.)).ml(relative(indent)).children(blocks)
})
.collect::<Vec<_>>();
v_flex()
.size_full()
.p_1()
.gap(px(6.))
.children(lines)
.into_any_element()
};
let pane = div()
.h_full()
.flex_grow()
.flex()
.flex_col()
// .child(
// div()
// .w_full()
// .border_color(color.border)
// .border_b(px(1.))
// .h(relative(0.1))
// .bg(color.tab_bar_background),
// )
.child(
div()
.size_full()
.overflow_hidden()
.bg(color.editor_background)
.p_2()
.child(pseudo_code_skeleton(self.theme.clone(), self.seed)),
);
let content = div().size_full().flex().child(sidebar).child(pane);
div()
.size_full()
.rounded(root_radius)
.p(root_padding)
.border(root_border)
.border_color(color.border_transparent)
.when(self.selected, |this| {
this.border_color(color.border_selected)
})
.child(
div()
.size_full()
.rounded(inner_radius)
.border(child_border)
.border_color(color.border)
.bg(color.background)
.child(content),
)
}
}
impl Component for ThemePreviewTile {
fn description() -> Option<&'static str> {
Some(Self::DOCS)
}
fn preview(_window: &mut Window, cx: &mut App) -> Option<AnyElement> {
let theme_registry = ThemeRegistry::global(cx);
let one_dark = theme_registry.get("One Dark");
let one_light = theme_registry.get("One Light");
let gruvbox_dark = theme_registry.get("Gruvbox Dark");
let gruvbox_light = theme_registry.get("Gruvbox Light");
let themes_to_preview = vec![
one_dark.clone().ok(),
one_light.clone().ok(),
gruvbox_dark.clone().ok(),
gruvbox_light.clone().ok(),
]
.into_iter()
.flatten()
.collect::<Vec<_>>();
Some(
v_flex()
.gap_6()
.p_4()
.children({
if let Some(one_dark) = one_dark.ok() {
vec![example_group(vec![
single_example(
"Default",
div()
.w(px(240.))
.h(px(180.))
.child(ThemePreviewTile::new(one_dark.clone(), false, 0.42))
.into_any_element(),
),
single_example(
"Selected",
div()
.w(px(240.))
.h(px(180.))
.child(ThemePreviewTile::new(one_dark, true, 0.42))
.into_any_element(),
),
])]
} else {
vec![]
}
})
.child(
example_group(vec![single_example(
"Default Themes",
h_flex()
.gap_4()
.children(
themes_to_preview
.iter()
.enumerate()
.map(|(i, theme)| {
div().w(px(200.)).h(px(140.)).child(ThemePreviewTile::new(
theme.clone(),
false,
0.42,
))
})
.collect::<Vec<_>>(),
)
.into_any_element(),
)])
.grow(),
)
.into_any_element(),
)
}
}

View File

@@ -2,7 +2,7 @@
description = "The fast, collaborative code editor."
edition.workspace = true
name = "zed"
version = "0.185.0"
version = "0.186.0"
publish.workspace = true
license = "GPL-3.0-or-later"
authors = ["Zed Team <hi@zed.dev>"]

View File

@@ -95,7 +95,7 @@ pub fn app_menus() -> Vec<Menu> {
MenuItem::separator(),
MenuItem::os_action("Cut", editor::actions::Cut, OsAction::Cut),
MenuItem::os_action("Copy", editor::actions::Copy, OsAction::Copy),
MenuItem::action("Copy and trim", editor::actions::CopyAndTrim),
MenuItem::action("Copy and Trim", editor::actions::CopyAndTrim),
MenuItem::os_action("Paste", editor::actions::Paste, OsAction::Paste),
MenuItem::separator(),
MenuItem::action("Find", search::buffer_search::Deploy::find()),

View File

@@ -2037,12 +2037,24 @@ Or to set a `socks5` proxy:
## File Finder
### File Icons
- Description: Whether to show file icons in the file finder.
- Setting: `file_icons`
- Default: `true`
### Modal Max Width
- Description: Max-width of the file finder modal. It can take one of these values: `small`, `medium`, `large`, `xlarge`, and `full`.
- Setting: `modal_max_width`
- Default: `small`
### Skip Focus For Active In Search
- Description: Determines whether the file finder should skip focus for the active file in search results.
- Setting: `skip_focus_for_active_in_search`
- Default: `true`
## Preferred Line Length
- Description: The column at which to soft-wrap lines, for buffers where soft-wrap is enabled.