Compare commits
2 Commits
editor/tog
...
additional
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
a566f34a0d | ||
|
|
aa3a9f11a8 |
@@ -15,17 +15,15 @@ with the community to improve the product in ways we haven't thought of (or had
|
||||
|
||||
In particular we love PRs that are:
|
||||
|
||||
- Fixing or extending the docs.
|
||||
- Fixing bugs.
|
||||
- Small enhancements to existing features to make them work for more people (making things work on more platforms/modes/whatever).
|
||||
- Fixes to existing bugs and issues.
|
||||
- Small enhancements to existing features, particularly to make them work for more people.
|
||||
- Small extra features, like keybindings or actions you miss from other editors or extensions.
|
||||
- Part of a Community Program like [Let's Git Together](https://github.com/zed-industries/zed/issues/41541).
|
||||
- Work towards shipping larger features on our roadmap.
|
||||
|
||||
If you're looking for concrete ideas:
|
||||
|
||||
- [Curated board of issues](https://github.com/orgs/zed-industries/projects/69) suitable for everyone from first-time contributors to seasoned community champions.
|
||||
- [Triaged bugs with confirmed steps to reproduce](https://github.com/zed-industries/zed/issues?q=is%3Aissue%20state%3Aopen%20type%3ABug%20label%3Astate%3Areproducible).
|
||||
- [Area labels](https://github.com/zed-industries/zed/labels?q=area%3A*) to browse bugs in a specific part of the product you care about (after clicking on an area label, add type:Bug to the search).
|
||||
- Our [top-ranking issues](https://github.com/zed-industries/zed/issues/5393) based on votes by the community.
|
||||
- Our [public roadmap](https://zed.dev/roadmap) contains a rough outline of our near-term priorities for Zed.
|
||||
|
||||
## Sending changes
|
||||
|
||||
@@ -39,17 +37,9 @@ like, sorry).
|
||||
Although we will take a look, we tend to only merge about half the PRs that are
|
||||
submitted. If you'd like your PR to have the best chance of being merged:
|
||||
|
||||
- Make sure the change is **desired**: we're always happy to accept bugfixes,
|
||||
but features should be confirmed with us first if you aim to avoid wasted
|
||||
effort. If there isn't already a GitHub issue for your feature with staff
|
||||
confirmation that we want it, start with a GitHub discussion rather than a PR.
|
||||
- Include a clear description of **what you're solving**, and why it's important.
|
||||
- Include **tests**.
|
||||
- If it changes the UI, attach **screenshots** or screen recordings.
|
||||
- Make the PR about **one thing only**, e.g. if it's a bugfix, don't add two
|
||||
features and a refactoring on top of that.
|
||||
- Keep AI assistance under your judgement and responsibility: it's unlikely
|
||||
we'll merge a vibe-coded PR that the author doesn't understand.
|
||||
- Include a clear description of what you're solving, and why it's important to you.
|
||||
- Include tests.
|
||||
- If it changes the UI, attach screenshots or screen recordings.
|
||||
|
||||
The internal advice for reviewers is as follows:
|
||||
|
||||
@@ -60,9 +50,10 @@ The internal advice for reviewers is as follows:
|
||||
If you need more feedback from us: the best way is to be responsive to
|
||||
Github comments, or to offer up time to pair with us.
|
||||
|
||||
If you need help deciding how to fix a bug, or finish implementing a feature
|
||||
that we've agreed we want, please open a PR early so we can discuss how to make
|
||||
the change with code in hand.
|
||||
If you are making a larger change, or need advice on how to finish the change
|
||||
you're making, please open the PR early. We would love to help you get
|
||||
things right, and it's often easier to see how to solve a problem before the
|
||||
diff gets too big.
|
||||
|
||||
## Things we will (probably) not merge
|
||||
|
||||
@@ -70,11 +61,11 @@ Although there are few hard and fast rules, typically we don't merge:
|
||||
|
||||
- Anything that can be provided by an extension. For example a new language, or theme. For adding themes or support for a new language to Zed, check out our [docs on developing extensions](https://zed.dev/docs/extensions/developing-extensions).
|
||||
- New file icons. Zed's default icon theme consists of icons that are hand-designed to fit together in a cohesive manner, please don't submit PRs with off-the-shelf SVGs.
|
||||
- Features where (in our subjective opinion) the extra complexity isn't worth it for the number of people who will benefit.
|
||||
- Giant refactorings.
|
||||
- Non-trivial changes with no tests.
|
||||
- Stylistic code changes that do not alter any app logic. Reducing allocations, removing `.unwrap()`s, fixing typos is great; making code "more readable" — maybe not so much.
|
||||
- Anything that seems AI-generated without understanding the output.
|
||||
- Features where (in our subjective opinion) the extra complexity isn't worth it for the number of people who will benefit.
|
||||
- Anything that seems completely AI generated.
|
||||
|
||||
## Bird's-eye view of Zed
|
||||
|
||||
|
||||
22
Cargo.lock
generated
22
Cargo.lock
generated
@@ -4925,7 +4925,7 @@ dependencies = [
|
||||
"libc",
|
||||
"option-ext",
|
||||
"redox_users 0.5.2",
|
||||
"windows-sys 0.61.2",
|
||||
"windows-sys 0.59.0",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
@@ -5636,7 +5636,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "39cab71617ae0d63f51a36d69f866391735b51691dbda63cf6f96d042b63efeb"
|
||||
dependencies = [
|
||||
"libc",
|
||||
"windows-sys 0.61.2",
|
||||
"windows-sys 0.59.0",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
@@ -8181,7 +8181,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "4b0f83760fb341a774ed326568e19f5a863af4a952def8c39f9ab92fd95b88e5"
|
||||
dependencies = [
|
||||
"equivalent",
|
||||
"hashbrown 0.16.1",
|
||||
"hashbrown 0.15.5",
|
||||
"serde",
|
||||
"serde_core",
|
||||
]
|
||||
@@ -10030,7 +10030,7 @@ version = "0.6.1"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "536bfad37a309d62069485248eeaba1e8d9853aaf951caaeaed0585a95346f08"
|
||||
dependencies = [
|
||||
"windows-sys 0.61.2",
|
||||
"windows-sys 0.60.2",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
@@ -10440,7 +10440,7 @@ version = "0.50.3"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "7957b9740744892f114936ab4a57b3f487491bbeafaf8083688b16841a4240e5"
|
||||
dependencies = [
|
||||
"windows-sys 0.61.2",
|
||||
"windows-sys 0.59.0",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
@@ -13726,12 +13726,14 @@ dependencies = [
|
||||
[[package]]
|
||||
name = "rodio"
|
||||
version = "0.21.1"
|
||||
source = "git+https://github.com/RustAudio/rodio?rev=e2074c6c2acf07b57cf717e076bdda7a9ac6e70b#e2074c6c2acf07b57cf717e076bdda7a9ac6e70b"
|
||||
source = "git+https://github.com/zed-industries/rodio#34cafdbd7df402539e490057882f7b09fcc05c2a"
|
||||
dependencies = [
|
||||
"cpal",
|
||||
"dasp_sample",
|
||||
"hound",
|
||||
"num-rational",
|
||||
"rand 0.9.2",
|
||||
"rand_distr",
|
||||
"rtrb",
|
||||
"symphonia",
|
||||
"thiserror 2.0.17",
|
||||
@@ -13990,7 +13992,7 @@ dependencies = [
|
||||
"errno 0.3.14",
|
||||
"libc",
|
||||
"linux-raw-sys 0.11.0",
|
||||
"windows-sys 0.61.2",
|
||||
"windows-sys 0.59.0",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
@@ -16373,7 +16375,7 @@ dependencies = [
|
||||
"getrandom 0.3.4",
|
||||
"once_cell",
|
||||
"rustix 1.1.2",
|
||||
"windows-sys 0.61.2",
|
||||
"windows-sys 0.59.0",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
@@ -17284,7 +17286,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "2fb391ac70462b3097a755618fbf9c8f95ecc1eb379a414f7b46f202ed10db1f"
|
||||
dependencies = [
|
||||
"cc",
|
||||
"windows-targets 0.52.6",
|
||||
"windows-targets 0.48.5",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
@@ -19114,7 +19116,7 @@ version = "0.1.11"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "c2a7b1c03c876122aa43f3020e6c3c3ee5c05081c9a00739faf7503aeba10d22"
|
||||
dependencies = [
|
||||
"windows-sys 0.61.2",
|
||||
"windows-sys 0.48.0",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
|
||||
@@ -370,7 +370,8 @@ remote = { path = "crates/remote" }
|
||||
remote_server = { path = "crates/remote_server" }
|
||||
repl = { path = "crates/repl" }
|
||||
reqwest_client = { path = "crates/reqwest_client" }
|
||||
rodio = { git = "https://github.com/RustAudio/rodio", rev ="e2074c6c2acf07b57cf717e076bdda7a9ac6e70b", features = ["wav", "playback", "wav_output", "recording"] }
|
||||
# rodio = { git = "https://github.com/RustAudio/rodio", rev ="e2074c6c2acf07b57cf717e076bdda7a9ac6e70b", features = ["wav", "playback", "wav_output", "recording"] }
|
||||
rodio = { git = "https://github.com/zed-industries/rodio", features = ["wav", "playback", "wav_output", "recording"] }
|
||||
rope = { path = "crates/rope" }
|
||||
rpc = { path = "crates/rpc" }
|
||||
rules_library = { path = "crates/rules_library" }
|
||||
|
||||
@@ -502,11 +502,6 @@
|
||||
"g p": "pane::ActivatePreviousItem",
|
||||
"shift-h": "pane::ActivatePreviousItem", // not a helix default
|
||||
"g .": "vim::HelixGotoLastModification",
|
||||
"g o": "editor::ToggleSelectedDiffHunks", // Zed specific
|
||||
"g shift-o": "git::ToggleStaged", // Zed specific
|
||||
"g shift-r": "git::Restore", // Zed specific
|
||||
"g u": "git::StageAndNext", // Zed specific
|
||||
"g shift-u": "git::UnstageAndNext", // Zed specific
|
||||
|
||||
// Window mode
|
||||
"space w v": "pane::SplitDown",
|
||||
|
||||
@@ -972,8 +972,6 @@
|
||||
"now": true,
|
||||
"find_path": true,
|
||||
"read_file": true,
|
||||
"restore_file_from_disk": true,
|
||||
"save_file": true,
|
||||
"open": true,
|
||||
"grep": true,
|
||||
"terminal": true,
|
||||
|
||||
@@ -2,8 +2,7 @@ use crate::{
|
||||
ContextServerRegistry, CopyPathTool, CreateDirectoryTool, DbLanguageModel, DbThread,
|
||||
DeletePathTool, DiagnosticsTool, EditFileTool, FetchTool, FindPathTool, GrepTool,
|
||||
ListDirectoryTool, MovePathTool, NowTool, OpenTool, ProjectSnapshot, ReadFileTool,
|
||||
RestoreFileFromDiskTool, SaveFileTool, SystemPromptTemplate, Template, Templates, TerminalTool,
|
||||
ThinkingTool, WebSearchTool,
|
||||
SystemPromptTemplate, Template, Templates, TerminalTool, ThinkingTool, WebSearchTool,
|
||||
};
|
||||
use acp_thread::{MentionUri, UserMessageId};
|
||||
use action_log::ActionLog;
|
||||
@@ -1003,8 +1002,6 @@ impl Thread {
|
||||
self.project.clone(),
|
||||
self.action_log.clone(),
|
||||
));
|
||||
self.add_tool(SaveFileTool::new(self.project.clone()));
|
||||
self.add_tool(RestoreFileFromDiskTool::new(self.project.clone()));
|
||||
self.add_tool(TerminalTool::new(self.project.clone(), environment));
|
||||
self.add_tool(ThinkingTool);
|
||||
self.add_tool(WebSearchTool);
|
||||
@@ -1969,12 +1966,6 @@ impl Thread {
|
||||
self.running_turn.as_ref()?.tools.get(name).cloned()
|
||||
}
|
||||
|
||||
pub fn has_tool(&self, name: &str) -> bool {
|
||||
self.running_turn
|
||||
.as_ref()
|
||||
.is_some_and(|turn| turn.tools.contains_key(name))
|
||||
}
|
||||
|
||||
fn build_request_messages(
|
||||
&self,
|
||||
available_tools: Vec<SharedString>,
|
||||
|
||||
@@ -4,6 +4,7 @@ mod create_directory_tool;
|
||||
mod delete_path_tool;
|
||||
mod diagnostics_tool;
|
||||
mod edit_file_tool;
|
||||
|
||||
mod fetch_tool;
|
||||
mod find_path_tool;
|
||||
mod grep_tool;
|
||||
@@ -12,8 +13,6 @@ mod move_path_tool;
|
||||
mod now_tool;
|
||||
mod open_tool;
|
||||
mod read_file_tool;
|
||||
mod restore_file_from_disk_tool;
|
||||
mod save_file_tool;
|
||||
|
||||
mod terminal_tool;
|
||||
mod thinking_tool;
|
||||
@@ -28,6 +27,7 @@ pub use create_directory_tool::*;
|
||||
pub use delete_path_tool::*;
|
||||
pub use diagnostics_tool::*;
|
||||
pub use edit_file_tool::*;
|
||||
|
||||
pub use fetch_tool::*;
|
||||
pub use find_path_tool::*;
|
||||
pub use grep_tool::*;
|
||||
@@ -36,8 +36,6 @@ pub use move_path_tool::*;
|
||||
pub use now_tool::*;
|
||||
pub use open_tool::*;
|
||||
pub use read_file_tool::*;
|
||||
pub use restore_file_from_disk_tool::*;
|
||||
pub use save_file_tool::*;
|
||||
|
||||
pub use terminal_tool::*;
|
||||
pub use thinking_tool::*;
|
||||
@@ -94,8 +92,6 @@ tools! {
|
||||
NowTool,
|
||||
OpenTool,
|
||||
ReadFileTool,
|
||||
RestoreFileFromDiskTool,
|
||||
SaveFileTool,
|
||||
TerminalTool,
|
||||
ThinkingTool,
|
||||
WebSearchTool,
|
||||
|
||||
@@ -306,39 +306,20 @@ impl AgentTool for EditFileTool {
|
||||
|
||||
// Check if the file has been modified since the agent last read it
|
||||
if let Some(abs_path) = abs_path.as_ref() {
|
||||
let (last_read_mtime, current_mtime, is_dirty, has_save_tool, has_restore_tool) = self.thread.update(cx, |thread, cx| {
|
||||
let (last_read_mtime, current_mtime, is_dirty) = self.thread.update(cx, |thread, cx| {
|
||||
let last_read = thread.file_read_times.get(abs_path).copied();
|
||||
let current = buffer.read(cx).file().and_then(|file| file.disk_state().mtime());
|
||||
let dirty = buffer.read(cx).is_dirty();
|
||||
let has_save = thread.has_tool("save_file");
|
||||
let has_restore = thread.has_tool("restore_file_from_disk");
|
||||
(last_read, current, dirty, has_save, has_restore)
|
||||
(last_read, current, dirty)
|
||||
})?;
|
||||
|
||||
// Check for unsaved changes first - these indicate modifications we don't know about
|
||||
if is_dirty {
|
||||
let message = match (has_save_tool, has_restore_tool) {
|
||||
(true, true) => {
|
||||
"This file has unsaved changes. Ask the user whether they want to keep or discard those changes. \
|
||||
If they want to keep them, ask for confirmation then use the save_file tool to save the file, then retry this edit. \
|
||||
If they want to discard them, ask for confirmation then use the restore_file_from_disk tool to restore the on-disk contents, then retry this edit."
|
||||
}
|
||||
(true, false) => {
|
||||
"This file has unsaved changes. Ask the user whether they want to keep or discard those changes. \
|
||||
If they want to keep them, ask for confirmation then use the save_file tool to save the file, then retry this edit. \
|
||||
If they want to discard them, ask the user to manually revert the file, then inform you when it's ok to proceed."
|
||||
}
|
||||
(false, true) => {
|
||||
"This file has unsaved changes. Ask the user whether they want to keep or discard those changes. \
|
||||
If they want to keep them, ask the user to manually save the file, then inform you when it's ok to proceed. \
|
||||
If they want to discard them, ask for confirmation then use the restore_file_from_disk tool to restore the on-disk contents, then retry this edit."
|
||||
}
|
||||
(false, false) => {
|
||||
"This file has unsaved changes. Ask the user whether they want to keep or discard those changes, \
|
||||
then ask them to save or revert the file manually and inform you when it's ok to proceed."
|
||||
}
|
||||
};
|
||||
anyhow::bail!("{}", message);
|
||||
anyhow::bail!(
|
||||
"This file cannot be written to because it has unsaved changes. \
|
||||
Please end the current conversation immediately by telling the user you want to write to this file (mention its path explicitly) but you can't write to it because it has unsaved changes. \
|
||||
Ask the user to save that buffer's changes and to inform you when it's ok to proceed."
|
||||
);
|
||||
}
|
||||
|
||||
// Check if the file was modified on disk since we last read it
|
||||
@@ -2221,21 +2202,9 @@ mod tests {
|
||||
assert!(result.is_err(), "Edit should fail when buffer is dirty");
|
||||
let error_msg = result.unwrap_err().to_string();
|
||||
assert!(
|
||||
error_msg.contains("This file has unsaved changes."),
|
||||
error_msg.contains("cannot be written to because it has unsaved changes"),
|
||||
"Error should mention unsaved changes, got: {}",
|
||||
error_msg
|
||||
);
|
||||
assert!(
|
||||
error_msg.contains("keep or discard"),
|
||||
"Error should ask whether to keep or discard changes, got: {}",
|
||||
error_msg
|
||||
);
|
||||
// Since save_file and restore_file_from_disk tools aren't added to the thread,
|
||||
// the error message should ask the user to manually save or revert
|
||||
assert!(
|
||||
error_msg.contains("save or revert the file manually"),
|
||||
"Error should ask user to manually save or revert when tools aren't available, got: {}",
|
||||
error_msg
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,352 +0,0 @@
|
||||
use agent_client_protocol as acp;
|
||||
use anyhow::Result;
|
||||
use collections::FxHashSet;
|
||||
use gpui::{App, Entity, SharedString, Task};
|
||||
use language::Buffer;
|
||||
use project::Project;
|
||||
use schemars::JsonSchema;
|
||||
use serde::{Deserialize, Serialize};
|
||||
use std::path::PathBuf;
|
||||
use std::sync::Arc;
|
||||
|
||||
use crate::{AgentTool, ToolCallEventStream};
|
||||
|
||||
/// Discards unsaved changes in open buffers by reloading file contents from disk.
|
||||
///
|
||||
/// Use this tool when:
|
||||
/// - You attempted to edit files but they have unsaved changes the user does not want to keep.
|
||||
/// - You want to reset files to the on-disk state before retrying an edit.
|
||||
///
|
||||
/// Only use this tool after asking the user for permission, because it will discard unsaved changes.
|
||||
#[derive(Debug, Serialize, Deserialize, JsonSchema)]
|
||||
pub struct RestoreFileFromDiskToolInput {
|
||||
/// The paths of the files to restore from disk.
|
||||
pub paths: Vec<PathBuf>,
|
||||
}
|
||||
|
||||
pub struct RestoreFileFromDiskTool {
|
||||
project: Entity<Project>,
|
||||
}
|
||||
|
||||
impl RestoreFileFromDiskTool {
|
||||
pub fn new(project: Entity<Project>) -> Self {
|
||||
Self { project }
|
||||
}
|
||||
}
|
||||
|
||||
impl AgentTool for RestoreFileFromDiskTool {
|
||||
type Input = RestoreFileFromDiskToolInput;
|
||||
type Output = String;
|
||||
|
||||
fn name() -> &'static str {
|
||||
"restore_file_from_disk"
|
||||
}
|
||||
|
||||
fn kind() -> acp::ToolKind {
|
||||
acp::ToolKind::Other
|
||||
}
|
||||
|
||||
fn initial_title(
|
||||
&self,
|
||||
input: Result<Self::Input, serde_json::Value>,
|
||||
_cx: &mut App,
|
||||
) -> SharedString {
|
||||
match input {
|
||||
Ok(input) if input.paths.len() == 1 => "Restore file from disk".into(),
|
||||
Ok(input) => format!("Restore {} files from disk", input.paths.len()).into(),
|
||||
Err(_) => "Restore files from disk".into(),
|
||||
}
|
||||
}
|
||||
|
||||
fn run(
|
||||
self: Arc<Self>,
|
||||
input: Self::Input,
|
||||
_event_stream: ToolCallEventStream,
|
||||
cx: &mut App,
|
||||
) -> Task<Result<String>> {
|
||||
let project = self.project.clone();
|
||||
let input_paths = input.paths;
|
||||
|
||||
cx.spawn(async move |cx| {
|
||||
let mut buffers_to_reload: FxHashSet<Entity<Buffer>> = FxHashSet::default();
|
||||
|
||||
let mut restored_paths: Vec<PathBuf> = Vec::new();
|
||||
let mut clean_paths: Vec<PathBuf> = Vec::new();
|
||||
let mut not_found_paths: Vec<PathBuf> = Vec::new();
|
||||
let mut open_errors: Vec<(PathBuf, String)> = Vec::new();
|
||||
let mut dirty_check_errors: Vec<(PathBuf, String)> = Vec::new();
|
||||
let mut reload_errors: Vec<String> = Vec::new();
|
||||
|
||||
for path in input_paths {
|
||||
let project_path =
|
||||
project.read_with(cx, |project, cx| project.find_project_path(&path, cx));
|
||||
|
||||
let project_path = match project_path {
|
||||
Ok(Some(project_path)) => project_path,
|
||||
Ok(None) => {
|
||||
not_found_paths.push(path);
|
||||
continue;
|
||||
}
|
||||
Err(error) => {
|
||||
open_errors.push((path, error.to_string()));
|
||||
continue;
|
||||
}
|
||||
};
|
||||
|
||||
let open_buffer_task =
|
||||
project.update(cx, |project, cx| project.open_buffer(project_path, cx));
|
||||
|
||||
let buffer = match open_buffer_task {
|
||||
Ok(task) => match task.await {
|
||||
Ok(buffer) => buffer,
|
||||
Err(error) => {
|
||||
open_errors.push((path, error.to_string()));
|
||||
continue;
|
||||
}
|
||||
},
|
||||
Err(error) => {
|
||||
open_errors.push((path, error.to_string()));
|
||||
continue;
|
||||
}
|
||||
};
|
||||
|
||||
let is_dirty = match buffer.read_with(cx, |buffer, _| buffer.is_dirty()) {
|
||||
Ok(is_dirty) => is_dirty,
|
||||
Err(error) => {
|
||||
dirty_check_errors.push((path, error.to_string()));
|
||||
continue;
|
||||
}
|
||||
};
|
||||
|
||||
if is_dirty {
|
||||
buffers_to_reload.insert(buffer);
|
||||
restored_paths.push(path);
|
||||
} else {
|
||||
clean_paths.push(path);
|
||||
}
|
||||
}
|
||||
|
||||
if !buffers_to_reload.is_empty() {
|
||||
let reload_task = project.update(cx, |project, cx| {
|
||||
project.reload_buffers(buffers_to_reload, true, cx)
|
||||
});
|
||||
|
||||
match reload_task {
|
||||
Ok(task) => {
|
||||
if let Err(error) = task.await {
|
||||
reload_errors.push(error.to_string());
|
||||
}
|
||||
}
|
||||
Err(error) => {
|
||||
reload_errors.push(error.to_string());
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
let mut lines: Vec<String> = Vec::new();
|
||||
|
||||
if !restored_paths.is_empty() {
|
||||
lines.push(format!("Restored {} file(s).", restored_paths.len()));
|
||||
}
|
||||
if !clean_paths.is_empty() {
|
||||
lines.push(format!("{} clean.", clean_paths.len()));
|
||||
}
|
||||
|
||||
if !not_found_paths.is_empty() {
|
||||
lines.push(format!("Not found ({}):", not_found_paths.len()));
|
||||
for path in ¬_found_paths {
|
||||
lines.push(format!("- {}", path.display()));
|
||||
}
|
||||
}
|
||||
if !open_errors.is_empty() {
|
||||
lines.push(format!("Open failed ({}):", open_errors.len()));
|
||||
for (path, error) in &open_errors {
|
||||
lines.push(format!("- {}: {}", path.display(), error));
|
||||
}
|
||||
}
|
||||
if !dirty_check_errors.is_empty() {
|
||||
lines.push(format!(
|
||||
"Dirty check failed ({}):",
|
||||
dirty_check_errors.len()
|
||||
));
|
||||
for (path, error) in &dirty_check_errors {
|
||||
lines.push(format!("- {}: {}", path.display(), error));
|
||||
}
|
||||
}
|
||||
if !reload_errors.is_empty() {
|
||||
lines.push(format!("Reload failed ({}):", reload_errors.len()));
|
||||
for error in &reload_errors {
|
||||
lines.push(format!("- {}", error));
|
||||
}
|
||||
}
|
||||
|
||||
if lines.is_empty() {
|
||||
Ok("No paths provided.".to_string())
|
||||
} else {
|
||||
Ok(lines.join("\n"))
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
use fs::Fs;
|
||||
use gpui::TestAppContext;
|
||||
use language::LineEnding;
|
||||
use project::FakeFs;
|
||||
use serde_json::json;
|
||||
use settings::SettingsStore;
|
||||
use util::path;
|
||||
|
||||
fn init_test(cx: &mut TestAppContext) {
|
||||
cx.update(|cx| {
|
||||
let settings_store = SettingsStore::test(cx);
|
||||
cx.set_global(settings_store);
|
||||
});
|
||||
}
|
||||
|
||||
#[gpui::test]
|
||||
async fn test_restore_file_from_disk_output_and_effects(cx: &mut TestAppContext) {
|
||||
init_test(cx);
|
||||
|
||||
let fs = FakeFs::new(cx.executor());
|
||||
fs.insert_tree(
|
||||
"/root",
|
||||
json!({
|
||||
"dirty.txt": "on disk: dirty\n",
|
||||
"clean.txt": "on disk: clean\n",
|
||||
}),
|
||||
)
|
||||
.await;
|
||||
|
||||
let project = Project::test(fs.clone(), [path!("/root").as_ref()], cx).await;
|
||||
let tool = Arc::new(RestoreFileFromDiskTool::new(project.clone()));
|
||||
|
||||
// Make dirty.txt dirty in-memory by saving different content into the buffer without saving to disk.
|
||||
let dirty_project_path = project.read_with(cx, |project, cx| {
|
||||
project
|
||||
.find_project_path("root/dirty.txt", cx)
|
||||
.expect("dirty.txt should exist in project")
|
||||
});
|
||||
|
||||
let dirty_buffer = project
|
||||
.update(cx, |project, cx| {
|
||||
project.open_buffer(dirty_project_path, cx)
|
||||
})
|
||||
.await
|
||||
.unwrap();
|
||||
dirty_buffer.update(cx, |buffer, cx| {
|
||||
buffer.edit([(0..buffer.len(), "in memory: dirty\n")], None, cx);
|
||||
});
|
||||
assert!(
|
||||
dirty_buffer.read_with(cx, |buffer, _| buffer.is_dirty()),
|
||||
"dirty.txt buffer should be dirty before restore"
|
||||
);
|
||||
|
||||
// Ensure clean.txt is opened but remains clean.
|
||||
let clean_project_path = project.read_with(cx, |project, cx| {
|
||||
project
|
||||
.find_project_path("root/clean.txt", cx)
|
||||
.expect("clean.txt should exist in project")
|
||||
});
|
||||
|
||||
let clean_buffer = project
|
||||
.update(cx, |project, cx| {
|
||||
project.open_buffer(clean_project_path, cx)
|
||||
})
|
||||
.await
|
||||
.unwrap();
|
||||
assert!(
|
||||
!clean_buffer.read_with(cx, |buffer, _| buffer.is_dirty()),
|
||||
"clean.txt buffer should start clean"
|
||||
);
|
||||
|
||||
let output = cx
|
||||
.update(|cx| {
|
||||
tool.clone().run(
|
||||
RestoreFileFromDiskToolInput {
|
||||
paths: vec![
|
||||
PathBuf::from("root/dirty.txt"),
|
||||
PathBuf::from("root/clean.txt"),
|
||||
],
|
||||
},
|
||||
ToolCallEventStream::test().0,
|
||||
cx,
|
||||
)
|
||||
})
|
||||
.await
|
||||
.unwrap();
|
||||
|
||||
// Output should mention restored + clean.
|
||||
assert!(
|
||||
output.contains("Restored 1 file(s)."),
|
||||
"expected restored count line, got:\n{output}"
|
||||
);
|
||||
assert!(
|
||||
output.contains("1 clean."),
|
||||
"expected clean count line, got:\n{output}"
|
||||
);
|
||||
|
||||
// Effect: dirty buffer should be restored back to disk content and become clean.
|
||||
let dirty_text = dirty_buffer.read_with(cx, |buffer, _| buffer.text());
|
||||
assert_eq!(
|
||||
dirty_text, "on disk: dirty\n",
|
||||
"dirty.txt buffer should be restored to disk contents"
|
||||
);
|
||||
assert!(
|
||||
!dirty_buffer.read_with(cx, |buffer, _| buffer.is_dirty()),
|
||||
"dirty.txt buffer should not be dirty after restore"
|
||||
);
|
||||
|
||||
// Disk contents should be unchanged (restore-from-disk should not write).
|
||||
let disk_dirty = fs.load(path!("/root/dirty.txt").as_ref()).await.unwrap();
|
||||
assert_eq!(disk_dirty, "on disk: dirty\n");
|
||||
|
||||
// Sanity: clean buffer should remain clean and unchanged.
|
||||
let clean_text = clean_buffer.read_with(cx, |buffer, _| buffer.text());
|
||||
assert_eq!(clean_text, "on disk: clean\n");
|
||||
assert!(
|
||||
!clean_buffer.read_with(cx, |buffer, _| buffer.is_dirty()),
|
||||
"clean.txt buffer should remain clean"
|
||||
);
|
||||
|
||||
// Test empty paths case.
|
||||
let output = cx
|
||||
.update(|cx| {
|
||||
tool.clone().run(
|
||||
RestoreFileFromDiskToolInput { paths: vec![] },
|
||||
ToolCallEventStream::test().0,
|
||||
cx,
|
||||
)
|
||||
})
|
||||
.await
|
||||
.unwrap();
|
||||
assert_eq!(output, "No paths provided.");
|
||||
|
||||
// Test not-found path case (path outside the project root).
|
||||
let output = cx
|
||||
.update(|cx| {
|
||||
tool.clone().run(
|
||||
RestoreFileFromDiskToolInput {
|
||||
paths: vec![PathBuf::from("nonexistent/path.txt")],
|
||||
},
|
||||
ToolCallEventStream::test().0,
|
||||
cx,
|
||||
)
|
||||
})
|
||||
.await
|
||||
.unwrap();
|
||||
assert!(
|
||||
output.contains("Not found (1):"),
|
||||
"expected not-found header line, got:\n{output}"
|
||||
);
|
||||
assert!(
|
||||
output.contains("- nonexistent/path.txt"),
|
||||
"expected not-found path bullet, got:\n{output}"
|
||||
);
|
||||
|
||||
let _ = LineEnding::Unix; // keep import used if the buffer edit API changes
|
||||
}
|
||||
}
|
||||
@@ -1,351 +0,0 @@
|
||||
use agent_client_protocol as acp;
|
||||
use anyhow::Result;
|
||||
use collections::FxHashSet;
|
||||
use gpui::{App, Entity, SharedString, Task};
|
||||
use language::Buffer;
|
||||
use project::Project;
|
||||
use schemars::JsonSchema;
|
||||
use serde::{Deserialize, Serialize};
|
||||
use std::path::PathBuf;
|
||||
use std::sync::Arc;
|
||||
|
||||
use crate::{AgentTool, ToolCallEventStream};
|
||||
|
||||
/// Saves files that have unsaved changes.
|
||||
///
|
||||
/// Use this tool when you need to edit files but they have unsaved changes that must be saved first.
|
||||
/// Only use this tool after asking the user for permission to save their unsaved changes.
|
||||
#[derive(Debug, Serialize, Deserialize, JsonSchema)]
|
||||
pub struct SaveFileToolInput {
|
||||
/// The paths of the files to save.
|
||||
pub paths: Vec<PathBuf>,
|
||||
}
|
||||
|
||||
pub struct SaveFileTool {
|
||||
project: Entity<Project>,
|
||||
}
|
||||
|
||||
impl SaveFileTool {
|
||||
pub fn new(project: Entity<Project>) -> Self {
|
||||
Self { project }
|
||||
}
|
||||
}
|
||||
|
||||
impl AgentTool for SaveFileTool {
|
||||
type Input = SaveFileToolInput;
|
||||
type Output = String;
|
||||
|
||||
fn name() -> &'static str {
|
||||
"save_file"
|
||||
}
|
||||
|
||||
fn kind() -> acp::ToolKind {
|
||||
acp::ToolKind::Other
|
||||
}
|
||||
|
||||
fn initial_title(
|
||||
&self,
|
||||
input: Result<Self::Input, serde_json::Value>,
|
||||
_cx: &mut App,
|
||||
) -> SharedString {
|
||||
match input {
|
||||
Ok(input) if input.paths.len() == 1 => "Save file".into(),
|
||||
Ok(input) => format!("Save {} files", input.paths.len()).into(),
|
||||
Err(_) => "Save files".into(),
|
||||
}
|
||||
}
|
||||
|
||||
fn run(
|
||||
self: Arc<Self>,
|
||||
input: Self::Input,
|
||||
_event_stream: ToolCallEventStream,
|
||||
cx: &mut App,
|
||||
) -> Task<Result<String>> {
|
||||
let project = self.project.clone();
|
||||
let input_paths = input.paths;
|
||||
|
||||
cx.spawn(async move |cx| {
|
||||
let mut buffers_to_save: FxHashSet<Entity<Buffer>> = FxHashSet::default();
|
||||
|
||||
let mut saved_paths: Vec<PathBuf> = Vec::new();
|
||||
let mut clean_paths: Vec<PathBuf> = Vec::new();
|
||||
let mut not_found_paths: Vec<PathBuf> = Vec::new();
|
||||
let mut open_errors: Vec<(PathBuf, String)> = Vec::new();
|
||||
let mut dirty_check_errors: Vec<(PathBuf, String)> = Vec::new();
|
||||
let mut save_errors: Vec<(String, String)> = Vec::new();
|
||||
|
||||
for path in input_paths {
|
||||
let project_path =
|
||||
project.read_with(cx, |project, cx| project.find_project_path(&path, cx));
|
||||
|
||||
let project_path = match project_path {
|
||||
Ok(Some(project_path)) => project_path,
|
||||
Ok(None) => {
|
||||
not_found_paths.push(path);
|
||||
continue;
|
||||
}
|
||||
Err(error) => {
|
||||
open_errors.push((path, error.to_string()));
|
||||
continue;
|
||||
}
|
||||
};
|
||||
|
||||
let open_buffer_task =
|
||||
project.update(cx, |project, cx| project.open_buffer(project_path, cx));
|
||||
|
||||
let buffer = match open_buffer_task {
|
||||
Ok(task) => match task.await {
|
||||
Ok(buffer) => buffer,
|
||||
Err(error) => {
|
||||
open_errors.push((path, error.to_string()));
|
||||
continue;
|
||||
}
|
||||
},
|
||||
Err(error) => {
|
||||
open_errors.push((path, error.to_string()));
|
||||
continue;
|
||||
}
|
||||
};
|
||||
|
||||
let is_dirty = match buffer.read_with(cx, |buffer, _| buffer.is_dirty()) {
|
||||
Ok(is_dirty) => is_dirty,
|
||||
Err(error) => {
|
||||
dirty_check_errors.push((path, error.to_string()));
|
||||
continue;
|
||||
}
|
||||
};
|
||||
|
||||
if is_dirty {
|
||||
buffers_to_save.insert(buffer);
|
||||
saved_paths.push(path);
|
||||
} else {
|
||||
clean_paths.push(path);
|
||||
}
|
||||
}
|
||||
|
||||
// Save each buffer individually since there's no batch save API.
|
||||
for buffer in buffers_to_save {
|
||||
let path_for_buffer = match buffer.read_with(cx, |buffer, _| {
|
||||
buffer
|
||||
.file()
|
||||
.map(|file| file.path().to_rel_path_buf())
|
||||
.map(|path| path.as_rel_path().as_unix_str().to_owned())
|
||||
}) {
|
||||
Ok(path) => path.unwrap_or_else(|| "<unknown>".to_string()),
|
||||
Err(error) => {
|
||||
save_errors.push(("<unknown>".to_string(), error.to_string()));
|
||||
continue;
|
||||
}
|
||||
};
|
||||
|
||||
let save_task = project.update(cx, |project, cx| project.save_buffer(buffer, cx));
|
||||
|
||||
match save_task {
|
||||
Ok(task) => {
|
||||
if let Err(error) = task.await {
|
||||
save_errors.push((path_for_buffer, error.to_string()));
|
||||
}
|
||||
}
|
||||
Err(error) => {
|
||||
save_errors.push((path_for_buffer, error.to_string()));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
let mut lines: Vec<String> = Vec::new();
|
||||
|
||||
if !saved_paths.is_empty() {
|
||||
lines.push(format!("Saved {} file(s).", saved_paths.len()));
|
||||
}
|
||||
if !clean_paths.is_empty() {
|
||||
lines.push(format!("{} clean.", clean_paths.len()));
|
||||
}
|
||||
|
||||
if !not_found_paths.is_empty() {
|
||||
lines.push(format!("Not found ({}):", not_found_paths.len()));
|
||||
for path in ¬_found_paths {
|
||||
lines.push(format!("- {}", path.display()));
|
||||
}
|
||||
}
|
||||
if !open_errors.is_empty() {
|
||||
lines.push(format!("Open failed ({}):", open_errors.len()));
|
||||
for (path, error) in &open_errors {
|
||||
lines.push(format!("- {}: {}", path.display(), error));
|
||||
}
|
||||
}
|
||||
if !dirty_check_errors.is_empty() {
|
||||
lines.push(format!(
|
||||
"Dirty check failed ({}):",
|
||||
dirty_check_errors.len()
|
||||
));
|
||||
for (path, error) in &dirty_check_errors {
|
||||
lines.push(format!("- {}: {}", path.display(), error));
|
||||
}
|
||||
}
|
||||
if !save_errors.is_empty() {
|
||||
lines.push(format!("Save failed ({}):", save_errors.len()));
|
||||
for (path, error) in &save_errors {
|
||||
lines.push(format!("- {}: {}", path, error));
|
||||
}
|
||||
}
|
||||
|
||||
if lines.is_empty() {
|
||||
Ok("No paths provided.".to_string())
|
||||
} else {
|
||||
Ok(lines.join("\n"))
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
use fs::Fs;
|
||||
use gpui::TestAppContext;
|
||||
use project::FakeFs;
|
||||
use serde_json::json;
|
||||
use settings::SettingsStore;
|
||||
use util::path;
|
||||
|
||||
fn init_test(cx: &mut TestAppContext) {
|
||||
cx.update(|cx| {
|
||||
let settings_store = SettingsStore::test(cx);
|
||||
cx.set_global(settings_store);
|
||||
});
|
||||
}
|
||||
|
||||
#[gpui::test]
|
||||
async fn test_save_file_output_and_effects(cx: &mut TestAppContext) {
|
||||
init_test(cx);
|
||||
|
||||
let fs = FakeFs::new(cx.executor());
|
||||
fs.insert_tree(
|
||||
"/root",
|
||||
json!({
|
||||
"dirty.txt": "on disk: dirty\n",
|
||||
"clean.txt": "on disk: clean\n",
|
||||
}),
|
||||
)
|
||||
.await;
|
||||
|
||||
let project = Project::test(fs.clone(), [path!("/root").as_ref()], cx).await;
|
||||
let tool = Arc::new(SaveFileTool::new(project.clone()));
|
||||
|
||||
// Make dirty.txt dirty in-memory.
|
||||
let dirty_project_path = project.read_with(cx, |project, cx| {
|
||||
project
|
||||
.find_project_path("root/dirty.txt", cx)
|
||||
.expect("dirty.txt should exist in project")
|
||||
});
|
||||
|
||||
let dirty_buffer = project
|
||||
.update(cx, |project, cx| {
|
||||
project.open_buffer(dirty_project_path, cx)
|
||||
})
|
||||
.await
|
||||
.unwrap();
|
||||
dirty_buffer.update(cx, |buffer, cx| {
|
||||
buffer.edit([(0..buffer.len(), "in memory: dirty\n")], None, cx);
|
||||
});
|
||||
assert!(
|
||||
dirty_buffer.read_with(cx, |buffer, _| buffer.is_dirty()),
|
||||
"dirty.txt buffer should be dirty before save"
|
||||
);
|
||||
|
||||
// Ensure clean.txt is opened but remains clean.
|
||||
let clean_project_path = project.read_with(cx, |project, cx| {
|
||||
project
|
||||
.find_project_path("root/clean.txt", cx)
|
||||
.expect("clean.txt should exist in project")
|
||||
});
|
||||
|
||||
let clean_buffer = project
|
||||
.update(cx, |project, cx| {
|
||||
project.open_buffer(clean_project_path, cx)
|
||||
})
|
||||
.await
|
||||
.unwrap();
|
||||
assert!(
|
||||
!clean_buffer.read_with(cx, |buffer, _| buffer.is_dirty()),
|
||||
"clean.txt buffer should start clean"
|
||||
);
|
||||
|
||||
let output = cx
|
||||
.update(|cx| {
|
||||
tool.clone().run(
|
||||
SaveFileToolInput {
|
||||
paths: vec![
|
||||
PathBuf::from("root/dirty.txt"),
|
||||
PathBuf::from("root/clean.txt"),
|
||||
],
|
||||
},
|
||||
ToolCallEventStream::test().0,
|
||||
cx,
|
||||
)
|
||||
})
|
||||
.await
|
||||
.unwrap();
|
||||
|
||||
// Output should mention saved + clean.
|
||||
assert!(
|
||||
output.contains("Saved 1 file(s)."),
|
||||
"expected saved count line, got:\n{output}"
|
||||
);
|
||||
assert!(
|
||||
output.contains("1 clean."),
|
||||
"expected clean count line, got:\n{output}"
|
||||
);
|
||||
|
||||
// Effect: dirty buffer should now be clean and disk should have new content.
|
||||
assert!(
|
||||
!dirty_buffer.read_with(cx, |buffer, _| buffer.is_dirty()),
|
||||
"dirty.txt buffer should not be dirty after save"
|
||||
);
|
||||
|
||||
let disk_dirty = fs.load(path!("/root/dirty.txt").as_ref()).await.unwrap();
|
||||
assert_eq!(
|
||||
disk_dirty, "in memory: dirty\n",
|
||||
"dirty.txt disk content should be updated"
|
||||
);
|
||||
|
||||
// Sanity: clean buffer should remain clean and disk unchanged.
|
||||
let disk_clean = fs.load(path!("/root/clean.txt").as_ref()).await.unwrap();
|
||||
assert_eq!(disk_clean, "on disk: clean\n");
|
||||
|
||||
// Test empty paths case.
|
||||
let output = cx
|
||||
.update(|cx| {
|
||||
tool.clone().run(
|
||||
SaveFileToolInput { paths: vec![] },
|
||||
ToolCallEventStream::test().0,
|
||||
cx,
|
||||
)
|
||||
})
|
||||
.await
|
||||
.unwrap();
|
||||
assert_eq!(output, "No paths provided.");
|
||||
|
||||
// Test not-found path case.
|
||||
let output = cx
|
||||
.update(|cx| {
|
||||
tool.clone().run(
|
||||
SaveFileToolInput {
|
||||
paths: vec![PathBuf::from("nonexistent/path.txt")],
|
||||
},
|
||||
ToolCallEventStream::test().0,
|
||||
cx,
|
||||
)
|
||||
})
|
||||
.await
|
||||
.unwrap();
|
||||
assert!(
|
||||
output.contains("Not found (1):"),
|
||||
"expected not-found header line, got:\n{output}"
|
||||
);
|
||||
assert!(
|
||||
output.contains("- nonexistent/path.txt"),
|
||||
"expected not-found path bullet, got:\n{output}"
|
||||
);
|
||||
}
|
||||
}
|
||||
@@ -204,7 +204,12 @@ impl Audio {
|
||||
})
|
||||
.denoise()
|
||||
.context("Could not set up denoiser")?
|
||||
.automatic_gain_control(0.90, 1.0, 0.0, 5.0)
|
||||
.automatic_gain_control(rodio::source::AutomaticGainControlSettings {
|
||||
target_level: 0.9,
|
||||
attack_time: Duration::from_secs(1),
|
||||
release_time: Duration::ZERO,
|
||||
absolute_max_gain: 5.0,
|
||||
})
|
||||
.periodic_access(Duration::from_millis(100), move |agc_source| {
|
||||
agc_source
|
||||
.set_enabled(LIVE_SETTINGS.auto_microphone_volume.load(Ordering::Relaxed));
|
||||
@@ -234,7 +239,12 @@ impl Audio {
|
||||
) -> anyhow::Result<()> {
|
||||
let (replay_source, source) = source
|
||||
.constant_params(CHANNEL_COUNT, SAMPLE_RATE)
|
||||
.automatic_gain_control(0.90, 1.0, 0.0, 5.0)
|
||||
.automatic_gain_control(rodio::source::AutomaticGainControlSettings {
|
||||
target_level: 0.9,
|
||||
attack_time: Duration::from_secs(1),
|
||||
release_time: Duration::ZERO,
|
||||
absolute_max_gain: 5.0,
|
||||
})
|
||||
.periodic_access(Duration::from_millis(100), move |agc_source| {
|
||||
agc_source.set_enabled(LIVE_SETTINGS.auto_speaker_volume.load(Ordering::Relaxed));
|
||||
})
|
||||
|
||||
@@ -616,13 +616,11 @@ impl BufferDiffInner {
|
||||
secondary: Option<&'a Self>,
|
||||
) -> impl 'a + Iterator<Item = DiffHunk> {
|
||||
let range = range.to_offset(buffer);
|
||||
println!(" >>> range = {range:?}");
|
||||
|
||||
let mut cursor = self
|
||||
.hunks
|
||||
.filter::<_, DiffHunkSummary>(buffer, move |summary| {
|
||||
let summary_range = summary.buffer_range.to_offset(buffer);
|
||||
println!(" >>> summary_range = {:?}", summary_range);
|
||||
let before_start = summary_range.end < range.start;
|
||||
let after_end = summary_range.start > range.end;
|
||||
!before_start && !after_end
|
||||
|
||||
@@ -20094,10 +20094,8 @@ impl Editor {
|
||||
ranges: Vec<Range<Anchor>>,
|
||||
cx: &mut Context<Editor>,
|
||||
) {
|
||||
println!("\n\nin toggle_diff_hunks_in_ranges");
|
||||
self.buffer.update(cx, |buffer, cx| {
|
||||
let expand = !buffer.has_expanded_diff_hunks_in_ranges(&ranges, cx);
|
||||
println!("expand={expand}\n\n");
|
||||
buffer.expand_or_collapse_diff_hunks(ranges, expand, cx);
|
||||
})
|
||||
}
|
||||
|
||||
@@ -29335,190 +29335,3 @@ async fn test_find_references_single_case(cx: &mut TestAppContext) {
|
||||
|
||||
cx.assert_editor_state(after);
|
||||
}
|
||||
|
||||
#[gpui::test]
|
||||
async fn test_toggle_selected_diff_hunks_neighbor(
|
||||
executor: BackgroundExecutor,
|
||||
cx: &mut TestAppContext,
|
||||
) {
|
||||
init_test(cx, |_| {});
|
||||
|
||||
let mut cx = EditorTestContext::new(cx).await;
|
||||
|
||||
let diff_base = r#"
|
||||
line0
|
||||
lineA
|
||||
lineB
|
||||
lineC
|
||||
lineD
|
||||
"#
|
||||
.unindent();
|
||||
|
||||
cx.set_state(
|
||||
&r#"
|
||||
line0
|
||||
ˇlineA1
|
||||
lineB
|
||||
lineD
|
||||
"#
|
||||
.unindent(),
|
||||
);
|
||||
cx.set_head_text(&diff_base);
|
||||
executor.run_until_parked();
|
||||
|
||||
cx.update_editor(|editor, window, cx| {
|
||||
editor.toggle_selected_diff_hunks(&ToggleSelectedDiffHunks, window, cx);
|
||||
});
|
||||
executor.run_until_parked();
|
||||
cx.assert_state_with_diff(
|
||||
r#"
|
||||
line0
|
||||
- lineA
|
||||
+ ˇlineA1
|
||||
lineB
|
||||
lineD
|
||||
"#
|
||||
.unindent(),
|
||||
);
|
||||
|
||||
cx.update_editor(|editor, window, cx| {
|
||||
editor.move_down(&MoveDown, window, cx);
|
||||
editor.move_right(&MoveRight, window, cx);
|
||||
editor.toggle_selected_diff_hunks(&ToggleSelectedDiffHunks, window, cx);
|
||||
});
|
||||
executor.run_until_parked();
|
||||
cx.assert_state_with_diff(
|
||||
r#"
|
||||
line0
|
||||
- lineA
|
||||
+ lineA1
|
||||
lˇineB
|
||||
- lineC
|
||||
lineD
|
||||
"#
|
||||
.unindent(),
|
||||
);
|
||||
|
||||
cx.update_editor(|editor, window, cx| {
|
||||
editor.toggle_selected_diff_hunks(&ToggleSelectedDiffHunks, window, cx);
|
||||
});
|
||||
executor.run_until_parked();
|
||||
cx.assert_state_with_diff(
|
||||
r#"
|
||||
line0
|
||||
- lineA
|
||||
+ lineA1
|
||||
lˇineB
|
||||
lineD
|
||||
"#
|
||||
.unindent(),
|
||||
);
|
||||
|
||||
cx.update_editor(|editor, window, cx| {
|
||||
editor.move_up(&MoveUp, window, cx);
|
||||
editor.toggle_selected_diff_hunks(&ToggleSelectedDiffHunks, window, cx);
|
||||
});
|
||||
executor.run_until_parked();
|
||||
cx.assert_state_with_diff(
|
||||
r#"
|
||||
line0
|
||||
lˇineA1
|
||||
lineB
|
||||
lineD
|
||||
"#
|
||||
.unindent(),
|
||||
);
|
||||
}
|
||||
|
||||
#[gpui::test]
|
||||
async fn test_toggle_selected_diff_hunks_neighbor_edge_case(
|
||||
executor: BackgroundExecutor,
|
||||
cx: &mut TestAppContext,
|
||||
) {
|
||||
init_test(cx, |_| {});
|
||||
|
||||
let mut cx = EditorTestContext::new(cx).await;
|
||||
|
||||
let diff_base = r#"
|
||||
line0
|
||||
lineA
|
||||
lineB
|
||||
lineC
|
||||
lineD
|
||||
"#
|
||||
.unindent();
|
||||
|
||||
cx.set_state(
|
||||
&r#"
|
||||
line0
|
||||
ˇlineA1
|
||||
lineB
|
||||
lineD
|
||||
"#
|
||||
.unindent(),
|
||||
);
|
||||
cx.set_head_text(&diff_base);
|
||||
executor.run_until_parked();
|
||||
|
||||
cx.update_editor(|editor, window, cx| {
|
||||
editor.toggle_selected_diff_hunks(&ToggleSelectedDiffHunks, window, cx);
|
||||
});
|
||||
executor.run_until_parked();
|
||||
cx.assert_state_with_diff(
|
||||
r#"
|
||||
line0
|
||||
- lineA
|
||||
+ ˇlineA1
|
||||
lineB
|
||||
lineD
|
||||
"#
|
||||
.unindent(),
|
||||
);
|
||||
|
||||
cx.update_editor(|editor, window, cx| {
|
||||
editor.move_down(&MoveDown, window, cx);
|
||||
editor.toggle_selected_diff_hunks(&ToggleSelectedDiffHunks, window, cx);
|
||||
});
|
||||
executor.run_until_parked();
|
||||
cx.assert_state_with_diff(
|
||||
r#"
|
||||
line0
|
||||
- lineA
|
||||
+ lineA1
|
||||
ˇlineB
|
||||
- lineC
|
||||
lineD
|
||||
"#
|
||||
.unindent(),
|
||||
);
|
||||
|
||||
cx.update_editor(|editor, window, cx| {
|
||||
editor.toggle_selected_diff_hunks(&ToggleSelectedDiffHunks, window, cx);
|
||||
});
|
||||
executor.run_until_parked();
|
||||
cx.assert_state_with_diff(
|
||||
r#"
|
||||
line0
|
||||
- lineA
|
||||
+ lineA1
|
||||
ˇlineB
|
||||
lineD
|
||||
"#
|
||||
.unindent(),
|
||||
);
|
||||
|
||||
cx.update_editor(|editor, window, cx| {
|
||||
editor.move_up(&MoveUp, window, cx);
|
||||
editor.toggle_selected_diff_hunks(&ToggleSelectedDiffHunks, window, cx);
|
||||
});
|
||||
executor.run_until_parked();
|
||||
cx.assert_state_with_diff(
|
||||
r#"
|
||||
line0
|
||||
ˇlineA1
|
||||
lineB
|
||||
lineD
|
||||
"#
|
||||
.unindent(),
|
||||
);
|
||||
}
|
||||
|
||||
@@ -9164,15 +9164,6 @@ impl Element for EditorElement {
|
||||
let height_in_lines = f64::from(bounds.size.height / line_height);
|
||||
let max_row = snapshot.max_point().row().as_f64();
|
||||
|
||||
// Calculate how much of the editor is clipped by parent containers (e.g., List).
|
||||
// This allows us to only render lines that are actually visible, which is
|
||||
// critical for performance when large AutoHeight editors are inside Lists.
|
||||
let visible_bounds = window.content_mask().bounds;
|
||||
let clipped_top = (visible_bounds.origin.y - bounds.origin.y).max(px(0.));
|
||||
let clipped_top_in_lines = f64::from(clipped_top / line_height);
|
||||
let visible_height_in_lines =
|
||||
f64::from(visible_bounds.size.height / line_height);
|
||||
|
||||
// The max scroll position for the top of the window
|
||||
let max_scroll_top = if matches!(
|
||||
snapshot.mode,
|
||||
@@ -9229,14 +9220,10 @@ impl Element for EditorElement {
|
||||
let mut scroll_position = snapshot.scroll_position();
|
||||
// The scroll position is a fractional point, the whole number of which represents
|
||||
// the top of the window in terms of display rows.
|
||||
// We add clipped_top_in_lines to skip rows that are clipped by parent containers,
|
||||
// but we don't modify scroll_position itself since the parent handles positioning.
|
||||
let start_row =
|
||||
DisplayRow((scroll_position.y + clipped_top_in_lines).floor() as u32);
|
||||
let start_row = DisplayRow(scroll_position.y as u32);
|
||||
let max_row = snapshot.max_point().row();
|
||||
let end_row = cmp::min(
|
||||
(scroll_position.y + clipped_top_in_lines + visible_height_in_lines).ceil()
|
||||
as u32,
|
||||
(scroll_position.y + height_in_lines).ceil() as u32,
|
||||
max_row.next_row().0,
|
||||
);
|
||||
let end_row = DisplayRow(end_row);
|
||||
|
||||
@@ -176,9 +176,11 @@ pub fn block_content_for_tests(
|
||||
}
|
||||
|
||||
pub fn editor_content_with_blocks(editor: &Entity<Editor>, cx: &mut VisualTestContext) -> String {
|
||||
let draw_size = size(px(3000.0), px(3000.0));
|
||||
cx.simulate_resize(draw_size);
|
||||
cx.draw(gpui::Point::default(), draw_size, |_, _| editor.clone());
|
||||
cx.draw(
|
||||
gpui::Point::default(),
|
||||
size(px(3000.0), px(3000.0)),
|
||||
|_, _| editor.clone(),
|
||||
);
|
||||
let (snapshot, mut lines, blocks) = editor.update_in(cx, |editor, window, cx| {
|
||||
let snapshot = editor.snapshot(window, cx);
|
||||
let text = editor.display_text(cx);
|
||||
|
||||
@@ -51,9 +51,17 @@ impl LiveKitStream {
|
||||
gpui::Priority::Realtime(gpui::RealtimePriority::Audio),
|
||||
{
|
||||
async move {
|
||||
let mut last_print = std::time::Instant::now();
|
||||
while let Some(frame) = stream.next().await {
|
||||
let samples = frame_to_samplesbuffer(frame);
|
||||
queue_input.append(samples);
|
||||
let queued = queue_input.append(samples);
|
||||
|
||||
if queued > 10 && last_print.elapsed().as_secs() > 5 {
|
||||
log::warn!(
|
||||
"More then 10 audio frames queued in the input, total: {queued}"
|
||||
);
|
||||
last_print = std::time::Instant::now();
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
|
||||
@@ -2606,41 +2606,19 @@ impl MultiBuffer {
|
||||
|
||||
pub fn has_expanded_diff_hunks_in_ranges(&self, ranges: &[Range<Anchor>], cx: &App) -> bool {
|
||||
let snapshot = self.read(cx);
|
||||
print!("[ ");
|
||||
for range in ranges {
|
||||
let range = range.to_point(&snapshot);
|
||||
print!("{range:?}, ");
|
||||
}
|
||||
println!(" ]");
|
||||
let mut cursor = snapshot.diff_transforms.cursor::<MultiBufferOffset>(());
|
||||
cursor.next();
|
||||
println!("{{ ");
|
||||
while let Some(item) = cursor.item() {
|
||||
if let Some(hunk_info) = item.hunk_info() {
|
||||
println!(" {:?}-{:?} {hunk_info:?}, ", cursor.start(), cursor.end());
|
||||
} else {
|
||||
println!(" {:?}-{:?}, ", cursor.start(), cursor.end());
|
||||
}
|
||||
cursor.next();
|
||||
}
|
||||
println!("}}");
|
||||
let mut cursor = snapshot.diff_transforms.cursor::<MultiBufferOffset>(());
|
||||
for range in ranges {
|
||||
let range = range.to_point(&snapshot);
|
||||
let start = snapshot.point_to_offset(Point::new(range.start.row, 0));
|
||||
let end = snapshot.point_to_offset(Point::new(range.end.row + 1, 0));
|
||||
let new_start = start.saturating_sub_usize(0);
|
||||
let new_end = snapshot.len().min(end);
|
||||
println!("TRYING {range:?} ({start:?}-{end:?}) => ({new_start:?}-{new_end:?})");
|
||||
cursor.seek(&new_start, Bias::Right);
|
||||
let start = start.saturating_sub_usize(1);
|
||||
let end = snapshot.len().min(end + 1usize);
|
||||
cursor.seek(&start, Bias::Right);
|
||||
while let Some(item) = cursor.item() {
|
||||
println!(" ?? {:?}-{:?}", cursor.start(), cursor.end());
|
||||
if *cursor.start() > new_end {
|
||||
println!(" BREAK");
|
||||
if *cursor.start() >= end {
|
||||
break;
|
||||
}
|
||||
if item.hunk_info().is_some() {
|
||||
println!(" HIT");
|
||||
return true;
|
||||
}
|
||||
cursor.next();
|
||||
@@ -2656,7 +2634,6 @@ impl MultiBuffer {
|
||||
cx: &mut Context<Self>,
|
||||
) {
|
||||
if self.snapshot.borrow().all_diff_hunks_expanded && !expand {
|
||||
println!("early exit");
|
||||
return;
|
||||
}
|
||||
self.sync_mut(cx);
|
||||
@@ -2665,7 +2642,6 @@ impl MultiBuffer {
|
||||
let mut last_hunk_row = None;
|
||||
for (range, end_excerpt_id) in ranges {
|
||||
for diff_hunk in snapshot.diff_hunks_in_range(range) {
|
||||
println!(" >> {diff_hunk:?}");
|
||||
if diff_hunk.excerpt_id.cmp(&end_excerpt_id, &snapshot).is_gt() {
|
||||
continue;
|
||||
}
|
||||
@@ -3883,7 +3859,6 @@ impl MultiBufferSnapshot {
|
||||
range: Range<T>,
|
||||
) -> impl Iterator<Item = MultiBufferDiffHunk> + '_ {
|
||||
let query_range = range.start.to_point(self)..range.end.to_point(self);
|
||||
println!("query_range={query_range:?}");
|
||||
self.lift_buffer_metadata(query_range.clone(), move |buffer, buffer_range| {
|
||||
let diff = self.diffs.get(&buffer.remote_id())?;
|
||||
let buffer_start = buffer.anchor_before(buffer_range.start);
|
||||
@@ -3899,7 +3874,6 @@ impl MultiBufferSnapshot {
|
||||
)
|
||||
})
|
||||
.filter_map(move |(range, hunk, excerpt)| {
|
||||
println!(" -> range={range:?}");
|
||||
if range.start != range.end && range.end == query_range.start && !hunk.range.is_empty()
|
||||
{
|
||||
return None;
|
||||
|
||||
Reference in New Issue
Block a user