Compare commits

..

14 Commits

Author SHA1 Message Date
Peter Tripp
7c7f540fb4 wip 2025-03-07 16:49:51 +00:00
João Marcos
e06d010aab Test folded buffers navigation (#26286)
#25944 but now with Vim mode off.

Release Notes:

- N/A
2025-03-07 16:19:12 +00:00
Marshall Bowers
14148f53d4 scripting_tool: Move description into a separate file (#26283)
This PR moves the `scripting_tool` description into a separate file so
it's a bit easier to work with.

Release Notes:

- N/A
2025-03-07 15:52:38 +00:00
Antonio Scandurra
efde5aa2bb Extract a Session struct to hold state about a given thread's scripting session (#26282)
We're still recreating a session for every tool call, but the idea is to
have a long-lived `Session` per assistant thread.

Release Notes:

- N/A

---------

Co-authored-by: Agus Zubiaga <hi@aguz.me>
2025-03-07 15:44:36 +00:00
Guilherme Gonçalves
fcc5e27455 Fix hotkey for toggle filters in project search (#25917)
Closes #24741 

Adjusted the shortcut key handling to properly toggle filters in the project search feature.

Release Notes:

- linux: Fixed `ctrl-alt-f` not correctly toggling search filters in project search.

---------

Co-authored-by: Peter Tripp <peter@zed.dev>
2025-03-07 15:27:10 +00:00
Marshall Bowers
ed417da536 git_ui: Try to prompt the model out of including the diff output (#26281)
This PR updates the prompt for generating commit messages to tell the
model not to include the raw diff output in the message.

Release Notes:

- N/A
2025-03-07 15:05:35 +00:00
Piotr Osiewicz
d1c67897c5 chore: Do not bust Rust build cache when opening projects with dev build (#26278)
## Problem
Running `cargo run .` twice in Zed repository required a rebuild two
times in a row. The second rebuild was triggered around libz-sys, which
in practice caused a rebuild of the ~entire project.

Some concrete examples:
```
cargo test -p project # Requires a rebuild (warranted)
cargo run .
cargo test -p project # Requires a rebuild (unwarranted)
```
or
```
cargo run . # Requires a rebuild (warranted)
cargo run . # Requires a rebuild (unwarranted)
```

## What's going on
Zed build script on MacOS sets MACOSX_DEPLOYMENT_TARGET to 10.15. This
is fine. However, **cargo propagates all environment variables to child
processes during `cargo run`**. This then affects Rust Analyzer spawned
by dev Zed - it clobbers build cache of whatever package it touches,
because it's behavior is not same between running it with `cargo run`
(where MACOS_DEPLOYMENT_TARGET gets propagated to child Zed) and running
it directly via `target/debug/zed` or whatever (where the env variable
is not set, so that build behaves roughly like Zed Dev.app).


## Solution
~We'll unset that env variable from user environment when we're
reasonably confident that we're running under `cargo run` by exploiting
other env variables set by cargo:
https://doc.rust-lang.org/cargo/reference/environment-variables.html
CARGO_PKG_NAME is always set to `zed` when running it via `cargo run`,
as it's the value propagated from the build.~

~The alternative I've considered is running [via a custom
runner](https://doc.rust-lang.org/cargo/reference/config.html#targetcfgrunner),
though the problem here is that we'd have to use a shell script to unset
the env variable - that could be problematic with e.g. fish. I just
didn't want to deal with that, though admittedly it would've been
cleaner in other aspects.~

Redact all above. We'll just set MACOSX_DEPLOYMENT_TARGET regardless of
whether you have it in your OG shell environment or not.

Release Notes:

- N/A
2025-03-07 14:06:44 +00:00
Finn Evers
a887f3b340 Remove plain text file type association from default settings (#25420)
Closes #20291

This PR removes the plain text file association from the default
settings, as #21298 added a `LanguageMatcher` for Plain Text files,
which now associates "Plain Text" with `txt`-files (see
10053e2566/crates/language/src/language.rs (L127-L137)).

Thus, the association via the default settings is not required anymore,
which fixes #20291 as described in
https://github.com/zed-industries/zed/issues/20291#issuecomment-2500731743

Release Notes:

- Fixed default file type associations overriding associations provided
by extensions for `txt`-files.

Co-authored-by: Peter Tripp <peter@zed.dev>
2025-03-07 08:45:23 -05:00
Kirill Bulatov
f8deebc6db Fix inline diagnostics in the project diff (#26275)
205f9a9f03/crates/editor/src/element.rs (L1643)

Due to the snippet above, Zed is supposed to have `row` larger or equal
to `start_row` here:


205f9a9f03/crates/editor/src/element.rs (L1694)

yet the panic were reported when clicking in the project diff.

That project diff has a lot of highlighting happening already, so the PR
disables inline diagnostics within a git diff view.


Release Notes:

- N/A
2025-03-07 11:34:37 +00:00
Michael Sloan
205f9a9f03 Add lua script access to code using cx + reuse project search logic (#26269)
Access to `cx` will be needed for anything that queries entities. In
this commit this is use of `WorktreeStore::find_search_candidates`. In
the future it will be things like access to LSP / tree-sitter outlines /
etc.

Changes to support access to `cx` from functions provided to the Lua
script:

* Adds a channel of requests that require a `cx`. Work enqueued to this
channel is run on the foreground thread.

* Adds `async` and `send` features to `mlua` crate so that async rust
functions can be used from Lua.

* Changes uses of `Rc<RefCell<...>>` to `Arc<Mutex<...>>` so that the
futures are `Send`.

One benefit of reusing project search logic for search candidates is
that it properly ignores paths.

Release Notes:

- N/A
2025-03-07 10:02:49 +00:00
Cole Miller
b0d1024f66 Silence a couple of noisy logs (#26262)
Closes #ISSUE

Release Notes:

- N/A
2025-03-06 22:45:47 -05:00
loczek
622ed8a032 git: Fix git panel not using default width (#26220)
Closes #26062

Removing the width here causes zed to use the default value (inside
default settings) after restart like other panels.

Release Notes:

- Fixed issue where git panel wasn't using default width after restart

Co-authored-by: Mikayla Maki <mikayla@zed.dev>
2025-03-07 02:27:29 +00:00
0x2CA
09c51f9641 assistant2: Fix font fallbacks (#26258)
Release Notes:

- N/A
2025-03-06 18:14:53 -08:00
Mikayla Maki
8422a81d88 Add staged variants of the hunk_style controls (#26259)
This PR adds a few more hunk style settings that flips the emphasis.
Normally, the concept at Zed has been that the project diff should
emphasize what's going into the commit. However, this leads to a problem
where the default state of all diff hunks are in the non-emphasized
state, making them hard to see and interact with. Especially on light
themes. This PR is an experiment in flipping the emphasis states. Now
the project diff is more like a queue of work, with the next "job" (hunk
to be evaluated) emphasized, and the "completed" (staged) hunks
deemphasized. This fixes the default state issue but is a big jump from
how we've been thinking about it. So here we can try it out and see how
it feels :)

Release Notes:

- Git Beta: Added hunk style settings to emphasize the unstaged state,
rather than the staged state.
2025-03-07 02:13:50 +00:00
25 changed files with 1756 additions and 1943 deletions

View File

@@ -26,3 +26,6 @@ rustflags = [
"-C",
"target-feature=+crt-static", # This fixes the linking issue when compiling livekit on Windows
]
[env]
MACOSX_DEPLOYMENT_TARGET = "10.15.7"

1416
Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@@ -452,7 +452,7 @@ livekit = { git = "https://github.com/zed-industries/livekit-rust-sdks", rev = "
], default-features = false }
log = { version = "0.4.16", features = ["kv_unstable_serde", "serde"] }
markup5ever_rcdom = "0.3.0"
mlua = { version = "0.10", features = ["lua54", "vendored"] }
mlua = { version = "0.10", features = ["lua54", "vendored", "async", "send"] }
nanoid = "0.4"
nbformat = { version = "0.10.0" }
nix = "0.29"

View File

@@ -105,7 +105,6 @@
"ctrl-shift-home": "editor::SelectToBeginning",
"ctrl-shift-end": "editor::SelectToEnd",
"ctrl-a": "editor::SelectAll",
"ctrl-l": "editor::SelectLine",
"ctrl-shift-i": "editor::Format",
// "cmd-shift-left": ["editor::SelectToBeginningOfLine", {"stop_at_soft_wraps": true, "stop_at_indent": true }],
// "ctrl-shift-a": ["editor::SelectToBeginningOfLine", { "stop_at_soft_wraps": true, "stop_at_indent": true }],
@@ -471,17 +470,15 @@
"ctrl-shift-d": "editor::DuplicateLineDown",
"ctrl-shift-j": "editor::JoinLines",
"ctrl-alt-backspace": "editor::DeleteToPreviousSubwordStart",
"ctrl-alt-h": "editor::DeleteToPreviousSubwordStart",
"ctrl-alt-delete": "editor::DeleteToNextSubwordEnd",
"ctrl-alt-d": "editor::DeleteToNextSubwordEnd",
"ctrl-alt-left": "editor::MoveToPreviousSubwordStart",
// "ctrl-alt-b": "editor::MoveToPreviousSubwordStart",
"ctrl-alt-right": "editor::MoveToNextSubwordEnd",
"ctrl-alt-f": "editor::MoveToNextSubwordEnd",
"ctrl-alt-shift-left": "editor::SelectToPreviousSubwordStart",
"ctrl-alt-shift-b": "editor::SelectToPreviousSubwordStart",
"ctrl-alt-shift-right": "editor::SelectToNextSubwordEnd",
"ctrl-alt-shift-f": "editor::SelectToNextSubwordEnd"
"ctrl-alt-left": "editor::MoveToPreviousSubwordStart", // macos sublime
"ctrl-alt-right": "editor::MoveToNextSubwordStart", // macos sublime
"alt-left": "editor::MoveToPreviousWordStart",
"alt-right": "editor::MoveToNextWordEnd",
"ctrl-l": "editor::SelectLine", // goes downwards
//"alt-l": "editor::SelectLineUp",
"alt-shift-left": "editor::SelectToPreviousSubwordStart",
"alt-shift-right": "editor::SelectToNextSubwordEnd"
}
},
// Bindings from Atom

View File

@@ -28,6 +28,10 @@
{
"context": "Editor",
"bindings": {
"alt-left": "editor::MoveToPreviousSubwordStart",
"alt-right": "editor::MoveToNextSubwordStart",
"alt-shift-left": "editor::SelectToPreviousSubwordStart",
"alt-shift-right": "editor::SelectToNextSubwordEnd",
"ctrl-alt-up": "editor::AddSelectionAbove",
"ctrl-alt-down": "editor::AddSelectionBelow",
"ctrl-shift-up": "editor::MoveLineUp",

View File

@@ -1055,7 +1055,6 @@
// }
//
"file_types": {
"Plain Text": ["txt"],
"JSONC": ["**/.zed/**/*.json", "**/zed/**/*.json", "**/Zed/**/*.json", "**/.vscode/**/*.json"],
"Shell Script": [".env.*"]
},

View File

@@ -173,6 +173,8 @@ impl ActiveThread {
text_style.refine(&TextStyleRefinement {
font_family: Some(theme_settings.ui_font.family.clone()),
font_fallbacks: theme_settings.ui_font.fallbacks.clone(),
font_features: Some(theme_settings.ui_font.features.clone()),
font_size: Some(ui_font_size.into()),
color: Some(cx.theme().colors().text),
..Default::default()
@@ -207,6 +209,8 @@ impl ActiveThread {
},
text: Some(TextStyleRefinement {
font_family: Some(theme_settings.buffer_font.family.clone()),
font_fallbacks: theme_settings.buffer_font.fallbacks.clone(),
font_features: Some(theme_settings.buffer_font.features.clone()),
font_size: Some(buffer_font_size.into()),
..Default::default()
}),
@@ -214,6 +218,8 @@ impl ActiveThread {
},
inline_code: TextStyleRefinement {
font_family: Some(theme_settings.buffer_font.family.clone()),
font_fallbacks: theme_settings.buffer_font.fallbacks.clone(),
font_features: Some(theme_settings.buffer_font.features.clone()),
font_size: Some(buffer_font_size.into()),
background_color: Some(colors.editor_foreground.opacity(0.1)),
..Default::default()

View File

@@ -389,6 +389,7 @@ impl Render for MessageEditor {
let text_style = TextStyle {
color: cx.theme().colors().text,
font_family: settings.ui_font.family.clone(),
font_fallbacks: settings.ui_font.fallbacks.clone(),
font_features: settings.ui_font.features.clone(),
font_size: font_size.into(),
font_weight: settings.ui_font.weight,

View File

@@ -16413,6 +16413,199 @@ async fn test_folding_buffer_when_multibuffer_has_only_one_excerpt(cx: &mut Test
);
}
#[gpui::test]
async fn test_multi_buffer_navigation_with_folded_buffers(cx: &mut TestAppContext) {
init_test(cx, |_| {});
cx.update(|cx| {
let default_key_bindings = settings::KeymapFile::load_asset_allow_partial_failure(
"keymaps/default-linux.json",
cx,
)
.unwrap();
cx.bind_keys(default_key_bindings);
});
let (editor, cx) = cx.add_window_view(|window, cx| {
let multi_buffer = MultiBuffer::build_multi(
[
("a0\nb0\nc0\nd0\ne0\n", vec![Point::row_range(0..2)]),
("a1\nb1\nc1\nd1\ne1\n", vec![Point::row_range(0..2)]),
("a2\nb2\nc2\nd2\ne2\n", vec![Point::row_range(0..2)]),
("a3\nb3\nc3\nd3\ne3\n", vec![Point::row_range(0..2)]),
],
cx,
);
let mut editor = Editor::new(
EditorMode::Full,
multi_buffer.clone(),
None,
true,
window,
cx,
);
let buffer_ids = multi_buffer.read(cx).excerpt_buffer_ids();
// fold all but the second buffer, so that we test navigating between two
// adjacent folded buffers, as well as folded buffers at the start and
// end the multibuffer
editor.fold_buffer(buffer_ids[0], cx);
editor.fold_buffer(buffer_ids[2], cx);
editor.fold_buffer(buffer_ids[3], cx);
editor
});
cx.simulate_resize(size(px(1000.), px(1000.)));
let mut cx = EditorTestContext::for_editor_in(editor.clone(), cx).await;
cx.assert_excerpts_with_selections(indoc! {"
[EXCERPT]
ˇ[FOLDED]
[EXCERPT]
a1
b1
[EXCERPT]
[FOLDED]
[EXCERPT]
[FOLDED]
"
});
cx.simulate_keystroke("down");
cx.assert_excerpts_with_selections(indoc! {"
[EXCERPT]
[FOLDED]
[EXCERPT]
ˇa1
b1
[EXCERPT]
[FOLDED]
[EXCERPT]
[FOLDED]
"
});
cx.simulate_keystroke("down");
cx.assert_excerpts_with_selections(indoc! {"
[EXCERPT]
[FOLDED]
[EXCERPT]
a1
ˇb1
[EXCERPT]
[FOLDED]
[EXCERPT]
[FOLDED]
"
});
cx.simulate_keystroke("down");
cx.assert_excerpts_with_selections(indoc! {"
[EXCERPT]
[FOLDED]
[EXCERPT]
a1
b1
ˇ[EXCERPT]
[FOLDED]
[EXCERPT]
[FOLDED]
"
});
cx.simulate_keystroke("down");
cx.assert_excerpts_with_selections(indoc! {"
[EXCERPT]
[FOLDED]
[EXCERPT]
a1
b1
[EXCERPT]
ˇ[FOLDED]
[EXCERPT]
[FOLDED]
"
});
for _ in 0..5 {
cx.simulate_keystroke("down");
cx.assert_excerpts_with_selections(indoc! {"
[EXCERPT]
[FOLDED]
[EXCERPT]
a1
b1
[EXCERPT]
[FOLDED]
[EXCERPT]
ˇ[FOLDED]
"
});
}
cx.simulate_keystroke("up");
cx.assert_excerpts_with_selections(indoc! {"
[EXCERPT]
[FOLDED]
[EXCERPT]
a1
b1
[EXCERPT]
ˇ[FOLDED]
[EXCERPT]
[FOLDED]
"
});
cx.simulate_keystroke("up");
cx.assert_excerpts_with_selections(indoc! {"
[EXCERPT]
[FOLDED]
[EXCERPT]
a1
b1
ˇ[EXCERPT]
[FOLDED]
[EXCERPT]
[FOLDED]
"
});
cx.simulate_keystroke("up");
cx.assert_excerpts_with_selections(indoc! {"
[EXCERPT]
[FOLDED]
[EXCERPT]
a1
ˇb1
[EXCERPT]
[FOLDED]
[EXCERPT]
[FOLDED]
"
});
cx.simulate_keystroke("up");
cx.assert_excerpts_with_selections(indoc! {"
[EXCERPT]
[FOLDED]
[EXCERPT]
ˇa1
b1
[EXCERPT]
[FOLDED]
[EXCERPT]
[FOLDED]
"
});
for _ in 0..5 {
cx.simulate_keystroke("up");
cx.assert_excerpts_with_selections(indoc! {"
[EXCERPT]
ˇ[FOLDED]
[EXCERPT]
a1
b1
[EXCERPT]
[FOLDED]
[EXCERPT]
[FOLDED]
"
});
}
}
#[gpui::test]
async fn test_inline_completion_text(cx: &mut TestAppContext) {
init_test(cx, |_| {});

View File

@@ -1691,7 +1691,7 @@ impl EditorElement {
let pos_y = content_origin.y
+ line_height * (row.0 as f32 - scroll_pixel_position.y / line_height);
let window_ix = row.minus(start_row) as usize;
let window_ix = row.0.saturating_sub(start_row.0) as usize;
let pos_x = {
let crease_trailer_layout = &crease_trailers[window_ix];
let line_layout = &line_layouts[window_ix];
@@ -4349,6 +4349,11 @@ impl EditorElement {
fn paint_gutter_diff_hunks(layout: &mut EditorLayout, window: &mut Window, cx: &mut App) {
let is_light = cx.theme().appearance().is_light();
let hunk_style = ProjectSettings::get_global(cx)
.git
.hunk_style
.unwrap_or_default();
if layout.display_hunks.is_empty() {
return;
}
@@ -4412,9 +4417,20 @@ impl EditorElement {
if let Some((hunk_bounds, mut background_color, corner_radii, secondary_status)) =
hunk_to_paint
{
if secondary_status.has_secondary_hunk() {
background_color =
background_color.opacity(if is_light { 0.2 } else { 0.32 });
match hunk_style {
GitHunkStyleSetting::Transparent | GitHunkStyleSetting::Pattern => {
if secondary_status.has_secondary_hunk() {
background_color =
background_color.opacity(if is_light { 0.2 } else { 0.32 });
}
}
GitHunkStyleSetting::StagedPattern
| GitHunkStyleSetting::StagedTransparent => {
if !secondary_status.has_secondary_hunk() {
background_color =
background_color.opacity(if is_light { 0.2 } else { 0.32 });
}
}
}
// Flatten the background color with the editor color to prevent
@@ -6734,10 +6750,10 @@ impl Element for EditorElement {
.update(cx, |editor, cx| editor.highlighted_display_rows(window, cx));
let is_light = cx.theme().appearance().is_light();
let use_pattern = ProjectSettings::get_global(cx)
let hunk_style = ProjectSettings::get_global(cx)
.git
.hunk_style
.map_or(false, |style| matches!(style, GitHunkStyleSetting::Pattern));
.unwrap_or_default();
for (ix, row_info) in row_infos.iter().enumerate() {
let Some(diff_status) = row_info.diff_status else {
@@ -6757,20 +6773,39 @@ impl Element for EditorElement {
let unstaged = diff_status.has_secondary_hunk();
let hunk_opacity = if is_light { 0.16 } else { 0.12 };
let slash_width = line_height.0 / 1.5; // ~16 by default
let staged_background =
solid_background(background_color.opacity(hunk_opacity));
let unstaged_background = if use_pattern {
pattern_slash(
background_color.opacity(hunk_opacity),
window.rem_size().0 * 1.125, // ~18 by default
)
} else {
solid_background(background_color.opacity(if is_light {
0.08
} else {
0.04
}))
let staged_background = match hunk_style {
GitHunkStyleSetting::Transparent | GitHunkStyleSetting::Pattern => {
solid_background(background_color.opacity(hunk_opacity))
}
GitHunkStyleSetting::StagedPattern => {
pattern_slash(background_color.opacity(hunk_opacity), slash_width)
}
GitHunkStyleSetting::StagedTransparent => {
solid_background(background_color.opacity(if is_light {
0.08
} else {
0.04
}))
}
};
let unstaged_background = match hunk_style {
GitHunkStyleSetting::Transparent => {
solid_background(background_color.opacity(if is_light {
0.08
} else {
0.04
}))
}
GitHunkStyleSetting::Pattern => {
pattern_slash(background_color.opacity(hunk_opacity), slash_width)
}
GitHunkStyleSetting::StagedPattern
| GitHunkStyleSetting::StagedTransparent => {
solid_background(background_color.opacity(hunk_opacity))
}
};
let background = if unstaged {

View File

@@ -429,12 +429,14 @@ impl EditorTestContext {
if expected_selections.len() > 0 {
assert!(
is_selected,
"excerpt {} should be selected. Got {:?}",
ix,
self.editor_state()
"excerpt {ix} should be selected. got {:?}",
self.editor_state(),
);
} else {
assert!(!is_selected, "excerpt {} should not be selected", ix);
assert!(
!is_selected,
"excerpt {ix} should not be selected, got: {selections:?}",
);
}
continue;
}

View File

@@ -4,7 +4,7 @@ If you can accurately express the change in just the subject line, don't include
Don't repeat information from the subject line in the message body.
Only return the commit message in your response. Do not include any additional meta-commentary about the task.
Only return the commit message in your response. Do not include any additional meta-commentary about the task. Do not include the raw diff output in the commit message.
Follow good Git style:

View File

@@ -374,7 +374,7 @@ impl GitPanel {
tracked_count: 0,
tracked_staged_count: 0,
update_visible_entries_task: Task::ready(()),
width: Some(px(360.)),
width: None,
context_menu: None,
workspace,
modal_open: false,

View File

@@ -138,6 +138,7 @@ impl ProjectDiff {
window,
cx,
);
diff_display_editor.disable_inline_diagnostics();
diff_display_editor.set_expand_all_diff_hunks(cx);
diff_display_editor.register_addon(GitPanelAddon {
workspace: workspace.downgrade(),

View File

@@ -736,7 +736,7 @@ impl Element for MarkdownElement {
markdown_end,
);
}
_ => log::error!("unsupported markdown tag {:?}", tag),
_ => log::debug!("unsupported markdown tag {:?}", tag),
}
}
MarkdownEvent::End(tag) => match tag {
@@ -853,7 +853,7 @@ impl Element for MarkdownElement {
MarkdownTagEnd::TableCell => {
builder.pop_div();
}
_ => log::error!("unsupported markdown tag end: {:?}", tag),
_ => log::debug!("unsupported markdown tag end: {:?}", tag),
},
MarkdownEvent::Text(parsed) => {
builder.push_text(parsed, range.start);

View File

@@ -3523,10 +3523,7 @@ impl MultiBufferSnapshot {
) -> impl Iterator<Item = MultiBufferDiffHunk> + '_ {
let query_range = range.start.to_point(self)..range.end.to_point(self);
self.lift_buffer_metadata(query_range.clone(), move |buffer, buffer_range| {
let Some(diff) = self.diffs.get(&buffer.remote_id()) else {
log::debug!("no diff found for {:?}", buffer.remote_id());
return None;
};
let diff = self.diffs.get(&buffer.remote_id())?;
let buffer_start = buffer.anchor_before(buffer_range.start);
let buffer_end = buffer.anchor_after(buffer_range.end);
Some(

View File

@@ -212,6 +212,10 @@ pub enum GitHunkStyleSetting {
Transparent,
/// Show unstaged hunks with a pattern background
Pattern,
/// Show staged hunks with a pattern background
StagedPattern,
/// Show staged hunks with a pattern background
StagedTransparent,
}
#[derive(Clone, Copy, Debug, Default, Serialize, Deserialize, JsonSchema)]

View File

@@ -15,13 +15,23 @@ doctest = false
[dependencies]
anyhow.workspace = true
assistant_tool.workspace = true
collections.workspace = true
futures.workspace = true
gpui.workspace = true
mlua.workspace = true
parking_lot.workspace = true
project.workspace = true
regex.workspace = true
schemars.workspace = true
serde.workspace = true
serde_json.workspace = true
settings.workspace = true
util.workspace = true
workspace.workspace = true
regex.workspace = true
[dev-dependencies]
editor = { workspace = true, features = ["test-support"] }
collections = { workspace = true, features = ["test-support"] }
gpui = { workspace = true, features = ["test-support"] }
project = { workspace = true, features = ["test-support"] }
settings = { workspace = true, features = ["test-support"] }
workspace = { workspace = true, features = ["test-support"] }

View File

@@ -1,19 +1,12 @@
mod streaming_json;
mod streaming_lua;
mod session;
pub(crate) use session::*;
use anyhow::anyhow;
use assistant_tool::{Tool, ToolRegistry};
use gpui::{App, AppContext as _, Task, WeakEntity, Window};
use mlua::{Function, Lua, MultiValue, Result, UserData, UserDataMethods};
use schemars::JsonSchema;
use serde::Deserialize;
use std::{
cell::RefCell,
collections::HashMap,
path::{Path, PathBuf},
rc::Rc,
sync::Arc,
};
use std::sync::Arc;
use workspace::Workspace;
pub fn init(cx: &App) {
@@ -34,20 +27,7 @@ impl Tool for ScriptingTool {
}
fn description(&self) -> String {
r#"You can write a Lua script and I'll run it on my code base and tell you what its output was,
including both stdout as well as the git diff of changes it made to the filesystem. That way,
you can get more information about the code base, or make changes to the code base directly.
The lua script will have access to `io` and it will run with the current working directory being in
the root of the code base, so you can use it to explore, search, make changes, etc. You can also have
the script print things, and I'll tell you what the output was. Note that `io` only has `open`, and
then the file it returns only has the methods read, write, and close - it doesn't have popen or
anything else. Also, I'm going to be putting this Lua script into JSON, so please don't use Lua's
double quote syntax for string literals - use one of Lua's other syntaxes for string literals, so I
don't have to escape the double quotes. There will be a global called `search` which accepts a regex
(it's implemented using Rust's regex crate, so use that regex syntax) and runs that regex on the contents
of every file in the code base (aside from gitignored files), then returns an array of tables with two
fields: "path" (the path to the file that had the matches) and "matches" (an array of strings, with each
string being a match that was found within the file)."#.into()
include_str!("scripting_tool_description.txt").into()
}
fn input_schema(&self) -> serde_json::Value {
@@ -62,724 +42,22 @@ string being a match that was found within the file)."#.into()
_window: &mut Window,
cx: &mut App,
) -> Task<anyhow::Result<String>> {
let root_dir = workspace.update(cx, |workspace, cx| {
let first_worktree = workspace
.visible_worktrees(cx)
.next()
.ok_or_else(|| anyhow!("no worktrees"))?;
workspace
.absolute_path_of_worktree(first_worktree.read(cx).id(), cx)
.ok_or_else(|| anyhow!("no worktree root"))
});
let root_dir = match root_dir {
Ok(root_dir) => root_dir,
Err(err) => return Task::ready(Err(err)),
};
let root_dir = match root_dir {
Ok(root_dir) => root_dir,
Err(err) => return Task::ready(Err(err)),
};
let input = match serde_json::from_value::<ScriptingToolInput>(input) {
Err(err) => return Task::ready(Err(err.into())),
Ok(input) => input,
};
let lua_script = input.lua_script;
cx.background_spawn(async move {
let fs_changes = HashMap::new();
let output = run_sandboxed_lua(&lua_script, fs_changes, root_dir)
.map_err(|err| anyhow!(format!("{err}")))?;
let output = output.printed_lines.join("\n");
let Ok(project) = workspace.read_with(cx, |workspace, _cx| workspace.project().clone())
else {
return Task::ready(Err(anyhow::anyhow!("No project found")));
};
let session = cx.new(|cx| Session::new(project, cx));
let lua_script = input.lua_script;
let script = session.update(cx, |session, cx| session.run_script(lua_script, cx));
cx.spawn(|_cx| async move {
let output = script.await?.stdout;
drop(session);
Ok(format!("The script output the following:\n{output}"))
})
}
}
const SANDBOX_PREAMBLE: &str = include_str!("sandbox_preamble.lua");
struct FileContent(RefCell<Vec<u8>>);
impl UserData for FileContent {
fn add_methods<M: UserDataMethods<Self>>(_methods: &mut M) {
// FileContent doesn't have any methods so far.
}
}
/// Sandboxed print() function in Lua.
fn print(lua: &Lua, printed_lines: Rc<RefCell<Vec<String>>>) -> Result<Function> {
lua.create_function(move |_, args: MultiValue| {
let mut string = String::new();
for arg in args.into_iter() {
// Lua's `print()` prints tab characters between each argument.
if !string.is_empty() {
string.push('\t');
}
// If the argument's to_string() fails, have the whole function call fail.
string.push_str(arg.to_string()?.as_str())
}
printed_lines.borrow_mut().push(string);
Ok(())
})
}
fn search(
lua: &Lua,
_fs_changes: Rc<RefCell<HashMap<PathBuf, Vec<u8>>>>,
root_dir: PathBuf,
) -> Result<Function> {
lua.create_function(move |lua, regex: String| {
use mlua::Table;
use regex::Regex;
use std::fs;
// Function to recursively search directory
let search_regex = match Regex::new(&regex) {
Ok(re) => re,
Err(e) => return Err(mlua::Error::runtime(format!("Invalid regex: {}", e))),
};
let mut search_results: Vec<Result<Table>> = Vec::new();
// Create an explicit stack for directories to process
let mut dir_stack = vec![root_dir.clone()];
while let Some(current_dir) = dir_stack.pop() {
// Process each entry in the current directory
let entries = match fs::read_dir(&current_dir) {
Ok(entries) => entries,
Err(e) => return Err(e.into()),
};
for entry_result in entries {
let entry = match entry_result {
Ok(e) => e,
Err(e) => return Err(e.into()),
};
let path = entry.path();
if path.is_dir() {
// Skip .git directory and other common directories to ignore
let dir_name = path.file_name().unwrap_or_default().to_string_lossy();
if !dir_name.starts_with('.')
&& dir_name != "node_modules"
&& dir_name != "target"
{
// Instead of recursive call, add to stack
dir_stack.push(path);
}
} else if path.is_file() {
// Skip binary files and very large files
if let Ok(metadata) = fs::metadata(&path) {
if metadata.len() > 1_000_000 {
// Skip files larger than 1MB
continue;
}
}
// Attempt to read the file as text
if let Ok(content) = fs::read_to_string(&path) {
let mut matches = Vec::new();
// Find all regex matches in the content
for capture in search_regex.find_iter(&content) {
matches.push(capture.as_str().to_string());
}
// If we found matches, create a result entry
if !matches.is_empty() {
let result_entry = lua.create_table()?;
result_entry.set("path", path.to_string_lossy().to_string())?;
let matches_table = lua.create_table()?;
for (i, m) in matches.iter().enumerate() {
matches_table.set(i + 1, m.clone())?;
}
result_entry.set("matches", matches_table)?;
search_results.push(Ok(result_entry));
}
}
}
}
}
// Create a table to hold our results
let results_table = lua.create_table()?;
for (i, result) in search_results.into_iter().enumerate() {
match result {
Ok(entry) => results_table.set(i + 1, entry)?,
Err(e) => return Err(e),
}
}
Ok(results_table)
})
}
/// Sandboxed io.open() function in Lua.
fn io_open(
lua: &Lua,
fs_changes: Rc<RefCell<HashMap<PathBuf, Vec<u8>>>>,
root_dir: PathBuf,
) -> Result<Function> {
lua.create_function(move |lua, (path_str, mode): (String, Option<String>)| {
let mode = mode.unwrap_or_else(|| "r".to_string());
// Parse the mode string to determine read/write permissions
let read_perm = mode.contains('r');
let write_perm = mode.contains('w') || mode.contains('a') || mode.contains('+');
let append = mode.contains('a');
let truncate = mode.contains('w');
// This will be the Lua value returned from the `open` function.
let file = lua.create_table()?;
// Store file metadata in the file
file.set("__path", path_str.clone())?;
file.set("__mode", mode.clone())?;
file.set("__read_perm", read_perm)?;
file.set("__write_perm", write_perm)?;
// Sandbox the path; it must be within root_dir
let path: PathBuf = {
let rust_path = Path::new(&path_str);
// Get absolute path
if rust_path.is_absolute() {
// Check if path starts with root_dir prefix without resolving symlinks
if !rust_path.starts_with(&root_dir) {
return Ok((
None,
format!(
"Error: Absolute path {} is outside the current working directory",
path_str
),
));
}
rust_path.to_path_buf()
} else {
// Make relative path absolute relative to cwd
root_dir.join(rust_path)
}
};
// close method
let close_fn = {
let fs_changes = fs_changes.clone();
lua.create_function(move |_lua, file_userdata: mlua::Table| {
let write_perm = file_userdata.get::<bool>("__write_perm")?;
let path = file_userdata.get::<String>("__path")?;
if write_perm {
// When closing a writable file, record the content
let content = file_userdata.get::<mlua::AnyUserData>("__content")?;
let content_ref = content.borrow::<FileContent>()?;
let content_vec = content_ref.0.borrow();
// Don't actually write to disk; instead, just update fs_changes.
let path_buf = PathBuf::from(&path);
fs_changes
.borrow_mut()
.insert(path_buf.clone(), content_vec.clone());
}
Ok(true)
})?
};
file.set("close", close_fn)?;
// If it's a directory, give it a custom read() and return early.
if path.is_dir() {
// TODO handle the case where we changed it in the in-memory fs
// Create a special directory handle
file.set("__is_directory", true)?;
// Store directory entries
let entries = match std::fs::read_dir(&path) {
Ok(entries) => {
let mut entry_names = Vec::new();
for entry in entries.flatten() {
entry_names.push(entry.file_name().to_string_lossy().into_owned());
}
entry_names
}
Err(e) => return Ok((None, format!("Error reading directory: {}", e))),
};
// Save the list of entries
file.set("__dir_entries", entries)?;
file.set("__dir_position", 0usize)?;
// Create a directory-specific read function
let read_fn = lua.create_function(|_lua, file_userdata: mlua::Table| {
let position = file_userdata.get::<usize>("__dir_position")?;
let entries = file_userdata.get::<Vec<String>>("__dir_entries")?;
if position >= entries.len() {
return Ok(None); // No more entries
}
let entry = entries[position].clone();
file_userdata.set("__dir_position", position + 1)?;
Ok(Some(entry))
})?;
file.set("read", read_fn)?;
// If we got this far, the directory was opened successfully
return Ok((Some(file), String::new()));
}
let is_in_changes = fs_changes.borrow().contains_key(&path);
let file_exists = is_in_changes || path.exists();
let mut file_content = Vec::new();
if file_exists && !truncate {
if is_in_changes {
file_content = fs_changes.borrow().get(&path).unwrap().clone();
} else {
// Try to read existing content if file exists and we're not truncating
match std::fs::read(&path) {
Ok(content) => file_content = content,
Err(e) => return Ok((None, format!("Error reading file: {}", e))),
}
}
}
// If in append mode, position should be at the end
let position = if append && file_exists {
file_content.len()
} else {
0
};
file.set("__position", position)?;
file.set(
"__content",
lua.create_userdata(FileContent(RefCell::new(file_content)))?,
)?;
// Create file methods
// read method
let read_fn = {
lua.create_function(
|_lua, (file_userdata, format): (mlua::Table, Option<mlua::Value>)| {
let read_perm = file_userdata.get::<bool>("__read_perm")?;
if !read_perm {
return Err(mlua::Error::runtime("File not open for reading"));
}
let content = file_userdata.get::<mlua::AnyUserData>("__content")?;
let mut position = file_userdata.get::<usize>("__position")?;
let content_ref = content.borrow::<FileContent>()?;
let content_vec = content_ref.0.borrow();
if position >= content_vec.len() {
return Ok(None); // EOF
}
match format {
Some(mlua::Value::String(s)) => {
let lossy_string = s.to_string_lossy();
let format_str: &str = lossy_string.as_ref();
// Only consider the first 2 bytes, since it's common to pass e.g. "*all" instead of "*a"
match &format_str[0..2] {
"*a" => {
// Read entire file from current position
let result = String::from_utf8_lossy(&content_vec[position..])
.to_string();
position = content_vec.len();
file_userdata.set("__position", position)?;
Ok(Some(result))
}
"*l" => {
// Read next line
let mut line = Vec::new();
let mut found_newline = false;
while position < content_vec.len() {
let byte = content_vec[position];
position += 1;
if byte == b'\n' {
found_newline = true;
break;
}
// Skip \r in \r\n sequence but add it if it's alone
if byte == b'\r' {
if position < content_vec.len()
&& content_vec[position] == b'\n'
{
position += 1;
found_newline = true;
break;
}
}
line.push(byte);
}
file_userdata.set("__position", position)?;
if !found_newline
&& line.is_empty()
&& position >= content_vec.len()
{
return Ok(None); // EOF
}
let result = String::from_utf8_lossy(&line).to_string();
Ok(Some(result))
}
"*n" => {
// Try to parse as a number (number of bytes to read)
match format_str.parse::<usize>() {
Ok(n) => {
let end =
std::cmp::min(position + n, content_vec.len());
let bytes = &content_vec[position..end];
let result = String::from_utf8_lossy(bytes).to_string();
position = end;
file_userdata.set("__position", position)?;
Ok(Some(result))
}
Err(_) => Err(mlua::Error::runtime(format!(
"Invalid format: {}",
format_str
))),
}
}
"*L" => {
// Read next line keeping the end of line
let mut line = Vec::new();
while position < content_vec.len() {
let byte = content_vec[position];
position += 1;
line.push(byte);
if byte == b'\n' {
break;
}
// If we encounter a \r, add it and check if the next is \n
if byte == b'\r'
&& position < content_vec.len()
&& content_vec[position] == b'\n'
{
line.push(content_vec[position]);
position += 1;
break;
}
}
file_userdata.set("__position", position)?;
if line.is_empty() && position >= content_vec.len() {
return Ok(None); // EOF
}
let result = String::from_utf8_lossy(&line).to_string();
Ok(Some(result))
}
_ => Err(mlua::Error::runtime(format!(
"Unsupported format: {}",
format_str
))),
}
}
Some(mlua::Value::Number(n)) => {
// Read n bytes
let n = n as usize;
let end = std::cmp::min(position + n, content_vec.len());
let bytes = &content_vec[position..end];
let result = String::from_utf8_lossy(bytes).to_string();
position = end;
file_userdata.set("__position", position)?;
Ok(Some(result))
}
Some(_) => Err(mlua::Error::runtime("Invalid format")),
None => {
// Default is to read a line
let mut line = Vec::new();
let mut found_newline = false;
while position < content_vec.len() {
let byte = content_vec[position];
position += 1;
if byte == b'\n' {
found_newline = true;
break;
}
// Handle \r\n
if byte == b'\r' {
if position < content_vec.len()
&& content_vec[position] == b'\n'
{
position += 1;
found_newline = true;
break;
}
}
line.push(byte);
}
file_userdata.set("__position", position)?;
if !found_newline && line.is_empty() && position >= content_vec.len() {
return Ok(None); // EOF
}
let result = String::from_utf8_lossy(&line).to_string();
Ok(Some(result))
}
}
},
)?
};
file.set("read", read_fn)?;
// write method
let write_fn = {
let fs_changes = fs_changes.clone();
lua.create_function(move |_lua, (file_userdata, text): (mlua::Table, String)| {
let write_perm = file_userdata.get::<bool>("__write_perm")?;
if !write_perm {
return Err(mlua::Error::runtime("File not open for writing"));
}
let content = file_userdata.get::<mlua::AnyUserData>("__content")?;
let position = file_userdata.get::<usize>("__position")?;
let content_ref = content.borrow::<FileContent>()?;
let mut content_vec = content_ref.0.borrow_mut();
let bytes = text.as_bytes();
// Ensure the vector has enough capacity
if position + bytes.len() > content_vec.len() {
content_vec.resize(position + bytes.len(), 0);
}
// Write the bytes
for (i, &byte) in bytes.iter().enumerate() {
content_vec[position + i] = byte;
}
// Update position
let new_position = position + bytes.len();
file_userdata.set("__position", new_position)?;
// Update fs_changes
let path = file_userdata.get::<String>("__path")?;
let path_buf = PathBuf::from(path);
fs_changes
.borrow_mut()
.insert(path_buf, content_vec.clone());
Ok(true)
})?
};
file.set("write", write_fn)?;
// If we got this far, the file was opened successfully
Ok((Some(file), String::new()))
})
}
/// Runs a Lua script in a sandboxed environment and returns the printed lines
pub fn run_sandboxed_lua(
script: &str,
fs_changes: HashMap<PathBuf, Vec<u8>>,
root_dir: PathBuf,
) -> Result<ScriptOutput> {
let lua = Lua::new();
lua.set_memory_limit(2 * 1024 * 1024 * 1024)?; // 2 GB
let globals = lua.globals();
// Track the lines the Lua script prints out.
let printed_lines = Rc::new(RefCell::new(Vec::new()));
let fs = Rc::new(RefCell::new(fs_changes));
globals.set("sb_print", print(&lua, printed_lines.clone())?)?;
globals.set("search", search(&lua, fs.clone(), root_dir.clone())?)?;
globals.set("sb_io_open", io_open(&lua, fs.clone(), root_dir)?)?;
globals.set("user_script", script)?;
lua.load(SANDBOX_PREAMBLE).exec()?;
drop(lua); // Necessary so the Rc'd values get decremented.
Ok(ScriptOutput {
printed_lines: Rc::try_unwrap(printed_lines)
.expect("There are still other references to printed_lines")
.into_inner(),
fs_changes: Rc::try_unwrap(fs)
.expect("There are still other references to fs_changes")
.into_inner(),
})
}
pub struct ScriptOutput {
printed_lines: Vec<String>,
#[allow(dead_code)]
fs_changes: HashMap<PathBuf, Vec<u8>>,
}
#[allow(dead_code)]
impl ScriptOutput {
fn fs_diff(&self) -> HashMap<PathBuf, String> {
let mut diff_map = HashMap::new();
for (path, content) in &self.fs_changes {
let diff = if path.exists() {
// Read the current file content
match std::fs::read(path) {
Ok(current_content) => {
// Convert both to strings for diffing
let new_content = String::from_utf8_lossy(content).to_string();
let old_content = String::from_utf8_lossy(&current_content).to_string();
// Generate a git-style diff
let new_lines: Vec<&str> = new_content.lines().collect();
let old_lines: Vec<&str> = old_content.lines().collect();
let path_str = path.to_string_lossy();
let mut diff = format!("diff --git a/{} b/{}\n", path_str, path_str);
diff.push_str(&format!("--- a/{}\n", path_str));
diff.push_str(&format!("+++ b/{}\n", path_str));
// Very basic diff algorithm - this is simplified
let mut i = 0;
let mut j = 0;
while i < old_lines.len() || j < new_lines.len() {
if i < old_lines.len()
&& j < new_lines.len()
&& old_lines[i] == new_lines[j]
{
i += 1;
j += 1;
continue;
}
// Find next matching line
let mut next_i = i;
let mut next_j = j;
let mut found = false;
// Look ahead for matches
for look_i in i..std::cmp::min(i + 10, old_lines.len()) {
for look_j in j..std::cmp::min(j + 10, new_lines.len()) {
if old_lines[look_i] == new_lines[look_j] {
next_i = look_i;
next_j = look_j;
found = true;
break;
}
}
if found {
break;
}
}
// Output the hunk header
diff.push_str(&format!(
"@@ -{},{} +{},{} @@\n",
i + 1,
if found {
next_i - i
} else {
old_lines.len() - i
},
j + 1,
if found {
next_j - j
} else {
new_lines.len() - j
}
));
// Output removed lines
for k in i..next_i {
diff.push_str(&format!("-{}\n", old_lines[k]));
}
// Output added lines
for k in j..next_j {
diff.push_str(&format!("+{}\n", new_lines[k]));
}
i = next_i;
j = next_j;
if found {
i += 1;
j += 1;
} else {
break;
}
}
diff
}
Err(_) => format!("Error reading current file: {}", path.display()),
}
} else {
// New file
let content_str = String::from_utf8_lossy(content).to_string();
let path_str = path.to_string_lossy();
let mut diff = format!("diff --git a/{} b/{}\n", path_str, path_str);
diff.push_str("new file mode 100644\n");
diff.push_str("--- /dev/null\n");
diff.push_str(&format!("+++ b/{}\n", path_str));
let lines: Vec<&str> = content_str.lines().collect();
diff.push_str(&format!("@@ -0,0 +1,{} @@\n", lines.len()));
for line in lines {
diff.push_str(&format!("+{}\n", line));
}
diff
};
diff_map.insert(path.clone(), diff);
}
diff_map
}
fn diff_to_string(&self) -> String {
let mut answer = String::new();
let diff_map = self.fs_diff();
if diff_map.is_empty() {
return "No changes to files".to_string();
}
// Sort the paths for consistent output
let mut paths: Vec<&PathBuf> = diff_map.keys().collect();
paths.sort();
for path in paths {
if !answer.is_empty() {
answer.push_str("\n");
}
answer.push_str(&diff_map[path]);
}
answer
}
}

View File

@@ -0,0 +1,22 @@
You can write a Lua script and I'll run it on my codebase and tell you what its
output was, including both stdout as well as the git diff of changes it made to
the filesystem. That way, you can get more information about the code base, or
make changes to the code base directly.
The Lua script will have access to `io` and it will run with the current working
directory being in the root of the code base, so you can use it to explore,
search, make changes, etc. You can also have the script print things, and I'll
tell you what the output was. Note that `io` only has `open`, and then the file
it returns only has the methods read, write, and close - it doesn't have popen
or anything else.
Also, I'm going to be putting this Lua script into JSON, so please don't use
Lua's double quote syntax for string literals - use one of Lua's other syntaxes
for string literals, so I don't have to escape the double quotes.
There will be a global called `search` which accepts a regex (it's implemented
using Rust's regex crate, so use that regex syntax) and runs that regex on the
contents of every file in the code base (aside from gitignored files), then
returns an array of tables with two fields: "path" (the path to the file that
had the matches) and "matches" (an array of strings, with each string being a
match that was found within the file).

View File

@@ -0,0 +1,743 @@
use anyhow::Result;
use collections::{HashMap, HashSet};
use futures::{
channel::{mpsc, oneshot},
pin_mut, SinkExt, StreamExt,
};
use gpui::{AppContext, AsyncApp, Context, Entity, Task, WeakEntity};
use mlua::{Lua, MultiValue, Table, UserData, UserDataMethods};
use parking_lot::Mutex;
use project::{search::SearchQuery, Fs, Project};
use regex::Regex;
use std::{
cell::RefCell,
path::{Path, PathBuf},
sync::Arc,
};
use util::{paths::PathMatcher, ResultExt};
pub struct ScriptOutput {
pub stdout: String,
}
struct ForegroundFn(Box<dyn FnOnce(WeakEntity<Session>, AsyncApp) + Send>);
pub struct Session {
project: Entity<Project>,
// TODO Remove this
fs_changes: Arc<Mutex<HashMap<PathBuf, Vec<u8>>>>,
foreground_fns_tx: mpsc::Sender<ForegroundFn>,
_invoke_foreground_fns: Task<()>,
}
impl Session {
pub fn new(project: Entity<Project>, cx: &mut Context<Self>) -> Self {
let (foreground_fns_tx, mut foreground_fns_rx) = mpsc::channel(128);
Session {
project,
fs_changes: Arc::new(Mutex::new(HashMap::default())),
foreground_fns_tx,
_invoke_foreground_fns: cx.spawn(|this, cx| async move {
while let Some(foreground_fn) = foreground_fns_rx.next().await {
foreground_fn.0(this.clone(), cx.clone());
}
}),
}
}
/// Runs a Lua script in a sandboxed environment and returns the printed lines
pub fn run_script(
&mut self,
script: String,
cx: &mut Context<Self>,
) -> Task<Result<ScriptOutput>> {
const SANDBOX_PREAMBLE: &str = include_str!("sandbox_preamble.lua");
// TODO Remove fs_changes
let fs_changes = self.fs_changes.clone();
// TODO Honor all worktrees instead of the first one
let root_dir = self
.project
.read(cx)
.visible_worktrees(cx)
.next()
.map(|worktree| worktree.read(cx).abs_path());
let fs = self.project.read(cx).fs().clone();
let foreground_fns_tx = self.foreground_fns_tx.clone();
cx.background_spawn(async move {
let lua = Lua::new();
lua.set_memory_limit(2 * 1024 * 1024 * 1024)?; // 2 GB
let globals = lua.globals();
let stdout = Arc::new(Mutex::new(String::new()));
globals.set(
"sb_print",
lua.create_function({
let stdout = stdout.clone();
move |_, args: MultiValue| Self::print(args, &stdout)
})?,
)?;
globals.set(
"search",
lua.create_async_function({
let foreground_fns_tx = foreground_fns_tx.clone();
let fs = fs.clone();
move |lua, regex| {
Self::search(lua, foreground_fns_tx.clone(), fs.clone(), regex)
}
})?,
)?;
globals.set(
"sb_io_open",
lua.create_function({
let fs_changes = fs_changes.clone();
let root_dir = root_dir.clone();
move |lua, (path_str, mode)| {
Self::io_open(&lua, &fs_changes, root_dir.as_ref(), path_str, mode)
}
})?,
)?;
globals.set("user_script", script)?;
lua.load(SANDBOX_PREAMBLE).exec_async().await?;
// Drop Lua instance to decrement reference count.
drop(lua);
let stdout = Arc::try_unwrap(stdout)
.expect("no more references to stdout")
.into_inner();
Ok(ScriptOutput { stdout })
})
}
/// Sandboxed print() function in Lua.
fn print(args: MultiValue, stdout: &Mutex<String>) -> mlua::Result<()> {
for (index, arg) in args.into_iter().enumerate() {
// Lua's `print()` prints tab characters between each argument.
if index > 0 {
stdout.lock().push('\t');
}
// If the argument's to_string() fails, have the whole function call fail.
stdout.lock().push_str(&arg.to_string()?);
}
stdout.lock().push('\n');
Ok(())
}
/// Sandboxed io.open() function in Lua.
fn io_open(
lua: &Lua,
fs_changes: &Arc<Mutex<HashMap<PathBuf, Vec<u8>>>>,
root_dir: Option<&Arc<Path>>,
path_str: String,
mode: Option<String>,
) -> mlua::Result<(Option<Table>, String)> {
let root_dir = root_dir
.ok_or_else(|| mlua::Error::runtime("cannot open file without a root directory"))?;
let mode = mode.unwrap_or_else(|| "r".to_string());
// Parse the mode string to determine read/write permissions
let read_perm = mode.contains('r');
let write_perm = mode.contains('w') || mode.contains('a') || mode.contains('+');
let append = mode.contains('a');
let truncate = mode.contains('w');
// This will be the Lua value returned from the `open` function.
let file = lua.create_table()?;
// Store file metadata in the file
file.set("__path", path_str.clone())?;
file.set("__mode", mode.clone())?;
file.set("__read_perm", read_perm)?;
file.set("__write_perm", write_perm)?;
// Sandbox the path; it must be within root_dir
let path: PathBuf = {
let rust_path = Path::new(&path_str);
// Get absolute path
if rust_path.is_absolute() {
// Check if path starts with root_dir prefix without resolving symlinks
if !rust_path.starts_with(&root_dir) {
return Ok((
None,
format!(
"Error: Absolute path {} is outside the current working directory",
path_str
),
));
}
rust_path.to_path_buf()
} else {
// Make relative path absolute relative to cwd
root_dir.join(rust_path)
}
};
// close method
let close_fn = {
let fs_changes = fs_changes.clone();
lua.create_function(move |_lua, file_userdata: mlua::Table| {
let write_perm = file_userdata.get::<bool>("__write_perm")?;
let path = file_userdata.get::<String>("__path")?;
if write_perm {
// When closing a writable file, record the content
let content = file_userdata.get::<mlua::AnyUserData>("__content")?;
let content_ref = content.borrow::<FileContent>()?;
let content_vec = content_ref.0.borrow();
// Don't actually write to disk; instead, just update fs_changes.
let path_buf = PathBuf::from(&path);
fs_changes
.lock()
.insert(path_buf.clone(), content_vec.clone());
}
Ok(true)
})?
};
file.set("close", close_fn)?;
// If it's a directory, give it a custom read() and return early.
if path.is_dir() {
// TODO handle the case where we changed it in the in-memory fs
// Create a special directory handle
file.set("__is_directory", true)?;
// Store directory entries
let entries = match std::fs::read_dir(&path) {
Ok(entries) => {
let mut entry_names = Vec::new();
for entry in entries.flatten() {
entry_names.push(entry.file_name().to_string_lossy().into_owned());
}
entry_names
}
Err(e) => return Ok((None, format!("Error reading directory: {}", e))),
};
// Save the list of entries
file.set("__dir_entries", entries)?;
file.set("__dir_position", 0usize)?;
// Create a directory-specific read function
let read_fn = lua.create_function(|_lua, file_userdata: mlua::Table| {
let position = file_userdata.get::<usize>("__dir_position")?;
let entries = file_userdata.get::<Vec<String>>("__dir_entries")?;
if position >= entries.len() {
return Ok(None); // No more entries
}
let entry = entries[position].clone();
file_userdata.set("__dir_position", position + 1)?;
Ok(Some(entry))
})?;
file.set("read", read_fn)?;
// If we got this far, the directory was opened successfully
return Ok((Some(file), String::new()));
}
let fs_changes_map = fs_changes.lock();
let is_in_changes = fs_changes_map.contains_key(&path);
let file_exists = is_in_changes || path.exists();
let mut file_content = Vec::new();
if file_exists && !truncate {
if is_in_changes {
file_content = fs_changes_map.get(&path).unwrap().clone();
} else {
// Try to read existing content if file exists and we're not truncating
match std::fs::read(&path) {
Ok(content) => file_content = content,
Err(e) => return Ok((None, format!("Error reading file: {}", e))),
}
}
}
drop(fs_changes_map); // Unlock the fs_changes mutex.
// If in append mode, position should be at the end
let position = if append && file_exists {
file_content.len()
} else {
0
};
file.set("__position", position)?;
file.set(
"__content",
lua.create_userdata(FileContent(RefCell::new(file_content)))?,
)?;
// Create file methods
// read method
let read_fn = {
lua.create_function(
|_lua, (file_userdata, format): (mlua::Table, Option<mlua::Value>)| {
let read_perm = file_userdata.get::<bool>("__read_perm")?;
if !read_perm {
return Err(mlua::Error::runtime("File not open for reading"));
}
let content = file_userdata.get::<mlua::AnyUserData>("__content")?;
let mut position = file_userdata.get::<usize>("__position")?;
let content_ref = content.borrow::<FileContent>()?;
let content_vec = content_ref.0.borrow();
if position >= content_vec.len() {
return Ok(None); // EOF
}
match format {
Some(mlua::Value::String(s)) => {
let lossy_string = s.to_string_lossy();
let format_str: &str = lossy_string.as_ref();
// Only consider the first 2 bytes, since it's common to pass e.g. "*all" instead of "*a"
match &format_str[0..2] {
"*a" => {
// Read entire file from current position
let result = String::from_utf8_lossy(&content_vec[position..])
.to_string();
position = content_vec.len();
file_userdata.set("__position", position)?;
Ok(Some(result))
}
"*l" => {
// Read next line
let mut line = Vec::new();
let mut found_newline = false;
while position < content_vec.len() {
let byte = content_vec[position];
position += 1;
if byte == b'\n' {
found_newline = true;
break;
}
// Skip \r in \r\n sequence but add it if it's alone
if byte == b'\r' {
if position < content_vec.len()
&& content_vec[position] == b'\n'
{
position += 1;
found_newline = true;
break;
}
}
line.push(byte);
}
file_userdata.set("__position", position)?;
if !found_newline
&& line.is_empty()
&& position >= content_vec.len()
{
return Ok(None); // EOF
}
let result = String::from_utf8_lossy(&line).to_string();
Ok(Some(result))
}
"*n" => {
// Try to parse as a number (number of bytes to read)
match format_str.parse::<usize>() {
Ok(n) => {
let end =
std::cmp::min(position + n, content_vec.len());
let bytes = &content_vec[position..end];
let result = String::from_utf8_lossy(bytes).to_string();
position = end;
file_userdata.set("__position", position)?;
Ok(Some(result))
}
Err(_) => Err(mlua::Error::runtime(format!(
"Invalid format: {}",
format_str
))),
}
}
"*L" => {
// Read next line keeping the end of line
let mut line = Vec::new();
while position < content_vec.len() {
let byte = content_vec[position];
position += 1;
line.push(byte);
if byte == b'\n' {
break;
}
// If we encounter a \r, add it and check if the next is \n
if byte == b'\r'
&& position < content_vec.len()
&& content_vec[position] == b'\n'
{
line.push(content_vec[position]);
position += 1;
break;
}
}
file_userdata.set("__position", position)?;
if line.is_empty() && position >= content_vec.len() {
return Ok(None); // EOF
}
let result = String::from_utf8_lossy(&line).to_string();
Ok(Some(result))
}
_ => Err(mlua::Error::runtime(format!(
"Unsupported format: {}",
format_str
))),
}
}
Some(mlua::Value::Number(n)) => {
// Read n bytes
let n = n as usize;
let end = std::cmp::min(position + n, content_vec.len());
let bytes = &content_vec[position..end];
let result = String::from_utf8_lossy(bytes).to_string();
position = end;
file_userdata.set("__position", position)?;
Ok(Some(result))
}
Some(_) => Err(mlua::Error::runtime("Invalid format")),
None => {
// Default is to read a line
let mut line = Vec::new();
let mut found_newline = false;
while position < content_vec.len() {
let byte = content_vec[position];
position += 1;
if byte == b'\n' {
found_newline = true;
break;
}
// Handle \r\n
if byte == b'\r' {
if position < content_vec.len()
&& content_vec[position] == b'\n'
{
position += 1;
found_newline = true;
break;
}
}
line.push(byte);
}
file_userdata.set("__position", position)?;
if !found_newline && line.is_empty() && position >= content_vec.len() {
return Ok(None); // EOF
}
let result = String::from_utf8_lossy(&line).to_string();
Ok(Some(result))
}
}
},
)?
};
file.set("read", read_fn)?;
// write method
let write_fn = {
let fs_changes = fs_changes.clone();
lua.create_function(move |_lua, (file_userdata, text): (mlua::Table, String)| {
let write_perm = file_userdata.get::<bool>("__write_perm")?;
if !write_perm {
return Err(mlua::Error::runtime("File not open for writing"));
}
let content = file_userdata.get::<mlua::AnyUserData>("__content")?;
let position = file_userdata.get::<usize>("__position")?;
let content_ref = content.borrow::<FileContent>()?;
let mut content_vec = content_ref.0.borrow_mut();
let bytes = text.as_bytes();
// Ensure the vector has enough capacity
if position + bytes.len() > content_vec.len() {
content_vec.resize(position + bytes.len(), 0);
}
// Write the bytes
for (i, &byte) in bytes.iter().enumerate() {
content_vec[position + i] = byte;
}
// Update position
let new_position = position + bytes.len();
file_userdata.set("__position", new_position)?;
// Update fs_changes
let path = file_userdata.get::<String>("__path")?;
let path_buf = PathBuf::from(path);
fs_changes.lock().insert(path_buf, content_vec.clone());
Ok(true)
})?
};
file.set("write", write_fn)?;
// If we got this far, the file was opened successfully
Ok((Some(file), String::new()))
}
async fn search(
lua: Lua,
mut foreground_tx: mpsc::Sender<ForegroundFn>,
fs: Arc<dyn Fs>,
regex: String,
) -> mlua::Result<Table> {
// TODO: Allow specification of these options.
let search_query = SearchQuery::regex(
&regex,
false,
false,
false,
PathMatcher::default(),
PathMatcher::default(),
None,
);
let search_query = match search_query {
Ok(query) => query,
Err(e) => return Err(mlua::Error::runtime(format!("Invalid search query: {}", e))),
};
// TODO: Should use `search_query.regex`. The tool description should also be updated,
// as it specifies standard regex.
let search_regex = match Regex::new(&regex) {
Ok(re) => re,
Err(e) => return Err(mlua::Error::runtime(format!("Invalid regex: {}", e))),
};
let mut abs_paths_rx =
Self::find_search_candidates(search_query, &mut foreground_tx).await?;
let mut search_results: Vec<Table> = Vec::new();
while let Some(path) = abs_paths_rx.next().await {
// Skip files larger than 1MB
if let Ok(Some(metadata)) = fs.metadata(&path).await {
if metadata.len > 1_000_000 {
continue;
}
}
// Attempt to read the file as text
if let Ok(content) = fs.load(&path).await {
let mut matches = Vec::new();
// Find all regex matches in the content
for capture in search_regex.find_iter(&content) {
matches.push(capture.as_str().to_string());
}
// If we found matches, create a result entry
if !matches.is_empty() {
let result_entry = lua.create_table()?;
result_entry.set("path", path.to_string_lossy().to_string())?;
let matches_table = lua.create_table()?;
for (ix, m) in matches.iter().enumerate() {
matches_table.set(ix + 1, m.clone())?;
}
result_entry.set("matches", matches_table)?;
search_results.push(result_entry);
}
}
}
// Create a table to hold our results
let results_table = lua.create_table()?;
for (ix, entry) in search_results.into_iter().enumerate() {
results_table.set(ix + 1, entry)?;
}
Ok(results_table)
}
async fn find_search_candidates(
search_query: SearchQuery,
foreground_tx: &mut mpsc::Sender<ForegroundFn>,
) -> mlua::Result<mpsc::UnboundedReceiver<PathBuf>> {
Self::run_foreground_fn(
"finding search file candidates",
foreground_tx,
Box::new(move |session, mut cx| {
session.update(&mut cx, |session, cx| {
session.project.update(cx, |project, cx| {
project.worktree_store().update(cx, |worktree_store, cx| {
// TODO: Better limit? For now this is the same as
// MAX_SEARCH_RESULT_FILES.
let limit = 5000;
// TODO: Providing non-empty open_entries can make this a bit more
// efficient as it can skip checking that these paths are textual.
let open_entries = HashSet::default();
let candidates = worktree_store.find_search_candidates(
search_query,
limit,
open_entries,
project.fs().clone(),
cx,
);
let (abs_paths_tx, abs_paths_rx) = mpsc::unbounded();
cx.spawn(|worktree_store, cx| async move {
pin_mut!(candidates);
while let Some(project_path) = candidates.next().await {
worktree_store.read_with(&cx, |worktree_store, cx| {
if let Some(worktree) = worktree_store
.worktree_for_id(project_path.worktree_id, cx)
{
if let Some(abs_path) = worktree
.read(cx)
.absolutize(&project_path.path)
.log_err()
{
abs_paths_tx.unbounded_send(abs_path)?;
}
}
anyhow::Ok(())
})??;
}
anyhow::Ok(())
})
.detach();
abs_paths_rx
})
})
})
}),
)
.await
}
async fn run_foreground_fn<R: Send + 'static>(
description: &str,
foreground_tx: &mut mpsc::Sender<ForegroundFn>,
function: Box<dyn FnOnce(WeakEntity<Self>, AsyncApp) -> anyhow::Result<R> + Send>,
) -> mlua::Result<R> {
let (response_tx, response_rx) = oneshot::channel();
let send_result = foreground_tx
.send(ForegroundFn(Box::new(move |this, cx| {
response_tx.send(function(this, cx)).ok();
})))
.await;
match send_result {
Ok(()) => (),
Err(err) => {
return Err(mlua::Error::runtime(format!(
"Internal error while enqueuing work for {description}: {err}"
)))
}
}
match response_rx.await {
Ok(Ok(result)) => Ok(result),
Ok(Err(err)) => Err(mlua::Error::runtime(format!(
"Error while {description}: {err}"
))),
Err(oneshot::Canceled) => Err(mlua::Error::runtime(format!(
"Internal error: response oneshot was canceled while {description}."
))),
}
}
}
struct FileContent(RefCell<Vec<u8>>);
impl UserData for FileContent {
fn add_methods<M: UserDataMethods<Self>>(_methods: &mut M) {
// FileContent doesn't have any methods so far.
}
}
#[cfg(test)]
mod tests {
use gpui::TestAppContext;
use project::FakeFs;
use serde_json::json;
use settings::SettingsStore;
use super::*;
#[gpui::test]
async fn test_print(cx: &mut TestAppContext) {
init_test(cx);
let fs = FakeFs::new(cx.executor());
let project = Project::test(fs, [], cx).await;
let session = cx.new(|cx| Session::new(project, cx));
let script = r#"
print("Hello", "world!")
print("Goodbye", "moon!")
"#;
let output = session
.update(cx, |session, cx| session.run_script(script.to_string(), cx))
.await
.unwrap();
assert_eq!(output.stdout, "Hello\tworld!\nGoodbye\tmoon!\n");
}
#[gpui::test]
async fn test_search(cx: &mut TestAppContext) {
init_test(cx);
let fs = FakeFs::new(cx.executor());
fs.insert_tree(
"/",
json!({
"file1.txt": "Hello world!",
"file2.txt": "Goodbye moon!"
}),
)
.await;
let project = Project::test(fs, [Path::new("/")], cx).await;
let session = cx.new(|cx| Session::new(project, cx));
let script = r#"
local results = search("world")
for i, result in ipairs(results) do
print("File: " .. result.path)
print("Matches:")
for j, match in ipairs(result.matches) do
print(" " .. match)
end
end
"#;
let output = session
.update(cx, |session, cx| session.run_script(script.to_string(), cx))
.await
.unwrap();
assert_eq!(output.stdout, "File: /file1.txt\nMatches:\n world\n");
}
fn init_test(cx: &mut TestAppContext) {
let settings_store = cx.update(SettingsStore::test);
cx.set_global(settings_store);
cx.update(Project::init_settings);
}
}

View File

@@ -1,152 +0,0 @@
/// This module works with streaming_lua to allow us to run fragments of
/// Lua scripts that come back from LLM JSON tool calls immediately as they arrive,
/// even when the full script (and the full JSON) has not been received yet.
pub fn from_json(json_str: &str) {
// The JSON structure we're looking for is very simple:
// 1. Open curly bracket
// 2. Optional whitespace
// 3. Quoted key - either "lua_script" or "description" (if description, just parse it)
// 4. Colon
// 5. Optional whitespace
// 6. Open quote
// 7. Now we start streaming until we see a closed quote
// TODO all of this needs to be stored in state in a struct instead of in variables,
// and that includes the iterator part.
let mut chars = json_str.trim_start().chars().peekable();
// Skip the opening curly brace
if chars.next() != Some('{') {
return;
}
let key = parse_key(&mut chars);
if key.map(|k| k.as_str()) == Some("description") {
// TODO parse the description here
parse_comma_then_quote(&mut chars);
if parse_key(&mut chars).map(|k| k.as_str()) != Some("lua_script") {
return; // This was the only remaining valid option.
}
// TODO parse the script here, remembering to s/backslash//g to unescape everything.
} else if key.map(|k| k.as_str()) == Some("lua_script") {
// TODO parse the script here, remembering to s/backslash//g to unescape everything.
parse_comma_then_quote(&mut chars);
if parse_key(&mut chars).map(|k| k.as_str()) != Some("description") {
return; // This was the only remaining valid option.
}
// TODO parse the description here
} else {
// The key wasn't one of the two valid options.
return;
}
// Parse value
let mut value = String::new();
let mut escape_next = false;
while let Some(c) = chars.next() {
if escape_next {
value.push(match c {
'n' => '\n',
't' => '\t',
'r' => '\r',
'\\' => '\\',
'"' => '"',
_ => c,
});
escape_next = false;
} else if c == '\\' {
escape_next = true;
} else if c == '"' {
break; // End of value
} else {
value.push(c);
}
}
// Process the parsed key-value pair
match key.as_str() {
"lua_script" => {
// Handle the lua script
println!("Found lua script: {}", value);
}
"description" => {
// Handle the description
println!("Found description: {}", value);
}
_ => {} // Should not reach here due to earlier check
}
}
fn parse_key(chars: &mut impl Iterator<Item = char>) -> Option<String> {
// Skip whitespace until we reach the start of the key
while let Some(c) = chars.next() {
if c.is_whitespace() {
// Consume the whitespace and continue
} else if c == '"' {
break; // Found the start of the key
} else {
return None; // Invalid format - expected a quote to start the key
}
}
// Parse the key. We don't need to escape backslashes because the exact key
// we expect does not include backslashes or quotes.
let mut key = String::new();
while let Some(c) = chars.next() {
if c == '"' {
break; // End of key
}
key.push(c);
}
// Skip colon and whitespace and next opening quote.
let mut found_colon = false;
while let Some(c) = chars.next() {
if c == ':' {
found_colon = true;
} else if found_colon && !c.is_whitespace() {
if c == '"' {
break; // Found the opening quote
}
return None; // Invalid format - expected a quote after colon and whitespace
} else if !c.is_whitespace() {
return None; // Invalid format - expected whitespace or colon
}
}
Some(key)
}
fn parse_comma_then_quote(chars: &mut impl Iterator<Item = char>) -> bool {
// Skip any whitespace
while let Some(&c) = chars.peek() {
if !c.is_whitespace() {
break;
}
chars.next();
}
// Check for comma
if chars.next() != Some(',') {
return false;
}
// Skip any whitespace after the comma
while let Some(&c) = chars.peek() {
if !c.is_whitespace() {
break;
}
chars.next();
}
// Check for opening quote
if chars.next() != Some('"') {
return false;
}
true
}

View File

@@ -1,268 +0,0 @@
/// This module accepts fragments of Lua code from LLM responses, and executes
/// them as they come in (to the extent possible) rather than having to wait
/// for the entire script to arrive to execute it. (Since these are tool calls,
/// they will presumably come back in JSON; it's up to the caller to deal with
/// parsing the JSON, escaping `\\` and `\"` in the JSON-quoted Lua, etc.)
///
/// By design, Lua does not preserve top-level locals across chunks ("chunk" is a
/// Lua term for a chunk of Lua code that can be executed), and chunks are the
/// smallest unit of execution you can run in Lua. To make sure that top-level
/// locals the LLM writes are preserved across multiple silently translates
/// locals to globals. This should be harmless for our use case, because we only
/// have a single "file" and not multiple files where the distinction could matter.
///
/// Since fragments will invariably arrive that don't happen to correspond to valid
/// Lua chunks (e.g. maybe they have an opening quote for a string literal and the
/// close quote will be coming in the next fragment), we use a simple heuristic to
/// split them up: we take each fragment and split it into lines, and then whenever
/// we have a complete line, we send it to Lua to process as a chunk. If it comes back
/// with a syntax error due to it being incomplete (which mlua tells us), then we
/// know to keep waiting for more lines and try again.
///
/// Eventually we'll either succeed, or else the response will end and we'll know it
/// had an actual syntax error. (Again, it's the caller's responsibility to deal
/// with detecting when the response ends due to the JSON quote having finally closed.)
///
/// This heuristic relies on the assumption that the LLM is generating normal-looking
/// Lua code where statements are split using newlines rather than semicolons.
/// In practice, this is a safe assumption.
#[derive(Default)]
struct ChunkBuffer {
buffer: String,
incomplete_multiline_string: bool,
last_newline_index: usize,
}
impl ChunkBuffer {
pub fn receive_chunk(
&mut self,
src_chunk: &str,
exec_chunk: &mut impl FnMut(&str) -> mlua::Result<()>,
) -> mlua::Result<()> {
self.buffer.push_str(src_chunk);
// Execute each line until we hit an incomplete parse
while let Some(index) = &self.buffer[self.last_newline_index..].find('\n') {
let mut index = *index;
// LLMs can produce incredibly long multiline strings. We don't want to keep
// attempting to re-parse those every time a new line of the string comes in.
// that would be extremely wasteful! Instead, just keep waiting until it ends.
{
let line = &self.buffer[self.last_newline_index..index];
const LOCAL_PREFIX: &str = "local ";
// It's safe to assume we'll never see a line which
// includes both "]]" and "[[" other than single-line
// assignments which are just using them to escape quotes.
//
// If that assumption turns out not to hold, we can always
// make this more robust.
if line.contains("[[") && !line.contains("]]") {
self.incomplete_multiline_string = true;
}
// In practice, LLMs produce multiline strings that always end
// with the ]] at the start of the line.
if line.starts_with("]]") {
self.incomplete_multiline_string = false;
} else if line.starts_with("local ") {
// We can't have top-level locals because they don't preserve
// across chunk executions. So just turn locals into globals.
// Since this is just one script, they're the same anyway.
self.buffer
.replace_range(self.last_newline_index..LOCAL_PREFIX.len(), "");
index -= LOCAL_PREFIX.len();
}
}
self.last_newline_index = index;
if self.incomplete_multiline_string {
continue;
}
// Execute all lines up to (and including) this one.
match exec_chunk(&self.buffer[..index]) {
Ok(()) => {
// The chunk executed successfully. Advance the buffer
// to reflect the fact that we've executed that code.
self.buffer = self.buffer[index + 1..].to_string();
self.last_newline_index = 0;
}
Err(mlua::Error::SyntaxError {
incomplete_input: true,
message: _,
}) => {
// If it errored specifically because the input was incomplete, no problem.
// We'll keep trying with more and more lines until eventually we find a
// sequence of lines that are valid together!
}
Err(other) => {
return Err(other);
}
}
}
Ok(())
}
pub fn finish(
&mut self,
exec_chunk: &mut impl FnMut(&str) -> mlua::Result<()>,
) -> mlua::Result<()> {
if !self.buffer.is_empty() {
// Execute whatever is left in the buffer
match exec_chunk(&self.buffer) {
Ok(()) => {
// Clear the buffer as everything has been executed
self.buffer.clear();
self.last_newline_index = 0;
self.incomplete_multiline_string = false;
}
Err(err) => {
return Err(err);
}
}
}
Ok(())
}
}
#[cfg(test)]
mod tests {
use super::*;
use mlua::Lua;
use std::cell::RefCell;
use std::rc::Rc;
#[test]
fn test_lua_runtime_receive_chunk() {
let mut chunk_buffer = ChunkBuffer::default();
let output = Rc::new(RefCell::new(String::new()));
let mut exec_chunk = |chunk: &str| -> mlua::Result<()> {
let lua = Lua::new();
// Clone the Rc to share ownership of the same RefCell
let output_ref = output.clone();
lua.globals().set(
"print",
lua.create_function(move |_, msg: String| {
let mut output = output_ref.borrow_mut();
output.push_str(&msg);
output.push('\n');
Ok(())
})?,
)?;
lua.load(chunk).exec()
};
exec_chunk("print('Hello, World!')").unwrap();
chunk_buffer
.receive_chunk("print('Hello, World!')", &mut exec_chunk)
.unwrap();
assert_eq!(*output.borrow(), "Hello, World!\n");
}
#[test]
fn test_lua_runtime_receive_chunk_shared_lua() {
let mut chunk_buffer = ChunkBuffer::default();
let output = Rc::new(RefCell::new(String::new()));
let lua = Lua::new();
// Set up the print function once for the shared Lua instance
{
let output_ref = output.clone();
lua.globals()
.set(
"print",
lua.create_function(move |_, msg: String| {
let mut output = output_ref.borrow_mut();
output.push_str(&msg);
output.push('\n');
Ok(())
})
.unwrap(),
)
.unwrap();
}
let mut exec_chunk = |chunk: &str| -> mlua::Result<()> { lua.load(chunk).exec() };
// Send first incomplete chunk
chunk_buffer
.receive_chunk("local message = 'Hello, '\n", &mut exec_chunk)
.unwrap();
// Send second chunk that completes the code
chunk_buffer
.receive_chunk(
"message = message .. 'World!'\nprint(message)",
&mut exec_chunk,
)
.unwrap();
chunk_buffer.finish(&mut exec_chunk).unwrap();
assert_eq!(*output.borrow(), "Hello, World!\n");
}
#[test]
fn test_multiline_string_across_chunks() {
let mut chunk_buffer = ChunkBuffer::default();
let output = Rc::new(RefCell::new(String::new()));
let lua = Lua::new();
// Set up the print function for the shared Lua instance
{
let output_ref = output.clone();
lua.globals()
.set(
"print",
lua.create_function(move |_, msg: String| {
let mut output = output_ref.borrow_mut();
output.push_str(&msg);
output.push('\n');
Ok(())
})
.unwrap(),
)
.unwrap();
}
let mut exec_chunk = |chunk: &str| -> mlua::Result<()> { lua.load(chunk).exec() };
// Send first chunk with the beginning of a multiline string
chunk_buffer
.receive_chunk("local multiline = [[This is the start\n", &mut exec_chunk)
.unwrap();
// Send second chunk with more lines
chunk_buffer
.receive_chunk("of a very long\nmultiline string\n", &mut exec_chunk)
.unwrap();
// Send third chunk with more content
chunk_buffer
.receive_chunk("that spans across\n", &mut exec_chunk)
.unwrap();
// Send final chunk that completes the multiline string
chunk_buffer
.receive_chunk("multiple chunks]]\nprint(multiline)", &mut exec_chunk)
.unwrap();
chunk_buffer.finish(&mut exec_chunk).unwrap();
let expected = "This is the start\nof a very long\nmultiline string\nthat spans across\nmultiple chunks\n";
assert_eq!(*output.borrow(), expected);
}
}

View File

@@ -326,6 +326,7 @@ fn main() {
.or_else(read_proxy_from_env);
let http = {
let _guard = Tokio::handle(cx).enter();
ReqwestClient::proxy_and_user_agent(proxy_url, &user_agent)
.expect("could not start HTTP client")
};

View File

@@ -70,7 +70,6 @@ export ZED_RELEASE_CHANNEL="${channel}"
popd
export ZED_BUNDLE=true
export MACOSX_DEPLOYMENT_TARGET=10.15.7
cargo_bundle_version=$(cargo -q bundle --help 2>&1 | head -n 1 || echo "")
if [ "$cargo_bundle_version" != "cargo-bundle v0.6.0-zed" ]; then