Compare commits

...

137 Commits

Author SHA1 Message Date
Richard Feldman
0e1fe66a19 Show Lua script with syntax highlighting in assistant2 2025-03-10 11:26:12 -04:00
Marshall Bowers
ed52e759d7 docs: Fix language links (#26368)
This PR fixes some language links in the docs.

The Shell Script page wasn't being linked from `SUMMARY.md`, so no page
was being generated.

There were also some differences in the language lists in the sidebar
and on the top-level languages page.

Release Notes:

- N/A
2025-03-10 15:22:51 +00:00
Richard Feldman
6da099a9d7 Unsandbox Lua scripts (#26365)
Per a conversation with @nathansobo, have the Lua scripts run
unsandboxed for now (while this feature is behind the staff feature
flag).

Release Notes:

- N/A
2025-03-10 11:04:37 -04:00
Smit Barmase
5f159bc95e go_to_line: Fix goto line + mouse click jumps to previous scroll position (#26362)
Closes #20658

Now, when the "Go to Line" palette is open:  
- Clicking on the editor will dismiss the palette without changing the
scroll position. (PR change)
- Pressing Enter will jump to the line number entered in the palette.
(Unchanged)
- Pressing Escape will jump back to the previous cursor location.
(Unchanged)

Release Notes:

- Fixed an issue where clicking the editor with the mouse while the "Go
to Line" palette is open would cause it to jump to the previous scroll
position.
2025-03-10 20:33:07 +05:30
Marshall Bowers
a4462577bf Sort Cargo.tomls (#26367)
This PR sorts some `Cargo.toml`s that had become unsorted.

Release Notes:

- N/A
2025-03-10 14:48:21 +00:00
João Marcos
c147b58558 Remove redundant checks in do_stage_or_unstage_and_next (#26364)
Release Notes:

- N/A
2025-03-10 14:23:17 +00:00
Devzeth
84fe1bfe9b Recognize ixx as part of the cpp suffix (#26333)
Adds "ixx" as path suffix to be recognized for c++. 

> ixx documentation
https://learn.microsoft.com/en-us/cpp/cpp/modules-cpp?view=msvc-

I've also added it to the icon file. 

Release Notes:

- N/A
2025-03-10 09:10:29 -05:00
Danilo Leal
657d7a911d Add logo for wgsl (WebGPU Shading Language) (#26360)
Was dabbling on the shaders these past few days and felt like we could
have the WGSL logo. This is based on the logo found on the GPU Web
repository: https://github.com/gpuweb/gpuweb/tree/main/logo

Release Notes:

- N/A
2025-03-10 09:44:19 -03:00
Danilo Leal
ee05cc3ad9 Add a line numbers toggle to the editor controls menu (#26318)
Closes https://github.com/zed-industries/zed/issues/26305

<img
src="https://github.com/user-attachments/assets/795029ad-128a-471f-9adf-c0ef26319bbf"
width="400px" />

Release Notes:

- N/A
2025-03-10 09:28:46 -03:00
Smit Barmase
5ed144f9d2 macOS: Add support for external file managers to open directory in Zed (#26357)
Closes #25421

This PR adds support for external file managers to show Zed as an option
in the "Open With" context menu for directories on macOS.

<img width="350" alt="image"
src="https://github.com/user-attachments/assets/c52acd48-73c4-47be-8683-6950e0371b73"
/>


Release Notes:

- Added support for opening folders in Zed from third-party macOS file
managers like Path Finder and Super Charge through their "Open With"
menu.
2025-03-10 15:21:39 +05:30
Julia Ryan
2a862b3c54 nix: Disable checks and remove crane workaround (#26356)
The checkPhase was failing for me in darwin so I turned it off. I think
eventually we'll want to use a separate derivation for tests (which
crane has a helper for).

Crane also solved our issue with spaces in paths so I bumped the flake
to pick up that fix and removed our workaround: ipetkov/crane#808.

Release Notes:

- N/A
2025-03-10 02:00:57 -07:00
Julia Ryan
4a7c84f490 Fix nix build (#26270)
This PR includes lots of small fixes to get our `build.nix` and
`shell.nix` back to a working state.

I've tested this by running `cargo run` (inside the devshell) and `nix
run` on x86 nixos and arm64 darwin machines. I'd appreciate it if others
could test building inside the devshell to double-check that it's not
just working because I happen to have some system-level packages
installed, as well as seeing if it works on other platforms (non-nixos
linux, arm linux, x86 darwin).

I couldn't get the full test suite (`cargo nextest run --workspace`)
passing in the devshell on darwin, but they _are_ all passing on nixos.
nixpkgs [disables some of our
tests](92d11f06d5/pkgs/by-name/ze/zed-editor/package.nix (L226-L234))
that apparently fail or are flakey on hydra, but they don't know why.
I'm going to punt on debugging those for now, especially given that they
seem to be working for me. I'm also unsure of whether we actually want
the nix checkPhase to run the full test suite (it's currently not
passing `--workspace`) given that we have separate CI that should
enforce that those pass on all PRs.

Here's an overview of the changes made:
- Fix our `generate-licenses` script
- Relaxes the `cargo-about` version requirement slightly so it doesn't
try to install an older binary when the nixpkgs one is newer than our
requirement
- Add a workaround for [this cargo-about
issue](https://github.com/zed-industries/zed/issues/19971) obviating the
need for the patching done in the nixpkgs package
- Set the new `--frozen` flag to avoid network access/mutating the
lockfile
- Use dynamic webrtc lib from nixpkgs, and fixes up the build script in
webrtc-sys that hardcodes it to be statically linked.
- Use `inputsFrom` in `shell.nix` and avoid duplicating everything from
`build.nix`
- Add a temporary workaround for an [upstream crane
bug](https://github.com/ipetkov/crane/issues/808).
- Fix shebangs in our `script` dir to not hard-code `/bin/bash`

There are still a bunch of issues that aren't resolved here, I'll make a
tracking issue for those and try to land this first just to get back to
an unbroken state. Eventually among other things I'd like to use a
`libgit2` from `staticPkgs` and musl cross compilation to build the
remote server under nix, and then add that as a separate flake output
and include it in the shell's `inputsFrom` list.

Thanks @niklaskorz, @GaetanLepage, @bbigras and all the other nixpkgs
maintainers that have kept the `zed-editor` package working and up to
date! I seriously considered just making our flake `overrideAttrs` the
package in nixpkgs given how well maintained it is.

Thanks @WeetHet for your volunteer maintinance of this flake. I
referenced #24953 while working on these fixes, and I'd love to
collaborate on adding some of those pieces like treefmt and a github
action. If you're interested I'd really appreciate some help debugging
why crane's `buildDepsOnly` isn't working for us. I'm assuming it'd make
our `nix build` times go way down from the improved dep caching if we
could get it working.

Thanks @rrbutani for all the help on this PR 💙.

Release Notes:

- N/A

---------

Co-authored-by: Rahul Butani <rrbutani@users.noreply.github.com>
Co-authored-by: Rahul Butani <rr.butani@gmail.com>
2025-03-10 01:06:11 -07:00
Mikayla Maki
230e2e4107 Restore git panel header (#26354)
Let's play around with it. This should not be added to tomorrow's
preview.

Release Notes:

- Git Beta: Added a panel header with an open diff and stage/unstage all
buttons.
2025-03-10 07:08:10 +00:00
Richard Hao
d732b8ba0f git: Disable commit message generation when commit not possible (#26329)
## Issue:

- `Generate Commit Message` will generate a random message if there are
no changes.
<img width="614" alt="image"
src="https://github.com/user-attachments/assets/c16cadac-01af-47c0-a2db-a5bbf62f84bb"
/>


## After Fixed:

- `Generate Commit Message` will be disabled if commit is not possible
<img width="610" alt="image"
src="https://github.com/user-attachments/assets/5ea9ca70-6fa3-4144-ab4e-be7a986d5496"
/>


## Release Notes:

- Fixed: Disable commit message generation when commit is not possible
2025-03-09 23:45:25 -07:00
Michael Sloan
7c3eecc9c7 Add support for querying file outline in assistant script (#26351)
Release Notes:

- N/A
2025-03-10 05:26:17 +00:00
Max Brunsfeld
fff37ab823 Follow-up fixes for recent multi buffer optimizations (#26345)
I realized that the optimization broke multi buffer syncing after buffer
reparses.

Release Notes:

- N/A
2025-03-09 22:15:38 -07:00
Kirill Bulatov
8a7a78fafb Avoid modifying the LSP message before resolving it (#26347)
Closes https://github.com/zed-industries/zed/issues/21277

To the left is current Zed, right is the improved version.
3rd message, from Zed, to resolve the item, does not have `textEdit` on
the right side, and has one on the left.
Seems to not influence the end result though, but at least Zed behaves
more appropriate now.

<img width="1727" alt="image"
src="https://github.com/user-attachments/assets/ca1236fd-9ce2-41ba-88fe-1f3178cdcbde"
/>


Instead of modifying the original LSP completion item, store completion
list defaults and apply them when the item is requested (except `data`
defaults, needed for resolve).

Now, the only place that can modify the completion items is this method,
and Python impl seems to be the one doing it:


ca9c3af56f/crates/languages/src/python.rs (L182-L204)

Seems ok to leave untouched for now.

Release Notes:

- Fixed LSP completion items modified before resolve request
2025-03-10 00:12:53 +02:00
Ben Kunkle
6de3ac3e17 Revert "Highlight super and this as keywords in JS/TS/TSX" (#26342)
Reverts zed-industries/zed#25135

This approach was not the best as explained in the response to the
original PR. Likely, the better approach is to create a newer specific
scope for these kinds of variables under the `@variable` prefix so that
themes can control these pseudo-keywords specifically
2025-03-09 16:11:37 +00:00
Smit Barmase
5aae3bdc69 copilot: Fix missing sign-out button when Zed is the edit prediction provider (#26340)
Closes #25884

Added a sign-out button for Copilot in Assistant settings, allowing
sign-out even when copilot is disabled.

<img width="500" alt="image"
src="https://github.com/user-attachments/assets/43fc97ad-f73c-49e1-a7b6-a3910434d661"
/>



Release Notes:

- Added a sign-out button for Copilot in Assistant settings.
2025-03-09 21:39:14 +05:30
Agus Zubiaga
e298301b40 assistant: Make scripting a first-class concept instead of a tool (#26338)
This PR makes refactors the scripting functionality to be a first-class
concept of the assistant instead of a generic tool, which will allow us
to build a more customized experience.

- The tool prompt has been slightly tweaked and is now included as a
system message in all conversations. I'm getting decent results, but now
that it isn't in the tools framework, it will probably require more
refining.

- The model will now include an `<eval ...>` tag at the end of the
message with the script. We parse this tag incrementally as it streams
in so that we can indicate that we are generating a script before we see
the closing `</eval>` tag. Later, this will help us interpret the script
as it arrives also.

- Threads now hold a `ScriptSession` entity which manages the state of
all scripts (from parsing to exited) in a centralized way, and will
later collect all script operations so they can be displayed in the UI.

- `script_tool` has been renamed to `assistant_scripting` 

- Script source now opens in a regular read-only buffer  

Note: We still need to handle persistence properly

Release Notes:

- N/A

---------

Co-authored-by: Marshall Bowers <git@maxdeviant.com>
2025-03-09 09:01:49 +00:00
brian tan
ed6bf7f161 diagnostics: Fix losing focus when activating from diagnostics view (#25517)
Closes #25509

Changes:
- If active item is already diagnostics, don't try to focus it again.

Instead of not focusing, should it just not activate instead? Something
like:

            if !workspace
                .active_item(cx)
                .map(|item| item.item_id() == existing.item_id())
                .unwrap_or(false)
            {
workspace.activate_item(&existing, true, true, window, cx);
            }


Release Notes:

- N/A
2025-03-08 22:17:20 +00:00
Smit Barmase
f14d6670ba copilot: Fix onboarding into Copilot requires Zed restart (#26330)
Closes #25594

This PR fixes an issue where signing into Copilot required restarting
Zed.

Copilot depends on an OAuth token that comes from either `hosts.json` or
`apps.json`. Initially, both files don't exist. If neither file is
found, we fallback to watching `hosts.json` for updates. However, if the
auth process creates `apps.json`, we won't receive updates from it,
causing the UI to remain outdated.

This PR fixes that by watching the parent `github-copilot` directory
instead, which will always contain one of those files along with an
additional version file.

I have tested this on macOS and Linux Wayland.

Release Notes:

- Fixed an issue where signing into Copilot required restarting Zed.
2025-03-09 03:19:09 +05:30
Joseph T. Lyons
22d9b5d8ca Update key binding documentation (#26321)
Release Notes:

- N/A
2025-03-08 01:03:52 -05:00
Danilo Leal
6ed6e8bc26 git: Refine diff hunk controls visuals (#26317)
You may need to zoom in hard to see this 😅 but the main addition of this
PR is just ensuring there's also horizontal border instead of just on
the bottom. Also added some box-shadow here to make it pop out of the
diff a bit more.

| Before | After |
|--------|--------|
| ![CleanShot 2025-03-08 at 12  37
40@2x](https://github.com/user-attachments/assets/98e0329e-646f-4455-89fa-3f6ec1211361)
| ![CleanShot 2025-03-08 at 12  36
40@2x](https://github.com/user-attachments/assets/7f667e65-5b72-4156-b0ec-2b162eb76f2f)
|

Release Notes:

- N/A
2025-03-08 01:00:47 -03:00
Max Brunsfeld
4846e6fb3a Fix performance bottlenecks when multi buffers have huge numbers of buffers (#26308)
This is motivated by trying to make the Project Diff view usable with
huge Git change sets.

Release Notes:

- Improved performance of rendering multibuffers with very large numbers
of buffers
2025-03-08 02:15:15 +00:00
Mikayla Maki
cb543f9546 Git UI papercuts (#26316)
Release Notes:

- Git Beta: added `git:Add` as an alias for the existing `git::Diff`
- Git Beta: Fixed a bug where the 'generate commit message' keybinding
wasn't working.
- Git Beta: Made the empty project diff state a little more helpful with
a button to push, and a button to close the item.
2025-03-08 01:49:06 +00:00
Michael Sloan
450d727a04 Fixes to excerpt movement actions and bindings + add multibuffer and singleton_buffer key contexts (#26264)
Closes #26002 

Release Notes:

- Added `multibuffer` key context.
- `cmd-down` and `cmd-shift-down` on Mac now moves to the end of the
last line of a singleton buffer instead of the beginning. In
multibuffers, these now move to the start of the next excerpt.
- Fixed `vim::PreviousSectionEnd` (bound to `[ ]`) to move to the
beginning of the line, matching the behavior of `vim::NextSectionEnd`.
- Added `editor::MoveToStartOfNextExcerpt` and
`editor::MoveToEndOfPreviousExcerpt`.
2025-03-08 00:58:47 +00:00
Mikayla Maki
60b3eb3f76 Add git branch switching aliases (#26315)
This gives us _very_ rudimentary support for `git switch` and `git
checkout` now, by making them aliases for our existing `git::branch`
call.

Release Notes:

- Git Beta: Added `git::Switch` and `git::CheckoutBranch` as aliases for
the existing `git::Branch`
2025-03-08 00:02:57 +00:00
Marshall Bowers
bbe7c9a738 assistant2: Factor out Thread::all_tools_finished method (#26314)
This PR factors out a new `Thread::all_tools_finished` method to
encapsulate some of the boilerplate in the `ThreadEvent::ToolFinished`
event handler.

This should make this event handler easier to replicate for the eval
use-case.

Release Notes:

- N/A
2025-03-07 23:35:32 +00:00
Mikayla Maki
f6345a6995 Improve when the commit suggestions would show (#26313)
Release Notes:

- Git Beta: Fixed a few bugs where the suggested commit text wouldn't
show in certain cases, or would update slowly.
2025-03-07 23:33:48 +00:00
Marshall Bowers
e70d0edfac assistant_tool: Pass an Entity<Project> to Tool::run (#26312)
This PR updates the `Tool::run` method to take an `Entity<Project>`
instead of a `WeakEntity<Project>`.

Release Notes:

- N/A
2025-03-07 23:30:56 +00:00
Marshall Bowers
921c24e274 assistant2: Add helper methods to Thread for dealing with tool use (#26310)
This PR adds two new helper methods to the `Thread` for dealing with
tool use:

- `use_pending_tools` - This uses all of the tools that are pending
- The reason we aren't calling this directly in `stream_completion` is
that we still might need to have a way for users to confirm that they
want tools to be run, which would need to happen at the UI layer in the
`ActiveThread`.
- `send_tool_results_to_model` - This encapsulates inserting a new user
message that contains the tool results and sending them up to the model.

Release Notes:

- N/A
2025-03-07 23:16:45 +00:00
Marshall Bowers
18f3f8097f assistant_tool: Decouple Tool from Workspace (#26309)
This PR decouples the `Tool` trait from the `Workspace` (and from the
UI, in general).

`Tool::run` now takes a `WeakEntity<Project>` instead of a
`WeakEntity<Workspace>` and a `Window`.

Release Notes:

- N/A
2025-03-07 22:41:56 +00:00
Marshall Bowers
4f6682c7fe haskell: Extract to zed-extensions/haskell repository (#26306)
This PR extracts the Haskell extension to the
[zed-extensions/haskell](https://github.com/zed-extensions/haskell)
repository.

Release Notes:

- N/A
2025-03-07 22:07:04 +00:00
Kiran_Peraka
f57dece2d5 git: Fix errors not showing in the toast notification (#26303)
Release Notes:

- Resolved an issue where error messages from Git were not being
displayed in toast notifications.
<img width="1702" alt="Screenshot 2025-03-08 at 1 11 30 AM"
src="https://github.com/user-attachments/assets/a46517db-4e64-4c5e-a64e-96e820ca9aec"
/>
2025-03-07 20:57:53 +00:00
Kirill Bulatov
103ad635d9 Refactor Completions to allow non-LSP ones better (#26300)
A preparation for https://github.com/zed-industries/zed/issues/4957 that
pushes all LSP-related data out from the basic completion item, so that
it's possible to create completion items without any trace of LSP
clearly.

Release Notes:

- N/A
2025-03-07 20:19:28 +00:00
Mikayla Maki
ec5e7a2653 Change the default staging and unstaging state display (#26299)
This adds a setting for the "border" hunk display mode, as discussed,
and makes it the default.

Here's how it looks in light mode:

<img width="1512" alt="Screenshot 2025-03-07 at 11 39 25 AM"
src="https://github.com/user-attachments/assets/a934faa3-ec69-47e1-ad46-535e48b98e9f"
/>

And dark mode: 

<img width="1511" alt="Screenshot 2025-03-07 at 11 39 56 AM"
src="https://github.com/user-attachments/assets/43c9afd1-22bb-4bd8-96ce-82702a6cbc80"
/>


Release Notes:

- Git Beta: Adjusted the default hunk styling for staged and unstaged
changes

Co-authored-by: Conrad <conrad@zed.dev>
Co-authored-by: Nate <nate@zed.dev>
2025-03-07 19:56:24 +00:00
Marshall Bowers
05d3ee8555 extension: Require that grammar names are written in snake_case (#26295)
This PR updates the `ExtensionBuilder` to require that grammar names are
written in snake_case.

The grammar names are used to construct identifiers, so we need them to
be valid C identifiers.

Release Notes:

- N/A
2025-03-07 19:02:35 +00:00
Nate Butler
1b34437839 component_preview: Add component pages (#26284)
This PR adds pages to component preview when clicking on a given
component in the sidebar.

This will let us create richer previews & better docs for using
components in the future.

Release Notes:

- N/A
2025-03-07 18:56:17 +00:00
Danilo Leal
3ff2c8fc38 Add file icon for Luau (#26293)
Closes https://github.com/zed-industries/zed/issues/14948

Release Notes:

- N/A
2025-03-07 15:27:00 -03:00
Cole Miller
b0b0b00fae worktree: Add some info-level logging about added and removed repository entries (#26291)
Trying to track down a user's reported issue with parent repositories
not getting picked up.

Release Notes:

- N/A
2025-03-07 18:02:05 +00:00
Conrad Irwin
80fb88520f Remove worktree and project notifies (#26244)
This reduces the number of multibuffer syncs from 100,000 to 20,000.
Before this change each editor individually observed the project, so
literally any project change was amplified by the number of editors you
had open.

Now editors listen to their buffers instead of the project, and other
users of `cx.observe` on the project have been updated to use specific
events to reduce churn.

Follow up to #26237


Release Notes:

- Improved performance of Zed in large repos with lots of file system
events.

---------

Co-authored-by: Max Brunsfeld <maxbrunsfeld@gmail.com>
2025-03-07 10:51:46 -07:00
Conrad Irwin
aef84d453a Update Linux Graphics troubleshooting (#26263)
Try to re-order the tips for clarity, and make it clear that /etc/prime
could be wrong either way around.

Also remove section on FIPS now we nolonger bundle openssl

Updates #15629

Release Notes:

- N/A
2025-03-07 09:57:11 -07:00
João Marcos
e06d010aab Test folded buffers navigation (#26286)
#25944 but now with Vim mode off.

Release Notes:

- N/A
2025-03-07 16:19:12 +00:00
Marshall Bowers
14148f53d4 scripting_tool: Move description into a separate file (#26283)
This PR moves the `scripting_tool` description into a separate file so
it's a bit easier to work with.

Release Notes:

- N/A
2025-03-07 15:52:38 +00:00
Antonio Scandurra
efde5aa2bb Extract a Session struct to hold state about a given thread's scripting session (#26282)
We're still recreating a session for every tool call, but the idea is to
have a long-lived `Session` per assistant thread.

Release Notes:

- N/A

---------

Co-authored-by: Agus Zubiaga <hi@aguz.me>
2025-03-07 15:44:36 +00:00
Guilherme Gonçalves
fcc5e27455 Fix hotkey for toggle filters in project search (#25917)
Closes #24741 

Adjusted the shortcut key handling to properly toggle filters in the project search feature.

Release Notes:

- linux: Fixed `ctrl-alt-f` not correctly toggling search filters in project search.

---------

Co-authored-by: Peter Tripp <peter@zed.dev>
2025-03-07 15:27:10 +00:00
Marshall Bowers
ed417da536 git_ui: Try to prompt the model out of including the diff output (#26281)
This PR updates the prompt for generating commit messages to tell the
model not to include the raw diff output in the message.

Release Notes:

- N/A
2025-03-07 15:05:35 +00:00
Piotr Osiewicz
d1c67897c5 chore: Do not bust Rust build cache when opening projects with dev build (#26278)
## Problem
Running `cargo run .` twice in Zed repository required a rebuild two
times in a row. The second rebuild was triggered around libz-sys, which
in practice caused a rebuild of the ~entire project.

Some concrete examples:
```
cargo test -p project # Requires a rebuild (warranted)
cargo run .
cargo test -p project # Requires a rebuild (unwarranted)
```
or
```
cargo run . # Requires a rebuild (warranted)
cargo run . # Requires a rebuild (unwarranted)
```

## What's going on
Zed build script on MacOS sets MACOSX_DEPLOYMENT_TARGET to 10.15. This
is fine. However, **cargo propagates all environment variables to child
processes during `cargo run`**. This then affects Rust Analyzer spawned
by dev Zed - it clobbers build cache of whatever package it touches,
because it's behavior is not same between running it with `cargo run`
(where MACOS_DEPLOYMENT_TARGET gets propagated to child Zed) and running
it directly via `target/debug/zed` or whatever (where the env variable
is not set, so that build behaves roughly like Zed Dev.app).


## Solution
~We'll unset that env variable from user environment when we're
reasonably confident that we're running under `cargo run` by exploiting
other env variables set by cargo:
https://doc.rust-lang.org/cargo/reference/environment-variables.html
CARGO_PKG_NAME is always set to `zed` when running it via `cargo run`,
as it's the value propagated from the build.~

~The alternative I've considered is running [via a custom
runner](https://doc.rust-lang.org/cargo/reference/config.html#targetcfgrunner),
though the problem here is that we'd have to use a shell script to unset
the env variable - that could be problematic with e.g. fish. I just
didn't want to deal with that, though admittedly it would've been
cleaner in other aspects.~

Redact all above. We'll just set MACOSX_DEPLOYMENT_TARGET regardless of
whether you have it in your OG shell environment or not.

Release Notes:

- N/A
2025-03-07 14:06:44 +00:00
Finn Evers
a887f3b340 Remove plain text file type association from default settings (#25420)
Closes #20291

This PR removes the plain text file association from the default
settings, as #21298 added a `LanguageMatcher` for Plain Text files,
which now associates "Plain Text" with `txt`-files (see
10053e2566/crates/language/src/language.rs (L127-L137)).

Thus, the association via the default settings is not required anymore,
which fixes #20291 as described in
https://github.com/zed-industries/zed/issues/20291#issuecomment-2500731743

Release Notes:

- Fixed default file type associations overriding associations provided
by extensions for `txt`-files.

Co-authored-by: Peter Tripp <peter@zed.dev>
2025-03-07 08:45:23 -05:00
Kirill Bulatov
f8deebc6db Fix inline diagnostics in the project diff (#26275)
205f9a9f03/crates/editor/src/element.rs (L1643)

Due to the snippet above, Zed is supposed to have `row` larger or equal
to `start_row` here:


205f9a9f03/crates/editor/src/element.rs (L1694)

yet the panic were reported when clicking in the project diff.

That project diff has a lot of highlighting happening already, so the PR
disables inline diagnostics within a git diff view.


Release Notes:

- N/A
2025-03-07 11:34:37 +00:00
Michael Sloan
205f9a9f03 Add lua script access to code using cx + reuse project search logic (#26269)
Access to `cx` will be needed for anything that queries entities. In
this commit this is use of `WorktreeStore::find_search_candidates`. In
the future it will be things like access to LSP / tree-sitter outlines /
etc.

Changes to support access to `cx` from functions provided to the Lua
script:

* Adds a channel of requests that require a `cx`. Work enqueued to this
channel is run on the foreground thread.

* Adds `async` and `send` features to `mlua` crate so that async rust
functions can be used from Lua.

* Changes uses of `Rc<RefCell<...>>` to `Arc<Mutex<...>>` so that the
futures are `Send`.

One benefit of reusing project search logic for search candidates is
that it properly ignores paths.

Release Notes:

- N/A
2025-03-07 10:02:49 +00:00
Cole Miller
b0d1024f66 Silence a couple of noisy logs (#26262)
Closes #ISSUE

Release Notes:

- N/A
2025-03-06 22:45:47 -05:00
loczek
622ed8a032 git: Fix git panel not using default width (#26220)
Closes #26062

Removing the width here causes zed to use the default value (inside
default settings) after restart like other panels.

Release Notes:

- Fixed issue where git panel wasn't using default width after restart

Co-authored-by: Mikayla Maki <mikayla@zed.dev>
2025-03-07 02:27:29 +00:00
0x2CA
09c51f9641 assistant2: Fix font fallbacks (#26258)
Release Notes:

- N/A
2025-03-06 18:14:53 -08:00
Mikayla Maki
8422a81d88 Add staged variants of the hunk_style controls (#26259)
This PR adds a few more hunk style settings that flips the emphasis.
Normally, the concept at Zed has been that the project diff should
emphasize what's going into the commit. However, this leads to a problem
where the default state of all diff hunks are in the non-emphasized
state, making them hard to see and interact with. Especially on light
themes. This PR is an experiment in flipping the emphasis states. Now
the project diff is more like a queue of work, with the next "job" (hunk
to be evaluated) emphasized, and the "completed" (staged) hunks
deemphasized. This fixes the default state issue but is a big jump from
how we've been thinking about it. So here we can try it out and see how
it feels :)

Release Notes:

- Git Beta: Added hunk style settings to emphasize the unstaged state,
rather than the staged state.
2025-03-07 02:13:50 +00:00
Mikayla Maki
6c025507b6 Restore co-author hiding (#26257)
Release Notes:

- N/A
2025-03-07 01:40:17 +00:00
Marshall Bowers
8f4b7aa5db Improve the generate commit message design (#26233)
[WIP]

Release Notes:

- N/A

---------

Co-authored-by: Mikayla Maki <mikayla.c.maki@gmail.com>
Co-authored-by: Danilo Leal <daniloleal09@gmail.com>
2025-03-07 01:21:20 +00:00
Cole Miller
3345666557 Fix paths on Windows in new test (#26255)
Closes #ISSUE

Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-03-06 20:08:13 -05:00
Max Brunsfeld
40c62cda5f Fix early return when reaching end excerpt in lift_buffer_metadata (#26253)
Release Notes:

- Fixed a bug causing slowness when viewing multi buffers with lots of
excerpts
2025-03-06 16:25:27 -08:00
Marshall Bowers
349a48d937 lua: Extract to zed-extensions/lua repository (#26250)
This PR extracts the Lua extension to the
[zed-extensions/lua](https://github.com/zed-extensions/lua) repository.

Release Notes:

- N/A
2025-03-06 23:17:34 +00:00
Cole Miller
a88af7351a Disable restore hunk control for created files (#25841)
Release Notes:

- Git Beta: disable hunk restore action and button for created files
2025-03-06 23:06:03 +00:00
Finn Evers
efaf358876 lua: Update keyword operator highlighting (#26091)
Resolves #26032 

This PR changes the highlighting for `and`, `not` and `or` in Lua from
`operator` to `keyword.operator`. [VS Code also highlights these as
keyword
operators](1483add845/Syntaxes/Lua.plist (L277-L279))
and Zed does this for other languages as well, like
[Python](813e207514/crates/languages/src/python/highlights.scm (L221-L229))
or
[Zig](813e207514/extensions/zig/languages/zig/highlights.scm (L145-L149)).

Additionally, in 813e207514 I removed
duplicate matches for existing keywords to improve readability and align
them to how keywords are generally matched across languages (see
[Rust](813e207514/crates/languages/src/rust/highlights.scm (L79-L119))
and
[Typescript](813e207514/crates/languages/src/typescript/highlights.scm (L210-L269))
for example).
Whilst contributing to the majority of the diff, this does not change
any existing highlights.

| Before | After | 
| --- | --- |
| <img width="309" alt="old"
src="https://github.com/user-attachments/assets/7790817e-4a0d-442b-b176-9a84bcc6f3c4"
/> | <img width="309" alt="PR"
src="https://github.com/user-attachments/assets/34a57962-938a-4465-9406-288f5c456aa3"
/> |


Release Notes:

- N/A

---------

Co-authored-by: Peter Tripp <peter@zed.dev>
2025-03-06 18:01:13 -05:00
Marshall Bowers
06a226dc32 editor: Remove some blank lines (#26249)
This PR removes some blank lines in `blink_manager.rs`.

Release Notes:

- N/A
2025-03-06 22:58:46 +00:00
Cole Miller
1763dd714b Worktree paths in git panel, take 2 (#26047)
Modified version of #25950. We still use worktree paths, but repo paths
with a status that lie outside the worktree are not excluded; instead,
we relativize them by adding `..`. This makes the list in the git panel
match what you'd get from running `git status` (with the repo's worktree
root as the working directory).

- [x] Implement + test new unrelativization logic
- [x] ~~When collecting repositories, dedup by .git abs path, so
worktrees can share a repo at the project level~~ dedup repos at the
repository selector layer, with repos coming from larger worktrees being
preferred
- [x] Open single-file worktree with diff when activating a path not in
the worktree

Release Notes:

- N/A
2025-03-06 22:55:28 +00:00
Marshall Bowers
330e799293 erlang: Extract to zed-extensions/erlang repository (#26248)
This PR extracts the Erlang extension to the
[zed-extensions/erlang](https://github.com/zed-extensions/erlang)
repository.

Release Notes:

- N/A
2025-03-06 22:53:13 +00:00
Max Brunsfeld
51c900366d Enable soft-wrap by default in markdown (#26247)
Release Notes:

- Enabled soft-wrap by default in markdown
2025-03-06 22:30:26 +00:00
Max Brunsfeld
be75f17429 Fix auto-indent when pasting multi-line content that was copied start… (#26246)
Closes https://github.com/zed-industries/zed/issues/24914 (again)

Release Notes:

- Fixed an issue where multi-line pasted content was auto-indented
incorrectly if copied from the middle of an existing line.
2025-03-06 22:13:34 +00:00
Conrad Irwin
f373383fc1 Track dirtyness per item (#26237)
This reduces the number of multibuffer syncs when starting the editor
with 80
files open in the Zed repo from 10,000,000 to 100,000 by avoiding
O(n**2)
dirtyness checks.

Release Notes:

- Fixed a beachball when restarting in a large repo with a large number
open files
2025-03-06 15:12:56 -07:00
Joseph T. Lyons
263d9ff755 Add event to track LLM-generated commit messages (#26245)
Release Notes:

- N/A
2025-03-06 21:46:57 +00:00
Naim A.
829ecda370 lsp: Add support for clangd's inactiveRegions extension (#26146)
Closes #13089 

Here we use `experimental` to advertise our support for
`inactiveRegions`. Note that clangd does not currently have a stable
release that reads the `experimental` object (PR
https://github.com/llvm/llvm-project/pull/116531), this can be tested
with one of clangd's recent "unstable snapshots" in their
[releases](https://github.com/clangd/clangd/releases).

Release Notes:

- Added support for clangd's `inactiveRegions` extension.

![Screen Recording 2025-03-05 at 22 39
58](https://github.com/user-attachments/assets/ceade8bd-4d8e-43c3-9520-ad44efa50d2f)

---------

Co-authored-by: Peter Tripp <peter@zed.dev>
Co-authored-by: Kirill Bulatov <kirill@zed.dev>
2025-03-06 21:30:05 +00:00
Kirill Bulatov
af5af9d7c5 Support workspace/executeCommand for actions' data (#26239)
Closes https://github.com/zed-industries/zed/issues/16746
Part of https://github.com/zed-extensions/deno/issues/2

Changes the action-related code so, that

* `lsp::Command` as actions are supported, if server replies with them
* actions with commands are filtered out based on servers'
`executeCommandOptions`
(https://microsoft.github.io/language-server-protocol/specifications/lsp/3.17/specification/#executeCommandOptions)
— commands that are not listed won't be executed and the corresponding
actions will be hidden in Zed

Release Notes:

- Added support of `workspace/executeCommand` for actions' data

---------

Co-authored-by: Peter Tripp <petertripp@gmail.com>
2025-03-06 23:26:46 +02:00
Marshall Bowers
97c0a0a86e language_models: Remove .unwraps in Bedrock provider (#26238)
This PR removes a number of `.unwrap`s in the Bedrock provider.

We must not `.unwrap` in situations where it is not provably safe to do
so, which it was not in any of these cases.

Release Notes:

- Fixed some potential panics in the AWS Bedrock model provider.
2025-03-06 21:02:37 +00:00
Nate Butler
7e964290bf Add StatusToast & the ToastLayer (#26232)
https://github.com/user-attachments/assets/b16e32e6-46c6-41dc-ab68-1824d288c8c2

This PR adds the first part of our planned extended notification system:
StatusToasts.

It also makes various updates to ComponentPreview and adds a `Styled`
extension in `ui::style::animation` to make it easier to animate styled
elements.

_**Note**: We will be very, very selective with what elements are
allowed to be animated in Zed. Assume PRs adding animation to elements
will all need to be manually signed off on by a designer._

## Status Toast

![CleanShot 2025-03-06 at 14 15
52@2x](https://github.com/user-attachments/assets/b65d4661-f8d1-4e98-b9be-2c05cba1409f)

These are designed to be used for notifying about things that don't
require an action to be taken or don't need to be triaged. They are
designed to be ignorable, and dismiss themselves automatically after a
set time.

They can optionally include a single action. 

Example: When the user enables Vim Mode, that action might let them undo
enabling it.

![CleanShot 2025-03-06 at 14 18
34@2x](https://github.com/user-attachments/assets/eb6cb20e-c968-4f03-88a5-ecb6a8809150)

Status Toasts should _not_ be used when an action is required, or for
any binary choice.

If the user must provide some input, this isn't the right component!

### Out of scope

- Toasts should fade over a short time (like AnimationDuration::Fast or
Instant) when dismissed
- We should visually show when the toast will dismiss. We'll need to
pipe the `duration_remaining` from the toast layer -> ActiveToast to do
this.
- Dismiss any active toast if another notification kind is created, like
a Notification or Alert.

Release Notes:

- N/A

---------

Co-authored-by: Cole Miller <m@cole-miller.net>
2025-03-06 20:37:54 +00:00
Marshall Bowers
b8a8b9c699 git_ui: Add support for generating commit messages with an LLM (#26227)
This PR finishes up the support for generating commit messages using an
LLM.

We're shelling out to `git diff` to get the diff text, as it seemed more
efficient than attempting to reconstruct the diff ourselves from our
internal Git state.


https://github.com/user-attachments/assets/9bcf30a7-7a08-4f49-a753-72a5d954bddd

Release Notes:

- Git Beta: Added support for generating commit messages using a
language model.

---------

Co-authored-by: Conrad Irwin <conrad.irwin@gmail.com>
2025-03-06 19:47:52 +00:00
Danilo Leal
d1cec209d4 gpui: Add rounded_md token (#26179)
This PR adds a new rounded/corner border token: `rounded_md` with a
value of 6px.

I feel like I was wanting to use 6px border radius a lot but avoiding
due to it being an arbitrary value... so, not anymore! It's also cool to
have this be consistent with Tailwind v4.

Follow on to the prior renames:

- `rounded_sm` -> `rounded_xs`:
https://github.com/zed-industries/zed/pull/26221
- `rounded_md` -> `rounded_sm`:
https://github.com/zed-industries/zed/pull/26228

Release Notes:

- N/A

---------

Co-authored-by: Marshall Bowers <git@maxdeviant.com>
2025-03-06 13:41:21 -05:00
Marshall Bowers
aceab76ae4 gpui: Rename rounded_md to rounded_sm (#26228)
This PR renames the `rounded_md` style method to `rounded_sm`.

Follow up to https://github.com/zed-industries/zed/pull/26221, which
freed up the `rounded_sm` name.

Release Notes:

- N/A
2025-03-06 17:57:31 +00:00
Conrad Irwin
9c054f207e Git telemetry (#26222)
Release Notes:

- git: Adds telemetry to git actions
2025-03-06 10:56:28 -07:00
smit
219d36f589 migrator: Add versioned migrations (#26215)
There is a drawback to how we currently write our migrations:

For example:

1. Suppose we change one of our actions from a string to an array and
rename it, then roll out the preview build:

      Before: `"ctrl-x": "editor::GoToPrevHunk"`
Latest: `"ctrl-x": ["editor::GoToPreviousHunk", { "center_cursor": true
}]`
      
To handle this, we wrote migration `A` to convert the string to an
array.

2. Now, suppose we decide to change it back to a string:
- User who hasn't migrated yet on Preview: `"ctrl-x":
"editor::GoToPrevHunk"`
- User who has migrated on Preview: `"ctrl-x":
["editor::GoToPreviousHunk", { "center_cursor": true }]`
    - Latest: `"ctrl-x": "editor::GoToPreviousHunk"`

To handle this, we would need to remove migration `A` and add two more
migrations:
- **Migration B**: `"ctrl-x": "editor::GoToPrevHunk"` -> `"ctrl-x":
"editor::GoToPreviousHunk"`
- **Migration C**: `"ctrl-x": ["editor::GoToPreviousHunk", {
"center_cursor": true }]` -> `"ctrl-x": "editor::GoToPreviousHunk"`

Nice. But over time, this keeps increasing, making it impossible to
track outdated versions and handle all cases. Missing a case means users
stuck on `"ctrl-x": "editor::GoToPrevHunk"` will remain there and won't
be automatically migrated to the latest state.

---

To fix this, we introduce versioned migrations. Instead of removing
migration `A`, we simply write a new migration that takes the user to
the latest version—i.e., in this case, migration `C`.

- A user who hasn't migrated before will go through both migrations `A`
and `C` in order.
- A user who has already migrated will only go through `C`, since `A`
wouldn't change anything for them.

With incremental migrations, we only need to write migrations on top of
the latest state (big win!), as know internally they all would be on
latest state. You *must not* modify previous migrations. Always create
new ones instead.

This also serves as base for only prompting user to migrate, when
feature reaches stable. That way, preview and stable keymap and settings
are in sync.

cc: @mgsloan @ConradIrwin @probably-neb 

Release Notes:

- N/A
2025-03-06 23:04:48 +05:30
Marshall Bowers
6fd9708eee extension: Add capabilities for the process API (#26224)
This PR adds support for capabilities for the extension process API.

In order to use the process API, an extension must declare which
commands it wants to use, with arguments:

```toml
[[capabilities]]
kind = "process:exec"
command = "echo"
args = ["hello!"]
```

A `*` can be used to denote a single wildcard in the argument list:

```toml
[[capabilities]]
kind = "process:exec"
command = "echo"
args = ["*"]
```

And `**` can be used to denote a wildcard for the remaining arguments:

```toml
[[capabilities]]
kind = "process:exec"
command = "ls"
args = ["-a", "**"]
```

Release Notes:

- N/A

---------

Co-authored-by: Conrad Irwin <conrad.irwin@gmail.com>
2025-03-06 11:55:00 -05:00
Marshall Bowers
99216acdec gpui: Rename rounded_sm to rounded_xs (#26221)
This PR renames the `rounded_sm` style method to `rounded_xs`.

This will allow us to add an additional step in the scale.

Release Notes:

- N/A
2025-03-06 16:08:19 +00:00
Chris Boette
aef25a3bc3 slash_commands_example: Improve setup instructions in README (#26217)
This PR improves the setup instructions for the slash-commands-example
extension by:

1. Replacing the `sed` command with a more reliable approach that
completely replaces the Cargo.toml file.

2. Explicitly showing how to create a standalone extension with a
properly configured Cargo.toml file that:
   - Uses `edition = "2021"` instead of `edition.workspace = true`
   - Doesn't include `publish.workspace = true`
   - Doesn't include the `[lints]` section

This change addresses an issue where the extension wouldn't work when
copied as a standalone project due to workspace references that are only
valid when the extension is built as part of the main Zed repository.

The updated instructions provide a clear, reliable path for developers
to create their own Zed extensions based on the slash commands example.

Release Notes:

- N/A
2025-03-06 10:56:17 -05:00
greathongtu
9b07f36199 gpui: Fix Cut action in input example (#26203)
Zed fan trying to learn GPUI here. Notice one problem in input example
which cause cmd-x function not work.
Let me know if any adjustments are needed!

Release Notes:

- N/A
2025-03-06 10:02:52 -05:00
Ben Kunkle
ff25fa24e7 Add support for auto-closing of JSX tags (#25681)
Closes #4271

Implemented by kicking of a task on the main thread at the end of
`Editor::handle_input` which waits for the buffer to be re-parsed before
checking if JSX tag completion possible based on the recent edits, and
if it is then it spawns a task on the background thread to generate the
edits to be auto-applied to the buffer

Release Notes:

- Added support for auto-closing of JSX tags

---------

Co-authored-by: Cole Miller <cole@zed.dev>
Co-authored-by: Max Brunsfeld <max@zed.dev>
Co-authored-by: Marshall Bowers <git@maxdeviant.com>
Co-authored-by: Mikayla <mikayla@zed.dev>
Co-authored-by: Peter Tripp <peter@zed.dev>
2025-03-06 08:36:10 -06:00
张小白
05df3d1bd6 windows: Dock menu impl 2 (#26010)
Closes #ISSUE

Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-03-06 12:40:34 +00:00
Piotr Osiewicz
84f4d2630f node_runtime: Use user/global configuration when using system node installation (#26209)
This partially reverts https://github.com/zed-industries/zed/pull/3324
We will still blank out user/global config when running managed NPM, to
keep to the spirit of #3324 (which was made at the time we did not allow
user-provided NPM builds - the intent of the change was to make the
behavior of NPM as consistent as possible).

I tested this change by:
1. Setting up a custom NPM registry via Versaccio
2. Adding this new registry to my .npmrc
3. Mirroring `vscode-langservers-extracted` to it 
4. Blocking access to `registry.npmjs.org`
5. Opening up settings.json file in Zed Nightly
- Verifying that language server update fails for it
6. Opening up Zed Dev build of this branch
- Confirming that language server update check goes through for it

Closes #19806
Closes #20749
Closes #9422

Release Notes:

- User and global .npmrc configuration is now respected when running
user-provided NPM binary (which also happens automatically when `npm`
from PATH is newer than 18.0.0)
2025-03-06 12:50:42 +01:00
renovate[bot]
b42930f5be Update Rust crate embed-resource to v3.0.2 (#26171)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
|
[embed-resource](https://redirect.github.com/nabijaczleweli/rust-embed-resource)
| build-dependencies | patch | `3.0.1` -> `3.0.2` |

---

### Release Notes

<details>
<summary>nabijaczleweli/rust-embed-resource (embed-resource)</summary>

###
[`v3.0.2`](https://redirect.github.com/nabijaczleweli/rust-embed-resource/compare/v3.0.1...v3.0.2)

[Compare
Source](https://redirect.github.com/nabijaczleweli/rust-embed-resource/compare/v3.0.1...v3.0.2)

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "after 3pm on Wednesday" in timezone
America/New_York, Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

Release Notes:

- N/A

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzOS4xODUuNCIsInVwZGF0ZWRJblZlciI6IjM5LjE4NS40IiwidGFyZ2V0QnJhbmNoIjoibWFpbiIsImxhYmVscyI6W119-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-03-06 12:31:11 +02:00
renovate[bot]
cb2eef6fb6 Update Rust crate chrono to v0.4.40 (#26170)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [chrono](https://redirect.github.com/chronotope/chrono) |
workspace.dependencies | patch | `0.4.39` -> `0.4.40` |

---

### Release Notes

<details>
<summary>chronotope/chrono (chrono)</summary>

###
[`v0.4.40`](https://redirect.github.com/chronotope/chrono/releases/tag/v0.4.40):
0.4.40

[Compare
Source](https://redirect.github.com/chronotope/chrono/compare/v0.4.39...v0.4.40)

#### What's Changed

- Add Month::num_days() by
[@&#8203;djc](https://redirect.github.com/djc) in
[https://github.com/chronotope/chrono/pull/1645](https://redirect.github.com/chronotope/chrono/pull/1645)
- Update Windows dependencies by
[@&#8203;kennykerr](https://redirect.github.com/kennykerr) in
[https://github.com/chronotope/chrono/pull/1646](https://redirect.github.com/chronotope/chrono/pull/1646)
- Feature/round_up method on DurationRound trait by
[@&#8203;MagnumTrader](https://redirect.github.com/MagnumTrader) in
[https://github.com/chronotope/chrono/pull/1651](https://redirect.github.com/chronotope/chrono/pull/1651)
- Expose `write_to` for `DelayedFormat` by
[@&#8203;tugtugtug](https://redirect.github.com/tugtugtug) in
[https://github.com/chronotope/chrono/pull/1654](https://redirect.github.com/chronotope/chrono/pull/1654)
- Update LICENSE.txt by
[@&#8203;maximevtush](https://redirect.github.com/maximevtush) in
[https://github.com/chronotope/chrono/pull/1656](https://redirect.github.com/chronotope/chrono/pull/1656)
- docs: fix minor typo by
[@&#8203;samfolo](https://redirect.github.com/samfolo) in
[https://github.com/chronotope/chrono/pull/1659](https://redirect.github.com/chronotope/chrono/pull/1659)
- Use NaiveDateTime for internal tz_info methods. by
[@&#8203;AVee](https://redirect.github.com/AVee) in
[https://github.com/chronotope/chrono/pull/1658](https://redirect.github.com/chronotope/chrono/pull/1658)
- Upgrade to windows-bindgen 0.60 by
[@&#8203;djc](https://redirect.github.com/djc) in
[https://github.com/chronotope/chrono/pull/1665](https://redirect.github.com/chronotope/chrono/pull/1665)
- Add quarter (%q) date string specifier by
[@&#8203;drinkcat](https://redirect.github.com/drinkcat) in
[https://github.com/chronotope/chrono/pull/1666](https://redirect.github.com/chronotope/chrono/pull/1666)

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "after 3pm on Wednesday" in timezone
America/New_York, Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

Release Notes:

- N/A

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzOS4xODUuNCIsInVwZGF0ZWRJblZlciI6IjM5LjE4NS40IiwidGFyZ2V0QnJhbmNoIjoibWFpbiIsImxhYmVscyI6W119-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-03-06 12:30:56 +02:00
renovate[bot]
73668c2c4b Update Rust crate async-compression to v0.4.20 (#26154)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
|
[async-compression](https://redirect.github.com/Nullus157/async-compression)
| workspace.dependencies | patch | `0.4.18` -> `0.4.20` |

---

### Release Notes

<details>
<summary>Nullus157/async-compression (async-compression)</summary>

###
[`v0.4.20`](https://redirect.github.com/Nullus157/async-compression/blob/HEAD/CHANGELOG.md#0420---2025-02-28)

[Compare
Source](https://redirect.github.com/Nullus157/async-compression/compare/v0.4.19...v0.4.20)

##### Added

-   Add support for `wasm32-wasip1-*` targets.

###
[`v0.4.19`](https://redirect.github.com/Nullus157/async-compression/blob/HEAD/CHANGELOG.md#0419---2025-02-27)

[Compare
Source](https://redirect.github.com/Nullus157/async-compression/compare/v0.4.18...v0.4.19)

##### Changed

-   Update `bzip2` dependency to `0.5`.

##### Fixed

-   Ensure that flush finishes before continuing.

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "after 3pm on Wednesday" in timezone
America/New_York, Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

Release Notes:

- N/A

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzOS4xODUuNCIsInVwZGF0ZWRJblZlciI6IjM5LjE4NS40IiwidGFyZ2V0QnJhbmNoIjoibWFpbiIsImxhYmVscyI6W119-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-03-06 12:30:38 +02:00
renovate[bot]
07f555ca3b Update Rust crate cargo_metadata to v0.19.2 (#26165)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [cargo_metadata](https://redirect.github.com/oli-obk/cargo_metadata) |
workspace.dependencies | patch | `0.19.1` -> `0.19.2` |

---

### Release Notes

<details>
<summary>oli-obk/cargo_metadata (cargo_metadata)</summary>

###
[`v0.19.2`](https://redirect.github.com/oli-obk/cargo_metadata/compare/0.19.1...0.19.2)

[Compare
Source](https://redirect.github.com/oli-obk/cargo_metadata/compare/0.19.1...0.19.2)

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "after 3pm on Wednesday" in timezone
America/New_York, Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [x] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

Release Notes:

- N/A

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzOS4xODUuNCIsInVwZGF0ZWRJblZlciI6IjM5LjE4NS40IiwidGFyZ2V0QnJhbmNoIjoibWFpbiIsImxhYmVscyI6W119-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-03-06 09:24:48 +00:00
renovate[bot]
b0e2b57462 Update Rust crate linkme to v0.3.32 (#26191)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [linkme](https://redirect.github.com/dtolnay/linkme) |
workspace.dependencies | patch | `0.3.31` -> `0.3.32` |

---

### Release Notes

<details>
<summary>dtolnay/linkme (linkme)</summary>

###
[`v0.3.32`](https://redirect.github.com/dtolnay/linkme/releases/tag/0.3.32)

[Compare
Source](https://redirect.github.com/dtolnay/linkme/compare/0.3.31...0.3.32)

-   Documentation improvements

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "after 3pm on Wednesday" in timezone
America/New_York, Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

Release Notes:

- N/A

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzOS4xODUuNCIsInVwZGF0ZWRJblZlciI6IjM5LjE4NS40IiwidGFyZ2V0QnJhbmNoIjoibWFpbiIsImxhYmVscyI6W119-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-03-06 11:06:55 +02:00
renovate[bot]
d404d7964e Update Rust crate blake3 to v1.6.1 (#26159)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [blake3](https://redirect.github.com/BLAKE3-team/BLAKE3) |
workspace.dependencies | patch | `1.6.0` -> `1.6.1` |

---

### Release Notes

<details>
<summary>BLAKE3-team/BLAKE3 (blake3)</summary>

###
[`v1.6.1`](https://redirect.github.com/BLAKE3-team/BLAKE3/releases/tag/1.6.1)

[Compare
Source](https://redirect.github.com/BLAKE3-team/BLAKE3/compare/1.6.0...1.6.1)

version 1.6.1

Changes since 1.6.0:

-   Remove `mmap` from the default features list. It was added
    accidentally in v1.6.0, last week. This is technically a
backwards-incompatible change, but I would rather not tag v2.0.0 for a
    build-time bugfix with a simple workaround.

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "after 3pm on Wednesday" in timezone
America/New_York, Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

Release Notes:

- N/A

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzOS4xODUuNCIsInVwZGF0ZWRJblZlciI6IjM5LjE4NS40IiwidGFyZ2V0QnJhbmNoIjoibWFpbiIsImxhYmVscyI6W119-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-03-06 11:06:06 +02:00
renovate[bot]
69af9bedaf Update Rust crate oo7 to v0.4.1 (#26192)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [oo7](https://redirect.github.com/bilelmoussaoui/oo7) | dependencies |
patch | `0.4.0` -> `0.4.1` |

---

### Configuration

📅 **Schedule**: Branch creation - "after 3pm on Wednesday" in timezone
America/New_York, Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

Release Notes:

- N/A

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzOS4xODUuNCIsInVwZGF0ZWRJblZlciI6IjM5LjE4NS40IiwidGFyZ2V0QnJhbmNoIjoibWFpbiIsImxhYmVscyI6W119-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-03-06 11:05:44 +02:00
Conrad Irwin
2ebbcf15ea Don't deleete non-extant files (#26187)
Closes #ISSUE

Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-03-05 22:45:25 -07:00
renovate[bot]
631cab5e40 Update Rust crate indoc to v2.0.6 (#26177)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [indoc](https://redirect.github.com/dtolnay/indoc) |
workspace.dependencies | patch | `2.0.5` -> `2.0.6` |

---

### Release Notes

<details>
<summary>dtolnay/indoc (indoc)</summary>

###
[`v2.0.6`](https://redirect.github.com/dtolnay/indoc/releases/tag/2.0.6)

[Compare
Source](https://redirect.github.com/dtolnay/indoc/compare/2.0.5...2.0.6)

-   Documentation improvements

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "after 3pm on Wednesday" in timezone
America/New_York, Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

Release Notes:

- N/A

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzOS4xODUuNCIsInVwZGF0ZWRJblZlciI6IjM5LjE4NS40IiwidGFyZ2V0QnJhbmNoIjoibWFpbiIsImxhYmVscyI6W119-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-03-06 05:39:10 +00:00
renovate[bot]
ba39a47c52 Update Rust crate libc to v0.2.170 (#26181)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [libc](https://redirect.github.com/rust-lang/libc) |
workspace.dependencies | patch | `0.2.169` -> `0.2.170` |

---

### Release Notes

<details>
<summary>rust-lang/libc (libc)</summary>

###
[`v0.2.170`](https://redirect.github.com/rust-lang/libc/releases/tag/0.2.170)

[Compare
Source](https://redirect.github.com/rust-lang/libc/compare/0.2.169...0.2.170)

##### Added

- Android: Declare `setdomainname` and `getdomainname`
[#&#8203;4212](https://redirect.github.com/rust-lang/libc/pull/4212)
- FreeBSD: Add `evdev` structures
[#&#8203;3756](https://redirect.github.com/rust-lang/libc/pull/3756)
- FreeBSD: Add the new `st_filerev` field to `stat32`
([#&#8203;4254](https://redirect.github.com/rust-lang/libc/pull/4254))
- Linux: Add ` SI_*`` and `TRAP_\*\`\` signal codes
[#&#8203;4225](https://redirect.github.com/rust-lang/libc/pull/4225)
- Linux: Add experimental configuration to enable 64-bit time in kernel
APIs, set by `RUST_LIBC_UNSTABLE_LINUX_TIME_BITS64`.
[#&#8203;4148](https://redirect.github.com/rust-lang/libc/pull/4148)
- Linux: Add recent socket timestamping flags
[#&#8203;4273](https://redirect.github.com/rust-lang/libc/pull/4273)
- Linux: Added new CANFD_FDF flag for the flags field of canfd_frame
[#&#8203;4223](https://redirect.github.com/rust-lang/libc/pull/4223)
- Musl: add CLONE_NEWTIME
[#&#8203;4226](https://redirect.github.com/rust-lang/libc/pull/4226)
- Solarish: add the posix_spawn family of functions
[#&#8203;4259](https://redirect.github.com/rust-lang/libc/pull/4259)

##### Deprecated

- Linux: deprecate kernel modules syscalls
[#&#8203;4228](https://redirect.github.com/rust-lang/libc/pull/4228)

##### Changed

- Emscripten: Assume version is at least 3.1.42
[#&#8203;4243](https://redirect.github.com/rust-lang/libc/pull/4243)

##### Fixed

- BSD: Correct the definition of `WEXITSTATUS`
[#&#8203;4213](https://redirect.github.com/rust-lang/libc/pull/4213)
- Hurd: Fix CMSG_DATA on 64bit systems
([#&#8203;4240](https://redirect.github.com/rust-lang/libc/pull/424))
- NetBSD: fix `getmntinfo`
([#&#8203;4265](https://redirect.github.com/rust-lang/libc/pull/4265)
- VxWorks: Fix the size of `time_t`
[#&#8203;426](https://redirect.github.com/rust-lang/libc/pull/426)

##### Other

- Add labels to FIXMEs
[#&#8203;4230](https://redirect.github.com/rust-lang/libc/pull/4230),
[#&#8203;4229](https://redirect.github.com/rust-lang/libc/pull/4229),
[#&#8203;4237](https://redirect.github.com/rust-lang/libc/pull/4237)
- CI: Bump FreeBSD CI to 13.4 and 14.2
[#&#8203;4260](https://redirect.github.com/rust-lang/libc/pull/4260)
- Copy definitions from core::ffi and centralize them
[#&#8203;4256](https://redirect.github.com/rust-lang/libc/pull/4256)
- Define c_char at top-level and remove per-target c_char definitions
[#&#8203;4202](https://redirect.github.com/rust-lang/libc/pull/4202)
- Port style.rs to syn and add tests for the style checker
[#&#8203;4220](https://redirect.github.com/rust-lang/libc/pull/4220)

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "after 3pm on Wednesday" in timezone
America/New_York, Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

Release Notes:

- N/A

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzOS4xODUuNCIsInVwZGF0ZWRJblZlciI6IjM5LjE4NS40IiwidGFyZ2V0QnJhbmNoIjoibWFpbiIsImxhYmVscyI6W119-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-03-06 05:27:23 +00:00
Conrad Irwin
c34357e2ab Git askpass (#25953)
Supersedes #25848

Release Notes:

- git: Supporting push/pull/fetch when remote requires auth

---------

Co-authored-by: Mikayla Maki <mikayla.c.maki@gmail.com>
2025-03-06 05:20:06 +00:00
renovate[bot]
6fdb666bb7 Update Rust crate inventory to v0.3.20 (#26180)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [inventory](https://redirect.github.com/dtolnay/inventory) |
workspace.dependencies | patch | `0.3.19` -> `0.3.20` |

---

### Release Notes

<details>
<summary>dtolnay/inventory (inventory)</summary>

###
[`v0.3.20`](https://redirect.github.com/dtolnay/inventory/releases/tag/0.3.20)

[Compare
Source](https://redirect.github.com/dtolnay/inventory/compare/0.3.19...0.3.20)

-   Documentation improvements

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "after 3pm on Wednesday" in timezone
America/New_York, Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

Release Notes:

- N/A

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzOS4xODUuNCIsInVwZGF0ZWRJblZlciI6IjM5LjE4NS40IiwidGFyZ2V0QnJhbmNoIjoibWFpbiIsImxhYmVscyI6W119-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-03-06 00:15:57 -05:00
Peter Tripp
7a9e0b37ed Improve schema_generator CLI (#25898)
Release Notes:

- N/A
2025-03-06 04:59:57 +00:00
renovate[bot]
d44ba92363 Update Rust crate globset to v0.4.16 (#26176)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
|
[globset](https://redirect.github.com/BurntSushi/ripgrep/tree/master/crates/globset)
([source](https://redirect.github.com/BurntSushi/ripgrep/tree/HEAD/crates/globset))
| workspace.dependencies | patch | `0.4.15` -> `0.4.16` |

---

### Configuration

📅 **Schedule**: Branch creation - "after 3pm on Wednesday" in timezone
America/New_York, Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

Release Notes:

- N/A

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzOS4xODUuNCIsInVwZGF0ZWRJblZlciI6IjM5LjE4NS40IiwidGFyZ2V0QnJhbmNoIjoibWFpbiIsImxhYmVscyI6W119-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-03-06 04:45:26 +00:00
renovate[bot]
ca4d8fc900 Update Rust crate bytes to v1.10.1 (#26164)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [bytes](https://redirect.github.com/tokio-rs/bytes) |
workspace.dependencies | patch | `1.10.0` -> `1.10.1` |

---

### Release Notes

<details>
<summary>tokio-rs/bytes (bytes)</summary>

###
[`v1.10.1`](https://redirect.github.com/tokio-rs/bytes/blob/HEAD/CHANGELOG.md#1101-March-5th-2025)

[Compare
Source](https://redirect.github.com/tokio-rs/bytes/compare/v1.10.0...v1.10.1)

##### Fixed

- Fix memory leak when using `to_vec` with `Bytes::from_owner`
([#&#8203;773](https://redirect.github.com/tokio-rs/bytes/issues/773))

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "after 3pm on Wednesday" in timezone
America/New_York, Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

Release Notes:

- N/A

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzOS4xODUuNCIsInVwZGF0ZWRJblZlciI6IjM5LjE4NS40IiwidGFyZ2V0QnJhbmNoIjoibWFpbiIsImxhYmVscyI6W119-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-03-06 04:35:04 +00:00
Agus Zubiaga
352882af77 docs: Improve edit prediction tab conflict section (#25493)
To be merged when https://github.com/zed-industries/zed/pull/25491 is released

Release Notes:

- N/A

Co-authored-by: Danilo Leal <daniloleal09@gmail.com>
2025-03-05 23:23:06 -05:00
Cole Miller
c5c4a6201b ci: Upload remote server assets to workflow run as well (#26153)
Closes #ISSUE

Release Notes:

- N/A
2025-03-06 04:20:38 +00:00
Devzeth
6327a5d665 docs: Improve documentation of ensure final new line on save (#25960)
The function ensure_final_newline in buffer.rs has this explanation:
Ensures that the buffer ends with a single newline character, no other
whitespace.

The documentation wasn't explaining well that we actually remove any
lines containing only whitespace and keep only 1 line at the end of a
buffer.

Release Notes:

- N/A

---------

Co-authored-by: Peter Tripp <peter@zed.dev>
Co-authored-by: Danilo Leal <danilo@zed.dev>
2025-03-05 23:14:20 -05:00
Devzeth
57438d30e8 docs: Add documentation for lsp_highlight_debounce (#25974)
Adds documentation for `lsp_highlight_debounce`. 

Release Notes:

- N/A
2025-03-05 23:09:04 -05:00
0x2CA
5c81dd7d39 git: Fix git commit font fallbacks (#26184)
Closes #ISSUE

Release Notes:

- Fixed git commit font_fallbacks
2025-03-06 04:06:00 +00:00
Cole Miller
aec4d5cb26 Fix panic in commit editor selections syncing (#26186)
Closes #26183 

Release Notes:

- Git Beta: Fixed a panic when selecting text in one of the commit
message editors
2025-03-06 03:21:40 +00:00
Peter Tripp
4c0750bd2f ci: Less Windows CI for PRs (#26155)
Split Windows GHA CI job into `windows_clippy` and `windows_tests`
(`cargo test` and `cargo build`). `windows_clippy` will continue to run
on every PR commit, but `windows_tests` will only be run on main. Tag a
PR `windows` if you would like to run windows tests.

Added a call to the Azure metadata service to detect the Azure hardware
used by the GitHub hosted Windows runners. This is temporary and I'll
remove once I've gathered some data (adds 5-15secs to Windows CI times)

Release Notes:

- N/A

---------

Co-authored-by: Mikayla Maki <mikayla@zed.dev>
2025-03-05 21:59:58 -05:00
brian tan
22b1a02e23 vim: Implement <count>% motion (#25839)
Closes https://github.com/zed-industries/zed/discussions/25665

> Currently Zed is missing quite an useful Vim motion: <count>% (go to
{count} percentage in the file).
Description:
{count}% - Go to {count} percentage in the file, on the first non-blank
in the line linewise. To compute the new line number this formula is
used: ({count} * number-of-lines + 99) / 100 .
> [Link](https://neovim.io/doc/user/motion.html#N%25).

Release Notes:

- vim: Added `<count>%` motion

---------

Co-authored-by: Conrad Irwin <conrad@zed.dev>
2025-03-05 19:59:18 -07:00
Max Brunsfeld
314ad5dd5f Clear pending staged/unstaged diff hunks hunks when writing to the git index fails (#26173)
Release Notes:

- Git Beta: Fixed a bug where discarding a hunk in the project diff view
performed two concurrent saves of the buffer.
- Git Beta: Fixed an issue where diff hunks appeared in the wrong state
after failing to write to the git index.
2025-03-05 18:45:09 -08:00
Kirill Bulatov
d3c68650c0 Improve cmd-click in terminal to find more paths (#26174)
Closes https://github.com/zed-industries/zed/issues/25701

Reworks the way cmd-click is handled:

* first, all worktree entries are checked for existence

This allows more fine-grained lookup of entries that are in the
worktree, but their path in the terminal is not "full": in case neither
`cwd` no worktree's root + that temrinal paths form a valid path
(https://github.com/zed-industries/zed/issues/25701)

The worktrees are sorted by "the most close to cwd first" so such files
are attempted to resolved in the most specific worktree.

This also fixes no cmd-click working in the remote ssh.

* second, only if the client is local, do the FS checks to find
non-indexed files

Release Notes:

- Improved cmd-click in terminal to find more paths
2025-03-06 00:41:13 +00:00
Danilo Leal
43339c6869 assistant2: Improve clarity of loading state (#26178)
Follow up to https://github.com/zed-industries/zed/pull/23299.

Having the loading state on the button makes sense, but it's also too
subtle. If you're waiting on an LLM response that takes a while, like a
"thinking state", not having anything more clearly visible communicating
that the model is still in-progress can make you think something is
wrong.

<img
src="https://github.com/user-attachments/assets/da64516e-5540-4294-97a2-e4542ce704f3"
width="700px" />

Release Notes:

- N/A
2025-03-05 21:39:29 -03:00
Julia Ryan
e505d6bf5b Git uncommit warning (#25977)
Adds a prompt when clicking the uncommit button when the current commit
is already present on a remote branch:

![screenshot showing
prompt](https://github.com/user-attachments/assets/d6421875-588e-4db0-aee0-a92f36bce94b)

Release Notes:

- N/A

---------

Co-authored-by: Conrad <conrad@zed.dev>
2025-03-05 15:56:51 -08:00
Julia Ryan
0200dda83d Disable uncommit button for parentless commits (#25983)
Closes #25976

There's a couple states that this covers:
- upon `git init`, no footer is shown at all
- after 1 commit (or when on any parentless commit), the uncommit button
is ~disabled~ hidden
- otherwise commit button is shown

Also updated the button with "meta" tooltip showing human readable
description and git command.

Release Notes:

- N/A

---------

Co-authored-by: Nate Butler <iamnbutler@gmail.com>
2025-03-05 23:23:05 +00:00
Marshall Bowers
4db9ab15a7 elixir: Extract to zed-extensions/elixir repository (#26167)
This PR extracts the Elixir extension to the
[zed-extensions/elixir](https://github.com/zed-extensions/elixir)
repository.

Release Notes:

- N/A
2025-03-05 22:50:35 +00:00
Cole Miller
5daadc0d30 git: Add CHERRY_PICK_HEAD to the list of merge heads (#26145)
Attempt to fix an issue where conflicts from a cherry-pick don't get
cleared out of the git panel after being resolved.

Release Notes:

- Git Beta: Fixed resolution of conflicts from cherry-picks not being
reflected in the git panel
2025-03-05 22:31:45 +00:00
Marshall Bowers
431727fdd7 csharp: Extract to zed-extensions/csharp repository (#26166)
This PR extracts the C# extension to the
[zed-extensions/csharp](https://github.com/zed-extensions/csharp)
repository.

Release Notes:

- N/A
2025-03-05 22:23:49 +00:00
Marshall Bowers
cee98f872a git_ui: Fix typo in comment (#26162)
This PR fixes a typo in a comment.

Release Notes:

- N/A
2025-03-05 21:48:23 +00:00
Marshall Bowers
e99d68a66f git_ui: Scaffold out support for generating commit messages with an LLM (#26161)
This PR adds the rough structure needed to support generating commit
messages using an LLM.

This functionality is not yet surfaced to the user.

This is the current state, if you tweak the source to show the button:


https://github.com/user-attachments/assets/66d1fbc4-09f3-4277-84f4-e9c9ebab274c

Release Notes:

- N/A
2025-03-05 21:42:48 +00:00
renovate[bot]
6a3e8044b1 Update Rust crate anyhow to v1.0.97 (#26152)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [anyhow](https://redirect.github.com/dtolnay/anyhow) |
workspace.dependencies | patch | `1.0.96` -> `1.0.97` |

---

### Release Notes

<details>
<summary>dtolnay/anyhow (anyhow)</summary>

###
[`v1.0.97`](https://redirect.github.com/dtolnay/anyhow/releases/tag/1.0.97)

[Compare
Source](https://redirect.github.com/dtolnay/anyhow/compare/1.0.96...1.0.97)

-   Documentation improvements

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "after 3pm on Wednesday" in timezone
America/New_York, Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

Release Notes:

- N/A

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzOS4xODUuNCIsInVwZGF0ZWRJblZlciI6IjM5LjE4NS40IiwidGFyZ2V0QnJhbmNoIjoibWFpbiIsImxhYmVscyI6W119-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-03-05 16:32:07 -05:00
renovate[bot]
c2375a4164 Update Rust crate async-trait to v0.1.87 (#26158)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [async-trait](https://redirect.github.com/dtolnay/async-trait) |
workspace.dependencies | patch | `0.1.86` -> `0.1.87` |

---

### Release Notes

<details>
<summary>dtolnay/async-trait (async-trait)</summary>

###
[`v0.1.87`](https://redirect.github.com/dtolnay/async-trait/releases/tag/0.1.87)

[Compare
Source](https://redirect.github.com/dtolnay/async-trait/compare/0.1.86...0.1.87)

-   Documentation improvements

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "after 3pm on Wednesday" in timezone
America/New_York, Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

Release Notes:

- N/A

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzOS4xODUuNCIsInVwZGF0ZWRJblZlciI6IjM5LjE4NS40IiwidGFyZ2V0QnJhbmNoIjoibWFpbiIsImxhYmVscyI6W119-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-03-05 16:31:40 -05:00
Mikayla Maki
9d54e63a11 Fix git branches in non-active repository (#26148)
Release Notes:

- Git Beta: Fixed a bug where the branch selector would only show for
the first repository opened.

---------

Co-authored-by: Conrad Irwin <conrad.irwin@gmail.com>
Co-authored-by: Richard Feldman <oss@rtfeldman.com>
2025-03-05 21:16:46 +00:00
Conrad Irwin
2a919ad1d0 git: Make repo selector wider (#26149)
…m_item()

Closes #ISSUE

Release Notes:

- git: Fixed repository selector being too narrow
2025-03-05 13:02:29 -07:00
Julia Ryan
f13b2fd811 Fix left clicking the close button in the switcher (#25979)
The close button on each tab previously only worked when you right
clicked it, presumably because on macos people were using `ctrl+tab` to
open the picker, and clicking with `ctrl` held registers as a right
click. Now it should work with either mouse button.

Release Notes:

- N/A

Co-authored-by: Conrad <conrad@zed.dev>
2025-03-05 11:50:39 -08:00
Marshall Bowers
7c39153160 assistant_tools: Add list-worktrees and read-file tools (#26147)
This PR adds two new tools to Assistant 2:

- `list-worktrees` - Lists the worktrees in a project
- `read-file` - Reads a file at the given path in the project

I don't see `list-worktrees` sticking around long-term, as when we have
tools for listing files those will include the worktree IDs along with
the path, but making this tool available allows the model to utilize
`read-file` when it otherwise wouldn't be able to.

Release Notes:

- N/A
2025-03-05 19:41:42 +00:00
Marshall Bowers
d0c2bef8c3 anthropic: Use an empty object if no tool input is provided (#26144)
This PR changes the default value when no input is provided with a tool
use from `null` to `{}`.

This fixes an issue I was seeing where tools that didn't accept input
were not being called correctly.

Release Notes:

- N/A
2025-03-05 19:17:44 +00:00
Cole Miller
87b3fefdd1 Fix panic when expanding a deletion hunk with blame open (#26130)
Closes #26118

Release Notes:

- Fixed a panic when expanding diff hunks while git blame is open
2025-03-05 13:07:36 -05:00
Marshall Bowers
66784c0b3f Fix language model selector (#26138)
This PR fixes the language model selector.

I tried to piece together the state prior to #25697 (the state it was in
at 11838cf89e) while retaining unrelated
changes that happened since then.

Release Notes:

- Fixed an issue where language models would not be authenticated until
after the model selector was opened (Preview only).
2025-03-05 12:48:10 -05:00
Cole Miller
ad9c508a72 Fix performance regression in multibuffer diff syncing (#26137)
This fixes a performance problem introduced in #25906 and caused by
calling `BufferDiff::snapshot` too frequently.

Release Notes:

- Fixed a performance regression related to buffer diffs

Co-authored-by: Conrad <conrad@zed.dev>
2025-03-05 12:25:51 -05:00
Max Brunsfeld
aaa506c061 Bump Tree-sitter to 0.25.3 for error recovery fixes (#26092)
For https://github.com/tree-sitter/tree-sitter/pull/4257

Release Notes:

- Fixed a hang that could occur when editing certain Zig files.
2025-03-05 08:50:19 -08:00
Bennet Bo Fenner
a602c50a6c assistant2: Allow adding directories as context that contain non-UTF8 files (#26135)
We would previously return an error if there was at least one non-UTF8
file. Now we just ignore them and only add text files. If no text files
are found we show an error.

Release Notes:

- N/A
2025-03-05 16:47:43 +00:00
Marshall Bowers
728c161e8d Clean up language model selector (#26134)
This PR does some cleanup for the language model selector after
https://github.com/zed-industries/zed/pull/26090.

Release Notes:

- N/A
2025-03-05 16:18:01 +00:00
Asqar Arslanov
3975d8ea93 vim: Rename wrapping keybindings + document cursor wrapping (#25694)
https://github.com/zed-industries/zed/pull/25663#issuecomment-2686095807

Renamed the `vim::Backspace` and `vim::Space` actions to
`vim::WrappingLeft` and `vim::WrappingRight` respectively. The old names
are still available, but they are marked as deprecated and users are
advised to use the new names.

Also added a paragraph to the docs describing how to enable wrapping
cursor navigation.
2025-03-05 08:54:30 -07:00
Peter Tripp
2d050a8130 Fix SSH remotes running Nushell (#25613)
- Closes: https://github.com/zed-industries/zed/issues/21005

Nushell does not support `uname -sm`
So invoke `sh -c "uname -sm"` instead which will also work under nushell.
See https://github.com/nushell/nushell/issues/12570 for the choice quote: "being posix/bash compliant is a non-goal"

Release Notes:

- Fixed ssh remotes running Nushell
2025-03-05 15:50:32 +00:00
Dino
e600e71c1c vim: Fix tab title when using !! and disable rerun button for terminal tasks (#26122)
These changes tackle two issues with running terminal commands via vim
mode:

- When using `!!` the tab's title was set to `!!` instead of the
previous command that was run and these changes fix that in order to
always display the previous command in the tab's title when re-running
the command with `!!`
- For a terminal command, pressing the rerun button would actually bring
up the task palette, so this has been updated in order to disable the
rerun button when the terminal tab was spawned via a vim command

Closes #25800 

Release Notes:

- Fixed the terminal tab title when using `!!` to rerun the last command
- Improved the terminal tab for when command is run via vim mode, in
order to disable the rerun button, seeing as Zed does not support it
2025-03-05 08:47:49 -07:00
Marshall Bowers
82d85fd2ed deno: Extract to zed-extensions/deno repository (#26129)
This PR extracts the Deno extension to the
[zed-extensions/deno](https://github.com/zed-extensions/deno)
repository.

Release Notes:

- N/A
2025-03-05 15:31:21 +00:00
smit
e061ebb46c editor: Fix cmd + click on a URL not working sometimes (#26128)
Closes #25647

This PR fixes two issues related to cmd + click on URL:

1. Normally cmd + click on URL, it opens browser. Now, alt + tab back to
Zed. If you cmd + click on link again it won't work, until you normal
click some where else in buffer. It won't even show underline.

2. Again, cmd + click on URL, it opens browser. Now, alt + tab back to
Zed. If you cmd + click, some where else in buffer like just normal
text, and now try to hover on URL it won't show up underline and cmd +
click on it won't work. Unless again, if you plain click somewhere else.

Problem:

Issue is when clicking we set pending anchor (for selection), and when
we mouse up we clear those. This works for normal case without pressing
any modifier.

But, in case of cmd modifier, we set pending anchor (set when
`SelectPhase::Begin`), but we don't clear it once we use that data.

Fix: 

Once we end up using selection, anchor, etc data to figure out where to
navigate either URL/defination etc, we clear selection just like how we
do it in normal click. This doesn't require to happen after navigate
task, so we do it right after our usage of it.

Before:


https://github.com/user-attachments/assets/b33d93fc-f490-4fa4-ae22-1da1fd6b77a9

After:


https://github.com/user-attachments/assets/028f039a-cd13-4651-b461-3ba52f2526de


Release Notes:

- Fixed an issue where cmd + click on a URL was not working sometimes.
2025-03-05 20:58:18 +05:30
382 changed files with 13568 additions and 8817 deletions

View File

@@ -26,3 +26,6 @@ rustflags = [
"-C",
"target-feature=+crt-static", # This fixes the linking issue when compiling livekit on Windows
]
[env]
MACOSX_DEPLOYMENT_TARGET = "10.15.7"

View File

@@ -236,12 +236,24 @@ jobs:
if: always()
run: rm -rf ./../.cargo
windows_tests:
windows_clippy:
timeout-minutes: 60
name: (Windows) Run Clippy and tests
name: (Windows) Run Clippy
if: github.repository_owner == 'zed-industries'
runs-on: hosted-windows-2
steps:
# Temporarily Collect some metadata about the hardware behind our runners.
- name: GHA Runner Info
run: |
Invoke-RestMethod -Headers @{"Metadata"="true"} -Method GET -Uri "http://169.254.169.254/metadata/instance/compute?api-version=2023-07-01" |
ConvertTo-Json -Depth 10 |
jq "{ vm_size: .vmSize, location: .location, os_disk_gb: (.storageProfile.osDisk.diskSizeGB | tonumber), rs_disk_gb: (.storageProfile.resourceDisk.size | tonumber / 1024) }"
@{
Cores = (Get-CimInstance Win32_Processor).NumberOfCores
vCPUs = (Get-CimInstance Win32_Processor).NumberOfLogicalProcessors
RamGb = [math]::Round((Get-CimInstance Win32_ComputerSystem).TotalPhysicalMemory / 1GB, 2)
cpuid = (Get-CimInstance Win32_Processor).Name.Trim()
} | ConvertTo-Json
# more info here:- https://github.com/rust-lang/cargo/issues/13020
- name: Enable longer pathnames for git
run: git config --system core.longpaths true
@@ -275,6 +287,69 @@ jobs:
working-directory: ${{ env.ZED_WORKSPACE }}
run: ./script/clippy.ps1
- name: Check dev drive space
working-directory: ${{ env.ZED_WORKSPACE }}
# `setup-dev-driver.ps1` creates a 100GB drive, with CI taking up ~45GB of the drive.
run: ./script/exit-ci-if-dev-drive-is-full.ps1 95
# Since the Windows runners are stateful, so we need to remove the config file to prevent potential bug.
- name: Clean CI config file
if: always()
run: |
if (Test-Path "${{ env.CARGO_HOME }}/config.toml") {
Remove-Item -Path "${{ env.CARGO_HOME }}/config.toml" -Force
}
# Windows CI takes twice as long as our other platforms and fast github hosted runners are expensive.
# But we still want to do CI, so let's only run tests on main and come back to this when we're
# ready to self host our Windows CI (e.g. during the push for full Windows support)
windows_tests:
timeout-minutes: 60
name: (Windows) Run Tests
if: ${{ github.repository_owner == 'zed-industries' && (github.ref == 'refs/heads/main' || contains(github.event.pull_request.labels.*.name, 'windows')) }}
runs-on: hosted-windows-2
steps:
# Temporarily Collect some metadata about the hardware behind our runners.
- name: GHA Runner Info
run: |
Invoke-RestMethod -Headers @{"Metadata"="true"} -Method GET -Uri "http://169.254.169.254/metadata/instance/compute?api-version=2023-07-01" |
ConvertTo-Json -Depth 10 |
jq "{ vm_size: .vmSize, location: .location, os_disk_gb: (.storageProfile.osDisk.diskSizeGB | tonumber), rs_disk_gb: (.storageProfile.resourceDisk.size | tonumber / 1024) }"
@{
Cores = (Get-CimInstance Win32_Processor).NumberOfCores
vCPUs = (Get-CimInstance Win32_Processor).NumberOfLogicalProcessors
RamGb = [math]::Round((Get-CimInstance Win32_ComputerSystem).TotalPhysicalMemory / 1GB, 2)
cpuid = (Get-CimInstance Win32_Processor).Name.Trim()
} | ConvertTo-Json
# more info here:- https://github.com/rust-lang/cargo/issues/13020
- name: Enable longer pathnames for git
run: git config --system core.longpaths true
- name: Checkout repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false
- name: Create Dev Drive using ReFS
run: ./script/setup-dev-driver.ps1
# actions/checkout does not let us clone into anywhere outside ${{ github.workspace }}, so we have to copy the clone...
- name: Copy Git Repo to Dev Drive
run: |
Copy-Item -Path "${{ github.workspace }}" -Destination "${{ env.ZED_WORKSPACE }}" -Recurse
- name: Cache dependencies
uses: swatinem/rust-cache@f0deed1e0edfc6a9be95417288c0e1099b1eeec3 # v2
with:
save-if: ${{ github.ref == 'refs/heads/main' }}
workspaces: ${{ env.ZED_WORKSPACE }}
cache-provider: "github"
- name: Configure CI
run: |
mkdir -p ${{ env.CARGO_HOME }} -ErrorAction Ignore
cp ./.cargo/ci-config.toml ${{ env.CARGO_HOME }}/config.toml
- name: Run tests
uses: ./.github/actions/run_tests_windows
with:
@@ -292,7 +367,10 @@ jobs:
# Since the Windows runners are stateful, so we need to remove the config file to prevent potential bug.
- name: Clean CI config file
if: always()
run: Remove-Item -Path "${{ env.CARGO_HOME }}/config.toml" -Force
run: |
if (Test-Path "${{ env.CARGO_HOME }}/config.toml") {
Remove-Item -Path "${{ env.CARGO_HOME }}/config.toml" -Force
}
bundle-mac:
timeout-minutes: 120
@@ -422,6 +500,13 @@ jobs:
name: zed-${{ github.event.pull_request.head.sha || github.sha }}-x86_64-unknown-linux-gnu.tar.gz
path: target/release/zed-*.tar.gz
- name: Upload Linux remote server to workflow run if main branch or specific label
uses: actions/upload-artifact@4cec3d8aa04e39d1a68397de0c4cd6fb9dce8ec1 # v4
if: ${{ github.ref == 'refs/heads/main' }} || contains(github.event.pull_request.labels.*.name, 'run-bundling') }}
with:
name: zed-remote-server-${{ github.event.pull_request.head.sha || github.sha }}-x86_64-unknown-linux-gnu.gz
path: target/zed-remote-server-linux-x86_64.gz
- name: Upload app bundle to release
uses: softprops/action-gh-release@de2c0eb89ae2a093876385947365aca7b0e5f844 # v1
with:
@@ -470,6 +555,13 @@ jobs:
name: zed-${{ github.event.pull_request.head.sha || github.sha }}-aarch64-unknown-linux-gnu.tar.gz
path: target/release/zed-*.tar.gz
- name: Upload Linux remote server to workflow run if main branch or specific label
uses: actions/upload-artifact@4cec3d8aa04e39d1a68397de0c4cd6fb9dce8ec1 # v4
if: ${{ github.ref == 'refs/heads/main' }} || contains(github.event.pull_request.labels.*.name, 'run-bundling') }}
with:
name: zed-remote-server-${{ github.event.pull_request.head.sha || github.sha }}-aarch64-unknown-linux-gnu.gz
path: target/zed-remote-server-linux-aarch64.gz
- name: Upload app bundle to release
uses: softprops/action-gh-release@de2c0eb89ae2a093876385947365aca7b0e5f844 # v1
with:

341
Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@@ -3,10 +3,12 @@ resolver = "2"
members = [
"crates/activity_indicator",
"crates/anthropic",
"crates/askpass",
"crates/assets",
"crates/assistant",
"crates/assistant2",
"crates/assistant_context_editor",
"crates/assistant_scripting",
"crates/assistant_settings",
"crates/assistant_slash_command",
"crates/assistant_slash_commands",
@@ -117,7 +119,6 @@ members = [
"crates/rope",
"crates/rpc",
"crates/schema_generator",
"crates/scripting_tool",
"crates/search",
"crates/semantic_index",
"crates/semantic_version",
@@ -168,15 +169,9 @@ members = [
# Extensions
#
"extensions/csharp",
"extensions/deno",
"extensions/elixir",
"extensions/emmet",
"extensions/erlang",
"extensions/glsl",
"extensions/haskell",
"extensions/html",
"extensions/lua",
"extensions/perplexity",
"extensions/proto",
"extensions/purescript",
@@ -210,6 +205,7 @@ edition = "2021"
activity_indicator = { path = "crates/activity_indicator" }
ai = { path = "crates/ai" }
anthropic = { path = "crates/anthropic" }
askpass = { path = "crates/askpass" }
assets = { path = "crates/assets" }
assistant = { path = "crates/assistant" }
assistant2 = { path = "crates/assistant2" }
@@ -322,7 +318,7 @@ reqwest_client = { path = "crates/reqwest_client" }
rich_text = { path = "crates/rich_text" }
rope = { path = "crates/rope" }
rpc = { path = "crates/rpc" }
scripting_tool = { path = "crates/scripting_tool" }
assistant_scripting = { path = "crates/assistant_scripting" }
search = { path = "crates/search" }
semantic_index = { path = "crates/semantic_index" }
semantic_version = { path = "crates/semantic_version" }
@@ -374,7 +370,7 @@ zeta = { path = "crates/zeta" }
#
aho-corasick = "1.1"
alacritty_terminal = { git = "https://github.com/zed-industries/alacritty.git", rev = "03c2907b44b4189aac5fdeaea331f5aab5c7072e" }
alacritty_terminal = { git = "https://github.com/zed-industries/alacritty.git", branch = "add-hush-login-flag" }
any_vec = "0.14"
anyhow = "1.0.86"
arrayvec = { version = "0.7.4", features = ["serde"] }
@@ -455,7 +451,7 @@ livekit = { git = "https://github.com/zed-industries/livekit-rust-sdks", rev = "
], default-features = false }
log = { version = "0.4.16", features = ["kv_unstable_serde", "serde"] }
markup5ever_rcdom = "0.3.0"
mlua = { version = "0.10", features = ["lua54", "vendored"] }
mlua = { version = "0.10", features = ["lua54", "vendored", "async", "send"] }
nanoid = "0.4"
nbformat = { version = "0.10.0" }
nix = "0.29"
@@ -541,7 +537,7 @@ tiny_http = "0.8"
toml = "0.8"
tokio = { version = "1" }
tower-http = "0.4.4"
tree-sitter = { version = "0.25.2", features = ["wasm"] }
tree-sitter = { version = "0.25.3", features = ["wasm"] }
tree-sitter-bash = "0.23"
tree-sitter-c = "0.23"
tree-sitter-cpp = "0.23"
@@ -604,9 +600,11 @@ features = [
version = "0.58"
features = [
"implement",
"Foundation_Collections",
"Foundation_Numerics",
"Storage",
"System_Threading",
"UI_StartScreen",
"UI_ViewManagement",
"Wdk_System_SystemServices",
"Win32_Globalization",

10
assets/icons/ai_edit.svg Normal file
View File

@@ -0,0 +1,10 @@
<svg width="16" height="16" viewBox="0 0 16 16" fill="none" xmlns="http://www.w3.org/2000/svg">
<path d="M12.5871 5.40624C12.8514 5.14195 13 4.78346 13 4.40965C13 4.03583 12.8516 3.67731 12.5873 3.41295C12.323 3.14859 11.9645 3.00005 11.5907 3C11.2169 2.99995 10.8584 3.14841 10.594 3.4127L3.92098 10.0874C3.80488 10.2031 3.71903 10.3456 3.67097 10.5024L3.01047 12.6784C2.99754 12.7217 2.99657 12.7676 3.00764 12.8113C3.01872 12.8551 3.04143 12.895 3.07337 12.9269C3.1053 12.9588 3.14528 12.9815 3.18905 12.9925C3.23282 13.0035 3.27875 13.0024 3.32197 12.9894L5.49849 12.3294C5.65508 12.2818 5.79758 12.1964 5.91349 12.0809L12.5871 5.40624Z" stroke="black" stroke-width="1.33" stroke-linecap="round" stroke-linejoin="round"/>
<path d="M10 4L12 6" stroke="black" stroke-width="1.33" stroke-linecap="round" stroke-linejoin="round"/>
<path d="M6.38818 3.53598V2.53598" stroke="black" stroke-width="1.2" stroke-linecap="round" stroke-linejoin="round"/>
<path d="M9.56982 12.6995L9.56982 13.6995" stroke="black" stroke-width="1.2" stroke-linecap="round" stroke-linejoin="round"/>
<path d="M2.38818 6.53598H3.38818" stroke="black" stroke-width="1.2" stroke-linecap="round" stroke-linejoin="round"/>
<path d="M13.5698 9.69949L12.5698 9.69949" stroke="black" stroke-width="1.2" stroke-linecap="round" stroke-linejoin="round"/>
<path d="M4.38818 4.53598L3.38818 3.53598" stroke="black" stroke-width="1.2" stroke-linecap="round" stroke-linejoin="round"/>
<path d="M11.5698 11.6995L12.5698 12.6995" stroke="black" stroke-width="1.2" stroke-linecap="round" stroke-linejoin="round"/>
</svg>

After

Width:  |  Height:  |  Size: 1.5 KiB

View File

@@ -0,0 +1,3 @@
<svg width="16" height="16" viewBox="0 0 16 16" fill="none" xmlns="http://www.w3.org/2000/svg">
<path fill-rule="evenodd" clip-rule="evenodd" d="M6.36197 1.67985C5.3748 1.41534 4.36011 2.00117 4.0956 2.98834L2.17985 10.138C1.91534 11.1252 2.50117 12.1399 3.48833 12.4044L10.638 14.3202C11.6252 14.5847 12.6399 13.9988 12.9044 13.0117L14.8202 5.86197C15.0847 4.8748 14.4988 3.86012 13.5117 3.59561L6.36197 1.67985ZM10.0457 4.58266C9.77896 4.51119 9.50479 4.66948 9.43332 4.93621L8.76235 7.44028C8.69088 7.70701 8.84917 7.98118 9.11591 8.05265L11.62 8.72362C11.8867 8.79509 12.1609 8.6368 12.2324 8.37006L12.9033 5.86599C12.9748 5.59926 12.8165 5.32509 12.5498 5.25362L10.0457 4.58266Z" fill="black"/>
</svg>

After

Width:  |  Height:  |  Size: 707 B

View File

@@ -0,0 +1,7 @@
<svg width="16" height="16" viewBox="0 0 16 16" fill="none" xmlns="http://www.w3.org/2000/svg">
<path d="M6.5 13L1.5 5H11.5L6.5 13Z" fill="black"/>
<path d="M14 9H9L11.5 5L14 9Z" fill="black" fill-opacity="0.75"/>
<path d="M9 9L14 9L11.5 13L9 9Z" fill="black" fill-opacity="0.65"/>
<path d="M14 5L15.25 7L12.75 7L14 5Z" fill="black" fill-opacity="0.5"/>
<path d="M14 9L12.75 7H15.25L14 9Z" fill="black" fill-opacity="0.55"/>
</svg>

After

Width:  |  Height:  |  Size: 432 B

View File

@@ -475,9 +475,7 @@
"ctrl-alt-delete": "editor::DeleteToNextSubwordEnd",
"ctrl-alt-d": "editor::DeleteToNextSubwordEnd",
"ctrl-alt-left": "editor::MoveToPreviousSubwordStart",
// "ctrl-alt-b": "editor::MoveToPreviousSubwordStart",
"ctrl-alt-right": "editor::MoveToNextSubwordEnd",
"ctrl-alt-f": "editor::MoveToNextSubwordEnd",
"ctrl-alt-shift-left": "editor::SelectToPreviousSubwordStart",
"ctrl-alt-shift-b": "editor::SelectToPreviousSubwordStart",
"ctrl-alt-shift-right": "editor::SelectToNextSubwordEnd",
@@ -739,7 +737,7 @@
"tab": "git_panel::FocusEditor",
"shift-tab": "git_panel::FocusEditor",
"escape": "git_panel::ToggleFocus",
"ctrl-enter": "git::ShowCommitEditor",
"ctrl-enter": "git::Commit",
"alt-enter": "menu::SecondaryConfirm"
}
},
@@ -747,13 +745,20 @@
"context": "GitCommit > Editor",
"bindings": {
"enter": "editor::Newline",
"ctrl-enter": "git::Commit"
"ctrl-enter": "git::Commit",
"alt-l": "git::GenerateCommitMessage"
}
},
{
"context": "GitDiff > Editor",
"bindings": {
"ctrl-enter": "git::ShowCommitEditor"
"ctrl-enter": "git::Commit"
}
},
{
"context": "AskPass > Editor",
"bindings": {
"enter": "menu::Confirm"
}
},
{
@@ -763,7 +768,8 @@
"tab": "git_panel::FocusChanges",
"shift-tab": "git_panel::FocusChanges",
"ctrl-enter": "git::Commit",
"alt-up": "git_panel::FocusChanges"
"alt-up": "git_panel::FocusChanges",
"alt-l": "git::GenerateCommitMessage"
}
},
{

View File

@@ -108,8 +108,8 @@
"cmd-right": ["editor::MoveToEndOfLine", { "stop_at_soft_wraps": true }],
"ctrl-e": ["editor::MoveToEndOfLine", { "stop_at_soft_wraps": false }],
"end": ["editor::MoveToEndOfLine", { "stop_at_soft_wraps": true }],
"cmd-up": "editor::MoveToStartOfExcerpt",
"cmd-down": "editor::MoveToEndOfExcerpt",
"cmd-up": "editor::MoveToBeginning",
"cmd-down": "editor::MoveToEnd",
"cmd-home": "editor::MoveToBeginning", // Typed via `cmd-fn-left`
"cmd-end": "editor::MoveToEnd", // Typed via `cmd-fn-right`
"shift-up": "editor::SelectUp",
@@ -124,8 +124,8 @@
"alt-shift-right": "editor::SelectToNextWordEnd", // cursorWordRightSelect
"ctrl-shift-up": "editor::SelectToStartOfParagraph",
"ctrl-shift-down": "editor::SelectToEndOfParagraph",
"cmd-shift-up": "editor::SelectToStartOfExcerpt",
"cmd-shift-down": "editor::SelectToEndOfExcerpt",
"cmd-shift-up": "editor::SelectToBeginning",
"cmd-shift-down": "editor::SelectToEnd",
"cmd-a": "editor::SelectAll",
"cmd-l": "editor::SelectLine",
"cmd-shift-i": "editor::Format",
@@ -172,6 +172,16 @@
"alt-enter": "editor::OpenSelectionsInMultibuffer"
}
},
{
"context": "Editor && multibuffer",
"use_key_equivalents": true,
"bindings": {
"cmd-up": "editor::MoveToStartOfExcerpt",
"cmd-down": "editor::MoveToStartOfNextExcerpt",
"cmd-shift-up": "editor::SelectToStartOfExcerpt",
"cmd-shift-down": "editor::SelectToStartOfNextExcerpt"
}
},
{
"context": "Editor && mode == full && edit_prediction",
"use_key_equivalents": true,
@@ -760,14 +770,21 @@
"tab": "git_panel::FocusEditor",
"shift-tab": "git_panel::FocusEditor",
"escape": "git_panel::ToggleFocus",
"cmd-enter": "git::ShowCommitEditor"
"cmd-enter": "git::Commit"
}
},
{
"context": "GitDiff > Editor",
"use_key_equivalents": true,
"bindings": {
"cmd-enter": "git::ShowCommitEditor"
"cmd-enter": "git::Commit"
}
},
{
"context": "AskPass > Editor",
"use_key_equivalents": true,
"bindings": {
"enter": "menu::Confirm"
}
},
{
@@ -778,7 +795,9 @@
"cmd-enter": "git::Commit",
"tab": "git_panel::FocusChanges",
"shift-tab": "git_panel::FocusChanges",
"alt-up": "git_panel::FocusChanges"
"alt-up": "git_panel::FocusChanges",
"shift-escape": "git::ExpandCommitEditor",
"alt-tab": "git::GenerateCommitMessage"
}
},
{
@@ -786,7 +805,8 @@
"use_key_equivalents": true,
"bindings": {
"enter": "editor::Newline",
"cmd-enter": "git::Commit"
"cmd-enter": "git::Commit",
"alt-tab": "git::GenerateCommitMessage"
}
},
{

View File

@@ -6,7 +6,7 @@
"a": ["vim::PushObject", { "around": true }],
"left": "vim::Left",
"h": "vim::Left",
"backspace": "vim::Backspace",
"backspace": "vim::WrappingLeft",
"down": "vim::Down",
"ctrl-j": "vim::Down",
"j": "vim::Down",
@@ -20,7 +20,7 @@
"k": "vim::Up",
"right": "vim::Right",
"l": "vim::Right",
"space": "vim::Space",
"space": "vim::WrappingRight",
"end": "vim::EndOfLine",
"$": "vim::EndOfLine",
"^": "vim::FirstNonWhitespace",
@@ -247,7 +247,8 @@
"context": "VimControl && VimCount",
"bindings": {
"0": ["vim::Number", 0],
":": "vim::CountCommand"
":": "vim::CountCommand",
"%": "vim::GoToPercentage"
}
},
{

View File

@@ -720,8 +720,8 @@
"remove_trailing_whitespace_on_save": true,
// Whether to start a new line with a comment when a previous line is a comment as well.
"extend_comment_on_newline": true,
// Whether or not to ensure there's a single newline at the end of a buffer
// when saving it.
// Removes any lines containing only whitespace at the end of the file and
// ensures just one newline at the end.
"ensure_final_newline_on_save": true,
// Whether or not to perform a buffer format before saving
//
@@ -845,7 +845,7 @@
// "hunk_style": "transparent"
// 2. Show unstaged hunks with a pattern background:
// "hunk_style": "pattern"
"hunk_style": "transparent"
"hunk_style": "staged_border"
},
// Configuration for how direnv configuration should be loaded. May take 2 values:
// 1. Load direnv configuration using `direnv export json` directly.
@@ -1055,7 +1055,6 @@
// }
//
"file_types": {
"Plain Text": ["txt"],
"JSONC": ["**/.zed/**/*.json", "**/zed/**/*.json", "**/Zed/**/*.json", "**/.vscode/**/*.json"],
"Shell Script": [".env.*"]
},
@@ -1175,6 +1174,7 @@
"format_on_save": "off",
"use_on_type_format": false,
"allow_rewrap": "anywhere",
"soft_wrap": "bounded",
"prettier": {
"allowed": true
}
@@ -1298,6 +1298,11 @@
// "semi": false,
// "singleQuote": true
},
// Settings for auto-closing of JSX tags.
"jsx_tag_auto_close": {
// // Whether to auto-close JSX tags.
// "enabled": true
},
// LSP Specific settings.
"lsp": {
// Specify the LSP name as a key here.

View File

@@ -9,7 +9,10 @@ use gpui::{
};
use language::{LanguageRegistry, LanguageServerBinaryStatus, LanguageServerId};
use lsp::LanguageServerName;
use project::{EnvironmentErrorMessage, LanguageServerProgress, Project, WorktreeId};
use project::{
EnvironmentErrorMessage, LanguageServerProgress, LspStoreEvent, Project,
ProjectEnvironmentEvent, WorktreeId,
};
use smallvec::SmallVec;
use std::{cmp::Reverse, fmt::Write, sync::Arc, time::Duration};
use ui::{prelude::*, ButtonLike, ContextMenu, PopoverMenu, PopoverMenuHandle, Tooltip};
@@ -73,7 +76,22 @@ impl ActivityIndicator {
})
.detach();
cx.observe(&project, |_, _, cx| cx.notify()).detach();
cx.subscribe(
&project.read(cx).lsp_store(),
|_, _, event, cx| match event {
LspStoreEvent::LanguageServerUpdate { .. } => cx.notify(),
_ => {}
},
)
.detach();
cx.subscribe(
&project.read(cx).environment().clone(),
|_, _, event, cx| match event {
ProjectEnvironmentEvent::ErrorsUpdated => cx.notify(),
},
)
.detach();
if let Some(auto_updater) = auto_updater.as_ref() {
cx.observe(auto_updater, |_, _, cx| cx.notify()).detach();
@@ -204,7 +222,7 @@ impl ActivityIndicator {
message: error.0.clone(),
on_click: Some(Arc::new(move |this, window, cx| {
this.project.update(cx, |project, cx| {
project.remove_environment_error(cx, worktree_id);
project.remove_environment_error(worktree_id, cx);
});
window.dispatch_action(Box::new(workspace::OpenLog), cx);
})),

21
crates/askpass/Cargo.toml Normal file
View File

@@ -0,0 +1,21 @@
[package]
name = "askpass"
version = "0.1.0"
edition.workspace = true
publish.workspace = true
license = "GPL-3.0-or-later"
[lints]
workspace = true
[lib]
path = "src/askpass.rs"
[dependencies]
anyhow.workspace = true
futures.workspace = true
gpui.workspace = true
smol.workspace = true
tempfile.workspace = true
util.workspace = true
which.workspace = true

View File

@@ -0,0 +1,194 @@
use std::path::{Path, PathBuf};
use std::time::Duration;
#[cfg(unix)]
use anyhow::Context as _;
use futures::channel::{mpsc, oneshot};
#[cfg(unix)]
use futures::{io::BufReader, AsyncBufReadExt as _};
#[cfg(unix)]
use futures::{select_biased, AsyncWriteExt as _, FutureExt as _};
use futures::{SinkExt, StreamExt};
use gpui::{AsyncApp, BackgroundExecutor, Task};
#[cfg(unix)]
use smol::fs;
#[cfg(unix)]
use smol::{fs::unix::PermissionsExt as _, net::unix::UnixListener};
#[cfg(unix)]
use util::ResultExt as _;
#[derive(PartialEq, Eq)]
pub enum AskPassResult {
CancelledByUser,
Timedout,
}
pub struct AskPassDelegate {
tx: mpsc::UnboundedSender<(String, oneshot::Sender<String>)>,
_task: Task<()>,
}
impl AskPassDelegate {
pub fn new(
cx: &mut AsyncApp,
password_prompt: impl Fn(String, oneshot::Sender<String>, &mut AsyncApp) + Send + Sync + 'static,
) -> Self {
let (tx, mut rx) = mpsc::unbounded::<(String, oneshot::Sender<String>)>();
let task = cx.spawn(|mut cx| async move {
while let Some((prompt, channel)) = rx.next().await {
password_prompt(prompt, channel, &mut cx);
}
});
Self { tx, _task: task }
}
pub async fn ask_password(&mut self, prompt: String) -> anyhow::Result<String> {
let (tx, rx) = oneshot::channel();
self.tx.send((prompt, tx)).await?;
Ok(rx.await?)
}
}
#[cfg(unix)]
pub struct AskPassSession {
script_path: PathBuf,
_askpass_task: Task<()>,
askpass_opened_rx: Option<oneshot::Receiver<()>>,
askpass_kill_master_rx: Option<oneshot::Receiver<()>>,
}
#[cfg(unix)]
impl AskPassSession {
/// This will create a new AskPassSession.
/// You must retain this session until the master process exits.
#[must_use]
pub async fn new(
executor: &BackgroundExecutor,
mut delegate: AskPassDelegate,
) -> anyhow::Result<Self> {
let temp_dir = tempfile::Builder::new().prefix("zed-askpass").tempdir()?;
let askpass_socket = temp_dir.path().join("askpass.sock");
let askpass_script_path = temp_dir.path().join("askpass.sh");
let (askpass_opened_tx, askpass_opened_rx) = oneshot::channel::<()>();
let listener =
UnixListener::bind(&askpass_socket).context("failed to create askpass socket")?;
let (askpass_kill_master_tx, askpass_kill_master_rx) = oneshot::channel::<()>();
let mut kill_tx = Some(askpass_kill_master_tx);
let askpass_task = executor.spawn(async move {
let mut askpass_opened_tx = Some(askpass_opened_tx);
while let Ok((mut stream, _)) = listener.accept().await {
if let Some(askpass_opened_tx) = askpass_opened_tx.take() {
askpass_opened_tx.send(()).ok();
}
let mut buffer = Vec::new();
let mut reader = BufReader::new(&mut stream);
if reader.read_until(b'\0', &mut buffer).await.is_err() {
buffer.clear();
}
let prompt = String::from_utf8_lossy(&buffer);
if let Some(password) = delegate
.ask_password(prompt.to_string())
.await
.context("failed to get askpass password")
.log_err()
{
stream.write_all(password.as_bytes()).await.log_err();
} else {
if let Some(kill_tx) = kill_tx.take() {
kill_tx.send(()).log_err();
}
// note: we expect the caller to drop this task when it's done.
// We need to keep the stream open until the caller is done to avoid
// spurious errors from ssh.
std::future::pending::<()>().await;
drop(stream);
}
}
drop(temp_dir)
});
anyhow::ensure!(
which::which("nc").is_ok(),
"Cannot find `nc` command (netcat), which is required to connect over SSH."
);
// Create an askpass script that communicates back to this process.
let askpass_script = format!(
"{shebang}\n{print_args} | {nc} -U {askpass_socket} 2> /dev/null \n",
// on macOS `brew install netcat` provides the GNU netcat implementation
// which does not support -U.
nc = if cfg!(target_os = "macos") {
"/usr/bin/nc"
} else {
"nc"
},
askpass_socket = askpass_socket.display(),
print_args = "printf '%s\\0' \"$@\"",
shebang = "#!/bin/sh",
);
fs::write(&askpass_script_path, askpass_script).await?;
fs::set_permissions(&askpass_script_path, std::fs::Permissions::from_mode(0o755)).await?;
Ok(Self {
script_path: askpass_script_path,
_askpass_task: askpass_task,
askpass_kill_master_rx: Some(askpass_kill_master_rx),
askpass_opened_rx: Some(askpass_opened_rx),
})
}
pub fn script_path(&self) -> &Path {
&self.script_path
}
// This will run the askpass task forever, resolving as many authentication requests as needed.
// The caller is responsible for examining the result of their own commands and cancelling this
// future when this is no longer needed. Note that this can only be called once, but due to the
// drop order this takes an &mut, so you can `drop()` it after you're done with the master process.
pub async fn run(&mut self) -> AskPassResult {
let connection_timeout = Duration::from_secs(10);
let askpass_opened_rx = self.askpass_opened_rx.take().expect("Only call run once");
let askpass_kill_master_rx = self
.askpass_kill_master_rx
.take()
.expect("Only call run once");
select_biased! {
_ = askpass_opened_rx.fuse() => {
// Note: this await can only resolve after we are dropped.
askpass_kill_master_rx.await.ok();
return AskPassResult::CancelledByUser
}
_ = futures::FutureExt::fuse(smol::Timer::after(connection_timeout)) => {
return AskPassResult::Timedout
}
}
}
}
#[cfg(not(unix))]
pub struct AskPassSession {
path: PathBuf,
}
#[cfg(not(unix))]
impl AskPassSession {
pub async fn new(_: &BackgroundExecutor, _: AskPassDelegate) -> anyhow::Result<Self> {
Ok(Self {
path: PathBuf::new(),
})
}
pub fn script_path(&self) -> &Path {
&self.path
}
pub async fn run(&mut self) -> AskPassResult {
futures::FutureExt::fuse(smol::Timer::after(Duration::from_secs(10))).await;
AskPassResult::Timedout
}
}

View File

@@ -110,7 +110,7 @@ impl ConfigurationView {
.bg(cx.theme().colors().surface_background)
.border_1()
.border_color(cx.theme().colors().border_variant)
.rounded_md()
.rounded_sm()
.when(configuration_view.is_none(), |this| {
this.child(div().child(Label::new(format!(
"No configuration view for {}",

View File

@@ -35,10 +35,10 @@ use language_model::{
report_assistant_event, LanguageModel, LanguageModelRegistry, LanguageModelRequest,
LanguageModelRequestMessage, LanguageModelTextStream, Role,
};
use language_model_selector::inline_language_model_selector;
use language_model_selector::{LanguageModelSelector, LanguageModelSelectorPopoverMenu};
use multi_buffer::MultiBufferRow;
use parking_lot::Mutex;
use project::{CodeAction, ProjectTransaction};
use project::{CodeAction, LspAction, ProjectTransaction};
use prompt_store::PromptBuilder;
use rope::Rope;
use settings::{update_settings_file, Settings, SettingsStore};
@@ -1425,6 +1425,7 @@ enum PromptEditorEvent {
struct PromptEditor {
id: InlineAssistId,
editor: Entity<Editor>,
language_model_selector: Entity<LanguageModelSelector>,
edited_since_done: bool,
gutter_dimensions: Arc<Mutex<GutterDimensions>>,
prompt_history: VecDeque<String>,
@@ -1438,7 +1439,6 @@ struct PromptEditor {
_token_count_subscriptions: Vec<Subscription>,
workspace: Option<WeakEntity<Workspace>>,
show_rate_limit_notice: bool,
fs: Arc<dyn Fs>,
}
#[derive(Copy, Clone)]
@@ -1567,7 +1567,6 @@ impl Render for PromptEditor {
]
}
});
let fs_clone = self.fs.clone();
h_flex()
.key_context("PromptEditor")
@@ -1590,13 +1589,29 @@ impl Render for PromptEditor {
.w(gutter_dimensions.full_width() + (gutter_dimensions.margin / 2.0))
.justify_center()
.gap_2()
.child(inline_language_model_selector(move |model, cx| {
update_settings_file::<AssistantSettings>(
fs_clone.clone(),
cx,
move |settings, _| settings.set_model(model.clone()),
);
}))
.child(LanguageModelSelectorPopoverMenu::new(
self.language_model_selector.clone(),
IconButton::new("context", IconName::SettingsAlt)
.shape(IconButtonShape::Square)
.icon_size(IconSize::Small)
.icon_color(Color::Muted),
move |window, cx| {
Tooltip::with_meta(
format!(
"Using {}",
LanguageModelRegistry::read_global(cx)
.active_model()
.map(|model| model.name().0)
.unwrap_or_else(|| "No model selected".into()),
),
None,
"Change Model",
window,
cx,
)
},
gpui::Corner::TopRight,
))
.map(|el| {
let CodegenStatus::Error(error) = self.codegen.read(cx).status(cx) else {
return el;
@@ -1709,8 +1724,21 @@ impl PromptEditor {
let mut this = Self {
id,
fs,
editor: prompt_editor,
language_model_selector: cx.new(|cx| {
let fs = fs.clone();
LanguageModelSelector::new(
move |model, cx| {
update_settings_file::<AssistantSettings>(
fs.clone(),
cx,
move |settings, _| settings.set_model(model.clone()),
);
},
window,
cx,
)
}),
edited_since_done: false,
gutter_dimensions,
prompt_history,
@@ -3541,10 +3569,10 @@ impl CodeActionProvider for AssistantCodeActionProvider {
Task::ready(Ok(vec![CodeAction {
server_id: language::LanguageServerId(0),
range: snapshot.anchor_before(range.start)..snapshot.anchor_after(range.end),
lsp_action: lsp::CodeAction {
lsp_action: LspAction::Action(Box::new(lsp::CodeAction {
title: "Fix with Assistant".into(),
..Default::default()
},
})),
}]))
} else {
Task::ready(Ok(Vec::new()))

View File

@@ -19,7 +19,7 @@ use language_model::{
report_assistant_event, LanguageModelRegistry, LanguageModelRequest,
LanguageModelRequestMessage, Role,
};
use language_model_selector::inline_language_model_selector;
use language_model_selector::{LanguageModelSelector, LanguageModelSelectorPopoverMenu};
use prompt_store::PromptBuilder;
use settings::{update_settings_file, Settings};
use std::{
@@ -487,9 +487,9 @@ enum PromptEditorEvent {
struct PromptEditor {
id: TerminalInlineAssistId,
fs: Arc<dyn Fs>,
height_in_lines: u8,
editor: Entity<Editor>,
language_model_selector: Entity<LanguageModelSelector>,
edited_since_done: bool,
prompt_history: VecDeque<String>,
prompt_history_ix: Option<usize>,
@@ -624,8 +624,6 @@ impl Render for PromptEditor {
}
};
let fs_clone = self.fs.clone();
h_flex()
.bg(cx.theme().colors().editor_background)
.border_y_1()
@@ -643,13 +641,29 @@ impl Render for PromptEditor {
.w_12()
.justify_center()
.gap_2()
.child(inline_language_model_selector(move |model, cx| {
update_settings_file::<AssistantSettings>(
fs_clone.clone(),
cx,
move |settings, _| settings.set_model(model.clone()),
);
}))
.child(LanguageModelSelectorPopoverMenu::new(
self.language_model_selector.clone(),
IconButton::new("change-model", IconName::SettingsAlt)
.shape(IconButtonShape::Square)
.icon_size(IconSize::Small)
.icon_color(Color::Muted),
move |window, cx| {
Tooltip::with_meta(
format!(
"Using {}",
LanguageModelRegistry::read_global(cx)
.active_model()
.map(|model| model.name().0)
.unwrap_or_else(|| "No model selected".into()),
),
None,
"Change Model",
window,
cx,
)
},
gpui::Corner::TopRight,
))
.children(
if let CodegenStatus::Error(error) = &self.codegen.read(cx).status {
let error_message = SharedString::from(error.to_string());
@@ -727,9 +741,22 @@ impl PromptEditor {
let mut this = Self {
id,
fs,
height_in_lines: 1,
editor: prompt_editor,
language_model_selector: cx.new(|cx| {
let fs = fs.clone();
LanguageModelSelector::new(
move |model, cx| {
update_settings_file::<AssistantSettings>(
fs.clone(),
cx,
move |settings, _| settings.set_model(model.clone()),
);
},
window,
cx,
)
}),
edited_since_done: false,
prompt_history,
prompt_history_ix: None,

View File

@@ -21,6 +21,7 @@ test-support = [
[dependencies]
anyhow.workspace = true
assistant_context_editor.workspace = true
assistant_scripting.workspace = true
assistant_settings.workspace = true
assistant_slash_command.workspace = true
assistant_tool.workspace = true
@@ -63,6 +64,7 @@ serde.workspace = true
serde_json.workspace = true
settings.workspace = true
smol.workspace = true
rich_text.workspace = true
streaming_diff.workspace = true
telemetry_events.workspace = true
terminal.workspace = true
@@ -81,8 +83,8 @@ zed_actions.workspace = true
[dev-dependencies]
editor = { workspace = true, features = ["test-support"] }
gpui = { workspace = true, "features" = ["test-support"] }
indoc.workspace = true
language = { workspace = true, "features" = ["test-support"] }
language_model = { workspace = true, "features" = ["test-support"] }
project = { workspace = true, features = ["test-support"] }
rand.workspace = true
indoc.workspace = true

View File

@@ -1,8 +1,7 @@
use std::sync::Arc;
use assistant_tool::ToolWorkingSet;
use collections::HashMap;
use assistant_scripting::{ScriptId, ScriptState};
use collections::{HashMap, HashSet};
use editor::{Editor, MultiBuffer};
use futures::FutureExt;
use gpui::{
list, AbsoluteLength, AnyElement, App, ClickEvent, DefiniteLength, EdgesRefinement, Empty,
Entity, Focusable, Length, ListAlignment, ListOffset, ListState, StyleRefinement, Subscription,
@@ -12,6 +11,8 @@ use language::{Buffer, LanguageRegistry};
use language_model::{LanguageModelRegistry, LanguageModelToolUseId, Role};
use markdown::{Markdown, MarkdownStyle};
use settings::Settings as _;
use std::ops::Range;
use std::sync::Arc;
use theme::ThemeSettings;
use ui::{prelude::*, Disclosure, KeyBinding};
use util::ResultExt as _;
@@ -21,11 +22,12 @@ use crate::thread::{MessageId, RequestKind, Thread, ThreadError, ThreadEvent};
use crate::thread_store::ThreadStore;
use crate::tool_use::{ToolUse, ToolUseStatus};
use crate::ui::ContextPill;
use gpui::{HighlightStyle, StyledText};
use rich_text::{self, Highlight};
pub struct ActiveThread {
workspace: WeakEntity<Workspace>,
language_registry: Arc<LanguageRegistry>,
tools: Arc<ToolWorkingSet>,
thread_store: Entity<ThreadStore>,
thread: Entity<Thread>,
save_thread_task: Option<Task<()>>,
@@ -34,6 +36,7 @@ pub struct ActiveThread {
rendered_messages_by_id: HashMap<MessageId, Entity<Markdown>>,
editing_message: Option<(MessageId, EditMessageState)>,
expanded_tool_uses: HashMap<LanguageModelToolUseId, bool>,
expanded_scripts: HashSet<ScriptId>,
last_error: Option<ThreadError>,
_subscriptions: Vec<Subscription>,
}
@@ -44,11 +47,10 @@ struct EditMessageState {
impl ActiveThread {
pub fn new(
workspace: WeakEntity<Workspace>,
thread: Entity<Thread>,
thread_store: Entity<ThreadStore>,
workspace: WeakEntity<Workspace>,
language_registry: Arc<LanguageRegistry>,
tools: Arc<ToolWorkingSet>,
window: &mut Window,
cx: &mut Context<Self>,
) -> Self {
@@ -60,13 +62,13 @@ impl ActiveThread {
let mut this = Self {
workspace,
language_registry,
tools,
thread_store,
thread: thread.clone(),
save_thread_task: None,
messages: Vec::new(),
rendered_messages_by_id: HashMap::default(),
expanded_tool_uses: HashMap::default(),
expanded_scripts: HashSet::default(),
list_state: ListState::new(0, ListAlignment::Bottom, px(1024.), {
let this = cx.entity().downgrade();
move |ix, window: &mut Window, cx: &mut App| {
@@ -173,6 +175,8 @@ impl ActiveThread {
text_style.refine(&TextStyleRefinement {
font_family: Some(theme_settings.ui_font.family.clone()),
font_fallbacks: theme_settings.ui_font.fallbacks.clone(),
font_features: Some(theme_settings.ui_font.features.clone()),
font_size: Some(ui_font_size.into()),
color: Some(cx.theme().colors().text),
..Default::default()
@@ -207,6 +211,8 @@ impl ActiveThread {
},
text: Some(TextStyleRefinement {
font_family: Some(theme_settings.buffer_font.family.clone()),
font_fallbacks: theme_settings.buffer_font.fallbacks.clone(),
font_features: Some(theme_settings.buffer_font.features.clone()),
font_size: Some(buffer_font_size.into()),
..Default::default()
}),
@@ -214,6 +220,8 @@ impl ActiveThread {
},
inline_code: TextStyleRefinement {
font_family: Some(theme_settings.buffer_font.family.clone()),
font_fallbacks: theme_settings.buffer_font.fallbacks.clone(),
font_features: Some(theme_settings.buffer_font.features.clone()),
font_size: Some(buffer_font_size.into()),
background_color: Some(colors.editor_foreground.opacity(0.1)),
..Default::default()
@@ -243,7 +251,7 @@ impl ActiveThread {
fn handle_thread_event(
&mut self,
_: &Entity<Thread>,
_thread: &Entity<Thread>,
event: &ThreadEvent,
window: &mut Window,
cx: &mut Context<Self>,
@@ -294,48 +302,26 @@ impl ActiveThread {
cx.notify();
}
ThreadEvent::UsePendingTools => {
let pending_tool_uses = self
.thread
.read(cx)
.pending_tool_uses()
.into_iter()
.filter(|tool_use| tool_use.status.is_idle())
.cloned()
.collect::<Vec<_>>();
for tool_use in pending_tool_uses {
if let Some(tool) = self.tools.tool(&tool_use.name, cx) {
let task = tool.run(tool_use.input, self.workspace.clone(), window, cx);
self.thread.update(cx, |thread, cx| {
thread.use_pending_tools(cx);
});
}
ThreadEvent::ToolFinished { .. } => {
if self.thread.read(cx).all_tools_finished() {
let model_registry = LanguageModelRegistry::read_global(cx);
if let Some(model) = model_registry.active_model() {
self.thread.update(cx, |thread, cx| {
thread.insert_tool_output(tool_use.id.clone(), task, cx);
thread.send_tool_results_to_model(model, cx);
});
}
}
}
ThreadEvent::ToolFinished { .. } => {
let all_tools_finished = self
.thread
.read(cx)
.pending_tool_uses()
.into_iter()
.all(|tool_use| tool_use.status.is_error());
if all_tools_finished {
let model_registry = LanguageModelRegistry::read_global(cx);
if let Some(model) = model_registry.active_model() {
self.thread.update(cx, |thread, cx| {
// Insert a user message to contain the tool results.
thread.insert_user_message(
// TODO: Sending up a user message without any content results in the model sending back
// responses that also don't have any content. We currently don't handle this case well,
// so for now we provide some text to keep the model on track.
"Here are the tool results.",
Vec::new(),
cx,
);
thread.send_to_model(model, RequestKind::Chat, true, cx);
});
}
ThreadEvent::ScriptFinished => {
let model_registry = LanguageModelRegistry::read_global(cx);
if let Some(model) = model_registry.active_model() {
self.thread.update(cx, |thread, cx| {
thread.send_to_model(model, RequestKind::Chat, false, cx);
});
}
}
}
@@ -477,12 +463,16 @@ impl ActiveThread {
return Empty.into_any();
};
let context = self.thread.read(cx).context_for_message(message_id);
let tool_uses = self.thread.read(cx).tool_uses_for_message(message_id);
let colors = cx.theme().colors();
let thread = self.thread.read(cx);
let context = thread.context_for_message(message_id);
let tool_uses = thread.tool_uses_for_message(message_id);
// Don't render user messages that are just there for returning tool results.
if message.role == Role::User && self.thread.read(cx).message_has_tool_results(message_id) {
if message.role == Role::User
&& (thread.message_has_tool_results(message_id)
|| thread.message_has_script_output(message_id))
{
return Empty.into_any();
}
@@ -495,6 +485,8 @@ impl ActiveThread {
.filter(|(id, _)| *id == message_id)
.map(|(_, state)| state.editor.clone());
let colors = cx.theme().colors();
let message_content = v_flex()
.child(
if let Some(edit_message_editor) = edit_message_editor.clone() {
@@ -629,6 +621,7 @@ impl ActiveThread {
Role::Assistant => div()
.id(("message-container", ix))
.child(message_content)
.children(self.render_script(message_id, cx))
.map(|parent| {
if tool_uses.is_empty() {
return parent;
@@ -645,7 +638,7 @@ impl ActiveThread {
Role::System => div().id(("message-container", ix)).py_1().px_2().child(
v_flex()
.bg(colors.editor_background)
.rounded_md()
.rounded_sm()
.child(message_content),
),
};
@@ -674,7 +667,7 @@ impl ActiveThread {
.pr_2()
.bg(cx.theme().colors().editor_foreground.opacity(0.02))
.when(is_open, |element| element.border_b_1().rounded_t(px(6.)))
.when(!is_open, |element| element.rounded(px(6.)))
.when(!is_open, |element| element.rounded_md())
.border_color(cx.theme().colors().border)
.child(
h_flex()
@@ -748,6 +741,191 @@ impl ActiveThread {
}),
)
}
fn render_script(&self, message_id: MessageId, cx: &mut Context<Self>) -> Option<AnyElement> {
let script = self.thread.read(cx).script_for_message(message_id, cx)?;
let is_open = self.expanded_scripts.contains(&script.id);
let colors = cx.theme().colors();
let element = div().px_2p5().child(
v_flex()
.gap_1()
.rounded_lg()
.border_1()
.border_color(colors.border)
.child(
h_flex()
.justify_between()
.py_0p5()
.pl_1()
.pr_2()
.bg(colors.editor_foreground.opacity(0.02))
.when(is_open, |element| element.border_b_1().rounded_t(px(6.)))
.when(!is_open, |element| element.rounded_md())
.border_color(colors.border)
.child(
h_flex()
.gap_1()
.child(Disclosure::new("script-disclosure", is_open).on_click(
cx.listener({
let script_id = script.id;
move |this, _event, _window, _cx| {
if this.expanded_scripts.contains(&script_id) {
this.expanded_scripts.remove(&script_id);
} else {
this.expanded_scripts.insert(script_id);
}
}
}),
))
// TODO: Generate script description
.child(Label::new("Script")),
)
.child(
h_flex()
.gap_1()
.child(
Label::new(match script.state {
ScriptState::Generating => "Generating",
ScriptState::Running { .. } => "Running",
ScriptState::Succeeded { .. } => "Finished",
ScriptState::Failed { .. } => "Error",
})
.size(LabelSize::XSmall)
.buffer_font(cx),
)
.child(
IconButton::new("view-source", IconName::Eye)
.icon_color(Color::Muted)
.disabled(matches!(script.state, ScriptState::Generating))
.on_click(cx.listener({
let source = script.source.clone();
move |this, _event, window, cx| {
this.open_script_source(source.clone(), window, cx);
}
})),
),
),
)
.when(is_open, |parent| {
let stdout = script.stdout_snapshot();
let error = script.error();
let lua_language =
async { self.language_registry.language_for_name("Lua").await.ok() }
.now_or_never()
.flatten();
let source_display = if let Some(lua_language) = &lua_language {
let mut highlights = Vec::new();
let mut buf = String::new();
rich_text::render_code(
&mut buf,
&mut highlights,
&script.source,
lua_language,
);
let theme = cx.theme();
let gpui_highlights: Vec<(Range<usize>, HighlightStyle)> = highlights
.iter()
.map(|(range, highlight)| {
let style = match highlight {
Highlight::Code => Default::default(),
Highlight::Id(id) => {
id.style(theme.syntax()).unwrap_or_default()
}
Highlight::InlineCode(_) => Default::default(),
Highlight::Highlight(highlight) => *highlight,
_ => HighlightStyle::default(),
};
(range.clone(), style)
})
.collect();
StyledText::new(buf)
.with_highlights(gpui_highlights)
.into_any_element()
} else {
Label::new(script.source.clone())
.size(LabelSize::Small)
.buffer_font(cx)
.into_any_element()
};
parent.child(
v_flex()
.p_2()
.bg(colors.editor_background)
.gap_2()
.child(
div()
.border_1()
.border_color(colors.border)
.p_2()
.bg(colors.editor_foreground.opacity(0.025))
.rounded_md()
.child(source_display),
)
.child(if stdout.is_empty() && error.is_none() {
Label::new("No output yet")
.size(LabelSize::Small)
.color(Color::Muted)
} else {
Label::new(stdout).size(LabelSize::Small).buffer_font(cx)
})
.children(script.error().map(|err| {
Label::new(err.to_string())
.size(LabelSize::Small)
.color(Color::Error)
})),
)
}),
);
Some(element.into_any())
}
fn open_script_source(
&mut self,
source: SharedString,
window: &mut Window,
cx: &mut Context<'_, ActiveThread>,
) {
let language_registry = self.language_registry.clone();
let workspace = self.workspace.clone();
let source = source.clone();
cx.spawn_in(window, |_, mut cx| async move {
let lua = language_registry.language_for_name("Lua").await.log_err();
workspace.update_in(&mut cx, |workspace, window, cx| {
let project = workspace.project().clone();
let buffer = project.update(cx, |project, cx| {
project.create_local_buffer(&source.trim(), lua, cx)
});
let buffer = cx.new(|cx| {
MultiBuffer::singleton(buffer, cx)
// TODO: Generate script description
.with_title("Assistant script".into())
});
let editor = cx.new(|cx| {
let mut editor =
Editor::for_multibuffer(buffer, Some(project), true, window, cx);
editor.set_read_only(true);
editor
});
workspace.add_item_to_active_pane(Box::new(editor), None, true, window, cx);
})
})
.detach_and_log_err(cx);
}
}
impl Render for ActiveThread {

View File

@@ -134,7 +134,7 @@ impl AssistantConfiguration {
.bg(cx.theme().colors().editor_background)
.border_1()
.border_color(cx.theme().colors().border_variant)
.rounded_md()
.rounded_sm()
.map(|parent| match configuration_view {
Some(configuration_view) => parent.child(configuration_view),
None => parent.child(div().child(Label::new(format!(

View File

@@ -1,28 +1,45 @@
use assistant_settings::AssistantSettings;
use fs::Fs;
use gpui::FocusHandle;
use language_model_selector::{assistant_language_model_selector, LanguageModelSelector};
use gpui::{Entity, FocusHandle, SharedString};
use language_model::LanguageModelRegistry;
use language_model_selector::{
LanguageModelSelector, LanguageModelSelectorPopoverMenu, ToggleModelSelector,
};
use settings::update_settings_file;
use std::sync::Arc;
use ui::{prelude::*, PopoverMenuHandle};
use ui::{prelude::*, ButtonLike, PopoverMenuHandle, Tooltip};
pub struct AssistantModelSelector {
selector: Entity<LanguageModelSelector>,
menu_handle: PopoverMenuHandle<LanguageModelSelector>,
focus_handle: FocusHandle,
fs: Arc<dyn Fs>,
}
impl AssistantModelSelector {
pub(crate) fn new(
fs: Arc<dyn Fs>,
menu_handle: PopoverMenuHandle<LanguageModelSelector>,
focus_handle: FocusHandle,
_window: &mut Window,
_cx: &mut App,
window: &mut Window,
cx: &mut App,
) -> Self {
Self {
fs,
selector: cx.new(|cx| {
let fs = fs.clone();
LanguageModelSelector::new(
move |model, cx| {
update_settings_file::<AssistantSettings>(
fs.clone(),
cx,
move |settings, _cx| settings.set_model(model.clone()),
);
},
window,
cx,
)
}),
menu_handle,
focus_handle,
menu_handle: PopoverMenuHandle::default(),
}
}
@@ -32,19 +49,43 @@ impl AssistantModelSelector {
}
impl Render for AssistantModelSelector {
fn render(&mut self, _: &mut Window, cx: &mut Context<Self>) -> impl IntoElement {
let fs_clone = self.fs.clone();
assistant_language_model_selector(
self.focus_handle.clone(),
Some(self.menu_handle.clone()),
cx,
move |model, cx| {
update_settings_file::<AssistantSettings>(
fs_clone.clone(),
fn render(&mut self, _window: &mut Window, cx: &mut Context<Self>) -> impl IntoElement {
let active_model = LanguageModelRegistry::read_global(cx).active_model();
let focus_handle = self.focus_handle.clone();
let model_name = match active_model {
Some(model) => model.name().0,
_ => SharedString::from("No model selected"),
};
LanguageModelSelectorPopoverMenu::new(
self.selector.clone(),
ButtonLike::new("active-model")
.style(ButtonStyle::Subtle)
.child(
h_flex()
.gap_0p5()
.child(
Label::new(model_name)
.size(LabelSize::Small)
.color(Color::Muted),
)
.child(
Icon::new(IconName::ChevronDown)
.color(Color::Muted)
.size(IconSize::XSmall),
),
),
move |window, cx| {
Tooltip::for_action_in(
"Change Model",
&ToggleModelSelector,
&focus_handle,
window,
cx,
move |settings, _| settings.set_model(model.clone()),
);
)
},
gpui::Corner::BottomRight,
)
.with_handle(self.menu_handle.clone())
}
}

View File

@@ -92,7 +92,6 @@ pub struct AssistantPanel {
context_editor: Option<Entity<ContextEditor>>,
configuration: Option<Entity<AssistantConfiguration>>,
configuration_subscription: Option<Subscription>,
tools: Arc<ToolWorkingSet>,
local_timezone: UtcOffset,
active_view: ActiveView,
history_store: Entity<HistoryStore>,
@@ -133,7 +132,7 @@ impl AssistantPanel {
log::info!("[assistant2-debug] finished initializing ContextStore");
workspace.update_in(&mut cx, |workspace, window, cx| {
cx.new(|cx| Self::new(workspace, thread_store, context_store, tools, window, cx))
cx.new(|cx| Self::new(workspace, thread_store, context_store, window, cx))
})
})
}
@@ -142,7 +141,6 @@ impl AssistantPanel {
workspace: &Workspace,
thread_store: Entity<ThreadStore>,
context_store: Entity<assistant_context_editor::ContextStore>,
tools: Arc<ToolWorkingSet>,
window: &mut Window,
cx: &mut Context<Self>,
) -> Self {
@@ -168,30 +166,30 @@ impl AssistantPanel {
let history_store =
cx.new(|cx| HistoryStore::new(thread_store.clone(), context_store.clone(), cx));
let thread = cx.new(|cx| {
ActiveThread::new(
workspace.clone(),
thread.clone(),
thread_store.clone(),
language_registry.clone(),
window,
cx,
)
});
Self {
active_view: ActiveView::Thread,
workspace: workspace.clone(),
project,
workspace,
project: project.clone(),
fs: fs.clone(),
language_registry: language_registry.clone(),
language_registry,
thread_store: thread_store.clone(),
thread: cx.new(|cx| {
ActiveThread::new(
thread.clone(),
thread_store.clone(),
workspace,
language_registry,
tools.clone(),
window,
cx,
)
}),
thread,
message_editor,
context_store,
context_editor: None,
configuration: None,
configuration_subscription: None,
tools,
local_timezone: UtcOffset::from_whole_seconds(
chrono::Local::now().offset().local_minus_utc(),
)
@@ -244,11 +242,10 @@ impl AssistantPanel {
self.active_view = ActiveView::Thread;
self.thread = cx.new(|cx| {
ActiveThread::new(
self.workspace.clone(),
thread.clone(),
self.thread_store.clone(),
self.workspace.clone(),
self.language_registry.clone(),
self.tools.clone(),
window,
cx,
)
@@ -379,11 +376,10 @@ impl AssistantPanel {
this.active_view = ActiveView::Thread;
this.thread = cx.new(|cx| {
ActiveThread::new(
this.workspace.clone(),
thread.clone(),
this.thread_store.clone(),
this.workspace.clone(),
this.language_registry.clone(),
this.tools.clone(),
window,
cx,
)
@@ -1023,12 +1019,7 @@ impl Render for AssistantPanel {
.map(|parent| match self.active_view {
ActiveView::Thread => parent
.child(self.render_active_thread_or_empty_state(window, cx))
.child(
h_flex()
.border_t_1()
.border_color(cx.theme().colors().border)
.child(self.message_editor.clone()),
)
.child(h_flex().child(self.message_editor.clone()))
.children(self.render_last_error(cx)),
ActiveView::History => parent.child(self.history.clone()),
ActiveView::PromptEditor => parent.children(self.context_editor.clone()),

View File

@@ -208,12 +208,18 @@ impl ContextStore {
let mut text_tasks = Vec::new();
this.update(&mut cx, |_, cx| {
for (path, buffer_entity) in files.into_iter().zip(buffers) {
let buffer_entity = buffer_entity?;
let buffer = buffer_entity.read(cx);
let (buffer_info, text_task) =
collect_buffer_info_and_text(path, buffer_entity, buffer, cx.to_async());
buffer_infos.push(buffer_info);
text_tasks.push(text_task);
// Skip all binary files and other non-UTF8 files
if let Ok(buffer_entity) = buffer_entity {
let buffer = buffer_entity.read(cx);
let (buffer_info, text_task) = collect_buffer_info_and_text(
path,
buffer_entity,
buffer,
cx.to_async(),
);
buffer_infos.push(buffer_info);
text_tasks.push(text_task);
}
}
anyhow::Ok(())
})??;

View File

@@ -27,6 +27,7 @@ use language::{Buffer, Point, Selection, TransactionId};
use language_model::{report_assistant_event, LanguageModelRegistry};
use multi_buffer::MultiBufferRow;
use parking_lot::Mutex;
use project::LspAction;
use project::{CodeAction, ProjectTransaction};
use prompt_store::PromptBuilder;
use settings::{Settings, SettingsStore};
@@ -1727,10 +1728,10 @@ impl CodeActionProvider for AssistantCodeActionProvider {
Task::ready(Ok(vec![CodeAction {
server_id: language::LanguageServerId(0),
range: snapshot.anchor_before(range.start)..snapshot.anchor_after(range.end),
lsp_action: lsp::CodeAction {
lsp_action: LspAction::Action(Box::new(lsp::CodeAction {
title: "Fix with Assistant".into(),
..Default::default()
},
})),
}]))
} else {
Task::ready(Ok(Vec::new()))

View File

@@ -857,6 +857,7 @@ impl PromptEditor<BufferCodegen> {
editor
});
let context_picker_menu_handle = PopoverMenuHandle::default();
let model_selector_menu_handle = PopoverMenuHandle::default();
let context_strip = cx.new(|cx| {
ContextStrip::new(
@@ -880,7 +881,13 @@ impl PromptEditor<BufferCodegen> {
context_strip,
context_picker_menu_handle,
model_selector: cx.new(|cx| {
AssistantModelSelector::new(fs, prompt_editor.focus_handle(cx), window, cx)
AssistantModelSelector::new(
fs,
model_selector_menu_handle,
prompt_editor.focus_handle(cx),
window,
cx,
)
}),
edited_since_done: false,
prompt_history,
@@ -1005,6 +1012,7 @@ impl PromptEditor<TerminalCodegen> {
editor
});
let context_picker_menu_handle = PopoverMenuHandle::default();
let model_selector_menu_handle = PopoverMenuHandle::default();
let context_strip = cx.new(|cx| {
ContextStrip::new(
@@ -1028,7 +1036,13 @@ impl PromptEditor<TerminalCodegen> {
context_strip,
context_picker_menu_handle,
model_selector: cx.new(|cx| {
AssistantModelSelector::new(fs, prompt_editor.focus_handle(cx), window, cx)
AssistantModelSelector::new(
fs,
model_selector_menu_handle.clone(),
prompt_editor.focus_handle(cx),
window,
cx,
)
}),
edited_since_done: false,
prompt_history,

View File

@@ -4,8 +4,8 @@ use editor::actions::MoveUp;
use editor::{Editor, EditorElement, EditorEvent, EditorStyle};
use fs::Fs;
use gpui::{
pulsating_between, Animation, AnimationExt, App, DismissEvent, Entity, Focusable, Subscription,
TextStyle, WeakEntity,
Animation, AnimationExt, App, DismissEvent, Entity, Focusable, Subscription, TextStyle,
WeakEntity,
};
use language_model::LanguageModelRegistry;
use language_model_selector::ToggleModelSelector;
@@ -16,7 +16,7 @@ use text::Bias;
use theme::ThemeSettings;
use ui::{
prelude::*, ButtonLike, KeyBinding, PlatformStyle, PopoverMenu, PopoverMenuHandle, Switch,
TintColor, Tooltip,
Tooltip,
};
use vim_mode_setting::VimModeSetting;
use workspace::Workspace;
@@ -54,6 +54,7 @@ impl MessageEditor {
let context_store = cx.new(|_cx| ContextStore::new(workspace.clone()));
let context_picker_menu_handle = PopoverMenuHandle::default();
let inline_context_picker_menu_handle = PopoverMenuHandle::default();
let model_selector_menu_handle = PopoverMenuHandle::default();
let editor = cx.new(|cx| {
let mut editor = Editor::auto_height(10, window, cx);
@@ -106,8 +107,15 @@ impl MessageEditor {
context_picker_menu_handle,
inline_context_picker,
inline_context_picker_menu_handle,
model_selector: cx
.new(|cx| AssistantModelSelector::new(fs, editor.focus_handle(cx), window, cx)),
model_selector: cx.new(|cx| {
AssistantModelSelector::new(
fs,
model_selector_menu_handle,
editor.focus_handle(cx),
window,
cx,
)
}),
use_tools: false,
_subscriptions: subscriptions,
}
@@ -290,166 +298,211 @@ impl Render for MessageEditor {
let linux = platform == PlatformStyle::Linux;
let windows = platform == PlatformStyle::Windows;
let button_width = if linux || windows || vim_mode_enabled {
px(92.)
px(82.)
} else {
px(64.)
};
v_flex()
.key_context("MessageEditor")
.on_action(cx.listener(Self::chat))
.on_action(cx.listener(|this, _: &ToggleModelSelector, window, cx| {
this.model_selector
.update(cx, |model_selector, cx| model_selector.toggle(window, cx));
}))
.on_action(cx.listener(Self::toggle_context_picker))
.on_action(cx.listener(Self::remove_all_context))
.on_action(cx.listener(Self::move_up))
.on_action(cx.listener(Self::toggle_chat_mode))
.size_full()
.gap_2()
.p_2()
.bg(bg_color)
.child(self.context_strip.clone())
.when(is_streaming_completion, |parent| {
let focus_handle = self.editor.focus_handle(cx).clone();
parent.child(
h_flex().py_3().w_full().justify_center().child(
h_flex()
.flex_none()
.pl_2()
.pr_1()
.py_1()
.bg(cx.theme().colors().editor_background)
.border_1()
.border_color(cx.theme().colors().border_variant)
.rounded_lg()
.shadow_md()
.gap_1()
.child(
Icon::new(IconName::ArrowCircle)
.size(IconSize::XSmall)
.color(Color::Muted)
.with_animation(
"arrow-circle",
Animation::new(Duration::from_secs(2)).repeat(),
|icon, delta| {
icon.transform(gpui::Transformation::rotate(
gpui::percentage(delta),
))
},
),
)
.child(
Label::new("Generating…")
.size(LabelSize::XSmall)
.color(Color::Muted),
)
.child(ui::Divider::vertical())
.child(
Button::new("cancel-generation", "Cancel")
.label_size(LabelSize::XSmall)
.key_binding(
KeyBinding::for_action_in(
&editor::actions::Cancel,
&focus_handle,
window,
cx,
)
.map(|kb| kb.size(rems_from_px(10.))),
)
.on_click(move |_event, window, cx| {
focus_handle.dispatch_action(
&editor::actions::Cancel,
window,
cx,
);
}),
),
),
)
})
.child(
v_flex()
.gap_5()
.child({
let settings = ThemeSettings::get_global(cx);
let text_style = TextStyle {
color: cx.theme().colors().text,
font_family: settings.ui_font.family.clone(),
font_features: settings.ui_font.features.clone(),
font_size: font_size.into(),
font_weight: settings.ui_font.weight,
line_height: line_height.into(),
..Default::default()
};
EditorElement::new(
&self.editor,
EditorStyle {
background: bg_color,
local_player: cx.theme().players().local(),
text: text_style,
..Default::default()
},
)
})
.key_context("MessageEditor")
.on_action(cx.listener(Self::chat))
.on_action(cx.listener(|this, _: &ToggleModelSelector, window, cx| {
this.model_selector
.update(cx, |model_selector, cx| model_selector.toggle(window, cx));
}))
.on_action(cx.listener(Self::toggle_context_picker))
.on_action(cx.listener(Self::remove_all_context))
.on_action(cx.listener(Self::move_up))
.on_action(cx.listener(Self::toggle_chat_mode))
.gap_2()
.p_2()
.bg(bg_color)
.border_t_1()
.border_color(cx.theme().colors().border)
.child(self.context_strip.clone())
.child(
PopoverMenu::new("inline-context-picker")
.menu(move |window, cx| {
inline_context_picker.update(cx, |this, cx| {
this.init(window, cx);
});
v_flex()
.gap_5()
.child({
let settings = ThemeSettings::get_global(cx);
let text_style = TextStyle {
color: cx.theme().colors().text,
font_family: settings.ui_font.family.clone(),
font_fallbacks: settings.ui_font.fallbacks.clone(),
font_features: settings.ui_font.features.clone(),
font_size: font_size.into(),
font_weight: settings.ui_font.weight,
line_height: line_height.into(),
..Default::default()
};
Some(inline_context_picker.clone())
EditorElement::new(
&self.editor,
EditorStyle {
background: bg_color,
local_player: cx.theme().players().local(),
text: text_style,
..Default::default()
},
)
})
.attach(gpui::Corner::TopLeft)
.anchor(gpui::Corner::BottomLeft)
.offset(gpui::Point {
x: px(0.0),
y: (-ThemeSettings::get_global(cx).ui_font_size(cx) * 2) - px(4.0),
})
.with_handle(self.inline_context_picker_menu_handle.clone()),
)
.child(
h_flex()
.justify_between()
.child(
Switch::new("use-tools", self.use_tools.into())
.label("Tools")
.on_click(cx.listener(|this, selection, _window, _cx| {
this.use_tools = match selection {
ToggleState::Selected => true,
ToggleState::Unselected
| ToggleState::Indeterminate => false,
};
}))
.key_binding(KeyBinding::for_action_in(
&ChatMode,
&focus_handle,
window,
cx,
)),
PopoverMenu::new("inline-context-picker")
.menu(move |window, cx| {
inline_context_picker.update(cx, |this, cx| {
this.init(window, cx);
});
Some(inline_context_picker.clone())
})
.attach(gpui::Corner::TopLeft)
.anchor(gpui::Corner::BottomLeft)
.offset(gpui::Point {
x: px(0.0),
y: (-ThemeSettings::get_global(cx).ui_font_size(cx) * 2)
- px(4.0),
})
.with_handle(self.inline_context_picker_menu_handle.clone()),
)
.child(h_flex().gap_1().child(self.model_selector.clone()).child(
if is_streaming_completion {
ButtonLike::new("cancel-generation")
.width(button_width.into())
.style(ButtonStyle::Tinted(TintColor::Accent))
.child(
h_flex()
.w_full()
.justify_between()
.child(
Label::new("Cancel")
.size(LabelSize::Small)
.with_animation(
"pulsating-label",
Animation::new(Duration::from_secs(2))
.repeat()
.with_easing(pulsating_between(
0.4, 0.8,
)),
|label, delta| label.alpha(delta),
),
)
.children(
KeyBinding::for_action_in(
&editor::actions::Cancel,
&focus_handle,
window,
cx,
)
.map(|binding| binding.into_any_element()),
),
)
.on_click(move |_event, window, cx| {
focus_handle.dispatch_action(
&editor::actions::Cancel,
.child(
h_flex()
.justify_between()
.child(
Switch::new("use-tools", self.use_tools.into())
.label("Tools")
.on_click(cx.listener(
|this, selection, _window, _cx| {
this.use_tools = match selection {
ToggleState::Selected => true,
ToggleState::Unselected
| ToggleState::Indeterminate => false,
};
},
))
.key_binding(KeyBinding::for_action_in(
&ChatMode,
&focus_handle,
window,
cx,
);
})
} else {
ButtonLike::new("submit-message")
.width(button_width.into())
.style(ButtonStyle::Filled)
.disabled(is_editor_empty || !is_model_selected)
.child(
h_flex()
.w_full()
.justify_between()
.child(
Label::new("Submit")
.size(LabelSize::Small)
.color(submit_label_color),
)),
)
.child(
h_flex().gap_1().child(self.model_selector.clone()).child(
ButtonLike::new("submit-message")
.width(button_width.into())
.style(ButtonStyle::Filled)
.disabled(
is_editor_empty
|| !is_model_selected
|| is_streaming_completion,
)
.children(
KeyBinding::for_action_in(
&Chat,
&focus_handle,
window,
cx,
)
.map(|binding| binding.into_any_element()),
),
)
.on_click(move |_event, window, cx| {
focus_handle.dispatch_action(&Chat, window, cx);
})
.when(is_editor_empty, |button| {
button
.tooltip(Tooltip::text("Type a message to submit"))
})
.when(!is_model_selected, |button| {
button.tooltip(Tooltip::text(
"Select a model to continue",
))
})
},
)),
.child(
h_flex()
.w_full()
.justify_between()
.child(
Label::new("Submit")
.size(LabelSize::Small)
.color(submit_label_color),
)
.children(
KeyBinding::for_action_in(
&Chat,
&focus_handle,
window,
cx,
)
.map(|binding| {
binding
.when(vim_mode_enabled, |kb| {
kb.size(rems_from_px(12.))
})
.into_any_element()
}),
),
)
.on_click(move |_event, window, cx| {
focus_handle.dispatch_action(&Chat, window, cx);
})
.when(is_editor_empty, |button| {
button.tooltip(Tooltip::text(
"Type a message to submit",
))
})
.when(is_streaming_completion, |button| {
button.tooltip(Tooltip::text(
"Cancel to submit a new message",
))
})
.when(!is_model_selected, |button| {
button.tooltip(Tooltip::text(
"Select a model to continue",
))
}),
),
),
),
),
)
}

View File

@@ -1,17 +1,21 @@
use std::sync::Arc;
use anyhow::Result;
use assistant_scripting::{
Script, ScriptEvent, ScriptId, ScriptSession, ScriptTagParser, SCRIPTING_PROMPT,
};
use assistant_tool::ToolWorkingSet;
use chrono::{DateTime, Utc};
use collections::{BTreeMap, HashMap, HashSet};
use futures::StreamExt as _;
use gpui::{App, Context, EventEmitter, SharedString, Task};
use gpui::{App, AppContext, Context, Entity, EventEmitter, SharedString, Subscription, Task};
use language_model::{
LanguageModel, LanguageModelCompletionEvent, LanguageModelRegistry, LanguageModelRequest,
LanguageModelRequestMessage, LanguageModelRequestTool, LanguageModelToolResult,
LanguageModelToolUseId, MaxMonthlySpendReachedError, MessageContent, PaymentRequiredError,
Role, StopReason,
};
use project::Project;
use serde::{Deserialize, Serialize};
use util::{post_inc, TryFutureExt as _};
use uuid::Uuid;
@@ -71,12 +75,24 @@ pub struct Thread {
context_by_message: HashMap<MessageId, Vec<ContextId>>,
completion_count: usize,
pending_completions: Vec<PendingCompletion>,
project: Entity<Project>,
tools: Arc<ToolWorkingSet>,
tool_use: ToolUseState,
scripts_by_assistant_message: HashMap<MessageId, ScriptId>,
script_output_messages: HashSet<MessageId>,
script_session: Entity<ScriptSession>,
_script_session_subscription: Subscription,
}
impl Thread {
pub fn new(tools: Arc<ToolWorkingSet>, _cx: &mut Context<Self>) -> Self {
pub fn new(
project: Entity<Project>,
tools: Arc<ToolWorkingSet>,
cx: &mut Context<Self>,
) -> Self {
let script_session = cx.new(|cx| ScriptSession::new(project.clone(), cx));
let script_session_subscription = cx.subscribe(&script_session, Self::handle_script_event);
Self {
id: ThreadId::new(),
updated_at: Utc::now(),
@@ -88,16 +104,22 @@ impl Thread {
context_by_message: HashMap::default(),
completion_count: 0,
pending_completions: Vec::new(),
project,
tools,
tool_use: ToolUseState::new(),
scripts_by_assistant_message: HashMap::default(),
script_output_messages: HashSet::default(),
script_session,
_script_session_subscription: script_session_subscription,
}
}
pub fn from_saved(
id: ThreadId,
saved: SavedThread,
project: Entity<Project>,
tools: Arc<ToolWorkingSet>,
_cx: &mut Context<Self>,
cx: &mut Context<Self>,
) -> Self {
let next_message_id = MessageId(
saved
@@ -107,6 +129,8 @@ impl Thread {
.unwrap_or(0),
);
let tool_use = ToolUseState::from_saved_messages(&saved.messages);
let script_session = cx.new(|cx| ScriptSession::new(project.clone(), cx));
let script_session_subscription = cx.subscribe(&script_session, Self::handle_script_event);
Self {
id,
@@ -127,8 +151,13 @@ impl Thread {
context_by_message: HashMap::default(),
completion_count: 0,
pending_completions: Vec::new(),
project,
tools,
tool_use,
scripts_by_assistant_message: HashMap::default(),
script_output_messages: HashSet::default(),
script_session,
_script_session_subscription: script_session_subscription,
}
}
@@ -193,6 +222,15 @@ impl Thread {
self.tool_use.pending_tool_uses()
}
/// Returns whether all of the tool uses have finished running.
pub fn all_tools_finished(&self) -> bool {
// If the only pending tool uses left are the ones with errors, then that means that we've finished running all
// of the pending tools.
self.pending_tool_uses()
.into_iter()
.all(|tool_use| tool_use.status.is_error())
}
pub fn tool_uses_for_message(&self, id: MessageId) -> Vec<ToolUse> {
self.tool_use.tool_uses_for_message(id)
}
@@ -205,17 +243,22 @@ impl Thread {
self.tool_use.message_has_tool_results(message_id)
}
pub fn message_has_script_output(&self, message_id: MessageId) -> bool {
self.script_output_messages.contains(&message_id)
}
pub fn insert_user_message(
&mut self,
text: impl Into<String>,
context: Vec<ContextSnapshot>,
cx: &mut Context<Self>,
) {
) -> MessageId {
let message_id = self.insert_message(Role::User, text, cx);
let context_ids = context.iter().map(|context| context.id).collect::<Vec<_>>();
self.context
.extend(context.into_iter().map(|context| (context.id, context)));
self.context_by_message.insert(message_id, context_ids);
message_id
}
pub fn insert_message(
@@ -284,6 +327,39 @@ impl Thread {
text
}
pub fn script_for_message<'a>(
&'a self,
message_id: MessageId,
cx: &'a App,
) -> Option<&'a Script> {
self.scripts_by_assistant_message
.get(&message_id)
.map(|script_id| self.script_session.read(cx).get(*script_id))
}
fn handle_script_event(
&mut self,
_script_session: Entity<ScriptSession>,
event: &ScriptEvent,
cx: &mut Context<Self>,
) {
match event {
ScriptEvent::Spawned(_) => {}
ScriptEvent::Exited(script_id) => {
if let Some(output_message) = self
.script_session
.read(cx)
.get(*script_id)
.output_message_for_llm()
{
let message_id = self.insert_user_message(output_message, vec![], cx);
self.script_output_messages.insert(message_id);
cx.emit(ThreadEvent::ScriptFinished)
}
}
}
}
pub fn send_to_model(
&mut self,
model: Arc<dyn LanguageModel>,
@@ -312,7 +388,7 @@ impl Thread {
pub fn to_completion_request(
&self,
request_kind: RequestKind,
_cx: &App,
cx: &App,
) -> LanguageModelRequest {
let mut request = LanguageModelRequest {
messages: vec![],
@@ -321,6 +397,12 @@ impl Thread {
temperature: None,
};
request.messages.push(LanguageModelRequestMessage {
role: Role::System,
content: vec![SCRIPTING_PROMPT.to_string().into()],
cache: true,
});
let mut referenced_context_ids = HashSet::default();
for message in &self.messages {
@@ -333,6 +415,7 @@ impl Thread {
content: Vec::new(),
cache: false,
};
match request_kind {
RequestKind::Chat => {
self.tool_use
@@ -353,11 +436,20 @@ impl Thread {
RequestKind::Chat => {
self.tool_use
.attach_tool_uses(message.id, &mut request_message);
if matches!(message.role, Role::Assistant) {
if let Some(script_id) = self.scripts_by_assistant_message.get(&message.id)
{
let script = self.script_session.read(cx).get(*script_id);
request_message.content.push(script.source_tag().into());
}
}
}
RequestKind::Summarize => {
// We don't care about tool use during summarization.
}
}
};
request.messages.push(request_message);
}
@@ -394,6 +486,8 @@ impl Thread {
let stream_completion = async {
let mut events = stream.await?;
let mut stop_reason = StopReason::EndTurn;
let mut script_tag_parser = ScriptTagParser::new();
let mut script_id = None;
while let Some(event) = events.next().await {
let event = event?;
@@ -408,19 +502,43 @@ impl Thread {
}
LanguageModelCompletionEvent::Text(chunk) => {
if let Some(last_message) = thread.messages.last_mut() {
if last_message.role == Role::Assistant {
last_message.text.push_str(&chunk);
let chunk = script_tag_parser.parse_chunk(&chunk);
let message_id = if last_message.role == Role::Assistant {
last_message.text.push_str(&chunk.content);
cx.emit(ThreadEvent::StreamedAssistantText(
last_message.id,
chunk,
chunk.content,
));
last_message.id
} else {
// If we won't have an Assistant message yet, assume this chunk marks the beginning
// of a new Assistant response.
//
// Importantly: We do *not* want to emit a `StreamedAssistantText` event here, as it
// will result in duplicating the text of the chunk in the rendered Markdown.
thread.insert_message(Role::Assistant, chunk, cx);
thread.insert_message(Role::Assistant, chunk.content, cx)
};
if script_id.is_none() && script_tag_parser.found_script() {
let id = thread
.script_session
.update(cx, |session, _cx| session.new_script());
thread.scripts_by_assistant_message.insert(message_id, id);
script_id = Some(id);
}
if let (Some(script_source), Some(script_id)) =
(chunk.script_source, script_id)
{
// TODO: move buffer to script and run as it streams
thread
.script_session
.update(cx, |this, cx| {
this.run_script(script_id, script_source, cx)
})
.detach_and_log_err(cx);
}
}
}
@@ -550,6 +668,23 @@ impl Thread {
});
}
pub fn use_pending_tools(&mut self, cx: &mut Context<Self>) {
let pending_tool_uses = self
.pending_tool_uses()
.into_iter()
.filter(|tool_use| tool_use.status.is_idle())
.cloned()
.collect::<Vec<_>>();
for tool_use in pending_tool_uses {
if let Some(tool) = self.tools.tool(&tool_use.name, cx) {
let task = tool.run(tool_use.input, self.project.clone(), cx);
self.insert_tool_output(tool_use.id.clone(), task, cx);
}
}
}
pub fn insert_tool_output(
&mut self,
tool_use_id: LanguageModelToolUseId,
@@ -576,6 +711,23 @@ impl Thread {
.run_pending_tool(tool_use_id, insert_output_task);
}
pub fn send_tool_results_to_model(
&mut self,
model: Arc<dyn LanguageModel>,
cx: &mut Context<Self>,
) {
// Insert a user message to contain the tool results.
self.insert_user_message(
// TODO: Sending up a user message without any content results in the model sending back
// responses that also don't have any content. We currently don't handle this case well,
// so for now we provide some text to keep the model on track.
"Here are the tool results.",
Vec::new(),
cx,
);
self.send_to_model(model, RequestKind::Chat, true, cx);
}
/// Cancels the last pending completion, if there are any pending.
///
/// Returns whether a completion was canceled.
@@ -609,6 +761,7 @@ pub enum ThreadEvent {
#[allow(unused)]
tool_use_id: LanguageModelToolUseId,
},
ScriptFinished,
}
impl EventEmitter<ThreadEvent> for Thread {}

View File

@@ -26,7 +26,6 @@ pub fn init(cx: &mut App) {
}
pub struct ThreadStore {
#[allow(unused)]
project: Entity<Project>,
tools: Arc<ToolWorkingSet>,
context_server_manager: Entity<ContextServerManager>,
@@ -78,7 +77,7 @@ impl ThreadStore {
}
pub fn create_thread(&mut self, cx: &mut Context<Self>) -> Entity<Thread> {
cx.new(|cx| Thread::new(self.tools.clone(), cx))
cx.new(|cx| Thread::new(self.project.clone(), self.tools.clone(), cx))
}
pub fn open_thread(
@@ -96,7 +95,15 @@ impl ThreadStore {
.ok_or_else(|| anyhow!("no thread found with ID: {id:?}"))?;
this.update(&mut cx, |this, cx| {
cx.new(|cx| Thread::from_saved(id.clone(), thread, this.tools.clone(), cx))
cx.new(|cx| {
Thread::from_saved(
id.clone(),
thread,
this.project.clone(),
this.tools.clone(),
cx,
)
})
})
})
}

View File

@@ -103,7 +103,7 @@ impl RenderOnce for ContextPill {
.pl_1()
.pb(px(1.))
.border_1()
.rounded_md()
.rounded_sm()
.gap_1()
.child(self.icon().size(IconSize::XSmall).color(Color::Muted));

View File

@@ -38,7 +38,7 @@ use language_model::{
Role,
};
use language_model_selector::{
assistant_language_model_selector, LanguageModelSelector, ToggleModelSelector,
LanguageModelSelector, LanguageModelSelectorPopoverMenu, ToggleModelSelector,
};
use multi_buffer::MultiBufferRow;
use picker::Picker;
@@ -197,7 +197,8 @@ pub struct ContextEditor {
// the file is opened. In order to keep the worktree alive for the duration of the
// context editor, we keep a reference here.
dragged_file_worktrees: Vec<Entity<Worktree>>,
language_model_selector: PopoverMenuHandle<LanguageModelSelector>,
language_model_selector: Entity<LanguageModelSelector>,
language_model_selector_menu_handle: PopoverMenuHandle<LanguageModelSelector>,
}
pub const DEFAULT_TAB_TITLE: &str = "New Chat";
@@ -263,7 +264,7 @@ impl ContextEditor {
image_blocks: Default::default(),
scroll_position: None,
remote_id: None,
fs,
fs: fs.clone(),
workspace,
project,
pending_slash_command_creases: HashMap::default(),
@@ -275,7 +276,20 @@ impl ContextEditor {
show_accept_terms: false,
slash_menu_handle: Default::default(),
dragged_file_worktrees: Vec::new(),
language_model_selector: PopoverMenuHandle::default(),
language_model_selector: cx.new(|cx| {
LanguageModelSelector::new(
move |model, cx| {
update_settings_file::<AssistantSettings>(
fs.clone(),
cx,
move |settings, _| settings.set_model(model.clone()),
);
},
window,
cx,
)
}),
language_model_selector_menu_handle: PopoverMenuHandle::default(),
};
this.update_message_headers(cx);
this.update_image_blocks(cx);
@@ -1227,7 +1241,7 @@ impl ContextEditor {
.child("Press")
.child(
h_flex()
.rounded_md()
.rounded_sm()
.px_1()
.mr_0p5()
.border_1()
@@ -2078,7 +2092,7 @@ impl ContextEditor {
.ml(gutter_width)
.pb_1()
.w(max_width - gutter_width)
.rounded_md()
.rounded_sm()
.border_1()
.border_color(theme.colors().border_variant)
.overflow_hidden()
@@ -2375,6 +2389,46 @@ impl ContextEditor {
)
}
fn render_language_model_selector(&self, cx: &mut Context<Self>) -> impl IntoElement {
let active_model = LanguageModelRegistry::read_global(cx).active_model();
let focus_handle = self.editor().focus_handle(cx).clone();
let model_name = match active_model {
Some(model) => model.name().0,
None => SharedString::from("No model selected"),
};
LanguageModelSelectorPopoverMenu::new(
self.language_model_selector.clone(),
ButtonLike::new("active-model")
.style(ButtonStyle::Subtle)
.child(
h_flex()
.gap_0p5()
.child(
Label::new(model_name)
.size(LabelSize::Small)
.color(Color::Muted),
)
.child(
Icon::new(IconName::ChevronDown)
.color(Color::Muted)
.size(IconSize::XSmall),
),
),
move |window, cx| {
Tooltip::for_action_in(
"Change Model",
&ToggleModelSelector,
&focus_handle,
window,
cx,
)
},
gpui::Corner::BottomLeft,
)
.with_handle(self.language_model_selector_menu_handle.clone())
}
fn render_last_error(&self, cx: &mut Context<Self>) -> Option<AnyElement> {
let last_error = self.last_error.as_ref()?;
@@ -2818,9 +2872,8 @@ impl Render for ContextEditor {
} else {
None
};
let fs_clone = self.fs.clone();
let language_model_selector = self.language_model_selector.clone();
let language_model_selector = self.language_model_selector_menu_handle.clone();
v_flex()
.key_context("ContextEditor")
.capture_action(cx.listener(ContextEditor::cancel))
@@ -2873,18 +2926,11 @@ impl Render for ContextEditor {
.gap_1()
.child(self.render_inject_context_menu(cx))
.child(ui::Divider::vertical())
.child(div().pl_0p5().child(assistant_language_model_selector(
self.editor().focus_handle(cx),
Some(self.language_model_selector.clone()),
cx,
move |model, cx| {
update_settings_file::<AssistantSettings>(
fs_clone.clone(),
cx,
move |settings, _| settings.set_model(model.clone()),
);
},
))),
.child(
div()
.pl_0p5()
.child(self.render_language_model_selector(cx)),
),
)
.child(
h_flex()
@@ -3376,7 +3422,7 @@ fn invoked_slash_command_fold_placeholder(
.ml_6()
.gap_2()
.bg(cx.theme().colors().surface_background)
.rounded_md()
.rounded_sm()
.child(Label::new(format!("/{}", command.name.clone())))
.map(|parent| match &command.status {
InvokedSlashCommandStatus::Running(_) => {

View File

@@ -104,49 +104,53 @@ impl ContextStore {
const CONTEXT_WATCH_DURATION: Duration = Duration::from_millis(100);
let (mut events, _) = fs.watch(contexts_dir(), CONTEXT_WATCH_DURATION).await;
let this = cx.new(|cx: &mut Context<Self>| {
let context_server_factory_registry =
ContextServerFactoryRegistry::default_global(cx);
let context_server_manager = cx.new(|cx| {
ContextServerManager::new(context_server_factory_registry, project.clone(), cx)
});
let mut this = Self {
contexts: Vec::new(),
contexts_metadata: Vec::new(),
context_server_manager,
context_server_slash_command_ids: HashMap::default(),
host_contexts: Vec::new(),
fs,
languages,
slash_commands,
telemetry,
_watch_updates: cx.spawn(|this, mut cx| {
async move {
while events.next().await.is_some() {
this.update(&mut cx, |this, cx| this.reload(cx))?
.await
.log_err();
let this =
cx.new(|cx: &mut Context<Self>| {
let context_server_factory_registry =
ContextServerFactoryRegistry::default_global(cx);
let context_server_manager = cx.new(|cx| {
ContextServerManager::new(
context_server_factory_registry,
project.clone(),
cx,
)
});
let mut this = Self {
contexts: Vec::new(),
contexts_metadata: Vec::new(),
context_server_manager,
context_server_slash_command_ids: HashMap::default(),
host_contexts: Vec::new(),
fs,
languages,
slash_commands,
telemetry,
_watch_updates: cx.spawn(|this, mut cx| {
async move {
while events.next().await.is_some() {
this.update(&mut cx, |this, cx| this.reload(cx))?
.await
.log_err();
}
anyhow::Ok(())
}
anyhow::Ok(())
}
.log_err()
}),
client_subscription: None,
_project_subscriptions: vec![
cx.observe(&project, Self::handle_project_changed),
cx.subscribe(&project, Self::handle_project_event),
],
project_is_shared: false,
client: project.read(cx).client(),
project: project.clone(),
prompt_builder,
};
this.handle_project_changed(project.clone(), cx);
this.synchronize_contexts(cx);
this.register_context_server_handlers(cx);
this.reload(cx).detach_and_log_err(cx);
this
})?;
.log_err()
}),
client_subscription: None,
_project_subscriptions: vec![
cx.subscribe(&project, Self::handle_project_event)
],
project_is_shared: false,
client: project.read(cx).client(),
project: project.clone(),
prompt_builder,
};
this.handle_project_shared(project.clone(), cx);
this.synchronize_contexts(cx);
this.register_context_server_handlers(cx);
this.reload(cx).detach_and_log_err(cx);
this
})?;
Ok(this)
})
@@ -288,7 +292,7 @@ impl ContextStore {
})?
}
fn handle_project_changed(&mut self, _: Entity<Project>, cx: &mut Context<Self>) {
fn handle_project_shared(&mut self, _: Entity<Project>, cx: &mut Context<Self>) {
let is_shared = self.project.read(cx).is_shared();
let was_shared = mem::replace(&mut self.project_is_shared, is_shared);
if is_shared == was_shared {
@@ -318,11 +322,14 @@ impl ContextStore {
fn handle_project_event(
&mut self,
_: Entity<Project>,
project: Entity<Project>,
event: &project::Event,
cx: &mut Context<Self>,
) {
match event {
project::Event::RemoteIdChanged(_) => {
self.handle_project_shared(project, cx);
}
project::Event::Reshared => {
self.advertise_contexts(cx);
}

View File

@@ -140,7 +140,7 @@ impl ResolvedPatch {
buffer.edit(
edits,
Some(AutoindentMode::Block {
original_start_columns: Vec::new(),
original_indent_columns: Vec::new(),
}),
cx,
);

View File

@@ -5,9 +5,9 @@ use assistant_slash_command::{AfterCompletion, SlashCommandLine, SlashCommandWor
use editor::{CompletionProvider, Editor};
use fuzzy::{match_strings, StringMatchCandidate};
use gpui::{App, AppContext as _, Context, Entity, Task, WeakEntity, Window};
use language::{Anchor, Buffer, LanguageServerId, ToPoint};
use language::{Anchor, Buffer, ToPoint};
use parking_lot::Mutex;
use project::{lsp_store::CompletionDocumentation, CompletionIntent};
use project::{lsp_store::CompletionDocumentation, CompletionIntent, CompletionSource};
use rope::Point;
use std::{
cell::RefCell,
@@ -125,10 +125,8 @@ impl SlashCommandCompletionProvider {
)),
new_text,
label: command.label(cx),
server_id: LanguageServerId(0),
lsp_completion: Default::default(),
confirm,
resolved: true,
source: CompletionSource::Custom,
})
})
.collect()
@@ -225,10 +223,8 @@ impl SlashCommandCompletionProvider {
label: new_argument.label,
new_text,
documentation: None,
server_id: LanguageServerId(0),
lsp_completion: Default::default(),
confirm,
resolved: true,
source: CompletionSource::Custom,
}
})
.collect())

View File

@@ -0,0 +1,35 @@
[package]
name = "assistant_scripting"
version = "0.1.0"
edition.workspace = true
publish.workspace = true
license = "GPL-3.0-or-later"
[lints]
workspace = true
[lib]
path = "src/assistant_scripting.rs"
doctest = false
[dependencies]
anyhow.workspace = true
collections.workspace = true
futures.workspace = true
gpui.workspace = true
log.workspace = true
mlua.workspace = true
parking_lot.workspace = true
project.workspace = true
regex.workspace = true
serde.workspace = true
serde_json.workspace = true
settings.workspace = true
util.workspace = true
[dev-dependencies]
collections = { workspace = true, features = ["test-support"] }
gpui = { workspace = true, features = ["test-support"] }
project = { workspace = true, features = ["test-support"] }
rand.workspace = true
settings = { workspace = true, features = ["test-support"] }

View File

@@ -0,0 +1,7 @@
mod session;
mod tag;
pub use session::*;
pub use tag::*;
pub const SCRIPTING_PROMPT: &str = include_str!("./system_prompt.txt");

View File

@@ -3,6 +3,16 @@
-- Create a sandbox environment
local sandbox = {}
-- For now, add all globals to `sandbox` (so there effectively is no sandbox).
-- We still need the logic below so that we can do things like overriding print() to write
-- to our in-memory log rather than to stdout, we will delete this loop (and re-enable
-- the I/O module being sandboxed below) to have things be sandboxed again.
for k, v in pairs(_G) do
if sandbox[k] == nil then
sandbox[k] = v
end
end
-- Allow access to standard libraries (safe subset)
sandbox.string = string
sandbox.table = table
@@ -13,7 +23,10 @@ sandbox.tostring = tostring
sandbox.tonumber = tonumber
sandbox.pairs = pairs
sandbox.ipairs = ipairs
-- Access to custom functions
sandbox.search = search
sandbox.outline = outline
-- Create a sandboxed version of LuaFileIO
local io = {}
@@ -22,8 +35,7 @@ local io = {}
io.open = sb_io_open
-- Add the sandboxed io library to the sandbox environment
sandbox.io = io
-- sandbox.io = io -- Uncomment this line to re-enable sandboxed file I/O.
-- Load the script with the sandbox environment
local user_script_fn, err = load(user_script, nil, "t", sandbox)

View File

@@ -0,0 +1,953 @@
use anyhow::anyhow;
use collections::{HashMap, HashSet};
use futures::{
channel::{mpsc, oneshot},
pin_mut, SinkExt, StreamExt,
};
use gpui::{AppContext, AsyncApp, Context, Entity, EventEmitter, SharedString, Task, WeakEntity};
use mlua::{ExternalResult, Lua, MultiValue, Table, UserData, UserDataMethods};
use parking_lot::Mutex;
use project::{search::SearchQuery, Fs, Project};
use regex::Regex;
use std::{
cell::RefCell,
path::{Path, PathBuf},
sync::Arc,
};
use util::{paths::PathMatcher, ResultExt};
use crate::{SCRIPT_END_TAG, SCRIPT_START_TAG};
struct ForegroundFn(Box<dyn FnOnce(WeakEntity<ScriptSession>, AsyncApp) + Send>);
pub struct ScriptSession {
project: Entity<Project>,
// TODO Remove this
fs_changes: Arc<Mutex<HashMap<PathBuf, Vec<u8>>>>,
foreground_fns_tx: mpsc::Sender<ForegroundFn>,
_invoke_foreground_fns: Task<()>,
scripts: Vec<Script>,
}
impl ScriptSession {
pub fn new(project: Entity<Project>, cx: &mut Context<Self>) -> Self {
let (foreground_fns_tx, mut foreground_fns_rx) = mpsc::channel(128);
ScriptSession {
project,
fs_changes: Arc::new(Mutex::new(HashMap::default())),
foreground_fns_tx,
_invoke_foreground_fns: cx.spawn(|this, cx| async move {
while let Some(foreground_fn) = foreground_fns_rx.next().await {
foreground_fn.0(this.clone(), cx.clone());
}
}),
scripts: Vec::new(),
}
}
pub fn new_script(&mut self) -> ScriptId {
let id = ScriptId(self.scripts.len() as u32);
let script = Script {
id,
state: ScriptState::Generating,
source: SharedString::new_static(""),
};
self.scripts.push(script);
id
}
pub fn run_script(
&mut self,
script_id: ScriptId,
script_src: String,
cx: &mut Context<Self>,
) -> Task<anyhow::Result<()>> {
let script = self.get_mut(script_id);
let stdout = Arc::new(Mutex::new(String::new()));
script.source = script_src.clone().into();
script.state = ScriptState::Running {
stdout: stdout.clone(),
};
let task = self.run_lua(script_src, stdout, cx);
cx.emit(ScriptEvent::Spawned(script_id));
cx.spawn(|session, mut cx| async move {
let result = task.await;
session.update(&mut cx, |session, cx| {
let script = session.get_mut(script_id);
let stdout = script.stdout_snapshot();
script.state = match result {
Ok(()) => ScriptState::Succeeded { stdout },
Err(error) => ScriptState::Failed { stdout, error },
};
cx.emit(ScriptEvent::Exited(script_id))
})
})
}
fn run_lua(
&mut self,
script: String,
stdout: Arc<Mutex<String>>,
cx: &mut Context<Self>,
) -> Task<anyhow::Result<()>> {
const SANDBOX_PREAMBLE: &str = include_str!("sandbox_preamble.lua");
// TODO Remove fs_changes
let fs_changes = self.fs_changes.clone();
// TODO Honor all worktrees instead of the first one
let root_dir = self
.project
.read(cx)
.visible_worktrees(cx)
.next()
.map(|worktree| worktree.read(cx).abs_path());
let fs = self.project.read(cx).fs().clone();
let foreground_fns_tx = self.foreground_fns_tx.clone();
let task = cx.background_spawn({
let stdout = stdout.clone();
async move {
let lua = Lua::new();
lua.set_memory_limit(2 * 1024 * 1024 * 1024)?; // 2 GB
let globals = lua.globals();
// Use the project root dir as the script's current working dir.
if let Some(root_dir) = &root_dir {
if let Some(root_dir) = root_dir.to_str() {
globals.set("cwd", root_dir)?;
}
}
globals.set(
"sb_print",
lua.create_function({
let stdout = stdout.clone();
move |_, args: MultiValue| Self::print(args, &stdout)
})?,
)?;
globals.set(
"search",
lua.create_async_function({
let foreground_fns_tx = foreground_fns_tx.clone();
move |lua, regex| {
let mut foreground_fns_tx = foreground_fns_tx.clone();
let fs = fs.clone();
async move {
Self::search(&lua, &mut foreground_fns_tx, fs, regex)
.await
.into_lua_err()
}
}
})?,
)?;
globals.set(
"outline",
lua.create_async_function({
let root_dir = root_dir.clone();
move |_lua, path| {
let mut foreground_fns_tx = foreground_fns_tx.clone();
let root_dir = root_dir.clone();
async move {
Self::outline(root_dir, &mut foreground_fns_tx, path)
.await
.into_lua_err()
}
}
})?,
)?;
globals.set(
"sb_io_open",
lua.create_function({
let fs_changes = fs_changes.clone();
let root_dir = root_dir.clone();
move |lua, (path_str, mode)| {
Self::io_open(&lua, &fs_changes, root_dir.as_ref(), path_str, mode)
}
})?,
)?;
globals.set("user_script", script)?;
lua.load(SANDBOX_PREAMBLE).exec_async().await?;
// Drop Lua instance to decrement reference count.
drop(lua);
anyhow::Ok(())
}
});
task
}
pub fn get(&self, script_id: ScriptId) -> &Script {
&self.scripts[script_id.0 as usize]
}
fn get_mut(&mut self, script_id: ScriptId) -> &mut Script {
&mut self.scripts[script_id.0 as usize]
}
/// Sandboxed print() function in Lua.
fn print(args: MultiValue, stdout: &Mutex<String>) -> mlua::Result<()> {
for (index, arg) in args.into_iter().enumerate() {
// Lua's `print()` prints tab characters between each argument.
if index > 0 {
stdout.lock().push('\t');
}
// If the argument's to_string() fails, have the whole function call fail.
stdout.lock().push_str(&arg.to_string()?);
}
stdout.lock().push('\n');
Ok(())
}
/// Sandboxed io.open() function in Lua.
fn io_open(
lua: &Lua,
fs_changes: &Arc<Mutex<HashMap<PathBuf, Vec<u8>>>>,
root_dir: Option<&Arc<Path>>,
path_str: String,
mode: Option<String>,
) -> mlua::Result<(Option<Table>, String)> {
let root_dir = root_dir
.ok_or_else(|| mlua::Error::runtime("cannot open file without a root directory"))?;
let mode = mode.unwrap_or_else(|| "r".to_string());
// Parse the mode string to determine read/write permissions
let read_perm = mode.contains('r');
let write_perm = mode.contains('w') || mode.contains('a') || mode.contains('+');
let append = mode.contains('a');
let truncate = mode.contains('w');
// This will be the Lua value returned from the `open` function.
let file = lua.create_table()?;
// Store file metadata in the file
file.set("__path", path_str.clone())?;
file.set("__mode", mode.clone())?;
file.set("__read_perm", read_perm)?;
file.set("__write_perm", write_perm)?;
let path = match Self::parse_abs_path_in_root_dir(&root_dir, &path_str) {
Ok(path) => path,
Err(err) => return Ok((None, format!("{err}"))),
};
// close method
let close_fn = {
let fs_changes = fs_changes.clone();
lua.create_function(move |_lua, file_userdata: mlua::Table| {
let write_perm = file_userdata.get::<bool>("__write_perm")?;
let path = file_userdata.get::<String>("__path")?;
if write_perm {
// When closing a writable file, record the content
let content = file_userdata.get::<mlua::AnyUserData>("__content")?;
let content_ref = content.borrow::<FileContent>()?;
let content_vec = content_ref.0.borrow();
// Don't actually write to disk; instead, just update fs_changes.
let path_buf = PathBuf::from(&path);
fs_changes
.lock()
.insert(path_buf.clone(), content_vec.clone());
}
Ok(true)
})?
};
file.set("close", close_fn)?;
// If it's a directory, give it a custom read() and return early.
if path.is_dir() {
// TODO handle the case where we changed it in the in-memory fs
// Create a special directory handle
file.set("__is_directory", true)?;
// Store directory entries
let entries = match std::fs::read_dir(&path) {
Ok(entries) => {
let mut entry_names = Vec::new();
for entry in entries.flatten() {
entry_names.push(entry.file_name().to_string_lossy().into_owned());
}
entry_names
}
Err(e) => return Ok((None, format!("Error reading directory: {}", e))),
};
// Save the list of entries
file.set("__dir_entries", entries)?;
file.set("__dir_position", 0usize)?;
// Create a directory-specific read function
let read_fn = lua.create_function(|_lua, file_userdata: mlua::Table| {
let position = file_userdata.get::<usize>("__dir_position")?;
let entries = file_userdata.get::<Vec<String>>("__dir_entries")?;
if position >= entries.len() {
return Ok(None); // No more entries
}
let entry = entries[position].clone();
file_userdata.set("__dir_position", position + 1)?;
Ok(Some(entry))
})?;
file.set("read", read_fn)?;
// If we got this far, the directory was opened successfully
return Ok((Some(file), String::new()));
}
let fs_changes_map = fs_changes.lock();
let is_in_changes = fs_changes_map.contains_key(&path);
let file_exists = is_in_changes || path.exists();
let mut file_content = Vec::new();
if file_exists && !truncate {
if is_in_changes {
file_content = fs_changes_map.get(&path).unwrap().clone();
} else {
// Try to read existing content if file exists and we're not truncating
match std::fs::read(&path) {
Ok(content) => file_content = content,
Err(e) => return Ok((None, format!("Error reading file: {}", e))),
}
}
}
drop(fs_changes_map); // Unlock the fs_changes mutex.
// If in append mode, position should be at the end
let position = if append && file_exists {
file_content.len()
} else {
0
};
file.set("__position", position)?;
file.set(
"__content",
lua.create_userdata(FileContent(RefCell::new(file_content)))?,
)?;
// Create file methods
// read method
let read_fn = {
lua.create_function(
|_lua, (file_userdata, format): (mlua::Table, Option<mlua::Value>)| {
let read_perm = file_userdata.get::<bool>("__read_perm")?;
if !read_perm {
return Err(mlua::Error::runtime("File not open for reading"));
}
let content = file_userdata.get::<mlua::AnyUserData>("__content")?;
let mut position = file_userdata.get::<usize>("__position")?;
let content_ref = content.borrow::<FileContent>()?;
let content_vec = content_ref.0.borrow();
if position >= content_vec.len() {
return Ok(None); // EOF
}
match format {
Some(mlua::Value::String(s)) => {
let lossy_string = s.to_string_lossy();
let format_str: &str = lossy_string.as_ref();
// Only consider the first 2 bytes, since it's common to pass e.g. "*all" instead of "*a"
match &format_str[0..2] {
"*a" => {
// Read entire file from current position
let result = String::from_utf8_lossy(&content_vec[position..])
.to_string();
position = content_vec.len();
file_userdata.set("__position", position)?;
Ok(Some(result))
}
"*l" => {
// Read next line
let mut line = Vec::new();
let mut found_newline = false;
while position < content_vec.len() {
let byte = content_vec[position];
position += 1;
if byte == b'\n' {
found_newline = true;
break;
}
// Skip \r in \r\n sequence but add it if it's alone
if byte == b'\r' {
if position < content_vec.len()
&& content_vec[position] == b'\n'
{
position += 1;
found_newline = true;
break;
}
}
line.push(byte);
}
file_userdata.set("__position", position)?;
if !found_newline
&& line.is_empty()
&& position >= content_vec.len()
{
return Ok(None); // EOF
}
let result = String::from_utf8_lossy(&line).to_string();
Ok(Some(result))
}
"*n" => {
// Try to parse as a number (number of bytes to read)
match format_str.parse::<usize>() {
Ok(n) => {
let end =
std::cmp::min(position + n, content_vec.len());
let bytes = &content_vec[position..end];
let result = String::from_utf8_lossy(bytes).to_string();
position = end;
file_userdata.set("__position", position)?;
Ok(Some(result))
}
Err(_) => Err(mlua::Error::runtime(format!(
"Invalid format: {}",
format_str
))),
}
}
"*L" => {
// Read next line keeping the end of line
let mut line = Vec::new();
while position < content_vec.len() {
let byte = content_vec[position];
position += 1;
line.push(byte);
if byte == b'\n' {
break;
}
// If we encounter a \r, add it and check if the next is \n
if byte == b'\r'
&& position < content_vec.len()
&& content_vec[position] == b'\n'
{
line.push(content_vec[position]);
position += 1;
break;
}
}
file_userdata.set("__position", position)?;
if line.is_empty() && position >= content_vec.len() {
return Ok(None); // EOF
}
let result = String::from_utf8_lossy(&line).to_string();
Ok(Some(result))
}
_ => Err(mlua::Error::runtime(format!(
"Unsupported format: {}",
format_str
))),
}
}
Some(mlua::Value::Number(n)) => {
// Read n bytes
let n = n as usize;
let end = std::cmp::min(position + n, content_vec.len());
let bytes = &content_vec[position..end];
let result = String::from_utf8_lossy(bytes).to_string();
position = end;
file_userdata.set("__position", position)?;
Ok(Some(result))
}
Some(_) => Err(mlua::Error::runtime("Invalid format")),
None => {
// Default is to read a line
let mut line = Vec::new();
let mut found_newline = false;
while position < content_vec.len() {
let byte = content_vec[position];
position += 1;
if byte == b'\n' {
found_newline = true;
break;
}
// Handle \r\n
if byte == b'\r' {
if position < content_vec.len()
&& content_vec[position] == b'\n'
{
position += 1;
found_newline = true;
break;
}
}
line.push(byte);
}
file_userdata.set("__position", position)?;
if !found_newline && line.is_empty() && position >= content_vec.len() {
return Ok(None); // EOF
}
let result = String::from_utf8_lossy(&line).to_string();
Ok(Some(result))
}
}
},
)?
};
file.set("read", read_fn)?;
// write method
let write_fn = {
let fs_changes = fs_changes.clone();
lua.create_function(move |_lua, (file_userdata, text): (mlua::Table, String)| {
let write_perm = file_userdata.get::<bool>("__write_perm")?;
if !write_perm {
return Err(mlua::Error::runtime("File not open for writing"));
}
let content = file_userdata.get::<mlua::AnyUserData>("__content")?;
let position = file_userdata.get::<usize>("__position")?;
let content_ref = content.borrow::<FileContent>()?;
let mut content_vec = content_ref.0.borrow_mut();
let bytes = text.as_bytes();
// Ensure the vector has enough capacity
if position + bytes.len() > content_vec.len() {
content_vec.resize(position + bytes.len(), 0);
}
// Write the bytes
for (i, &byte) in bytes.iter().enumerate() {
content_vec[position + i] = byte;
}
// Update position
let new_position = position + bytes.len();
file_userdata.set("__position", new_position)?;
// Update fs_changes
let path = file_userdata.get::<String>("__path")?;
let path_buf = PathBuf::from(path);
fs_changes.lock().insert(path_buf, content_vec.clone());
Ok(true)
})?
};
file.set("write", write_fn)?;
// If we got this far, the file was opened successfully
Ok((Some(file), String::new()))
}
async fn search(
lua: &Lua,
foreground_tx: &mut mpsc::Sender<ForegroundFn>,
fs: Arc<dyn Fs>,
regex: String,
) -> anyhow::Result<Table> {
// TODO: Allow specification of these options.
let search_query = SearchQuery::regex(
&regex,
false,
false,
false,
PathMatcher::default(),
PathMatcher::default(),
None,
);
let search_query = match search_query {
Ok(query) => query,
Err(e) => return Err(anyhow!("Invalid search query: {}", e)),
};
// TODO: Should use `search_query.regex`. The tool description should also be updated,
// as it specifies standard regex.
let search_regex = match Regex::new(&regex) {
Ok(re) => re,
Err(e) => return Err(anyhow!("Invalid regex: {}", e)),
};
let mut abs_paths_rx = Self::find_search_candidates(search_query, foreground_tx).await?;
let mut search_results: Vec<Table> = Vec::new();
while let Some(path) = abs_paths_rx.next().await {
// Skip files larger than 1MB
if let Ok(Some(metadata)) = fs.metadata(&path).await {
if metadata.len > 1_000_000 {
continue;
}
}
// Attempt to read the file as text
if let Ok(content) = fs.load(&path).await {
let mut matches = Vec::new();
// Find all regex matches in the content
for capture in search_regex.find_iter(&content) {
matches.push(capture.as_str().to_string());
}
// If we found matches, create a result entry
if !matches.is_empty() {
let result_entry = lua.create_table()?;
result_entry.set("path", path.to_string_lossy().to_string())?;
let matches_table = lua.create_table()?;
for (ix, m) in matches.iter().enumerate() {
matches_table.set(ix + 1, m.clone())?;
}
result_entry.set("matches", matches_table)?;
search_results.push(result_entry);
}
}
}
// Create a table to hold our results
let results_table = lua.create_table()?;
for (ix, entry) in search_results.into_iter().enumerate() {
results_table.set(ix + 1, entry)?;
}
Ok(results_table)
}
async fn find_search_candidates(
search_query: SearchQuery,
foreground_tx: &mut mpsc::Sender<ForegroundFn>,
) -> anyhow::Result<mpsc::UnboundedReceiver<PathBuf>> {
Self::run_foreground_fn(
"finding search file candidates",
foreground_tx,
Box::new(move |session, mut cx| {
session.update(&mut cx, |session, cx| {
session.project.update(cx, |project, cx| {
project.worktree_store().update(cx, |worktree_store, cx| {
// TODO: Better limit? For now this is the same as
// MAX_SEARCH_RESULT_FILES.
let limit = 5000;
// TODO: Providing non-empty open_entries can make this a bit more
// efficient as it can skip checking that these paths are textual.
let open_entries = HashSet::default();
let candidates = worktree_store.find_search_candidates(
search_query,
limit,
open_entries,
project.fs().clone(),
cx,
);
let (abs_paths_tx, abs_paths_rx) = mpsc::unbounded();
cx.spawn(|worktree_store, cx| async move {
pin_mut!(candidates);
while let Some(project_path) = candidates.next().await {
worktree_store.read_with(&cx, |worktree_store, cx| {
if let Some(worktree) = worktree_store
.worktree_for_id(project_path.worktree_id, cx)
{
if let Some(abs_path) = worktree
.read(cx)
.absolutize(&project_path.path)
.log_err()
{
abs_paths_tx.unbounded_send(abs_path)?;
}
}
anyhow::Ok(())
})??;
}
anyhow::Ok(())
})
.detach();
abs_paths_rx
})
})
})
}),
)
.await?
}
async fn outline(
root_dir: Option<Arc<Path>>,
foreground_tx: &mut mpsc::Sender<ForegroundFn>,
path_str: String,
) -> anyhow::Result<String> {
let root_dir = root_dir
.ok_or_else(|| mlua::Error::runtime("cannot get outline without a root directory"))?;
let path = Self::parse_abs_path_in_root_dir(&root_dir, &path_str)?;
let outline = Self::run_foreground_fn(
"getting code outline",
foreground_tx,
Box::new(move |session, cx| {
cx.spawn(move |mut cx| async move {
// TODO: This will not use file content from `fs_changes`. It will also reflect
// user changes that have not been saved.
let buffer = session
.update(&mut cx, |session, cx| {
session
.project
.update(cx, |project, cx| project.open_local_buffer(&path, cx))
})?
.await?;
buffer.update(&mut cx, |buffer, _cx| {
if let Some(outline) = buffer.snapshot().outline(None) {
Ok(outline)
} else {
Err(anyhow!("No outline for file {path_str}"))
}
})
})
}),
)
.await?
.await??;
Ok(outline
.items
.into_iter()
.map(|item| {
if item.text.contains('\n') {
log::error!("Outline item unexpectedly contains newline");
}
format!("{}{}", " ".repeat(item.depth), item.text)
})
.collect::<Vec<String>>()
.join("\n"))
}
async fn run_foreground_fn<R: Send + 'static>(
description: &str,
foreground_tx: &mut mpsc::Sender<ForegroundFn>,
function: Box<dyn FnOnce(WeakEntity<Self>, AsyncApp) -> R + Send>,
) -> anyhow::Result<R> {
let (response_tx, response_rx) = oneshot::channel();
let send_result = foreground_tx
.send(ForegroundFn(Box::new(move |this, cx| {
response_tx.send(function(this, cx)).ok();
})))
.await;
match send_result {
Ok(()) => (),
Err(err) => {
return Err(anyhow::Error::new(err).context(format!(
"Internal error while enqueuing work for {description}"
)));
}
}
match response_rx.await {
Ok(result) => Ok(result),
Err(oneshot::Canceled) => Err(anyhow!(
"Internal error: response oneshot was canceled while {description}."
)),
}
}
fn parse_abs_path_in_root_dir(root_dir: &Path, path_str: &str) -> anyhow::Result<PathBuf> {
let path = Path::new(&path_str);
if path.is_absolute() {
// Check if path starts with root_dir prefix without resolving symlinks
if path.starts_with(&root_dir) {
Ok(path.to_path_buf())
} else {
Err(anyhow!(
"Error: Absolute path {} is outside the current working directory",
path_str
))
}
} else {
// TODO: Does use of `../` break sandbox - is path canonicalization needed?
Ok(root_dir.join(path))
}
}
}
struct FileContent(RefCell<Vec<u8>>);
impl UserData for FileContent {
fn add_methods<M: UserDataMethods<Self>>(_methods: &mut M) {
// FileContent doesn't have any methods so far.
}
}
#[derive(Debug)]
pub enum ScriptEvent {
Spawned(ScriptId),
Exited(ScriptId),
}
impl EventEmitter<ScriptEvent> for ScriptSession {}
#[derive(Clone, Copy, Debug, PartialEq, Eq, Hash)]
pub struct ScriptId(u32);
pub struct Script {
pub id: ScriptId,
pub state: ScriptState,
pub source: SharedString,
}
pub enum ScriptState {
Generating,
Running {
stdout: Arc<Mutex<String>>,
},
Succeeded {
stdout: String,
},
Failed {
stdout: String,
error: anyhow::Error,
},
}
impl Script {
pub fn source_tag(&self) -> String {
format!("{}{}{}", SCRIPT_START_TAG, self.source, SCRIPT_END_TAG)
}
/// If exited, returns a message with the output for the LLM
pub fn output_message_for_llm(&self) -> Option<String> {
match &self.state {
ScriptState::Generating { .. } => None,
ScriptState::Running { .. } => None,
ScriptState::Succeeded { stdout } => {
format!("Here's the script output:\n{}", stdout).into()
}
ScriptState::Failed { stdout, error } => format!(
"The script failed with:\n{}\n\nHere's the output it managed to print:\n{}",
error, stdout
)
.into(),
}
}
/// Get a snapshot of the script's stdout
pub fn stdout_snapshot(&self) -> String {
match &self.state {
ScriptState::Generating { .. } => String::new(),
ScriptState::Running { stdout } => stdout.lock().clone(),
ScriptState::Succeeded { stdout } => stdout.clone(),
ScriptState::Failed { stdout, .. } => stdout.clone(),
}
}
/// Returns the error if the script failed, otherwise None
pub fn error(&self) -> Option<&anyhow::Error> {
match &self.state {
ScriptState::Generating { .. } => None,
ScriptState::Running { .. } => None,
ScriptState::Succeeded { .. } => None,
ScriptState::Failed { error, .. } => Some(error),
}
}
}
#[cfg(test)]
mod tests {
use gpui::TestAppContext;
use project::FakeFs;
use serde_json::json;
use settings::SettingsStore;
use super::*;
#[gpui::test]
async fn test_print(cx: &mut TestAppContext) {
let script = r#"
print("Hello", "world!")
print("Goodbye", "moon!")
"#;
let output = test_script(script, cx).await.unwrap();
assert_eq!(output, "Hello\tworld!\nGoodbye\tmoon!\n");
}
#[gpui::test]
async fn test_search(cx: &mut TestAppContext) {
let script = r#"
local results = search("world")
for i, result in ipairs(results) do
print("File: " .. result.path)
print("Matches:")
for j, match in ipairs(result.matches) do
print(" " .. match)
end
end
"#;
let output = test_script(script, cx).await.unwrap();
assert_eq!(output, "File: /file1.txt\nMatches:\n world\n");
}
async fn test_script(source: &str, cx: &mut TestAppContext) -> anyhow::Result<String> {
init_test(cx);
let fs = FakeFs::new(cx.executor());
fs.insert_tree(
"/",
json!({
"file1.txt": "Hello world!",
"file2.txt": "Goodbye moon!"
}),
)
.await;
let project = Project::test(fs, [Path::new("/")], cx).await;
let session = cx.new(|cx| ScriptSession::new(project, cx));
let (script_id, task) = session.update(cx, |session, cx| {
let script_id = session.new_script();
let task = session.run_script(script_id, source.to_string(), cx);
(script_id, task)
});
task.await?;
Ok(session.read_with(cx, |session, _cx| session.get(script_id).stdout_snapshot()))
}
fn init_test(cx: &mut TestAppContext) {
let settings_store = cx.update(SettingsStore::test);
cx.set_global(settings_store);
cx.update(Project::init_settings);
}
}

View File

@@ -0,0 +1,36 @@
You can write a Lua script and I'll run it on my codebase and tell you what its
output was, including both stdout as well as the git diff of changes it made to
the filesystem. That way, you can get more information about the code base, or
make changes to the code base directly.
Put the Lua script inside of an `<eval>` tag like so:
<eval type="lua">
print("Hello, world!")
</eval>
The Lua script will have access to `io` and it will run with the current working
directory being in the root of the code base, so you can use it to explore,
search, make changes, etc. You can also have the script print things, and I'll
tell you what the output was. Note that `io` only has `open`, and then the file
it returns only has the methods read, write, and close - it doesn't have popen
or anything else.
There is a function called `search` which accepts a regex (it's implemented
using Rust's regex crate, so use that regex syntax) and runs that regex on the
contents of every file in the code base (aside from gitignored files), then
returns an array of tables with two fields: "path" (the path to the file that
had the matches) and "matches" (an array of strings, with each string being a
match that was found within the file).
There is a function called `outline` which accepts the path to a source file,
and returns a string where each line is a declaration. These lines are indented
with 2 spaces to indicate when a declaration is inside another.
When I send you the script output, do not thank me for running it,
act as if you ran it yourself.
IMPORTANT!
Only include a maximum of one Lua script at the very end of your message
DO NOT WRITE ANYTHING ELSE AFTER THE SCRIPT. Wait for my response with the script
output to continue.

View File

@@ -0,0 +1,260 @@
pub const SCRIPT_START_TAG: &str = "<eval type=\"lua\">";
pub const SCRIPT_END_TAG: &str = "</eval>";
const START_TAG: &[u8] = SCRIPT_START_TAG.as_bytes();
const END_TAG: &[u8] = SCRIPT_END_TAG.as_bytes();
/// Parses a script tag in an assistant message as it is being streamed.
pub struct ScriptTagParser {
state: State,
buffer: Vec<u8>,
tag_match_ix: usize,
}
enum State {
Unstarted,
Streaming,
Ended,
}
#[derive(Debug, PartialEq)]
pub struct ChunkOutput {
/// The chunk with script tags removed.
pub content: String,
/// The full script tag content. `None` until closed.
pub script_source: Option<String>,
}
impl ScriptTagParser {
/// Create a new script tag parser.
pub fn new() -> Self {
Self {
state: State::Unstarted,
buffer: Vec::new(),
tag_match_ix: 0,
}
}
/// Returns true if the parser has found a script tag.
pub fn found_script(&self) -> bool {
match self.state {
State::Unstarted => false,
State::Streaming | State::Ended => true,
}
}
/// Process a new chunk of input, splitting it into surrounding content and script source.
pub fn parse_chunk(&mut self, input: &str) -> ChunkOutput {
let mut content = Vec::with_capacity(input.len());
for byte in input.bytes() {
match self.state {
State::Unstarted => {
if collect_until_tag(byte, START_TAG, &mut self.tag_match_ix, &mut content) {
self.state = State::Streaming;
self.buffer = Vec::with_capacity(1024);
self.tag_match_ix = 0;
}
}
State::Streaming => {
if collect_until_tag(byte, END_TAG, &mut self.tag_match_ix, &mut self.buffer) {
self.state = State::Ended;
}
}
State::Ended => content.push(byte),
}
}
let content = unsafe { String::from_utf8_unchecked(content) };
let script_source = if matches!(self.state, State::Ended) && !self.buffer.is_empty() {
let source = unsafe { String::from_utf8_unchecked(std::mem::take(&mut self.buffer)) };
Some(source)
} else {
None
};
ChunkOutput {
content,
script_source,
}
}
}
fn collect_until_tag(byte: u8, tag: &[u8], tag_match_ix: &mut usize, buffer: &mut Vec<u8>) -> bool {
// this can't be a method because it'd require a mutable borrow on both self and self.buffer
if match_tag_byte(byte, tag, tag_match_ix) {
*tag_match_ix >= tag.len()
} else {
if *tag_match_ix > 0 {
// push the partially matched tag to the buffer
buffer.extend_from_slice(&tag[..*tag_match_ix]);
*tag_match_ix = 0;
// the tag might start to match again
if match_tag_byte(byte, tag, tag_match_ix) {
return *tag_match_ix >= tag.len();
}
}
buffer.push(byte);
false
}
}
fn match_tag_byte(byte: u8, tag: &[u8], tag_match_ix: &mut usize) -> bool {
if byte == tag[*tag_match_ix] {
*tag_match_ix += 1;
true
} else {
false
}
}
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn test_parse_complete_tag() {
let mut parser = ScriptTagParser::new();
let input = "<eval type=\"lua\">print(\"Hello, World!\")</eval>";
let result = parser.parse_chunk(input);
assert_eq!(result.content, "");
assert_eq!(
result.script_source,
Some("print(\"Hello, World!\")".to_string())
);
}
#[test]
fn test_no_tag() {
let mut parser = ScriptTagParser::new();
let input = "No tags here, just plain text";
let result = parser.parse_chunk(input);
assert_eq!(result.content, "No tags here, just plain text");
assert_eq!(result.script_source, None);
}
#[test]
fn test_partial_end_tag() {
let mut parser = ScriptTagParser::new();
// Start the tag
let result = parser.parse_chunk("<eval type=\"lua\">let x = '</e");
assert_eq!(result.content, "");
assert_eq!(result.script_source, None);
// Finish with the rest
let result = parser.parse_chunk("val' + 'not the end';</eval>");
assert_eq!(result.content, "");
assert_eq!(
result.script_source,
Some("let x = '</eval' + 'not the end';".to_string())
);
}
#[test]
fn test_text_before_and_after_tag() {
let mut parser = ScriptTagParser::new();
let input = "Before tag <eval type=\"lua\">print(\"Hello\")</eval> After tag";
let result = parser.parse_chunk(input);
assert_eq!(result.content, "Before tag After tag");
assert_eq!(result.script_source, Some("print(\"Hello\")".to_string()));
}
#[test]
fn test_multiple_chunks_with_surrounding_text() {
let mut parser = ScriptTagParser::new();
// First chunk with text before
let result = parser.parse_chunk("Before script <eval type=\"lua\">local x = 10");
assert_eq!(result.content, "Before script ");
assert_eq!(result.script_source, None);
// Second chunk with script content
let result = parser.parse_chunk("\nlocal y = 20");
assert_eq!(result.content, "");
assert_eq!(result.script_source, None);
// Last chunk with text after
let result = parser.parse_chunk("\nprint(x + y)</eval> After script");
assert_eq!(result.content, " After script");
assert_eq!(
result.script_source,
Some("local x = 10\nlocal y = 20\nprint(x + y)".to_string())
);
let result = parser.parse_chunk(" there's more text");
assert_eq!(result.content, " there's more text");
assert_eq!(result.script_source, None);
}
#[test]
fn test_partial_start_tag_matching() {
let mut parser = ScriptTagParser::new();
// partial match of start tag...
let result = parser.parse_chunk("<ev");
assert_eq!(result.content, "");
// ...that's abandandoned when the < of a real tag is encountered
let result = parser.parse_chunk("<eval type=\"lua\">script content</eval>");
// ...so it gets pushed to content
assert_eq!(result.content, "<ev");
// ...and the real tag is parsed correctly
assert_eq!(result.script_source, Some("script content".to_string()));
}
#[test]
fn test_random_chunked_parsing() {
use rand::rngs::StdRng;
use rand::{Rng, SeedableRng};
use std::time::{SystemTime, UNIX_EPOCH};
let test_inputs = [
"Before <eval type=\"lua\">print(\"Hello\")</eval> After",
"No tags here at all",
"<eval type=\"lua\">local x = 10\nlocal y = 20\nprint(x + y)</eval>",
"Text <eval type=\"lua\">if true then\nprint(\"nested </e\")\nend</eval> more",
];
let seed = SystemTime::now()
.duration_since(UNIX_EPOCH)
.unwrap()
.as_secs();
eprintln!("Using random seed: {}", seed);
let mut rng = StdRng::seed_from_u64(seed);
for test_input in &test_inputs {
let mut reference_parser = ScriptTagParser::new();
let expected = reference_parser.parse_chunk(test_input);
let mut chunked_parser = ScriptTagParser::new();
let mut remaining = test_input.as_bytes();
let mut actual_content = String::new();
let mut actual_script = None;
while !remaining.is_empty() {
let chunk_size = rng.gen_range(1..=remaining.len().min(5));
let (chunk, rest) = remaining.split_at(chunk_size);
remaining = rest;
let chunk_str = std::str::from_utf8(chunk).unwrap();
let result = chunked_parser.parse_chunk(chunk_str);
actual_content.push_str(&result.content);
if result.script_source.is_some() {
actual_script = result.script_source;
}
}
assert_eq!(actual_content, expected.content);
assert_eq!(actual_script, expected.script_source);
}
}
}

View File

@@ -17,6 +17,6 @@ collections.workspace = true
derive_more.workspace = true
gpui.workspace = true
parking_lot.workspace = true
project.workspace = true
serde.workspace = true
serde_json.workspace = true
workspace.workspace = true

View File

@@ -4,8 +4,8 @@ mod tool_working_set;
use std::sync::Arc;
use anyhow::Result;
use gpui::{App, Task, WeakEntity, Window};
use workspace::Workspace;
use gpui::{App, Entity, Task};
use project::Project;
pub use crate::tool_registry::*;
pub use crate::tool_working_set::*;
@@ -31,8 +31,7 @@ pub trait Tool: 'static + Send + Sync {
fn run(
self: Arc<Self>,
input: serde_json::Value,
workspace: WeakEntity<Workspace>,
window: &mut Window,
project: Entity<Project>,
cx: &mut App,
) -> Task<Result<String>>;
}

View File

@@ -16,7 +16,7 @@ anyhow.workspace = true
assistant_tool.workspace = true
chrono.workspace = true
gpui.workspace = true
project.workspace = true
schemars.workspace = true
serde.workspace = true
serde_json.workspace = true
workspace.workspace = true

View File

@@ -1,13 +1,19 @@
mod list_worktrees_tool;
mod now_tool;
mod read_file_tool;
use assistant_tool::ToolRegistry;
use gpui::App;
use crate::list_worktrees_tool::ListWorktreesTool;
use crate::now_tool::NowTool;
use crate::read_file_tool::ReadFileTool;
pub fn init(cx: &mut App) {
assistant_tool::init(cx);
let registry = ToolRegistry::global(cx);
registry.register_tool(NowTool);
registry.register_tool(ListWorktreesTool);
registry.register_tool(ReadFileTool);
}

View File

@@ -0,0 +1,77 @@
use std::sync::Arc;
use anyhow::Result;
use assistant_tool::Tool;
use gpui::{App, Entity, Task};
use project::Project;
use schemars::JsonSchema;
use serde::{Deserialize, Serialize};
#[derive(Debug, Serialize, Deserialize, JsonSchema)]
pub struct ListWorktreesToolInput {}
pub struct ListWorktreesTool;
impl Tool for ListWorktreesTool {
fn name(&self) -> String {
"list-worktrees".into()
}
fn description(&self) -> String {
"Lists all worktrees in the current project. Use this tool when you need to find available worktrees and their IDs.".into()
}
fn input_schema(&self) -> serde_json::Value {
serde_json::json!(
{
"type": "object",
"properties": {},
"required": []
}
)
}
fn run(
self: Arc<Self>,
_input: serde_json::Value,
project: Entity<Project>,
cx: &mut App,
) -> Task<Result<String>> {
cx.spawn(|cx| async move {
cx.update(|cx| {
#[derive(Debug, Serialize)]
struct WorktreeInfo {
id: usize,
root_name: String,
root_dir: Option<String>,
}
let worktrees = project.update(cx, |project, cx| {
project
.visible_worktrees(cx)
.map(|worktree| {
worktree.read_with(cx, |worktree, _cx| WorktreeInfo {
id: worktree.id().to_usize(),
root_dir: worktree
.root_dir()
.map(|root_dir| root_dir.to_string_lossy().to_string()),
root_name: worktree.root_name().to_string(),
})
})
.collect::<Vec<_>>()
});
if worktrees.is_empty() {
return Ok("No worktrees found in the current project.".to_string());
}
let mut result = String::from("Worktrees in the current project:\n\n");
for worktree in worktrees {
result.push_str(&serde_json::to_string(&worktree)?);
}
Ok(result)
})?
})
}
}

View File

@@ -3,7 +3,8 @@ use std::sync::Arc;
use anyhow::{anyhow, Result};
use assistant_tool::Tool;
use chrono::{Local, Utc};
use gpui::{App, Task, WeakEntity, Window};
use gpui::{App, Entity, Task};
use project::Project;
use schemars::JsonSchema;
use serde::{Deserialize, Serialize};
@@ -41,8 +42,7 @@ impl Tool for NowTool {
fn run(
self: Arc<Self>,
input: serde_json::Value,
_workspace: WeakEntity<workspace::Workspace>,
_window: &mut Window,
_project: Entity<Project>,
_cx: &mut App,
) -> Task<Result<String>> {
let input: NowToolInput = match serde_json::from_value(input) {

View File

@@ -0,0 +1,62 @@
use std::path::Path;
use std::sync::Arc;
use anyhow::{anyhow, Result};
use assistant_tool::Tool;
use gpui::{App, Entity, Task};
use project::{Project, ProjectPath, WorktreeId};
use schemars::JsonSchema;
use serde::{Deserialize, Serialize};
#[derive(Debug, Serialize, Deserialize, JsonSchema)]
pub struct ReadFileToolInput {
/// The ID of the worktree in which the file resides.
pub worktree_id: usize,
/// The path to the file to read.
///
/// This path is relative to the worktree root, it must not be an absolute path.
pub path: Arc<Path>,
}
pub struct ReadFileTool;
impl Tool for ReadFileTool {
fn name(&self) -> String {
"read-file".into()
}
fn description(&self) -> String {
"Reads the content of a file specified by a worktree ID and path. Use this tool when you need to access the contents of a file in the project.".into()
}
fn input_schema(&self) -> serde_json::Value {
let schema = schemars::schema_for!(ReadFileToolInput);
serde_json::to_value(&schema).unwrap()
}
fn run(
self: Arc<Self>,
input: serde_json::Value,
project: Entity<Project>,
cx: &mut App,
) -> Task<Result<String>> {
let input = match serde_json::from_value::<ReadFileToolInput>(input) {
Ok(input) => input,
Err(err) => return Task::ready(Err(anyhow!(err))),
};
let project_path = ProjectPath {
worktree_id: WorktreeId::from_usize(input.worktree_id),
path: input.path,
};
cx.spawn(|cx| async move {
let buffer = cx
.update(|cx| {
project.update(cx, |project, cx| project.open_buffer(project_path, cx))
})?
.await?;
cx.update(|cx| buffer.read(cx).text())
})
}
}

View File

@@ -663,11 +663,13 @@ impl std::fmt::Debug for BufferDiff {
}
}
#[derive(Clone, Debug)]
pub enum BufferDiffEvent {
DiffChanged {
changed_range: Option<Range<text::Anchor>>,
},
LanguageChanged,
HunksStagedOrUnstaged(Option<Rope>),
}
impl EventEmitter<BufferDiffEvent> for BufferDiff {}
@@ -762,6 +764,17 @@ impl BufferDiff {
self.secondary_diff.clone()
}
pub fn clear_pending_hunks(&mut self, cx: &mut Context<Self>) {
if let Some(secondary_diff) = &self.secondary_diff {
secondary_diff.update(cx, |diff, _| {
diff.inner.pending_hunks.clear();
});
cx.emit(BufferDiffEvent::DiffChanged {
changed_range: Some(Anchor::MIN..Anchor::MAX),
});
}
}
pub fn stage_or_unstage_hunks(
&mut self,
stage: bool,
@@ -784,6 +797,9 @@ impl BufferDiff {
}
});
}
cx.emit(BufferDiffEvent::HunksStagedOrUnstaged(
new_index_text.clone(),
));
if let Some((first, last)) = hunks.first().zip(hunks.last()) {
let changed_range = first.buffer_range.start..last.buffer_range.end;
cx.emit(BufferDiffEvent::DiffChanged {
@@ -900,6 +916,14 @@ impl BufferDiff {
}
}
pub fn hunks<'a>(
&'a self,
buffer_snapshot: &'a text::BufferSnapshot,
cx: &'a App,
) -> impl 'a + Iterator<Item = DiffHunk> {
self.hunks_intersecting_range(Anchor::MIN..Anchor::MAX, buffer_snapshot, cx)
}
pub fn hunks_intersecting_range<'a>(
&'a self,
range: Range<text::Anchor>,

View File

@@ -418,6 +418,8 @@ impl Telemetry {
fn report_event(self: &Arc<Self>, event: Event) {
let mut state = self.state.lock();
// RUST_LOG=telemetry=trace to debug telemetry events
log::trace!(target: "telemetry", "{:?}", event);
if !state.settings.metrics {
return;

View File

@@ -308,7 +308,7 @@ impl Server {
.add_request_handler(forward_read_only_project_request::<proto::InlayHints>)
.add_request_handler(forward_read_only_project_request::<proto::ResolveInlayHint>)
.add_request_handler(forward_read_only_project_request::<proto::OpenBufferByPath>)
.add_request_handler(forward_read_only_project_request::<proto::GitBranches>)
.add_request_handler(forward_read_only_project_request::<proto::GitGetBranches>)
.add_request_handler(forward_read_only_project_request::<proto::OpenUnstagedDiff>)
.add_request_handler(forward_read_only_project_request::<proto::OpenUncommittedDiff>)
.add_request_handler(
@@ -393,9 +393,6 @@ impl Server {
.add_request_handler(forward_mutating_project_request::<proto::OpenContext>)
.add_request_handler(forward_mutating_project_request::<proto::CreateContext>)
.add_request_handler(forward_mutating_project_request::<proto::SynchronizeContexts>)
.add_request_handler(forward_mutating_project_request::<proto::Push>)
.add_request_handler(forward_mutating_project_request::<proto::Pull>)
.add_request_handler(forward_mutating_project_request::<proto::Fetch>)
.add_request_handler(forward_mutating_project_request::<proto::Stage>)
.add_request_handler(forward_mutating_project_request::<proto::Unstage>)
.add_request_handler(forward_mutating_project_request::<proto::Commit>)
@@ -405,6 +402,10 @@ impl Server {
.add_request_handler(forward_read_only_project_request::<proto::GitCheckoutFiles>)
.add_request_handler(forward_mutating_project_request::<proto::SetIndexText>)
.add_request_handler(forward_mutating_project_request::<proto::OpenCommitMessageBuffer>)
.add_request_handler(forward_mutating_project_request::<proto::GitDiff>)
.add_request_handler(forward_mutating_project_request::<proto::GitCreateBranch>)
.add_request_handler(forward_mutating_project_request::<proto::GitChangeBranch>)
.add_request_handler(forward_mutating_project_request::<proto::CheckForPushedCommits>)
.add_message_handler(broadcast_project_message_from_host::<proto::AdvertiseContexts>)
.add_message_handler(update_context)
.add_request_handler({

View File

@@ -2027,6 +2027,15 @@ async fn test_git_blame_is_forwarded(cx_a: &mut TestAppContext, cx_b: &mut TestA
.unwrap()
.downcast::<Editor>()
.unwrap();
let buffer_id_b = editor_b.update(cx_b, |editor_b, cx| {
editor_b
.buffer()
.read(cx)
.as_singleton()
.unwrap()
.read(cx)
.remote_id()
});
// client_b now requests git blame for the open buffer
editor_b.update_in(cx_b, |editor_b, window, cx| {
@@ -2045,6 +2054,7 @@ async fn test_git_blame_is_forwarded(cx_a: &mut TestAppContext, cx_b: &mut TestA
&(0..4)
.map(|row| RowInfo {
buffer_row: Some(row),
buffer_id: Some(buffer_id_b),
..Default::default()
})
.collect::<Vec<_>>(),
@@ -2092,6 +2102,7 @@ async fn test_git_blame_is_forwarded(cx_a: &mut TestAppContext, cx_b: &mut TestA
&(0..4)
.map(|row| RowInfo {
buffer_row: Some(row),
buffer_id: Some(buffer_id_b),
..Default::default()
})
.collect::<Vec<_>>(),
@@ -2127,6 +2138,7 @@ async fn test_git_blame_is_forwarded(cx_a: &mut TestAppContext, cx_b: &mut TestA
&(0..4)
.map(|row| RowInfo {
buffer_row: Some(row),
buffer_id: Some(buffer_id_b),
..Default::default()
})
.collect::<Vec<_>>(),

View File

@@ -6741,19 +6741,24 @@ async fn test_remote_git_branches(
.collect::<HashSet<_>>();
let (project_a, worktree_id) = client_a.build_local_project("/project", cx_a).await;
let project_id = active_call_a
.update(cx_a, |call, cx| call.share_project(project_a.clone(), cx))
.await
.unwrap();
let project_b = client_b.join_remote_project(project_id, cx_b).await;
let root_path = ProjectPath::root_path(worktree_id);
// Client A sees that a guest has joined.
// Client A sees that a guest has joined and the repo has been populated
executor.run_until_parked();
let repo_b = cx_b.update(|cx| project_b.read(cx).active_repository(cx).unwrap());
let root_path = ProjectPath::root_path(worktree_id);
let branches_b = cx_b
.update(|cx| project_b.update(cx, |project, cx| project.branches(root_path.clone(), cx)))
.update(|cx| repo_b.update(cx, |repository, _| repository.branches()))
.await
.unwrap()
.unwrap();
let new_branch = branches[2];
@@ -6765,13 +6770,10 @@ async fn test_remote_git_branches(
assert_eq!(branches_b, branches_set);
cx_b.update(|cx| {
project_b.update(cx, |project, cx| {
project.update_or_create_branch(root_path.clone(), new_branch.to_string(), cx)
})
})
.await
.unwrap();
cx_b.update(|cx| repo_b.read(cx).change_branch(new_branch.to_string()))
.await
.unwrap()
.unwrap();
executor.run_until_parked();
@@ -6789,11 +6791,21 @@ async fn test_remote_git_branches(
// Also try creating a new branch
cx_b.update(|cx| {
project_b.update(cx, |project, cx| {
project.update_or_create_branch(root_path.clone(), "totally-new-branch".to_string(), cx)
})
repo_b
.read(cx)
.create_branch("totally-new-branch".to_string())
})
.await
.unwrap()
.unwrap();
cx_b.update(|cx| {
repo_b
.read(cx)
.change_branch("totally-new-branch".to_string())
})
.await
.unwrap()
.unwrap();
executor.run_until_parked();

View File

@@ -276,11 +276,13 @@ async fn test_ssh_collaboration_git_branches(
// has some git repositories
executor.run_until_parked();
let repo_b = cx_b.update(|cx| project_b.read(cx).active_repository(cx).unwrap());
let root_path = ProjectPath::root_path(worktree_id);
let branches_b = cx_b
.update(|cx| project_b.update(cx, |project, cx| project.branches(root_path.clone(), cx)))
.update(|cx| repo_b.read(cx).branches())
.await
.unwrap()
.unwrap();
let new_branch = branches[2];
@@ -292,13 +294,10 @@ async fn test_ssh_collaboration_git_branches(
assert_eq!(&branches_b, &branches_set);
cx_b.update(|cx| {
project_b.update(cx, |project, cx| {
project.update_or_create_branch(root_path.clone(), new_branch.to_string(), cx)
})
})
.await
.unwrap();
cx_b.update(|cx| repo_b.read(cx).change_branch(new_branch.to_string()))
.await
.unwrap()
.unwrap();
executor.run_until_parked();
@@ -318,11 +317,21 @@ async fn test_ssh_collaboration_git_branches(
// Also try creating a new branch
cx_b.update(|cx| {
project_b.update(cx, |project, cx| {
project.update_or_create_branch(root_path.clone(), "totally-new-branch".to_string(), cx)
})
repo_b
.read(cx)
.create_branch("totally-new-branch".to_string())
})
.await
.unwrap()
.unwrap();
cx_b.update(|cx| {
repo_b
.read(cx)
.change_branch("totally-new-branch".to_string())
})
.await
.unwrap()
.unwrap();
executor.run_until_parked();

View File

@@ -323,7 +323,7 @@ impl ChatPanel {
.my_0p5()
.px_0p5()
.gap_x_1()
.rounded_md()
.rounded_sm()
.child(Icon::new(IconName::ReplyArrowRight).color(Color::Muted))
.when(reply_to_message.is_none(), |el| {
el.child(
@@ -358,7 +358,7 @@ impl ChatPanel {
.my_0p5()
.px_0p5()
.gap_x_1()
.rounded_md()
.rounded_sm()
.overflow_hidden()
.hover(|style| style.bg(cx.theme().colors().element_background))
.child(Icon::new(IconName::ReplyArrowRight).color(Color::Muted))
@@ -476,7 +476,7 @@ impl ChatPanel {
div()
.group("")
.bg(background)
.rounded_md()
.rounded_sm()
.overflow_hidden()
.px_1p5()
.py_0p5()
@@ -563,7 +563,7 @@ impl ChatPanel {
.child(
div()
.px_1()
.rounded_md()
.rounded_sm()
.text_ui_xs(cx)
.bg(cx.theme().colors().background)
.child("New messages"),
@@ -589,7 +589,7 @@ impl ChatPanel {
div()
.w_6()
.bg(cx.theme().colors().element_background)
.hover(|style| style.bg(cx.theme().colors().element_hover).rounded_md())
.hover(|style| style.bg(cx.theme().colors().element_hover).rounded_sm())
.child(child)
}
@@ -604,7 +604,7 @@ impl ChatPanel {
.absolute()
.right_2()
.overflow_hidden()
.rounded_md()
.rounded_sm()
.border_color(cx.theme().colors().element_selected)
.border_1()
.when(!self.has_open_menu(message_id), |el| {

View File

@@ -10,9 +10,9 @@ use gpui::{
};
use language::{
language_settings::SoftWrap, Anchor, Buffer, BufferSnapshot, CodeLabel, LanguageRegistry,
LanguageServerId, ToOffset,
ToOffset,
};
use project::{search::SearchQuery, Completion};
use project::{search::SearchQuery, Completion, CompletionSource};
use settings::Settings;
use std::{
cell::RefCell,
@@ -309,11 +309,9 @@ impl MessageEditor {
old_range: range.clone(),
new_text,
label,
documentation: None,
server_id: LanguageServerId(0), // TODO: Make this optional or something?
lsp_completion: Default::default(), // TODO: Make this optional or something?
confirm: None,
resolved: true,
documentation: None,
source: CompletionSource::Custom,
}
})
.collect()
@@ -531,7 +529,7 @@ impl Render for MessageEditor {
.px_2()
.py_1()
.bg(cx.theme().colors().editor_background)
.rounded_md()
.rounded_sm()
.child(EditorElement::new(
&self.editor,
EditorStyle {

View File

@@ -300,7 +300,7 @@ impl NotificationPanel {
.hover(|style| {
style
.bg(cx.theme().colors().element_selected)
.rounded_md()
.rounded_sm()
})
.child(Label::new(relative_timestamp).color(Color::Muted))
.tooltip(move |_, cx| {

View File

@@ -1,3 +1,4 @@
use std::fmt::Display;
use std::ops::{Deref, DerefMut};
use std::sync::LazyLock;
@@ -8,7 +9,7 @@ use parking_lot::RwLock;
use theme::ActiveTheme;
pub trait Component {
fn scope() -> Option<&'static str>;
fn scope() -> Option<ComponentScope>;
fn name() -> &'static str {
std::any::type_name::<Self>()
}
@@ -31,7 +32,7 @@ pub static COMPONENT_DATA: LazyLock<RwLock<ComponentRegistry>> =
LazyLock::new(|| RwLock::new(ComponentRegistry::new()));
pub struct ComponentRegistry {
components: Vec<(Option<&'static str>, &'static str, Option<&'static str>)>,
components: Vec<(Option<ComponentScope>, &'static str, Option<&'static str>)>,
previews: HashMap<&'static str, fn(&mut Window, &mut App) -> AnyElement>,
}
@@ -77,18 +78,23 @@ pub struct ComponentId(pub &'static str);
#[derive(Clone)]
pub struct ComponentMetadata {
id: ComponentId,
name: SharedString,
scope: Option<SharedString>,
scope: Option<ComponentScope>,
description: Option<SharedString>,
preview: Option<fn(&mut Window, &mut App) -> AnyElement>,
}
impl ComponentMetadata {
pub fn id(&self) -> ComponentId {
self.id.clone()
}
pub fn name(&self) -> SharedString {
self.name.clone()
}
pub fn scope(&self) -> Option<SharedString> {
pub fn scope(&self) -> Option<ComponentScope> {
self.scope.clone()
}
@@ -152,14 +158,16 @@ pub fn components() -> AllComponents {
let data = COMPONENT_DATA.read();
let mut all_components = AllComponents::new();
for &(scope, name, description) in &data.components {
let scope = scope.map(Into::into);
for (ref scope, name, description) in &data.components {
let preview = data.previews.get(name).cloned();
let component_name = SharedString::new_static(name);
let id = ComponentId(name);
all_components.insert(
ComponentId(name),
id.clone(),
ComponentMetadata {
name: name.into(),
scope,
id,
name: component_name,
scope: scope.clone(),
description: description.map(Into::into),
preview,
},
@@ -169,6 +177,59 @@ pub fn components() -> AllComponents {
all_components
}
#[derive(Debug, Clone, PartialEq, Eq, Hash)]
pub enum ComponentScope {
Layout,
Input,
Notification,
Editor,
Collaboration,
VersionControl,
Unknown(SharedString),
}
impl Display for ComponentScope {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
match self {
ComponentScope::Layout => write!(f, "Layout"),
ComponentScope::Input => write!(f, "Input"),
ComponentScope::Notification => write!(f, "Notification"),
ComponentScope::Editor => write!(f, "Editor"),
ComponentScope::Collaboration => write!(f, "Collaboration"),
ComponentScope::VersionControl => write!(f, "Version Control"),
ComponentScope::Unknown(name) => write!(f, "Unknown: {}", name),
}
}
}
impl From<&str> for ComponentScope {
fn from(value: &str) -> Self {
match value {
"Layout" => ComponentScope::Layout,
"Input" => ComponentScope::Input,
"Notification" => ComponentScope::Notification,
"Editor" => ComponentScope::Editor,
"Collaboration" => ComponentScope::Collaboration,
"Version Control" | "VersionControl" => ComponentScope::VersionControl,
_ => ComponentScope::Unknown(SharedString::new(value)),
}
}
}
impl From<String> for ComponentScope {
fn from(value: String) -> Self {
match value.as_str() {
"Layout" => ComponentScope::Layout,
"Input" => ComponentScope::Input,
"Notification" => ComponentScope::Notification,
"Editor" => ComponentScope::Editor,
"Collaboration" => ComponentScope::Collaboration,
"Version Control" | "VersionControl" => ComponentScope::VersionControl,
_ => ComponentScope::Unknown(SharedString::new(value)),
}
}
}
/// Which side of the preview to show labels on
#[derive(Default, Debug, Clone, Copy, PartialEq, Eq)]
pub enum ExampleLabelSide {
@@ -177,8 +238,8 @@ pub enum ExampleLabelSide {
/// Right side
Right,
/// Top side
Top,
#[default]
Top,
/// Bottom side
Bottom,
}
@@ -208,6 +269,7 @@ impl RenderOnce for ComponentExample {
.text_size(px(10.))
.text_color(cx.theme().colors().text_muted)
.when(self.grow, |this| this.flex_1())
.when(!self.grow, |this| this.flex_none())
.child(self.element)
.child(self.variant_name)
.into_any_element()

View File

@@ -15,7 +15,12 @@ path = "src/component_preview.rs"
default = []
[dependencies]
client.workspace = true
component.workspace = true
gpui.workspace = true
languages.workspace = true
project.workspace = true
ui.workspace = true
workspace.workspace = true
notifications.workspace = true
collections.workspace = true

View File

@@ -2,18 +2,51 @@
//!
//! A view for exploring Zed components.
use component::{components, ComponentMetadata};
use gpui::{list, prelude::*, uniform_list, App, EventEmitter, FocusHandle, Focusable, Window};
use std::iter::Iterator;
use std::sync::Arc;
use client::UserStore;
use component::{components, ComponentId, ComponentMetadata};
use gpui::{
list, prelude::*, uniform_list, App, Entity, EventEmitter, FocusHandle, Focusable, Task,
WeakEntity, Window,
};
use collections::HashMap;
use gpui::{ListState, ScrollHandle, UniformListScrollHandle};
use ui::{prelude::*, ListItem};
use languages::LanguageRegistry;
use notifications::status_toast::{StatusToast, ToastIcon};
use project::Project;
use ui::{prelude::*, Divider, ListItem, ListSubHeader};
use workspace::{item::ItemEvent, Item, Workspace, WorkspaceId};
use workspace::{AppState, ItemId, SerializableItem};
pub fn init(app_state: Arc<AppState>, cx: &mut App) {
let app_state = app_state.clone();
cx.observe_new(move |workspace: &mut Workspace, _, cx| {
let app_state = app_state.clone();
let weak_workspace = cx.entity().downgrade();
pub fn init(cx: &mut App) {
cx.observe_new(|workspace: &mut Workspace, _, _cx| {
workspace.register_action(
|workspace, _: &workspace::OpenComponentPreview, window, cx| {
let component_preview = cx.new(|cx| ComponentPreview::new(window, cx));
move |workspace, _: &workspace::OpenComponentPreview, window, cx| {
let app_state = app_state.clone();
let language_registry = app_state.languages.clone();
let user_store = app_state.user_store.clone();
let component_preview = cx.new(|cx| {
ComponentPreview::new(
weak_workspace.clone(),
language_registry,
user_store,
None,
cx,
)
});
workspace.add_item_to_active_pane(
Box::new(component_preview),
None,
@@ -27,43 +60,105 @@ pub fn init(cx: &mut App) {
.detach();
}
enum PreviewEntry {
AllComponents,
Separator,
Component(ComponentMetadata),
SectionHeader(SharedString),
}
impl From<ComponentMetadata> for PreviewEntry {
fn from(component: ComponentMetadata) -> Self {
PreviewEntry::Component(component)
}
}
impl From<SharedString> for PreviewEntry {
fn from(section_header: SharedString) -> Self {
PreviewEntry::SectionHeader(section_header)
}
}
#[derive(Default, Debug, Clone, PartialEq, Eq)]
enum PreviewPage {
#[default]
AllComponents,
Component(ComponentId),
}
struct ComponentPreview {
focus_handle: FocusHandle,
_view_scroll_handle: ScrollHandle,
nav_scroll_handle: UniformListScrollHandle,
component_map: HashMap<ComponentId, ComponentMetadata>,
active_page: PreviewPage,
components: Vec<ComponentMetadata>,
component_list: ListState,
selected_index: usize,
cursor_index: usize,
language_registry: Arc<LanguageRegistry>,
workspace: WeakEntity<Workspace>,
user_store: Entity<UserStore>,
}
impl ComponentPreview {
pub fn new(_window: &mut Window, cx: &mut Context<Self>) -> Self {
let components = components().all_sorted();
let initial_length = components.len();
pub fn new(
workspace: WeakEntity<Workspace>,
language_registry: Arc<LanguageRegistry>,
user_store: Entity<UserStore>,
selected_index: impl Into<Option<usize>>,
cx: &mut Context<Self>,
) -> Self {
let sorted_components = components().all_sorted();
let selected_index = selected_index.into().unwrap_or(0);
let component_list = ListState::new(initial_length, gpui::ListAlignment::Top, px(500.0), {
let this = cx.entity().downgrade();
move |ix, window: &mut Window, cx: &mut App| {
this.update(cx, |this, cx| {
this.render_preview(ix, window, cx).into_any_element()
})
.unwrap()
}
});
let component_list = ListState::new(
sorted_components.len(),
gpui::ListAlignment::Top,
px(1500.0),
{
let this = cx.entity().downgrade();
move |ix, window: &mut Window, cx: &mut App| {
this.update(cx, |this, cx| {
let component = this.get_component(ix);
this.render_preview(&component, window, cx)
.into_any_element()
})
.unwrap()
}
},
);
Self {
let mut component_preview = Self {
focus_handle: cx.focus_handle(),
_view_scroll_handle: ScrollHandle::new(),
nav_scroll_handle: UniformListScrollHandle::new(),
components,
language_registry,
user_store,
workspace,
active_page: PreviewPage::AllComponents,
component_map: components().0,
components: sorted_components,
component_list,
selected_index: 0,
cursor_index: selected_index,
};
if component_preview.cursor_index > 0 {
component_preview.scroll_to_preview(component_preview.cursor_index, cx);
}
component_preview.update_component_list(cx);
component_preview
}
fn scroll_to_preview(&mut self, ix: usize, cx: &mut Context<Self>) {
self.component_list.scroll_to_reveal_item(ix);
self.selected_index = ix;
self.cursor_index = ix;
cx.notify();
}
fn set_active_page(&mut self, page: PreviewPage, cx: &mut Context<Self>) {
self.active_page = page;
cx.notify();
}
@@ -71,32 +166,173 @@ impl ComponentPreview {
self.components[ix].clone()
}
fn scope_ordered_entries(&self) -> Vec<PreviewEntry> {
use std::collections::HashMap;
let mut scope_groups: HashMap<Option<ComponentScope>, Vec<ComponentMetadata>> =
HashMap::default();
for component in &self.components {
scope_groups
.entry(component.scope())
.or_insert_with(Vec::new)
.push(component.clone());
}
for components in scope_groups.values_mut() {
components.sort_by_key(|c| c.name().to_lowercase());
}
let mut entries = Vec::new();
let known_scopes = [
ComponentScope::Layout,
ComponentScope::Input,
ComponentScope::Editor,
ComponentScope::Notification,
ComponentScope::Collaboration,
ComponentScope::VersionControl,
];
// Always show all components first
entries.push(PreviewEntry::AllComponents);
entries.push(PreviewEntry::Separator);
for scope in known_scopes.iter() {
let scope_key = Some(scope.clone());
if let Some(components) = scope_groups.remove(&scope_key) {
if !components.is_empty() {
entries.push(PreviewEntry::SectionHeader(scope.to_string().into()));
for component in components {
entries.push(PreviewEntry::Component(component));
}
}
}
}
for (scope, components) in &scope_groups {
if let Some(ComponentScope::Unknown(_)) = scope {
if !components.is_empty() {
if let Some(scope_value) = scope {
entries.push(PreviewEntry::SectionHeader(scope_value.to_string().into()));
}
for component in components {
entries.push(PreviewEntry::Component(component.clone()));
}
}
}
}
if let Some(components) = scope_groups.get(&None) {
if !components.is_empty() {
entries.push(PreviewEntry::Separator);
entries.push(PreviewEntry::SectionHeader("Uncategorized".into()));
for component in components {
entries.push(PreviewEntry::Component(component.clone()));
}
}
}
entries
}
fn render_sidebar_entry(
&self,
ix: usize,
selected: bool,
entry: &PreviewEntry,
cx: &Context<Self>,
) -> impl IntoElement {
let component = self.get_component(ix);
match entry {
PreviewEntry::Component(component_metadata) => {
let id = component_metadata.id();
let selected = self.active_page == PreviewPage::Component(id.clone());
ListItem::new(ix)
.child(Label::new(component_metadata.name().clone()).color(Color::Default))
.selectable(true)
.toggle_state(selected)
.inset(true)
.on_click(cx.listener(move |this, _, _, cx| {
let id = id.clone();
this.set_active_page(PreviewPage::Component(id), cx);
}))
.into_any_element()
}
PreviewEntry::SectionHeader(shared_string) => ListSubHeader::new(shared_string)
.inset(true)
.into_any_element(),
PreviewEntry::AllComponents => {
let selected = self.active_page == PreviewPage::AllComponents;
ListItem::new(ix)
.child(Label::new(component.name().clone()).color(Color::Default))
.selectable(true)
.toggle_state(selected)
.inset(true)
.on_click(cx.listener(move |this, _, _, cx| {
this.scroll_to_preview(ix, cx);
}))
ListItem::new(ix)
.child(Label::new("All Components").color(Color::Default))
.selectable(true)
.toggle_state(selected)
.inset(true)
.on_click(cx.listener(move |this, _, _, cx| {
this.set_active_page(PreviewPage::AllComponents, cx);
}))
.into_any_element()
}
PreviewEntry::Separator => ListItem::new(ix)
.child(h_flex().pt_3().child(Divider::horizontal_dashed()))
.into_any_element(),
}
}
fn update_component_list(&mut self, cx: &mut Context<Self>) {
let new_len = self.scope_ordered_entries().len();
let entries = self.scope_ordered_entries();
let weak_entity = cx.entity().downgrade();
let new_list = ListState::new(
new_len,
gpui::ListAlignment::Top,
px(1500.0),
move |ix, window, cx| {
let entry = &entries[ix];
weak_entity
.update(cx, |this, cx| match entry {
PreviewEntry::Component(component) => this
.render_preview(component, window, cx)
.into_any_element(),
PreviewEntry::SectionHeader(shared_string) => this
.render_scope_header(ix, shared_string.clone(), window, cx)
.into_any_element(),
PreviewEntry::AllComponents => div().w_full().h_0().into_any_element(),
PreviewEntry::Separator => div().w_full().h_0().into_any_element(),
})
.unwrap()
},
);
self.component_list = new_list;
}
fn render_scope_header(
&self,
_ix: usize,
title: SharedString,
_window: &Window,
_cx: &App,
) -> impl IntoElement {
h_flex()
.w_full()
.h_10()
.items_center()
.child(Headline::new(title).size(HeadlineSize::XSmall))
.child(Divider::horizontal())
}
fn render_preview(
&self,
ix: usize,
component: &ComponentMetadata,
window: &mut Window,
cx: &mut Context<Self>,
cx: &mut App,
) -> impl IntoElement {
let component = self.get_component(ix);
let name = component.name();
let scope = component.scope();
@@ -108,7 +344,7 @@ impl ComponentPreview {
v_flex()
.border_1()
.border_color(cx.theme().colors().border)
.rounded_md()
.rounded_sm()
.w_full()
.gap_4()
.py_4()
@@ -142,10 +378,71 @@ impl ComponentPreview {
)
.into_any_element()
}
fn render_all_components(&self) -> impl IntoElement {
v_flex()
.id("component-list")
.px_8()
.pt_4()
.size_full()
.child(
list(self.component_list.clone())
.flex_grow()
.with_sizing_behavior(gpui::ListSizingBehavior::Auto),
)
}
fn render_component_page(
&mut self,
component_id: &ComponentId,
window: &mut Window,
cx: &mut Context<Self>,
) -> impl IntoElement {
let component = self.component_map.get(&component_id);
if let Some(component) = component {
v_flex()
.w_full()
.flex_initial()
.min_h_full()
.child(self.render_preview(component, window, cx))
.into_any_element()
} else {
v_flex()
.size_full()
.items_center()
.justify_center()
.child("Component not found")
.into_any_element()
}
}
fn test_status_toast(&self, window: &mut Window, cx: &mut Context<Self>) {
if let Some(workspace) = self.workspace.upgrade() {
workspace.update(cx, |workspace, cx| {
let status_toast = StatusToast::new(
"`zed/new-notification-system` created!",
window,
cx,
|this, _, cx| {
this.icon(ToastIcon::new(IconName::GitBranchSmall).color(Color::Muted))
.action(
"Open Pull Request",
cx.listener(|_, _, _, cx| cx.open_url("https://github.com/")),
)
},
);
workspace.toggle_status_toast(window, cx, status_toast)
});
}
}
}
impl Render for ComponentPreview {
fn render(&mut self, _window: &mut Window, cx: &mut Context<'_, Self>) -> impl IntoElement {
fn render(&mut self, window: &mut Window, cx: &mut Context<Self>) -> impl IntoElement {
let sidebar_entries = self.scope_ordered_entries();
let active_page = self.active_page.clone();
h_flex()
.id("component-preview")
.key_context("ComponentPreview")
@@ -155,35 +452,47 @@ impl Render for ComponentPreview {
.track_focus(&self.focus_handle)
.px_2()
.bg(cx.theme().colors().editor_background)
.child(
uniform_list(
cx.entity().clone(),
"component-nav",
self.components.len(),
move |this, range, _window, cx| {
range
.map(|ix| this.render_sidebar_entry(ix, ix == this.selected_index, cx))
.collect()
},
)
.track_scroll(self.nav_scroll_handle.clone())
.pt_4()
.w(px(240.))
.h_full()
.flex_grow(),
)
.child(
v_flex()
.id("component-list")
.px_8()
.pt_4()
.size_full()
.h_full()
.child(
list(self.component_list.clone())
.flex_grow()
.with_sizing_behavior(gpui::ListSizingBehavior::Auto),
uniform_list(
cx.entity().clone(),
"component-nav",
sidebar_entries.len(),
move |this, range, _window, cx| {
range
.map(|ix| {
this.render_sidebar_entry(ix, &sidebar_entries[ix], cx)
})
.collect()
},
)
.track_scroll(self.nav_scroll_handle.clone())
.pt_4()
.w(px(240.))
.h_full()
.flex_1(),
)
.child(
div().w_full().pb_4().child(
Button::new("toast-test", "Launch Toast")
.on_click(cx.listener({
move |this, _, window, cx| {
this.test_status_toast(window, cx);
cx.notify();
}
}))
.full_width(),
),
),
)
.child(match active_page {
PreviewPage::AllComponents => self.render_all_components().into_any_element(),
PreviewPage::Component(id) => self
.render_component_page(&id, window, cx)
.into_any_element(),
})
}
}
@@ -213,13 +522,26 @@ impl Item for ComponentPreview {
fn clone_on_split(
&self,
_workspace_id: Option<WorkspaceId>,
window: &mut Window,
_window: &mut Window,
cx: &mut Context<Self>,
) -> Option<gpui::Entity<Self>>
where
Self: Sized,
{
Some(cx.new(|cx| Self::new(window, cx)))
let language_registry = self.language_registry.clone();
let user_store = self.user_store.clone();
let weak_workspace = self.workspace.clone();
let selected_index = self.cursor_index;
Some(cx.new(|cx| {
Self::new(
weak_workspace,
language_registry,
user_store,
selected_index,
cx,
)
}))
}
fn to_item_events(event: &Self::Event, mut f: impl FnMut(workspace::item::ItemEvent)) {
@@ -227,6 +549,59 @@ impl Item for ComponentPreview {
}
}
// TODO: impl serializable item for component preview so it will restore with the workspace
// ref: https://github.com/zed-industries/zed/blob/32201ac70a501e63dfa2ade9c00f85aea2d4dd94/crates/image_viewer/src/image_viewer.rs#L199
// Use `ImageViewer` as a model for how to do it, except it'll be even simpler
impl SerializableItem for ComponentPreview {
fn serialized_item_kind() -> &'static str {
"ComponentPreview"
}
fn deserialize(
project: Entity<Project>,
workspace: WeakEntity<Workspace>,
_workspace_id: WorkspaceId,
_item_id: ItemId,
window: &mut Window,
cx: &mut App,
) -> Task<gpui::Result<Entity<Self>>> {
let user_store = project.read(cx).user_store().clone();
let language_registry = project.read(cx).languages().clone();
window.spawn(cx, |mut cx| async move {
let user_store = user_store.clone();
let language_registry = language_registry.clone();
let weak_workspace = workspace.clone();
cx.update(|_, cx| {
Ok(cx.new(|cx| {
ComponentPreview::new(weak_workspace, language_registry, user_store, None, cx)
}))
})?
})
}
fn cleanup(
_workspace_id: WorkspaceId,
_alive_items: Vec<ItemId>,
_window: &mut Window,
_cx: &mut App,
) -> Task<gpui::Result<()>> {
Task::ready(Ok(()))
// window.spawn(cx, |_| {
// ...
// })
}
fn serialize(
&mut self,
_workspace: &mut Workspace,
_item_id: ItemId,
_closing: bool,
_window: &mut Window,
_cx: &mut Context<Self>,
) -> Option<Task<gpui::Result<()>>> {
// TODO: Serialize the active index so we can re-open to the same place
None
}
fn should_serialize(&self, _event: &Self::Event) -> bool {
false
}
}

View File

@@ -31,4 +31,3 @@ settings.workspace = true
smol.workspace = true
url = { workspace = true, features = ["serde"] }
util.workspace = true
workspace.workspace = true

View File

@@ -1,8 +1,9 @@
use std::sync::Arc;
use anyhow::{anyhow, bail};
use anyhow::{anyhow, bail, Result};
use assistant_tool::Tool;
use gpui::{App, Entity, Task, Window};
use gpui::{App, Entity, Task};
use project::Project;
use crate::manager::ContextServerManager;
use crate::types;
@@ -49,12 +50,11 @@ impl Tool for ContextServerTool {
}
fn run(
self: std::sync::Arc<Self>,
self: Arc<Self>,
input: serde_json::Value,
_workspace: gpui::WeakEntity<workspace::Workspace>,
_: &mut Window,
_project: Entity<Project>,
cx: &mut App,
) -> gpui::Task<gpui::Result<String>> {
) -> Task<Result<String>> {
if let Some(server) = self.server_manager.read(cx).get_server(&self.server_id) {
cx.foreground_executor().spawn({
let tool_name = self.tool.name.clone();

View File

@@ -623,16 +623,21 @@ impl Copilot {
pub fn sign_out(&mut self, cx: &mut Context<Self>) -> Task<Result<()>> {
self.update_sign_in_status(request::SignInStatus::NotSignedIn, cx);
if let CopilotServer::Running(RunningCopilotServer { lsp: server, .. }) = &self.server {
let server = server.clone();
cx.background_spawn(async move {
server
.request::<request::SignOut>(request::SignOutParams {})
.await?;
match &self.server {
CopilotServer::Running(RunningCopilotServer { lsp: server, .. }) => {
let server = server.clone();
cx.background_spawn(async move {
server
.request::<request::SignOut>(request::SignOutParams {})
.await?;
anyhow::Ok(())
})
}
CopilotServer::Disabled => cx.background_spawn(async move {
clear_copilot_config_dir().await;
anyhow::Ok(())
})
} else {
Task::ready(Err(anyhow!("copilot hasn't started yet")))
}),
_ => Task::ready(Err(anyhow!("copilot hasn't started yet"))),
}
}
@@ -1016,6 +1021,10 @@ async fn clear_copilot_dir() {
remove_matching(paths::copilot_dir(), |_| true).await
}
async fn clear_copilot_config_dir() {
remove_matching(copilot_chat::copilot_chat_config_dir(), |_| true).await
}
async fn get_copilot_lsp(http: Arc<dyn HttpClient>) -> anyhow::Result<PathBuf> {
const SERVER_PATH: &str = "dist/language-server.js";

View File

@@ -4,13 +4,14 @@ use std::sync::OnceLock;
use anyhow::{anyhow, Result};
use chrono::DateTime;
use collections::HashSet;
use fs::Fs;
use futures::{io::BufReader, stream::BoxStream, AsyncBufReadExt, AsyncReadExt, StreamExt};
use gpui::{prelude::*, App, AsyncApp, Global};
use http_client::{AsyncBody, HttpClient, Method, Request as HttpRequest};
use paths::home_dir;
use serde::{Deserialize, Serialize};
use settings::watch_config_file;
use settings::watch_config_dir;
use strum::EnumIter;
pub const COPILOT_CHAT_COMPLETION_URL: &str = "https://api.githubcopilot.com/chat/completions";
@@ -212,7 +213,7 @@ pub fn init(fs: Arc<dyn Fs>, client: Arc<dyn HttpClient>, cx: &mut App) {
cx.set_global(GlobalCopilotChat(copilot_chat));
}
fn copilot_chat_config_dir() -> &'static PathBuf {
pub fn copilot_chat_config_dir() -> &'static PathBuf {
static COPILOT_CHAT_CONFIG_DIR: OnceLock<PathBuf> = OnceLock::new();
COPILOT_CHAT_CONFIG_DIR.get_or_init(|| {
@@ -237,27 +238,18 @@ impl CopilotChat {
}
pub fn new(fs: Arc<dyn Fs>, client: Arc<dyn HttpClient>, cx: &App) -> Self {
let config_paths = copilot_chat_config_paths();
let resolve_config_path = {
let fs = fs.clone();
async move {
for config_path in config_paths.iter() {
if fs.metadata(config_path).await.is_ok_and(|v| v.is_some()) {
return config_path.clone();
}
}
config_paths[0].clone()
}
};
let config_paths: HashSet<PathBuf> = copilot_chat_config_paths().into_iter().collect();
let dir_path = copilot_chat_config_dir();
cx.spawn(|cx| async move {
let config_file = resolve_config_path.await;
let mut config_file_rx = watch_config_file(cx.background_executor(), fs, config_file);
while let Some(contents) = config_file_rx.next().await {
let mut parent_watch_rx = watch_config_dir(
cx.background_executor(),
fs.clone(),
dir_path.clone(),
config_paths,
);
while let Some(contents) = parent_watch_rx.next().await {
let oauth_token = extract_oauth_token(contents);
cx.update(|cx| {
if let Some(this) = Self::global(cx).as_ref() {
this.update(cx, |this, cx| {

View File

@@ -122,7 +122,7 @@ impl CopilotCodeVerification {
.p_1()
.border_1()
.border_muted(cx)
.rounded_md()
.rounded_sm()
.cursor_pointer()
.justify_between()
.on_mouse_down(gpui::MouseButton::Left, {

View File

@@ -311,7 +311,10 @@ impl ProjectDiagnosticsEditor {
cx: &mut Context<Workspace>,
) {
if let Some(existing) = workspace.item_of_type::<ProjectDiagnosticsEditor>(cx) {
workspace.activate_item(&existing, true, true, window, cx);
let is_active = workspace
.active_item(cx)
.is_some_and(|item| item.item_id() == existing.item_id());
workspace.activate_item(&existing, true, !is_active, window, cx);
} else {
let workspace_handle = cx.entity().downgrade();
@@ -973,7 +976,7 @@ fn diagnostic_header_renderer(diagnostic: Diagnostic) -> RenderBlock {
h_flex()
.gap_2()
.px_1()
.rounded_md()
.rounded_sm()
.bg(color.surface_background.opacity(0.5))
.map(|stack| {
stack.child(

View File

@@ -94,6 +94,7 @@ ctor.workspace = true
env_logger.workspace = true
gpui = { workspace = true, features = ["test-support"] }
language = { workspace = true, features = ["test-support"] }
languages = {workspace = true, features = ["test-support"] }
lsp = { workspace = true, features = ["test-support"] }
multi_buffer = { workspace = true, features = ["test-support"] }
project = { workspace = true, features = ["test-support"] }

View File

@@ -340,7 +340,9 @@ gpui::actions!(
MoveToPreviousWordStart,
MoveToStartOfParagraph,
MoveToStartOfExcerpt,
MoveToStartOfNextExcerpt,
MoveToEndOfExcerpt,
MoveToEndOfPreviousExcerpt,
MoveUp,
Newline,
NewlineAbove,
@@ -378,7 +380,9 @@ gpui::actions!(
SelectAll,
SelectAllMatches,
SelectToStartOfExcerpt,
SelectToStartOfNextExcerpt,
SelectToEndOfExcerpt,
SelectToEndOfPreviousExcerpt,
SelectDown,
SelectEnclosingSymbol,
SelectLargerSyntaxNode,

View File

@@ -7,7 +7,6 @@ use std::time::Duration;
pub struct BlinkManager {
blink_interval: Duration,
blink_epoch: usize,
blinking_paused: bool,
visible: bool,
@@ -24,7 +23,6 @@ impl BlinkManager {
Self {
blink_interval,
blink_epoch: 0,
blinking_paused: false,
visible: true,

View File

@@ -2,12 +2,13 @@ use anyhow::Context as _;
use gpui::{App, Context, Entity, Window};
use language::Language;
use url::Url;
use workspace::{OpenOptions, OpenVisible};
use crate::lsp_ext::find_specific_language_server_in_selection;
use crate::{element::register_action, Editor, SwitchSourceHeader};
const CLANGD_SERVER_NAME: &str = "clangd";
use project::lsp_store::clangd_ext::CLANGD_SERVER_NAME;
fn is_c_language(language: &Language) -> bool {
return language.name() == "C++".into() || language.name() == "C".into();
@@ -46,7 +47,7 @@ pub fn switch_source_header(
project.request_lsp(
buffer,
project::LanguageServerToQuery::Other(server_to_query),
project::lsp_ext_command::SwitchSourceHeader,
project::lsp_store::lsp_ext_command::SwitchSourceHeader,
cx,
)
});
@@ -72,7 +73,7 @@ pub fn switch_source_header(
workspace
.update_in(&mut cx, |workspace, window, cx| {
workspace.open_abs_path(path, false, window, cx)
workspace.open_abs_path(path, OpenOptions { visible: Some(OpenVisible::None), ..Default::default() }, window, cx)
})
.with_context(|| {
format!(

View File

@@ -6,11 +6,11 @@ use gpui::{
};
use language::Buffer;
use language::CodeLabel;
use lsp::LanguageServerId;
use markdown::Markdown;
use multi_buffer::{Anchor, ExcerptId};
use ordered_float::OrderedFloat;
use project::lsp_store::CompletionDocumentation;
use project::CompletionSource;
use project::{CodeAction, Completion, TaskSourceKind};
use std::{
@@ -233,11 +233,9 @@ impl CompletionsMenu {
runs: Default::default(),
filter_range: Default::default(),
},
server_id: LanguageServerId(usize::MAX),
documentation: None,
lsp_completion: Default::default(),
confirm: None,
resolved: true,
source: CompletionSource::Custom,
})
.collect();
@@ -500,7 +498,12 @@ impl CompletionsMenu {
// Ignore font weight for syntax highlighting, as we'll use it
// for fuzzy matches.
highlight.font_weight = None;
if completion.lsp_completion.deprecated.unwrap_or(false) {
if completion
.source
.lsp_completion(false)
.and_then(|lsp_completion| lsp_completion.deprecated)
.unwrap_or(false)
{
highlight.strikethrough = Some(StrikethroughStyle {
thickness: 1.0.into(),
..Default::default()
@@ -534,7 +537,7 @@ impl CompletionsMenu {
};
let color_swatch = completion
.color()
.map(|color| div().size_4().bg(color).rounded_sm());
.map(|color| div().size_4().bg(color).rounded_xs());
div().min_w(px(280.)).max_w(px(540.)).child(
ListItem::new(mat.candidate_id)
@@ -708,7 +711,12 @@ impl CompletionsMenu {
let completion = &completions[mat.candidate_id];
let sort_key = completion.sort_key();
let sort_text = completion.lsp_completion.sort_text.as_deref();
let sort_text =
if let CompletionSource::Lsp { lsp_completion, .. } = &completion.source {
lsp_completion.sort_text.as_deref()
} else {
None
};
let score = Reverse(OrderedFloat(mat.score));
if mat.score >= 0.2 {
@@ -851,7 +859,7 @@ impl CodeActionsItem {
pub fn label(&self) -> String {
match self {
Self::CodeAction { action, .. } => action.lsp_action.title.clone(),
Self::CodeAction { action, .. } => action.lsp_action.title().to_owned(),
Self::Task(_, task) => task.resolved_label.clone(),
}
}
@@ -984,7 +992,7 @@ impl CodeActionsMenu {
.overflow_hidden()
.child(
// TASK: It would be good to make lsp_action.title a SharedString to avoid allocating here.
action.lsp_action.title.replace("\n", ""),
action.lsp_action.title().replace("\n", ""),
)
.when(selected, |this| {
this.text_color(colors.text_accent)
@@ -1029,7 +1037,7 @@ impl CodeActionsMenu {
.max_by_key(|(_, action)| match action {
CodeActionsItem::Task(_, task) => task.resolved_label.chars().count(),
CodeActionsItem::CodeAction { action, .. } => {
action.lsp_action.title.chars().count()
action.lsp_action.title().chars().count()
}
})
.map(|(ix, _)| ix),

View File

@@ -28,6 +28,7 @@ mod hover_popover;
mod indent_guides;
mod inlay_hint_cache;
pub mod items;
mod jsx_tag_auto_close;
mod linked_editing_ranges;
mod lsp_ext;
mod mouse_context_menu;
@@ -137,8 +138,9 @@ use multi_buffer::{
use project::{
lsp_store::{CompletionDocumentation, FormatTrigger, LspFormatTarget, OpenLspBufferHandle},
project_settings::{GitGutterSetting, ProjectSettings},
CodeAction, Completion, CompletionIntent, DocumentHighlight, InlayHint, Location, LocationLink,
PrepareRenameResponse, Project, ProjectItem, ProjectTransaction, TaskSourceKind,
CodeAction, Completion, CompletionIntent, CompletionSource, DocumentHighlight, InlayHint,
Location, LocationLink, PrepareRenameResponse, Project, ProjectItem, ProjectTransaction,
TaskSourceKind,
};
use rand::prelude::*;
use rpc::{proto::*, ErrorExt};
@@ -260,6 +262,7 @@ enum DisplayDiffHunk {
display_row: DisplayRow,
},
Unfolded {
is_created_file: bool,
diff_base_byte_range: Range<usize>,
display_row_range: Range<DisplayRow>,
multi_buffer_range: Range<Anchor>,
@@ -724,6 +727,7 @@ pub struct Editor {
use_autoclose: bool,
use_auto_surround: bool,
auto_replace_emoji_shortcode: bool,
jsx_tag_auto_close_enabled_in_any_buffer: bool,
show_git_blame_gutter: bool,
show_git_blame_inline: bool,
show_git_blame_inline_delay_task: Option<Task<()>>,
@@ -1010,8 +1014,8 @@ pub struct ClipboardSelection {
pub len: usize,
/// Whether this was a full-line selection.
pub is_entire_line: bool,
/// The column where this selection originally started.
pub start_column: u32,
/// The indentation of the first line when this content was originally copied.
pub first_line_indent: u32,
}
#[derive(Debug)]
@@ -1199,7 +1203,7 @@ impl Editor {
.bg(cx.theme().colors().ghost_element_background)
.hover(|style| style.bg(cx.theme().colors().ghost_element_hover))
.active(|style| style.bg(cx.theme().colors().ghost_element_active))
.rounded_sm()
.rounded_xs()
.size_full()
.cursor_pointer()
.child("")
@@ -1247,11 +1251,6 @@ impl Editor {
let mut project_subscriptions = Vec::new();
if mode == EditorMode::Full {
if let Some(project) = project.as_ref() {
if buffer.read(cx).is_singleton() {
project_subscriptions.push(cx.observe_in(project, window, |_, _, _, cx| {
cx.emit(EditorEvent::TitleChanged);
}));
}
project_subscriptions.push(cx.subscribe_in(
project,
window,
@@ -1410,6 +1409,7 @@ impl Editor {
use_autoclose: true,
use_auto_surround: true,
auto_replace_emoji_shortcode: false,
jsx_tag_auto_close_enabled_in_any_buffer: false,
leader_peer_id: None,
remote_id: None,
hover_state: Default::default(),
@@ -1493,6 +1493,7 @@ impl Editor {
this.end_selection(window, cx);
this.scroll_manager.show_scrollbar(window, cx);
jsx_tag_auto_close::refresh_enabled_in_any_buffer(&mut this, &buffer, cx);
if mode == EditorMode::Full {
let should_auto_hide_scrollbars = cx.should_auto_hide_scrollbars();
@@ -1572,13 +1573,16 @@ impl Editor {
}
}
if let Some(extension) = self
.buffer
.read(cx)
.as_singleton()
.and_then(|buffer| buffer.read(cx).file()?.path().extension()?.to_str())
{
key_context.set("extension", extension.to_string());
if let Some(singleton_buffer) = self.buffer.read(cx).as_singleton() {
if let Some(extension) = singleton_buffer
.read(cx)
.file()
.and_then(|file| file.path().extension()?.to_str())
{
key_context.set("extension", extension.to_string());
}
} else {
key_context.add("multibuffer");
}
if has_active_edit_prediction {
@@ -2247,6 +2251,9 @@ impl Editor {
cx.subscribe(&other, |this, other, other_evt, cx| match other_evt {
EditorEvent::SelectionsChanged { local: true } => {
let other_selections = other.read(cx).selections.disjoint.to_vec();
if other_selections.is_empty() {
return;
}
this.selections.change_with(cx, |selections| {
selections.select_anchors(other_selections);
});
@@ -2258,6 +2265,9 @@ impl Editor {
cx.subscribe_self::<EditorEvent>(move |this, this_evt, cx| match this_evt {
EditorEvent::SelectionsChanged { local: true } => {
let these_selections = this.selections.disjoint.to_vec();
if these_selections.is_empty() {
return;
}
other.update(cx, |other_editor, cx| {
other_editor.selections.change_with(cx, |selections| {
selections.select_anchors(these_selections);
@@ -2344,7 +2354,7 @@ impl Editor {
pub fn edit_with_block_indent<I, S, T>(
&mut self,
edits: I,
original_start_columns: Vec<u32>,
original_indent_columns: Vec<Option<u32>>,
cx: &mut Context<Self>,
) where
I: IntoIterator<Item = (Range<S>, T)>,
@@ -2359,7 +2369,7 @@ impl Editor {
buffer.edit(
edits,
Some(AutoindentMode::Block {
original_start_columns,
original_indent_columns,
}),
cx,
)
@@ -3095,6 +3105,9 @@ impl Editor {
drop(snapshot);
self.transact(window, cx, |this, window, cx| {
let initial_buffer_versions =
jsx_tag_auto_close::construct_initial_buffer_versions_map(this, &edits, cx);
this.buffer.update(cx, |buffer, cx| {
buffer.edit(edits, this.autoindent_mode.clone(), cx);
});
@@ -3182,6 +3195,7 @@ impl Editor {
this.trigger_completion_on_input(&text, trigger_in_words, window, cx);
linked_editing_ranges::refresh_linked_ranges(this, window, cx);
this.refresh_inline_completion(true, false, window, cx);
jsx_tag_auto_close::handle_from(this, initial_buffer_versions, window, cx);
});
}
@@ -3466,7 +3480,7 @@ impl Editor {
pub fn insert(&mut self, text: &str, window: &mut Window, cx: &mut Context<Self>) {
let autoindent = text.is_empty().not().then(|| AutoindentMode::Block {
original_start_columns: Vec::new(),
original_indent_columns: Vec::new(),
});
self.insert_with_autoindent_mode(text, autoindent, window, cx);
}
@@ -6536,7 +6550,7 @@ impl Editor {
.pl_1()
.pr(padding_right)
.gap_1()
.rounded(px(6.))
.rounded_md()
.border_1()
.bg(Self::edit_prediction_line_popover_bg_color(cx))
.border_color(Self::edit_prediction_callout_popover_border_color(cx))
@@ -7843,7 +7857,7 @@ impl Editor {
for hunk in &hunks {
self.prepare_restore_change(&mut revert_changes, hunk, cx);
}
self.do_stage_or_unstage(false, buffer_id, hunks.into_iter(), window, cx);
self.do_stage_or_unstage(false, buffer_id, hunks.into_iter(), cx);
}
drop(chunk_by);
if !revert_changes.is_empty() {
@@ -7881,6 +7895,9 @@ impl Editor {
hunk: &MultiBufferDiffHunk,
cx: &mut App,
) -> Option<()> {
if hunk.is_created_file() {
return None;
}
let buffer = self.buffer.read(cx);
let diff = buffer.diff_for(hunk.buffer_id)?;
let buffer = buffer.buffer(hunk.buffer_id)?;
@@ -8690,7 +8707,9 @@ impl Editor {
clipboard_selections.push(ClipboardSelection {
len,
is_entire_line,
start_column: selection.start.column,
first_line_indent: buffer
.indent_size_for_line(MultiBufferRow(selection.start.row))
.len,
});
}
}
@@ -8769,7 +8788,7 @@ impl Editor {
clipboard_selections.push(ClipboardSelection {
len,
is_entire_line,
start_column: start.column,
first_line_indent: buffer.indent_size_for_line(MultiBufferRow(start.row)).len,
});
}
}
@@ -8799,8 +8818,8 @@ impl Editor {
let old_selections = this.selections.all::<usize>(cx);
let all_selections_were_entire_line =
clipboard_selections.iter().all(|s| s.is_entire_line);
let first_selection_start_column =
clipboard_selections.first().map(|s| s.start_column);
let first_selection_indent_column =
clipboard_selections.first().map(|s| s.first_line_indent);
if clipboard_selections.len() != old_selections.len() {
clipboard_selections.drain(..);
}
@@ -8815,21 +8834,21 @@ impl Editor {
let mut start_offset = 0;
let mut edits = Vec::new();
let mut original_start_columns = Vec::new();
let mut original_indent_columns = Vec::new();
for (ix, selection) in old_selections.iter().enumerate() {
let to_insert;
let entire_line;
let original_start_column;
let original_indent_column;
if let Some(clipboard_selection) = clipboard_selections.get(ix) {
let end_offset = start_offset + clipboard_selection.len;
to_insert = &clipboard_text[start_offset..end_offset];
entire_line = clipboard_selection.is_entire_line;
start_offset = end_offset + 1;
original_start_column = Some(clipboard_selection.start_column);
original_indent_column = Some(clipboard_selection.first_line_indent);
} else {
to_insert = clipboard_text.as_str();
entire_line = all_selections_were_entire_line;
original_start_column = first_selection_start_column
original_indent_column = first_selection_indent_column
}
// If the corresponding selection was empty when this slice of the
@@ -8845,7 +8864,7 @@ impl Editor {
};
edits.push((range, to_insert));
original_start_columns.extend(original_start_column);
original_indent_columns.push(original_indent_column);
}
drop(snapshot);
@@ -8853,7 +8872,7 @@ impl Editor {
edits,
if auto_indent_on_paste {
Some(AutoindentMode::Block {
original_start_columns,
original_indent_columns,
})
} else {
None
@@ -9829,6 +9848,31 @@ impl Editor {
})
}
pub fn move_to_start_of_next_excerpt(
&mut self,
_: &MoveToStartOfNextExcerpt,
window: &mut Window,
cx: &mut Context<Self>,
) {
if matches!(self.mode, EditorMode::SingleLine { .. }) {
cx.propagate();
return;
}
self.change_selections(Some(Autoscroll::fit()), window, cx, |s| {
s.move_with(|map, selection| {
selection.collapse_to(
movement::start_of_excerpt(
map,
selection.head(),
workspace::searchable::Direction::Next,
),
SelectionGoal::None,
)
});
})
}
pub fn move_to_end_of_excerpt(
&mut self,
_: &MoveToEndOfExcerpt,
@@ -9854,6 +9898,31 @@ impl Editor {
})
}
pub fn move_to_end_of_previous_excerpt(
&mut self,
_: &MoveToEndOfPreviousExcerpt,
window: &mut Window,
cx: &mut Context<Self>,
) {
if matches!(self.mode, EditorMode::SingleLine { .. }) {
cx.propagate();
return;
}
self.change_selections(Some(Autoscroll::fit()), window, cx, |s| {
s.move_with(|map, selection| {
selection.collapse_to(
movement::end_of_excerpt(
map,
selection.head(),
workspace::searchable::Direction::Prev,
),
SelectionGoal::None,
)
});
})
}
pub fn select_to_start_of_excerpt(
&mut self,
_: &SelectToStartOfExcerpt,
@@ -9875,6 +9944,27 @@ impl Editor {
})
}
pub fn select_to_start_of_next_excerpt(
&mut self,
_: &SelectToStartOfNextExcerpt,
window: &mut Window,
cx: &mut Context<Self>,
) {
if matches!(self.mode, EditorMode::SingleLine { .. }) {
cx.propagate();
return;
}
self.change_selections(Some(Autoscroll::fit()), window, cx, |s| {
s.move_heads_with(|map, head, _| {
(
movement::start_of_excerpt(map, head, workspace::searchable::Direction::Next),
SelectionGoal::None,
)
});
})
}
pub fn select_to_end_of_excerpt(
&mut self,
_: &SelectToEndOfExcerpt,
@@ -9896,6 +9986,27 @@ impl Editor {
})
}
pub fn select_to_end_of_previous_excerpt(
&mut self,
_: &SelectToEndOfPreviousExcerpt,
window: &mut Window,
cx: &mut Context<Self>,
) {
if matches!(self.mode, EditorMode::SingleLine { .. }) {
cx.propagate();
return;
}
self.change_selections(Some(Autoscroll::fit()), window, cx, |s| {
s.move_heads_with(|map, head, _| {
(
movement::end_of_excerpt(map, head, workspace::searchable::Direction::Prev),
SelectionGoal::None,
)
});
})
}
pub fn move_to_beginning(
&mut self,
_: &MoveToBeginning,
@@ -11528,7 +11639,7 @@ impl Editor {
fn go_to_next_hunk(&mut self, _: &GoToHunk, window: &mut Window, cx: &mut Context<Self>) {
let snapshot = self.snapshot(window, cx);
let selection = self.selections.newest::<Point>(cx);
self.go_to_hunk_after_or_before_position(
self.go_to_hunk_before_or_after_position(
&snapshot,
selection.head(),
Direction::Next,
@@ -11537,7 +11648,7 @@ impl Editor {
);
}
fn go_to_hunk_after_or_before_position(
fn go_to_hunk_before_or_after_position(
&mut self,
snapshot: &EditorSnapshot,
position: Point,
@@ -11588,7 +11699,7 @@ impl Editor {
) {
let snapshot = self.snapshot(window, cx);
let selection = self.selections.newest::<Point>(cx);
self.go_to_hunk_after_or_before_position(
self.go_to_hunk_before_or_after_position(
&snapshot,
selection.head(),
Direction::Prev,
@@ -13657,13 +13768,13 @@ impl Editor {
pub fn toggle_staged_selected_diff_hunks(
&mut self,
_: &::git::ToggleStaged,
window: &mut Window,
_: &mut Window,
cx: &mut Context<Self>,
) {
let snapshot = self.buffer.read(cx).snapshot(cx);
let ranges: Vec<_> = self.selections.disjoint.iter().map(|s| s.range()).collect();
let stage = self.has_stageable_diff_hunks_in_ranges(&ranges, &snapshot);
self.stage_or_unstage_diff_hunks(stage, &ranges, window, cx);
self.stage_or_unstage_diff_hunks(stage, ranges, cx);
}
pub fn stage_and_next(
@@ -13687,16 +13798,53 @@ impl Editor {
pub fn stage_or_unstage_diff_hunks(
&mut self,
stage: bool,
ranges: &[Range<Anchor>],
window: &mut Window,
ranges: Vec<Range<Anchor>>,
cx: &mut Context<Self>,
) {
let snapshot = self.buffer.read(cx).snapshot(cx);
let chunk_by = self
.diff_hunks_in_ranges(&ranges, &snapshot)
.chunk_by(|hunk| hunk.buffer_id);
for (buffer_id, hunks) in &chunk_by {
self.do_stage_or_unstage(stage, buffer_id, hunks, window, cx);
let task = self.save_buffers_for_ranges_if_needed(&ranges, cx);
cx.spawn(|this, mut cx| async move {
task.await?;
this.update(&mut cx, |this, cx| {
let snapshot = this.buffer.read(cx).snapshot(cx);
let chunk_by = this
.diff_hunks_in_ranges(&ranges, &snapshot)
.chunk_by(|hunk| hunk.buffer_id);
for (buffer_id, hunks) in &chunk_by {
this.do_stage_or_unstage(stage, buffer_id, hunks, cx);
}
})
})
.detach_and_log_err(cx);
}
fn save_buffers_for_ranges_if_needed(
&mut self,
ranges: &[Range<Anchor>],
cx: &mut Context<'_, Editor>,
) -> Task<Result<()>> {
let multibuffer = self.buffer.read(cx);
let snapshot = multibuffer.read(cx);
let buffer_ids: HashSet<_> = ranges
.iter()
.flat_map(|range| snapshot.buffer_ids_for_range(range.clone()))
.collect();
drop(snapshot);
let mut buffers = HashSet::default();
for buffer_id in buffer_ids {
if let Some(buffer_entity) = multibuffer.buffer(buffer_id) {
let buffer = buffer_entity.read(cx);
if buffer.file().is_some_and(|file| file.disk_state().exists()) && buffer.is_dirty()
{
buffers.insert(buffer_entity);
}
}
}
if let Some(project) = &self.project {
project.update(cx, |project, cx| project.save_buffers(buffers, cx))
} else {
Task::ready(Ok(()))
}
}
@@ -13709,26 +13857,11 @@ impl Editor {
let ranges = self.selections.disjoint_anchor_ranges().collect::<Vec<_>>();
if ranges.iter().any(|range| range.start != range.end) {
self.stage_or_unstage_diff_hunks(stage, &ranges[..], window, cx);
self.stage_or_unstage_diff_hunks(stage, ranges, cx);
return;
}
let snapshot = self.snapshot(window, cx);
let newest_range = self.selections.newest::<Point>(cx).range();
let run_twice = snapshot
.hunks_for_ranges([newest_range])
.first()
.is_some_and(|hunk| {
let next_line = Point::new(hunk.row_range.end.0 + 1, 0);
self.hunk_after_position(&snapshot, next_line)
.is_some_and(|other| other.row_range == hunk.row_range)
});
if run_twice {
self.go_to_next_hunk(&GoToHunk, window, cx);
}
self.stage_or_unstage_diff_hunks(stage, &ranges[..], window, cx);
self.stage_or_unstage_diff_hunks(stage, ranges, cx);
self.go_to_next_hunk(&GoToHunk, window, cx);
}
@@ -13737,31 +13870,16 @@ impl Editor {
stage: bool,
buffer_id: BufferId,
hunks: impl Iterator<Item = MultiBufferDiffHunk>,
window: &mut Window,
cx: &mut App,
) {
let Some(project) = self.project.as_ref() else {
return;
};
let Some(buffer) = project.read(cx).buffer_for_id(buffer_id, cx) else {
return;
};
let Some(diff) = self.buffer.read(cx).diff_for(buffer_id) else {
return;
};
) -> Option<()> {
let project = self.project.as_ref()?;
let buffer = project.read(cx).buffer_for_id(buffer_id, cx)?;
let diff = self.buffer.read(cx).diff_for(buffer_id)?;
let buffer_snapshot = buffer.read(cx).snapshot();
let file_exists = buffer_snapshot
.file()
.is_some_and(|file| file.disk_state().exists());
let Some((repo, path)) = project
.read(cx)
.repository_and_path_for_buffer_id(buffer_id, cx)
else {
log::debug!("no git repo for buffer id");
return;
};
let new_index_text = diff.update(cx, |diff, cx| {
diff.update(cx, |diff, cx| {
diff.stage_or_unstage_hunks(
stage,
&hunks
@@ -13777,20 +13895,7 @@ impl Editor {
cx,
)
});
if file_exists {
let buffer_store = project.read(cx).buffer_store().clone();
buffer_store
.update(cx, |buffer_store, cx| buffer_store.save_buffer(buffer, cx))
.detach_and_log_err(cx);
}
let recv = repo
.read(cx)
.set_index_text(&path, new_index_text.map(|rope| rope.to_string()));
cx.background_spawn(async move { recv.await? })
.detach_and_notify_err(window, cx);
None
}
pub fn expand_selected_diff_hunks(&mut self, cx: &mut Context<Self>) {
@@ -14183,6 +14288,13 @@ impl Editor {
EditorSettings::override_global(editor_settings, cx);
}
pub fn line_numbers_enabled(&self, cx: &App) -> bool {
if let Some(show_line_numbers) = self.show_line_numbers {
return show_line_numbers;
}
EditorSettings::get_global(cx).gutter.line_numbers
}
pub fn should_use_relative_line_numbers(&self, cx: &mut App) -> bool {
self.use_relative_line_numbers
.unwrap_or(EditorSettings::get_global(cx).relative_line_numbers)
@@ -14858,14 +14970,14 @@ impl Editor {
&self,
window: &mut Window,
cx: &mut App,
) -> BTreeMap<DisplayRow, Background> {
) -> BTreeMap<DisplayRow, LineHighlight> {
let snapshot = self.snapshot(window, cx);
let mut used_highlight_orders = HashMap::default();
self.highlighted_rows
.iter()
.flat_map(|(_, highlighted_rows)| highlighted_rows.iter())
.fold(
BTreeMap::<DisplayRow, Background>::new(),
BTreeMap::<DisplayRow, LineHighlight>::new(),
|mut unique_rows, highlight| {
let start = highlight.range.start.to_display_point(&snapshot);
let end = highlight.range.end.to_display_point(&snapshot);
@@ -15378,6 +15490,7 @@ impl Editor {
let buffer = self.buffer.read(cx);
self.registered_buffers
.retain(|buffer_id, _| buffer.buffer(*buffer_id).is_some());
jsx_tag_auto_close::refresh_enabled_in_any_buffer(self, multibuffer, cx);
cx.emit(EditorEvent::ExcerptsRemoved { ids: ids.clone() })
}
multi_buffer::Event::ExcerptsEdited {
@@ -15397,6 +15510,7 @@ impl Editor {
}
multi_buffer::Event::Reparsed(buffer_id) => {
self.tasks_update_task = Some(self.refresh_runnables(window, cx));
jsx_tag_auto_close::refresh_enabled_in_any_buffer(self, multibuffer, cx);
cx.emit(EditorEvent::Reparsed(*buffer_id));
}
@@ -15405,19 +15519,15 @@ impl Editor {
}
multi_buffer::Event::LanguageChanged(buffer_id) => {
linked_editing_ranges::refresh_linked_ranges(self, window, cx);
jsx_tag_auto_close::refresh_enabled_in_any_buffer(self, multibuffer, cx);
cx.emit(EditorEvent::Reparsed(*buffer_id));
cx.notify();
}
multi_buffer::Event::DirtyChanged => cx.emit(EditorEvent::DirtyChanged),
multi_buffer::Event::Saved => cx.emit(EditorEvent::Saved),
multi_buffer::Event::FileHandleChanged | multi_buffer::Event::Reloaded => {
cx.emit(EditorEvent::TitleChanged)
}
// multi_buffer::Event::DiffBaseChanged => {
// self.scrollbar_marker_state.dirty = true;
// cx.emit(EditorEvent::DiffBaseChanged);
// cx.notify();
// }
multi_buffer::Event::FileHandleChanged
| multi_buffer::Event::Reloaded
| multi_buffer::Event::BufferDiffChanged => cx.emit(EditorEvent::TitleChanged),
multi_buffer::Event::Closed => cx.emit(EditorEvent::Closed),
multi_buffer::Event::DiagnosticsUpdated => {
self.refresh_active_diagnostics(cx);
@@ -16305,7 +16415,7 @@ fn get_uncommitted_diff_for_buffer(
}
});
cx.spawn(|mut cx| async move {
let diffs = futures::future::join_all(tasks).await;
let diffs = future::join_all(tasks).await;
buffer
.update(&mut cx, |buffer, cx| {
for diff in diffs.into_iter().flatten() {
@@ -16875,38 +16985,41 @@ fn snippet_completions(
Some(Completion {
old_range: range,
new_text: snippet.body.clone(),
resolved: false,
source: CompletionSource::Lsp {
server_id: LanguageServerId(usize::MAX),
resolved: true,
lsp_completion: Box::new(lsp::CompletionItem {
label: snippet.prefix.first().unwrap().clone(),
kind: Some(CompletionItemKind::SNIPPET),
label_details: snippet.description.as_ref().map(|description| {
lsp::CompletionItemLabelDetails {
detail: Some(description.clone()),
description: None,
}
}),
insert_text_format: Some(InsertTextFormat::SNIPPET),
text_edit: Some(lsp::CompletionTextEdit::InsertAndReplace(
lsp::InsertReplaceEdit {
new_text: snippet.body.clone(),
insert: lsp_range,
replace: lsp_range,
},
)),
filter_text: Some(snippet.body.clone()),
sort_text: Some(char::MAX.to_string()),
..lsp::CompletionItem::default()
}),
lsp_defaults: None,
},
label: CodeLabel {
text: matching_prefix.clone(),
runs: vec![],
runs: Vec::new(),
filter_range: 0..matching_prefix.len(),
},
server_id: LanguageServerId(usize::MAX),
documentation: snippet
.description
.clone()
.map(|description| CompletionDocumentation::SingleLine(description.into())),
lsp_completion: lsp::CompletionItem {
label: snippet.prefix.first().unwrap().clone(),
kind: Some(CompletionItemKind::SNIPPET),
label_details: snippet.description.as_ref().map(|description| {
lsp::CompletionItemLabelDetails {
detail: Some(description.clone()),
description: None,
}
}),
insert_text_format: Some(InsertTextFormat::SNIPPET),
text_edit: Some(lsp::CompletionTextEdit::InsertAndReplace(
lsp::InsertReplaceEdit {
new_text: snippet.body.clone(),
insert: lsp_range,
replace: lsp_range,
},
)),
filter_text: Some(snippet.body.clone()),
sort_text: Some(char::MAX.to_string()),
..Default::default()
},
confirm: None,
})
})
@@ -17254,6 +17367,7 @@ impl EditorSnapshot {
if hunk_display_end.column() > 0 {
end_row.0 += 1;
}
let is_created_file = hunk.is_created_file();
DisplayDiffHunk::Unfolded {
status: hunk.status(),
diff_base_byte_range: hunk.diff_base_byte_range,
@@ -17263,6 +17377,7 @@ impl EditorSnapshot {
hunk.buffer_id,
hunk.buffer_range,
),
is_created_file,
}
};
@@ -18402,3 +18517,27 @@ impl Render for MissingEditPredictionKeybindingTooltip {
})
}
}
#[derive(Debug, Clone, Copy, PartialEq)]
pub struct LineHighlight {
pub background: Background,
pub border: Option<gpui::Hsla>,
}
impl From<Hsla> for LineHighlight {
fn from(hsla: Hsla) -> Self {
Self {
background: hsla.into(),
border: None,
}
}
}
impl From<Background> for LineHighlight {
fn from(background: Background) -> Self {
Self {
background,
border: None,
}
}
}

View File

@@ -4931,6 +4931,34 @@ async fn test_paste_multiline(cx: &mut TestAppContext) {
)
);
"});
// Copy an indented block, starting mid-line
cx.set_state(indoc! {"
const a: B = (
c(),
somethin«g(
e,
f
)ˇ»
);
"});
cx.update_editor(|e, window, cx| e.copy(&Copy, window, cx));
// Paste it on a line with a lower indent level
cx.update_editor(|e, window, cx| e.move_to_end(&Default::default(), window, cx));
cx.update_editor(|e, window, cx| e.paste(&Paste, window, cx));
cx.assert_editor_state(indoc! {"
const a: B = (
c(),
something(
e,
f
)
);
g(
e,
f
"});
}
#[gpui::test]
@@ -12306,24 +12334,6 @@ async fn test_completions_default_resolve_data_handling(cx: &mut TestAppContext)
},
};
let item_0_out = lsp::CompletionItem {
commit_characters: Some(default_commit_characters.clone()),
insert_text_format: Some(default_insert_text_format),
..item_0
};
let items_out = iter::once(item_0_out)
.chain(items[1..].iter().map(|item| lsp::CompletionItem {
commit_characters: Some(default_commit_characters.clone()),
data: Some(default_data.clone()),
insert_text_mode: Some(default_insert_text_mode),
text_edit: Some(lsp::CompletionTextEdit::Edit(lsp::TextEdit {
range: default_edit_range,
new_text: item.label.clone(),
})),
..item.clone()
}))
.collect::<Vec<lsp::CompletionItem>>();
let mut cx = EditorLspTestContext::new_rust(
lsp::ServerCapabilities {
completion_provider: Some(lsp::CompletionOptions {
@@ -12342,10 +12352,11 @@ async fn test_completions_default_resolve_data_handling(cx: &mut TestAppContext)
let completion_data = default_data.clone();
let completion_characters = default_commit_characters.clone();
let completion_items = items.clone();
cx.handle_request::<lsp::request::Completion, _, _>(move |_, _, _| {
let default_data = completion_data.clone();
let default_commit_characters = completion_characters.clone();
let items = items.clone();
let items = completion_items.clone();
async move {
Ok(Some(lsp::CompletionResponse::List(lsp::CompletionList {
items,
@@ -12394,7 +12405,7 @@ async fn test_completions_default_resolve_data_handling(cx: &mut TestAppContext)
.iter()
.map(|mat| mat.string.clone())
.collect::<Vec<String>>(),
items_out
items
.iter()
.map(|completion| completion.label.clone())
.collect::<Vec<String>>()
@@ -12407,14 +12418,18 @@ async fn test_completions_default_resolve_data_handling(cx: &mut TestAppContext)
// with 4 from the end.
assert_eq!(
*resolved_items.lock(),
[
&items_out[0..16],
&items_out[items_out.len() - 4..items_out.len()]
]
.concat()
.iter()
.cloned()
.collect::<Vec<lsp::CompletionItem>>()
[&items[0..16], &items[items.len() - 4..items.len()]]
.concat()
.iter()
.cloned()
.map(|mut item| {
if item.data.is_none() {
item.data = Some(default_data.clone());
}
item
})
.collect::<Vec<lsp::CompletionItem>>(),
"Items sent for resolve should be unchanged modulo resolve `data` filled with default if missing"
);
resolved_items.lock().clear();
@@ -12425,9 +12440,15 @@ async fn test_completions_default_resolve_data_handling(cx: &mut TestAppContext)
// Completions that have already been resolved are skipped.
assert_eq!(
*resolved_items.lock(),
items_out[items_out.len() - 16..items_out.len() - 4]
items[items.len() - 16..items.len() - 4]
.iter()
.cloned()
.map(|mut item| {
if item.data.is_none() {
item.data = Some(default_data.clone());
}
item
})
.collect::<Vec<lsp::CompletionItem>>()
);
resolved_items.lock().clear();
@@ -16385,6 +16406,199 @@ async fn test_folding_buffer_when_multibuffer_has_only_one_excerpt(cx: &mut Test
);
}
#[gpui::test]
async fn test_multi_buffer_navigation_with_folded_buffers(cx: &mut TestAppContext) {
init_test(cx, |_| {});
cx.update(|cx| {
let default_key_bindings = settings::KeymapFile::load_asset_allow_partial_failure(
"keymaps/default-linux.json",
cx,
)
.unwrap();
cx.bind_keys(default_key_bindings);
});
let (editor, cx) = cx.add_window_view(|window, cx| {
let multi_buffer = MultiBuffer::build_multi(
[
("a0\nb0\nc0\nd0\ne0\n", vec![Point::row_range(0..2)]),
("a1\nb1\nc1\nd1\ne1\n", vec![Point::row_range(0..2)]),
("a2\nb2\nc2\nd2\ne2\n", vec![Point::row_range(0..2)]),
("a3\nb3\nc3\nd3\ne3\n", vec![Point::row_range(0..2)]),
],
cx,
);
let mut editor = Editor::new(
EditorMode::Full,
multi_buffer.clone(),
None,
true,
window,
cx,
);
let buffer_ids = multi_buffer.read(cx).excerpt_buffer_ids();
// fold all but the second buffer, so that we test navigating between two
// adjacent folded buffers, as well as folded buffers at the start and
// end the multibuffer
editor.fold_buffer(buffer_ids[0], cx);
editor.fold_buffer(buffer_ids[2], cx);
editor.fold_buffer(buffer_ids[3], cx);
editor
});
cx.simulate_resize(size(px(1000.), px(1000.)));
let mut cx = EditorTestContext::for_editor_in(editor.clone(), cx).await;
cx.assert_excerpts_with_selections(indoc! {"
[EXCERPT]
ˇ[FOLDED]
[EXCERPT]
a1
b1
[EXCERPT]
[FOLDED]
[EXCERPT]
[FOLDED]
"
});
cx.simulate_keystroke("down");
cx.assert_excerpts_with_selections(indoc! {"
[EXCERPT]
[FOLDED]
[EXCERPT]
ˇa1
b1
[EXCERPT]
[FOLDED]
[EXCERPT]
[FOLDED]
"
});
cx.simulate_keystroke("down");
cx.assert_excerpts_with_selections(indoc! {"
[EXCERPT]
[FOLDED]
[EXCERPT]
a1
ˇb1
[EXCERPT]
[FOLDED]
[EXCERPT]
[FOLDED]
"
});
cx.simulate_keystroke("down");
cx.assert_excerpts_with_selections(indoc! {"
[EXCERPT]
[FOLDED]
[EXCERPT]
a1
b1
ˇ[EXCERPT]
[FOLDED]
[EXCERPT]
[FOLDED]
"
});
cx.simulate_keystroke("down");
cx.assert_excerpts_with_selections(indoc! {"
[EXCERPT]
[FOLDED]
[EXCERPT]
a1
b1
[EXCERPT]
ˇ[FOLDED]
[EXCERPT]
[FOLDED]
"
});
for _ in 0..5 {
cx.simulate_keystroke("down");
cx.assert_excerpts_with_selections(indoc! {"
[EXCERPT]
[FOLDED]
[EXCERPT]
a1
b1
[EXCERPT]
[FOLDED]
[EXCERPT]
ˇ[FOLDED]
"
});
}
cx.simulate_keystroke("up");
cx.assert_excerpts_with_selections(indoc! {"
[EXCERPT]
[FOLDED]
[EXCERPT]
a1
b1
[EXCERPT]
ˇ[FOLDED]
[EXCERPT]
[FOLDED]
"
});
cx.simulate_keystroke("up");
cx.assert_excerpts_with_selections(indoc! {"
[EXCERPT]
[FOLDED]
[EXCERPT]
a1
b1
ˇ[EXCERPT]
[FOLDED]
[EXCERPT]
[FOLDED]
"
});
cx.simulate_keystroke("up");
cx.assert_excerpts_with_selections(indoc! {"
[EXCERPT]
[FOLDED]
[EXCERPT]
a1
ˇb1
[EXCERPT]
[FOLDED]
[EXCERPT]
[FOLDED]
"
});
cx.simulate_keystroke("up");
cx.assert_excerpts_with_selections(indoc! {"
[EXCERPT]
[FOLDED]
[EXCERPT]
ˇa1
b1
[EXCERPT]
[FOLDED]
[EXCERPT]
[FOLDED]
"
});
for _ in 0..5 {
cx.simulate_keystroke("up");
cx.assert_excerpts_with_selections(indoc! {"
[EXCERPT]
ˇ[FOLDED]
[EXCERPT]
a1
b1
[EXCERPT]
[FOLDED]
[EXCERPT]
[FOLDED]
"
});
}
}
#[gpui::test]
async fn test_inline_completion_text(cx: &mut TestAppContext) {
init_test(cx, |_| {});
@@ -16779,6 +16993,245 @@ async fn test_tree_sitter_brackets_newline_insertion(cx: &mut TestAppContext) {
"});
}
mod autoclose_tags {
use super::*;
use language::language_settings::JsxTagAutoCloseSettings;
use languages::language;
async fn test_setup(cx: &mut TestAppContext) -> EditorTestContext {
init_test(cx, |settings| {
settings.defaults.jsx_tag_auto_close = Some(JsxTagAutoCloseSettings { enabled: true });
});
let mut cx = EditorTestContext::new(cx).await;
cx.update_buffer(|buffer, cx| {
let language = language("tsx", tree_sitter_typescript::LANGUAGE_TSX.into());
buffer.set_language(Some(language), cx)
});
cx
}
macro_rules! check {
($name:ident, $initial:literal + $input:literal => $expected:expr) => {
#[gpui::test]
async fn $name(cx: &mut TestAppContext) {
let mut cx = test_setup(cx).await;
cx.set_state($initial);
cx.run_until_parked();
cx.update_editor(|editor, window, cx| {
editor.handle_input($input, window, cx);
});
cx.run_until_parked();
cx.assert_editor_state($expected);
}
};
}
check!(
test_basic,
"<divˇ" + ">" => "<div>ˇ</div>"
);
check!(
test_basic_nested,
"<div><divˇ</div>" + ">" => "<div><div>ˇ</div></div>"
);
check!(
test_basic_ignore_already_closed,
"<div><divˇ</div></div>" + ">" => "<div><div>ˇ</div></div>"
);
check!(
test_doesnt_autoclose_closing_tag,
"</divˇ" + ">" => "</div>ˇ"
);
check!(
test_jsx_attr,
"<div attr={</div>}ˇ" + ">" => "<div attr={</div>}>ˇ</div>"
);
check!(
test_ignores_closing_tags_in_expr_block,
"<div><divˇ{</div>}</div>" + ">" => "<div><div>ˇ</div>{</div>}</div>"
);
check!(
test_doesnt_autoclose_on_gt_in_expr,
"<div attr={1 ˇ" + ">" => "<div attr={1 >ˇ"
);
check!(
test_ignores_closing_tags_with_different_tag_names,
"<div><divˇ</div></span>" + ">" => "<div><div>ˇ</div></div></span>"
);
check!(
test_autocloses_in_jsx_expression,
"<div>{<divˇ}</div>" + ">" => "<div>{<div>ˇ</div>}</div>"
);
check!(
test_doesnt_autoclose_already_closed_in_jsx_expression,
"<div>{<divˇ</div>}</div>" + ">" => "<div>{<div>ˇ</div>}</div>"
);
check!(
test_autocloses_fragment,
"" + ">" => "<>ˇ</>"
);
check!(
test_does_not_include_type_argument_in_autoclose_tag_name,
"<Component<T> attr={boolean_value}ˇ" + ">" => "<Component<T> attr={boolean_value}>ˇ</Component>"
);
check!(
test_does_not_autoclose_doctype,
"<!DOCTYPE htmlˇ" + ">" => "<!DOCTYPE html>ˇ"
);
check!(
test_does_not_autoclose_comment,
"<!-- comment --ˇ" + ">" => "<!-- comment -->ˇ"
);
check!(
test_multi_cursor_autoclose_same_tag,
r#"
<divˇ
<divˇ
"#
+ ">" =>
r#"
<div>ˇ</div>
<div>ˇ</div>
"#
);
check!(
test_multi_cursor_autoclose_different_tags,
r#"
<divˇ
<spanˇ
"#
+ ">" =>
r#"
<div>ˇ</div>
<span>ˇ</span>
"#
);
check!(
test_multi_cursor_autoclose_some_dont_autoclose_others,
r#"
<divˇ
<div /ˇ
<spanˇ</span>
<!DOCTYPE htmlˇ
</headˇ
<Component<T>ˇ
ˇ
"#
+ ">" =>
r#"
<div>ˇ</div>
<div />ˇ
<span>ˇ</span>
<!DOCTYPE html>ˇ
</head>ˇ
<Component<T>>ˇ</Component>
"#
);
check!(
test_doesnt_mess_up_trailing_text,
"<divˇfoobar" + ">" => "<div>ˇ</div>foobar"
);
#[gpui::test]
async fn test_multibuffer(cx: &mut TestAppContext) {
init_test(cx, |settings| {
settings.defaults.jsx_tag_auto_close = Some(JsxTagAutoCloseSettings { enabled: true });
});
let buffer_a = cx.new(|cx| {
let mut buf = language::Buffer::local("<div", cx);
buf.set_language(
Some(language("tsx", tree_sitter_typescript::LANGUAGE_TSX.into())),
cx,
);
buf
});
let buffer_b = cx.new(|cx| {
let mut buf = language::Buffer::local("<pre", cx);
buf.set_language(
Some(language("tsx", tree_sitter_typescript::LANGUAGE_TSX.into())),
cx,
);
buf
});
let buffer_c = cx.new(|cx| {
let buf = language::Buffer::local("<span", cx);
buf
});
let buffer = cx.new(|cx| {
let mut buf = MultiBuffer::new(language::Capability::ReadWrite);
buf.push_excerpts(
buffer_a,
[ExcerptRange {
context: text::Anchor::MIN..text::Anchor::MAX,
primary: None,
}],
cx,
);
buf.push_excerpts(
buffer_b,
[ExcerptRange {
context: text::Anchor::MIN..text::Anchor::MAX,
primary: None,
}],
cx,
);
buf.push_excerpts(
buffer_c,
[ExcerptRange {
context: text::Anchor::MIN..text::Anchor::MAX,
primary: None,
}],
cx,
);
buf
});
let editor = cx.add_window(|window, cx| build_editor(buffer.clone(), window, cx));
let mut cx = EditorTestContext::for_editor(editor, cx).await;
cx.update_editor(|editor, window, cx| {
editor.change_selections(None, window, cx, |selections| {
selections.select(vec![
Selection::from_offset(4),
Selection::from_offset(9),
Selection::from_offset(15),
])
})
});
cx.run_until_parked();
cx.update_editor(|editor, window, cx| {
editor.handle_input(">", window, cx);
});
cx.run_until_parked();
cx.assert_editor_state("<div>ˇ</div>\n<pre>ˇ</pre>\n<span>ˇ");
}
}
fn empty_range(row: usize, column: usize) -> Range<DisplayPoint> {
let point = DisplayPoint::new(DisplayRow(row as u32), column as u32);
point..point

View File

@@ -20,10 +20,10 @@ use crate::{
DisplayRow, DocumentHighlightRead, DocumentHighlightWrite, EditDisplayMode, Editor, EditorMode,
EditorSettings, EditorSnapshot, EditorStyle, ExpandExcerpts, FocusedBlock, GoToHunk,
GoToPreviousHunk, GutterDimensions, HalfPageDown, HalfPageUp, HandleInput, HoveredCursor,
InlayHintRefreshReason, InlineCompletion, JumpData, LineDown, LineUp, OpenExcerpts, PageDown,
PageUp, Point, RowExt, RowRangeExt, SelectPhase, SelectedTextHighlight, Selection, SoftWrap,
StickyHeaderExcerpt, ToPoint, ToggleFold, COLUMNAR_SELECTION_MODIFIERS, CURSORS_VISIBLE_FOR,
FILE_HEADER_HEIGHT, GIT_BLAME_MAX_AUTHOR_CHARS_DISPLAYED, MAX_LINE_LEN,
InlayHintRefreshReason, InlineCompletion, JumpData, LineDown, LineHighlight, LineUp,
OpenExcerpts, PageDown, PageUp, Point, RowExt, RowRangeExt, SelectPhase, SelectedTextHighlight,
Selection, SoftWrap, StickyHeaderExcerpt, ToPoint, ToggleFold, COLUMNAR_SELECTION_MODIFIERS,
CURSORS_VISIBLE_FOR, FILE_HEADER_HEIGHT, GIT_BLAME_MAX_AUTHOR_CHARS_DISPLAYED, MAX_LINE_LEN,
MULTI_BUFFER_EXCERPT_HEADER_HEIGHT,
};
use buffer_diff::{DiffHunkStatus, DiffHunkStatusKind};
@@ -77,7 +77,7 @@ use ui::{
POPOVER_Y_PADDING,
};
use unicode_segmentation::UnicodeSegmentation;
use util::{debug_panic, maybe, RangeExt, ResultExt};
use util::{debug_panic, RangeExt, ResultExt};
use workspace::{item::Item, notifications::NotifyTaskExt};
const INLINE_BLAME_PADDING_EM_WIDTHS: f32 = 7.;
@@ -282,7 +282,9 @@ impl EditorElement {
register_action(editor, window, Editor::move_to_beginning);
register_action(editor, window, Editor::move_to_end);
register_action(editor, window, Editor::move_to_start_of_excerpt);
register_action(editor, window, Editor::move_to_start_of_next_excerpt);
register_action(editor, window, Editor::move_to_end_of_excerpt);
register_action(editor, window, Editor::move_to_end_of_previous_excerpt);
register_action(editor, window, Editor::select_up);
register_action(editor, window, Editor::select_down);
register_action(editor, window, Editor::select_left);
@@ -296,7 +298,9 @@ impl EditorElement {
register_action(editor, window, Editor::select_to_start_of_paragraph);
register_action(editor, window, Editor::select_to_end_of_paragraph);
register_action(editor, window, Editor::select_to_start_of_excerpt);
register_action(editor, window, Editor::select_to_start_of_next_excerpt);
register_action(editor, window, Editor::select_to_end_of_excerpt);
register_action(editor, window, Editor::select_to_end_of_previous_excerpt);
register_action(editor, window, Editor::select_to_beginning);
register_action(editor, window, Editor::select_to_end);
register_action(editor, window, Editor::select_all);
@@ -1691,7 +1695,7 @@ impl EditorElement {
let pos_y = content_origin.y
+ line_height * (row.0 as f32 - scroll_pixel_position.y / line_height);
let window_ix = row.minus(start_row) as usize;
let window_ix = row.0.saturating_sub(start_row.0) as usize;
let pos_x = {
let crease_trailer_layout = &crease_trailers[window_ix];
let line_layout = &line_layouts[window_ix];
@@ -1724,7 +1728,7 @@ impl EditorElement {
.h(line_height)
.w_full()
.px_1()
.rounded_sm()
.rounded_xs()
.opacity(opacity)
.bg(severity_to_color(&diagnostic_to_render.severity)
.color(cx)
@@ -2676,24 +2680,21 @@ impl EditorElement {
window: &mut Window,
cx: &mut App,
) -> Div {
let file_status = maybe!({
let project = self.editor.read(cx).project.as_ref()?.read(cx);
let (repo, path) =
project.repository_and_path_for_buffer_id(for_excerpt.buffer_id, cx)?;
let status = repo.read(cx).repository_entry.status_for_path(&path)?;
Some(status.status)
})
.filter(|_| {
self.editor
.read(cx)
.buffer
.read(cx)
.all_diff_hunks_expanded()
});
let include_root = self
.editor
let editor = self.editor.read(cx);
let file_status = editor
.buffer
.read(cx)
.all_diff_hunks_expanded()
.then(|| {
editor
.project
.as_ref()?
.read(cx)
.status_for_buffer_id(for_excerpt.buffer_id, cx)
})
.flatten();
let include_root = editor
.project
.as_ref()
.map(|project| project.read(cx).visible_worktrees(cx).count() > 1)
@@ -2705,7 +2706,7 @@ impl EditorElement {
let parent_path = path.as_ref().and_then(|path| {
Some(path.parent()?.to_string_lossy().to_string() + std::path::MAIN_SEPARATOR_STR)
});
let focus_handle = self.editor.focus_handle(cx);
let focus_handle = editor.focus_handle(cx);
let colors = cx.theme().colors();
div()
@@ -2720,7 +2721,7 @@ impl EditorElement {
.flex_basis(Length::Definite(DefiniteLength::Fraction(0.667)))
.pl_0p5()
.pr_5()
.rounded_md()
.rounded_sm()
.shadow_md()
.border_1()
.map(|div| {
@@ -2744,7 +2745,7 @@ impl EditorElement {
header.child(
div()
.hover(|style| style.bg(colors.element_selected))
.rounded_sm()
.rounded_xs()
.child(
ButtonLike::new("toggle-buffer-fold")
.style(ui::ButtonStyle::Transparent)
@@ -2778,8 +2779,7 @@ impl EditorElement {
)
})
.children(
self.editor
.read(cx)
editor
.addons
.values()
.filter_map(|addon| {
@@ -3952,6 +3952,7 @@ impl EditorElement {
display_row_range,
multi_buffer_range,
status,
is_created_file,
..
} = &hunk
{
@@ -3983,6 +3984,7 @@ impl EditorElement {
display_row_range.start.0,
status,
multi_buffer_range.clone(),
*is_created_file,
line_height,
&editor,
cx,
@@ -4134,46 +4136,74 @@ impl EditorElement {
}
}
let mut paint_highlight =
|highlight_row_start: DisplayRow, highlight_row_end: DisplayRow, color| {
let origin = point(
layout.hitbox.origin.x,
layout.hitbox.origin.y
+ (highlight_row_start.as_f32() - scroll_top)
* layout.position_map.line_height,
);
let size = size(
layout.hitbox.size.width,
layout.position_map.line_height
* highlight_row_end.next_row().minus(highlight_row_start) as f32,
);
window.paint_quad(fill(Bounds { origin, size }, color));
};
let mut paint_highlight = |highlight_row_start: DisplayRow,
highlight_row_end: DisplayRow,
highlight: crate::LineHighlight,
edges| {
let origin = point(
layout.hitbox.origin.x,
layout.hitbox.origin.y
+ (highlight_row_start.as_f32() - scroll_top)
* layout.position_map.line_height,
);
let size = size(
layout.hitbox.size.width,
layout.position_map.line_height
* highlight_row_end.next_row().minus(highlight_row_start) as f32,
);
let mut quad = fill(Bounds { origin, size }, highlight.background);
if let Some(border_color) = highlight.border {
quad.border_color = border_color;
quad.border_widths = edges
}
window.paint_quad(quad);
};
let mut current_paint: Option<(gpui::Background, Range<DisplayRow>)> = None;
let mut current_paint: Option<(LineHighlight, Range<DisplayRow>, Edges<Pixels>)> =
None;
for (&new_row, &new_background) in &layout.highlighted_rows {
match &mut current_paint {
Some((current_background, current_range)) => {
Some((current_background, current_range, mut edges)) => {
let current_background = *current_background;
let new_range_started = current_background != new_background
|| current_range.end.next_row() != new_row;
if new_range_started {
if current_range.end.next_row() == new_row {
edges.bottom = px(0.);
};
paint_highlight(
current_range.start,
current_range.end,
current_background,
edges,
);
current_paint = Some((new_background, new_row..new_row));
let edges = Edges {
top: if current_range.end.next_row() != new_row {
px(1.)
} else {
px(0.)
},
bottom: px(1.),
..Default::default()
};
current_paint = Some((new_background, new_row..new_row, edges));
continue;
} else {
current_range.end = current_range.end.next_row();
}
}
None => current_paint = Some((new_background, new_row..new_row)),
None => {
let edges = Edges {
top: px(1.),
bottom: px(1.),
..Default::default()
};
current_paint = Some((new_background, new_row..new_row, edges))
}
};
}
if let Some((color, range)) = current_paint {
paint_highlight(range.start, range.end, color);
if let Some((color, range, edges)) = current_paint {
paint_highlight(range.start, range.end, color, edges);
}
let scroll_left =
@@ -4351,6 +4381,11 @@ impl EditorElement {
fn paint_gutter_diff_hunks(layout: &mut EditorLayout, window: &mut Window, cx: &mut App) {
let is_light = cx.theme().appearance().is_light();
let hunk_style = ProjectSettings::get_global(cx)
.git
.hunk_style
.unwrap_or_default();
if layout.display_hunks.is_empty() {
return;
}
@@ -4414,9 +4449,23 @@ impl EditorElement {
if let Some((hunk_bounds, mut background_color, corner_radii, secondary_status)) =
hunk_to_paint
{
if secondary_status.has_secondary_hunk() {
background_color =
background_color.opacity(if is_light { 0.2 } else { 0.32 });
match hunk_style {
GitHunkStyleSetting::Transparent | GitHunkStyleSetting::Pattern => {
if secondary_status.has_secondary_hunk() {
background_color =
background_color.opacity(if is_light { 0.2 } else { 0.32 });
}
}
GitHunkStyleSetting::StagedPattern
| GitHunkStyleSetting::StagedTransparent => {
if !secondary_status.has_secondary_hunk() {
background_color =
background_color.opacity(if is_light { 0.2 } else { 0.32 });
}
}
GitHunkStyleSetting::StagedBorder | GitHunkStyleSetting::Border => {
// Don't change the background color
}
}
// Flatten the background color with the editor color to prevent
@@ -6736,10 +6785,10 @@ impl Element for EditorElement {
.update(cx, |editor, cx| editor.highlighted_display_rows(window, cx));
let is_light = cx.theme().appearance().is_light();
let use_pattern = ProjectSettings::get_global(cx)
let hunk_style = ProjectSettings::get_global(cx)
.git
.hunk_style
.map_or(false, |style| matches!(style, GitHunkStyleSetting::Pattern));
.unwrap_or_default();
for (ix, row_info) in row_infos.iter().enumerate() {
let Some(diff_status) = row_info.diff_status else {
@@ -6759,26 +6808,74 @@ impl Element for EditorElement {
let unstaged = diff_status.has_secondary_hunk();
let hunk_opacity = if is_light { 0.16 } else { 0.12 };
let slash_width = line_height.0 / 1.5; // ~16 by default
let staged_background =
solid_background(background_color.opacity(hunk_opacity));
let unstaged_background = if use_pattern {
pattern_slash(
background_color.opacity(hunk_opacity),
window.rem_size().0 * 1.125, // ~18 by default
)
} else {
solid_background(background_color.opacity(if is_light {
0.08
} else {
0.04
}))
let staged_highlight: LineHighlight = match hunk_style {
GitHunkStyleSetting::Transparent
| GitHunkStyleSetting::Pattern
| GitHunkStyleSetting::Border => {
solid_background(background_color.opacity(hunk_opacity)).into()
}
GitHunkStyleSetting::StagedPattern => {
pattern_slash(background_color.opacity(hunk_opacity), slash_width)
.into()
}
GitHunkStyleSetting::StagedTransparent => {
solid_background(background_color.opacity(if is_light {
0.08
} else {
0.04
}))
.into()
}
GitHunkStyleSetting::StagedBorder => LineHighlight {
background: (background_color.opacity(if is_light {
0.08
} else {
0.06
}))
.into(),
border: Some(if is_light {
background_color.opacity(0.48)
} else {
background_color.opacity(0.36)
}),
},
};
let unstaged_highlight = match hunk_style {
GitHunkStyleSetting::Transparent => {
solid_background(background_color.opacity(if is_light {
0.08
} else {
0.04
}))
.into()
}
GitHunkStyleSetting::Pattern => {
pattern_slash(background_color.opacity(hunk_opacity), slash_width)
.into()
}
GitHunkStyleSetting::Border => LineHighlight {
background: (background_color.opacity(if is_light {
0.08
} else {
0.02
}))
.into(),
border: Some(background_color.opacity(0.5)),
},
GitHunkStyleSetting::StagedPattern
| GitHunkStyleSetting::StagedTransparent
| GitHunkStyleSetting::StagedBorder => {
solid_background(background_color.opacity(hunk_opacity)).into()
}
};
let background = if unstaged {
unstaged_background
unstaged_highlight
} else {
staged_background
staged_highlight
};
highlighted_rows
@@ -7627,7 +7724,7 @@ pub struct EditorLayout {
indent_guides: Option<Vec<IndentGuideLayout>>,
visible_display_row_range: Range<DisplayRow>,
active_rows: BTreeMap<DisplayRow, bool>,
highlighted_rows: BTreeMap<DisplayRow, gpui::Background>,
highlighted_rows: BTreeMap<DisplayRow, LineHighlight>,
line_elements: SmallVec<[AnyElement; 1]>,
line_numbers: Arc<HashMap<MultiBufferRow, LineNumberLayout>>,
display_hunks: Vec<(DisplayDiffHunk, Option<Hitbox>)>,
@@ -8790,6 +8887,7 @@ fn diff_hunk_controls(
row: u32,
status: &DiffHunkStatus,
hunk_range: Range<Anchor>,
is_created_file: bool,
line_height: Pixels,
editor: &Entity<Editor>,
cx: &mut App,
@@ -8798,13 +8896,16 @@ fn diff_hunk_controls(
.h(line_height)
.mr_1()
.gap_1()
.px_1()
.px_0p5()
.pb_1()
.border_x_1()
.border_b_1()
.border_color(cx.theme().colors().border_variant)
.rounded_b_lg()
.bg(cx.theme().colors().editor_background)
.gap_1()
.occlude()
.shadow_md()
.child(if status.has_secondary_hunk() {
Button::new(("stage", row as u64), "Stage")
.alpha(if status.is_pending() { 0.66 } else { 1.0 })
@@ -8822,12 +8923,11 @@ fn diff_hunk_controls(
})
.on_click({
let editor = editor.clone();
move |_event, window, cx| {
move |_event, _window, cx| {
editor.update(cx, |editor, cx| {
editor.stage_or_unstage_diff_hunks(
true,
&[hunk_range.start..hunk_range.start],
window,
vec![hunk_range.start..hunk_range.start],
cx,
);
});
@@ -8850,12 +8950,11 @@ fn diff_hunk_controls(
})
.on_click({
let editor = editor.clone();
move |_event, window, cx| {
move |_event, _window, cx| {
editor.update(cx, |editor, cx| {
editor.stage_or_unstage_diff_hunks(
false,
&[hunk_range.start..hunk_range.start],
window,
vec![hunk_range.start..hunk_range.start],
cx,
);
});
@@ -8863,7 +8962,7 @@ fn diff_hunk_controls(
})
})
.child(
Button::new("discard", "Restore")
Button::new("restore", "Restore")
.tooltip({
let focus_handle = editor.focus_handle(cx);
move |window, cx| {
@@ -8885,7 +8984,8 @@ fn diff_hunk_controls(
editor.restore_hunks_in_ranges(vec![point..point], window, cx);
});
}
}),
})
.disabled(is_created_file),
)
.when(
!editor.read(cx).buffer().read(cx).all_diff_hunks_expanded(),
@@ -8914,7 +9014,7 @@ fn diff_hunk_controls(
let snapshot = editor.snapshot(window, cx);
let position =
hunk_range.end.to_point(&snapshot.buffer_snapshot);
editor.go_to_hunk_after_or_before_position(
editor.go_to_hunk_before_or_after_position(
&snapshot,
position,
Direction::Next,
@@ -8950,7 +9050,7 @@ fn diff_hunk_controls(
let snapshot = editor.snapshot(window, cx);
let point =
hunk_range.start.to_point(&snapshot.buffer_snapshot);
editor.go_to_hunk_after_or_before_position(
editor.go_to_hunk_before_or_after_position(
&snapshot,
point,
Direction::Prev,

View File

@@ -195,9 +195,12 @@ impl GitBlame {
) -> impl 'a + Iterator<Item = Option<BlameEntry>> {
self.sync(cx);
let buffer_id = self.buffer_snapshot.remote_id();
let mut cursor = self.entries.cursor::<u32>(&());
rows.into_iter().map(move |info| {
let row = info.buffer_row?;
let row = info
.buffer_row
.filter(|_| info.buffer_id == Some(buffer_id))?;
cursor.seek_forward(&row, Bias::Right, &());
cursor.item()?.blame.clone()
})
@@ -535,6 +538,7 @@ mod tests {
use serde_json::json;
use settings::SettingsStore;
use std::{cmp, env, ops::Range, path::Path};
use text::BufferId;
use unindent::Unindent as _;
use util::{path, RandomCharIter};
@@ -552,16 +556,18 @@ mod tests {
#[track_caller]
fn assert_blame_rows(
blame: &mut GitBlame,
buffer_id: BufferId,
rows: Range<u32>,
expected: Vec<Option<BlameEntry>>,
cx: &mut Context<GitBlame>,
) {
assert_eq!(
pretty_assertions::assert_eq!(
blame
.blame_for_rows(
&rows
.map(|row| RowInfo {
buffer_row: Some(row),
buffer_id: Some(buffer_id),
..Default::default()
})
.collect::<Vec<_>>(),
@@ -694,6 +700,7 @@ mod tests {
})
.await
.unwrap();
let buffer_id = buffer.update(cx, |buffer, _| buffer.remote_id());
let git_blame = cx.new(|cx| GitBlame::new(buffer.clone(), project, false, true, cx));
@@ -701,12 +708,13 @@ mod tests {
git_blame.update(cx, |blame, cx| {
// All lines
assert_eq!(
pretty_assertions::assert_eq!(
blame
.blame_for_rows(
&(0..8)
.map(|buffer_row| RowInfo {
buffer_row: Some(buffer_row),
buffer_id: Some(buffer_id),
..Default::default()
})
.collect::<Vec<_>>(),
@@ -725,12 +733,13 @@ mod tests {
]
);
// Subset of lines
assert_eq!(
pretty_assertions::assert_eq!(
blame
.blame_for_rows(
&(1..4)
.map(|buffer_row| RowInfo {
buffer_row: Some(buffer_row),
buffer_id: Some(buffer_id),
..Default::default()
})
.collect::<Vec<_>>(),
@@ -744,12 +753,13 @@ mod tests {
]
);
// Subset of lines, with some not displayed
assert_eq!(
pretty_assertions::assert_eq!(
blame
.blame_for_rows(
&[
RowInfo {
buffer_row: Some(1),
buffer_id: Some(buffer_id),
..Default::default()
},
Default::default(),
@@ -800,6 +810,7 @@ mod tests {
})
.await
.unwrap();
let buffer_id = buffer.update(cx, |buffer, _| buffer.remote_id());
let git_blame = cx.new(|cx| GitBlame::new(buffer.clone(), project, false, true, cx));
@@ -810,6 +821,7 @@ mod tests {
// lines.
assert_blame_rows(
blame,
buffer_id,
0..4,
vec![
Some(blame_entry("1b1b1b", 0..4)),
@@ -828,6 +840,7 @@ mod tests {
git_blame.update(cx, |blame, cx| {
assert_blame_rows(
blame,
buffer_id,
0..2,
vec![None, Some(blame_entry("1b1b1b", 0..4))],
cx,
@@ -840,6 +853,7 @@ mod tests {
git_blame.update(cx, |blame, cx| {
assert_blame_rows(
blame,
buffer_id,
1..4,
vec![
None,
@@ -852,7 +866,13 @@ mod tests {
// Before we insert a newline at the end, sanity check:
git_blame.update(cx, |blame, cx| {
assert_blame_rows(blame, 3..4, vec![Some(blame_entry("1b1b1b", 0..4))], cx);
assert_blame_rows(
blame,
buffer_id,
3..4,
vec![Some(blame_entry("1b1b1b", 0..4))],
cx,
);
});
// Insert a newline at the end
buffer.update(cx, |buffer, cx| {
@@ -862,6 +882,7 @@ mod tests {
git_blame.update(cx, |blame, cx| {
assert_blame_rows(
blame,
buffer_id,
3..5,
vec![Some(blame_entry("1b1b1b", 0..4)), None],
cx,
@@ -870,7 +891,13 @@ mod tests {
// Before we insert a newline at the start, sanity check:
git_blame.update(cx, |blame, cx| {
assert_blame_rows(blame, 2..3, vec![Some(blame_entry("1b1b1b", 0..4))], cx);
assert_blame_rows(
blame,
buffer_id,
2..3,
vec![Some(blame_entry("1b1b1b", 0..4))],
cx,
);
});
// Usage example
@@ -882,6 +909,7 @@ mod tests {
git_blame.update(cx, |blame, cx| {
assert_blame_rows(
blame,
buffer_id,
2..4,
vec![None, Some(blame_entry("1b1b1b", 0..4))],
cx,

View File

@@ -241,8 +241,10 @@ impl Editor {
}
})
.collect();
return self.navigate_to_hover_links(None, links, modifiers.alt, window, cx);
let navigate_task =
self.navigate_to_hover_links(None, links, modifiers.alt, window, cx);
self.select(SelectPhase::End, window, cx);
return navigate_task;
}
}
@@ -258,7 +260,7 @@ impl Editor {
cx,
);
if point.as_valid().is_some() {
let navigate_task = if point.as_valid().is_some() {
if modifiers.shift {
self.go_to_type_definition(&GoToTypeDefinition, window, cx)
} else {
@@ -266,7 +268,9 @@ impl Editor {
}
} else {
Task::ready(Ok(Navigated::No))
}
};
self.select(SelectPhase::End, window, cx);
return navigate_task;
}
}

View File

@@ -25,7 +25,7 @@ use theme::ThemeSettings;
use ui::{prelude::*, theme_is_transparent, Scrollbar, ScrollbarState};
use url::Url;
use util::TryFutureExt;
use workspace::Workspace;
use workspace::{OpenOptions, OpenVisible, Workspace};
pub const HOVER_REQUEST_DELAY_MILLIS: u64 = 200;
pub const MIN_POPOVER_CHARACTER_WIDTH: f32 = 20.;
@@ -632,8 +632,15 @@ pub fn open_markdown_url(link: SharedString, window: &mut Window, cx: &mut App)
if uri.scheme() == "file" {
if let Some(workspace) = window.root::<Workspace>().flatten() {
workspace.update(cx, |workspace, cx| {
let task =
workspace.open_abs_path(PathBuf::from(uri.path()), false, window, cx);
let task = workspace.open_abs_path(
PathBuf::from(uri.path()),
OpenOptions {
visible: Some(OpenVisible::None),
..Default::default()
},
window,
cx,
);
cx.spawn_in(window, |_, mut cx| async move {
let item = task.await?;

View File

@@ -38,10 +38,14 @@ use text::{BufferId, Selection};
use theme::{Theme, ThemeSettings};
use ui::{prelude::*, IconDecorationKind};
use util::{paths::PathExt, ResultExt, TryFutureExt};
use workspace::item::{Dedup, ItemSettings, SerializableItem, TabContentParams};
use workspace::{
item::{BreadcrumbText, FollowEvent},
searchable::SearchOptions,
OpenVisible,
};
use workspace::{
item::{Dedup, ItemSettings, SerializableItem, TabContentParams},
OpenOptions,
};
use workspace::{
item::{FollowableItem, Item, ItemEvent, ProjectItem},
@@ -1157,7 +1161,15 @@ impl SerializableItem for Editor {
}
None => {
let open_by_abs_path = workspace.update(cx, |workspace, cx| {
workspace.open_abs_path(abs_path.clone(), false, window, cx)
workspace.open_abs_path(
abs_path.clone(),
OpenOptions {
visible: Some(OpenVisible::None),
..Default::default()
},
window,
cx,
)
});
window.spawn(cx, |mut cx| async move {
let editor = open_by_abs_path?.await?.downcast::<Editor>().with_context(|| format!("Failed to downcast to Editor after opening abs path {abs_path:?}"))?;

View File

@@ -0,0 +1,616 @@
use anyhow::{anyhow, Context as _, Result};
use collections::HashMap;
use gpui::{Context, Entity, Window};
use multi_buffer::{MultiBuffer, ToOffset};
use std::ops::Range;
use util::ResultExt as _;
use language::{BufferSnapshot, JsxTagAutoCloseConfig, Node};
use text::{Anchor, OffsetRangeExt as _};
use crate::Editor;
pub struct JsxTagCompletionState {
edit_index: usize,
open_tag_range: Range<usize>,
}
/// Index of the named child within an open or close tag
/// that corresponds to the tag name
/// Note that this is not configurable, i.e. we assume the first
/// named child of a tag node is the tag name
const TS_NODE_TAG_NAME_CHILD_INDEX: usize = 0;
/// Maximum number of parent elements to walk back when checking if an open tag
/// is already closed.
///
/// See the comment in `generate_auto_close_edits` for more details
const ALREADY_CLOSED_PARENT_ELEMENT_WALK_BACK_LIMIT: usize = 2;
pub(crate) fn should_auto_close(
buffer: &BufferSnapshot,
edited_ranges: &[Range<usize>],
config: &JsxTagAutoCloseConfig,
) -> Option<Vec<JsxTagCompletionState>> {
let mut to_auto_edit = vec![];
for (index, edited_range) in edited_ranges.iter().enumerate() {
let text = buffer
.text_for_range(edited_range.clone())
.collect::<String>();
if !text.ends_with(">") {
continue;
}
let Some(layer) = buffer.smallest_syntax_layer_containing(edited_range.clone()) else {
continue;
};
let Some(node) = layer
.node()
.named_descendant_for_byte_range(edited_range.start, edited_range.end)
else {
continue;
};
let mut jsx_open_tag_node = node;
if node.grammar_name() != config.open_tag_node_name {
if let Some(parent) = node.parent() {
if parent.grammar_name() == config.open_tag_node_name {
jsx_open_tag_node = parent;
}
}
}
if jsx_open_tag_node.grammar_name() != config.open_tag_node_name {
continue;
}
let first_two_chars: Option<[char; 2]> = {
let mut chars = buffer
.text_for_range(jsx_open_tag_node.byte_range())
.flat_map(|chunk| chunk.chars());
if let (Some(c1), Some(c2)) = (chars.next(), chars.next()) {
Some([c1, c2])
} else {
None
}
};
if let Some(chars) = first_two_chars {
if chars[0] != '<' {
continue;
}
if chars[1] == '!' || chars[1] == '/' {
continue;
}
}
to_auto_edit.push(JsxTagCompletionState {
edit_index: index,
open_tag_range: jsx_open_tag_node.byte_range(),
});
}
if to_auto_edit.is_empty() {
return None;
} else {
return Some(to_auto_edit);
}
}
pub(crate) fn generate_auto_close_edits(
buffer: &BufferSnapshot,
ranges: &[Range<usize>],
config: &JsxTagAutoCloseConfig,
state: Vec<JsxTagCompletionState>,
) -> Result<Vec<(Range<Anchor>, String)>> {
let mut edits = Vec::with_capacity(state.len());
for auto_edit in state {
let edited_range = ranges[auto_edit.edit_index].clone();
let Some(layer) = buffer.smallest_syntax_layer_containing(edited_range.clone()) else {
continue;
};
let layer_root_node = layer.node();
let Some(open_tag) = layer_root_node.descendant_for_byte_range(
auto_edit.open_tag_range.start,
auto_edit.open_tag_range.end,
) else {
continue;
};
assert!(open_tag.kind() == config.open_tag_node_name);
let tag_name = open_tag
.named_child(TS_NODE_TAG_NAME_CHILD_INDEX)
.filter(|node| node.kind() == config.tag_name_node_name)
.map_or("".to_string(), |node| {
buffer.text_for_range(node.byte_range()).collect::<String>()
});
/*
* Naive check to see if the tag is already closed
* Essentially all we do is count the number of open and close tags
* with the same tag name as the open tag just entered by the user
* The search is limited to some scope determined by
* `ALREADY_CLOSED_PARENT_ELEMENT_WALK_BACK_LIMIT`
*
* The limit is preferable to walking up the tree until we find a non-tag node,
* and then checking the entire tree, as this is unnecessarily expensive, and
* risks false positives
* eg. a `</div>` tag without a corresponding opening tag exists 25 lines away
* and the user typed in `<div>`, intuitively we still want to auto-close it because
* the other `</div>` tag is almost certainly not supposed to be the closing tag for the
* current element
*
* We have to walk up the tree some amount because tree-sitters error correction is not
* designed to handle this case, and usually does not represent the tree structure
* in the way we might expect,
*
* We half to walk up the tree until we hit an element with a different open tag name (`doing_deep_search == true`)
* because tree-sitter may pair the new open tag with the root of the tree's closing tag leaving the
* root's opening tag unclosed.
* e.g
* ```
* <div>
* <div>|cursor here|
* </div>
* ```
* in Astro/vue/svelte tree-sitter represented the tree as
* (
* (jsx_element
* (jsx_opening_element
* "<div>")
* )
* (jsx_element
* (jsx_opening_element
* "<div>") // <- cursor is here
* (jsx_closing_element
* "</div>")
* )
* )
* so if we only walked to the first `jsx_element` node,
* we would mistakenly identify the div entered by the
* user as already being closed, despite this clearly
* being false
*
* The errors with the tree-sitter tree caused by error correction,
* are also why the naive algorithm was chosen, as the alternative
* approach would be to maintain or construct a full parse tree (like tree-sitter)
* that better represents errors in a way that we can simply check
* the enclosing scope of the entered tag for a closing tag
* This is far more complex and expensive, and was deemed impractical
* given that the naive algorithm is sufficient in the majority of cases.
*/
{
let tag_node_name_equals = |node: &Node, tag_name_node_name: &str, name: &str| {
let is_empty = name.len() == 0;
if let Some(node_name) = node.named_child(TS_NODE_TAG_NAME_CHILD_INDEX) {
if node_name.kind() != tag_name_node_name {
return is_empty;
}
let range = node_name.byte_range();
return buffer.text_for_range(range).equals_str(name);
}
return is_empty;
};
let tree_root_node = {
let mut ancestors = Vec::with_capacity(
// estimate of max, not based on any data,
// but trying to avoid excessive reallocation
16,
);
ancestors.push(layer_root_node);
let mut cur = layer_root_node;
// walk down the tree until we hit the open tag
// note: this is what node.parent() does internally
while let Some(descendant) = cur.child_with_descendant(open_tag) {
if descendant == open_tag {
break;
}
ancestors.push(descendant);
cur = descendant;
}
assert!(ancestors.len() > 0);
let mut tree_root_node = open_tag;
let mut parent_element_node_count = 0;
let mut doing_deep_search = false;
for &ancestor in ancestors.iter().rev() {
tree_root_node = ancestor;
let is_element = ancestor.kind() == config.jsx_element_node_name;
let is_error = ancestor.is_error();
if is_error || !is_element {
break;
}
if is_element {
let is_first = parent_element_node_count == 0;
if !is_first {
let has_open_tag_with_same_tag_name = ancestor
.named_child(0)
.filter(|n| n.kind() == config.open_tag_node_name)
.map_or(false, |element_open_tag_node| {
tag_node_name_equals(
&element_open_tag_node,
&config.tag_name_node_name,
&tag_name,
)
});
if has_open_tag_with_same_tag_name {
doing_deep_search = true;
} else if doing_deep_search {
break;
}
}
parent_element_node_count += 1;
if !doing_deep_search
&& parent_element_node_count
>= ALREADY_CLOSED_PARENT_ELEMENT_WALK_BACK_LIMIT
{
break;
}
}
}
tree_root_node
};
let mut unclosed_open_tag_count: i32 = 0;
let mut cursor = layer_root_node.walk();
let mut stack = Vec::with_capacity(tree_root_node.descendant_count());
stack.extend(tree_root_node.children(&mut cursor));
let mut has_erroneous_close_tag = false;
let mut erroneous_close_tag_node_name = "";
let mut erroneous_close_tag_name_node_name = "";
if let Some(name) = config.erroneous_close_tag_node_name.as_deref() {
has_erroneous_close_tag = true;
erroneous_close_tag_node_name = name;
erroneous_close_tag_name_node_name = config
.erroneous_close_tag_name_node_name
.as_deref()
.unwrap_or(&config.tag_name_node_name);
}
let is_after_open_tag = |node: &Node| {
return node.start_byte() < open_tag.start_byte()
&& node.end_byte() < open_tag.start_byte();
};
// perf: use cursor for more efficient traversal
// if child -> go to child
// else if next sibling -> go to next sibling
// else -> go to parent
// if parent == tree_root_node -> break
while let Some(node) = stack.pop() {
let kind = node.kind();
if kind == config.open_tag_node_name {
if tag_node_name_equals(&node, &config.tag_name_node_name, &tag_name) {
unclosed_open_tag_count += 1;
}
} else if kind == config.close_tag_node_name {
if tag_node_name_equals(&node, &config.tag_name_node_name, &tag_name) {
unclosed_open_tag_count -= 1;
}
} else if has_erroneous_close_tag && kind == erroneous_close_tag_node_name {
if tag_node_name_equals(&node, erroneous_close_tag_name_node_name, &tag_name) {
if !is_after_open_tag(&node) {
unclosed_open_tag_count -= 1;
}
}
} else if kind == config.jsx_element_node_name {
// perf: filter only open,close,element,erroneous nodes
stack.extend(node.children(&mut cursor));
}
}
if unclosed_open_tag_count <= 0 {
// skip if already closed
continue;
}
}
let edit_anchor = buffer.anchor_after(edited_range.end);
let edit_range = edit_anchor..edit_anchor;
edits.push((edit_range, format!("</{}>", tag_name)));
}
return Ok(edits);
}
pub(crate) fn refresh_enabled_in_any_buffer(
editor: &mut Editor,
multi_buffer: &Entity<MultiBuffer>,
cx: &Context<Editor>,
) {
editor.jsx_tag_auto_close_enabled_in_any_buffer = {
let multi_buffer = multi_buffer.read(cx);
let mut found_enabled = false;
multi_buffer.for_each_buffer(|buffer| {
let buffer = buffer.read(cx);
let snapshot = buffer.snapshot();
for syntax_layer in snapshot.syntax_layers() {
let language = syntax_layer.language;
if language.config().jsx_tag_auto_close.is_none() {
continue;
}
let language_settings = language::language_settings::language_settings(
Some(language.name()),
snapshot.file(),
cx,
);
if language_settings.jsx_tag_auto_close.enabled {
found_enabled = true;
}
}
});
found_enabled
};
}
pub(crate) type InitialBufferVersionsMap = HashMap<language::BufferId, clock::Global>;
pub(crate) fn construct_initial_buffer_versions_map<
D: ToOffset + Copy,
_S: Into<std::sync::Arc<str>>,
>(
editor: &Editor,
edits: &[(Range<D>, _S)],
cx: &Context<Editor>,
) -> InitialBufferVersionsMap {
let mut initial_buffer_versions = InitialBufferVersionsMap::default();
if !editor.jsx_tag_auto_close_enabled_in_any_buffer {
return initial_buffer_versions;
}
for (edit_range, _) in edits {
let edit_range_buffer = editor
.buffer()
.read(cx)
.excerpt_containing(edit_range.end, cx)
.map(|e| e.1);
if let Some(buffer) = edit_range_buffer {
let (buffer_id, buffer_version) =
buffer.read_with(cx, |buffer, _| (buffer.remote_id(), buffer.version.clone()));
initial_buffer_versions.insert(buffer_id, buffer_version);
}
}
return initial_buffer_versions;
}
pub(crate) fn handle_from(
editor: &Editor,
initial_buffer_versions: InitialBufferVersionsMap,
window: &mut Window,
cx: &mut Context<Editor>,
) {
if !editor.jsx_tag_auto_close_enabled_in_any_buffer {
return;
}
struct JsxAutoCloseEditContext {
buffer: Entity<language::Buffer>,
config: language::JsxTagAutoCloseConfig,
edits: Vec<Range<usize>>,
}
let mut edit_contexts =
HashMap::<(language::BufferId, language::LanguageId), JsxAutoCloseEditContext>::default();
for (buffer_id, buffer_version_initial) in initial_buffer_versions {
let Some(buffer) = editor.buffer.read(cx).buffer(buffer_id) else {
continue;
};
let snapshot = buffer.read(cx).snapshot();
for edit in buffer.read(cx).edits_since(&buffer_version_initial) {
let Some(language) = snapshot.language_at(edit.new.end) else {
continue;
};
let Some(config) = language.config().jsx_tag_auto_close.as_ref() else {
continue;
};
let language_settings = snapshot.settings_at(edit.new.end, cx);
if !language_settings.jsx_tag_auto_close.enabled {
continue;
}
edit_contexts
.entry((snapshot.remote_id(), language.id()))
.or_insert_with(|| JsxAutoCloseEditContext {
buffer: buffer.clone(),
config: config.clone(),
edits: vec![],
})
.edits
.push(edit.new);
}
}
for ((buffer_id, _), auto_close_context) in edit_contexts {
let JsxAutoCloseEditContext {
buffer,
config: jsx_tag_auto_close_config,
edits: edited_ranges,
} = auto_close_context;
let (buffer_version_initial, mut buffer_parse_status_rx) =
buffer.read_with(cx, |buffer, _| (buffer.version(), buffer.parse_status()));
cx.spawn_in(window, |this, mut cx| async move {
let Some(buffer_parse_status) = buffer_parse_status_rx.recv().await.ok() else {
return Some(());
};
if buffer_parse_status == language::ParseStatus::Parsing {
let Some(language::ParseStatus::Idle) = buffer_parse_status_rx.recv().await.ok()
else {
return Some(());
};
}
let buffer_snapshot = buffer.read_with(&cx, |buf, _| buf.snapshot()).ok()?;
let Some(edit_behavior_state) =
should_auto_close(&buffer_snapshot, &edited_ranges, &jsx_tag_auto_close_config)
else {
return Some(());
};
let ensure_no_edits_since_start = || -> Option<()> {
// <div>wef,wefwef
let has_edits_since_start = this
.read_with(&cx, |this, cx| {
this.buffer.read_with(cx, |buffer, cx| {
buffer.buffer(buffer_id).map_or(true, |buffer| {
buffer.read_with(cx, |buffer, _| {
buffer.has_edits_since(&buffer_version_initial)
})
})
})
})
.ok()?;
if has_edits_since_start {
Err(anyhow!(
"Auto-close Operation Failed - Buffer has edits since start"
))
.log_err()?;
}
Some(())
};
ensure_no_edits_since_start()?;
let edits = cx
.background_executor()
.spawn({
let buffer_snapshot = buffer_snapshot.clone();
async move {
generate_auto_close_edits(
&buffer_snapshot,
&edited_ranges,
&jsx_tag_auto_close_config,
edit_behavior_state,
)
}
})
.await;
let edits = edits
.context("Auto-close Operation Failed - Failed to compute edits")
.log_err()?;
if edits.is_empty() {
return Some(());
}
// check again after awaiting background task before applying edits
ensure_no_edits_since_start()?;
let multi_buffer_snapshot = this
.read_with(&cx, |this, cx| {
this.buffer.read_with(cx, |buffer, cx| buffer.snapshot(cx))
})
.ok()?;
let mut base_selections = Vec::new();
let mut buffer_selection_map = HashMap::default();
{
let selections = this
.read_with(&cx, |this, _| this.selections.disjoint_anchors().clone())
.ok()?;
for selection in selections.iter() {
let Some(selection_buffer_offset_head) =
multi_buffer_snapshot.point_to_buffer_offset(selection.head())
else {
base_selections.push(selection.clone());
continue;
};
let Some(selection_buffer_offset_tail) =
multi_buffer_snapshot.point_to_buffer_offset(selection.tail())
else {
base_selections.push(selection.clone());
continue;
};
let is_entirely_in_buffer = selection_buffer_offset_head.0.remote_id()
== buffer_id
&& selection_buffer_offset_tail.0.remote_id() == buffer_id;
if !is_entirely_in_buffer {
base_selections.push(selection.clone());
continue;
}
let selection_buffer_offset_head = selection_buffer_offset_head.1;
let selection_buffer_offset_tail = selection_buffer_offset_tail.1;
buffer_selection_map.insert(
(selection_buffer_offset_head, selection_buffer_offset_tail),
(selection.clone(), None),
);
}
}
let mut any_selections_need_update = false;
for edit in &edits {
let edit_range_offset = edit.0.to_offset(&buffer_snapshot);
if edit_range_offset.start != edit_range_offset.end {
continue;
}
if let Some(selection) =
buffer_selection_map.get_mut(&(edit_range_offset.start, edit_range_offset.end))
{
if selection.0.head().bias() != text::Bias::Right
|| selection.0.tail().bias() != text::Bias::Right
{
continue;
}
if selection.1.is_none() {
any_selections_need_update = true;
selection.1 = Some(
selection
.0
.clone()
.map(|anchor| multi_buffer_snapshot.anchor_before(anchor)),
);
}
}
}
buffer
.update(&mut cx, |buffer, cx| {
buffer.edit(edits, None, cx);
})
.ok()?;
if any_selections_need_update {
let multi_buffer_snapshot = this
.read_with(&cx, |this, cx| {
this.buffer.read_with(cx, |buffer, cx| buffer.snapshot(cx))
})
.ok()?;
base_selections.extend(buffer_selection_map.values().map(|selection| {
match &selection.1 {
Some(left_biased_selection) => left_biased_selection.clone(),
None => selection.0.clone(),
}
}));
let base_selections = base_selections
.into_iter()
.map(|selection| {
selection.map(|anchor| anchor.to_offset(&multi_buffer_snapshot))
})
.collect::<Vec<_>>();
this.update_in(&mut cx, |this, window, cx| {
this.change_selections_inner(None, false, window, cx, |s| {
s.select(base_selections);
});
})
.ok()?;
}
Some(())
})
.detach();
}
}

View File

@@ -448,7 +448,9 @@ pub fn end_of_excerpt(
if start.row() > DisplayRow(0) {
*start.row_mut() -= 1;
}
map.clip_point(start, Bias::Left)
start = map.clip_point(start, Bias::Left);
*start.column_mut() = 0;
start
}
Direction::Next => {
let mut end = excerpt.end_anchor().to_display_point(&map);

View File

@@ -4,7 +4,7 @@ use anyhow::Context as _;
use gpui::{App, AppContext as _, Context, Entity, Window};
use language::{Capability, Language};
use multi_buffer::MultiBuffer;
use project::lsp_ext_command::ExpandMacro;
use project::lsp_store::{lsp_ext_command::ExpandMacro, rust_analyzer_ext::RUST_ANALYZER_NAME};
use text::ToPointUtf16;
use crate::{
@@ -12,8 +12,6 @@ use crate::{
ExpandMacroRecursively, OpenDocs,
};
const RUST_ANALYZER_NAME: &str = "rust-analyzer";
fn is_rust_language(language: &Language) -> bool {
language.name() == "Rust".into()
}
@@ -131,7 +129,7 @@ pub fn open_docs(editor: &mut Editor, _: &OpenDocs, window: &mut Window, cx: &mu
project.request_lsp(
buffer,
project::LanguageServerToQuery::Other(server_to_query),
project::lsp_ext_command::OpenDocs { position },
project::lsp_store::lsp_ext_command::OpenDocs { position },
cx,
)
});

View File

@@ -429,12 +429,14 @@ impl EditorTestContext {
if expected_selections.len() > 0 {
assert!(
is_selected,
"excerpt {} should be selected. Got {:?}",
ix,
self.editor_state()
"excerpt {ix} should be selected. got {:?}",
self.editor_state(),
);
} else {
assert!(!is_selected, "excerpt {} should not be selected", ix);
assert!(
!is_selected,
"excerpt {ix} should not be selected, got: {selections:?}",
);
}
continue;
}

View File

@@ -17,6 +17,7 @@ async-compression.workspace = true
async-tar.workspace = true
async-trait.workspace = true
collections.workspace = true
convert_case.workspace = true
fs.workspace = true
futures.workspace = true
gpui.workspace = true

View File

@@ -4,6 +4,7 @@ use crate::{
use anyhow::{anyhow, bail, Context as _, Result};
use async_compression::futures::bufread::GzipDecoder;
use async_tar::Archive;
use convert_case::{Case, Casing as _};
use futures::io::BufReader;
use futures::AsyncReadExt;
use http_client::{self, AsyncBody, HttpClient};
@@ -97,6 +98,11 @@ impl ExtensionBuilder {
}
for (grammar_name, grammar_metadata) in &extension_manifest.grammars {
let snake_cased_grammar_name = grammar_name.to_case(Case::Snake);
if grammar_name.as_ref() != snake_cased_grammar_name.as_str() {
bail!("grammar name '{grammar_name}' must be written in snake_case: {snake_cased_grammar_name}");
}
log::info!(
"compiling grammar {grammar_name} for extension {}",
extension_dir.display()

View File

@@ -1,4 +1,4 @@
use anyhow::{anyhow, Context as _, Result};
use anyhow::{anyhow, bail, Context as _, Result};
use collections::{BTreeMap, HashMap};
use fs::Fs;
use language::LanguageName;
@@ -85,6 +85,61 @@ pub struct ExtensionManifest {
pub indexed_docs_providers: BTreeMap<Arc<str>, IndexedDocsProviderEntry>,
#[serde(default)]
pub snippets: Option<PathBuf>,
#[serde(default)]
pub capabilities: Vec<ExtensionCapability>,
}
impl ExtensionManifest {
pub fn allow_exec(
&self,
desired_command: &str,
desired_args: &[impl AsRef<str> + std::fmt::Debug],
) -> Result<()> {
let is_allowed = self.capabilities.iter().any(|capability| match capability {
ExtensionCapability::ProcessExec { command, args } if command == desired_command => {
for (ix, arg) in args.iter().enumerate() {
if arg == "**" {
return true;
}
if ix >= desired_args.len() {
return false;
}
if arg != "*" && arg != desired_args[ix].as_ref() {
return false;
}
}
if args.len() < desired_args.len() {
return false;
}
true
}
_ => false,
});
if !is_allowed {
bail!(
"capability for process:exec {desired_command} {desired_args:?} was not listed in the extension manifest",
);
}
Ok(())
}
}
/// A capability for an extension.
#[derive(Debug, PartialEq, Eq, Clone, Serialize, Deserialize)]
#[serde(tag = "kind")]
pub enum ExtensionCapability {
#[serde(rename = "process:exec")]
ProcessExec {
/// The command to execute.
command: String,
/// The arguments to pass to the command. Use `*` for a single wildcard argument.
/// If the last element is `**`, then any trailing arguments are allowed.
args: Vec<String>,
},
}
#[derive(Clone, Default, PartialEq, Eq, Debug, Deserialize, Serialize)]
@@ -218,5 +273,104 @@ fn manifest_from_old_manifest(
slash_commands: BTreeMap::default(),
indexed_docs_providers: BTreeMap::default(),
snippets: None,
capabilities: Vec::new(),
}
}
#[cfg(test)]
mod tests {
use super::*;
fn extension_manifest() -> ExtensionManifest {
ExtensionManifest {
id: "test".into(),
name: "Test".to_string(),
version: "1.0.0".into(),
schema_version: SchemaVersion::ZERO,
description: None,
repository: None,
authors: vec![],
lib: Default::default(),
themes: vec![],
icon_themes: vec![],
languages: vec![],
grammars: BTreeMap::default(),
language_servers: BTreeMap::default(),
context_servers: BTreeMap::default(),
slash_commands: BTreeMap::default(),
indexed_docs_providers: BTreeMap::default(),
snippets: None,
capabilities: vec![],
}
}
#[test]
fn test_allow_exact_match() {
let manifest = ExtensionManifest {
capabilities: vec![ExtensionCapability::ProcessExec {
command: "ls".to_string(),
args: vec!["-la".to_string()],
}],
..extension_manifest()
};
assert!(manifest.allow_exec("ls", &["-la"]).is_ok());
assert!(manifest.allow_exec("ls", &["-l"]).is_err());
assert!(manifest.allow_exec("pwd", &[] as &[&str]).is_err());
}
#[test]
fn test_allow_wildcard_arg() {
let manifest = ExtensionManifest {
capabilities: vec![ExtensionCapability::ProcessExec {
command: "git".to_string(),
args: vec!["*".to_string()],
}],
..extension_manifest()
};
assert!(manifest.allow_exec("git", &["status"]).is_ok());
assert!(manifest.allow_exec("git", &["commit"]).is_ok());
assert!(manifest.allow_exec("git", &["status", "-s"]).is_err()); // too many args
assert!(manifest.allow_exec("npm", &["install"]).is_err()); // wrong command
}
#[test]
fn test_allow_double_wildcard() {
let manifest = ExtensionManifest {
capabilities: vec![ExtensionCapability::ProcessExec {
command: "cargo".to_string(),
args: vec!["test".to_string(), "**".to_string()],
}],
..extension_manifest()
};
assert!(manifest.allow_exec("cargo", &["test"]).is_ok());
assert!(manifest.allow_exec("cargo", &["test", "--all"]).is_ok());
assert!(manifest
.allow_exec("cargo", &["test", "--all", "--no-fail-fast"])
.is_ok());
assert!(manifest.allow_exec("cargo", &["build"]).is_err()); // wrong first arg
}
#[test]
fn test_allow_mixed_wildcards() {
let manifest = ExtensionManifest {
capabilities: vec![ExtensionCapability::ProcessExec {
command: "docker".to_string(),
args: vec!["run".to_string(), "*".to_string(), "**".to_string()],
}],
..extension_manifest()
};
assert!(manifest.allow_exec("docker", &["run", "nginx"]).is_ok());
assert!(manifest.allow_exec("docker", &["run"]).is_err());
assert!(manifest
.allow_exec("docker", &["run", "ubuntu", "bash"])
.is_ok());
assert!(manifest
.allow_exec("docker", &["run", "alpine", "sh", "-c", "echo hello"])
.is_ok());
assert!(manifest.allow_exec("docker", &["ps"]).is_err()); // wrong first arg
}
}

View File

@@ -163,6 +163,7 @@ async fn test_extension_store(cx: &mut TestAppContext) {
slash_commands: BTreeMap::default(),
indexed_docs_providers: BTreeMap::default(),
snippets: None,
capabilities: Vec::new(),
}),
dev: false,
},
@@ -191,6 +192,7 @@ async fn test_extension_store(cx: &mut TestAppContext) {
slash_commands: BTreeMap::default(),
indexed_docs_providers: BTreeMap::default(),
snippets: None,
capabilities: Vec::new(),
}),
dev: false,
},
@@ -356,6 +358,7 @@ async fn test_extension_store(cx: &mut TestAppContext) {
slash_commands: BTreeMap::default(),
indexed_docs_providers: BTreeMap::default(),
snippets: None,
capabilities: Vec::new(),
}),
dev: false,
},

View File

@@ -592,6 +592,8 @@ impl process::Host for WasmState {
command: process::Command,
) -> wasmtime::Result<Result<process::Output, String>> {
maybe!(async {
self.manifest.allow_exec(&command.command, &command.args)?;
let output = util::command::new_smol_command(command.command.as_str())
.args(&command.args)
.envs(command.env)

View File

@@ -40,7 +40,7 @@ impl RenderOnce for ExtensionCard {
.bg(cx.theme().colors().elevated_surface_background)
.border_1()
.border_color(cx.theme().colors().border)
.rounded_md()
.rounded_sm()
.children(self.children)
.when(self.overridden_by_dev_extension, |card| {
card.child(

View File

@@ -632,7 +632,7 @@ impl ExtensionsPage {
.px_0p5()
.border_1()
.border_color(cx.theme().colors().border)
.rounded_md()
.rounded_sm()
.child(
Label::new(label).size(LabelSize::XSmall),
)

View File

@@ -470,7 +470,7 @@ impl Render for FeedbackModal {
.bg(cx.theme().colors().editor_background)
.p_2()
.border_1()
.rounded_md()
.rounded_sm()
.border_color(cx.theme().colors().border)
.child(self.feedback_editor.clone()),
)
@@ -482,7 +482,7 @@ impl Render for FeedbackModal {
.bg(cx.theme().colors().editor_background)
.p_2()
.border_1()
.rounded_md()
.rounded_sm()
.border_color(if self.valid_email_address() {
cx.theme().colors().border
} else {

View File

@@ -42,8 +42,8 @@ use ui::{
};
use util::{maybe, paths::PathWithPosition, post_inc, ResultExt};
use workspace::{
item::PreviewTabsSettings, notifications::NotifyResultExt, pane, ModalView, SplitDirection,
Workspace,
item::PreviewTabsSettings, notifications::NotifyResultExt, pane, ModalView, OpenOptions,
OpenVisible, SplitDirection, Workspace,
};
actions!(file_finder, [SelectPrevious, ToggleMenu]);
@@ -1239,7 +1239,10 @@ impl PickerDelegate for FileFinderDelegate {
} else {
workspace.open_abs_path(
abs_path.to_path_buf(),
false,
OpenOptions {
visible: Some(OpenVisible::None),
..Default::default()
},
window,
cx,
)

View File

@@ -7,7 +7,7 @@ use menu::{Confirm, SelectNext, SelectPrevious};
use project::{RemoveOptions, FS_WATCH_LATENCY};
use serde_json::json;
use util::path;
use workspace::{AppState, ToggleFileFinder, Workspace};
use workspace::{AppState, OpenOptions, ToggleFileFinder, Workspace};
#[ctor::ctor]
fn init_logger() {
@@ -951,7 +951,10 @@ async fn test_external_files_history(cx: &mut gpui::TestAppContext) {
.update_in(cx, |workspace, window, cx| {
workspace.open_abs_path(
PathBuf::from(path!("/external-src/test/third.rs")),
false,
OpenOptions {
visible: Some(OpenVisible::None),
..Default::default()
},
window,
cx,
)

View File

@@ -1448,6 +1448,12 @@ impl FakeFs {
});
}
pub fn set_error_message_for_index_write(&self, dot_git: &Path, message: Option<String>) {
self.with_git_state(dot_git, true, |state| {
state.simulated_index_write_error_message = message;
});
}
pub fn paths(&self, include_dot_git: bool) -> Vec<PathBuf> {
let mut result = Vec::new();
let mut queue = collections::VecDeque::new();

View File

@@ -16,6 +16,7 @@ test-support = []
[dependencies]
anyhow.workspace = true
askpass.workspace = true
async-trait.workspace = true
collections.workspace = true
derive_more.workspace = true
@@ -34,7 +35,7 @@ text.workspace = true
time.workspace = true
url.workspace = true
util.workspace = true
tempfile.workspace = true
futures.workspace = true
[dev-dependencies]
pretty_assertions.workspace = true

View File

@@ -8,9 +8,6 @@ pub mod status;
use anyhow::{anyhow, Context as _, Result};
use gpui::action_with_deprecated_aliases;
use gpui::actions;
use gpui::impl_actions;
use repository::PushOptions;
use schemars::JsonSchema;
use serde::{Deserialize, Serialize};
use std::ffi::OsStr;
use std::fmt;
@@ -31,13 +28,6 @@ pub static COMMIT_MESSAGE: LazyLock<&'static OsStr> =
LazyLock::new(|| OsStr::new("COMMIT_EDITMSG"));
pub static INDEX_LOCK: LazyLock<&'static OsStr> = LazyLock::new(|| OsStr::new("index.lock"));
#[derive(Debug, Copy, Clone, PartialEq, Deserialize, JsonSchema)]
pub struct Push {
pub options: Option<PushOptions>,
}
impl_actions!(git, [Push]);
actions!(
git,
[
@@ -54,10 +44,13 @@ actions!(
RestoreTrackedFiles,
TrashUntrackedFiles,
Uncommit,
Push,
ForcePush,
Pull,
Fetch,
Commit,
ShowCommitEditor,
ExpandCommitEditor,
GenerateCommitMessage
]
);
action_with_deprecated_aliases!(git, RestoreFile, ["editor::RevertFile"]);

View File

@@ -2,7 +2,9 @@ use crate::status::FileStatus;
use crate::GitHostingProviderRegistry;
use crate::{blame::Blame, status::GitStatus};
use anyhow::{anyhow, Context, Result};
use askpass::{AskPassResult, AskPassSession};
use collections::{HashMap, HashSet};
use futures::{select_biased, FutureExt as _};
use git2::BranchType;
use gpui::SharedString;
use parking_lot::Mutex;
@@ -11,8 +13,6 @@ use schemars::JsonSchema;
use serde::Deserialize;
use std::borrow::Borrow;
use std::io::Write as _;
#[cfg(not(windows))]
use std::os::unix::fs::PermissionsExt;
use std::process::Stdio;
use std::sync::LazyLock;
use std::{
@@ -21,9 +21,11 @@ use std::{
sync::Arc,
};
use sum_tree::MapSeekTarget;
use util::command::new_std_command;
use util::command::{new_smol_command, new_std_command};
use util::ResultExt;
pub const REMOTE_CANCELLED_BY_USER: &str = "Operation cancelled by user";
#[derive(Clone, Debug, Hash, PartialEq, Eq)]
pub struct Branch {
pub is_head: bool,
@@ -106,6 +108,7 @@ pub struct CommitSummary {
pub subject: SharedString,
/// This is a unix timestamp
pub commit_timestamp: i64,
pub has_parent: bool,
}
#[derive(Clone, Debug, Hash, PartialEq, Eq)]
@@ -199,10 +202,29 @@ pub trait GitRepository: Send + Sync {
branch_name: &str,
upstream_name: &str,
options: Option<PushOptions>,
askpass: AskPassSession,
) -> Result<RemoteCommandOutput>;
fn pull(&self, branch_name: &str, upstream_name: &str) -> Result<RemoteCommandOutput>;
fn pull(
&self,
branch_name: &str,
upstream_name: &str,
askpass: AskPassSession,
) -> Result<RemoteCommandOutput>;
fn fetch(&self, askpass: AskPassSession) -> Result<RemoteCommandOutput>;
fn get_remotes(&self, branch_name: Option<&str>) -> Result<Vec<Remote>>;
fn fetch(&self) -> Result<RemoteCommandOutput>;
/// returns a list of remote branches that contain HEAD
fn check_for_pushed_commit(&self) -> Result<Vec<SharedString>>;
/// Run git diff
fn diff(&self, diff: DiffType) -> Result<String>;
}
pub enum DiffType {
HeadToIndex,
HeadToWorktree,
}
#[derive(Debug, Clone, Copy, PartialEq, Eq, Deserialize, JsonSchema)]
@@ -427,6 +449,15 @@ impl GitRepository for RealGitRepository {
true
})
.ok();
if let Some(oid) = self
.repository
.lock()
.find_reference("CHERRY_PICK_HEAD")
.ok()
.and_then(|reference| reference.target())
{
shas.push(oid.to_string())
}
shas
}
@@ -462,6 +493,7 @@ impl GitRepository for RealGitRepository {
let fields = [
"%(HEAD)",
"%(objectname)",
"%(parent)",
"%(refname)",
"%(upstream)",
"%(upstream:track)",
@@ -553,6 +585,28 @@ impl GitRepository for RealGitRepository {
)
}
fn diff(&self, diff: DiffType) -> Result<String> {
let working_directory = self.working_directory()?;
let args = match diff {
DiffType::HeadToIndex => Some("--staged"),
DiffType::HeadToWorktree => None,
};
let output = new_std_command(&self.git_binary_path)
.current_dir(&working_directory)
.args(["diff"])
.args(args)
.output()?;
if !output.status.success() {
return Err(anyhow!(
"Failed to run git diff:\n{}",
String::from_utf8_lossy(&output.stderr)
));
}
Ok(String::from_utf8_lossy(&output.stdout).to_string())
}
fn stage_paths(&self, paths: &[RepoPath]) -> Result<()> {
let working_directory = self.working_directory()?;
@@ -563,7 +617,6 @@ impl GitRepository for RealGitRepository {
.args(paths.iter().map(|p| p.as_ref()))
.output()?;
// TODO: Get remote response out of this and show it to the user
if !output.status.success() {
return Err(anyhow!(
"Failed to stage paths:\n{}",
@@ -584,7 +637,6 @@ impl GitRepository for RealGitRepository {
.args(paths.iter().map(|p| p.as_ref()))
.output()?;
// TODO: Get remote response out of this and show it to the user
if !output.status.success() {
return Err(anyhow!(
"Failed to unstage:\n{}",
@@ -610,7 +662,6 @@ impl GitRepository for RealGitRepository {
let output = cmd.output()?;
// TODO: Get remote response out of this and show it to the user
if !output.status.success() {
return Err(anyhow!(
"Failed to commit:\n{}",
@@ -625,15 +676,15 @@ impl GitRepository for RealGitRepository {
branch_name: &str,
remote_name: &str,
options: Option<PushOptions>,
ask_pass: AskPassSession,
) -> Result<RemoteCommandOutput> {
let working_directory = self.working_directory()?;
// We do this on every operation to ensure that the askpass script exists and is executable.
#[cfg(not(windows))]
let (askpass_script_path, _temp_dir) = setup_askpass()?;
let mut command = new_std_command("git");
let mut command = new_smol_command("git");
command
.env("GIT_ASKPASS", ask_pass.script_path())
.env("SSH_ASKPASS", ask_pass.script_path())
.env("SSH_ASKPASS_REQUIRE", "force")
.current_dir(&working_directory)
.args(["push"])
.args(options.map(|option| match option {
@@ -641,92 +692,53 @@ impl GitRepository for RealGitRepository {
PushOptions::Force => "--force-with-lease",
}))
.arg(remote_name)
.arg(format!("{}:{}", branch_name, branch_name));
.arg(format!("{}:{}", branch_name, branch_name))
.stdout(smol::process::Stdio::piped())
.stderr(smol::process::Stdio::piped());
let git_process = command.spawn()?;
#[cfg(not(windows))]
{
command.env("GIT_ASKPASS", askpass_script_path);
}
let output = command.output()?;
if !output.status.success() {
return Err(anyhow!(
"Failed to push:\n{}",
String::from_utf8_lossy(&output.stderr)
));
} else {
return Ok(RemoteCommandOutput {
stdout: String::from_utf8_lossy(&output.stdout).to_string(),
stderr: String::from_utf8_lossy(&output.stderr).to_string(),
});
}
run_remote_command(ask_pass, git_process)
}
fn pull(&self, branch_name: &str, remote_name: &str) -> Result<RemoteCommandOutput> {
fn pull(
&self,
branch_name: &str,
remote_name: &str,
ask_pass: AskPassSession,
) -> Result<RemoteCommandOutput> {
let working_directory = self.working_directory()?;
// We do this on every operation to ensure that the askpass script exists and is executable.
#[cfg(not(windows))]
let (askpass_script_path, _temp_dir) = setup_askpass()?;
let mut command = new_std_command("git");
let mut command = new_smol_command("git");
command
.env("GIT_ASKPASS", ask_pass.script_path())
.env("SSH_ASKPASS", ask_pass.script_path())
.env("SSH_ASKPASS_REQUIRE", "force")
.current_dir(&working_directory)
.args(["pull"])
.arg(remote_name)
.arg(branch_name);
.arg(branch_name)
.stdout(smol::process::Stdio::piped())
.stderr(smol::process::Stdio::piped());
let git_process = command.spawn()?;
#[cfg(not(windows))]
{
command.env("GIT_ASKPASS", askpass_script_path);
}
let output = command.output()?;
if !output.status.success() {
return Err(anyhow!(
"Failed to pull:\n{}",
String::from_utf8_lossy(&output.stderr)
));
} else {
return Ok(RemoteCommandOutput {
stdout: String::from_utf8_lossy(&output.stdout).to_string(),
stderr: String::from_utf8_lossy(&output.stderr).to_string(),
});
}
run_remote_command(ask_pass, git_process)
}
fn fetch(&self) -> Result<RemoteCommandOutput> {
fn fetch(&self, ask_pass: AskPassSession) -> Result<RemoteCommandOutput> {
let working_directory = self.working_directory()?;
// We do this on every operation to ensure that the askpass script exists and is executable.
#[cfg(not(windows))]
let (askpass_script_path, _temp_dir) = setup_askpass()?;
let mut command = new_std_command("git");
let mut command = new_smol_command("git");
command
.env("GIT_ASKPASS", ask_pass.script_path())
.env("SSH_ASKPASS", ask_pass.script_path())
.env("SSH_ASKPASS_REQUIRE", "force")
.current_dir(&working_directory)
.args(["fetch", "--all"]);
.args(["fetch", "--all"])
.stdout(smol::process::Stdio::piped())
.stderr(smol::process::Stdio::piped());
let git_process = command.spawn()?;
#[cfg(not(windows))]
{
command.env("GIT_ASKPASS", askpass_script_path);
}
let output = command.output()?;
if !output.status.success() {
return Err(anyhow!(
"Failed to fetch:\n{}",
String::from_utf8_lossy(&output.stderr)
));
} else {
return Ok(RemoteCommandOutput {
stdout: String::from_utf8_lossy(&output.stdout).to_string(),
stderr: String::from_utf8_lossy(&output.stderr).to_string(),
});
}
run_remote_command(ask_pass, git_process)
}
fn get_remotes(&self, branch_name: Option<&str>) -> Result<Vec<Remote>> {
@@ -770,18 +782,88 @@ impl GitRepository for RealGitRepository {
));
}
}
fn check_for_pushed_commit(&self) -> Result<Vec<SharedString>> {
let working_directory = self.working_directory()?;
let git_cmd = |args: &[&str]| -> Result<String> {
let output = new_std_command(&self.git_binary_path)
.current_dir(&working_directory)
.args(args)
.output()?;
if output.status.success() {
Ok(String::from_utf8(output.stdout)?)
} else {
Err(anyhow!(String::from_utf8_lossy(&output.stderr).to_string()))
}
};
let head = git_cmd(&["rev-parse", "HEAD"])
.context("Failed to get HEAD")?
.trim()
.to_owned();
let mut remote_branches = vec![];
let mut add_if_matching = |remote_head: &str| {
if let Ok(merge_base) = git_cmd(&["merge-base", &head, remote_head]) {
if merge_base.trim() == head {
if let Some(s) = remote_head.strip_prefix("refs/remotes/") {
remote_branches.push(s.to_owned().into());
}
}
}
};
// check the main branch of each remote
let remotes = git_cmd(&["remote"]).context("Failed to get remotes")?;
for remote in remotes.lines() {
if let Ok(remote_head) =
git_cmd(&["symbolic-ref", &format!("refs/remotes/{remote}/HEAD")])
{
add_if_matching(remote_head.trim());
}
}
// ... and the remote branch that the checked-out one is tracking
if let Ok(remote_head) = git_cmd(&["rev-parse", "--symbolic-full-name", "@{u}"]) {
add_if_matching(remote_head.trim());
}
Ok(remote_branches)
}
}
#[cfg(not(windows))]
fn setup_askpass() -> Result<(PathBuf, tempfile::TempDir), anyhow::Error> {
let temp_dir = tempfile::Builder::new()
.prefix("zed-git-askpass")
.tempdir()?;
let askpass_script = "#!/bin/sh\necho ''";
let askpass_script_path = temp_dir.path().join("git-askpass.sh");
std::fs::write(&askpass_script_path, askpass_script)?;
std::fs::set_permissions(&askpass_script_path, std::fs::Permissions::from_mode(0o755))?;
Ok((askpass_script_path, temp_dir))
fn run_remote_command(
mut ask_pass: AskPassSession,
git_process: smol::process::Child,
) -> std::result::Result<RemoteCommandOutput, anyhow::Error> {
smol::block_on(async {
select_biased! {
result = ask_pass.run().fuse() => {
match result {
AskPassResult::CancelledByUser => {
Err(anyhow!(REMOTE_CANCELLED_BY_USER))?
}
AskPassResult::Timedout => {
Err(anyhow!("Connecting to host timed out"))?
}
}
}
output = git_process.output().fuse() => {
let output = output?;
if !output.status.success() {
Err(anyhow!(
"Operation failed:\n{}",
String::from_utf8_lossy(&output.stderr)
))
} else {
Ok(RemoteCommandOutput {
stdout: String::from_utf8_lossy(&output.stdout).to_string(),
stderr: String::from_utf8_lossy(&output.stderr).to_string(),
})
}
}
}
})
}
#[derive(Debug, Clone)]
@@ -799,6 +881,7 @@ pub struct FakeGitRepositoryState {
pub statuses: HashMap<RepoPath, FileStatus>,
pub current_branch_name: Option<String>,
pub branches: HashSet<String>,
pub simulated_index_write_error_message: Option<String>,
}
impl FakeGitRepository {
@@ -818,6 +901,7 @@ impl FakeGitRepositoryState {
statuses: Default::default(),
current_branch_name: Default::default(),
branches: Default::default(),
simulated_index_write_error_message: None,
}
}
}
@@ -837,6 +921,9 @@ impl GitRepository for FakeGitRepository {
fn set_index_text(&self, path: &RepoPath, content: Option<String>) -> anyhow::Result<()> {
let mut state = self.state.lock();
if let Some(message) = state.simulated_index_write_error_message.clone() {
return Err(anyhow::anyhow!(message));
}
if let Some(content) = content {
state.index_contents.insert(path.clone(), content);
} else {
@@ -972,21 +1059,35 @@ impl GitRepository for FakeGitRepository {
_branch: &str,
_remote: &str,
_options: Option<PushOptions>,
_ask_pass: AskPassSession,
) -> Result<RemoteCommandOutput> {
unimplemented!()
}
fn pull(&self, _branch: &str, _remote: &str) -> Result<RemoteCommandOutput> {
fn pull(
&self,
_branch: &str,
_remote: &str,
_ask_pass: AskPassSession,
) -> Result<RemoteCommandOutput> {
unimplemented!()
}
fn fetch(&self) -> Result<RemoteCommandOutput> {
fn fetch(&self, _ask_pass: AskPassSession) -> Result<RemoteCommandOutput> {
unimplemented!()
}
fn get_remotes(&self, _branch: Option<&str>) -> Result<Vec<Remote>> {
unimplemented!()
}
fn check_for_pushed_commit(&self) -> Result<Vec<SharedString>> {
unimplemented!()
}
fn diff(&self, _diff: DiffType) -> Result<String> {
unimplemented!()
}
}
fn check_path_to_repo_path_errors(relative_file_path: &Path) -> Result<()> {
@@ -1117,6 +1218,7 @@ fn parse_branch_input(input: &str) -> Result<Vec<Branch>> {
let mut fields = line.split('\x00');
let is_current_branch = fields.next().context("no HEAD")? == "*";
let head_sha: SharedString = fields.next().context("no objectname")?.to_string().into();
let parent_sha: SharedString = fields.next().context("no parent")?.to_string().into();
let ref_name: SharedString = fields
.next()
.context("no refname")?
@@ -1140,6 +1242,7 @@ fn parse_branch_input(input: &str) -> Result<Vec<Branch>> {
sha: head_sha,
subject,
commit_timestamp: commiterdate,
has_parent: !parent_sha.is_empty(),
}),
upstream: if upstream_name.is_empty() {
None
@@ -1192,7 +1295,7 @@ fn parse_upstream_track(upstream_track: &str) -> Result<UpstreamTracking> {
fn test_branches_parsing() {
// suppress "help: octal escapes are not supported, `\0` is always null"
#[allow(clippy::octal_escapes)]
let input = "*\0060964da10574cd9bf06463a53bf6e0769c5c45e\0refs/heads/zed-patches\0refs/remotes/origin/zed-patches\0\01733187470\0generated protobuf\n";
let input = "*\0060964da10574cd9bf06463a53bf6e0769c5c45e\0\0refs/heads/zed-patches\0refs/remotes/origin/zed-patches\0\01733187470\0generated protobuf\n";
assert_eq!(
parse_branch_input(&input).unwrap(),
vec![Branch {
@@ -1209,6 +1312,7 @@ fn test_branches_parsing() {
sha: "060964da10574cd9bf06463a53bf6e0769c5c45e".into(),
subject: "generated protobuf".into(),
commit_timestamp: 1733187470,
has_parent: false,
})
}]
)

View File

@@ -54,6 +54,39 @@ impl From<TrackedStatus> for FileStatus {
}
}
#[derive(Debug, PartialEq, Eq, Clone, Copy)]
pub enum StageStatus {
Staged,
Unstaged,
PartiallyStaged,
}
impl StageStatus {
pub fn is_fully_staged(&self) -> bool {
matches!(self, StageStatus::Staged)
}
pub fn is_fully_unstaged(&self) -> bool {
matches!(self, StageStatus::Unstaged)
}
pub fn has_staged(&self) -> bool {
matches!(self, StageStatus::Staged | StageStatus::PartiallyStaged)
}
pub fn has_unstaged(&self) -> bool {
matches!(self, StageStatus::Unstaged | StageStatus::PartiallyStaged)
}
pub fn as_bool(self) -> Option<bool> {
match self {
StageStatus::Staged => Some(true),
StageStatus::Unstaged => Some(false),
StageStatus::PartiallyStaged => None,
}
}
}
impl FileStatus {
pub const fn worktree(worktree_status: StatusCode) -> Self {
FileStatus::Tracked(TrackedStatus {
@@ -106,15 +139,15 @@ impl FileStatus {
Ok(status)
}
pub fn is_staged(self) -> Option<bool> {
pub fn staging(self) -> StageStatus {
match self {
FileStatus::Untracked | FileStatus::Ignored | FileStatus::Unmerged { .. } => {
Some(false)
StageStatus::Unstaged
}
FileStatus::Tracked(tracked) => match (tracked.index_status, tracked.worktree_status) {
(StatusCode::Unmodified, _) => Some(false),
(_, StatusCode::Unmodified) => Some(true),
_ => None,
(StatusCode::Unmodified, _) => StageStatus::Unstaged,
(_, StatusCode::Unmodified) => StageStatus::Staged,
_ => StageStatus::PartiallyStaged,
},
}
}

View File

@@ -18,6 +18,7 @@ test-support = ["multi_buffer/test-support"]
[dependencies]
anyhow.workspace = true
askpass.workspace= true
buffer_diff.workspace = true
collections.workspace = true
component.workspace = true
@@ -30,6 +31,7 @@ git.workspace = true
gpui.workspace = true
itertools.workspace = true
language.workspace = true
language_model.workspace = true
linkify.workspace = true
linkme.workspace = true
log.workspace = true
@@ -46,6 +48,7 @@ serde_json.workspace = true
settings.workspace = true
smallvec.workspace = true
strum.workspace = true
telemetry.workspace = true
theme.workspace = true
time.workspace = true
ui.workspace = true
@@ -61,6 +64,7 @@ ctor.workspace = true
env_logger.workspace = true
editor = { workspace = true, features = ["test-support"] }
gpui = { workspace = true, features = ["test-support"] }
pretty_assertions.workspace = true
project = { workspace = true, features = ["test-support"] }
settings = { workspace = true, features = ["test-support"] }
unindent.workspace = true

View File

@@ -0,0 +1,101 @@
use editor::Editor;
use futures::channel::oneshot;
use gpui::{AppContext, DismissEvent, Entity, EventEmitter, Focusable, Styled};
use ui::{
div, h_flex, v_flex, ActiveTheme, App, Context, DynamicSpacing, Headline, HeadlineSize, Icon,
IconName, IconSize, InteractiveElement, IntoElement, ParentElement, Render, SharedString,
StyledExt, StyledTypography, Window,
};
use workspace::ModalView;
pub(crate) struct AskPassModal {
operation: SharedString,
prompt: SharedString,
editor: Entity<Editor>,
tx: Option<oneshot::Sender<String>>,
}
impl EventEmitter<DismissEvent> for AskPassModal {}
impl ModalView for AskPassModal {}
impl Focusable for AskPassModal {
fn focus_handle(&self, cx: &App) -> gpui::FocusHandle {
self.editor.focus_handle(cx)
}
}
impl AskPassModal {
pub fn new(
operation: SharedString,
prompt: SharedString,
tx: oneshot::Sender<String>,
window: &mut Window,
cx: &mut Context<Self>,
) -> Self {
let editor = cx.new(|cx| {
let mut editor = Editor::single_line(window, cx);
if prompt.contains("yes/no") {
editor.set_masked(false, cx);
} else {
editor.set_masked(true, cx);
}
editor
});
Self {
operation,
prompt,
editor,
tx: Some(tx),
}
}
fn cancel(&mut self, _: &menu::Cancel, _window: &mut Window, cx: &mut Context<Self>) {
cx.emit(DismissEvent);
}
fn confirm(&mut self, _: &menu::Confirm, _window: &mut Window, cx: &mut Context<Self>) {
if let Some(tx) = self.tx.take() {
tx.send(self.editor.read(cx).text(cx)).ok();
}
cx.emit(DismissEvent);
}
}
impl Render for AskPassModal {
fn render(&mut self, _: &mut Window, cx: &mut Context<Self>) -> impl IntoElement {
v_flex()
.key_context("PasswordPrompt")
.on_action(cx.listener(Self::cancel))
.on_action(cx.listener(Self::confirm))
.elevation_2(cx)
.size_full()
.font_buffer(cx)
.child(
h_flex()
.px(DynamicSpacing::Base12.rems(cx))
.pt(DynamicSpacing::Base08.rems(cx))
.pb(DynamicSpacing::Base04.rems(cx))
.rounded_t_sm()
.w_full()
.gap_1p5()
.child(Icon::new(IconName::GitBranch).size(IconSize::XSmall))
.child(h_flex().gap_1().overflow_x_hidden().child(
div().max_w_96().overflow_x_hidden().text_ellipsis().child(
Headline::new(self.operation.clone()).size(HeadlineSize::XSmall),
),
)),
)
.child(
div()
.text_buffer(cx)
.py_2()
.px_3()
.bg(cx.theme().colors().editor_background)
.border_t_1()
.border_color(cx.theme().colors().border_variant)
.size_full()
.overflow_hidden()
.child(self.prompt.clone())
.child(self.editor.clone()),
)
}
}

View File

@@ -1,4 +1,4 @@
use anyhow::Context as _;
use anyhow::{anyhow, Context as _};
use fuzzy::{StringMatch, StringMatchCandidate};
use git::repository::Branch;
@@ -8,7 +8,7 @@ use gpui::{
Task, Window,
};
use picker::{Picker, PickerDelegate};
use project::{Project, ProjectPath};
use project::git::Repository;
use std::sync::Arc;
use ui::{prelude::*, HighlightedLabel, ListItem, ListItemSpacing, PopoverMenuHandle};
use util::ResultExt;
@@ -18,26 +18,50 @@ use workspace::{ModalView, Workspace};
pub fn init(cx: &mut App) {
cx.observe_new(|workspace: &mut Workspace, _, _| {
workspace.register_action(open);
workspace.register_action(switch);
workspace.register_action(checkout_branch);
})
.detach();
}
pub fn checkout_branch(
workspace: &mut Workspace,
_: &zed_actions::git::CheckoutBranch,
window: &mut Window,
cx: &mut Context<Workspace>,
) {
open(workspace, &zed_actions::git::Branch, window, cx);
}
pub fn switch(
workspace: &mut Workspace,
_: &zed_actions::git::Switch,
window: &mut Window,
cx: &mut Context<Workspace>,
) {
open(workspace, &zed_actions::git::Branch, window, cx);
}
pub fn open(
workspace: &mut Workspace,
_: &zed_actions::git::Branch,
window: &mut Window,
cx: &mut Context<Workspace>,
) {
let project = workspace.project().clone();
let repository = workspace.project().read(cx).active_repository(cx).clone();
let style = BranchListStyle::Modal;
workspace.toggle_modal(window, cx, |window, cx| {
BranchList::new(project, style, 34., window, cx)
BranchList::new(repository, style, 34., window, cx)
})
}
pub fn popover(project: Entity<Project>, window: &mut Window, cx: &mut App) -> Entity<BranchList> {
pub fn popover(
repository: Option<Entity<Repository>>,
window: &mut Window,
cx: &mut App,
) -> Entity<BranchList> {
cx.new(|cx| {
let list = BranchList::new(project, BranchListStyle::Popover, 15., window, cx);
let list = BranchList::new(repository, BranchListStyle::Popover, 15., window, cx);
list.focus_handle(cx).focus(window);
list
})
@@ -58,22 +82,21 @@ pub struct BranchList {
impl BranchList {
fn new(
project_handle: Entity<Project>,
repository: Option<Entity<Repository>>,
style: BranchListStyle,
rem_width: f32,
window: &mut Window,
cx: &mut Context<Self>,
) -> Self {
let popover_handle = PopoverMenuHandle::default();
let project = project_handle.read(cx);
let all_branches_request = project
.visible_worktrees(cx)
.next()
.map(|worktree| project.branches(ProjectPath::root_path(worktree.read(cx).id()), cx))
.context("No worktrees found");
let all_branches_request = repository
.clone()
.map(|repository| repository.read(cx).branches());
cx.spawn_in(window, |this, mut cx| async move {
let all_branches = all_branches_request?.await?;
let all_branches = all_branches_request
.context("No active repository")?
.await??;
this.update_in(&mut cx, |this, window, cx| {
this.picker.update(cx, |picker, cx| {
@@ -86,7 +109,7 @@ impl BranchList {
})
.detach_and_log_err(cx);
let delegate = BranchListDelegate::new(project_handle.clone(), style, 20);
let delegate = BranchListDelegate::new(repository.clone(), style, 20);
let picker = cx.new(|cx| Picker::uniform_list(delegate, window, cx));
let _subscription = cx.subscribe(&picker, |_, _, _, cx| {
@@ -145,7 +168,7 @@ impl BranchEntry {
pub struct BranchListDelegate {
matches: Vec<BranchEntry>,
all_branches: Option<Vec<Branch>>,
project: Entity<Project>,
repo: Option<Entity<Repository>>,
style: BranchListStyle,
selected_index: usize,
last_query: String,
@@ -155,13 +178,13 @@ pub struct BranchListDelegate {
impl BranchListDelegate {
fn new(
project: Entity<Project>,
repo: Option<Entity<Repository>>,
style: BranchListStyle,
branch_name_trailoff_after: usize,
) -> Self {
Self {
matches: vec![],
project,
repo,
style,
all_branches: None,
selected_index: 0,
@@ -280,14 +303,16 @@ impl PickerDelegate for BranchListDelegate {
return;
};
let current_branch = self.project.update(cx, |project, cx| {
project
.active_repository(cx)
.and_then(|repo| repo.read(cx).current_branch())
.map(|branch| branch.name.to_string())
let current_branch = self.repo.as_ref().map(|repo| {
repo.update(cx, |repo, _| {
repo.current_branch().map(|branch| branch.name.clone())
})
});
if current_branch == Some(branch.name().to_string()) {
if current_branch
.flatten()
.is_some_and(|current_branch| current_branch == branch.name())
{
cx.emit(DismissEvent);
return;
}
@@ -296,19 +321,33 @@ impl PickerDelegate for BranchListDelegate {
let branch = branch.clone();
|picker, mut cx| async move {
let branch_change_task = picker.update(&mut cx, |this, cx| {
let project = this.delegate.project.read(cx);
let branch_to_checkout = match branch {
BranchEntry::Branch(branch) => branch.string,
BranchEntry::History(string) => string,
BranchEntry::NewBranch { name: branch_name } => branch_name,
};
let worktree = project
.visible_worktrees(cx)
.next()
.context("worktree disappeared")?;
let repository = ProjectPath::root_path(worktree.read(cx).id());
let repo = this
.delegate
.repo
.as_ref()
.ok_or_else(|| anyhow!("No active repository"))?
.clone();
anyhow::Ok(project.update_or_create_branch(repository, branch_to_checkout, cx))
let cx = cx.to_async();
anyhow::Ok(async move {
match branch {
BranchEntry::Branch(StringMatch {
string: branch_name,
..
})
| BranchEntry::History(branch_name) => {
cx.update(|cx| repo.read(cx).change_branch(branch_name))?
.await?
}
BranchEntry::NewBranch { name: branch_name } => {
cx.update(|cx| repo.read(cx).create_branch(branch_name.clone()))?
.await??;
cx.update(|cx| repo.read(cx).change_branch(branch_name))?
.await?
}
}
})
})??;
branch_change_task.await?;
@@ -316,7 +355,7 @@ impl PickerDelegate for BranchListDelegate {
picker.update(&mut cx, |_, cx| {
cx.emit(DismissEvent);
Ok::<(), anyhow::Error>(())
anyhow::Ok(())
})
}
})

Some files were not shown because too many files have changed in this diff Show More