Compare commits

..

70 Commits

Author SHA1 Message Date
Richard Feldman
7ef2a8211f Merge remote-tracking branch 'origin/main' into inlay-hint-tooltip 2025-07-15 10:53:58 -04:00
Taylor Beever
0671a4d5ae Allow for venv activation script to use pyenv (#33119)
Release Notes:

- Allows for configuration and use of `pyenv` as a virtual environment provider
2025-07-15 16:44:40 +02:00
tidely
bd78f2c493 project: Use checked_sub for next/previous in search history (#34408)
Use `checked_sub` instead of checking for bounds manually. Also greatly
simplifies the logic for `next` and `previous`. Removing other manual
bounds checks as well

Release Notes:

- N/A
2025-07-15 16:42:37 +02:00
tidely
d1abba0d33 gpui: Reduce manual shifting & other minor improvements (#34407)
Minor cleanup in gpui.

- Reduce manual shifting by using `u32::to_be_bytes`
- Remove eager `Vec` allocation when listing registered actions
- Remove unnecessary return statements
- Replace manual `if let Some(_)` with `.as_deref_mut()`

Release Notes:

- N/A
2025-07-15 16:39:33 +02:00
tidely
05065985e7 cli: Remove manual std::io::copy implementation (#34409)
Removes a manual implementation of `std::io::copy`. The internal buffer
of `std::io::copy` is also 8 kB and behaves exactly the same. On Linux
`std::io::copy` also has access to some better performing file copying.

Release Notes:

- N/A
2025-07-15 16:37:15 +02:00
Ben Brandt
7ab8f431a7 Update to acp 0.0.9 (#34463)
Release Notes:

- N/A
2025-07-15 14:28:27 +00:00
Hilmar Wiegand
050ed85d71 Add severity argument to GoToDiagnostic actions (#33995)
This PR adds a `severity` argument so severity can be defined when
navigating through diagnostics. This allows keybinds like the following:

```json
{
  "] e": ["editor::GoToDiagnostic", { "severity": "error" }],
  "[ e": ["editor::GoToDiagnostic", { "severity": "error" }]
}
```

I've added test comments and a test. Let me know if there's anything
else you need!

Release Notes:

- Add `severity` argument to `editor::GoToDiagnostic`,
`editor::GoToPreviousDiagnostic`, `project_panel::SelectNextDiagnostic`
and `project_panel::SelectPrevDiagnostic` actions
2025-07-15 14:03:57 +00:00
Danilo Leal
858e176a1c Refine keymap UI design (#34437)
Release Notes:

- N/A

---------

Co-authored-by: Ben Kunkle <Ben.kunkle@gmail.com>
2025-07-15 13:45:59 +00:00
Marshall Bowers
a65c0b2bff collab: Fix typo in log message (#34455)
This PR fixes a small typo in a log message.

Release Notes:

- N/A
2025-07-15 13:16:49 +00:00
Marshall Bowers
848a86a385 collab: Sync model overages for all active Zed Pro subscriptions (#34452)
Release Notes:

- N/A
2025-07-15 13:01:01 +00:00
Piotr Osiewicz
52f2b32557 extension_cli: Copy over snippet file when bundling extensions (#34450)
Closes #30670

Release Notes:

- Fixed snippets from extensions not working.
2025-07-15 11:07:29 +00:00
Alvaro Parker
8dca4d150e Fix border and minimap flickering on pane split (#33973)
Closes #33972

As noted on
https://github.com/zed-industries/zed/pull/31390#discussion_r2147473526,
when splitting panes and having a border size set for the active pane,
or the minimap visibility configured to the active editor only, zed will
shortly show a flicker of the border or the minimap on the pane that's
being deactivated.

Release Notes:

- Fixed an issue where pane activations would sometimes have a brief
delay, causing a flicker in the process.
2025-07-15 10:47:35 +02:00
Peter Tripp
440beb8a90 Improve Java LSP documentation (#34410)
Remove references to
[ABckh/zed-java-eclipse-jdtls](https://github.com/ABckh/zed-java-eclipse-jdtls)
which hasn't seen a new version in 10 months (2024-10-01).

Release Notes:

- N/A
2025-07-14 18:16:43 -04:00
Peter Tripp
ce63a6ddd8 Exclude .repo folders by default (#34431)
These are used by [Google's `repo`
tool](https://android.googlesource.com/tools/repo) used for Android for
managing hundreds of git subprojects.

Originally reported in:
- https://github.com/zed-industries/zed/issues/34302

Release Notes:

- Add Google Repo `.repo` folders to default `file_scan_exclusions`
2025-07-14 18:16:28 -04:00
Finn Evers
26ba6e7e00 editor: Improve minimap performance (#33067)
This PR aims to improve the minimap performace. This is primarily
achieved by disabling/removing stuff that is not shown in the minimal as
well as by assuring the display map is not updated during minimap
prepaint.

This should already be much better in parts, as the block map as well as
the fold map will be less frequently updated due to the minimap
prepainting (optimally, they should never be, but I think we're not
quite there yet).
For this, I had to remove block rendering support for the minimap, which
is not as bad as it sounds: Practically, we were currently not rendering
most blocks anyway, there were issues due to this (e.g. scrolling any
visible block offscreen in the main editor causes scroll jumps
currently) and in the long run, the minimap will most likely need its
own block map or a different approach anyway. The existing
implementation caused resizes to occur very frequently for practically
no benefit. Can pull this out into a separate PR if requested, most
likely makes the other changes here easier to discuss.

This is WIP as we are still hitting some code path here we definitely
should not be hitting. E.g. there seems to be a rerender roughly every
second if the window is unfocused but visible which does not happen when
the minimap is disabled.

While this primarily focuses on the minimap, it also touches a few other
small parts not related to the minimap where I noticed we were doing too
much stuff during prepaint. Happy for any feedback there aswell.

Putting this up here already so we have a place to discuss the changes
early if needed.

Release Notes:

- Improved performance with the minimap enabled.
- Fixed an issue where interacting with blocks in the editor would
sometimes not properly work with the minimap enabled.
2025-07-15 00:29:27 +03:00
Joseph T. Lyons
363a265051 Add test for running Close Others on an inactive item (#34425)
Adds a test for the changes added in:
https://github.com/zed-industries/zed/pull/34355

Release Notes:

- N/A
2025-07-14 20:24:05 +00:00
Michael Sloan
37e73e3277 Only depend on scap x11 feature when gpui x11 feature is enabled (#34251)
Release Notes:

- N/A
2025-07-14 18:34:33 +00:00
Richard Feldman
32f5132bde Fix contrast adjustment for Powerline separators (#34417)
It turns out Starship is using custom Powerline separators in the
Unicode private reserved character range. This addresses some issues
seen in the comments of #34234

Release Notes:

- Fix automatic contrast adjustment for Powerline separators
2025-07-14 18:18:41 +00:00
Anthony Eid
fd5650d4ed debugger: A support for data breakpoint's on variables (#34391)
Closes #ISSUE

Release Notes:

- N/A *or* Added/Fixed/Improved ...

---------

Co-authored-by: Piotr Osiewicz <24362066+osiewicz@users.noreply.github.com>
Co-authored-by: Mikayla Maki <mikayla.c.maki@gmail.com>
Co-authored-by: Mikayla Maki <mikayla@zed.dev>
2025-07-14 17:45:46 +00:00
domi
8b6b039b63 vim: Add missing normal mode binding for signature help overload (#34278)
Closes #ISSUE

related https://github.com/zed-industries/zed/pull/33199
2025-07-14 17:20:19 +00:00
Piotr Osiewicz
4848bd705e docs/debugger: Remove mention of onboarding calls (#34414)
Closes #ISSUE

Release Notes:

- N/A
2025-07-14 17:18:09 +00:00
Conrad Irwin
45d0686129 Remove unused KeycodeSource (#34403)
Release Notes:

- N/A
2025-07-14 11:03:16 -06:00
Marshall Bowers
eca36c502e Route all LLM traffic through cloud.zed.dev (#34404)
This PR makes it so all LLM traffic is routed through `cloud.zed.dev`.

We're already routing `llm.zed.dev` to `cloud.zed.dev` on the server,
but we want to standardize on `cloud.zed.dev` moving forward.

Release Notes:

- N/A
2025-07-14 16:03:19 +00:00
Piotr Osiewicz
6673c7cd4c debugger: Add memory view (#33955)
This is mostly setting up the UI for now; I expect it to be the biggest
chunk of work.

Release Notes:

- debugger: Added memory view

---------

Co-authored-by: Anthony Eid <hello@anthonyeid.me>
Co-authored-by: Mikayla Maki <mikayla.c.maki@gmail.com>
Co-authored-by: Mikayla Maki <mikayla@zed.dev>
2025-07-14 16:32:06 +02:00
Peter Tripp
a2f5c47e2d Add editor::ToggleFoldAll action (#34317)
In multibuffers adds the ability to alt-click to fold/unfold all
excepts. In singleton buffers it adds the ability to toggle back and
forth between `editor::FoldAll` and `editor::UnfoldAll`.

Bind it in your keymap with:

```json
  {
    "context": "Editor && (mode == full || multibuffer)",
    "bindings": {
      "cmd-k cmd-o": "editor::ToggleFoldAll"
    }
  },
```

<img width="253" height="99" alt="Screenshot 2025-07-11 at 17 04 25"
src="https://github.com/user-attachments/assets/94de8275-d2ee-4cf8-a46c-a698ccdb60e3"
/>

Release Notes:

- Add ability to fold all excerpts in a multibuffer (alt-click) and in
singleton buffers `editor::ToggleFoldAll`
2025-07-14 13:23:51 +00:00
Peter Tripp
c6a6db9754 emacs: Fix cmd-f not working in Terminal (#34400)
Release Notes:

- N/A
2025-07-14 13:16:25 +00:00
Brian Donovan
6f9e052edb languages: Add JS/TS generator functions to outline (#34388)
Functions like `function* iterateElements() {}` would not show up in the
editor's navigation outline. With this change, they do.

| **Before** | **After**
|-|-|
|<img width="453" height="280" alt="Screenshot 2025-07-13 at 4 58 22 PM"
src="https://github.com/user-attachments/assets/822f0774-bda2-4855-a6dd-80ba82fffaf3"
/>|<img width="564" height="373" alt="Screenshot 2025-07-13 at 4 58
55 PM"
src="https://github.com/user-attachments/assets/f4f6b84f-cd26-49b7-923b-724860eb18ad"
/>|

Note that I decided to use Zed's agent assistance features to do this PR
as a sort of test run. I don't normally code with an AI assistant, but
figured it might be good in this case since I'm unfamiliar with the
codebase. I must say I was fairly impressed. All the changes in this PR
were done by Claude Sonnet 4, though I have done a manual review to
ensure the changes look sane and tested the changes by running the
re-built `zed` binary with a toy project.

Closes #21631

Release Notes:

- Fixed JS/TS outlines to show generator functions.
2025-07-14 07:26:17 -05:00
Oleksiy Syvokon
2edf85f054 evals: Switch disable_cursor_blinking to determenistic asserts (#34398)
Release Notes:

- N/A
2025-07-14 11:26:15 +00:00
vipex
00ec243771 pane: 'Close others' now closes relative to right-clicked tab (#34355)
Closes #33445

Fixed the "Close others" context menu action to close tabs relative to
the right-clicked tab instead of the currently active tab. Previously,
when right-clicking on an inactive tab and selecting "Close others", it
would keep the active tab open rather than the right-clicked tab.

## Before/After

https://github.com/user-attachments/assets/d76854c3-c490-4a41-8166-309dec26ba8a



## Changes

- Modified `close_inactive_items()` method to accept an optional
`target_item_id` parameter
- Updated context menu handler to pass the right-clicked tab's ID as the
target
- Maintained backward compatibility by defaulting to active tab when no
target is specified
- Updated all existing call sites to pass `None` for the new parameter

Release Notes:

- Fixed: "Close others" context menu action now correctly keeps the
right-clicked tab open instead of the active tab
2025-07-14 14:06:40 +03:00
feeiyu
84124c60db Fix cannot select in terminal when copy_on_select is enabled (#34131)
Closes #33989


![terminal_select](https://github.com/user-attachments/assets/5027d2f2-f2b3-43a4-8262-3c266fdc5256)

Release Notes:

- N/A
2025-07-14 14:04:54 +03:00
Umesh Yadav
cf1ce1beed languages: Fix ESLint diagnostics not getting shown (#33814)
Closes #33442

Release Notes:

- Resolved an issue where the ESLint language server returned an empty
string for the CodeDescription.href field in diagnostics, leading to
missing diagnostics in editor.
2025-07-14 13:48:56 +03:00
Oleksiy Syvokon
e4effa5e01 linux: Fix keycodes mapping on Wayland (#34396)
We are already converting Wayland keycodes to X11's; double conversion
results in a wrong mapping.


Release Notes:

- N/A
2025-07-14 09:44:29 +00:00
Sergei Surovtsev
f50041779d Language independent hotkeys (#34053)
Addresses #10972 
Closes #24950
Closes #24499

Adds _key_en_ to _Keystroke_ that is derived from key's scan code. This
is more lightweight approach than #32529

Currently has been tested on x11 and windows. Mac code hasn't been
implemented yet.

Release Notes:

- linux: When typing non-ASCII keys on Linux we will now also match
keybindings against the QWERTY-equivalent layout. This should allow most
of Zed's builtin shortcuts to work out of the box on most keyboard
layouts. **Breaking change**: If you had been using `keysym` names in
your keyboard shortcut file (`ctrl-cyrillic_yeru`, etc.) you should now
use the QWERTY-equivalent characters instead.

---------

Co-authored-by: Conrad Irwin <conrad.irwin@gmail.com>
2025-07-13 23:00:03 -06:00
Smit Barmase
51df8a17ef project_panel: Do not render a single sticky entry when scrolled all the way to the top (#34389)
Fixes root entry not expanding/collapsing on nightly. Regressed in
https://github.com/zed-industries/zed/pull/34367.

Release Notes:

- N/A
2025-07-14 06:59:45 +05:30
Somtoo Chukwurah
85d12548a1 linux: Add file_finder::Toggle key binding (#34380)
This fixes a bug on linux where repeated presses of p while holding down
the ctrl modifier navigates through options in reverse.

Closes #34379 

The main issue is the default biding of ctrl-p on linux is
menu::SelectPrevious hence in context "context": "FileFinder ||
(FileFinder > Picker > Editor)" it would navigate in reverse

Release Notes:

- Fixed `file_finder::Toggle` on Linux not scrolling forward
2025-07-13 23:35:03 +00:00
Finn Evers
0af7d32b7d keymap_ui: Dismiss context menu less frequently (#34387)
This PR fixes an issue where the context menu in the keymap UI would be
immediately dismissed after being opened when using a trackpad on MacOS.

Right clicking on MacOS almost always fires a scroll event with a delta
of 0 pixels right after (which is not the case when using a mouse). The
fired scroll event caused the context menu to be removed on the next
frame. This change ensures the menu is only removed when a vertical
scroll is actually happening.

Release Notes:

- N/A
2025-07-13 21:41:41 +00:00
Smit Barmase
1cadff9311 project_panel: Fix sticky items horizontal scroll and hover propagation (#34367)
Release Notes:

- Fixed horizontal scrolling not working for sticky items in the Project
Panel.
- Fixed issue where hovering over the last sticky item in the Project
Panel showed a hovered state on the entry behind it.
- Improved behavior when clicking a sticky item in the Project Panel so
it scrolls just enough for the item to no longer be sticky.
2025-07-13 06:14:30 +05:30
Anthony Eid
8f6b9f0d65 debugger: Allow users to shutdown debug sessions while they're booting (#34362)
This solves problems where users couldn't shut down sessions while
locators or build tasks are running.

I renamed `debugger::Session::Mode` enum to `SessionState` to be more
clear when it's referenced in other crates. I also embedded the boot
task that is created in `SessionState::Building` variant. This allows
sessions to shut down all created threads in their boot process in a
clean and idiomatic way.

Finally, I added a method on terminal that allows killing the active
task.

Release Notes:

- Debugger: Allow shutting down debug sessions while they're booting up
2025-07-12 19:16:35 -04:00
Cole Miller
970a1066f5 git: Handle shift-click to stage a range of entries in the panel (#34296)
Release Notes:

- git: shift-click can now be used to stage a range of entries in the
git panel.
2025-07-12 19:04:26 +00:00
Remco Smits
833bc6979a debugger: Fix correctly determine replace range for debug console completions (#33959)
Follow-up #33868

This PR fixes a few issues with determining the completion range for
client‑ and variable‑list completions.

1. Non‑word completions
We previously supported only word characters and _, using their combined
length to compute the start offset. In PHP, however, an expression can
contain `$`, `-`, `>`, `[`, `]`, `(`, and `)`. Because these characters
weren’t treated as word characters, the start offset stopped at them,
even when the preceding character was part of a word.

2. Trailing characters inside the search text
When autocompletion occurred in the middle of the search text, we didn’t
account for trailing characters. As a result, the start offset was off
by the number of characters after the cursor. For example, replacing res
with result in print(res) produced `print(rresult)` because the trailing
`)` wasn’t subtracted from the start offset.

The following completions are correctly covered now:

- **Before** `$aut` -> `$aut$author` **After** `$aut` -> `$author`
- **Before** `$author->na` -> `$author->na$author->name` **After**
`$author->na` -> `$author->name`
- **Before** `$author->books[` -> `$author->books[$author->books[0]`
**After** `$author->books[` -> `$author->books[0]`
- **Before** `print(res)` -> `print(rresult)` **After** `print(res)` ->
`print(result)`

**Before**


https://github.com/user-attachments/assets/b530cf31-8d4d-45e6-9650-18574f14314c


https://github.com/user-attachments/assets/52475b7b-2bf2-4749-98ec-0dc933fcc364

**After**


https://github.com/user-attachments/assets/c065701b-31c9-4e0a-b584-d1daffe3a38c


https://github.com/user-attachments/assets/455ebb3e-632e-4a57-aea8-d214d2992c06

Release Notes:

- Debugger: Fixed autocompletion not always replacing the correct search
text
2025-07-12 14:24:49 -04:00
Cole Miller
a8cc927303 debugger: Improve appearance of session list for JavaScript debugging (#34322)
This PR updates the debugger panel's session list to be more useful in
some cases that are commonly hit when using the JavaScript adapter. We
make two adjustments, which only apply to JavaScript sessions:

- For a child session that's the only child of a root session, we
collapse it with its parent. This imitates what VS Code does in the
"call stack" view for JavaScript sessions.
- When a session has exactly one thread, we label the session with that
thread's name, instead of the session label provided by the DAP. VS Code
also makes this adjustment, which surfaces more useful information when
working with browser sessions.

Closes #33072 

Release Notes:

- debugger: Improved the appearance of JavaScript sessions in the debug
panel's session list.

---------

Co-authored-by: Julia <julia@zed.dev>
Co-authored-by: Remco Smits <djsmits12@gmail.com>
2025-07-12 15:56:05 +00:00
Kirill Bulatov
13ddd5e4cb Return back the guards when goto targets are queried for (#34340)
Closes https://github.com/zed-industries/zed/issues/34310
Follow-up of https://github.com/zed-industries/zed/pull/29359

Release Notes:

- Fixed goto definition not working in remote projects in certain
conditions
2025-07-12 18:27:52 +03:00
Cole Miller
1b6e212eba debugger: Fix endless restarts when connecting to TCP adapters over SSH (#34328)
Closes #34323
Closes #34313

The previous PR #33932 introduced a way to "close" the
`pending_requests` buffer of the `TransportDelegate`, preventing any
more requests from being added. This prevents pending requests from
accumulating without ever being drained during the shutdown sequence;
without it, some of our tests hang at this point (due to using a
single-threaded executor).

The bug occurred because we were closing `pending_requests` whenever we
detected the server side of the transport shut down, and this closed
state stuck around and interfered with the retry logic for SSH+TCP
adapter connections.

This PR fixes the bug by only closing `pending_requests` on session
shutdown, and adds a regression test covering the SSH retry logic.

Release Notes:

- debugger: Fixed a bug causing SSH connections to some adapters
(Python, Go, JavaScript) to fail and restart endlessly.
2025-07-12 11:27:18 -04:00
Danilo Leal
46834d31f1 Refine status bar design (#34324)
Experimenting with a set of standardized icons and polishing spacing a
little bit.

Release Notes:

- N/A
2025-07-12 11:48:19 -03:00
Kirill Bulatov
e070c81687 Remove remaining plugin-related language server adapters (#34334)
Follow-up of https://github.com/zed-industries/zed/pull/34208

Release Notes:

- N/A
2025-07-12 11:42:14 +00:00
Cole Miller
5b61b8c8ed agent: Fix crash with pathological fetch output (#34253)
Closes #34029

The crash is due to a stack overflow in our `html_to_markdown`
conversion; I've added a maximum depth of 200 for the recursion in that
crate to guard against this kind of thing.

Separately, we were treating all content-types other than `text/plain`
and `application/json` as HTML; I've changed this to only treat
`text/html` and `application/xhtml+xml` as HTML, and fall back to
plaintext. (In the original crash, the content-type was
`application/octet-stream`.)

Release Notes:

- agent: Fixed a potential crash when fetching large non-HTML files.
2025-07-11 21:01:09 -04:00
Cole Miller
625ce12a3e Revert "git: Intercept signing prompt from GPG when committing" (#34306)
Reverts zed-industries/zed#34096

This introduced a regression, because the unlocked key can't benefit
from caching.

Release Notes:
- N/A
2025-07-11 23:20:35 +00:00
vipex
12bc8907d9 Recall empty, unsaved buffers on app load (#33475)
Closes #33342

This PR implements serialization of pinned tabs regardless of their
state (empty, untitled, etc.)

The root cause was that empty untitled tabs were being skipped during
serialization but their pinned state was still being persisted, leading
to a mismatch between the stored pinned count and actual restorable
tabs, this issue lead to a crash which was patched by @JosephTLyons, but
this PR aims to be a proper fix.

**Note**: I'm still evaluating the best approach for this fix. Currently
exploring whether it's necessary to store the pinned state in the
database schema or if there's a simpler solution that doesn't require
schema changes.

--- 

**Edit from Joseph**

We ended up going with altering our recall logic, where we always
restore all editors, even those that are new, empty, and unsaved. This
prevents the crash that #33335 patched because we are no longer skipping
the restoration of pinned editors that have no text and haven't been
saved, throwing off the count dealing with the number of pinned items.

This solution is rather simple, but I think it's fine. We simply just
restore everything the same, no conditional dropping of anything. This
is also consistent with VS Code, which also restores all editors,
regardless of whether or not a new, unsaved buffers have content or not.

https://github.com/zed-industries/zed/tree/alt-solution-for-%2333342

Release Notes:
- N/A

---------

Co-authored-by: Joseph T. Lyons <JosephTLyons@gmail.com>
2025-07-11 22:23:04 +00:00
Ben Kunkle
67c765a99a keymap_ui: Dual-phase focus for keystroke input (#34312)
Closes #ISSUE

An idea I and @MrSubidubi came up with, to improve UX around the
keystroke input.

Currently, there's a hard tradeoff with what to focus first in the edit
keybind modal, if we focus the keystroke input, it makes keybind
modification very easy, however, if you don't want to edit a keybind,
you must use the mouse to escape the keystroke input before editing
something else - breaking keyboard navigation.

The idea in this PR is to have a dual-phased focus system for the
keystroke input. There is an outer focus that has some sort of visual
indicator to communicate it is focused (currently a border). While the
outer focus region is focused, keystrokes are not intercepted. Then
there is a keybind (currently hardcoded to `enter`) to enter the inner
focus where keystrokes are intercepted, and which must be exited using
the mouse. When the inner focus region is focused, there is a visual
indicator for the fact it is "recording" (currently a hacked together
red pulsing recording icon)


<details><summary>Video</summary>


https://github.com/user-attachments/assets/490538d0-f092-4df1-a53a-a47d7efe157b


</details>

Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-07-11 17:06:06 -05:00
Ben Kunkle
206cce6783 keymap_ui: Support unbinding non-user defined keybindings (#34318)
Closes #ISSUE

Makes it so that `KeymapFile::update_keybinding` treats removals of
bindings that weren't user-defined as creating a new binding to
`zed::NoAction`.


Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-07-11 21:23:14 +00:00
Richard Feldman
a509ae241a Revert "wip"
This reverts commit 0bb1a5f98a.
2025-07-11 12:00:43 -04:00
Richard Feldman
0bb1a5f98a wip 2025-07-11 12:00:41 -04:00
Richard Feldman
79f376d752 Clean up inlay hint hover logic 2025-07-11 11:05:23 -04:00
Richard Feldman
fe8b3fe53d Drop debug logging 2025-07-10 22:30:29 -04:00
Richard Feldman
2cd812e54f Delete unused import 2025-07-10 22:23:52 -04:00
Richard Feldman
6adf082e43 Revert "Add a hover when hovering over inlays"
This reverts commit 0d6232b373.
2025-07-10 22:23:39 -04:00
Richard Feldman
e4963e70cc Revert "Attempt to fix hover"
This reverts commit e7c6f228d5.
2025-07-10 22:23:34 -04:00
Richard Feldman
e7c6f228d5 Attempt to fix hover 2025-07-10 22:23:26 -04:00
Richard Feldman
0d6232b373 Add a hover when hovering over inlays 2025-07-10 19:14:36 -04:00
Richard Feldman
ca4df68f31 Remove flashed black circle 2025-07-10 17:54:22 -04:00
Richard Feldman
c96b6a06f0 Don't show loading message 2025-07-10 17:46:13 -04:00
Richard Feldman
b1cd20a435 Remove all debug logging from inlay hint hover implementation 2025-07-10 17:46:08 -04:00
Richard Feldman
509375c83c Remove some debug logging 2025-07-10 17:39:37 -04:00
Richard Feldman
b6bd9c0682 It works 2025-07-10 17:32:12 -04:00
Richard Feldman
8ee82395b8 Kinda make this work 2025-07-10 17:14:30 -04:00
Richard Feldman
a322aa33c7 wip - currently just shows a generic message, not the docs 2025-07-09 16:37:37 -04:00
Richard Feldman
2ff30d20e3 Fix inlay hint hover by not clearing hover when mouse is over inlay
When hovering over an inlay hint, point_for_position.as_valid() returns None
because inlays don't have valid text positions. This was causing hover_at(editor, None)
to be called, which would hide any active hovers.

The fix is simple: don't call hover_at when we're over an inlay position.
The inlay hover is already handled by update_hovered_link, so we don't need
to do anything else.
2025-07-09 13:15:43 -04:00
Richard Feldman
01d7b3345b Fix inlay hint caching 2025-07-09 13:04:11 -04:00
Richard Feldman
17f7312fc0 Extract resolve_hint
Co-authored-by: Cole Miller <cole@zed.dev>
2025-07-07 16:48:33 -04:00
Richard Feldman
a61e478152 Reproduce #33715 in a test 2025-07-02 15:50:02 -04:00
143 changed files with 6134 additions and 2337 deletions

80
Cargo.lock generated
View File

@@ -264,16 +264,18 @@ dependencies = [
[[package]]
name = "agentic-coding-protocol"
version = "0.0.7"
version = "0.0.9"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a75f520bcc049ebe40c8c99427aa61b48ad78a01bcc96a13b350b903dcfb9438"
checksum = "0e276b798eddd02562a339340a96919d90bbfcf78de118fdddc932524646fac7"
dependencies = [
"anyhow",
"chrono",
"derive_more 2.0.1",
"futures 0.3.31",
"log",
"parking_lot",
"schemars",
"semver",
"serde",
"serde_json",
]
@@ -608,7 +610,6 @@ dependencies = [
"parking_lot",
"smol",
"tempfile",
"unindent",
"util",
"workspace-hack",
]
@@ -677,7 +678,7 @@ dependencies = [
"anyhow",
"async-trait",
"collections",
"derive_more",
"derive_more 0.99.19",
"extension",
"futures 0.3.31",
"gpui",
@@ -740,7 +741,7 @@ dependencies = [
"clock",
"collections",
"ctor",
"derive_more",
"derive_more 0.99.19",
"futures 0.3.31",
"gpui",
"icons",
@@ -776,7 +777,7 @@ dependencies = [
"clock",
"collections",
"component",
"derive_more",
"derive_more 0.99.19",
"editor",
"feature_flags",
"fs",
@@ -1233,7 +1234,7 @@ version = "0.1.0"
dependencies = [
"anyhow",
"collections",
"derive_more",
"derive_more 0.99.19",
"gpui",
"parking_lot",
"rodio",
@@ -2926,7 +2927,7 @@ dependencies = [
"cocoa 0.26.0",
"collections",
"credentials_provider",
"derive_more",
"derive_more 0.99.19",
"feature_flags",
"fs",
"futures 0.3.31",
@@ -3110,10 +3111,11 @@ dependencies = [
"context_server",
"ctor",
"dap",
"dap-types",
"dap_adapters",
"dashmap 6.1.0",
"debugger_ui",
"derive_more",
"derive_more 0.99.19",
"editor",
"envy",
"extension",
@@ -3318,7 +3320,7 @@ name = "command_palette_hooks"
version = "0.1.0"
dependencies = [
"collections",
"derive_more",
"derive_more 0.99.19",
"gpui",
"workspace-hack",
]
@@ -4393,12 +4395,15 @@ dependencies = [
"futures 0.3.31",
"fuzzy",
"gpui",
"hex",
"indoc",
"itertools 0.14.0",
"language",
"log",
"menu",
"notifications",
"parking_lot",
"parse_int",
"paths",
"picker",
"pretty_assertions",
@@ -4522,6 +4527,27 @@ dependencies = [
"syn 2.0.101",
]
[[package]]
name = "derive_more"
version = "2.0.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "093242cf7570c207c83073cf82f79706fe7b8317e98620a47d5be7c3d8497678"
dependencies = [
"derive_more-impl",
]
[[package]]
name = "derive_more-impl"
version = "2.0.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "bda628edc44c4bb645fbe0f758797143e4e07926f7ebf4e9bdfbd3d2ce621df3"
dependencies = [
"proc-macro2",
"quote",
"syn 2.0.101",
"unicode-xid",
]
[[package]]
name = "derive_refineable"
version = "0.1.0"
@@ -6220,7 +6246,7 @@ dependencies = [
"askpass",
"async-trait",
"collections",
"derive_more",
"derive_more 0.99.19",
"futures 0.3.31",
"git2",
"gpui",
@@ -7237,7 +7263,7 @@ dependencies = [
"core-video",
"cosmic-text",
"ctor",
"derive_more",
"derive_more 0.99.19",
"embed-resource",
"env_logger 0.11.8",
"etagere",
@@ -7783,7 +7809,7 @@ version = "0.1.0"
dependencies = [
"anyhow",
"bytes 1.10.1",
"derive_more",
"derive_more 0.99.19",
"futures 0.3.31",
"http 1.3.1",
"log",
@@ -8221,7 +8247,7 @@ dependencies = [
"async-trait",
"cargo_metadata",
"collections",
"derive_more",
"derive_more 0.99.19",
"extension",
"fs",
"futures 0.3.31",
@@ -9033,7 +9059,6 @@ dependencies = [
"credentials_provider",
"deepseek",
"editor",
"feature_flags",
"fs",
"futures 0.3.31",
"google_ai",
@@ -9134,7 +9159,6 @@ dependencies = [
"futures 0.3.31",
"gpui",
"http_client",
"indoc",
"language",
"log",
"lsp",
@@ -9661,12 +9685,11 @@ dependencies = [
[[package]]
name = "lsp-types"
version = "0.95.1"
source = "git+https://github.com/zed-industries/lsp-types?rev=c9c189f1c5dd53c624a419ce35bc77ad6a908d18#c9c189f1c5dd53c624a419ce35bc77ad6a908d18"
source = "git+https://github.com/zed-industries/lsp-types?rev=6add7052b598ea1f40f7e8913622c3958b009b60#6add7052b598ea1f40f7e8913622c3958b009b60"
dependencies = [
"bitflags 1.3.2",
"serde",
"serde_json",
"serde_repr",
"url",
]
@@ -11278,6 +11301,15 @@ dependencies = [
"windows-targets 0.52.6",
]
[[package]]
name = "parse_int"
version = "0.9.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1c464266693329dd5a8715098c7f86e6c5fd5d985018b8318f53d9c6c2b21a31"
dependencies = [
"num-traits",
]
[[package]]
name = "partial-json-fixer"
version = "0.5.3"
@@ -12321,6 +12353,7 @@ dependencies = [
"anyhow",
"askpass",
"async-trait",
"base64 0.22.1",
"buffer_diff",
"circular-buffer",
"client",
@@ -12366,6 +12399,7 @@ dependencies = [
"sha2",
"shellexpand 2.1.2",
"shlex",
"smallvec",
"smol",
"snippet",
"snippet_provider",
@@ -14100,7 +14134,7 @@ dependencies = [
[[package]]
name = "scap"
version = "0.0.8"
source = "git+https://github.com/zed-industries/scap?rev=28dd306ff2e3374404936dec778fc1e975b8dd12#28dd306ff2e3374404936dec778fc1e975b8dd12"
source = "git+https://github.com/zed-industries/scap?rev=270538dc780f5240723233ff901e1054641ed318#270538dc780f5240723233ff901e1054641ed318"
dependencies = [
"anyhow",
"cocoa 0.25.0",
@@ -14152,6 +14186,7 @@ dependencies = [
"indexmap",
"ref-cast",
"schemars_derive",
"semver",
"serde",
"serde_json",
]
@@ -14673,16 +14708,19 @@ dependencies = [
"language",
"log",
"menu",
"notifications",
"paths",
"project",
"schemars",
"search",
"serde",
"serde_json",
"settings",
"theme",
"tree-sitter-json",
"tree-sitter-rust",
"ui",
"ui_input",
"util",
"workspace",
"workspace-hack",
@@ -16110,7 +16148,7 @@ version = "0.1.0"
dependencies = [
"anyhow",
"collections",
"derive_more",
"derive_more 0.99.19",
"fs",
"futures 0.3.31",
"gpui",
@@ -18377,7 +18415,6 @@ version = "0.1.0"
dependencies = [
"anyhow",
"client",
"feature_flags",
"futures 0.3.31",
"gpui",
"http_client",
@@ -19649,6 +19686,7 @@ dependencies = [
"rustix 1.0.7",
"rustls 0.23.26",
"rustls-webpki 0.103.1",
"scap",
"schemars",
"scopeguard",
"sea-orm",

View File

@@ -404,7 +404,7 @@ zlog_settings = { path = "crates/zlog_settings" }
# External crates
#
agentic-coding-protocol = "0.0.7"
agentic-coding-protocol = { version = "0.0.9" }
aho-corasick = "1.1"
alacritty_terminal = { git = "https://github.com/zed-industries/alacritty.git", branch = "add-hush-login-flag" }
any_vec = "0.14"
@@ -492,7 +492,7 @@ libc = "0.2"
libsqlite3-sys = { version = "0.30.1", features = ["bundled"] }
linkify = "0.10.0"
log = { version = "0.4.16", features = ["kv_unstable_serde", "serde"] }
lsp-types = { git = "https://github.com/zed-industries/lsp-types", rev = "c9c189f1c5dd53c624a419ce35bc77ad6a908d18" }
lsp-types = { git = "https://github.com/zed-industries/lsp-types", rev = "6add7052b598ea1f40f7e8913622c3958b009b60" }
markup5ever_rcdom = "0.3.0"
metal = "0.29"
moka = { version = "0.12.10", features = ["sync"] }
@@ -507,6 +507,7 @@ ordered-float = "2.1.1"
palette = { version = "0.7.5", default-features = false, features = ["std"] }
parking_lot = "0.12.1"
partial-json-fixer = "0.5.3"
parse_int = "0.9"
pathdiff = "0.2"
pet = { git = "https://github.com/microsoft/python-environment-tools.git", rev = "845945b830297a50de0e24020b980a65e4820559" }
pet-conda = { git = "https://github.com/microsoft/python-environment-tools.git", rev = "845945b830297a50de0e24020b980a65e4820559" }
@@ -546,7 +547,7 @@ rustc-demangle = "0.1.23"
rustc-hash = "2.1.0"
rustls = { version = "0.23.26" }
rustls-platform-verifier = "0.5.0"
scap = { git = "https://github.com/zed-industries/scap", rev = "28dd306ff2e3374404936dec778fc1e975b8dd12", default-features = false }
scap = { git = "https://github.com/zed-industries/scap", rev = "270538dc780f5240723233ff901e1054641ed318", default-features = false }
schemars = { version = "1.0", features = ["indexmap2"] }
semver = "1.0"
serde = { version = "1.0", features = ["derive", "rc"] }

View File

@@ -1 +1,12 @@
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-bug"><path d="m8 2 1.88 1.88"/><path d="M14.12 3.88 16 2"/><path d="M9 7.13v-1a3.003 3.003 0 1 1 6 0v1"/><path d="M12 20c-3.3 0-6-2.7-6-6v-3a4 4 0 0 1 4-4h4a4 4 0 0 1 4 4v3c0 3.3-2.7 6-6 6"/><path d="M12 20v-9"/><path d="M6.53 9C4.6 8.8 3 7.1 3 5"/><path d="M6 13H2"/><path d="M3 21c0-2.1 1.7-3.9 3.8-4"/><path d="M20.97 5c0 2.1-1.6 3.8-3.5 4"/><path d="M22 13h-4"/><path d="M17.2 17c2.1.1 3.8 1.9 3.8 4"/></svg>
<svg width="16" height="16" viewBox="0 0 16 16" fill="none" xmlns="http://www.w3.org/2000/svg">
<path d="M5.49219 2.29071L6.41455 3.1933" stroke="black" stroke-width="1.5" stroke-linecap="round" stroke-linejoin="round"/>
<path d="M9.61816 3.1933L10.508 2.29071" stroke="black" stroke-width="1.5" stroke-linecap="round" stroke-linejoin="round"/>
<path d="M5.7042 5.89221V5.15749C5.69033 4.85975 5.73943 4.56239 5.84856 4.28336C5.95768 4.00434 6.12456 3.74943 6.33913 3.53402C6.55369 3.31862 6.81149 3.14718 7.09697 3.03005C7.38245 2.91292 7.68969 2.85254 8.00014 2.85254C8.3106 2.85254 8.61784 2.91292 8.90332 3.03005C9.18879 3.14718 9.44659 3.31862 9.66116 3.53402C9.87572 3.74943 10.0426 4.00434 10.1517 4.28336C10.2609 4.56239 10.31 4.85975 10.2961 5.15749V5.89221" stroke="black" stroke-width="1.5" stroke-linecap="round" stroke-linejoin="round"/>
<path d="M8.00006 13.0426C6.13263 13.0426 4.60474 11.6005 4.60474 9.83792V8.23558C4.60474 7.66895 4.84322 7.12554 5.26772 6.72487C5.69221 6.32421 6.26796 6.09912 6.86829 6.09912H9.13184C9.73217 6.09912 10.3079 6.32421 10.7324 6.72487C11.1569 7.12554 11.3954 7.66895 11.3954 8.23558V9.83792C11.3954 11.6005 9.86749 13.0426 8.00006 13.0426Z" fill="black" fill-opacity="0.15" stroke="black" stroke-width="1.5" stroke-linecap="round" stroke-linejoin="round"/>
<path d="M4.60452 6.25196C3.51235 6.13878 2.60693 5.17677 2.60693 3.9884" stroke="black" stroke-width="1.5" stroke-linecap="round" stroke-linejoin="round"/>
<path d="M4.60462 8.81659H2.34106" stroke="black" stroke-width="1.5" stroke-linecap="round" stroke-linejoin="round"/>
<path d="M2.4541 13.3186C2.4541 12.1302 3.41611 11.1116 4.60448 11.0551" stroke="black" stroke-width="1.5" stroke-linecap="round" stroke-linejoin="round"/>
<path d="M13.0761 3.9884C13.0761 5.17677 12.1706 6.13878 11.0955 6.25196" stroke="black" stroke-width="1.5" stroke-linecap="round" stroke-linejoin="round"/>
<path d="M13.6591 8.81659H11.3955" stroke="black" stroke-width="1.5" stroke-linecap="round" stroke-linejoin="round"/>
<path d="M11.3955 11.0551C12.5839 11.1116 13.5459 12.1302 13.5459 13.3186" stroke="black" stroke-width="1.5" stroke-linecap="round" stroke-linejoin="round"/>
</svg>

Before

Width:  |  Height:  |  Size: 615 B

After

Width:  |  Height:  |  Size: 2.1 KiB

View File

@@ -1,5 +1,5 @@
<svg width="16" height="16" viewBox="0 0 16 16" fill="none" xmlns="http://www.w3.org/2000/svg">
<path d="M3.03125 3V3.03125M3.03125 3.03125V9M3.03125 3.03125C3.03125 5 6 5 6 5M3.03125 9C3.03125 11 6 11 6 11M3.03125 9V12" stroke="black" stroke-width="1.5" stroke-linecap="round" stroke-linejoin="round"/>
<rect x="8" y="2.5" width="6" height="5" rx="1.5" fill="black"/>
<rect x="8" y="8.46875" width="6" height="5.0625" rx="1.5" fill="black"/>
<path d="M3 3V3.03125M3 3.03125V9M3 3.03125C3 5 5.96875 5 5.96875 5M3 9C3 11 5.96875 11 5.96875 11M3 9V12" stroke="black" stroke-width="1.5" stroke-linecap="round" stroke-linejoin="round"/>
<rect x="8" y="3" width="5.5" height="4" rx="1.5" fill="black"/>
<rect x="8" y="9" width="5.5" height="4" rx="1.5" fill="black"/>
</svg>

Before

Width:  |  Height:  |  Size: 462 B

After

Width:  |  Height:  |  Size: 423 B

View File

@@ -1,6 +1,7 @@
<svg width="12" height="12" viewBox="0 0 12 12" fill="none" xmlns="http://www.w3.org/2000/svg">
<path fill-rule="evenodd" clip-rule="evenodd" d="M3.75 3.25C4.02614 3.25 4.25 3.02614 4.25 2.75C4.25 2.47386 4.02614 2.25 3.75 2.25C3.47386 2.25 3.25 2.47386 3.25 2.75C3.25 3.02614 3.47386 3.25 3.75 3.25ZM3.75 4.25C4.57843 4.25 5.25 3.57843 5.25 2.75C5.25 1.92157 4.57843 1.25 3.75 1.25C2.92157 1.25 2.25 1.92157 2.25 2.75C2.25 3.57843 2.92157 4.25 3.75 4.25Z" fill="black"/>
<path fill-rule="evenodd" clip-rule="evenodd" d="M8.25 3.25C8.52614 3.25 8.75 3.02614 8.75 2.75C8.75 2.47386 8.52614 2.25 8.25 2.25C7.97386 2.25 7.75 2.47386 7.75 2.75C7.75 3.02614 7.97386 3.25 8.25 3.25ZM8.25 4.25C9.07843 4.25 9.75 3.57843 9.75 2.75C9.75 1.92157 9.07843 1.25 8.25 1.25C7.42157 1.25 6.75 1.92157 6.75 2.75C6.75 3.57843 7.42157 4.25 8.25 4.25Z" fill="black"/>
<path fill-rule="evenodd" clip-rule="evenodd" d="M3.75 9.75C4.02614 9.75 4.25 9.52614 4.25 9.25C4.25 8.97386 4.02614 8.75 3.75 8.75C3.47386 8.75 3.25 8.97386 3.25 9.25C3.25 9.52614 3.47386 9.75 3.75 9.75ZM3.75 10.75C4.57843 10.75 5.25 10.0784 5.25 9.25C5.25 8.42157 4.57843 7.75 3.75 7.75C2.92157 7.75 2.25 8.42157 2.25 9.25C2.25 10.0784 2.92157 10.75 3.75 10.75Z" fill="black"/>
<path fill-rule="evenodd" clip-rule="evenodd" d="M3.25 3.75H4.25V5.59609C4.67823 5.35824 5.24991 5.25 6 5.25H7.25017C7.5262 5.25 7.75 5.02625 7.75 4.75V3.75H8.75V4.75C8.75 5.57832 8.07871 6.25 7.25017 6.25H6C5.14559 6.25 4.77639 6.41132 4.59684 6.56615C4.42571 6.71373 4.33877 6.92604 4.25 7.30651V8.25H3.25V3.75Z" fill="black"/>
<svg width="16" height="16" viewBox="0 0 16 16" fill="none" xmlns="http://www.w3.org/2000/svg">
<circle cx="5" cy="12" r="1.25" stroke="black" stroke-width="1.5"/>
<path d="M5 11V5" stroke="black" stroke-width="1.5"/>
<path d="M5 10C5 10 5.5 8 7 8C7.73103 8 8.69957 8 9.50049 8C10.3289 8 11 7.32843 11 6.5V5" stroke="black" stroke-width="1.5"/>
<circle cx="5" cy="4" r="1.25" stroke="black" stroke-width="1.5"/>
<circle cx="11" cy="4" r="1.25" stroke="black" stroke-width="1.5"/>
</svg>

Before

Width:  |  Height:  |  Size: 1.5 KiB

After

Width:  |  Height:  |  Size: 487 B

View File

@@ -1 +1,7 @@
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-list-tree"><path d="M21 12h-8"/><path d="M21 6H8"/><path d="M21 18h-8"/><path d="M3 6v4c0 1.1.9 2 2 2h3"/><path d="M3 10v6c0 1.1.9 2 2 2h3"/></svg>
<svg width="16" height="16" viewBox="0 0 16 16" fill="none" xmlns="http://www.w3.org/2000/svg">
<path d="M13.5 8H9.5" stroke="black" stroke-width="1.5" stroke-linecap="round" stroke-linejoin="round"/>
<path d="M13.5 4L6.5 4" stroke="black" stroke-width="1.5" stroke-linecap="round" stroke-linejoin="round"/>
<path d="M13.5 12H9.5" stroke="black" stroke-width="1.5" stroke-linecap="round" stroke-linejoin="round"/>
<path d="M3 3.5V6.33333C3 7.25 3.72 8 4.6 8H7" stroke="black" stroke-width="1.5" stroke-linecap="round" stroke-linejoin="round"/>
<path d="M3 6V10.5C3 11.325 3.72 12 4.6 12H7" stroke="black" stroke-width="1.5" stroke-linecap="round" stroke-linejoin="round"/>
</svg>

Before

Width:  |  Height:  |  Size: 349 B

After

Width:  |  Height:  |  Size: 680 B

View File

@@ -0,0 +1 @@
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-location-edit-icon lucide-location-edit"><path d="M17.97 9.304A8 8 0 0 0 2 10c0 4.69 4.887 9.562 7.022 11.468"/><path d="M21.378 16.626a1 1 0 0 0-3.004-3.004l-4.01 4.012a2 2 0 0 0-.506.854l-.837 2.87a.5.5 0 0 0 .62.62l2.87-.837a2 2 0 0 0 .854-.506z"/><circle cx="10" cy="10" r="3"/></svg>

After

Width:  |  Height:  |  Size: 491 B

View File

@@ -0,0 +1,3 @@
<svg width="14" height="14" viewBox="0 0 14 14" fill="none" xmlns="http://www.w3.org/2000/svg">
<path d="M5 4L10 7L5 10V4Z" fill="black" stroke="black" stroke-width="1.5" stroke-linecap="round" stroke-linejoin="round"/>
</svg>

After

Width:  |  Height:  |  Size: 227 B

View File

@@ -0,0 +1,5 @@
<svg width="16" height="16" viewBox="0 0 16 16" fill="none" xmlns="http://www.w3.org/2000/svg">
<path d="M11.8889 3H4.11111C3.49746 3 3 3.49746 3 4.11111V11.8889C3 12.5025 3.49746 13 4.11111 13H11.8889C12.5025 13 13 12.5025 13 11.8889V4.11111C13 3.49746 12.5025 3 11.8889 3Z" fill="black" fill-opacity="0.15" stroke="black" stroke-width="1.5" stroke-linecap="round" stroke-linejoin="round"/>
<path d="M8.37939 10.3243H10.3794" stroke="black" stroke-width="1.5" stroke-linecap="round" stroke-linejoin="round"/>
<path d="M5.64966 9.32837L7.64966 7.32837L5.64966 5.32837" stroke="black" stroke-width="1.5" stroke-linecap="round" stroke-linejoin="round"/>
</svg>

After

Width:  |  Height:  |  Size: 659 B

View File

@@ -1,3 +1,5 @@
<svg width="16" height="16" viewBox="0 0 16 16" fill="none" xmlns="http://www.w3.org/2000/svg">
<path d="M5.9 8.00002C7.44656 8.00002 8.7 6.74637 8.7 5.20002C8.7 3.65368 7.44656 2.40002 5.9 2.40002C4.35344 2.40002 3.1 3.65368 3.1 5.20002C3.1 6.74637 4.35344 8.00002 5.9 8.00002ZM7.00906 9.05002H4.79094C2.69684 9.05002 1 10.7475 1 12.841C1 13.261 1.3395 13.6 1.75819 13.6H10.0409C10.4609 13.6 10.8 13.261 10.8 12.841C10.8 10.7475 9.1025 9.05002 7.00906 9.05002ZM11.4803 9.40002H9.86484C10.87 10.2247 11.5 11.4585 11.5 12.841C11.5 13.121 11.4169 13.3791 11.2812 13.6H14.3C14.6872 13.6 15 13.285 15 12.8803C15 10.9663 13.4338 9.40002 11.4803 9.40002ZM10.45 8.00002C11.8041 8.00002 12.9 6.90409 12.9 5.55002C12.9 4.19596 11.8041 3.10002 10.45 3.10002C9.90072 3.10002 9.39913 3.28716 8.9905 3.59243C9.2425 4.07631 9.4 4.61815 9.4 5.20002C9.4 5.97702 9.13903 6.69059 8.70897 7.27181C9.15281 7.72002 9.7675 8.00002 10.45 8.00002Z" fill="white"/>
<path d="M6.79118 8.27005C8.27568 8.27005 9.4791 7.06663 9.4791 5.58214C9.4791 4.09765 8.27568 2.89423 6.79118 2.89423C5.30669 2.89423 4.10327 4.09765 4.10327 5.58214C4.10327 7.06663 5.30669 8.27005 6.79118 8.27005Z" stroke="black" stroke-width="1.5" stroke-linecap="round" stroke-linejoin="round"/>
<path d="M6.79112 8.60443C4.19441 8.60443 2.08936 10.7095 2.08936 13.3062H11.4929C11.4929 10.7095 9.38784 8.60443 6.79112 8.60443Z" fill="black" fill-opacity="0.15" stroke="black" stroke-width="1.5" stroke-linecap="round" stroke-linejoin="round"/>
<path d="M14.6984 12.9263C14.6984 10.8893 13.4895 8.99736 12.2806 8.09067C12.6779 7.79254 12.9957 7.40104 13.2057 6.95083C13.4157 6.50062 13.5115 6.00558 13.4846 5.50952C13.4577 5.01346 13.309 4.53168 13.0515 4.10681C12.7941 3.68194 12.4358 3.3271 12.0085 3.07367" stroke="black" stroke-width="1.5" stroke-linecap="round" stroke-linejoin="round"/>
</svg>

Before

Width:  |  Height:  |  Size: 947 B

After

Width:  |  Height:  |  Size: 999 B

View File

@@ -1,5 +1,5 @@
<svg width="16" height="16" viewBox="0 0 16 16" fill="none" xmlns="http://www.w3.org/2000/svg">
<path d="M8 2L6.72534 5.87534C6.6601 6.07367 6.5492 6.25392 6.40155 6.40155C6.25392 6.5492 6.07367 6.6601 5.87534 6.72534L2 8L5.87534 9.27466C6.07367 9.3399 6.25392 9.4508 6.40155 9.59845C6.5492 9.74608 6.6601 9.92633 6.72534 10.1247L8 14L9.27466 10.1247C9.3399 9.92633 9.4508 9.74608 9.59845 9.59845C9.74608 9.4508 9.92633 9.3399 10.1247 9.27466L14 8L10.1247 6.72534C9.92633 6.6601 9.74608 6.5492 9.59845 6.40155C9.4508 6.25392 9.3399 6.07367 9.27466 5.87534L8 2Z" fill="black" fill-opacity="0.15" stroke="black" stroke-width="1.5" stroke-linecap="round" stroke-linejoin="round"/>
<path d="M8 2.93652L6.9243 6.20697C6.86924 6.37435 6.77565 6.52646 6.65105 6.65105C6.52646 6.77565 6.37435 6.86924 6.20697 6.9243L2.93652 8L6.20697 9.0757C6.37435 9.13076 6.52646 9.22435 6.65105 9.34895C6.77565 9.47354 6.86924 9.62565 6.9243 9.79306L8 13.0635L9.0757 9.79306C9.13076 9.62565 9.22435 9.47354 9.34895 9.34895C9.47354 9.22435 9.62565 9.13076 9.79306 9.0757L13.0635 8L9.79306 6.9243C9.62565 6.86924 9.47354 6.77565 9.34895 6.65105C9.22435 6.52646 9.13076 6.37435 9.0757 6.20697L8 2.93652Z" fill="black" fill-opacity="0.15" stroke="black" stroke-width="1.5" stroke-linecap="round" stroke-linejoin="round"/>
<path d="M3.33334 2V4.66666M2 3.33334H4.66666" stroke="black" stroke-opacity="0.75" stroke-width="1.25" stroke-linecap="round" stroke-linejoin="round"/>
<path d="M12.6665 11.3333V14M11.3333 12.6666H13.9999" stroke="black" stroke-opacity="0.75" stroke-width="1.25" stroke-linecap="round" stroke-linejoin="round"/>
</svg>

Before

Width:  |  Height:  |  Size: 998 B

After

Width:  |  Height:  |  Size: 1.0 KiB

View File

@@ -475,8 +475,8 @@
"ctrl-/": ["editor::ToggleComments", { "advance_downwards": false }],
"ctrl-u": "editor::UndoSelection",
"ctrl-shift-u": "editor::RedoSelection",
"f8": "editor::GoToDiagnostic",
"shift-f8": "editor::GoToPreviousDiagnostic",
"f8": ["editor::GoToDiagnostic", { "severity": { "min": "hint", "max": "error" } }],
"shift-f8": ["editor::GoToPreviousDiagnostic", { "severity": { "min": "hint", "max": "error" } }],
"f2": "editor::Rename",
"f12": "editor::GoToDefinition",
"alt-f12": "editor::GoToDefinitionSplit",
@@ -856,6 +856,7 @@
"alt-shift-y": "git::UnstageFile",
"ctrl-alt-y": "git::ToggleStaged",
"space": "git::ToggleStaged",
"shift-space": "git::StageRange",
"tab": "git_panel::FocusEditor",
"shift-tab": "git_panel::FocusEditor",
"escape": "git_panel::ToggleFocus",
@@ -998,6 +999,7 @@
{
"context": "FileFinder || (FileFinder > Picker > Editor)",
"bindings": {
"ctrl-p": "file_finder::Toggle",
"ctrl-shift-a": "file_finder::ToggleSplitMenu",
"ctrl-shift-i": "file_finder::ToggleFilterMenu"
}

View File

@@ -528,8 +528,8 @@
"cmd-/": ["editor::ToggleComments", { "advance_downwards": false }],
"cmd-u": "editor::UndoSelection",
"cmd-shift-u": "editor::RedoSelection",
"f8": "editor::GoToDiagnostic",
"shift-f8": "editor::GoToPreviousDiagnostic",
"f8": ["editor::GoToDiagnostic", { "severity": { "min": "hint", "max": "error" } }],
"shift-f8": ["editor::GoToPreviousDiagnostic", { "severity": { "min": "hint", "max": "error" } }],
"f2": "editor::Rename",
"f12": "editor::GoToDefinition",
"alt-f12": "editor::GoToDefinitionSplit",
@@ -930,6 +930,7 @@
"enter": "menu::Confirm",
"cmd-alt-y": "git::ToggleStaged",
"space": "git::ToggleStaged",
"shift-space": "git::StageRange",
"cmd-y": "git::StageFile",
"cmd-shift-y": "git::UnstageFile",
"alt-down": "git_panel::FocusEditor",
@@ -1097,6 +1098,7 @@
"ctrl-cmd-space": "terminal::ShowCharacterPalette",
"cmd-c": "terminal::Copy",
"cmd-v": "terminal::Paste",
"cmd-f": "buffer_search::Deploy",
"cmd-a": "editor::SelectAll",
"cmd-k": "terminal::Clear",
"cmd-n": "workspace::NewTerminal",

View File

@@ -466,7 +466,7 @@
}
},
{
"context": "vim_mode == insert && showing_signature_help && !showing_completions",
"context": "(vim_mode == insert || vim_mode == normal) && showing_signature_help && !showing_completions",
"bindings": {
"ctrl-p": "editor::SignatureHelpPrevious",
"ctrl-n": "editor::SignatureHelpNext"
@@ -841,6 +841,7 @@
"i": "git_panel::FocusEditor",
"x": "git::ToggleStaged",
"shift-x": "git::StageAll",
"g x": "git::StageRange",
"shift-u": "git::UnstageAll"
}
},

View File

@@ -1135,6 +1135,7 @@
"**/.svn",
"**/.hg",
"**/.jj",
"**/.repo",
"**/CVS",
"**/.DS_Store",
"**/Thumbs.db",

View File

@@ -875,7 +875,7 @@ impl AcpThread {
&self,
) -> impl use<> + Future<Output = Result<acp::InitializeResponse, acp::Error>> {
let connection = self.connection.clone();
async move { connection.request(acp::InitializeParams).await }
async move { connection.initialize().await }
}
pub fn authenticate(&self) -> impl use<> + Future<Output = Result<(), acp::Error>> {
@@ -1839,8 +1839,12 @@ mod tests {
}
impl acp::Agent for FakeAgent {
async fn initialize(&self) -> Result<acp::InitializeResponse, acp::Error> {
async fn initialize(
&self,
params: acp::InitializeParams,
) -> Result<acp::InitializeResponse, acp::Error> {
Ok(acp::InitializeResponse {
protocol_version: params.protocol_version,
is_authenticated: true,
})
}

View File

@@ -448,7 +448,7 @@ impl ActivityIndicator {
.into_any_element(),
),
message: format!("Debug: {}", session.read(cx).adapter()),
tooltip_message: Some(session.read(cx).label().to_string()),
tooltip_message: session.read(cx).label().map(|label| label.to_string()),
on_click: None,
});
}

View File

@@ -660,7 +660,6 @@ impl InlineAssistant {
height: Some(prompt_editor_height),
render: build_assist_editor_renderer(prompt_editor),
priority: 0,
render_in_minimap: false,
},
BlockProperties {
style: BlockStyle::Sticky,
@@ -675,7 +674,6 @@ impl InlineAssistant {
.into_any_element()
}),
priority: 0,
render_in_minimap: false,
},
];
@@ -1451,7 +1449,6 @@ impl InlineAssistant {
.into_any_element()
}),
priority: 0,
render_in_minimap: false,
});
}

View File

@@ -1256,7 +1256,6 @@ impl TextThreadEditor {
),
priority: usize::MAX,
render: render_block(MessageMetadata::from(message)),
render_in_minimap: false,
};
let mut new_blocks = vec![];
let mut block_index_to_message = vec![];
@@ -1858,7 +1857,6 @@ impl TextThreadEditor {
.into_any_element()
}),
priority: 0,
render_in_minimap: false,
})
})
.collect::<Vec<_>>();

View File

@@ -19,6 +19,5 @@ net.workspace = true
parking_lot.workspace = true
smol.workspace = true
tempfile.workspace = true
unindent.workspace = true
util.workspace = true
workspace-hack.workspace = true

View File

@@ -40,21 +40,11 @@ impl AskPassDelegate {
self.tx.send((prompt, tx)).await?;
Ok(rx.await?)
}
pub fn new_always_failing() -> Self {
let (tx, _rx) = mpsc::unbounded::<(String, oneshot::Sender<String>)>();
Self {
tx,
_task: Task::ready(()),
}
}
}
pub struct AskPassSession {
#[cfg(not(target_os = "windows"))]
script_path: std::path::PathBuf,
#[cfg(not(target_os = "windows"))]
gpg_script_path: std::path::PathBuf,
#[cfg(target_os = "windows")]
askpass_helper: String,
#[cfg(target_os = "windows")]
@@ -69,9 +59,6 @@ const ASKPASS_SCRIPT_NAME: &str = "askpass.sh";
#[cfg(target_os = "windows")]
const ASKPASS_SCRIPT_NAME: &str = "askpass.ps1";
#[cfg(not(target_os = "windows"))]
const GPG_SCRIPT_NAME: &str = "gpg.sh";
impl AskPassSession {
/// This will create a new AskPassSession.
/// You must retain this session until the master process exits.
@@ -85,8 +72,6 @@ impl AskPassSession {
let temp_dir = tempfile::Builder::new().prefix("zed-askpass").tempdir()?;
let askpass_socket = temp_dir.path().join("askpass.sock");
let askpass_script_path = temp_dir.path().join(ASKPASS_SCRIPT_NAME);
#[cfg(not(target_os = "windows"))]
let gpg_script_path = temp_dir.path().join(GPG_SCRIPT_NAME);
let (askpass_opened_tx, askpass_opened_rx) = oneshot::channel::<()>();
let listener = UnixListener::bind(&askpass_socket).context("creating askpass socket")?;
#[cfg(not(target_os = "windows"))]
@@ -150,20 +135,9 @@ impl AskPassSession {
askpass_script_path.display()
);
#[cfg(not(target_os = "windows"))]
{
let gpg_script = generate_gpg_script();
fs::write(&gpg_script_path, gpg_script)
.await
.with_context(|| format!("creating gpg wrapper script at {gpg_script_path:?}"))?;
make_file_executable(&gpg_script_path).await?;
}
Ok(Self {
#[cfg(not(target_os = "windows"))]
script_path: askpass_script_path,
#[cfg(not(target_os = "windows"))]
gpg_script_path,
#[cfg(target_os = "windows")]
secret,
@@ -186,19 +160,6 @@ impl AskPassSession {
&self.askpass_helper
}
#[cfg(not(target_os = "windows"))]
pub fn gpg_script_path(&self) -> Option<impl AsRef<OsStr>> {
Some(&self.gpg_script_path)
}
#[cfg(target_os = "windows")]
pub fn gpg_script_path(&self) -> Option<impl AsRef<OsStr>> {
// TODO implement wrapping GPG on Windows. This is more difficult than on Unix
// because we can't use --passphrase-fd with a nonstandard FD, and both --passphrase
// and --passphrase-file are insecure.
None::<std::path::PathBuf>
}
// This will run the askpass task forever, resolving as many authentication requests as needed.
// The caller is responsible for examining the result of their own commands and cancelling this
// future when this is no longer needed. Note that this can only be called once, but due to the
@@ -302,23 +263,3 @@ fn generate_askpass_script(zed_path: &std::path::Path, askpass_socket: &std::pat
askpass_socket = askpass_socket.display(),
)
}
#[inline]
#[cfg(not(target_os = "windows"))]
fn generate_gpg_script() -> String {
use unindent::Unindent as _;
r#"
#!/bin/sh
set -eu
unset GIT_CONFIG_PARAMETERS
GPG_PROGRAM=$(git config gpg.program || echo 'gpg')
PROMPT="Enter passphrase to unlock GPG key:"
PASSPHRASE=$(${GIT_ASKPASS} "${PROMPT}")
exec "${GPG_PROGRAM}" --batch --no-tty --yes --passphrase-fd 3 --pinentry-mode loopback "$@" 3<<EOF
${PASSPHRASE}
EOF
"#.unindent()
}

View File

@@ -365,17 +365,23 @@ fn eval_disable_cursor_blinking() {
// Model | Pass rate
// ============================================
//
// claude-3.7-sonnet | 0.99 (2025-06-14)
// claude-sonnet-4 | 0.85 (2025-06-14)
// gemini-2.5-pro-preview-latest | 0.97 (2025-06-16)
// gemini-2.5-flash-preview-04-17 |
// gpt-4.1 |
// claude-3.7-sonnet | 0.59 (2025-07-14)
// claude-sonnet-4 | 0.81 (2025-07-14)
// gemini-2.5-pro | 0.95 (2025-07-14)
// gemini-2.5-flash-preview-04-17 | 0.78 (2025-07-14)
// gpt-4.1 | 0.00 (2025-07-14) (follows edit_description too literally)
let input_file_path = "root/editor.rs";
let input_file_content = include_str!("evals/fixtures/disable_cursor_blinking/before.rs");
let edit_description = "Comment out the call to `BlinkManager::enable`";
let possible_diffs = vec![
include_str!("evals/fixtures/disable_cursor_blinking/possible-01.diff"),
include_str!("evals/fixtures/disable_cursor_blinking/possible-02.diff"),
include_str!("evals/fixtures/disable_cursor_blinking/possible-03.diff"),
include_str!("evals/fixtures/disable_cursor_blinking/possible-04.diff"),
];
eval(
100,
0.95,
0.51,
0.05,
EvalInput::from_conversation(
vec![
@@ -433,11 +439,7 @@ fn eval_disable_cursor_blinking() {
),
],
Some(input_file_content.into()),
EvalAssertion::judge_diff(indoc! {"
- Calls to BlinkManager in `observe_window_activation` were commented out
- The call to `blink_manager.enable` above the call to show_cursor_names was commented out
- All the edits have valid indentation
"}),
EvalAssertion::assert_diff_any(possible_diffs),
),
);
}

View File

@@ -0,0 +1,28 @@
--- before.rs 2025-07-07 11:37:48.434629001 +0300
+++ expected.rs 2025-07-14 10:33:53.346906775 +0300
@@ -1780,11 +1780,11 @@
cx.observe_window_activation(window, |editor, window, cx| {
let active = window.is_window_active();
editor.blink_manager.update(cx, |blink_manager, cx| {
- if active {
- blink_manager.enable(cx);
- } else {
- blink_manager.disable(cx);
- }
+ // if active {
+ // blink_manager.enable(cx);
+ // } else {
+ // blink_manager.disable(cx);
+ // }
});
}),
],
@@ -18463,7 +18463,7 @@
}
self.blink_manager.update(cx, |blink_manager, cx| {
- blink_manager.enable(cx);
+ // blink_manager.enable(cx);
});
self.show_cursor_names(window, cx);
self.buffer.update(cx, |buffer, cx| {

View File

@@ -0,0 +1,29 @@
@@ -1778,13 +1778,13 @@
cx.observe_global_in::<SettingsStore>(window, Self::settings_changed),
observe_buffer_font_size_adjustment(cx, |_, cx| cx.notify()),
cx.observe_window_activation(window, |editor, window, cx| {
- let active = window.is_window_active();
+ // let active = window.is_window_active();
editor.blink_manager.update(cx, |blink_manager, cx| {
- if active {
- blink_manager.enable(cx);
- } else {
- blink_manager.disable(cx);
- }
+ // if active {
+ // blink_manager.enable(cx);
+ // } else {
+ // blink_manager.disable(cx);
+ // }
});
}),
],
@@ -18463,7 +18463,7 @@
}
self.blink_manager.update(cx, |blink_manager, cx| {
- blink_manager.enable(cx);
+ // blink_manager.enable(cx);
});
self.show_cursor_names(window, cx);
self.buffer.update(cx, |buffer, cx| {

View File

@@ -0,0 +1,34 @@
@@ -1774,17 +1774,17 @@
cx.observe(&buffer, Self::on_buffer_changed),
cx.subscribe_in(&buffer, window, Self::on_buffer_event),
cx.observe_in(&display_map, window, Self::on_display_map_changed),
- cx.observe(&blink_manager, |_, _, cx| cx.notify()),
+ // cx.observe(&blink_manager, |_, _, cx| cx.notify()),
cx.observe_global_in::<SettingsStore>(window, Self::settings_changed),
observe_buffer_font_size_adjustment(cx, |_, cx| cx.notify()),
cx.observe_window_activation(window, |editor, window, cx| {
- let active = window.is_window_active();
+ // let active = window.is_window_active();
editor.blink_manager.update(cx, |blink_manager, cx| {
- if active {
- blink_manager.enable(cx);
- } else {
- blink_manager.disable(cx);
- }
+ // if active {
+ // blink_manager.enable(cx);
+ // } else {
+ // blink_manager.disable(cx);
+ // }
});
}),
],
@@ -18463,7 +18463,7 @@
}
self.blink_manager.update(cx, |blink_manager, cx| {
- blink_manager.enable(cx);
+ // blink_manager.enable(cx);
});
self.show_cursor_names(window, cx);
self.buffer.update(cx, |buffer, cx| {

View File

@@ -0,0 +1,33 @@
@@ -1774,17 +1774,17 @@
cx.observe(&buffer, Self::on_buffer_changed),
cx.subscribe_in(&buffer, window, Self::on_buffer_event),
cx.observe_in(&display_map, window, Self::on_display_map_changed),
- cx.observe(&blink_manager, |_, _, cx| cx.notify()),
+ // cx.observe(&blink_manager, |_, _, cx| cx.notify()),
cx.observe_global_in::<SettingsStore>(window, Self::settings_changed),
observe_buffer_font_size_adjustment(cx, |_, cx| cx.notify()),
cx.observe_window_activation(window, |editor, window, cx| {
let active = window.is_window_active();
editor.blink_manager.update(cx, |blink_manager, cx| {
- if active {
- blink_manager.enable(cx);
- } else {
- blink_manager.disable(cx);
- }
+ // if active {
+ // blink_manager.enable(cx);
+ // } else {
+ // blink_manager.disable(cx);
+ // }
});
}),
],
@@ -18463,7 +18463,7 @@
}
self.blink_manager.update(cx, |blink_manager, cx| {
- blink_manager.enable(cx);
+ // blink_manager.enable(cx);
});
self.show_cursor_names(window, cx);
self.buffer.update(cx, |buffer, cx| {

View File

@@ -69,10 +69,9 @@ impl FetchTool {
.to_str()
.context("invalid Content-Type header")?;
let content_type = match content_type {
"text/html" => ContentType::Html,
"text/plain" => ContentType::Plaintext,
"text/html" | "application/xhtml+xml" => ContentType::Html,
"application/json" => ContentType::Json,
_ => ContentType::Html,
_ => ContentType::Plaintext,
};
match content_type {

View File

@@ -315,19 +315,19 @@ fn main() -> Result<()> {
});
let stdin_pipe_handle: Option<JoinHandle<anyhow::Result<()>>> =
stdin_tmp_file.map(|tmp_file| {
stdin_tmp_file.map(|mut tmp_file| {
thread::spawn(move || {
let stdin = std::io::stdin().lock();
if io::IsTerminal::is_terminal(&stdin) {
return Ok(());
let mut stdin = std::io::stdin().lock();
if !io::IsTerminal::is_terminal(&stdin) {
io::copy(&mut stdin, &mut tmp_file)?;
}
return pipe_to_tmp(stdin, tmp_file);
Ok(())
})
});
let anonymous_fd_pipe_handles: Vec<JoinHandle<anyhow::Result<()>>> = anonymous_fd_tmp_files
let anonymous_fd_pipe_handles: Vec<_> = anonymous_fd_tmp_files
.into_iter()
.map(|(file, tmp_file)| thread::spawn(move || pipe_to_tmp(file, tmp_file)))
.map(|(mut file, mut tmp_file)| thread::spawn(move || io::copy(&mut file, &mut tmp_file)))
.collect();
if args.foreground {
@@ -349,22 +349,6 @@ fn main() -> Result<()> {
Ok(())
}
fn pipe_to_tmp(mut src: impl io::Read, mut dest: fs::File) -> Result<()> {
let mut buffer = [0; 8 * 1024];
loop {
let bytes_read = match src.read(&mut buffer) {
Err(err) if err.kind() == io::ErrorKind::Interrupted => continue,
res => res?,
};
if bytes_read == 0 {
break;
}
io::Write::write_all(&mut dest, &buffer[..bytes_read])?;
}
io::Write::flush(&mut dest)?;
Ok(())
}
fn anonymous_fd(path: &str) -> Option<fs::File> {
#[cfg(target_os = "linux")]
{

View File

@@ -94,6 +94,7 @@ context_server.workspace = true
ctor.workspace = true
dap = { workspace = true, features = ["test-support"] }
dap_adapters = { workspace = true, features = ["test-support"] }
dap-types.workspace = true
debugger_ui = { workspace = true, features = ["test-support"] }
editor = { workspace = true, features = ["test-support"] }
extension.workspace = true

View File

@@ -5,7 +5,7 @@ use axum::{
routing::{get, post},
};
use chrono::{DateTime, SecondsFormat, Utc};
use collections::HashSet;
use collections::{HashMap, HashSet};
use reqwest::StatusCode;
use sea_orm::ActiveValue;
use serde::{Deserialize, Serialize};
@@ -21,12 +21,13 @@ use stripe::{
PaymentMethod, Subscription, SubscriptionId, SubscriptionStatus,
};
use util::{ResultExt, maybe};
use zed_llm_client::LanguageModelProvider;
use crate::api::events::SnowflakeRow;
use crate::db::billing_subscription::{
StripeCancellationReason, StripeSubscriptionStatus, SubscriptionKind,
};
use crate::llm::db::subscription_usage_meter::CompletionMode;
use crate::llm::db::subscription_usage_meter::{self, CompletionMode};
use crate::llm::{AGENT_EXTENDED_TRIAL_FEATURE_FLAG, DEFAULT_MAX_MONTHLY_SPEND};
use crate::rpc::{ResultExt as _, Server};
use crate::stripe_client::{
@@ -1416,18 +1417,21 @@ async fn sync_model_request_usage_with_stripe(
let usage_meters = llm_db
.get_current_subscription_usage_meters(Utc::now())
.await?;
let usage_meters = usage_meters
.into_iter()
.filter(|(_, usage)| !staff_user_ids.contains(&usage.user_id))
.collect::<Vec<_>>();
let user_ids = usage_meters
.iter()
.map(|(_, usage)| usage.user_id)
.collect::<HashSet<UserId>>();
let billing_subscriptions = app
.db
.get_active_zed_pro_billing_subscriptions(user_ids)
.await?;
let mut usage_meters_by_user_id =
HashMap::<UserId, Vec<subscription_usage_meter::Model>>::default();
for (usage_meter, usage) in usage_meters {
let meters = usage_meters_by_user_id.entry(usage.user_id).or_default();
meters.push(usage_meter);
}
log::info!("Stripe usage sync: Retrieving Zed Pro subscriptions");
let get_zed_pro_subscriptions_started_at = Utc::now();
let billing_subscriptions = app.db.get_active_zed_pro_billing_subscriptions().await?;
log::info!(
"Stripe usage sync: Retrieved {} Zed Pro subscriptions in {}",
billing_subscriptions.len(),
Utc::now() - get_zed_pro_subscriptions_started_at
);
let claude_sonnet_4 = stripe_billing
.find_price_by_lookup_key("claude-sonnet-4-requests")
@@ -1451,59 +1455,90 @@ async fn sync_model_request_usage_with_stripe(
.find_price_by_lookup_key("claude-3-7-sonnet-requests-max")
.await?;
let usage_meter_count = usage_meters.len();
let model_mode_combinations = [
("claude-opus-4", CompletionMode::Max),
("claude-opus-4", CompletionMode::Normal),
("claude-sonnet-4", CompletionMode::Max),
("claude-sonnet-4", CompletionMode::Normal),
("claude-3-7-sonnet", CompletionMode::Max),
("claude-3-7-sonnet", CompletionMode::Normal),
("claude-3-5-sonnet", CompletionMode::Normal),
];
log::info!("Stripe usage sync: Syncing {usage_meter_count} usage meters");
let billing_subscription_count = billing_subscriptions.len();
for (usage_meter, usage) in usage_meters {
log::info!("Stripe usage sync: Syncing {billing_subscription_count} Zed Pro subscriptions");
for (user_id, (billing_customer, billing_subscription)) in billing_subscriptions {
maybe!(async {
let Some((billing_customer, billing_subscription)) =
billing_subscriptions.get(&usage.user_id)
else {
bail!(
"Attempted to sync usage meter for user who is not a Stripe customer: {}",
usage.user_id
);
};
if staff_user_ids.contains(&user_id) {
return anyhow::Ok(());
}
let stripe_customer_id =
StripeCustomerId(billing_customer.stripe_customer_id.clone().into());
let stripe_subscription_id =
StripeSubscriptionId(billing_subscription.stripe_subscription_id.clone().into());
let model = llm_db.model_by_id(usage_meter.model_id)?;
let usage_meters = usage_meters_by_user_id.get(&user_id);
let (price, meter_event_name) = match model.name.as_str() {
"claude-opus-4" => match usage_meter.mode {
CompletionMode::Normal => (&claude_opus_4, "claude_opus_4/requests"),
CompletionMode::Max => (&claude_opus_4_max, "claude_opus_4/requests/max"),
},
"claude-sonnet-4" => match usage_meter.mode {
CompletionMode::Normal => (&claude_sonnet_4, "claude_sonnet_4/requests"),
CompletionMode::Max => (&claude_sonnet_4_max, "claude_sonnet_4/requests/max"),
},
"claude-3-5-sonnet" => (&claude_3_5_sonnet, "claude_3_5_sonnet/requests"),
"claude-3-7-sonnet" => match usage_meter.mode {
CompletionMode::Normal => (&claude_3_7_sonnet, "claude_3_7_sonnet/requests"),
CompletionMode::Max => {
(&claude_3_7_sonnet_max, "claude_3_7_sonnet/requests/max")
for (model, mode) in &model_mode_combinations {
let Ok(model) =
llm_db.model(LanguageModelProvider::Anthropic, model)
else {
log::warn!("Failed to load model for user {user_id}: {model}");
continue;
};
let (price, meter_event_name) = match model.name.as_str() {
"claude-opus-4" => match mode {
CompletionMode::Normal => (&claude_opus_4, "claude_opus_4/requests"),
CompletionMode::Max => (&claude_opus_4_max, "claude_opus_4/requests/max"),
},
"claude-sonnet-4" => match mode {
CompletionMode::Normal => (&claude_sonnet_4, "claude_sonnet_4/requests"),
CompletionMode::Max => {
(&claude_sonnet_4_max, "claude_sonnet_4/requests/max")
}
},
"claude-3-5-sonnet" => (&claude_3_5_sonnet, "claude_3_5_sonnet/requests"),
"claude-3-7-sonnet" => match mode {
CompletionMode::Normal => {
(&claude_3_7_sonnet, "claude_3_7_sonnet/requests")
}
CompletionMode::Max => {
(&claude_3_7_sonnet_max, "claude_3_7_sonnet/requests/max")
}
},
model_name => {
bail!("Attempted to sync usage meter for unsupported model: {model_name:?}")
}
},
model_name => {
bail!("Attempted to sync usage meter for unsupported model: {model_name:?}")
}
};
};
stripe_billing
.subscribe_to_price(&stripe_subscription_id, price)
.await?;
stripe_billing
.bill_model_request_usage(
&stripe_customer_id,
meter_event_name,
usage_meter.requests,
)
.await?;
let model_requests = usage_meters
.and_then(|usage_meters| {
usage_meters
.iter()
.find(|meter| meter.model_id == model.id && meter.mode == *mode)
})
.map(|usage_meter| usage_meter.requests)
.unwrap_or(0);
if model_requests > 0 {
stripe_billing
.subscribe_to_price(&stripe_subscription_id, price)
.await?;
}
stripe_billing
.bill_model_request_usage(&stripe_customer_id, meter_event_name, model_requests)
.await
.with_context(|| {
format!(
"Failed to bill model request usage of {model_requests} for {stripe_customer_id}: {meter_event_name}",
)
})?;
}
Ok(())
})
@@ -1512,7 +1547,7 @@ async fn sync_model_request_usage_with_stripe(
}
log::info!(
"Stripe usage sync: Synced {usage_meter_count} usage meters in {:?}",
"Stripe usage sync: Synced {billing_subscription_count} Zed Pro subscriptions in {}",
Utc::now() - started_at
);

View File

@@ -199,6 +199,33 @@ impl Database {
pub async fn get_active_zed_pro_billing_subscriptions(
&self,
) -> Result<HashMap<UserId, (billing_customer::Model, billing_subscription::Model)>> {
self.transaction(|tx| async move {
let mut rows = billing_subscription::Entity::find()
.inner_join(billing_customer::Entity)
.select_also(billing_customer::Entity)
.filter(
billing_subscription::Column::StripeSubscriptionStatus
.eq(StripeSubscriptionStatus::Active),
)
.filter(billing_subscription::Column::Kind.eq(SubscriptionKind::ZedPro))
.order_by_asc(billing_subscription::Column::Id)
.stream(&*tx)
.await?;
let mut subscriptions = HashMap::default();
while let Some(row) = rows.next().await {
if let (subscription, Some(customer)) = row? {
subscriptions.insert(customer.user_id, (customer, subscription));
}
}
Ok(subscriptions)
})
.await
}
pub async fn get_active_zed_pro_billing_subscriptions_for_users(
&self,
user_ids: HashSet<UserId>,
) -> Result<HashMap<UserId, (billing_customer::Model, billing_subscription::Model)>> {
self.transaction(|tx| {

View File

@@ -1013,7 +1013,7 @@ async fn test_peers_following_each_other(cx_a: &mut TestAppContext, cx_b: &mut T
// and some of which were originally opened by client B.
workspace_b.update_in(cx_b, |workspace, window, cx| {
workspace.active_pane().update(cx, |pane, cx| {
pane.close_inactive_items(&Default::default(), window, cx)
pane.close_inactive_items(&Default::default(), None, window, cx)
.detach();
});
});

View File

@@ -2,6 +2,7 @@ use crate::tests::TestServer;
use call::ActiveCall;
use collections::{HashMap, HashSet};
use dap::{Capabilities, adapters::DebugTaskDefinition, transport::RequestHandling};
use debugger_ui::debugger_panel::DebugPanel;
use extension::ExtensionHostProxy;
use fs::{FakeFs, Fs as _, RemoveOptions};
@@ -22,6 +23,7 @@ use language::{
use node_runtime::NodeRuntime;
use project::{
ProjectPath,
debugger::session::ThreadId,
lsp_store::{FormatTrigger, LspFormatTarget},
};
use remote::SshRemoteClient;
@@ -29,7 +31,11 @@ use remote_server::{HeadlessAppState, HeadlessProject};
use rpc::proto;
use serde_json::json;
use settings::SettingsStore;
use std::{path::Path, sync::Arc};
use std::{
path::Path,
sync::{Arc, atomic::AtomicUsize},
};
use task::TcpArgumentsTemplate;
use util::path;
#[gpui::test(iterations = 10)]
@@ -688,3 +694,162 @@ async fn test_remote_server_debugger(
shutdown_session.await.unwrap();
}
#[gpui::test]
async fn test_slow_adapter_startup_retries(
cx_a: &mut TestAppContext,
server_cx: &mut TestAppContext,
executor: BackgroundExecutor,
) {
cx_a.update(|cx| {
release_channel::init(SemanticVersion::default(), cx);
command_palette_hooks::init(cx);
zlog::init_test();
dap_adapters::init(cx);
});
server_cx.update(|cx| {
release_channel::init(SemanticVersion::default(), cx);
dap_adapters::init(cx);
});
let (opts, server_ssh) = SshRemoteClient::fake_server(cx_a, server_cx);
let remote_fs = FakeFs::new(server_cx.executor());
remote_fs
.insert_tree(
path!("/code"),
json!({
"lib.rs": "fn one() -> usize { 1 }"
}),
)
.await;
// User A connects to the remote project via SSH.
server_cx.update(HeadlessProject::init);
let remote_http_client = Arc::new(BlockedHttpClient);
let node = NodeRuntime::unavailable();
let languages = Arc::new(LanguageRegistry::new(server_cx.executor()));
let _headless_project = server_cx.new(|cx| {
client::init_settings(cx);
HeadlessProject::new(
HeadlessAppState {
session: server_ssh,
fs: remote_fs.clone(),
http_client: remote_http_client,
node_runtime: node,
languages,
extension_host_proxy: Arc::new(ExtensionHostProxy::new()),
},
cx,
)
});
let client_ssh = SshRemoteClient::fake_client(opts, cx_a).await;
let mut server = TestServer::start(server_cx.executor()).await;
let client_a = server.create_client(cx_a, "user_a").await;
cx_a.update(|cx| {
debugger_ui::init(cx);
command_palette_hooks::init(cx);
});
let (project_a, _) = client_a
.build_ssh_project(path!("/code"), client_ssh.clone(), cx_a)
.await;
let (workspace, cx_a) = client_a.build_workspace(&project_a, cx_a);
let debugger_panel = workspace
.update_in(cx_a, |_workspace, window, cx| {
cx.spawn_in(window, DebugPanel::load)
})
.await
.unwrap();
workspace.update_in(cx_a, |workspace, window, cx| {
workspace.add_panel(debugger_panel, window, cx);
});
cx_a.run_until_parked();
let debug_panel = workspace
.update(cx_a, |workspace, cx| workspace.panel::<DebugPanel>(cx))
.unwrap();
let workspace_window = cx_a
.window_handle()
.downcast::<workspace::Workspace>()
.unwrap();
let count = Arc::new(AtomicUsize::new(0));
let session = debugger_ui::tests::start_debug_session_with(
&workspace_window,
cx_a,
DebugTaskDefinition {
adapter: "fake-adapter".into(),
label: "test".into(),
config: json!({
"request": "launch"
}),
tcp_connection: Some(TcpArgumentsTemplate {
port: None,
host: None,
timeout: None,
}),
},
move |client| {
let count = count.clone();
client.on_request_ext::<dap::requests::Initialize, _>(move |_seq, _request| {
if count.fetch_add(1, std::sync::atomic::Ordering::SeqCst) < 5 {
return RequestHandling::Exit;
}
RequestHandling::Respond(Ok(Capabilities::default()))
});
},
)
.unwrap();
cx_a.run_until_parked();
let client = session.update(cx_a, |session, _| session.adapter_client().unwrap());
client
.fake_event(dap::messages::Events::Stopped(dap::StoppedEvent {
reason: dap::StoppedEventReason::Pause,
description: None,
thread_id: Some(1),
preserve_focus_hint: None,
text: None,
all_threads_stopped: None,
hit_breakpoint_ids: None,
}))
.await;
cx_a.run_until_parked();
let active_session = debug_panel
.update(cx_a, |this, _| this.active_session())
.unwrap();
let running_state = active_session.update(cx_a, |active_session, _| {
active_session.running_state().clone()
});
assert_eq!(
client.id(),
running_state.read_with(cx_a, |running_state, _| running_state.session_id())
);
assert_eq!(
ThreadId(1),
running_state.read_with(cx_a, |running_state, _| running_state
.selected_thread_id()
.unwrap())
);
let shutdown_session = workspace.update(cx_a, |workspace, cx| {
workspace.project().update(cx, |project, cx| {
project.dap_store().update(cx, |dap_store, cx| {
dap_store.shutdown_session(session.read(cx).session_id(), cx)
})
})
});
client_ssh.update(cx_a, |a, _| {
a.shutdown_processes(Some(proto::ShutdownRemoteServer {}), executor)
});
shutdown_session.await.unwrap();
}

View File

@@ -378,6 +378,14 @@ pub trait DebugAdapter: 'static + Send + Sync {
fn label_for_child_session(&self, _args: &StartDebuggingRequestArguments) -> Option<String> {
None
}
fn compact_child_session(&self) -> bool {
false
}
fn prefer_thread_name(&self) -> bool {
false
}
}
#[cfg(any(test, feature = "test-support"))]
@@ -442,10 +450,18 @@ impl DebugAdapter for FakeAdapter {
_: Option<Vec<String>>,
_: &mut AsyncApp,
) -> Result<DebugAdapterBinary> {
let connection = task_definition
.tcp_connection
.as_ref()
.map(|connection| TcpArguments {
host: connection.host(),
port: connection.port.unwrap_or(17),
timeout: connection.timeout,
});
Ok(DebugAdapterBinary {
command: Some("command".into()),
arguments: vec![],
connection: None,
connection,
envs: HashMap::default(),
cwd: None,
request_args: StartDebuggingRequestArguments {

View File

@@ -2,7 +2,7 @@ use crate::{
adapters::DebugAdapterBinary,
transport::{IoKind, LogKind, TransportDelegate},
};
use anyhow::{Context as _, Result};
use anyhow::Result;
use dap_types::{
messages::{Message, Response},
requests::Request,
@@ -110,9 +110,7 @@ impl DebugAdapterClient {
self.transport_delegate
.pending_requests
.lock()
.as_mut()
.context("client is closed")?
.insert(sequence_id, callback_tx);
.insert(sequence_id, callback_tx)?;
log::debug!(
"Client {} send `{}` request with sequence_id: {}",
@@ -170,6 +168,7 @@ impl DebugAdapterClient {
pub fn kill(&self) {
log::debug!("Killing DAP process");
self.transport_delegate.transport.lock().kill();
self.transport_delegate.pending_requests.lock().shutdown();
}
pub fn has_adapter_logs(&self) -> bool {
@@ -184,11 +183,34 @@ impl DebugAdapterClient {
}
#[cfg(any(test, feature = "test-support"))]
pub fn on_request<R: dap_types::requests::Request, F>(&self, handler: F)
pub fn on_request<R: dap_types::requests::Request, F>(&self, mut handler: F)
where
F: 'static
+ Send
+ FnMut(u64, R::Arguments) -> Result<R::Response, dap_types::ErrorResponse>,
{
use crate::transport::RequestHandling;
self.transport_delegate
.transport
.lock()
.as_fake()
.on_request::<R, _>(move |seq, request| {
RequestHandling::Respond(handler(seq, request))
});
}
#[cfg(any(test, feature = "test-support"))]
pub fn on_request_ext<R: dap_types::requests::Request, F>(&self, handler: F)
where
F: 'static
+ Send
+ FnMut(
u64,
R::Arguments,
) -> crate::transport::RequestHandling<
Result<R::Response, dap_types::ErrorResponse>,
>,
{
self.transport_delegate
.transport

View File

@@ -49,6 +49,12 @@ pub enum IoKind {
StdErr,
}
#[cfg(any(test, feature = "test-support"))]
pub enum RequestHandling<T> {
Respond(T),
Exit,
}
type LogHandlers = Arc<Mutex<SmallVec<[(LogKind, IoHandler); 2]>>>;
pub trait Transport: Send + Sync {
@@ -76,7 +82,11 @@ async fn start(
) -> Result<Box<dyn Transport>> {
#[cfg(any(test, feature = "test-support"))]
if cfg!(any(test, feature = "test-support")) {
return Ok(Box::new(FakeTransport::start(cx).await?));
if let Some(connection) = binary.connection.clone() {
return Ok(Box::new(FakeTransport::start_tcp(connection, cx).await?));
} else {
return Ok(Box::new(FakeTransport::start_stdio(cx).await?));
}
}
if binary.connection.is_some() {
@@ -90,11 +100,57 @@ async fn start(
}
}
pub(crate) struct PendingRequests {
inner: Option<HashMap<u64, oneshot::Sender<Result<Response>>>>,
}
impl PendingRequests {
fn new() -> Self {
Self {
inner: Some(HashMap::default()),
}
}
fn flush(&mut self, e: anyhow::Error) {
let Some(inner) = self.inner.as_mut() else {
return;
};
for (_, sender) in inner.drain() {
sender.send(Err(e.cloned())).ok();
}
}
pub(crate) fn insert(
&mut self,
sequence_id: u64,
callback_tx: oneshot::Sender<Result<Response>>,
) -> anyhow::Result<()> {
let Some(inner) = self.inner.as_mut() else {
bail!("client is closed")
};
inner.insert(sequence_id, callback_tx);
Ok(())
}
pub(crate) fn remove(
&mut self,
sequence_id: u64,
) -> anyhow::Result<Option<oneshot::Sender<Result<Response>>>> {
let Some(inner) = self.inner.as_mut() else {
bail!("client is closed");
};
Ok(inner.remove(&sequence_id))
}
pub(crate) fn shutdown(&mut self) {
self.flush(anyhow!("transport shutdown"));
self.inner = None;
}
}
pub(crate) struct TransportDelegate {
log_handlers: LogHandlers,
// TODO this should really be some kind of associative channel
pub(crate) pending_requests:
Arc<Mutex<Option<HashMap<u64, oneshot::Sender<Result<Response>>>>>>,
pub(crate) pending_requests: Arc<Mutex<PendingRequests>>,
pub(crate) transport: Mutex<Box<dyn Transport>>,
pub(crate) server_tx: smol::lock::Mutex<Option<Sender<Message>>>,
tasks: Mutex<Vec<Task<()>>>,
@@ -108,7 +164,7 @@ impl TransportDelegate {
transport: Mutex::new(transport),
log_handlers,
server_tx: Default::default(),
pending_requests: Arc::new(Mutex::new(Some(HashMap::default()))),
pending_requests: Arc::new(Mutex::new(PendingRequests::new())),
tasks: Default::default(),
})
}
@@ -151,24 +207,10 @@ impl TransportDelegate {
Ok(()) => {
pending_requests
.lock()
.take()
.into_iter()
.flatten()
.for_each(|(_, request)| {
request
.send(Err(anyhow!("debugger shutdown unexpectedly")))
.ok();
});
.flush(anyhow!("debugger shutdown unexpectedly"));
}
Err(e) => {
pending_requests
.lock()
.take()
.into_iter()
.flatten()
.for_each(|(_, request)| {
request.send(Err(e.cloned())).ok();
});
pending_requests.lock().flush(e);
}
}
}));
@@ -286,7 +328,7 @@ impl TransportDelegate {
async fn recv_from_server<Stdout>(
server_stdout: Stdout,
mut message_handler: DapMessageHandler,
pending_requests: Arc<Mutex<Option<HashMap<u64, oneshot::Sender<Result<Response>>>>>>,
pending_requests: Arc<Mutex<PendingRequests>>,
log_handlers: Option<LogHandlers>,
) -> Result<()>
where
@@ -303,14 +345,10 @@ impl TransportDelegate {
ConnectionResult::Timeout => anyhow::bail!("Timed out when connecting to debugger"),
ConnectionResult::ConnectionReset => {
log::info!("Debugger closed the connection");
break Ok(());
return Ok(());
}
ConnectionResult::Result(Ok(Message::Response(res))) => {
let tx = pending_requests
.lock()
.as_mut()
.context("client is closed")?
.remove(&res.request_seq);
let tx = pending_requests.lock().remove(res.request_seq)?;
if let Some(tx) = tx {
if let Err(e) = tx.send(Self::process_response(res)) {
log::trace!("Did not send response `{:?}` for a cancelled", e);
@@ -704,8 +742,7 @@ impl Drop for StdioTransport {
}
#[cfg(any(test, feature = "test-support"))]
type RequestHandler =
Box<dyn Send + FnMut(u64, serde_json::Value) -> dap_types::messages::Response>;
type RequestHandler = Box<dyn Send + FnMut(u64, serde_json::Value) -> RequestHandling<Response>>;
#[cfg(any(test, feature = "test-support"))]
type ResponseHandler = Box<dyn Send + Fn(Response)>;
@@ -716,23 +753,38 @@ pub struct FakeTransport {
request_handlers: Arc<Mutex<HashMap<&'static str, RequestHandler>>>,
// for reverse request responses
response_handlers: Arc<Mutex<HashMap<&'static str, ResponseHandler>>>,
stdin_writer: Option<PipeWriter>,
stdout_reader: Option<PipeReader>,
message_handler: Option<Task<Result<()>>>,
kind: FakeTransportKind,
}
#[cfg(any(test, feature = "test-support"))]
pub enum FakeTransportKind {
Stdio {
stdin_writer: Option<PipeWriter>,
stdout_reader: Option<PipeReader>,
},
Tcp {
connection: TcpArguments,
executor: BackgroundExecutor,
},
}
#[cfg(any(test, feature = "test-support"))]
impl FakeTransport {
pub fn on_request<R: dap_types::requests::Request, F>(&self, mut handler: F)
where
F: 'static + Send + FnMut(u64, R::Arguments) -> Result<R::Response, ErrorResponse>,
F: 'static
+ Send
+ FnMut(u64, R::Arguments) -> RequestHandling<Result<R::Response, ErrorResponse>>,
{
self.request_handlers.lock().insert(
R::COMMAND,
Box::new(move |seq, args| {
let result = handler(seq, serde_json::from_value(args).unwrap());
let response = match result {
let RequestHandling::Respond(response) = result else {
return RequestHandling::Exit;
};
let response = match response {
Ok(response) => Response {
seq: seq + 1,
request_seq: seq,
@@ -750,7 +802,7 @@ impl FakeTransport {
message: None,
},
};
response
RequestHandling::Respond(response)
}),
);
}
@@ -764,86 +816,75 @@ impl FakeTransport {
.insert(R::COMMAND, Box::new(handler));
}
async fn start(cx: &mut AsyncApp) -> Result<Self> {
async fn start_tcp(connection: TcpArguments, cx: &mut AsyncApp) -> Result<Self> {
Ok(Self {
request_handlers: Arc::new(Mutex::new(HashMap::default())),
response_handlers: Arc::new(Mutex::new(HashMap::default())),
message_handler: None,
kind: FakeTransportKind::Tcp {
connection,
executor: cx.background_executor().clone(),
},
})
}
async fn handle_messages(
request_handlers: Arc<Mutex<HashMap<&'static str, RequestHandler>>>,
response_handlers: Arc<Mutex<HashMap<&'static str, ResponseHandler>>>,
stdin_reader: PipeReader,
stdout_writer: PipeWriter,
) -> Result<()> {
use dap_types::requests::{Request, RunInTerminal, StartDebugging};
use serde_json::json;
let (stdin_writer, stdin_reader) = async_pipe::pipe();
let (stdout_writer, stdout_reader) = async_pipe::pipe();
let mut this = Self {
request_handlers: Arc::new(Mutex::new(HashMap::default())),
response_handlers: Arc::new(Mutex::new(HashMap::default())),
stdin_writer: Some(stdin_writer),
stdout_reader: Some(stdout_reader),
message_handler: None,
};
let request_handlers = this.request_handlers.clone();
let response_handlers = this.response_handlers.clone();
let mut reader = BufReader::new(stdin_reader);
let stdout_writer = Arc::new(smol::lock::Mutex::new(stdout_writer));
let mut buffer = String::new();
this.message_handler = Some(cx.background_spawn(async move {
let mut reader = BufReader::new(stdin_reader);
let mut buffer = String::new();
loop {
match TransportDelegate::receive_server_message(&mut reader, &mut buffer, None)
.await
{
ConnectionResult::Timeout => {
anyhow::bail!("Timed out when connecting to debugger");
}
ConnectionResult::ConnectionReset => {
log::info!("Debugger closed the connection");
break Ok(());
}
ConnectionResult::Result(Err(e)) => break Err(e),
ConnectionResult::Result(Ok(message)) => {
match message {
Message::Request(request) => {
// redirect reverse requests to stdout writer/reader
if request.command == RunInTerminal::COMMAND
|| request.command == StartDebugging::COMMAND
{
let message =
serde_json::to_string(&Message::Request(request)).unwrap();
let mut writer = stdout_writer.lock().await;
writer
.write_all(
TransportDelegate::build_rpc_message(message)
.as_bytes(),
)
.await
.unwrap();
writer.flush().await.unwrap();
} else {
let response = if let Some(handle) =
request_handlers.lock().get_mut(request.command.as_str())
{
handle(request.seq, request.arguments.unwrap_or(json!({})))
} else {
panic!("No request handler for {}", request.command);
};
let message =
serde_json::to_string(&Message::Response(response))
.unwrap();
let mut writer = stdout_writer.lock().await;
writer
.write_all(
TransportDelegate::build_rpc_message(message)
.as_bytes(),
)
.await
.unwrap();
writer.flush().await.unwrap();
}
}
Message::Event(event) => {
loop {
match TransportDelegate::receive_server_message(&mut reader, &mut buffer, None).await {
ConnectionResult::Timeout => {
anyhow::bail!("Timed out when connecting to debugger");
}
ConnectionResult::ConnectionReset => {
log::info!("Debugger closed the connection");
break Ok(());
}
ConnectionResult::Result(Err(e)) => break Err(e),
ConnectionResult::Result(Ok(message)) => {
match message {
Message::Request(request) => {
// redirect reverse requests to stdout writer/reader
if request.command == RunInTerminal::COMMAND
|| request.command == StartDebugging::COMMAND
{
let message =
serde_json::to_string(&Message::Event(event)).unwrap();
serde_json::to_string(&Message::Request(request)).unwrap();
let mut writer = stdout_writer.lock().await;
writer
.write_all(
TransportDelegate::build_rpc_message(message).as_bytes(),
)
.await
.unwrap();
writer.flush().await.unwrap();
} else {
let response = if let Some(handle) =
request_handlers.lock().get_mut(request.command.as_str())
{
handle(request.seq, request.arguments.unwrap_or(json!({})))
} else {
panic!("No request handler for {}", request.command);
};
let response = match response {
RequestHandling::Respond(response) => response,
RequestHandling::Exit => {
break Err(anyhow!("exit in response to request"));
}
};
let message =
serde_json::to_string(&Message::Response(response)).unwrap();
let mut writer = stdout_writer.lock().await;
writer
@@ -854,20 +895,56 @@ impl FakeTransport {
.unwrap();
writer.flush().await.unwrap();
}
Message::Response(response) => {
if let Some(handle) =
response_handlers.lock().get(response.command.as_str())
{
handle(response);
} else {
log::error!("No response handler for {}", response.command);
}
}
Message::Event(event) => {
let message = serde_json::to_string(&Message::Event(event)).unwrap();
let mut writer = stdout_writer.lock().await;
writer
.write_all(TransportDelegate::build_rpc_message(message).as_bytes())
.await
.unwrap();
writer.flush().await.unwrap();
}
Message::Response(response) => {
if let Some(handle) =
response_handlers.lock().get(response.command.as_str())
{
handle(response);
} else {
log::error!("No response handler for {}", response.command);
}
}
}
}
}
}));
}
}
async fn start_stdio(cx: &mut AsyncApp) -> Result<Self> {
let (stdin_writer, stdin_reader) = async_pipe::pipe();
let (stdout_writer, stdout_reader) = async_pipe::pipe();
let kind = FakeTransportKind::Stdio {
stdin_writer: Some(stdin_writer),
stdout_reader: Some(stdout_reader),
};
let mut this = Self {
request_handlers: Arc::new(Mutex::new(HashMap::default())),
response_handlers: Arc::new(Mutex::new(HashMap::default())),
message_handler: None,
kind,
};
let request_handlers = this.request_handlers.clone();
let response_handlers = this.response_handlers.clone();
this.message_handler = Some(cx.background_spawn(Self::handle_messages(
request_handlers,
response_handlers,
stdin_reader,
stdout_writer,
)));
Ok(this)
}
@@ -876,7 +953,10 @@ impl FakeTransport {
#[cfg(any(test, feature = "test-support"))]
impl Transport for FakeTransport {
fn tcp_arguments(&self) -> Option<TcpArguments> {
None
match &self.kind {
FakeTransportKind::Stdio { .. } => None,
FakeTransportKind::Tcp { connection, .. } => Some(connection.clone()),
}
}
fn connect(
@@ -887,12 +967,33 @@ impl Transport for FakeTransport {
Box<dyn AsyncRead + Unpin + Send + 'static>,
)>,
> {
let result = util::maybe!({
Ok((
Box::new(self.stdin_writer.take().context("Cannot reconnect")?) as _,
Box::new(self.stdout_reader.take().context("Cannot reconnect")?) as _,
))
});
let result = match &mut self.kind {
FakeTransportKind::Stdio {
stdin_writer,
stdout_reader,
} => util::maybe!({
Ok((
Box::new(stdin_writer.take().context("Cannot reconnect")?) as _,
Box::new(stdout_reader.take().context("Cannot reconnect")?) as _,
))
}),
FakeTransportKind::Tcp { executor, .. } => {
let (stdin_writer, stdin_reader) = async_pipe::pipe();
let (stdout_writer, stdout_reader) = async_pipe::pipe();
let request_handlers = self.request_handlers.clone();
let response_handlers = self.response_handlers.clone();
self.message_handler = Some(executor.spawn(Self::handle_messages(
request_handlers,
response_handlers,
stdin_reader,
stdout_writer,
)));
Ok((Box::new(stdin_writer) as _, Box::new(stdout_reader) as _))
}
};
Task::ready(result)
}

View File

@@ -534,6 +534,14 @@ impl DebugAdapter for JsDebugAdapter {
.filter(|name| !name.is_empty())?;
Some(label.to_owned())
}
fn compact_child_session(&self) -> bool {
true
}
fn prefer_thread_name(&self) -> bool {
true
}
}
fn normalize_task_type(task_type: &mut Value) {

View File

@@ -399,7 +399,8 @@ impl LogStore {
state.insert(DebugAdapterState::new(
id.session_id,
adapter_name,
session_label,
session_label
.unwrap_or_else(|| format!("Session {} (child)", id.session_id.0).into()),
has_adapter_logs,
));

View File

@@ -40,12 +40,15 @@ file_icons.workspace = true
futures.workspace = true
fuzzy.workspace = true
gpui.workspace = true
hex.workspace = true
indoc.workspace = true
itertools.workspace = true
language.workspace = true
log.workspace = true
menu.workspace = true
notifications.workspace = true
parking_lot.workspace = true
parse_int.workspace = true
paths.workspace = true
picker.workspace = true
pretty_assertions.workspace = true

View File

@@ -2,6 +2,7 @@ use crate::persistence::DebuggerPaneItem;
use crate::session::DebugSession;
use crate::session::running::RunningState;
use crate::session::running::breakpoint_list::BreakpointList;
use crate::{
ClearAllBreakpoints, Continue, CopyDebugAdapterArguments, Detach, FocusBreakpointList,
FocusConsole, FocusFrames, FocusLoadedSources, FocusModules, FocusTerminal, FocusVariables,
@@ -9,6 +10,7 @@ use crate::{
ToggleExpandItem, ToggleSessionPicker, ToggleThreadPicker, persistence, spawn_task_or_modal,
};
use anyhow::{Context as _, Result, anyhow};
use collections::IndexMap;
use dap::adapters::DebugAdapterName;
use dap::debugger_settings::DebugPanelDockPosition;
use dap::{
@@ -26,7 +28,7 @@ use text::ToPoint as _;
use itertools::Itertools as _;
use language::Buffer;
use project::debugger::session::{Session, SessionStateEvent};
use project::debugger::session::{Session, SessionQuirks, SessionState, SessionStateEvent};
use project::{DebugScenarioContext, Fs, ProjectPath, TaskSourceKind, WorktreeId};
use project::{Project, debugger::session::ThreadStatus};
use rpc::proto::{self};
@@ -35,7 +37,7 @@ use std::sync::{Arc, LazyLock};
use task::{DebugScenario, TaskContext};
use tree_sitter::{Query, StreamingIterator as _};
use ui::{ContextMenu, Divider, PopoverMenuHandle, Tooltip, prelude::*};
use util::{ResultExt, maybe};
use util::{ResultExt, debug_panic, maybe};
use workspace::SplitDirection;
use workspace::item::SaveOptions;
use workspace::{
@@ -63,13 +65,14 @@ pub enum DebugPanelEvent {
pub struct DebugPanel {
size: Pixels,
sessions: Vec<Entity<DebugSession>>,
active_session: Option<Entity<DebugSession>>,
project: Entity<Project>,
workspace: WeakEntity<Workspace>,
focus_handle: FocusHandle,
context_menu: Option<(Entity<ContextMenu>, Point<Pixels>, Subscription)>,
debug_scenario_scheduled_last: bool,
pub(crate) sessions_with_children:
IndexMap<Entity<DebugSession>, Vec<WeakEntity<DebugSession>>>,
pub(crate) thread_picker_menu_handle: PopoverMenuHandle<ContextMenu>,
pub(crate) session_picker_menu_handle: PopoverMenuHandle<ContextMenu>,
fs: Arc<dyn Fs>,
@@ -100,7 +103,7 @@ impl DebugPanel {
Self {
size: px(300.),
sessions: vec![],
sessions_with_children: Default::default(),
active_session: None,
focus_handle,
breakpoint_list: BreakpointList::new(
@@ -138,8 +141,9 @@ impl DebugPanel {
});
}
pub(crate) fn sessions(&self) -> Vec<Entity<DebugSession>> {
self.sessions.clone()
#[cfg(test)]
pub(crate) fn sessions(&self) -> impl Iterator<Item = Entity<DebugSession>> {
self.sessions_with_children.keys().cloned()
}
pub fn active_session(&self) -> Option<Entity<DebugSession>> {
@@ -185,12 +189,20 @@ impl DebugPanel {
cx: &mut Context<Self>,
) {
let dap_store = self.project.read(cx).dap_store();
let Some(adapter) = DapRegistry::global(cx).adapter(&scenario.adapter) else {
return;
};
let quirks = SessionQuirks {
compact: adapter.compact_child_session(),
prefer_thread_name: adapter.prefer_thread_name(),
};
let session = dap_store.update(cx, |dap_store, cx| {
dap_store.new_session(
scenario.label.clone(),
Some(scenario.label.clone()),
DebugAdapterName(scenario.adapter.clone()),
task_context.clone(),
None,
quirks,
cx,
)
});
@@ -267,22 +279,34 @@ impl DebugPanel {
}
});
cx.spawn(async move |_, cx| {
if let Err(error) = task.await {
log::error!("{error}");
session
.update(cx, |session, cx| {
session
.console_output(cx)
.unbounded_send(format!("error: {}", error))
.ok();
session.shutdown(cx)
})?
.await;
let boot_task = cx.spawn({
let session = session.clone();
async move |_, cx| {
if let Err(error) = task.await {
log::error!("{error}");
session
.update(cx, |session, cx| {
session
.console_output(cx)
.unbounded_send(format!("error: {}", error))
.ok();
session.shutdown(cx)
})?
.await;
}
anyhow::Ok(())
}
anyhow::Ok(())
})
.detach_and_log_err(cx);
});
session.update(cx, |session, _| match &mut session.mode {
SessionState::Building(state_task) => {
*state_task = Some(boot_task);
}
SessionState::Running(_) => {
debug_panic!("Session state should be in building because we are just starting it");
}
});
}
pub(crate) fn rerun_last_session(
@@ -363,14 +387,15 @@ impl DebugPanel {
};
let dap_store_handle = self.project.read(cx).dap_store().clone();
let label = curr_session.read(cx).label().clone();
let label = curr_session.read(cx).label();
let quirks = curr_session.read(cx).quirks();
let adapter = curr_session.read(cx).adapter().clone();
let binary = curr_session.read(cx).binary().cloned().unwrap();
let task_context = curr_session.read(cx).task_context().clone();
let curr_session_id = curr_session.read(cx).session_id();
self.sessions
.retain(|session| session.read(cx).session_id(cx) != curr_session_id);
self.sessions_with_children
.retain(|session, _| session.read(cx).session_id(cx) != curr_session_id);
let task = dap_store_handle.update(cx, |dap_store, cx| {
dap_store.shutdown_session(curr_session_id, cx)
});
@@ -379,7 +404,7 @@ impl DebugPanel {
task.await.log_err();
let (session, task) = dap_store_handle.update(cx, |dap_store, cx| {
let session = dap_store.new_session(label, adapter, task_context, None, cx);
let session = dap_store.new_session(label, adapter, task_context, None, quirks, cx);
let task = session.update(cx, |session, cx| {
session.boot(binary, worktree, dap_store_handle.downgrade(), cx)
@@ -425,6 +450,7 @@ impl DebugPanel {
let dap_store_handle = self.project.read(cx).dap_store().clone();
let label = self.label_for_child_session(&parent_session, request, cx);
let adapter = parent_session.read(cx).adapter().clone();
let quirks = parent_session.read(cx).quirks();
let Some(mut binary) = parent_session.read(cx).binary().cloned() else {
log::error!("Attempted to start a child-session without a binary");
return;
@@ -438,6 +464,7 @@ impl DebugPanel {
adapter,
task_context,
Some(parent_session.clone()),
quirks,
cx,
);
@@ -463,8 +490,8 @@ impl DebugPanel {
cx: &mut Context<Self>,
) {
let Some(session) = self
.sessions
.iter()
.sessions_with_children
.keys()
.find(|other| entity_id == other.entity_id())
.cloned()
else {
@@ -498,15 +525,14 @@ impl DebugPanel {
}
session.update(cx, |session, cx| session.shutdown(cx)).ok();
this.update(cx, |this, cx| {
this.sessions.retain(|other| entity_id != other.entity_id());
this.retain_sessions(|other| entity_id != other.entity_id());
if let Some(active_session_id) = this
.active_session
.as_ref()
.map(|session| session.entity_id())
{
if active_session_id == entity_id {
this.active_session = this.sessions.first().cloned();
this.active_session = this.sessions_with_children.keys().next().cloned();
}
}
cx.notify()
@@ -813,13 +839,24 @@ impl DebugPanel {
.on_click(window.listener_for(
&running_state,
|this, _, _window, cx| {
this.stop_thread(cx);
if this.session().read(cx).is_building() {
this.session().update(cx, |session, cx| {
session.shutdown(cx).detach()
});
} else {
this.stop_thread(cx);
}
},
))
.disabled(active_session.as_ref().is_none_or(
|session| {
session
.read(cx)
.session(cx)
.read(cx)
.is_terminated()
},
))
.disabled(
thread_status != ThreadStatus::Stopped
&& thread_status != ThreadStatus::Running,
)
.tooltip({
let focus_handle = focus_handle.clone();
let label = if capabilities
@@ -976,8 +1013,8 @@ impl DebugPanel {
cx: &mut Context<Self>,
) {
if let Some(session) = self
.sessions
.iter()
.sessions_with_children
.keys()
.find(|session| session.read(cx).session_id(cx) == session_id)
{
self.activate_session(session.clone(), window, cx);
@@ -990,7 +1027,7 @@ impl DebugPanel {
window: &mut Window,
cx: &mut Context<Self>,
) {
debug_assert!(self.sessions.contains(&session_item));
debug_assert!(self.sessions_with_children.contains_key(&session_item));
session_item.focus_handle(cx).focus(window);
session_item.update(cx, |this, cx| {
this.running_state().update(cx, |this, cx| {
@@ -1261,18 +1298,27 @@ impl DebugPanel {
parent_session: &Entity<Session>,
request: &StartDebuggingRequestArguments,
cx: &mut Context<'_, Self>,
) -> SharedString {
) -> Option<SharedString> {
let adapter = parent_session.read(cx).adapter();
if let Some(adapter) = DapRegistry::global(cx).adapter(&adapter) {
if let Some(label) = adapter.label_for_child_session(request) {
return label.into();
return Some(label.into());
}
}
let mut label = parent_session.read(cx).label().clone();
if !label.ends_with("(child)") {
label = format!("{label} (child)").into();
None
}
fn retain_sessions(&mut self, keep: impl Fn(&Entity<DebugSession>) -> bool) {
self.sessions_with_children
.retain(|session, _| keep(session));
for children in self.sessions_with_children.values_mut() {
children.retain(|child| {
let Some(child) = child.upgrade() else {
return false;
};
keep(&child)
});
}
label
}
}
@@ -1302,11 +1348,11 @@ async fn register_session_inner(
let serialized_layout = persistence::get_serialized_layout(adapter_name).await;
let debug_session = this.update_in(cx, |this, window, cx| {
let parent_session = this
.sessions
.iter()
.sessions_with_children
.keys()
.find(|p| Some(p.read(cx).session_id(cx)) == session.read(cx).parent_id(cx))
.cloned();
this.sessions.retain(|session| {
this.retain_sessions(|session| {
!session
.read(cx)
.running_state()
@@ -1337,13 +1383,23 @@ async fn register_session_inner(
)
.detach();
let insert_position = this
.sessions
.iter()
.sessions_with_children
.keys()
.position(|session| Some(session) == parent_session.as_ref())
.map(|position| position + 1)
.unwrap_or(this.sessions.len());
.unwrap_or(this.sessions_with_children.len());
// Maintain topological sort order of sessions
this.sessions.insert(insert_position, debug_session.clone());
let (_, old) = this.sessions_with_children.insert_before(
insert_position,
debug_session.clone(),
Default::default(),
);
debug_assert!(old.is_none());
if let Some(parent_session) = parent_session {
this.sessions_with_children
.entry(parent_session)
.and_modify(|children| children.push(debug_session.downgrade()));
}
debug_session
})?;
@@ -1383,7 +1439,7 @@ impl Panel for DebugPanel {
cx: &mut Context<Self>,
) {
if position.axis() != self.position(window, cx).axis() {
self.sessions.iter().for_each(|session_item| {
self.sessions_with_children.keys().for_each(|session_item| {
session_item.update(cx, |item, cx| {
item.running_state()
.update(cx, |state, _| state.invert_axies())
@@ -1749,6 +1805,7 @@ impl Render for DebugPanel {
.child(breakpoint_list)
.child(Divider::vertical())
.child(welcome_experience)
.child(Divider::vertical())
} else {
this.items_end()
.child(welcome_experience)

View File

@@ -83,6 +83,8 @@ actions!(
Rerun,
/// Toggles expansion of the selected item in the debugger UI.
ToggleExpandItem,
/// Set a data breakpoint on the selected variable or memory region.
ToggleDataBreakpoint,
]
);

View File

@@ -1,16 +1,82 @@
use std::time::Duration;
use std::{rc::Rc, time::Duration};
use collections::HashMap;
use gpui::{Animation, AnimationExt as _, Entity, Transformation, percentage};
use gpui::{Animation, AnimationExt as _, Entity, Transformation, WeakEntity, percentage};
use project::debugger::session::{ThreadId, ThreadStatus};
use ui::{ContextMenu, DropdownMenu, DropdownStyle, Indicator, prelude::*};
use util::truncate_and_trailoff;
use util::{maybe, truncate_and_trailoff};
use crate::{
debugger_panel::DebugPanel,
session::{DebugSession, running::RunningState},
};
struct SessionListEntry {
ancestors: Vec<Entity<DebugSession>>,
leaf: Entity<DebugSession>,
}
impl SessionListEntry {
pub(crate) fn label_element(&self, depth: usize, cx: &mut App) -> AnyElement {
const MAX_LABEL_CHARS: usize = 150;
let mut label = String::new();
for ancestor in &self.ancestors {
label.push_str(&ancestor.update(cx, |ancestor, cx| {
ancestor.label(cx).unwrap_or("(child)".into())
}));
label.push_str(" » ");
}
label.push_str(
&self
.leaf
.update(cx, |leaf, cx| leaf.label(cx).unwrap_or("(child)".into())),
);
let label = truncate_and_trailoff(&label, MAX_LABEL_CHARS);
let is_terminated = self
.leaf
.read(cx)
.running_state
.read(cx)
.session()
.read(cx)
.is_terminated();
let icon = {
if is_terminated {
Some(Indicator::dot().color(Color::Error))
} else {
match self
.leaf
.read(cx)
.running_state
.read(cx)
.thread_status(cx)
.unwrap_or_default()
{
project::debugger::session::ThreadStatus::Stopped => {
Some(Indicator::dot().color(Color::Conflict))
}
_ => Some(Indicator::dot().color(Color::Success)),
}
}
};
h_flex()
.id("session-label")
.ml(depth * px(16.0))
.gap_2()
.when_some(icon, |this, indicator| this.child(indicator))
.justify_between()
.child(
Label::new(label)
.size(LabelSize::Small)
.when(is_terminated, |this| this.strikethrough()),
)
.into_any_element()
}
}
impl DebugPanel {
fn dropdown_label(label: impl Into<SharedString>) -> Label {
const MAX_LABEL_CHARS: usize = 50;
@@ -25,145 +91,205 @@ impl DebugPanel {
window: &mut Window,
cx: &mut Context<Self>,
) -> Option<impl IntoElement> {
if let Some(running_state) = running_state {
let sessions = self.sessions().clone();
let weak = cx.weak_entity();
let running_state = running_state.read(cx);
let label = if let Some(active_session) = active_session.clone() {
active_session.read(cx).session(cx).read(cx).label()
} else {
SharedString::new_static("Unknown Session")
};
let running_state = running_state?;
let is_terminated = running_state.session().read(cx).is_terminated();
let is_started = active_session
.is_some_and(|session| session.read(cx).session(cx).read(cx).is_started());
let mut session_entries = Vec::with_capacity(self.sessions_with_children.len() * 3);
let mut sessions_with_children = self.sessions_with_children.iter().peekable();
let session_state_indicator = if is_terminated {
Indicator::dot().color(Color::Error).into_any_element()
} else if !is_started {
Icon::new(IconName::ArrowCircle)
.size(IconSize::Small)
.color(Color::Muted)
.with_animation(
"arrow-circle",
Animation::new(Duration::from_secs(2)).repeat(),
|icon, delta| icon.transform(Transformation::rotate(percentage(delta))),
)
.into_any_element()
while let Some((root, children)) = sessions_with_children.next() {
let root_entry = if let Ok([single_child]) = <&[_; 1]>::try_from(children.as_slice())
&& let Some(single_child) = single_child.upgrade()
&& single_child.read(cx).quirks.compact
{
sessions_with_children.next();
SessionListEntry {
leaf: single_child.clone(),
ancestors: vec![root.clone()],
}
} else {
match running_state.thread_status(cx).unwrap_or_default() {
ThreadStatus::Stopped => {
Indicator::dot().color(Color::Conflict).into_any_element()
}
_ => Indicator::dot().color(Color::Success).into_any_element(),
SessionListEntry {
leaf: root.clone(),
ancestors: Vec::new(),
}
};
session_entries.push(root_entry);
let trigger = h_flex()
.gap_2()
.child(session_state_indicator)
.justify_between()
.child(
DebugPanel::dropdown_label(label)
.when(is_terminated, |this| this.strikethrough()),
)
.into_any_element();
Some(
DropdownMenu::new_with_element(
"debugger-session-list",
trigger,
ContextMenu::build(window, cx, move |mut this, _, cx| {
let context_menu = cx.weak_entity();
let mut session_depths = HashMap::default();
for session in sessions.into_iter() {
let weak_session = session.downgrade();
let weak_session_id = weak_session.entity_id();
let session_id = session.read(cx).session_id(cx);
let parent_depth = session
.read(cx)
.session(cx)
.read(cx)
.parent_id(cx)
.and_then(|parent_id| session_depths.get(&parent_id).cloned());
let self_depth =
*session_depths.entry(session_id).or_insert_with(|| {
parent_depth.map(|depth| depth + 1).unwrap_or(0usize)
});
this = this.custom_entry(
{
let weak = weak.clone();
let context_menu = context_menu.clone();
move |_, cx| {
weak_session
.read_with(cx, |session, cx| {
let context_menu = context_menu.clone();
let id: SharedString =
format!("debug-session-{}", session_id.0)
.into();
h_flex()
.w_full()
.group(id.clone())
.justify_between()
.child(session.label_element(self_depth, cx))
.child(
IconButton::new(
"close-debug-session",
IconName::Close,
)
.visible_on_hover(id.clone())
.icon_size(IconSize::Small)
.on_click({
let weak = weak.clone();
move |_, window, cx| {
weak.update(cx, |panel, cx| {
panel.close_session(
weak_session_id,
window,
cx,
);
})
.ok();
context_menu
.update(cx, |this, cx| {
this.cancel(
&Default::default(),
window,
cx,
);
})
.ok();
}
}),
)
.into_any_element()
})
.unwrap_or_else(|_| div().into_any_element())
}
},
{
let weak = weak.clone();
move |window, cx| {
weak.update(cx, |panel, cx| {
panel.activate_session(session.clone(), window, cx);
})
.ok();
}
},
);
}
this
session_entries.extend(
sessions_with_children
.by_ref()
.take_while(|(session, _)| {
session
.read(cx)
.session(cx)
.read(cx)
.parent_id(cx)
.is_some()
})
.map(|(session, _)| SessionListEntry {
leaf: session.clone(),
ancestors: vec![],
}),
)
.style(DropdownStyle::Ghost)
.handle(self.session_picker_menu_handle.clone()),
)
} else {
None
);
}
let weak = cx.weak_entity();
let trigger_label = if let Some(active_session) = active_session.clone() {
active_session.update(cx, |active_session, cx| {
active_session.label(cx).unwrap_or("(child)".into())
})
} else {
SharedString::new_static("Unknown Session")
};
let running_state = running_state.read(cx);
let is_terminated = running_state.session().read(cx).is_terminated();
let is_started = active_session
.is_some_and(|session| session.read(cx).session(cx).read(cx).is_started());
let session_state_indicator = if is_terminated {
Indicator::dot().color(Color::Error).into_any_element()
} else if !is_started {
Icon::new(IconName::ArrowCircle)
.size(IconSize::Small)
.color(Color::Muted)
.with_animation(
"arrow-circle",
Animation::new(Duration::from_secs(2)).repeat(),
|icon, delta| icon.transform(Transformation::rotate(percentage(delta))),
)
.into_any_element()
} else {
match running_state.thread_status(cx).unwrap_or_default() {
ThreadStatus::Stopped => Indicator::dot().color(Color::Conflict).into_any_element(),
_ => Indicator::dot().color(Color::Success).into_any_element(),
}
};
let trigger = h_flex()
.gap_2()
.child(session_state_indicator)
.justify_between()
.child(
DebugPanel::dropdown_label(trigger_label)
.when(is_terminated, |this| this.strikethrough()),
)
.into_any_element();
let menu = DropdownMenu::new_with_element(
"debugger-session-list",
trigger,
ContextMenu::build(window, cx, move |mut this, _, cx| {
let context_menu = cx.weak_entity();
let mut session_depths = HashMap::default();
for session_entry in session_entries {
let session_id = session_entry.leaf.read(cx).session_id(cx);
let parent_depth = session_entry
.ancestors
.first()
.unwrap_or(&session_entry.leaf)
.read(cx)
.session(cx)
.read(cx)
.parent_id(cx)
.and_then(|parent_id| session_depths.get(&parent_id).cloned());
let self_depth = *session_depths
.entry(session_id)
.or_insert_with(|| parent_depth.map(|depth| depth + 1).unwrap_or(0usize));
this = this.custom_entry(
{
let weak = weak.clone();
let context_menu = context_menu.clone();
let ancestors: Rc<[_]> = session_entry
.ancestors
.iter()
.map(|session| session.downgrade())
.collect();
let leaf = session_entry.leaf.downgrade();
move |window, cx| {
Self::render_session_menu_entry(
weak.clone(),
context_menu.clone(),
ancestors.clone(),
leaf.clone(),
self_depth,
window,
cx,
)
}
},
{
let weak = weak.clone();
let leaf = session_entry.leaf.clone();
move |window, cx| {
weak.update(cx, |panel, cx| {
panel.activate_session(leaf.clone(), window, cx);
})
.ok();
}
},
);
}
this
}),
)
.style(DropdownStyle::Ghost)
.handle(self.session_picker_menu_handle.clone());
Some(menu)
}
fn render_session_menu_entry(
weak: WeakEntity<DebugPanel>,
context_menu: WeakEntity<ContextMenu>,
ancestors: Rc<[WeakEntity<DebugSession>]>,
leaf: WeakEntity<DebugSession>,
self_depth: usize,
_window: &mut Window,
cx: &mut App,
) -> AnyElement {
let Some(session_entry) = maybe!({
let ancestors = ancestors
.iter()
.map(|ancestor| ancestor.upgrade())
.collect::<Option<Vec<_>>>()?;
let leaf = leaf.upgrade()?;
Some(SessionListEntry { ancestors, leaf })
}) else {
return div().into_any_element();
};
let id: SharedString = format!(
"debug-session-{}",
session_entry.leaf.read(cx).session_id(cx).0
)
.into();
let session_entity_id = session_entry.leaf.entity_id();
h_flex()
.w_full()
.group(id.clone())
.justify_between()
.child(session_entry.label_element(self_depth, cx))
.child(
IconButton::new("close-debug-session", IconName::Close)
.visible_on_hover(id.clone())
.icon_size(IconSize::Small)
.on_click({
let weak = weak.clone();
move |_, window, cx| {
weak.update(cx, |panel, cx| {
panel.close_session(session_entity_id, window, cx);
})
.ok();
context_menu
.update(cx, |this, cx| {
this.cancel(&Default::default(), window, cx);
})
.ok();
}
}),
)
.into_any_element()
}
pub(crate) fn render_thread_dropdown(

View File

@@ -11,7 +11,7 @@ use workspace::{Member, Pane, PaneAxis, Workspace};
use crate::session::running::{
self, DebugTerminal, RunningState, SubView, breakpoint_list::BreakpointList, console::Console,
loaded_source_list::LoadedSourceList, module_list::ModuleList,
loaded_source_list::LoadedSourceList, memory_view::MemoryView, module_list::ModuleList,
stack_frame_list::StackFrameList, variable_list::VariableList,
};
@@ -24,6 +24,7 @@ pub(crate) enum DebuggerPaneItem {
Modules,
LoadedSources,
Terminal,
MemoryView,
}
impl DebuggerPaneItem {
@@ -36,6 +37,7 @@ impl DebuggerPaneItem {
DebuggerPaneItem::Modules,
DebuggerPaneItem::LoadedSources,
DebuggerPaneItem::Terminal,
DebuggerPaneItem::MemoryView,
];
VARIANTS
}
@@ -43,6 +45,9 @@ impl DebuggerPaneItem {
pub(crate) fn is_supported(&self, capabilities: &Capabilities) -> bool {
match self {
DebuggerPaneItem::Modules => capabilities.supports_modules_request.unwrap_or_default(),
DebuggerPaneItem::MemoryView => capabilities
.supports_read_memory_request
.unwrap_or_default(),
DebuggerPaneItem::LoadedSources => capabilities
.supports_loaded_sources_request
.unwrap_or_default(),
@@ -59,6 +64,7 @@ impl DebuggerPaneItem {
DebuggerPaneItem::Modules => SharedString::new_static("Modules"),
DebuggerPaneItem::LoadedSources => SharedString::new_static("Sources"),
DebuggerPaneItem::Terminal => SharedString::new_static("Terminal"),
DebuggerPaneItem::MemoryView => SharedString::new_static("Memory View"),
}
}
pub(crate) fn tab_tooltip(self) -> SharedString {
@@ -80,6 +86,7 @@ impl DebuggerPaneItem {
DebuggerPaneItem::Terminal => {
"Provides an interactive terminal session within the debugging environment."
}
DebuggerPaneItem::MemoryView => "Allows inspection of memory contents.",
};
SharedString::new_static(tooltip)
}
@@ -204,6 +211,7 @@ pub(crate) fn deserialize_pane_layout(
breakpoint_list: &Entity<BreakpointList>,
loaded_sources: &Entity<LoadedSourceList>,
terminal: &Entity<DebugTerminal>,
memory_view: &Entity<MemoryView>,
subscriptions: &mut HashMap<EntityId, Subscription>,
window: &mut Window,
cx: &mut Context<RunningState>,
@@ -228,6 +236,7 @@ pub(crate) fn deserialize_pane_layout(
breakpoint_list,
loaded_sources,
terminal,
memory_view,
subscriptions,
window,
cx,
@@ -298,6 +307,12 @@ pub(crate) fn deserialize_pane_layout(
DebuggerPaneItem::Terminal,
cx,
)),
DebuggerPaneItem::MemoryView => Box::new(SubView::new(
memory_view.focus_handle(cx),
memory_view.clone().into(),
DebuggerPaneItem::MemoryView,
cx,
)),
})
.collect();

View File

@@ -5,14 +5,13 @@ use dap::client::SessionId;
use gpui::{
App, Axis, Entity, EventEmitter, FocusHandle, Focusable, Subscription, Task, WeakEntity,
};
use project::Project;
use project::debugger::session::Session;
use project::worktree_store::WorktreeStore;
use project::{Project, debugger::session::SessionQuirks};
use rpc::proto;
use running::RunningState;
use std::{cell::OnceCell, sync::OnceLock};
use ui::{Indicator, prelude::*};
use util::truncate_and_trailoff;
use std::cell::OnceCell;
use ui::prelude::*;
use workspace::{
CollaboratorId, FollowableItem, ViewId, Workspace,
item::{self, Item},
@@ -20,8 +19,8 @@ use workspace::{
pub struct DebugSession {
remote_id: Option<workspace::ViewId>,
running_state: Entity<RunningState>,
label: OnceLock<SharedString>,
pub(crate) running_state: Entity<RunningState>,
pub(crate) quirks: SessionQuirks,
stack_trace_view: OnceCell<Entity<StackTraceView>>,
_worktree_store: WeakEntity<WorktreeStore>,
workspace: WeakEntity<Workspace>,
@@ -57,6 +56,7 @@ impl DebugSession {
cx,
)
});
let quirks = session.read(cx).quirks();
cx.new(|cx| Self {
_subscriptions: [cx.subscribe(&running_state, |_, _, _, cx| {
@@ -64,7 +64,7 @@ impl DebugSession {
})],
remote_id: None,
running_state,
label: OnceLock::new(),
quirks,
stack_trace_view: OnceCell::new(),
_worktree_store: project.read(cx).worktree_store().downgrade(),
workspace,
@@ -110,64 +110,28 @@ impl DebugSession {
.update(cx, |state, cx| state.shutdown(cx));
}
pub(crate) fn label(&self, cx: &App) -> SharedString {
if let Some(label) = self.label.get() {
return label.clone();
}
let session = self.running_state.read(cx).session();
self.label
.get_or_init(|| session.read(cx).label())
.to_owned()
}
pub(crate) fn running_state(&self) -> &Entity<RunningState> {
&self.running_state
}
pub(crate) fn label_element(&self, depth: usize, cx: &App) -> AnyElement {
const MAX_LABEL_CHARS: usize = 150;
let label = self.label(cx);
let label = truncate_and_trailoff(&label, MAX_LABEL_CHARS);
let is_terminated = self
.running_state
.read(cx)
.session()
.read(cx)
.is_terminated();
let icon = {
if is_terminated {
Some(Indicator::dot().color(Color::Error))
} else {
match self
.running_state
.read(cx)
.thread_status(cx)
.unwrap_or_default()
{
project::debugger::session::ThreadStatus::Stopped => {
Some(Indicator::dot().color(Color::Conflict))
}
_ => Some(Indicator::dot().color(Color::Success)),
pub(crate) fn label(&self, cx: &mut App) -> Option<SharedString> {
let session = self.running_state.read(cx).session().clone();
session.update(cx, |session, cx| {
let session_label = session.label();
let quirks = session.quirks();
let mut single_thread_name = || {
let threads = session.threads(cx);
match threads.as_slice() {
[(thread, _)] => Some(SharedString::from(&thread.name)),
_ => None,
}
};
if quirks.prefer_thread_name {
single_thread_name().or(session_label)
} else {
session_label.or_else(single_thread_name)
}
};
})
}
h_flex()
.id("session-label")
.ml(depth * px(16.0))
.gap_2()
.when_some(icon, |this, indicator| this.child(indicator))
.justify_between()
.child(
Label::new(label)
.size(LabelSize::Small)
.when(is_terminated, |this| this.strikethrough()),
)
.into_any_element()
pub fn running_state(&self) -> &Entity<RunningState> {
&self.running_state
}
}

View File

@@ -1,16 +1,17 @@
pub(crate) mod breakpoint_list;
pub(crate) mod console;
pub(crate) mod loaded_source_list;
pub(crate) mod memory_view;
pub(crate) mod module_list;
pub mod stack_frame_list;
pub mod variable_list;
use std::{any::Any, ops::ControlFlow, path::PathBuf, sync::Arc, time::Duration};
use crate::{
ToggleExpandItem,
new_process_modal::resolve_path,
persistence::{self, DebuggerPaneItem, SerializedLayout},
session::running::memory_view::MemoryView,
};
use super::DebugPanelItemEvent;
@@ -34,7 +35,7 @@ use loaded_source_list::LoadedSourceList;
use module_list::ModuleList;
use project::{
DebugScenarioContext, Project, WorktreeId,
debugger::session::{Session, SessionEvent, ThreadId, ThreadStatus},
debugger::session::{self, Session, SessionEvent, SessionStateEvent, ThreadId, ThreadStatus},
terminals::TerminalKind,
};
use rpc::proto::ViewId;
@@ -81,6 +82,7 @@ pub struct RunningState {
_schedule_serialize: Option<Task<()>>,
pub(crate) scenario: Option<DebugScenario>,
pub(crate) scenario_context: Option<DebugScenarioContext>,
memory_view: Entity<MemoryView>,
}
impl RunningState {
@@ -676,14 +678,36 @@ impl RunningState {
let session_id = session.read(cx).session_id();
let weak_state = cx.weak_entity();
let stack_frame_list = cx.new(|cx| {
StackFrameList::new(workspace.clone(), session.clone(), weak_state, window, cx)
StackFrameList::new(
workspace.clone(),
session.clone(),
weak_state.clone(),
window,
cx,
)
});
let debug_terminal =
parent_terminal.unwrap_or_else(|| cx.new(|cx| DebugTerminal::empty(window, cx)));
let variable_list =
cx.new(|cx| VariableList::new(session.clone(), stack_frame_list.clone(), window, cx));
let memory_view = cx.new(|cx| {
MemoryView::new(
session.clone(),
workspace.clone(),
stack_frame_list.downgrade(),
window,
cx,
)
});
let variable_list = cx.new(|cx| {
VariableList::new(
session.clone(),
stack_frame_list.clone(),
memory_view.clone(),
weak_state.clone(),
window,
cx,
)
});
let module_list = cx.new(|cx| ModuleList::new(session.clone(), workspace.clone(), cx));
@@ -770,6 +794,15 @@ impl RunningState {
cx.on_focus_out(&focus_handle, window, |this, _, window, cx| {
this.serialize_layout(window, cx);
}),
cx.subscribe(
&session,
|this, session, event: &SessionStateEvent, cx| match event {
SessionStateEvent::Shutdown if session.read(cx).is_building() => {
this.shutdown(cx);
}
_ => {}
},
),
];
let mut pane_close_subscriptions = HashMap::default();
@@ -786,6 +819,7 @@ impl RunningState {
&breakpoint_list,
&loaded_source_list,
&debug_terminal,
&memory_view,
&mut pane_close_subscriptions,
window,
cx,
@@ -814,6 +848,7 @@ impl RunningState {
let active_pane = panes.first_pane();
Self {
memory_view,
session,
workspace,
focus_handle,
@@ -884,6 +919,7 @@ impl RunningState {
let weak_project = project.downgrade();
let weak_workspace = workspace.downgrade();
let is_local = project.read(cx).is_local();
cx.spawn_in(window, async move |this, cx| {
let DebugScenario {
adapter,
@@ -1224,6 +1260,12 @@ impl RunningState {
item_kind,
cx,
)),
DebuggerPaneItem::MemoryView => Box::new(SubView::new(
self.memory_view.focus_handle(cx),
self.memory_view.clone().into(),
item_kind,
cx,
)),
}
}
@@ -1408,7 +1450,14 @@ impl RunningState {
&self.module_list
}
pub(crate) fn activate_item(&self, item: DebuggerPaneItem, window: &mut Window, cx: &mut App) {
pub(crate) fn activate_item(
&mut self,
item: DebuggerPaneItem,
window: &mut Window,
cx: &mut Context<Self>,
) {
self.ensure_pane_item(item, window, cx);
let (variable_list_position, pane) = self
.panes
.panes()
@@ -1420,9 +1469,10 @@ impl RunningState {
.map(|view| (view, pane))
})
.unwrap();
pane.update(cx, |this, cx| {
this.activate_item(variable_list_position, true, true, window, cx);
})
});
}
#[cfg(test)]
@@ -1459,7 +1509,7 @@ impl RunningState {
}
}
pub(crate) fn selected_thread_id(&self) -> Option<ThreadId> {
pub fn selected_thread_id(&self) -> Option<ThreadId> {
self.thread_id
}
@@ -1599,9 +1649,21 @@ impl RunningState {
})
.log_err();
self.session.update(cx, |session, cx| {
let is_building = self.session.update(cx, |session, cx| {
session.shutdown(cx).detach();
})
matches!(session.mode, session::SessionState::Building(_))
});
if is_building {
self.debug_terminal.update(cx, |terminal, cx| {
if let Some(view) = terminal.terminal.as_ref() {
view.update(cx, |view, cx| {
view.terminal()
.update(cx, |terminal, _| terminal.kill_active_task())
})
}
})
}
}
pub fn stop_thread(&self, cx: &mut Context<Self>) {

View File

@@ -24,10 +24,10 @@ use project::{
};
use ui::{
ActiveTheme, AnyElement, App, ButtonCommon, Clickable, Color, Context, Disableable, Div,
Divider, FluentBuilder as _, Icon, IconButton, IconName, IconSize, Indicator,
InteractiveElement, IntoElement, Label, LabelCommon, LabelSize, ListItem, ParentElement,
Render, RenderOnce, Scrollbar, ScrollbarState, SharedString, StatefulInteractiveElement,
Styled, Toggleable, Tooltip, Window, div, h_flex, px, v_flex,
Divider, FluentBuilder as _, Icon, IconButton, IconName, IconSize, InteractiveElement,
IntoElement, Label, LabelCommon, LabelSize, ListItem, ParentElement, Render, RenderOnce,
Scrollbar, ScrollbarState, SharedString, StatefulInteractiveElement, Styled, Toggleable,
Tooltip, Window, div, h_flex, px, v_flex,
};
use util::ResultExt;
use workspace::Workspace;
@@ -46,6 +46,7 @@ actions!(
pub(crate) enum SelectedBreakpointKind {
Source,
Exception,
Data,
}
pub(crate) struct BreakpointList {
workspace: WeakEntity<Workspace>,
@@ -188,6 +189,9 @@ impl BreakpointList {
BreakpointEntryKind::ExceptionBreakpoint(bp) => {
(SelectedBreakpointKind::Exception, bp.is_enabled)
}
BreakpointEntryKind::DataBreakpoint(bp) => {
(SelectedBreakpointKind::Data, bp.0.is_enabled)
}
})
})
}
@@ -391,7 +395,8 @@ impl BreakpointList {
let row = line_breakpoint.breakpoint.row;
self.go_to_line_breakpoint(path, row, window, cx);
}
BreakpointEntryKind::ExceptionBreakpoint(_) => {}
BreakpointEntryKind::DataBreakpoint(_)
| BreakpointEntryKind::ExceptionBreakpoint(_) => {}
}
}
@@ -421,6 +426,10 @@ impl BreakpointList {
let id = exception_breakpoint.id.clone();
self.toggle_exception_breakpoint(&id, cx);
}
BreakpointEntryKind::DataBreakpoint(data_breakpoint) => {
let id = data_breakpoint.0.dap.data_id.clone();
self.toggle_data_breakpoint(&id, cx);
}
}
cx.notify();
}
@@ -441,7 +450,7 @@ impl BreakpointList {
let row = line_breakpoint.breakpoint.row;
self.edit_line_breakpoint(path, row, BreakpointEditAction::Toggle, cx);
}
BreakpointEntryKind::ExceptionBreakpoint(_) => {}
_ => {}
}
cx.notify();
}
@@ -490,6 +499,14 @@ impl BreakpointList {
cx.notify();
}
fn toggle_data_breakpoint(&mut self, id: &str, cx: &mut Context<Self>) {
if let Some(session) = &self.session {
session.update(cx, |this, cx| {
this.toggle_data_breakpoint(&id, cx);
});
}
}
fn toggle_exception_breakpoint(&mut self, id: &str, cx: &mut Context<Self>) {
if let Some(session) = &self.session {
session.update(cx, |this, cx| {
@@ -642,6 +659,7 @@ impl BreakpointList {
SelectedBreakpointKind::Exception => {
"Exception Breakpoints cannot be removed from the breakpoint list"
}
SelectedBreakpointKind::Data => "Remove data breakpoint from a breakpoint list",
});
let toggle_label = selection_kind.map(|(_, is_enabled)| {
if is_enabled {
@@ -783,8 +801,20 @@ impl Render for BreakpointList {
weak: weak.clone(),
})
});
self.breakpoints
.extend(breakpoints.chain(exception_breakpoints));
let data_breakpoints = self.session.as_ref().into_iter().flat_map(|session| {
session
.read(cx)
.data_breakpoints()
.map(|state| BreakpointEntry {
kind: BreakpointEntryKind::DataBreakpoint(DataBreakpoint(state.clone())),
weak: weak.clone(),
})
});
self.breakpoints.extend(
breakpoints
.chain(data_breakpoints)
.chain(exception_breakpoints),
);
v_flex()
.id("breakpoint-list")
.key_context("BreakpointList")
@@ -905,7 +935,11 @@ impl LineBreakpoint {
.ok();
}
})
.child(Indicator::icon(Icon::new(icon_name)).color(Color::Debugger))
.child(
Icon::new(icon_name)
.color(Color::Debugger)
.size(IconSize::XSmall),
)
.on_mouse_down(MouseButton::Left, move |_, _, _| {});
ListItem::new(SharedString::from(format!(
@@ -996,6 +1030,103 @@ struct ExceptionBreakpoint {
data: ExceptionBreakpointsFilter,
is_enabled: bool,
}
#[derive(Clone, Debug)]
struct DataBreakpoint(project::debugger::session::DataBreakpointState);
impl DataBreakpoint {
fn render(
&self,
props: SupportedBreakpointProperties,
strip_mode: Option<ActiveBreakpointStripMode>,
ix: usize,
is_selected: bool,
focus_handle: FocusHandle,
list: WeakEntity<BreakpointList>,
) -> ListItem {
let color = if self.0.is_enabled {
Color::Debugger
} else {
Color::Muted
};
let is_enabled = self.0.is_enabled;
let id = self.0.dap.data_id.clone();
ListItem::new(SharedString::from(format!(
"data-breakpoint-ui-item-{}",
self.0.dap.data_id
)))
.rounded()
.start_slot(
div()
.id(SharedString::from(format!(
"data-breakpoint-ui-item-{}-click-handler",
self.0.dap.data_id
)))
.tooltip({
let focus_handle = focus_handle.clone();
move |window, cx| {
Tooltip::for_action_in(
if is_enabled {
"Disable Data Breakpoint"
} else {
"Enable Data Breakpoint"
},
&ToggleEnableBreakpoint,
&focus_handle,
window,
cx,
)
}
})
.on_click({
let list = list.clone();
move |_, _, cx| {
list.update(cx, |this, cx| {
this.toggle_data_breakpoint(&id, cx);
})
.ok();
}
})
.cursor_pointer()
.child(
Icon::new(IconName::Binary)
.color(color)
.size(IconSize::Small),
),
)
.child(
h_flex()
.w_full()
.mr_4()
.py_0p5()
.justify_between()
.child(
v_flex()
.py_1()
.gap_1()
.min_h(px(26.))
.justify_center()
.id(("data-breakpoint-label", ix))
.child(
Label::new(self.0.context.human_readable_label())
.size(LabelSize::Small)
.line_height_style(ui::LineHeightStyle::UiLabel),
),
)
.child(BreakpointOptionsStrip {
props,
breakpoint: BreakpointEntry {
kind: BreakpointEntryKind::DataBreakpoint(self.clone()),
weak: list,
},
is_selected,
focus_handle,
strip_mode,
index: ix,
}),
)
.toggle_state(is_selected)
}
}
impl ExceptionBreakpoint {
fn render(
@@ -1062,7 +1193,11 @@ impl ExceptionBreakpoint {
}
})
.cursor_pointer()
.child(Indicator::icon(Icon::new(IconName::Flame)).color(color)),
.child(
Icon::new(IconName::Flame)
.color(color)
.size(IconSize::Small),
),
)
.child(
h_flex()
@@ -1105,6 +1240,7 @@ impl ExceptionBreakpoint {
enum BreakpointEntryKind {
LineBreakpoint(LineBreakpoint),
ExceptionBreakpoint(ExceptionBreakpoint),
DataBreakpoint(DataBreakpoint),
}
#[derive(Clone, Debug)]
@@ -1140,6 +1276,14 @@ impl BreakpointEntry {
focus_handle,
self.weak.clone(),
),
BreakpointEntryKind::DataBreakpoint(data_breakpoint) => data_breakpoint.render(
props.for_data_breakpoints(),
strip_mode,
ix,
is_selected,
focus_handle,
self.weak.clone(),
),
}
}
@@ -1155,6 +1299,11 @@ impl BreakpointEntry {
exception_breakpoint.id
)
.into(),
BreakpointEntryKind::DataBreakpoint(data_breakpoint) => format!(
"data-breakpoint-control-strip--{}",
data_breakpoint.0.dap.data_id
)
.into(),
}
}
@@ -1172,8 +1321,8 @@ impl BreakpointEntry {
BreakpointEntryKind::LineBreakpoint(line_breakpoint) => {
line_breakpoint.breakpoint.condition.is_some()
}
// We don't support conditions on exception breakpoints
BreakpointEntryKind::ExceptionBreakpoint(_) => false,
// We don't support conditions on exception/data breakpoints
_ => false,
}
}
@@ -1225,6 +1374,10 @@ impl SupportedBreakpointProperties {
// TODO: we don't yet support conditions for exception breakpoints at the data layer, hence all props are disabled here.
Self::empty()
}
fn for_data_breakpoints(self) -> Self {
// TODO: we don't yet support conditions for data breakpoints at the data layer, hence all props are disabled here.
Self::empty()
}
}
#[derive(IntoElement)]
struct BreakpointOptionsStrip {

View File

@@ -12,7 +12,7 @@ use gpui::{
Action as _, AppContext, Context, Corner, Entity, FocusHandle, Focusable, HighlightStyle, Hsla,
Render, Subscription, Task, TextStyle, WeakEntity, actions,
};
use language::{Buffer, CodeLabel, ToOffset};
use language::{Anchor, Buffer, CodeLabel, TextBufferSnapshot, ToOffset};
use menu::{Confirm, SelectNext, SelectPrevious};
use project::{
Completion, CompletionResponse,
@@ -637,27 +637,13 @@ impl ConsoleQueryBarCompletionProvider {
});
let snapshot = buffer.read(cx).text_snapshot();
let query = snapshot.text();
let replace_range = {
let buffer_offset = buffer_position.to_offset(&snapshot);
let reversed_chars = snapshot.reversed_chars_for_range(0..buffer_offset);
let mut word_len = 0;
for ch in reversed_chars {
if ch.is_alphanumeric() || ch == '_' {
word_len += 1;
} else {
break;
}
}
let word_start_offset = buffer_offset - word_len;
let start_anchor = snapshot.anchor_at(word_start_offset, Bias::Left);
start_anchor..buffer_position
};
let buffer_text = snapshot.text();
cx.spawn(async move |_, cx| {
const LIMIT: usize = 10;
let matches = fuzzy::match_strings(
&string_matches,
&query,
&buffer_text,
true,
true,
LIMIT,
@@ -672,7 +658,12 @@ impl ConsoleQueryBarCompletionProvider {
let variable_value = variables.get(&string_match.string)?;
Some(project::Completion {
replace_range: replace_range.clone(),
replace_range: Self::replace_range_for_completion(
&buffer_text,
buffer_position,
string_match.string.as_bytes(),
&snapshot,
),
new_text: string_match.string.clone(),
label: CodeLabel {
filter_range: 0..string_match.string.len(),
@@ -697,6 +688,28 @@ impl ConsoleQueryBarCompletionProvider {
})
}
fn replace_range_for_completion(
buffer_text: &String,
buffer_position: Anchor,
new_bytes: &[u8],
snapshot: &TextBufferSnapshot,
) -> Range<Anchor> {
let buffer_offset = buffer_position.to_offset(&snapshot);
let buffer_bytes = &buffer_text.as_bytes()[0..buffer_offset];
let mut prefix_len = 0;
for i in (0..new_bytes.len()).rev() {
if buffer_bytes.ends_with(&new_bytes[0..i]) {
prefix_len = i;
break;
}
}
let start = snapshot.clip_offset(buffer_offset - prefix_len, Bias::Left);
snapshot.anchor_before(start)..buffer_position
}
const fn completion_type_score(completion_type: CompletionItemType) -> usize {
match completion_type {
CompletionItemType::Field | CompletionItemType::Property => 0,
@@ -744,6 +757,8 @@ impl ConsoleQueryBarCompletionProvider {
cx.background_executor().spawn(async move {
let completions = completion_task.await?;
let buffer_text = snapshot.text();
let completions = completions
.into_iter()
.map(|completion| {
@@ -753,26 +768,14 @@ impl ConsoleQueryBarCompletionProvider {
.as_ref()
.unwrap_or(&completion.label)
.to_owned();
let buffer_text = snapshot.text();
let buffer_bytes = buffer_text.as_bytes();
let new_bytes = new_text.as_bytes();
let mut prefix_len = 0;
for i in (0..new_bytes.len()).rev() {
if buffer_bytes.ends_with(&new_bytes[0..i]) {
prefix_len = i;
break;
}
}
let buffer_offset = buffer_position.to_offset(&snapshot);
let start = buffer_offset - prefix_len;
let start = snapshot.clip_offset(start, Bias::Left);
let start = snapshot.anchor_before(start);
let replace_range = start..buffer_position;
project::Completion {
replace_range,
replace_range: Self::replace_range_for_completion(
&buffer_text,
buffer_position,
new_text.as_bytes(),
&snapshot,
),
new_text,
label: CodeLabel {
filter_range: 0..completion.label.len(),
@@ -944,3 +947,64 @@ fn color_fetcher(color: ansi::Color) -> fn(&Theme) -> Hsla {
};
color_fetcher
}
#[cfg(test)]
mod tests {
use super::*;
use crate::tests::init_test;
use editor::test::editor_test_context::EditorTestContext;
use gpui::TestAppContext;
use language::Point;
#[track_caller]
fn assert_completion_range(
input: &str,
expect: &str,
replacement: &str,
cx: &mut EditorTestContext,
) {
cx.set_state(input);
let buffer_position =
cx.editor(|editor, _, cx| editor.selections.newest::<Point>(cx).start);
let snapshot = &cx.buffer_snapshot();
let replace_range = ConsoleQueryBarCompletionProvider::replace_range_for_completion(
&cx.buffer_text(),
snapshot.anchor_before(buffer_position),
replacement.as_bytes(),
&snapshot,
);
cx.update_editor(|editor, _, cx| {
editor.edit(
vec![(
snapshot.offset_for_anchor(&replace_range.start)
..snapshot.offset_for_anchor(&replace_range.end),
replacement,
)],
cx,
);
});
pretty_assertions::assert_eq!(expect, cx.display_text());
}
#[gpui::test]
async fn test_determine_completion_replace_range(cx: &mut TestAppContext) {
init_test(cx);
let mut cx = EditorTestContext::new(cx).await;
assert_completion_range("resˇ", "result", "result", &mut cx);
assert_completion_range("print(resˇ)", "print(result)", "result", &mut cx);
assert_completion_range("$author->nˇ", "$author->name", "$author->name", &mut cx);
assert_completion_range(
"$author->books[ˇ",
"$author->books[0]",
"$author->books[0]",
&mut cx,
);
}
}

View File

@@ -0,0 +1,963 @@
use std::{
cell::LazyCell,
fmt::Write,
ops::RangeInclusive,
sync::{Arc, LazyLock},
time::Duration,
};
use editor::{Editor, EditorElement, EditorStyle};
use gpui::{
Action, AppContext, DismissEvent, Empty, Entity, FocusHandle, Focusable, MouseButton,
MouseMoveEvent, Point, ScrollStrategy, ScrollWheelEvent, Stateful, Subscription, Task,
TextStyle, UniformList, UniformListScrollHandle, WeakEntity, actions, anchored, bounds,
deferred, point, size, uniform_list,
};
use notifications::status_toast::{StatusToast, ToastIcon};
use project::debugger::{MemoryCell, dap_command::DataBreakpointContext, session::Session};
use settings::Settings;
use theme::ThemeSettings;
use ui::{
ActiveTheme, AnyElement, App, Color, Context, ContextMenu, Div, Divider, DropdownMenu, Element,
FluentBuilder, Icon, IconName, InteractiveElement, IntoElement, Label, LabelCommon,
ParentElement, Pixels, PopoverMenuHandle, Render, Scrollbar, ScrollbarState, SharedString,
StatefulInteractiveElement, Styled, TextSize, Tooltip, Window, div, h_flex, px, v_flex,
};
use util::ResultExt;
use workspace::Workspace;
use crate::{ToggleDataBreakpoint, session::running::stack_frame_list::StackFrameList};
actions!(debugger, [GoToSelectedAddress]);
pub(crate) struct MemoryView {
workspace: WeakEntity<Workspace>,
scroll_handle: UniformListScrollHandle,
scroll_state: ScrollbarState,
show_scrollbar: bool,
stack_frame_list: WeakEntity<StackFrameList>,
hide_scrollbar_task: Option<Task<()>>,
focus_handle: FocusHandle,
view_state: ViewState,
query_editor: Entity<Editor>,
session: Entity<Session>,
width_picker_handle: PopoverMenuHandle<ContextMenu>,
is_writing_memory: bool,
open_context_menu: Option<(Entity<ContextMenu>, Point<Pixels>, Subscription)>,
}
impl Focusable for MemoryView {
fn focus_handle(&self, _: &ui::App) -> FocusHandle {
self.focus_handle.clone()
}
}
#[derive(Clone, Debug)]
struct Drag {
start_address: u64,
end_address: u64,
}
impl Drag {
fn contains(&self, address: u64) -> bool {
let range = self.memory_range();
range.contains(&address)
}
fn memory_range(&self) -> RangeInclusive<u64> {
if self.start_address < self.end_address {
self.start_address..=self.end_address
} else {
self.end_address..=self.start_address
}
}
}
#[derive(Clone, Debug)]
enum SelectedMemoryRange {
DragUnderway(Drag),
DragComplete(Drag),
}
impl SelectedMemoryRange {
fn contains(&self, address: u64) -> bool {
match self {
SelectedMemoryRange::DragUnderway(drag) => drag.contains(address),
SelectedMemoryRange::DragComplete(drag) => drag.contains(address),
}
}
fn is_dragging(&self) -> bool {
matches!(self, SelectedMemoryRange::DragUnderway(_))
}
fn drag(&self) -> &Drag {
match self {
SelectedMemoryRange::DragUnderway(drag) => drag,
SelectedMemoryRange::DragComplete(drag) => drag,
}
}
}
#[derive(Clone)]
struct ViewState {
/// Uppermost row index
base_row: u64,
/// How many cells per row do we have?
line_width: ViewWidth,
selection: Option<SelectedMemoryRange>,
}
impl ViewState {
fn new(base_row: u64, line_width: ViewWidth) -> Self {
Self {
base_row,
line_width,
selection: None,
}
}
fn row_count(&self) -> u64 {
// This was picked fully arbitrarily. There's no incentive for us to care about page sizes other than the fact that it seems to be a good
// middle ground for data size.
const PAGE_SIZE: u64 = 4096;
PAGE_SIZE / self.line_width.width as u64
}
fn schedule_scroll_down(&mut self) {
self.base_row = self.base_row.saturating_add(1)
}
fn schedule_scroll_up(&mut self) {
self.base_row = self.base_row.saturating_sub(1);
}
}
static HEX_BYTES_MEMOIZED: LazyLock<[SharedString; 256]> =
LazyLock::new(|| std::array::from_fn(|byte| SharedString::from(format!("{byte:02X}"))));
static UNKNOWN_BYTE: SharedString = SharedString::new_static("??");
impl MemoryView {
pub(crate) fn new(
session: Entity<Session>,
workspace: WeakEntity<Workspace>,
stack_frame_list: WeakEntity<StackFrameList>,
window: &mut Window,
cx: &mut Context<Self>,
) -> Self {
let view_state = ViewState::new(0, WIDTHS[4].clone());
let scroll_handle = UniformListScrollHandle::default();
let query_editor = cx.new(|cx| Editor::single_line(window, cx));
let scroll_state = ScrollbarState::new(scroll_handle.clone());
let mut this = Self {
workspace,
scroll_state,
scroll_handle,
stack_frame_list,
show_scrollbar: false,
hide_scrollbar_task: None,
focus_handle: cx.focus_handle(),
view_state,
query_editor,
session,
width_picker_handle: Default::default(),
is_writing_memory: true,
open_context_menu: None,
};
this.change_query_bar_mode(false, window, cx);
this
}
fn hide_scrollbar(&mut self, window: &mut Window, cx: &mut Context<Self>) {
const SCROLLBAR_SHOW_INTERVAL: Duration = Duration::from_secs(1);
self.hide_scrollbar_task = Some(cx.spawn_in(window, async move |panel, cx| {
cx.background_executor()
.timer(SCROLLBAR_SHOW_INTERVAL)
.await;
panel
.update(cx, |panel, cx| {
panel.show_scrollbar = false;
cx.notify();
})
.log_err();
}))
}
fn render_vertical_scrollbar(&self, cx: &mut Context<Self>) -> Option<Stateful<Div>> {
if !(self.show_scrollbar || self.scroll_state.is_dragging()) {
return None;
}
Some(
div()
.occlude()
.id("memory-view-vertical-scrollbar")
.on_mouse_move(cx.listener(|this, evt, _, cx| {
this.handle_drag(evt);
cx.notify();
cx.stop_propagation()
}))
.on_hover(|_, _, cx| {
cx.stop_propagation();
})
.on_any_mouse_down(|_, _, cx| {
cx.stop_propagation();
})
.on_mouse_up(
MouseButton::Left,
cx.listener(|_, _, _, cx| {
cx.stop_propagation();
}),
)
.on_scroll_wheel(cx.listener(|_, _, _, cx| {
cx.notify();
}))
.h_full()
.absolute()
.right_1()
.top_1()
.bottom_0()
.w(px(12.))
.cursor_default()
.children(Scrollbar::vertical(self.scroll_state.clone())),
)
}
fn render_memory(&self, cx: &mut Context<Self>) -> UniformList {
let weak = cx.weak_entity();
let session = self.session.clone();
let view_state = self.view_state.clone();
uniform_list(
"debugger-memory-view",
self.view_state.row_count() as usize,
move |range, _, cx| {
let mut line_buffer = Vec::with_capacity(view_state.line_width.width as usize);
let memory_start =
(view_state.base_row + range.start as u64) * view_state.line_width.width as u64;
let memory_end = (view_state.base_row + range.end as u64)
* view_state.line_width.width as u64
- 1;
let mut memory = session.update(cx, |this, cx| {
this.read_memory(memory_start..=memory_end, cx)
});
let mut rows = Vec::with_capacity(range.end - range.start);
for ix in range {
line_buffer.extend((&mut memory).take(view_state.line_width.width as usize));
rows.push(render_single_memory_view_line(
&line_buffer,
ix as u64,
weak.clone(),
cx,
));
line_buffer.clear();
}
rows
},
)
.track_scroll(self.scroll_handle.clone())
.on_scroll_wheel(cx.listener(|this, evt: &ScrollWheelEvent, window, _| {
let delta = evt.delta.pixel_delta(window.line_height());
let scroll_handle = this.scroll_state.scroll_handle();
let size = scroll_handle.content_size();
let viewport = scroll_handle.viewport();
let current_offset = scroll_handle.offset();
let first_entry_offset_boundary = size.height / this.view_state.row_count() as f32;
let last_entry_offset_boundary = size.height - first_entry_offset_boundary;
if first_entry_offset_boundary + viewport.size.height > current_offset.y.abs() {
// The topmost entry is visible, hence if we're scrolling up, we need to load extra lines.
this.view_state.schedule_scroll_up();
} else if last_entry_offset_boundary < current_offset.y.abs() + viewport.size.height {
this.view_state.schedule_scroll_down();
}
scroll_handle.set_offset(current_offset + point(px(0.), delta.y));
}))
}
fn render_query_bar(&self, cx: &Context<Self>) -> impl IntoElement {
EditorElement::new(
&self.query_editor,
Self::editor_style(&self.query_editor, cx),
)
}
pub(super) fn go_to_memory_reference(
&mut self,
memory_reference: &str,
evaluate_name: Option<&str>,
stack_frame_id: Option<u64>,
cx: &mut Context<Self>,
) {
use parse_int::parse;
let Ok(as_address) = parse::<u64>(&memory_reference) else {
return;
};
let access_size = evaluate_name
.map(|typ| {
self.session.update(cx, |this, cx| {
this.data_access_size(stack_frame_id, typ, cx)
})
})
.unwrap_or_else(|| Task::ready(None));
cx.spawn(async move |this, cx| {
let access_size = access_size.await.unwrap_or(1);
this.update(cx, |this, cx| {
this.view_state.selection = Some(SelectedMemoryRange::DragComplete(Drag {
start_address: as_address,
end_address: as_address + access_size - 1,
}));
this.jump_to_address(as_address, cx);
})
.ok();
})
.detach();
}
fn handle_drag(&mut self, evt: &MouseMoveEvent) {
if !evt.dragging() {
return;
}
if !self.scroll_state.is_dragging()
&& !self
.view_state
.selection
.as_ref()
.is_some_and(|selection| selection.is_dragging())
{
return;
}
let row_count = self.view_state.row_count();
debug_assert!(row_count > 1);
let scroll_handle = self.scroll_state.scroll_handle();
let viewport = scroll_handle.viewport();
let (top_area, bottom_area) = {
let size = size(viewport.size.width, viewport.size.height / 10.);
(
bounds(viewport.origin, size),
bounds(
point(viewport.origin.x, viewport.origin.y + size.height * 2.),
size,
),
)
};
if bottom_area.contains(&evt.position) {
//ix == row_count - 1 {
self.view_state.schedule_scroll_down();
} else if top_area.contains(&evt.position) {
self.view_state.schedule_scroll_up();
}
}
fn editor_style(editor: &Entity<Editor>, cx: &Context<Self>) -> EditorStyle {
let is_read_only = editor.read(cx).read_only(cx);
let settings = ThemeSettings::get_global(cx);
let theme = cx.theme();
let text_style = TextStyle {
color: if is_read_only {
theme.colors().text_muted
} else {
theme.colors().text
},
font_family: settings.buffer_font.family.clone(),
font_features: settings.buffer_font.features.clone(),
font_size: TextSize::Small.rems(cx).into(),
font_weight: settings.buffer_font.weight,
..Default::default()
};
EditorStyle {
background: theme.colors().editor_background,
local_player: theme.players().local(),
text: text_style,
..Default::default()
}
}
fn render_width_picker(&self, window: &mut Window, cx: &mut Context<Self>) -> DropdownMenu {
let weak = cx.weak_entity();
let selected_width = self.view_state.line_width.clone();
DropdownMenu::new(
"memory-view-width-picker",
selected_width.label.clone(),
ContextMenu::build(window, cx, |mut this, window, cx| {
for width in &WIDTHS {
let weak = weak.clone();
let width = width.clone();
this = this.entry(width.label.clone(), None, move |_, cx| {
_ = weak.update(cx, |this, _| {
// Convert base ix between 2 line widths to keep the shown memory address roughly the same.
// All widths are powers of 2, so the conversion should be lossless.
match this.view_state.line_width.width.cmp(&width.width) {
std::cmp::Ordering::Less => {
// We're converting up.
let shift = width.width.trailing_zeros()
- this.view_state.line_width.width.trailing_zeros();
this.view_state.base_row >>= shift;
}
std::cmp::Ordering::Greater => {
// We're converting down.
let shift = this.view_state.line_width.width.trailing_zeros()
- width.width.trailing_zeros();
this.view_state.base_row <<= shift;
}
_ => {}
}
this.view_state.line_width = width.clone();
});
});
}
if let Some(ix) = WIDTHS
.iter()
.position(|width| width.width == selected_width.width)
{
for _ in 0..=ix {
this.select_next(&Default::default(), window, cx);
}
}
this
}),
)
.handle(self.width_picker_handle.clone())
}
fn page_down(&mut self, _: &menu::SelectLast, _: &mut Window, cx: &mut Context<Self>) {
self.view_state.base_row = self
.view_state
.base_row
.overflowing_add(self.view_state.row_count())
.0;
cx.notify();
}
fn page_up(&mut self, _: &menu::SelectFirst, _: &mut Window, cx: &mut Context<Self>) {
self.view_state.base_row = self
.view_state
.base_row
.overflowing_sub(self.view_state.row_count())
.0;
cx.notify();
}
fn change_query_bar_mode(
&mut self,
is_writing_memory: bool,
window: &mut Window,
cx: &mut Context<Self>,
) {
if is_writing_memory == self.is_writing_memory {
return;
}
if !self.is_writing_memory {
self.query_editor.update(cx, |this, cx| {
this.clear(window, cx);
this.set_placeholder_text("Write to Selected Memory Range", cx);
});
self.is_writing_memory = true;
self.query_editor.focus_handle(cx).focus(window);
} else {
self.query_editor.update(cx, |this, cx| {
this.clear(window, cx);
this.set_placeholder_text("Go to Memory Address / Expression", cx);
});
self.is_writing_memory = false;
}
}
fn toggle_data_breakpoint(
&mut self,
_: &crate::ToggleDataBreakpoint,
_: &mut Window,
cx: &mut Context<Self>,
) {
let Some(SelectedMemoryRange::DragComplete(selection)) = self.view_state.selection.clone()
else {
return;
};
let range = selection.memory_range();
let context = Arc::new(DataBreakpointContext::Address {
address: range.start().to_string(),
bytes: Some(*range.end() - *range.start()),
});
self.session.update(cx, |this, cx| {
let data_breakpoint_info = this.data_breakpoint_info(context.clone(), None, cx);
cx.spawn(async move |this, cx| {
if let Some(info) = data_breakpoint_info.await {
let Some(data_id) = info.data_id.clone() else {
return;
};
_ = this.update(cx, |this, cx| {
this.create_data_breakpoint(
context,
data_id.clone(),
dap::DataBreakpoint {
data_id,
access_type: None,
condition: None,
hit_condition: None,
},
cx,
);
});
}
})
.detach();
})
}
fn confirm(&mut self, _: &menu::Confirm, window: &mut Window, cx: &mut Context<Self>) {
if let Some(SelectedMemoryRange::DragComplete(drag)) = &self.view_state.selection {
// Go into memory writing mode.
if !self.is_writing_memory {
let should_return = self.session.update(cx, |session, cx| {
if !session
.capabilities()
.supports_write_memory_request
.unwrap_or_default()
{
let adapter_name = session.adapter();
// We cannot write memory with this adapter.
_ = self.workspace.update(cx, |this, cx| {
this.toggle_status_toast(
StatusToast::new(format!(
"Debug Adapter `{adapter_name}` does not support writing to memory"
), cx, |this, cx| {
cx.spawn(async move |this, cx| {
cx.background_executor().timer(Duration::from_secs(2)).await;
_ = this.update(cx, |_, cx| {
cx.emit(DismissEvent)
});
}).detach();
this.icon(ToastIcon::new(IconName::XCircle).color(Color::Error))
}),
cx,
);
});
true
} else {
false
}
});
if should_return {
return;
}
self.change_query_bar_mode(true, window, cx);
} else if self.query_editor.focus_handle(cx).is_focused(window) {
let mut text = self.query_editor.read(cx).text(cx);
if text.chars().any(|c| !c.is_ascii_hexdigit()) {
// Interpret this text as a string and oh-so-conveniently convert it.
text = text.bytes().map(|byte| format!("{:02x}", byte)).collect();
}
self.session.update(cx, |this, cx| {
let range = drag.memory_range();
if let Ok(as_hex) = hex::decode(text) {
this.write_memory(*range.start(), &as_hex, cx);
}
});
self.change_query_bar_mode(false, window, cx);
}
cx.notify();
return;
}
// Just change the currently viewed address.
if !self.query_editor.focus_handle(cx).is_focused(window) {
return;
}
self.jump_to_query_bar_address(cx);
}
fn jump_to_query_bar_address(&mut self, cx: &mut Context<Self>) {
use parse_int::parse;
let text = self.query_editor.read(cx).text(cx);
let Ok(as_address) = parse::<u64>(&text) else {
return self.jump_to_expression(text, cx);
};
self.jump_to_address(as_address, cx);
}
fn jump_to_address(&mut self, address: u64, cx: &mut Context<Self>) {
self.view_state.base_row = (address & !0xfff) / self.view_state.line_width.width as u64;
let line_ix = (address & 0xfff) / self.view_state.line_width.width as u64;
self.scroll_handle
.scroll_to_item(line_ix as usize, ScrollStrategy::Center);
cx.notify();
}
fn jump_to_expression(&mut self, expr: String, cx: &mut Context<Self>) {
let Ok(selected_frame) = self
.stack_frame_list
.update(cx, |this, _| this.opened_stack_frame_id())
else {
return;
};
let reference = self.session.update(cx, |this, cx| {
this.memory_reference_of_expr(selected_frame, expr, cx)
});
cx.spawn(async move |this, cx| {
if let Some(reference) = reference.await {
_ = this.update(cx, |this, cx| {
let Ok(address) = parse_int::parse::<u64>(&reference) else {
return;
};
this.jump_to_address(address, cx);
});
}
})
.detach();
}
fn cancel(&mut self, _: &menu::Cancel, _: &mut Window, cx: &mut Context<Self>) {
self.view_state.selection = None;
cx.notify();
}
/// Jump to memory pointed to by selected memory range.
fn go_to_address(
&mut self,
_: &GoToSelectedAddress,
window: &mut Window,
cx: &mut Context<Self>,
) {
let Some(SelectedMemoryRange::DragComplete(drag)) = self.view_state.selection.clone()
else {
return;
};
let range = drag.memory_range();
let Some(memory): Option<Vec<u8>> = self.session.update(cx, |this, cx| {
this.read_memory(range, cx).map(|cell| cell.0).collect()
}) else {
return;
};
if memory.len() > 8 {
return;
}
let zeros_to_write = 8 - memory.len();
let mut acc = String::from("0x");
acc.extend(std::iter::repeat("00").take(zeros_to_write));
let as_query = memory.into_iter().rev().fold(acc, |mut acc, byte| {
_ = write!(&mut acc, "{:02x}", byte);
acc
});
self.query_editor.update(cx, |this, cx| {
this.set_text(as_query, window, cx);
});
self.jump_to_query_bar_address(cx);
}
fn deploy_memory_context_menu(
&mut self,
range: RangeInclusive<u64>,
position: Point<Pixels>,
window: &mut Window,
cx: &mut Context<Self>,
) {
let session = self.session.clone();
let context_menu = ContextMenu::build(window, cx, |menu, _, cx| {
let range_too_large = range.end() - range.start() > std::mem::size_of::<u64>() as u64;
let caps = session.read(cx).capabilities();
let supports_data_breakpoints = caps.supports_data_breakpoints.unwrap_or_default()
&& caps.supports_data_breakpoint_bytes.unwrap_or_default();
let memory_unreadable = LazyCell::new(|| {
session.update(cx, |this, cx| {
this.read_memory(range.clone(), cx)
.any(|cell| cell.0.is_none())
})
});
let mut menu = menu.action_disabled_when(
range_too_large || *memory_unreadable,
"Go To Selected Address",
GoToSelectedAddress.boxed_clone(),
);
if supports_data_breakpoints {
menu = menu.action_disabled_when(
*memory_unreadable,
"Set Data Breakpoint",
ToggleDataBreakpoint.boxed_clone(),
);
}
menu.context(self.focus_handle.clone())
});
cx.focus_view(&context_menu, window);
let subscription = cx.subscribe_in(
&context_menu,
window,
|this, _, _: &DismissEvent, window, cx| {
if this.open_context_menu.as_ref().is_some_and(|context_menu| {
context_menu.0.focus_handle(cx).contains_focused(window, cx)
}) {
cx.focus_self(window);
}
this.open_context_menu.take();
cx.notify();
},
);
self.open_context_menu = Some((context_menu, position, subscription));
}
}
#[derive(Clone)]
struct ViewWidth {
width: u8,
label: SharedString,
}
impl ViewWidth {
const fn new(width: u8, label: &'static str) -> Self {
Self {
width,
label: SharedString::new_static(label),
}
}
}
static WIDTHS: [ViewWidth; 7] = [
ViewWidth::new(1, "1 byte"),
ViewWidth::new(2, "2 bytes"),
ViewWidth::new(4, "4 bytes"),
ViewWidth::new(8, "8 bytes"),
ViewWidth::new(16, "16 bytes"),
ViewWidth::new(32, "32 bytes"),
ViewWidth::new(64, "64 bytes"),
];
fn render_single_memory_view_line(
memory: &[MemoryCell],
ix: u64,
weak: gpui::WeakEntity<MemoryView>,
cx: &mut App,
) -> AnyElement {
let Ok(view_state) = weak.update(cx, |this, _| this.view_state.clone()) else {
return div().into_any();
};
let base_address = (view_state.base_row + ix) * view_state.line_width.width as u64;
h_flex()
.id((
"memory-view-row-full",
ix * view_state.line_width.width as u64,
))
.size_full()
.gap_x_2()
.child(
div()
.child(
Label::new(format!("{:016X}", base_address))
.buffer_font(cx)
.size(ui::LabelSize::Small)
.color(Color::Muted),
)
.px_1()
.border_r_1()
.border_color(Color::Muted.color(cx)),
)
.child(
h_flex()
.id((
"memory-view-row-raw-memory",
ix * view_state.line_width.width as u64,
))
.px_1()
.children(memory.iter().enumerate().map(|(cell_ix, cell)| {
let weak = weak.clone();
div()
.id(("memory-view-row-raw-memory-cell", cell_ix as u64))
.px_0p5()
.when_some(view_state.selection.as_ref(), |this, selection| {
this.when(selection.contains(base_address + cell_ix as u64), |this| {
let weak = weak.clone();
this.bg(Color::Accent.color(cx)).when(
!selection.is_dragging(),
|this| {
let selection = selection.drag().memory_range();
this.on_mouse_down(
MouseButton::Right,
move |click, window, cx| {
_ = weak.update(cx, |this, cx| {
this.deploy_memory_context_menu(
selection.clone(),
click.position,
window,
cx,
)
});
cx.stop_propagation();
},
)
},
)
})
})
.child(
Label::new(
cell.0
.map(|val| HEX_BYTES_MEMOIZED[val as usize].clone())
.unwrap_or_else(|| UNKNOWN_BYTE.clone()),
)
.buffer_font(cx)
.when(cell.0.is_none(), |this| this.color(Color::Muted))
.size(ui::LabelSize::Small),
)
.on_drag(
Drag {
start_address: base_address + cell_ix as u64,
end_address: base_address + cell_ix as u64,
},
{
let weak = weak.clone();
move |drag, _, _, cx| {
_ = weak.update(cx, |this, _| {
this.view_state.selection =
Some(SelectedMemoryRange::DragUnderway(drag.clone()));
});
cx.new(|_| Empty)
}
},
)
.on_drop({
let weak = weak.clone();
move |drag: &Drag, _, cx| {
_ = weak.update(cx, |this, _| {
this.view_state.selection =
Some(SelectedMemoryRange::DragComplete(Drag {
start_address: drag.start_address,
end_address: base_address + cell_ix as u64,
}));
});
}
})
.drag_over(move |style, drag: &Drag, _, cx| {
_ = weak.update(cx, |this, _| {
this.view_state.selection =
Some(SelectedMemoryRange::DragUnderway(Drag {
start_address: drag.start_address,
end_address: base_address + cell_ix as u64,
}));
});
style
})
})),
)
.child(
h_flex()
.id((
"memory-view-row-ascii-memory",
ix * view_state.line_width.width as u64,
))
.h_full()
.px_1()
.mr_4()
// .gap_x_1p5()
.border_x_1()
.border_color(Color::Muted.color(cx))
.children(memory.iter().enumerate().map(|(ix, cell)| {
let as_character = char::from(cell.0.unwrap_or(0));
let as_visible = if as_character.is_ascii_graphic() {
as_character
} else {
'·'
};
div()
.px_0p5()
.when_some(view_state.selection.as_ref(), |this, selection| {
this.when(selection.contains(base_address + ix as u64), |this| {
this.bg(Color::Accent.color(cx))
})
})
.child(
Label::new(format!("{as_visible}"))
.buffer_font(cx)
.when(cell.0.is_none(), |this| this.color(Color::Muted))
.size(ui::LabelSize::Small),
)
})),
)
.into_any()
}
impl Render for MemoryView {
fn render(
&mut self,
window: &mut ui::Window,
cx: &mut ui::Context<Self>,
) -> impl ui::IntoElement {
let (icon, tooltip_text) = if self.is_writing_memory {
(IconName::Pencil, "Edit memory at a selected address")
} else {
(
IconName::LocationEdit,
"Change address of currently viewed memory",
)
};
v_flex()
.id("Memory-view")
.on_action(cx.listener(Self::cancel))
.on_action(cx.listener(Self::go_to_address))
.p_1()
.on_action(cx.listener(Self::confirm))
.on_action(cx.listener(Self::toggle_data_breakpoint))
.on_action(cx.listener(Self::page_down))
.on_action(cx.listener(Self::page_up))
.size_full()
.track_focus(&self.focus_handle)
.on_hover(cx.listener(|this, hovered, window, cx| {
if *hovered {
this.show_scrollbar = true;
this.hide_scrollbar_task.take();
cx.notify();
} else if !this.focus_handle.contains_focused(window, cx) {
this.hide_scrollbar(window, cx);
}
}))
.child(
h_flex()
.w_full()
.mb_0p5()
.gap_1()
.child(
h_flex()
.w_full()
.rounded_md()
.border_1()
.gap_x_2()
.px_2()
.py_0p5()
.mb_0p5()
.bg(cx.theme().colors().editor_background)
.when_else(
self.query_editor
.focus_handle(cx)
.contains_focused(window, cx),
|this| this.border_color(cx.theme().colors().border_focused),
|this| this.border_color(cx.theme().colors().border_transparent),
)
.child(
div()
.id("memory-view-editor-icon")
.child(Icon::new(icon).size(ui::IconSize::XSmall))
.tooltip(Tooltip::text(tooltip_text)),
)
.child(self.render_query_bar(cx)),
)
.child(self.render_width_picker(window, cx)),
)
.child(Divider::horizontal())
.child(
v_flex()
.size_full()
.on_mouse_move(cx.listener(|this, evt: &MouseMoveEvent, _, _| {
this.handle_drag(evt);
}))
.child(self.render_memory(cx).size_full())
.children(self.open_context_menu.as_ref().map(|(menu, position, _)| {
deferred(
anchored()
.position(*position)
.anchor(gpui::Corner::TopLeft)
.child(menu.clone()),
)
.with_priority(1)
}))
.children(self.render_vertical_scrollbar(cx)),
)
}
}

View File

@@ -1,3 +1,5 @@
use crate::session::running::{RunningState, memory_view::MemoryView};
use super::stack_frame_list::{StackFrameList, StackFrameListEvent};
use dap::{
ScopePresentationHint, StackFrameId, VariablePresentationHint, VariablePresentationHintKind,
@@ -7,13 +9,17 @@ use editor::Editor;
use gpui::{
Action, AnyElement, ClickEvent, ClipboardItem, Context, DismissEvent, Empty, Entity,
FocusHandle, Focusable, Hsla, MouseButton, MouseDownEvent, Point, Stateful, Subscription,
TextStyleRefinement, UniformListScrollHandle, actions, anchored, deferred, uniform_list,
TextStyleRefinement, UniformListScrollHandle, WeakEntity, actions, anchored, deferred,
uniform_list,
};
use menu::{SelectFirst, SelectLast, SelectNext, SelectPrevious};
use project::debugger::session::{Session, SessionEvent, Watcher};
use project::debugger::{
dap_command::DataBreakpointContext,
session::{Session, SessionEvent, Watcher},
};
use std::{collections::HashMap, ops::Range, sync::Arc};
use ui::{ContextMenu, ListItem, ScrollableHandle, Scrollbar, ScrollbarState, Tooltip, prelude::*};
use util::debug_panic;
use util::{debug_panic, maybe};
actions!(
variable_list,
@@ -32,6 +38,8 @@ actions!(
AddWatch,
/// Removes the selected variable from the watch list.
RemoveWatch,
/// Jump to variable's memory location.
GoToMemory,
]
);
@@ -86,30 +94,30 @@ impl EntryPath {
}
#[derive(Debug, Clone, PartialEq)]
enum EntryKind {
enum DapEntry {
Watcher(Watcher),
Variable(dap::Variable),
Scope(dap::Scope),
}
impl EntryKind {
impl DapEntry {
fn as_watcher(&self) -> Option<&Watcher> {
match self {
EntryKind::Watcher(watcher) => Some(watcher),
DapEntry::Watcher(watcher) => Some(watcher),
_ => None,
}
}
fn as_variable(&self) -> Option<&dap::Variable> {
match self {
EntryKind::Variable(dap) => Some(dap),
DapEntry::Variable(dap) => Some(dap),
_ => None,
}
}
fn as_scope(&self) -> Option<&dap::Scope> {
match self {
EntryKind::Scope(dap) => Some(dap),
DapEntry::Scope(dap) => Some(dap),
_ => None,
}
}
@@ -117,38 +125,38 @@ impl EntryKind {
#[cfg(test)]
fn name(&self) -> &str {
match self {
EntryKind::Watcher(watcher) => &watcher.expression,
EntryKind::Variable(dap) => &dap.name,
EntryKind::Scope(dap) => &dap.name,
DapEntry::Watcher(watcher) => &watcher.expression,
DapEntry::Variable(dap) => &dap.name,
DapEntry::Scope(dap) => &dap.name,
}
}
}
#[derive(Debug, Clone, PartialEq)]
struct ListEntry {
dap_kind: EntryKind,
entry: DapEntry,
path: EntryPath,
}
impl ListEntry {
fn as_watcher(&self) -> Option<&Watcher> {
self.dap_kind.as_watcher()
self.entry.as_watcher()
}
fn as_variable(&self) -> Option<&dap::Variable> {
self.dap_kind.as_variable()
self.entry.as_variable()
}
fn as_scope(&self) -> Option<&dap::Scope> {
self.dap_kind.as_scope()
self.entry.as_scope()
}
fn item_id(&self) -> ElementId {
use std::fmt::Write;
let mut id = match &self.dap_kind {
EntryKind::Watcher(watcher) => format!("watcher-{}", watcher.expression),
EntryKind::Variable(dap) => format!("variable-{}", dap.name),
EntryKind::Scope(dap) => format!("scope-{}", dap.name),
let mut id = match &self.entry {
DapEntry::Watcher(watcher) => format!("watcher-{}", watcher.expression),
DapEntry::Variable(dap) => format!("variable-{}", dap.name),
DapEntry::Scope(dap) => format!("scope-{}", dap.name),
};
for name in self.path.indices.iter() {
_ = write!(id, "-{}", name);
@@ -158,10 +166,10 @@ impl ListEntry {
fn item_value_id(&self) -> ElementId {
use std::fmt::Write;
let mut id = match &self.dap_kind {
EntryKind::Watcher(watcher) => format!("watcher-{}", watcher.expression),
EntryKind::Variable(dap) => format!("variable-{}", dap.name),
EntryKind::Scope(dap) => format!("scope-{}", dap.name),
let mut id = match &self.entry {
DapEntry::Watcher(watcher) => format!("watcher-{}", watcher.expression),
DapEntry::Variable(dap) => format!("variable-{}", dap.name),
DapEntry::Scope(dap) => format!("scope-{}", dap.name),
};
for name in self.path.indices.iter() {
_ = write!(id, "-{}", name);
@@ -188,13 +196,17 @@ pub struct VariableList {
focus_handle: FocusHandle,
edited_path: Option<(EntryPath, Entity<Editor>)>,
disabled: bool,
memory_view: Entity<MemoryView>,
weak_running: WeakEntity<RunningState>,
_subscriptions: Vec<Subscription>,
}
impl VariableList {
pub fn new(
pub(crate) fn new(
session: Entity<Session>,
stack_frame_list: Entity<StackFrameList>,
memory_view: Entity<MemoryView>,
weak_running: WeakEntity<RunningState>,
window: &mut Window,
cx: &mut Context<Self>,
) -> Self {
@@ -211,6 +223,7 @@ impl VariableList {
SessionEvent::Variables | SessionEvent::Watchers => {
this.build_entries(cx);
}
_ => {}
}),
cx.on_focus_out(&focus_handle, window, |this, _, _, cx| {
@@ -234,6 +247,8 @@ impl VariableList {
edited_path: None,
entries: Default::default(),
entry_states: Default::default(),
weak_running,
memory_view,
}
}
@@ -284,7 +299,7 @@ impl VariableList {
scope.variables_reference,
scope.variables_reference,
EntryPath::for_scope(&scope.name),
EntryKind::Scope(scope),
DapEntry::Scope(scope),
)
})
.collect::<Vec<_>>();
@@ -298,7 +313,7 @@ impl VariableList {
watcher.variables_reference,
watcher.variables_reference,
EntryPath::for_watcher(watcher.expression.clone()),
EntryKind::Watcher(watcher.clone()),
DapEntry::Watcher(watcher.clone()),
)
})
.collect::<Vec<_>>(),
@@ -309,9 +324,9 @@ impl VariableList {
while let Some((container_reference, variables_reference, mut path, dap_kind)) = stack.pop()
{
match &dap_kind {
EntryKind::Watcher(watcher) => path = path.with_child(watcher.expression.clone()),
EntryKind::Variable(dap) => path = path.with_name(dap.name.clone().into()),
EntryKind::Scope(dap) => path = path.with_child(dap.name.clone().into()),
DapEntry::Watcher(watcher) => path = path.with_child(watcher.expression.clone()),
DapEntry::Variable(dap) => path = path.with_name(dap.name.clone().into()),
DapEntry::Scope(dap) => path = path.with_child(dap.name.clone().into()),
}
let var_state = self
@@ -336,7 +351,7 @@ impl VariableList {
});
entries.push(ListEntry {
dap_kind,
entry: dap_kind,
path: path.clone(),
});
@@ -349,7 +364,7 @@ impl VariableList {
variables_reference,
child.variables_reference,
path.with_child(child.name.clone().into()),
EntryKind::Variable(child),
DapEntry::Variable(child),
)
}));
}
@@ -380,9 +395,9 @@ impl VariableList {
pub fn completion_variables(&self, _cx: &mut Context<Self>) -> Vec<dap::Variable> {
self.entries
.iter()
.filter_map(|entry| match &entry.dap_kind {
EntryKind::Variable(dap) => Some(dap.clone()),
EntryKind::Scope(_) | EntryKind::Watcher { .. } => None,
.filter_map(|entry| match &entry.entry {
DapEntry::Variable(dap) => Some(dap.clone()),
DapEntry::Scope(_) | DapEntry::Watcher { .. } => None,
})
.collect()
}
@@ -400,12 +415,12 @@ impl VariableList {
.get(ix)
.and_then(|entry| Some(entry).zip(self.entry_states.get(&entry.path)))?;
match &entry.dap_kind {
EntryKind::Watcher { .. } => {
match &entry.entry {
DapEntry::Watcher { .. } => {
Some(self.render_watcher(entry, *state, window, cx))
}
EntryKind::Variable(_) => Some(self.render_variable(entry, *state, window, cx)),
EntryKind::Scope(_) => Some(self.render_scope(entry, *state, cx)),
DapEntry::Variable(_) => Some(self.render_variable(entry, *state, window, cx)),
DapEntry::Scope(_) => Some(self.render_scope(entry, *state, cx)),
}
})
.collect()
@@ -562,6 +577,51 @@ impl VariableList {
}
}
fn jump_to_variable_memory(
&mut self,
_: &GoToMemory,
window: &mut Window,
cx: &mut Context<Self>,
) {
_ = maybe!({
let selection = self.selection.as_ref()?;
let entry = self.entries.iter().find(|entry| &entry.path == selection)?;
let var = entry.entry.as_variable()?;
let memory_reference = var.memory_reference.as_deref()?;
let sizeof_expr = if var.type_.as_ref().is_some_and(|t| {
t.chars()
.all(|c| c.is_whitespace() || c.is_alphabetic() || c == '*')
}) {
var.type_.as_deref()
} else {
var.evaluate_name
.as_deref()
.map(|name| name.strip_prefix("/nat ").unwrap_or_else(|| name))
};
self.memory_view.update(cx, |this, cx| {
this.go_to_memory_reference(
memory_reference,
sizeof_expr,
self.selected_stack_frame_id,
cx,
);
});
let weak_panel = self.weak_running.clone();
window.defer(cx, move |window, cx| {
_ = weak_panel.update(cx, |this, cx| {
this.activate_item(
crate::persistence::DebuggerPaneItem::MemoryView,
window,
cx,
);
});
});
Some(())
});
}
fn deploy_list_entry_context_menu(
&mut self,
entry: ListEntry,
@@ -569,49 +629,156 @@ impl VariableList {
window: &mut Window,
cx: &mut Context<Self>,
) {
let supports_set_variable = self
.session
.read(cx)
.capabilities()
.supports_set_variable
.unwrap_or_default();
let (supports_set_variable, supports_data_breakpoints, supports_go_to_memory) =
self.session.read_with(cx, |session, _| {
(
session
.capabilities()
.supports_set_variable
.unwrap_or_default(),
session
.capabilities()
.supports_data_breakpoints
.unwrap_or_default(),
session
.capabilities()
.supports_read_memory_request
.unwrap_or_default(),
)
});
let can_toggle_data_breakpoint = entry
.as_variable()
.filter(|_| supports_data_breakpoints)
.and_then(|variable| {
let variables_reference = self
.entry_states
.get(&entry.path)
.map(|state| state.parent_reference)?;
Some(self.session.update(cx, |session, cx| {
session.data_breakpoint_info(
Arc::new(DataBreakpointContext::Variable {
variables_reference,
name: variable.name.clone(),
bytes: None,
}),
None,
cx,
)
}))
});
let context_menu = ContextMenu::build(window, cx, |menu, _, _| {
menu.when(entry.as_variable().is_some(), |menu| {
menu.action("Copy Name", CopyVariableName.boxed_clone())
.action("Copy Value", CopyVariableValue.boxed_clone())
.when(supports_set_variable, |menu| {
menu.action("Edit Value", EditVariable.boxed_clone())
let focus_handle = self.focus_handle.clone();
cx.spawn_in(window, async move |this, cx| {
let can_toggle_data_breakpoint = if let Some(task) = can_toggle_data_breakpoint {
task.await.is_some()
} else {
true
};
cx.update(|window, cx| {
let context_menu = ContextMenu::build(window, cx, |menu, _, _| {
menu.when_some(entry.as_variable(), |menu, _| {
menu.action("Copy Name", CopyVariableName.boxed_clone())
.action("Copy Value", CopyVariableValue.boxed_clone())
.when(supports_set_variable, |menu| {
menu.action("Edit Value", EditVariable.boxed_clone())
})
.when(supports_go_to_memory, |menu| {
menu.action("Go To Memory", GoToMemory.boxed_clone())
})
.action("Watch Variable", AddWatch.boxed_clone())
.when(can_toggle_data_breakpoint, |menu| {
menu.action(
"Toggle Data Breakpoint",
crate::ToggleDataBreakpoint.boxed_clone(),
)
})
})
.action("Watch Variable", AddWatch.boxed_clone())
})
.when(entry.as_watcher().is_some(), |menu| {
menu.action("Copy Name", CopyVariableName.boxed_clone())
.action("Copy Value", CopyVariableValue.boxed_clone())
.when(supports_set_variable, |menu| {
menu.action("Edit Value", EditVariable.boxed_clone())
.when(entry.as_watcher().is_some(), |menu| {
menu.action("Copy Name", CopyVariableName.boxed_clone())
.action("Copy Value", CopyVariableValue.boxed_clone())
.when(supports_set_variable, |menu| {
menu.action("Edit Value", EditVariable.boxed_clone())
})
.action("Remove Watch", RemoveWatch.boxed_clone())
})
.action("Remove Watch", RemoveWatch.boxed_clone())
.context(focus_handle.clone())
});
_ = this.update(cx, |this, cx| {
cx.focus_view(&context_menu, window);
let subscription = cx.subscribe_in(
&context_menu,
window,
|this, _, _: &DismissEvent, window, cx| {
if this.open_context_menu.as_ref().is_some_and(|context_menu| {
context_menu.0.focus_handle(cx).contains_focused(window, cx)
}) {
cx.focus_self(window);
}
this.open_context_menu.take();
cx.notify();
},
);
this.open_context_menu = Some((context_menu, position, subscription));
});
})
.context(self.focus_handle.clone())
})
.detach();
}
fn toggle_data_breakpoint(
&mut self,
_: &crate::ToggleDataBreakpoint,
_window: &mut Window,
cx: &mut Context<Self>,
) {
let Some(entry) = self
.selection
.as_ref()
.and_then(|selection| self.entries.iter().find(|entry| &entry.path == selection))
else {
return;
};
let Some((name, var_ref)) = entry.as_variable().map(|var| &var.name).zip(
self.entry_states
.get(&entry.path)
.map(|state| state.parent_reference),
) else {
return;
};
let context = Arc::new(DataBreakpointContext::Variable {
variables_reference: var_ref,
name: name.clone(),
bytes: None,
});
let data_breakpoint = self.session.update(cx, |session, cx| {
session.data_breakpoint_info(context.clone(), None, cx)
});
cx.focus_view(&context_menu, window);
let subscription = cx.subscribe_in(
&context_menu,
window,
|this, _, _: &DismissEvent, window, cx| {
if this.open_context_menu.as_ref().is_some_and(|context_menu| {
context_menu.0.focus_handle(cx).contains_focused(window, cx)
}) {
cx.focus_self(window);
}
this.open_context_menu.take();
let session = self.session.downgrade();
cx.spawn(async move |_, cx| {
let Some(data_id) = data_breakpoint.await.and_then(|info| info.data_id) else {
return;
};
_ = session.update(cx, |session, cx| {
session.create_data_breakpoint(
context,
data_id.clone(),
dap::DataBreakpoint {
data_id,
access_type: None,
condition: None,
hit_condition: None,
},
cx,
);
cx.notify();
},
);
self.open_context_menu = Some((context_menu, position, subscription));
});
})
.detach();
}
fn copy_variable_name(
@@ -628,10 +795,10 @@ impl VariableList {
return;
};
let variable_name = match &entry.dap_kind {
EntryKind::Variable(dap) => dap.name.clone(),
EntryKind::Watcher(watcher) => watcher.expression.to_string(),
EntryKind::Scope(_) => return,
let variable_name = match &entry.entry {
DapEntry::Variable(dap) => dap.name.clone(),
DapEntry::Watcher(watcher) => watcher.expression.to_string(),
DapEntry::Scope(_) => return,
};
cx.write_to_clipboard(ClipboardItem::new_string(variable_name));
@@ -651,10 +818,10 @@ impl VariableList {
return;
};
let variable_value = match &entry.dap_kind {
EntryKind::Variable(dap) => dap.value.clone(),
EntryKind::Watcher(watcher) => watcher.value.to_string(),
EntryKind::Scope(_) => return,
let variable_value = match &entry.entry {
DapEntry::Variable(dap) => dap.value.clone(),
DapEntry::Watcher(watcher) => watcher.value.to_string(),
DapEntry::Scope(_) => return,
};
cx.write_to_clipboard(ClipboardItem::new_string(variable_value));
@@ -669,10 +836,10 @@ impl VariableList {
return;
};
let variable_value = match &entry.dap_kind {
EntryKind::Watcher(watcher) => watcher.value.to_string(),
EntryKind::Variable(variable) => variable.value.clone(),
EntryKind::Scope(_) => return,
let variable_value = match &entry.entry {
DapEntry::Watcher(watcher) => watcher.value.to_string(),
DapEntry::Variable(variable) => variable.value.clone(),
DapEntry::Scope(_) => return,
};
let editor = Self::create_variable_editor(&variable_value, window, cx);
@@ -753,7 +920,7 @@ impl VariableList {
"{}{} {}{}",
INDENT.repeat(state.depth - 1),
if state.is_expanded { "v" } else { ">" },
entry.dap_kind.name(),
entry.entry.name(),
if self.selection.as_ref() == Some(&entry.path) {
" <=== selected"
} else {
@@ -770,8 +937,8 @@ impl VariableList {
pub(crate) fn scopes(&self) -> Vec<dap::Scope> {
self.entries
.iter()
.filter_map(|entry| match &entry.dap_kind {
EntryKind::Scope(scope) => Some(scope),
.filter_map(|entry| match &entry.entry {
DapEntry::Scope(scope) => Some(scope),
_ => None,
})
.cloned()
@@ -785,10 +952,10 @@ impl VariableList {
let mut idx = 0;
for entry in self.entries.iter() {
match &entry.dap_kind {
EntryKind::Watcher { .. } => continue,
EntryKind::Variable(dap) => scopes[idx].1.push(dap.clone()),
EntryKind::Scope(scope) => {
match &entry.entry {
DapEntry::Watcher { .. } => continue,
DapEntry::Variable(dap) => scopes[idx].1.push(dap.clone()),
DapEntry::Scope(scope) => {
if scopes.len() > 0 {
idx += 1;
}
@@ -806,8 +973,8 @@ impl VariableList {
pub(crate) fn variables(&self) -> Vec<dap::Variable> {
self.entries
.iter()
.filter_map(|entry| match &entry.dap_kind {
EntryKind::Variable(variable) => Some(variable),
.filter_map(|entry| match &entry.entry {
DapEntry::Variable(variable) => Some(variable),
_ => None,
})
.cloned()
@@ -1358,6 +1525,8 @@ impl Render for VariableList {
.on_action(cx.listener(Self::edit_variable))
.on_action(cx.listener(Self::add_watcher))
.on_action(cx.listener(Self::remove_watcher))
.on_action(cx.listener(Self::toggle_data_breakpoint))
.on_action(cx.listener(Self::jump_to_variable_memory))
.child(
uniform_list(
"variable-list",

View File

@@ -427,7 +427,7 @@ async fn test_handle_start_debugging_request(
let sessions = workspace
.update(cx, |workspace, _window, cx| {
let debug_panel = workspace.panel::<DebugPanel>(cx).unwrap();
debug_panel.read(cx).sessions()
debug_panel.read(cx).sessions().collect::<Vec<_>>()
})
.unwrap();
assert_eq!(sessions.len(), 1);
@@ -451,7 +451,7 @@ async fn test_handle_start_debugging_request(
.unwrap()
.read(cx)
.session(cx);
let current_sessions = debug_panel.read(cx).sessions();
let current_sessions = debug_panel.read(cx).sessions().collect::<Vec<_>>();
assert_eq!(active_session, current_sessions[1].read(cx).session(cx));
assert_eq!(
active_session.read(cx).parent_session(),
@@ -1796,7 +1796,7 @@ async fn test_debug_adapters_shutdown_on_app_quit(
let panel = workspace.panel::<DebugPanel>(cx).unwrap();
panel.read_with(cx, |panel, _| {
assert!(
!panel.sessions().is_empty(),
panel.sessions().next().is_some(),
"Debug session should be active"
);
});

View File

@@ -111,7 +111,6 @@ async fn test_module_list(executor: BackgroundExecutor, cx: &mut TestAppContext)
});
running_state.update_in(cx, |this, window, cx| {
this.ensure_pane_item(DebuggerPaneItem::Modules, window, cx);
this.activate_item(DebuggerPaneItem::Modules, window, cx);
cx.refresh_windows();
});

View File

@@ -6,7 +6,7 @@ use editor::{
hover_popover::diagnostics_markdown_style,
};
use gpui::{AppContext, Entity, Focusable, WeakEntity};
use language::{BufferId, Diagnostic, DiagnosticEntry, LanguageRegistry};
use language::{BufferId, Diagnostic, DiagnosticEntry};
use lsp::DiagnosticSeverity;
use markdown::{Markdown, MarkdownElement};
use settings::Settings;
@@ -27,7 +27,6 @@ impl DiagnosticRenderer {
diagnostic_group: Vec<DiagnosticEntry<Point>>,
buffer_id: BufferId,
diagnostics_editor: Option<WeakEntity<ProjectDiagnosticsEditor>>,
languages: Arc<LanguageRegistry>,
cx: &mut App,
) -> Vec<DiagnosticBlock> {
let Some(primary_ix) = diagnostic_group
@@ -80,9 +79,7 @@ impl DiagnosticRenderer {
initial_range: primary.range.clone(),
severity: primary.diagnostic.severity,
diagnostics_editor: diagnostics_editor.clone(),
markdown: cx.new(|cx| {
Markdown::new(markdown.into(), Some(languages.clone()), None, cx)
}),
markdown: cx.new(|cx| Markdown::new(markdown.into(), None, None, cx)),
});
} else if entry.range.start.row.abs_diff(primary.range.start.row) < 5 {
let markdown = Self::markdown(&entry.diagnostic);
@@ -91,9 +88,7 @@ impl DiagnosticRenderer {
initial_range: entry.range.clone(),
severity: entry.diagnostic.severity,
diagnostics_editor: diagnostics_editor.clone(),
markdown: cx.new(|cx| {
Markdown::new(markdown.into(), Some(languages.clone()), None, cx)
}),
markdown: cx.new(|cx| Markdown::new(markdown.into(), None, None, cx)),
});
} else {
let mut markdown = Self::markdown(&entry.diagnostic);
@@ -105,9 +100,7 @@ impl DiagnosticRenderer {
initial_range: entry.range.clone(),
severity: entry.diagnostic.severity,
diagnostics_editor: diagnostics_editor.clone(),
markdown: cx.new(|cx| {
Markdown::new(markdown.into(), Some(languages.clone()), None, cx)
}),
markdown: cx.new(|cx| Markdown::new(markdown.into(), None, None, cx)),
});
}
}
@@ -134,11 +127,9 @@ impl editor::DiagnosticRenderer for DiagnosticRenderer {
buffer_id: BufferId,
snapshot: EditorSnapshot,
editor: WeakEntity<Editor>,
languages: Arc<LanguageRegistry>,
cx: &mut App,
) -> Vec<BlockProperties<Anchor>> {
let blocks =
Self::diagnostic_blocks_for_group(diagnostic_group, buffer_id, None, languages, cx);
let blocks = Self::diagnostic_blocks_for_group(diagnostic_group, buffer_id, None, cx);
blocks
.into_iter()
.map(|block| {
@@ -153,7 +144,6 @@ impl editor::DiagnosticRenderer for DiagnosticRenderer {
style: BlockStyle::Flex,
render: Arc::new(move |bcx| block.render_block(editor.clone(), bcx)),
priority: 1,
render_in_minimap: false,
}
})
.collect()
@@ -164,11 +154,9 @@ impl editor::DiagnosticRenderer for DiagnosticRenderer {
diagnostic_group: Vec<DiagnosticEntry<Point>>,
range: Range<Point>,
buffer_id: BufferId,
languages: Arc<LanguageRegistry>,
cx: &mut App,
) -> Option<Entity<Markdown>> {
let blocks =
Self::diagnostic_blocks_for_group(diagnostic_group, buffer_id, None, languages, cx);
let blocks = Self::diagnostic_blocks_for_group(diagnostic_group, buffer_id, None, cx);
blocks.into_iter().find_map(|block| {
if block.initial_range == range {
Some(block.markdown)

View File

@@ -508,15 +508,6 @@ impl ProjectDiagnosticsEditor {
window: &mut Window,
cx: &mut Context<Self>,
) -> Task<Result<()>> {
let languages = self
.editor
.read(cx)
.project
.as_ref()
.unwrap()
.read(cx)
.languages()
.clone();
let was_empty = self.multibuffer.read(cx).is_empty();
let buffer_snapshot = buffer.read(cx).snapshot();
let buffer_id = buffer_snapshot.remote_id();
@@ -568,7 +559,6 @@ impl ProjectDiagnosticsEditor {
group,
buffer_snapshot.remote_id(),
Some(this.clone()),
languages.clone(),
cx,
)
})?;
@@ -666,7 +656,6 @@ impl ProjectDiagnosticsEditor {
block.render_block(editor.clone(), bcx)
}),
priority: 1,
render_in_minimap: false,
}
});
let block_ids = this.editor.update(cx, |editor, cx| {

View File

@@ -14,7 +14,10 @@ use indoc::indoc;
use language::{DiagnosticSourceKind, Rope};
use lsp::LanguageServerId;
use pretty_assertions::assert_eq;
use project::FakeFs;
use project::{
FakeFs,
project_settings::{GoToDiagnosticSeverity, GoToDiagnosticSeverityFilter},
};
use rand::{Rng, rngs::StdRng, seq::IteratorRandom as _};
use serde_json::json;
use settings::SettingsStore;
@@ -1005,7 +1008,7 @@ async fn active_diagnostics_dismiss_after_invalidation(cx: &mut TestAppContext)
cx.run_until_parked();
cx.update_editor(|editor, window, cx| {
editor.go_to_diagnostic(&GoToDiagnostic, window, cx);
editor.go_to_diagnostic(&GoToDiagnostic::default(), window, cx);
assert_eq!(
editor
.active_diagnostic_group()
@@ -1047,7 +1050,7 @@ async fn active_diagnostics_dismiss_after_invalidation(cx: &mut TestAppContext)
"});
cx.update_editor(|editor, window, cx| {
editor.go_to_diagnostic(&GoToDiagnostic, window, cx);
editor.go_to_diagnostic(&GoToDiagnostic::default(), window, cx);
assert_eq!(editor.active_diagnostic_group(), None);
});
cx.assert_editor_state(indoc! {"
@@ -1126,7 +1129,7 @@ async fn cycle_through_same_place_diagnostics(cx: &mut TestAppContext) {
// Fourth diagnostic
cx.update_editor(|editor, window, cx| {
editor.go_to_prev_diagnostic(&GoToPreviousDiagnostic, window, cx);
editor.go_to_prev_diagnostic(&GoToPreviousDiagnostic::default(), window, cx);
});
cx.assert_editor_state(indoc! {"
fn func(abc def: i32) -> ˇu32 {
@@ -1135,7 +1138,7 @@ async fn cycle_through_same_place_diagnostics(cx: &mut TestAppContext) {
// Third diagnostic
cx.update_editor(|editor, window, cx| {
editor.go_to_prev_diagnostic(&GoToPreviousDiagnostic, window, cx);
editor.go_to_prev_diagnostic(&GoToPreviousDiagnostic::default(), window, cx);
});
cx.assert_editor_state(indoc! {"
fn func(abc ˇdef: i32) -> u32 {
@@ -1144,7 +1147,7 @@ async fn cycle_through_same_place_diagnostics(cx: &mut TestAppContext) {
// Second diagnostic, same place
cx.update_editor(|editor, window, cx| {
editor.go_to_prev_diagnostic(&GoToPreviousDiagnostic, window, cx);
editor.go_to_prev_diagnostic(&GoToPreviousDiagnostic::default(), window, cx);
});
cx.assert_editor_state(indoc! {"
fn func(abc ˇdef: i32) -> u32 {
@@ -1153,7 +1156,7 @@ async fn cycle_through_same_place_diagnostics(cx: &mut TestAppContext) {
// First diagnostic
cx.update_editor(|editor, window, cx| {
editor.go_to_prev_diagnostic(&GoToPreviousDiagnostic, window, cx);
editor.go_to_prev_diagnostic(&GoToPreviousDiagnostic::default(), window, cx);
});
cx.assert_editor_state(indoc! {"
fn func(abcˇ def: i32) -> u32 {
@@ -1162,7 +1165,7 @@ async fn cycle_through_same_place_diagnostics(cx: &mut TestAppContext) {
// Wrapped over, fourth diagnostic
cx.update_editor(|editor, window, cx| {
editor.go_to_prev_diagnostic(&GoToPreviousDiagnostic, window, cx);
editor.go_to_prev_diagnostic(&GoToPreviousDiagnostic::default(), window, cx);
});
cx.assert_editor_state(indoc! {"
fn func(abc def: i32) -> ˇu32 {
@@ -1181,7 +1184,7 @@ async fn cycle_through_same_place_diagnostics(cx: &mut TestAppContext) {
// First diagnostic
cx.update_editor(|editor, window, cx| {
editor.go_to_diagnostic(&GoToDiagnostic, window, cx);
editor.go_to_diagnostic(&GoToDiagnostic::default(), window, cx);
});
cx.assert_editor_state(indoc! {"
fn func(abcˇ def: i32) -> u32 {
@@ -1190,7 +1193,7 @@ async fn cycle_through_same_place_diagnostics(cx: &mut TestAppContext) {
// Second diagnostic
cx.update_editor(|editor, window, cx| {
editor.go_to_diagnostic(&GoToDiagnostic, window, cx);
editor.go_to_diagnostic(&GoToDiagnostic::default(), window, cx);
});
cx.assert_editor_state(indoc! {"
fn func(abc ˇdef: i32) -> u32 {
@@ -1199,7 +1202,7 @@ async fn cycle_through_same_place_diagnostics(cx: &mut TestAppContext) {
// Third diagnostic, same place
cx.update_editor(|editor, window, cx| {
editor.go_to_diagnostic(&GoToDiagnostic, window, cx);
editor.go_to_diagnostic(&GoToDiagnostic::default(), window, cx);
});
cx.assert_editor_state(indoc! {"
fn func(abc ˇdef: i32) -> u32 {
@@ -1208,7 +1211,7 @@ async fn cycle_through_same_place_diagnostics(cx: &mut TestAppContext) {
// Fourth diagnostic
cx.update_editor(|editor, window, cx| {
editor.go_to_diagnostic(&GoToDiagnostic, window, cx);
editor.go_to_diagnostic(&GoToDiagnostic::default(), window, cx);
});
cx.assert_editor_state(indoc! {"
fn func(abc def: i32) -> ˇu32 {
@@ -1217,7 +1220,7 @@ async fn cycle_through_same_place_diagnostics(cx: &mut TestAppContext) {
// Wrapped around, first diagnostic
cx.update_editor(|editor, window, cx| {
editor.go_to_diagnostic(&GoToDiagnostic, window, cx);
editor.go_to_diagnostic(&GoToDiagnostic::default(), window, cx);
});
cx.assert_editor_state(indoc! {"
fn func(abcˇ def: i32) -> u32 {
@@ -1441,6 +1444,128 @@ async fn test_diagnostics_with_code(cx: &mut TestAppContext) {
);
}
#[gpui::test]
async fn go_to_diagnostic_with_severity(cx: &mut TestAppContext) {
init_test(cx);
let mut cx = EditorTestContext::new(cx).await;
let lsp_store =
cx.update_editor(|editor, _, cx| editor.project.as_ref().unwrap().read(cx).lsp_store());
cx.set_state(indoc! {"error warning info hiˇnt"});
cx.update(|_, cx| {
lsp_store.update(cx, |lsp_store, cx| {
lsp_store
.update_diagnostics(
LanguageServerId(0),
lsp::PublishDiagnosticsParams {
uri: lsp::Url::from_file_path(path!("/root/file")).unwrap(),
version: None,
diagnostics: vec![
lsp::Diagnostic {
range: lsp::Range::new(
lsp::Position::new(0, 0),
lsp::Position::new(0, 5),
),
severity: Some(lsp::DiagnosticSeverity::ERROR),
..Default::default()
},
lsp::Diagnostic {
range: lsp::Range::new(
lsp::Position::new(0, 6),
lsp::Position::new(0, 13),
),
severity: Some(lsp::DiagnosticSeverity::WARNING),
..Default::default()
},
lsp::Diagnostic {
range: lsp::Range::new(
lsp::Position::new(0, 14),
lsp::Position::new(0, 18),
),
severity: Some(lsp::DiagnosticSeverity::INFORMATION),
..Default::default()
},
lsp::Diagnostic {
range: lsp::Range::new(
lsp::Position::new(0, 19),
lsp::Position::new(0, 23),
),
severity: Some(lsp::DiagnosticSeverity::HINT),
..Default::default()
},
],
},
None,
DiagnosticSourceKind::Pushed,
&[],
cx,
)
.unwrap()
});
});
cx.run_until_parked();
macro_rules! go {
($severity:expr) => {
cx.update_editor(|editor, window, cx| {
editor.go_to_diagnostic(
&GoToDiagnostic {
severity: $severity,
},
window,
cx,
);
});
};
}
// Default, should cycle through all diagnostics
go!(GoToDiagnosticSeverityFilter::default());
cx.assert_editor_state(indoc! {"ˇerror warning info hint"});
go!(GoToDiagnosticSeverityFilter::default());
cx.assert_editor_state(indoc! {"error ˇwarning info hint"});
go!(GoToDiagnosticSeverityFilter::default());
cx.assert_editor_state(indoc! {"error warning ˇinfo hint"});
go!(GoToDiagnosticSeverityFilter::default());
cx.assert_editor_state(indoc! {"error warning info ˇhint"});
go!(GoToDiagnosticSeverityFilter::default());
cx.assert_editor_state(indoc! {"ˇerror warning info hint"});
let only_info = GoToDiagnosticSeverityFilter::Only(GoToDiagnosticSeverity::Information);
go!(only_info);
cx.assert_editor_state(indoc! {"error warning ˇinfo hint"});
go!(only_info);
cx.assert_editor_state(indoc! {"error warning ˇinfo hint"});
let no_hints = GoToDiagnosticSeverityFilter::Range {
min: GoToDiagnosticSeverity::Information,
max: GoToDiagnosticSeverity::Error,
};
go!(no_hints);
cx.assert_editor_state(indoc! {"ˇerror warning info hint"});
go!(no_hints);
cx.assert_editor_state(indoc! {"error ˇwarning info hint"});
go!(no_hints);
cx.assert_editor_state(indoc! {"error warning ˇinfo hint"});
go!(no_hints);
cx.assert_editor_state(indoc! {"ˇerror warning info hint"});
let warning_info = GoToDiagnosticSeverityFilter::Range {
min: GoToDiagnosticSeverity::Information,
max: GoToDiagnosticSeverity::Warning,
};
go!(warning_info);
cx.assert_editor_state(indoc! {"error ˇwarning info hint"});
go!(warning_info);
cx.assert_editor_state(indoc! {"error warning ˇinfo hint"});
go!(warning_info);
cx.assert_editor_state(indoc! {"error ˇwarning info hint"});
}
fn init_test(cx: &mut TestAppContext) {
cx.update(|cx| {
zlog::init_test();

View File

@@ -6,7 +6,7 @@ use gpui::{
WeakEntity, Window,
};
use language::Diagnostic;
use project::project_settings::ProjectSettings;
use project::project_settings::{GoToDiagnosticSeverityFilter, ProjectSettings};
use settings::Settings;
use ui::{Button, ButtonLike, Color, Icon, IconName, Label, Tooltip, h_flex, prelude::*};
use workspace::{StatusItemView, ToolbarItemEvent, Workspace, item::ItemHandle};
@@ -77,7 +77,7 @@ impl Render for DiagnosticIndicator {
.tooltip(|window, cx| {
Tooltip::for_action(
"Next Diagnostic",
&editor::actions::GoToDiagnostic,
&editor::actions::GoToDiagnostic::default(),
window,
cx,
)
@@ -156,7 +156,12 @@ impl DiagnosticIndicator {
fn go_to_next_diagnostic(&mut self, window: &mut Window, cx: &mut Context<Self>) {
if let Some(editor) = self.active_editor.as_ref().and_then(|e| e.upgrade()) {
editor.update(cx, |editor, cx| {
editor.go_to_diagnostic_impl(editor::Direction::Next, window, cx);
editor.go_to_diagnostic_impl(
editor::Direction::Next,
GoToDiagnosticSeverityFilter::default(),
window,
cx,
);
})
}
}

View File

@@ -243,7 +243,6 @@ struct ActionDef {
fn dump_all_gpui_actions() -> Vec<ActionDef> {
let mut actions = gpui::generate_list_of_all_registered_actions()
.into_iter()
.map(|action| ActionDef {
name: action.name,
human_name: command_palette::humanize_action_name(action.name),

View File

@@ -1,6 +1,7 @@
//! This module contains all actions supported by [`Editor`].
use super::*;
use gpui::{Action, actions};
use project::project_settings::GoToDiagnosticSeverityFilter;
use schemars::JsonSchema;
use util::serde::default_true;
@@ -265,6 +266,24 @@ pub enum UuidVersion {
V7,
}
/// Goes to the next diagnostic in the file.
#[derive(PartialEq, Clone, Default, Debug, Deserialize, JsonSchema, Action)]
#[action(namespace = editor)]
#[serde(deny_unknown_fields)]
pub struct GoToDiagnostic {
#[serde(default)]
pub severity: GoToDiagnosticSeverityFilter,
}
/// Goes to the previous diagnostic in the file.
#[derive(PartialEq, Clone, Default, Debug, Deserialize, JsonSchema, Action)]
#[action(namespace = editor)]
#[serde(deny_unknown_fields)]
pub struct GoToPreviousDiagnostic {
#[serde(default)]
pub severity: GoToDiagnosticSeverityFilter,
}
actions!(
debugger,
[
@@ -410,6 +429,8 @@ actions!(
ToggleFold,
/// Toggles recursive folding at the current position.
ToggleFoldRecursive,
/// Toggles all folds in a buffer or all excerpts in multibuffer.
ToggleFoldAll,
/// Formats the entire document.
Format,
/// Formats only the selected text.
@@ -422,8 +443,6 @@ actions!(
GoToDefinition,
/// Goes to definition in a split pane.
GoToDefinitionSplit,
/// Goes to the next diagnostic in the file.
GoToDiagnostic,
/// Goes to the next diff hunk.
GoToHunk,
/// Goes to the previous diff hunk.
@@ -438,8 +457,6 @@ actions!(
GoToParentModule,
/// Goes to the previous change in the file.
GoToPreviousChange,
/// Goes to the previous diagnostic in the file.
GoToPreviousDiagnostic,
/// Goes to the type definition of the symbol at cursor.
GoToTypeDefinition,
/// Goes to type definition in a split pane.

View File

@@ -271,7 +271,6 @@ impl DisplayMap {
height: Some(height),
style,
priority,
render_in_minimap: true,
}
}),
);
@@ -1663,7 +1662,6 @@ pub mod tests {
height: Some(height),
render: Arc::new(|_| div().into_any()),
priority,
render_in_minimap: true,
}
})
.collect::<Vec<_>>();
@@ -2029,7 +2027,6 @@ pub mod tests {
style: BlockStyle::Sticky,
render: Arc::new(|_| div().into_any()),
priority: 0,
render_in_minimap: true,
}],
cx,
);
@@ -2227,7 +2224,6 @@ pub mod tests {
style: BlockStyle::Sticky,
render: Arc::new(|_| div().into_any()),
priority: 0,
render_in_minimap: true,
},
BlockProperties {
placement: BlockPlacement::Below(
@@ -2237,7 +2233,6 @@ pub mod tests {
style: BlockStyle::Sticky,
render: Arc::new(|_| div().into_any()),
priority: 0,
render_in_minimap: true,
},
],
cx,
@@ -2344,7 +2339,6 @@ pub mod tests {
style: BlockStyle::Sticky,
render: Arc::new(|_| div().into_any()),
priority: 0,
render_in_minimap: true,
}],
cx,
)
@@ -2420,7 +2414,6 @@ pub mod tests {
style: BlockStyle::Fixed,
render: Arc::new(|_| div().into_any()),
priority: 0,
render_in_minimap: true,
}],
cx,
);

View File

@@ -193,7 +193,6 @@ pub struct CustomBlock {
style: BlockStyle,
render: Arc<Mutex<RenderBlock>>,
priority: usize,
pub(crate) render_in_minimap: bool,
}
#[derive(Clone)]
@@ -205,7 +204,6 @@ pub struct BlockProperties<P> {
pub style: BlockStyle,
pub render: RenderBlock,
pub priority: usize,
pub render_in_minimap: bool,
}
impl<P: Debug> Debug for BlockProperties<P> {
@@ -1044,7 +1042,6 @@ impl BlockMapWriter<'_> {
render: Arc::new(Mutex::new(block.render)),
style: block.style,
priority: block.priority,
render_in_minimap: block.render_in_minimap,
});
self.0.custom_blocks.insert(block_ix, new_block.clone());
self.0.custom_blocks_by_id.insert(id, new_block);
@@ -1079,7 +1076,6 @@ impl BlockMapWriter<'_> {
style: block.style,
render: block.render.clone(),
priority: block.priority,
render_in_minimap: block.render_in_minimap,
};
let new_block = Arc::new(new_block);
*block = new_block.clone();
@@ -1976,7 +1972,6 @@ mod tests {
height: Some(1),
render: Arc::new(|_| div().into_any()),
priority: 0,
render_in_minimap: true,
},
BlockProperties {
style: BlockStyle::Fixed,
@@ -1984,7 +1979,6 @@ mod tests {
height: Some(2),
render: Arc::new(|_| div().into_any()),
priority: 0,
render_in_minimap: true,
},
BlockProperties {
style: BlockStyle::Fixed,
@@ -1992,7 +1986,6 @@ mod tests {
height: Some(3),
render: Arc::new(|_| div().into_any()),
priority: 0,
render_in_minimap: true,
},
]);
@@ -2217,7 +2210,6 @@ mod tests {
height: Some(1),
render: Arc::new(|_| div().into_any()),
priority: 0,
render_in_minimap: true,
},
BlockProperties {
style: BlockStyle::Fixed,
@@ -2225,7 +2217,6 @@ mod tests {
height: Some(2),
render: Arc::new(|_| div().into_any()),
priority: 0,
render_in_minimap: true,
},
BlockProperties {
style: BlockStyle::Fixed,
@@ -2233,7 +2224,6 @@ mod tests {
height: Some(3),
render: Arc::new(|_| div().into_any()),
priority: 0,
render_in_minimap: true,
},
]);
@@ -2322,7 +2312,6 @@ mod tests {
render: Arc::new(|_| div().into_any()),
height: Some(1),
priority: 0,
render_in_minimap: true,
},
BlockProperties {
style: BlockStyle::Fixed,
@@ -2330,7 +2319,6 @@ mod tests {
render: Arc::new(|_| div().into_any()),
height: Some(1),
priority: 0,
render_in_minimap: true,
},
]);
@@ -2370,7 +2358,6 @@ mod tests {
height: Some(4),
render: Arc::new(|_| div().into_any()),
priority: 0,
render_in_minimap: true,
}])[0];
let blocks_snapshot = block_map.read(wraps_snapshot, Default::default());
@@ -2424,7 +2411,6 @@ mod tests {
height: Some(1),
render: Arc::new(|_| div().into_any()),
priority: 0,
render_in_minimap: true,
},
BlockProperties {
style: BlockStyle::Fixed,
@@ -2432,7 +2418,6 @@ mod tests {
height: Some(1),
render: Arc::new(|_| div().into_any()),
priority: 0,
render_in_minimap: true,
},
BlockProperties {
style: BlockStyle::Fixed,
@@ -2440,7 +2425,6 @@ mod tests {
height: Some(1),
render: Arc::new(|_| div().into_any()),
priority: 0,
render_in_minimap: true,
},
]);
let blocks_snapshot = block_map.read(wraps_snapshot.clone(), Default::default());
@@ -2455,7 +2439,6 @@ mod tests {
height: Some(1),
render: Arc::new(|_| div().into_any()),
priority: 0,
render_in_minimap: true,
},
BlockProperties {
style: BlockStyle::Fixed,
@@ -2463,7 +2446,6 @@ mod tests {
height: Some(1),
render: Arc::new(|_| div().into_any()),
priority: 0,
render_in_minimap: true,
},
BlockProperties {
style: BlockStyle::Fixed,
@@ -2471,7 +2453,6 @@ mod tests {
height: Some(1),
render: Arc::new(|_| div().into_any()),
priority: 0,
render_in_minimap: true,
},
]);
let blocks_snapshot = block_map.read(wraps_snapshot.clone(), Default::default());
@@ -2571,7 +2552,6 @@ mod tests {
height: Some(1),
render: Arc::new(|_| div().into_any()),
priority: 0,
render_in_minimap: true,
},
BlockProperties {
style: BlockStyle::Fixed,
@@ -2579,7 +2559,6 @@ mod tests {
height: Some(1),
render: Arc::new(|_| div().into_any()),
priority: 0,
render_in_minimap: true,
},
BlockProperties {
style: BlockStyle::Fixed,
@@ -2587,7 +2566,6 @@ mod tests {
height: Some(1),
render: Arc::new(|_| div().into_any()),
priority: 0,
render_in_minimap: true,
},
]);
let excerpt_blocks_3 = writer.insert(vec![
@@ -2597,7 +2575,6 @@ mod tests {
height: Some(1),
render: Arc::new(|_| div().into_any()),
priority: 0,
render_in_minimap: true,
},
BlockProperties {
style: BlockStyle::Fixed,
@@ -2605,7 +2582,6 @@ mod tests {
height: Some(1),
render: Arc::new(|_| div().into_any()),
priority: 0,
render_in_minimap: true,
},
]);
@@ -2653,7 +2629,6 @@ mod tests {
height: Some(1),
render: Arc::new(|_| div().into_any()),
priority: 0,
render_in_minimap: true,
}]);
let blocks_snapshot = block_map.read(wrap_snapshot.clone(), Patch::default());
let blocks = blocks_snapshot
@@ -3011,7 +2986,6 @@ mod tests {
height: Some(height),
render: Arc::new(|_| div().into_any()),
priority: 0,
render_in_minimap: true,
}
})
.collect::<Vec<_>>();
@@ -3032,7 +3006,6 @@ mod tests {
style: props.style,
render: Arc::new(|_| div().into_any()),
priority: 0,
render_in_minimap: true,
}));
for (block_properties, block_id) in block_properties.iter().zip(block_ids) {
@@ -3557,7 +3530,6 @@ mod tests {
height: Some(1),
render: Arc::new(|_| div().into_any()),
priority: 0,
render_in_minimap: true,
}])[0];
let blocks_snapshot = block_map.read(wraps_snapshot.clone(), Default::default());

View File

@@ -111,9 +111,8 @@ use itertools::Itertools;
use language::{
AutoindentMode, BracketMatch, BracketPair, Buffer, Capability, CharKind, CodeLabel,
CursorShape, DiagnosticEntry, DiffOptions, DocumentationConfig, EditPredictionsMode,
EditPreview, HighlightedText, IndentKind, IndentSize, Language, LanguageRegistry,
OffsetRangeExt, Point, Selection, SelectionGoal, TextObject, TransactionId, TreeSitterOptions,
WordsQuery,
EditPreview, HighlightedText, IndentKind, IndentSize, Language, OffsetRangeExt, Point,
Selection, SelectionGoal, TextObject, TransactionId, TreeSitterOptions, WordsQuery,
language_settings::{
self, InlayHintSettings, LspInsertMode, RewrapBehavior, WordsCompletionMode,
all_language_settings, language_settings,
@@ -135,7 +134,7 @@ use project::{
session::{Session, SessionEvent},
},
git_store::{GitStoreEvent, RepositoryEvent},
project_settings::DiagnosticSeverity,
project_settings::{DiagnosticSeverity, GoToDiagnosticSeverityFilter},
};
pub use git::blame::BlameRenderer;
@@ -403,7 +402,6 @@ pub trait DiagnosticRenderer {
buffer_id: BufferId,
snapshot: EditorSnapshot,
editor: WeakEntity<Editor>,
languages: Arc<LanguageRegistry>,
cx: &mut App,
) -> Vec<BlockProperties<Anchor>>;
@@ -412,7 +410,6 @@ pub trait DiagnosticRenderer {
diagnostic_group: Vec<DiagnosticEntry<Point>>,
range: Range<Point>,
buffer_id: BufferId,
languages: Arc<LanguageRegistry>,
cx: &mut App,
) -> Option<Entity<markdown::Markdown>>;
@@ -1798,6 +1795,7 @@ impl Editor {
);
let full_mode = mode.is_full();
let is_minimap = mode.is_minimap();
let diagnostics_max_severity = if full_mode {
EditorSettings::get_global(cx)
.diagnostics_max_severity
@@ -1858,13 +1856,19 @@ impl Editor {
let selections = SelectionsCollection::new(display_map.clone(), buffer.clone());
let blink_manager = cx.new(|cx| BlinkManager::new(CURSOR_BLINK_INTERVAL, cx));
let blink_manager = cx.new(|cx| {
let mut blink_manager = BlinkManager::new(CURSOR_BLINK_INTERVAL, cx);
if is_minimap {
blink_manager.disable(cx);
}
blink_manager
});
let soft_wrap_mode_override = matches!(mode, EditorMode::SingleLine { .. })
.then(|| language_settings::SoftWrap::None);
let mut project_subscriptions = Vec::new();
if mode.is_full() {
if full_mode {
if let Some(project) = project.as_ref() {
project_subscriptions.push(cx.subscribe_in(
project,
@@ -1975,18 +1979,23 @@ impl Editor {
let inlay_hint_settings =
inlay_hint_settings(selections.newest_anchor().head(), &buffer_snapshot, cx);
let focus_handle = cx.focus_handle();
cx.on_focus(&focus_handle, window, Self::handle_focus)
.detach();
cx.on_focus_in(&focus_handle, window, Self::handle_focus_in)
.detach();
cx.on_focus_out(&focus_handle, window, Self::handle_focus_out)
.detach();
cx.on_blur(&focus_handle, window, Self::handle_blur)
.detach();
cx.observe_pending_input(window, Self::observe_pending_input)
.detach();
if !is_minimap {
cx.on_focus(&focus_handle, window, Self::handle_focus)
.detach();
cx.on_focus_in(&focus_handle, window, Self::handle_focus_in)
.detach();
cx.on_focus_out(&focus_handle, window, Self::handle_focus_out)
.detach();
cx.on_blur(&focus_handle, window, Self::handle_blur)
.detach();
cx.observe_pending_input(window, Self::observe_pending_input)
.detach();
}
let show_indent_guides = if matches!(mode, EditorMode::SingleLine { .. }) {
let show_indent_guides = if matches!(
mode,
EditorMode::SingleLine { .. } | EditorMode::Minimap { .. }
) {
Some(false)
} else {
None
@@ -2052,10 +2061,10 @@ impl Editor {
minimap_visibility: MinimapVisibility::for_mode(&mode, cx),
offset_content: !matches!(mode, EditorMode::SingleLine { .. }),
show_breadcrumbs: EditorSettings::get_global(cx).toolbar.breadcrumbs,
show_gutter: mode.is_full(),
show_line_numbers: None,
show_gutter: full_mode,
show_line_numbers: (!full_mode).then_some(false),
use_relative_line_numbers: None,
disable_expand_excerpt_buttons: false,
disable_expand_excerpt_buttons: !full_mode,
show_git_diff_gutter: None,
show_code_actions: None,
show_runnables: None,
@@ -2089,7 +2098,7 @@ impl Editor {
document_highlights_task: None,
linked_editing_range_task: None,
pending_rename: None,
searchable: true,
searchable: !is_minimap,
cursor_shape: EditorSettings::get_global(cx)
.cursor_shape
.unwrap_or_default(),
@@ -2097,9 +2106,9 @@ impl Editor {
autoindent_mode: Some(AutoindentMode::EachLine),
collapse_matches: false,
workspace: None,
input_enabled: true,
use_modal_editing: mode.is_full(),
read_only: mode.is_minimap(),
input_enabled: !is_minimap,
use_modal_editing: full_mode,
read_only: is_minimap,
use_autoclose: true,
use_auto_surround: true,
auto_replace_emoji_shortcode: false,
@@ -2115,11 +2124,10 @@ impl Editor {
edit_prediction_preview: EditPredictionPreview::Inactive {
released_too_fast: false,
},
inline_diagnostics_enabled: mode.is_full(),
diagnostics_enabled: mode.is_full(),
inline_diagnostics_enabled: full_mode,
diagnostics_enabled: full_mode,
inline_value_cache: InlineValueCache::new(inlay_hint_settings.show_value_hints),
inlay_hint_cache: InlayHintCache::new(inlay_hint_settings),
gutter_hovered: false,
pixel_position_of_newest_cursor: None,
last_bounds: None,
@@ -2142,9 +2150,10 @@ impl Editor {
show_git_blame_inline: false,
show_selection_menu: None,
show_git_blame_inline_delay_task: None,
git_blame_inline_enabled: ProjectSettings::get_global(cx).git.inline_blame_enabled(),
git_blame_inline_enabled: full_mode
&& ProjectSettings::get_global(cx).git.inline_blame_enabled(),
render_diff_hunk_controls: Arc::new(render_diff_hunk_controls),
serialize_dirty_buffers: !mode.is_minimap()
serialize_dirty_buffers: !is_minimap
&& ProjectSettings::get_global(cx)
.session
.restore_unsaved_buffers,
@@ -2155,27 +2164,31 @@ impl Editor {
breakpoint_store,
gutter_breakpoint_indicator: (None, None),
hovered_diff_hunk_row: None,
_subscriptions: vec![
cx.observe(&buffer, Self::on_buffer_changed),
cx.subscribe_in(&buffer, window, Self::on_buffer_event),
cx.observe_in(&display_map, window, Self::on_display_map_changed),
cx.observe(&blink_manager, |_, _, cx| cx.notify()),
cx.observe_global_in::<SettingsStore>(window, Self::settings_changed),
observe_buffer_font_size_adjustment(cx, |_, cx| cx.notify()),
cx.observe_window_activation(window, |editor, window, cx| {
let active = window.is_window_active();
editor.blink_manager.update(cx, |blink_manager, cx| {
if active {
blink_manager.enable(cx);
} else {
blink_manager.disable(cx);
}
});
if active {
editor.show_mouse_cursor(cx);
}
}),
],
_subscriptions: (!is_minimap)
.then(|| {
vec![
cx.observe(&buffer, Self::on_buffer_changed),
cx.subscribe_in(&buffer, window, Self::on_buffer_event),
cx.observe_in(&display_map, window, Self::on_display_map_changed),
cx.observe(&blink_manager, |_, _, cx| cx.notify()),
cx.observe_global_in::<SettingsStore>(window, Self::settings_changed),
observe_buffer_font_size_adjustment(cx, |_, cx| cx.notify()),
cx.observe_window_activation(window, |editor, window, cx| {
let active = window.is_window_active();
editor.blink_manager.update(cx, |blink_manager, cx| {
if active {
blink_manager.enable(cx);
} else {
blink_manager.disable(cx);
}
});
if active {
editor.show_mouse_cursor(cx);
}
}),
]
})
.unwrap_or_default(),
tasks_update_task: None,
pull_diagnostics_task: Task::ready(()),
colors: None,
@@ -2206,6 +2219,11 @@ impl Editor {
selection_drag_state: SelectionDragState::None,
folding_newlines: Task::ready(()),
};
if is_minimap {
return editor;
}
if let Some(breakpoints) = editor.breakpoint_store.as_ref() {
editor
._subscriptions
@@ -10448,7 +10466,6 @@ impl Editor {
cloned_prompt.clone().into_any_element()
}),
priority: 0,
render_in_minimap: true,
}];
let focus_handle = bp_prompt.focus_handle(cx);
@@ -15069,7 +15086,7 @@ impl Editor {
pub fn go_to_diagnostic(
&mut self,
_: &GoToDiagnostic,
action: &GoToDiagnostic,
window: &mut Window,
cx: &mut Context<Self>,
) {
@@ -15077,12 +15094,12 @@ impl Editor {
return;
}
self.hide_mouse_cursor(HideMouseCursorOrigin::MovementAction, cx);
self.go_to_diagnostic_impl(Direction::Next, window, cx)
self.go_to_diagnostic_impl(Direction::Next, action.severity, window, cx)
}
pub fn go_to_prev_diagnostic(
&mut self,
_: &GoToPreviousDiagnostic,
action: &GoToPreviousDiagnostic,
window: &mut Window,
cx: &mut Context<Self>,
) {
@@ -15090,12 +15107,13 @@ impl Editor {
return;
}
self.hide_mouse_cursor(HideMouseCursorOrigin::MovementAction, cx);
self.go_to_diagnostic_impl(Direction::Prev, window, cx)
self.go_to_diagnostic_impl(Direction::Prev, action.severity, window, cx)
}
pub fn go_to_diagnostic_impl(
&mut self,
direction: Direction,
severity: GoToDiagnosticSeverityFilter,
window: &mut Window,
cx: &mut Context<Self>,
) {
@@ -15111,9 +15129,11 @@ impl Editor {
fn filtered(
snapshot: EditorSnapshot,
severity: GoToDiagnosticSeverityFilter,
diagnostics: impl Iterator<Item = DiagnosticEntry<usize>>,
) -> impl Iterator<Item = DiagnosticEntry<usize>> {
diagnostics
.filter(move |entry| severity.matches(entry.diagnostic.severity))
.filter(|entry| entry.range.start != entry.range.end)
.filter(|entry| !entry.diagnostic.is_unnecessary)
.filter(move |entry| !snapshot.intersects_fold(entry.range.start))
@@ -15122,12 +15142,14 @@ impl Editor {
let snapshot = self.snapshot(window, cx);
let before = filtered(
snapshot.clone(),
severity,
buffer
.diagnostics_in_range(0..selection.start)
.filter(|entry| entry.range.start <= selection.start),
);
let after = filtered(
snapshot,
severity,
buffer
.diagnostics_in_range(selection.start..buffer.len())
.filter(|entry| entry.range.start >= selection.start),
@@ -16144,7 +16166,6 @@ impl Editor {
}
}),
priority: 0,
render_in_minimap: true,
}],
Some(Autoscroll::fit()),
cx,
@@ -16577,20 +16598,13 @@ impl Editor {
let Some(renderer) = GlobalDiagnosticRenderer::global(cx) else {
return;
};
let languages = self.project.as_ref().unwrap().read(cx).languages().clone();
let diagnostic_group = buffer
.diagnostic_group(buffer_id, diagnostic.diagnostic.group_id)
.collect::<Vec<_>>();
let blocks = renderer.render_group(
diagnostic_group,
buffer_id,
snapshot,
cx.weak_entity(),
languages,
cx,
);
let blocks =
renderer.render_group(diagnostic_group, buffer_id, snapshot, cx.weak_entity(), cx);
let blocks = self.display_map.update(cx, |display_map, cx| {
display_map.insert_blocks(blocks, cx).into_iter().collect()
@@ -17085,6 +17099,46 @@ impl Editor {
}
}
pub fn toggle_fold_all(
&mut self,
_: &actions::ToggleFoldAll,
window: &mut Window,
cx: &mut Context<Self>,
) {
if self.buffer.read(cx).is_singleton() {
let display_map = self.display_map.update(cx, |map, cx| map.snapshot(cx));
let has_folds = display_map
.folds_in_range(0..display_map.buffer_snapshot.len())
.next()
.is_some();
if has_folds {
self.unfold_all(&actions::UnfoldAll, window, cx);
} else {
self.fold_all(&actions::FoldAll, window, cx);
}
} else {
let buffer_ids = self.buffer.read(cx).excerpt_buffer_ids();
let should_unfold = buffer_ids
.iter()
.any(|buffer_id| self.is_buffer_folded(*buffer_id, cx));
self.toggle_fold_multiple_buffers = cx.spawn_in(window, async move |editor, cx| {
editor
.update_in(cx, |editor, _, cx| {
for buffer_id in buffer_ids {
if should_unfold {
editor.unfold_buffer(buffer_id, cx);
} else {
editor.fold_buffer(buffer_id, cx);
}
}
})
.ok();
});
}
}
fn fold_at_level(
&mut self,
fold_at: &FoldAtLevel,
@@ -18014,7 +18068,7 @@ impl Editor {
parent: cx.weak_entity(),
},
self.buffer.clone(),
self.project.clone(),
None,
Some(self.display_map.clone()),
window,
cx,
@@ -19898,14 +19952,12 @@ impl Editor {
}
fn settings_changed(&mut self, window: &mut Window, cx: &mut Context<Self>) {
let new_severity = if self.diagnostics_enabled() {
EditorSettings::get_global(cx)
if self.diagnostics_enabled() {
let new_severity = EditorSettings::get_global(cx)
.diagnostics_max_severity
.unwrap_or(DiagnosticSeverity::Hint)
} else {
DiagnosticSeverity::Off
};
self.set_max_diagnostics_severity(new_severity, cx);
.unwrap_or(DiagnosticSeverity::Hint);
self.set_max_diagnostics_severity(new_severity, cx);
}
self.tasks_update_task = Some(self.refresh_runnables(window, cx));
self.update_edit_prediction_settings(cx);
self.refresh_inline_completion(true, false, window, cx);

View File

@@ -5081,7 +5081,6 @@ fn test_move_line_up_down_with_blocks(cx: &mut TestAppContext) {
height: Some(1),
render: Arc::new(|_| div().into_any()),
priority: 0,
render_in_minimap: true,
}],
Some(Autoscroll::fit()),
cx,
@@ -5124,7 +5123,6 @@ async fn test_selections_and_replace_blocks(cx: &mut TestAppContext) {
style: BlockStyle::Sticky,
render: Arc::new(|_| gpui::div().into_any_element()),
priority: 0,
render_in_minimap: true,
}],
None,
cx,
@@ -14736,7 +14734,7 @@ async fn go_to_prev_overlapping_diagnostic(executor: BackgroundExecutor, cx: &mu
executor.run_until_parked();
cx.update_editor(|editor, window, cx| {
editor.go_to_prev_diagnostic(&GoToPreviousDiagnostic, window, cx);
editor.go_to_prev_diagnostic(&GoToPreviousDiagnostic::default(), window, cx);
});
cx.assert_editor_state(indoc! {"
@@ -14745,7 +14743,7 @@ async fn go_to_prev_overlapping_diagnostic(executor: BackgroundExecutor, cx: &mu
"});
cx.update_editor(|editor, window, cx| {
editor.go_to_prev_diagnostic(&GoToPreviousDiagnostic, window, cx);
editor.go_to_prev_diagnostic(&GoToPreviousDiagnostic::default(), window, cx);
});
cx.assert_editor_state(indoc! {"
@@ -14754,7 +14752,7 @@ async fn go_to_prev_overlapping_diagnostic(executor: BackgroundExecutor, cx: &mu
"});
cx.update_editor(|editor, window, cx| {
editor.go_to_prev_diagnostic(&GoToPreviousDiagnostic, window, cx);
editor.go_to_prev_diagnostic(&GoToPreviousDiagnostic::default(), window, cx);
});
cx.assert_editor_state(indoc! {"
@@ -14763,7 +14761,7 @@ async fn go_to_prev_overlapping_diagnostic(executor: BackgroundExecutor, cx: &mu
"});
cx.update_editor(|editor, window, cx| {
editor.go_to_prev_diagnostic(&GoToPreviousDiagnostic, window, cx);
editor.go_to_prev_diagnostic(&GoToPreviousDiagnostic::default(), window, cx);
});
cx.assert_editor_state(indoc! {"
@@ -21465,7 +21463,7 @@ println!("5");
.unwrap();
pane_1
.update_in(cx, |pane, window, cx| {
pane.close_inactive_items(&CloseInactiveItems::default(), window, cx)
pane.close_inactive_items(&CloseInactiveItems::default(), None, window, cx)
})
.await
.unwrap();
@@ -21501,7 +21499,7 @@ println!("5");
.unwrap();
pane_2
.update_in(cx, |pane, window, cx| {
pane.close_inactive_items(&CloseInactiveItems::default(), window, cx)
pane.close_inactive_items(&CloseInactiveItems::default(), None, window, cx)
})
.await
.unwrap();

View File

@@ -9,7 +9,7 @@ use crate::{
LineUp, MAX_LINE_LEN, MINIMAP_FONT_SIZE, MULTI_BUFFER_EXCERPT_HEADER_HEIGHT, OpenExcerpts,
PageDown, PageUp, PhantomBreakpointIndicator, Point, RowExt, RowRangeExt, SelectPhase,
SelectedTextHighlight, Selection, SelectionDragState, SoftWrap, StickyHeaderExcerpt, ToPoint,
ToggleFold,
ToggleFold, ToggleFoldAll,
code_context_menus::{CodeActionsMenu, MENU_ASIDE_MAX_WIDTH, MENU_ASIDE_MIN_WIDTH, MENU_GAP},
display_map::{
Block, BlockContext, BlockStyle, ChunkRendererId, DisplaySnapshot, EditorMargins,
@@ -416,6 +416,7 @@ impl EditorElement {
register_action(editor, window, Editor::fold_recursive);
register_action(editor, window, Editor::toggle_fold);
register_action(editor, window, Editor::toggle_fold_recursive);
register_action(editor, window, Editor::toggle_fold_all);
register_action(editor, window, Editor::unfold_lines);
register_action(editor, window, Editor::unfold_recursive);
register_action(editor, window, Editor::unfold_all);
@@ -2093,16 +2094,19 @@ impl EditorElement {
window: &mut Window,
cx: &mut App,
) -> HashMap<DisplayRow, AnyElement> {
if self.editor.read(cx).mode().is_minimap() {
return HashMap::default();
}
let max_severity = match ProjectSettings::get_global(cx)
.diagnostics
.inline
.max_severity
.unwrap_or_else(|| self.editor.read(cx).diagnostics_max_severity)
.into_lsp()
let max_severity = match self
.editor
.read(cx)
.inline_diagnostics_enabled()
.then(|| {
ProjectSettings::get_global(cx)
.diagnostics
.inline
.max_severity
.unwrap_or_else(|| self.editor.read(cx).diagnostics_max_severity)
.into_lsp()
})
.flatten()
{
Some(max_severity) => max_severity,
None => return HashMap::default(),
@@ -2618,9 +2622,6 @@ impl EditorElement {
window: &mut Window,
cx: &mut App,
) -> Option<Vec<IndentGuideLayout>> {
if self.editor.read(cx).mode().is_minimap() {
return None;
}
let indent_guides = self.editor.update(cx, |editor, cx| {
editor.indent_guides(visible_buffer_range, snapshot, cx)
})?;
@@ -3084,9 +3085,9 @@ impl EditorElement {
window: &mut Window,
cx: &mut App,
) -> Arc<HashMap<MultiBufferRow, LineNumberLayout>> {
let include_line_numbers = snapshot.show_line_numbers.unwrap_or_else(|| {
EditorSettings::get_global(cx).gutter.line_numbers && snapshot.mode.is_full()
});
let include_line_numbers = snapshot
.show_line_numbers
.unwrap_or_else(|| EditorSettings::get_global(cx).gutter.line_numbers);
if !include_line_numbers {
return Arc::default();
}
@@ -3399,22 +3400,18 @@ impl EditorElement {
div()
.size_full()
.children(
(!snapshot.mode.is_minimap() || custom.render_in_minimap).then(|| {
custom.render(&mut BlockContext {
window,
app: cx,
anchor_x,
margins: editor_margins,
line_height,
em_width,
block_id,
selected,
max_width: text_hitbox.size.width.max(*scroll_width),
editor_style: &self.style,
})
}),
)
.child(custom.render(&mut BlockContext {
window,
app: cx,
anchor_x,
margins: editor_margins,
line_height,
em_width,
block_id,
selected,
max_width: text_hitbox.size.width.max(*scroll_width),
editor_style: &self.style,
}))
.into_any()
}
@@ -3620,24 +3617,37 @@ impl EditorElement {
.tooltip({
let focus_handle = focus_handle.clone();
move |window, cx| {
Tooltip::for_action_in(
Tooltip::with_meta_in(
"Toggle Excerpt Fold",
&ToggleFold,
Some(&ToggleFold),
"Alt+click to toggle all",
&focus_handle,
window,
cx,
)
}
})
.on_click(move |_, _, cx| {
if is_folded {
.on_click(move |event, window, cx| {
if event.modifiers().alt {
// Alt+click toggles all buffers
editor.update(cx, |editor, cx| {
editor.unfold_buffer(buffer_id, cx);
editor.toggle_fold_all(
&ToggleFoldAll,
window,
cx,
);
});
} else {
editor.update(cx, |editor, cx| {
editor.fold_buffer(buffer_id, cx);
});
// Regular click toggles single buffer
if is_folded {
editor.update(cx, |editor, cx| {
editor.unfold_buffer(buffer_id, cx);
});
} else {
editor.update(cx, |editor, cx| {
editor.fold_buffer(buffer_id, cx);
});
}
}
}),
),
@@ -6762,7 +6772,7 @@ impl EditorElement {
}
fn paint_mouse_listeners(&mut self, layout: &EditorLayout, window: &mut Window, cx: &mut App) {
if self.editor.read(cx).mode.is_minimap() {
if layout.mode.is_minimap() {
return;
}
@@ -7889,9 +7899,14 @@ impl Element for EditorElement {
line_height: Some(self.style.text.line_height),
..Default::default()
};
let focus_handle = self.editor.focus_handle(cx);
window.set_view_id(self.editor.entity_id());
window.set_focus_handle(&focus_handle, cx);
let is_minimap = self.editor.read(cx).mode.is_minimap();
if !is_minimap {
let focus_handle = self.editor.focus_handle(cx);
window.set_view_id(self.editor.entity_id());
window.set_focus_handle(&focus_handle, cx);
}
let rem_size = self.rem_size(cx);
window.with_rem_size(rem_size, |window| {
@@ -8329,18 +8344,22 @@ impl Element for EditorElement {
window,
cx,
);
let new_renrerer_widths = line_layouts
.iter()
.flat_map(|layout| &layout.fragments)
.filter_map(|fragment| {
if let LineFragment::Element { id, size, .. } = fragment {
Some((*id, size.width))
} else {
None
}
});
if self.editor.update(cx, |editor, cx| {
editor.update_renderer_widths(new_renrerer_widths, cx)
let new_renderer_widths = (!is_minimap).then(|| {
line_layouts
.iter()
.flat_map(|layout| &layout.fragments)
.filter_map(|fragment| {
if let LineFragment::Element { id, size, .. } = fragment {
Some((*id, size.width))
} else {
None
}
})
});
if new_renderer_widths.is_some_and(|new_renderer_widths| {
self.editor.update(cx, |editor, cx| {
editor.update_renderer_widths(new_renderer_widths, cx)
})
}) {
// If the fold widths have changed, we need to prepaint
// the element again to account for any changes in
@@ -8403,27 +8422,31 @@ impl Element for EditorElement {
let sticky_header_excerpt_id =
sticky_header_excerpt.as_ref().map(|top| top.excerpt.id);
let blocks = window.with_element_namespace("blocks", |window| {
self.render_blocks(
start_row..end_row,
&snapshot,
&hitbox,
&text_hitbox,
editor_width,
&mut scroll_width,
&editor_margins,
em_width,
gutter_dimensions.full_width(),
line_height,
&mut line_layouts,
&local_selections,
&selected_buffer_ids,
is_row_soft_wrapped,
sticky_header_excerpt_id,
window,
cx,
)
});
let blocks = (!is_minimap)
.then(|| {
window.with_element_namespace("blocks", |window| {
self.render_blocks(
start_row..end_row,
&snapshot,
&hitbox,
&text_hitbox,
editor_width,
&mut scroll_width,
&editor_margins,
em_width,
gutter_dimensions.full_width(),
line_height,
&mut line_layouts,
&local_selections,
&selected_buffer_ids,
is_row_soft_wrapped,
sticky_header_excerpt_id,
window,
cx,
)
})
})
.unwrap_or_else(|| Ok((Vec::default(), HashMap::default())));
let (mut blocks, row_block_types) = match blocks {
Ok(blocks) => blocks,
Err(resized_blocks) => {
@@ -8955,19 +8978,21 @@ impl Element for EditorElement {
window: &mut Window,
cx: &mut App,
) {
let focus_handle = self.editor.focus_handle(cx);
let key_context = self
.editor
.update(cx, |editor, cx| editor.key_context(window, cx));
if !layout.mode.is_minimap() {
let focus_handle = self.editor.focus_handle(cx);
let key_context = self
.editor
.update(cx, |editor, cx| editor.key_context(window, cx));
window.set_key_context(key_context);
window.handle_input(
&focus_handle,
ElementInputHandler::new(bounds, self.editor.clone()),
cx,
);
self.register_actions(window, cx);
self.register_key_listeners(window, cx, layout);
window.set_key_context(key_context);
window.handle_input(
&focus_handle,
ElementInputHandler::new(bounds, self.editor.clone()),
cx,
);
self.register_actions(window, cx);
self.register_key_listeners(window, cx, layout);
}
let text_style = TextStyleRefinement {
font_size: Some(self.style.text.font_size),
@@ -10276,7 +10301,6 @@ mod tests {
height: Some(3),
render: Arc::new(|cx| div().h(3. * cx.window.line_height()).into_any()),
priority: 0,
render_in_minimap: true,
}],
None,
cx,

View File

@@ -1,6 +1,7 @@
use crate::{
Anchor, Editor, EditorSettings, EditorSnapshot, FindAllReferences, GoToDefinition,
GoToTypeDefinition, GotoDefinitionKind, InlayId, Navigated, PointForPosition, SelectPhase,
display_map::InlayOffset,
editor_settings::GoToDefinitionFallback,
hover_popover::{self, InlayHover},
scroll::ScrollAmount,
@@ -15,6 +16,7 @@ use project::{
};
use settings::Settings;
use std::ops::Range;
use text;
use theme::ActiveTheme as _;
use util::{ResultExt, TryFutureExt as _, maybe};
@@ -121,13 +123,14 @@ impl Editor {
cx: &mut Context<Self>,
) {
let hovered_link_modifier = Editor::multi_cursor_modifier(false, &modifiers, cx);
if !hovered_link_modifier || self.has_pending_selection() {
self.hide_hovered_link(cx);
return;
}
match point_for_position.as_valid() {
Some(point) => {
if !hovered_link_modifier || self.has_pending_selection() {
self.hide_hovered_link(cx);
return;
}
let trigger_point = TriggerPoint::Text(
snapshot
.buffer_snapshot
@@ -284,114 +287,117 @@ pub fn update_inlay_link_and_hover_points(
window: &mut Window,
cx: &mut Context<Editor>,
) {
let hovered_offset = if point_for_position.column_overshoot_after_line_end == 0 {
Some(snapshot.display_point_to_inlay_offset(point_for_position.exact_unclipped, Bias::Left))
} else {
None
};
// For inlay hints, we need to use the exact position where the mouse is
// But we must clip it to valid bounds to avoid panics
let clipped_point = snapshot.clip_point(point_for_position.exact_unclipped, Bias::Left);
let hovered_offset = snapshot.display_point_to_inlay_offset(clipped_point, Bias::Left);
let mut go_to_definition_updated = false;
let mut hover_updated = false;
if let Some(hovered_offset) = hovered_offset {
let buffer_snapshot = editor.buffer().read(cx).snapshot(cx);
let previous_valid_anchor = buffer_snapshot.anchor_at(
point_for_position.previous_valid.to_point(snapshot),
Bias::Left,
);
let next_valid_anchor = buffer_snapshot.anchor_at(
point_for_position.next_valid.to_point(snapshot),
Bias::Right,
);
if let Some(hovered_hint) = editor
.visible_inlay_hints(cx)
.into_iter()
.skip_while(|hint| {
hint.position
.cmp(&previous_valid_anchor, &buffer_snapshot)
.is_lt()
})
.take_while(|hint| {
hint.position
.cmp(&next_valid_anchor, &buffer_snapshot)
.is_le()
})
.max_by_key(|hint| hint.id)
{
let inlay_hint_cache = editor.inlay_hint_cache();
let excerpt_id = previous_valid_anchor.excerpt_id;
if let Some(cached_hint) = inlay_hint_cache.hint_by_id(excerpt_id, hovered_hint.id) {
match cached_hint.resolve_state {
// Get all visible inlay hints
let visible_hints = editor.visible_inlay_hints(cx);
// Find if we're hovering over an inlay hint
if let Some(hovered_inlay) = visible_hints.into_iter().find(|inlay| {
// Only process hint inlays
if !matches!(inlay.id, InlayId::Hint(_)) {
return false;
}
// Check if the hovered position falls within this inlay's display range
let inlay_start = snapshot.anchor_to_inlay_offset(inlay.position);
let inlay_end = InlayOffset(inlay_start.0 + inlay.text.len());
hovered_offset >= inlay_start && hovered_offset < inlay_end
}) {
let inlay_hint_cache = editor.inlay_hint_cache();
let excerpt_id = hovered_inlay.position.excerpt_id;
// Extract the hint ID from the inlay
if let InlayId::Hint(_hint_id) = hovered_inlay.id {
if let Some(cached_hint) = inlay_hint_cache.hint_by_id(excerpt_id, hovered_inlay.id) {
// Check if we should process this hint for hover
let should_process_hint = match cached_hint.resolve_state {
ResolveState::CanResolve(_, _) => {
if let Some(buffer_id) = previous_valid_anchor.buffer_id {
// For unresolved hints, spawn resolution
if let Some(buffer_id) = hovered_inlay.position.buffer_id {
inlay_hint_cache.spawn_hint_resolve(
buffer_id,
excerpt_id,
hovered_hint.id,
hovered_inlay.id,
window,
cx,
);
}
false // Don't process unresolved hints
}
ResolveState::Resolved => {
let mut extra_shift_left = 0;
let mut extra_shift_right = 0;
if cached_hint.padding_left {
extra_shift_left += 1;
extra_shift_right += 1;
}
if cached_hint.padding_right {
extra_shift_right += 1;
}
match cached_hint.label {
project::InlayHintLabel::String(_) => {
if let Some(tooltip) = cached_hint.tooltip {
hover_popover::hover_at_inlay(
editor,
InlayHover {
tooltip: match tooltip {
InlayHintTooltip::String(text) => HoverBlock {
text,
kind: HoverBlockKind::PlainText,
},
InlayHintTooltip::MarkupContent(content) => {
HoverBlock {
text: content.value,
kind: content.kind,
}
ResolveState::Resolved => true,
ResolveState::Resolving => false,
};
if should_process_hint {
let mut extra_shift_left = 0;
let mut extra_shift_right = 0;
if cached_hint.padding_left {
extra_shift_left += 1;
extra_shift_right += 1;
}
if cached_hint.padding_right {
extra_shift_right += 1;
}
match cached_hint.label {
project::InlayHintLabel::String(_) => {
if let Some(tooltip) = cached_hint.tooltip {
hover_popover::hover_at_inlay(
editor,
InlayHover {
tooltip: match tooltip {
InlayHintTooltip::String(text) => HoverBlock {
text,
kind: HoverBlockKind::PlainText,
},
InlayHintTooltip::MarkupContent(content) => {
HoverBlock {
text: content.value,
kind: content.kind,
}
},
range: InlayHighlight {
inlay: hovered_hint.id,
inlay_position: hovered_hint.position,
range: extra_shift_left
..hovered_hint.text.len() + extra_shift_right,
},
}
},
window,
cx,
);
hover_updated = true;
}
range: InlayHighlight {
inlay: hovered_inlay.id,
inlay_position: hovered_inlay.position,
range: extra_shift_left
..hovered_inlay.text.len() + extra_shift_right,
},
},
window,
cx,
);
hover_updated = true;
}
project::InlayHintLabel::LabelParts(label_parts) => {
let hint_start =
snapshot.anchor_to_inlay_offset(hovered_hint.position);
if let Some((hovered_hint_part, part_range)) =
hover_popover::find_hovered_hint_part(
label_parts,
hint_start,
hovered_offset,
)
{
let highlight_start =
(part_range.start - hint_start).0 + extra_shift_left;
let highlight_end =
(part_range.end - hint_start).0 + extra_shift_right;
}
project::InlayHintLabel::LabelParts(label_parts) => {
// Find the first part with actual hover information (tooltip or location)
let _hint_start =
snapshot.anchor_to_inlay_offset(hovered_inlay.position);
let mut part_offset = 0;
for part in label_parts {
let part_len = part.value.chars().count();
if part.tooltip.is_some() || part.location.is_some() {
// Found the meaningful part - show hover for it
let highlight_start = part_offset + extra_shift_left;
let highlight_end = part_offset + part_len + extra_shift_right;
let highlight = InlayHighlight {
inlay: hovered_hint.id,
inlay_position: hovered_hint.position,
inlay: hovered_inlay.id,
inlay_position: hovered_inlay.position,
range: highlight_start..highlight_end,
};
if let Some(tooltip) = hovered_hint_part.tooltip {
if let Some(tooltip) = part.tooltip {
hover_popover::hover_at_inlay(
editor,
InlayHover {
@@ -415,10 +421,160 @@ pub fn update_inlay_link_and_hover_points(
cx,
);
hover_updated = true;
}
if let Some((language_server_id, location)) =
hovered_hint_part.location
} else if let Some((_language_server_id, location)) =
part.location.clone()
{
// When there's no tooltip but we have a location, perform a "Go to Definition" style operation
let filename = location
.uri
.path()
.split('/')
.next_back()
.unwrap_or("unknown")
.to_string();
hover_popover::hover_at_inlay(
editor,
InlayHover {
tooltip: HoverBlock {
text: "Loading documentation...".to_string(),
kind: HoverBlockKind::PlainText,
},
range: highlight.clone(),
},
window,
cx,
);
hover_updated = true;
// Now perform the "Go to Definition" flow to get hover documentation
if let Some(project) = editor.project.clone() {
let highlight = highlight.clone();
let hint_value = part.value.clone();
let location_uri = location.uri.clone();
cx.spawn_in(window, async move |editor, cx| {
async move {
// Small delay to show the loading message first
cx.background_executor()
.timer(std::time::Duration::from_millis(50))
.await;
// Convert LSP URL to file path
let file_path = location.uri.to_file_path()
.map_err(|_| anyhow::anyhow!("Invalid file URL"))?;
// Open the definition file
let definition_buffer = project
.update(cx, |project, cx| {
project.open_local_buffer(file_path, cx)
})?
.await?;
// Extract documentation directly from the source
let documentation = definition_buffer.update(cx, |buffer, _| {
let line_number = location.range.start.line as usize;
// Get the text of the buffer
let text = buffer.text();
let lines: Vec<&str> = text.lines().collect();
// Look backwards from the definition line to find doc comments
let mut doc_lines = Vec::new();
let mut current_line = line_number.saturating_sub(1);
// Skip any attributes like #[derive(...)]
while current_line > 0 && lines.get(current_line).map_or(false, |line| {
let trimmed = line.trim();
trimmed.starts_with("#[") || trimmed.is_empty()
}) {
current_line = current_line.saturating_sub(1);
}
// Collect doc comments
while current_line > 0 {
if let Some(line) = lines.get(current_line) {
let trimmed = line.trim();
if trimmed.starts_with("///") {
// Remove the /// and any leading space
let doc_text = trimmed.strip_prefix("///").unwrap_or("")
.strip_prefix(" ").unwrap_or_else(|| trimmed.strip_prefix("///").unwrap_or(""));
doc_lines.push(doc_text.to_string());
} else if !trimmed.is_empty() {
// Stop at the first non-doc, non-empty line
break;
}
}
current_line = current_line.saturating_sub(1);
}
// Reverse to get correct order
doc_lines.reverse();
// Also get the actual definition line
let definition = lines.get(line_number)
.map(|s| s.trim().to_string())
.unwrap_or_else(|| hint_value.clone());
if doc_lines.is_empty() {
None
} else {
let docs = doc_lines.join("\n");
Some((definition, docs))
}
})?;
if let Some((definition, docs)) = documentation {
// Format as markdown with the definition as a code block
let formatted_docs = format!("```rust\n{}\n```\n\n{}", definition, docs);
editor.update_in(cx, |editor, window, cx| {
hover_popover::hover_at_inlay(
editor,
InlayHover {
tooltip: HoverBlock {
text: formatted_docs,
kind: HoverBlockKind::Markdown,
},
range: highlight,
},
window,
cx,
);
}).log_err();
} else {
// Fallback to showing just the location info
let fallback_text = format!(
"{}\n\nDefined in {} at line {}",
hint_value.trim(),
filename,
location.range.start.line + 1
);
editor.update_in(cx, |editor, window, cx| {
hover_popover::hover_at_inlay(
editor,
InlayHover {
tooltip: HoverBlock {
text: fallback_text,
kind: HoverBlockKind::PlainText,
},
range: highlight,
},
window,
cx,
);
}).log_err();
}
anyhow::Ok(())
}
.log_err()
.await
}).detach();
}
}
if let Some((language_server_id, location)) = &part.location {
if secondary_held
&& !editor.has_pending_nonempty_selection()
{
@@ -428,8 +584,8 @@ pub fn update_inlay_link_and_hover_points(
editor,
TriggerPoint::InlayHint(
highlight,
location,
language_server_id,
location.clone(),
*language_server_id,
),
snapshot,
window,
@@ -437,11 +593,14 @@ pub fn update_inlay_link_and_hover_points(
);
}
}
break;
}
part_offset += part_len;
}
};
}
ResolveState::Resolving => {}
}
};
}
}
}

View File

@@ -275,13 +275,6 @@ fn show_hover(
return None;
}
}
let languages = editor
.project
.as_ref()
.unwrap()
.read(cx)
.languages()
.clone();
let hover_popover_delay = EditorSettings::get_global(cx).hover_popover_delay;
let all_diagnostics_active = editor.active_diagnostics == ActiveDiagnostic::All;
@@ -347,7 +340,7 @@ fn show_hover(
renderer
.as_ref()
.and_then(|renderer| {
renderer.render_hover(group, point_range, buffer_id, languages, cx)
renderer.render_hover(group, point_range, buffer_id, cx)
})
.context("no rendered diagnostic")
})??;

View File

@@ -1226,7 +1226,20 @@ impl SerializableItem for Editor {
abs_path: None,
contents: None,
..
} => Task::ready(Err(anyhow!("No path or contents found for buffer"))),
} => window.spawn(cx, async move |cx| {
let buffer = project
.update(cx, |project, cx| project.create_buffer(cx))?
.await?;
cx.update(|window, cx| {
cx.new(|cx| {
let mut editor = Editor::for_buffer(buffer, Some(project), window, cx);
editor.read_metadata_from_db(item_id, workspace_id, window, cx);
editor
})
})
}),
}
}
@@ -2098,5 +2111,38 @@ mod tests {
assert!(editor.has_conflict(cx)); // The editor should have a conflict
});
}
// Test case 5: Deserialize with no path, no content, no language, and no old mtime (new, empty, unsaved buffer)
{
let project = Project::test(fs.clone(), [path!("/file.rs").as_ref()], cx).await;
let (workspace, cx) =
cx.add_window_view(|window, cx| Workspace::test_new(project.clone(), window, cx));
let workspace_id = workspace::WORKSPACE_DB.next_id().await.unwrap();
let item_id = 10000 as ItemId;
let serialized_editor = SerializedEditor {
abs_path: None,
contents: None,
language: None,
mtime: None,
};
DB.save_serialized_editor(item_id, workspace_id, serialized_editor)
.await
.unwrap();
let deserialized =
deserialize_editor(item_id, workspace_id, workspace, project, cx).await;
deserialized.update(cx, |editor, cx| {
assert_eq!(editor.text(cx), "");
assert!(!editor.is_dirty(cx));
assert!(!editor.has_conflict(cx));
let buffer = editor.buffer().read(cx).as_singleton().unwrap().read(cx);
assert!(buffer.file().is_none());
});
}
}
}

View File

@@ -289,6 +289,24 @@ async fn copy_extension_resources(
}
}
if let Some(snippets_path) = manifest.snippets.as_ref() {
let parent = snippets_path.parent();
if let Some(parent) = parent.filter(|p| p.components().next().is_some()) {
fs::create_dir_all(output_dir.join(parent))?;
}
copy_recursive(
fs.as_ref(),
&extension_path.join(&snippets_path),
&output_dir.join(&snippets_path),
CopyOptions {
overwrite: true,
ignore_if_exists: false,
},
)
.await
.with_context(|| format!("failed to copy snippets from '{}'", snippets_path.display()))?;
}
Ok(())
}

View File

@@ -98,17 +98,6 @@ impl FeatureFlag for AcpFeatureFlag {
const NAME: &'static str = "acp";
}
pub struct ZedCloudFeatureFlag {}
impl FeatureFlag for ZedCloudFeatureFlag {
const NAME: &'static str = "zed-cloud";
fn enabled_for_staff() -> bool {
// Require individual opt-in, for now.
false
}
}
pub trait FeatureFlagViewExt<V: 'static> {
fn observe_flag<T: FeatureFlag, F>(&mut self, window: &Window, callback: F) -> Subscription
where

View File

@@ -1,7 +1,7 @@
use crate::FakeFs;
use crate::{FakeFs, Fs};
use anyhow::{Context as _, Result};
use collections::{HashMap, HashSet};
use futures::future::{self, BoxFuture};
use futures::future::{self, BoxFuture, join_all};
use git::{
blame::Blame,
repository::{
@@ -356,18 +356,46 @@ impl GitRepository for FakeGitRepository {
fn stage_paths(
&self,
_paths: Vec<RepoPath>,
paths: Vec<RepoPath>,
_env: Arc<HashMap<String, String>>,
) -> BoxFuture<'_, Result<()>> {
unimplemented!()
Box::pin(async move {
let contents = paths
.into_iter()
.map(|path| {
let abs_path = self.dot_git_path.parent().unwrap().join(&path);
Box::pin(async move { (path.clone(), self.fs.load(&abs_path).await.ok()) })
})
.collect::<Vec<_>>();
let contents = join_all(contents).await;
self.with_state_async(true, move |state| {
for (path, content) in contents {
if let Some(content) = content {
state.index_contents.insert(path, content);
} else {
state.index_contents.remove(&path);
}
}
Ok(())
})
.await
})
}
fn unstage_paths(
&self,
_paths: Vec<RepoPath>,
paths: Vec<RepoPath>,
_env: Arc<HashMap<String, String>>,
) -> BoxFuture<'_, Result<()>> {
unimplemented!()
self.with_state_async(true, move |state| {
for path in paths {
match state.head_contents.get(&path) {
Some(content) => state.index_contents.insert(path, content.clone()),
None => state.index_contents.remove(&path),
};
}
Ok(())
})
}
fn commit(
@@ -375,10 +403,8 @@ impl GitRepository for FakeGitRepository {
_message: gpui::SharedString,
_name_and_email: Option<(gpui::SharedString, gpui::SharedString)>,
_options: CommitOptions,
_ask_pass: AskPassDelegate,
_env: Arc<HashMap<String, String>>,
_cx: AsyncApp,
) -> BoxFuture<'static, Result<()>> {
) -> BoxFuture<'_, Result<()>> {
unimplemented!()
}

View File

@@ -41,9 +41,9 @@ futures.workspace = true
workspace-hack.workspace = true
[dev-dependencies]
gpui = { workspace = true, features = ["test-support"] }
pretty_assertions.workspace = true
serde_json.workspace = true
tempfile.workspace = true
text = { workspace = true, features = ["test-support"] }
unindent.workspace = true
gpui = { workspace = true, features = ["test-support"] }
tempfile.workspace = true

View File

@@ -31,8 +31,10 @@ actions!(
git,
[
// per-hunk
/// Toggles the staged state of the hunk at cursor.
/// Toggles the staged state of the hunk or status entry at cursor.
ToggleStaged,
/// Stage status entries between an anchor entry and the cursor.
StageRange,
/// Stages the current hunk and moves to the next one.
StageAndNext,
/// Unstages the current hunk and moves to the next one.

View File

@@ -391,12 +391,8 @@ pub trait GitRepository: Send + Sync {
message: SharedString,
name_and_email: Option<(SharedString, SharedString)>,
options: CommitOptions,
askpass: AskPassDelegate,
env: Arc<HashMap<String, String>>,
// This method takes an AsyncApp to ensure it's invoked on the main thread,
// otherwise git-credentials-manager won't work.
cx: AsyncApp,
) -> BoxFuture<'static, Result<()>>;
) -> BoxFuture<'_, Result<()>>;
fn push(
&self,
@@ -1197,68 +1193,36 @@ impl GitRepository for RealGitRepository {
message: SharedString,
name_and_email: Option<(SharedString, SharedString)>,
options: CommitOptions,
ask_pass: AskPassDelegate,
env: Arc<HashMap<String, String>>,
cx: AsyncApp,
) -> BoxFuture<'static, Result<()>> {
) -> BoxFuture<'_, Result<()>> {
let working_directory = self.working_directory();
let executor = cx.background_executor().clone();
async move {
let working_directory = working_directory?;
let have_user_git_askpass = env.contains_key("GIT_ASKPASS");
let mut command = new_smol_command("git");
command.current_dir(&working_directory).envs(env.iter());
self.executor
.spawn(async move {
let mut cmd = new_smol_command("git");
cmd.current_dir(&working_directory?)
.envs(env.iter())
.args(["commit", "--quiet", "-m"])
.arg(&message.to_string())
.arg("--cleanup=strip");
let ask_pass = if have_user_git_askpass {
None
} else {
Some(AskPassSession::new(&executor, ask_pass).await?)
};
if options.amend {
cmd.arg("--amend");
}
if let Some(program) = ask_pass
.as_ref()
.and_then(|ask_pass| ask_pass.gpg_script_path())
{
command.arg("-c").arg(format!(
"gpg.program={}",
program.as_ref().to_string_lossy()
));
}
if let Some((name, email)) = name_and_email {
cmd.arg("--author").arg(&format!("{name} <{email}>"));
}
command
.args(["commit", "-m"])
.arg(message.to_string())
.arg("--cleanup=strip")
.stdin(smol::process::Stdio::null())
.stdout(smol::process::Stdio::piped())
.stderr(smol::process::Stdio::piped());
let output = cmd.output().await?;
if options.amend {
command.arg("--amend");
}
if let Some((name, email)) = name_and_email {
command.arg("--author").arg(&format!("{name} <{email}>"));
}
if let Some(ask_pass) = ask_pass {
command.env("GIT_ASKPASS", ask_pass.script_path());
let git_process = command.spawn()?;
run_askpass_command(ask_pass, git_process).await?;
Ok(())
} else {
let git_process = command.spawn()?;
let output = git_process.output().await?;
anyhow::ensure!(
output.status.success(),
"{}",
"Failed to commit:\n{}",
String::from_utf8_lossy(&output.stderr)
);
Ok(())
}
}
.boxed()
})
.boxed()
}
fn push(
@@ -2082,16 +2046,12 @@ mod tests {
)
.await
.unwrap();
cx.spawn(|cx| {
repo.commit(
"Initial commit".into(),
None,
CommitOptions::default(),
AskPassDelegate::new_always_failing(),
Arc::new(checkpoint_author_envs()),
cx,
)
})
repo.commit(
"Initial commit".into(),
None,
CommitOptions::default(),
Arc::new(checkpoint_author_envs()),
)
.await
.unwrap();
@@ -2115,16 +2075,12 @@ mod tests {
)
.await
.unwrap();
cx.spawn(|cx| {
repo.commit(
"Commit after checkpoint".into(),
None,
CommitOptions::default(),
AskPassDelegate::new_always_failing(),
Arc::new(checkpoint_author_envs()),
cx,
)
})
repo.commit(
"Commit after checkpoint".into(),
None,
CommitOptions::default(),
Arc::new(checkpoint_author_envs()),
)
.await
.unwrap();
@@ -2257,16 +2213,12 @@ mod tests {
)
.await
.unwrap();
cx.spawn(|cx| {
repo.commit(
"Initial commit".into(),
None,
CommitOptions::default(),
AskPassDelegate::new_always_failing(),
Arc::new(checkpoint_author_envs()),
cx,
)
})
repo.commit(
"Initial commit".into(),
None,
CommitOptions::default(),
Arc::new(checkpoint_author_envs()),
)
.await
.unwrap();

View File

@@ -297,7 +297,6 @@ fn conflicts_updated(
move |cx| render_conflict_buttons(&conflict, excerpt_id, editor_handle.clone(), cx)
}),
priority: 0,
render_in_minimap: true,
})
}
let new_block_ids = editor.insert_blocks(blocks, None, cx);

View File

@@ -30,10 +30,9 @@ use git::{ExpandCommitEditor, RestoreTrackedFiles, StageAll, TrashUntrackedFiles
use gpui::{
Action, Animation, AnimationExt as _, AsyncApp, AsyncWindowContext, Axis, ClickEvent, Corner,
DismissEvent, Entity, EventEmitter, FocusHandle, Focusable, KeyContext,
ListHorizontalSizingBehavior, ListSizingBehavior, Modifiers, ModifiersChangedEvent,
MouseButton, MouseDownEvent, Point, PromptLevel, ScrollStrategy, Subscription, Task,
Transformation, UniformListScrollHandle, WeakEntity, actions, anchored, deferred, percentage,
uniform_list,
ListHorizontalSizingBehavior, ListSizingBehavior, MouseButton, MouseDownEvent, Point,
PromptLevel, ScrollStrategy, Subscription, Task, Transformation, UniformListScrollHandle,
WeakEntity, actions, anchored, deferred, percentage, uniform_list,
};
use itertools::Itertools;
use language::{Buffer, File};
@@ -48,7 +47,7 @@ use panel::{
PanelHeader, panel_button, panel_editor_container, panel_editor_style, panel_filled_button,
panel_icon_button,
};
use project::git_store::RepositoryEvent;
use project::git_store::{RepositoryEvent, RepositoryId};
use project::{
Fs, Project, ProjectPath,
git_store::{GitStoreEvent, Repository},
@@ -212,14 +211,14 @@ impl GitHeaderEntry {
#[derive(Debug, PartialEq, Eq, Clone)]
enum GitListEntry {
GitStatusEntry(GitStatusEntry),
Status(GitStatusEntry),
Header(GitHeaderEntry),
}
impl GitListEntry {
fn status_entry(&self) -> Option<&GitStatusEntry> {
match self {
GitListEntry::GitStatusEntry(entry) => Some(entry),
GitListEntry::Status(entry) => Some(entry),
_ => None,
}
}
@@ -323,7 +322,6 @@ pub struct GitPanel {
pub(crate) commit_editor: Entity<Editor>,
conflicted_count: usize,
conflicted_staged_count: usize,
current_modifiers: Modifiers,
add_coauthors: bool,
generate_commit_message_task: Option<Task<Option<()>>>,
entries: Vec<GitListEntry>,
@@ -355,9 +353,16 @@ pub struct GitPanel {
show_placeholders: bool,
local_committer: Option<GitCommitter>,
local_committer_task: Option<Task<()>>,
bulk_staging: Option<BulkStaging>,
_settings_subscription: Subscription,
}
#[derive(Clone, Debug, PartialEq, Eq)]
struct BulkStaging {
repo_id: RepositoryId,
anchor: RepoPath,
}
const MAX_PANEL_EDITOR_LINES: usize = 6;
pub(crate) fn commit_message_editor(
@@ -497,7 +502,6 @@ impl GitPanel {
commit_editor,
conflicted_count: 0,
conflicted_staged_count: 0,
current_modifiers: window.modifiers(),
add_coauthors: true,
generate_commit_message_task: None,
entries: Vec::new(),
@@ -529,6 +533,7 @@ impl GitPanel {
entry_count: 0,
horizontal_scrollbar,
vertical_scrollbar,
bulk_staging: None,
_settings_subscription,
};
@@ -735,16 +740,6 @@ impl GitPanel {
}
}
fn handle_modifiers_changed(
&mut self,
event: &ModifiersChangedEvent,
_: &mut Window,
cx: &mut Context<Self>,
) {
self.current_modifiers = event.modifiers;
cx.notify();
}
fn scroll_to_selected_entry(&mut self, cx: &mut Context<Self>) {
if let Some(selected_entry) = self.selected_entry {
self.scroll_handle
@@ -1265,10 +1260,18 @@ impl GitPanel {
return;
};
let (stage, repo_paths) = match entry {
GitListEntry::GitStatusEntry(status_entry) => {
GitListEntry::Status(status_entry) => {
if status_entry.status.staging().is_fully_staged() {
if let Some(op) = self.bulk_staging.clone()
&& op.anchor == status_entry.repo_path
{
self.bulk_staging = None;
}
(false, vec![status_entry.clone()])
} else {
self.set_bulk_staging_anchor(status_entry.repo_path.clone(), cx);
(true, vec![status_entry.clone()])
}
}
@@ -1383,6 +1386,13 @@ impl GitPanel {
}
}
fn stage_range(&mut self, _: &git::StageRange, _window: &mut Window, cx: &mut Context<Self>) {
let Some(index) = self.selected_entry else {
return;
};
self.stage_bulk(index, cx);
}
fn stage_selected(&mut self, _: &git::StageFile, _window: &mut Window, cx: &mut Context<Self>) {
let Some(selected_entry) = self.get_selected_entry() else {
return;
@@ -1574,15 +1584,10 @@ impl GitPanel {
let task = if self.has_staged_changes() {
// Repository serializes all git operations, so we can just send a commit immediately
cx.spawn_in(window, async move |this, cx| {
let askpass_delegate = this.update_in(cx, |this, window, cx| {
this.askpass_delegate("git commit", window, cx)
})?;
let commit_task = active_repository.update(cx, |repo, cx| {
repo.commit(message.into(), None, options, askpass_delegate, cx)
})?;
commit_task.await?
})
let commit_task = active_repository.update(cx, |repo, cx| {
repo.commit(message.into(), None, options, cx)
});
cx.background_spawn(async move { commit_task.await? })
} else {
let changed_files = self
.entries
@@ -1599,13 +1604,10 @@ impl GitPanel {
let stage_task =
active_repository.update(cx, |repo, cx| repo.stage_entries(changed_files, cx));
cx.spawn_in(window, async move |this, cx| {
cx.spawn(async move |_, cx| {
stage_task.await?;
let askpass_delegate = this.update_in(cx, |this, window, cx| {
this.askpass_delegate("git commit".to_string(), window, cx)
})?;
let commit_task = active_repository.update(cx, |repo, cx| {
repo.commit(message.into(), None, options, askpass_delegate, cx)
repo.commit(message.into(), None, options, cx)
})?;
commit_task.await?
})
@@ -2457,6 +2459,11 @@ impl GitPanel {
}
fn update_visible_entries(&mut self, cx: &mut Context<Self>) {
let bulk_staging = self.bulk_staging.take();
let last_staged_path_prev_index = bulk_staging
.as_ref()
.and_then(|op| self.entry_by_path(&op.anchor, cx));
self.entries.clear();
self.single_staged_entry.take();
self.single_tracked_entry.take();
@@ -2473,7 +2480,7 @@ impl GitPanel {
let mut changed_entries = Vec::new();
let mut new_entries = Vec::new();
let mut conflict_entries = Vec::new();
let mut last_staged = None;
let mut single_staged_entry = None;
let mut staged_count = 0;
let mut max_width_item: Option<(RepoPath, usize)> = None;
@@ -2511,7 +2518,7 @@ impl GitPanel {
if staging.has_staged() {
staged_count += 1;
last_staged = Some(entry.clone());
single_staged_entry = Some(entry.clone());
}
let width_estimate = Self::item_width_estimate(
@@ -2542,27 +2549,27 @@ impl GitPanel {
let mut pending_staged_count = 0;
let mut last_pending_staged = None;
let mut pending_status_for_last_staged = None;
let mut pending_status_for_single_staged = None;
for pending in self.pending.iter() {
if pending.target_status == TargetStatus::Staged {
pending_staged_count += pending.entries.len();
last_pending_staged = pending.entries.iter().next().cloned();
}
if let Some(last_staged) = &last_staged {
if let Some(single_staged) = &single_staged_entry {
if pending
.entries
.iter()
.any(|entry| entry.repo_path == last_staged.repo_path)
.any(|entry| entry.repo_path == single_staged.repo_path)
{
pending_status_for_last_staged = Some(pending.target_status);
pending_status_for_single_staged = Some(pending.target_status);
}
}
}
if conflict_entries.len() == 0 && staged_count == 1 && pending_staged_count == 0 {
match pending_status_for_last_staged {
match pending_status_for_single_staged {
Some(TargetStatus::Staged) | None => {
self.single_staged_entry = last_staged;
self.single_staged_entry = single_staged_entry;
}
_ => {}
}
@@ -2578,11 +2585,8 @@ impl GitPanel {
self.entries.push(GitListEntry::Header(GitHeaderEntry {
header: Section::Conflict,
}));
self.entries.extend(
conflict_entries
.into_iter()
.map(GitListEntry::GitStatusEntry),
);
self.entries
.extend(conflict_entries.into_iter().map(GitListEntry::Status));
}
if changed_entries.len() > 0 {
@@ -2591,31 +2595,39 @@ impl GitPanel {
header: Section::Tracked,
}));
}
self.entries.extend(
changed_entries
.into_iter()
.map(GitListEntry::GitStatusEntry),
);
self.entries
.extend(changed_entries.into_iter().map(GitListEntry::Status));
}
if new_entries.len() > 0 {
self.entries.push(GitListEntry::Header(GitHeaderEntry {
header: Section::New,
}));
self.entries
.extend(new_entries.into_iter().map(GitListEntry::GitStatusEntry));
.extend(new_entries.into_iter().map(GitListEntry::Status));
}
if let Some((repo_path, _)) = max_width_item {
self.max_width_item_index = self.entries.iter().position(|entry| match entry {
GitListEntry::GitStatusEntry(git_status_entry) => {
git_status_entry.repo_path == repo_path
}
GitListEntry::Status(git_status_entry) => git_status_entry.repo_path == repo_path,
GitListEntry::Header(_) => false,
});
}
self.update_counts(repo);
let bulk_staging_anchor_new_index = bulk_staging
.as_ref()
.filter(|op| op.repo_id == repo.id)
.and_then(|op| self.entry_by_path(&op.anchor, cx));
if bulk_staging_anchor_new_index == last_staged_path_prev_index
&& let Some(index) = bulk_staging_anchor_new_index
&& let Some(entry) = self.entries.get(index)
&& let Some(entry) = entry.status_entry()
&& self.entry_staging(entry) == StageStatus::Staged
{
self.bulk_staging = bulk_staging;
}
self.select_first_entry_if_none(cx);
let suggested_commit_message = self.suggest_commit_message(cx);
@@ -3751,7 +3763,7 @@ impl GitPanel {
for ix in range {
match &this.entries.get(ix) {
Some(GitListEntry::GitStatusEntry(entry)) => {
Some(GitListEntry::Status(entry)) => {
items.push(this.render_entry(
ix,
entry,
@@ -4008,8 +4020,6 @@ impl GitPanel {
let marked = self.marked_entries.contains(&ix);
let status_style = GitPanelSettings::get_global(cx).status_style;
let status = entry.status;
let modifiers = self.current_modifiers;
let shift_held = modifiers.shift;
let has_conflict = status.is_conflicted();
let is_modified = status.is_modified();
@@ -4128,12 +4138,6 @@ impl GitPanel {
cx.stop_propagation();
},
)
// .on_secondary_mouse_down(cx.listener(
// move |this, event: &MouseDownEvent, window, cx| {
// this.deploy_entry_context_menu(event.position, ix, window, cx);
// cx.stop_propagation();
// },
// ))
.child(
div()
.id(checkbox_wrapper_id)
@@ -4145,46 +4149,35 @@ impl GitPanel {
.disabled(!has_write_access)
.fill()
.elevation(ElevationIndex::Surface)
.on_click({
.on_click_ext({
let entry = entry.clone();
cx.listener(move |this, _, window, cx| {
if !has_write_access {
return;
}
this.toggle_staged_for_entry(
&GitListEntry::GitStatusEntry(entry.clone()),
window,
cx,
);
cx.stop_propagation();
})
let this = cx.weak_entity();
move |_, click, window, cx| {
this.update(cx, |this, cx| {
if !has_write_access {
return;
}
if click.modifiers().shift {
this.stage_bulk(ix, cx);
} else {
this.toggle_staged_for_entry(
&GitListEntry::Status(entry.clone()),
window,
cx,
);
}
cx.stop_propagation();
})
.ok();
}
})
.tooltip(move |window, cx| {
let is_staged = entry_staging.is_fully_staged();
let action = if is_staged { "Unstage" } else { "Stage" };
let tooltip_name = if shift_held {
format!("{} section", action)
} else {
action.to_string()
};
let tooltip_name = action.to_string();
let meta = if shift_held {
format!(
"Release shift to {} single entry",
action.to_lowercase()
)
} else {
format!("Shift click to {} section", action.to_lowercase())
};
Tooltip::with_meta(
tooltip_name,
Some(&ToggleStaged),
meta,
window,
cx,
)
Tooltip::for_action(tooltip_name, &ToggleStaged, window, cx)
}),
),
)
@@ -4250,6 +4243,41 @@ impl GitPanel {
panel
})
}
fn stage_bulk(&mut self, mut index: usize, cx: &mut Context<'_, Self>) {
let Some(op) = self.bulk_staging.as_ref() else {
return;
};
let Some(mut anchor_index) = self.entry_by_path(&op.anchor, cx) else {
return;
};
if let Some(entry) = self.entries.get(index)
&& let Some(entry) = entry.status_entry()
{
self.set_bulk_staging_anchor(entry.repo_path.clone(), cx);
}
if index < anchor_index {
std::mem::swap(&mut index, &mut anchor_index);
}
let entries = self
.entries
.get(anchor_index..=index)
.unwrap_or_default()
.iter()
.filter_map(|entry| entry.status_entry().cloned())
.collect::<Vec<_>>();
self.change_file_stage(true, entries, cx);
}
fn set_bulk_staging_anchor(&mut self, path: RepoPath, cx: &mut Context<'_, GitPanel>) {
let Some(repo) = self.active_repository.as_ref() else {
return;
};
self.bulk_staging = Some(BulkStaging {
repo_id: repo.read(cx).id,
anchor: path,
});
}
}
fn current_language_model(cx: &Context<'_, GitPanel>) -> Option<Arc<dyn LanguageModel>> {
@@ -4287,9 +4315,9 @@ impl Render for GitPanel {
.id("git_panel")
.key_context(self.dispatch_context(window, cx))
.track_focus(&self.focus_handle)
.on_modifiers_changed(cx.listener(Self::handle_modifiers_changed))
.when(has_write_access && !project.is_read_only(cx), |this| {
this.on_action(cx.listener(Self::toggle_staged_for_selected))
.on_action(cx.listener(Self::stage_range))
.on_action(cx.listener(GitPanel::commit))
.on_action(cx.listener(GitPanel::amend))
.on_action(cx.listener(GitPanel::cancel))
@@ -4961,7 +4989,7 @@ impl Component for PanelRepoFooter {
#[cfg(test)]
mod tests {
use git::status::StatusCode;
use git::status::{StatusCode, UnmergedStatus, UnmergedStatusCode};
use gpui::{TestAppContext, VisualTestContext};
use project::{FakeFs, WorktreeSettings};
use serde_json::json;
@@ -5060,13 +5088,13 @@ mod tests {
GitListEntry::Header(GitHeaderEntry {
header: Section::Tracked
}),
GitListEntry::GitStatusEntry(GitStatusEntry {
GitListEntry::Status(GitStatusEntry {
abs_path: path!("/root/zed/crates/gpui/gpui.rs").into(),
repo_path: "crates/gpui/gpui.rs".into(),
status: StatusCode::Modified.worktree(),
staging: StageStatus::Unstaged,
}),
GitListEntry::GitStatusEntry(GitStatusEntry {
GitListEntry::Status(GitStatusEntry {
abs_path: path!("/root/zed/crates/util/util.rs").into(),
repo_path: "crates/util/util.rs".into(),
status: StatusCode::Modified.worktree(),
@@ -5075,54 +5103,6 @@ mod tests {
],
);
// TODO(cole) restore this once repository deduplication is implemented properly.
//cx.update_window_entity(&panel, |panel, window, cx| {
// panel.select_last(&Default::default(), window, cx);
// assert_eq!(panel.selected_entry, Some(2));
// panel.open_diff(&Default::default(), window, cx);
//});
//cx.run_until_parked();
//let worktree_roots = workspace.update(cx, |workspace, cx| {
// workspace
// .worktrees(cx)
// .map(|worktree| worktree.read(cx).abs_path())
// .collect::<Vec<_>>()
//});
//pretty_assertions::assert_eq!(
// worktree_roots,
// vec![
// Path::new(path!("/root/zed/crates/gpui")).into(),
// Path::new(path!("/root/zed/crates/util/util.rs")).into(),
// ]
//);
//project.update(cx, |project, cx| {
// let git_store = project.git_store().read(cx);
// // The repo that comes from the single-file worktree can't be selected through the UI.
// let filtered_entries = filtered_repository_entries(git_store, cx)
// .iter()
// .map(|repo| repo.read(cx).worktree_abs_path.clone())
// .collect::<Vec<_>>();
// assert_eq!(
// filtered_entries,
// [Path::new(path!("/root/zed/crates/gpui")).into()]
// );
// // But we can select it artificially here.
// let repo_from_single_file_worktree = git_store
// .repositories()
// .values()
// .find(|repo| {
// repo.read(cx).worktree_abs_path.as_ref()
// == Path::new(path!("/root/zed/crates/util/util.rs"))
// })
// .unwrap()
// .clone();
// // Paths still make sense when we somehow activate a repo that comes from a single-file worktree.
// repo_from_single_file_worktree.update(cx, |repo, cx| repo.set_as_active_repository(cx));
//});
let handle = cx.update_window_entity(&panel, |panel, _, _| {
std::mem::replace(&mut panel.update_visible_entries_task, Task::ready(()))
});
@@ -5135,13 +5115,13 @@ mod tests {
GitListEntry::Header(GitHeaderEntry {
header: Section::Tracked
}),
GitListEntry::GitStatusEntry(GitStatusEntry {
GitListEntry::Status(GitStatusEntry {
abs_path: path!("/root/zed/crates/gpui/gpui.rs").into(),
repo_path: "crates/gpui/gpui.rs".into(),
status: StatusCode::Modified.worktree(),
staging: StageStatus::Unstaged,
}),
GitListEntry::GitStatusEntry(GitStatusEntry {
GitListEntry::Status(GitStatusEntry {
abs_path: path!("/root/zed/crates/util/util.rs").into(),
repo_path: "crates/util/util.rs".into(),
status: StatusCode::Modified.worktree(),
@@ -5150,4 +5130,196 @@ mod tests {
],
);
}
#[gpui::test]
async fn test_bulk_staging(cx: &mut TestAppContext) {
use GitListEntry::*;
init_test(cx);
let fs = FakeFs::new(cx.background_executor.clone());
fs.insert_tree(
"/root",
json!({
"project": {
".git": {},
"src": {
"main.rs": "fn main() {}",
"lib.rs": "pub fn hello() {}",
"utils.rs": "pub fn util() {}"
},
"tests": {
"test.rs": "fn test() {}"
},
"new_file.txt": "new content",
"another_new.rs": "// new file",
"conflict.txt": "conflicted content"
}
}),
)
.await;
fs.set_status_for_repo(
Path::new(path!("/root/project/.git")),
&[
(Path::new("src/main.rs"), StatusCode::Modified.worktree()),
(Path::new("src/lib.rs"), StatusCode::Modified.worktree()),
(Path::new("tests/test.rs"), StatusCode::Modified.worktree()),
(Path::new("new_file.txt"), FileStatus::Untracked),
(Path::new("another_new.rs"), FileStatus::Untracked),
(Path::new("src/utils.rs"), FileStatus::Untracked),
(
Path::new("conflict.txt"),
UnmergedStatus {
first_head: UnmergedStatusCode::Updated,
second_head: UnmergedStatusCode::Updated,
}
.into(),
),
],
);
let project = Project::test(fs.clone(), [Path::new(path!("/root/project"))], cx).await;
let workspace =
cx.add_window(|window, cx| Workspace::test_new(project.clone(), window, cx));
let cx = &mut VisualTestContext::from_window(*workspace, cx);
cx.read(|cx| {
project
.read(cx)
.worktrees(cx)
.nth(0)
.unwrap()
.read(cx)
.as_local()
.unwrap()
.scan_complete()
})
.await;
cx.executor().run_until_parked();
let panel = workspace.update(cx, GitPanel::new).unwrap();
let handle = cx.update_window_entity(&panel, |panel, _, _| {
std::mem::replace(&mut panel.update_visible_entries_task, Task::ready(()))
});
cx.executor().advance_clock(2 * UPDATE_DEBOUNCE);
handle.await;
let entries = panel.read_with(cx, |panel, _| panel.entries.clone());
#[rustfmt::skip]
pretty_assertions::assert_matches!(
entries.as_slice(),
&[
Header(GitHeaderEntry { header: Section::Conflict }),
Status(GitStatusEntry { staging: StageStatus::Unstaged, .. }),
Header(GitHeaderEntry { header: Section::Tracked }),
Status(GitStatusEntry { staging: StageStatus::Unstaged, .. }),
Status(GitStatusEntry { staging: StageStatus::Unstaged, .. }),
Status(GitStatusEntry { staging: StageStatus::Unstaged, .. }),
Header(GitHeaderEntry { header: Section::New }),
Status(GitStatusEntry { staging: StageStatus::Unstaged, .. }),
Status(GitStatusEntry { staging: StageStatus::Unstaged, .. }),
Status(GitStatusEntry { staging: StageStatus::Unstaged, .. }),
],
);
let second_status_entry = entries[3].clone();
panel.update_in(cx, |panel, window, cx| {
panel.toggle_staged_for_entry(&second_status_entry, window, cx);
});
panel.update_in(cx, |panel, window, cx| {
panel.selected_entry = Some(7);
panel.stage_range(&git::StageRange, window, cx);
});
cx.read(|cx| {
project
.read(cx)
.worktrees(cx)
.nth(0)
.unwrap()
.read(cx)
.as_local()
.unwrap()
.scan_complete()
})
.await;
cx.executor().run_until_parked();
let handle = cx.update_window_entity(&panel, |panel, _, _| {
std::mem::replace(&mut panel.update_visible_entries_task, Task::ready(()))
});
cx.executor().advance_clock(2 * UPDATE_DEBOUNCE);
handle.await;
let entries = panel.read_with(cx, |panel, _| panel.entries.clone());
#[rustfmt::skip]
pretty_assertions::assert_matches!(
entries.as_slice(),
&[
Header(GitHeaderEntry { header: Section::Conflict }),
Status(GitStatusEntry { staging: StageStatus::Unstaged, .. }),
Header(GitHeaderEntry { header: Section::Tracked }),
Status(GitStatusEntry { staging: StageStatus::Staged, .. }),
Status(GitStatusEntry { staging: StageStatus::Staged, .. }),
Status(GitStatusEntry { staging: StageStatus::Staged, .. }),
Header(GitHeaderEntry { header: Section::New }),
Status(GitStatusEntry { staging: StageStatus::Staged, .. }),
Status(GitStatusEntry { staging: StageStatus::Unstaged, .. }),
Status(GitStatusEntry { staging: StageStatus::Unstaged, .. }),
],
);
let third_status_entry = entries[4].clone();
panel.update_in(cx, |panel, window, cx| {
panel.toggle_staged_for_entry(&third_status_entry, window, cx);
});
panel.update_in(cx, |panel, window, cx| {
panel.selected_entry = Some(9);
panel.stage_range(&git::StageRange, window, cx);
});
cx.read(|cx| {
project
.read(cx)
.worktrees(cx)
.nth(0)
.unwrap()
.read(cx)
.as_local()
.unwrap()
.scan_complete()
})
.await;
cx.executor().run_until_parked();
let handle = cx.update_window_entity(&panel, |panel, _, _| {
std::mem::replace(&mut panel.update_visible_entries_task, Task::ready(()))
});
cx.executor().advance_clock(2 * UPDATE_DEBOUNCE);
handle.await;
let entries = panel.read_with(cx, |panel, _| panel.entries.clone());
#[rustfmt::skip]
pretty_assertions::assert_matches!(
entries.as_slice(),
&[
Header(GitHeaderEntry { header: Section::Conflict }),
Status(GitStatusEntry { staging: StageStatus::Unstaged, .. }),
Header(GitHeaderEntry { header: Section::Tracked }),
Status(GitStatusEntry { staging: StageStatus::Staged, .. }),
Status(GitStatusEntry { staging: StageStatus::Unstaged, .. }),
Status(GitStatusEntry { staging: StageStatus::Staged, .. }),
Header(GitHeaderEntry { header: Section::New }),
Status(GitStatusEntry { staging: StageStatus::Staged, .. }),
Status(GitStatusEntry { staging: StageStatus::Staged, .. }),
Status(GitStatusEntry { staging: StageStatus::Staged, .. }),
],
);
}
}

View File

@@ -66,6 +66,7 @@ x11 = [
"x11-clipboard",
"filedescriptor",
"open",
"scap?/x11",
]
screen-capture = [
"scap",

View File

@@ -403,12 +403,10 @@ impl ActionRegistry {
/// Useful for transforming the list of available actions into a
/// format suited for static analysis such as in validating keymaps, or
/// generating documentation.
pub fn generate_list_of_all_registered_actions() -> Vec<MacroActionData> {
let mut actions = Vec::new();
for builder in inventory::iter::<MacroActionBuilder> {
actions.push(builder.0());
}
actions
pub fn generate_list_of_all_registered_actions() -> impl Iterator<Item = MacroActionData> {
inventory::iter::<MacroActionBuilder>
.into_iter()
.map(|builder| builder.0())
}
mod no_action {

View File

@@ -1250,11 +1250,7 @@ impl App {
.downcast::<T>()
.unwrap()
.update(cx, |entity_state, cx| {
if let Some(window) = window {
on_new(entity_state, Some(window), cx);
} else {
on_new(entity_state, None, cx);
}
on_new(entity_state, window.as_deref_mut(), cx)
})
},
),

View File

@@ -12,18 +12,13 @@ use std::{
/// Convert an RGB hex color code number to a color type
pub fn rgb(hex: u32) -> Rgba {
let r = ((hex >> 16) & 0xFF) as f32 / 255.0;
let g = ((hex >> 8) & 0xFF) as f32 / 255.0;
let b = (hex & 0xFF) as f32 / 255.0;
let [_, r, g, b] = hex.to_be_bytes().map(|b| (b as f32) / 255.0);
Rgba { r, g, b, a: 1.0 }
}
/// Convert an RGBA hex color code number to [`Rgba`]
pub fn rgba(hex: u32) -> Rgba {
let r = ((hex >> 24) & 0xFF) as f32 / 255.0;
let g = ((hex >> 16) & 0xFF) as f32 / 255.0;
let b = ((hex >> 8) & 0xFF) as f32 / 255.0;
let a = (hex & 0xFF) as f32 / 255.0;
let [r, g, b, a] = hex.to_be_bytes().map(|b| (b as f32) / 255.0);
Rgba { r, g, b, a }
}
@@ -63,14 +58,14 @@ impl Rgba {
if other.a >= 1.0 {
other
} else if other.a <= 0.0 {
return *self;
*self
} else {
return Rgba {
Rgba {
r: (self.r * (1.0 - other.a)) + (other.r * other.a),
g: (self.g * (1.0 - other.a)) + (other.g * other.a),
b: (self.b * (1.0 - other.a)) + (other.b * other.a),
a: self.a,
};
}
}
}
}
@@ -494,12 +489,12 @@ impl Hsla {
if alpha >= 1.0 {
other
} else if alpha <= 0.0 {
return self;
self
} else {
let converted_self = Rgba::from(self);
let converted_other = Rgba::from(other);
let blended_rgb = converted_self.blend(converted_other);
return Hsla::from(blended_rgb);
Hsla::from(blended_rgb)
}
}

View File

@@ -903,7 +903,7 @@ pub trait InteractiveElement: Sized {
/// Apply the given style when the given data type is dragged over this element
fn drag_over<S: 'static>(
mut self,
f: impl 'static + Fn(StyleRefinement, &S, &Window, &App) -> StyleRefinement,
f: impl 'static + Fn(StyleRefinement, &S, &mut Window, &mut App) -> StyleRefinement,
) -> Self {
self.interactivity().drag_over_styles.push((
TypeId::of::<S>(),

View File

@@ -13,6 +13,9 @@ pub struct Keystroke {
/// key is the character printed on the key that was pressed
/// e.g. for option-s, key is "s"
/// On layouts that do not have ascii keys (e.g. Thai)
/// this will be the ASCII-equivalent character (q instead of ๆ),
/// and the typed character will be present in key_char.
pub key: String,
/// key_char is the character that could have been typed when

View File

@@ -706,6 +706,60 @@ pub(super) fn log_cursor_icon_warning(message: impl std::fmt::Display) {
}
}
#[cfg(any(feature = "wayland", feature = "x11"))]
fn guess_ascii(keycode: Keycode, shift: bool) -> Option<char> {
let c = match (keycode.raw(), shift) {
(24, _) => 'q',
(25, _) => 'w',
(26, _) => 'e',
(27, _) => 'r',
(28, _) => 't',
(29, _) => 'y',
(30, _) => 'u',
(31, _) => 'i',
(32, _) => 'o',
(33, _) => 'p',
(34, false) => '[',
(34, true) => '{',
(35, false) => ']',
(35, true) => '}',
(38, _) => 'a',
(39, _) => 's',
(40, _) => 'd',
(41, _) => 'f',
(42, _) => 'g',
(43, _) => 'h',
(44, _) => 'j',
(45, _) => 'k',
(46, _) => 'l',
(47, false) => ';',
(47, true) => ':',
(48, false) => '\'',
(48, true) => '"',
(49, false) => '`',
(49, true) => '~',
(51, false) => '\\',
(51, true) => '|',
(52, _) => 'z',
(53, _) => 'x',
(54, _) => 'c',
(55, _) => 'v',
(56, _) => 'b',
(57, _) => 'n',
(58, _) => 'm',
(59, false) => ',',
(59, true) => '>',
(60, false) => '.',
(60, true) => '<',
(61, false) => '/',
(61, true) => '?',
_ => return None,
};
Some(c)
}
#[cfg(any(feature = "wayland", feature = "x11"))]
impl crate::Keystroke {
pub(super) fn from_xkb(
@@ -773,6 +827,8 @@ impl crate::Keystroke {
let name = xkb::keysym_get_name(key_sym).to_lowercase();
if key_sym.is_keypad_key() {
name.replace("kp_", "")
} else if let Some(key_en) = guess_ascii(keycode, modifiers.shift) {
String::from(key_en)
} else {
name
}

View File

@@ -119,8 +119,10 @@ impl MarkdownWriter {
.push_back(current_element.clone());
}
for child in node.children.borrow().iter() {
self.visit_node(child, handlers)?;
if self.current_element_stack.len() < 200 {
for child in node.children.borrow().iter() {
self.visit_node(child, handlers)?;
}
}
if let Some(current_element) = current_element {

View File

@@ -226,21 +226,10 @@ impl HttpClientWithUrl {
}
/// Builds a Zed LLM URL using the given path.
pub fn build_zed_llm_url(
&self,
path: &str,
query: &[(&str, &str)],
use_cloud: bool,
) -> Result<Url> {
pub fn build_zed_llm_url(&self, path: &str, query: &[(&str, &str)]) -> Result<Url> {
let base_url = self.base_url();
let base_api_url = match base_url.as_ref() {
"https://zed.dev" => {
if use_cloud {
"https://cloud.zed.dev"
} else {
"https://llm.zed.dev"
}
}
"https://zed.dev" => "https://cloud.zed.dev",
"https://staging.zed.dev" => "https://llm-staging.zed.dev",
"http://localhost:3000" => "http://localhost:8787",
other => other,

View File

@@ -163,6 +163,7 @@ pub enum IconName {
ListTree,
ListX,
LoadCircle,
LocationEdit,
LockOutlined,
LspDebug,
LspRestart,
@@ -190,6 +191,7 @@ pub enum IconName {
Play,
PlayAlt,
PlayBug,
PlayFilled,
Plus,
PocketKnife,
Power,
@@ -247,6 +249,7 @@ pub enum IconName {
SwatchBook,
Tab,
Terminal,
TerminalAlt,
TextSnippet,
ThumbsDown,
ThumbsUp,

View File

@@ -334,6 +334,9 @@ impl LanguageRegistry {
if let Some(adapters) = state.lsp_adapters.get_mut(language_name) {
adapters.retain(|adapter| &adapter.name != name)
}
state.all_lsp_adapters.remove(name);
state.available_lsp_adapters.remove(name);
state.version += 1;
state.reload_count += 1;
*state.subscription.0.borrow_mut() = ();

View File

@@ -28,7 +28,6 @@ credentials_provider.workspace = true
copilot.workspace = true
deepseek = { workspace = true, features = ["schemars"] }
editor.workspace = true
feature_flags.workspace = true
fs.workspace = true
futures.workspace = true
google_ai = { workspace = true, features = ["schemars"] }

View File

@@ -2,7 +2,6 @@ use anthropic::AnthropicModelMode;
use anyhow::{Context as _, Result, anyhow};
use chrono::{DateTime, Utc};
use client::{Client, ModelRequestUsage, UserStore, zed_urls};
use feature_flags::{FeatureFlagAppExt as _, ZedCloudFeatureFlag};
use futures::{
AsyncBufReadExt, FutureExt, Stream, StreamExt, future::BoxFuture, stream::BoxStream,
};
@@ -137,7 +136,6 @@ impl State {
cx: &mut Context<Self>,
) -> Self {
let refresh_llm_token_listener = RefreshLlmTokenListener::global(cx);
let use_cloud = cx.has_flag::<ZedCloudFeatureFlag>();
Self {
client: client.clone(),
@@ -165,7 +163,7 @@ impl State {
.await;
}
let response = Self::fetch_models(client, llm_api_token, use_cloud).await?;
let response = Self::fetch_models(client, llm_api_token).await?;
this.update(cx, |this, cx| {
this.update_models(response, cx);
})
@@ -184,7 +182,7 @@ impl State {
let llm_api_token = this.llm_api_token.clone();
cx.spawn(async move |this, cx| {
llm_api_token.refresh(&client).await?;
let response = Self::fetch_models(client, llm_api_token, use_cloud).await?;
let response = Self::fetch_models(client, llm_api_token).await?;
this.update(cx, |this, cx| {
this.update_models(response, cx);
})
@@ -268,18 +266,13 @@ impl State {
async fn fetch_models(
client: Arc<Client>,
llm_api_token: LlmApiToken,
use_cloud: bool,
) -> Result<ListModelsResponse> {
let http_client = &client.http_client();
let token = llm_api_token.acquire(&client).await?;
let request = http_client::Request::builder()
.method(Method::GET)
.uri(
http_client
.build_zed_llm_url("/models", &[], use_cloud)?
.as_ref(),
)
.uri(http_client.build_zed_llm_url("/models", &[])?.as_ref())
.header("Authorization", format!("Bearer {token}"))
.body(AsyncBody::empty())?;
let mut response = http_client
@@ -543,7 +536,6 @@ impl CloudLanguageModel {
llm_api_token: LlmApiToken,
app_version: Option<SemanticVersion>,
body: CompletionBody,
use_cloud: bool,
) -> Result<PerformLlmCompletionResponse> {
let http_client = &client.http_client();
@@ -551,11 +543,9 @@ impl CloudLanguageModel {
let mut refreshed_token = false;
loop {
let request_builder = http_client::Request::builder().method(Method::POST).uri(
http_client
.build_zed_llm_url("/completions", &[], use_cloud)?
.as_ref(),
);
let request_builder = http_client::Request::builder()
.method(Method::POST)
.uri(http_client.build_zed_llm_url("/completions", &[])?.as_ref());
let request_builder = if let Some(app_version) = app_version {
request_builder.header(ZED_VERSION_HEADER_NAME, app_version.to_string())
} else {
@@ -782,7 +772,6 @@ impl LanguageModel for CloudLanguageModel {
let model_id = self.model.id.to_string();
let generate_content_request =
into_google(request, model_id.clone(), GoogleModelMode::Default);
let use_cloud = cx.has_flag::<ZedCloudFeatureFlag>();
async move {
let http_client = &client.http_client();
let token = llm_api_token.acquire(&client).await?;
@@ -798,7 +787,7 @@ impl LanguageModel for CloudLanguageModel {
.method(Method::POST)
.uri(
http_client
.build_zed_llm_url("/count_tokens", &[], use_cloud)?
.build_zed_llm_url("/count_tokens", &[])?
.as_ref(),
)
.header("Content-Type", "application/json")
@@ -847,9 +836,6 @@ impl LanguageModel for CloudLanguageModel {
let intent = request.intent;
let mode = request.mode;
let app_version = cx.update(|cx| AppVersion::global(cx)).ok();
let use_cloud = cx
.update(|cx| cx.has_flag::<ZedCloudFeatureFlag>())
.unwrap_or(false);
let thinking_allowed = request.thinking_allowed;
match self.model.provider {
zed_llm_client::LanguageModelProvider::Anthropic => {
@@ -888,7 +874,6 @@ impl LanguageModel for CloudLanguageModel {
provider_request: serde_json::to_value(&request)
.map_err(|e| anyhow!(e))?,
},
use_cloud,
)
.await
.map_err(|err| match err.downcast::<ApiError>() {
@@ -941,7 +926,6 @@ impl LanguageModel for CloudLanguageModel {
provider_request: serde_json::to_value(&request)
.map_err(|e| anyhow!(e))?,
},
use_cloud,
)
.await?;
@@ -982,7 +966,6 @@ impl LanguageModel for CloudLanguageModel {
provider_request: serde_json::to_value(&request)
.map_err(|e| anyhow!(e))?,
},
use_cloud,
)
.await?;

View File

@@ -44,7 +44,6 @@ dap.workspace = true
futures.workspace = true
gpui.workspace = true
http_client.workspace = true
indoc.workspace = true
language.workspace = true
log.workspace = true
lsp.workspace = true

View File

@@ -14,6 +14,15 @@
"(" @context
")" @context)) @item
(generator_function_declaration
"async"? @context
"function" @context
"*" @context
name: (_) @name
parameters: (formal_parameters
"(" @context
")" @context)) @item
(interface_declaration
"interface" @context
name: (_) @name) @item

View File

@@ -262,7 +262,6 @@ impl LspAdapter for RustLspAdapter {
_: LanguageServerId,
_: Option<&'_ Buffer>,
) {
// https://zed.dev/cla
static REGEX: LazyLock<Regex> =
LazyLock::new(|| Regex::new(r"(?m)`([^`]+)\n`$").expect("Failed to create REGEX"));

View File

@@ -18,6 +18,15 @@
"(" @context
")" @context)) @item
(generator_function_declaration
"async"? @context
"function" @context
"*" @context
name: (_) @name
parameters: (formal_parameters
"(" @context
")" @context)) @item
(interface_declaration
"interface" @context
name: (_) @name) @item

View File

@@ -8,10 +8,9 @@ use futures::future::join_all;
use gpui::{App, AppContext, AsyncApp, Task};
use http_client::github::{AssetKind, GitHubLspBinaryVersion, build_asset_url};
use language::{
Buffer, ContextLocation, ContextProvider, File, LanguageToolchainStore, LspAdapter,
LspAdapterDelegate,
ContextLocation, ContextProvider, File, LanguageToolchainStore, LspAdapter, LspAdapterDelegate,
};
use lsp::{CodeActionKind, LanguageServerBinary, LanguageServerId, LanguageServerName};
use lsp::{CodeActionKind, LanguageServerBinary, LanguageServerName};
use node_runtime::NodeRuntime;
use project::{Fs, lsp_store::language_server_settings};
use serde_json::{Value, json};
@@ -606,7 +605,6 @@ impl LspAdapter for TypeScriptLspAdapter {
}
}
// >>> https://zed.dev/cla <<<
async fn fetch_server_binary(
&self,
latest_version: Box<dyn 'static + Send + Any>,
@@ -750,15 +748,6 @@ impl LspAdapter for TypeScriptLspAdapter {
("TSX".into(), "typescriptreact".into()),
])
}
fn process_diagnostics(
&self,
d: &mut lsp::PublishDiagnosticsParams,
_: LanguageServerId,
_: Option<&'_ Buffer>,
) {
dbg!("called with ", d);
}
}
async fn get_cached_ts_server_binary(
@@ -874,7 +863,7 @@ impl LspAdapter for EsLintLspAdapter {
},
"experimental": {
"useFlatConfig": use_flat_config,
},
}
});
let override_options = cx.update(|cx| {
@@ -1086,6 +1075,62 @@ mod tests {
);
}
#[gpui::test]
async fn test_generator_function_outline(cx: &mut TestAppContext) {
let language = crate::language("javascript", tree_sitter_typescript::LANGUAGE_TSX.into());
let text = r#"
function normalFunction() {
console.log("normal");
}
function* simpleGenerator() {
yield 1;
yield 2;
}
async function* asyncGenerator() {
yield await Promise.resolve(1);
}
function* generatorWithParams(start, end) {
for (let i = start; i <= end; i++) {
yield i;
}
}
class TestClass {
*methodGenerator() {
yield "method";
}
async *asyncMethodGenerator() {
yield "async method";
}
}
"#
.unindent();
let buffer = cx.new(|cx| language::Buffer::local(text, cx).with_language(language, cx));
let outline = buffer.read_with(cx, |buffer, _| buffer.snapshot().outline(None).unwrap());
assert_eq!(
outline
.items
.iter()
.map(|item| (item.text.as_str(), item.depth))
.collect::<Vec<_>>(),
&[
("function normalFunction()", 0),
("function* simpleGenerator()", 0),
("async function* asyncGenerator()", 0),
("function* generatorWithParams( )", 0),
("class TestClass", 0),
("*methodGenerator()", 1),
("async *asyncMethodGenerator()", 1),
]
);
}
#[gpui::test]
async fn test_package_json_discovery(executor: BackgroundExecutor, cx: &mut TestAppContext) {
cx.update(|cx| {

View File

@@ -18,6 +18,15 @@
"(" @context
")" @context)) @item
(generator_function_declaration
"async"? @context
"function" @context
"*" @context
name: (_) @name
parameters: (formal_parameters
"(" @context
")" @context)) @item
(interface_declaration
"interface" @context
name: (_) @name) @item

View File

@@ -280,185 +280,6 @@ impl LspAdapter for VtslsLspAdapter {
("TSX".into(), "typescriptreact".into()),
])
}
fn diagnostic_message_to_markdown(&self, message: &str) -> Option<String> {
use regex::{Captures, Regex};
dbg!(&message);
// Helper functions for formatting
let format_type_block = |prefix: &str, content: &str| -> String {
if prefix.is_empty() {
if content.len() > 50 || content.contains('\n') || content.contains('`') {
format!("\n```typescript\ntype a ={}\n```\n", dbg!(content))
} else {
format!("`{}`", dbg!(content))
}
} else {
if content.len() > 50 || content.contains('\n') || content.contains('`') {
format!(
"{}\n```typescript\ntype a ={}\n```\n",
prefix,
dbg!(content)
)
} else {
format!("{} `{}`", prefix, dbg!(content))
}
}
};
let format_typescript_block =
|content: &str| -> String { format!("\n\n```typescript\n{}\n```\n", dbg!(content)) };
let format_simple_type_block = |content: &str| -> String { format!("`{}`", dbg!(content)) };
let unstyle_code_block = |content: &str| -> String { format!("`{}`", dbg!(content)) };
let mut result = message.to_string();
// Format 'key' with "value"
let re = Regex::new(r#"(\w+)(\s+)'(.+?)'(\s+)with(\s+)"(.+?)""#).unwrap();
result = re
.replace_all(&result, |caps: &Captures| {
format!(
"{}{}`{}`{} with `\"{}\"`",
&caps[1], &caps[2], &caps[3], &caps[4], &caps[6]
)
})
.to_string();
// Format "key"
let re = Regex::new(r#"(\s)'"(.*?)"'(\s|:|.|$)"#).unwrap();
result = re
.replace_all(&result, |caps: &Captures| {
format!("{}`\"{}\"`{}", &caps[1], &caps[2], &caps[3])
})
.to_string();
// Format declare module snippet
let re = Regex::new(r#"['"](declare module )['"](.*)['""];['"']"#).unwrap();
result = re
.replace_all(&result, |caps: &Captures| {
format_typescript_block(&format!("{} \"{}\"", &caps[1], &caps[2]))
})
.to_string();
// Format missing props error
let re = Regex::new(r#"(is missing the following properties from type\s?)'(.*)': ([^:]+)"#)
.unwrap();
result = re
.replace_all(&result, |caps: &Captures| {
let props: Vec<&str> = caps[3].split(", ").filter(|s| !s.is_empty()).collect();
let props_html = props
.iter()
.map(|prop| format!("<li>{}</li>", prop))
.collect::<Vec<_>>()
.join("");
format!("{}`{}`: <ul>{}</ul>", &caps[1], &caps[2], props_html)
})
.to_string();
// Format type pairs
let re = Regex::new(r#"(?i)(types) ['"](.*?)['"] and ['"](.*?)['"][.]?"#).unwrap();
result = re
.replace_all(&result, |caps: &Captures| {
format!("{} `{}` and `{}`", &caps[1], &caps[2], &caps[3])
})
.to_string();
// Format type annotation options
let re = Regex::new(r#"(?i)type annotation must be ['"](.*?)['"] or ['"](.*?)['"][.]?"#)
.unwrap();
result = re
.replace_all(&result, |caps: &Captures| {
format!("type annotation must be `{}` or `{}`", &caps[1], &caps[2])
})
.to_string();
// Format overload
let re = Regex::new(r#"(?i)(Overload \d of \d), ['"](.*?)['"], "#).unwrap();
result = re
.replace_all(&result, |caps: &Captures| {
format!("{}, `{}`, ", &caps[1], &caps[2])
})
.to_string();
// Format simple strings
let re = Regex::new(r#"^['"]"[^"]*"['"]$"#).unwrap();
result = re
.replace_all(&result, |caps: &Captures| format_typescript_block(&caps[0]))
.to_string();
// Replace module 'x' by module "x" for ts error #2307
let re = Regex::new(r#"(?i)(module )'([^"]*?)'"#).unwrap();
result = re
.replace_all(&result, |caps: &Captures| {
format!("{}\"{}\"", &caps[1], &caps[2])
})
.to_string();
// Format string types
let re = Regex::new(r#"(?i)(module|file|file name|imported via) ['""](.*?)['""]"#).unwrap();
result = re
.replace_all(&result, |caps: &Captures| {
format_type_block(&caps[1], &format!("\"{}\"", &caps[2]))
})
.to_string();
// Format types
dbg!(&result);
let re = Regex::new(r#"(?i)(type|type alias|interface|module|file|file name|class|method's|subtype of constraint) ['"](.*?)['"]"#).unwrap();
result = re
.replace_all(&result, |caps: &Captures| {
dbg!(&caps);
format_type_block(&caps[1], &caps[2])
})
.to_string();
// Format reversed types
let re = Regex::new(r#"(?i)(.*)['"]([^>]*)['"] (type|interface|return type|file|module|is (not )?assignable)"#).unwrap();
result = re
.replace_all(&result, |caps: &Captures| {
format!("{}`{}` {}", &caps[1], &caps[2], &caps[3])
})
.to_string();
// Format simple types that didn't captured before
let re = Regex::new(
r#"['"]((void|null|undefined|any|boolean|string|number|bigint|symbol)(\[\])?)['"']"#,
)
.unwrap();
result = re
.replace_all(&result, |caps: &Captures| {
format_simple_type_block(&caps[1])
})
.to_string();
// Format some typescript keywords
let re = Regex::new(r#"['"](import|export|require|in|continue|break|let|false|true|const|new|throw|await|for await|[0-9]+)( ?.*?)['"]"#).unwrap();
result = re
.replace_all(&result, |caps: &Captures| {
format_typescript_block(&format!("{}{}", &caps[1], &caps[2]))
})
.to_string();
// Format return values
let re = Regex::new(r#"(?i)(return|operator) ['"](.*?)['"']"#).unwrap();
result = re
.replace_all(&result, |caps: &Captures| {
format!("{} {}", &caps[1], format_typescript_block(&caps[2]))
})
.to_string();
// Format regular code blocks
let re = Regex::new(r#"(\W|^)'([^'"]*?)'(\W|$)"#).unwrap();
result = re
.replace_all(&result, |caps: &Captures| {
format!("{}{}{}", &caps[1], unstyle_code_block(&caps[2]), &caps[3])
})
.to_string();
Some(result)
}
}
async fn get_cached_ts_server_binary(
@@ -480,25 +301,3 @@ async fn get_cached_ts_server_binary(
.await
.log_err()
}
#[cfg(test)]
mod test {
use super::*;
use indoc::indoc;
#[test]
fn test_diagnostic_message_to_markdown() {
let message = "Property 'user' is missing in type '{ person: { username: string; email: string; }; }' but required in type '{ user: { name: string; email: `${string}@${string}.${string}`; age: number; }; }'.";
let expected = indoc! { "
Property `user` is missing in type `{ person: { username: string; email: string; }; }` but required in type
```typescript
{ user: { name: string; email: `${string}@${string}.${string}`; age: number; }; }
```
"};
let result = VtslsLspAdapter::new(NodeRuntime::unavailable())
.diagnostic_message_to_markdown(message)
.unwrap();
pretty_assertions::assert_eq!(result, expected.to_string());
}
}

View File

@@ -1534,26 +1534,12 @@ impl MarkdownElementBuilder {
rendered_index: self.pending_line.text.len(),
source_index: source_range.start,
});
if text.starts_with("type a =") {
self.pending_line.text.push_str(&text["type a =".len()..]);
} else {
self.pending_line.text.push_str(text);
}
self.pending_line.text.push_str(text);
self.current_source_index = source_range.end;
if let Some(Some(language)) = self.code_block_stack.last() {
dbg!(&language);
let mut offset = 0;
for (mut range, highlight_id) in
language.highlight_text(&Rope::from(text), 0..text.len())
{
if text.starts_with("type a =") {
if range.start < "type a =".len() || range.end < "type a =".len() {
continue;
}
range.start -= "type a =".len();
range.end -= "type a =".len();
};
for (range, highlight_id) in language.highlight_text(&Rope::from(text), 0..text.len()) {
if range.start > offset {
self.pending_line
.runs

View File

@@ -31,6 +31,7 @@ aho-corasick.workspace = true
anyhow.workspace = true
askpass.workspace = true
async-trait.workspace = true
base64.workspace = true
buffer_diff.workspace = true
circular-buffer.workspace = true
client.workspace = true
@@ -72,6 +73,7 @@ settings.workspace = true
sha2.workspace = true
shellexpand.workspace = true
shlex.workspace = true
smallvec.workspace = true
smol.workspace = true
snippet.workspace = true
snippet_provider.workspace = true

View File

@@ -15,7 +15,9 @@ pub mod breakpoint_store;
pub mod dap_command;
pub mod dap_store;
pub mod locators;
mod memory;
pub mod session;
#[cfg(any(feature = "test-support", test))]
pub mod test;
pub use memory::MemoryCell;

View File

@@ -1,6 +1,7 @@
use std::sync::Arc;
use anyhow::{Context as _, Ok, Result};
use base64::Engine;
use dap::{
Capabilities, ContinueArguments, ExceptionFilterOptions, InitializeRequestArguments,
InitializeRequestArgumentsPathFormat, NextArguments, SetVariableResponse, SourceBreakpoint,
@@ -10,6 +11,7 @@ use dap::{
proto_conversions::ProtoConversion,
requests::{Continue, Next},
};
use rpc::proto;
use serde_json::Value;
use util::ResultExt;
@@ -812,7 +814,7 @@ impl DapCommand for RestartCommand {
}
}
#[derive(Debug, Hash, PartialEq, Eq)]
#[derive(Clone, Debug, Hash, PartialEq, Eq)]
pub struct VariablesCommand {
pub variables_reference: u64,
pub filter: Option<VariablesArgumentsFilter>,
@@ -1666,6 +1668,130 @@ impl LocalDapCommand for SetBreakpoints {
Ok(message.breakpoints)
}
}
#[derive(Clone, Debug, Hash, PartialEq, Eq)]
pub enum DataBreakpointContext {
Variable {
variables_reference: u64,
name: String,
bytes: Option<u64>,
},
Expression {
expression: String,
frame_id: Option<u64>,
},
Address {
address: String,
bytes: Option<u64>,
},
}
impl DataBreakpointContext {
pub fn human_readable_label(&self) -> String {
match self {
DataBreakpointContext::Variable { name, .. } => format!("Variable: {}", name),
DataBreakpointContext::Expression { expression, .. } => {
format!("Expression: {}", expression)
}
DataBreakpointContext::Address { address, bytes } => {
let mut label = format!("Address: {}", address);
if let Some(bytes) = bytes {
label.push_str(&format!(
" ({} byte{})",
bytes,
if *bytes == 1 { "" } else { "s" }
));
}
label
}
}
}
}
#[derive(Clone, Debug, Hash, PartialEq, Eq)]
pub(crate) struct DataBreakpointInfoCommand {
pub context: Arc<DataBreakpointContext>,
pub mode: Option<String>,
}
impl LocalDapCommand for DataBreakpointInfoCommand {
type Response = dap::DataBreakpointInfoResponse;
type DapRequest = dap::requests::DataBreakpointInfo;
const CACHEABLE: bool = true;
// todo(debugger): We should expand this trait in the future to take a &self
// Depending on this command is_supported could be differentb
fn is_supported(capabilities: &Capabilities) -> bool {
capabilities.supports_data_breakpoints.unwrap_or(false)
}
fn to_dap(&self) -> <Self::DapRequest as dap::requests::Request>::Arguments {
let (variables_reference, name, frame_id, as_address, bytes) = match &*self.context {
DataBreakpointContext::Variable {
variables_reference,
name,
bytes,
} => (
Some(*variables_reference),
name.clone(),
None,
Some(false),
*bytes,
),
DataBreakpointContext::Expression {
expression,
frame_id,
} => (None, expression.clone(), *frame_id, Some(false), None),
DataBreakpointContext::Address { address, bytes } => {
(None, address.clone(), None, Some(true), *bytes)
}
};
dap::DataBreakpointInfoArguments {
variables_reference,
name,
frame_id,
bytes,
as_address,
mode: self.mode.clone(),
}
}
fn response_from_dap(
&self,
message: <Self::DapRequest as dap::requests::Request>::Response,
) -> Result<Self::Response> {
Ok(message)
}
}
#[derive(Clone, Debug, Hash, PartialEq, Eq)]
pub(crate) struct SetDataBreakpointsCommand {
pub breakpoints: Vec<dap::DataBreakpoint>,
}
impl LocalDapCommand for SetDataBreakpointsCommand {
type Response = Vec<dap::Breakpoint>;
type DapRequest = dap::requests::SetDataBreakpoints;
fn is_supported(capabilities: &Capabilities) -> bool {
capabilities.supports_data_breakpoints.unwrap_or(false)
}
fn to_dap(&self) -> <Self::DapRequest as dap::requests::Request>::Arguments {
dap::SetDataBreakpointsArguments {
breakpoints: self.breakpoints.clone(),
}
}
fn response_from_dap(
&self,
message: <Self::DapRequest as dap::requests::Request>::Response,
) -> Result<Self::Response> {
Ok(message.breakpoints)
}
}
#[derive(Clone, Debug, Hash, PartialEq)]
pub(super) enum SetExceptionBreakpoints {
Plain {
@@ -1774,3 +1900,76 @@ impl DapCommand for LocationsCommand {
})
}
}
#[derive(Clone, Debug, Hash, PartialEq, Eq)]
pub(crate) struct ReadMemory {
pub(crate) memory_reference: String,
pub(crate) offset: Option<u64>,
pub(crate) count: u64,
}
#[derive(Clone, Debug, PartialEq)]
pub(crate) struct ReadMemoryResponse {
pub(super) address: Arc<str>,
pub(super) unreadable_bytes: Option<u64>,
pub(super) content: Arc<[u8]>,
}
impl LocalDapCommand for ReadMemory {
type Response = ReadMemoryResponse;
type DapRequest = dap::requests::ReadMemory;
const CACHEABLE: bool = true;
fn is_supported(capabilities: &Capabilities) -> bool {
capabilities
.supports_read_memory_request
.unwrap_or_default()
}
fn to_dap(&self) -> <Self::DapRequest as dap::requests::Request>::Arguments {
dap::ReadMemoryArguments {
memory_reference: self.memory_reference.clone(),
offset: self.offset,
count: self.count,
}
}
fn response_from_dap(
&self,
message: <Self::DapRequest as dap::requests::Request>::Response,
) -> Result<Self::Response> {
let data = if let Some(data) = message.data {
base64::engine::general_purpose::STANDARD
.decode(data)
.log_err()
.context("parsing base64 data from DAP's ReadMemory response")?
} else {
vec![]
};
Ok(ReadMemoryResponse {
address: message.address.into(),
content: data.into(),
unreadable_bytes: message.unreadable_bytes,
})
}
}
impl LocalDapCommand for dap::WriteMemoryArguments {
type Response = dap::WriteMemoryResponse;
type DapRequest = dap::requests::WriteMemory;
fn is_supported(capabilities: &Capabilities) -> bool {
capabilities
.supports_write_memory_request
.unwrap_or_default()
}
fn to_dap(&self) -> <Self::DapRequest as dap::requests::Request>::Arguments {
self.clone()
}
fn response_from_dap(
&self,
message: <Self::DapRequest as dap::requests::Request>::Response,
) -> Result<Self::Response> {
Ok(message)
}
}

View File

@@ -6,6 +6,7 @@ use super::{
};
use crate::{
InlayHint, InlayHintLabel, ProjectEnvironment, ResolveState,
debugger::session::SessionQuirks,
project_settings::ProjectSettings,
terminals::{SshCommand, wrap_for_ssh},
worktree_store::WorktreeStore,
@@ -385,10 +386,11 @@ impl DapStore {
pub fn new_session(
&mut self,
label: SharedString,
label: Option<SharedString>,
adapter: DebugAdapterName,
task_context: TaskContext,
parent_session: Option<Entity<Session>>,
quirks: SessionQuirks,
cx: &mut Context<Self>,
) -> Entity<Session> {
let session_id = SessionId(util::post_inc(&mut self.next_session_id));
@@ -406,6 +408,7 @@ impl DapStore {
label,
adapter,
task_context,
quirks,
cx,
);

View File

@@ -0,0 +1,384 @@
//! This module defines the format in which memory of debuggee is represented.
//!
//! Each byte in memory can either be mapped or unmapped. We try to mimic that twofold:
//! - We assume that the memory is divided into pages of a fixed size.
//! - We assume that each page can be either mapped or unmapped.
//! These two assumptions drive the shape of the memory representation.
//! In particular, we want the unmapped pages to be represented without allocating any memory, as *most*
//! of the memory in a program space is usually unmapped.
//! Note that per DAP we don't know what the address space layout is, so we can't optimize off of it.
//! Note that while we optimize for a paged layout, we also want to be able to represent memory that is not paged.
//! This use case is relevant to embedded folks. Furthermore, we cater to default 4k page size.
//! It is picked arbitrarily as a ubiquous default - other than that, the underlying format of Zed's memory storage should not be relevant
//! to the users of this module.
use std::{collections::BTreeMap, ops::RangeInclusive, sync::Arc};
use gpui::BackgroundExecutor;
use smallvec::SmallVec;
const PAGE_SIZE: u64 = 4096;
/// Represents the contents of a single page. We special-case unmapped pages to be allocation-free,
/// since they're going to make up the majority of the memory in a program space (even though the user might not even get to see them - ever).
#[derive(Clone, Debug)]
pub(super) enum PageContents {
/// Whole page is unreadable.
Unmapped,
Mapped(Arc<MappedPageContents>),
}
impl PageContents {
#[cfg(test)]
fn mapped(contents: Vec<u8>) -> Self {
PageContents::Mapped(Arc::new(MappedPageContents(
vec![PageChunk::Mapped(contents.into())].into(),
)))
}
}
#[derive(Clone, Debug)]
enum PageChunk {
Mapped(Arc<[u8]>),
Unmapped(u64),
}
impl PageChunk {
fn len(&self) -> u64 {
match self {
PageChunk::Mapped(contents) => contents.len() as u64,
PageChunk::Unmapped(size) => *size,
}
}
}
impl MappedPageContents {
fn len(&self) -> u64 {
self.0.iter().map(|chunk| chunk.len()).sum()
}
}
/// We hope for the whole page to be mapped in a single chunk, but we do leave the possibility open
/// of having interleaved read permissions in a single page; debuggee's execution environment might either
/// have a different page size OR it might not have paged memory layout altogether
/// (which might be relevant to embedded systems).
///
/// As stated previously, the concept of a page in this module has to do more
/// with optimizing fetching of the memory and not with the underlying bits and pieces
/// of the memory of a debuggee.
#[derive(Default, Debug)]
pub(super) struct MappedPageContents(
/// Most of the time there should be only one chunk (either mapped or unmapped),
/// but we do leave the possibility open of having multiple regions of memory in a single page.
SmallVec<[PageChunk; 1]>,
);
type MemoryAddress = u64;
#[derive(Clone, Copy, Debug, PartialEq, PartialOrd, Ord, Eq)]
#[repr(transparent)]
pub(super) struct PageAddress(u64);
impl PageAddress {
pub(super) fn iter_range(
range: RangeInclusive<PageAddress>,
) -> impl Iterator<Item = PageAddress> {
let mut current = range.start().0;
let end = range.end().0;
std::iter::from_fn(move || {
if current > end {
None
} else {
let addr = PageAddress(current);
current += PAGE_SIZE;
Some(addr)
}
})
}
}
pub(super) struct Memory {
pages: BTreeMap<PageAddress, PageContents>,
}
/// Represents a single memory cell (or None if a given cell is unmapped/unknown).
#[derive(Copy, Clone, Debug, PartialEq, PartialOrd, Ord, Eq)]
#[repr(transparent)]
pub struct MemoryCell(pub Option<u8>);
impl Memory {
pub(super) fn new() -> Self {
Self {
pages: Default::default(),
}
}
pub(super) fn memory_range_to_page_range(
range: RangeInclusive<MemoryAddress>,
) -> RangeInclusive<PageAddress> {
let start_page = (range.start() / PAGE_SIZE) * PAGE_SIZE;
let end_page = (range.end() / PAGE_SIZE) * PAGE_SIZE;
PageAddress(start_page)..=PageAddress(end_page)
}
pub(super) fn build_page(&self, page_address: PageAddress) -> Option<MemoryPageBuilder> {
if self.pages.contains_key(&page_address) {
// We already know the state of this page.
None
} else {
Some(MemoryPageBuilder::new(page_address))
}
}
pub(super) fn insert_page(&mut self, address: PageAddress, page: PageContents) {
self.pages.insert(address, page);
}
pub(super) fn memory_range(&self, range: RangeInclusive<MemoryAddress>) -> MemoryIterator {
let pages = Self::memory_range_to_page_range(range.clone());
let pages = self
.pages
.range(pages)
.map(|(address, page)| (*address, page.clone()))
.collect::<Vec<_>>();
MemoryIterator::new(range, pages.into_iter())
}
pub(crate) fn clear(&mut self, background_executor: &BackgroundExecutor) {
let memory = std::mem::take(&mut self.pages);
background_executor
.spawn(async move {
drop(memory);
})
.detach();
}
}
/// Builder for memory pages.
///
/// Memory reads in DAP are sequential (or at least we make them so).
/// ReadMemory response includes `unreadableBytes` property indicating the number of bytes
/// that could not be read after the last successfully read byte.
///
/// We use it as follows:
/// - We start off with a "large" 1-page ReadMemory request.
/// - If it succeeds/fails wholesale, cool; we have no unknown memory regions in this page.
/// - If it succeeds partially, we know # of mapped bytes.
/// We might also know the # of unmapped bytes.
/// However, we're still unsure about what's *after* the unreadable region.
///
/// This is where this builder comes in. It lets us track the state of figuring out contents of a single page.
pub(super) struct MemoryPageBuilder {
chunks: MappedPageContents,
base_address: PageAddress,
left_to_read: u64,
}
/// Represents a chunk of memory of which we don't know if it's mapped or unmapped; thus we need
/// to issue a request to figure out it's state.
pub(super) struct UnknownMemory {
pub(super) address: MemoryAddress,
pub(super) size: u64,
}
impl MemoryPageBuilder {
fn new(base_address: PageAddress) -> Self {
Self {
chunks: Default::default(),
base_address,
left_to_read: PAGE_SIZE,
}
}
pub(super) fn build(self) -> (PageAddress, PageContents) {
debug_assert_eq!(self.left_to_read, 0);
debug_assert_eq!(
self.chunks.len(),
PAGE_SIZE,
"Expected `build` to be called on a fully-fetched page"
);
let contents = if let Some(first) = self.chunks.0.first()
&& self.chunks.len() == 1
&& matches!(first, PageChunk::Unmapped(PAGE_SIZE))
{
PageContents::Unmapped
} else {
PageContents::Mapped(Arc::new(MappedPageContents(self.chunks.0)))
};
(self.base_address, contents)
}
/// Drives the fetching of memory, in an iterator-esque style.
pub(super) fn next_request(&self) -> Option<UnknownMemory> {
if self.left_to_read == 0 {
None
} else {
let offset_in_current_page = PAGE_SIZE - self.left_to_read;
Some(UnknownMemory {
address: self.base_address.0 + offset_in_current_page,
size: self.left_to_read,
})
}
}
pub(super) fn unknown(&mut self, bytes: u64) {
if bytes == 0 {
return;
}
self.left_to_read -= bytes;
self.chunks.0.push(PageChunk::Unmapped(bytes));
}
pub(super) fn known(&mut self, data: Arc<[u8]>) {
if data.is_empty() {
return;
}
self.left_to_read -= data.len() as u64;
self.chunks.0.push(PageChunk::Mapped(data));
}
}
fn page_contents_into_iter(data: Arc<MappedPageContents>) -> Box<dyn Iterator<Item = MemoryCell>> {
let mut data_range = 0..data.0.len();
let iter = std::iter::from_fn(move || {
let data = &data;
let data_ref = data.clone();
data_range.next().map(move |index| {
let contents = &data_ref.0[index];
match contents {
PageChunk::Mapped(items) => {
let chunk_range = 0..items.len();
let items = items.clone();
Box::new(
chunk_range
.into_iter()
.map(move |ix| MemoryCell(Some(items[ix]))),
) as Box<dyn Iterator<Item = MemoryCell>>
}
PageChunk::Unmapped(len) => {
Box::new(std::iter::repeat_n(MemoryCell(None), *len as usize))
}
}
})
})
.flatten();
Box::new(iter)
}
/// Defines an iteration over a range of memory. Some of this memory might be unmapped or straight up missing.
/// Thus, this iterator alternates between synthesizing values and yielding known memory.
pub struct MemoryIterator {
start: MemoryAddress,
end: MemoryAddress,
current_known_page: Option<(PageAddress, Box<dyn Iterator<Item = MemoryCell>>)>,
pages: std::vec::IntoIter<(PageAddress, PageContents)>,
}
impl MemoryIterator {
fn new(
range: RangeInclusive<MemoryAddress>,
pages: std::vec::IntoIter<(PageAddress, PageContents)>,
) -> Self {
Self {
start: *range.start(),
end: *range.end(),
current_known_page: None,
pages,
}
}
fn fetch_next_page(&mut self) -> bool {
if let Some((mut address, chunk)) = self.pages.next() {
let mut contents = match chunk {
PageContents::Unmapped => None,
PageContents::Mapped(mapped_page_contents) => {
Some(page_contents_into_iter(mapped_page_contents))
}
};
if address.0 < self.start {
// Skip ahead till our iterator is at the start of the range
//address: 20, start: 25
//
let to_skip = self.start - address.0;
address.0 += to_skip;
if let Some(contents) = &mut contents {
contents.nth(to_skip as usize - 1);
}
}
self.current_known_page = contents.map(|contents| (address, contents));
true
} else {
false
}
}
}
impl Iterator for MemoryIterator {
type Item = MemoryCell;
fn next(&mut self) -> Option<Self::Item> {
if self.start > self.end {
return None;
}
if let Some((current_page_address, current_memory_chunk)) = self.current_known_page.as_mut()
{
if current_page_address.0 <= self.start {
if let Some(next_cell) = current_memory_chunk.next() {
self.start += 1;
return Some(next_cell);
} else {
self.current_known_page.take();
}
}
}
if !self.fetch_next_page() {
self.start += 1;
return Some(MemoryCell(None));
} else {
self.next()
}
}
}
#[cfg(test)]
mod tests {
use crate::debugger::{
MemoryCell,
memory::{MemoryIterator, PageAddress, PageContents},
};
#[test]
fn iterate_over_unmapped_memory() {
let empty_iterator = MemoryIterator::new(0..=127, Default::default());
let actual = empty_iterator.collect::<Vec<_>>();
let expected = vec![MemoryCell(None); 128];
assert_eq!(actual.len(), expected.len());
assert_eq!(actual, expected);
}
#[test]
fn iterate_over_partially_mapped_memory() {
let it = MemoryIterator::new(
0..=127,
vec![(PageAddress(5), PageContents::mapped(vec![1]))].into_iter(),
);
let actual = it.collect::<Vec<_>>();
let expected = std::iter::repeat_n(MemoryCell(None), 5)
.chain(std::iter::once(MemoryCell(Some(1))))
.chain(std::iter::repeat_n(MemoryCell(None), 122))
.collect::<Vec<_>>();
assert_eq!(actual.len(), expected.len());
assert_eq!(actual, expected);
}
#[test]
fn reads_from_the_middle_of_a_page() {
let partial_iter = MemoryIterator::new(
20..=30,
vec![(PageAddress(0), PageContents::mapped((0..255).collect()))].into_iter(),
);
let actual = partial_iter.collect::<Vec<_>>();
let expected = (20..=30)
.map(|val| MemoryCell(Some(val)))
.collect::<Vec<_>>();
assert_eq!(actual.len(), expected.len());
assert_eq!(actual, expected);
}
}

View File

@@ -1,18 +1,21 @@
use crate::debugger::breakpoint_store::BreakpointSessionState;
use crate::debugger::dap_command::{DataBreakpointContext, ReadMemory};
use crate::debugger::memory::{self, Memory, MemoryIterator, MemoryPageBuilder, PageAddress};
use super::breakpoint_store::{
BreakpointStore, BreakpointStoreEvent, BreakpointUpdatedReason, SourceBreakpoint,
};
use super::dap_command::{
self, Attach, ConfigurationDone, ContinueCommand, DisconnectCommand, EvaluateCommand,
Initialize, Launch, LoadedSourcesCommand, LocalDapCommand, LocationsCommand, ModulesCommand,
NextCommand, PauseCommand, RestartCommand, RestartStackFrameCommand, ScopesCommand,
SetExceptionBreakpoints, SetVariableValueCommand, StackTraceCommand, StepBackCommand,
StepCommand, StepInCommand, StepOutCommand, TerminateCommand, TerminateThreadsCommand,
ThreadsCommand, VariablesCommand,
self, Attach, ConfigurationDone, ContinueCommand, DataBreakpointInfoCommand, DisconnectCommand,
EvaluateCommand, Initialize, Launch, LoadedSourcesCommand, LocalDapCommand, LocationsCommand,
ModulesCommand, NextCommand, PauseCommand, RestartCommand, RestartStackFrameCommand,
ScopesCommand, SetDataBreakpointsCommand, SetExceptionBreakpoints, SetVariableValueCommand,
StackTraceCommand, StepBackCommand, StepCommand, StepInCommand, StepOutCommand,
TerminateCommand, TerminateThreadsCommand, ThreadsCommand, VariablesCommand,
};
use super::dap_store::DapStore;
use anyhow::{Context as _, Result, anyhow};
use base64::Engine;
use collections::{HashMap, HashSet, IndexMap};
use dap::adapters::{DebugAdapterBinary, DebugAdapterName};
use dap::messages::Response;
@@ -26,7 +29,7 @@ use dap::{
use dap::{
ExceptionBreakpointsFilter, ExceptionFilterOptions, OutputEvent, OutputEventCategory,
RunInTerminalRequestArguments, StackFramePresentationHint, StartDebuggingRequestArguments,
StartDebuggingRequestArgumentsRequest, VariablePresentationHint,
StartDebuggingRequestArgumentsRequest, VariablePresentationHint, WriteMemoryArguments,
};
use futures::SinkExt;
use futures::channel::mpsc::UnboundedSender;
@@ -42,6 +45,7 @@ use serde_json::Value;
use smol::stream::StreamExt;
use std::any::TypeId;
use std::collections::BTreeMap;
use std::ops::RangeInclusive;
use std::u64;
use std::{
any::Any,
@@ -52,7 +56,7 @@ use std::{
};
use task::TaskContext;
use text::{PointUtf16, ToPointUtf16};
use util::ResultExt;
use util::{ResultExt, maybe};
use worktree::Worktree;
#[derive(Debug, Copy, Clone, Hash, PartialEq, PartialOrd, Ord, Eq)]
@@ -134,8 +138,15 @@ pub struct Watcher {
pub presentation_hint: Option<VariablePresentationHint>,
}
pub enum Mode {
Building,
#[derive(Debug, Clone, PartialEq)]
pub struct DataBreakpointState {
pub dap: dap::DataBreakpoint,
pub is_enabled: bool,
pub context: Arc<DataBreakpointContext>,
}
pub enum SessionState {
Building(Option<Task<Result<()>>>),
Running(RunningMode),
}
@@ -151,6 +162,12 @@ pub struct RunningMode {
messages_tx: UnboundedSender<Message>,
}
#[derive(Clone, Copy, Debug, PartialEq, Eq, Default)]
pub struct SessionQuirks {
pub compact: bool,
pub prefer_thread_name: bool,
}
fn client_source(abs_path: &Path) -> dap::Source {
dap::Source {
name: abs_path
@@ -554,15 +571,15 @@ impl RunningMode {
}
}
impl Mode {
impl SessionState {
pub(super) fn request_dap<R: LocalDapCommand>(&self, request: R) -> Task<Result<R::Response>>
where
<R::DapRequest as dap::requests::Request>::Response: 'static,
<R::DapRequest as dap::requests::Request>::Arguments: 'static + Send,
{
match self {
Mode::Running(debug_adapter_client) => debug_adapter_client.request(request),
Mode::Building => Task::ready(Err(anyhow!(
SessionState::Running(debug_adapter_client) => debug_adapter_client.request(request),
SessionState::Building(_) => Task::ready(Err(anyhow!(
"no adapter running to send request: {request:?}"
))),
}
@@ -571,13 +588,13 @@ impl Mode {
/// Did this debug session stop at least once?
pub(crate) fn has_ever_stopped(&self) -> bool {
match self {
Mode::Building => false,
Mode::Running(running_mode) => running_mode.has_ever_stopped,
SessionState::Building(_) => false,
SessionState::Running(running_mode) => running_mode.has_ever_stopped,
}
}
fn stopped(&mut self) {
if let Mode::Running(running) = self {
if let SessionState::Running(running) = self {
running.has_ever_stopped = true;
}
}
@@ -654,9 +671,9 @@ type IsEnabled = bool;
pub struct OutputToken(pub usize);
/// Represents a current state of a single debug adapter and provides ways to mutate it.
pub struct Session {
pub mode: Mode,
pub mode: SessionState,
id: SessionId,
label: SharedString,
label: Option<SharedString>,
adapter: DebugAdapterName,
pub(super) capabilities: Capabilities,
child_session_ids: HashSet<SessionId>,
@@ -676,9 +693,12 @@ pub struct Session {
pub(crate) breakpoint_store: Entity<BreakpointStore>,
ignore_breakpoints: bool,
exception_breakpoints: BTreeMap<String, (ExceptionBreakpointsFilter, IsEnabled)>,
data_breakpoints: BTreeMap<String, DataBreakpointState>,
background_tasks: Vec<Task<()>>,
restart_task: Option<Task<()>>,
task_context: TaskContext,
memory: memory::Memory,
quirks: SessionQuirks,
}
trait CacheableCommand: Any + Send + Sync {
@@ -768,6 +788,7 @@ pub enum SessionEvent {
request: RunInTerminalRequestArguments,
sender: mpsc::Sender<Result<u32>>,
},
DataBreakpointInfo,
ConsoleOutput,
}
@@ -792,9 +813,10 @@ impl Session {
breakpoint_store: Entity<BreakpointStore>,
session_id: SessionId,
parent_session: Option<Entity<Session>>,
label: SharedString,
label: Option<SharedString>,
adapter: DebugAdapterName,
task_context: TaskContext,
quirks: SessionQuirks,
cx: &mut App,
) -> Entity<Self> {
cx.new::<Self>(|cx| {
@@ -820,10 +842,9 @@ impl Session {
BreakpointStoreEvent::SetDebugLine | BreakpointStoreEvent::ClearDebugLines => {}
})
.detach();
// cx.on_app_quit(Self::on_app_quit).detach();
let this = Self {
mode: Mode::Building,
mode: SessionState::Building(None),
id: session_id,
child_session_ids: HashSet::default(),
parent_session,
@@ -844,10 +865,13 @@ impl Session {
is_session_terminated: false,
ignore_breakpoints: false,
breakpoint_store,
data_breakpoints: Default::default(),
exception_breakpoints: Default::default(),
label,
adapter,
task_context,
memory: memory::Memory::new(),
quirks,
};
this
@@ -860,8 +884,8 @@ impl Session {
pub fn worktree(&self) -> Option<Entity<Worktree>> {
match &self.mode {
Mode::Building => None,
Mode::Running(local_mode) => local_mode.worktree.upgrade(),
SessionState::Building(_) => None,
SessionState::Running(local_mode) => local_mode.worktree.upgrade(),
}
}
@@ -920,7 +944,18 @@ impl Session {
)
.await?;
this.update(cx, |this, cx| {
this.mode = Mode::Running(mode);
match &mut this.mode {
SessionState::Building(task) if task.is_some() => {
task.take().unwrap().detach_and_log_err(cx);
}
_ => {
debug_assert!(
this.parent_session.is_some(),
"Booting a root debug session without a boot task"
);
}
};
this.mode = SessionState::Running(mode);
cx.emit(SessionStateEvent::Running);
})?;
@@ -1013,8 +1048,8 @@ impl Session {
pub fn binary(&self) -> Option<&DebugAdapterBinary> {
match &self.mode {
Mode::Building => None,
Mode::Running(running_mode) => Some(&running_mode.binary),
SessionState::Building(_) => None,
SessionState::Running(running_mode) => Some(&running_mode.binary),
}
}
@@ -1022,7 +1057,7 @@ impl Session {
self.adapter.clone()
}
pub fn label(&self) -> SharedString {
pub fn label(&self) -> Option<SharedString> {
self.label.clone()
}
@@ -1059,26 +1094,26 @@ impl Session {
pub fn is_started(&self) -> bool {
match &self.mode {
Mode::Building => false,
Mode::Running(running) => running.is_started,
SessionState::Building(_) => false,
SessionState::Running(running) => running.is_started,
}
}
pub fn is_building(&self) -> bool {
matches!(self.mode, Mode::Building)
matches!(self.mode, SessionState::Building(_))
}
pub fn as_running_mut(&mut self) -> Option<&mut RunningMode> {
match &mut self.mode {
Mode::Running(local_mode) => Some(local_mode),
Mode::Building => None,
SessionState::Running(local_mode) => Some(local_mode),
SessionState::Building(_) => None,
}
}
pub fn as_running(&self) -> Option<&RunningMode> {
match &self.mode {
Mode::Running(local_mode) => Some(local_mode),
Mode::Building => None,
SessionState::Running(local_mode) => Some(local_mode),
SessionState::Building(_) => None,
}
}
@@ -1220,7 +1255,7 @@ impl Session {
let adapter_id = self.adapter().to_string();
let request = Initialize { adapter_id };
let Mode::Running(running) = &self.mode else {
let SessionState::Running(running) = &self.mode else {
return Task::ready(Err(anyhow!(
"Cannot send initialize request, task still building"
)));
@@ -1269,10 +1304,12 @@ impl Session {
cx: &mut Context<Self>,
) -> Task<Result<()>> {
match &self.mode {
Mode::Running(local_mode) => {
SessionState::Running(local_mode) => {
local_mode.initialize_sequence(&self.capabilities, initialize_rx, dap_store, cx)
}
Mode::Building => Task::ready(Err(anyhow!("cannot initialize, still building"))),
SessionState::Building(_) => {
Task::ready(Err(anyhow!("cannot initialize, still building")))
}
}
}
@@ -1283,7 +1320,7 @@ impl Session {
cx: &mut Context<Self>,
) {
match &mut self.mode {
Mode::Running(local_mode) => {
SessionState::Running(local_mode) => {
if !matches!(
self.thread_states.thread_state(active_thread_id),
Some(ThreadStatus::Stopped)
@@ -1307,7 +1344,7 @@ impl Session {
})
.detach();
}
Mode::Building => {}
SessionState::Building(_) => {}
}
}
@@ -1587,7 +1624,7 @@ impl Session {
fn request_inner<T: LocalDapCommand + PartialEq + Eq + Hash>(
capabilities: &Capabilities,
mode: &Mode,
mode: &SessionState,
request: T,
process_result: impl FnOnce(
&mut Self,
@@ -1643,6 +1680,12 @@ impl Session {
self.invalidate_command_type::<ModulesCommand>();
self.invalidate_command_type::<LoadedSourcesCommand>();
self.invalidate_command_type::<ThreadsCommand>();
self.invalidate_command_type::<DataBreakpointInfoCommand>();
self.invalidate_command_type::<ReadMemory>();
let executor = self.as_running().map(|running| running.executor.clone());
if let Some(executor) = executor {
self.memory.clear(&executor);
}
}
fn invalidate_state(&mut self, key: &RequestSlot) {
@@ -1715,6 +1758,135 @@ impl Session {
&self.modules
}
// CodeLLDB returns the size of a pointed-to-memory, which we can use to make the experience of go-to-memory better.
pub fn data_access_size(
&mut self,
frame_id: Option<u64>,
evaluate_name: &str,
cx: &mut Context<Self>,
) -> Task<Option<u64>> {
let request = self.request(
EvaluateCommand {
expression: format!("?${{sizeof({evaluate_name})}}"),
frame_id,
context: Some(EvaluateArgumentsContext::Repl),
source: None,
},
|_, response, _| response.ok(),
cx,
);
cx.background_spawn(async move {
let result = request.await?;
result.result.parse().ok()
})
}
pub fn memory_reference_of_expr(
&mut self,
frame_id: Option<u64>,
expression: String,
cx: &mut Context<Self>,
) -> Task<Option<String>> {
let request = self.request(
EvaluateCommand {
expression,
frame_id,
context: Some(EvaluateArgumentsContext::Repl),
source: None,
},
|_, response, _| response.ok(),
cx,
);
cx.background_spawn(async move {
let result = request.await?;
result.memory_reference
})
}
pub fn write_memory(&mut self, address: u64, data: &[u8], cx: &mut Context<Self>) {
let data = base64::engine::general_purpose::STANDARD.encode(data);
self.request(
WriteMemoryArguments {
memory_reference: address.to_string(),
data,
allow_partial: None,
offset: None,
},
|this, response, cx| {
this.memory.clear(cx.background_executor());
this.invalidate_command_type::<ReadMemory>();
this.invalidate_command_type::<VariablesCommand>();
cx.emit(SessionEvent::Variables);
response.ok()
},
cx,
)
.detach();
}
pub fn read_memory(
&mut self,
range: RangeInclusive<u64>,
cx: &mut Context<Self>,
) -> MemoryIterator {
// This function is a bit more involved when it comes to fetching data.
// Since we attempt to read memory in pages, we need to account for some parts
// of memory being unreadable. Therefore, we start off by fetching a page per request.
// In case that fails, we try to re-fetch smaller regions until we have the full range.
let page_range = Memory::memory_range_to_page_range(range.clone());
for page_address in PageAddress::iter_range(page_range) {
self.read_single_page_memory(page_address, cx);
}
self.memory.memory_range(range)
}
fn read_single_page_memory(&mut self, page_start: PageAddress, cx: &mut Context<Self>) {
_ = maybe!({
let builder = self.memory.build_page(page_start)?;
self.memory_read_fetch_page_recursive(builder, cx);
Some(())
});
}
fn memory_read_fetch_page_recursive(
&mut self,
mut builder: MemoryPageBuilder,
cx: &mut Context<Self>,
) {
let Some(next_request) = builder.next_request() else {
// We're done fetching. Let's grab the page and insert it into our memory store.
let (address, contents) = builder.build();
self.memory.insert_page(address, contents);
return;
};
let size = next_request.size;
self.fetch(
ReadMemory {
memory_reference: format!("0x{:X}", next_request.address),
offset: Some(0),
count: next_request.size,
},
move |this, memory, cx| {
if let Ok(memory) = memory {
builder.known(memory.content);
if let Some(unknown) = memory.unreadable_bytes {
builder.unknown(unknown);
}
// This is the recursive bit: if we're not yet done with
// the whole page, we'll kick off a new request with smaller range.
// Note that this function is recursive only conceptually;
// since it kicks off a new request with callback, we don't need to worry about stack overflow.
this.memory_read_fetch_page_recursive(builder, cx);
} else {
builder.unknown(size);
}
},
cx,
);
}
pub fn ignore_breakpoints(&self) -> bool {
self.ignore_breakpoints
}
@@ -1745,6 +1917,10 @@ impl Session {
}
}
pub fn data_breakpoints(&self) -> impl Iterator<Item = &DataBreakpointState> {
self.data_breakpoints.values()
}
pub fn exception_breakpoints(
&self,
) -> impl Iterator<Item = &(ExceptionBreakpointsFilter, IsEnabled)> {
@@ -1778,6 +1954,45 @@ impl Session {
}
}
pub fn toggle_data_breakpoint(&mut self, id: &str, cx: &mut Context<'_, Session>) {
if let Some(state) = self.data_breakpoints.get_mut(id) {
state.is_enabled = !state.is_enabled;
self.send_exception_breakpoints(cx);
}
}
fn send_data_breakpoints(&mut self, cx: &mut Context<Self>) {
if let Some(mode) = self.as_running() {
let breakpoints = self
.data_breakpoints
.values()
.filter_map(|state| state.is_enabled.then(|| state.dap.clone()))
.collect();
let command = SetDataBreakpointsCommand { breakpoints };
mode.request(command).detach_and_log_err(cx);
}
}
pub fn create_data_breakpoint(
&mut self,
context: Arc<DataBreakpointContext>,
data_id: String,
dap: dap::DataBreakpoint,
cx: &mut Context<Self>,
) {
if self.data_breakpoints.remove(&data_id).is_none() {
self.data_breakpoints.insert(
data_id,
DataBreakpointState {
dap,
is_enabled: true,
context,
},
);
}
self.send_data_breakpoints(cx);
}
pub fn breakpoints_enabled(&self) -> bool {
self.ignore_breakpoints
}
@@ -1907,28 +2122,36 @@ impl Session {
self.thread_states.exit_all_threads();
cx.notify();
let task = if self
.capabilities
.supports_terminate_request
.unwrap_or_default()
{
self.request(
TerminateCommand {
restart: Some(false),
},
Self::clear_active_debug_line_response,
cx,
)
} else {
self.request(
DisconnectCommand {
restart: Some(false),
terminate_debuggee: Some(true),
suspend_debuggee: Some(false),
},
Self::clear_active_debug_line_response,
cx,
)
let task = match &mut self.mode {
SessionState::Running(_) => {
if self
.capabilities
.supports_terminate_request
.unwrap_or_default()
{
self.request(
TerminateCommand {
restart: Some(false),
},
Self::clear_active_debug_line_response,
cx,
)
} else {
self.request(
DisconnectCommand {
restart: Some(false),
terminate_debuggee: Some(true),
suspend_debuggee: Some(false),
},
Self::clear_active_debug_line_response,
cx,
)
}
}
SessionState::Building(build_task) => {
build_task.take();
Task::ready(Some(()))
}
};
cx.emit(SessionStateEvent::Shutdown);
@@ -1978,8 +2201,8 @@ impl Session {
pub fn adapter_client(&self) -> Option<Arc<DebugAdapterClient>> {
match self.mode {
Mode::Running(ref local) => Some(local.client.clone()),
Mode::Building => None,
SessionState::Running(ref local) => Some(local.client.clone()),
SessionState::Building(_) => None,
}
}
@@ -2331,6 +2554,20 @@ impl Session {
.unwrap_or_default()
}
pub fn data_breakpoint_info(
&mut self,
context: Arc<DataBreakpointContext>,
mode: Option<String>,
cx: &mut Context<Self>,
) -> Task<Option<dap::DataBreakpointInfoResponse>> {
let command = DataBreakpointInfoCommand {
context: context.clone(),
mode,
};
self.request(command, |_, response, _| response.ok(), cx)
}
pub fn set_variable_value(
&mut self,
stack_frame_id: u64,
@@ -2349,6 +2586,8 @@ impl Session {
move |this, response, cx| {
let response = response.log_err()?;
this.invalidate_command_type::<VariablesCommand>();
this.invalidate_command_type::<ReadMemory>();
this.memory.clear(cx.background_executor());
this.refresh_watchers(stack_frame_id, cx);
cx.emit(SessionEvent::Variables);
Some(response)
@@ -2388,6 +2627,8 @@ impl Session {
cx.spawn(async move |this, cx| {
let response = request.await;
this.update(cx, |this, cx| {
this.memory.clear(cx.background_executor());
this.invalidate_command_type::<ReadMemory>();
match response {
Ok(response) => {
let event = dap::OutputEvent {
@@ -2443,7 +2684,7 @@ impl Session {
}
pub fn is_attached(&self) -> bool {
let Mode::Running(local_mode) = &self.mode else {
let SessionState::Running(local_mode) = &self.mode else {
return false;
};
local_mode.binary.request_args.request == StartDebuggingRequestArgumentsRequest::Attach
@@ -2481,4 +2722,8 @@ impl Session {
pub fn thread_state(&self, thread_id: ThreadId) -> Option<ThreadStatus> {
self.thread_states.thread_state(thread_id)
}
pub fn quirks(&self) -> SessionQuirks {
self.quirks
}
}

Some files were not shown because too many files have changed in this diff Show More