Compare commits

...

666 Commits

Author SHA1 Message Date
Anthony Eid
aff5050d4e Rename event handler to message handle in debug client func 2025-02-18 01:48:41 -05:00
Anthony Eid
a203b1cc36 Clean up handle initialized event return func 2025-02-18 01:47:44 -05:00
Anthony Eid
e1c7e1088b Handle initialize event 2025-02-18 01:45:09 -05:00
Anthony Eid
cbf9ef0851 Start work on handling dap message from session
Did i stash my changes and reapply them to the debugger branch? Maybe, did I then proceed to
freak out for quite a bit and wonder why there were so many conflicts and my changes were messed up?
Perhaps, did I realize my mistake and commit to the right branch? Hopefully
2025-02-18 00:57:06 -05:00
Piotr Osiewicz
16dba67b38 UI touchups 2025-02-17 20:12:47 +01:00
Piotr Osiewicz
45cf5b5ab5 Add basic tab content 2025-02-17 20:00:40 +01:00
Piotr Osiewicz
c07526f30b Fix focus on inert state 2025-02-17 19:55:18 +01:00
Piotr Osiewicz
24e816be38 WIP 2025-02-17 19:45:30 +01:00
Piotr Osiewicz
b67a382e57 WIP 2025-02-17 18:15:57 +01:00
Piotr Osiewicz
06338963f3 WIP 2025-02-17 17:24:50 +01:00
Piotr Osiewicz
95590967eb WIP 2025-02-17 16:51:39 +01:00
Piotr Osiewicz
9786da2fe5 WIP 2025-02-17 15:41:46 +01:00
Piotr Osiewicz
0dc90fd35e UI shred 2025-02-17 14:51:46 +01:00
Piotr Osiewicz
1e0a0fa9f9 Renames 2025-02-17 14:07:05 +01:00
Piotr Osiewicz
f81463cc93 WIP 2025-02-17 14:06:02 +01:00
Piotr Osiewicz
c9a9ab22f0 And the rename is complete 2025-02-17 13:46:17 +01:00
Piotr Osiewicz
cf13a62bb5 Rename client module to session 2025-02-17 13:45:16 +01:00
Piotr Osiewicz
fbf4eb9472 fixup! Building! And crashing 2025-02-17 13:40:42 +01:00
Piotr Osiewicz
ad356d9c8e Building! And crashing 2025-02-17 13:39:45 +01:00
Piotr Osiewicz
45db28a5ce Project compiles, yay 2025-02-17 12:50:55 +01:00
Piotr Osiewicz
a0c91d620a WIP 2025-02-17 11:28:22 +01:00
Piotr Osiewicz
b1ca8c3e45 Merge branch 'debugger' into remove-dap-session 2025-02-17 11:11:45 +01:00
Piotr Osiewicz
f8e248286e Merge branch 'main' into debugger 2025-02-17 11:04:44 +01:00
Piotr Osiewicz
0233152bd1 Get rid of supports_attach (we'll configure it differently later on) 2025-02-17 11:02:34 +01:00
Piotr Osiewicz
294ce96b24 WIP 2025-02-17 00:50:05 +01:00
Anthony Eid
f7886adf9a Remove display_row from line number layout struct 2025-02-16 18:21:44 -05:00
Anthony Eid
bc5904e485 Fix breakpoint line numbers not being colored 2025-02-16 18:18:15 -05:00
Piotr Osiewicz
12c02a12ca WIP 2025-02-15 01:51:57 +01:00
Piotr Osiewicz
36b430a1ae Add a doc comment for the debugger module 2025-02-15 01:41:24 +01:00
Piotr Osiewicz
7fe8c626d3 Fix up one call site 2025-02-15 01:26:57 +01:00
Piotr Osiewicz
43e2c4de06 Keep going 2025-02-15 00:41:48 +01:00
Piotr Osiewicz
6648d6299f Use dropdown menu instead 2025-02-15 00:09:08 +01:00
Piotr Osiewicz
d0f64d475d Add placeholder for thread dropdown 2025-02-15 00:07:29 +01:00
Piotr Osiewicz
e285ae60c3 WIP 2025-02-14 23:27:13 +01:00
Anthony Eid
53336eedb7 Move breakpoint serialization/deserialization to breakpoint store from project 2025-02-14 16:56:34 -05:00
Anthony Eid
3aab70c7bb Fix some warnings 2025-02-14 15:51:29 -05:00
Anthony Eid
cdb5af2906 Remove project from toggle breakpoint dependencies
Project is no longer responsible for toggle breakpoints or telling breakpoint store to toggle breakpoints.
Now editor toggles breakpoints by directly telling breakpoint store too. In the future I plan on removing
on breakpoint related handling from project to breakpoint store.

I also fix some debugger related test compile errors. Plenty of them won't pass because we're still in a refactor,
but they build now
2025-02-14 15:38:32 -05:00
Piotr Osiewicz
936ac212e7 Another WIP 2025-02-14 21:18:37 +01:00
Piotr Osiewicz
debcb1f26f WIP
Co-Authored-By: Anthony Eid <hello@anthonyeid.me>
2025-02-14 19:30:14 +01:00
Piotr Osiewicz
90440d9652 fixup! Check capabilities outside of dap_session 2025-02-14 16:44:46 +01:00
Piotr Osiewicz
be80dedd8c Merge branch 'main' into debugger 2025-02-14 16:43:04 +01:00
Piotr Osiewicz
00c24ce289 Simplify DapCommand::is_supported 2025-02-14 16:27:44 +01:00
Piotr Osiewicz
c820f12e9c Check capabilities outside of dap_session
That way we don't need to roundtrip for RPC peers. Plus the logs for unsupported features will be less noisy
2025-02-14 16:24:54 +01:00
Remco Smits
3833788cbc Add missing init for breakpointstore 2025-02-14 14:23:27 +01:00
Anthony Eid
91e60d79bb Add BreakpointStore to debugger crate (#114)
* Initial setup for breakpoint store

* WIP Move more methods to breakpoint store

* Move event handler to breakpoint store

* Fix compiler errrors

* Fix more compiler errors

* Get Zed to compile

---------

Co-authored-by: Remco Smits <djsmits12@gmail.com>
2025-02-14 13:55:03 +01:00
Remco Smits
ecfc0ef12d Move requests into client (#112)
* Move most of the requests into the client state itself.

Co-authored-by: Anthony Eid <hello@anthonyeid.me>

* WIP

* Fix some errors and create new errors

* More teardown

* Fix detach requests

* Move set variable value to dap command

* Fix dap command error and add evaluate command

* FIx more compiler errors

* Fix more compiler errors

* Clipppyyyy

* FIx more

* One more

* Fix one more

* Use threadId from project instead u64

* Mostly fix stack frame list

* More compile errors

* More

* WIP transfer completions to dap command

Co-Authored-By: Anthony Eid <56899983+Anthony-Eid@users.noreply.github.com>
Co-Authored-By: Piotr Osiewicz <24362066+osiewicz@users.noreply.github.com>

* Finish console completions DapCommand impl

* Get Zed to build !!

* Fix test compile errors: The debugger tests will still fail

* Add threads reqeust to debug session

Co-authored-by: Piotr Osiewicz <piotr@zed.dev>

---------

Co-authored-by: Piotr Osiewicz <24362066+osiewicz@users.noreply.github.com>
Co-authored-by: Anthony Eid <hello@anthonyeid.me>
Co-authored-by: Anthony Eid <56899983+Anthony-Eid@users.noreply.github.com>
Co-authored-by: Piotr Osiewicz <piotr@zed.dev>
2025-02-13 16:25:32 -05:00
Piotr Osiewicz
661e5b0335 Merge branch 'main' into debugger 2025-02-13 11:28:20 +01:00
Piotr Osiewicz
041e1eef38 Move sessions mapping to DapStore itself
Co-authored-by: hello@anthonyeid.me
2025-02-11 14:44:44 +01:00
Piotr Osiewicz
c8ba6d7c57 Move caps onto the debug client state 2025-02-11 02:12:13 +01:00
Piotr Osiewicz
9d5525e0fb Move DAP modules in project into debugger submodule 2025-02-11 00:35:25 +01:00
Piotr Osiewicz
9ac18e9404 Add assert for unexpected debug command 2025-02-11 00:22:39 +01:00
Piotr Osiewicz
90deb4a82e Merge branch 'main' into debugger 2025-02-11 00:18:52 +01:00
Remco Smits
8b45634d39 Invalidate dap session information on stopped event 2025-02-10 11:51:38 +01:00
Remco Smits
e216805e66 Add check if request is support inside dap command
This fixes also an issue that Piotr was having with step debugging Python. Because we now always send the request event though the adapter didn't support it. This is only for the new request structure
2025-02-10 10:45:46 +01:00
Piotr Osiewicz
062e64d227 cargo fmt 2025-02-10 02:15:29 +01:00
Piotr Osiewicz
39d4d624fb Merge branch 'main' into debugger 2025-02-10 02:10:39 +01:00
Remco Smits
dad39abeeb Move module list to new structure 2025-02-08 14:25:12 +01:00
Remco Smits
05ca096a5b Use observe for module list instead of clearing only when the length is different
This should fix issues when we update a module it's content. That wouldn't show up anymore. So by always resetting the list we should get the most up-to-date  value.

NOTE: We probably want to reduce the amount of resets and notifies every time we update the client state, even though we didn't update the modules.

We could send and specific event from the client state, each time we update a module. So instead of using a observer we could use a normal event subscriber and only update the list based on that event
2025-02-08 13:44:45 +01:00
Anthony Eid
2fc7c17497 Remove debugger collab db tables (#111)
* WIP setup active debug sessions request

* Set up active debug sessions request that's send when joining a project

* Fix test debug panel console

* Remove debugger tables from collab db

* Wip request active debug sessions
2025-02-06 22:29:32 -05:00
Remco Smits
af77b2f99f WIP refactor (#110)
* WIP

Co-Authored-By: Piotr Osiewicz <piotr@zed.dev>
Co-Authored-By: Anthony Eid <hello@anthonyeid.me>

* Tear stuff out and make the world burn

Co-authored-by: Remco Smits <djsmits12@gmail.com>
Co-authored-by: Piotr <piotr@zed.dev>

* Fix compile errors

* Remove dap_store from module list and fix some warnings

Module list now uses Entity<DebugSession> to get active modules and handle remote/local state
so dap_store is no longer needed

* Add Cacheable Command trait

This gets rid of ClientRequest or whatever the name was; we don't need to enumerate every possible request and repeat ourselves, instead letting you mark any request as cached with no extra boilerplate.

* Add Eq requirement for RequestSlot

* Implement DapCommand for Arc<DapCommand>

That way we can use a single allocated block for each dap command to store it as both the cache key and the command itself.

* Clone Arc on demand

* Add request helper

* Start work on setting up a new dap_command_handler

* Make clippy pass

* Add loaded sources dap command

* Set up module list test to see if Modules request is called

* Fix compile warnings

* Add basic local module_list test

* Add module list event testing to test_module_list

* Bring back as_any_arc

* Only reset module list's list state when modules_len changes

---------

Co-authored-by: Piotr Osiewicz <24362066+osiewicz@users.noreply.github.com>
Co-authored-by: Anthony Eid <56899983+Anthony-Eid@users.noreply.github.com>
Co-authored-by: Anthony Eid <hello@anthonyeid.me>
Co-authored-by: Piotr <piotr@zed.dev>
2025-02-06 21:12:53 -05:00
Piotr Osiewicz
16eff80a7b Merge branch 'main' into debugger 2025-02-05 20:22:19 +01:00
Kirill Bulatov
64bc112fe5 Revert recent anti-aliasing improvements (#24289)
This reverts commit 31fa414422.
This reverts commit b9e0aae49f.

`lyon` commit revert:

![image](https://github.com/user-attachments/assets/0243f61c-0713-416d-b8db-47372e04abaa)

`MSAA` commit revert:

![image](https://github.com/user-attachments/assets/b1a4a9fe-0192-47ef-be6f-52e03c025724)

cc @huacnlee , @\as-cii had decided to revert this PR due to a selection
right corner rendering bug.
Not sure what to propose for a fix from my side

Release Notes:

- N/A
2025-02-05 19:24:40 +01:00
Bennet Bo Fenner
9ef8aceb81 edit prediction: Improve UX around disabled_globs and show_inline_completions (#24207)
Release Notes:

- N/A

---------

Co-authored-by: Danilo <danilo@zed.dev>
Co-authored-by: Danilo Leal <daniloleal09@gmail.com>
2025-02-05 19:24:40 +01:00
Danilo Leal
2973b46c78 Revise the MessageNotification component (#24287)
This PR makes adding icons to the primary and secondary actions, in the
`MessageNotification` component, optional. Also took the opportunity to
remove a probably unnecessary "third action" from it; streamlining the
component API (we had added that for a design that we're not using
anymore). I did keep the "more info" possibility, which may be useful in
the future, though.

Release Notes:

- N/A
2025-02-05 19:24:40 +01:00
Danilo Leal
b6182d0781 edit prediction: Fix license detection error logging + check for different spellings (#24281)
Follow-up to https://github.com/zed-industries/zed/pull/24278

This PR ensures we're checking if there's a license-type file in both US
& UK English spelling, and fixes the error logging again, treating for
when the worktree contains just a single file or multiple.

Release Notes:

- N/A

Co-authored-by: Bennet Bo Fenner <53836821+bennetbo@users.noreply.github.com>
2025-02-05 19:24:40 +01:00
Cole Miller
dac28730e5 Fix the worktree's repository_for_path (#24279)
Go back to a less optimized implementation for now since the custom
cursor target seems to have some bugs.

Release Notes:

- Fixed missing git blame and status output in some projects with
multiple git repositories
2025-02-05 19:24:40 +01:00
Kirill Bulatov
9c3e85a6ff Rework shared commit editors (#24274)
Rework of https://github.com/zed-industries/zed/pull/24130
Uses
1033c0b57e
`COMMIT_EDITMSG` language-related definitions (thanks @d1y )

Instead of using real `.git/COMMIT_EDITMSG` file, create a buffer
without FS representation, stored in the `Repository` and shared the
regular way via the `BufferStore`.
Adds a knowledge of what `Git Commit` language is, and uses it in the
buffers which are rendered in the git panel.

Release Notes:

- N/A

---------

Co-authored-by: Conrad Irwin <conrad@zed.dev>
Co-authored-by: d1y <chenhonzhou@gmail.com>
Co-authored-by: Smit <smit@zed.dev>
2025-02-05 19:24:39 +01:00
Danilo Leal
e21ea19fc6 edit prediction: Don't log an error if license file isn't found (#24278)
Logging an error in this case isn't super necessary.

Release Notes:

- N/A

Co-authored-by: Bennet Bo Fenner <53836821+bennetbo@users.noreply.github.com>
2025-02-05 19:24:09 +01:00
Agus Zubiaga
cee4e25d51 edit predictions: Onboarding funnel telemetry (#24237)
Release Notes:

- N/A
2025-02-05 19:24:09 +01:00
Marshall Bowers
33e39281a6 languages: Sort dependencies in Cargo.toml (#24277)
This PR sorts the dependency lists in the `Cargo.toml` for the
`languages` crate.

Release Notes:

- N/A
2025-02-05 19:24:09 +01:00
Peter Tripp
da66f75410 Revert "copilot: Correct o3-mini context length" (#24275)
Reverts zed-industries/zed#24152
See comment: https://github.com/zed-industries/zed/pull/24152#issuecomment-2636808170
Manually confirmed >20k generates error.
2025-02-05 19:24:09 +01:00
张小白
b7f0a1f5c8 windows: Fix tests on Windows (#22616)
Release Notes:

- N/A

---------

Co-authored-by: Mikayla <mikayla.c.maki@gmail.com>
2025-02-05 19:24:09 +01:00
Agus Zubiaga
6ec4adffe7 Accept edit predictions with alt-tab in addition to tab (#24272)
When you have an edit prediction available, you can now also accept it
with `alt-tab` (or `alt-enter` on Linux) even if you don't have an LSP
completions menu open. This is meant to lower the mental load when going
from one mode to another.

Release Notes:

- N/A
2025-02-05 19:24:09 +01:00
Agus Zubiaga
180ce5e9ba edit prediction: Allow enabling OSS data collection with no project open (#24265)
This was an leftover from when we were persisting a per-project setting.

Release Notes:

- N/A
2025-02-05 19:24:09 +01:00
Piotr Osiewicz
4fcb10dd26 lsp: Add support for default rename behavior in prepareRename request (#24246)
Fixes #24184

Release Notes:

- Fixed renaming not working with some language servers (e.g. hls)
2025-02-05 19:24:08 +01:00
Remco Smits
1be2836f60 Refactor session id to use entity instead (#109)
* Move common session fields to struct

* Use session entity instead of session id inside UI

This is to prepare to move the state to the
2025-02-05 19:19:23 +01:00
Remco Smits
c76144e918 Re-Apply Lazy load stack frame information (#108)
* Reapply "Lazy load stack frame information (scopes & variables) (#106)"

This reverts commit 27b60436b8.

* Reapply "Remove futures from debugger ui crate"

This reverts commit 3cf96588e2.

* Don't fetch initial variables twice

This was introduced by my original PR, because I added the fetch on stack frame select but when the stack frames where updated we would already fetch the initial variables. And when the selectedStackFrameUpdated event was received we would refetch it because it was not done yet.

* Remove duplicated method just for testing

* Make keep open entries work again

The issue was the we used the scope_id, which changes after each debug step. So using the name of the scope instead should be more reliable.

* Correctly invalidate variable list information

* Comment out collab variable list for now

Also commenting out an event that triggers the infinite loop.

We want to revisited the collab stuff anyway so we decided to do it this way.
2025-02-05 17:45:03 +01:00
Remco Smits
aa268d1379 Merge branch 'main' into debugger 2025-02-05 11:24:31 +01:00
Anthony Eid
3cf96588e2 Revert "Remove futures from debugger ui crate"
Futures are once again needed by variable list dued to the lazy stack
frame fetching PR revert

This reverts commit 26f14fd036.
2025-02-04 07:50:43 -05:00
Anthony Eid
27b60436b8 Revert "Lazy load stack frame information (scopes & variables) (#106)"
I'm reverting this because it introduced a regression with collab's test
variable list & didn't supoort fetching stack frames initiated by a
remote user.

This reverts commit 945e3226d8.
2025-02-04 07:47:42 -05:00
Piotr Osiewicz
28874b60cf Fix up symlinks for license files 2025-02-04 02:34:55 +01:00
Piotr Osiewicz
dcde289f94 Merge branch 'main' into debugger 2025-02-04 02:21:09 +01:00
Piotr Osiewicz
52350245e6 cargo fmt 2025-02-04 00:22:37 +01:00
Piotr Osiewicz
a56e3eafdf Merge branch 'main' into debugger 2025-02-04 00:19:15 +01:00
Piotr Osiewicz
ad98bf444d Trigger CLA check 2025-02-04 00:10:19 +01:00
Piotr Osiewicz
080d0c46ad Remove sqlez dependency on project 2025-02-04 00:03:36 +01:00
Piotr Osiewicz
dfeddaae8a Remove unused dep from activity indicator 2025-02-03 23:28:27 +01:00
Anthony Eid
5cd93ca158 Fix test_extension_store_with_test_extension 2025-02-03 16:24:23 -05:00
Anthony Eid
d60089888f Show console output on remote debug clients (#107)
This PR shows console output & removes the query bar for remote clients.

We don't show the query bar because it's currently impossible for remote clients to send evaluations requests from their console. We didn't implement that feature yet because of security consoles and will do so in the future with permissions.

Co-authored-by: Remco Smits \<djsmits12@gmail.com\>

* Add basic collab debug console test

Co-authored-by: Remco Smits <djsmits12@gmail.com>

* Show output events on remote client debug consoles

Co-authored-by: Remco Smits <djsmits12@gmail.com>

* Don't show debug console query bar on remote clients

Co-authored-by: Remco Smits <djsmits12@gmail.com>

---------

Co-authored-by: Remco Smits <djsmits12@gmail.com>
2025-02-03 11:31:06 -05:00
Piotr Osiewicz
4c930652af Use workspace values for edition and publish in new crates 2025-02-03 16:53:00 +01:00
Remco Smits
44e9444c8c Fix choose debugger action didn't work anymore 2025-02-03 16:14:39 +01:00
Piotr Osiewicz
69548f5f34 Clean up naming around activity indicator and binary statuses
Co-authored-by: Anthony <hello@anthonyeid.me>
2025-02-03 15:25:16 +01:00
Piotr Osiewicz
c45b6e9271 Clean up naming in activity_indicator.rs
Co-authored-by: Anthony <hello@anthonyeid.me>
2025-02-03 15:25:16 +01:00
Remco Smits
26f14fd036 Remove futures from debugger ui crate 2025-02-03 15:09:02 +01:00
Remco Smits
945e3226d8 Lazy load stack frame information (scopes & variables) (#106)
* Add tests for incremental fetching scopes & variables for stack frames

* Fetch scopes and variables when you select a stack frame

* Send proto update message when you select a stack frame
2025-02-03 15:05:17 +01:00
Piotr Osiewicz
3bf5833135 Make Pane take non-optional double click action again
Co-authored-by: Anthony <hello@anthonyeid.me>
2025-02-03 14:48:44 +01:00
Anthony Eid
47b3f55a17 Minor cleanup of breakpoint context menu code 2025-02-03 08:35:51 -05:00
Anthony Eid
c994075327 Clean up Debugger collab tests (#104)
The debugger collab tests are difficult to understand and read due to the sheer size of each test and the setup involved to start a collab debugger session. This PR aims to mitigate these problems by creating a struct called Zed Instance that manages setting up tests and joining/rejoining collab debug sessions.

* Create util functions to set up debugger collab tests

* WIP converting debugger collab tests to use ZedInstance

* Clean up collab test utility functions to work with 3 member calls

* Update item set on join project test to use new api

* Update test update breakpoints send to dap

* Update test_ignore_breakpoints

* Update last collab tests

* Don't setup file tree for tests that don't need it
2025-02-03 08:22:06 -05:00
Remco Smits
b6b7ad38b5 Remove commented code in lldb attach code 2025-02-03 10:11:35 +01:00
Remco Smits
e81a7e1e06 Merge branch 'main' into debugger 2025-02-03 10:11:12 +01:00
Anthony Eid
889949ca76 Add lldb attach support 2025-02-02 13:39:12 -05:00
Remco Smits
ef3a6deb05 Improve the visibility of process entries (#105)
Before this change it was impossible to see the command arguments, as the executable was to long.

This changes that so we use the process name as executable name VSCode as seems to do this. So you could still see the arguments of the program.

I also added a tooltip to see the correct executable + arguments.
2025-02-01 16:52:48 +01:00
Anthony Eid
197166a8c1 Fix clippy errors 2025-01-31 04:33:35 -05:00
Remco Smits
dfe978b06a Add toolchain support for python debug adapter (#90)
* WIP add toolchain for python

Co-Authored-By: Anthony Eid <56899983+Anthony-Eid@users.noreply.github.com>

* Integrate toolchain store with dap_store & use it for debugpy

* Wip require worktree to start debug session

* Move start debug session to project for determing the worktree

* Make all tests pass again

* Make collab tests pass

* Use notify instead of manual notification

* Use reference instead of clone

* Revert "Use reference instead of clone"

This reverts commit 61469bb1679bc35d5d3bf0b93e5b7cfc94357c80.

* Revert "Use notify instead of manual notification"

This reverts commit a0b9bf52a1d948dfb244c4b7040576a34ec6f465.

* Revert debugger branch merge

* Revert "Revert debugger branch merge"

This reverts commit 56c883d4dba4877826ea2185a8177fddefa0d054.

* Clean up

* Make node runtime required

* Pass worktree id into get_environment

* Fix use the resolved debug adapter config

* Add fallback if toolchain could not be found to common binary names

---------

Co-authored-by: Anthony Eid <56899983+Anthony-Eid@users.noreply.github.com>
Co-authored-by: Anthony Eid <hello@anthonyeid.me>
2025-01-29 20:55:57 +01:00
Anthony Eid
a2e0b6778b Merge branch 'main' into debugger 2025-01-29 04:13:19 -05:00
Anthony Eid
25db814d0c Merge branch 'main' into debugger 2025-01-29 04:04:55 -05:00
Anthony Eid
78865820d8 Transition to gpui3 (#103)
Update debugger's branch gpui version to 3

Co-authored-by: Remco Smits djsmits12@gmail.com
2025-01-29 03:46:37 -05:00
Anthony Eid
1fb0c5b84e Fix bug where lldb user installed path wasn't used in some cases
We checked if the user passed in an lldb-dap path after we searched for lldb-dap in PATH.
I switched around the order so if there's a user_installed_path passed through it takes
priority
2025-01-24 03:21:53 -05:00
Remco Smits
ec547b8ca3 Add debug configuration documentation (#96)
* Start adding docs for configurations and adapters

* Update debugger setting files with settings, attach config, & themes

I also ran prettier and typos on it too."

---------

Co-authored-by: Anthony Eid <hello@anthonyeid.me>
2025-01-21 17:43:11 -05:00
Anthony Eid
56abc60a34 Collab - Toggle Ignore Breakpoints (#93)
* Create collab ignore breakpoint integration test

* Add collab ignore breakpoints message handlers

Still need to enable remote dap_stores the ability to store/manage ignore breakpoint state

* Refactor session to have remote and local modes

This was done to allow remote clients access to some session details they need such as ignore breakpoints.

Co-authored-by: Remco Smits <djsmits12@gmail.com>

* DapStore so both local & remote modes have access to DebugSessions

* Add remote sessions when creating new debug panel items

* Finish implementing collab breakpoints ignore

* Clippy & clean up

* Clean up session information when sessions end on remote clients

* Rename proto message

* Add ignore breakpoints state to collab db

---------

Co-authored-by: Remco Smits <djsmits12@gmail.com>
2025-01-21 16:53:33 -05:00
Remco Smits
a89e29554f Rename server to adapter inside the dap log 2025-01-21 17:48:53 +01:00
Remco Smits
8ceb115f3c Fix typo in docs 2025-01-21 16:50:41 +01:00
Remco Smits
953843acdd Add stack frame source origin to tooltip 2025-01-21 12:38:36 +01:00
Remco Smits
5a52c7e21f Add test to for shutdown on launch/attach failure
Co-Authored-By: Anthony Eid <56899983+Anthony-Eid@users.noreply.github.com>
2025-01-20 21:32:07 +01:00
Remco Smits
9bca023b1a Shutdown debug session when launch/attach fails
Co-Authored-By: Anthony Eid <56899983+Anthony-Eid@users.noreply.github.com>
2025-01-20 21:31:41 +01:00
Remco Smits
647f411b10 Add test for sending breakpoints for unopened buffers
Co-Authored-By: Anthony Eid <56899983+Anthony-Eid@users.noreply.github.com>
2025-01-20 21:17:30 +01:00
Remco Smits
e1392c9717 Allow sending breakpoint request for unopened buffers
So before this change, if you have save breakpoints feature enabled. You had to have the buffer open for the breakpoint to be send after you launched Zed again. Because we needed to have a buffer to determine the right position based on a anchor.

This changes that so we fallback the cached position (last known), that we stored in the DB. So it could be that the file has changed on disk without Zed knowning. But it's better then not sending the breakpoints at all

Co-Authored-By: Anthony Eid <56899983+Anthony-Eid@users.noreply.github.com>
2025-01-20 21:17:06 +01:00
Remco Smits
7b6d20b9ff Stack frame presentation hint (#94)
* Update dap type to include new presentation hint for stack frames

* Implement grouped/collapsed stack frames based on presentation hint

* Add test

Co-Authored-By: Anthony Eid <56899983+Anthony-Eid@users.noreply.github.com>

---------

Co-authored-by: Anthony Eid <56899983+Anthony-Eid@users.noreply.github.com>
2025-01-20 20:34:28 +01:00
Remco Smits
675fb2f54b Fix only show restart frame icon for frames that allow it 2025-01-20 17:38:16 +01:00
Anthony Eid
1a5feff9ed Implment Variable List Toggling Function in Collab (#88)
* Impl VariablesCommand for dap requests

* Have remote variable list generate entries & other fields locally

* Finish first version of collab variable list test (It fails)

* Get variable list collab test to pass

* Update variable test again (it failing)

* Improve tests more WIP

* Remove a test to make merge easier

* Finalize collab test

* Only send variables one time to collab server instead of each rebuild

* Finish variable list fetch!
2025-01-19 18:58:46 -05:00
Remco Smits
4de9921fa8 Add support for adapter specific clipboard value formatting 2025-01-19 16:07:18 +01:00
Remco Smits
5ecfef0aa0 Fix failing test because of race condition
By sorting the breakpoints based on the line number, the order does not matter.
2025-01-19 14:36:02 +01:00
Remco Smits
868f55c074 Fix failing test 2025-01-19 14:33:41 +01:00
Remco Smits
272721ae30 Merge branch 'main' into debugger 2025-01-19 13:59:07 +01:00
Remco Smits
f34fc7fbe8 Use arc for dap command request instead of cloning the command itself 2025-01-19 13:08:41 +01:00
Remco Smits
a193efd65a Re-send breakpoints when source file has been changed (#92)
* Resend breakpoints when the source/edit has been changed/saved

* Check for dapstore first before checking project path

* Add test to validate we re-send variables when editor is saved

* Add new line after edit

* Also send breakpoints changed when source on disk changed

* Fix the test

* Don't send breakpoints changed for saved event

We send the breakpoints twice, because we already send them when the FileHandleChanged event was received and that is received after the Saved event itself. And the Saved event does not guarantee that the source is already changed on disk. Which it has to be so we can send the new breakpoints.

* Assert in more places that the source modified is false
2025-01-19 12:58:45 +01:00
Anthony Eid
568f127f43 Fix name error in module list func 2025-01-16 23:40:27 -05:00
Anthony Eid
bc49c18481 Fix Collab Module List Bug (#91)
This PR fixes a module list bug where remote clients wouldn't have any modules in their module list when they hit their first breakpoint.

We now send module list changes whenever there is a module list update and don't send update messages in the SetDebuggerPanelItem proto message. This is because the module list usually doesn't change each time a user steps over, so sending the module list was wasting bandwidth.


* Add collab module list test

* Get module list to send during all changes & stop redundant update

* Update module list test for remote clients joining mid session
2025-01-16 23:37:12 -05:00
Remco Smits
d03e4149ea Fix CustomEvent type field are not public 2025-01-16 21:32:23 +01:00
Anthony Eid
86946506bf Send synced breakpoints to active DAP servers in collab (#89)
* Send changed breakpoints to DAP servers on sync breakpoints handle

Co-authored-by: Remco Smits <djsmits12@gmail.com>

* Add more sync breakpoints test

Co-authored-by: Remco Smits <djsmits12@gmail.com>

---------

Co-authored-by: Remco Smits <djsmits12@gmail.com>
2025-01-16 14:40:22 -05:00
Remco Smits
41b702807d Fix only allow autocompletion for variables that are from the current(first) stackframe
So this changes the behavior for providing variables for autocompletion inside the debug console if the adapter does not support autocompletion.

Before this change you would get variables based on the selected stack frame. But this is not correct, as you cannot use variables that are not in scope anymore. So changing it to only provide variables for the current(first) stack frame we should provide variables that could always be used for autocompletion and for expressions.
2025-01-15 14:13:01 +01:00
Remco Smits
a9d7858f30 Add restart stack frame (#85)
* Add restart stack frame

* Add collab support

* Add restart frame to proto capabilities

* Add collab test
2025-01-14 22:30:04 +01:00
Remco Smits
06c11f97bb Move send request to dap client to background thread 2025-01-14 20:04:37 +01:00
Remco Smits
8ecd5479ac Debug output grouping (#86)
* Remove output editor

* Implement output grouping

* Remove OutputGroup when we found the end position

* Fix make gutter smaller

* Render placeholder

* Show group end on the same level as group start

* Add tests

* Add support for collapsed grouped output

* Fix crease placeholder is not showing up

* Don't trim output multiple times

* Update tests

* Fix clippy
2025-01-14 19:38:17 +01:00
Anthony Eid
a7e26bbfb5 Proxy dap requests to upstream clients (#77)
* WIP Start work to send all dap client requests with request_dap

* Continue work on converting dap client requests

* WIP setup dap command for proto dap requests

* WIP dap command

Co-authored-by: Remco Smits <djsmits12@gmail.com>

* revert "WIP dap command"

This reverts commit fd2a6832b667aa23caf588c3ab55243319bc1654.

Co-authored-by: Remco Smits <djsmits12@gmail.com>

* More WIP with Dap Command trait
Co-authored-by: Remco Smits <djsmits12@gmail.com>

* Get step over command to work with remote dap clients

Co-authored-by: Remco Smits <djsmits12@gmail.com>

* Fix thread status not being set to stop on remote debug panel items

* Create a inner wrapper type to use for dap remote step requests

* Implement step in,back,out for remote debugger sessions

* Add Continue Command

* Add more dap command impls

TerminateThreads, Pause, and Disconnect. As well as a shutdown session request downstream clients can send to host

* Add Disconnect & Terminate dap command impls

* Add basic dap proxy test over collab

* Fix clippy error

* Start work on syncing breakpoint thread status

Co-authored-by: Remco Smits <djsmits12@gmail.com>
Co-authored-by: Carter Canedy <cartercanedy42@gmail.com>

* WIP Fix thread status not syncing

* Add thread state model's to remote debug panels when setting panel items

* Sync thread state on step out command

---------

Co-authored-by: Remco Smits <djsmits12@gmail.com>
Co-authored-by: Carter Canedy <cartercanedy42@gmail.com>
2025-01-14 01:08:31 -05:00
Anthony Eid
f9f28107f5 Add launch delay for gdb debugger 2025-01-13 22:59:28 -05:00
Anthony Eid
2736b2f477 Update rust-embed version to fix cicd build error (#87) 2025-01-12 23:42:51 -05:00
Anthony Eid
5aa816e85b Fix telemetry spelling error 2025-01-12 15:26:25 -05:00
Remco Smits
4baa7f7742 Don't show Telemetry data in output console 2025-01-12 15:47:43 +01:00
Remco Smits
4e13a00c44 Add missing action handler for OpenDebugTasks 2025-01-11 19:02:32 +01:00
Remco Smits
943609f1ac Fix failing editor test 2025-01-11 18:46:00 +01:00
Anthony Eid
887e2a65e1 Use Toast Notification for Debug Session Warning (#83)
* Switch debug session exited without hitting breakpoint to toast notification

* Move notify below the thread event, so we register the thread is added

---------

Co-authored-by: Remco Smits <djsmits12@gmail.com>
2025-01-11 17:26:49 +01:00
Remco Smits
c91f51dd68 Log errors ins stepping tasks, and update tests
I updated the tests so we don't reuse the debug panel item for each operation, instead we request a new instance each time so we can ensure the status actually changed.
2025-01-11 17:20:07 +01:00
Remco Smits
797f5bd967 Merge branch 'main' into debugger 2025-01-11 15:48:36 +01:00
Remco Smits
14a2f7526b Variable list keyboard navigation (#84)
* WIP

* Add assert helper for variable list visual entries

* Wip rework toggle entry (scope/variable) code

* Remove commented code

* Move colors to struct

* Add entry to selection if you click on them

* Add selected option to visual entries assert

* Use pretty assertions for visual entries assert helper

* Use focus handle method to give focus handle

* Register select first and last actions

* Correctly format selected entry

* Add tests helper to get active debug panel item

* Add tests for keyboard navigation

* Remove not needed comment

* Move other tests to test helper

This also removes a test that is duplicated with the keyboard navigation tests, it covers the same

* Update console test to use new test helper

* Fix failing test

I forgot to update the test, because we now always send a body back in a error case.

* Fix clippyyyy
2025-01-11 15:47:41 +01:00
Anthony Eid
918869fcfe Merge branch 'main' into debugger 2025-01-10 20:19:11 -05:00
Anthony Eid
46b72f6e3b Reset thread status to stop after failed step{over,in,out,back} & continue requests
When thread status is not Stopped users are unable to click buttons. So the thread status
needs to be reset if any of the above requests fail or else a user loses to ability to
click any of the debug buttons related to those requests
2025-01-10 17:53:17 -05:00
Anthony Eid
73627a3843 Get downstream collab clients to set their own active debug line
Before this commit downstream clients in active debug sessions relied
on the host to send them the active debug line. This had three main
limitations (Which are solved by this commit)

1. Downstream clients didn't have the ability to click on their own stack
frame list and go to that frame's location

2. Downstream clients would always follow the host when checking out stack
frames even from a different debug adapter or thread

3. If a user joins an active debug session they wouldn't have an active
debug line until the host's debug adapter sent another stop event
2025-01-10 04:06:08 -05:00
Anthony Eid
fa17737332 Correctly set thread state in SetDebuggerPanelItem handler 2025-01-09 16:49:41 -05:00
Anthony Eid
01648b9370 Integrate Debugger within Collab DB (#81)
This PR integrates Zed's Debugger with the Collab database, enabling Zed to guarantee that clients joining a project with active debug sessions in progress will receive and set up those sessions.

* Add test for setting active debug panel items on project join

* Add DebuggerSession proto message

* Modify debugger session

* Get collab server to build

* Get collab test to compile

* Add proto messages

* Set up message handler for get debugger sessions

* Send set debug panel requests when handling get debugger sessions msg

* Get request to send and set debug panel

* Setup ground work for debug sessions collab db table

* Set up debug_client table for collab db

* Remove up GetDebuggerSession proto request code

* Get proto::DebuggerSession during join_project_internal

* Remove debug_sessions table from collab db

* Add migration for debug_client and fix some bugs

* Create dap store event queue for remote daps

When creating a project in from_join_project_response(...) the debug panel hasn't been initialized yet so
it can't handle dap_store events. The solution to this is creating an event queue that debug panel takes
from dap store if it's remote and then handles all the events

* Fix debug panel join project during session test

* Add debug_panel_items table to collab db

* Integrate debug_panel_item table into collab msg handlers

* Finialize debug_panel_item table refactor for collab db

* Integrate UpdateDebugAdapter RPC with collab DB

* Handle ShutdownDebugClient RPC for collab db

* Fix clippy
2025-01-08 12:17:50 -05:00
Remco Smits
dd4a1f7d30 Only change the request timeout for tests 2025-01-08 15:29:16 +01:00
Remco Smits
8911c96399 Send error response back for RunInTerminal response for debugging 2025-01-08 12:28:01 +01:00
Remco Smits
404f77357d Use title from adapter when spawning a debug terminal 2025-01-08 12:15:28 +01:00
Remco Smits
d99653fd15 Add basic test to validate StartDebugging reverse request works 2025-01-08 11:49:23 +01:00
Remco Smits
72f13890f0 Make respond to start debugging code more readable 2025-01-08 11:48:11 +01:00
Anthony Eid
2551a0fe13 Fix variable list for participants during collab debugging 2025-01-08 05:31:55 -05:00
Remco Smits
501073393c Merge branch 'main' into debugger 2025-01-07 19:35:45 +01:00
Remco Smits
78d342d582 Fix CWD is not correctly respected when appending child paths
Right now when you add `"cwd": "$ZED_WORKTREE_ROOT/some/path"` to your config,
it won't work as expected. Because we only added the `cwd` to the task template when the cwd was a valid path.
But when you add `"cwd": "$ZED_WORKTREE_ROOT` to your config, it will never be a valid path, so the `cwd` will never be added to the task template and fallback to the `$ZED_WORKTREE_ROOT` already.

So by just always adding it to the task template, we will try to resolve it and otherwise fallback to the `$ZED_WORKTREE_ROOT`.
2025-01-03 22:10:01 +01:00
Remco Smits
f2722db366 Fix crash with RunInTerminal request
This occurs when you start a JavaScript debug session, that tells Zed to spawn a debug terminal, without having a breakpoint for the session. The debug adapter terminates the terminal but also terminates Zed because we send the Zed process ID to the debug adapter to keep track of the terminal as extra information, because we already send the pid of the spawned terminal.

So removing the process_id of Zed itself fixes the crash😀

You can reproduce this by using the following config:
```json
{
  "label": "JavaScript debug terminal",
  "adapter": "javascript",
  "request": "launch",
  "cwd": "$ZED_WORKTREE_ROOT",
  // "program": "$ZED_FILE", // this is optional, but will also crash
  "initialize_args": {
    "console": "integratedTerminal"
  }
}
```
2025-01-03 21:48:00 +01:00
Anthony Eid
f0cd2bfa61 Fix unused test import 2025-01-02 22:21:15 -05:00
Anthony Eid
020623a19d Fix regression causing some debugger tests to fail
After fn on_response() was created we added a test feature that allowed us to
return error responses. We serialized responses wrapped around
a Result type, causing serde_json::from_value(R::Response) to fail when
attempting to get a response and returning a default response instead.
2025-01-02 22:12:58 -05:00
Anthony Eid
e551923477 Merge branch 'main' into debugger
I made the debugger's panel activation priority 9
2025-01-02 18:54:47 -05:00
Anthony Eid
cbbf2daa2f Update debugger docs with more basic information 2025-01-02 18:36:37 -05:00
Anthony Eid
9497987438 Add workspace breakpoint persistence test 2025-01-02 18:17:00 -05:00
Anthony Eid
331aafde23 Add log breakpoint tests 2025-01-02 18:04:16 -05:00
Anthony Eid
f9c88fb50f Add edit log breakpoint action
I also fixed two bugs. Where not adding a log message to a standard breakpoint
would remove it (That shouldn't happen) and the breakpoint prompt editor not
always returning focus to the editor
2025-01-02 17:12:41 -05:00
Anthony Eid
1698965317 Create set breakpoint context menu
>
> We repeated the same code for creating a breakpoint context menu in three places so I exported
> it to a function.
2025-01-02 03:11:42 -05:00
Remco Smits
1c59d2203b Send error response message back for StartDebugging request 2024-12-30 23:24:52 +01:00
Remco Smits
42d1b484bd Testing: Allow sending error response inside request handler 2024-12-30 23:17:54 +01:00
Remco Smits
7223bf93ba Update toggle breakpoint test so cover more cases
Co-Authored-By: Anthony Eid <56899983+Anthony-Eid@users.noreply.github.com>
2024-12-30 20:16:30 +01:00
Remco Smits
3afda611b4 Add editor toggle breakpoint test
Co-Authored-By: Anthony Eid <56899983+Anthony-Eid@users.noreply.github.com>
2024-12-30 20:10:18 +01:00
Remco Smits
acb3ee23a8 Clean up remove breakpoint code
Co-Authored-By: Anthony Eid <56899983+Anthony-Eid@users.noreply.github.com>
2024-12-30 20:10:03 +01:00
Anthony Eid
dd5083ad62 Remove empty log breakpoints when confirming log message 2024-12-30 05:15:20 -05:00
Anthony Eid
0b68944397 Fix bug where toggling breakpoints on line 1 wouldn't work
This only affected breakpoints that were being toggled from a code runner/actions symbol right click
2024-12-30 04:55:03 -05:00
Anthony Eid
f0947c1197 Fix bug where untoggling last breakpoint in buffer wouldn't send to debug sessions
The bug was caused by removing the breakpoint_set associated to a project_paths
when removing the last breakpoint in the set and there were active debug sessions.
Causing Zed to never tell active DAPs that the last breakpoint was removed.

We now only remove breakpoint_set if there's no active sessions. A better solution
would be removing breakpoint_set after sending all breakpoint requests to active DAPs.

Also, we only send initial breakpoint requests for project paths that contain breakpoints now.
2024-12-30 04:34:45 -05:00
Remco Smits
9cd7e072fe Keep scroll position in variable list when opening nested variables (#82)
* Keep scroll position when opening nested variables

* Remove debug statement

* Fix clippy

* Update determine scroll position to cover more cases
2024-12-29 17:54:23 +01:00
Anthony Eid
f682077ffd Remove dbg! statement
We already notify the user of the error and now log it too"
2024-12-29 11:36:47 -05:00
Remco Smits
b3aa18d70c Update variable list test to cover hide sub variables 2024-12-29 16:33:50 +01:00
Remco Smits
030ee04e54 Swap memory for breakpoints instead of manual clear 2024-12-29 16:02:53 +01:00
Remco Smits
e2206b876b Move variable index to SumTree
This change replaces the Vec-based implementation with SumTree,
significantly improving performance for frequent insertions. While
the impact may not be immediately noticeable to users, it eliminates the need for costly vector shifts during each insert operation.
2024-12-29 16:02:06 +01:00
Anthony Eid
1f0304f002 Fix debug_panel_item::to_proto function signature
I switched it to use AppContext to improve the function's flexibility
2024-12-29 03:37:00 -05:00
Remco Smits
b5b877bd26 Add test to validate we shutdown session when attach modal is dismissed 2024-12-27 20:19:58 +01:00
Remco Smits
bfdf12c51c Fix typo 2024-12-27 20:13:25 +01:00
Remco Smits
e0891a1df7 Add test for attach select process flow 2024-12-27 19:31:57 +01:00
Remco Smits
c1e15c0907 Merge branch 'main' into debugger 2024-12-27 18:59:03 +01:00
Remco Smits
f4669e8965 Add basic test for attach flow 2024-12-27 16:59:45 +01:00
Remco Smits
43defed0a4 Move run in terminal tests to debug panel module 2024-12-27 16:59:06 +01:00
Remco Smits
ea18dfbe94 Add unhappy path test for handle RunInTerminal reverse request 2024-12-27 15:26:33 +01:00
Remco Smits
7ced61d798 Add basic test for RunInTerminal reverse request to spawn terminal
This commit also adds a way to add a handler when a response is received from a specific reverse request.
2024-12-27 00:56:42 +01:00
Remco Smits
fba00c728e Notify spawn terminal errors to user 2024-12-27 00:13:02 +01:00
Remco Smits
ca970dd77d Add test option for faking reverse requests 2024-12-26 23:04:57 +01:00
Remco Smits
569f500b52 Remove not needed hash derive 2024-12-26 18:32:59 +01:00
Remco Smits
b911fb1c9a Fix failing debug task test 2024-12-26 17:08:30 +01:00
Remco Smits
b8c89d3d8a Fix formatting 2024-12-26 16:56:35 +01:00
Remco Smits
098bc759a9 Merge branch 'main' into debugger 2024-12-26 16:54:53 +01:00
Remco Smits
ccacce9ccb introduce debug session model (#80)
* WIP introduce debug session model

* Wip more refactor towards session model

* Fix some compile errors

* Move sub menu item into own struct

* Don't show client id inside selected value

* Remove unused session_id from shutdown event

* Remove clients from dap store use from sessions instead

* Fix clippy

* Add client id to received event log

* Move reconnect client work again

* Move sessions to local dap store

* Move ingore breakpoints to session model

* Move next session/client id to local store

* Remove duplicated test support only method

* Show client id first in dap log menu entry

* Show sub menu better

* Sort clients by their id

* Sort sessions by their id

* Fix configuration done race condition with sending breakpoints

@Anthony-Eid I think this fixed the race condition you noticed with the configuration done request.

So the issue is when the task is created it directly start the execution of the task itself.
So the `configuration_done` request is most of the times finished before the breakpoints are send
if you don't have a lot of breakpoints.

So instead of creating both tasks before the task is created we now create the 2 tasks right after eachother,
so we can be sure that the `configuration_done` request is send after the breakpoints are send.

```
[2024-12-25T21:51:09+01:00 DEBUG dap::client] Client 0 received event `Initialized`
[2024-12-25T21:51:09+01:00 DEBUG dap::client] Client 0 send `setBreakpoints` request with sequence_id: 3
[2024-12-25T21:51:09+01:00 DEBUG dap::client] Client 0 send `setBreakpoints` request with sequence_id: 4
[2024-12-25T21:51:09+01:00 DEBUG dap::client] Client 0 send `configurationDone` request with sequence_id: 8
[2024-12-25T21:51:09+01:00 DEBUG dap::client] Client 0 send `setBreakpoints` request with sequence_id: 9
[2024-12-25T21:51:09+01:00 DEBUG dap::client] Client 0 send `setBreakpoints` request with sequence_id: 7
[2024-12-25T21:51:09+01:00 DEBUG dap::client] Client 0 send `setBreakpoints` request with sequence_id: 5
[2024-12-25T21:51:09+01:00 DEBUG dap::client] Client 0 received response for: `setBreakpoints` sequence_id: 3
[2024-12-25T21:51:09+01:00 DEBUG dap::client] Client 0 received response for: `setBreakpoints` sequence_id: 4
[2024-12-25T21:51:09+01:00 DEBUG dap::client] Client 0 send `setBreakpoints` request with sequence_id: 6
[2024-12-25T21:51:09+01:00 DEBUG dap::client] Client 0 received response for: `configurationDone` sequence_id: 8
[2024-12-25T21:51:09+01:00 DEBUG dap::client] Client 0 received response for: `setBreakpoints` sequence_id: 9
[2024-12-25T21:51:09+01:00 DEBUG dap::client] Client 0 received response for: `setBreakpoints` sequence_id: 7
[2024-12-25T21:51:09+01:00 DEBUG dap::client] Client 0 received response for: `setBreakpoints` sequence_id: 5
[2024-12-25T21:51:09+01:00 DEBUG dap::client] Client 0 received response for: `setBreakpoints` sequence_id: 6
```

```
[2024-12-25T21:49:51+01:00 DEBUG dap::client] Client 0 received event `Initialized`
[2024-12-25T21:49:51+01:00 DEBUG dap::client] Client 0 send `setBreakpoints` request with sequence_id: 4
[2024-12-25T21:49:51+01:00 DEBUG dap::client] Client 0 send `setBreakpoints` request with sequence_id: 5
[2024-12-25T21:49:51+01:00 DEBUG dap::client] Client 0 send `setBreakpoints` request with sequence_id: 6
[2024-12-25T21:49:51+01:00 DEBUG dap::client] Client 0 send `setBreakpoints` request with sequence_id: 7
[2024-12-25T21:49:51+01:00 DEBUG dap::client] Client 0 send `setBreakpoints` request with sequence_id: 8
[2024-12-25T21:49:51+01:00 DEBUG dap::client] Client 0 send `setBreakpoints` request with sequence_id: 3
[2024-12-25T21:49:51+01:00 DEBUG dap::client] Client 0 received response for: `setBreakpoints` sequence_id: 4
[2024-12-25T21:49:51+01:00 DEBUG dap::client] Client 0 received response for: `setBreakpoints` sequence_id: 5
[2024-12-25T21:49:51+01:00 DEBUG dap::client] Client 0 received response for: `setBreakpoints` sequence_id: 6
[2024-12-25T21:49:51+01:00 DEBUG dap::client] Client 0 received response for: `setBreakpoints` sequence_id: 7
[2024-12-25T21:49:51+01:00 DEBUG dap::client] Client 0 received response for: `setBreakpoints` sequence_id: 8
[2024-12-25T21:49:51+01:00 DEBUG dap::client] Client 0 received response for: `setBreakpoints` sequence_id: 3
[2024-12-25T21:49:51+01:00 DEBUG dap::client] Client 0 send `configurationDone` request with sequence_id: 9
[2024-12-25T21:49:51+01:00 DEBUG dap::client] Client 0 received response for: `configurationDone` sequence_id: 9
```

* Move capabilities back to dapstore itself

* Fix failing remote debug panel test

* Remove capabilities on remote when client shutdown

* Move client_by_session to local store and impl multi client shutdown

* Remove unused code

* Fix clippyyy

* Rename merge capabilities method

As we don't only merge we also insert if there is no capabilities for the client yet.

* Use resolved label for debug session name

* Notify errors when start client to user

* Notify reconnect error to user

* Always shutdown all clients

We should always shutdown all clients  from a single debug session when one is shutdown/terminated.
2024-12-26 16:09:54 +01:00
Remco Smits
1929dec4a0 Use task label for debug panel item tab content 2024-12-22 22:28:22 +01:00
Remco Smits
ed7443db6e Update console evaluate test to validate we invalidate variables 2024-12-22 14:23:39 +01:00
Remco Smits
3e9139eeb1 Add basic flow tests for console/output editor 2024-12-21 15:43:53 +01:00
Remco Smits
ad1e51f64a Add tests for variable list toggle scope & variables
This also covers fetching the variables, and validating the visual entries
2024-12-21 00:46:13 +01:00
Remco Smits
a91faaceec Rename DebugClientStopped event to DebugClientShutdown
Co-Authored-By: Anthony Eid <56899983+Anthony-Eid@users.noreply.github.com>
2024-12-20 21:43:11 +01:00
Remco Smits
cea5cc7cd0 Add basic test for collab show debug panel
Co-Authored-By: Anthony Eid <56899983+Anthony-Eid@users.noreply.github.com>
2024-12-20 21:38:15 +01:00
Anthony Eid
f6d26afb13 Collab: Sync DAP Client Capabilities (#79)
* Start work on getting dap clients to sync capabilities

* Add notify when syncing capabilities 

* Remove client capabilities in handle shutdown debug client
2024-12-20 11:42:00 -05:00
Remco Smits
f8fe1652a7 Add basic flow tests for variable list 2024-12-19 22:44:39 +01:00
Remco Smits
39e70354c1 Add test for selecting stackframe and open correct editor 2024-12-19 22:40:54 +01:00
Remco Smits
4f8c19a93a Fix don't remove request handler for tests
This fixes an issue that we couldn't handle a request multiple times.
2024-12-19 22:38:54 +01:00
Anthony Eid
1b1d37484b Add DAP Step back support (#78)
* Add step back support for DAP

The step back button is hidden because most dap implementations don't support
it.

* Add step back as global action - Thanks Remco for the advice! 

* Filter step back action when not avaliable
2024-12-19 13:39:21 -05:00
Anthony Eid
5dbadab1ac Fix failing test_debug_panel_following
We don't have followableItem impl for debug_panel_item yet
so this test will always fail. I commented out some lines and
put a todo that we'll get to after standard collab works for the
debugger
2024-12-18 13:25:22 -05:00
Anthony Eid
d7fa7c208d WIP Implement Debugger Collab Functionality (#69)
* Initial WIP for impl FollowableItem for DebugPanelItem

* Implment DebuggerThreadState proto functions

* Add Debug panel item variable list definition to zed.proto

* Add more debug panel item messages to zed.proto

* WIP

* Fix compile errors

Co-authored-by: Remco Smits <djsmits12@gmail.com>

* More WIP

Co-authored-by: Remco Smits <djsmits12@gmail.com>

* Further WIP lol

* Start working on fake adapter WIP

Co-authored-by: Remco Smits <djsmits12@gmail.com>
Co-authored-by: Mikayla Maki  <mikayla@zed.dev>

* Merge with Remco's mock debug adapter client branch

* Fix false positive clippy error

This error was a match variant not being covered when the variant wasn't possible dued
to a feature flag. I'm pretty sure this is a bug in clippy/rust-analyzer and will
open an issue on their repos

* Add todo to change in dap adapter downloads

* WIP Get variable to send during variable list

* Get variable list from/to_proto working

Note: For some reason variable entries aren't rendering even though
everything is being sent

* Fix warning messages

* Fix typo

* Impl stack from list from/to_proto for debug panel item

* Change order of set_from_proto for debug panel item

* Impl Variable list variables to/from proto funcs

* Start work on Set Debugger Panel Item event

* WIP with remco

Co-authored-by: Remco Smits <djsmits12@gmail.com>

* Get SetDebugPanelItem proto message sending and handled

Co-authored-by: Remco Smits <djsmits12@gmail.com>

* Setup UpdateDebugAdapter collab message & send live stack frame list updates

* Use proto enums instead of hardcoded integers

* Use read instead of update

* Send variable list update message each time we build the entries

* Send stack frame message when we selected the active stackframe

* Add more mappings

* Remove debug and rename method to be more inline with others

* Use the correct entries to reset

* Add tests to validate we can go and from proto ScopeVariableIndex

* Rename test

* Create UpdateAdapter ModuleList variant WIP

* Change proto conversion trait to have some types return Result enums

* Get clippy to pass

I removed some proto message we used in DebugPanelItem FollowableItem implmentation
because they were causing clippy errors and will need to be change in the near
future anyway

---------

Co-authored-by: Remco Smits <djsmits12@gmail.com>
Co-authored-by: Mikayla Maki <mikayla@zed.dev>
2024-12-18 11:45:21 -05:00
Remco Smits
6aba39874c Add test for stack frame list
This covers the initial fetch and opening of the editor that belongs to the current stack frame (always the first one)
2024-12-18 17:28:36 +01:00
Remco Smits
5fe2da3e72 Minimize delay when clear entries and build entries 2024-12-18 17:27:28 +01:00
Remco Smits
ab4973fbdb Add more tests for creating a debug panel item per client & thread 2024-12-18 15:46:49 +01:00
Remco Smits
e3cb3d143e Fix failing test 2024-12-18 15:11:07 +01:00
Remco Smits
6ed42285d6 Remove unused dep from debugger_ui crate 2024-12-18 14:12:37 +01:00
Remco Smits
0b97ffad13 Merge branch 'main' into debugger 2024-12-18 14:08:48 +01:00
Remco Smits
7f4f7b056e Add basic debug panel flow test (#74)
* Add debug panel flow tests

* Wip debug

* Wip fix test

Co-Authored-By: Anthony Eid <56899983+Anthony-Eid@users.noreply.github.com>

* WIP Get test_show_debug_panel to run

while the test now runs without panicing dued to parked with nothing left to run
the debug panel item is not being spawned

* Get test_show_debug_panel to pass & clean up

* Make clippy pass

* Assert debug panel item is removed when client shutdown

* Move send event back to dapstore

---------

Co-authored-by: Anthony Eid <56899983+Anthony-Eid@users.noreply.github.com>
Co-authored-by: Anthony Eid <hello@anthonyeid.me>
2024-12-18 10:47:50 +01:00
Gaauwe Rombouts
473ebbb6c3 Continue with successful variable requests (#75) 2024-12-17 23:21:32 +01:00
Remco Smits
3474750588 Fix don't panic when we receive a response after the request timeout was exceeded 2024-12-17 18:38:05 +01:00
Remco Smits
28c6012ff8 Update dap types to correctly fix other event case 2024-12-16 20:03:05 +01:00
Anthony Eid
9a802b9133 Add dap settings to disable dap logs & format dap messages within logs 2024-12-16 12:30:30 -05:00
Anthony Eid
d7b5132bd9 Update dap-types version
The past version always returned an error when attempting to deserialize custom DAP
events. Causing Zed to stop recving messages from the debug adapter
2024-12-16 12:14:53 -05:00
Remco Smits
65789d3925 Fix don't serialize terminal view when it's a debug terminal 2024-12-14 23:14:11 +01:00
Remco Smits
bea29464f9 Show client id inside dap log
This makes it easier to see which adapter was started first.
2024-12-14 20:06:29 +01:00
Remco Smits
980f9c7353 Correctly reconnect to adapter for StartDebugging reverse request
Before this change, we always started a new adapter but this was causing some issues:

- Infinite client starts
- Cannot start client port already in use
- Not able to start a debug session

Co-Authored-By: Anthony Eid <56899983+Anthony-Eid@users.noreply.github.com>
2024-12-14 20:04:39 +01:00
Remco Smits
aefef9c58c Log event names for debugging 2024-12-14 19:59:41 +01:00
Remco Smits
29a9eaf5fc Fix RunInTerminal not working due env variable values get wrapped with " 2024-12-14 19:00:33 +01:00
Anthony Eid
490d42599b Fix GDB Debug Adapter on linux & Clippy
The GDB adapter was failing because an stderr stream failed to
connect to Zed from the adapter. We now log the error and continue
the debugging sequence if std input and output streams connect.
2024-12-11 05:20:50 -05:00
Remco Smits
04cd04eb44 Move thread status to debug_panel 2024-12-10 17:54:38 +01:00
Remco Smits
22e52c306a Use util command for windows support 2024-12-10 17:36:13 +01:00
Remco Smits
6a4a285ee6 Add test to validate that you can add/remove breakpoints in collab setting 2024-12-10 17:28:47 +01:00
Remco Smits
a50f3c36b0 Merge branch 'main' into debugger 2024-12-09 22:28:59 +01:00
Remco Smits
1a0ecf0c16 Add FakeAdapter and FakeTransport for testing purposes (#73)
* Move process to transport struct itself

* Add test to make sure we can send request and receive a response

* Align other handle methods

* Fix issues inside cargo.toml require test features inside the correct *-dependencies

* Remove comments that are not needed

* Add as_fake instead of downcasting

* Clean up

* Override get_binary method so we don't fail on install

* Fix false positive clippy error

This error was a match variant not being covered when the variant wasn't possible dued
to a feature flag. I'm pretty sure this is a bug in clippy/rust-analyzer and will
open an issue on their repos

* Remove not needed clone

* Panic when we receive an event/reverse request inside the test

* reuse the type of the closure

* Add a way to fake receiving events

* Oops remove fake event from different test

* Clipppyyyy

---------

Co-authored-by: Anthony Eid <hello@anthonyeid.me>
2024-12-09 20:16:06 +01:00
Remco Smits
2b8ae367c7 Rework download adapters and allow multiple download formats (#72)
* Rework download adapters and allow multiple download formats

* Move version path check up, so we don't have to check always 2 paths

* Add user installed binary back

* Fix dynamic ports don't work for javascript when using start debugging path
2024-12-06 20:31:21 +01:00
Remco Smits
61e8d0c39b Add resolved cwd to lldb initialize options 2024-12-01 10:57:09 +01:00
Remco Smits
386031e6dd Collab: Sync active debug line (#71)
* Add sync active debug line over collab

* Remove unused downcast

* Remove unused import
2024-11-28 21:48:36 +01:00
Anthony Eid
b8b65f7a8f Disable gdb dap on non x86 platforms
Gdb doesn't run on Arm platforms so I'm disabling it to avoid users
running into bugs. It will still work on intel macs and x86 devices
2024-11-28 15:32:07 -05:00
jansol
4068960686 Add gdb debug adapter implementation (#70)
* dap_adapters: add gdb

* debug: Add debug zed with gdb task
2024-11-28 21:14:27 +01:00
Remco Smits
df71b972e1 Fix a rare panic when selecting a stack frame but new stack frames are being loaded 2024-11-27 17:21:53 +01:00
Remco Smits
6bc5679c86 Always send a response back when you receive a startDebugging reverse request 2024-11-27 11:18:43 +01:00
Anthony Eid
35a52a7f90 Fix bug where breakpoints weren't render on active lines 2024-11-26 19:07:11 -05:00
Remco Smits
5971d37942 Fix failing tests 2024-11-25 19:03:35 +01:00
Remco Smits
046ff02062 Merge branch 'main' into debugger 2024-11-25 18:36:27 +01:00
Remco Smits
2c45c57f7b Collab: Sync breakpoints (#68)
* Sync breakpoints on toggle to other client

* WIP Seperate local/remote in dap store

* WIP initial breakpoints sync

Co-Authored-By: Anthony Eid <56899983+Anthony-Eid@users.noreply.github.com>

* Get zed to compile

* Don't remove dap data when you unshare

* Add breakpoints table migration

* Update collab db when changing breakpoints

* Store breakpoints inside collab db when you change them

* Clean up

* Fix incorrect clearing of breakpoints during collab sync

* Get breakpoints to sync correctly on project join

We now send SynchronizedBreakpoints within the JoinProjectResponse
and use those breakpoints to initialize a remote DapStore.

* Set breakpoints from proto method

---------

Co-authored-by: Anthony Eid <56899983+Anthony-Eid@users.noreply.github.com>
Co-authored-by: Anthony Eid <hello@anthonyeid.me>
2024-11-25 15:09:56 +01:00
Remco Smits
8e738ba4d5 Reduce request timeout
The php adapter hangs when you click to fast on the UI buttons, so waiting 15 seconds before the request timeout was reached is a bit to much. Because after the timeout you can still continue the debug session.
2024-11-19 23:20:16 +01:00
Remco Smits
008bd534af Add debug task file watcher
This also fixes that we did not see the initial debug tasks defined in `initial_debug_tasks.json`. So new users should see at least these debug tasks the could run, without having to define them their selfs.
2024-11-16 11:07:31 +01:00
Remco Smits
8c72b99031 Dont starve the main thread when receiving alot of messages 2024-11-16 09:33:29 +01:00
Anthony Eid
7e0150790a Fix debugger hang cause by repeatedly processing the same output event 2024-11-16 00:13:03 -05:00
Anthony Eid
a3220dc31d Add breakpoint context menu to all gutter symbols on right click 2024-11-15 22:02:49 -05:00
Anthony Eid
3aaee14ecc Extract breakpoint context menu into function to use for code action symbols 2024-11-15 21:12:40 -05:00
Remco Smits
0d84881641 Add default tasks for build in adapters 2024-11-15 23:36:42 +01:00
Remco Smits
b6dc3ca86f Send cwd to PHP and Python adapter 2024-11-15 23:18:39 +01:00
Remco Smits
99bfc342ab Merge branch 'main' into debugger 2024-11-15 22:36:56 +01:00
Anthony Eid
3a77d7a655 Fix output that was produced before initial stop was not visible at first stop (#57)
* Fix log breakpoint output bug when hit before any breakpoints

* Fix always push to output queue

* Don't pop output queue

We still want all the output in new threads

* Fix clippy error

---------

Co-authored-by: Remco Smits <djsmits12@gmail.com>
2024-11-15 21:45:10 +01:00
Remco Smits
2c1f348c49 Refine spacing 2024-11-13 12:31:37 +01:00
Remco Smits
d8f8140965 Fix compile error 2024-11-13 12:26:05 +01:00
Remco Smits
ae0d08f36c Format docs with prettier 2024-11-13 12:21:49 +01:00
Remco Smits
23ccf08da1 Merge branch 'main' into debugger 2024-11-13 12:16:48 +01:00
Anthony Eid
977fd87a51 Create basic debug.json user documentation 2024-11-12 17:08:22 -05:00
Anthony Eid
80f775e186 Add debug console indicator for unread messages 2024-11-12 16:50:21 -05:00
Anthony Eid
6f0e223dc7 Improve debug client not found log error messages 2024-11-12 16:23:48 -05:00
Anthony Eid
ca80d0c3bd Fix debug client terminate bug where some highlights were not cleared
Co-authored-by: Remco Smits <djsmits12@gmail.com>
2024-11-12 15:48:14 -05:00
Remco Smits
ce30deac63 Prevent building variable entries twice while step debugging 2024-11-12 20:14:37 +01:00
Remco Smits
31e3e48052 Keep open variable list entries when changing stackframe 2024-11-11 19:33:38 +01:00
Remco Smits
81ca004ee6 Fix clippy error 2024-11-11 19:33:02 +01:00
Remco Smits
15dd1ee22e Only clear the active debug line for the editor that belongs to the current debug line
Also added a missing `cx.notify()` which fixes a delay when the line is removed visually
2024-11-10 23:48:32 +01:00
Remco Smits
c54454fa42 Remove not needed notify
I added this it fix the race issue with session exited without stopping on a breakpoint. Turns out the adapter was sending these events for other threads that where not visible.
2024-11-10 22:38:03 +01:00
Remco Smits
964a6e8585 Fix don't show active debug line for finished session when you reopen a file 2024-11-10 22:10:32 +01:00
Remco Smits
aa5d51d776 Merge branch 'debugger' of https://github.com/RemcoSmitsDev/zed into debugger 2024-11-10 21:41:36 +01:00
Remco Smits
ba25aa26c9 Fix race condition that would trigger debug did not stop warning 2024-11-10 21:41:01 +01:00
Anthony Eid
b69d031e15 Transfer breakpoints on project_panel file rename (#65) 2024-11-10 14:42:06 -05:00
Remco Smits
68ee3c747a Fix flickering of rpc messages 2024-11-10 20:06:54 +01:00
Remco Smits
7cec577daa Add request timeout and add more logging for dropping tasks 2024-11-10 19:48:54 +01:00
Remco Smits
0976e85eb0 Fallback to disconnect if client does not support terminate request for shutdown 2024-11-10 19:37:40 +01:00
Remco Smits
c19c86085b Allow users to toggle ignore breakpoints (#62)
* Add allow users to ignore breakpoints

* Add different colors for both states

* Add source name for breakpoints

* Move ignore breakpoints to dap store

* Change icon instead of color

* Add action for ignore breakpoints for a client

* Remove spacing

* Fix compile error

* Return task instead of detaching itself
2024-11-10 18:20:16 +01:00
Remco Smits
055ffc17cd Merge branch 'main' into debugger 2024-11-10 13:27:54 +01:00
Remco Smits
b728779a9a Merge branch 'main' into debugger 2024-11-10 13:20:36 +01:00
Anthony Eid
fb5bee3ba8 Merge branch 'main' into debugger 2024-11-10 02:38:16 -05:00
Anthony Eid
ffaaadf2b9 Get lldb-dap working on linux if already installed by user 2024-11-08 03:03:10 -05:00
Anthony Eid
9781292cba Fix breakpoint deserialization when loading workspaces from database 2024-11-08 02:28:37 -05:00
Anthony Eid
9e37b4708f Go Adapter Implementation (#64)
* Go DAP WIP

* Start work on getting go adapter working

* Get beta version of go adapter working & fix breakpoint line msgs

This adapter only works if a user has Go & delve in their PATH. It doesn't automatically download & updates itself because the official download process from the Delve docs would add delve to a user's PATH. I want to discuss with the Zed dev team & Remco if that is behavior we're ok with because typical downloads don't affect a user's PATH.

This PR also fixes a bug where some breakpoint line numbers were incorrect when sending to active DAP servers.
2024-11-08 02:15:07 -05:00
Anthony Eid
bae6edba88 Fix failing unit test 2024-11-06 23:55:14 -05:00
Anthony Eid
9772573816 Fix adapter completion not showing in debug.json
schemars seems to have a bug when generating schema for a struct with more than
one flatten field. Only the first flatten field works correctly and the second
one doesn't show up in the schema.
2024-11-06 19:06:29 -05:00
Anthony Eid
56f77c3192 Disable the ability to create debug tasks from tasks.json (#61)
This was done to simplify the process of setting up a debug task and improve task organization.
This commit also improves parsing of debug.json so it's able to ignore misconfiguration debug tasks
instead of failing and returning no configured tasks
2024-11-06 16:55:51 -05:00
Anthony Eid
56df4fbe6e Get users python binary path based on which instead of python3 & pass shell_env to Adapters (#63)
* Add ProjectEnvironment to Dap Store & use get users python3 path from which command

* Pass shell env to debug adapters
2024-11-06 16:26:55 -05:00
Remco Smits
45c4aef0da Fix non mac os compile error 2024-11-03 12:18:19 +01:00
Remco Smits
1b2871aac2 Fix failing test 2024-11-02 22:44:03 +01:00
Remco Smits
65cd774baa Implement Attach debugger + picker (#58)
* Rename StopDebugAdapters to ShutdownDebugAdapters

* Remove debug config from install methods

* Move requests methods to background executor

* Wip attach with picker

* Move find client to the first line of the method

The client should always be there at this point.

* Fix correctly determine when to restart

While debugging an reverse request issue, the top level client did send terminated event with `restart: false` but we tried to restart the client resulting in the client never being cleaned up by us.

Because before this change we always assumed if we got a json value we are going to restart the client, which is wrong.

We no try to restart the client if:
- restart arg is a boolean and its true
- restart arg is a json value but no boolean

* Clean up response to adapter

* Fix clippy errors

* WIP tasks

* Simplified debug task schema

This changes debug.json to look for adapter: adapter_name instead of
and object when a user selects a debug adatper and fixes the default
behavior of request (to launch)

Co-authored-by: Remco Smits <djsmits12@gmail.com>

* Make default and flatten work for request

* Rename enum case

* Remove dbg

* Dismiss when candidate is not found

* Add docs for why we update the process id on the config

* Show error when `attach` request is selected but not supported

---------

Co-authored-by: Anthony Eid <hello@anthonyeid.me>
2024-11-02 22:23:15 +01:00
Remco Smits
591f6cc9a2 Fix linux clippy error 2024-11-01 23:36:05 +01:00
Anthony Eid
bb89c1d913 Check if debug task cwd field is a valid path
If the path isn't valid the default cwd value is used (users worktree directory)
2024-11-01 23:29:49 +01:00
Remco Smits
d1ddbfc586 Merge branch 'main' into debugger 2024-11-01 23:27:59 +01:00
Remco Smits
d226de3230 Add current working directory to adapter (#59)
* Add current working directory to adapter

* Prio use resolve cwd
2024-11-01 22:47:43 +01:00
Anthony Eid
5efdaf3ca7 Allow users to set debug adapter path to use from settings 2024-11-01 17:18:23 -04:00
Remco Smits
cd2b4a8714 Merge branch 'main' into debugger 2024-10-30 21:21:59 +01:00
Remco Smits
15535d3cec Don't try to install a adapter that does not support it 2024-10-27 17:21:45 +01:00
Anthony Eid
747ef3e7ec Show debugger actions only during active sessions
Co-authored-by: Remco Smits <djsmits12@gmail.com>
2024-10-27 08:35:51 -04:00
Anthony Eid
aa9875277a Fix dap update status not going away
Co-authored-by: Remco Smits <djsmits12@gmail.com>
2024-10-27 08:05:19 -04:00
Anthony Eid
0c647ae071 Switch debugpy to use TCP instead of STDIO
Co-authored-by: Remco Smits <djsmits12@gmail.com>
2024-10-27 07:41:52 -04:00
Anthony Eid
f6556c5bb6 Show users error notices when dap fails to start
Co-authored-by: Remco Smits <djsmits12@gmail.com>
2024-10-27 07:29:21 -04:00
Remco Smits
f7eb5213a5 Allow users to configure host, port and timeout for build in TCP adapters (#56)
* Remove space in cargo.toml

* Add TCPHost for Javascript and PHP adapter

* Allow falling back to first open port

* Add some more docs to TCPHost

* Fix cached binaries prevent from having multiple debug sessions for one adapters

This was an issue, because we stored the port arg inside the DebugAdapterBinary which was cached so could not change anymore.

Co-authored-by: Anthony Eid <hello@anthonyeid.me>

* Add default setting for tcp timeout

Co-Authored-By: Anthony Eid <56899983+Anthony-Eid@users.noreply.github.com>

---------

Co-authored-by: Anthony Eid <hello@anthonyeid.me>
Co-authored-by: Anthony Eid <56899983+Anthony-Eid@users.noreply.github.com>
2024-10-27 12:16:09 +01:00
Anthony Eid
d279afa41c Show DAP status in activity bar (#54)
* Begin integrating languages with DAP

* Add dap status type to activity indicator

Co-authored-by: Remco Smits <djsmits12@gmail.com>

* Show dap status to users

* Change Status enum to use ServerStatus struct in activity indicator

---------

Co-authored-by: Remco Smits <djsmits12@gmail.com>
2024-10-27 06:08:37 -04:00
Remco Smits
ddaf1508d3 Remove unused dependencies 2024-10-26 17:00:13 +02:00
Remco Smits
96871b493f Merge branch 'main' into debugger 2024-10-26 16:52:42 +02:00
Remco Smits
a45aa3d014 Correctly refetch/invalidate stackframes/scopes/variables (#55)
* Correctly invalidate and refetch data

* Fix show current stack frame after invalidating

* Remove not used return value

* Clean up

* Don't send SelectedStackFrameChanged when id is still the same

* Only update the active debug line if we got the editor
2024-10-26 16:27:20 +02:00
Remco Smits
57668dbaeb Fix compile error for missing field 2024-10-24 20:50:40 +02:00
Anthony Eid
115f2eb2e2 Merge branch 'main' into debugger 2024-10-24 13:23:55 -04:00
Anthony Eid
290c76daef Enable Debug Adapter Updates (#53)
* Get debug adapter to update when new versions are avaliable

* Debug adapter download only happens if there's a new version avaliable

* Use DebugAdapter.name() instead of string literals

* Fix bug where some daps wouldn't update/install correctly

* Clean up download adapter from github function

* Add debug adapter caching

* Add basic notification event to dap_store
2024-10-23 11:17:12 -04:00
Remco Smits
e2d449a11f Rename event case so its more clear 2024-10-23 11:44:24 +02:00
Remco Smits
a95b16aa67 Remove borrow_mut because its not needed 2024-10-23 11:01:48 +02:00
Remco Smits
932f4ed10a Remove not used method from merge 2024-10-23 11:01:17 +02:00
Anthony Eid
61daad2377 Clean up DAP install code (#51)
* Clean up dap adapter install_binary functions 

* Get lldb-dap working when on macos

* Add release tag to name of downloaded DAPs

* Check if adapter is already installed before installing one
2024-10-23 04:45:02 -04:00
Remco Smits
8baa2805d0 Use the correct logkind for stderr 2024-10-22 08:13:22 +02:00
Remco Smits
ffc058209d Add missing license 2024-10-21 20:05:11 +02:00
Remco Smits
b426ff6074 Remove unused dep 2024-10-21 20:00:30 +02:00
Remco Smits
6f973bdda4 Merge branch 'main' into debugger 2024-10-21 19:56:26 +02:00
Borna Butković
43ea3b47a4 Implement logging for debug adapter clients (#45)
* Implement RPC logging for debug adapter clients

* Implement server logs for debugger servers

* This cleans up the way we pass through the input and output readers for logging. So not each debug adapters have to map the AdapterLogIO fields.  I also removed some specific when has logs from the client, because the client is not responsible for that.  Removed an not needed/duplicated dependency  Fix formatting & clippy

This cleans up the way we pass through the input and output readers for logging. So not each debug adapters have to map the AdapterLogIO fields.

I also removed some specific when has logs from the client, because the client is not responsible for that.

Removed an not needed/duplicated dependency

Fix formatting & clippy

* Implement `has_adapter_logs` for each transport impl

* Make adapter stdout logging work

* Add conditional render for adapter log back

* Oops forgot to pipe the output

* Always enable rpc messages

Previously, RPC messages were only stored when explicitly enabled, which occurred after the client was already running. This approach prevented debugging of requests sent during the initial connection period. By always enabling RPC messages, we ensure that all requests, including those during the connection phase, are captured and available for debugging purposes.

This could help use debug when someone has troble getting a debug starting. This improvement could be particularly helpful in debugging scenarios where users encounter issues during the initial connection or startup phase of their debugging sessions.

---------

Co-authored-by: Remco Smits <djsmits12@gmail.com>
Co-authored-by: Anthony Eid <hello@anthonyeid.me>
2024-10-21 19:44:35 +02:00
Remco Smits
30a4c84a48 Remove unused dependency & sort 2024-10-20 21:23:16 +02:00
Remco Smits
211fd50776 Merge branch 'main' into debugger 2024-10-20 19:50:25 +02:00
Remco Smits
fc78c40385 Move how we connect to an debug adpater to the transport layer (#52)
* Move how we connect to an debug adpater to the transport layer

This PR cleans up how we connect to a debug adapter.
Previously, this was done inside the debug adapter implementation.

While reviewing the debugger RPC log view PR,
I noticed that we could simplify the transport/client code,
making it easier to attach handlers for logging RPC messages.

* Remove not needed async block

* Change hardcoded delay before connecting to tcp adapter to timeout approach

Co-Authored-By: Anthony Eid <hello@anthonyeid.me>

---------

Co-authored-by: Anthony Eid <hello@anthonyeid.me>
2024-10-20 19:29:34 +02:00
Anthony Eid
6735cfad67 Fix debug tasks and regular tasks overwritting each other 2024-10-20 06:21:32 -04:00
Anthony Eid
2e028a7038 Fix debug tasks loading from .zed/debug.json 2024-10-20 04:10:16 -04:00
Remco Smits
afe228fd77 Fix debug kind in debug panel item tab content 2024-10-17 20:14:12 +02:00
Remco Smits
fdea7d9de9 Merge branch 'main' into debugger 2024-10-17 14:04:43 +02:00
Remco Smits
1c1e34b3d2 Debug terminal (#50)
* Make program optional in debug task format

* Remove default config for skipFile JavaScript debugger

* Add Debug case to TerminalKind

* Don't allow serializing debug terminals

* Add respond method so we can send response back for reverse requests

* Implement run in terminal reverse request

* Move client calls to dap store

This commit also fixes an issue with not sending a response for the `StartDebugging` reverse request.

* Make clippy happy
2024-10-17 13:46:11 +02:00
Remco Smits
3a6f2adcc6 Merge branch 'main' into debugger 2024-10-15 12:32:09 +02:00
Remco Smits
13afc3741f Fix flicker the variable list less when clicking fast step over control 2024-10-14 20:57:07 +02:00
Remco Smits
c65ed1c738 Remove type duplication for debug settings JSON schema 2024-10-14 17:40:13 +02:00
Remco Smits
f5dc1175b8 Merge branch 'main' into debugger 2024-10-14 16:56:12 +02:00
Remco Smits
5758f664bc Add loaded sources (#49)
* Remove not needed notify

* Add loaded sources list

* Remove not needed double nested div.child()

* Remove not needed block

* Fix todo for updating loaded source
2024-10-14 16:08:32 +02:00
Remco Smits
b46b8aa76a Fix clippy 2024-10-14 15:38:26 +02:00
Remco Smits
222cd4ba43 Fix missing thread_state status update when client was terminated
Before this change the debug panel was still in the running state, which is/was wrong. So updating the status to Ended, will update the UI to an correct state.
2024-10-13 15:19:05 +02:00
Remco Smits
5bb7f2408a Update dap types to fix invalid value for PresentationHint enums 2024-10-11 18:55:05 +02:00
Remco Smits
b1d24a0524 Flatten custom adapter connection config 2024-10-11 17:56:18 +02:00
Remco Smits
177ae28ab2 Add support for custom adapters
Also simplify the DebugAdapterBInary struct
2024-10-10 21:07:17 +02:00
Remco Smits
554a402cec Fix wrong symlink for license 2024-10-10 17:46:23 +02:00
Remco Smits
7e2c1386fc Cleanup adapters code and add javascript adapter (#48)
* Clean up how adapters install and fetch their binary

Before this was in one method, which was hard to read and doing to many things.

* Add javascript debug adapter

* Get node binary from node_runtime

* Undo remove request args for lldb

* Remove initialize value for now

* Remove default values

* Fix did not compile javascript

* Fix php did not build
2024-10-09 15:47:07 +02:00
Remco Smits
8a4f677119 Merge branch 'main' into debugger 2024-10-09 11:42:59 +02:00
Remco Smits
a728f9e751 Remove not needed clone 2024-10-09 11:39:02 +02:00
Remco Smits
171c74223c Make start debugging request work again
This also fixes an issue that you could not add your own initialize_args for non custom adapters
2024-10-09 11:30:59 +02:00
Remco Smits
5f1de1ab65 Add missing license 2024-10-08 23:54:53 +02:00
Remco Smits
91926cdcfe Fix correctly install and build php adapter 2024-10-08 23:46:37 +02:00
Remco Smits
08935b29ef Remove new lines from variable list value
This fixes a display issue, for javascript expression values.
2024-10-08 23:14:28 +02:00
Remco Smits
eedd865ae8 Remove unused dep 2024-10-08 07:46:54 +02:00
Remco Smits
00b6fdc098 Make ci pass 2024-10-08 07:45:33 +02:00
Anthony Eid
187d909736 DapAdapter Updates (#40)
* Pass http client to dap store

* Set up DapAdapterDelegate to use for DAP binary fetches

* WIP to get debug adapters to use zed directory for installs

* Get debugpy automatic download working

* Change DapAdapter fetch_or_install to return a Result

* Add node_runtime to dap delegate

Co-authored-by: Remco Smits <djsmits12@gmail.com>

* Create dap_adapter crate & move language dap adapter code to that crate

This is the first phase of dap::client refactor to organize debug adapters with zed lsp adapters.
Eventually dap::client will have a TransportParams pass to it instead of an adapter, and adapters
will handle custom debug events.

Co-authored-by: Remco Smits <djsmits12@gmail.com>

* Move language specific dap adapter code to their own files

* Move DebugAdapter member out of ClientDebugAdapter struct

This change was done to make dap::client more in line with zed's lsp::client, it might be reverted depending
on if this solution is cleaner or not.

* Get php debug adapter to auto download when needed

* Get adapter_path argument to work with auto download dap adapters

---------

Co-authored-by: Remco Smits <djsmits12@gmail.com>
2024-10-07 16:33:03 -04:00
Remco Smits
ac0ba07c61 Merge main 2024-10-07 20:11:42 +02:00
Remco Smits
984cb686a8 Merge main 2024-10-07 20:02:55 +02:00
Remco Smits
3b545a79ba Move stack frame list to its own view (#47)
* WIP Move stack frame list to own module

* Add missing notify

* Remove some more duplicated state for the current stack frame

* Clear highlights when stack frame has changed

* Fix go to stack frame again
2024-10-07 19:39:46 +02:00
Remco Smits
f76b7c9337 Implement handle Capabilities event 2024-10-06 13:55:20 +02:00
Remco Smits
8c0a7b1024 Module list (#46) 2024-10-06 01:43:00 +02:00
Remco Smits
c2ed56af0b Add copy memory reference menu item to variable list 2024-10-05 13:51:24 +02:00
Remco Smits
f9b045bd23 Remove unused crates 2024-10-04 20:23:20 +02:00
Remco Smits
3c301c3ea4 Format file 2024-10-04 20:21:07 +02:00
Anthony Eid
93af1bfae7 Breakpoint prompt editor (#44)
* Create basic breakpoint prompt editor structure

* Get breakpoint prompt to properly render

* Fix bug where toggle breakpoint failed to work

This bug occurs when a breakpoint anchor position is moved from the begining
of a line. This causes dap_store.breakpoint hashmap to fail to properly get
the correct element, thus toggling the wrong breakpoint.

The fix to this bug is passing a breakpoint anchor to an editor's display map
and to the render breakpoint function. Instead of creating a new anchor when
clicking on a breakpoint icon, zed will use the breakpoint anchor passed to
the display map.

In the case of using toggle breakpoint action, zed will iterate through all
breakpoints in that buffer to check if any are on the cursor's line number,
then use anchor if found. Otherwise, zed creates a new anchor.

* Fix bug where breakpoint icon overlaps with other icons

This bug happened when an icon {code action | code runner} was rendered on the same line of a breakpoint
where that breakpoint's anchor was not at the start of the line

* Get breakpoint prompt to add log breakpoint's correctly

* Clean up breakpoint prompt UI & allow editting of log messages

---------

Co-authored-by: Remco Smits <djsmits12@gmail.com>
2024-10-04 14:16:04 -04:00
Remco Smits
6b4ebac822 Fix current debug line highlight did not work
Co-authored-by: Anthony Eid <hello@anthonyeid.me>
2024-10-04 20:10:49 +02:00
Remco Smits
55c65700ad Merge branch 'main' into debugger
Co-authored-by: Anthony Eid <hello@anthonyeid.me>
2024-10-04 19:39:12 +02:00
Remco Smits
842bf0287d Debug console (#43)
* Use buffer settings for font, size etc.

* Trim end of message

* By default send the output values to the output editor

* WIP send evaluate request

* Rename variable

* Add result to console from evaluate response

* Remove not needed arc

* Remove store capabilities on variable_list

* Specify the capacity for the task vec

* Add placeholder

* WIP add completion provider for existing variables

* Add value to auto completion label

* Make todo for debugger

* Specify the capacity of the vec's

* Make clippy happy

* Remove not needed notifies and add missing one

* WIP move scopes and variables to variable_list

* Rename configuration done method

* Add support for adapter completions and manual variable completions

* Move type to variabel_list

* Change update to read

* Show debug panel when debug session stops

Co-Authored-By: Anthony Eid <56899983+Anthony-Eid@users.noreply.github.com>
Co-Authored-By: Mikayla Maki <mikayla.c.maki@gmail.com>

* Also use scope reference to determine to which where the set value editor should display

* Refetch existing variables after

* Rebuild entries after refetching the variables

---------

Co-authored-by: Anthony Eid <56899983+Anthony-Eid@users.noreply.github.com>
Co-authored-by: Mikayla Maki <mikayla.c.maki@gmail.com>
2024-10-04 09:00:45 +02:00
Anthony Eid
1914cef0aa Improve contrast for breakpoint & debug active line colors 2024-09-27 11:11:53 -04:00
Remco Smits
171ddfb554 Add stop debug adapter command 2024-09-25 19:15:04 +02:00
Remco Smits
bfddc634be Show current debug line when you reopen a buffer (#42)
* Only include dap store if editor.mode is FULL

* Move go to stack frame to debug_panel_item

* Notify dap_store when updating active breakpoints location

* Fix clippyyyy

* Show active debug line when you reopen a buffer

* Remove uncommented code

This is not needed anymore, we already clear the highlights when thread exited

* Make clippy happy

* Fix todo for removing highlights on exited event
2024-09-25 17:53:40 +02:00
Remco Smits
29918d96a2 Fix wrong function call for step in button 2024-09-24 20:00:38 +02:00
Anthony Eid
3b3ac85199 Change source/serialize breakpoints to be indexed zero 2024-09-24 11:20:33 -04:00
Anthony Eid
9bc08f9f84 Move breakpoint sql binding to sqlez crate 2024-09-24 10:54:45 -04:00
Remco Smits
ce77773796 Remove useless method 2024-09-23 19:45:12 +02:00
Remco Smits
9016a03e90 Update the correct status for continued event 2024-09-22 17:05:20 +02:00
Remco Smits
a3dff431c5 Move all the debug panel actions to the workspace (#41)
This PR moves out all the actions from the **debug_panel_item** to the **workspace** which allows people to use these actions inside their key binds.

I also had to remove the debug_panel dependency inside the debug_panel_item, because we hit a `"cannot update debug_panel while it is already being updated"` panic. So instead of updating the thread status inside the **debug_panel** we now do this inside the **debug_panel_item** to prevent this panic. 

I also move the actions to its own debugger namespace, so it's more clear what the actions are for.

The new actions can now also be used for key binds:
```
debugger: start
debugger: continue
debugger: step into
debugger: step over
debugger: step out
debugger: restart
debugger: stop
debugger: pause
```

/cc @Anthony-Eid We now can have key binds for debugger actions.
2024-09-22 17:02:33 +02:00
Anthony Eid
231b5a910a Merge pull request #38 from Anthony-Eid/breakpoint-context-menu
Breakpoint Context Menu & Log Breakpoints
2024-09-21 23:32:43 -04:00
Anthony Eid
7dec58ce89 Merge branch 'debugger' into breakpoint-context-menu 2024-09-21 23:13:46 -04:00
Remco Smits
8b96ac8138 Make clippy pass 2024-09-21 19:39:55 +02:00
Remco Smits
4ddb65bdaa Make test pass again 2024-09-21 19:25:11 +02:00
Remco Smits
c26a8f1537 Remove unused dep 2024-09-21 18:47:44 +02:00
Remco Smits
f1f1426635 Make CI pass 2024-09-21 18:38:40 +02:00
Remco Smits
278699f2f7 Merge branch 'main' into debugger 2024-09-21 18:35:47 +02:00
Remco Smits
9612b60ccb Refactor: Move types to the correct place and move specific request code to the dapstore (#39)
* Make dap store global

* Partially move initialize & capability code to dap store

* Reuse shutdown for shutdown clients

* Rename merge_capabilities

* Correctly fallback to current thread id for checking to skip event

* Move mthod

* Move terminate threads to dap store

* Move disconnect and restart to dap store

* Update dap-types to the correct version

This includes the capabilities::merge method

* Change dap store to WeakModel in debug panels

* Make clippy happy

* Move pause thread to dap store

* WIP refactor out thread state and capbilities

* Move update thread status to debug panel method

* Remove double notify

* Move continue thread to dap store

* Change WeakModel dapStore to Model dapStore

* Move step over to dap store

* Move step in to dap store

* Move step out to dap store

* Remove step back

we don't support this yet

* Move threadState type to debugPanel

* Change to background task

* Fix panic when debugSession stopped

* Remove capabilities when debug client stops

* Add missing notify

* Remove drain that causes panic

* Remove Arc<DebugAdapterClient> from debug_panel_item instead use the id

* Reset stack_frame_list to prevent crash

* Refactor ThreadState to model to prevent recursion dependency in variable_list

* WIP

* WIP move set_variable_value and get_variables to dap store

* Remove unused method

* Fix correctly insert updated variables

Before this changes you would see the variables * scopes. Because it did not insert the the variables per scope.

* Correctly update current stack frame on variable list

* Only allow building variable list entries for current stack frame

* Make toggle variables & scopes work again

* Fix clippy

* Pass around id instead of entire client

* Move set breakpoints to dap store

* Show thread status again in tooltip text

* Move stack frames and scope requests to dap store

* Move terminate request to dap store

* Remove gap that is not doing anything

* Add retain back to remove also threads that belong to the client

* Add debug kind back to tab content
2024-09-21 17:31:50 +02:00
Anthony
c88316655c Fix typos & clippy warnings 2024-09-17 02:10:10 -04:00
Anthony
c3a778724f Handle breakpoint toggle with different kinds & on_click for log breakpoints 2024-09-17 02:01:46 -04:00
Anthony Eid
c1ab059d54 Get log breakpoint's to serialize/deserialize correctly in the database 2024-09-16 22:49:59 -04:00
Anthony Eid
142a6dea17 Get log breakpoint indicator to change into hint color on hover 2024-09-16 02:14:16 -04:00
Anthony Eid
e7f7fb759d Send log breakpoint's to debug adapter's when able 2024-09-16 01:51:15 -04:00
Anthony Eid
621d1812d1 Set up basic infrastructure for log breakpoints 2024-09-16 01:24:31 -04:00
Anthony Eid
8cdb1fb55a Get toggle breakpoint working from breakpoint context menu 2024-09-14 14:54:25 -04:00
Anthony Eid
716a81756f Add on_right_click function to IconButton struct 2024-09-14 14:54:25 -04:00
Anthony Eid
56943e2c78 Fix bug where breakpoints from past sessions on line 1 were unable to be toggle 2024-09-14 14:53:59 -04:00
Remco Smits
4694de8e8a Return adapter error message when error response comes back 2024-09-14 20:28:26 +02:00
Remco Smits
3985963259 Move Request, Response, Event types to dap-types repo
This also changes the concept of passing the receiver inside the request to make sure we handle all the requests in the right order.

Now we have another peace of state that keeps track of the current requests, and when request comes in we move the receiver to the pending requests state and handle the request as we did before.
2024-09-13 16:06:00 +02:00
Remco Smits
559173e550 Fix typo 2024-09-13 15:05:16 +02:00
Remco Smits
9b82278bb1 Move current stack frame id to debug panel item 2024-09-13 14:48:01 +02:00
Remco Smits
4405ae2d19 Correctly shutdown adapter (#37)
* Kill adapter and close channels

* Always try to terminate the debug adapter

* Remove TODO

* Always allow allow the user to restart if capability allows it

Co-Authored-By: Anthony Eid <56899983+Anthony-Eid@users.noreply.github.com>

* Drop tasks

Co-Authored-By: Anthony Eid <56899983+Anthony-Eid@users.noreply.github.com>

* WIP fix hang

* Remove debug code

* clean up

---------

Co-authored-by: Anthony Eid <56899983+Anthony-Eid@users.noreply.github.com>
2024-09-13 14:04:20 +02:00
Anthony Eid
571d99cecf Fix bug where breakpoints were unable to be toggled on the first line 2024-09-12 11:10:54 -04:00
Anthony Eid
ed6da4a7f6 Get zed variables to work in debug tasks program 2024-09-12 11:10:54 -04:00
Anthony Eid
8a835bcf88 Get .zed/debug.json to resolve debug tasks properly
Co-authored-by: Piotr <piotr@zed.dev>
2024-09-12 11:10:54 -04:00
Anthony Eid
3fac36bd77 Add custom debug config option to speed up DAP development 2024-09-12 11:10:54 -04:00
Remco Smits
7a6f9d9559 Remove default value for DebugItemAction struct 2024-09-12 11:06:33 +02:00
Remco Smits
ab6f3341c3 Remove background for breakpoint icon 2024-09-12 10:56:24 +02:00
Remco Smits
a4ce44629c Remove unused deps 2024-09-12 10:51:42 +02:00
Anthony Eid
18fb45f526 Move debugger breakpoint code from dap::client to project::dap_store 2024-09-11 18:41:04 -04:00
Anthony Eid
3184ba1df0 Add a debug breakpoint button from lucide 2024-09-11 18:20:19 -04:00
Anthony Eid
e6049f9830 Merge open/close breakpoints into one data structure
This is done by making breakpoint.position field optional and
adding a cached_position field too. I also added dap_store to
buffer_store and got buffer_store to initialize breakpoints
when a new buffer is opened. Fixing a bug where some breakpoints
wouldn't appear within multi buffers
2024-09-11 17:13:55 -04:00
Anthony Eid
0f5e5ea3b4 Change open_breakpoint's BTree to use project_path as it's key
This is the first step of merging open/close breakpoints into
one data structure.

Co-authored-by: Remco Smits <djsmits12@gmail.com>
2024-09-11 17:13:55 -04:00
Anthony Eid
5a0c7d2885 Change breakpoint position from multi_buffer::Anchor -> text::Anchor
This fixes a crash that happened when placing a breakpoint within a multi buffer
where it's excerpt_id != 1.

Co-authored-by: Remco Smits <djsmits12@gmail.com>
2024-09-11 17:13:55 -04:00
Piotr Osiewicz
499024297d Touch up UI with borders and whatnot 2024-09-11 14:57:44 -04:00
Piotr Osiewicz
3683920dab Add support for LLDB 2024-09-10 22:45:22 -04:00
Remco Smits
5d07ab0d03 Show correctly adapter name in thread tab 2024-09-10 22:35:17 +02:00
Remco Smits
09c195e78e Remove debug code 2024-09-08 17:04:16 +02:00
Remco Smits
165e058dce Make php adapter work again with hardcoded values 2024-09-08 16:58:13 +02:00
Remco Smits
0e6042e12f Remove unused dep 2024-09-08 16:26:47 +02:00
Anthony Eid
1e99694f29 Make Debug tasks easier for a user to config
* Start setting up new debug task format

* Set up blueprint for converting from debug task to regular task

* Create debug adapter trait

* Get debug.json schema to json lsp to show users hints

* Start debugger task refactor to enable easier debugger setup

* Get python adapter to work within task.json

co-authored-by: Piotr <piotr@zed.dev>

* Start work on getting Php Debug adapter working

* Make debug adapter trait work with async functions

Fix CICD spell check & clippy warnings

Co-authored-by: Remco Smits <djsmits12@gmail.com>

---------

Co-authored-by: Piotr <piotr@zed.dev>
Co-authored-by: Remco Smits <djsmits12@gmail.com>
2024-09-08 10:14:36 -04:00
Remco Smits
dc5d0f4148 Don't pre-paint breakpoints that are outside the viewport 2024-09-08 12:37:16 +02:00
Remco Smits
b2927a07e4 Remove commented code 2024-09-08 11:08:25 +02:00
Remco Smits
b00d63b6c4 Move breakpoint & debug client code to dap_store (#36)
* Move breakpoint and client code to dap store

This also changes how we sync breakpoints. Now we just update the dap store model instead of having a RWlock.

The goal is to prepare for making inlay hints for debugging work.

* Remove debug code

* Remove unused method

* Fix don't grow the amount tasks for sending all the breakpoints

* Partially implement terminate clients when app quits

* Sync open breakpoints to closed breakpoints when buffer is closed

* Call terminate request also for not already started clients

* Fix missing import

* Remove not needed read_with call
2024-09-08 11:06:42 +02:00
Remco Smits
edf4e53571 Fix Clippy errors 2024-09-07 19:17:23 +02:00
Remco Smits
ac2aa795cd Merge main 2024-09-07 15:51:40 +02:00
Remco Smits
b009832229 Allow setting variable value (#34)
* Wip render set variable value editor

* Remove unused subscriptions

* Set current variable value & select all when setting value

* Send set variable request

* Rename tread entry to VariableListEntry

* WIP allow setting variables for nested structures

* Refactor & rename vars on thread state to be only the ids

* Fix we did not correct notify the right context when updating variable list

* Clean open entries when debugger stops

* Use SetExpression if adapter supports it when setting value

* Refetch scope variables after setting value

This commit also reworks how we store scopes & variables

* Remove debug code

* Make Clippy happy

* Rename variable

* Change order for variable id

* Allow cancelling set value using escape key
2024-09-07 14:00:00 +02:00
Remco Smits
7b7a4757cd Show warning when session exited without hitting any breakpoint (#31)
* Show warning when session did not stop but exited

* Fix code was not formatted any more

* Fix clippy
2024-09-03 08:19:11 +02:00
Remco Smits
83cc452465 Make stepping granularity configurable (#33)
This commit also changes the default granularity to line, before this was statement but most editors have line as the default.
2024-09-01 11:44:32 +02:00
Remco Smits
4cf735bd93 Add right click menu to copy variable name & value (#32) 2024-08-31 14:50:57 +02:00
Remco Smits
fc4078f8e8 Fix we did not open the first scope after stopping for second time 2024-08-28 10:22:39 +02:00
Remco Smits
243bd4b225 Merge branch 'main' into debugger 2024-08-28 10:20:51 +02:00
Anthony Eid
3ac4d1eaac Fix CICD spell check error 2024-08-26 23:24:07 -04:00
Anthony Eid
921d0c54a3 Merge branch 'main' into debugger 2024-08-26 17:03:36 -04:00
Anthony Eid
008b6b591b Debugger console (#30)
* Create debugger console

* Get console to output console messages during output events

* Use match expression in handle_output_event

* Move debug console code to it's own file console
2024-08-26 16:46:37 -04:00
Anthony Eid
2b504b3248 Debugger elements (#29) 2024-08-26 20:36:46 +02:00
Remco Smits
045f927b4c Don't close the debug panel when debug session ends (#28) 2024-08-25 19:44:01 +02:00
Remco Smits
199b6657c6 Remove duplicated content code for debugger settings (#27)
Refactored this because its being refactored inside: https://github.com/zed-industries/zed/pull/16744
2024-08-25 19:29:59 +02:00
Remco Smits
31b27e1e85 Allow clicking on stack frame again (#26) 2024-08-25 15:31:56 +02:00
Remco Smits
5a301fc83a Improve clear highlights performance (#25)
* Only clear highlights op open tabs

This much better so we don't have to open each path of each stack frame in each thread.

* Don't open the same file twice for clearing thread highlights
2024-08-25 12:31:03 +02:00
Remco Smits
618d81a3de Fix that debug tab was not showing anymore (#24) 2024-08-24 15:25:00 +02:00
Remco Smits
9bcd03b755 Remove default methods 2024-08-23 20:55:49 +02:00
Remco Smits
f3e7129479 Add debug icon to toggle debug panel (#23)
* Add debug icon to toggle debug panel

* Add setting to wether to show the debug button

* Fix clippy
2024-08-23 20:39:58 +02:00
Remco Smits
149116e76f Close debug panel tabs when client stopped (#20) 2024-08-22 07:17:50 +02:00
Anthony Eid
bbb449c1b7 Get debugger panel buttons to trigger while panel isn't focused 2024-08-21 17:17:26 -04:00
Remco Smits
651c31c338 Change icons from vscode to lucide (#21) 2024-08-21 22:33:00 +02:00
Remco Smits
3c98e8986f Fix failing test 2024-08-21 22:17:03 +02:00
Remco Smits
7885da1a2c Fix typos 2024-08-21 21:24:12 +02:00
Anthony Eid
2843a36853 Merge pull request #13 from Anthony-Eid/breakpoints
Make breakpoints persistent across Zed sessions
2024-08-21 11:33:43 -04:00
Anthony Eid
5c45e459bd Merge debugger into breakpoints
Co-authored-by: Remco Smits <djsmits12@gmail.com>
2024-08-21 11:30:29 -04:00
Anthony Eid
fb169af400 Add todo to fix bug & bug information
Co-authored-by: Remco Smits <djsmits12@gmail.com>
2024-08-21 11:23:17 -04:00
Remco Smits
db8b8beb70 Fix typo 2024-08-21 13:53:55 +02:00
Remco Smits
1ac97d2e87 Merge branch 'main' into debugger 2024-08-21 13:51:29 +02:00
Remco Smits
11b2bc1ffc Lazy fetch variables (#19)
* Lazy fetch

* Remove unused code

* Clean up fetch nested variable

* Include scope id for variable id to be more unique

* Clean up not needed get_mut function calls
2024-08-18 16:44:53 +02:00
Anthony Eid
da1fdd25cd Merge branch 'debugger' into breakpoints 2024-08-15 19:58:09 -04:00
Anthony Eid
e9f0e936ea Set up ground work for debugger settings & save breakpoints setting
Co-authored-by: Remco Smits <djsmits12@gmail.com>
2024-08-15 19:40:40 -04:00
Anthony Eid
3a3f4990ba Breakpoint PR review edits
Co-authored-by: Remco Smits <djsmits12@gmail.com>
2024-08-15 17:13:04 -04:00
Remco Smits
bb40689472 Merge branch 'main' into debugger 2024-08-14 17:22:27 +02:00
Remco Smits
9643e71947 Remove debugger client when its terminated (#18)
* Only terminate thread if it did not already end

* Remove debug client when its terminated
2024-08-14 17:00:18 +02:00
Remco Smits
9ea23b0a62 Fix wrong value for always showing disclosure icon
Oopss, after testing i did commit the wrong value.
2024-08-14 15:28:39 +02:00
Remco Smits
b561c68d28 Always show disclosure icon for variables and scopes (#17) 2024-08-14 10:46:07 +02:00
Anthony
26e9843a2a Merge branch 'debugger' into breakpoints 2024-08-13 20:14:24 -04:00
Remco Smits
5678469f9d Show variables recursively (#16)
* WIP Show variables in toggable list items

* Use more unique element id so we can group variables by name and reference

* Fix correct pass in listItem toggle value

* Fix we did not set the current stack frame id

* WIP start fetching variables recursively

* Remove duplicated code for current stack frame id

* Fetch all variables for each stack frame parallel

* Fetch all variable on the same level parallel

* Rename vars so its more clear

* Refactor how we store thread data

We now store the information in a better way so we could filter on it easier and faster.

* Remove unused code

* Use stack frame id as top level key instead of struct itself

* Add has_children to scope thread entry

* Fix allow switching current stack frame

Also fixed building list entries twice right after each other

* Show message when scope does not have variables

* Remove scope if it does not have variables

* Add allow collapsing scopes

* Add allow collapsing variables

* Add docs to determine collapsed variables

* Allow collapsing variables and always show first scope variables

* Correctly fix collapse variables

* Fix clippy
2024-08-12 22:12:25 +02:00
Anthony Eid
fe58a702a0 Add debugger.accent as a configurable theme 2024-08-07 23:05:29 -04:00
Anthony Eid
30b59a6c92 Add breakpoint documentation 2024-08-07 22:32:01 -04:00
Anthony Eid
0f1738c8cc Allow toggling breakpoints to work from within multibuffer gutter 2024-08-07 19:17:49 -04:00
Anthony Eid
d3fe698c23 Fix bug that caused a breakpoints to only remain persistent in one open file when booting zed up 2024-08-07 19:08:28 -04:00
Remco Smits
cb52082821 Merge branch 'main' into debugger 2024-08-07 23:22:03 +02:00
Anthony Eid
a4db59cabb Merge branch 'remco-debugger' into breakpoints 2024-08-07 15:57:31 -04:00
Anthony Eid
7d243ac274 Get breakpoints to display in the correct positions within multi buffers
Remco Smits <djsmits12@gmail.com>
2024-08-07 15:26:34 -04:00
Anthony Eid
5a08fc2033 Fix breakpoint line number not being red when cursor is on it & clippy warnings 2024-08-06 22:48:28 -04:00
Anthony Eid
111a0dc539 Get project::has_active_debuggers to return true if a debugger is running instead of starting
We shouldn't consider a debugger active until it's running because it may fail during it's starting phase.
Also, this check is used determine if we should update a file's breakpoints when one is toggled.
2024-08-06 22:24:01 -04:00
Anthony Eid
1ad8ed77e4 Show a breakpoint on line with code action while code action symbol is loading
I also handle overlaps with code test buttons as well
2024-08-06 20:39:36 -04:00
Anthony Eid
c92ecc690f Handle case where Code action indicator interlaps with breakpoint
Code action's that overlap with breakpoints will now show as red. Still need to
add a code action to toggle a breakpoint & handle overlaps with code runners
2024-08-06 16:34:35 -04:00
Anthony Eid
a4cc28f480 Fix bug where toggle breakpoint action wouldn't untoggle 2024-08-05 17:31:23 -04:00
Remco Smits
f4af5afe62 Fix merge conflicts 2024-08-05 22:50:11 +02:00
Remco Smits
49dd57d1f9 Add output for each thread (#15) 2024-08-05 21:38:30 +02:00
Remco Smits
96bdeca2ac Terminate thread when pane is closed/removed (#14) 2024-08-05 21:24:01 +02:00
Remco Smits
ef63ccff66 Merge branch 'main' into debugger 2024-08-04 11:38:05 +02:00
Anthony Eid
4181c39224 Delete println that was used for debugging 2024-08-04 01:21:45 -04:00
Anthony Eid
f3cb4a467b Send breakpoints from unopened files to adapters & fix workspace serialization 2024-08-04 00:42:18 -04:00
Anthony Eid
5f4affdefe Introduce serialized breakpoint type & refactor code base to use it 2024-08-03 23:26:12 -04:00
Anthony Eid
b1d830cb09 Editor doesn't need to know about closed breakpoint 2024-08-03 20:18:35 -04:00
Anthony Eid
e24e5f7e89 Transfer breakpoints from open -> close when buffer maintaining them is released 2024-08-03 20:15:16 -04:00
Anthony Eid
68158d3fc5 Transfer breakpoints from closed DS -> open DS when a buffer is open 2024-08-03 16:14:00 -04:00
Anthony Eid
0247fd6087 Get deserialization to work for file's that open when initializing Zed 2024-08-03 15:22:26 -04:00
Anthony Eid
d4904f97bb Start work on deserializing Workspace 2024-08-03 13:42:01 -04:00
Anthony Eid
8b63c1ab6f Change breakpoint DS back so it uses buffer_id as a key
I also made a new DS specifically for breakpoints that aren't part of any open buffers.
It should be easier to serialize and deserialize breakpoints now.
2024-08-01 19:17:25 -04:00
Anthony Eid
9dfd2f5bdd Get load workspace to get breakpoint data 2024-08-01 18:29:46 -04:00
Anthony Eid
ca844637f7 Get workplace to serialize breakpoints 2024-08-01 16:56:36 -04:00
Anthony Eid
42aefb4034 Change breakpoints DS to use project path as key instead of bufferID 2024-08-01 14:50:59 -04:00
Anthony Eid
e974ddedce Start work on setting up workplace serialization for breakpoints 2024-08-01 11:22:28 -04:00
Remco Smits
a82d759940 Merge branch 'main' into debugger 2024-07-31 17:54:39 +02:00
Remco Smits
a0123e8557 Merge branch 'main' into debugger 2024-07-31 16:06:40 +02:00
Remco Smits
7ff1a08356 Remove unused crates 2024-07-31 15:45:11 +02:00
Remco Smits
c99865b853 Make clippy pass 2024-07-31 15:17:45 +02:00
Remco Smits
fc4d46ec22 Add terminate request 2024-07-31 15:11:54 +02:00
Remco Smits
1c98c1c302 Act on capabilities when sending requests (#12)
* Fix used wrong request args in set breakpoints request

Some debug adapters depend on getting the exact data that you passed in `launch` or `attach` request.

* Send correct request for stopping debug adapter

I changed the name to be more in line with the request name. We now also send the correct request values on the `support_terminate_debuggee` and `support_terminate_debuggee` capabilities.

* Send disconnect request for terminate threads if it does not support it

* Only send configuration done request if its supported

* Add disconnect icon

* Only send step over request params when adapter supports it

* Only send resume(continue) request params if adapter supports it

* Step in only send request args if adapter supports it

* Step out only send request args if adapter supports it

* Step back only send request args if adapter supports it

* Log error using `detach_and_log_err` instead of manually
2024-07-31 15:09:27 +02:00
Remco Smits
aa257ece86 Fix use correct request to stop thread (#11)
* Fix clippy

* Remove increment request sequence (wrong while merging)

* Send correct request to stop thread based on capabilities
2024-07-31 11:24:59 +02:00
Remco Smits
b9c1e511a9 Fix auto pick port didn't use the configure host (#10)
* Fix use the configured host to determine the port that we can use.

* Make clippy happy
2024-07-31 10:28:31 +02:00
Remco Smits
2ab7f834b2 Implement startDebugging reverse request (#9)
* Wip move handle reverse requests on the debug panel

* Remove todo for startDebugging

* Remove unused code

* Make clippy happy

* Log error instead of ignoring it

* Remove debug code
2024-07-30 18:07:33 +02:00
Anthony Eid
26a5770e08 Merge pull request #7 from Anthony-Eid/breakpoints
This pull request enables users to set breakpoints by clicking to the left of a line number within editor. It also anchor's breakpoints to the original line they were placed on, which allows breakpoints to stay in their relative position when a line before a breakpoint is removed/added.
2024-07-29 20:06:56 -04:00
Anthony Eid
d6dd59c83f Merge with debugger branch 2024-07-29 20:03:48 -04:00
Anthony Eid
655b23c635 Get clippy to pass 2024-07-29 18:48:27 -04:00
Anthony Eid
620e65411b Fix crash that can happen after placing breakpoint in file and changing files
The crash was happening because editor was trying to render all breakpoints even if they didn't
belong to the current buffer. Breakpoint don't persistent after closing a tab, will need to
change the breakpoint DS key from buffer_id to something more permament
2024-07-29 17:58:52 -04:00
Anthony Eid
d222fbe84c Fix breakpoint indicator lagging behind mouse in gutter 2024-07-29 16:42:19 -04:00
Anthony Eid
74931bd472 Make active debug line stand out more & set unique DAP client ids (#8) 2024-07-29 19:27:07 +02:00
Anthony Eid
4c5deb0b4e Finalize refactoring breakpoints to use RWLocks 2024-07-28 18:13:14 -04:00
Anthony Eid
a545400534 Set up breakpoint toggle to send breakpoints to dap when necessary 2024-07-28 13:51:28 -04:00
Anthony Eid
12c853e3f0 Merge branch 'remco-debugger' into breakpoints 2024-07-27 18:54:40 -04:00
Anthony Eid
4bb8ec96fd Get gutter breakpoint hint to display at correct point and set breakpoint
The breakpoints that are added through clicking on the gutter can't be affect by the toggle
breakpoint command. This is a bug
2024-07-27 18:10:12 -04:00
Anthony Eid
08dbf365bb Start work on getting breakpoint indicator to show in gutter
The end goal of this commit is to allow a user to set a breakpoint by
hovering over a line in the editor's gutter. Currently breakpoints
only show on line 6 when the mouse is on a gutter.
2024-07-27 17:53:44 -04:00
Anthony Eid
c39c0a55f5 Have project share breakpoints pointer with editor
Instead of editor sharing with project each time a breakpoint is toggled. An
editor will clone project's breakpoints if a project is passed into the editors
new function
2024-07-27 15:44:46 -04:00
Anthony Eid
4373e479f7 Get Editor & Project to share breakpoints through RWLock
Previously editor would send breakpoints to project whenever
a new breakpoint was deleted or added. Now editor shares a
shared pointer to a breakpoint datastructure. This behavior
is more efficient because it doesn't have to repeatly send
breakpoints everytime one is updated.

Still have to handle cases when a breakpoint is added/removed during
a debugging phase. I also have to figure out how to share the breakpoints
pointer only once per project and editor.
2024-07-27 14:05:11 -04:00
Remco Smits
7f8c28877f Wip start debugging request 2024-07-27 10:46:55 +02:00
Remco Smits
1ff23477de Clean up run in terminal code 2024-07-27 10:45:31 +02:00
Remco Smits
d28950c633 Line up toggle debug panel focus with other panels action names 2024-07-26 23:29:08 +02:00
Remco Smits
6ff5e00740 Fix don't go to stack frame if only the thread id matches of the stopped event
This makes sure if you have multiple debug adapters running we don't match a thread id that belongs to a different client
2024-07-25 22:21:27 +02:00
Remco Smits
b70acdfa4a Wip run in terminal request 2024-07-25 21:30:49 +02:00
Remco Smits
403ae10087 Fix infinite loop of threads, because sequence id was wrong
This was added to make sure we had the same sequence id as python debugger. But resulting in breaking xdebug(php) debug adapter
2024-07-25 21:22:51 +02:00
Anthony Eid
9a8a54109e Fix race condition in debugger (#6)
* Add support for DAP to use std for communication

* Add more descriptive error logs for DAP

* Implement handler for continued event

* Add PR feedback to handle_continued_event function

* Updated debugger

* Fix race condition when using late case debug adapters

The main thing this commit does is fix race conditions between a
dap client's event handler and sending a launch/attach request.
Some adapters would only respond to a starting request after the
client handled an init event, which could never happen because
the client used to start it's handler after it sent a launch request.

This commit also ignores undefined errors instead of crashing. This
allows the client to work with adapters that send custom events.

Finially, I added some more descriptive error messages and change
client's starting request seq from 0 to 1 to be more in line with
the specs.

* Get clippy to run successfully

* Add some function docs to dap client

* Fix debugger's highlight when stepping through code

Merging with the main zed branch removed a function that
the debugger panel relied on. I added the function back
and changed it to work with the updated internals of zed.

* Get clippy to pass & add an error log instead of using dbg!()

* Get sequence id to be incremented after getting respond and event

---------

Co-authored-by: Remco Smits <djsmits12@gmail.com>
2024-07-25 20:24:19 +02:00
Anthony Eid
f0a5775204 Fix toggle breakpoints having a max of one bp per buffer
Breakpoints are now stored in a BTreeMap<buffer_id, HashSet<Breakpoint>>
I did this becauase we need to constant check if a breakpoint exists in
a buffer whenever we toggle one.
2024-07-25 14:23:33 -04:00
Anthony Eid
916150a8e0 Get breakpoints to use anchors instead
Warning: Project is not being sent breakpoints right now
2024-07-25 14:23:33 -04:00
Anthony Eid
61533d737c Merge branch 'remco-debugger' into debugger 2024-07-25 14:22:44 -04:00
Remco Smits
11c740b47a Merge branch 'main' into debugger 2024-07-25 20:03:06 +02:00
Anthony Eid
9c2b909df5 Merge branch 'remco-debugger' into debugger 2024-07-25 13:47:48 -04:00
Remco Smits
8b05b88ada Merge branch 'main' into debugger 2024-07-25 18:59:05 +02:00
Anthony Eid
0975ca844c Get sequence id to be incremented after getting respond and event 2024-07-25 01:13:53 -04:00
Anthony Eid
a136b7279a Merge branch 'main' into debugger 2024-07-25 01:07:51 -04:00
Anthony Eid
7f1bd3b1d9 Get clippy to pass & add an error log instead of using dbg!() 2024-07-25 00:20:29 -04:00
Anthony Eid
c7f7d18681 Fix debugger's highlight when stepping through code
Merging with the main zed branch removed a function that
the debugger panel relied on. I added the function back
and changed it to work with the updated internals of zed.
2024-07-25 00:06:51 -04:00
Anthony Eid
96d3f13937 Merge branch 'remco-debugger' into debugger
Adds the ability to ignore custom events. Also,
some more descriptive error handling and documentation.
2024-07-24 23:54:22 -04:00
Remco Smits
1a421db92d Make clippy happier than ever 2024-07-24 13:05:14 +02:00
Remco Smits
9c60884771 Merge branch 'main' into debugger 2024-07-24 12:53:44 +02:00
Remco Smits
ee3323d12a Make clippy happy again 2024-07-24 12:14:18 +02:00
Remco Smits
b88fd3e0c5 Make sure we read events as early as possible
We have to read them as early as possible to make sure we support more debug adapters. Some of them wait before you handled the initialized event. That is also why I moved the launch/attach request to be directly after the initialize request. For example the xdebug adapter sends the initialized event only when you send the launch request, so before if we send the breakpoints and configuration is done requests the adapter would stop and all channels were closed.
2024-07-24 11:51:03 +02:00
Remco Smits
2e10853b34 Fix sending incorrect start request id 2024-07-23 14:45:04 +02:00
Anthony Eid
780bfafedf Add some function docs to dap client 2024-07-23 02:56:20 -04:00
Anthony Eid
0b8c4ded9e Get clippy to run successfully 2024-07-23 02:12:09 -04:00
Anthony Eid
15d5186399 Fix race condition when using late case debug adapters
The main thing this commit does is fix race conditions between a
dap client's event handler and sending a launch/attach request.
Some adapters would only respond to a starting request after the
client handled an init event, which could never happen because
the client used to start it's handler after it sent a launch request.

This commit also ignores undefined errors instead of crashing. This
allows the client to work with adapters that send custom events.

Finially, I added some more descriptive error messages and change
client's starting request seq from 0 to 1 to be more in line with
the specs.
2024-07-23 01:56:54 -04:00
Anthony Eid
9bc1d4a8ae Merge branch 'remco-debugger' into debugger 2024-07-22 22:18:45 -04:00
Anthony Eid
7c9771b8d7 Updated debugger 2024-07-18 14:20:01 -04:00
Remco Smits
d08e28f4e0 Clean up show continue/pause buttons 2024-07-17 15:43:16 +02:00
Remco Smits
ef67321ff2 Implement restart program 2024-07-17 14:41:22 +02:00
Remco Smits
4c777ad140 Fix deserialize error when debug adapter send empty {} body but expected type () 2024-07-17 14:09:56 +02:00
Remco Smits
923ae5473a Fix that we change the thread state status to running
Not all the debug adapters send the continue event so we couldn't determine that the thread state status was `running`. So we have to determine it our selfs if the thread state status did not change, if so we change the status to `running`.

This allows us to show the pause button when the thread state status is running, so you can pause the programs execution.
2024-07-17 10:21:39 +02:00
Remco Smits
a48166e5a0 Add method to get the capabilities from the client 2024-07-16 23:07:43 +02:00
Remco Smits
ef098c028b Show/disable actions buttons based on thread state status 2024-07-16 23:07:29 +02:00
Remco Smits
7ce1b8dc76 Update the thread state status after continue request if continue event was not send 2024-07-16 23:06:39 +02:00
Remco Smits
e350417a33 Add debug pause icon 2024-07-16 22:55:37 +02:00
Remco Smits
ea9e0755df Implement stop action 2024-07-16 21:51:46 +02:00
Remco Smits
6aced1b3aa Add tooltip for tab content to indicate the thread status 2024-07-16 21:29:47 +02:00
Remco Smits
99e01fc608 Clippyyyyy 2024-07-16 21:28:52 +02:00
Remco Smits
fe899c9164 Change that current thread state does not return option 2024-07-16 21:28:36 +02:00
Remco Smits
f8b9937e51 Handle exited event 2024-07-16 21:19:20 +02:00
Remco Smits
dc5928374e Fix we did not notify after stack frame list reset was doen
This fixes an issue if you step debug the same thread, the stack frame list was not up to date.
2024-07-16 21:18:41 +02:00
Remco Smits
0deb3cc606 Move handle initialized event to own method 2024-07-16 21:17:05 +02:00
Remco Smits
ba4a70d7ae Make TCP connect delay configurable 2024-07-16 20:56:24 +02:00
Remco Smits
65a790e4ca Make double click on pane action optional 2024-07-16 20:55:55 +02:00
Anthony Eid
18cb54280a Merge branch 'remco-debugger' into debugger 2024-07-16 13:07:49 -04:00
Remco Smits
b6e677eb06 Wip configure host for tcp debug adapter
Co-Authored-By: Anthony Eid <56899983+Anthony-Eid@users.noreply.github.com>
2024-07-15 18:44:21 +02:00
Anthony Eid
ffa0609f8d Implement stdio debugger connection type (#5)
* Add support for DAP to use std for communication

* Add more descriptive error logs for DAP

* Implement handler for continued event

* Add PR feedback to handle_continued_event function
2024-07-15 18:12:05 +02:00
Remco Smits
77a314350f Fix autocompletion for connection type
before `s_t_d_i_o` after `stdio`

Co-Authored-By: Anthony Eid <56899983+Anthony-Eid@users.noreply.github.com>
2024-07-15 18:03:44 +02:00
Anthony Eid
8d7ec33183 Add PR feedback to handle_continued_event function 2024-07-15 11:43:10 -04:00
Anthony Eid
73ef771875 Implement handler for continued event 2024-07-14 22:36:54 -04:00
Anthony Eid
3e1aa65b20 Add more descriptive error logs for DAP 2024-07-14 19:16:08 -04:00
Anthony Eid
92b93850bb Add support for DAP to use std for communication 2024-07-14 15:35:37 -04:00
Remco Smits
737b03c928 Refactor go_to_stack_frame & clear_hightlights 2024-07-14 16:44:47 +02:00
Remco Smits
1b42dd5865 Add client id to thread tab title 2024-07-14 15:48:43 +02:00
Remco Smits
c9074b1c25 Wire through debugger id for initialize request 2024-07-14 15:19:05 +02:00
Remco Smits
00379280f3 Don't store request type on client(its already in the config) 2024-07-14 15:18:37 +02:00
Remco Smits
4e2d0351cc Rename thread state -> thread states 2024-07-14 15:18:07 +02:00
Remco Smits
a583efd9b9 Give clippy the best day of his life 2024-07-12 22:34:42 +02:00
Remco Smits
6237c29a42 Merge branch 'main' into debugger 2024-07-12 22:21:46 +02:00
Remco Smits
9ea9b41e73 Merge main into debugger
Co-Authored-By: Anthony Eid <56899983+Anthony-Eid@users.noreply.github.com>
2024-07-12 22:18:39 +02:00
Remco Smits
8d99f9b7d2 Fix send all breakpoints when toggling breakpoints
We now send all the breakpoints for the current buffer when toggling one breakpoint.

Co-Authored-By: Anthony Eid <56899983+Anthony-Eid@users.noreply.github.com>
2024-07-12 22:00:58 +02:00
Remco Smits
014ffbce2e Send the initial breakpoints from the project
We now send the initial breakpoints that we created before the debug adapter was running.

Co-Authored-By: Anthony Eid <56899983+Anthony-Eid@users.noreply.github.com>
2024-07-12 21:11:38 +02:00
Remco Smits
68dd3c90c2 Fix we never created the first pane OOPS
Co-Authored-By: Anthony Eid <56899983+Anthony-Eid@users.noreply.github.com>
2024-07-12 21:10:39 +02:00
Remco Smits
4a6f6151f0 Remove editor highlights when debugger terminated event was received 2024-07-11 20:59:02 +02:00
Remco Smits
f108d4c705 Replace find method with any
Because we don't use the result of the find so any would be better in this case
2024-07-11 20:54:41 +02:00
Remco Smits
5ab95f1e1a Remove unused method 2024-07-11 20:51:56 +02:00
Remco Smits
49da08ffa4 Make goto stack trace frame work again after stopped event 2024-07-11 20:44:55 +02:00
Remco Smits
f287c897a4 Remove commented out code 2024-07-11 19:13:48 +02:00
Remco Smits
d15ff2d06f Don't spawn task for empty tasks vec 2024-07-11 19:13:26 +02:00
Remco Smits
648daa3237 WIP start adding support for debugging multiple threads at the same time
Start showing all the threads as tabs so you can debug multiple threads at the same time.
2024-07-10 22:31:49 +02:00
Remco Smits
33e127de09 Merge pull request #4 from Anthony-Eid/debugger
Clean up debugger Inventory code
2024-07-09 16:25:14 +02:00
Anthony Eid
5a9b279039 Merge with remco-debugger branch 2024-07-08 19:55:11 -04:00
Remco Smits
cce58570dc Remove unneeded clone 2024-07-07 20:52:40 +02:00
Remco Smits
361bbec3a0 Make some more room to show stack trace 2024-07-07 20:37:33 +02:00
Remco Smits
a87409813c Fix that thread is not always known in stopped event
This fixes an issue that not all debuggers send the `thread` event and we assumed that we always know the thread inside the handling code of the `stopped` event.
2024-07-07 20:24:33 +02:00
Remco Smits
da84aa1ac2 Fix panic with line - 1 and column - 1 2024-07-07 20:21:38 +02:00
Remco Smits
817760688a Don't support removing current thread_id 2024-07-07 17:32:29 +02:00
Remco Smits
13e56010c1 Add status types for thread state so we can disable buttons on this status 2024-07-07 17:27:40 +02:00
Remco Smits
b827a35e44 Make clippy's day again with using async mutex 2024-07-05 19:26:12 +02:00
Remco Smits
9006e8fdff Make clippy's day 2024-07-05 19:19:00 +02:00
Remco Smits
ce8ec033f4 Fix clippy 2024-07-05 19:16:38 +02:00
Remco Smits
9678cc9bc3 Remove editor highlights 2024-07-05 19:15:16 +02:00
Remco Smits
e87c4ddadc Merge branch 'main' into debugger 2024-07-05 17:02:58 +02:00
Remco Smits
3f8581a2fb Merge branch 'main' into debugger 2024-07-05 16:54:35 +02:00
Remco Smits
00c5b83384 Update handle terminated event to restart debugger 2024-07-05 16:50:43 +02:00
Remco Smits
4aedc1cd0b Save request type on client 2024-07-05 16:44:27 +02:00
Remco Smits
d238675c1a Wip support for determing to use launch or attach request 2024-07-05 16:31:07 +02:00
Remco Smits
dcf6f6ca30 Implement handle terminated event 2024-07-05 15:50:11 +02:00
Remco Smits
f4606bd951 Don't go to stack frame if event is not ment for current client 2024-07-05 15:11:26 +02:00
Remco Smits
12bef0830a Allow clicking on stack frames 2024-07-05 14:25:01 +02:00
Remco Smits
953a2b376c Make sure we don't resume other threads with action for one thread 2024-07-05 00:08:54 +02:00
Remco Smits
ac3b9f7a4c Wip show current file & column 2024-07-04 23:57:30 +02:00
Remco Smits
ef5990d427 Move ToggleBreakpoint action to editor instead of workspace 2024-07-04 14:52:50 +02:00
Remco Smits
23a81d5d70 Send all stack frame, scopes, and variable requests parallel
This will improve performance for loading large stack frames.
2024-07-04 13:58:12 +02:00
Remco Smits
515122c54d Clean up debug panel code
For now we only support debugger to be positioned on the bottom.
2024-07-03 23:20:39 +02:00
Remco Smits
f4eacca987 Fix 2 TODO's for multiple client support 2024-07-03 15:48:47 +02:00
Remco Smits
003fb7c81e Move thread state to the client
This allows us to show the current line and is closer for multi client/adapter support
2024-07-03 14:53:22 +02:00
Remco Smits
7d2f63ebbd Split event handling code to separate functions 2024-07-03 12:49:49 +02:00
Remco Smits
93e0bbb833 Use parking lot mutex instead of sync mutex 2024-07-03 10:40:02 +02:00
Remco Smits
8c5f6a0be7 Make clippppy more happy 2024-07-03 10:12:07 +02:00
Remco Smits
d6cafb8315 Fix typo 2024-07-03 09:45:54 +02:00
Remco Smits
a24b76b30b Merge branch 'main' into debugger 2024-07-03 09:38:56 +02:00
Remco Smits
11d74ea4ec Add tooltip for stack trace debug items
Co-Authored-By: Anthony Eid <56899983+Anthony-Eid@users.noreply.github.com>
2024-07-02 19:00:35 +02:00
Remco Smits
93f4775cf6 Properly fix race condition of receive events and handling responses
Co-Authored-By: Anthony Eid <56899983+Anthony-Eid@users.noreply.github.com>
2024-07-02 19:00:00 +02:00
Remco Smits
a93913b9a3 Swap handle events with handle request
This is a temporary fix, so we don't crash when race condition occurs and the event is send earlier than request was finished. The debug adapter was still `starting` meaning the task was still pending.
2024-07-02 16:51:30 +02:00
Remco Smits
08afbc6b58 Update capabilities 2024-07-02 15:09:44 +02:00
Remco Smits
b869465f00 Fix send configuration done request when initialized event was received.
This fixes an issue that the debugger adapter was notified to early. So it had in correct breakpoint information and would always stop on the first line without given a stack trace
2024-07-02 15:09:26 +02:00
Remco Smits
2debea8115 Wire through transport type from config 2024-07-02 15:06:09 +02:00
Remco Smits
cae295ff65 Fix send correct line to set breakpoint 2024-07-02 15:02:59 +02:00
Remco Smits
c51206e980 Wire through launch config values to launch request 2024-07-02 10:59:02 +02:00
Remco Smits
1baa5aea94 Fix missing match arg for Initialized event 2024-07-02 10:57:49 +02:00
Remco Smits
3e022a5565 Change error message when non-success response came back 2024-07-02 10:46:52 +02:00
Remco Smits
3dd769be94 Fix panic denormalizing Terminated event without body 2024-07-02 10:46:12 +02:00
Remco Smits
7936a4bee3 Fix panic when denormalizing Initialized event with body 2024-07-02 10:45:41 +02:00
Remco Smits
09aabe481c Undo wrong request value 2024-07-01 16:09:06 +02:00
Remco Smits
2ea1e4fa85 Add debug panel actions 2024-07-01 12:45:08 +02:00
Remco Smits
8699dad0e3 Add restart & pause method & log request errors 2024-07-01 12:43:57 +02:00
Remco Smits
47a5f0c620 Start wiring through custom launch data 2024-06-29 16:47:51 +02:00
Remco Smits
854ff68bac Clean up transport code 2024-06-29 16:38:43 +02:00
Remco Smits
01d384e676 Add missing event body mapping 2024-06-29 16:36:32 +02:00
Anthony Eid
ddd893a795 Clean up debugger inventory files/code 2024-06-28 00:52:15 -04:00
Remco Smits
3d7cd5dac7 Remove todo and change casing 2024-06-27 22:00:01 +02:00
Remco Smits
a6fdfb5191 Merge pull request #3 from Anthony-Eid/debugger
Debugger Tasks
2024-06-27 20:35:10 +02:00
Anthony Eid
73a68d560f Change abs_path to path in project update worktree settings 2024-06-27 14:31:50 -04:00
Anthony Eid
b4eeb25f55 Merge branch 'remco-debugger' into debugger 2024-06-27 14:15:03 -04:00
Remco Smits
fc991ab273 Merge branch 'main' into debugger 2024-06-27 20:12:43 +02:00
Remco Smits
ab58d14559 Wip display scopes & variables 2024-06-27 20:05:02 +02:00
Anthony Eid
3a0b311378 Get start debugger task to run DAP on confirm 2024-06-27 13:23:31 -04:00
Anthony Eid
b81065fe63 Get debug tasks to show in start debugger menu 2024-06-27 13:23:31 -04:00
Anthony Eid
a67f28dba2 Get Start Debugger action to open task menu 2024-06-27 13:23:31 -04:00
Remco Smits
99b2472e83 Merge pull request #2 from d1y/debugger
debug add icon
2024-06-27 18:40:37 +02:00
Remco Smits
be45d5aa73 Merge branch 'debugger' into debugger 2024-06-27 18:24:00 +02:00
Remco Smits
0508df9e7b Remove commit hash use repo url only 2024-06-26 18:20:18 +02:00
Remco Smits
153efab377 Migrate to dap-types crate 2024-06-26 17:58:48 +02:00
Remco Smits
8015fb70e3 Remove unused mut self 2024-06-26 11:44:43 +02:00
Remco Smits
79d23aa4fe Start adding support for multiple threads + beginning of stack frame UI 2024-06-25 21:38:01 +02:00
Remco Smits
d5dae425fc Allow stepping to next line 2024-06-25 21:35:47 +02:00
d1y
d9e09c4a66 add debug icon 2024-06-26 03:31:19 +08:00
Remco Smits
331625e876 Change hardcoded set breakpoint to TODO 2024-06-25 21:07:01 +02:00
Remco Smits
61949fb348 Add enum for SteppingGranularity 2024-06-25 21:06:33 +02:00
Remco Smits
c7f4e09496 Merge pull request #1 from Anthony-Eid/debugger
Set up ground work for debugger config files
2024-06-25 09:27:58 +02:00
Anthony Eid
5442e116ce Get debugger inventory to add configs from .zed/debug.json 2024-06-25 02:45:46 -04:00
Anthony Eid
11a4fc8b02 Start work on debugger inventory
The debugger config inventory is used to track config files for the debugger. Currently, only .zed/debug.json is set up to work, but it's still untested.
2024-06-25 02:13:52 -04:00
Remco Smits
d303ebd46e Wip add select launch config modal
Co-Authored-By: Anthony Eid <56899983+Anthony-Eid@users.noreply.github.com>
2024-06-24 21:00:09 +02:00
Remco Smits
e1de8dc50e Remove unused code 2024-06-24 18:47:17 +02:00
Remco Smits
5fe110c1dd Rename play to continue debug 2024-06-24 00:07:43 +02:00
Remco Smits
89b203d03a Move current thread to debug panel instead of client 2024-06-23 23:52:10 +02:00
Remco Smits
0f4f8abbaa Add start debugger command 2024-06-23 23:51:53 +02:00
Remco Smits
0d97e9e579 Send debug adapter event through project and listen in debug panel 2024-06-23 14:27:07 +02:00
Remco Smits
14b913fb4b Make request return result of struct instead of Value 2024-06-23 11:21:15 +02:00
Remco Smits
9f1cd2bdb5 Merge branch 'main' into debugger 2024-06-23 10:52:52 +02:00
Remco Smits
547c40e332 change duration 2024-06-23 10:42:34 +02:00
Remco Smits
9cff6d5aa5 Wire through project path
Co-Authored-By: Bennet Bo Fenner <53836821+bennetbo@users.noreply.github.com>
2024-06-23 01:53:41 +02:00
Remco Smits
7e438bc1f3 Wip breakpoints
Co-Authored-By: Bennet Bo Fenner <53836821+bennetbo@users.noreply.github.com>
2024-06-23 00:36:31 +02:00
Remco Smits
944a52ce91 Wip break points
Co-Authored-By: Bennet Bo Fenner <53836821+bennetbo@users.noreply.github.com>
2024-06-22 23:47:18 +02:00
Remco Smits
95a814ed41 Wip breakpoints
Co-Authored-By: Bennet Bo Fenner <53836821+bennetbo@users.noreply.github.com>
2024-06-22 22:55:39 +02:00
Remco Smits
6b9295b6c4 wip write through buttons
Co-Authored-By: Bennet Bo Fenner <53836821+bennetbo@users.noreply.github.com>
2024-06-22 21:17:43 +02:00
Remco Smits
1128fce61a make thread id public 2024-06-22 20:22:32 +02:00
Remco Smits
7c355fdb0f clean up 2024-06-22 20:22:23 +02:00
Remco Smits
0e2a0b9edc Remove deps 2024-06-22 20:22:12 +02:00
Remco Smits
c130f9c2f2 Wip 2024-06-10 21:41:36 +02:00
Remco Smits
08300c6e90 Wip debugger client 2024-06-10 20:38:00 +02:00
Remco Smits
7b71119094 Move to imports 2024-06-09 13:01:46 +02:00
Remco Smits
c18db76862 Merge main 2024-06-09 12:27:47 +02:00
Remco Smits
c0dd152509 Merge branch 'main' into debugger 2024-06-09 12:27:39 +02:00
Remco Smits
f402a4e5ce WIP 2024-04-10 15:57:29 +02:00
152 changed files with 26234 additions and 422 deletions

19
.zed/debug.json Normal file
View File

@@ -0,0 +1,19 @@
[
{
"label": "Debug Zed with LLDB",
"adapter": "lldb",
"program": "$ZED_WORKTREE_ROOT/target/debug/zed",
"request": "launch",
"cwd": "$ZED_WORKTREE_ROOT"
},
{
"label": "Debug Zed with GDB",
"adapter": "gdb",
"program": "$ZED_WORKTREE_ROOT/target/debug/zed",
"request": "launch",
"cwd": "$ZED_WORKTREE_ROOT",
"initialize_args": {
"stopAtBeginningOfMainSubprogram": true
}
}
]

124
Cargo.lock generated
View File

@@ -13,7 +13,6 @@ dependencies = [
"futures 0.3.31",
"gpui",
"language",
"lsp",
"project",
"smallvec",
"ui",
@@ -2770,9 +2769,12 @@ dependencies = [
"clock",
"collab_ui",
"collections",
"command_palette_hooks",
"context_server",
"ctor",
"dap",
"dashmap 6.1.0",
"debugger_ui",
"derive_more",
"editor",
"env_logger 0.11.6",
@@ -3690,6 +3692,67 @@ dependencies = [
"syn 2.0.90",
]
[[package]]
name = "dap"
version = "0.1.0"
dependencies = [
"anyhow",
"async-compression",
"async-pipe",
"async-tar",
"async-trait",
"client",
"collections",
"dap-types",
"env_logger 0.11.6",
"fs",
"futures 0.3.31",
"gpui",
"http_client",
"language",
"log",
"node_runtime",
"parking_lot",
"paths",
"schemars",
"serde",
"serde_json",
"settings",
"smallvec",
"smol",
"sysinfo",
"task",
"util",
]
[[package]]
name = "dap-types"
version = "0.0.1"
source = "git+https://github.com/zed-industries/dap-types?rev=bf5632dc19f806e8a435c9f04a4bfe7322badea2#bf5632dc19f806e8a435c9f04a4bfe7322badea2"
dependencies = [
"schemars",
"serde",
"serde_json",
]
[[package]]
name = "dap_adapters"
version = "0.1.0"
dependencies = [
"anyhow",
"async-trait",
"dap",
"gpui",
"language",
"paths",
"regex",
"serde",
"serde_json",
"sysinfo",
"task",
"util",
]
[[package]]
name = "dashmap"
version = "5.5.3"
@@ -3763,6 +3826,57 @@ dependencies = [
"winapi",
]
[[package]]
name = "debugger_tools"
version = "0.1.0"
dependencies = [
"anyhow",
"dap",
"editor",
"futures 0.3.31",
"gpui",
"project",
"serde_json",
"settings",
"smol",
"util",
"workspace",
]
[[package]]
name = "debugger_ui"
version = "0.1.0"
dependencies = [
"anyhow",
"client",
"collections",
"command_palette_hooks",
"dap",
"editor",
"env_logger 0.11.6",
"fuzzy",
"gpui",
"language",
"menu",
"picker",
"pretty_assertions",
"project",
"rpc",
"serde",
"serde_json",
"settings",
"sum_tree",
"sysinfo",
"task",
"tasks_ui",
"terminal_view",
"theme",
"ui",
"unindent",
"util",
"workspace",
]
[[package]]
name = "deepseek"
version = "0.1.0"
@@ -4064,6 +4178,7 @@ dependencies = [
"log",
"lsp",
"markdown",
"menu",
"multi_buffer",
"ordered-float 2.10.1",
"parking_lot",
@@ -10112,6 +10227,8 @@ dependencies = [
"client",
"clock",
"collections",
"dap",
"dap_adapters",
"env_logger 0.11.6",
"fancy-regex 0.14.0",
"fs",
@@ -10123,6 +10240,7 @@ dependencies = [
"gpui",
"http_client",
"image",
"indexmap",
"itertools 0.14.0",
"language",
"log",
@@ -10760,6 +10878,7 @@ version = "0.1.0"
dependencies = [
"anyhow",
"auto_update",
"dap",
"editor",
"extension_host",
"file_finder",
@@ -13282,6 +13401,7 @@ dependencies = [
"parking_lot",
"schemars",
"serde",
"serde_json",
"serde_json_lenient",
"sha2",
"shellexpand 2.1.2",
@@ -16640,6 +16760,8 @@ dependencies = [
"component_preview",
"copilot",
"db",
"debugger_tools",
"debugger_ui",
"diagnostics",
"editor",
"env_logger 0.11.6",

View File

@@ -31,6 +31,10 @@ members = [
"crates/context_server",
"crates/context_server_settings",
"crates/copilot",
"crates/dap",
"crates/dap_adapters",
"crates/debugger_tools",
"crates/debugger_ui",
"crates/db",
"crates/deepseek",
"crates/diagnostics",
@@ -233,7 +237,11 @@ component_preview = { path = "crates/component_preview" }
context_server = { path = "crates/context_server" }
context_server_settings = { path = "crates/context_server_settings" }
copilot = { path = "crates/copilot" }
dap = { path = "crates/dap" }
dap_adapters = { path = "crates/dap_adapters" }
db = { path = "crates/db" }
debugger_ui = { path = "crates/debugger_ui" }
debugger_tools = { path = "crates/debugger_tools" }
deepseek = { path = "crates/deepseek" }
diagnostics = { path = "crates/diagnostics" }
buffer_diff = { path = "crates/buffer_diff" }

1
assets/icons/debug.svg Normal file
View File

@@ -0,0 +1 @@
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-bug"><path d="m8 2 1.88 1.88"/><path d="M14.12 3.88 16 2"/><path d="M9 7.13v-1a3.003 3.003 0 1 1 6 0v1"/><path d="M12 20c-3.3 0-6-2.7-6-6v-3a4 4 0 0 1 4-4h4a4 4 0 0 1 4 4v3c0 3.3-2.7 6-6 6"/><path d="M12 20v-9"/><path d="M6.53 9C4.6 8.8 3 7.1 3 5"/><path d="M6 13H2"/><path d="M3 21c0-2.1 1.7-3.9 3.8-4"/><path d="M20.97 5c0 2.1-1.6 3.8-3.5 4"/><path d="M22 13h-4"/><path d="M17.2 17c2.1.1 3.8 1.9 3.8 4"/></svg>

After

Width:  |  Height:  |  Size: 615 B

View File

@@ -0,0 +1 @@
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="currentColor" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-circle"><circle cx="12" cy="12" r="10"/></svg>

After

Width:  |  Height:  |  Size: 257 B

View File

@@ -0,0 +1 @@
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-step-forward"><line x1="6" x2="6" y1="4" y2="20"/><polygon points="10,4 20,12 10,20"/></svg>

After

Width:  |  Height:  |  Size: 295 B

View File

@@ -0,0 +1 @@
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-unplug"><path d="m19 5 3-3"/><path d="m2 22 3-3"/><path d="M6.3 20.3a2.4 2.4 0 0 0 3.4 0L12 18l-6-6-2.3 2.3a2.4 2.4 0 0 0 0 3.4Z"/><path d="M7.5 13.5 10 11"/><path d="M10.5 16.5 13 14"/><path d="m12 6 6 6 2.3-2.3a2.4 2.4 0 0 0 0-3.4l-2.6-2.6a2.4 2.4 0 0 0-3.4 0Z"/></svg>

After

Width:  |  Height:  |  Size: 474 B

View File

@@ -0,0 +1 @@
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-circle-off"><path d="m2 2 20 20"/><path d="M8.35 2.69A10 10 0 0 1 21.3 15.65"/><path d="M19.08 19.08A10 10 0 1 1 4.92 4.92"/></svg>

After

Width:  |  Height:  |  Size: 334 B

View File

@@ -0,0 +1 @@
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="currentColor" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-message-circle"><path d="M7.9 20A9 9 0 1 0 4 16.1L2 22Z"/></svg>

After

Width:  |  Height:  |  Size: 275 B

View File

@@ -0,0 +1 @@
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-pause"><rect x="14" y="4" width="4" height="16" rx="1"/><rect x="6" y="4" width="4" height="16" rx="1"/></svg>

After

Width:  |  Height:  |  Size: 313 B

View File

@@ -0,0 +1 @@
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-rotate-ccw"><path d="M3 12a9 9 0 1 0 9-9 9.75 9.75 0 0 0-6.74 2.74L3 8"/><path d="M3 3v5h5"/></svg>

After

Width:  |  Height:  |  Size: 302 B

View File

@@ -0,0 +1 @@
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-undo-dot"><path d="M21 17a9 9 0 0 0-15-6.7L3 13"/><path d="M3 7v6h6"/><circle cx="12" cy="17" r="1"/></svg>

After

Width:  |  Height:  |  Size: 310 B

View File

@@ -0,0 +1,5 @@
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-arrow-up-from-dot">
<path d="m5 15 7 7 7-7"/>
<path d="M12 8v14"/>
<circle cx="12" cy="3" r="1"/>
</svg>

After

Width:  |  Height:  |  Size: 313 B

View File

@@ -0,0 +1,5 @@
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-arrow-up-from-dot">
<path d="m3 10 9-8 9 8"/>
<path d="M12 17V2"/>
<circle cx="12" cy="21" r="1"/>
</svg>

After

Width:  |  Height:  |  Size: 314 B

View File

@@ -0,0 +1,5 @@
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-redo-dot">
<circle cx="12" cy="17" r="1"/>
<path d="M21 7v6h-6"/>
<path d="M3 17a9 9 0 0 1 9-9 9 9 0 0 1 6 2.3l3 2.7"/>
</svg>

After

Width:  |  Height:  |  Size: 335 B

View File

@@ -0,0 +1 @@
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-square"><rect width="18" height="18" x="3" y="3" rx="2"/></svg>

After

Width:  |  Height:  |  Size: 266 B

View File

@@ -716,6 +716,14 @@
"space": "project_panel::Open"
}
},
{
"context": "VariableList",
"use_key_equivalents": true,
"bindings": {
"left": "variable_list::CollapseSelectedEntry",
"right": "variable_list::ExpandSelectedEntry"
}
},
{
"context": "GitPanel && ChangesList",
"use_key_equivalents": true,

View File

@@ -1321,6 +1321,12 @@
// }
// ]
"ssh_connections": [],
// Configures context servers for use in the Assistant.
"context_servers": {}
"context_servers": {},
"debugger": {
"stepping_granularity": "line",
"save_breakpoints": true,
"button": true
}
}

View File

@@ -0,0 +1,32 @@
[
{
"label": "Debug active PHP file",
"adapter": "php",
"program": "$ZED_FILE",
"request": "launch",
"cwd": "$ZED_WORKTREE_ROOT"
},
{
"label": "Debug active Python file",
"adapter": "python",
"program": "$ZED_FILE",
"request": "launch",
"cwd": "$ZED_WORKTREE_ROOT"
},
{
"label": "Debug active JavaScript file",
"adapter": "javascript",
"program": "$ZED_FILE",
"request": "launch",
"cwd": "$ZED_WORKTREE_ROOT"
},
{
"label": "JavaScript debug terminal",
"adapter": "javascript",
"request": "launch",
"cwd": "$ZED_WORKTREE_ROOT",
"initialize_args": {
"console": "integratedTerminal"
}
}
]

View File

@@ -20,7 +20,6 @@ extension_host.workspace = true
futures.workspace = true
gpui.workspace = true
language.workspace = true
lsp.workspace = true
project.workspace = true
smallvec.workspace = true
ui.workspace = true

View File

@@ -7,8 +7,7 @@ use gpui::{
EventEmitter, InteractiveElement as _, ParentElement as _, Render, SharedString,
StatefulInteractiveElement, Styled, Transformation, Window,
};
use language::{LanguageRegistry, LanguageServerBinaryStatus, LanguageServerId};
use lsp::LanguageServerName;
use language::{BinaryStatus, LanguageRegistry, LanguageServerId};
use project::{EnvironmentErrorMessage, LanguageServerProgress, Project, WorktreeId};
use smallvec::SmallVec;
use std::{cmp::Reverse, fmt::Write, sync::Arc, time::Duration};
@@ -20,21 +19,21 @@ actions!(activity_indicator, [ShowErrorMessage]);
pub enum Event {
ShowError {
lsp_name: LanguageServerName,
server_name: SharedString,
error: String,
},
}
pub struct ActivityIndicator {
statuses: Vec<LspStatus>,
statuses: Vec<ServerStatus>,
project: Entity<Project>,
auto_updater: Option<Entity<AutoUpdater>>,
context_menu_handle: PopoverMenuHandle<ContextMenu>,
}
struct LspStatus {
name: LanguageServerName,
status: LanguageServerBinaryStatus,
struct ServerStatus {
name: SharedString,
status: BinaryStatus,
}
struct PendingWork<'a> {
@@ -65,7 +64,20 @@ impl ActivityIndicator {
while let Some((name, status)) = status_events.next().await {
this.update(&mut cx, |this: &mut ActivityIndicator, cx| {
this.statuses.retain(|s| s.name != name);
this.statuses.push(LspStatus { name, status });
this.statuses.push(ServerStatus { name, status });
cx.notify();
})?;
}
anyhow::Ok(())
})
.detach();
let mut status_events = languages.dap_server_binary_statuses();
cx.spawn(|this, mut cx| async move {
while let Some((name, status)) = status_events.next().await {
this.update(&mut cx, |this, cx| {
this.statuses.retain(|s| s.name != name);
this.statuses.push(ServerStatus { name, status });
cx.notify();
})?;
}
@@ -88,18 +100,18 @@ impl ActivityIndicator {
});
cx.subscribe_in(&this, window, move |_, _, event, window, cx| match event {
Event::ShowError { lsp_name, error } => {
Event::ShowError { server_name, error } => {
let create_buffer = project.update(cx, |project, cx| project.create_buffer(cx));
let project = project.clone();
let error = error.clone();
let lsp_name = lsp_name.clone();
let server_name = server_name.clone();
cx.spawn_in(window, |workspace, mut cx| async move {
let buffer = create_buffer.await?;
buffer.update(&mut cx, |buffer, cx| {
buffer.edit(
[(
0..0,
format!("Language server error: {}\n\n{}", lsp_name, error),
format!("Language server error: {}\n\n{}", server_name, error),
)],
None,
cx,
@@ -129,9 +141,9 @@ impl ActivityIndicator {
fn show_error_message(&mut self, _: &ShowErrorMessage, _: &mut Window, cx: &mut Context<Self>) {
self.statuses.retain(|status| {
if let LanguageServerBinaryStatus::Failed { error } = &status.status {
if let BinaryStatus::Failed { error } = &status.status {
cx.emit(Event::ShowError {
lsp_name: status.name.clone(),
server_name: status.name.clone(),
error: error.clone(),
});
false
@@ -260,12 +272,10 @@ impl ActivityIndicator {
let mut failed = SmallVec::<[_; 3]>::new();
for status in &self.statuses {
match status.status {
LanguageServerBinaryStatus::CheckingForUpdate => {
checking_for_update.push(status.name.clone())
}
LanguageServerBinaryStatus::Downloading => downloading.push(status.name.clone()),
LanguageServerBinaryStatus::Failed { .. } => failed.push(status.name.clone()),
LanguageServerBinaryStatus::None => {}
BinaryStatus::CheckingForUpdate => checking_for_update.push(status.name.clone()),
BinaryStatus::Downloading => downloading.push(status.name.clone()),
BinaryStatus::Failed { .. } => failed.push(status.name.clone()),
BinaryStatus::None => {}
}
}
@@ -278,7 +288,7 @@ impl ActivityIndicator {
),
message: format!(
"Downloading {}...",
downloading.iter().map(|name| name.0.as_ref()).fold(
downloading.iter().map(|name| name.as_ref()).fold(
String::new(),
|mut acc, s| {
if !acc.is_empty() {
@@ -306,7 +316,7 @@ impl ActivityIndicator {
),
message: format!(
"Checking for updates to {}...",
checking_for_update.iter().map(|name| name.0.as_ref()).fold(
checking_for_update.iter().map(|name| name.as_ref()).fold(
String::new(),
|mut acc, s| {
if !acc.is_empty() {
@@ -336,7 +346,7 @@ impl ActivityIndicator {
"Failed to run {}. Click to show error.",
failed
.iter()
.map(|name| name.0.as_ref())
.map(|name| name.as_ref())
.fold(String::new(), |mut acc, s| {
if !acc.is_empty() {
acc.push_str(", ");

View File

@@ -89,8 +89,11 @@ channel.workspace = true
client = { workspace = true, features = ["test-support"] }
collab_ui = { workspace = true, features = ["test-support"] }
collections = { workspace = true, features = ["test-support"] }
command_palette_hooks.workspace = true
context_server.workspace = true
ctor.workspace = true
dap = { workspace = true, features = ["test-support"] }
debugger_ui = { workspace = true, features = ["test-support"] }
editor = { workspace = true, features = ["test-support"] }
env_logger.workspace = true
extension.workspace = true

View File

@@ -469,3 +469,14 @@ CREATE TABLE IF NOT EXISTS processed_stripe_events (
);
CREATE INDEX "ix_processed_stripe_events_on_stripe_event_created_timestamp" ON processed_stripe_events (stripe_event_created_timestamp);
CREATE TABLE IF NOT EXISTS "breakpoints" (
"id" INTEGER PRIMARY KEY AUTOINCREMENT,
"project_id" INTEGER NOT NULL REFERENCES projects (id) ON DELETE CASCADE,
"position" INTEGER NOT NULL,
"log_message" TEXT NULL,
"worktree_id" BIGINT NOT NULL,
"path" TEXT NOT NULL,
"kind" VARCHAR NOT NULL
);
CREATE INDEX "index_breakpoints_on_project_id" ON "breakpoints" ("project_id");

View File

@@ -0,0 +1,11 @@
CREATE TABLE IF NOT EXISTS "breakpoints" (
"id" SERIAL PRIMARY KEY,
"project_id" INTEGER NOT NULL REFERENCES projects (id) ON DELETE CASCADE,
"position" INTEGER NOT NULL,
"log_message" TEXT NULL,
"worktree_id" BIGINT NOT NULL,
"path" TEXT NOT NULL,
"kind" VARCHAR NOT NULL
);
CREATE INDEX "index_breakpoints_on_project_id" ON "breakpoints" ("project_id");

View File

@@ -659,6 +659,7 @@ pub struct RejoinedProject {
pub collaborators: Vec<ProjectCollaborator>,
pub worktrees: Vec<RejoinedWorktree>,
pub language_servers: Vec<proto::LanguageServer>,
pub breakpoints: HashMap<proto::ProjectPath, HashSet<proto::Breakpoint>>,
}
impl RejoinedProject {
@@ -681,6 +682,17 @@ impl RejoinedProject {
.map(|collaborator| collaborator.to_proto())
.collect(),
language_servers: self.language_servers.clone(),
breakpoints: self
.breakpoints
.iter()
.map(
|(project_path, breakpoints)| proto::SynchronizeBreakpoints {
project_id: self.id.to_proto(),
breakpoints: breakpoints.iter().cloned().collect(),
project_path: Some(project_path.clone()),
},
)
.collect(),
}
}
}
@@ -727,6 +739,7 @@ pub struct Project {
pub collaborators: Vec<ProjectCollaborator>,
pub worktrees: BTreeMap<u64, Worktree>,
pub language_servers: Vec<proto::LanguageServer>,
pub breakpoints: HashMap<proto::ProjectPath, HashSet<proto::Breakpoint>>,
}
pub struct ProjectCollaborator {

View File

@@ -94,6 +94,9 @@ id_type!(RoomParticipantId);
id_type!(ServerId);
id_type!(SignupId);
id_type!(UserId);
id_type!(DebugClientId);
id_type!(SessionId);
id_type!(ThreadId);
/// ChannelRole gives you permissions for both channels and calls.
#[derive(

View File

@@ -1,5 +1,5 @@
use anyhow::Context as _;
use collections::{HashMap, HashSet};
use util::ResultExt;
use super::*;
@@ -571,6 +571,60 @@ impl Database {
.await
}
pub async fn update_breakpoints(
&self,
connection_id: ConnectionId,
update: &proto::SynchronizeBreakpoints,
) -> Result<TransactionGuard<HashSet<ConnectionId>>> {
let project_id = ProjectId::from_proto(update.project_id);
self.project_transaction(project_id, |tx| async move {
let project_path = update
.project_path
.as_ref()
.ok_or_else(|| anyhow!("invalid project path"))?;
// Ensure the update comes from the host.
let project = project::Entity::find_by_id(project_id)
.one(&*tx)
.await?
.ok_or_else(|| anyhow!("no such project"))?;
// remove all existing breakpoints
breakpoints::Entity::delete_many()
.filter(breakpoints::Column::ProjectId.eq(project.id))
.exec(&*tx)
.await?;
if !update.breakpoints.is_empty() {
breakpoints::Entity::insert_many(update.breakpoints.iter().map(|breakpoint| {
breakpoints::ActiveModel {
id: ActiveValue::NotSet,
project_id: ActiveValue::Set(project_id),
worktree_id: ActiveValue::Set(project_path.worktree_id as i64),
path: ActiveValue::Set(project_path.path.clone()),
kind: match proto::BreakpointKind::from_i32(breakpoint.kind) {
Some(proto::BreakpointKind::Log) => {
ActiveValue::Set(breakpoints::BreakpointKind::Log)
}
Some(proto::BreakpointKind::Standard) => {
ActiveValue::Set(breakpoints::BreakpointKind::Standard)
}
None => ActiveValue::Set(breakpoints::BreakpointKind::Standard),
},
log_message: ActiveValue::Set(breakpoint.message.clone()),
position: ActiveValue::Set(breakpoint.cached_position as i32),
}
}))
.exec_without_returning(&*tx)
.await?;
}
self.internal_project_connection_ids(project_id, connection_id, true, &tx)
.await
})
.await
}
/// Updates the worktree settings for the given connection.
pub async fn update_worktree_settings(
&self,
@@ -852,6 +906,33 @@ impl Database {
}
}
let mut breakpoints: HashMap<proto::ProjectPath, HashSet<proto::Breakpoint>> =
HashMap::default();
let db_breakpoints = project.find_related(breakpoints::Entity).all(tx).await?;
for breakpoint in db_breakpoints.iter() {
let project_path = proto::ProjectPath {
worktree_id: breakpoint.worktree_id as u64,
path: breakpoint.path.clone(),
};
breakpoints
.entry(project_path)
.or_default()
.insert(proto::Breakpoint {
position: None,
cached_position: breakpoint.position as u32,
kind: match breakpoint.kind {
breakpoints::BreakpointKind::Standard => {
proto::BreakpointKind::Standard.into()
}
breakpoints::BreakpointKind::Log => proto::BreakpointKind::Log.into(),
},
message: breakpoint.log_message.clone(),
});
}
// Populate language servers.
let language_servers = project
.find_related(language_server::Entity)
@@ -879,6 +960,7 @@ impl Database {
worktree_id: None,
})
.collect(),
breakpoints,
};
Ok((project, replica_id as ReplicaId))
}
@@ -1106,41 +1188,52 @@ impl Database {
exclude_dev_server: bool,
) -> Result<TransactionGuard<HashSet<ConnectionId>>> {
self.project_transaction(project_id, |tx| async move {
let project = project::Entity::find_by_id(project_id)
.one(&*tx)
.await?
.ok_or_else(|| anyhow!("no such project"))?;
let mut collaborators = project_collaborator::Entity::find()
.filter(project_collaborator::Column::ProjectId.eq(project_id))
.stream(&*tx)
.await?;
let mut connection_ids = HashSet::default();
if let Some(host_connection) = project.host_connection().log_err() {
if !exclude_dev_server {
connection_ids.insert(host_connection);
}
}
while let Some(collaborator) = collaborators.next().await {
let collaborator = collaborator?;
connection_ids.insert(collaborator.connection());
}
if connection_ids.contains(&connection_id)
|| Some(connection_id) == project.host_connection().ok()
{
Ok(connection_ids)
} else {
Err(anyhow!(
"can only send project updates to a project you're in"
))?
}
self.internal_project_connection_ids(project_id, connection_id, exclude_dev_server, &tx)
.await
})
.await
}
async fn internal_project_connection_ids(
&self,
project_id: ProjectId,
connection_id: ConnectionId,
exclude_dev_server: bool,
tx: &DatabaseTransaction,
) -> Result<HashSet<ConnectionId>> {
let project = project::Entity::find_by_id(project_id)
.one(tx)
.await?
.ok_or_else(|| anyhow!("no such project"))?;
let mut collaborators = project_collaborator::Entity::find()
.filter(project_collaborator::Column::ProjectId.eq(project_id))
.stream(tx)
.await?;
let mut connection_ids = HashSet::default();
if let Some(host_connection) = project.host_connection().log_err() {
if !exclude_dev_server {
connection_ids.insert(host_connection);
}
}
while let Some(collaborator) = collaborators.next().await {
let collaborator = collaborator?;
connection_ids.insert(collaborator.connection());
}
if connection_ids.contains(&connection_id)
|| Some(connection_id) == project.host_connection().ok()
{
Ok(connection_ids)
} else {
Err(anyhow!(
"can only send project updates to a project you're in"
))?
}
}
async fn project_guest_connection_ids(
&self,
project_id: ProjectId,

View File

@@ -765,6 +765,33 @@ impl Database {
worktrees.push(worktree);
}
let mut breakpoints: HashMap<proto::ProjectPath, HashSet<proto::Breakpoint>> =
HashMap::default();
let db_breakpoints = project.find_related(breakpoints::Entity).all(tx).await?;
for breakpoint in db_breakpoints.iter() {
let project_path = proto::ProjectPath {
worktree_id: breakpoint.worktree_id as u64,
path: breakpoint.path.clone(),
};
breakpoints
.entry(project_path)
.or_default()
.insert(proto::Breakpoint {
position: None,
cached_position: breakpoint.position as u32,
kind: match breakpoint.kind {
breakpoints::BreakpointKind::Standard => {
proto::BreakpointKind::Standard.into()
}
breakpoints::BreakpointKind::Log => proto::BreakpointKind::Log.into(),
},
message: breakpoint.log_message.clone(),
});
}
let language_servers = project
.find_related(language_server::Entity)
.all(tx)
@@ -834,6 +861,7 @@ impl Database {
collaborators,
worktrees,
language_servers,
breakpoints,
}))
}

View File

@@ -2,6 +2,7 @@ pub mod access_token;
pub mod billing_customer;
pub mod billing_preference;
pub mod billing_subscription;
pub mod breakpoints;
pub mod buffer;
pub mod buffer_operation;
pub mod buffer_snapshot;

View File

@@ -0,0 +1,47 @@
use crate::db::ProjectId;
use sea_orm::entity::prelude::*;
#[derive(Clone, Debug, PartialEq, Eq, DeriveEntityModel)]
#[sea_orm(table_name = "breakpoints")]
pub struct Model {
#[sea_orm(primary_key)]
pub id: i32,
#[sea_orm(primary_key)]
pub project_id: ProjectId,
pub worktree_id: i64,
pub path: String,
pub kind: BreakpointKind,
pub log_message: Option<String>,
pub position: i32,
}
#[derive(
Copy, Clone, Debug, PartialEq, Eq, EnumIter, DeriveActiveEnum, Default, Hash, serde::Serialize,
)]
#[sea_orm(rs_type = "String", db_type = "String(StringLen::None)")]
#[serde(rename_all = "snake_case")]
pub enum BreakpointKind {
#[default]
#[sea_orm(string_value = "standard")]
Standard,
#[sea_orm(string_value = "log")]
Log,
}
#[derive(Copy, Clone, Debug, EnumIter, DeriveRelation)]
pub enum Relation {
#[sea_orm(
belongs_to = "super::project::Entity",
from = "Column::ProjectId",
to = "super::project::Column::Id"
)]
Project,
}
impl Related<super::project::Entity> for Entity {
fn to() -> RelationDef {
Relation::Project.def()
}
}
impl ActiveModelBehavior for ActiveModel {}

View File

@@ -49,6 +49,8 @@ pub enum Relation {
Collaborators,
#[sea_orm(has_many = "super::language_server::Entity")]
LanguageServers,
#[sea_orm(has_many = "super::breakpoints::Entity")]
Breakpoints,
}
impl Related<super::user::Entity> for Entity {
@@ -81,4 +83,10 @@ impl Related<super::language_server::Entity> for Entity {
}
}
impl Related<super::breakpoints::Entity> for Entity {
fn to() -> RelationDef {
Relation::Breakpoints.def()
}
}
impl ActiveModelBehavior for ActiveModel {}

View File

@@ -423,7 +423,40 @@ impl Server {
app_state.config.openai_api_key.clone(),
)
}
});
})
.add_message_handler(update_breakpoints)
.add_message_handler(broadcast_project_message_from_host::<proto::SetActiveDebugLine>)
.add_message_handler(
broadcast_project_message_from_host::<proto::RemoveActiveDebugLine>,
)
.add_message_handler(broadcast_project_message_from_host::<proto::UpdateDebugAdapter>)
.add_message_handler(
broadcast_project_message_from_host::<proto::SetDebugClientCapabilities>,
)
.add_message_handler(broadcast_project_message_from_host::<proto::ShutdownDebugClient>)
.add_request_handler(forward_mutating_project_request::<proto::DapNextRequest>)
.add_request_handler(forward_mutating_project_request::<proto::DapStepInRequest>)
.add_request_handler(forward_mutating_project_request::<proto::DapStepOutRequest>)
.add_request_handler(forward_mutating_project_request::<proto::DapStepBackRequest>)
.add_request_handler(forward_mutating_project_request::<proto::DapContinueRequest>)
.add_request_handler(forward_mutating_project_request::<proto::DapPauseRequest>)
.add_request_handler(forward_mutating_project_request::<proto::DapDisconnectRequest>)
.add_request_handler(
forward_mutating_project_request::<proto::DapTerminateThreadsRequest>,
)
.add_request_handler(forward_mutating_project_request::<proto::DapRestartRequest>)
.add_request_handler(forward_mutating_project_request::<proto::DapTerminateRequest>)
.add_message_handler(broadcast_project_message_from_host::<proto::UpdateThreadStatus>)
.add_request_handler(forward_mutating_project_request::<proto::VariablesRequest>)
.add_message_handler(
broadcast_project_message_from_host::<proto::DapRestartStackFrameRequest>,
)
.add_message_handler(
broadcast_project_message_from_host::<proto::ToggleIgnoreBreakpoints>,
)
.add_message_handler(
broadcast_project_message_from_host::<proto::IgnoreBreakpointState>,
);
Arc::new(server)
}
@@ -1863,6 +1896,18 @@ fn join_project_internal(
.trace_err();
}
let breakpoints = project
.breakpoints
.iter()
.map(
|(project_path, breakpoint_set)| proto::SynchronizeBreakpoints {
project_id: project.id.0 as u64,
breakpoints: breakpoint_set.iter().map(|bp| bp.clone()).collect(),
project_path: Some(project_path.clone()),
},
)
.collect();
// First, we send the metadata associated with each worktree.
response.send(proto::JoinProjectResponse {
project_id: project.id.0 as u64,
@@ -1871,6 +1916,7 @@ fn join_project_internal(
collaborators: collaborators.clone(),
language_servers: project.language_servers.clone(),
role: project.role.into(),
breakpoints,
})?;
for (worktree_id, worktree) in mem::take(&mut project.worktrees) {
@@ -2057,7 +2103,7 @@ async fn update_worktree_settings(
Ok(())
}
/// Notify other participants that a language server has started.
/// Notify other participants that a language server has started.
async fn start_language_server(
request: proto::StartLanguageServer,
session: Session,
@@ -2103,6 +2149,29 @@ async fn update_language_server(
Ok(())
}
/// Notify other participants that breakpoints have changed.
async fn update_breakpoints(
request: proto::SynchronizeBreakpoints,
session: Session,
) -> Result<()> {
let guest_connection_ids = session
.db()
.await
.update_breakpoints(session.connection_id, &request)
.await?;
broadcast(
Some(session.connection_id),
guest_connection_ids.iter().copied(),
|connection_id| {
session
.peer
.forward_send(session.connection_id, connection_id, request.clone())
},
);
Ok(())
}
/// forward a project request to the host. These requests should be read only
/// as guests are allowed to send them.
async fn forward_read_only_project_request<T>(

View File

@@ -11,6 +11,7 @@ mod channel_buffer_tests;
mod channel_guest_tests;
mod channel_message_tests;
mod channel_tests;
mod debug_panel_tests;
mod editor_tests;
mod following_tests;
mod integration_tests;

File diff suppressed because it is too large Load Diff

View File

@@ -22,7 +22,7 @@ use language::{
};
use project::{
project_settings::{InlineBlameSettings, ProjectSettings},
SERVER_PROGRESS_THROTTLE_TIMEOUT,
ProjectPath, SERVER_PROGRESS_THROTTLE_TIMEOUT,
};
use recent_projects::disconnected_overlay::DisconnectedOverlay;
use rpc::RECEIVE_TIMEOUT;
@@ -1591,6 +1591,8 @@ async fn test_mutual_editor_inlay_hint_cache_update(
.await
.unwrap();
executor.run_until_parked();
// Client B joins the project
let project_b = client_b.join_remote_project(project_id, cx_b).await;
active_call_b
@@ -2394,6 +2396,202 @@ fn main() { let foo = other::foo(); }"};
);
}
#[gpui::test]
async fn test_add_breakpoints(cx_a: &mut TestAppContext, cx_b: &mut TestAppContext) {
let executor = cx_a.executor();
let mut server = TestServer::start(executor.clone()).await;
let client_a = server.create_client(cx_a, "user_a").await;
let client_b = server.create_client(cx_b, "user_b").await;
server
.create_room(&mut [(&client_a, cx_a), (&client_b, cx_b)])
.await;
let active_call_a = cx_a.read(ActiveCall::global);
let active_call_b = cx_b.read(ActiveCall::global);
cx_a.update(editor::init);
cx_b.update(editor::init);
client_a
.fs()
.insert_tree(
"/a",
json!({
"test.txt": "one\ntwo\nthree\nfour\nfive",
}),
)
.await;
let (project_a, worktree_id) = client_a.build_local_project("/a", cx_a).await;
let project_path = ProjectPath {
worktree_id,
path: Arc::from(Path::new(&"test.txt")),
};
active_call_a
.update(cx_a, |call, cx| call.set_location(Some(&project_a), cx))
.await
.unwrap();
let project_id = active_call_a
.update(cx_a, |call, cx| call.share_project(project_a.clone(), cx))
.await
.unwrap();
let project_b = client_b.join_remote_project(project_id, cx_b).await;
active_call_b
.update(cx_b, |call, cx| call.set_location(Some(&project_b), cx))
.await
.unwrap();
let (workspace_a, cx_a) = client_a.build_workspace(&project_a, cx_a);
let (workspace_b, cx_b) = client_b.build_workspace(&project_b, cx_b);
// Client A opens an editor.
let editor_a = workspace_a
.update_in(cx_a, |workspace, window, cx| {
workspace.open_path(project_path.clone(), None, true, window, cx)
})
.await
.unwrap()
.downcast::<Editor>()
.unwrap();
// Client B opens same editor as A.
let editor_b = workspace_b
.update_in(cx_b, |workspace, window, cx| {
workspace.open_path(project_path.clone(), None, true, window, cx)
})
.await
.unwrap()
.downcast::<Editor>()
.unwrap();
cx_a.run_until_parked();
cx_b.run_until_parked();
// Client A adds breakpoint on line (1)
editor_a.update_in(cx_a, |editor, window, cx| {
editor.toggle_breakpoint(&editor::actions::ToggleBreakpoint, window, cx);
});
cx_a.run_until_parked();
cx_b.run_until_parked();
let breakpoints_a = editor_a.update(cx_a, |editor, cx| {
editor
.breakpoint_store()
.clone()
.unwrap()
.read(cx)
.breakpoints()
.clone()
});
let breakpoints_b = editor_b.update(cx_b, |editor, cx| {
editor
.breakpoint_store()
.clone()
.unwrap()
.read(cx)
.breakpoints()
.clone()
});
assert_eq!(1, breakpoints_a.len());
assert_eq!(1, breakpoints_a.get(&project_path).unwrap().len());
assert_eq!(breakpoints_a, breakpoints_b);
// Client B adds breakpoint on line(2)
editor_b.update_in(cx_b, |editor, window, cx| {
editor.move_down(&editor::actions::MoveDown, window, cx);
editor.move_down(&editor::actions::MoveDown, window, cx);
editor.toggle_breakpoint(&editor::actions::ToggleBreakpoint, window, cx);
});
cx_a.run_until_parked();
cx_b.run_until_parked();
let breakpoints_a = editor_a.update(cx_a, |editor, cx| {
editor
.breakpoint_store()
.clone()
.unwrap()
.read(cx)
.breakpoints()
.clone()
});
let breakpoints_b = editor_b.update(cx_b, |editor, cx| {
editor
.breakpoint_store()
.clone()
.unwrap()
.read(cx)
.breakpoints()
.clone()
});
assert_eq!(1, breakpoints_a.len());
assert_eq!(breakpoints_a, breakpoints_b);
assert_eq!(2, breakpoints_a.get(&project_path).unwrap().len());
// Client A removes last added breakpoint from client B
editor_a.update_in(cx_a, |editor, window, cx| {
editor.move_down(&editor::actions::MoveDown, window, cx);
editor.move_down(&editor::actions::MoveDown, window, cx);
editor.toggle_breakpoint(&editor::actions::ToggleBreakpoint, window, cx);
});
cx_a.run_until_parked();
cx_b.run_until_parked();
let breakpoints_a = editor_a.update(cx_a, |editor, cx| {
editor
.breakpoint_store()
.clone()
.unwrap()
.read(cx)
.breakpoints()
.clone()
});
let breakpoints_b = editor_b.update(cx_b, |editor, cx| {
editor
.breakpoint_store()
.clone()
.unwrap()
.read(cx)
.breakpoints()
.clone()
});
assert_eq!(1, breakpoints_a.len());
assert_eq!(breakpoints_a, breakpoints_b);
assert_eq!(1, breakpoints_a.get(&project_path).unwrap().len());
// Client B removes first added breakpoint by client A
editor_b.update_in(cx_b, |editor, window, cx| {
editor.move_up(&editor::actions::MoveUp, window, cx);
editor.move_up(&editor::actions::MoveUp, window, cx);
editor.toggle_breakpoint(&editor::actions::ToggleBreakpoint, window, cx);
});
cx_a.run_until_parked();
cx_b.run_until_parked();
let breakpoints_a = editor_a.update(cx_a, |editor, cx| {
editor
.breakpoint_store()
.clone()
.unwrap()
.read(cx)
.breakpoints()
.clone()
});
let breakpoints_b = editor_b.update(cx_b, |editor, cx| {
editor
.breakpoint_store()
.clone()
.unwrap()
.read(cx)
.breakpoints()
.clone()
});
assert_eq!(0, breakpoints_a.len());
assert_eq!(breakpoints_a, breakpoints_b);
}
#[track_caller]
fn tab_undo_assert(
cx_a: &mut EditorTestContext,

54
crates/dap/Cargo.toml Normal file
View File

@@ -0,0 +1,54 @@
[package]
name = "dap"
version = "0.1.0"
edition.workspace = true
publish.workspace = true
license = "GPL-3.0-or-later"
[lints]
workspace = true
[features]
test-support = [
"gpui/test-support",
"util/test-support",
"task/test-support",
"async-pipe",
"settings/test-support",
]
[dependencies]
anyhow.workspace = true
async-compression.workspace = true
async-pipe = { workspace = true, optional = true }
async-tar.workspace = true
async-trait.workspace = true
client.workspace = true
collections.workspace = true
dap-types = { git = "https://github.com/zed-industries/dap-types", rev = "bf5632dc19f806e8a435c9f04a4bfe7322badea2" }
fs.workspace = true
futures.workspace = true
gpui.workspace = true
http_client.workspace = true
language.workspace = true
log.workspace = true
node_runtime.workspace = true
parking_lot.workspace = true
paths.workspace = true
schemars.workspace = true
serde.workspace = true
serde_json.workspace = true
settings.workspace = true
smallvec.workspace = true
smol.workspace = true
sysinfo.workspace = true
task.workspace = true
util.workspace = true
[dev-dependencies]
async-pipe.workspace = true
env_logger.workspace = true
gpui = { workspace = true, features = ["test-support"] }
settings = { workspace = true, features = ["test-support"] }
task = { workspace = true, features = ["test-support"] }
util = { workspace = true, features = ["test-support"] }

1
crates/dap/LICENSE-GPL Symbolic link
View File

@@ -0,0 +1 @@
../../LICENSE-GPL

View File

@@ -0,0 +1,9 @@
# Overview
The active `Project` is responsible for maintain opened and closed breakpoints
as well as serializing breakpoints to save. At a high level project serializes
the positions of breakpoints that don't belong to any active buffers and handles
converting breakpoints from serializing to active whenever a buffer is opened/closed.
`Project` also handles sending all relevant breakpoint information to debug adapter's
during debugging or when starting a debugger.

402
crates/dap/src/adapters.rs Normal file
View File

@@ -0,0 +1,402 @@
#[cfg(any(test, feature = "test-support"))]
use crate::transport::FakeTransport;
use ::fs::Fs;
use anyhow::{anyhow, Context as _, Ok, Result};
use async_compression::futures::bufread::GzipDecoder;
use async_tar::Archive;
use async_trait::async_trait;
use futures::io::BufReader;
use gpui::{AsyncApp, SharedString};
pub use http_client::{github::latest_github_release, HttpClient};
use language::LanguageToolchainStore;
use node_runtime::NodeRuntime;
use serde::{Deserialize, Serialize};
use serde_json::Value;
use settings::WorktreeId;
use smol::{self, fs::File, lock::Mutex};
use std::{
collections::{HashMap, HashSet},
ffi::{OsStr, OsString},
fmt::Debug,
net::Ipv4Addr,
ops::Deref,
path::{Path, PathBuf},
sync::Arc,
time::Duration,
};
use sysinfo::{Pid, Process};
use task::DebugAdapterConfig;
use util::ResultExt;
#[derive(Clone, Debug, PartialEq, Eq)]
pub enum DapStatus {
None,
CheckingForUpdate,
Downloading,
Failed { error: String },
}
#[async_trait(?Send)]
pub trait DapDelegate {
fn worktree_id(&self) -> WorktreeId;
fn http_client(&self) -> Arc<dyn HttpClient>;
fn node_runtime(&self) -> NodeRuntime;
fn toolchain_store(&self) -> Arc<dyn LanguageToolchainStore>;
fn fs(&self) -> Arc<dyn Fs>;
fn updated_adapters(&self) -> Arc<Mutex<HashSet<DebugAdapterName>>>;
fn update_status(&self, dap_name: DebugAdapterName, status: DapStatus);
fn which(&self, command: &OsStr) -> Option<PathBuf>;
async fn shell_env(&self) -> collections::HashMap<String, String>;
}
#[derive(Clone, Debug, PartialEq, Eq, PartialOrd, Ord, Hash, Deserialize, Serialize)]
pub struct DebugAdapterName(pub Arc<str>);
impl Deref for DebugAdapterName {
type Target = str;
fn deref(&self) -> &Self::Target {
&self.0
}
}
impl AsRef<str> for DebugAdapterName {
fn as_ref(&self) -> &str {
&self.0
}
}
impl AsRef<Path> for DebugAdapterName {
fn as_ref(&self) -> &Path {
Path::new(&*self.0)
}
}
impl std::fmt::Display for DebugAdapterName {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
std::fmt::Display::fmt(&self.0, f)
}
}
impl From<DebugAdapterName> for SharedString {
fn from(name: DebugAdapterName) -> Self {
SharedString::from(name.0)
}
}
impl<'a> From<&'a str> for DebugAdapterName {
fn from(str: &'a str) -> DebugAdapterName {
DebugAdapterName(str.to_string().into())
}
}
#[derive(Debug, Clone)]
pub struct TcpArguments {
pub host: Ipv4Addr,
pub port: Option<u16>,
pub timeout: Option<u64>,
}
#[derive(Debug, Clone)]
pub struct DebugAdapterBinary {
pub command: String,
pub arguments: Option<Vec<OsString>>,
pub envs: Option<HashMap<String, String>>,
pub cwd: Option<PathBuf>,
pub connection: Option<TcpArguments>,
#[cfg(any(test, feature = "test-support"))]
// todo(debugger) Find a way to remove this. It's a hack for FakeTransport
pub is_fake: bool,
}
pub struct AdapterVersion {
pub tag_name: String,
pub url: String,
}
pub enum DownloadedFileType {
Vsix,
GzipTar,
Zip,
}
pub struct GithubRepo {
pub repo_name: String,
pub repo_owner: String,
}
pub async fn download_adapter_from_github(
adapter_name: DebugAdapterName,
github_version: AdapterVersion,
file_type: DownloadedFileType,
delegate: &dyn DapDelegate,
) -> Result<PathBuf> {
let adapter_path = paths::debug_adapters_dir().join(&adapter_name);
let version_path = adapter_path.join(format!("{}_{}", adapter_name, github_version.tag_name));
let fs = delegate.fs();
if version_path.exists() {
return Ok(version_path);
}
if !adapter_path.exists() {
fs.create_dir(&adapter_path.as_path())
.await
.context("Failed creating adapter path")?;
}
log::debug!(
"Downloading adapter {} from {}",
adapter_name,
&github_version.url,
);
let mut response = delegate
.http_client()
.get(&github_version.url, Default::default(), true)
.await
.context("Error downloading release")?;
if !response.status().is_success() {
Err(anyhow!(
"download failed with status {}",
response.status().to_string()
))?;
}
match file_type {
DownloadedFileType::GzipTar => {
let decompressed_bytes = GzipDecoder::new(BufReader::new(response.body_mut()));
let archive = Archive::new(decompressed_bytes);
archive.unpack(&version_path).await?;
}
DownloadedFileType::Zip | DownloadedFileType::Vsix => {
let zip_path = version_path.with_extension("zip");
let mut file = File::create(&zip_path).await?;
futures::io::copy(response.body_mut(), &mut file).await?;
// we cannot check the status as some adapter include files with names that trigger `Illegal byte sequence`
util::command::new_smol_command("unzip")
.arg(&zip_path)
.arg("-d")
.arg(&version_path)
.output()
.await?;
util::fs::remove_matching(&adapter_path, |entry| {
entry
.file_name()
.is_some_and(|file| file.to_string_lossy().ends_with(".zip"))
})
.await;
}
}
// remove older versions
util::fs::remove_matching(&adapter_path, |entry| {
entry.to_string_lossy() != version_path.to_string_lossy()
})
.await;
Ok(version_path)
}
pub async fn fetch_latest_adapter_version_from_github(
github_repo: GithubRepo,
delegate: &dyn DapDelegate,
) -> Result<AdapterVersion> {
let release = latest_github_release(
&format!("{}/{}", github_repo.repo_owner, github_repo.repo_name),
false,
false,
delegate.http_client(),
)
.await?;
Ok(AdapterVersion {
tag_name: release.tag_name,
url: release.zipball_url,
})
}
#[async_trait(?Send)]
pub trait DebugAdapter: 'static + Send + Sync {
fn name(&self) -> DebugAdapterName;
async fn get_binary(
&self,
delegate: &dyn DapDelegate,
config: &DebugAdapterConfig,
user_installed_path: Option<PathBuf>,
cx: &mut AsyncApp,
) -> Result<DebugAdapterBinary> {
if delegate
.updated_adapters()
.lock()
.await
.contains(&self.name())
{
log::info!("Using cached debug adapter binary {}", self.name());
if let Some(binary) = self
.get_installed_binary(delegate, &config, user_installed_path.clone(), cx)
.await
.log_err()
{
return Ok(binary);
}
log::info!(
"Cached binary {} is corrupt falling back to install",
self.name()
);
}
log::info!("Getting latest version of debug adapter {}", self.name());
delegate.update_status(self.name(), DapStatus::CheckingForUpdate);
if let Some(version) = self.fetch_latest_adapter_version(delegate).await.log_err() {
log::info!(
"Installiing latest version of debug adapter {}",
self.name()
);
delegate.update_status(self.name(), DapStatus::Downloading);
self.install_binary(version, delegate).await?;
delegate
.updated_adapters()
.lock_arc()
.await
.insert(self.name());
}
self.get_installed_binary(delegate, &config, user_installed_path, cx)
.await
}
async fn fetch_latest_adapter_version(
&self,
delegate: &dyn DapDelegate,
) -> Result<AdapterVersion>;
/// Installs the binary for the debug adapter.
/// This method is called when the adapter binary is not found or needs to be updated.
/// It should download and install the necessary files for the debug adapter to function.
async fn install_binary(
&self,
version: AdapterVersion,
delegate: &dyn DapDelegate,
) -> Result<()>;
async fn get_installed_binary(
&self,
delegate: &dyn DapDelegate,
config: &DebugAdapterConfig,
user_installed_path: Option<PathBuf>,
cx: &mut AsyncApp,
) -> Result<DebugAdapterBinary>;
/// Should return base configuration to make the debug adapter work
fn request_args(&self, config: &DebugAdapterConfig) -> Value;
/// Filters out the processes that the adapter can attach to for debugging
fn attach_processes<'a>(
&self,
_: &'a HashMap<Pid, Process>,
) -> Option<Vec<(&'a Pid, &'a Process)>> {
None
}
}
#[cfg(any(test, feature = "test-support"))]
pub struct FakeAdapter {}
#[cfg(any(test, feature = "test-support"))]
impl FakeAdapter {
const ADAPTER_NAME: &'static str = "fake-adapter";
pub fn new() -> Self {
Self {}
}
}
#[cfg(any(test, feature = "test-support"))]
#[async_trait(?Send)]
impl DebugAdapter for FakeAdapter {
fn name(&self) -> DebugAdapterName {
DebugAdapterName(Self::ADAPTER_NAME.into())
}
async fn get_binary(
&self,
_: &dyn DapDelegate,
_: &DebugAdapterConfig,
_: Option<PathBuf>,
_: &mut AsyncApp,
) -> Result<DebugAdapterBinary> {
Ok(DebugAdapterBinary {
command: "command".into(),
arguments: None,
connection: Some(TcpArguments {
host: Ipv4Addr::LOCALHOST,
port: None,
timeout: None,
}),
is_fake: true,
envs: None,
cwd: None,
})
}
async fn fetch_latest_adapter_version(
&self,
_delegate: &dyn DapDelegate,
) -> Result<AdapterVersion> {
unimplemented!("fetch latest adapter version");
}
async fn install_binary(
&self,
_version: AdapterVersion,
_delegate: &dyn DapDelegate,
) -> Result<()> {
unimplemented!("install binary");
}
async fn get_installed_binary(
&self,
_: &dyn DapDelegate,
_: &DebugAdapterConfig,
_: Option<PathBuf>,
_: &mut AsyncApp,
) -> Result<DebugAdapterBinary> {
unimplemented!("get installed binary");
}
fn request_args(&self, config: &DebugAdapterConfig) -> Value {
use serde_json::json;
use task::DebugRequestType;
json!({
"request": match config.request {
DebugRequestType::Launch => "launch",
DebugRequestType::Attach(_) => "attach",
},
"process_id": if let DebugRequestType::Attach(attach_config) = &config.request {
attach_config.process_id
} else {
None
},
})
}
fn attach_processes<'a>(
&self,
processes: &'a HashMap<Pid, Process>,
) -> Option<Vec<(&'a Pid, &'a Process)>> {
Some(
processes
.iter()
.filter(|(pid, _)| pid.as_u32() == std::process::id())
.collect::<Vec<_>>(),
)
}
}

484
crates/dap/src/client.rs Normal file
View File

@@ -0,0 +1,484 @@
use crate::{
adapters::{DebugAdapter, DebugAdapterBinary, TcpArguments},
transport::{IoKind, LogKind, TransportDelegate},
};
use anyhow::{anyhow, Result};
use dap_types::{
messages::{Message, Response},
requests::Request,
};
use futures::{channel::oneshot, select, FutureExt as _};
use gpui::{App, AsyncApp, BackgroundExecutor};
use smol::channel::{Receiver, Sender};
use std::{
hash::Hash,
net::Ipv4Addr,
sync::{
atomic::{AtomicU64, Ordering},
Arc,
},
time::Duration,
};
#[cfg(any(test, feature = "test-support"))]
const DAP_REQUEST_TIMEOUT: Duration = Duration::from_secs(2);
#[cfg(not(any(test, feature = "test-support")))]
const DAP_REQUEST_TIMEOUT: Duration = Duration::from_secs(12);
#[derive(Copy, Clone, Debug, PartialEq, Eq, PartialOrd, Ord, Hash)]
#[repr(transparent)]
pub struct DebugAdapterClientId(pub u32);
impl DebugAdapterClientId {
pub fn from_proto(client_id: u64) -> Self {
Self(client_id as u32)
}
pub fn to_proto(&self) -> u64 {
self.0 as u64
}
}
/// Represents a connection to the debug adapter process, either via stdout/stdin or a socket.
pub struct DebugAdapterClient {
id: DebugAdapterClientId,
sequence_count: AtomicU64,
binary: DebugAdapterBinary,
executor: BackgroundExecutor,
transport_delegate: TransportDelegate,
}
impl DebugAdapterClient {
pub async fn start<F>(
id: DebugAdapterClientId,
binary: DebugAdapterBinary,
message_handler: F,
cx: AsyncApp,
) -> Result<Self>
where
F: FnMut(Message, &mut App) + 'static + Send + Sync + Clone,
{
let ((server_rx, server_tx), transport_delegate) =
TransportDelegate::start(&binary, cx.clone()).await?;
let this = Self {
id,
binary,
transport_delegate,
sequence_count: AtomicU64::new(1),
executor: cx.background_executor().clone(),
};
log::info!("Successfully connected to debug adapter");
let client_id = this.id;
// start handling events/reverse requests
cx.update(|cx| {
cx.spawn({
let server_tx = server_tx.clone();
|mut cx| async move {
Self::handle_receive_messages(
client_id,
server_rx,
server_tx,
message_handler,
&mut cx,
)
.await
}
})
.detach_and_log_err(cx);
this
})
}
async fn handle_receive_messages<F>(
client_id: DebugAdapterClientId,
server_rx: Receiver<Message>,
client_tx: Sender<Message>,
mut message_handler: F,
cx: &mut AsyncApp,
) -> Result<()>
where
F: FnMut(Message, &mut App) + 'static + Send + Sync + Clone,
{
let result = loop {
let message = match server_rx.recv().await {
Ok(message) => message,
Err(e) => break Err(e.into()),
};
if let Err(e) = match message {
Message::Event(ev) => {
log::debug!("Client {} received event `{}`", client_id.0, &ev);
cx.update(|cx| message_handler(Message::Event(ev), cx))
}
Message::Request(req) => cx.update(|cx| message_handler(Message::Request(req), cx)),
Message::Response(response) => {
log::debug!("Received response after request timeout: {:#?}", response);
Ok(())
}
} {
break Err(e);
}
smol::future::yield_now().await;
};
drop(client_tx);
log::debug!("Handle receive messages dropped");
result
}
/// Send a request to an adapter and get a response back
/// Note: This function will block until a response is sent back from the adapter
pub async fn request<R: Request>(&self, arguments: R::Arguments) -> Result<R::Response> {
let serialized_arguments = serde_json::to_value(arguments)?;
let (callback_tx, callback_rx) = oneshot::channel::<Result<Response>>();
let sequence_id = self.next_sequence_id();
let request = crate::messages::Request {
seq: sequence_id,
command: R::COMMAND.to_string(),
arguments: Some(serialized_arguments),
};
self.transport_delegate
.add_pending_request(sequence_id, callback_tx)
.await;
log::debug!(
"Client {} send `{}` request with sequence_id: {}",
self.id.0,
R::COMMAND.to_string(),
sequence_id
);
self.send_message(Message::Request(request)).await?;
let mut timeout = self.executor.timer(DAP_REQUEST_TIMEOUT).fuse();
let command = R::COMMAND.to_string();
select! {
response = callback_rx.fuse() => {
log::debug!(
"Client {} received response for: `{}` sequence_id: {}",
self.id.0,
command,
sequence_id
);
let response = response??;
match response.success {
true => Ok(serde_json::from_value(response.body.unwrap_or_default())?),
false => Err(anyhow!("Request failed")),
}
}
_ = timeout => {
self.transport_delegate.cancel_pending_request(&sequence_id).await;
log::error!("Cancelled DAP request for {command:?} id {sequence_id} which took over {DAP_REQUEST_TIMEOUT:?}");
anyhow::bail!("DAP request timeout");
}
}
}
pub async fn send_message(&self, message: Message) -> Result<()> {
self.transport_delegate.send_message(message).await
}
pub fn id(&self) -> DebugAdapterClientId {
self.id
}
pub fn binary(&self) -> &DebugAdapterBinary {
&self.binary
}
/// Get the next sequence id to be used in a request
pub fn next_sequence_id(&self) -> u64 {
self.sequence_count.fetch_add(1, Ordering::Relaxed)
}
pub async fn shutdown(&self) -> Result<()> {
self.transport_delegate.shutdown().await
}
pub fn has_adapter_logs(&self) -> bool {
self.transport_delegate.has_adapter_logs()
}
pub fn add_log_handler<F>(&self, f: F, kind: LogKind)
where
F: 'static + Send + FnMut(IoKind, &str),
{
self.transport_delegate.add_log_handler(f, kind);
}
#[cfg(any(test, feature = "test-support"))]
pub async fn on_request<R: dap_types::requests::Request, F>(&self, handler: F)
where
F: 'static
+ Send
+ FnMut(u64, R::Arguments) -> Result<R::Response, dap_types::ErrorResponse>,
{
let transport = self.transport_delegate.transport();
transport.on_request::<R, F>(handler).await;
}
#[cfg(any(test, feature = "test-support"))]
pub async fn fake_reverse_request<R: dap_types::requests::Request>(&self, args: R::Arguments) {
self.send_message(Message::Request(dap_types::messages::Request {
seq: self.sequence_count.load(Ordering::Relaxed),
command: R::COMMAND.into(),
arguments: serde_json::to_value(args).ok(),
}))
.await
.unwrap();
}
#[cfg(any(test, feature = "test-support"))]
pub async fn on_response<R: dap_types::requests::Request, F>(&self, handler: F)
where
F: 'static + Send + Fn(Response),
{
let transport = self.transport_delegate.transport();
transport.on_response::<R, F>(handler).await;
}
#[cfg(any(test, feature = "test-support"))]
pub async fn fake_event(&self, event: dap_types::messages::Events) {
self.send_message(Message::Event(Box::new(event)))
.await
.unwrap();
}
}
#[cfg(test)]
mod tests {
use super::*;
use crate::{
adapters::FakeAdapter, client::DebugAdapterClient, debugger_settings::DebuggerSettings,
};
use dap_types::{
messages::Events,
requests::{Initialize, Request, RunInTerminal},
Capabilities, InitializeRequestArguments, InitializeRequestArgumentsPathFormat,
RunInTerminalRequestArguments,
};
use gpui::TestAppContext;
use serde_json::json;
use settings::{Settings, SettingsStore};
use std::sync::atomic::{AtomicBool, Ordering};
pub fn init_test(cx: &mut gpui::TestAppContext) {
if std::env::var("RUST_LOG").is_ok() {
env_logger::try_init().ok();
}
cx.update(|cx| {
let settings = SettingsStore::test(cx);
cx.set_global(settings);
DebuggerSettings::register(cx);
});
}
#[gpui::test]
pub async fn test_initialize_client(cx: &mut TestAppContext) {
init_test(cx);
let client = DebugAdapterClient::start(
crate::client::DebugAdapterClientId(1),
DebugAdapterBinary {
command: "command".into(),
arguments: Default::default(),
envs: Default::default(),
connection: Some(crate::adapters::TcpArguments {
host: Ipv4Addr::LOCALHOST,
port: None,
timeout: None,
}),
is_fake: true,
cwd: None,
},
|_, _| panic!("Did not expect to hit this code path"),
cx.to_async(),
)
.await
.unwrap();
client
.on_request::<Initialize, _>(move |_, _| {
Ok(dap_types::Capabilities {
supports_configuration_done_request: Some(true),
..Default::default()
})
})
.await;
cx.run_until_parked();
let response = client
.request::<Initialize>(InitializeRequestArguments {
client_id: Some("zed".to_owned()),
client_name: Some("Zed".to_owned()),
adapter_id: "fake-adapter".to_owned(),
locale: Some("en-US".to_owned()),
path_format: Some(InitializeRequestArgumentsPathFormat::Path),
supports_variable_type: Some(true),
supports_variable_paging: Some(false),
supports_run_in_terminal_request: Some(true),
supports_memory_references: Some(true),
supports_progress_reporting: Some(false),
supports_invalidated_event: Some(false),
lines_start_at1: Some(true),
columns_start_at1: Some(true),
supports_memory_event: Some(false),
supports_args_can_be_interpreted_by_shell: Some(false),
supports_start_debugging_request: Some(true),
})
.await
.unwrap();
cx.run_until_parked();
assert_eq!(
dap_types::Capabilities {
supports_configuration_done_request: Some(true),
..Default::default()
},
response
);
client.shutdown().await.unwrap();
}
#[gpui::test]
pub async fn test_calls_event_handler(cx: &mut TestAppContext) {
init_test(cx);
let called_event_handler = Arc::new(AtomicBool::new(false));
let client = DebugAdapterClient::start(
crate::client::DebugAdapterClientId(1),
DebugAdapterBinary {
command: "command".into(),
arguments: Default::default(),
envs: Default::default(),
connection: Some(TcpArguments {
host: Ipv4Addr::LOCALHOST,
port: None,
timeout: None,
}),
is_fake: true,
cwd: None,
},
{
let called_event_handler = called_event_handler.clone();
move |event, _| {
called_event_handler.store(true, Ordering::SeqCst);
assert_eq!(
Message::Event(Box::new(Events::Initialized(
Some(Capabilities::default())
))),
event
);
}
},
cx.to_async(),
)
.await
.unwrap();
cx.run_until_parked();
client
.fake_event(Events::Initialized(Some(Capabilities::default())))
.await;
cx.run_until_parked();
assert!(
called_event_handler.load(std::sync::atomic::Ordering::SeqCst),
"Event handler was not called"
);
client.shutdown().await.unwrap();
}
#[gpui::test]
pub async fn test_calls_event_handler_for_reverse_request(cx: &mut TestAppContext) {
init_test(cx);
let called_event_handler = Arc::new(AtomicBool::new(false));
let client = DebugAdapterClient::start(
crate::client::DebugAdapterClientId(1),
DebugAdapterBinary {
command: "command".into(),
arguments: Default::default(),
envs: Default::default(),
connection: Some(TcpArguments {
host: Ipv4Addr::LOCALHOST,
port: None,
timeout: None,
}),
is_fake: true,
cwd: None,
},
{
let called_event_handler = called_event_handler.clone();
move |event, _| {
called_event_handler.store(true, Ordering::SeqCst);
assert_eq!(
Message::Request(dap_types::messages::Request {
seq: 1,
command: RunInTerminal::COMMAND.into(),
arguments: Some(json!({
"cwd": "/project/path/src",
"args": ["node", "test.js"],
}))
}),
event
);
}
},
cx.to_async(),
)
.await
.unwrap();
cx.run_until_parked();
client
.fake_reverse_request::<RunInTerminal>(RunInTerminalRequestArguments {
kind: None,
title: None,
cwd: "/project/path/src".into(),
args: vec!["node".into(), "test.js".into()],
env: None,
args_can_be_interpreted_by_shell: None,
})
.await;
cx.run_until_parked();
assert!(
called_event_handler.load(std::sync::atomic::Ordering::SeqCst),
"Event handler was not called"
);
client.shutdown().await.unwrap();
}
}

View File

@@ -0,0 +1,59 @@
use dap_types::SteppingGranularity;
use gpui::{App, Global};
use schemars::JsonSchema;
use serde::{Deserialize, Serialize};
use settings::{Settings, SettingsSources};
#[derive(Serialize, Deserialize, JsonSchema, Clone, Copy)]
#[serde(default)]
pub struct DebuggerSettings {
/// Determines the stepping granularity.
///
/// Default: line
pub stepping_granularity: SteppingGranularity,
/// Whether the breakpoints should be reused across Zed sessions.
///
/// Default: true
pub save_breakpoints: bool,
/// Whether to show the debug button in the status bar.
///
/// Default: true
pub button: bool,
/// Time in milliseconds until timeout error when connecting to a TCP debug adapter
///
/// Default: 2000ms
pub timeout: u64,
/// Whether to log messages between active debug adapters and Zed
///
/// Default: true
pub log_dap_communications: bool,
/// Whether to format dap messages in when adding them to debug adapter logger
///
/// Default: true
pub format_dap_log_messages: bool,
}
impl Default for DebuggerSettings {
fn default() -> Self {
Self {
button: true,
save_breakpoints: true,
stepping_granularity: SteppingGranularity::Line,
timeout: 2000,
log_dap_communications: true,
format_dap_log_messages: true,
}
}
}
impl Settings for DebuggerSettings {
const KEY: Option<&'static str> = Some("debugger");
type FileContent = Self;
fn load(sources: SettingsSources<Self::FileContent>, _: &mut App) -> anyhow::Result<Self> {
sources.json_merge()
}
}
impl Global for DebuggerSettings {}

24
crates/dap/src/lib.rs Normal file
View File

@@ -0,0 +1,24 @@
pub mod adapters;
pub mod client;
pub mod debugger_settings;
pub mod proto_conversions;
pub mod transport;
pub use dap_types::*;
pub use task::{DebugAdapterConfig, DebugAdapterKind, DebugRequestType};
#[cfg(any(test, feature = "test-support"))]
pub use adapters::FakeAdapter;
#[cfg(any(test, feature = "test-support"))]
pub fn test_config() -> DebugAdapterConfig {
DebugAdapterConfig {
label: "test config".into(),
kind: DebugAdapterKind::Fake,
request: DebugRequestType::Launch,
program: None,
supports_attach: false,
cwd: None,
initialize_args: None,
}
}

View File

@@ -0,0 +1,638 @@
use anyhow::{anyhow, Result};
use client::proto::{
self, DapChecksum, DapChecksumAlgorithm, DapEvaluateContext, DapModule, DapScope,
DapScopePresentationHint, DapSource, DapSourcePresentationHint, DapStackFrame, DapVariable,
SetDebugClientCapabilities,
};
use dap_types::{
Capabilities, OutputEventCategory, OutputEventGroup, ScopePresentationHint, Source,
};
pub trait ProtoConversion {
type ProtoType;
type Output;
fn to_proto(&self) -> Self::ProtoType;
fn from_proto(payload: Self::ProtoType) -> Self::Output;
}
impl<T> ProtoConversion for Vec<T>
where
T: ProtoConversion<Output = T>,
{
type ProtoType = Vec<T::ProtoType>;
type Output = Self;
fn to_proto(&self) -> Self::ProtoType {
self.iter().map(|item| item.to_proto()).collect()
}
fn from_proto(payload: Self::ProtoType) -> Self {
payload
.into_iter()
.map(|item| T::from_proto(item))
.collect()
}
}
impl ProtoConversion for dap_types::Scope {
type ProtoType = DapScope;
type Output = Self;
fn to_proto(&self) -> Self::ProtoType {
Self::ProtoType {
name: self.name.clone(),
presentation_hint: self
.presentation_hint
.as_ref()
.map(|hint| hint.to_proto().into()),
variables_reference: self.variables_reference,
named_variables: self.named_variables,
indexed_variables: self.indexed_variables,
expensive: self.expensive,
source: self.source.as_ref().map(Source::to_proto),
line: self.line,
end_line: self.end_line,
column: self.column,
end_column: self.end_column,
}
}
fn from_proto(payload: Self::ProtoType) -> Self {
let presentation_hint = payload
.presentation_hint
.and_then(DapScopePresentationHint::from_i32);
Self {
name: payload.name,
presentation_hint: presentation_hint.map(ScopePresentationHint::from_proto),
variables_reference: payload.variables_reference,
named_variables: payload.named_variables,
indexed_variables: payload.indexed_variables,
expensive: payload.expensive,
source: payload.source.map(dap_types::Source::from_proto),
line: payload.line,
end_line: payload.end_line,
column: payload.column,
end_column: payload.end_column,
}
}
}
impl ProtoConversion for dap_types::Variable {
type ProtoType = DapVariable;
type Output = Self;
fn to_proto(&self) -> Self::ProtoType {
Self::ProtoType {
name: self.name.clone(),
value: self.value.clone(),
r#type: self.type_.clone(),
evaluate_name: self.evaluate_name.clone(),
variables_reference: self.variables_reference,
named_variables: self.named_variables,
indexed_variables: self.indexed_variables,
memory_reference: self.memory_reference.clone(),
}
}
fn from_proto(payload: Self::ProtoType) -> Self {
Self {
name: payload.name,
value: payload.value,
type_: payload.r#type,
evaluate_name: payload.evaluate_name,
presentation_hint: None, // TODO Debugger Collab Add this
variables_reference: payload.variables_reference,
named_variables: payload.named_variables,
indexed_variables: payload.indexed_variables,
memory_reference: payload.memory_reference,
}
}
}
impl ProtoConversion for dap_types::ScopePresentationHint {
type ProtoType = DapScopePresentationHint;
type Output = Self;
fn to_proto(&self) -> Self::ProtoType {
match self {
dap_types::ScopePresentationHint::Locals => DapScopePresentationHint::Locals,
dap_types::ScopePresentationHint::Arguments => DapScopePresentationHint::Arguments,
dap_types::ScopePresentationHint::Registers => DapScopePresentationHint::Registers,
dap_types::ScopePresentationHint::ReturnValue => DapScopePresentationHint::ReturnValue,
dap_types::ScopePresentationHint::Unknown => DapScopePresentationHint::ScopeUnknown,
&_ => unreachable!(),
}
}
fn from_proto(payload: Self::ProtoType) -> Self {
match payload {
DapScopePresentationHint::Locals => dap_types::ScopePresentationHint::Locals,
DapScopePresentationHint::Arguments => dap_types::ScopePresentationHint::Arguments,
DapScopePresentationHint::Registers => dap_types::ScopePresentationHint::Registers,
DapScopePresentationHint::ReturnValue => dap_types::ScopePresentationHint::ReturnValue,
DapScopePresentationHint::ScopeUnknown => dap_types::ScopePresentationHint::Unknown,
}
}
}
impl ProtoConversion for dap_types::SourcePresentationHint {
type ProtoType = DapSourcePresentationHint;
type Output = Self;
fn to_proto(&self) -> Self::ProtoType {
match self {
dap_types::SourcePresentationHint::Normal => DapSourcePresentationHint::SourceNormal,
dap_types::SourcePresentationHint::Emphasize => DapSourcePresentationHint::Emphasize,
dap_types::SourcePresentationHint::Deemphasize => {
DapSourcePresentationHint::Deemphasize
}
dap_types::SourcePresentationHint::Unknown => DapSourcePresentationHint::SourceUnknown,
}
}
fn from_proto(payload: Self::ProtoType) -> Self {
match payload {
DapSourcePresentationHint::SourceNormal => dap_types::SourcePresentationHint::Normal,
DapSourcePresentationHint::Emphasize => dap_types::SourcePresentationHint::Emphasize,
DapSourcePresentationHint::Deemphasize => {
dap_types::SourcePresentationHint::Deemphasize
}
DapSourcePresentationHint::SourceUnknown => dap_types::SourcePresentationHint::Unknown,
}
}
}
impl ProtoConversion for dap_types::Checksum {
type ProtoType = DapChecksum;
type Output = Self;
fn to_proto(&self) -> Self::ProtoType {
DapChecksum {
algorithm: self.algorithm.to_proto().into(),
checksum: self.checksum.clone(),
}
}
fn from_proto(payload: Self::ProtoType) -> Self {
Self {
algorithm: dap_types::ChecksumAlgorithm::from_proto(payload.algorithm()),
checksum: payload.checksum,
}
}
}
impl ProtoConversion for dap_types::ChecksumAlgorithm {
type ProtoType = DapChecksumAlgorithm;
type Output = Self;
fn to_proto(&self) -> Self::ProtoType {
match self {
dap_types::ChecksumAlgorithm::Md5 => DapChecksumAlgorithm::Md5,
dap_types::ChecksumAlgorithm::Sha1 => DapChecksumAlgorithm::Sha1,
dap_types::ChecksumAlgorithm::Sha256 => DapChecksumAlgorithm::Sha256,
dap_types::ChecksumAlgorithm::Timestamp => DapChecksumAlgorithm::Timestamp,
}
}
fn from_proto(payload: Self::ProtoType) -> Self {
match payload {
DapChecksumAlgorithm::Md5 => dap_types::ChecksumAlgorithm::Md5,
DapChecksumAlgorithm::Sha1 => dap_types::ChecksumAlgorithm::Sha1,
DapChecksumAlgorithm::Sha256 => dap_types::ChecksumAlgorithm::Sha256,
DapChecksumAlgorithm::Timestamp => dap_types::ChecksumAlgorithm::Timestamp,
DapChecksumAlgorithm::ChecksumAlgorithmUnspecified => unreachable!(),
}
}
}
impl ProtoConversion for dap_types::Source {
type ProtoType = DapSource;
type Output = Self;
fn to_proto(&self) -> Self::ProtoType {
Self::ProtoType {
name: self.name.clone(),
path: self.path.clone(),
source_reference: self.source_reference,
presentation_hint: self.presentation_hint.map(|hint| hint.to_proto().into()),
origin: self.origin.clone(),
sources: self
.sources
.clone()
.map(|src| src.to_proto())
.unwrap_or_default(),
adapter_data: Default::default(), // TODO Debugger Collab
checksums: self
.checksums
.clone()
.map(|c| c.to_proto())
.unwrap_or_default(),
}
}
fn from_proto(payload: Self::ProtoType) -> Self {
Self {
name: payload.name.clone(),
path: payload.path.clone(),
source_reference: payload.source_reference,
presentation_hint: payload
.presentation_hint
.and_then(DapSourcePresentationHint::from_i32)
.map(dap_types::SourcePresentationHint::from_proto),
origin: payload.origin.clone(),
sources: Some(Vec::<dap_types::Source>::from_proto(payload.sources)),
checksums: Some(Vec::<dap_types::Checksum>::from_proto(payload.checksums)),
adapter_data: None, // TODO Debugger Collab
}
}
}
impl ProtoConversion for dap_types::StackFrame {
type ProtoType = DapStackFrame;
type Output = Self;
fn to_proto(&self) -> Self::ProtoType {
Self::ProtoType {
id: self.id,
name: self.name.clone(),
source: self.source.as_ref().map(|src| src.to_proto()),
line: self.line,
column: self.column,
end_line: self.end_line,
end_column: self.end_column,
can_restart: self.can_restart,
instruction_pointer_reference: self.instruction_pointer_reference.clone(),
module_id: None, // TODO Debugger Collab
presentation_hint: None, // TODO Debugger Collab
}
}
fn from_proto(payload: Self::ProtoType) -> Self {
Self {
id: payload.id,
name: payload.name,
source: payload.source.map(dap_types::Source::from_proto),
line: payload.line,
column: payload.column,
end_line: payload.end_line,
end_column: payload.end_column,
can_restart: payload.can_restart,
instruction_pointer_reference: payload.instruction_pointer_reference,
module_id: None, // TODO Debugger Collab
presentation_hint: None, // TODO Debugger Collab
}
}
}
impl ProtoConversion for dap_types::Module {
type ProtoType = DapModule;
type Output = Result<Self>;
fn to_proto(&self) -> Self::ProtoType {
let id = match &self.id {
dap_types::ModuleId::Number(num) => proto::dap_module_id::Id::Number(*num),
dap_types::ModuleId::String(string) => proto::dap_module_id::Id::String(string.clone()),
};
DapModule {
id: Some(proto::DapModuleId { id: Some(id) }),
name: self.name.clone(),
path: self.path.clone(),
is_optimized: self.is_optimized,
is_user_code: self.is_user_code,
version: self.version.clone(),
symbol_status: self.symbol_status.clone(),
symbol_file_path: self.symbol_file_path.clone(),
date_time_stamp: self.date_time_stamp.clone(),
address_range: self.address_range.clone(),
}
}
fn from_proto(payload: Self::ProtoType) -> Result<Self> {
let id = match payload
.id
.ok_or(anyhow!("All DapModule proto messages must have an id"))?
.id
.ok_or(anyhow!("All DapModuleID proto messages must have an id"))?
{
proto::dap_module_id::Id::String(string) => dap_types::ModuleId::String(string),
proto::dap_module_id::Id::Number(num) => dap_types::ModuleId::Number(num),
};
Ok(Self {
id,
name: payload.name,
path: payload.path,
is_optimized: payload.is_optimized,
is_user_code: payload.is_user_code,
version: payload.version,
symbol_status: payload.symbol_status,
symbol_file_path: payload.symbol_file_path,
date_time_stamp: payload.date_time_stamp,
address_range: payload.address_range,
})
}
}
pub fn capabilities_from_proto(payload: &SetDebugClientCapabilities) -> Capabilities {
Capabilities {
supports_loaded_sources_request: Some(payload.supports_loaded_sources_request),
supports_modules_request: Some(payload.supports_modules_request),
supports_restart_request: Some(payload.supports_restart_request),
supports_set_expression: Some(payload.supports_set_expression),
supports_single_thread_execution_requests: Some(
payload.supports_single_thread_execution_requests,
),
supports_step_back: Some(payload.supports_step_back),
supports_stepping_granularity: Some(payload.supports_stepping_granularity),
supports_terminate_threads_request: Some(payload.supports_terminate_threads_request),
supports_restart_frame: Some(payload.supports_restart_frame_request),
supports_clipboard_context: Some(payload.supports_clipboard_context),
..Default::default()
}
}
pub fn capabilities_to_proto(
capabilities: &Capabilities,
project_id: u64,
client_id: u64,
) -> SetDebugClientCapabilities {
SetDebugClientCapabilities {
client_id,
project_id,
supports_loaded_sources_request: capabilities
.supports_loaded_sources_request
.unwrap_or_default(),
supports_modules_request: capabilities.supports_modules_request.unwrap_or_default(),
supports_restart_request: capabilities.supports_restart_request.unwrap_or_default(),
supports_set_expression: capabilities.supports_set_expression.unwrap_or_default(),
supports_single_thread_execution_requests: capabilities
.supports_single_thread_execution_requests
.unwrap_or_default(),
supports_step_back: capabilities.supports_step_back.unwrap_or_default(),
supports_stepping_granularity: capabilities
.supports_stepping_granularity
.unwrap_or_default(),
supports_terminate_threads_request: capabilities
.supports_terminate_threads_request
.unwrap_or_default(),
supports_restart_frame_request: capabilities.supports_restart_frame.unwrap_or_default(),
supports_clipboard_context: capabilities.supports_clipboard_context.unwrap_or_default(),
}
}
impl ProtoConversion for dap_types::SteppingGranularity {
type ProtoType = proto::SteppingGranularity;
type Output = Self;
fn to_proto(&self) -> Self::ProtoType {
match self {
dap_types::SteppingGranularity::Statement => proto::SteppingGranularity::Statement,
dap_types::SteppingGranularity::Line => proto::SteppingGranularity::Line,
dap_types::SteppingGranularity::Instruction => proto::SteppingGranularity::Instruction,
}
}
fn from_proto(payload: Self::ProtoType) -> Self {
match payload {
proto::SteppingGranularity::Line => dap_types::SteppingGranularity::Line,
proto::SteppingGranularity::Instruction => dap_types::SteppingGranularity::Instruction,
proto::SteppingGranularity::Statement => dap_types::SteppingGranularity::Statement,
}
}
}
impl ProtoConversion for dap_types::OutputEventCategory {
type ProtoType = proto::DapOutputCategory;
type Output = Self;
fn to_proto(&self) -> Self::ProtoType {
match self {
Self::Console => proto::DapOutputCategory::ConsoleOutput,
Self::Important => proto::DapOutputCategory::Important,
Self::Stdout => proto::DapOutputCategory::Stdout,
Self::Stderr => proto::DapOutputCategory::Stderr,
_ => proto::DapOutputCategory::Unknown,
}
}
fn from_proto(payload: Self::ProtoType) -> Self {
match payload {
proto::DapOutputCategory::ConsoleOutput => Self::Console,
proto::DapOutputCategory::Important => Self::Important,
proto::DapOutputCategory::Stdout => Self::Stdout,
proto::DapOutputCategory::Stderr => Self::Stderr,
proto::DapOutputCategory::Unknown => Self::Unknown,
}
}
}
impl ProtoConversion for dap_types::OutputEvent {
type ProtoType = proto::DapOutputEvent;
type Output = Self;
fn to_proto(&self) -> Self::ProtoType {
proto::DapOutputEvent {
category: self
.category
.as_ref()
.map(|category| category.to_proto().into()),
output: self.output.clone(),
variables_reference: self.variables_reference,
source: self.source.as_ref().map(|source| source.to_proto()),
line: self.line.map(|line| line as u32),
column: self.column.map(|column| column as u32),
group: self.group.map(|group| group.to_proto().into()),
}
}
fn from_proto(payload: Self::ProtoType) -> Self {
dap_types::OutputEvent {
category: payload
.category
.and_then(proto::DapOutputCategory::from_i32)
.map(OutputEventCategory::from_proto),
output: payload.output.clone(),
variables_reference: payload.variables_reference,
source: payload.source.map(Source::from_proto),
line: payload.line.map(|line| line as u64),
column: payload.column.map(|column| column as u64),
group: payload
.group
.and_then(proto::DapOutputEventGroup::from_i32)
.map(OutputEventGroup::from_proto),
data: None,
}
}
}
impl ProtoConversion for dap_types::OutputEventGroup {
type ProtoType = proto::DapOutputEventGroup;
type Output = Self;
fn to_proto(&self) -> Self::ProtoType {
match self {
dap_types::OutputEventGroup::Start => proto::DapOutputEventGroup::Start,
dap_types::OutputEventGroup::StartCollapsed => {
proto::DapOutputEventGroup::StartCollapsed
}
dap_types::OutputEventGroup::End => proto::DapOutputEventGroup::End,
}
}
fn from_proto(payload: Self::ProtoType) -> Self {
match payload {
proto::DapOutputEventGroup::Start => Self::Start,
proto::DapOutputEventGroup::StartCollapsed => Self::StartCollapsed,
proto::DapOutputEventGroup::End => Self::End,
}
}
}
impl ProtoConversion for dap_types::CompletionItem {
type ProtoType = proto::DapCompletionItem;
type Output = Self;
fn to_proto(&self) -> Self::ProtoType {
proto::DapCompletionItem {
label: self.label.clone(),
text: self.text.clone(),
detail: self.detail.clone(),
typ: self
.type_
.as_ref()
.map(ProtoConversion::to_proto)
.map(|typ| typ.into()),
start: self.start,
length: self.length,
selection_start: self.selection_start,
selection_length: self.selection_length,
sort_text: self.sort_text.clone(),
}
}
fn from_proto(payload: Self::ProtoType) -> Self {
let typ = payload.typ(); // todo(debugger): This might be a potential issue/bug because it defaults to a type when it's None
Self {
label: payload.label,
detail: payload.detail,
sort_text: payload.sort_text,
text: payload.text.clone(),
type_: Some(dap_types::CompletionItemType::from_proto(typ)),
start: payload.start,
length: payload.length,
selection_start: payload.selection_start,
selection_length: payload.selection_length,
}
}
}
impl ProtoConversion for dap_types::EvaluateArgumentsContext {
type ProtoType = DapEvaluateContext;
type Output = Self;
fn to_proto(&self) -> Self::ProtoType {
match self {
dap_types::EvaluateArgumentsContext::Variables => {
proto::DapEvaluateContext::EvaluateVariables
}
dap_types::EvaluateArgumentsContext::Watch => proto::DapEvaluateContext::Watch,
dap_types::EvaluateArgumentsContext::Hover => proto::DapEvaluateContext::Hover,
dap_types::EvaluateArgumentsContext::Repl => proto::DapEvaluateContext::Repl,
dap_types::EvaluateArgumentsContext::Clipboard => proto::DapEvaluateContext::Clipboard,
dap_types::EvaluateArgumentsContext::Unknown => {
proto::DapEvaluateContext::EvaluateUnknown
}
_ => proto::DapEvaluateContext::EvaluateUnknown,
}
}
fn from_proto(payload: Self::ProtoType) -> Self {
match payload {
proto::DapEvaluateContext::EvaluateVariables => {
dap_types::EvaluateArgumentsContext::Variables
}
proto::DapEvaluateContext::Watch => dap_types::EvaluateArgumentsContext::Watch,
proto::DapEvaluateContext::Hover => dap_types::EvaluateArgumentsContext::Hover,
proto::DapEvaluateContext::Repl => dap_types::EvaluateArgumentsContext::Repl,
proto::DapEvaluateContext::Clipboard => dap_types::EvaluateArgumentsContext::Clipboard,
proto::DapEvaluateContext::EvaluateUnknown => {
dap_types::EvaluateArgumentsContext::Unknown
}
}
}
}
impl ProtoConversion for dap_types::CompletionItemType {
type ProtoType = proto::DapCompletionItemType;
type Output = Self;
fn to_proto(&self) -> Self::ProtoType {
match self {
dap_types::CompletionItemType::Class => proto::DapCompletionItemType::Class,
dap_types::CompletionItemType::Color => proto::DapCompletionItemType::Color,
dap_types::CompletionItemType::Constructor => proto::DapCompletionItemType::Constructor,
dap_types::CompletionItemType::Customcolor => proto::DapCompletionItemType::Customcolor,
dap_types::CompletionItemType::Enum => proto::DapCompletionItemType::Enum,
dap_types::CompletionItemType::Field => proto::DapCompletionItemType::Field,
dap_types::CompletionItemType::File => proto::DapCompletionItemType::CompletionItemFile,
dap_types::CompletionItemType::Function => proto::DapCompletionItemType::Function,
dap_types::CompletionItemType::Interface => proto::DapCompletionItemType::Interface,
dap_types::CompletionItemType::Keyword => proto::DapCompletionItemType::Keyword,
dap_types::CompletionItemType::Method => proto::DapCompletionItemType::Method,
dap_types::CompletionItemType::Module => proto::DapCompletionItemType::Module,
dap_types::CompletionItemType::Property => proto::DapCompletionItemType::Property,
dap_types::CompletionItemType::Reference => proto::DapCompletionItemType::Reference,
dap_types::CompletionItemType::Snippet => proto::DapCompletionItemType::Snippet,
dap_types::CompletionItemType::Text => proto::DapCompletionItemType::Text,
dap_types::CompletionItemType::Unit => proto::DapCompletionItemType::Unit,
dap_types::CompletionItemType::Value => proto::DapCompletionItemType::Value,
dap_types::CompletionItemType::Variable => proto::DapCompletionItemType::Variable,
}
}
fn from_proto(payload: Self::ProtoType) -> Self {
match payload {
proto::DapCompletionItemType::Class => dap_types::CompletionItemType::Class,
proto::DapCompletionItemType::Color => dap_types::CompletionItemType::Color,
proto::DapCompletionItemType::CompletionItemFile => dap_types::CompletionItemType::File,
proto::DapCompletionItemType::Constructor => dap_types::CompletionItemType::Constructor,
proto::DapCompletionItemType::Customcolor => dap_types::CompletionItemType::Customcolor,
proto::DapCompletionItemType::Enum => dap_types::CompletionItemType::Enum,
proto::DapCompletionItemType::Field => dap_types::CompletionItemType::Field,
proto::DapCompletionItemType::Function => dap_types::CompletionItemType::Function,
proto::DapCompletionItemType::Interface => dap_types::CompletionItemType::Interface,
proto::DapCompletionItemType::Keyword => dap_types::CompletionItemType::Keyword,
proto::DapCompletionItemType::Method => dap_types::CompletionItemType::Method,
proto::DapCompletionItemType::Module => dap_types::CompletionItemType::Module,
proto::DapCompletionItemType::Property => dap_types::CompletionItemType::Property,
proto::DapCompletionItemType::Reference => dap_types::CompletionItemType::Reference,
proto::DapCompletionItemType::Snippet => dap_types::CompletionItemType::Snippet,
proto::DapCompletionItemType::Text => dap_types::CompletionItemType::Text,
proto::DapCompletionItemType::Unit => dap_types::CompletionItemType::Unit,
proto::DapCompletionItemType::Value => dap_types::CompletionItemType::Value,
proto::DapCompletionItemType::Variable => dap_types::CompletionItemType::Variable,
}
}
}
impl ProtoConversion for dap_types::Thread {
type ProtoType = proto::DapThread;
type Output = Self;
fn to_proto(&self) -> Self::ProtoType {
proto::DapThread {
id: self.id,
name: self.name.clone(),
}
}
fn from_proto(payload: Self::ProtoType) -> Self {
Self {
id: payload.id,
name: payload.name,
}
}
}

936
crates/dap/src/transport.rs Normal file
View File

@@ -0,0 +1,936 @@
use anyhow::{anyhow, bail, Context, Result};
use async_trait::async_trait;
use dap_types::{
messages::{Message, Response},
ErrorResponse,
};
use futures::{channel::oneshot, select, AsyncRead, AsyncReadExt as _, AsyncWrite, FutureExt as _};
use gpui::AsyncApp;
use settings::Settings as _;
use smallvec::SmallVec;
use smol::{
channel::{unbounded, Receiver, Sender},
io::{AsyncBufReadExt as _, AsyncWriteExt, BufReader},
lock::Mutex,
net::{TcpListener, TcpStream},
process::Child,
};
use std::{
any::Any,
collections::HashMap,
net::{Ipv4Addr, SocketAddrV4},
process::Stdio,
sync::Arc,
time::Duration,
};
use task::{DebugAdapterKind, TCPHost};
use util::ResultExt as _;
use crate::{adapters::DebugAdapterBinary, debugger_settings::DebuggerSettings};
pub type IoHandler = Box<dyn Send + FnMut(IoKind, &str)>;
#[derive(PartialEq, Eq, Clone, Copy)]
pub enum LogKind {
Adapter,
Rpc,
}
pub enum IoKind {
StdIn,
StdOut,
StdErr,
}
pub struct TransportPipe {
input: Box<dyn AsyncWrite + Unpin + Send + 'static>,
output: Box<dyn AsyncRead + Unpin + Send + 'static>,
stdout: Option<Box<dyn AsyncRead + Unpin + Send + 'static>>,
stderr: Option<Box<dyn AsyncRead + Unpin + Send + 'static>>,
}
impl TransportPipe {
pub fn new(
input: Box<dyn AsyncWrite + Unpin + Send + 'static>,
output: Box<dyn AsyncRead + Unpin + Send + 'static>,
stdout: Option<Box<dyn AsyncRead + Unpin + Send + 'static>>,
stderr: Option<Box<dyn AsyncRead + Unpin + Send + 'static>>,
) -> Self {
TransportPipe {
input,
output,
stdout,
stderr,
}
}
}
type Requests = Arc<Mutex<HashMap<u64, oneshot::Sender<Result<Response>>>>>;
type LogHandlers = Arc<parking_lot::Mutex<SmallVec<[(LogKind, IoHandler); 2]>>>;
enum Transport {
Stdio(StdioTransport),
Tcp(TcpTransport),
#[cfg(any(test, feature = "test-support"))]
Fake(FakeTransport),
}
impl Transport {
async fn start(binary: &DebugAdapterBinary, cx: AsyncApp) -> Result<(TransportPipe, Self)> {
#[cfg(any(test, feature = "test-support"))]
if binary.is_fake {
return FakeTransport::start(cx)
.await
.map(|(transports, fake)| (transports, Self::Fake(fake)));
}
if binary.connection.is_some() {
TcpTransport::start(binary, cx)
.await
.map(|(transports, tcp)| (transports, Self::Tcp(tcp)))
} else {
StdioTransport::start(binary, cx)
.await
.map(|(transports, stdio)| (transports, Self::Stdio(stdio)))
}
}
fn has_adapter_logs(&self) -> bool {
match self {
Transport::Stdio(stdio_transport) => stdio_transport.has_adapter_logs(),
Transport::Tcp(tcp_transport) => tcp_transport.has_adapter_logs(),
#[cfg(any(test, feature = "test-support"))]
Transport::Fake(fake_transport) => fake_transport.has_adapter_logs(),
}
}
async fn kill(&self) -> Result<()> {
match self {
Transport::Stdio(stdio_transport) => stdio_transport.kill().await,
Transport::Tcp(tcp_transport) => tcp_transport.kill().await,
#[cfg(any(test, feature = "test-support"))]
Transport::Fake(fake_transport) => fake_transport.kill().await,
}
}
#[cfg(any(test, feature = "test-support"))]
fn as_fake(&self) -> &FakeTransport {
match self {
Transport::Fake(fake_transport) => fake_transport,
_ => panic!("Not a fake transport layer"),
}
}
}
pub(crate) struct TransportDelegate {
log_handlers: LogHandlers,
current_requests: Requests,
pending_requests: Requests,
transport: Transport,
server_tx: Arc<Mutex<Option<Sender<Message>>>>,
}
impl Transport {
#[cfg(any(test, feature = "test-support"))]
fn fake(args: DebugAdapterBinary) -> Self {
todo!()
}
}
impl TransportDelegate {
#[cfg(any(test, feature = "test-support"))]
pub fn fake(args: DebugAdapterBinary) -> Self {
Self {
transport: Transport::fake(args),
server_tx: Default::default(),
log_handlers: Default::default(),
current_requests: Default::default(),
pending_requests: Default::default(),
}
}
pub(crate) async fn start(
binary: &DebugAdapterBinary,
cx: AsyncApp,
) -> Result<((Receiver<Message>, Sender<Message>), Self)> {
let (transport_pipes, transport) = Transport::start(binary, cx.clone()).await?;
let mut this = Self {
transport,
server_tx: Default::default(),
log_handlers: Default::default(),
current_requests: Default::default(),
pending_requests: Default::default(),
};
let messages = this.start_handlers(transport_pipes, cx).await?;
Ok((messages, this))
}
async fn start_handlers(
&mut self,
mut params: TransportPipe,
cx: AsyncApp,
) -> Result<(Receiver<Message>, Sender<Message>)> {
let (client_tx, server_rx) = unbounded::<Message>();
let (server_tx, client_rx) = unbounded::<Message>();
let log_dap_communications =
cx.update(|cx| DebuggerSettings::get_global(cx).log_dap_communications)
.with_context(|| "Failed to get Debugger Setting log dap communications error in transport::start_handlers. Defaulting to false")
.unwrap_or(false);
let log_handler = if log_dap_communications {
Some(self.log_handlers.clone())
} else {
None
};
cx.update(|cx| {
if let Some(stdout) = params.stdout.take() {
cx.background_executor()
.spawn(Self::handle_adapter_log(stdout, log_handler.clone()))
.detach_and_log_err(cx);
}
cx.background_executor()
.spawn(Self::handle_output(
params.output,
client_tx,
self.pending_requests.clone(),
log_handler.clone(),
))
.detach_and_log_err(cx);
if let Some(stderr) = params.stderr.take() {
cx.background_executor()
.spawn(Self::handle_error(stderr, self.log_handlers.clone()))
.detach_and_log_err(cx);
}
cx.background_executor()
.spawn(Self::handle_input(
params.input,
client_rx,
self.current_requests.clone(),
self.pending_requests.clone(),
log_handler.clone(),
))
.detach_and_log_err(cx);
})?;
{
let mut lock = self.server_tx.lock().await;
*lock = Some(server_tx.clone());
}
Ok((server_rx, server_tx))
}
pub(crate) async fn add_pending_request(
&self,
sequence_id: u64,
request: oneshot::Sender<Result<Response>>,
) {
let mut pending_requests = self.pending_requests.lock().await;
pending_requests.insert(sequence_id, request);
}
pub(crate) async fn cancel_pending_request(&self, sequence_id: &u64) {
let mut pending_requests = self.pending_requests.lock().await;
pending_requests.remove(sequence_id);
}
pub(crate) async fn send_message(&self, message: Message) -> Result<()> {
if let Some(server_tx) = self.server_tx.lock().await.as_ref() {
server_tx
.send(message)
.await
.map_err(|e| anyhow!("Failed to send message: {}", e))
} else {
Err(anyhow!("Server tx already dropped"))
}
}
async fn handle_adapter_log<Stdout>(
stdout: Stdout,
log_handlers: Option<LogHandlers>,
) -> Result<()>
where
Stdout: AsyncRead + Unpin + Send + 'static,
{
let mut reader = BufReader::new(stdout);
let mut line = String::new();
let result = loop {
line.truncate(0);
let bytes_read = match reader.read_line(&mut line).await {
Ok(bytes_read) => bytes_read,
Err(e) => break Err(e.into()),
};
if bytes_read == 0 {
break Err(anyhow!("Debugger log stream closed"));
}
if let Some(log_handlers) = log_handlers.as_ref() {
for (kind, handler) in log_handlers.lock().iter_mut() {
if matches!(kind, LogKind::Adapter) {
handler(IoKind::StdOut, line.as_str());
}
}
}
smol::future::yield_now().await;
};
log::debug!("Handle adapter log dropped");
result
}
fn build_rpc_message(message: String) -> String {
format!("Content-Length: {}\r\n\r\n{}", message.len(), message)
}
async fn handle_input<Stdin>(
mut server_stdin: Stdin,
client_rx: Receiver<Message>,
current_requests: Requests,
pending_requests: Requests,
log_handlers: Option<LogHandlers>,
) -> Result<()>
where
Stdin: AsyncWrite + Unpin + Send + 'static,
{
let result = loop {
match client_rx.recv().await {
Ok(message) => {
if let Message::Request(request) = &message {
if let Some(sender) = current_requests.lock().await.remove(&request.seq) {
pending_requests.lock().await.insert(request.seq, sender);
}
}
let message = match serde_json::to_string(&message) {
Ok(message) => message,
Err(e) => break Err(e.into()),
};
if let Some(log_handlers) = log_handlers.as_ref() {
for (kind, log_handler) in log_handlers.lock().iter_mut() {
if matches!(kind, LogKind::Rpc) {
log_handler(IoKind::StdIn, &message);
}
}
}
if let Err(e) = server_stdin
.write_all(Self::build_rpc_message(message).as_bytes())
.await
{
break Err(e.into());
}
if let Err(e) = server_stdin.flush().await {
break Err(e.into());
}
}
Err(error) => break Err(error.into()),
}
smol::future::yield_now().await;
};
log::debug!("Handle adapter input dropped");
result
}
async fn handle_output<Stdout>(
server_stdout: Stdout,
client_tx: Sender<Message>,
pending_requests: Requests,
log_handlers: Option<LogHandlers>,
) -> Result<()>
where
Stdout: AsyncRead + Unpin + Send + 'static,
{
let mut recv_buffer = String::new();
let mut reader = BufReader::new(server_stdout);
let result = loop {
let message =
Self::receive_server_message(&mut reader, &mut recv_buffer, log_handlers.as_ref())
.await;
match message {
Ok(Message::Response(res)) => {
if let Some(tx) = pending_requests.lock().await.remove(&res.request_seq) {
if let Err(e) = tx.send(Self::process_response(res)) {
break Err(anyhow!("Failed to send response: {:?}", e));
}
} else {
client_tx.send(Message::Response(res)).await?;
};
}
Ok(message) => {
client_tx.send(message).await?;
}
Err(e) => break Err(e),
}
smol::future::yield_now().await;
};
drop(client_tx);
log::debug!("Handle adapter output dropped");
result
}
async fn handle_error<Stderr>(stderr: Stderr, log_handlers: LogHandlers) -> Result<()>
where
Stderr: AsyncRead + Unpin + Send + 'static,
{
let mut buffer = String::new();
let mut reader = BufReader::new(stderr);
let result = loop {
match reader.read_line(&mut buffer).await {
Ok(0) => break Err(anyhow!("debugger error stream closed")),
Ok(_) => {
for (kind, log_handler) in log_handlers.lock().iter_mut() {
if matches!(kind, LogKind::Adapter) {
log_handler(IoKind::StdErr, buffer.as_str());
}
}
buffer.truncate(0);
}
Err(error) => break Err(error.into()),
}
smol::future::yield_now().await;
};
log::debug!("Handle adapter error dropped");
result
}
fn process_response(response: Response) -> Result<Response> {
if response.success {
Ok(response)
} else {
if let Some(body) = response.body.clone() {
if let Ok(error) = serde_json::from_value::<ErrorResponse>(body) {
if let Some(message) = error.error {
return Err(anyhow!(message.format));
};
};
}
Err(anyhow!(
"Received error response from adapter. Response: {:?}",
response.clone()
))
}
}
async fn receive_server_message<Stdout>(
reader: &mut BufReader<Stdout>,
buffer: &mut String,
log_handlers: Option<&LogHandlers>,
) -> Result<Message>
where
Stdout: AsyncRead + Unpin + Send + 'static,
{
let mut content_length = None;
loop {
buffer.truncate(0);
if reader
.read_line(buffer)
.await
.with_context(|| "reading a message from server")?
== 0
{
return Err(anyhow!("debugger reader stream closed"));
};
if buffer == "\r\n" {
break;
}
let parts = buffer.trim().split_once(": ");
match parts {
Some(("Content-Length", value)) => {
content_length = Some(value.parse().context("invalid content length")?);
}
_ => {}
}
}
let content_length = content_length.context("missing content length")?;
let mut content = vec![0; content_length];
reader
.read_exact(&mut content)
.await
.with_context(|| "reading after a loop")?;
let message = std::str::from_utf8(&content).context("invalid utf8 from server")?;
if let Some(log_handlers) = log_handlers {
for (kind, log_handler) in log_handlers.lock().iter_mut() {
if matches!(kind, LogKind::Rpc) {
log_handler(IoKind::StdOut, &message);
}
}
}
Ok(serde_json::from_str::<Message>(message)?)
}
pub async fn shutdown(&self) -> Result<()> {
log::debug!("Start shutdown client");
if let Some(server_tx) = self.server_tx.lock().await.take().as_ref() {
server_tx.close();
}
let mut current_requests = self.current_requests.lock().await;
let mut pending_requests = self.pending_requests.lock().await;
current_requests.clear();
pending_requests.clear();
let _ = self.transport.kill().await.log_err();
drop(current_requests);
drop(pending_requests);
log::debug!("Shutdown client completed");
anyhow::Ok(())
}
pub fn has_adapter_logs(&self) -> bool {
self.transport.has_adapter_logs()
}
#[cfg(any(test, feature = "test-support"))]
pub fn transport(&self) -> Arc<&FakeTransport> {
Arc::new(self.transport.as_fake())
}
pub fn add_log_handler<F>(&self, f: F, kind: LogKind)
where
F: 'static + Send + FnMut(IoKind, &str),
{
let mut log_handlers = self.log_handlers.lock();
log_handlers.push((kind, Box::new(f)));
}
}
pub struct TcpTransport {
port: u16,
host: Ipv4Addr,
timeout: u64,
process: Mutex<Child>,
}
impl TcpTransport {
/// Get an open port to use with the tcp client when not supplied by debug config
pub async fn port(host: &TCPHost) -> Result<u16> {
if let Some(port) = host.port {
Ok(port)
} else {
Ok(TcpListener::bind(SocketAddrV4::new(host.host(), 0))
.await?
.local_addr()?
.port())
}
}
async fn port_for_host(host: Ipv4Addr) -> Result<u16> {
Ok(TcpListener::bind(SocketAddrV4::new(host, 0))
.await?
.local_addr()?
.port())
}
async fn start(binary: &DebugAdapterBinary, cx: AsyncApp) -> Result<(TransportPipe, Self)> {
let Some(connection_args) = binary.connection.as_ref() else {
return Err(anyhow!("No connection arguments provided"));
};
let host = connection_args.host;
let port = if let Some(port) = connection_args.port {
port
} else {
TcpListener::bind(SocketAddrV4::new(host, 0))
.await
.with_context(|| {
format!(
"Failed to connect to debug adapter over tcp. host: {}",
host
)
})?
.local_addr()?
.port()
};
let mut command = util::command::new_smol_command(&binary.command);
if let Some(cwd) = &binary.cwd {
command.current_dir(cwd);
}
if let Some(args) = &binary.arguments {
command.args(args);
}
if let Some(envs) = &binary.envs {
command.envs(envs);
}
command
.stdin(Stdio::null())
.stdout(Stdio::piped())
.stderr(Stdio::piped())
.kill_on_drop(true);
let mut process = command
.spawn()
.with_context(|| "failed to start debug adapter.")?;
let address = SocketAddrV4::new(host, port);
let timeout = connection_args.timeout.unwrap_or_else(|| {
cx.update(|cx| DebuggerSettings::get_global(cx).timeout)
.unwrap_or(2000u64)
});
let (rx, tx) = select! {
_ = cx.background_executor().timer(Duration::from_millis(timeout)).fuse() => {
return Err(anyhow!(format!("Connection to TCP DAP timeout {}:{}", host, port)))
},
result = cx.spawn(|cx| async move {
loop {
match TcpStream::connect(address).await {
Ok(stream) => return stream.split(),
Err(_) => {
cx.background_executor().timer(Duration::from_millis(100)).await;
}
}
}
}).fuse() => result
};
log::info!(
"Debug adapter has connected to TCP server {}:{}",
host,
port
);
let stdout = process.stdout.take();
let stderr = process.stderr.take();
let this = Self {
port,
host,
process: Mutex::new(process),
timeout,
};
let pipe = TransportPipe::new(
Box::new(tx),
Box::new(BufReader::new(rx)),
stdout.map(|s| Box::new(s) as Box<dyn AsyncRead + Unpin + Send>),
stderr.map(|s| Box::new(s) as Box<dyn AsyncRead + Unpin + Send>),
);
Ok((pipe, this))
}
fn has_adapter_logs(&self) -> bool {
true
}
async fn kill(&self) -> Result<()> {
self.process.lock().await.kill()?;
Ok(())
}
}
pub struct StdioTransport {
process: Mutex<Child>,
}
impl StdioTransport {
async fn start(binary: &DebugAdapterBinary, _: AsyncApp) -> Result<(TransportPipe, Self)> {
let mut command = util::command::new_smol_command(&binary.command);
if let Some(cwd) = &binary.cwd {
command.current_dir(cwd);
}
if let Some(args) = &binary.arguments {
command.args(args);
}
if let Some(envs) = &binary.envs {
command.envs(envs);
}
command
.stdin(Stdio::piped())
.stdout(Stdio::piped())
.stderr(Stdio::piped())
.kill_on_drop(true);
let mut process = command
.spawn()
.with_context(|| "failed to spawn command.")?;
let stdin = process
.stdin
.take()
.ok_or_else(|| anyhow!("Failed to open stdin"))?;
let stdout = process
.stdout
.take()
.ok_or_else(|| anyhow!("Failed to open stdout"))?;
let stderr = process
.stdout
.take()
.map(|io_err| Box::new(io_err) as Box<dyn AsyncRead + Unpin + Send>);
if stderr.is_none() {
log::error!(
"Failed to connect to stderr for debug adapter command {}",
&binary.command
);
}
log::info!("Debug adapter has connected to stdio adapter");
let process = Mutex::new(process);
Ok((
TransportPipe::new(
Box::new(stdin),
Box::new(BufReader::new(stdout)),
None,
stderr,
),
Self { process },
))
}
fn has_adapter_logs(&self) -> bool {
false
}
async fn reconnect(&self, _: AsyncApp) -> Result<TransportPipe> {
bail!("Cannot reconnect to adapter")
}
async fn kill(&self) -> Result<()> {
self.process.lock().await.kill()?;
Ok(())
}
}
#[cfg(any(test, feature = "test-support"))]
type RequestHandler = Box<
dyn Send
+ FnMut(
u64,
serde_json::Value,
Arc<Mutex<async_pipe::PipeWriter>>,
) -> std::pin::Pin<Box<dyn std::future::Future<Output = ()> + Send>>,
>;
#[cfg(any(test, feature = "test-support"))]
type ResponseHandler = Box<dyn Send + Fn(Response)>;
#[cfg(any(test, feature = "test-support"))]
pub struct FakeTransport {
// for sending fake response back from adapter side
request_handlers: Arc<Mutex<HashMap<&'static str, RequestHandler>>>,
// for reverse request responses
response_handlers: Arc<Mutex<HashMap<&'static str, ResponseHandler>>>,
}
#[cfg(any(test, feature = "test-support"))]
impl FakeTransport {
pub async fn on_request<R: dap_types::requests::Request, F>(&self, mut handler: F)
where
F: 'static + Send + FnMut(u64, R::Arguments) -> Result<R::Response, ErrorResponse>,
{
self.request_handlers.lock().await.insert(
R::COMMAND,
Box::new(
move |seq, args, writer: Arc<Mutex<async_pipe::PipeWriter>>| {
let response = handler(seq, serde_json::from_value(args).unwrap());
let message = serde_json::to_string(&Message::Response(Response {
seq: seq + 1,
request_seq: seq,
success: response.as_ref().is_ok(),
command: R::COMMAND.into(),
body: util::maybe!({ serde_json::to_value(response.ok()?).ok() }),
}))
.unwrap();
let writer = writer.clone();
Box::pin(async move {
let mut writer = writer.lock().await;
writer
.write_all(TransportDelegate::build_rpc_message(message).as_bytes())
.await
.unwrap();
writer.flush().await.unwrap();
})
},
),
);
}
pub async fn on_response<R: dap_types::requests::Request, F>(&self, handler: F)
where
F: 'static + Send + Fn(Response),
{
self.response_handlers
.lock()
.await
.insert(R::COMMAND, Box::new(handler));
}
async fn reconnect(&self, cx: AsyncApp) -> Result<(TransportPipe, Self)> {
FakeTransport::start(cx).await
}
async fn start(cx: AsyncApp) -> Result<(TransportPipe, Self)> {
let this = Self {
request_handlers: Arc::new(Mutex::new(HashMap::default())),
response_handlers: Arc::new(Mutex::new(HashMap::default())),
};
use dap_types::requests::{Request, RunInTerminal, StartDebugging};
use serde_json::json;
let (stdin_writer, stdin_reader) = async_pipe::pipe();
let (stdout_writer, stdout_reader) = async_pipe::pipe();
let request_handlers = this.request_handlers.clone();
let response_handlers = this.response_handlers.clone();
let stdout_writer = Arc::new(Mutex::new(stdout_writer));
cx.background_executor()
.spawn(async move {
let mut reader = BufReader::new(stdin_reader);
let mut buffer = String::new();
loop {
let message =
TransportDelegate::receive_server_message(&mut reader, &mut buffer, None)
.await;
match message {
Err(error) => {
break anyhow!(error);
}
Ok(message) => {
match message {
Message::Request(request) => {
// redirect reverse requests to stdout writer/reader
if request.command == RunInTerminal::COMMAND
|| request.command == StartDebugging::COMMAND
{
let message =
serde_json::to_string(&Message::Request(request))
.unwrap();
let mut writer = stdout_writer.lock().await;
writer
.write_all(
TransportDelegate::build_rpc_message(message)
.as_bytes(),
)
.await
.unwrap();
writer.flush().await.unwrap();
} else {
if let Some(handle) = request_handlers
.lock()
.await
.get_mut(request.command.as_str())
{
handle(
request.seq,
request.arguments.unwrap_or(json!({})),
stdout_writer.clone(),
)
.await;
} else {
log::error!(
"No request handler for {}",
request.command
);
}
}
}
Message::Event(event) => {
let message =
serde_json::to_string(&Message::Event(event)).unwrap();
let mut writer = stdout_writer.lock().await;
writer
.write_all(
TransportDelegate::build_rpc_message(message)
.as_bytes(),
)
.await
.unwrap();
writer.flush().await.unwrap();
}
Message::Response(response) => {
if let Some(handle) = response_handlers
.lock()
.await
.get(response.command.as_str())
{
handle(response);
} else {
log::error!("No response handler for {}", response.command);
}
}
}
}
}
}
})
.detach();
Ok((
TransportPipe::new(Box::new(stdin_writer), Box::new(stdout_reader), None, None),
this,
))
}
fn has_adapter_logs(&self) -> bool {
false
}
async fn kill(&self) -> Result<()> {
Ok(())
}
#[cfg(any(test, feature = "test-support"))]
fn as_fake(&self) -> &FakeTransport {
self
}
}

View File

@@ -0,0 +1,41 @@
[package]
name = "dap_adapters"
version = "0.1.0"
edition.workspace = true
publish.workspace = true
license = "GPL-3.0-or-later"
[features]
test-support = [
"dap/test-support",
"gpui/test-support",
"task/test-support",
"util/test-support",
]
[lints]
workspace = true
[lib]
path = "src/dap_adapters.rs"
doctest = false
[dependencies]
anyhow.workspace = true
async-trait.workspace = true
dap.workspace = true
gpui.workspace = true
language.workspace = true
paths.workspace = true
regex.workspace = true
serde.workspace = true
serde_json.workspace = true
sysinfo.workspace = true
task.workspace = true
util.workspace = true
[dev-dependencies]
dap = { workspace = true, features = ["test-support"] }
gpui = { workspace = true, features = ["test-support"] }
task = { workspace = true, features = ["test-support"] }
util = { workspace = true, features = ["test-support"] }

View File

@@ -0,0 +1 @@
../../LICENSE-GPL

View File

@@ -0,0 +1,82 @@
use std::{ffi::OsString, path::PathBuf, sync::Arc};
use dap::transport::{StdioTransport, TcpTransport};
use gpui::AsyncApp;
use serde_json::Value;
use task::DebugAdapterConfig;
use crate::*;
pub(crate) struct CustomDebugAdapter {
custom_args: CustomArgs,
}
impl CustomDebugAdapter {
const ADAPTER_NAME: &'static str = "custom_dap";
pub(crate) async fn new(custom_args: CustomArgs) -> Result<Self> {
Ok(CustomDebugAdapter { custom_args })
}
}
#[async_trait(?Send)]
impl DebugAdapter for CustomDebugAdapter {
fn name(&self) -> DebugAdapterName {
DebugAdapterName(Self::ADAPTER_NAME.into())
}
async fn get_binary(
&self,
_: &dyn DapDelegate,
config: &DebugAdapterConfig,
_: Option<PathBuf>,
_: &mut AsyncApp,
) -> Result<DebugAdapterBinary> {
let connection = if let DebugConnectionType::TCP(connection) = &self.custom_args.connection
{
Some(adapters::TcpArguments {
host: connection.host(),
port: connection.port,
timeout: connection.timeout,
})
} else {
None
};
let ret = DebugAdapterBinary {
command: self.custom_args.command.clone(),
arguments: self
.custom_args
.args
.clone()
.map(|args| args.iter().map(OsString::from).collect()),
cwd: config.cwd.clone(),
envs: self.custom_args.envs.clone(),
#[cfg(any(test, feature = "test-support"))]
is_fake: false,
connection,
};
Ok(ret)
}
async fn fetch_latest_adapter_version(&self, _: &dyn DapDelegate) -> Result<AdapterVersion> {
bail!("Custom debug adapters don't have latest versions")
}
async fn install_binary(&self, _: AdapterVersion, _: &dyn DapDelegate) -> Result<()> {
bail!("Custom debug adapters cannot be installed")
}
async fn get_installed_binary(
&self,
_: &dyn DapDelegate,
_: &DebugAdapterConfig,
_: Option<PathBuf>,
_: &mut AsyncApp,
) -> Result<DebugAdapterBinary> {
bail!("Custom debug adapters cannot be installed")
}
fn request_args(&self, config: &DebugAdapterConfig) -> Value {
json!({"program": config.program})
}
}

View File

@@ -0,0 +1,49 @@
mod custom;
#[cfg(any(target_arch = "x86", target_arch = "x86_64"))]
mod gdb;
mod go;
mod javascript;
mod lldb;
mod php;
mod python;
use std::sync::Arc;
use anyhow::{anyhow, bail, Result};
use async_trait::async_trait;
use custom::CustomDebugAdapter;
use dap::adapters::{
self, AdapterVersion, DapDelegate, DebugAdapter, DebugAdapterBinary, DebugAdapterName,
GithubRepo,
};
#[cfg(any(target_arch = "x86", target_arch = "x86_64"))]
use gdb::GdbDebugAdapter;
use go::GoDebugAdapter;
use javascript::JsDebugAdapter;
use lldb::LldbDebugAdapter;
use php::PhpDebugAdapter;
use python::PythonDebugAdapter;
use serde_json::{json, Value};
use task::{CustomArgs, DebugAdapterConfig, DebugAdapterKind, DebugConnectionType, TCPHost};
pub async fn build_adapter(kind: &DebugAdapterKind) -> Result<Arc<dyn DebugAdapter>> {
match kind {
DebugAdapterKind::Custom(start_args) => {
Ok(Arc::new(CustomDebugAdapter::new(start_args.clone()).await?))
}
DebugAdapterKind::Python(host) => Ok(Arc::new(PythonDebugAdapter::new(host).await?)),
DebugAdapterKind::Php(host) => Ok(Arc::new(PhpDebugAdapter::new(host.clone()).await?)),
DebugAdapterKind::Javascript(host) => {
Ok(Arc::new(JsDebugAdapter::new(host.clone()).await?))
}
DebugAdapterKind::Lldb => Ok(Arc::new(LldbDebugAdapter::new())),
DebugAdapterKind::Go(host) => Ok(Arc::new(GoDebugAdapter::new(host).await?)),
#[cfg(any(target_arch = "x86", target_arch = "x86_64"))]
DebugAdapterKind::Gdb => Ok(Arc::new(GdbDebugAdapter::new())),
#[cfg(any(test, feature = "test-support"))]
DebugAdapterKind::Fake => Ok(Arc::new(dap::adapters::FakeAdapter::new())),
#[cfg(not(any(test, feature = "test-support")))]
#[allow(unreachable_patterns)]
_ => unreachable!("Fake variant only exists with test-support feature"),
}
}

View File

@@ -0,0 +1,87 @@
use std::ffi::OsStr;
use anyhow::Result;
use async_trait::async_trait;
use dap::transport::{StdioTransport, Transport};
use gpui::AsyncApp;
use task::DebugAdapterConfig;
use crate::*;
pub(crate) struct GdbDebugAdapter {}
impl GdbDebugAdapter {
const ADAPTER_NAME: &'static str = "gdb";
pub(crate) fn new() -> Self {
GdbDebugAdapter {}
}
}
#[async_trait(?Send)]
impl DebugAdapter for GdbDebugAdapter {
fn name(&self) -> DebugAdapterName {
DebugAdapterName(Self::ADAPTER_NAME.into())
}
fn transport(&self) -> Arc<dyn Transport> {
Arc::new(StdioTransport::new())
}
async fn get_binary(
&self,
delegate: &dyn DapDelegate,
config: &DebugAdapterConfig,
user_installed_path: Option<std::path::PathBuf>,
_: &mut AsyncApp,
) -> Result<DebugAdapterBinary> {
let user_setting_path = user_installed_path
.filter(|p| p.exists())
.and_then(|p| p.to_str().map(|s| s.to_string()));
/* GDB implements DAP natively so just need to */
let gdb_path = delegate
.which(OsStr::new("gdb"))
.and_then(|p| p.to_str().map(|s| s.to_string()))
.ok_or(anyhow!("Could not find gdb in path"));
if gdb_path.is_err() && user_setting_path.is_none() {
bail!("Could not find gdb path or it's not installed");
}
let gdb_path = user_setting_path.unwrap_or(gdb_path?);
Ok(DebugAdapterBinary {
command: gdb_path,
arguments: Some(vec!["-i=dap".into()]),
envs: None,
cwd: config.cwd.clone(),
})
}
async fn install_binary(
&self,
_version: AdapterVersion,
_delegate: &dyn DapDelegate,
) -> Result<()> {
unimplemented!("GDB debug adapter cannot be installed by Zed (yet)")
}
async fn fetch_latest_adapter_version(&self, _: &dyn DapDelegate) -> Result<AdapterVersion> {
unimplemented!("Fetch latest GDB version not implemented (yet)")
}
async fn get_installed_binary(
&self,
_: &dyn DapDelegate,
_: &DebugAdapterConfig,
_: Option<std::path::PathBuf>,
_: &mut AsyncApp,
) -> Result<DebugAdapterBinary> {
unimplemented!("GDB cannot be installed by Zed (yet)")
}
fn request_args(&self, config: &DebugAdapterConfig) -> Value {
json!({"program": config.program, "cwd": config.cwd})
}
}

View File

@@ -0,0 +1,98 @@
use dap::transport::TcpTransport;
use gpui::AsyncApp;
use std::{ffi::OsStr, net::Ipv4Addr, path::PathBuf, sync::Arc};
use crate::*;
pub(crate) struct GoDebugAdapter {
port: u16,
host: Ipv4Addr,
timeout: Option<u64>,
}
impl GoDebugAdapter {
const ADAPTER_NAME: &'static str = "delve";
pub(crate) async fn new(host: &TCPHost) -> Result<Self> {
Ok(GoDebugAdapter {
port: TcpTransport::port(host).await?,
host: host.host(),
timeout: host.timeout,
})
}
}
#[async_trait(?Send)]
impl DebugAdapter for GoDebugAdapter {
fn name(&self) -> DebugAdapterName {
DebugAdapterName(Self::ADAPTER_NAME.into())
}
async fn get_binary(
&self,
delegate: &dyn DapDelegate,
config: &DebugAdapterConfig,
user_installed_path: Option<PathBuf>,
cx: &mut AsyncApp,
) -> Result<DebugAdapterBinary> {
self.get_installed_binary(delegate, config, user_installed_path, cx)
.await
}
async fn fetch_latest_adapter_version(
&self,
_delegate: &dyn DapDelegate,
) -> Result<AdapterVersion> {
unimplemented!("This adapter is used from path for now");
}
async fn install_binary(
&self,
version: AdapterVersion,
delegate: &dyn DapDelegate,
) -> Result<()> {
adapters::download_adapter_from_github(
self.name(),
version,
adapters::DownloadedFileType::Zip,
delegate,
)
.await?;
Ok(())
}
async fn get_installed_binary(
&self,
delegate: &dyn DapDelegate,
config: &DebugAdapterConfig,
_: Option<PathBuf>,
_: &mut AsyncApp,
) -> Result<DebugAdapterBinary> {
let delve_path = delegate
.which(OsStr::new("dlv"))
.and_then(|p| p.to_str().map(|p| p.to_string()))
.ok_or(anyhow!("Dlv not found in path"))?;
Ok(DebugAdapterBinary {
command: delve_path,
arguments: Some(vec![
"dap".into(),
"--listen".into(),
format!("{}:{}", self.host, self.port).into(),
]),
cwd: config.cwd.clone(),
envs: None,
connection: None,
#[cfg(any(test, feature = "test-support"))]
is_fake: false,
})
}
fn request_args(&self, config: &DebugAdapterConfig) -> Value {
json!({
"program": config.program,
"cwd": config.cwd,
"subProcess": true,
})
}
}

View File

@@ -0,0 +1,155 @@
use adapters::latest_github_release;
use dap::transport::TcpTransport;
use gpui::AsyncApp;
use regex::Regex;
use std::{collections::HashMap, net::Ipv4Addr, path::PathBuf, sync::Arc};
use sysinfo::{Pid, Process};
use task::DebugRequestType;
use crate::*;
pub(crate) struct JsDebugAdapter {
port: u16,
host: Ipv4Addr,
timeout: Option<u64>,
}
impl JsDebugAdapter {
const ADAPTER_NAME: &'static str = "vscode-js-debug";
const ADAPTER_PATH: &'static str = "js-debug/src/dapDebugServer.js";
pub(crate) async fn new(host: TCPHost) -> Result<Self> {
Ok(JsDebugAdapter {
host: host.host(),
timeout: host.timeout,
port: TcpTransport::port(&host).await?,
})
}
}
#[async_trait(?Send)]
impl DebugAdapter for JsDebugAdapter {
fn name(&self) -> DebugAdapterName {
DebugAdapterName(Self::ADAPTER_NAME.into())
}
async fn fetch_latest_adapter_version(
&self,
delegate: &dyn DapDelegate,
) -> Result<AdapterVersion> {
let release = latest_github_release(
&format!("{}/{}", "microsoft", Self::ADAPTER_NAME),
true,
false,
delegate.http_client(),
)
.await?;
let asset_name = format!("js-debug-dap-{}.tar.gz", release.tag_name);
Ok(AdapterVersion {
tag_name: release.tag_name,
url: release
.assets
.iter()
.find(|asset| asset.name == asset_name)
.ok_or_else(|| anyhow!("no asset found matching {:?}", asset_name))?
.browser_download_url
.clone(),
})
}
async fn get_installed_binary(
&self,
delegate: &dyn DapDelegate,
config: &DebugAdapterConfig,
user_installed_path: Option<PathBuf>,
_: &mut AsyncApp,
) -> Result<DebugAdapterBinary> {
let adapter_path = if let Some(user_installed_path) = user_installed_path {
user_installed_path
} else {
let adapter_path = paths::debug_adapters_dir().join(self.name());
let file_name_prefix = format!("{}_", self.name());
util::fs::find_file_name_in_dir(adapter_path.as_path(), |file_name| {
file_name.starts_with(&file_name_prefix)
})
.await
.ok_or_else(|| anyhow!("Couldn't find JavaScript dap directory"))?
};
Ok(DebugAdapterBinary {
command: delegate
.node_runtime()
.binary_path()
.await?
.to_string_lossy()
.into_owned(),
arguments: Some(vec![
adapter_path.join(Self::ADAPTER_PATH).into(),
self.port.to_string().into(),
self.host.to_string().into(),
]),
cwd: config.cwd.clone(),
envs: None,
#[cfg(any(test, feature = "test-support"))]
is_fake: false,
connection: Some(adapters::TcpArguments {
host: self.host,
port: Some(self.port),
timeout: self.timeout,
}),
})
}
async fn install_binary(
&self,
version: AdapterVersion,
delegate: &dyn DapDelegate,
) -> Result<()> {
adapters::download_adapter_from_github(
self.name(),
version,
adapters::DownloadedFileType::GzipTar,
delegate,
)
.await?;
return Ok(());
}
fn request_args(&self, config: &DebugAdapterConfig) -> Value {
let pid = if let DebugRequestType::Attach(attach_config) = &config.request {
attach_config.process_id
} else {
None
};
json!({
"program": config.program,
"type": "pwa-node",
"request": match config.request {
DebugRequestType::Launch => "launch",
DebugRequestType::Attach(_) => "attach",
},
"processId": pid,
"cwd": config.cwd,
})
}
fn attach_processes<'a>(
&self,
processes: &'a HashMap<Pid, Process>,
) -> Option<Vec<(&'a Pid, &'a Process)>> {
let regex = Regex::new(r"(?i)^(?:node|bun|iojs)(?:$|\b)").unwrap();
Some(
processes
.iter()
.filter(|(_, process)| regex.is_match(&process.name().to_string_lossy()))
.collect::<Vec<_>>(),
)
}
}

View File

@@ -0,0 +1,110 @@
use std::{collections::HashMap, ffi::OsStr, path::PathBuf, sync::Arc};
use anyhow::Result;
use async_trait::async_trait;
use dap::transport::StdioTransport;
use gpui::AsyncApp;
use sysinfo::{Pid, Process};
use task::{DebugAdapterConfig, DebugRequestType};
use crate::*;
pub(crate) struct LldbDebugAdapter {}
impl LldbDebugAdapter {
const ADAPTER_NAME: &'static str = "lldb";
pub(crate) fn new() -> Self {
LldbDebugAdapter {}
}
}
#[async_trait(?Send)]
impl DebugAdapter for LldbDebugAdapter {
fn name(&self) -> DebugAdapterName {
DebugAdapterName(Self::ADAPTER_NAME.into())
}
async fn get_binary(
&self,
delegate: &dyn DapDelegate,
config: &DebugAdapterConfig,
user_installed_path: Option<PathBuf>,
_: &mut AsyncApp,
) -> Result<DebugAdapterBinary> {
let lldb_dap_path = if let Some(user_installed_path) = user_installed_path {
user_installed_path.to_string_lossy().into()
} else if cfg!(target_os = "macos") {
util::command::new_smol_command("xcrun")
.args(&["-f", "lldb-dap"])
.output()
.await
.ok()
.and_then(|output| String::from_utf8(output.stdout).ok())
.map(|path| path.trim().to_string())
.ok_or(anyhow!("Failed to find lldb-dap in user's path"))?
} else {
delegate
.which(OsStr::new("lldb-dap"))
.and_then(|p| p.to_str().map(|s| s.to_string()))
.ok_or(anyhow!("Could not find lldb-dap in path"))?
};
Ok(DebugAdapterBinary {
command: lldb_dap_path,
arguments: None,
envs: None,
cwd: config.cwd.clone(),
connection: None,
#[cfg(any(test, feature = "test-support"))]
is_fake: false,
})
}
async fn install_binary(
&self,
_version: AdapterVersion,
_delegate: &dyn DapDelegate,
) -> Result<()> {
unimplemented!("LLDB debug adapter cannot be installed by Zed (yet)")
}
async fn fetch_latest_adapter_version(&self, _: &dyn DapDelegate) -> Result<AdapterVersion> {
unimplemented!("Fetch latest adapter version not implemented for lldb (yet)")
}
async fn get_installed_binary(
&self,
_: &dyn DapDelegate,
_: &DebugAdapterConfig,
_: Option<PathBuf>,
_: &mut AsyncApp,
) -> Result<DebugAdapterBinary> {
unimplemented!("LLDB debug adapter cannot be installed by Zed (yet)")
}
fn request_args(&self, config: &DebugAdapterConfig) -> Value {
let pid = if let DebugRequestType::Attach(attach_config) = &config.request {
attach_config.process_id
} else {
None
};
json!({
"program": config.program,
"request": match config.request {
DebugRequestType::Launch => "launch",
DebugRequestType::Attach(_) => "attach",
},
"pid": pid,
"cwd": config.cwd,
})
}
fn attach_processes<'a>(
&self,
processes: &'a HashMap<Pid, Process>,
) -> Option<Vec<(&'a Pid, &'a Process)>> {
Some(processes.iter().collect::<Vec<_>>())
}
}

View File

@@ -0,0 +1,125 @@
use adapters::latest_github_release;
use dap::{adapters::TcpArguments, transport::TcpTransport};
use gpui::AsyncApp;
use std::{net::Ipv4Addr, path::PathBuf, sync::Arc};
use crate::*;
pub(crate) struct PhpDebugAdapter {
port: u16,
host: Ipv4Addr,
timeout: Option<u64>,
}
impl PhpDebugAdapter {
const ADAPTER_NAME: &'static str = "vscode-php-debug";
const ADAPTER_PATH: &'static str = "extension/out/phpDebug.js";
pub(crate) async fn new(host: TCPHost) -> Result<Self> {
Ok(PhpDebugAdapter {
port: TcpTransport::port(&host).await?,
host: host.host(),
timeout: host.timeout,
})
}
}
#[async_trait(?Send)]
impl DebugAdapter for PhpDebugAdapter {
fn name(&self) -> DebugAdapterName {
DebugAdapterName(Self::ADAPTER_NAME.into())
}
async fn fetch_latest_adapter_version(
&self,
delegate: &dyn DapDelegate,
) -> Result<AdapterVersion> {
let release = latest_github_release(
&format!("{}/{}", "xdebug", Self::ADAPTER_NAME),
true,
false,
delegate.http_client(),
)
.await?;
let asset_name = format!("php-debug-{}.vsix", release.tag_name.replace("v", ""));
Ok(AdapterVersion {
tag_name: release.tag_name,
url: release
.assets
.iter()
.find(|asset| asset.name == asset_name)
.ok_or_else(|| anyhow!("no asset found matching {:?}", asset_name))?
.browser_download_url
.clone(),
})
}
async fn get_installed_binary(
&self,
delegate: &dyn DapDelegate,
config: &DebugAdapterConfig,
user_installed_path: Option<PathBuf>,
_: &mut AsyncApp,
) -> Result<DebugAdapterBinary> {
let adapter_path = if let Some(user_installed_path) = user_installed_path {
user_installed_path
} else {
let adapter_path = paths::debug_adapters_dir().join(self.name());
let file_name_prefix = format!("{}_", self.name());
util::fs::find_file_name_in_dir(adapter_path.as_path(), |file_name| {
file_name.starts_with(&file_name_prefix)
})
.await
.ok_or_else(|| anyhow!("Couldn't find PHP dap directory"))?
};
Ok(DebugAdapterBinary {
command: delegate
.node_runtime()
.binary_path()
.await?
.to_string_lossy()
.into_owned(),
arguments: Some(vec![
adapter_path.join(Self::ADAPTER_PATH).into(),
format!("--server={}", self.port).into(),
]),
connection: Some(TcpArguments {
port: Some(self.port),
host: Ipv4Addr::LOCALHOST,
timeout: None,
}),
cwd: config.cwd.clone(),
envs: None,
#[cfg(any(test, feature = "test-support"))]
is_fake: false,
})
}
async fn install_binary(
&self,
version: AdapterVersion,
delegate: &dyn DapDelegate,
) -> Result<()> {
adapters::download_adapter_from_github(
self.name(),
version,
adapters::DownloadedFileType::Vsix,
delegate,
)
.await?;
Ok(())
}
fn request_args(&self, config: &DebugAdapterConfig) -> Value {
json!({
"program": config.program,
"cwd": config.cwd,
})
}
}

View File

@@ -0,0 +1,143 @@
use crate::*;
use dap::transport::TcpTransport;
use gpui::AsyncApp;
use std::{ffi::OsStr, net::Ipv4Addr, path::PathBuf, sync::Arc};
pub(crate) struct PythonDebugAdapter {
port: u16,
host: Ipv4Addr,
timeout: Option<u64>,
}
impl PythonDebugAdapter {
const ADAPTER_NAME: &'static str = "debugpy";
const ADAPTER_PATH: &'static str = "src/debugpy/adapter";
const LANGUAGE_NAME: &'static str = "Python";
pub(crate) async fn new(host: &TCPHost) -> Result<Self> {
Ok(PythonDebugAdapter {
port: TcpTransport::port(host).await?,
host: host.host(),
timeout: host.timeout,
})
}
}
#[async_trait(?Send)]
impl DebugAdapter for PythonDebugAdapter {
fn name(&self) -> DebugAdapterName {
DebugAdapterName(Self::ADAPTER_NAME.into())
}
async fn fetch_latest_adapter_version(
&self,
delegate: &dyn DapDelegate,
) -> Result<AdapterVersion> {
let github_repo = GithubRepo {
repo_name: Self::ADAPTER_NAME.into(),
repo_owner: "microsoft".into(),
};
adapters::fetch_latest_adapter_version_from_github(github_repo, delegate).await
}
async fn install_binary(
&self,
version: AdapterVersion,
delegate: &dyn DapDelegate,
) -> Result<()> {
let version_path = adapters::download_adapter_from_github(
self.name(),
version,
adapters::DownloadedFileType::Zip,
delegate,
)
.await?;
// only needed when you install the latest version for the first time
if let Some(debugpy_dir) =
util::fs::find_file_name_in_dir(version_path.as_path(), |file_name| {
file_name.starts_with("microsoft-debugpy-")
})
.await
{
// TODO Debugger: Rename folder instead of moving all files to another folder
// We're doing uncessary IO work right now
util::fs::move_folder_files_to_folder(debugpy_dir.as_path(), version_path.as_path())
.await?;
}
Ok(())
}
async fn get_installed_binary(
&self,
delegate: &dyn DapDelegate,
config: &DebugAdapterConfig,
user_installed_path: Option<PathBuf>,
cx: &mut AsyncApp,
) -> Result<DebugAdapterBinary> {
const BINARY_NAMES: [&str; 3] = ["python3", "python", "py"];
let debugpy_dir = if let Some(user_installed_path) = user_installed_path {
user_installed_path
} else {
let adapter_path = paths::debug_adapters_dir().join(self.name());
let file_name_prefix = format!("{}_", self.name());
util::fs::find_file_name_in_dir(adapter_path.as_path(), |file_name| {
file_name.starts_with(&file_name_prefix)
})
.await
.ok_or_else(|| anyhow!("Debugpy directory not found"))?
};
let toolchain = delegate
.toolchain_store()
.active_toolchain(
delegate.worktree_id(),
language::LanguageName::new(Self::LANGUAGE_NAME),
cx,
)
.await;
let python_path = if let Some(toolchain) = toolchain {
Some(toolchain.path.to_string())
} else {
BINARY_NAMES
.iter()
.filter_map(|cmd| {
delegate
.which(OsStr::new(cmd))
.map(|path| path.to_string_lossy().to_string())
})
.find(|_| true)
};
Ok(DebugAdapterBinary {
command: python_path.ok_or(anyhow!("failed to find binary path for python"))?,
arguments: Some(vec![
debugpy_dir.join(Self::ADAPTER_PATH).into(),
format!("--port={}", self.port).into(),
format!("--host={}", self.host).into(),
]),
connection: Some(adapters::TcpArguments {
host: self.host,
port: Some(self.port),
timeout: self.timeout,
}),
cwd: config.cwd.clone(),
envs: None,
#[cfg(any(test, feature = "test-support"))]
is_fake: false,
})
}
fn request_args(&self, config: &DebugAdapterConfig) -> Value {
json!({
"program": config.program,
"subProcess": true,
"cwd": config.cwd,
})
}
}

View File

@@ -0,0 +1,26 @@
[package]
name = "debugger_tools"
version = "0.1.0"
edition.workspace = true
publish.workspace = true
license = "GPL-3.0-or-later"
[lints]
workspace = true
[lib]
path = "src/debugger_tools.rs"
doctest = false
[dependencies]
anyhow.workspace = true
dap.workspace = true
editor.workspace = true
futures.workspace = true
gpui.workspace = true
project.workspace = true
serde_json.workspace = true
settings.workspace = true
smol.workspace = true
util.workspace = true
workspace.workspace = true

View File

@@ -0,0 +1 @@
../../LICENSE-GPL

View File

@@ -0,0 +1,846 @@
use dap::{
client::{DebugAdapterClient, DebugAdapterClientId},
debugger_settings::DebuggerSettings,
transport::{IoKind, LogKind},
};
use editor::{Editor, EditorEvent};
use futures::{
channel::mpsc::{unbounded, UnboundedSender},
StreamExt,
};
use gpui::{
actions, div, App, AppContext, Context, Empty, Entity, EventEmitter, FocusHandle, Focusable,
IntoElement, ParentElement, Render, SharedString, Styled, Subscription, WeakEntity, Window,
};
use project::{debugger::session::Session, search::SearchQuery, Project};
use settings::Settings as _;
use std::{
borrow::Cow,
collections::{HashMap, VecDeque},
sync::Arc,
};
use util::maybe;
use workspace::{
item::Item,
searchable::{SearchEvent, SearchableItem, SearchableItemHandle},
ui::{h_flex, Button, Clickable, ContextMenu, Label, LabelCommon, PopoverMenu},
ToolbarItemEvent, ToolbarItemView, Workspace,
};
struct DapLogView {
editor: Entity<Editor>,
focus_handle: FocusHandle,
log_store: Entity<LogStore>,
editor_subscriptions: Vec<Subscription>,
current_view: Option<(DebugAdapterClientId, LogKind)>,
project: Entity<Project>,
_subscriptions: Vec<Subscription>,
}
struct LogStore {
projects: HashMap<WeakEntity<Project>, ProjectState>,
debug_clients: HashMap<DebugAdapterClientId, DebugAdapterState>,
rpc_tx: UnboundedSender<(DebugAdapterClientId, IoKind, String)>,
adapter_log_tx: UnboundedSender<(DebugAdapterClientId, IoKind, String)>,
}
struct ProjectState {
_subscriptions: [gpui::Subscription; 2],
}
struct DebugAdapterState {
log_messages: VecDeque<String>,
rpc_messages: RpcMessages,
}
struct RpcMessages {
messages: VecDeque<String>,
last_message_kind: Option<MessageKind>,
}
impl RpcMessages {
const MESSAGE_QUEUE_LIMIT: usize = 255;
fn new() -> Self {
Self {
last_message_kind: None,
messages: VecDeque::with_capacity(Self::MESSAGE_QUEUE_LIMIT),
}
}
}
const SEND: &str = "// Send";
const RECEIVE: &str = "// Receive";
#[derive(Clone, Copy, PartialEq, Eq)]
enum MessageKind {
Send,
Receive,
}
impl MessageKind {
fn label(&self) -> &'static str {
match self {
Self::Send => SEND,
Self::Receive => RECEIVE,
}
}
}
impl DebugAdapterState {
fn new() -> Self {
Self {
log_messages: VecDeque::new(),
rpc_messages: RpcMessages::new(),
}
}
}
impl LogStore {
fn new(cx: &Context<Self>) -> Self {
let (rpc_tx, mut rpc_rx) = unbounded::<(DebugAdapterClientId, IoKind, String)>();
cx.spawn(|this, mut cx| async move {
while let Some((client_id, io_kind, message)) = rpc_rx.next().await {
if let Some(this) = this.upgrade() {
this.update(&mut cx, |this, cx| {
this.on_rpc_log(client_id, io_kind, &message, cx);
})?;
}
smol::future::yield_now().await;
}
anyhow::Ok(())
})
.detach_and_log_err(cx);
let (adapter_log_tx, mut adapter_log_rx) =
unbounded::<(DebugAdapterClientId, IoKind, String)>();
cx.spawn(|this, mut cx| async move {
while let Some((client_id, io_kind, message)) = adapter_log_rx.next().await {
if let Some(this) = this.upgrade() {
this.update(&mut cx, |this, cx| {
this.on_adapter_log(client_id, io_kind, &message, cx);
})?;
}
smol::future::yield_now().await;
}
anyhow::Ok(())
})
.detach_and_log_err(cx);
Self {
rpc_tx,
adapter_log_tx,
projects: HashMap::new(),
debug_clients: HashMap::new(),
}
}
fn on_rpc_log(
&mut self,
client_id: DebugAdapterClientId,
io_kind: IoKind,
message: &str,
cx: &mut Context<Self>,
) {
self.add_debug_client_message(client_id, io_kind, message.to_string(), cx);
}
fn on_adapter_log(
&mut self,
client_id: DebugAdapterClientId,
io_kind: IoKind,
message: &str,
cx: &mut Context<Self>,
) {
self.add_debug_client_log(client_id, io_kind, message.to_string(), cx);
}
pub fn add_project(&mut self, project: &Entity<Project>, cx: &mut Context<Self>) {
let weak_project = project.downgrade();
self.projects.insert(
project.downgrade(),
ProjectState {
_subscriptions: [
cx.observe_release(project, move |this, _, _| {
this.projects.remove(&weak_project);
}),
cx.subscribe(project, |this, project, event, cx| match event {
project::Event::DebugClientStarted(client_id) => {
let client = project.update(cx, |project, cx| {
project.dap_store().update(cx, |store, cx| {
store
.client_by_id(client_id)
.and_then(|client| Some(client))
})
});
if let Some(client) = client {
this.add_debug_client(*client_id, client, cx);
}
}
project::Event::DebugClientShutdown(client_id) => {
this.remove_debug_client(*client_id, cx);
}
_ => {}
}),
],
},
);
}
fn get_debug_adapter_state(
&mut self,
id: DebugAdapterClientId,
) -> Option<&mut DebugAdapterState> {
self.debug_clients.get_mut(&id)
}
fn add_debug_client_message(
&mut self,
id: DebugAdapterClientId,
io_kind: IoKind,
message: String,
cx: &mut Context<Self>,
) {
let Some(debug_client_state) = self.get_debug_adapter_state(id) else {
return;
};
let kind = match io_kind {
IoKind::StdOut | IoKind::StdErr => MessageKind::Receive,
IoKind::StdIn => MessageKind::Send,
};
let rpc_messages = &mut debug_client_state.rpc_messages;
if rpc_messages.last_message_kind != Some(kind) {
Self::add_debug_client_entry(
&mut rpc_messages.messages,
id,
kind.label().to_string(),
LogKind::Rpc,
cx,
);
rpc_messages.last_message_kind = Some(kind);
}
Self::add_debug_client_entry(&mut rpc_messages.messages, id, message, LogKind::Rpc, cx);
cx.notify();
}
fn add_debug_client_log(
&mut self,
id: DebugAdapterClientId,
io_kind: IoKind,
message: String,
cx: &mut Context<Self>,
) {
let Some(debug_client_state) = self.get_debug_adapter_state(id) else {
return;
};
let message = match io_kind {
IoKind::StdErr => {
let mut message = message.clone();
message.insert_str(0, "stderr: ");
message
}
_ => message,
};
Self::add_debug_client_entry(
&mut debug_client_state.log_messages,
id,
message,
LogKind::Adapter,
cx,
);
cx.notify();
}
fn add_debug_client_entry(
log_lines: &mut VecDeque<String>,
id: DebugAdapterClientId,
message: String,
kind: LogKind,
cx: &mut Context<Self>,
) {
while log_lines.len() >= RpcMessages::MESSAGE_QUEUE_LIMIT {
log_lines.pop_front();
}
let format_messages = DebuggerSettings::get_global(cx).format_dap_log_messages;
let entry = if format_messages {
maybe!({
serde_json::to_string_pretty::<serde_json::Value>(
&serde_json::from_str(&message).ok()?,
)
.ok()
})
.unwrap_or(message)
} else {
message
};
log_lines.push_back(entry.clone());
cx.emit(Event::NewLogEntry { id, entry, kind });
}
fn add_debug_client(
&mut self,
client_id: DebugAdapterClientId,
client: Entity<Session>,
cx: &App,
) -> Option<&mut DebugAdapterState> {
let client_state = self
.debug_clients
.entry(client_id)
.or_insert_with(DebugAdapterState::new);
let io_tx = self.rpc_tx.clone();
let client = client.read(cx).adapter_client()?;
client.add_log_handler(
move |io_kind, message| {
io_tx
.unbounded_send((client_id, io_kind, message.to_string()))
.ok();
},
LogKind::Rpc,
);
let log_io_tx = self.adapter_log_tx.clone();
client.add_log_handler(
move |io_kind, message| {
log_io_tx
.unbounded_send((client_id, io_kind, message.to_string()))
.ok();
},
LogKind::Adapter,
);
Some(client_state)
}
fn remove_debug_client(&mut self, client_id: DebugAdapterClientId, cx: &mut Context<Self>) {
self.debug_clients.remove(&client_id);
cx.notify();
}
fn log_messages_for_client(
&mut self,
client_id: DebugAdapterClientId,
) -> Option<&mut VecDeque<String>> {
Some(&mut self.debug_clients.get_mut(&client_id)?.log_messages)
}
fn rpc_messages_for_client(
&mut self,
client_id: DebugAdapterClientId,
) -> Option<&mut VecDeque<String>> {
Some(
&mut self
.debug_clients
.get_mut(&client_id)?
.rpc_messages
.messages,
)
}
}
pub struct DapLogToolbarItemView {
log_view: Option<Entity<DapLogView>>,
}
impl DapLogToolbarItemView {
pub fn new() -> Self {
Self { log_view: None }
}
}
impl Render for DapLogToolbarItemView {
fn render(&mut self, _window: &mut Window, cx: &mut Context<Self>) -> impl IntoElement {
let Some(log_view) = self.log_view.clone() else {
return Empty.into_any_element();
};
let (menu_rows, current_client_id) = log_view.update(cx, |log_view, cx| {
(
log_view.menu_items(cx).unwrap_or_default(),
log_view.current_view.map(|(client_id, _)| client_id),
)
});
let current_client = current_client_id.and_then(|current_client_id| {
menu_rows
.iter()
.find(|row| row.client_id == current_client_id)
});
let dap_menu: PopoverMenu<_> = PopoverMenu::new("DapLogView")
.anchor(gpui::Corner::TopLeft)
.trigger(Button::new(
"debug_client_menu_header",
current_client
.map(|sub_item| {
Cow::Owned(format!(
"{} ({}) - {}",
sub_item.client_name,
sub_item.client_id.0,
match sub_item.selected_entry {
LogKind::Adapter => ADAPTER_LOGS,
LogKind::Rpc => RPC_MESSAGES,
}
))
})
.unwrap_or_else(|| "No adapter selected".into()),
))
.menu(move |mut window, cx| {
let log_view = log_view.clone();
let menu_rows = menu_rows.clone();
ContextMenu::build(&mut window, cx, move |mut menu, window, _cx| {
for row in menu_rows.into_iter() {
menu = menu.custom_row(move |_window, _cx| {
div()
.w_full()
.pl_2()
.child(
Label::new(
format!("{}. {}", row.client_id.0, row.client_name,),
)
.color(workspace::ui::Color::Muted),
)
.into_any_element()
});
if row.has_adapter_logs {
menu = menu.custom_entry(
move |_window, _cx| {
div()
.w_full()
.pl_4()
.child(Label::new(ADAPTER_LOGS))
.into_any_element()
},
window.handler_for(&log_view, move |view, window, cx| {
view.show_log_messages_for_adapter(row.client_id, window, cx);
}),
);
}
menu = menu.custom_entry(
move |_window, _cx| {
div()
.w_full()
.pl_4()
.child(Label::new(RPC_MESSAGES))
.into_any_element()
},
window.handler_for(&log_view, move |view, window, cx| {
view.show_rpc_trace_for_server(row.client_id, window, cx);
}),
);
}
menu
})
.into()
});
h_flex()
.size_full()
.child(dap_menu)
.child(
div()
.child(
Button::new("clear_log_button", "Clear").on_click(cx.listener(
|this, _, window, cx| {
if let Some(log_view) = this.log_view.as_ref() {
log_view.update(cx, |log_view, cx| {
log_view.editor.update(cx, |editor, cx| {
editor.set_read_only(false);
editor.clear(window, cx);
editor.set_read_only(true);
});
})
}
},
)),
)
.ml_2(),
)
.into_any_element()
}
}
impl EventEmitter<ToolbarItemEvent> for DapLogToolbarItemView {}
impl ToolbarItemView for DapLogToolbarItemView {
fn set_active_pane_item(
&mut self,
active_pane_item: Option<&dyn workspace::item::ItemHandle>,
_window: &mut Window,
cx: &mut Context<Self>,
) -> workspace::ToolbarItemLocation {
if let Some(item) = active_pane_item {
if let Some(log_view) = item.downcast::<DapLogView>() {
self.log_view = Some(log_view.clone());
return workspace::ToolbarItemLocation::PrimaryLeft;
}
}
self.log_view = None;
cx.notify();
workspace::ToolbarItemLocation::Hidden
}
}
impl DapLogView {
pub fn new(
project: Entity<Project>,
log_store: Entity<LogStore>,
window: &mut Window,
cx: &mut Context<Self>,
) -> Self {
let (editor, editor_subscriptions) = Self::editor_for_logs(String::new(), window, cx);
let focus_handle = cx.focus_handle();
let events_subscriptions = cx.subscribe(&log_store, |log_view, _, event, cx| match event {
Event::NewLogEntry { id, entry, kind } => {
if log_view.current_view == Some((*id, *kind)) {
log_view.editor.update(cx, |editor, cx| {
editor.set_read_only(false);
let last_point = editor.buffer().read(cx).len(cx);
editor.edit(
vec![
(last_point..last_point, entry.trim()),
(last_point..last_point, "\n"),
],
cx,
);
editor.set_read_only(true);
});
}
}
});
Self {
editor,
focus_handle,
project,
log_store,
editor_subscriptions,
current_view: None,
_subscriptions: vec![events_subscriptions],
}
}
fn editor_for_logs(
log_contents: String,
window: &mut Window,
cx: &mut Context<Self>,
) -> (Entity<Editor>, Vec<Subscription>) {
let editor = cx.new(|cx| {
let mut editor = Editor::multi_line(window, cx);
editor.set_text(log_contents, window, cx);
editor.move_to_end(&editor::actions::MoveToEnd, window, cx);
editor.set_read_only(true);
editor.set_show_edit_predictions(Some(false), window, cx);
editor
});
let editor_subscription = cx.subscribe(
&editor,
|_, _, event: &EditorEvent, cx: &mut Context<'_, DapLogView>| cx.emit(event.clone()),
);
let search_subscription = cx.subscribe(
&editor,
|_, _, event: &SearchEvent, cx: &mut Context<'_, DapLogView>| cx.emit(event.clone()),
);
(editor, vec![editor_subscription, search_subscription])
}
fn menu_items(&self, cx: &App) -> Option<Vec<DapMenuItem>> {
let mut menu_items = self
.project
.read(cx)
.dap_store()
.read(cx)
.clients()
.filter_map(|client| {
let client = client.read(cx).adapter_client()?;
Some(DapMenuItem {
client_id: client.id(),
client_name: unimplemented!(),
has_adapter_logs: client.has_adapter_logs(),
selected_entry: self.current_view.map_or(LogKind::Adapter, |(_, kind)| kind),
})
})
.collect::<Vec<_>>();
menu_items.sort_by_key(|item| item.client_id.0);
Some(menu_items)
}
fn show_rpc_trace_for_server(
&mut self,
client_id: DebugAdapterClientId,
window: &mut Window,
cx: &mut Context<Self>,
) {
let rpc_log = self.log_store.update(cx, |log_store, _| {
log_store
.rpc_messages_for_client(client_id)
.map(|state| log_contents(&state))
});
if let Some(rpc_log) = rpc_log {
self.current_view = Some((client_id, LogKind::Rpc));
let (editor, editor_subscriptions) = Self::editor_for_logs(rpc_log, window, cx);
let language = self.project.read(cx).languages().language_for_name("JSON");
editor
.read(cx)
.buffer()
.read(cx)
.as_singleton()
.expect("log buffer should be a singleton")
.update(cx, |_, cx| {
cx.spawn({
let buffer = cx.entity();
|_, mut cx| async move {
let language = language.await.ok();
buffer.update(&mut cx, |buffer, cx| {
buffer.set_language(language, cx);
})
}
})
.detach_and_log_err(cx);
});
self.editor = editor;
self.editor_subscriptions = editor_subscriptions;
cx.notify();
}
cx.focus_self(window);
}
fn show_log_messages_for_adapter(
&mut self,
client_id: DebugAdapterClientId,
window: &mut Window,
cx: &mut Context<Self>,
) {
let message_log = self.log_store.update(cx, |log_store, _| {
log_store
.log_messages_for_client(client_id)
.map(|state| log_contents(&state))
});
if let Some(message_log) = message_log {
self.current_view = Some((client_id, LogKind::Adapter));
let (editor, editor_subscriptions) = Self::editor_for_logs(message_log, window, cx);
editor
.read(cx)
.buffer()
.read(cx)
.as_singleton()
.expect("log buffer should be a singleton");
self.editor = editor;
self.editor_subscriptions = editor_subscriptions;
cx.notify();
}
cx.focus_self(window);
}
}
fn log_contents(lines: &VecDeque<String>) -> String {
let (a, b) = lines.as_slices();
let a = a.iter().map(move |v| v.as_ref());
let b = b.iter().map(move |v| v.as_ref());
a.chain(b).fold(String::new(), |mut acc, el| {
acc.push_str(el);
acc.push('\n');
acc
})
}
#[derive(Clone, PartialEq)]
pub(crate) struct DapMenuItem {
pub client_id: DebugAdapterClientId,
pub client_name: String,
pub has_adapter_logs: bool,
pub selected_entry: LogKind,
}
const ADAPTER_LOGS: &str = "Adapter Logs";
const RPC_MESSAGES: &str = "RPC Messages";
impl Render for DapLogView {
fn render(&mut self, window: &mut Window, cx: &mut Context<Self>) -> impl IntoElement {
self.editor.update(cx, |editor, cx| {
editor.render(window, cx).into_any_element()
})
}
}
actions!(debug, [OpenDebuggerAdapterLogs]);
pub fn init(cx: &mut App) {
let log_store = cx.new(|cx| LogStore::new(cx));
cx.observe_new(move |workspace: &mut Workspace, window, cx| {
let Some(_window) = window else {
return;
};
let project = workspace.project();
if project.read(cx).is_local() {
log_store.update(cx, |store, cx| {
store.add_project(project, cx);
});
}
let log_store = log_store.clone();
workspace.register_action(move |workspace, _: &OpenDebuggerAdapterLogs, window, cx| {
let project = workspace.project().read(cx);
if project.is_local() {
workspace.add_item_to_active_pane(
Box::new(cx.new(|cx| {
DapLogView::new(workspace.project().clone(), log_store.clone(), window, cx)
})),
None,
true,
window,
cx,
);
}
});
})
.detach();
}
impl Item for DapLogView {
type Event = EditorEvent;
fn to_item_events(event: &Self::Event, f: impl FnMut(workspace::item::ItemEvent)) {
Editor::to_item_events(event, f)
}
fn tab_content_text(&self, _window: &Window, _cx: &App) -> Option<SharedString> {
Some("DAP Logs".into())
}
fn telemetry_event_text(&self) -> Option<&'static str> {
None
}
fn as_searchable(&self, handle: &Entity<Self>) -> Option<Box<dyn SearchableItemHandle>> {
Some(Box::new(handle.clone()))
}
}
impl SearchableItem for DapLogView {
type Match = <Editor as SearchableItem>::Match;
fn clear_matches(&mut self, window: &mut Window, cx: &mut Context<Self>) {
self.editor.update(cx, |e, cx| e.clear_matches(window, cx))
}
fn update_matches(
&mut self,
matches: &[Self::Match],
window: &mut Window,
cx: &mut Context<Self>,
) {
self.editor
.update(cx, |e, cx| e.update_matches(matches, window, cx))
}
fn query_suggestion(&mut self, window: &mut Window, cx: &mut Context<Self>) -> String {
self.editor
.update(cx, |e, cx| e.query_suggestion(window, cx))
}
fn activate_match(
&mut self,
index: usize,
matches: &[Self::Match],
window: &mut Window,
cx: &mut Context<Self>,
) {
self.editor
.update(cx, |e, cx| e.activate_match(index, matches, window, cx))
}
fn select_matches(
&mut self,
matches: &[Self::Match],
window: &mut Window,
cx: &mut Context<Self>,
) {
self.editor
.update(cx, |e, cx| e.select_matches(matches, window, cx))
}
fn find_matches(
&mut self,
query: Arc<project::search::SearchQuery>,
window: &mut Window,
cx: &mut Context<Self>,
) -> gpui::Task<Vec<Self::Match>> {
self.editor
.update(cx, |e, cx| e.find_matches(query, window, cx))
}
fn replace(
&mut self,
_: &Self::Match,
_: &SearchQuery,
_window: &mut Window,
_: &mut Context<Self>,
) {
// Since DAP Log is read-only, it doesn't make sense to support replace operation.
}
fn supported_options(&self) -> workspace::searchable::SearchOptions {
workspace::searchable::SearchOptions {
case: true,
word: true,
regex: true,
find_in_results: true,
// DAP log is read-only.
replacement: false,
selection: false,
}
}
fn active_match_index(
&mut self,
matches: &[Self::Match],
window: &mut Window,
cx: &mut Context<Self>,
) -> Option<usize> {
self.editor
.update(cx, |e, cx| e.active_match_index(matches, window, cx))
}
}
impl Focusable for DapLogView {
fn focus_handle(&self, _cx: &App) -> FocusHandle {
self.focus_handle.clone()
}
}
pub enum Event {
NewLogEntry {
id: DebugAdapterClientId,
entry: String,
kind: LogKind,
},
}
impl EventEmitter<Event> for LogStore {}
impl EventEmitter<Event> for DapLogView {}
impl EventEmitter<EditorEvent> for DapLogView {}
impl EventEmitter<SearchEvent> for DapLogView {}

View File

@@ -0,0 +1,8 @@
mod dap_log;
pub use dap_log::*;
use gpui::App;
pub fn init(cx: &mut App) {
dap_log::init(cx);
}

View File

@@ -0,0 +1,57 @@
[package]
name = "debugger_ui"
version = "0.1.0"
edition.workspace = true
publish.workspace = true
license = "GPL-3.0-or-later"
[lints]
workspace = true
[features]
test-support = [
"dap/test-support",
"editor/test-support",
"gpui/test-support",
"project/test-support",
"util/test-support",
"workspace/test-support",
]
[dependencies]
anyhow.workspace = true
client.workspace = true
collections.workspace = true
command_palette_hooks.workspace = true
dap.workspace = true
editor.workspace = true
fuzzy.workspace = true
gpui.workspace = true
language.workspace = true
menu.workspace = true
picker.workspace = true
pretty_assertions.workspace = true
project.workspace = true
rpc.workspace = true
serde.workspace = true
serde_json.workspace = true
settings.workspace = true
sum_tree.workspace = true
sysinfo.workspace = true
task.workspace = true
tasks_ui.workspace = true
terminal_view.workspace = true
theme.workspace = true
ui.workspace = true
util.workspace = true
workspace.workspace = true
[dev-dependencies]
dap = { workspace = true, features = ["test-support"] }
editor = { workspace = true, features = ["test-support"] }
env_logger.workspace = true
gpui = { workspace = true, features = ["test-support"] }
project = { workspace = true, features = ["test-support"] }
unindent.workspace = true
util = { workspace = true, features = ["test-support"] }
workspace = { workspace = true, features = ["test-support"] }

View File

@@ -0,0 +1 @@
../../LICENSE-GPL

View File

@@ -0,0 +1,305 @@
use dap::client::DebugAdapterClientId;
use fuzzy::{StringMatch, StringMatchCandidate};
use gpui::Subscription;
use gpui::{DismissEvent, Entity, EventEmitter, Focusable, Render};
use picker::{Picker, PickerDelegate};
use project::debugger::dap_store::DapStore;
use std::sync::Arc;
use sysinfo::System;
use ui::{prelude::*, Context, Tooltip};
use ui::{ListItem, ListItemSpacing};
use workspace::ModalView;
#[derive(Debug, Clone)]
struct Candidate {
pid: u32,
name: String,
command: Vec<String>,
}
pub(crate) struct AttachModalDelegate {
selected_index: usize,
matches: Vec<StringMatch>,
session_id: DebugAdapterClientId,
placeholder_text: Arc<str>,
dap_store: Entity<DapStore>,
client_id: DebugAdapterClientId,
candidates: Option<Vec<Candidate>>,
}
impl AttachModalDelegate {
pub fn new(
session_id: DebugAdapterClientId,
client_id: DebugAdapterClientId,
dap_store: Entity<DapStore>,
) -> Self {
Self {
client_id,
dap_store,
session_id,
candidates: None,
selected_index: 0,
matches: Vec::default(),
placeholder_text: Arc::from("Select the process you want to attach the debugger to"),
}
}
}
pub(crate) struct AttachModal {
_subscription: Subscription,
pub(crate) picker: Entity<Picker<AttachModalDelegate>>,
}
impl AttachModal {
pub fn new(
session_id: &DebugAdapterClientId,
client_id: DebugAdapterClientId,
dap_store: Entity<DapStore>,
window: &mut Window,
cx: &mut Context<Self>,
) -> Self {
let picker = cx.new(|cx| {
Picker::uniform_list(
AttachModalDelegate::new(*session_id, client_id, dap_store),
window,
cx,
)
});
let _subscription = cx.subscribe(&picker, |_, _, _, cx| {
cx.emit(DismissEvent);
});
Self {
picker,
_subscription,
}
}
}
impl Render for AttachModal {
fn render(&mut self, _window: &mut Window, _: &mut Context<Self>) -> impl ui::IntoElement {
v_flex()
.key_context("AttachModal")
.w(rems(34.))
.child(self.picker.clone())
}
}
impl EventEmitter<DismissEvent> for AttachModal {}
impl Focusable for AttachModal {
fn focus_handle(&self, cx: &App) -> gpui::FocusHandle {
self.picker.read(cx).focus_handle(cx)
}
}
impl ModalView for AttachModal {}
impl PickerDelegate for AttachModalDelegate {
type ListItem = ListItem;
fn match_count(&self) -> usize {
self.matches.len()
}
fn selected_index(&self) -> usize {
self.selected_index
}
fn set_selected_index(
&mut self,
ix: usize,
_window: &mut Window,
_: &mut Context<Picker<Self>>,
) {
self.selected_index = ix;
}
fn placeholder_text(&self, _window: &mut Window, _cx: &mut App) -> std::sync::Arc<str> {
self.placeholder_text.clone()
}
fn update_matches(
&mut self,
query: String,
_window: &mut Window,
cx: &mut Context<Picker<Self>>,
) -> gpui::Task<()> {
cx.spawn(|this, mut cx| async move {
let Some(processes) = this
.update(&mut cx, |this, cx| {
if let Some(processes) = this.delegate.candidates.clone() {
processes
} else {
let Some(client) = this.delegate.dap_store.update(cx, |store, cx| {
store
.client_by_id(&this.delegate.client_id)
.and_then(|client| client.read(cx).adapter_client())
}) else {
return Vec::new();
};
let system = System::new_all();
todo!("client.adapter().attach_processes(&system.processes())");
let processes: Vec<(&sysinfo::Pid, &sysinfo::Process)> = vec![];
let processes = processes
.into_iter()
.map(|(pid, process)| Candidate {
pid: pid.as_u32(),
name: process.name().to_string_lossy().into_owned(),
command: process
.cmd()
.iter()
.map(|s| s.to_string_lossy().to_string())
.collect::<Vec<_>>(),
})
.collect::<Vec<Candidate>>();
let _ = this.delegate.candidates.insert(processes.clone());
processes
}
})
.ok()
else {
return;
};
let matches = fuzzy::match_strings(
&processes
.iter()
.enumerate()
.map(|(id, candidate)| {
StringMatchCandidate::new(
id,
format!(
"{} {} {}",
candidate.command.join(" "),
candidate.pid,
candidate.name
)
.as_str(),
)
})
.collect::<Vec<_>>(),
&query,
true,
100,
&Default::default(),
cx.background_executor().clone(),
)
.await;
this.update(&mut cx, |this, _| {
let delegate = &mut this.delegate;
delegate.matches = matches;
delegate.candidates = Some(processes);
if delegate.matches.is_empty() {
delegate.selected_index = 0;
} else {
delegate.selected_index =
delegate.selected_index.min(delegate.matches.len() - 1);
}
})
.ok();
})
}
fn confirm(&mut self, _: bool, _window: &mut Window, cx: &mut Context<Picker<Self>>) {
let candidate = self
.matches
.get(self.selected_index())
.and_then(|current_match| {
let ix = current_match.candidate_id;
self.candidates.as_ref().map(|candidates| &candidates[ix])
});
let Some(candidate) = candidate else {
return cx.emit(DismissEvent);
};
self.dap_store.update(cx, |store, cx| {
store
.attach(self.client_id, candidate.pid, cx)
.detach_and_log_err(cx);
});
cx.emit(DismissEvent);
}
fn dismissed(&mut self, _window: &mut Window, cx: &mut Context<Picker<Self>>) {
self.selected_index = 0;
self.candidates.take();
self.dap_store.update(cx, |store, cx| {
store.shutdown_client(&self.session_id, cx).detach();
});
cx.emit(DismissEvent);
}
fn render_match(
&self,
ix: usize,
selected: bool,
_window: &mut Window,
_: &mut Context<Picker<Self>>,
) -> Option<Self::ListItem> {
let candidates = self.candidates.as_ref()?;
let hit = &self.matches[ix];
let candidate = &candidates.get(hit.candidate_id)?;
Some(
ListItem::new(SharedString::from(format!("process-entry-{ix}")))
.inset(true)
.spacing(ListItemSpacing::Sparse)
.toggle_state(selected)
.child(
v_flex()
.items_start()
.child(Label::new(format!("{} {}", candidate.name, candidate.pid)))
.child(
div()
.id(SharedString::from(format!("process-entry-{ix}-command")))
.tooltip(Tooltip::text(
candidate
.command
.clone()
.into_iter()
.collect::<Vec<_>>()
.join(" "),
))
.child(
Label::new(format!(
"{} {}",
candidate.name,
candidate
.command
.clone()
.into_iter()
.skip(1)
.collect::<Vec<_>>()
.join(" ")
))
.size(LabelSize::Small)
.color(Color::Muted),
),
),
),
)
}
}
#[allow(dead_code)]
#[cfg(any(test, feature = "test-support"))]
pub(crate) fn procss_names(modal: &AttachModal, cx: &mut Context<AttachModal>) -> Vec<String> {
modal.picker.update(cx, |picker, _| {
picker
.delegate
.matches
.iter()
.map(|hit| hit.string.clone())
.collect::<Vec<_>>()
})
}

View File

@@ -0,0 +1,329 @@
use crate::{attach_modal::AttachModal, session::DebugSession};
use anyhow::Result;
use collections::{BTreeMap, HashMap};
use command_palette_hooks::CommandPaletteFilter;
use dap::{
client::DebugAdapterClientId,
debugger_settings::DebuggerSettings,
messages::{Events, Message},
requests::{Request, RunInTerminal, StartDebugging},
Capabilities, CapabilitiesEvent, ContinuedEvent, ErrorResponse, ExitedEvent, LoadedSourceEvent,
ModuleEvent, OutputEvent, RunInTerminalRequestArguments, RunInTerminalResponse, StoppedEvent,
TerminatedEvent, ThreadEvent, ThreadEventReason,
};
use gpui::{
actions, Action, App, AsyncWindowContext, Context, Entity, EventEmitter, FocusHandle,
Focusable, Subscription, Task, WeakEntity,
};
use project::{
debugger::{
dap_store::{DapStore, DapStoreEvent},
session::ThreadId,
},
terminals::TerminalKind,
};
use rpc::proto::{self, UpdateDebugAdapter};
use serde_json::Value;
use settings::Settings;
use std::{any::TypeId, collections::VecDeque, path::PathBuf, u64};
use task::DebugRequestType;
use terminal_view::terminal_panel::TerminalPanel;
use ui::prelude::*;
use util::ResultExt as _;
use workspace::{
dock::{DockPosition, Panel, PanelEvent},
pane, Continue, Disconnect, Pane, Pause, Restart, Start, StepBack, StepInto, StepOut, StepOver,
Stop, ToggleIgnoreBreakpoints, Workspace,
};
pub enum DebugPanelEvent {
Exited(DebugAdapterClientId),
Terminated(DebugAdapterClientId),
Stopped {
client_id: DebugAdapterClientId,
event: StoppedEvent,
go_to_stack_frame: bool,
},
Thread((DebugAdapterClientId, ThreadEvent)),
Continued((DebugAdapterClientId, ContinuedEvent)),
Output((DebugAdapterClientId, OutputEvent)),
Module((DebugAdapterClientId, ModuleEvent)),
LoadedSource((DebugAdapterClientId, LoadedSourceEvent)),
ClientShutdown(DebugAdapterClientId),
CapabilitiesChanged(DebugAdapterClientId),
}
actions!(debug_panel, [ToggleFocus]);
pub struct DebugPanel {
size: Pixels,
pane: Entity<Pane>,
workspace: WeakEntity<Workspace>,
_subscriptions: Vec<Subscription>,
}
impl DebugPanel {
pub fn new(
workspace: &Workspace,
window: &mut Window,
cx: &mut Context<Workspace>,
) -> Entity<Self> {
cx.new(|cx| {
let pane = cx.new(|cx| {
let mut pane = Pane::new(
workspace.weak_handle(),
workspace.project().clone(),
Default::default(),
None,
gpui::NoAction.boxed_clone(),
window,
cx,
);
pane.set_can_split(None);
pane.set_can_navigate(true, cx);
pane.display_nav_history_buttons(None);
pane.set_should_display_tab_bar(|_window, _cx| true);
pane.set_close_pane_if_empty(true, cx);
pane.set_render_tab_bar_buttons(cx, |_, _, _| {
(
None,
Some(
h_flex()
.child(
IconButton::new("new-debug-session", IconName::Plus)
.icon_size(IconSize::Small),
)
.into_any_element(),
),
)
});
pane.add_item(
Box::new(DebugSession::inert(cx)),
false,
false,
None,
window,
cx,
);
pane
});
let project = workspace.project().clone();
let _subscriptions = vec![
cx.observe(&pane, |_, _, cx| cx.notify()),
cx.subscribe_in(&pane, window, Self::handle_pane_event),
];
let debug_panel = Self {
pane,
size: px(300.),
_subscriptions,
workspace: workspace.weak_handle(),
};
debug_panel
})
}
pub fn load(
workspace: WeakEntity<Workspace>,
cx: AsyncWindowContext,
) -> Task<Result<Entity<Self>>> {
cx.spawn(|mut cx| async move {
workspace.update_in(&mut cx, |workspace, window, cx| {
let debug_panel = DebugPanel::new(workspace, window, cx);
cx.observe(&debug_panel, |_, debug_panel, cx| {
let (has_active_session, support_step_back) =
debug_panel.update(cx, |this, cx| {
this.active_debug_panel_item(cx)
.map(|item| (true, false))
.unwrap_or((false, false))
});
let filter = CommandPaletteFilter::global_mut(cx);
let debugger_action_types = [
TypeId::of::<Continue>(),
TypeId::of::<StepOver>(),
TypeId::of::<StepInto>(),
TypeId::of::<StepOut>(),
TypeId::of::<Stop>(),
TypeId::of::<Disconnect>(),
TypeId::of::<Pause>(),
TypeId::of::<Restart>(),
TypeId::of::<ToggleIgnoreBreakpoints>(),
];
let step_back_action_type = [TypeId::of::<StepBack>()];
if has_active_session {
filter.show_action_types(debugger_action_types.iter());
if support_step_back {
filter.show_action_types(step_back_action_type.iter());
} else {
filter.hide_action_types(&step_back_action_type);
}
} else {
// show only the `debug: start`
filter.hide_action_types(&debugger_action_types);
filter.hide_action_types(&step_back_action_type);
}
})
.detach();
debug_panel
})
})
}
#[cfg(any(test, feature = "test-support"))]
pub fn message_queue(&self) -> &HashMap<DebugAdapterClientId, VecDeque<OutputEvent>> {
// &self.message_queue
unimplemented!("Should chekc session for console messagse")
}
#[cfg(any(test, feature = "test-support"))]
pub fn dap_store(&self) -> Entity<DapStore> {
self.dap_store.clone()
}
pub fn active_debug_panel_item(&self, cx: &Context<Self>) -> Option<Entity<DebugSession>> {
self.pane
.read(cx)
.active_item()
.and_then(|panel| panel.downcast::<DebugSession>())
}
pub fn debug_panel_items_by_client(
&self,
client_id: &DebugAdapterClientId,
cx: &Context<Self>,
) -> Vec<Entity<DebugSession>> {
self.pane
.read(cx)
.items()
.filter_map(|item| item.downcast::<DebugSession>())
.filter(|item| item.read(cx).session_id(cx) == Some(*client_id))
.map(|item| item.clone())
.collect()
}
pub fn debug_panel_item_by_client(
&self,
client_id: DebugAdapterClientId,
cx: &mut Context<Self>,
) -> Option<Entity<DebugSession>> {
self.pane
.read(cx)
.items()
.filter_map(|item| item.downcast::<DebugSession>())
.find(|item| {
let item = item.read(cx);
item.session_id(cx) == Some(client_id)
})
}
fn handle_pane_event(
&mut self,
_: &Entity<Pane>,
event: &pane::Event,
window: &mut Window,
cx: &mut Context<Self>,
) {
match event {
pane::Event::Remove { .. } => cx.emit(PanelEvent::Close),
pane::Event::ZoomIn => cx.emit(PanelEvent::ZoomIn),
pane::Event::ZoomOut => cx.emit(PanelEvent::ZoomOut),
pane::Event::AddItem { item } => {
self.workspace
.update(cx, |workspace, cx| {
item.added_to_pane(workspace, self.pane.clone(), window, cx)
})
.ok();
}
_ => {}
}
}
}
impl EventEmitter<PanelEvent> for DebugPanel {}
impl EventEmitter<DebugPanelEvent> for DebugPanel {}
impl EventEmitter<project::Event> for DebugPanel {}
impl Focusable for DebugPanel {
fn focus_handle(&self, cx: &App) -> FocusHandle {
self.pane.focus_handle(cx)
}
}
impl Panel for DebugPanel {
fn pane(&self) -> Option<Entity<Pane>> {
Some(self.pane.clone())
}
fn persistent_name() -> &'static str {
"DebugPanel"
}
fn position(&self, _window: &Window, _cx: &App) -> DockPosition {
DockPosition::Bottom
}
fn position_is_valid(&self, position: DockPosition) -> bool {
position == DockPosition::Bottom
}
fn set_position(
&mut self,
_position: DockPosition,
_window: &mut Window,
_cx: &mut Context<Self>,
) {
}
fn size(&self, _window: &Window, _cx: &App) -> Pixels {
self.size
}
fn set_size(&mut self, size: Option<Pixels>, _window: &mut Window, _cx: &mut Context<Self>) {
self.size = size.unwrap();
}
fn remote_id() -> Option<proto::PanelId> {
Some(proto::PanelId::DebugPanel)
}
fn icon(&self, _window: &Window, _cx: &App) -> Option<IconName> {
Some(IconName::Debug)
}
fn icon_tooltip(&self, _window: &Window, cx: &App) -> Option<&'static str> {
if DebuggerSettings::get_global(cx).button {
Some("Debug Panel")
} else {
None
}
}
fn toggle_action(&self) -> Box<dyn Action> {
Box::new(ToggleFocus)
}
fn activation_priority(&self) -> u32 {
9
}
}
impl Render for DebugPanel {
fn render(&mut self, _window: &mut Window, cx: &mut Context<Self>) -> impl IntoElement {
v_flex()
.key_context("DebugPanel")
.track_focus(&self.focus_handle(cx))
.size_full()
.child(self.pane.clone())
.into_any()
}
}

View File

@@ -0,0 +1,46 @@
use dap::debugger_settings::DebuggerSettings;
use debugger_panel::{DebugPanel, ToggleFocus};
use gpui::App;
use session::DebugSession;
use settings::Settings;
use workspace::{
Continue, Pause, Restart, ShutdownDebugAdapters, Start, StepBack, StepInto, StepOut, StepOver,
Stop, ToggleIgnoreBreakpoints, Workspace,
};
pub mod attach_modal;
pub mod debugger_panel;
pub mod session;
#[cfg(test)]
mod tests;
pub fn init(cx: &mut App) {
DebuggerSettings::register(cx);
workspace::FollowableViewRegistry::register::<DebugSession>(cx);
cx.observe_new(|workspace: &mut Workspace, window, _cx| {
let Some(_) = window else {
return;
};
workspace
.register_action(|workspace, _: &ToggleFocus, window, cx| {
workspace.toggle_panel_focus::<DebugPanel>(window, cx);
})
.register_action(|workspace: &mut Workspace, _: &Start, window, cx| {
tasks_ui::toggle_modal(workspace, None, task::TaskModal::DebugModal, window, cx)
.detach();
})
.register_action(
|workspace: &mut Workspace, _: &ShutdownDebugAdapters, _window, cx| {
workspace.project().update(cx, |project, cx| {
project.dap_store().update(cx, |store, cx| {
store.shutdown_clients(cx).detach();
})
})
},
);
})
.detach();
}

View File

@@ -0,0 +1,191 @@
mod inert;
mod running;
mod starting;
use crate::debugger_panel::{DebugPanel, DebugPanelEvent};
use dap::{
client::DebugAdapterClientId, debugger_settings::DebuggerSettings, Capabilities,
ContinuedEvent, LoadedSourceEvent, ModuleEvent, OutputEvent, OutputEventCategory, StoppedEvent,
ThreadEvent,
};
use gpui::{
AnyElement, App, Entity, EventEmitter, FocusHandle, Focusable, Subscription, Task, WeakEntity,
};
use inert::InertState;
use project::debugger::session::Session;
use project::debugger::session::{ThreadId, ThreadStatus};
use rpc::proto::{self, PeerId};
use settings::Settings;
use starting::StartingState;
use ui::{prelude::*, ContextMenu, DropdownMenu, Indicator, Tooltip};
use workspace::{
item::{self, Item, ItemEvent},
FollowableItem, ViewId, Workspace,
};
enum DebugSessionState {
Inert(Entity<InertState>),
Starting(Entity<StartingState>),
Running(Entity<running::RunningState>),
}
pub struct DebugSession {
remote_id: Option<workspace::ViewId>,
mode: DebugSessionState,
}
#[derive(Debug)]
pub enum DebugPanelItemEvent {
Close,
Stopped { go_to_stack_frame: bool },
}
#[derive(Clone, PartialEq, Eq)]
pub enum ThreadItem {
Console,
LoadedSource,
Modules,
Variables,
}
impl ThreadItem {
fn _to_proto(&self) -> proto::DebuggerThreadItem {
match self {
ThreadItem::Console => proto::DebuggerThreadItem::Console,
ThreadItem::LoadedSource => proto::DebuggerThreadItem::LoadedSource,
ThreadItem::Modules => proto::DebuggerThreadItem::Modules,
ThreadItem::Variables => proto::DebuggerThreadItem::Variables,
}
}
fn from_proto(active_thread_item: proto::DebuggerThreadItem) -> Self {
match active_thread_item {
proto::DebuggerThreadItem::Console => ThreadItem::Console,
proto::DebuggerThreadItem::LoadedSource => ThreadItem::LoadedSource,
proto::DebuggerThreadItem::Modules => ThreadItem::Modules,
proto::DebuggerThreadItem::Variables => ThreadItem::Variables,
}
}
}
impl DebugSession {
pub(super) fn inert(cx: &mut App) -> Entity<Self> {
cx.new(|cx| Self {
remote_id: None,
mode: DebugSessionState::Inert(cx.new(|cx| InertState::new(cx))),
})
}
pub(crate) fn session_id(&self, cx: &App) -> Option<DebugAdapterClientId> {
match &self.mode {
DebugSessionState::Inert(_) => None,
DebugSessionState::Starting(_entity) => unimplemented!(),
DebugSessionState::Running(entity) => Some(entity.read(cx).client_id()),
}
}
}
impl EventEmitter<DebugPanelItemEvent> for DebugSession {}
impl Focusable for DebugSession {
fn focus_handle(&self, cx: &App) -> FocusHandle {
match &self.mode {
DebugSessionState::Inert(inert_state) => inert_state.focus_handle(cx),
DebugSessionState::Starting(starting_state) => starting_state.focus_handle(cx),
DebugSessionState::Running(running_state) => running_state.focus_handle(cx),
}
}
}
impl Item for DebugSession {
type Event = DebugPanelItemEvent;
fn tab_content(&self, _: item::TabContentParams, _: &Window, _: &App) -> AnyElement {
let label = match &self.mode {
DebugSessionState::Inert(_) => "New Session",
DebugSessionState::Starting(_) => "Starting",
DebugSessionState::Running(_) => "Running",
};
div().child(Label::new(label)).into_any_element()
}
}
impl FollowableItem for DebugSession {
fn remote_id(&self) -> Option<workspace::ViewId> {
self.remote_id
}
fn to_state_proto(&self, _window: &Window, _cx: &App) -> Option<proto::view::Variant> {
None
}
fn from_state_proto(
_workspace: Entity<Workspace>,
_remote_id: ViewId,
_state: &mut Option<proto::view::Variant>,
_window: &mut Window,
_cx: &mut App,
) -> Option<gpui::Task<gpui::Result<Entity<Self>>>> {
None
}
fn add_event_to_update_proto(
&self,
_event: &Self::Event,
_update: &mut Option<proto::update_view::Variant>,
_window: &Window,
_cx: &App,
) -> bool {
// update.get_or_insert_with(|| proto::update_view::Variant::DebugPanel(Default::default()));
true
}
fn apply_update_proto(
&mut self,
_project: &Entity<project::Project>,
_message: proto::update_view::Variant,
_window: &mut Window,
_cx: &mut Context<Self>,
) -> gpui::Task<gpui::Result<()>> {
Task::ready(Ok(()))
}
fn set_leader_peer_id(
&mut self,
_leader_peer_id: Option<PeerId>,
_window: &mut Window,
_cx: &mut Context<Self>,
) {
}
fn to_follow_event(_event: &Self::Event) -> Option<workspace::item::FollowEvent> {
None
}
fn dedup(&self, existing: &Self, _window: &Window, cx: &App) -> Option<workspace::item::Dedup> {
if existing.session_id(cx) == self.session_id(cx) {
Some(item::Dedup::KeepExisting)
} else {
None
}
}
fn is_project_item(&self, _window: &Window, _cx: &App) -> bool {
true
}
}
impl Render for DebugSession {
fn render(&mut self, window: &mut Window, cx: &mut Context<'_, Self>) -> impl IntoElement {
match &self.mode {
DebugSessionState::Inert(inert_state) => {
inert_state.update(cx, |this, cx| this.render(window, cx).into_any_element())
}
DebugSessionState::Starting(starting_state) => {
starting_state.update(cx, |this, cx| this.render(window, cx).into_any_element())
}
DebugSessionState::Running(running_state) => {
running_state.update(cx, |this, cx| this.render(window, cx).into_any_element())
}
}
}
}

View File

@@ -0,0 +1,44 @@
use gpui::{App, FocusHandle, Focusable};
use ui::{
div, h_flex, v_flex, Context, ContextMenu, DropdownMenu, Element, InteractiveElement,
ParentElement, Render, Styled,
};
pub(super) struct InertState {
focus_handle: FocusHandle,
}
impl InertState {
pub(super) fn new(cx: &mut Context<Self>) -> Self {
Self {
focus_handle: cx.focus_handle(),
}
}
}
impl Focusable for InertState {
fn focus_handle(&self, cx: &App) -> FocusHandle {
self.focus_handle.clone()
}
}
impl Render for InertState {
fn render(
&mut self,
window: &mut ui::Window,
cx: &mut ui::Context<'_, Self>,
) -> impl ui::IntoElement {
v_flex()
.track_focus(&self.focus_handle)
.size_full()
.p_1()
.child(h_flex().child(DropdownMenu::new(
"dap-adapter-picker",
"Select Debug Adapter",
ContextMenu::build(window, cx, |this, _, _| {
this.entry("GDB", None, |_, _| {})
.entry("Delve", None, |_, _| {})
.entry("LLDB", None, |_, _| {})
}),
)))
}
}

View File

@@ -0,0 +1,653 @@
mod console;
mod loaded_source_list;
mod module_list;
mod stack_frame_list;
mod variable_list;
use console::Console;
use dap::{
client::DebugAdapterClientId, debugger_settings::DebuggerSettings, Capabilities, ContinuedEvent,
};
use gpui::{
AppContext, Entity, EventEmitter, FocusHandle, Focusable, Subscription, Task, WeakEntity,
};
use loaded_source_list::LoadedSourceList;
use module_list::ModuleList;
use project::debugger::session::{Session, ThreadId, ThreadStatus};
use rpc::proto::{self, ViewId};
use settings::Settings;
use stack_frame_list::{StackFrameList, StackFrameListEvent};
use ui::{
div, h_flex, v_flex, ActiveTheme, AnyElement, App, Button, ButtonCommon, Clickable, Color,
Context, ContextMenu, Disableable, DropdownMenu, Element, FluentBuilder, IconButton, IconName,
IconSize, Indicator, InteractiveElement, IntoElement, Label, LabelCommon, ParentElement,
Render, SharedString, StatefulInteractiveElement, Styled, Tooltip, Window,
};
use variable_list::VariableList;
use workspace::{
item::{self, ItemEvent},
FollowableItem, Item, Workspace,
};
use crate::debugger_panel::{DebugPanel, DebugPanelEvent};
use super::{DebugPanelItemEvent, ThreadItem};
pub struct RunningState {
session: Entity<Session>,
thread_id: ThreadId,
console: Entity<console::Console>,
focus_handle: FocusHandle,
remote_id: Option<ViewId>,
show_console_indicator: bool,
module_list: Entity<module_list::ModuleList>,
active_thread_item: ThreadItem,
_workspace: WeakEntity<Workspace>,
client_id: DebugAdapterClientId,
variable_list: Entity<variable_list::VariableList>,
_subscriptions: Vec<Subscription>,
stack_frame_list: Entity<stack_frame_list::StackFrameList>,
loaded_source_list: Entity<loaded_source_list::LoadedSourceList>,
}
impl Render for RunningState {
fn render(&mut self, window: &mut Window, cx: &mut Context<Self>) -> impl IntoElement {
let thread_status = ThreadStatus::Running;
let active_thread_item = &self.active_thread_item;
let capabilities = self.capabilities(cx);
h_flex()
.key_context("DebugPanelItem")
.track_focus(&self.focus_handle(cx))
.size_full()
.items_start()
.child(
v_flex()
.size_full()
.items_start()
.child(
h_flex()
.w_full()
.border_b_1()
.border_color(cx.theme().colors().border_variant)
.justify_between()
.child(
h_flex()
.p_1()
.w_full()
.gap_2()
.map(|this| {
if thread_status == ThreadStatus::Running {
this.child(
IconButton::new(
"debug-pause",
IconName::DebugPause,
)
.icon_size(IconSize::Small)
.on_click(cx.listener(|this, _, _window, cx| {
this.pause_thread(cx);
}))
.tooltip(move |window, cx| {
Tooltip::text("Pause program")(window, cx)
}),
)
} else {
this.child(
IconButton::new(
"debug-continue",
IconName::DebugContinue,
)
.icon_size(IconSize::Small)
.on_click(cx.listener(|this, _, _window, cx| {
this.continue_thread(cx)
}))
.disabled(thread_status != ThreadStatus::Stopped)
.tooltip(move |window, cx| {
Tooltip::text("Continue program")(window, cx)
}),
)
}
})
.when(
capabilities.supports_step_back.unwrap_or(false),
|this| {
this.child(
IconButton::new(
"debug-step-back",
IconName::DebugStepBack,
)
.icon_size(IconSize::Small)
.on_click(cx.listener(|this, _, _window, cx| {
this.step_back(cx);
}))
.disabled(thread_status != ThreadStatus::Stopped)
.tooltip(move |window, cx| {
Tooltip::text("Step back")(window, cx)
}),
)
},
)
.child(
IconButton::new("debug-step-over", IconName::DebugStepOver)
.icon_size(IconSize::Small)
.on_click(cx.listener(|this, _, _window, cx| {
this.step_over(cx);
}))
.disabled(thread_status != ThreadStatus::Stopped)
.tooltip(move |window, cx| {
Tooltip::text("Step over")(window, cx)
}),
)
.child(
IconButton::new("debug-step-in", IconName::DebugStepInto)
.icon_size(IconSize::Small)
.on_click(cx.listener(|this, _, _window, cx| {
this.step_in(cx);
}))
.disabled(thread_status != ThreadStatus::Stopped)
.tooltip(move |window, cx| {
Tooltip::text("Step in")(window, cx)
}),
)
.child(
IconButton::new("debug-step-out", IconName::DebugStepOut)
.icon_size(IconSize::Small)
.on_click(cx.listener(|this, _, _window, cx| {
this.step_out(cx);
}))
.disabled(thread_status != ThreadStatus::Stopped)
.tooltip(move |window, cx| {
Tooltip::text("Step out")(window, cx)
}),
)
.child(
IconButton::new("debug-restart", IconName::DebugRestart)
.icon_size(IconSize::Small)
.on_click(cx.listener(|this, _, _window, cx| {
this.restart_client(cx);
}))
.disabled(
!capabilities
.supports_restart_request
.unwrap_or_default(),
)
.tooltip(move |window, cx| {
Tooltip::text("Restart")(window, cx)
}),
)
.child(
IconButton::new("debug-stop", IconName::DebugStop)
.icon_size(IconSize::Small)
.on_click(cx.listener(|this, _, _window, cx| {
this.stop_thread(cx);
}))
.disabled(
thread_status != ThreadStatus::Stopped
&& thread_status != ThreadStatus::Running,
)
.tooltip(move |window, cx| {
Tooltip::text("Stop")(window, cx)
}),
)
.child(
IconButton::new(
"debug-disconnect",
IconName::DebugDisconnect,
)
.icon_size(IconSize::Small)
.on_click(cx.listener(|this, _, _window, cx| {
this.disconnect_client(cx);
}))
.disabled(
thread_status == ThreadStatus::Exited
|| thread_status == ThreadStatus::Ended,
)
.tooltip(
move |window, cx| {
Tooltip::text("Disconnect")(window, cx)
},
),
)
.child(
IconButton::new(
"debug-ignore-breakpoints",
if self.session.read(cx).breakpoints_enabled() {
IconName::DebugBreakpoint
} else {
IconName::DebugIgnoreBreakpoints
},
)
.icon_size(IconSize::Small)
.on_click(cx.listener(|this, _, _window, cx| {
this.toggle_ignore_breakpoints(cx);
}))
.disabled(
thread_status == ThreadStatus::Exited
|| thread_status == ThreadStatus::Ended,
)
.tooltip(
move |window, cx| {
Tooltip::text("Ignore breakpoints")(window, cx)
},
),
),
)
//.child(h_flex())
.child(h_flex().p_1().mx_2().w_3_4().justify_end().child(
DropdownMenu::new(
"thread-list",
"Threads",
ContextMenu::build(window, cx, |this, _, _| {
this.entry("Thread 1", None, |_, _| {}).entry(
"Thread 2",
None,
|_, _| {},
)
}),
),
)),
)
.child(
h_flex()
.size_full()
.items_start()
.p_1()
.gap_4()
.child(self.stack_frame_list.clone()),
),
)
.child(
v_flex()
.border_l_1()
.border_color(cx.theme().colors().border_variant)
.size_full()
.items_start()
.child(
h_flex()
.border_b_1()
.w_full()
.border_color(cx.theme().colors().border_variant)
.child(self.render_entry_button(
&SharedString::from("Variables"),
ThreadItem::Variables,
cx,
))
.when(
capabilities.supports_modules_request.unwrap_or_default(),
|this| {
this.child(self.render_entry_button(
&SharedString::from("Modules"),
ThreadItem::Modules,
cx,
))
},
)
.when(
capabilities
.supports_loaded_sources_request
.unwrap_or_default(),
|this| {
this.child(self.render_entry_button(
&SharedString::from("Loaded Sources"),
ThreadItem::LoadedSource,
cx,
))
},
)
.child(self.render_entry_button(
&SharedString::from("Console"),
ThreadItem::Console,
cx,
)),
)
.when(*active_thread_item == ThreadItem::Variables, |this| {
this.size_full().child(self.variable_list.clone())
})
.when(*active_thread_item == ThreadItem::Modules, |this| {
this.size_full().child(self.module_list.clone())
})
.when(*active_thread_item == ThreadItem::LoadedSource, |this| {
this.size_full().child(self.loaded_source_list.clone())
})
.when(*active_thread_item == ThreadItem::Console, |this| {
this.child(self.console.clone())
}),
)
}
}
impl RunningState {
#[allow(clippy::too_many_arguments)]
pub fn new(
session: Entity<Session>,
client_id: DebugAdapterClientId,
thread_id: ThreadId,
debug_panel: &Entity<DebugPanel>,
workspace: WeakEntity<Workspace>,
window: &mut Window,
cx: &mut Context<Self>,
) -> Self {
let focus_handle = cx.focus_handle();
let stack_frame_list = cx.new(|cx| {
StackFrameList::new(workspace.clone(), session.clone(), thread_id, window, cx)
});
let variable_list = cx.new(|cx| {
VariableList::new(
session.clone(),
client_id,
stack_frame_list.clone(),
window,
cx,
)
});
let module_list = cx.new(|cx| ModuleList::new(session.clone(), client_id, cx));
let loaded_source_list = cx.new(|cx| LoadedSourceList::new(session.clone(), client_id, cx));
let console = cx.new(|cx| {
Console::new(
session.clone(),
client_id,
stack_frame_list.clone(),
variable_list.clone(),
window,
cx,
)
});
cx.observe(&module_list, |_, _, cx| cx.notify()).detach();
let _subscriptions = vec![cx.subscribe(
&stack_frame_list,
move |this: &mut Self, _, event: &StackFrameListEvent, cx| match event {
StackFrameListEvent::SelectedStackFrameChanged(_)
| StackFrameListEvent::StackFramesUpdated => this.clear_highlights(cx),
},
)];
Self {
session,
console,
thread_id,
_workspace: workspace,
module_list,
focus_handle,
variable_list,
_subscriptions,
remote_id: None,
stack_frame_list,
loaded_source_list,
client_id,
show_console_indicator: false,
active_thread_item: ThreadItem::Variables,
}
}
// pub(crate) fn update_adapter(
// &mut self,
// update: &UpdateDebugAdapter,
// window: &mut Window,
// cx: &mut Context<Self>,
// ) {
// if let Some(update_variant) = update.variant.as_ref() {
// match update_variant {
// proto::update_debug_adapter::Variant::StackFrameList(stack_frame_list) => {
// self.stack_frame_list.update(cx, |this, cx| {
// this.set_from_proto(stack_frame_list.clone(), cx);
// })
// }
// proto::update_debug_adapter::Variant::ThreadState(thread_state) => {
// self.thread_state.update(cx, |this, _| {
// *this = ThreadState::from_proto(thread_state.clone());
// })
// }
// proto::update_debug_adapter::Variant::VariableList(variable_list) => self
// .variable_list
// .update(cx, |this, cx| this.set_from_proto(variable_list, cx)),
// proto::update_debug_adapter::Variant::AddToVariableList(variables_to_add) => self
// .variable_list
// .update(cx, |this, _| this.add_variables(variables_to_add.clone())),
// proto::update_debug_adapter::Variant::Modules(_) => {}
// proto::update_debug_adapter::Variant::OutputEvent(output_event) => {
// self.console.update(cx, |this, cx| {
// this.add_message(OutputEvent::from_proto(output_event.clone()), window, cx);
// })
// }
// }
// }
// }
pub fn session(&self) -> &Entity<Session> {
&self.session
}
pub fn client_id(&self) -> DebugAdapterClientId {
self.client_id
}
pub fn thread_id(&self) -> ThreadId {
self.thread_id
}
#[cfg(any(test, feature = "test-support"))]
pub fn set_thread_item(&mut self, thread_item: ThreadItem, cx: &mut Context<Self>) {
self.active_thread_item = thread_item;
cx.notify()
}
#[cfg(any(test, feature = "test-support"))]
pub fn stack_frame_list(&self) -> &Entity<StackFrameList> {
&self.stack_frame_list
}
#[cfg(any(test, feature = "test-support"))]
pub fn console(&self) -> &Entity<Console> {
&self.console
}
#[cfg(any(test, feature = "test-support"))]
pub fn module_list(&self) -> &Entity<ModuleList> {
&self.module_list
}
#[cfg(any(test, feature = "test-support"))]
pub fn variable_list(&self) -> &Entity<VariableList> {
&self.variable_list
}
#[cfg(any(test, feature = "test-support"))]
pub fn are_breakpoints_ignored(&self, cx: &App) -> bool {
self.session.read(cx).ignore_breakpoints()
}
pub fn capabilities(&self, cx: &mut Context<Self>) -> Capabilities {
self.session().read(cx).capabilities().clone()
}
fn clear_highlights(&self, _cx: &mut Context<Self>) {
// TODO(debugger): make this work again
// if let Some((_, project_path, _)) = self.dap_store.read(cx).active_debug_line() {
// self.workspace
// .update(cx, |workspace, cx| {
// let editor = workspace
// .items_of_type::<Editor>(cx)
// .find(|editor| Some(project_path.clone()) == editor.project_path(cx));
// if let Some(editor) = editor {
// editor.update(cx, |editor, cx| {
// editor.clear_row_highlights::<editor::DebugCurrentRowHighlight>();
// cx.notify();
// });
// }
// })
// .ok();
// }
}
pub fn go_to_current_stack_frame(&self, window: &mut Window, cx: &mut Context<Self>) {
self.stack_frame_list.update(cx, |stack_frame_list, cx| {
if let Some(stack_frame) = stack_frame_list
.stack_frames(cx)
.iter()
.find(|frame| frame.dap.id == stack_frame_list.current_stack_frame_id())
.cloned()
{
stack_frame_list
.select_stack_frame(&stack_frame.dap, true, window, cx)
.detach_and_log_err(cx);
}
});
}
fn render_entry_button(
&self,
label: &SharedString,
thread_item: ThreadItem,
cx: &mut Context<Self>,
) -> AnyElement {
let has_indicator =
matches!(thread_item, ThreadItem::Console) && self.show_console_indicator;
div()
.id(label.clone())
.px_2()
.py_1()
.cursor_pointer()
.border_b_2()
.when(self.active_thread_item == thread_item, |this| {
this.border_color(cx.theme().colors().border)
})
.child(
h_flex()
.child(Button::new(label.clone(), label.clone()))
.when(has_indicator, |this| this.child(Indicator::dot())),
)
.on_click(cx.listener(move |this, _, _window, cx| {
this.active_thread_item = thread_item.clone();
if matches!(this.active_thread_item, ThreadItem::Console) {
this.show_console_indicator = false;
}
cx.notify();
}))
.into_any_element()
}
pub fn continue_thread(&mut self, cx: &mut Context<Self>) {
self.session().update(cx, |state, cx| {
state.continue_thread(self.thread_id, cx);
});
}
pub fn step_over(&mut self, cx: &mut Context<Self>) {
let granularity = DebuggerSettings::get_global(cx).stepping_granularity;
self.session().update(cx, |state, cx| {
state.step_over(self.thread_id, granularity, cx);
});
}
pub fn step_in(&mut self, cx: &mut Context<Self>) {
let granularity = DebuggerSettings::get_global(cx).stepping_granularity;
self.session().update(cx, |state, cx| {
state.step_in(self.thread_id, granularity, cx);
});
}
pub fn step_out(&mut self, cx: &mut Context<Self>) {
let granularity = DebuggerSettings::get_global(cx).stepping_granularity;
self.session().update(cx, |state, cx| {
state.step_out(self.thread_id, granularity, cx);
});
}
pub fn step_back(&mut self, cx: &mut Context<Self>) {
let granularity = DebuggerSettings::get_global(cx).stepping_granularity;
self.session().update(cx, |state, cx| {
state.step_back(self.thread_id, granularity, cx);
});
}
pub fn restart_client(&self, cx: &mut Context<Self>) {
self.session().update(cx, |state, cx| {
state.restart(None, cx);
});
}
pub fn pause_thread(&self, cx: &mut Context<Self>) {
self.session().update(cx, |state, cx| {
state.pause_thread(self.thread_id, cx);
});
}
pub fn stop_thread(&self, cx: &mut Context<Self>) {
self.session().update(cx, |state, cx| {
state.terminate_threads(Some(vec![self.thread_id; 1]), cx);
});
}
pub fn disconnect_client(&self, cx: &mut Context<Self>) {
self.session().update(cx, |state, cx| {
state.disconnect_client(cx);
});
}
pub fn toggle_ignore_breakpoints(&mut self, cx: &mut Context<Self>) {
self.session.update(cx, |session, cx| {
session.set_ignore_breakpoints(!session.breakpoints_enabled());
});
}
}
impl EventEmitter<DebugPanelItemEvent> for RunningState {}
impl Focusable for RunningState {
fn focus_handle(&self, _: &App) -> FocusHandle {
self.focus_handle.clone()
}
}
impl Item for RunningState {
type Event = DebugPanelItemEvent;
fn tab_content(
&self,
params: workspace::item::TabContentParams,
_window: &Window,
cx: &App,
) -> AnyElement {
Label::new(format!("{} - Thread {}", todo!(), self.thread_id.0))
.color(if params.selected {
Color::Default
} else {
Color::Muted
})
.into_any_element()
}
fn tab_tooltip_text(&self, cx: &App) -> Option<SharedString> {
Some(SharedString::from(format!(
"{} Thread {} - {:?}",
todo!(),
self.thread_id.0,
todo!("thread state"),
)))
}
fn to_item_events(event: &Self::Event, mut f: impl FnMut(ItemEvent)) {
match event {
DebugPanelItemEvent::Close => f(ItemEvent::CloseItem),
DebugPanelItemEvent::Stopped { .. } => {}
}
}
}

View File

@@ -0,0 +1,527 @@
use super::{
stack_frame_list::{StackFrameList, StackFrameListEvent},
variable_list::VariableList,
};
use anyhow::anyhow;
use dap::{client::DebugAdapterClientId, OutputEvent, OutputEventGroup};
use editor::{
display_map::{Crease, CreaseId},
Anchor, CompletionProvider, Editor, EditorElement, EditorStyle, FoldPlaceholder,
};
use fuzzy::StringMatchCandidate;
use gpui::{Context, Entity, Render, Subscription, Task, TextStyle, WeakEntity};
use language::{Buffer, CodeLabel, LanguageServerId};
use menu::Confirm;
use project::{
debugger::session::{CompletionsQuery, Session},
Completion,
};
use settings::Settings;
use std::{cell::RefCell, collections::HashMap, rc::Rc, sync::Arc, usize};
use theme::ThemeSettings;
use ui::{prelude::*, ButtonLike, Disclosure, ElevationIndex};
pub struct OutputGroup {
pub start: Anchor,
pub collapsed: bool,
pub end: Option<Anchor>,
pub crease_ids: Vec<CreaseId>,
pub placeholder: SharedString,
}
pub struct Console {
groups: Vec<OutputGroup>,
console: Entity<Editor>,
query_bar: Entity<Editor>,
session: Entity<Session>,
client_id: DebugAdapterClientId,
_subscriptions: Vec<Subscription>,
variable_list: Entity<VariableList>,
stack_frame_list: Entity<StackFrameList>,
}
impl Console {
pub fn new(
session: Entity<Session>,
client_id: DebugAdapterClientId,
stack_frame_list: Entity<StackFrameList>,
variable_list: Entity<VariableList>,
window: &mut Window,
cx: &mut Context<Self>,
) -> Self {
let console = cx.new(|cx| {
let mut editor = Editor::multi_line(window, cx);
editor.move_to_end(&editor::actions::MoveToEnd, window, cx);
editor.set_read_only(true);
editor.set_show_gutter(true, cx);
editor.set_show_runnables(false, cx);
editor.set_show_code_actions(false, cx);
editor.set_show_line_numbers(false, cx);
editor.set_show_git_diff_gutter(false, cx);
editor.set_autoindent(false);
editor.set_input_enabled(false);
editor.set_use_autoclose(false);
editor.set_show_wrap_guides(false, cx);
editor.set_show_indent_guides(false, cx);
editor.set_show_edit_predictions(Some(false), window, cx);
editor
});
let this = cx.weak_entity();
let query_bar = cx.new(|cx| {
let mut editor = Editor::single_line(window, cx);
editor.set_placeholder_text("Evaluate an expression", cx);
editor.set_use_autoclose(false);
editor.set_show_gutter(false, cx);
editor.set_show_wrap_guides(false, cx);
editor.set_show_indent_guides(false, cx);
editor.set_completion_provider(Some(Box::new(ConsoleQueryBarCompletionProvider(this))));
editor
});
let _subscriptions =
vec![cx.subscribe(&stack_frame_list, Self::handle_stack_frame_list_events)];
Self {
session,
console,
query_bar,
variable_list,
_subscriptions,
stack_frame_list,
client_id,
groups: Vec::default(),
}
}
#[cfg(any(test, feature = "test-support"))]
pub fn editor(&self) -> &Entity<Editor> {
&self.console
}
#[cfg(any(test, feature = "test-support"))]
pub fn query_bar(&self) -> &Entity<Editor> {
&self.query_bar
}
fn is_local(&self, _cx: &Context<Self>) -> bool {
// todo(debugger): Fix this function
true
}
fn handle_stack_frame_list_events(
&mut self,
_: Entity<StackFrameList>,
event: &StackFrameListEvent,
cx: &mut Context<Self>,
) {
match event {
StackFrameListEvent::SelectedStackFrameChanged(_) => cx.notify(),
StackFrameListEvent::StackFramesUpdated => {}
}
}
pub fn add_message(&mut self, event: OutputEvent, window: &mut Window, cx: &mut Context<Self>) {
self.console.update(cx, |console, cx| {
let output = event.output.trim_end().to_string();
let snapshot = console.buffer().read(cx).snapshot(cx);
let start = snapshot.anchor_before(snapshot.max_point());
let mut indent_size = self
.groups
.iter()
.filter(|group| group.end.is_none())
.count();
if Some(OutputEventGroup::End) == event.group {
indent_size = indent_size.saturating_sub(1);
}
let indent = if indent_size > 0 {
" ".repeat(indent_size)
} else {
"".to_string()
};
console.set_read_only(false);
console.move_to_end(&editor::actions::MoveToEnd, window, cx);
console.insert(format!("{}{}\n", indent, output).as_str(), window, cx);
console.set_read_only(true);
let end = snapshot.anchor_before(snapshot.max_point());
match event.group {
Some(OutputEventGroup::Start) => {
self.groups.push(OutputGroup {
start,
end: None,
collapsed: false,
placeholder: output.clone().into(),
crease_ids: console.insert_creases(
vec![Self::create_crease(output.into(), start, end)],
cx,
),
});
}
Some(OutputEventGroup::StartCollapsed) => {
self.groups.push(OutputGroup {
start,
end: None,
collapsed: true,
placeholder: output.clone().into(),
crease_ids: console.insert_creases(
vec![Self::create_crease(output.into(), start, end)],
cx,
),
});
}
Some(OutputEventGroup::End) => {
if let Some(index) = self.groups.iter().rposition(|group| group.end.is_none()) {
let group = self.groups.remove(index);
console.remove_creases(group.crease_ids.clone(), cx);
let creases =
vec![Self::create_crease(group.placeholder, group.start, end)];
console.insert_creases(creases.clone(), cx);
if group.collapsed {
console.fold_creases(creases, false, window, cx);
}
}
}
None => {}
}
cx.notify();
});
}
fn create_crease(placeholder: SharedString, start: Anchor, end: Anchor) -> Crease<Anchor> {
Crease::inline(
start..end,
FoldPlaceholder {
render: Arc::new({
let placeholder = placeholder.clone();
move |_id, _range, _window, _cx| {
ButtonLike::new("output-group-placeholder")
.style(ButtonStyle::Transparent)
.layer(ElevationIndex::ElevatedSurface)
.child(Label::new(placeholder.clone()).single_line())
.into_any_element()
}
}),
..Default::default()
},
move |row, is_folded, fold, _window, _cx| {
Disclosure::new(("output-group", row.0 as u64), !is_folded)
.toggle_state(is_folded)
.on_click(move |_event, window, cx| fold(!is_folded, window, cx))
.into_any_element()
},
move |_id, _range, _window, _cx| gpui::Empty.into_any_element(),
)
}
pub fn evaluate(&mut self, _: &Confirm, window: &mut Window, cx: &mut Context<Self>) {
let expression = self.query_bar.update(cx, |editor, cx| {
let expression = editor.text(cx);
editor.clear(window, cx);
expression
});
self.session.update(cx, |state, cx| {
state.evaluate(
expression,
Some(dap::EvaluateArgumentsContext::Variables),
Some(self.stack_frame_list.read(cx).current_stack_frame_id()),
None,
cx,
);
});
// TODO(debugger): make this work again
// let weak_console = cx.weak_entity();
// window
// .spawn(cx, |mut cx| async move {
// let response = evaluate_task.await?;
// weak_console.update_in(&mut cx, |console, window, cx| {
// console.add_message(
// OutputEvent {
// category: None,
// output: response.result,
// group: None,
// variables_reference: Some(response.variables_reference),
// source: None,
// line: None,
// column: None,
// data: None,
// },
// window,
// cx,
// );
// console.variable_list.update(cx, |variable_list, cx| {
// variable_list.invalidate(window, cx);
// })
// })
// })
// .detach_and_log_err(cx);
}
fn render_console(&self, cx: &Context<Self>) -> impl IntoElement {
let settings = ThemeSettings::get_global(cx);
let text_style = TextStyle {
color: if self.console.read(cx).read_only(cx) {
cx.theme().colors().text_disabled
} else {
cx.theme().colors().text
},
font_family: settings.buffer_font.family.clone(),
font_features: settings.buffer_font.features.clone(),
font_size: settings.buffer_font_size.into(),
font_weight: settings.buffer_font.weight,
line_height: relative(settings.buffer_line_height.value()),
..Default::default()
};
EditorElement::new(
&self.console,
EditorStyle {
background: cx.theme().colors().editor_background,
local_player: cx.theme().players().local(),
text: text_style,
..Default::default()
},
)
}
fn render_query_bar(&self, cx: &Context<Self>) -> impl IntoElement {
let settings = ThemeSettings::get_global(cx);
let text_style = TextStyle {
color: if self.console.read(cx).read_only(cx) {
cx.theme().colors().text_disabled
} else {
cx.theme().colors().text
},
font_family: settings.ui_font.family.clone(),
font_features: settings.ui_font.features.clone(),
font_fallbacks: settings.ui_font.fallbacks.clone(),
font_size: TextSize::Editor.rems(cx).into(),
font_weight: settings.ui_font.weight,
line_height: relative(1.3),
..Default::default()
};
EditorElement::new(
&self.query_bar,
EditorStyle {
background: cx.theme().colors().editor_background,
local_player: cx.theme().players().local(),
text: text_style,
..Default::default()
},
)
}
}
impl Render for Console {
fn render(&mut self, _window: &mut Window, cx: &mut Context<Self>) -> impl IntoElement {
v_flex()
.key_context("DebugConsole")
.on_action(cx.listener(Self::evaluate))
.size_full()
.child(self.render_console(cx))
.when(self.is_local(cx), |this| {
this.child(self.render_query_bar(cx))
.pt(DynamicSpacing::Base04.rems(cx))
})
.border_2()
}
}
struct ConsoleQueryBarCompletionProvider(WeakEntity<Console>);
impl CompletionProvider for ConsoleQueryBarCompletionProvider {
fn completions(
&self,
buffer: &Entity<Buffer>,
buffer_position: language::Anchor,
_trigger: editor::CompletionContext,
_window: &mut Window,
cx: &mut Context<Editor>,
) -> gpui::Task<gpui::Result<Vec<project::Completion>>> {
let Some(console) = self.0.upgrade() else {
return Task::ready(Ok(Vec::new()));
};
let support_completions = console
.read(cx)
.session
.read(cx)
.capabilities()
.supports_completions_request
.unwrap_or_default();
if support_completions {
self.client_completions(&console, buffer, buffer_position, cx)
} else {
self.variable_list_completions(&console, buffer, buffer_position, cx)
}
}
fn resolve_completions(
&self,
_buffer: Entity<Buffer>,
_completion_indices: Vec<usize>,
_completions: Rc<RefCell<Box<[Completion]>>>,
_cx: &mut Context<Editor>,
) -> gpui::Task<gpui::Result<bool>> {
Task::ready(Ok(false))
}
fn apply_additional_edits_for_completion(
&self,
_buffer: Entity<Buffer>,
_completions: Rc<RefCell<Box<[Completion]>>>,
_completion_index: usize,
_push_to_history: bool,
_cx: &mut Context<Editor>,
) -> gpui::Task<gpui::Result<Option<language::Transaction>>> {
Task::ready(Ok(None))
}
fn is_completion_trigger(
&self,
_buffer: &Entity<Buffer>,
_position: language::Anchor,
_text: &str,
_trigger_in_words: bool,
_cx: &mut Context<Editor>,
) -> bool {
true
}
}
impl ConsoleQueryBarCompletionProvider {
fn variable_list_completions(
&self,
console: &Entity<Console>,
buffer: &Entity<Buffer>,
buffer_position: language::Anchor,
cx: &mut Context<Editor>,
) -> gpui::Task<gpui::Result<Vec<project::Completion>>> {
let (variables, string_matches) = console.update(cx, |console, cx| {
let mut variables = HashMap::new();
let mut string_matches = Vec::new();
for variable in console.variable_list.update(cx, |variable_list, cx| {
variable_list.completion_variables(cx)
}) {
if let Some(evaluate_name) = &variable.variable.evaluate_name {
variables.insert(evaluate_name.clone(), variable.variable.value.clone());
string_matches.push(StringMatchCandidate {
id: 0,
string: evaluate_name.clone(),
char_bag: evaluate_name.chars().collect(),
});
}
variables.insert(
variable.variable.name.clone(),
variable.variable.value.clone(),
);
string_matches.push(StringMatchCandidate {
id: 0,
string: variable.variable.name.clone(),
char_bag: variable.variable.name.chars().collect(),
});
}
(variables, string_matches)
});
let query = buffer.read(cx).text();
cx.spawn(|_, cx| async move {
let matches = fuzzy::match_strings(
&string_matches,
&query,
true,
10,
&Default::default(),
cx.background_executor().clone(),
)
.await;
Ok(matches
.iter()
.filter_map(|string_match| {
let variable_value = variables.get(&string_match.string)?;
Some(project::Completion {
old_range: buffer_position..buffer_position,
new_text: string_match.string.clone(),
label: CodeLabel {
filter_range: 0..string_match.string.len(),
text: format!("{} {}", string_match.string.clone(), variable_value),
runs: Vec::new(),
},
server_id: LanguageServerId(usize::MAX),
documentation: None,
lsp_completion: Default::default(),
confirm: None,
resolved: true,
})
})
.collect())
})
}
fn client_completions(
&self,
console: &Entity<Console>,
buffer: &Entity<Buffer>,
buffer_position: language::Anchor,
cx: &mut Context<Editor>,
) -> gpui::Task<gpui::Result<Vec<project::Completion>>> {
let completion_task = console.update(cx, |console, cx| {
console.session.update(cx, |state, cx| {
let frame_id = Some(console.stack_frame_list.read(cx).current_stack_frame_id());
state.completions(
CompletionsQuery::new(buffer.read(cx), buffer_position, frame_id),
cx,
)
})
});
cx.background_executor().spawn(async move {
Ok(completion_task
.await?
.iter()
.map(|completion| project::Completion {
old_range: buffer_position..buffer_position, // TODO(debugger): change this
new_text: completion.text.clone().unwrap_or(completion.label.clone()),
label: CodeLabel {
filter_range: 0..completion.label.len(),
text: completion.label.clone(),
runs: Vec::new(),
},
server_id: LanguageServerId(usize::MAX),
documentation: None,
lsp_completion: Default::default(),
confirm: None,
resolved: true,
})
.collect())
})
}
}

View File

@@ -0,0 +1,102 @@
use dap::client::DebugAdapterClientId;
use gpui::{list, AnyElement, Empty, Entity, FocusHandle, Focusable, ListState, Subscription};
use project::debugger::session::Session;
use ui::prelude::*;
use util::maybe;
pub struct LoadedSourceList {
list: ListState,
focus_handle: FocusHandle,
_subscription: Subscription,
session: Entity<Session>,
client_id: DebugAdapterClientId,
}
impl LoadedSourceList {
pub fn new(
session: Entity<Session>,
client_id: DebugAdapterClientId,
cx: &mut Context<Self>,
) -> Self {
let weak_entity = cx.weak_entity();
let focus_handle = cx.focus_handle();
let list = ListState::new(
0,
gpui::ListAlignment::Top,
px(1000.),
move |ix, _window, cx| {
weak_entity
.upgrade()
.map(|loaded_sources| {
loaded_sources.update(cx, |this, cx| this.render_entry(ix, cx))
})
.unwrap_or(div().into_any())
},
);
let _subscription = cx.observe(&session, |loaded_source_list, state, cx| {
let len = state.update(cx, |state, cx| state.loaded_sources(cx).len());
loaded_source_list.list.reset(len);
cx.notify();
});
Self {
list,
session,
focus_handle,
_subscription,
client_id,
}
}
fn render_entry(&mut self, ix: usize, cx: &mut Context<Self>) -> AnyElement {
let Some(source) = maybe!({
self.session
.update(cx, |state, cx| state.loaded_sources(cx).get(ix).cloned())
}) else {
return Empty.into_any();
};
v_flex()
.rounded_md()
.w_full()
.group("")
.p_1()
.hover(|s| s.bg(cx.theme().colors().element_hover))
.child(
h_flex()
.gap_0p5()
.text_ui_sm(cx)
.when_some(source.name.clone(), |this, name| this.child(name)),
)
.child(
h_flex()
.text_ui_xs(cx)
.text_color(cx.theme().colors().text_muted)
.when_some(source.path.clone(), |this, path| this.child(path)),
)
.into_any()
}
}
impl Focusable for LoadedSourceList {
fn focus_handle(&self, _: &gpui::App) -> gpui::FocusHandle {
self.focus_handle.clone()
}
}
impl Render for LoadedSourceList {
fn render(&mut self, _window: &mut Window, cx: &mut Context<Self>) -> impl IntoElement {
self.session.update(cx, |state, cx| {
state.loaded_sources(cx);
});
div()
.track_focus(&self.focus_handle)
.size_full()
.p_1()
.child(list(self.list.clone()).size_full())
}
}

View File

@@ -0,0 +1,112 @@
use dap::{client::DebugAdapterClientId, ModuleEvent};
use gpui::{list, AnyElement, Empty, Entity, FocusHandle, Focusable, ListState, Subscription};
use project::debugger::session::Session;
use ui::prelude::*;
pub struct ModuleList {
list: ListState,
focus_handle: FocusHandle,
_subscription: Subscription,
session: Entity<Session>,
client_id: DebugAdapterClientId,
}
impl ModuleList {
pub fn new(
session: Entity<Session>,
client_id: DebugAdapterClientId,
cx: &mut Context<Self>,
) -> Self {
let weak_entity = cx.weak_entity();
let focus_handle = cx.focus_handle();
let list = ListState::new(
0,
gpui::ListAlignment::Top,
px(1000.),
move |ix, _window, cx| {
weak_entity
.upgrade()
.map(|module_list| module_list.update(cx, |this, cx| this.render_entry(ix, cx)))
.unwrap_or(div().into_any())
},
);
let _subscription = cx.observe(&session, |module_list, state, cx| {
let modules_len = state.update(cx, |state, cx| state.modules(cx).len());
module_list.list.reset(modules_len);
cx.notify();
});
Self {
list,
session,
focus_handle,
_subscription,
client_id,
}
}
pub fn on_module_event(&mut self, event: &ModuleEvent, cx: &mut Context<Self>) {
self.session
.update(cx, |state, cx| state.handle_module_event(event, cx));
}
fn render_entry(&mut self, ix: usize, cx: &mut Context<Self>) -> AnyElement {
let Some(module) = maybe!({
self.session
.update(cx, |state, cx| state.modules(cx).get(ix).cloned())
}) else {
return Empty.into_any();
};
v_flex()
.rounded_md()
.w_full()
.group("")
.p_1()
.hover(|s| s.bg(cx.theme().colors().element_hover))
.child(h_flex().gap_0p5().text_ui_sm(cx).child(module.name.clone()))
.child(
h_flex()
.text_ui_xs(cx)
.text_color(cx.theme().colors().text_muted)
.when_some(module.path.clone(), |this, path| this.child(path)),
)
.into_any()
}
}
impl Focusable for ModuleList {
fn focus_handle(&self, _: &gpui::App) -> gpui::FocusHandle {
self.focus_handle.clone()
}
}
impl Render for ModuleList {
fn render(&mut self, _window: &mut Window, cx: &mut Context<Self>) -> impl IntoElement {
self.session.update(cx, |state, cx| {
state.modules(cx);
});
div()
.track_focus(&self.focus_handle)
.size_full()
.p_1()
.child(list(self.list.clone()).size_full())
}
}
#[cfg(any(test, feature = "test-support"))]
use dap::Module;
use util::maybe;
#[cfg(any(test, feature = "test-support"))]
impl ModuleList {
pub fn modules(&self, cx: &mut Context<Self>) -> Vec<Module> {
self.session.update(cx, |session, cx| {
session.modules(cx).iter().cloned().collect()
})
}
}

View File

@@ -0,0 +1,468 @@
use std::path::Path;
use anyhow::{anyhow, Result};
use dap::client::DebugAdapterClientId;
use gpui::{
list, AnyElement, Entity, EventEmitter, FocusHandle, Focusable, ListState, Subscription, Task,
WeakEntity,
};
use project::debugger::session::{Session, StackFrame, ThreadId};
use project::ProjectPath;
use ui::{prelude::*, Tooltip};
use workspace::Workspace;
pub type StackFrameId = u64;
#[derive(Debug)]
pub enum StackFrameListEvent {
SelectedStackFrameChanged(StackFrameId),
StackFramesUpdated,
}
pub struct StackFrameList {
list: ListState,
thread_id: ThreadId,
focus_handle: FocusHandle,
_subscription: Subscription,
session: Entity<Session>,
entries: Vec<StackFrameEntry>,
workspace: WeakEntity<Workspace>,
current_stack_frame_id: StackFrameId,
_fetch_stack_frames_task: Option<Task<Result<()>>>,
}
#[derive(Debug, PartialEq, Eq)]
pub enum StackFrameEntry {
Normal(dap::StackFrame),
Collapsed(Vec<dap::StackFrame>),
}
impl StackFrameList {
pub fn new(
workspace: WeakEntity<Workspace>,
session: Entity<Session>,
thread_id: ThreadId,
_window: &Window,
cx: &mut Context<Self>,
) -> Self {
let weak_entity = cx.weak_entity();
let focus_handle = cx.focus_handle();
let list = ListState::new(
0,
gpui::ListAlignment::Top,
px(1000.),
move |ix, _window, cx| {
weak_entity
.upgrade()
.map(|stack_frame_list| {
stack_frame_list.update(cx, |this, cx| this.render_entry(ix, cx))
})
.unwrap_or(div().into_any())
},
);
let _subscription = cx.observe(&session, |stack_frame_list, state, cx| {
let _frame_len = state.update(cx, |state, cx| {
state.stack_frames(stack_frame_list.thread_id, cx).len()
});
stack_frame_list.build_entries(cx);
});
Self {
list,
session,
workspace,
thread_id,
focus_handle,
_subscription,
entries: Default::default(),
_fetch_stack_frames_task: None,
current_stack_frame_id: Default::default(),
}
}
pub(crate) fn thread_id(&self) -> ThreadId {
self.thread_id
}
#[cfg(any(test, feature = "test-support"))]
pub fn entries(&self) -> &Vec<StackFrameEntry> {
&self.entries
}
pub fn stack_frames(&self, cx: &mut App) -> Vec<StackFrame> {
self.session
.update(cx, |this, cx| this.stack_frames(self.thread_id, cx))
}
#[cfg(any(test, feature = "test-support"))]
pub fn dap_stack_frames(&self, cx: &mut App) -> Vec<dap::StackFrame> {
self.stack_frames(cx)
.into_iter()
.map(|stack_frame| stack_frame.dap.clone())
.collect()
}
pub fn get_main_stack_frame_id(&self, cx: &mut Context<Self>) -> u64 {
self.stack_frames(cx)
.first()
.map(|stack_frame| stack_frame.dap.id)
.unwrap_or(0)
}
pub fn current_stack_frame_id(&self) -> u64 {
self.current_stack_frame_id
}
fn build_entries(&mut self, cx: &mut Context<Self>) {
let mut entries = Vec::new();
let mut collapsed_entries = Vec::new();
for stack_frame in &self.stack_frames(cx) {
match stack_frame.dap.presentation_hint {
Some(dap::StackFramePresentationHint::Deemphasize) => {
collapsed_entries.push(stack_frame.dap.clone());
}
_ => {
let collapsed_entries = std::mem::take(&mut collapsed_entries);
if !collapsed_entries.is_empty() {
entries.push(StackFrameEntry::Collapsed(collapsed_entries.clone()));
}
entries.push(StackFrameEntry::Normal(stack_frame.dap.clone()));
}
}
}
let collapsed_entries = std::mem::take(&mut collapsed_entries);
if !collapsed_entries.is_empty() {
entries.push(StackFrameEntry::Collapsed(collapsed_entries.clone()));
}
std::mem::swap(&mut self.entries, &mut entries);
self.list.reset(self.entries.len());
cx.notify();
}
// fn fetch_stack_frames(
// &mut self,
// go_to_stack_frame: bool,
// window: &Window,
// cx: &mut Context<Self>,
// ) {
// // If this is a remote debug session we never need to fetch stack frames ourselves
// // because the host will fetch and send us stack frames whenever there's a stop event
// if self.dap_store.read(cx).as_remote().is_some() {
// return;
// }
// let task = self.dap_store.update(cx, |store, cx| {
// store.stack_frames(&self.client_id, self.thread_id, cx)
// });
// self.fetch_stack_frames_task = Some(cx.spawn_in(window, |this, mut cx| async move {
// let mut stack_frames = task.await?;
// let task = this.update_in(&mut cx, |this, window, cx| {
// std::mem::swap(&mut this.stack_frames, &mut stack_frames);
// this.build_entries();
// cx.emit(StackFrameListEvent::StackFramesUpdated);
// let stack_frame = this
// .stack_frames
// .first()
// .cloned()
// .ok_or_else(|| anyhow!("No stack frame found to select"))?;
// anyhow::Ok(this.select_stack_frame(&stack_frame, go_to_stack_frame, window, cx))
// })?;
// task?.await?;
// this.update(&mut cx, |this, _| {
// this.fetch_stack_frames_task.take();
// })
// }));
// }
pub fn select_stack_frame(
&mut self,
stack_frame: &dap::StackFrame,
go_to_stack_frame: bool,
window: &Window,
cx: &mut Context<Self>,
) -> Task<Result<()>> {
self.current_stack_frame_id = stack_frame.id;
cx.emit(StackFrameListEvent::SelectedStackFrameChanged(
stack_frame.id,
));
cx.notify();
if !go_to_stack_frame {
return Task::ready(Ok(()));
};
let _row = (stack_frame.line.saturating_sub(1)) as u32;
let Some(project_path) = self.project_path_from_stack_frame(&stack_frame, cx) else {
return Task::ready(Err(anyhow!("Project path not found")));
};
cx.spawn_in(window, {
// let client_id = self.client_id;
move |this, mut cx| async move {
this.update_in(&mut cx, |this, window, cx| {
this.workspace.update(cx, |workspace, cx| {
workspace.open_path_preview(
project_path.clone(),
None,
false,
true,
window,
cx,
)
})
})??
.await?;
Ok(())
// TODO(debugger): make this work again
// this.update(&mut cx, |this, cx| {
// this.dap_store.update(cx, |store, cx| {
// store.set_active_debug_line(client_id, &project_path, row, cx);
// })
// })
}
})
}
fn project_path_from_stack_frame(
&self,
stack_frame: &dap::StackFrame,
cx: &mut Context<Self>,
) -> Option<ProjectPath> {
let path = stack_frame.source.as_ref().and_then(|s| s.path.as_ref())?;
self.workspace
.update(cx, |workspace, cx| {
workspace.project().read_with(cx, |project, cx| {
project.project_path_for_absolute_path(&Path::new(path), cx)
})
})
.ok()?
}
pub fn restart_stack_frame(&mut self, stack_frame_id: u64, cx: &mut Context<Self>) {
self.session.update(cx, |state, cx| {
state.restart_stack_frame(stack_frame_id, cx)
});
}
fn render_normal_entry(
&self,
stack_frame: &dap::StackFrame,
cx: &mut Context<Self>,
) -> AnyElement {
let source = stack_frame.source.clone();
let is_selected_frame = stack_frame.id == self.current_stack_frame_id;
let formatted_path = format!(
"{}:{}",
source.clone().and_then(|s| s.name).unwrap_or_default(),
stack_frame.line,
);
let supports_frame_restart = self
.session
.read(cx)
.capabilities()
.supports_restart_frame
.unwrap_or_default();
let origin = stack_frame
.source
.to_owned()
.and_then(|source| source.origin);
h_flex()
.rounded_md()
.justify_between()
.w_full()
.group("")
.id(("stack-frame", stack_frame.id))
.tooltip({
let formatted_path = formatted_path.clone();
move |_window, app| {
app.new(|_| {
let mut tooltip = Tooltip::new(formatted_path.clone());
if let Some(origin) = &origin {
tooltip = tooltip.meta(origin);
}
tooltip
})
.into()
}
})
.p_1()
.when(is_selected_frame, |this| {
this.bg(cx.theme().colors().element_hover)
})
.on_click(cx.listener({
let stack_frame = stack_frame.clone();
move |this, _, window, cx| {
this.select_stack_frame(&stack_frame, true, window, cx)
.detach_and_log_err(cx);
}
}))
.hover(|style| style.bg(cx.theme().colors().element_hover).cursor_pointer())
.child(
v_flex()
.child(
h_flex()
.gap_0p5()
.text_ui_sm(cx)
.truncate()
.child(stack_frame.name.clone())
.child(formatted_path),
)
.child(
h_flex()
.text_ui_xs(cx)
.truncate()
.text_color(cx.theme().colors().text_muted)
.when_some(source.and_then(|s| s.path), |this, path| this.child(path)),
),
)
.when(
supports_frame_restart && stack_frame.can_restart.unwrap_or(true),
|this| {
this.child(
h_flex()
.id(("restart-stack-frame", stack_frame.id))
.visible_on_hover("")
.absolute()
.right_2()
.overflow_hidden()
.rounded_md()
.border_1()
.border_color(cx.theme().colors().element_selected)
.bg(cx.theme().colors().element_background)
.hover(|style| {
style
.bg(cx.theme().colors().ghost_element_hover)
.cursor_pointer()
})
.child(
IconButton::new(
("restart-stack-frame", stack_frame.id),
IconName::DebugRestart,
)
.icon_size(IconSize::Small)
.on_click(cx.listener({
let stack_frame_id = stack_frame.id;
move |this, _, _window, cx| {
this.restart_stack_frame(stack_frame_id, cx);
}
}))
.tooltip(move |window, cx| {
Tooltip::text("Restart Stack Frame")(window, cx)
}),
),
)
},
)
.into_any()
}
pub fn expand_collapsed_entry(
&mut self,
ix: usize,
stack_frames: &Vec<dap::StackFrame>,
cx: &mut Context<Self>,
) {
self.entries.splice(
ix..ix + 1,
stack_frames
.iter()
.map(|frame| StackFrameEntry::Normal(frame.clone())),
);
self.list.reset(self.entries.len());
cx.notify();
}
fn render_collapsed_entry(
&self,
ix: usize,
stack_frames: &Vec<dap::StackFrame>,
cx: &mut Context<Self>,
) -> AnyElement {
let first_stack_frame = &stack_frames[0];
h_flex()
.rounded_md()
.justify_between()
.w_full()
.group("")
.id(("stack-frame", first_stack_frame.id))
.p_1()
.on_click(cx.listener({
let stack_frames = stack_frames.clone();
move |this, _, _window, cx| {
this.expand_collapsed_entry(ix, &stack_frames, cx);
}
}))
.hover(|style| style.bg(cx.theme().colors().element_hover).cursor_pointer())
.child(
v_flex()
.text_ui_sm(cx)
.truncate()
.text_color(cx.theme().colors().text_muted)
.child(format!(
"Show {} more{}",
stack_frames.len(),
first_stack_frame
.source
.as_ref()
.and_then(|source| source.origin.as_ref())
.map_or(String::new(), |origin| format!(": {}", origin))
)),
)
.into_any()
}
fn render_entry(&self, ix: usize, cx: &mut Context<Self>) -> AnyElement {
match &self.entries[ix] {
StackFrameEntry::Normal(stack_frame) => self.render_normal_entry(stack_frame, cx),
StackFrameEntry::Collapsed(stack_frames) => {
self.render_collapsed_entry(ix, stack_frames, cx)
}
}
}
}
impl Render for StackFrameList {
fn render(&mut self, _window: &mut Window, _cx: &mut Context<Self>) -> impl IntoElement {
div()
.size_full()
.p_1()
.child(list(self.list.clone()).size_full())
}
}
impl Focusable for StackFrameList {
fn focus_handle(&self, _: &gpui::App) -> gpui::FocusHandle {
self.focus_handle.clone()
}
}
impl EventEmitter<StackFrameListEvent> for StackFrameList {}

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,22 @@
use gpui::{FocusHandle, Focusable};
use ui::{div, Element, ParentElement, Render, Styled};
pub(super) struct StartingState {
focus_handle: FocusHandle,
}
impl Focusable for StartingState {
fn focus_handle(&self, cx: &ui::App) -> FocusHandle {
self.focus_handle.clone()
}
}
impl Render for StartingState {
fn render(
&mut self,
window: &mut ui::Window,
cx: &mut ui::Context<'_, Self>,
) -> impl ui::IntoElement {
div().size_full().child("Starting a debug adapter")
}
}

View File

@@ -0,0 +1,81 @@
use gpui::{Entity, TestAppContext, WindowHandle};
use project::{Project, Worktree};
use settings::SettingsStore;
use terminal_view::terminal_panel::TerminalPanel;
use workspace::Workspace;
use crate::{debugger_panel::DebugPanel, session::DebugSession};
mod attach_modal;
mod console;
mod debugger_panel;
mod module_list;
mod stack_frame_list;
mod variable_list;
pub fn init_test(cx: &mut gpui::TestAppContext) {
if std::env::var("RUST_LOG").is_ok() {
env_logger::try_init().ok();
}
cx.update(|cx| {
let settings = SettingsStore::test(cx);
cx.set_global(settings);
terminal_view::init(cx);
theme::init(theme::LoadThemes::JustBase, cx);
command_palette_hooks::init(cx);
language::init(cx);
workspace::init_settings(cx);
Project::init_settings(cx);
editor::init(cx);
});
}
pub async fn init_test_workspace(
project: &Entity<Project>,
cx: &mut TestAppContext,
) -> WindowHandle<Workspace> {
let workspace_handle =
cx.add_window(|window, cx| Workspace::test_new(project.clone(), window, cx));
let debugger_panel = workspace_handle
.update(cx, |_, window, cx| cx.spawn_in(window, DebugPanel::load))
.unwrap()
.await
.expect("Failed to load debug panel");
let terminal_panel = workspace_handle
.update(cx, |_, window, cx| cx.spawn_in(window, TerminalPanel::load))
.unwrap()
.await
.expect("Failed to load terminal panel");
workspace_handle
.update(cx, |workspace, window, cx| {
workspace.add_panel(debugger_panel, window, cx);
workspace.add_panel(terminal_panel, window, cx);
})
.unwrap();
workspace_handle
}
pub fn active_debug_panel_item(
workspace: WindowHandle<Workspace>,
cx: &mut TestAppContext,
) -> Entity<DebugSession> {
workspace
.update(cx, |workspace, _window, cx| {
let debug_panel = workspace.panel::<DebugPanel>(cx).unwrap();
debug_panel
.update(cx, |this, cx| this.active_debug_panel_item(cx))
.unwrap()
})
.unwrap()
}
pub fn worktree_from_project(
project: &Entity<Project>,
cx: &mut TestAppContext,
) -> Entity<Worktree> {
project.read_with(cx, |project, cx| project.worktrees(cx).next().unwrap())
}

View File

@@ -0,0 +1,277 @@
use crate::*;
use attach_modal::AttachModal;
use dap::requests::{Attach, Disconnect, Initialize};
use gpui::{BackgroundExecutor, TestAppContext, VisualTestContext};
use menu::{Cancel, Confirm};
use project::{FakeFs, Project};
use serde_json::json;
use std::sync::{
atomic::{AtomicBool, Ordering},
Arc,
};
use task::AttachConfig;
use tests::{init_test, init_test_workspace};
#[gpui::test]
async fn test_direct_attach_to_process(executor: BackgroundExecutor, cx: &mut TestAppContext) {
init_test(cx);
let send_attach_request = Arc::new(AtomicBool::new(false));
let fs = FakeFs::new(executor.clone());
fs.insert_tree(
"/project",
json!({
"main.rs": "First line\nSecond line\nThird line\nFourth line",
}),
)
.await;
let project = Project::test(fs, ["/project".as_ref()], cx).await;
let workspace = init_test_workspace(&project, cx).await;
let cx = &mut VisualTestContext::from_window(*workspace, cx);
let task = project.update(cx, |project, cx| {
project.start_debug_session(dap::test_config(), cx)
});
let (session, client) = task.await.unwrap();
client
.on_request::<Initialize, _>(move |_, _| {
Ok(dap::Capabilities {
supports_step_back: Some(false),
..Default::default()
})
})
.await;
client
.on_request::<Attach, _>({
let send_attach_request = send_attach_request.clone();
move |_, args| {
send_attach_request.store(true, Ordering::SeqCst);
assert_eq!(json!({"request": "attach", "process_id": 10}), args.raw);
Ok(())
}
})
.await;
client.on_request::<Disconnect, _>(move |_, _| Ok(())).await;
cx.run_until_parked();
assert!(
send_attach_request.load(std::sync::atomic::Ordering::SeqCst),
"Expected to send attach request, because we passed in the processId"
);
// assert we didn't show the attach modal
workspace
.update(cx, |workspace, _window, cx| {
assert!(workspace.active_modal::<AttachModal>(cx).is_none());
})
.unwrap();
let shutdown_session = project.update(cx, |project, cx| {
project.dap_store().update(cx, |dap_store, cx| {
dap_store.shutdown_client(&session.read(cx).id(), cx)
})
});
shutdown_session.await.unwrap();
}
#[gpui::test]
async fn test_show_attach_modal_and_select_process(
executor: BackgroundExecutor,
cx: &mut TestAppContext,
) {
init_test(cx);
let send_attach_request = Arc::new(AtomicBool::new(false));
let fs = FakeFs::new(executor.clone());
fs.insert_tree(
"/project",
json!({
"main.rs": "First line\nSecond line\nThird line\nFourth line",
}),
)
.await;
let project = Project::test(fs, ["/project".as_ref()], cx).await;
let workspace = init_test_workspace(&project, cx).await;
let cx = &mut VisualTestContext::from_window(*workspace, cx);
let task = project.update(cx, |project, cx| {
project.start_debug_session(dap::test_config(), cx)
});
let (session, client) = task.await.unwrap();
client
.on_request::<Initialize, _>(move |_, _| {
Ok(dap::Capabilities {
supports_step_back: Some(false),
..Default::default()
})
})
.await;
client
.on_request::<Attach, _>({
let send_attach_request = send_attach_request.clone();
move |_, args| {
send_attach_request.store(true, Ordering::SeqCst);
assert_eq!(
json!({
"request": "attach",
// note we filtered out all processes in FakeAdapter::attach_processes,
// that is not equal to the current process id
"process_id": std::process::id(),
}),
args.raw
);
Ok(())
}
})
.await;
client.on_request::<Disconnect, _>(move |_, _| Ok(())).await;
cx.run_until_parked();
// assert we show the attach modal
workspace
.update(cx, |workspace, _window, cx| {
let attach_modal = workspace.active_modal::<AttachModal>(cx).unwrap();
let names = attach_modal.update(cx, |modal, cx| attach_modal::procss_names(&modal, cx));
// we filtered out all processes that are not the current process(zed itself)
assert_eq!(1, names.len());
})
.unwrap();
// select the only existing process
cx.dispatch_action(Confirm);
cx.run_until_parked();
// assert attach modal was dismissed
workspace
.update(cx, |workspace, _window, cx| {
assert!(workspace.active_modal::<AttachModal>(cx).is_none());
})
.unwrap();
assert!(
send_attach_request.load(std::sync::atomic::Ordering::SeqCst),
"Expected to send attach request, because we passed in the processId"
);
let shutdown_session = project.update(cx, |project, cx| {
project.dap_store().update(cx, |dap_store, cx| {
dap_store.shutdown_client(&session.read(cx).id(), cx)
})
});
shutdown_session.await.unwrap();
}
#[gpui::test]
async fn test_shutdown_session_when_modal_is_dismissed(
executor: BackgroundExecutor,
cx: &mut TestAppContext,
) {
init_test(cx);
let send_attach_request = Arc::new(AtomicBool::new(false));
let fs = FakeFs::new(executor.clone());
fs.insert_tree(
"/project",
json!({
"main.rs": "First line\nSecond line\nThird line\nFourth line",
}),
)
.await;
let project = Project::test(fs, ["/project".as_ref()], cx).await;
let workspace = init_test_workspace(&project, cx).await;
let cx = &mut VisualTestContext::from_window(*workspace, cx);
let task = project.update(cx, |project, cx| {
project.start_debug_session(dap::test_config(), cx)
});
let (session, client) = task.await.unwrap();
client
.on_request::<Initialize, _>(move |_, _| {
Ok(dap::Capabilities {
supports_step_back: Some(false),
..Default::default()
})
})
.await;
client
.on_request::<Attach, _>({
let send_attach_request = send_attach_request.clone();
move |_, _| {
send_attach_request.store(true, Ordering::SeqCst);
Ok(())
}
})
.await;
client.on_request::<Disconnect, _>(move |_, _| Ok(())).await;
cx.run_until_parked();
// assert we show the attach modal
workspace
.update(cx, |workspace, _window, cx| {
let attach_modal = workspace.active_modal::<AttachModal>(cx).unwrap();
let names = attach_modal.update(cx, |modal, cx| attach_modal::procss_names(&modal, cx));
// we filtered out all processes that are not the current process(zed itself)
assert_eq!(1, names.len());
})
.unwrap();
// close the modal
cx.dispatch_action(Cancel);
cx.run_until_parked();
// assert attach modal was dismissed
workspace
.update(cx, |workspace, _window, cx| {
assert!(workspace.active_modal::<AttachModal>(cx).is_none());
})
.unwrap();
assert!(
!send_attach_request.load(std::sync::atomic::Ordering::SeqCst),
"Didn't expected to send attach request, because we closed the modal"
);
// assert debug session is shutdown
project.update(cx, |project, cx| {
project.dap_store().update(cx, |dap_store, cx| {
assert!(dap_store.session_by_id(&session.read(cx).id()).is_none())
});
});
}

View File

@@ -0,0 +1,853 @@
use crate::*;
use dap::{
requests::{Disconnect, Evaluate, Initialize, Launch, Scopes, StackTrace, Variables},
Scope, StackFrame, Variable,
};
use gpui::{BackgroundExecutor, TestAppContext, VisualTestContext};
use project::{FakeFs, Project};
use serde_json::json;
use std::sync::{
atomic::{AtomicBool, Ordering},
Arc, Mutex,
};
use tests::{active_debug_panel_item, init_test, init_test_workspace};
use unindent::Unindent as _;
use variable_list::{VariableContainer, VariableListEntry};
#[gpui::test]
async fn test_handle_output_event(executor: BackgroundExecutor, cx: &mut TestAppContext) {
init_test(cx);
let fs = FakeFs::new(executor.clone());
fs.insert_tree(
"/project",
json!({
"main.rs": "First line\nSecond line\nThird line\nFourth line",
}),
)
.await;
let project = Project::test(fs, ["/project".as_ref()], cx).await;
let workspace = init_test_workspace(&project, cx).await;
let cx = &mut VisualTestContext::from_window(*workspace, cx);
let task = project.update(cx, |project, cx| {
project.start_debug_session(dap::test_config(), cx)
});
let (session, client) = task.await.unwrap();
client
.on_request::<Initialize, _>(move |_, _| {
Ok(dap::Capabilities {
supports_step_back: Some(false),
..Default::default()
})
})
.await;
client.on_request::<Launch, _>(move |_, _| Ok(())).await;
client
.on_request::<StackTrace, _>(move |_, _| {
Ok(dap::StackTraceResponse {
stack_frames: Vec::default(),
total_frames: None,
})
})
.await;
client.on_request::<Disconnect, _>(move |_, _| Ok(())).await;
client
.fake_event(dap::messages::Events::Output(dap::OutputEvent {
category: None,
output: "First console output line before thread stopped!".to_string(),
data: None,
variables_reference: None,
source: None,
line: None,
column: None,
group: None,
}))
.await;
client
.fake_event(dap::messages::Events::Output(dap::OutputEvent {
category: Some(dap::OutputEventCategory::Stdout),
output: "First output line before thread stopped!".to_string(),
data: None,
variables_reference: None,
source: None,
line: None,
column: None,
group: None,
}))
.await;
client
.fake_event(dap::messages::Events::Stopped(dap::StoppedEvent {
reason: dap::StoppedEventReason::Pause,
description: None,
thread_id: Some(1),
preserve_focus_hint: None,
text: None,
all_threads_stopped: None,
hit_breakpoint_ids: None,
}))
.await;
cx.run_until_parked();
// assert we have output from before the thread stopped
workspace
.update(cx, |workspace, _window, cx| {
let debug_panel = workspace.panel::<DebugPanel>(cx).unwrap();
let active_debug_panel_item = debug_panel
.update(cx, |this, cx| this.active_debug_panel_item(cx))
.unwrap();
assert_eq!(1, debug_panel.read(cx).message_queue().len());
assert_eq!(
"First console output line before thread stopped!\nFirst output line before thread stopped!\n",
active_debug_panel_item.read(cx).console().read(cx).editor().read(cx).text(cx).as_str()
);
})
.unwrap();
client
.fake_event(dap::messages::Events::Output(dap::OutputEvent {
category: Some(dap::OutputEventCategory::Stdout),
output: "Second output line after thread stopped!".to_string(),
data: None,
variables_reference: None,
source: None,
line: None,
column: None,
group: None,
}))
.await;
client
.fake_event(dap::messages::Events::Output(dap::OutputEvent {
category: Some(dap::OutputEventCategory::Console),
output: "Second console output line after thread stopped!".to_string(),
data: None,
variables_reference: None,
source: None,
line: None,
column: None,
group: None,
}))
.await;
cx.run_until_parked();
// assert we have output from before and after the thread stopped
workspace
.update(cx, |workspace, _window, cx| {
let debug_panel = workspace.panel::<DebugPanel>(cx).unwrap();
let active_debug_panel_item = debug_panel
.update(cx, |this, cx| this.active_debug_panel_item(cx))
.unwrap();
assert!(!debug_panel.read(cx).message_queue().is_empty());
assert_eq!(
"First console output line before thread stopped!\nFirst output line before thread stopped!\nSecond output line after thread stopped!\nSecond console output line after thread stopped!\n",
active_debug_panel_item.read(cx).console().read(cx).editor().read(cx).text(cx).as_str()
);
})
.unwrap();
let shutdown_session = project.update(cx, |project, cx| {
project.dap_store().update(cx, |dap_store, cx| {
dap_store.shutdown_client(&session.read(cx).id(), cx)
})
});
shutdown_session.await.unwrap();
}
#[gpui::test]
async fn test_grouped_output(executor: BackgroundExecutor, cx: &mut TestAppContext) {
init_test(cx);
let fs = FakeFs::new(executor.clone());
fs.insert_tree(
"/project",
json!({
"main.rs": "First line\nSecond line\nThird line\nFourth line",
}),
)
.await;
let project = Project::test(fs, ["/project".as_ref()], cx).await;
let workspace = init_test_workspace(&project, cx).await;
let cx = &mut VisualTestContext::from_window(*workspace, cx);
let task = project.update(cx, |project, cx| {
project.start_debug_session(dap::test_config(), cx)
});
let (session, client) = task.await.unwrap();
client
.on_request::<Initialize, _>(move |_, _| {
Ok(dap::Capabilities {
supports_step_back: Some(false),
..Default::default()
})
})
.await;
client.on_request::<Launch, _>(move |_, _| Ok(())).await;
client
.on_request::<StackTrace, _>(move |_, _| {
Ok(dap::StackTraceResponse {
stack_frames: Vec::default(),
total_frames: None,
})
})
.await;
client.on_request::<Disconnect, _>(move |_, _| Ok(())).await;
client
.fake_event(dap::messages::Events::Stopped(dap::StoppedEvent {
reason: dap::StoppedEventReason::Pause,
description: None,
thread_id: Some(1),
preserve_focus_hint: None,
text: None,
all_threads_stopped: None,
hit_breakpoint_ids: None,
}))
.await;
client
.fake_event(dap::messages::Events::Output(dap::OutputEvent {
category: None,
output: "First line".to_string(),
data: None,
variables_reference: None,
source: None,
line: None,
column: None,
group: None,
}))
.await;
client
.fake_event(dap::messages::Events::Output(dap::OutputEvent {
category: Some(dap::OutputEventCategory::Stdout),
output: "First group".to_string(),
data: None,
variables_reference: None,
source: None,
line: None,
column: None,
group: Some(dap::OutputEventGroup::Start),
}))
.await;
client
.fake_event(dap::messages::Events::Output(dap::OutputEvent {
category: Some(dap::OutputEventCategory::Stdout),
output: "First item in group 1".to_string(),
data: None,
variables_reference: None,
source: None,
line: None,
column: None,
group: None,
}))
.await;
client
.fake_event(dap::messages::Events::Output(dap::OutputEvent {
category: Some(dap::OutputEventCategory::Stdout),
output: "Second item in group 1".to_string(),
data: None,
variables_reference: None,
source: None,
line: None,
column: None,
group: None,
}))
.await;
client
.fake_event(dap::messages::Events::Output(dap::OutputEvent {
category: Some(dap::OutputEventCategory::Stdout),
output: "Second group".to_string(),
data: None,
variables_reference: None,
source: None,
line: None,
column: None,
group: Some(dap::OutputEventGroup::Start),
}))
.await;
client
.fake_event(dap::messages::Events::Output(dap::OutputEvent {
category: Some(dap::OutputEventCategory::Stdout),
output: "First item in group 2".to_string(),
data: None,
variables_reference: None,
source: None,
line: None,
column: None,
group: None,
}))
.await;
client
.fake_event(dap::messages::Events::Output(dap::OutputEvent {
category: Some(dap::OutputEventCategory::Stdout),
output: "Second item in group 2".to_string(),
data: None,
variables_reference: None,
source: None,
line: None,
column: None,
group: None,
}))
.await;
client
.fake_event(dap::messages::Events::Output(dap::OutputEvent {
category: Some(dap::OutputEventCategory::Stdout),
output: "End group 2".to_string(),
data: None,
variables_reference: None,
source: None,
line: None,
column: None,
group: Some(dap::OutputEventGroup::End),
}))
.await;
client
.fake_event(dap::messages::Events::Output(dap::OutputEvent {
category: Some(dap::OutputEventCategory::Stdout),
output: "Third group".to_string(),
data: None,
variables_reference: None,
source: None,
line: None,
column: None,
group: Some(dap::OutputEventGroup::StartCollapsed),
}))
.await;
client
.fake_event(dap::messages::Events::Output(dap::OutputEvent {
category: Some(dap::OutputEventCategory::Stdout),
output: "First item in group 3".to_string(),
data: None,
variables_reference: None,
source: None,
line: None,
column: None,
group: None,
}))
.await;
client
.fake_event(dap::messages::Events::Output(dap::OutputEvent {
category: Some(dap::OutputEventCategory::Stdout),
output: "Second item in group 3".to_string(),
data: None,
variables_reference: None,
source: None,
line: None,
column: None,
group: None,
}))
.await;
client
.fake_event(dap::messages::Events::Output(dap::OutputEvent {
category: Some(dap::OutputEventCategory::Stdout),
output: "End group 3".to_string(),
data: None,
variables_reference: None,
source: None,
line: None,
column: None,
group: Some(dap::OutputEventGroup::End),
}))
.await;
client
.fake_event(dap::messages::Events::Output(dap::OutputEvent {
category: Some(dap::OutputEventCategory::Stdout),
output: "Third item in group 1".to_string(),
data: None,
variables_reference: None,
source: None,
line: None,
column: None,
group: None,
}))
.await;
client
.fake_event(dap::messages::Events::Output(dap::OutputEvent {
category: Some(dap::OutputEventCategory::Stdout),
output: "Second item".to_string(),
data: None,
variables_reference: None,
source: None,
line: None,
column: None,
group: Some(dap::OutputEventGroup::End),
}))
.await;
cx.run_until_parked();
active_debug_panel_item(workspace, cx).update(cx, |debug_panel_item, cx| {
debug_panel_item.console().update(cx, |console, cx| {
console.editor().update(cx, |editor, cx| {
pretty_assertions::assert_eq!(
"
First line
First group
First item in group 1
Second item in group 1
Second group
First item in group 2
Second item in group 2
End group 2
⋯ End group 3
Third item in group 1
Second item
"
.unindent(),
editor.display_text(cx)
);
})
});
});
let shutdown_session = project.update(cx, |project, cx| {
project.dap_store().update(cx, |dap_store, cx| {
dap_store.shutdown_client(&session.read(cx).id(), cx)
})
});
shutdown_session.await.unwrap();
}
#[gpui::test]
async fn test_evaluate_expression(executor: BackgroundExecutor, cx: &mut TestAppContext) {
init_test(cx);
const NEW_VALUE: &str = "{nested1: \"Nested 1 updated\", nested2: \"Nested 2 updated\"}";
let called_evaluate = Arc::new(AtomicBool::new(false));
let fs = FakeFs::new(executor.clone());
let test_file_content = r#"
const variable1 = {
nested1: "Nested 1",
nested2: "Nested 2",
};
const variable2 = "Value 2";
const variable3 = "Value 3";
"#
.unindent();
fs.insert_tree(
"/project",
json!({
"src": {
"test.js": test_file_content,
}
}),
)
.await;
let project = Project::test(fs, ["/project".as_ref()], cx).await;
let workspace = init_test_workspace(&project, cx).await;
let cx = &mut VisualTestContext::from_window(*workspace, cx);
let task = project.update(cx, |project, cx| {
project.start_debug_session(dap::test_config(), cx)
});
let (session, client) = task.await.unwrap();
client
.on_request::<Initialize, _>(move |_, _| {
Ok(dap::Capabilities {
supports_step_back: Some(false),
..Default::default()
})
})
.await;
client.on_request::<Launch, _>(move |_, _| Ok(())).await;
let stack_frames = vec![StackFrame {
id: 1,
name: "Stack Frame 1".into(),
source: Some(dap::Source {
name: Some("test.js".into()),
path: Some("/project/src/test.js".into()),
source_reference: None,
presentation_hint: None,
origin: None,
sources: None,
adapter_data: None,
checksums: None,
}),
line: 3,
column: 1,
end_line: None,
end_column: None,
can_restart: None,
instruction_pointer_reference: None,
module_id: None,
presentation_hint: None,
}];
client
.on_request::<StackTrace, _>({
let stack_frames = Arc::new(stack_frames.clone());
move |_, args| {
assert_eq!(1, args.thread_id);
Ok(dap::StackTraceResponse {
stack_frames: (*stack_frames).clone(),
total_frames: None,
})
}
})
.await;
let scopes = vec![
Scope {
name: "Scope 1".into(),
presentation_hint: None,
variables_reference: 2,
named_variables: None,
indexed_variables: None,
expensive: false,
source: None,
line: None,
column: None,
end_line: None,
end_column: None,
},
Scope {
name: "Scope 2".into(),
presentation_hint: None,
variables_reference: 4,
named_variables: None,
indexed_variables: None,
expensive: false,
source: None,
line: None,
column: None,
end_line: None,
end_column: None,
},
];
client
.on_request::<Scopes, _>({
let scopes = Arc::new(scopes.clone());
move |_, args| {
assert_eq!(1, args.frame_id);
Ok(dap::ScopesResponse {
scopes: (*scopes).clone(),
})
}
})
.await;
let scope1_variables = Arc::new(Mutex::new(vec![
Variable {
name: "variable1".into(),
value: "{nested1: \"Nested 1\", nested2: \"Nested 2\"}".into(),
type_: None,
presentation_hint: None,
evaluate_name: None,
variables_reference: 3,
named_variables: None,
indexed_variables: None,
memory_reference: None,
},
Variable {
name: "variable2".into(),
value: "Value 2".into(),
type_: None,
presentation_hint: None,
evaluate_name: None,
variables_reference: 0,
named_variables: None,
indexed_variables: None,
memory_reference: None,
},
]));
let nested_variables = vec![
Variable {
name: "nested1".into(),
value: "Nested 1".into(),
type_: None,
presentation_hint: None,
evaluate_name: None,
variables_reference: 0,
named_variables: None,
indexed_variables: None,
memory_reference: None,
},
Variable {
name: "nested2".into(),
value: "Nested 2".into(),
type_: None,
presentation_hint: None,
evaluate_name: None,
variables_reference: 0,
named_variables: None,
indexed_variables: None,
memory_reference: None,
},
];
let scope2_variables = vec![Variable {
name: "variable3".into(),
value: "Value 3".into(),
type_: None,
presentation_hint: None,
evaluate_name: None,
variables_reference: 0,
named_variables: None,
indexed_variables: None,
memory_reference: None,
}];
client
.on_request::<Variables, _>({
let scope1_variables = scope1_variables.clone();
let nested_variables = Arc::new(nested_variables.clone());
let scope2_variables = Arc::new(scope2_variables.clone());
move |_, args| match args.variables_reference {
4 => Ok(dap::VariablesResponse {
variables: (*scope2_variables).clone(),
}),
3 => Ok(dap::VariablesResponse {
variables: (*nested_variables).clone(),
}),
2 => Ok(dap::VariablesResponse {
variables: scope1_variables.lock().unwrap().clone(),
}),
id => unreachable!("unexpected variables reference {id}"),
}
})
.await;
client
.on_request::<Evaluate, _>({
let called_evaluate = called_evaluate.clone();
let scope1_variables = scope1_variables.clone();
move |_, args| {
called_evaluate.store(true, Ordering::SeqCst);
assert_eq!(format!("$variable1 = {}", NEW_VALUE), args.expression);
assert_eq!(Some(1), args.frame_id);
assert_eq!(Some(dap::EvaluateArgumentsContext::Variables), args.context);
scope1_variables.lock().unwrap()[0].value = NEW_VALUE.to_string();
Ok(dap::EvaluateResponse {
result: NEW_VALUE.into(),
type_: None,
presentation_hint: None,
variables_reference: 0,
named_variables: None,
indexed_variables: None,
memory_reference: None,
})
}
})
.await;
client.on_request::<Disconnect, _>(move |_, _| Ok(())).await;
client
.fake_event(dap::messages::Events::Stopped(dap::StoppedEvent {
reason: dap::StoppedEventReason::Pause,
description: None,
thread_id: Some(1),
preserve_focus_hint: None,
text: None,
all_threads_stopped: None,
hit_breakpoint_ids: None,
}))
.await;
cx.run_until_parked();
// toggle nested variables for scope 1
active_debug_panel_item(workspace, cx).update(cx, |debug_panel_item, cx| {
let scope1_variables = scope1_variables.lock().unwrap().clone();
debug_panel_item
.variable_list()
.update(cx, |variable_list, cx| {
variable_list.toggle_entry(
&variable_list::OpenEntry::Variable {
scope_name: scopes[0].name.clone(),
name: scope1_variables[0].name.clone(),
depth: 1,
},
cx,
);
});
});
cx.run_until_parked();
active_debug_panel_item(workspace, cx).update_in(cx, |debug_panel_item, window, cx| {
debug_panel_item.console().update(cx, |console, item_cx| {
console
.query_bar()
.update(item_cx, |query_bar, console_cx| {
query_bar.set_text(format!("$variable1 = {}", NEW_VALUE), window, console_cx);
});
console.evaluate(&menu::Confirm, window, item_cx);
});
});
cx.run_until_parked();
active_debug_panel_item(workspace, cx).update(cx, |debug_panel_item, cx| {
assert_eq!(
"",
debug_panel_item
.console()
.read(cx)
.query_bar()
.read(cx)
.text(cx)
.as_str()
);
assert_eq!(
format!("{}\n", NEW_VALUE),
debug_panel_item
.console()
.read(cx)
.editor()
.read(cx)
.text(cx)
.as_str()
);
debug_panel_item
.variable_list()
.update(cx, |variable_list, _| {
let scope1_variables = scope1_variables.lock().unwrap().clone();
// scope 1
assert_eq!(
vec![
VariableContainer {
container_reference: scopes[0].variables_reference,
variable: scope1_variables[0].clone(),
depth: 1,
},
VariableContainer {
container_reference: scope1_variables[0].variables_reference,
variable: nested_variables[0].clone(),
depth: 2,
},
VariableContainer {
container_reference: scope1_variables[0].variables_reference,
variable: nested_variables[1].clone(),
depth: 2,
},
VariableContainer {
container_reference: scopes[0].variables_reference,
variable: scope1_variables[1].clone(),
depth: 1,
},
],
variable_list.variables_by_scope(1, 2).unwrap().variables()
);
// scope 2
assert_eq!(
vec![VariableContainer {
container_reference: scopes[1].variables_reference,
variable: scope2_variables[0].clone(),
depth: 1,
}],
variable_list.variables_by_scope(1, 4).unwrap().variables()
);
// assert visual entries
assert_eq!(
vec![
VariableListEntry::Scope(scopes[0].clone()),
VariableListEntry::Variable {
depth: 1,
scope: Arc::new(scopes[0].clone()),
has_children: true,
variable: Arc::new(scope1_variables[0].clone()),
container_reference: scopes[0].variables_reference,
},
VariableListEntry::Variable {
depth: 2,
scope: Arc::new(scopes[0].clone()),
has_children: false,
variable: Arc::new(nested_variables[0].clone()),
container_reference: scope1_variables[0].variables_reference,
},
VariableListEntry::Variable {
depth: 2,
scope: Arc::new(scopes[0].clone()),
has_children: false,
variable: Arc::new(nested_variables[1].clone()),
container_reference: scope1_variables[0].variables_reference,
},
VariableListEntry::Variable {
depth: 1,
scope: Arc::new(scopes[0].clone()),
has_children: false,
variable: Arc::new(scope1_variables[1].clone()),
container_reference: scopes[0].variables_reference,
},
VariableListEntry::Scope(scopes[1].clone()),
],
variable_list.entries().get(&1).unwrap().clone()
);
});
});
assert!(
called_evaluate.load(std::sync::atomic::Ordering::SeqCst),
"Expected evaluate request to be called"
);
let shutdown_session = project.update(cx, |project, cx| {
project.dap_store().update(cx, |dap_store, cx| {
dap_store.shutdown_client(&session.read(cx).id(), cx)
})
});
shutdown_session.await.unwrap();
}

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,223 @@
use crate::{
session::ThreadItem,
tests::{active_debug_panel_item, init_test, init_test_workspace},
};
use dap::{
requests::{Disconnect, Initialize, Launch, Modules, StackTrace},
StoppedEvent,
};
use gpui::{BackgroundExecutor, TestAppContext, VisualTestContext};
use project::{FakeFs, Project};
use std::sync::{
atomic::{AtomicBool, Ordering},
Arc,
};
#[gpui::test]
async fn test_module_list(executor: BackgroundExecutor, cx: &mut TestAppContext) {
init_test(cx);
let fs = FakeFs::new(executor.clone());
let project = Project::test(fs, ["/project".as_ref()], cx).await;
let workspace = init_test_workspace(&project, cx).await;
let cx = &mut VisualTestContext::from_window(*workspace, cx);
let task = project.update(cx, |project, cx| {
project.start_debug_session(dap::test_config(), cx)
});
let (session, client) = task.await.unwrap();
client
.on_request::<Initialize, _>(move |_, _| {
Ok(dap::Capabilities {
supports_modules_request: Some(true),
..Default::default()
})
})
.await;
client.on_request::<Launch, _>(move |_, _| Ok(())).await;
client
.on_request::<StackTrace, _>(move |_, _| {
Ok(dap::StackTraceResponse {
stack_frames: Vec::default(),
total_frames: None,
})
})
.await;
let called_modules = Arc::new(AtomicBool::new(false));
let modules = vec![
dap::Module {
id: dap::ModuleId::Number(1),
name: "First Module".into(),
address_range: None,
date_time_stamp: None,
path: None,
symbol_file_path: None,
symbol_status: None,
version: None,
is_optimized: None,
is_user_code: None,
},
dap::Module {
id: dap::ModuleId::Number(2),
name: "Second Module".into(),
address_range: None,
date_time_stamp: None,
path: None,
symbol_file_path: None,
symbol_status: None,
version: None,
is_optimized: None,
is_user_code: None,
},
];
client
.on_request::<Modules, _>({
let called_modules = called_modules.clone();
let modules = modules.clone();
move |_, _| unsafe {
static mut REQUEST_COUNT: i32 = 1;
assert_eq!(
1, REQUEST_COUNT,
"This request should only be called once from the host"
);
REQUEST_COUNT += 1;
called_modules.store(true, Ordering::SeqCst);
Ok(dap::ModulesResponse {
modules: modules.clone(),
total_modules: Some(2u64),
})
}
})
.await;
client
.fake_event(dap::messages::Events::Stopped(StoppedEvent {
reason: dap::StoppedEventReason::Pause,
description: None,
thread_id: Some(1),
preserve_focus_hint: None,
text: None,
all_threads_stopped: None,
hit_breakpoint_ids: None,
}))
.await;
client.on_request::<Disconnect, _>(move |_, _| Ok(())).await;
cx.run_until_parked();
assert!(
!called_modules.load(std::sync::atomic::Ordering::SeqCst),
"Request Modules shouldn't be called before it's needed"
);
active_debug_panel_item(workspace, cx).update(cx, |item, cx| {
item.set_thread_item(ThreadItem::Modules, cx);
});
cx.run_until_parked();
assert!(
called_modules.load(std::sync::atomic::Ordering::SeqCst),
"Request Modules should be called because a user clicked on the module list"
);
active_debug_panel_item(workspace, cx).update(cx, |item, cx| {
item.set_thread_item(ThreadItem::Modules, cx);
let actual_modules = item.modules().update(cx, |list, cx| list.modules(cx));
assert_eq!(modules, actual_modules);
});
// Test all module events now
// New Module
// Changed
// Removed
let new_module = dap::Module {
id: dap::ModuleId::Number(3),
name: "Third Module".into(),
address_range: None,
date_time_stamp: None,
path: None,
symbol_file_path: None,
symbol_status: None,
version: None,
is_optimized: None,
is_user_code: None,
};
client
.fake_event(dap::messages::Events::Module(dap::ModuleEvent {
reason: dap::ModuleEventReason::New,
module: new_module.clone(),
}))
.await;
cx.run_until_parked();
active_debug_panel_item(workspace, cx).update(cx, |item, cx| {
let actual_modules = item.modules().update(cx, |list, cx| list.modules(cx));
assert_eq!(actual_modules.len(), 3);
assert!(actual_modules.contains(&new_module));
});
let changed_module = dap::Module {
id: dap::ModuleId::Number(2),
name: "Modified Second Module".into(),
address_range: None,
date_time_stamp: None,
path: None,
symbol_file_path: None,
symbol_status: None,
version: None,
is_optimized: None,
is_user_code: None,
};
client
.fake_event(dap::messages::Events::Module(dap::ModuleEvent {
reason: dap::ModuleEventReason::Changed,
module: changed_module.clone(),
}))
.await;
cx.run_until_parked();
active_debug_panel_item(workspace, cx).update(cx, |item, cx| {
let actual_modules = item.modules().update(cx, |list, cx| list.modules(cx));
assert_eq!(actual_modules.len(), 3);
assert!(actual_modules.contains(&changed_module));
});
client
.fake_event(dap::messages::Events::Module(dap::ModuleEvent {
reason: dap::ModuleEventReason::Removed,
module: changed_module.clone(),
}))
.await;
cx.run_until_parked();
active_debug_panel_item(workspace, cx).update(cx, |item, cx| {
let actual_modules = item.modules().update(cx, |list, cx| list.modules(cx));
assert_eq!(actual_modules.len(), 2);
assert!(!actual_modules.contains(&changed_module));
});
let shutdown_session = project.update(cx, |project, cx| {
project.dap_store().update(cx, |dap_store, cx| {
dap_store.shutdown_client(&session.read(cx).id(), cx)
})
});
shutdown_session.await.unwrap();
}

View File

@@ -0,0 +1,645 @@
use crate::{
debugger_panel::DebugPanel,
stack_frame_list::StackFrameEntry,
tests::{active_debug_panel_item, init_test, init_test_workspace},
};
use dap::{
requests::{Disconnect, Initialize, Launch, StackTrace},
StackFrame,
};
use editor::{Editor, ToPoint as _};
use gpui::{BackgroundExecutor, TestAppContext, VisualTestContext};
use project::{FakeFs, Project};
use serde_json::json;
use std::sync::Arc;
use unindent::Unindent as _;
#[gpui::test]
async fn test_fetch_initial_stack_frames_and_go_to_stack_frame(
executor: BackgroundExecutor,
cx: &mut TestAppContext,
) {
init_test(cx);
let fs = FakeFs::new(executor.clone());
let test_file_content = r#"
import { SOME_VALUE } './module.js';
console.log(SOME_VALUE);
"#
.unindent();
let module_file_content = r#"
export SOME_VALUE = 'some value';
"#
.unindent();
fs.insert_tree(
"/project",
json!({
"src": {
"test.js": test_file_content,
"module.js": module_file_content,
}
}),
)
.await;
let project = Project::test(fs, ["/project".as_ref()], cx).await;
let workspace = init_test_workspace(&project, cx).await;
let cx = &mut VisualTestContext::from_window(*workspace, cx);
let task = project.update(cx, |project, cx| {
project.start_debug_session(dap::test_config(), cx)
});
let (session, client) = task.await.unwrap();
client
.on_request::<Initialize, _>(move |_, _| {
Ok(dap::Capabilities {
supports_step_back: Some(false),
..Default::default()
})
})
.await;
client.on_request::<Launch, _>(move |_, _| Ok(())).await;
let stack_frames = vec![
StackFrame {
id: 1,
name: "Stack Frame 1".into(),
source: Some(dap::Source {
name: Some("test.js".into()),
path: Some("/project/src/test.js".into()),
source_reference: None,
presentation_hint: None,
origin: None,
sources: None,
adapter_data: None,
checksums: None,
}),
line: 3,
column: 1,
end_line: None,
end_column: None,
can_restart: None,
instruction_pointer_reference: None,
module_id: None,
presentation_hint: None,
},
StackFrame {
id: 2,
name: "Stack Frame 2".into(),
source: Some(dap::Source {
name: Some("module.js".into()),
path: Some("/project/src/module.js".into()),
source_reference: None,
presentation_hint: None,
origin: None,
sources: None,
adapter_data: None,
checksums: None,
}),
line: 1,
column: 1,
end_line: None,
end_column: None,
can_restart: None,
instruction_pointer_reference: None,
module_id: None,
presentation_hint: None,
},
];
client
.on_request::<StackTrace, _>({
let stack_frames = Arc::new(stack_frames.clone());
move |_, args| {
assert_eq!(1, args.thread_id);
Ok(dap::StackTraceResponse {
stack_frames: (*stack_frames).clone(),
total_frames: None,
})
}
})
.await;
client.on_request::<Disconnect, _>(move |_, _| Ok(())).await;
client
.fake_event(dap::messages::Events::Stopped(dap::StoppedEvent {
reason: dap::StoppedEventReason::Pause,
description: None,
thread_id: Some(1),
preserve_focus_hint: None,
text: None,
all_threads_stopped: None,
hit_breakpoint_ids: None,
}))
.await;
cx.run_until_parked();
workspace
.update(cx, |workspace, _window, cx| {
let debug_panel = workspace.panel::<DebugPanel>(cx).unwrap();
let active_debug_panel_item = debug_panel
.update(cx, |this, cx| this.active_debug_panel_item(cx))
.unwrap();
active_debug_panel_item.update(cx, |debug_panel_item, cx| {
let (stack_frame_list, stack_frame_id) =
debug_panel_item.stack_frame_list().update(cx, |list, cx| {
(
list.stack_frames(cx)
.into_iter()
.map(|frame| frame.dap)
.collect::<Vec<_>>(),
list.current_stack_frame_id(),
)
});
assert_eq!(1, stack_frame_id);
assert_eq!(stack_frames, stack_frame_list);
});
})
.unwrap();
let shutdown_session = project.update(cx, |project, cx| {
project.dap_store().update(cx, |dap_store, cx| {
dap_store.shutdown_client(&session.read(cx).id(), cx)
})
});
shutdown_session.await.unwrap();
}
#[gpui::test]
async fn test_select_stack_frame(executor: BackgroundExecutor, cx: &mut TestAppContext) {
init_test(cx);
let fs = FakeFs::new(executor.clone());
let test_file_content = r#"
import { SOME_VALUE } './module.js';
console.log(SOME_VALUE);
"#
.unindent();
let module_file_content = r#"
export SOME_VALUE = 'some value';
"#
.unindent();
fs.insert_tree(
"/project",
json!({
"src": {
"test.js": test_file_content,
"module.js": module_file_content,
}
}),
)
.await;
let project = Project::test(fs, ["/project".as_ref()], cx).await;
let workspace = init_test_workspace(&project, cx).await;
let cx = &mut VisualTestContext::from_window(*workspace, cx);
let task = project.update(cx, |project, cx| {
project.start_debug_session(dap::test_config(), cx)
});
let (session, client) = task.await.unwrap();
client
.on_request::<Initialize, _>(move |_, _| {
Ok(dap::Capabilities {
supports_step_back: Some(false),
..Default::default()
})
})
.await;
client.on_request::<Launch, _>(move |_, _| Ok(())).await;
let stack_frames = vec![
StackFrame {
id: 1,
name: "Stack Frame 1".into(),
source: Some(dap::Source {
name: Some("test.js".into()),
path: Some("/project/src/test.js".into()),
source_reference: None,
presentation_hint: None,
origin: None,
sources: None,
adapter_data: None,
checksums: None,
}),
line: 3,
column: 1,
end_line: None,
end_column: None,
can_restart: None,
instruction_pointer_reference: None,
module_id: None,
presentation_hint: None,
},
StackFrame {
id: 2,
name: "Stack Frame 2".into(),
source: Some(dap::Source {
name: Some("module.js".into()),
path: Some("/project/src/module.js".into()),
source_reference: None,
presentation_hint: None,
origin: None,
sources: None,
adapter_data: None,
checksums: None,
}),
line: 1,
column: 1,
end_line: None,
end_column: None,
can_restart: None,
instruction_pointer_reference: None,
module_id: None,
presentation_hint: None,
},
];
client
.on_request::<StackTrace, _>({
let stack_frames = Arc::new(stack_frames.clone());
move |_, args| {
assert_eq!(1, args.thread_id);
Ok(dap::StackTraceResponse {
stack_frames: (*stack_frames).clone(),
total_frames: None,
})
}
})
.await;
client.on_request::<Disconnect, _>(move |_, _| Ok(())).await;
client
.fake_event(dap::messages::Events::Stopped(dap::StoppedEvent {
reason: dap::StoppedEventReason::Pause,
description: None,
thread_id: Some(1),
preserve_focus_hint: None,
text: None,
all_threads_stopped: None,
hit_breakpoint_ids: None,
}))
.await;
cx.run_until_parked();
workspace
.update(cx, |workspace, window, cx| {
let debug_panel = workspace.panel::<DebugPanel>(cx).unwrap();
let active_debug_panel_item = debug_panel
.update(cx, |this, cx| this.active_debug_panel_item(cx))
.unwrap();
active_debug_panel_item.update(cx, |debug_panel_item, cx| {
let (stack_frame_list, stack_frame_id) =
debug_panel_item.stack_frame_list().update(cx, |list, cx| {
(
list.stack_frames(cx)
.into_iter()
.map(|frame| frame.dap)
.collect::<Vec<_>>(),
list.current_stack_frame_id(),
)
});
assert_eq!(1, stack_frame_id);
assert_eq!(stack_frames, stack_frame_list);
});
let editors = workspace.items_of_type::<Editor>(cx).collect::<Vec<_>>();
assert_eq!(1, editors.len());
let project_path = editors[0]
.update(cx, |editor, cx| editor.project_path(cx))
.unwrap();
assert_eq!("src/test.js", project_path.path.to_string_lossy());
assert_eq!(test_file_content, editors[0].read(cx).text(cx));
assert_eq!(
vec![2..3],
editors[0].update(cx, |editor, cx| {
let snapshot = editor.snapshot(window, cx);
editor
.highlighted_rows::<editor::DebugCurrentRowHighlight>()
.map(|(range, _)| {
let start = range.start.to_point(&snapshot.buffer_snapshot);
let end = range.end.to_point(&snapshot.buffer_snapshot);
start.row..end.row
})
.collect::<Vec<_>>()
})
);
})
.unwrap();
let stack_frame_list = workspace
.update(cx, |workspace, _window, cx| {
let debug_panel = workspace.panel::<DebugPanel>(cx).unwrap();
let active_debug_panel_item = debug_panel
.update(cx, |this, cx| this.active_debug_panel_item(cx))
.unwrap();
active_debug_panel_item.read(cx).stack_frame_list().clone()
})
.unwrap();
// select second stack frame
stack_frame_list
.update_in(cx, |stack_frame_list, window, cx| {
stack_frame_list.select_stack_frame(&stack_frames[1], true, window, cx)
})
.await
.unwrap();
workspace
.update(cx, |workspace, window, cx| {
let debug_panel = workspace.panel::<DebugPanel>(cx).unwrap();
let active_debug_panel_item = debug_panel
.update(cx, |this, cx| this.active_debug_panel_item(cx))
.unwrap();
active_debug_panel_item.update(cx, |debug_panel_item, cx| {
let (stack_frame_list, stack_frame_id) =
debug_panel_item.stack_frame_list().update(cx, |list, cx| {
(
list.stack_frames(cx)
.into_iter()
.map(|frame| frame.dap)
.collect::<Vec<_>>(),
list.current_stack_frame_id(),
)
});
assert_eq!(2, stack_frame_id);
assert_eq!(stack_frames, stack_frame_list);
});
let editors = workspace.items_of_type::<Editor>(cx).collect::<Vec<_>>();
assert_eq!(1, editors.len());
let project_path = editors[0]
.update(cx, |editor, cx| editor.project_path(cx))
.unwrap();
assert_eq!("src/module.js", project_path.path.to_string_lossy());
assert_eq!(module_file_content, editors[0].read(cx).text(cx));
assert_eq!(
vec![0..1],
editors[0].update(cx, |editor, cx| {
let snapshot = editor.snapshot(window, cx);
editor
.highlighted_rows::<editor::DebugCurrentRowHighlight>()
.map(|(range, _)| {
let start = range.start.to_point(&snapshot.buffer_snapshot);
let end = range.end.to_point(&snapshot.buffer_snapshot);
start.row..end.row
})
.collect::<Vec<_>>()
})
);
})
.unwrap();
let shutdown_session = project.update(cx, |project, cx| {
project.dap_store().update(cx, |dap_store, cx| {
dap_store.shutdown_client(&session.read(cx).id(), cx)
})
});
shutdown_session.await.unwrap();
}
#[gpui::test]
async fn test_collapsed_entries(executor: BackgroundExecutor, cx: &mut TestAppContext) {
init_test(cx);
let fs = FakeFs::new(executor.clone());
let test_file_content = r#"
import { SOME_VALUE } './module.js';
console.log(SOME_VALUE);
"#
.unindent();
let module_file_content = r#"
export SOME_VALUE = 'some value';
"#
.unindent();
fs.insert_tree(
"/project",
json!({
"src": {
"test.js": test_file_content,
"module.js": module_file_content,
}
}),
)
.await;
let project = Project::test(fs, ["/project".as_ref()], cx).await;
let workspace = init_test_workspace(&project, cx).await;
let cx = &mut VisualTestContext::from_window(*workspace, cx);
let task = project.update(cx, |project, cx| {
project.start_debug_session(dap::test_config(), cx)
});
let (session, client) = task.await.unwrap();
client
.on_request::<Initialize, _>(move |_, _| {
Ok(dap::Capabilities {
supports_step_back: Some(false),
..Default::default()
})
})
.await;
client.on_request::<Launch, _>(move |_, _| Ok(())).await;
let stack_frames = vec![
StackFrame {
id: 1,
name: "Stack Frame 1".into(),
source: Some(dap::Source {
name: Some("test.js".into()),
path: Some("/project/src/test.js".into()),
source_reference: None,
presentation_hint: None,
origin: None,
sources: None,
adapter_data: None,
checksums: None,
}),
line: 3,
column: 1,
end_line: None,
end_column: None,
can_restart: None,
instruction_pointer_reference: None,
module_id: None,
presentation_hint: None,
},
StackFrame {
id: 2,
name: "Stack Frame 2".into(),
source: Some(dap::Source {
name: Some("module.js".into()),
path: Some("/project/src/module.js".into()),
source_reference: None,
presentation_hint: None,
origin: Some("ignored".into()),
sources: None,
adapter_data: None,
checksums: None,
}),
line: 1,
column: 1,
end_line: None,
end_column: None,
can_restart: None,
instruction_pointer_reference: None,
module_id: None,
presentation_hint: Some(dap::StackFramePresentationHint::Deemphasize),
},
StackFrame {
id: 3,
name: "Stack Frame 3".into(),
source: Some(dap::Source {
name: Some("module.js".into()),
path: Some("/project/src/module.js".into()),
source_reference: None,
presentation_hint: None,
origin: Some("ignored".into()),
sources: None,
adapter_data: None,
checksums: None,
}),
line: 1,
column: 1,
end_line: None,
end_column: None,
can_restart: None,
instruction_pointer_reference: None,
module_id: None,
presentation_hint: Some(dap::StackFramePresentationHint::Deemphasize),
},
StackFrame {
id: 4,
name: "Stack Frame 4".into(),
source: Some(dap::Source {
name: Some("module.js".into()),
path: Some("/project/src/module.js".into()),
source_reference: None,
presentation_hint: None,
origin: None,
sources: None,
adapter_data: None,
checksums: None,
}),
line: 1,
column: 1,
end_line: None,
end_column: None,
can_restart: None,
instruction_pointer_reference: None,
module_id: None,
presentation_hint: None,
},
];
client
.on_request::<StackTrace, _>({
let stack_frames = Arc::new(stack_frames.clone());
move |_, args| {
assert_eq!(1, args.thread_id);
Ok(dap::StackTraceResponse {
stack_frames: (*stack_frames).clone(),
total_frames: None,
})
}
})
.await;
client.on_request::<Disconnect, _>(move |_, _| Ok(())).await;
client
.fake_event(dap::messages::Events::Stopped(dap::StoppedEvent {
reason: dap::StoppedEventReason::Pause,
description: None,
thread_id: Some(1),
preserve_focus_hint: None,
text: None,
all_threads_stopped: None,
hit_breakpoint_ids: None,
}))
.await;
cx.run_until_parked();
active_debug_panel_item(workspace, cx).update(cx, |debug_panel_item, cx| {
debug_panel_item
.stack_frame_list()
.update(cx, |stack_frame_list, cx| {
assert_eq!(
&vec![
StackFrameEntry::Normal(stack_frames[0].clone()),
StackFrameEntry::Collapsed(vec![
stack_frames[1].clone(),
stack_frames[2].clone()
]),
StackFrameEntry::Normal(stack_frames[3].clone()),
],
stack_frame_list.entries()
);
stack_frame_list.expand_collapsed_entry(
1,
&vec![stack_frames[1].clone(), stack_frames[2].clone()],
cx,
);
assert_eq!(
&vec![
StackFrameEntry::Normal(stack_frames[0].clone()),
StackFrameEntry::Normal(stack_frames[1].clone()),
StackFrameEntry::Normal(stack_frames[2].clone()),
StackFrameEntry::Normal(stack_frames[3].clone()),
],
stack_frame_list.entries()
);
});
});
let shutdown_session = project.update(cx, |project, cx| {
project.dap_store().update(cx, |dap_store, cx| {
dap_store.shutdown_client(&session.read(cx).id(), cx)
})
});
shutdown_session.await.unwrap();
}

File diff suppressed because it is too large Load Diff

View File

@@ -55,6 +55,7 @@ linkify.workspace = true
log.workspace = true
lsp.workspace = true
markdown.workspace = true
menu.workspace = true
multi_buffer.workspace = true
ordered-float.workspace = true
parking_lot.workspace = true

View File

@@ -393,6 +393,8 @@ gpui::actions!(
SwitchSourceHeader,
Tab,
TabPrev,
ToggleBreakpoint,
EditLogBreakpoint,
ToggleAutoSignatureHelp,
ToggleGitBlame,
ToggleGitBlameInline,

View File

@@ -811,6 +811,17 @@ impl DisplaySnapshot {
.anchor_at(point.to_offset(self, bias), bias)
}
pub fn display_point_to_breakpoint_anchor(&self, point: DisplayPoint) -> Anchor {
let bias = if point.is_zero() {
Bias::Right
} else {
Bias::Left
};
self.buffer_snapshot
.anchor_at(point.to_offset(self, bias), bias)
}
fn display_point_to_inlay_point(&self, point: DisplayPoint, bias: Bias) -> InlayPoint {
let block_point = point.0;
let wrap_point = self.block_snapshot.to_wrap_point(block_point, bias);

View File

@@ -80,15 +80,16 @@ use code_context_menus::{
use git::blame::GitBlame;
use gpui::{
div, impl_actions, point, prelude::*, pulsating_between, px, relative, size, Action, Animation,
AnimationExt, AnyElement, App, AsyncWindowContext, AvailableSpace, Bounds, ClipboardEntry,
ClipboardItem, Context, DispatchPhase, ElementId, Entity, EntityInputHandler, EventEmitter,
FocusHandle, FocusOutEvent, Focusable, FontId, FontWeight, Global, HighlightStyle, Hsla,
InteractiveText, KeyContext, Modifiers, MouseButton, MouseDownEvent, PaintQuad, ParentElement,
Pixels, Render, SharedString, Size, Styled, StyledText, Subscription, Task, TextStyle,
TextStyleRefinement, UTF16Selection, UnderlineStyle, UniformListScrollHandle, WeakEntity,
WeakFocusHandle, Window,
AnimationExt, AnyElement, App, AsyncWindowContext, AvailableSpace, Bounds, ClickEvent,
ClipboardEntry, ClipboardItem, Context, DispatchPhase, ElementId, Entity, EntityInputHandler,
EventEmitter, FocusHandle, FocusOutEvent, Focusable, FontId, FontWeight, Global,
HighlightStyle, Hsla, InteractiveText, KeyContext, Modifiers, MouseButton, MouseDownEvent,
PaintQuad, ParentElement, Pixels, Render, SharedString, Size, Styled, StyledText, Subscription,
Task, TextStyle, TextStyleRefinement, UTF16Selection, UnderlineStyle, UniformListScrollHandle,
WeakEntity, WeakFocusHandle, Window,
};
use highlight_matching_bracket::refresh_matching_bracket_highlights;
use hover_links::{find_file, HoverLink, HoveredLinkState, InlayHighlight};
use hover_popover::{hide_hover, HoverState};
use indent_guides::ActiveIndentGuidesState;
use inlay_hint_cache::{InlayHintCache, InlaySplice, InvalidationStrategy};
@@ -106,6 +107,10 @@ use language::{
use language::{point_to_lsp, BufferRow, CharClassifier, Runnable, RunnableRange};
use linked_editing_ranges::refresh_linked_ranges;
use mouse_context_menu::MouseContextMenu;
use project::{
debugger::breakpoint_store::{BreakpointEditAction, BreakpointStore, BreakpointStoreEvent},
ProjectPath,
};
pub use proposed_changes_editor::{
ProposedChangeLocation, ProposedChangesEditor, ProposedChangesEditorToolbar,
};
@@ -113,7 +118,6 @@ use similar::{ChangeTag, TextDiff};
use std::iter::Peekable;
use task::{ResolvedTask, TaskTemplate, TaskVariables};
use hover_links::{find_file, HoverLink, HoveredLinkState, InlayHighlight};
pub use lsp::CompletionContext;
use lsp::{
CompletionItemKind, CompletionTriggerKind, DiagnosticSeverity, InsertTextFormat,
@@ -130,7 +134,12 @@ use multi_buffer::{
ExcerptInfo, ExpandExcerptDirection, MultiBufferDiffHunk, MultiBufferPoint, MultiBufferRow,
ToOffsetUtf16,
};
use parking_lot::Mutex;
use project::{
debugger::{
breakpoint_store::{Breakpoint, BreakpointKind},
dap_store::DapStore,
},
lsp_store::{FormatTrigger, LspFormatTarget, OpenLspBufferHandle},
project_settings::{GitGutterSetting, ProjectSettings},
CodeAction, Completion, CompletionIntent, DocumentHighlight, InlayHint, Location, LocationLink,
@@ -146,6 +155,7 @@ use serde::{Deserialize, Serialize};
use settings::{update_settings_file, Settings, SettingsLocation, SettingsStore};
use smallvec::SmallVec;
use snippet::Snippet;
use std::sync::Arc;
use std::{
any::TypeId,
borrow::Cow,
@@ -156,7 +166,6 @@ use std::{
ops::{ControlFlow, Deref, DerefMut, Not as _, Range, RangeInclusive},
path::{Path, PathBuf},
rc::Rc,
sync::Arc,
time::{Duration, Instant},
};
pub use sum_tree::Bias;
@@ -280,6 +289,7 @@ impl InlayId {
}
}
pub enum DebugCurrentRowHighlight {}
enum DocumentHighlightRead {}
enum DocumentHighlightWrite {}
enum InputComposition {}
@@ -589,6 +599,7 @@ struct ResolvedTasks {
templates: SmallVec<[(TaskSourceKind, ResolvedTask); 1]>,
position: Anchor,
}
#[derive(Copy, Clone, Debug)]
struct MultiBufferOffset(usize);
#[derive(Copy, Clone, Debug, PartialEq, PartialOrd)]
@@ -750,6 +761,12 @@ pub struct Editor {
expect_bounds_change: Option<Bounds<Pixels>>,
tasks: BTreeMap<(BufferId, BufferRow), RunnableTasks>,
tasks_update_task: Option<Task<()>>,
pub dap_store: Option<Entity<DapStore>>,
pub breakpoint_store: Option<Entity<BreakpointStore>>,
/// Allow's a user to create a breakpoint by selecting this indicator
/// It should be None while a user is not hovering over the gutter
/// Otherwise it represents the point that the breakpoint will be shown
pub gutter_breakpoint_indicator: Option<DisplayPoint>,
in_project_search: bool,
previous_search_ranges: Option<Arc<[Range<Anchor>]>>,
breadcrumb_header: Option<String>,
@@ -1273,6 +1290,8 @@ impl Editor {
}
}
}
} else if let project::Event::ActiveDebugLineChanged = event {
editor.go_to_active_debug_line(window, cx);
}
},
));
@@ -1314,6 +1333,15 @@ impl Editor {
None
};
let (dap_store, breakpoint_store) = match (mode, project.as_ref()) {
(EditorMode::Full, Some(project)) => {
let dap_store = project.read(cx).dap_store();
let breakpoint_store = project.read(cx).breakpoint_store();
(Some(dap_store), Some(breakpoint_store))
}
_ => (None, None),
};
let mut code_action_providers = Vec::new();
let mut load_uncommitted_diff = None;
if let Some(project) = project.clone() {
@@ -1442,6 +1470,9 @@ impl Editor {
blame: None,
blame_subscription: None,
tasks: Default::default(),
dap_store,
breakpoint_store,
gutter_breakpoint_indicator: None,
_subscriptions: vec![
cx.observe(&buffer, Self::on_buffer_changed),
cx.subscribe_in(&buffer, window, Self::on_buffer_event),
@@ -1490,6 +1521,8 @@ impl Editor {
this.start_git_blame_inline(false, window, cx);
}
this.go_to_active_debug_line(window, cx);
if let Some(buffer) = buffer.read(cx).as_singleton() {
if let Some(project) = this.project.as_ref() {
let handle = project.update(cx, |project, cx| {
@@ -5443,14 +5476,28 @@ impl Editor {
_style: &EditorStyle,
row: DisplayRow,
is_active: bool,
breakpoint: Option<&Breakpoint>,
cx: &mut Context<Self>,
) -> Option<IconButton> {
let color = if breakpoint.is_some() {
Color::Debugger
} else {
Color::Muted
};
let position = breakpoint.as_ref().and_then(|bp| bp.active_position);
let bp_kind = Arc::new(
breakpoint
.map(|bp| bp.kind.clone())
.unwrap_or(BreakpointKind::Standard),
);
if self.available_code_actions.is_some() {
Some(
IconButton::new("code_actions_indicator", ui::IconName::Bolt)
.shape(ui::IconButtonShape::Square)
.icon_size(IconSize::XSmall)
.icon_color(Color::Muted)
.icon_color(color)
.toggle_state(is_active)
.tooltip({
let focus_handle = self.focus_handle.clone();
@@ -5475,6 +5522,16 @@ impl Editor {
window,
cx,
);
}))
.on_right_click(cx.listener(move |editor, event: &ClickEvent, window, cx| {
editor.set_breakpoint_context_menu(
row,
position,
bp_kind.clone(),
event.down.position,
window,
cx,
);
})),
)
} else {
@@ -5493,6 +5550,193 @@ impl Editor {
}
}
/// Get all display points of breakpoints that will be rendered within editor
///
/// This function is used to handle overlaps between breakpoints and Code action/runner symbol.
/// It's also used to set the color of line numbers with breakpoints to the breakpoint color.
/// TODO debugger: Use this function to color toggle symbols that house nested breakpoints
fn active_breakpoint_points(
&mut self,
window: &mut Window,
cx: &mut Context<Self>,
) -> HashMap<DisplayRow, Breakpoint> {
let mut breakpoint_display_points = HashMap::default();
let Some(dap_store) = self.dap_store.clone() else {
return breakpoint_display_points;
};
let snapshot = self.snapshot(window, cx);
let breakpoints = &dap_store.read(cx).breakpoint_store().read(cx).breakpoints;
if let Some(buffer) = self.buffer.read(cx).as_singleton() {
let buffer = buffer.read(cx);
if let Some(project_path) = buffer.project_path(cx) {
if let Some(breakpoints) = breakpoints.get(&project_path) {
for breakpoint in breakpoints {
let point = breakpoint.point_for_buffer(&buffer);
breakpoint_display_points
.insert(point.to_display_point(&snapshot).row(), breakpoint.clone());
}
};
};
return breakpoint_display_points;
}
let multi_buffer_snapshot = &snapshot.display_snapshot.buffer_snapshot;
let Some(project) = self.project.as_ref() else {
return breakpoint_display_points;
};
for excerpt_boundary in
multi_buffer_snapshot.excerpt_boundaries_in_range(Point::new(0, 0)..)
{
let info = excerpt_boundary.next.as_ref();
if let Some(info) = info {
let Some(excerpt_ranges) = multi_buffer_snapshot.range_for_excerpt(info.id) else {
continue;
};
// To translate a breakpoint's position within a singular buffer to a multi buffer
// position we need to know it's excerpt starting location, it's position within
// the singular buffer, and if that position is within the excerpt's range.
let excerpt_head = excerpt_ranges
.start
.to_display_point(&snapshot.display_snapshot);
let buffer_range = info // Buffer lines being shown within the excerpt
.buffer
.summary_for_anchor::<Point>(&info.range.context.start)
..info
.buffer
.summary_for_anchor::<Point>(&info.range.context.end);
let Some(project_path) = project.read_with(cx, |this, cx| {
this.buffer_for_id(info.buffer_id, cx)
.and_then(|buffer| buffer.read_with(cx, |b, cx| b.project_path(cx)))
}) else {
continue;
};
if let Some(breakpoint_set) = breakpoints.get(&project_path) {
for breakpoint in breakpoint_set {
let breakpoint_position =
breakpoint.point_for_buffer_snapshot(&info.buffer);
if buffer_range.contains(&breakpoint_position) {
// Translated breakpoint position from singular buffer to multi buffer
let delta = breakpoint_position.row - buffer_range.start.row;
let position = excerpt_head + DisplayPoint::new(DisplayRow(delta), 0);
breakpoint_display_points.insert(position.row(), breakpoint.clone());
}
}
};
};
}
breakpoint_display_points
}
fn breakpoint_context_menu(
&self,
anchor: text::Anchor,
kind: Arc<BreakpointKind>,
row: DisplayRow,
window: &mut Window,
cx: &mut Context<Self>,
) -> Entity<ui::ContextMenu> {
let weak_editor = cx.weak_entity();
let focus_handle = self.focus_handle(cx);
let second_entry_msg = if kind.log_message().is_some() {
"Edit Log Breakpoint"
} else {
"Add Log Breakpoint"
};
ui::ContextMenu::build(window, cx, |menu, _, _cx| {
menu.on_blur_subscription(Subscription::new(|| {}))
.context(focus_handle)
.entry("Toggle Breakpoint", None, {
let weak_editor = weak_editor.clone();
move |_window, cx| {
weak_editor
.update(cx, |this, cx| {
this.edit_breakpoint_at_anchor(
anchor,
BreakpointKind::Standard,
BreakpointEditAction::Toggle,
cx,
);
})
.log_err();
}
})
.entry(second_entry_msg, None, move |window, cx| {
weak_editor
.update(cx, |this, cx| {
this.add_edit_breakpoint_block(row, anchor, kind.as_ref(), window, cx);
})
.log_err();
})
})
}
fn render_breakpoint(
&self,
position: text::Anchor,
row: DisplayRow,
kind: &BreakpointKind,
cx: &mut Context<Self>,
) -> IconButton {
let color = if self
.gutter_breakpoint_indicator
.is_some_and(|gutter_bp| gutter_bp.row() == row)
{
Color::Hint
} else {
Color::Debugger
};
let icon = match &kind {
BreakpointKind::Standard => ui::IconName::DebugBreakpoint,
BreakpointKind::Log(_) => ui::IconName::DebugLogBreakpoint,
};
let arc_kind = Arc::new(kind.clone());
let arc_kind2 = arc_kind.clone();
IconButton::new(("breakpoint_indicator", row.0 as usize), icon)
.icon_size(IconSize::XSmall)
.size(ui::ButtonSize::None)
.icon_color(color)
.style(ButtonStyle::Transparent)
.on_click(cx.listener(move |editor, _e, window, cx| {
window.focus(&editor.focus_handle(cx));
editor.edit_breakpoint_at_anchor(
position,
arc_kind.as_ref().clone(),
BreakpointEditAction::Toggle,
cx,
);
}))
.on_right_click(cx.listener(move |editor, event: &ClickEvent, window, cx| {
editor.set_breakpoint_context_menu(
row,
Some(position),
arc_kind2.clone(),
event.down.position,
window,
cx,
);
}))
}
fn build_tasks_context(
project: &Entity<Project>,
buffer: &Entity<Buffer>,
@@ -5629,12 +5873,26 @@ impl Editor {
_style: &EditorStyle,
is_active: bool,
row: DisplayRow,
breakpoint: Option<Breakpoint>,
cx: &mut Context<Self>,
) -> IconButton {
let color = if breakpoint.is_some() {
Color::Debugger
} else {
Color::Muted
};
let position = breakpoint.as_ref().and_then(|bp| bp.active_position);
let bp_kind = Arc::new(
breakpoint
.map(|bp| bp.kind)
.unwrap_or(BreakpointKind::Standard),
);
IconButton::new(("run_indicator", row.0 as usize), ui::IconName::Play)
.shape(ui::IconButtonShape::Square)
.icon_size(IconSize::XSmall)
.icon_color(Color::Muted)
.icon_color(color)
.toggle_state(is_active)
.on_click(cx.listener(move |editor, _e, window, cx| {
window.focus(&editor.focus_handle(cx));
@@ -5646,6 +5904,16 @@ impl Editor {
cx,
);
}))
.on_right_click(cx.listener(move |editor, event: &ClickEvent, window, cx| {
editor.set_breakpoint_context_menu(
row,
position,
bp_kind.clone(),
event.down.position,
window,
cx,
);
}))
}
pub fn context_menu_visible(&self) -> bool {
@@ -6976,6 +7244,236 @@ impl Editor {
}
}
fn set_breakpoint_context_menu(
&mut self,
row: DisplayRow,
position: Option<text::Anchor>,
kind: Arc<BreakpointKind>,
clicked_point: gpui::Point<Pixels>,
window: &mut Window,
cx: &mut Context<Self>,
) {
let source = self
.buffer
.read(cx)
.snapshot(cx)
.breakpoint_anchor(Point::new(row.0, 0u32));
let context_menu = self.breakpoint_context_menu(
position.unwrap_or(source.text_anchor),
kind,
row,
window,
cx,
);
self.mouse_context_menu = MouseContextMenu::pinned_to_editor(
self,
source,
clicked_point,
context_menu,
window,
cx,
);
}
fn add_edit_breakpoint_block(
&mut self,
row: DisplayRow,
anchor: text::Anchor,
kind: &BreakpointKind,
window: &mut Window,
cx: &mut Context<Self>,
) {
let position = self
.snapshot(window, cx)
.display_point_to_anchor(DisplayPoint::new(row, 0), Bias::Right);
let weak_editor = cx.weak_entity();
let bp_prompt =
cx.new(|cx| BreakpointPromptEditor::new(weak_editor, anchor, kind.clone(), window, cx));
let height = bp_prompt.update(cx, |this, cx| {
this.prompt
.update(cx, |prompt, cx| prompt.max_point(cx).row().0 + 1 + 2)
});
let cloned_prompt = bp_prompt.clone();
let blocks = vec![BlockProperties {
style: BlockStyle::Sticky,
placement: BlockPlacement::Above(position),
height,
render: Arc::new(move |cx| {
*cloned_prompt.read(cx).gutter_dimensions.lock() = *cx.gutter_dimensions;
cloned_prompt.clone().into_any_element()
}),
priority: 0,
}];
let focus_handle = bp_prompt.focus_handle(cx);
window.focus(&focus_handle);
let block_ids = self.insert_blocks(blocks, None, cx);
bp_prompt.update(cx, |prompt, _| {
prompt.add_block_ids(block_ids);
});
}
pub(crate) fn breakpoint_at_cursor_head(
&mut self,
window: &mut Window,
cx: &mut Context<Self>,
) -> Option<(text::Anchor, BreakpointKind)> {
let cursor_position: Point = self.selections.newest(cx).head();
// We Set the column position to zero so this function interacts correctly
// between calls by clicking on the gutter & using an action to toggle a
// breakpoint. Otherwise, toggling a breakpoint through an action wouldn't
// untoggle a breakpoint that was added through clicking on the gutter
let breakpoint_position = self
.snapshot(window, cx)
.display_snapshot
.buffer_snapshot
.breakpoint_anchor(Point::new(cursor_position.row, 0))
.text_anchor;
let project = self.project.clone();
let buffer_id = breakpoint_position.buffer_id?;
let buffer = project?.read_with(cx, |project, cx| project.buffer_for_id(buffer_id, cx))?;
let (buffer_snapshot, project_path) = (
buffer.read(cx).snapshot(),
buffer.read(cx).project_path(cx)?,
);
let row = buffer_snapshot
.summary_for_anchor::<Point>(&breakpoint_position)
.row;
let bp = self.dap_store.clone()?.read_with(cx, |dap_store, cx| {
dap_store.breakpoint_store().read(cx).breakpoint_at_row(
row,
&project_path,
buffer_snapshot,
)
})?;
Some((bp.active_position?, bp.kind))
}
pub fn edit_log_breakpoint(
&mut self,
_: &EditLogBreakpoint,
window: &mut Window,
cx: &mut Context<Self>,
) {
let (anchor, kind) = self
.breakpoint_at_cursor_head(window, cx)
.unwrap_or_else(|| {
let cursor_position: Point = self.selections.newest(cx).head();
let breakpoint_position = self
.snapshot(window, cx)
.display_snapshot
.buffer_snapshot
.breakpoint_anchor(Point::new(cursor_position.row, 0))
.text_anchor;
let kind = BreakpointKind::Standard;
(breakpoint_position, kind)
});
if let Some(buffer) = self
.buffer()
.read(cx)
.as_singleton()
.map(|buffer| buffer.read(cx))
{
let row = buffer
.summary_for_anchor::<Point>(&anchor)
.to_display_point(&self.snapshot(window, cx))
.row();
self.add_edit_breakpoint_block(row, anchor, &kind, window, cx);
}
}
pub fn toggle_breakpoint(
&mut self,
_: &ToggleBreakpoint,
window: &mut Window,
cx: &mut Context<Self>,
) {
let edit_action = BreakpointEditAction::Toggle;
if let Some((anchor, kind)) = self.breakpoint_at_cursor_head(window, cx) {
self.edit_breakpoint_at_anchor(anchor, kind, edit_action, cx);
} else {
let cursor_position: Point = self.selections.newest(cx).head();
let breakpoint_position = self
.snapshot(window, cx)
.display_snapshot
.buffer_snapshot
.breakpoint_anchor(Point::new(cursor_position.row, 0))
.text_anchor;
self.edit_breakpoint_at_anchor(
breakpoint_position,
BreakpointKind::Standard,
edit_action,
cx,
);
}
}
pub fn edit_breakpoint_at_anchor(
&mut self,
breakpoint_position: text::Anchor,
kind: BreakpointKind,
edit_action: BreakpointEditAction,
cx: &mut Context<Self>,
) {
let Some(breakpoint_store) = &self.breakpoint_store else {
return;
};
let Some(buffer_id) = breakpoint_position.buffer_id else {
return;
};
let Some(cache_position) = self.buffer.read_with(cx, |buffer, cx| {
buffer.buffer(buffer_id).map(|buffer| {
buffer
.read(cx)
.summary_for_anchor::<Point>(&breakpoint_position)
.row
})
}) else {
return;
};
breakpoint_store.update(cx, |breakpoint_store, cx| {
breakpoint_store.toggle_breakpoint(
buffer_id,
Breakpoint {
cached_position: cache_position,
active_position: Some(breakpoint_position),
kind,
},
edit_action,
cx,
);
});
cx.notify();
}
#[cfg(any(test, feature = "test-support"))]
pub fn breakpoint_store(&self) -> Option<Entity<BreakpointStore>> {
self.breakpoint_store.clone()
}
pub fn prepare_revert_change(
&self,
revert_changes: &mut HashMap<BufferId, Vec<(Range<text::Anchor>, Rope)>>,
@@ -10582,6 +11080,32 @@ impl Editor {
hunk
}
pub fn go_to_line<T: 'static>(
&mut self,
row: u32,
highlight_color: Option<Hsla>,
window: &mut Window,
cx: &mut Context<Self>,
) {
let snapshot = self.snapshot(window, cx).display_snapshot;
let start = snapshot
.buffer_snapshot
.clip_point(Point::new(row, 0), Bias::Left);
let end = start + Point::new(1, 0);
let start = snapshot.buffer_snapshot.anchor_before(start);
let end = snapshot.buffer_snapshot.anchor_before(end);
self.clear_row_highlights::<T>();
self.highlight_rows::<T>(
start..end,
highlight_color
.unwrap_or_else(|| cx.theme().colors().editor_highlighted_line_background),
true,
cx,
);
self.request_autoscroll(Autoscroll::center(), cx);
}
pub fn go_to_definition(
&mut self,
_: &GoToDefinition,
@@ -13090,6 +13614,40 @@ impl Editor {
}
}
pub fn project_path(&self, cx: &mut Context<Self>) -> Option<ProjectPath> {
if let Some(buffer) = self.buffer.read(cx).as_singleton() {
buffer.read_with(cx, |buffer, cx| buffer.project_path(cx))
} else {
None
}
}
pub fn go_to_active_debug_line(&mut self, window: &mut Window, cx: &mut Context<Self>) {
let Some(dap_store) = self.dap_store.as_ref() else {
return;
};
let Some(project_path) = self.project_path(cx) else {
return;
};
if let Some((_, path, position)) = dap_store.read(cx).active_debug_line() {
if path == project_path {
self.go_to_line::<DebugCurrentRowHighlight>(
position,
Some(cx.theme().colors().editor_debugger_active_line_background),
window,
cx,
);
return;
}
}
self.clear_row_highlights::<DebugCurrentRowHighlight>();
cx.notify();
}
pub fn copy_file_name_without_extension(
&mut self,
_: &CopyFileNameWithoutExtension,
@@ -14152,9 +14710,23 @@ impl Editor {
}
multi_buffer::Event::DirtyChanged => cx.emit(EditorEvent::DirtyChanged),
multi_buffer::Event::Saved => cx.emit(EditorEvent::Saved),
multi_buffer::Event::FileHandleChanged | multi_buffer::Event::Reloaded => {
cx.emit(EditorEvent::TitleChanged)
multi_buffer::Event::FileHandleChanged => {
cx.emit(EditorEvent::TitleChanged);
if let Some(dap_store) = &self.dap_store {
if let Some(project_path) = self.project_path(cx) {
dap_store.update(cx, |dap_store, cx| {
dap_store.breakpoint_store().update(cx, |_, cx| {
cx.emit(BreakpointStoreEvent::BreakpointsChanged {
project_path,
source_changed: true,
});
});
});
}
}
}
multi_buffer::Event::Reloaded => cx.emit(EditorEvent::TitleChanged),
// multi_buffer::Event::DiffBaseChanged => {
// self.scrollbar_marker_state.dirty = true;
// cx.emit(EditorEvent::DiffBaseChanged);
@@ -16883,6 +17455,158 @@ impl Global for KillRing {}
const UPDATE_DEBOUNCE: Duration = Duration::from_millis(50);
struct BreakpointPromptEditor {
pub(crate) prompt: Entity<Editor>,
editor: WeakEntity<Editor>,
breakpoint_anchor: text::Anchor,
kind: BreakpointKind,
block_ids: HashSet<CustomBlockId>,
gutter_dimensions: Arc<Mutex<GutterDimensions>>,
_subscriptions: Vec<Subscription>,
}
impl BreakpointPromptEditor {
const MAX_LINES: u8 = 4;
fn new(
editor: WeakEntity<Editor>,
breakpoint_anchor: text::Anchor,
kind: BreakpointKind,
window: &mut Window,
cx: &mut Context<Self>,
) -> Self {
let buffer = cx.new(|cx| {
Buffer::local(
kind.log_message()
.map(|msg| msg.to_string())
.unwrap_or_default(),
cx,
)
});
let buffer = cx.new(|cx| MultiBuffer::singleton(buffer, cx));
let prompt = cx.new(|cx| {
let mut prompt = Editor::new(
EditorMode::AutoHeight {
max_lines: Self::MAX_LINES as usize,
},
buffer,
None,
false,
window,
cx,
);
prompt.set_soft_wrap_mode(language::language_settings::SoftWrap::EditorWidth, cx);
prompt.set_show_cursor_when_unfocused(false, cx);
prompt.set_placeholder_text(
"Message to log when breakpoint is hit. Expressions within {} are interpolated.",
cx,
);
prompt
});
Self {
prompt,
editor,
breakpoint_anchor,
kind,
gutter_dimensions: Arc::new(Mutex::new(GutterDimensions::default())),
block_ids: Default::default(),
_subscriptions: vec![],
}
}
pub(crate) fn add_block_ids(&mut self, block_ids: Vec<CustomBlockId>) {
self.block_ids.extend(block_ids)
}
fn confirm(&mut self, _: &menu::Confirm, window: &mut Window, cx: &mut Context<Self>) {
if let Some(editor) = self.editor.upgrade() {
let log_message = self
.prompt
.read(cx)
.buffer
.read(cx)
.as_singleton()
.expect("A multi buffer in breakpoint prompt isn't possible")
.read(cx)
.as_rope()
.to_string();
editor.update(cx, |editor, cx| {
editor.edit_breakpoint_at_anchor(
self.breakpoint_anchor,
self.kind.clone(),
BreakpointEditAction::EditLogMessage(log_message.into()),
cx,
);
editor.remove_blocks(self.block_ids.clone(), None, cx);
cx.focus_self(window);
});
}
}
fn cancel(&mut self, _: &menu::Cancel, window: &mut Window, cx: &mut Context<Self>) {
self.editor
.update(cx, |editor, cx| {
editor.remove_blocks(self.block_ids.clone(), None, cx);
window.focus(&editor.focus_handle);
})
.log_err();
}
fn render_prompt_editor(&self, cx: &mut Context<Self>) -> impl IntoElement {
let settings = ThemeSettings::get_global(cx);
let text_style = TextStyle {
color: if self.prompt.read(cx).read_only(cx) {
cx.theme().colors().text_disabled
} else {
cx.theme().colors().text
},
font_family: settings.buffer_font.family.clone(),
font_fallbacks: settings.buffer_font.fallbacks.clone(),
font_size: settings.buffer_font_size.into(),
font_weight: settings.buffer_font.weight,
line_height: relative(settings.buffer_line_height.value()),
..Default::default()
};
EditorElement::new(
&self.prompt,
EditorStyle {
background: cx.theme().colors().editor_background,
local_player: cx.theme().players().local(),
text: text_style,
..Default::default()
},
)
}
}
impl Render for BreakpointPromptEditor {
fn render(&mut self, window: &mut Window, cx: &mut Context<Self>) -> impl IntoElement {
let gutter_dimensions = *self.gutter_dimensions.lock();
h_flex()
.key_context("Editor")
.bg(cx.theme().colors().editor_background)
.border_y_1()
.border_color(cx.theme().status().info_border)
.size_full()
.py(window.line_height() / 2.5)
.on_action(cx.listener(Self::confirm))
.on_action(cx.listener(Self::cancel))
.child(h_flex().w(gutter_dimensions.full_width() + (gutter_dimensions.margin / 2.0)))
.child(div().flex_1().child(self.render_prompt_editor(cx)))
}
}
impl Focusable for BreakpointPromptEditor {
fn focus_handle(&self, cx: &App) -> FocusHandle {
self.prompt.focus_handle(cx)
}
}
fn all_edits_insertions_or_deletions(
edits: &Vec<(Range<Anchor>, String)>,
snapshot: &MultiBufferSnapshot,

View File

@@ -29,6 +29,7 @@ use parking_lot::Mutex;
use pretty_assertions::{assert_eq, assert_ne};
use project::FakeFs;
use project::{
debugger::breakpoint_store::BreakpointKind,
lsp_command::SIGNATURE_HELP_HIGHLIGHT_CURRENT,
project_settings::{LspSettings, ProjectSettings},
};
@@ -11070,6 +11071,7 @@ async fn test_move_to_enclosing_bracket(cx: &mut gpui::TestAppContext) {
cx.update_editor(|editor, window, cx| {
editor.move_to_enclosing_bracket(&MoveToEnclosingBracket, window, cx)
});
cx.run_until_parked();
cx.assert_editor_state(after);
};
@@ -15693,6 +15695,348 @@ async fn assert_highlighted_edits(
});
}
#[track_caller]
fn assert_breakpoint(
breakpoints: &BTreeMap<ProjectPath, collections::HashSet<Breakpoint>>,
project_path: &ProjectPath,
expected: Vec<(u32, BreakpointKind)>,
) {
if expected.len() == 0usize {
assert!(!breakpoints.contains_key(project_path));
} else {
let mut breakpoint = breakpoints
.get(project_path)
.unwrap()
.into_iter()
.map(|breakpoint| (breakpoint.cached_position, breakpoint.kind.clone()))
.collect::<Vec<_>>();
breakpoint.sort_by_key(|(cached_position, _)| *cached_position);
assert_eq!(expected, breakpoint);
}
}
fn add_log_breakpoint_at_cursor(
editor: &mut Editor,
log_message: &str,
window: &mut Window,
cx: &mut Context<Editor>,
) {
let (anchor, kind) = editor
.breakpoint_at_cursor_head(window, cx)
.unwrap_or_else(|| {
let cursor_position: Point = editor.selections.newest(cx).head();
let breakpoint_position = editor
.snapshot(window, cx)
.display_snapshot
.buffer_snapshot
.breakpoint_anchor(Point::new(cursor_position.row, 0))
.text_anchor;
let kind = BreakpointKind::Standard;
(breakpoint_position, kind)
});
editor.edit_breakpoint_at_anchor(
anchor,
kind,
BreakpointEditAction::EditLogMessage(log_message.into()),
cx,
);
}
#[gpui::test]
async fn test_breakpoint_toggling(cx: &mut TestAppContext) {
init_test(cx, |_| {});
let sample_text = "First line\nSecond line\nThird line\nFourth line".to_string();
let fs = FakeFs::new(cx.executor());
fs.insert_tree(
"/a",
json!({
"main.rs": sample_text,
}),
)
.await;
let project = Project::test(fs, ["/a".as_ref()], cx).await;
let workspace = cx.add_window(|window, cx| Workspace::test_new(project.clone(), window, cx));
let cx = &mut VisualTestContext::from_window(*workspace.deref(), cx);
let fs = FakeFs::new(cx.executor());
fs.insert_tree(
"/a",
json!({
"main.rs": sample_text,
}),
)
.await;
let project = Project::test(fs, ["/a".as_ref()], cx).await;
let workspace = cx.add_window(|window, cx| Workspace::test_new(project.clone(), window, cx));
let cx = &mut VisualTestContext::from_window(*workspace.deref(), cx);
let worktree_id = workspace
.update(cx, |workspace, _window, cx| {
workspace.project().update(cx, |project, cx| {
project.worktrees(cx).next().unwrap().read(cx).id()
})
})
.unwrap();
let buffer = project
.update(cx, |project, cx| {
project.open_buffer((worktree_id, "main.rs"), cx)
})
.await
.unwrap();
let (editor, cx) = cx.add_window_view(|window, cx| {
Editor::new(
EditorMode::Full,
MultiBuffer::build_from_buffer(buffer, cx),
Some(project),
true,
window,
cx,
)
});
let project_path = editor.update(cx, |editor, cx| editor.project_path(cx).unwrap());
// assert we can add breakpoint on the first line
editor.update_in(cx, |editor, window, cx| {
editor.toggle_breakpoint(&actions::ToggleBreakpoint, window, cx);
editor.move_to_end(&MoveToEnd, window, cx);
editor.toggle_breakpoint(&actions::ToggleBreakpoint, window, cx);
});
let breakpoints = editor.update(cx, |editor, cx| {
editor
.project
.as_ref()
.unwrap()
.read(cx)
.breakpoint_store()
.read(cx)
.breakpoints()
.clone()
});
assert_eq!(1, breakpoints.len());
assert_breakpoint(
&breakpoints,
&project_path,
vec![(0, BreakpointKind::Standard), (3, BreakpointKind::Standard)],
);
editor.update_in(cx, |editor, window, cx| {
editor.move_to_beginning(&MoveToBeginning, window, cx);
editor.toggle_breakpoint(&actions::ToggleBreakpoint, window, cx);
});
let breakpoints = editor.update(cx, |editor, cx| {
editor
.project
.as_ref()
.unwrap()
.read(cx)
.breakpoint_store()
.read(cx)
.breakpoints()
.clone()
});
assert_eq!(1, breakpoints.len());
assert_breakpoint(
&breakpoints,
&project_path,
vec![(3, BreakpointKind::Standard)],
);
editor.update_in(cx, |editor, window, cx| {
editor.move_to_end(&MoveToEnd, window, cx);
editor.toggle_breakpoint(&actions::ToggleBreakpoint, window, cx);
});
let breakpoints = editor.update(cx, |editor, cx| {
editor
.project
.as_ref()
.unwrap()
.read(cx)
.breakpoint_store()
.read(cx)
.breakpoints()
.clone()
});
assert_eq!(0, breakpoints.len());
assert_breakpoint(&breakpoints, &project_path, vec![]);
}
#[gpui::test]
async fn test_log_breakpoint_editing(cx: &mut TestAppContext) {
init_test(cx, |_| {});
let sample_text = "First line\nSecond line\nThird line\nFourth line".to_string();
let fs = FakeFs::new(cx.executor());
fs.insert_tree(
"/a",
json!({
"main.rs": sample_text,
}),
)
.await;
let project = Project::test(fs, ["/a".as_ref()], cx).await;
let (workspace, cx) =
cx.add_window_view(|window, cx| Workspace::test_new(project.clone(), window, cx));
let worktree_id = workspace.update(cx, |workspace, cx| {
workspace.project().update(cx, |project, cx| {
project.worktrees(cx).next().unwrap().read(cx).id()
})
});
let buffer = project
.update(cx, |project, cx| {
project.open_buffer((worktree_id, "main.rs"), cx)
})
.await
.unwrap();
let (editor, cx) = cx.add_window_view(|window, cx| {
Editor::new(
EditorMode::Full,
MultiBuffer::build_from_buffer(buffer, cx),
Some(project),
true,
window,
cx,
)
});
let project_path = editor.update(cx, |editor, cx| editor.project_path(cx).unwrap());
editor.update_in(cx, |editor, window, cx| {
add_log_breakpoint_at_cursor(editor, "hello world", window, cx);
});
let breakpoints = editor.update(cx, |editor, cx| {
editor
.project
.as_ref()
.unwrap()
.read(cx)
.breakpoint_store()
.read(cx)
.breakpoints()
.clone()
});
assert_breakpoint(
&breakpoints,
&project_path,
vec![(0, BreakpointKind::Log("hello world".into()))],
);
// Removing a log message from a log breakpoint should remove it
editor.update_in(cx, |editor, window, cx| {
add_log_breakpoint_at_cursor(editor, "", window, cx);
});
let breakpoints = editor.update(cx, |editor, cx| {
editor
.project
.as_ref()
.unwrap()
.read(cx)
.breakpoint_store()
.read(cx)
.breakpoints()
.clone()
});
assert_breakpoint(&breakpoints, &project_path, vec![]);
editor.update_in(cx, |editor, window, cx| {
editor.toggle_breakpoint(&actions::ToggleBreakpoint, window, cx);
editor.move_to_end(&MoveToEnd, window, cx);
editor.toggle_breakpoint(&actions::ToggleBreakpoint, window, cx);
// Not adding a log message to a standard breakpoint shouldn't remove it
add_log_breakpoint_at_cursor(editor, "", window, cx);
});
let breakpoints = editor.update(cx, |editor, cx| {
editor
.project
.as_ref()
.unwrap()
.read(cx)
.breakpoint_store()
.read(cx)
.breakpoints()
.clone()
});
assert_breakpoint(
&breakpoints,
&project_path,
vec![(0, BreakpointKind::Standard), (3, BreakpointKind::Standard)],
);
editor.update_in(cx, |editor, window, cx| {
add_log_breakpoint_at_cursor(editor, "hello world", window, cx);
});
let breakpoints = editor.update(cx, |editor, cx| {
editor
.project
.as_ref()
.unwrap()
.read(cx)
.breakpoint_store()
.read(cx)
.breakpoints()
.clone()
});
assert_breakpoint(
&breakpoints,
&project_path,
vec![
(0, BreakpointKind::Standard),
(3, BreakpointKind::Log("hello world".into())),
],
);
editor.update_in(cx, |editor, window, cx| {
add_log_breakpoint_at_cursor(editor, "hello Earth!!", window, cx);
});
let breakpoints = editor.update(cx, |editor, cx| {
editor
.project
.as_ref()
.unwrap()
.read(cx)
.breakpoint_store()
.read(cx)
.breakpoints()
.clone()
});
assert_breakpoint(
&breakpoints,
&project_path,
vec![
(0, BreakpointKind::Standard),
(3, BreakpointKind::Log("hello Earth !!".into())),
],
);
}
#[gpui::test]
async fn test_rename_with_duplicate_edits(cx: &mut gpui::TestAppContext) {
init_test(cx, |_| {});

View File

@@ -53,7 +53,10 @@ use multi_buffer::{
Anchor, ExcerptId, ExcerptInfo, ExpandExcerptDirection, MultiBufferPoint, MultiBufferRow,
RowInfo, ToOffset,
};
use project::project_settings::{GitGutterSetting, ProjectSettings};
use project::{
debugger::breakpoint_store::{Breakpoint, BreakpointKind},
project_settings::{GitGutterSetting, ProjectSettings},
};
use settings::Settings;
use smallvec::{smallvec, SmallVec};
use std::{
@@ -503,6 +506,8 @@ impl EditorElement {
register_action(editor, window, Editor::insert_uuid_v4);
register_action(editor, window, Editor::insert_uuid_v7);
register_action(editor, window, Editor::open_selections_in_multibuffer);
register_action(editor, window, Editor::toggle_breakpoint);
register_action(editor, window, Editor::edit_log_breakpoint);
}
fn register_key_listeners(&self, window: &mut Window, _: &mut App, layout: &EditorLayout) {
@@ -849,6 +854,18 @@ impl EditorElement {
let gutter_hovered = gutter_hitbox.is_hovered(window);
editor.set_gutter_hovered(gutter_hovered, cx);
if gutter_hovered {
editor.gutter_breakpoint_indicator = Some(
position_map
.point_for_position(event.position)
.previous_valid,
);
} else {
editor.gutter_breakpoint_indicator = None;
}
cx.notify();
// Don't trigger hover popover if mouse is hovering over context menu
if text_hitbox.is_hovered(window) {
let point_for_position = position_map.point_for_position(event.position);
@@ -1946,6 +1963,66 @@ impl EditorElement {
(offset_y, length)
}
#[allow(clippy::too_many_arguments)]
fn layout_breakpoints(
&self,
line_height: Pixels,
range: Range<DisplayRow>,
scroll_pixel_position: gpui::Point<Pixels>,
gutter_dimensions: &GutterDimensions,
gutter_hitbox: &Hitbox,
rows_with_hunk_bounds: &HashMap<DisplayRow, Bounds<Pixels>>,
snapshot: &EditorSnapshot,
breakpoints: HashMap<DisplayRow, Breakpoint>,
window: &mut Window,
cx: &mut App,
) -> Vec<AnyElement> {
self.editor.update(cx, |editor, cx| {
if editor.dap_store.is_none() {
return Vec::new();
};
breakpoints
.iter()
.filter_map(|(point, bp)| {
let row = MultiBufferRow { 0: point.0 };
if range.start > *point || range.end < *point {
return None;
}
if snapshot.is_line_folded(row) {
return None;
}
let backup_position = snapshot
.display_point_to_breakpoint_anchor(DisplayPoint::new(*point, 0))
.text_anchor;
let button = editor.render_breakpoint(
bp.active_position.unwrap_or(backup_position),
*point,
&bp.kind,
cx,
);
let button = prepaint_gutter_button(
button,
*point,
line_height,
gutter_dimensions,
scroll_pixel_position,
gutter_hitbox,
rows_with_hunk_bounds,
window,
cx,
);
Some(button)
})
.collect_vec()
})
}
#[allow(clippy::too_many_arguments)]
fn layout_run_indicators(
&self,
@@ -1956,6 +2033,7 @@ impl EditorElement {
gutter_hitbox: &Hitbox,
rows_with_hunk_bounds: &HashMap<DisplayRow, Bounds<Pixels>>,
snapshot: &EditorSnapshot,
breakpoints: &mut HashMap<DisplayRow, Breakpoint>,
window: &mut Window,
cx: &mut App,
) -> Vec<AnyElement> {
@@ -2015,11 +2093,13 @@ impl EditorElement {
return None;
}
}
let display_row = multibuffer_point.to_display_point(snapshot).row();
let button = editor.render_run_indicator(
&self.style,
Some(display_row) == active_task_indicator_row,
display_row,
breakpoints.remove(&display_row),
cx,
);
@@ -2049,6 +2129,7 @@ impl EditorElement {
gutter_dimensions: &GutterDimensions,
gutter_hitbox: &Hitbox,
rows_with_hunk_bounds: &HashMap<DisplayRow, Bounds<Pixels>>,
breakpoint_points: &mut HashMap<DisplayRow, Breakpoint>,
window: &mut Window,
cx: &mut App,
) -> Option<AnyElement> {
@@ -2063,11 +2144,16 @@ impl EditorElement {
{
active = deployed_from_indicator.map_or(true, |indicator_row| indicator_row == row);
};
button = editor.render_code_actions_indicator(&self.style, row, active, cx);
let breakpoint = breakpoint_points.get(&row);
button = editor.render_code_actions_indicator(&self.style, row, active, breakpoint, cx);
});
let button = button?;
breakpoint_points.remove(&row);
let button = prepaint_gutter_button(
button?,
button,
row,
line_height,
gutter_dimensions,
@@ -2148,8 +2234,10 @@ impl EditorElement {
scroll_position: gpui::Point<f32>,
rows: Range<DisplayRow>,
buffer_rows: &[RowInfo],
active_rows: &BTreeMap<DisplayRow, bool>,
newest_selection_head: Option<DisplayPoint>,
snapshot: &EditorSnapshot,
breakpoint_rows: &HashMap<DisplayRow, Breakpoint>,
window: &mut Window,
cx: &mut App,
) -> Arc<HashMap<MultiBufferRow, LineNumberLayout>> {
@@ -2200,7 +2288,13 @@ impl EditorElement {
return None;
}
let color = cx.theme().colors().editor_line_number;
let color = if breakpoint_rows.contains_key(&display_row) {
cx.theme().colors().debugger_accent
} else if active_rows.contains_key(&display_row) {
cx.theme().colors().editor_active_line_number
} else {
cx.theme().colors().editor_line_number
};
let shaped_line = self
.shape_line_number(SharedString::from(&line_number), color, window)
.log_err()?;
@@ -2231,7 +2325,6 @@ impl EditorElement {
let line_number = LineNumberLayout {
shaped_line,
hitbox,
display_row,
};
Some((multi_buffer_row, line_number))
})
@@ -4593,32 +4686,31 @@ impl EditorElement {
for LineNumberLayout {
shaped_line,
hitbox,
display_row,
} in layout.line_numbers.values()
{
let Some(hitbox) = hitbox else {
continue;
};
let is_active = layout.active_rows.contains_key(&display_row);
let Some(()) = (if !is_singleton && hitbox.is_hovered(window) {
let color = cx.theme().colors().editor_hover_line_number;
let color = if is_active {
cx.theme().colors().editor_active_line_number
} else if !is_singleton && hitbox.is_hovered(window) {
cx.theme().colors().editor_hover_line_number
let Some(line) = self
.shape_line_number(shaped_line.text.clone(), color, window)
.log_err()
else {
continue;
};
line.paint(hitbox.origin, line_height, window, cx).log_err()
} else {
cx.theme().colors().editor_line_number
shaped_line
.paint(hitbox.origin, line_height, window, cx)
.log_err()
}) else {
continue;
};
let Some(line) = self
.shape_line_number(shaped_line.text.clone(), color, window)
.log_err()
else {
continue;
};
let Some(()) = line.paint(hitbox.origin, line_height, window, cx).log_err() else {
continue;
};
// In singleton buffers, we select corresponding lines on the line number click, so use | -like cursor.
// In multi buffers, we open file at the line number clicked, so use a pointing hand cursor.
if is_singleton {
@@ -4649,7 +4741,7 @@ impl EditorElement {
&layout.position_map.snapshot,
line_height,
layout.gutter_hitbox.bounds,
hunk,
&hunk,
);
Some((
hunk_bounds,
@@ -4811,6 +4903,9 @@ impl EditorElement {
}
});
for breakpoint in layout.breakpoints.iter_mut() {
breakpoint.paint(window, cx);
}
for test_indicator in layout.test_indicators.iter_mut() {
test_indicator.paint(window, cx);
}
@@ -5974,6 +6069,7 @@ fn prepaint_gutter_button(
cx: &mut App,
) -> AnyElement {
let mut button = button.into_any_element();
let available_space = size(
AvailableSpace::MinContent,
AvailableSpace::Definite(line_height),
@@ -6843,6 +6939,10 @@ impl Element for EditorElement {
window.set_view_id(self.editor.entity_id());
window.set_focus_handle(&focus_handle, cx);
let mut breakpoint_rows = self
.editor
.update(cx, |editor, cx| editor.active_breakpoint_points(window, cx));
let rem_size = self.rem_size(cx);
window.with_rem_size(rem_size, |window| {
window.with_text_style(Some(text_style), |window| {
@@ -7088,12 +7188,33 @@ impl Element for EditorElement {
scroll_position,
start_row..end_row,
&row_infos,
&active_rows,
newest_selection_head,
&snapshot,
&breakpoint_rows,
window,
cx,
);
// We add the gutter breakpoint indicator to breakpoint_rows after painting
// line numbers so we don't paint a line number debug accent color if a user
// has their mouse over that line when a breakpoint isn't there
let gutter_breakpoint_indicator =
self.editor.read(cx).gutter_breakpoint_indicator;
if let Some(gutter_breakpoint_point) = gutter_breakpoint_indicator {
breakpoint_rows
.entry(gutter_breakpoint_point.row())
.or_insert(Breakpoint {
active_position: Some(
snapshot
.display_point_to_breakpoint_anchor(gutter_breakpoint_point)
.text_anchor,
),
cached_position: 0,
kind: BreakpointKind::Standard,
});
}
let mut crease_toggles =
window.with_element_namespace("crease_toggles", |window| {
self.layout_crease_toggles(
@@ -7498,6 +7619,7 @@ impl Element for EditorElement {
&gutter_dimensions,
&gutter_hitbox,
&rows_with_hunk_bounds,
&mut breakpoint_rows,
window,
cx,
);
@@ -7527,6 +7649,7 @@ impl Element for EditorElement {
&gutter_hitbox,
&rows_with_hunk_bounds,
&snapshot,
&mut breakpoint_rows,
window,
cx,
)
@@ -7534,6 +7657,19 @@ impl Element for EditorElement {
Vec::new()
};
let breakpoints = self.layout_breakpoints(
line_height,
start_row..end_row,
scroll_pixel_position,
&gutter_dimensions,
&gutter_hitbox,
&rows_with_hunk_bounds,
&snapshot,
breakpoint_rows,
window,
cx,
);
self.layout_signature_help(
&hitbox,
content_origin,
@@ -7695,6 +7831,7 @@ impl Element for EditorElement {
diff_hunk_controls: hunk_controls,
mouse_context_menu,
test_indicators,
breakpoints,
code_actions_indicator,
crease_toggles,
crease_trailers,
@@ -7873,6 +8010,7 @@ pub struct EditorLayout {
selections: Vec<(PlayerColor, Vec<SelectionLayout>)>,
code_actions_indicator: Option<AnyElement>,
test_indicators: Vec<AnyElement>,
breakpoints: Vec<AnyElement>,
crease_toggles: Vec<Option<AnyElement>>,
diff_hunk_controls: Vec<AnyElement>,
crease_trailers: Vec<Option<CreaseTrailerLayout>>,
@@ -7892,7 +8030,6 @@ impl EditorLayout {
struct LineNumberLayout {
shaped_line: ShapedLine,
hitbox: Option<Hitbox>,
display_row: DisplayRow,
}
struct ColoredRange<T> {
@@ -8543,8 +8680,10 @@ mod tests {
..Default::default()
})
.collect::<Vec<_>>(),
&BTreeMap::default(),
Some(DisplayPoint::new(DisplayRow(0), 0)),
&snapshot,
&HashMap::default(),
window,
cx,
)

View File

@@ -4,7 +4,7 @@ use std::sync::Arc;
use anyhow::Result;
use fs::Fs;
use gpui::{App, Global, ReadGlobal, SharedString, Task};
use language::{LanguageMatcher, LanguageName, LanguageServerBinaryStatus, LoadedLanguage};
use language::{BinaryStatus, LanguageMatcher, LanguageName, LoadedLanguage};
use lsp::LanguageServerName;
use parking_lot::RwLock;
@@ -274,7 +274,7 @@ pub trait ExtensionLanguageServerProxy: Send + Sync + 'static {
fn update_language_server_status(
&self,
language_server_id: LanguageServerName,
status: LanguageServerBinaryStatus,
status: BinaryStatus,
);
}
@@ -307,7 +307,7 @@ impl ExtensionLanguageServerProxy for ExtensionHostProxy {
fn update_language_server_status(
&self,
language_server_id: LanguageServerName,
status: LanguageServerBinaryStatus,
status: BinaryStatus,
) {
let Some(proxy) = self.language_server_proxy.read().clone() else {
return;

View File

@@ -8,9 +8,9 @@ use collections::BTreeMap;
use extension::ExtensionHostProxy;
use fs::{FakeFs, Fs, RealFs};
use futures::{io::BufReader, AsyncReadExt, StreamExt};
use gpui::{AppContext as _, SemanticVersion, TestAppContext};
use gpui::{AppContext as _, SemanticVersion, SharedString, TestAppContext};
use http_client::{FakeHttpClient, Response};
use language::{LanguageMatcher, LanguageRegistry, LanguageServerBinaryStatus};
use language::{BinaryStatus, LanguageMatcher, LanguageRegistry};
use lsp::LanguageServerName;
use node_runtime::NodeRuntime;
use parking_lot::Mutex;
@@ -660,18 +660,9 @@ async fn test_extension_store_with_test_extension(cx: &mut TestAppContext) {
status_updates.next().await.unwrap(),
],
[
(
LanguageServerName("gleam".into()),
LanguageServerBinaryStatus::CheckingForUpdate
),
(
LanguageServerName("gleam".into()),
LanguageServerBinaryStatus::Downloading
),
(
LanguageServerName("gleam".into()),
LanguageServerBinaryStatus::None
)
(SharedString::new("gleam"), BinaryStatus::CheckingForUpdate),
(SharedString::new("gleam"), BinaryStatus::Downloading),
(SharedString::new("gleam"), BinaryStatus::None)
]
);

View File

@@ -4,7 +4,7 @@ use crate::wasm_host::WasmState;
use anyhow::Result;
use async_trait::async_trait;
use extension::{ExtensionLanguageServerProxy, WorktreeDelegate};
use language::LanguageServerBinaryStatus;
use language::BinaryStatus;
use semantic_version::SemanticVersion;
use std::sync::{Arc, OnceLock};
use wasmtime::component::{Linker, Resource};
@@ -135,17 +135,11 @@ impl ExtensionImports for WasmState {
status: LanguageServerInstallationStatus,
) -> wasmtime::Result<()> {
let status = match status {
LanguageServerInstallationStatus::CheckingForUpdate => {
LanguageServerBinaryStatus::CheckingForUpdate
}
LanguageServerInstallationStatus::Downloading => {
LanguageServerBinaryStatus::Downloading
}
LanguageServerInstallationStatus::CheckingForUpdate => BinaryStatus::CheckingForUpdate,
LanguageServerInstallationStatus::Downloading => BinaryStatus::Downloading,
LanguageServerInstallationStatus::Cached
| LanguageServerInstallationStatus::Downloaded => LanguageServerBinaryStatus::None,
LanguageServerInstallationStatus::Failed(error) => {
LanguageServerBinaryStatus::Failed { error }
}
| LanguageServerInstallationStatus::Downloaded => BinaryStatus::None,
LanguageServerInstallationStatus::Failed(error) => BinaryStatus::Failed { error },
};
self.host

View File

@@ -9,7 +9,7 @@ use extension::{ExtensionLanguageServerProxy, KeyValueStoreDelegate, WorktreeDel
use futures::{io::BufReader, FutureExt as _};
use futures::{lock::Mutex, AsyncReadExt};
use language::LanguageName;
use language::{language_settings::AllLanguageSettings, LanguageServerBinaryStatus};
use language::{language_settings::AllLanguageSettings, BinaryStatus};
use project::project_settings::ProjectSettings;
use semantic_version::SemanticVersion;
use std::{
@@ -482,16 +482,10 @@ impl ExtensionImports for WasmState {
status: LanguageServerInstallationStatus,
) -> wasmtime::Result<()> {
let status = match status {
LanguageServerInstallationStatus::CheckingForUpdate => {
LanguageServerBinaryStatus::CheckingForUpdate
}
LanguageServerInstallationStatus::Downloading => {
LanguageServerBinaryStatus::Downloading
}
LanguageServerInstallationStatus::None => LanguageServerBinaryStatus::None,
LanguageServerInstallationStatus::Failed(error) => {
LanguageServerBinaryStatus::Failed { error }
}
LanguageServerInstallationStatus::CheckingForUpdate => BinaryStatus::CheckingForUpdate,
LanguageServerInstallationStatus::Downloading => BinaryStatus::Downloading,
LanguageServerInstallationStatus::None => BinaryStatus::None,
LanguageServerInstallationStatus::Failed(error) => BinaryStatus::Failed { error },
};
self.host

View File

@@ -13,7 +13,7 @@ use extension::{
};
use futures::{io::BufReader, FutureExt as _};
use futures::{lock::Mutex, AsyncReadExt};
use language::{language_settings::AllLanguageSettings, LanguageName, LanguageServerBinaryStatus};
use language::{language_settings::AllLanguageSettings, BinaryStatus, LanguageName};
use project::project_settings::ProjectSettings;
use semantic_version::SemanticVersion;
use std::{
@@ -671,16 +671,10 @@ impl ExtensionImports for WasmState {
status: LanguageServerInstallationStatus,
) -> wasmtime::Result<()> {
let status = match status {
LanguageServerInstallationStatus::CheckingForUpdate => {
LanguageServerBinaryStatus::CheckingForUpdate
}
LanguageServerInstallationStatus::Downloading => {
LanguageServerBinaryStatus::Downloading
}
LanguageServerInstallationStatus::None => LanguageServerBinaryStatus::None,
LanguageServerInstallationStatus::Failed(error) => {
LanguageServerBinaryStatus::Failed { error }
}
LanguageServerInstallationStatus::CheckingForUpdate => BinaryStatus::CheckingForUpdate,
LanguageServerInstallationStatus::Downloading => BinaryStatus::Downloading,
LanguageServerInstallationStatus::None => BinaryStatus::None,
LanguageServerInstallationStatus::Failed(error) => BinaryStatus::Failed { error },
};
self.host

View File

@@ -72,8 +72,8 @@ pub use buffer::Operation;
pub use buffer::*;
pub use diagnostic_set::{DiagnosticEntry, DiagnosticGroup};
pub use language_registry::{
AvailableLanguage, LanguageNotFound, LanguageQueries, LanguageRegistry,
LanguageServerBinaryStatus, QUERY_FILENAME_PREFIXES,
AvailableLanguage, BinaryStatus, LanguageNotFound, LanguageQueries, LanguageRegistry,
QUERY_FILENAME_PREFIXES,
};
pub use lsp::{LanguageServerId, LanguageServerName};
pub use outline::*;
@@ -303,7 +303,7 @@ pub trait LspAdapterDelegate: Send + Sync {
fn worktree_id(&self) -> WorktreeId;
fn worktree_root_path(&self) -> &Path;
fn exists(&self, path: &Path, is_dir: Option<bool>) -> bool;
fn update_status(&self, language: LanguageServerName, status: LanguageServerBinaryStatus);
fn update_status(&self, language: LanguageServerName, status: BinaryStatus);
async fn language_server_download_dir(&self, name: &LanguageServerName) -> Option<Arc<Path>>;
async fn npm_package_installed_version(
@@ -381,7 +381,7 @@ pub trait LspAdapter: 'static + Send + Sync {
} else {
delegate.update_status(
self.name(),
LanguageServerBinaryStatus::Failed {
BinaryStatus::Failed {
error: format!("{error:?}"),
},
);
@@ -568,7 +568,7 @@ async fn try_fetch_server_binary<L: LspAdapter + 'static + Send + Sync + ?Sized>
let name = adapter.name();
log::info!("fetching latest version of language server {:?}", name.0);
delegate.update_status(name.clone(), LanguageServerBinaryStatus::CheckingForUpdate);
delegate.update_status(name.clone(), BinaryStatus::CheckingForUpdate);
let latest_version = adapter
.fetch_latest_server_version(delegate.as_ref())
@@ -579,16 +579,16 @@ async fn try_fetch_server_binary<L: LspAdapter + 'static + Send + Sync + ?Sized>
.await
{
log::info!("language server {:?} is already installed", name.0);
delegate.update_status(name.clone(), LanguageServerBinaryStatus::None);
delegate.update_status(name.clone(), BinaryStatus::None);
Ok(binary)
} else {
log::info!("downloading language server {:?}", name.0);
delegate.update_status(adapter.name(), LanguageServerBinaryStatus::Downloading);
delegate.update_status(adapter.name(), BinaryStatus::Downloading);
let binary = adapter
.fetch_server_binary(latest_version, container_dir, delegate.as_ref())
.await;
delegate.update_status(name.clone(), LanguageServerBinaryStatus::None);
delegate.update_status(name.clone(), BinaryStatus::None);
binary
}
}

View File

@@ -98,7 +98,8 @@ pub struct LanguageRegistry {
state: RwLock<LanguageRegistryState>,
language_server_download_dir: Option<Arc<Path>>,
executor: BackgroundExecutor,
lsp_binary_status_tx: LspBinaryStatusSender,
lsp_binary_status_tx: BinaryStatusSender,
dap_binary_status_tx: BinaryStatusSender,
}
struct LanguageRegistryState {
@@ -130,7 +131,7 @@ pub struct FakeLanguageServerEntry {
}
#[derive(Clone, Debug, PartialEq, Eq)]
pub enum LanguageServerBinaryStatus {
pub enum BinaryStatus {
None,
CheckingForUpdate,
Downloading,
@@ -213,8 +214,8 @@ pub struct LanguageQueries {
}
#[derive(Clone, Default)]
struct LspBinaryStatusSender {
txs: Arc<Mutex<Vec<mpsc::UnboundedSender<(LanguageServerName, LanguageServerBinaryStatus)>>>>,
struct BinaryStatusSender {
txs: Arc<Mutex<Vec<mpsc::UnboundedSender<(SharedString, BinaryStatus)>>>>,
}
pub struct LoadedLanguage {
@@ -247,6 +248,7 @@ impl LanguageRegistry {
}),
language_server_download_dir: None,
lsp_binary_status_tx: Default::default(),
dap_binary_status_tx: Default::default(),
executor,
};
this.add(PLAIN_TEXT.clone());
@@ -914,12 +916,12 @@ impl LanguageRegistry {
self.state.read().all_lsp_adapters.get(name).cloned()
}
pub fn update_lsp_status(
&self,
server_name: LanguageServerName,
status: LanguageServerBinaryStatus,
) {
self.lsp_binary_status_tx.send(server_name, status);
pub fn update_lsp_status(&self, server_name: LanguageServerName, status: BinaryStatus) {
self.lsp_binary_status_tx.send(server_name.0, status);
}
pub fn update_dap_status(&self, server_name: LanguageServerName, status: BinaryStatus) {
self.dap_binary_status_tx.send(server_name.0, status);
}
pub fn next_language_server_id(&self) -> LanguageServerId {
@@ -973,10 +975,16 @@ impl LanguageRegistry {
pub fn language_server_binary_statuses(
&self,
) -> mpsc::UnboundedReceiver<(LanguageServerName, LanguageServerBinaryStatus)> {
) -> mpsc::UnboundedReceiver<(SharedString, BinaryStatus)> {
self.lsp_binary_status_tx.subscribe()
}
pub fn dap_server_binary_statuses(
&self,
) -> mpsc::UnboundedReceiver<(SharedString, BinaryStatus)> {
self.dap_binary_status_tx.subscribe()
}
pub async fn delete_server_container(&self, name: LanguageServerName) {
log::info!("deleting server container");
let Some(dir) = self.language_server_download_dir(&name) else {
@@ -1087,16 +1095,14 @@ impl LanguageRegistryState {
}
}
impl LspBinaryStatusSender {
fn subscribe(
&self,
) -> mpsc::UnboundedReceiver<(LanguageServerName, LanguageServerBinaryStatus)> {
impl BinaryStatusSender {
fn subscribe(&self) -> mpsc::UnboundedReceiver<(SharedString, BinaryStatus)> {
let (tx, rx) = mpsc::unbounded();
self.txs.lock().push(tx);
rx
}
fn send(&self, name: LanguageServerName, status: LanguageServerBinaryStatus) {
fn send(&self, name: SharedString, status: BinaryStatus) {
let mut txs = self.txs.lock();
txs.retain(|tx| tx.unbounded_send((name.clone(), status.clone())).is_ok());
}

View File

@@ -12,8 +12,8 @@ use fs::Fs;
use futures::{Future, FutureExt};
use gpui::AsyncApp;
use language::{
CodeLabel, HighlightId, Language, LanguageName, LanguageServerBinaryStatus,
LanguageToolchainStore, LspAdapter, LspAdapterDelegate,
BinaryStatus, CodeLabel, HighlightId, Language, LanguageName, LanguageToolchainStore,
LspAdapter, LspAdapterDelegate,
};
use lsp::{CodeActionKind, LanguageServerBinary, LanguageServerBinaryOptions, LanguageServerName};
use serde::Serialize;
@@ -80,7 +80,7 @@ impl ExtensionLanguageServerProxy for LanguageServerRegistryProxy {
fn update_language_server_status(
&self,
language_server_id: LanguageServerName,
status: LanguageServerBinaryStatus,
status: BinaryStatus,
) {
self.language_registry
.update_lsp_status(language_server_id, status);

View File

@@ -85,6 +85,7 @@ impl JsonLspAdapter {
cx,
);
let tasks_schema = task::TaskTemplates::generate_json_schema();
let debug_schema = task::DebugTaskFile::generate_json_schema();
let snippets_schema = snippet_provider::format::VSSnippetsFile::generate_json_schema();
let tsconfig_schema = serde_json::Value::from_str(TSCONFIG_SCHEMA).unwrap();
let package_json_schema = serde_json::Value::from_str(PACKAGE_JSON_SCHEMA).unwrap();
@@ -136,7 +137,15 @@ impl JsonLspAdapter {
)
],
"schema": snippets_schema,
}
},
{
"fileMatch": [
schema_file_match(paths::debug_tasks_file()),
paths::local_debug_file_relative_path()
],
"schema": debug_schema,
},
]
}
})

View File

@@ -4822,6 +4822,16 @@ impl MultiBufferSnapshot {
self.anchor_at(position, Bias::Right)
}
pub fn breakpoint_anchor<T: ToOffset>(&self, position: T) -> Anchor {
let bias = if position.to_offset(self) == 0usize {
Bias::Right
} else {
Bias::Left
};
self.anchor_at(position, bias)
}
pub fn anchor_at<T: ToOffset>(&self, position: T, mut bias: Bias) -> Anchor {
let offset = position.to_offset(self);

View File

@@ -169,6 +169,12 @@ pub fn tasks_file() -> &'static PathBuf {
TASKS_FILE.get_or_init(|| config_dir().join("tasks.json"))
}
/// Returns the path to the `debug.json` file.
pub fn debug_tasks_file() -> &'static PathBuf {
static DEBUG_TASKS_FILE: OnceLock<PathBuf> = OnceLock::new();
DEBUG_TASKS_FILE.get_or_init(|| config_dir().join("debug.json"))
}
/// Returns the path to the extensions directory.
///
/// This is where installed extensions are stored.
@@ -284,6 +290,14 @@ pub fn languages_dir() -> &'static PathBuf {
LANGUAGES_DIR.get_or_init(|| support_dir().join("languages"))
}
/// Returns the path to the debug adapters directory
///
/// This is where debug adapters are downloaded to for DAPs that are built-in to Zed.
pub fn debug_adapters_dir() -> &'static PathBuf {
static DEBUG_ADAPTERS_DIR: OnceLock<PathBuf> = OnceLock::new();
DEBUG_ADAPTERS_DIR.get_or_init(|| support_dir().join("debug_adapters"))
}
/// Returns the path to the Copilot directory.
pub fn copilot_dir() -> &'static PathBuf {
static COPILOT_DIR: OnceLock<PathBuf> = OnceLock::new();
@@ -328,5 +342,15 @@ pub fn local_vscode_tasks_file_relative_path() -> &'static Path {
Path::new(".vscode/tasks.json")
}
/// Returns the relative path to a `launch.json` file within a project.
pub fn local_debug_file_relative_path() -> &'static Path {
Path::new(".zed/debug.json")
}
/// Returns the relative path to a `.vscode/launch.json` file within a project.
pub fn local_vscode_launch_file_relative_path() -> &'static Path {
Path::new(".vscode/launch.json")
}
/// A default editorconfig file name to use when resolving project settings.
pub const EDITORCONFIG_NAME: &str = ".editorconfig";

View File

@@ -22,16 +22,20 @@ test-support = [
"prettier/test-support",
"worktree/test-support",
"gpui/test-support",
"dap/test-support",
"dap_adapters/test-support",
]
[dependencies]
aho-corasick.workspace = true
anyhow.workspace = true
async-trait.workspace = true
buffer_diff.workspace = true
client.workspace = true
clock.workspace = true
collections.workspace = true
buffer_diff.workspace = true
dap.workspace = true
dap_adapters.workspace = true
fs.workspace = true
futures.workspace = true
fuzzy.workspace = true
@@ -40,6 +44,7 @@ globset.workspace = true
gpui.workspace = true
http_client.workspace = true
itertools.workspace = true
indexmap.workspace = true
language.workspace = true
log.workspace = true
lsp.workspace = true
@@ -80,17 +85,19 @@ fancy-regex.workspace = true
client = { workspace = true, features = ["test-support"] }
collections = { workspace = true, features = ["test-support"] }
buffer_diff = { workspace = true, features = ["test-support"] }
dap = { workspace = true, features = ["test-support"] }
dap_adapters = { workspace = true, features = ["test-support"] }
env_logger.workspace = true
fs = { workspace = true, features = ["test-support"] }
git2.workspace = true
gpui = { workspace = true, features = ["test-support"] }
language = { workspace = true, features = ["test-support"] }
release_channel.workspace = true
lsp = { workspace = true, features = ["test-support"] }
prettier = { workspace = true, features = ["test-support"] }
pretty_assertions.workspace = true
worktree = { workspace = true, features = ["test-support"] }
release_channel.workspace = true
rpc = { workspace = true, features = ["test-support"] }
settings = { workspace = true, features = ["test-support"] }
unindent.workspace = true
util = { workspace = true, features = ["test-support"] }
worktree = { workspace = true, features = ["test-support"] }

View File

@@ -330,6 +330,10 @@ enum OpenBuffer {
pub enum BufferStoreEvent {
BufferAdded(Entity<Buffer>),
BufferOpened {
buffer: Entity<Buffer>,
project_path: ProjectPath,
},
BufferDropped(BufferId),
BufferChangedFilePath {
buffer: Entity<Buffer>,
@@ -1284,6 +1288,13 @@ impl BufferStore {
}
}
fn as_local(&self) -> Option<&LocalBufferStore> {
match &self.state {
BufferStoreState::Local(state) => Some(state),
_ => None,
}
}
fn as_local_mut(&mut self) -> Option<&mut LocalBufferStore> {
match &mut self.state {
BufferStoreState::Local(state) => Some(state),
@@ -1311,6 +1322,11 @@ impl BufferStore {
cx: &mut Context<Self>,
) -> Task<Result<Entity<Buffer>>> {
if let Some(buffer) = self.get_by_path(&project_path, cx) {
cx.emit(BufferStoreEvent::BufferOpened {
buffer: buffer.clone(),
project_path,
});
return Task::ready(Ok(buffer));
}
@@ -1334,12 +1350,18 @@ impl BufferStore {
.insert(
cx.spawn(move |this, mut cx| async move {
let load_result = load_buffer.await;
this.update(&mut cx, |this, _cx| {
this.update(&mut cx, |this, cx| {
// Record the fact that the buffer is no longer loading.
this.loading_buffers.remove(&project_path);
})
.ok();
load_result.map_err(Arc::new)
let buffer = load_result.map_err(Arc::new)?;
cx.emit(BufferStoreEvent::BufferOpened {
buffer: buffer.clone(),
project_path,
});
Ok(buffer)
})?
})
.shared(),
)
@@ -1778,6 +1800,11 @@ impl BufferStore {
})
}
pub fn buffer_id_for_project_path(&self, project_path: &ProjectPath) -> Option<&BufferId> {
self.as_local()
.and_then(|state| state.local_buffer_ids_by_path.get(project_path))
}
pub fn get_by_path(&self, path: &ProjectPath, cx: &App) -> Option<Entity<Buffer>> {
self.buffers().find_map(|buffer| {
let file = File::from_dyn(buffer.read(cx).file())?;

View File

@@ -0,0 +1,17 @@
//! Zed's debugger data layer is implemented in terms of 3 concepts:
//! - DAP store - that knows about all of the available debug sessions.
//! - Debug sessions - that bear responsibility of communicating with debug adapters and managing the state of each individual session.
//! For the most part it is agnostic over the communication layer (it'll use RPC for peers and actual DAP requests for the host).
//! - Breakpoint store - that knows about all breakpoints set for a project.
//!
//! There are few reasons for this divide:
//! - Breakpoints persist across debug sessions and they're not really specific to any particular session. Sure, we have to send protocol messages for them
//! (so they're a "thing" in the protocol), but we also want to set them before any session starts up.
//! - Debug clients are doing the heavy lifting, and this is where UI grabs all of it's data from. They also rely on breakpoint store during initialization to obtain
//! current set of breakpoints.
//! - Since DAP store knows about all of the available debug sessions, it is responsible for routing RPC requests to sessions. It also knows how to find adapters for particular kind of session.
pub mod breakpoint_store;
pub mod dap_command;
pub mod dap_store;
pub mod session;

View File

@@ -0,0 +1,751 @@
use crate::{
buffer_store::{BufferStore, BufferStoreEvent},
BufferId, ProjectItem as _, ProjectPath, WorktreeStore,
};
use anyhow::{Context as _, Result};
use collections::{BTreeMap, HashMap, HashSet};
use dap::{debugger_settings::DebuggerSettings, SourceBreakpoint};
use gpui::{App, AsyncApp, Context, Entity, EventEmitter};
use language::{
proto::{deserialize_anchor, serialize_anchor as serialize_text_anchor},
Buffer, BufferSnapshot,
};
use rpc::{proto, AnyProtoClient, TypedEnvelope};
use settings::Settings;
use settings::WorktreeId;
use std::{
hash::{Hash, Hasher},
path::Path,
sync::Arc,
};
use text::Point;
use util::{maybe, ResultExt as _};
struct RemoteBreakpointStore {
upstream_client: Option<AnyProtoClient>,
upstream_project_id: u64,
}
enum BreakpointMode {
Local,
Remote(RemoteBreakpointStore),
}
pub struct BreakpointStore {
pub breakpoints: BTreeMap<ProjectPath, HashSet<Breakpoint>>,
buffer_store: Entity<BufferStore>,
worktree_store: Entity<WorktreeStore>,
downstream_client: Option<(AnyProtoClient, u64)>,
mode: BreakpointMode,
}
pub enum BreakpointStoreEvent {
BreakpointsChanged {
project_path: ProjectPath,
source_changed: bool,
},
}
impl EventEmitter<BreakpointStoreEvent> for BreakpointStore {}
impl BreakpointStore {
pub fn init(client: &AnyProtoClient) {
client.add_entity_message_handler(Self::handle_synchronize_breakpoints);
}
pub fn local(
buffer_store: Entity<BufferStore>,
worktree_store: Entity<WorktreeStore>,
cx: &mut Context<Self>,
) -> Self {
cx.subscribe(&buffer_store, Self::handle_buffer_event)
.detach();
BreakpointStore {
breakpoints: BTreeMap::new(),
buffer_store,
worktree_store,
mode: BreakpointMode::Local,
downstream_client: None,
}
}
pub(crate) fn remote(
upstream_project_id: u64,
upstream_client: AnyProtoClient,
buffer_store: Entity<BufferStore>,
worktree_store: Entity<WorktreeStore>,
cx: &mut Context<Self>,
) -> Self {
cx.subscribe(&buffer_store, Self::handle_buffer_event)
.detach();
BreakpointStore {
breakpoints: BTreeMap::new(),
buffer_store,
worktree_store,
mode: BreakpointMode::Remote(RemoteBreakpointStore {
upstream_client: Some(upstream_client),
upstream_project_id,
}),
downstream_client: None,
}
}
pub fn shared(&mut self, project_id: u64, downstream_client: AnyProtoClient) {
self.downstream_client = Some((downstream_client.clone(), project_id));
for (project_path, breakpoints) in self.breakpoints.iter() {
downstream_client
.send(proto::SynchronizeBreakpoints {
project_id,
project_path: Some(project_path.to_proto()),
breakpoints: breakpoints
.iter()
.filter_map(|breakpoint| breakpoint.to_proto())
.collect(),
})
.log_err();
}
}
pub fn unshared(&mut self, cx: &mut Context<Self>) {
self.downstream_client.take();
cx.notify();
}
pub fn upstream_client(&self) -> Option<(AnyProtoClient, u64)> {
match &self.mode {
BreakpointMode::Remote(RemoteBreakpointStore {
upstream_client: Some(upstream_client),
upstream_project_id,
..
}) => Some((upstream_client.clone(), *upstream_project_id)),
BreakpointMode::Remote(RemoteBreakpointStore {
upstream_client: None,
..
}) => None,
BreakpointMode::Local => None,
}
}
pub fn set_breakpoints_from_proto(
&mut self,
breakpoints: Vec<proto::SynchronizeBreakpoints>,
cx: &mut Context<Self>,
) {
let mut new_breakpoints = BTreeMap::new();
for project_breakpoints in breakpoints {
let Some(project_path) = project_breakpoints.project_path else {
continue;
};
new_breakpoints.insert(
ProjectPath::from_proto(project_path),
project_breakpoints
.breakpoints
.into_iter()
.filter_map(Breakpoint::from_proto)
.collect::<HashSet<_>>(),
);
}
std::mem::swap(&mut self.breakpoints, &mut new_breakpoints);
cx.notify();
}
pub fn toggle_breakpoint(
&mut self,
buffer_id: BufferId,
mut breakpoint: Breakpoint,
edit_action: BreakpointEditAction,
cx: &mut Context<Self>,
) {
let Some(project_path) = self
.buffer_store
.read(cx)
.get(buffer_id)
.and_then(|buffer| buffer.read(cx).project_path(cx))
else {
return;
};
let upstream_client = self.upstream_client();
let breakpoint_set = self.breakpoints.entry(project_path.clone()).or_default();
match edit_action {
BreakpointEditAction::Toggle => {
if !breakpoint_set.remove(&breakpoint) {
breakpoint_set.insert(breakpoint);
}
}
BreakpointEditAction::EditLogMessage(log_message) => {
if !log_message.is_empty() {
breakpoint.kind = BreakpointKind::Log(log_message.clone());
breakpoint_set.remove(&breakpoint);
breakpoint_set.insert(breakpoint);
} else if matches!(&breakpoint.kind, BreakpointKind::Log(_)) {
breakpoint_set.remove(&breakpoint);
}
}
}
if let Some((client, project_id)) = upstream_client.or(self.downstream_client.clone()) {
client
.send(client::proto::SynchronizeBreakpoints {
project_id,
project_path: Some(project_path.to_proto()),
breakpoints: breakpoint_set
.iter()
.filter_map(|breakpoint| breakpoint.to_proto())
.collect(),
})
.log_err();
}
if breakpoint_set.is_empty() {
self.breakpoints.remove(&project_path);
}
cx.emit(BreakpointStoreEvent::BreakpointsChanged {
project_path: project_path.clone(),
source_changed: false,
});
cx.notify();
}
fn handle_buffer_event(
&mut self,
_buffer_store: Entity<BufferStore>,
event: &BufferStoreEvent,
cx: &mut Context<Self>,
) {
match event {
BufferStoreEvent::BufferOpened {
buffer,
project_path,
} => self.on_open_buffer(&project_path, &buffer, cx),
_ => {}
}
}
pub fn on_open_buffer(
&mut self,
project_path: &ProjectPath,
buffer: &Entity<Buffer>,
cx: &mut Context<Self>,
) {
let entry = self.breakpoints.remove(project_path).unwrap_or_default();
let mut set_bp: HashSet<Breakpoint> = HashSet::default();
let buffer = buffer.read(cx);
for mut bp in entry.into_iter() {
bp.set_active_position(&buffer);
set_bp.insert(bp);
}
self.breakpoints.insert(project_path.clone(), set_bp);
cx.emit(BreakpointStoreEvent::BreakpointsChanged {
project_path: project_path.clone(),
source_changed: true,
});
cx.notify();
}
pub fn on_file_rename(
&mut self,
old_project_path: ProjectPath,
new_project_path: ProjectPath,
cx: &mut Context<Self>,
) {
if let Some(breakpoints) = self.breakpoints.remove(&old_project_path) {
self.breakpoints
.insert(new_project_path.clone(), breakpoints);
cx.emit(BreakpointStoreEvent::BreakpointsChanged {
project_path: new_project_path,
source_changed: false,
});
cx.notify();
}
}
pub fn sync_open_breakpoints_to_closed_breakpoints(
&mut self,
buffer: &Entity<Buffer>,
cx: &mut Context<Self>,
) {
let Some(project_path) = buffer.read(cx).project_path(cx) else {
return;
};
if let Some(breakpoint_set) = self.breakpoints.remove(&project_path) {
let breakpoint_iter = breakpoint_set.into_iter().map(|mut breakpoint| {
breakpoint.cached_position = breakpoint.point_for_buffer(buffer.read(cx)).row;
breakpoint.active_position = None;
breakpoint
});
self.breakpoints.insert(
project_path.clone(),
breakpoint_iter.collect::<HashSet<_>>(),
);
cx.emit(BreakpointStoreEvent::BreakpointsChanged {
project_path,
source_changed: false,
});
cx.notify();
}
}
pub fn breakpoint_at_row(
&self,
row: u32,
project_path: &ProjectPath,
buffer_snapshot: BufferSnapshot,
) -> Option<Breakpoint> {
let breakpoint_set = self.breakpoints.get(project_path)?;
breakpoint_set
.iter()
.find(|breakpoint| breakpoint.point_for_buffer_snapshot(&buffer_snapshot).row == row)
.cloned()
}
pub fn toggle_breakpoint_for_buffer(
&mut self,
project_path: &ProjectPath,
mut breakpoint: Breakpoint,
edit_action: BreakpointEditAction,
cx: &mut Context<Self>,
) {
let upstream_client = self.upstream_client();
let breakpoint_set = self.breakpoints.entry(project_path.clone()).or_default();
match edit_action {
BreakpointEditAction::Toggle => {
if !breakpoint_set.remove(&breakpoint) {
breakpoint_set.insert(breakpoint);
}
}
BreakpointEditAction::EditLogMessage(log_message) => {
if !log_message.is_empty() {
breakpoint.kind = BreakpointKind::Log(log_message.clone());
breakpoint_set.remove(&breakpoint);
breakpoint_set.insert(breakpoint);
} else if matches!(&breakpoint.kind, BreakpointKind::Log(_)) {
breakpoint_set.remove(&breakpoint);
}
}
}
if let Some((client, project_id)) = upstream_client.or(self.downstream_client.clone()) {
client
.send(client::proto::SynchronizeBreakpoints {
project_id,
project_path: Some(project_path.to_proto()),
breakpoints: breakpoint_set
.iter()
.filter_map(|breakpoint| breakpoint.to_proto())
.collect(),
})
.log_err();
}
if breakpoint_set.is_empty() {
self.breakpoints.remove(project_path);
}
cx.emit(BreakpointStoreEvent::BreakpointsChanged {
project_path: project_path.clone(),
source_changed: false,
});
cx.notify();
}
pub fn deserialize_breakpoints(
&mut self,
worktree_id: WorktreeId,
serialize_breakpoints: Vec<SerializedBreakpoint>,
) {
for serialize_breakpoint in serialize_breakpoints {
self.breakpoints
.entry(ProjectPath {
worktree_id,
path: serialize_breakpoint.path.clone(),
})
.or_default()
.insert(Breakpoint {
active_position: None,
cached_position: serialize_breakpoint.position,
kind: serialize_breakpoint.kind,
});
}
}
async fn handle_synchronize_breakpoints(
this: Entity<Self>,
envelope: TypedEnvelope<proto::SynchronizeBreakpoints>,
mut cx: AsyncApp,
) -> Result<()> {
let project_path = ProjectPath::from_proto(
envelope
.payload
.project_path
.context("Invalid Breakpoint call")?,
);
this.update(&mut cx, |store, cx| {
let breakpoints = envelope
.payload
.breakpoints
.into_iter()
.filter_map(Breakpoint::from_proto)
.collect::<HashSet<_>>();
if breakpoints.is_empty() {
store.breakpoints.remove(&project_path);
} else {
store.breakpoints.insert(project_path.clone(), breakpoints);
}
cx.emit(BreakpointStoreEvent::BreakpointsChanged {
project_path,
source_changed: false,
});
cx.notify();
})
}
fn serialize_breakpoints_for_project_path(
&self,
project_path: &ProjectPath,
cx: &App,
) -> Option<(Arc<Path>, Vec<SerializedBreakpoint>)> {
let buffer = maybe!({
let buffer_id = self
.buffer_store
.read(cx)
.buffer_id_for_project_path(project_path)?;
Some(self.buffer_store.read(cx).get(*buffer_id)?.read(cx))
});
let worktree_path = self
.worktree_store
.read(cx)
.worktree_for_id(project_path.worktree_id, cx)?
.read(cx)
.abs_path();
Some((
worktree_path,
self.breakpoints
.get(&project_path)?
.iter()
.map(|bp| bp.to_serialized(buffer, project_path.path.clone()))
.collect(),
))
}
pub fn serialize_breakpoints(&self, cx: &App) -> HashMap<Arc<Path>, Vec<SerializedBreakpoint>> {
let mut result: HashMap<Arc<Path>, Vec<SerializedBreakpoint>> = Default::default();
if !DebuggerSettings::get_global(cx).save_breakpoints {
return result;
}
for project_path in self.breakpoints.keys() {
if let Some((worktree_path, mut serialized_breakpoint)) =
self.serialize_breakpoints_for_project_path(project_path, cx)
{
result
.entry(worktree_path.clone())
.or_default()
.append(&mut serialized_breakpoint)
}
}
result
}
pub fn all_breakpoints(
&self,
as_abs_path: bool,
cx: &App,
) -> HashMap<Arc<Path>, Vec<SerializedBreakpoint>> {
let mut all_breakpoints: HashMap<Arc<Path>, Vec<SerializedBreakpoint>> = Default::default();
for (project_path, breakpoints) in &self.breakpoints {
let buffer = maybe!({
let buffer_store = self.buffer_store.read(cx);
let buffer_id = buffer_store.buffer_id_for_project_path(project_path)?;
let buffer = buffer_store.get(*buffer_id)?;
Some(buffer.read(cx))
});
let Some(path) = maybe!({
if as_abs_path {
let worktree = self
.worktree_store
.read(cx)
.worktree_for_id(project_path.worktree_id, cx)?;
Some(Arc::from(
worktree
.read(cx)
.absolutize(&project_path.path)
.ok()?
.as_path(),
))
} else {
Some(project_path.clone().path)
}
}) else {
continue;
};
all_breakpoints.entry(path).or_default().extend(
breakpoints
.into_iter()
.map(|bp| bp.to_serialized(buffer, project_path.clone().path)),
);
}
all_breakpoints
}
#[cfg(any(test, feature = "test-support"))]
pub fn breakpoints(&self) -> &BTreeMap<ProjectPath, HashSet<Breakpoint>> {
&self.breakpoints
}
}
type LogMessage = Arc<str>;
#[derive(Clone, Debug)]
pub enum BreakpointEditAction {
Toggle,
EditLogMessage(LogMessage),
}
#[derive(Clone, Debug)]
pub enum BreakpointKind {
Standard,
Log(LogMessage),
}
impl BreakpointKind {
pub fn to_int(&self) -> i32 {
match self {
BreakpointKind::Standard => 0,
BreakpointKind::Log(_) => 1,
}
}
pub fn log_message(&self) -> Option<LogMessage> {
match self {
BreakpointKind::Standard => None,
BreakpointKind::Log(message) => Some(message.clone()),
}
}
}
impl PartialEq for BreakpointKind {
fn eq(&self, other: &Self) -> bool {
std::mem::discriminant(self) == std::mem::discriminant(other)
}
}
impl Eq for BreakpointKind {}
impl Hash for BreakpointKind {
fn hash<H: Hasher>(&self, state: &mut H) {
std::mem::discriminant(self).hash(state);
}
}
#[derive(Clone, Debug)]
pub struct Breakpoint {
pub active_position: Option<text::Anchor>,
pub cached_position: u32,
pub kind: BreakpointKind,
}
// Custom implementation for PartialEq, Eq, and Hash is done
// to get toggle breakpoint to solely be based on a breakpoint's
// location. Otherwise, a user can get in situation's where there's
// overlapping breakpoint's with them being aware.
impl PartialEq for Breakpoint {
fn eq(&self, other: &Self) -> bool {
match (&self.active_position, &other.active_position) {
(None, None) => self.cached_position == other.cached_position,
(None, Some(_)) => false,
(Some(_), None) => false,
(Some(self_position), Some(other_position)) => self_position == other_position,
}
}
}
impl Eq for Breakpoint {}
impl Hash for Breakpoint {
fn hash<H: Hasher>(&self, state: &mut H) {
if self.active_position.is_some() {
self.active_position.hash(state);
} else {
self.cached_position.hash(state);
}
}
}
impl Breakpoint {
pub fn to_source_breakpoint(&self, buffer: &Buffer) -> SourceBreakpoint {
let line = self
.active_position
.map(|position| buffer.summary_for_anchor::<Point>(&position).row)
.unwrap_or(self.cached_position) as u64;
let log_message = match &self.kind {
BreakpointKind::Standard => None,
BreakpointKind::Log(message) => Some(message.clone().to_string()),
};
SourceBreakpoint {
line,
condition: None,
hit_condition: None,
log_message,
column: None,
mode: None,
}
}
pub fn set_active_position(&mut self, buffer: &Buffer) {
if self.active_position.is_none() {
self.active_position =
Some(buffer.breakpoint_anchor(Point::new(self.cached_position, 0)));
}
}
pub fn point_for_buffer(&self, buffer: &Buffer) -> Point {
self.active_position
.map(|position| buffer.summary_for_anchor::<Point>(&position))
.unwrap_or(Point::new(self.cached_position, 0))
}
pub fn point_for_buffer_snapshot(&self, buffer_snapshot: &BufferSnapshot) -> Point {
self.active_position
.map(|position| buffer_snapshot.summary_for_anchor::<Point>(&position))
.unwrap_or(Point::new(self.cached_position, 0))
}
pub fn source_for_snapshot(&self, snapshot: Option<&BufferSnapshot>) -> SourceBreakpoint {
let line = match snapshot {
Some(snapshot) => self
.active_position
.map(|position| snapshot.summary_for_anchor::<Point>(&position).row)
.unwrap_or(self.cached_position) as u64,
None => self.cached_position as u64,
};
let log_message = match &self.kind {
BreakpointKind::Standard => None,
BreakpointKind::Log(log_message) => Some(log_message.clone().to_string()),
};
SourceBreakpoint {
line,
condition: None,
hit_condition: None,
log_message,
column: None,
mode: None,
}
}
pub fn to_serialized(&self, buffer: Option<&Buffer>, path: Arc<Path>) -> SerializedBreakpoint {
match buffer {
Some(buffer) => SerializedBreakpoint {
position: self
.active_position
.map(|position| buffer.summary_for_anchor::<Point>(&position).row)
.unwrap_or(self.cached_position),
path,
kind: self.kind.clone(),
},
None => SerializedBreakpoint {
position: self.cached_position,
path,
kind: self.kind.clone(),
},
}
}
pub fn to_proto(&self) -> Option<client::proto::Breakpoint> {
Some(client::proto::Breakpoint {
position: if let Some(position) = &self.active_position {
Some(serialize_text_anchor(position))
} else {
None
},
cached_position: self.cached_position,
kind: match self.kind {
BreakpointKind::Standard => proto::BreakpointKind::Standard.into(),
BreakpointKind::Log(_) => proto::BreakpointKind::Log.into(),
},
message: if let BreakpointKind::Log(message) = &self.kind {
Some(message.to_string())
} else {
None
},
})
}
pub fn from_proto(breakpoint: client::proto::Breakpoint) -> Option<Self> {
Some(Self {
active_position: if let Some(position) = breakpoint.position.clone() {
deserialize_anchor(position)
} else {
None
},
cached_position: breakpoint.cached_position,
kind: match proto::BreakpointKind::from_i32(breakpoint.kind) {
Some(proto::BreakpointKind::Log) => {
BreakpointKind::Log(breakpoint.message.clone().unwrap_or_default().into())
}
None | Some(proto::BreakpointKind::Standard) => BreakpointKind::Standard,
},
})
}
}
#[derive(Clone, Debug, Hash, PartialEq, Eq)]
pub struct SerializedBreakpoint {
pub position: u32,
pub path: Arc<Path>,
pub kind: BreakpointKind,
}
impl SerializedBreakpoint {
pub fn to_source_breakpoint(&self) -> SourceBreakpoint {
let log_message = match &self.kind {
BreakpointKind::Standard => None,
BreakpointKind::Log(message) => Some(message.clone().to_string()),
};
SourceBreakpoint {
line: self.position as u64,
condition: None,
hit_condition: None,
log_message,
column: None,
mode: None,
}
}
}

Some files were not shown because too many files have changed in this diff Show More