Compare commits

...

41 Commits

Author SHA1 Message Date
Antonio Scandurra
3d81b007bc Go to rope.rs, scroll around line 30. Notice underlines disappearing 2024-10-29 10:37:42 +01:00
Antonio Scandurra
33d12fb8fb Checkpoint 2024-10-28 23:10:49 +01:00
Antonio Scandurra
e1e5cb75a5 WIP 2024-10-28 19:31:04 +01:00
Antonio Scandurra
d71314b4df Start on indexing rope chunks 2024-10-28 16:18:34 +01:00
Bennet Bo Fenner
b13940720a markdown preview: Ignore inline HTML tags in text (#19804)
Follow up to #19785

This PR ensures that we explicitly ignore inline HTML tags so that we
can still extract the text between the tags and show them to the user

Release Notes:

- N/A
2024-10-27 14:34:59 +01:00
Peter Tripp
db61711753 ci: Don't run GitHub Actions workflows on forks (#19789)
- Closes: https://github.com/zed-industries/zed/issues/19351

Release Notes:

- N/A
2024-10-27 09:04:52 -04:00
Joseph T. Lyons
c12a9f2673 Add fold_at_level test (#19800) 2024-10-26 21:57:55 -04:00
Kirill Bulatov
2e32f1c8a1 Restore horizontal scrollbar checks (#19767)
Closes https://github.com/zed-industries/zed/issues/19637

Follow-up of https://github.com/zed-industries/zed/pull/18927 , restores
the condition that removed the horizontal scrollbar when panel's items
are not long enough.

Release Notes:

- Fixed horizontal scrollbar not being hidden
([#19637](https://github.com/zed-industries/zed/issues/19637))
2024-10-26 21:57:22 +03:00
Bennet Bo Fenner
03a1c8d2b8 markdown preview: Fix infinite loop in parser when parsing list items (#19785)
Release Notes:

- Fixed an issue with the markdown parser when opening a markdown
preview file that contained HTML tags inside a list item
2024-10-26 14:59:46 +02:00
Kirill Bulatov
d7a277607b Fix a few Windows tests (#19773) 2024-10-26 03:32:22 +03:00
Thorsten Ball
fc8a72cdd8 WIP: ssh remoting: Add upload_binary field to SshConnections (#19748)
This removes the old `remote_server { "download_binary_on_host": bool }`
field and replaces it with a `upload_binary: bool` on every
`ssh_connection`.


@ConradIrwin it compiles, it connects, but I haven't tested it really
yet

Release Notes:

- N/A

---------

Co-authored-by: Conrad <conrad@zed.dev>
Co-authored-by: Conrad Irwin <conrad.irwin@gmail.com>
2024-10-25 17:32:54 -06:00
Kirill Bulatov
1acebb3c47 Remove another false-positive Danger message (#19769)
Follow-up of https://github.com/zed-industries/zed/pull/19151

Ignores any URLs aftrer `Release Notes:` (if present) and after
`Follow-up of` and `Part of` words.


Release Notes:

- N/A
2024-10-26 01:55:46 +03:00
Conrad Irwin
78ed0c9312 vim: Copy comment to new lines with o/O (#19766)
Co-Authored-By: Kurt Wolf <kurtwolfbuilds@gmail.com>

Closes: #4691

Closes #ISSUE

Release Notes:

- vim: o/O now respect `extend_comment_on_newline`
2024-10-25 16:47:44 -06:00
Conrad Irwin
98d2e5fe73 Quote fixes (#19765)
Closes #19372

Release Notes:

- Fixed autoclosing quotes when the string is already open.
- Added autoclosing of rust multiline strings

---------

Co-authored-by: Kurt Wolf <kurtwolfbuilds@gmail.com>
2024-10-25 16:28:08 -06:00
Max Brunsfeld
4325819075 Fix more failure cases of assistant edits (#19653)
* Make `description` optional (since we describe it as optional in the
prompt, and we're currently not showing it)
* Fix fuzzy location bug that neglected the cost of deleting prefixes of
the query.
* Make auto-indent work for single-line edits. Previously, auto-indent
would not occur when overwriting a single line (without inserting or
deleting a newline)

Release Notes:

- N/A
2024-10-25 14:30:34 -07:00
Marshall Bowers
c19c89e6df collab: Include checkout_complete query parameter after checking out (#19763)
This PR updates the checkout flow to include the `?checkout_complete=1`
query parameter after successfully checking out.

We'll use this on the account page to adapt the UI accordingly.

Release Notes:

- N/A
2024-10-25 16:29:20 -04:00
Joseph T. Lyons
507929cb79 Add editor: fold at level <level> commands (#19750)
Closes https://github.com/zed-industries/zed/issues/5142

Note that I only moved the cursor to the top of the file so it wouldn't
jump - the commands work no matter where you are in the file.


https://github.com/user-attachments/assets/78c74ca6-5c17-477c-b5d1-97c5665e44b0

Also, is VS Code doing this right thing here? or is it busted?


https://github.com/user-attachments/assets/8c503b50-9671-4221-b9f8-1e692fe8cd9a

Release Notes:

- Added `editor: fold at level <level>` commands. macOS: `cmd-k,
cmd-<number>`, Linux: `ctrl-k, ctrl-<number>`.

---------

Co-authored-by: Piotr Osiewicz <24362066+osiewicz@users.noreply.github.com>
2024-10-25 16:08:37 -04:00
Max Brunsfeld
7d0a7aff44 Fix condition for re-using highlights when seeking buffer chunks iterator (#19760)
Fixes a syntax highlighting regression introduced in
https://github.com/zed-industries/zed/pull/19531, which caused syntax
highlighting to be missing after any block.

Release Notes:

- N/A
2024-10-25 12:37:03 -07:00
Kirill Bulatov
92ba18342c Properly deserialize active pane in the workspace (#19744)
Without setting the active pane metadata, no center pane events are
emitted on start before the pane is focused manually, which breaks
deserialization of other components like outline panel, which should
show the active pane's active item outlines on start.

Release Notes:

- N/A

Co-authored-by: Thorsten Ball <thorsten@zed.dev>
2024-10-25 22:04:09 +03:00
Kirill Bulatov
6de5ace116 Update outline panel representation when a theme is changed (#19747)
Release Notes:

- N/A
2024-10-25 22:04:02 +03:00
Richard Feldman
c9db1b9a7b Add keybindings for accepting hunks (#19749)
I went with Cmd-Shift-Y on macOS (Ctrl-Shift-Y on Linux) for "yes accept
this individual hunk" - both are currently unused.

I went with Cmd-Shift-A on macOS (Ctrl-Alt-A on Linux) for "accept all
hunks" - both are unused. (Ctrl-Shift-A on Linux was taken, as is
Ctrl-Alt-Y, so although the pairing of Ctrl-Shift-Y and Ctrl-Alt-A isn't
necessarily obvious, the letters seem intuitive - "yes" and "all" - and
those key combinations don't conflict with anything.)

Release Notes:

- Added keybindings for applying hunks in Proposed Changes
<img width="247" alt="Screenshot 2024-10-25 at 12 47 00 PM"
src="https://github.com/user-attachments/assets/d6355621-ba80-4ee2-8918-b7239a4d29be">
2024-10-25 14:10:04 -04:00
Nathan Sobo
24cb694494 Update placeholder text with key bindings to focus context panel and navigate history (#19447)
Hopefully, this will help people understand how easy it is to add
context to an inline transformation.

![CleanShot 2024-10-18 at 22 41
00@2x](https://github.com/user-attachments/assets/c09c1d89-3df2-4079-9849-9de7ac63c003)

@as-cii @maxdeviant @rtfeldman could somebody update this to display the
actual correct key bindings and ship it. I have them hard coded for now.

Release Notes:

- Updated placeholder text with key bindings to focus context panel and
navigate history.

---------

Co-authored-by: Richard Feldman <oss@rtfeldman.com>
2024-10-25 14:09:21 -04:00
Conrad Irwin
85bdd9329b Revert "Show invisibles in editor (#19298)" (#19752)
Closes: #19714

This reverts commit 6dcec47235.

Release Notes:

- (preview only) Fixes a crash when rendering invisibles
2024-10-25 11:59:22 -06:00
Kirill Bulatov
d40ea8fc81 Make macOS bundle script compatible with GNU sed (#19745)
Closes https://github.com/zed-industries/zed/issues/19742

Release Notes:

- N/A
2024-10-25 19:04:38 +03:00
Marshall Bowers
5f9a1482f1 assistant: Make /file emit events as they occur (#19743)
This PR updates the `/file` command to emit its `SlashCommandEvent`s in
a way that can actually be streamed.

Previously it was buffering up all of the events and then returning them
all at once.

Note that we still don't yet support streaming in the context editor on
`main`, so there won't be any visible changes just yet.

Release Notes:

- N/A
2024-10-25 11:02:27 -04:00
Thorsten Ball
5c2238c7a5 ssh remoting: Use matching versions of remote server binary (#19740)
This changes the download logic to not fetch the latest version, but to
fetch the version matching the current version of Zed.


Release Notes:

- Changed the update logic of the SSH remote server to not fetch the
latest version for a current channel, but to fetch the version matching
the current Zed version. If Zed is updated, the server is updated too.
If the server is newer than the Zed version an error will be displayed.
2024-10-25 16:27:36 +02:00
Piotr Osiewicz
5769065f27 project panel: Persist full filename when renaming auto-folded entries (#19728)
This fixes a debug-only panic when processing filenames. The underflow
that happens in Preview/Stable shouldn't cause any issues (other than
maybe unmarking an entry in the project panel).

/cc @notpeter

Closes #ISSUE

Release Notes:

- N/A
2024-10-25 13:47:01 +02:00
Thorsten Ball
0173479d18 ssh remoting: Lock file becomes stale if connection drops & no update if binary is running (#19724)
Release Notes:

- Changed the update process of the remote server binary to not attempt
an update if we can detect that the current binary is used by another
process.
- Changed the update process of the remote server binary to mark the
lock file as stale in case the SSH connection of the process that
created the lock file isn't open anymore.
2024-10-25 12:42:31 +02:00
Max Brunsfeld
08a3c54bac Allow editor blocks to replace ranges of text (#19531)
This PR adds the ability for editor blocks to replace lines of text, but
does not yet use that feature anywhere. We'll update assistant patches
to use replace blocks on another branch:
https://github.com/zed-industries/zed/tree/assistant-patch-replace-blocks

Release Notes:

- N/A

---------

Co-authored-by: Antonio Scandurra <me@as-cii.com>
Co-authored-by: Richard Feldman <richard@zed.dev>
Co-authored-by: Marshall Bowers <marshall@zed.dev>
Co-authored-by: Nathan Sobo <nathan@zed.dev>
2024-10-25 12:29:25 +02:00
Piotr Osiewicz
3617873431 project panel: Fix interactions with auto-folded directories (#19723)
Closes https://github.com/zed-industries/zed/issues/19566

Release Notes:

- N/A

---------

Co-authored-by: Peter Tripp <peter@zed.dev>
2024-10-25 12:13:21 +02:00
Bennet Bo Fenner
6eb6788201 image viewer: Reuse existing tabs (#19717)
Co-authored-by: Kirill <kirill@zed.dev>
Co-authored-by: Mikayla <mikayla@zed.dev>

Fixes #9896

Release Notes:

- Fixed an issue where clicking on an image inside the project panel
would not re-use an existing image tab

Co-authored-by: Kirill <kirill@zed.dev>
Co-authored-by: Mikayla <mikayla@zed.dev>
2024-10-25 09:34:50 +02:00
Conrad Irwin
ebc3031fd9 Inline initialization (#19711)
This restores all the init behaviour into main again. This means we
never need to call init_ui (and so we can't call it more than once).

Release Notes:

- (Nightly only) fixes a panic when using the cli to open another file
in a running zed.
2024-10-24 20:43:40 -06:00
Danilo Leal
42a7402cc5 assistant: Use a labeled button for the slash command menu (#19703)
This should help a bit more the discoverability of the slash commands.

Release Notes:

- N/A
2024-10-24 19:37:42 -03:00
Danilo Leal
6cd5c9e32f assistant: Tweak the model selector design (#19704)
Exploring using the UI font for it, as it is more common for dropdowns
and popovers throughout the app. Feeling like it makes it lighter and
also shorter in width!

| Before | After |
|--------|--------|
| <img width="1296" alt="Screenshot 2024-10-24 at 16 39 04"
src="https://github.com/user-attachments/assets/0412f922-77a9-4d83-adf9-5632534d6c5b">
| <img width="1296" alt="Screenshot 2024-10-24 at 16 38 26"
src="https://github.com/user-attachments/assets/8bf52ba7-fda7-4437-b53e-903c282f2931">
|

Release Notes:

- N/A
2024-10-24 19:37:09 -03:00
Conrad Irwin
d45b830412 SSH connection pooling (#19692)
Co-Authored-By: Max <max@zed.dev>

Closes #ISSUE

Release Notes:

- SSH Remoting: Reuse connections across hosts

---------

Co-authored-by: Max <max@zed.dev>
2024-10-24 14:37:54 -06:00
Conrad Irwin
3a9c071e6e Fix partial downloads of ssh remote server (#19700)
Release Notes:

- SSH Remoting: fix a bug where inerrrupting ssh connecting could leave
your local binary cached in an invalid state
2024-10-24 14:37:02 -06:00
Marshall Bowers
ca861bb1bb ui: Fix swapped element background colors (#19701)
This PR fixes an issue introduced in
https://github.com/zed-industries/zed/pull/18768 where the element
backgrounds colors for `ElevationIndex::ElevatedSurface` and
`ElevationIndex::Surface` were swapped.

Release Notes:

- N/A
2024-10-24 16:34:45 -04:00
Kirill Bulatov
454d3dd52b Fix ssh project history (#19683)
Use `Fs` instead of `std::fs` and do entry existence checks better:
* first, check the worktree entry existence without any FS checks
* then, only for local cases, use `Fs` to check for abs_path existence
of items, in case those came from single-filed worktrees that got closed
and removed.

Remote entries do not get file existence checks, so might try opening
previously removed buffers for now.

Release Notes:

- N/A
2024-10-24 21:49:07 +03:00
Peter Tripp
3ec015b325 docs: Example theme_overrides for docstrings as italic (#19694) 2024-10-24 14:37:57 -04:00
Mikayla Maki
02718284ef Remove dev servers (#19638)
TODO:

- [ ] Check that workspace migration worked
- [ ] Add server migrations and make sure SeaORM files are in sync
(maybe?)

Release Notes:

- N/A

---------

Co-authored-by: Conrad <conrad@zed.dev>
Co-authored-by: Conrad Irwin <conrad.irwin@gmail.com>
2024-10-24 12:14:03 -06:00
Marshall Bowers
b5f816dde5 assistant: Add implementation for /delta argument completion (#19693)
This PR fixes a panic that could occur when trying to complete arguments
for the `/delta` slash command.

We were using `unimplemented!()` instead of providing a default no-op
implementation like we do for other slash commands that do not support
completing arguments.

Closes https://github.com/zed-industries/zed/issues/19686.

Release Notes:

- Fixed a panic that could occur when trying to complete arguments with
the `/delta` command.
2024-10-24 13:39:06 -04:00
126 changed files with 5184 additions and 7954 deletions

View File

@@ -91,6 +91,7 @@ jobs:
macos_tests:
timeout-minutes: 60
name: (macOS) Run Clippy and tests
if: github.repository_owner == 'zed-industries'
runs-on:
- self-hosted
- test
@@ -126,6 +127,7 @@ jobs:
linux_tests:
timeout-minutes: 60
name: (Linux) Run Clippy and tests
if: github.repository_owner == 'zed-industries'
runs-on:
- buildjet-16vcpu-ubuntu-2204
steps:
@@ -158,6 +160,7 @@ jobs:
build_remote_server:
timeout-minutes: 60
name: (Linux) Build Remote Server
if: github.repository_owner == 'zed-industries'
runs-on:
- buildjet-16vcpu-ubuntu-2204
steps:
@@ -185,6 +188,7 @@ jobs:
windows_tests:
timeout-minutes: 60
name: (Windows) Run Clippy and tests
if: github.repository_owner == 'zed-industries'
runs-on: hosted-windows-1
steps:
- name: Checkout repo

46
Cargo.lock generated
View File

@@ -2550,7 +2550,6 @@ dependencies = [
"ctor",
"dashmap 6.0.1",
"derive_more",
"dev_server_projects",
"editor",
"env_logger",
"envy",
@@ -2561,7 +2560,6 @@ dependencies = [
"git_hosting_providers",
"google_ai",
"gpui",
"headless",
"hex",
"http_client",
"hyper 0.14.30",
@@ -3476,18 +3474,6 @@ dependencies = [
"syn 1.0.109",
]
[[package]]
name = "dev_server_projects"
version = "0.1.0"
dependencies = [
"anyhow",
"client",
"gpui",
"rpc",
"serde",
"serde_json",
]
[[package]]
name = "diagnostics"
version = "0.1.0"
@@ -3725,7 +3711,6 @@ dependencies = [
"tree-sitter-rust",
"tree-sitter-typescript",
"ui",
"unicode-segmentation",
"unindent",
"url",
"util",
@@ -5274,28 +5259,6 @@ dependencies = [
"http 0.2.12",
]
[[package]]
name = "headless"
version = "0.1.0"
dependencies = [
"anyhow",
"client",
"extension",
"fs",
"futures 0.3.30",
"gpui",
"language",
"log",
"node_runtime",
"postage",
"project",
"proto",
"settings",
"shellexpand 2.1.2",
"signal-hook",
"util",
]
[[package]]
name = "heck"
version = "0.3.3"
@@ -8443,7 +8406,6 @@ dependencies = [
"client",
"clock",
"collections",
"dev_server_projects",
"env_logger",
"fs",
"futures 0.3.30",
@@ -8981,8 +8943,6 @@ version = "0.1.0"
dependencies = [
"anyhow",
"auto_update",
"client",
"dev_server_projects",
"editor",
"file_finder",
"futures 0.3.30",
@@ -8999,14 +8959,12 @@ dependencies = [
"project",
"release_channel",
"remote",
"rpc",
"schemars",
"serde",
"serde_json",
"settings",
"smol",
"task",
"terminal_view",
"theme",
"ui",
"util",
@@ -11912,7 +11870,6 @@ dependencies = [
"client",
"collections",
"command_palette",
"dev_server_projects",
"editor",
"extensions_ui",
"feature_flags",
@@ -14309,7 +14266,6 @@ dependencies = [
"collections",
"db",
"derive_more",
"dev_server_projects",
"env_logger",
"fs",
"futures 0.3.30",
@@ -14628,7 +14584,6 @@ dependencies = [
"command_palette_hooks",
"copilot",
"db",
"dev_server_projects",
"diagnostics",
"editor",
"env_logger",
@@ -14644,7 +14599,6 @@ dependencies = [
"git_hosting_providers",
"go_to_line",
"gpui",
"headless",
"http_client",
"image_viewer",
"inline_completion_button",

View File

@@ -23,7 +23,6 @@ members = [
"crates/context_servers",
"crates/copilot",
"crates/db",
"crates/dev_server_projects",
"crates/diagnostics",
"crates/docs_preprocessor",
"crates/editor",
@@ -45,7 +44,6 @@ members = [
"crates/google_ai",
"crates/gpui",
"crates/gpui_macros",
"crates/headless",
"crates/html_to_markdown",
"crates/http_client",
"crates/image_viewer",
@@ -201,7 +199,6 @@ command_palette_hooks = { path = "crates/command_palette_hooks" }
context_servers = { path = "crates/context_servers" }
copilot = { path = "crates/copilot" }
db = { path = "crates/db" }
dev_server_projects = { path = "crates/dev_server_projects" }
diagnostics = { path = "crates/diagnostics" }
editor = { path = "crates/editor" }
extension = { path = "crates/extension" }
@@ -219,7 +216,6 @@ go_to_line = { path = "crates/go_to_line" }
google_ai = { path = "crates/google_ai" }
gpui = { path = "crates/gpui", default-features = false, features = ["http_client"]}
gpui_macros = { path = "crates/gpui_macros" }
headless = { path = "crates/headless" }
html_to_markdown = { path = "crates/html_to_markdown" }
http_client = { path = "crates/http_client" }
image_viewer = { path = "crates/image_viewer" }
@@ -468,7 +464,7 @@ tree-sitter-typescript = "0.23"
tree-sitter-yaml = { git = "https://github.com/zed-industries/tree-sitter-yaml", rev = "baff0b51c64ef6a1fb1f8390f3ad6015b83ec13a" }
unicase = "2.6"
unindent = "0.1.7"
unicode-segmentation = "1.11"
unicode-segmentation = "1.10"
url = "2.2"
uuid = { version = "1.1.2", features = ["v4", "v5", "serde"] }
wasmparser = "0.215"

View File

@@ -313,6 +313,15 @@
"ctrl-k ctrl-l": "editor::ToggleFold",
"ctrl-k ctrl-[": "editor::FoldRecursive",
"ctrl-k ctrl-]": "editor::UnfoldRecursive",
"ctrl-k ctrl-1": ["editor::FoldAtLevel", { "level": 1 }],
"ctrl-k ctrl-2": ["editor::FoldAtLevel", { "level": 2 }],
"ctrl-k ctrl-3": ["editor::FoldAtLevel", { "level": 3 }],
"ctrl-k ctrl-4": ["editor::FoldAtLevel", { "level": 4 }],
"ctrl-k ctrl-5": ["editor::FoldAtLevel", { "level": 5 }],
"ctrl-k ctrl-6": ["editor::FoldAtLevel", { "level": 6 }],
"ctrl-k ctrl-7": ["editor::FoldAtLevel", { "level": 7 }],
"ctrl-k ctrl-8": ["editor::FoldAtLevel", { "level": 8 }],
"ctrl-k ctrl-9": ["editor::FoldAtLevel", { "level": 9 }],
"ctrl-k ctrl-0": "editor::FoldAll",
"ctrl-k ctrl-j": "editor::UnfoldAll",
"ctrl-space": "editor::ShowCompletions",
@@ -505,6 +514,13 @@
"ctrl-enter": "assistant::InlineAssist"
}
},
{
"context": "ProposedChangesEditor",
"bindings": {
"ctrl-shift-y": "editor::ApplyDiffHunk",
"ctrl-alt-a": "editor::ApplyAllDiffHunks"
}
},
{
"context": "Editor && jupyter && !ContextEditor",
"bindings": {

View File

@@ -349,7 +349,15 @@
"alt-cmd-]": "editor::UnfoldLines",
"cmd-k cmd-l": "editor::ToggleFold",
"cmd-k cmd-[": "editor::FoldRecursive",
"cmd-k cmd-]": "editor::UnfoldRecursive",
"cmd-k cmd-1": ["editor::FoldAtLevel", { "level": 1 }],
"cmd-k cmd-2": ["editor::FoldAtLevel", { "level": 2 }],
"cmd-k cmd-3": ["editor::FoldAtLevel", { "level": 3 }],
"cmd-k cmd-4": ["editor::FoldAtLevel", { "level": 4 }],
"cmd-k cmd-5": ["editor::FoldAtLevel", { "level": 5 }],
"cmd-k cmd-6": ["editor::FoldAtLevel", { "level": 6 }],
"cmd-k cmd-7": ["editor::FoldAtLevel", { "level": 7 }],
"cmd-k cmd-8": ["editor::FoldAtLevel", { "level": 8 }],
"cmd-k cmd-9": ["editor::FoldAtLevel", { "level": 9 }],
"cmd-k cmd-0": "editor::FoldAll",
"cmd-k cmd-j": "editor::UnfoldAll",
"ctrl-space": "editor::ShowCompletions",
@@ -538,6 +546,13 @@
"ctrl-enter": "assistant::InlineAssist"
}
},
{
"context": "ProposedChangesEditor",
"bindings": {
"cmd-shift-y": "editor::ApplyDiffHunk",
"cmd-shift-a": "editor::ApplyAllDiffHunks"
}
},
{
"context": "PromptEditor",
"bindings": {

View File

@@ -88,7 +88,6 @@ origin: (f64, f64),
<edit>
<path>src/shapes/rectangle.rs</path>
<description>Update the Rectangle's new function to take an origin parameter</description>
<operation>update</operation>
<old_text>
fn new(width: f64, height: f64) -> Self {
@@ -117,7 +116,6 @@ pub struct Circle {
<edit>
<path>src/shapes/circle.rs</path>
<description>Update the Circle's new function to take an origin parameter</description>
<operation>update</operation>
<old_text>
fn new(radius: f64) -> Self {
@@ -134,7 +132,6 @@ fn new(origin: (f64, f64), radius: f64) -> Self {
<edit>
<path>src/shapes/rectangle.rs</path>
<description>Add an import for the std::fmt module</description>
<operation>insert_before</operation>
<old_text>
struct Rectangle {
@@ -147,7 +144,10 @@ use std::fmt;
<edit>
<path>src/shapes/rectangle.rs</path>
<description>Add a Display implementation for Rectangle</description>
<description>
Add a manual Display implementation for Rectangle.
Currently, this is the same as a derived Display implementation.
</description>
<operation>insert_after</operation>
<old_text>
Rectangle { width, height }
@@ -169,7 +169,6 @@ impl fmt::Display for Rectangle {
<edit>
<path>src/shapes/circle.rs</path>
<description>Add an import for the `std::fmt` module</description>
<operation>insert_before</operation>
<old_text>
struct Circle {
@@ -181,7 +180,6 @@ use std::fmt;
<edit>
<path>src/shapes/circle.rs</path>
<description>Add a Display implementation for Circle</description>
<operation>insert_after</operation>
<old_text>
Circle { radius }

View File

@@ -26,8 +26,8 @@ use collections::{BTreeSet, HashMap, HashSet};
use editor::{
actions::{FoldAt, MoveToEndOfLine, Newline, ShowCompletions, UnfoldAt},
display_map::{
BlockContext, BlockDisposition, BlockId, BlockProperties, BlockStyle, Crease,
CreaseMetadata, CustomBlockId, FoldId, RenderBlock, ToDisplayPoint,
BlockContext, BlockId, BlockPlacement, BlockProperties, BlockStyle, Crease, CreaseMetadata,
CustomBlockId, FoldId, RenderBlock, ToDisplayPoint,
},
scroll::{Autoscroll, AutoscrollStrategy},
Anchor, Editor, EditorEvent, ProposedChangeLocation, ProposedChangesEditor, RowExt,
@@ -963,7 +963,7 @@ impl AssistantPanel {
fn new_context(&mut self, cx: &mut ViewContext<Self>) -> Option<View<ContextEditor>> {
let project = self.project.read(cx);
if project.is_via_collab() && project.dev_server_project_id().is_none() {
if project.is_via_collab() {
let task = self
.context_store
.update(cx, |store, cx| store.create_remote_context(cx));
@@ -2009,13 +2009,12 @@ impl ContextEditor {
})
.map(|(command, error_message)| BlockProperties {
style: BlockStyle::Fixed,
position: Anchor {
height: 1,
placement: BlockPlacement::Below(Anchor {
buffer_id: Some(buffer_id),
excerpt_id,
text_anchor: command.source_range.start,
},
height: 1,
disposition: BlockDisposition::Below,
}),
render: slash_command_error_block_renderer(error_message),
priority: 0,
}),
@@ -2242,11 +2241,10 @@ impl ContextEditor {
} else {
let block_ids = editor.insert_blocks(
[BlockProperties {
position: patch_start,
height: path_count as u32 + 1,
style: BlockStyle::Flex,
render: render_block,
disposition: BlockDisposition::Below,
placement: BlockPlacement::Below(patch_start),
priority: 0,
}],
None,
@@ -2731,12 +2729,13 @@ impl ContextEditor {
})
};
let create_block_properties = |message: &Message| BlockProperties {
position: buffer
.anchor_in_excerpt(excerpt_id, message.anchor_range.start)
.unwrap(),
height: 2,
style: BlockStyle::Sticky,
disposition: BlockDisposition::Above,
placement: BlockPlacement::Above(
buffer
.anchor_in_excerpt(excerpt_id, message.anchor_range.start)
.unwrap(),
),
priority: usize::MAX,
render: render_block(MessageMetadata::from(message)),
};
@@ -3372,7 +3371,7 @@ impl ContextEditor {
let anchor = buffer.anchor_in_excerpt(excerpt_id, anchor).unwrap();
let image = render_image.clone();
anchor.is_valid(&buffer).then(|| BlockProperties {
position: anchor,
placement: BlockPlacement::Above(anchor),
height: MAX_HEIGHT_IN_LINES,
style: BlockStyle::Sticky,
render: Box::new(move |cx| {
@@ -3393,8 +3392,6 @@ impl ContextEditor {
)
.into_any_element()
}),
disposition: BlockDisposition::Above,
priority: 0,
})
})
@@ -3949,7 +3946,7 @@ impl Render for ContextEditor {
.bg(cx.theme().colors().editor_background)
.child(
h_flex()
.gap_2()
.gap_1()
.child(render_inject_context_menu(cx.view().downgrade(), cx))
.child(
IconButton::new("quote-button", IconName::Quote)
@@ -4249,11 +4246,11 @@ fn render_inject_context_menu(
slash_command_picker::SlashCommandSelector::new(
commands.clone(),
active_context_editor,
IconButton::new("trigger", IconName::SlashSquare)
Button::new("trigger", "Add Context")
.icon(IconName::Plus)
.icon_size(IconSize::Small)
.tooltip(|cx| {
Tooltip::with_meta("Insert Context", None, "Type / to insert via keyboard", cx)
}),
.icon_position(IconPosition::Start)
.tooltip(|cx| Tooltip::text("Type / to insert via keyboard", cx)),
)
}

View File

@@ -636,7 +636,7 @@ async fn test_workflow_step_parsing(cx: &mut TestAppContext) {
kind: AssistantEditKind::InsertAfter {
old_text: "fn one".into(),
new_text: "fn two() {}".into(),
description: "add a `two` function".into(),
description: Some("add a `two` function".into()),
},
}]],
cx,
@@ -690,7 +690,7 @@ async fn test_workflow_step_parsing(cx: &mut TestAppContext) {
kind: AssistantEditKind::InsertAfter {
old_text: "fn zero".into(),
new_text: "fn two() {}".into(),
description: "add a `two` function".into(),
description: Some("add a `two` function".into()),
},
}]],
cx,
@@ -754,7 +754,7 @@ async fn test_workflow_step_parsing(cx: &mut TestAppContext) {
kind: AssistantEditKind::InsertAfter {
old_text: "fn zero".into(),
new_text: "fn two() {}".into(),
description: "add a `two` function".into(),
description: Some("add a `two` function".into()),
},
}]],
cx,
@@ -798,7 +798,7 @@ async fn test_workflow_step_parsing(cx: &mut TestAppContext) {
kind: AssistantEditKind::InsertAfter {
old_text: "fn zero".into(),
new_text: "fn two() {}".into(),
description: "add a `two` function".into(),
description: Some("add a `two` function".into()),
},
}]],
cx,

View File

@@ -9,7 +9,7 @@ use collections::{hash_map, HashMap, HashSet, VecDeque};
use editor::{
actions::{MoveDown, MoveUp, SelectAll},
display_map::{
BlockContext, BlockDisposition, BlockProperties, BlockStyle, CustomBlockId, RenderBlock,
BlockContext, BlockPlacement, BlockProperties, BlockStyle, CustomBlockId, RenderBlock,
ToDisplayPoint,
},
Anchor, AnchorRangeExt, CodeActionProvider, Editor, EditorElement, EditorEvent, EditorMode,
@@ -54,7 +54,7 @@ use telemetry_events::{AssistantEvent, AssistantKind, AssistantPhase};
use terminal_view::terminal_panel::TerminalPanel;
use text::{OffsetRangeExt, ToPoint as _};
use theme::ThemeSettings;
use ui::{prelude::*, CheckboxWithLabel, IconButtonShape, Popover, Tooltip};
use ui::{prelude::*, text_for_action, CheckboxWithLabel, IconButtonShape, Popover, Tooltip};
use util::{RangeExt, ResultExt};
use workspace::{notifications::NotificationId, ItemHandle, Toast, Workspace};
@@ -446,15 +446,14 @@ impl InlineAssistant {
let assist_blocks = vec![
BlockProperties {
style: BlockStyle::Sticky,
position: range.start,
placement: BlockPlacement::Above(range.start),
height: prompt_editor_height,
render: build_assist_editor_renderer(prompt_editor),
disposition: BlockDisposition::Above,
priority: 0,
},
BlockProperties {
style: BlockStyle::Sticky,
position: range.end,
placement: BlockPlacement::Below(range.end),
height: 0,
render: Box::new(|cx| {
v_flex()
@@ -464,7 +463,6 @@ impl InlineAssistant {
.border_color(cx.theme().status().info_border)
.into_any_element()
}),
disposition: BlockDisposition::Below,
priority: 0,
},
];
@@ -1179,7 +1177,7 @@ impl InlineAssistant {
let height =
deleted_lines_editor.update(cx, |editor, cx| editor.max_point(cx).row().0 + 1);
new_blocks.push(BlockProperties {
position: new_row,
placement: BlockPlacement::Above(new_row),
height,
style: BlockStyle::Flex,
render: Box::new(move |cx| {
@@ -1191,7 +1189,6 @@ impl InlineAssistant {
.child(deleted_lines_editor.clone())
.into_any_element()
}),
disposition: BlockDisposition::Above,
priority: 0,
});
}
@@ -1599,7 +1596,7 @@ impl PromptEditor {
// always show the cursor (even when it isn't focused) because
// typing in one will make what you typed appear in all of them.
editor.set_show_cursor_when_unfocused(true, cx);
editor.set_placeholder_text("Add a prompt…", cx);
editor.set_placeholder_text(Self::placeholder_text(codegen.read(cx), cx), cx);
editor
});
@@ -1656,6 +1653,7 @@ impl PromptEditor {
self.editor = cx.new_view(|cx| {
let mut editor = Editor::auto_height(Self::MAX_LINES as usize, cx);
editor.set_soft_wrap_mode(language::language_settings::SoftWrap::EditorWidth, cx);
editor.set_placeholder_text(Self::placeholder_text(self.codegen.read(cx), cx), cx);
editor.set_placeholder_text("Add a prompt…", cx);
editor.set_text(prompt, cx);
if focus {
@@ -1666,6 +1664,20 @@ impl PromptEditor {
self.subscribe_to_editor(cx);
}
fn placeholder_text(codegen: &Codegen, cx: &WindowContext) -> String {
let context_keybinding = text_for_action(&crate::ToggleFocus, cx)
.map(|keybinding| format!("{keybinding} for context"))
.unwrap_or_default();
let action = if codegen.is_insertion {
"Generate"
} else {
"Transform"
};
format!("{action}{context_keybinding} • ↓↑ for history")
}
fn prompt(&self, cx: &AppContext) -> String {
self.editor.read(cx).text(cx)
}
@@ -2263,6 +2275,7 @@ pub struct Codegen {
initial_transaction_id: Option<TransactionId>,
telemetry: Option<Arc<Telemetry>>,
builder: Arc<PromptBuilder>,
is_insertion: bool,
}
impl Codegen {
@@ -2285,6 +2298,7 @@ impl Codegen {
)
});
let mut this = Self {
is_insertion: range.to_offset(&buffer.read(cx).snapshot(cx)).is_empty(),
alternatives: vec![codegen],
active_alternative: 0,
seen_alternatives: HashSet::default(),
@@ -2686,7 +2700,7 @@ impl CodegenAlternative {
let prompt = self
.builder
.generate_content_prompt(user_prompt, language_name, buffer, range)
.generate_inline_transformation_prompt(user_prompt, language_name, buffer, range)
.map_err(|e| anyhow::anyhow!("Failed to generate content prompt: {}", e))?;
let mut messages = Vec::new();

View File

@@ -158,39 +158,34 @@ impl PickerDelegate for ModelPickerDelegate {
.spacing(ListItemSpacing::Sparse)
.selected(selected)
.start_slot(
div().pr_1().child(
div().pr_0p5().child(
Icon::new(model_info.icon)
.color(Color::Muted)
.size(IconSize::Medium),
),
)
.child(
h_flex()
.w_full()
.justify_between()
.font_buffer(cx)
.min_w(px(240.))
.child(
h_flex()
.gap_2()
.child(Label::new(model_info.model.name().0.clone()))
.child(
Label::new(provider_name)
.size(LabelSize::XSmall)
.color(Color::Muted),
)
.children(match model_info.availability {
LanguageModelAvailability::Public => None,
LanguageModelAvailability::RequiresPlan(Plan::Free) => None,
LanguageModelAvailability::RequiresPlan(Plan::ZedPro) => {
show_badges.then(|| {
Label::new("Pro")
.size(LabelSize::XSmall)
.color(Color::Muted)
})
}
}),
),
h_flex().w_full().justify_between().min_w(px(200.)).child(
h_flex()
.gap_1p5()
.child(Label::new(model_info.model.name().0.clone()))
.child(
Label::new(provider_name)
.size(LabelSize::XSmall)
.color(Color::Muted),
)
.children(match model_info.availability {
LanguageModelAvailability::Public => None,
LanguageModelAvailability::RequiresPlan(Plan::Free) => None,
LanguageModelAvailability::RequiresPlan(Plan::ZedPro) => {
show_badges.then(|| {
Label::new("Pro")
.size(LabelSize::XSmall)
.color(Color::Muted)
})
}
}),
),
)
.end_slot(div().when(model_info.is_selected, |this| {
this.child(
@@ -212,7 +207,7 @@ impl PickerDelegate for ModelPickerDelegate {
h_flex()
.w_full()
.border_t_1()
.border_color(cx.theme().colors().border)
.border_color(cx.theme().colors().border_variant)
.p_1()
.gap_4()
.justify_between()

View File

@@ -33,21 +33,21 @@ pub enum AssistantEditKind {
Update {
old_text: String,
new_text: String,
description: String,
description: Option<String>,
},
Create {
new_text: String,
description: String,
description: Option<String>,
},
InsertBefore {
old_text: String,
new_text: String,
description: String,
description: Option<String>,
},
InsertAfter {
old_text: String,
new_text: String,
description: String,
description: Option<String>,
},
Delete {
old_text: String,
@@ -86,19 +86,37 @@ enum SearchDirection {
Diagonal,
}
// A measure of the currently quality of an in-progress fuzzy search.
//
// Uses 60 bits to store a numeric cost, and 4 bits to store the preceding
// operation in the search.
#[derive(Copy, Clone, PartialEq, Eq, PartialOrd, Ord)]
struct SearchState {
score: u32,
cost: u32,
direction: SearchDirection,
}
impl SearchState {
fn new(score: u32, direction: SearchDirection) -> Self {
Self { score, direction }
fn new(cost: u32, direction: SearchDirection) -> Self {
Self { cost, direction }
}
}
struct SearchMatrix {
cols: usize,
data: Vec<SearchState>,
}
impl SearchMatrix {
fn new(rows: usize, cols: usize) -> Self {
SearchMatrix {
cols,
data: vec![SearchState::new(0, SearchDirection::Diagonal); rows * cols],
}
}
fn get(&self, row: usize, col: usize) -> SearchState {
self.data[row * self.cols + col]
}
fn set(&mut self, row: usize, col: usize, cost: SearchState) {
self.data[row * self.cols + col] = cost;
}
}
@@ -187,23 +205,23 @@ impl AssistantEdit {
"update" => AssistantEditKind::Update {
old_text: old_text.ok_or_else(|| anyhow!("missing old_text"))?,
new_text: new_text.ok_or_else(|| anyhow!("missing new_text"))?,
description: description.ok_or_else(|| anyhow!("missing description"))?,
description,
},
"insert_before" => AssistantEditKind::InsertBefore {
old_text: old_text.ok_or_else(|| anyhow!("missing old_text"))?,
new_text: new_text.ok_or_else(|| anyhow!("missing new_text"))?,
description: description.ok_or_else(|| anyhow!("missing description"))?,
description,
},
"insert_after" => AssistantEditKind::InsertAfter {
old_text: old_text.ok_or_else(|| anyhow!("missing old_text"))?,
new_text: new_text.ok_or_else(|| anyhow!("missing new_text"))?,
description: description.ok_or_else(|| anyhow!("missing description"))?,
description,
},
"delete" => AssistantEditKind::Delete {
old_text: old_text.ok_or_else(|| anyhow!("missing old_text"))?,
},
"create" => AssistantEditKind::Create {
description: description.ok_or_else(|| anyhow!("missing description"))?,
description,
new_text: new_text.ok_or_else(|| anyhow!("missing new_text"))?,
},
_ => Err(anyhow!("unknown operation {operation:?}"))?,
@@ -264,7 +282,7 @@ impl AssistantEditKind {
ResolvedEdit {
range,
new_text,
description: Some(description),
description,
}
}
Self::Create {
@@ -272,7 +290,7 @@ impl AssistantEditKind {
description,
} => ResolvedEdit {
range: text::Anchor::MIN..text::Anchor::MAX,
description: Some(description),
description,
new_text,
},
Self::InsertBefore {
@@ -285,7 +303,7 @@ impl AssistantEditKind {
ResolvedEdit {
range: range.start..range.start,
new_text,
description: Some(description),
description,
}
}
Self::InsertAfter {
@@ -298,7 +316,7 @@ impl AssistantEditKind {
ResolvedEdit {
range: range.end..range.end,
new_text,
description: Some(description),
description,
}
}
Self::Delete { old_text } => {
@@ -314,44 +332,29 @@ impl AssistantEditKind {
fn resolve_location(buffer: &text::BufferSnapshot, search_query: &str) -> Range<text::Anchor> {
const INSERTION_COST: u32 = 3;
const DELETION_COST: u32 = 10;
const WHITESPACE_INSERTION_COST: u32 = 1;
const DELETION_COST: u32 = 3;
const WHITESPACE_DELETION_COST: u32 = 1;
const EQUALITY_BONUS: u32 = 5;
struct Matrix {
cols: usize,
data: Vec<SearchState>,
}
impl Matrix {
fn new(rows: usize, cols: usize) -> Self {
Matrix {
cols,
data: vec![SearchState::new(0, SearchDirection::Diagonal); rows * cols],
}
}
fn get(&self, row: usize, col: usize) -> SearchState {
self.data[row * self.cols + col]
}
fn set(&mut self, row: usize, col: usize, cost: SearchState) {
self.data[row * self.cols + col] = cost;
}
}
let buffer_len = buffer.len();
let query_len = search_query.len();
let mut matrix = Matrix::new(query_len + 1, buffer_len + 1);
let mut matrix = SearchMatrix::new(query_len + 1, buffer_len + 1);
let mut leading_deletion_cost = 0_u32;
for (row, query_byte) in search_query.bytes().enumerate() {
let deletion_cost = if query_byte.is_ascii_whitespace() {
WHITESPACE_DELETION_COST
} else {
DELETION_COST
};
leading_deletion_cost = leading_deletion_cost.saturating_add(deletion_cost);
matrix.set(
row + 1,
0,
SearchState::new(leading_deletion_cost, SearchDirection::Diagonal),
);
for (col, buffer_byte) in buffer.bytes_in_range(0..buffer.len()).flatten().enumerate() {
let deletion_cost = if query_byte.is_ascii_whitespace() {
WHITESPACE_DELETION_COST
} else {
DELETION_COST
};
let insertion_cost = if buffer_byte.is_ascii_whitespace() {
WHITESPACE_INSERTION_COST
} else {
@@ -359,38 +362,35 @@ impl AssistantEditKind {
};
let up = SearchState::new(
matrix.get(row, col + 1).score.saturating_sub(deletion_cost),
matrix.get(row, col + 1).cost.saturating_add(deletion_cost),
SearchDirection::Up,
);
let left = SearchState::new(
matrix
.get(row + 1, col)
.score
.saturating_sub(insertion_cost),
matrix.get(row + 1, col).cost.saturating_add(insertion_cost),
SearchDirection::Left,
);
let diagonal = SearchState::new(
if query_byte == *buffer_byte {
matrix.get(row, col).score.saturating_add(EQUALITY_BONUS)
matrix.get(row, col).cost
} else {
matrix
.get(row, col)
.score
.saturating_sub(deletion_cost + insertion_cost)
.cost
.saturating_add(deletion_cost + insertion_cost)
},
SearchDirection::Diagonal,
);
matrix.set(row + 1, col + 1, up.max(left).max(diagonal));
matrix.set(row + 1, col + 1, up.min(left).min(diagonal));
}
}
// Traceback to find the best match
let mut best_buffer_end = buffer_len;
let mut best_score = 0;
let mut best_cost = u32::MAX;
for col in 1..=buffer_len {
let score = matrix.get(query_len, col).score;
if score > best_score {
best_score = score;
let cost = matrix.get(query_len, col).cost;
if cost < best_cost {
best_cost = cost;
best_buffer_end = col;
}
}
@@ -560,89 +560,84 @@ mod tests {
language_settings::AllLanguageSettings, Language, LanguageConfig, LanguageMatcher,
};
use settings::SettingsStore;
use text::{OffsetRangeExt, Point};
use ui::BorrowAppContext;
use unindent::Unindent as _;
use util::test::{generate_marked_text, marked_text_ranges};
#[gpui::test]
fn test_resolve_location(cx: &mut AppContext) {
{
let buffer = cx.new_model(|cx| {
Buffer::local(
concat!(
" Lorem\n",
" ipsum\n",
" dolor sit amet\n",
" consecteur",
),
cx,
)
});
let snapshot = buffer.read(cx).snapshot();
assert_eq!(
AssistantEditKind::resolve_location(&snapshot, "ipsum\ndolor").to_point(&snapshot),
Point::new(1, 0)..Point::new(2, 18)
);
}
assert_location_resolution(
concat!(
" Lorem\n",
"« ipsum\n",
" dolor sit amet»\n",
" consecteur",
),
"ipsum\ndolor",
cx,
);
{
let buffer = cx.new_model(|cx| {
Buffer::local(
concat!(
"fn foo1(a: usize) -> usize {\n",
" 40\n",
"}\n",
"\n",
"fn foo2(b: usize) -> usize {\n",
" 42\n",
"}\n",
),
cx,
)
});
let snapshot = buffer.read(cx).snapshot();
assert_eq!(
AssistantEditKind::resolve_location(&snapshot, "fn foo1(b: usize) {\n40\n}")
.to_point(&snapshot),
Point::new(0, 0)..Point::new(2, 1)
);
}
assert_location_resolution(
&"
«fn foo1(a: usize) -> usize {
40
{
let buffer = cx.new_model(|cx| {
Buffer::local(
concat!(
"fn main() {\n",
" Foo\n",
" .bar()\n",
" .baz()\n",
" .qux()\n",
"}\n",
"\n",
"fn foo2(b: usize) -> usize {\n",
" 42\n",
"}\n",
),
cx,
)
});
let snapshot = buffer.read(cx).snapshot();
assert_eq!(
AssistantEditKind::resolve_location(&snapshot, "Foo.bar.baz.qux()")
.to_point(&snapshot),
Point::new(1, 0)..Point::new(4, 14)
);
}
fn foo2(b: usize) -> usize {
42
}
"
.unindent(),
"fn foo1(b: usize) {\n40\n}",
cx,
);
assert_location_resolution(
&"
fn main() {
« Foo
.bar()
.baz()
.qux()»
}
fn foo2(b: usize) -> usize {
42
}
"
.unindent(),
"Foo.bar.baz.qux()",
cx,
);
assert_location_resolution(
&"
class Something {
one() { return 1; }
« two() { return 2222; }
three() { return 333; }
four() { return 4444; }
five() { return 5555; }
six() { return 6666; }
» seven() { return 7; }
eight() { return 8; }
}
"
.unindent(),
&"
two() { return 2222; }
four() { return 4444; }
five() { return 5555; }
six() { return 6666; }
"
.unindent(),
cx,
);
}
#[gpui::test]
fn test_resolve_edits(cx: &mut AppContext) {
let settings_store = SettingsStore::test(cx);
cx.set_global(settings_store);
language::init(cx);
cx.update_global::<SettingsStore, _>(|settings, cx| {
settings.update_user_settings::<AllLanguageSettings>(cx, |_| {});
});
init_test(cx);
assert_edits(
"
@@ -675,7 +670,7 @@ mod tests {
last_name: String,
"
.unindent(),
description: "".into(),
description: None,
},
AssistantEditKind::Update {
old_text: "
@@ -690,7 +685,7 @@ mod tests {
}
"
.unindent(),
description: "".into(),
description: None,
},
],
"
@@ -734,7 +729,7 @@ mod tests {
qux();
}"
.unindent(),
description: "implement bar".into(),
description: Some("implement bar".into()),
},
AssistantEditKind::Update {
old_text: "
@@ -747,7 +742,7 @@ mod tests {
bar();
}"
.unindent(),
description: "call bar in foo".into(),
description: Some("call bar in foo".into()),
},
AssistantEditKind::InsertAfter {
old_text: "
@@ -762,7 +757,7 @@ mod tests {
}
"
.unindent(),
description: "implement qux".into(),
description: Some("implement qux".into()),
},
],
"
@@ -814,7 +809,7 @@ mod tests {
}
"
.unindent(),
description: "pick better number".into(),
description: None,
},
AssistantEditKind::Update {
old_text: "
@@ -829,7 +824,7 @@ mod tests {
}
"
.unindent(),
description: "pick better number".into(),
description: None,
},
AssistantEditKind::Update {
old_text: "
@@ -844,7 +839,7 @@ mod tests {
}
"
.unindent(),
description: "pick better number".into(),
description: None,
},
],
"
@@ -865,6 +860,69 @@ mod tests {
.unindent(),
cx,
);
assert_edits(
"
impl Person {
fn set_name(&mut self, name: String) {
self.name = name;
}
fn name(&self) -> String {
return self.name;
}
}
"
.unindent(),
vec![
AssistantEditKind::Update {
old_text: "self.name = name;".unindent(),
new_text: "self._name = name;".unindent(),
description: None,
},
AssistantEditKind::Update {
old_text: "return self.name;\n".unindent(),
new_text: "return self._name;\n".unindent(),
description: None,
},
],
"
impl Person {
fn set_name(&mut self, name: String) {
self._name = name;
}
fn name(&self) -> String {
return self._name;
}
}
"
.unindent(),
cx,
);
}
fn init_test(cx: &mut AppContext) {
let settings_store = SettingsStore::test(cx);
cx.set_global(settings_store);
language::init(cx);
cx.update_global::<SettingsStore, _>(|settings, cx| {
settings.update_user_settings::<AllLanguageSettings>(cx, |_| {});
});
}
#[track_caller]
fn assert_location_resolution(
text_with_expected_range: &str,
query: &str,
cx: &mut AppContext,
) {
let (text, _) = marked_text_ranges(text_with_expected_range, false);
let buffer = cx.new_model(|cx| Buffer::local(text.clone(), cx));
let snapshot = buffer.read(cx).snapshot();
let range = AssistantEditKind::resolve_location(&snapshot, query).to_offset(&snapshot);
let text_with_actual_range = generate_marked_text(&text, &[range], false);
pretty_assertions::assert_eq!(text_with_actual_range, text_with_expected_range);
}
#[track_caller]

View File

@@ -204,7 +204,7 @@ impl PromptBuilder {
Ok(())
}
pub fn generate_content_prompt(
pub fn generate_inline_transformation_prompt(
&self,
user_prompt: String,
language_name: Option<&LanguageName>,

View File

@@ -1,5 +1,5 @@
use crate::slash_command::file_command::{FileCommandMetadata, FileSlashCommand};
use anyhow::Result;
use anyhow::{anyhow, Result};
use assistant_slash_command::{
ArgumentCompletion, SlashCommand, SlashCommandOutput, SlashCommandOutputSection,
SlashCommandResult,
@@ -38,7 +38,7 @@ impl SlashCommand for DeltaSlashCommand {
_workspace: Option<WeakView<Workspace>>,
_cx: &mut WindowContext,
) -> Task<Result<Vec<ArgumentCompletion>>> {
unimplemented!()
Task::ready(Err(anyhow!("this command does not require argument")))
}
fn run(

View File

@@ -4,6 +4,7 @@ use assistant_slash_command::{
SlashCommandOutput, SlashCommandOutputSection, SlashCommandResult,
};
use futures::channel::mpsc;
use futures::Stream;
use fuzzy::PathMatch;
use gpui::{AppContext, Model, Task, View, WeakView};
use language::{BufferSnapshot, CodeLabel, HighlightId, LineEnding, LspAdapterDelegate};
@@ -196,7 +197,12 @@ impl SlashCommand for FileSlashCommand {
return Task::ready(Err(anyhow!("missing path")));
};
collect_files(workspace.read(cx).project().clone(), arguments, cx)
Task::ready(Ok(collect_files(
workspace.read(cx).project().clone(),
arguments,
cx,
)
.boxed()))
}
}
@@ -204,7 +210,7 @@ fn collect_files(
project: Model<Project>,
glob_inputs: &[String],
cx: &mut AppContext,
) -> Task<SlashCommandResult> {
) -> impl Stream<Item = Result<SlashCommandEvent>> {
let Ok(matchers) = glob_inputs
.into_iter()
.map(|glob_input| {
@@ -213,7 +219,7 @@ fn collect_files(
})
.collect::<anyhow::Result<Vec<custom_path_matcher::PathMatcher>>>()
else {
return Task::ready(Err(anyhow!("invalid path")));
return futures::stream::once(async { Err(anyhow!("invalid path")) }).boxed();
};
let project_handle = project.downgrade();
@@ -357,8 +363,12 @@ fn collect_files(
events_tx.unbounded_send(Ok(SlashCommandEvent::EndSection { metadata: None }))?;
}
}
Ok(events_rx.boxed())
anyhow::Ok(())
})
.detach_and_log_err(cx);
events_rx.boxed()
}
pub fn codeblock_fence_for_path(
@@ -550,6 +560,7 @@ mod test {
use project::Project;
use serde_json::json;
use settings::SettingsStore;
use smol::stream::StreamExt;
use crate::slash_command::file_command::collect_files;
@@ -590,11 +601,9 @@ mod test {
let project = Project::test(fs, ["/root".as_ref()], cx).await;
let result_1 = cx
.update(|cx| collect_files(project.clone(), &["root/dir".to_string()], cx))
.await
.unwrap();
let result_1 = SlashCommandOutput::from_event_stream(result_1)
let result_1 =
cx.update(|cx| collect_files(project.clone(), &["root/dir".to_string()], cx));
let result_1 = SlashCommandOutput::from_event_stream(result_1.boxed())
.await
.unwrap();
@@ -602,20 +611,16 @@ mod test {
// 4 files + 2 directories
assert_eq!(result_1.sections.len(), 6);
let result_2 = cx
.update(|cx| collect_files(project.clone(), &["root/dir/".to_string()], cx))
.await
.unwrap();
let result_2 = SlashCommandOutput::from_event_stream(result_2)
let result_2 =
cx.update(|cx| collect_files(project.clone(), &["root/dir/".to_string()], cx));
let result_2 = SlashCommandOutput::from_event_stream(result_2.boxed())
.await
.unwrap();
assert_eq!(result_1, result_2);
let result = cx
.update(|cx| collect_files(project.clone(), &["root/dir*".to_string()], cx))
.await
.unwrap();
let result =
cx.update(|cx| collect_files(project.clone(), &["root/dir*".to_string()], cx).boxed());
let result = SlashCommandOutput::from_event_stream(result).await.unwrap();
assert!(result.text.starts_with("root/dir"));
@@ -659,11 +664,11 @@ mod test {
let project = Project::test(fs, ["/zed".as_ref()], cx).await;
let result = cx
.update(|cx| collect_files(project.clone(), &["zed/assets/themes".to_string()], cx))
let result =
cx.update(|cx| collect_files(project.clone(), &["zed/assets/themes".to_string()], cx));
let result = SlashCommandOutput::from_event_stream(result.boxed())
.await
.unwrap();
let result = SlashCommandOutput::from_event_stream(result).await.unwrap();
// Sanity check
assert!(result.text.starts_with("zed/assets/themes\n"));
@@ -721,11 +726,11 @@ mod test {
let project = Project::test(fs, ["/zed".as_ref()], cx).await;
let result = cx
.update(|cx| collect_files(project.clone(), &["zed/assets/themes".to_string()], cx))
let result =
cx.update(|cx| collect_files(project.clone(), &["zed/assets/themes".to_string()], cx));
let result = SlashCommandOutput::from_event_stream(result.boxed())
.await
.unwrap();
let result = SlashCommandOutput::from_event_stream(result).await.unwrap();
assert!(result.text.starts_with("zed/assets/themes\n"));
assert_eq!(result.sections[0].label, "zed/assets/themes/LICENSE");

View File

@@ -178,7 +178,7 @@ impl PickerDelegate for SlashCommandDelegate {
SlashCommandEntry::Info(info) => Some(
ListItem::new(ix)
.inset(true)
.spacing(ListItemSpacing::Sparse)
.spacing(ListItemSpacing::Dense)
.selected(selected)
.child(
h_flex()
@@ -224,7 +224,7 @@ impl PickerDelegate for SlashCommandDelegate {
SlashCommandEntry::Advert { renderer, .. } => Some(
ListItem::new(ix)
.inset(true)
.spacing(ListItemSpacing::Sparse)
.spacing(ListItemSpacing::Dense)
.selected(selected)
.child(renderer(cx)),
),

View File

@@ -11,6 +11,7 @@ use gpui::{
};
use markdown_preview::markdown_preview_view::{MarkdownPreviewMode, MarkdownPreviewView};
use paths::remote_servers_dir;
use schemars::JsonSchema;
use serde::Deserialize;
use serde_derive::Serialize;
@@ -431,10 +432,11 @@ impl AutoUpdater {
cx.notify();
}
pub async fn get_latest_remote_server_release(
pub async fn download_remote_server_release(
os: &str,
arch: &str,
mut release_channel: ReleaseChannel,
release_channel: ReleaseChannel,
version: Option<SemanticVersion>,
cx: &mut AsyncAppContext,
) -> Result<PathBuf> {
let this = cx.update(|cx| {
@@ -444,15 +446,12 @@ impl AutoUpdater {
.ok_or_else(|| anyhow!("auto-update not initialized"))
})??;
if release_channel == ReleaseChannel::Dev {
release_channel = ReleaseChannel::Nightly;
}
let release = Self::get_latest_release(
let release = Self::get_release(
&this,
"zed-remote-server",
os,
arch,
version,
Some(release_channel),
cx,
)
@@ -467,17 +466,21 @@ impl AutoUpdater {
let client = this.read_with(cx, |this, _| this.http_client.clone())?;
if smol::fs::metadata(&version_path).await.is_err() {
log::info!("downloading zed-remote-server {os} {arch}");
log::info!(
"downloading zed-remote-server {os} {arch} version {}",
release.version
);
download_remote_server_binary(&version_path, release, client, cx).await?;
}
Ok(version_path)
}
pub async fn get_latest_remote_server_release_url(
pub async fn get_remote_server_release_url(
os: &str,
arch: &str,
mut release_channel: ReleaseChannel,
release_channel: ReleaseChannel,
version: Option<SemanticVersion>,
cx: &mut AsyncAppContext,
) -> Result<(String, String)> {
let this = cx.update(|cx| {
@@ -487,15 +490,12 @@ impl AutoUpdater {
.ok_or_else(|| anyhow!("auto-update not initialized"))
})??;
if release_channel == ReleaseChannel::Dev {
release_channel = ReleaseChannel::Nightly;
}
let release = Self::get_latest_release(
let release = Self::get_release(
&this,
"zed-remote-server",
os,
arch,
version,
Some(release_channel),
cx,
)
@@ -507,6 +507,56 @@ impl AutoUpdater {
Ok((release.url, body))
}
async fn get_release(
this: &Model<Self>,
asset: &str,
os: &str,
arch: &str,
version: Option<SemanticVersion>,
release_channel: Option<ReleaseChannel>,
cx: &mut AsyncAppContext,
) -> Result<JsonRelease> {
let client = this.read_with(cx, |this, _| this.http_client.clone())?;
if let Some(version) = version {
let channel = release_channel.map(|c| c.dev_name()).unwrap_or("stable");
let url = format!("/api/releases/{channel}/{version}/{asset}-{os}-{arch}.gz?update=1",);
Ok(JsonRelease {
version: version.to_string(),
url: client.build_url(&url),
})
} else {
let mut url_string = client.build_url(&format!(
"/api/releases/latest?asset={}&os={}&arch={}",
asset, os, arch
));
if let Some(param) = release_channel.and_then(|c| c.release_query_param()) {
url_string += "&";
url_string += param;
}
let mut response = client.get(&url_string, Default::default(), true).await?;
let mut body = Vec::new();
response.body_mut().read_to_end(&mut body).await?;
if !response.status().is_success() {
return Err(anyhow!(
"failed to fetch release: {:?}",
String::from_utf8_lossy(&body),
));
}
serde_json::from_slice(body.as_slice()).with_context(|| {
format!(
"error deserializing release {:?}",
String::from_utf8_lossy(&body),
)
})
}
}
async fn get_latest_release(
this: &Model<Self>,
asset: &str,
@@ -515,38 +565,7 @@ impl AutoUpdater {
release_channel: Option<ReleaseChannel>,
cx: &mut AsyncAppContext,
) -> Result<JsonRelease> {
let client = this.read_with(cx, |this, _| this.http_client.clone())?;
let mut url_string = client.build_url(&format!(
"/api/releases/latest?asset={}&os={}&arch={}",
asset, os, arch
));
if let Some(param) = release_channel.and_then(|c| c.release_query_param()) {
url_string += "&";
url_string += param;
}
let mut response = client.get(&url_string, Default::default(), true).await?;
let mut body = Vec::new();
response
.body_mut()
.read_to_end(&mut body)
.await
.context("error reading release")?;
if !response.status().is_success() {
Err(anyhow!(
"failed to fetch release: {:?}",
String::from_utf8_lossy(&body),
))?;
}
serde_json::from_slice(body.as_slice()).with_context(|| {
format!(
"error deserializing release {:?}",
String::from_utf8_lossy(&body),
)
})
Self::get_release(this, asset, os, arch, None, release_channel, cx).await
}
async fn update(this: Model<Self>, mut cx: AsyncAppContext) -> Result<()> {
@@ -661,12 +680,15 @@ async fn download_remote_server_binary(
client: Arc<HttpClientWithUrl>,
cx: &AsyncAppContext,
) -> Result<()> {
let mut target_file = File::create(&target_path).await?;
let temp = tempfile::Builder::new().tempfile_in(remote_servers_dir())?;
let mut temp_file = File::create(&temp).await?;
let update_request_body = build_remote_server_update_request_body(cx)?;
let request_body = AsyncBody::from(serde_json::to_string(&update_request_body)?);
let mut response = client.get(&release.url, request_body, true).await?;
smol::io::copy(response.body_mut(), &mut target_file).await?;
smol::io::copy(response.body_mut(), &mut temp_file).await?;
smol::fs::rename(&temp, &target_path).await?;
Ok(())
}

View File

@@ -1194,26 +1194,15 @@ impl Room {
project: Model<Project>,
cx: &mut ModelContext<Self>,
) -> Task<Result<u64>> {
let request = if let Some(dev_server_project_id) = project.read(cx).dev_server_project_id()
{
self.client.request(proto::ShareProject {
room_id: self.id(),
worktrees: vec![],
dev_server_project_id: Some(dev_server_project_id.0),
is_ssh_project: false,
})
} else {
if let Some(project_id) = project.read(cx).remote_id() {
return Task::ready(Ok(project_id));
}
if let Some(project_id) = project.read(cx).remote_id() {
return Task::ready(Ok(project_id));
}
self.client.request(proto::ShareProject {
room_id: self.id(),
worktrees: project.read(cx).worktree_metadata_protos(cx),
dev_server_project_id: None,
is_ssh_project: project.read(cx).is_via_ssh(),
})
};
let request = self.client.request(proto::ShareProject {
room_id: self.id(),
worktrees: project.read(cx).worktree_metadata_protos(cx),
is_ssh_project: project.read(cx).is_via_ssh(),
});
cx.spawn(|this, mut cx| async move {
let response = request.await?;

View File

@@ -15,7 +15,6 @@ pub enum CliRequest {
urls: Vec<String>,
wait: bool,
open_new_workspace: Option<bool>,
dev_server_token: Option<String>,
env: Option<HashMap<String, String>>,
},
}

View File

@@ -151,6 +151,12 @@ fn main() -> Result<()> {
}
}
if let Some(_) = args.dev_server_token {
return Err(anyhow::anyhow!(
"Dev servers were removed in v0.157.x please upgrade to SSH remoting: https://zed.dev/docs/remote-development"
))?;
}
let sender: JoinHandle<anyhow::Result<()>> = thread::spawn({
let exit_status = exit_status.clone();
move || {
@@ -162,7 +168,6 @@ fn main() -> Result<()> {
urls,
wait: args.wait,
open_new_workspace,
dev_server_token: args.dev_server_token,
env,
})?;

View File

@@ -30,7 +30,6 @@ use schemars::JsonSchema;
use serde::{Deserialize, Serialize};
use settings::{Settings, SettingsSources};
use socks::connect_socks_proxy_stream;
use std::fmt;
use std::pin::Pin;
use std::{
any::TypeId,
@@ -54,15 +53,6 @@ pub use rpc::*;
pub use telemetry_events::Event;
pub use user::*;
#[derive(Debug, Clone, Eq, PartialEq)]
pub struct DevServerToken(pub String);
impl fmt::Display for DevServerToken {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
write!(f, "{}", self.0)
}
}
static ZED_SERVER_URL: LazyLock<Option<String>> =
LazyLock::new(|| std::env::var("ZED_SERVER_URL").ok());
static ZED_RPC_URL: LazyLock<Option<String>> = LazyLock::new(|| std::env::var("ZED_RPC_URL").ok());
@@ -304,20 +294,14 @@ struct ClientState {
}
#[derive(Clone, Debug, Eq, PartialEq)]
pub enum Credentials {
DevServer { token: DevServerToken },
User { user_id: u64, access_token: String },
pub struct Credentials {
pub user_id: u64,
pub access_token: String,
}
impl Credentials {
pub fn authorization_header(&self) -> String {
match self {
Credentials::DevServer { token } => format!("dev-server-token {}", token),
Credentials::User {
user_id,
access_token,
} => format!("{} {}", user_id, access_token),
}
format!("{} {}", self.user_id, self.access_token)
}
}
@@ -600,11 +584,11 @@ impl Client {
}
pub fn user_id(&self) -> Option<u64> {
if let Some(Credentials::User { user_id, .. }) = self.state.read().credentials.as_ref() {
Some(*user_id)
} else {
None
}
self.state
.read()
.credentials
.as_ref()
.map(|credentials| credentials.user_id)
}
pub fn peer_id(&self) -> Option<PeerId> {
@@ -793,11 +777,6 @@ impl Client {
.is_some()
}
pub fn set_dev_server_token(&self, token: DevServerToken) -> &Self {
self.state.write().credentials = Some(Credentials::DevServer { token });
self
}
#[async_recursion(?Send)]
pub async fn authenticate_and_connect(
self: &Arc<Self>,
@@ -848,9 +827,7 @@ impl Client {
}
}
let credentials = credentials.unwrap();
if let Credentials::User { user_id, .. } = &credentials {
self.set_id(*user_id);
}
self.set_id(credentials.user_id);
if was_disconnected {
self.set_status(Status::Connecting, cx);
@@ -866,9 +843,8 @@ impl Client {
Ok(conn) => {
self.state.write().credentials = Some(credentials.clone());
if !read_from_provider && IMPERSONATE_LOGIN.is_none() {
if let Credentials::User{user_id, access_token} = credentials {
self.credentials_provider.write_credentials(user_id, access_token, cx).await.log_err();
}
self.credentials_provider.write_credentials(credentials.user_id, credentials.access_token, cx).await.log_err();
}
futures::select_biased! {
@@ -1301,7 +1277,7 @@ impl Client {
.decrypt_string(&access_token)
.context("failed to decrypt access token")?;
Ok(Credentials::User {
Ok(Credentials {
user_id: user_id.parse()?,
access_token,
})
@@ -1422,7 +1398,7 @@ impl Client {
// Use the admin API token to authenticate as the impersonated user.
api_token.insert_str(0, "ADMIN_TOKEN:");
Ok(Credentials::User {
Ok(Credentials {
user_id: response.user.id,
access_token: api_token,
})
@@ -1667,7 +1643,7 @@ impl CredentialsProvider for DevelopmentCredentialsProvider {
let credentials: DevelopmentCredentials = serde_json::from_slice(&json).log_err()?;
Some(Credentials::User {
Some(Credentials {
user_id: credentials.user_id,
access_token: credentials.access_token,
})
@@ -1721,7 +1697,7 @@ impl CredentialsProvider for KeychainCredentialsProvider {
.await
.log_err()??;
Some(Credentials::User {
Some(Credentials {
user_id: user_id.parse().ok()?,
access_token: String::from_utf8(access_token).ok()?,
})
@@ -1855,7 +1831,7 @@ mod tests {
// Time out when client tries to connect.
client.override_authenticate(move |cx| {
cx.background_executor().spawn(async move {
Ok(Credentials::User {
Ok(Credentials {
user_id,
access_token: "token".into(),
})

View File

@@ -49,7 +49,7 @@ impl FakeServer {
let mut state = state.lock();
state.auth_count += 1;
let access_token = state.access_token.to_string();
Ok(Credentials::User {
Ok(Credentials {
user_id: client_user_id,
access_token,
})
@@ -73,7 +73,7 @@ impl FakeServer {
}
if credentials
!= (Credentials::User {
!= (Credentials {
user_id: client_user_id,
access_token: state.lock().access_token.to_string(),
})

View File

@@ -28,9 +28,6 @@ impl std::fmt::Display for ChannelId {
#[derive(Debug, PartialEq, Eq, PartialOrd, Ord, Hash, Clone, Copy)]
pub struct ProjectId(pub u64);
#[derive(Debug, PartialEq, Eq, PartialOrd, Ord, Hash, Clone, Copy)]
pub struct DevServerId(pub u64);
#[derive(
Debug, PartialEq, Eq, PartialOrd, Ord, Hash, Clone, Copy, serde::Serialize, serde::Deserialize,
)]

View File

@@ -86,7 +86,6 @@ client = { workspace = true, features = ["test-support"] }
collab_ui = { workspace = true, features = ["test-support"] }
collections = { workspace = true, features = ["test-support"] }
ctor.workspace = true
dev_server_projects.workspace = true
editor = { workspace = true, features = ["test-support"] }
env_logger.workspace = true
file_finder.workspace = true
@@ -94,7 +93,6 @@ fs = { workspace = true, features = ["test-support"] }
git = { workspace = true, features = ["test-support"] }
git_hosting_providers.workspace = true
gpui = { workspace = true, features = ["test-support"] }
headless.workspace = true
hyper.workspace = true
indoc.workspace = true
language = { workspace = true, features = ["test-support"] }

View File

@@ -252,7 +252,10 @@ async fn create_billing_subscription(
let default_model = llm_db.model(rpc::LanguageModelProvider::Anthropic, "claude-3-5-sonnet")?;
let stripe_model = stripe_billing.register_model(default_model).await?;
let success_url = format!("{}/account", app.config.zed_dot_dev_url());
let success_url = format!(
"{}/account?checkout_complete=1",
app.config.zed_dot_dev_url()
);
let checkout_session_url = stripe_billing
.checkout(customer_id, &user.github_login, &stripe_model, &success_url)
.await?;

View File

@@ -1,5 +1,5 @@
use crate::{
db::{self, dev_server, AccessTokenId, Database, DevServerId, UserId},
db::{self, AccessTokenId, Database, UserId},
rpc::Principal,
AppState, Error, Result,
};
@@ -44,19 +44,10 @@ pub async fn validate_header<B>(mut req: Request<B>, next: Next<B>) -> impl Into
let first = auth_header.next().unwrap_or("");
if first == "dev-server-token" {
let dev_server_token = auth_header.next().ok_or_else(|| {
Error::http(
StatusCode::BAD_REQUEST,
"missing dev-server-token token in authorization header".to_string(),
)
})?;
let dev_server = verify_dev_server_token(dev_server_token, &state.db)
.await
.map_err(|e| Error::http(StatusCode::UNAUTHORIZED, format!("{}", e)))?;
req.extensions_mut()
.insert(Principal::DevServer(dev_server));
return Ok::<_, Error>(next.run(req).await);
Err(Error::http(
StatusCode::UNAUTHORIZED,
"Dev servers were removed in Zed 0.157 please upgrade to SSH remoting".to_string(),
))?;
}
let user_id = UserId(first.parse().map_err(|_| {
@@ -240,41 +231,6 @@ pub async fn verify_access_token(
})
}
pub fn generate_dev_server_token(id: usize, access_token: String) -> String {
format!("{}.{}", id, access_token)
}
pub async fn verify_dev_server_token(
dev_server_token: &str,
db: &Arc<Database>,
) -> anyhow::Result<dev_server::Model> {
let (id, token) = split_dev_server_token(dev_server_token)?;
let token_hash = hash_access_token(token);
let server = db.get_dev_server(id).await?;
if server
.hashed_token
.as_bytes()
.ct_eq(token_hash.as_ref())
.into()
{
Ok(server)
} else {
Err(anyhow!("wrong token for dev server"))
}
}
// a dev_server_token has the format <id>.<base64>. This is to make them
// relatively easy to copy/paste around.
pub fn split_dev_server_token(dev_server_token: &str) -> anyhow::Result<(DevServerId, &str)> {
let mut parts = dev_server_token.splitn(2, '.');
let id = DevServerId(parts.next().unwrap_or_default().parse()?);
let token = parts
.next()
.ok_or_else(|| anyhow!("invalid dev server token format"))?;
Ok((id, token))
}
#[cfg(test)]
mod test {
use rand::thread_rng;

View File

@@ -726,7 +726,6 @@ pub struct Project {
pub collaborators: Vec<ProjectCollaborator>,
pub worktrees: BTreeMap<u64, Worktree>,
pub language_servers: Vec<proto::LanguageServer>,
pub dev_server_project_id: Option<DevServerProjectId>,
}
pub struct ProjectCollaborator {

View File

@@ -79,7 +79,6 @@ id_type!(ChannelChatParticipantId);
id_type!(ChannelId);
id_type!(ChannelMemberId);
id_type!(ContactId);
id_type!(DevServerId);
id_type!(ExtensionId);
id_type!(FlagId);
id_type!(FollowerId);
@@ -89,7 +88,6 @@ id_type!(NotificationId);
id_type!(NotificationKindId);
id_type!(ProjectCollaboratorId);
id_type!(ProjectId);
id_type!(DevServerProjectId);
id_type!(ReplicaId);
id_type!(RoomId);
id_type!(RoomParticipantId);
@@ -277,12 +275,6 @@ impl From<ChannelVisibility> for i32 {
}
}
#[derive(Copy, Clone, Debug, Serialize, PartialEq)]
pub enum PrincipalId {
UserId(UserId),
DevServerId(DevServerId),
}
/// Indicate whether a [Buffer] has permissions to edit.
#[derive(PartialEq, Clone, Copy, Debug)]
pub enum Capability {

View File

@@ -8,8 +8,6 @@ pub mod buffers;
pub mod channels;
pub mod contacts;
pub mod contributors;
pub mod dev_server_projects;
pub mod dev_servers;
pub mod embeddings;
pub mod extensions;
pub mod hosted_projects;

View File

@@ -1,365 +1 @@
use anyhow::anyhow;
use rpc::{
proto::{self},
ConnectionId,
};
use sea_orm::{
ActiveModelTrait, ActiveValue, ColumnTrait, Condition, DatabaseTransaction, EntityTrait,
IntoActiveModel, ModelTrait, QueryFilter,
};
use crate::db::ProjectId;
use super::{
dev_server, dev_server_project, project, project_collaborator, worktree, Database, DevServerId,
DevServerProjectId, RejoinedProject, ResharedProject, ServerId, UserId,
};
impl Database {
pub async fn get_dev_server_project(
&self,
dev_server_project_id: DevServerProjectId,
) -> crate::Result<dev_server_project::Model> {
self.transaction(|tx| async move {
Ok(
dev_server_project::Entity::find_by_id(dev_server_project_id)
.one(&*tx)
.await?
.ok_or_else(|| {
anyhow!("no dev server project with id {}", dev_server_project_id)
})?,
)
})
.await
}
pub async fn get_projects_for_dev_server(
&self,
dev_server_id: DevServerId,
) -> crate::Result<Vec<proto::DevServerProject>> {
self.transaction(|tx| async move {
self.get_projects_for_dev_server_internal(dev_server_id, &tx)
.await
})
.await
}
pub async fn get_projects_for_dev_server_internal(
&self,
dev_server_id: DevServerId,
tx: &DatabaseTransaction,
) -> crate::Result<Vec<proto::DevServerProject>> {
let servers = dev_server_project::Entity::find()
.filter(dev_server_project::Column::DevServerId.eq(dev_server_id))
.find_also_related(project::Entity)
.all(tx)
.await?;
Ok(servers
.into_iter()
.map(|(dev_server_project, project)| dev_server_project.to_proto(project))
.collect())
}
pub async fn dev_server_project_ids_for_user(
&self,
user_id: UserId,
tx: &DatabaseTransaction,
) -> crate::Result<Vec<DevServerProjectId>> {
let dev_servers = dev_server::Entity::find()
.filter(dev_server::Column::UserId.eq(user_id))
.find_with_related(dev_server_project::Entity)
.all(tx)
.await?;
Ok(dev_servers
.into_iter()
.flat_map(|(_, projects)| projects.into_iter().map(|p| p.id))
.collect())
}
pub async fn owner_for_dev_server_project(
&self,
dev_server_project_id: DevServerProjectId,
tx: &DatabaseTransaction,
) -> crate::Result<UserId> {
let dev_server = dev_server_project::Entity::find_by_id(dev_server_project_id)
.find_also_related(dev_server::Entity)
.one(tx)
.await?
.and_then(|(_, dev_server)| dev_server)
.ok_or_else(|| anyhow!("no dev server project"))?;
Ok(dev_server.user_id)
}
pub async fn get_stale_dev_server_projects(
&self,
connection: ConnectionId,
) -> crate::Result<Vec<ProjectId>> {
self.transaction(|tx| async move {
let projects = project::Entity::find()
.filter(
Condition::all()
.add(project::Column::HostConnectionId.eq(connection.id))
.add(project::Column::HostConnectionServerId.eq(connection.owner_id)),
)
.all(&*tx)
.await?;
Ok(projects.into_iter().map(|p| p.id).collect())
})
.await
}
pub async fn create_dev_server_project(
&self,
dev_server_id: DevServerId,
path: &str,
user_id: UserId,
) -> crate::Result<(dev_server_project::Model, proto::DevServerProjectsUpdate)> {
self.transaction(|tx| async move {
let dev_server = dev_server::Entity::find_by_id(dev_server_id)
.one(&*tx)
.await?
.ok_or_else(|| anyhow!("no dev server with id {}", dev_server_id))?;
if dev_server.user_id != user_id {
return Err(anyhow!("not your dev server"))?;
}
let project = dev_server_project::Entity::insert(dev_server_project::ActiveModel {
id: ActiveValue::NotSet,
dev_server_id: ActiveValue::Set(dev_server_id),
paths: ActiveValue::Set(dev_server_project::JSONPaths(vec![path.to_string()])),
})
.exec_with_returning(&*tx)
.await?;
let status = self
.dev_server_projects_update_internal(user_id, &tx)
.await?;
Ok((project, status))
})
.await
}
pub async fn update_dev_server_project(
&self,
id: DevServerProjectId,
paths: &[String],
user_id: UserId,
) -> crate::Result<(dev_server_project::Model, proto::DevServerProjectsUpdate)> {
self.transaction(move |tx| async move {
let paths = paths.to_owned();
let Some((project, Some(dev_server))) = dev_server_project::Entity::find_by_id(id)
.find_also_related(dev_server::Entity)
.one(&*tx)
.await?
else {
return Err(anyhow!("no such dev server project"))?;
};
if dev_server.user_id != user_id {
return Err(anyhow!("not your dev server"))?;
}
let mut project = project.into_active_model();
project.paths = ActiveValue::Set(dev_server_project::JSONPaths(paths));
let project = project.update(&*tx).await?;
let status = self
.dev_server_projects_update_internal(user_id, &tx)
.await?;
Ok((project, status))
})
.await
}
pub async fn delete_dev_server_project(
&self,
dev_server_project_id: DevServerProjectId,
dev_server_id: DevServerId,
user_id: UserId,
) -> crate::Result<(Vec<proto::DevServerProject>, proto::DevServerProjectsUpdate)> {
self.transaction(|tx| async move {
project::Entity::delete_many()
.filter(project::Column::DevServerProjectId.eq(dev_server_project_id))
.exec(&*tx)
.await?;
let result = dev_server_project::Entity::delete_by_id(dev_server_project_id)
.exec(&*tx)
.await?;
if result.rows_affected != 1 {
return Err(anyhow!(
"no dev server project with id {}",
dev_server_project_id
))?;
}
let status = self
.dev_server_projects_update_internal(user_id, &tx)
.await?;
let projects = self
.get_projects_for_dev_server_internal(dev_server_id, &tx)
.await?;
Ok((projects, status))
})
.await
}
pub async fn share_dev_server_project(
&self,
dev_server_project_id: DevServerProjectId,
dev_server_id: DevServerId,
connection: ConnectionId,
worktrees: &[proto::WorktreeMetadata],
) -> crate::Result<(
proto::DevServerProject,
UserId,
proto::DevServerProjectsUpdate,
)> {
self.transaction(|tx| async move {
let dev_server = dev_server::Entity::find_by_id(dev_server_id)
.one(&*tx)
.await?
.ok_or_else(|| anyhow!("no dev server with id {}", dev_server_id))?;
let dev_server_project = dev_server_project::Entity::find_by_id(dev_server_project_id)
.one(&*tx)
.await?
.ok_or_else(|| {
anyhow!("no dev server project with id {}", dev_server_project_id)
})?;
if dev_server_project.dev_server_id != dev_server_id {
return Err(anyhow!("dev server project shared from wrong server"))?;
}
let project = project::ActiveModel {
room_id: ActiveValue::Set(None),
host_user_id: ActiveValue::Set(None),
host_connection_id: ActiveValue::set(Some(connection.id as i32)),
host_connection_server_id: ActiveValue::set(Some(ServerId(
connection.owner_id as i32,
))),
id: ActiveValue::NotSet,
hosted_project_id: ActiveValue::Set(None),
dev_server_project_id: ActiveValue::Set(Some(dev_server_project_id)),
}
.insert(&*tx)
.await?;
if !worktrees.is_empty() {
worktree::Entity::insert_many(worktrees.iter().map(|worktree| {
worktree::ActiveModel {
id: ActiveValue::set(worktree.id as i64),
project_id: ActiveValue::set(project.id),
abs_path: ActiveValue::set(worktree.abs_path.clone()),
root_name: ActiveValue::set(worktree.root_name.clone()),
visible: ActiveValue::set(worktree.visible),
scan_id: ActiveValue::set(0),
completed_scan_id: ActiveValue::set(0),
}
}))
.exec(&*tx)
.await?;
}
let status = self
.dev_server_projects_update_internal(dev_server.user_id, &tx)
.await?;
Ok((
dev_server_project.to_proto(Some(project)),
dev_server.user_id,
status,
))
})
.await
}
pub async fn reshare_dev_server_projects(
&self,
reshared_projects: &Vec<proto::UpdateProject>,
dev_server_id: DevServerId,
connection: ConnectionId,
) -> crate::Result<Vec<ResharedProject>> {
self.transaction(|tx| async move {
let mut ret = Vec::new();
for reshared_project in reshared_projects {
let project_id = ProjectId::from_proto(reshared_project.project_id);
let (project, dev_server_project) = project::Entity::find_by_id(project_id)
.find_also_related(dev_server_project::Entity)
.one(&*tx)
.await?
.ok_or_else(|| anyhow!("project does not exist"))?;
if dev_server_project.map(|rp| rp.dev_server_id) != Some(dev_server_id) {
return Err(anyhow!("dev server project reshared from wrong server"))?;
}
let Ok(old_connection_id) = project.host_connection() else {
return Err(anyhow!("dev server project was not shared"))?;
};
project::Entity::update(project::ActiveModel {
id: ActiveValue::set(project_id),
host_connection_id: ActiveValue::set(Some(connection.id as i32)),
host_connection_server_id: ActiveValue::set(Some(ServerId(
connection.owner_id as i32,
))),
..Default::default()
})
.exec(&*tx)
.await?;
let collaborators = project
.find_related(project_collaborator::Entity)
.all(&*tx)
.await?;
self.update_project_worktrees(project_id, &reshared_project.worktrees, &tx)
.await?;
ret.push(super::ResharedProject {
id: project_id,
old_connection_id,
collaborators: collaborators
.iter()
.map(|collaborator| super::ProjectCollaborator {
connection_id: collaborator.connection(),
user_id: collaborator.user_id,
replica_id: collaborator.replica_id,
is_host: collaborator.is_host,
})
.collect(),
worktrees: reshared_project.worktrees.clone(),
});
}
Ok(ret)
})
.await
}
pub async fn rejoin_dev_server_projects(
&self,
rejoined_projects: &Vec<proto::RejoinProject>,
user_id: UserId,
connection_id: ConnectionId,
) -> crate::Result<Vec<RejoinedProject>> {
self.transaction(|tx| async move {
let mut ret = Vec::new();
for rejoined_project in rejoined_projects {
if let Some(project) = self
.rejoin_project_internal(&tx, rejoined_project, user_id, connection_id)
.await?
{
ret.push(project);
}
}
Ok(ret)
})
.await
}
}

View File

@@ -1,222 +1 @@
use rpc::proto;
use sea_orm::{
ActiveValue, ColumnTrait, DatabaseTransaction, EntityTrait, IntoActiveModel, QueryFilter,
};
use super::{dev_server, dev_server_project, Database, DevServerId, UserId};
impl Database {
pub async fn get_dev_server(
&self,
dev_server_id: DevServerId,
) -> crate::Result<dev_server::Model> {
self.transaction(|tx| async move {
Ok(dev_server::Entity::find_by_id(dev_server_id)
.one(&*tx)
.await?
.ok_or_else(|| anyhow::anyhow!("no dev server with id {}", dev_server_id))?)
})
.await
}
pub async fn get_dev_server_for_user(
&self,
dev_server_id: DevServerId,
user_id: UserId,
) -> crate::Result<dev_server::Model> {
self.transaction(|tx| async move {
let server = dev_server::Entity::find_by_id(dev_server_id)
.one(&*tx)
.await?
.ok_or_else(|| anyhow::anyhow!("no dev server with id {}", dev_server_id))?;
if server.user_id != user_id {
return Err(anyhow::anyhow!(
"dev server {} is not owned by user {}",
dev_server_id,
user_id
))?;
}
Ok(server)
})
.await
}
pub async fn get_dev_servers(&self, user_id: UserId) -> crate::Result<Vec<dev_server::Model>> {
self.transaction(|tx| async move {
Ok(dev_server::Entity::find()
.filter(dev_server::Column::UserId.eq(user_id))
.all(&*tx)
.await?)
})
.await
}
pub async fn dev_server_projects_update(
&self,
user_id: UserId,
) -> crate::Result<proto::DevServerProjectsUpdate> {
self.transaction(|tx| async move {
self.dev_server_projects_update_internal(user_id, &tx).await
})
.await
}
pub async fn dev_server_projects_update_internal(
&self,
user_id: UserId,
tx: &DatabaseTransaction,
) -> crate::Result<proto::DevServerProjectsUpdate> {
let dev_servers = dev_server::Entity::find()
.filter(dev_server::Column::UserId.eq(user_id))
.all(tx)
.await?;
let dev_server_projects = dev_server_project::Entity::find()
.filter(
dev_server_project::Column::DevServerId
.is_in(dev_servers.iter().map(|d| d.id).collect::<Vec<_>>()),
)
.find_also_related(super::project::Entity)
.all(tx)
.await?;
Ok(proto::DevServerProjectsUpdate {
dev_servers: dev_servers
.into_iter()
.map(|d| d.to_proto(proto::DevServerStatus::Offline))
.collect(),
dev_server_projects: dev_server_projects
.into_iter()
.map(|(dev_server_project, project)| dev_server_project.to_proto(project))
.collect(),
})
}
pub async fn create_dev_server(
&self,
name: &str,
ssh_connection_string: Option<&str>,
hashed_access_token: &str,
user_id: UserId,
) -> crate::Result<(dev_server::Model, proto::DevServerProjectsUpdate)> {
self.transaction(|tx| async move {
if name.trim().is_empty() {
return Err(anyhow::anyhow!(proto::ErrorCode::Forbidden))?;
}
let dev_server = dev_server::Entity::insert(dev_server::ActiveModel {
id: ActiveValue::NotSet,
hashed_token: ActiveValue::Set(hashed_access_token.to_string()),
name: ActiveValue::Set(name.trim().to_string()),
user_id: ActiveValue::Set(user_id),
ssh_connection_string: ActiveValue::Set(
ssh_connection_string.map(ToOwned::to_owned),
),
})
.exec_with_returning(&*tx)
.await?;
let dev_server_projects = self
.dev_server_projects_update_internal(user_id, &tx)
.await?;
Ok((dev_server, dev_server_projects))
})
.await
}
pub async fn update_dev_server_token(
&self,
id: DevServerId,
hashed_token: &str,
user_id: UserId,
) -> crate::Result<proto::DevServerProjectsUpdate> {
self.transaction(|tx| async move {
let Some(dev_server) = dev_server::Entity::find_by_id(id).one(&*tx).await? else {
return Err(anyhow::anyhow!("no dev server with id {}", id))?;
};
if dev_server.user_id != user_id {
return Err(anyhow::anyhow!(proto::ErrorCode::Forbidden))?;
}
dev_server::Entity::update(dev_server::ActiveModel {
hashed_token: ActiveValue::Set(hashed_token.to_string()),
..dev_server.clone().into_active_model()
})
.exec(&*tx)
.await?;
let dev_server_projects = self
.dev_server_projects_update_internal(user_id, &tx)
.await?;
Ok(dev_server_projects)
})
.await
}
pub async fn rename_dev_server(
&self,
id: DevServerId,
name: &str,
ssh_connection_string: Option<&str>,
user_id: UserId,
) -> crate::Result<proto::DevServerProjectsUpdate> {
self.transaction(|tx| async move {
let Some(dev_server) = dev_server::Entity::find_by_id(id).one(&*tx).await? else {
return Err(anyhow::anyhow!("no dev server with id {}", id))?;
};
if dev_server.user_id != user_id || name.trim().is_empty() {
return Err(anyhow::anyhow!(proto::ErrorCode::Forbidden))?;
}
dev_server::Entity::update(dev_server::ActiveModel {
name: ActiveValue::Set(name.trim().to_string()),
ssh_connection_string: ActiveValue::Set(
ssh_connection_string.map(ToOwned::to_owned),
),
..dev_server.clone().into_active_model()
})
.exec(&*tx)
.await?;
let dev_server_projects = self
.dev_server_projects_update_internal(user_id, &tx)
.await?;
Ok(dev_server_projects)
})
.await
}
pub async fn delete_dev_server(
&self,
id: DevServerId,
user_id: UserId,
) -> crate::Result<proto::DevServerProjectsUpdate> {
self.transaction(|tx| async move {
let Some(dev_server) = dev_server::Entity::find_by_id(id).one(&*tx).await? else {
return Err(anyhow::anyhow!("no dev server with id {}", id))?;
};
if dev_server.user_id != user_id {
return Err(anyhow::anyhow!(proto::ErrorCode::Forbidden))?;
}
dev_server_project::Entity::delete_many()
.filter(dev_server_project::Column::DevServerId.eq(id))
.exec(&*tx)
.await?;
dev_server::Entity::delete(dev_server.into_active_model())
.exec(&*tx)
.await?;
let dev_server_projects = self
.dev_server_projects_update_internal(user_id, &tx)
.await?;
Ok(dev_server_projects)
})
.await
}
}

View File

@@ -32,7 +32,6 @@ impl Database {
connection: ConnectionId,
worktrees: &[proto::WorktreeMetadata],
is_ssh_project: bool,
dev_server_project_id: Option<DevServerProjectId>,
) -> Result<TransactionGuard<(ProjectId, proto::Room)>> {
self.room_transaction(room_id, |tx| async move {
let participant = room_participant::Entity::find()
@@ -61,38 +60,6 @@ impl Database {
return Err(anyhow!("guests cannot share projects"))?;
}
if let Some(dev_server_project_id) = dev_server_project_id {
let project = project::Entity::find()
.filter(project::Column::DevServerProjectId.eq(Some(dev_server_project_id)))
.one(&*tx)
.await?
.ok_or_else(|| anyhow!("no remote project"))?;
let (_, dev_server) = dev_server_project::Entity::find_by_id(dev_server_project_id)
.find_also_related(dev_server::Entity)
.one(&*tx)
.await?
.ok_or_else(|| anyhow!("no dev_server_project"))?;
if !dev_server.is_some_and(|dev_server| dev_server.user_id == participant.user_id) {
return Err(anyhow!("not your dev server"))?;
}
if project.room_id.is_some() {
return Err(anyhow!("project already shared"))?;
};
let project = project::Entity::update(project::ActiveModel {
room_id: ActiveValue::Set(Some(room_id)),
..project.into_active_model()
})
.exec(&*tx)
.await?;
let room = self.get_room(room_id, &tx).await?;
return Ok((project.id, room));
}
let project = project::ActiveModel {
room_id: ActiveValue::set(Some(participant.room_id)),
host_user_id: ActiveValue::set(Some(participant.user_id)),
@@ -102,7 +69,6 @@ impl Database {
))),
id: ActiveValue::NotSet,
hosted_project_id: ActiveValue::Set(None),
dev_server_project_id: ActiveValue::Set(None),
}
.insert(&*tx)
.await?;
@@ -156,7 +122,6 @@ impl Database {
&self,
project_id: ProjectId,
connection: ConnectionId,
user_id: Option<UserId>,
) -> Result<TransactionGuard<(bool, Option<proto::Room>, Vec<ConnectionId>)>> {
self.project_transaction(project_id, |tx| async move {
let guest_connection_ids = self.project_guest_connection_ids(project_id, &tx).await?;
@@ -172,25 +137,6 @@ impl Database {
if project.host_connection()? == connection {
return Ok((true, room, guest_connection_ids));
}
if let Some(dev_server_project_id) = project.dev_server_project_id {
if let Some(user_id) = user_id {
if user_id
!= self
.owner_for_dev_server_project(dev_server_project_id, &tx)
.await?
{
Err(anyhow!("cannot unshare a project hosted by another user"))?
}
project::Entity::update(project::ActiveModel {
room_id: ActiveValue::Set(None),
..project.into_active_model()
})
.exec(&*tx)
.await?;
return Ok((false, room, guest_connection_ids));
}
}
Err(anyhow!("cannot unshare a project hosted by another user"))?
})
.await
@@ -633,17 +579,6 @@ impl Database {
.await
}
pub async fn find_dev_server_project(&self, id: DevServerProjectId) -> Result<project::Model> {
self.transaction(|tx| async move {
Ok(project::Entity::find()
.filter(project::Column::DevServerProjectId.eq(id))
.one(&*tx)
.await?
.ok_or_else(|| anyhow!("no such project"))?)
})
.await
}
/// Adds the given connection to the specified project
/// in the current room.
pub async fn join_project(
@@ -654,13 +589,7 @@ impl Database {
) -> Result<TransactionGuard<(Project, ReplicaId)>> {
self.project_transaction(project_id, |tx| async move {
let (project, role) = self
.access_project(
project_id,
connection,
PrincipalId::UserId(user_id),
Capability::ReadOnly,
&tx,
)
.access_project(project_id, connection, Capability::ReadOnly, &tx)
.await?;
self.join_project_internal(project, user_id, connection, role, &tx)
.await
@@ -851,7 +780,6 @@ impl Database {
worktree_id: None,
})
.collect(),
dev_server_project_id: project.dev_server_project_id,
};
Ok((project, replica_id as ReplicaId))
}
@@ -1007,29 +935,14 @@ impl Database {
&self,
project_id: ProjectId,
connection_id: ConnectionId,
principal_id: PrincipalId,
capability: Capability,
tx: &DatabaseTransaction,
) -> Result<(project::Model, ChannelRole)> {
let (mut project, dev_server_project) = project::Entity::find_by_id(project_id)
.find_also_related(dev_server_project::Entity)
let project = project::Entity::find_by_id(project_id)
.one(tx)
.await?
.ok_or_else(|| anyhow!("no such project"))?;
let user_id = match principal_id {
PrincipalId::DevServerId(_) => {
if project
.host_connection()
.is_ok_and(|connection| connection == connection_id)
{
return Ok((project, ChannelRole::Admin));
}
return Err(anyhow!("not the project host"))?;
}
PrincipalId::UserId(user_id) => user_id,
};
let role_from_room = if let Some(room_id) = project.room_id {
room_participant::Entity::find()
.filter(room_participant::Column::RoomId.eq(room_id))
@@ -1040,34 +953,8 @@ impl Database {
} else {
None
};
let role_from_dev_server = if let Some(dev_server_project) = dev_server_project {
let dev_server = dev_server::Entity::find_by_id(dev_server_project.dev_server_id)
.one(tx)
.await?
.ok_or_else(|| anyhow!("no such channel"))?;
if user_id == dev_server.user_id {
// If the user left the room "uncleanly" they may rejoin the
// remote project before leave_room runs. IN that case kick
// the project out of the room pre-emptively.
if role_from_room.is_none() {
project = project::Entity::update(project::ActiveModel {
room_id: ActiveValue::Set(None),
..project.into_active_model()
})
.exec(tx)
.await?;
}
Some(ChannelRole::Admin)
} else {
None
}
} else {
None
};
let role = role_from_dev_server
.or(role_from_room)
.unwrap_or(ChannelRole::Banned);
let role = role_from_room.unwrap_or(ChannelRole::Banned);
match capability {
Capability::ReadWrite => {
@@ -1090,17 +977,10 @@ impl Database {
&self,
project_id: ProjectId,
connection_id: ConnectionId,
user_id: UserId,
) -> Result<ConnectionId> {
self.project_transaction(project_id, |tx| async move {
let (project, _) = self
.access_project(
project_id,
connection_id,
PrincipalId::UserId(user_id),
Capability::ReadOnly,
&tx,
)
.access_project(project_id, connection_id, Capability::ReadOnly, &tx)
.await?;
project.host_connection()
})
@@ -1113,17 +993,10 @@ impl Database {
&self,
project_id: ProjectId,
connection_id: ConnectionId,
user_id: UserId,
) -> Result<ConnectionId> {
self.project_transaction(project_id, |tx| async move {
let (project, _) = self
.access_project(
project_id,
connection_id,
PrincipalId::UserId(user_id),
Capability::ReadWrite,
&tx,
)
.access_project(project_id, connection_id, Capability::ReadWrite, &tx)
.await?;
project.host_connection()
})
@@ -1131,47 +1004,16 @@ impl Database {
.map(|guard| guard.into_inner())
}
/// Returns the host connection for a request to join a shared project.
pub async fn host_for_owner_project_request(
&self,
project_id: ProjectId,
_connection_id: ConnectionId,
user_id: UserId,
) -> Result<ConnectionId> {
self.project_transaction(project_id, |tx| async move {
let (project, dev_server_project) = project::Entity::find_by_id(project_id)
.find_also_related(dev_server_project::Entity)
.one(&*tx)
.await?
.ok_or_else(|| anyhow!("no such project"))?;
let Some(dev_server_project) = dev_server_project else {
return Err(anyhow!("not a dev server project"))?;
};
let dev_server = dev_server::Entity::find_by_id(dev_server_project.dev_server_id)
.one(&*tx)
.await?
.ok_or_else(|| anyhow!("no such dev server"))?;
if dev_server.user_id != user_id {
return Err(anyhow!("not your project"))?;
}
project.host_connection()
})
.await
.map(|guard| guard.into_inner())
}
pub async fn connections_for_buffer_update(
&self,
project_id: ProjectId,
principal_id: PrincipalId,
connection_id: ConnectionId,
capability: Capability,
) -> Result<TransactionGuard<(ConnectionId, Vec<ConnectionId>)>> {
self.project_transaction(project_id, |tx| async move {
// Authorize
let (project, _) = self
.access_project(project_id, connection_id, principal_id, capability, &tx)
.access_project(project_id, connection_id, capability, &tx)
.await?;
let host_connection_id = project.host_connection()?;

View File

@@ -858,25 +858,6 @@ impl Database {
.all(&*tx)
.await?;
// if any project in the room has a remote-project-id that belongs to a dev server that this user owns.
let dev_server_projects_for_user = self
.dev_server_project_ids_for_user(leaving_participant.user_id, &tx)
.await?;
let dev_server_projects_to_unshare = project::Entity::find()
.filter(
Condition::all()
.add(project::Column::RoomId.eq(room_id))
.add(
project::Column::DevServerProjectId
.is_in(dev_server_projects_for_user.clone()),
),
)
.all(&*tx)
.await?
.into_iter()
.map(|project| project.id)
.collect::<HashSet<_>>();
let mut left_projects = HashMap::default();
let mut collaborators = project_collaborator::Entity::find()
.filter(project_collaborator::Column::ProjectId.is_in(project_ids))
@@ -899,9 +880,7 @@ impl Database {
left_project.connection_ids.push(collaborator_connection_id);
}
if (collaborator.is_host && collaborator.connection() == connection)
|| dev_server_projects_to_unshare.contains(&collaborator.project_id)
{
if collaborator.is_host && collaborator.connection() == connection {
left_project.should_unshare = true;
}
}
@@ -944,17 +923,6 @@ impl Database {
.exec(&*tx)
.await?;
if !dev_server_projects_to_unshare.is_empty() {
project::Entity::update_many()
.filter(project::Column::Id.is_in(dev_server_projects_to_unshare))
.set(project::ActiveModel {
room_id: ActiveValue::Set(None),
..Default::default()
})
.exec(&*tx)
.await?;
}
let (channel, room) = self.get_channel_room(room_id, &tx).await?;
let deleted = if room.participants.is_empty() {
let result = room::Entity::delete_by_id(room_id).exec(&*tx).await?;
@@ -1323,26 +1291,6 @@ impl Database {
project.worktree_root_names.push(db_worktree.root_name);
}
}
} else if let Some(dev_server_project_id) = db_project.dev_server_project_id {
let host = self
.owner_for_dev_server_project(dev_server_project_id, tx)
.await?;
if let Some((_, participant)) = participants
.iter_mut()
.find(|(_, v)| v.user_id == host.to_proto())
{
participant.projects.push(proto::ParticipantProject {
id: db_project.id.to_proto(),
worktree_root_names: Default::default(),
});
let project = participant.projects.last_mut().unwrap();
for db_worktree in db_worktrees {
if db_worktree.visible {
project.worktree_root_names.push(db_worktree.root_name);
}
}
}
}
}

View File

@@ -13,8 +13,6 @@ pub mod channel_message;
pub mod channel_message_mention;
pub mod contact;
pub mod contributor;
pub mod dev_server;
pub mod dev_server_project;
pub mod embedding;
pub mod extension;
pub mod extension_version;

View File

@@ -1,39 +0,0 @@
use crate::db::{DevServerId, UserId};
use rpc::proto;
use sea_orm::entity::prelude::*;
#[derive(Clone, Debug, PartialEq, Eq, DeriveEntityModel)]
#[sea_orm(table_name = "dev_servers")]
pub struct Model {
#[sea_orm(primary_key)]
pub id: DevServerId,
pub name: String,
pub user_id: UserId,
pub hashed_token: String,
pub ssh_connection_string: Option<String>,
}
impl ActiveModelBehavior for ActiveModel {}
#[derive(Copy, Clone, Debug, EnumIter, DeriveRelation)]
pub enum Relation {
#[sea_orm(has_many = "super::dev_server_project::Entity")]
RemoteProject,
}
impl Related<super::dev_server_project::Entity> for Entity {
fn to() -> RelationDef {
Relation::RemoteProject.def()
}
}
impl Model {
pub fn to_proto(&self, status: proto::DevServerStatus) -> proto::DevServer {
proto::DevServer {
dev_server_id: self.id.to_proto(),
name: self.name.clone(),
status: status as i32,
ssh_connection_string: self.ssh_connection_string.clone(),
}
}
}

View File

@@ -1,59 +0,0 @@
use super::project;
use crate::db::{DevServerId, DevServerProjectId};
use rpc::proto;
use sea_orm::{entity::prelude::*, FromJsonQueryResult};
use serde::{Deserialize, Serialize};
#[derive(Clone, Debug, PartialEq, Eq, DeriveEntityModel)]
#[sea_orm(table_name = "dev_server_projects")]
pub struct Model {
#[sea_orm(primary_key)]
pub id: DevServerProjectId,
pub dev_server_id: DevServerId,
pub paths: JSONPaths,
}
#[derive(Clone, Debug, PartialEq, Eq, Serialize, Deserialize, FromJsonQueryResult)]
pub struct JSONPaths(pub Vec<String>);
impl ActiveModelBehavior for ActiveModel {}
#[derive(Copy, Clone, Debug, EnumIter, DeriveRelation)]
pub enum Relation {
#[sea_orm(has_one = "super::project::Entity")]
Project,
#[sea_orm(
belongs_to = "super::dev_server::Entity",
from = "Column::DevServerId",
to = "super::dev_server::Column::Id"
)]
DevServer,
}
impl Related<super::project::Entity> for Entity {
fn to() -> RelationDef {
Relation::Project.def()
}
}
impl Related<super::dev_server::Entity> for Entity {
fn to() -> RelationDef {
Relation::DevServer.def()
}
}
impl Model {
pub fn to_proto(&self, project: Option<project::Model>) -> proto::DevServerProject {
proto::DevServerProject {
id: self.id.to_proto(),
project_id: project.map(|p| p.id.to_proto()),
dev_server_id: self.dev_server_id.to_proto(),
path: self.paths().first().cloned().unwrap_or_default(),
paths: self.paths().clone(),
}
}
pub fn paths(&self) -> &Vec<String> {
&self.paths.0
}
}

View File

@@ -1,4 +1,4 @@
use crate::db::{DevServerProjectId, HostedProjectId, ProjectId, Result, RoomId, ServerId, UserId};
use crate::db::{HostedProjectId, ProjectId, Result, RoomId, ServerId, UserId};
use anyhow::anyhow;
use rpc::ConnectionId;
use sea_orm::entity::prelude::*;
@@ -13,7 +13,6 @@ pub struct Model {
pub host_connection_id: Option<i32>,
pub host_connection_server_id: Option<ServerId>,
pub hosted_project_id: Option<HostedProjectId>,
pub dev_server_project_id: Option<DevServerProjectId>,
}
impl Model {
@@ -57,12 +56,6 @@ pub enum Relation {
to = "super::hosted_project::Column::Id"
)]
HostedProject,
#[sea_orm(
belongs_to = "super::dev_server_project::Entity",
from = "Column::DevServerProjectId",
to = "super::dev_server_project::Column::Id"
)]
RemoteProject,
}
impl Related<super::user::Entity> for Entity {
@@ -101,10 +94,4 @@ impl Related<super::hosted_project::Entity> for Entity {
}
}
impl Related<super::dev_server_project::Entity> for Entity {
fn to() -> RelationDef {
Relation::RemoteProject.def()
}
}
impl ActiveModelBehavior for ActiveModel {}

View File

@@ -540,18 +540,18 @@ async fn test_project_count(db: &Arc<Database>) {
.unwrap();
assert_eq!(db.project_count_excluding_admins().await.unwrap(), 0);
db.share_project(room_id, ConnectionId { owner_id, id: 1 }, &[], false, None)
db.share_project(room_id, ConnectionId { owner_id, id: 1 }, &[], false)
.await
.unwrap();
assert_eq!(db.project_count_excluding_admins().await.unwrap(), 1);
db.share_project(room_id, ConnectionId { owner_id, id: 1 }, &[], false, None)
db.share_project(room_id, ConnectionId { owner_id, id: 1 }, &[], false)
.await
.unwrap();
assert_eq!(db.project_count_excluding_admins().await.unwrap(), 2);
// Projects shared by admins aren't counted.
db.share_project(room_id, ConnectionId { owner_id, id: 0 }, &[], false, None)
db.share_project(room_id, ConnectionId { owner_id, id: 0 }, &[], false)
.await
.unwrap();
assert_eq!(db.project_count_excluding_admins().await.unwrap(), 2);

File diff suppressed because it is too large Load Diff

View File

@@ -1,7 +1,7 @@
use crate::db::{ChannelId, ChannelRole, DevServerId, PrincipalId, UserId};
use crate::db::{ChannelId, ChannelRole, UserId};
use anyhow::{anyhow, Result};
use collections::{BTreeMap, HashMap, HashSet};
use rpc::{proto, ConnectionId};
use rpc::ConnectionId;
use semantic_version::SemanticVersion;
use serde::Serialize;
use std::fmt;
@@ -11,9 +11,7 @@ use tracing::instrument;
pub struct ConnectionPool {
connections: BTreeMap<ConnectionId, Connection>,
connected_users: BTreeMap<UserId, ConnectedPrincipal>,
connected_dev_servers: BTreeMap<DevServerId, ConnectionId>,
channels: ChannelPool,
offline_dev_servers: HashSet<DevServerId>,
}
#[derive(Default, Serialize)]
@@ -32,13 +30,13 @@ impl fmt::Display for ZedVersion {
impl ZedVersion {
pub fn can_collaborate(&self) -> bool {
self.0 >= SemanticVersion::new(0, 151, 0)
self.0 >= SemanticVersion::new(0, 157, 0)
}
}
#[derive(Serialize)]
pub struct Connection {
pub principal_id: PrincipalId,
pub user_id: UserId,
pub admin: bool,
pub zed_version: ZedVersion,
}
@@ -47,7 +45,6 @@ impl ConnectionPool {
pub fn reset(&mut self) {
self.connections.clear();
self.connected_users.clear();
self.connected_dev_servers.clear();
self.channels.clear();
}
@@ -66,7 +63,7 @@ impl ConnectionPool {
self.connections.insert(
connection_id,
Connection {
principal_id: PrincipalId::UserId(user_id),
user_id,
admin,
zed_version,
},
@@ -75,25 +72,6 @@ impl ConnectionPool {
connected_user.connection_ids.insert(connection_id);
}
pub fn add_dev_server(
&mut self,
connection_id: ConnectionId,
dev_server_id: DevServerId,
zed_version: ZedVersion,
) {
self.connections.insert(
connection_id,
Connection {
principal_id: PrincipalId::DevServerId(dev_server_id),
admin: false,
zed_version,
},
);
self.connected_dev_servers
.insert(dev_server_id, connection_id);
}
#[instrument(skip(self))]
pub fn remove_connection(&mut self, connection_id: ConnectionId) -> Result<()> {
let connection = self
@@ -101,28 +79,18 @@ impl ConnectionPool {
.get_mut(&connection_id)
.ok_or_else(|| anyhow!("no such connection"))?;
match connection.principal_id {
PrincipalId::UserId(user_id) => {
let connected_user = self.connected_users.get_mut(&user_id).unwrap();
connected_user.connection_ids.remove(&connection_id);
if connected_user.connection_ids.is_empty() {
self.connected_users.remove(&user_id);
self.channels.remove_user(&user_id);
}
}
PrincipalId::DevServerId(dev_server_id) => {
self.connected_dev_servers.remove(&dev_server_id);
self.offline_dev_servers.remove(&dev_server_id);
}
}
let user_id = connection.user_id;
let connected_user = self.connected_users.get_mut(&user_id).unwrap();
connected_user.connection_ids.remove(&connection_id);
if connected_user.connection_ids.is_empty() {
self.connected_users.remove(&user_id);
self.channels.remove_user(&user_id);
};
self.connections.remove(&connection_id).unwrap();
Ok(())
}
pub fn set_dev_server_offline(&mut self, dev_server_id: DevServerId) {
self.offline_dev_servers.insert(dev_server_id);
}
pub fn connections(&self) -> impl Iterator<Item = &Connection> {
self.connections.values()
}
@@ -147,42 +115,6 @@ impl ConnectionPool {
.copied()
}
pub fn dev_server_status(&self, dev_server_id: DevServerId) -> proto::DevServerStatus {
if self.dev_server_connection_id(dev_server_id).is_some()
&& !self.offline_dev_servers.contains(&dev_server_id)
{
proto::DevServerStatus::Online
} else {
proto::DevServerStatus::Offline
}
}
pub fn dev_server_connection_id(&self, dev_server_id: DevServerId) -> Option<ConnectionId> {
self.connected_dev_servers.get(&dev_server_id).copied()
}
pub fn online_dev_server_connection_id(
&self,
dev_server_id: DevServerId,
) -> Result<ConnectionId> {
match self.connected_dev_servers.get(&dev_server_id) {
Some(cid) => Ok(*cid),
None => Err(anyhow!(proto::ErrorCode::DevServerOffline)),
}
}
pub fn dev_server_connection_id_supporting(
&self,
dev_server_id: DevServerId,
required: ZedVersion,
) -> Result<ConnectionId> {
match self.connected_dev_servers.get(&dev_server_id) {
Some(cid) if self.connections[cid].zed_version >= required => Ok(*cid),
Some(_) => Err(anyhow!(proto::ErrorCode::RemoteUpgradeRequired)),
None => Err(anyhow!(proto::ErrorCode::DevServerOffline)),
}
}
pub fn channel_user_ids(
&self,
channel_id: ChannelId,
@@ -227,39 +159,22 @@ impl ConnectionPool {
#[cfg(test)]
pub fn check_invariants(&self) {
for (connection_id, connection) in &self.connections {
match &connection.principal_id {
PrincipalId::UserId(user_id) => {
assert!(self
.connected_users
.get(user_id)
.unwrap()
.connection_ids
.contains(connection_id));
}
PrincipalId::DevServerId(dev_server_id) => {
assert_eq!(
self.connected_dev_servers.get(dev_server_id).unwrap(),
connection_id
);
}
}
assert!(self
.connected_users
.get(&connection.user_id)
.unwrap()
.connection_ids
.contains(connection_id));
}
for (user_id, state) in &self.connected_users {
for connection_id in &state.connection_ids {
assert_eq!(
self.connections.get(connection_id).unwrap().principal_id,
PrincipalId::UserId(*user_id)
self.connections.get(connection_id).unwrap().user_id,
*user_id
);
}
}
for (dev_server_id, connection_id) in &self.connected_dev_servers {
assert_eq!(
self.connections.get(connection_id).unwrap().principal_id,
PrincipalId::DevServerId(*dev_server_id)
);
}
}
}

View File

@@ -8,7 +8,6 @@ mod channel_buffer_tests;
mod channel_guest_tests;
mod channel_message_tests;
mod channel_tests;
mod dev_server_tests;
mod editor_tests;
mod following_tests;
mod integration_tests;

View File

@@ -95,7 +95,9 @@ async fn test_channel_guest_promotion(cx_a: &mut TestAppContext, cx_b: &mut Test
let room_b = cx_b
.read(ActiveCall::global)
.update(cx_b, |call, _| call.room().unwrap().clone());
cx_b.simulate_keystrokes("cmd-p 1 enter");
cx_b.simulate_keystrokes("cmd-p");
cx_a.run_until_parked();
cx_b.simulate_keystrokes("1 enter");
let (project_b, editor_b) = workspace_b.update(cx_b, |workspace, cx| {
(

View File

@@ -1,643 +0,0 @@
use std::{path::Path, sync::Arc};
use call::ActiveCall;
use editor::Editor;
use fs::Fs;
use gpui::{TestAppContext, VisualTestContext, WindowHandle};
use rpc::{proto::DevServerStatus, ErrorCode, ErrorExt};
use serde_json::json;
use workspace::{AppState, Workspace};
use crate::tests::{following_tests::join_channel, TestServer};
use super::TestClient;
#[gpui::test]
async fn test_dev_server(cx: &mut gpui::TestAppContext, cx2: &mut gpui::TestAppContext) {
let (server, client) = TestServer::start1(cx).await;
let store = cx.update(|cx| dev_server_projects::Store::global(cx).clone());
let resp = store
.update(cx, |store, cx| {
store.create_dev_server("server-1".to_string(), None, cx)
})
.await
.unwrap();
store.update(cx, |store, _| {
assert_eq!(store.dev_servers().len(), 1);
assert_eq!(store.dev_servers()[0].name, "server-1");
assert_eq!(store.dev_servers()[0].status, DevServerStatus::Offline);
});
let dev_server = server.create_dev_server(resp.access_token, cx2).await;
cx.executor().run_until_parked();
store.update(cx, |store, _| {
assert_eq!(store.dev_servers()[0].status, DevServerStatus::Online);
});
dev_server
.fs()
.insert_tree(
"/remote",
json!({
"1.txt": "remote\nremote\nremote",
"2.js": "function two() { return 2; }",
"3.rs": "mod test",
}),
)
.await;
store
.update(cx, |store, cx| {
store.create_dev_server_project(
client::DevServerId(resp.dev_server_id),
"/remote".to_string(),
cx,
)
})
.await
.unwrap();
cx.executor().run_until_parked();
let remote_workspace = store
.update(cx, |store, cx| {
let projects = store.dev_server_projects();
assert_eq!(projects.len(), 1);
assert_eq!(projects[0].paths, vec!["/remote"]);
workspace::join_dev_server_project(
projects[0].id,
projects[0].project_id.unwrap(),
client.app_state.clone(),
None,
cx,
)
})
.await
.unwrap();
cx.executor().run_until_parked();
let cx = VisualTestContext::from_window(remote_workspace.into(), cx).as_mut();
cx.simulate_keystrokes("cmd-p 1 enter");
let editor = remote_workspace
.update(cx, |ws, cx| {
ws.active_item_as::<Editor>(cx).unwrap().clone()
})
.unwrap();
editor.update(cx, |ed, cx| {
assert_eq!(ed.text(cx).to_string(), "remote\nremote\nremote");
});
cx.simulate_input("wow!");
cx.simulate_keystrokes("cmd-s");
let content = dev_server
.fs()
.load(Path::new("/remote/1.txt"))
.await
.unwrap();
assert_eq!(content, "wow!remote\nremote\nremote\n");
}
#[gpui::test]
async fn test_dev_server_env_files(
cx1: &mut gpui::TestAppContext,
cx2: &mut gpui::TestAppContext,
cx3: &mut gpui::TestAppContext,
) {
let (server, client1, client2, channel_id) = TestServer::start2(cx1, cx2).await;
let (_dev_server, remote_workspace) =
create_dev_server_project(&server, client1.app_state.clone(), cx1, cx3).await;
cx1.executor().run_until_parked();
let cx1 = VisualTestContext::from_window(remote_workspace.into(), cx1).as_mut();
cx1.simulate_keystrokes("cmd-p . e enter");
let editor = remote_workspace
.update(cx1, |ws, cx| {
ws.active_item_as::<Editor>(cx).unwrap().clone()
})
.unwrap();
editor.update(cx1, |ed, cx| {
assert_eq!(ed.text(cx).to_string(), "SECRET");
});
cx1.update(|cx| {
workspace::join_channel(
channel_id,
client1.app_state.clone(),
Some(remote_workspace),
cx,
)
})
.await
.unwrap();
cx1.executor().run_until_parked();
remote_workspace
.update(cx1, |ws, cx| {
assert!(ws.project().read(cx).is_shared());
})
.unwrap();
join_channel(channel_id, &client2, cx2).await.unwrap();
cx2.executor().run_until_parked();
let (workspace2, cx2) = client2.active_workspace(cx2);
let editor = workspace2.update(cx2, |ws, cx| {
ws.active_item_as::<Editor>(cx).unwrap().clone()
});
// TODO: it'd be nice to hide .env files from other people
editor.update(cx2, |ed, cx| {
assert_eq!(ed.text(cx).to_string(), "SECRET");
});
}
async fn create_dev_server_project(
server: &TestServer,
client_app_state: Arc<AppState>,
cx: &mut TestAppContext,
cx_devserver: &mut TestAppContext,
) -> (TestClient, WindowHandle<Workspace>) {
let store = cx.update(|cx| dev_server_projects::Store::global(cx).clone());
let resp = store
.update(cx, |store, cx| {
store.create_dev_server("server-1".to_string(), None, cx)
})
.await
.unwrap();
let dev_server = server
.create_dev_server(resp.access_token, cx_devserver)
.await;
cx.executor().run_until_parked();
dev_server
.fs()
.insert_tree(
"/remote",
json!({
"1.txt": "remote\nremote\nremote",
".env": "SECRET",
}),
)
.await;
store
.update(cx, |store, cx| {
store.create_dev_server_project(
client::DevServerId(resp.dev_server_id),
"/remote".to_string(),
cx,
)
})
.await
.unwrap();
cx.executor().run_until_parked();
let workspace = store
.update(cx, |store, cx| {
let projects = store.dev_server_projects();
assert_eq!(projects.len(), 1);
assert_eq!(projects[0].paths, vec!["/remote"]);
workspace::join_dev_server_project(
projects[0].id,
projects[0].project_id.unwrap(),
client_app_state,
None,
cx,
)
})
.await
.unwrap();
cx.executor().run_until_parked();
(dev_server, workspace)
}
#[gpui::test]
async fn test_dev_server_leave_room(
cx1: &mut gpui::TestAppContext,
cx2: &mut gpui::TestAppContext,
cx3: &mut gpui::TestAppContext,
) {
let (server, client1, client2, channel_id) = TestServer::start2(cx1, cx2).await;
let (_dev_server, remote_workspace) =
create_dev_server_project(&server, client1.app_state.clone(), cx1, cx3).await;
cx1.update(|cx| {
workspace::join_channel(
channel_id,
client1.app_state.clone(),
Some(remote_workspace),
cx,
)
})
.await
.unwrap();
cx1.executor().run_until_parked();
remote_workspace
.update(cx1, |ws, cx| {
assert!(ws.project().read(cx).is_shared());
})
.unwrap();
join_channel(channel_id, &client2, cx2).await.unwrap();
cx2.executor().run_until_parked();
cx1.update(|cx| ActiveCall::global(cx).update(cx, |active_call, cx| active_call.hang_up(cx)))
.await
.unwrap();
cx1.executor().run_until_parked();
let (workspace, cx2) = client2.active_workspace(cx2);
cx2.update(|cx| assert!(workspace.read(cx).project().read(cx).is_disconnected(cx)));
}
#[gpui::test]
async fn test_dev_server_delete(
cx1: &mut gpui::TestAppContext,
cx2: &mut gpui::TestAppContext,
cx3: &mut gpui::TestAppContext,
) {
let (server, client1, client2, channel_id) = TestServer::start2(cx1, cx2).await;
let (_dev_server, remote_workspace) =
create_dev_server_project(&server, client1.app_state.clone(), cx1, cx3).await;
cx1.update(|cx| {
workspace::join_channel(
channel_id,
client1.app_state.clone(),
Some(remote_workspace),
cx,
)
})
.await
.unwrap();
cx1.executor().run_until_parked();
remote_workspace
.update(cx1, |ws, cx| {
assert!(ws.project().read(cx).is_shared());
})
.unwrap();
join_channel(channel_id, &client2, cx2).await.unwrap();
cx2.executor().run_until_parked();
cx1.update(|cx| {
dev_server_projects::Store::global(cx).update(cx, |store, cx| {
store.delete_dev_server_project(store.dev_server_projects().first().unwrap().id, cx)
})
})
.await
.unwrap();
cx1.executor().run_until_parked();
let (workspace, cx2) = client2.active_workspace(cx2);
cx2.update(|cx| assert!(workspace.read(cx).project().read(cx).is_disconnected(cx)));
cx1.update(|cx| {
dev_server_projects::Store::global(cx).update(cx, |store, _| {
assert_eq!(store.dev_server_projects().len(), 0);
})
})
}
#[gpui::test]
async fn test_dev_server_rename(
cx1: &mut gpui::TestAppContext,
cx2: &mut gpui::TestAppContext,
cx3: &mut gpui::TestAppContext,
) {
let (server, client1, client2, channel_id) = TestServer::start2(cx1, cx2).await;
let (_dev_server, remote_workspace) =
create_dev_server_project(&server, client1.app_state.clone(), cx1, cx3).await;
cx1.update(|cx| {
workspace::join_channel(
channel_id,
client1.app_state.clone(),
Some(remote_workspace),
cx,
)
})
.await
.unwrap();
cx1.executor().run_until_parked();
remote_workspace
.update(cx1, |ws, cx| {
assert!(ws.project().read(cx).is_shared());
})
.unwrap();
join_channel(channel_id, &client2, cx2).await.unwrap();
cx2.executor().run_until_parked();
cx1.update(|cx| {
dev_server_projects::Store::global(cx).update(cx, |store, cx| {
store.rename_dev_server(
store.dev_servers().first().unwrap().id,
"name-edited".to_string(),
None,
cx,
)
})
})
.await
.unwrap();
cx1.executor().run_until_parked();
cx1.update(|cx| {
dev_server_projects::Store::global(cx).update(cx, |store, _| {
assert_eq!(store.dev_servers().first().unwrap().name, "name-edited");
})
})
}
#[gpui::test]
async fn test_dev_server_refresh_access_token(
cx1: &mut gpui::TestAppContext,
cx2: &mut gpui::TestAppContext,
cx3: &mut gpui::TestAppContext,
cx4: &mut gpui::TestAppContext,
) {
let (server, client1, client2, channel_id) = TestServer::start2(cx1, cx2).await;
let (_dev_server, remote_workspace) =
create_dev_server_project(&server, client1.app_state.clone(), cx1, cx3).await;
cx1.update(|cx| {
workspace::join_channel(
channel_id,
client1.app_state.clone(),
Some(remote_workspace),
cx,
)
})
.await
.unwrap();
cx1.executor().run_until_parked();
remote_workspace
.update(cx1, |ws, cx| {
assert!(ws.project().read(cx).is_shared());
})
.unwrap();
join_channel(channel_id, &client2, cx2).await.unwrap();
cx2.executor().run_until_parked();
// Regenerate the access token
let new_token_response = cx1
.update(|cx| {
dev_server_projects::Store::global(cx).update(cx, |store, cx| {
store.regenerate_dev_server_token(store.dev_servers().first().unwrap().id, cx)
})
})
.await
.unwrap();
cx1.executor().run_until_parked();
// Assert that the other client was disconnected
let (workspace, cx2) = client2.active_workspace(cx2);
cx2.update(|cx| assert!(workspace.read(cx).project().read(cx).is_disconnected(cx)));
// Assert that the owner of the dev server does not see the dev server as online anymore
let (workspace, cx1) = client1.active_workspace(cx1);
cx1.update(|cx| {
assert!(workspace.read(cx).project().read(cx).is_disconnected(cx));
dev_server_projects::Store::global(cx).update(cx, |store, _| {
assert_eq!(
store.dev_servers().first().unwrap().status,
DevServerStatus::Offline
);
})
});
// Reconnect the dev server with the new token
let _dev_server = server
.create_dev_server(new_token_response.access_token, cx4)
.await;
cx1.executor().run_until_parked();
// Assert that the dev server is online again
cx1.update(|cx| {
dev_server_projects::Store::global(cx).update(cx, |store, _| {
assert_eq!(store.dev_servers().len(), 1);
assert_eq!(
store.dev_servers().first().unwrap().status,
DevServerStatus::Online
);
})
});
}
#[gpui::test]
async fn test_dev_server_reconnect(
cx1: &mut gpui::TestAppContext,
cx2: &mut gpui::TestAppContext,
cx3: &mut gpui::TestAppContext,
) {
let (mut server, client1) = TestServer::start1(cx1).await;
let channel_id = server
.make_channel("test", None, (&client1, cx1), &mut [])
.await;
let (_dev_server, remote_workspace) =
create_dev_server_project(&server, client1.app_state.clone(), cx1, cx3).await;
cx1.update(|cx| {
workspace::join_channel(
channel_id,
client1.app_state.clone(),
Some(remote_workspace),
cx,
)
})
.await
.unwrap();
cx1.executor().run_until_parked();
remote_workspace
.update(cx1, |ws, cx| {
assert!(ws.project().read(cx).is_shared());
})
.unwrap();
drop(client1);
let client2 = server.create_client(cx2, "user_a").await;
let store = cx2.update(|cx| dev_server_projects::Store::global(cx).clone());
store
.update(cx2, |store, cx| {
let projects = store.dev_server_projects();
workspace::join_dev_server_project(
projects[0].id,
projects[0].project_id.unwrap(),
client2.app_state.clone(),
None,
cx,
)
})
.await
.unwrap();
}
#[gpui::test]
async fn test_dev_server_restart(cx1: &mut gpui::TestAppContext, cx2: &mut gpui::TestAppContext) {
let (server, client1) = TestServer::start1(cx1).await;
let (_dev_server, remote_workspace) =
create_dev_server_project(&server, client1.app_state.clone(), cx1, cx2).await;
let cx = VisualTestContext::from_window(remote_workspace.into(), cx1).as_mut();
server.reset().await;
cx.run_until_parked();
cx.simulate_keystrokes("cmd-p 1 enter");
remote_workspace
.update(cx, |ws, cx| {
ws.active_item_as::<Editor>(cx)
.unwrap()
.update(cx, |ed, cx| {
assert_eq!(ed.text(cx).to_string(), "remote\nremote\nremote");
})
})
.unwrap();
}
#[gpui::test]
async fn test_create_dev_server_project_path_validation(
cx1: &mut gpui::TestAppContext,
cx2: &mut gpui::TestAppContext,
cx3: &mut gpui::TestAppContext,
) {
let (server, client1) = TestServer::start1(cx1).await;
let _channel_id = server
.make_channel("test", None, (&client1, cx1), &mut [])
.await;
// Creating a project with a path that does exist should not fail
let (_dev_server, _) =
create_dev_server_project(&server, client1.app_state.clone(), cx1, cx2).await;
cx1.executor().run_until_parked();
let store = cx1.update(|cx| dev_server_projects::Store::global(cx).clone());
let resp = store
.update(cx1, |store, cx| {
store.create_dev_server("server-2".to_string(), None, cx)
})
.await
.unwrap();
cx1.executor().run_until_parked();
let _dev_server = server.create_dev_server(resp.access_token, cx3).await;
cx1.executor().run_until_parked();
// Creating a remote project with a path that does not exist should fail
let result = store
.update(cx1, |store, cx| {
store.create_dev_server_project(
client::DevServerId(resp.dev_server_id),
"/notfound".to_string(),
cx,
)
})
.await;
cx1.executor().run_until_parked();
let error = result.unwrap_err();
assert!(matches!(
error.error_code(),
ErrorCode::DevServerProjectPathDoesNotExist
));
}
#[gpui::test]
async fn test_save_as_remote(cx1: &mut gpui::TestAppContext, cx2: &mut gpui::TestAppContext) {
let (server, client1) = TestServer::start1(cx1).await;
// Creating a project with a path that does exist should not fail
let (dev_server, remote_workspace) =
create_dev_server_project(&server, client1.app_state.clone(), cx1, cx2).await;
let mut cx = VisualTestContext::from_window(remote_workspace.into(), cx1);
cx.simulate_keystrokes("cmd-p 1 enter");
cx.simulate_keystrokes("cmd-shift-s");
cx.simulate_input("2.txt");
cx.simulate_keystrokes("enter");
cx.executor().run_until_parked();
let title = remote_workspace
.update(&mut cx, |ws, cx| {
let active_item = ws.active_item(cx).unwrap();
active_item.tab_description(0, cx).unwrap()
})
.unwrap();
assert_eq!(title, "2.txt");
let path = Path::new("/remote/2.txt");
assert_eq!(
dev_server.fs().load(path).await.unwrap(),
"remote\nremote\nremote"
);
}
#[gpui::test]
async fn test_new_file_remote(cx1: &mut gpui::TestAppContext, cx2: &mut gpui::TestAppContext) {
let (server, client1) = TestServer::start1(cx1).await;
// Creating a project with a path that does exist should not fail
let (dev_server, remote_workspace) =
create_dev_server_project(&server, client1.app_state.clone(), cx1, cx2).await;
let mut cx = VisualTestContext::from_window(remote_workspace.into(), cx1);
cx.simulate_keystrokes("cmd-n");
cx.simulate_input("new!");
cx.simulate_keystrokes("cmd-shift-s");
cx.simulate_input("2.txt");
cx.simulate_keystrokes("enter");
cx.executor().run_until_parked();
let title = remote_workspace
.update(&mut cx, |ws, cx| {
ws.active_item(cx).unwrap().tab_description(0, cx).unwrap()
})
.unwrap();
assert_eq!(title, "2.txt");
let path = Path::new("/remote/2.txt");
assert_eq!(dev_server.fs().load(path).await.unwrap(), "new!");
}

View File

@@ -1589,8 +1589,9 @@ async fn test_following_stops_on_unshare(cx_a: &mut TestAppContext, cx_b: &mut T
.await;
let (workspace_b, cx_b) = client_b.join_workspace(channel_id, cx_b).await;
cx_a.simulate_keystrokes("cmd-p 2 enter");
cx_a.simulate_keystrokes("cmd-p");
cx_a.run_until_parked();
cx_a.simulate_keystrokes("2 enter");
let editor_a = workspace_a.update(cx_a, |workspace, cx| {
workspace.active_item_as::<Editor>(cx).unwrap()
@@ -2041,7 +2042,9 @@ async fn test_following_to_channel_notes_other_workspace(
share_workspace(&workspace_a, cx_a).await.unwrap();
// a opens 1.txt
cx_a.simulate_keystrokes("cmd-p 1 enter");
cx_a.simulate_keystrokes("cmd-p");
cx_a.run_until_parked();
cx_a.simulate_keystrokes("1 enter");
cx_a.run_until_parked();
workspace_a.update(cx_a, |workspace, cx| {
let editor = workspace.active_item(cx).unwrap();
@@ -2098,7 +2101,9 @@ async fn test_following_while_deactivated(cx_a: &mut TestAppContext, cx_b: &mut
share_workspace(&workspace_a, cx_a).await.unwrap();
// a opens 1.txt
cx_a.simulate_keystrokes("cmd-p 1 enter");
cx_a.simulate_keystrokes("cmd-p");
cx_a.run_until_parked();
cx_a.simulate_keystrokes("1 enter");
cx_a.run_until_parked();
workspace_a.update(cx_a, |workspace, cx| {
let editor = workspace.active_item(cx).unwrap();
@@ -2118,7 +2123,9 @@ async fn test_following_while_deactivated(cx_a: &mut TestAppContext, cx_b: &mut
cx_b.simulate_keystrokes("down");
// a opens a different file while not followed
cx_a.simulate_keystrokes("cmd-p 2 enter");
cx_a.simulate_keystrokes("cmd-p");
cx_a.run_until_parked();
cx_a.simulate_keystrokes("2 enter");
workspace_b.update(cx_b, |workspace, cx| {
let editor = workspace.active_item_as::<Editor>(cx).unwrap();
@@ -2128,7 +2135,9 @@ async fn test_following_while_deactivated(cx_a: &mut TestAppContext, cx_b: &mut
// a opens a file in a new window
let (_, cx_a2) = client_a.build_test_workspace(&mut cx_a2).await;
cx_a2.update(|cx| cx.activate_window());
cx_a2.simulate_keystrokes("cmd-p 3 enter");
cx_a2.simulate_keystrokes("cmd-p");
cx_a2.run_until_parked();
cx_a2.simulate_keystrokes("3 enter");
cx_a2.run_until_parked();
// b starts following a again

View File

@@ -26,7 +26,7 @@ async fn test_sharing_an_ssh_remote_project(
.await;
// Set up project on remote FS
let (port, server_ssh) = SshRemoteClient::fake_server(cx_a, server_cx);
let (opts, server_ssh) = SshRemoteClient::fake_server(cx_a, server_cx);
let remote_fs = FakeFs::new(server_cx.executor());
remote_fs
.insert_tree(
@@ -67,7 +67,7 @@ async fn test_sharing_an_ssh_remote_project(
)
});
let client_ssh = SshRemoteClient::fake_client(port, cx_a).await;
let client_ssh = SshRemoteClient::fake_client(opts, cx_a).await;
let (project_a, worktree_id) = client_a
.build_ssh_project("/code/project1", client_ssh, cx_a)
.await;

View File

@@ -1,5 +1,4 @@
use crate::{
auth::split_dev_server_token,
db::{tests::TestDb, NewUserParams, UserId},
executor::Executor,
rpc::{Principal, Server, ZedVersion, CLEANUP_TIMEOUT, RECONNECT_TIMEOUT},
@@ -204,7 +203,7 @@ impl TestServer {
.override_authenticate(move |cx| {
cx.spawn(|_| async move {
let access_token = "the-token".to_string();
Ok(Credentials::User {
Ok(Credentials {
user_id: user_id.to_proto(),
access_token,
})
@@ -213,7 +212,7 @@ impl TestServer {
.override_establish_connection(move |credentials, cx| {
assert_eq!(
credentials,
&Credentials::User {
&Credentials {
user_id: user_id.0 as u64,
access_token: "the-token".into()
}
@@ -297,7 +296,6 @@ impl TestServer {
collab_ui::init(&app_state, cx);
file_finder::init(cx);
menu::init();
dev_server_projects::init(client.clone(), cx);
settings::KeymapFile::load_asset(os_keymap, cx).unwrap();
language_model::LanguageModelRegistry::test(cx);
assistant::context_store::init(&client.clone().into());
@@ -319,135 +317,6 @@ impl TestServer {
client
}
pub async fn create_dev_server(
&self,
access_token: String,
cx: &mut TestAppContext,
) -> TestClient {
cx.update(|cx| {
if cx.has_global::<SettingsStore>() {
panic!("Same cx used to create two test clients")
}
let settings = SettingsStore::test(cx);
cx.set_global(settings);
release_channel::init(SemanticVersion::default(), cx);
client::init_settings(cx);
});
let (dev_server_id, _) = split_dev_server_token(&access_token).unwrap();
let clock = Arc::new(FakeSystemClock::default());
let http = FakeHttpClient::with_404_response();
let mut client = cx.update(|cx| Client::new(clock, http.clone(), cx));
let server = self.server.clone();
let db = self.app_state.db.clone();
let connection_killers = self.connection_killers.clone();
let forbid_connections = self.forbid_connections.clone();
Arc::get_mut(&mut client)
.unwrap()
.set_id(1)
.set_dev_server_token(client::DevServerToken(access_token.clone()))
.override_establish_connection(move |credentials, cx| {
assert_eq!(
credentials,
&Credentials::DevServer {
token: client::DevServerToken(access_token.to_string())
}
);
let server = server.clone();
let db = db.clone();
let connection_killers = connection_killers.clone();
let forbid_connections = forbid_connections.clone();
cx.spawn(move |cx| async move {
if forbid_connections.load(SeqCst) {
Err(EstablishConnectionError::other(anyhow!(
"server is forbidding connections"
)))
} else {
let (client_conn, server_conn, killed) =
Connection::in_memory(cx.background_executor().clone());
let (connection_id_tx, connection_id_rx) = oneshot::channel();
let dev_server = db
.get_dev_server(dev_server_id)
.await
.expect("retrieving dev_server failed");
cx.background_executor()
.spawn(server.handle_connection(
server_conn,
"dev-server".to_string(),
Principal::DevServer(dev_server),
ZedVersion(SemanticVersion::new(1, 0, 0)),
None,
Some(connection_id_tx),
Executor::Deterministic(cx.background_executor().clone()),
))
.detach();
let connection_id = connection_id_rx.await.map_err(|e| {
EstablishConnectionError::Other(anyhow!(
"{} (is server shutting down?)",
e
))
})?;
connection_killers
.lock()
.insert(connection_id.into(), killed);
Ok(client_conn)
}
})
});
let fs = FakeFs::new(cx.executor());
let user_store = cx.new_model(|cx| UserStore::new(client.clone(), cx));
let workspace_store = cx.new_model(|cx| WorkspaceStore::new(client.clone(), cx));
let language_registry = Arc::new(LanguageRegistry::test(cx.executor()));
let session = cx.new_model(|cx| AppSession::new(Session::test(), cx));
let app_state = Arc::new(workspace::AppState {
client: client.clone(),
user_store: user_store.clone(),
workspace_store,
languages: language_registry,
fs: fs.clone(),
build_window_options: |_, _| Default::default(),
node_runtime: NodeRuntime::unavailable(),
session,
});
cx.update(|cx| {
theme::init(theme::LoadThemes::JustBase, cx);
Project::init(&client, cx);
client::init(&client, cx);
language::init(cx);
editor::init(cx);
workspace::init(app_state.clone(), cx);
call::init(client.clone(), user_store.clone(), cx);
channel::init(&client, user_store.clone(), cx);
notifications::init(client.clone(), user_store, cx);
collab_ui::init(&app_state, cx);
file_finder::init(cx);
menu::init();
headless::init(
client.clone(),
headless::AppState {
languages: app_state.languages.clone(),
user_store: app_state.user_store.clone(),
fs: fs.clone(),
node_runtime: app_state.node_runtime.clone(),
},
cx,
)
})
.await
.unwrap();
TestClient {
app_state,
username: "dev-server".to_string(),
channel_store: cx.read(ChannelStore::global).clone(),
notification_store: cx.read(NotificationStore::global).clone(),
state: Default::default(),
}
}
pub fn disconnect_client(&self, peer_id: PeerId) {
self.connection_killers
.lock()

View File

@@ -1,23 +0,0 @@
[package]
name = "dev_server_projects"
version = "0.1.0"
edition = "2021"
publish = false
license = "GPL-3.0-or-later"
[lints]
workspace = true
[lib]
path = "src/dev_server_projects.rs"
doctest = false
[dependencies]
anyhow.workspace = true
gpui.workspace = true
serde.workspace = true
client.workspace = true
rpc.workspace = true
[dev-dependencies]
serde_json.workspace = true

View File

@@ -1 +0,0 @@
../../LICENSE-GPL

View File

@@ -1,249 +1 @@
use anyhow::Result;
use gpui::{AppContext, AsyncAppContext, Context, Global, Model, ModelContext, SharedString, Task};
use rpc::{
proto::{self, DevServerStatus},
TypedEnvelope,
};
use std::{collections::HashMap, sync::Arc};
use client::{Client, ProjectId};
pub use client::{DevServerId, DevServerProjectId};
pub struct Store {
dev_server_projects: HashMap<DevServerProjectId, DevServerProject>,
dev_servers: HashMap<DevServerId, DevServer>,
_subscriptions: Vec<client::Subscription>,
client: Arc<Client>,
}
#[derive(Debug, Clone)]
pub struct DevServerProject {
pub id: DevServerProjectId,
pub project_id: Option<ProjectId>,
pub paths: Vec<SharedString>,
pub dev_server_id: DevServerId,
}
impl From<proto::DevServerProject> for DevServerProject {
fn from(project: proto::DevServerProject) -> Self {
Self {
id: DevServerProjectId(project.id),
project_id: project.project_id.map(ProjectId),
paths: project.paths.into_iter().map(|path| path.into()).collect(),
dev_server_id: DevServerId(project.dev_server_id),
}
}
}
#[derive(Debug, Clone)]
pub struct DevServer {
pub id: DevServerId,
pub name: SharedString,
pub ssh_connection_string: Option<SharedString>,
pub status: DevServerStatus,
}
impl From<proto::DevServer> for DevServer {
fn from(dev_server: proto::DevServer) -> Self {
Self {
id: DevServerId(dev_server.dev_server_id),
status: dev_server.status(),
name: dev_server.name.into(),
ssh_connection_string: dev_server.ssh_connection_string.map(|s| s.into()),
}
}
}
struct GlobalStore(Model<Store>);
impl Global for GlobalStore {}
pub fn init(client: Arc<Client>, cx: &mut AppContext) {
let store = cx.new_model(|cx| Store::new(client, cx));
cx.set_global(GlobalStore(store));
}
impl Store {
pub fn global(cx: &AppContext) -> Model<Store> {
cx.global::<GlobalStore>().0.clone()
}
pub fn new(client: Arc<Client>, cx: &ModelContext<Self>) -> Self {
Self {
dev_server_projects: Default::default(),
dev_servers: Default::default(),
_subscriptions: vec![client
.add_message_handler(cx.weak_model(), Self::handle_dev_server_projects_update)],
client,
}
}
pub fn projects_for_server(&self, id: DevServerId) -> Vec<DevServerProject> {
let mut projects: Vec<DevServerProject> = self
.dev_server_projects
.values()
.filter(|project| project.dev_server_id == id)
.cloned()
.collect();
projects.sort_by_key(|p| (p.paths.clone(), p.id));
projects
}
pub fn dev_servers(&self) -> Vec<DevServer> {
let mut dev_servers: Vec<DevServer> = self.dev_servers.values().cloned().collect();
dev_servers.sort_by_key(|d| (d.status == DevServerStatus::Offline, d.name.clone(), d.id));
dev_servers
}
pub fn dev_server(&self, id: DevServerId) -> Option<&DevServer> {
self.dev_servers.get(&id)
}
pub fn dev_server_status(&self, id: DevServerId) -> DevServerStatus {
self.dev_server(id)
.map(|server| server.status)
.unwrap_or(DevServerStatus::Offline)
}
pub fn dev_server_projects(&self) -> Vec<DevServerProject> {
let mut projects: Vec<DevServerProject> =
self.dev_server_projects.values().cloned().collect();
projects.sort_by_key(|p| (p.paths.clone(), p.id));
projects
}
pub fn dev_server_project(&self, id: DevServerProjectId) -> Option<&DevServerProject> {
self.dev_server_projects.get(&id)
}
pub fn dev_server_for_project(&self, id: DevServerProjectId) -> Option<&DevServer> {
self.dev_server_project(id)
.and_then(|project| self.dev_server(project.dev_server_id))
}
async fn handle_dev_server_projects_update(
this: Model<Self>,
envelope: TypedEnvelope<proto::DevServerProjectsUpdate>,
mut cx: AsyncAppContext,
) -> Result<()> {
this.update(&mut cx, |this, cx| {
this.dev_servers = envelope
.payload
.dev_servers
.into_iter()
.map(|dev_server| (DevServerId(dev_server.dev_server_id), dev_server.into()))
.collect();
this.dev_server_projects = envelope
.payload
.dev_server_projects
.into_iter()
.map(|project| (DevServerProjectId(project.id), project.into()))
.collect();
cx.notify();
})?;
Ok(())
}
pub fn create_dev_server_project(
&mut self,
dev_server_id: DevServerId,
path: String,
cx: &mut ModelContext<Self>,
) -> Task<Result<proto::CreateDevServerProjectResponse>> {
let client = self.client.clone();
cx.background_executor().spawn(async move {
client
.request(proto::CreateDevServerProject {
dev_server_id: dev_server_id.0,
path,
})
.await
})
}
pub fn create_dev_server(
&mut self,
name: String,
ssh_connection_string: Option<String>,
cx: &mut ModelContext<Self>,
) -> Task<Result<proto::CreateDevServerResponse>> {
let client = self.client.clone();
cx.background_executor().spawn(async move {
let result = client
.request(proto::CreateDevServer {
name,
ssh_connection_string,
})
.await?;
Ok(result)
})
}
pub fn rename_dev_server(
&mut self,
dev_server_id: DevServerId,
name: String,
ssh_connection_string: Option<String>,
cx: &mut ModelContext<Self>,
) -> Task<Result<()>> {
let client = self.client.clone();
cx.background_executor().spawn(async move {
client
.request(proto::RenameDevServer {
dev_server_id: dev_server_id.0,
name,
ssh_connection_string,
})
.await?;
Ok(())
})
}
pub fn regenerate_dev_server_token(
&mut self,
dev_server_id: DevServerId,
cx: &mut ModelContext<Self>,
) -> Task<Result<proto::RegenerateDevServerTokenResponse>> {
let client = self.client.clone();
cx.background_executor().spawn(async move {
client
.request(proto::RegenerateDevServerToken {
dev_server_id: dev_server_id.0,
})
.await
})
}
pub fn delete_dev_server(
&mut self,
id: DevServerId,
cx: &mut ModelContext<Self>,
) -> Task<Result<()>> {
let client = self.client.clone();
cx.background_executor().spawn(async move {
client
.request(proto::DeleteDevServer {
dev_server_id: id.0,
})
.await?;
Ok(())
})
}
pub fn delete_dev_server_project(
&mut self,
id: DevServerProjectId,
cx: &mut ModelContext<Self>,
) -> Task<Result<()>> {
let client = self.client.clone();
cx.background_executor().spawn(async move {
client
.request(proto::DeleteDevServerProject {
dev_server_project_id: id.0,
})
.await?;
Ok(())
})
}
}

View File

@@ -9,7 +9,7 @@ use anyhow::Result;
use collections::{BTreeSet, HashSet};
use editor::{
diagnostic_block_renderer,
display_map::{BlockDisposition, BlockProperties, BlockStyle, CustomBlockId, RenderBlock},
display_map::{BlockPlacement, BlockProperties, BlockStyle, CustomBlockId, RenderBlock},
highlight_diagnostic_message,
scroll::Autoscroll,
Editor, EditorEvent, ExcerptId, ExcerptRange, MultiBuffer, ToOffset,
@@ -439,11 +439,10 @@ impl ProjectDiagnosticsEditor {
primary.message.split('\n').next().unwrap().to_string();
group_state.block_count += 1;
blocks_to_add.push(BlockProperties {
position: header_position,
placement: BlockPlacement::Above(header_position),
height: 2,
style: BlockStyle::Sticky,
render: diagnostic_header_renderer(primary),
disposition: BlockDisposition::Above,
priority: 0,
});
}
@@ -459,13 +458,15 @@ impl ProjectDiagnosticsEditor {
if !diagnostic.message.is_empty() {
group_state.block_count += 1;
blocks_to_add.push(BlockProperties {
position: (excerpt_id, entry.range.start),
placement: BlockPlacement::Below((
excerpt_id,
entry.range.start,
)),
height: diagnostic.message.matches('\n').count() as u32 + 1,
style: BlockStyle::Fixed,
render: diagnostic_block_renderer(
diagnostic, None, true, true,
),
disposition: BlockDisposition::Below,
priority: 0,
});
}
@@ -498,13 +499,24 @@ impl ProjectDiagnosticsEditor {
editor.remove_blocks(blocks_to_remove, None, cx);
let block_ids = editor.insert_blocks(
blocks_to_add.into_iter().flat_map(|block| {
let (excerpt_id, text_anchor) = block.position;
let placement = match block.placement {
BlockPlacement::Above((excerpt_id, text_anchor)) => BlockPlacement::Above(
excerpts_snapshot.anchor_in_excerpt(excerpt_id, text_anchor)?,
),
BlockPlacement::Below((excerpt_id, text_anchor)) => BlockPlacement::Below(
excerpts_snapshot.anchor_in_excerpt(excerpt_id, text_anchor)?,
),
BlockPlacement::Replace(_) => {
unreachable!(
"no Replace block should have been pushed to blocks_to_add"
)
}
};
Some(BlockProperties {
position: excerpts_snapshot.anchor_in_excerpt(excerpt_id, text_anchor)?,
placement,
height: block.height,
style: block.style,
render: block.render,
disposition: block.disposition,
priority: 0,
})
}),

View File

@@ -81,7 +81,6 @@ ui.workspace = true
url.workspace = true
util.workspace = true
workspace.workspace = true
unicode-segmentation.workspace = true
[dev-dependencies]
ctor.workspace = true

View File

@@ -153,6 +153,10 @@ pub struct DeleteToPreviousWordStart {
pub ignore_newlines: bool,
}
#[derive(PartialEq, Clone, Deserialize, Default)]
pub struct FoldAtLevel {
pub level: u32,
}
impl_actions!(
editor,
[
@@ -182,6 +186,7 @@ impl_actions!(
ToggleCodeActions,
ToggleComments,
UnfoldAt,
FoldAtLevel
]
);
@@ -193,6 +198,7 @@ gpui::actions!(
AcceptPartialInlineCompletion,
AddSelectionAbove,
AddSelectionBelow,
ApplyAllDiffHunks,
ApplyDiffHunk,
Backspace,
Cancel,

View File

@@ -8,7 +8,7 @@
//! of several smaller structures that form a hierarchy (starting at the bottom):
//! - [`InlayMap`] that decides where the [`Inlay`]s should be displayed.
//! - [`FoldMap`] that decides where the fold indicators should be; it also tracks parts of a source file that are currently folded.
//! - [`CharMap`] that replaces tabs and non-printable characters
//! - [`TabMap`] that keeps track of hard tabs in a buffer.
//! - [`WrapMap`] that handles soft wrapping.
//! - [`BlockMap`] that tracks custom blocks such as diagnostics that should be displayed within buffer.
//! - [`DisplayMap`] that adds background highlights to the regions of text.
@@ -18,22 +18,20 @@
//! [EditorElement]: crate::element::EditorElement
mod block_map;
mod char_map;
mod crease_map;
mod fold_map;
mod inlay_map;
mod invisibles;
mod tab_map;
mod wrap_map;
use crate::{
hover_links::InlayHighlight, movement::TextLayoutDetails, EditorStyle, InlayId, RowExt,
};
pub use block_map::{
Block, BlockBufferRows, BlockChunks as DisplayChunks, BlockContext, BlockDisposition, BlockId,
BlockMap, BlockPoint, BlockProperties, BlockStyle, CustomBlockId, RenderBlock,
Block, BlockBufferRows, BlockChunks as DisplayChunks, BlockContext, BlockId, BlockMap,
BlockPlacement, BlockPoint, BlockProperties, BlockStyle, CustomBlockId, RenderBlock,
};
use block_map::{BlockRow, BlockSnapshot};
use char_map::{CharMap, CharSnapshot};
use collections::{HashMap, HashSet};
pub use crease_map::*;
pub use fold_map::{Fold, FoldId, FoldPlaceholder, FoldPoint};
@@ -44,7 +42,6 @@ use gpui::{
pub(crate) use inlay_map::Inlay;
use inlay_map::{InlayMap, InlaySnapshot};
pub use inlay_map::{InlayOffset, InlayPoint};
pub use invisibles::is_invisible;
use language::{
language_settings::language_settings, ChunkRenderer, OffsetUtf16, Point,
Subscription as BufferSubscription,
@@ -64,9 +61,9 @@ use std::{
sync::Arc,
};
use sum_tree::{Bias, TreeMap};
use tab_map::{TabMap, TabSnapshot};
use text::LineIndent;
use ui::{px, WindowContext};
use unicode_segmentation::UnicodeSegmentation;
use ui::WindowContext;
use wrap_map::{WrapMap, WrapSnapshot};
#[derive(Copy, Clone, Debug, PartialEq, Eq)]
@@ -97,7 +94,7 @@ pub struct DisplayMap {
/// Decides where the fold indicators should be and tracks parts of a source file that are currently folded.
fold_map: FoldMap,
/// Keeps track of hard tabs in a buffer.
char_map: CharMap,
tab_map: TabMap,
/// Handles soft wrapping.
wrap_map: Model<WrapMap>,
/// Tracks custom blocks such as diagnostics that should be displayed within buffer.
@@ -134,7 +131,7 @@ impl DisplayMap {
let crease_map = CreaseMap::new(&buffer_snapshot);
let (inlay_map, snapshot) = InlayMap::new(buffer_snapshot);
let (fold_map, snapshot) = FoldMap::new(snapshot);
let (char_map, snapshot) = CharMap::new(snapshot, tab_size);
let (tab_map, snapshot) = TabMap::new(snapshot, tab_size);
let (wrap_map, snapshot) = WrapMap::new(snapshot, font, font_size, wrap_width, cx);
let block_map = BlockMap::new(
snapshot,
@@ -151,7 +148,7 @@ impl DisplayMap {
buffer_subscription,
fold_map,
inlay_map,
char_map,
tab_map,
wrap_map,
block_map,
crease_map,
@@ -169,17 +166,17 @@ impl DisplayMap {
let (inlay_snapshot, edits) = self.inlay_map.sync(buffer_snapshot, edits);
let (fold_snapshot, edits) = self.fold_map.read(inlay_snapshot.clone(), edits);
let tab_size = Self::tab_size(&self.buffer, cx);
let (char_snapshot, edits) = self.char_map.sync(fold_snapshot.clone(), edits, tab_size);
let (tab_snapshot, edits) = self.tab_map.sync(fold_snapshot.clone(), edits, tab_size);
let (wrap_snapshot, edits) = self
.wrap_map
.update(cx, |map, cx| map.sync(char_snapshot.clone(), edits, cx));
.update(cx, |map, cx| map.sync(tab_snapshot.clone(), edits, cx));
let block_snapshot = self.block_map.read(wrap_snapshot.clone(), edits).snapshot;
DisplaySnapshot {
buffer_snapshot: self.buffer.read(cx).snapshot(cx),
fold_snapshot,
inlay_snapshot,
char_snapshot,
tab_snapshot,
wrap_snapshot,
block_snapshot,
crease_snapshot: self.crease_map.snapshot(),
@@ -215,13 +212,13 @@ impl DisplayMap {
let tab_size = Self::tab_size(&self.buffer, cx);
let (snapshot, edits) = self.inlay_map.sync(snapshot, edits);
let (mut fold_map, snapshot, edits) = self.fold_map.write(snapshot, edits);
let (snapshot, edits) = self.char_map.sync(snapshot, edits, tab_size);
let (snapshot, edits) = self.tab_map.sync(snapshot, edits, tab_size);
let (snapshot, edits) = self
.wrap_map
.update(cx, |map, cx| map.sync(snapshot, edits, cx));
self.block_map.read(snapshot, edits);
let (snapshot, edits) = fold_map.fold(ranges);
let (snapshot, edits) = self.char_map.sync(snapshot, edits, tab_size);
let (snapshot, edits) = self.tab_map.sync(snapshot, edits, tab_size);
let (snapshot, edits) = self
.wrap_map
.update(cx, |map, cx| map.sync(snapshot, edits, cx));
@@ -239,13 +236,13 @@ impl DisplayMap {
let tab_size = Self::tab_size(&self.buffer, cx);
let (snapshot, edits) = self.inlay_map.sync(snapshot, edits);
let (mut fold_map, snapshot, edits) = self.fold_map.write(snapshot, edits);
let (snapshot, edits) = self.char_map.sync(snapshot, edits, tab_size);
let (snapshot, edits) = self.tab_map.sync(snapshot, edits, tab_size);
let (snapshot, edits) = self
.wrap_map
.update(cx, |map, cx| map.sync(snapshot, edits, cx));
self.block_map.read(snapshot, edits);
let (snapshot, edits) = fold_map.unfold(ranges, inclusive);
let (snapshot, edits) = self.char_map.sync(snapshot, edits, tab_size);
let (snapshot, edits) = self.tab_map.sync(snapshot, edits, tab_size);
let (snapshot, edits) = self
.wrap_map
.update(cx, |map, cx| map.sync(snapshot, edits, cx));
@@ -280,7 +277,7 @@ impl DisplayMap {
let tab_size = Self::tab_size(&self.buffer, cx);
let (snapshot, edits) = self.inlay_map.sync(snapshot, edits);
let (snapshot, edits) = self.fold_map.read(snapshot, edits);
let (snapshot, edits) = self.char_map.sync(snapshot, edits, tab_size);
let (snapshot, edits) = self.tab_map.sync(snapshot, edits, tab_size);
let (snapshot, edits) = self
.wrap_map
.update(cx, |map, cx| map.sync(snapshot, edits, cx));
@@ -298,7 +295,7 @@ impl DisplayMap {
let tab_size = Self::tab_size(&self.buffer, cx);
let (snapshot, edits) = self.inlay_map.sync(snapshot, edits);
let (snapshot, edits) = self.fold_map.read(snapshot, edits);
let (snapshot, edits) = self.char_map.sync(snapshot, edits, tab_size);
let (snapshot, edits) = self.tab_map.sync(snapshot, edits, tab_size);
let (snapshot, edits) = self
.wrap_map
.update(cx, |map, cx| map.sync(snapshot, edits, cx));
@@ -316,7 +313,7 @@ impl DisplayMap {
let tab_size = Self::tab_size(&self.buffer, cx);
let (snapshot, edits) = self.inlay_map.sync(snapshot, edits);
let (snapshot, edits) = self.fold_map.read(snapshot, edits);
let (snapshot, edits) = self.char_map.sync(snapshot, edits, tab_size);
let (snapshot, edits) = self.tab_map.sync(snapshot, edits, tab_size);
let (snapshot, edits) = self
.wrap_map
.update(cx, |map, cx| map.sync(snapshot, edits, cx));
@@ -334,7 +331,7 @@ impl DisplayMap {
let tab_size = Self::tab_size(&self.buffer, cx);
let (snapshot, edits) = self.inlay_map.sync(snapshot, edits);
let (snapshot, edits) = self.fold_map.read(snapshot, edits);
let (snapshot, edits) = self.char_map.sync(snapshot, edits, tab_size);
let (snapshot, edits) = self.tab_map.sync(snapshot, edits, tab_size);
let (snapshot, edits) = self
.wrap_map
.update(cx, |map, cx| map.sync(snapshot, edits, cx));
@@ -410,7 +407,7 @@ impl DisplayMap {
let (snapshot, edits) = self.inlay_map.sync(buffer_snapshot, edits);
let (snapshot, edits) = self.fold_map.read(snapshot, edits);
let tab_size = Self::tab_size(&self.buffer, cx);
let (snapshot, edits) = self.char_map.sync(snapshot, edits, tab_size);
let (snapshot, edits) = self.tab_map.sync(snapshot, edits, tab_size);
let (snapshot, edits) = self
.wrap_map
.update(cx, |map, cx| map.sync(snapshot, edits, cx));
@@ -418,7 +415,7 @@ impl DisplayMap {
let (snapshot, edits) = self.inlay_map.splice(to_remove, to_insert);
let (snapshot, edits) = self.fold_map.read(snapshot, edits);
let (snapshot, edits) = self.char_map.sync(snapshot, edits, tab_size);
let (snapshot, edits) = self.tab_map.sync(snapshot, edits, tab_size);
let (snapshot, edits) = self
.wrap_map
.update(cx, |map, cx| map.sync(snapshot, edits, cx));
@@ -470,7 +467,7 @@ pub struct DisplaySnapshot {
pub fold_snapshot: FoldSnapshot,
pub crease_snapshot: CreaseSnapshot,
inlay_snapshot: InlaySnapshot,
char_snapshot: CharSnapshot,
tab_snapshot: TabSnapshot,
wrap_snapshot: WrapSnapshot,
block_snapshot: BlockSnapshot,
text_highlights: TextHighlights,
@@ -570,8 +567,8 @@ impl DisplaySnapshot {
fn point_to_display_point(&self, point: MultiBufferPoint, bias: Bias) -> DisplayPoint {
let inlay_point = self.inlay_snapshot.to_inlay_point(point);
let fold_point = self.fold_snapshot.to_fold_point(inlay_point, bias);
let char_point = self.char_snapshot.to_char_point(fold_point);
let wrap_point = self.wrap_snapshot.char_point_to_wrap_point(char_point);
let tab_point = self.tab_snapshot.to_tab_point(fold_point);
let wrap_point = self.wrap_snapshot.tab_point_to_wrap_point(tab_point);
let block_point = self.block_snapshot.to_block_point(wrap_point);
DisplayPoint(block_point)
}
@@ -599,21 +596,21 @@ impl DisplaySnapshot {
fn display_point_to_inlay_point(&self, point: DisplayPoint, bias: Bias) -> InlayPoint {
let block_point = point.0;
let wrap_point = self.block_snapshot.to_wrap_point(block_point);
let char_point = self.wrap_snapshot.to_char_point(wrap_point);
let fold_point = self.char_snapshot.to_fold_point(char_point, bias).0;
let tab_point = self.wrap_snapshot.to_tab_point(wrap_point);
let fold_point = self.tab_snapshot.to_fold_point(tab_point, bias).0;
fold_point.to_inlay_point(&self.fold_snapshot)
}
pub fn display_point_to_fold_point(&self, point: DisplayPoint, bias: Bias) -> FoldPoint {
let block_point = point.0;
let wrap_point = self.block_snapshot.to_wrap_point(block_point);
let char_point = self.wrap_snapshot.to_char_point(wrap_point);
self.char_snapshot.to_fold_point(char_point, bias).0
let tab_point = self.wrap_snapshot.to_tab_point(wrap_point);
self.tab_snapshot.to_fold_point(tab_point, bias).0
}
pub fn fold_point_to_display_point(&self, fold_point: FoldPoint) -> DisplayPoint {
let char_point = self.char_snapshot.to_char_point(fold_point);
let wrap_point = self.wrap_snapshot.char_point_to_wrap_point(char_point);
let tab_point = self.tab_snapshot.to_tab_point(fold_point);
let wrap_point = self.wrap_snapshot.tab_point_to_wrap_point(tab_point);
let block_point = self.block_snapshot.to_block_point(wrap_point);
DisplayPoint(block_point)
}
@@ -691,23 +688,6 @@ impl DisplaySnapshot {
}
}
if chunk.is_invisible {
let invisible_highlight = HighlightStyle {
background_color: Some(editor_style.status.hint_background),
underline: Some(UnderlineStyle {
color: Some(editor_style.status.hint),
thickness: px(1.),
wavy: false,
}),
..Default::default()
};
if let Some(highlight_style) = highlight_style.as_mut() {
highlight_style.highlight(invisible_highlight);
} else {
highlight_style = Some(invisible_highlight);
}
}
let mut diagnostic_highlight = HighlightStyle::default();
if chunk.is_unnecessary {
@@ -804,11 +784,12 @@ impl DisplaySnapshot {
layout_line.closest_index_for_x(x) as u32
}
pub fn grapheme_at(&self, mut point: DisplayPoint) -> Option<String> {
pub fn display_chars_at(
&self,
mut point: DisplayPoint,
) -> impl Iterator<Item = (char, DisplayPoint)> + '_ {
point = DisplayPoint(self.block_snapshot.clip_point(point.0, Bias::Left));
let chars = self
.text_chunks(point.row())
self.text_chunks(point.row())
.flat_map(str::chars)
.skip_while({
let mut column = 0;
@@ -818,21 +799,16 @@ impl DisplaySnapshot {
!at_point
}
})
.take_while({
let mut prev = false;
move |char| {
let now = char.is_ascii();
let end = char.is_ascii() && (char.is_ascii_whitespace() || prev);
prev = now;
!end
.map(move |ch| {
let result = (ch, point);
if ch == '\n' {
*point.row_mut() += 1;
*point.column_mut() = 0;
} else {
*point.column_mut() += ch.len_utf8() as u32;
}
});
chars
.collect::<String>()
.graphemes(true)
.next()
.map(|s| s.to_owned())
result
})
}
pub fn buffer_chars_at(&self, mut offset: usize) -> impl Iterator<Item = (char, usize)> + '_ {
@@ -1144,8 +1120,8 @@ impl DisplayPoint {
pub fn to_offset(self, map: &DisplaySnapshot, bias: Bias) -> usize {
let wrap_point = map.block_snapshot.to_wrap_point(self.0);
let char_point = map.wrap_snapshot.to_char_point(wrap_point);
let fold_point = map.char_snapshot.to_fold_point(char_point, bias).0;
let tab_point = map.wrap_snapshot.to_tab_point(wrap_point);
let fold_point = map.tab_snapshot.to_fold_point(tab_point, bias).0;
let inlay_point = fold_point.to_inlay_point(&map.fold_snapshot);
map.inlay_snapshot
.to_buffer_offset(map.inlay_snapshot.to_offset(inlay_point))
@@ -1180,6 +1156,7 @@ impl ToDisplayPoint for Anchor {
pub mod tests {
use super::*;
use crate::{movement, test::marked_display_snapshot};
use block_map::BlockPlacement;
use gpui::{div, font, observe, px, AppContext, BorrowAppContext, Context, Element, Hsla};
use language::{
language_settings::{AllLanguageSettings, AllLanguageSettingsContent},
@@ -1191,6 +1168,7 @@ pub mod tests {
use smol::stream::StreamExt;
use std::{env, sync::Arc};
use theme::{LoadThemes, SyntaxTheme};
use unindent::Unindent as _;
use util::test::{marked_text_ranges, sample_text};
use Bias::*;
@@ -1252,7 +1230,7 @@ pub mod tests {
let snapshot = map.update(cx, |map, cx| map.snapshot(cx));
log::info!("buffer text: {:?}", snapshot.buffer_snapshot.text());
log::info!("fold text: {:?}", snapshot.fold_snapshot.text());
log::info!("char text: {:?}", snapshot.char_snapshot.text());
log::info!("tab text: {:?}", snapshot.tab_snapshot.text());
log::info!("wrap text: {:?}", snapshot.wrap_snapshot.text());
log::info!("block text: {:?}", snapshot.block_snapshot.text());
log::info!("display text: {:?}", snapshot.text());
@@ -1293,24 +1271,22 @@ pub mod tests {
Bias::Left,
));
let disposition = if rng.gen() {
BlockDisposition::Above
let placement = if rng.gen() {
BlockPlacement::Above(position)
} else {
BlockDisposition::Below
BlockPlacement::Below(position)
};
let height = rng.gen_range(1..5);
log::info!(
"inserting block {:?} {:?} with height {}",
disposition,
position.to_point(&buffer),
"inserting block {:?} with height {}",
placement.as_ref().map(|p| p.to_point(&buffer)),
height
);
let priority = rng.gen_range(1..100);
BlockProperties {
placement,
style: BlockStyle::Fixed,
position,
height,
disposition,
render: Box::new(|_| div().into_any()),
priority,
}
@@ -1369,7 +1345,7 @@ pub mod tests {
fold_count = snapshot.fold_count();
log::info!("buffer text: {:?}", snapshot.buffer_snapshot.text());
log::info!("fold text: {:?}", snapshot.fold_snapshot.text());
log::info!("char text: {:?}", snapshot.char_snapshot.text());
log::info!("tab text: {:?}", snapshot.tab_snapshot.text());
log::info!("wrap text: {:?}", snapshot.wrap_snapshot.text());
log::info!("block text: {:?}", snapshot.block_snapshot.text());
log::info!("display text: {:?}", snapshot.text());
@@ -1649,8 +1625,6 @@ pub mod tests {
#[gpui::test]
async fn test_chunks(cx: &mut gpui::TestAppContext) {
use unindent::Unindent as _;
let text = r#"
fn outer() {}
@@ -1747,12 +1721,110 @@ pub mod tests {
);
}
#[gpui::test]
async fn test_chunks_with_syntax_highlighting_across_blocks(cx: &mut gpui::TestAppContext) {
cx.background_executor
.set_block_on_ticks(usize::MAX..=usize::MAX);
let text = r#"
const A: &str = "
one
two
three
";
const B: &str = "four";
"#
.unindent();
let theme = SyntaxTheme::new_test(vec![
("string", Hsla::red()),
("punctuation", Hsla::blue()),
("keyword", Hsla::green()),
]);
let language = Arc::new(
Language::new(
LanguageConfig {
name: "Rust".into(),
..Default::default()
},
Some(tree_sitter_rust::LANGUAGE.into()),
)
.with_highlights_query(
r#"
(string_literal) @string
"const" @keyword
[":" ";"] @punctuation
"#,
)
.unwrap(),
);
language.set_theme(&theme);
cx.update(|cx| init_test(cx, |_| {}));
let buffer = cx.new_model(|cx| Buffer::local(text, cx).with_language(language, cx));
cx.condition(&buffer, |buf, _| !buf.is_parsing()).await;
let buffer = cx.new_model(|cx| MultiBuffer::singleton(buffer, cx));
let buffer_snapshot = buffer.read_with(cx, |buffer, cx| buffer.snapshot(cx));
let map = cx.new_model(|cx| {
DisplayMap::new(
buffer,
font("Courier"),
px(16.0),
None,
true,
1,
1,
0,
FoldPlaceholder::test(),
cx,
)
});
// Insert a block in the middle of a multi-line string literal
map.update(cx, |map, cx| {
map.insert_blocks(
[BlockProperties {
placement: BlockPlacement::Below(
buffer_snapshot.anchor_before(Point::new(1, 0)),
),
height: 1,
style: BlockStyle::Sticky,
render: Box::new(|_| div().into_any()),
priority: 0,
}],
cx,
)
});
pretty_assertions::assert_eq!(
cx.update(|cx| syntax_chunks(DisplayRow(0)..DisplayRow(7), &map, &theme, cx)),
[
("const".into(), Some(Hsla::green())),
(" A".into(), None),
(":".into(), Some(Hsla::blue())),
(" &str = ".into(), None),
("\"\n one\n".into(), Some(Hsla::red())),
("\n".into(), None),
(" two\n three\n\"".into(), Some(Hsla::red())),
(";".into(), Some(Hsla::blue())),
("\n".into(), None),
("const".into(), Some(Hsla::green())),
(" B".into(), None),
(":".into(), Some(Hsla::blue())),
(" &str = ".into(), None),
("\"four\"".into(), Some(Hsla::red())),
(";".into(), Some(Hsla::blue())),
("\n".into(), None),
]
);
}
// todo(linux) fails due to pixel differences in text rendering
#[cfg(target_os = "macos")]
#[gpui::test]
async fn test_chunks_with_soft_wrapping(cx: &mut gpui::TestAppContext) {
use unindent::Unindent as _;
cx.background_executor
.set_block_on_ticks(usize::MAX..=usize::MAX);

File diff suppressed because it is too large Load Diff

View File

@@ -1100,6 +1100,17 @@ pub struct FoldBufferRows<'a> {
fold_point: FoldPoint,
}
impl<'a> FoldBufferRows<'a> {
pub(crate) fn seek(&mut self, row: u32) {
let fold_point = FoldPoint::new(row, 0);
self.cursor.seek(&fold_point, Bias::Left, &());
let overshoot = fold_point.0 - self.cursor.start().0 .0;
let inlay_point = InlayPoint(self.cursor.start().1 .0 + overshoot);
self.input_buffer_rows.seek(inlay_point.row());
self.fold_point = fold_point;
}
}
impl<'a> Iterator for FoldBufferRows<'a> {
type Item = Option<u32>;
@@ -1135,6 +1146,38 @@ pub struct FoldChunks<'a> {
max_output_offset: FoldOffset,
}
impl<'a> FoldChunks<'a> {
pub(crate) fn seek(&mut self, range: Range<FoldOffset>) {
self.transform_cursor.seek(&range.start, Bias::Right, &());
let inlay_start = {
let overshoot = range.start.0 - self.transform_cursor.start().0 .0;
self.transform_cursor.start().1 + InlayOffset(overshoot)
};
let transform_end = self.transform_cursor.end(&());
let inlay_end = if self
.transform_cursor
.item()
.map_or(true, |transform| transform.is_fold())
{
inlay_start
} else if range.end < transform_end.0 {
let overshoot = range.end.0 - self.transform_cursor.start().0 .0;
self.transform_cursor.start().1 + InlayOffset(overshoot)
} else {
transform_end.1
};
self.inlay_chunks.seek(inlay_start..inlay_end);
self.inlay_chunk = None;
self.inlay_offset = inlay_start;
self.output_offset = range.start;
self.max_output_offset = range.end;
}
}
impl<'a> Iterator for FoldChunks<'a> {
type Item = Chunk<'a>;

View File

@@ -1,157 +0,0 @@
use std::sync::LazyLock;
use collections::HashMap;
// Invisibility in a Unicode context is not well defined, so we have to guess.
//
// We highlight all ASCII control codes, and unicode whitespace because they are likely
// confused with a normal space (U+0020).
//
// We also highlight the handful of blank non-space characters:
// U+2800 BRAILLE PATTERN BLANK - Category: So
// U+115F HANGUL CHOSEONG FILLER - Category: Lo
// U+1160 HANGUL CHOSEONG FILLER - Category: Lo
// U+3164 HANGUL FILLER - Category: Lo
// U+FFA0 HALFWIDTH HANGUL FILLER - Category: Lo
// U+FFFC OBJECT REPLACEMENT CHARACTER - Category: So
//
// For the rest of Unicode, invisibility happens for two reasons:
// * A Format character (like a byte order mark or right-to-left override)
// * An invisible Nonspacing Mark character (like U+034F, or variation selectors)
//
// We don't consider unassigned codepoints invisible as the font renderer already shows
// a replacement character in that case (and there are a *lot* of them)
//
// Control characters are mostly fine to highlight; except:
// * U+E0020..=U+E007F are used in emoji flags. We don't highlight them right now, but we could if we tightened our heuristics.
// * U+200D is used to join characters. We highlight this but don't replace it. As our font system ignores mid-glyph highlights this mostly works to highlight unexpected uses.
//
// Nonspacing marks are handled like U+200D. This means that mid-glyph we ignore them, but
// probably causes issues with end-of-glyph usage.
//
// ref: https://invisible-characters.com
// ref: https://www.compart.com/en/unicode/category/Cf
// ref: https://gist.github.com/ConradIrwin/f759e1fc29267143c4c7895aa495dca5?h=1
// ref: https://unicode.org/Public/emoji/13.0/emoji-test.txt
// https://github.com/bits/UTF-8-Unicode-Test-Documents/blob/master/UTF-8_sequence_separated/utf8_sequence_0-0x10ffff_assigned_including-unprintable-asis.txt
pub fn is_invisible(c: char) -> bool {
if c <= '\u{1f}' {
c != '\t' && c != '\n' && c != '\r'
} else if c >= '\u{7f}' {
c <= '\u{9f}' || c.is_whitespace() || contains(c, &FORMAT) || contains(c, &OTHER)
} else {
false
}
}
pub(crate) fn replacement(c: char) -> Option<&'static str> {
if !is_invisible(c) {
return None;
}
if c <= '\x7f' {
REPLACEMENTS.get(&c).copied()
} else if contains(c, &PRESERVE) {
None
} else {
Some(" ")
}
}
const REPLACEMENTS: LazyLock<HashMap<char, &'static str>> = LazyLock::new(|| {
[
('\x00', ""),
('\x01', ""),
('\x02', ""),
('\x03', ""),
('\x04', ""),
('\x05', ""),
('\x06', ""),
('\x07', ""),
('\x08', ""),
('\x0B', ""),
('\x0C', ""),
('\x0D', ""),
('\x0E', ""),
('\x0F', ""),
('\x10', ""),
('\x11', ""),
('\x12', ""),
('\x13', ""),
('\x14', ""),
('\x15', ""),
('\x16', ""),
('\x17', ""),
('\x18', ""),
('\x19', ""),
('\x1A', ""),
('\x1B', ""),
('\x1C', ""),
('\x1D', ""),
('\x1E', ""),
('\x1F', ""),
('\u{007F}', ""),
]
.into_iter()
.collect()
});
// generated using ucd-generate: ucd-generate general-category --include Format --chars ucd-16.0.0
pub const FORMAT: &'static [(char, char)] = &[
('\u{ad}', '\u{ad}'),
('\u{600}', '\u{605}'),
('\u{61c}', '\u{61c}'),
('\u{6dd}', '\u{6dd}'),
('\u{70f}', '\u{70f}'),
('\u{890}', '\u{891}'),
('\u{8e2}', '\u{8e2}'),
('\u{180e}', '\u{180e}'),
('\u{200b}', '\u{200f}'),
('\u{202a}', '\u{202e}'),
('\u{2060}', '\u{2064}'),
('\u{2066}', '\u{206f}'),
('\u{feff}', '\u{feff}'),
('\u{fff9}', '\u{fffb}'),
('\u{110bd}', '\u{110bd}'),
('\u{110cd}', '\u{110cd}'),
('\u{13430}', '\u{1343f}'),
('\u{1bca0}', '\u{1bca3}'),
('\u{1d173}', '\u{1d17a}'),
('\u{e0001}', '\u{e0001}'),
('\u{e0020}', '\u{e007f}'),
];
// hand-made base on https://invisible-characters.com (Excluding Cf)
pub const OTHER: &'static [(char, char)] = &[
('\u{034f}', '\u{034f}'),
('\u{115F}', '\u{1160}'),
('\u{17b4}', '\u{17b5}'),
('\u{180b}', '\u{180d}'),
('\u{2800}', '\u{2800}'),
('\u{3164}', '\u{3164}'),
('\u{fe00}', '\u{fe0d}'),
('\u{ffa0}', '\u{ffa0}'),
('\u{fffc}', '\u{fffc}'),
('\u{e0100}', '\u{e01ef}'),
];
// a subset of FORMAT/OTHER that may appear within glyphs
const PRESERVE: &'static [(char, char)] = &[
('\u{034f}', '\u{034f}'),
('\u{200d}', '\u{200d}'),
('\u{17b4}', '\u{17b5}'),
('\u{180b}', '\u{180d}'),
('\u{e0061}', '\u{e007a}'),
('\u{e007f}', '\u{e007f}'),
];
fn contains(c: char, list: &[(char, char)]) -> bool {
for (start, end) in list {
if c < *start {
return false;
}
if c <= *end {
return true;
}
}
false
}

View File

@@ -1,6 +1,5 @@
use super::{
fold_map::{self, FoldChunks, FoldEdit, FoldPoint, FoldSnapshot},
invisibles::{is_invisible, replacement},
Highlights,
};
use language::{Chunk, Point};
@@ -10,14 +9,14 @@ use sum_tree::Bias;
const MAX_EXPANSION_COLUMN: u32 = 256;
/// Keeps track of hard tabs and non-printable characters in a text buffer.
/// Keeps track of hard tabs in a text buffer.
///
/// See the [`display_map` module documentation](crate::display_map) for more information.
pub struct CharMap(CharSnapshot);
pub struct TabMap(TabSnapshot);
impl CharMap {
pub fn new(fold_snapshot: FoldSnapshot, tab_size: NonZeroU32) -> (Self, CharSnapshot) {
let snapshot = CharSnapshot {
impl TabMap {
pub fn new(fold_snapshot: FoldSnapshot, tab_size: NonZeroU32) -> (Self, TabSnapshot) {
let snapshot = TabSnapshot {
fold_snapshot,
tab_size,
max_expansion_column: MAX_EXPANSION_COLUMN,
@@ -27,7 +26,7 @@ impl CharMap {
}
#[cfg(test)]
pub fn set_max_expansion_column(&mut self, column: u32) -> CharSnapshot {
pub fn set_max_expansion_column(&mut self, column: u32) -> TabSnapshot {
self.0.max_expansion_column = column;
self.0.clone()
}
@@ -37,9 +36,9 @@ impl CharMap {
fold_snapshot: FoldSnapshot,
mut fold_edits: Vec<FoldEdit>,
tab_size: NonZeroU32,
) -> (CharSnapshot, Vec<TabEdit>) {
) -> (TabSnapshot, Vec<TabEdit>) {
let old_snapshot = &mut self.0;
let mut new_snapshot = CharSnapshot {
let mut new_snapshot = TabSnapshot {
fold_snapshot,
tab_size,
max_expansion_column: old_snapshot.max_expansion_column,
@@ -138,15 +137,15 @@ impl CharMap {
let new_start = fold_edit.new.start.to_point(&new_snapshot.fold_snapshot);
let new_end = fold_edit.new.end.to_point(&new_snapshot.fold_snapshot);
tab_edits.push(TabEdit {
old: old_snapshot.to_char_point(old_start)..old_snapshot.to_char_point(old_end),
new: new_snapshot.to_char_point(new_start)..new_snapshot.to_char_point(new_end),
old: old_snapshot.to_tab_point(old_start)..old_snapshot.to_tab_point(old_end),
new: new_snapshot.to_tab_point(new_start)..new_snapshot.to_tab_point(new_end),
});
}
} else {
new_snapshot.version += 1;
tab_edits.push(TabEdit {
old: CharPoint::zero()..old_snapshot.max_point(),
new: CharPoint::zero()..new_snapshot.max_point(),
old: TabPoint::zero()..old_snapshot.max_point(),
new: TabPoint::zero()..new_snapshot.max_point(),
});
}
@@ -156,14 +155,14 @@ impl CharMap {
}
#[derive(Clone)]
pub struct CharSnapshot {
pub struct TabSnapshot {
pub fold_snapshot: FoldSnapshot,
pub tab_size: NonZeroU32,
pub max_expansion_column: u32,
pub version: usize,
}
impl CharSnapshot {
impl TabSnapshot {
pub fn buffer_snapshot(&self) -> &MultiBufferSnapshot {
&self.fold_snapshot.inlay_snapshot.buffer
}
@@ -171,7 +170,7 @@ impl CharSnapshot {
pub fn line_len(&self, row: u32) -> u32 {
let max_point = self.max_point();
if row < max_point.row() {
self.to_char_point(FoldPoint::new(row, self.fold_snapshot.line_len(row)))
self.to_tab_point(FoldPoint::new(row, self.fold_snapshot.line_len(row)))
.0
.column
} else {
@@ -180,10 +179,10 @@ impl CharSnapshot {
}
pub fn text_summary(&self) -> TextSummary {
self.text_summary_for_range(CharPoint::zero()..self.max_point())
self.text_summary_for_range(TabPoint::zero()..self.max_point())
}
pub fn text_summary_for_range(&self, range: Range<CharPoint>) -> TextSummary {
pub fn text_summary_for_range(&self, range: Range<TabPoint>) -> TextSummary {
let input_start = self.to_fold_point(range.start, Bias::Left).0;
let input_end = self.to_fold_point(range.end, Bias::Right).0;
let input_summary = self
@@ -212,7 +211,7 @@ impl CharSnapshot {
} else {
for _ in self
.chunks(
CharPoint::new(range.end.row(), 0)..range.end,
TabPoint::new(range.end.row(), 0)..range.end,
false,
Highlights::default(),
)
@@ -233,7 +232,7 @@ impl CharSnapshot {
pub fn chunks<'a>(
&'a self,
range: Range<CharPoint>,
range: Range<TabPoint>,
language_aware: bool,
highlights: Highlights<'a>,
) -> TabChunks<'a> {
@@ -252,6 +251,7 @@ impl CharSnapshot {
};
TabChunks {
snapshot: self,
fold_chunks: self.fold_snapshot.chunks(
input_start..input_end,
language_aware,
@@ -279,7 +279,7 @@ impl CharSnapshot {
#[cfg(test)]
pub fn text(&self) -> String {
self.chunks(
CharPoint::zero()..self.max_point(),
TabPoint::zero()..self.max_point(),
false,
Highlights::default(),
)
@@ -287,24 +287,24 @@ impl CharSnapshot {
.collect()
}
pub fn max_point(&self) -> CharPoint {
self.to_char_point(self.fold_snapshot.max_point())
pub fn max_point(&self) -> TabPoint {
self.to_tab_point(self.fold_snapshot.max_point())
}
pub fn clip_point(&self, point: CharPoint, bias: Bias) -> CharPoint {
self.to_char_point(
pub fn clip_point(&self, point: TabPoint, bias: Bias) -> TabPoint {
self.to_tab_point(
self.fold_snapshot
.clip_point(self.to_fold_point(point, bias).0, bias),
)
}
pub fn to_char_point(&self, input: FoldPoint) -> CharPoint {
pub fn to_tab_point(&self, input: FoldPoint) -> TabPoint {
let chars = self.fold_snapshot.chars_at(FoldPoint::new(input.row(), 0));
let expanded = self.expand_tabs(chars, input.column());
CharPoint::new(input.row(), expanded)
TabPoint::new(input.row(), expanded)
}
pub fn to_fold_point(&self, output: CharPoint, bias: Bias) -> (FoldPoint, u32, u32) {
pub fn to_fold_point(&self, output: TabPoint, bias: Bias) -> (FoldPoint, u32, u32) {
let chars = self.fold_snapshot.chars_at(FoldPoint::new(output.row(), 0));
let expanded = output.column();
let (collapsed, expanded_char_column, to_next_stop) =
@@ -316,13 +316,13 @@ impl CharSnapshot {
)
}
pub fn make_char_point(&self, point: Point, bias: Bias) -> CharPoint {
pub fn make_tab_point(&self, point: Point, bias: Bias) -> TabPoint {
let inlay_point = self.fold_snapshot.inlay_snapshot.to_inlay_point(point);
let fold_point = self.fold_snapshot.to_fold_point(inlay_point, bias);
self.to_char_point(fold_point)
self.to_tab_point(fold_point)
}
pub fn to_point(&self, point: CharPoint, bias: Bias) -> Point {
pub fn to_point(&self, point: TabPoint, bias: Bias) -> Point {
let fold_point = self.to_fold_point(point, bias).0;
let inlay_point = fold_point.to_inlay_point(&self.fold_snapshot);
self.fold_snapshot
@@ -345,9 +345,6 @@ impl CharSnapshot {
let tab_len = tab_size - expanded_chars % tab_size;
expanded_bytes += tab_len;
expanded_chars += tab_len;
} else if let Some(replacement) = replacement(c) {
expanded_chars += replacement.chars().count() as u32;
expanded_bytes += replacement.len() as u32;
} else {
expanded_bytes += c.len_utf8() as u32;
expanded_chars += 1;
@@ -387,9 +384,6 @@ impl CharSnapshot {
Bias::Right => (collapsed_bytes + 1, expanded_chars, 0),
};
}
} else if let Some(replacement) = replacement(c) {
expanded_chars += replacement.chars().count() as u32;
expanded_bytes += replacement.len() as u32;
} else {
expanded_chars += 1;
expanded_bytes += c.len_utf8() as u32;
@@ -411,9 +405,9 @@ impl CharSnapshot {
}
#[derive(Copy, Clone, Debug, Default, Eq, Ord, PartialOrd, PartialEq)]
pub struct CharPoint(pub Point);
pub struct TabPoint(pub Point);
impl CharPoint {
impl TabPoint {
pub fn new(row: u32, column: u32) -> Self {
Self(Point::new(row, column))
}
@@ -431,13 +425,13 @@ impl CharPoint {
}
}
impl From<Point> for CharPoint {
impl From<Point> for TabPoint {
fn from(point: Point) -> Self {
Self(point)
}
}
pub type TabEdit = text::Edit<CharPoint>;
pub type TabEdit = text::Edit<TabPoint>;
#[derive(Clone, Debug, Default, Eq, PartialEq)]
pub struct TextSummary {
@@ -492,6 +486,7 @@ impl<'a> std::ops::AddAssign<&'a Self> for TextSummary {
const SPACES: &str = " ";
pub struct TabChunks<'a> {
snapshot: &'a TabSnapshot,
fold_chunks: FoldChunks<'a>,
chunk: Chunk<'a>,
column: u32,
@@ -503,6 +498,37 @@ pub struct TabChunks<'a> {
inside_leading_tab: bool,
}
impl<'a> TabChunks<'a> {
pub(crate) fn seek(&mut self, range: Range<TabPoint>) {
let (input_start, expanded_char_column, to_next_stop) =
self.snapshot.to_fold_point(range.start, Bias::Left);
let input_column = input_start.column();
let input_start = input_start.to_offset(&self.snapshot.fold_snapshot);
let input_end = self
.snapshot
.to_fold_point(range.end, Bias::Right)
.0
.to_offset(&self.snapshot.fold_snapshot);
let to_next_stop = if range.start.0 + Point::new(0, to_next_stop) > range.end.0 {
range.end.column() - range.start.column()
} else {
to_next_stop
};
self.fold_chunks.seek(input_start..input_end);
self.input_column = input_column;
self.column = expanded_char_column;
self.output_position = range.start.0;
self.max_output_position = range.end.0;
self.chunk = Chunk {
text: &SPACES[0..(to_next_stop as usize)],
is_tab: true,
..Default::default()
};
self.inside_leading_tab = to_next_stop > 0;
}
}
impl<'a> Iterator for TabChunks<'a> {
type Item = Chunk<'a>;
@@ -558,37 +584,6 @@ impl<'a> Iterator for TabChunks<'a> {
self.input_column = 0;
self.output_position += Point::new(1, 0);
}
_ if is_invisible(c) => {
if ix > 0 {
let (prefix, suffix) = self.chunk.text.split_at(ix);
self.chunk.text = suffix;
return Some(Chunk {
text: prefix,
is_invisible: false,
..self.chunk.clone()
});
}
let c_len = c.len_utf8();
let replacement = replacement(c).unwrap_or(&self.chunk.text[..c_len]);
if self.chunk.text.len() >= c_len {
self.chunk.text = &self.chunk.text[c_len..];
} else {
self.chunk.text = "";
}
let len = replacement.chars().count() as u32;
let next_output_position = cmp::min(
self.output_position + Point::new(0, len),
self.max_output_position,
);
self.column += len;
self.input_column += 1;
self.output_position = next_output_position;
return Some(Chunk {
text: replacement,
is_invisible: true,
..self.chunk.clone()
});
}
_ => {
self.column += 1;
if !self.inside_leading_tab {
@@ -618,11 +613,11 @@ mod tests {
let buffer_snapshot = buffer.read(cx).snapshot(cx);
let (_, inlay_snapshot) = InlayMap::new(buffer_snapshot.clone());
let (_, fold_snapshot) = FoldMap::new(inlay_snapshot);
let (_, char_snapshot) = CharMap::new(fold_snapshot, 4.try_into().unwrap());
let (_, tab_snapshot) = TabMap::new(fold_snapshot, 4.try_into().unwrap());
assert_eq!(char_snapshot.expand_tabs("\t".chars(), 0), 0);
assert_eq!(char_snapshot.expand_tabs("\t".chars(), 1), 4);
assert_eq!(char_snapshot.expand_tabs("\ta".chars(), 2), 5);
assert_eq!(tab_snapshot.expand_tabs("\t".chars(), 0), 0);
assert_eq!(tab_snapshot.expand_tabs("\t".chars(), 1), 4);
assert_eq!(tab_snapshot.expand_tabs("\ta".chars(), 2), 5);
}
#[gpui::test]
@@ -635,16 +630,16 @@ mod tests {
let buffer_snapshot = buffer.read(cx).snapshot(cx);
let (_, inlay_snapshot) = InlayMap::new(buffer_snapshot.clone());
let (_, fold_snapshot) = FoldMap::new(inlay_snapshot);
let (_, mut char_snapshot) = CharMap::new(fold_snapshot, 4.try_into().unwrap());
let (_, mut tab_snapshot) = TabMap::new(fold_snapshot, 4.try_into().unwrap());
char_snapshot.max_expansion_column = max_expansion_column;
assert_eq!(char_snapshot.text(), output);
tab_snapshot.max_expansion_column = max_expansion_column;
assert_eq!(tab_snapshot.text(), output);
for (ix, c) in input.char_indices() {
assert_eq!(
char_snapshot
tab_snapshot
.chunks(
CharPoint::new(0, ix as u32)..char_snapshot.max_point(),
TabPoint::new(0, ix as u32)..tab_snapshot.max_point(),
false,
Highlights::default(),
)
@@ -658,13 +653,13 @@ mod tests {
let input_point = Point::new(0, ix as u32);
let output_point = Point::new(0, output.find(c).unwrap() as u32);
assert_eq!(
char_snapshot.to_char_point(FoldPoint(input_point)),
CharPoint(output_point),
"to_char_point({input_point:?})"
tab_snapshot.to_tab_point(FoldPoint(input_point)),
TabPoint(output_point),
"to_tab_point({input_point:?})"
);
assert_eq!(
char_snapshot
.to_fold_point(CharPoint(output_point), Bias::Left)
tab_snapshot
.to_fold_point(TabPoint(output_point), Bias::Left)
.0,
FoldPoint(input_point),
"to_fold_point({output_point:?})"
@@ -682,10 +677,10 @@ mod tests {
let buffer_snapshot = buffer.read(cx).snapshot(cx);
let (_, inlay_snapshot) = InlayMap::new(buffer_snapshot.clone());
let (_, fold_snapshot) = FoldMap::new(inlay_snapshot);
let (_, mut char_snapshot) = CharMap::new(fold_snapshot, 4.try_into().unwrap());
let (_, mut tab_snapshot) = TabMap::new(fold_snapshot, 4.try_into().unwrap());
char_snapshot.max_expansion_column = max_expansion_column;
assert_eq!(char_snapshot.text(), input);
tab_snapshot.max_expansion_column = max_expansion_column;
assert_eq!(tab_snapshot.text(), input);
}
#[gpui::test]
@@ -696,10 +691,10 @@ mod tests {
let buffer_snapshot = buffer.read(cx).snapshot(cx);
let (_, inlay_snapshot) = InlayMap::new(buffer_snapshot.clone());
let (_, fold_snapshot) = FoldMap::new(inlay_snapshot);
let (_, char_snapshot) = CharMap::new(fold_snapshot, 4.try_into().unwrap());
let (_, tab_snapshot) = TabMap::new(fold_snapshot, 4.try_into().unwrap());
assert_eq!(
chunks(&char_snapshot, CharPoint::zero()),
chunks(&tab_snapshot, TabPoint::zero()),
vec![
(" ".to_string(), true),
(" ".to_string(), false),
@@ -708,7 +703,7 @@ mod tests {
]
);
assert_eq!(
chunks(&char_snapshot, CharPoint::new(0, 2)),
chunks(&tab_snapshot, TabPoint::new(0, 2)),
vec![
(" ".to_string(), true),
(" ".to_string(), false),
@@ -717,7 +712,7 @@ mod tests {
]
);
fn chunks(snapshot: &CharSnapshot, start: CharPoint) -> Vec<(String, bool)> {
fn chunks(snapshot: &TabSnapshot, start: TabPoint) -> Vec<(String, bool)> {
let mut chunks = Vec::new();
let mut was_tab = false;
let mut text = String::new();
@@ -763,12 +758,12 @@ mod tests {
let (inlay_snapshot, _) = inlay_map.randomly_mutate(&mut 0, &mut rng);
log::info!("InlayMap text: {:?}", inlay_snapshot.text());
let (mut char_map, _) = CharMap::new(fold_snapshot.clone(), tab_size);
let tabs_snapshot = char_map.set_max_expansion_column(32);
let (mut tab_map, _) = TabMap::new(fold_snapshot.clone(), tab_size);
let tabs_snapshot = tab_map.set_max_expansion_column(32);
let text = text::Rope::from(tabs_snapshot.text().as_str());
log::info!(
"CharMap text (tab size: {}): {:?}",
"TabMap text (tab size: {}): {:?}",
tab_size,
tabs_snapshot.text(),
);
@@ -776,11 +771,11 @@ mod tests {
for _ in 0..5 {
let end_row = rng.gen_range(0..=text.max_point().row);
let end_column = rng.gen_range(0..=text.line_len(end_row));
let mut end = CharPoint(text.clip_point(Point::new(end_row, end_column), Bias::Right));
let mut end = TabPoint(text.clip_point(Point::new(end_row, end_column), Bias::Right));
let start_row = rng.gen_range(0..=text.max_point().row);
let start_column = rng.gen_range(0..=text.line_len(start_row));
let mut start =
CharPoint(text.clip_point(Point::new(start_row, start_column), Bias::Left));
TabPoint(text.clip_point(Point::new(start_row, start_column), Bias::Left));
if start > end {
mem::swap(&mut start, &mut end);
}

View File

@@ -1,6 +1,6 @@
use super::{
char_map::{self, CharPoint, CharSnapshot, TabEdit},
fold_map::FoldBufferRows,
tab_map::{self, TabEdit, TabPoint, TabSnapshot},
Highlights,
};
use gpui::{AppContext, Context, Font, LineWrapper, Model, ModelContext, Pixels, Task};
@@ -12,7 +12,7 @@ use std::{cmp, collections::VecDeque, mem, ops::Range, time::Duration};
use sum_tree::{Bias, Cursor, SumTree};
use text::Patch;
pub use super::char_map::TextSummary;
pub use super::tab_map::TextSummary;
pub type WrapEdit = text::Edit<u32>;
/// Handles soft wrapping of text.
@@ -20,7 +20,7 @@ pub type WrapEdit = text::Edit<u32>;
/// See the [`display_map` module documentation](crate::display_map) for more information.
pub struct WrapMap {
snapshot: WrapSnapshot,
pending_edits: VecDeque<(CharSnapshot, Vec<TabEdit>)>,
pending_edits: VecDeque<(TabSnapshot, Vec<TabEdit>)>,
interpolated_edits: Patch<u32>,
edits_since_sync: Patch<u32>,
wrap_width: Option<Pixels>,
@@ -30,7 +30,7 @@ pub struct WrapMap {
#[derive(Clone)]
pub struct WrapSnapshot {
char_snapshot: CharSnapshot,
tab_snapshot: TabSnapshot,
transforms: SumTree<Transform>,
interpolated: bool,
}
@@ -51,11 +51,12 @@ struct TransformSummary {
pub struct WrapPoint(pub Point);
pub struct WrapChunks<'a> {
input_chunks: char_map::TabChunks<'a>,
input_chunks: tab_map::TabChunks<'a>,
input_chunk: Chunk<'a>,
output_position: WrapPoint,
max_output_row: u32,
transforms: Cursor<'a, Transform, (WrapPoint, CharPoint)>,
transforms: Cursor<'a, Transform, (WrapPoint, TabPoint)>,
snapshot: &'a WrapSnapshot,
}
#[derive(Clone)]
@@ -65,12 +66,27 @@ pub struct WrapBufferRows<'a> {
output_row: u32,
soft_wrapped: bool,
max_output_row: u32,
transforms: Cursor<'a, Transform, (WrapPoint, CharPoint)>,
transforms: Cursor<'a, Transform, (WrapPoint, TabPoint)>,
}
impl<'a> WrapBufferRows<'a> {
pub(crate) fn seek(&mut self, start_row: u32) {
self.transforms
.seek(&WrapPoint::new(start_row, 0), Bias::Left, &());
let mut input_row = self.transforms.start().1.row();
if self.transforms.item().map_or(false, |t| t.is_isomorphic()) {
input_row += start_row - self.transforms.start().0.row();
}
self.soft_wrapped = self.transforms.item().map_or(false, |t| !t.is_isomorphic());
self.input_buffer_rows.seek(input_row);
self.input_buffer_row = self.input_buffer_rows.next().unwrap();
self.output_row = start_row;
}
}
impl WrapMap {
pub fn new(
char_snapshot: CharSnapshot,
tab_snapshot: TabSnapshot,
font: Font,
font_size: Pixels,
wrap_width: Option<Pixels>,
@@ -83,7 +99,7 @@ impl WrapMap {
pending_edits: Default::default(),
interpolated_edits: Default::default(),
edits_since_sync: Default::default(),
snapshot: WrapSnapshot::new(char_snapshot),
snapshot: WrapSnapshot::new(tab_snapshot),
background_task: None,
};
this.set_wrap_width(wrap_width, cx);
@@ -101,17 +117,17 @@ impl WrapMap {
pub fn sync(
&mut self,
char_snapshot: CharSnapshot,
tab_snapshot: TabSnapshot,
edits: Vec<TabEdit>,
cx: &mut ModelContext<Self>,
) -> (WrapSnapshot, Patch<u32>) {
if self.wrap_width.is_some() {
self.pending_edits.push_back((char_snapshot, edits));
self.pending_edits.push_back((tab_snapshot, edits));
self.flush_edits(cx);
} else {
self.edits_since_sync = self
.edits_since_sync
.compose(self.snapshot.interpolate(char_snapshot, &edits));
.compose(self.snapshot.interpolate(tab_snapshot, &edits));
self.snapshot.interpolated = false;
}
@@ -161,11 +177,11 @@ impl WrapMap {
let (font, font_size) = self.font_with_size.clone();
let task = cx.background_executor().spawn(async move {
let mut line_wrapper = text_system.line_wrapper(font, font_size);
let char_snapshot = new_snapshot.char_snapshot.clone();
let range = CharPoint::zero()..char_snapshot.max_point();
let tab_snapshot = new_snapshot.tab_snapshot.clone();
let range = TabPoint::zero()..tab_snapshot.max_point();
let edits = new_snapshot
.update(
char_snapshot,
tab_snapshot,
&[TabEdit {
old: range.clone(),
new: range.clone(),
@@ -205,7 +221,7 @@ impl WrapMap {
} else {
let old_rows = self.snapshot.transforms.summary().output.lines.row + 1;
self.snapshot.transforms = SumTree::default();
let summary = self.snapshot.char_snapshot.text_summary();
let summary = self.snapshot.tab_snapshot.text_summary();
if !summary.lines.is_zero() {
self.snapshot
.transforms
@@ -223,8 +239,8 @@ impl WrapMap {
fn flush_edits(&mut self, cx: &mut ModelContext<Self>) {
if !self.snapshot.interpolated {
let mut to_remove_len = 0;
for (char_snapshot, _) in &self.pending_edits {
if char_snapshot.version <= self.snapshot.char_snapshot.version {
for (tab_snapshot, _) in &self.pending_edits {
if tab_snapshot.version <= self.snapshot.tab_snapshot.version {
to_remove_len += 1;
} else {
break;
@@ -246,9 +262,9 @@ impl WrapMap {
let update_task = cx.background_executor().spawn(async move {
let mut edits = Patch::default();
let mut line_wrapper = text_system.line_wrapper(font, font_size);
for (char_snapshot, tab_edits) in pending_edits {
for (tab_snapshot, tab_edits) in pending_edits {
let wrap_edits = snapshot
.update(char_snapshot, &tab_edits, wrap_width, &mut line_wrapper)
.update(tab_snapshot, &tab_edits, wrap_width, &mut line_wrapper)
.await;
edits = edits.compose(&wrap_edits);
}
@@ -285,11 +301,11 @@ impl WrapMap {
let was_interpolated = self.snapshot.interpolated;
let mut to_remove_len = 0;
for (char_snapshot, edits) in &self.pending_edits {
if char_snapshot.version <= self.snapshot.char_snapshot.version {
for (tab_snapshot, edits) in &self.pending_edits {
if tab_snapshot.version <= self.snapshot.tab_snapshot.version {
to_remove_len += 1;
} else {
let interpolated_edits = self.snapshot.interpolate(char_snapshot.clone(), edits);
let interpolated_edits = self.snapshot.interpolate(tab_snapshot.clone(), edits);
self.edits_since_sync = self.edits_since_sync.compose(&interpolated_edits);
self.interpolated_edits = self.interpolated_edits.compose(&interpolated_edits);
}
@@ -302,49 +318,45 @@ impl WrapMap {
}
impl WrapSnapshot {
fn new(char_snapshot: CharSnapshot) -> Self {
fn new(tab_snapshot: TabSnapshot) -> Self {
let mut transforms = SumTree::default();
let extent = char_snapshot.text_summary();
let extent = tab_snapshot.text_summary();
if !extent.lines.is_zero() {
transforms.push(Transform::isomorphic(extent), &());
}
Self {
transforms,
char_snapshot,
tab_snapshot,
interpolated: true,
}
}
pub fn buffer_snapshot(&self) -> &MultiBufferSnapshot {
self.char_snapshot.buffer_snapshot()
self.tab_snapshot.buffer_snapshot()
}
fn interpolate(
&mut self,
new_char_snapshot: CharSnapshot,
tab_edits: &[TabEdit],
) -> Patch<u32> {
fn interpolate(&mut self, new_tab_snapshot: TabSnapshot, tab_edits: &[TabEdit]) -> Patch<u32> {
let mut new_transforms;
if tab_edits.is_empty() {
new_transforms = self.transforms.clone();
} else {
let mut old_cursor = self.transforms.cursor::<CharPoint>(&());
let mut old_cursor = self.transforms.cursor::<TabPoint>(&());
let mut tab_edits_iter = tab_edits.iter().peekable();
new_transforms =
old_cursor.slice(&tab_edits_iter.peek().unwrap().old.start, Bias::Right, &());
while let Some(edit) = tab_edits_iter.next() {
if edit.new.start > CharPoint::from(new_transforms.summary().input.lines) {
let summary = new_char_snapshot.text_summary_for_range(
CharPoint::from(new_transforms.summary().input.lines)..edit.new.start,
if edit.new.start > TabPoint::from(new_transforms.summary().input.lines) {
let summary = new_tab_snapshot.text_summary_for_range(
TabPoint::from(new_transforms.summary().input.lines)..edit.new.start,
);
new_transforms.push_or_extend(Transform::isomorphic(summary));
}
if !edit.new.is_empty() {
new_transforms.push_or_extend(Transform::isomorphic(
new_char_snapshot.text_summary_for_range(edit.new.clone()),
new_tab_snapshot.text_summary_for_range(edit.new.clone()),
));
}
@@ -353,7 +365,7 @@ impl WrapSnapshot {
if next_edit.old.start > old_cursor.end(&()) {
if old_cursor.end(&()) > edit.old.end {
let summary = self
.char_snapshot
.tab_snapshot
.text_summary_for_range(edit.old.end..old_cursor.end(&()));
new_transforms.push_or_extend(Transform::isomorphic(summary));
}
@@ -367,7 +379,7 @@ impl WrapSnapshot {
} else {
if old_cursor.end(&()) > edit.old.end {
let summary = self
.char_snapshot
.tab_snapshot
.text_summary_for_range(edit.old.end..old_cursor.end(&()));
new_transforms.push_or_extend(Transform::isomorphic(summary));
}
@@ -380,7 +392,7 @@ impl WrapSnapshot {
let old_snapshot = mem::replace(
self,
WrapSnapshot {
char_snapshot: new_char_snapshot,
tab_snapshot: new_tab_snapshot,
transforms: new_transforms,
interpolated: true,
},
@@ -391,7 +403,7 @@ impl WrapSnapshot {
async fn update(
&mut self,
new_char_snapshot: CharSnapshot,
new_tab_snapshot: TabSnapshot,
tab_edits: &[TabEdit],
wrap_width: Pixels,
line_wrapper: &mut LineWrapper,
@@ -428,27 +440,27 @@ impl WrapSnapshot {
new_transforms = self.transforms.clone();
} else {
let mut row_edits = row_edits.into_iter().peekable();
let mut old_cursor = self.transforms.cursor::<CharPoint>(&());
let mut old_cursor = self.transforms.cursor::<TabPoint>(&());
new_transforms = old_cursor.slice(
&CharPoint::new(row_edits.peek().unwrap().old_rows.start, 0),
&TabPoint::new(row_edits.peek().unwrap().old_rows.start, 0),
Bias::Right,
&(),
);
while let Some(edit) = row_edits.next() {
if edit.new_rows.start > new_transforms.summary().input.lines.row {
let summary = new_char_snapshot.text_summary_for_range(
CharPoint(new_transforms.summary().input.lines)
..CharPoint::new(edit.new_rows.start, 0),
let summary = new_tab_snapshot.text_summary_for_range(
TabPoint(new_transforms.summary().input.lines)
..TabPoint::new(edit.new_rows.start, 0),
);
new_transforms.push_or_extend(Transform::isomorphic(summary));
}
let mut line = String::new();
let mut remaining = None;
let mut chunks = new_char_snapshot.chunks(
CharPoint::new(edit.new_rows.start, 0)..new_char_snapshot.max_point(),
let mut chunks = new_tab_snapshot.chunks(
TabPoint::new(edit.new_rows.start, 0)..new_tab_snapshot.max_point(),
false,
Highlights::default(),
);
@@ -495,19 +507,19 @@ impl WrapSnapshot {
}
new_transforms.extend(edit_transforms, &());
old_cursor.seek_forward(&CharPoint::new(edit.old_rows.end, 0), Bias::Right, &());
old_cursor.seek_forward(&TabPoint::new(edit.old_rows.end, 0), Bias::Right, &());
if let Some(next_edit) = row_edits.peek() {
if next_edit.old_rows.start > old_cursor.end(&()).row() {
if old_cursor.end(&()) > CharPoint::new(edit.old_rows.end, 0) {
let summary = self.char_snapshot.text_summary_for_range(
CharPoint::new(edit.old_rows.end, 0)..old_cursor.end(&()),
if old_cursor.end(&()) > TabPoint::new(edit.old_rows.end, 0) {
let summary = self.tab_snapshot.text_summary_for_range(
TabPoint::new(edit.old_rows.end, 0)..old_cursor.end(&()),
);
new_transforms.push_or_extend(Transform::isomorphic(summary));
}
old_cursor.next(&());
new_transforms.append(
old_cursor.slice(
&CharPoint::new(next_edit.old_rows.start, 0),
&TabPoint::new(next_edit.old_rows.start, 0),
Bias::Right,
&(),
),
@@ -515,9 +527,9 @@ impl WrapSnapshot {
);
}
} else {
if old_cursor.end(&()) > CharPoint::new(edit.old_rows.end, 0) {
let summary = self.char_snapshot.text_summary_for_range(
CharPoint::new(edit.old_rows.end, 0)..old_cursor.end(&()),
if old_cursor.end(&()) > TabPoint::new(edit.old_rows.end, 0) {
let summary = self.tab_snapshot.text_summary_for_range(
TabPoint::new(edit.old_rows.end, 0)..old_cursor.end(&()),
);
new_transforms.push_or_extend(Transform::isomorphic(summary));
}
@@ -530,7 +542,7 @@ impl WrapSnapshot {
let old_snapshot = mem::replace(
self,
WrapSnapshot {
char_snapshot: new_char_snapshot,
tab_snapshot: new_tab_snapshot,
transforms: new_transforms,
interpolated: false,
},
@@ -583,17 +595,17 @@ impl WrapSnapshot {
) -> WrapChunks<'a> {
let output_start = WrapPoint::new(rows.start, 0);
let output_end = WrapPoint::new(rows.end, 0);
let mut transforms = self.transforms.cursor::<(WrapPoint, CharPoint)>(&());
let mut transforms = self.transforms.cursor::<(WrapPoint, TabPoint)>(&());
transforms.seek(&output_start, Bias::Right, &());
let mut input_start = CharPoint(transforms.start().1 .0);
let mut input_start = TabPoint(transforms.start().1 .0);
if transforms.item().map_or(false, |t| t.is_isomorphic()) {
input_start.0 += output_start.0 - transforms.start().0 .0;
}
let input_end = self
.to_char_point(output_end)
.min(self.char_snapshot.max_point());
.to_tab_point(output_end)
.min(self.tab_snapshot.max_point());
WrapChunks {
input_chunks: self.char_snapshot.chunks(
input_chunks: self.tab_snapshot.chunks(
input_start..input_end,
language_aware,
highlights,
@@ -602,6 +614,7 @@ impl WrapSnapshot {
output_position: output_start,
max_output_row: rows.end,
transforms,
snapshot: self,
}
}
@@ -610,7 +623,7 @@ impl WrapSnapshot {
}
pub fn line_len(&self, row: u32) -> u32 {
let mut cursor = self.transforms.cursor::<(WrapPoint, CharPoint)>(&());
let mut cursor = self.transforms.cursor::<(WrapPoint, TabPoint)>(&());
cursor.seek(&WrapPoint::new(row + 1, 0), Bias::Left, &());
if cursor
.item()
@@ -618,7 +631,7 @@ impl WrapSnapshot {
{
let overshoot = row - cursor.start().0.row();
let tab_row = cursor.start().1.row() + overshoot;
let tab_line_len = self.char_snapshot.line_len(tab_row);
let tab_line_len = self.tab_snapshot.line_len(tab_row);
if overshoot == 0 {
cursor.start().0.column() + (tab_line_len - cursor.start().1.column())
} else {
@@ -629,6 +642,65 @@ impl WrapSnapshot {
}
}
pub fn text_summary_for_range(&self, rows: Range<u32>) -> TextSummary {
let mut summary = TextSummary::default();
let start = WrapPoint::new(rows.start, 0);
let end = WrapPoint::new(rows.end, 0);
let mut cursor = self.transforms.cursor::<(WrapPoint, TabPoint)>(&());
cursor.seek(&start, Bias::Right, &());
if let Some(transform) = cursor.item() {
let start_in_transform = start.0 - cursor.start().0 .0;
let end_in_transform = cmp::min(end, cursor.end(&()).0).0 - cursor.start().0 .0;
if transform.is_isomorphic() {
let tab_start = TabPoint(cursor.start().1 .0 + start_in_transform);
let tab_end = TabPoint(cursor.start().1 .0 + end_in_transform);
summary += &self.tab_snapshot.text_summary_for_range(tab_start..tab_end);
} else {
debug_assert_eq!(start_in_transform.row, end_in_transform.row);
let indent_len = end_in_transform.column - start_in_transform.column;
summary += &TextSummary {
lines: Point::new(0, indent_len),
first_line_chars: indent_len,
last_line_chars: indent_len,
longest_row: 0,
longest_row_chars: indent_len,
};
}
cursor.next(&());
}
if rows.end > cursor.start().0.row() {
summary += &cursor
.summary::<_, TransformSummary>(&WrapPoint::new(rows.end, 0), Bias::Right, &())
.output;
if let Some(transform) = cursor.item() {
let end_in_transform = end.0 - cursor.start().0 .0;
if transform.is_isomorphic() {
let char_start = cursor.start().1;
let char_end = TabPoint(char_start.0 + end_in_transform);
summary += &self
.tab_snapshot
.text_summary_for_range(char_start..char_end);
} else {
debug_assert_eq!(end_in_transform, Point::new(1, 0));
summary += &TextSummary {
lines: Point::new(1, 0),
first_line_chars: 0,
last_line_chars: 0,
longest_row: 0,
longest_row_chars: 0,
};
}
}
}
summary
}
pub fn soft_wrap_indent(&self, row: u32) -> Option<u32> {
let mut cursor = self.transforms.cursor::<WrapPoint>(&());
cursor.seek(&WrapPoint::new(row + 1, 0), Bias::Right, &());
@@ -646,14 +718,14 @@ impl WrapSnapshot {
}
pub fn buffer_rows(&self, start_row: u32) -> WrapBufferRows {
let mut transforms = self.transforms.cursor::<(WrapPoint, CharPoint)>(&());
let mut transforms = self.transforms.cursor::<(WrapPoint, TabPoint)>(&());
transforms.seek(&WrapPoint::new(start_row, 0), Bias::Left, &());
let mut input_row = transforms.start().1.row();
if transforms.item().map_or(false, |t| t.is_isomorphic()) {
input_row += start_row - transforms.start().0.row();
}
let soft_wrapped = transforms.item().map_or(false, |t| !t.is_isomorphic());
let mut input_buffer_rows = self.char_snapshot.buffer_rows(input_row);
let mut input_buffer_rows = self.tab_snapshot.buffer_rows(input_row);
let input_buffer_row = input_buffer_rows.next().unwrap();
WrapBufferRows {
transforms,
@@ -665,26 +737,26 @@ impl WrapSnapshot {
}
}
pub fn to_char_point(&self, point: WrapPoint) -> CharPoint {
let mut cursor = self.transforms.cursor::<(WrapPoint, CharPoint)>(&());
pub fn to_tab_point(&self, point: WrapPoint) -> TabPoint {
let mut cursor = self.transforms.cursor::<(WrapPoint, TabPoint)>(&());
cursor.seek(&point, Bias::Right, &());
let mut char_point = cursor.start().1 .0;
let mut tab_point = cursor.start().1 .0;
if cursor.item().map_or(false, |t| t.is_isomorphic()) {
char_point += point.0 - cursor.start().0 .0;
tab_point += point.0 - cursor.start().0 .0;
}
CharPoint(char_point)
TabPoint(tab_point)
}
pub fn to_point(&self, point: WrapPoint, bias: Bias) -> Point {
self.char_snapshot.to_point(self.to_char_point(point), bias)
self.tab_snapshot.to_point(self.to_tab_point(point), bias)
}
pub fn make_wrap_point(&self, point: Point, bias: Bias) -> WrapPoint {
self.char_point_to_wrap_point(self.char_snapshot.make_char_point(point, bias))
self.tab_point_to_wrap_point(self.tab_snapshot.make_tab_point(point, bias))
}
pub fn char_point_to_wrap_point(&self, point: CharPoint) -> WrapPoint {
let mut cursor = self.transforms.cursor::<(CharPoint, WrapPoint)>(&());
pub fn tab_point_to_wrap_point(&self, point: TabPoint) -> WrapPoint {
let mut cursor = self.transforms.cursor::<(TabPoint, WrapPoint)>(&());
cursor.seek(&point, Bias::Right, &());
WrapPoint(cursor.start().1 .0 + (point.0 - cursor.start().0 .0))
}
@@ -699,10 +771,7 @@ impl WrapSnapshot {
}
}
self.char_point_to_wrap_point(
self.char_snapshot
.clip_point(self.to_char_point(point), bias),
)
self.tab_point_to_wrap_point(self.tab_snapshot.clip_point(self.to_tab_point(point), bias))
}
pub fn prev_row_boundary(&self, mut point: WrapPoint) -> u32 {
@@ -712,7 +781,7 @@ impl WrapSnapshot {
*point.column_mut() = 0;
let mut cursor = self.transforms.cursor::<(WrapPoint, CharPoint)>(&());
let mut cursor = self.transforms.cursor::<(WrapPoint, TabPoint)>(&());
cursor.seek(&point, Bias::Right, &());
if cursor.item().is_none() {
cursor.prev(&());
@@ -732,7 +801,7 @@ impl WrapSnapshot {
pub fn next_row_boundary(&self, mut point: WrapPoint) -> Option<u32> {
point.0 += Point::new(1, 0);
let mut cursor = self.transforms.cursor::<(WrapPoint, CharPoint)>(&());
let mut cursor = self.transforms.cursor::<(WrapPoint, TabPoint)>(&());
cursor.seek(&point, Bias::Right, &());
while let Some(transform) = cursor.item() {
if transform.is_isomorphic() && cursor.start().1.column() == 0 {
@@ -745,12 +814,27 @@ impl WrapSnapshot {
None
}
#[cfg(test)]
pub fn text(&self) -> String {
self.text_chunks(0).collect()
}
#[cfg(test)]
pub fn text_chunks(&self, wrap_row: u32) -> impl Iterator<Item = &str> {
self.chunks(
wrap_row..self.max_point().row() + 1,
false,
Highlights::default(),
)
.map(|h| h.text)
}
fn check_invariants(&self) {
#[cfg(test)]
{
assert_eq!(
CharPoint::from(self.transforms.summary().input.lines),
self.char_snapshot.max_point()
TabPoint::from(self.transforms.summary().input.lines),
self.tab_snapshot.max_point()
);
{
@@ -763,18 +847,18 @@ impl WrapSnapshot {
}
let text = language::Rope::from(self.text().as_str());
let mut input_buffer_rows = self.char_snapshot.buffer_rows(0);
let mut input_buffer_rows = self.tab_snapshot.buffer_rows(0);
let mut expected_buffer_rows = Vec::new();
let mut prev_tab_row = 0;
for display_row in 0..=self.max_point().row() {
let char_point = self.to_char_point(WrapPoint::new(display_row, 0));
if char_point.row() == prev_tab_row && display_row != 0 {
let tab_point = self.to_tab_point(WrapPoint::new(display_row, 0));
if tab_point.row() == prev_tab_row && display_row != 0 {
expected_buffer_rows.push(None);
} else {
expected_buffer_rows.push(input_buffer_rows.next().unwrap());
}
prev_tab_row = char_point.row();
prev_tab_row = tab_point.row();
assert_eq!(self.line_len(display_row), text.line_len(display_row));
}
@@ -791,6 +875,26 @@ impl WrapSnapshot {
}
}
impl<'a> WrapChunks<'a> {
pub(crate) fn seek(&mut self, rows: Range<u32>) {
let output_start = WrapPoint::new(rows.start, 0);
let output_end = WrapPoint::new(rows.end, 0);
self.transforms.seek(&output_start, Bias::Right, &());
let mut input_start = TabPoint(self.transforms.start().1 .0);
if self.transforms.item().map_or(false, |t| t.is_isomorphic()) {
input_start.0 += output_start.0 - self.transforms.start().0 .0;
}
let input_end = self
.snapshot
.to_tab_point(output_end)
.min(self.snapshot.tab_snapshot.max_point());
self.input_chunks.seek(input_start..input_end);
self.input_chunk = Chunk::default();
self.output_position = output_start;
self.max_output_row = rows.end;
}
}
impl<'a> Iterator for WrapChunks<'a> {
type Item = Chunk<'a>;
@@ -838,11 +942,13 @@ impl<'a> Iterator for WrapChunks<'a> {
} else {
*self.output_position.column_mut() += char_len as u32;
}
if self.output_position >= transform_end {
self.transforms.next(&());
break;
}
}
let (prefix, suffix) = self.input_chunk.text.split_at(input_len);
self.input_chunk.text = suffix;
Some(Chunk {
@@ -997,7 +1103,7 @@ impl sum_tree::Summary for TransformSummary {
}
}
impl<'a> sum_tree::Dimension<'a, TransformSummary> for CharPoint {
impl<'a> sum_tree::Dimension<'a, TransformSummary> for TabPoint {
fn zero(_cx: &()) -> Self {
Default::default()
}
@@ -1007,7 +1113,7 @@ impl<'a> sum_tree::Dimension<'a, TransformSummary> for CharPoint {
}
}
impl<'a> sum_tree::SeekTarget<'a, TransformSummary, TransformSummary> for CharPoint {
impl<'a> sum_tree::SeekTarget<'a, TransformSummary, TransformSummary> for TabPoint {
fn cmp(&self, cursor_location: &TransformSummary, _: &()) -> std::cmp::Ordering {
Ord::cmp(&self.0, &cursor_location.input.lines)
}
@@ -1055,7 +1161,7 @@ fn consolidate_wrap_edits(edits: Vec<WrapEdit>) -> Vec<WrapEdit> {
mod tests {
use super::*;
use crate::{
display_map::{char_map::CharMap, fold_map::FoldMap, inlay_map::InlayMap},
display_map::{fold_map::FoldMap, inlay_map::InlayMap, tab_map::TabMap},
MultiBuffer,
};
use gpui::{font, px, test::observe};
@@ -1107,9 +1213,9 @@ mod tests {
log::info!("InlayMap text: {:?}", inlay_snapshot.text());
let (mut fold_map, fold_snapshot) = FoldMap::new(inlay_snapshot.clone());
log::info!("FoldMap text: {:?}", fold_snapshot.text());
let (mut char_map, _) = CharMap::new(fold_snapshot.clone(), tab_size);
let tabs_snapshot = char_map.set_max_expansion_column(32);
log::info!("CharMap text: {:?}", tabs_snapshot.text());
let (mut tab_map, _) = TabMap::new(fold_snapshot.clone(), tab_size);
let tabs_snapshot = tab_map.set_max_expansion_column(32);
log::info!("TabMap text: {:?}", tabs_snapshot.text());
let mut line_wrapper = text_system.line_wrapper(font.clone(), font_size);
let unwrapped_text = tabs_snapshot.text();
@@ -1155,7 +1261,7 @@ mod tests {
20..=39 => {
for (fold_snapshot, fold_edits) in fold_map.randomly_mutate(&mut rng) {
let (tabs_snapshot, tab_edits) =
char_map.sync(fold_snapshot, fold_edits, tab_size);
tab_map.sync(fold_snapshot, fold_edits, tab_size);
let (mut snapshot, wrap_edits) =
wrap_map.update(cx, |map, cx| map.sync(tabs_snapshot, tab_edits, cx));
snapshot.check_invariants();
@@ -1168,7 +1274,7 @@ mod tests {
inlay_map.randomly_mutate(&mut next_inlay_id, &mut rng);
let (fold_snapshot, fold_edits) = fold_map.read(inlay_snapshot, inlay_edits);
let (tabs_snapshot, tab_edits) =
char_map.sync(fold_snapshot, fold_edits, tab_size);
tab_map.sync(fold_snapshot, fold_edits, tab_size);
let (mut snapshot, wrap_edits) =
wrap_map.update(cx, |map, cx| map.sync(tabs_snapshot, tab_edits, cx));
snapshot.check_invariants();
@@ -1192,8 +1298,8 @@ mod tests {
log::info!("InlayMap text: {:?}", inlay_snapshot.text());
let (fold_snapshot, fold_edits) = fold_map.read(inlay_snapshot, inlay_edits);
log::info!("FoldMap text: {:?}", fold_snapshot.text());
let (tabs_snapshot, tab_edits) = char_map.sync(fold_snapshot, fold_edits, tab_size);
log::info!("CharMap text: {:?}", tabs_snapshot.text());
let (tabs_snapshot, tab_edits) = tab_map.sync(fold_snapshot, fold_edits, tab_size);
log::info!("TabMap text: {:?}", tabs_snapshot.text());
let unwrapped_text = tabs_snapshot.text();
let expected_text = wrap_text(&unwrapped_text, wrap_width, &mut line_wrapper);
@@ -1239,7 +1345,7 @@ mod tests {
if tab_size.get() == 1
|| !wrapped_snapshot
.char_snapshot
.tab_snapshot
.fold_snapshot
.text()
.contains('\t')
@@ -1336,19 +1442,6 @@ mod tests {
}
impl WrapSnapshot {
pub fn text(&self) -> String {
self.text_chunks(0).collect()
}
pub fn text_chunks(&self, wrap_row: u32) -> impl Iterator<Item = &str> {
self.chunks(
wrap_row..self.max_point().row() + 1,
false,
Highlights::default(),
)
.map(|h| h.text)
}
fn verify_chunks(&mut self, rng: &mut impl Rng) {
for _ in 0..5 {
let mut end_row = rng.gen_range(0..=self.max_point().row());

View File

@@ -223,6 +223,7 @@ pub fn render_parsed_markdown(
}
}),
);
// hello
let mut links = Vec::new();
let mut link_ranges = Vec::new();
@@ -3282,10 +3283,25 @@ impl Editor {
&bracket_pair.start[..prefix_len],
));
let is_closing_quote = if bracket_pair.end == bracket_pair.start
&& bracket_pair.start.len() == 1
{
let target = bracket_pair.start.chars().next().unwrap();
let current_line_count = snapshot
.reversed_chars_at(selection.start)
.take_while(|&c| c != '\n')
.filter(|&c| c == target)
.count();
current_line_count % 2 == 1
} else {
false
};
if autoclose
&& bracket_pair.close
&& following_text_allows_autoclose
&& preceding_text_matches_prefix
&& !is_closing_quote
{
let anchor = snapshot.anchor_before(selection.end);
new_selections.push((selection.map(|_| anchor), text.len()));
@@ -3769,6 +3785,9 @@ impl Editor {
pub fn newline_below(&mut self, _: &NewlineBelow, cx: &mut ViewContext<Self>) {
let buffer = self.buffer.read(cx);
let snapshot = buffer.snapshot(cx);
//
//
//
let mut edits = Vec::new();
let mut rows = Vec::new();
@@ -10210,7 +10229,7 @@ impl Editor {
let block_id = this.insert_blocks(
[BlockProperties {
style: BlockStyle::Flex,
position: range.start,
placement: BlockPlacement::Below(range.start),
height: 1,
render: Box::new({
let rename_editor = rename_editor.clone();
@@ -10246,7 +10265,6 @@ impl Editor {
.into_any_element()
}
}),
disposition: BlockDisposition::Below,
priority: 0,
}],
Some(Autoscroll::fit()),
@@ -10531,10 +10549,11 @@ impl Editor {
let message_height = diagnostic.message.matches('\n').count() as u32 + 1;
BlockProperties {
style: BlockStyle::Fixed,
position: buffer.anchor_after(entry.range.start),
placement: BlockPlacement::Below(
buffer.anchor_after(entry.range.start),
),
height: message_height,
render: diagnostic_block_renderer(diagnostic, None, true, true),
disposition: BlockDisposition::Below,
priority: 0,
}
}),
@@ -10728,15 +10747,44 @@ impl Editor {
self.fold_ranges(fold_ranges, true, cx);
}
fn fold_at_level(&mut self, fold_at: &FoldAtLevel, cx: &mut ViewContext<Self>) {
let fold_at_level = fold_at.level;
let snapshot = self.buffer.read(cx).snapshot(cx);
let mut fold_ranges = Vec::new();
let mut stack = vec![(0, snapshot.max_buffer_row().0, 1)];
while let Some((mut start_row, end_row, current_level)) = stack.pop() {
while start_row < end_row {
match self.snapshot(cx).foldable_range(MultiBufferRow(start_row)) {
Some(foldable_range) => {
let nested_start_row = foldable_range.0.start.row + 1;
let nested_end_row = foldable_range.0.end.row;
if current_level == fold_at_level {
fold_ranges.push(foldable_range);
}
if current_level <= fold_at_level {
stack.push((nested_start_row, nested_end_row, current_level + 1));
}
start_row = nested_end_row + 1;
}
None => start_row += 1,
}
}
}
self.fold_ranges(fold_ranges, true, cx);
}
pub fn fold_all(&mut self, _: &actions::FoldAll, cx: &mut ViewContext<Self>) {
let mut fold_ranges = Vec::new();
let display_map = self.display_map.update(cx, |map, cx| map.snapshot(cx));
let snapshot = self.buffer.read(cx).snapshot(cx);
for row in 0..display_map.max_buffer_row().0 {
if let Some((foldable_range, fold_text)) =
display_map.foldable_range(MultiBufferRow(row))
{
fold_ranges.push((foldable_range, fold_text));
for row in 0..snapshot.max_buffer_row().0 {
if let Some(foldable_range) = self.snapshot(cx).foldable_range(MultiBufferRow(row)) {
fold_ranges.push(foldable_range);
}
}

View File

@@ -1080,6 +1080,112 @@ fn test_fold_action_multiple_line_breaks(cx: &mut TestAppContext) {
});
}
#[gpui::test]
fn test_fold_at_level(cx: &mut TestAppContext) {
init_test(cx, |_| {});
let view = cx.add_window(|cx| {
let buffer = MultiBuffer::build_simple(
&"
class Foo:
# Hello!
def a():
print(1)
def b():
print(2)
class Bar:
# World!
def a():
print(1)
def b():
print(2)
"
.unindent(),
cx,
);
build_editor(buffer.clone(), cx)
});
_ = view.update(cx, |view, cx| {
view.fold_at_level(&FoldAtLevel { level: 2 }, cx);
assert_eq!(
view.display_text(cx),
"
class Foo:
# Hello!
def a():⋯
def b():⋯
class Bar:
# World!
def a():⋯
def b():⋯
"
.unindent(),
);
view.fold_at_level(&FoldAtLevel { level: 1 }, cx);
assert_eq!(
view.display_text(cx),
"
class Foo:⋯
class Bar:⋯
"
.unindent(),
);
view.unfold_all(&UnfoldAll, cx);
view.fold_at_level(&FoldAtLevel { level: 0 }, cx);
assert_eq!(
view.display_text(cx),
"
class Foo:
# Hello!
def a():
print(1)
def b():
print(2)
class Bar:
# World!
def a():
print(1)
def b():
print(2)
"
.unindent(),
);
assert_eq!(view.display_text(cx), view.buffer.read(cx).read(cx).text());
});
}
#[gpui::test]
fn test_move_cursor(cx: &mut TestAppContext) {
init_test(cx, |_| {});
@@ -3868,8 +3974,7 @@ fn test_move_line_up_down_with_blocks(cx: &mut TestAppContext) {
editor.insert_blocks(
[BlockProperties {
style: BlockStyle::Fixed,
position: snapshot.anchor_after(Point::new(2, 0)),
disposition: BlockDisposition::Below,
placement: BlockPlacement::Below(snapshot.anchor_after(Point::new(2, 0))),
height: 1,
render: Box::new(|_| div().into_any()),
priority: 0,

View File

@@ -68,7 +68,6 @@ use sum_tree::Bias;
use theme::{ActiveTheme, Appearance, PlayerColor};
use ui::prelude::*;
use ui::{h_flex, ButtonLike, ButtonStyle, ContextMenu, Tooltip};
use unicode_segmentation::UnicodeSegmentation;
use util::RangeExt;
use util::ResultExt;
use workspace::{item::Item, Workspace};
@@ -337,6 +336,7 @@ impl EditorElement {
register_action(view, cx, Editor::open_url);
register_action(view, cx, Editor::open_file);
register_action(view, cx, Editor::fold);
register_action(view, cx, Editor::fold_at_level);
register_action(view, cx, Editor::fold_all);
register_action(view, cx, Editor::fold_at);
register_action(view, cx, Editor::fold_recursive);
@@ -445,6 +445,7 @@ impl EditorElement {
register_action(view, cx, Editor::accept_inline_completion);
register_action(view, cx, Editor::revert_file);
register_action(view, cx, Editor::revert_selected_hunks);
register_action(view, cx, Editor::apply_all_diff_hunks);
register_action(view, cx, Editor::apply_selected_diff_hunks);
register_action(view, cx, Editor::open_active_item_in_terminal);
register_action(view, cx, Editor::reload_file)
@@ -1026,21 +1027,23 @@ impl EditorElement {
}
let block_text = if let CursorShape::Block = selection.cursor_shape {
snapshot
.grapheme_at(cursor_position)
.display_chars_at(cursor_position)
.next()
.or_else(|| {
if cursor_column == 0 {
snapshot.placeholder_text().and_then(|s| {
s.graphemes(true).next().map(|s| s.to_owned())
})
snapshot
.placeholder_text()
.and_then(|s| s.chars().next())
.map(|c| (c, cursor_position))
} else {
None
}
})
.and_then(|grapheme| {
let text = if grapheme == "\n" {
.and_then(|(character, _)| {
let text = if character == '\n' {
SharedString::from(" ")
} else {
SharedString::from(grapheme)
SharedString::from(character.to_string())
};
let len = text.len();
@@ -2071,7 +2074,7 @@ impl EditorElement {
let mut element = match block {
Block::Custom(block) => {
let align_to = block
.position()
.start()
.to_point(&snapshot.buffer_snapshot)
.to_display_point(snapshot);
let anchor_x = text_x
@@ -6294,7 +6297,7 @@ fn compute_auto_height_layout(
mod tests {
use super::*;
use crate::{
display_map::{BlockDisposition, BlockProperties},
display_map::{BlockPlacement, BlockProperties},
editor_tests::{init_test, update_test_language_settings},
Editor, MultiBuffer,
};
@@ -6550,9 +6553,8 @@ mod tests {
editor.insert_blocks(
[BlockProperties {
style: BlockStyle::Fixed,
disposition: BlockDisposition::Above,
placement: BlockPlacement::Above(Anchor::min()),
height: 3,
position: Anchor::min(),
render: Box::new(|cx| div().h(3. * cx.line_height()).into_any()),
priority: 0,
}],

View File

@@ -1,7 +1,6 @@
use crate::{
display_map::{InlayOffset, ToDisplayPoint},
hover_links::{InlayHighlight, RangeInEditor},
is_invisible,
scroll::ScrollAmount,
Anchor, AnchorRangeExt, DisplayPoint, DisplayRow, Editor, EditorSettings, EditorSnapshot,
Hover, RangeToAnchorExt,
@@ -12,7 +11,7 @@ use gpui::{
StyleRefinement, Styled, Task, TextStyleRefinement, View, ViewContext,
};
use itertools::Itertools;
use language::{Diagnostic, DiagnosticEntry, Language, LanguageRegistry};
use language::{DiagnosticEntry, Language, LanguageRegistry};
use lsp::DiagnosticSeverity;
use markdown::{Markdown, MarkdownStyle};
use multi_buffer::ToOffset;
@@ -200,6 +199,7 @@ fn show_hover(
if editor.pending_rename.is_some() {
return None;
}
let snapshot = editor.snapshot(cx);
let (buffer, buffer_position) = editor
@@ -259,7 +259,7 @@ fn show_hover(
}
// If there's a diagnostic, assign it on the hover state and notify
let mut local_diagnostic = snapshot
let local_diagnostic = snapshot
.buffer_snapshot
.diagnostics_in_range::<_, usize>(anchor..anchor, false)
// Find the entry with the most specific range
@@ -281,42 +281,6 @@ fn show_hover(
})
});
if let Some(invisible) = snapshot
.buffer_snapshot
.chars_at(anchor)
.next()
.filter(|&c| is_invisible(c))
{
let after = snapshot.buffer_snapshot.anchor_after(
anchor.to_offset(&snapshot.buffer_snapshot) + invisible.len_utf8(),
);
local_diagnostic = Some(DiagnosticEntry {
diagnostic: Diagnostic {
severity: DiagnosticSeverity::HINT,
message: format!("Unicode character U+{:02X}", invisible as u32),
..Default::default()
},
range: anchor..after,
})
} else if let Some(invisible) = snapshot
.buffer_snapshot
.reversed_chars_at(anchor)
.next()
.filter(|&c| is_invisible(c))
{
let before = snapshot.buffer_snapshot.anchor_before(
anchor.to_offset(&snapshot.buffer_snapshot) - invisible.len_utf8(),
);
local_diagnostic = Some(DiagnosticEntry {
diagnostic: Diagnostic {
severity: DiagnosticSeverity::HINT,
message: format!("Unicode character U+{:02X}", invisible as u32),
..Default::default()
},
range: before..anchor,
})
}
let diagnostic_popover = if let Some(local_diagnostic) = local_diagnostic {
let text = match local_diagnostic.diagnostic.source {
Some(ref source) => {
@@ -324,6 +288,7 @@ fn show_hover(
}
None => local_diagnostic.diagnostic.message.clone(),
};
let mut border_color: Option<Hsla> = None;
let mut background_color: Option<Hsla> = None;
@@ -379,6 +344,7 @@ fn show_hover(
Markdown::new_text(text, markdown_style.clone(), None, cx, None)
})
.ok();
Some(DiagnosticPopover {
local_diagnostic,
primary_diagnostic,
@@ -466,6 +432,7 @@ fn show_hover(
cx.notify();
cx.refresh();
})?;
anyhow::Ok(())
}
.log_err()

View File

@@ -16,10 +16,10 @@ use util::RangeExt;
use workspace::Item;
use crate::{
editor_settings::CurrentLineHighlight, hunk_status, hunks_for_selections, ApplyDiffHunk,
BlockDisposition, BlockProperties, BlockStyle, CustomBlockId, DiffRowHighlight, DisplayRow,
DisplaySnapshot, Editor, EditorElement, ExpandAllHunkDiffs, GoToHunk, GoToPrevHunk, RevertFile,
RevertSelectedHunks, ToDisplayPoint, ToggleHunkDiff,
editor_settings::CurrentLineHighlight, hunk_status, hunks_for_selections, ApplyAllDiffHunks,
ApplyDiffHunk, BlockPlacement, BlockProperties, BlockStyle, CustomBlockId, DiffRowHighlight,
DisplayRow, DisplaySnapshot, Editor, EditorElement, ExpandAllHunkDiffs, GoToHunk, GoToPrevHunk,
RevertFile, RevertSelectedHunks, ToDisplayPoint, ToggleHunkDiff,
};
#[derive(Debug, Clone)]
@@ -352,7 +352,11 @@ impl Editor {
None
}
pub(crate) fn apply_all_diff_hunks(&mut self, cx: &mut ViewContext<Self>) {
pub(crate) fn apply_all_diff_hunks(
&mut self,
_: &ApplyAllDiffHunks,
cx: &mut ViewContext<Self>,
) {
let buffers = self.buffer.read(cx).all_buffers();
for branch_buffer in buffers {
branch_buffer.update(cx, |branch_buffer, cx| {
@@ -417,10 +421,9 @@ impl Editor {
};
BlockProperties {
position: hunk.multi_buffer_range.start,
placement: BlockPlacement::Above(hunk.multi_buffer_range.start),
height: 1,
style: BlockStyle::Sticky,
disposition: BlockDisposition::Above,
priority: 0,
render: Box::new({
let editor = cx.view().clone();
@@ -700,10 +703,9 @@ impl Editor {
let hunk = hunk.clone();
let height = editor_height.max(deleted_text_height);
BlockProperties {
position: hunk.multi_buffer_range.start,
placement: BlockPlacement::Above(hunk.multi_buffer_range.start),
height,
style: BlockStyle::Flex,
disposition: BlockDisposition::Above,
priority: 0,
render: Box::new(move |cx| {
let width = EditorElement::diff_hunk_strip_width(cx.line_height());

View File

@@ -1,4 +1,4 @@
use crate::{Editor, EditorEvent, SemanticsProvider};
use crate::{ApplyAllDiffHunks, Editor, EditorEvent, SemanticsProvider};
use collections::HashSet;
use futures::{channel::mpsc, future::join_all};
use gpui::{AppContext, EventEmitter, FocusableView, Model, Render, Subscription, Task, View};
@@ -8,7 +8,7 @@ use project::Project;
use smol::stream::StreamExt;
use std::{any::TypeId, ops::Range, rc::Rc, time::Duration};
use text::ToOffset;
use ui::prelude::*;
use ui::{prelude::*, ButtonLike, KeyBinding};
use workspace::{
searchable::SearchableItemHandle, Item, ItemHandle as _, ToolbarItemEvent, ToolbarItemLocation,
ToolbarItemView, Workspace,
@@ -232,7 +232,10 @@ impl ProposedChangesEditor {
impl Render for ProposedChangesEditor {
fn render(&mut self, _cx: &mut ViewContext<Self>) -> impl IntoElement {
self.editor.clone()
div()
.size_full()
.key_context("ProposedChangesEditor")
.child(self.editor.clone())
}
}
@@ -331,17 +334,21 @@ impl ProposedChangesEditorToolbar {
}
impl Render for ProposedChangesEditorToolbar {
fn render(&mut self, _cx: &mut ViewContext<Self>) -> impl IntoElement {
let editor = self.current_editor.clone();
Button::new("apply-changes", "Apply All").on_click(move |_, cx| {
if let Some(editor) = &editor {
editor.update(cx, |editor, cx| {
editor.editor.update(cx, |editor, cx| {
editor.apply_all_diff_hunks(cx);
})
});
fn render(&mut self, cx: &mut ViewContext<Self>) -> impl IntoElement {
let button_like = ButtonLike::new("apply-changes").child(Label::new("Apply All"));
match &self.current_editor {
Some(editor) => {
let focus_handle = editor.focus_handle(cx);
let keybinding = KeyBinding::for_action_in(&ApplyAllDiffHunks, &focus_handle, cx)
.map(|binding| binding.into_any_element());
button_like.children(keybinding).on_click({
move |_event, cx| focus_handle.dispatch_action(&ApplyAllDiffHunks, cx)
})
}
})
None => button_like.disabled(true),
}
}
}

View File

@@ -1,6 +1,7 @@
use std::{
borrow::Cow,
ops::{Deref, DerefMut, Range},
path::Path,
sync::Arc,
};
@@ -66,10 +67,12 @@ impl EditorLspTestContext {
);
language_registry.add(Arc::new(language));
let root = Self::root_path();
app_state
.fs
.as_fake()
.insert_tree("/root", json!({ "dir": { file_name.clone(): "" }}))
.insert_tree(root, json!({ "dir": { file_name.clone(): "" }}))
.await;
let window = cx.add_window(|cx| Workspace::test_new(project.clone(), cx));
@@ -79,7 +82,7 @@ impl EditorLspTestContext {
let mut cx = VisualTestContext::from_window(*window.deref(), cx);
project
.update(&mut cx, |project, cx| {
project.find_or_create_worktree("/root", true, cx)
project.find_or_create_worktree(root, true, cx)
})
.await
.unwrap();
@@ -108,7 +111,7 @@ impl EditorLspTestContext {
},
lsp,
workspace,
buffer_lsp_url: lsp::Url::from_file_path(format!("/root/dir/{file_name}")).unwrap(),
buffer_lsp_url: lsp::Url::from_file_path(root.join("dir").join(file_name)).unwrap(),
}
}
@@ -123,6 +126,7 @@ impl EditorLspTestContext {
path_suffixes: vec!["rs".to_string()],
..Default::default()
},
line_comments: vec!["// ".into(), "/// ".into(), "//! ".into()],
..Default::default()
},
Some(tree_sitter_rust::LANGUAGE.into()),
@@ -309,6 +313,16 @@ impl EditorLspTestContext {
pub fn notify<T: notification::Notification>(&self, params: T::Params) {
self.lsp.notify::<T>(params);
}
#[cfg(target_os = "windows")]
fn root_path() -> &'static Path {
Path::new("C:\\root")
}
#[cfg(not(target_os = "windows"))]
fn root_path() -> &'static Path {
Path::new("/root")
}
}
impl Deref for EditorLspTestContext {

View File

@@ -5,6 +5,7 @@ mod file_finder_settings;
mod new_path_prompt;
mod open_path_prompt;
use futures::future::join_all;
pub use open_path_prompt::OpenPathDelegate;
use collections::HashMap;
@@ -59,7 +60,7 @@ impl FileFinder {
fn register(workspace: &mut Workspace, _: &mut ViewContext<Workspace>) {
workspace.register_action(|workspace, action: &workspace::ToggleFileFinder, cx| {
let Some(file_finder) = workspace.active_modal::<Self>(cx) else {
Self::open(workspace, action.separate_history, cx);
Self::open(workspace, action.separate_history, cx).detach();
return;
};
@@ -72,8 +73,13 @@ impl FileFinder {
});
}
fn open(workspace: &mut Workspace, separate_history: bool, cx: &mut ViewContext<Workspace>) {
fn open(
workspace: &mut Workspace,
separate_history: bool,
cx: &mut ViewContext<Workspace>,
) -> Task<()> {
let project = workspace.project().read(cx);
let fs = project.fs();
let currently_opened_path = workspace
.active_item(cx)
@@ -88,28 +94,51 @@ impl FileFinder {
let history_items = workspace
.recent_navigation_history(Some(MAX_RECENT_SELECTIONS), cx)
.into_iter()
.filter(|(_, history_abs_path)| match history_abs_path {
Some(abs_path) => history_file_exists(abs_path),
None => true,
.filter_map(|(project_path, abs_path)| {
if project.entry_for_path(&project_path, cx).is_some() {
return Some(Task::ready(Some(FoundPath::new(project_path, abs_path))));
}
let abs_path = abs_path?;
if project.is_local() {
let fs = fs.clone();
Some(cx.background_executor().spawn(async move {
if fs.is_file(&abs_path).await {
Some(FoundPath::new(project_path, Some(abs_path)))
} else {
None
}
}))
} else {
Some(Task::ready(Some(FoundPath::new(
project_path,
Some(abs_path),
))))
}
})
.map(|(history_path, abs_path)| FoundPath::new(history_path, abs_path))
.collect::<Vec<_>>();
cx.spawn(move |workspace, mut cx| async move {
let history_items = join_all(history_items).await.into_iter().flatten();
let project = workspace.project().clone();
let weak_workspace = cx.view().downgrade();
workspace.toggle_modal(cx, |cx| {
let delegate = FileFinderDelegate::new(
cx.view().downgrade(),
weak_workspace,
project,
currently_opened_path,
history_items,
separate_history,
cx,
);
workspace
.update(&mut cx, |workspace, cx| {
let project = workspace.project().clone();
let weak_workspace = cx.view().downgrade();
workspace.toggle_modal(cx, |cx| {
let delegate = FileFinderDelegate::new(
cx.view().downgrade(),
weak_workspace,
project,
currently_opened_path,
history_items.collect(),
separate_history,
cx,
);
FileFinder::new(delegate, cx)
});
FileFinder::new(delegate, cx)
});
})
.ok();
})
}
fn new(delegate: FileFinderDelegate, cx: &mut ViewContext<Self>) -> Self {
@@ -456,16 +485,6 @@ impl FoundPath {
const MAX_RECENT_SELECTIONS: usize = 20;
#[cfg(not(test))]
fn history_file_exists(abs_path: &PathBuf) -> bool {
abs_path.exists()
}
#[cfg(test)]
fn history_file_exists(abs_path: &Path) -> bool {
!abs_path.ends_with("nonexistent.rs")
}
pub enum Event {
Selected(ProjectPath),
Dismissed,

View File

@@ -4,7 +4,7 @@ use super::*;
use editor::Editor;
use gpui::{Entity, TestAppContext, VisualTestContext};
use menu::{Confirm, SelectNext, SelectPrev};
use project::FS_WATCH_LATENCY;
use project::{RemoveOptions, FS_WATCH_LATENCY};
use serde_json::json;
use workspace::{AppState, ToggleFileFinder, Workspace};
@@ -1450,6 +1450,15 @@ async fn test_nonexistent_history_items_not_shown(cx: &mut gpui::TestAppContext)
open_close_queried_buffer("non", 1, "nonexistent.rs", &workspace, cx).await;
open_close_queried_buffer("thi", 1, "third.rs", &workspace, cx).await;
open_close_queried_buffer("fir", 1, "first.rs", &workspace, cx).await;
app_state
.fs
.remove_file(
Path::new("/src/test/nonexistent.rs"),
RemoveOptions::default(),
)
.await
.unwrap();
cx.run_until_parked();
let picker = open_file_picker(&workspace, cx);
cx.simulate_input("rs");

View File

@@ -865,14 +865,20 @@ impl FakeFsState {
let mut entry_stack = Vec::new();
'outer: loop {
let mut path_components = path.components().peekable();
let mut prefix = None;
while let Some(component) = path_components.next() {
match component {
Component::Prefix(_) => panic!("prefix paths aren't supported"),
Component::Prefix(prefix_component) => prefix = Some(prefix_component),
Component::RootDir => {
entry_stack.clear();
entry_stack.push(self.root.clone());
canonical_path.clear();
canonical_path.push("/");
match prefix {
Some(prefix_component) => {
canonical_path.push(prefix_component.as_os_str());
}
None => canonical_path.push("/"),
}
}
Component::CurDir => {}
Component::ParentDir => {
@@ -1384,11 +1390,12 @@ impl Fs for FakeFs {
let mut created_dirs = Vec::new();
let mut cur_path = PathBuf::new();
for component in path.components() {
let mut state = self.state.lock();
let should_skip = matches!(component, Component::Prefix(..) | Component::RootDir);
cur_path.push(component);
if cur_path == Path::new("/") {
if should_skip {
continue;
}
let mut state = self.state.lock();
let inode = state.next_inode;
let mtime = state.next_mtime;

View File

@@ -1,7 +1,6 @@
use crate::{
black, fill, point, px, size, Bounds, Half, Hsla, LineLayout, Pixels, Point, Result,
SharedString, StrikethroughStyle, UnderlineStyle, WindowContext, WrapBoundary,
WrappedLineLayout,
black, fill, point, px, size, Bounds, Hsla, LineLayout, Pixels, Point, Result, SharedString,
StrikethroughStyle, UnderlineStyle, WindowContext, WrapBoundary, WrappedLineLayout,
};
use derive_more::{Deref, DerefMut};
use smallvec::SmallVec;
@@ -130,9 +129,8 @@ fn paint_line(
let text_system = cx.text_system().clone();
let mut glyph_origin = origin;
let mut prev_glyph_position = Point::default();
let mut max_glyph_size = size(px(0.), px(0.));
for (run_ix, run) in layout.runs.iter().enumerate() {
max_glyph_size = text_system.bounding_box(run.font_id, layout.font_size).size;
let max_glyph_size = text_system.bounding_box(run.font_id, layout.font_size).size;
for (glyph_ix, glyph) in run.glyphs.iter().enumerate() {
glyph_origin.x += glyph.position.x - prev_glyph_position.x;
@@ -141,9 +139,6 @@ fn paint_line(
wraps.next();
if let Some((background_origin, background_color)) = current_background.as_mut()
{
if glyph_origin.x == background_origin.x {
background_origin.x -= max_glyph_size.width.half()
}
cx.paint_quad(fill(
Bounds {
origin: *background_origin,
@@ -155,9 +150,6 @@ fn paint_line(
background_origin.y += line_height;
}
if let Some((underline_origin, underline_style)) = current_underline.as_mut() {
if glyph_origin.x == underline_origin.x {
underline_origin.x -= max_glyph_size.width.half();
};
cx.paint_underline(
*underline_origin,
glyph_origin.x - underline_origin.x,
@@ -169,9 +161,6 @@ fn paint_line(
if let Some((strikethrough_origin, strikethrough_style)) =
current_strikethrough.as_mut()
{
if glyph_origin.x == strikethrough_origin.x {
strikethrough_origin.x -= max_glyph_size.width.half();
};
cx.paint_strikethrough(
*strikethrough_origin,
glyph_origin.x - strikethrough_origin.x,
@@ -190,18 +179,7 @@ fn paint_line(
let mut finished_underline: Option<(Point<Pixels>, UnderlineStyle)> = None;
let mut finished_strikethrough: Option<(Point<Pixels>, StrikethroughStyle)> = None;
if glyph.index >= run_end {
let mut style_run = decoration_runs.next();
// ignore style runs that apply to a partial glyph
while let Some(run) = style_run {
if glyph.index < run_end + (run.len as usize) {
break;
}
run_end += run.len as usize;
style_run = decoration_runs.next();
}
if let Some(style_run) = style_run {
if let Some(style_run) = decoration_runs.next() {
if let Some((_, background_color)) = &mut current_background {
if style_run.background_color.as_ref() != Some(background_color) {
finished_background = current_background.take();
@@ -262,14 +240,10 @@ fn paint_line(
}
if let Some((background_origin, background_color)) = finished_background {
let mut width = glyph_origin.x - background_origin.x;
if width == px(0.) {
width = px(5.)
};
cx.paint_quad(fill(
Bounds {
origin: background_origin,
size: size(width, line_height),
size: size(glyph_origin.x - background_origin.x, line_height),
},
background_color,
));
@@ -325,10 +299,7 @@ fn paint_line(
last_line_end_x -= glyph.position.x;
}
if let Some((mut background_origin, background_color)) = current_background.take() {
if last_line_end_x == background_origin.x {
background_origin.x -= max_glyph_size.width.half()
};
if let Some((background_origin, background_color)) = current_background.take() {
cx.paint_quad(fill(
Bounds {
origin: background_origin,
@@ -338,10 +309,7 @@ fn paint_line(
));
}
if let Some((mut underline_start, underline_style)) = current_underline.take() {
if last_line_end_x == underline_start.x {
underline_start.x -= max_glyph_size.width.half()
};
if let Some((underline_start, underline_style)) = current_underline.take() {
cx.paint_underline(
underline_start,
last_line_end_x - underline_start.x,
@@ -349,10 +317,7 @@ fn paint_line(
);
}
if let Some((mut strikethrough_start, strikethrough_style)) = current_strikethrough.take() {
if last_line_end_x == strikethrough_start.x {
strikethrough_start.x -= max_glyph_size.width.half()
};
if let Some((strikethrough_start, strikethrough_style)) = current_strikethrough.take() {
cx.paint_strikethrough(
strikethrough_start,
last_line_end_x - strikethrough_start.x,

View File

@@ -1,37 +0,0 @@
[package]
name = "headless"
version = "0.1.0"
edition = "2021"
publish = false
license = "GPL-3.0-or-later"
[lints]
workspace = true
[lib]
path = "src/headless.rs"
doctest = false
[dependencies]
anyhow.workspace = true
client.workspace = true
extension.workspace = true
signal-hook.workspace = true
gpui.workspace = true
log.workspace = true
util.workspace = true
node_runtime.workspace = true
language.workspace = true
project.workspace = true
proto.workspace = true
fs.workspace = true
futures.workspace = true
settings.workspace = true
shellexpand.workspace = true
postage.workspace = true
[dev-dependencies]
client = { workspace = true, features = ["test-support"] }
fs = { workspace = true, features = ["test-support"] }
gpui = { workspace = true, features = ["test-support"] }
util = { workspace = true, features = ["test-support"] }

View File

@@ -1 +0,0 @@
../../LICENSE-GPL

View File

@@ -1,397 +0,0 @@
use anyhow::{anyhow, Result};
use client::DevServerProjectId;
use client::{user::UserStore, Client, ClientSettings};
use extension::ExtensionStore;
use fs::Fs;
use futures::{Future, StreamExt};
use gpui::{AppContext, AsyncAppContext, Context, Global, Model, ModelContext, Task, WeakModel};
use language::LanguageRegistry;
use node_runtime::NodeRuntime;
use postage::stream::Stream;
use project::Project;
use proto::{self, ErrorCode, TypedEnvelope};
use settings::{Settings, SettingsStore};
use std::path::Path;
use std::{collections::HashMap, sync::Arc};
use util::{ResultExt, TryFutureExt};
pub struct DevServer {
client: Arc<Client>,
app_state: AppState,
remote_shutdown: bool,
projects: HashMap<DevServerProjectId, Model<Project>>,
_subscriptions: Vec<client::Subscription>,
_maintain_connection: Task<Option<()>>,
}
pub struct AppState {
pub node_runtime: NodeRuntime,
pub user_store: Model<UserStore>,
pub languages: Arc<LanguageRegistry>,
pub fs: Arc<dyn Fs>,
}
struct GlobalDevServer(Model<DevServer>);
impl Global for GlobalDevServer {}
pub fn init(client: Arc<Client>, app_state: AppState, cx: &mut AppContext) -> Task<Result<()>> {
let dev_server = cx.new_model(|cx| DevServer::new(client.clone(), app_state, cx));
cx.set_global(GlobalDevServer(dev_server.clone()));
#[cfg(not(target_os = "windows"))]
{
use signal_hook::consts::{SIGINT, SIGTERM};
use signal_hook::iterator::Signals;
// Set up a handler when the dev server is shut down
// with ctrl-c or kill
let (tx, rx) = futures::channel::oneshot::channel();
let mut signals = Signals::new([SIGTERM, SIGINT]).unwrap();
std::thread::spawn({
move || {
if let Some(sig) = signals.forever().next() {
tx.send(sig).log_err();
}
}
});
cx.spawn(|cx| async move {
if let Ok(sig) = rx.await {
log::info!("received signal {sig:?}");
cx.update(|cx| cx.quit()).log_err();
}
})
.detach();
}
let server_url = ClientSettings::get_global(cx).server_url.clone();
cx.spawn(|cx| async move {
client
.authenticate_and_connect(false, &cx)
.await
.map_err(|e| anyhow!("Error connecting to '{}': {}", server_url, e))
})
}
impl DevServer {
pub fn global(cx: &AppContext) -> Model<DevServer> {
cx.global::<GlobalDevServer>().0.clone()
}
pub fn new(client: Arc<Client>, app_state: AppState, cx: &mut ModelContext<Self>) -> Self {
cx.on_app_quit(Self::app_will_quit).detach();
let maintain_connection = cx.spawn({
let client = client.clone();
move |this, cx| Self::maintain_connection(this, client.clone(), cx).log_err()
});
cx.observe_global::<SettingsStore>(|_, cx| {
ExtensionStore::global(cx).update(cx, |store, cx| store.auto_install_extensions(cx))
})
.detach();
DevServer {
_subscriptions: vec![
client.add_message_handler(cx.weak_model(), Self::handle_dev_server_instructions),
client.add_request_handler(
cx.weak_model(),
Self::handle_validate_dev_server_project_request,
),
client.add_request_handler(cx.weak_model(), Self::handle_list_remote_directory),
client.add_message_handler(cx.weak_model(), Self::handle_shutdown),
],
_maintain_connection: maintain_connection,
projects: Default::default(),
remote_shutdown: false,
app_state,
client,
}
}
fn app_will_quit(&mut self, _: &mut ModelContext<Self>) -> impl Future<Output = ()> {
let request = if self.remote_shutdown {
None
} else {
Some(
self.client
.request(proto::ShutdownDevServer { reason: None }),
)
};
async move {
if let Some(request) = request {
request.await.log_err();
}
}
}
async fn handle_dev_server_instructions(
this: Model<Self>,
envelope: TypedEnvelope<proto::DevServerInstructions>,
mut cx: AsyncAppContext,
) -> Result<()> {
let (added_projects, retained_projects, removed_projects_ids) =
this.read_with(&mut cx, |this, _| {
let removed_projects = this
.projects
.keys()
.filter(|dev_server_project_id| {
!envelope
.payload
.projects
.iter()
.any(|p| p.id == dev_server_project_id.0)
})
.cloned()
.collect::<Vec<_>>();
let mut added_projects = vec![];
let mut retained_projects = vec![];
for project in envelope.payload.projects.iter() {
if this.projects.contains_key(&DevServerProjectId(project.id)) {
retained_projects.push(project.clone());
} else {
added_projects.push(project.clone());
}
}
(added_projects, retained_projects, removed_projects)
})?;
for dev_server_project in added_projects {
DevServer::share_project(this.clone(), &dev_server_project, &mut cx).await?;
}
for dev_server_project in retained_projects {
DevServer::update_project(this.clone(), &dev_server_project, &mut cx).await?;
}
this.update(&mut cx, |this, cx| {
for old_project_id in &removed_projects_ids {
this.unshare_project(old_project_id, cx)?;
}
Ok::<(), anyhow::Error>(())
})??;
Ok(())
}
async fn handle_validate_dev_server_project_request(
this: Model<Self>,
envelope: TypedEnvelope<proto::ValidateDevServerProjectRequest>,
cx: AsyncAppContext,
) -> Result<proto::Ack> {
let expanded = shellexpand::tilde(&envelope.payload.path).to_string();
let path = std::path::Path::new(&expanded);
let fs = cx.read_model(&this, |this, _| this.app_state.fs.clone())?;
let path_exists = fs.metadata(path).await.is_ok_and(|result| result.is_some());
if !path_exists {
return Err(anyhow!(ErrorCode::DevServerProjectPathDoesNotExist))?;
}
Ok(proto::Ack {})
}
async fn handle_list_remote_directory(
this: Model<Self>,
envelope: TypedEnvelope<proto::ListRemoteDirectory>,
cx: AsyncAppContext,
) -> Result<proto::ListRemoteDirectoryResponse> {
let expanded = shellexpand::tilde(&envelope.payload.path).to_string();
let fs = cx.read_model(&this, |this, _| this.app_state.fs.clone())?;
let mut entries = Vec::new();
let mut response = fs.read_dir(Path::new(&expanded)).await?;
while let Some(path) = response.next().await {
if let Some(file_name) = path?.file_name() {
entries.push(file_name.to_string_lossy().to_string());
}
}
Ok(proto::ListRemoteDirectoryResponse { entries })
}
async fn handle_shutdown(
this: Model<Self>,
_envelope: TypedEnvelope<proto::ShutdownDevServer>,
mut cx: AsyncAppContext,
) -> Result<()> {
this.update(&mut cx, |this, cx| {
this.remote_shutdown = true;
cx.quit();
})
}
fn unshare_project(
&mut self,
dev_server_project_id: &DevServerProjectId,
cx: &mut ModelContext<Self>,
) -> Result<()> {
if let Some(project) = self.projects.remove(dev_server_project_id) {
project.update(cx, |project, cx| project.unshare(cx))?;
}
Ok(())
}
async fn share_project(
this: Model<Self>,
dev_server_project: &proto::DevServerProject,
cx: &mut AsyncAppContext,
) -> Result<()> {
let (client, project) = this.update(cx, |this, cx| {
let project = Project::local(
this.client.clone(),
this.app_state.node_runtime.clone(),
this.app_state.user_store.clone(),
this.app_state.languages.clone(),
this.app_state.fs.clone(),
None,
cx,
);
(this.client.clone(), project)
})?;
for path in &dev_server_project.paths {
let path = shellexpand::tilde(path).to_string();
let (worktree, _) = project
.update(cx, |project, cx| {
project.find_or_create_worktree(&path, true, cx)
})?
.await?;
worktree.update(cx, |worktree, cx| {
worktree.as_local_mut().unwrap().share_private_files(cx)
})?;
}
let worktrees =
project.read_with(cx, |project, cx| project.worktree_metadata_protos(cx))?;
let response = client
.request(proto::ShareDevServerProject {
dev_server_project_id: dev_server_project.id,
worktrees,
})
.await?;
let project_id = response.project_id;
project.update(cx, |project, cx| project.shared(project_id, cx))??;
this.update(cx, |this, _| {
this.projects
.insert(DevServerProjectId(dev_server_project.id), project);
})?;
Ok(())
}
async fn update_project(
this: Model<Self>,
dev_server_project: &proto::DevServerProject,
cx: &mut AsyncAppContext,
) -> Result<()> {
let tasks = this.update(cx, |this, cx| {
let Some(project) = this
.projects
.get(&DevServerProjectId(dev_server_project.id))
else {
return vec![];
};
let mut to_delete = vec![];
let mut tasks = vec![];
project.update(cx, |project, cx| {
for worktree in project.visible_worktrees(cx) {
let mut delete = true;
for config in dev_server_project.paths.iter() {
if worktree.read(cx).abs_path().to_string_lossy()
== shellexpand::tilde(config)
{
delete = false;
}
}
if delete {
to_delete.push(worktree.read(cx).id())
}
}
for worktree_id in to_delete {
project.remove_worktree(worktree_id, cx)
}
for config in dev_server_project.paths.iter() {
tasks.push(project.find_or_create_worktree(
shellexpand::tilde(config).to_string(),
true,
cx,
));
}
tasks
})
})?;
futures::future::join_all(tasks).await;
Ok(())
}
async fn maintain_connection(
this: WeakModel<Self>,
client: Arc<Client>,
mut cx: AsyncAppContext,
) -> Result<()> {
let mut client_status = client.status();
let _ = client_status.try_recv();
let current_status = *client_status.borrow();
if current_status.is_connected() {
// wait for first disconnect
client_status.recv().await;
}
loop {
let Some(current_status) = client_status.recv().await else {
return Ok(());
};
let Some(this) = this.upgrade() else {
return Ok(());
};
if !current_status.is_connected() {
continue;
}
this.update(&mut cx, |this, cx| this.rejoin(cx))?.await?;
}
}
fn rejoin(&mut self, cx: &mut ModelContext<Self>) -> Task<Result<()>> {
let mut projects: HashMap<u64, Model<Project>> = HashMap::default();
let request = self.client.request(proto::ReconnectDevServer {
reshared_projects: self
.projects
.iter()
.flat_map(|(_, handle)| {
let project = handle.read(cx);
let project_id = project.remote_id()?;
projects.insert(project_id, handle.clone());
Some(proto::UpdateProject {
project_id,
worktrees: project.worktree_metadata_protos(cx),
})
})
.collect(),
});
cx.spawn(|_, mut cx| async move {
let response = request.await?;
for reshared_project in response.reshared_projects {
if let Some(project) = projects.get(&reshared_project.id) {
project.update(&mut cx, |project, cx| {
project.reshared(reshared_project, cx).log_err();
})?;
}
}
Ok(())
})
}
}

View File

@@ -1,3 +1,4 @@
use anyhow::Context as _;
use gpui::{
canvas, div, fill, img, opaque_grey, point, size, AnyElement, AppContext, Bounds, Context,
EventEmitter, FocusHandle, FocusableView, Img, InteractiveElement, IntoElement, Model,
@@ -19,6 +20,7 @@ use workspace::{
const IMAGE_VIEWER_KIND: &str = "ImageView";
pub struct ImageItem {
id: ProjectEntryId,
path: PathBuf,
project_path: ProjectPath,
}
@@ -48,9 +50,15 @@ impl project::Item for ImageItem {
.read_with(&cx, |project, cx| project.absolute_path(&path, cx))?
.ok_or_else(|| anyhow::anyhow!("Failed to find the absolute path"))?;
let id = project
.update(&mut cx, |project, cx| project.entry_for_path(&path, cx))?
.context("Entry not found")?
.id;
cx.new_model(|_| ImageItem {
path: abs_path,
project_path: path,
id,
})
}))
} else {
@@ -59,7 +67,7 @@ impl project::Item for ImageItem {
}
fn entry_id(&self, _: &AppContext) -> Option<ProjectEntryId> {
None
Some(self.id)
}
fn project_path(&self, _: &AppContext) -> Option<ProjectPath> {
@@ -68,18 +76,30 @@ impl project::Item for ImageItem {
}
pub struct ImageView {
path: PathBuf,
image: Model<ImageItem>,
focus_handle: FocusHandle,
}
impl Item for ImageView {
type Event = ();
fn tab_content(&self, params: TabContentParams, _cx: &WindowContext) -> AnyElement {
let title = self
.path
fn for_each_project_item(
&self,
cx: &AppContext,
f: &mut dyn FnMut(gpui::EntityId, &dyn project::Item),
) {
f(self.image.entity_id(), self.image.read(cx))
}
fn is_singleton(&self, _cx: &AppContext) -> bool {
true
}
fn tab_content(&self, params: TabContentParams, cx: &WindowContext) -> AnyElement {
let path = &self.image.read(cx).path;
let title = path
.file_name()
.unwrap_or_else(|| self.path.as_os_str())
.unwrap_or_else(|| path.as_os_str())
.to_string_lossy()
.to_string();
Label::new(title)
@@ -90,9 +110,10 @@ impl Item for ImageView {
}
fn tab_icon(&self, cx: &WindowContext) -> Option<Icon> {
let path = &self.image.read(cx).path;
ItemSettings::get_global(cx)
.file_icons
.then(|| FileIcons::get_icon(self.path.as_path(), cx))
.then(|| FileIcons::get_icon(path.as_path(), cx))
.flatten()
.map(Icon::from_path)
}
@@ -106,7 +127,7 @@ impl Item for ImageView {
Self: Sized,
{
Some(cx.new_view(|cx| Self {
path: self.path.clone(),
image: self.image.clone(),
focus_handle: cx.focus_handle(),
}))
}
@@ -118,7 +139,7 @@ impl SerializableItem for ImageView {
}
fn deserialize(
_project: Model<Project>,
project: Model<Project>,
_workspace: WeakView<Workspace>,
workspace_id: WorkspaceId,
item_id: ItemId,
@@ -129,10 +150,38 @@ impl SerializableItem for ImageView {
.get_image_path(item_id, workspace_id)?
.ok_or_else(|| anyhow::anyhow!("No image path found"))?;
cx.new_view(|cx| ImageView {
path: image_path,
focus_handle: cx.focus_handle(),
})
let (worktree, relative_path) = project
.update(&mut cx, |project, cx| {
project.find_or_create_worktree(image_path.clone(), false, cx)
})?
.await
.context("Path not found")?;
let worktree_id = worktree.update(&mut cx, |worktree, _cx| worktree.id())?;
let project_path = ProjectPath {
worktree_id,
path: relative_path.into(),
};
let id = project
.update(&mut cx, |project, cx| {
project.entry_for_path(&project_path, cx)
})?
.context("No entry found")?
.id;
cx.update(|cx| {
let image = cx.new_model(|_| ImageItem {
id,
path: image_path,
project_path,
});
Ok(cx.new_view(|cx| ImageView {
image,
focus_handle: cx.focus_handle(),
}))
})?
})
}
@@ -154,7 +203,7 @@ impl SerializableItem for ImageView {
let workspace_id = workspace.database_id()?;
Some(cx.background_executor().spawn({
let image_path = self.path.clone();
let image_path = self.image.read(cx).path.clone();
async move {
IMAGE_VIEWER
.save_image_path(item_id, workspace_id, image_path)
@@ -177,6 +226,7 @@ impl FocusableView for ImageView {
impl Render for ImageView {
fn render(&mut self, cx: &mut ViewContext<Self>) -> impl IntoElement {
let image_path = self.image.read(cx).path.clone();
let checkered_background = |bounds: Bounds<Pixels>, _, cx: &mut WindowContext| {
let square_size = 32.0;
@@ -233,7 +283,7 @@ impl Render for ImageView {
// TODO: In browser based Tailwind & Flex this would be h-screen and we'd use w-full
.h_full()
.child(
img(self.path.clone())
img(image_path)
.object_fit(ObjectFit::ScaleDown)
.max_w_full()
.max_h_full(),
@@ -254,7 +304,7 @@ impl ProjectItem for ImageView {
Self: Sized,
{
Self {
path: item.read(cx).path.clone(),
image: item,
focus_handle: cx.focus_handle(),
}
}

View File

@@ -501,8 +501,6 @@ pub struct Chunk<'a> {
pub is_unnecessary: bool,
/// Whether this chunk of text was originally a tab character.
pub is_tab: bool,
/// Whether this chunk of text is an invisible character.
pub is_invisible: bool,
/// An optional recipe for how the chunk should be presented.
pub renderer: Option<ChunkRenderer>,
}
@@ -1969,18 +1967,27 @@ impl Buffer {
let new_text_length = new_text.len();
let old_start = range.start.to_point(&before_edit);
let new_start = (delta + range.start as isize) as usize;
delta += new_text_length as isize - (range.end as isize - range.start as isize);
let range_len = range.end - range.start;
delta += new_text_length as isize - range_len as isize;
// Decide what range of the insertion to auto-indent, and whether
// the first line of the insertion should be considered a newly-inserted line
// or an edit to an existing line.
let mut range_of_insertion_to_indent = 0..new_text_length;
let mut first_line_is_new = false;
let mut original_indent_column = None;
let mut first_line_is_new = true;
// When inserting an entire line at the beginning of an existing line,
// treat the insertion as new.
if new_text.contains('\n')
&& old_start.column <= before_edit.indent_size_for_line(old_start.row).len
let old_line_start = before_edit.indent_size_for_line(old_start.row).len;
let old_line_end = before_edit.line_len(old_start.row);
if old_start.column > old_line_start {
first_line_is_new = false;
}
if !new_text.contains('\n')
&& (old_start.column + (range_len as u32) < old_line_end
|| old_line_end == old_line_start)
{
first_line_is_new = true;
first_line_is_new = false;
}
// When inserting text starting with a newline, avoid auto-indenting the
@@ -1990,7 +1997,7 @@ impl Buffer {
first_line_is_new = true;
}
// Avoid auto-indenting after the insertion.
let mut original_indent_column = None;
if let AutoindentMode::Block {
original_indent_columns,
} = &mode
@@ -2002,6 +2009,8 @@ impl Buffer {
)
.len
}));
// Avoid auto-indenting the line after the edit.
if new_text[range_of_insertion_to_indent.clone()].ends_with('\n') {
range_of_insertion_to_indent.end -= 1;
}
@@ -4037,7 +4046,7 @@ impl<'a> BufferChunks<'a> {
let old_range = std::mem::replace(&mut self.range, range.clone());
self.chunks.set_range(self.range.clone());
if let Some(highlights) = self.highlights.as_mut() {
if old_range.start >= self.range.start && old_range.end <= self.range.end {
if old_range.start <= self.range.start && old_range.end >= self.range.end {
// Reuse existing highlights stack, as the new range is a subrange of the old one.
highlights
.stack
@@ -4213,6 +4222,7 @@ impl<'a> Iterator for BufferChunks<'a> {
if self.range.start == self.chunks.offset() + chunk.len() {
self.chunks.next().unwrap();
}
Some(Chunk {
text: slice,
syntax_highlight_id: highlight_id,

View File

@@ -1241,7 +1241,6 @@ fn test_autoindent_does_not_adjust_lines_with_unchanged_suggestion(cx: &mut AppC
Some(AutoindentMode::EachLine),
cx,
);
assert_eq!(
buffer.text(),
"
@@ -1256,6 +1255,74 @@ fn test_autoindent_does_not_adjust_lines_with_unchanged_suggestion(cx: &mut AppC
"
.unindent()
);
// Insert a newline after the open brace. It is auto-indented
buffer.edit_via_marked_text(
&"
fn a() {«
»
c
.f
.g();
d
.f
.g();
}
"
.unindent(),
Some(AutoindentMode::EachLine),
cx,
);
assert_eq!(
buffer.text(),
"
fn a() {
ˇ
c
.f
.g();
d
.f
.g();
}
"
.unindent()
.replace("ˇ", "")
);
// Manually outdent the line. It stays outdented.
buffer.edit_via_marked_text(
&"
fn a() {
«»
c
.f
.g();
d
.f
.g();
}
"
.unindent(),
Some(AutoindentMode::EachLine),
cx,
);
assert_eq!(
buffer.text(),
"
fn a() {
c
.f
.g();
d
.f
.g();
}
"
.unindent()
);
buffer
});

View File

@@ -10,7 +10,7 @@ workspace = true
[features]
test-support = [
"tree-sitter"
"load-grammars"
]
load-grammars = [
"tree-sitter-bash",
@@ -82,3 +82,8 @@ text.workspace = true
theme = { workspace = true, features = ["test-support"] }
unindent.workspace = true
workspace = { workspace = true, features = ["test-support"] }
tree-sitter-typescript.workspace = true
tree-sitter-python.workspace = true
tree-sitter-go.workspace = true
tree-sitter-c.workspace = true
tree-sitter-css.workspace = true

View File

@@ -288,15 +288,6 @@ fn load_config(name: &str) -> LanguageConfig {
.with_context(|| format!("failed to load config.toml for language {name:?}"))
.unwrap();
#[cfg(not(feature = "load-grammars"))]
{
config = LanguageConfig {
name: config.name,
matcher: config.matcher,
..Default::default()
}
}
config
}

View File

@@ -5,6 +5,9 @@ line_comments = ["// ", "/// ", "//! "]
autoclose_before = ";:.,=}])>"
brackets = [
{ start = "{", end = "}", close = true, newline = true },
{ start = "r#\"", end = "\"#", close = true, newline = true },
{ start = "r##\"", end = "\"##", close = true, newline = true },
{ start = "r###\"", end = "\"###", close = true, newline = true },
{ start = "[", end = "]", close = true, newline = true },
{ start = "(", end = ")", close = true, newline = true },
{ start = "<", end = ">", close = false, newline = true, not_in = ["string", "comment"] },

View File

@@ -1177,6 +1177,8 @@ impl FakeLanguageServer {
let (stdout_writer, stdout_reader) = async_pipe::pipe();
let (notifications_tx, notifications_rx) = channel::unbounded();
let root = Self::root_path();
let mut server = LanguageServer::new_internal(
server_id,
stdin_writer,
@@ -1184,8 +1186,8 @@ impl FakeLanguageServer {
None::<async_pipe::PipeReader>,
Arc::new(Mutex::new(None)),
None,
Path::new("/"),
Path::new("/"),
root,
root,
None,
cx.clone(),
|_| {},
@@ -1201,8 +1203,8 @@ impl FakeLanguageServer {
None::<async_pipe::PipeReader>,
Arc::new(Mutex::new(None)),
None,
Path::new("/"),
Path::new("/"),
root,
root,
None,
cx,
move |msg| {
@@ -1238,6 +1240,16 @@ impl FakeLanguageServer {
(server, fake)
}
#[cfg(target_os = "windows")]
fn root_path() -> &'static Path {
Path::new("C:\\")
}
#[cfg(not(target_os = "windows"))]
fn root_path() -> &'static Path {
Path::new("/")
}
}
#[cfg(any(test, feature = "test-support"))]

View File

@@ -234,6 +234,10 @@ impl<'a> MarkdownParser<'a> {
text.push('\n');
}
// We want to ignore any inline HTML tags in the text but keep
// the text between them
Event::InlineHtml(_) => {}
Event::Text(t) => {
text.push_str(t.as_ref());
@@ -626,6 +630,8 @@ impl<'a> MarkdownParser<'a> {
// Otherwise we need to insert the block after all the nested items
// that have been parsed so far
items.extend(block);
} else {
self.cursor += 1;
}
}
}
@@ -847,6 +853,16 @@ mod tests {
);
}
#[gpui::test]
async fn test_text_with_inline_html() {
let parsed = parse("This is a paragraph with an inline HTML <sometag>tag</sometag>.").await;
assert_eq!(
parsed.children,
vec![p("This is a paragraph with an inline HTML tag.", 0..63),],
);
}
#[gpui::test]
async fn test_raw_links_detection() {
let parsed = parse("Checkout this https://zed.dev link").await;
@@ -1090,6 +1106,26 @@ Some other content
);
}
#[gpui::test]
async fn test_list_item_with_inline_html() {
let parsed = parse(
"\
* This is a list item with an inline HTML <sometag>tag</sometag>.
",
)
.await;
assert_eq!(
parsed.children,
vec![list_item(
0..67,
1,
Unordered,
vec![p("This is a list item with an inline HTML tag.", 4..44),],
),],
);
}
#[gpui::test]
async fn test_nested_list_with_paragraph_inside() {
let parsed = parse(

View File

@@ -2862,6 +2862,30 @@ impl MultiBufferSnapshot {
}
}
pub fn indent_and_comment_for_line(&self, row: MultiBufferRow, cx: &AppContext) -> String {
let mut indent = self.indent_size_for_line(row).chars().collect::<String>();
if self.settings_at(0, cx).extend_comment_on_newline {
if let Some(language_scope) = self.language_scope_at(Point::new(row.0, 0)) {
let delimiters = language_scope.line_comment_prefixes();
for delimiter in delimiters {
if *self
.chars_at(Point::new(row.0, indent.len() as u32))
.take(delimiter.chars().count())
.collect::<String>()
.as_str()
== **delimiter
{
indent.push_str(&delimiter);
break;
}
}
}
}
indent
}
pub fn prev_non_blank_row(&self, mut row: MultiBufferRow) -> Option<MultiBufferRow> {
while row.0 > 0 {
row.0 -= 1;

View File

@@ -41,7 +41,7 @@ use search::{BufferSearchBar, ProjectSearchView};
use serde::{Deserialize, Serialize};
use settings::{Settings, SettingsStore};
use smol::channel;
use theme::SyntaxTheme;
use theme::{SyntaxTheme, ThemeSettings};
use util::{debug_panic, RangeExt, ResultExt, TryFutureExt};
use workspace::{
dock::{DockPosition, Panel, PanelEvent},
@@ -653,13 +653,25 @@ impl OutlinePanel {
});
let mut outline_panel_settings = *OutlinePanelSettings::get_global(cx);
let settings_subscription = cx.observe_global::<SettingsStore>(move |_, cx| {
let new_settings = *OutlinePanelSettings::get_global(cx);
if outline_panel_settings != new_settings {
outline_panel_settings = new_settings;
cx.notify();
}
});
let mut current_theme = ThemeSettings::get_global(cx).clone();
let settings_subscription =
cx.observe_global::<SettingsStore>(move |outline_panel, cx| {
let new_settings = OutlinePanelSettings::get_global(cx);
let new_theme = ThemeSettings::get_global(cx);
if &current_theme != new_theme {
outline_panel_settings = *new_settings;
current_theme = new_theme.clone();
for excerpts in outline_panel.excerpts.values_mut() {
for excerpt in excerpts.values_mut() {
excerpt.invalidate_outlines();
}
}
outline_panel.update_non_fs_items(cx);
} else if &outline_panel_settings != new_settings {
outline_panel_settings = *new_settings;
cx.notify();
}
});
let mut outline_panel = Self {
mode: ItemsDisplayMode::Outline,

View File

@@ -30,7 +30,6 @@ async-trait.workspace = true
client.workspace = true
clock.workspace = true
collections.workspace = true
dev_server_projects.workspace = true
fs.workspace = true
futures.workspace = true
fuzzy.workspace = true

View File

@@ -25,8 +25,7 @@ mod yarn;
use anyhow::{anyhow, Context as _, Result};
use buffer_store::{BufferStore, BufferStoreEvent};
use client::{
proto, Client, Collaborator, DevServerProjectId, PendingEntitySubscription, ProjectId,
TypedEnvelope, UserStore,
proto, Client, Collaborator, PendingEntitySubscription, ProjectId, TypedEnvelope, UserStore,
};
use clock::ReplicaId;
use collections::{BTreeSet, HashMap, HashSet};
@@ -156,7 +155,6 @@ pub struct Project {
terminals: Terminals,
node: Option<NodeRuntime>,
hosted_project_id: Option<ProjectId>,
dev_server_project_id: Option<client::DevServerProjectId>,
search_history: SearchHistory,
search_included_history: SearchHistory,
search_excluded_history: SearchHistory,
@@ -217,7 +215,6 @@ enum ProjectClientState {
capability: Capability,
remote_id: u64,
replica_id: ReplicaId,
in_room: bool,
},
}
@@ -675,7 +672,6 @@ impl Project {
},
node: Some(node),
hosted_project_id: None,
dev_server_project_id: None,
search_history: Self::new_search_history(),
environment,
remotely_created_models: Default::default(),
@@ -705,7 +701,7 @@ impl Project {
let ssh_proto = ssh.read(cx).proto_client();
let worktree_store =
cx.new_model(|_| WorktreeStore::remote(false, ssh_proto.clone(), 0, None));
cx.new_model(|_| WorktreeStore::remote(false, ssh_proto.clone(), 0));
cx.subscribe(&worktree_store, Self::on_worktree_store_event)
.detach();
@@ -794,7 +790,6 @@ impl Project {
},
node: Some(node),
hosted_project_id: None,
dev_server_project_id: None,
search_history: Self::new_search_history(),
environment,
remotely_created_models: Default::default(),
@@ -898,15 +893,7 @@ impl Project {
let role = response.payload.role();
let worktree_store = cx.new_model(|_| {
WorktreeStore::remote(
true,
client.clone().into(),
response.payload.project_id,
response
.payload
.dev_server_project_id
.map(DevServerProjectId),
)
WorktreeStore::remote(true, client.clone().into(), response.payload.project_id)
})?;
let buffer_store = cx.new_model(|cx| {
BufferStore::remote(worktree_store.clone(), client.clone().into(), remote_id, cx)
@@ -992,7 +979,6 @@ impl Project {
capability: Capability::ReadWrite,
remote_id,
replica_id,
in_room: response.payload.dev_server_project_id.is_none(),
},
buffers_needing_diff: Default::default(),
git_diff_debouncer: DebouncedDelay::new(),
@@ -1001,10 +987,6 @@ impl Project {
},
node: None,
hosted_project_id: None,
dev_server_project_id: response
.payload
.dev_server_project_id
.map(DevServerProjectId),
search_history: Self::new_search_history(),
search_included_history: Self::new_search_history(),
search_excluded_history: Self::new_search_history(),
@@ -1305,39 +1287,23 @@ impl Project {
self.hosted_project_id
}
pub fn dev_server_project_id(&self) -> Option<DevServerProjectId> {
self.dev_server_project_id
}
pub fn supports_terminal(&self, cx: &AppContext) -> bool {
pub fn supports_terminal(&self, _cx: &AppContext) -> bool {
if self.is_local() {
return true;
}
if self.is_via_ssh() {
return true;
}
let Some(id) = self.dev_server_project_id else {
return false;
};
let Some(server) = dev_server_projects::Store::global(cx)
.read(cx)
.dev_server_for_project(id)
else {
return false;
};
server.ssh_connection_string.is_some()
return false;
}
pub fn ssh_connection_string(&self, cx: &AppContext) -> Option<SharedString> {
if let Some(ssh_state) = &self.ssh_client {
return Some(ssh_state.read(cx).connection_string().into());
}
let dev_server_id = self.dev_server_project_id()?;
dev_server_projects::Store::global(cx)
.read(cx)
.dev_server_for_project(dev_server_id)?
.ssh_connection_string
.clone()
return None;
}
pub fn ssh_connection_state(&self, cx: &AppContext) -> Option<remote::ConnectionState> {
@@ -1549,17 +1515,9 @@ impl Project {
pub fn shared(&mut self, project_id: u64, cx: &mut ModelContext<Self>) -> Result<()> {
if !matches!(self.client_state, ProjectClientState::Local) {
if let ProjectClientState::Remote { in_room, .. } = &mut self.client_state {
if *in_room || self.dev_server_project_id.is_none() {
return Err(anyhow!("project was already shared"));
} else {
*in_room = true;
return Ok(());
}
} else {
return Err(anyhow!("project was already shared"));
}
return Err(anyhow!("project was already shared"));
}
self.client_subscriptions.extend([
self.client
.subscribe_to_entity(project_id)?
@@ -1657,14 +1615,7 @@ impl Project {
fn unshare_internal(&mut self, cx: &mut AppContext) -> Result<()> {
if self.is_via_collab() {
if self.dev_server_project_id().is_some() {
if let ProjectClientState::Remote { in_room, .. } = &mut self.client_state {
*in_room = false
}
return Ok(());
} else {
return Err(anyhow!("attempted to unshare a remote project"));
}
return Err(anyhow!("attempted to unshare a remote project"));
}
if let ProjectClientState::Shared { remote_id, .. } = self.client_state {
@@ -1924,11 +1875,7 @@ impl Project {
})
}
pub fn get_open_buffer(
&mut self,
path: &ProjectPath,
cx: &mut ModelContext<Self>,
) -> Option<Model<Buffer>> {
pub fn get_open_buffer(&self, path: &ProjectPath, cx: &AppContext) -> Option<Model<Buffer>> {
self.buffer_store.read(cx).get_by_path(path, cx)
}
@@ -2265,29 +2212,6 @@ impl Project {
}
fn on_worktree_released(&mut self, id_to_remove: WorktreeId, cx: &mut ModelContext<Self>) {
if let Some(dev_server_project_id) = self.dev_server_project_id {
let paths: Vec<String> = self
.visible_worktrees(cx)
.filter_map(|worktree| {
if worktree.read(cx).id() == id_to_remove {
None
} else {
Some(worktree.read(cx).abs_path().to_string_lossy().to_string())
}
})
.collect();
if !paths.is_empty() {
let request = self.client.request(proto::UpdateDevServerProject {
dev_server_project_id: dev_server_project_id.0,
paths,
});
cx.background_executor()
.spawn(request)
.detach_and_log_err(cx);
}
return;
}
if let Some(ssh) = &self.ssh_client {
ssh.read(cx)
.proto_client()
@@ -3152,7 +3076,7 @@ impl Project {
match &self.client_state {
ProjectClientState::Shared { .. } => true,
ProjectClientState::Local => false,
ProjectClientState::Remote { in_room, .. } => *in_room,
ProjectClientState::Remote { .. } => true,
}
}
@@ -3279,20 +3203,6 @@ impl Project {
let response = response.await?;
Ok(response.entries.into_iter().map(PathBuf::from).collect())
})
} else if let Some(dev_server) = self.dev_server_project_id().and_then(|id| {
dev_server_projects::Store::global(cx)
.read(cx)
.dev_server_for_project(id)
}) {
let request = proto::ListRemoteDirectory {
dev_server_id: dev_server.id.0,
path: query,
};
let response = self.client.request(request);
cx.background_executor().spawn(async move {
let response = response.await?;
Ok(response.entries.into_iter().map(PathBuf::from).collect())
})
} else {
Task::ready(Err(anyhow!("cannot list directory in remote project")))
}
@@ -3381,17 +3291,10 @@ impl Project {
}
pub fn absolute_path(&self, project_path: &ProjectPath, cx: &AppContext) -> Option<PathBuf> {
let workspace_root = self
.worktree_for_id(project_path.worktree_id, cx)?
self.worktree_for_id(project_path.worktree_id, cx)?
.read(cx)
.abs_path();
let project_path = project_path.path.as_ref();
Some(if project_path == Path::new("") {
workspace_root.to_path_buf()
} else {
workspace_root.join(project_path)
})
.absolutize(&project_path.path)
.ok()
}
/// Attempts to find a `ProjectPath` corresponding to the given path. If the path

View File

@@ -37,11 +37,8 @@ pub enum TerminalKind {
/// SshCommand describes how to connect to a remote server
#[derive(Debug, Clone, PartialEq, Eq)]
pub enum SshCommand {
/// DevServers give a string from the user
DevServer(String),
/// Direct ssh has a list of arguments to pass to ssh
Direct(Vec<String>),
pub struct SshCommand {
arguments: Vec<String>,
}
impl Project {
@@ -73,19 +70,12 @@ impl Project {
if let Some(args) = ssh_client.ssh_args() {
return Some((
ssh_client.connection_options().host.clone(),
SshCommand::Direct(args),
SshCommand { arguments: args },
));
}
}
let dev_server_project_id = self.dev_server_project_id()?;
let projects_store = dev_server_projects::Store::global(cx).read(cx);
let ssh_command = projects_store
.dev_server_for_project(dev_server_project_id)?
.ssh_connection_string
.as_ref()?
.to_string();
Some(("".to_string(), SshCommand::DevServer(ssh_command)))
return None;
}
pub fn create_terminal(
@@ -399,14 +389,8 @@ pub fn wrap_for_ssh(
};
let shell_invocation = format!("sh -c {}", shlex::try_quote(&commands).unwrap());
let (program, mut args) = match ssh_command {
SshCommand::DevServer(ssh_command) => {
let mut args = shlex::split(ssh_command).unwrap_or_default();
let program = args.drain(0..1).next().unwrap_or("ssh".to_string());
(program, args)
}
SshCommand::Direct(ssh_args) => ("ssh".to_string(), ssh_args.clone()),
};
let program = "ssh".to_string();
let mut args = ssh_command.arguments.clone();
args.push("-t".to_string());
args.push(shell_invocation);

View File

@@ -1,11 +1,9 @@
use std::{
cell::RefCell,
path::{Path, PathBuf},
sync::{atomic::AtomicUsize, Arc},
};
use anyhow::{anyhow, Context as _, Result};
use client::DevServerProjectId;
use collections::{HashMap, HashSet};
use fs::Fs;
use futures::{
@@ -41,7 +39,6 @@ enum WorktreeStoreState {
fs: Arc<dyn Fs>,
},
Remote {
dev_server_project_id: Option<DevServerProjectId>,
upstream_client: AnyProtoClient,
upstream_project_id: u64,
},
@@ -94,7 +91,6 @@ impl WorktreeStore {
retain_worktrees: bool,
upstream_client: AnyProtoClient,
upstream_project_id: u64,
dev_server_project_id: Option<DevServerProjectId>,
) -> Self {
Self {
next_entry_id: Default::default(),
@@ -106,7 +102,6 @@ impl WorktreeStore {
state: WorktreeStoreState::Remote {
upstream_client,
upstream_project_id,
dev_server_project_id,
},
}
}
@@ -196,18 +191,9 @@ impl WorktreeStore {
if !self.loading_worktrees.contains_key(&path) {
let task = match &self.state {
WorktreeStoreState::Remote {
upstream_client,
dev_server_project_id,
..
upstream_client, ..
} => {
if let Some(dev_server_project_id) = dev_server_project_id {
self.create_dev_server_worktree(
upstream_client.clone(),
*dev_server_project_id,
abs_path,
cx,
)
} else if upstream_client.is_via_collab() {
if upstream_client.is_via_collab() {
Task::ready(Err(Arc::new(anyhow!("cannot create worktrees via collab"))))
} else {
self.create_ssh_worktree(upstream_client.clone(), abs_path, visible, cx)
@@ -322,51 +308,6 @@ impl WorktreeStore {
})
}
fn create_dev_server_worktree(
&mut self,
client: AnyProtoClient,
dev_server_project_id: DevServerProjectId,
abs_path: impl AsRef<Path>,
cx: &mut ModelContext<Self>,
) -> Task<Result<Model<Worktree>, Arc<anyhow::Error>>> {
let path: Arc<Path> = abs_path.as_ref().into();
let mut paths: Vec<String> = self
.visible_worktrees(cx)
.map(|worktree| worktree.read(cx).abs_path().to_string_lossy().to_string())
.collect();
paths.push(path.to_string_lossy().to_string());
let request = client.request(proto::UpdateDevServerProject {
dev_server_project_id: dev_server_project_id.0,
paths,
});
let abs_path = abs_path.as_ref().to_path_buf();
cx.spawn(move |project, cx| async move {
let (tx, rx) = futures::channel::oneshot::channel();
let tx = RefCell::new(Some(tx));
let Some(project) = project.upgrade() else {
return Err(anyhow!("project dropped"))?;
};
let observer = cx.update(|cx| {
cx.observe(&project, move |project, cx| {
let abs_path = abs_path.clone();
project.update(cx, |project, cx| {
if let Some((worktree, _)) = project.find_worktree(&abs_path, cx) {
if let Some(tx) = tx.borrow_mut().take() {
tx.send(worktree).ok();
}
}
})
})
})?;
request.await?;
let worktree = rx.await.map_err(|e| anyhow!(e))?;
drop(observer);
Ok(worktree)
})
}
pub fn add(&mut self, worktree: &Model<Worktree>, cx: &mut ModelContext<Self>) {
let worktree_id = worktree.read(cx).id();
debug_assert!(self.worktrees().all(|w| w.read(cx).id() != worktree_id));

View File

@@ -94,12 +94,18 @@ pub struct ProjectPanel {
struct EditState {
worktree_id: WorktreeId,
entry_id: ProjectEntryId,
is_new_entry: bool,
leaf_entry_id: Option<ProjectEntryId>,
is_dir: bool,
depth: usize,
processing_filename: Option<String>,
}
impl EditState {
fn is_new_entry(&self) -> bool {
self.leaf_entry_id.is_none()
}
}
#[derive(Clone, Debug)]
enum ClipboardEntry {
Copied(BTreeSet<SelectedEntry>),
@@ -497,13 +503,14 @@ impl ProjectPanel {
if let Some((worktree, entry)) = self.selected_sub_entry(cx) {
let auto_fold_dirs = ProjectPanelSettings::get_global(cx).auto_fold_dirs;
let worktree = worktree.read(cx);
let is_root = Some(entry) == worktree.root_entry();
let is_dir = entry.is_dir();
let is_foldable = auto_fold_dirs && self.is_foldable(entry, worktree);
let is_unfoldable = auto_fold_dirs && self.is_unfoldable(entry, worktree);
let worktree_id = worktree.id();
let is_read_only = project.is_read_only(cx);
let is_remote = project.is_via_collab() && project.dev_server_project_id().is_none();
let is_remote = project.is_via_collab();
let is_local = project.is_local();
let context_menu = ContextMenu::build(cx, |menu, cx| {
@@ -823,10 +830,10 @@ impl ProjectPanel {
cx.focus(&self.focus_handle);
let worktree_id = edit_state.worktree_id;
let is_new_entry = edit_state.is_new_entry;
let is_new_entry = edit_state.is_new_entry();
let filename = self.filename_editor.read(cx).text(cx);
edit_state.is_dir = edit_state.is_dir
|| (edit_state.is_new_entry && filename.ends_with(std::path::MAIN_SEPARATOR));
|| (edit_state.is_new_entry() && filename.ends_with(std::path::MAIN_SEPARATOR));
let is_dir = edit_state.is_dir;
let worktree = self.project.read(cx).worktree_for_id(worktree_id, cx)?;
let entry = worktree.read(cx).entry_for_id(edit_state.entry_id)?.clone();
@@ -857,7 +864,6 @@ impl ProjectPanel {
if path_already_exists(new_path.as_path()) {
return None;
}
edited_entry_id = entry.id;
edit_task = self.project.update(cx, |project, cx| {
project.rename_entry(entry.id, new_path.as_path(), cx)
@@ -976,6 +982,7 @@ impl ProjectPanel {
}) = self.selection
{
let directory_id;
let new_entry_id = self.resolve_entry(entry_id);
if let Some((worktree, expanded_dir_ids)) = self
.project
.read(cx)
@@ -983,7 +990,7 @@ impl ProjectPanel {
.zip(self.expanded_dir_ids.get_mut(&worktree_id))
{
let worktree = worktree.read(cx);
if let Some(mut entry) = worktree.entry_for_id(entry_id) {
if let Some(mut entry) = worktree.entry_for_id(new_entry_id) {
loop {
if entry.is_dir() {
if let Err(ix) = expanded_dir_ids.binary_search(&entry.id) {
@@ -1011,7 +1018,7 @@ impl ProjectPanel {
self.edit_state = Some(EditState {
worktree_id,
entry_id: directory_id,
is_new_entry: true,
leaf_entry_id: None,
is_dir,
processing_filename: None,
depth: 0,
@@ -1045,12 +1052,12 @@ impl ProjectPanel {
}) = self.selection
{
if let Some(worktree) = self.project.read(cx).worktree_for_id(worktree_id, cx) {
let entry_id = self.unflatten_entry_id(entry_id);
if let Some(entry) = worktree.read(cx).entry_for_id(entry_id) {
let sub_entry_id = self.unflatten_entry_id(entry_id);
if let Some(entry) = worktree.read(cx).entry_for_id(sub_entry_id) {
self.edit_state = Some(EditState {
worktree_id,
entry_id,
is_new_entry: false,
entry_id: sub_entry_id,
leaf_entry_id: Some(entry_id),
is_dir: entry.is_dir(),
processing_filename: None,
depth: 0,
@@ -1273,6 +1280,7 @@ impl ProjectPanel {
fn select_parent(&mut self, _: &SelectParent, cx: &mut ViewContext<Self>) {
if let Some((worktree, entry)) = self.selected_sub_entry(cx) {
if let Some(parent) = entry.path.parent() {
let worktree = worktree.read(cx);
if let Some(parent_entry) = worktree.entry_for_path(parent) {
self.selection = Some(SelectedEntry {
worktree_id: worktree.id(),
@@ -1406,7 +1414,6 @@ impl ProjectPanel {
.clipboard
.as_ref()
.filter(|clipboard| !clipboard.items().is_empty())?;
enum PasteTask {
Rename(Task<Result<CreatedEntry>>),
Copy(Task<Result<Option<Entry>>>),
@@ -1416,7 +1423,7 @@ impl ProjectPanel {
let clip_is_cut = clipboard_entries.is_cut();
for clipboard_entry in clipboard_entries.items() {
let new_path =
self.create_paste_path(clipboard_entry, self.selected_entry_handle(cx)?, cx)?;
self.create_paste_path(clipboard_entry, self.selected_sub_entry(cx)?, cx)?;
let clip_entry_id = clipboard_entry.entry_id;
let is_same_worktree = clipboard_entry.worktree_id == worktree_id;
let relative_worktree_source_path = if !is_same_worktree {
@@ -1558,7 +1565,7 @@ impl ProjectPanel {
fn reveal_in_finder(&mut self, _: &RevealInFileManager, cx: &mut ViewContext<Self>) {
if let Some((worktree, entry)) = self.selected_sub_entry(cx) {
cx.reveal_path(&worktree.abs_path().join(&entry.path));
cx.reveal_path(&worktree.read(cx).abs_path().join(&entry.path));
}
}
@@ -1573,7 +1580,7 @@ impl ProjectPanel {
if let Some((worktree, entry)) = self.selected_sub_entry(cx) {
let abs_path = match &entry.canonical_path {
Some(canonical_path) => Some(canonical_path.to_path_buf()),
None => worktree.absolutize(&entry.path).ok(),
None => worktree.read(cx).absolutize(&entry.path).ok(),
};
let working_directory = if entry.is_dir() {
@@ -1596,7 +1603,7 @@ impl ProjectPanel {
if entry.is_dir() {
let include_root = self.project.read(cx).visible_worktrees(cx).count() > 1;
let dir_path = if include_root {
let mut full_path = PathBuf::from(worktree.root_name());
let mut full_path = PathBuf::from(worktree.read(cx).root_name());
full_path.push(&entry.path);
Arc::from(full_path)
} else {
@@ -1730,6 +1737,8 @@ impl ProjectPanel {
}
}
/// Finds the currently selected subentry for a given leaf entry id. If a given entry
/// has no ancestors, the project entry ID that's passed in is returned as-is.
fn resolve_entry(&self, id: ProjectEntryId) -> ProjectEntryId {
self.ancestors
.get(&id)
@@ -1756,12 +1765,12 @@ impl ProjectPanel {
fn selected_sub_entry<'a>(
&self,
cx: &'a AppContext,
) -> Option<(&'a Worktree, &'a project::Entry)> {
) -> Option<(Model<Worktree>, &'a project::Entry)> {
let (worktree, mut entry) = self.selected_entry_handle(cx)?;
let worktree = worktree.read(cx);
let resolved_id = self.resolve_entry(entry.id);
if resolved_id != entry.id {
let worktree = worktree.read(cx);
entry = worktree.entry_for_id(resolved_id)?;
}
Some((worktree, entry))
@@ -1831,7 +1840,7 @@ impl ProjectPanel {
let mut new_entry_parent_id = None;
let mut new_entry_kind = EntryKind::Dir;
if let Some(edit_state) = &self.edit_state {
if edit_state.worktree_id == worktree_id && edit_state.is_new_entry {
if edit_state.worktree_id == worktree_id && edit_state.is_new_entry() {
new_entry_parent_id = Some(edit_state.entry_id);
new_entry_kind = if edit_state.is_dir {
EntryKind::Dir
@@ -1885,7 +1894,19 @@ impl ProjectPanel {
}
auto_folded_ancestors.clear();
visible_worktree_entries.push(entry.clone());
if Some(entry.id) == new_entry_parent_id {
let precedes_new_entry = if let Some(new_entry_id) = new_entry_parent_id {
entry.id == new_entry_id || {
self.ancestors.get(&entry.id).map_or(false, |entries| {
entries
.ancestors
.iter()
.any(|entry_id| *entry_id == new_entry_id)
})
}
} else {
false
};
if precedes_new_entry {
visible_worktree_entries.push(Entry {
id: NEW_ENTRY_ID,
kind: new_entry_kind,
@@ -2335,7 +2356,7 @@ impl ProjectPanel {
};
if let Some(edit_state) = &self.edit_state {
let is_edited_entry = if edit_state.is_new_entry {
let is_edited_entry = if edit_state.is_new_entry() {
entry.id == NEW_ENTRY_ID
} else {
entry.id == edit_state.entry_id
@@ -2353,10 +2374,41 @@ impl ProjectPanel {
if is_edited_entry {
if let Some(processing_filename) = &edit_state.processing_filename {
details.is_processing = true;
details.filename.clear();
details.filename.push_str(processing_filename);
if let Some(ancestors) = edit_state
.leaf_entry_id
.and_then(|entry| self.ancestors.get(&entry))
{
let position = ancestors.ancestors.iter().position(|entry_id| *entry_id == edit_state.entry_id).expect("Edited sub-entry should be an ancestor of selected leaf entry") + 1;
let all_components = ancestors.ancestors.len();
let prefix_components = all_components - position;
let suffix_components = position.checked_sub(1);
let mut previous_components =
Path::new(&details.filename).components();
let mut new_path = previous_components
.by_ref()
.take(prefix_components)
.collect::<PathBuf>();
if let Some(last_component) =
Path::new(processing_filename).components().last()
{
new_path.push(last_component);
previous_components.next();
}
if let Some(_) = suffix_components {
new_path.push(previous_components);
}
if let Some(str) = new_path.to_str() {
details.filename.clear();
details.filename.push_str(str);
}
} else {
details.filename.clear();
details.filename.push_str(processing_filename);
}
} else {
if edit_state.is_new_entry {
if edit_state.is_new_entry() {
details.filename.clear();
}
details.is_editing = true;
@@ -2546,9 +2598,7 @@ impl ProjectPanel {
h_flex().h_6().w_full().child(editor.clone())
} else {
h_flex().h_6().map(|mut this| {
if let Some(folded_ancestors) =
is_active.then(|| self.ancestors.get(&entry_id)).flatten()
{
if let Some(folded_ancestors) = self.ancestors.get(&entry_id) {
let components = Path::new(&file_name)
.components()
.map(|comp| {
@@ -2557,6 +2607,7 @@ impl ProjectPanel {
comp_str
})
.collect::<Vec<_>>();
let components_len = components.len();
let active_index = components_len
- 1
@@ -2592,9 +2643,10 @@ impl ProjectPanel {
Label::new(component)
.single_line()
.color(filename_text_color)
.when(index == active_index, |this| {
this.underline(true)
}),
.when(
is_active && index == active_index,
|this| this.underline(true),
),
);
this = this.child(label);
@@ -2769,6 +2821,17 @@ impl ProjectPanel {
return None;
}
let scroll_handle = self.scroll_handle.0.borrow();
let longest_item_width = scroll_handle
.last_item_size
.filter(|size| size.contents.width > size.item.width)?
.contents
.width
.0 as f64;
if longest_item_width < scroll_handle.base_handle.bounds().size.width.0 as f64 {
return None;
}
Some(
div()
.occlude()
@@ -3334,12 +3397,11 @@ impl Panel for ProjectPanel {
fn starts_open(&self, cx: &WindowContext) -> bool {
let project = &self.project.read(cx);
project.dev_server_project_id().is_some()
|| project.visible_worktrees(cx).any(|tree| {
tree.read(cx)
.root_entry()
.map_or(false, |entry| entry.is_dir())
})
project.visible_worktrees(cx).any(|tree| {
tree.read(cx)
.root_entry()
.map_or(false, |entry| entry.is_dir())
})
}
}

View File

@@ -217,33 +217,14 @@ message Envelope {
MultiLspQueryResponse multi_lsp_query_response = 176;
RestartLanguageServers restart_language_servers = 208;
CreateDevServerProject create_dev_server_project = 177;
CreateDevServerProjectResponse create_dev_server_project_response = 188;
CreateDevServer create_dev_server = 178;
CreateDevServerResponse create_dev_server_response = 179;
ShutdownDevServer shutdown_dev_server = 180;
DevServerInstructions dev_server_instructions = 181;
ReconnectDevServer reconnect_dev_server = 182;
ReconnectDevServerResponse reconnect_dev_server_response = 183;
ShareDevServerProject share_dev_server_project = 184;
JoinDevServerProject join_dev_server_project = 185;
RejoinRemoteProjects rejoin_remote_projects = 186;
RejoinRemoteProjectsResponse rejoin_remote_projects_response = 187;
DevServerProjectsUpdate dev_server_projects_update = 193;
ValidateDevServerProjectRequest validate_dev_server_project_request = 194;
DeleteDevServer delete_dev_server = 195;
OpenNewBuffer open_new_buffer = 196;
DeleteDevServerProject delete_dev_server_project = 197;
GetSupermavenApiKey get_supermaven_api_key = 198;
GetSupermavenApiKeyResponse get_supermaven_api_key_response = 199;
RegenerateDevServerToken regenerate_dev_server_token = 200;
RegenerateDevServerTokenResponse regenerate_dev_server_token_response = 201;
RenameDevServer rename_dev_server = 202;
TaskContextForLocation task_context_for_location = 203;
TaskContext task_context = 204;
@@ -264,7 +245,6 @@ message Envelope {
ListRemoteDirectory list_remote_directory = 219;
ListRemoteDirectoryResponse list_remote_directory_response = 220;
UpdateDevServerProject update_dev_server_project = 221;
AddWorktree add_worktree = 222;
AddWorktreeResponse add_worktree_response = 223;
@@ -304,10 +284,17 @@ message Envelope {
LanguageServerPromptResponse language_server_prompt_response = 269; // current max
}
reserved 87 to 88;
reserved 158 to 161;
reserved 166 to 169;
reserved 177 to 185;
reserved 188;
reserved 193 to 195;
reserved 197;
reserved 200 to 202;
reserved 205 to 206;
reserved 221;
reserved 224 to 229;
reserved 247 to 254;
}
@@ -342,12 +329,11 @@ enum ErrorCode {
WrongMoveTarget = 11;
UnsharedItem = 12;
NoSuchProject = 13;
DevServerAlreadyOnline = 14;
DevServerOffline = 15;
DevServerProjectPathDoesNotExist = 16;
RemoteUpgradeRequired = 17;
RateLimitExceeded = 18;
reserved 6;
reserved 14 to 15;
}
message EndStream {}
@@ -511,7 +497,7 @@ message LiveKitConnectionInfo {
message ShareProject {
uint64 room_id = 1;
repeated WorktreeMetadata worktrees = 2;
optional uint64 dev_server_project_id = 3;
reserved 3;
bool is_ssh_project = 4;
}
@@ -536,19 +522,6 @@ message JoinHostedProject {
uint64 project_id = 1;
}
message CreateDevServerProject {
reserved 1;
reserved 2;
uint64 dev_server_id = 3;
string path = 4;
}
message CreateDevServerProjectResponse {
DevServerProject dev_server_project = 1;
}
message ValidateDevServerProjectRequest {
string path = 1;
}
message ListRemoteDirectory {
uint64 dev_server_id = 1;
@@ -559,77 +532,6 @@ message ListRemoteDirectoryResponse {
repeated string entries = 1;
}
message UpdateDevServerProject {
uint64 dev_server_project_id = 1;
repeated string paths = 2;
}
message CreateDevServer {
reserved 1;
string name = 2;
optional string ssh_connection_string = 3;
}
message RegenerateDevServerToken {
uint64 dev_server_id = 1;
}
message RegenerateDevServerTokenResponse {
uint64 dev_server_id = 1;
string access_token = 2;
}
message CreateDevServerResponse {
uint64 dev_server_id = 1;
reserved 2;
string access_token = 3;
string name = 4;
}
message ShutdownDevServer {
optional string reason = 1;
}
message RenameDevServer {
uint64 dev_server_id = 1;
string name = 2;
optional string ssh_connection_string = 3;
}
message DeleteDevServer {
uint64 dev_server_id = 1;
}
message DeleteDevServerProject {
uint64 dev_server_project_id = 1;
}
message ReconnectDevServer {
repeated UpdateProject reshared_projects = 1;
}
message ReconnectDevServerResponse {
repeated ResharedProject reshared_projects = 1;
}
message DevServerInstructions {
repeated DevServerProject projects = 1;
}
message DevServerProjectsUpdate {
repeated DevServer dev_servers = 1;
repeated DevServerProject dev_server_projects = 2;
}
message ShareDevServerProject {
uint64 dev_server_project_id = 1;
repeated WorktreeMetadata worktrees = 2;
}
message JoinDevServerProject {
uint64 dev_server_project_id = 1;
}
message JoinProjectResponse {
uint64 project_id = 5;
uint32 replica_id = 1;
@@ -637,7 +539,7 @@ message JoinProjectResponse {
repeated Collaborator collaborators = 3;
repeated LanguageServer language_servers = 4;
ChannelRole role = 6;
optional uint64 dev_server_project_id = 7;
reserved 7;
}
message LeaveProject {
@@ -1429,29 +1331,6 @@ message HostedProject {
ChannelVisibility visibility = 4;
}
message DevServerProject {
uint64 id = 1;
optional uint64 project_id = 2;
reserved 3;
reserved 4;
uint64 dev_server_id = 5;
string path = 6;
repeated string paths = 7;
}
message DevServer {
reserved 1;
uint64 dev_server_id = 2;
string name = 3;
DevServerStatus status = 4;
optional string ssh_connection_string = 5;
}
enum DevServerStatus {
Offline = 0;
Online = 1;
}
message JoinChannel {
uint64 channel_id = 1;
}

View File

@@ -318,30 +318,12 @@ messages!(
(SetRoomParticipantRole, Foreground),
(BlameBuffer, Foreground),
(BlameBufferResponse, Foreground),
(CreateDevServerProject, Background),
(CreateDevServerProjectResponse, Foreground),
(CreateDevServer, Foreground),
(CreateDevServerResponse, Foreground),
(DevServerInstructions, Foreground),
(ShutdownDevServer, Foreground),
(ReconnectDevServer, Foreground),
(ReconnectDevServerResponse, Foreground),
(ShareDevServerProject, Foreground),
(JoinDevServerProject, Foreground),
(RejoinRemoteProjects, Foreground),
(RejoinRemoteProjectsResponse, Foreground),
(MultiLspQuery, Background),
(MultiLspQueryResponse, Background),
(DevServerProjectsUpdate, Foreground),
(ValidateDevServerProjectRequest, Background),
(ListRemoteDirectory, Background),
(ListRemoteDirectoryResponse, Background),
(UpdateDevServerProject, Background),
(DeleteDevServer, Foreground),
(DeleteDevServerProject, Foreground),
(RegenerateDevServerToken, Foreground),
(RegenerateDevServerTokenResponse, Foreground),
(RenameDevServer, Foreground),
(OpenNewBuffer, Foreground),
(RestartLanguageServers, Foreground),
(LinkedEditingRange, Background),
@@ -419,7 +401,6 @@ request_messages!(
(GetTypeDefinition, GetTypeDefinitionResponse),
(LinkedEditingRange, LinkedEditingRangeResponse),
(ListRemoteDirectory, ListRemoteDirectoryResponse),
(UpdateDevServerProject, Ack),
(GetUsers, UsersResponse),
(IncomingCall, Ack),
(InlayHints, InlayHintsResponse),
@@ -477,19 +458,8 @@ request_messages!(
(LspExtExpandMacro, LspExtExpandMacroResponse),
(SetRoomParticipantRole, Ack),
(BlameBuffer, BlameBufferResponse),
(CreateDevServerProject, CreateDevServerProjectResponse),
(CreateDevServer, CreateDevServerResponse),
(ShutdownDevServer, Ack),
(ShareDevServerProject, ShareProjectResponse),
(JoinDevServerProject, JoinProjectResponse),
(RejoinRemoteProjects, RejoinRemoteProjectsResponse),
(ReconnectDevServer, ReconnectDevServerResponse),
(ValidateDevServerProjectRequest, Ack),
(MultiLspQuery, MultiLspQueryResponse),
(DeleteDevServer, Ack),
(DeleteDevServerProject, Ack),
(RegenerateDevServerToken, RegenerateDevServerTokenResponse),
(RenameDevServer, Ack),
(RestartLanguageServers, Ack),
(OpenContext, OpenContextResponse),
(CreateContext, CreateContextResponse),

View File

@@ -16,7 +16,6 @@ doctest = false
anyhow.workspace = true
auto_update.workspace = true
release_channel.workspace = true
client.workspace = true
editor.workspace = true
file_finder.workspace = true
futures.workspace = true
@@ -30,15 +29,12 @@ menu.workspace = true
ordered-float.workspace = true
picker.workspace = true
project.workspace = true
dev_server_projects.workspace = true
remote.workspace = true
rpc.workspace = true
schemars.workspace = true
serde.workspace = true
settings.workspace = true
smol.workspace = true
task.workspace = true
terminal_view.workspace = true
theme.workspace = true
ui.workspace = true
util.workspace = true

View File

@@ -1,6 +1,5 @@
use std::path::PathBuf;
use dev_server_projects::DevServer;
use gpui::{ClickEvent, DismissEvent, EventEmitter, FocusHandle, FocusableView, Render, WeakView};
use project::project_settings::ProjectSettings;
use remote::SshConnectionOptions;
@@ -12,14 +11,10 @@ use ui::{
};
use workspace::{notifications::DetachAndPromptErr, ModalView, OpenOptions, Workspace};
use crate::{
open_dev_server_project, open_ssh_project, remote_servers::reconnect_to_dev_server_project,
RemoteServerProjects, SshSettings,
};
use crate::open_ssh_project;
enum Host {
RemoteProject,
DevServerProject(DevServer),
SshRemoteProject(SshConnectionOptions),
}
@@ -55,20 +50,9 @@ impl DisconnectedOverlay {
return;
}
let handle = cx.view().downgrade();
let dev_server = project
.read(cx)
.dev_server_project_id()
.and_then(|id| {
dev_server_projects::Store::global(cx)
.read(cx)
.dev_server_for_project(id)
})
.cloned();
let ssh_connection_options = project.read(cx).ssh_connection_options(cx);
let host = if let Some(dev_server) = dev_server {
Host::DevServerProject(dev_server)
} else if let Some(ssh_connection_options) = ssh_connection_options {
let host = if let Some(ssh_connection_options) = ssh_connection_options {
Host::SshRemoteProject(ssh_connection_options)
} else {
Host::RemoteProject
@@ -89,9 +73,6 @@ impl DisconnectedOverlay {
cx.emit(DismissEvent);
match &self.host {
Host::DevServerProject(dev_server) => {
self.reconnect_to_dev_server(dev_server.clone(), cx);
}
Host::SshRemoteProject(ssh_connection_options) => {
self.reconnect_to_ssh_remote(ssh_connection_options.clone(), cx);
}
@@ -99,50 +80,6 @@ impl DisconnectedOverlay {
}
}
fn reconnect_to_dev_server(&self, dev_server: DevServer, cx: &mut ViewContext<Self>) {
let Some(workspace) = self.workspace.upgrade() else {
return;
};
let Some(dev_server_project_id) = workspace
.read(cx)
.project()
.read(cx)
.dev_server_project_id()
else {
return;
};
if let Some(project_id) = dev_server_projects::Store::global(cx)
.read(cx)
.dev_server_project(dev_server_project_id)
.and_then(|project| project.project_id)
{
return workspace.update(cx, move |_, cx| {
open_dev_server_project(true, dev_server_project_id, project_id, cx)
.detach_and_prompt_err("Failed to reconnect", cx, |_, _| None)
});
}
if dev_server.ssh_connection_string.is_some() {
let task = workspace.update(cx, |_, cx| {
reconnect_to_dev_server_project(
cx.view().clone(),
dev_server,
dev_server_project_id,
true,
cx,
)
});
task.detach_and_prompt_err("Failed to reconnect", cx, |_, _| None);
} else {
return workspace.update(cx, |workspace, cx| {
let handle = cx.view().downgrade();
workspace.toggle_modal(cx, |cx| RemoteServerProjects::new(cx, handle))
});
}
}
fn reconnect_to_ssh_remote(
&self,
connection_options: SshConnectionOptions,
@@ -165,16 +102,6 @@ impl DisconnectedOverlay {
let paths = ssh_project.paths.iter().map(PathBuf::from).collect();
cx.spawn(move |_, mut cx| async move {
let nickname = cx
.update(|cx| {
SshSettings::get_global(cx).nickname_for(
&connection_options.host,
connection_options.port,
&connection_options.username,
)
})
.ok()
.flatten();
open_ssh_project(
connection_options,
paths,
@@ -183,7 +110,6 @@ impl DisconnectedOverlay {
replace_window: Some(window),
..Default::default()
},
nickname,
&mut cx,
)
.await?;
@@ -200,13 +126,10 @@ impl DisconnectedOverlay {
impl Render for DisconnectedOverlay {
fn render(&mut self, cx: &mut ViewContext<Self>) -> impl IntoElement {
let can_reconnect = matches!(
self.host,
Host::DevServerProject(_) | Host::SshRemoteProject(_)
);
let can_reconnect = matches!(self.host, Host::SshRemoteProject(_));
let message = match &self.host {
Host::RemoteProject | Host::DevServerProject(_) => {
Host::RemoteProject => {
"Your connection to the remote project has been lost.".to_string()
}
Host::SshRemoteProject(options) => {

View File

@@ -1,10 +1,8 @@
pub mod disconnected_overlay;
mod remote_servers;
mod ssh_connections;
use remote::SshConnectionOptions;
pub use ssh_connections::open_ssh_project;
use client::{DevServerProjectId, ProjectId};
use disconnected_overlay::DisconnectedOverlay;
use fuzzy::{StringMatch, StringMatchCandidate};
use gpui::{
@@ -17,9 +15,7 @@ use picker::{
highlighted_match_with_paths::{HighlightedMatchWithPaths, HighlightedText},
Picker, PickerDelegate,
};
use remote_servers::reconnect_to_dev_server_project;
pub use remote_servers::RemoteServerProjects;
use rpc::proto::DevServerStatus;
use serde::Deserialize;
use settings::Settings;
pub use ssh_connections::SshSettings;
@@ -28,13 +24,12 @@ use std::{
sync::Arc,
};
use ui::{
prelude::*, tooltip_container, ButtonLike, IconWithIndicator, Indicator, KeyBinding, ListItem,
ListItemSpacing, Tooltip,
prelude::*, tooltip_container, ButtonLike, KeyBinding, ListItem, ListItemSpacing, Tooltip,
};
use util::{paths::PathExt, ResultExt};
use workspace::{
AppState, CloseIntent, ModalView, OpenOptions, SerializedWorkspaceLocation, Workspace,
WorkspaceId, WORKSPACE_DB,
CloseIntent, ModalView, OpenOptions, SerializedWorkspaceLocation, Workspace, WorkspaceId,
WORKSPACE_DB,
};
#[derive(PartialEq, Clone, Deserialize, Default)]
@@ -101,7 +96,7 @@ impl RecentProjects {
}
}
fn register(workspace: &mut Workspace, cx: &mut ViewContext<Workspace>) {
fn register(workspace: &mut Workspace, _cx: &mut ViewContext<Workspace>) {
workspace.register_action(|workspace, open_recent: &OpenRecent, cx| {
let Some(recent_projects) = workspace.active_modal::<Self>(cx) else {
Self::open(workspace, open_recent.create_new_window, cx);
@@ -114,20 +109,6 @@ impl RecentProjects {
.update(cx, |picker, cx| picker.cycle_selection(cx))
});
});
if workspace
.project()
.read(cx)
.dev_server_project_id()
.is_some()
{
workspace.register_action(|workspace, _: &workspace::Open, cx| {
if workspace.active_modal::<Self>(cx).is_some() {
cx.propagate();
} else {
Self::open(workspace, true, cx);
}
});
}
}
pub fn open(
@@ -254,13 +235,6 @@ impl PickerDelegate for RecentProjectsDelegate {
.map(|(_, path)| path.compact().to_string_lossy().into_owned())
.collect::<Vec<_>>()
.join(""),
SerializedWorkspaceLocation::DevServer(dev_server_project) => {
format!(
"{}{}",
dev_server_project.dev_server_name,
dev_server_project.paths.join("")
)
}
SerializedWorkspaceLocation::Ssh(ssh_project) => ssh_project
.ssh_urls()
.iter()
@@ -321,7 +295,10 @@ impl PickerDelegate for RecentProjectsDelegate {
cx.spawn(move |workspace, mut cx| async move {
let continue_replacing = workspace
.update(&mut cx, |workspace, cx| {
workspace.prepare_to_close(CloseIntent::ReplaceWindow, cx)
workspace.prepare_to_close(
CloseIntent::ReplaceWindow,
cx,
)
})?
.await?;
if continue_replacing {
@@ -339,74 +316,44 @@ impl PickerDelegate for RecentProjectsDelegate {
workspace.open_workspace_for_paths(false, paths, cx)
}
}
SerializedWorkspaceLocation::DevServer(dev_server_project) => {
let store = dev_server_projects::Store::global(cx);
let Some(project_id) = store.read(cx)
.dev_server_project(dev_server_project.id)
.and_then(|p| p.project_id)
else {
let server = store.read(cx).dev_server_for_project(dev_server_project.id);
if server.is_some_and(|server| server.ssh_connection_string.is_some()) {
return reconnect_to_dev_server_project(cx.view().clone(), server.unwrap().clone(), dev_server_project.id, replace_current_window, cx);
} else {
let dev_server_name = dev_server_project.dev_server_name.clone();
return cx.spawn(|workspace, mut cx| async move {
let response =
cx.prompt(gpui::PromptLevel::Warning,
"Dev Server is offline",
Some(format!("Cannot connect to {}. To debug open the remote project settings.", dev_server_name).as_str()),
&["Ok", "Open Settings"]
).await?;
if response == 1 {
workspace.update(&mut cx, |workspace, cx| {
let handle = cx.view().downgrade();
workspace.toggle_modal(cx, |cx| RemoteServerProjects::new(cx, handle))
})?;
} else {
workspace.update(&mut cx, |workspace, cx| {
RecentProjects::open(workspace, true, cx);
})?;
}
Ok(())
})
}
SerializedWorkspaceLocation::Ssh(ssh_project) => {
let app_state = workspace.app_state().clone();
let replace_window = if replace_current_window {
cx.window_handle().downcast::<Workspace>()
} else {
None
};
open_dev_server_project(replace_current_window, dev_server_project.id, project_id, cx)
}
SerializedWorkspaceLocation::Ssh(ssh_project) => {
let app_state = workspace.app_state().clone();
let replace_window = if replace_current_window {
cx.window_handle().downcast::<Workspace>()
} else {
None
};
let open_options = OpenOptions {
replace_window,
..Default::default()
};
let open_options = OpenOptions {
replace_window,
..Default::default()
};
let connection_options = SshSettings::get_global(cx)
.connection_options_for(
ssh_project.host.clone(),
ssh_project.port,
ssh_project.user.clone(),
);
let args = SshSettings::get_global(cx).args_for(&ssh_project.host, ssh_project.port, &ssh_project.user);
let nickname = SshSettings::get_global(cx).nickname_for(&ssh_project.host, ssh_project.port, &ssh_project.user);
let connection_options = SshConnectionOptions {
host: ssh_project.host.clone(),
username: ssh_project.user.clone(),
port: ssh_project.port,
password: None,
args,
};
let paths = ssh_project.paths.iter().map(PathBuf::from).collect();
let paths = ssh_project.paths.iter().map(PathBuf::from).collect();
cx.spawn(|_, mut cx| async move {
open_ssh_project(connection_options, paths, app_state, open_options, nickname, &mut cx).await
})
cx.spawn(|_, mut cx| async move {
open_ssh_project(
connection_options,
paths,
app_state,
open_options,
&mut cx,
)
.await
})
}
}
}
}
})
.detach_and_log_err(cx);
.detach_and_log_err(cx);
cx.emit(DismissEvent);
}
}
@@ -431,20 +378,6 @@ impl PickerDelegate for RecentProjectsDelegate {
let (_, location) = self.workspaces.get(hit.candidate_id)?;
let dev_server_status =
if let SerializedWorkspaceLocation::DevServer(dev_server_project) = location {
let store = dev_server_projects::Store::global(cx).read(cx);
Some(
store
.dev_server_project(dev_server_project.id)
.and_then(|p| store.dev_server(p.dev_server_id))
.map(|s| s.status)
.unwrap_or_default(),
)
} else {
None
};
let mut path_start_offset = 0;
let paths = match location {
SerializedWorkspaceLocation::Local(paths, order) => Arc::new(
@@ -457,13 +390,6 @@ impl PickerDelegate for RecentProjectsDelegate {
.collect(),
),
SerializedWorkspaceLocation::Ssh(ssh_project) => Arc::new(ssh_project.ssh_urls()),
SerializedWorkspaceLocation::DevServer(dev_server_project) => {
Arc::new(vec![PathBuf::from(format!(
"{}:{}",
dev_server_project.dev_server_name,
dev_server_project.paths.join(", ")
))])
}
};
let (match_labels, paths): (Vec<_>, Vec<_>) = paths
@@ -478,13 +404,7 @@ impl PickerDelegate for RecentProjectsDelegate {
.unzip();
let highlighted_match = HighlightedMatchWithPaths {
match_label: HighlightedText::join(match_labels.into_iter().flatten(), ", ").color(
if matches!(dev_server_status, Some(DevServerStatus::Offline)) {
Color::Disabled
} else {
Color::Default
},
),
match_label: HighlightedText::join(match_labels.into_iter().flatten(), ", "),
paths,
};
@@ -507,24 +427,6 @@ impl PickerDelegate for RecentProjectsDelegate {
SerializedWorkspaceLocation::Ssh(_) => Icon::new(IconName::Server)
.color(Color::Muted)
.into_any_element(),
SerializedWorkspaceLocation::DevServer(_) => {
let indicator_color = match dev_server_status {
Some(DevServerStatus::Online) => Color::Created,
Some(DevServerStatus::Offline) => Color::Hidden,
_ => unreachable!(),
};
IconWithIndicator::new(
Icon::new(IconName::Server).color(Color::Muted),
Some(Indicator::dot()),
)
.indicator_color(indicator_color)
.indicator_border_color(if selected {
Some(cx.theme().colors().element_selected)
} else {
None
})
.into_any_element()
}
})
})
.child({
@@ -597,59 +499,6 @@ impl PickerDelegate for RecentProjectsDelegate {
}
}
fn open_dev_server_project(
replace_current_window: bool,
dev_server_project_id: DevServerProjectId,
project_id: ProjectId,
cx: &mut ViewContext<Workspace>,
) -> Task<anyhow::Result<()>> {
if let Some(app_state) = AppState::global(cx).upgrade() {
let handle = if replace_current_window {
cx.window_handle().downcast::<Workspace>()
} else {
None
};
if let Some(handle) = handle {
cx.spawn(move |workspace, mut cx| async move {
let continue_replacing = workspace
.update(&mut cx, |workspace, cx| {
workspace.prepare_to_close(CloseIntent::ReplaceWindow, cx)
})?
.await?;
if continue_replacing {
workspace
.update(&mut cx, |_workspace, cx| {
workspace::join_dev_server_project(
dev_server_project_id,
project_id,
app_state,
Some(handle),
cx,
)
})?
.await?;
}
Ok(())
})
} else {
let task = workspace::join_dev_server_project(
dev_server_project_id,
project_id,
app_state,
None,
cx,
);
cx.spawn(|_, _| async move {
task.await?;
Ok(())
})
}
} else {
Task::ready(Err(anyhow::anyhow!("App state not found")))
}
}
// Compute the highlighted text for the name and path
fn highlights_for_path(
path: &Path,

View File

@@ -1,19 +1,12 @@
use std::collections::HashMap;
use std::path::PathBuf;
use std::sync::Arc;
use std::time::Duration;
use anyhow::anyhow;
use anyhow::Context;
use anyhow::Result;
use dev_server_projects::{DevServer, DevServerId, DevServerProjectId};
use editor::Editor;
use file_finder::OpenPathDelegate;
use futures::channel::oneshot;
use futures::future::Shared;
use futures::FutureExt;
use gpui::canvas;
use gpui::AsyncWindowContext;
use gpui::ClipboardItem;
use gpui::Task;
use gpui::WeakView;
@@ -22,17 +15,11 @@ use gpui::{
PromptLevel, ScrollHandle, View, ViewContext,
};
use picker::Picker;
use project::terminals::wrap_for_ssh;
use project::terminals::SshCommand;
use project::Project;
use remote::SshConnectionOptions;
use rpc::proto::DevServerStatus;
use remote::SshRemoteClient;
use settings::update_settings_file;
use settings::Settings;
use task::HideStrategy;
use task::RevealStrategy;
use task::SpawnInTerminal;
use terminal_view::terminal_panel::TerminalPanel;
use ui::{
prelude::*, IconButtonShape, List, ListItem, ListSeparator, Modal, ModalHeader, Scrollbar,
ScrollbarState, Section, Tooltip,
@@ -43,7 +30,6 @@ use workspace::OpenOptions;
use workspace::Toast;
use workspace::{notifications::DetachAndPromptErr, ModalView, Workspace};
use crate::open_dev_server_project;
use crate::ssh_connections::connect_over_ssh;
use crate::ssh_connections::open_ssh_project;
use crate::ssh_connections::RemoteSettingsContent;
@@ -61,6 +47,7 @@ pub struct RemoteServerProjects {
scroll_handle: ScrollHandle,
workspace: WeakView<Workspace>,
selectable_items: SelectableItemList,
retained_connections: Vec<Model<SshRemoteClient>>,
}
struct CreateRemoteServer {
@@ -210,11 +197,7 @@ impl ProjectPicker {
picker
});
let connection_string = connection.connection_string().into();
let nickname = SshSettings::get_global(cx).nickname_for(
&connection.host,
connection.port,
&connection.username,
);
let nickname = connection.nickname.clone().map(|nick| nick.into());
let _path_task = cx
.spawn({
let workspace = workspace.clone();
@@ -370,6 +353,7 @@ impl RemoteServerProjects {
scroll_handle: ScrollHandle::new(),
workspace,
selectable_items: Default::default(),
retained_connections: Vec::new(),
}
}
@@ -426,7 +410,7 @@ impl RemoteServerProjects {
return;
}
};
let ssh_prompt = cx.new_view(|cx| SshPrompt::new(&connection_options, None, cx));
let ssh_prompt = cx.new_view(|cx| SshPrompt::new(&connection_options, cx));
let connection = connect_over_ssh(
connection_options.remote_server_identifier(),
@@ -439,7 +423,7 @@ impl RemoteServerProjects {
let address_editor = editor.clone();
let creating = cx.spawn(move |this, mut cx| async move {
match connection.await {
Some(_) => this
Some(Some(client)) => this
.update(&mut cx, |this, cx| {
let _ = this.workspace.update(cx, |workspace, _| {
workspace
@@ -447,14 +431,14 @@ impl RemoteServerProjects {
.telemetry()
.report_app_event("create ssh server".to_string())
});
this.retained_connections.push(client);
this.add_ssh_server(connection_options, cx);
this.mode = Mode::default_mode();
this.selectable_items.reset_selection();
cx.notify()
})
.log_err(),
None => this
_ => this
.update(&mut cx, |this, cx| {
address_editor.update(cx, |this, _| {
this.set_read_only(false);
@@ -503,12 +487,11 @@ impl RemoteServerProjects {
return;
};
let nickname = ssh_connection.nickname.clone();
let connection_options = ssh_connection.into();
workspace.update(cx, |_, cx| {
cx.defer(move |workspace, cx| {
workspace.toggle_modal(cx, |cx| {
SshConnectionModal::new(&connection_options, Vec::new(), nickname, cx)
SshConnectionModal::new(&connection_options, Vec::new(), cx)
});
let prompt = workspace
.active_modal::<SshConnectionModal>(cx)
@@ -596,9 +579,7 @@ impl RemoteServerProjects {
self.create_ssh_server(state.address_editor.clone(), cx);
}
Mode::EditNickname(state) => {
let text = Some(state.editor.read(cx).text(cx))
.filter(|text| !text.is_empty())
.map(SharedString::from);
let text = Some(state.editor.read(cx).text(cx)).filter(|text| !text.is_empty());
let index = state.index;
self.update_settings_file(cx, move |setting, _| {
if let Some(connections) = setting.ssh_connections.as_mut() {
@@ -645,7 +626,7 @@ impl RemoteServerProjects {
) -> impl IntoElement {
let (main_label, aux_label) = if let Some(nickname) = ssh_connection.nickname.clone() {
let aux_label = SharedString::from(format!("({})", ssh_connection.host));
(nickname, Some(aux_label))
(nickname.into(), Some(aux_label))
} else {
(ssh_connection.host.clone(), None)
};
@@ -758,13 +739,11 @@ impl RemoteServerProjects {
let project = project.clone();
let server = server.clone();
cx.spawn(|remote_server_projects, mut cx| async move {
let nickname = server.nickname.clone();
let result = open_ssh_project(
server.into(),
project.paths.into_iter().map(PathBuf::from).collect(),
app_state,
OpenOptions::default(),
nickname,
&mut cx,
)
.await;
@@ -873,6 +852,7 @@ impl RemoteServerProjects {
projects: vec![],
nickname: None,
args: connection_options.args.unwrap_or_default(),
upload_binary_over_ssh: None,
})
});
}
@@ -965,7 +945,7 @@ impl RemoteServerProjects {
SshConnectionHeader {
connection_string: connection_string.clone(),
paths: Default::default(),
nickname: connection.nickname.clone(),
nickname: connection.nickname.clone().map(|s| s.into()),
}
.render(cx),
)
@@ -1071,7 +1051,7 @@ impl RemoteServerProjects {
);
cx.spawn(|mut cx| async move {
if confirmation.await.ok() == Some(1) {
if confirmation.await.ok() == Some(0) {
remote_servers
.update(&mut cx, |this, cx| {
this.delete_ssh_server(index, cx);
@@ -1147,13 +1127,14 @@ impl RemoteServerProjects {
};
let connection_string = connection.host.clone();
let nickname = connection.nickname.clone().map(|s| s.into());
v_flex()
.child(
SshConnectionHeader {
connection_string,
paths: Default::default(),
nickname: connection.nickname.clone(),
nickname,
}
.render(cx),
)
@@ -1319,146 +1300,3 @@ impl Render for RemoteServerProjects {
})
}
}
pub fn reconnect_to_dev_server_project(
workspace: View<Workspace>,
dev_server: DevServer,
dev_server_project_id: DevServerProjectId,
replace_current_window: bool,
cx: &mut WindowContext,
) -> Task<Result<()>> {
let store = dev_server_projects::Store::global(cx);
let reconnect = reconnect_to_dev_server(workspace.clone(), dev_server, cx);
cx.spawn(|mut cx| async move {
reconnect.await?;
cx.background_executor()
.timer(Duration::from_millis(1000))
.await;
if let Some(project_id) = store.update(&mut cx, |store, _| {
store
.dev_server_project(dev_server_project_id)
.and_then(|p| p.project_id)
})? {
workspace
.update(&mut cx, move |_, cx| {
open_dev_server_project(
replace_current_window,
dev_server_project_id,
project_id,
cx,
)
})?
.await?;
}
Ok(())
})
}
pub fn reconnect_to_dev_server(
workspace: View<Workspace>,
dev_server: DevServer,
cx: &mut WindowContext,
) -> Task<Result<()>> {
let Some(ssh_connection_string) = dev_server.ssh_connection_string else {
return Task::ready(Err(anyhow!("Can't reconnect, no ssh_connection_string")));
};
let dev_server_store = dev_server_projects::Store::global(cx);
let get_access_token = dev_server_store.update(cx, |store, cx| {
store.regenerate_dev_server_token(dev_server.id, cx)
});
cx.spawn(|mut cx| async move {
let access_token = get_access_token.await?.access_token;
spawn_ssh_task(
workspace,
dev_server_store,
dev_server.id,
ssh_connection_string.to_string(),
access_token,
&mut cx,
)
.await
})
}
pub async fn spawn_ssh_task(
workspace: View<Workspace>,
dev_server_store: Model<dev_server_projects::Store>,
dev_server_id: DevServerId,
ssh_connection_string: String,
access_token: String,
cx: &mut AsyncWindowContext,
) -> Result<()> {
let terminal_panel = workspace
.update(cx, |workspace, cx| workspace.panel::<TerminalPanel>(cx))
.ok()
.flatten()
.with_context(|| anyhow!("No terminal panel"))?;
let command = "sh".to_string();
let args = vec![
"-x".to_string(),
"-c".to_string(),
format!(
r#"~/.local/bin/zed -v >/dev/stderr || (curl -f https://zed.dev/install.sh || wget -qO- https://zed.dev/install.sh) | sh && ZED_HEADLESS=1 ~/.local/bin/zed --dev-server-token {}"#,
access_token
),
];
let ssh_connection_string = ssh_connection_string.to_string();
let (command, args) = wrap_for_ssh(
&SshCommand::DevServer(ssh_connection_string.clone()),
Some((&command, &args)),
None,
HashMap::default(),
None,
);
let terminal = terminal_panel
.update(cx, |terminal_panel, cx| {
terminal_panel.spawn_in_new_terminal(
SpawnInTerminal {
id: task::TaskId("ssh-remote".into()),
full_label: "Install zed over ssh".into(),
label: "Install zed over ssh".into(),
command,
args,
command_label: ssh_connection_string.clone(),
cwd: None,
use_new_terminal: true,
allow_concurrent_runs: false,
reveal: RevealStrategy::Always,
hide: HideStrategy::Never,
env: Default::default(),
shell: Default::default(),
},
cx,
)
})?
.await?;
terminal
.update(cx, |terminal, cx| terminal.wait_for_completed_task(cx))?
.await;
// There's a race-condition between the task completing successfully, and the server sending us the online status. Make it less likely we'll show the error state.
if dev_server_store.update(cx, |this, _| this.dev_server_status(dev_server_id))?
== DevServerStatus::Offline
{
cx.background_executor()
.timer(Duration::from_millis(200))
.await
}
if dev_server_store.update(cx, |this, _| this.dev_server_status(dev_server_id))?
== DevServerStatus::Offline
{
return Err(anyhow!("couldn't reconnect"))?;
}
Ok(())
}

View File

@@ -26,15 +26,9 @@ use ui::{
};
use workspace::{AppState, ModalView, Workspace};
#[derive(Clone, Default, Serialize, Deserialize, JsonSchema)]
pub struct RemoteServerSettings {
pub download_on_host: Option<bool>,
}
#[derive(Deserialize)]
pub struct SshSettings {
pub ssh_connections: Option<Vec<SshConnection>>,
pub remote_server: Option<RemoteServerSettings>,
}
impl SshSettings {
@@ -42,39 +36,31 @@ impl SshSettings {
self.ssh_connections.clone().into_iter().flatten()
}
pub fn args_for(
pub fn connection_options_for(
&self,
host: &str,
host: String,
port: Option<u16>,
user: &Option<String>,
) -> Option<Vec<String>> {
self.ssh_connections()
.filter_map(|conn| {
if conn.host == host && &conn.username == user && conn.port == port {
Some(conn.args)
} else {
None
}
})
.next()
}
pub fn nickname_for(
&self,
host: &str,
port: Option<u16>,
user: &Option<String>,
) -> Option<SharedString> {
self.ssh_connections()
.filter_map(|conn| {
if conn.host == host && &conn.username == user && conn.port == port {
Some(conn.nickname)
} else {
None
}
})
.next()
.flatten()
username: Option<String>,
) -> SshConnectionOptions {
for conn in self.ssh_connections() {
if conn.host == host && conn.username == username && conn.port == port {
return SshConnectionOptions {
nickname: conn.nickname,
upload_binary_over_ssh: conn.upload_binary_over_ssh.unwrap_or_default(),
args: Some(conn.args),
host,
port,
username,
password: None,
};
}
}
SshConnectionOptions {
host,
port,
username,
..Default::default()
}
}
}
@@ -85,13 +71,20 @@ pub struct SshConnection {
pub username: Option<String>,
#[serde(skip_serializing_if = "Option::is_none")]
pub port: Option<u16>,
pub projects: Vec<SshProject>,
/// Name to use for this server in UI.
#[serde(skip_serializing_if = "Option::is_none")]
pub nickname: Option<SharedString>,
#[serde(skip_serializing_if = "Vec::is_empty")]
#[serde(default)]
pub args: Vec<String>,
#[serde(default)]
pub projects: Vec<SshProject>,
/// Name to use for this server in UI.
#[serde(skip_serializing_if = "Option::is_none")]
pub nickname: Option<String>,
// By default Zed will download the binary to the host directly.
// If this is set to true, Zed will download the binary to your local machine,
// and then upload it over the SSH connection. Useful if your SSH server has
// limited outbound internet access.
#[serde(skip_serializing_if = "Option::is_none")]
pub upload_binary_over_ssh: Option<bool>,
}
impl From<SshConnection> for SshConnectionOptions {
@@ -102,6 +95,8 @@ impl From<SshConnection> for SshConnectionOptions {
port: val.port,
password: None,
args: Some(val.args),
nickname: val.nickname,
upload_binary_over_ssh: val.upload_binary_over_ssh.unwrap_or_default(),
}
}
}
@@ -114,7 +109,6 @@ pub struct SshProject {
#[derive(Clone, Default, Serialize, Deserialize, JsonSchema)]
pub struct RemoteSettingsContent {
pub ssh_connections: Option<Vec<SshConnection>>,
pub remote_server: Option<RemoteServerSettings>,
}
impl Settings for SshSettings {
@@ -153,10 +147,10 @@ pub struct SshConnectionModal {
impl SshPrompt {
pub(crate) fn new(
connection_options: &SshConnectionOptions,
nickname: Option<SharedString>,
cx: &mut ViewContext<Self>,
) -> Self {
let connection_string = connection_options.connection_string().into();
let nickname = connection_options.nickname.clone().map(|s| s.into());
Self {
connection_string,
@@ -276,11 +270,10 @@ impl SshConnectionModal {
pub(crate) fn new(
connection_options: &SshConnectionOptions,
paths: Vec<PathBuf>,
nickname: Option<SharedString>,
cx: &mut ViewContext<Self>,
) -> Self {
Self {
prompt: cx.new_view(|cx| SshPrompt::new(connection_options, nickname, cx)),
prompt: cx.new_view(|cx| SshPrompt::new(connection_options, cx)),
finished: false,
paths,
}
@@ -451,13 +444,17 @@ impl remote::SshClientDelegate for SshClientDelegate {
fn get_server_binary(
&self,
platform: SshPlatform,
upload_binary_over_ssh: bool,
cx: &mut AsyncAppContext,
) -> oneshot::Receiver<Result<(ServerBinary, SemanticVersion)>> {
let (tx, rx) = oneshot::channel();
let this = self.clone();
cx.spawn(|mut cx| async move {
tx.send(this.get_server_binary_impl(platform, &mut cx).await)
.ok();
tx.send(
this.get_server_binary_impl(platform, upload_binary_over_ssh, &mut cx)
.await,
)
.ok();
})
.detach();
rx
@@ -492,19 +489,14 @@ impl SshClientDelegate {
async fn get_server_binary_impl(
&self,
platform: SshPlatform,
upload_binary_via_ssh: bool,
cx: &mut AsyncAppContext,
) -> Result<(ServerBinary, SemanticVersion)> {
let (version, release_channel, download_binary_on_host) = cx.update(|cx| {
let (version, release_channel) = cx.update(|cx| {
let version = AppVersion::global(cx);
let channel = ReleaseChannel::global(cx);
let ssh_settings = SshSettings::get_global(cx);
let download_binary_on_host = ssh_settings
.remote_server
.as_ref()
.and_then(|server| server.download_on_host)
.unwrap_or(false);
(version, channel, download_binary_on_host)
(version, channel)
})?;
// In dev mode, build the remote server binary from source
@@ -517,23 +509,57 @@ impl SshClientDelegate {
}
}
if download_binary_on_host {
let (request_url, request_body) = AutoUpdater::get_latest_remote_server_release_url(
// For nightly channel, always get latest
let current_version = if release_channel == ReleaseChannel::Nightly {
None
} else {
Some(version)
};
self.update_status(
Some(&format!("Checking remote server release {}", version)),
cx,
);
if upload_binary_via_ssh {
let binary_path = AutoUpdater::download_remote_server_release(
platform.os,
platform.arch,
release_channel,
current_version,
cx,
)
.await
.map_err(|e| {
anyhow!(
"Failed to get remote server binary download url (os: {}, arch: {}): {}",
"Failed to download remote server binary (version: {}, os: {}, arch: {}): {}",
version,
platform.os,
platform.arch,
e
)
})?;
Ok((ServerBinary::LocalBinary(binary_path), version))
} else {
let (request_url, request_body) = AutoUpdater::get_remote_server_release_url(
platform.os,
platform.arch,
release_channel,
current_version,
cx,
)
.await
.map_err(|e| {
anyhow!(
"Failed to get remote server binary download url (version: {}, os: {}, arch: {}): {}",
version,
platform.os,
platform.arch,
e
)
})?;
Ok((
ServerBinary::ReleaseUrl {
url: request_url,
@@ -541,25 +567,6 @@ impl SshClientDelegate {
},
version,
))
} else {
self.update_status(Some("Checking for latest version of remote server"), cx);
let binary_path = AutoUpdater::get_latest_remote_server_release(
platform.os,
platform.arch,
release_channel,
cx,
)
.await
.map_err(|e| {
anyhow!(
"Failed to download remote server binary (os: {}, arch: {}): {}",
platform.os,
platform.arch,
e
)
})?;
Ok((ServerBinary::LocalBinary(binary_path), version))
}
}
@@ -700,7 +707,6 @@ pub async fn open_ssh_project(
paths: Vec<PathBuf>,
app_state: Arc<AppState>,
open_options: workspace::OpenOptions,
nickname: Option<SharedString>,
cx: &mut AsyncAppContext,
) -> Result<()> {
let window = if let Some(window) = open_options.replace_window {
@@ -725,12 +731,11 @@ pub async fn open_ssh_project(
let (cancel_tx, cancel_rx) = oneshot::channel();
let delegate = window.update(cx, {
let connection_options = connection_options.clone();
let nickname = nickname.clone();
let paths = paths.clone();
move |workspace, cx| {
cx.activate_window();
workspace.toggle_modal(cx, |cx| {
SshConnectionModal::new(&connection_options, paths, nickname.clone(), cx)
SshConnectionModal::new(&connection_options, paths, cx)
});
let ui = workspace

File diff suppressed because it is too large Load Diff

View File

@@ -702,7 +702,7 @@ async fn init_test(
) -> (Model<Project>, Model<HeadlessProject>, Arc<FakeFs>) {
init_logger();
let (forwarder, ssh_server_client) = SshRemoteClient::fake_server(cx, server_cx);
let (opts, ssh_server_client) = SshRemoteClient::fake_server(cx, server_cx);
let fs = FakeFs::new(server_cx.executor());
fs.insert_tree(
"/code",
@@ -744,7 +744,7 @@ async fn init_test(
)
});
let ssh = SshRemoteClient::fake_client(forwarder, cx).await;
let ssh = SshRemoteClient::fake_client(opts, cx).await;
let project = build_project(ssh, cx);
project
.update(cx, {

View File

@@ -8,7 +8,7 @@ use client::telemetry::Telemetry;
use collections::{HashMap, HashSet};
use editor::{
display_map::{
BlockContext, BlockDisposition, BlockId, BlockProperties, BlockStyle, CustomBlockId,
BlockContext, BlockId, BlockPlacement, BlockProperties, BlockStyle, CustomBlockId,
RenderBlock,
},
scroll::Autoscroll,
@@ -90,12 +90,11 @@ impl EditorBlock {
let invalidation_anchor = buffer.read(cx).read(cx).anchor_before(next_row_start);
let block = BlockProperties {
position: code_range.end,
placement: BlockPlacement::Below(code_range.end),
// Take up at least one height for status, allow the editor to determine the real height based on the content from render
height: 1,
style: BlockStyle::Sticky,
render: Self::create_output_area_renderer(execution_view.clone(), on_close.clone()),
disposition: BlockDisposition::Below,
priority: 0,
};

839
crates/rope/src/chunk.rs Normal file
View File

@@ -0,0 +1,839 @@
use crate::{OffsetUtf16, Point, PointUtf16, TextSummary, Unclipped};
use arrayvec::ArrayString;
use std::{cmp, ops::Range};
use sum_tree::Bias;
use unicode_segmentation::GraphemeCursor;
use util::debug_panic;
#[cfg(test)]
pub(crate) const MIN_BASE: usize = 6;
#[cfg(not(test))]
pub(crate) const MIN_BASE: usize = 32;
pub(crate) const MAX_BASE: usize = MIN_BASE * 2;
#[derive(Clone, Debug, Default)]
pub struct Chunk {
chars: usize,
chars_utf16: usize,
tabs: usize,
newlines: usize,
pub text: ArrayString<MAX_BASE>,
}
impl Chunk {
#[inline(always)]
pub fn new(text: &str) -> Self {
let mut this = Chunk::default();
this.push_str(text);
this
}
#[inline(always)]
pub fn push_str(&mut self, text: &str) {
for (char_ix, c) in text.char_indices() {
let ix = self.text.len() + char_ix;
self.chars |= 1 << ix;
self.chars_utf16 |= 1 << ix;
self.chars_utf16 |= c.len_utf16() << ix;
self.tabs |= ((c == '\t') as usize) << ix;
self.newlines |= ((c == '\n') as usize) << ix;
}
self.text.push_str(text);
}
#[inline(always)]
pub fn append(&mut self, slice: ChunkSlice) {
if slice.is_empty() {
return;
};
let base_ix = self.text.len();
self.chars |= slice.chars << base_ix;
self.chars_utf16 |= slice.chars_utf16 << base_ix;
self.tabs |= slice.tabs << base_ix;
self.newlines |= slice.newlines << base_ix;
self.text.push_str(&slice.text);
}
#[inline(always)]
pub fn as_slice(&self) -> ChunkSlice {
ChunkSlice {
chars: self.chars,
chars_utf16: self.chars_utf16,
tabs: self.tabs,
newlines: self.newlines,
text: &self.text,
}
}
#[inline(always)]
pub fn slice(&self, range: Range<usize>) -> ChunkSlice {
self.as_slice().slice(range)
}
}
#[derive(Clone, Copy, Debug)]
pub struct ChunkSlice<'a> {
chars: usize,
chars_utf16: usize,
tabs: usize,
newlines: usize,
text: &'a str,
}
impl<'a> Into<Chunk> for ChunkSlice<'a> {
fn into(self) -> Chunk {
Chunk {
chars: self.chars,
chars_utf16: self.chars_utf16,
tabs: self.tabs,
newlines: self.newlines,
text: self.text.try_into().unwrap(),
}
}
}
impl<'a> ChunkSlice<'a> {
#[inline(always)]
pub fn is_empty(self) -> bool {
self.text.is_empty()
}
#[inline(always)]
pub fn is_char_boundary(self, offset: usize) -> bool {
self.text.is_char_boundary(offset)
}
#[inline(always)]
pub fn split_at(self, mid: usize) -> (ChunkSlice<'a>, ChunkSlice<'a>) {
if mid == 64 {
let left = self;
let right = ChunkSlice {
chars: 0,
chars_utf16: 0,
tabs: 0,
newlines: 0,
text: "",
};
(left, right)
} else {
let mask = ((1u128 << mid) - 1) as usize;
let (left_text, right_text) = self.text.split_at(mid);
let left = ChunkSlice {
chars: self.chars & mask,
chars_utf16: self.chars_utf16 & mask,
tabs: self.tabs & mask,
newlines: self.newlines & mask,
text: left_text,
};
let right = ChunkSlice {
chars: self.chars >> mid,
chars_utf16: self.chars_utf16 >> mid,
tabs: self.tabs >> mid,
newlines: self.newlines >> mid,
text: right_text,
};
(left, right)
}
}
#[inline(always)]
pub fn slice(self, range: Range<usize>) -> Self {
let mask = ((1u128 << range.end) - 1) as usize;
if range.start == 64 {
Self {
chars: 0,
chars_utf16: 0,
tabs: 0,
newlines: 0,
text: "",
}
} else {
Self {
chars: (self.chars & mask) >> range.start,
chars_utf16: (self.chars_utf16 & mask) >> range.start,
tabs: (self.tabs & mask) >> range.start,
newlines: (self.newlines & mask) >> range.start,
text: &self.text[range],
}
}
}
#[inline(always)]
pub fn text_summary(&self) -> TextSummary {
let (longest_row, longest_row_chars) = self.longest_row();
TextSummary {
len: self.len(),
len_utf16: self.len_utf16(),
lines: self.lines(),
first_line_chars: self.first_line_chars(),
last_line_chars: self.last_line_chars(),
last_line_len_utf16: self.last_line_len_utf16(),
longest_row,
longest_row_chars,
}
}
/// Get length in bytes
#[inline(always)]
pub fn len(&self) -> usize {
self.text.len()
}
/// Get length in UTF-16 code units
#[inline(always)]
pub fn len_utf16(&self) -> OffsetUtf16 {
OffsetUtf16(self.chars_utf16.count_ones() as usize)
}
/// Get point representing number of lines and length of last line
#[inline(always)]
pub fn lines(&self) -> Point {
let row = self.newlines.count_ones();
let column = self.newlines.leading_zeros() - (usize::BITS - self.text.len() as u32);
Point::new(row, column)
}
/// Get number of chars in first line
#[inline(always)]
pub fn first_line_chars(&self) -> u32 {
if self.newlines == 0 {
self.chars.count_ones()
} else {
let mask = ((1u128 << self.newlines.trailing_zeros() as usize) - 1) as usize;
(self.chars & mask).count_ones()
}
}
/// Get number of chars in last line
#[inline(always)]
pub fn last_line_chars(&self) -> u32 {
if self.newlines == 0 {
self.chars.count_ones()
} else {
let mask = !(usize::MAX >> self.newlines.leading_zeros());
(self.chars & mask).count_ones()
}
}
/// Get number of UTF-16 code units in last line
#[inline(always)]
pub fn last_line_len_utf16(&self) -> u32 {
if self.newlines == 0 {
self.chars_utf16.count_ones()
} else {
let mask = !(usize::MAX >> self.newlines.leading_zeros());
(self.chars_utf16 & mask).count_ones()
}
}
/// Get the longest row in the chunk and its length in characters.
#[inline(always)]
pub fn longest_row(&self) -> (u32, u32) {
let mut chars = self.chars;
let mut newlines = self.newlines;
let mut row = 0;
let mut longest_row = 0;
let mut longest_row_chars = 0;
while newlines > 0 {
let newline_ix = newlines.trailing_zeros();
let row_chars = (chars & ((1 << newline_ix) - 1)).count_ones() as u8;
if row_chars > longest_row_chars {
longest_row = row;
longest_row_chars = row_chars;
}
newlines >>= newline_ix;
newlines >>= 1;
chars >>= newline_ix;
chars >>= 1;
row += 1;
}
let row_chars = chars.count_ones() as u8;
if row_chars > longest_row_chars {
(row, row_chars as u32)
} else {
(longest_row, longest_row_chars as u32)
}
}
#[inline(always)]
pub fn offset_to_point(&self, offset: usize) -> Point {
if !self.text.is_char_boundary(offset) {
debug_panic!(
"offset {:?} is not a char boundary for string {:?}",
offset,
self.text
);
return Point::zero();
}
let mask = ((1u128 << offset) - 1) as usize;
let row = (self.newlines & mask).count_ones();
let newline_ix = usize::BITS - (self.newlines & mask).leading_zeros();
let column = (offset - newline_ix as usize) as u32;
Point::new(row, column)
}
#[inline(always)]
pub fn point_to_offset(&self, point: Point) -> usize {
if point.row > self.newlines.count_ones() {
debug_panic!(
"point {:?} extends beyond rows for string {:?}",
point,
self.text
);
return 0;
}
let row_start_offset = if point.row > 0 {
(nth_set_bit(self.newlines, point.row as usize) + 1) as usize
} else {
0
};
let newlines = if row_start_offset == usize::BITS as usize {
0
} else {
self.newlines >> row_start_offset
};
let row_len = cmp::min(newlines.trailing_zeros(), self.text.len() as u32);
if point.column > row_len {
debug_panic!(
"point {:?} extends beyond row for string {:?}",
point,
self.text
);
return row_start_offset + row_len as usize;
}
row_start_offset + point.column as usize
}
#[inline(always)]
pub fn offset_to_offset_utf16(&self, offset: usize) -> OffsetUtf16 {
let mask = ((1u128 << offset) - 1) as usize;
OffsetUtf16((self.chars_utf16 & mask).count_ones() as usize)
}
#[inline(always)]
pub fn offset_utf16_to_offset(&self, target: OffsetUtf16) -> usize {
if target.0 == 0 {
0
} else {
let ix = nth_set_bit(self.chars_utf16, target.0) + 1;
if ix == 64 {
64
} else {
let utf8_additional_len = cmp::min(
(self.chars_utf16 >> ix).trailing_zeros() as usize,
self.text.len() - ix,
);
ix + utf8_additional_len
}
}
}
#[inline(always)]
pub fn offset_to_point_utf16(&self, offset: usize) -> PointUtf16 {
let mask = ((1u128 << offset) - 1) as usize;
let row = (self.newlines & mask).count_ones();
let newline_ix = usize::BITS - (self.newlines & mask).leading_zeros();
let column = if newline_ix == 64 {
0
} else {
((self.chars_utf16 & mask) >> newline_ix).count_ones()
};
PointUtf16::new(row, column)
}
#[inline(always)]
pub fn point_to_point_utf16(&self, point: Point) -> PointUtf16 {
self.offset_to_point_utf16(self.point_to_offset(point))
}
#[inline(always)]
pub fn point_utf16_to_offset(&self, point: PointUtf16, clip: bool) -> usize {
let lines = self.lines();
if point.row > lines.row {
if !clip {
debug_panic!(
"point {:?} is beyond this chunk's extent {:?}",
point,
self.text
);
}
return self.len();
}
let row_start_offset = if point.row > 0 {
(nth_set_bit(self.newlines, point.row as usize) + 1) as usize
} else {
0
};
let row_len_utf8 = if row_start_offset == 64 {
0
} else {
cmp::min(
(self.newlines >> row_start_offset).trailing_zeros(),
(self.text.len() - row_start_offset) as u32,
)
};
let mask = ((1u128 << row_len_utf8) - 1) as usize;
let row_chars_utf16 = if row_start_offset == 64 {
0
} else {
(self.chars_utf16 >> row_start_offset) & mask
};
if point.column > row_chars_utf16.count_ones() {
if !clip {
debug_panic!(
"point {:?} is beyond the end of the line in chunk {:?}",
point,
self.text
);
}
return row_start_offset + row_len_utf8 as usize;
}
let mut offset = row_start_offset;
if point.column > 0 {
let offset_within_row = nth_set_bit(row_chars_utf16, point.column as usize) + 1;
offset += offset_within_row;
if offset < 64 {
offset += cmp::min(
(self.chars_utf16 >> offset).trailing_zeros() as usize,
self.text.len() - offset,
);
}
if !self.text.is_char_boundary(offset) {
offset -= 1;
while !self.text.is_char_boundary(offset) {
offset -= 1;
}
if !clip {
debug_panic!(
"point {:?} is within character in chunk {:?}",
point,
self.text,
);
}
}
}
offset
}
pub fn unclipped_point_utf16_to_point(&self, target: Unclipped<PointUtf16>) -> Point {
let mut point = Point::zero();
let mut point_utf16 = PointUtf16::zero();
for ch in self.text.chars() {
if point_utf16 == target.0 {
break;
}
if point_utf16 > target.0 {
// If the point is past the end of a line or inside of a code point,
// return the last valid point before the target.
return point;
}
if ch == '\n' {
point_utf16 += PointUtf16::new(1, 0);
point += Point::new(1, 0);
} else {
point_utf16 += PointUtf16::new(0, ch.len_utf16() as u32);
point += Point::new(0, ch.len_utf8() as u32);
}
}
point
}
// todo!("use bitsets")
pub fn clip_point(&self, target: Point, bias: Bias) -> Point {
for (row, line) in self.text.split('\n').enumerate() {
if row == target.row as usize {
let bytes = line.as_bytes();
let mut column = target.column.min(bytes.len() as u32) as usize;
if column == 0
|| column == bytes.len()
|| (bytes[column - 1] < 128 && bytes[column] < 128)
{
return Point::new(row as u32, column as u32);
}
let mut grapheme_cursor = GraphemeCursor::new(column, bytes.len(), true);
loop {
if line.is_char_boundary(column)
&& grapheme_cursor.is_boundary(line, 0).unwrap_or(false)
{
break;
}
match bias {
Bias::Left => column -= 1,
Bias::Right => column += 1,
}
grapheme_cursor.set_cursor(column);
}
return Point::new(row as u32, column as u32);
}
}
unreachable!()
}
// todo!("use bitsets")
pub fn clip_point_utf16(&self, target: Unclipped<PointUtf16>, bias: Bias) -> PointUtf16 {
for (row, line) in self.text.split('\n').enumerate() {
if row == target.0.row as usize {
let mut code_units = line.encode_utf16();
let mut column = code_units.by_ref().take(target.0.column as usize).count();
if char::decode_utf16(code_units).next().transpose().is_err() {
match bias {
Bias::Left => column -= 1,
Bias::Right => column += 1,
}
}
return PointUtf16::new(row as u32, column as u32);
}
}
unreachable!()
}
// todo!("use bitsets")
pub fn clip_offset_utf16(&self, target: OffsetUtf16, bias: Bias) -> OffsetUtf16 {
let mut code_units = self.text.encode_utf16();
let mut offset = code_units.by_ref().take(target.0).count();
if char::decode_utf16(code_units).next().transpose().is_err() {
match bias {
Bias::Left => offset -= 1,
Bias::Right => offset += 1,
}
}
OffsetUtf16(offset)
}
}
/// Finds the n-th bit that is set to 1.
#[inline(always)]
fn nth_set_bit(v: usize, mut n: usize) -> usize {
let v = v.reverse_bits();
let mut s: usize = 64;
let mut t: usize;
// Parallel bit count intermediates
let a = v - ((v >> 1) & usize::MAX / 3);
let b = (a & usize::MAX / 5) + ((a >> 2) & usize::MAX / 5);
let c = (b + (b >> 4)) & usize::MAX / 0x11;
let d = (c + (c >> 8)) & usize::MAX / 0x101;
t = (d >> 32) + (d >> 48);
// Branchless select
s -= ((t.wrapping_sub(n)) & 256) >> 3;
n -= t & ((t.wrapping_sub(n)) >> 8);
t = (d >> (s - 16)) & 0xff;
s -= ((t.wrapping_sub(n)) & 256) >> 4;
n -= t & ((t.wrapping_sub(n)) >> 8);
t = (c >> (s - 8)) & 0xf;
s -= ((t.wrapping_sub(n)) & 256) >> 5;
n -= t & ((t.wrapping_sub(n)) >> 8);
t = (b >> (s - 4)) & 0x7;
s -= ((t.wrapping_sub(n)) & 256) >> 6;
n -= t & ((t.wrapping_sub(n)) >> 8);
t = (a >> (s - 2)) & 0x3;
s -= ((t.wrapping_sub(n)) & 256) >> 7;
n -= t & ((t.wrapping_sub(n)) >> 8);
t = (v >> (s - 1)) & 0x1;
s -= ((t.wrapping_sub(n)) & 256) >> 8;
65 - s - 1
}
#[cfg(test)]
mod tests {
use super::*;
use rand::prelude::*;
use util::RandomCharIter;
#[gpui::test(iterations = 100)]
fn test_random_chunks(mut rng: StdRng) {
let max_len = std::env::var("CHUNK_MAX_LEN")
.ok()
.and_then(|s| s.parse().ok())
.unwrap_or(64);
let chunk_len = rng.gen_range(0..=max_len);
let text = RandomCharIter::new(&mut rng)
.take(chunk_len)
.collect::<String>();
let mut ix = chunk_len;
while !text.is_char_boundary(ix) {
ix -= 1;
}
let text = &text[..ix];
log::info!("Chunk: {:?}", text);
let chunk = Chunk::new(&text);
verify_chunk(chunk.as_slice(), text);
for _ in 0..10 {
let mut start = rng.gen_range(0..=chunk.text.len());
let mut end = rng.gen_range(start..=chunk.text.len());
while !chunk.text.is_char_boundary(start) {
start -= 1;
}
while !chunk.text.is_char_boundary(end) {
end -= 1;
}
let range = start..end;
log::info!("Range: {:?}", range);
let text_slice = &text[range.clone()];
let chunk_slice = chunk.slice(range);
verify_chunk(chunk_slice, text_slice);
}
}
#[test]
fn test_nth_set_bit() {
assert_eq!(
nth_set_bit(
0b1000000000000000000000000000000000000000000000000000000000000000,
1
),
63
);
assert_eq!(
nth_set_bit(
0b1100000000000000000000000000000000000000000000000000000000000000,
1
),
62
);
assert_eq!(
nth_set_bit(
0b1100000000000000000000000000000000000000000000000000000000000000,
2
),
63
);
assert_eq!(
nth_set_bit(
0b0000000000000000000000000000000000000000000000000000000000000001,
1
),
0
);
assert_eq!(
nth_set_bit(
0b0000000000000000000000000000000000000000000000000000000000000011,
2
),
1
);
assert_eq!(
nth_set_bit(
0b0101010101010101010101010101010101010101010101010101010101010101,
1
),
0
);
assert_eq!(
nth_set_bit(
0b0101010101010101010101010101010101010101010101010101010101010101,
32
),
62
);
assert_eq!(
nth_set_bit(
0b1111111111111111111111111111111111111111111111111111111111111111,
64
),
63
);
assert_eq!(
nth_set_bit(
0b1111111111111111111111111111111111111111111111111111111111111111,
1
),
0
);
assert_eq!(
nth_set_bit(
0b1010101010101010101010101010101010101010101010101010101010101010,
1
),
1
);
assert_eq!(
nth_set_bit(
0b1111000011110000111100001111000011110000111100001111000011110000,
8
),
15
);
}
fn verify_chunk(chunk: ChunkSlice<'_>, text: &str) {
let mut offset = 0;
let mut offset_utf16 = OffsetUtf16(0);
let mut point = Point::zero();
let mut point_utf16 = PointUtf16::zero();
log::info!("Verifying chunk {:?}", text);
assert_eq!(chunk.offset_to_point(0), Point::zero());
for c in text.chars() {
let expected_point = chunk.offset_to_point(offset);
assert_eq!(point, expected_point, "mismatch at offset {}", offset);
assert_eq!(
chunk.point_to_offset(point),
offset,
"mismatch at point {:?}",
point
);
assert_eq!(
chunk.offset_to_offset_utf16(offset),
offset_utf16,
"mismatch at offset {}",
offset
);
assert_eq!(
chunk.offset_utf16_to_offset(offset_utf16),
offset,
"mismatch at offset_utf16 {:?}",
offset_utf16
);
assert_eq!(
chunk.point_to_point_utf16(point),
point_utf16,
"mismatch at point {:?}",
point
);
assert_eq!(
chunk.point_utf16_to_offset(point_utf16, false),
offset,
"mismatch at point_utf16 {:?}",
point_utf16
);
if c == '\n' {
point.row += 1;
point.column = 0;
point_utf16.row += 1;
point_utf16.column = 0;
} else {
point.column += c.len_utf8() as u32;
point_utf16.column += c.len_utf16() as u32;
}
offset += c.len_utf8();
offset_utf16.0 += c.len_utf16();
}
let final_point = chunk.offset_to_point(offset);
assert_eq!(point, final_point, "mismatch at final offset {}", offset);
assert_eq!(
chunk.point_to_offset(point),
offset,
"mismatch at point {:?}",
point
);
assert_eq!(
chunk.offset_to_offset_utf16(offset),
offset_utf16,
"mismatch at offset {}",
offset
);
assert_eq!(
chunk.offset_utf16_to_offset(offset_utf16),
offset,
"mismatch at offset_utf16 {:?}",
offset_utf16
);
assert_eq!(
chunk.point_to_point_utf16(point),
point_utf16,
"mismatch at final point {:?}",
point
);
assert_eq!(
chunk.point_utf16_to_offset(point_utf16, false),
offset,
"mismatch at final point_utf16 {:?}",
point_utf16
);
// Verify length methods
assert_eq!(chunk.len(), text.len());
assert_eq!(
chunk.len_utf16().0,
text.chars().map(|c| c.len_utf16()).sum::<usize>()
);
// Verify line counting
let lines = chunk.lines();
let mut newline_count = 0;
let mut last_line_len = 0;
for c in text.chars() {
if c == '\n' {
newline_count += 1;
last_line_len = 0;
} else {
last_line_len += c.len_utf8() as u32;
}
}
assert_eq!(lines, Point::new(newline_count, last_line_len));
// Verify first/last line chars
if !text.is_empty() {
let first_line = text.split('\n').next().unwrap();
assert_eq!(chunk.first_line_chars(), first_line.chars().count() as u32);
let last_line = text.split('\n').last().unwrap();
assert_eq!(chunk.last_line_chars(), last_line.chars().count() as u32);
assert_eq!(
chunk.last_line_len_utf16(),
last_line.chars().map(|c| c.len_utf16() as u32).sum::<u32>()
);
}
// Verify longest row
let (longest_row, longest_chars) = chunk.longest_row();
let mut max_chars = 0;
let mut current_row = 0;
let mut current_chars = 0;
let mut max_row = 0;
for c in text.chars() {
if c == '\n' {
if current_chars > max_chars {
max_chars = current_chars;
max_row = current_row;
}
current_row += 1;
current_chars = 0;
} else {
current_chars += 1;
}
}
if current_chars > max_chars {
max_chars = current_chars;
max_row = current_row;
}
assert_eq!((max_row, max_chars as u32), (longest_row, longest_chars));
}
}

View File

@@ -1,9 +1,10 @@
mod chunk;
mod offset_utf16;
mod point;
mod point_utf16;
mod unclipped;
use arrayvec::ArrayString;
use chunk::{Chunk, ChunkSlice};
use smallvec::SmallVec;
use std::{
cmp, fmt, io, mem,
@@ -11,20 +12,12 @@ use std::{
str,
};
use sum_tree::{Bias, Dimension, SumTree};
use unicode_segmentation::GraphemeCursor;
use util::debug_panic;
pub use offset_utf16::OffsetUtf16;
pub use point::Point;
pub use point_utf16::PointUtf16;
pub use unclipped::Unclipped;
#[cfg(test)]
const CHUNK_BASE: usize = 6;
#[cfg(not(test))]
const CHUNK_BASE: usize = 64;
#[derive(Clone, Default)]
pub struct Rope {
chunks: SumTree<Chunk>,
@@ -39,10 +32,13 @@ impl Rope {
let mut chunks = rope.chunks.cursor::<()>(&());
chunks.next(&());
if let Some(chunk) = chunks.item() {
if self.chunks.last().map_or(false, |c| c.0.len() < CHUNK_BASE)
|| chunk.0.len() < CHUNK_BASE
if self
.chunks
.last()
.map_or(false, |c| c.text.len() < chunk::MIN_BASE)
|| chunk.text.len() < chunk::MIN_BASE
{
self.push(&chunk.0);
self.push(&chunk.text);
chunks.next(&());
}
}
@@ -77,11 +73,13 @@ impl Rope {
pub fn push(&mut self, mut text: &str) {
self.chunks.update_last(
|last_chunk| {
let split_ix = if last_chunk.0.len() + text.len() <= 2 * CHUNK_BASE {
let split_ix = if last_chunk.text.len() + text.len() <= chunk::MAX_BASE {
text.len()
} else {
let mut split_ix =
cmp::min(CHUNK_BASE.saturating_sub(last_chunk.0.len()), text.len());
let mut split_ix = cmp::min(
chunk::MIN_BASE.saturating_sub(last_chunk.text.len()),
text.len(),
);
while !text.is_char_boundary(split_ix) {
split_ix += 1;
}
@@ -89,7 +87,7 @@ impl Rope {
};
let (suffix, remainder) = text.split_at(split_ix);
last_chunk.0.push_str(suffix);
last_chunk.push_str(suffix);
text = remainder;
},
&(),
@@ -101,12 +99,12 @@ impl Rope {
let mut new_chunks = SmallVec::<[_; 16]>::new();
while !text.is_empty() {
let mut split_ix = cmp::min(2 * CHUNK_BASE, text.len());
let mut split_ix = cmp::min(chunk::MAX_BASE, text.len());
while !text.is_char_boundary(split_ix) {
split_ix -= 1;
}
let (chunk, remainder) = text.split_at(split_ix);
new_chunks.push(Chunk(ArrayString::from(chunk).unwrap()));
new_chunks.push(Chunk::new(chunk));
text = remainder;
}
@@ -135,7 +133,7 @@ impl Rope {
// a chunk ends with 3 bytes of a 4-byte character. These 3 bytes end up being stored in the following chunk, thus wasting
// 3 bytes of storage in current chunk.
// For example, a 1024-byte string can occupy between 32 (full ASCII, 1024/32) and 36 (full 4-byte UTF-8, 1024 / 29 rounded up) chunks.
const MIN_CHUNK_SIZE: usize = 2 * CHUNK_BASE - 3;
const MIN_CHUNK_SIZE: usize = chunk::MAX_BASE - 3;
// We also round up the capacity up by one, for a good measure; we *really* don't want to realloc here, as we assume that the # of characters
// we're working with there is large.
@@ -143,12 +141,12 @@ impl Rope {
let mut new_chunks = Vec::with_capacity(capacity);
while !text.is_empty() {
let mut split_ix = cmp::min(2 * CHUNK_BASE, text.len());
let mut split_ix = cmp::min(chunk::MAX_BASE, text.len());
while !text.is_char_boundary(split_ix) {
split_ix -= 1;
}
let (chunk, remainder) = text.split_at(split_ix);
new_chunks.push(Chunk(ArrayString::from(chunk).unwrap()));
new_chunks.push(Chunk::new(chunk));
text = remainder;
}
@@ -165,6 +163,35 @@ impl Rope {
self.check_invariants();
}
fn push_chunk(&mut self, mut chunk: ChunkSlice) {
self.chunks.update_last(
|last_chunk| {
let split_ix = if last_chunk.text.len() + chunk.len() <= chunk::MAX_BASE {
chunk.len()
} else {
let mut split_ix = cmp::min(
chunk::MIN_BASE.saturating_sub(last_chunk.text.len()),
chunk.len(),
);
while !chunk.is_char_boundary(split_ix) {
split_ix += 1;
}
split_ix
};
let (suffix, remainder) = chunk.split_at(split_ix);
last_chunk.append(suffix);
chunk = remainder;
},
&(),
);
if !chunk.is_empty() {
self.chunks.push(chunk.into());
}
}
pub fn push_front(&mut self, text: &str) {
let suffix = mem::replace(self, Rope::from(text));
self.append(suffix);
@@ -178,7 +205,7 @@ impl Rope {
let mut chunks = self.chunks.cursor::<()>(&()).peekable();
while let Some(chunk) = chunks.next() {
if chunks.peek().is_some() {
assert!(chunk.0.len() + 3 >= CHUNK_BASE);
assert!(chunk.text.len() + 3 >= chunk::MIN_BASE);
}
}
}
@@ -250,7 +277,7 @@ impl Rope {
let overshoot = offset - cursor.start().0;
cursor.start().1
+ cursor.item().map_or(Default::default(), |chunk| {
chunk.offset_to_offset_utf16(overshoot)
chunk.as_slice().offset_to_offset_utf16(overshoot)
})
}
@@ -263,7 +290,7 @@ impl Rope {
let overshoot = offset - cursor.start().0;
cursor.start().1
+ cursor.item().map_or(Default::default(), |chunk| {
chunk.offset_utf16_to_offset(overshoot)
chunk.as_slice().offset_utf16_to_offset(overshoot)
})
}
@@ -275,9 +302,9 @@ impl Rope {
cursor.seek(&offset, Bias::Left, &());
let overshoot = offset - cursor.start().0;
cursor.start().1
+ cursor
.item()
.map_or(Point::zero(), |chunk| chunk.offset_to_point(overshoot))
+ cursor.item().map_or(Point::zero(), |chunk| {
chunk.as_slice().offset_to_point(overshoot)
})
}
pub fn offset_to_point_utf16(&self, offset: usize) -> PointUtf16 {
@@ -289,7 +316,7 @@ impl Rope {
let overshoot = offset - cursor.start().0;
cursor.start().1
+ cursor.item().map_or(PointUtf16::zero(), |chunk| {
chunk.offset_to_point_utf16(overshoot)
chunk.as_slice().offset_to_point_utf16(overshoot)
})
}
@@ -302,7 +329,7 @@ impl Rope {
let overshoot = point - cursor.start().0;
cursor.start().1
+ cursor.item().map_or(PointUtf16::zero(), |chunk| {
chunk.point_to_point_utf16(overshoot)
chunk.as_slice().point_to_point_utf16(overshoot)
})
}
@@ -316,7 +343,7 @@ impl Rope {
cursor.start().1
+ cursor
.item()
.map_or(0, |chunk| chunk.point_to_offset(overshoot))
.map_or(0, |chunk| chunk.as_slice().point_to_offset(overshoot))
}
pub fn point_utf16_to_offset(&self, point: PointUtf16) -> usize {
@@ -335,9 +362,9 @@ impl Rope {
cursor.seek(&point, Bias::Left, &());
let overshoot = point - cursor.start().0;
cursor.start().1
+ cursor
.item()
.map_or(0, |chunk| chunk.point_utf16_to_offset(overshoot, clip))
+ cursor.item().map_or(0, |chunk| {
chunk.as_slice().point_utf16_to_offset(overshoot, clip)
})
}
pub fn unclipped_point_utf16_to_point(&self, point: Unclipped<PointUtf16>) -> Point {
@@ -349,7 +376,7 @@ impl Rope {
let overshoot = Unclipped(point.0 - cursor.start().0);
cursor.start().1
+ cursor.item().map_or(Point::zero(), |chunk| {
chunk.unclipped_point_utf16_to_point(overshoot)
chunk.as_slice().unclipped_point_utf16_to_point(overshoot)
})
}
@@ -358,7 +385,7 @@ impl Rope {
cursor.seek(&offset, Bias::Left, &());
if let Some(chunk) = cursor.item() {
let mut ix = offset - cursor.start();
while !chunk.0.is_char_boundary(ix) {
while !chunk.text.is_char_boundary(ix) {
match bias {
Bias::Left => {
ix -= 1;
@@ -381,7 +408,7 @@ impl Rope {
cursor.seek(&offset, Bias::Right, &());
if let Some(chunk) = cursor.item() {
let overshoot = offset - cursor.start();
*cursor.start() + chunk.clip_offset_utf16(overshoot, bias)
*cursor.start() + chunk.as_slice().clip_offset_utf16(overshoot, bias)
} else {
self.summary().len_utf16
}
@@ -392,7 +419,7 @@ impl Rope {
cursor.seek(&point, Bias::Right, &());
if let Some(chunk) = cursor.item() {
let overshoot = point - cursor.start();
*cursor.start() + chunk.clip_point(overshoot, bias)
*cursor.start() + chunk.as_slice().clip_point(overshoot, bias)
} else {
self.summary().lines
}
@@ -403,7 +430,7 @@ impl Rope {
cursor.seek(&point.0, Bias::Right, &());
if let Some(chunk) = cursor.item() {
let overshoot = Unclipped(point.0 - cursor.start());
*cursor.start() + chunk.clip_point_utf16(overshoot, bias)
*cursor.start() + chunk.as_slice().clip_point_utf16(overshoot, bias)
} else {
self.summary().lines_utf16()
}
@@ -466,7 +493,7 @@ impl fmt::Debug for Rope {
pub struct Cursor<'a> {
rope: &'a Rope,
chunks: sum_tree::Cursor<'a, Chunk, usize>,
chunks: sum_tree::Cursor<'a, Chunk<{ chunk::MAX_BASE }>, usize>,
offset: usize,
}
@@ -500,7 +527,7 @@ impl<'a> Cursor<'a> {
if let Some(start_chunk) = self.chunks.item() {
let start_ix = self.offset - self.chunks.start();
let end_ix = cmp::min(end_offset, self.chunks.end(&())) - self.chunks.start();
slice.push(&start_chunk.0[start_ix..end_ix]);
slice.push(&start_chunk.text[start_ix..end_ix]);
}
if end_offset > self.chunks.end(&()) {
@@ -510,7 +537,7 @@ impl<'a> Cursor<'a> {
});
if let Some(end_chunk) = self.chunks.item() {
let end_ix = end_offset - self.chunks.start();
slice.push(&end_chunk.0[..end_ix]);
slice.push(&end_chunk.text[..end_ix]);
}
}
@@ -525,9 +552,7 @@ impl<'a> Cursor<'a> {
if let Some(start_chunk) = self.chunks.item() {
let start_ix = self.offset - self.chunks.start();
let end_ix = cmp::min(end_offset, self.chunks.end(&())) - self.chunks.start();
summary.add_assign(&D::from_text_summary(&TextSummary::from(
&start_chunk.0[start_ix..end_ix],
)));
summary.add_assign(&D::from_chunk(start_chunk.slice(start_ix..end_ix)));
}
if end_offset > self.chunks.end(&()) {
@@ -535,9 +560,7 @@ impl<'a> Cursor<'a> {
summary.add_assign(&self.chunks.summary(&end_offset, Bias::Right, &()));
if let Some(end_chunk) = self.chunks.item() {
let end_ix = end_offset - self.chunks.start();
summary.add_assign(&D::from_text_summary(&TextSummary::from(
&end_chunk.0[..end_ix],
)));
summary.add_assign(&D::from_chunk(end_chunk.slice(0..end_ix)));
}
}
@@ -555,7 +578,7 @@ impl<'a> Cursor<'a> {
}
pub struct Chunks<'a> {
chunks: sum_tree::Cursor<'a, Chunk, usize>,
chunks: sum_tree::Cursor<'a, Chunk<{ chunk::MAX_BASE }>, usize>,
range: Range<usize>,
offset: usize,
reversed: bool,
@@ -678,11 +701,11 @@ impl<'a> Chunks<'a> {
if let Some(chunk) = self.chunks.item() {
let mut end_ix = self.offset - *self.chunks.start();
if chunk.0.as_bytes()[end_ix - 1] == b'\n' {
if chunk.text.as_bytes()[end_ix - 1] == b'\n' {
end_ix -= 1;
}
if let Some(newline_ix) = chunk.0[..end_ix].rfind('\n') {
if let Some(newline_ix) = chunk.text[..end_ix].rfind('\n') {
self.offset = *self.chunks.start() + newline_ix + 1;
if self.offset_is_valid() {
return true;
@@ -694,7 +717,7 @@ impl<'a> Chunks<'a> {
.search_backward(|summary| summary.text.lines.row > 0, &());
self.offset = *self.chunks.start();
if let Some(chunk) = self.chunks.item() {
if let Some(newline_ix) = chunk.0.rfind('\n') {
if let Some(newline_ix) = chunk.text.rfind('\n') {
self.offset += newline_ix + 1;
if self.offset_is_valid() {
if self.offset == self.chunks.end(&()) {
@@ -731,7 +754,7 @@ impl<'a> Chunks<'a> {
slice_start..slice_end
};
Some(&chunk.0[slice_range])
Some(&chunk.text[slice_range])
}
pub fn lines(self) -> Lines<'a> {
@@ -767,7 +790,7 @@ impl<'a> Iterator for Chunks<'a> {
}
pub struct Bytes<'a> {
chunks: sum_tree::Cursor<'a, Chunk, usize>,
chunks: sum_tree::Cursor<'a, Chunk<{ chunk::MAX_BASE }>, usize>,
range: Range<usize>,
reversed: bool,
}
@@ -798,7 +821,7 @@ impl<'a> Bytes<'a> {
}
let start = self.range.start.saturating_sub(chunk_start);
let end = self.range.end - chunk_start;
Some(&chunk.0.as_bytes()[start..chunk.0.len().min(end)])
Some(&chunk.text.as_bytes()[start..chunk.text.len().min(end)])
}
}
@@ -902,265 +925,13 @@ impl<'a> Lines<'a> {
}
}
#[derive(Clone, Debug, Default)]
struct Chunk(ArrayString<{ 2 * CHUNK_BASE }>);
impl Chunk {
fn offset_to_offset_utf16(&self, target: usize) -> OffsetUtf16 {
let mut offset = 0;
let mut offset_utf16 = OffsetUtf16(0);
for ch in self.0.chars() {
if offset >= target {
break;
}
offset += ch.len_utf8();
offset_utf16.0 += ch.len_utf16();
}
offset_utf16
}
fn offset_utf16_to_offset(&self, target: OffsetUtf16) -> usize {
let mut offset_utf16 = OffsetUtf16(0);
let mut offset = 0;
for ch in self.0.chars() {
if offset_utf16 >= target {
break;
}
offset += ch.len_utf8();
offset_utf16.0 += ch.len_utf16();
}
offset
}
fn offset_to_point(&self, target: usize) -> Point {
let mut offset = 0;
let mut point = Point::new(0, 0);
for ch in self.0.chars() {
if offset >= target {
break;
}
if ch == '\n' {
point.row += 1;
point.column = 0;
} else {
point.column += ch.len_utf8() as u32;
}
offset += ch.len_utf8();
}
point
}
fn offset_to_point_utf16(&self, target: usize) -> PointUtf16 {
let mut offset = 0;
let mut point = PointUtf16::new(0, 0);
for ch in self.0.chars() {
if offset >= target {
break;
}
if ch == '\n' {
point.row += 1;
point.column = 0;
} else {
point.column += ch.len_utf16() as u32;
}
offset += ch.len_utf8();
}
point
}
fn point_to_offset(&self, target: Point) -> usize {
let mut offset = 0;
let mut point = Point::new(0, 0);
for ch in self.0.chars() {
if point >= target {
if point > target {
debug_panic!("point {target:?} is inside of character {ch:?}");
}
break;
}
if ch == '\n' {
point.row += 1;
point.column = 0;
if point.row > target.row {
debug_panic!(
"point {target:?} is beyond the end of a line with length {}",
point.column
);
break;
}
} else {
point.column += ch.len_utf8() as u32;
}
offset += ch.len_utf8();
}
offset
}
fn point_to_point_utf16(&self, target: Point) -> PointUtf16 {
let mut point = Point::zero();
let mut point_utf16 = PointUtf16::new(0, 0);
for ch in self.0.chars() {
if point >= target {
break;
}
if ch == '\n' {
point_utf16.row += 1;
point_utf16.column = 0;
point.row += 1;
point.column = 0;
} else {
point_utf16.column += ch.len_utf16() as u32;
point.column += ch.len_utf8() as u32;
}
}
point_utf16
}
fn point_utf16_to_offset(&self, target: PointUtf16, clip: bool) -> usize {
let mut offset = 0;
let mut point = PointUtf16::new(0, 0);
for ch in self.0.chars() {
if point == target {
break;
}
if ch == '\n' {
point.row += 1;
point.column = 0;
if point.row > target.row {
if !clip {
debug_panic!(
"point {target:?} is beyond the end of a line with length {}",
point.column
);
}
// Return the offset of the newline
return offset;
}
} else {
point.column += ch.len_utf16() as u32;
}
if point > target {
if !clip {
debug_panic!("point {target:?} is inside of codepoint {ch:?}");
}
// Return the offset of the codepoint which we have landed within, bias left
return offset;
}
offset += ch.len_utf8();
}
offset
}
fn unclipped_point_utf16_to_point(&self, target: Unclipped<PointUtf16>) -> Point {
let mut point = Point::zero();
let mut point_utf16 = PointUtf16::zero();
for ch in self.0.chars() {
if point_utf16 == target.0 {
break;
}
if point_utf16 > target.0 {
// If the point is past the end of a line or inside of a code point,
// return the last valid point before the target.
return point;
}
if ch == '\n' {
point_utf16 += PointUtf16::new(1, 0);
point += Point::new(1, 0);
} else {
point_utf16 += PointUtf16::new(0, ch.len_utf16() as u32);
point += Point::new(0, ch.len_utf8() as u32);
}
}
point
}
fn clip_point(&self, target: Point, bias: Bias) -> Point {
for (row, line) in self.0.split('\n').enumerate() {
if row == target.row as usize {
let bytes = line.as_bytes();
let mut column = target.column.min(bytes.len() as u32) as usize;
if column == 0
|| column == bytes.len()
|| (bytes[column - 1] < 128 && bytes[column] < 128)
{
return Point::new(row as u32, column as u32);
}
let mut grapheme_cursor = GraphemeCursor::new(column, bytes.len(), true);
loop {
if line.is_char_boundary(column)
&& grapheme_cursor.is_boundary(line, 0).unwrap_or(false)
{
break;
}
match bias {
Bias::Left => column -= 1,
Bias::Right => column += 1,
}
grapheme_cursor.set_cursor(column);
}
return Point::new(row as u32, column as u32);
}
}
unreachable!()
}
fn clip_point_utf16(&self, target: Unclipped<PointUtf16>, bias: Bias) -> PointUtf16 {
for (row, line) in self.0.split('\n').enumerate() {
if row == target.0.row as usize {
let mut code_units = line.encode_utf16();
let mut column = code_units.by_ref().take(target.0.column as usize).count();
if char::decode_utf16(code_units).next().transpose().is_err() {
match bias {
Bias::Left => column -= 1,
Bias::Right => column += 1,
}
}
return PointUtf16::new(row as u32, column as u32);
}
}
unreachable!()
}
fn clip_offset_utf16(&self, target: OffsetUtf16, bias: Bias) -> OffsetUtf16 {
let mut code_units = self.0.encode_utf16();
let mut offset = code_units.by_ref().take(target.0).count();
if char::decode_utf16(code_units).next().transpose().is_err() {
match bias {
Bias::Left => offset -= 1,
Bias::Right => offset += 1,
}
}
OffsetUtf16(offset)
}
}
impl sum_tree::Item for Chunk {
impl sum_tree::Item for Chunk<{ chunk::MAX_BASE }> {
type Summary = ChunkSummary;
fn summary(&self, _cx: &()) -> Self::Summary {
ChunkSummary::from(self.0.as_str())
ChunkSummary {
text: self.as_slice().text_summary(),
}
}
}
@@ -1169,14 +940,6 @@ pub struct ChunkSummary {
text: TextSummary,
}
impl<'a> From<&'a str> for ChunkSummary {
fn from(text: &'a str) -> Self {
Self {
text: TextSummary::from(text),
}
}
}
impl sum_tree::Summary for ChunkSummary {
type Context = ();
@@ -1323,6 +1086,7 @@ impl std::ops::AddAssign<Self> for TextSummary {
pub trait TextDimension: 'static + for<'a> Dimension<'a, ChunkSummary> {
fn from_text_summary(summary: &TextSummary) -> Self;
fn from_chunk(chunk: ChunkSlice) -> Self;
fn add_assign(&mut self, other: &Self);
}
@@ -1334,6 +1098,10 @@ impl<D1: TextDimension, D2: TextDimension> TextDimension for (D1, D2) {
)
}
fn from_chunk(chunk: ChunkSlice) -> Self {
(D1::from_chunk(chunk), D2::from_chunk(chunk))
}
fn add_assign(&mut self, other: &Self) {
self.0.add_assign(&other.0);
self.1.add_assign(&other.1);
@@ -1355,6 +1123,10 @@ impl TextDimension for TextSummary {
summary.clone()
}
fn from_chunk(chunk: ChunkSlice) -> Self {
chunk.text_summary()
}
fn add_assign(&mut self, other: &Self) {
*self += other;
}
@@ -1375,6 +1147,10 @@ impl TextDimension for usize {
summary.len
}
fn from_chunk(chunk: ChunkSlice) -> Self {
chunk.len()
}
fn add_assign(&mut self, other: &Self) {
*self += other;
}
@@ -1395,6 +1171,10 @@ impl TextDimension for OffsetUtf16 {
summary.len_utf16
}
fn from_chunk(chunk: ChunkSlice) -> Self {
chunk.len_utf16()
}
fn add_assign(&mut self, other: &Self) {
*self += other;
}
@@ -1415,6 +1195,10 @@ impl TextDimension for Point {
summary.lines
}
fn from_chunk(chunk: ChunkSlice) -> Self {
chunk.lines()
}
fn add_assign(&mut self, other: &Self) {
*self += other;
}
@@ -1435,6 +1219,13 @@ impl TextDimension for PointUtf16 {
summary.lines_utf16()
}
fn from_chunk(chunk: ChunkSlice) -> Self {
PointUtf16 {
row: chunk.lines().row,
column: chunk.last_line_len_utf16(),
}
}
fn add_assign(&mut self, other: &Self) {
*self += other;
}
@@ -1919,7 +1710,7 @@ mod tests {
fn text(&self) -> String {
let mut text = String::new();
for chunk in self.chunks.cursor::<()>(&()) {
text.push_str(&chunk.0);
text.push_str(&chunk.text);
}
text
}

View File

@@ -1,4 +1,4 @@
use crate::{ChunkSummary, TextDimension, TextSummary};
use crate::{chunk::ChunkSlice, ChunkSummary, TextDimension, TextSummary};
use std::ops::{Add, AddAssign, Sub, SubAssign};
#[derive(Debug, Default, Clone, Copy, PartialEq, Eq, PartialOrd, Ord, Hash)]
@@ -27,6 +27,10 @@ impl<T: TextDimension> TextDimension for Unclipped<T> {
Unclipped(T::from_text_summary(summary))
}
fn from_chunk(chunk: ChunkSlice) -> Self {
Unclipped(T::from_chunk(chunk))
}
fn add_assign(&mut self, other: &Self) {
TextDimension::add_assign(&mut self.0, &other.0);
}

View File

@@ -71,7 +71,7 @@ pub struct ThemeContent {
}
/// The content of a serialized theme.
#[derive(Debug, Clone, Default, Serialize, Deserialize, JsonSchema)]
#[derive(Debug, Clone, Default, Serialize, Deserialize, JsonSchema, PartialEq)]
#[serde(default)]
pub struct ThemeStyleContent {
#[serde(default, rename = "background.appearance")]
@@ -133,7 +133,7 @@ impl ThemeStyleContent {
}
}
#[derive(Debug, Clone, Default, Serialize, Deserialize, JsonSchema)]
#[derive(Debug, Clone, Default, Serialize, Deserialize, JsonSchema, PartialEq)]
#[serde(default)]
pub struct ThemeColorsContent {
/// Border color. Used for most borders, is usually a high contrast color.
@@ -952,7 +952,7 @@ impl ThemeColorsContent {
}
}
#[derive(Debug, Clone, Default, Serialize, Deserialize, JsonSchema)]
#[derive(Debug, Clone, Default, Serialize, Deserialize, JsonSchema, PartialEq)]
#[serde(default)]
pub struct StatusColorsContent {
/// Indicates some kind of conflict, like a file changed on disk while it was open, or
@@ -1273,17 +1273,17 @@ impl StatusColorsContent {
}
}
#[derive(Debug, Clone, Serialize, Deserialize, JsonSchema)]
#[derive(Debug, Clone, Serialize, Deserialize, JsonSchema, PartialEq)]
pub struct AccentContent(pub Option<String>);
#[derive(Debug, Clone, Serialize, Deserialize, JsonSchema)]
#[derive(Debug, Clone, Serialize, Deserialize, JsonSchema, PartialEq)]
pub struct PlayerColorContent {
pub cursor: Option<String>,
pub background: Option<String>,
pub selection: Option<String>,
}
#[derive(Debug, Clone, Copy, Serialize, Deserialize, JsonSchema)]
#[derive(Debug, Clone, Copy, Serialize, Deserialize, JsonSchema, PartialEq)]
#[serde(rename_all = "snake_case")]
pub enum FontStyleContent {
Normal,
@@ -1301,7 +1301,7 @@ impl From<FontStyleContent> for FontStyle {
}
}
#[derive(Debug, Clone, Copy, Serialize_repr, Deserialize_repr)]
#[derive(Debug, Clone, Copy, Serialize_repr, Deserialize_repr, PartialEq)]
#[repr(u16)]
pub enum FontWeightContent {
Thin = 100,
@@ -1359,7 +1359,7 @@ impl From<FontWeightContent> for FontWeight {
}
}
#[derive(Debug, Clone, Default, Serialize, Deserialize, JsonSchema)]
#[derive(Debug, Clone, Default, Serialize, Deserialize, JsonSchema, PartialEq)]
#[serde(default)]
pub struct HighlightStyleContent {
pub color: Option<String>,

Some files were not shown because too many files have changed in this diff Show More