Compare commits

..

39 Commits

Author SHA1 Message Date
Nathan Sobo
db9f079598 Move MessageStream to its own module
Split the MessageStream type into a separate module to keep the proto
module focused on protobuf message definitions. The new message_stream module
contains the MessageStream type and its implementation for reading and writing
protobuf messages over a WebSocket connection.

Refactored imports and updated references to MessageStream in the peer and
proto modules to use the new path.
2024-06-09 21:06:57 -06:00
Max Brunsfeld
48581167b7 Remove dependencies from the Worktree crate and make it more focused (#12747)
The `worktree` crate mainly provides an in-memory model of a directory
and its git repositories. But because it was originally extracted from
the Project crate, it also contained lingering bits of code that were
outside of that area:
* it had a little bit of logic related to buffers (though most buffer
management lives in `project`)
* it had a *little* bit of logic for storing diagnostics (though the
vast majority of LSP and diagnostic logic lives in `project`)
* it had a little bit of logic for sending RPC message (though the
*receiving* logic for those RPC messages lived in `project`)

In this PR, I've moved those concerns entirely to the project crate
(where they were already dealt with for the most part), so that the
worktree crate can be more focused on its main job, and have fewer
dependencies.

Worktree no longer depends on `client` or `lsp`. It still depends on
`language`, but only because of `impl language::File for
worktree::File`.

Release Notes:

- N/A
2024-06-06 11:16:58 -07:00
Thorsten Ball
00dfd217d8 astro: Bump version to 0.0.3 (#12744)
(Forgot in the other PR)

Release Notes:

- N/A
2024-06-06 19:17:51 +02:00
Thorsten Ball
22490f7968 ruby: Bump version to 0.0.7 (#12743)
(Forgot to bump in the other PR)
Release Notes:

- N/A
2024-06-06 19:07:41 +02:00
Thorsten Ball
880940856d ruby: Allow opt-in to Tailwind LS in string (#12742)
This fixes #12728 as much as I can tell.

The problem was that inside ERB files, when inside Ruby code, we didn't
treat `-` as part of the word, which broke completions.

So, with the change in here, and the following Zed settings, it works.

```json
{
  "languages": {
    "Ruby": {
      "language_servers": ["tailwindcss-language-server", "solargraph"]
    }
  },
  "lsp": {
    "tailwindcss-language-server": {
      "settings": {
        "includeLanguages": {
          "erb": "html",
          "ruby": "html"
        },
        "experimental": {
          "classRegex": ["\\bclass:\\s*['\"]([^'\"]*)['\"]"]
        }
    }
  }
}
```

This enabled `tailwindcss-language-server` for Ruby files and tells the
language server to look for classes inside `class: ""` strings.

See demo video.

Release Notes:

- Fixed `tailwindcss-language-server` not being activated inside Ruby
strings (inside `.erb`)
([#12728](https://github.com/zed-industries/zed/issues/12728)).

Demo video:


https://github.com/zed-industries/zed/assets/1185253/643343b4-d64f-4c4e-98a1-d10df0b24e31

Co-authored-by: Max Brunsfeld <max@zed.dev>
2024-06-06 19:02:25 +02:00
Thorsten Ball
c354793871 astro: Fix Tailwind LS not working in attributes (#12741)
This fixes #12402.

We already had the `tailwind-language-server` config in Astro's
`config.toml` here:
fd39f20842/extensions/astro/languages/astro/config.toml (L17-L23)

But it's not enough to add `overrides.string` to the `config.toml`, you
also need an `overrides.scm` file that sets the overrides.

And, tricky bit, when you add a single override to the `overrides.scm`
file you have to add all of them that Zed knows about. In my case, I had
to add `@comment` too, because Zed somehow expects that.

Release Notes:

- Fixed `tailwind-language-server` not working in attributes inside of
`*.astro` files.
([#12402](https://github.com/zed-industries/zed/issues/12402)).

Demo/proof:


https://github.com/zed-industries/zed/assets/1185253/05677a2d-831d-4e05-a1a2-4d1730ce2a46
2024-06-06 18:57:10 +02:00
Paul Eguisier
2f057785f7 Maintain cursor to upper line in visual mode indent/outdent (#12582)
Release Notes:

- vim: Fix indent via `<` and `>` not being repeatable with `.`.
[#12351](https://github.com/zed-industries/zed/issues/12351)
2024-06-06 17:45:25 +02:00
Panghu
fd39f20842 Prevent folder expansion when all items are closed (#12729)
Release Notes:

- Prevent folder expansion when all items are closed

### Problem
When all items are closed, the next activated file expands (see the
video below).


https://github.com/zed-industries/zed/assets/21101490/a7631cd2-4e97-4954-8b01-d283dd4796be


### Cause
When the currently active item is closed, Zed tries to activate the
previously active item. Activating an item by default expands the
corresponding folder, which can result in folders being expanded when
all files are closed.

### Fixed Video


https://github.com/zed-industries/zed/assets/21101490/d30f05c5-6d86-4e11-b349-337fa75586f3
2024-06-06 17:32:58 +02:00
Thorsten Ball
0c7e745be8 docs: Fix Vim documentation for bindings (#12735)
Release Notes:

- N/A
2024-06-06 15:55:07 +02:00
Nycheporuk Zakhar
3000f6ea22 Use cwd to run package.json script (#12700)
Fixes case when `package.json` is not in root directory. 
Usually in mono repository, where multiple `package.json` may be present

Release Notes:

- Fixed runnable for package.json in monorepos
2024-06-06 15:17:45 +03:00
Piotr Osiewicz
377e24b798 chore: Fix clippy for upcoming 1.79 Rust release (#12727)
1.79 is due for release in a week.
Release Notes:

- N/A
2024-06-06 12:46:53 +02:00
Antonio Scandurra
a0c0f1ebcd Rename conversations to contexts (#12724)
This just changes nomenclature within the codebase and should have no
external effect.

Release Notes:

- N/A
2024-06-06 11:40:54 +02:00
Antonio Scandurra
70ce06cb95 Improve UX for saved contexts (#12721)
Release Notes:

- Added search for saved contexts.
- Fixed a bug that caused titles generate by the LLM to be longer than
one line.
2024-06-06 10:22:39 +02:00
Thorsten Ball
9a5b97db00 linux/x11: Don't surround selection when XMI composing (#12632)
On X11 I was unable to type ä ü and other umlauts in files with
autoclose enabled, because typing ä requires me to hit

- compose key
- `"`
- `a`

When the `"` was typed, Zed would insert a matching `"` because it had a
selection around the dead-key that was inserted by the compose key.

We ran into a similar issue in #7611, but in the case of the Brazilian
keyboard, the `"` is the compose key so we didn't trigger the matching
`"`, because we didn't have a selection yet.

What this does is it fixes the issue by making the
surround-selection-with-quotes-or-brackets also depend on the autoclose
settings, which is didn't do before. This is a breaking change for users
of a Brazilian keyboard layout in which `"` cannot be used to surround
an existing selection with quotes anymore.

That _might_ be a change that users notice, but I can't think of
scenario for that where the user wants, say, `"` to be NOT autoclosed,
but work with selections. (Example is Markdown, for which autoclose for
`"` is disabled. Do we want that but allow surrounding with quotes?)

So it fixes the issue and makes the behavior slightly more consistent,
in my eyes.

Release Notes:

- Changed the behavior of surrounding selections with brackets/quotes to
also depend on the auto-close settings of the language. This is a
breaking change for users of a Brazilian keyboard layout in which `"`
cannot be used to surround an existing selection with quotes anymore.

Before:

[Screencast from 2024-06-04
11-49-51.webm](https://github.com/zed-industries/zed/assets/1185253/6bf255b5-32e9-4ba7-8b46-1e49ace2ba7c)

After:

[Screencast from 2024-06-04
11-52-19.webm](https://github.com/zed-industries/zed/assets/1185253/3cd196fc-20ba-465f-bb54-e257f7f6d9f3)
2024-06-06 10:01:38 +02:00
Dhairya Nadapara
0b75afd322 chore: added inl to cpp config (#12710)
Screenshot:
<img width="1027" alt="image"
src="https://github.com/zed-industries/zed/assets/19250981/1d35d35c-d31c-4feb-b2ca-a417972fadf6">

Release Notes:

- Added `inl` to cpp config ([12605](https://github.com/zed-industries/zed/issues/12605))
2024-06-06 10:23:36 +03:00
Brian Schwind
4fd698a093 Fix key-bindings doc typo (#12718)
`base_keymap` is a property of `settings.json`, not `keymap.json`. If
you run "toggle base keymap selector" and select a particular editor,
you will notice that it places the `base_keymap` property in
`settings.json`.

Release Notes:

- N/A
2024-06-06 09:41:28 +03:00
Chung Wei Leong
b50846205c Fixed default LSP default settings for JavaScript, TypeScript & TSX (#12716)
Fixed the default LSP settings for `JavaScript`, `TypeScript` & `TSX`,
correcting the "rest" value from `".."` to `"..."`.

Release Notes:
- N/A
2024-06-06 09:40:20 +03:00
Kirill Bulatov
a574036efd Update the whitespace docs in the default settings file (#12717) 2024-06-06 08:29:01 +03:00
Conrad Irwin
89641acf2f Fix ordering of keyboard shortcuts so that you can use AI on linux (#12714)
Release Notes:

- N/A
2024-06-05 21:58:37 -06:00
Nate Butler
611bf2d905 Update prompt library styles (#12689)
- Extend Picker to allow passing a custom editor. This allows creating a
custom styled input.
- Updates various picker styles

Before:

![CleanShot 2024-06-05 at 22 08
36@2x](https://github.com/zed-industries/zed/assets/1714999/96bc62c6-839d-405b-b030-31491aab8710)

After:

![CleanShot 2024-06-05 at 22 09
15@2x](https://github.com/zed-industries/zed/assets/1714999/a4938885-e825-4880-955e-f3f47c81e1e3)

Release Notes:

- N/A
2024-06-05 22:10:02 -04:00
Andrew Lygin
f476a8bc2a editor: Add ToggleTabBar action (#12499)
This PR adds the `editor: toggle tab bar` action that hides / shows the
tab bar and updates the `tab_bar.show` setting in `settings.json`
accordingly.

First mentioned in
https://github.com/zed-industries/zed/pull/7356#issuecomment-2118445379.

Release Notes:

- Added the `editor: toggle tab bar` action.
2024-06-05 19:50:57 -06:00
Mikayla Maki
d3d0d01571 Adjust IME action buffering to only apply to insert actions (#12702)
Follow up to https://github.com/zed-industries/zed/pull/12678
fixes https://github.com/zed-industries/zed/issues/11829

In this solution, we only buffer Insert Text actions from the macOS IME.
The marked text and unmark actions are eagerly processed, so that the
IME state is synchronized with the editor state during multi step
pre-edit composition.

Release Notes:

- Fixed an issue where the IME pre-edit could desynchronize from the
editor on macOS
([#11829](https://github.com/zed-industries/zed/pull/12651)).

Co-authored-by: Conrad <conrad@zed.dev>
2024-06-05 16:13:03 -07:00
Marshall Bowers
29d29f5a90 assistant: Initialize the UI font in the prompt library window (#12701)
This PR fixes an issue where the prompt library did not properly have
the UI font or rem size set.

Since it is being opened in a new window, we need to re-initialize these
values the same way we do in the main window.

Release Notes:

- N/A
2024-06-05 17:41:03 -04:00
Bennet Bo Fenner
9824e40878 lsp: Handle responses in background thread (#12640)
Release Notes:

- Improved performance when handling large responses from language
servers

---------

Co-authored-by: Piotr <piotr@zed.dev>
2024-06-05 23:06:44 +02:00
Conrad Irwin
1ad8d6ab1c Don't show backtraces in prompts (#12699)
Release Notes:

- N/A
2024-06-05 15:00:23 -06:00
CharlesChen0823
8745719687 vim: Fix g _ not having the expected behavior (#12607)
Release Notes:

- N/A

---------

Co-authored-by: Conrad Irwin <conrad.irwin@gmail.com>
2024-06-05 15:00:13 -06:00
kshokhin
c7c19609b3 Search in selections (#10831)
Release Notes:

- Adding [#8617 ](https://github.com/zed-industries/zed/issues/8617)

---------

Co-authored-by: Conrad Irwin <conrad.irwin@gmail.com>
2024-06-05 13:42:51 -06:00
Conrad Irwin
428c143fbb Add ability to scroll popovers with vim (#12650)
Co-Authored-By: ahmadraheel@gmail.com



Release Notes:

- vim: allow scrolling the currently open information overlay using
`ctrl-{u,d,e,y}`etc. (#11883)
2024-06-05 13:39:17 -06:00
Marshall Bowers
f3460d440c html_to_markdown: Move TableHandler out of rustdoc (#12697)
This PR moves the `TableHandler` out of the `rustdoc` module, as it
doesn't contain anything specific to rustdoc.

Release Notes:

- N/A
2024-06-05 15:37:02 -04:00
Joseph T Lyons
071270fe88 Remove meta label
This label has been deleted. Now, the only label used for ignoring top-ranking issues is `ignore top-ranking issues`.
2024-06-05 13:29:34 -04:00
Joseph T Lyons
a59dd7d06d v0.140.x dev 2024-06-05 12:23:48 -04:00
Conrad Irwin
868284876d Bump alacritty to fix some file descriptor yuck (#12687)
https://github.com/alacritty/alacritty/pull/7996

Release Notes:

- Fixed a crash caused by bad file descriptor lifetime handling.
2024-06-05 09:12:05 -06:00
Antonio Scandurra
6bbe9a2253 Polish prompt library some more (#12686)
Release Notes:

- N/A
2024-06-05 16:55:37 +02:00
Antonio Scandurra
7a05db6d3d Cancel inline assist editor on blur if it wasn't confirmed (#12684)
Release Notes:

- N/A
2024-06-05 16:31:45 +02:00
Antonio Scandurra
3587e9726b Support wrapping and hard newlines in inline assistant (#12683)
Release Notes:

- Improved UX for the inline assistant. It will now automatically wrap
when the text gets too long, and you can insert newlines using
`shift-enter`.
2024-06-05 16:10:56 +02:00
Antonio Scandurra
a96782cc6b Allow using the inline assistant in prompt library (#12680)
Release Notes:

- N/A
2024-06-05 14:46:33 +02:00
Nicholas Cioli
0289c312c9 editor: Render boundary whitespace (#11954)
![image](https://github.com/zed-industries/zed/assets/1240491/3dd06e45-ae8e-49d5-984d-3d8bdf98d983)

Added support for only rendering whitespace that is on a
boundary, the logic of which is explained below:

- Any tab character
- Whitespace at the start and end of a line
- Whitespace that is directly adjacent to another whitespace


Release Notes:

- Added `boundary` whitespace rendering option
([#4290](https://github.com/zed-industries/zed/issues/4290)).




---------

Co-authored-by: Nicholas Cioli <nicholascioli@users.noreply.github.com>
2024-06-05 14:02:55 +03:00
Kirill Bulatov
63a8095879 Revert "Fix a bug where the IME pre-edit would desync from Zed (#12651)" (#12678)
This reverts commit 1a0708f28c since after
that, default task-related keybindings (alt-t and alt-shift-t) started
to leave `†` and `ˇ` symbols in the text editors before triggering
actions.


Release Notes:

- N/A
2024-06-05 13:54:06 +03:00
Kirill Bulatov
1768c0d996 Do not occlude terminal pane by terminal element (#12677)
Release Notes:

- Fixed file drag and drop not working for terminal

Co-authored-by: Antonio Scandurra <antonio@zed.dev>
2024-06-05 13:43:18 +03:00
92 changed files with 4965 additions and 3416 deletions

11
Cargo.lock generated
View File

@@ -88,9 +88,8 @@ dependencies = [
[[package]]
name = "alacritty_terminal"
version = "0.23.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f6d1ea4484c8676f295307a4892d478c70ac8da1dbd8c7c10830a504b7f1022f"
version = "0.24.1-dev"
source = "git+https://github.com/alacritty/alacritty?rev=cacdb5bb3b72bad2c729227537979d95af75978f#cacdb5bb3b72bad2c729227537979d95af75978f"
dependencies = [
"base64 0.22.0",
"bitflags 2.4.2",
@@ -107,7 +106,7 @@ dependencies = [
"signal-hook",
"unicode-width",
"vte",
"windows-sys 0.48.0",
"windows-sys 0.52.0",
]
[[package]]
@@ -12888,7 +12887,6 @@ name = "worktree"
version = "0.1.0"
dependencies = [
"anyhow",
"client",
"clock",
"collections",
"env_logger",
@@ -12903,7 +12901,6 @@ dependencies = [
"itertools 0.11.0",
"language",
"log",
"lsp",
"parking_lot",
"postage",
"pretty_assertions",
@@ -13152,7 +13149,7 @@ dependencies = [
[[package]]
name = "zed"
version = "0.139.0"
version = "0.140.0"
dependencies = [
"activity_indicator",
"anyhow",

View File

@@ -74,6 +74,7 @@
"ib": "storage",
"ico": "image",
"ini": "settings",
"inl": "cpp",
"j2k": "image",
"java": "java",
"jfif": "image",

View File

@@ -0,0 +1 @@
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-text-quote"><path d="M17 6H3"/><path d="M21 12H8"/><path d="M21 18H8"/><path d="M3 12v6"/></svg>

After

Width:  |  Height:  |  Size: 299 B

1
assets/icons/sparkle.svg Normal file
View File

@@ -0,0 +1 @@
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-sparkle"><path d="M9.937 15.5A2 2 0 0 0 8.5 14.063l-6.135-1.582a.5.5 0 0 1 0-.962L8.5 9.936A2 2 0 0 0 9.937 8.5l1.582-6.135a.5.5 0 0 1 .963 0L14.063 8.5A2 2 0 0 0 15.5 9.937l6.135 1.581a.5.5 0 0 1 0 .964L15.5 14.063a2 2 0 0 0-1.437 1.437l-1.582 6.135a.5.5 0 0 1-.963 0z"/></svg>

After

Width:  |  Height:  |  Size: 481 B

View File

@@ -0,0 +1,3 @@
<svg width="24" height="24" viewBox="0 0 24 24" fill="none" xmlns="http://www.w3.org/2000/svg">
<path d="M9.937 15.5C9.84772 15.1539 9.66734 14.8381 9.41462 14.5854C9.1619 14.3327 8.84607 14.1523 8.5 14.063L2.365 12.481C2.26033 12.4513 2.16821 12.3883 2.10261 12.3014C2.03702 12.2146 2.00153 12.1088 2.00153 12C2.00153 11.8912 2.03702 11.7854 2.10261 11.6986C2.16821 11.6118 2.26033 11.5487 2.365 11.519L8.5 9.93601C8.84595 9.84681 9.16169 9.66658 9.4144 9.41404C9.66711 9.16151 9.84757 8.84589 9.937 8.50001L11.519 2.36501C11.5484 2.25992 11.6114 2.16735 11.6983 2.1014C11.7853 2.03545 11.8914 1.99976 12.0005 1.99976C12.1096 1.99976 12.2157 2.03545 12.3027 2.1014C12.3896 2.16735 12.4526 2.25992 12.482 2.36501L14.063 8.50001C14.1523 8.84608 14.3327 9.1619 14.5854 9.41462C14.8381 9.66734 15.1539 9.84773 15.5 9.93701L21.635 11.518C21.7405 11.5471 21.8335 11.61 21.8998 11.6971C21.9661 11.7841 22.0021 11.8906 22.0021 12C22.0021 12.1094 21.9661 12.2159 21.8998 12.3029C21.8335 12.39 21.7405 12.4529 21.635 12.482L15.5 14.063C15.1539 14.1523 14.8381 14.3327 14.5854 14.5854C14.3327 14.8381 14.1523 15.1539 14.063 15.5L12.481 21.635C12.4516 21.7401 12.3886 21.8327 12.3017 21.8986C12.2147 21.9646 12.1086 22.0003 11.9995 22.0003C11.8904 22.0003 11.7843 21.9646 11.6973 21.8986C11.6104 21.8327 11.5474 21.7401 11.518 21.635L9.937 15.5Z" fill="black" stroke="black" stroke-width="2" stroke-linecap="round" stroke-linejoin="round"/>
</svg>

After

Width:  |  Height:  |  Size: 1.4 KiB

View File

@@ -204,18 +204,6 @@
"alt-m": "assistant::ToggleModelSelector"
}
},
{
"context": "ConversationEditor > Editor",
"bindings": {
"ctrl-enter": "assistant::Assist",
"ctrl-s": "workspace::Save",
"ctrl->": "assistant::QuoteSelection",
"shift-enter": "assistant::Split",
"ctrl-r": "assistant::CycleMessageRole",
"enter": "assistant::ConfirmCommand",
"alt-enter": "editor::Newline"
}
},
{
"context": "PromptLibrary",
"bindings": {
@@ -232,7 +220,8 @@
"shift-enter": "search::SelectPrevMatch",
"alt-enter": "search::SelectAllMatches",
"ctrl-f": "search::FocusSearch",
"ctrl-h": "search::ToggleReplace"
"ctrl-h": "search::ToggleReplace",
"ctrl-l": "search::ToggleSelection"
}
},
{
@@ -296,6 +285,7 @@
"ctrl-alt-g": "search::SelectNextMatch",
"ctrl-alt-shift-g": "search::SelectPrevMatch",
"ctrl-alt-shift-h": "search::ToggleReplace",
"ctrl-alt-shift-l": "search::ToggleSelection",
"alt-enter": "search::SelectAllMatches",
"alt-c": "search::ToggleCaseSensitive",
"alt-w": "search::ToggleWholeWord",
@@ -554,6 +544,18 @@
"ctrl-enter": "assistant::InlineAssist"
}
},
{
"context": "ContextEditor > Editor",
"bindings": {
"ctrl-enter": "assistant::Assist",
"ctrl-s": "workspace::Save",
"ctrl->": "assistant::QuoteSelection",
"shift-enter": "assistant::Split",
"ctrl-r": "assistant::CycleMessageRole",
"enter": "assistant::ConfirmCommand",
"alt-enter": "editor::Newline"
}
},
{
"context": "ProjectSearchBar && !in_replace",
"bindings": {

View File

@@ -176,6 +176,12 @@
"replace_enabled": true
}
],
"cmd-alt-l": [
"buffer_search::Deploy",
{
"selection_search_enabled": true
}
],
"cmd-e": [
"buffer_search::Deploy",
{
@@ -222,7 +228,7 @@
}
},
{
"context": "ConversationEditor > Editor",
"context": "ContextEditor > Editor",
"bindings": {
"cmd-enter": "assistant::Assist",
"cmd-s": "workspace::Save",
@@ -250,7 +256,8 @@
"shift-enter": "search::SelectPrevMatch",
"alt-enter": "search::SelectAllMatches",
"cmd-f": "search::FocusSearch",
"cmd-alt-f": "search::ToggleReplace"
"cmd-alt-f": "search::ToggleReplace",
"cmd-alt-l": "search::ToggleSelection"
}
},
{
@@ -316,6 +323,7 @@
"cmd-g": "search::SelectNextMatch",
"cmd-shift-g": "search::SelectPrevMatch",
"cmd-shift-h": "search::ToggleReplace",
"cmd-alt-l": "search::ToggleSelection",
"alt-enter": "search::SelectAllMatches",
"alt-cmd-c": "search::ToggleCaseSensitive",
"alt-cmd-w": "search::ToggleWholeWord",

View File

@@ -131,14 +131,7 @@
// The default number of lines to expand excerpts in the multibuffer by.
"expand_excerpt_lines": 3,
// Globs to match against file paths to determine if a file is private.
"private_files": [
"**/.env*",
"**/*.pem",
"**/*.key",
"**/*.cert",
"**/*.crt",
"**/secrets.yml"
],
"private_files": ["**/.env*", "**/*.pem", "**/*.key", "**/*.cert", "**/*.crt", "**/secrets.yml"],
// Whether to use additional LSP queries to format (and amend) the code after
// every "trigger" symbol input, defined by LSP server capabilities.
"use_on_type_format": true,
@@ -164,6 +157,12 @@
// "none"
// 3. Draw all invisible symbols:
// "all"
// 4. Draw whitespaces at boundaries only:
// "boundaries"
// For a whitespace to be on a boundary, any of the following conditions need to be met:
// - It is a tab
// - It is adjacent to an edge (start or end)
// - It is adjacent to a whitespace (left or right)
"show_whitespaces": "selection",
// Settings related to calls in Zed
"calls": {
@@ -698,7 +697,7 @@
}
},
"JavaScript": {
"language_servers": ["typescript-language-server", "!vtsls", ".."],
"language_servers": ["typescript-language-server", "!vtsls", "..."],
"prettier": {
"allowed": true
}
@@ -741,7 +740,7 @@
}
},
"TSX": {
"language_servers": ["typescript-language-server", "!vtsls", ".."],
"language_servers": ["typescript-language-server", "!vtsls", "..."],
"prettier": {
"allowed": true
}
@@ -752,7 +751,7 @@
}
},
"TypeScript": {
"language_servers": ["typescript-language-server", "!vtsls", ".."],
"language_servers": ["typescript-language-server", "!vtsls", "..."],
"prettier": {
"allowed": true
}

View File

@@ -1,11 +1,11 @@
pub mod assistant_panel;
pub mod assistant_settings;
mod codegen;
mod completion_provider;
mod context_store;
mod inline_assistant;
mod model_selector;
mod prompt_library;
mod prompts;
mod saved_conversation;
mod search;
mod slash_command;
mod streaming_diff;
@@ -17,9 +17,10 @@ use assistant_slash_command::SlashCommandRegistry;
use client::{proto, Client};
use command_palette_hooks::CommandPaletteFilter;
pub(crate) use completion_provider::*;
pub(crate) use context_store::*;
use gpui::{actions, AppContext, Global, SharedString, UpdateGlobal};
pub(crate) use inline_assistant::*;
pub(crate) use model_selector::*;
pub(crate) use saved_conversation::*;
use semantic_index::{CloudEmbeddingProvider, SemanticIndex};
use serde::{Deserialize, Serialize};
use settings::{Settings, SettingsStore};
@@ -31,6 +32,7 @@ use std::{
fmt::{self, Display},
sync::Arc,
};
pub(crate) use streaming_diff::*;
use util::paths::EMBEDDINGS_DIR;
actions!(
@@ -273,10 +275,11 @@ pub fn init(client: Arc<Client>, cx: &mut AppContext) {
.detach();
prompt_library::init(cx);
completion_provider::init(client, cx);
completion_provider::init(client.clone(), cx);
assistant_slash_command::init(cx);
register_slash_commands(cx);
assistant_panel::init(cx);
inline_assistant::init(client.telemetry().clone(), cx);
CommandPaletteFilter::update_global(cx, |filter, _cx| {
filter.hide_namespace(Assistant::NAMESPACE);

File diff suppressed because it is too large Load Diff

View File

@@ -1,704 +0,0 @@
use crate::{
streaming_diff::{Hunk, StreamingDiff},
CompletionProvider, LanguageModelRequest,
};
use anyhow::Result;
use client::telemetry::Telemetry;
use editor::{Anchor, MultiBuffer, MultiBufferSnapshot, ToOffset, ToPoint};
use futures::{channel::mpsc, SinkExt, Stream, StreamExt};
use gpui::{EventEmitter, Model, ModelContext, Task};
use language::{Rope, TransactionId};
use multi_buffer::MultiBufferRow;
use std::{cmp, future, ops::Range, sync::Arc, time::Instant};
#[derive(Debug)]
pub enum Event {
Finished,
Undone,
}
#[derive(Clone)]
pub enum CodegenKind {
Transform { range: Range<Anchor> },
Generate { position: Anchor },
}
pub struct Codegen {
buffer: Model<MultiBuffer>,
snapshot: MultiBufferSnapshot,
kind: CodegenKind,
last_equal_ranges: Vec<Range<Anchor>>,
transaction_id: Option<TransactionId>,
error: Option<anyhow::Error>,
generation: Task<()>,
idle: bool,
telemetry: Option<Arc<Telemetry>>,
_subscription: gpui::Subscription,
}
impl EventEmitter<Event> for Codegen {}
impl Codegen {
pub fn new(
buffer: Model<MultiBuffer>,
kind: CodegenKind,
telemetry: Option<Arc<Telemetry>>,
cx: &mut ModelContext<Self>,
) -> Self {
let snapshot = buffer.read(cx).snapshot(cx);
Self {
buffer: buffer.clone(),
snapshot,
kind,
last_equal_ranges: Default::default(),
transaction_id: Default::default(),
error: Default::default(),
idle: true,
generation: Task::ready(()),
telemetry,
_subscription: cx.subscribe(&buffer, Self::handle_buffer_event),
}
}
fn handle_buffer_event(
&mut self,
_buffer: Model<MultiBuffer>,
event: &multi_buffer::Event,
cx: &mut ModelContext<Self>,
) {
if let multi_buffer::Event::TransactionUndone { transaction_id } = event {
if self.transaction_id == Some(*transaction_id) {
self.transaction_id = None;
self.generation = Task::ready(());
cx.emit(Event::Undone);
}
}
}
pub fn range(&self) -> Range<Anchor> {
match &self.kind {
CodegenKind::Transform { range } => range.clone(),
CodegenKind::Generate { position } => position.bias_left(&self.snapshot)..*position,
}
}
pub fn kind(&self) -> &CodegenKind {
&self.kind
}
pub fn last_equal_ranges(&self) -> &[Range<Anchor>] {
&self.last_equal_ranges
}
pub fn idle(&self) -> bool {
self.idle
}
pub fn error(&self) -> Option<&anyhow::Error> {
self.error.as_ref()
}
pub fn start(&mut self, prompt: LanguageModelRequest, cx: &mut ModelContext<Self>) {
let range = self.range();
let snapshot = self.snapshot.clone();
let selected_text = snapshot
.text_for_range(range.start..range.end)
.collect::<Rope>();
let selection_start = range.start.to_point(&snapshot);
let suggested_line_indent = snapshot
.suggested_indents(selection_start.row..selection_start.row + 1, cx)
.into_values()
.next()
.unwrap_or_else(|| snapshot.indent_size_for_line(MultiBufferRow(selection_start.row)));
let model_telemetry_id = prompt.model.telemetry_id();
let response = CompletionProvider::global(cx).complete(prompt);
let telemetry = self.telemetry.clone();
self.generation = cx.spawn(|this, mut cx| {
async move {
let generate = async {
let mut edit_start = range.start.to_offset(&snapshot);
let (mut hunks_tx, mut hunks_rx) = mpsc::channel(1);
let diff: Task<anyhow::Result<()>> =
cx.background_executor().spawn(async move {
let mut response_latency = None;
let request_start = Instant::now();
let diff = async {
let chunks = strip_invalid_spans_from_codeblock(response.await?);
futures::pin_mut!(chunks);
let mut diff = StreamingDiff::new(selected_text.to_string());
let mut new_text = String::new();
let mut base_indent = None;
let mut line_indent = None;
let mut first_line = true;
while let Some(chunk) = chunks.next().await {
if response_latency.is_none() {
response_latency = Some(request_start.elapsed());
}
let chunk = chunk?;
let mut lines = chunk.split('\n').peekable();
while let Some(line) = lines.next() {
new_text.push_str(line);
if line_indent.is_none() {
if let Some(non_whitespace_ch_ix) =
new_text.find(|ch: char| !ch.is_whitespace())
{
line_indent = Some(non_whitespace_ch_ix);
base_indent = base_indent.or(line_indent);
let line_indent = line_indent.unwrap();
let base_indent = base_indent.unwrap();
let indent_delta =
line_indent as i32 - base_indent as i32;
let mut corrected_indent_len = cmp::max(
0,
suggested_line_indent.len as i32 + indent_delta,
)
as usize;
if first_line {
corrected_indent_len = corrected_indent_len
.saturating_sub(
selection_start.column as usize,
);
}
let indent_char = suggested_line_indent.char();
let mut indent_buffer = [0; 4];
let indent_str =
indent_char.encode_utf8(&mut indent_buffer);
new_text.replace_range(
..line_indent,
&indent_str.repeat(corrected_indent_len),
);
}
}
if line_indent.is_some() {
hunks_tx.send(diff.push_new(&new_text)).await?;
new_text.clear();
}
if lines.peek().is_some() {
hunks_tx.send(diff.push_new("\n")).await?;
line_indent = None;
first_line = false;
}
}
}
hunks_tx.send(diff.push_new(&new_text)).await?;
hunks_tx.send(diff.finish()).await?;
anyhow::Ok(())
};
let result = diff.await;
let error_message =
result.as_ref().err().map(|error| error.to_string());
if let Some(telemetry) = telemetry {
telemetry.report_assistant_event(
None,
telemetry_events::AssistantKind::Inline,
model_telemetry_id,
response_latency,
error_message,
);
}
result?;
Ok(())
});
while let Some(hunks) = hunks_rx.next().await {
this.update(&mut cx, |this, cx| {
this.last_equal_ranges.clear();
let transaction = this.buffer.update(cx, |buffer, cx| {
// Avoid grouping assistant edits with user edits.
buffer.finalize_last_transaction(cx);
buffer.start_transaction(cx);
buffer.edit(
hunks.into_iter().filter_map(|hunk| match hunk {
Hunk::Insert { text } => {
let edit_start = snapshot.anchor_after(edit_start);
Some((edit_start..edit_start, text))
}
Hunk::Remove { len } => {
let edit_end = edit_start + len;
let edit_range = snapshot.anchor_after(edit_start)
..snapshot.anchor_before(edit_end);
edit_start = edit_end;
Some((edit_range, String::new()))
}
Hunk::Keep { len } => {
let edit_end = edit_start + len;
let edit_range = snapshot.anchor_after(edit_start)
..snapshot.anchor_before(edit_end);
edit_start = edit_end;
this.last_equal_ranges.push(edit_range);
None
}
}),
None,
cx,
);
buffer.end_transaction(cx)
});
if let Some(transaction) = transaction {
if let Some(first_transaction) = this.transaction_id {
// Group all assistant edits into the first transaction.
this.buffer.update(cx, |buffer, cx| {
buffer.merge_transactions(
transaction,
first_transaction,
cx,
)
});
} else {
this.transaction_id = Some(transaction);
this.buffer.update(cx, |buffer, cx| {
buffer.finalize_last_transaction(cx)
});
}
}
cx.notify();
})?;
}
diff.await?;
anyhow::Ok(())
};
let result = generate.await;
this.update(&mut cx, |this, cx| {
this.last_equal_ranges.clear();
this.idle = true;
if let Err(error) = result {
this.error = Some(error);
}
cx.emit(Event::Finished);
cx.notify();
})
.ok();
}
});
self.error.take();
self.idle = false;
cx.notify();
}
pub fn undo(&mut self, cx: &mut ModelContext<Self>) {
if let Some(transaction_id) = self.transaction_id {
self.buffer
.update(cx, |buffer, cx| buffer.undo_transaction(transaction_id, cx));
}
}
}
fn strip_invalid_spans_from_codeblock(
stream: impl Stream<Item = Result<String>>,
) -> impl Stream<Item = Result<String>> {
let mut first_line = true;
let mut buffer = String::new();
let mut starts_with_markdown_codeblock = false;
let mut includes_start_or_end_span = false;
stream.filter_map(move |chunk| {
let chunk = match chunk {
Ok(chunk) => chunk,
Err(err) => return future::ready(Some(Err(err))),
};
buffer.push_str(&chunk);
if buffer.len() > "<|S|".len() && buffer.starts_with("<|S|") {
includes_start_or_end_span = true;
buffer = buffer
.strip_prefix("<|S|>")
.or_else(|| buffer.strip_prefix("<|S|"))
.unwrap_or(&buffer)
.to_string();
} else if buffer.ends_with("|E|>") {
includes_start_or_end_span = true;
} else if buffer.starts_with("<|")
|| buffer.starts_with("<|S")
|| buffer.starts_with("<|S|")
|| buffer.ends_with('|')
|| buffer.ends_with("|E")
|| buffer.ends_with("|E|")
{
return future::ready(None);
}
if first_line {
if buffer.is_empty() || buffer == "`" || buffer == "``" {
return future::ready(None);
} else if buffer.starts_with("```") {
starts_with_markdown_codeblock = true;
if let Some(newline_ix) = buffer.find('\n') {
buffer.replace_range(..newline_ix + 1, "");
first_line = false;
} else {
return future::ready(None);
}
}
}
let mut text = buffer.to_string();
if starts_with_markdown_codeblock {
text = text
.strip_suffix("\n```\n")
.or_else(|| text.strip_suffix("\n```"))
.or_else(|| text.strip_suffix("\n``"))
.or_else(|| text.strip_suffix("\n`"))
.or_else(|| text.strip_suffix('\n'))
.unwrap_or(&text)
.to_string();
}
if includes_start_or_end_span {
text = text
.strip_suffix("|E|>")
.or_else(|| text.strip_suffix("E|>"))
.or_else(|| text.strip_prefix("|>"))
.or_else(|| text.strip_prefix('>'))
.unwrap_or(&text)
.to_string();
};
if text.contains('\n') {
first_line = false;
}
let remainder = buffer.split_off(text.len());
let result = if buffer.is_empty() {
None
} else {
Some(Ok(buffer.clone()))
};
buffer = remainder;
future::ready(result)
})
}
#[cfg(test)]
mod tests {
use std::sync::Arc;
use crate::FakeCompletionProvider;
use super::*;
use futures::stream::{self};
use gpui::{Context, TestAppContext};
use indoc::indoc;
use language::{
language_settings, tree_sitter_rust, Buffer, Language, LanguageConfig, LanguageMatcher,
Point,
};
use rand::prelude::*;
use serde::Serialize;
use settings::SettingsStore;
#[derive(Serialize)]
pub struct DummyCompletionRequest {
pub name: String,
}
#[gpui::test(iterations = 10)]
async fn test_transform_autoindent(cx: &mut TestAppContext, mut rng: StdRng) {
let provider = FakeCompletionProvider::default();
cx.set_global(cx.update(SettingsStore::test));
cx.set_global(CompletionProvider::Fake(provider.clone()));
cx.update(language_settings::init);
let text = indoc! {"
fn main() {
let x = 0;
for _ in 0..10 {
x += 1;
}
}
"};
let buffer =
cx.new_model(|cx| Buffer::local(text, cx).with_language(Arc::new(rust_lang()), cx));
let buffer = cx.new_model(|cx| MultiBuffer::singleton(buffer, cx));
let range = buffer.read_with(cx, |buffer, cx| {
let snapshot = buffer.snapshot(cx);
snapshot.anchor_before(Point::new(1, 0))..snapshot.anchor_after(Point::new(4, 5))
});
let codegen = cx.new_model(|cx| {
Codegen::new(buffer.clone(), CodegenKind::Transform { range }, None, cx)
});
let request = LanguageModelRequest::default();
codegen.update(cx, |codegen, cx| codegen.start(request, cx));
let mut new_text = concat!(
" let mut x = 0;\n",
" while x < 10 {\n",
" x += 1;\n",
" }",
);
while !new_text.is_empty() {
let max_len = cmp::min(new_text.len(), 10);
let len = rng.gen_range(1..=max_len);
let (chunk, suffix) = new_text.split_at(len);
provider.send_completion(chunk.into());
new_text = suffix;
cx.background_executor.run_until_parked();
}
provider.finish_completion();
cx.background_executor.run_until_parked();
assert_eq!(
buffer.read_with(cx, |buffer, cx| buffer.snapshot(cx).text()),
indoc! {"
fn main() {
let mut x = 0;
while x < 10 {
x += 1;
}
}
"}
);
}
#[gpui::test(iterations = 10)]
async fn test_autoindent_when_generating_past_indentation(
cx: &mut TestAppContext,
mut rng: StdRng,
) {
let provider = FakeCompletionProvider::default();
cx.set_global(CompletionProvider::Fake(provider.clone()));
cx.set_global(cx.update(SettingsStore::test));
cx.update(language_settings::init);
let text = indoc! {"
fn main() {
le
}
"};
let buffer =
cx.new_model(|cx| Buffer::local(text, cx).with_language(Arc::new(rust_lang()), cx));
let buffer = cx.new_model(|cx| MultiBuffer::singleton(buffer, cx));
let position = buffer.read_with(cx, |buffer, cx| {
let snapshot = buffer.snapshot(cx);
snapshot.anchor_before(Point::new(1, 6))
});
let codegen = cx.new_model(|cx| {
Codegen::new(buffer.clone(), CodegenKind::Generate { position }, None, cx)
});
let request = LanguageModelRequest::default();
codegen.update(cx, |codegen, cx| codegen.start(request, cx));
let mut new_text = concat!(
"t mut x = 0;\n",
"while x < 10 {\n",
" x += 1;\n",
"}", //
);
while !new_text.is_empty() {
let max_len = cmp::min(new_text.len(), 10);
let len = rng.gen_range(1..=max_len);
let (chunk, suffix) = new_text.split_at(len);
provider.send_completion(chunk.into());
new_text = suffix;
cx.background_executor.run_until_parked();
}
provider.finish_completion();
cx.background_executor.run_until_parked();
assert_eq!(
buffer.read_with(cx, |buffer, cx| buffer.snapshot(cx).text()),
indoc! {"
fn main() {
let mut x = 0;
while x < 10 {
x += 1;
}
}
"}
);
}
#[gpui::test(iterations = 10)]
async fn test_autoindent_when_generating_before_indentation(
cx: &mut TestAppContext,
mut rng: StdRng,
) {
let provider = FakeCompletionProvider::default();
cx.set_global(CompletionProvider::Fake(provider.clone()));
cx.set_global(cx.update(SettingsStore::test));
cx.update(language_settings::init);
let text = concat!(
"fn main() {\n",
" \n",
"}\n" //
);
let buffer =
cx.new_model(|cx| Buffer::local(text, cx).with_language(Arc::new(rust_lang()), cx));
let buffer = cx.new_model(|cx| MultiBuffer::singleton(buffer, cx));
let position = buffer.read_with(cx, |buffer, cx| {
let snapshot = buffer.snapshot(cx);
snapshot.anchor_before(Point::new(1, 2))
});
let codegen = cx.new_model(|cx| {
Codegen::new(buffer.clone(), CodegenKind::Generate { position }, None, cx)
});
let request = LanguageModelRequest::default();
codegen.update(cx, |codegen, cx| codegen.start(request, cx));
let mut new_text = concat!(
"let mut x = 0;\n",
"while x < 10 {\n",
" x += 1;\n",
"}", //
);
while !new_text.is_empty() {
let max_len = cmp::min(new_text.len(), 10);
let len = rng.gen_range(1..=max_len);
let (chunk, suffix) = new_text.split_at(len);
provider.send_completion(chunk.into());
new_text = suffix;
cx.background_executor.run_until_parked();
}
provider.finish_completion();
cx.background_executor.run_until_parked();
assert_eq!(
buffer.read_with(cx, |buffer, cx| buffer.snapshot(cx).text()),
indoc! {"
fn main() {
let mut x = 0;
while x < 10 {
x += 1;
}
}
"}
);
}
#[gpui::test]
async fn test_strip_invalid_spans_from_codeblock() {
assert_eq!(
strip_invalid_spans_from_codeblock(chunks("Lorem ipsum dolor", 2))
.map(|chunk| chunk.unwrap())
.collect::<String>()
.await,
"Lorem ipsum dolor"
);
assert_eq!(
strip_invalid_spans_from_codeblock(chunks("```\nLorem ipsum dolor", 2))
.map(|chunk| chunk.unwrap())
.collect::<String>()
.await,
"Lorem ipsum dolor"
);
assert_eq!(
strip_invalid_spans_from_codeblock(chunks("```\nLorem ipsum dolor\n```", 2))
.map(|chunk| chunk.unwrap())
.collect::<String>()
.await,
"Lorem ipsum dolor"
);
assert_eq!(
strip_invalid_spans_from_codeblock(chunks("```\nLorem ipsum dolor\n```\n", 2))
.map(|chunk| chunk.unwrap())
.collect::<String>()
.await,
"Lorem ipsum dolor"
);
assert_eq!(
strip_invalid_spans_from_codeblock(chunks(
"```html\n```js\nLorem ipsum dolor\n```\n```",
2
))
.map(|chunk| chunk.unwrap())
.collect::<String>()
.await,
"```js\nLorem ipsum dolor\n```"
);
assert_eq!(
strip_invalid_spans_from_codeblock(chunks("``\nLorem ipsum dolor\n```", 2))
.map(|chunk| chunk.unwrap())
.collect::<String>()
.await,
"``\nLorem ipsum dolor\n```"
);
assert_eq!(
strip_invalid_spans_from_codeblock(chunks("<|S|Lorem ipsum|E|>", 2))
.map(|chunk| chunk.unwrap())
.collect::<String>()
.await,
"Lorem ipsum"
);
assert_eq!(
strip_invalid_spans_from_codeblock(chunks("<|S|>Lorem ipsum", 2))
.map(|chunk| chunk.unwrap())
.collect::<String>()
.await,
"Lorem ipsum"
);
assert_eq!(
strip_invalid_spans_from_codeblock(chunks("```\n<|S|>Lorem ipsum\n```", 2))
.map(|chunk| chunk.unwrap())
.collect::<String>()
.await,
"Lorem ipsum"
);
assert_eq!(
strip_invalid_spans_from_codeblock(chunks("```\n<|S|Lorem ipsum|E|>\n```", 2))
.map(|chunk| chunk.unwrap())
.collect::<String>()
.await,
"Lorem ipsum"
);
fn chunks(text: &str, size: usize) -> impl Stream<Item = Result<String>> {
stream::iter(
text.chars()
.collect::<Vec<_>>()
.chunks(size)
.map(|chunk| Ok(chunk.iter().collect::<String>()))
.collect::<Vec<_>>(),
)
}
}
fn rust_lang() -> Language {
Language::new(
LanguageConfig {
name: "Rust".into(),
matcher: LanguageMatcher {
path_suffixes: vec!["rs".to_string()],
..Default::default()
},
..Default::default()
},
Some(tree_sitter_rust::language()),
)
.with_indents_query(
r#"
(call_expression) @indent
(field_expression) @indent
(_ "(" ")" @end) @indent
(_ "{" "}" @end) @indent
"#,
)
.unwrap()
}
}

View File

@@ -0,0 +1,196 @@
use crate::{assistant_settings::OpenAiModel, MessageId, MessageMetadata};
use anyhow::{anyhow, Result};
use collections::HashMap;
use fs::Fs;
use futures::StreamExt;
use fuzzy::StringMatchCandidate;
use gpui::{AppContext, Model, ModelContext, Task};
use regex::Regex;
use serde::{Deserialize, Serialize};
use std::{cmp::Reverse, ffi::OsStr, path::PathBuf, sync::Arc, time::Duration};
use ui::Context;
use util::{paths::CONTEXTS_DIR, ResultExt, TryFutureExt};
#[derive(Serialize, Deserialize)]
pub struct SavedMessage {
pub id: MessageId,
pub start: usize,
}
#[derive(Serialize, Deserialize)]
pub struct SavedContext {
pub id: Option<String>,
pub zed: String,
pub version: String,
pub text: String,
pub messages: Vec<SavedMessage>,
pub message_metadata: HashMap<MessageId, MessageMetadata>,
pub summary: String,
}
impl SavedContext {
pub const VERSION: &'static str = "0.2.0";
}
#[derive(Serialize, Deserialize)]
struct SavedContextV0_1_0 {
id: Option<String>,
zed: String,
version: String,
text: String,
messages: Vec<SavedMessage>,
message_metadata: HashMap<MessageId, MessageMetadata>,
summary: String,
api_url: Option<String>,
model: OpenAiModel,
}
#[derive(Clone)]
pub struct SavedContextMetadata {
pub title: String,
pub path: PathBuf,
pub mtime: chrono::DateTime<chrono::Local>,
}
pub struct ContextStore {
contexts_metadata: Vec<SavedContextMetadata>,
fs: Arc<dyn Fs>,
_watch_updates: Task<Option<()>>,
}
impl ContextStore {
pub fn new(fs: Arc<dyn Fs>, cx: &mut AppContext) -> Task<Result<Model<Self>>> {
cx.spawn(|mut cx| async move {
const CONTEXT_WATCH_DURATION: Duration = Duration::from_millis(100);
let (mut events, _) = fs.watch(&CONTEXTS_DIR, CONTEXT_WATCH_DURATION).await;
let this = cx.new_model(|cx: &mut ModelContext<Self>| Self {
contexts_metadata: Vec::new(),
fs,
_watch_updates: cx.spawn(|this, mut cx| {
async move {
while events.next().await.is_some() {
this.update(&mut cx, |this, cx| this.reload(cx))?
.await
.log_err();
}
anyhow::Ok(())
}
.log_err()
}),
})?;
this.update(&mut cx, |this, cx| this.reload(cx))?
.await
.log_err();
Ok(this)
})
}
pub fn load(&self, path: PathBuf, cx: &AppContext) -> Task<Result<SavedContext>> {
let fs = self.fs.clone();
cx.background_executor().spawn(async move {
let saved_context = fs.load(&path).await?;
let saved_context_json = serde_json::from_str::<serde_json::Value>(&saved_context)?;
match saved_context_json
.get("version")
.ok_or_else(|| anyhow!("version not found"))?
{
serde_json::Value::String(version) => match version.as_str() {
SavedContext::VERSION => {
Ok(serde_json::from_value::<SavedContext>(saved_context_json)?)
}
"0.1.0" => {
let saved_context =
serde_json::from_value::<SavedContextV0_1_0>(saved_context_json)?;
Ok(SavedContext {
id: saved_context.id,
zed: saved_context.zed,
version: saved_context.version,
text: saved_context.text,
messages: saved_context.messages,
message_metadata: saved_context.message_metadata,
summary: saved_context.summary,
})
}
_ => Err(anyhow!("unrecognized saved context version: {}", version)),
},
_ => Err(anyhow!("version not found on saved context")),
}
})
}
pub fn search(&self, query: String, cx: &AppContext) -> Task<Vec<SavedContextMetadata>> {
let metadata = self.contexts_metadata.clone();
let executor = cx.background_executor().clone();
cx.background_executor().spawn(async move {
if query.is_empty() {
metadata
} else {
let candidates = metadata
.iter()
.enumerate()
.map(|(id, metadata)| StringMatchCandidate::new(id, metadata.title.clone()))
.collect::<Vec<_>>();
let matches = fuzzy::match_strings(
&candidates,
&query,
false,
100,
&Default::default(),
executor,
)
.await;
matches
.into_iter()
.map(|mat| metadata[mat.candidate_id].clone())
.collect()
}
})
}
fn reload(&mut self, cx: &mut ModelContext<Self>) -> Task<Result<()>> {
let fs = self.fs.clone();
cx.spawn(|this, mut cx| async move {
fs.create_dir(&CONTEXTS_DIR).await?;
let mut paths = fs.read_dir(&CONTEXTS_DIR).await?;
let mut contexts = Vec::<SavedContextMetadata>::new();
while let Some(path) = paths.next().await {
let path = path?;
if path.extension() != Some(OsStr::new("json")) {
continue;
}
let pattern = r" - \d+.zed.json$";
let re = Regex::new(pattern).unwrap();
let metadata = fs.metadata(&path).await?;
if let Some((file_name, metadata)) = path
.file_name()
.and_then(|name| name.to_str())
.zip(metadata)
{
// This is used to filter out contexts saved by the new assistant.
if !re.is_match(file_name) {
continue;
}
if let Some(title) = re.replace(file_name, "").lines().next() {
contexts.push(SavedContextMetadata {
title: title.to_string(),
path,
mtime: metadata.mtime.into(),
});
}
}
}
contexts.sort_unstable_by_key(|context| Reverse(context.mtime));
this.update(&mut cx, |this, cx| {
this.contexts_metadata = contexts;
cx.notify();
})
})
}
}

File diff suppressed because it is too large Load Diff

View File

@@ -1,21 +1,22 @@
use crate::{
slash_command::SlashCommandCompletionProvider, CompletionProvider, LanguageModelRequest,
LanguageModelRequestMessage, Role,
slash_command::SlashCommandCompletionProvider, AssistantPanel, CompletionProvider,
InlineAssist, InlineAssistant, LanguageModelRequest, LanguageModelRequestMessage, Role,
};
use anyhow::{anyhow, Result};
use assistant_slash_command::SlashCommandRegistry;
use chrono::{DateTime, Utc};
use collections::HashMap;
use editor::{actions::Tab, Editor, EditorEvent};
use editor::{actions::Tab, CurrentLineHighlight, Editor, EditorEvent};
use futures::{
future::{self, BoxFuture, Shared},
FutureExt,
};
use fuzzy::StringMatchCandidate;
use gpui::{
actions, point, size, AnyElement, AppContext, BackgroundExecutor, Bounds, DevicePixels,
EventEmitter, Global, PromptLevel, ReadGlobal, Subscription, Task, TitlebarOptions, View,
WindowBounds, WindowHandle, WindowOptions,
actions, percentage, point, size, Animation, AnimationExt, AnyElement, AppContext,
BackgroundExecutor, Bounds, DevicePixels, EventEmitter, Global, PromptLevel, ReadGlobal,
Subscription, Task, TitlebarOptions, Transformation, UpdateGlobal, View, WindowBounds,
WindowHandle, WindowOptions,
};
use heed::{types::SerdeBincode, Database, RoTxn};
use language::{language_settings::SoftWrap, Buffer, LanguageRegistry};
@@ -23,18 +24,21 @@ use parking_lot::RwLock;
use picker::{Picker, PickerDelegate};
use rope::Rope;
use serde::{Deserialize, Serialize};
use settings::Settings;
use std::{
future::Future,
path::PathBuf,
sync::{atomic::AtomicBool, Arc},
time::Duration,
};
use theme::ThemeSettings;
use ui::{
div, prelude::*, IconButtonShape, ListHeader, ListItem, ListItemSpacing, ListSubHeader,
ParentElement, Render, SharedString, Styled, TitleBar, Tooltip, ViewContext, VisualContext,
};
use util::{paths::PROMPTS_DIR, ResultExt, TryFutureExt};
use uuid::Uuid;
use workspace::Workspace;
actions!(
prompt_library,
@@ -124,7 +128,7 @@ struct PromptPickerDelegate {
}
enum PromptPickerEvent {
Selected { prompt_id: PromptId },
Selected { prompt_id: Option<PromptId> },
Confirmed { prompt_id: PromptId },
Deleted { prompt_id: PromptId },
ToggledDefault { prompt_id: PromptId },
@@ -163,11 +167,14 @@ impl PickerDelegate for PromptPickerDelegate {
fn set_selected_index(&mut self, ix: usize, cx: &mut ViewContext<Picker<Self>>) {
self.selected_index = ix;
if let Some(PromptPickerEntry::Prompt(prompt)) = self.entries.get(self.selected_index) {
cx.emit(PromptPickerEvent::Selected {
prompt_id: prompt.id,
});
}
let prompt_id = if let Some(PromptPickerEntry::Prompt(prompt)) =
self.entries.get(self.selected_index)
{
Some(prompt.id)
} else {
None
};
cx.emit(PromptPickerEvent::Selected { prompt_id });
}
fn placeholder_text(&self, _cx: &mut WindowContext) -> Arc<str> {
@@ -245,7 +252,11 @@ impl PickerDelegate for PromptPickerDelegate {
let element = match prompt {
PromptPickerEntry::DefaultPromptsHeader => ListHeader::new("Default Prompts")
.inset(true)
.start_slot(Icon::new(IconName::ZedAssistant))
.start_slot(
Icon::new(IconName::Sparkle)
.color(Color::Muted)
.size(IconSize::XSmall),
)
.selected(selected)
.into_any_element(),
PromptPickerEntry::DefaultPromptsEmpty => {
@@ -256,7 +267,11 @@ impl PickerDelegate for PromptPickerDelegate {
}
PromptPickerEntry::AllPromptsHeader => ListHeader::new("All Prompts")
.inset(true)
.start_slot(Icon::new(IconName::Library))
.start_slot(
Icon::new(IconName::Library)
.color(Color::Muted)
.size(IconSize::XSmall),
)
.selected(selected)
.into_any_element(),
PromptPickerEntry::AllPromptsEmpty => ListSubHeader::new("No prompts")
@@ -270,14 +285,15 @@ impl PickerDelegate for PromptPickerDelegate {
.inset(true)
.spacing(ListItemSpacing::Sparse)
.selected(selected)
.child(Label::new(
.child(h_flex().h_5().line_height(relative(1.)).child(Label::new(
prompt.title.clone().unwrap_or("Untitled".into()),
))
)))
.end_hover_slot(
h_flex()
.gap_2()
.child(
IconButton::new("delete-prompt", IconName::Trash)
.icon_color(Color::Muted)
.shape(IconButtonShape::Square)
.tooltip(move |cx| Tooltip::text("Delete Prompt", cx))
.on_click(cx.listener(move |_, _, cx| {
@@ -285,30 +301,24 @@ impl PickerDelegate for PromptPickerDelegate {
})),
)
.child(
IconButton::new(
"toggle-default-prompt",
if default {
IconName::ZedAssistantFilled
} else {
IconName::ZedAssistant
},
)
.shape(IconButtonShape::Square)
.tooltip(move |cx| {
Tooltip::text(
if default {
"Remove from Default Prompt"
} else {
"Add to Default Prompt"
},
cx,
)
})
.on_click(cx.listener(
move |_, _, cx| {
IconButton::new("toggle-default-prompt", IconName::Sparkle)
.selected(default)
.selected_icon(IconName::SparkleFilled)
.icon_color(if default { Color::Accent } else { Color::Muted })
.shape(IconButtonShape::Square)
.tooltip(move |cx| {
Tooltip::text(
if default {
"Remove from Default Prompt"
} else {
"Add to Default Prompt"
},
cx,
)
})
.on_click(cx.listener(move |_, _, cx| {
cx.emit(PromptPickerEvent::ToggledDefault { prompt_id })
},
)),
})),
),
)
.into_any_element()
@@ -316,6 +326,18 @@ impl PickerDelegate for PromptPickerDelegate {
};
Some(element)
}
fn render_editor(&self, editor: &View<Editor>, cx: &mut ViewContext<Picker<Self>>) -> Div {
h_flex()
.bg(cx.theme().colors().editor_background)
.rounded_md()
.overflow_hidden()
.flex_none()
.py_1()
.px_2()
.mx_2()
.child(editor.clone())
}
}
impl PromptLibrary {
@@ -354,7 +376,11 @@ impl PromptLibrary {
) {
match event {
PromptPickerEvent::Selected { prompt_id } => {
self.load_prompt(*prompt_id, false, cx);
if let Some(prompt_id) = *prompt_id {
self.load_prompt(prompt_id, false, cx);
} else {
self.focus_picker(&Default::default(), cx);
}
}
PromptPickerEvent::Confirmed { prompt_id } => {
self.load_prompt(*prompt_id, true, cx);
@@ -498,6 +524,7 @@ impl PromptLibrary {
editor.set_show_gutter(false, cx);
editor.set_show_wrap_guides(false, cx);
editor.set_show_indent_guides(false, cx);
editor.set_current_line_highlight(Some(CurrentLineHighlight::None));
editor.set_completion_provider(Box::new(
SlashCommandCompletionProvider::new(commands, None, None),
));
@@ -603,6 +630,49 @@ impl PromptLibrary {
self.picker.update(cx, |picker, cx| picker.focus(cx));
}
pub fn inline_assist(&mut self, _: &InlineAssist, cx: &mut ViewContext<Self>) {
let Some(active_prompt_id) = self.active_prompt_id else {
cx.propagate();
return;
};
let prompt_editor = &self.prompt_editors[&active_prompt_id].editor;
let provider = CompletionProvider::global(cx);
if provider.is_authenticated() {
InlineAssistant::update_global(cx, |assistant, cx| {
assistant.assist(&prompt_editor, None, false, cx)
})
} else {
for window in cx.windows() {
if let Some(workspace) = window.downcast::<Workspace>() {
let panel = workspace
.update(cx, |workspace, cx| {
cx.activate_window();
workspace.focus_panel::<AssistantPanel>(cx)
})
.ok()
.flatten();
if panel.is_some() {
return;
}
}
}
}
}
fn cancel_last_inline_assist(
&mut self,
_: &editor::actions::Cancel,
cx: &mut ViewContext<Self>,
) {
let canceled = InlineAssistant::update_global(cx, |assistant, cx| {
assistant.cancel_last_inline_assist(cx)
});
if !canceled {
cx.propagate();
}
}
fn handle_prompt_editor_event(
&mut self,
prompt_id: PromptId,
@@ -694,14 +764,13 @@ impl PromptLibrary {
.child(
h_flex()
.p(Spacing::Small.rems(cx))
.border_b_1()
.border_color(cx.theme().colors().border)
.h(TitleBar::height(cx))
.w_full()
.flex_none()
.justify_end()
.child(
IconButton::new("new-prompt", IconName::Plus)
.style(ButtonStyle::Transparent)
.shape(IconButtonShape::Square)
.tooltip(move |cx| Tooltip::for_action("New Prompt", &NewPrompt, cx))
.on_click(|_, cx| {
@@ -723,19 +792,30 @@ impl PromptLibrary {
.flex_none()
.min_w_64()
.children(self.active_prompt_id.and_then(|prompt_id| {
let buffer_font = ThemeSettings::get_global(cx).buffer_font.family.clone();
let prompt_metadata = self.store.metadata(prompt_id)?;
let prompt_editor = &self.prompt_editors[&prompt_id];
let focus_handle = prompt_editor.editor.focus_handle(cx);
let current_model = CompletionProvider::global(cx).model();
let token_count = prompt_editor.token_count.map(|count| count.to_string());
Some(
h_flex()
.id("prompt-editor-inner")
.size_full()
.items_start()
.on_click(cx.listener(move |_, _, cx| {
cx.focus(&focus_handle);
}))
.child(
div()
.on_action(cx.listener(Self::focus_picker))
.on_action(cx.listener(Self::inline_assist))
.on_action(cx.listener(Self::cancel_last_inline_assist))
.flex_grow()
.h_full()
.pt(Spacing::Large.rems(cx))
.pl(Spacing::Large.rems(cx))
.pt(Spacing::XXLarge.rems(cx))
.pl(Spacing::XXLarge.rems(cx))
.child(prompt_editor.editor.clone()),
)
.child(
@@ -743,49 +823,92 @@ impl PromptLibrary {
.w_12()
.py(Spacing::Large.rems(cx))
.justify_start()
.items_center()
.gap_4()
.child(
IconButton::new(
"toggle-default-prompt",
if prompt_metadata.default {
IconName::ZedAssistantFilled
} else {
IconName::ZedAssistant
},
)
.size(ButtonSize::Large)
.shape(IconButtonShape::Square)
.tooltip(move |cx| {
Tooltip::for_action(
if prompt_metadata.default {
"Remove from Default Prompt"
} else {
"Add to Default Prompt"
},
&ToggleDefaultPrompt,
cx,
.items_end()
.gap_1()
.child(h_flex().h_8().font_family(buffer_font).when_some_else(
token_count,
|tokens_ready, token_count| {
tokens_ready.pr_3().justify_end().child(
// This isn't actually a button, it just let's us easily add
// a tooltip to the token count.
Button::new("token_count", token_count.clone())
.style(ButtonStyle::Transparent)
.color(Color::Muted)
.tooltip(move |cx| {
Tooltip::with_meta(
format!("{} tokens", token_count,),
None,
format!(
"Model: {}",
current_model.display_name()
),
cx,
)
}),
)
})
.on_click(|_, cx| {
cx.dispatch_action(Box::new(ToggleDefaultPrompt));
}),
},
|tokens_loading| {
tokens_loading.w_12().justify_center().child(
Icon::new(IconName::ArrowCircle)
.size(IconSize::Small)
.color(Color::Muted)
.with_animation(
"arrow-circle",
Animation::new(Duration::from_secs(4)).repeat(),
|icon, delta| {
icon.transform(Transformation::rotate(
percentage(delta),
))
},
),
)
},
))
.child(
h_flex().justify_center().w_12().h_8().child(
IconButton::new("toggle-default-prompt", IconName::Sparkle)
.style(ButtonStyle::Transparent)
.selected(prompt_metadata.default)
.selected_icon(IconName::SparkleFilled)
.icon_color(if prompt_metadata.default {
Color::Accent
} else {
Color::Muted
})
.shape(IconButtonShape::Square)
.tooltip(move |cx| {
Tooltip::text(
if prompt_metadata.default {
"Remove from Default Prompt"
} else {
"Add to Default Prompt"
},
cx,
)
})
.on_click(|_, cx| {
cx.dispatch_action(Box::new(ToggleDefaultPrompt));
}),
),
)
.child(
IconButton::new("delete-prompt", IconName::Trash)
.shape(IconButtonShape::Square)
.tooltip(move |cx| {
Tooltip::for_action("Delete Prompt", &DeletePrompt, cx)
})
.on_click(|_, cx| {
cx.dispatch_action(Box::new(DeletePrompt));
}),
)
.children(prompt_editor.token_count.map(|token_count| {
h_flex()
.justify_center()
.child(Label::new(token_count.to_string()))
})),
h_flex().justify_center().w_12().h_8().child(
IconButton::new("delete-prompt", IconName::Trash)
.size(ButtonSize::Large)
.style(ButtonStyle::Transparent)
.shape(IconButtonShape::Square)
.tooltip(move |cx| {
Tooltip::for_action(
"Delete Prompt",
&DeletePrompt,
cx,
)
})
.on_click(|_, cx| {
cx.dispatch_action(Box::new(DeletePrompt));
}),
),
),
),
)
}))
@@ -794,6 +917,14 @@ impl PromptLibrary {
impl Render for PromptLibrary {
fn render(&mut self, cx: &mut ViewContext<Self>) -> impl IntoElement {
let (ui_font, ui_font_size) = {
let theme_settings = ThemeSettings::get_global(cx);
(theme_settings.ui_font.clone(), theme_settings.ui_font_size)
};
let theme = cx.theme().clone();
cx.set_rem_size(ui_font_size);
h_flex()
.id("prompt-manager")
.key_context("PromptLibrary")
@@ -804,6 +935,8 @@ impl Render for PromptLibrary {
}))
.size_full()
.overflow_hidden()
.font(ui_font)
.text_color(theme.colors().text)
.child(self.render_prompt_list(cx))
.child(self.render_active_prompt(cx))
}

View File

@@ -1,126 +0,0 @@
use crate::{assistant_settings::OpenAiModel, MessageId, MessageMetadata};
use anyhow::{anyhow, Result};
use collections::HashMap;
use fs::Fs;
use futures::StreamExt;
use regex::Regex;
use serde::{Deserialize, Serialize};
use std::{
cmp::Reverse,
ffi::OsStr,
path::{Path, PathBuf},
sync::Arc,
};
use util::paths::CONVERSATIONS_DIR;
#[derive(Serialize, Deserialize)]
pub struct SavedMessage {
pub id: MessageId,
pub start: usize,
}
#[derive(Serialize, Deserialize)]
pub struct SavedConversation {
pub id: Option<String>,
pub zed: String,
pub version: String,
pub text: String,
pub messages: Vec<SavedMessage>,
pub message_metadata: HashMap<MessageId, MessageMetadata>,
pub summary: String,
}
impl SavedConversation {
pub const VERSION: &'static str = "0.2.0";
pub async fn load(path: &Path, fs: &dyn Fs) -> Result<Self> {
let saved_conversation = fs.load(path).await?;
let saved_conversation_json =
serde_json::from_str::<serde_json::Value>(&saved_conversation)?;
match saved_conversation_json
.get("version")
.ok_or_else(|| anyhow!("version not found"))?
{
serde_json::Value::String(version) => match version.as_str() {
Self::VERSION => Ok(serde_json::from_value::<Self>(saved_conversation_json)?),
"0.1.0" => {
let saved_conversation =
serde_json::from_value::<SavedConversationV0_1_0>(saved_conversation_json)?;
Ok(Self {
id: saved_conversation.id,
zed: saved_conversation.zed,
version: saved_conversation.version,
text: saved_conversation.text,
messages: saved_conversation.messages,
message_metadata: saved_conversation.message_metadata,
summary: saved_conversation.summary,
})
}
_ => Err(anyhow!(
"unrecognized saved conversation version: {}",
version
)),
},
_ => Err(anyhow!("version not found on saved conversation")),
}
}
}
#[derive(Serialize, Deserialize)]
struct SavedConversationV0_1_0 {
id: Option<String>,
zed: String,
version: String,
text: String,
messages: Vec<SavedMessage>,
message_metadata: HashMap<MessageId, MessageMetadata>,
summary: String,
api_url: Option<String>,
model: OpenAiModel,
}
pub struct SavedConversationMetadata {
pub title: String,
pub path: PathBuf,
pub mtime: chrono::DateTime<chrono::Local>,
}
impl SavedConversationMetadata {
pub async fn list(fs: Arc<dyn Fs>) -> Result<Vec<Self>> {
fs.create_dir(&CONVERSATIONS_DIR).await?;
let mut paths = fs.read_dir(&CONVERSATIONS_DIR).await?;
let mut conversations = Vec::<SavedConversationMetadata>::new();
while let Some(path) = paths.next().await {
let path = path?;
if path.extension() != Some(OsStr::new("json")) {
continue;
}
let pattern = r" - \d+.zed.json$";
let re = Regex::new(pattern).unwrap();
let metadata = fs.metadata(&path).await?;
if let Some((file_name, metadata)) = path
.file_name()
.and_then(|name| name.to_str())
.zip(metadata)
{
// This is used to filter out conversations saved by the new assistant.
if !re.is_match(file_name) {
continue;
}
let title = re.replace(file_name, "");
conversations.push(Self {
title: title.into_owned(),
path,
mtime: metadata.mtime.into(),
});
}
}
conversations.sort_unstable_by_key(|conversation| Reverse(conversation.mtime));
Ok(conversations)
}
}

View File

@@ -1,4 +1,4 @@
use crate::assistant_panel::ConversationEditor;
use crate::assistant_panel::ContextEditor;
use anyhow::Result;
pub use assistant_slash_command::{SlashCommand, SlashCommandOutput, SlashCommandRegistry};
use editor::{CompletionProvider, Editor};
@@ -29,7 +29,7 @@ pub mod tabs_command;
pub(crate) struct SlashCommandCompletionProvider {
commands: Arc<SlashCommandRegistry>,
cancel_flag: Mutex<Arc<AtomicBool>>,
editor: Option<WeakView<ConversationEditor>>,
editor: Option<WeakView<ContextEditor>>,
workspace: Option<WeakView<Workspace>>,
}
@@ -43,7 +43,7 @@ pub(crate) struct SlashCommandLine {
impl SlashCommandCompletionProvider {
pub fn new(
commands: Arc<SlashCommandRegistry>,
editor: Option<WeakView<ConversationEditor>>,
editor: Option<WeakView<ContextEditor>>,
workspace: Option<WeakView<Workspace>>,
) -> Self {
Self {

View File

@@ -267,7 +267,7 @@ impl Room {
.await
{
Ok(()) => Ok(room),
Err(error) => Err(anyhow!("room creation failed: {:?}", error)),
Err(error) => Err(error.context("room creation failed")),
}
})
}

View File

@@ -83,7 +83,10 @@ async fn test_host_disconnect(
let project_b = client_b.build_dev_server_project(project_id, cx_b).await;
cx_a.background_executor.run_until_parked();
assert!(worktree_a.read_with(cx_a, |tree, _| tree.as_local().unwrap().is_shared()));
assert!(worktree_a.read_with(cx_a, |tree, _| tree
.as_local()
.unwrap()
.has_update_observer()));
let workspace_b = cx_b
.add_window(|cx| Workspace::new(None, project_b.clone(), client_b.app_state.clone(), cx));
@@ -120,7 +123,10 @@ async fn test_host_disconnect(
project_b.read_with(cx_b, |project, _| project.is_read_only());
assert!(worktree_a.read_with(cx_a, |tree, _| !tree.as_local().unwrap().is_shared()));
assert!(worktree_a.read_with(cx_a, |tree, _| !tree
.as_local()
.unwrap()
.has_update_observer()));
// Ensure client B's edited state is reset and that the whole window is blurred.

View File

@@ -1378,7 +1378,10 @@ async fn test_unshare_project(
let project_b = client_b.build_dev_server_project(project_id, cx_b).await;
executor.run_until_parked();
assert!(worktree_a.read_with(cx_a, |tree, _| tree.as_local().unwrap().is_shared()));
assert!(worktree_a.read_with(cx_a, |tree, _| tree
.as_local()
.unwrap()
.has_update_observer()));
project_b
.update(cx_b, |p, cx| p.open_buffer((worktree_id, "a.txt"), cx))
@@ -1403,7 +1406,10 @@ async fn test_unshare_project(
.unwrap();
executor.run_until_parked();
assert!(worktree_a.read_with(cx_a, |tree, _| !tree.as_local().unwrap().is_shared()));
assert!(worktree_a.read_with(cx_a, |tree, _| !tree
.as_local()
.unwrap()
.has_update_observer()));
assert!(project_c.read_with(cx_c, |project, _| project.is_disconnected()));
@@ -1415,7 +1421,10 @@ async fn test_unshare_project(
let project_c2 = client_c.build_dev_server_project(project_id, cx_c).await;
executor.run_until_parked();
assert!(worktree_a.read_with(cx_a, |tree, _| tree.as_local().unwrap().is_shared()));
assert!(worktree_a.read_with(cx_a, |tree, _| tree
.as_local()
.unwrap()
.has_update_observer()));
project_c2
.update(cx_c, |p, cx| p.open_buffer((worktree_id, "a.txt"), cx))
.await
@@ -1522,7 +1531,7 @@ async fn test_project_reconnect(
executor.run_until_parked();
let worktree1_id = worktree_a1.read_with(cx_a, |worktree, _| {
assert!(worktree.as_local().unwrap().is_shared());
assert!(worktree.as_local().unwrap().has_update_observer());
worktree.id()
});
let (worktree_a2, _) = project_a1
@@ -1534,7 +1543,7 @@ async fn test_project_reconnect(
executor.run_until_parked();
let worktree2_id = worktree_a2.read_with(cx_a, |tree, _| {
assert!(tree.as_local().unwrap().is_shared());
assert!(tree.as_local().unwrap().has_update_observer());
tree.id()
});
executor.run_until_parked();
@@ -1568,7 +1577,7 @@ async fn test_project_reconnect(
});
worktree_a1.read_with(cx_a, |tree, _| {
assert!(tree.as_local().unwrap().is_shared())
assert!(tree.as_local().unwrap().has_update_observer())
});
// While client A is disconnected, add and remove files from client A's project.
@@ -1611,7 +1620,7 @@ async fn test_project_reconnect(
.await;
let worktree3_id = worktree_a3.read_with(cx_a, |tree, _| {
assert!(!tree.as_local().unwrap().is_shared());
assert!(!tree.as_local().unwrap().has_update_observer());
tree.id()
});
executor.run_until_parked();
@@ -1634,7 +1643,11 @@ async fn test_project_reconnect(
project_a1.read_with(cx_a, |project, cx| {
assert!(project.is_shared());
assert!(worktree_a1.read(cx).as_local().unwrap().is_shared());
assert!(worktree_a1
.read(cx)
.as_local()
.unwrap()
.has_update_observer());
assert_eq!(
worktree_a1
.read(cx)
@@ -1652,7 +1665,11 @@ async fn test_project_reconnect(
"subdir2/i.txt"
]
);
assert!(worktree_a3.read(cx).as_local().unwrap().is_shared());
assert!(worktree_a3
.read(cx)
.as_local()
.unwrap()
.has_update_observer());
assert_eq!(
worktree_a3
.read(cx)
@@ -1733,7 +1750,7 @@ async fn test_project_reconnect(
executor.run_until_parked();
let worktree4_id = worktree_a4.read_with(cx_a, |tree, _| {
assert!(tree.as_local().unwrap().is_shared());
assert!(tree.as_local().unwrap().has_update_observer());
tree.id()
});
project_a1.update(cx_a, |project, cx| {

View File

@@ -69,7 +69,6 @@ struct TestPlan<T: RandomizedTest> {
pub struct UserTestPlan {
pub user_id: UserId,
pub username: String,
pub allow_client_reconnection: bool,
pub allow_client_disconnection: bool,
next_root_id: usize,
operation_ix: usize,
@@ -237,7 +236,6 @@ impl<T: RandomizedTest> TestPlan<T> {
next_root_id: 0,
operation_ix: 0,
allow_client_disconnection,
allow_client_reconnection,
});
}

View File

@@ -129,10 +129,10 @@ where
impl Clamp for RGBAColor {
fn clamp(self) -> Self {
RGBAColor {
r: self.r.min(1.0).max(0.0),
g: self.g.min(1.0).max(0.0),
b: self.b.min(1.0).max(0.0),
a: self.a.min(1.0).max(0.0),
r: self.r.clamp(0., 1.),
g: self.g.clamp(0., 1.),
b: self.b.clamp(0., 1.),
a: self.a.clamp(0., 1.),
}
}
}

View File

@@ -1044,7 +1044,6 @@ async fn get_copilot_lsp(http: Arc<dyn HttpClient>) -> anyhow::Result<PathBuf> {
mod tests {
use super::*;
use gpui::TestAppContext;
use language::BufferId;
#[gpui::test(iterations = 10)]
async fn test_buffer_management(cx: &mut TestAppContext) {
@@ -1258,16 +1257,5 @@ mod tests {
fn load(&self, _: &AppContext) -> Task<Result<String>> {
unimplemented!()
}
fn buffer_reloaded(
&self,
_: BufferId,
_: &clock::Global,
_: language::LineEnding,
_: Option<std::time::SystemTime>,
_: &mut AppContext,
) {
unimplemented!()
}
}
}

View File

@@ -289,6 +289,7 @@ gpui::actions!(
ToggleLineNumbers,
ToggleIndentGuides,
ToggleSoftWrap,
ToggleTabBar,
Transpose,
Undo,
UndoSelection,

View File

@@ -277,8 +277,55 @@ impl DisplayMap {
block_map.insert(blocks)
}
pub fn replace_blocks(&mut self, styles: HashMap<BlockId, RenderBlock>) {
self.block_map.replace(styles);
pub fn replace_blocks(
&mut self,
heights_and_renderers: HashMap<BlockId, (Option<u8>, RenderBlock)>,
cx: &mut ModelContext<Self>,
) {
//
// Note: previous implementation of `replace_blocks` simply called
// `self.block_map.replace(styles)` which just modified the render by replacing
// the `RenderBlock` with the new one.
//
// ```rust
// for block in &self.blocks {
// if let Some(render) = renderers.remove(&block.id) {
// *block.render.lock() = render;
// }
// }
// ```
//
// If height changes however, we need to update the tree. There's a performance
// cost to this, so we'll split the replace blocks into handling the old behavior
// directly and the new behavior separately.
//
//
let mut only_renderers = HashMap::<BlockId, RenderBlock>::default();
let mut full_replace = HashMap::<BlockId, (u8, RenderBlock)>::default();
for (id, (height, render)) in heights_and_renderers {
if let Some(height) = height {
full_replace.insert(id, (height, render));
} else {
only_renderers.insert(id, render);
}
}
self.block_map.replace_renderers(only_renderers);
if full_replace.is_empty() {
return;
}
let snapshot = self.buffer.read(cx).snapshot(cx);
let edits = self.buffer_subscription.consume().into_inner();
let tab_size = Self::tab_size(&self.buffer, cx);
let (snapshot, edits) = self.inlay_map.sync(snapshot, edits);
let (snapshot, edits) = self.fold_map.read(snapshot, edits);
let (snapshot, edits) = self.tab_map.sync(snapshot, edits, tab_size);
let (snapshot, edits) = self
.wrap_map
.update(cx, |map, cx| map.sync(snapshot, edits, cx));
let mut block_map = self.block_map.write(snapshot, edits);
block_map.replace(full_replace);
}
pub fn remove_blocks(&mut self, ids: HashSet<BlockId>, cx: &mut ModelContext<Self>) {

View File

@@ -467,8 +467,8 @@ impl BlockMap {
*transforms = new_transforms;
}
pub fn replace(&mut self, mut renderers: HashMap<BlockId, RenderBlock>) {
for block in &self.blocks {
pub fn replace_renderers(&mut self, mut renderers: HashMap<BlockId, RenderBlock>) {
for block in &mut self.blocks {
if let Some(render) = renderers.remove(&block.id) {
*block.render.lock() = render;
}
@@ -659,6 +659,48 @@ impl<'a> BlockMapWriter<'a> {
ids
}
pub fn replace(&mut self, mut heights_and_renderers: HashMap<BlockId, (u8, RenderBlock)>) {
let wrap_snapshot = &*self.0.wrap_snapshot.borrow();
let buffer = wrap_snapshot.buffer_snapshot();
let mut edits = Patch::default();
let mut last_block_buffer_row = None;
for block in &mut self.0.blocks {
if let Some((new_height, render)) = heights_and_renderers.remove(&block.id) {
if block.height != new_height {
let new_block = Block {
id: block.id,
position: block.position,
height: new_height,
style: block.style,
render: Mutex::new(render),
disposition: block.disposition,
};
*block = Arc::new(new_block);
let buffer_row = block.position.to_point(buffer).row;
if last_block_buffer_row != Some(buffer_row) {
last_block_buffer_row = Some(buffer_row);
let wrap_row = wrap_snapshot
.make_wrap_point(Point::new(buffer_row, 0), Bias::Left)
.row();
let start_row =
wrap_snapshot.prev_row_boundary(WrapPoint::new(wrap_row, 0));
let end_row = wrap_snapshot
.next_row_boundary(WrapPoint::new(wrap_row, 0))
.unwrap_or(wrap_snapshot.max_point().row() + 1);
edits.push(Edit {
old: start_row..end_row,
new: start_row..end_row,
})
}
}
}
}
self.0.sync(wrap_snapshot, edits);
}
pub fn remove(&mut self, block_ids: HashSet<BlockId>) {
let wrap_snapshot = &*self.0.wrap_snapshot.borrow();
let buffer = wrap_snapshot.buffer_snapshot();
@@ -1305,6 +1347,111 @@ mod tests {
assert_eq!(snapshot.text(), "aaa\n\nb!!!\n\n\nbb\nccc\nddd\n\n\n");
}
#[gpui::test]
fn test_replace_with_heights(cx: &mut gpui::TestAppContext) {
let _update = cx.update(|cx| init_test(cx));
let text = "aaa\nbbb\nccc\nddd";
let buffer = cx.update(|cx| MultiBuffer::build_simple(text, cx));
let buffer_snapshot = cx.update(|cx| buffer.read(cx).snapshot(cx));
let _subscription = buffer.update(cx, |buffer, _| buffer.subscribe());
let (_inlay_map, inlay_snapshot) = InlayMap::new(buffer_snapshot.clone());
let (_fold_map, fold_snapshot) = FoldMap::new(inlay_snapshot);
let (_tab_map, tab_snapshot) = TabMap::new(fold_snapshot, 1.try_into().unwrap());
let (_wrap_map, wraps_snapshot) =
cx.update(|cx| WrapMap::new(tab_snapshot, font("Helvetica"), px(14.0), None, cx));
let mut block_map = BlockMap::new(wraps_snapshot.clone(), false, 1, 1, 0);
let mut writer = block_map.write(wraps_snapshot.clone(), Default::default());
let block_ids = writer.insert(vec![
BlockProperties {
style: BlockStyle::Fixed,
position: buffer_snapshot.anchor_after(Point::new(1, 0)),
height: 1,
disposition: BlockDisposition::Above,
render: Box::new(|_| div().into_any()),
},
BlockProperties {
style: BlockStyle::Fixed,
position: buffer_snapshot.anchor_after(Point::new(1, 2)),
height: 2,
disposition: BlockDisposition::Above,
render: Box::new(|_| div().into_any()),
},
BlockProperties {
style: BlockStyle::Fixed,
position: buffer_snapshot.anchor_after(Point::new(3, 3)),
height: 3,
disposition: BlockDisposition::Below,
render: Box::new(|_| div().into_any()),
},
]);
{
let snapshot = block_map.read(wraps_snapshot.clone(), Default::default());
assert_eq!(snapshot.text(), "aaa\n\n\n\nbbb\nccc\nddd\n\n\n");
let mut block_map_writer = block_map.write(wraps_snapshot.clone(), Default::default());
let mut hash_map = HashMap::default();
let render: RenderBlock = Box::new(|_| div().into_any());
hash_map.insert(block_ids[0], (2_u8, render));
block_map_writer.replace(hash_map);
let snapshot = block_map.read(wraps_snapshot.clone(), Default::default());
assert_eq!(snapshot.text(), "aaa\n\n\n\n\nbbb\nccc\nddd\n\n\n");
}
{
let mut block_map_writer = block_map.write(wraps_snapshot.clone(), Default::default());
let mut hash_map = HashMap::default();
let render: RenderBlock = Box::new(|_| div().into_any());
hash_map.insert(block_ids[0], (1_u8, render));
block_map_writer.replace(hash_map);
let snapshot = block_map.read(wraps_snapshot.clone(), Default::default());
assert_eq!(snapshot.text(), "aaa\n\n\n\nbbb\nccc\nddd\n\n\n");
}
{
let mut block_map_writer = block_map.write(wraps_snapshot.clone(), Default::default());
let mut hash_map = HashMap::default();
let render: RenderBlock = Box::new(|_| div().into_any());
hash_map.insert(block_ids[0], (0_u8, render));
block_map_writer.replace(hash_map);
let snapshot = block_map.read(wraps_snapshot.clone(), Default::default());
assert_eq!(snapshot.text(), "aaa\n\n\nbbb\nccc\nddd\n\n\n");
}
{
let mut block_map_writer = block_map.write(wraps_snapshot.clone(), Default::default());
let mut hash_map = HashMap::default();
let render: RenderBlock = Box::new(|_| div().into_any());
hash_map.insert(block_ids[0], (3_u8, render));
block_map_writer.replace(hash_map);
let snapshot = block_map.read(wraps_snapshot.clone(), Default::default());
assert_eq!(snapshot.text(), "aaa\n\n\n\n\n\nbbb\nccc\nddd\n\n\n");
}
{
let mut block_map_writer = block_map.write(wraps_snapshot.clone(), Default::default());
let mut hash_map = HashMap::default();
let render: RenderBlock = Box::new(|_| div().into_any());
hash_map.insert(block_ids[0], (3_u8, render));
block_map_writer.replace(hash_map);
let snapshot = block_map.read(wraps_snapshot.clone(), Default::default());
// Same height as before, should remain the same
assert_eq!(snapshot.text(), "aaa\n\n\n\n\n\nbbb\nccc\nddd\n\n\n");
}
}
#[gpui::test]
fn test_blocks_on_wrapped_lines(cx: &mut gpui::TestAppContext) {
cx.update(|cx| init_test(cx));

View File

@@ -53,8 +53,7 @@ use convert_case::{Case, Casing};
use debounced_delay::DebouncedDelay;
use display_map::*;
pub use display_map::{DisplayPoint, FoldPlaceholder};
use editor_settings::CurrentLineHighlight;
pub use editor_settings::EditorSettings;
pub use editor_settings::{CurrentLineHighlight, EditorSettings};
use element::LineWithInvisibles;
pub use element::{
CursorLayout, EditorElement, HighlightedRange, HighlightedRangeLine, PointForPosition,
@@ -112,7 +111,7 @@ use rpc::{proto::*, ErrorExt};
use scroll::{Autoscroll, OngoingScroll, ScrollAnchor, ScrollManager, ScrollbarAutoHide};
use selections_collection::{resolve_multiple, MutableSelectionsCollection, SelectionsCollection};
use serde::{Deserialize, Serialize};
use settings::{Settings, SettingsStore};
use settings::{update_settings_file, Settings, SettingsStore};
use smallvec::SmallVec;
use snippet::Snippet;
use std::ops::Not as _;
@@ -144,7 +143,7 @@ use workspace::notifications::{DetachAndPromptErr, NotificationId};
use workspace::{
searchable::SearchEvent, ItemNavHistory, SplitDirection, ViewId, Workspace, WorkspaceId,
};
use workspace::{OpenInTerminal, OpenTerminal, Toast};
use workspace::{OpenInTerminal, OpenTerminal, TabBarSettings, Toast};
use crate::hover_links::find_url;
@@ -480,7 +479,7 @@ pub struct Editor {
pending_rename: Option<RenameState>,
searchable: bool,
cursor_shape: CursorShape,
current_line_highlight: CurrentLineHighlight,
current_line_highlight: Option<CurrentLineHighlight>,
collapse_matches: bool,
autoindent_mode: Option<AutoindentMode>,
workspace: Option<(WeakView<Workspace>, Option<WorkspaceId>)>,
@@ -523,6 +522,7 @@ pub struct Editor {
expect_bounds_change: Option<Bounds<Pixels>>,
tasks: BTreeMap<(BufferId, BufferRow), RunnableTasks>,
tasks_update_task: Option<Task<()>>,
previous_search_ranges: Option<Arc<[Range<Anchor>]>>,
}
#[derive(Clone)]
@@ -1768,7 +1768,7 @@ impl Editor {
pending_rename: Default::default(),
searchable: true,
cursor_shape: Default::default(),
current_line_highlight: EditorSettings::get_global(cx).current_line_highlight,
current_line_highlight: None,
autoindent_mode: Some(AutoindentMode::EachLine),
collapse_matches: false,
workspace: None,
@@ -1825,6 +1825,7 @@ impl Editor {
}),
],
tasks_update_task: None,
previous_search_ranges: None,
};
this.tasks_update_task = Some(this.refresh_runnables(cx));
this._subscriptions.extend(project_subscriptions);
@@ -1992,7 +1993,9 @@ impl Editor {
ongoing_scroll: self.scroll_manager.ongoing_scroll(),
placeholder_text: self.placeholder_text.clone(),
is_focused: self.focus_handle.is_focused(cx),
current_line_highlight: self.current_line_highlight,
current_line_highlight: self
.current_line_highlight
.unwrap_or_else(|| EditorSettings::get_global(cx).current_line_highlight),
gutter_hovered: self.gutter_hovered,
}
}
@@ -2082,7 +2085,10 @@ impl Editor {
cx.notify();
}
pub fn set_current_line_highlight(&mut self, current_line_highlight: CurrentLineHighlight) {
pub fn set_current_line_highlight(
&mut self,
current_line_highlight: Option<CurrentLineHighlight>,
) {
self.current_line_highlight = current_line_highlight;
}
@@ -2813,6 +2819,9 @@ impl Editor {
}
if let Some(bracket_pair) = bracket_pair {
let autoclose = self.use_autoclose
&& snapshot.settings_at(selection.start, cx).use_autoclose;
if selection.is_empty() {
if is_bracket_pair_start {
let prefix_len = bracket_pair.start.len() - text.len();
@@ -2833,8 +2842,6 @@ impl Editor {
),
&bracket_pair.start[..prefix_len],
));
let autoclose = self.use_autoclose
&& snapshot.settings_at(selection.start, cx).use_autoclose;
if autoclose
&& following_text_allows_autoclose
&& preceding_text_matches_prefix
@@ -2887,7 +2894,10 @@ impl Editor {
}
// If an opening bracket is 1 character long and is typed while
// text is selected, then surround that text with the bracket pair.
else if is_bracket_pair_start && bracket_pair.start.chars().count() == 1 {
else if autoclose
&& is_bracket_pair_start
&& bracket_pair.start.chars().count() == 1
{
edits.push((selection.start..selection.start, text.clone()));
edits.push((
selection.end..selection.end,
@@ -3010,12 +3020,7 @@ impl Editor {
s.select(new_selections)
});
if brace_inserted {
// If we inserted a brace while composing text (i.e. typing `"` on a
// Brazilian keyboard), exit the composing state because most likely
// the user wanted to surround the selection.
this.unmark_text(cx);
} else if EditorSettings::get_global(cx).use_on_type_format {
if !brace_inserted && EditorSettings::get_global(cx).use_on_type_format {
if let Some(on_type_format_task) =
this.trigger_on_type_formatting(text.to_string(), cx)
{
@@ -9263,11 +9268,15 @@ impl Editor {
for (block_id, diagnostic) in &active_diagnostics.blocks {
new_styles.insert(
*block_id,
diagnostic_block_renderer(diagnostic.clone(), is_valid),
(
None,
diagnostic_block_renderer(diagnostic.clone(), is_valid),
),
);
}
self.display_map
.update(cx, |display_map, _| display_map.replace_blocks(new_styles));
self.display_map.update(cx, |display_map, cx| {
display_map.replace_blocks(new_styles, cx)
});
}
}
}
@@ -9624,12 +9633,12 @@ impl Editor {
pub fn replace_blocks(
&mut self,
blocks: HashMap<BlockId, RenderBlock>,
blocks: HashMap<BlockId, (Option<u8>, RenderBlock)>,
autoscroll: Option<Autoscroll>,
cx: &mut ViewContext<Self>,
) {
self.display_map
.update(cx, |display_map, _| display_map.replace_blocks(blocks));
.update(cx, |display_map, cx| display_map.replace_blocks(blocks, cx));
if let Some(autoscroll) = autoscroll {
self.request_autoscroll(autoscroll, cx);
}
@@ -9790,6 +9799,17 @@ impl Editor {
cx.notify();
}
pub fn toggle_tab_bar(&mut self, _: &ToggleTabBar, cx: &mut ViewContext<Self>) {
let Some(workspace) = self.workspace() else {
return;
};
let fs = workspace.read(cx).app_state().fs.clone();
let current_show = TabBarSettings::get_global(cx).show;
update_settings_file::<TabBarSettings>(fs, cx, move |setting| {
setting.show = Some(!current_show);
});
}
pub fn toggle_indent_guides(&mut self, _: &ToggleIndentGuides, cx: &mut ViewContext<Self>) {
let currently_enabled = self.should_show_indent_guides().unwrap_or_else(|| {
self.buffer
@@ -10256,6 +10276,27 @@ impl Editor {
self.background_highlights_in_range(start..end, &snapshot, theme)
}
#[cfg(feature = "test-support")]
pub fn search_background_highlights(
&mut self,
cx: &mut ViewContext<Self>,
) -> Vec<Range<Point>> {
let snapshot = self.buffer().read(cx).snapshot(cx);
let highlights = self
.background_highlights
.get(&TypeId::of::<items::BufferSearchHighlights>());
if let Some((_color, ranges)) = highlights {
ranges
.iter()
.map(|range| range.start.to_point(&snapshot)..range.end.to_point(&snapshot))
.collect_vec()
} else {
vec![]
}
}
fn document_highlights_for_position<'a>(
&'a self,
position: Anchor,
@@ -10604,7 +10645,6 @@ impl Editor {
let editor_settings = EditorSettings::get_global(cx);
self.scroll_manager.vertical_scroll_margin = editor_settings.vertical_scroll_margin;
self.show_breadcrumbs = editor_settings.toolbar.breadcrumbs;
self.current_line_highlight = editor_settings.current_line_highlight;
if self.mode == EditorMode::Full {
let inline_blame_enabled = ProjectSettings::get_global(cx).git.inline_blame_enabled();

View File

@@ -318,6 +318,7 @@ impl EditorElement {
register_action(view, cx, Editor::open_excerpts);
register_action(view, cx, Editor::open_excerpts_in_split);
register_action(view, cx, Editor::toggle_soft_wrap);
register_action(view, cx, Editor::toggle_tab_bar);
register_action(view, cx, Editor::toggle_line_numbers);
register_action(view, cx, Editor::toggle_indent_guides);
register_action(view, cx, Editor::toggle_inlay_hints);
@@ -4071,6 +4072,7 @@ impl LineWithInvisibles {
if non_whitespace_added || !inside_wrapped_string {
invisibles.push(Invisible::Tab {
line_start_offset: line.len(),
line_end_offset: line.len() + line_chunk.len(),
});
}
} else {
@@ -4186,16 +4188,15 @@ impl LineWithInvisibles {
whitespace_setting: ShowWhitespaceSetting,
cx: &mut WindowContext,
) {
let allowed_invisibles_regions = match whitespace_setting {
ShowWhitespaceSetting::None => return,
ShowWhitespaceSetting::Selection => Some(selection_ranges),
ShowWhitespaceSetting::All => None,
};
for invisible in &self.invisibles {
let (&token_offset, invisible_symbol) = match invisible {
Invisible::Tab { line_start_offset } => (line_start_offset, &layout.tab_invisible),
Invisible::Whitespace { line_offset } => (line_offset, &layout.space_invisible),
let extract_whitespace_info = |invisible: &Invisible| {
let (token_offset, token_end_offset, invisible_symbol) = match invisible {
Invisible::Tab {
line_start_offset,
line_end_offset,
} => (*line_start_offset, *line_end_offset, &layout.tab_invisible),
Invisible::Whitespace { line_offset } => {
(*line_offset, line_offset + 1, &layout.space_invisible)
}
};
let x_offset = self.x_for_index(token_offset);
@@ -4207,17 +4208,73 @@ impl LineWithInvisibles {
line_y,
);
if let Some(allowed_regions) = allowed_invisibles_regions {
let invisible_point = DisplayPoint::new(row, token_offset as u32);
if !allowed_regions
(
[token_offset, token_end_offset],
Box::new(move |cx: &mut WindowContext| {
invisible_symbol.paint(origin, line_height, cx).log_err();
}),
)
};
let invisible_iter = self.invisibles.iter().map(extract_whitespace_info);
match whitespace_setting {
ShowWhitespaceSetting::None => return,
ShowWhitespaceSetting::All => invisible_iter.for_each(|(_, paint)| paint(cx)),
ShowWhitespaceSetting::Selection => invisible_iter.for_each(|([start, _], paint)| {
let invisible_point = DisplayPoint::new(row, start as u32);
if !selection_ranges
.iter()
.any(|region| region.start <= invisible_point && invisible_point < region.end)
{
continue;
return;
}
paint(cx);
}),
// For a whitespace to be on a boundary, any of the following conditions need to be met:
// - It is a tab
// - It is adjacent to an edge (start or end)
// - It is adjacent to a whitespace (left or right)
ShowWhitespaceSetting::Boundary => {
// We'll need to keep track of the last invisible we've seen and then check if we are adjacent to it for some of
// the above cases.
// Note: We zip in the original `invisibles` to check for tab equality
let mut last_seen: Option<(bool, usize, Box<dyn Fn(&mut WindowContext)>)> = None;
for (([start, end], paint), invisible) in
invisible_iter.zip_eq(self.invisibles.iter())
{
let should_render = match (&last_seen, invisible) {
(_, Invisible::Tab { .. }) => true,
(Some((_, last_end, _)), _) => *last_end == start,
_ => false,
};
if should_render || start == 0 || end == self.len {
paint(cx);
// Since we are scanning from the left, we will skip over the first available whitespace that is part
// of a boundary between non-whitespace segments, so we correct by manually redrawing it if needed.
if let Some((should_render_last, last_end, paint_last)) = last_seen {
// Note that we need to make sure that the last one is actually adjacent
if !should_render_last && last_end == start {
paint_last(cx);
}
}
}
// Manually render anything within a selection
let invisible_point = DisplayPoint::new(row, start as u32);
if selection_ranges.iter().any(|region| {
region.start <= invisible_point && invisible_point < region.end
}) {
paint(cx);
}
last_seen = Some((should_render, end, paint));
}
}
invisible_symbol.paint(origin, line_height, cx).log_err();
}
};
}
pub fn x_for_index(&self, index: usize) -> Pixels {
@@ -4307,8 +4364,18 @@ impl LineWithInvisibles {
#[derive(Debug, Clone, Copy, PartialEq, Eq)]
enum Invisible {
Tab { line_start_offset: usize },
Whitespace { line_offset: usize },
/// A tab character
///
/// A tab character is internally represented by spaces (configured by the user's tab width)
/// aligned to the nearest column, so it's necessary to store the start and end offset for
/// adjacency checks.
Tab {
line_start_offset: usize,
line_end_offset: usize,
},
Whitespace {
line_offset: usize,
},
}
impl EditorElement {
@@ -5853,15 +5920,18 @@ mod tests {
let expected_invisibles = vec![
Invisible::Tab {
line_start_offset: 0,
line_end_offset: TAB_SIZE as usize,
},
Invisible::Whitespace {
line_offset: TAB_SIZE as usize,
},
Invisible::Tab {
line_start_offset: TAB_SIZE as usize + 1,
line_end_offset: TAB_SIZE as usize * 2,
},
Invisible::Tab {
line_start_offset: TAB_SIZE as usize * 2 + 1,
line_end_offset: TAB_SIZE as usize * 3,
},
Invisible::Whitespace {
line_offset: TAB_SIZE as usize * 3 + 1,
@@ -5915,10 +5985,11 @@ mod tests {
#[gpui::test]
fn test_wrapped_invisibles_drawing(cx: &mut TestAppContext) {
let tab_size = 4;
let input_text = "a\tbcd ".repeat(9);
let input_text = "a\tbcd ".repeat(9);
let repeated_invisibles = [
Invisible::Tab {
line_start_offset: 1,
line_end_offset: tab_size as usize,
},
Invisible::Whitespace {
line_offset: tab_size as usize + 3,
@@ -5929,6 +6000,12 @@ mod tests {
Invisible::Whitespace {
line_offset: tab_size as usize + 5,
},
Invisible::Whitespace {
line_offset: tab_size as usize + 6,
},
Invisible::Whitespace {
line_offset: tab_size as usize + 7,
},
];
let expected_invisibles = std::iter::once(repeated_invisibles)
.cycle()

View File

@@ -1,5 +1,6 @@
use crate::{
hover_popover::{self, InlayHover},
scroll::ScrollAmount,
Anchor, Editor, EditorSnapshot, FindAllReferences, GoToDefinition, GoToTypeDefinition, InlayId,
PointForPosition, SelectPhase,
};
@@ -38,7 +39,11 @@ impl RangeInEditor {
}
}
fn point_within_range(&self, trigger_point: &TriggerPoint, snapshot: &EditorSnapshot) -> bool {
pub fn point_within_range(
&self,
trigger_point: &TriggerPoint,
snapshot: &EditorSnapshot,
) -> bool {
match (self, trigger_point) {
(Self::Text(range), TriggerPoint::Text(point)) => {
let point_after_start = range.start.cmp(point, &snapshot.buffer_snapshot).is_le();
@@ -169,6 +174,21 @@ impl Editor {
.detach();
}
pub fn scroll_hover(&mut self, amount: &ScrollAmount, cx: &mut ViewContext<Self>) -> bool {
let selection = self.selections.newest_anchor().head();
let snapshot = self.snapshot(cx);
let Some(popover) = self.hover_state.info_popovers.iter().find(|popover| {
popover
.symbol_range
.point_within_range(&TriggerPoint::Text(selection), &snapshot)
}) else {
return false;
};
popover.scroll(amount, cx);
true
}
fn cmd_click_reveal_task(
&mut self,
point: PointForPosition,
@@ -302,7 +322,6 @@ pub fn update_inlay_link_and_hover_points(
hover_popover::hover_at_inlay(
editor,
InlayHover {
excerpt: excerpt_id,
tooltip: match tooltip {
InlayHintTooltip::String(text) => HoverBlock {
text,
@@ -350,7 +369,6 @@ pub fn update_inlay_link_and_hover_points(
hover_popover::hover_at_inlay(
editor,
InlayHover {
excerpt: excerpt_id,
tooltip: match tooltip {
InlayHintLabelPartTooltip::String(text) => {
HoverBlock {

View File

@@ -1,14 +1,15 @@
use crate::{
display_map::{InlayOffset, ToDisplayPoint},
hover_links::{InlayHighlight, RangeInEditor},
scroll::ScrollAmount,
Anchor, AnchorRangeExt, DisplayPoint, DisplayRow, Editor, EditorSettings, EditorSnapshot,
EditorStyle, ExcerptId, Hover, RangeToAnchorExt,
EditorStyle, Hover, RangeToAnchorExt,
};
use futures::{stream::FuturesUnordered, FutureExt};
use gpui::{
div, px, AnyElement, CursorStyle, Hsla, InteractiveElement, IntoElement, MouseButton,
ParentElement, Pixels, SharedString, Size, StatefulInteractiveElement, Styled, Task,
ViewContext, WeakView,
ParentElement, Pixels, ScrollHandle, SharedString, Size, StatefulInteractiveElement, Styled,
Task, ViewContext, WeakView,
};
use language::{markdown, DiagnosticEntry, Language, LanguageRegistry, ParsedMarkdown};
@@ -48,7 +49,6 @@ pub fn hover_at(editor: &mut Editor, anchor: Option<Anchor>, cx: &mut ViewContex
}
pub struct InlayHover {
pub excerpt: ExcerptId,
pub range: InlayHighlight,
pub tooltip: HoverBlock,
}
@@ -118,6 +118,7 @@ pub fn hover_at_inlay(editor: &mut Editor, inlay_hover: InlayHover, cx: &mut Vie
let hover_popover = InfoPopover {
symbol_range: RangeInEditor::Inlay(inlay_hover.range.clone()),
parsed_content,
scroll_handle: ScrollHandle::new(),
};
this.update(&mut cx, |this, cx| {
@@ -317,6 +318,7 @@ fn show_hover(
InfoPopover {
symbol_range: RangeInEditor::Text(range),
parsed_content,
scroll_handle: ScrollHandle::new(),
},
)
})
@@ -423,7 +425,7 @@ async fn parse_blocks(
}
}
#[derive(Default)]
#[derive(Default, Debug)]
pub struct HoverState {
pub info_popovers: Vec<InfoPopover>,
pub diagnostic_popover: Option<DiagnosticPopover>,
@@ -487,10 +489,11 @@ impl HoverState {
}
}
#[derive(Debug, Clone)]
#[derive(Clone, Debug)]
pub struct InfoPopover {
symbol_range: RangeInEditor,
parsed_content: ParsedMarkdown,
pub symbol_range: RangeInEditor,
pub parsed_content: ParsedMarkdown,
pub scroll_handle: ScrollHandle,
}
impl InfoPopover {
@@ -504,23 +507,33 @@ impl InfoPopover {
div()
.id("info_popover")
.elevation_2(cx)
.p_2()
.overflow_y_scroll()
.track_scroll(&self.scroll_handle)
.max_w(max_size.width)
.max_h(max_size.height)
// Prevent a mouse down/move on the popover from being propagated to the editor,
// because that would dismiss the popover.
.on_mouse_move(|_, cx| cx.stop_propagation())
.on_mouse_down(MouseButton::Left, |_, cx| cx.stop_propagation())
.child(crate::render_parsed_markdown(
.child(div().p_2().child(crate::render_parsed_markdown(
"content",
&self.parsed_content,
style,
workspace,
cx,
))
)))
.into_any_element()
}
pub fn scroll(&self, amount: &ScrollAmount, cx: &mut ViewContext<Editor>) {
let mut current = self.scroll_handle.offset();
current.y -= amount.pixels(
cx.line_height(),
self.scroll_handle.bounds().size.height - px(16.),
) / 2.0;
cx.notify();
self.scroll_handle.set_offset(current);
}
}
#[derive(Debug, Clone)]

View File

@@ -10,7 +10,7 @@ use language::Buffer;
use multi_buffer::{
Anchor, ExcerptRange, MultiBuffer, MultiBufferRow, MultiBufferSnapshot, ToPoint,
};
use settings::{Settings, SettingsStore};
use settings::SettingsStore;
use text::{BufferId, Point};
use ui::{
div, ActiveTheme, Context as _, IntoElement, ParentElement, Styled, ViewContext, VisualContext,
@@ -21,7 +21,7 @@ use crate::{
editor_settings::CurrentLineHighlight,
git::{diff_hunk_to_display, DisplayDiffHunk},
hunk_status, hunks_for_selections, BlockDisposition, BlockId, BlockProperties, BlockStyle,
DiffRowHighlight, Editor, EditorSettings, EditorSnapshot, ExpandAllHunkDiffs, RangeToAnchorExt,
DiffRowHighlight, Editor, EditorSnapshot, ExpandAllHunkDiffs, RangeToAnchorExt,
RevertSelectedHunks, ToDisplayPoint, ToggleHunkDiff,
};
@@ -591,7 +591,7 @@ fn editor_with_deleted_text(
let subscription_editor = parent_editor.clone();
editor._subscriptions.extend([
cx.on_blur(&editor.focus_handle, |editor, cx| {
editor.set_current_line_highlight(CurrentLineHighlight::None);
editor.set_current_line_highlight(Some(CurrentLineHighlight::None));
editor.change_selections(None, cx, |s| {
s.try_cancel();
});
@@ -602,14 +602,14 @@ fn editor_with_deleted_text(
{
parent_editor.read(cx).current_line_highlight
} else {
EditorSettings::get_global(cx).current_line_highlight
None
};
editor.set_current_line_highlight(restored_highlight);
cx.notify();
}),
cx.observe_global::<SettingsStore>(|editor, cx| {
if !editor.is_focused(cx) {
editor.set_current_line_highlight(CurrentLineHighlight::None);
editor.set_current_line_highlight(Some(CurrentLineHighlight::None));
}
}),
]);

View File

@@ -13,8 +13,7 @@ use gpui::{
VisualContext, WeakView, WindowContext,
};
use language::{
proto::serialize_anchor as serialize_text_anchor, Bias, Buffer, CharKind, OffsetRangeExt,
Point, SelectionGoal,
proto::serialize_anchor as serialize_text_anchor, Bias, Buffer, CharKind, Point, SelectionGoal,
};
use multi_buffer::AnchorRangeExt;
use project::{search::SearchQuery, FormatTrigger, Item as _, Project, ProjectPath};
@@ -1008,6 +1007,25 @@ impl SearchableItem for Editor {
self.has_background_highlights::<SearchWithinRange>()
}
fn toggle_filtered_search_ranges(&mut self, enabled: bool, cx: &mut ViewContext<Self>) {
if self.has_filtered_search_ranges() {
self.previous_search_ranges = self
.clear_background_highlights::<SearchWithinRange>(cx)
.map(|(_, ranges)| ranges)
}
if !enabled {
return;
}
let ranges = self.selections.disjoint_anchor_ranges();
if ranges.iter().any(|range| range.start != range.end) {
self.set_search_within_ranges(&ranges, cx);
} else if let Some(previous_search_ranges) = self.previous_search_ranges.take() {
self.set_search_within_ranges(&previous_search_ranges, cx)
}
}
fn query_suggestion(&mut self, cx: &mut ViewContext<Self>) -> String {
let setting = EditorSettings::get_global(cx).seed_search_query_from_cursor;
let snapshot = &self.snapshot(cx).buffer_snapshot;
@@ -1016,9 +1034,14 @@ impl SearchableItem for Editor {
match setting {
SeedQuerySetting::Never => String::new(),
SeedQuerySetting::Selection | SeedQuerySetting::Always if !selection.is_empty() => {
snapshot
let text: String = snapshot
.text_for_range(selection.start..selection.end)
.collect()
.collect();
if text.contains('\n') {
String::new()
} else {
text
}
}
SeedQuerySetting::Selection => String::new(),
SeedQuerySetting::Always => {
@@ -1135,58 +1158,64 @@ impl SearchableItem for Editor {
let search_within_ranges = self
.background_highlights
.get(&TypeId::of::<SearchWithinRange>())
.map(|(_color, ranges)| {
ranges
.iter()
.map(|range| range.to_offset(&buffer))
.collect::<Vec<_>>()
.map_or(vec![], |(_color, ranges)| {
ranges.iter().map(|range| range.clone()).collect::<Vec<_>>()
});
cx.background_executor().spawn(async move {
let mut ranges = Vec::new();
if let Some((_, _, excerpt_buffer)) = buffer.as_singleton() {
if let Some(search_within_ranges) = search_within_ranges {
for range in search_within_ranges {
let offset = range.start;
ranges.extend(
query
.search(excerpt_buffer, Some(range))
.await
.into_iter()
.map(|range| {
buffer.anchor_after(range.start + offset)
..buffer.anchor_before(range.end + offset)
}),
);
}
let search_within_ranges = if search_within_ranges.is_empty() {
vec![None]
} else {
ranges.extend(query.search(excerpt_buffer, None).await.into_iter().map(
|range| buffer.anchor_after(range.start)..buffer.anchor_before(range.end),
));
search_within_ranges
.into_iter()
.map(|range| Some(range.to_offset(&buffer)))
.collect::<Vec<_>>()
};
for range in search_within_ranges {
let buffer = &buffer;
ranges.extend(
query
.search(excerpt_buffer, range.clone())
.await
.into_iter()
.map(|matched_range| {
let offset = range.clone().map(|r| r.start).unwrap_or(0);
buffer.anchor_after(matched_range.start + offset)
..buffer.anchor_before(matched_range.end + offset)
}),
);
}
} else {
for excerpt in buffer.excerpt_boundaries_in_range(0..buffer.len()) {
if let Some(next_excerpt) = excerpt.next {
let excerpt_range =
next_excerpt.range.context.to_offset(&next_excerpt.buffer);
ranges.extend(
query
.search(&next_excerpt.buffer, Some(excerpt_range.clone()))
.await
.into_iter()
.map(|range| {
let start = next_excerpt
.buffer
.anchor_after(excerpt_range.start + range.start);
let end = next_excerpt
.buffer
.anchor_before(excerpt_range.start + range.end);
buffer.anchor_in_excerpt(next_excerpt.id, start).unwrap()
..buffer.anchor_in_excerpt(next_excerpt.id, end).unwrap()
}),
);
}
let search_within_ranges = if search_within_ranges.is_empty() {
vec![buffer.anchor_before(0)..buffer.anchor_after(buffer.len())]
} else {
search_within_ranges
};
for (excerpt_id, search_buffer, search_range) in
buffer.excerpts_in_ranges(search_within_ranges)
{
ranges.extend(
query
.search(&search_buffer, Some(search_range.clone()))
.await
.into_iter()
.map(|match_range| {
let start = search_buffer
.anchor_after(search_range.start + match_range.start);
let end = search_buffer
.anchor_before(search_range.start + match_range.end);
buffer.anchor_in_excerpt(excerpt_id, start).unwrap()
..buffer.anchor_in_excerpt(excerpt_id, end).unwrap()
}),
);
}
}
};
ranges
})
}

View File

@@ -1,7 +1,8 @@
use crate::Editor;
use serde::Deserialize;
use ui::{px, Pixels};
#[derive(Clone, PartialEq, Deserialize)]
#[derive(Debug, Clone, PartialEq, Deserialize)]
pub enum ScrollAmount {
// Scroll N lines (positive is towards the end of the document)
Line(f32),
@@ -25,4 +26,11 @@ impl ScrollAmount {
.unwrap_or(0.),
}
}
pub fn pixels(&self, line_height: Pixels, height: Pixels) -> Pixels {
match self {
ScrollAmount::Line(x) => px(line_height.0 * x),
ScrollAmount::Page(x) => px(height.0 * x),
}
}
}

View File

@@ -273,6 +273,13 @@ impl SelectionsCollection {
self.all(cx).last().unwrap().clone()
}
pub fn disjoint_anchor_ranges(&self) -> Vec<Range<Anchor>> {
self.disjoint_anchors()
.iter()
.map(|s| s.start..s.end)
.collect()
}
#[cfg(any(test, feature = "test-support"))]
pub fn ranges<D: TextDimension + Ord + Sub<D, Output = D> + std::fmt::Debug>(
&self,

View File

@@ -2437,7 +2437,7 @@ where
}
}
#[derive(Default)]
#[derive(Default, Debug)]
struct ScrollHandleState {
offset: Rc<RefCell<Point<Pixels>>>,
bounds: Bounds<Pixels>,
@@ -2449,7 +2449,7 @@ struct ScrollHandleState {
/// A handle to the scrollable aspects of an element.
/// Used for accessing scroll state, like the current scroll offset,
/// and for mutating the scroll state, like scrolling to a specific child.
#[derive(Clone)]
#[derive(Clone, Debug)]
pub struct ScrollHandle(Rc<RefCell<ScrollHandleState>>);
impl Default for ScrollHandle {
@@ -2526,6 +2526,14 @@ impl ScrollHandle {
}
}
/// Set the offset explicitly. The offset is the distance from the top left of the
/// parent container to the top left of the first child.
/// As you scroll further down the offset becomes more negative.
pub fn set_offset(&self, mut position: Point<Pixels>) {
let state = self.0.borrow();
*state.offset.borrow_mut() = position;
}
/// Get the logical scroll top, based on a child index and a pixel offset.
pub fn logical_scroll_top(&self) -> (usize, Pixels) {
let ix = self.top_item();

View File

@@ -201,7 +201,7 @@ mod sys {
#[link(name = "CoreFoundation", kind = "framework")]
#[link(name = "CoreVideo", kind = "framework")]
#[allow(improper_ctypes)]
#[allow(improper_ctypes, unknown_lints, clippy::duplicated_attributes)]
extern "C" {
pub fn CVDisplayLinkCreateWithActiveCGDisplays(
display_link_out: *mut *mut CVDisplayLink,

View File

@@ -310,8 +310,8 @@ unsafe fn build_window_class(name: &'static str, superclass: &Class) -> *const C
decl.register()
}
#[derive(Debug, Clone)]
#[allow(clippy::enum_variant_names)]
#[derive(Clone)]
enum ImeInput {
InsertText(String, Option<Range<usize>>),
SetMarkedText(String, Option<Range<usize>>, Option<Range<usize>>),
@@ -340,7 +340,7 @@ struct MacWindowState {
traffic_light_position: Option<Point<Pixels>>,
previous_modifiers_changed_event: Option<PlatformInput>,
// State tracking what the IME did after the last request
last_ime_action: Option<ImeInput>,
last_ime_inputs: Option<SmallVec<[(String, Option<Range<usize>>); 1]>>,
previous_keydown_inserted_text: Option<String>,
external_files_dragged: bool,
// Whether the next left-mouse click is also the focusing click.
@@ -636,7 +636,7 @@ impl MacWindow {
.as_ref()
.and_then(|titlebar| titlebar.traffic_light_position),
previous_modifiers_changed_event: None,
last_ime_action: None,
last_ime_inputs: None,
previous_keydown_inserted_text: None,
external_files_dragged: false,
first_mouse: false,
@@ -1195,18 +1195,26 @@ extern "C" fn handle_key_down(this: &Object, _: Sel, native_event: id) {
// - The IME consumes characters like 'j' and 'k', which makes paging through `less` in
// the terminal behave incorrectly by default. This behavior should be patched by our
// IME integration
// - `alt-t` should open the tasks menu
// - In vim mode, this keybinding should work:
// ```
// {
// "context": "Editor && vim_mode == insert",
// "bindings": {"j j": "vim::NormalBefore"}
// }
// ```
// and typing 'j k' in insert mode with this keybinding should insert the two characters
// Brazilian layout:
// - `" space` should create an unmarked quote
// - `" backspace` should delete the marked quote
// - `" up` should insert a quote, unmark it, and move up one line
// - `" cmd-down` should insert a quote, unmark it, and move to the end of the file
// - NOTE: The current implementation does not move the selection to the end of the file
// - `cmd-ctrl-space` and clicking on an emoji should type it
// Czech (QWERTY) layout:
// - in vim mode `option-4` should go to end of line (same as $)
// Japanese (Romaji) layout:
// - Triggering the IME composer (e.g. via typing 'a i' and then the left key), and then selecting
// results of different length (e.g. kana -> kanji -> emoji -> back to kanji via the up and down keys)
// should maintain the composing state in the editor
// - type `a i left down up enter enter` should create an unmarked text "愛"
extern "C" fn handle_key_event(this: &Object, native_event: id, key_equivalent: bool) -> BOOL {
let window_state = unsafe { get_window_state(this) };
let mut lock = window_state.as_ref().lock();
@@ -1236,12 +1244,12 @@ extern "C" fn handle_key_event(this: &Object, native_event: id, key_equivalent:
} else {
lock.last_fresh_keydown = Some(keydown.clone());
}
lock.last_ime_inputs = Some(Default::default());
drop(lock);
// Send the event to the input context for IME handling, unless the `fn` modifier is
// being pressed. This will call back into other functions like `insert_text`, etc.
// Note that the IME expects it's actions to be applied immediately, and buffering them
// can break pre-edit
// being pressed.
// this will call back into `insert_text`, etc.
if !fn_modifier {
unsafe {
let input_context: id = msg_send![this, inputContext];
@@ -1252,27 +1260,36 @@ extern "C" fn handle_key_event(this: &Object, native_event: id, key_equivalent:
let mut handled = false;
let mut lock = window_state.lock();
let previous_keydown_inserted_text = lock.previous_keydown_inserted_text.take();
let mut last_ime = lock.last_ime_action.take();
let mut last_inserts = lock.last_ime_inputs.take().unwrap();
let mut callback = lock.event_callback.take();
drop(lock);
let last_insert = last_inserts.pop();
// on a brazilian keyboard typing `"` and then hitting `up` will cause two IME
// events, one to unmark the quote, and one to send the up arrow.
for (text, range) in last_inserts {
send_to_input_handler(this, ImeInput::InsertText(text, range));
}
let is_composing =
with_input_handler(this, |input_handler| input_handler.marked_text_range())
.flatten()
.is_some();
if let Some(ime) = last_ime {
if let ImeInput::InsertText(text, _) = &ime {
if !is_composing {
window_state.lock().previous_keydown_inserted_text = Some(text.clone());
if let Some(callback) = callback.as_mut() {
event.keystroke.ime_key = Some(text.clone());
let _ = callback(PlatformInput::KeyDown(event));
}
if let Some((text, range)) = last_insert {
if !is_composing {
window_state.lock().previous_keydown_inserted_text = Some(text.clone());
if let Some(callback) = callback.as_mut() {
event.keystroke.ime_key = Some(text.clone());
handled = !callback(PlatformInput::KeyDown(event)).propagate;
}
}
handled = true;
if !handled {
handled = true;
send_to_input_handler(this, ImeInput::InsertText(text, range));
}
} else if !is_composing {
let is_held = event.is_held;
@@ -1653,21 +1670,24 @@ extern "C" fn valid_attributes_for_marked_text(_: &Object, _: Sel) -> id {
}
extern "C" fn has_marked_text(this: &Object, _: Sel) -> BOOL {
with_input_handler(this, |input_handler| input_handler.marked_text_range())
.flatten()
.is_some() as BOOL
let has_marked_text_result =
with_input_handler(this, |input_handler| input_handler.marked_text_range()).flatten();
has_marked_text_result.is_some() as BOOL
}
extern "C" fn marked_range(this: &Object, _: Sel) -> NSRange {
with_input_handler(this, |input_handler| input_handler.marked_text_range())
.flatten()
.map_or(NSRange::invalid(), |range| range.into())
let marked_range_result =
with_input_handler(this, |input_handler| input_handler.marked_text_range()).flatten();
marked_range_result.map_or(NSRange::invalid(), |range| range.into())
}
extern "C" fn selected_range(this: &Object, _: Sel) -> NSRange {
with_input_handler(this, |input_handler| input_handler.selected_text_range())
.flatten()
.map_or(NSRange::invalid(), |range| range.into())
let selected_range_result =
with_input_handler(this, |input_handler| input_handler.selected_text_range()).flatten();
selected_range_result.map_or(NSRange::invalid(), |range| range.into())
}
extern "C" fn first_rect_for_character_range(
@@ -1760,7 +1780,7 @@ extern "C" fn attributed_substring_for_proposed_range(
return None;
}
let selected_text = input_handler.text_for_range(range)?;
let selected_text = input_handler.text_for_range(range.clone())?;
unsafe {
let string: id = msg_send![class!(NSAttributedString), alloc];
let string: id = msg_send![string, initWithString: ns_string(&selected_text)];
@@ -1926,17 +1946,25 @@ fn send_to_input_handler(window: &Object, ime: ImeInput) {
let window_state = get_window_state(window);
let mut lock = window_state.lock();
lock.last_ime_action = Some(ime.clone());
if let Some(mut input_handler) = lock.input_handler.take() {
drop(lock);
match ime {
match ime.clone() {
ImeInput::InsertText(text, range) => {
if let Some(ime_input) = lock.last_ime_inputs.as_mut() {
ime_input.push((text, range));
lock.input_handler = Some(input_handler);
return;
}
drop(lock);
input_handler.replace_text_in_range(range, &text)
}
ImeInput::SetMarkedText(text, range, marked_range) => {
drop(lock);
input_handler.replace_and_mark_text_in_range(range, &text, marked_range)
}
ImeInput::UnmarkText => input_handler.unmark_text(),
ImeInput::UnmarkText => {
drop(lock);
input_handler.unmark_text()
}
}
window_state.lock().input_handler = Some(input_handler);
}

View File

@@ -63,7 +63,9 @@ impl TaffyLayoutEngine {
let parent_id = self
.taffy
// This is safe because LayoutId is repr(transparent) to taffy::tree::NodeId.
.new_with_children(taffy_style, unsafe { std::mem::transmute(children) })
.new_with_children(taffy_style, unsafe {
std::mem::transmute::<&[LayoutId], &[taffy::NodeId]>(children)
})
.expect(EXPECT_MESSAGE)
.into();
self.children_to_parents

View File

@@ -1036,6 +1036,37 @@ impl<'a> WindowContext<'a> {
});
}
/// Subscribe to events emitted by a model or view.
/// The entity to which you're subscribing must implement the [`EventEmitter`] trait.
/// The callback will be invoked a handle to the emitting entity (either a [`View`] or [`Model`]), the event, and a window context for the current window.
pub fn observe<E, T>(
&mut self,
entity: &E,
mut on_notify: impl FnMut(E, &mut WindowContext<'_>) + 'static,
) -> Subscription
where
E: Entity<T>,
{
let entity_id = entity.entity_id();
let entity = entity.downgrade();
let window_handle = self.window.handle;
self.app.new_observer(
entity_id,
Box::new(move |cx| {
window_handle
.update(cx, |_, cx| {
if let Some(handle) = E::upgrade_from(&entity) {
on_notify(handle, cx);
true
} else {
false
}
})
.unwrap_or(false)
}),
)
}
/// Subscribe to events emitted by a model or view.
/// The entity to which you're subscribing must implement the [`EventEmitter`] trait.
/// The callback will be invoked a handle to the emitting entity (either a [`View`] or [`Model`]), the event, and a window context for the current window.

View File

@@ -16,7 +16,9 @@ use html5ever::tendril::TendrilSink;
use html5ever::tree_builder::TreeBuilderOpts;
use markup5ever_rcdom::RcDom;
use crate::markdown::{HeadingHandler, ListHandler, ParagraphHandler, StyledTextHandler};
use crate::markdown::{
HeadingHandler, ListHandler, ParagraphHandler, StyledTextHandler, TableHandler,
};
use crate::markdown_writer::{HandleTag, MarkdownWriter};
/// Converts the provided HTML to Markdown.
@@ -27,11 +29,11 @@ pub fn convert_html_to_markdown(html: impl Read) -> Result<String> {
Box::new(ParagraphHandler),
Box::new(HeadingHandler),
Box::new(ListHandler),
Box::new(TableHandler::new()),
Box::new(StyledTextHandler),
Box::new(structure::rustdoc::RustdocChromeRemover),
Box::new(structure::rustdoc::RustdocHeadingHandler),
Box::new(structure::rustdoc::RustdocCodeHandler),
Box::new(structure::rustdoc::RustdocTableHandler::new()),
Box::new(structure::rustdoc::RustdocItemHandler),
];
@@ -51,11 +53,11 @@ pub fn convert_rustdoc_to_markdown(html: impl Read) -> Result<String> {
Box::new(ParagraphHandler),
Box::new(HeadingHandler),
Box::new(ListHandler),
Box::new(TableHandler::new()),
Box::new(StyledTextHandler),
Box::new(structure::rustdoc::RustdocChromeRemover),
Box::new(structure::rustdoc::RustdocHeadingHandler),
Box::new(structure::rustdoc::RustdocCodeHandler),
Box::new(structure::rustdoc::RustdocTableHandler::new()),
Box::new(structure::rustdoc::RustdocItemHandler),
];

View File

@@ -101,6 +101,87 @@ impl HandleTag for ListHandler {
}
}
pub struct TableHandler {
/// The number of columns in the current `<table>`.
current_table_columns: usize,
is_first_th: bool,
is_first_td: bool,
}
impl TableHandler {
pub fn new() -> Self {
Self {
current_table_columns: 0,
is_first_th: true,
is_first_td: true,
}
}
}
impl HandleTag for TableHandler {
fn should_handle(&self, tag: &str) -> bool {
match tag {
"table" | "thead" | "tbody" | "tr" | "th" | "td" => true,
_ => false,
}
}
fn handle_tag_start(
&mut self,
tag: &HtmlElement,
writer: &mut MarkdownWriter,
) -> StartTagOutcome {
match tag.tag.as_str() {
"thead" => writer.push_blank_line(),
"tr" => writer.push_newline(),
"th" => {
self.current_table_columns += 1;
if self.is_first_th {
self.is_first_th = false;
} else {
writer.push_str(" ");
}
writer.push_str("| ");
}
"td" => {
if self.is_first_td {
self.is_first_td = false;
} else {
writer.push_str(" ");
}
writer.push_str("| ");
}
_ => {}
}
StartTagOutcome::Continue
}
fn handle_tag_end(&mut self, tag: &HtmlElement, writer: &mut MarkdownWriter) {
match tag.tag.as_str() {
"thead" => {
writer.push_newline();
for ix in 0..self.current_table_columns {
if ix > 0 {
writer.push_str(" ");
}
writer.push_str("| ---");
}
writer.push_str(" |");
self.is_first_th = true;
}
"tr" => {
writer.push_str(" |");
self.is_first_td = true;
}
"table" => {
self.current_table_columns = 0;
}
_ => {}
}
}
}
pub struct StyledTextHandler;
impl HandleTag for StyledTextHandler {

View File

@@ -96,87 +96,6 @@ impl HandleTag for RustdocCodeHandler {
}
}
pub struct RustdocTableHandler {
/// The number of columns in the current `<table>`.
current_table_columns: usize,
is_first_th: bool,
is_first_td: bool,
}
impl RustdocTableHandler {
pub fn new() -> Self {
Self {
current_table_columns: 0,
is_first_th: true,
is_first_td: true,
}
}
}
impl HandleTag for RustdocTableHandler {
fn should_handle(&self, tag: &str) -> bool {
match tag {
"table" | "thead" | "tbody" | "tr" | "th" | "td" => true,
_ => false,
}
}
fn handle_tag_start(
&mut self,
tag: &HtmlElement,
writer: &mut MarkdownWriter,
) -> StartTagOutcome {
match tag.tag.as_str() {
"thead" => writer.push_blank_line(),
"tr" => writer.push_newline(),
"th" => {
self.current_table_columns += 1;
if self.is_first_th {
self.is_first_th = false;
} else {
writer.push_str(" ");
}
writer.push_str("| ");
}
"td" => {
if self.is_first_td {
self.is_first_td = false;
} else {
writer.push_str(" ");
}
writer.push_str("| ");
}
_ => {}
}
StartTagOutcome::Continue
}
fn handle_tag_end(&mut self, tag: &HtmlElement, writer: &mut MarkdownWriter) {
match tag.tag.as_str() {
"thead" => {
writer.push_newline();
for ix in 0..self.current_table_columns {
if ix > 0 {
writer.push_str(" ");
}
writer.push_str("| ---");
}
writer.push_str(" |");
self.is_first_th = true;
}
"tr" => {
writer.push_str(" |");
self.is_first_td = true;
}
"table" => {
self.current_table_columns = 0;
}
_ => {}
}
}
}
const RUSTDOC_ITEM_NAME_CLASS: &str = "item-name";
pub struct RustdocItemHandler;

View File

@@ -382,16 +382,6 @@ pub trait LocalFile: File {
/// Loads the file's contents from disk.
fn load(&self, cx: &AppContext) -> Task<Result<String>>;
/// Called when the buffer is reloaded from disk.
fn buffer_reloaded(
&self,
buffer_id: BufferId,
version: &clock::Global,
line_ending: LineEnding,
mtime: Option<SystemTime>,
cx: &mut AppContext,
);
/// Returns true if the file should not be shared with collaborators.
fn is_private(&self, _: &AppContext) -> bool {
false
@@ -884,15 +874,6 @@ impl Buffer {
self.saved_version = version;
self.text.set_line_ending(line_ending);
self.saved_mtime = mtime;
if let Some(file) = self.file.as_ref().and_then(|f| f.as_local()) {
file.buffer_reloaded(
self.remote_id(),
&self.saved_version,
self.line_ending(),
self.saved_mtime,
cx,
);
}
cx.emit(Event::Reloaded);
cx.notify();
}

View File

@@ -391,6 +391,13 @@ pub enum ShowWhitespaceSetting {
None,
/// Draw all invisible symbols.
All,
/// Draw whitespaces at boundaries only.
///
/// For a whitespace to be on a boundary, any of the following conditions need to be met:
/// - It is a tab
/// - It is adjacent to an edge (start or end)
/// - It is adjacent to a whitespace (left or right)
Boundary,
}
/// Controls which formatter should be used when formatting code.

View File

@@ -48,7 +48,6 @@ pub struct SyntaxMapMatches<'a> {
#[derive(Debug)]
pub struct SyntaxMapCapture<'a> {
pub depth: usize,
pub node: Node<'a>,
pub index: u32,
pub grammar_index: usize,
@@ -886,7 +885,9 @@ impl<'a> SyntaxMapCaptures<'a> {
// TODO - add a Tree-sitter API to remove the need for this.
let cursor = unsafe {
std::mem::transmute::<_, &'static mut QueryCursor>(query_cursor.deref_mut())
std::mem::transmute::<&mut tree_sitter::QueryCursor, &'static mut QueryCursor>(
query_cursor.deref_mut(),
)
};
cursor.set_byte_range(range.clone());
@@ -933,7 +934,6 @@ impl<'a> SyntaxMapCaptures<'a> {
let layer = self.layers[..self.active_layer_count].first()?;
let capture = layer.next_capture?;
Some(SyntaxMapCapture {
depth: layer.depth,
grammar_index: layer.grammar_index,
index: capture.index,
node: capture.node,
@@ -1004,7 +1004,9 @@ impl<'a> SyntaxMapMatches<'a> {
// TODO - add a Tree-sitter API to remove the need for this.
let cursor = unsafe {
std::mem::transmute::<_, &'static mut QueryCursor>(query_cursor.deref_mut())
std::mem::transmute::<&mut tree_sitter::QueryCursor, &'static mut QueryCursor>(
query_cursor.deref_mut(),
)
};
cursor.set_byte_range(range.clone());

View File

@@ -765,6 +765,7 @@ impl SearchableItem for LspLogView {
regex: true,
// LSP log is read-only.
replacement: false,
selection: false,
}
}
fn active_match_index(

View File

@@ -1,6 +1,6 @@
name = "C++"
grammar = "cpp"
path_suffixes = ["cc", "hh", "cpp", "h", "hpp", "cxx", "hxx", "c++", "ipp"]
path_suffixes = ["cc", "hh", "cpp", "h", "hpp", "cxx", "hxx", "c++", "ipp", "inl"]
line_comments = ["// "]
autoclose_before = ";:.,=}])>"
brackets = [

View File

@@ -29,14 +29,14 @@ pub(super) fn json_task_context() -> ContextProviderWithTasks {
ContextProviderWithTasks::new(TaskTemplates(vec![
TaskTemplate {
label: "package script $ZED_CUSTOM_script".to_owned(),
command: "npm run".to_owned(),
command: "npm --prefix $ZED_DIRNAME run".to_owned(),
args: vec![VariableName::Custom("script".into()).template_value()],
tags: vec!["package-script".into()],
..TaskTemplate::default()
},
TaskTemplate {
label: "composer script $ZED_CUSTOM_script".to_owned(),
command: "composer".to_owned(),
command: "composer -d $ZED_DIRNAME".to_owned(),
args: vec![VariableName::Custom("script".into()).template_value()],
tags: vec!["composer-script".into()],
..TaskTemplate::default()

View File

@@ -0,0 +1,159 @@
use std::str;
use std::sync::Arc;
use anyhow::{anyhow, Result};
use collections::HashMap;
use futures::{
channel::mpsc::{unbounded, UnboundedReceiver, UnboundedSender},
AsyncBufReadExt, AsyncRead, AsyncReadExt as _,
};
use gpui::{BackgroundExecutor, Task};
use log::warn;
use parking_lot::Mutex;
use smol::io::BufReader;
use crate::{
AnyNotification, AnyResponse, IoHandler, IoKind, RequestId, ResponseHandler, CONTENT_LEN_HEADER,
};
const HEADER_DELIMITER: &'static [u8; 4] = b"\r\n\r\n";
/// Handler for stdout of language server.
pub struct LspStdoutHandler {
pub(super) loop_handle: Task<Result<()>>,
pub(super) notifications_channel: UnboundedReceiver<AnyNotification>,
}
pub(self) async fn read_headers<Stdout>(
reader: &mut BufReader<Stdout>,
buffer: &mut Vec<u8>,
) -> Result<()>
where
Stdout: AsyncRead + Unpin + Send + 'static,
{
loop {
if buffer.len() >= HEADER_DELIMITER.len()
&& buffer[(buffer.len() - HEADER_DELIMITER.len())..] == HEADER_DELIMITER[..]
{
return Ok(());
}
if reader.read_until(b'\n', buffer).await? == 0 {
return Err(anyhow!("cannot read LSP message headers"));
}
}
}
impl LspStdoutHandler {
pub fn new<Input>(
stdout: Input,
response_handlers: Arc<Mutex<Option<HashMap<RequestId, ResponseHandler>>>>,
io_handlers: Arc<Mutex<HashMap<i32, IoHandler>>>,
cx: BackgroundExecutor,
) -> Self
where
Input: AsyncRead + Unpin + Send + 'static,
{
let (tx, notifications_channel) = unbounded();
let loop_handle = cx.spawn(Self::handler(stdout, tx, response_handlers, io_handlers));
Self {
loop_handle,
notifications_channel,
}
}
async fn handler<Input>(
stdout: Input,
notifications_sender: UnboundedSender<AnyNotification>,
response_handlers: Arc<Mutex<Option<HashMap<RequestId, ResponseHandler>>>>,
io_handlers: Arc<Mutex<HashMap<i32, IoHandler>>>,
) -> anyhow::Result<()>
where
Input: AsyncRead + Unpin + Send + 'static,
{
let mut stdout = BufReader::new(stdout);
let mut buffer = Vec::new();
loop {
buffer.clear();
read_headers(&mut stdout, &mut buffer).await?;
let headers = std::str::from_utf8(&buffer)?;
let message_len = headers
.split('\n')
.find(|line| line.starts_with(CONTENT_LEN_HEADER))
.and_then(|line| line.strip_prefix(CONTENT_LEN_HEADER))
.ok_or_else(|| anyhow!("invalid LSP message header {headers:?}"))?
.trim_end()
.parse()?;
buffer.resize(message_len, 0);
stdout.read_exact(&mut buffer).await?;
if let Ok(message) = str::from_utf8(&buffer) {
log::trace!("incoming message: {message}");
for handler in io_handlers.lock().values_mut() {
handler(IoKind::StdOut, message);
}
}
if let Ok(msg) = serde_json::from_slice::<AnyNotification>(&buffer) {
notifications_sender.unbounded_send(msg)?;
} else if let Ok(AnyResponse {
id, error, result, ..
}) = serde_json::from_slice(&buffer)
{
let mut response_handlers = response_handlers.lock();
if let Some(handler) = response_handlers
.as_mut()
.and_then(|handlers| handlers.remove(&id))
{
drop(response_handlers);
if let Some(error) = error {
handler(Err(error));
} else if let Some(result) = result {
handler(Ok(result.get().into()));
} else {
handler(Ok("null".into()));
}
}
} else {
warn!(
"failed to deserialize LSP message:\n{}",
std::str::from_utf8(&buffer)?
);
}
}
}
}
#[cfg(test)]
mod tests {
use super::*;
#[gpui::test]
async fn test_read_headers() {
let mut buf = Vec::new();
let mut reader = smol::io::BufReader::new(b"Content-Length: 123\r\n\r\n" as &[u8]);
read_headers(&mut reader, &mut buf).await.unwrap();
assert_eq!(buf, b"Content-Length: 123\r\n\r\n");
let mut buf = Vec::new();
let mut reader = smol::io::BufReader::new(b"Content-Type: application/vscode-jsonrpc\r\nContent-Length: 1235\r\n\r\n{\"somecontent\":123}" as &[u8]);
read_headers(&mut reader, &mut buf).await.unwrap();
assert_eq!(
buf,
b"Content-Type: application/vscode-jsonrpc\r\nContent-Length: 1235\r\n\r\n"
);
let mut buf = Vec::new();
let mut reader = smol::io::BufReader::new(b"Content-Length: 1235\r\nContent-Type: application/vscode-jsonrpc\r\n\r\n{\"somecontent\":true}" as &[u8]);
read_headers(&mut reader, &mut buf).await.unwrap();
assert_eq!(
buf,
b"Content-Length: 1235\r\nContent-Type: application/vscode-jsonrpc\r\n\r\n"
);
}
}

View File

@@ -1,4 +1,5 @@
use log::warn;
mod input_handler;
pub use lsp_types::request::*;
pub use lsp_types::*;
@@ -12,7 +13,7 @@ use serde::{de::DeserializeOwned, Deserialize, Serialize};
use serde_json::{json, value::RawValue, Value};
use smol::{
channel,
io::{AsyncBufReadExt, AsyncReadExt, AsyncWriteExt, BufReader},
io::{AsyncBufReadExt, AsyncWriteExt, BufReader},
process::{self, Child},
};
@@ -25,7 +26,6 @@ use std::{
io::Write,
path::PathBuf,
pin::Pin,
str::{self, FromStr as _},
sync::{
atomic::{AtomicI32, Ordering::SeqCst},
Arc, Weak,
@@ -36,13 +36,13 @@ use std::{
use std::{path::Path, process::Stdio};
use util::{ResultExt, TryFutureExt};
const HEADER_DELIMITER: &'static [u8; 4] = b"\r\n\r\n";
const JSON_RPC_VERSION: &str = "2.0";
const CONTENT_LEN_HEADER: &str = "Content-Length: ";
const LSP_REQUEST_TIMEOUT: Duration = Duration::from_secs(60 * 2);
const SERVER_SHUTDOWN_TIMEOUT: Duration = Duration::from_secs(5);
type NotificationHandler = Box<dyn Send + FnMut(Option<RequestId>, &str, AsyncAppContext)>;
type NotificationHandler = Box<dyn Send + FnMut(Option<RequestId>, Value, AsyncAppContext)>;
type ResponseHandler = Box<dyn Send + FnOnce(Result<String, Error>)>;
type IoHandler = Box<dyn Send + FnMut(IoKind, &str)>;
@@ -164,13 +164,12 @@ struct Notification<'a, T> {
/// Language server RPC notification message before it is deserialized into a concrete type.
#[derive(Debug, Clone, Deserialize)]
struct AnyNotification<'a> {
struct AnyNotification {
#[serde(default)]
id: Option<RequestId>,
#[serde(borrow)]
method: &'a str,
#[serde(borrow, default)]
params: Option<&'a RawValue>,
method: String,
#[serde(default)]
params: Option<Value>,
}
#[derive(Debug, Serialize, Deserialize)]
@@ -297,13 +296,7 @@ impl LanguageServer {
"Language server with id {} sent unhandled notification {}:\n{}",
server_id,
notification.method,
serde_json::to_string_pretty(
&notification
.params
.and_then(|params| Value::from_str(params.get()).ok())
.unwrap_or(Value::Null)
)
.unwrap(),
serde_json::to_string_pretty(&notification.params).unwrap(),
);
},
);
@@ -418,79 +411,36 @@ impl LanguageServer {
Stdout: AsyncRead + Unpin + Send + 'static,
F: FnMut(AnyNotification) + 'static + Send,
{
let mut stdout = BufReader::new(stdout);
use smol::stream::StreamExt;
let stdout = BufReader::new(stdout);
let _clear_response_handlers = util::defer({
let response_handlers = response_handlers.clone();
move || {
response_handlers.lock().take();
}
});
let mut buffer = Vec::new();
loop {
buffer.clear();
let mut input_handler = input_handler::LspStdoutHandler::new(
stdout,
response_handlers,
io_handlers,
cx.background_executor().clone(),
);
read_headers(&mut stdout, &mut buffer).await?;
let headers = std::str::from_utf8(&buffer)?;
let message_len = headers
.split('\n')
.find(|line| line.starts_with(CONTENT_LEN_HEADER))
.and_then(|line| line.strip_prefix(CONTENT_LEN_HEADER))
.ok_or_else(|| anyhow!("invalid LSP message header {headers:?}"))?
.trim_end()
.parse()?;
buffer.resize(message_len, 0);
stdout.read_exact(&mut buffer).await?;
if let Ok(message) = str::from_utf8(&buffer) {
log::trace!("incoming message: {message}");
for handler in io_handlers.lock().values_mut() {
handler(IoKind::StdOut, message);
}
}
if let Ok(msg) = serde_json::from_slice::<AnyNotification>(&buffer) {
while let Some(msg) = input_handler.notifications_channel.next().await {
{
let mut notification_handlers = notification_handlers.lock();
if let Some(handler) = notification_handlers.get_mut(msg.method) {
handler(
msg.id,
msg.params.map(|params| params.get()).unwrap_or("null"),
cx.clone(),
);
if let Some(handler) = notification_handlers.get_mut(msg.method.as_str()) {
handler(msg.id, msg.params.unwrap_or(Value::Null), cx.clone());
} else {
drop(notification_handlers);
on_unhandled_notification(msg);
}
} else if let Ok(AnyResponse {
id, error, result, ..
}) = serde_json::from_slice(&buffer)
{
let mut response_handlers = response_handlers.lock();
if let Some(handler) = response_handlers
.as_mut()
.and_then(|handlers| handlers.remove(&id))
{
drop(response_handlers);
if let Some(error) = error {
handler(Err(error));
} else if let Some(result) = result {
handler(Ok(result.get().into()));
} else {
handler(Ok("null".into()));
}
}
} else {
warn!(
"failed to deserialize LSP message:\n{}",
std::str::from_utf8(&buffer)?
);
}
// Don't starve the main thread when receiving lots of messages at once.
// Don't starve the main thread when receiving lots of notifications at once.
smol::future::yield_now().await;
}
input_handler.loop_handle.await
}
async fn handle_stderr<Stderr>(
@@ -512,7 +462,7 @@ impl LanguageServer {
return Ok(());
}
if let Ok(message) = str::from_utf8(&buffer) {
if let Ok(message) = std::str::from_utf8(&buffer) {
log::trace!("incoming stderr message:{message}");
for handler in io_handlers.lock().values_mut() {
handler(IoKind::StdErr, message);
@@ -850,7 +800,7 @@ impl LanguageServer {
let prev_handler = self.notification_handlers.lock().insert(
method,
Box::new(move |_, params, cx| {
if let Some(params) = serde_json::from_str(params).log_err() {
if let Some(params) = serde_json::from_value(params).log_err() {
f(params, cx);
}
}),
@@ -878,7 +828,7 @@ impl LanguageServer {
method,
Box::new(move |id, params, cx| {
if let Some(id) = id {
match serde_json::from_str(params) {
match serde_json::from_value(params) {
Ok(params) => {
let response = f(params, cx.clone());
cx.foreground_executor()
@@ -910,12 +860,7 @@ impl LanguageServer {
}
Err(error) => {
log::error!(
"error deserializing {} request: {:?}, message: {:?}",
method,
error,
params
);
log::error!("error deserializing {} request: {:?}", method, error);
let response = AnyResponse {
jsonrpc: JSON_RPC_VERSION,
id,
@@ -1202,10 +1147,7 @@ impl FakeLanguageServer {
notifications_tx
.try_send((
msg.method.to_string(),
msg.params
.map(|raw_value| raw_value.get())
.unwrap_or("null")
.to_string(),
msg.params.unwrap_or(Value::Null).to_string(),
))
.ok();
},
@@ -1372,30 +1314,11 @@ impl FakeLanguageServer {
}
}
pub(self) async fn read_headers<Stdout>(
reader: &mut BufReader<Stdout>,
buffer: &mut Vec<u8>,
) -> Result<()>
where
Stdout: AsyncRead + Unpin + Send + 'static,
{
loop {
if buffer.len() >= HEADER_DELIMITER.len()
&& buffer[(buffer.len() - HEADER_DELIMITER.len())..] == HEADER_DELIMITER[..]
{
return Ok(());
}
if reader.read_until(b'\n', buffer).await? == 0 {
return Err(anyhow!("cannot read LSP message headers"));
}
}
}
#[cfg(test)]
mod tests {
use super::*;
use gpui::TestAppContext;
use std::str::FromStr;
#[ctor::ctor]
fn init_logger() {
@@ -1475,30 +1398,6 @@ mod tests {
fake.receive_notification::<notification::Exit>().await;
}
#[gpui::test]
async fn test_read_headers() {
let mut buf = Vec::new();
let mut reader = smol::io::BufReader::new(b"Content-Length: 123\r\n\r\n" as &[u8]);
read_headers(&mut reader, &mut buf).await.unwrap();
assert_eq!(buf, b"Content-Length: 123\r\n\r\n");
let mut buf = Vec::new();
let mut reader = smol::io::BufReader::new(b"Content-Type: application/vscode-jsonrpc\r\nContent-Length: 1235\r\n\r\n{\"somecontent\":123}" as &[u8]);
read_headers(&mut reader, &mut buf).await.unwrap();
assert_eq!(
buf,
b"Content-Type: application/vscode-jsonrpc\r\nContent-Length: 1235\r\n\r\n"
);
let mut buf = Vec::new();
let mut reader = smol::io::BufReader::new(b"Content-Length: 1235\r\nContent-Type: application/vscode-jsonrpc\r\n\r\n{\"somecontent\":true}" as &[u8]);
read_headers(&mut reader, &mut buf).await.unwrap();
assert_eq!(
buf,
b"Content-Length: 1235\r\nContent-Type: application/vscode-jsonrpc\r\n\r\n"
);
}
#[gpui::test]
fn test_deserialize_string_digit_id() {
let json = r#"{"jsonrpc":"2.0","id":"2","method":"workspace/configuration","params":{"items":[{"scopeUri":"file:///Users/mph/Devel/personal/hello-scala/","section":"metals"}]}}"#;

View File

@@ -3740,6 +3740,62 @@ impl MultiBufferSnapshot {
}
}
/// Returns excerpts overlapping the given ranges. If range spans multiple excerpts returns one range for each excerpt
pub fn excerpts_in_ranges(
&self,
ranges: impl IntoIterator<Item = Range<Anchor>>,
) -> impl Iterator<Item = (ExcerptId, &BufferSnapshot, Range<usize>)> {
let mut ranges = ranges.into_iter().map(|range| range.to_offset(self));
let mut cursor = self.excerpts.cursor::<usize>();
let mut next_range = move |cursor: &mut Cursor<Excerpt, usize>| {
let range = ranges.next();
if let Some(range) = range.as_ref() {
cursor.seek_forward(&range.start, Bias::Right, &());
}
range
};
let mut range = next_range(&mut cursor);
iter::from_fn(move || {
if range.is_none() {
return None;
}
if range.as_ref().unwrap().is_empty() || *cursor.start() >= range.as_ref().unwrap().end
{
range = next_range(&mut cursor);
if range.is_none() {
return None;
}
}
cursor.item().map(|excerpt| {
let multibuffer_excerpt = MultiBufferExcerpt::new(&excerpt, *cursor.start());
let multibuffer_excerpt_range = multibuffer_excerpt
.map_range_from_buffer(excerpt.range.context.to_offset(&excerpt.buffer));
let overlap_range = cmp::max(
range.as_ref().unwrap().start,
multibuffer_excerpt_range.start,
)
..cmp::min(range.as_ref().unwrap().end, multibuffer_excerpt_range.end);
let overlap_range = multibuffer_excerpt.map_range_to_buffer(overlap_range);
if multibuffer_excerpt_range.end <= range.as_ref().unwrap().end {
cursor.next(&());
} else {
range = next_range(&mut cursor);
}
(excerpt.id, &excerpt.buffer, overlap_range)
})
})
}
pub fn remote_selections_in_range<'a>(
&'a self,
range: &'a Range<Anchor>,
@@ -6076,4 +6132,415 @@ mod tests {
assert_eq!(multibuffer.read(cx).text(), "XABCD1234\nAB5678");
});
}
#[gpui::test]
fn test_excerpts_in_ranges_no_ranges(cx: &mut AppContext) {
let buffer_1 = cx.new_model(|cx| Buffer::local(sample_text(6, 6, 'a'), cx));
let buffer_2 = cx.new_model(|cx| Buffer::local(sample_text(6, 6, 'g'), cx));
let multibuffer = cx.new_model(|_| MultiBuffer::new(0, Capability::ReadWrite));
multibuffer.update(cx, |multibuffer, cx| {
multibuffer.push_excerpts(
buffer_1.clone(),
[ExcerptRange {
context: 0..buffer_1.read(cx).len(),
primary: None,
}],
cx,
);
multibuffer.push_excerpts(
buffer_2.clone(),
[ExcerptRange {
context: 0..buffer_2.read(cx).len(),
primary: None,
}],
cx,
);
});
let snapshot = multibuffer.update(cx, |multibuffer, cx| multibuffer.snapshot(cx));
let mut excerpts = snapshot.excerpts_in_ranges(iter::from_fn(|| None));
assert!(excerpts.next().is_none());
}
fn validate_excerpts(
actual: &Vec<(ExcerptId, BufferId, Range<Anchor>)>,
expected: &Vec<(ExcerptId, BufferId, Range<Anchor>)>,
) {
assert_eq!(actual.len(), expected.len());
actual
.into_iter()
.zip(expected)
.map(|(actual, expected)| {
assert_eq!(actual.0, expected.0);
assert_eq!(actual.1, expected.1);
assert_eq!(actual.2.start, expected.2.start);
assert_eq!(actual.2.end, expected.2.end);
})
.collect_vec();
}
fn map_range_from_excerpt(
snapshot: &MultiBufferSnapshot,
excerpt_id: ExcerptId,
excerpt_buffer: &BufferSnapshot,
range: Range<usize>,
) -> Range<Anchor> {
snapshot
.anchor_in_excerpt(excerpt_id, excerpt_buffer.anchor_before(range.start))
.unwrap()
..snapshot
.anchor_in_excerpt(excerpt_id, excerpt_buffer.anchor_after(range.end))
.unwrap()
}
fn make_expected_excerpt_info(
snapshot: &MultiBufferSnapshot,
cx: &mut AppContext,
excerpt_id: ExcerptId,
buffer: &Model<Buffer>,
range: Range<usize>,
) -> (ExcerptId, BufferId, Range<Anchor>) {
(
excerpt_id,
buffer.read(cx).remote_id(),
map_range_from_excerpt(&snapshot, excerpt_id, &buffer.read(cx).snapshot(), range),
)
}
#[gpui::test]
fn test_excerpts_in_ranges_range_inside_the_excerpt(cx: &mut AppContext) {
let buffer_1 = cx.new_model(|cx| Buffer::local(sample_text(6, 6, 'a'), cx));
let buffer_2 = cx.new_model(|cx| Buffer::local(sample_text(6, 6, 'g'), cx));
let buffer_len = buffer_1.read(cx).len();
let multibuffer = cx.new_model(|_| MultiBuffer::new(0, Capability::ReadWrite));
let mut expected_excerpt_id = ExcerptId(0);
multibuffer.update(cx, |multibuffer, cx| {
expected_excerpt_id = multibuffer.push_excerpts(
buffer_1.clone(),
[ExcerptRange {
context: 0..buffer_1.read(cx).len(),
primary: None,
}],
cx,
)[0];
multibuffer.push_excerpts(
buffer_2.clone(),
[ExcerptRange {
context: 0..buffer_2.read(cx).len(),
primary: None,
}],
cx,
);
});
let snapshot = multibuffer.update(cx, |multibuffer, cx| multibuffer.snapshot(cx));
let range = snapshot
.anchor_in_excerpt(expected_excerpt_id, buffer_1.read(cx).anchor_before(1))
.unwrap()
..snapshot
.anchor_in_excerpt(
expected_excerpt_id,
buffer_1.read(cx).anchor_after(buffer_len / 2),
)
.unwrap();
let expected_excerpts = vec![make_expected_excerpt_info(
&snapshot,
cx,
expected_excerpt_id,
&buffer_1,
1..(buffer_len / 2),
)];
let excerpts = snapshot
.excerpts_in_ranges(vec![range.clone()].into_iter())
.map(|(excerpt_id, buffer, actual_range)| {
(
excerpt_id,
buffer.remote_id(),
map_range_from_excerpt(&snapshot, excerpt_id, buffer, actual_range),
)
})
.collect_vec();
validate_excerpts(&excerpts, &expected_excerpts);
}
#[gpui::test]
fn test_excerpts_in_ranges_range_crosses_excerpts_boundary(cx: &mut AppContext) {
let buffer_1 = cx.new_model(|cx| Buffer::local(sample_text(6, 6, 'a'), cx));
let buffer_2 = cx.new_model(|cx| Buffer::local(sample_text(6, 6, 'g'), cx));
let buffer_len = buffer_1.read(cx).len();
let multibuffer = cx.new_model(|_| MultiBuffer::new(0, Capability::ReadWrite));
let mut excerpt_1_id = ExcerptId(0);
let mut excerpt_2_id = ExcerptId(0);
multibuffer.update(cx, |multibuffer, cx| {
excerpt_1_id = multibuffer.push_excerpts(
buffer_1.clone(),
[ExcerptRange {
context: 0..buffer_1.read(cx).len(),
primary: None,
}],
cx,
)[0];
excerpt_2_id = multibuffer.push_excerpts(
buffer_2.clone(),
[ExcerptRange {
context: 0..buffer_2.read(cx).len(),
primary: None,
}],
cx,
)[0];
});
let snapshot = multibuffer.read(cx).snapshot(cx);
let expected_range = snapshot
.anchor_in_excerpt(
excerpt_1_id,
buffer_1.read(cx).anchor_before(buffer_len / 2),
)
.unwrap()
..snapshot
.anchor_in_excerpt(excerpt_2_id, buffer_2.read(cx).anchor_after(buffer_len / 2))
.unwrap();
let expected_excerpts = vec![
make_expected_excerpt_info(
&snapshot,
cx,
excerpt_1_id,
&buffer_1,
(buffer_len / 2)..buffer_len,
),
make_expected_excerpt_info(&snapshot, cx, excerpt_2_id, &buffer_2, 0..buffer_len / 2),
];
let excerpts = snapshot
.excerpts_in_ranges(vec![expected_range.clone()].into_iter())
.map(|(excerpt_id, buffer, actual_range)| {
(
excerpt_id,
buffer.remote_id(),
map_range_from_excerpt(&snapshot, excerpt_id, buffer, actual_range),
)
})
.collect_vec();
validate_excerpts(&excerpts, &expected_excerpts);
}
#[gpui::test]
fn test_excerpts_in_ranges_range_encloses_excerpt(cx: &mut AppContext) {
let buffer_1 = cx.new_model(|cx| Buffer::local(sample_text(6, 6, 'a'), cx));
let buffer_2 = cx.new_model(|cx| Buffer::local(sample_text(6, 6, 'g'), cx));
let buffer_3 = cx.new_model(|cx| Buffer::local(sample_text(6, 6, 'r'), cx));
let buffer_len = buffer_1.read(cx).len();
let multibuffer = cx.new_model(|_| MultiBuffer::new(0, Capability::ReadWrite));
let mut excerpt_1_id = ExcerptId(0);
let mut excerpt_2_id = ExcerptId(0);
let mut excerpt_3_id = ExcerptId(0);
multibuffer.update(cx, |multibuffer, cx| {
excerpt_1_id = multibuffer.push_excerpts(
buffer_1.clone(),
[ExcerptRange {
context: 0..buffer_1.read(cx).len(),
primary: None,
}],
cx,
)[0];
excerpt_2_id = multibuffer.push_excerpts(
buffer_2.clone(),
[ExcerptRange {
context: 0..buffer_2.read(cx).len(),
primary: None,
}],
cx,
)[0];
excerpt_3_id = multibuffer.push_excerpts(
buffer_3.clone(),
[ExcerptRange {
context: 0..buffer_3.read(cx).len(),
primary: None,
}],
cx,
)[0];
});
let snapshot = multibuffer.read(cx).snapshot(cx);
let expected_range = snapshot
.anchor_in_excerpt(
excerpt_1_id,
buffer_1.read(cx).anchor_before(buffer_len / 2),
)
.unwrap()
..snapshot
.anchor_in_excerpt(excerpt_3_id, buffer_3.read(cx).anchor_after(buffer_len / 2))
.unwrap();
let expected_excerpts = vec![
make_expected_excerpt_info(
&snapshot,
cx,
excerpt_1_id,
&buffer_1,
(buffer_len / 2)..buffer_len,
),
make_expected_excerpt_info(&snapshot, cx, excerpt_2_id, &buffer_2, 0..buffer_len),
make_expected_excerpt_info(&snapshot, cx, excerpt_3_id, &buffer_3, 0..buffer_len / 2),
];
let excerpts = snapshot
.excerpts_in_ranges(vec![expected_range.clone()].into_iter())
.map(|(excerpt_id, buffer, actual_range)| {
(
excerpt_id,
buffer.remote_id(),
map_range_from_excerpt(&snapshot, excerpt_id, buffer, actual_range),
)
})
.collect_vec();
validate_excerpts(&excerpts, &expected_excerpts);
}
#[gpui::test]
fn test_excerpts_in_ranges_multiple_ranges(cx: &mut AppContext) {
let buffer_1 = cx.new_model(|cx| Buffer::local(sample_text(6, 6, 'a'), cx));
let buffer_2 = cx.new_model(|cx| Buffer::local(sample_text(6, 6, 'g'), cx));
let buffer_len = buffer_1.read(cx).len();
let multibuffer = cx.new_model(|_| MultiBuffer::new(0, Capability::ReadWrite));
let mut excerpt_1_id = ExcerptId(0);
let mut excerpt_2_id = ExcerptId(0);
multibuffer.update(cx, |multibuffer, cx| {
excerpt_1_id = multibuffer.push_excerpts(
buffer_1.clone(),
[ExcerptRange {
context: 0..buffer_1.read(cx).len(),
primary: None,
}],
cx,
)[0];
excerpt_2_id = multibuffer.push_excerpts(
buffer_2.clone(),
[ExcerptRange {
context: 0..buffer_2.read(cx).len(),
primary: None,
}],
cx,
)[0];
});
let snapshot = multibuffer.read(cx).snapshot(cx);
let ranges = vec![
1..(buffer_len / 4),
(buffer_len / 3)..(buffer_len / 2),
(buffer_len / 4 * 3)..(buffer_len),
];
let expected_excerpts = ranges
.iter()
.map(|range| {
make_expected_excerpt_info(&snapshot, cx, excerpt_1_id, &buffer_1, range.clone())
})
.collect_vec();
let ranges = ranges.into_iter().map(|range| {
map_range_from_excerpt(
&snapshot,
excerpt_1_id,
&buffer_1.read(cx).snapshot(),
range,
)
});
let excerpts = snapshot
.excerpts_in_ranges(ranges)
.map(|(excerpt_id, buffer, actual_range)| {
(
excerpt_id,
buffer.remote_id(),
map_range_from_excerpt(&snapshot, excerpt_id, buffer, actual_range),
)
})
.collect_vec();
validate_excerpts(&excerpts, &expected_excerpts);
}
#[gpui::test]
fn test_excerpts_in_ranges_range_ends_at_excerpt_end(cx: &mut AppContext) {
let buffer_1 = cx.new_model(|cx| Buffer::local(sample_text(6, 6, 'a'), cx));
let buffer_2 = cx.new_model(|cx| Buffer::local(sample_text(6, 6, 'g'), cx));
let buffer_len = buffer_1.read(cx).len();
let multibuffer = cx.new_model(|_| MultiBuffer::new(0, Capability::ReadWrite));
let mut excerpt_1_id = ExcerptId(0);
let mut excerpt_2_id = ExcerptId(0);
multibuffer.update(cx, |multibuffer, cx| {
excerpt_1_id = multibuffer.push_excerpts(
buffer_1.clone(),
[ExcerptRange {
context: 0..buffer_1.read(cx).len(),
primary: None,
}],
cx,
)[0];
excerpt_2_id = multibuffer.push_excerpts(
buffer_2.clone(),
[ExcerptRange {
context: 0..buffer_2.read(cx).len(),
primary: None,
}],
cx,
)[0];
});
let snapshot = multibuffer.read(cx).snapshot(cx);
let ranges = [0..buffer_len, (buffer_len / 3)..(buffer_len / 2)];
let expected_excerpts = vec![
make_expected_excerpt_info(&snapshot, cx, excerpt_1_id, &buffer_1, ranges[0].clone()),
make_expected_excerpt_info(&snapshot, cx, excerpt_2_id, &buffer_2, ranges[1].clone()),
];
let ranges = [
map_range_from_excerpt(
&snapshot,
excerpt_1_id,
&buffer_1.read(cx).snapshot(),
ranges[0].clone(),
),
map_range_from_excerpt(
&snapshot,
excerpt_2_id,
&buffer_2.read(cx).snapshot(),
ranges[1].clone(),
),
];
let excerpts = snapshot
.excerpts_in_ranges(ranges.into_iter())
.map(|(excerpt_id, buffer, actual_range)| {
(
excerpt_id,
buffer.remote_id(),
map_range_from_excerpt(&snapshot, excerpt_id, buffer, actual_range),
)
})
.collect_vec();
validate_excerpts(&excerpts, &expected_excerpts);
}
}

View File

@@ -103,6 +103,19 @@ pub trait PickerDelegate: Sized + 'static {
None
}
fn render_editor(&self, editor: &View<Editor>, _cx: &mut ViewContext<Picker<Self>>) -> Div {
v_flex()
.child(
h_flex()
.overflow_hidden()
.flex_none()
.h_9()
.px_4()
.child(editor.clone()),
)
.child(Divider::horizontal())
}
fn render_match(
&self,
ix: usize,
@@ -552,16 +565,7 @@ impl<D: PickerDelegate> Render for Picker<D> {
.on_action(cx.listener(Self::use_selected_query))
.on_action(cx.listener(Self::confirm_input))
.child(match &self.head {
Head::Editor(editor) => v_flex()
.child(
h_flex()
.overflow_hidden()
.flex_none()
.h_9()
.px_4()
.child(editor.clone()),
)
.child(Divider::horizontal()),
Head::Editor(editor) => self.delegate.render_editor(&editor.clone(), cx),
Head::Empty(empty_head) => div().child(empty_head.clone()),
})
.when(self.delegate.match_count() > 0, |el| {

View File

@@ -44,7 +44,7 @@ use language::{
markdown, point_to_lsp, prepare_completion_documentation,
proto::{
deserialize_anchor, deserialize_line_ending, deserialize_version, serialize_anchor,
serialize_version, split_operations,
serialize_line_ending, serialize_version, split_operations,
},
range_from_lsp, Bias, Buffer, BufferSnapshot, CachedLspAdapter, Capability, CodeLabel,
ContextProvider, Diagnostic, DiagnosticEntry, DiagnosticSet, Diff, Documentation,
@@ -121,9 +121,9 @@ pub use task_inventory::{
BasicContextProvider, ContextProviderWithTasks, Inventory, TaskSourceKind,
};
pub use worktree::{
DiagnosticSummary, Entry, EntryKind, File, LocalWorktree, PathChange, ProjectEntryId,
RepositoryEntry, UpdatedEntriesSet, UpdatedGitRepositoriesSet, Worktree, WorktreeId,
WorktreeSettings, FS_WATCH_LATENCY,
Entry, EntryKind, File, LocalWorktree, PathChange, ProjectEntryId, RepositoryEntry,
UpdatedEntriesSet, UpdatedGitRepositoriesSet, Worktree, WorktreeId, WorktreeSettings,
FS_WATCH_LATENCY,
};
const MAX_SERVER_REINSTALL_ATTEMPT_COUNT: u64 = 4;
@@ -180,6 +180,18 @@ pub struct Project {
next_entry_id: Arc<AtomicUsize>,
join_project_response_message_id: u32,
next_diagnostic_group_id: usize,
diagnostic_summaries:
HashMap<WorktreeId, HashMap<Arc<Path>, HashMap<LanguageServerId, DiagnosticSummary>>>,
diagnostics: HashMap<
WorktreeId,
HashMap<
Arc<Path>,
Vec<(
LanguageServerId,
Vec<DiagnosticEntry<Unclipped<PointUtf16>>>,
)>,
>,
>,
user_store: Model<UserStore>,
fs: Arc<dyn Fs>,
client_state: ProjectClientState,
@@ -749,6 +761,8 @@ impl Project {
fs,
next_entry_id: Default::default(),
next_diagnostic_group_id: Default::default(),
diagnostics: Default::default(),
diagnostic_summaries: Default::default(),
supplementary_language_servers: HashMap::default(),
language_servers: Default::default(),
language_server_ids: HashMap::default(),
@@ -842,8 +856,7 @@ impl Project {
// That's because Worktree's identifier is entity id, which should probably be changed.
let mut worktrees = Vec::new();
for worktree in response.payload.worktrees {
let worktree =
Worktree::remote(remote_id, replica_id, worktree, client.clone(), cx);
let worktree = Worktree::remote(replica_id, worktree, cx);
worktrees.push(worktree);
}
@@ -873,6 +886,8 @@ impl Project {
fs,
next_entry_id: Default::default(),
next_diagnostic_group_id: Default::default(),
diagnostic_summaries: Default::default(),
diagnostics: Default::default(),
client_subscriptions: Default::default(),
_subscriptions: vec![
cx.on_release(Self::release),
@@ -1733,6 +1748,7 @@ impl Project {
let worktrees = this.update(&mut cx, |this, _cx| {
this.worktrees().collect::<Vec<_>>()
})?;
let update_project = this
.update(&mut cx, |this, cx| {
this.client.request(proto::UpdateProject {
@@ -1741,14 +1757,49 @@ impl Project {
})
})?
.await;
if update_project.log_err().is_some() {
if update_project.log_err().is_none() {
continue;
}
this.update(&mut cx, |this, cx| {
for worktree in worktrees {
worktree.update(&mut cx, |worktree, cx| {
let worktree = worktree.as_local_mut().unwrap();
worktree.share(project_id, cx).detach_and_log_err(cx)
worktree.update(cx, |worktree, cx| {
if let Some(summaries) =
this.diagnostic_summaries.get(&worktree.id())
{
for (path, summaries) in summaries {
for (&server_id, summary) in summaries {
this.client.send(
proto::UpdateDiagnosticSummary {
project_id,
worktree_id: cx.entity_id().as_u64(),
summary: Some(
summary.to_proto(server_id, path),
),
},
)?;
}
}
}
worktree.as_local_mut().unwrap().observe_updates(
project_id,
cx,
{
let client = client.clone();
move |update| {
client
.request(update)
.map(|result| result.is_ok())
}
},
);
anyhow::Ok(())
})?;
}
}
anyhow::Ok(())
})??;
}
LocalProjectUpdate::CreateBufferForPeer { peer_id, buffer_id } => {
let buffer = this.update(&mut cx, |this, _| {
@@ -1893,7 +1944,7 @@ impl Project {
for worktree_handle in self.worktrees.iter_mut() {
if let WorktreeHandle::Strong(worktree) = worktree_handle {
let is_visible = worktree.update(cx, |worktree, _| {
worktree.as_local_mut().unwrap().unshare();
worktree.as_local_mut().unwrap().stop_observing_updates();
worktree.is_visible()
});
if !is_visible {
@@ -2177,23 +2228,47 @@ impl Project {
) -> Task<Result<Model<Buffer>>> {
let load_buffer = worktree.update(cx, |worktree, cx| {
let worktree = worktree.as_local_mut().unwrap();
worktree.load_buffer(&path, cx)
let file = worktree.load_file(path.as_ref(), cx);
let reservation = cx.reserve_model();
let buffer_id = BufferId::from(reservation.entity_id().as_non_zero_u64());
cx.spawn(move |_, mut cx| async move {
let (file, contents, diff_base) = file.await?;
let text_buffer = cx
.background_executor()
.spawn(async move { text::Buffer::new(0, buffer_id, contents) })
.await;
cx.insert_model(reservation, |_| {
Buffer::build(
text_buffer,
diff_base,
Some(Arc::new(file)),
Capability::ReadWrite,
)
})
})
});
fn is_not_found_error(error: &anyhow::Error) -> bool {
error
.root_cause()
.downcast_ref::<io::Error>()
.is_some_and(|err| err.kind() == io::ErrorKind::NotFound)
}
cx.spawn(move |this, mut cx| async move {
let buffer = match load_buffer.await {
Ok(buffer) => Ok(buffer),
Err(error) if is_not_found_error(&error) => {
worktree.update(&mut cx, |worktree, cx| {
let worktree = worktree.as_local_mut().unwrap();
worktree.new_buffer(path, cx)
})
}
Err(error) if is_not_found_error(&error) => cx.new_model(|cx| {
let buffer_id = BufferId::from(cx.entity_id().as_non_zero_u64());
let text_buffer = text::Buffer::new(0, buffer_id, "".into());
Buffer::build(
text_buffer,
None,
Some(Arc::new(File {
worktree,
path,
mtime: None,
entry_id: None,
is_local: true,
is_deleted: false,
is_private: false,
})),
Capability::ReadWrite,
)
}),
Err(e) => Err(e),
}?;
this.update(&mut cx, |this, cx| this.register_buffer(&buffer, cx))??;
@@ -2321,8 +2396,8 @@ impl Project {
let worktree = file.worktree.clone();
let path = file.path.clone();
worktree.update(cx, |worktree, cx| match worktree {
Worktree::Local(worktree) => worktree.save_buffer(buffer, path, false, cx),
Worktree::Remote(worktree) => worktree.save_buffer(buffer, None, cx),
Worktree::Local(worktree) => self.save_local_buffer(&worktree, buffer, path, false, cx),
Worktree::Remote(_) => self.save_remote_buffer(buffer, None, cx),
})
}
@@ -2340,21 +2415,20 @@ impl Project {
};
cx.spawn(move |this, mut cx| async move {
if let Some(old_file) = &old_file {
this.update(&mut cx, |this, cx| {
this.update(&mut cx, |this, cx| {
if let Some(old_file) = &old_file {
this.unregister_buffer_from_language_servers(&buffer, old_file, cx);
})?;
}
worktree
.update(&mut cx, |worktree, cx| match worktree {
}
worktree.update(cx, |worktree, cx| match worktree {
Worktree::Local(worktree) => {
worktree.save_buffer(buffer.clone(), path.path, true, cx)
this.save_local_buffer(worktree, buffer.clone(), path.path, true, cx)
}
Worktree::Remote(worktree) => {
worktree.save_buffer(buffer.clone(), Some(path.to_proto()), cx)
Worktree::Remote(_) => {
this.save_remote_buffer(buffer.clone(), Some(path.to_proto()), cx)
}
})?
.await?;
})
})?
.await?;
this.update(&mut cx, |this, cx| {
this.detect_language_for_buffer(&buffer, cx);
@@ -2364,6 +2438,129 @@ impl Project {
})
}
pub fn save_local_buffer(
&self,
worktree: &LocalWorktree,
buffer_handle: Model<Buffer>,
path: Arc<Path>,
mut has_changed_file: bool,
cx: &mut ModelContext<Worktree>,
) -> Task<Result<()>> {
let buffer = buffer_handle.read(cx);
let rpc = self.client.clone();
let buffer_id: u64 = buffer.remote_id().into();
let project_id = self.remote_id();
if buffer.file().is_some_and(|file| !file.is_created()) {
has_changed_file = true;
}
let text = buffer.as_rope().clone();
let version = buffer.version();
let save = worktree.write_file(path.as_ref(), text, buffer.line_ending(), cx);
let fs = Arc::clone(&self.fs);
let abs_path = worktree.absolutize(&path);
let is_private = worktree.is_path_private(&path);
cx.spawn(move |this, mut cx| async move {
let entry = save.await?;
let abs_path = abs_path?;
let this = this.upgrade().context("worktree dropped")?;
let (entry_id, mtime, path, is_private) = match entry {
Some(entry) => (Some(entry.id), entry.mtime, entry.path, entry.is_private),
None => {
let metadata = fs
.metadata(&abs_path)
.await
.with_context(|| {
format!(
"Fetching metadata after saving the excluded buffer {abs_path:?}"
)
})?
.with_context(|| {
format!("Excluded buffer {path:?} got removed during saving")
})?;
(None, Some(metadata.mtime), path, is_private)
}
};
if has_changed_file {
let new_file = Arc::new(File {
entry_id,
worktree: this,
path,
mtime,
is_local: true,
is_deleted: false,
is_private,
});
if let Some(project_id) = project_id {
rpc.send(proto::UpdateBufferFile {
project_id,
buffer_id,
file: Some(new_file.to_proto()),
})
.log_err();
}
buffer_handle.update(&mut cx, |buffer, cx| {
if has_changed_file {
buffer.file_updated(new_file, cx);
}
})?;
}
if let Some(project_id) = project_id {
rpc.send(proto::BufferSaved {
project_id,
buffer_id,
version: serialize_version(&version),
mtime: mtime.map(|time| time.into()),
})?;
}
buffer_handle.update(&mut cx, |buffer, cx| {
buffer.did_save(version.clone(), mtime, cx);
})?;
Ok(())
})
}
pub fn save_remote_buffer(
&self,
buffer_handle: Model<Buffer>,
new_path: Option<proto::ProjectPath>,
cx: &mut ModelContext<Worktree>,
) -> Task<Result<()>> {
let buffer = buffer_handle.read(cx);
let buffer_id = buffer.remote_id().into();
let version = buffer.version();
let rpc = self.client.clone();
let project_id = self.remote_id();
cx.spawn(move |_, mut cx| async move {
let response = rpc
.request(proto::SaveBuffer {
project_id: project_id.ok_or_else(|| anyhow!("project_id is not set"))?,
buffer_id,
new_path,
version: serialize_version(&version),
})
.await?;
let version = deserialize_version(&response.version);
let mtime = response.mtime.map(|mtime| mtime.into());
buffer_handle.update(&mut cx, |buffer, cx| {
buffer.did_save(version.clone(), mtime, cx);
})?;
Ok(())
})
}
pub fn get_open_buffer(
&mut self,
path: &ProjectPath,
@@ -2489,8 +2686,10 @@ impl Project {
let language = buffer.language().cloned();
let worktree_id = file.worktree_id(cx);
if let Some(local_worktree) = file.worktree.read(cx).as_local() {
for (server_id, diagnostics) in local_worktree.diagnostics_for_path(file.path()) {
if let Some(diagnostics) = self.diagnostics.get(&worktree_id) {
for (server_id, diagnostics) in
diagnostics.get(file.path()).cloned().unwrap_or_default()
{
self.update_buffer_diagnostics(buffer_handle, server_id, None, diagnostics, cx)
.log_err();
}
@@ -2729,6 +2928,23 @@ impl Project {
.ok();
}
BufferEvent::Reloaded => {
if self.is_local() {
if let Some(project_id) = self.remote_id() {
let buffer = buffer.read(cx);
self.client
.send(proto::BufferReloaded {
project_id,
buffer_id: buffer.remote_id().to_proto(),
version: serialize_version(&buffer.version()),
mtime: buffer.saved_mtime().map(|t| t.into()),
line_ending: serialize_line_ending(buffer.line_ending()) as i32,
})
.log_err();
}
}
}
BufferEvent::Edited { .. } => {
let buffer = buffer.read(cx);
let file = File::from_dyn(buffer.file())?;
@@ -3929,14 +4145,43 @@ impl Project {
});
}
}
for worktree in &self.worktrees {
if let Some(worktree) = worktree.upgrade() {
worktree.update(cx, |worktree, cx| {
if let Some(worktree) = worktree.as_local_mut() {
worktree.clear_diagnostics_for_language_server(server_id, cx);
let project_id = self.remote_id();
for (worktree_id, summaries) in self.diagnostic_summaries.iter_mut() {
summaries.retain(|path, summaries_by_server_id| {
if summaries_by_server_id.remove(&server_id).is_some() {
if let Some(project_id) = project_id {
self.client
.send(proto::UpdateDiagnosticSummary {
project_id,
worktree_id: worktree_id.to_proto(),
summary: Some(proto::DiagnosticSummary {
path: path.to_string_lossy().to_string(),
language_server_id: server_id.0 as u64,
error_count: 0,
warning_count: 0,
}),
})
.log_err();
}
});
}
!summaries_by_server_id.is_empty()
} else {
true
}
});
}
for diagnostics in self.diagnostics.values_mut() {
diagnostics.retain(|_, diagnostics_by_server_id| {
if let Ok(ix) =
diagnostics_by_server_id.binary_search_by_key(&server_id, |e| e.0)
{
diagnostics_by_server_id.remove(ix);
!diagnostics_by_server_id.is_empty()
} else {
true
}
});
}
self.language_server_watched_paths.remove(&server_id);
@@ -4673,10 +4918,13 @@ impl Project {
}
let updated = worktree.update(cx, |worktree, cx| {
worktree
.as_local_mut()
.ok_or_else(|| anyhow!("not a local worktree"))?
.update_diagnostics(server_id, project_path.path.clone(), diagnostics, cx)
self.update_worktree_diagnostics(
worktree.id(),
server_id,
project_path.path.clone(),
diagnostics,
cx,
)
})?;
if updated {
cx.emit(Event::DiagnosticsUpdated {
@@ -4687,6 +4935,67 @@ impl Project {
Ok(())
}
pub fn update_worktree_diagnostics(
&mut self,
worktree_id: WorktreeId,
server_id: LanguageServerId,
worktree_path: Arc<Path>,
diagnostics: Vec<DiagnosticEntry<Unclipped<PointUtf16>>>,
_: &mut ModelContext<Worktree>,
) -> Result<bool> {
let summaries_for_tree = self.diagnostic_summaries.entry(worktree_id).or_default();
let diagnostics_for_tree = self.diagnostics.entry(worktree_id).or_default();
let summaries_by_server_id = summaries_for_tree.entry(worktree_path.clone()).or_default();
let old_summary = summaries_by_server_id
.remove(&server_id)
.unwrap_or_default();
let new_summary = DiagnosticSummary::new(&diagnostics);
if new_summary.is_empty() {
if let Some(diagnostics_by_server_id) = diagnostics_for_tree.get_mut(&worktree_path) {
if let Ok(ix) = diagnostics_by_server_id.binary_search_by_key(&server_id, |e| e.0) {
diagnostics_by_server_id.remove(ix);
}
if diagnostics_by_server_id.is_empty() {
diagnostics_for_tree.remove(&worktree_path);
}
}
} else {
summaries_by_server_id.insert(server_id, new_summary);
let diagnostics_by_server_id = diagnostics_for_tree
.entry(worktree_path.clone())
.or_default();
match diagnostics_by_server_id.binary_search_by_key(&server_id, |e| e.0) {
Ok(ix) => {
diagnostics_by_server_id[ix] = (server_id, diagnostics);
}
Err(ix) => {
diagnostics_by_server_id.insert(ix, (server_id, diagnostics));
}
}
}
if !old_summary.is_empty() || !new_summary.is_empty() {
if let Some(project_id) = self.remote_id() {
self.client
.send(proto::UpdateDiagnosticSummary {
project_id,
worktree_id: worktree_id.to_proto(),
summary: Some(proto::DiagnosticSummary {
path: worktree_path.to_string_lossy().to_string(),
language_server_id: server_id.0 as u64,
error_count: new_summary.error_count as u32,
warning_count: new_summary.warning_count as u32,
}),
})
.log_err();
}
}
Ok(!old_summary.is_empty() || !new_summary.is_empty())
}
fn update_buffer_diagnostics(
&mut self,
buffer: &Model<Buffer>,
@@ -7453,7 +7762,6 @@ impl Project {
cx: &mut ModelContext<Self>,
) -> Task<Result<Model<Worktree>>> {
let fs = self.fs.clone();
let client = self.client.clone();
let next_entry_id = self.next_entry_id.clone();
let path: Arc<Path> = abs_path.as_ref().into();
let task = self
@@ -7462,15 +7770,9 @@ impl Project {
.or_insert_with(|| {
cx.spawn(move |project, mut cx| {
async move {
let worktree = Worktree::local(
client.clone(),
path.clone(),
visible,
fs,
next_entry_id,
&mut cx,
)
.await;
let worktree =
Worktree::local(path.clone(), visible, fs, next_entry_id, &mut cx)
.await;
project.update(&mut cx, |project, _| {
project.loading_local_worktrees.remove(&path);
@@ -7503,6 +7805,9 @@ impl Project {
}
pub fn remove_worktree(&mut self, id_to_remove: WorktreeId, cx: &mut ModelContext<Self>) {
self.diagnostics.remove(&id_to_remove);
self.diagnostic_summaries.remove(&id_to_remove);
let mut servers_to_remove = HashMap::default();
let mut servers_to_preserve = HashSet::default();
for ((worktree_id, server_name), &server_id) in &self.language_server_ids {
@@ -8128,23 +8433,34 @@ impl Project {
include_ignored: bool,
cx: &'a AppContext,
) -> impl Iterator<Item = (ProjectPath, LanguageServerId, DiagnosticSummary)> + 'a {
self.visible_worktrees(cx).flat_map(move |worktree| {
let worktree = worktree.read(cx);
let worktree_id = worktree.id();
worktree
.diagnostic_summaries()
.filter_map(move |(path, server_id, summary)| {
if include_ignored
|| worktree
.entry_for_path(path.as_ref())
.map_or(false, |entry| !entry.is_ignored)
{
Some((ProjectPath { worktree_id, path }, server_id, summary))
} else {
None
}
})
})
self.visible_worktrees(cx)
.filter_map(|worktree| {
let worktree = worktree.read(cx);
Some((worktree, self.diagnostic_summaries.get(&worktree.id())?))
})
.flat_map(move |(worktree, summaries)| {
let worktree_id = worktree.id();
summaries
.iter()
.filter(move |(path, _)| {
include_ignored
|| worktree
.entry_for_path(path.as_ref())
.map_or(false, |entry| !entry.is_ignored)
})
.flat_map(move |(path, summaries)| {
summaries.iter().map(move |(server_id, summary)| {
(
ProjectPath {
worktree_id,
path: path.clone(),
},
*server_id,
*summary,
)
})
})
})
}
pub fn disk_based_diagnostics_started(
@@ -8805,23 +9121,41 @@ impl Project {
) -> Result<()> {
this.update(&mut cx, |this, cx| {
let worktree_id = WorktreeId::from_proto(envelope.payload.worktree_id);
if let Some(worktree) = this.worktree_for_id(worktree_id, cx) {
if let Some(summary) = envelope.payload.summary {
let project_path = ProjectPath {
worktree_id,
path: Path::new(&summary.path).into(),
};
worktree.update(cx, |worktree, _| {
worktree
.as_remote_mut()
.unwrap()
.update_diagnostic_summary(project_path.path.clone(), &summary);
});
cx.emit(Event::DiagnosticsUpdated {
language_server_id: LanguageServerId(summary.language_server_id as usize),
path: project_path,
});
if let Some(message) = envelope.payload.summary {
let project_path = ProjectPath {
worktree_id,
path: Path::new(&message.path).into(),
};
let path = project_path.path.clone();
let server_id = LanguageServerId(message.language_server_id as usize);
let summary = DiagnosticSummary {
error_count: message.error_count as usize,
warning_count: message.warning_count as usize,
};
if summary.is_empty() {
if let Some(worktree_summaries) =
this.diagnostic_summaries.get_mut(&worktree_id)
{
if let Some(summaries) = worktree_summaries.get_mut(&path) {
summaries.remove(&server_id);
if summaries.is_empty() {
worktree_summaries.remove(&path);
}
}
}
} else {
this.diagnostic_summaries
.entry(worktree_id)
.or_default()
.entry(path)
.or_default()
.insert(server_id, summary);
}
cx.emit(Event::DiagnosticsUpdated {
language_server_id: LanguageServerId(message.language_server_id as usize),
path: project_path,
});
}
Ok(())
})?
@@ -10229,7 +10563,6 @@ impl Project {
cx: &mut ModelContext<Project>,
) -> Result<()> {
let replica_id = self.replica_id();
let remote_id = self.remote_id().ok_or_else(|| anyhow!("invalid project"))?;
let mut old_worktrees_by_id = self
.worktrees
@@ -10246,8 +10579,7 @@ impl Project {
{
self.worktrees.push(WorktreeHandle::Strong(old_worktree));
} else {
let worktree =
Worktree::remote(remote_id, replica_id, worktree, self.client.clone(), cx);
let worktree = Worktree::remote(replica_id, worktree, cx);
let _ = self.add_worktree(&worktree, cx);
}
}
@@ -10282,7 +10614,7 @@ impl Project {
fn deserialize_symbol(serialized_symbol: proto::Symbol) -> Result<CoreSymbol> {
let source_worktree_id = WorktreeId::from_proto(serialized_symbol.source_worktree_id);
let worktree_id = WorktreeId::from_proto(serialized_symbol.worktree_id);
let kind = unsafe { mem::transmute(serialized_symbol.kind) };
let kind = unsafe { mem::transmute::<i32, lsp::SymbolKind>(serialized_symbol.kind) };
let path = ProjectPath {
worktree_id,
path: PathBuf::from(serialized_symbol.path).into(),
@@ -11393,7 +11725,7 @@ fn serialize_symbol(symbol: &Symbol) -> proto::Symbol {
worktree_id: symbol.path.worktree_id.to_proto(),
path: symbol.path.path.to_string_lossy().to_string(),
name: symbol.name.clone(),
kind: unsafe { mem::transmute(symbol.kind) },
kind: unsafe { mem::transmute::<lsp::SymbolKind, i32>(symbol.kind) },
start: Some(proto::PointUtf16 {
row: symbol.range.start.0.row,
column: symbol.range.start.0.column,
@@ -11502,6 +11834,13 @@ async fn wait_for_loading_buffer(
}
}
fn is_not_found_error(error: &anyhow::Error) -> bool {
error
.root_cause()
.downcast_ref::<io::Error>()
.is_some_and(|err| err.kind() == io::ErrorKind::NotFound)
}
fn include_text(server: &lsp::LanguageServer) -> bool {
server
.capabilities()
@@ -11740,3 +12079,47 @@ fn deserialize_location(
})
})
}
#[derive(Copy, Clone, Debug, Default, PartialEq, Serialize)]
pub struct DiagnosticSummary {
pub error_count: usize,
pub warning_count: usize,
}
impl DiagnosticSummary {
pub fn new<'a, T: 'a>(diagnostics: impl IntoIterator<Item = &'a DiagnosticEntry<T>>) -> Self {
let mut this = Self {
error_count: 0,
warning_count: 0,
};
for entry in diagnostics {
if entry.diagnostic.is_primary {
match entry.diagnostic.severity {
DiagnosticSeverity::ERROR => this.error_count += 1,
DiagnosticSeverity::WARNING => this.warning_count += 1,
_ => {}
}
}
}
this
}
pub fn is_empty(&self) -> bool {
self.error_count == 0 && self.warning_count == 0
}
pub fn to_proto(
&self,
language_server_id: LanguageServerId,
path: &Path,
) -> proto::DiagnosticSummary {
proto::DiagnosticSummary {
path: path.to_string_lossy().to_string(),
language_server_id: language_server_id.0 as u64,
error_count: self.error_count as u32,
warning_count: self.warning_count as u32,
}
}
}

View File

@@ -38,6 +38,7 @@ pub struct GitSettings {
impl GitSettings {
pub fn inline_blame_enabled(&self) -> bool {
#[allow(unknown_lints, clippy::manual_unwrap_or_default)]
match self.inline_blame {
Some(InlineBlameSettings { enabled, .. }) => enabled,
_ => false,

View File

@@ -1295,7 +1295,7 @@ async fn test_restarting_server_with_diagnostics_running(cx: &mut gpui::TestAppC
project
.language_servers_running_disk_based_diagnostics()
.collect::<Vec<_>>(),
[LanguageServerId(0); 0]
[] as [language::LanguageServerId; 0]
);
});
}
@@ -2955,7 +2955,6 @@ async fn test_rescan_and_remote_updates(cx: &mut gpui::TestAppContext) {
}));
let project = Project::test(Arc::new(RealFs::default()), [dir.path()], cx).await;
let rpc = project.update(cx, |p, _| p.client.clone());
let buffer_for_path = |path: &'static str, cx: &mut gpui::TestAppContext| {
let buffer = project.update(cx, |p, cx| p.open_local_buffer(dir.path().join(path), cx));
@@ -2987,7 +2986,7 @@ async fn test_rescan_and_remote_updates(cx: &mut gpui::TestAppContext) {
let updates = Arc::new(Mutex::new(Vec::new()));
tree.update(cx, |tree, cx| {
let _ = tree.as_local_mut().unwrap().observe_updates(0, cx, {
tree.as_local_mut().unwrap().observe_updates(0, cx, {
let updates = updates.clone();
move |update| {
updates.lock().push(update);
@@ -2996,7 +2995,7 @@ async fn test_rescan_and_remote_updates(cx: &mut gpui::TestAppContext) {
});
});
let remote = cx.update(|cx| Worktree::remote(1, 1, metadata, rpc.clone(), cx));
let remote = cx.update(|cx| Worktree::remote(1, metadata, cx));
cx.executor().run_until_parked();

View File

@@ -128,7 +128,7 @@ impl Project {
id: spawn_task.id,
full_label: spawn_task.full_label,
label: spawn_task.label,
command_label: spawn_task.command_flattened,
command_label: spawn_task.command_label,
status: TaskStatus::Running,
completion_rx,
}),
@@ -152,7 +152,7 @@ impl Project {
id: spawn_task.id,
full_label: spawn_task.full_label,
label: spawn_task.label,
command_label: spawn_task.command_flattened,
command_label: spawn_task.command_label,
status: TaskStatus::Running,
completion_rx,
}),

View File

@@ -1107,7 +1107,7 @@ pub async fn spawn_ssh_task(
label: "Install zed over ssh".into(),
command,
args,
command_flattened: ssh_connection_string.clone(),
command_label: ssh_connection_string.clone(),
cwd: Some(TerminalWorkDir::Ssh {
ssh_command: ssh_connection_string,
path: None,

View File

@@ -0,0 +1,136 @@
use crate::proto::{Envelope, Message};
use anyhow::{anyhow, Result};
use async_tungstenite::tungstenite::Message as WebSocketMessage;
use futures::{SinkExt, StreamExt};
use prost::Message as _;
use std::{io, time::Instant};
const KIB: usize = 1024;
const MIB: usize = KIB * 1024;
const MAX_BUFFER_LEN: usize = MIB;
/// A stream of protobuf messages.
pub struct MessageStream<S> {
stream: S,
encoding_buffer: Vec<u8>,
}
impl<S> MessageStream<S> {
pub fn new(stream: S) -> Self {
Self {
stream,
encoding_buffer: Vec::new(),
}
}
pub fn inner_mut(&mut self) -> &mut S {
&mut self.stream
}
}
impl<S> MessageStream<S>
where
S: futures::Sink<WebSocketMessage, Error = anyhow::Error> + Unpin,
{
pub async fn write(&mut self, message: Message) -> Result<(), anyhow::Error> {
#[cfg(any(test, feature = "test-support"))]
const COMPRESSION_LEVEL: i32 = -7;
#[cfg(not(any(test, feature = "test-support")))]
const COMPRESSION_LEVEL: i32 = 4;
match message {
Message::Envelope(message) => {
self.encoding_buffer.reserve(message.encoded_len());
message
.encode(&mut self.encoding_buffer)
.map_err(io::Error::from)?;
let buffer =
zstd::stream::encode_all(self.encoding_buffer.as_slice(), COMPRESSION_LEVEL)
.unwrap();
self.encoding_buffer.clear();
self.encoding_buffer.shrink_to(MAX_BUFFER_LEN);
self.stream.send(WebSocketMessage::Binary(buffer)).await?;
}
Message::Ping => {
self.stream
.send(WebSocketMessage::Ping(Default::default()))
.await?;
}
Message::Pong => {
self.stream
.send(WebSocketMessage::Pong(Default::default()))
.await?;
}
}
Ok(())
}
}
impl<S> MessageStream<S>
where
S: futures::Stream<Item = Result<WebSocketMessage, anyhow::Error>> + Unpin,
{
pub async fn read(&mut self) -> Result<(Message, Instant), anyhow::Error> {
while let Some(bytes) = self.stream.next().await {
let received_at = Instant::now();
match bytes? {
WebSocketMessage::Binary(bytes) => {
zstd::stream::copy_decode(bytes.as_slice(), &mut self.encoding_buffer).unwrap();
let envelope = Envelope::decode(self.encoding_buffer.as_slice())
.map_err(io::Error::from)?;
self.encoding_buffer.clear();
self.encoding_buffer.shrink_to(MAX_BUFFER_LEN);
return Ok((Message::Envelope(envelope), received_at));
}
WebSocketMessage::Ping(_) => return Ok((Message::Ping, received_at)),
WebSocketMessage::Pong(_) => return Ok((Message::Pong, received_at)),
WebSocketMessage::Close(_) => break,
_ => {}
}
}
Err(anyhow!("connection closed"))
}
}
#[cfg(test)]
mod tests {
use crate::proto::{envelope, Envelope, UpdateWorktree};
use super::*;
#[gpui::test]
async fn test_buffer_size() {
let (tx, rx) = futures::channel::mpsc::unbounded();
let mut sink = MessageStream::new(tx.sink_map_err(|_| anyhow!("")));
sink.write(Message::Envelope(Envelope {
payload: Some(envelope::Payload::UpdateWorktree(UpdateWorktree {
root_name: "abcdefg".repeat(10),
..Default::default()
})),
..Default::default()
}))
.await
.unwrap();
assert!(sink.encoding_buffer.capacity() <= MAX_BUFFER_LEN);
sink.write(Message::Envelope(Envelope {
payload: Some(envelope::Payload::UpdateWorktree(UpdateWorktree {
root_name: "abcdefg".repeat(1000000),
..Default::default()
})),
..Default::default()
}))
.await
.unwrap();
assert!(sink.encoding_buffer.capacity() <= MAX_BUFFER_LEN);
let mut stream = MessageStream::new(rx.map(anyhow::Ok));
stream.read().await.unwrap();
assert!(stream.encoding_buffer.capacity() <= MAX_BUFFER_LEN);
stream.read().await.unwrap();
assert!(stream.encoding_buffer.capacity() <= MAX_BUFFER_LEN);
}
}

View File

@@ -1,8 +1,8 @@
use crate::{ErrorCode, ErrorCodeExt, ErrorExt, RpcError};
use super::{
proto::{self, AnyTypedEnvelope, EnvelopedMessage, MessageStream, PeerId, RequestMessage},
Connection,
proto::{self, AnyTypedEnvelope, EnvelopedMessage, PeerId, RequestMessage},
Connection, MessageStream,
};
use anyhow::{anyhow, Context, Result};
use collections::HashMap;

View File

@@ -1,21 +1,15 @@
#![allow(non_snake_case)]
use super::{entity_messages, messages, request_messages, ConnectionId, TypedEnvelope};
use anyhow::{anyhow, Result};
use async_tungstenite::tungstenite::Message as WebSocketMessage;
use collections::HashMap;
use futures::{SinkExt as _, StreamExt as _};
use prost::Message as _;
use serde::Serialize;
use std::any::{Any, TypeId};
use std::time::Instant;
use std::{
any::{Any, TypeId},
cmp,
fmt::Debug,
io, iter,
time::{Duration, SystemTime, UNIX_EPOCH},
fmt::{self, Debug},
iter, mem,
time::{Duration, Instant, SystemTime, UNIX_EPOCH},
};
use std::{fmt, mem};
include!(concat!(env!("OUT_DIR"), "/zed.messages.rs"));
@@ -516,16 +510,6 @@ entity_messages!(
UpdateChannelBufferCollaborators,
);
const KIB: usize = 1024;
const MIB: usize = KIB * 1024;
const MAX_BUFFER_LEN: usize = MIB;
/// A stream of protobuf messages.
pub struct MessageStream<S> {
stream: S,
encoding_buffer: Vec<u8>,
}
#[allow(clippy::large_enum_variant)]
#[derive(Debug)]
pub enum Message {
@@ -534,87 +518,6 @@ pub enum Message {
Pong,
}
impl<S> MessageStream<S> {
pub fn new(stream: S) -> Self {
Self {
stream,
encoding_buffer: Vec::new(),
}
}
pub fn inner_mut(&mut self) -> &mut S {
&mut self.stream
}
}
impl<S> MessageStream<S>
where
S: futures::Sink<WebSocketMessage, Error = anyhow::Error> + Unpin,
{
pub async fn write(&mut self, message: Message) -> Result<(), anyhow::Error> {
#[cfg(any(test, feature = "test-support"))]
const COMPRESSION_LEVEL: i32 = -7;
#[cfg(not(any(test, feature = "test-support")))]
const COMPRESSION_LEVEL: i32 = 4;
match message {
Message::Envelope(message) => {
self.encoding_buffer.reserve(message.encoded_len());
message
.encode(&mut self.encoding_buffer)
.map_err(io::Error::from)?;
let buffer =
zstd::stream::encode_all(self.encoding_buffer.as_slice(), COMPRESSION_LEVEL)
.unwrap();
self.encoding_buffer.clear();
self.encoding_buffer.shrink_to(MAX_BUFFER_LEN);
self.stream.send(WebSocketMessage::Binary(buffer)).await?;
}
Message::Ping => {
self.stream
.send(WebSocketMessage::Ping(Default::default()))
.await?;
}
Message::Pong => {
self.stream
.send(WebSocketMessage::Pong(Default::default()))
.await?;
}
}
Ok(())
}
}
impl<S> MessageStream<S>
where
S: futures::Stream<Item = Result<WebSocketMessage, anyhow::Error>> + Unpin,
{
pub async fn read(&mut self) -> Result<(Message, Instant), anyhow::Error> {
while let Some(bytes) = self.stream.next().await {
let received_at = Instant::now();
match bytes? {
WebSocketMessage::Binary(bytes) => {
zstd::stream::copy_decode(bytes.as_slice(), &mut self.encoding_buffer).unwrap();
let envelope = Envelope::decode(self.encoding_buffer.as_slice())
.map_err(io::Error::from)?;
self.encoding_buffer.clear();
self.encoding_buffer.shrink_to(MAX_BUFFER_LEN);
return Ok((Message::Envelope(envelope), received_at));
}
WebSocketMessage::Ping(_) => return Ok((Message::Ping, received_at)),
WebSocketMessage::Pong(_) => return Ok((Message::Pong, received_at)),
WebSocketMessage::Close(_) => break,
_ => {}
}
}
Err(anyhow!("connection closed"))
}
}
impl From<Timestamp> for SystemTime {
fn from(val: Timestamp) -> Self {
UNIX_EPOCH
@@ -722,38 +625,6 @@ pub fn split_worktree_update(
mod tests {
use super::*;
#[gpui::test]
async fn test_buffer_size() {
let (tx, rx) = futures::channel::mpsc::unbounded();
let mut sink = MessageStream::new(tx.sink_map_err(|_| anyhow!("")));
sink.write(Message::Envelope(Envelope {
payload: Some(envelope::Payload::UpdateWorktree(UpdateWorktree {
root_name: "abcdefg".repeat(10),
..Default::default()
})),
..Default::default()
}))
.await
.unwrap();
assert!(sink.encoding_buffer.capacity() <= MAX_BUFFER_LEN);
sink.write(Message::Envelope(Envelope {
payload: Some(envelope::Payload::UpdateWorktree(UpdateWorktree {
root_name: "abcdefg".repeat(1000000),
..Default::default()
})),
..Default::default()
}))
.await
.unwrap();
assert!(sink.encoding_buffer.capacity() <= MAX_BUFFER_LEN);
let mut stream = MessageStream::new(rx.map(anyhow::Ok));
stream.read().await.unwrap();
assert!(stream.encoding_buffer.capacity() <= MAX_BUFFER_LEN);
stream.read().await.unwrap();
assert!(stream.encoding_buffer.capacity() <= MAX_BUFFER_LEN);
}
#[gpui::test]
fn test_converting_peer_id_from_and_to_u64() {
let peer_id = PeerId {

View File

@@ -2,6 +2,7 @@ pub mod auth;
mod conn;
mod error;
mod extension;
mod message_stream;
mod notification;
mod peer;
pub mod proto;
@@ -9,6 +10,7 @@ pub mod proto;
pub use conn::Connection;
pub use error::*;
pub use extension::*;
pub use message_stream::*;
pub use notification::*;
pub use peer::*;
mod macros;

View File

@@ -3,7 +3,7 @@ mod registrar;
use crate::{
search_bar::render_nav_button, FocusSearch, NextHistoryQuery, PreviousHistoryQuery, ReplaceAll,
ReplaceNext, SearchOptions, SelectAllMatches, SelectNextMatch, SelectPrevMatch,
ToggleCaseSensitive, ToggleRegex, ToggleReplace, ToggleWholeWord,
ToggleCaseSensitive, ToggleRegex, ToggleReplace, ToggleSelection, ToggleWholeWord,
};
use any_vec::AnyVec;
use collections::HashMap;
@@ -48,6 +48,8 @@ pub struct Deploy {
pub focus: bool,
#[serde(default)]
pub replace_enabled: bool,
#[serde(default)]
pub selection_search_enabled: bool,
}
impl_actions!(buffer_search, [Deploy]);
@@ -59,6 +61,7 @@ impl Deploy {
Self {
focus: true,
replace_enabled: false,
selection_search_enabled: false,
}
}
}
@@ -90,6 +93,7 @@ pub struct BufferSearchBar {
search_history: SearchHistory,
search_history_cursor: SearchHistoryCursor,
replace_enabled: bool,
selection_search_enabled: bool,
scroll_handle: ScrollHandle,
editor_scroll_handle: ScrollHandle,
editor_needed_width: Pixels,
@@ -228,7 +232,7 @@ impl Render for BufferSearchBar {
}),
)
}))
.children(supported_options.word.then(|| {
.children(supported_options.regex.then(|| {
self.render_search_option_button(
SearchOptions::REGEX,
cx.listener(|this, _, cx| this.toggle_regex(&ToggleRegex, cx)),
@@ -251,6 +255,26 @@ impl Render for BufferSearchBar {
.tooltip(|cx| Tooltip::for_action("Toggle replace", &ToggleReplace, cx)),
)
})
.when(supported_options.selection, |this| {
this.child(
IconButton::new(
"buffer-search-bar-toggle-search-selection-button",
IconName::SearchSelection,
)
.style(ButtonStyle::Subtle)
.when(self.selection_search_enabled, |button| {
button.style(ButtonStyle::Filled)
})
.on_click(cx.listener(|this, _: &ClickEvent, cx| {
this.toggle_selection(&ToggleSelection, cx);
}))
.selected(self.selection_search_enabled)
.size(ButtonSize::Compact)
.tooltip(|cx| {
Tooltip::for_action("Toggle search selection", &ToggleSelection, cx)
}),
)
})
.child(
h_flex()
.flex_none()
@@ -359,6 +383,9 @@ impl Render for BufferSearchBar {
.when(self.supported_options().regex, |this| {
this.on_action(cx.listener(Self::toggle_regex))
})
.when(self.supported_options().selection, |this| {
this.on_action(cx.listener(Self::toggle_selection))
})
.gap_2()
.child(
h_flex()
@@ -440,6 +467,11 @@ impl BufferSearchBar {
this.toggle_whole_word(action, cx);
}
}));
registrar.register_handler(ForDeployed(|this, action: &ToggleSelection, cx| {
if this.supported_options().selection {
this.toggle_selection(action, cx);
}
}));
registrar.register_handler(ForDeployed(|this, action: &ToggleReplace, cx| {
if this.supported_options().replacement {
this.toggle_replace(action, cx);
@@ -497,6 +529,7 @@ impl BufferSearchBar {
search_history_cursor: Default::default(),
active_search: None,
replace_enabled: false,
selection_search_enabled: false,
scroll_handle: ScrollHandle::new(),
editor_scroll_handle: ScrollHandle::new(),
editor_needed_width: px(0.),
@@ -516,8 +549,11 @@ impl BufferSearchBar {
searchable_item.clear_matches(cx);
}
}
if let Some(active_editor) = self.active_searchable_item.as_ref() {
if let Some(active_editor) = self.active_searchable_item.as_mut() {
self.selection_search_enabled = false;
self.replace_enabled = false;
active_editor.search_bar_visibility_changed(false, cx);
active_editor.toggle_filtered_search_ranges(false, cx);
let handle = active_editor.focus_handle(cx);
cx.focus(&handle);
}
@@ -530,8 +566,12 @@ impl BufferSearchBar {
pub fn deploy(&mut self, deploy: &Deploy, cx: &mut ViewContext<Self>) -> bool {
if self.show(cx) {
if let Some(active_item) = self.active_searchable_item.as_mut() {
active_item.toggle_filtered_search_ranges(deploy.selection_search_enabled, cx);
}
self.search_suggested(cx);
self.replace_enabled = deploy.replace_enabled;
self.selection_search_enabled = deploy.selection_search_enabled;
if deploy.focus {
let mut handle = self.query_editor.focus_handle(cx).clone();
let mut select_query = true;
@@ -539,9 +579,11 @@ impl BufferSearchBar {
handle = self.replacement_editor.focus_handle(cx).clone();
select_query = false;
};
if select_query {
self.select_query(cx);
}
cx.focus(&handle);
}
return true;
@@ -823,6 +865,15 @@ impl BufferSearchBar {
self.toggle_search_option(SearchOptions::WHOLE_WORD, cx)
}
fn toggle_selection(&mut self, _: &ToggleSelection, cx: &mut ViewContext<Self>) {
if let Some(active_item) = self.active_searchable_item.as_mut() {
self.selection_search_enabled = !self.selection_search_enabled;
active_item.toggle_filtered_search_ranges(self.selection_search_enabled, cx);
let _ = self.update_matches(cx);
cx.notify();
}
}
fn toggle_regex(&mut self, _: &ToggleRegex, cx: &mut ViewContext<Self>) {
self.toggle_search_option(SearchOptions::REGEX, cx)
}
@@ -1090,9 +1141,9 @@ mod tests {
use std::ops::Range;
use super::*;
use editor::{display_map::DisplayRow, DisplayPoint, Editor};
use editor::{display_map::DisplayRow, DisplayPoint, Editor, MultiBuffer};
use gpui::{Context, Hsla, TestAppContext, VisualTestContext};
use language::Buffer;
use language::{Buffer, Point};
use project::Project;
use smol::stream::StreamExt as _;
use unindent::Unindent as _;
@@ -1405,6 +1456,15 @@ mod tests {
});
}
fn display_points_of(
background_highlights: Vec<(Range<DisplayPoint>, Hsla)>,
) -> Vec<Range<DisplayPoint>> {
background_highlights
.into_iter()
.map(|(range, _)| range)
.collect::<Vec<_>>()
}
#[gpui::test]
async fn test_search_option_handling(cx: &mut TestAppContext) {
let (editor, search_bar, cx) = init_test(cx);
@@ -1417,12 +1477,6 @@ mod tests {
})
.await
.unwrap();
let display_points_of = |background_highlights: Vec<(Range<DisplayPoint>, Hsla)>| {
background_highlights
.into_iter()
.map(|(range, _)| range)
.collect::<Vec<_>>()
};
editor.update(cx, |editor, cx| {
assert_eq!(
display_points_of(editor.all_text_background_highlights(cx)),
@@ -2032,15 +2086,156 @@ mod tests {
.await;
}
#[gpui::test]
async fn test_find_matches_in_selections_singleton_buffer_multiple_selections(
cx: &mut TestAppContext,
) {
init_globals(cx);
let buffer = cx.new_model(|cx| {
Buffer::local(
r#"
aaa bbb aaa ccc
aaa bbb aaa ccc
aaa bbb aaa ccc
aaa bbb aaa ccc
aaa bbb aaa ccc
aaa bbb aaa ccc
"#
.unindent(),
cx,
)
});
let cx = cx.add_empty_window();
let editor = cx.new_view(|cx| Editor::for_buffer(buffer.clone(), None, cx));
let search_bar = cx.new_view(|cx| {
let mut search_bar = BufferSearchBar::new(cx);
search_bar.set_active_pane_item(Some(&editor), cx);
search_bar.show(cx);
search_bar
});
editor.update(cx, |editor, cx| {
editor.change_selections(None, cx, |s| {
s.select_ranges(vec![Point::new(1, 0)..Point::new(2, 4)])
})
});
search_bar.update(cx, |search_bar, cx| {
let deploy = Deploy {
focus: true,
replace_enabled: false,
selection_search_enabled: true,
};
search_bar.deploy(&deploy, cx);
});
cx.run_until_parked();
search_bar
.update(cx, |search_bar, cx| search_bar.search("aaa", None, cx))
.await
.unwrap();
editor.update(cx, |editor, cx| {
assert_eq!(
editor.search_background_highlights(cx),
&[
Point::new(1, 0)..Point::new(1, 3),
Point::new(1, 8)..Point::new(1, 11),
Point::new(2, 0)..Point::new(2, 3),
]
);
});
}
#[gpui::test]
async fn test_find_matches_in_selections_multiple_excerpts_buffer_multiple_selections(
cx: &mut TestAppContext,
) {
init_globals(cx);
let text = r#"
aaa bbb aaa ccc
aaa bbb aaa ccc
aaa bbb aaa ccc
aaa bbb aaa ccc
aaa bbb aaa ccc
aaa bbb aaa ccc
aaa bbb aaa ccc
aaa bbb aaa ccc
aaa bbb aaa ccc
aaa bbb aaa ccc
aaa bbb aaa ccc
aaa bbb aaa ccc
"#
.unindent();
let cx = cx.add_empty_window();
let editor = cx.new_view(|cx| {
let multibuffer = MultiBuffer::build_multi(
[
(
&text,
vec![
Point::new(0, 0)..Point::new(2, 0),
Point::new(4, 0)..Point::new(5, 0),
],
),
(&text, vec![Point::new(9, 0)..Point::new(11, 0)]),
],
cx,
);
Editor::for_multibuffer(multibuffer, None, false, cx)
});
let search_bar = cx.new_view(|cx| {
let mut search_bar = BufferSearchBar::new(cx);
search_bar.set_active_pane_item(Some(&editor), cx);
search_bar.show(cx);
search_bar
});
editor.update(cx, |editor, cx| {
editor.change_selections(None, cx, |s| {
s.select_ranges(vec![
Point::new(1, 0)..Point::new(1, 4),
Point::new(5, 3)..Point::new(6, 4),
])
})
});
search_bar.update(cx, |search_bar, cx| {
let deploy = Deploy {
focus: true,
replace_enabled: false,
selection_search_enabled: true,
};
search_bar.deploy(&deploy, cx);
});
cx.run_until_parked();
search_bar
.update(cx, |search_bar, cx| search_bar.search("aaa", None, cx))
.await
.unwrap();
editor.update(cx, |editor, cx| {
assert_eq!(
editor.search_background_highlights(cx),
&[
Point::new(1, 0)..Point::new(1, 3),
Point::new(5, 8)..Point::new(5, 11),
Point::new(6, 0)..Point::new(6, 3),
]
);
});
}
#[gpui::test]
async fn test_invalid_regexp_search_after_valid(cx: &mut TestAppContext) {
let (editor, search_bar, cx) = init_test(cx);
let display_points_of = |background_highlights: Vec<(Range<DisplayPoint>, Hsla)>| {
background_highlights
.into_iter()
.map(|(range, _)| range)
.collect::<Vec<_>>()
};
// Search using valid regexp
search_bar
.update(cx, |search_bar, cx| {

View File

@@ -25,6 +25,7 @@ actions!(
ToggleIncludeIgnored,
ToggleRegex,
ToggleReplace,
ToggleSelection,
SelectNextMatch,
SelectPrevMatch,
SelectAllMatches,

View File

@@ -66,8 +66,8 @@ pub struct SpawnInTerminal {
/// Arguments to the command, potentially unsubstituted,
/// to let the shell that spawns the command to do the substitution, if needed.
pub args: Vec<String>,
/// A command with all of its arguments appended.
pub command_flattened: String,
/// A human-readable label, containing command and all of its arguments, joined and substituted.
pub command_label: String,
/// Current working directory to spawn the command into.
pub cwd: Option<TerminalWorkDir>,
/// Env overrides for the command, will be appended to the terminal's environment from the settings.

View File

@@ -143,13 +143,13 @@ impl TaskTemplate {
&variable_names,
&mut substituted_variables,
)?;
let _command = substitute_all_template_variables_in_str(
let command = substitute_all_template_variables_in_str(
&self.command,
&task_variables,
&variable_names,
&mut substituted_variables,
)?;
let _args_with_substitutions = substitute_all_template_variables_in_vec(
let args_with_substitutions = substitute_all_template_variables_in_vec(
&self.args,
&task_variables,
&variable_names,
@@ -163,14 +163,12 @@ impl TaskTemplate {
.context("hashing task variables")
.log_err()?;
let id = TaskId(format!("{id_base}_{task_hash}_{variables_hash}"));
let mut _env = substitute_all_template_variables_in_map(
let mut env = substitute_all_template_variables_in_map(
&self.env,
&task_variables,
&variable_names,
&mut substituted_variables,
)?;
let mut env = self.env.clone();
env.extend(task_variables.into_iter().map(|(k, v)| (k, v.to_owned())));
Some(ResolvedTask {
id: id.clone(),
@@ -182,15 +180,15 @@ impl TaskTemplate {
cwd,
full_label,
label: human_readable_label,
command_flattened: self.args.iter().fold(
self.command.clone(),
command_label: args_with_substitutions.iter().fold(
command.clone(),
|mut command_label, arg| {
command_label.push(' ');
command_label.push_str(arg);
command_label
},
),
command: self.command.clone(),
command,
args: self.args.clone(),
env,
use_new_terminal: self.use_new_terminal,
@@ -530,8 +528,8 @@ mod tests {
);
assert_eq!(
spawn_in_terminal.command,
format!("echo $ZED_FILE $ZED_SYMBOL"),
"Command should not be substituted with variables"
format!("echo test_file {long_value}"),
"Command should be substituted with variables and those should not be shortened"
);
assert_eq!(
spawn_in_terminal.args,
@@ -543,12 +541,9 @@ mod tests {
"Args should not be substituted with variables"
);
assert_eq!(
spawn_in_terminal.command_flattened,
format!(
"{} arg1 $ZED_SELECTED_TEXT arg2 $ZED_COLUMN arg3 $ZED_SYMBOL",
spawn_in_terminal.command
),
"Command label args should not be substituted with variables"
spawn_in_terminal.command_label,
format!("{} arg1 test_selected_text arg2 5678 arg3 {long_value}", spawn_in_terminal.command),
"Command label args should be substituted with variables and those should not be shortened"
);
assert_eq!(
@@ -560,16 +555,16 @@ mod tests {
);
assert_eq!(
spawn_in_terminal.env.get("env_key_1").map(|s| s.as_str()),
Some("$ZED_WORKTREE_ROOT")
Some("/test_root/")
);
assert_eq!(
spawn_in_terminal.env.get("env_key_2").map(|s| s.as_str()),
Some("env_var_2 $ZED_CUSTOM_custom_variable_1 $ZED_CUSTOM_custom_variable_2")
Some("env_var_2 test_custom_variable_1 test_custom_variable_2")
);
assert_eq!(
spawn_in_terminal.env.get("env_key_3").map(|s| s.as_str()),
Some("env_var_3 $ZED_SYMBOL"),
"Env vars should not be substituted with variables and those should not be shortened"
spawn_in_terminal.env.get("env_key_3"),
Some(&format!("env_var_3 {long_value}")),
"Env vars should be substituted with variables and those should not be shortened"
);
}

View File

@@ -362,13 +362,13 @@ impl PickerDelegate for TasksModalDelegate {
String::new()
};
if let Some(resolved) = resolved_task.resolved.as_ref() {
if resolved.command_flattened != display_label
&& resolved.command_flattened != resolved_task.resolved_label
if resolved.command_label != display_label
&& resolved.command_label != resolved_task.resolved_label
{
if !tooltip_label_text.trim().is_empty() {
tooltip_label_text.push('\n');
}
tooltip_label_text.push_str(&resolved.command_flattened);
tooltip_label_text.push_str(&resolved.command_label);
}
}
let tooltip_label = if tooltip_label_text.trim().is_empty() {
@@ -466,7 +466,7 @@ impl PickerDelegate for TasksModalDelegate {
let task_index = self.matches.get(self.selected_index())?.candidate_id;
let tasks = self.candidates.as_ref()?;
let (_, task) = tasks.get(task_index)?;
Some(task.resolved.as_ref()?.command_flattened.clone())
Some(task.resolved.as_ref()?.command_label.clone())
}
fn confirm_input(&mut self, omit_history_entry: bool, cx: &mut ViewContext<Picker<Self>>) {

View File

@@ -14,7 +14,7 @@ doctest = false
[dependencies]
alacritty_terminal = "0.23"
alacritty_terminal = { git = "https://github.com/alacritty/alacritty", rev = "cacdb5bb3b72bad2c729227537979d95af75978f" }
anyhow.workspace = true
collections.workspace = true
dirs = "4.0.0"

View File

@@ -553,7 +553,6 @@ impl Element for TerminalElement {
global_id: Option<&GlobalElementId>,
cx: &mut WindowContext,
) -> (LayoutId, Self::RequestLayoutState) {
self.interactivity.occlude_mouse();
let layout_id = self
.interactivity
.request_layout(global_id, cx, |mut style, cx| {

View File

@@ -357,7 +357,7 @@ impl TerminalPanel {
return;
};
spawn_task.command_flattened = format!("{shell} -i -c `{}`", spawn_task.command_flattened);
spawn_task.command_label = format!("{shell} -i -c `{}`", spawn_task.command_label);
let task_command = std::mem::replace(&mut spawn_task.command, shell);
let task_args = std::mem::take(&mut spawn_task.args);
let combined_command = task_args

View File

@@ -972,6 +972,7 @@ impl SearchableItem for TerminalView {
word: false,
regex: true,
replacement: false,
selection: false,
}
}

View File

@@ -54,10 +54,14 @@ pub enum IconDecoration {
#[derive(Default, PartialEq, Copy, Clone)]
pub enum IconSize {
/// 10px
Indicator,
/// 12px
XSmall,
/// 14px
Small,
#[default]
/// 16px
Medium,
}
@@ -169,12 +173,15 @@ pub enum IconName {
Save,
Screen,
SelectAll,
SearchSelection,
Server,
Settings,
Shift,
Sliders,
Snip,
Space,
Sparkle,
SparkleFilled,
Spinner,
Split,
Star,
@@ -293,12 +300,15 @@ impl IconName {
IconName::Save => "icons/save.svg",
IconName::Screen => "icons/desktop.svg",
IconName::SelectAll => "icons/select_all.svg",
IconName::SearchSelection => "icons/search_selection.svg",
IconName::Server => "icons/server.svg",
IconName::Settings => "icons/file_icons/settings.svg",
IconName::Shift => "icons/shift.svg",
IconName::Sliders => "icons/sliders.svg",
IconName::Snip => "icons/snip.svg",
IconName::Space => "icons/space.svg",
IconName::Sparkle => "icons/sparkle.svg",
IconName::SparkleFilled => "icons/sparkle_filled.svg",
IconName::Spinner => "icons/spinner.svg",
IconName::Split => "icons/split.svg",
IconName::Star => "icons/star.svg",

View File

@@ -21,7 +21,7 @@ lazy_static::lazy_static! {
} else {
HOME.join(".config").join("zed")
};
pub static ref CONVERSATIONS_DIR: PathBuf = if cfg!(target_os = "macos") {
pub static ref CONTEXTS_DIR: PathBuf = if cfg!(target_os = "macos") {
CONFIG_DIR.join("conversations")
} else {
SUPPORT_DIR.join("conversations")

View File

@@ -1440,6 +1440,14 @@ pub(crate) fn last_non_whitespace(
) -> DisplayPoint {
let mut end_of_line = end_of_line(map, false, from, count).to_offset(map, Bias::Left);
let scope = map.buffer_snapshot.language_scope_at(from.to_point(map));
// NOTE: depending on clip_at_line_end we may already be one char back from the end.
if let Some((ch, _)) = map.buffer_chars_at(end_of_line).next() {
if char_kind(&scope, ch) != CharKind::Whitespace {
return end_of_line.to_display_point(map);
}
}
for (ch, offset) in map.reverse_buffer_chars_at(end_of_line) {
if ch == '\n' {
break;
@@ -1935,6 +1943,10 @@ mod test {
#[gpui::test]
async fn test_end_of_line_downward(cx: &mut gpui::TestAppContext) {
let mut cx = NeovimBackedTestContext::new(cx).await;
cx.set_shared_state("ˇ one\n two \nthree").await;
cx.simulate_shared_keystrokes("g _").await;
cx.shared_state().await.assert_eq(" onˇe\n two \nthree");
cx.set_shared_state("ˇ one \n two \nthree").await;
cx.simulate_shared_keystrokes("g _").await;
cx.shared_state().await.assert_eq(" onˇe \n two \nthree");

View File

@@ -11,6 +11,7 @@ pub(crate) mod search;
pub mod substitute;
mod yank;
use std::collections::HashMap;
use std::sync::Arc;
use crate::{
@@ -21,8 +22,11 @@ use crate::{
Vim,
};
use collections::BTreeSet;
use editor::display_map::ToDisplayPoint;
use editor::scroll::Autoscroll;
use editor::Anchor;
use editor::Bias;
use editor::Editor;
use gpui::{actions, ViewContext, WindowContext};
use language::{Point, SelectionGoal};
use log::error;
@@ -143,7 +147,11 @@ pub(crate) fn register(workspace: &mut Workspace, cx: &mut ViewContext<Workspace
Vim::update(cx, |vim, cx| {
vim.record_current_action(cx);
vim.update_active_editor(cx, |_, editor, cx| {
editor.transact(cx, |editor, cx| editor.indent(&Default::default(), cx))
editor.transact(cx, |editor, cx| {
let mut original_positions = save_selection_starts(editor, cx);
editor.indent(&Default::default(), cx);
restore_selection_cursors(editor, cx, &mut original_positions);
});
});
if vim.state().mode.is_visual() {
vim.switch_mode(Mode::Normal, false, cx)
@@ -155,7 +163,11 @@ pub(crate) fn register(workspace: &mut Workspace, cx: &mut ViewContext<Workspace
Vim::update(cx, |vim, cx| {
vim.record_current_action(cx);
vim.update_active_editor(cx, |_, editor, cx| {
editor.transact(cx, |editor, cx| editor.outdent(&Default::default(), cx))
editor.transact(cx, |editor, cx| {
let mut original_positions = save_selection_starts(editor, cx);
editor.outdent(&Default::default(), cx);
restore_selection_cursors(editor, cx, &mut original_positions);
});
});
if vim.state().mode.is_visual() {
vim.switch_mode(Mode::Normal, false, cx)
@@ -390,6 +402,33 @@ fn yank_line(_: &mut Workspace, _: &YankLine, cx: &mut ViewContext<Workspace>) {
})
}
fn save_selection_starts(editor: &Editor, cx: &mut ViewContext<Editor>) -> HashMap<usize, Anchor> {
let (map, selections) = editor.selections.all_display(cx);
selections
.iter()
.map(|selection| {
(
selection.id,
map.display_point_to_anchor(selection.start, Bias::Right),
)
})
.collect::<HashMap<_, _>>()
}
fn restore_selection_cursors(
editor: &mut Editor,
cx: &mut ViewContext<Editor>,
positions: &mut HashMap<usize, Anchor>,
) {
editor.change_selections(Some(Autoscroll::fit()), cx, |s| {
s.move_with(|map, selection| {
if let Some(anchor) = positions.remove(&selection.id) {
selection.collapse_to(anchor.to_display_point(map), SelectionGoal::None);
}
});
});
}
pub(crate) fn normal_replace(text: Arc<str>, cx: &mut WindowContext) {
Vim::update(cx, |vim, cx| {
vim.stop_recording();

View File

@@ -69,6 +69,10 @@ fn scroll_editor(
let should_move_cursor = editor.newest_selection_on_screen(cx).is_eq();
let old_top_anchor = editor.scroll_manager.anchor().anchor;
if editor.scroll_hover(amount, cx) {
return;
}
editor.scroll_screen(amount, cx);
if should_move_cursor {
let visible_rows = if let Some(visible_rows) = editor.visible_line_count() {

View File

@@ -179,7 +179,7 @@ async fn test_indent_outdent(cx: &mut gpui::TestAppContext) {
// works in visual mode
cx.simulate_keystrokes("shift-v down >");
cx.assert_editor_state("aa\n bb\n cˇc");
cx.assert_editor_state("aa\n bˇb\n cc");
// works as operator
cx.set_state("aa\nbˇb\ncc\n", Mode::Normal);
@@ -202,11 +202,16 @@ async fn test_indent_outdent(cx: &mut gpui::TestAppContext) {
cx.simulate_keystrokes("> 2 k");
cx.assert_editor_state(" aa\n bb\n ˇcc\n");
// works with repeat
cx.set_state("a\nb\nccˇc\n", Mode::Normal);
cx.simulate_keystrokes("> 2 k");
cx.assert_editor_state(" a\n b\n ccˇc\n");
cx.simulate_keystrokes(".");
cx.assert_editor_state(" a\n b\n ccˇc\n");
cx.simulate_keystrokes("v k <");
cx.assert_editor_state(" a\n\n ccc\n");
cx.simulate_keystrokes(".");
cx.assert_editor_state(" a\n\nccc\n");
}
#[gpui::test]

View File

@@ -1,3 +1,7 @@
{"Put":{"state":"ˇ one\n two \nthree"}}
{"Key":"g"}
{"Key":"_"}
{"Get":{"state":" onˇe\n two \nthree","mode":"Normal"}}
{"Put":{"state":"ˇ one \n two \nthree"}}
{"Key":"g"}
{"Key":"_"}

View File

@@ -551,8 +551,7 @@ where
if let Err(err) = self.await {
log::error!("{err:?}");
if let Ok(prompt) = cx.update(|cx| {
let detail = f(&err, cx)
.unwrap_or_else(|| format!("{err:?}. Please try again.", err = err));
let detail = f(&err, cx).unwrap_or_else(|| format!("{err}. Please try again."));
cx.prompt(PromptLevel::Critical, &msg, Some(&detail), &["Ok"])
}) {
prompt.await.ok();

View File

@@ -1133,11 +1133,19 @@ impl Pane {
}
}
// If a buffer is open both in a singleton editor and in a multibuffer, make sure
// to focus the singleton buffer when prompting to save that buffer, as opposed
// to focusing the multibuffer, because this gives the user a more clear idea
// of what content they would be saving.
items_to_close.sort_by_key(|item| !item.is_singleton(cx));
let active_item_id = self.active_item().map(|item| item.item_id());
items_to_close.sort_by_key(|item| {
// Put the currently active item at the end, because if the currently active item is not closed last
// closing the currently active item will cause the focus to switch to another item
// This will cause Zed to expand the content of the currently active item
active_item_id.filter(|&id| id == item.item_id()).is_some()
// If a buffer is open both in a singleton editor and in a multibuffer, make sure
// to focus the singleton buffer when prompting to save that buffer, as opposed
// to focusing the multibuffer, because this gives the user a more clear idea
// of what content they would be saving.
|| !item.is_singleton(cx)
});
let workspace = self.workspace.clone();
cx.spawn(|pane, mut cx| async move {

View File

@@ -39,8 +39,9 @@ pub struct SearchOptions {
pub case: bool,
pub word: bool,
pub regex: bool,
/// Specifies whether the item supports search & replace.
/// Specifies whether the supports search & replace.
pub replacement: bool,
pub selection: bool,
}
pub trait SearchableItem: Item + EventEmitter<SearchEvent> {
@@ -52,15 +53,18 @@ pub trait SearchableItem: Item + EventEmitter<SearchEvent> {
word: true,
regex: true,
replacement: true,
selection: true,
}
}
fn search_bar_visibility_changed(&mut self, _visible: bool, _cx: &mut ViewContext<Self>) {}
fn has_filtered_search_ranges(&mut self) -> bool {
false
Self::supported_options().selection
}
fn toggle_filtered_search_ranges(&mut self, _enabled: bool, _cx: &mut ViewContext<Self>) {}
fn clear_matches(&mut self, cx: &mut ViewContext<Self>);
fn update_matches(&mut self, matches: &[Self::Match], cx: &mut ViewContext<Self>);
fn query_suggestion(&mut self, cx: &mut ViewContext<Self>) -> String;
@@ -138,6 +142,8 @@ pub trait SearchableItemHandle: ItemHandle {
cx: &mut WindowContext,
) -> Option<usize>;
fn search_bar_visibility_changed(&self, visible: bool, cx: &mut WindowContext);
fn toggle_filtered_search_ranges(&mut self, enabled: bool, cx: &mut WindowContext);
}
impl<T: SearchableItem> SearchableItemHandle for View<T> {
@@ -240,6 +246,12 @@ impl<T: SearchableItem> SearchableItemHandle for View<T> {
this.search_bar_visibility_changed(visible, cx)
});
}
fn toggle_filtered_search_ranges(&mut self, enabled: bool, cx: &mut WindowContext) {
self.update(cx, |this, cx| {
this.toggle_filtered_search_ranges(enabled, cx)
});
}
}
impl From<Box<dyn SearchableItemHandle>> for AnyView {

View File

@@ -3969,8 +3969,7 @@ impl Workspace {
fn adjust_padding(padding: Option<f32>) -> f32 {
padding
.unwrap_or(Self::DEFAULT_PADDING)
.min(Self::MAX_PADDING)
.max(0.0)
.clamp(0.0, Self::MAX_PADDING)
}
}

View File

@@ -14,7 +14,6 @@ workspace = true
[features]
test-support = [
"client/test-support",
"language/test-support",
"settings/test-support",
"text/test-support",
@@ -24,7 +23,6 @@ test-support = [
[dependencies]
anyhow.workspace = true
client.workspace = true
clock.workspace = true
collections.workspace = true
fs.workspace = true
@@ -36,7 +34,6 @@ ignore.workspace = true
itertools.workspace = true
language.workspace = true
log.workspace = true
lsp.workspace = true
parking_lot.workspace = true
postage.workspace = true
rpc.workspace = true

View File

@@ -5,11 +5,9 @@ mod worktree_tests;
use ::ignore::gitignore::{Gitignore, GitignoreBuilder};
use anyhow::{anyhow, Context as _, Result};
use client::{proto, Client};
use clock::ReplicaId;
use collections::{HashMap, HashSet, VecDeque};
use fs::{copy_recursive, RemoveOptions};
use fs::{Fs, Watcher};
use fs::{copy_recursive, Fs, RemoveOptions, Watcher};
use futures::{
channel::{
mpsc::{self, UnboundedSender},
@@ -21,9 +19,9 @@ use futures::{
FutureExt as _, Stream, StreamExt,
};
use fuzzy::CharBag;
use git::status::GitStatus;
use git::{
repository::{GitFileStatus, GitRepository, RepoPath},
status::GitStatus,
DOT_GIT, GITIGNORE,
};
use gpui::{
@@ -32,21 +30,15 @@ use gpui::{
};
use ignore::IgnoreStack;
use itertools::Itertools;
use language::{
proto::{deserialize_version, serialize_line_ending, serialize_version},
Buffer, Capability, DiagnosticEntry, File as _, LineEnding, PointUtf16, Rope, Unclipped,
};
use lsp::{DiagnosticSeverity, LanguageServerId};
use parking_lot::Mutex;
use postage::{
barrier,
prelude::{Sink as _, Stream as _},
watch,
};
use serde::Serialize;
use rpc::proto;
use settings::{Settings, SettingsLocation, SettingsStore};
use smol::channel::{self, Sender};
use std::time::Instant;
use std::{
any::Any,
cmp::{self, Ordering},
@@ -62,10 +54,10 @@ use std::{
atomic::{AtomicUsize, Ordering::SeqCst},
Arc,
},
time::{Duration, SystemTime},
time::{Duration, Instant, SystemTime},
};
use sum_tree::{Bias, Edit, SeekTarget, SumTree, TreeMap, TreeSet};
use text::BufferId;
use text::{LineEnding, Rope};
use util::{
paths::{PathMatcher, HOME},
ResultExt,
@@ -106,36 +98,16 @@ pub enum CreatedEntry {
Excluded { abs_path: PathBuf },
}
#[cfg(any(test, feature = "test-support"))]
impl CreatedEntry {
pub fn to_included(self) -> Option<Entry> {
match self {
CreatedEntry::Included(entry) => Some(entry),
CreatedEntry::Excluded { .. } => None,
}
}
}
pub struct LocalWorktree {
snapshot: LocalSnapshot,
scan_requests_tx: channel::Sender<ScanRequest>,
path_prefixes_to_scan_tx: channel::Sender<Arc<Path>>,
is_scanning: (watch::Sender<bool>, watch::Receiver<bool>),
_background_scanner_tasks: Vec<Task<()>>,
share: Option<ShareState>,
diagnostics: HashMap<
Arc<Path>,
Vec<(
LanguageServerId,
Vec<DiagnosticEntry<Unclipped<PointUtf16>>>,
)>,
>,
diagnostic_summaries: HashMap<Arc<Path>, HashMap<LanguageServerId, DiagnosticSummary>>,
client: Arc<Client>,
update_observer: Option<ShareState>,
fs: Arc<dyn Fs>,
fs_case_sensitive: bool,
visible: bool,
next_entry_id: Arc<AtomicUsize>,
}
@@ -147,12 +119,9 @@ struct ScanRequest {
pub struct RemoteWorktree {
snapshot: Snapshot,
background_snapshot: Arc<Mutex<Snapshot>>,
project_id: u64,
client: Arc<Client>,
updates_tx: Option<UnboundedSender<proto::UpdateWorktree>>,
snapshot_subscriptions: VecDeque<(usize, oneshot::Sender<()>)>,
replica_id: ReplicaId,
diagnostic_summaries: HashMap<Arc<Path>, HashMap<LanguageServerId, DiagnosticSummary>>,
visible: bool,
disconnected: bool,
}
@@ -365,7 +334,6 @@ enum ScanState {
}
struct ShareState {
project_id: u64,
snapshots_tx:
mpsc::UnboundedSender<(LocalSnapshot, UpdatedEntriesSet, UpdatedGitRepositoriesSet)>,
resume_updates: watch::Sender<()>,
@@ -382,7 +350,6 @@ impl EventEmitter<Event> for Worktree {}
impl Worktree {
pub async fn local(
client: Arc<Client>,
path: impl Into<Arc<Path>>,
visible: bool,
fs: Arc<dyn Fs>,
@@ -502,7 +469,7 @@ impl Worktree {
next_entry_id: Arc::clone(&next_entry_id),
snapshot,
is_scanning: watch::channel_with(true),
share: None,
update_observer: None,
scan_requests_tx,
path_prefixes_to_scan_tx,
_background_scanner_tasks: start_background_scan_tasks(
@@ -514,9 +481,6 @@ impl Worktree {
Arc::clone(&fs),
cx,
),
diagnostics: Default::default(),
diagnostic_summaries: Default::default(),
client,
fs,
fs_case_sensitive,
visible,
@@ -525,10 +489,8 @@ impl Worktree {
}
pub fn remote(
project_remote_id: u64,
replica_id: ReplicaId,
worktree: proto::WorktreeMetadata,
client: Arc<Client>,
cx: &mut AppContext,
) -> Model<Self> {
cx.new_model(|cx: &mut ModelContext<Self>| {
@@ -590,14 +552,11 @@ impl Worktree {
.detach();
Worktree::Remote(RemoteWorktree {
project_id: project_remote_id,
replica_id,
snapshot: snapshot.clone(),
background_snapshot,
updates_tx: Some(updates_tx),
snapshot_subscriptions: Default::default(),
client: client.clone(),
diagnostic_summaries: Default::default(),
visible: worktree.visible,
disconnected: false,
})
@@ -679,21 +638,6 @@ impl Worktree {
}
}
pub fn diagnostic_summaries(
&self,
) -> impl Iterator<Item = (Arc<Path>, LanguageServerId, DiagnosticSummary)> + '_ {
match self {
Worktree::Local(worktree) => &worktree.diagnostic_summaries,
Worktree::Remote(worktree) => &worktree.diagnostic_summaries,
}
.iter()
.flat_map(|(path, summaries)| {
summaries
.iter()
.map(move |(&server_id, &summary)| (path.clone(), server_id, summary))
})
}
pub fn abs_path(&self) -> Arc<Path> {
match self {
Worktree::Local(worktree) => worktree.abs_path.clone(),
@@ -807,168 +751,6 @@ impl LocalWorktree {
path.starts_with(&self.abs_path)
}
pub fn load_buffer(
&mut self,
path: &Path,
cx: &mut ModelContext<Worktree>,
) -> Task<Result<Model<Buffer>>> {
let path = Arc::from(path);
let reservation = cx.reserve_model();
let buffer_id = BufferId::from(reservation.entity_id().as_non_zero_u64());
cx.spawn(move |this, mut cx| async move {
let (file, contents, diff_base) = this
.update(&mut cx, |t, cx| t.as_local().unwrap().load(&path, cx))?
.await?;
let text_buffer = cx
.background_executor()
.spawn(async move { text::Buffer::new(0, buffer_id, contents) })
.await;
cx.insert_model(reservation, |_| {
Buffer::build(
text_buffer,
diff_base,
Some(Arc::new(file)),
Capability::ReadWrite,
)
})
})
}
pub fn new_buffer(
&mut self,
path: Arc<Path>,
cx: &mut ModelContext<Worktree>,
) -> Model<Buffer> {
let worktree = cx.handle();
cx.new_model(|cx| {
let buffer_id = BufferId::from(cx.entity_id().as_non_zero_u64());
let text_buffer = text::Buffer::new(0, buffer_id, "".into());
Buffer::build(
text_buffer,
None,
Some(Arc::new(File {
worktree,
path,
mtime: None,
entry_id: None,
is_local: true,
is_deleted: false,
is_private: false,
})),
Capability::ReadWrite,
)
})
}
pub fn diagnostics_for_path(
&self,
path: &Path,
) -> Vec<(
LanguageServerId,
Vec<DiagnosticEntry<Unclipped<PointUtf16>>>,
)> {
self.diagnostics.get(path).cloned().unwrap_or_default()
}
pub fn clear_diagnostics_for_language_server(
&mut self,
server_id: LanguageServerId,
_: &mut ModelContext<Worktree>,
) {
let worktree_id = self.id().to_proto();
self.diagnostic_summaries
.retain(|path, summaries_by_server_id| {
if summaries_by_server_id.remove(&server_id).is_some() {
if let Some(share) = self.share.as_ref() {
self.client
.send(proto::UpdateDiagnosticSummary {
project_id: share.project_id,
worktree_id,
summary: Some(proto::DiagnosticSummary {
path: path.to_string_lossy().to_string(),
language_server_id: server_id.0 as u64,
error_count: 0,
warning_count: 0,
}),
})
.log_err();
}
!summaries_by_server_id.is_empty()
} else {
true
}
});
self.diagnostics.retain(|_, diagnostics_by_server_id| {
if let Ok(ix) = diagnostics_by_server_id.binary_search_by_key(&server_id, |e| e.0) {
diagnostics_by_server_id.remove(ix);
!diagnostics_by_server_id.is_empty()
} else {
true
}
});
}
pub fn update_diagnostics(
&mut self,
server_id: LanguageServerId,
worktree_path: Arc<Path>,
diagnostics: Vec<DiagnosticEntry<Unclipped<PointUtf16>>>,
_: &mut ModelContext<Worktree>,
) -> Result<bool> {
let summaries_by_server_id = self
.diagnostic_summaries
.entry(worktree_path.clone())
.or_default();
let old_summary = summaries_by_server_id
.remove(&server_id)
.unwrap_or_default();
let new_summary = DiagnosticSummary::new(&diagnostics);
if new_summary.is_empty() {
if let Some(diagnostics_by_server_id) = self.diagnostics.get_mut(&worktree_path) {
if let Ok(ix) = diagnostics_by_server_id.binary_search_by_key(&server_id, |e| e.0) {
diagnostics_by_server_id.remove(ix);
}
if diagnostics_by_server_id.is_empty() {
self.diagnostics.remove(&worktree_path);
}
}
} else {
summaries_by_server_id.insert(server_id, new_summary);
let diagnostics_by_server_id =
self.diagnostics.entry(worktree_path.clone()).or_default();
match diagnostics_by_server_id.binary_search_by_key(&server_id, |e| e.0) {
Ok(ix) => {
diagnostics_by_server_id[ix] = (server_id, diagnostics);
}
Err(ix) => {
diagnostics_by_server_id.insert(ix, (server_id, diagnostics));
}
}
}
if !old_summary.is_empty() || !new_summary.is_empty() {
if let Some(share) = self.share.as_ref() {
self.client
.send(proto::UpdateDiagnosticSummary {
project_id: share.project_id,
worktree_id: self.id().to_proto(),
summary: Some(proto::DiagnosticSummary {
path: worktree_path.to_string_lossy().to_string(),
language_server_id: server_id.0 as u64,
error_count: new_summary.error_count as u32,
warning_count: new_summary.warning_count as u32,
}),
})
.log_err();
}
}
Ok(!old_summary.is_empty() || !new_summary.is_empty())
}
fn restart_background_scanners(&mut self, cx: &mut ModelContext<Worktree>) {
let (scan_requests_tx, scan_requests_rx) = channel::unbounded();
let (path_prefixes_to_scan_tx, path_prefixes_to_scan_rx) = channel::unbounded();
@@ -997,7 +779,7 @@ impl LocalWorktree {
new_snapshot.share_private_files = self.snapshot.share_private_files;
self.snapshot = new_snapshot;
if let Some(share) = self.share.as_mut() {
if let Some(share) = self.update_observer.as_mut() {
share
.snapshots_tx
.unbounded_send((
@@ -1138,7 +920,7 @@ impl LocalWorktree {
}
}
fn load(
pub fn load_file(
&self,
path: &Path,
cx: &mut ModelContext<Worktree>,
@@ -1232,97 +1014,6 @@ impl LocalWorktree {
})
}
pub fn save_buffer(
&self,
buffer_handle: Model<Buffer>,
path: Arc<Path>,
mut has_changed_file: bool,
cx: &mut ModelContext<Worktree>,
) -> Task<Result<()>> {
let buffer = buffer_handle.read(cx);
let rpc = self.client.clone();
let buffer_id: u64 = buffer.remote_id().into();
let project_id = self.share.as_ref().map(|share| share.project_id);
if buffer.file().is_some_and(|file| !file.is_created()) {
has_changed_file = true;
}
let text = buffer.as_rope().clone();
let version = buffer.version();
let save = self.write_file(path.as_ref(), text, buffer.line_ending(), cx);
let fs = Arc::clone(&self.fs);
let abs_path = self.absolutize(&path);
let is_private = self.snapshot.is_path_private(&path);
cx.spawn(move |this, mut cx| async move {
let entry = save.await?;
let abs_path = abs_path?;
let this = this.upgrade().context("worktree dropped")?;
let (entry_id, mtime, path, is_dotenv) = match entry {
Some(entry) => (Some(entry.id), entry.mtime, entry.path, entry.is_private),
None => {
let metadata = fs
.metadata(&abs_path)
.await
.with_context(|| {
format!(
"Fetching metadata after saving the excluded buffer {abs_path:?}"
)
})?
.with_context(|| {
format!("Excluded buffer {path:?} got removed during saving")
})?;
(None, Some(metadata.mtime), path, is_private)
}
};
if has_changed_file {
let new_file = Arc::new(File {
entry_id,
worktree: this,
path,
mtime,
is_local: true,
is_deleted: false,
is_private: is_dotenv,
});
if let Some(project_id) = project_id {
rpc.send(proto::UpdateBufferFile {
project_id,
buffer_id,
file: Some(new_file.to_proto()),
})
.log_err();
}
buffer_handle.update(&mut cx, |buffer, cx| {
if has_changed_file {
buffer.file_updated(new_file, cx);
}
})?;
}
if let Some(project_id) = project_id {
rpc.send(proto::BufferSaved {
project_id,
buffer_id,
version: serialize_version(&version),
mtime: mtime.map(|time| time.into()),
})?;
}
buffer_handle.update(&mut cx, |buffer, cx| {
buffer.did_save(version.clone(), mtime, cx);
})?;
Ok(())
})
}
/// Find the lowest path in the worktree's datastructures that is an ancestor
fn lowest_ancestor(&self, path: &Path) -> PathBuf {
let mut lowest_ancestor = None;
@@ -1400,7 +1091,7 @@ impl LocalWorktree {
})
}
pub(crate) fn write_file(
pub fn write_file(
&self,
path: impl Into<Arc<Path>>,
text: Rope,
@@ -1630,8 +1321,7 @@ impl LocalWorktree {
project_id: u64,
cx: &mut ModelContext<Worktree>,
callback: F,
) -> oneshot::Receiver<()>
where
) where
F: 'static + Send + Fn(proto::UpdateWorktree) -> Fut,
Fut: Send + Future<Output = bool>,
{
@@ -1640,12 +1330,9 @@ impl LocalWorktree {
#[cfg(not(any(test, feature = "test-support")))]
const MAX_CHUNK_SIZE: usize = 256;
let (share_tx, share_rx) = oneshot::channel();
if let Some(share) = self.share.as_mut() {
share_tx.send(()).ok();
*share.resume_updates.borrow_mut() = ();
return share_rx;
if let Some(observer) = self.update_observer.as_mut() {
*observer.resume_updates.borrow_mut() = ();
return;
}
let (resume_updates_tx, mut resume_updates_rx) = watch::channel::<()>();
@@ -1683,47 +1370,23 @@ impl LocalWorktree {
}
}
}
share_tx.send(()).ok();
Some(())
});
self.share = Some(ShareState {
project_id,
self.update_observer = Some(ShareState {
snapshots_tx,
resume_updates: resume_updates_tx,
_maintain_remote_snapshot,
});
share_rx
}
pub fn share(&mut self, project_id: u64, cx: &mut ModelContext<Worktree>) -> Task<Result<()>> {
let client = self.client.clone();
for (path, summaries) in &self.diagnostic_summaries {
for (&server_id, summary) in summaries {
if let Err(e) = self.client.send(proto::UpdateDiagnosticSummary {
project_id,
worktree_id: cx.entity_id().as_u64(),
summary: Some(summary.to_proto(server_id, path)),
}) {
return Task::ready(Err(e));
}
}
}
let rx = self.observe_updates(project_id, cx, move |update| {
client.request(update).map(|result| result.is_ok())
});
cx.background_executor()
.spawn(async move { rx.await.map_err(|_| anyhow!("share ended")) })
pub fn stop_observing_updates(&mut self) {
self.update_observer.take();
}
pub fn unshare(&mut self) {
self.share.take();
}
pub fn is_shared(&self) -> bool {
self.share.is_some()
#[cfg(any(test, feature = "test-support"))]
pub fn has_update_observer(&self) -> bool {
self.update_observer.is_some()
}
pub fn share_private_files(&mut self, cx: &mut ModelContext<Worktree>) {
@@ -1743,37 +1406,6 @@ impl RemoteWorktree {
self.disconnected = true;
}
pub fn save_buffer(
&self,
buffer_handle: Model<Buffer>,
new_path: Option<proto::ProjectPath>,
cx: &mut ModelContext<Worktree>,
) -> Task<Result<()>> {
let buffer = buffer_handle.read(cx);
let buffer_id = buffer.remote_id().into();
let version = buffer.version();
let rpc = self.client.clone();
let project_id = self.project_id;
cx.spawn(move |_, mut cx| async move {
let response = rpc
.request(proto::SaveBuffer {
project_id,
buffer_id,
new_path,
version: serialize_version(&version),
})
.await?;
let version = deserialize_version(&response.version);
let mtime = response.mtime.map(|mtime| mtime.into());
buffer_handle.update(&mut cx, |buffer, cx| {
buffer.did_save(version.clone(), mtime, cx);
})?;
Ok(())
})
}
pub fn update_from_remote(&mut self, update: proto::UpdateWorktree) {
if let Some(updates_tx) = &self.updates_tx {
updates_tx
@@ -1807,32 +1439,6 @@ impl RemoteWorktree {
}
}
pub fn update_diagnostic_summary(
&mut self,
path: Arc<Path>,
summary: &proto::DiagnosticSummary,
) {
let server_id = LanguageServerId(summary.language_server_id as usize);
let summary = DiagnosticSummary {
error_count: summary.error_count as usize,
warning_count: summary.warning_count as usize,
};
if summary.is_empty() {
if let Some(summaries) = self.diagnostic_summaries.get_mut(&path) {
summaries.remove(&server_id);
if summaries.is_empty() {
self.diagnostic_summaries.remove(&path);
}
}
} else {
self.diagnostic_summaries
.entry(path)
.or_default()
.insert(server_id, summary);
}
}
pub fn insert_entry(
&mut self,
entry: proto::Entry,
@@ -3023,29 +2629,6 @@ impl language::LocalFile for File {
cx.background_executor()
.spawn(async move { fs.load(&abs_path?).await })
}
fn buffer_reloaded(
&self,
buffer_id: BufferId,
version: &clock::Global,
line_ending: LineEnding,
mtime: Option<SystemTime>,
cx: &mut AppContext,
) {
let worktree = self.worktree.read(cx).as_local().unwrap();
if let Some(project_id) = worktree.share.as_ref().map(|share| share.project_id) {
worktree
.client
.send(proto::BufferReloaded {
project_id,
buffer_id: buffer_id.into(),
version: serialize_version(version),
mtime: mtime.map(|time| time.into()),
line_ending: serialize_line_ending(line_ending) as i32,
})
.log_err();
}
}
}
impl File {
@@ -5109,46 +4692,12 @@ impl ProjectEntryId {
}
}
#[derive(Copy, Clone, Debug, Default, PartialEq, Serialize)]
pub struct DiagnosticSummary {
pub error_count: usize,
pub warning_count: usize,
}
impl DiagnosticSummary {
fn new<'a, T: 'a>(diagnostics: impl IntoIterator<Item = &'a DiagnosticEntry<T>>) -> Self {
let mut this = Self {
error_count: 0,
warning_count: 0,
};
for entry in diagnostics {
if entry.diagnostic.is_primary {
match entry.diagnostic.severity {
DiagnosticSeverity::ERROR => this.error_count += 1,
DiagnosticSeverity::WARNING => this.warning_count += 1,
_ => {}
}
}
}
this
}
pub fn is_empty(&self) -> bool {
self.error_count == 0 && self.warning_count == 0
}
pub fn to_proto(
&self,
language_server_id: LanguageServerId,
path: &Path,
) -> proto::DiagnosticSummary {
proto::DiagnosticSummary {
path: path.to_string_lossy().to_string(),
language_server_id: language_server_id.0 as u64,
error_count: self.error_count as u32,
warning_count: self.warning_count as u32,
#[cfg(any(test, feature = "test-support"))]
impl CreatedEntry {
pub fn to_included(self) -> Option<Entry> {
match self {
CreatedEntry::Included(entry) => Some(entry),
CreatedEntry::Excluded { .. } => None,
}
}
}

View File

@@ -3,12 +3,9 @@ use crate::{
WorktreeModelHandle,
};
use anyhow::Result;
use client::Client;
use clock::FakeSystemClock;
use fs::{FakeFs, Fs, RealFs, RemoveOptions};
use git::{repository::GitFileStatus, GITIGNORE};
use gpui::{BorrowAppContext, ModelContext, Task, TestAppContext};
use http::FakeHttpClient;
use parking_lot::Mutex;
use postage::stream::Stream;
use pretty_assertions::assert_eq;
@@ -35,7 +32,6 @@ async fn test_traversal(cx: &mut TestAppContext) {
.await;
let tree = Worktree::local(
build_client(cx),
Path::new("/root"),
true,
fs,
@@ -100,7 +96,6 @@ async fn test_circular_symlinks(cx: &mut TestAppContext) {
.unwrap();
let tree = Worktree::local(
build_client(cx),
Path::new("/root"),
true,
fs.clone(),
@@ -200,7 +195,6 @@ async fn test_symlinks_pointing_outside(cx: &mut TestAppContext) {
.unwrap();
let tree = Worktree::local(
build_client(cx),
Path::new("/root/dir1"),
true,
fs.clone(),
@@ -351,7 +345,6 @@ async fn test_renaming_case_only(cx: &mut TestAppContext) {
}));
let tree = Worktree::local(
build_client(cx),
temp_root.path(),
true,
fs.clone(),
@@ -428,7 +421,6 @@ async fn test_open_gitignored_files(cx: &mut TestAppContext) {
.await;
let tree = Worktree::local(
build_client(cx),
Path::new("/root"),
true,
fs.clone(),
@@ -461,16 +453,16 @@ async fn test_open_gitignored_files(cx: &mut TestAppContext) {
// Open a file that is nested inside of a gitignored directory that
// has not yet been expanded.
let prev_read_dir_count = fs.read_dir_call_count();
let buffer = tree
let (file, _, _) = tree
.update(cx, |tree, cx| {
tree.as_local_mut()
.unwrap()
.load_buffer("one/node_modules/b/b1.js".as_ref(), cx)
.load_file("one/node_modules/b/b1.js".as_ref(), cx)
})
.await
.unwrap();
tree.read_with(cx, |tree, cx| {
tree.read_with(cx, |tree, _| {
assert_eq!(
tree.entries(true)
.map(|entry| (entry.path.as_ref(), entry.is_ignored))
@@ -491,10 +483,7 @@ async fn test_open_gitignored_files(cx: &mut TestAppContext) {
]
);
assert_eq!(
buffer.read(cx).file().unwrap().path().as_ref(),
Path::new("one/node_modules/b/b1.js")
);
assert_eq!(file.path.as_ref(), Path::new("one/node_modules/b/b1.js"));
// Only the newly-expanded directories are scanned.
assert_eq!(fs.read_dir_call_count() - prev_read_dir_count, 2);
@@ -503,16 +492,16 @@ async fn test_open_gitignored_files(cx: &mut TestAppContext) {
// Open another file in a different subdirectory of the same
// gitignored directory.
let prev_read_dir_count = fs.read_dir_call_count();
let buffer = tree
let (file, _, _) = tree
.update(cx, |tree, cx| {
tree.as_local_mut()
.unwrap()
.load_buffer("one/node_modules/a/a2.js".as_ref(), cx)
.load_file("one/node_modules/a/a2.js".as_ref(), cx)
})
.await
.unwrap();
tree.read_with(cx, |tree, cx| {
tree.read_with(cx, |tree, _| {
assert_eq!(
tree.entries(true)
.map(|entry| (entry.path.as_ref(), entry.is_ignored))
@@ -535,10 +524,7 @@ async fn test_open_gitignored_files(cx: &mut TestAppContext) {
]
);
assert_eq!(
buffer.read(cx).file().unwrap().path().as_ref(),
Path::new("one/node_modules/a/a2.js")
);
assert_eq!(file.path.as_ref(), Path::new("one/node_modules/a/a2.js"));
// Only the newly-expanded directory is scanned.
assert_eq!(fs.read_dir_call_count() - prev_read_dir_count, 1);
@@ -591,7 +577,6 @@ async fn test_dirs_no_longer_ignored(cx: &mut TestAppContext) {
.await;
let tree = Worktree::local(
build_client(cx),
Path::new("/root"),
true,
fs.clone(),
@@ -711,7 +696,6 @@ async fn test_rescan_with_gitignore(cx: &mut TestAppContext) {
.await;
let tree = Worktree::local(
build_client(cx),
"/root/tree".as_ref(),
true,
fs.clone(),
@@ -793,7 +777,6 @@ async fn test_update_gitignore(cx: &mut TestAppContext) {
.await;
let tree = Worktree::local(
build_client(cx),
"/root".as_ref(),
true,
fs.clone(),
@@ -848,7 +831,6 @@ async fn test_write_file(cx: &mut TestAppContext) {
}));
let tree = Worktree::local(
build_client(cx),
dir.path(),
true,
Arc::new(RealFs::default()),
@@ -928,7 +910,6 @@ async fn test_file_scan_exclusions(cx: &mut TestAppContext) {
});
let tree = Worktree::local(
build_client(cx),
dir.path(),
true,
Arc::new(RealFs::default()),
@@ -1032,7 +1013,6 @@ async fn test_fs_events_in_exclusions(cx: &mut TestAppContext) {
});
let tree = Worktree::local(
build_client(cx),
dir.path(),
true,
Arc::new(RealFs::default()),
@@ -1142,7 +1122,6 @@ async fn test_fs_events_in_dot_git_worktree(cx: &mut TestAppContext) {
let dot_git_worktree_dir = dir.path().join(".git");
let tree = Worktree::local(
build_client(cx),
dot_git_worktree_dir.clone(),
true,
Arc::new(RealFs::default()),
@@ -1181,7 +1160,6 @@ async fn test_create_directory_during_initial_scan(cx: &mut TestAppContext) {
.await;
let tree = Worktree::local(
build_client(cx),
"/root".as_ref(),
true,
fs,
@@ -1194,7 +1172,7 @@ async fn test_create_directory_during_initial_scan(cx: &mut TestAppContext) {
let snapshot1 = tree.update(cx, |tree, cx| {
let tree = tree.as_local_mut().unwrap();
let snapshot = Arc::new(Mutex::new(tree.snapshot()));
let _ = tree.observe_updates(0, cx, {
tree.observe_updates(0, cx, {
let snapshot = snapshot.clone();
move |update| {
snapshot.lock().apply_remote_update(update).unwrap();
@@ -1232,13 +1210,6 @@ async fn test_create_directory_during_initial_scan(cx: &mut TestAppContext) {
async fn test_create_dir_all_on_create_entry(cx: &mut TestAppContext) {
init_test(cx);
cx.executor().allow_parking();
let client_fake = cx.update(|cx| {
Client::new(
Arc::new(FakeSystemClock::default()),
FakeHttpClient::with_404_response(),
cx,
)
});
let fs_fake = FakeFs::new(cx.background_executor.clone());
fs_fake
@@ -1251,7 +1222,6 @@ async fn test_create_dir_all_on_create_entry(cx: &mut TestAppContext) {
.await;
let tree_fake = Worktree::local(
client_fake,
"/root".as_ref(),
true,
fs_fake,
@@ -1280,21 +1250,12 @@ async fn test_create_dir_all_on_create_entry(cx: &mut TestAppContext) {
assert!(tree.entry_for_path("a/b/").unwrap().is_dir());
});
let client_real = cx.update(|cx| {
Client::new(
Arc::new(FakeSystemClock::default()),
FakeHttpClient::with_404_response(),
cx,
)
});
let fs_real = Arc::new(RealFs::default());
let temp_root = temp_tree(json!({
"a": {}
}));
let tree_real = Worktree::local(
client_real,
temp_root.path(),
true,
fs_real,
@@ -1385,7 +1346,6 @@ async fn test_random_worktree_operations_during_initial_scan(
log::info!("generated initial tree");
let worktree = Worktree::local(
build_client(cx),
root_dir,
true,
fs.clone(),
@@ -1400,7 +1360,7 @@ async fn test_random_worktree_operations_during_initial_scan(
worktree.update(cx, |tree, cx| {
check_worktree_change_events(tree, cx);
let _ = tree.as_local_mut().unwrap().observe_updates(0, cx, {
tree.as_local_mut().unwrap().observe_updates(0, cx, {
let updates = updates.clone();
move |update| {
updates.lock().push(update);
@@ -1475,7 +1435,6 @@ async fn test_random_worktree_changes(cx: &mut TestAppContext, mut rng: StdRng)
log::info!("generated initial tree");
let worktree = Worktree::local(
build_client(cx),
root_dir,
true,
fs.clone(),
@@ -1489,7 +1448,7 @@ async fn test_random_worktree_changes(cx: &mut TestAppContext, mut rng: StdRng)
worktree.update(cx, |tree, cx| {
check_worktree_change_events(tree, cx);
let _ = tree.as_local_mut().unwrap().observe_updates(0, cx, {
tree.as_local_mut().unwrap().observe_updates(0, cx, {
let updates = updates.clone();
move |update| {
updates.lock().push(update);
@@ -1548,7 +1507,6 @@ async fn test_random_worktree_changes(cx: &mut TestAppContext, mut rng: StdRng)
{
let new_worktree = Worktree::local(
build_client(cx),
root_dir,
true,
fs.clone(),
@@ -1892,7 +1850,6 @@ async fn test_rename_work_directory(cx: &mut TestAppContext) {
let root_path = root.path();
let tree = Worktree::local(
build_client(cx),
root_path,
true,
Arc::new(RealFs::default()),
@@ -1971,7 +1928,6 @@ async fn test_git_repository_for_path(cx: &mut TestAppContext) {
}));
let tree = Worktree::local(
build_client(cx),
root.path(),
true,
Arc::new(RealFs::default()),
@@ -2112,7 +2068,6 @@ async fn test_git_status(cx: &mut TestAppContext) {
git_commit("Initial commit", &repo);
let tree = Worktree::local(
build_client(cx),
root.path(),
true,
Arc::new(RealFs::default()),
@@ -2294,7 +2249,6 @@ async fn test_repository_subfolder_git_status(cx: &mut TestAppContext) {
// Open the worktree in subfolder
let project_root = Path::new("my-repo/sub-folder-1/sub-folder-2");
let tree = Worktree::local(
build_client(cx),
root.path().join(project_root),
true,
Arc::new(RealFs::default()),
@@ -2392,7 +2346,6 @@ async fn test_propagate_git_statuses(cx: &mut TestAppContext) {
);
let tree = Worktree::local(
build_client(cx),
Path::new("/root"),
true,
fs.clone(),
@@ -2471,12 +2424,6 @@ async fn test_propagate_git_statuses(cx: &mut TestAppContext) {
}
}
fn build_client(cx: &mut TestAppContext) -> Arc<Client> {
let clock = Arc::new(FakeSystemClock::default());
let http_client = FakeHttpClient::with_404_response();
cx.update(|cx| Client::new(clock, http_client, cx))
}
#[track_caller]
fn git_init(path: &Path) -> git2::Repository {
git2::Repository::init(path).expect("Failed to initialize git repository")

View File

@@ -2,7 +2,7 @@
description = "The fast, collaborative code editor."
edition = "2021"
name = "zed"
version = "0.139.0"
version = "0.140.0"
publish = false
license = "GPL-3.0-or-later"
authors = ["Zed Team <hi@zed.dev>"]

View File

@@ -12,7 +12,7 @@ We have a growing collection of pre-defined keymaps in [zed repository's keymaps
- TextMate
- VSCode (default)
These keymaps can be set via the `base_keymap` setting in your `keymap.json` file. Additionally, if you'd like to work from a clean slate, you can provide `"None"` to the setting.
These keymaps can be set via the `base_keymap` setting in your `settings.json` file. Additionally, if you'd like to work from a clean slate, you can provide `"None"` to the setting.
## Custom key bindings

View File

@@ -27,12 +27,14 @@ g S Find symbol in entire project
g ] Go to next diagnostic
g [ Go to previous diagnostic
] d Go to next diagnostic
[ d Go to previous diagnostic
g h Show inline error (hover)
g . Open the code actions menu
# Git
] c Go to previous git change
[ c Go to next git change
] c Go to next git change
[ c Go to previous git change
# Treesitter
] x Select a smaller syntax node

View File

@@ -1,7 +1,7 @@
id = "astro"
name = "Astro"
description = "Astro support."
version = "0.0.2"
version = "0.0.3"
schema_version = 1
authors = ["Alvaro Gaona <alvgaona@gmail.com>"]
repository = "https://github.com/zed-industries/zed"

View File

@@ -0,0 +1,6 @@
[
(attribute_value)
(quoted_attribute_value)
] @string
(comment) @comment

View File

@@ -1,7 +1,7 @@
id = "ruby"
name = "Ruby"
description = "Ruby support."
version = "0.0.6"
version = "0.0.7"
schema_version = 1
authors = ["Vitaly Slobodin <vitaliy.slobodin@gmail.com>"]
repository = "https://github.com/zed-industries/zed"

View File

@@ -40,3 +40,8 @@ brackets = [
]
collapsed_placeholder = "# ..."
tab_size = 2
scope_opt_in_language_servers = ["tailwindcss-language-server"]
[overrides.string]
word_characters = ["-"]
opt_into_language_servers = ["tailwindcss-language-server"]

View File

@@ -23,6 +23,7 @@ if [[ -n $apt ]]; then
libzstd-dev
libvulkan1
libgit2-dev
make
)
$maysudo "$apt" install -y "${deps[@]}"
exit 0

View File

@@ -32,7 +32,6 @@ ADDITIONAL_LABELS: set[str] = {
}
IGNORED_LABELS: set[str] = {
"ignore top-ranking issues",
"meta",
}
ISSUES_PER_LABEL: int = 20